CHECK: Is CUDA the right version (10)? weights arg is None Caution! Loading imagenet weights Creating model, this may take a second... Loading weights into model tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2a (BatchNormalizati (None, None, None, 1 512 res3b_branch2a[0][0] __________________________________________________________________________________________________ res3b_branch2a_relu (Activation (None, None, None, 1 0 bn3b_branch2a[0][0] __________________________________________________________________________________________________ padding3b_branch2b (ZeroPadding (None, None, None, 1 0 res3b_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b_branch2b (Conv2D) (None, None, None, 1 147456 padding3b_branch2b[0][0] __________________________________________________________________________________________________ bn3b_branch2b (BatchNormalizati (None, None, None, 1 512 res3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2b_relu (Activation (None, None, None, 1 0 bn3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2c (Conv2D) (None, None, None, 5 65536 res3b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2c (BatchNormalizati (None, None, None, 5 2048 res3b_branch2c[0][0] __________________________________________________________________________________________________ res3b (Add) (None, None, None, 5 0 bn3b_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b_relu (Activation) (None, None, None, 5 0 res3b[0][0] __________________________________________________________________________________________________ res3c_branch2a (Conv2D) (None, None, None, 1 65536 res3b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2a (BatchNormalizati (None, None, None, 1 512 res3c_branch2a[0][0] __________________________________________________________________________________________________ res3c_branch2a_relu (Activation (None, None, None, 1 0 bn3c_branch2a[0][0] __________________________________________________________________________________________________ padding3c_branch2b (ZeroPadding (None, None, None, 1 0 res3c_branch2a_relu[0][0] __________________________________________________________________________________________________ res3c_branch2b (Conv2D) (None, None, None, 1 147456 padding3c_branch2b[0][0] __________________________________________________________________________________________________ bn3c_branch2b (BatchNormalizati (None, None, None, 1 512 res3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2b_relu (Activation (None, None, None, 1 0 bn3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2c (Conv2D) (None, None, None, 5 65536 res3c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2c (BatchNormalizati (None, None, None, 5 2048 res3c_branch2c[0][0] __________________________________________________________________________________________________ res3c (Add) (None, None, None, 5 0 bn3c_branch2c[0][0] res3b_relu[0][0] __________________________________________________________________________________________________ res3c_relu (Activation) (None, None, None, 5 0 res3c[0][0] __________________________________________________________________________________________________ res3d_branch2a (Conv2D) (None, None, None, 1 65536 res3c_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2a (BatchNormalizati (None, None, None, 1 512 res3d_branch2a[0][0] __________________________________________________________________________________________________ res3d_branch2a_relu (Activation (None, None, None, 1 0 bn3d_branch2a[0][0] __________________________________________________________________________________________________ padding3d_branch2b (ZeroPadding (None, None, None, 1 0 res3d_branch2a_relu[0][0] __________________________________________________________________________________________________ res3d_branch2b (Conv2D) (None, None, None, 1 147456 padding3d_branch2b[0][0] __________________________________________________________________________________________________ bn3d_branch2b (BatchNormalizati (None, None, None, 1 512 res3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2b_relu (Activation (None, None, None, 1 0 bn3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2c (Conv2D) (None, None, None, 5 65536 res3d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2c (BatchNormalizati (None, None, None, 5 2048 res3d_branch2c[0][0] __________________________________________________________________________________________________ res3d (Add) (None, None, None, 5 0 bn3d_branch2c[0][0] res3c_relu[0][0] __________________________________________________________________________________________________ res3d_relu (Activation) (None, None, None, 5 0 res3d[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2a (BatchNormalizati (None, None, None, 2 1024 res4b_branch2a[0][0] __________________________________________________________________________________________________ res4b_branch2a_relu (Activation (None, None, None, 2 0 bn4b_branch2a[0][0] __________________________________________________________________________________________________ padding4b_branch2b (ZeroPadding (None, None, None, 2 0 res4b_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b_branch2b (Conv2D) (None, None, None, 2 589824 padding4b_branch2b[0][0] __________________________________________________________________________________________________ bn4b_branch2b (BatchNormalizati (None, None, None, 2 1024 res4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2b_relu (Activation (None, None, None, 2 0 bn4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2c (Conv2D) (None, None, None, 1 262144 res4b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2c (BatchNormalizati (None, None, None, 1 4096 res4b_branch2c[0][0] __________________________________________________________________________________________________ res4b (Add) (None, None, None, 1 0 bn4b_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b_relu (Activation) (None, None, None, 1 0 res4b[0][0] __________________________________________________________________________________________________ res4c_branch2a (Conv2D) (None, None, None, 2 262144 res4b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2a (BatchNormalizati (None, None, None, 2 1024 res4c_branch2a[0][0] __________________________________________________________________________________________________ res4c_branch2a_relu (Activation (None, None, None, 2 0 bn4c_branch2a[0][0] __________________________________________________________________________________________________ padding4c_branch2b (ZeroPadding (None, None, None, 2 0 res4c_branch2a_relu[0][0] __________________________________________________________________________________________________ res4c_branch2b (Conv2D) (None, None, None, 2 589824 padding4c_branch2b[0][0] __________________________________________________________________________________________________ bn4c_branch2b (BatchNormalizati (None, None, None, 2 1024 res4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2b_relu (Activation (None, None, None, 2 0 bn4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2c (Conv2D) (None, None, None, 1 262144 res4c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2c (BatchNormalizati (None, None, None, 1 4096 res4c_branch2c[0][0] __________________________________________________________________________________________________ res4c (Add) (None, None, None, 1 0 bn4c_branch2c[0][0] res4b_relu[0][0] __________________________________________________________________________________________________ res4c_relu (Activation) (None, None, None, 1 0 res4c[0][0] __________________________________________________________________________________________________ res4d_branch2a (Conv2D) (None, None, None, 2 262144 res4c_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2a (BatchNormalizati (None, None, None, 2 1024 res4d_branch2a[0][0] __________________________________________________________________________________________________ res4d_branch2a_relu (Activation (None, None, None, 2 0 bn4d_branch2a[0][0] __________________________________________________________________________________________________ padding4d_branch2b (ZeroPadding (None, None, None, 2 0 res4d_branch2a_relu[0][0] __________________________________________________________________________________________________ res4d_branch2b (Conv2D) (None, None, None, 2 589824 padding4d_branch2b[0][0] __________________________________________________________________________________________________ bn4d_branch2b (BatchNormalizati (None, None, None, 2 1024 res4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2b_relu (Activation (None, None, None, 2 0 bn4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2c (Conv2D) (None, None, None, 1 262144 res4d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2c (BatchNormalizati (None, None, None, 1 4096 res4d_branch2c[0][0] __________________________________________________________________________________________________ res4d (Add) (None, None, None, 1 0 bn4d_branch2c[0][0] res4c_relu[0][0] __________________________________________________________________________________________________ res4d_relu (Activation) (None, None, None, 1 0 res4d[0][0] __________________________________________________________________________________________________ res4e_branch2a (Conv2D) (None, None, None, 2 262144 res4d_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2a (BatchNormalizati (None, None, None, 2 1024 res4e_branch2a[0][0] __________________________________________________________________________________________________ res4e_branch2a_relu (Activation (None, None, None, 2 0 bn4e_branch2a[0][0] __________________________________________________________________________________________________ padding4e_branch2b (ZeroPadding (None, None, None, 2 0 res4e_branch2a_relu[0][0] __________________________________________________________________________________________________ res4e_branch2b (Conv2D) (None, None, None, 2 589824 padding4e_branch2b[0][0] __________________________________________________________________________________________________ bn4e_branch2b (BatchNormalizati (None, None, None, 2 1024 res4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2b_relu (Activation (None, None, None, 2 0 bn4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2c (Conv2D) (None, None, None, 1 262144 res4e_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2c (BatchNormalizati (None, None, None, 1 4096 res4e_branch2c[0][0] __________________________________________________________________________________________________ res4e (Add) (None, None, None, 1 0 bn4e_branch2c[0][0] res4d_relu[0][0] __________________________________________________________________________________________________ res4e_relu (Activation) (None, None, None, 1 0 res4e[0][0] __________________________________________________________________________________________________ res4f_branch2a (Conv2D) (None, None, None, 2 262144 res4e_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2a (BatchNormalizati (None, None, None, 2 1024 res4f_branch2a[0][0] __________________________________________________________________________________________________ res4f_branch2a_relu (Activation (None, None, None, 2 0 bn4f_branch2a[0][0] __________________________________________________________________________________________________ padding4f_branch2b (ZeroPadding (None, None, None, 2 0 res4f_branch2a_relu[0][0] __________________________________________________________________________________________________ res4f_branch2b (Conv2D) (None, None, None, 2 589824 padding4f_branch2b[0][0] __________________________________________________________________________________________________ bn4f_branch2b (BatchNormalizati (None, None, None, 2 1024 res4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2b_relu (Activation (None, None, None, 2 0 bn4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2c (Conv2D) (None, None, None, 1 262144 res4f_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2c (BatchNormalizati (None, None, None, 1 4096 res4f_branch2c[0][0] __________________________________________________________________________________________________ res4f (Add) (None, None, None, 1 0 bn4f_branch2c[0][0] res4e_relu[0][0] __________________________________________________________________________________________________ res4f_relu (Activation) (None, None, None, 1 0 res4f[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4f_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4f_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3d_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3d_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 36,382,957 Trainable params: 36,276,717 Non-trainable params: 106,240 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 40:22 - loss: 3.8989 - regression_loss: 2.7644 - classification_loss: 1.1346 2/500 [..............................] - ETA: 21:04 - loss: 3.8910 - regression_loss: 2.7585 - classification_loss: 1.1325 3/500 [..............................] - ETA: 14:38 - loss: 3.9679 - regression_loss: 2.8361 - classification_loss: 1.1318 4/500 [..............................] - ETA: 11:26 - loss: 3.9834 - regression_loss: 2.8514 - classification_loss: 1.1320 5/500 [..............................] - ETA: 9:31 - loss: 4.0357 - regression_loss: 2.9041 - classification_loss: 1.1315 6/500 [..............................] - ETA: 8:13 - loss: 4.0062 - regression_loss: 2.8750 - classification_loss: 1.1312 7/500 [..............................] - ETA: 7:17 - loss: 3.9856 - regression_loss: 2.8544 - classification_loss: 1.1311 8/500 [..............................] - ETA: 6:36 - loss: 3.9851 - regression_loss: 2.8542 - classification_loss: 1.1309 9/500 [..............................] - ETA: 6:03 - loss: 3.9754 - regression_loss: 2.8442 - classification_loss: 1.1312 10/500 [..............................] - ETA: 5:36 - loss: 3.9929 - regression_loss: 2.8618 - classification_loss: 1.1311 11/500 [..............................] - ETA: 5:15 - loss: 3.9729 - regression_loss: 2.8417 - classification_loss: 1.1312 12/500 [..............................] - ETA: 4:57 - loss: 3.9966 - regression_loss: 2.8652 - classification_loss: 1.1313 13/500 [..............................] - ETA: 4:43 - loss: 4.0071 - regression_loss: 2.8760 - classification_loss: 1.1311 14/500 [..............................] - ETA: 4:31 - loss: 4.0170 - regression_loss: 2.8861 - classification_loss: 1.1310 15/500 [..............................] - ETA: 4:19 - loss: 4.0179 - regression_loss: 2.8871 - classification_loss: 1.1308 16/500 [..............................] - ETA: 4:09 - loss: 4.0091 - regression_loss: 2.8783 - classification_loss: 1.1308 17/500 [>.............................] - ETA: 4:00 - loss: 4.0162 - regression_loss: 2.8855 - classification_loss: 1.1306 18/500 [>.............................] - ETA: 3:52 - loss: 4.0049 - regression_loss: 2.8742 - classification_loss: 1.1308 19/500 [>.............................] - ETA: 3:44 - loss: 4.0065 - regression_loss: 2.8758 - classification_loss: 1.1306 20/500 [>.............................] - ETA: 3:38 - loss: 4.0018 - regression_loss: 2.8713 - classification_loss: 1.1305 21/500 [>.............................] - ETA: 3:32 - loss: 4.0087 - regression_loss: 2.8782 - classification_loss: 1.1305 22/500 [>.............................] - ETA: 3:27 - loss: 4.0058 - regression_loss: 2.8755 - classification_loss: 1.1304 23/500 [>.............................] - ETA: 3:23 - loss: 4.0156 - regression_loss: 2.8854 - classification_loss: 1.1302 24/500 [>.............................] - ETA: 3:18 - loss: 4.0102 - regression_loss: 2.8799 - classification_loss: 1.1303 25/500 [>.............................] - ETA: 3:14 - loss: 4.0097 - regression_loss: 2.8796 - classification_loss: 1.1301 26/500 [>.............................] - ETA: 3:10 - loss: 4.0065 - regression_loss: 2.8766 - classification_loss: 1.1299 27/500 [>.............................] - ETA: 3:07 - loss: 4.0067 - regression_loss: 2.8770 - classification_loss: 1.1297 28/500 [>.............................] - ETA: 3:04 - loss: 4.0153 - regression_loss: 2.8857 - classification_loss: 1.1296 29/500 [>.............................] - ETA: 3:01 - loss: 4.0040 - regression_loss: 2.8745 - classification_loss: 1.1295 30/500 [>.............................] - ETA: 2:58 - loss: 4.0023 - regression_loss: 2.8730 - classification_loss: 1.1293 31/500 [>.............................] - ETA: 2:55 - loss: 4.0051 - regression_loss: 2.8757 - classification_loss: 1.1293 32/500 [>.............................] - ETA: 2:52 - loss: 4.0050 - regression_loss: 2.8759 - classification_loss: 1.1292 33/500 [>.............................] - ETA: 2:50 - loss: 3.9969 - regression_loss: 2.8678 - classification_loss: 1.1292 34/500 [=>............................] - ETA: 2:47 - loss: 3.9939 - regression_loss: 2.8647 - classification_loss: 1.1292 35/500 [=>............................] - ETA: 2:45 - loss: 3.9941 - regression_loss: 2.8651 - classification_loss: 1.1290 36/500 [=>............................] - ETA: 2:43 - loss: 3.9922 - regression_loss: 2.8632 - classification_loss: 1.1290 37/500 [=>............................] - ETA: 2:41 - loss: 3.9970 - regression_loss: 2.8681 - classification_loss: 1.1289 38/500 [=>............................] - ETA: 2:39 - loss: 3.9985 - regression_loss: 2.8697 - classification_loss: 1.1287 39/500 [=>............................] - ETA: 2:38 - loss: 3.9996 - regression_loss: 2.8709 - classification_loss: 1.1287 40/500 [=>............................] - ETA: 2:36 - loss: 4.0039 - regression_loss: 2.8753 - classification_loss: 1.1287 41/500 [=>............................] - ETA: 2:35 - loss: 4.0034 - regression_loss: 2.8748 - classification_loss: 1.1286 42/500 [=>............................] - ETA: 2:33 - loss: 4.0063 - regression_loss: 2.8780 - classification_loss: 1.1283 43/500 [=>............................] - ETA: 2:32 - loss: 4.0040 - regression_loss: 2.8758 - classification_loss: 1.1281 44/500 [=>............................] - ETA: 2:30 - loss: 4.0065 - regression_loss: 2.8785 - classification_loss: 1.1280 45/500 [=>............................] - ETA: 2:29 - loss: 4.0091 - regression_loss: 2.8813 - classification_loss: 1.1278 46/500 [=>............................] - ETA: 2:28 - loss: 4.0061 - regression_loss: 2.8785 - classification_loss: 1.1276 47/500 [=>............................] - ETA: 2:27 - loss: 4.0065 - regression_loss: 2.8792 - classification_loss: 1.1273 48/500 [=>............................] - ETA: 2:25 - loss: 4.0022 - regression_loss: 2.8750 - classification_loss: 1.1272 49/500 [=>............................] - ETA: 2:24 - loss: 3.9994 - regression_loss: 2.8725 - classification_loss: 1.1269 50/500 [==>...........................] - ETA: 2:23 - loss: 3.9982 - regression_loss: 2.8715 - classification_loss: 1.1267 51/500 [==>...........................] - ETA: 2:22 - loss: 3.9933 - regression_loss: 2.8669 - classification_loss: 1.1265 52/500 [==>...........................] - ETA: 2:21 - loss: 3.9907 - regression_loss: 2.8646 - classification_loss: 1.1262 53/500 [==>...........................] - ETA: 2:20 - loss: 3.9893 - regression_loss: 2.8633 - classification_loss: 1.1260 54/500 [==>...........................] - ETA: 2:19 - loss: 3.9892 - regression_loss: 2.8635 - classification_loss: 1.1257 55/500 [==>...........................] - ETA: 2:18 - loss: 3.9859 - regression_loss: 2.8605 - classification_loss: 1.1254 56/500 [==>...........................] - ETA: 2:17 - loss: 3.9835 - regression_loss: 2.8585 - classification_loss: 1.1250 57/500 [==>...........................] - ETA: 2:16 - loss: 3.9838 - regression_loss: 2.8592 - classification_loss: 1.1246 58/500 [==>...........................] - ETA: 2:15 - loss: 3.9866 - regression_loss: 2.8621 - classification_loss: 1.1246 59/500 [==>...........................] - ETA: 2:14 - loss: 3.9883 - regression_loss: 2.8634 - classification_loss: 1.1249 60/500 [==>...........................] - ETA: 2:13 - loss: 3.9853 - regression_loss: 2.8609 - classification_loss: 1.1245 61/500 [==>...........................] - ETA: 2:12 - loss: 3.9809 - regression_loss: 2.8569 - classification_loss: 1.1240 62/500 [==>...........................] - ETA: 2:11 - loss: 3.9766 - regression_loss: 2.8532 - classification_loss: 1.1234 63/500 [==>...........................] - ETA: 2:10 - loss: 3.9754 - regression_loss: 2.8523 - classification_loss: 1.1231 64/500 [==>...........................] - ETA: 2:10 - loss: 3.9753 - regression_loss: 2.8522 - classification_loss: 1.1231 65/500 [==>...........................] - ETA: 2:09 - loss: 3.9721 - regression_loss: 2.8492 - classification_loss: 1.1230 66/500 [==>...........................] - ETA: 2:08 - loss: 3.9693 - regression_loss: 2.8469 - classification_loss: 1.1224 67/500 [===>..........................] - ETA: 2:07 - loss: 3.9673 - regression_loss: 2.8453 - classification_loss: 1.1220 68/500 [===>..........................] - ETA: 2:07 - loss: 3.9621 - regression_loss: 2.8404 - classification_loss: 1.1217 69/500 [===>..........................] - ETA: 2:06 - loss: 3.9611 - regression_loss: 2.8392 - classification_loss: 1.1219 70/500 [===>..........................] - ETA: 2:05 - loss: 3.9553 - regression_loss: 2.8341 - classification_loss: 1.1212 71/500 [===>..........................] - ETA: 2:04 - loss: 3.9508 - regression_loss: 2.8304 - classification_loss: 1.1205 72/500 [===>..........................] - ETA: 2:04 - loss: 3.9487 - regression_loss: 2.8285 - classification_loss: 1.1202 73/500 [===>..........................] - ETA: 2:03 - loss: 3.9457 - regression_loss: 2.8263 - classification_loss: 1.1194 74/500 [===>..........................] - ETA: 2:02 - loss: 3.9449 - regression_loss: 2.8260 - classification_loss: 1.1189 75/500 [===>..........................] - ETA: 2:02 - loss: 3.9406 - regression_loss: 2.8226 - classification_loss: 1.1180 76/500 [===>..........................] - ETA: 2:01 - loss: 3.9366 - regression_loss: 2.8196 - classification_loss: 1.1171 77/500 [===>..........................] - ETA: 2:01 - loss: 3.9328 - regression_loss: 2.8158 - classification_loss: 1.1170 78/500 [===>..........................] - ETA: 2:00 - loss: 3.9290 - regression_loss: 2.8130 - classification_loss: 1.1160 79/500 [===>..........................] - ETA: 1:59 - loss: 3.9239 - regression_loss: 2.8078 - classification_loss: 1.1162 80/500 [===>..........................] - ETA: 1:59 - loss: 3.9184 - regression_loss: 2.8033 - classification_loss: 1.1151 81/500 [===>..........................] - ETA: 1:58 - loss: 3.9124 - regression_loss: 2.7987 - classification_loss: 1.1137 82/500 [===>..........................] - ETA: 1:58 - loss: 3.9102 - regression_loss: 2.7980 - classification_loss: 1.1122 83/500 [===>..........................] - ETA: 1:57 - loss: 3.9071 - regression_loss: 2.7948 - classification_loss: 1.1123 84/500 [====>.........................] - ETA: 1:57 - loss: 3.9026 - regression_loss: 2.7915 - classification_loss: 1.1111 85/500 [====>.........................] - ETA: 1:56 - loss: 3.8984 - regression_loss: 2.7890 - classification_loss: 1.1094 86/500 [====>.........................] - ETA: 1:56 - loss: 3.8933 - regression_loss: 2.7858 - classification_loss: 1.1076 87/500 [====>.........................] - ETA: 1:55 - loss: 3.8906 - regression_loss: 2.7842 - classification_loss: 1.1065 88/500 [====>.........................] - ETA: 1:55 - loss: 3.8865 - regression_loss: 2.7810 - classification_loss: 1.1055 89/500 [====>.........................] - ETA: 1:54 - loss: 3.8817 - regression_loss: 2.7780 - classification_loss: 1.1038 90/500 [====>.........................] - ETA: 1:54 - loss: 3.8856 - regression_loss: 2.7825 - classification_loss: 1.1031 91/500 [====>.........................] - ETA: 1:53 - loss: 3.8806 - regression_loss: 2.7787 - classification_loss: 1.1019 92/500 [====>.........................] - ETA: 1:52 - loss: 3.8767 - regression_loss: 2.7756 - classification_loss: 1.1011 93/500 [====>.........................] - ETA: 1:52 - loss: 3.8718 - regression_loss: 2.7717 - classification_loss: 1.1000 94/500 [====>.........................] - ETA: 1:52 - loss: 3.8687 - regression_loss: 2.7714 - classification_loss: 1.0973 95/500 [====>.........................] - ETA: 1:51 - loss: 3.8645 - regression_loss: 2.7694 - classification_loss: 1.0952 96/500 [====>.........................] - ETA: 1:50 - loss: 3.8600 - regression_loss: 2.7663 - classification_loss: 1.0937 97/500 [====>.........................] - ETA: 1:50 - loss: 3.8552 - regression_loss: 2.7635 - classification_loss: 1.0917 98/500 [====>.........................] - ETA: 1:50 - loss: 3.8511 - regression_loss: 2.7613 - classification_loss: 1.0898 99/500 [====>.........................] - ETA: 1:49 - loss: 3.8460 - regression_loss: 2.7588 - classification_loss: 1.0872 100/500 [=====>........................] - ETA: 1:49 - loss: 3.8414 - regression_loss: 2.7577 - classification_loss: 1.0837 101/500 [=====>........................] - ETA: 1:48 - loss: 3.8365 - regression_loss: 2.7561 - classification_loss: 1.0804 102/500 [=====>........................] - ETA: 1:48 - loss: 3.8356 - regression_loss: 2.7572 - classification_loss: 1.0784 103/500 [=====>........................] - ETA: 1:47 - loss: 3.8305 - regression_loss: 2.7547 - classification_loss: 1.0758 104/500 [=====>........................] - ETA: 1:47 - loss: 3.8252 - regression_loss: 2.7524 - classification_loss: 1.0729 105/500 [=====>........................] - ETA: 1:46 - loss: 3.8182 - regression_loss: 2.7490 - classification_loss: 1.0692 106/500 [=====>........................] - ETA: 1:46 - loss: 3.8182 - regression_loss: 2.7490 - classification_loss: 1.0692 107/500 [=====>........................] - ETA: 1:45 - loss: 3.8195 - regression_loss: 2.7505 - classification_loss: 1.0690 108/500 [=====>........................] - ETA: 1:45 - loss: 3.8169 - regression_loss: 2.7478 - classification_loss: 1.0691 109/500 [=====>........................] - ETA: 1:45 - loss: 3.8096 - regression_loss: 2.7448 - classification_loss: 1.0648 110/500 [=====>........................] - ETA: 1:44 - loss: 3.8050 - regression_loss: 2.7435 - classification_loss: 1.0615 111/500 [=====>........................] - ETA: 1:44 - loss: 3.8019 - regression_loss: 2.7427 - classification_loss: 1.0592 112/500 [=====>........................] - ETA: 1:43 - loss: 3.7969 - regression_loss: 2.7411 - classification_loss: 1.0558 113/500 [=====>........................] - ETA: 1:43 - loss: 3.7968 - regression_loss: 2.7416 - classification_loss: 1.0552 114/500 [=====>........................] - ETA: 1:43 - loss: 3.7986 - regression_loss: 2.7433 - classification_loss: 1.0553 115/500 [=====>........................] - ETA: 1:42 - loss: 3.7931 - regression_loss: 2.7407 - classification_loss: 1.0523 116/500 [=====>........................] - ETA: 1:42 - loss: 3.7873 - regression_loss: 2.7386 - classification_loss: 1.0487 117/500 [======>.......................] - ETA: 1:41 - loss: 3.7883 - regression_loss: 2.7397 - classification_loss: 1.0486 118/500 [======>.......................] - ETA: 1:41 - loss: 3.7810 - regression_loss: 2.7367 - classification_loss: 1.0443 119/500 [======>.......................] - ETA: 1:41 - loss: 3.7740 - regression_loss: 2.7333 - classification_loss: 1.0407 120/500 [======>.......................] - ETA: 1:40 - loss: 3.7686 - regression_loss: 2.7316 - classification_loss: 1.0370 121/500 [======>.......................] - ETA: 1:40 - loss: 3.7611 - regression_loss: 2.7291 - classification_loss: 1.0321 122/500 [======>.......................] - ETA: 1:40 - loss: 3.7583 - regression_loss: 2.7285 - classification_loss: 1.0297 123/500 [======>.......................] - ETA: 1:39 - loss: 3.7565 - regression_loss: 2.7289 - classification_loss: 1.0276 124/500 [======>.......................] - ETA: 1:39 - loss: 3.7543 - regression_loss: 2.7270 - classification_loss: 1.0273 125/500 [======>.......................] - ETA: 1:38 - loss: 3.7473 - regression_loss: 2.7242 - classification_loss: 1.0231 126/500 [======>.......................] - ETA: 1:38 - loss: 3.7428 - regression_loss: 2.7237 - classification_loss: 1.0191 127/500 [======>.......................] - ETA: 1:38 - loss: 3.7354 - regression_loss: 2.7211 - classification_loss: 1.0144 128/500 [======>.......................] - ETA: 1:37 - loss: 3.7382 - regression_loss: 2.7230 - classification_loss: 1.0152 129/500 [======>.......................] - ETA: 1:37 - loss: 3.7383 - regression_loss: 2.7237 - classification_loss: 1.0146 130/500 [======>.......................] - ETA: 1:37 - loss: 3.7319 - regression_loss: 2.7217 - classification_loss: 1.0101 131/500 [======>.......................] - ETA: 1:36 - loss: 3.7241 - regression_loss: 2.7185 - classification_loss: 1.0057 132/500 [======>.......................] - ETA: 1:36 - loss: 3.7186 - regression_loss: 2.7159 - classification_loss: 1.0027 133/500 [======>.......................] - ETA: 1:35 - loss: 3.7137 - regression_loss: 2.7137 - classification_loss: 0.9999 134/500 [=======>......................] - ETA: 1:35 - loss: 3.7086 - regression_loss: 2.7121 - classification_loss: 0.9965 135/500 [=======>......................] - ETA: 1:35 - loss: 3.7083 - regression_loss: 2.7121 - classification_loss: 0.9962 136/500 [=======>......................] - ETA: 1:34 - loss: 3.7084 - regression_loss: 2.7117 - classification_loss: 0.9967 137/500 [=======>......................] - ETA: 1:34 - loss: 3.7026 - regression_loss: 2.7096 - classification_loss: 0.9930 138/500 [=======>......................] - ETA: 1:34 - loss: 3.6958 - regression_loss: 2.7073 - classification_loss: 0.9885 139/500 [=======>......................] - ETA: 1:33 - loss: 3.6929 - regression_loss: 2.7062 - classification_loss: 0.9867 140/500 [=======>......................] - ETA: 1:33 - loss: 3.6923 - regression_loss: 2.7058 - classification_loss: 0.9865 141/500 [=======>......................] - ETA: 1:33 - loss: 3.6936 - regression_loss: 2.7066 - classification_loss: 0.9870 142/500 [=======>......................] - ETA: 1:32 - loss: 3.6894 - regression_loss: 2.7053 - classification_loss: 0.9841 143/500 [=======>......................] - ETA: 1:32 - loss: 3.6851 - regression_loss: 2.7035 - classification_loss: 0.9815 144/500 [=======>......................] - ETA: 1:32 - loss: 3.6788 - regression_loss: 2.7008 - classification_loss: 0.9779 145/500 [=======>......................] - ETA: 1:31 - loss: 3.6737 - regression_loss: 2.6988 - classification_loss: 0.9749 146/500 [=======>......................] - ETA: 1:31 - loss: 3.6687 - regression_loss: 2.6969 - classification_loss: 0.9718 147/500 [=======>......................] - ETA: 1:31 - loss: 3.6640 - regression_loss: 2.6953 - classification_loss: 0.9687 148/500 [=======>......................] - ETA: 1:30 - loss: 3.6579 - regression_loss: 2.6926 - classification_loss: 0.9653 149/500 [=======>......................] - ETA: 1:30 - loss: 3.6422 - regression_loss: 2.6745 - classification_loss: 0.9676 150/500 [========>.....................] - ETA: 1:30 - loss: 3.6371 - regression_loss: 2.6728 - classification_loss: 0.9644 151/500 [========>.....................] - ETA: 1:30 - loss: 3.6379 - regression_loss: 2.6728 - classification_loss: 0.9651 152/500 [========>.....................] - ETA: 1:29 - loss: 3.6372 - regression_loss: 2.6719 - classification_loss: 0.9653 153/500 [========>.....................] - ETA: 1:29 - loss: 3.6369 - regression_loss: 2.6716 - classification_loss: 0.9654 154/500 [========>.....................] - ETA: 1:29 - loss: 3.6343 - regression_loss: 2.6703 - classification_loss: 0.9640 155/500 [========>.....................] - ETA: 1:28 - loss: 3.6299 - regression_loss: 2.6685 - classification_loss: 0.9613 156/500 [========>.....................] - ETA: 1:28 - loss: 3.6269 - regression_loss: 2.6677 - classification_loss: 0.9592 157/500 [========>.....................] - ETA: 1:28 - loss: 3.6235 - regression_loss: 2.6656 - classification_loss: 0.9579 158/500 [========>.....................] - ETA: 1:27 - loss: 3.6183 - regression_loss: 2.6638 - classification_loss: 0.9545 159/500 [========>.....................] - ETA: 1:27 - loss: 3.6182 - regression_loss: 2.6635 - classification_loss: 0.9547 160/500 [========>.....................] - ETA: 1:27 - loss: 3.6145 - regression_loss: 2.6627 - classification_loss: 0.9518 161/500 [========>.....................] - ETA: 1:26 - loss: 3.6124 - regression_loss: 2.6634 - classification_loss: 0.9490 162/500 [========>.....................] - ETA: 1:26 - loss: 3.6078 - regression_loss: 2.6617 - classification_loss: 0.9461 163/500 [========>.....................] - ETA: 1:26 - loss: 3.6053 - regression_loss: 2.6606 - classification_loss: 0.9447 164/500 [========>.....................] - ETA: 1:25 - loss: 3.6050 - regression_loss: 2.6598 - classification_loss: 0.9452 165/500 [========>.....................] - ETA: 1:25 - loss: 3.6007 - regression_loss: 2.6584 - classification_loss: 0.9424 166/500 [========>.....................] - ETA: 1:25 - loss: 3.5959 - regression_loss: 2.6569 - classification_loss: 0.9391 167/500 [=========>....................] - ETA: 1:24 - loss: 3.5937 - regression_loss: 2.6567 - classification_loss: 0.9370 168/500 [=========>....................] - ETA: 1:24 - loss: 3.5898 - regression_loss: 2.6558 - classification_loss: 0.9340 169/500 [=========>....................] - ETA: 1:24 - loss: 3.5860 - regression_loss: 2.6547 - classification_loss: 0.9313 170/500 [=========>....................] - ETA: 1:24 - loss: 3.5812 - regression_loss: 2.6529 - classification_loss: 0.9282 171/500 [=========>....................] - ETA: 1:23 - loss: 3.5770 - regression_loss: 2.6517 - classification_loss: 0.9253 172/500 [=========>....................] - ETA: 1:23 - loss: 3.5730 - regression_loss: 2.6505 - classification_loss: 0.9225 173/500 [=========>....................] - ETA: 1:23 - loss: 3.5690 - regression_loss: 2.6491 - classification_loss: 0.9199 174/500 [=========>....................] - ETA: 1:22 - loss: 3.5669 - regression_loss: 2.6488 - classification_loss: 0.9181 175/500 [=========>....................] - ETA: 1:22 - loss: 3.5624 - regression_loss: 2.6471 - classification_loss: 0.9153 176/500 [=========>....................] - ETA: 1:22 - loss: 3.5592 - regression_loss: 2.6470 - classification_loss: 0.9122 177/500 [=========>....................] - ETA: 1:21 - loss: 3.5543 - regression_loss: 2.6450 - classification_loss: 0.9094 178/500 [=========>....................] - ETA: 1:21 - loss: 3.5515 - regression_loss: 2.6444 - classification_loss: 0.9072 179/500 [=========>....................] - ETA: 1:21 - loss: 3.5511 - regression_loss: 2.6450 - classification_loss: 0.9061 180/500 [=========>....................] - ETA: 1:20 - loss: 3.5490 - regression_loss: 2.6448 - classification_loss: 0.9043 181/500 [=========>....................] - ETA: 1:20 - loss: 3.5456 - regression_loss: 2.6435 - classification_loss: 0.9021 182/500 [=========>....................] - ETA: 1:20 - loss: 3.5444 - regression_loss: 2.6444 - classification_loss: 0.9000 183/500 [=========>....................] - ETA: 1:20 - loss: 3.5431 - regression_loss: 2.6442 - classification_loss: 0.8989 184/500 [==========>...................] - ETA: 1:19 - loss: 3.5384 - regression_loss: 2.6424 - classification_loss: 0.8960 185/500 [==========>...................] - ETA: 1:19 - loss: 3.5339 - regression_loss: 2.6405 - classification_loss: 0.8935 186/500 [==========>...................] - ETA: 1:19 - loss: 3.5291 - regression_loss: 2.6385 - classification_loss: 0.8907 187/500 [==========>...................] - ETA: 1:18 - loss: 3.5257 - regression_loss: 2.6374 - classification_loss: 0.8883 188/500 [==========>...................] - ETA: 1:18 - loss: 3.5213 - regression_loss: 2.6357 - classification_loss: 0.8856 189/500 [==========>...................] - ETA: 1:18 - loss: 3.5195 - regression_loss: 2.6357 - classification_loss: 0.8838 190/500 [==========>...................] - ETA: 1:17 - loss: 3.5166 - regression_loss: 2.6350 - classification_loss: 0.8817 191/500 [==========>...................] - ETA: 1:17 - loss: 3.5138 - regression_loss: 2.6339 - classification_loss: 0.8799 192/500 [==========>...................] - ETA: 1:17 - loss: 3.5097 - regression_loss: 2.6323 - classification_loss: 0.8774 193/500 [==========>...................] - ETA: 1:17 - loss: 3.5130 - regression_loss: 2.6327 - classification_loss: 0.8804 194/500 [==========>...................] - ETA: 1:16 - loss: 3.5108 - regression_loss: 2.6319 - classification_loss: 0.8789 195/500 [==========>...................] - ETA: 1:16 - loss: 3.5094 - regression_loss: 2.6319 - classification_loss: 0.8775 196/500 [==========>...................] - ETA: 1:16 - loss: 3.5065 - regression_loss: 2.6310 - classification_loss: 0.8755 197/500 [==========>...................] - ETA: 1:15 - loss: 3.5048 - regression_loss: 2.6301 - classification_loss: 0.8747 198/500 [==========>...................] - ETA: 1:15 - loss: 3.5008 - regression_loss: 2.6283 - classification_loss: 0.8725 199/500 [==========>...................] - ETA: 1:15 - loss: 3.5001 - regression_loss: 2.6289 - classification_loss: 0.8712 200/500 [===========>..................] - ETA: 1:15 - loss: 3.4967 - regression_loss: 2.6275 - classification_loss: 0.8691 201/500 [===========>..................] - ETA: 1:14 - loss: 3.4928 - regression_loss: 2.6261 - classification_loss: 0.8667 202/500 [===========>..................] - ETA: 1:14 - loss: 3.4887 - regression_loss: 2.6245 - classification_loss: 0.8642 203/500 [===========>..................] - ETA: 1:14 - loss: 3.4862 - regression_loss: 2.6235 - classification_loss: 0.8627 204/500 [===========>..................] - ETA: 1:13 - loss: 3.4811 - regression_loss: 2.6210 - classification_loss: 0.8601 205/500 [===========>..................] - ETA: 1:13 - loss: 3.4778 - regression_loss: 2.6195 - classification_loss: 0.8583 206/500 [===========>..................] - ETA: 1:13 - loss: 3.4725 - regression_loss: 2.6170 - classification_loss: 0.8555 207/500 [===========>..................] - ETA: 1:13 - loss: 3.4699 - regression_loss: 2.6162 - classification_loss: 0.8537 208/500 [===========>..................] - ETA: 1:12 - loss: 3.4663 - regression_loss: 2.6149 - classification_loss: 0.8514 209/500 [===========>..................] - ETA: 1:12 - loss: 3.4635 - regression_loss: 2.6138 - classification_loss: 0.8497 210/500 [===========>..................] - ETA: 1:12 - loss: 3.4621 - regression_loss: 2.6135 - classification_loss: 0.8486 211/500 [===========>..................] - ETA: 1:12 - loss: 3.4621 - regression_loss: 2.6134 - classification_loss: 0.8487 212/500 [===========>..................] - ETA: 1:11 - loss: 3.4618 - regression_loss: 2.6152 - classification_loss: 0.8466 213/500 [===========>..................] - ETA: 1:11 - loss: 3.4570 - regression_loss: 2.6131 - classification_loss: 0.8439 214/500 [===========>..................] - ETA: 1:11 - loss: 3.4552 - regression_loss: 2.6123 - classification_loss: 0.8429 215/500 [===========>..................] - ETA: 1:11 - loss: 3.4544 - regression_loss: 2.6126 - classification_loss: 0.8418 216/500 [===========>..................] - ETA: 1:10 - loss: 3.4520 - regression_loss: 2.6121 - classification_loss: 0.8400 217/500 [============>.................] - ETA: 1:10 - loss: 3.4496 - regression_loss: 2.6112 - classification_loss: 0.8384 218/500 [============>.................] - ETA: 1:10 - loss: 3.4485 - regression_loss: 2.6118 - classification_loss: 0.8367 219/500 [============>.................] - ETA: 1:09 - loss: 3.4451 - regression_loss: 2.6103 - classification_loss: 0.8348 220/500 [============>.................] - ETA: 1:09 - loss: 3.4433 - regression_loss: 2.6096 - classification_loss: 0.8337 221/500 [============>.................] - ETA: 1:09 - loss: 3.4400 - regression_loss: 2.6083 - classification_loss: 0.8318 222/500 [============>.................] - ETA: 1:09 - loss: 3.4372 - regression_loss: 2.6076 - classification_loss: 0.8296 223/500 [============>.................] - ETA: 1:08 - loss: 3.4368 - regression_loss: 2.6072 - classification_loss: 0.8296 224/500 [============>.................] - ETA: 1:08 - loss: 3.4338 - regression_loss: 2.6063 - classification_loss: 0.8275 225/500 [============>.................] - ETA: 1:08 - loss: 3.4301 - regression_loss: 2.6045 - classification_loss: 0.8256 226/500 [============>.................] - ETA: 1:07 - loss: 3.4311 - regression_loss: 2.6055 - classification_loss: 0.8256 227/500 [============>.................] - ETA: 1:07 - loss: 3.4291 - regression_loss: 2.6048 - classification_loss: 0.8242 228/500 [============>.................] - ETA: 1:07 - loss: 3.4298 - regression_loss: 2.6055 - classification_loss: 0.8243 229/500 [============>.................] - ETA: 1:07 - loss: 3.4275 - regression_loss: 2.6042 - classification_loss: 0.8233 230/500 [============>.................] - ETA: 1:06 - loss: 3.4280 - regression_loss: 2.6061 - classification_loss: 0.8219 231/500 [============>.................] - ETA: 1:06 - loss: 3.4236 - regression_loss: 2.6037 - classification_loss: 0.8199 232/500 [============>.................] - ETA: 1:06 - loss: 3.4211 - regression_loss: 2.6025 - classification_loss: 0.8186 233/500 [============>.................] - ETA: 1:06 - loss: 3.4212 - regression_loss: 2.6031 - classification_loss: 0.8181 234/500 [=============>................] - ETA: 1:05 - loss: 3.4196 - regression_loss: 2.6025 - classification_loss: 0.8171 235/500 [=============>................] - ETA: 1:05 - loss: 3.4175 - regression_loss: 2.6021 - classification_loss: 0.8155 236/500 [=============>................] - ETA: 1:05 - loss: 3.4115 - regression_loss: 2.5910 - classification_loss: 0.8205 237/500 [=============>................] - ETA: 1:05 - loss: 3.4080 - regression_loss: 2.5890 - classification_loss: 0.8190 238/500 [=============>................] - ETA: 1:04 - loss: 3.4056 - regression_loss: 2.5883 - classification_loss: 0.8173 239/500 [=============>................] - ETA: 1:04 - loss: 3.4032 - regression_loss: 2.5874 - classification_loss: 0.8158 240/500 [=============>................] - ETA: 1:04 - loss: 3.4004 - regression_loss: 2.5863 - classification_loss: 0.8141 241/500 [=============>................] - ETA: 1:03 - loss: 3.3995 - regression_loss: 2.5862 - classification_loss: 0.8133 242/500 [=============>................] - ETA: 1:03 - loss: 3.3974 - regression_loss: 2.5852 - classification_loss: 0.8121 243/500 [=============>................] - ETA: 1:03 - loss: 3.3960 - regression_loss: 2.5845 - classification_loss: 0.8115 244/500 [=============>................] - ETA: 1:03 - loss: 3.3962 - regression_loss: 2.5851 - classification_loss: 0.8111 245/500 [=============>................] - ETA: 1:02 - loss: 3.3939 - regression_loss: 2.5842 - classification_loss: 0.8097 246/500 [=============>................] - ETA: 1:02 - loss: 3.3925 - regression_loss: 2.5839 - classification_loss: 0.8087 247/500 [=============>................] - ETA: 1:02 - loss: 3.3899 - regression_loss: 2.5828 - classification_loss: 0.8071 248/500 [=============>................] - ETA: 1:02 - loss: 3.3897 - regression_loss: 2.5834 - classification_loss: 0.8064 249/500 [=============>................] - ETA: 1:01 - loss: 3.3874 - regression_loss: 2.5825 - classification_loss: 0.8049 250/500 [==============>...............] - ETA: 1:01 - loss: 3.3863 - regression_loss: 2.5829 - classification_loss: 0.8034 251/500 [==============>...............] - ETA: 1:01 - loss: 3.3838 - regression_loss: 2.5818 - classification_loss: 0.8021 252/500 [==============>...............] - ETA: 1:00 - loss: 3.3809 - regression_loss: 2.5807 - classification_loss: 0.8002 253/500 [==============>...............] - ETA: 1:00 - loss: 3.3784 - regression_loss: 2.5795 - classification_loss: 0.7989 254/500 [==============>...............] - ETA: 1:00 - loss: 3.3786 - regression_loss: 2.5809 - classification_loss: 0.7978 255/500 [==============>...............] - ETA: 1:00 - loss: 3.3770 - regression_loss: 2.5805 - classification_loss: 0.7966 256/500 [==============>...............] - ETA: 59s - loss: 3.3793 - regression_loss: 2.5832 - classification_loss: 0.7961 257/500 [==============>...............] - ETA: 59s - loss: 3.3774 - regression_loss: 2.5829 - classification_loss: 0.7945 258/500 [==============>...............] - ETA: 59s - loss: 3.3730 - regression_loss: 2.5802 - classification_loss: 0.7928 259/500 [==============>...............] - ETA: 59s - loss: 3.3710 - regression_loss: 2.5792 - classification_loss: 0.7917 260/500 [==============>...............] - ETA: 58s - loss: 3.3688 - regression_loss: 2.5782 - classification_loss: 0.7906 261/500 [==============>...............] - ETA: 58s - loss: 3.3668 - regression_loss: 2.5775 - classification_loss: 0.7893 262/500 [==============>...............] - ETA: 58s - loss: 3.3655 - regression_loss: 2.5773 - classification_loss: 0.7882 263/500 [==============>...............] - ETA: 58s - loss: 3.3649 - regression_loss: 2.5780 - classification_loss: 0.7869 264/500 [==============>...............] - ETA: 57s - loss: 3.3638 - regression_loss: 2.5772 - classification_loss: 0.7865 265/500 [==============>...............] - ETA: 57s - loss: 3.3615 - regression_loss: 2.5760 - classification_loss: 0.7855 266/500 [==============>...............] - ETA: 57s - loss: 3.3589 - regression_loss: 2.5748 - classification_loss: 0.7841 267/500 [===============>..............] - ETA: 57s - loss: 3.3566 - regression_loss: 2.5741 - classification_loss: 0.7825 268/500 [===============>..............] - ETA: 56s - loss: 3.3558 - regression_loss: 2.5745 - classification_loss: 0.7813 269/500 [===============>..............] - ETA: 56s - loss: 3.3565 - regression_loss: 2.5755 - classification_loss: 0.7810 270/500 [===============>..............] - ETA: 56s - loss: 3.3541 - regression_loss: 2.5745 - classification_loss: 0.7796 271/500 [===============>..............] - ETA: 56s - loss: 3.3512 - regression_loss: 2.5734 - classification_loss: 0.7778 272/500 [===============>..............] - ETA: 55s - loss: 3.3490 - regression_loss: 2.5724 - classification_loss: 0.7766 273/500 [===============>..............] - ETA: 55s - loss: 3.3456 - regression_loss: 2.5708 - classification_loss: 0.7748 274/500 [===============>..............] - ETA: 55s - loss: 3.3431 - regression_loss: 2.5701 - classification_loss: 0.7731 275/500 [===============>..............] - ETA: 55s - loss: 3.3416 - regression_loss: 2.5696 - classification_loss: 0.7721 276/500 [===============>..............] - ETA: 54s - loss: 3.3396 - regression_loss: 2.5687 - classification_loss: 0.7709 277/500 [===============>..............] - ETA: 54s - loss: 3.3370 - regression_loss: 2.5677 - classification_loss: 0.7693 278/500 [===============>..............] - ETA: 54s - loss: 3.3357 - regression_loss: 2.5673 - classification_loss: 0.7684 279/500 [===============>..............] - ETA: 54s - loss: 3.3335 - regression_loss: 2.5669 - classification_loss: 0.7666 280/500 [===============>..............] - ETA: 53s - loss: 3.3331 - regression_loss: 2.5670 - classification_loss: 0.7661 281/500 [===============>..............] - ETA: 53s - loss: 3.3303 - regression_loss: 2.5657 - classification_loss: 0.7646 282/500 [===============>..............] - ETA: 53s - loss: 3.3268 - regression_loss: 2.5638 - classification_loss: 0.7630 283/500 [===============>..............] - ETA: 53s - loss: 3.3254 - regression_loss: 2.5634 - classification_loss: 0.7620 284/500 [================>.............] - ETA: 52s - loss: 3.3238 - regression_loss: 2.5629 - classification_loss: 0.7609 285/500 [================>.............] - ETA: 52s - loss: 3.3219 - regression_loss: 2.5623 - classification_loss: 0.7596 286/500 [================>.............] - ETA: 52s - loss: 3.3183 - regression_loss: 2.5606 - classification_loss: 0.7577 287/500 [================>.............] - ETA: 51s - loss: 3.3162 - regression_loss: 2.5596 - classification_loss: 0.7565 288/500 [================>.............] - ETA: 51s - loss: 3.3155 - regression_loss: 2.5590 - classification_loss: 0.7565 289/500 [================>.............] - ETA: 51s - loss: 3.3160 - regression_loss: 2.5598 - classification_loss: 0.7561 290/500 [================>.............] - ETA: 51s - loss: 3.3181 - regression_loss: 2.5612 - classification_loss: 0.7569 291/500 [================>.............] - ETA: 50s - loss: 3.3164 - regression_loss: 2.5609 - classification_loss: 0.7555 292/500 [================>.............] - ETA: 50s - loss: 3.3155 - regression_loss: 2.5606 - classification_loss: 0.7549 293/500 [================>.............] - ETA: 50s - loss: 3.3130 - regression_loss: 2.5594 - classification_loss: 0.7536 294/500 [================>.............] - ETA: 50s - loss: 3.3115 - regression_loss: 2.5590 - classification_loss: 0.7525 295/500 [================>.............] - ETA: 49s - loss: 3.3091 - regression_loss: 2.5577 - classification_loss: 0.7514 296/500 [================>.............] - ETA: 49s - loss: 3.3078 - regression_loss: 2.5574 - classification_loss: 0.7504 297/500 [================>.............] - ETA: 49s - loss: 3.3067 - regression_loss: 2.5571 - classification_loss: 0.7496 298/500 [================>.............] - ETA: 49s - loss: 3.3050 - regression_loss: 2.5563 - classification_loss: 0.7487 299/500 [================>.............] - ETA: 48s - loss: 3.3041 - regression_loss: 2.5558 - classification_loss: 0.7483 300/500 [=================>............] - ETA: 48s - loss: 3.3046 - regression_loss: 2.5565 - classification_loss: 0.7481 301/500 [=================>............] - ETA: 48s - loss: 3.3062 - regression_loss: 2.5578 - classification_loss: 0.7483 302/500 [=================>............] - ETA: 48s - loss: 3.3043 - regression_loss: 2.5571 - classification_loss: 0.7472 303/500 [=================>............] - ETA: 47s - loss: 3.3050 - regression_loss: 2.5577 - classification_loss: 0.7473 304/500 [=================>............] - ETA: 47s - loss: 3.3037 - regression_loss: 2.5577 - classification_loss: 0.7460 305/500 [=================>............] - ETA: 47s - loss: 3.3017 - regression_loss: 2.5567 - classification_loss: 0.7450 306/500 [=================>............] - ETA: 47s - loss: 3.2997 - regression_loss: 2.5556 - classification_loss: 0.7441 307/500 [=================>............] - ETA: 46s - loss: 3.2983 - regression_loss: 2.5554 - classification_loss: 0.7429 308/500 [=================>............] - ETA: 46s - loss: 3.2965 - regression_loss: 2.5545 - classification_loss: 0.7420 309/500 [=================>............] - ETA: 46s - loss: 3.2948 - regression_loss: 2.5541 - classification_loss: 0.7407 310/500 [=================>............] - ETA: 46s - loss: 3.2932 - regression_loss: 2.5536 - classification_loss: 0.7395 311/500 [=================>............] - ETA: 45s - loss: 3.2922 - regression_loss: 2.5535 - classification_loss: 0.7387 312/500 [=================>............] - ETA: 45s - loss: 3.2910 - regression_loss: 2.5532 - classification_loss: 0.7378 313/500 [=================>............] - ETA: 45s - loss: 3.2917 - regression_loss: 2.5537 - classification_loss: 0.7380 314/500 [=================>............] - ETA: 45s - loss: 3.2905 - regression_loss: 2.5536 - classification_loss: 0.7368 315/500 [=================>............] - ETA: 44s - loss: 3.2893 - regression_loss: 2.5535 - classification_loss: 0.7358 316/500 [=================>............] - ETA: 44s - loss: 3.2868 - regression_loss: 2.5524 - classification_loss: 0.7344 317/500 [==================>...........] - ETA: 44s - loss: 3.2846 - regression_loss: 2.5514 - classification_loss: 0.7332 318/500 [==================>...........] - ETA: 44s - loss: 3.2832 - regression_loss: 2.5509 - classification_loss: 0.7323 319/500 [==================>...........] - ETA: 43s - loss: 3.2819 - regression_loss: 2.5502 - classification_loss: 0.7317 320/500 [==================>...........] - ETA: 43s - loss: 3.2796 - regression_loss: 2.5490 - classification_loss: 0.7306 321/500 [==================>...........] - ETA: 43s - loss: 3.2759 - regression_loss: 2.5468 - classification_loss: 0.7291 322/500 [==================>...........] - ETA: 43s - loss: 3.2741 - regression_loss: 2.5462 - classification_loss: 0.7279 323/500 [==================>...........] - ETA: 42s - loss: 3.2716 - regression_loss: 2.5449 - classification_loss: 0.7267 324/500 [==================>...........] - ETA: 42s - loss: 3.2686 - regression_loss: 2.5436 - classification_loss: 0.7251 325/500 [==================>...........] - ETA: 42s - loss: 3.2662 - regression_loss: 2.5424 - classification_loss: 0.7238 326/500 [==================>...........] - ETA: 42s - loss: 3.2660 - regression_loss: 2.5430 - classification_loss: 0.7229 327/500 [==================>...........] - ETA: 41s - loss: 3.2644 - regression_loss: 2.5423 - classification_loss: 0.7220 328/500 [==================>...........] - ETA: 41s - loss: 3.2619 - regression_loss: 2.5409 - classification_loss: 0.7210 329/500 [==================>...........] - ETA: 41s - loss: 3.2629 - regression_loss: 2.5408 - classification_loss: 0.7222 330/500 [==================>...........] - ETA: 41s - loss: 3.2619 - regression_loss: 2.5406 - classification_loss: 0.7213 331/500 [==================>...........] - ETA: 40s - loss: 3.2605 - regression_loss: 2.5400 - classification_loss: 0.7205 332/500 [==================>...........] - ETA: 40s - loss: 3.2586 - regression_loss: 2.5395 - classification_loss: 0.7192 333/500 [==================>...........] - ETA: 40s - loss: 3.2585 - regression_loss: 2.5395 - classification_loss: 0.7190 334/500 [===================>..........] - ETA: 40s - loss: 3.2566 - regression_loss: 2.5387 - classification_loss: 0.7179 335/500 [===================>..........] - ETA: 39s - loss: 3.2553 - regression_loss: 2.5386 - classification_loss: 0.7167 336/500 [===================>..........] - ETA: 39s - loss: 3.2524 - regression_loss: 2.5370 - classification_loss: 0.7153 337/500 [===================>..........] - ETA: 39s - loss: 3.2519 - regression_loss: 2.5366 - classification_loss: 0.7153 338/500 [===================>..........] - ETA: 39s - loss: 3.2523 - regression_loss: 2.5368 - classification_loss: 0.7155 339/500 [===================>..........] - ETA: 38s - loss: 3.2518 - regression_loss: 2.5368 - classification_loss: 0.7149 340/500 [===================>..........] - ETA: 38s - loss: 3.2526 - regression_loss: 2.5374 - classification_loss: 0.7152 341/500 [===================>..........] - ETA: 38s - loss: 3.2501 - regression_loss: 2.5360 - classification_loss: 0.7141 342/500 [===================>..........] - ETA: 38s - loss: 3.2496 - regression_loss: 2.5356 - classification_loss: 0.7140 343/500 [===================>..........] - ETA: 37s - loss: 3.2478 - regression_loss: 2.5345 - classification_loss: 0.7132 344/500 [===================>..........] - ETA: 37s - loss: 3.2478 - regression_loss: 2.5353 - classification_loss: 0.7125 345/500 [===================>..........] - ETA: 37s - loss: 3.2453 - regression_loss: 2.5341 - classification_loss: 0.7112 346/500 [===================>..........] - ETA: 37s - loss: 3.2429 - regression_loss: 2.5326 - classification_loss: 0.7103 347/500 [===================>..........] - ETA: 36s - loss: 3.2416 - regression_loss: 2.5320 - classification_loss: 0.7096 348/500 [===================>..........] - ETA: 36s - loss: 3.2395 - regression_loss: 2.5308 - classification_loss: 0.7086 349/500 [===================>..........] - ETA: 36s - loss: 3.2384 - regression_loss: 2.5301 - classification_loss: 0.7083 350/500 [====================>.........] - ETA: 36s - loss: 3.2373 - regression_loss: 2.5297 - classification_loss: 0.7076 351/500 [====================>.........] - ETA: 36s - loss: 3.2360 - regression_loss: 2.5292 - classification_loss: 0.7068 352/500 [====================>.........] - ETA: 35s - loss: 3.2382 - regression_loss: 2.5310 - classification_loss: 0.7072 353/500 [====================>.........] - ETA: 35s - loss: 3.2362 - regression_loss: 2.5299 - classification_loss: 0.7062 354/500 [====================>.........] - ETA: 35s - loss: 3.2371 - regression_loss: 2.5305 - classification_loss: 0.7066 355/500 [====================>.........] - ETA: 35s - loss: 3.2367 - regression_loss: 2.5306 - classification_loss: 0.7062 356/500 [====================>.........] - ETA: 34s - loss: 3.2342 - regression_loss: 2.5291 - classification_loss: 0.7051 357/500 [====================>.........] - ETA: 34s - loss: 3.2337 - regression_loss: 2.5291 - classification_loss: 0.7046 358/500 [====================>.........] - ETA: 34s - loss: 3.2316 - regression_loss: 2.5277 - classification_loss: 0.7039 359/500 [====================>.........] - ETA: 34s - loss: 3.2301 - regression_loss: 2.5270 - classification_loss: 0.7031 360/500 [====================>.........] - ETA: 33s - loss: 3.2293 - regression_loss: 2.5271 - classification_loss: 0.7022 361/500 [====================>.........] - ETA: 33s - loss: 3.2293 - regression_loss: 2.5274 - classification_loss: 0.7019 362/500 [====================>.........] - ETA: 33s - loss: 3.2271 - regression_loss: 2.5260 - classification_loss: 0.7010 363/500 [====================>.........] - ETA: 33s - loss: 3.2254 - regression_loss: 2.5253 - classification_loss: 0.7001 364/500 [====================>.........] - ETA: 32s - loss: 3.2266 - regression_loss: 2.5260 - classification_loss: 0.7006 365/500 [====================>.........] - ETA: 32s - loss: 3.2243 - regression_loss: 2.5247 - classification_loss: 0.6996 366/500 [====================>.........] - ETA: 32s - loss: 3.2223 - regression_loss: 2.5233 - classification_loss: 0.6989 367/500 [=====================>........] - ETA: 32s - loss: 3.2203 - regression_loss: 2.5222 - classification_loss: 0.6982 368/500 [=====================>........] - ETA: 31s - loss: 3.2184 - regression_loss: 2.5210 - classification_loss: 0.6973 369/500 [=====================>........] - ETA: 31s - loss: 3.2116 - regression_loss: 2.5142 - classification_loss: 0.6974 370/500 [=====================>........] - ETA: 31s - loss: 3.2076 - regression_loss: 2.5116 - classification_loss: 0.6960 371/500 [=====================>........] - ETA: 31s - loss: 3.2067 - regression_loss: 2.5113 - classification_loss: 0.6953 372/500 [=====================>........] - ETA: 30s - loss: 3.2058 - regression_loss: 2.5113 - classification_loss: 0.6945 373/500 [=====================>........] - ETA: 30s - loss: 3.2050 - regression_loss: 2.5106 - classification_loss: 0.6944 374/500 [=====================>........] - ETA: 30s - loss: 3.2042 - regression_loss: 2.5103 - classification_loss: 0.6939 375/500 [=====================>........] - ETA: 30s - loss: 3.2021 - regression_loss: 2.5092 - classification_loss: 0.6929 376/500 [=====================>........] - ETA: 29s - loss: 3.2006 - regression_loss: 2.5085 - classification_loss: 0.6921 377/500 [=====================>........] - ETA: 29s - loss: 3.2003 - regression_loss: 2.5090 - classification_loss: 0.6913 378/500 [=====================>........] - ETA: 29s - loss: 3.1986 - regression_loss: 2.5080 - classification_loss: 0.6906 379/500 [=====================>........] - ETA: 29s - loss: 3.1952 - regression_loss: 2.5058 - classification_loss: 0.6893 380/500 [=====================>........] - ETA: 28s - loss: 3.1936 - regression_loss: 2.5051 - classification_loss: 0.6885 381/500 [=====================>........] - ETA: 28s - loss: 3.1925 - regression_loss: 2.5049 - classification_loss: 0.6876 382/500 [=====================>........] - ETA: 28s - loss: 3.1895 - regression_loss: 2.5030 - classification_loss: 0.6864 383/500 [=====================>........] - ETA: 28s - loss: 3.1882 - regression_loss: 2.5024 - classification_loss: 0.6858 384/500 [======================>.......] - ETA: 27s - loss: 3.1867 - regression_loss: 2.5016 - classification_loss: 0.6851 385/500 [======================>.......] - ETA: 27s - loss: 3.1862 - regression_loss: 2.5014 - classification_loss: 0.6849 386/500 [======================>.......] - ETA: 27s - loss: 3.1835 - regression_loss: 2.4998 - classification_loss: 0.6837 387/500 [======================>.......] - ETA: 27s - loss: 3.1809 - regression_loss: 2.4983 - classification_loss: 0.6826 388/500 [======================>.......] - ETA: 26s - loss: 3.1785 - regression_loss: 2.4969 - classification_loss: 0.6816 389/500 [======================>.......] - ETA: 26s - loss: 3.1783 - regression_loss: 2.4975 - classification_loss: 0.6808 390/500 [======================>.......] - ETA: 26s - loss: 3.1775 - regression_loss: 2.4975 - classification_loss: 0.6800 391/500 [======================>.......] - ETA: 26s - loss: 3.1782 - regression_loss: 2.4975 - classification_loss: 0.6807 392/500 [======================>.......] - ETA: 25s - loss: 3.1783 - regression_loss: 2.4971 - classification_loss: 0.6813 393/500 [======================>.......] - ETA: 25s - loss: 3.1771 - regression_loss: 2.4964 - classification_loss: 0.6806 394/500 [======================>.......] - ETA: 25s - loss: 3.1746 - regression_loss: 2.4949 - classification_loss: 0.6797 395/500 [======================>.......] - ETA: 25s - loss: 3.1732 - regression_loss: 2.4941 - classification_loss: 0.6791 396/500 [======================>.......] - ETA: 25s - loss: 3.1722 - regression_loss: 2.4939 - classification_loss: 0.6784 397/500 [======================>.......] - ETA: 24s - loss: 3.1722 - regression_loss: 2.4941 - classification_loss: 0.6781 398/500 [======================>.......] - ETA: 24s - loss: 3.1715 - regression_loss: 2.4940 - classification_loss: 0.6774 399/500 [======================>.......] - ETA: 24s - loss: 3.1699 - regression_loss: 2.4934 - classification_loss: 0.6765 400/500 [=======================>......] - ETA: 24s - loss: 3.1680 - regression_loss: 2.4925 - classification_loss: 0.6755 401/500 [=======================>......] - ETA: 23s - loss: 3.1676 - regression_loss: 2.4930 - classification_loss: 0.6746 402/500 [=======================>......] - ETA: 23s - loss: 3.1659 - regression_loss: 2.4921 - classification_loss: 0.6738 403/500 [=======================>......] - ETA: 23s - loss: 3.1654 - regression_loss: 2.4923 - classification_loss: 0.6731 404/500 [=======================>......] - ETA: 23s - loss: 3.1645 - regression_loss: 2.4921 - classification_loss: 0.6724 405/500 [=======================>......] - ETA: 22s - loss: 3.1633 - regression_loss: 2.4914 - classification_loss: 0.6719 406/500 [=======================>......] - ETA: 22s - loss: 3.1639 - regression_loss: 2.4916 - classification_loss: 0.6723 407/500 [=======================>......] - ETA: 22s - loss: 3.1635 - regression_loss: 2.4918 - classification_loss: 0.6717 408/500 [=======================>......] - ETA: 22s - loss: 3.1639 - regression_loss: 2.4920 - classification_loss: 0.6719 409/500 [=======================>......] - ETA: 21s - loss: 3.1627 - regression_loss: 2.4917 - classification_loss: 0.6710 410/500 [=======================>......] - ETA: 21s - loss: 3.1632 - regression_loss: 2.4921 - classification_loss: 0.6711 411/500 [=======================>......] - ETA: 21s - loss: 3.1628 - regression_loss: 2.4920 - classification_loss: 0.6708 412/500 [=======================>......] - ETA: 21s - loss: 3.1619 - regression_loss: 2.4919 - classification_loss: 0.6700 413/500 [=======================>......] - ETA: 20s - loss: 3.1607 - regression_loss: 2.4914 - classification_loss: 0.6693 414/500 [=======================>......] - ETA: 20s - loss: 3.1602 - regression_loss: 2.4910 - classification_loss: 0.6692 415/500 [=======================>......] - ETA: 20s - loss: 3.1589 - regression_loss: 2.4903 - classification_loss: 0.6686 416/500 [=======================>......] - ETA: 20s - loss: 3.1568 - regression_loss: 2.4892 - classification_loss: 0.6677 417/500 [========================>.....] - ETA: 19s - loss: 3.1570 - regression_loss: 2.4893 - classification_loss: 0.6677 418/500 [========================>.....] - ETA: 19s - loss: 3.1562 - regression_loss: 2.4891 - classification_loss: 0.6671 419/500 [========================>.....] - ETA: 19s - loss: 3.1551 - regression_loss: 2.4885 - classification_loss: 0.6665 420/500 [========================>.....] - ETA: 19s - loss: 3.1541 - regression_loss: 2.4882 - classification_loss: 0.6659 421/500 [========================>.....] - ETA: 18s - loss: 3.1529 - regression_loss: 2.4877 - classification_loss: 0.6653 422/500 [========================>.....] - ETA: 18s - loss: 3.1519 - regression_loss: 2.4872 - classification_loss: 0.6647 423/500 [========================>.....] - ETA: 18s - loss: 3.1500 - regression_loss: 2.4863 - classification_loss: 0.6637 424/500 [========================>.....] - ETA: 18s - loss: 3.1497 - regression_loss: 2.4866 - classification_loss: 0.6631 425/500 [========================>.....] - ETA: 17s - loss: 3.1492 - regression_loss: 2.4863 - classification_loss: 0.6630 426/500 [========================>.....] - ETA: 17s - loss: 3.1520 - regression_loss: 2.4878 - classification_loss: 0.6642 427/500 [========================>.....] - ETA: 17s - loss: 3.1516 - regression_loss: 2.4877 - classification_loss: 0.6639 428/500 [========================>.....] - ETA: 17s - loss: 3.1510 - regression_loss: 2.4877 - classification_loss: 0.6633 429/500 [========================>.....] - ETA: 17s - loss: 3.1483 - regression_loss: 2.4860 - classification_loss: 0.6623 430/500 [========================>.....] - ETA: 16s - loss: 3.1477 - regression_loss: 2.4857 - classification_loss: 0.6620 431/500 [========================>.....] - ETA: 16s - loss: 3.1464 - regression_loss: 2.4849 - classification_loss: 0.6615 432/500 [========================>.....] - ETA: 16s - loss: 3.1455 - regression_loss: 2.4845 - classification_loss: 0.6610 433/500 [========================>.....] - ETA: 16s - loss: 3.1435 - regression_loss: 2.4832 - classification_loss: 0.6603 434/500 [=========================>....] - ETA: 15s - loss: 3.1427 - regression_loss: 2.4830 - classification_loss: 0.6597 435/500 [=========================>....] - ETA: 15s - loss: 3.1429 - regression_loss: 2.4832 - classification_loss: 0.6597 436/500 [=========================>....] - ETA: 15s - loss: 3.1415 - regression_loss: 2.4826 - classification_loss: 0.6590 437/500 [=========================>....] - ETA: 15s - loss: 3.1401 - regression_loss: 2.4819 - classification_loss: 0.6582 438/500 [=========================>....] - ETA: 14s - loss: 3.1402 - regression_loss: 2.4820 - classification_loss: 0.6582 439/500 [=========================>....] - ETA: 14s - loss: 3.1393 - regression_loss: 2.4818 - classification_loss: 0.6575 440/500 [=========================>....] - ETA: 14s - loss: 3.1376 - regression_loss: 2.4809 - classification_loss: 0.6567 441/500 [=========================>....] - ETA: 14s - loss: 3.1363 - regression_loss: 2.4803 - classification_loss: 0.6560 442/500 [=========================>....] - ETA: 13s - loss: 3.1347 - regression_loss: 2.4795 - classification_loss: 0.6552 443/500 [=========================>....] - ETA: 13s - loss: 3.1338 - regression_loss: 2.4789 - classification_loss: 0.6548 444/500 [=========================>....] - ETA: 13s - loss: 3.1333 - regression_loss: 2.4789 - classification_loss: 0.6545 445/500 [=========================>....] - ETA: 13s - loss: 3.1325 - regression_loss: 2.4786 - classification_loss: 0.6538 446/500 [=========================>....] - ETA: 12s - loss: 3.1313 - regression_loss: 2.4780 - classification_loss: 0.6532 447/500 [=========================>....] - ETA: 12s - loss: 3.1295 - regression_loss: 2.4768 - classification_loss: 0.6527 448/500 [=========================>....] - ETA: 12s - loss: 3.1283 - regression_loss: 2.4762 - classification_loss: 0.6521 449/500 [=========================>....] - ETA: 12s - loss: 3.1270 - regression_loss: 2.4756 - classification_loss: 0.6514 450/500 [==========================>...] - ETA: 11s - loss: 3.1223 - regression_loss: 2.4701 - classification_loss: 0.6522 451/500 [==========================>...] - ETA: 11s - loss: 3.1196 - regression_loss: 2.4682 - classification_loss: 0.6514 452/500 [==========================>...] - ETA: 11s - loss: 3.1189 - regression_loss: 2.4677 - classification_loss: 0.6512 453/500 [==========================>...] - ETA: 11s - loss: 3.1184 - regression_loss: 2.4675 - classification_loss: 0.6510 454/500 [==========================>...] - ETA: 11s - loss: 3.1180 - regression_loss: 2.4671 - classification_loss: 0.6509 455/500 [==========================>...] - ETA: 10s - loss: 3.1167 - regression_loss: 2.4665 - classification_loss: 0.6501 456/500 [==========================>...] - ETA: 10s - loss: 3.1139 - regression_loss: 2.4646 - classification_loss: 0.6493 457/500 [==========================>...] - ETA: 10s - loss: 3.1127 - regression_loss: 2.4640 - classification_loss: 0.6487 458/500 [==========================>...] - ETA: 10s - loss: 3.1121 - regression_loss: 2.4638 - classification_loss: 0.6484 459/500 [==========================>...] - ETA: 9s - loss: 3.1117 - regression_loss: 2.4638 - classification_loss: 0.6479 460/500 [==========================>...] - ETA: 9s - loss: 3.1114 - regression_loss: 2.4638 - classification_loss: 0.6475 461/500 [==========================>...] - ETA: 9s - loss: 3.1098 - regression_loss: 2.4631 - classification_loss: 0.6467 462/500 [==========================>...] - ETA: 9s - loss: 3.1086 - regression_loss: 2.4625 - classification_loss: 0.6461 463/500 [==========================>...] - ETA: 8s - loss: 3.1081 - regression_loss: 2.4626 - classification_loss: 0.6455 464/500 [==========================>...] - ETA: 8s - loss: 3.1090 - regression_loss: 2.4638 - classification_loss: 0.6452 465/500 [==========================>...] - ETA: 8s - loss: 3.1073 - regression_loss: 2.4628 - classification_loss: 0.6445 466/500 [==========================>...] - ETA: 8s - loss: 3.1056 - regression_loss: 2.4616 - classification_loss: 0.6440 467/500 [===========================>..] - ETA: 7s - loss: 3.1047 - regression_loss: 2.4614 - classification_loss: 0.6433 468/500 [===========================>..] - ETA: 7s - loss: 3.1034 - regression_loss: 2.4606 - classification_loss: 0.6428 469/500 [===========================>..] - ETA: 7s - loss: 3.1027 - regression_loss: 2.4603 - classification_loss: 0.6424 470/500 [===========================>..] - ETA: 7s - loss: 3.1023 - regression_loss: 2.4605 - classification_loss: 0.6418 471/500 [===========================>..] - ETA: 6s - loss: 3.1014 - regression_loss: 2.4602 - classification_loss: 0.6412 472/500 [===========================>..] - ETA: 6s - loss: 3.0997 - regression_loss: 2.4592 - classification_loss: 0.6405 473/500 [===========================>..] - ETA: 6s - loss: 3.0993 - regression_loss: 2.4592 - classification_loss: 0.6401 474/500 [===========================>..] - ETA: 6s - loss: 3.0972 - regression_loss: 2.4579 - classification_loss: 0.6393 475/500 [===========================>..] - ETA: 5s - loss: 3.0968 - regression_loss: 2.4578 - classification_loss: 0.6390 476/500 [===========================>..] - ETA: 5s - loss: 3.0960 - regression_loss: 2.4575 - classification_loss: 0.6385 477/500 [===========================>..] - ETA: 5s - loss: 3.0944 - regression_loss: 2.4562 - classification_loss: 0.6382 478/500 [===========================>..] - ETA: 5s - loss: 3.0924 - regression_loss: 2.4550 - classification_loss: 0.6374 479/500 [===========================>..] - ETA: 5s - loss: 3.0915 - regression_loss: 2.4544 - classification_loss: 0.6371 480/500 [===========================>..] - ETA: 4s - loss: 3.0904 - regression_loss: 2.4537 - classification_loss: 0.6367 481/500 [===========================>..] - ETA: 4s - loss: 3.0895 - regression_loss: 2.4532 - classification_loss: 0.6363 482/500 [===========================>..] - ETA: 4s - loss: 3.0883 - regression_loss: 2.4525 - classification_loss: 0.6358 483/500 [===========================>..] - ETA: 4s - loss: 3.0868 - regression_loss: 2.4517 - classification_loss: 0.6351 484/500 [============================>.] - ETA: 3s - loss: 3.0860 - regression_loss: 2.4516 - classification_loss: 0.6344 485/500 [============================>.] - ETA: 3s - loss: 3.0855 - regression_loss: 2.4516 - classification_loss: 0.6340 486/500 [============================>.] - ETA: 3s - loss: 3.0895 - regression_loss: 2.4524 - classification_loss: 0.6370 487/500 [============================>.] - ETA: 3s - loss: 3.0884 - regression_loss: 2.4519 - classification_loss: 0.6366 488/500 [============================>.] - ETA: 2s - loss: 3.0866 - regression_loss: 2.4508 - classification_loss: 0.6358 489/500 [============================>.] - ETA: 2s - loss: 3.0848 - regression_loss: 2.4496 - classification_loss: 0.6352 490/500 [============================>.] - ETA: 2s - loss: 3.0844 - regression_loss: 2.4496 - classification_loss: 0.6348 491/500 [============================>.] - ETA: 2s - loss: 3.0828 - regression_loss: 2.4485 - classification_loss: 0.6343 492/500 [============================>.] - ETA: 1s - loss: 3.0815 - regression_loss: 2.4479 - classification_loss: 0.6336 493/500 [============================>.] - ETA: 1s - loss: 3.0797 - regression_loss: 2.4468 - classification_loss: 0.6329 494/500 [============================>.] - ETA: 1s - loss: 3.0784 - regression_loss: 2.4461 - classification_loss: 0.6324 495/500 [============================>.] - ETA: 1s - loss: 3.0789 - regression_loss: 2.4459 - classification_loss: 0.6329 496/500 [============================>.] - ETA: 0s - loss: 3.0780 - regression_loss: 2.4454 - classification_loss: 0.6326 497/500 [============================>.] - ETA: 0s - loss: 3.0759 - regression_loss: 2.4440 - classification_loss: 0.6319 498/500 [============================>.] - ETA: 0s - loss: 3.0747 - regression_loss: 2.4431 - classification_loss: 0.6316 499/500 [============================>.] - ETA: 0s - loss: 3.0734 - regression_loss: 2.4423 - classification_loss: 0.6311 500/500 [==============================] - 119s 238ms/step - loss: 3.0726 - regression_loss: 2.4422 - classification_loss: 0.6304 326 instances of class plum with average precision: 0.4224 mAP: 0.4224 Epoch 00001: saving model to ./training/snapshots/resnet50_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:03 - loss: 3.5809 - regression_loss: 2.9736 - classification_loss: 0.6073 2/500 [..............................] - ETA: 1:58 - loss: 3.1541 - regression_loss: 2.6573 - classification_loss: 0.4967 3/500 [..............................] - ETA: 1:54 - loss: 2.9325 - regression_loss: 2.4577 - classification_loss: 0.4748 4/500 [..............................] - ETA: 1:54 - loss: 3.0217 - regression_loss: 2.5308 - classification_loss: 0.4909 5/500 [..............................] - ETA: 1:57 - loss: 2.9046 - regression_loss: 2.4325 - classification_loss: 0.4721 6/500 [..............................] - ETA: 1:57 - loss: 2.8443 - regression_loss: 2.3891 - classification_loss: 0.4552 7/500 [..............................] - ETA: 1:57 - loss: 2.7962 - regression_loss: 2.3472 - classification_loss: 0.4490 8/500 [..............................] - ETA: 1:57 - loss: 2.8307 - regression_loss: 2.3608 - classification_loss: 0.4699 9/500 [..............................] - ETA: 1:57 - loss: 2.7796 - regression_loss: 2.3261 - classification_loss: 0.4535 10/500 [..............................] - ETA: 1:56 - loss: 2.7399 - regression_loss: 2.2968 - classification_loss: 0.4430 11/500 [..............................] - ETA: 1:55 - loss: 2.6965 - regression_loss: 2.2599 - classification_loss: 0.4366 12/500 [..............................] - ETA: 1:54 - loss: 2.6814 - regression_loss: 2.2417 - classification_loss: 0.4398 13/500 [..............................] - ETA: 1:54 - loss: 2.6650 - regression_loss: 2.2263 - classification_loss: 0.4387 14/500 [..............................] - ETA: 1:53 - loss: 2.6640 - regression_loss: 2.2303 - classification_loss: 0.4337 15/500 [..............................] - ETA: 1:53 - loss: 2.6916 - regression_loss: 2.2528 - classification_loss: 0.4388 16/500 [..............................] - ETA: 1:52 - loss: 2.7057 - regression_loss: 2.2682 - classification_loss: 0.4375 17/500 [>.............................] - ETA: 1:52 - loss: 2.6783 - regression_loss: 2.2497 - classification_loss: 0.4286 18/500 [>.............................] - ETA: 1:52 - loss: 2.6928 - regression_loss: 2.2577 - classification_loss: 0.4351 19/500 [>.............................] - ETA: 1:52 - loss: 2.6731 - regression_loss: 2.2443 - classification_loss: 0.4288 20/500 [>.............................] - ETA: 1:51 - loss: 2.6639 - regression_loss: 2.2375 - classification_loss: 0.4264 21/500 [>.............................] - ETA: 1:51 - loss: 2.6833 - regression_loss: 2.2459 - classification_loss: 0.4375 22/500 [>.............................] - ETA: 1:50 - loss: 2.7034 - regression_loss: 2.2526 - classification_loss: 0.4507 23/500 [>.............................] - ETA: 1:50 - loss: 2.6462 - regression_loss: 2.2073 - classification_loss: 0.4389 24/500 [>.............................] - ETA: 1:50 - loss: 2.6390 - regression_loss: 2.2047 - classification_loss: 0.4343 25/500 [>.............................] - ETA: 1:49 - loss: 2.6336 - regression_loss: 2.2058 - classification_loss: 0.4278 26/500 [>.............................] - ETA: 1:49 - loss: 2.6986 - regression_loss: 2.2429 - classification_loss: 0.4557 27/500 [>.............................] - ETA: 1:49 - loss: 2.6925 - regression_loss: 2.2403 - classification_loss: 0.4522 28/500 [>.............................] - ETA: 1:49 - loss: 2.6717 - regression_loss: 2.2246 - classification_loss: 0.4471 29/500 [>.............................] - ETA: 1:48 - loss: 2.6519 - regression_loss: 2.2103 - classification_loss: 0.4416 30/500 [>.............................] - ETA: 1:48 - loss: 2.6508 - regression_loss: 2.2112 - classification_loss: 0.4396 31/500 [>.............................] - ETA: 1:47 - loss: 2.6614 - regression_loss: 2.2196 - classification_loss: 0.4418 32/500 [>.............................] - ETA: 1:47 - loss: 2.6486 - regression_loss: 2.2094 - classification_loss: 0.4392 33/500 [>.............................] - ETA: 1:47 - loss: 2.6418 - regression_loss: 2.2072 - classification_loss: 0.4345 34/500 [=>............................] - ETA: 1:46 - loss: 2.6329 - regression_loss: 2.1994 - classification_loss: 0.4335 35/500 [=>............................] - ETA: 1:46 - loss: 2.6464 - regression_loss: 2.2089 - classification_loss: 0.4375 36/500 [=>............................] - ETA: 1:46 - loss: 2.6170 - regression_loss: 2.1865 - classification_loss: 0.4305 37/500 [=>............................] - ETA: 1:46 - loss: 2.6160 - regression_loss: 2.1851 - classification_loss: 0.4309 38/500 [=>............................] - ETA: 1:45 - loss: 2.6111 - regression_loss: 2.1828 - classification_loss: 0.4283 39/500 [=>............................] - ETA: 1:45 - loss: 2.6112 - regression_loss: 2.1822 - classification_loss: 0.4290 40/500 [=>............................] - ETA: 1:45 - loss: 2.6028 - regression_loss: 2.1766 - classification_loss: 0.4262 41/500 [=>............................] - ETA: 1:45 - loss: 2.6057 - regression_loss: 2.1807 - classification_loss: 0.4250 42/500 [=>............................] - ETA: 1:44 - loss: 2.5928 - regression_loss: 2.1719 - classification_loss: 0.4209 43/500 [=>............................] - ETA: 1:44 - loss: 2.5962 - regression_loss: 2.1741 - classification_loss: 0.4220 44/500 [=>............................] - ETA: 1:44 - loss: 2.5797 - regression_loss: 2.1593 - classification_loss: 0.4203 45/500 [=>............................] - ETA: 1:44 - loss: 2.5697 - regression_loss: 2.1532 - classification_loss: 0.4166 46/500 [=>............................] - ETA: 1:44 - loss: 2.5908 - regression_loss: 2.1646 - classification_loss: 0.4262 47/500 [=>............................] - ETA: 1:43 - loss: 2.5892 - regression_loss: 2.1627 - classification_loss: 0.4265 48/500 [=>............................] - ETA: 1:43 - loss: 2.5904 - regression_loss: 2.1658 - classification_loss: 0.4246 49/500 [=>............................] - ETA: 1:43 - loss: 2.5800 - regression_loss: 2.1593 - classification_loss: 0.4207 50/500 [==>...........................] - ETA: 1:43 - loss: 2.5736 - regression_loss: 2.1556 - classification_loss: 0.4180 51/500 [==>...........................] - ETA: 1:42 - loss: 2.5669 - regression_loss: 2.1516 - classification_loss: 0.4153 52/500 [==>...........................] - ETA: 1:42 - loss: 2.5822 - regression_loss: 2.1595 - classification_loss: 0.4227 53/500 [==>...........................] - ETA: 1:41 - loss: 2.6025 - regression_loss: 2.1758 - classification_loss: 0.4268 54/500 [==>...........................] - ETA: 1:41 - loss: 2.6005 - regression_loss: 2.1750 - classification_loss: 0.4255 55/500 [==>...........................] - ETA: 1:41 - loss: 2.5926 - regression_loss: 2.1689 - classification_loss: 0.4237 56/500 [==>...........................] - ETA: 1:41 - loss: 2.5855 - regression_loss: 2.1642 - classification_loss: 0.4213 57/500 [==>...........................] - ETA: 1:41 - loss: 2.5869 - regression_loss: 2.1657 - classification_loss: 0.4212 58/500 [==>...........................] - ETA: 1:40 - loss: 2.5842 - regression_loss: 2.1642 - classification_loss: 0.4200 59/500 [==>...........................] - ETA: 1:40 - loss: 2.5979 - regression_loss: 2.1733 - classification_loss: 0.4246 60/500 [==>...........................] - ETA: 1:40 - loss: 2.5970 - regression_loss: 2.1721 - classification_loss: 0.4249 61/500 [==>...........................] - ETA: 1:40 - loss: 2.5914 - regression_loss: 2.1668 - classification_loss: 0.4246 62/500 [==>...........................] - ETA: 1:39 - loss: 2.5973 - regression_loss: 2.1748 - classification_loss: 0.4225 63/500 [==>...........................] - ETA: 1:39 - loss: 2.5914 - regression_loss: 2.1704 - classification_loss: 0.4210 64/500 [==>...........................] - ETA: 1:39 - loss: 2.5896 - regression_loss: 2.1710 - classification_loss: 0.4185 65/500 [==>...........................] - ETA: 1:39 - loss: 2.6108 - regression_loss: 2.1853 - classification_loss: 0.4255 66/500 [==>...........................] - ETA: 1:39 - loss: 2.6104 - regression_loss: 2.1852 - classification_loss: 0.4253 67/500 [===>..........................] - ETA: 1:38 - loss: 2.6106 - regression_loss: 2.1845 - classification_loss: 0.4261 68/500 [===>..........................] - ETA: 1:38 - loss: 2.6051 - regression_loss: 2.1807 - classification_loss: 0.4244 69/500 [===>..........................] - ETA: 1:38 - loss: 2.6064 - regression_loss: 2.1829 - classification_loss: 0.4236 70/500 [===>..........................] - ETA: 1:38 - loss: 2.6055 - regression_loss: 2.1816 - classification_loss: 0.4239 71/500 [===>..........................] - ETA: 1:38 - loss: 2.6033 - regression_loss: 2.1798 - classification_loss: 0.4235 72/500 [===>..........................] - ETA: 1:37 - loss: 2.6041 - regression_loss: 2.1806 - classification_loss: 0.4235 73/500 [===>..........................] - ETA: 1:37 - loss: 2.5998 - regression_loss: 2.1764 - classification_loss: 0.4234 74/500 [===>..........................] - ETA: 1:37 - loss: 2.5947 - regression_loss: 2.1731 - classification_loss: 0.4216 75/500 [===>..........................] - ETA: 1:37 - loss: 2.5961 - regression_loss: 2.1745 - classification_loss: 0.4215 76/500 [===>..........................] - ETA: 1:36 - loss: 2.5855 - regression_loss: 2.1655 - classification_loss: 0.4199 77/500 [===>..........................] - ETA: 1:36 - loss: 2.5850 - regression_loss: 2.1658 - classification_loss: 0.4192 78/500 [===>..........................] - ETA: 1:36 - loss: 2.5766 - regression_loss: 2.1588 - classification_loss: 0.4177 79/500 [===>..........................] - ETA: 1:36 - loss: 2.5644 - regression_loss: 2.1495 - classification_loss: 0.4149 80/500 [===>..........................] - ETA: 1:35 - loss: 2.5627 - regression_loss: 2.1479 - classification_loss: 0.4148 81/500 [===>..........................] - ETA: 1:35 - loss: 2.5619 - regression_loss: 2.1473 - classification_loss: 0.4145 82/500 [===>..........................] - ETA: 1:35 - loss: 2.5524 - regression_loss: 2.1398 - classification_loss: 0.4127 83/500 [===>..........................] - ETA: 1:35 - loss: 2.5501 - regression_loss: 2.1383 - classification_loss: 0.4118 84/500 [====>.........................] - ETA: 1:35 - loss: 2.5510 - regression_loss: 2.1390 - classification_loss: 0.4119 85/500 [====>.........................] - ETA: 1:34 - loss: 2.5392 - regression_loss: 2.1139 - classification_loss: 0.4253 86/500 [====>.........................] - ETA: 1:34 - loss: 2.5440 - regression_loss: 2.1161 - classification_loss: 0.4279 87/500 [====>.........................] - ETA: 1:34 - loss: 2.5492 - regression_loss: 2.1182 - classification_loss: 0.4311 88/500 [====>.........................] - ETA: 1:33 - loss: 2.5482 - regression_loss: 2.1172 - classification_loss: 0.4310 89/500 [====>.........................] - ETA: 1:33 - loss: 2.5531 - regression_loss: 2.1239 - classification_loss: 0.4292 90/500 [====>.........................] - ETA: 1:33 - loss: 2.5509 - regression_loss: 2.1237 - classification_loss: 0.4272 91/500 [====>.........................] - ETA: 1:33 - loss: 2.5498 - regression_loss: 2.1226 - classification_loss: 0.4272 92/500 [====>.........................] - ETA: 1:33 - loss: 2.5538 - regression_loss: 2.1271 - classification_loss: 0.4267 93/500 [====>.........................] - ETA: 1:32 - loss: 2.5522 - regression_loss: 2.1265 - classification_loss: 0.4256 94/500 [====>.........................] - ETA: 1:32 - loss: 2.5493 - regression_loss: 2.1250 - classification_loss: 0.4243 95/500 [====>.........................] - ETA: 1:32 - loss: 2.5360 - regression_loss: 2.1142 - classification_loss: 0.4217 96/500 [====>.........................] - ETA: 1:32 - loss: 2.5338 - regression_loss: 2.1127 - classification_loss: 0.4211 97/500 [====>.........................] - ETA: 1:31 - loss: 2.5262 - regression_loss: 2.1067 - classification_loss: 0.4195 98/500 [====>.........................] - ETA: 1:31 - loss: 2.5250 - regression_loss: 2.1060 - classification_loss: 0.4190 99/500 [====>.........................] - ETA: 1:31 - loss: 2.5237 - regression_loss: 2.1051 - classification_loss: 0.4186 100/500 [=====>........................] - ETA: 1:31 - loss: 2.5165 - regression_loss: 2.0995 - classification_loss: 0.4170 101/500 [=====>........................] - ETA: 1:31 - loss: 2.5097 - regression_loss: 2.0944 - classification_loss: 0.4153 102/500 [=====>........................] - ETA: 1:31 - loss: 2.5203 - regression_loss: 2.1028 - classification_loss: 0.4175 103/500 [=====>........................] - ETA: 1:30 - loss: 2.5190 - regression_loss: 2.1025 - classification_loss: 0.4165 104/500 [=====>........................] - ETA: 1:30 - loss: 2.5196 - regression_loss: 2.1028 - classification_loss: 0.4168 105/500 [=====>........................] - ETA: 1:30 - loss: 2.5153 - regression_loss: 2.1002 - classification_loss: 0.4151 106/500 [=====>........................] - ETA: 1:30 - loss: 2.5211 - regression_loss: 2.1043 - classification_loss: 0.4168 107/500 [=====>........................] - ETA: 1:29 - loss: 2.5255 - regression_loss: 2.1082 - classification_loss: 0.4173 108/500 [=====>........................] - ETA: 1:29 - loss: 2.5300 - regression_loss: 2.1119 - classification_loss: 0.4181 109/500 [=====>........................] - ETA: 1:29 - loss: 2.5332 - regression_loss: 2.1162 - classification_loss: 0.4170 110/500 [=====>........................] - ETA: 1:29 - loss: 2.5307 - regression_loss: 2.1153 - classification_loss: 0.4154 111/500 [=====>........................] - ETA: 1:28 - loss: 2.5265 - regression_loss: 2.1125 - classification_loss: 0.4140 112/500 [=====>........................] - ETA: 1:28 - loss: 2.5215 - regression_loss: 2.1088 - classification_loss: 0.4127 113/500 [=====>........................] - ETA: 1:28 - loss: 2.5237 - regression_loss: 2.1114 - classification_loss: 0.4123 114/500 [=====>........................] - ETA: 1:28 - loss: 2.5271 - regression_loss: 2.1149 - classification_loss: 0.4121 115/500 [=====>........................] - ETA: 1:27 - loss: 2.5264 - regression_loss: 2.1146 - classification_loss: 0.4118 116/500 [=====>........................] - ETA: 1:27 - loss: 2.5263 - regression_loss: 2.1152 - classification_loss: 0.4111 117/500 [======>.......................] - ETA: 1:27 - loss: 2.5258 - regression_loss: 2.1151 - classification_loss: 0.4107 118/500 [======>.......................] - ETA: 1:27 - loss: 2.5319 - regression_loss: 2.1226 - classification_loss: 0.4093 119/500 [======>.......................] - ETA: 1:26 - loss: 2.5325 - regression_loss: 2.1239 - classification_loss: 0.4085 120/500 [======>.......................] - ETA: 1:26 - loss: 2.5305 - regression_loss: 2.1226 - classification_loss: 0.4079 121/500 [======>.......................] - ETA: 1:26 - loss: 2.5253 - regression_loss: 2.1195 - classification_loss: 0.4057 122/500 [======>.......................] - ETA: 1:26 - loss: 2.5256 - regression_loss: 2.1202 - classification_loss: 0.4054 123/500 [======>.......................] - ETA: 1:26 - loss: 2.5226 - regression_loss: 2.1183 - classification_loss: 0.4043 124/500 [======>.......................] - ETA: 1:25 - loss: 2.5226 - regression_loss: 2.1187 - classification_loss: 0.4039 125/500 [======>.......................] - ETA: 1:25 - loss: 2.5285 - regression_loss: 2.1223 - classification_loss: 0.4062 126/500 [======>.......................] - ETA: 1:25 - loss: 2.5317 - regression_loss: 2.1237 - classification_loss: 0.4080 127/500 [======>.......................] - ETA: 1:25 - loss: 2.5297 - regression_loss: 2.1218 - classification_loss: 0.4079 128/500 [======>.......................] - ETA: 1:24 - loss: 2.5321 - regression_loss: 2.1234 - classification_loss: 0.4087 129/500 [======>.......................] - ETA: 1:24 - loss: 2.5320 - regression_loss: 2.1240 - classification_loss: 0.4080 130/500 [======>.......................] - ETA: 1:24 - loss: 2.5323 - regression_loss: 2.1225 - classification_loss: 0.4098 131/500 [======>.......................] - ETA: 1:24 - loss: 2.5379 - regression_loss: 2.1251 - classification_loss: 0.4129 132/500 [======>.......................] - ETA: 1:24 - loss: 2.5410 - regression_loss: 2.1269 - classification_loss: 0.4140 133/500 [======>.......................] - ETA: 1:23 - loss: 2.5332 - regression_loss: 2.1210 - classification_loss: 0.4121 134/500 [=======>......................] - ETA: 1:23 - loss: 2.5350 - regression_loss: 2.1230 - classification_loss: 0.4120 135/500 [=======>......................] - ETA: 1:23 - loss: 2.5302 - regression_loss: 2.1192 - classification_loss: 0.4110 136/500 [=======>......................] - ETA: 1:23 - loss: 2.5261 - regression_loss: 2.1162 - classification_loss: 0.4099 137/500 [=======>......................] - ETA: 1:22 - loss: 2.5317 - regression_loss: 2.1205 - classification_loss: 0.4112 138/500 [=======>......................] - ETA: 1:22 - loss: 2.5305 - regression_loss: 2.1193 - classification_loss: 0.4111 139/500 [=======>......................] - ETA: 1:22 - loss: 2.5347 - regression_loss: 2.1224 - classification_loss: 0.4123 140/500 [=======>......................] - ETA: 1:22 - loss: 2.5293 - regression_loss: 2.1187 - classification_loss: 0.4106 141/500 [=======>......................] - ETA: 1:21 - loss: 2.5287 - regression_loss: 2.1183 - classification_loss: 0.4104 142/500 [=======>......................] - ETA: 1:21 - loss: 2.5285 - regression_loss: 2.1179 - classification_loss: 0.4106 143/500 [=======>......................] - ETA: 1:21 - loss: 2.5284 - regression_loss: 2.1187 - classification_loss: 0.4096 144/500 [=======>......................] - ETA: 1:21 - loss: 2.5269 - regression_loss: 2.1174 - classification_loss: 0.4094 145/500 [=======>......................] - ETA: 1:21 - loss: 2.5225 - regression_loss: 2.1140 - classification_loss: 0.4085 146/500 [=======>......................] - ETA: 1:20 - loss: 2.5206 - regression_loss: 2.1128 - classification_loss: 0.4078 147/500 [=======>......................] - ETA: 1:20 - loss: 2.5193 - regression_loss: 2.1117 - classification_loss: 0.4075 148/500 [=======>......................] - ETA: 1:20 - loss: 2.5165 - regression_loss: 2.1098 - classification_loss: 0.4067 149/500 [=======>......................] - ETA: 1:20 - loss: 2.5161 - regression_loss: 2.1098 - classification_loss: 0.4064 150/500 [========>.....................] - ETA: 1:19 - loss: 2.5169 - regression_loss: 2.1108 - classification_loss: 0.4061 151/500 [========>.....................] - ETA: 1:19 - loss: 2.5164 - regression_loss: 2.1107 - classification_loss: 0.4057 152/500 [========>.....................] - ETA: 1:19 - loss: 2.5278 - regression_loss: 2.1127 - classification_loss: 0.4152 153/500 [========>.....................] - ETA: 1:19 - loss: 2.5271 - regression_loss: 2.1123 - classification_loss: 0.4148 154/500 [========>.....................] - ETA: 1:18 - loss: 2.5281 - regression_loss: 2.1141 - classification_loss: 0.4139 155/500 [========>.....................] - ETA: 1:18 - loss: 2.5341 - regression_loss: 2.1189 - classification_loss: 0.4152 156/500 [========>.....................] - ETA: 1:18 - loss: 2.5341 - regression_loss: 2.1195 - classification_loss: 0.4146 157/500 [========>.....................] - ETA: 1:18 - loss: 2.5303 - regression_loss: 2.1164 - classification_loss: 0.4140 158/500 [========>.....................] - ETA: 1:17 - loss: 2.5269 - regression_loss: 2.1139 - classification_loss: 0.4130 159/500 [========>.....................] - ETA: 1:17 - loss: 2.5308 - regression_loss: 2.1177 - classification_loss: 0.4131 160/500 [========>.....................] - ETA: 1:17 - loss: 2.5248 - regression_loss: 2.1129 - classification_loss: 0.4119 161/500 [========>.....................] - ETA: 1:17 - loss: 2.5288 - regression_loss: 2.1161 - classification_loss: 0.4127 162/500 [========>.....................] - ETA: 1:17 - loss: 2.5279 - regression_loss: 2.1154 - classification_loss: 0.4125 163/500 [========>.....................] - ETA: 1:16 - loss: 2.5313 - regression_loss: 2.1178 - classification_loss: 0.4135 164/500 [========>.....................] - ETA: 1:16 - loss: 2.5318 - regression_loss: 2.1180 - classification_loss: 0.4138 165/500 [========>.....................] - ETA: 1:16 - loss: 2.5290 - regression_loss: 2.1164 - classification_loss: 0.4126 166/500 [========>.....................] - ETA: 1:16 - loss: 2.5280 - regression_loss: 2.1161 - classification_loss: 0.4119 167/500 [=========>....................] - ETA: 1:15 - loss: 2.5288 - regression_loss: 2.1171 - classification_loss: 0.4116 168/500 [=========>....................] - ETA: 1:15 - loss: 2.5264 - regression_loss: 2.1154 - classification_loss: 0.4110 169/500 [=========>....................] - ETA: 1:15 - loss: 2.5268 - regression_loss: 2.1147 - classification_loss: 0.4122 170/500 [=========>....................] - ETA: 1:15 - loss: 2.5248 - regression_loss: 2.1132 - classification_loss: 0.4116 171/500 [=========>....................] - ETA: 1:15 - loss: 2.5242 - regression_loss: 2.1129 - classification_loss: 0.4113 172/500 [=========>....................] - ETA: 1:14 - loss: 2.5255 - regression_loss: 2.1135 - classification_loss: 0.4120 173/500 [=========>....................] - ETA: 1:14 - loss: 2.5211 - regression_loss: 2.1102 - classification_loss: 0.4110 174/500 [=========>....................] - ETA: 1:14 - loss: 2.5232 - regression_loss: 2.1116 - classification_loss: 0.4115 175/500 [=========>....................] - ETA: 1:14 - loss: 2.5226 - regression_loss: 2.1116 - classification_loss: 0.4110 176/500 [=========>....................] - ETA: 1:13 - loss: 2.5228 - regression_loss: 2.1119 - classification_loss: 0.4109 177/500 [=========>....................] - ETA: 1:13 - loss: 2.5226 - regression_loss: 2.1113 - classification_loss: 0.4113 178/500 [=========>....................] - ETA: 1:13 - loss: 2.5198 - regression_loss: 2.1085 - classification_loss: 0.4113 179/500 [=========>....................] - ETA: 1:13 - loss: 2.5171 - regression_loss: 2.0967 - classification_loss: 0.4204 180/500 [=========>....................] - ETA: 1:12 - loss: 2.5153 - regression_loss: 2.0951 - classification_loss: 0.4202 181/500 [=========>....................] - ETA: 1:12 - loss: 2.5195 - regression_loss: 2.0987 - classification_loss: 0.4208 182/500 [=========>....................] - ETA: 1:12 - loss: 2.5177 - regression_loss: 2.0973 - classification_loss: 0.4204 183/500 [=========>....................] - ETA: 1:12 - loss: 2.5154 - regression_loss: 2.0954 - classification_loss: 0.4200 184/500 [==========>...................] - ETA: 1:12 - loss: 2.5123 - regression_loss: 2.0935 - classification_loss: 0.4188 185/500 [==========>...................] - ETA: 1:11 - loss: 2.5096 - regression_loss: 2.0917 - classification_loss: 0.4179 186/500 [==========>...................] - ETA: 1:11 - loss: 2.5075 - regression_loss: 2.0902 - classification_loss: 0.4174 187/500 [==========>...................] - ETA: 1:11 - loss: 2.5077 - regression_loss: 2.0908 - classification_loss: 0.4169 188/500 [==========>...................] - ETA: 1:11 - loss: 2.5079 - regression_loss: 2.0911 - classification_loss: 0.4168 189/500 [==========>...................] - ETA: 1:10 - loss: 2.5102 - regression_loss: 2.0930 - classification_loss: 0.4172 190/500 [==========>...................] - ETA: 1:10 - loss: 2.5086 - regression_loss: 2.0918 - classification_loss: 0.4168 191/500 [==========>...................] - ETA: 1:10 - loss: 2.5048 - regression_loss: 2.0892 - classification_loss: 0.4157 192/500 [==========>...................] - ETA: 1:10 - loss: 2.5071 - regression_loss: 2.0908 - classification_loss: 0.4163 193/500 [==========>...................] - ETA: 1:10 - loss: 2.5049 - regression_loss: 2.0892 - classification_loss: 0.4157 194/500 [==========>...................] - ETA: 1:09 - loss: 2.5093 - regression_loss: 2.0916 - classification_loss: 0.4177 195/500 [==========>...................] - ETA: 1:09 - loss: 2.5064 - regression_loss: 2.0897 - classification_loss: 0.4166 196/500 [==========>...................] - ETA: 1:09 - loss: 2.5034 - regression_loss: 2.0877 - classification_loss: 0.4157 197/500 [==========>...................] - ETA: 1:09 - loss: 2.5039 - regression_loss: 2.0884 - classification_loss: 0.4155 198/500 [==========>...................] - ETA: 1:09 - loss: 2.5023 - regression_loss: 2.0877 - classification_loss: 0.4147 199/500 [==========>...................] - ETA: 1:08 - loss: 2.5040 - regression_loss: 2.0892 - classification_loss: 0.4149 200/500 [===========>..................] - ETA: 1:08 - loss: 2.5082 - regression_loss: 2.0920 - classification_loss: 0.4162 201/500 [===========>..................] - ETA: 1:08 - loss: 2.5065 - regression_loss: 2.0907 - classification_loss: 0.4158 202/500 [===========>..................] - ETA: 1:08 - loss: 2.5060 - regression_loss: 2.0901 - classification_loss: 0.4159 203/500 [===========>..................] - ETA: 1:08 - loss: 2.5103 - regression_loss: 2.0932 - classification_loss: 0.4171 204/500 [===========>..................] - ETA: 1:07 - loss: 2.5066 - regression_loss: 2.0904 - classification_loss: 0.4162 205/500 [===========>..................] - ETA: 1:07 - loss: 2.5046 - regression_loss: 2.0889 - classification_loss: 0.4156 206/500 [===========>..................] - ETA: 1:07 - loss: 2.5025 - regression_loss: 2.0875 - classification_loss: 0.4150 207/500 [===========>..................] - ETA: 1:07 - loss: 2.4988 - regression_loss: 2.0845 - classification_loss: 0.4143 208/500 [===========>..................] - ETA: 1:06 - loss: 2.4975 - regression_loss: 2.0834 - classification_loss: 0.4141 209/500 [===========>..................] - ETA: 1:06 - loss: 2.4966 - regression_loss: 2.0831 - classification_loss: 0.4135 210/500 [===========>..................] - ETA: 1:06 - loss: 2.4963 - regression_loss: 2.0829 - classification_loss: 0.4134 211/500 [===========>..................] - ETA: 1:06 - loss: 2.4937 - regression_loss: 2.0812 - classification_loss: 0.4125 212/500 [===========>..................] - ETA: 1:05 - loss: 2.4885 - regression_loss: 2.0773 - classification_loss: 0.4113 213/500 [===========>..................] - ETA: 1:05 - loss: 2.4868 - regression_loss: 2.0763 - classification_loss: 0.4106 214/500 [===========>..................] - ETA: 1:05 - loss: 2.4862 - regression_loss: 2.0760 - classification_loss: 0.4101 215/500 [===========>..................] - ETA: 1:05 - loss: 2.4856 - regression_loss: 2.0760 - classification_loss: 0.4096 216/500 [===========>..................] - ETA: 1:05 - loss: 2.4847 - regression_loss: 2.0756 - classification_loss: 0.4091 217/500 [============>.................] - ETA: 1:04 - loss: 2.4830 - regression_loss: 2.0745 - classification_loss: 0.4085 218/500 [============>.................] - ETA: 1:04 - loss: 2.4823 - regression_loss: 2.0737 - classification_loss: 0.4086 219/500 [============>.................] - ETA: 1:04 - loss: 2.4812 - regression_loss: 2.0725 - classification_loss: 0.4087 220/500 [============>.................] - ETA: 1:04 - loss: 2.4816 - regression_loss: 2.0737 - classification_loss: 0.4079 221/500 [============>.................] - ETA: 1:03 - loss: 2.4764 - regression_loss: 2.0695 - classification_loss: 0.4069 222/500 [============>.................] - ETA: 1:03 - loss: 2.4759 - regression_loss: 2.0693 - classification_loss: 0.4067 223/500 [============>.................] - ETA: 1:03 - loss: 2.4773 - regression_loss: 2.0705 - classification_loss: 0.4068 224/500 [============>.................] - ETA: 1:03 - loss: 2.4760 - regression_loss: 2.0696 - classification_loss: 0.4064 225/500 [============>.................] - ETA: 1:02 - loss: 2.4780 - regression_loss: 2.0710 - classification_loss: 0.4069 226/500 [============>.................] - ETA: 1:02 - loss: 2.4783 - regression_loss: 2.0719 - classification_loss: 0.4064 227/500 [============>.................] - ETA: 1:02 - loss: 2.4793 - regression_loss: 2.0729 - classification_loss: 0.4064 228/500 [============>.................] - ETA: 1:02 - loss: 2.4784 - regression_loss: 2.0725 - classification_loss: 0.4059 229/500 [============>.................] - ETA: 1:02 - loss: 2.4792 - regression_loss: 2.0739 - classification_loss: 0.4053 230/500 [============>.................] - ETA: 1:01 - loss: 2.4796 - regression_loss: 2.0742 - classification_loss: 0.4053 231/500 [============>.................] - ETA: 1:01 - loss: 2.4787 - regression_loss: 2.0739 - classification_loss: 0.4048 232/500 [============>.................] - ETA: 1:01 - loss: 2.4779 - regression_loss: 2.0734 - classification_loss: 0.4045 233/500 [============>.................] - ETA: 1:01 - loss: 2.4762 - regression_loss: 2.0724 - classification_loss: 0.4038 234/500 [=============>................] - ETA: 1:00 - loss: 2.4783 - regression_loss: 2.0742 - classification_loss: 0.4040 235/500 [=============>................] - ETA: 1:00 - loss: 2.4736 - regression_loss: 2.0703 - classification_loss: 0.4033 236/500 [=============>................] - ETA: 1:00 - loss: 2.4750 - regression_loss: 2.0715 - classification_loss: 0.4035 237/500 [=============>................] - ETA: 1:00 - loss: 2.4735 - regression_loss: 2.0703 - classification_loss: 0.4033 238/500 [=============>................] - ETA: 59s - loss: 2.4717 - regression_loss: 2.0688 - classification_loss: 0.4028  239/500 [=============>................] - ETA: 59s - loss: 2.4700 - regression_loss: 2.0677 - classification_loss: 0.4023 240/500 [=============>................] - ETA: 59s - loss: 2.4715 - regression_loss: 2.0685 - classification_loss: 0.4030 241/500 [=============>................] - ETA: 59s - loss: 2.4704 - regression_loss: 2.0679 - classification_loss: 0.4025 242/500 [=============>................] - ETA: 59s - loss: 2.4741 - regression_loss: 2.0705 - classification_loss: 0.4035 243/500 [=============>................] - ETA: 58s - loss: 2.4728 - regression_loss: 2.0697 - classification_loss: 0.4031 244/500 [=============>................] - ETA: 58s - loss: 2.4722 - regression_loss: 2.0687 - classification_loss: 0.4036 245/500 [=============>................] - ETA: 58s - loss: 2.4717 - regression_loss: 2.0687 - classification_loss: 0.4030 246/500 [=============>................] - ETA: 58s - loss: 2.4729 - regression_loss: 2.0696 - classification_loss: 0.4034 247/500 [=============>................] - ETA: 57s - loss: 2.4718 - regression_loss: 2.0688 - classification_loss: 0.4030 248/500 [=============>................] - ETA: 57s - loss: 2.4718 - regression_loss: 2.0688 - classification_loss: 0.4030 249/500 [=============>................] - ETA: 57s - loss: 2.4732 - regression_loss: 2.0696 - classification_loss: 0.4036 250/500 [==============>...............] - ETA: 57s - loss: 2.4768 - regression_loss: 2.0729 - classification_loss: 0.4039 251/500 [==============>...............] - ETA: 57s - loss: 2.4765 - regression_loss: 2.0725 - classification_loss: 0.4040 252/500 [==============>...............] - ETA: 56s - loss: 2.4765 - regression_loss: 2.0725 - classification_loss: 0.4040 253/500 [==============>...............] - ETA: 56s - loss: 2.4769 - regression_loss: 2.0722 - classification_loss: 0.4047 254/500 [==============>...............] - ETA: 56s - loss: 2.4751 - regression_loss: 2.0709 - classification_loss: 0.4042 255/500 [==============>...............] - ETA: 56s - loss: 2.4801 - regression_loss: 2.0724 - classification_loss: 0.4077 256/500 [==============>...............] - ETA: 55s - loss: 2.4767 - regression_loss: 2.0696 - classification_loss: 0.4071 257/500 [==============>...............] - ETA: 55s - loss: 2.4741 - regression_loss: 2.0667 - classification_loss: 0.4075 258/500 [==============>...............] - ETA: 55s - loss: 2.4738 - regression_loss: 2.0666 - classification_loss: 0.4072 259/500 [==============>...............] - ETA: 55s - loss: 2.4719 - regression_loss: 2.0648 - classification_loss: 0.4070 260/500 [==============>...............] - ETA: 54s - loss: 2.4712 - regression_loss: 2.0645 - classification_loss: 0.4067 261/500 [==============>...............] - ETA: 54s - loss: 2.4675 - regression_loss: 2.0616 - classification_loss: 0.4059 262/500 [==============>...............] - ETA: 54s - loss: 2.4693 - regression_loss: 2.0629 - classification_loss: 0.4063 263/500 [==============>...............] - ETA: 54s - loss: 2.4689 - regression_loss: 2.0629 - classification_loss: 0.4060 264/500 [==============>...............] - ETA: 54s - loss: 2.4697 - regression_loss: 2.0633 - classification_loss: 0.4065 265/500 [==============>...............] - ETA: 53s - loss: 2.4707 - regression_loss: 2.0639 - classification_loss: 0.4068 266/500 [==============>...............] - ETA: 53s - loss: 2.4687 - regression_loss: 2.0623 - classification_loss: 0.4063 267/500 [===============>..............] - ETA: 53s - loss: 2.4676 - regression_loss: 2.0615 - classification_loss: 0.4061 268/500 [===============>..............] - ETA: 53s - loss: 2.4624 - regression_loss: 2.0573 - classification_loss: 0.4051 269/500 [===============>..............] - ETA: 52s - loss: 2.4619 - regression_loss: 2.0573 - classification_loss: 0.4046 270/500 [===============>..............] - ETA: 52s - loss: 2.4614 - regression_loss: 2.0570 - classification_loss: 0.4044 271/500 [===============>..............] - ETA: 52s - loss: 2.4624 - regression_loss: 2.0579 - classification_loss: 0.4046 272/500 [===============>..............] - ETA: 52s - loss: 2.4590 - regression_loss: 2.0553 - classification_loss: 0.4037 273/500 [===============>..............] - ETA: 51s - loss: 2.4601 - regression_loss: 2.0561 - classification_loss: 0.4040 274/500 [===============>..............] - ETA: 51s - loss: 2.4602 - regression_loss: 2.0559 - classification_loss: 0.4042 275/500 [===============>..............] - ETA: 51s - loss: 2.4576 - regression_loss: 2.0540 - classification_loss: 0.4036 276/500 [===============>..............] - ETA: 51s - loss: 2.4562 - regression_loss: 2.0528 - classification_loss: 0.4034 277/500 [===============>..............] - ETA: 51s - loss: 2.4549 - regression_loss: 2.0516 - classification_loss: 0.4033 278/500 [===============>..............] - ETA: 50s - loss: 2.4544 - regression_loss: 2.0513 - classification_loss: 0.4031 279/500 [===============>..............] - ETA: 50s - loss: 2.4535 - regression_loss: 2.0506 - classification_loss: 0.4029 280/500 [===============>..............] - ETA: 50s - loss: 2.4530 - regression_loss: 2.0505 - classification_loss: 0.4025 281/500 [===============>..............] - ETA: 50s - loss: 2.4531 - regression_loss: 2.0508 - classification_loss: 0.4024 282/500 [===============>..............] - ETA: 49s - loss: 2.4519 - regression_loss: 2.0498 - classification_loss: 0.4021 283/500 [===============>..............] - ETA: 49s - loss: 2.4484 - regression_loss: 2.0469 - classification_loss: 0.4015 284/500 [================>.............] - ETA: 49s - loss: 2.4440 - regression_loss: 2.0430 - classification_loss: 0.4011 285/500 [================>.............] - ETA: 49s - loss: 2.4427 - regression_loss: 2.0421 - classification_loss: 0.4007 286/500 [================>.............] - ETA: 48s - loss: 2.4421 - regression_loss: 2.0419 - classification_loss: 0.4002 287/500 [================>.............] - ETA: 48s - loss: 2.4417 - regression_loss: 2.0418 - classification_loss: 0.3999 288/500 [================>.............] - ETA: 48s - loss: 2.4414 - regression_loss: 2.0415 - classification_loss: 0.3999 289/500 [================>.............] - ETA: 48s - loss: 2.4399 - regression_loss: 2.0402 - classification_loss: 0.3997 290/500 [================>.............] - ETA: 48s - loss: 2.4379 - regression_loss: 2.0388 - classification_loss: 0.3992 291/500 [================>.............] - ETA: 47s - loss: 2.4376 - regression_loss: 2.0386 - classification_loss: 0.3990 292/500 [================>.............] - ETA: 47s - loss: 2.4352 - regression_loss: 2.0367 - classification_loss: 0.3985 293/500 [================>.............] - ETA: 47s - loss: 2.4325 - regression_loss: 2.0343 - classification_loss: 0.3982 294/500 [================>.............] - ETA: 47s - loss: 2.4358 - regression_loss: 2.0363 - classification_loss: 0.3995 295/500 [================>.............] - ETA: 46s - loss: 2.4371 - regression_loss: 2.0367 - classification_loss: 0.4004 296/500 [================>.............] - ETA: 46s - loss: 2.4375 - regression_loss: 2.0372 - classification_loss: 0.4004 297/500 [================>.............] - ETA: 46s - loss: 2.4366 - regression_loss: 2.0367 - classification_loss: 0.3999 298/500 [================>.............] - ETA: 46s - loss: 2.4366 - regression_loss: 2.0368 - classification_loss: 0.3998 299/500 [================>.............] - ETA: 45s - loss: 2.4345 - regression_loss: 2.0351 - classification_loss: 0.3994 300/500 [=================>............] - ETA: 45s - loss: 2.4402 - regression_loss: 2.0375 - classification_loss: 0.4027 301/500 [=================>............] - ETA: 45s - loss: 2.4401 - regression_loss: 2.0373 - classification_loss: 0.4028 302/500 [=================>............] - ETA: 45s - loss: 2.4402 - regression_loss: 2.0376 - classification_loss: 0.4026 303/500 [=================>............] - ETA: 45s - loss: 2.4370 - regression_loss: 2.0352 - classification_loss: 0.4018 304/500 [=================>............] - ETA: 44s - loss: 2.4355 - regression_loss: 2.0342 - classification_loss: 0.4013 305/500 [=================>............] - ETA: 44s - loss: 2.4340 - regression_loss: 2.0329 - classification_loss: 0.4011 306/500 [=================>............] - ETA: 44s - loss: 2.4351 - regression_loss: 2.0332 - classification_loss: 0.4019 307/500 [=================>............] - ETA: 44s - loss: 2.4355 - regression_loss: 2.0340 - classification_loss: 0.4015 308/500 [=================>............] - ETA: 43s - loss: 2.4323 - regression_loss: 2.0316 - classification_loss: 0.4007 309/500 [=================>............] - ETA: 43s - loss: 2.4342 - regression_loss: 2.0332 - classification_loss: 0.4009 310/500 [=================>............] - ETA: 43s - loss: 2.4334 - regression_loss: 2.0325 - classification_loss: 0.4010 311/500 [=================>............] - ETA: 43s - loss: 2.4322 - regression_loss: 2.0318 - classification_loss: 0.4004 312/500 [=================>............] - ETA: 42s - loss: 2.4341 - regression_loss: 2.0331 - classification_loss: 0.4010 313/500 [=================>............] - ETA: 42s - loss: 2.4321 - regression_loss: 2.0316 - classification_loss: 0.4006 314/500 [=================>............] - ETA: 42s - loss: 2.4332 - regression_loss: 2.0324 - classification_loss: 0.4009 315/500 [=================>............] - ETA: 42s - loss: 2.4315 - regression_loss: 2.0311 - classification_loss: 0.4004 316/500 [=================>............] - ETA: 42s - loss: 2.4322 - regression_loss: 2.0316 - classification_loss: 0.4005 317/500 [==================>...........] - ETA: 41s - loss: 2.4307 - regression_loss: 2.0306 - classification_loss: 0.4001 318/500 [==================>...........] - ETA: 41s - loss: 2.4293 - regression_loss: 2.0294 - classification_loss: 0.3998 319/500 [==================>...........] - ETA: 41s - loss: 2.4295 - regression_loss: 2.0299 - classification_loss: 0.3996 320/500 [==================>...........] - ETA: 41s - loss: 2.4281 - regression_loss: 2.0291 - classification_loss: 0.3991 321/500 [==================>...........] - ETA: 40s - loss: 2.4274 - regression_loss: 2.0287 - classification_loss: 0.3987 322/500 [==================>...........] - ETA: 40s - loss: 2.4220 - regression_loss: 2.0224 - classification_loss: 0.3996 323/500 [==================>...........] - ETA: 40s - loss: 2.4224 - regression_loss: 2.0227 - classification_loss: 0.3997 324/500 [==================>...........] - ETA: 40s - loss: 2.4238 - regression_loss: 2.0241 - classification_loss: 0.3997 325/500 [==================>...........] - ETA: 40s - loss: 2.4236 - regression_loss: 2.0242 - classification_loss: 0.3994 326/500 [==================>...........] - ETA: 39s - loss: 2.4252 - regression_loss: 2.0253 - classification_loss: 0.3999 327/500 [==================>...........] - ETA: 39s - loss: 2.4259 - regression_loss: 2.0259 - classification_loss: 0.3999 328/500 [==================>...........] - ETA: 39s - loss: 2.4227 - regression_loss: 2.0236 - classification_loss: 0.3991 329/500 [==================>...........] - ETA: 39s - loss: 2.4216 - regression_loss: 2.0228 - classification_loss: 0.3988 330/500 [==================>...........] - ETA: 38s - loss: 2.4180 - regression_loss: 2.0200 - classification_loss: 0.3980 331/500 [==================>...........] - ETA: 38s - loss: 2.4181 - regression_loss: 2.0202 - classification_loss: 0.3979 332/500 [==================>...........] - ETA: 38s - loss: 2.4158 - regression_loss: 2.0186 - classification_loss: 0.3973 333/500 [==================>...........] - ETA: 38s - loss: 2.4175 - regression_loss: 2.0199 - classification_loss: 0.3976 334/500 [===================>..........] - ETA: 38s - loss: 2.4172 - regression_loss: 2.0197 - classification_loss: 0.3975 335/500 [===================>..........] - ETA: 37s - loss: 2.4153 - regression_loss: 2.0182 - classification_loss: 0.3971 336/500 [===================>..........] - ETA: 37s - loss: 2.4148 - regression_loss: 2.0173 - classification_loss: 0.3974 337/500 [===================>..........] - ETA: 37s - loss: 2.4166 - regression_loss: 2.0183 - classification_loss: 0.3983 338/500 [===================>..........] - ETA: 37s - loss: 2.4173 - regression_loss: 2.0194 - classification_loss: 0.3980 339/500 [===================>..........] - ETA: 36s - loss: 2.4167 - regression_loss: 2.0189 - classification_loss: 0.3978 340/500 [===================>..........] - ETA: 36s - loss: 2.4140 - regression_loss: 2.0166 - classification_loss: 0.3974 341/500 [===================>..........] - ETA: 36s - loss: 2.4133 - regression_loss: 2.0161 - classification_loss: 0.3972 342/500 [===================>..........] - ETA: 36s - loss: 2.4161 - regression_loss: 2.0180 - classification_loss: 0.3981 343/500 [===================>..........] - ETA: 35s - loss: 2.4169 - regression_loss: 2.0188 - classification_loss: 0.3981 344/500 [===================>..........] - ETA: 35s - loss: 2.4180 - regression_loss: 2.0197 - classification_loss: 0.3983 345/500 [===================>..........] - ETA: 35s - loss: 2.4145 - regression_loss: 2.0169 - classification_loss: 0.3976 346/500 [===================>..........] - ETA: 35s - loss: 2.4167 - regression_loss: 2.0186 - classification_loss: 0.3981 347/500 [===================>..........] - ETA: 35s - loss: 2.4169 - regression_loss: 2.0187 - classification_loss: 0.3982 348/500 [===================>..........] - ETA: 34s - loss: 2.4171 - regression_loss: 2.0188 - classification_loss: 0.3983 349/500 [===================>..........] - ETA: 34s - loss: 2.4160 - regression_loss: 2.0180 - classification_loss: 0.3980 350/500 [====================>.........] - ETA: 34s - loss: 2.4148 - regression_loss: 2.0173 - classification_loss: 0.3975 351/500 [====================>.........] - ETA: 34s - loss: 2.4138 - regression_loss: 2.0167 - classification_loss: 0.3971 352/500 [====================>.........] - ETA: 33s - loss: 2.4130 - regression_loss: 2.0161 - classification_loss: 0.3969 353/500 [====================>.........] - ETA: 33s - loss: 2.4143 - regression_loss: 2.0170 - classification_loss: 0.3973 354/500 [====================>.........] - ETA: 33s - loss: 2.4142 - regression_loss: 2.0170 - classification_loss: 0.3972 355/500 [====================>.........] - ETA: 33s - loss: 2.4131 - regression_loss: 2.0163 - classification_loss: 0.3968 356/500 [====================>.........] - ETA: 32s - loss: 2.4149 - regression_loss: 2.0175 - classification_loss: 0.3974 357/500 [====================>.........] - ETA: 32s - loss: 2.4133 - regression_loss: 2.0161 - classification_loss: 0.3973 358/500 [====================>.........] - ETA: 32s - loss: 2.4144 - regression_loss: 2.0169 - classification_loss: 0.3975 359/500 [====================>.........] - ETA: 32s - loss: 2.4161 - regression_loss: 2.0189 - classification_loss: 0.3973 360/500 [====================>.........] - ETA: 32s - loss: 2.4147 - regression_loss: 2.0177 - classification_loss: 0.3970 361/500 [====================>.........] - ETA: 31s - loss: 2.4151 - regression_loss: 2.0184 - classification_loss: 0.3967 362/500 [====================>.........] - ETA: 31s - loss: 2.4166 - regression_loss: 2.0195 - classification_loss: 0.3970 363/500 [====================>.........] - ETA: 31s - loss: 2.4176 - regression_loss: 2.0204 - classification_loss: 0.3972 364/500 [====================>.........] - ETA: 31s - loss: 2.4179 - regression_loss: 2.0208 - classification_loss: 0.3971 365/500 [====================>.........] - ETA: 30s - loss: 2.4177 - regression_loss: 2.0207 - classification_loss: 0.3970 366/500 [====================>.........] - ETA: 30s - loss: 2.4169 - regression_loss: 2.0202 - classification_loss: 0.3967 367/500 [=====================>........] - ETA: 30s - loss: 2.4162 - regression_loss: 2.0198 - classification_loss: 0.3964 368/500 [=====================>........] - ETA: 30s - loss: 2.4153 - regression_loss: 2.0192 - classification_loss: 0.3960 369/500 [=====================>........] - ETA: 30s - loss: 2.4171 - regression_loss: 2.0203 - classification_loss: 0.3968 370/500 [=====================>........] - ETA: 29s - loss: 2.4154 - regression_loss: 2.0189 - classification_loss: 0.3965 371/500 [=====================>........] - ETA: 29s - loss: 2.4156 - regression_loss: 2.0190 - classification_loss: 0.3966 372/500 [=====================>........] - ETA: 29s - loss: 2.4173 - regression_loss: 2.0199 - classification_loss: 0.3974 373/500 [=====================>........] - ETA: 29s - loss: 2.4157 - regression_loss: 2.0187 - classification_loss: 0.3970 374/500 [=====================>........] - ETA: 28s - loss: 2.4154 - regression_loss: 2.0184 - classification_loss: 0.3970 375/500 [=====================>........] - ETA: 28s - loss: 2.4152 - regression_loss: 2.0184 - classification_loss: 0.3968 376/500 [=====================>........] - ETA: 28s - loss: 2.4176 - regression_loss: 2.0203 - classification_loss: 0.3972 377/500 [=====================>........] - ETA: 28s - loss: 2.4164 - regression_loss: 2.0196 - classification_loss: 0.3967 378/500 [=====================>........] - ETA: 27s - loss: 2.4162 - regression_loss: 2.0194 - classification_loss: 0.3967 379/500 [=====================>........] - ETA: 27s - loss: 2.4176 - regression_loss: 2.0206 - classification_loss: 0.3970 380/500 [=====================>........] - ETA: 27s - loss: 2.4190 - regression_loss: 2.0217 - classification_loss: 0.3972 381/500 [=====================>........] - ETA: 27s - loss: 2.4186 - regression_loss: 2.0216 - classification_loss: 0.3970 382/500 [=====================>........] - ETA: 27s - loss: 2.4200 - regression_loss: 2.0228 - classification_loss: 0.3972 383/500 [=====================>........] - ETA: 26s - loss: 2.4198 - regression_loss: 2.0227 - classification_loss: 0.3971 384/500 [======================>.......] - ETA: 26s - loss: 2.4167 - regression_loss: 2.0202 - classification_loss: 0.3965 385/500 [======================>.......] - ETA: 26s - loss: 2.4140 - regression_loss: 2.0179 - classification_loss: 0.3961 386/500 [======================>.......] - ETA: 26s - loss: 2.4151 - regression_loss: 2.0188 - classification_loss: 0.3962 387/500 [======================>.......] - ETA: 25s - loss: 2.4142 - regression_loss: 2.0183 - classification_loss: 0.3960 388/500 [======================>.......] - ETA: 25s - loss: 2.4133 - regression_loss: 2.0176 - classification_loss: 0.3957 389/500 [======================>.......] - ETA: 25s - loss: 2.4131 - regression_loss: 2.0175 - classification_loss: 0.3956 390/500 [======================>.......] - ETA: 25s - loss: 2.4120 - regression_loss: 2.0168 - classification_loss: 0.3952 391/500 [======================>.......] - ETA: 24s - loss: 2.4116 - regression_loss: 2.0167 - classification_loss: 0.3949 392/500 [======================>.......] - ETA: 24s - loss: 2.4118 - regression_loss: 2.0170 - classification_loss: 0.3948 393/500 [======================>.......] - ETA: 24s - loss: 2.4114 - regression_loss: 2.0169 - classification_loss: 0.3945 394/500 [======================>.......] - ETA: 24s - loss: 2.4108 - regression_loss: 2.0166 - classification_loss: 0.3943 395/500 [======================>.......] - ETA: 24s - loss: 2.4105 - regression_loss: 2.0165 - classification_loss: 0.3940 396/500 [======================>.......] - ETA: 23s - loss: 2.4113 - regression_loss: 2.0173 - classification_loss: 0.3941 397/500 [======================>.......] - ETA: 23s - loss: 2.4106 - regression_loss: 2.0169 - classification_loss: 0.3937 398/500 [======================>.......] - ETA: 23s - loss: 2.4112 - regression_loss: 2.0175 - classification_loss: 0.3937 399/500 [======================>.......] - ETA: 23s - loss: 2.4109 - regression_loss: 2.0174 - classification_loss: 0.3935 400/500 [=======================>......] - ETA: 22s - loss: 2.4105 - regression_loss: 2.0172 - classification_loss: 0.3933 401/500 [=======================>......] - ETA: 22s - loss: 2.4103 - regression_loss: 2.0172 - classification_loss: 0.3931 402/500 [=======================>......] - ETA: 22s - loss: 2.4097 - regression_loss: 2.0169 - classification_loss: 0.3928 403/500 [=======================>......] - ETA: 22s - loss: 2.4075 - regression_loss: 2.0151 - classification_loss: 0.3924 404/500 [=======================>......] - ETA: 21s - loss: 2.4069 - regression_loss: 2.0145 - classification_loss: 0.3925 405/500 [=======================>......] - ETA: 21s - loss: 2.4079 - regression_loss: 2.0152 - classification_loss: 0.3927 406/500 [=======================>......] - ETA: 21s - loss: 2.4071 - regression_loss: 2.0147 - classification_loss: 0.3925 407/500 [=======================>......] - ETA: 21s - loss: 2.4057 - regression_loss: 2.0137 - classification_loss: 0.3920 408/500 [=======================>......] - ETA: 21s - loss: 2.4044 - regression_loss: 2.0126 - classification_loss: 0.3917 409/500 [=======================>......] - ETA: 20s - loss: 2.4035 - regression_loss: 2.0120 - classification_loss: 0.3915 410/500 [=======================>......] - ETA: 20s - loss: 2.4018 - regression_loss: 2.0104 - classification_loss: 0.3914 411/500 [=======================>......] - ETA: 20s - loss: 2.4008 - regression_loss: 2.0096 - classification_loss: 0.3913 412/500 [=======================>......] - ETA: 20s - loss: 2.4013 - regression_loss: 2.0100 - classification_loss: 0.3913 413/500 [=======================>......] - ETA: 19s - loss: 2.4010 - regression_loss: 2.0098 - classification_loss: 0.3912 414/500 [=======================>......] - ETA: 19s - loss: 2.4009 - regression_loss: 2.0098 - classification_loss: 0.3910 415/500 [=======================>......] - ETA: 19s - loss: 2.3992 - regression_loss: 2.0086 - classification_loss: 0.3906 416/500 [=======================>......] - ETA: 19s - loss: 2.3989 - regression_loss: 2.0084 - classification_loss: 0.3905 417/500 [========================>.....] - ETA: 18s - loss: 2.3982 - regression_loss: 2.0078 - classification_loss: 0.3904 418/500 [========================>.....] - ETA: 18s - loss: 2.3986 - regression_loss: 2.0081 - classification_loss: 0.3905 419/500 [========================>.....] - ETA: 18s - loss: 2.3964 - regression_loss: 2.0063 - classification_loss: 0.3901 420/500 [========================>.....] - ETA: 18s - loss: 2.3958 - regression_loss: 2.0058 - classification_loss: 0.3899 421/500 [========================>.....] - ETA: 18s - loss: 2.3946 - regression_loss: 2.0050 - classification_loss: 0.3896 422/500 [========================>.....] - ETA: 17s - loss: 2.3948 - regression_loss: 2.0053 - classification_loss: 0.3895 423/500 [========================>.....] - ETA: 17s - loss: 2.3946 - regression_loss: 2.0052 - classification_loss: 0.3895 424/500 [========================>.....] - ETA: 17s - loss: 2.3944 - regression_loss: 2.0052 - classification_loss: 0.3892 425/500 [========================>.....] - ETA: 17s - loss: 2.3932 - regression_loss: 2.0045 - classification_loss: 0.3887 426/500 [========================>.....] - ETA: 16s - loss: 2.3932 - regression_loss: 2.0045 - classification_loss: 0.3887 427/500 [========================>.....] - ETA: 16s - loss: 2.3927 - regression_loss: 2.0042 - classification_loss: 0.3886 428/500 [========================>.....] - ETA: 16s - loss: 2.3913 - regression_loss: 2.0031 - classification_loss: 0.3882 429/500 [========================>.....] - ETA: 16s - loss: 2.3928 - regression_loss: 2.0042 - classification_loss: 0.3887 430/500 [========================>.....] - ETA: 16s - loss: 2.3904 - regression_loss: 2.0020 - classification_loss: 0.3884 431/500 [========================>.....] - ETA: 15s - loss: 2.3918 - regression_loss: 2.0031 - classification_loss: 0.3887 432/500 [========================>.....] - ETA: 15s - loss: 2.3908 - regression_loss: 2.0025 - classification_loss: 0.3883 433/500 [========================>.....] - ETA: 15s - loss: 2.3911 - regression_loss: 2.0028 - classification_loss: 0.3883 434/500 [=========================>....] - ETA: 15s - loss: 2.3905 - regression_loss: 2.0024 - classification_loss: 0.3881 435/500 [=========================>....] - ETA: 14s - loss: 2.3908 - regression_loss: 2.0028 - classification_loss: 0.3880 436/500 [=========================>....] - ETA: 14s - loss: 2.3879 - regression_loss: 2.0005 - classification_loss: 0.3875 437/500 [=========================>....] - ETA: 14s - loss: 2.3876 - regression_loss: 2.0002 - classification_loss: 0.3874 438/500 [=========================>....] - ETA: 14s - loss: 2.3887 - regression_loss: 2.0011 - classification_loss: 0.3876 439/500 [=========================>....] - ETA: 13s - loss: 2.3870 - regression_loss: 1.9998 - classification_loss: 0.3872 440/500 [=========================>....] - ETA: 13s - loss: 2.3849 - regression_loss: 1.9982 - classification_loss: 0.3867 441/500 [=========================>....] - ETA: 13s - loss: 2.3845 - regression_loss: 1.9979 - classification_loss: 0.3866 442/500 [=========================>....] - ETA: 13s - loss: 2.3843 - regression_loss: 1.9977 - classification_loss: 0.3866 443/500 [=========================>....] - ETA: 13s - loss: 2.3833 - regression_loss: 1.9969 - classification_loss: 0.3864 444/500 [=========================>....] - ETA: 12s - loss: 2.3847 - regression_loss: 1.9981 - classification_loss: 0.3866 445/500 [=========================>....] - ETA: 12s - loss: 2.3844 - regression_loss: 1.9979 - classification_loss: 0.3866 446/500 [=========================>....] - ETA: 12s - loss: 2.3841 - regression_loss: 1.9977 - classification_loss: 0.3864 447/500 [=========================>....] - ETA: 12s - loss: 2.3845 - regression_loss: 1.9982 - classification_loss: 0.3863 448/500 [=========================>....] - ETA: 11s - loss: 2.3833 - regression_loss: 1.9974 - classification_loss: 0.3859 449/500 [=========================>....] - ETA: 11s - loss: 2.3828 - regression_loss: 1.9971 - classification_loss: 0.3857 450/500 [==========================>...] - ETA: 11s - loss: 2.3809 - regression_loss: 1.9956 - classification_loss: 0.3853 451/500 [==========================>...] - ETA: 11s - loss: 2.3808 - regression_loss: 1.9954 - classification_loss: 0.3854 452/500 [==========================>...] - ETA: 10s - loss: 2.3799 - regression_loss: 1.9948 - classification_loss: 0.3851 453/500 [==========================>...] - ETA: 10s - loss: 2.3792 - regression_loss: 1.9942 - classification_loss: 0.3849 454/500 [==========================>...] - ETA: 10s - loss: 2.3797 - regression_loss: 1.9948 - classification_loss: 0.3849 455/500 [==========================>...] - ETA: 10s - loss: 2.3792 - regression_loss: 1.9946 - classification_loss: 0.3846 456/500 [==========================>...] - ETA: 10s - loss: 2.3813 - regression_loss: 1.9964 - classification_loss: 0.3849 457/500 [==========================>...] - ETA: 9s - loss: 2.3845 - regression_loss: 1.9986 - classification_loss: 0.3859  458/500 [==========================>...] - ETA: 9s - loss: 2.3843 - regression_loss: 1.9985 - classification_loss: 0.3858 459/500 [==========================>...] - ETA: 9s - loss: 2.3822 - regression_loss: 1.9967 - classification_loss: 0.3854 460/500 [==========================>...] - ETA: 9s - loss: 2.3808 - regression_loss: 1.9957 - classification_loss: 0.3851 461/500 [==========================>...] - ETA: 8s - loss: 2.3798 - regression_loss: 1.9950 - classification_loss: 0.3847 462/500 [==========================>...] - ETA: 8s - loss: 2.3787 - regression_loss: 1.9940 - classification_loss: 0.3847 463/500 [==========================>...] - ETA: 8s - loss: 2.3811 - regression_loss: 1.9960 - classification_loss: 0.3851 464/500 [==========================>...] - ETA: 8s - loss: 2.3793 - regression_loss: 1.9945 - classification_loss: 0.3848 465/500 [==========================>...] - ETA: 8s - loss: 2.3782 - regression_loss: 1.9937 - classification_loss: 0.3846 466/500 [==========================>...] - ETA: 7s - loss: 2.3776 - regression_loss: 1.9932 - classification_loss: 0.3843 467/500 [===========================>..] - ETA: 7s - loss: 2.3767 - regression_loss: 1.9926 - classification_loss: 0.3841 468/500 [===========================>..] - ETA: 7s - loss: 2.3773 - regression_loss: 1.9930 - classification_loss: 0.3843 469/500 [===========================>..] - ETA: 7s - loss: 2.3774 - regression_loss: 1.9930 - classification_loss: 0.3844 470/500 [===========================>..] - ETA: 6s - loss: 2.3775 - regression_loss: 1.9932 - classification_loss: 0.3844 471/500 [===========================>..] - ETA: 6s - loss: 2.3781 - regression_loss: 1.9937 - classification_loss: 0.3844 472/500 [===========================>..] - ETA: 6s - loss: 2.3760 - regression_loss: 1.9921 - classification_loss: 0.3840 473/500 [===========================>..] - ETA: 6s - loss: 2.3765 - regression_loss: 1.9923 - classification_loss: 0.3843 474/500 [===========================>..] - ETA: 5s - loss: 2.3766 - regression_loss: 1.9924 - classification_loss: 0.3842 475/500 [===========================>..] - ETA: 5s - loss: 2.3772 - regression_loss: 1.9929 - classification_loss: 0.3843 476/500 [===========================>..] - ETA: 5s - loss: 2.3763 - regression_loss: 1.9920 - classification_loss: 0.3843 477/500 [===========================>..] - ETA: 5s - loss: 2.3763 - regression_loss: 1.9919 - classification_loss: 0.3844 478/500 [===========================>..] - ETA: 5s - loss: 2.3767 - regression_loss: 1.9923 - classification_loss: 0.3844 479/500 [===========================>..] - ETA: 4s - loss: 2.3769 - regression_loss: 1.9926 - classification_loss: 0.3843 480/500 [===========================>..] - ETA: 4s - loss: 2.3744 - regression_loss: 1.9906 - classification_loss: 0.3838 481/500 [===========================>..] - ETA: 4s - loss: 2.3745 - regression_loss: 1.9909 - classification_loss: 0.3836 482/500 [===========================>..] - ETA: 4s - loss: 2.3774 - regression_loss: 1.9919 - classification_loss: 0.3855 483/500 [===========================>..] - ETA: 3s - loss: 2.3770 - regression_loss: 1.9917 - classification_loss: 0.3854 484/500 [============================>.] - ETA: 3s - loss: 2.3776 - regression_loss: 1.9921 - classification_loss: 0.3855 485/500 [============================>.] - ETA: 3s - loss: 2.3775 - regression_loss: 1.9922 - classification_loss: 0.3853 486/500 [============================>.] - ETA: 3s - loss: 2.3778 - regression_loss: 1.9924 - classification_loss: 0.3854 487/500 [============================>.] - ETA: 2s - loss: 2.3792 - regression_loss: 1.9933 - classification_loss: 0.3859 488/500 [============================>.] - ETA: 2s - loss: 2.3776 - regression_loss: 1.9921 - classification_loss: 0.3855 489/500 [============================>.] - ETA: 2s - loss: 2.3757 - regression_loss: 1.9905 - classification_loss: 0.3852 490/500 [============================>.] - ETA: 2s - loss: 2.3762 - regression_loss: 1.9909 - classification_loss: 0.3853 491/500 [============================>.] - ETA: 2s - loss: 2.3767 - regression_loss: 1.9914 - classification_loss: 0.3852 492/500 [============================>.] - ETA: 1s - loss: 2.3763 - regression_loss: 1.9912 - classification_loss: 0.3850 493/500 [============================>.] - ETA: 1s - loss: 2.3753 - regression_loss: 1.9904 - classification_loss: 0.3849 494/500 [============================>.] - ETA: 1s - loss: 2.3737 - regression_loss: 1.9894 - classification_loss: 0.3843 495/500 [============================>.] - ETA: 1s - loss: 2.3720 - regression_loss: 1.9879 - classification_loss: 0.3840 496/500 [============================>.] - ETA: 0s - loss: 2.3712 - regression_loss: 1.9875 - classification_loss: 0.3837 497/500 [============================>.] - ETA: 0s - loss: 2.3717 - regression_loss: 1.9877 - classification_loss: 0.3840 498/500 [============================>.] - ETA: 0s - loss: 2.3722 - regression_loss: 1.9878 - classification_loss: 0.3844 499/500 [============================>.] - ETA: 0s - loss: 2.3743 - regression_loss: 1.9896 - classification_loss: 0.3847 500/500 [==============================] - 114s 229ms/step - loss: 2.3743 - regression_loss: 1.9897 - classification_loss: 0.3845 326 instances of class plum with average precision: 0.6805 mAP: 0.6805 Epoch 00002: saving model to ./training/snapshots/resnet50_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 1:45 - loss: 1.0991 - regression_loss: 0.8942 - classification_loss: 0.2049 2/500 [..............................] - ETA: 1:45 - loss: 1.6612 - regression_loss: 1.3822 - classification_loss: 0.2790 3/500 [..............................] - ETA: 1:52 - loss: 1.8354 - regression_loss: 1.5677 - classification_loss: 0.2677 4/500 [..............................] - ETA: 1:52 - loss: 1.8228 - regression_loss: 1.5600 - classification_loss: 0.2628 5/500 [..............................] - ETA: 1:51 - loss: 1.8574 - regression_loss: 1.5952 - classification_loss: 0.2622 6/500 [..............................] - ETA: 1:54 - loss: 2.3908 - regression_loss: 1.9516 - classification_loss: 0.4393 7/500 [..............................] - ETA: 1:53 - loss: 2.3369 - regression_loss: 1.9235 - classification_loss: 0.4134 8/500 [..............................] - ETA: 1:55 - loss: 2.3687 - regression_loss: 1.9608 - classification_loss: 0.4079 9/500 [..............................] - ETA: 1:54 - loss: 2.4202 - regression_loss: 2.0054 - classification_loss: 0.4148 10/500 [..............................] - ETA: 1:52 - loss: 2.3784 - regression_loss: 1.9750 - classification_loss: 0.4035 11/500 [..............................] - ETA: 1:53 - loss: 2.3631 - regression_loss: 1.9668 - classification_loss: 0.3963 12/500 [..............................] - ETA: 1:53 - loss: 2.3330 - regression_loss: 1.9450 - classification_loss: 0.3880 13/500 [..............................] - ETA: 1:53 - loss: 2.3080 - regression_loss: 1.9276 - classification_loss: 0.3804 14/500 [..............................] - ETA: 1:53 - loss: 2.3873 - regression_loss: 1.9670 - classification_loss: 0.4203 15/500 [..............................] - ETA: 1:53 - loss: 2.3750 - regression_loss: 1.9614 - classification_loss: 0.4135 16/500 [..............................] - ETA: 1:53 - loss: 2.3606 - regression_loss: 1.9497 - classification_loss: 0.4109 17/500 [>.............................] - ETA: 1:52 - loss: 2.4086 - regression_loss: 1.9843 - classification_loss: 0.4243 18/500 [>.............................] - ETA: 1:52 - loss: 2.3717 - regression_loss: 1.9509 - classification_loss: 0.4208 19/500 [>.............................] - ETA: 1:51 - loss: 2.3782 - regression_loss: 1.9628 - classification_loss: 0.4154 20/500 [>.............................] - ETA: 1:51 - loss: 2.3538 - regression_loss: 1.9487 - classification_loss: 0.4051 21/500 [>.............................] - ETA: 1:51 - loss: 2.3642 - regression_loss: 1.9561 - classification_loss: 0.4082 22/500 [>.............................] - ETA: 1:50 - loss: 2.3266 - regression_loss: 1.9318 - classification_loss: 0.3949 23/500 [>.............................] - ETA: 1:50 - loss: 2.3415 - regression_loss: 1.9375 - classification_loss: 0.4040 24/500 [>.............................] - ETA: 1:50 - loss: 2.3223 - regression_loss: 1.9250 - classification_loss: 0.3974 25/500 [>.............................] - ETA: 1:49 - loss: 2.3246 - regression_loss: 1.9303 - classification_loss: 0.3943 26/500 [>.............................] - ETA: 1:49 - loss: 2.3077 - regression_loss: 1.9181 - classification_loss: 0.3895 27/500 [>.............................] - ETA: 1:49 - loss: 2.2858 - regression_loss: 1.9030 - classification_loss: 0.3828 28/500 [>.............................] - ETA: 1:48 - loss: 2.2507 - regression_loss: 1.8775 - classification_loss: 0.3732 29/500 [>.............................] - ETA: 1:48 - loss: 2.2606 - regression_loss: 1.8876 - classification_loss: 0.3730 30/500 [>.............................] - ETA: 1:48 - loss: 2.2718 - regression_loss: 1.8997 - classification_loss: 0.3722 31/500 [>.............................] - ETA: 1:48 - loss: 2.2826 - regression_loss: 1.9049 - classification_loss: 0.3778 32/500 [>.............................] - ETA: 1:48 - loss: 2.2984 - regression_loss: 1.9113 - classification_loss: 0.3871 33/500 [>.............................] - ETA: 1:47 - loss: 2.2919 - regression_loss: 1.9054 - classification_loss: 0.3864 34/500 [=>............................] - ETA: 1:47 - loss: 2.2869 - regression_loss: 1.9009 - classification_loss: 0.3861 35/500 [=>............................] - ETA: 1:47 - loss: 2.2797 - regression_loss: 1.8956 - classification_loss: 0.3841 36/500 [=>............................] - ETA: 1:47 - loss: 2.2892 - regression_loss: 1.9027 - classification_loss: 0.3865 37/500 [=>............................] - ETA: 1:47 - loss: 2.2796 - regression_loss: 1.8966 - classification_loss: 0.3830 38/500 [=>............................] - ETA: 1:46 - loss: 2.3184 - regression_loss: 1.9201 - classification_loss: 0.3983 39/500 [=>............................] - ETA: 1:46 - loss: 2.3113 - regression_loss: 1.9167 - classification_loss: 0.3946 40/500 [=>............................] - ETA: 1:46 - loss: 2.3027 - regression_loss: 1.9106 - classification_loss: 0.3921 41/500 [=>............................] - ETA: 1:46 - loss: 2.3026 - regression_loss: 1.9117 - classification_loss: 0.3909 42/500 [=>............................] - ETA: 1:46 - loss: 2.3297 - regression_loss: 1.9306 - classification_loss: 0.3990 43/500 [=>............................] - ETA: 1:45 - loss: 2.3250 - regression_loss: 1.9286 - classification_loss: 0.3964 44/500 [=>............................] - ETA: 1:45 - loss: 2.3215 - regression_loss: 1.9269 - classification_loss: 0.3946 45/500 [=>............................] - ETA: 1:45 - loss: 2.3238 - regression_loss: 1.9294 - classification_loss: 0.3944 46/500 [=>............................] - ETA: 1:45 - loss: 2.3213 - regression_loss: 1.9281 - classification_loss: 0.3932 47/500 [=>............................] - ETA: 1:44 - loss: 2.3048 - regression_loss: 1.9134 - classification_loss: 0.3914 48/500 [=>............................] - ETA: 1:44 - loss: 2.3093 - regression_loss: 1.9162 - classification_loss: 0.3931 49/500 [=>............................] - ETA: 1:44 - loss: 2.3103 - regression_loss: 1.9174 - classification_loss: 0.3929 50/500 [==>...........................] - ETA: 1:44 - loss: 2.3076 - regression_loss: 1.9168 - classification_loss: 0.3908 51/500 [==>...........................] - ETA: 1:43 - loss: 2.3129 - regression_loss: 1.9213 - classification_loss: 0.3916 52/500 [==>...........................] - ETA: 1:43 - loss: 2.3115 - regression_loss: 1.9212 - classification_loss: 0.3903 53/500 [==>...........................] - ETA: 1:43 - loss: 2.3074 - regression_loss: 1.9191 - classification_loss: 0.3884 54/500 [==>...........................] - ETA: 1:42 - loss: 2.3088 - regression_loss: 1.9205 - classification_loss: 0.3882 55/500 [==>...........................] - ETA: 1:42 - loss: 2.2985 - regression_loss: 1.9129 - classification_loss: 0.3855 56/500 [==>...........................] - ETA: 1:42 - loss: 2.3042 - regression_loss: 1.9171 - classification_loss: 0.3872 57/500 [==>...........................] - ETA: 1:42 - loss: 2.3344 - regression_loss: 1.9421 - classification_loss: 0.3922 58/500 [==>...........................] - ETA: 1:41 - loss: 2.3382 - regression_loss: 1.9464 - classification_loss: 0.3918 59/500 [==>...........................] - ETA: 1:41 - loss: 2.3336 - regression_loss: 1.9451 - classification_loss: 0.3884 60/500 [==>...........................] - ETA: 1:41 - loss: 2.3442 - regression_loss: 1.9534 - classification_loss: 0.3908 61/500 [==>...........................] - ETA: 1:40 - loss: 2.3392 - regression_loss: 1.9502 - classification_loss: 0.3890 62/500 [==>...........................] - ETA: 1:40 - loss: 2.3381 - regression_loss: 1.9508 - classification_loss: 0.3872 63/500 [==>...........................] - ETA: 1:40 - loss: 2.3393 - regression_loss: 1.9534 - classification_loss: 0.3860 64/500 [==>...........................] - ETA: 1:39 - loss: 2.3372 - regression_loss: 1.9524 - classification_loss: 0.3848 65/500 [==>...........................] - ETA: 1:39 - loss: 2.3450 - regression_loss: 1.9623 - classification_loss: 0.3827 66/500 [==>...........................] - ETA: 1:39 - loss: 2.3562 - regression_loss: 1.9669 - classification_loss: 0.3893 67/500 [===>..........................] - ETA: 1:39 - loss: 2.3485 - regression_loss: 1.9612 - classification_loss: 0.3874 68/500 [===>..........................] - ETA: 1:38 - loss: 2.3520 - regression_loss: 1.9650 - classification_loss: 0.3870 69/500 [===>..........................] - ETA: 1:38 - loss: 2.3463 - regression_loss: 1.9614 - classification_loss: 0.3850 70/500 [===>..........................] - ETA: 1:38 - loss: 2.3472 - regression_loss: 1.9630 - classification_loss: 0.3842 71/500 [===>..........................] - ETA: 1:38 - loss: 2.3430 - regression_loss: 1.9592 - classification_loss: 0.3838 72/500 [===>..........................] - ETA: 1:38 - loss: 2.3372 - regression_loss: 1.9547 - classification_loss: 0.3825 73/500 [===>..........................] - ETA: 1:37 - loss: 2.3295 - regression_loss: 1.9489 - classification_loss: 0.3806 74/500 [===>..........................] - ETA: 1:37 - loss: 2.3280 - regression_loss: 1.9491 - classification_loss: 0.3790 75/500 [===>..........................] - ETA: 1:37 - loss: 2.3352 - regression_loss: 1.9547 - classification_loss: 0.3806 76/500 [===>..........................] - ETA: 1:37 - loss: 2.3325 - regression_loss: 1.9531 - classification_loss: 0.3794 77/500 [===>..........................] - ETA: 1:36 - loss: 2.3349 - regression_loss: 1.9560 - classification_loss: 0.3789 78/500 [===>..........................] - ETA: 1:36 - loss: 2.3283 - regression_loss: 1.9517 - classification_loss: 0.3766 79/500 [===>..........................] - ETA: 1:36 - loss: 2.3231 - regression_loss: 1.9477 - classification_loss: 0.3754 80/500 [===>..........................] - ETA: 1:36 - loss: 2.3335 - regression_loss: 1.9557 - classification_loss: 0.3778 81/500 [===>..........................] - ETA: 1:35 - loss: 2.3252 - regression_loss: 1.9488 - classification_loss: 0.3764 82/500 [===>..........................] - ETA: 1:35 - loss: 2.3194 - regression_loss: 1.9446 - classification_loss: 0.3748 83/500 [===>..........................] - ETA: 1:35 - loss: 2.3106 - regression_loss: 1.9376 - classification_loss: 0.3731 84/500 [====>.........................] - ETA: 1:35 - loss: 2.3154 - regression_loss: 1.9421 - classification_loss: 0.3733 85/500 [====>.........................] - ETA: 1:34 - loss: 2.3124 - regression_loss: 1.9409 - classification_loss: 0.3715 86/500 [====>.........................] - ETA: 1:34 - loss: 2.3053 - regression_loss: 1.9358 - classification_loss: 0.3695 87/500 [====>.........................] - ETA: 1:34 - loss: 2.2984 - regression_loss: 1.9308 - classification_loss: 0.3676 88/500 [====>.........................] - ETA: 1:34 - loss: 2.2995 - regression_loss: 1.9333 - classification_loss: 0.3661 89/500 [====>.........................] - ETA: 1:33 - loss: 2.2969 - regression_loss: 1.9318 - classification_loss: 0.3651 90/500 [====>.........................] - ETA: 1:33 - loss: 2.3046 - regression_loss: 1.9383 - classification_loss: 0.3663 91/500 [====>.........................] - ETA: 1:33 - loss: 2.3081 - regression_loss: 1.9420 - classification_loss: 0.3661 92/500 [====>.........................] - ETA: 1:33 - loss: 2.3129 - regression_loss: 1.9462 - classification_loss: 0.3667 93/500 [====>.........................] - ETA: 1:32 - loss: 2.3083 - regression_loss: 1.9425 - classification_loss: 0.3658 94/500 [====>.........................] - ETA: 1:32 - loss: 2.3085 - regression_loss: 1.9421 - classification_loss: 0.3664 95/500 [====>.........................] - ETA: 1:32 - loss: 2.3116 - regression_loss: 1.9436 - classification_loss: 0.3680 96/500 [====>.........................] - ETA: 1:32 - loss: 2.3064 - regression_loss: 1.9405 - classification_loss: 0.3659 97/500 [====>.........................] - ETA: 1:31 - loss: 2.3049 - regression_loss: 1.9391 - classification_loss: 0.3657 98/500 [====>.........................] - ETA: 1:31 - loss: 2.3061 - regression_loss: 1.9403 - classification_loss: 0.3657 99/500 [====>.........................] - ETA: 1:31 - loss: 2.3125 - regression_loss: 1.9450 - classification_loss: 0.3675 100/500 [=====>........................] - ETA: 1:31 - loss: 2.3107 - regression_loss: 1.9432 - classification_loss: 0.3675 101/500 [=====>........................] - ETA: 1:31 - loss: 2.3041 - regression_loss: 1.9380 - classification_loss: 0.3661 102/500 [=====>........................] - ETA: 1:30 - loss: 2.3038 - regression_loss: 1.9379 - classification_loss: 0.3659 103/500 [=====>........................] - ETA: 1:30 - loss: 2.3007 - regression_loss: 1.9356 - classification_loss: 0.3651 104/500 [=====>........................] - ETA: 1:30 - loss: 2.2998 - regression_loss: 1.9352 - classification_loss: 0.3646 105/500 [=====>........................] - ETA: 1:30 - loss: 2.3019 - regression_loss: 1.9382 - classification_loss: 0.3637 106/500 [=====>........................] - ETA: 1:29 - loss: 2.3017 - regression_loss: 1.9386 - classification_loss: 0.3631 107/500 [=====>........................] - ETA: 1:29 - loss: 2.2994 - regression_loss: 1.9372 - classification_loss: 0.3622 108/500 [=====>........................] - ETA: 1:29 - loss: 2.3177 - regression_loss: 1.9509 - classification_loss: 0.3667 109/500 [=====>........................] - ETA: 1:29 - loss: 2.3151 - regression_loss: 1.9493 - classification_loss: 0.3659 110/500 [=====>........................] - ETA: 1:28 - loss: 2.3061 - regression_loss: 1.9422 - classification_loss: 0.3639 111/500 [=====>........................] - ETA: 1:28 - loss: 2.3057 - regression_loss: 1.9411 - classification_loss: 0.3646 112/500 [=====>........................] - ETA: 1:28 - loss: 2.3010 - regression_loss: 1.9378 - classification_loss: 0.3632 113/500 [=====>........................] - ETA: 1:28 - loss: 2.2966 - regression_loss: 1.9343 - classification_loss: 0.3622 114/500 [=====>........................] - ETA: 1:28 - loss: 2.2929 - regression_loss: 1.9304 - classification_loss: 0.3624 115/500 [=====>........................] - ETA: 1:27 - loss: 2.2876 - regression_loss: 1.9264 - classification_loss: 0.3612 116/500 [=====>........................] - ETA: 1:27 - loss: 2.2849 - regression_loss: 1.9243 - classification_loss: 0.3606 117/500 [======>.......................] - ETA: 1:27 - loss: 2.2935 - regression_loss: 1.9304 - classification_loss: 0.3631 118/500 [======>.......................] - ETA: 1:27 - loss: 2.2986 - regression_loss: 1.9334 - classification_loss: 0.3653 119/500 [======>.......................] - ETA: 1:26 - loss: 2.2927 - regression_loss: 1.9287 - classification_loss: 0.3640 120/500 [======>.......................] - ETA: 1:26 - loss: 2.2895 - regression_loss: 1.9262 - classification_loss: 0.3633 121/500 [======>.......................] - ETA: 1:26 - loss: 2.2937 - regression_loss: 1.9293 - classification_loss: 0.3644 122/500 [======>.......................] - ETA: 1:26 - loss: 2.2965 - regression_loss: 1.9299 - classification_loss: 0.3667 123/500 [======>.......................] - ETA: 1:25 - loss: 2.2975 - regression_loss: 1.9299 - classification_loss: 0.3675 124/500 [======>.......................] - ETA: 1:25 - loss: 2.2991 - regression_loss: 1.9317 - classification_loss: 0.3674 125/500 [======>.......................] - ETA: 1:25 - loss: 2.2982 - regression_loss: 1.9309 - classification_loss: 0.3673 126/500 [======>.......................] - ETA: 1:25 - loss: 2.3044 - regression_loss: 1.9357 - classification_loss: 0.3687 127/500 [======>.......................] - ETA: 1:24 - loss: 2.3118 - regression_loss: 1.9414 - classification_loss: 0.3704 128/500 [======>.......................] - ETA: 1:24 - loss: 2.3005 - regression_loss: 1.9321 - classification_loss: 0.3684 129/500 [======>.......................] - ETA: 1:24 - loss: 2.3021 - regression_loss: 1.9338 - classification_loss: 0.3684 130/500 [======>.......................] - ETA: 1:24 - loss: 2.3068 - regression_loss: 1.9371 - classification_loss: 0.3696 131/500 [======>.......................] - ETA: 1:23 - loss: 2.3064 - regression_loss: 1.9375 - classification_loss: 0.3689 132/500 [======>.......................] - ETA: 1:23 - loss: 2.3014 - regression_loss: 1.9339 - classification_loss: 0.3675 133/500 [======>.......................] - ETA: 1:23 - loss: 2.2990 - regression_loss: 1.9317 - classification_loss: 0.3673 134/500 [=======>......................] - ETA: 1:23 - loss: 2.3056 - regression_loss: 1.9375 - classification_loss: 0.3681 135/500 [=======>......................] - ETA: 1:23 - loss: 2.3034 - regression_loss: 1.9365 - classification_loss: 0.3670 136/500 [=======>......................] - ETA: 1:22 - loss: 2.3105 - regression_loss: 1.9420 - classification_loss: 0.3685 137/500 [=======>......................] - ETA: 1:22 - loss: 2.3055 - regression_loss: 1.9382 - classification_loss: 0.3673 138/500 [=======>......................] - ETA: 1:22 - loss: 2.2944 - regression_loss: 1.9286 - classification_loss: 0.3658 139/500 [=======>......................] - ETA: 1:22 - loss: 2.2857 - regression_loss: 1.9215 - classification_loss: 0.3642 140/500 [=======>......................] - ETA: 1:21 - loss: 2.2792 - regression_loss: 1.9167 - classification_loss: 0.3625 141/500 [=======>......................] - ETA: 1:21 - loss: 2.2781 - regression_loss: 1.9158 - classification_loss: 0.3623 142/500 [=======>......................] - ETA: 1:21 - loss: 2.2754 - regression_loss: 1.9138 - classification_loss: 0.3616 143/500 [=======>......................] - ETA: 1:21 - loss: 2.2754 - regression_loss: 1.9139 - classification_loss: 0.3615 144/500 [=======>......................] - ETA: 1:20 - loss: 2.2817 - regression_loss: 1.9187 - classification_loss: 0.3630 145/500 [=======>......................] - ETA: 1:20 - loss: 2.2754 - regression_loss: 1.9135 - classification_loss: 0.3619 146/500 [=======>......................] - ETA: 1:20 - loss: 2.2738 - regression_loss: 1.9123 - classification_loss: 0.3615 147/500 [=======>......................] - ETA: 1:20 - loss: 2.2731 - regression_loss: 1.9120 - classification_loss: 0.3612 148/500 [=======>......................] - ETA: 1:19 - loss: 2.2741 - regression_loss: 1.9126 - classification_loss: 0.3614 149/500 [=======>......................] - ETA: 1:19 - loss: 2.2671 - regression_loss: 1.9071 - classification_loss: 0.3600 150/500 [========>.....................] - ETA: 1:19 - loss: 2.2626 - regression_loss: 1.9033 - classification_loss: 0.3593 151/500 [========>.....................] - ETA: 1:19 - loss: 2.2627 - regression_loss: 1.9036 - classification_loss: 0.3592 152/500 [========>.....................] - ETA: 1:19 - loss: 2.2637 - regression_loss: 1.9048 - classification_loss: 0.3589 153/500 [========>.....................] - ETA: 1:18 - loss: 2.2696 - regression_loss: 1.9110 - classification_loss: 0.3586 154/500 [========>.....................] - ETA: 1:18 - loss: 2.2663 - regression_loss: 1.9086 - classification_loss: 0.3577 155/500 [========>.....................] - ETA: 1:18 - loss: 2.2653 - regression_loss: 1.9077 - classification_loss: 0.3577 156/500 [========>.....................] - ETA: 1:18 - loss: 2.2687 - regression_loss: 1.9093 - classification_loss: 0.3594 157/500 [========>.....................] - ETA: 1:17 - loss: 2.2712 - regression_loss: 1.9115 - classification_loss: 0.3598 158/500 [========>.....................] - ETA: 1:17 - loss: 2.2638 - regression_loss: 1.9051 - classification_loss: 0.3587 159/500 [========>.....................] - ETA: 1:17 - loss: 2.2638 - regression_loss: 1.9051 - classification_loss: 0.3587 160/500 [========>.....................] - ETA: 1:17 - loss: 2.2644 - regression_loss: 1.9053 - classification_loss: 0.3590 161/500 [========>.....................] - ETA: 1:16 - loss: 2.2656 - regression_loss: 1.9065 - classification_loss: 0.3590 162/500 [========>.....................] - ETA: 1:16 - loss: 2.2681 - regression_loss: 1.9078 - classification_loss: 0.3603 163/500 [========>.....................] - ETA: 1:16 - loss: 2.2715 - regression_loss: 1.9107 - classification_loss: 0.3608 164/500 [========>.....................] - ETA: 1:16 - loss: 2.2719 - regression_loss: 1.9116 - classification_loss: 0.3603 165/500 [========>.....................] - ETA: 1:15 - loss: 2.2758 - regression_loss: 1.9144 - classification_loss: 0.3614 166/500 [========>.....................] - ETA: 1:15 - loss: 2.2747 - regression_loss: 1.9135 - classification_loss: 0.3612 167/500 [=========>....................] - ETA: 1:15 - loss: 2.2693 - regression_loss: 1.9086 - classification_loss: 0.3608 168/500 [=========>....................] - ETA: 1:15 - loss: 2.2672 - regression_loss: 1.9058 - classification_loss: 0.3614 169/500 [=========>....................] - ETA: 1:15 - loss: 2.2652 - regression_loss: 1.9043 - classification_loss: 0.3608 170/500 [=========>....................] - ETA: 1:14 - loss: 2.2650 - regression_loss: 1.9040 - classification_loss: 0.3610 171/500 [=========>....................] - ETA: 1:14 - loss: 2.2656 - regression_loss: 1.9048 - classification_loss: 0.3608 172/500 [=========>....................] - ETA: 1:14 - loss: 2.2668 - regression_loss: 1.9060 - classification_loss: 0.3608 173/500 [=========>....................] - ETA: 1:14 - loss: 2.2662 - regression_loss: 1.9054 - classification_loss: 0.3608 174/500 [=========>....................] - ETA: 1:13 - loss: 2.2646 - regression_loss: 1.9041 - classification_loss: 0.3604 175/500 [=========>....................] - ETA: 1:13 - loss: 2.2630 - regression_loss: 1.9028 - classification_loss: 0.3602 176/500 [=========>....................] - ETA: 1:13 - loss: 2.2608 - regression_loss: 1.9012 - classification_loss: 0.3596 177/500 [=========>....................] - ETA: 1:13 - loss: 2.2603 - regression_loss: 1.9007 - classification_loss: 0.3596 178/500 [=========>....................] - ETA: 1:12 - loss: 2.2590 - regression_loss: 1.8995 - classification_loss: 0.3594 179/500 [=========>....................] - ETA: 1:12 - loss: 2.2567 - regression_loss: 1.8976 - classification_loss: 0.3591 180/500 [=========>....................] - ETA: 1:12 - loss: 2.2563 - regression_loss: 1.8975 - classification_loss: 0.3588 181/500 [=========>....................] - ETA: 1:12 - loss: 2.2592 - regression_loss: 1.9002 - classification_loss: 0.3590 182/500 [=========>....................] - ETA: 1:12 - loss: 2.2590 - regression_loss: 1.9005 - classification_loss: 0.3585 183/500 [=========>....................] - ETA: 1:11 - loss: 2.2589 - regression_loss: 1.9006 - classification_loss: 0.3582 184/500 [==========>...................] - ETA: 1:11 - loss: 2.2586 - regression_loss: 1.9004 - classification_loss: 0.3582 185/500 [==========>...................] - ETA: 1:11 - loss: 2.2570 - regression_loss: 1.8989 - classification_loss: 0.3580 186/500 [==========>...................] - ETA: 1:11 - loss: 2.2551 - regression_loss: 1.8978 - classification_loss: 0.3573 187/500 [==========>...................] - ETA: 1:10 - loss: 2.2549 - regression_loss: 1.8980 - classification_loss: 0.3569 188/500 [==========>...................] - ETA: 1:10 - loss: 2.2558 - regression_loss: 1.8987 - classification_loss: 0.3571 189/500 [==========>...................] - ETA: 1:10 - loss: 2.2551 - regression_loss: 1.8981 - classification_loss: 0.3570 190/500 [==========>...................] - ETA: 1:10 - loss: 2.2558 - regression_loss: 1.8985 - classification_loss: 0.3572 191/500 [==========>...................] - ETA: 1:10 - loss: 2.2556 - regression_loss: 1.8985 - classification_loss: 0.3572 192/500 [==========>...................] - ETA: 1:09 - loss: 2.2544 - regression_loss: 1.8977 - classification_loss: 0.3567 193/500 [==========>...................] - ETA: 1:09 - loss: 2.2544 - regression_loss: 1.8982 - classification_loss: 0.3563 194/500 [==========>...................] - ETA: 1:09 - loss: 2.2525 - regression_loss: 1.8965 - classification_loss: 0.3559 195/500 [==========>...................] - ETA: 1:09 - loss: 2.2542 - regression_loss: 1.8980 - classification_loss: 0.3562 196/500 [==========>...................] - ETA: 1:08 - loss: 2.2529 - regression_loss: 1.8969 - classification_loss: 0.3560 197/500 [==========>...................] - ETA: 1:08 - loss: 2.2513 - regression_loss: 1.8962 - classification_loss: 0.3551 198/500 [==========>...................] - ETA: 1:08 - loss: 2.2479 - regression_loss: 1.8937 - classification_loss: 0.3542 199/500 [==========>...................] - ETA: 1:08 - loss: 2.2483 - regression_loss: 1.8939 - classification_loss: 0.3544 200/500 [===========>..................] - ETA: 1:07 - loss: 2.2593 - regression_loss: 1.8997 - classification_loss: 0.3596 201/500 [===========>..................] - ETA: 1:07 - loss: 2.2572 - regression_loss: 1.8981 - classification_loss: 0.3592 202/500 [===========>..................] - ETA: 1:07 - loss: 2.2577 - regression_loss: 1.8987 - classification_loss: 0.3590 203/500 [===========>..................] - ETA: 1:07 - loss: 2.2539 - regression_loss: 1.8956 - classification_loss: 0.3583 204/500 [===========>..................] - ETA: 1:07 - loss: 2.2493 - regression_loss: 1.8921 - classification_loss: 0.3572 205/500 [===========>..................] - ETA: 1:06 - loss: 2.2479 - regression_loss: 1.8911 - classification_loss: 0.3568 206/500 [===========>..................] - ETA: 1:06 - loss: 2.2502 - regression_loss: 1.8926 - classification_loss: 0.3576 207/500 [===========>..................] - ETA: 1:06 - loss: 2.2480 - regression_loss: 1.8910 - classification_loss: 0.3570 208/500 [===========>..................] - ETA: 1:06 - loss: 2.2496 - regression_loss: 1.8928 - classification_loss: 0.3568 209/500 [===========>..................] - ETA: 1:05 - loss: 2.2487 - regression_loss: 1.8924 - classification_loss: 0.3562 210/500 [===========>..................] - ETA: 1:05 - loss: 2.2470 - regression_loss: 1.8911 - classification_loss: 0.3559 211/500 [===========>..................] - ETA: 1:05 - loss: 2.2439 - regression_loss: 1.8887 - classification_loss: 0.3552 212/500 [===========>..................] - ETA: 1:05 - loss: 2.2520 - regression_loss: 1.8957 - classification_loss: 0.3563 213/500 [===========>..................] - ETA: 1:05 - loss: 2.2509 - regression_loss: 1.8949 - classification_loss: 0.3561 214/500 [===========>..................] - ETA: 1:04 - loss: 2.2507 - regression_loss: 1.8948 - classification_loss: 0.3559 215/500 [===========>..................] - ETA: 1:04 - loss: 2.2533 - regression_loss: 1.8974 - classification_loss: 0.3559 216/500 [===========>..................] - ETA: 1:04 - loss: 2.2500 - regression_loss: 1.8949 - classification_loss: 0.3551 217/500 [============>.................] - ETA: 1:04 - loss: 2.2482 - regression_loss: 1.8935 - classification_loss: 0.3547 218/500 [============>.................] - ETA: 1:03 - loss: 2.2486 - regression_loss: 1.8946 - classification_loss: 0.3541 219/500 [============>.................] - ETA: 1:03 - loss: 2.2481 - regression_loss: 1.8948 - classification_loss: 0.3532 220/500 [============>.................] - ETA: 1:03 - loss: 2.2482 - regression_loss: 1.8949 - classification_loss: 0.3533 221/500 [============>.................] - ETA: 1:03 - loss: 2.2478 - regression_loss: 1.8946 - classification_loss: 0.3532 222/500 [============>.................] - ETA: 1:03 - loss: 2.2439 - regression_loss: 1.8915 - classification_loss: 0.3524 223/500 [============>.................] - ETA: 1:02 - loss: 2.2486 - regression_loss: 1.8947 - classification_loss: 0.3540 224/500 [============>.................] - ETA: 1:02 - loss: 2.2468 - regression_loss: 1.8930 - classification_loss: 0.3538 225/500 [============>.................] - ETA: 1:02 - loss: 2.2475 - regression_loss: 1.8938 - classification_loss: 0.3537 226/500 [============>.................] - ETA: 1:02 - loss: 2.2447 - regression_loss: 1.8915 - classification_loss: 0.3533 227/500 [============>.................] - ETA: 1:01 - loss: 2.2459 - regression_loss: 1.8913 - classification_loss: 0.3546 228/500 [============>.................] - ETA: 1:01 - loss: 2.2484 - regression_loss: 1.8928 - classification_loss: 0.3556 229/500 [============>.................] - ETA: 1:01 - loss: 2.2505 - regression_loss: 1.8943 - classification_loss: 0.3562 230/500 [============>.................] - ETA: 1:01 - loss: 2.2510 - regression_loss: 1.8951 - classification_loss: 0.3559 231/500 [============>.................] - ETA: 1:00 - loss: 2.2493 - regression_loss: 1.8938 - classification_loss: 0.3555 232/500 [============>.................] - ETA: 1:00 - loss: 2.2487 - regression_loss: 1.8935 - classification_loss: 0.3551 233/500 [============>.................] - ETA: 1:00 - loss: 2.2494 - regression_loss: 1.8932 - classification_loss: 0.3562 234/500 [=============>................] - ETA: 1:00 - loss: 2.2477 - regression_loss: 1.8920 - classification_loss: 0.3557 235/500 [=============>................] - ETA: 1:00 - loss: 2.2473 - regression_loss: 1.8917 - classification_loss: 0.3556 236/500 [=============>................] - ETA: 59s - loss: 2.2460 - regression_loss: 1.8906 - classification_loss: 0.3554  237/500 [=============>................] - ETA: 59s - loss: 2.2443 - regression_loss: 1.8894 - classification_loss: 0.3549 238/500 [=============>................] - ETA: 59s - loss: 2.2434 - regression_loss: 1.8890 - classification_loss: 0.3544 239/500 [=============>................] - ETA: 59s - loss: 2.2437 - regression_loss: 1.8890 - classification_loss: 0.3546 240/500 [=============>................] - ETA: 58s - loss: 2.2411 - regression_loss: 1.8871 - classification_loss: 0.3540 241/500 [=============>................] - ETA: 58s - loss: 2.2363 - regression_loss: 1.8833 - classification_loss: 0.3530 242/500 [=============>................] - ETA: 58s - loss: 2.2334 - regression_loss: 1.8810 - classification_loss: 0.3524 243/500 [=============>................] - ETA: 58s - loss: 2.2327 - regression_loss: 1.8807 - classification_loss: 0.3520 244/500 [=============>................] - ETA: 58s - loss: 2.2352 - regression_loss: 1.8830 - classification_loss: 0.3522 245/500 [=============>................] - ETA: 57s - loss: 2.2341 - regression_loss: 1.8822 - classification_loss: 0.3519 246/500 [=============>................] - ETA: 57s - loss: 2.2321 - regression_loss: 1.8809 - classification_loss: 0.3512 247/500 [=============>................] - ETA: 57s - loss: 2.2314 - regression_loss: 1.8805 - classification_loss: 0.3509 248/500 [=============>................] - ETA: 57s - loss: 2.2312 - regression_loss: 1.8805 - classification_loss: 0.3507 249/500 [=============>................] - ETA: 56s - loss: 2.2368 - regression_loss: 1.8850 - classification_loss: 0.3519 250/500 [==============>...............] - ETA: 56s - loss: 2.2374 - regression_loss: 1.8855 - classification_loss: 0.3519 251/500 [==============>...............] - ETA: 56s - loss: 2.2361 - regression_loss: 1.8845 - classification_loss: 0.3516 252/500 [==============>...............] - ETA: 56s - loss: 2.2385 - regression_loss: 1.8864 - classification_loss: 0.3521 253/500 [==============>...............] - ETA: 56s - loss: 2.2375 - regression_loss: 1.8856 - classification_loss: 0.3519 254/500 [==============>...............] - ETA: 55s - loss: 2.2366 - regression_loss: 1.8852 - classification_loss: 0.3514 255/500 [==============>...............] - ETA: 55s - loss: 2.2366 - regression_loss: 1.8852 - classification_loss: 0.3514 256/500 [==============>...............] - ETA: 55s - loss: 2.2368 - regression_loss: 1.8855 - classification_loss: 0.3513 257/500 [==============>...............] - ETA: 55s - loss: 2.2333 - regression_loss: 1.8826 - classification_loss: 0.3507 258/500 [==============>...............] - ETA: 54s - loss: 2.2349 - regression_loss: 1.8840 - classification_loss: 0.3508 259/500 [==============>...............] - ETA: 54s - loss: 2.2345 - regression_loss: 1.8836 - classification_loss: 0.3509 260/500 [==============>...............] - ETA: 54s - loss: 2.2326 - regression_loss: 1.8821 - classification_loss: 0.3505 261/500 [==============>...............] - ETA: 54s - loss: 2.2352 - regression_loss: 1.8847 - classification_loss: 0.3505 262/500 [==============>...............] - ETA: 54s - loss: 2.2344 - regression_loss: 1.8839 - classification_loss: 0.3506 263/500 [==============>...............] - ETA: 53s - loss: 2.2341 - regression_loss: 1.8840 - classification_loss: 0.3502 264/500 [==============>...............] - ETA: 53s - loss: 2.2340 - regression_loss: 1.8842 - classification_loss: 0.3498 265/500 [==============>...............] - ETA: 53s - loss: 2.2319 - regression_loss: 1.8826 - classification_loss: 0.3493 266/500 [==============>...............] - ETA: 53s - loss: 2.2314 - regression_loss: 1.8824 - classification_loss: 0.3490 267/500 [===============>..............] - ETA: 52s - loss: 2.2350 - regression_loss: 1.8851 - classification_loss: 0.3499 268/500 [===============>..............] - ETA: 52s - loss: 2.2349 - regression_loss: 1.8847 - classification_loss: 0.3502 269/500 [===============>..............] - ETA: 52s - loss: 2.2306 - regression_loss: 1.8807 - classification_loss: 0.3500 270/500 [===============>..............] - ETA: 52s - loss: 2.2301 - regression_loss: 1.8800 - classification_loss: 0.3501 271/500 [===============>..............] - ETA: 51s - loss: 2.2313 - regression_loss: 1.8812 - classification_loss: 0.3501 272/500 [===============>..............] - ETA: 51s - loss: 2.2296 - regression_loss: 1.8798 - classification_loss: 0.3498 273/500 [===============>..............] - ETA: 51s - loss: 2.2270 - regression_loss: 1.8777 - classification_loss: 0.3493 274/500 [===============>..............] - ETA: 51s - loss: 2.2297 - regression_loss: 1.8803 - classification_loss: 0.3494 275/500 [===============>..............] - ETA: 51s - loss: 2.2327 - regression_loss: 1.8826 - classification_loss: 0.3501 276/500 [===============>..............] - ETA: 50s - loss: 2.2336 - regression_loss: 1.8838 - classification_loss: 0.3499 277/500 [===============>..............] - ETA: 50s - loss: 2.2343 - regression_loss: 1.8844 - classification_loss: 0.3499 278/500 [===============>..............] - ETA: 50s - loss: 2.2329 - regression_loss: 1.8831 - classification_loss: 0.3497 279/500 [===============>..............] - ETA: 50s - loss: 2.2319 - regression_loss: 1.8823 - classification_loss: 0.3496 280/500 [===============>..............] - ETA: 49s - loss: 2.2358 - regression_loss: 1.8861 - classification_loss: 0.3497 281/500 [===============>..............] - ETA: 49s - loss: 2.2345 - regression_loss: 1.8852 - classification_loss: 0.3494 282/500 [===============>..............] - ETA: 49s - loss: 2.2349 - regression_loss: 1.8856 - classification_loss: 0.3493 283/500 [===============>..............] - ETA: 49s - loss: 2.2329 - regression_loss: 1.8840 - classification_loss: 0.3489 284/500 [================>.............] - ETA: 48s - loss: 2.2306 - regression_loss: 1.8822 - classification_loss: 0.3484 285/500 [================>.............] - ETA: 48s - loss: 2.2290 - regression_loss: 1.8810 - classification_loss: 0.3480 286/500 [================>.............] - ETA: 48s - loss: 2.2243 - regression_loss: 1.8772 - classification_loss: 0.3472 287/500 [================>.............] - ETA: 48s - loss: 2.2278 - regression_loss: 1.8799 - classification_loss: 0.3479 288/500 [================>.............] - ETA: 47s - loss: 2.2271 - regression_loss: 1.8794 - classification_loss: 0.3477 289/500 [================>.............] - ETA: 47s - loss: 2.2274 - regression_loss: 1.8798 - classification_loss: 0.3476 290/500 [================>.............] - ETA: 47s - loss: 2.2254 - regression_loss: 1.8785 - classification_loss: 0.3469 291/500 [================>.............] - ETA: 47s - loss: 2.2250 - regression_loss: 1.8787 - classification_loss: 0.3463 292/500 [================>.............] - ETA: 47s - loss: 2.2253 - regression_loss: 1.8789 - classification_loss: 0.3464 293/500 [================>.............] - ETA: 46s - loss: 2.2263 - regression_loss: 1.8799 - classification_loss: 0.3464 294/500 [================>.............] - ETA: 46s - loss: 2.2255 - regression_loss: 1.8795 - classification_loss: 0.3460 295/500 [================>.............] - ETA: 46s - loss: 2.2261 - regression_loss: 1.8799 - classification_loss: 0.3462 296/500 [================>.............] - ETA: 46s - loss: 2.2270 - regression_loss: 1.8807 - classification_loss: 0.3463 297/500 [================>.............] - ETA: 45s - loss: 2.2267 - regression_loss: 1.8806 - classification_loss: 0.3461 298/500 [================>.............] - ETA: 45s - loss: 2.2278 - regression_loss: 1.8815 - classification_loss: 0.3462 299/500 [================>.............] - ETA: 45s - loss: 2.2233 - regression_loss: 1.8778 - classification_loss: 0.3455 300/500 [=================>............] - ETA: 45s - loss: 2.2211 - regression_loss: 1.8761 - classification_loss: 0.3450 301/500 [=================>............] - ETA: 44s - loss: 2.2188 - regression_loss: 1.8744 - classification_loss: 0.3445 302/500 [=================>............] - ETA: 44s - loss: 2.2185 - regression_loss: 1.8743 - classification_loss: 0.3442 303/500 [=================>............] - ETA: 44s - loss: 2.2183 - regression_loss: 1.8744 - classification_loss: 0.3439 304/500 [=================>............] - ETA: 44s - loss: 2.2217 - regression_loss: 1.8778 - classification_loss: 0.3439 305/500 [=================>............] - ETA: 44s - loss: 2.2212 - regression_loss: 1.8774 - classification_loss: 0.3438 306/500 [=================>............] - ETA: 43s - loss: 2.2185 - regression_loss: 1.8750 - classification_loss: 0.3435 307/500 [=================>............] - ETA: 43s - loss: 2.2184 - regression_loss: 1.8748 - classification_loss: 0.3436 308/500 [=================>............] - ETA: 43s - loss: 2.2172 - regression_loss: 1.8739 - classification_loss: 0.3433 309/500 [=================>............] - ETA: 43s - loss: 2.2175 - regression_loss: 1.8742 - classification_loss: 0.3433 310/500 [=================>............] - ETA: 42s - loss: 2.2172 - regression_loss: 1.8740 - classification_loss: 0.3432 311/500 [=================>............] - ETA: 42s - loss: 2.2220 - regression_loss: 1.8760 - classification_loss: 0.3460 312/500 [=================>............] - ETA: 42s - loss: 2.2241 - regression_loss: 1.8777 - classification_loss: 0.3465 313/500 [=================>............] - ETA: 42s - loss: 2.2245 - regression_loss: 1.8784 - classification_loss: 0.3461 314/500 [=================>............] - ETA: 41s - loss: 2.2244 - regression_loss: 1.8782 - classification_loss: 0.3462 315/500 [=================>............] - ETA: 41s - loss: 2.2249 - regression_loss: 1.8788 - classification_loss: 0.3461 316/500 [=================>............] - ETA: 41s - loss: 2.2235 - regression_loss: 1.8776 - classification_loss: 0.3458 317/500 [==================>...........] - ETA: 41s - loss: 2.2226 - regression_loss: 1.8770 - classification_loss: 0.3457 318/500 [==================>...........] - ETA: 41s - loss: 2.2219 - regression_loss: 1.8766 - classification_loss: 0.3453 319/500 [==================>...........] - ETA: 40s - loss: 2.2226 - regression_loss: 1.8772 - classification_loss: 0.3453 320/500 [==================>...........] - ETA: 40s - loss: 2.2205 - regression_loss: 1.8756 - classification_loss: 0.3449 321/500 [==================>...........] - ETA: 40s - loss: 2.2192 - regression_loss: 1.8745 - classification_loss: 0.3447 322/500 [==================>...........] - ETA: 40s - loss: 2.2192 - regression_loss: 1.8745 - classification_loss: 0.3447 323/500 [==================>...........] - ETA: 39s - loss: 2.2189 - regression_loss: 1.8742 - classification_loss: 0.3447 324/500 [==================>...........] - ETA: 39s - loss: 2.2153 - regression_loss: 1.8711 - classification_loss: 0.3442 325/500 [==================>...........] - ETA: 39s - loss: 2.2130 - regression_loss: 1.8689 - classification_loss: 0.3441 326/500 [==================>...........] - ETA: 39s - loss: 2.2124 - regression_loss: 1.8684 - classification_loss: 0.3440 327/500 [==================>...........] - ETA: 38s - loss: 2.2113 - regression_loss: 1.8678 - classification_loss: 0.3435 328/500 [==================>...........] - ETA: 38s - loss: 2.2082 - regression_loss: 1.8649 - classification_loss: 0.3433 329/500 [==================>...........] - ETA: 38s - loss: 2.2079 - regression_loss: 1.8645 - classification_loss: 0.3434 330/500 [==================>...........] - ETA: 38s - loss: 2.2067 - regression_loss: 1.8635 - classification_loss: 0.3432 331/500 [==================>...........] - ETA: 38s - loss: 2.2056 - regression_loss: 1.8626 - classification_loss: 0.3430 332/500 [==================>...........] - ETA: 37s - loss: 2.2028 - regression_loss: 1.8603 - classification_loss: 0.3425 333/500 [==================>...........] - ETA: 37s - loss: 2.2015 - regression_loss: 1.8594 - classification_loss: 0.3420 334/500 [===================>..........] - ETA: 37s - loss: 2.2036 - regression_loss: 1.8608 - classification_loss: 0.3428 335/500 [===================>..........] - ETA: 37s - loss: 2.2025 - regression_loss: 1.8601 - classification_loss: 0.3424 336/500 [===================>..........] - ETA: 36s - loss: 2.2010 - regression_loss: 1.8588 - classification_loss: 0.3422 337/500 [===================>..........] - ETA: 36s - loss: 2.1992 - regression_loss: 1.8573 - classification_loss: 0.3420 338/500 [===================>..........] - ETA: 36s - loss: 2.2002 - regression_loss: 1.8575 - classification_loss: 0.3427 339/500 [===================>..........] - ETA: 36s - loss: 2.2028 - regression_loss: 1.8586 - classification_loss: 0.3442 340/500 [===================>..........] - ETA: 36s - loss: 2.2011 - regression_loss: 1.8574 - classification_loss: 0.3437 341/500 [===================>..........] - ETA: 35s - loss: 2.2019 - regression_loss: 1.8582 - classification_loss: 0.3437 342/500 [===================>..........] - ETA: 35s - loss: 2.2010 - regression_loss: 1.8575 - classification_loss: 0.3434 343/500 [===================>..........] - ETA: 35s - loss: 2.2001 - regression_loss: 1.8568 - classification_loss: 0.3433 344/500 [===================>..........] - ETA: 35s - loss: 2.1977 - regression_loss: 1.8549 - classification_loss: 0.3429 345/500 [===================>..........] - ETA: 34s - loss: 2.2002 - regression_loss: 1.8568 - classification_loss: 0.3434 346/500 [===================>..........] - ETA: 34s - loss: 2.2023 - regression_loss: 1.8584 - classification_loss: 0.3439 347/500 [===================>..........] - ETA: 34s - loss: 2.2018 - regression_loss: 1.8580 - classification_loss: 0.3438 348/500 [===================>..........] - ETA: 34s - loss: 2.2019 - regression_loss: 1.8581 - classification_loss: 0.3438 349/500 [===================>..........] - ETA: 33s - loss: 2.2014 - regression_loss: 1.8575 - classification_loss: 0.3438 350/500 [====================>.........] - ETA: 33s - loss: 2.2025 - regression_loss: 1.8583 - classification_loss: 0.3442 351/500 [====================>.........] - ETA: 33s - loss: 2.2044 - regression_loss: 1.8598 - classification_loss: 0.3446 352/500 [====================>.........] - ETA: 33s - loss: 2.2036 - regression_loss: 1.8593 - classification_loss: 0.3444 353/500 [====================>.........] - ETA: 33s - loss: 2.2012 - regression_loss: 1.8574 - classification_loss: 0.3437 354/500 [====================>.........] - ETA: 32s - loss: 2.2024 - regression_loss: 1.8585 - classification_loss: 0.3440 355/500 [====================>.........] - ETA: 32s - loss: 2.2029 - regression_loss: 1.8591 - classification_loss: 0.3438 356/500 [====================>.........] - ETA: 32s - loss: 2.2023 - regression_loss: 1.8586 - classification_loss: 0.3438 357/500 [====================>.........] - ETA: 32s - loss: 2.2027 - regression_loss: 1.8591 - classification_loss: 0.3436 358/500 [====================>.........] - ETA: 32s - loss: 2.2055 - regression_loss: 1.8613 - classification_loss: 0.3442 359/500 [====================>.........] - ETA: 31s - loss: 2.2040 - regression_loss: 1.8599 - classification_loss: 0.3441 360/500 [====================>.........] - ETA: 31s - loss: 2.2022 - regression_loss: 1.8584 - classification_loss: 0.3438 361/500 [====================>.........] - ETA: 31s - loss: 2.2010 - regression_loss: 1.8573 - classification_loss: 0.3438 362/500 [====================>.........] - ETA: 31s - loss: 2.2003 - regression_loss: 1.8567 - classification_loss: 0.3436 363/500 [====================>.........] - ETA: 30s - loss: 2.2004 - regression_loss: 1.8569 - classification_loss: 0.3435 364/500 [====================>.........] - ETA: 30s - loss: 2.2007 - regression_loss: 1.8571 - classification_loss: 0.3436 365/500 [====================>.........] - ETA: 30s - loss: 2.1974 - regression_loss: 1.8543 - classification_loss: 0.3430 366/500 [====================>.........] - ETA: 30s - loss: 2.1971 - regression_loss: 1.8542 - classification_loss: 0.3429 367/500 [=====================>........] - ETA: 29s - loss: 2.1973 - regression_loss: 1.8543 - classification_loss: 0.3430 368/500 [=====================>........] - ETA: 29s - loss: 2.1979 - regression_loss: 1.8552 - classification_loss: 0.3428 369/500 [=====================>........] - ETA: 29s - loss: 2.1972 - regression_loss: 1.8545 - classification_loss: 0.3426 370/500 [=====================>........] - ETA: 29s - loss: 2.1973 - regression_loss: 1.8546 - classification_loss: 0.3427 371/500 [=====================>........] - ETA: 29s - loss: 2.2010 - regression_loss: 1.8553 - classification_loss: 0.3456 372/500 [=====================>........] - ETA: 28s - loss: 2.2012 - regression_loss: 1.8556 - classification_loss: 0.3456 373/500 [=====================>........] - ETA: 28s - loss: 2.2024 - regression_loss: 1.8569 - classification_loss: 0.3454 374/500 [=====================>........] - ETA: 28s - loss: 2.2051 - regression_loss: 1.8590 - classification_loss: 0.3461 375/500 [=====================>........] - ETA: 28s - loss: 2.2072 - regression_loss: 1.8607 - classification_loss: 0.3464 376/500 [=====================>........] - ETA: 27s - loss: 2.2061 - regression_loss: 1.8599 - classification_loss: 0.3462 377/500 [=====================>........] - ETA: 27s - loss: 2.2053 - regression_loss: 1.8592 - classification_loss: 0.3461 378/500 [=====================>........] - ETA: 27s - loss: 2.2086 - regression_loss: 1.8622 - classification_loss: 0.3464 379/500 [=====================>........] - ETA: 27s - loss: 2.2089 - regression_loss: 1.8623 - classification_loss: 0.3467 380/500 [=====================>........] - ETA: 27s - loss: 2.2074 - regression_loss: 1.8609 - classification_loss: 0.3464 381/500 [=====================>........] - ETA: 26s - loss: 2.2074 - regression_loss: 1.8608 - classification_loss: 0.3467 382/500 [=====================>........] - ETA: 26s - loss: 2.2069 - regression_loss: 1.8603 - classification_loss: 0.3465 383/500 [=====================>........] - ETA: 26s - loss: 2.2026 - regression_loss: 1.8566 - classification_loss: 0.3460 384/500 [======================>.......] - ETA: 26s - loss: 2.2044 - regression_loss: 1.8580 - classification_loss: 0.3464 385/500 [======================>.......] - ETA: 25s - loss: 2.2033 - regression_loss: 1.8573 - classification_loss: 0.3460 386/500 [======================>.......] - ETA: 25s - loss: 2.2030 - regression_loss: 1.8574 - classification_loss: 0.3456 387/500 [======================>.......] - ETA: 25s - loss: 2.2031 - regression_loss: 1.8576 - classification_loss: 0.3455 388/500 [======================>.......] - ETA: 25s - loss: 2.2023 - regression_loss: 1.8569 - classification_loss: 0.3453 389/500 [======================>.......] - ETA: 25s - loss: 2.2006 - regression_loss: 1.8555 - classification_loss: 0.3452 390/500 [======================>.......] - ETA: 24s - loss: 2.1996 - regression_loss: 1.8544 - classification_loss: 0.3451 391/500 [======================>.......] - ETA: 24s - loss: 2.1989 - regression_loss: 1.8538 - classification_loss: 0.3450 392/500 [======================>.......] - ETA: 24s - loss: 2.1981 - regression_loss: 1.8533 - classification_loss: 0.3448 393/500 [======================>.......] - ETA: 24s - loss: 2.1983 - regression_loss: 1.8534 - classification_loss: 0.3449 394/500 [======================>.......] - ETA: 23s - loss: 2.1995 - regression_loss: 1.8540 - classification_loss: 0.3455 395/500 [======================>.......] - ETA: 23s - loss: 2.2009 - regression_loss: 1.8548 - classification_loss: 0.3461 396/500 [======================>.......] - ETA: 23s - loss: 2.2000 - regression_loss: 1.8541 - classification_loss: 0.3459 397/500 [======================>.......] - ETA: 23s - loss: 2.2043 - regression_loss: 1.8577 - classification_loss: 0.3466 398/500 [======================>.......] - ETA: 22s - loss: 2.2068 - regression_loss: 1.8595 - classification_loss: 0.3473 399/500 [======================>.......] - ETA: 22s - loss: 2.2091 - regression_loss: 1.8616 - classification_loss: 0.3475 400/500 [=======================>......] - ETA: 22s - loss: 2.2085 - regression_loss: 1.8611 - classification_loss: 0.3474 401/500 [=======================>......] - ETA: 22s - loss: 2.2082 - regression_loss: 1.8608 - classification_loss: 0.3473 402/500 [=======================>......] - ETA: 22s - loss: 2.2080 - regression_loss: 1.8608 - classification_loss: 0.3472 403/500 [=======================>......] - ETA: 21s - loss: 2.2066 - regression_loss: 1.8598 - classification_loss: 0.3468 404/500 [=======================>......] - ETA: 21s - loss: 2.2075 - regression_loss: 1.8605 - classification_loss: 0.3470 405/500 [=======================>......] - ETA: 21s - loss: 2.2052 - regression_loss: 1.8586 - classification_loss: 0.3467 406/500 [=======================>......] - ETA: 21s - loss: 2.2043 - regression_loss: 1.8578 - classification_loss: 0.3465 407/500 [=======================>......] - ETA: 20s - loss: 2.2031 - regression_loss: 1.8568 - classification_loss: 0.3462 408/500 [=======================>......] - ETA: 20s - loss: 2.2065 - regression_loss: 1.8591 - classification_loss: 0.3474 409/500 [=======================>......] - ETA: 20s - loss: 2.2064 - regression_loss: 1.8594 - classification_loss: 0.3470 410/500 [=======================>......] - ETA: 20s - loss: 2.2043 - regression_loss: 1.8576 - classification_loss: 0.3467 411/500 [=======================>......] - ETA: 20s - loss: 2.2043 - regression_loss: 1.8577 - classification_loss: 0.3467 412/500 [=======================>......] - ETA: 19s - loss: 2.2037 - regression_loss: 1.8572 - classification_loss: 0.3465 413/500 [=======================>......] - ETA: 19s - loss: 2.2040 - regression_loss: 1.8576 - classification_loss: 0.3464 414/500 [=======================>......] - ETA: 19s - loss: 2.2049 - regression_loss: 1.8581 - classification_loss: 0.3468 415/500 [=======================>......] - ETA: 19s - loss: 2.2039 - regression_loss: 1.8573 - classification_loss: 0.3466 416/500 [=======================>......] - ETA: 18s - loss: 2.2039 - regression_loss: 1.8573 - classification_loss: 0.3466 417/500 [========================>.....] - ETA: 18s - loss: 2.2038 - regression_loss: 1.8572 - classification_loss: 0.3466 418/500 [========================>.....] - ETA: 18s - loss: 2.2038 - regression_loss: 1.8573 - classification_loss: 0.3465 419/500 [========================>.....] - ETA: 18s - loss: 2.2041 - regression_loss: 1.8573 - classification_loss: 0.3469 420/500 [========================>.....] - ETA: 18s - loss: 2.2041 - regression_loss: 1.8573 - classification_loss: 0.3468 421/500 [========================>.....] - ETA: 17s - loss: 2.2055 - regression_loss: 1.8582 - classification_loss: 0.3473 422/500 [========================>.....] - ETA: 17s - loss: 2.2043 - regression_loss: 1.8574 - classification_loss: 0.3469 423/500 [========================>.....] - ETA: 17s - loss: 2.2059 - regression_loss: 1.8592 - classification_loss: 0.3467 424/500 [========================>.....] - ETA: 17s - loss: 2.2051 - regression_loss: 1.8584 - classification_loss: 0.3467 425/500 [========================>.....] - ETA: 16s - loss: 2.2032 - regression_loss: 1.8569 - classification_loss: 0.3463 426/500 [========================>.....] - ETA: 16s - loss: 2.2038 - regression_loss: 1.8575 - classification_loss: 0.3464 427/500 [========================>.....] - ETA: 16s - loss: 2.2039 - regression_loss: 1.8577 - classification_loss: 0.3462 428/500 [========================>.....] - ETA: 16s - loss: 2.2030 - regression_loss: 1.8571 - classification_loss: 0.3459 429/500 [========================>.....] - ETA: 16s - loss: 2.2017 - regression_loss: 1.8560 - classification_loss: 0.3456 430/500 [========================>.....] - ETA: 15s - loss: 2.2013 - regression_loss: 1.8557 - classification_loss: 0.3456 431/500 [========================>.....] - ETA: 15s - loss: 2.2010 - regression_loss: 1.8554 - classification_loss: 0.3456 432/500 [========================>.....] - ETA: 15s - loss: 2.2038 - regression_loss: 1.8575 - classification_loss: 0.3464 433/500 [========================>.....] - ETA: 15s - loss: 2.2043 - regression_loss: 1.8580 - classification_loss: 0.3464 434/500 [=========================>....] - ETA: 14s - loss: 2.2045 - regression_loss: 1.8583 - classification_loss: 0.3462 435/500 [=========================>....] - ETA: 14s - loss: 2.2068 - regression_loss: 1.8605 - classification_loss: 0.3463 436/500 [=========================>....] - ETA: 14s - loss: 2.2057 - regression_loss: 1.8598 - classification_loss: 0.3459 437/500 [=========================>....] - ETA: 14s - loss: 2.2059 - regression_loss: 1.8598 - classification_loss: 0.3460 438/500 [=========================>....] - ETA: 13s - loss: 2.2053 - regression_loss: 1.8595 - classification_loss: 0.3457 439/500 [=========================>....] - ETA: 13s - loss: 2.2061 - regression_loss: 1.8602 - classification_loss: 0.3459 440/500 [=========================>....] - ETA: 13s - loss: 2.2051 - regression_loss: 1.8594 - classification_loss: 0.3457 441/500 [=========================>....] - ETA: 13s - loss: 2.2043 - regression_loss: 1.8588 - classification_loss: 0.3455 442/500 [=========================>....] - ETA: 13s - loss: 2.2047 - regression_loss: 1.8591 - classification_loss: 0.3456 443/500 [=========================>....] - ETA: 12s - loss: 2.2042 - regression_loss: 1.8587 - classification_loss: 0.3454 444/500 [=========================>....] - ETA: 12s - loss: 2.2039 - regression_loss: 1.8586 - classification_loss: 0.3452 445/500 [=========================>....] - ETA: 12s - loss: 2.2043 - regression_loss: 1.8594 - classification_loss: 0.3450 446/500 [=========================>....] - ETA: 12s - loss: 2.2053 - regression_loss: 1.8602 - classification_loss: 0.3451 447/500 [=========================>....] - ETA: 11s - loss: 2.2041 - regression_loss: 1.8593 - classification_loss: 0.3448 448/500 [=========================>....] - ETA: 11s - loss: 2.2022 - regression_loss: 1.8577 - classification_loss: 0.3445 449/500 [=========================>....] - ETA: 11s - loss: 2.2020 - regression_loss: 1.8577 - classification_loss: 0.3443 450/500 [==========================>...] - ETA: 11s - loss: 2.2013 - regression_loss: 1.8572 - classification_loss: 0.3441 451/500 [==========================>...] - ETA: 11s - loss: 2.2028 - regression_loss: 1.8582 - classification_loss: 0.3445 452/500 [==========================>...] - ETA: 10s - loss: 2.2027 - regression_loss: 1.8583 - classification_loss: 0.3445 453/500 [==========================>...] - ETA: 10s - loss: 2.2033 - regression_loss: 1.8586 - classification_loss: 0.3447 454/500 [==========================>...] - ETA: 10s - loss: 2.2040 - regression_loss: 1.8591 - classification_loss: 0.3449 455/500 [==========================>...] - ETA: 10s - loss: 2.2044 - regression_loss: 1.8595 - classification_loss: 0.3449 456/500 [==========================>...] - ETA: 9s - loss: 2.2035 - regression_loss: 1.8590 - classification_loss: 0.3446  457/500 [==========================>...] - ETA: 9s - loss: 2.2035 - regression_loss: 1.8589 - classification_loss: 0.3445 458/500 [==========================>...] - ETA: 9s - loss: 2.2024 - regression_loss: 1.8581 - classification_loss: 0.3443 459/500 [==========================>...] - ETA: 9s - loss: 2.2022 - regression_loss: 1.8581 - classification_loss: 0.3441 460/500 [==========================>...] - ETA: 9s - loss: 2.2027 - regression_loss: 1.8584 - classification_loss: 0.3443 461/500 [==========================>...] - ETA: 8s - loss: 2.2018 - regression_loss: 1.8579 - classification_loss: 0.3439 462/500 [==========================>...] - ETA: 8s - loss: 2.1996 - regression_loss: 1.8561 - classification_loss: 0.3435 463/500 [==========================>...] - ETA: 8s - loss: 2.1997 - regression_loss: 1.8562 - classification_loss: 0.3434 464/500 [==========================>...] - ETA: 8s - loss: 2.1965 - regression_loss: 1.8522 - classification_loss: 0.3443 465/500 [==========================>...] - ETA: 7s - loss: 2.1979 - regression_loss: 1.8534 - classification_loss: 0.3445 466/500 [==========================>...] - ETA: 7s - loss: 2.1976 - regression_loss: 1.8532 - classification_loss: 0.3444 467/500 [===========================>..] - ETA: 7s - loss: 2.1975 - regression_loss: 1.8532 - classification_loss: 0.3444 468/500 [===========================>..] - ETA: 7s - loss: 2.2000 - regression_loss: 1.8553 - classification_loss: 0.3447 469/500 [===========================>..] - ETA: 7s - loss: 2.2010 - regression_loss: 1.8562 - classification_loss: 0.3448 470/500 [===========================>..] - ETA: 6s - loss: 2.2003 - regression_loss: 1.8557 - classification_loss: 0.3446 471/500 [===========================>..] - ETA: 6s - loss: 2.1992 - regression_loss: 1.8548 - classification_loss: 0.3443 472/500 [===========================>..] - ETA: 6s - loss: 2.1992 - regression_loss: 1.8550 - classification_loss: 0.3442 473/500 [===========================>..] - ETA: 6s - loss: 2.1980 - regression_loss: 1.8541 - classification_loss: 0.3440 474/500 [===========================>..] - ETA: 5s - loss: 2.1972 - regression_loss: 1.8535 - classification_loss: 0.3437 475/500 [===========================>..] - ETA: 5s - loss: 2.1958 - regression_loss: 1.8524 - classification_loss: 0.3434 476/500 [===========================>..] - ETA: 5s - loss: 2.1952 - regression_loss: 1.8520 - classification_loss: 0.3432 477/500 [===========================>..] - ETA: 5s - loss: 2.1948 - regression_loss: 1.8517 - classification_loss: 0.3431 478/500 [===========================>..] - ETA: 4s - loss: 2.1952 - regression_loss: 1.8521 - classification_loss: 0.3432 479/500 [===========================>..] - ETA: 4s - loss: 2.1953 - regression_loss: 1.8523 - classification_loss: 0.3430 480/500 [===========================>..] - ETA: 4s - loss: 2.1956 - regression_loss: 1.8524 - classification_loss: 0.3432 481/500 [===========================>..] - ETA: 4s - loss: 2.1957 - regression_loss: 1.8524 - classification_loss: 0.3432 482/500 [===========================>..] - ETA: 4s - loss: 2.1963 - regression_loss: 1.8529 - classification_loss: 0.3433 483/500 [===========================>..] - ETA: 3s - loss: 2.1969 - regression_loss: 1.8531 - classification_loss: 0.3437 484/500 [============================>.] - ETA: 3s - loss: 2.1975 - regression_loss: 1.8535 - classification_loss: 0.3440 485/500 [============================>.] - ETA: 3s - loss: 2.1962 - regression_loss: 1.8522 - classification_loss: 0.3439 486/500 [============================>.] - ETA: 3s - loss: 2.1941 - regression_loss: 1.8505 - classification_loss: 0.3436 487/500 [============================>.] - ETA: 2s - loss: 2.1963 - regression_loss: 1.8522 - classification_loss: 0.3442 488/500 [============================>.] - ETA: 2s - loss: 2.1966 - regression_loss: 1.8523 - classification_loss: 0.3443 489/500 [============================>.] - ETA: 2s - loss: 2.1980 - regression_loss: 1.8535 - classification_loss: 0.3445 490/500 [============================>.] - ETA: 2s - loss: 2.1977 - regression_loss: 1.8534 - classification_loss: 0.3443 491/500 [============================>.] - ETA: 2s - loss: 2.1959 - regression_loss: 1.8520 - classification_loss: 0.3439 492/500 [============================>.] - ETA: 1s - loss: 2.1953 - regression_loss: 1.8516 - classification_loss: 0.3436 493/500 [============================>.] - ETA: 1s - loss: 2.1955 - regression_loss: 1.8519 - classification_loss: 0.3436 494/500 [============================>.] - ETA: 1s - loss: 2.1935 - regression_loss: 1.8503 - classification_loss: 0.3432 495/500 [============================>.] - ETA: 1s - loss: 2.1934 - regression_loss: 1.8502 - classification_loss: 0.3432 496/500 [============================>.] - ETA: 0s - loss: 2.1928 - regression_loss: 1.8495 - classification_loss: 0.3433 497/500 [============================>.] - ETA: 0s - loss: 2.1928 - regression_loss: 1.8496 - classification_loss: 0.3432 498/500 [============================>.] - ETA: 0s - loss: 2.1933 - regression_loss: 1.8499 - classification_loss: 0.3434 499/500 [============================>.] - ETA: 0s - loss: 2.1924 - regression_loss: 1.8493 - classification_loss: 0.3431 500/500 [==============================] - 113s 225ms/step - loss: 2.1936 - regression_loss: 1.8509 - classification_loss: 0.3428 326 instances of class plum with average precision: 0.7117 mAP: 0.7117 Epoch 00003: saving model to ./training/snapshots/resnet50_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 1:55 - loss: 2.1547 - regression_loss: 1.8358 - classification_loss: 0.3190 2/500 [..............................] - ETA: 2:00 - loss: 2.3743 - regression_loss: 2.0480 - classification_loss: 0.3263 3/500 [..............................] - ETA: 2:00 - loss: 2.0913 - regression_loss: 1.7942 - classification_loss: 0.2971 4/500 [..............................] - ETA: 1:56 - loss: 1.8467 - regression_loss: 1.5890 - classification_loss: 0.2577 5/500 [..............................] - ETA: 1:54 - loss: 2.0055 - regression_loss: 1.7186 - classification_loss: 0.2869 6/500 [..............................] - ETA: 1:55 - loss: 1.9745 - regression_loss: 1.6970 - classification_loss: 0.2775 7/500 [..............................] - ETA: 1:56 - loss: 2.0474 - regression_loss: 1.7766 - classification_loss: 0.2708 8/500 [..............................] - ETA: 1:56 - loss: 2.0424 - regression_loss: 1.7742 - classification_loss: 0.2682 9/500 [..............................] - ETA: 1:57 - loss: 1.9704 - regression_loss: 1.7094 - classification_loss: 0.2611 10/500 [..............................] - ETA: 1:57 - loss: 1.9069 - regression_loss: 1.6552 - classification_loss: 0.2516 11/500 [..............................] - ETA: 1:57 - loss: 1.9476 - regression_loss: 1.6801 - classification_loss: 0.2675 12/500 [..............................] - ETA: 1:57 - loss: 1.9776 - regression_loss: 1.7020 - classification_loss: 0.2756 13/500 [..............................] - ETA: 1:58 - loss: 2.0278 - regression_loss: 1.7429 - classification_loss: 0.2849 14/500 [..............................] - ETA: 1:58 - loss: 2.0135 - regression_loss: 1.7302 - classification_loss: 0.2832 15/500 [..............................] - ETA: 1:58 - loss: 2.0125 - regression_loss: 1.7267 - classification_loss: 0.2858 16/500 [..............................] - ETA: 1:57 - loss: 1.9956 - regression_loss: 1.7154 - classification_loss: 0.2803 17/500 [>.............................] - ETA: 1:57 - loss: 1.9840 - regression_loss: 1.7061 - classification_loss: 0.2778 18/500 [>.............................] - ETA: 1:57 - loss: 1.9372 - regression_loss: 1.6673 - classification_loss: 0.2700 19/500 [>.............................] - ETA: 1:57 - loss: 1.9240 - regression_loss: 1.6577 - classification_loss: 0.2663 20/500 [>.............................] - ETA: 1:56 - loss: 1.9894 - regression_loss: 1.7108 - classification_loss: 0.2786 21/500 [>.............................] - ETA: 1:56 - loss: 2.0213 - regression_loss: 1.7358 - classification_loss: 0.2855 22/500 [>.............................] - ETA: 1:56 - loss: 2.0242 - regression_loss: 1.7335 - classification_loss: 0.2907 23/500 [>.............................] - ETA: 1:56 - loss: 2.0491 - regression_loss: 1.6581 - classification_loss: 0.3910 24/500 [>.............................] - ETA: 1:56 - loss: 2.0571 - regression_loss: 1.6660 - classification_loss: 0.3912 25/500 [>.............................] - ETA: 1:55 - loss: 2.0373 - regression_loss: 1.6539 - classification_loss: 0.3835 26/500 [>.............................] - ETA: 1:55 - loss: 2.0307 - regression_loss: 1.6518 - classification_loss: 0.3789 27/500 [>.............................] - ETA: 1:55 - loss: 2.0531 - regression_loss: 1.6736 - classification_loss: 0.3795 28/500 [>.............................] - ETA: 1:55 - loss: 2.0352 - regression_loss: 1.6627 - classification_loss: 0.3725 29/500 [>.............................] - ETA: 1:55 - loss: 2.0267 - regression_loss: 1.6594 - classification_loss: 0.3673 30/500 [>.............................] - ETA: 1:55 - loss: 2.0311 - regression_loss: 1.6622 - classification_loss: 0.3689 31/500 [>.............................] - ETA: 1:54 - loss: 2.0021 - regression_loss: 1.6399 - classification_loss: 0.3622 32/500 [>.............................] - ETA: 1:54 - loss: 2.0118 - regression_loss: 1.6498 - classification_loss: 0.3620 33/500 [>.............................] - ETA: 1:54 - loss: 2.0159 - regression_loss: 1.6544 - classification_loss: 0.3615 34/500 [=>............................] - ETA: 1:54 - loss: 2.0168 - regression_loss: 1.6549 - classification_loss: 0.3620 35/500 [=>............................] - ETA: 1:54 - loss: 2.0418 - regression_loss: 1.6785 - classification_loss: 0.3633 36/500 [=>............................] - ETA: 1:54 - loss: 2.0612 - regression_loss: 1.7002 - classification_loss: 0.3610 37/500 [=>............................] - ETA: 1:53 - loss: 2.0361 - regression_loss: 1.6798 - classification_loss: 0.3563 38/500 [=>............................] - ETA: 1:53 - loss: 2.0396 - regression_loss: 1.6818 - classification_loss: 0.3578 39/500 [=>............................] - ETA: 1:53 - loss: 2.0213 - regression_loss: 1.6677 - classification_loss: 0.3537 40/500 [=>............................] - ETA: 1:53 - loss: 2.0147 - regression_loss: 1.6641 - classification_loss: 0.3506 41/500 [=>............................] - ETA: 1:53 - loss: 2.0207 - regression_loss: 1.6718 - classification_loss: 0.3489 42/500 [=>............................] - ETA: 1:52 - loss: 2.0145 - regression_loss: 1.6678 - classification_loss: 0.3467 43/500 [=>............................] - ETA: 1:52 - loss: 2.0083 - regression_loss: 1.6640 - classification_loss: 0.3443 44/500 [=>............................] - ETA: 1:52 - loss: 2.0053 - regression_loss: 1.6622 - classification_loss: 0.3431 45/500 [=>............................] - ETA: 1:52 - loss: 2.0231 - regression_loss: 1.6782 - classification_loss: 0.3448 46/500 [=>............................] - ETA: 1:51 - loss: 2.0223 - regression_loss: 1.6786 - classification_loss: 0.3436 47/500 [=>............................] - ETA: 1:51 - loss: 2.0005 - regression_loss: 1.6608 - classification_loss: 0.3397 48/500 [=>............................] - ETA: 1:50 - loss: 1.9951 - regression_loss: 1.6575 - classification_loss: 0.3376 49/500 [=>............................] - ETA: 1:50 - loss: 1.9781 - regression_loss: 1.6433 - classification_loss: 0.3348 50/500 [==>...........................] - ETA: 1:50 - loss: 2.0026 - regression_loss: 1.6597 - classification_loss: 0.3428 51/500 [==>...........................] - ETA: 1:50 - loss: 1.9969 - regression_loss: 1.6544 - classification_loss: 0.3426 52/500 [==>...........................] - ETA: 1:50 - loss: 1.9987 - regression_loss: 1.6568 - classification_loss: 0.3419 53/500 [==>...........................] - ETA: 1:49 - loss: 2.0075 - regression_loss: 1.6621 - classification_loss: 0.3455 54/500 [==>...........................] - ETA: 1:49 - loss: 2.0133 - regression_loss: 1.6667 - classification_loss: 0.3466 55/500 [==>...........................] - ETA: 1:49 - loss: 2.0041 - regression_loss: 1.6604 - classification_loss: 0.3437 56/500 [==>...........................] - ETA: 1:49 - loss: 2.0170 - regression_loss: 1.6717 - classification_loss: 0.3453 57/500 [==>...........................] - ETA: 1:49 - loss: 2.0190 - regression_loss: 1.6746 - classification_loss: 0.3444 58/500 [==>...........................] - ETA: 1:48 - loss: 2.0171 - regression_loss: 1.6736 - classification_loss: 0.3435 59/500 [==>...........................] - ETA: 1:48 - loss: 2.0042 - regression_loss: 1.6631 - classification_loss: 0.3410 60/500 [==>...........................] - ETA: 1:48 - loss: 2.0055 - regression_loss: 1.6645 - classification_loss: 0.3410 61/500 [==>...........................] - ETA: 1:48 - loss: 2.0023 - regression_loss: 1.6636 - classification_loss: 0.3386 62/500 [==>...........................] - ETA: 1:47 - loss: 2.0050 - regression_loss: 1.6664 - classification_loss: 0.3386 63/500 [==>...........................] - ETA: 1:47 - loss: 2.0047 - regression_loss: 1.6656 - classification_loss: 0.3391 64/500 [==>...........................] - ETA: 1:47 - loss: 2.0077 - regression_loss: 1.6687 - classification_loss: 0.3390 65/500 [==>...........................] - ETA: 1:47 - loss: 2.0075 - regression_loss: 1.6684 - classification_loss: 0.3391 66/500 [==>...........................] - ETA: 1:46 - loss: 2.0023 - regression_loss: 1.6632 - classification_loss: 0.3391 67/500 [===>..........................] - ETA: 1:46 - loss: 1.9981 - regression_loss: 1.6598 - classification_loss: 0.3383 68/500 [===>..........................] - ETA: 1:46 - loss: 1.9993 - regression_loss: 1.6604 - classification_loss: 0.3389 69/500 [===>..........................] - ETA: 1:45 - loss: 2.0016 - regression_loss: 1.6649 - classification_loss: 0.3368 70/500 [===>..........................] - ETA: 1:45 - loss: 1.9979 - regression_loss: 1.6623 - classification_loss: 0.3356 71/500 [===>..........................] - ETA: 1:45 - loss: 1.9892 - regression_loss: 1.6553 - classification_loss: 0.3339 72/500 [===>..........................] - ETA: 1:45 - loss: 1.9886 - regression_loss: 1.6552 - classification_loss: 0.3335 73/500 [===>..........................] - ETA: 1:45 - loss: 1.9916 - regression_loss: 1.6589 - classification_loss: 0.3328 74/500 [===>..........................] - ETA: 1:44 - loss: 1.9869 - regression_loss: 1.6560 - classification_loss: 0.3309 75/500 [===>..........................] - ETA: 1:44 - loss: 2.0034 - regression_loss: 1.6693 - classification_loss: 0.3341 76/500 [===>..........................] - ETA: 1:44 - loss: 2.0120 - regression_loss: 1.6758 - classification_loss: 0.3361 77/500 [===>..........................] - ETA: 1:44 - loss: 2.0124 - regression_loss: 1.6770 - classification_loss: 0.3353 78/500 [===>..........................] - ETA: 1:44 - loss: 2.0199 - regression_loss: 1.6825 - classification_loss: 0.3374 79/500 [===>..........................] - ETA: 1:43 - loss: 2.0252 - regression_loss: 1.6854 - classification_loss: 0.3398 80/500 [===>..........................] - ETA: 1:43 - loss: 2.0254 - regression_loss: 1.6867 - classification_loss: 0.3387 81/500 [===>..........................] - ETA: 1:43 - loss: 2.0314 - regression_loss: 1.6933 - classification_loss: 0.3381 82/500 [===>..........................] - ETA: 1:43 - loss: 2.0315 - regression_loss: 1.6946 - classification_loss: 0.3369 83/500 [===>..........................] - ETA: 1:42 - loss: 2.0452 - regression_loss: 1.7029 - classification_loss: 0.3422 84/500 [====>.........................] - ETA: 1:42 - loss: 2.0453 - regression_loss: 1.7030 - classification_loss: 0.3423 85/500 [====>.........................] - ETA: 1:41 - loss: 2.0424 - regression_loss: 1.7015 - classification_loss: 0.3409 86/500 [====>.........................] - ETA: 1:41 - loss: 2.0427 - regression_loss: 1.7021 - classification_loss: 0.3406 87/500 [====>.........................] - ETA: 1:41 - loss: 2.0421 - regression_loss: 1.7025 - classification_loss: 0.3397 88/500 [====>.........................] - ETA: 1:40 - loss: 2.0286 - regression_loss: 1.6915 - classification_loss: 0.3370 89/500 [====>.........................] - ETA: 1:40 - loss: 2.0352 - regression_loss: 1.6966 - classification_loss: 0.3386 90/500 [====>.........................] - ETA: 1:40 - loss: 2.0360 - regression_loss: 1.6981 - classification_loss: 0.3380 91/500 [====>.........................] - ETA: 1:40 - loss: 2.0370 - regression_loss: 1.6995 - classification_loss: 0.3375 92/500 [====>.........................] - ETA: 1:40 - loss: 2.0436 - regression_loss: 1.7056 - classification_loss: 0.3380 93/500 [====>.........................] - ETA: 1:39 - loss: 2.0462 - regression_loss: 1.7083 - classification_loss: 0.3378 94/500 [====>.........................] - ETA: 1:39 - loss: 2.0422 - regression_loss: 1.7058 - classification_loss: 0.3363 95/500 [====>.........................] - ETA: 1:39 - loss: 2.0411 - regression_loss: 1.7055 - classification_loss: 0.3355 96/500 [====>.........................] - ETA: 1:39 - loss: 2.0365 - regression_loss: 1.7024 - classification_loss: 0.3341 97/500 [====>.........................] - ETA: 1:38 - loss: 2.0429 - regression_loss: 1.7063 - classification_loss: 0.3366 98/500 [====>.........................] - ETA: 1:38 - loss: 2.0388 - regression_loss: 1.7040 - classification_loss: 0.3348 99/500 [====>.........................] - ETA: 1:38 - loss: 2.0356 - regression_loss: 1.7017 - classification_loss: 0.3339 100/500 [=====>........................] - ETA: 1:38 - loss: 2.0335 - regression_loss: 1.7002 - classification_loss: 0.3333 101/500 [=====>........................] - ETA: 1:37 - loss: 2.0325 - regression_loss: 1.6998 - classification_loss: 0.3327 102/500 [=====>........................] - ETA: 1:37 - loss: 2.0365 - regression_loss: 1.7037 - classification_loss: 0.3328 103/500 [=====>........................] - ETA: 1:37 - loss: 2.0393 - regression_loss: 1.7068 - classification_loss: 0.3326 104/500 [=====>........................] - ETA: 1:37 - loss: 2.0319 - regression_loss: 1.7003 - classification_loss: 0.3316 105/500 [=====>........................] - ETA: 1:37 - loss: 2.0326 - regression_loss: 1.7017 - classification_loss: 0.3309 106/500 [=====>........................] - ETA: 1:36 - loss: 2.0346 - regression_loss: 1.7034 - classification_loss: 0.3313 107/500 [=====>........................] - ETA: 1:36 - loss: 2.0377 - regression_loss: 1.7054 - classification_loss: 0.3322 108/500 [=====>........................] - ETA: 1:36 - loss: 2.0421 - regression_loss: 1.7107 - classification_loss: 0.3314 109/500 [=====>........................] - ETA: 1:36 - loss: 2.0496 - regression_loss: 1.7176 - classification_loss: 0.3320 110/500 [=====>........................] - ETA: 1:35 - loss: 2.0432 - regression_loss: 1.7126 - classification_loss: 0.3306 111/500 [=====>........................] - ETA: 1:35 - loss: 2.0372 - regression_loss: 1.7080 - classification_loss: 0.3292 112/500 [=====>........................] - ETA: 1:35 - loss: 2.0504 - regression_loss: 1.7190 - classification_loss: 0.3314 113/500 [=====>........................] - ETA: 1:35 - loss: 2.0492 - regression_loss: 1.7185 - classification_loss: 0.3307 114/500 [=====>........................] - ETA: 1:35 - loss: 2.0587 - regression_loss: 1.7248 - classification_loss: 0.3339 115/500 [=====>........................] - ETA: 1:34 - loss: 2.0565 - regression_loss: 1.7233 - classification_loss: 0.3332 116/500 [=====>........................] - ETA: 1:34 - loss: 2.0580 - regression_loss: 1.7250 - classification_loss: 0.3331 117/500 [======>.......................] - ETA: 1:34 - loss: 2.0579 - regression_loss: 1.7243 - classification_loss: 0.3336 118/500 [======>.......................] - ETA: 1:34 - loss: 2.0544 - regression_loss: 1.7215 - classification_loss: 0.3329 119/500 [======>.......................] - ETA: 1:33 - loss: 2.0593 - regression_loss: 1.7255 - classification_loss: 0.3338 120/500 [======>.......................] - ETA: 1:33 - loss: 2.0535 - regression_loss: 1.7216 - classification_loss: 0.3319 121/500 [======>.......................] - ETA: 1:33 - loss: 2.0573 - regression_loss: 1.7255 - classification_loss: 0.3317 122/500 [======>.......................] - ETA: 1:33 - loss: 2.0571 - regression_loss: 1.7257 - classification_loss: 0.3314 123/500 [======>.......................] - ETA: 1:32 - loss: 2.0589 - regression_loss: 1.7276 - classification_loss: 0.3314 124/500 [======>.......................] - ETA: 1:32 - loss: 2.0599 - regression_loss: 1.7290 - classification_loss: 0.3308 125/500 [======>.......................] - ETA: 1:32 - loss: 2.0659 - regression_loss: 1.7331 - classification_loss: 0.3328 126/500 [======>.......................] - ETA: 1:32 - loss: 2.0600 - regression_loss: 1.7290 - classification_loss: 0.3310 127/500 [======>.......................] - ETA: 1:31 - loss: 2.0605 - regression_loss: 1.7301 - classification_loss: 0.3304 128/500 [======>.......................] - ETA: 1:31 - loss: 2.0602 - regression_loss: 1.7302 - classification_loss: 0.3300 129/500 [======>.......................] - ETA: 1:31 - loss: 2.0611 - regression_loss: 1.7314 - classification_loss: 0.3297 130/500 [======>.......................] - ETA: 1:31 - loss: 2.0611 - regression_loss: 1.7318 - classification_loss: 0.3293 131/500 [======>.......................] - ETA: 1:30 - loss: 2.0661 - regression_loss: 1.7367 - classification_loss: 0.3294 132/500 [======>.......................] - ETA: 1:30 - loss: 2.0652 - regression_loss: 1.7357 - classification_loss: 0.3295 133/500 [======>.......................] - ETA: 1:30 - loss: 2.0644 - regression_loss: 1.7357 - classification_loss: 0.3288 134/500 [=======>......................] - ETA: 1:30 - loss: 2.0618 - regression_loss: 1.7340 - classification_loss: 0.3278 135/500 [=======>......................] - ETA: 1:29 - loss: 2.0615 - regression_loss: 1.7340 - classification_loss: 0.3275 136/500 [=======>......................] - ETA: 1:29 - loss: 2.0612 - regression_loss: 1.7333 - classification_loss: 0.3279 137/500 [=======>......................] - ETA: 1:29 - loss: 2.0614 - regression_loss: 1.7334 - classification_loss: 0.3280 138/500 [=======>......................] - ETA: 1:29 - loss: 2.0566 - regression_loss: 1.7294 - classification_loss: 0.3271 139/500 [=======>......................] - ETA: 1:28 - loss: 2.0558 - regression_loss: 1.7288 - classification_loss: 0.3270 140/500 [=======>......................] - ETA: 1:28 - loss: 2.0547 - regression_loss: 1.7286 - classification_loss: 0.3261 141/500 [=======>......................] - ETA: 1:28 - loss: 2.0515 - regression_loss: 1.7265 - classification_loss: 0.3250 142/500 [=======>......................] - ETA: 1:28 - loss: 2.0568 - regression_loss: 1.7311 - classification_loss: 0.3257 143/500 [=======>......................] - ETA: 1:27 - loss: 2.0548 - regression_loss: 1.7299 - classification_loss: 0.3249 144/500 [=======>......................] - ETA: 1:27 - loss: 2.0584 - regression_loss: 1.7324 - classification_loss: 0.3260 145/500 [=======>......................] - ETA: 1:27 - loss: 2.0553 - regression_loss: 1.7301 - classification_loss: 0.3252 146/500 [=======>......................] - ETA: 1:27 - loss: 2.0548 - regression_loss: 1.7295 - classification_loss: 0.3253 147/500 [=======>......................] - ETA: 1:26 - loss: 2.0562 - regression_loss: 1.7308 - classification_loss: 0.3254 148/500 [=======>......................] - ETA: 1:26 - loss: 2.0495 - regression_loss: 1.7254 - classification_loss: 0.3241 149/500 [=======>......................] - ETA: 1:26 - loss: 2.0477 - regression_loss: 1.7239 - classification_loss: 0.3238 150/500 [========>.....................] - ETA: 1:26 - loss: 2.0454 - regression_loss: 1.7226 - classification_loss: 0.3228 151/500 [========>.....................] - ETA: 1:25 - loss: 2.0465 - regression_loss: 1.7242 - classification_loss: 0.3223 152/500 [========>.....................] - ETA: 1:25 - loss: 2.0439 - regression_loss: 1.7223 - classification_loss: 0.3216 153/500 [========>.....................] - ETA: 1:25 - loss: 2.0467 - regression_loss: 1.7250 - classification_loss: 0.3216 154/500 [========>.....................] - ETA: 1:25 - loss: 2.0467 - regression_loss: 1.7255 - classification_loss: 0.3213 155/500 [========>.....................] - ETA: 1:25 - loss: 2.0426 - regression_loss: 1.7225 - classification_loss: 0.3201 156/500 [========>.....................] - ETA: 1:24 - loss: 2.0419 - regression_loss: 1.7222 - classification_loss: 0.3197 157/500 [========>.....................] - ETA: 1:24 - loss: 2.0390 - regression_loss: 1.7199 - classification_loss: 0.3191 158/500 [========>.....................] - ETA: 1:24 - loss: 2.0372 - regression_loss: 1.7183 - classification_loss: 0.3188 159/500 [========>.....................] - ETA: 1:24 - loss: 2.0395 - regression_loss: 1.7201 - classification_loss: 0.3193 160/500 [========>.....................] - ETA: 1:23 - loss: 2.0367 - regression_loss: 1.7178 - classification_loss: 0.3189 161/500 [========>.....................] - ETA: 1:23 - loss: 2.0341 - regression_loss: 1.7155 - classification_loss: 0.3186 162/500 [========>.....................] - ETA: 1:23 - loss: 2.0407 - regression_loss: 1.7209 - classification_loss: 0.3198 163/500 [========>.....................] - ETA: 1:23 - loss: 2.0497 - regression_loss: 1.7279 - classification_loss: 0.3218 164/500 [========>.....................] - ETA: 1:22 - loss: 2.0503 - regression_loss: 1.7283 - classification_loss: 0.3220 165/500 [========>.....................] - ETA: 1:22 - loss: 2.0491 - regression_loss: 1.7272 - classification_loss: 0.3219 166/500 [========>.....................] - ETA: 1:22 - loss: 2.0528 - regression_loss: 1.7299 - classification_loss: 0.3229 167/500 [=========>....................] - ETA: 1:22 - loss: 2.0536 - regression_loss: 1.7305 - classification_loss: 0.3231 168/500 [=========>....................] - ETA: 1:21 - loss: 2.0527 - regression_loss: 1.7305 - classification_loss: 0.3222 169/500 [=========>....................] - ETA: 1:21 - loss: 2.0477 - regression_loss: 1.7265 - classification_loss: 0.3212 170/500 [=========>....................] - ETA: 1:21 - loss: 2.0440 - regression_loss: 1.7237 - classification_loss: 0.3203 171/500 [=========>....................] - ETA: 1:21 - loss: 2.0421 - regression_loss: 1.7224 - classification_loss: 0.3197 172/500 [=========>....................] - ETA: 1:20 - loss: 2.0419 - regression_loss: 1.7225 - classification_loss: 0.3194 173/500 [=========>....................] - ETA: 1:20 - loss: 2.0401 - regression_loss: 1.7209 - classification_loss: 0.3192 174/500 [=========>....................] - ETA: 1:20 - loss: 2.0389 - regression_loss: 1.7200 - classification_loss: 0.3188 175/500 [=========>....................] - ETA: 1:20 - loss: 2.0387 - regression_loss: 1.7201 - classification_loss: 0.3186 176/500 [=========>....................] - ETA: 1:19 - loss: 2.0443 - regression_loss: 1.7246 - classification_loss: 0.3197 177/500 [=========>....................] - ETA: 1:19 - loss: 2.0418 - regression_loss: 1.7224 - classification_loss: 0.3193 178/500 [=========>....................] - ETA: 1:19 - loss: 2.0421 - regression_loss: 1.7230 - classification_loss: 0.3191 179/500 [=========>....................] - ETA: 1:19 - loss: 2.0398 - regression_loss: 1.7212 - classification_loss: 0.3186 180/500 [=========>....................] - ETA: 1:18 - loss: 2.0430 - regression_loss: 1.7234 - classification_loss: 0.3196 181/500 [=========>....................] - ETA: 1:18 - loss: 2.0436 - regression_loss: 1.7243 - classification_loss: 0.3194 182/500 [=========>....................] - ETA: 1:18 - loss: 2.0420 - regression_loss: 1.7148 - classification_loss: 0.3272 183/500 [=========>....................] - ETA: 1:18 - loss: 2.0405 - regression_loss: 1.7139 - classification_loss: 0.3266 184/500 [==========>...................] - ETA: 1:17 - loss: 2.0418 - regression_loss: 1.7150 - classification_loss: 0.3267 185/500 [==========>...................] - ETA: 1:17 - loss: 2.0409 - regression_loss: 1.7133 - classification_loss: 0.3277 186/500 [==========>...................] - ETA: 1:17 - loss: 2.0400 - regression_loss: 1.7126 - classification_loss: 0.3274 187/500 [==========>...................] - ETA: 1:17 - loss: 2.0395 - regression_loss: 1.7127 - classification_loss: 0.3268 188/500 [==========>...................] - ETA: 1:16 - loss: 2.0364 - regression_loss: 1.7102 - classification_loss: 0.3262 189/500 [==========>...................] - ETA: 1:16 - loss: 2.0349 - regression_loss: 1.7090 - classification_loss: 0.3259 190/500 [==========>...................] - ETA: 1:16 - loss: 2.0406 - regression_loss: 1.7131 - classification_loss: 0.3274 191/500 [==========>...................] - ETA: 1:16 - loss: 2.0408 - regression_loss: 1.7131 - classification_loss: 0.3277 192/500 [==========>...................] - ETA: 1:15 - loss: 2.0436 - regression_loss: 1.7152 - classification_loss: 0.3284 193/500 [==========>...................] - ETA: 1:15 - loss: 2.0433 - regression_loss: 1.7153 - classification_loss: 0.3279 194/500 [==========>...................] - ETA: 1:15 - loss: 2.0411 - regression_loss: 1.7132 - classification_loss: 0.3280 195/500 [==========>...................] - ETA: 1:15 - loss: 2.0394 - regression_loss: 1.7115 - classification_loss: 0.3278 196/500 [==========>...................] - ETA: 1:14 - loss: 2.0419 - regression_loss: 1.7137 - classification_loss: 0.3282 197/500 [==========>...................] - ETA: 1:14 - loss: 2.0367 - regression_loss: 1.7097 - classification_loss: 0.3270 198/500 [==========>...................] - ETA: 1:14 - loss: 2.0348 - regression_loss: 1.7084 - classification_loss: 0.3264 199/500 [==========>...................] - ETA: 1:14 - loss: 2.0382 - regression_loss: 1.7110 - classification_loss: 0.3272 200/500 [===========>..................] - ETA: 1:13 - loss: 2.0383 - regression_loss: 1.7113 - classification_loss: 0.3270 201/500 [===========>..................] - ETA: 1:13 - loss: 2.0385 - regression_loss: 1.7118 - classification_loss: 0.3267 202/500 [===========>..................] - ETA: 1:13 - loss: 2.0364 - regression_loss: 1.7103 - classification_loss: 0.3262 203/500 [===========>..................] - ETA: 1:13 - loss: 2.0372 - regression_loss: 1.7104 - classification_loss: 0.3268 204/500 [===========>..................] - ETA: 1:12 - loss: 2.0396 - regression_loss: 1.7128 - classification_loss: 0.3267 205/500 [===========>..................] - ETA: 1:12 - loss: 2.0401 - regression_loss: 1.7133 - classification_loss: 0.3269 206/500 [===========>..................] - ETA: 1:12 - loss: 2.0359 - regression_loss: 1.7095 - classification_loss: 0.3264 207/500 [===========>..................] - ETA: 1:12 - loss: 2.0402 - regression_loss: 1.7125 - classification_loss: 0.3277 208/500 [===========>..................] - ETA: 1:11 - loss: 2.0549 - regression_loss: 1.7187 - classification_loss: 0.3362 209/500 [===========>..................] - ETA: 1:11 - loss: 2.0543 - regression_loss: 1.7185 - classification_loss: 0.3358 210/500 [===========>..................] - ETA: 1:11 - loss: 2.0581 - regression_loss: 1.7220 - classification_loss: 0.3360 211/500 [===========>..................] - ETA: 1:11 - loss: 2.0566 - regression_loss: 1.7209 - classification_loss: 0.3356 212/500 [===========>..................] - ETA: 1:10 - loss: 2.0551 - regression_loss: 1.7201 - classification_loss: 0.3350 213/500 [===========>..................] - ETA: 1:10 - loss: 2.0529 - regression_loss: 1.7185 - classification_loss: 0.3344 214/500 [===========>..................] - ETA: 1:10 - loss: 2.0499 - regression_loss: 1.7160 - classification_loss: 0.3339 215/500 [===========>..................] - ETA: 1:10 - loss: 2.0441 - regression_loss: 1.7113 - classification_loss: 0.3328 216/500 [===========>..................] - ETA: 1:09 - loss: 2.0472 - regression_loss: 1.7131 - classification_loss: 0.3341 217/500 [============>.................] - ETA: 1:09 - loss: 2.0444 - regression_loss: 1.7109 - classification_loss: 0.3335 218/500 [============>.................] - ETA: 1:09 - loss: 2.0447 - regression_loss: 1.7108 - classification_loss: 0.3340 219/500 [============>.................] - ETA: 1:09 - loss: 2.0435 - regression_loss: 1.7102 - classification_loss: 0.3333 220/500 [============>.................] - ETA: 1:08 - loss: 2.0406 - regression_loss: 1.7078 - classification_loss: 0.3328 221/500 [============>.................] - ETA: 1:08 - loss: 2.0383 - regression_loss: 1.7060 - classification_loss: 0.3323 222/500 [============>.................] - ETA: 1:08 - loss: 2.0340 - regression_loss: 1.7025 - classification_loss: 0.3315 223/500 [============>.................] - ETA: 1:08 - loss: 2.0379 - regression_loss: 1.7056 - classification_loss: 0.3323 224/500 [============>.................] - ETA: 1:07 - loss: 2.0399 - regression_loss: 1.7075 - classification_loss: 0.3324 225/500 [============>.................] - ETA: 1:07 - loss: 2.0389 - regression_loss: 1.7070 - classification_loss: 0.3319 226/500 [============>.................] - ETA: 1:07 - loss: 2.0393 - regression_loss: 1.7076 - classification_loss: 0.3317 227/500 [============>.................] - ETA: 1:07 - loss: 2.0386 - regression_loss: 1.7069 - classification_loss: 0.3317 228/500 [============>.................] - ETA: 1:06 - loss: 2.0417 - regression_loss: 1.7101 - classification_loss: 0.3316 229/500 [============>.................] - ETA: 1:06 - loss: 2.0451 - regression_loss: 1.7128 - classification_loss: 0.3324 230/500 [============>.................] - ETA: 1:06 - loss: 2.0468 - regression_loss: 1.7141 - classification_loss: 0.3328 231/500 [============>.................] - ETA: 1:06 - loss: 2.0456 - regression_loss: 1.7131 - classification_loss: 0.3325 232/500 [============>.................] - ETA: 1:06 - loss: 2.0449 - regression_loss: 1.7121 - classification_loss: 0.3328 233/500 [============>.................] - ETA: 1:05 - loss: 2.0448 - regression_loss: 1.7122 - classification_loss: 0.3326 234/500 [=============>................] - ETA: 1:05 - loss: 2.0460 - regression_loss: 1.7129 - classification_loss: 0.3330 235/500 [=============>................] - ETA: 1:05 - loss: 2.0464 - regression_loss: 1.7136 - classification_loss: 0.3328 236/500 [=============>................] - ETA: 1:05 - loss: 2.0449 - regression_loss: 1.7123 - classification_loss: 0.3326 237/500 [=============>................] - ETA: 1:04 - loss: 2.0449 - regression_loss: 1.7127 - classification_loss: 0.3322 238/500 [=============>................] - ETA: 1:04 - loss: 2.0464 - regression_loss: 1.7141 - classification_loss: 0.3323 239/500 [=============>................] - ETA: 1:04 - loss: 2.0434 - regression_loss: 1.7118 - classification_loss: 0.3316 240/500 [=============>................] - ETA: 1:04 - loss: 2.0413 - regression_loss: 1.7100 - classification_loss: 0.3313 241/500 [=============>................] - ETA: 1:03 - loss: 2.0464 - regression_loss: 1.7140 - classification_loss: 0.3325 242/500 [=============>................] - ETA: 1:03 - loss: 2.0446 - regression_loss: 1.7126 - classification_loss: 0.3320 243/500 [=============>................] - ETA: 1:03 - loss: 2.0450 - regression_loss: 1.7133 - classification_loss: 0.3318 244/500 [=============>................] - ETA: 1:03 - loss: 2.0439 - regression_loss: 1.7125 - classification_loss: 0.3314 245/500 [=============>................] - ETA: 1:02 - loss: 2.0424 - regression_loss: 1.7114 - classification_loss: 0.3310 246/500 [=============>................] - ETA: 1:02 - loss: 2.0438 - regression_loss: 1.7126 - classification_loss: 0.3312 247/500 [=============>................] - ETA: 1:02 - loss: 2.0433 - regression_loss: 1.7124 - classification_loss: 0.3310 248/500 [=============>................] - ETA: 1:02 - loss: 2.0419 - regression_loss: 1.7114 - classification_loss: 0.3305 249/500 [=============>................] - ETA: 1:01 - loss: 2.0381 - regression_loss: 1.7083 - classification_loss: 0.3298 250/500 [==============>...............] - ETA: 1:01 - loss: 2.0388 - regression_loss: 1.7088 - classification_loss: 0.3300 251/500 [==============>...............] - ETA: 1:01 - loss: 2.0380 - regression_loss: 1.7085 - classification_loss: 0.3295 252/500 [==============>...............] - ETA: 1:01 - loss: 2.0359 - regression_loss: 1.7068 - classification_loss: 0.3291 253/500 [==============>...............] - ETA: 1:00 - loss: 2.0368 - regression_loss: 1.7077 - classification_loss: 0.3291 254/500 [==============>...............] - ETA: 1:00 - loss: 2.0357 - regression_loss: 1.7070 - classification_loss: 0.3287 255/500 [==============>...............] - ETA: 1:00 - loss: 2.0355 - regression_loss: 1.7071 - classification_loss: 0.3284 256/500 [==============>...............] - ETA: 1:00 - loss: 2.0338 - regression_loss: 1.7060 - classification_loss: 0.3278 257/500 [==============>...............] - ETA: 59s - loss: 2.0335 - regression_loss: 1.7058 - classification_loss: 0.3276  258/500 [==============>...............] - ETA: 59s - loss: 2.0370 - regression_loss: 1.7089 - classification_loss: 0.3281 259/500 [==============>...............] - ETA: 59s - loss: 2.0369 - regression_loss: 1.7090 - classification_loss: 0.3279 260/500 [==============>...............] - ETA: 59s - loss: 2.0379 - regression_loss: 1.7097 - classification_loss: 0.3283 261/500 [==============>...............] - ETA: 58s - loss: 2.0422 - regression_loss: 1.7136 - classification_loss: 0.3286 262/500 [==============>...............] - ETA: 58s - loss: 2.0418 - regression_loss: 1.7132 - classification_loss: 0.3286 263/500 [==============>...............] - ETA: 58s - loss: 2.0480 - regression_loss: 1.7184 - classification_loss: 0.3295 264/500 [==============>...............] - ETA: 57s - loss: 2.0462 - regression_loss: 1.7170 - classification_loss: 0.3292 265/500 [==============>...............] - ETA: 57s - loss: 2.0460 - regression_loss: 1.7170 - classification_loss: 0.3290 266/500 [==============>...............] - ETA: 57s - loss: 2.0449 - regression_loss: 1.7164 - classification_loss: 0.3285 267/500 [===============>..............] - ETA: 57s - loss: 2.0467 - regression_loss: 1.7178 - classification_loss: 0.3290 268/500 [===============>..............] - ETA: 57s - loss: 2.0450 - regression_loss: 1.7165 - classification_loss: 0.3285 269/500 [===============>..............] - ETA: 56s - loss: 2.0450 - regression_loss: 1.7165 - classification_loss: 0.3284 270/500 [===============>..............] - ETA: 56s - loss: 2.0433 - regression_loss: 1.7153 - classification_loss: 0.3280 271/500 [===============>..............] - ETA: 56s - loss: 2.0459 - regression_loss: 1.7174 - classification_loss: 0.3285 272/500 [===============>..............] - ETA: 56s - loss: 2.0480 - regression_loss: 1.7191 - classification_loss: 0.3289 273/500 [===============>..............] - ETA: 55s - loss: 2.0474 - regression_loss: 1.7185 - classification_loss: 0.3289 274/500 [===============>..............] - ETA: 55s - loss: 2.0446 - regression_loss: 1.7161 - classification_loss: 0.3285 275/500 [===============>..............] - ETA: 55s - loss: 2.0444 - regression_loss: 1.7162 - classification_loss: 0.3282 276/500 [===============>..............] - ETA: 55s - loss: 2.0553 - regression_loss: 1.7235 - classification_loss: 0.3319 277/500 [===============>..............] - ETA: 54s - loss: 2.0546 - regression_loss: 1.7231 - classification_loss: 0.3315 278/500 [===============>..............] - ETA: 54s - loss: 2.0541 - regression_loss: 1.7230 - classification_loss: 0.3311 279/500 [===============>..............] - ETA: 54s - loss: 2.0492 - regression_loss: 1.7187 - classification_loss: 0.3305 280/500 [===============>..............] - ETA: 54s - loss: 2.0486 - regression_loss: 1.7185 - classification_loss: 0.3301 281/500 [===============>..............] - ETA: 53s - loss: 2.0489 - regression_loss: 1.7188 - classification_loss: 0.3301 282/500 [===============>..............] - ETA: 53s - loss: 2.0459 - regression_loss: 1.7163 - classification_loss: 0.3296 283/500 [===============>..............] - ETA: 53s - loss: 2.0435 - regression_loss: 1.7144 - classification_loss: 0.3291 284/500 [================>.............] - ETA: 53s - loss: 2.0430 - regression_loss: 1.7140 - classification_loss: 0.3289 285/500 [================>.............] - ETA: 52s - loss: 2.0411 - regression_loss: 1.7124 - classification_loss: 0.3286 286/500 [================>.............] - ETA: 52s - loss: 2.0411 - regression_loss: 1.7122 - classification_loss: 0.3288 287/500 [================>.............] - ETA: 52s - loss: 2.0402 - regression_loss: 1.7115 - classification_loss: 0.3288 288/500 [================>.............] - ETA: 52s - loss: 2.0418 - regression_loss: 1.7134 - classification_loss: 0.3284 289/500 [================>.............] - ETA: 51s - loss: 2.0420 - regression_loss: 1.7137 - classification_loss: 0.3283 290/500 [================>.............] - ETA: 51s - loss: 2.0412 - regression_loss: 1.7131 - classification_loss: 0.3281 291/500 [================>.............] - ETA: 51s - loss: 2.0417 - regression_loss: 1.7138 - classification_loss: 0.3279 292/500 [================>.............] - ETA: 51s - loss: 2.0411 - regression_loss: 1.7131 - classification_loss: 0.3280 293/500 [================>.............] - ETA: 50s - loss: 2.0419 - regression_loss: 1.7139 - classification_loss: 0.3280 294/500 [================>.............] - ETA: 50s - loss: 2.0401 - regression_loss: 1.7125 - classification_loss: 0.3276 295/500 [================>.............] - ETA: 50s - loss: 2.0386 - regression_loss: 1.7114 - classification_loss: 0.3273 296/500 [================>.............] - ETA: 50s - loss: 2.0361 - regression_loss: 1.7093 - classification_loss: 0.3267 297/500 [================>.............] - ETA: 49s - loss: 2.0337 - regression_loss: 1.7077 - classification_loss: 0.3260 298/500 [================>.............] - ETA: 49s - loss: 2.0339 - regression_loss: 1.7081 - classification_loss: 0.3258 299/500 [================>.............] - ETA: 49s - loss: 2.0301 - regression_loss: 1.7050 - classification_loss: 0.3252 300/500 [=================>............] - ETA: 49s - loss: 2.0297 - regression_loss: 1.7048 - classification_loss: 0.3248 301/500 [=================>............] - ETA: 48s - loss: 2.0298 - regression_loss: 1.7048 - classification_loss: 0.3250 302/500 [=================>............] - ETA: 48s - loss: 2.0323 - regression_loss: 1.7070 - classification_loss: 0.3253 303/500 [=================>............] - ETA: 48s - loss: 2.0306 - regression_loss: 1.7058 - classification_loss: 0.3248 304/500 [=================>............] - ETA: 48s - loss: 2.0308 - regression_loss: 1.7059 - classification_loss: 0.3249 305/500 [=================>............] - ETA: 47s - loss: 2.0302 - regression_loss: 1.7057 - classification_loss: 0.3246 306/500 [=================>............] - ETA: 47s - loss: 2.0305 - regression_loss: 1.7058 - classification_loss: 0.3246 307/500 [=================>............] - ETA: 47s - loss: 2.0300 - regression_loss: 1.7057 - classification_loss: 0.3243 308/500 [=================>............] - ETA: 47s - loss: 2.0286 - regression_loss: 1.7049 - classification_loss: 0.3237 309/500 [=================>............] - ETA: 47s - loss: 2.0269 - regression_loss: 1.7037 - classification_loss: 0.3233 310/500 [=================>............] - ETA: 46s - loss: 2.0248 - regression_loss: 1.7020 - classification_loss: 0.3228 311/500 [=================>............] - ETA: 46s - loss: 2.0271 - regression_loss: 1.7042 - classification_loss: 0.3230 312/500 [=================>............] - ETA: 46s - loss: 2.0257 - regression_loss: 1.7032 - classification_loss: 0.3226 313/500 [=================>............] - ETA: 46s - loss: 2.0237 - regression_loss: 1.7016 - classification_loss: 0.3221 314/500 [=================>............] - ETA: 45s - loss: 2.0226 - regression_loss: 1.7010 - classification_loss: 0.3217 315/500 [=================>............] - ETA: 45s - loss: 2.0248 - regression_loss: 1.7027 - classification_loss: 0.3221 316/500 [=================>............] - ETA: 45s - loss: 2.0249 - regression_loss: 1.7030 - classification_loss: 0.3219 317/500 [==================>...........] - ETA: 45s - loss: 2.0277 - regression_loss: 1.7056 - classification_loss: 0.3221 318/500 [==================>...........] - ETA: 44s - loss: 2.0290 - regression_loss: 1.7070 - classification_loss: 0.3220 319/500 [==================>...........] - ETA: 44s - loss: 2.0287 - regression_loss: 1.7071 - classification_loss: 0.3216 320/500 [==================>...........] - ETA: 44s - loss: 2.0283 - regression_loss: 1.7068 - classification_loss: 0.3215 321/500 [==================>...........] - ETA: 44s - loss: 2.0281 - regression_loss: 1.7069 - classification_loss: 0.3212 322/500 [==================>...........] - ETA: 43s - loss: 2.0289 - regression_loss: 1.7078 - classification_loss: 0.3211 323/500 [==================>...........] - ETA: 43s - loss: 2.0274 - regression_loss: 1.7066 - classification_loss: 0.3208 324/500 [==================>...........] - ETA: 43s - loss: 2.0250 - regression_loss: 1.7048 - classification_loss: 0.3202 325/500 [==================>...........] - ETA: 43s - loss: 2.0230 - regression_loss: 1.7032 - classification_loss: 0.3197 326/500 [==================>...........] - ETA: 42s - loss: 2.0224 - regression_loss: 1.7027 - classification_loss: 0.3197 327/500 [==================>...........] - ETA: 42s - loss: 2.0277 - regression_loss: 1.7056 - classification_loss: 0.3220 328/500 [==================>...........] - ETA: 42s - loss: 2.0292 - regression_loss: 1.7071 - classification_loss: 0.3221 329/500 [==================>...........] - ETA: 42s - loss: 2.0280 - regression_loss: 1.7064 - classification_loss: 0.3216 330/500 [==================>...........] - ETA: 41s - loss: 2.0252 - regression_loss: 1.7041 - classification_loss: 0.3211 331/500 [==================>...........] - ETA: 41s - loss: 2.0246 - regression_loss: 1.7038 - classification_loss: 0.3208 332/500 [==================>...........] - ETA: 41s - loss: 2.0279 - regression_loss: 1.7066 - classification_loss: 0.3214 333/500 [==================>...........] - ETA: 41s - loss: 2.0290 - regression_loss: 1.7074 - classification_loss: 0.3216 334/500 [===================>..........] - ETA: 40s - loss: 2.0272 - regression_loss: 1.7060 - classification_loss: 0.3212 335/500 [===================>..........] - ETA: 40s - loss: 2.0253 - regression_loss: 1.7045 - classification_loss: 0.3208 336/500 [===================>..........] - ETA: 40s - loss: 2.0225 - regression_loss: 1.7023 - classification_loss: 0.3202 337/500 [===================>..........] - ETA: 40s - loss: 2.0209 - regression_loss: 1.7011 - classification_loss: 0.3198 338/500 [===================>..........] - ETA: 39s - loss: 2.0200 - regression_loss: 1.7006 - classification_loss: 0.3195 339/500 [===================>..........] - ETA: 39s - loss: 2.0186 - regression_loss: 1.6996 - classification_loss: 0.3191 340/500 [===================>..........] - ETA: 39s - loss: 2.0187 - regression_loss: 1.6999 - classification_loss: 0.3189 341/500 [===================>..........] - ETA: 39s - loss: 2.0169 - regression_loss: 1.6984 - classification_loss: 0.3185 342/500 [===================>..........] - ETA: 38s - loss: 2.0190 - regression_loss: 1.7000 - classification_loss: 0.3190 343/500 [===================>..........] - ETA: 38s - loss: 2.0199 - regression_loss: 1.7008 - classification_loss: 0.3192 344/500 [===================>..........] - ETA: 38s - loss: 2.0191 - regression_loss: 1.7003 - classification_loss: 0.3188 345/500 [===================>..........] - ETA: 38s - loss: 2.0192 - regression_loss: 1.7007 - classification_loss: 0.3186 346/500 [===================>..........] - ETA: 37s - loss: 2.0195 - regression_loss: 1.7011 - classification_loss: 0.3184 347/500 [===================>..........] - ETA: 37s - loss: 2.0212 - regression_loss: 1.7022 - classification_loss: 0.3190 348/500 [===================>..........] - ETA: 37s - loss: 2.0200 - regression_loss: 1.7013 - classification_loss: 0.3188 349/500 [===================>..........] - ETA: 37s - loss: 2.0203 - regression_loss: 1.7017 - classification_loss: 0.3186 350/500 [====================>.........] - ETA: 36s - loss: 2.0206 - regression_loss: 1.7020 - classification_loss: 0.3186 351/500 [====================>.........] - ETA: 36s - loss: 2.0196 - regression_loss: 1.7014 - classification_loss: 0.3182 352/500 [====================>.........] - ETA: 36s - loss: 2.0208 - regression_loss: 1.7027 - classification_loss: 0.3181 353/500 [====================>.........] - ETA: 36s - loss: 2.0222 - regression_loss: 1.7037 - classification_loss: 0.3184 354/500 [====================>.........] - ETA: 36s - loss: 2.0198 - regression_loss: 1.7020 - classification_loss: 0.3179 355/500 [====================>.........] - ETA: 35s - loss: 2.0208 - regression_loss: 1.7032 - classification_loss: 0.3177 356/500 [====================>.........] - ETA: 35s - loss: 2.0206 - regression_loss: 1.7029 - classification_loss: 0.3177 357/500 [====================>.........] - ETA: 35s - loss: 2.0245 - regression_loss: 1.7056 - classification_loss: 0.3189 358/500 [====================>.........] - ETA: 35s - loss: 2.0230 - regression_loss: 1.7045 - classification_loss: 0.3185 359/500 [====================>.........] - ETA: 34s - loss: 2.0229 - regression_loss: 1.7046 - classification_loss: 0.3184 360/500 [====================>.........] - ETA: 34s - loss: 2.0229 - regression_loss: 1.7046 - classification_loss: 0.3183 361/500 [====================>.........] - ETA: 34s - loss: 2.0236 - regression_loss: 1.7051 - classification_loss: 0.3185 362/500 [====================>.........] - ETA: 34s - loss: 2.0231 - regression_loss: 1.7047 - classification_loss: 0.3183 363/500 [====================>.........] - ETA: 33s - loss: 2.0226 - regression_loss: 1.7045 - classification_loss: 0.3181 364/500 [====================>.........] - ETA: 33s - loss: 2.0232 - regression_loss: 1.7048 - classification_loss: 0.3184 365/500 [====================>.........] - ETA: 33s - loss: 2.0232 - regression_loss: 1.7047 - classification_loss: 0.3185 366/500 [====================>.........] - ETA: 33s - loss: 2.0231 - regression_loss: 1.7047 - classification_loss: 0.3184 367/500 [=====================>........] - ETA: 32s - loss: 2.0223 - regression_loss: 1.7041 - classification_loss: 0.3182 368/500 [=====================>........] - ETA: 32s - loss: 2.0228 - regression_loss: 1.7045 - classification_loss: 0.3183 369/500 [=====================>........] - ETA: 32s - loss: 2.0228 - regression_loss: 1.7045 - classification_loss: 0.3183 370/500 [=====================>........] - ETA: 32s - loss: 2.0242 - regression_loss: 1.7060 - classification_loss: 0.3182 371/500 [=====================>........] - ETA: 31s - loss: 2.0257 - regression_loss: 1.7075 - classification_loss: 0.3182 372/500 [=====================>........] - ETA: 31s - loss: 2.0261 - regression_loss: 1.7080 - classification_loss: 0.3181 373/500 [=====================>........] - ETA: 31s - loss: 2.0265 - regression_loss: 1.7086 - classification_loss: 0.3179 374/500 [=====================>........] - ETA: 31s - loss: 2.0248 - regression_loss: 1.7072 - classification_loss: 0.3176 375/500 [=====================>........] - ETA: 30s - loss: 2.0248 - regression_loss: 1.7072 - classification_loss: 0.3176 376/500 [=====================>........] - ETA: 30s - loss: 2.0216 - regression_loss: 1.7046 - classification_loss: 0.3170 377/500 [=====================>........] - ETA: 30s - loss: 2.0196 - regression_loss: 1.7030 - classification_loss: 0.3165 378/500 [=====================>........] - ETA: 30s - loss: 2.0195 - regression_loss: 1.7032 - classification_loss: 0.3163 379/500 [=====================>........] - ETA: 29s - loss: 2.0193 - regression_loss: 1.7029 - classification_loss: 0.3163 380/500 [=====================>........] - ETA: 29s - loss: 2.0194 - regression_loss: 1.7031 - classification_loss: 0.3163 381/500 [=====================>........] - ETA: 29s - loss: 2.0189 - regression_loss: 1.7027 - classification_loss: 0.3162 382/500 [=====================>........] - ETA: 29s - loss: 2.0205 - regression_loss: 1.7036 - classification_loss: 0.3168 383/500 [=====================>........] - ETA: 28s - loss: 2.0179 - regression_loss: 1.7014 - classification_loss: 0.3164 384/500 [======================>.......] - ETA: 28s - loss: 2.0170 - regression_loss: 1.7007 - classification_loss: 0.3163 385/500 [======================>.......] - ETA: 28s - loss: 2.0151 - regression_loss: 1.6992 - classification_loss: 0.3159 386/500 [======================>.......] - ETA: 28s - loss: 2.0162 - regression_loss: 1.7002 - classification_loss: 0.3160 387/500 [======================>.......] - ETA: 27s - loss: 2.0148 - regression_loss: 1.6990 - classification_loss: 0.3157 388/500 [======================>.......] - ETA: 27s - loss: 2.0141 - regression_loss: 1.6985 - classification_loss: 0.3156 389/500 [======================>.......] - ETA: 27s - loss: 2.0149 - regression_loss: 1.6992 - classification_loss: 0.3158 390/500 [======================>.......] - ETA: 27s - loss: 2.0143 - regression_loss: 1.6988 - classification_loss: 0.3155 391/500 [======================>.......] - ETA: 26s - loss: 2.0153 - regression_loss: 1.6998 - classification_loss: 0.3155 392/500 [======================>.......] - ETA: 26s - loss: 2.0156 - regression_loss: 1.6999 - classification_loss: 0.3156 393/500 [======================>.......] - ETA: 26s - loss: 2.0159 - regression_loss: 1.7002 - classification_loss: 0.3157 394/500 [======================>.......] - ETA: 26s - loss: 2.0159 - regression_loss: 1.7004 - classification_loss: 0.3155 395/500 [======================>.......] - ETA: 25s - loss: 2.0158 - regression_loss: 1.7006 - classification_loss: 0.3152 396/500 [======================>.......] - ETA: 25s - loss: 2.0154 - regression_loss: 1.7004 - classification_loss: 0.3150 397/500 [======================>.......] - ETA: 25s - loss: 2.0184 - regression_loss: 1.7023 - classification_loss: 0.3161 398/500 [======================>.......] - ETA: 25s - loss: 2.0184 - regression_loss: 1.7021 - classification_loss: 0.3162 399/500 [======================>.......] - ETA: 24s - loss: 2.0193 - regression_loss: 1.7031 - classification_loss: 0.3163 400/500 [=======================>......] - ETA: 24s - loss: 2.0215 - regression_loss: 1.7050 - classification_loss: 0.3165 401/500 [=======================>......] - ETA: 24s - loss: 2.0225 - regression_loss: 1.7059 - classification_loss: 0.3167 402/500 [=======================>......] - ETA: 24s - loss: 2.0203 - regression_loss: 1.7039 - classification_loss: 0.3163 403/500 [=======================>......] - ETA: 23s - loss: 2.0204 - regression_loss: 1.7039 - classification_loss: 0.3165 404/500 [=======================>......] - ETA: 23s - loss: 2.0188 - regression_loss: 1.7027 - classification_loss: 0.3161 405/500 [=======================>......] - ETA: 23s - loss: 2.0198 - regression_loss: 1.7034 - classification_loss: 0.3163 406/500 [=======================>......] - ETA: 23s - loss: 2.0200 - regression_loss: 1.7038 - classification_loss: 0.3162 407/500 [=======================>......] - ETA: 22s - loss: 2.0212 - regression_loss: 1.7049 - classification_loss: 0.3163 408/500 [=======================>......] - ETA: 22s - loss: 2.0214 - regression_loss: 1.7051 - classification_loss: 0.3163 409/500 [=======================>......] - ETA: 22s - loss: 2.0211 - regression_loss: 1.7049 - classification_loss: 0.3161 410/500 [=======================>......] - ETA: 22s - loss: 2.0236 - regression_loss: 1.7069 - classification_loss: 0.3168 411/500 [=======================>......] - ETA: 21s - loss: 2.0280 - regression_loss: 1.7105 - classification_loss: 0.3175 412/500 [=======================>......] - ETA: 21s - loss: 2.0289 - regression_loss: 1.7112 - classification_loss: 0.3177 413/500 [=======================>......] - ETA: 21s - loss: 2.0301 - regression_loss: 1.7123 - classification_loss: 0.3178 414/500 [=======================>......] - ETA: 21s - loss: 2.0304 - regression_loss: 1.7129 - classification_loss: 0.3175 415/500 [=======================>......] - ETA: 20s - loss: 2.0312 - regression_loss: 1.7136 - classification_loss: 0.3176 416/500 [=======================>......] - ETA: 20s - loss: 2.0295 - regression_loss: 1.7123 - classification_loss: 0.3172 417/500 [========================>.....] - ETA: 20s - loss: 2.0290 - regression_loss: 1.7119 - classification_loss: 0.3171 418/500 [========================>.....] - ETA: 20s - loss: 2.0305 - regression_loss: 1.7133 - classification_loss: 0.3171 419/500 [========================>.....] - ETA: 20s - loss: 2.0308 - regression_loss: 1.7136 - classification_loss: 0.3172 420/500 [========================>.....] - ETA: 19s - loss: 2.0306 - regression_loss: 1.7135 - classification_loss: 0.3171 421/500 [========================>.....] - ETA: 19s - loss: 2.0297 - regression_loss: 1.7128 - classification_loss: 0.3169 422/500 [========================>.....] - ETA: 19s - loss: 2.0300 - regression_loss: 1.7131 - classification_loss: 0.3169 423/500 [========================>.....] - ETA: 19s - loss: 2.0295 - regression_loss: 1.7127 - classification_loss: 0.3168 424/500 [========================>.....] - ETA: 18s - loss: 2.0291 - regression_loss: 1.7126 - classification_loss: 0.3165 425/500 [========================>.....] - ETA: 18s - loss: 2.0297 - regression_loss: 1.7132 - classification_loss: 0.3165 426/500 [========================>.....] - ETA: 18s - loss: 2.0287 - regression_loss: 1.7124 - classification_loss: 0.3163 427/500 [========================>.....] - ETA: 18s - loss: 2.0275 - regression_loss: 1.7116 - classification_loss: 0.3160 428/500 [========================>.....] - ETA: 17s - loss: 2.0276 - regression_loss: 1.7117 - classification_loss: 0.3159 429/500 [========================>.....] - ETA: 17s - loss: 2.0274 - regression_loss: 1.7116 - classification_loss: 0.3158 430/500 [========================>.....] - ETA: 17s - loss: 2.0272 - regression_loss: 1.7115 - classification_loss: 0.3157 431/500 [========================>.....] - ETA: 17s - loss: 2.0261 - regression_loss: 1.7106 - classification_loss: 0.3155 432/500 [========================>.....] - ETA: 16s - loss: 2.0258 - regression_loss: 1.7103 - classification_loss: 0.3156 433/500 [========================>.....] - ETA: 16s - loss: 2.0274 - regression_loss: 1.7116 - classification_loss: 0.3159 434/500 [=========================>....] - ETA: 16s - loss: 2.0270 - regression_loss: 1.7113 - classification_loss: 0.3157 435/500 [=========================>....] - ETA: 16s - loss: 2.0262 - regression_loss: 1.7106 - classification_loss: 0.3155 436/500 [=========================>....] - ETA: 15s - loss: 2.0284 - regression_loss: 1.7124 - classification_loss: 0.3160 437/500 [=========================>....] - ETA: 15s - loss: 2.0274 - regression_loss: 1.7117 - classification_loss: 0.3157 438/500 [=========================>....] - ETA: 15s - loss: 2.0274 - regression_loss: 1.7117 - classification_loss: 0.3157 439/500 [=========================>....] - ETA: 15s - loss: 2.0283 - regression_loss: 1.7116 - classification_loss: 0.3166 440/500 [=========================>....] - ETA: 14s - loss: 2.0281 - regression_loss: 1.7116 - classification_loss: 0.3165 441/500 [=========================>....] - ETA: 14s - loss: 2.0248 - regression_loss: 1.7088 - classification_loss: 0.3160 442/500 [=========================>....] - ETA: 14s - loss: 2.0253 - regression_loss: 1.7094 - classification_loss: 0.3159 443/500 [=========================>....] - ETA: 14s - loss: 2.0257 - regression_loss: 1.7098 - classification_loss: 0.3159 444/500 [=========================>....] - ETA: 13s - loss: 2.0269 - regression_loss: 1.7107 - classification_loss: 0.3161 445/500 [=========================>....] - ETA: 13s - loss: 2.0269 - regression_loss: 1.7109 - classification_loss: 0.3160 446/500 [=========================>....] - ETA: 13s - loss: 2.0252 - regression_loss: 1.7094 - classification_loss: 0.3158 447/500 [=========================>....] - ETA: 13s - loss: 2.0249 - regression_loss: 1.7091 - classification_loss: 0.3158 448/500 [=========================>....] - ETA: 12s - loss: 2.0269 - regression_loss: 1.7106 - classification_loss: 0.3162 449/500 [=========================>....] - ETA: 12s - loss: 2.0267 - regression_loss: 1.7105 - classification_loss: 0.3162 450/500 [==========================>...] - ETA: 12s - loss: 2.0264 - regression_loss: 1.7102 - classification_loss: 0.3162 451/500 [==========================>...] - ETA: 12s - loss: 2.0258 - regression_loss: 1.7098 - classification_loss: 0.3161 452/500 [==========================>...] - ETA: 11s - loss: 2.0256 - regression_loss: 1.7096 - classification_loss: 0.3160 453/500 [==========================>...] - ETA: 11s - loss: 2.0264 - regression_loss: 1.7103 - classification_loss: 0.3161 454/500 [==========================>...] - ETA: 11s - loss: 2.0278 - regression_loss: 1.7112 - classification_loss: 0.3166 455/500 [==========================>...] - ETA: 11s - loss: 2.0282 - regression_loss: 1.7116 - classification_loss: 0.3166 456/500 [==========================>...] - ETA: 10s - loss: 2.0281 - regression_loss: 1.7116 - classification_loss: 0.3165 457/500 [==========================>...] - ETA: 10s - loss: 2.0282 - regression_loss: 1.7119 - classification_loss: 0.3163 458/500 [==========================>...] - ETA: 10s - loss: 2.0274 - regression_loss: 1.7114 - classification_loss: 0.3160 459/500 [==========================>...] - ETA: 10s - loss: 2.0257 - regression_loss: 1.7101 - classification_loss: 0.3156 460/500 [==========================>...] - ETA: 9s - loss: 2.0265 - regression_loss: 1.7105 - classification_loss: 0.3160  461/500 [==========================>...] - ETA: 9s - loss: 2.0262 - regression_loss: 1.7103 - classification_loss: 0.3159 462/500 [==========================>...] - ETA: 9s - loss: 2.0247 - regression_loss: 1.7091 - classification_loss: 0.3156 463/500 [==========================>...] - ETA: 9s - loss: 2.0235 - regression_loss: 1.7082 - classification_loss: 0.3153 464/500 [==========================>...] - ETA: 8s - loss: 2.0231 - regression_loss: 1.7079 - classification_loss: 0.3152 465/500 [==========================>...] - ETA: 8s - loss: 2.0226 - regression_loss: 1.7077 - classification_loss: 0.3149 466/500 [==========================>...] - ETA: 8s - loss: 2.0232 - regression_loss: 1.7081 - classification_loss: 0.3152 467/500 [===========================>..] - ETA: 8s - loss: 2.0254 - regression_loss: 1.7098 - classification_loss: 0.3156 468/500 [===========================>..] - ETA: 7s - loss: 2.0259 - regression_loss: 1.7103 - classification_loss: 0.3156 469/500 [===========================>..] - ETA: 7s - loss: 2.0251 - regression_loss: 1.7094 - classification_loss: 0.3157 470/500 [===========================>..] - ETA: 7s - loss: 2.0254 - regression_loss: 1.7097 - classification_loss: 0.3157 471/500 [===========================>..] - ETA: 7s - loss: 2.0236 - regression_loss: 1.7082 - classification_loss: 0.3154 472/500 [===========================>..] - ETA: 6s - loss: 2.0214 - regression_loss: 1.7064 - classification_loss: 0.3150 473/500 [===========================>..] - ETA: 6s - loss: 2.0232 - regression_loss: 1.7079 - classification_loss: 0.3153 474/500 [===========================>..] - ETA: 6s - loss: 2.0240 - regression_loss: 1.7086 - classification_loss: 0.3154 475/500 [===========================>..] - ETA: 6s - loss: 2.0235 - regression_loss: 1.7083 - classification_loss: 0.3151 476/500 [===========================>..] - ETA: 5s - loss: 2.0239 - regression_loss: 1.7087 - classification_loss: 0.3152 477/500 [===========================>..] - ETA: 5s - loss: 2.0224 - regression_loss: 1.7075 - classification_loss: 0.3149 478/500 [===========================>..] - ETA: 5s - loss: 2.0216 - regression_loss: 1.7068 - classification_loss: 0.3148 479/500 [===========================>..] - ETA: 5s - loss: 2.0214 - regression_loss: 1.7067 - classification_loss: 0.3146 480/500 [===========================>..] - ETA: 4s - loss: 2.0206 - regression_loss: 1.7061 - classification_loss: 0.3145 481/500 [===========================>..] - ETA: 4s - loss: 2.0202 - regression_loss: 1.7058 - classification_loss: 0.3143 482/500 [===========================>..] - ETA: 4s - loss: 2.0188 - regression_loss: 1.7048 - classification_loss: 0.3140 483/500 [===========================>..] - ETA: 4s - loss: 2.0178 - regression_loss: 1.7041 - classification_loss: 0.3137 484/500 [============================>.] - ETA: 3s - loss: 2.0191 - regression_loss: 1.7049 - classification_loss: 0.3142 485/500 [============================>.] - ETA: 3s - loss: 2.0196 - regression_loss: 1.7052 - classification_loss: 0.3143 486/500 [============================>.] - ETA: 3s - loss: 2.0167 - regression_loss: 1.7029 - classification_loss: 0.3138 487/500 [============================>.] - ETA: 3s - loss: 2.0152 - regression_loss: 1.7016 - classification_loss: 0.3135 488/500 [============================>.] - ETA: 2s - loss: 2.0135 - regression_loss: 1.7003 - classification_loss: 0.3133 489/500 [============================>.] - ETA: 2s - loss: 2.0127 - regression_loss: 1.6997 - classification_loss: 0.3130 490/500 [============================>.] - ETA: 2s - loss: 2.0112 - regression_loss: 1.6985 - classification_loss: 0.3127 491/500 [============================>.] - ETA: 2s - loss: 2.0109 - regression_loss: 1.6984 - classification_loss: 0.3125 492/500 [============================>.] - ETA: 1s - loss: 2.0130 - regression_loss: 1.6999 - classification_loss: 0.3132 493/500 [============================>.] - ETA: 1s - loss: 2.0137 - regression_loss: 1.7006 - classification_loss: 0.3131 494/500 [============================>.] - ETA: 1s - loss: 2.0135 - regression_loss: 1.7005 - classification_loss: 0.3130 495/500 [============================>.] - ETA: 1s - loss: 2.0118 - regression_loss: 1.6992 - classification_loss: 0.3126 496/500 [============================>.] - ETA: 0s - loss: 2.0112 - regression_loss: 1.6986 - classification_loss: 0.3126 497/500 [============================>.] - ETA: 0s - loss: 2.0108 - regression_loss: 1.6983 - classification_loss: 0.3124 498/500 [============================>.] - ETA: 0s - loss: 2.0092 - regression_loss: 1.6970 - classification_loss: 0.3122 499/500 [============================>.] - ETA: 0s - loss: 2.0092 - regression_loss: 1.6971 - classification_loss: 0.3120 500/500 [==============================] - 124s 247ms/step - loss: 2.0069 - regression_loss: 1.6954 - classification_loss: 0.3116 326 instances of class plum with average precision: 0.7597 mAP: 0.7597 Epoch 00004: saving model to ./training/snapshots/resnet50_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 2:00 - loss: 2.3538 - regression_loss: 1.9722 - classification_loss: 0.3816 2/500 [..............................] - ETA: 2:00 - loss: 1.8098 - regression_loss: 1.5250 - classification_loss: 0.2849 3/500 [..............................] - ETA: 1:59 - loss: 1.8452 - regression_loss: 1.5699 - classification_loss: 0.2753 4/500 [..............................] - ETA: 1:59 - loss: 1.9087 - regression_loss: 1.6293 - classification_loss: 0.2795 5/500 [..............................] - ETA: 1:58 - loss: 2.0302 - regression_loss: 1.7257 - classification_loss: 0.3045 6/500 [..............................] - ETA: 1:58 - loss: 1.9295 - regression_loss: 1.6460 - classification_loss: 0.2834 7/500 [..............................] - ETA: 1:57 - loss: 1.9729 - regression_loss: 1.6693 - classification_loss: 0.3036 8/500 [..............................] - ETA: 1:57 - loss: 1.9633 - regression_loss: 1.6613 - classification_loss: 0.3020 9/500 [..............................] - ETA: 1:57 - loss: 1.9670 - regression_loss: 1.6734 - classification_loss: 0.2937 10/500 [..............................] - ETA: 1:58 - loss: 1.9185 - regression_loss: 1.6352 - classification_loss: 0.2833 11/500 [..............................] - ETA: 1:58 - loss: 1.8850 - regression_loss: 1.6107 - classification_loss: 0.2743 12/500 [..............................] - ETA: 1:58 - loss: 1.8902 - regression_loss: 1.6169 - classification_loss: 0.2733 13/500 [..............................] - ETA: 1:58 - loss: 1.8863 - regression_loss: 1.6173 - classification_loss: 0.2690 14/500 [..............................] - ETA: 1:58 - loss: 1.8961 - regression_loss: 1.6183 - classification_loss: 0.2778 15/500 [..............................] - ETA: 1:58 - loss: 1.8989 - regression_loss: 1.6213 - classification_loss: 0.2776 16/500 [..............................] - ETA: 1:58 - loss: 1.8950 - regression_loss: 1.6161 - classification_loss: 0.2789 17/500 [>.............................] - ETA: 1:58 - loss: 1.8847 - regression_loss: 1.6080 - classification_loss: 0.2767 18/500 [>.............................] - ETA: 1:58 - loss: 1.9095 - regression_loss: 1.6283 - classification_loss: 0.2812 19/500 [>.............................] - ETA: 1:57 - loss: 1.9166 - regression_loss: 1.6358 - classification_loss: 0.2808 20/500 [>.............................] - ETA: 1:57 - loss: 1.9557 - regression_loss: 1.6786 - classification_loss: 0.2770 21/500 [>.............................] - ETA: 1:57 - loss: 1.9267 - regression_loss: 1.6539 - classification_loss: 0.2728 22/500 [>.............................] - ETA: 1:57 - loss: 1.9227 - regression_loss: 1.6523 - classification_loss: 0.2704 23/500 [>.............................] - ETA: 1:57 - loss: 1.9388 - regression_loss: 1.6617 - classification_loss: 0.2770 24/500 [>.............................] - ETA: 1:57 - loss: 1.9503 - regression_loss: 1.6697 - classification_loss: 0.2806 25/500 [>.............................] - ETA: 1:57 - loss: 1.9324 - regression_loss: 1.6514 - classification_loss: 0.2810 26/500 [>.............................] - ETA: 1:57 - loss: 1.8877 - regression_loss: 1.6132 - classification_loss: 0.2746 27/500 [>.............................] - ETA: 1:57 - loss: 1.8837 - regression_loss: 1.6113 - classification_loss: 0.2724 28/500 [>.............................] - ETA: 1:57 - loss: 1.8818 - regression_loss: 1.6087 - classification_loss: 0.2731 29/500 [>.............................] - ETA: 1:56 - loss: 1.8435 - regression_loss: 1.5759 - classification_loss: 0.2677 30/500 [>.............................] - ETA: 1:56 - loss: 1.8828 - regression_loss: 1.6000 - classification_loss: 0.2828 31/500 [>.............................] - ETA: 1:56 - loss: 1.8804 - regression_loss: 1.5989 - classification_loss: 0.2814 32/500 [>.............................] - ETA: 1:56 - loss: 1.8886 - regression_loss: 1.6069 - classification_loss: 0.2816 33/500 [>.............................] - ETA: 1:55 - loss: 1.8705 - regression_loss: 1.5935 - classification_loss: 0.2770 34/500 [=>............................] - ETA: 1:55 - loss: 1.8757 - regression_loss: 1.5981 - classification_loss: 0.2776 35/500 [=>............................] - ETA: 1:55 - loss: 1.8806 - regression_loss: 1.6008 - classification_loss: 0.2798 36/500 [=>............................] - ETA: 1:55 - loss: 1.8718 - regression_loss: 1.5922 - classification_loss: 0.2795 37/500 [=>............................] - ETA: 1:54 - loss: 1.8789 - regression_loss: 1.5993 - classification_loss: 0.2796 38/500 [=>............................] - ETA: 1:54 - loss: 1.8902 - regression_loss: 1.6129 - classification_loss: 0.2773 39/500 [=>............................] - ETA: 1:54 - loss: 1.8979 - regression_loss: 1.6217 - classification_loss: 0.2762 40/500 [=>............................] - ETA: 1:54 - loss: 1.9491 - regression_loss: 1.6626 - classification_loss: 0.2864 41/500 [=>............................] - ETA: 1:53 - loss: 1.9643 - regression_loss: 1.6756 - classification_loss: 0.2887 42/500 [=>............................] - ETA: 1:53 - loss: 1.9521 - regression_loss: 1.6656 - classification_loss: 0.2865 43/500 [=>............................] - ETA: 1:53 - loss: 1.9558 - regression_loss: 1.6687 - classification_loss: 0.2871 44/500 [=>............................] - ETA: 1:53 - loss: 1.9540 - regression_loss: 1.6688 - classification_loss: 0.2852 45/500 [=>............................] - ETA: 1:52 - loss: 1.9470 - regression_loss: 1.6637 - classification_loss: 0.2833 46/500 [=>............................] - ETA: 1:52 - loss: 1.9429 - regression_loss: 1.6610 - classification_loss: 0.2819 47/500 [=>............................] - ETA: 1:52 - loss: 1.9441 - regression_loss: 1.6615 - classification_loss: 0.2825 48/500 [=>............................] - ETA: 1:52 - loss: 1.9639 - regression_loss: 1.6777 - classification_loss: 0.2862 49/500 [=>............................] - ETA: 1:51 - loss: 1.9630 - regression_loss: 1.6786 - classification_loss: 0.2844 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9613 - regression_loss: 1.6786 - classification_loss: 0.2827 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9552 - regression_loss: 1.6745 - classification_loss: 0.2807 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9626 - regression_loss: 1.6816 - classification_loss: 0.2809 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9798 - regression_loss: 1.6936 - classification_loss: 0.2862 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9806 - regression_loss: 1.6940 - classification_loss: 0.2866 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9739 - regression_loss: 1.6892 - classification_loss: 0.2847 56/500 [==>...........................] - ETA: 1:50 - loss: 1.9891 - regression_loss: 1.7022 - classification_loss: 0.2869 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9884 - regression_loss: 1.7027 - classification_loss: 0.2857 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9837 - regression_loss: 1.7005 - classification_loss: 0.2832 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9724 - regression_loss: 1.6919 - classification_loss: 0.2805 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9701 - regression_loss: 1.6910 - classification_loss: 0.2790 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9712 - regression_loss: 1.6915 - classification_loss: 0.2797 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9937 - regression_loss: 1.7095 - classification_loss: 0.2842 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9982 - regression_loss: 1.7135 - classification_loss: 0.2847 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0020 - regression_loss: 1.7145 - classification_loss: 0.2876 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9962 - regression_loss: 1.7105 - classification_loss: 0.2857 66/500 [==>...........................] - ETA: 1:47 - loss: 1.9897 - regression_loss: 1.7051 - classification_loss: 0.2846 67/500 [===>..........................] - ETA: 1:47 - loss: 1.9884 - regression_loss: 1.7041 - classification_loss: 0.2843 68/500 [===>..........................] - ETA: 1:47 - loss: 1.9970 - regression_loss: 1.7108 - classification_loss: 0.2862 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0055 - regression_loss: 1.7188 - classification_loss: 0.2867 70/500 [===>..........................] - ETA: 1:46 - loss: 2.0018 - regression_loss: 1.7158 - classification_loss: 0.2859 71/500 [===>..........................] - ETA: 1:46 - loss: 1.9973 - regression_loss: 1.7124 - classification_loss: 0.2850 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0064 - regression_loss: 1.7181 - classification_loss: 0.2883 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0106 - regression_loss: 1.7204 - classification_loss: 0.2902 74/500 [===>..........................] - ETA: 1:45 - loss: 2.0095 - regression_loss: 1.7179 - classification_loss: 0.2916 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0265 - regression_loss: 1.7305 - classification_loss: 0.2960 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0374 - regression_loss: 1.7396 - classification_loss: 0.2978 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0342 - regression_loss: 1.7375 - classification_loss: 0.2966 78/500 [===>..........................] - ETA: 1:44 - loss: 2.0292 - regression_loss: 1.7336 - classification_loss: 0.2956 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0356 - regression_loss: 1.7383 - classification_loss: 0.2974 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0301 - regression_loss: 1.7335 - classification_loss: 0.2966 81/500 [===>..........................] - ETA: 1:43 - loss: 2.0249 - regression_loss: 1.7290 - classification_loss: 0.2959 82/500 [===>..........................] - ETA: 1:43 - loss: 2.0186 - regression_loss: 1.7233 - classification_loss: 0.2954 83/500 [===>..........................] - ETA: 1:43 - loss: 2.0121 - regression_loss: 1.7183 - classification_loss: 0.2938 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0000 - regression_loss: 1.7079 - classification_loss: 0.2921 85/500 [====>.........................] - ETA: 1:42 - loss: 2.0017 - regression_loss: 1.7083 - classification_loss: 0.2934 86/500 [====>.........................] - ETA: 1:42 - loss: 2.0027 - regression_loss: 1.7091 - classification_loss: 0.2937 87/500 [====>.........................] - ETA: 1:42 - loss: 2.0051 - regression_loss: 1.7104 - classification_loss: 0.2947 88/500 [====>.........................] - ETA: 1:42 - loss: 2.0025 - regression_loss: 1.7078 - classification_loss: 0.2947 89/500 [====>.........................] - ETA: 1:41 - loss: 2.0048 - regression_loss: 1.7092 - classification_loss: 0.2956 90/500 [====>.........................] - ETA: 1:41 - loss: 2.0001 - regression_loss: 1.7055 - classification_loss: 0.2946 91/500 [====>.........................] - ETA: 1:41 - loss: 2.0015 - regression_loss: 1.7067 - classification_loss: 0.2948 92/500 [====>.........................] - ETA: 1:41 - loss: 2.0068 - regression_loss: 1.7077 - classification_loss: 0.2991 93/500 [====>.........................] - ETA: 1:40 - loss: 2.0115 - regression_loss: 1.7115 - classification_loss: 0.2999 94/500 [====>.........................] - ETA: 1:40 - loss: 2.0125 - regression_loss: 1.7119 - classification_loss: 0.3006 95/500 [====>.........................] - ETA: 1:40 - loss: 2.0115 - regression_loss: 1.7103 - classification_loss: 0.3012 96/500 [====>.........................] - ETA: 1:40 - loss: 2.0117 - regression_loss: 1.7095 - classification_loss: 0.3022 97/500 [====>.........................] - ETA: 1:39 - loss: 2.0030 - regression_loss: 1.7023 - classification_loss: 0.3007 98/500 [====>.........................] - ETA: 1:39 - loss: 2.0029 - regression_loss: 1.7027 - classification_loss: 0.3002 99/500 [====>.........................] - ETA: 1:39 - loss: 1.9963 - regression_loss: 1.6972 - classification_loss: 0.2991 100/500 [=====>........................] - ETA: 1:39 - loss: 1.9953 - regression_loss: 1.6969 - classification_loss: 0.2984 101/500 [=====>........................] - ETA: 1:38 - loss: 2.0017 - regression_loss: 1.7010 - classification_loss: 0.3007 102/500 [=====>........................] - ETA: 1:38 - loss: 2.0023 - regression_loss: 1.7014 - classification_loss: 0.3009 103/500 [=====>........................] - ETA: 1:38 - loss: 2.0089 - regression_loss: 1.7060 - classification_loss: 0.3029 104/500 [=====>........................] - ETA: 1:38 - loss: 2.0047 - regression_loss: 1.7026 - classification_loss: 0.3021 105/500 [=====>........................] - ETA: 1:37 - loss: 2.0002 - regression_loss: 1.6988 - classification_loss: 0.3014 106/500 [=====>........................] - ETA: 1:37 - loss: 2.0083 - regression_loss: 1.7063 - classification_loss: 0.3020 107/500 [=====>........................] - ETA: 1:37 - loss: 2.0050 - regression_loss: 1.7038 - classification_loss: 0.3012 108/500 [=====>........................] - ETA: 1:37 - loss: 2.0090 - regression_loss: 1.7055 - classification_loss: 0.3035 109/500 [=====>........................] - ETA: 1:36 - loss: 2.0084 - regression_loss: 1.7044 - classification_loss: 0.3040 110/500 [=====>........................] - ETA: 1:36 - loss: 2.0069 - regression_loss: 1.7031 - classification_loss: 0.3039 111/500 [=====>........................] - ETA: 1:36 - loss: 2.0056 - regression_loss: 1.7020 - classification_loss: 0.3035 112/500 [=====>........................] - ETA: 1:36 - loss: 2.0050 - regression_loss: 1.7018 - classification_loss: 0.3031 113/500 [=====>........................] - ETA: 1:35 - loss: 2.0038 - regression_loss: 1.6984 - classification_loss: 0.3055 114/500 [=====>........................] - ETA: 1:35 - loss: 2.0038 - regression_loss: 1.6980 - classification_loss: 0.3058 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9980 - regression_loss: 1.6924 - classification_loss: 0.3057 116/500 [=====>........................] - ETA: 1:34 - loss: 1.9997 - regression_loss: 1.6931 - classification_loss: 0.3066 117/500 [======>.......................] - ETA: 1:34 - loss: 1.9924 - regression_loss: 1.6843 - classification_loss: 0.3080 118/500 [======>.......................] - ETA: 1:34 - loss: 1.9932 - regression_loss: 1.6858 - classification_loss: 0.3074 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9960 - regression_loss: 1.6878 - classification_loss: 0.3081 120/500 [======>.......................] - ETA: 1:33 - loss: 1.9925 - regression_loss: 1.6855 - classification_loss: 0.3070 121/500 [======>.......................] - ETA: 1:33 - loss: 1.9902 - regression_loss: 1.6834 - classification_loss: 0.3068 122/500 [======>.......................] - ETA: 1:33 - loss: 1.9947 - regression_loss: 1.6868 - classification_loss: 0.3079 123/500 [======>.......................] - ETA: 1:33 - loss: 1.9919 - regression_loss: 1.6847 - classification_loss: 0.3072 124/500 [======>.......................] - ETA: 1:32 - loss: 1.9919 - regression_loss: 1.6846 - classification_loss: 0.3073 125/500 [======>.......................] - ETA: 1:32 - loss: 1.9897 - regression_loss: 1.6822 - classification_loss: 0.3075 126/500 [======>.......................] - ETA: 1:32 - loss: 1.9852 - regression_loss: 1.6782 - classification_loss: 0.3070 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9819 - regression_loss: 1.6754 - classification_loss: 0.3066 128/500 [======>.......................] - ETA: 1:31 - loss: 1.9804 - regression_loss: 1.6743 - classification_loss: 0.3061 129/500 [======>.......................] - ETA: 1:31 - loss: 1.9768 - regression_loss: 1.6716 - classification_loss: 0.3052 130/500 [======>.......................] - ETA: 1:31 - loss: 1.9736 - regression_loss: 1.6691 - classification_loss: 0.3046 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9700 - regression_loss: 1.6664 - classification_loss: 0.3035 132/500 [======>.......................] - ETA: 1:30 - loss: 1.9693 - regression_loss: 1.6656 - classification_loss: 0.3037 133/500 [======>.......................] - ETA: 1:30 - loss: 1.9658 - regression_loss: 1.6634 - classification_loss: 0.3024 134/500 [=======>......................] - ETA: 1:30 - loss: 1.9680 - regression_loss: 1.6645 - classification_loss: 0.3035 135/500 [=======>......................] - ETA: 1:29 - loss: 1.9665 - regression_loss: 1.6632 - classification_loss: 0.3033 136/500 [=======>......................] - ETA: 1:29 - loss: 1.9686 - regression_loss: 1.6652 - classification_loss: 0.3034 137/500 [=======>......................] - ETA: 1:29 - loss: 1.9607 - regression_loss: 1.6588 - classification_loss: 0.3019 138/500 [=======>......................] - ETA: 1:29 - loss: 1.9631 - regression_loss: 1.6612 - classification_loss: 0.3019 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9648 - regression_loss: 1.6633 - classification_loss: 0.3015 140/500 [=======>......................] - ETA: 1:28 - loss: 1.9686 - regression_loss: 1.6657 - classification_loss: 0.3028 141/500 [=======>......................] - ETA: 1:28 - loss: 1.9858 - regression_loss: 1.6539 - classification_loss: 0.3319 142/500 [=======>......................] - ETA: 1:28 - loss: 1.9876 - regression_loss: 1.6555 - classification_loss: 0.3321 143/500 [=======>......................] - ETA: 1:28 - loss: 1.9825 - regression_loss: 1.6506 - classification_loss: 0.3319 144/500 [=======>......................] - ETA: 1:27 - loss: 1.9864 - regression_loss: 1.6529 - classification_loss: 0.3335 145/500 [=======>......................] - ETA: 1:27 - loss: 1.9795 - regression_loss: 1.6475 - classification_loss: 0.3319 146/500 [=======>......................] - ETA: 1:27 - loss: 1.9854 - regression_loss: 1.6521 - classification_loss: 0.3334 147/500 [=======>......................] - ETA: 1:27 - loss: 1.9870 - regression_loss: 1.6530 - classification_loss: 0.3340 148/500 [=======>......................] - ETA: 1:26 - loss: 1.9927 - regression_loss: 1.6576 - classification_loss: 0.3351 149/500 [=======>......................] - ETA: 1:26 - loss: 1.9930 - regression_loss: 1.6584 - classification_loss: 0.3345 150/500 [========>.....................] - ETA: 1:26 - loss: 1.9934 - regression_loss: 1.6586 - classification_loss: 0.3349 151/500 [========>.....................] - ETA: 1:26 - loss: 2.0001 - regression_loss: 1.6636 - classification_loss: 0.3365 152/500 [========>.....................] - ETA: 1:25 - loss: 2.0036 - regression_loss: 1.6666 - classification_loss: 0.3370 153/500 [========>.....................] - ETA: 1:25 - loss: 2.0107 - regression_loss: 1.6718 - classification_loss: 0.3389 154/500 [========>.....................] - ETA: 1:25 - loss: 2.0086 - regression_loss: 1.6701 - classification_loss: 0.3385 155/500 [========>.....................] - ETA: 1:25 - loss: 2.0076 - regression_loss: 1.6692 - classification_loss: 0.3384 156/500 [========>.....................] - ETA: 1:24 - loss: 2.0024 - regression_loss: 1.6653 - classification_loss: 0.3372 157/500 [========>.....................] - ETA: 1:24 - loss: 2.0064 - regression_loss: 1.6678 - classification_loss: 0.3386 158/500 [========>.....................] - ETA: 1:24 - loss: 2.0054 - regression_loss: 1.6673 - classification_loss: 0.3380 159/500 [========>.....................] - ETA: 1:24 - loss: 2.0031 - regression_loss: 1.6654 - classification_loss: 0.3377 160/500 [========>.....................] - ETA: 1:23 - loss: 2.0050 - regression_loss: 1.6678 - classification_loss: 0.3371 161/500 [========>.....................] - ETA: 1:23 - loss: 2.0069 - regression_loss: 1.6699 - classification_loss: 0.3370 162/500 [========>.....................] - ETA: 1:23 - loss: 2.0056 - regression_loss: 1.6685 - classification_loss: 0.3371 163/500 [========>.....................] - ETA: 1:23 - loss: 2.0014 - regression_loss: 1.6644 - classification_loss: 0.3370 164/500 [========>.....................] - ETA: 1:22 - loss: 1.9982 - regression_loss: 1.6619 - classification_loss: 0.3363 165/500 [========>.....................] - ETA: 1:22 - loss: 1.9955 - regression_loss: 1.6582 - classification_loss: 0.3374 166/500 [========>.....................] - ETA: 1:22 - loss: 1.9949 - regression_loss: 1.6569 - classification_loss: 0.3380 167/500 [=========>....................] - ETA: 1:22 - loss: 1.9987 - regression_loss: 1.6597 - classification_loss: 0.3389 168/500 [=========>....................] - ETA: 1:21 - loss: 1.9958 - regression_loss: 1.6576 - classification_loss: 0.3382 169/500 [=========>....................] - ETA: 1:21 - loss: 1.9951 - regression_loss: 1.6577 - classification_loss: 0.3374 170/500 [=========>....................] - ETA: 1:21 - loss: 1.9972 - regression_loss: 1.6596 - classification_loss: 0.3376 171/500 [=========>....................] - ETA: 1:21 - loss: 1.9938 - regression_loss: 1.6566 - classification_loss: 0.3373 172/500 [=========>....................] - ETA: 1:20 - loss: 1.9934 - regression_loss: 1.6560 - classification_loss: 0.3374 173/500 [=========>....................] - ETA: 1:20 - loss: 1.9930 - regression_loss: 1.6556 - classification_loss: 0.3374 174/500 [=========>....................] - ETA: 1:20 - loss: 1.9919 - regression_loss: 1.6546 - classification_loss: 0.3373 175/500 [=========>....................] - ETA: 1:20 - loss: 1.9919 - regression_loss: 1.6545 - classification_loss: 0.3374 176/500 [=========>....................] - ETA: 1:19 - loss: 1.9878 - regression_loss: 1.6511 - classification_loss: 0.3367 177/500 [=========>....................] - ETA: 1:19 - loss: 1.9864 - regression_loss: 1.6503 - classification_loss: 0.3362 178/500 [=========>....................] - ETA: 1:19 - loss: 1.9912 - regression_loss: 1.6547 - classification_loss: 0.3364 179/500 [=========>....................] - ETA: 1:19 - loss: 1.9836 - regression_loss: 1.6485 - classification_loss: 0.3351 180/500 [=========>....................] - ETA: 1:18 - loss: 1.9849 - regression_loss: 1.6498 - classification_loss: 0.3350 181/500 [=========>....................] - ETA: 1:18 - loss: 1.9815 - regression_loss: 1.6471 - classification_loss: 0.3344 182/500 [=========>....................] - ETA: 1:18 - loss: 1.9821 - regression_loss: 1.6481 - classification_loss: 0.3341 183/500 [=========>....................] - ETA: 1:18 - loss: 1.9820 - regression_loss: 1.6480 - classification_loss: 0.3341 184/500 [==========>...................] - ETA: 1:17 - loss: 1.9825 - regression_loss: 1.6487 - classification_loss: 0.3338 185/500 [==========>...................] - ETA: 1:17 - loss: 1.9878 - regression_loss: 1.6534 - classification_loss: 0.3344 186/500 [==========>...................] - ETA: 1:17 - loss: 1.9844 - regression_loss: 1.6510 - classification_loss: 0.3334 187/500 [==========>...................] - ETA: 1:17 - loss: 1.9904 - regression_loss: 1.6558 - classification_loss: 0.3346 188/500 [==========>...................] - ETA: 1:16 - loss: 1.9900 - regression_loss: 1.6557 - classification_loss: 0.3343 189/500 [==========>...................] - ETA: 1:16 - loss: 1.9925 - regression_loss: 1.6576 - classification_loss: 0.3349 190/500 [==========>...................] - ETA: 1:16 - loss: 1.9918 - regression_loss: 1.6569 - classification_loss: 0.3349 191/500 [==========>...................] - ETA: 1:16 - loss: 1.9898 - regression_loss: 1.6558 - classification_loss: 0.3341 192/500 [==========>...................] - ETA: 1:15 - loss: 1.9896 - regression_loss: 1.6559 - classification_loss: 0.3337 193/500 [==========>...................] - ETA: 1:15 - loss: 1.9864 - regression_loss: 1.6533 - classification_loss: 0.3331 194/500 [==========>...................] - ETA: 1:15 - loss: 1.9852 - regression_loss: 1.6523 - classification_loss: 0.3329 195/500 [==========>...................] - ETA: 1:15 - loss: 1.9825 - regression_loss: 1.6503 - classification_loss: 0.3323 196/500 [==========>...................] - ETA: 1:14 - loss: 1.9797 - regression_loss: 1.6483 - classification_loss: 0.3314 197/500 [==========>...................] - ETA: 1:14 - loss: 1.9728 - regression_loss: 1.6427 - classification_loss: 0.3301 198/500 [==========>...................] - ETA: 1:14 - loss: 1.9718 - regression_loss: 1.6419 - classification_loss: 0.3299 199/500 [==========>...................] - ETA: 1:14 - loss: 1.9741 - regression_loss: 1.6442 - classification_loss: 0.3299 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9732 - regression_loss: 1.6437 - classification_loss: 0.3295 201/500 [===========>..................] - ETA: 1:13 - loss: 1.9725 - regression_loss: 1.6438 - classification_loss: 0.3288 202/500 [===========>..................] - ETA: 1:13 - loss: 1.9722 - regression_loss: 1.6440 - classification_loss: 0.3282 203/500 [===========>..................] - ETA: 1:13 - loss: 1.9714 - regression_loss: 1.6436 - classification_loss: 0.3278 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9716 - regression_loss: 1.6440 - classification_loss: 0.3277 205/500 [===========>..................] - ETA: 1:12 - loss: 1.9704 - regression_loss: 1.6428 - classification_loss: 0.3276 206/500 [===========>..................] - ETA: 1:12 - loss: 1.9767 - regression_loss: 1.6482 - classification_loss: 0.3285 207/500 [===========>..................] - ETA: 1:12 - loss: 1.9756 - regression_loss: 1.6472 - classification_loss: 0.3284 208/500 [===========>..................] - ETA: 1:11 - loss: 1.9753 - regression_loss: 1.6474 - classification_loss: 0.3279 209/500 [===========>..................] - ETA: 1:11 - loss: 1.9730 - regression_loss: 1.6455 - classification_loss: 0.3276 210/500 [===========>..................] - ETA: 1:11 - loss: 1.9708 - regression_loss: 1.6441 - classification_loss: 0.3268 211/500 [===========>..................] - ETA: 1:11 - loss: 1.9707 - regression_loss: 1.6440 - classification_loss: 0.3267 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9676 - regression_loss: 1.6416 - classification_loss: 0.3260 213/500 [===========>..................] - ETA: 1:10 - loss: 1.9671 - regression_loss: 1.6414 - classification_loss: 0.3257 214/500 [===========>..................] - ETA: 1:10 - loss: 1.9673 - regression_loss: 1.6416 - classification_loss: 0.3258 215/500 [===========>..................] - ETA: 1:10 - loss: 1.9674 - regression_loss: 1.6416 - classification_loss: 0.3258 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9702 - regression_loss: 1.6441 - classification_loss: 0.3262 217/500 [============>.................] - ETA: 1:09 - loss: 1.9707 - regression_loss: 1.6448 - classification_loss: 0.3259 218/500 [============>.................] - ETA: 1:09 - loss: 1.9692 - regression_loss: 1.6437 - classification_loss: 0.3255 219/500 [============>.................] - ETA: 1:09 - loss: 1.9742 - regression_loss: 1.6477 - classification_loss: 0.3265 220/500 [============>.................] - ETA: 1:09 - loss: 1.9714 - regression_loss: 1.6456 - classification_loss: 0.3258 221/500 [============>.................] - ETA: 1:08 - loss: 1.9715 - regression_loss: 1.6458 - classification_loss: 0.3257 222/500 [============>.................] - ETA: 1:08 - loss: 1.9708 - regression_loss: 1.6456 - classification_loss: 0.3252 223/500 [============>.................] - ETA: 1:08 - loss: 1.9757 - regression_loss: 1.6508 - classification_loss: 0.3249 224/500 [============>.................] - ETA: 1:08 - loss: 1.9755 - regression_loss: 1.6509 - classification_loss: 0.3246 225/500 [============>.................] - ETA: 1:07 - loss: 1.9765 - regression_loss: 1.6520 - classification_loss: 0.3245 226/500 [============>.................] - ETA: 1:07 - loss: 1.9749 - regression_loss: 1.6510 - classification_loss: 0.3240 227/500 [============>.................] - ETA: 1:07 - loss: 1.9736 - regression_loss: 1.6501 - classification_loss: 0.3235 228/500 [============>.................] - ETA: 1:07 - loss: 1.9701 - regression_loss: 1.6474 - classification_loss: 0.3227 229/500 [============>.................] - ETA: 1:06 - loss: 1.9701 - regression_loss: 1.6477 - classification_loss: 0.3225 230/500 [============>.................] - ETA: 1:06 - loss: 1.9678 - regression_loss: 1.6459 - classification_loss: 0.3219 231/500 [============>.................] - ETA: 1:06 - loss: 1.9680 - regression_loss: 1.6465 - classification_loss: 0.3215 232/500 [============>.................] - ETA: 1:06 - loss: 1.9705 - regression_loss: 1.6485 - classification_loss: 0.3220 233/500 [============>.................] - ETA: 1:05 - loss: 1.9725 - regression_loss: 1.6503 - classification_loss: 0.3222 234/500 [=============>................] - ETA: 1:05 - loss: 1.9715 - regression_loss: 1.6497 - classification_loss: 0.3218 235/500 [=============>................] - ETA: 1:05 - loss: 1.9702 - regression_loss: 1.6489 - classification_loss: 0.3213 236/500 [=============>................] - ETA: 1:05 - loss: 1.9725 - regression_loss: 1.6508 - classification_loss: 0.3217 237/500 [=============>................] - ETA: 1:04 - loss: 1.9695 - regression_loss: 1.6484 - classification_loss: 0.3211 238/500 [=============>................] - ETA: 1:04 - loss: 1.9687 - regression_loss: 1.6479 - classification_loss: 0.3208 239/500 [=============>................] - ETA: 1:04 - loss: 1.9663 - regression_loss: 1.6460 - classification_loss: 0.3203 240/500 [=============>................] - ETA: 1:04 - loss: 1.9678 - regression_loss: 1.6474 - classification_loss: 0.3203 241/500 [=============>................] - ETA: 1:03 - loss: 1.9699 - regression_loss: 1.6494 - classification_loss: 0.3205 242/500 [=============>................] - ETA: 1:03 - loss: 1.9691 - regression_loss: 1.6489 - classification_loss: 0.3201 243/500 [=============>................] - ETA: 1:03 - loss: 1.9690 - regression_loss: 1.6490 - classification_loss: 0.3201 244/500 [=============>................] - ETA: 1:03 - loss: 1.9685 - regression_loss: 1.6485 - classification_loss: 0.3199 245/500 [=============>................] - ETA: 1:02 - loss: 1.9671 - regression_loss: 1.6477 - classification_loss: 0.3195 246/500 [=============>................] - ETA: 1:02 - loss: 1.9649 - regression_loss: 1.6462 - classification_loss: 0.3187 247/500 [=============>................] - ETA: 1:02 - loss: 1.9649 - regression_loss: 1.6465 - classification_loss: 0.3184 248/500 [=============>................] - ETA: 1:02 - loss: 1.9681 - regression_loss: 1.6490 - classification_loss: 0.3191 249/500 [=============>................] - ETA: 1:01 - loss: 1.9687 - regression_loss: 1.6494 - classification_loss: 0.3193 250/500 [==============>...............] - ETA: 1:01 - loss: 1.9685 - regression_loss: 1.6499 - classification_loss: 0.3186 251/500 [==============>...............] - ETA: 1:01 - loss: 1.9671 - regression_loss: 1.6486 - classification_loss: 0.3185 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9675 - regression_loss: 1.6493 - classification_loss: 0.3182 253/500 [==============>...............] - ETA: 1:00 - loss: 1.9660 - regression_loss: 1.6482 - classification_loss: 0.3178 254/500 [==============>...............] - ETA: 1:00 - loss: 1.9636 - regression_loss: 1.6463 - classification_loss: 0.3172 255/500 [==============>...............] - ETA: 1:00 - loss: 1.9650 - regression_loss: 1.6479 - classification_loss: 0.3171 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9635 - regression_loss: 1.6468 - classification_loss: 0.3167 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9631 - regression_loss: 1.6467 - classification_loss: 0.3164 258/500 [==============>...............] - ETA: 59s - loss: 1.9627 - regression_loss: 1.6466 - classification_loss: 0.3160  259/500 [==============>...............] - ETA: 59s - loss: 1.9597 - regression_loss: 1.6444 - classification_loss: 0.3152 260/500 [==============>...............] - ETA: 59s - loss: 1.9582 - regression_loss: 1.6434 - classification_loss: 0.3148 261/500 [==============>...............] - ETA: 59s - loss: 1.9596 - regression_loss: 1.6445 - classification_loss: 0.3150 262/500 [==============>...............] - ETA: 58s - loss: 1.9595 - regression_loss: 1.6446 - classification_loss: 0.3148 263/500 [==============>...............] - ETA: 58s - loss: 1.9591 - regression_loss: 1.6444 - classification_loss: 0.3147 264/500 [==============>...............] - ETA: 58s - loss: 1.9554 - regression_loss: 1.6414 - classification_loss: 0.3140 265/500 [==============>...............] - ETA: 58s - loss: 1.9536 - regression_loss: 1.6400 - classification_loss: 0.3135 266/500 [==============>...............] - ETA: 57s - loss: 1.9592 - regression_loss: 1.6441 - classification_loss: 0.3151 267/500 [===============>..............] - ETA: 57s - loss: 1.9590 - regression_loss: 1.6440 - classification_loss: 0.3150 268/500 [===============>..............] - ETA: 57s - loss: 1.9609 - regression_loss: 1.6457 - classification_loss: 0.3152 269/500 [===============>..............] - ETA: 57s - loss: 1.9613 - regression_loss: 1.6462 - classification_loss: 0.3151 270/500 [===============>..............] - ETA: 56s - loss: 1.9614 - regression_loss: 1.6461 - classification_loss: 0.3153 271/500 [===============>..............] - ETA: 56s - loss: 1.9655 - regression_loss: 1.6491 - classification_loss: 0.3164 272/500 [===============>..............] - ETA: 56s - loss: 1.9671 - regression_loss: 1.6504 - classification_loss: 0.3167 273/500 [===============>..............] - ETA: 56s - loss: 1.9654 - regression_loss: 1.6492 - classification_loss: 0.3163 274/500 [===============>..............] - ETA: 55s - loss: 1.9679 - regression_loss: 1.6513 - classification_loss: 0.3166 275/500 [===============>..............] - ETA: 55s - loss: 1.9699 - regression_loss: 1.6534 - classification_loss: 0.3164 276/500 [===============>..............] - ETA: 55s - loss: 1.9683 - regression_loss: 1.6522 - classification_loss: 0.3161 277/500 [===============>..............] - ETA: 55s - loss: 1.9696 - regression_loss: 1.6533 - classification_loss: 0.3163 278/500 [===============>..............] - ETA: 54s - loss: 1.9689 - regression_loss: 1.6531 - classification_loss: 0.3159 279/500 [===============>..............] - ETA: 54s - loss: 1.9684 - regression_loss: 1.6526 - classification_loss: 0.3158 280/500 [===============>..............] - ETA: 54s - loss: 1.9641 - regression_loss: 1.6489 - classification_loss: 0.3152 281/500 [===============>..............] - ETA: 54s - loss: 1.9654 - regression_loss: 1.6503 - classification_loss: 0.3151 282/500 [===============>..............] - ETA: 53s - loss: 1.9653 - regression_loss: 1.6502 - classification_loss: 0.3151 283/500 [===============>..............] - ETA: 53s - loss: 1.9644 - regression_loss: 1.6495 - classification_loss: 0.3149 284/500 [================>.............] - ETA: 53s - loss: 1.9636 - regression_loss: 1.6490 - classification_loss: 0.3146 285/500 [================>.............] - ETA: 53s - loss: 1.9620 - regression_loss: 1.6478 - classification_loss: 0.3142 286/500 [================>.............] - ETA: 52s - loss: 1.9622 - regression_loss: 1.6481 - classification_loss: 0.3141 287/500 [================>.............] - ETA: 52s - loss: 1.9624 - regression_loss: 1.6484 - classification_loss: 0.3141 288/500 [================>.............] - ETA: 52s - loss: 1.9642 - regression_loss: 1.6499 - classification_loss: 0.3144 289/500 [================>.............] - ETA: 52s - loss: 1.9700 - regression_loss: 1.6531 - classification_loss: 0.3169 290/500 [================>.............] - ETA: 51s - loss: 1.9695 - regression_loss: 1.6524 - classification_loss: 0.3170 291/500 [================>.............] - ETA: 51s - loss: 1.9676 - regression_loss: 1.6510 - classification_loss: 0.3166 292/500 [================>.............] - ETA: 51s - loss: 1.9667 - regression_loss: 1.6505 - classification_loss: 0.3162 293/500 [================>.............] - ETA: 51s - loss: 1.9668 - regression_loss: 1.6506 - classification_loss: 0.3163 294/500 [================>.............] - ETA: 50s - loss: 1.9661 - regression_loss: 1.6500 - classification_loss: 0.3161 295/500 [================>.............] - ETA: 50s - loss: 1.9659 - regression_loss: 1.6502 - classification_loss: 0.3156 296/500 [================>.............] - ETA: 50s - loss: 1.9642 - regression_loss: 1.6492 - classification_loss: 0.3150 297/500 [================>.............] - ETA: 50s - loss: 1.9629 - regression_loss: 1.6483 - classification_loss: 0.3146 298/500 [================>.............] - ETA: 49s - loss: 1.9659 - regression_loss: 1.6506 - classification_loss: 0.3153 299/500 [================>.............] - ETA: 49s - loss: 1.9661 - regression_loss: 1.6508 - classification_loss: 0.3153 300/500 [=================>............] - ETA: 49s - loss: 1.9660 - regression_loss: 1.6507 - classification_loss: 0.3153 301/500 [=================>............] - ETA: 49s - loss: 1.9655 - regression_loss: 1.6506 - classification_loss: 0.3149 302/500 [=================>............] - ETA: 48s - loss: 1.9647 - regression_loss: 1.6499 - classification_loss: 0.3148 303/500 [=================>............] - ETA: 48s - loss: 1.9646 - regression_loss: 1.6499 - classification_loss: 0.3146 304/500 [=================>............] - ETA: 48s - loss: 1.9649 - regression_loss: 1.6505 - classification_loss: 0.3144 305/500 [=================>............] - ETA: 48s - loss: 1.9704 - regression_loss: 1.6556 - classification_loss: 0.3149 306/500 [=================>............] - ETA: 47s - loss: 1.9713 - regression_loss: 1.6563 - classification_loss: 0.3150 307/500 [=================>............] - ETA: 47s - loss: 1.9696 - regression_loss: 1.6551 - classification_loss: 0.3145 308/500 [=================>............] - ETA: 47s - loss: 1.9690 - regression_loss: 1.6548 - classification_loss: 0.3142 309/500 [=================>............] - ETA: 47s - loss: 1.9667 - regression_loss: 1.6530 - classification_loss: 0.3137 310/500 [=================>............] - ETA: 46s - loss: 1.9679 - regression_loss: 1.6540 - classification_loss: 0.3139 311/500 [=================>............] - ETA: 46s - loss: 1.9673 - regression_loss: 1.6538 - classification_loss: 0.3135 312/500 [=================>............] - ETA: 46s - loss: 1.9661 - regression_loss: 1.6529 - classification_loss: 0.3132 313/500 [=================>............] - ETA: 46s - loss: 1.9684 - regression_loss: 1.6544 - classification_loss: 0.3140 314/500 [=================>............] - ETA: 45s - loss: 1.9696 - regression_loss: 1.6551 - classification_loss: 0.3144 315/500 [=================>............] - ETA: 45s - loss: 1.9673 - regression_loss: 1.6532 - classification_loss: 0.3140 316/500 [=================>............] - ETA: 45s - loss: 1.9681 - regression_loss: 1.6541 - classification_loss: 0.3140 317/500 [==================>...........] - ETA: 45s - loss: 1.9674 - regression_loss: 1.6536 - classification_loss: 0.3138 318/500 [==================>...........] - ETA: 44s - loss: 1.9658 - regression_loss: 1.6525 - classification_loss: 0.3133 319/500 [==================>...........] - ETA: 44s - loss: 1.9652 - regression_loss: 1.6521 - classification_loss: 0.3131 320/500 [==================>...........] - ETA: 44s - loss: 1.9654 - regression_loss: 1.6523 - classification_loss: 0.3130 321/500 [==================>...........] - ETA: 44s - loss: 1.9651 - regression_loss: 1.6523 - classification_loss: 0.3128 322/500 [==================>...........] - ETA: 43s - loss: 1.9629 - regression_loss: 1.6506 - classification_loss: 0.3123 323/500 [==================>...........] - ETA: 43s - loss: 1.9645 - regression_loss: 1.6521 - classification_loss: 0.3124 324/500 [==================>...........] - ETA: 43s - loss: 1.9665 - regression_loss: 1.6537 - classification_loss: 0.3128 325/500 [==================>...........] - ETA: 43s - loss: 1.9669 - regression_loss: 1.6540 - classification_loss: 0.3129 326/500 [==================>...........] - ETA: 43s - loss: 1.9658 - regression_loss: 1.6531 - classification_loss: 0.3127 327/500 [==================>...........] - ETA: 42s - loss: 1.9646 - regression_loss: 1.6524 - classification_loss: 0.3122 328/500 [==================>...........] - ETA: 42s - loss: 1.9647 - regression_loss: 1.6526 - classification_loss: 0.3122 329/500 [==================>...........] - ETA: 42s - loss: 1.9646 - regression_loss: 1.6526 - classification_loss: 0.3119 330/500 [==================>...........] - ETA: 42s - loss: 1.9631 - regression_loss: 1.6516 - classification_loss: 0.3115 331/500 [==================>...........] - ETA: 41s - loss: 1.9606 - regression_loss: 1.6496 - classification_loss: 0.3110 332/500 [==================>...........] - ETA: 41s - loss: 1.9580 - regression_loss: 1.6474 - classification_loss: 0.3105 333/500 [==================>...........] - ETA: 41s - loss: 1.9571 - regression_loss: 1.6468 - classification_loss: 0.3103 334/500 [===================>..........] - ETA: 41s - loss: 1.9545 - regression_loss: 1.6447 - classification_loss: 0.3098 335/500 [===================>..........] - ETA: 40s - loss: 1.9550 - regression_loss: 1.6452 - classification_loss: 0.3098 336/500 [===================>..........] - ETA: 40s - loss: 1.9572 - regression_loss: 1.6470 - classification_loss: 0.3102 337/500 [===================>..........] - ETA: 40s - loss: 1.9560 - regression_loss: 1.6462 - classification_loss: 0.3097 338/500 [===================>..........] - ETA: 40s - loss: 1.9567 - regression_loss: 1.6471 - classification_loss: 0.3096 339/500 [===================>..........] - ETA: 39s - loss: 1.9577 - regression_loss: 1.6478 - classification_loss: 0.3099 340/500 [===================>..........] - ETA: 39s - loss: 1.9636 - regression_loss: 1.6514 - classification_loss: 0.3122 341/500 [===================>..........] - ETA: 39s - loss: 1.9633 - regression_loss: 1.6512 - classification_loss: 0.3121 342/500 [===================>..........] - ETA: 39s - loss: 1.9601 - regression_loss: 1.6486 - classification_loss: 0.3115 343/500 [===================>..........] - ETA: 38s - loss: 1.9570 - regression_loss: 1.6461 - classification_loss: 0.3110 344/500 [===================>..........] - ETA: 38s - loss: 1.9567 - regression_loss: 1.6457 - classification_loss: 0.3110 345/500 [===================>..........] - ETA: 38s - loss: 1.9555 - regression_loss: 1.6450 - classification_loss: 0.3106 346/500 [===================>..........] - ETA: 38s - loss: 1.9550 - regression_loss: 1.6446 - classification_loss: 0.3104 347/500 [===================>..........] - ETA: 37s - loss: 1.9548 - regression_loss: 1.6445 - classification_loss: 0.3103 348/500 [===================>..........] - ETA: 37s - loss: 1.9530 - regression_loss: 1.6431 - classification_loss: 0.3099 349/500 [===================>..........] - ETA: 37s - loss: 1.9538 - regression_loss: 1.6436 - classification_loss: 0.3102 350/500 [====================>.........] - ETA: 37s - loss: 1.9499 - regression_loss: 1.6404 - classification_loss: 0.3095 351/500 [====================>.........] - ETA: 36s - loss: 1.9477 - regression_loss: 1.6386 - classification_loss: 0.3091 352/500 [====================>.........] - ETA: 36s - loss: 1.9484 - regression_loss: 1.6395 - classification_loss: 0.3089 353/500 [====================>.........] - ETA: 36s - loss: 1.9466 - regression_loss: 1.6381 - classification_loss: 0.3085 354/500 [====================>.........] - ETA: 36s - loss: 1.9458 - regression_loss: 1.6375 - classification_loss: 0.3083 355/500 [====================>.........] - ETA: 35s - loss: 1.9448 - regression_loss: 1.6367 - classification_loss: 0.3081 356/500 [====================>.........] - ETA: 35s - loss: 1.9442 - regression_loss: 1.6363 - classification_loss: 0.3079 357/500 [====================>.........] - ETA: 35s - loss: 1.9410 - regression_loss: 1.6336 - classification_loss: 0.3073 358/500 [====================>.........] - ETA: 35s - loss: 1.9390 - regression_loss: 1.6319 - classification_loss: 0.3071 359/500 [====================>.........] - ETA: 34s - loss: 1.9403 - regression_loss: 1.6332 - classification_loss: 0.3071 360/500 [====================>.........] - ETA: 34s - loss: 1.9394 - regression_loss: 1.6325 - classification_loss: 0.3069 361/500 [====================>.........] - ETA: 34s - loss: 1.9394 - regression_loss: 1.6326 - classification_loss: 0.3068 362/500 [====================>.........] - ETA: 34s - loss: 1.9360 - regression_loss: 1.6298 - classification_loss: 0.3062 363/500 [====================>.........] - ETA: 33s - loss: 1.9355 - regression_loss: 1.6295 - classification_loss: 0.3060 364/500 [====================>.........] - ETA: 33s - loss: 1.9360 - regression_loss: 1.6301 - classification_loss: 0.3060 365/500 [====================>.........] - ETA: 33s - loss: 1.9370 - regression_loss: 1.6307 - classification_loss: 0.3063 366/500 [====================>.........] - ETA: 33s - loss: 1.9356 - regression_loss: 1.6297 - classification_loss: 0.3059 367/500 [=====================>........] - ETA: 32s - loss: 1.9354 - regression_loss: 1.6297 - classification_loss: 0.3057 368/500 [=====================>........] - ETA: 32s - loss: 1.9378 - regression_loss: 1.6315 - classification_loss: 0.3063 369/500 [=====================>........] - ETA: 32s - loss: 1.9388 - regression_loss: 1.6322 - classification_loss: 0.3066 370/500 [=====================>........] - ETA: 32s - loss: 1.9394 - regression_loss: 1.6328 - classification_loss: 0.3065 371/500 [=====================>........] - ETA: 31s - loss: 1.9369 - regression_loss: 1.6308 - classification_loss: 0.3061 372/500 [=====================>........] - ETA: 31s - loss: 1.9365 - regression_loss: 1.6304 - classification_loss: 0.3061 373/500 [=====================>........] - ETA: 31s - loss: 1.9357 - regression_loss: 1.6297 - classification_loss: 0.3060 374/500 [=====================>........] - ETA: 31s - loss: 1.9347 - regression_loss: 1.6289 - classification_loss: 0.3058 375/500 [=====================>........] - ETA: 30s - loss: 1.9382 - regression_loss: 1.6318 - classification_loss: 0.3064 376/500 [=====================>........] - ETA: 30s - loss: 1.9377 - regression_loss: 1.6317 - classification_loss: 0.3060 377/500 [=====================>........] - ETA: 30s - loss: 1.9376 - regression_loss: 1.6316 - classification_loss: 0.3060 378/500 [=====================>........] - ETA: 30s - loss: 1.9375 - regression_loss: 1.6316 - classification_loss: 0.3059 379/500 [=====================>........] - ETA: 29s - loss: 1.9369 - regression_loss: 1.6310 - classification_loss: 0.3059 380/500 [=====================>........] - ETA: 29s - loss: 1.9354 - regression_loss: 1.6299 - classification_loss: 0.3055 381/500 [=====================>........] - ETA: 29s - loss: 1.9341 - regression_loss: 1.6287 - classification_loss: 0.3054 382/500 [=====================>........] - ETA: 29s - loss: 1.9326 - regression_loss: 1.6276 - classification_loss: 0.3051 383/500 [=====================>........] - ETA: 28s - loss: 1.9328 - regression_loss: 1.6277 - classification_loss: 0.3051 384/500 [======================>.......] - ETA: 28s - loss: 1.9307 - regression_loss: 1.6261 - classification_loss: 0.3046 385/500 [======================>.......] - ETA: 28s - loss: 1.9303 - regression_loss: 1.6257 - classification_loss: 0.3046 386/500 [======================>.......] - ETA: 28s - loss: 1.9314 - regression_loss: 1.6267 - classification_loss: 0.3048 387/500 [======================>.......] - ETA: 27s - loss: 1.9320 - regression_loss: 1.6273 - classification_loss: 0.3047 388/500 [======================>.......] - ETA: 27s - loss: 1.9320 - regression_loss: 1.6273 - classification_loss: 0.3047 389/500 [======================>.......] - ETA: 27s - loss: 1.9318 - regression_loss: 1.6274 - classification_loss: 0.3044 390/500 [======================>.......] - ETA: 27s - loss: 1.9295 - regression_loss: 1.6256 - classification_loss: 0.3039 391/500 [======================>.......] - ETA: 26s - loss: 1.9289 - regression_loss: 1.6251 - classification_loss: 0.3038 392/500 [======================>.......] - ETA: 26s - loss: 1.9260 - regression_loss: 1.6210 - classification_loss: 0.3050 393/500 [======================>.......] - ETA: 26s - loss: 1.9265 - regression_loss: 1.6214 - classification_loss: 0.3051 394/500 [======================>.......] - ETA: 26s - loss: 1.9254 - regression_loss: 1.6205 - classification_loss: 0.3049 395/500 [======================>.......] - ETA: 25s - loss: 1.9253 - regression_loss: 1.6207 - classification_loss: 0.3046 396/500 [======================>.......] - ETA: 25s - loss: 1.9245 - regression_loss: 1.6201 - classification_loss: 0.3045 397/500 [======================>.......] - ETA: 25s - loss: 1.9226 - regression_loss: 1.6185 - classification_loss: 0.3041 398/500 [======================>.......] - ETA: 25s - loss: 1.9224 - regression_loss: 1.6185 - classification_loss: 0.3040 399/500 [======================>.......] - ETA: 24s - loss: 1.9227 - regression_loss: 1.6186 - classification_loss: 0.3041 400/500 [=======================>......] - ETA: 24s - loss: 1.9215 - regression_loss: 1.6178 - classification_loss: 0.3037 401/500 [=======================>......] - ETA: 24s - loss: 1.9211 - regression_loss: 1.6175 - classification_loss: 0.3037 402/500 [=======================>......] - ETA: 24s - loss: 1.9203 - regression_loss: 1.6169 - classification_loss: 0.3033 403/500 [=======================>......] - ETA: 24s - loss: 1.9187 - regression_loss: 1.6156 - classification_loss: 0.3031 404/500 [=======================>......] - ETA: 23s - loss: 1.9191 - regression_loss: 1.6161 - classification_loss: 0.3030 405/500 [=======================>......] - ETA: 23s - loss: 1.9182 - regression_loss: 1.6155 - classification_loss: 0.3027 406/500 [=======================>......] - ETA: 23s - loss: 1.9179 - regression_loss: 1.6152 - classification_loss: 0.3027 407/500 [=======================>......] - ETA: 23s - loss: 1.9190 - regression_loss: 1.6161 - classification_loss: 0.3029 408/500 [=======================>......] - ETA: 22s - loss: 1.9182 - regression_loss: 1.6157 - classification_loss: 0.3025 409/500 [=======================>......] - ETA: 22s - loss: 1.9178 - regression_loss: 1.6151 - classification_loss: 0.3026 410/500 [=======================>......] - ETA: 22s - loss: 1.9196 - regression_loss: 1.6167 - classification_loss: 0.3029 411/500 [=======================>......] - ETA: 22s - loss: 1.9199 - regression_loss: 1.6169 - classification_loss: 0.3029 412/500 [=======================>......] - ETA: 21s - loss: 1.9216 - regression_loss: 1.6183 - classification_loss: 0.3032 413/500 [=======================>......] - ETA: 21s - loss: 1.9202 - regression_loss: 1.6173 - classification_loss: 0.3029 414/500 [=======================>......] - ETA: 21s - loss: 1.9186 - regression_loss: 1.6161 - classification_loss: 0.3025 415/500 [=======================>......] - ETA: 21s - loss: 1.9194 - regression_loss: 1.6165 - classification_loss: 0.3029 416/500 [=======================>......] - ETA: 20s - loss: 1.9171 - regression_loss: 1.6147 - classification_loss: 0.3024 417/500 [========================>.....] - ETA: 20s - loss: 1.9174 - regression_loss: 1.6149 - classification_loss: 0.3025 418/500 [========================>.....] - ETA: 20s - loss: 1.9163 - regression_loss: 1.6140 - classification_loss: 0.3023 419/500 [========================>.....] - ETA: 20s - loss: 1.9189 - regression_loss: 1.6166 - classification_loss: 0.3022 420/500 [========================>.....] - ETA: 19s - loss: 1.9216 - regression_loss: 1.6189 - classification_loss: 0.3027 421/500 [========================>.....] - ETA: 19s - loss: 1.9219 - regression_loss: 1.6192 - classification_loss: 0.3027 422/500 [========================>.....] - ETA: 19s - loss: 1.9230 - regression_loss: 1.6199 - classification_loss: 0.3031 423/500 [========================>.....] - ETA: 19s - loss: 1.9218 - regression_loss: 1.6190 - classification_loss: 0.3028 424/500 [========================>.....] - ETA: 18s - loss: 1.9197 - regression_loss: 1.6173 - classification_loss: 0.3024 425/500 [========================>.....] - ETA: 18s - loss: 1.9172 - regression_loss: 1.6153 - classification_loss: 0.3019 426/500 [========================>.....] - ETA: 18s - loss: 1.9160 - regression_loss: 1.6145 - classification_loss: 0.3015 427/500 [========================>.....] - ETA: 18s - loss: 1.9176 - regression_loss: 1.6159 - classification_loss: 0.3017 428/500 [========================>.....] - ETA: 17s - loss: 1.9179 - regression_loss: 1.6162 - classification_loss: 0.3017 429/500 [========================>.....] - ETA: 17s - loss: 1.9208 - regression_loss: 1.6185 - classification_loss: 0.3023 430/500 [========================>.....] - ETA: 17s - loss: 1.9207 - regression_loss: 1.6182 - classification_loss: 0.3025 431/500 [========================>.....] - ETA: 17s - loss: 1.9242 - regression_loss: 1.6210 - classification_loss: 0.3032 432/500 [========================>.....] - ETA: 16s - loss: 1.9240 - regression_loss: 1.6210 - classification_loss: 0.3030 433/500 [========================>.....] - ETA: 16s - loss: 1.9218 - regression_loss: 1.6192 - classification_loss: 0.3026 434/500 [=========================>....] - ETA: 16s - loss: 1.9221 - regression_loss: 1.6195 - classification_loss: 0.3026 435/500 [=========================>....] - ETA: 16s - loss: 1.9219 - regression_loss: 1.6194 - classification_loss: 0.3025 436/500 [=========================>....] - ETA: 15s - loss: 1.9204 - regression_loss: 1.6183 - classification_loss: 0.3021 437/500 [=========================>....] - ETA: 15s - loss: 1.9214 - regression_loss: 1.6191 - classification_loss: 0.3022 438/500 [=========================>....] - ETA: 15s - loss: 1.9211 - regression_loss: 1.6189 - classification_loss: 0.3022 439/500 [=========================>....] - ETA: 15s - loss: 1.9184 - regression_loss: 1.6166 - classification_loss: 0.3018 440/500 [=========================>....] - ETA: 14s - loss: 1.9173 - regression_loss: 1.6158 - classification_loss: 0.3016 441/500 [=========================>....] - ETA: 14s - loss: 1.9193 - regression_loss: 1.6174 - classification_loss: 0.3019 442/500 [=========================>....] - ETA: 14s - loss: 1.9211 - regression_loss: 1.6190 - classification_loss: 0.3020 443/500 [=========================>....] - ETA: 14s - loss: 1.9248 - regression_loss: 1.6221 - classification_loss: 0.3027 444/500 [=========================>....] - ETA: 13s - loss: 1.9241 - regression_loss: 1.6216 - classification_loss: 0.3026 445/500 [=========================>....] - ETA: 13s - loss: 1.9234 - regression_loss: 1.6212 - classification_loss: 0.3023 446/500 [=========================>....] - ETA: 13s - loss: 1.9226 - regression_loss: 1.6206 - classification_loss: 0.3020 447/500 [=========================>....] - ETA: 13s - loss: 1.9211 - regression_loss: 1.6194 - classification_loss: 0.3017 448/500 [=========================>....] - ETA: 12s - loss: 1.9213 - regression_loss: 1.6197 - classification_loss: 0.3016 449/500 [=========================>....] - ETA: 12s - loss: 1.9211 - regression_loss: 1.6196 - classification_loss: 0.3015 450/500 [==========================>...] - ETA: 12s - loss: 1.9219 - regression_loss: 1.6203 - classification_loss: 0.3016 451/500 [==========================>...] - ETA: 12s - loss: 1.9219 - regression_loss: 1.6204 - classification_loss: 0.3015 452/500 [==========================>...] - ETA: 11s - loss: 1.9214 - regression_loss: 1.6199 - classification_loss: 0.3015 453/500 [==========================>...] - ETA: 11s - loss: 1.9215 - regression_loss: 1.6200 - classification_loss: 0.3015 454/500 [==========================>...] - ETA: 11s - loss: 1.9229 - regression_loss: 1.6208 - classification_loss: 0.3020 455/500 [==========================>...] - ETA: 11s - loss: 1.9234 - regression_loss: 1.6211 - classification_loss: 0.3022 456/500 [==========================>...] - ETA: 10s - loss: 1.9217 - regression_loss: 1.6197 - classification_loss: 0.3019 457/500 [==========================>...] - ETA: 10s - loss: 1.9203 - regression_loss: 1.6187 - classification_loss: 0.3016 458/500 [==========================>...] - ETA: 10s - loss: 1.9193 - regression_loss: 1.6179 - classification_loss: 0.3014 459/500 [==========================>...] - ETA: 10s - loss: 1.9187 - regression_loss: 1.6175 - classification_loss: 0.3012 460/500 [==========================>...] - ETA: 9s - loss: 1.9174 - regression_loss: 1.6165 - classification_loss: 0.3009  461/500 [==========================>...] - ETA: 9s - loss: 1.9171 - regression_loss: 1.6163 - classification_loss: 0.3008 462/500 [==========================>...] - ETA: 9s - loss: 1.9163 - regression_loss: 1.6158 - classification_loss: 0.3005 463/500 [==========================>...] - ETA: 9s - loss: 1.9160 - regression_loss: 1.6156 - classification_loss: 0.3004 464/500 [==========================>...] - ETA: 8s - loss: 1.9131 - regression_loss: 1.6132 - classification_loss: 0.2999 465/500 [==========================>...] - ETA: 8s - loss: 1.9116 - regression_loss: 1.6120 - classification_loss: 0.2997 466/500 [==========================>...] - ETA: 8s - loss: 1.9099 - regression_loss: 1.6106 - classification_loss: 0.2993 467/500 [===========================>..] - ETA: 8s - loss: 1.9096 - regression_loss: 1.6104 - classification_loss: 0.2992 468/500 [===========================>..] - ETA: 7s - loss: 1.9097 - regression_loss: 1.6105 - classification_loss: 0.2992 469/500 [===========================>..] - ETA: 7s - loss: 1.9079 - regression_loss: 1.6091 - classification_loss: 0.2989 470/500 [===========================>..] - ETA: 7s - loss: 1.9063 - regression_loss: 1.6079 - classification_loss: 0.2985 471/500 [===========================>..] - ETA: 7s - loss: 1.9061 - regression_loss: 1.6078 - classification_loss: 0.2983 472/500 [===========================>..] - ETA: 6s - loss: 1.9056 - regression_loss: 1.6075 - classification_loss: 0.2981 473/500 [===========================>..] - ETA: 6s - loss: 1.9067 - regression_loss: 1.6082 - classification_loss: 0.2985 474/500 [===========================>..] - ETA: 6s - loss: 1.9059 - regression_loss: 1.6076 - classification_loss: 0.2983 475/500 [===========================>..] - ETA: 6s - loss: 1.9051 - regression_loss: 1.6069 - classification_loss: 0.2982 476/500 [===========================>..] - ETA: 5s - loss: 1.9054 - regression_loss: 1.6074 - classification_loss: 0.2981 477/500 [===========================>..] - ETA: 5s - loss: 1.9052 - regression_loss: 1.6074 - classification_loss: 0.2978 478/500 [===========================>..] - ETA: 5s - loss: 1.9047 - regression_loss: 1.6070 - classification_loss: 0.2977 479/500 [===========================>..] - ETA: 5s - loss: 1.9031 - regression_loss: 1.6059 - classification_loss: 0.2972 480/500 [===========================>..] - ETA: 4s - loss: 1.9031 - regression_loss: 1.6060 - classification_loss: 0.2972 481/500 [===========================>..] - ETA: 4s - loss: 1.9040 - regression_loss: 1.6066 - classification_loss: 0.2975 482/500 [===========================>..] - ETA: 4s - loss: 1.9073 - regression_loss: 1.6087 - classification_loss: 0.2986 483/500 [===========================>..] - ETA: 4s - loss: 1.9089 - regression_loss: 1.6105 - classification_loss: 0.2985 484/500 [============================>.] - ETA: 3s - loss: 1.9090 - regression_loss: 1.6106 - classification_loss: 0.2984 485/500 [============================>.] - ETA: 3s - loss: 1.9094 - regression_loss: 1.6110 - classification_loss: 0.2984 486/500 [============================>.] - ETA: 3s - loss: 1.9090 - regression_loss: 1.6107 - classification_loss: 0.2983 487/500 [============================>.] - ETA: 3s - loss: 1.9091 - regression_loss: 1.6108 - classification_loss: 0.2983 488/500 [============================>.] - ETA: 2s - loss: 1.9087 - regression_loss: 1.6107 - classification_loss: 0.2980 489/500 [============================>.] - ETA: 2s - loss: 1.9077 - regression_loss: 1.6097 - classification_loss: 0.2980 490/500 [============================>.] - ETA: 2s - loss: 1.9072 - regression_loss: 1.6094 - classification_loss: 0.2979 491/500 [============================>.] - ETA: 2s - loss: 1.9062 - regression_loss: 1.6086 - classification_loss: 0.2976 492/500 [============================>.] - ETA: 1s - loss: 1.9041 - regression_loss: 1.6068 - classification_loss: 0.2972 493/500 [============================>.] - ETA: 1s - loss: 1.9041 - regression_loss: 1.6068 - classification_loss: 0.2973 494/500 [============================>.] - ETA: 1s - loss: 1.9039 - regression_loss: 1.6068 - classification_loss: 0.2971 495/500 [============================>.] - ETA: 1s - loss: 1.9023 - regression_loss: 1.6055 - classification_loss: 0.2968 496/500 [============================>.] - ETA: 0s - loss: 1.9013 - regression_loss: 1.6048 - classification_loss: 0.2965 497/500 [============================>.] - ETA: 0s - loss: 1.9016 - regression_loss: 1.6052 - classification_loss: 0.2964 498/500 [============================>.] - ETA: 0s - loss: 1.9018 - regression_loss: 1.6054 - classification_loss: 0.2963 499/500 [============================>.] - ETA: 0s - loss: 1.9002 - regression_loss: 1.6042 - classification_loss: 0.2960 500/500 [==============================] - 124s 247ms/step - loss: 1.9008 - regression_loss: 1.6047 - classification_loss: 0.2961 326 instances of class plum with average precision: 0.7409 mAP: 0.7409 Epoch 00005: saving model to ./training/snapshots/resnet50_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 1:48 - loss: 1.6860 - regression_loss: 1.4102 - classification_loss: 0.2758 2/500 [..............................] - ETA: 1:54 - loss: 1.5443 - regression_loss: 1.3240 - classification_loss: 0.2203 3/500 [..............................] - ETA: 1:58 - loss: 1.4775 - regression_loss: 1.2813 - classification_loss: 0.1962 4/500 [..............................] - ETA: 1:59 - loss: 1.6907 - regression_loss: 1.4417 - classification_loss: 0.2490 5/500 [..............................] - ETA: 2:00 - loss: 1.7534 - regression_loss: 1.5059 - classification_loss: 0.2475 6/500 [..............................] - ETA: 2:00 - loss: 1.8386 - regression_loss: 1.5794 - classification_loss: 0.2593 7/500 [..............................] - ETA: 2:00 - loss: 1.7547 - regression_loss: 1.5115 - classification_loss: 0.2432 8/500 [..............................] - ETA: 2:00 - loss: 1.8035 - regression_loss: 1.5489 - classification_loss: 0.2545 9/500 [..............................] - ETA: 1:59 - loss: 1.8982 - regression_loss: 1.6131 - classification_loss: 0.2851 10/500 [..............................] - ETA: 1:59 - loss: 1.9680 - regression_loss: 1.6707 - classification_loss: 0.2972 11/500 [..............................] - ETA: 1:59 - loss: 1.9246 - regression_loss: 1.6330 - classification_loss: 0.2915 12/500 [..............................] - ETA: 1:59 - loss: 1.8873 - regression_loss: 1.6026 - classification_loss: 0.2847 13/500 [..............................] - ETA: 1:59 - loss: 1.9105 - regression_loss: 1.6266 - classification_loss: 0.2839 14/500 [..............................] - ETA: 1:59 - loss: 1.8971 - regression_loss: 1.6090 - classification_loss: 0.2881 15/500 [..............................] - ETA: 1:59 - loss: 1.8796 - regression_loss: 1.5976 - classification_loss: 0.2821 16/500 [..............................] - ETA: 1:58 - loss: 1.8754 - regression_loss: 1.5976 - classification_loss: 0.2778 17/500 [>.............................] - ETA: 1:58 - loss: 1.9403 - regression_loss: 1.6544 - classification_loss: 0.2860 18/500 [>.............................] - ETA: 1:58 - loss: 1.9333 - regression_loss: 1.6522 - classification_loss: 0.2811 19/500 [>.............................] - ETA: 1:57 - loss: 1.9384 - regression_loss: 1.6563 - classification_loss: 0.2821 20/500 [>.............................] - ETA: 1:58 - loss: 1.9474 - regression_loss: 1.6642 - classification_loss: 0.2833 21/500 [>.............................] - ETA: 1:57 - loss: 1.9487 - regression_loss: 1.6686 - classification_loss: 0.2801 22/500 [>.............................] - ETA: 1:57 - loss: 1.9662 - regression_loss: 1.6811 - classification_loss: 0.2851 23/500 [>.............................] - ETA: 1:57 - loss: 2.0169 - regression_loss: 1.7254 - classification_loss: 0.2915 24/500 [>.............................] - ETA: 1:57 - loss: 1.9824 - regression_loss: 1.6958 - classification_loss: 0.2866 25/500 [>.............................] - ETA: 1:57 - loss: 1.9749 - regression_loss: 1.6918 - classification_loss: 0.2831 26/500 [>.............................] - ETA: 1:56 - loss: 1.9751 - regression_loss: 1.6916 - classification_loss: 0.2836 27/500 [>.............................] - ETA: 1:56 - loss: 1.9574 - regression_loss: 1.6772 - classification_loss: 0.2803 28/500 [>.............................] - ETA: 1:56 - loss: 1.9578 - regression_loss: 1.6785 - classification_loss: 0.2794 29/500 [>.............................] - ETA: 1:56 - loss: 1.9536 - regression_loss: 1.6758 - classification_loss: 0.2778 30/500 [>.............................] - ETA: 1:56 - loss: 1.9354 - regression_loss: 1.6608 - classification_loss: 0.2746 31/500 [>.............................] - ETA: 1:56 - loss: 1.9884 - regression_loss: 1.7005 - classification_loss: 0.2879 32/500 [>.............................] - ETA: 1:55 - loss: 1.9798 - regression_loss: 1.6947 - classification_loss: 0.2851 33/500 [>.............................] - ETA: 1:55 - loss: 1.9486 - regression_loss: 1.6691 - classification_loss: 0.2795 34/500 [=>............................] - ETA: 1:55 - loss: 1.9849 - regression_loss: 1.6888 - classification_loss: 0.2962 35/500 [=>............................] - ETA: 1:55 - loss: 1.9891 - regression_loss: 1.6900 - classification_loss: 0.2991 36/500 [=>............................] - ETA: 1:55 - loss: 1.9787 - regression_loss: 1.6825 - classification_loss: 0.2962 37/500 [=>............................] - ETA: 1:54 - loss: 1.9711 - regression_loss: 1.6765 - classification_loss: 0.2946 38/500 [=>............................] - ETA: 1:54 - loss: 1.9708 - regression_loss: 1.6753 - classification_loss: 0.2955 39/500 [=>............................] - ETA: 1:54 - loss: 1.9710 - regression_loss: 1.6756 - classification_loss: 0.2954 40/500 [=>............................] - ETA: 1:53 - loss: 1.9588 - regression_loss: 1.6648 - classification_loss: 0.2940 41/500 [=>............................] - ETA: 1:53 - loss: 1.9427 - regression_loss: 1.6524 - classification_loss: 0.2903 42/500 [=>............................] - ETA: 1:53 - loss: 1.9450 - regression_loss: 1.6542 - classification_loss: 0.2908 43/500 [=>............................] - ETA: 1:53 - loss: 1.9442 - regression_loss: 1.6529 - classification_loss: 0.2913 44/500 [=>............................] - ETA: 1:53 - loss: 1.9526 - regression_loss: 1.6610 - classification_loss: 0.2916 45/500 [=>............................] - ETA: 1:52 - loss: 1.9455 - regression_loss: 1.6549 - classification_loss: 0.2906 46/500 [=>............................] - ETA: 1:52 - loss: 1.9364 - regression_loss: 1.6479 - classification_loss: 0.2884 47/500 [=>............................] - ETA: 1:52 - loss: 1.9258 - regression_loss: 1.6396 - classification_loss: 0.2863 48/500 [=>............................] - ETA: 1:52 - loss: 1.9218 - regression_loss: 1.6357 - classification_loss: 0.2860 49/500 [=>............................] - ETA: 1:52 - loss: 1.9293 - regression_loss: 1.6427 - classification_loss: 0.2866 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9179 - regression_loss: 1.6322 - classification_loss: 0.2858 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9216 - regression_loss: 1.6360 - classification_loss: 0.2856 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9110 - regression_loss: 1.6281 - classification_loss: 0.2829 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9082 - regression_loss: 1.6253 - classification_loss: 0.2830 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9088 - regression_loss: 1.6258 - classification_loss: 0.2830 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9040 - regression_loss: 1.6234 - classification_loss: 0.2806 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8990 - regression_loss: 1.6184 - classification_loss: 0.2806 57/500 [==>...........................] - ETA: 1:49 - loss: 1.9142 - regression_loss: 1.6297 - classification_loss: 0.2844 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9141 - regression_loss: 1.6293 - classification_loss: 0.2848 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9169 - regression_loss: 1.6319 - classification_loss: 0.2850 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9233 - regression_loss: 1.6374 - classification_loss: 0.2858 61/500 [==>...........................] - ETA: 1:48 - loss: 1.9170 - regression_loss: 1.6325 - classification_loss: 0.2844 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9082 - regression_loss: 1.6254 - classification_loss: 0.2827 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8974 - regression_loss: 1.6177 - classification_loss: 0.2797 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8916 - regression_loss: 1.6137 - classification_loss: 0.2779 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8908 - regression_loss: 1.6132 - classification_loss: 0.2776 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8882 - regression_loss: 1.6114 - classification_loss: 0.2768 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8946 - regression_loss: 1.6196 - classification_loss: 0.2750 68/500 [===>..........................] - ETA: 1:46 - loss: 1.8874 - regression_loss: 1.6139 - classification_loss: 0.2735 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8901 - regression_loss: 1.6170 - classification_loss: 0.2731 70/500 [===>..........................] - ETA: 1:46 - loss: 1.9071 - regression_loss: 1.6309 - classification_loss: 0.2762 71/500 [===>..........................] - ETA: 1:46 - loss: 1.9046 - regression_loss: 1.6291 - classification_loss: 0.2755 72/500 [===>..........................] - ETA: 1:45 - loss: 1.8953 - regression_loss: 1.6210 - classification_loss: 0.2743 73/500 [===>..........................] - ETA: 1:45 - loss: 1.8963 - regression_loss: 1.6220 - classification_loss: 0.2743 74/500 [===>..........................] - ETA: 1:45 - loss: 1.8898 - regression_loss: 1.6175 - classification_loss: 0.2723 75/500 [===>..........................] - ETA: 1:44 - loss: 1.8880 - regression_loss: 1.6161 - classification_loss: 0.2719 76/500 [===>..........................] - ETA: 1:44 - loss: 1.8912 - regression_loss: 1.6188 - classification_loss: 0.2724 77/500 [===>..........................] - ETA: 1:44 - loss: 1.8897 - regression_loss: 1.6166 - classification_loss: 0.2731 78/500 [===>..........................] - ETA: 1:44 - loss: 1.9004 - regression_loss: 1.6251 - classification_loss: 0.2754 79/500 [===>..........................] - ETA: 1:44 - loss: 1.9132 - regression_loss: 1.6366 - classification_loss: 0.2766 80/500 [===>..........................] - ETA: 1:43 - loss: 1.9039 - regression_loss: 1.6292 - classification_loss: 0.2746 81/500 [===>..........................] - ETA: 1:43 - loss: 1.8987 - regression_loss: 1.6252 - classification_loss: 0.2735 82/500 [===>..........................] - ETA: 1:43 - loss: 1.8922 - regression_loss: 1.6198 - classification_loss: 0.2724 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8900 - regression_loss: 1.6184 - classification_loss: 0.2716 84/500 [====>.........................] - ETA: 1:42 - loss: 1.8913 - regression_loss: 1.6205 - classification_loss: 0.2709 85/500 [====>.........................] - ETA: 1:42 - loss: 1.8918 - regression_loss: 1.6207 - classification_loss: 0.2711 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8883 - regression_loss: 1.6186 - classification_loss: 0.2697 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8910 - regression_loss: 1.6215 - classification_loss: 0.2694 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8880 - regression_loss: 1.6188 - classification_loss: 0.2691 89/500 [====>.........................] - ETA: 1:41 - loss: 1.8867 - regression_loss: 1.6178 - classification_loss: 0.2689 90/500 [====>.........................] - ETA: 1:41 - loss: 1.8849 - regression_loss: 1.6156 - classification_loss: 0.2694 91/500 [====>.........................] - ETA: 1:41 - loss: 1.9029 - regression_loss: 1.6287 - classification_loss: 0.2741 92/500 [====>.........................] - ETA: 1:40 - loss: 1.8944 - regression_loss: 1.6212 - classification_loss: 0.2733 93/500 [====>.........................] - ETA: 1:40 - loss: 1.8831 - regression_loss: 1.6116 - classification_loss: 0.2716 94/500 [====>.........................] - ETA: 1:40 - loss: 1.8946 - regression_loss: 1.6208 - classification_loss: 0.2738 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8962 - regression_loss: 1.6219 - classification_loss: 0.2744 96/500 [====>.........................] - ETA: 1:39 - loss: 1.9054 - regression_loss: 1.6296 - classification_loss: 0.2758 97/500 [====>.........................] - ETA: 1:39 - loss: 1.9050 - regression_loss: 1.6296 - classification_loss: 0.2754 98/500 [====>.........................] - ETA: 1:39 - loss: 1.9054 - regression_loss: 1.6304 - classification_loss: 0.2750 99/500 [====>.........................] - ETA: 1:39 - loss: 1.9141 - regression_loss: 1.6376 - classification_loss: 0.2765 100/500 [=====>........................] - ETA: 1:38 - loss: 1.9213 - regression_loss: 1.6423 - classification_loss: 0.2790 101/500 [=====>........................] - ETA: 1:38 - loss: 1.9192 - regression_loss: 1.6413 - classification_loss: 0.2780 102/500 [=====>........................] - ETA: 1:38 - loss: 1.9185 - regression_loss: 1.6406 - classification_loss: 0.2780 103/500 [=====>........................] - ETA: 1:38 - loss: 1.9163 - regression_loss: 1.6392 - classification_loss: 0.2771 104/500 [=====>........................] - ETA: 1:37 - loss: 1.9200 - regression_loss: 1.6425 - classification_loss: 0.2775 105/500 [=====>........................] - ETA: 1:37 - loss: 1.9186 - regression_loss: 1.6413 - classification_loss: 0.2773 106/500 [=====>........................] - ETA: 1:37 - loss: 1.9173 - regression_loss: 1.6406 - classification_loss: 0.2768 107/500 [=====>........................] - ETA: 1:37 - loss: 1.9152 - regression_loss: 1.6392 - classification_loss: 0.2760 108/500 [=====>........................] - ETA: 1:36 - loss: 1.9132 - regression_loss: 1.6375 - classification_loss: 0.2757 109/500 [=====>........................] - ETA: 1:36 - loss: 1.9124 - regression_loss: 1.6365 - classification_loss: 0.2759 110/500 [=====>........................] - ETA: 1:36 - loss: 1.9054 - regression_loss: 1.6307 - classification_loss: 0.2747 111/500 [=====>........................] - ETA: 1:36 - loss: 1.9058 - regression_loss: 1.6314 - classification_loss: 0.2744 112/500 [=====>........................] - ETA: 1:35 - loss: 1.9016 - regression_loss: 1.6278 - classification_loss: 0.2738 113/500 [=====>........................] - ETA: 1:35 - loss: 1.9098 - regression_loss: 1.6348 - classification_loss: 0.2750 114/500 [=====>........................] - ETA: 1:35 - loss: 1.9128 - regression_loss: 1.6366 - classification_loss: 0.2762 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9115 - regression_loss: 1.6354 - classification_loss: 0.2761 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9295 - regression_loss: 1.6513 - classification_loss: 0.2782 117/500 [======>.......................] - ETA: 1:34 - loss: 1.9328 - regression_loss: 1.6539 - classification_loss: 0.2789 118/500 [======>.......................] - ETA: 1:34 - loss: 1.9332 - regression_loss: 1.6548 - classification_loss: 0.2784 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9419 - regression_loss: 1.6598 - classification_loss: 0.2821 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9500 - regression_loss: 1.6663 - classification_loss: 0.2837 121/500 [======>.......................] - ETA: 1:33 - loss: 1.9403 - regression_loss: 1.6580 - classification_loss: 0.2823 122/500 [======>.......................] - ETA: 1:33 - loss: 1.9412 - regression_loss: 1.6589 - classification_loss: 0.2823 123/500 [======>.......................] - ETA: 1:33 - loss: 1.9407 - regression_loss: 1.6584 - classification_loss: 0.2823 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9414 - regression_loss: 1.6586 - classification_loss: 0.2828 125/500 [======>.......................] - ETA: 1:32 - loss: 1.9414 - regression_loss: 1.6578 - classification_loss: 0.2836 126/500 [======>.......................] - ETA: 1:32 - loss: 1.9411 - regression_loss: 1.6573 - classification_loss: 0.2839 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9439 - regression_loss: 1.6593 - classification_loss: 0.2846 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9349 - regression_loss: 1.6518 - classification_loss: 0.2831 129/500 [======>.......................] - ETA: 1:31 - loss: 1.9293 - regression_loss: 1.6471 - classification_loss: 0.2822 130/500 [======>.......................] - ETA: 1:31 - loss: 1.9282 - regression_loss: 1.6467 - classification_loss: 0.2815 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9310 - regression_loss: 1.6502 - classification_loss: 0.2808 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9264 - regression_loss: 1.6464 - classification_loss: 0.2800 133/500 [======>.......................] - ETA: 1:30 - loss: 1.9335 - regression_loss: 1.6524 - classification_loss: 0.2810 134/500 [=======>......................] - ETA: 1:30 - loss: 1.9345 - regression_loss: 1.6539 - classification_loss: 0.2806 135/500 [=======>......................] - ETA: 1:30 - loss: 1.9330 - regression_loss: 1.6524 - classification_loss: 0.2806 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9356 - regression_loss: 1.6532 - classification_loss: 0.2823 137/500 [=======>......................] - ETA: 1:29 - loss: 1.9353 - regression_loss: 1.6531 - classification_loss: 0.2822 138/500 [=======>......................] - ETA: 1:29 - loss: 1.9323 - regression_loss: 1.6508 - classification_loss: 0.2816 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9381 - regression_loss: 1.6550 - classification_loss: 0.2831 140/500 [=======>......................] - ETA: 1:28 - loss: 1.9380 - regression_loss: 1.6552 - classification_loss: 0.2828 141/500 [=======>......................] - ETA: 1:28 - loss: 1.9308 - regression_loss: 1.6492 - classification_loss: 0.2816 142/500 [=======>......................] - ETA: 1:28 - loss: 1.9309 - regression_loss: 1.6496 - classification_loss: 0.2813 143/500 [=======>......................] - ETA: 1:27 - loss: 1.9297 - regression_loss: 1.6486 - classification_loss: 0.2810 144/500 [=======>......................] - ETA: 1:27 - loss: 1.9292 - regression_loss: 1.6481 - classification_loss: 0.2811 145/500 [=======>......................] - ETA: 1:27 - loss: 1.9266 - regression_loss: 1.6463 - classification_loss: 0.2804 146/500 [=======>......................] - ETA: 1:27 - loss: 1.9272 - regression_loss: 1.6470 - classification_loss: 0.2802 147/500 [=======>......................] - ETA: 1:26 - loss: 1.9264 - regression_loss: 1.6465 - classification_loss: 0.2799 148/500 [=======>......................] - ETA: 1:26 - loss: 1.9209 - regression_loss: 1.6423 - classification_loss: 0.2786 149/500 [=======>......................] - ETA: 1:26 - loss: 1.9125 - regression_loss: 1.6351 - classification_loss: 0.2775 150/500 [========>.....................] - ETA: 1:26 - loss: 1.9116 - regression_loss: 1.6343 - classification_loss: 0.2773 151/500 [========>.....................] - ETA: 1:26 - loss: 1.9066 - regression_loss: 1.6302 - classification_loss: 0.2764 152/500 [========>.....................] - ETA: 1:25 - loss: 1.9088 - regression_loss: 1.6317 - classification_loss: 0.2770 153/500 [========>.....................] - ETA: 1:25 - loss: 1.9020 - regression_loss: 1.6262 - classification_loss: 0.2758 154/500 [========>.....................] - ETA: 1:25 - loss: 1.9014 - regression_loss: 1.6262 - classification_loss: 0.2752 155/500 [========>.....................] - ETA: 1:25 - loss: 1.9001 - regression_loss: 1.6253 - classification_loss: 0.2748 156/500 [========>.....................] - ETA: 1:24 - loss: 1.9002 - regression_loss: 1.6258 - classification_loss: 0.2745 157/500 [========>.....................] - ETA: 1:24 - loss: 1.8944 - regression_loss: 1.6208 - classification_loss: 0.2735 158/500 [========>.....................] - ETA: 1:24 - loss: 1.8942 - regression_loss: 1.6209 - classification_loss: 0.2733 159/500 [========>.....................] - ETA: 1:24 - loss: 1.8906 - regression_loss: 1.6182 - classification_loss: 0.2724 160/500 [========>.....................] - ETA: 1:23 - loss: 1.8917 - regression_loss: 1.6190 - classification_loss: 0.2727 161/500 [========>.....................] - ETA: 1:23 - loss: 1.8921 - regression_loss: 1.6193 - classification_loss: 0.2728 162/500 [========>.....................] - ETA: 1:23 - loss: 1.8914 - regression_loss: 1.6189 - classification_loss: 0.2726 163/500 [========>.....................] - ETA: 1:23 - loss: 1.8907 - regression_loss: 1.6182 - classification_loss: 0.2725 164/500 [========>.....................] - ETA: 1:22 - loss: 1.8905 - regression_loss: 1.6184 - classification_loss: 0.2721 165/500 [========>.....................] - ETA: 1:22 - loss: 1.8898 - regression_loss: 1.6167 - classification_loss: 0.2731 166/500 [========>.....................] - ETA: 1:22 - loss: 1.8844 - regression_loss: 1.6120 - classification_loss: 0.2723 167/500 [=========>....................] - ETA: 1:22 - loss: 1.8823 - regression_loss: 1.6100 - classification_loss: 0.2723 168/500 [=========>....................] - ETA: 1:21 - loss: 1.8818 - regression_loss: 1.6095 - classification_loss: 0.2723 169/500 [=========>....................] - ETA: 1:21 - loss: 1.8835 - regression_loss: 1.6106 - classification_loss: 0.2729 170/500 [=========>....................] - ETA: 1:21 - loss: 1.8801 - regression_loss: 1.6075 - classification_loss: 0.2727 171/500 [=========>....................] - ETA: 1:21 - loss: 1.8808 - regression_loss: 1.6079 - classification_loss: 0.2729 172/500 [=========>....................] - ETA: 1:20 - loss: 1.8783 - regression_loss: 1.6058 - classification_loss: 0.2725 173/500 [=========>....................] - ETA: 1:20 - loss: 1.8843 - regression_loss: 1.6107 - classification_loss: 0.2736 174/500 [=========>....................] - ETA: 1:20 - loss: 1.8872 - regression_loss: 1.6130 - classification_loss: 0.2743 175/500 [=========>....................] - ETA: 1:20 - loss: 1.8907 - regression_loss: 1.6155 - classification_loss: 0.2752 176/500 [=========>....................] - ETA: 1:19 - loss: 1.8905 - regression_loss: 1.6153 - classification_loss: 0.2752 177/500 [=========>....................] - ETA: 1:19 - loss: 1.8908 - regression_loss: 1.6160 - classification_loss: 0.2748 178/500 [=========>....................] - ETA: 1:19 - loss: 1.8961 - regression_loss: 1.6206 - classification_loss: 0.2755 179/500 [=========>....................] - ETA: 1:19 - loss: 1.8959 - regression_loss: 1.6207 - classification_loss: 0.2752 180/500 [=========>....................] - ETA: 1:18 - loss: 1.9048 - regression_loss: 1.6286 - classification_loss: 0.2762 181/500 [=========>....................] - ETA: 1:18 - loss: 1.9092 - regression_loss: 1.6317 - classification_loss: 0.2775 182/500 [=========>....................] - ETA: 1:18 - loss: 1.9067 - regression_loss: 1.6297 - classification_loss: 0.2769 183/500 [=========>....................] - ETA: 1:18 - loss: 1.9051 - regression_loss: 1.6283 - classification_loss: 0.2768 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9021 - regression_loss: 1.6262 - classification_loss: 0.2759 185/500 [==========>...................] - ETA: 1:17 - loss: 1.8990 - regression_loss: 1.6238 - classification_loss: 0.2752 186/500 [==========>...................] - ETA: 1:17 - loss: 1.8979 - regression_loss: 1.6228 - classification_loss: 0.2751 187/500 [==========>...................] - ETA: 1:17 - loss: 1.8980 - regression_loss: 1.6232 - classification_loss: 0.2748 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8967 - regression_loss: 1.6220 - classification_loss: 0.2748 189/500 [==========>...................] - ETA: 1:16 - loss: 1.8943 - regression_loss: 1.6202 - classification_loss: 0.2741 190/500 [==========>...................] - ETA: 1:16 - loss: 1.8984 - regression_loss: 1.6241 - classification_loss: 0.2743 191/500 [==========>...................] - ETA: 1:16 - loss: 1.8946 - regression_loss: 1.6211 - classification_loss: 0.2735 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8926 - regression_loss: 1.6197 - classification_loss: 0.2729 193/500 [==========>...................] - ETA: 1:15 - loss: 1.8916 - regression_loss: 1.6188 - classification_loss: 0.2728 194/500 [==========>...................] - ETA: 1:15 - loss: 1.8894 - regression_loss: 1.6171 - classification_loss: 0.2723 195/500 [==========>...................] - ETA: 1:15 - loss: 1.8938 - regression_loss: 1.6204 - classification_loss: 0.2734 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8903 - regression_loss: 1.6177 - classification_loss: 0.2725 197/500 [==========>...................] - ETA: 1:14 - loss: 1.8874 - regression_loss: 1.6156 - classification_loss: 0.2718 198/500 [==========>...................] - ETA: 1:14 - loss: 1.8876 - regression_loss: 1.6164 - classification_loss: 0.2712 199/500 [==========>...................] - ETA: 1:14 - loss: 1.8898 - regression_loss: 1.6165 - classification_loss: 0.2734 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8920 - regression_loss: 1.6184 - classification_loss: 0.2736 201/500 [===========>..................] - ETA: 1:13 - loss: 1.8916 - regression_loss: 1.6181 - classification_loss: 0.2734 202/500 [===========>..................] - ETA: 1:13 - loss: 1.8893 - regression_loss: 1.6164 - classification_loss: 0.2729 203/500 [===========>..................] - ETA: 1:13 - loss: 1.8872 - regression_loss: 1.6147 - classification_loss: 0.2725 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8813 - regression_loss: 1.6097 - classification_loss: 0.2716 205/500 [===========>..................] - ETA: 1:12 - loss: 1.8833 - regression_loss: 1.6115 - classification_loss: 0.2718 206/500 [===========>..................] - ETA: 1:12 - loss: 1.8854 - regression_loss: 1.6128 - classification_loss: 0.2727 207/500 [===========>..................] - ETA: 1:12 - loss: 1.8806 - regression_loss: 1.6087 - classification_loss: 0.2719 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8770 - regression_loss: 1.6056 - classification_loss: 0.2713 209/500 [===========>..................] - ETA: 1:11 - loss: 1.8746 - regression_loss: 1.6041 - classification_loss: 0.2706 210/500 [===========>..................] - ETA: 1:11 - loss: 1.8697 - regression_loss: 1.6000 - classification_loss: 0.2698 211/500 [===========>..................] - ETA: 1:11 - loss: 1.8702 - regression_loss: 1.5999 - classification_loss: 0.2703 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8700 - regression_loss: 1.5996 - classification_loss: 0.2704 213/500 [===========>..................] - ETA: 1:10 - loss: 1.8696 - regression_loss: 1.5992 - classification_loss: 0.2704 214/500 [===========>..................] - ETA: 1:10 - loss: 1.8701 - regression_loss: 1.5997 - classification_loss: 0.2704 215/500 [===========>..................] - ETA: 1:10 - loss: 1.8650 - regression_loss: 1.5954 - classification_loss: 0.2696 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8622 - regression_loss: 1.5929 - classification_loss: 0.2692 217/500 [============>.................] - ETA: 1:09 - loss: 1.8638 - regression_loss: 1.5946 - classification_loss: 0.2692 218/500 [============>.................] - ETA: 1:09 - loss: 1.8659 - regression_loss: 1.5960 - classification_loss: 0.2699 219/500 [============>.................] - ETA: 1:09 - loss: 1.8635 - regression_loss: 1.5940 - classification_loss: 0.2695 220/500 [============>.................] - ETA: 1:09 - loss: 1.8664 - regression_loss: 1.5964 - classification_loss: 0.2700 221/500 [============>.................] - ETA: 1:08 - loss: 1.8682 - regression_loss: 1.5984 - classification_loss: 0.2698 222/500 [============>.................] - ETA: 1:08 - loss: 1.8682 - regression_loss: 1.5982 - classification_loss: 0.2701 223/500 [============>.................] - ETA: 1:08 - loss: 1.8665 - regression_loss: 1.5970 - classification_loss: 0.2695 224/500 [============>.................] - ETA: 1:08 - loss: 1.8662 - regression_loss: 1.5969 - classification_loss: 0.2694 225/500 [============>.................] - ETA: 1:07 - loss: 1.8655 - regression_loss: 1.5959 - classification_loss: 0.2696 226/500 [============>.................] - ETA: 1:07 - loss: 1.8629 - regression_loss: 1.5938 - classification_loss: 0.2691 227/500 [============>.................] - ETA: 1:07 - loss: 1.8691 - regression_loss: 1.5988 - classification_loss: 0.2703 228/500 [============>.................] - ETA: 1:07 - loss: 1.8706 - regression_loss: 1.6002 - classification_loss: 0.2704 229/500 [============>.................] - ETA: 1:06 - loss: 1.8688 - regression_loss: 1.5987 - classification_loss: 0.2701 230/500 [============>.................] - ETA: 1:06 - loss: 1.8665 - regression_loss: 1.5969 - classification_loss: 0.2696 231/500 [============>.................] - ETA: 1:06 - loss: 1.8666 - regression_loss: 1.5975 - classification_loss: 0.2692 232/500 [============>.................] - ETA: 1:06 - loss: 1.8616 - regression_loss: 1.5932 - classification_loss: 0.2683 233/500 [============>.................] - ETA: 1:06 - loss: 1.8635 - regression_loss: 1.5949 - classification_loss: 0.2686 234/500 [=============>................] - ETA: 1:05 - loss: 1.8664 - regression_loss: 1.5972 - classification_loss: 0.2692 235/500 [=============>................] - ETA: 1:05 - loss: 1.8629 - regression_loss: 1.5944 - classification_loss: 0.2685 236/500 [=============>................] - ETA: 1:05 - loss: 1.8640 - regression_loss: 1.5955 - classification_loss: 0.2685 237/500 [=============>................] - ETA: 1:05 - loss: 1.8668 - regression_loss: 1.5979 - classification_loss: 0.2688 238/500 [=============>................] - ETA: 1:04 - loss: 1.8657 - regression_loss: 1.5970 - classification_loss: 0.2687 239/500 [=============>................] - ETA: 1:04 - loss: 1.8656 - regression_loss: 1.5970 - classification_loss: 0.2686 240/500 [=============>................] - ETA: 1:04 - loss: 1.8688 - regression_loss: 1.5995 - classification_loss: 0.2693 241/500 [=============>................] - ETA: 1:04 - loss: 1.8710 - regression_loss: 1.6014 - classification_loss: 0.2696 242/500 [=============>................] - ETA: 1:03 - loss: 1.8693 - regression_loss: 1.5993 - classification_loss: 0.2700 243/500 [=============>................] - ETA: 1:03 - loss: 1.8704 - regression_loss: 1.6001 - classification_loss: 0.2703 244/500 [=============>................] - ETA: 1:03 - loss: 1.8777 - regression_loss: 1.6068 - classification_loss: 0.2709 245/500 [=============>................] - ETA: 1:03 - loss: 1.8786 - regression_loss: 1.6076 - classification_loss: 0.2710 246/500 [=============>................] - ETA: 1:02 - loss: 1.8800 - regression_loss: 1.6089 - classification_loss: 0.2711 247/500 [=============>................] - ETA: 1:02 - loss: 1.8792 - regression_loss: 1.6082 - classification_loss: 0.2709 248/500 [=============>................] - ETA: 1:02 - loss: 1.8799 - regression_loss: 1.6087 - classification_loss: 0.2711 249/500 [=============>................] - ETA: 1:02 - loss: 1.8793 - regression_loss: 1.6084 - classification_loss: 0.2709 250/500 [==============>...............] - ETA: 1:01 - loss: 1.8796 - regression_loss: 1.6086 - classification_loss: 0.2710 251/500 [==============>...............] - ETA: 1:01 - loss: 1.8766 - regression_loss: 1.6062 - classification_loss: 0.2705 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8756 - regression_loss: 1.6054 - classification_loss: 0.2702 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8749 - regression_loss: 1.6045 - classification_loss: 0.2704 254/500 [==============>...............] - ETA: 1:00 - loss: 1.8747 - regression_loss: 1.6044 - classification_loss: 0.2703 255/500 [==============>...............] - ETA: 1:00 - loss: 1.8724 - regression_loss: 1.6026 - classification_loss: 0.2698 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8730 - regression_loss: 1.6033 - classification_loss: 0.2697 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8715 - regression_loss: 1.6019 - classification_loss: 0.2696 258/500 [==============>...............] - ETA: 59s - loss: 1.8717 - regression_loss: 1.6023 - classification_loss: 0.2694  259/500 [==============>...............] - ETA: 59s - loss: 1.8709 - regression_loss: 1.6012 - classification_loss: 0.2696 260/500 [==============>...............] - ETA: 59s - loss: 1.8719 - regression_loss: 1.6019 - classification_loss: 0.2699 261/500 [==============>...............] - ETA: 59s - loss: 1.8738 - regression_loss: 1.6036 - classification_loss: 0.2702 262/500 [==============>...............] - ETA: 58s - loss: 1.8775 - regression_loss: 1.6070 - classification_loss: 0.2705 263/500 [==============>...............] - ETA: 58s - loss: 1.8732 - regression_loss: 1.6034 - classification_loss: 0.2698 264/500 [==============>...............] - ETA: 58s - loss: 1.8719 - regression_loss: 1.6021 - classification_loss: 0.2698 265/500 [==============>...............] - ETA: 58s - loss: 1.8711 - regression_loss: 1.6016 - classification_loss: 0.2695 266/500 [==============>...............] - ETA: 57s - loss: 1.8723 - regression_loss: 1.6023 - classification_loss: 0.2700 267/500 [===============>..............] - ETA: 57s - loss: 1.8726 - regression_loss: 1.6026 - classification_loss: 0.2700 268/500 [===============>..............] - ETA: 57s - loss: 1.8743 - regression_loss: 1.6037 - classification_loss: 0.2706 269/500 [===============>..............] - ETA: 57s - loss: 1.8727 - regression_loss: 1.6021 - classification_loss: 0.2706 270/500 [===============>..............] - ETA: 56s - loss: 1.8721 - regression_loss: 1.6015 - classification_loss: 0.2706 271/500 [===============>..............] - ETA: 56s - loss: 1.8736 - regression_loss: 1.6016 - classification_loss: 0.2720 272/500 [===============>..............] - ETA: 56s - loss: 1.8724 - regression_loss: 1.6006 - classification_loss: 0.2718 273/500 [===============>..............] - ETA: 56s - loss: 1.8704 - regression_loss: 1.5991 - classification_loss: 0.2713 274/500 [===============>..............] - ETA: 55s - loss: 1.8718 - regression_loss: 1.6001 - classification_loss: 0.2717 275/500 [===============>..............] - ETA: 55s - loss: 1.8709 - regression_loss: 1.5995 - classification_loss: 0.2714 276/500 [===============>..............] - ETA: 55s - loss: 1.8748 - regression_loss: 1.6028 - classification_loss: 0.2721 277/500 [===============>..............] - ETA: 55s - loss: 1.8761 - regression_loss: 1.6037 - classification_loss: 0.2724 278/500 [===============>..............] - ETA: 54s - loss: 1.8709 - regression_loss: 1.5979 - classification_loss: 0.2730 279/500 [===============>..............] - ETA: 54s - loss: 1.8730 - regression_loss: 1.5997 - classification_loss: 0.2732 280/500 [===============>..............] - ETA: 54s - loss: 1.8760 - regression_loss: 1.6023 - classification_loss: 0.2737 281/500 [===============>..............] - ETA: 54s - loss: 1.8752 - regression_loss: 1.6018 - classification_loss: 0.2734 282/500 [===============>..............] - ETA: 53s - loss: 1.8759 - regression_loss: 1.6026 - classification_loss: 0.2733 283/500 [===============>..............] - ETA: 53s - loss: 1.8764 - regression_loss: 1.6031 - classification_loss: 0.2732 284/500 [================>.............] - ETA: 53s - loss: 1.8759 - regression_loss: 1.6027 - classification_loss: 0.2733 285/500 [================>.............] - ETA: 53s - loss: 1.8761 - regression_loss: 1.6025 - classification_loss: 0.2736 286/500 [================>.............] - ETA: 52s - loss: 1.8744 - regression_loss: 1.6012 - classification_loss: 0.2732 287/500 [================>.............] - ETA: 52s - loss: 1.8744 - regression_loss: 1.6012 - classification_loss: 0.2732 288/500 [================>.............] - ETA: 52s - loss: 1.8728 - regression_loss: 1.6001 - classification_loss: 0.2728 289/500 [================>.............] - ETA: 52s - loss: 1.8735 - regression_loss: 1.6004 - classification_loss: 0.2730 290/500 [================>.............] - ETA: 51s - loss: 1.8761 - regression_loss: 1.6032 - classification_loss: 0.2728 291/500 [================>.............] - ETA: 51s - loss: 1.8742 - regression_loss: 1.6015 - classification_loss: 0.2727 292/500 [================>.............] - ETA: 51s - loss: 1.8725 - regression_loss: 1.6002 - classification_loss: 0.2724 293/500 [================>.............] - ETA: 51s - loss: 1.8730 - regression_loss: 1.6006 - classification_loss: 0.2725 294/500 [================>.............] - ETA: 50s - loss: 1.8698 - regression_loss: 1.5979 - classification_loss: 0.2720 295/500 [================>.............] - ETA: 50s - loss: 1.8729 - regression_loss: 1.6007 - classification_loss: 0.2722 296/500 [================>.............] - ETA: 50s - loss: 1.8723 - regression_loss: 1.6003 - classification_loss: 0.2721 297/500 [================>.............] - ETA: 50s - loss: 1.8733 - regression_loss: 1.6010 - classification_loss: 0.2723 298/500 [================>.............] - ETA: 49s - loss: 1.8729 - regression_loss: 1.6007 - classification_loss: 0.2721 299/500 [================>.............] - ETA: 49s - loss: 1.8724 - regression_loss: 1.6005 - classification_loss: 0.2720 300/500 [=================>............] - ETA: 49s - loss: 1.8704 - regression_loss: 1.5987 - classification_loss: 0.2717 301/500 [=================>............] - ETA: 49s - loss: 1.8701 - regression_loss: 1.5986 - classification_loss: 0.2715 302/500 [=================>............] - ETA: 48s - loss: 1.8694 - regression_loss: 1.5979 - classification_loss: 0.2716 303/500 [=================>............] - ETA: 48s - loss: 1.8701 - regression_loss: 1.5986 - classification_loss: 0.2716 304/500 [=================>............] - ETA: 48s - loss: 1.8690 - regression_loss: 1.5979 - classification_loss: 0.2712 305/500 [=================>............] - ETA: 48s - loss: 1.8716 - regression_loss: 1.6000 - classification_loss: 0.2717 306/500 [=================>............] - ETA: 47s - loss: 1.8706 - regression_loss: 1.5993 - classification_loss: 0.2713 307/500 [=================>............] - ETA: 47s - loss: 1.8690 - regression_loss: 1.5982 - classification_loss: 0.2708 308/500 [=================>............] - ETA: 47s - loss: 1.8685 - regression_loss: 1.5979 - classification_loss: 0.2706 309/500 [=================>............] - ETA: 47s - loss: 1.8694 - regression_loss: 1.5987 - classification_loss: 0.2708 310/500 [=================>............] - ETA: 47s - loss: 1.8704 - regression_loss: 1.5993 - classification_loss: 0.2711 311/500 [=================>............] - ETA: 46s - loss: 1.8690 - regression_loss: 1.5983 - classification_loss: 0.2707 312/500 [=================>............] - ETA: 46s - loss: 1.8677 - regression_loss: 1.5972 - classification_loss: 0.2705 313/500 [=================>............] - ETA: 46s - loss: 1.8677 - regression_loss: 1.5973 - classification_loss: 0.2704 314/500 [=================>............] - ETA: 46s - loss: 1.8673 - regression_loss: 1.5968 - classification_loss: 0.2705 315/500 [=================>............] - ETA: 45s - loss: 1.8684 - regression_loss: 1.5977 - classification_loss: 0.2706 316/500 [=================>............] - ETA: 45s - loss: 1.8694 - regression_loss: 1.5986 - classification_loss: 0.2708 317/500 [==================>...........] - ETA: 45s - loss: 1.8659 - regression_loss: 1.5958 - classification_loss: 0.2702 318/500 [==================>...........] - ETA: 44s - loss: 1.8670 - regression_loss: 1.5965 - classification_loss: 0.2705 319/500 [==================>...........] - ETA: 44s - loss: 1.8647 - regression_loss: 1.5947 - classification_loss: 0.2700 320/500 [==================>...........] - ETA: 44s - loss: 1.8656 - regression_loss: 1.5955 - classification_loss: 0.2701 321/500 [==================>...........] - ETA: 44s - loss: 1.8657 - regression_loss: 1.5956 - classification_loss: 0.2701 322/500 [==================>...........] - ETA: 43s - loss: 1.8665 - regression_loss: 1.5960 - classification_loss: 0.2705 323/500 [==================>...........] - ETA: 43s - loss: 1.8649 - regression_loss: 1.5947 - classification_loss: 0.2702 324/500 [==================>...........] - ETA: 43s - loss: 1.8667 - regression_loss: 1.5960 - classification_loss: 0.2707 325/500 [==================>...........] - ETA: 43s - loss: 1.8661 - regression_loss: 1.5955 - classification_loss: 0.2707 326/500 [==================>...........] - ETA: 42s - loss: 1.8648 - regression_loss: 1.5944 - classification_loss: 0.2704 327/500 [==================>...........] - ETA: 42s - loss: 1.8659 - regression_loss: 1.5958 - classification_loss: 0.2701 328/500 [==================>...........] - ETA: 42s - loss: 1.8768 - regression_loss: 1.6000 - classification_loss: 0.2767 329/500 [==================>...........] - ETA: 42s - loss: 1.8765 - regression_loss: 1.5999 - classification_loss: 0.2766 330/500 [==================>...........] - ETA: 41s - loss: 1.8733 - regression_loss: 1.5973 - classification_loss: 0.2760 331/500 [==================>...........] - ETA: 41s - loss: 1.8741 - regression_loss: 1.5979 - classification_loss: 0.2761 332/500 [==================>...........] - ETA: 41s - loss: 1.8709 - regression_loss: 1.5953 - classification_loss: 0.2756 333/500 [==================>...........] - ETA: 41s - loss: 1.8700 - regression_loss: 1.5944 - classification_loss: 0.2756 334/500 [===================>..........] - ETA: 41s - loss: 1.8712 - regression_loss: 1.5952 - classification_loss: 0.2760 335/500 [===================>..........] - ETA: 40s - loss: 1.8706 - regression_loss: 1.5948 - classification_loss: 0.2758 336/500 [===================>..........] - ETA: 40s - loss: 1.8672 - regression_loss: 1.5919 - classification_loss: 0.2753 337/500 [===================>..........] - ETA: 40s - loss: 1.8705 - regression_loss: 1.5946 - classification_loss: 0.2758 338/500 [===================>..........] - ETA: 40s - loss: 1.8686 - regression_loss: 1.5931 - classification_loss: 0.2755 339/500 [===================>..........] - ETA: 39s - loss: 1.8666 - regression_loss: 1.5914 - classification_loss: 0.2752 340/500 [===================>..........] - ETA: 39s - loss: 1.8655 - regression_loss: 1.5906 - classification_loss: 0.2749 341/500 [===================>..........] - ETA: 39s - loss: 1.8671 - regression_loss: 1.5919 - classification_loss: 0.2751 342/500 [===================>..........] - ETA: 39s - loss: 1.8661 - regression_loss: 1.5912 - classification_loss: 0.2749 343/500 [===================>..........] - ETA: 38s - loss: 1.8664 - regression_loss: 1.5915 - classification_loss: 0.2749 344/500 [===================>..........] - ETA: 38s - loss: 1.8667 - regression_loss: 1.5916 - classification_loss: 0.2751 345/500 [===================>..........] - ETA: 38s - loss: 1.8660 - regression_loss: 1.5911 - classification_loss: 0.2749 346/500 [===================>..........] - ETA: 38s - loss: 1.8675 - regression_loss: 1.5927 - classification_loss: 0.2749 347/500 [===================>..........] - ETA: 37s - loss: 1.8686 - regression_loss: 1.5938 - classification_loss: 0.2748 348/500 [===================>..........] - ETA: 37s - loss: 1.8674 - regression_loss: 1.5929 - classification_loss: 0.2745 349/500 [===================>..........] - ETA: 37s - loss: 1.8686 - regression_loss: 1.5937 - classification_loss: 0.2749 350/500 [====================>.........] - ETA: 37s - loss: 1.8695 - regression_loss: 1.5945 - classification_loss: 0.2750 351/500 [====================>.........] - ETA: 36s - loss: 1.8720 - regression_loss: 1.5966 - classification_loss: 0.2754 352/500 [====================>.........] - ETA: 36s - loss: 1.8713 - regression_loss: 1.5960 - classification_loss: 0.2752 353/500 [====================>.........] - ETA: 36s - loss: 1.8708 - regression_loss: 1.5956 - classification_loss: 0.2752 354/500 [====================>.........] - ETA: 36s - loss: 1.8701 - regression_loss: 1.5953 - classification_loss: 0.2749 355/500 [====================>.........] - ETA: 35s - loss: 1.8710 - regression_loss: 1.5961 - classification_loss: 0.2749 356/500 [====================>.........] - ETA: 35s - loss: 1.8714 - regression_loss: 1.5965 - classification_loss: 0.2750 357/500 [====================>.........] - ETA: 35s - loss: 1.8691 - regression_loss: 1.5944 - classification_loss: 0.2747 358/500 [====================>.........] - ETA: 35s - loss: 1.8684 - regression_loss: 1.5939 - classification_loss: 0.2745 359/500 [====================>.........] - ETA: 34s - loss: 1.8664 - regression_loss: 1.5924 - classification_loss: 0.2740 360/500 [====================>.........] - ETA: 34s - loss: 1.8661 - regression_loss: 1.5919 - classification_loss: 0.2742 361/500 [====================>.........] - ETA: 34s - loss: 1.8661 - regression_loss: 1.5920 - classification_loss: 0.2742 362/500 [====================>.........] - ETA: 34s - loss: 1.8665 - regression_loss: 1.5923 - classification_loss: 0.2742 363/500 [====================>.........] - ETA: 33s - loss: 1.8658 - regression_loss: 1.5917 - classification_loss: 0.2742 364/500 [====================>.........] - ETA: 33s - loss: 1.8655 - regression_loss: 1.5914 - classification_loss: 0.2741 365/500 [====================>.........] - ETA: 33s - loss: 1.8656 - regression_loss: 1.5914 - classification_loss: 0.2742 366/500 [====================>.........] - ETA: 33s - loss: 1.8660 - regression_loss: 1.5917 - classification_loss: 0.2743 367/500 [=====================>........] - ETA: 32s - loss: 1.8641 - regression_loss: 1.5902 - classification_loss: 0.2739 368/500 [=====================>........] - ETA: 32s - loss: 1.8665 - regression_loss: 1.5921 - classification_loss: 0.2744 369/500 [=====================>........] - ETA: 32s - loss: 1.8652 - regression_loss: 1.5911 - classification_loss: 0.2741 370/500 [=====================>........] - ETA: 32s - loss: 1.8655 - regression_loss: 1.5914 - classification_loss: 0.2742 371/500 [=====================>........] - ETA: 31s - loss: 1.8675 - regression_loss: 1.5927 - classification_loss: 0.2747 372/500 [=====================>........] - ETA: 31s - loss: 1.8656 - regression_loss: 1.5911 - classification_loss: 0.2744 373/500 [=====================>........] - ETA: 31s - loss: 1.8659 - regression_loss: 1.5915 - classification_loss: 0.2744 374/500 [=====================>........] - ETA: 31s - loss: 1.8627 - regression_loss: 1.5888 - classification_loss: 0.2738 375/500 [=====================>........] - ETA: 30s - loss: 1.8612 - regression_loss: 1.5877 - classification_loss: 0.2736 376/500 [=====================>........] - ETA: 30s - loss: 1.8622 - regression_loss: 1.5883 - classification_loss: 0.2739 377/500 [=====================>........] - ETA: 30s - loss: 1.8642 - regression_loss: 1.5898 - classification_loss: 0.2744 378/500 [=====================>........] - ETA: 30s - loss: 1.8636 - regression_loss: 1.5893 - classification_loss: 0.2743 379/500 [=====================>........] - ETA: 29s - loss: 1.8607 - regression_loss: 1.5869 - classification_loss: 0.2737 380/500 [=====================>........] - ETA: 29s - loss: 1.8618 - regression_loss: 1.5878 - classification_loss: 0.2739 381/500 [=====================>........] - ETA: 29s - loss: 1.8616 - regression_loss: 1.5876 - classification_loss: 0.2740 382/500 [=====================>........] - ETA: 29s - loss: 1.8611 - regression_loss: 1.5871 - classification_loss: 0.2740 383/500 [=====================>........] - ETA: 28s - loss: 1.8590 - regression_loss: 1.5848 - classification_loss: 0.2742 384/500 [======================>.......] - ETA: 28s - loss: 1.8600 - regression_loss: 1.5857 - classification_loss: 0.2743 385/500 [======================>.......] - ETA: 28s - loss: 1.8591 - regression_loss: 1.5847 - classification_loss: 0.2744 386/500 [======================>.......] - ETA: 28s - loss: 1.8578 - regression_loss: 1.5836 - classification_loss: 0.2742 387/500 [======================>.......] - ETA: 27s - loss: 1.8569 - regression_loss: 1.5829 - classification_loss: 0.2740 388/500 [======================>.......] - ETA: 27s - loss: 1.8574 - regression_loss: 1.5834 - classification_loss: 0.2740 389/500 [======================>.......] - ETA: 27s - loss: 1.8585 - regression_loss: 1.5841 - classification_loss: 0.2744 390/500 [======================>.......] - ETA: 27s - loss: 1.8579 - regression_loss: 1.5835 - classification_loss: 0.2745 391/500 [======================>.......] - ETA: 26s - loss: 1.8567 - regression_loss: 1.5825 - classification_loss: 0.2742 392/500 [======================>.......] - ETA: 26s - loss: 1.8568 - regression_loss: 1.5828 - classification_loss: 0.2740 393/500 [======================>.......] - ETA: 26s - loss: 1.8565 - regression_loss: 1.5827 - classification_loss: 0.2738 394/500 [======================>.......] - ETA: 26s - loss: 1.8536 - regression_loss: 1.5802 - classification_loss: 0.2734 395/500 [======================>.......] - ETA: 25s - loss: 1.8552 - regression_loss: 1.5817 - classification_loss: 0.2735 396/500 [======================>.......] - ETA: 25s - loss: 1.8574 - regression_loss: 1.5836 - classification_loss: 0.2738 397/500 [======================>.......] - ETA: 25s - loss: 1.8612 - regression_loss: 1.5868 - classification_loss: 0.2744 398/500 [======================>.......] - ETA: 25s - loss: 1.8601 - regression_loss: 1.5858 - classification_loss: 0.2743 399/500 [======================>.......] - ETA: 24s - loss: 1.8602 - regression_loss: 1.5860 - classification_loss: 0.2743 400/500 [=======================>......] - ETA: 24s - loss: 1.8587 - regression_loss: 1.5847 - classification_loss: 0.2740 401/500 [=======================>......] - ETA: 24s - loss: 1.8590 - regression_loss: 1.5852 - classification_loss: 0.2738 402/500 [=======================>......] - ETA: 24s - loss: 1.8588 - regression_loss: 1.5852 - classification_loss: 0.2736 403/500 [=======================>......] - ETA: 24s - loss: 1.8558 - regression_loss: 1.5813 - classification_loss: 0.2745 404/500 [=======================>......] - ETA: 23s - loss: 1.8572 - regression_loss: 1.5824 - classification_loss: 0.2748 405/500 [=======================>......] - ETA: 23s - loss: 1.8569 - regression_loss: 1.5822 - classification_loss: 0.2747 406/500 [=======================>......] - ETA: 23s - loss: 1.8573 - regression_loss: 1.5825 - classification_loss: 0.2748 407/500 [=======================>......] - ETA: 23s - loss: 1.8563 - regression_loss: 1.5818 - classification_loss: 0.2745 408/500 [=======================>......] - ETA: 22s - loss: 1.8558 - regression_loss: 1.5816 - classification_loss: 0.2742 409/500 [=======================>......] - ETA: 22s - loss: 1.8545 - regression_loss: 1.5806 - classification_loss: 0.2739 410/500 [=======================>......] - ETA: 22s - loss: 1.8529 - regression_loss: 1.5793 - classification_loss: 0.2735 411/500 [=======================>......] - ETA: 22s - loss: 1.8519 - regression_loss: 1.5786 - classification_loss: 0.2733 412/500 [=======================>......] - ETA: 21s - loss: 1.8528 - regression_loss: 1.5794 - classification_loss: 0.2734 413/500 [=======================>......] - ETA: 21s - loss: 1.8536 - regression_loss: 1.5799 - classification_loss: 0.2736 414/500 [=======================>......] - ETA: 21s - loss: 1.8541 - regression_loss: 1.5803 - classification_loss: 0.2738 415/500 [=======================>......] - ETA: 21s - loss: 1.8550 - regression_loss: 1.5811 - classification_loss: 0.2739 416/500 [=======================>......] - ETA: 20s - loss: 1.8539 - regression_loss: 1.5802 - classification_loss: 0.2737 417/500 [========================>.....] - ETA: 20s - loss: 1.8524 - regression_loss: 1.5789 - classification_loss: 0.2734 418/500 [========================>.....] - ETA: 20s - loss: 1.8507 - regression_loss: 1.5777 - classification_loss: 0.2730 419/500 [========================>.....] - ETA: 20s - loss: 1.8497 - regression_loss: 1.5771 - classification_loss: 0.2727 420/500 [========================>.....] - ETA: 19s - loss: 1.8474 - regression_loss: 1.5751 - classification_loss: 0.2723 421/500 [========================>.....] - ETA: 19s - loss: 1.8471 - regression_loss: 1.5752 - classification_loss: 0.2719 422/500 [========================>.....] - ETA: 19s - loss: 1.8469 - regression_loss: 1.5751 - classification_loss: 0.2719 423/500 [========================>.....] - ETA: 19s - loss: 1.8469 - regression_loss: 1.5750 - classification_loss: 0.2719 424/500 [========================>.....] - ETA: 18s - loss: 1.8463 - regression_loss: 1.5744 - classification_loss: 0.2719 425/500 [========================>.....] - ETA: 18s - loss: 1.8456 - regression_loss: 1.5738 - classification_loss: 0.2718 426/500 [========================>.....] - ETA: 18s - loss: 1.8453 - regression_loss: 1.5736 - classification_loss: 0.2718 427/500 [========================>.....] - ETA: 18s - loss: 1.8475 - regression_loss: 1.5752 - classification_loss: 0.2723 428/500 [========================>.....] - ETA: 17s - loss: 1.8461 - regression_loss: 1.5741 - classification_loss: 0.2720 429/500 [========================>.....] - ETA: 17s - loss: 1.8478 - regression_loss: 1.5755 - classification_loss: 0.2723 430/500 [========================>.....] - ETA: 17s - loss: 1.8488 - regression_loss: 1.5763 - classification_loss: 0.2725 431/500 [========================>.....] - ETA: 17s - loss: 1.8481 - regression_loss: 1.5759 - classification_loss: 0.2723 432/500 [========================>.....] - ETA: 16s - loss: 1.8488 - regression_loss: 1.5765 - classification_loss: 0.2723 433/500 [========================>.....] - ETA: 16s - loss: 1.8462 - regression_loss: 1.5743 - classification_loss: 0.2719 434/500 [=========================>....] - ETA: 16s - loss: 1.8469 - regression_loss: 1.5752 - classification_loss: 0.2717 435/500 [=========================>....] - ETA: 16s - loss: 1.8468 - regression_loss: 1.5751 - classification_loss: 0.2717 436/500 [=========================>....] - ETA: 15s - loss: 1.8460 - regression_loss: 1.5744 - classification_loss: 0.2716 437/500 [=========================>....] - ETA: 15s - loss: 1.8470 - regression_loss: 1.5753 - classification_loss: 0.2717 438/500 [=========================>....] - ETA: 15s - loss: 1.8464 - regression_loss: 1.5748 - classification_loss: 0.2716 439/500 [=========================>....] - ETA: 15s - loss: 1.8472 - regression_loss: 1.5754 - classification_loss: 0.2718 440/500 [=========================>....] - ETA: 14s - loss: 1.8467 - regression_loss: 1.5750 - classification_loss: 0.2717 441/500 [=========================>....] - ETA: 14s - loss: 1.8457 - regression_loss: 1.5743 - classification_loss: 0.2714 442/500 [=========================>....] - ETA: 14s - loss: 1.8469 - regression_loss: 1.5751 - classification_loss: 0.2718 443/500 [=========================>....] - ETA: 14s - loss: 1.8459 - regression_loss: 1.5744 - classification_loss: 0.2715 444/500 [=========================>....] - ETA: 13s - loss: 1.8455 - regression_loss: 1.5741 - classification_loss: 0.2714 445/500 [=========================>....] - ETA: 13s - loss: 1.8492 - regression_loss: 1.5772 - classification_loss: 0.2720 446/500 [=========================>....] - ETA: 13s - loss: 1.8510 - regression_loss: 1.5784 - classification_loss: 0.2726 447/500 [=========================>....] - ETA: 13s - loss: 1.8511 - regression_loss: 1.5785 - classification_loss: 0.2726 448/500 [=========================>....] - ETA: 12s - loss: 1.8500 - regression_loss: 1.5778 - classification_loss: 0.2722 449/500 [=========================>....] - ETA: 12s - loss: 1.8506 - regression_loss: 1.5781 - classification_loss: 0.2725 450/500 [==========================>...] - ETA: 12s - loss: 1.8523 - regression_loss: 1.5791 - classification_loss: 0.2732 451/500 [==========================>...] - ETA: 12s - loss: 1.8495 - regression_loss: 1.5767 - classification_loss: 0.2728 452/500 [==========================>...] - ETA: 11s - loss: 1.8476 - regression_loss: 1.5751 - classification_loss: 0.2724 453/500 [==========================>...] - ETA: 11s - loss: 1.8463 - regression_loss: 1.5741 - classification_loss: 0.2722 454/500 [==========================>...] - ETA: 11s - loss: 1.8470 - regression_loss: 1.5748 - classification_loss: 0.2722 455/500 [==========================>...] - ETA: 11s - loss: 1.8471 - regression_loss: 1.5748 - classification_loss: 0.2723 456/500 [==========================>...] - ETA: 10s - loss: 1.8462 - regression_loss: 1.5741 - classification_loss: 0.2721 457/500 [==========================>...] - ETA: 10s - loss: 1.8511 - regression_loss: 1.5776 - classification_loss: 0.2736 458/500 [==========================>...] - ETA: 10s - loss: 1.8493 - regression_loss: 1.5760 - classification_loss: 0.2733 459/500 [==========================>...] - ETA: 10s - loss: 1.8478 - regression_loss: 1.5748 - classification_loss: 0.2730 460/500 [==========================>...] - ETA: 9s - loss: 1.8471 - regression_loss: 1.5744 - classification_loss: 0.2727  461/500 [==========================>...] - ETA: 9s - loss: 1.8477 - regression_loss: 1.5749 - classification_loss: 0.2728 462/500 [==========================>...] - ETA: 9s - loss: 1.8477 - regression_loss: 1.5751 - classification_loss: 0.2726 463/500 [==========================>...] - ETA: 9s - loss: 1.8475 - regression_loss: 1.5750 - classification_loss: 0.2726 464/500 [==========================>...] - ETA: 8s - loss: 1.8459 - regression_loss: 1.5736 - classification_loss: 0.2723 465/500 [==========================>...] - ETA: 8s - loss: 1.8458 - regression_loss: 1.5738 - classification_loss: 0.2721 466/500 [==========================>...] - ETA: 8s - loss: 1.8456 - regression_loss: 1.5737 - classification_loss: 0.2719 467/500 [===========================>..] - ETA: 8s - loss: 1.8468 - regression_loss: 1.5746 - classification_loss: 0.2722 468/500 [===========================>..] - ETA: 7s - loss: 1.8470 - regression_loss: 1.5747 - classification_loss: 0.2723 469/500 [===========================>..] - ETA: 7s - loss: 1.8478 - regression_loss: 1.5753 - classification_loss: 0.2725 470/500 [===========================>..] - ETA: 7s - loss: 1.8463 - regression_loss: 1.5742 - classification_loss: 0.2721 471/500 [===========================>..] - ETA: 7s - loss: 1.8468 - regression_loss: 1.5742 - classification_loss: 0.2725 472/500 [===========================>..] - ETA: 6s - loss: 1.8455 - regression_loss: 1.5731 - classification_loss: 0.2724 473/500 [===========================>..] - ETA: 6s - loss: 1.8447 - regression_loss: 1.5725 - classification_loss: 0.2722 474/500 [===========================>..] - ETA: 6s - loss: 1.8446 - regression_loss: 1.5726 - classification_loss: 0.2721 475/500 [===========================>..] - ETA: 6s - loss: 1.8474 - regression_loss: 1.5741 - classification_loss: 0.2733 476/500 [===========================>..] - ETA: 5s - loss: 1.8470 - regression_loss: 1.5738 - classification_loss: 0.2732 477/500 [===========================>..] - ETA: 5s - loss: 1.8470 - regression_loss: 1.5740 - classification_loss: 0.2731 478/500 [===========================>..] - ETA: 5s - loss: 1.8467 - regression_loss: 1.5738 - classification_loss: 0.2730 479/500 [===========================>..] - ETA: 5s - loss: 1.8470 - regression_loss: 1.5740 - classification_loss: 0.2730 480/500 [===========================>..] - ETA: 4s - loss: 1.8451 - regression_loss: 1.5725 - classification_loss: 0.2726 481/500 [===========================>..] - ETA: 4s - loss: 1.8449 - regression_loss: 1.5724 - classification_loss: 0.2725 482/500 [===========================>..] - ETA: 4s - loss: 1.8451 - regression_loss: 1.5724 - classification_loss: 0.2727 483/500 [===========================>..] - ETA: 4s - loss: 1.8458 - regression_loss: 1.5730 - classification_loss: 0.2729 484/500 [============================>.] - ETA: 3s - loss: 1.8476 - regression_loss: 1.5743 - classification_loss: 0.2733 485/500 [============================>.] - ETA: 3s - loss: 1.8474 - regression_loss: 1.5742 - classification_loss: 0.2731 486/500 [============================>.] - ETA: 3s - loss: 1.8478 - regression_loss: 1.5746 - classification_loss: 0.2732 487/500 [============================>.] - ETA: 3s - loss: 1.8475 - regression_loss: 1.5743 - classification_loss: 0.2732 488/500 [============================>.] - ETA: 2s - loss: 1.8488 - regression_loss: 1.5752 - classification_loss: 0.2736 489/500 [============================>.] - ETA: 2s - loss: 1.8482 - regression_loss: 1.5747 - classification_loss: 0.2735 490/500 [============================>.] - ETA: 2s - loss: 1.8486 - regression_loss: 1.5750 - classification_loss: 0.2735 491/500 [============================>.] - ETA: 2s - loss: 1.8493 - regression_loss: 1.5757 - classification_loss: 0.2736 492/500 [============================>.] - ETA: 1s - loss: 1.8501 - regression_loss: 1.5762 - classification_loss: 0.2740 493/500 [============================>.] - ETA: 1s - loss: 1.8510 - regression_loss: 1.5770 - classification_loss: 0.2740 494/500 [============================>.] - ETA: 1s - loss: 1.8512 - regression_loss: 1.5773 - classification_loss: 0.2739 495/500 [============================>.] - ETA: 1s - loss: 1.8503 - regression_loss: 1.5765 - classification_loss: 0.2738 496/500 [============================>.] - ETA: 0s - loss: 1.8489 - regression_loss: 1.5753 - classification_loss: 0.2736 497/500 [============================>.] - ETA: 0s - loss: 1.8472 - regression_loss: 1.5739 - classification_loss: 0.2733 498/500 [============================>.] - ETA: 0s - loss: 1.8464 - regression_loss: 1.5733 - classification_loss: 0.2731 499/500 [============================>.] - ETA: 0s - loss: 1.8459 - regression_loss: 1.5729 - classification_loss: 0.2730 500/500 [==============================] - 123s 246ms/step - loss: 1.8450 - regression_loss: 1.5722 - classification_loss: 0.2727 326 instances of class plum with average precision: 0.7988 mAP: 0.7988 Epoch 00006: saving model to ./training/snapshots/resnet50_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 1:48 - loss: 1.0166 - regression_loss: 0.9051 - classification_loss: 0.1116 2/500 [..............................] - ETA: 1:58 - loss: 1.7061 - regression_loss: 1.4906 - classification_loss: 0.2155 3/500 [..............................] - ETA: 1:57 - loss: 1.7702 - regression_loss: 1.5623 - classification_loss: 0.2079 4/500 [..............................] - ETA: 1:57 - loss: 1.5945 - regression_loss: 1.4018 - classification_loss: 0.1928 5/500 [..............................] - ETA: 1:56 - loss: 1.5800 - regression_loss: 1.3657 - classification_loss: 0.2144 6/500 [..............................] - ETA: 1:54 - loss: 1.7047 - regression_loss: 1.4660 - classification_loss: 0.2387 7/500 [..............................] - ETA: 1:54 - loss: 1.7807 - regression_loss: 1.5251 - classification_loss: 0.2556 8/500 [..............................] - ETA: 1:55 - loss: 1.7646 - regression_loss: 1.5200 - classification_loss: 0.2446 9/500 [..............................] - ETA: 1:54 - loss: 1.8089 - regression_loss: 1.5643 - classification_loss: 0.2446 10/500 [..............................] - ETA: 1:54 - loss: 1.7283 - regression_loss: 1.4974 - classification_loss: 0.2310 11/500 [..............................] - ETA: 1:54 - loss: 1.7399 - regression_loss: 1.4953 - classification_loss: 0.2446 12/500 [..............................] - ETA: 1:54 - loss: 1.7571 - regression_loss: 1.5129 - classification_loss: 0.2442 13/500 [..............................] - ETA: 1:53 - loss: 1.7927 - regression_loss: 1.5394 - classification_loss: 0.2533 14/500 [..............................] - ETA: 1:52 - loss: 1.7282 - regression_loss: 1.4844 - classification_loss: 0.2437 15/500 [..............................] - ETA: 1:52 - loss: 1.7215 - regression_loss: 1.4783 - classification_loss: 0.2432 16/500 [..............................] - ETA: 1:51 - loss: 1.7215 - regression_loss: 1.4767 - classification_loss: 0.2448 17/500 [>.............................] - ETA: 1:51 - loss: 1.7571 - regression_loss: 1.5073 - classification_loss: 0.2498 18/500 [>.............................] - ETA: 1:51 - loss: 1.7535 - regression_loss: 1.5014 - classification_loss: 0.2521 19/500 [>.............................] - ETA: 1:50 - loss: 1.7627 - regression_loss: 1.5089 - classification_loss: 0.2538 20/500 [>.............................] - ETA: 1:50 - loss: 1.7840 - regression_loss: 1.5231 - classification_loss: 0.2609 21/500 [>.............................] - ETA: 1:50 - loss: 1.7944 - regression_loss: 1.5264 - classification_loss: 0.2680 22/500 [>.............................] - ETA: 1:50 - loss: 1.7601 - regression_loss: 1.4982 - classification_loss: 0.2619 23/500 [>.............................] - ETA: 1:50 - loss: 1.8064 - regression_loss: 1.5351 - classification_loss: 0.2714 24/500 [>.............................] - ETA: 1:50 - loss: 1.8212 - regression_loss: 1.5505 - classification_loss: 0.2706 25/500 [>.............................] - ETA: 1:50 - loss: 1.8322 - regression_loss: 1.5598 - classification_loss: 0.2725 26/500 [>.............................] - ETA: 1:49 - loss: 1.8162 - regression_loss: 1.5486 - classification_loss: 0.2676 27/500 [>.............................] - ETA: 1:49 - loss: 1.7927 - regression_loss: 1.5303 - classification_loss: 0.2623 28/500 [>.............................] - ETA: 1:49 - loss: 1.8051 - regression_loss: 1.5430 - classification_loss: 0.2620 29/500 [>.............................] - ETA: 1:49 - loss: 1.8020 - regression_loss: 1.5429 - classification_loss: 0.2591 30/500 [>.............................] - ETA: 1:49 - loss: 1.8060 - regression_loss: 1.5463 - classification_loss: 0.2597 31/500 [>.............................] - ETA: 1:48 - loss: 1.7991 - regression_loss: 1.5405 - classification_loss: 0.2586 32/500 [>.............................] - ETA: 1:48 - loss: 1.7872 - regression_loss: 1.5284 - classification_loss: 0.2589 33/500 [>.............................] - ETA: 1:48 - loss: 1.8495 - regression_loss: 1.5761 - classification_loss: 0.2733 34/500 [=>............................] - ETA: 1:48 - loss: 1.8476 - regression_loss: 1.5737 - classification_loss: 0.2739 35/500 [=>............................] - ETA: 1:47 - loss: 1.8312 - regression_loss: 1.5597 - classification_loss: 0.2715 36/500 [=>............................] - ETA: 1:47 - loss: 1.8199 - regression_loss: 1.5494 - classification_loss: 0.2705 37/500 [=>............................] - ETA: 1:47 - loss: 1.8034 - regression_loss: 1.5375 - classification_loss: 0.2660 38/500 [=>............................] - ETA: 1:47 - loss: 1.7939 - regression_loss: 1.5298 - classification_loss: 0.2641 39/500 [=>............................] - ETA: 1:46 - loss: 1.7612 - regression_loss: 1.5023 - classification_loss: 0.2589 40/500 [=>............................] - ETA: 1:46 - loss: 1.7925 - regression_loss: 1.5298 - classification_loss: 0.2627 41/500 [=>............................] - ETA: 1:46 - loss: 1.7902 - regression_loss: 1.5303 - classification_loss: 0.2600 42/500 [=>............................] - ETA: 1:45 - loss: 1.7790 - regression_loss: 1.5203 - classification_loss: 0.2588 43/500 [=>............................] - ETA: 1:45 - loss: 1.7740 - regression_loss: 1.5162 - classification_loss: 0.2577 44/500 [=>............................] - ETA: 1:45 - loss: 1.7702 - regression_loss: 1.5125 - classification_loss: 0.2577 45/500 [=>............................] - ETA: 1:44 - loss: 1.7673 - regression_loss: 1.5088 - classification_loss: 0.2584 46/500 [=>............................] - ETA: 1:44 - loss: 1.7712 - regression_loss: 1.5082 - classification_loss: 0.2630 47/500 [=>............................] - ETA: 1:44 - loss: 1.7609 - regression_loss: 1.4999 - classification_loss: 0.2611 48/500 [=>............................] - ETA: 1:44 - loss: 1.7484 - regression_loss: 1.4884 - classification_loss: 0.2599 49/500 [=>............................] - ETA: 1:44 - loss: 1.7543 - regression_loss: 1.4946 - classification_loss: 0.2597 50/500 [==>...........................] - ETA: 1:44 - loss: 1.7681 - regression_loss: 1.5079 - classification_loss: 0.2602 51/500 [==>...........................] - ETA: 1:43 - loss: 1.7813 - regression_loss: 1.5170 - classification_loss: 0.2643 52/500 [==>...........................] - ETA: 1:43 - loss: 1.7773 - regression_loss: 1.5137 - classification_loss: 0.2636 53/500 [==>...........................] - ETA: 1:43 - loss: 1.7733 - regression_loss: 1.5109 - classification_loss: 0.2624 54/500 [==>...........................] - ETA: 1:43 - loss: 1.7689 - regression_loss: 1.5076 - classification_loss: 0.2613 55/500 [==>...........................] - ETA: 1:43 - loss: 1.7609 - regression_loss: 1.5017 - classification_loss: 0.2591 56/500 [==>...........................] - ETA: 1:42 - loss: 1.7608 - regression_loss: 1.5024 - classification_loss: 0.2585 57/500 [==>...........................] - ETA: 1:42 - loss: 1.7410 - regression_loss: 1.4853 - classification_loss: 0.2558 58/500 [==>...........................] - ETA: 1:42 - loss: 1.7586 - regression_loss: 1.5048 - classification_loss: 0.2538 59/500 [==>...........................] - ETA: 1:42 - loss: 1.7670 - regression_loss: 1.5108 - classification_loss: 0.2562 60/500 [==>...........................] - ETA: 1:42 - loss: 1.7753 - regression_loss: 1.5189 - classification_loss: 0.2565 61/500 [==>...........................] - ETA: 1:41 - loss: 1.7610 - regression_loss: 1.5064 - classification_loss: 0.2545 62/500 [==>...........................] - ETA: 1:41 - loss: 1.7651 - regression_loss: 1.5107 - classification_loss: 0.2543 63/500 [==>...........................] - ETA: 1:41 - loss: 1.7676 - regression_loss: 1.5143 - classification_loss: 0.2534 64/500 [==>...........................] - ETA: 1:41 - loss: 1.7692 - regression_loss: 1.5158 - classification_loss: 0.2534 65/500 [==>...........................] - ETA: 1:40 - loss: 1.7722 - regression_loss: 1.5192 - classification_loss: 0.2530 66/500 [==>...........................] - ETA: 1:40 - loss: 1.7689 - regression_loss: 1.5164 - classification_loss: 0.2525 67/500 [===>..........................] - ETA: 1:40 - loss: 1.7784 - regression_loss: 1.5224 - classification_loss: 0.2560 68/500 [===>..........................] - ETA: 1:40 - loss: 1.7638 - regression_loss: 1.5104 - classification_loss: 0.2534 69/500 [===>..........................] - ETA: 1:39 - loss: 1.7778 - regression_loss: 1.5225 - classification_loss: 0.2553 70/500 [===>..........................] - ETA: 1:39 - loss: 1.7812 - regression_loss: 1.5257 - classification_loss: 0.2555 71/500 [===>..........................] - ETA: 1:39 - loss: 1.7876 - regression_loss: 1.5305 - classification_loss: 0.2571 72/500 [===>..........................] - ETA: 1:39 - loss: 1.7845 - regression_loss: 1.5282 - classification_loss: 0.2563 73/500 [===>..........................] - ETA: 1:38 - loss: 1.7943 - regression_loss: 1.5353 - classification_loss: 0.2590 74/500 [===>..........................] - ETA: 1:38 - loss: 1.7986 - regression_loss: 1.5385 - classification_loss: 0.2601 75/500 [===>..........................] - ETA: 1:38 - loss: 1.7867 - regression_loss: 1.5289 - classification_loss: 0.2578 76/500 [===>..........................] - ETA: 1:38 - loss: 1.7698 - regression_loss: 1.5145 - classification_loss: 0.2554 77/500 [===>..........................] - ETA: 1:38 - loss: 1.7715 - regression_loss: 1.5164 - classification_loss: 0.2551 78/500 [===>..........................] - ETA: 1:37 - loss: 1.7761 - regression_loss: 1.5212 - classification_loss: 0.2550 79/500 [===>..........................] - ETA: 1:37 - loss: 1.7754 - regression_loss: 1.5211 - classification_loss: 0.2543 80/500 [===>..........................] - ETA: 1:37 - loss: 1.7694 - regression_loss: 1.5163 - classification_loss: 0.2531 81/500 [===>..........................] - ETA: 1:37 - loss: 1.7584 - regression_loss: 1.5070 - classification_loss: 0.2514 82/500 [===>..........................] - ETA: 1:37 - loss: 1.7659 - regression_loss: 1.5125 - classification_loss: 0.2534 83/500 [===>..........................] - ETA: 1:36 - loss: 1.7563 - regression_loss: 1.5043 - classification_loss: 0.2520 84/500 [====>.........................] - ETA: 1:36 - loss: 1.7572 - regression_loss: 1.5054 - classification_loss: 0.2518 85/500 [====>.........................] - ETA: 1:36 - loss: 1.7579 - regression_loss: 1.5061 - classification_loss: 0.2518 86/500 [====>.........................] - ETA: 1:36 - loss: 1.7470 - regression_loss: 1.4971 - classification_loss: 0.2500 87/500 [====>.........................] - ETA: 1:36 - loss: 1.7492 - regression_loss: 1.4982 - classification_loss: 0.2510 88/500 [====>.........................] - ETA: 1:35 - loss: 1.7493 - regression_loss: 1.4983 - classification_loss: 0.2510 89/500 [====>.........................] - ETA: 1:35 - loss: 1.7465 - regression_loss: 1.4957 - classification_loss: 0.2508 90/500 [====>.........................] - ETA: 1:35 - loss: 1.7461 - regression_loss: 1.4952 - classification_loss: 0.2509 91/500 [====>.........................] - ETA: 1:35 - loss: 1.7545 - regression_loss: 1.5007 - classification_loss: 0.2538 92/500 [====>.........................] - ETA: 1:35 - loss: 1.7540 - regression_loss: 1.5002 - classification_loss: 0.2537 93/500 [====>.........................] - ETA: 1:34 - loss: 1.7546 - regression_loss: 1.5017 - classification_loss: 0.2528 94/500 [====>.........................] - ETA: 1:34 - loss: 1.7534 - regression_loss: 1.5004 - classification_loss: 0.2530 95/500 [====>.........................] - ETA: 1:34 - loss: 1.7504 - regression_loss: 1.4987 - classification_loss: 0.2518 96/500 [====>.........................] - ETA: 1:34 - loss: 1.7456 - regression_loss: 1.4946 - classification_loss: 0.2510 97/500 [====>.........................] - ETA: 1:34 - loss: 1.7402 - regression_loss: 1.4900 - classification_loss: 0.2502 98/500 [====>.........................] - ETA: 1:33 - loss: 1.7334 - regression_loss: 1.4842 - classification_loss: 0.2493 99/500 [====>.........................] - ETA: 1:33 - loss: 1.7410 - regression_loss: 1.4908 - classification_loss: 0.2502 100/500 [=====>........................] - ETA: 1:33 - loss: 1.7390 - regression_loss: 1.4896 - classification_loss: 0.2494 101/500 [=====>........................] - ETA: 1:32 - loss: 1.7426 - regression_loss: 1.4927 - classification_loss: 0.2499 102/500 [=====>........................] - ETA: 1:32 - loss: 1.7496 - regression_loss: 1.4984 - classification_loss: 0.2512 103/500 [=====>........................] - ETA: 1:32 - loss: 1.7441 - regression_loss: 1.4940 - classification_loss: 0.2501 104/500 [=====>........................] - ETA: 1:32 - loss: 1.7362 - regression_loss: 1.4872 - classification_loss: 0.2490 105/500 [=====>........................] - ETA: 1:31 - loss: 1.7320 - regression_loss: 1.4841 - classification_loss: 0.2479 106/500 [=====>........................] - ETA: 1:31 - loss: 1.7375 - regression_loss: 1.4871 - classification_loss: 0.2505 107/500 [=====>........................] - ETA: 1:31 - loss: 1.7416 - regression_loss: 1.4907 - classification_loss: 0.2509 108/500 [=====>........................] - ETA: 1:31 - loss: 1.7417 - regression_loss: 1.4906 - classification_loss: 0.2511 109/500 [=====>........................] - ETA: 1:31 - loss: 1.7378 - regression_loss: 1.4872 - classification_loss: 0.2506 110/500 [=====>........................] - ETA: 1:30 - loss: 1.7346 - regression_loss: 1.4846 - classification_loss: 0.2500 111/500 [=====>........................] - ETA: 1:30 - loss: 1.7356 - regression_loss: 1.4852 - classification_loss: 0.2504 112/500 [=====>........................] - ETA: 1:30 - loss: 1.7357 - regression_loss: 1.4858 - classification_loss: 0.2499 113/500 [=====>........................] - ETA: 1:30 - loss: 1.7367 - regression_loss: 1.4869 - classification_loss: 0.2498 114/500 [=====>........................] - ETA: 1:29 - loss: 1.7360 - regression_loss: 1.4857 - classification_loss: 0.2503 115/500 [=====>........................] - ETA: 1:29 - loss: 1.7352 - regression_loss: 1.4728 - classification_loss: 0.2623 116/500 [=====>........................] - ETA: 1:29 - loss: 1.7271 - regression_loss: 1.4658 - classification_loss: 0.2613 117/500 [======>.......................] - ETA: 1:29 - loss: 1.7230 - regression_loss: 1.4626 - classification_loss: 0.2604 118/500 [======>.......................] - ETA: 1:28 - loss: 1.7283 - regression_loss: 1.4680 - classification_loss: 0.2603 119/500 [======>.......................] - ETA: 1:28 - loss: 1.7283 - regression_loss: 1.4685 - classification_loss: 0.2598 120/500 [======>.......................] - ETA: 1:28 - loss: 1.7292 - regression_loss: 1.4698 - classification_loss: 0.2594 121/500 [======>.......................] - ETA: 1:28 - loss: 1.7284 - regression_loss: 1.4687 - classification_loss: 0.2597 122/500 [======>.......................] - ETA: 1:27 - loss: 1.7277 - regression_loss: 1.4681 - classification_loss: 0.2596 123/500 [======>.......................] - ETA: 1:27 - loss: 1.7302 - regression_loss: 1.4708 - classification_loss: 0.2594 124/500 [======>.......................] - ETA: 1:27 - loss: 1.7328 - regression_loss: 1.4730 - classification_loss: 0.2598 125/500 [======>.......................] - ETA: 1:27 - loss: 1.7362 - regression_loss: 1.4757 - classification_loss: 0.2605 126/500 [======>.......................] - ETA: 1:26 - loss: 1.7397 - regression_loss: 1.4791 - classification_loss: 0.2606 127/500 [======>.......................] - ETA: 1:26 - loss: 1.7376 - regression_loss: 1.4777 - classification_loss: 0.2599 128/500 [======>.......................] - ETA: 1:26 - loss: 1.7327 - regression_loss: 1.4742 - classification_loss: 0.2585 129/500 [======>.......................] - ETA: 1:26 - loss: 1.7281 - regression_loss: 1.4706 - classification_loss: 0.2575 130/500 [======>.......................] - ETA: 1:25 - loss: 1.7245 - regression_loss: 1.4683 - classification_loss: 0.2562 131/500 [======>.......................] - ETA: 1:25 - loss: 1.7254 - regression_loss: 1.4689 - classification_loss: 0.2565 132/500 [======>.......................] - ETA: 1:25 - loss: 1.7234 - regression_loss: 1.4673 - classification_loss: 0.2561 133/500 [======>.......................] - ETA: 1:25 - loss: 1.7290 - regression_loss: 1.4715 - classification_loss: 0.2575 134/500 [=======>......................] - ETA: 1:24 - loss: 1.7244 - regression_loss: 1.4680 - classification_loss: 0.2564 135/500 [=======>......................] - ETA: 1:24 - loss: 1.7268 - regression_loss: 1.4697 - classification_loss: 0.2570 136/500 [=======>......................] - ETA: 1:24 - loss: 1.7253 - regression_loss: 1.4684 - classification_loss: 0.2569 137/500 [=======>......................] - ETA: 1:24 - loss: 1.7339 - regression_loss: 1.4750 - classification_loss: 0.2589 138/500 [=======>......................] - ETA: 1:24 - loss: 1.7322 - regression_loss: 1.4738 - classification_loss: 0.2584 139/500 [=======>......................] - ETA: 1:23 - loss: 1.7377 - regression_loss: 1.4778 - classification_loss: 0.2599 140/500 [=======>......................] - ETA: 1:23 - loss: 1.7356 - regression_loss: 1.4765 - classification_loss: 0.2591 141/500 [=======>......................] - ETA: 1:23 - loss: 1.7338 - regression_loss: 1.4752 - classification_loss: 0.2586 142/500 [=======>......................] - ETA: 1:23 - loss: 1.7323 - regression_loss: 1.4745 - classification_loss: 0.2577 143/500 [=======>......................] - ETA: 1:22 - loss: 1.7338 - regression_loss: 1.4760 - classification_loss: 0.2578 144/500 [=======>......................] - ETA: 1:22 - loss: 1.7322 - regression_loss: 1.4753 - classification_loss: 0.2569 145/500 [=======>......................] - ETA: 1:22 - loss: 1.7317 - regression_loss: 1.4748 - classification_loss: 0.2569 146/500 [=======>......................] - ETA: 1:22 - loss: 1.7322 - regression_loss: 1.4755 - classification_loss: 0.2567 147/500 [=======>......................] - ETA: 1:21 - loss: 1.7323 - regression_loss: 1.4759 - classification_loss: 0.2564 148/500 [=======>......................] - ETA: 1:21 - loss: 1.7331 - regression_loss: 1.4768 - classification_loss: 0.2563 149/500 [=======>......................] - ETA: 1:21 - loss: 1.7298 - regression_loss: 1.4743 - classification_loss: 0.2555 150/500 [========>.....................] - ETA: 1:21 - loss: 1.7289 - regression_loss: 1.4736 - classification_loss: 0.2553 151/500 [========>.....................] - ETA: 1:20 - loss: 1.7297 - regression_loss: 1.4740 - classification_loss: 0.2556 152/500 [========>.....................] - ETA: 1:20 - loss: 1.7302 - regression_loss: 1.4745 - classification_loss: 0.2557 153/500 [========>.....................] - ETA: 1:20 - loss: 1.7316 - regression_loss: 1.4761 - classification_loss: 0.2555 154/500 [========>.....................] - ETA: 1:20 - loss: 1.7286 - regression_loss: 1.4737 - classification_loss: 0.2549 155/500 [========>.....................] - ETA: 1:19 - loss: 1.7286 - regression_loss: 1.4736 - classification_loss: 0.2550 156/500 [========>.....................] - ETA: 1:19 - loss: 1.7364 - regression_loss: 1.4796 - classification_loss: 0.2568 157/500 [========>.....................] - ETA: 1:19 - loss: 1.7317 - regression_loss: 1.4761 - classification_loss: 0.2557 158/500 [========>.....................] - ETA: 1:19 - loss: 1.7290 - regression_loss: 1.4739 - classification_loss: 0.2551 159/500 [========>.....................] - ETA: 1:19 - loss: 1.7286 - regression_loss: 1.4737 - classification_loss: 0.2549 160/500 [========>.....................] - ETA: 1:18 - loss: 1.7261 - regression_loss: 1.4719 - classification_loss: 0.2542 161/500 [========>.....................] - ETA: 1:18 - loss: 1.7239 - regression_loss: 1.4704 - classification_loss: 0.2535 162/500 [========>.....................] - ETA: 1:18 - loss: 1.7227 - regression_loss: 1.4696 - classification_loss: 0.2531 163/500 [========>.....................] - ETA: 1:18 - loss: 1.7206 - regression_loss: 1.4681 - classification_loss: 0.2526 164/500 [========>.....................] - ETA: 1:17 - loss: 1.7190 - regression_loss: 1.4666 - classification_loss: 0.2525 165/500 [========>.....................] - ETA: 1:17 - loss: 1.7161 - regression_loss: 1.4644 - classification_loss: 0.2516 166/500 [========>.....................] - ETA: 1:17 - loss: 1.7301 - regression_loss: 1.4755 - classification_loss: 0.2546 167/500 [=========>....................] - ETA: 1:17 - loss: 1.7289 - regression_loss: 1.4746 - classification_loss: 0.2543 168/500 [=========>....................] - ETA: 1:16 - loss: 1.7272 - regression_loss: 1.4737 - classification_loss: 0.2536 169/500 [=========>....................] - ETA: 1:16 - loss: 1.7262 - regression_loss: 1.4731 - classification_loss: 0.2530 170/500 [=========>....................] - ETA: 1:16 - loss: 1.7230 - regression_loss: 1.4708 - classification_loss: 0.2522 171/500 [=========>....................] - ETA: 1:16 - loss: 1.7238 - regression_loss: 1.4716 - classification_loss: 0.2522 172/500 [=========>....................] - ETA: 1:15 - loss: 1.7185 - regression_loss: 1.4673 - classification_loss: 0.2512 173/500 [=========>....................] - ETA: 1:15 - loss: 1.7182 - regression_loss: 1.4676 - classification_loss: 0.2507 174/500 [=========>....................] - ETA: 1:15 - loss: 1.7183 - regression_loss: 1.4678 - classification_loss: 0.2505 175/500 [=========>....................] - ETA: 1:15 - loss: 1.7185 - regression_loss: 1.4682 - classification_loss: 0.2503 176/500 [=========>....................] - ETA: 1:14 - loss: 1.7154 - regression_loss: 1.4658 - classification_loss: 0.2496 177/500 [=========>....................] - ETA: 1:14 - loss: 1.7147 - regression_loss: 1.4654 - classification_loss: 0.2493 178/500 [=========>....................] - ETA: 1:14 - loss: 1.7136 - regression_loss: 1.4646 - classification_loss: 0.2491 179/500 [=========>....................] - ETA: 1:14 - loss: 1.7121 - regression_loss: 1.4632 - classification_loss: 0.2490 180/500 [=========>....................] - ETA: 1:13 - loss: 1.7079 - regression_loss: 1.4598 - classification_loss: 0.2480 181/500 [=========>....................] - ETA: 1:13 - loss: 1.7067 - regression_loss: 1.4590 - classification_loss: 0.2477 182/500 [=========>....................] - ETA: 1:13 - loss: 1.7093 - regression_loss: 1.4612 - classification_loss: 0.2481 183/500 [=========>....................] - ETA: 1:13 - loss: 1.7133 - regression_loss: 1.4635 - classification_loss: 0.2499 184/500 [==========>...................] - ETA: 1:13 - loss: 1.7134 - regression_loss: 1.4638 - classification_loss: 0.2496 185/500 [==========>...................] - ETA: 1:12 - loss: 1.7092 - regression_loss: 1.4603 - classification_loss: 0.2488 186/500 [==========>...................] - ETA: 1:12 - loss: 1.7096 - regression_loss: 1.4604 - classification_loss: 0.2491 187/500 [==========>...................] - ETA: 1:12 - loss: 1.7088 - regression_loss: 1.4594 - classification_loss: 0.2494 188/500 [==========>...................] - ETA: 1:12 - loss: 1.7047 - regression_loss: 1.4561 - classification_loss: 0.2486 189/500 [==========>...................] - ETA: 1:11 - loss: 1.7036 - regression_loss: 1.4549 - classification_loss: 0.2486 190/500 [==========>...................] - ETA: 1:11 - loss: 1.7011 - regression_loss: 1.4527 - classification_loss: 0.2485 191/500 [==========>...................] - ETA: 1:11 - loss: 1.7008 - regression_loss: 1.4523 - classification_loss: 0.2485 192/500 [==========>...................] - ETA: 1:11 - loss: 1.6981 - regression_loss: 1.4500 - classification_loss: 0.2482 193/500 [==========>...................] - ETA: 1:10 - loss: 1.6975 - regression_loss: 1.4493 - classification_loss: 0.2482 194/500 [==========>...................] - ETA: 1:10 - loss: 1.6972 - regression_loss: 1.4491 - classification_loss: 0.2481 195/500 [==========>...................] - ETA: 1:10 - loss: 1.6959 - regression_loss: 1.4477 - classification_loss: 0.2481 196/500 [==========>...................] - ETA: 1:10 - loss: 1.6983 - regression_loss: 1.4499 - classification_loss: 0.2484 197/500 [==========>...................] - ETA: 1:10 - loss: 1.7015 - regression_loss: 1.4526 - classification_loss: 0.2489 198/500 [==========>...................] - ETA: 1:09 - loss: 1.7026 - regression_loss: 1.4539 - classification_loss: 0.2488 199/500 [==========>...................] - ETA: 1:09 - loss: 1.6999 - regression_loss: 1.4516 - classification_loss: 0.2483 200/500 [===========>..................] - ETA: 1:09 - loss: 1.7012 - regression_loss: 1.4522 - classification_loss: 0.2490 201/500 [===========>..................] - ETA: 1:09 - loss: 1.7045 - regression_loss: 1.4544 - classification_loss: 0.2501 202/500 [===========>..................] - ETA: 1:08 - loss: 1.7063 - regression_loss: 1.4559 - classification_loss: 0.2504 203/500 [===========>..................] - ETA: 1:08 - loss: 1.7066 - regression_loss: 1.4564 - classification_loss: 0.2503 204/500 [===========>..................] - ETA: 1:08 - loss: 1.7060 - regression_loss: 1.4560 - classification_loss: 0.2501 205/500 [===========>..................] - ETA: 1:08 - loss: 1.7030 - regression_loss: 1.4534 - classification_loss: 0.2496 206/500 [===========>..................] - ETA: 1:08 - loss: 1.7016 - regression_loss: 1.4525 - classification_loss: 0.2491 207/500 [===========>..................] - ETA: 1:07 - loss: 1.7083 - regression_loss: 1.4577 - classification_loss: 0.2506 208/500 [===========>..................] - ETA: 1:07 - loss: 1.7097 - regression_loss: 1.4588 - classification_loss: 0.2510 209/500 [===========>..................] - ETA: 1:07 - loss: 1.7113 - regression_loss: 1.4603 - classification_loss: 0.2510 210/500 [===========>..................] - ETA: 1:07 - loss: 1.7140 - regression_loss: 1.4623 - classification_loss: 0.2517 211/500 [===========>..................] - ETA: 1:06 - loss: 1.7153 - regression_loss: 1.4631 - classification_loss: 0.2522 212/500 [===========>..................] - ETA: 1:06 - loss: 1.7170 - regression_loss: 1.4644 - classification_loss: 0.2526 213/500 [===========>..................] - ETA: 1:06 - loss: 1.7167 - regression_loss: 1.4647 - classification_loss: 0.2520 214/500 [===========>..................] - ETA: 1:06 - loss: 1.7173 - regression_loss: 1.4651 - classification_loss: 0.2522 215/500 [===========>..................] - ETA: 1:05 - loss: 1.7147 - regression_loss: 1.4631 - classification_loss: 0.2516 216/500 [===========>..................] - ETA: 1:05 - loss: 1.7146 - regression_loss: 1.4632 - classification_loss: 0.2514 217/500 [============>.................] - ETA: 1:05 - loss: 1.7175 - regression_loss: 1.4659 - classification_loss: 0.2517 218/500 [============>.................] - ETA: 1:05 - loss: 1.7171 - regression_loss: 1.4655 - classification_loss: 0.2515 219/500 [============>.................] - ETA: 1:04 - loss: 1.7148 - regression_loss: 1.4636 - classification_loss: 0.2512 220/500 [============>.................] - ETA: 1:04 - loss: 1.7145 - regression_loss: 1.4632 - classification_loss: 0.2513 221/500 [============>.................] - ETA: 1:04 - loss: 1.7092 - regression_loss: 1.4588 - classification_loss: 0.2504 222/500 [============>.................] - ETA: 1:04 - loss: 1.7110 - regression_loss: 1.4605 - classification_loss: 0.2505 223/500 [============>.................] - ETA: 1:04 - loss: 1.7126 - regression_loss: 1.4618 - classification_loss: 0.2508 224/500 [============>.................] - ETA: 1:03 - loss: 1.7110 - regression_loss: 1.4603 - classification_loss: 0.2507 225/500 [============>.................] - ETA: 1:03 - loss: 1.7093 - regression_loss: 1.4590 - classification_loss: 0.2503 226/500 [============>.................] - ETA: 1:03 - loss: 1.7108 - regression_loss: 1.4600 - classification_loss: 0.2508 227/500 [============>.................] - ETA: 1:03 - loss: 1.7124 - regression_loss: 1.4616 - classification_loss: 0.2508 228/500 [============>.................] - ETA: 1:02 - loss: 1.7114 - regression_loss: 1.4609 - classification_loss: 0.2505 229/500 [============>.................] - ETA: 1:02 - loss: 1.7100 - regression_loss: 1.4599 - classification_loss: 0.2502 230/500 [============>.................] - ETA: 1:02 - loss: 1.7088 - regression_loss: 1.4589 - classification_loss: 0.2498 231/500 [============>.................] - ETA: 1:02 - loss: 1.7100 - regression_loss: 1.4597 - classification_loss: 0.2503 232/500 [============>.................] - ETA: 1:02 - loss: 1.7102 - regression_loss: 1.4602 - classification_loss: 0.2500 233/500 [============>.................] - ETA: 1:01 - loss: 1.7116 - regression_loss: 1.4613 - classification_loss: 0.2503 234/500 [=============>................] - ETA: 1:01 - loss: 1.7115 - regression_loss: 1.4614 - classification_loss: 0.2501 235/500 [=============>................] - ETA: 1:01 - loss: 1.7112 - regression_loss: 1.4614 - classification_loss: 0.2498 236/500 [=============>................] - ETA: 1:01 - loss: 1.7094 - regression_loss: 1.4602 - classification_loss: 0.2493 237/500 [=============>................] - ETA: 1:00 - loss: 1.7108 - regression_loss: 1.4608 - classification_loss: 0.2500 238/500 [=============>................] - ETA: 1:00 - loss: 1.7089 - regression_loss: 1.4595 - classification_loss: 0.2494 239/500 [=============>................] - ETA: 1:00 - loss: 1.7072 - regression_loss: 1.4580 - classification_loss: 0.2492 240/500 [=============>................] - ETA: 1:00 - loss: 1.7072 - regression_loss: 1.4581 - classification_loss: 0.2492 241/500 [=============>................] - ETA: 59s - loss: 1.7078 - regression_loss: 1.4585 - classification_loss: 0.2493  242/500 [=============>................] - ETA: 59s - loss: 1.7082 - regression_loss: 1.4588 - classification_loss: 0.2494 243/500 [=============>................] - ETA: 59s - loss: 1.7080 - regression_loss: 1.4587 - classification_loss: 0.2493 244/500 [=============>................] - ETA: 59s - loss: 1.7066 - regression_loss: 1.4574 - classification_loss: 0.2492 245/500 [=============>................] - ETA: 58s - loss: 1.7084 - regression_loss: 1.4591 - classification_loss: 0.2493 246/500 [=============>................] - ETA: 58s - loss: 1.7092 - regression_loss: 1.4598 - classification_loss: 0.2494 247/500 [=============>................] - ETA: 58s - loss: 1.7119 - regression_loss: 1.4621 - classification_loss: 0.2497 248/500 [=============>................] - ETA: 58s - loss: 1.7085 - regression_loss: 1.4591 - classification_loss: 0.2493 249/500 [=============>................] - ETA: 58s - loss: 1.7064 - regression_loss: 1.4573 - classification_loss: 0.2490 250/500 [==============>...............] - ETA: 57s - loss: 1.7074 - regression_loss: 1.4581 - classification_loss: 0.2494 251/500 [==============>...............] - ETA: 57s - loss: 1.7071 - regression_loss: 1.4579 - classification_loss: 0.2491 252/500 [==============>...............] - ETA: 57s - loss: 1.7163 - regression_loss: 1.4638 - classification_loss: 0.2526 253/500 [==============>...............] - ETA: 57s - loss: 1.7165 - regression_loss: 1.4639 - classification_loss: 0.2527 254/500 [==============>...............] - ETA: 56s - loss: 1.7114 - regression_loss: 1.4581 - classification_loss: 0.2533 255/500 [==============>...............] - ETA: 56s - loss: 1.7159 - regression_loss: 1.4620 - classification_loss: 0.2539 256/500 [==============>...............] - ETA: 56s - loss: 1.7149 - regression_loss: 1.4615 - classification_loss: 0.2534 257/500 [==============>...............] - ETA: 56s - loss: 1.7142 - regression_loss: 1.4612 - classification_loss: 0.2530 258/500 [==============>...............] - ETA: 55s - loss: 1.7143 - regression_loss: 1.4617 - classification_loss: 0.2527 259/500 [==============>...............] - ETA: 55s - loss: 1.7146 - regression_loss: 1.4623 - classification_loss: 0.2523 260/500 [==============>...............] - ETA: 55s - loss: 1.7183 - regression_loss: 1.4648 - classification_loss: 0.2535 261/500 [==============>...............] - ETA: 55s - loss: 1.7201 - regression_loss: 1.4659 - classification_loss: 0.2542 262/500 [==============>...............] - ETA: 55s - loss: 1.7215 - regression_loss: 1.4672 - classification_loss: 0.2543 263/500 [==============>...............] - ETA: 54s - loss: 1.7247 - regression_loss: 1.4699 - classification_loss: 0.2548 264/500 [==============>...............] - ETA: 54s - loss: 1.7258 - regression_loss: 1.4703 - classification_loss: 0.2555 265/500 [==============>...............] - ETA: 54s - loss: 1.7281 - regression_loss: 1.4726 - classification_loss: 0.2555 266/500 [==============>...............] - ETA: 54s - loss: 1.7271 - regression_loss: 1.4719 - classification_loss: 0.2552 267/500 [===============>..............] - ETA: 53s - loss: 1.7254 - regression_loss: 1.4706 - classification_loss: 0.2548 268/500 [===============>..............] - ETA: 53s - loss: 1.7243 - regression_loss: 1.4698 - classification_loss: 0.2544 269/500 [===============>..............] - ETA: 53s - loss: 1.7220 - regression_loss: 1.4680 - classification_loss: 0.2540 270/500 [===============>..............] - ETA: 53s - loss: 1.7227 - regression_loss: 1.4687 - classification_loss: 0.2540 271/500 [===============>..............] - ETA: 52s - loss: 1.7239 - regression_loss: 1.4703 - classification_loss: 0.2536 272/500 [===============>..............] - ETA: 52s - loss: 1.7235 - regression_loss: 1.4699 - classification_loss: 0.2536 273/500 [===============>..............] - ETA: 52s - loss: 1.7214 - regression_loss: 1.4683 - classification_loss: 0.2531 274/500 [===============>..............] - ETA: 52s - loss: 1.7181 - regression_loss: 1.4655 - classification_loss: 0.2526 275/500 [===============>..............] - ETA: 52s - loss: 1.7176 - regression_loss: 1.4651 - classification_loss: 0.2525 276/500 [===============>..............] - ETA: 51s - loss: 1.7173 - regression_loss: 1.4650 - classification_loss: 0.2523 277/500 [===============>..............] - ETA: 51s - loss: 1.7190 - regression_loss: 1.4669 - classification_loss: 0.2522 278/500 [===============>..............] - ETA: 51s - loss: 1.7179 - regression_loss: 1.4661 - classification_loss: 0.2518 279/500 [===============>..............] - ETA: 51s - loss: 1.7201 - regression_loss: 1.4677 - classification_loss: 0.2524 280/500 [===============>..............] - ETA: 50s - loss: 1.7177 - regression_loss: 1.4656 - classification_loss: 0.2520 281/500 [===============>..............] - ETA: 50s - loss: 1.7185 - regression_loss: 1.4662 - classification_loss: 0.2523 282/500 [===============>..............] - ETA: 50s - loss: 1.7207 - regression_loss: 1.4679 - classification_loss: 0.2528 283/500 [===============>..............] - ETA: 50s - loss: 1.7182 - regression_loss: 1.4659 - classification_loss: 0.2523 284/500 [================>.............] - ETA: 49s - loss: 1.7191 - regression_loss: 1.4670 - classification_loss: 0.2521 285/500 [================>.............] - ETA: 49s - loss: 1.7187 - regression_loss: 1.4666 - classification_loss: 0.2521 286/500 [================>.............] - ETA: 49s - loss: 1.7182 - regression_loss: 1.4663 - classification_loss: 0.2519 287/500 [================>.............] - ETA: 49s - loss: 1.7190 - regression_loss: 1.4670 - classification_loss: 0.2520 288/500 [================>.............] - ETA: 48s - loss: 1.7231 - regression_loss: 1.4700 - classification_loss: 0.2531 289/500 [================>.............] - ETA: 48s - loss: 1.7236 - regression_loss: 1.4705 - classification_loss: 0.2530 290/500 [================>.............] - ETA: 48s - loss: 1.7255 - regression_loss: 1.4723 - classification_loss: 0.2532 291/500 [================>.............] - ETA: 48s - loss: 1.7249 - regression_loss: 1.4718 - classification_loss: 0.2531 292/500 [================>.............] - ETA: 48s - loss: 1.7261 - regression_loss: 1.4730 - classification_loss: 0.2530 293/500 [================>.............] - ETA: 47s - loss: 1.7260 - regression_loss: 1.4730 - classification_loss: 0.2530 294/500 [================>.............] - ETA: 47s - loss: 1.7236 - regression_loss: 1.4710 - classification_loss: 0.2526 295/500 [================>.............] - ETA: 47s - loss: 1.7229 - regression_loss: 1.4702 - classification_loss: 0.2527 296/500 [================>.............] - ETA: 47s - loss: 1.7222 - regression_loss: 1.4696 - classification_loss: 0.2526 297/500 [================>.............] - ETA: 46s - loss: 1.7207 - regression_loss: 1.4683 - classification_loss: 0.2524 298/500 [================>.............] - ETA: 46s - loss: 1.7204 - regression_loss: 1.4684 - classification_loss: 0.2520 299/500 [================>.............] - ETA: 46s - loss: 1.7187 - regression_loss: 1.4663 - classification_loss: 0.2525 300/500 [=================>............] - ETA: 46s - loss: 1.7189 - regression_loss: 1.4663 - classification_loss: 0.2526 301/500 [=================>............] - ETA: 45s - loss: 1.7168 - regression_loss: 1.4641 - classification_loss: 0.2527 302/500 [=================>............] - ETA: 45s - loss: 1.7206 - regression_loss: 1.4671 - classification_loss: 0.2536 303/500 [=================>............] - ETA: 45s - loss: 1.7204 - regression_loss: 1.4669 - classification_loss: 0.2535 304/500 [=================>............] - ETA: 45s - loss: 1.7246 - regression_loss: 1.4702 - classification_loss: 0.2545 305/500 [=================>............] - ETA: 45s - loss: 1.7232 - regression_loss: 1.4692 - classification_loss: 0.2540 306/500 [=================>............] - ETA: 44s - loss: 1.7203 - regression_loss: 1.4668 - classification_loss: 0.2535 307/500 [=================>............] - ETA: 44s - loss: 1.7184 - regression_loss: 1.4652 - classification_loss: 0.2532 308/500 [=================>............] - ETA: 44s - loss: 1.7186 - regression_loss: 1.4654 - classification_loss: 0.2532 309/500 [=================>............] - ETA: 44s - loss: 1.7188 - regression_loss: 1.4656 - classification_loss: 0.2532 310/500 [=================>............] - ETA: 43s - loss: 1.7178 - regression_loss: 1.4650 - classification_loss: 0.2528 311/500 [=================>............] - ETA: 43s - loss: 1.7170 - regression_loss: 1.4643 - classification_loss: 0.2527 312/500 [=================>............] - ETA: 43s - loss: 1.7178 - regression_loss: 1.4649 - classification_loss: 0.2529 313/500 [=================>............] - ETA: 43s - loss: 1.7178 - regression_loss: 1.4648 - classification_loss: 0.2530 314/500 [=================>............] - ETA: 42s - loss: 1.7147 - regression_loss: 1.4623 - classification_loss: 0.2524 315/500 [=================>............] - ETA: 42s - loss: 1.7146 - regression_loss: 1.4622 - classification_loss: 0.2523 316/500 [=================>............] - ETA: 42s - loss: 1.7143 - regression_loss: 1.4621 - classification_loss: 0.2522 317/500 [==================>...........] - ETA: 42s - loss: 1.7113 - regression_loss: 1.4597 - classification_loss: 0.2517 318/500 [==================>...........] - ETA: 42s - loss: 1.7123 - regression_loss: 1.4604 - classification_loss: 0.2519 319/500 [==================>...........] - ETA: 41s - loss: 1.7125 - regression_loss: 1.4605 - classification_loss: 0.2519 320/500 [==================>...........] - ETA: 41s - loss: 1.7118 - regression_loss: 1.4600 - classification_loss: 0.2518 321/500 [==================>...........] - ETA: 41s - loss: 1.7159 - regression_loss: 1.4635 - classification_loss: 0.2525 322/500 [==================>...........] - ETA: 41s - loss: 1.7145 - regression_loss: 1.4624 - classification_loss: 0.2521 323/500 [==================>...........] - ETA: 40s - loss: 1.7169 - regression_loss: 1.4641 - classification_loss: 0.2528 324/500 [==================>...........] - ETA: 40s - loss: 1.7162 - regression_loss: 1.4636 - classification_loss: 0.2526 325/500 [==================>...........] - ETA: 40s - loss: 1.7148 - regression_loss: 1.4626 - classification_loss: 0.2522 326/500 [==================>...........] - ETA: 40s - loss: 1.7164 - regression_loss: 1.4640 - classification_loss: 0.2524 327/500 [==================>...........] - ETA: 40s - loss: 1.7149 - regression_loss: 1.4629 - classification_loss: 0.2520 328/500 [==================>...........] - ETA: 39s - loss: 1.7145 - regression_loss: 1.4625 - classification_loss: 0.2520 329/500 [==================>...........] - ETA: 39s - loss: 1.7156 - regression_loss: 1.4633 - classification_loss: 0.2523 330/500 [==================>...........] - ETA: 39s - loss: 1.7172 - regression_loss: 1.4651 - classification_loss: 0.2521 331/500 [==================>...........] - ETA: 39s - loss: 1.7137 - regression_loss: 1.4621 - classification_loss: 0.2516 332/500 [==================>...........] - ETA: 38s - loss: 1.7114 - regression_loss: 1.4602 - classification_loss: 0.2512 333/500 [==================>...........] - ETA: 38s - loss: 1.7094 - regression_loss: 1.4585 - classification_loss: 0.2509 334/500 [===================>..........] - ETA: 38s - loss: 1.7091 - regression_loss: 1.4586 - classification_loss: 0.2505 335/500 [===================>..........] - ETA: 38s - loss: 1.7105 - regression_loss: 1.4596 - classification_loss: 0.2509 336/500 [===================>..........] - ETA: 37s - loss: 1.7081 - regression_loss: 1.4577 - classification_loss: 0.2504 337/500 [===================>..........] - ETA: 37s - loss: 1.7078 - regression_loss: 1.4576 - classification_loss: 0.2502 338/500 [===================>..........] - ETA: 37s - loss: 1.7060 - regression_loss: 1.4559 - classification_loss: 0.2501 339/500 [===================>..........] - ETA: 37s - loss: 1.7067 - regression_loss: 1.4564 - classification_loss: 0.2502 340/500 [===================>..........] - ETA: 36s - loss: 1.7051 - regression_loss: 1.4551 - classification_loss: 0.2500 341/500 [===================>..........] - ETA: 36s - loss: 1.7068 - regression_loss: 1.4566 - classification_loss: 0.2502 342/500 [===================>..........] - ETA: 36s - loss: 1.7070 - regression_loss: 1.4569 - classification_loss: 0.2501 343/500 [===================>..........] - ETA: 36s - loss: 1.7063 - regression_loss: 1.4564 - classification_loss: 0.2499 344/500 [===================>..........] - ETA: 36s - loss: 1.7088 - regression_loss: 1.4580 - classification_loss: 0.2508 345/500 [===================>..........] - ETA: 35s - loss: 1.7086 - regression_loss: 1.4581 - classification_loss: 0.2505 346/500 [===================>..........] - ETA: 35s - loss: 1.7066 - regression_loss: 1.4564 - classification_loss: 0.2501 347/500 [===================>..........] - ETA: 35s - loss: 1.7058 - regression_loss: 1.4560 - classification_loss: 0.2498 348/500 [===================>..........] - ETA: 35s - loss: 1.7057 - regression_loss: 1.4560 - classification_loss: 0.2497 349/500 [===================>..........] - ETA: 34s - loss: 1.7061 - regression_loss: 1.4565 - classification_loss: 0.2496 350/500 [====================>.........] - ETA: 34s - loss: 1.7057 - regression_loss: 1.4561 - classification_loss: 0.2495 351/500 [====================>.........] - ETA: 34s - loss: 1.7065 - regression_loss: 1.4571 - classification_loss: 0.2494 352/500 [====================>.........] - ETA: 34s - loss: 1.7079 - regression_loss: 1.4585 - classification_loss: 0.2494 353/500 [====================>.........] - ETA: 33s - loss: 1.7082 - regression_loss: 1.4588 - classification_loss: 0.2494 354/500 [====================>.........] - ETA: 33s - loss: 1.7101 - regression_loss: 1.4605 - classification_loss: 0.2497 355/500 [====================>.........] - ETA: 33s - loss: 1.7105 - regression_loss: 1.4607 - classification_loss: 0.2498 356/500 [====================>.........] - ETA: 33s - loss: 1.7080 - regression_loss: 1.4587 - classification_loss: 0.2493 357/500 [====================>.........] - ETA: 33s - loss: 1.7065 - regression_loss: 1.4575 - classification_loss: 0.2490 358/500 [====================>.........] - ETA: 32s - loss: 1.7059 - regression_loss: 1.4571 - classification_loss: 0.2489 359/500 [====================>.........] - ETA: 32s - loss: 1.7068 - regression_loss: 1.4571 - classification_loss: 0.2497 360/500 [====================>.........] - ETA: 32s - loss: 1.7069 - regression_loss: 1.4573 - classification_loss: 0.2496 361/500 [====================>.........] - ETA: 32s - loss: 1.7092 - regression_loss: 1.4589 - classification_loss: 0.2503 362/500 [====================>.........] - ETA: 31s - loss: 1.7113 - regression_loss: 1.4608 - classification_loss: 0.2506 363/500 [====================>.........] - ETA: 31s - loss: 1.7119 - regression_loss: 1.4615 - classification_loss: 0.2504 364/500 [====================>.........] - ETA: 31s - loss: 1.7112 - regression_loss: 1.4610 - classification_loss: 0.2502 365/500 [====================>.........] - ETA: 31s - loss: 1.7098 - regression_loss: 1.4598 - classification_loss: 0.2499 366/500 [====================>.........] - ETA: 30s - loss: 1.7098 - regression_loss: 1.4600 - classification_loss: 0.2498 367/500 [=====================>........] - ETA: 30s - loss: 1.7102 - regression_loss: 1.4604 - classification_loss: 0.2498 368/500 [=====================>........] - ETA: 30s - loss: 1.7134 - regression_loss: 1.4631 - classification_loss: 0.2503 369/500 [=====================>........] - ETA: 30s - loss: 1.7113 - regression_loss: 1.4615 - classification_loss: 0.2498 370/500 [=====================>........] - ETA: 30s - loss: 1.7125 - regression_loss: 1.4622 - classification_loss: 0.2503 371/500 [=====================>........] - ETA: 29s - loss: 1.7129 - regression_loss: 1.4624 - classification_loss: 0.2505 372/500 [=====================>........] - ETA: 29s - loss: 1.7120 - regression_loss: 1.4619 - classification_loss: 0.2501 373/500 [=====================>........] - ETA: 29s - loss: 1.7131 - regression_loss: 1.4631 - classification_loss: 0.2500 374/500 [=====================>........] - ETA: 29s - loss: 1.7124 - regression_loss: 1.4624 - classification_loss: 0.2500 375/500 [=====================>........] - ETA: 28s - loss: 1.7125 - regression_loss: 1.4624 - classification_loss: 0.2501 376/500 [=====================>........] - ETA: 28s - loss: 1.7115 - regression_loss: 1.4617 - classification_loss: 0.2498 377/500 [=====================>........] - ETA: 28s - loss: 1.7096 - regression_loss: 1.4600 - classification_loss: 0.2496 378/500 [=====================>........] - ETA: 28s - loss: 1.7095 - regression_loss: 1.4601 - classification_loss: 0.2494 379/500 [=====================>........] - ETA: 27s - loss: 1.7091 - regression_loss: 1.4600 - classification_loss: 0.2491 380/500 [=====================>........] - ETA: 27s - loss: 1.7089 - regression_loss: 1.4599 - classification_loss: 0.2490 381/500 [=====================>........] - ETA: 27s - loss: 1.7099 - regression_loss: 1.4611 - classification_loss: 0.2489 382/500 [=====================>........] - ETA: 27s - loss: 1.7101 - regression_loss: 1.4614 - classification_loss: 0.2486 383/500 [=====================>........] - ETA: 27s - loss: 1.7102 - regression_loss: 1.4615 - classification_loss: 0.2487 384/500 [======================>.......] - ETA: 26s - loss: 1.7106 - regression_loss: 1.4619 - classification_loss: 0.2486 385/500 [======================>.......] - ETA: 26s - loss: 1.7103 - regression_loss: 1.4619 - classification_loss: 0.2485 386/500 [======================>.......] - ETA: 26s - loss: 1.7102 - regression_loss: 1.4618 - classification_loss: 0.2485 387/500 [======================>.......] - ETA: 26s - loss: 1.7130 - regression_loss: 1.4638 - classification_loss: 0.2492 388/500 [======================>.......] - ETA: 25s - loss: 1.7135 - regression_loss: 1.4643 - classification_loss: 0.2493 389/500 [======================>.......] - ETA: 25s - loss: 1.7128 - regression_loss: 1.4637 - classification_loss: 0.2491 390/500 [======================>.......] - ETA: 25s - loss: 1.7117 - regression_loss: 1.4629 - classification_loss: 0.2488 391/500 [======================>.......] - ETA: 25s - loss: 1.7117 - regression_loss: 1.4628 - classification_loss: 0.2489 392/500 [======================>.......] - ETA: 24s - loss: 1.7110 - regression_loss: 1.4624 - classification_loss: 0.2486 393/500 [======================>.......] - ETA: 24s - loss: 1.7114 - regression_loss: 1.4628 - classification_loss: 0.2485 394/500 [======================>.......] - ETA: 24s - loss: 1.7112 - regression_loss: 1.4628 - classification_loss: 0.2484 395/500 [======================>.......] - ETA: 24s - loss: 1.7106 - regression_loss: 1.4623 - classification_loss: 0.2483 396/500 [======================>.......] - ETA: 24s - loss: 1.7112 - regression_loss: 1.4629 - classification_loss: 0.2483 397/500 [======================>.......] - ETA: 23s - loss: 1.7095 - regression_loss: 1.4615 - classification_loss: 0.2480 398/500 [======================>.......] - ETA: 23s - loss: 1.7091 - regression_loss: 1.4612 - classification_loss: 0.2479 399/500 [======================>.......] - ETA: 23s - loss: 1.7091 - regression_loss: 1.4613 - classification_loss: 0.2479 400/500 [=======================>......] - ETA: 23s - loss: 1.7085 - regression_loss: 1.4610 - classification_loss: 0.2475 401/500 [=======================>......] - ETA: 22s - loss: 1.7115 - regression_loss: 1.4639 - classification_loss: 0.2476 402/500 [=======================>......] - ETA: 22s - loss: 1.7101 - regression_loss: 1.4628 - classification_loss: 0.2474 403/500 [=======================>......] - ETA: 22s - loss: 1.7116 - regression_loss: 1.4640 - classification_loss: 0.2477 404/500 [=======================>......] - ETA: 22s - loss: 1.7128 - regression_loss: 1.4650 - classification_loss: 0.2478 405/500 [=======================>......] - ETA: 21s - loss: 1.7160 - regression_loss: 1.4677 - classification_loss: 0.2483 406/500 [=======================>......] - ETA: 21s - loss: 1.7146 - regression_loss: 1.4664 - classification_loss: 0.2482 407/500 [=======================>......] - ETA: 21s - loss: 1.7147 - regression_loss: 1.4663 - classification_loss: 0.2484 408/500 [=======================>......] - ETA: 21s - loss: 1.7168 - regression_loss: 1.4678 - classification_loss: 0.2490 409/500 [=======================>......] - ETA: 21s - loss: 1.7155 - regression_loss: 1.4667 - classification_loss: 0.2487 410/500 [=======================>......] - ETA: 20s - loss: 1.7182 - regression_loss: 1.4691 - classification_loss: 0.2491 411/500 [=======================>......] - ETA: 20s - loss: 1.7174 - regression_loss: 1.4685 - classification_loss: 0.2489 412/500 [=======================>......] - ETA: 20s - loss: 1.7209 - regression_loss: 1.4711 - classification_loss: 0.2498 413/500 [=======================>......] - ETA: 20s - loss: 1.7220 - regression_loss: 1.4721 - classification_loss: 0.2499 414/500 [=======================>......] - ETA: 19s - loss: 1.7240 - regression_loss: 1.4737 - classification_loss: 0.2502 415/500 [=======================>......] - ETA: 19s - loss: 1.7250 - regression_loss: 1.4748 - classification_loss: 0.2502 416/500 [=======================>......] - ETA: 19s - loss: 1.7270 - regression_loss: 1.4766 - classification_loss: 0.2504 417/500 [========================>.....] - ETA: 19s - loss: 1.7283 - regression_loss: 1.4776 - classification_loss: 0.2507 418/500 [========================>.....] - ETA: 18s - loss: 1.7272 - regression_loss: 1.4769 - classification_loss: 0.2503 419/500 [========================>.....] - ETA: 18s - loss: 1.7282 - regression_loss: 1.4781 - classification_loss: 0.2501 420/500 [========================>.....] - ETA: 18s - loss: 1.7269 - regression_loss: 1.4771 - classification_loss: 0.2498 421/500 [========================>.....] - ETA: 18s - loss: 1.7257 - regression_loss: 1.4762 - classification_loss: 0.2495 422/500 [========================>.....] - ETA: 18s - loss: 1.7259 - regression_loss: 1.4765 - classification_loss: 0.2494 423/500 [========================>.....] - ETA: 17s - loss: 1.7283 - regression_loss: 1.4783 - classification_loss: 0.2500 424/500 [========================>.....] - ETA: 17s - loss: 1.7284 - regression_loss: 1.4784 - classification_loss: 0.2500 425/500 [========================>.....] - ETA: 17s - loss: 1.7296 - regression_loss: 1.4795 - classification_loss: 0.2501 426/500 [========================>.....] - ETA: 17s - loss: 1.7306 - regression_loss: 1.4802 - classification_loss: 0.2503 427/500 [========================>.....] - ETA: 16s - loss: 1.7297 - regression_loss: 1.4796 - classification_loss: 0.2501 428/500 [========================>.....] - ETA: 16s - loss: 1.7344 - regression_loss: 1.4827 - classification_loss: 0.2517 429/500 [========================>.....] - ETA: 16s - loss: 1.7335 - regression_loss: 1.4820 - classification_loss: 0.2515 430/500 [========================>.....] - ETA: 16s - loss: 1.7336 - regression_loss: 1.4821 - classification_loss: 0.2515 431/500 [========================>.....] - ETA: 15s - loss: 1.7327 - regression_loss: 1.4813 - classification_loss: 0.2513 432/500 [========================>.....] - ETA: 15s - loss: 1.7334 - regression_loss: 1.4818 - classification_loss: 0.2515 433/500 [========================>.....] - ETA: 15s - loss: 1.7340 - regression_loss: 1.4823 - classification_loss: 0.2516 434/500 [=========================>....] - ETA: 15s - loss: 1.7339 - regression_loss: 1.4822 - classification_loss: 0.2517 435/500 [=========================>....] - ETA: 15s - loss: 1.7328 - regression_loss: 1.4812 - classification_loss: 0.2516 436/500 [=========================>....] - ETA: 14s - loss: 1.7324 - regression_loss: 1.4809 - classification_loss: 0.2515 437/500 [=========================>....] - ETA: 14s - loss: 1.7324 - regression_loss: 1.4809 - classification_loss: 0.2515 438/500 [=========================>....] - ETA: 14s - loss: 1.7321 - regression_loss: 1.4807 - classification_loss: 0.2513 439/500 [=========================>....] - ETA: 14s - loss: 1.7307 - regression_loss: 1.4797 - classification_loss: 0.2510 440/500 [=========================>....] - ETA: 13s - loss: 1.7303 - regression_loss: 1.4794 - classification_loss: 0.2509 441/500 [=========================>....] - ETA: 13s - loss: 1.7325 - regression_loss: 1.4809 - classification_loss: 0.2515 442/500 [=========================>....] - ETA: 13s - loss: 1.7319 - regression_loss: 1.4805 - classification_loss: 0.2514 443/500 [=========================>....] - ETA: 13s - loss: 1.7330 - regression_loss: 1.4813 - classification_loss: 0.2517 444/500 [=========================>....] - ETA: 12s - loss: 1.7330 - regression_loss: 1.4813 - classification_loss: 0.2517 445/500 [=========================>....] - ETA: 12s - loss: 1.7374 - regression_loss: 1.4846 - classification_loss: 0.2528 446/500 [=========================>....] - ETA: 12s - loss: 1.7365 - regression_loss: 1.4839 - classification_loss: 0.2526 447/500 [=========================>....] - ETA: 12s - loss: 1.7361 - regression_loss: 1.4835 - classification_loss: 0.2525 448/500 [=========================>....] - ETA: 12s - loss: 1.7351 - regression_loss: 1.4826 - classification_loss: 0.2525 449/500 [=========================>....] - ETA: 11s - loss: 1.7390 - regression_loss: 1.4852 - classification_loss: 0.2538 450/500 [==========================>...] - ETA: 11s - loss: 1.7366 - regression_loss: 1.4832 - classification_loss: 0.2534 451/500 [==========================>...] - ETA: 11s - loss: 1.7368 - regression_loss: 1.4834 - classification_loss: 0.2534 452/500 [==========================>...] - ETA: 11s - loss: 1.7396 - regression_loss: 1.4856 - classification_loss: 0.2540 453/500 [==========================>...] - ETA: 10s - loss: 1.7392 - regression_loss: 1.4852 - classification_loss: 0.2540 454/500 [==========================>...] - ETA: 10s - loss: 1.7381 - regression_loss: 1.4843 - classification_loss: 0.2538 455/500 [==========================>...] - ETA: 10s - loss: 1.7421 - regression_loss: 1.4878 - classification_loss: 0.2543 456/500 [==========================>...] - ETA: 10s - loss: 1.7404 - regression_loss: 1.4863 - classification_loss: 0.2540 457/500 [==========================>...] - ETA: 9s - loss: 1.7389 - regression_loss: 1.4851 - classification_loss: 0.2538  458/500 [==========================>...] - ETA: 9s - loss: 1.7399 - regression_loss: 1.4858 - classification_loss: 0.2541 459/500 [==========================>...] - ETA: 9s - loss: 1.7383 - regression_loss: 1.4846 - classification_loss: 0.2538 460/500 [==========================>...] - ETA: 9s - loss: 1.7386 - regression_loss: 1.4850 - classification_loss: 0.2535 461/500 [==========================>...] - ETA: 9s - loss: 1.7396 - regression_loss: 1.4859 - classification_loss: 0.2537 462/500 [==========================>...] - ETA: 8s - loss: 1.7372 - regression_loss: 1.4837 - classification_loss: 0.2535 463/500 [==========================>...] - ETA: 8s - loss: 1.7370 - regression_loss: 1.4835 - classification_loss: 0.2534 464/500 [==========================>...] - ETA: 8s - loss: 1.7365 - regression_loss: 1.4831 - classification_loss: 0.2534 465/500 [==========================>...] - ETA: 8s - loss: 1.7360 - regression_loss: 1.4827 - classification_loss: 0.2534 466/500 [==========================>...] - ETA: 7s - loss: 1.7357 - regression_loss: 1.4824 - classification_loss: 0.2533 467/500 [===========================>..] - ETA: 7s - loss: 1.7345 - regression_loss: 1.4815 - classification_loss: 0.2529 468/500 [===========================>..] - ETA: 7s - loss: 1.7336 - regression_loss: 1.4810 - classification_loss: 0.2527 469/500 [===========================>..] - ETA: 7s - loss: 1.7318 - regression_loss: 1.4794 - classification_loss: 0.2524 470/500 [===========================>..] - ETA: 6s - loss: 1.7327 - regression_loss: 1.4802 - classification_loss: 0.2525 471/500 [===========================>..] - ETA: 6s - loss: 1.7314 - regression_loss: 1.4791 - classification_loss: 0.2523 472/500 [===========================>..] - ETA: 6s - loss: 1.7330 - regression_loss: 1.4805 - classification_loss: 0.2525 473/500 [===========================>..] - ETA: 6s - loss: 1.7332 - regression_loss: 1.4807 - classification_loss: 0.2525 474/500 [===========================>..] - ETA: 6s - loss: 1.7337 - regression_loss: 1.4811 - classification_loss: 0.2526 475/500 [===========================>..] - ETA: 5s - loss: 1.7336 - regression_loss: 1.4810 - classification_loss: 0.2526 476/500 [===========================>..] - ETA: 5s - loss: 1.7339 - regression_loss: 1.4813 - classification_loss: 0.2526 477/500 [===========================>..] - ETA: 5s - loss: 1.7332 - regression_loss: 1.4806 - classification_loss: 0.2526 478/500 [===========================>..] - ETA: 5s - loss: 1.7332 - regression_loss: 1.4806 - classification_loss: 0.2526 479/500 [===========================>..] - ETA: 4s - loss: 1.7329 - regression_loss: 1.4802 - classification_loss: 0.2527 480/500 [===========================>..] - ETA: 4s - loss: 1.7343 - regression_loss: 1.4815 - classification_loss: 0.2528 481/500 [===========================>..] - ETA: 4s - loss: 1.7339 - regression_loss: 1.4811 - classification_loss: 0.2527 482/500 [===========================>..] - ETA: 4s - loss: 1.7341 - regression_loss: 1.4814 - classification_loss: 0.2527 483/500 [===========================>..] - ETA: 3s - loss: 1.7343 - regression_loss: 1.4816 - classification_loss: 0.2527 484/500 [============================>.] - ETA: 3s - loss: 1.7344 - regression_loss: 1.4816 - classification_loss: 0.2528 485/500 [============================>.] - ETA: 3s - loss: 1.7330 - regression_loss: 1.4805 - classification_loss: 0.2525 486/500 [============================>.] - ETA: 3s - loss: 1.7321 - regression_loss: 1.4798 - classification_loss: 0.2523 487/500 [============================>.] - ETA: 3s - loss: 1.7313 - regression_loss: 1.4790 - classification_loss: 0.2522 488/500 [============================>.] - ETA: 2s - loss: 1.7295 - regression_loss: 1.4775 - classification_loss: 0.2520 489/500 [============================>.] - ETA: 2s - loss: 1.7291 - regression_loss: 1.4772 - classification_loss: 0.2519 490/500 [============================>.] - ETA: 2s - loss: 1.7294 - regression_loss: 1.4775 - classification_loss: 0.2519 491/500 [============================>.] - ETA: 2s - loss: 1.7287 - regression_loss: 1.4770 - classification_loss: 0.2517 492/500 [============================>.] - ETA: 1s - loss: 1.7286 - regression_loss: 1.4770 - classification_loss: 0.2516 493/500 [============================>.] - ETA: 1s - loss: 1.7263 - regression_loss: 1.4750 - classification_loss: 0.2512 494/500 [============================>.] - ETA: 1s - loss: 1.7269 - regression_loss: 1.4754 - classification_loss: 0.2515 495/500 [============================>.] - ETA: 1s - loss: 1.7260 - regression_loss: 1.4747 - classification_loss: 0.2513 496/500 [============================>.] - ETA: 0s - loss: 1.7232 - regression_loss: 1.4723 - classification_loss: 0.2509 497/500 [============================>.] - ETA: 0s - loss: 1.7212 - regression_loss: 1.4705 - classification_loss: 0.2507 498/500 [============================>.] - ETA: 0s - loss: 1.7232 - regression_loss: 1.4723 - classification_loss: 0.2509 499/500 [============================>.] - ETA: 0s - loss: 1.7231 - regression_loss: 1.4722 - classification_loss: 0.2509 500/500 [==============================] - 116s 231ms/step - loss: 1.7232 - regression_loss: 1.4724 - classification_loss: 0.2508 326 instances of class plum with average precision: 0.7875 mAP: 0.7875 Epoch 00007: saving model to ./training/snapshots/resnet50_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 1:50 - loss: 1.3897 - regression_loss: 1.2201 - classification_loss: 0.1696 2/500 [..............................] - ETA: 1:50 - loss: 1.6500 - regression_loss: 1.3722 - classification_loss: 0.2778 3/500 [..............................] - ETA: 1:49 - loss: 1.3349 - regression_loss: 1.1164 - classification_loss: 0.2185 4/500 [..............................] - ETA: 1:48 - loss: 1.4706 - regression_loss: 1.2429 - classification_loss: 0.2277 5/500 [..............................] - ETA: 1:48 - loss: 1.6906 - regression_loss: 1.3988 - classification_loss: 0.2918 6/500 [..............................] - ETA: 1:49 - loss: 1.7433 - regression_loss: 1.4481 - classification_loss: 0.2952 7/500 [..............................] - ETA: 1:49 - loss: 1.6925 - regression_loss: 1.4154 - classification_loss: 0.2770 8/500 [..............................] - ETA: 1:49 - loss: 1.7158 - regression_loss: 1.4393 - classification_loss: 0.2765 9/500 [..............................] - ETA: 1:49 - loss: 1.7399 - regression_loss: 1.4585 - classification_loss: 0.2814 10/500 [..............................] - ETA: 1:49 - loss: 1.7588 - regression_loss: 1.4820 - classification_loss: 0.2768 11/500 [..............................] - ETA: 1:49 - loss: 1.6568 - regression_loss: 1.3944 - classification_loss: 0.2623 12/500 [..............................] - ETA: 1:49 - loss: 1.6192 - regression_loss: 1.3638 - classification_loss: 0.2554 13/500 [..............................] - ETA: 1:48 - loss: 1.5636 - regression_loss: 1.3208 - classification_loss: 0.2428 14/500 [..............................] - ETA: 1:48 - loss: 1.5284 - regression_loss: 1.2950 - classification_loss: 0.2334 15/500 [..............................] - ETA: 1:48 - loss: 1.5538 - regression_loss: 1.3183 - classification_loss: 0.2355 16/500 [..............................] - ETA: 1:47 - loss: 1.5582 - regression_loss: 1.3147 - classification_loss: 0.2435 17/500 [>.............................] - ETA: 1:47 - loss: 1.5182 - regression_loss: 1.2831 - classification_loss: 0.2351 18/500 [>.............................] - ETA: 1:47 - loss: 1.5286 - regression_loss: 1.2931 - classification_loss: 0.2355 19/500 [>.............................] - ETA: 1:47 - loss: 1.5407 - regression_loss: 1.3064 - classification_loss: 0.2343 20/500 [>.............................] - ETA: 1:47 - loss: 1.5972 - regression_loss: 1.3558 - classification_loss: 0.2414 21/500 [>.............................] - ETA: 1:47 - loss: 1.5658 - regression_loss: 1.3315 - classification_loss: 0.2343 22/500 [>.............................] - ETA: 1:46 - loss: 1.5812 - regression_loss: 1.3452 - classification_loss: 0.2360 23/500 [>.............................] - ETA: 1:46 - loss: 1.6107 - regression_loss: 1.3686 - classification_loss: 0.2421 24/500 [>.............................] - ETA: 1:46 - loss: 1.5991 - regression_loss: 1.3623 - classification_loss: 0.2368 25/500 [>.............................] - ETA: 1:46 - loss: 1.6023 - regression_loss: 1.3631 - classification_loss: 0.2391 26/500 [>.............................] - ETA: 1:46 - loss: 1.5846 - regression_loss: 1.3509 - classification_loss: 0.2338 27/500 [>.............................] - ETA: 1:45 - loss: 1.5755 - regression_loss: 1.3457 - classification_loss: 0.2298 28/500 [>.............................] - ETA: 1:45 - loss: 1.5661 - regression_loss: 1.3371 - classification_loss: 0.2290 29/500 [>.............................] - ETA: 1:44 - loss: 1.5766 - regression_loss: 1.3485 - classification_loss: 0.2281 30/500 [>.............................] - ETA: 1:44 - loss: 1.5579 - regression_loss: 1.3336 - classification_loss: 0.2243 31/500 [>.............................] - ETA: 1:44 - loss: 1.5673 - regression_loss: 1.3420 - classification_loss: 0.2253 32/500 [>.............................] - ETA: 1:44 - loss: 1.5739 - regression_loss: 1.3491 - classification_loss: 0.2248 33/500 [>.............................] - ETA: 1:43 - loss: 1.5648 - regression_loss: 1.3410 - classification_loss: 0.2238 34/500 [=>............................] - ETA: 1:43 - loss: 1.5639 - regression_loss: 1.3415 - classification_loss: 0.2224 35/500 [=>............................] - ETA: 1:43 - loss: 1.5745 - regression_loss: 1.3508 - classification_loss: 0.2237 36/500 [=>............................] - ETA: 1:43 - loss: 1.5438 - regression_loss: 1.3246 - classification_loss: 0.2192 37/500 [=>............................] - ETA: 1:42 - loss: 1.5420 - regression_loss: 1.3223 - classification_loss: 0.2196 38/500 [=>............................] - ETA: 1:42 - loss: 1.5573 - regression_loss: 1.3361 - classification_loss: 0.2212 39/500 [=>............................] - ETA: 1:42 - loss: 1.5764 - regression_loss: 1.3509 - classification_loss: 0.2256 40/500 [=>............................] - ETA: 1:42 - loss: 1.5780 - regression_loss: 1.3526 - classification_loss: 0.2254 41/500 [=>............................] - ETA: 1:42 - loss: 1.5768 - regression_loss: 1.3521 - classification_loss: 0.2248 42/500 [=>............................] - ETA: 1:41 - loss: 1.5727 - regression_loss: 1.3483 - classification_loss: 0.2245 43/500 [=>............................] - ETA: 1:41 - loss: 1.5949 - regression_loss: 1.3672 - classification_loss: 0.2276 44/500 [=>............................] - ETA: 1:41 - loss: 1.6120 - regression_loss: 1.3797 - classification_loss: 0.2323 45/500 [=>............................] - ETA: 1:41 - loss: 1.6306 - regression_loss: 1.3965 - classification_loss: 0.2341 46/500 [=>............................] - ETA: 1:40 - loss: 1.6327 - regression_loss: 1.3980 - classification_loss: 0.2346 47/500 [=>............................] - ETA: 1:40 - loss: 1.6298 - regression_loss: 1.3969 - classification_loss: 0.2329 48/500 [=>............................] - ETA: 1:40 - loss: 1.6257 - regression_loss: 1.3939 - classification_loss: 0.2318 49/500 [=>............................] - ETA: 1:39 - loss: 1.6205 - regression_loss: 1.3897 - classification_loss: 0.2308 50/500 [==>...........................] - ETA: 1:39 - loss: 1.6348 - regression_loss: 1.4025 - classification_loss: 0.2323 51/500 [==>...........................] - ETA: 1:39 - loss: 1.6394 - regression_loss: 1.4069 - classification_loss: 0.2325 52/500 [==>...........................] - ETA: 1:38 - loss: 1.6452 - regression_loss: 1.4129 - classification_loss: 0.2323 53/500 [==>...........................] - ETA: 1:38 - loss: 1.6473 - regression_loss: 1.4157 - classification_loss: 0.2316 54/500 [==>...........................] - ETA: 1:38 - loss: 1.6440 - regression_loss: 1.4134 - classification_loss: 0.2306 55/500 [==>...........................] - ETA: 1:38 - loss: 1.6430 - regression_loss: 1.4138 - classification_loss: 0.2292 56/500 [==>...........................] - ETA: 1:37 - loss: 1.6467 - regression_loss: 1.4180 - classification_loss: 0.2286 57/500 [==>...........................] - ETA: 1:37 - loss: 1.6339 - regression_loss: 1.4079 - classification_loss: 0.2261 58/500 [==>...........................] - ETA: 1:37 - loss: 1.6265 - regression_loss: 1.4022 - classification_loss: 0.2243 59/500 [==>...........................] - ETA: 1:37 - loss: 1.6444 - regression_loss: 1.4146 - classification_loss: 0.2298 60/500 [==>...........................] - ETA: 1:36 - loss: 1.6320 - regression_loss: 1.4047 - classification_loss: 0.2273 61/500 [==>...........................] - ETA: 1:36 - loss: 1.6348 - regression_loss: 1.4070 - classification_loss: 0.2278 62/500 [==>...........................] - ETA: 1:36 - loss: 1.6286 - regression_loss: 1.4005 - classification_loss: 0.2281 63/500 [==>...........................] - ETA: 1:36 - loss: 1.6176 - regression_loss: 1.3914 - classification_loss: 0.2262 64/500 [==>...........................] - ETA: 1:35 - loss: 1.6079 - regression_loss: 1.3833 - classification_loss: 0.2246 65/500 [==>...........................] - ETA: 1:35 - loss: 1.6097 - regression_loss: 1.3832 - classification_loss: 0.2264 66/500 [==>...........................] - ETA: 1:35 - loss: 1.6134 - regression_loss: 1.3869 - classification_loss: 0.2265 67/500 [===>..........................] - ETA: 1:35 - loss: 1.6125 - regression_loss: 1.3871 - classification_loss: 0.2254 68/500 [===>..........................] - ETA: 1:34 - loss: 1.6119 - regression_loss: 1.3869 - classification_loss: 0.2250 69/500 [===>..........................] - ETA: 1:34 - loss: 1.6080 - regression_loss: 1.3834 - classification_loss: 0.2246 70/500 [===>..........................] - ETA: 1:34 - loss: 1.5974 - regression_loss: 1.3746 - classification_loss: 0.2228 71/500 [===>..........................] - ETA: 1:34 - loss: 1.5934 - regression_loss: 1.3716 - classification_loss: 0.2218 72/500 [===>..........................] - ETA: 1:33 - loss: 1.5957 - regression_loss: 1.3734 - classification_loss: 0.2223 73/500 [===>..........................] - ETA: 1:33 - loss: 1.5891 - regression_loss: 1.3681 - classification_loss: 0.2210 74/500 [===>..........................] - ETA: 1:33 - loss: 1.5936 - regression_loss: 1.3720 - classification_loss: 0.2216 75/500 [===>..........................] - ETA: 1:33 - loss: 1.5965 - regression_loss: 1.3747 - classification_loss: 0.2218 76/500 [===>..........................] - ETA: 1:33 - loss: 1.5936 - regression_loss: 1.3714 - classification_loss: 0.2221 77/500 [===>..........................] - ETA: 1:32 - loss: 1.5948 - regression_loss: 1.3733 - classification_loss: 0.2215 78/500 [===>..........................] - ETA: 1:32 - loss: 1.6000 - regression_loss: 1.3771 - classification_loss: 0.2229 79/500 [===>..........................] - ETA: 1:32 - loss: 1.5983 - regression_loss: 1.3756 - classification_loss: 0.2227 80/500 [===>..........................] - ETA: 1:32 - loss: 1.5938 - regression_loss: 1.3712 - classification_loss: 0.2226 81/500 [===>..........................] - ETA: 1:32 - loss: 1.5850 - regression_loss: 1.3636 - classification_loss: 0.2214 82/500 [===>..........................] - ETA: 1:32 - loss: 1.5934 - regression_loss: 1.3694 - classification_loss: 0.2240 83/500 [===>..........................] - ETA: 1:31 - loss: 1.5919 - regression_loss: 1.3685 - classification_loss: 0.2234 84/500 [====>.........................] - ETA: 1:31 - loss: 1.5904 - regression_loss: 1.3672 - classification_loss: 0.2231 85/500 [====>.........................] - ETA: 1:31 - loss: 1.5905 - regression_loss: 1.3684 - classification_loss: 0.2221 86/500 [====>.........................] - ETA: 1:31 - loss: 1.5917 - regression_loss: 1.3705 - classification_loss: 0.2212 87/500 [====>.........................] - ETA: 1:30 - loss: 1.5916 - regression_loss: 1.3699 - classification_loss: 0.2217 88/500 [====>.........................] - ETA: 1:30 - loss: 1.5980 - regression_loss: 1.3744 - classification_loss: 0.2236 89/500 [====>.........................] - ETA: 1:30 - loss: 1.5944 - regression_loss: 1.3712 - classification_loss: 0.2232 90/500 [====>.........................] - ETA: 1:30 - loss: 1.5981 - regression_loss: 1.3741 - classification_loss: 0.2240 91/500 [====>.........................] - ETA: 1:30 - loss: 1.6001 - regression_loss: 1.3759 - classification_loss: 0.2242 92/500 [====>.........................] - ETA: 1:29 - loss: 1.6108 - regression_loss: 1.3852 - classification_loss: 0.2256 93/500 [====>.........................] - ETA: 1:29 - loss: 1.6185 - regression_loss: 1.3917 - classification_loss: 0.2268 94/500 [====>.........................] - ETA: 1:29 - loss: 1.6206 - regression_loss: 1.3924 - classification_loss: 0.2282 95/500 [====>.........................] - ETA: 1:29 - loss: 1.6203 - regression_loss: 1.3925 - classification_loss: 0.2278 96/500 [====>.........................] - ETA: 1:28 - loss: 1.6139 - regression_loss: 1.3871 - classification_loss: 0.2268 97/500 [====>.........................] - ETA: 1:28 - loss: 1.6159 - regression_loss: 1.3890 - classification_loss: 0.2269 98/500 [====>.........................] - ETA: 1:28 - loss: 1.6164 - regression_loss: 1.3901 - classification_loss: 0.2263 99/500 [====>.........................] - ETA: 1:28 - loss: 1.6198 - regression_loss: 1.3930 - classification_loss: 0.2268 100/500 [=====>........................] - ETA: 1:28 - loss: 1.6184 - regression_loss: 1.3913 - classification_loss: 0.2270 101/500 [=====>........................] - ETA: 1:27 - loss: 1.6195 - regression_loss: 1.3924 - classification_loss: 0.2272 102/500 [=====>........................] - ETA: 1:27 - loss: 1.6194 - regression_loss: 1.3919 - classification_loss: 0.2275 103/500 [=====>........................] - ETA: 1:27 - loss: 1.6244 - regression_loss: 1.3962 - classification_loss: 0.2282 104/500 [=====>........................] - ETA: 1:27 - loss: 1.6177 - regression_loss: 1.3907 - classification_loss: 0.2269 105/500 [=====>........................] - ETA: 1:26 - loss: 1.6096 - regression_loss: 1.3840 - classification_loss: 0.2256 106/500 [=====>........................] - ETA: 1:26 - loss: 1.5968 - regression_loss: 1.3727 - classification_loss: 0.2241 107/500 [=====>........................] - ETA: 1:26 - loss: 1.6002 - regression_loss: 1.3751 - classification_loss: 0.2251 108/500 [=====>........................] - ETA: 1:26 - loss: 1.5934 - regression_loss: 1.3689 - classification_loss: 0.2245 109/500 [=====>........................] - ETA: 1:25 - loss: 1.5973 - regression_loss: 1.3735 - classification_loss: 0.2238 110/500 [=====>........................] - ETA: 1:25 - loss: 1.5973 - regression_loss: 1.3734 - classification_loss: 0.2239 111/500 [=====>........................] - ETA: 1:25 - loss: 1.6028 - regression_loss: 1.3768 - classification_loss: 0.2260 112/500 [=====>........................] - ETA: 1:25 - loss: 1.5948 - regression_loss: 1.3687 - classification_loss: 0.2261 113/500 [=====>........................] - ETA: 1:25 - loss: 1.5950 - regression_loss: 1.3691 - classification_loss: 0.2258 114/500 [=====>........................] - ETA: 1:25 - loss: 1.6015 - regression_loss: 1.3743 - classification_loss: 0.2272 115/500 [=====>........................] - ETA: 1:24 - loss: 1.6119 - regression_loss: 1.3829 - classification_loss: 0.2290 116/500 [=====>........................] - ETA: 1:24 - loss: 1.6121 - regression_loss: 1.3835 - classification_loss: 0.2286 117/500 [======>.......................] - ETA: 1:24 - loss: 1.6112 - regression_loss: 1.3828 - classification_loss: 0.2284 118/500 [======>.......................] - ETA: 1:24 - loss: 1.6131 - regression_loss: 1.3853 - classification_loss: 0.2278 119/500 [======>.......................] - ETA: 1:23 - loss: 1.6198 - regression_loss: 1.3911 - classification_loss: 0.2287 120/500 [======>.......................] - ETA: 1:23 - loss: 1.6213 - regression_loss: 1.3925 - classification_loss: 0.2288 121/500 [======>.......................] - ETA: 1:23 - loss: 1.6215 - regression_loss: 1.3934 - classification_loss: 0.2281 122/500 [======>.......................] - ETA: 1:23 - loss: 1.6182 - regression_loss: 1.3908 - classification_loss: 0.2273 123/500 [======>.......................] - ETA: 1:23 - loss: 1.6180 - regression_loss: 1.3910 - classification_loss: 0.2270 124/500 [======>.......................] - ETA: 1:22 - loss: 1.6139 - regression_loss: 1.3875 - classification_loss: 0.2264 125/500 [======>.......................] - ETA: 1:22 - loss: 1.6129 - regression_loss: 1.3867 - classification_loss: 0.2262 126/500 [======>.......................] - ETA: 1:22 - loss: 1.6105 - regression_loss: 1.3846 - classification_loss: 0.2259 127/500 [======>.......................] - ETA: 1:22 - loss: 1.6138 - regression_loss: 1.3882 - classification_loss: 0.2256 128/500 [======>.......................] - ETA: 1:21 - loss: 1.6146 - regression_loss: 1.3883 - classification_loss: 0.2263 129/500 [======>.......................] - ETA: 1:21 - loss: 1.6171 - regression_loss: 1.3907 - classification_loss: 0.2264 130/500 [======>.......................] - ETA: 1:21 - loss: 1.6229 - regression_loss: 1.3949 - classification_loss: 0.2281 131/500 [======>.......................] - ETA: 1:21 - loss: 1.6237 - regression_loss: 1.3950 - classification_loss: 0.2287 132/500 [======>.......................] - ETA: 1:21 - loss: 1.6316 - regression_loss: 1.4023 - classification_loss: 0.2293 133/500 [======>.......................] - ETA: 1:20 - loss: 1.6317 - regression_loss: 1.4021 - classification_loss: 0.2296 134/500 [=======>......................] - ETA: 1:20 - loss: 1.6259 - regression_loss: 1.3969 - classification_loss: 0.2290 135/500 [=======>......................] - ETA: 1:20 - loss: 1.6322 - regression_loss: 1.4024 - classification_loss: 0.2297 136/500 [=======>......................] - ETA: 1:20 - loss: 1.6311 - regression_loss: 1.4017 - classification_loss: 0.2295 137/500 [=======>......................] - ETA: 1:19 - loss: 1.6313 - regression_loss: 1.4012 - classification_loss: 0.2301 138/500 [=======>......................] - ETA: 1:19 - loss: 1.6283 - regression_loss: 1.3980 - classification_loss: 0.2303 139/500 [=======>......................] - ETA: 1:19 - loss: 1.6321 - regression_loss: 1.4001 - classification_loss: 0.2320 140/500 [=======>......................] - ETA: 1:19 - loss: 1.6266 - regression_loss: 1.3957 - classification_loss: 0.2310 141/500 [=======>......................] - ETA: 1:19 - loss: 1.6253 - regression_loss: 1.3946 - classification_loss: 0.2307 142/500 [=======>......................] - ETA: 1:18 - loss: 1.6219 - regression_loss: 1.3917 - classification_loss: 0.2302 143/500 [=======>......................] - ETA: 1:18 - loss: 1.6232 - regression_loss: 1.3930 - classification_loss: 0.2301 144/500 [=======>......................] - ETA: 1:18 - loss: 1.6264 - regression_loss: 1.3962 - classification_loss: 0.2303 145/500 [=======>......................] - ETA: 1:18 - loss: 1.6282 - regression_loss: 1.3978 - classification_loss: 0.2304 146/500 [=======>......................] - ETA: 1:17 - loss: 1.6189 - regression_loss: 1.3882 - classification_loss: 0.2307 147/500 [=======>......................] - ETA: 1:17 - loss: 1.6283 - regression_loss: 1.3963 - classification_loss: 0.2321 148/500 [=======>......................] - ETA: 1:17 - loss: 1.6312 - regression_loss: 1.3987 - classification_loss: 0.2325 149/500 [=======>......................] - ETA: 1:17 - loss: 1.6256 - regression_loss: 1.3940 - classification_loss: 0.2316 150/500 [========>.....................] - ETA: 1:17 - loss: 1.6334 - regression_loss: 1.4005 - classification_loss: 0.2330 151/500 [========>.....................] - ETA: 1:16 - loss: 1.6403 - regression_loss: 1.4061 - classification_loss: 0.2343 152/500 [========>.....................] - ETA: 1:16 - loss: 1.6436 - regression_loss: 1.4088 - classification_loss: 0.2348 153/500 [========>.....................] - ETA: 1:16 - loss: 1.6449 - regression_loss: 1.4095 - classification_loss: 0.2355 154/500 [========>.....................] - ETA: 1:16 - loss: 1.6454 - regression_loss: 1.4096 - classification_loss: 0.2358 155/500 [========>.....................] - ETA: 1:16 - loss: 1.6429 - regression_loss: 1.4079 - classification_loss: 0.2350 156/500 [========>.....................] - ETA: 1:15 - loss: 1.6434 - regression_loss: 1.4085 - classification_loss: 0.2348 157/500 [========>.....................] - ETA: 1:15 - loss: 1.6420 - regression_loss: 1.4077 - classification_loss: 0.2343 158/500 [========>.....................] - ETA: 1:15 - loss: 1.6437 - regression_loss: 1.4092 - classification_loss: 0.2344 159/500 [========>.....................] - ETA: 1:15 - loss: 1.6429 - regression_loss: 1.4088 - classification_loss: 0.2341 160/500 [========>.....................] - ETA: 1:15 - loss: 1.6396 - regression_loss: 1.4061 - classification_loss: 0.2335 161/500 [========>.....................] - ETA: 1:14 - loss: 1.6360 - regression_loss: 1.4031 - classification_loss: 0.2329 162/500 [========>.....................] - ETA: 1:14 - loss: 1.6332 - regression_loss: 1.4007 - classification_loss: 0.2325 163/500 [========>.....................] - ETA: 1:14 - loss: 1.6326 - regression_loss: 1.4003 - classification_loss: 0.2323 164/500 [========>.....................] - ETA: 1:14 - loss: 1.6308 - regression_loss: 1.3988 - classification_loss: 0.2319 165/500 [========>.....................] - ETA: 1:13 - loss: 1.6334 - regression_loss: 1.4014 - classification_loss: 0.2320 166/500 [========>.....................] - ETA: 1:13 - loss: 1.6358 - regression_loss: 1.4038 - classification_loss: 0.2320 167/500 [=========>....................] - ETA: 1:13 - loss: 1.6359 - regression_loss: 1.4038 - classification_loss: 0.2322 168/500 [=========>....................] - ETA: 1:13 - loss: 1.6293 - regression_loss: 1.3981 - classification_loss: 0.2312 169/500 [=========>....................] - ETA: 1:13 - loss: 1.6301 - regression_loss: 1.3989 - classification_loss: 0.2311 170/500 [=========>....................] - ETA: 1:12 - loss: 1.6294 - regression_loss: 1.3983 - classification_loss: 0.2311 171/500 [=========>....................] - ETA: 1:12 - loss: 1.6270 - regression_loss: 1.3965 - classification_loss: 0.2305 172/500 [=========>....................] - ETA: 1:12 - loss: 1.6296 - regression_loss: 1.3986 - classification_loss: 0.2310 173/500 [=========>....................] - ETA: 1:12 - loss: 1.6304 - regression_loss: 1.3991 - classification_loss: 0.2312 174/500 [=========>....................] - ETA: 1:12 - loss: 1.6319 - regression_loss: 1.4004 - classification_loss: 0.2315 175/500 [=========>....................] - ETA: 1:11 - loss: 1.6335 - regression_loss: 1.4016 - classification_loss: 0.2319 176/500 [=========>....................] - ETA: 1:11 - loss: 1.6318 - regression_loss: 1.4002 - classification_loss: 0.2317 177/500 [=========>....................] - ETA: 1:11 - loss: 1.6319 - regression_loss: 1.4001 - classification_loss: 0.2318 178/500 [=========>....................] - ETA: 1:11 - loss: 1.6364 - regression_loss: 1.4037 - classification_loss: 0.2327 179/500 [=========>....................] - ETA: 1:11 - loss: 1.6320 - regression_loss: 1.3998 - classification_loss: 0.2322 180/500 [=========>....................] - ETA: 1:10 - loss: 1.6344 - regression_loss: 1.4021 - classification_loss: 0.2323 181/500 [=========>....................] - ETA: 1:10 - loss: 1.6329 - regression_loss: 1.4013 - classification_loss: 0.2316 182/500 [=========>....................] - ETA: 1:10 - loss: 1.6289 - regression_loss: 1.3981 - classification_loss: 0.2308 183/500 [=========>....................] - ETA: 1:10 - loss: 1.6377 - regression_loss: 1.4052 - classification_loss: 0.2325 184/500 [==========>...................] - ETA: 1:10 - loss: 1.6388 - regression_loss: 1.4062 - classification_loss: 0.2326 185/500 [==========>...................] - ETA: 1:09 - loss: 1.6386 - regression_loss: 1.4059 - classification_loss: 0.2327 186/500 [==========>...................] - ETA: 1:09 - loss: 1.6384 - regression_loss: 1.4055 - classification_loss: 0.2329 187/500 [==========>...................] - ETA: 1:09 - loss: 1.6368 - regression_loss: 1.4042 - classification_loss: 0.2326 188/500 [==========>...................] - ETA: 1:09 - loss: 1.6403 - regression_loss: 1.4067 - classification_loss: 0.2336 189/500 [==========>...................] - ETA: 1:09 - loss: 1.6346 - regression_loss: 1.4019 - classification_loss: 0.2327 190/500 [==========>...................] - ETA: 1:08 - loss: 1.6331 - regression_loss: 1.4005 - classification_loss: 0.2327 191/500 [==========>...................] - ETA: 1:08 - loss: 1.6306 - regression_loss: 1.3986 - classification_loss: 0.2320 192/500 [==========>...................] - ETA: 1:08 - loss: 1.6296 - regression_loss: 1.3976 - classification_loss: 0.2320 193/500 [==========>...................] - ETA: 1:08 - loss: 1.6272 - regression_loss: 1.3956 - classification_loss: 0.2315 194/500 [==========>...................] - ETA: 1:07 - loss: 1.6245 - regression_loss: 1.3932 - classification_loss: 0.2313 195/500 [==========>...................] - ETA: 1:07 - loss: 1.6223 - regression_loss: 1.3914 - classification_loss: 0.2309 196/500 [==========>...................] - ETA: 1:07 - loss: 1.6254 - regression_loss: 1.3951 - classification_loss: 0.2304 197/500 [==========>...................] - ETA: 1:07 - loss: 1.6241 - regression_loss: 1.3945 - classification_loss: 0.2296 198/500 [==========>...................] - ETA: 1:06 - loss: 1.6234 - regression_loss: 1.3938 - classification_loss: 0.2296 199/500 [==========>...................] - ETA: 1:06 - loss: 1.6209 - regression_loss: 1.3917 - classification_loss: 0.2292 200/500 [===========>..................] - ETA: 1:06 - loss: 1.6190 - regression_loss: 1.3901 - classification_loss: 0.2289 201/500 [===========>..................] - ETA: 1:06 - loss: 1.6157 - regression_loss: 1.3875 - classification_loss: 0.2282 202/500 [===========>..................] - ETA: 1:06 - loss: 1.6142 - regression_loss: 1.3862 - classification_loss: 0.2280 203/500 [===========>..................] - ETA: 1:05 - loss: 1.6110 - regression_loss: 1.3835 - classification_loss: 0.2276 204/500 [===========>..................] - ETA: 1:05 - loss: 1.6093 - regression_loss: 1.3821 - classification_loss: 0.2272 205/500 [===========>..................] - ETA: 1:05 - loss: 1.6051 - regression_loss: 1.3787 - classification_loss: 0.2264 206/500 [===========>..................] - ETA: 1:05 - loss: 1.6044 - regression_loss: 1.3783 - classification_loss: 0.2261 207/500 [===========>..................] - ETA: 1:05 - loss: 1.6046 - regression_loss: 1.3781 - classification_loss: 0.2265 208/500 [===========>..................] - ETA: 1:04 - loss: 1.6007 - regression_loss: 1.3748 - classification_loss: 0.2259 209/500 [===========>..................] - ETA: 1:04 - loss: 1.6014 - regression_loss: 1.3753 - classification_loss: 0.2261 210/500 [===========>..................] - ETA: 1:04 - loss: 1.6001 - regression_loss: 1.3746 - classification_loss: 0.2255 211/500 [===========>..................] - ETA: 1:04 - loss: 1.5993 - regression_loss: 1.3740 - classification_loss: 0.2253 212/500 [===========>..................] - ETA: 1:04 - loss: 1.5988 - regression_loss: 1.3737 - classification_loss: 0.2251 213/500 [===========>..................] - ETA: 1:03 - loss: 1.5971 - regression_loss: 1.3724 - classification_loss: 0.2247 214/500 [===========>..................] - ETA: 1:03 - loss: 1.5974 - regression_loss: 1.3729 - classification_loss: 0.2245 215/500 [===========>..................] - ETA: 1:03 - loss: 1.5988 - regression_loss: 1.3738 - classification_loss: 0.2250 216/500 [===========>..................] - ETA: 1:03 - loss: 1.6008 - regression_loss: 1.3753 - classification_loss: 0.2256 217/500 [============>.................] - ETA: 1:02 - loss: 1.5988 - regression_loss: 1.3734 - classification_loss: 0.2254 218/500 [============>.................] - ETA: 1:02 - loss: 1.5956 - regression_loss: 1.3707 - classification_loss: 0.2249 219/500 [============>.................] - ETA: 1:02 - loss: 1.5936 - regression_loss: 1.3691 - classification_loss: 0.2245 220/500 [============>.................] - ETA: 1:02 - loss: 1.5930 - regression_loss: 1.3689 - classification_loss: 0.2241 221/500 [============>.................] - ETA: 1:02 - loss: 1.5959 - regression_loss: 1.3713 - classification_loss: 0.2245 222/500 [============>.................] - ETA: 1:01 - loss: 1.5954 - regression_loss: 1.3710 - classification_loss: 0.2244 223/500 [============>.................] - ETA: 1:01 - loss: 1.5962 - regression_loss: 1.3720 - classification_loss: 0.2242 224/500 [============>.................] - ETA: 1:01 - loss: 1.5996 - regression_loss: 1.3750 - classification_loss: 0.2246 225/500 [============>.................] - ETA: 1:01 - loss: 1.5984 - regression_loss: 1.3742 - classification_loss: 0.2242 226/500 [============>.................] - ETA: 1:01 - loss: 1.6001 - regression_loss: 1.3752 - classification_loss: 0.2250 227/500 [============>.................] - ETA: 1:00 - loss: 1.6009 - regression_loss: 1.3757 - classification_loss: 0.2252 228/500 [============>.................] - ETA: 1:00 - loss: 1.5996 - regression_loss: 1.3744 - classification_loss: 0.2252 229/500 [============>.................] - ETA: 1:00 - loss: 1.6012 - regression_loss: 1.3755 - classification_loss: 0.2257 230/500 [============>.................] - ETA: 1:00 - loss: 1.6014 - regression_loss: 1.3758 - classification_loss: 0.2256 231/500 [============>.................] - ETA: 1:00 - loss: 1.5992 - regression_loss: 1.3737 - classification_loss: 0.2255 232/500 [============>.................] - ETA: 59s - loss: 1.6003 - regression_loss: 1.3747 - classification_loss: 0.2256  233/500 [============>.................] - ETA: 59s - loss: 1.5988 - regression_loss: 1.3733 - classification_loss: 0.2255 234/500 [=============>................] - ETA: 59s - loss: 1.5989 - regression_loss: 1.3735 - classification_loss: 0.2254 235/500 [=============>................] - ETA: 59s - loss: 1.6012 - regression_loss: 1.3756 - classification_loss: 0.2256 236/500 [=============>................] - ETA: 59s - loss: 1.6029 - regression_loss: 1.3768 - classification_loss: 0.2261 237/500 [=============>................] - ETA: 58s - loss: 1.6071 - regression_loss: 1.3801 - classification_loss: 0.2269 238/500 [=============>................] - ETA: 58s - loss: 1.6078 - regression_loss: 1.3808 - classification_loss: 0.2270 239/500 [=============>................] - ETA: 58s - loss: 1.6085 - regression_loss: 1.3817 - classification_loss: 0.2268 240/500 [=============>................] - ETA: 58s - loss: 1.6090 - regression_loss: 1.3814 - classification_loss: 0.2277 241/500 [=============>................] - ETA: 57s - loss: 1.6122 - regression_loss: 1.3843 - classification_loss: 0.2279 242/500 [=============>................] - ETA: 57s - loss: 1.6109 - regression_loss: 1.3832 - classification_loss: 0.2277 243/500 [=============>................] - ETA: 57s - loss: 1.6155 - regression_loss: 1.3869 - classification_loss: 0.2286 244/500 [=============>................] - ETA: 57s - loss: 1.6197 - regression_loss: 1.3901 - classification_loss: 0.2295 245/500 [=============>................] - ETA: 57s - loss: 1.6192 - regression_loss: 1.3898 - classification_loss: 0.2295 246/500 [=============>................] - ETA: 56s - loss: 1.6181 - regression_loss: 1.3889 - classification_loss: 0.2292 247/500 [=============>................] - ETA: 56s - loss: 1.6182 - regression_loss: 1.3890 - classification_loss: 0.2291 248/500 [=============>................] - ETA: 56s - loss: 1.6153 - regression_loss: 1.3867 - classification_loss: 0.2286 249/500 [=============>................] - ETA: 56s - loss: 1.6153 - regression_loss: 1.3866 - classification_loss: 0.2287 250/500 [==============>...............] - ETA: 56s - loss: 1.6122 - regression_loss: 1.3840 - classification_loss: 0.2281 251/500 [==============>...............] - ETA: 55s - loss: 1.6105 - regression_loss: 1.3828 - classification_loss: 0.2277 252/500 [==============>...............] - ETA: 55s - loss: 1.6086 - regression_loss: 1.3812 - classification_loss: 0.2274 253/500 [==============>...............] - ETA: 55s - loss: 1.6107 - regression_loss: 1.3830 - classification_loss: 0.2277 254/500 [==============>...............] - ETA: 55s - loss: 1.6141 - regression_loss: 1.3851 - classification_loss: 0.2290 255/500 [==============>...............] - ETA: 55s - loss: 1.6146 - regression_loss: 1.3855 - classification_loss: 0.2290 256/500 [==============>...............] - ETA: 54s - loss: 1.6142 - regression_loss: 1.3855 - classification_loss: 0.2287 257/500 [==============>...............] - ETA: 54s - loss: 1.6151 - regression_loss: 1.3863 - classification_loss: 0.2288 258/500 [==============>...............] - ETA: 54s - loss: 1.6180 - regression_loss: 1.3884 - classification_loss: 0.2296 259/500 [==============>...............] - ETA: 54s - loss: 1.6221 - regression_loss: 1.3916 - classification_loss: 0.2305 260/500 [==============>...............] - ETA: 54s - loss: 1.6215 - regression_loss: 1.3911 - classification_loss: 0.2304 261/500 [==============>...............] - ETA: 53s - loss: 1.6217 - regression_loss: 1.3912 - classification_loss: 0.2305 262/500 [==============>...............] - ETA: 53s - loss: 1.6205 - regression_loss: 1.3902 - classification_loss: 0.2303 263/500 [==============>...............] - ETA: 53s - loss: 1.6204 - regression_loss: 1.3902 - classification_loss: 0.2302 264/500 [==============>...............] - ETA: 53s - loss: 1.6223 - regression_loss: 1.3916 - classification_loss: 0.2307 265/500 [==============>...............] - ETA: 52s - loss: 1.6224 - regression_loss: 1.3919 - classification_loss: 0.2306 266/500 [==============>...............] - ETA: 52s - loss: 1.6233 - regression_loss: 1.3930 - classification_loss: 0.2303 267/500 [===============>..............] - ETA: 52s - loss: 1.6234 - regression_loss: 1.3928 - classification_loss: 0.2306 268/500 [===============>..............] - ETA: 52s - loss: 1.6230 - regression_loss: 1.3927 - classification_loss: 0.2302 269/500 [===============>..............] - ETA: 52s - loss: 1.6234 - regression_loss: 1.3930 - classification_loss: 0.2304 270/500 [===============>..............] - ETA: 51s - loss: 1.6240 - regression_loss: 1.3932 - classification_loss: 0.2308 271/500 [===============>..............] - ETA: 51s - loss: 1.6253 - regression_loss: 1.3943 - classification_loss: 0.2309 272/500 [===============>..............] - ETA: 51s - loss: 1.6241 - regression_loss: 1.3933 - classification_loss: 0.2308 273/500 [===============>..............] - ETA: 51s - loss: 1.6233 - regression_loss: 1.3926 - classification_loss: 0.2307 274/500 [===============>..............] - ETA: 51s - loss: 1.6241 - regression_loss: 1.3932 - classification_loss: 0.2309 275/500 [===============>..............] - ETA: 50s - loss: 1.6251 - regression_loss: 1.3942 - classification_loss: 0.2309 276/500 [===============>..............] - ETA: 50s - loss: 1.6269 - regression_loss: 1.3952 - classification_loss: 0.2317 277/500 [===============>..............] - ETA: 50s - loss: 1.6255 - regression_loss: 1.3941 - classification_loss: 0.2313 278/500 [===============>..............] - ETA: 50s - loss: 1.6306 - regression_loss: 1.3981 - classification_loss: 0.2326 279/500 [===============>..............] - ETA: 49s - loss: 1.6288 - regression_loss: 1.3931 - classification_loss: 0.2358 280/500 [===============>..............] - ETA: 49s - loss: 1.6298 - regression_loss: 1.3940 - classification_loss: 0.2358 281/500 [===============>..............] - ETA: 49s - loss: 1.6294 - regression_loss: 1.3937 - classification_loss: 0.2357 282/500 [===============>..............] - ETA: 49s - loss: 1.6284 - regression_loss: 1.3929 - classification_loss: 0.2355 283/500 [===============>..............] - ETA: 49s - loss: 1.6261 - regression_loss: 1.3910 - classification_loss: 0.2351 284/500 [================>.............] - ETA: 48s - loss: 1.6311 - regression_loss: 1.3948 - classification_loss: 0.2362 285/500 [================>.............] - ETA: 48s - loss: 1.6327 - regression_loss: 1.3961 - classification_loss: 0.2366 286/500 [================>.............] - ETA: 48s - loss: 1.6351 - regression_loss: 1.3977 - classification_loss: 0.2373 287/500 [================>.............] - ETA: 48s - loss: 1.6359 - regression_loss: 1.3983 - classification_loss: 0.2375 288/500 [================>.............] - ETA: 47s - loss: 1.6352 - regression_loss: 1.3978 - classification_loss: 0.2374 289/500 [================>.............] - ETA: 47s - loss: 1.6325 - regression_loss: 1.3955 - classification_loss: 0.2370 290/500 [================>.............] - ETA: 47s - loss: 1.6313 - regression_loss: 1.3947 - classification_loss: 0.2366 291/500 [================>.............] - ETA: 47s - loss: 1.6324 - regression_loss: 1.3955 - classification_loss: 0.2369 292/500 [================>.............] - ETA: 47s - loss: 1.6317 - regression_loss: 1.3950 - classification_loss: 0.2367 293/500 [================>.............] - ETA: 46s - loss: 1.6306 - regression_loss: 1.3939 - classification_loss: 0.2367 294/500 [================>.............] - ETA: 46s - loss: 1.6299 - regression_loss: 1.3934 - classification_loss: 0.2365 295/500 [================>.............] - ETA: 46s - loss: 1.6290 - regression_loss: 1.3927 - classification_loss: 0.2363 296/500 [================>.............] - ETA: 46s - loss: 1.6267 - regression_loss: 1.3909 - classification_loss: 0.2357 297/500 [================>.............] - ETA: 46s - loss: 1.6274 - regression_loss: 1.3917 - classification_loss: 0.2357 298/500 [================>.............] - ETA: 45s - loss: 1.6261 - regression_loss: 1.3909 - classification_loss: 0.2353 299/500 [================>.............] - ETA: 45s - loss: 1.6245 - regression_loss: 1.3885 - classification_loss: 0.2360 300/500 [=================>............] - ETA: 45s - loss: 1.6252 - regression_loss: 1.3890 - classification_loss: 0.2362 301/500 [=================>............] - ETA: 45s - loss: 1.6236 - regression_loss: 1.3876 - classification_loss: 0.2360 302/500 [=================>............] - ETA: 44s - loss: 1.6239 - regression_loss: 1.3878 - classification_loss: 0.2361 303/500 [=================>............] - ETA: 44s - loss: 1.6243 - regression_loss: 1.3881 - classification_loss: 0.2363 304/500 [=================>............] - ETA: 44s - loss: 1.6230 - regression_loss: 1.3871 - classification_loss: 0.2359 305/500 [=================>............] - ETA: 44s - loss: 1.6238 - regression_loss: 1.3880 - classification_loss: 0.2359 306/500 [=================>............] - ETA: 44s - loss: 1.6210 - regression_loss: 1.3856 - classification_loss: 0.2354 307/500 [=================>............] - ETA: 43s - loss: 1.6221 - regression_loss: 1.3864 - classification_loss: 0.2357 308/500 [=================>............] - ETA: 43s - loss: 1.6222 - regression_loss: 1.3864 - classification_loss: 0.2358 309/500 [=================>............] - ETA: 43s - loss: 1.6224 - regression_loss: 1.3865 - classification_loss: 0.2358 310/500 [=================>............] - ETA: 43s - loss: 1.6241 - regression_loss: 1.3878 - classification_loss: 0.2363 311/500 [=================>............] - ETA: 42s - loss: 1.6232 - regression_loss: 1.3871 - classification_loss: 0.2361 312/500 [=================>............] - ETA: 42s - loss: 1.6231 - regression_loss: 1.3873 - classification_loss: 0.2358 313/500 [=================>............] - ETA: 42s - loss: 1.6201 - regression_loss: 1.3849 - classification_loss: 0.2352 314/500 [=================>............] - ETA: 42s - loss: 1.6207 - regression_loss: 1.3852 - classification_loss: 0.2356 315/500 [=================>............] - ETA: 42s - loss: 1.6194 - regression_loss: 1.3840 - classification_loss: 0.2354 316/500 [=================>............] - ETA: 41s - loss: 1.6184 - regression_loss: 1.3833 - classification_loss: 0.2351 317/500 [==================>...........] - ETA: 41s - loss: 1.6189 - regression_loss: 1.3838 - classification_loss: 0.2351 318/500 [==================>...........] - ETA: 41s - loss: 1.6181 - regression_loss: 1.3832 - classification_loss: 0.2349 319/500 [==================>...........] - ETA: 41s - loss: 1.6177 - regression_loss: 1.3830 - classification_loss: 0.2347 320/500 [==================>...........] - ETA: 40s - loss: 1.6161 - regression_loss: 1.3816 - classification_loss: 0.2345 321/500 [==================>...........] - ETA: 40s - loss: 1.6194 - regression_loss: 1.3835 - classification_loss: 0.2360 322/500 [==================>...........] - ETA: 40s - loss: 1.6216 - regression_loss: 1.3850 - classification_loss: 0.2366 323/500 [==================>...........] - ETA: 40s - loss: 1.6222 - regression_loss: 1.3855 - classification_loss: 0.2367 324/500 [==================>...........] - ETA: 40s - loss: 1.6260 - regression_loss: 1.3890 - classification_loss: 0.2370 325/500 [==================>...........] - ETA: 39s - loss: 1.6251 - regression_loss: 1.3883 - classification_loss: 0.2368 326/500 [==================>...........] - ETA: 39s - loss: 1.6280 - regression_loss: 1.3903 - classification_loss: 0.2377 327/500 [==================>...........] - ETA: 39s - loss: 1.6259 - regression_loss: 1.3886 - classification_loss: 0.2373 328/500 [==================>...........] - ETA: 39s - loss: 1.6243 - regression_loss: 1.3872 - classification_loss: 0.2370 329/500 [==================>...........] - ETA: 38s - loss: 1.6255 - regression_loss: 1.3884 - classification_loss: 0.2370 330/500 [==================>...........] - ETA: 38s - loss: 1.6247 - regression_loss: 1.3878 - classification_loss: 0.2369 331/500 [==================>...........] - ETA: 38s - loss: 1.6230 - regression_loss: 1.3867 - classification_loss: 0.2364 332/500 [==================>...........] - ETA: 38s - loss: 1.6214 - regression_loss: 1.3854 - classification_loss: 0.2360 333/500 [==================>...........] - ETA: 38s - loss: 1.6192 - regression_loss: 1.3836 - classification_loss: 0.2356 334/500 [===================>..........] - ETA: 37s - loss: 1.6175 - regression_loss: 1.3822 - classification_loss: 0.2354 335/500 [===================>..........] - ETA: 37s - loss: 1.6183 - regression_loss: 1.3828 - classification_loss: 0.2355 336/500 [===================>..........] - ETA: 37s - loss: 1.6151 - regression_loss: 1.3800 - classification_loss: 0.2350 337/500 [===================>..........] - ETA: 37s - loss: 1.6161 - regression_loss: 1.3806 - classification_loss: 0.2355 338/500 [===================>..........] - ETA: 36s - loss: 1.6157 - regression_loss: 1.3803 - classification_loss: 0.2354 339/500 [===================>..........] - ETA: 36s - loss: 1.6164 - regression_loss: 1.3809 - classification_loss: 0.2356 340/500 [===================>..........] - ETA: 36s - loss: 1.6196 - regression_loss: 1.3835 - classification_loss: 0.2361 341/500 [===================>..........] - ETA: 36s - loss: 1.6227 - regression_loss: 1.3861 - classification_loss: 0.2366 342/500 [===================>..........] - ETA: 36s - loss: 1.6213 - regression_loss: 1.3851 - classification_loss: 0.2362 343/500 [===================>..........] - ETA: 35s - loss: 1.6280 - regression_loss: 1.3906 - classification_loss: 0.2375 344/500 [===================>..........] - ETA: 35s - loss: 1.6285 - regression_loss: 1.3907 - classification_loss: 0.2378 345/500 [===================>..........] - ETA: 35s - loss: 1.6271 - regression_loss: 1.3891 - classification_loss: 0.2380 346/500 [===================>..........] - ETA: 35s - loss: 1.6245 - regression_loss: 1.3868 - classification_loss: 0.2377 347/500 [===================>..........] - ETA: 34s - loss: 1.6242 - regression_loss: 1.3866 - classification_loss: 0.2377 348/500 [===================>..........] - ETA: 34s - loss: 1.6233 - regression_loss: 1.3859 - classification_loss: 0.2374 349/500 [===================>..........] - ETA: 34s - loss: 1.6248 - regression_loss: 1.3869 - classification_loss: 0.2379 350/500 [====================>.........] - ETA: 34s - loss: 1.6253 - regression_loss: 1.3872 - classification_loss: 0.2381 351/500 [====================>.........] - ETA: 34s - loss: 1.6254 - regression_loss: 1.3873 - classification_loss: 0.2380 352/500 [====================>.........] - ETA: 33s - loss: 1.6249 - regression_loss: 1.3871 - classification_loss: 0.2378 353/500 [====================>.........] - ETA: 33s - loss: 1.6250 - regression_loss: 1.3872 - classification_loss: 0.2378 354/500 [====================>.........] - ETA: 33s - loss: 1.6262 - regression_loss: 1.3882 - classification_loss: 0.2380 355/500 [====================>.........] - ETA: 33s - loss: 1.6260 - regression_loss: 1.3881 - classification_loss: 0.2379 356/500 [====================>.........] - ETA: 32s - loss: 1.6278 - regression_loss: 1.3892 - classification_loss: 0.2385 357/500 [====================>.........] - ETA: 32s - loss: 1.6266 - regression_loss: 1.3884 - classification_loss: 0.2381 358/500 [====================>.........] - ETA: 32s - loss: 1.6268 - regression_loss: 1.3887 - classification_loss: 0.2380 359/500 [====================>.........] - ETA: 32s - loss: 1.6265 - regression_loss: 1.3886 - classification_loss: 0.2379 360/500 [====================>.........] - ETA: 32s - loss: 1.6258 - regression_loss: 1.3881 - classification_loss: 0.2377 361/500 [====================>.........] - ETA: 31s - loss: 1.6260 - regression_loss: 1.3883 - classification_loss: 0.2377 362/500 [====================>.........] - ETA: 31s - loss: 1.6261 - regression_loss: 1.3884 - classification_loss: 0.2377 363/500 [====================>.........] - ETA: 31s - loss: 1.6247 - regression_loss: 1.3874 - classification_loss: 0.2374 364/500 [====================>.........] - ETA: 31s - loss: 1.6240 - regression_loss: 1.3869 - classification_loss: 0.2371 365/500 [====================>.........] - ETA: 30s - loss: 1.6251 - regression_loss: 1.3879 - classification_loss: 0.2372 366/500 [====================>.........] - ETA: 30s - loss: 1.6253 - regression_loss: 1.3882 - classification_loss: 0.2371 367/500 [=====================>........] - ETA: 30s - loss: 1.6242 - regression_loss: 1.3874 - classification_loss: 0.2368 368/500 [=====================>........] - ETA: 30s - loss: 1.6230 - regression_loss: 1.3864 - classification_loss: 0.2366 369/500 [=====================>........] - ETA: 30s - loss: 1.6216 - regression_loss: 1.3851 - classification_loss: 0.2365 370/500 [=====================>........] - ETA: 29s - loss: 1.6208 - regression_loss: 1.3844 - classification_loss: 0.2363 371/500 [=====================>........] - ETA: 29s - loss: 1.6217 - regression_loss: 1.3853 - classification_loss: 0.2364 372/500 [=====================>........] - ETA: 29s - loss: 1.6221 - regression_loss: 1.3855 - classification_loss: 0.2366 373/500 [=====================>........] - ETA: 29s - loss: 1.6226 - regression_loss: 1.3862 - classification_loss: 0.2364 374/500 [=====================>........] - ETA: 28s - loss: 1.6216 - regression_loss: 1.3854 - classification_loss: 0.2362 375/500 [=====================>........] - ETA: 28s - loss: 1.6207 - regression_loss: 1.3846 - classification_loss: 0.2361 376/500 [=====================>........] - ETA: 28s - loss: 1.6214 - regression_loss: 1.3851 - classification_loss: 0.2362 377/500 [=====================>........] - ETA: 28s - loss: 1.6201 - regression_loss: 1.3842 - classification_loss: 0.2359 378/500 [=====================>........] - ETA: 27s - loss: 1.6183 - regression_loss: 1.3829 - classification_loss: 0.2355 379/500 [=====================>........] - ETA: 27s - loss: 1.6171 - regression_loss: 1.3818 - classification_loss: 0.2353 380/500 [=====================>........] - ETA: 27s - loss: 1.6156 - regression_loss: 1.3805 - classification_loss: 0.2350 381/500 [=====================>........] - ETA: 27s - loss: 1.6150 - regression_loss: 1.3801 - classification_loss: 0.2349 382/500 [=====================>........] - ETA: 27s - loss: 1.6152 - regression_loss: 1.3803 - classification_loss: 0.2348 383/500 [=====================>........] - ETA: 26s - loss: 1.6140 - regression_loss: 1.3794 - classification_loss: 0.2346 384/500 [======================>.......] - ETA: 26s - loss: 1.6139 - regression_loss: 1.3792 - classification_loss: 0.2347 385/500 [======================>.......] - ETA: 26s - loss: 1.6138 - regression_loss: 1.3792 - classification_loss: 0.2346 386/500 [======================>.......] - ETA: 26s - loss: 1.6123 - regression_loss: 1.3780 - classification_loss: 0.2343 387/500 [======================>.......] - ETA: 25s - loss: 1.6127 - regression_loss: 1.3784 - classification_loss: 0.2343 388/500 [======================>.......] - ETA: 25s - loss: 1.6113 - regression_loss: 1.3772 - classification_loss: 0.2341 389/500 [======================>.......] - ETA: 25s - loss: 1.6111 - regression_loss: 1.3771 - classification_loss: 0.2340 390/500 [======================>.......] - ETA: 25s - loss: 1.6101 - regression_loss: 1.3764 - classification_loss: 0.2337 391/500 [======================>.......] - ETA: 25s - loss: 1.6100 - regression_loss: 1.3762 - classification_loss: 0.2338 392/500 [======================>.......] - ETA: 24s - loss: 1.6113 - regression_loss: 1.3772 - classification_loss: 0.2342 393/500 [======================>.......] - ETA: 24s - loss: 1.6110 - regression_loss: 1.3770 - classification_loss: 0.2340 394/500 [======================>.......] - ETA: 24s - loss: 1.6102 - regression_loss: 1.3762 - classification_loss: 0.2340 395/500 [======================>.......] - ETA: 24s - loss: 1.6092 - regression_loss: 1.3754 - classification_loss: 0.2338 396/500 [======================>.......] - ETA: 23s - loss: 1.6084 - regression_loss: 1.3748 - classification_loss: 0.2336 397/500 [======================>.......] - ETA: 23s - loss: 1.6142 - regression_loss: 1.3714 - classification_loss: 0.2428 398/500 [======================>.......] - ETA: 23s - loss: 1.6147 - regression_loss: 1.3717 - classification_loss: 0.2431 399/500 [======================>.......] - ETA: 23s - loss: 1.6178 - regression_loss: 1.3749 - classification_loss: 0.2429 400/500 [=======================>......] - ETA: 22s - loss: 1.6182 - regression_loss: 1.3754 - classification_loss: 0.2428 401/500 [=======================>......] - ETA: 22s - loss: 1.6183 - regression_loss: 1.3756 - classification_loss: 0.2427 402/500 [=======================>......] - ETA: 22s - loss: 1.6174 - regression_loss: 1.3749 - classification_loss: 0.2424 403/500 [=======================>......] - ETA: 22s - loss: 1.6174 - regression_loss: 1.3750 - classification_loss: 0.2424 404/500 [=======================>......] - ETA: 22s - loss: 1.6164 - regression_loss: 1.3742 - classification_loss: 0.2422 405/500 [=======================>......] - ETA: 21s - loss: 1.6171 - regression_loss: 1.3745 - classification_loss: 0.2425 406/500 [=======================>......] - ETA: 21s - loss: 1.6169 - regression_loss: 1.3746 - classification_loss: 0.2423 407/500 [=======================>......] - ETA: 21s - loss: 1.6182 - regression_loss: 1.3754 - classification_loss: 0.2429 408/500 [=======================>......] - ETA: 21s - loss: 1.6196 - regression_loss: 1.3768 - classification_loss: 0.2428 409/500 [=======================>......] - ETA: 20s - loss: 1.6189 - regression_loss: 1.3760 - classification_loss: 0.2429 410/500 [=======================>......] - ETA: 20s - loss: 1.6179 - regression_loss: 1.3753 - classification_loss: 0.2426 411/500 [=======================>......] - ETA: 20s - loss: 1.6186 - regression_loss: 1.3761 - classification_loss: 0.2425 412/500 [=======================>......] - ETA: 20s - loss: 1.6186 - regression_loss: 1.3763 - classification_loss: 0.2423 413/500 [=======================>......] - ETA: 19s - loss: 1.6182 - regression_loss: 1.3758 - classification_loss: 0.2424 414/500 [=======================>......] - ETA: 19s - loss: 1.6203 - regression_loss: 1.3777 - classification_loss: 0.2426 415/500 [=======================>......] - ETA: 19s - loss: 1.6194 - regression_loss: 1.3769 - classification_loss: 0.2424 416/500 [=======================>......] - ETA: 19s - loss: 1.6199 - regression_loss: 1.3775 - classification_loss: 0.2424 417/500 [========================>.....] - ETA: 19s - loss: 1.6204 - regression_loss: 1.3781 - classification_loss: 0.2424 418/500 [========================>.....] - ETA: 18s - loss: 1.6200 - regression_loss: 1.3778 - classification_loss: 0.2421 419/500 [========================>.....] - ETA: 18s - loss: 1.6218 - regression_loss: 1.3793 - classification_loss: 0.2425 420/500 [========================>.....] - ETA: 18s - loss: 1.6201 - regression_loss: 1.3778 - classification_loss: 0.2423 421/500 [========================>.....] - ETA: 18s - loss: 1.6192 - regression_loss: 1.3773 - classification_loss: 0.2420 422/500 [========================>.....] - ETA: 17s - loss: 1.6191 - regression_loss: 1.3771 - classification_loss: 0.2419 423/500 [========================>.....] - ETA: 17s - loss: 1.6181 - regression_loss: 1.3765 - classification_loss: 0.2416 424/500 [========================>.....] - ETA: 17s - loss: 1.6161 - regression_loss: 1.3748 - classification_loss: 0.2413 425/500 [========================>.....] - ETA: 17s - loss: 1.6173 - regression_loss: 1.3757 - classification_loss: 0.2416 426/500 [========================>.....] - ETA: 17s - loss: 1.6180 - regression_loss: 1.3765 - classification_loss: 0.2415 427/500 [========================>.....] - ETA: 16s - loss: 1.6180 - regression_loss: 1.3768 - classification_loss: 0.2412 428/500 [========================>.....] - ETA: 16s - loss: 1.6163 - regression_loss: 1.3753 - classification_loss: 0.2410 429/500 [========================>.....] - ETA: 16s - loss: 1.6166 - regression_loss: 1.3756 - classification_loss: 0.2409 430/500 [========================>.....] - ETA: 16s - loss: 1.6165 - regression_loss: 1.3757 - classification_loss: 0.2408 431/500 [========================>.....] - ETA: 15s - loss: 1.6190 - regression_loss: 1.3777 - classification_loss: 0.2413 432/500 [========================>.....] - ETA: 15s - loss: 1.6176 - regression_loss: 1.3767 - classification_loss: 0.2409 433/500 [========================>.....] - ETA: 15s - loss: 1.6176 - regression_loss: 1.3768 - classification_loss: 0.2408 434/500 [=========================>....] - ETA: 15s - loss: 1.6175 - regression_loss: 1.3769 - classification_loss: 0.2406 435/500 [=========================>....] - ETA: 14s - loss: 1.6149 - regression_loss: 1.3747 - classification_loss: 0.2402 436/500 [=========================>....] - ETA: 14s - loss: 1.6138 - regression_loss: 1.3737 - classification_loss: 0.2400 437/500 [=========================>....] - ETA: 14s - loss: 1.6144 - regression_loss: 1.3744 - classification_loss: 0.2400 438/500 [=========================>....] - ETA: 14s - loss: 1.6137 - regression_loss: 1.3738 - classification_loss: 0.2399 439/500 [=========================>....] - ETA: 14s - loss: 1.6170 - regression_loss: 1.3767 - classification_loss: 0.2403 440/500 [=========================>....] - ETA: 13s - loss: 1.6182 - regression_loss: 1.3778 - classification_loss: 0.2404 441/500 [=========================>....] - ETA: 13s - loss: 1.6184 - regression_loss: 1.3781 - classification_loss: 0.2403 442/500 [=========================>....] - ETA: 13s - loss: 1.6192 - regression_loss: 1.3788 - classification_loss: 0.2405 443/500 [=========================>....] - ETA: 13s - loss: 1.6182 - regression_loss: 1.3780 - classification_loss: 0.2402 444/500 [=========================>....] - ETA: 12s - loss: 1.6182 - regression_loss: 1.3781 - classification_loss: 0.2401 445/500 [=========================>....] - ETA: 12s - loss: 1.6176 - regression_loss: 1.3776 - classification_loss: 0.2400 446/500 [=========================>....] - ETA: 12s - loss: 1.6173 - regression_loss: 1.3774 - classification_loss: 0.2399 447/500 [=========================>....] - ETA: 12s - loss: 1.6171 - regression_loss: 1.3773 - classification_loss: 0.2398 448/500 [=========================>....] - ETA: 11s - loss: 1.6172 - regression_loss: 1.3774 - classification_loss: 0.2398 449/500 [=========================>....] - ETA: 11s - loss: 1.6184 - regression_loss: 1.3781 - classification_loss: 0.2403 450/500 [==========================>...] - ETA: 11s - loss: 1.6172 - regression_loss: 1.3771 - classification_loss: 0.2401 451/500 [==========================>...] - ETA: 11s - loss: 1.6165 - regression_loss: 1.3766 - classification_loss: 0.2399 452/500 [==========================>...] - ETA: 11s - loss: 1.6153 - regression_loss: 1.3757 - classification_loss: 0.2396 453/500 [==========================>...] - ETA: 10s - loss: 1.6152 - regression_loss: 1.3756 - classification_loss: 0.2396 454/500 [==========================>...] - ETA: 10s - loss: 1.6162 - regression_loss: 1.3765 - classification_loss: 0.2397 455/500 [==========================>...] - ETA: 10s - loss: 1.6145 - regression_loss: 1.3751 - classification_loss: 0.2394 456/500 [==========================>...] - ETA: 10s - loss: 1.6141 - regression_loss: 1.3748 - classification_loss: 0.2393 457/500 [==========================>...] - ETA: 9s - loss: 1.6144 - regression_loss: 1.3750 - classification_loss: 0.2394  458/500 [==========================>...] - ETA: 9s - loss: 1.6140 - regression_loss: 1.3748 - classification_loss: 0.2392 459/500 [==========================>...] - ETA: 9s - loss: 1.6136 - regression_loss: 1.3745 - classification_loss: 0.2391 460/500 [==========================>...] - ETA: 9s - loss: 1.6109 - regression_loss: 1.3721 - classification_loss: 0.2388 461/500 [==========================>...] - ETA: 8s - loss: 1.6117 - regression_loss: 1.3727 - classification_loss: 0.2389 462/500 [==========================>...] - ETA: 8s - loss: 1.6105 - regression_loss: 1.3719 - classification_loss: 0.2386 463/500 [==========================>...] - ETA: 8s - loss: 1.6129 - regression_loss: 1.3739 - classification_loss: 0.2391 464/500 [==========================>...] - ETA: 8s - loss: 1.6138 - regression_loss: 1.3745 - classification_loss: 0.2393 465/500 [==========================>...] - ETA: 8s - loss: 1.6125 - regression_loss: 1.3735 - classification_loss: 0.2390 466/500 [==========================>...] - ETA: 7s - loss: 1.6113 - regression_loss: 1.3725 - classification_loss: 0.2388 467/500 [===========================>..] - ETA: 7s - loss: 1.6120 - regression_loss: 1.3733 - classification_loss: 0.2388 468/500 [===========================>..] - ETA: 7s - loss: 1.6118 - regression_loss: 1.3731 - classification_loss: 0.2387 469/500 [===========================>..] - ETA: 7s - loss: 1.6110 - regression_loss: 1.3724 - classification_loss: 0.2385 470/500 [===========================>..] - ETA: 6s - loss: 1.6110 - regression_loss: 1.3726 - classification_loss: 0.2384 471/500 [===========================>..] - ETA: 6s - loss: 1.6096 - regression_loss: 1.3714 - classification_loss: 0.2381 472/500 [===========================>..] - ETA: 6s - loss: 1.6119 - regression_loss: 1.3734 - classification_loss: 0.2385 473/500 [===========================>..] - ETA: 6s - loss: 1.6141 - regression_loss: 1.3756 - classification_loss: 0.2385 474/500 [===========================>..] - ETA: 6s - loss: 1.6122 - regression_loss: 1.3740 - classification_loss: 0.2383 475/500 [===========================>..] - ETA: 5s - loss: 1.6140 - regression_loss: 1.3751 - classification_loss: 0.2389 476/500 [===========================>..] - ETA: 5s - loss: 1.6145 - regression_loss: 1.3755 - classification_loss: 0.2389 477/500 [===========================>..] - ETA: 5s - loss: 1.6140 - regression_loss: 1.3752 - classification_loss: 0.2388 478/500 [===========================>..] - ETA: 5s - loss: 1.6126 - regression_loss: 1.3742 - classification_loss: 0.2384 479/500 [===========================>..] - ETA: 4s - loss: 1.6128 - regression_loss: 1.3743 - classification_loss: 0.2386 480/500 [===========================>..] - ETA: 4s - loss: 1.6131 - regression_loss: 1.3748 - classification_loss: 0.2383 481/500 [===========================>..] - ETA: 4s - loss: 1.6143 - regression_loss: 1.3757 - classification_loss: 0.2387 482/500 [===========================>..] - ETA: 4s - loss: 1.6139 - regression_loss: 1.3755 - classification_loss: 0.2384 483/500 [===========================>..] - ETA: 3s - loss: 1.6138 - regression_loss: 1.3756 - classification_loss: 0.2382 484/500 [============================>.] - ETA: 3s - loss: 1.6146 - regression_loss: 1.3764 - classification_loss: 0.2382 485/500 [============================>.] - ETA: 3s - loss: 1.6163 - regression_loss: 1.3776 - classification_loss: 0.2387 486/500 [============================>.] - ETA: 3s - loss: 1.6150 - regression_loss: 1.3761 - classification_loss: 0.2390 487/500 [============================>.] - ETA: 3s - loss: 1.6157 - regression_loss: 1.3766 - classification_loss: 0.2391 488/500 [============================>.] - ETA: 2s - loss: 1.6144 - regression_loss: 1.3756 - classification_loss: 0.2388 489/500 [============================>.] - ETA: 2s - loss: 1.6140 - regression_loss: 1.3753 - classification_loss: 0.2387 490/500 [============================>.] - ETA: 2s - loss: 1.6143 - regression_loss: 1.3757 - classification_loss: 0.2386 491/500 [============================>.] - ETA: 2s - loss: 1.6135 - regression_loss: 1.3750 - classification_loss: 0.2385 492/500 [============================>.] - ETA: 1s - loss: 1.6133 - regression_loss: 1.3749 - classification_loss: 0.2384 493/500 [============================>.] - ETA: 1s - loss: 1.6122 - regression_loss: 1.3740 - classification_loss: 0.2382 494/500 [============================>.] - ETA: 1s - loss: 1.6119 - regression_loss: 1.3738 - classification_loss: 0.2381 495/500 [============================>.] - ETA: 1s - loss: 1.6127 - regression_loss: 1.3745 - classification_loss: 0.2383 496/500 [============================>.] - ETA: 0s - loss: 1.6125 - regression_loss: 1.3744 - classification_loss: 0.2381 497/500 [============================>.] - ETA: 0s - loss: 1.6107 - regression_loss: 1.3730 - classification_loss: 0.2378 498/500 [============================>.] - ETA: 0s - loss: 1.6105 - regression_loss: 1.3729 - classification_loss: 0.2377 499/500 [============================>.] - ETA: 0s - loss: 1.6088 - regression_loss: 1.3715 - classification_loss: 0.2373 500/500 [==============================] - 116s 231ms/step - loss: 1.6085 - regression_loss: 1.3713 - classification_loss: 0.2372 326 instances of class plum with average precision: 0.8105 mAP: 0.8105 Epoch 00008: saving model to ./training/snapshots/resnet50_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 1:47 - loss: 1.6816 - regression_loss: 1.4565 - classification_loss: 0.2251 2/500 [..............................] - ETA: 1:48 - loss: 1.7270 - regression_loss: 1.5136 - classification_loss: 0.2134 3/500 [..............................] - ETA: 1:48 - loss: 1.4306 - regression_loss: 1.2558 - classification_loss: 0.1748 4/500 [..............................] - ETA: 1:53 - loss: 1.5274 - regression_loss: 1.3272 - classification_loss: 0.2002 5/500 [..............................] - ETA: 1:56 - loss: 1.4773 - regression_loss: 1.2851 - classification_loss: 0.1922 6/500 [..............................] - ETA: 1:56 - loss: 1.4853 - regression_loss: 1.3032 - classification_loss: 0.1821 7/500 [..............................] - ETA: 1:55 - loss: 1.5597 - regression_loss: 1.3638 - classification_loss: 0.1959 8/500 [..............................] - ETA: 1:56 - loss: 1.6376 - regression_loss: 1.4297 - classification_loss: 0.2080 9/500 [..............................] - ETA: 1:57 - loss: 1.6213 - regression_loss: 1.4160 - classification_loss: 0.2053 10/500 [..............................] - ETA: 1:57 - loss: 1.6570 - regression_loss: 1.4567 - classification_loss: 0.2003 11/500 [..............................] - ETA: 1:56 - loss: 1.6179 - regression_loss: 1.4190 - classification_loss: 0.1988 12/500 [..............................] - ETA: 1:56 - loss: 1.6366 - regression_loss: 1.4322 - classification_loss: 0.2044 13/500 [..............................] - ETA: 1:56 - loss: 1.6680 - regression_loss: 1.4606 - classification_loss: 0.2074 14/500 [..............................] - ETA: 1:55 - loss: 1.6547 - regression_loss: 1.4494 - classification_loss: 0.2053 15/500 [..............................] - ETA: 1:55 - loss: 1.6357 - regression_loss: 1.4376 - classification_loss: 0.1981 16/500 [..............................] - ETA: 1:55 - loss: 1.6613 - regression_loss: 1.4586 - classification_loss: 0.2027 17/500 [>.............................] - ETA: 1:55 - loss: 1.6605 - regression_loss: 1.4571 - classification_loss: 0.2034 18/500 [>.............................] - ETA: 1:55 - loss: 1.6476 - regression_loss: 1.4399 - classification_loss: 0.2077 19/500 [>.............................] - ETA: 1:55 - loss: 1.6592 - regression_loss: 1.4536 - classification_loss: 0.2055 20/500 [>.............................] - ETA: 1:55 - loss: 1.6759 - regression_loss: 1.4671 - classification_loss: 0.2088 21/500 [>.............................] - ETA: 1:54 - loss: 1.6614 - regression_loss: 1.4551 - classification_loss: 0.2063 22/500 [>.............................] - ETA: 1:54 - loss: 1.7056 - regression_loss: 1.4895 - classification_loss: 0.2161 23/500 [>.............................] - ETA: 1:54 - loss: 1.7116 - regression_loss: 1.4923 - classification_loss: 0.2194 24/500 [>.............................] - ETA: 1:54 - loss: 1.6861 - regression_loss: 1.4707 - classification_loss: 0.2153 25/500 [>.............................] - ETA: 1:54 - loss: 1.6812 - regression_loss: 1.4672 - classification_loss: 0.2141 26/500 [>.............................] - ETA: 1:53 - loss: 1.6565 - regression_loss: 1.4468 - classification_loss: 0.2097 27/500 [>.............................] - ETA: 1:53 - loss: 1.6249 - regression_loss: 1.4193 - classification_loss: 0.2057 28/500 [>.............................] - ETA: 1:53 - loss: 1.6333 - regression_loss: 1.4225 - classification_loss: 0.2108 29/500 [>.............................] - ETA: 1:52 - loss: 1.6568 - regression_loss: 1.4374 - classification_loss: 0.2194 30/500 [>.............................] - ETA: 1:52 - loss: 1.6500 - regression_loss: 1.4323 - classification_loss: 0.2177 31/500 [>.............................] - ETA: 1:52 - loss: 1.6234 - regression_loss: 1.4097 - classification_loss: 0.2137 32/500 [>.............................] - ETA: 1:52 - loss: 1.6139 - regression_loss: 1.4001 - classification_loss: 0.2138 33/500 [>.............................] - ETA: 1:52 - loss: 1.6302 - regression_loss: 1.4119 - classification_loss: 0.2183 34/500 [=>............................] - ETA: 1:51 - loss: 1.6307 - regression_loss: 1.4107 - classification_loss: 0.2200 35/500 [=>............................] - ETA: 1:51 - loss: 1.6292 - regression_loss: 1.4091 - classification_loss: 0.2200 36/500 [=>............................] - ETA: 1:50 - loss: 1.6240 - regression_loss: 1.4035 - classification_loss: 0.2205 37/500 [=>............................] - ETA: 1:50 - loss: 1.6058 - regression_loss: 1.3879 - classification_loss: 0.2179 38/500 [=>............................] - ETA: 1:50 - loss: 1.6253 - regression_loss: 1.4018 - classification_loss: 0.2235 39/500 [=>............................] - ETA: 1:50 - loss: 1.6014 - regression_loss: 1.3813 - classification_loss: 0.2201 40/500 [=>............................] - ETA: 1:49 - loss: 1.6146 - regression_loss: 1.3928 - classification_loss: 0.2218 41/500 [=>............................] - ETA: 1:49 - loss: 1.6341 - regression_loss: 1.4073 - classification_loss: 0.2268 42/500 [=>............................] - ETA: 1:49 - loss: 1.6267 - regression_loss: 1.3999 - classification_loss: 0.2268 43/500 [=>............................] - ETA: 1:49 - loss: 1.6217 - regression_loss: 1.3966 - classification_loss: 0.2251 44/500 [=>............................] - ETA: 1:48 - loss: 1.6060 - regression_loss: 1.3836 - classification_loss: 0.2224 45/500 [=>............................] - ETA: 1:48 - loss: 1.6129 - regression_loss: 1.3883 - classification_loss: 0.2246 46/500 [=>............................] - ETA: 1:48 - loss: 1.6273 - regression_loss: 1.4005 - classification_loss: 0.2268 47/500 [=>............................] - ETA: 1:48 - loss: 1.6597 - regression_loss: 1.4255 - classification_loss: 0.2342 48/500 [=>............................] - ETA: 1:48 - loss: 1.6630 - regression_loss: 1.4274 - classification_loss: 0.2356 49/500 [=>............................] - ETA: 1:47 - loss: 1.6629 - regression_loss: 1.4279 - classification_loss: 0.2350 50/500 [==>...........................] - ETA: 1:47 - loss: 1.6626 - regression_loss: 1.4278 - classification_loss: 0.2348 51/500 [==>...........................] - ETA: 1:47 - loss: 1.6637 - regression_loss: 1.4279 - classification_loss: 0.2358 52/500 [==>...........................] - ETA: 1:46 - loss: 1.6555 - regression_loss: 1.4217 - classification_loss: 0.2338 53/500 [==>...........................] - ETA: 1:46 - loss: 1.6408 - regression_loss: 1.4092 - classification_loss: 0.2316 54/500 [==>...........................] - ETA: 1:46 - loss: 1.6561 - regression_loss: 1.4258 - classification_loss: 0.2303 55/500 [==>...........................] - ETA: 1:46 - loss: 1.6516 - regression_loss: 1.4226 - classification_loss: 0.2290 56/500 [==>...........................] - ETA: 1:46 - loss: 1.6570 - regression_loss: 1.4271 - classification_loss: 0.2299 57/500 [==>...........................] - ETA: 1:45 - loss: 1.6505 - regression_loss: 1.4204 - classification_loss: 0.2300 58/500 [==>...........................] - ETA: 1:45 - loss: 1.6584 - regression_loss: 1.4272 - classification_loss: 0.2312 59/500 [==>...........................] - ETA: 1:45 - loss: 1.6549 - regression_loss: 1.4233 - classification_loss: 0.2316 60/500 [==>...........................] - ETA: 1:45 - loss: 1.6682 - regression_loss: 1.4324 - classification_loss: 0.2359 61/500 [==>...........................] - ETA: 1:44 - loss: 1.6682 - regression_loss: 1.4323 - classification_loss: 0.2360 62/500 [==>...........................] - ETA: 1:44 - loss: 1.6628 - regression_loss: 1.4280 - classification_loss: 0.2349 63/500 [==>...........................] - ETA: 1:44 - loss: 1.6806 - regression_loss: 1.4433 - classification_loss: 0.2373 64/500 [==>...........................] - ETA: 1:43 - loss: 1.6731 - regression_loss: 1.4371 - classification_loss: 0.2359 65/500 [==>...........................] - ETA: 1:43 - loss: 1.6786 - regression_loss: 1.4397 - classification_loss: 0.2390 66/500 [==>...........................] - ETA: 1:43 - loss: 1.6749 - regression_loss: 1.4370 - classification_loss: 0.2378 67/500 [===>..........................] - ETA: 1:43 - loss: 1.6864 - regression_loss: 1.4445 - classification_loss: 0.2419 68/500 [===>..........................] - ETA: 1:42 - loss: 1.6935 - regression_loss: 1.4512 - classification_loss: 0.2423 69/500 [===>..........................] - ETA: 1:42 - loss: 1.6977 - regression_loss: 1.4546 - classification_loss: 0.2431 70/500 [===>..........................] - ETA: 1:42 - loss: 1.7098 - regression_loss: 1.4654 - classification_loss: 0.2444 71/500 [===>..........................] - ETA: 1:42 - loss: 1.6961 - regression_loss: 1.4538 - classification_loss: 0.2423 72/500 [===>..........................] - ETA: 1:41 - loss: 1.6849 - regression_loss: 1.4445 - classification_loss: 0.2403 73/500 [===>..........................] - ETA: 1:41 - loss: 1.6873 - regression_loss: 1.4453 - classification_loss: 0.2420 74/500 [===>..........................] - ETA: 1:41 - loss: 1.6852 - regression_loss: 1.4444 - classification_loss: 0.2408 75/500 [===>..........................] - ETA: 1:41 - loss: 1.6798 - regression_loss: 1.4396 - classification_loss: 0.2401 76/500 [===>..........................] - ETA: 1:41 - loss: 1.6900 - regression_loss: 1.4482 - classification_loss: 0.2417 77/500 [===>..........................] - ETA: 1:40 - loss: 1.6829 - regression_loss: 1.4429 - classification_loss: 0.2400 78/500 [===>..........................] - ETA: 1:40 - loss: 1.6814 - regression_loss: 1.4422 - classification_loss: 0.2392 79/500 [===>..........................] - ETA: 1:40 - loss: 1.6668 - regression_loss: 1.4300 - classification_loss: 0.2368 80/500 [===>..........................] - ETA: 1:39 - loss: 1.6688 - regression_loss: 1.4313 - classification_loss: 0.2375 81/500 [===>..........................] - ETA: 1:39 - loss: 1.6872 - regression_loss: 1.4450 - classification_loss: 0.2421 82/500 [===>..........................] - ETA: 1:39 - loss: 1.6783 - regression_loss: 1.4378 - classification_loss: 0.2404 83/500 [===>..........................] - ETA: 1:39 - loss: 1.6701 - regression_loss: 1.4305 - classification_loss: 0.2396 84/500 [====>.........................] - ETA: 1:39 - loss: 1.6693 - regression_loss: 1.4302 - classification_loss: 0.2391 85/500 [====>.........................] - ETA: 1:38 - loss: 1.6594 - regression_loss: 1.4222 - classification_loss: 0.2372 86/500 [====>.........................] - ETA: 1:38 - loss: 1.6596 - regression_loss: 1.4225 - classification_loss: 0.2370 87/500 [====>.........................] - ETA: 1:38 - loss: 1.6645 - regression_loss: 1.4267 - classification_loss: 0.2379 88/500 [====>.........................] - ETA: 1:38 - loss: 1.6620 - regression_loss: 1.4235 - classification_loss: 0.2384 89/500 [====>.........................] - ETA: 1:37 - loss: 1.6610 - regression_loss: 1.4226 - classification_loss: 0.2383 90/500 [====>.........................] - ETA: 1:37 - loss: 1.6580 - regression_loss: 1.4204 - classification_loss: 0.2376 91/500 [====>.........................] - ETA: 1:37 - loss: 1.6555 - regression_loss: 1.4184 - classification_loss: 0.2371 92/500 [====>.........................] - ETA: 1:37 - loss: 1.6570 - regression_loss: 1.4205 - classification_loss: 0.2365 93/500 [====>.........................] - ETA: 1:37 - loss: 1.6721 - regression_loss: 1.4324 - classification_loss: 0.2397 94/500 [====>.........................] - ETA: 1:36 - loss: 1.6654 - regression_loss: 1.4264 - classification_loss: 0.2390 95/500 [====>.........................] - ETA: 1:36 - loss: 1.6678 - regression_loss: 1.4287 - classification_loss: 0.2391 96/500 [====>.........................] - ETA: 1:36 - loss: 1.6625 - regression_loss: 1.4243 - classification_loss: 0.2382 97/500 [====>.........................] - ETA: 1:36 - loss: 1.6548 - regression_loss: 1.4180 - classification_loss: 0.2369 98/500 [====>.........................] - ETA: 1:35 - loss: 1.6537 - regression_loss: 1.4169 - classification_loss: 0.2368 99/500 [====>.........................] - ETA: 1:35 - loss: 1.6678 - regression_loss: 1.4297 - classification_loss: 0.2382 100/500 [=====>........................] - ETA: 1:35 - loss: 1.6705 - regression_loss: 1.4319 - classification_loss: 0.2386 101/500 [=====>........................] - ETA: 1:35 - loss: 1.6731 - regression_loss: 1.4342 - classification_loss: 0.2388 102/500 [=====>........................] - ETA: 1:35 - loss: 1.6811 - regression_loss: 1.4403 - classification_loss: 0.2409 103/500 [=====>........................] - ETA: 1:34 - loss: 1.6734 - regression_loss: 1.4336 - classification_loss: 0.2398 104/500 [=====>........................] - ETA: 1:34 - loss: 1.6755 - regression_loss: 1.4349 - classification_loss: 0.2407 105/500 [=====>........................] - ETA: 1:34 - loss: 1.6856 - regression_loss: 1.4427 - classification_loss: 0.2429 106/500 [=====>........................] - ETA: 1:34 - loss: 1.6876 - regression_loss: 1.4448 - classification_loss: 0.2428 107/500 [=====>........................] - ETA: 1:33 - loss: 1.6883 - regression_loss: 1.4454 - classification_loss: 0.2428 108/500 [=====>........................] - ETA: 1:33 - loss: 1.6852 - regression_loss: 1.4421 - classification_loss: 0.2431 109/500 [=====>........................] - ETA: 1:33 - loss: 1.6835 - regression_loss: 1.4411 - classification_loss: 0.2424 110/500 [=====>........................] - ETA: 1:33 - loss: 1.6787 - regression_loss: 1.4375 - classification_loss: 0.2412 111/500 [=====>........................] - ETA: 1:32 - loss: 1.6769 - regression_loss: 1.4360 - classification_loss: 0.2410 112/500 [=====>........................] - ETA: 1:32 - loss: 1.6757 - regression_loss: 1.4356 - classification_loss: 0.2401 113/500 [=====>........................] - ETA: 1:32 - loss: 1.6726 - regression_loss: 1.4330 - classification_loss: 0.2396 114/500 [=====>........................] - ETA: 1:32 - loss: 1.6747 - regression_loss: 1.4354 - classification_loss: 0.2393 115/500 [=====>........................] - ETA: 1:31 - loss: 1.6793 - regression_loss: 1.4394 - classification_loss: 0.2399 116/500 [=====>........................] - ETA: 1:31 - loss: 1.6799 - regression_loss: 1.4398 - classification_loss: 0.2401 117/500 [======>.......................] - ETA: 1:31 - loss: 1.6746 - regression_loss: 1.4355 - classification_loss: 0.2391 118/500 [======>.......................] - ETA: 1:31 - loss: 1.6762 - regression_loss: 1.4372 - classification_loss: 0.2389 119/500 [======>.......................] - ETA: 1:30 - loss: 1.6718 - regression_loss: 1.4337 - classification_loss: 0.2381 120/500 [======>.......................] - ETA: 1:30 - loss: 1.6710 - regression_loss: 1.4329 - classification_loss: 0.2381 121/500 [======>.......................] - ETA: 1:30 - loss: 1.6670 - regression_loss: 1.4296 - classification_loss: 0.2375 122/500 [======>.......................] - ETA: 1:30 - loss: 1.6680 - regression_loss: 1.4301 - classification_loss: 0.2380 123/500 [======>.......................] - ETA: 1:30 - loss: 1.6584 - regression_loss: 1.4220 - classification_loss: 0.2364 124/500 [======>.......................] - ETA: 1:29 - loss: 1.6574 - regression_loss: 1.4217 - classification_loss: 0.2357 125/500 [======>.......................] - ETA: 1:29 - loss: 1.6565 - regression_loss: 1.4210 - classification_loss: 0.2355 126/500 [======>.......................] - ETA: 1:29 - loss: 1.6534 - regression_loss: 1.4185 - classification_loss: 0.2348 127/500 [======>.......................] - ETA: 1:29 - loss: 1.6468 - regression_loss: 1.4130 - classification_loss: 0.2338 128/500 [======>.......................] - ETA: 1:28 - loss: 1.6409 - regression_loss: 1.4081 - classification_loss: 0.2328 129/500 [======>.......................] - ETA: 1:28 - loss: 1.6410 - regression_loss: 1.4082 - classification_loss: 0.2327 130/500 [======>.......................] - ETA: 1:28 - loss: 1.6453 - regression_loss: 1.4124 - classification_loss: 0.2329 131/500 [======>.......................] - ETA: 1:28 - loss: 1.6488 - regression_loss: 1.4163 - classification_loss: 0.2325 132/500 [======>.......................] - ETA: 1:27 - loss: 1.6478 - regression_loss: 1.4158 - classification_loss: 0.2320 133/500 [======>.......................] - ETA: 1:27 - loss: 1.6407 - regression_loss: 1.4099 - classification_loss: 0.2308 134/500 [=======>......................] - ETA: 1:27 - loss: 1.6366 - regression_loss: 1.4066 - classification_loss: 0.2300 135/500 [=======>......................] - ETA: 1:27 - loss: 1.6353 - regression_loss: 1.4057 - classification_loss: 0.2296 136/500 [=======>......................] - ETA: 1:27 - loss: 1.6395 - regression_loss: 1.4089 - classification_loss: 0.2306 137/500 [=======>......................] - ETA: 1:26 - loss: 1.6376 - regression_loss: 1.4074 - classification_loss: 0.2302 138/500 [=======>......................] - ETA: 1:26 - loss: 1.6355 - regression_loss: 1.4059 - classification_loss: 0.2296 139/500 [=======>......................] - ETA: 1:26 - loss: 1.6308 - regression_loss: 1.4022 - classification_loss: 0.2286 140/500 [=======>......................] - ETA: 1:26 - loss: 1.6255 - regression_loss: 1.3978 - classification_loss: 0.2278 141/500 [=======>......................] - ETA: 1:26 - loss: 1.6261 - regression_loss: 1.3983 - classification_loss: 0.2278 142/500 [=======>......................] - ETA: 1:25 - loss: 1.6256 - regression_loss: 1.3980 - classification_loss: 0.2276 143/500 [=======>......................] - ETA: 1:25 - loss: 1.6258 - regression_loss: 1.3972 - classification_loss: 0.2287 144/500 [=======>......................] - ETA: 1:25 - loss: 1.6250 - regression_loss: 1.3963 - classification_loss: 0.2287 145/500 [=======>......................] - ETA: 1:25 - loss: 1.6267 - regression_loss: 1.3981 - classification_loss: 0.2286 146/500 [=======>......................] - ETA: 1:24 - loss: 1.6233 - regression_loss: 1.3956 - classification_loss: 0.2278 147/500 [=======>......................] - ETA: 1:24 - loss: 1.6159 - regression_loss: 1.3890 - classification_loss: 0.2269 148/500 [=======>......................] - ETA: 1:24 - loss: 1.6115 - regression_loss: 1.3853 - classification_loss: 0.2261 149/500 [=======>......................] - ETA: 1:24 - loss: 1.6076 - regression_loss: 1.3820 - classification_loss: 0.2256 150/500 [========>.....................] - ETA: 1:24 - loss: 1.6042 - regression_loss: 1.3794 - classification_loss: 0.2248 151/500 [========>.....................] - ETA: 1:23 - loss: 1.5997 - regression_loss: 1.3756 - classification_loss: 0.2241 152/500 [========>.....................] - ETA: 1:23 - loss: 1.5976 - regression_loss: 1.3739 - classification_loss: 0.2238 153/500 [========>.....................] - ETA: 1:23 - loss: 1.5936 - regression_loss: 1.3706 - classification_loss: 0.2230 154/500 [========>.....................] - ETA: 1:23 - loss: 1.5960 - regression_loss: 1.3730 - classification_loss: 0.2231 155/500 [========>.....................] - ETA: 1:22 - loss: 1.5972 - regression_loss: 1.3739 - classification_loss: 0.2233 156/500 [========>.....................] - ETA: 1:22 - loss: 1.5974 - regression_loss: 1.3744 - classification_loss: 0.2230 157/500 [========>.....................] - ETA: 1:22 - loss: 1.5933 - regression_loss: 1.3711 - classification_loss: 0.2222 158/500 [========>.....................] - ETA: 1:22 - loss: 1.5929 - regression_loss: 1.3706 - classification_loss: 0.2223 159/500 [========>.....................] - ETA: 1:21 - loss: 1.5920 - regression_loss: 1.3695 - classification_loss: 0.2225 160/500 [========>.....................] - ETA: 1:21 - loss: 1.5905 - regression_loss: 1.3678 - classification_loss: 0.2227 161/500 [========>.....................] - ETA: 1:21 - loss: 1.5845 - regression_loss: 1.3627 - classification_loss: 0.2217 162/500 [========>.....................] - ETA: 1:21 - loss: 1.5808 - regression_loss: 1.3596 - classification_loss: 0.2211 163/500 [========>.....................] - ETA: 1:20 - loss: 1.5816 - regression_loss: 1.3605 - classification_loss: 0.2211 164/500 [========>.....................] - ETA: 1:20 - loss: 1.5805 - regression_loss: 1.3597 - classification_loss: 0.2208 165/500 [========>.....................] - ETA: 1:20 - loss: 1.5739 - regression_loss: 1.3542 - classification_loss: 0.2197 166/500 [========>.....................] - ETA: 1:20 - loss: 1.5742 - regression_loss: 1.3542 - classification_loss: 0.2200 167/500 [=========>....................] - ETA: 1:19 - loss: 1.5730 - regression_loss: 1.3525 - classification_loss: 0.2205 168/500 [=========>....................] - ETA: 1:19 - loss: 1.5746 - regression_loss: 1.3543 - classification_loss: 0.2203 169/500 [=========>....................] - ETA: 1:19 - loss: 1.5704 - regression_loss: 1.3509 - classification_loss: 0.2195 170/500 [=========>....................] - ETA: 1:19 - loss: 1.5675 - regression_loss: 1.3485 - classification_loss: 0.2190 171/500 [=========>....................] - ETA: 1:18 - loss: 1.5645 - regression_loss: 1.3461 - classification_loss: 0.2184 172/500 [=========>....................] - ETA: 1:18 - loss: 1.5638 - regression_loss: 1.3455 - classification_loss: 0.2184 173/500 [=========>....................] - ETA: 1:18 - loss: 1.5621 - regression_loss: 1.3443 - classification_loss: 0.2178 174/500 [=========>....................] - ETA: 1:17 - loss: 1.5638 - regression_loss: 1.3463 - classification_loss: 0.2175 175/500 [=========>....................] - ETA: 1:17 - loss: 1.5640 - regression_loss: 1.3463 - classification_loss: 0.2176 176/500 [=========>....................] - ETA: 1:17 - loss: 1.5642 - regression_loss: 1.3467 - classification_loss: 0.2175 177/500 [=========>....................] - ETA: 1:17 - loss: 1.5674 - regression_loss: 1.3492 - classification_loss: 0.2183 178/500 [=========>....................] - ETA: 1:16 - loss: 1.5676 - regression_loss: 1.3493 - classification_loss: 0.2183 179/500 [=========>....................] - ETA: 1:16 - loss: 1.5702 - regression_loss: 1.3516 - classification_loss: 0.2187 180/500 [=========>....................] - ETA: 1:16 - loss: 1.5702 - regression_loss: 1.3513 - classification_loss: 0.2189 181/500 [=========>....................] - ETA: 1:16 - loss: 1.5711 - regression_loss: 1.3528 - classification_loss: 0.2183 182/500 [=========>....................] - ETA: 1:15 - loss: 1.5737 - regression_loss: 1.3553 - classification_loss: 0.2184 183/500 [=========>....................] - ETA: 1:15 - loss: 1.5697 - regression_loss: 1.3518 - classification_loss: 0.2179 184/500 [==========>...................] - ETA: 1:15 - loss: 1.5694 - regression_loss: 1.3520 - classification_loss: 0.2174 185/500 [==========>...................] - ETA: 1:15 - loss: 1.5687 - regression_loss: 1.3513 - classification_loss: 0.2173 186/500 [==========>...................] - ETA: 1:14 - loss: 1.5705 - regression_loss: 1.3525 - classification_loss: 0.2180 187/500 [==========>...................] - ETA: 1:14 - loss: 1.5705 - regression_loss: 1.3526 - classification_loss: 0.2179 188/500 [==========>...................] - ETA: 1:14 - loss: 1.5720 - regression_loss: 1.3540 - classification_loss: 0.2180 189/500 [==========>...................] - ETA: 1:14 - loss: 1.5771 - regression_loss: 1.3574 - classification_loss: 0.2196 190/500 [==========>...................] - ETA: 1:13 - loss: 1.5799 - regression_loss: 1.3597 - classification_loss: 0.2202 191/500 [==========>...................] - ETA: 1:13 - loss: 1.5788 - regression_loss: 1.3588 - classification_loss: 0.2200 192/500 [==========>...................] - ETA: 1:13 - loss: 1.5771 - regression_loss: 1.3577 - classification_loss: 0.2194 193/500 [==========>...................] - ETA: 1:13 - loss: 1.5780 - regression_loss: 1.3584 - classification_loss: 0.2196 194/500 [==========>...................] - ETA: 1:12 - loss: 1.5774 - regression_loss: 1.3578 - classification_loss: 0.2195 195/500 [==========>...................] - ETA: 1:12 - loss: 1.5793 - regression_loss: 1.3593 - classification_loss: 0.2200 196/500 [==========>...................] - ETA: 1:12 - loss: 1.5810 - regression_loss: 1.3608 - classification_loss: 0.2202 197/500 [==========>...................] - ETA: 1:12 - loss: 1.5816 - regression_loss: 1.3614 - classification_loss: 0.2202 198/500 [==========>...................] - ETA: 1:11 - loss: 1.5888 - regression_loss: 1.3682 - classification_loss: 0.2206 199/500 [==========>...................] - ETA: 1:11 - loss: 1.5874 - regression_loss: 1.3672 - classification_loss: 0.2202 200/500 [===========>..................] - ETA: 1:11 - loss: 1.5871 - regression_loss: 1.3671 - classification_loss: 0.2200 201/500 [===========>..................] - ETA: 1:11 - loss: 1.5865 - regression_loss: 1.3667 - classification_loss: 0.2198 202/500 [===========>..................] - ETA: 1:10 - loss: 1.5892 - regression_loss: 1.3688 - classification_loss: 0.2204 203/500 [===========>..................] - ETA: 1:10 - loss: 1.5926 - regression_loss: 1.3708 - classification_loss: 0.2218 204/500 [===========>..................] - ETA: 1:10 - loss: 1.5927 - regression_loss: 1.3708 - classification_loss: 0.2219 205/500 [===========>..................] - ETA: 1:10 - loss: 1.5942 - regression_loss: 1.3723 - classification_loss: 0.2219 206/500 [===========>..................] - ETA: 1:09 - loss: 1.5952 - regression_loss: 1.3735 - classification_loss: 0.2217 207/500 [===========>..................] - ETA: 1:09 - loss: 1.5973 - regression_loss: 1.3755 - classification_loss: 0.2218 208/500 [===========>..................] - ETA: 1:09 - loss: 1.5986 - regression_loss: 1.3764 - classification_loss: 0.2222 209/500 [===========>..................] - ETA: 1:09 - loss: 1.5991 - regression_loss: 1.3768 - classification_loss: 0.2223 210/500 [===========>..................] - ETA: 1:08 - loss: 1.6008 - regression_loss: 1.3788 - classification_loss: 0.2220 211/500 [===========>..................] - ETA: 1:08 - loss: 1.6003 - regression_loss: 1.3781 - classification_loss: 0.2222 212/500 [===========>..................] - ETA: 1:08 - loss: 1.5992 - regression_loss: 1.3774 - classification_loss: 0.2218 213/500 [===========>..................] - ETA: 1:08 - loss: 1.5994 - regression_loss: 1.3777 - classification_loss: 0.2217 214/500 [===========>..................] - ETA: 1:07 - loss: 1.6013 - regression_loss: 1.3790 - classification_loss: 0.2224 215/500 [===========>..................] - ETA: 1:07 - loss: 1.6016 - regression_loss: 1.3793 - classification_loss: 0.2223 216/500 [===========>..................] - ETA: 1:07 - loss: 1.6016 - regression_loss: 1.3795 - classification_loss: 0.2221 217/500 [============>.................] - ETA: 1:07 - loss: 1.6009 - regression_loss: 1.3789 - classification_loss: 0.2219 218/500 [============>.................] - ETA: 1:06 - loss: 1.5974 - regression_loss: 1.3760 - classification_loss: 0.2214 219/500 [============>.................] - ETA: 1:06 - loss: 1.5970 - regression_loss: 1.3758 - classification_loss: 0.2212 220/500 [============>.................] - ETA: 1:06 - loss: 1.5935 - regression_loss: 1.3726 - classification_loss: 0.2209 221/500 [============>.................] - ETA: 1:06 - loss: 1.5929 - regression_loss: 1.3723 - classification_loss: 0.2206 222/500 [============>.................] - ETA: 1:05 - loss: 1.5899 - regression_loss: 1.3699 - classification_loss: 0.2200 223/500 [============>.................] - ETA: 1:05 - loss: 1.5903 - regression_loss: 1.3704 - classification_loss: 0.2198 224/500 [============>.................] - ETA: 1:05 - loss: 1.5872 - regression_loss: 1.3679 - classification_loss: 0.2193 225/500 [============>.................] - ETA: 1:05 - loss: 1.5861 - regression_loss: 1.3670 - classification_loss: 0.2190 226/500 [============>.................] - ETA: 1:04 - loss: 1.5836 - regression_loss: 1.3652 - classification_loss: 0.2185 227/500 [============>.................] - ETA: 1:04 - loss: 1.5827 - regression_loss: 1.3645 - classification_loss: 0.2182 228/500 [============>.................] - ETA: 1:04 - loss: 1.5873 - regression_loss: 1.3689 - classification_loss: 0.2184 229/500 [============>.................] - ETA: 1:04 - loss: 1.5888 - regression_loss: 1.3705 - classification_loss: 0.2183 230/500 [============>.................] - ETA: 1:03 - loss: 1.5894 - regression_loss: 1.3710 - classification_loss: 0.2184 231/500 [============>.................] - ETA: 1:03 - loss: 1.5860 - regression_loss: 1.3680 - classification_loss: 0.2179 232/500 [============>.................] - ETA: 1:03 - loss: 1.5844 - regression_loss: 1.3669 - classification_loss: 0.2175 233/500 [============>.................] - ETA: 1:03 - loss: 1.5833 - regression_loss: 1.3662 - classification_loss: 0.2171 234/500 [=============>................] - ETA: 1:02 - loss: 1.5837 - regression_loss: 1.3663 - classification_loss: 0.2174 235/500 [=============>................] - ETA: 1:02 - loss: 1.5891 - regression_loss: 1.3705 - classification_loss: 0.2186 236/500 [=============>................] - ETA: 1:02 - loss: 1.5855 - regression_loss: 1.3675 - classification_loss: 0.2180 237/500 [=============>................] - ETA: 1:02 - loss: 1.5830 - regression_loss: 1.3654 - classification_loss: 0.2176 238/500 [=============>................] - ETA: 1:01 - loss: 1.5833 - regression_loss: 1.3652 - classification_loss: 0.2181 239/500 [=============>................] - ETA: 1:01 - loss: 1.5866 - regression_loss: 1.3680 - classification_loss: 0.2186 240/500 [=============>................] - ETA: 1:01 - loss: 1.5846 - regression_loss: 1.3666 - classification_loss: 0.2180 241/500 [=============>................] - ETA: 1:01 - loss: 1.5837 - regression_loss: 1.3659 - classification_loss: 0.2178 242/500 [=============>................] - ETA: 1:01 - loss: 1.5828 - regression_loss: 1.3654 - classification_loss: 0.2175 243/500 [=============>................] - ETA: 1:00 - loss: 1.5857 - regression_loss: 1.3681 - classification_loss: 0.2177 244/500 [=============>................] - ETA: 1:00 - loss: 1.5826 - regression_loss: 1.3654 - classification_loss: 0.2172 245/500 [=============>................] - ETA: 1:00 - loss: 1.5817 - regression_loss: 1.3646 - classification_loss: 0.2171 246/500 [=============>................] - ETA: 1:00 - loss: 1.5802 - regression_loss: 1.3633 - classification_loss: 0.2169 247/500 [=============>................] - ETA: 59s - loss: 1.5784 - regression_loss: 1.3619 - classification_loss: 0.2164  248/500 [=============>................] - ETA: 59s - loss: 1.5784 - regression_loss: 1.3622 - classification_loss: 0.2162 249/500 [=============>................] - ETA: 59s - loss: 1.5822 - regression_loss: 1.3649 - classification_loss: 0.2173 250/500 [==============>...............] - ETA: 59s - loss: 1.5788 - regression_loss: 1.3621 - classification_loss: 0.2168 251/500 [==============>...............] - ETA: 58s - loss: 1.5786 - regression_loss: 1.3619 - classification_loss: 0.2167 252/500 [==============>...............] - ETA: 58s - loss: 1.5777 - regression_loss: 1.3611 - classification_loss: 0.2166 253/500 [==============>...............] - ETA: 58s - loss: 1.5815 - regression_loss: 1.3642 - classification_loss: 0.2173 254/500 [==============>...............] - ETA: 58s - loss: 1.5839 - regression_loss: 1.3663 - classification_loss: 0.2176 255/500 [==============>...............] - ETA: 57s - loss: 1.5867 - regression_loss: 1.3679 - classification_loss: 0.2189 256/500 [==============>...............] - ETA: 57s - loss: 1.5843 - regression_loss: 1.3660 - classification_loss: 0.2183 257/500 [==============>...............] - ETA: 57s - loss: 1.5853 - regression_loss: 1.3668 - classification_loss: 0.2184 258/500 [==============>...............] - ETA: 57s - loss: 1.5876 - regression_loss: 1.3685 - classification_loss: 0.2191 259/500 [==============>...............] - ETA: 56s - loss: 1.5901 - regression_loss: 1.3695 - classification_loss: 0.2206 260/500 [==============>...............] - ETA: 56s - loss: 1.5916 - regression_loss: 1.3706 - classification_loss: 0.2210 261/500 [==============>...............] - ETA: 56s - loss: 1.5938 - regression_loss: 1.3722 - classification_loss: 0.2215 262/500 [==============>...............] - ETA: 56s - loss: 1.5908 - regression_loss: 1.3696 - classification_loss: 0.2211 263/500 [==============>...............] - ETA: 56s - loss: 1.5936 - regression_loss: 1.3720 - classification_loss: 0.2217 264/500 [==============>...............] - ETA: 55s - loss: 1.5939 - regression_loss: 1.3721 - classification_loss: 0.2218 265/500 [==============>...............] - ETA: 55s - loss: 1.5978 - regression_loss: 1.3754 - classification_loss: 0.2225 266/500 [==============>...............] - ETA: 55s - loss: 1.5975 - regression_loss: 1.3751 - classification_loss: 0.2224 267/500 [===============>..............] - ETA: 55s - loss: 1.5948 - regression_loss: 1.3729 - classification_loss: 0.2220 268/500 [===============>..............] - ETA: 54s - loss: 1.5969 - regression_loss: 1.3750 - classification_loss: 0.2219 269/500 [===============>..............] - ETA: 54s - loss: 1.5966 - regression_loss: 1.3749 - classification_loss: 0.2218 270/500 [===============>..............] - ETA: 54s - loss: 1.5964 - regression_loss: 1.3749 - classification_loss: 0.2216 271/500 [===============>..............] - ETA: 54s - loss: 1.5952 - regression_loss: 1.3740 - classification_loss: 0.2212 272/500 [===============>..............] - ETA: 53s - loss: 1.5989 - regression_loss: 1.3775 - classification_loss: 0.2214 273/500 [===============>..............] - ETA: 53s - loss: 1.5993 - regression_loss: 1.3779 - classification_loss: 0.2214 274/500 [===============>..............] - ETA: 53s - loss: 1.5979 - regression_loss: 1.3767 - classification_loss: 0.2212 275/500 [===============>..............] - ETA: 53s - loss: 1.5978 - regression_loss: 1.3766 - classification_loss: 0.2212 276/500 [===============>..............] - ETA: 52s - loss: 1.5997 - regression_loss: 1.3783 - classification_loss: 0.2214 277/500 [===============>..............] - ETA: 52s - loss: 1.5979 - regression_loss: 1.3768 - classification_loss: 0.2212 278/500 [===============>..............] - ETA: 52s - loss: 1.5962 - regression_loss: 1.3754 - classification_loss: 0.2208 279/500 [===============>..............] - ETA: 52s - loss: 1.5956 - regression_loss: 1.3748 - classification_loss: 0.2208 280/500 [===============>..............] - ETA: 51s - loss: 1.5986 - regression_loss: 1.3781 - classification_loss: 0.2205 281/500 [===============>..............] - ETA: 51s - loss: 1.5985 - regression_loss: 1.3780 - classification_loss: 0.2206 282/500 [===============>..............] - ETA: 51s - loss: 1.5958 - regression_loss: 1.3756 - classification_loss: 0.2202 283/500 [===============>..............] - ETA: 51s - loss: 1.5927 - regression_loss: 1.3729 - classification_loss: 0.2198 284/500 [================>.............] - ETA: 51s - loss: 1.5964 - regression_loss: 1.3761 - classification_loss: 0.2202 285/500 [================>.............] - ETA: 50s - loss: 1.5942 - regression_loss: 1.3744 - classification_loss: 0.2198 286/500 [================>.............] - ETA: 50s - loss: 1.5938 - regression_loss: 1.3739 - classification_loss: 0.2200 287/500 [================>.............] - ETA: 50s - loss: 1.5920 - regression_loss: 1.3723 - classification_loss: 0.2197 288/500 [================>.............] - ETA: 50s - loss: 1.5900 - regression_loss: 1.3705 - classification_loss: 0.2195 289/500 [================>.............] - ETA: 49s - loss: 1.5892 - regression_loss: 1.3697 - classification_loss: 0.2196 290/500 [================>.............] - ETA: 49s - loss: 1.5906 - regression_loss: 1.3709 - classification_loss: 0.2198 291/500 [================>.............] - ETA: 49s - loss: 1.5932 - regression_loss: 1.3727 - classification_loss: 0.2205 292/500 [================>.............] - ETA: 49s - loss: 1.5940 - regression_loss: 1.3732 - classification_loss: 0.2208 293/500 [================>.............] - ETA: 48s - loss: 1.5945 - regression_loss: 1.3738 - classification_loss: 0.2207 294/500 [================>.............] - ETA: 48s - loss: 1.5951 - regression_loss: 1.3742 - classification_loss: 0.2209 295/500 [================>.............] - ETA: 48s - loss: 1.5946 - regression_loss: 1.3739 - classification_loss: 0.2207 296/500 [================>.............] - ETA: 48s - loss: 1.5928 - regression_loss: 1.3724 - classification_loss: 0.2204 297/500 [================>.............] - ETA: 47s - loss: 1.5917 - regression_loss: 1.3716 - classification_loss: 0.2202 298/500 [================>.............] - ETA: 47s - loss: 1.5922 - regression_loss: 1.3720 - classification_loss: 0.2202 299/500 [================>.............] - ETA: 47s - loss: 1.5925 - regression_loss: 1.3723 - classification_loss: 0.2202 300/500 [=================>............] - ETA: 47s - loss: 1.5950 - regression_loss: 1.3744 - classification_loss: 0.2205 301/500 [=================>............] - ETA: 47s - loss: 1.5965 - regression_loss: 1.3755 - classification_loss: 0.2210 302/500 [=================>............] - ETA: 46s - loss: 1.5958 - regression_loss: 1.3750 - classification_loss: 0.2207 303/500 [=================>............] - ETA: 46s - loss: 1.5946 - regression_loss: 1.3742 - classification_loss: 0.2204 304/500 [=================>............] - ETA: 46s - loss: 1.5938 - regression_loss: 1.3737 - classification_loss: 0.2201 305/500 [=================>............] - ETA: 46s - loss: 1.5940 - regression_loss: 1.3740 - classification_loss: 0.2201 306/500 [=================>............] - ETA: 45s - loss: 1.5925 - regression_loss: 1.3728 - classification_loss: 0.2197 307/500 [=================>............] - ETA: 45s - loss: 1.5925 - regression_loss: 1.3729 - classification_loss: 0.2196 308/500 [=================>............] - ETA: 45s - loss: 1.5954 - regression_loss: 1.3755 - classification_loss: 0.2198 309/500 [=================>............] - ETA: 45s - loss: 1.5954 - regression_loss: 1.3757 - classification_loss: 0.2197 310/500 [=================>............] - ETA: 44s - loss: 1.5925 - regression_loss: 1.3733 - classification_loss: 0.2192 311/500 [=================>............] - ETA: 44s - loss: 1.5905 - regression_loss: 1.3715 - classification_loss: 0.2189 312/500 [=================>............] - ETA: 44s - loss: 1.5905 - regression_loss: 1.3715 - classification_loss: 0.2190 313/500 [=================>............] - ETA: 44s - loss: 1.5879 - regression_loss: 1.3693 - classification_loss: 0.2186 314/500 [=================>............] - ETA: 44s - loss: 1.5860 - regression_loss: 1.3678 - classification_loss: 0.2182 315/500 [=================>............] - ETA: 43s - loss: 1.5879 - regression_loss: 1.3694 - classification_loss: 0.2185 316/500 [=================>............] - ETA: 43s - loss: 1.5859 - regression_loss: 1.3676 - classification_loss: 0.2183 317/500 [==================>...........] - ETA: 43s - loss: 1.5850 - regression_loss: 1.3669 - classification_loss: 0.2181 318/500 [==================>...........] - ETA: 43s - loss: 1.5849 - regression_loss: 1.3670 - classification_loss: 0.2180 319/500 [==================>...........] - ETA: 42s - loss: 1.5859 - regression_loss: 1.3678 - classification_loss: 0.2181 320/500 [==================>...........] - ETA: 42s - loss: 1.5837 - regression_loss: 1.3661 - classification_loss: 0.2176 321/500 [==================>...........] - ETA: 42s - loss: 1.5837 - regression_loss: 1.3661 - classification_loss: 0.2176 322/500 [==================>...........] - ETA: 42s - loss: 1.5832 - regression_loss: 1.3658 - classification_loss: 0.2174 323/500 [==================>...........] - ETA: 41s - loss: 1.5825 - regression_loss: 1.3653 - classification_loss: 0.2172 324/500 [==================>...........] - ETA: 41s - loss: 1.5815 - regression_loss: 1.3645 - classification_loss: 0.2169 325/500 [==================>...........] - ETA: 41s - loss: 1.5831 - regression_loss: 1.3655 - classification_loss: 0.2176 326/500 [==================>...........] - ETA: 41s - loss: 1.5823 - regression_loss: 1.3646 - classification_loss: 0.2177 327/500 [==================>...........] - ETA: 40s - loss: 1.5851 - regression_loss: 1.3670 - classification_loss: 0.2181 328/500 [==================>...........] - ETA: 40s - loss: 1.5854 - regression_loss: 1.3673 - classification_loss: 0.2181 329/500 [==================>...........] - ETA: 40s - loss: 1.5858 - regression_loss: 1.3675 - classification_loss: 0.2183 330/500 [==================>...........] - ETA: 40s - loss: 1.5873 - regression_loss: 1.3686 - classification_loss: 0.2187 331/500 [==================>...........] - ETA: 39s - loss: 1.5880 - regression_loss: 1.3693 - classification_loss: 0.2187 332/500 [==================>...........] - ETA: 39s - loss: 1.5871 - regression_loss: 1.3686 - classification_loss: 0.2185 333/500 [==================>...........] - ETA: 39s - loss: 1.5878 - regression_loss: 1.3694 - classification_loss: 0.2184 334/500 [===================>..........] - ETA: 39s - loss: 1.5890 - regression_loss: 1.3705 - classification_loss: 0.2186 335/500 [===================>..........] - ETA: 38s - loss: 1.5887 - regression_loss: 1.3703 - classification_loss: 0.2184 336/500 [===================>..........] - ETA: 38s - loss: 1.5908 - regression_loss: 1.3722 - classification_loss: 0.2186 337/500 [===================>..........] - ETA: 38s - loss: 1.5929 - regression_loss: 1.3736 - classification_loss: 0.2193 338/500 [===================>..........] - ETA: 38s - loss: 1.5940 - regression_loss: 1.3747 - classification_loss: 0.2194 339/500 [===================>..........] - ETA: 38s - loss: 1.5915 - regression_loss: 1.3723 - classification_loss: 0.2191 340/500 [===================>..........] - ETA: 37s - loss: 1.5909 - regression_loss: 1.3719 - classification_loss: 0.2189 341/500 [===================>..........] - ETA: 37s - loss: 1.5922 - regression_loss: 1.3730 - classification_loss: 0.2192 342/500 [===================>..........] - ETA: 37s - loss: 1.5941 - regression_loss: 1.3743 - classification_loss: 0.2198 343/500 [===================>..........] - ETA: 37s - loss: 1.5948 - regression_loss: 1.3749 - classification_loss: 0.2198 344/500 [===================>..........] - ETA: 36s - loss: 1.5925 - regression_loss: 1.3729 - classification_loss: 0.2196 345/500 [===================>..........] - ETA: 36s - loss: 1.5903 - regression_loss: 1.3710 - classification_loss: 0.2193 346/500 [===================>..........] - ETA: 36s - loss: 1.5925 - regression_loss: 1.3724 - classification_loss: 0.2201 347/500 [===================>..........] - ETA: 36s - loss: 1.5908 - regression_loss: 1.3710 - classification_loss: 0.2198 348/500 [===================>..........] - ETA: 35s - loss: 1.5900 - regression_loss: 1.3704 - classification_loss: 0.2196 349/500 [===================>..........] - ETA: 35s - loss: 1.5900 - regression_loss: 1.3706 - classification_loss: 0.2194 350/500 [====================>.........] - ETA: 35s - loss: 1.5901 - regression_loss: 1.3707 - classification_loss: 0.2194 351/500 [====================>.........] - ETA: 35s - loss: 1.5912 - regression_loss: 1.3715 - classification_loss: 0.2197 352/500 [====================>.........] - ETA: 34s - loss: 1.5910 - regression_loss: 1.3714 - classification_loss: 0.2196 353/500 [====================>.........] - ETA: 34s - loss: 1.5913 - regression_loss: 1.3716 - classification_loss: 0.2197 354/500 [====================>.........] - ETA: 34s - loss: 1.5887 - regression_loss: 1.3695 - classification_loss: 0.2192 355/500 [====================>.........] - ETA: 34s - loss: 1.5986 - regression_loss: 1.3715 - classification_loss: 0.2271 356/500 [====================>.........] - ETA: 34s - loss: 1.5988 - regression_loss: 1.3716 - classification_loss: 0.2271 357/500 [====================>.........] - ETA: 33s - loss: 1.5978 - regression_loss: 1.3709 - classification_loss: 0.2270 358/500 [====================>.........] - ETA: 33s - loss: 1.5975 - regression_loss: 1.3706 - classification_loss: 0.2269 359/500 [====================>.........] - ETA: 33s - loss: 1.5974 - regression_loss: 1.3705 - classification_loss: 0.2268 360/500 [====================>.........] - ETA: 33s - loss: 1.5964 - regression_loss: 1.3698 - classification_loss: 0.2266 361/500 [====================>.........] - ETA: 32s - loss: 1.5961 - regression_loss: 1.3694 - classification_loss: 0.2267 362/500 [====================>.........] - ETA: 32s - loss: 1.5953 - regression_loss: 1.3689 - classification_loss: 0.2264 363/500 [====================>.........] - ETA: 32s - loss: 1.5943 - regression_loss: 1.3679 - classification_loss: 0.2264 364/500 [====================>.........] - ETA: 32s - loss: 1.5929 - regression_loss: 1.3667 - classification_loss: 0.2262 365/500 [====================>.........] - ETA: 31s - loss: 1.5926 - regression_loss: 1.3665 - classification_loss: 0.2261 366/500 [====================>.........] - ETA: 31s - loss: 1.5901 - regression_loss: 1.3644 - classification_loss: 0.2258 367/500 [=====================>........] - ETA: 31s - loss: 1.5906 - regression_loss: 1.3646 - classification_loss: 0.2260 368/500 [=====================>........] - ETA: 31s - loss: 1.5912 - regression_loss: 1.3653 - classification_loss: 0.2259 369/500 [=====================>........] - ETA: 30s - loss: 1.5912 - regression_loss: 1.3653 - classification_loss: 0.2259 370/500 [=====================>........] - ETA: 30s - loss: 1.5928 - regression_loss: 1.3664 - classification_loss: 0.2264 371/500 [=====================>........] - ETA: 30s - loss: 1.5934 - regression_loss: 1.3671 - classification_loss: 0.2263 372/500 [=====================>........] - ETA: 30s - loss: 1.5928 - regression_loss: 1.3667 - classification_loss: 0.2261 373/500 [=====================>........] - ETA: 29s - loss: 1.5922 - regression_loss: 1.3661 - classification_loss: 0.2262 374/500 [=====================>........] - ETA: 29s - loss: 1.5923 - regression_loss: 1.3662 - classification_loss: 0.2261 375/500 [=====================>........] - ETA: 29s - loss: 1.5913 - regression_loss: 1.3653 - classification_loss: 0.2260 376/500 [=====================>........] - ETA: 29s - loss: 1.5922 - regression_loss: 1.3661 - classification_loss: 0.2261 377/500 [=====================>........] - ETA: 29s - loss: 1.5915 - regression_loss: 1.3657 - classification_loss: 0.2258 378/500 [=====================>........] - ETA: 28s - loss: 1.5907 - regression_loss: 1.3649 - classification_loss: 0.2258 379/500 [=====================>........] - ETA: 28s - loss: 1.5908 - regression_loss: 1.3648 - classification_loss: 0.2260 380/500 [=====================>........] - ETA: 28s - loss: 1.5916 - regression_loss: 1.3654 - classification_loss: 0.2262 381/500 [=====================>........] - ETA: 28s - loss: 1.5889 - regression_loss: 1.3631 - classification_loss: 0.2258 382/500 [=====================>........] - ETA: 27s - loss: 1.5876 - regression_loss: 1.3617 - classification_loss: 0.2259 383/500 [=====================>........] - ETA: 27s - loss: 1.5888 - regression_loss: 1.3627 - classification_loss: 0.2261 384/500 [======================>.......] - ETA: 27s - loss: 1.5878 - regression_loss: 1.3619 - classification_loss: 0.2259 385/500 [======================>.......] - ETA: 27s - loss: 1.5870 - regression_loss: 1.3612 - classification_loss: 0.2257 386/500 [======================>.......] - ETA: 26s - loss: 1.5884 - regression_loss: 1.3626 - classification_loss: 0.2257 387/500 [======================>.......] - ETA: 26s - loss: 1.5884 - regression_loss: 1.3628 - classification_loss: 0.2256 388/500 [======================>.......] - ETA: 26s - loss: 1.5883 - regression_loss: 1.3627 - classification_loss: 0.2256 389/500 [======================>.......] - ETA: 26s - loss: 1.5905 - regression_loss: 1.3644 - classification_loss: 0.2261 390/500 [======================>.......] - ETA: 25s - loss: 1.5932 - regression_loss: 1.3666 - classification_loss: 0.2266 391/500 [======================>.......] - ETA: 25s - loss: 1.5928 - regression_loss: 1.3664 - classification_loss: 0.2264 392/500 [======================>.......] - ETA: 25s - loss: 1.5919 - regression_loss: 1.3657 - classification_loss: 0.2262 393/500 [======================>.......] - ETA: 25s - loss: 1.5898 - regression_loss: 1.3639 - classification_loss: 0.2259 394/500 [======================>.......] - ETA: 25s - loss: 1.5925 - regression_loss: 1.3666 - classification_loss: 0.2259 395/500 [======================>.......] - ETA: 24s - loss: 1.5911 - regression_loss: 1.3651 - classification_loss: 0.2260 396/500 [======================>.......] - ETA: 24s - loss: 1.5896 - regression_loss: 1.3639 - classification_loss: 0.2256 397/500 [======================>.......] - ETA: 24s - loss: 1.5886 - regression_loss: 1.3633 - classification_loss: 0.2253 398/500 [======================>.......] - ETA: 24s - loss: 1.5877 - regression_loss: 1.3626 - classification_loss: 0.2251 399/500 [======================>.......] - ETA: 23s - loss: 1.5862 - regression_loss: 1.3614 - classification_loss: 0.2248 400/500 [=======================>......] - ETA: 23s - loss: 1.5869 - regression_loss: 1.3618 - classification_loss: 0.2250 401/500 [=======================>......] - ETA: 23s - loss: 1.5856 - regression_loss: 1.3607 - classification_loss: 0.2249 402/500 [=======================>......] - ETA: 23s - loss: 1.5839 - regression_loss: 1.3593 - classification_loss: 0.2246 403/500 [=======================>......] - ETA: 22s - loss: 1.5832 - regression_loss: 1.3587 - classification_loss: 0.2245 404/500 [=======================>......] - ETA: 22s - loss: 1.5839 - regression_loss: 1.3594 - classification_loss: 0.2245 405/500 [=======================>......] - ETA: 22s - loss: 1.5855 - regression_loss: 1.3610 - classification_loss: 0.2245 406/500 [=======================>......] - ETA: 22s - loss: 1.5867 - regression_loss: 1.3622 - classification_loss: 0.2245 407/500 [=======================>......] - ETA: 21s - loss: 1.5862 - regression_loss: 1.3620 - classification_loss: 0.2242 408/500 [=======================>......] - ETA: 21s - loss: 1.5860 - regression_loss: 1.3619 - classification_loss: 0.2241 409/500 [=======================>......] - ETA: 21s - loss: 1.5847 - regression_loss: 1.3609 - classification_loss: 0.2238 410/500 [=======================>......] - ETA: 21s - loss: 1.5844 - regression_loss: 1.3607 - classification_loss: 0.2237 411/500 [=======================>......] - ETA: 20s - loss: 1.5851 - regression_loss: 1.3613 - classification_loss: 0.2238 412/500 [=======================>......] - ETA: 20s - loss: 1.5850 - regression_loss: 1.3612 - classification_loss: 0.2238 413/500 [=======================>......] - ETA: 20s - loss: 1.5850 - regression_loss: 1.3609 - classification_loss: 0.2241 414/500 [=======================>......] - ETA: 20s - loss: 1.5836 - regression_loss: 1.3597 - classification_loss: 0.2239 415/500 [=======================>......] - ETA: 20s - loss: 1.5841 - regression_loss: 1.3601 - classification_loss: 0.2240 416/500 [=======================>......] - ETA: 19s - loss: 1.5834 - regression_loss: 1.3597 - classification_loss: 0.2237 417/500 [========================>.....] - ETA: 19s - loss: 1.5840 - regression_loss: 1.3600 - classification_loss: 0.2240 418/500 [========================>.....] - ETA: 19s - loss: 1.5839 - regression_loss: 1.3600 - classification_loss: 0.2239 419/500 [========================>.....] - ETA: 19s - loss: 1.5823 - regression_loss: 1.3585 - classification_loss: 0.2238 420/500 [========================>.....] - ETA: 18s - loss: 1.5893 - regression_loss: 1.3631 - classification_loss: 0.2262 421/500 [========================>.....] - ETA: 18s - loss: 1.5900 - regression_loss: 1.3635 - classification_loss: 0.2265 422/500 [========================>.....] - ETA: 18s - loss: 1.5908 - regression_loss: 1.3640 - classification_loss: 0.2268 423/500 [========================>.....] - ETA: 18s - loss: 1.5911 - regression_loss: 1.3642 - classification_loss: 0.2269 424/500 [========================>.....] - ETA: 17s - loss: 1.5907 - regression_loss: 1.3639 - classification_loss: 0.2268 425/500 [========================>.....] - ETA: 17s - loss: 1.5905 - regression_loss: 1.3636 - classification_loss: 0.2269 426/500 [========================>.....] - ETA: 17s - loss: 1.5916 - regression_loss: 1.3646 - classification_loss: 0.2270 427/500 [========================>.....] - ETA: 17s - loss: 1.5936 - regression_loss: 1.3661 - classification_loss: 0.2275 428/500 [========================>.....] - ETA: 16s - loss: 1.5949 - regression_loss: 1.3672 - classification_loss: 0.2277 429/500 [========================>.....] - ETA: 16s - loss: 1.5989 - regression_loss: 1.3709 - classification_loss: 0.2280 430/500 [========================>.....] - ETA: 16s - loss: 1.5997 - regression_loss: 1.3715 - classification_loss: 0.2282 431/500 [========================>.....] - ETA: 16s - loss: 1.5999 - regression_loss: 1.3719 - classification_loss: 0.2281 432/500 [========================>.....] - ETA: 16s - loss: 1.5989 - regression_loss: 1.3710 - classification_loss: 0.2280 433/500 [========================>.....] - ETA: 15s - loss: 1.5987 - regression_loss: 1.3709 - classification_loss: 0.2278 434/500 [=========================>....] - ETA: 15s - loss: 1.5981 - regression_loss: 1.3705 - classification_loss: 0.2276 435/500 [=========================>....] - ETA: 15s - loss: 1.5966 - regression_loss: 1.3692 - classification_loss: 0.2273 436/500 [=========================>....] - ETA: 15s - loss: 1.5968 - regression_loss: 1.3695 - classification_loss: 0.2273 437/500 [=========================>....] - ETA: 14s - loss: 1.5964 - regression_loss: 1.3692 - classification_loss: 0.2272 438/500 [=========================>....] - ETA: 14s - loss: 1.5967 - regression_loss: 1.3694 - classification_loss: 0.2273 439/500 [=========================>....] - ETA: 14s - loss: 1.5952 - regression_loss: 1.3682 - classification_loss: 0.2270 440/500 [=========================>....] - ETA: 14s - loss: 1.5955 - regression_loss: 1.3683 - classification_loss: 0.2272 441/500 [=========================>....] - ETA: 13s - loss: 1.5949 - regression_loss: 1.3679 - classification_loss: 0.2271 442/500 [=========================>....] - ETA: 13s - loss: 1.5940 - regression_loss: 1.3670 - classification_loss: 0.2270 443/500 [=========================>....] - ETA: 13s - loss: 1.5931 - regression_loss: 1.3663 - classification_loss: 0.2268 444/500 [=========================>....] - ETA: 13s - loss: 1.5930 - regression_loss: 1.3663 - classification_loss: 0.2267 445/500 [=========================>....] - ETA: 12s - loss: 1.5932 - regression_loss: 1.3663 - classification_loss: 0.2269 446/500 [=========================>....] - ETA: 12s - loss: 1.5927 - regression_loss: 1.3660 - classification_loss: 0.2267 447/500 [=========================>....] - ETA: 12s - loss: 1.5920 - regression_loss: 1.3652 - classification_loss: 0.2268 448/500 [=========================>....] - ETA: 12s - loss: 1.5912 - regression_loss: 1.3646 - classification_loss: 0.2266 449/500 [=========================>....] - ETA: 12s - loss: 1.5904 - regression_loss: 1.3641 - classification_loss: 0.2264 450/500 [==========================>...] - ETA: 11s - loss: 1.5912 - regression_loss: 1.3648 - classification_loss: 0.2265 451/500 [==========================>...] - ETA: 11s - loss: 1.5901 - regression_loss: 1.3638 - classification_loss: 0.2263 452/500 [==========================>...] - ETA: 11s - loss: 1.5903 - regression_loss: 1.3642 - classification_loss: 0.2261 453/500 [==========================>...] - ETA: 11s - loss: 1.5902 - regression_loss: 1.3641 - classification_loss: 0.2261 454/500 [==========================>...] - ETA: 10s - loss: 1.5892 - regression_loss: 1.3632 - classification_loss: 0.2260 455/500 [==========================>...] - ETA: 10s - loss: 1.5885 - regression_loss: 1.3625 - classification_loss: 0.2259 456/500 [==========================>...] - ETA: 10s - loss: 1.5876 - regression_loss: 1.3620 - classification_loss: 0.2257 457/500 [==========================>...] - ETA: 10s - loss: 1.5869 - regression_loss: 1.3614 - classification_loss: 0.2255 458/500 [==========================>...] - ETA: 9s - loss: 1.5867 - regression_loss: 1.3614 - classification_loss: 0.2253  459/500 [==========================>...] - ETA: 9s - loss: 1.5913 - regression_loss: 1.3645 - classification_loss: 0.2268 460/500 [==========================>...] - ETA: 9s - loss: 1.5910 - regression_loss: 1.3644 - classification_loss: 0.2266 461/500 [==========================>...] - ETA: 9s - loss: 1.5923 - regression_loss: 1.3653 - classification_loss: 0.2270 462/500 [==========================>...] - ETA: 8s - loss: 1.5938 - regression_loss: 1.3667 - classification_loss: 0.2271 463/500 [==========================>...] - ETA: 8s - loss: 1.5942 - regression_loss: 1.3671 - classification_loss: 0.2270 464/500 [==========================>...] - ETA: 8s - loss: 1.5944 - regression_loss: 1.3675 - classification_loss: 0.2269 465/500 [==========================>...] - ETA: 8s - loss: 1.5942 - regression_loss: 1.3674 - classification_loss: 0.2268 466/500 [==========================>...] - ETA: 8s - loss: 1.5948 - regression_loss: 1.3677 - classification_loss: 0.2270 467/500 [===========================>..] - ETA: 7s - loss: 1.5928 - regression_loss: 1.3661 - classification_loss: 0.2267 468/500 [===========================>..] - ETA: 7s - loss: 1.5918 - regression_loss: 1.3653 - classification_loss: 0.2265 469/500 [===========================>..] - ETA: 7s - loss: 1.5929 - regression_loss: 1.3660 - classification_loss: 0.2268 470/500 [===========================>..] - ETA: 7s - loss: 1.5915 - regression_loss: 1.3650 - classification_loss: 0.2265 471/500 [===========================>..] - ETA: 6s - loss: 1.5921 - regression_loss: 1.3656 - classification_loss: 0.2265 472/500 [===========================>..] - ETA: 6s - loss: 1.5916 - regression_loss: 1.3652 - classification_loss: 0.2264 473/500 [===========================>..] - ETA: 6s - loss: 1.5921 - regression_loss: 1.3654 - classification_loss: 0.2267 474/500 [===========================>..] - ETA: 6s - loss: 1.5904 - regression_loss: 1.3640 - classification_loss: 0.2264 475/500 [===========================>..] - ETA: 5s - loss: 1.5898 - regression_loss: 1.3634 - classification_loss: 0.2264 476/500 [===========================>..] - ETA: 5s - loss: 1.5884 - regression_loss: 1.3623 - classification_loss: 0.2261 477/500 [===========================>..] - ETA: 5s - loss: 1.5888 - regression_loss: 1.3626 - classification_loss: 0.2262 478/500 [===========================>..] - ETA: 5s - loss: 1.5901 - regression_loss: 1.3640 - classification_loss: 0.2261 479/500 [===========================>..] - ETA: 4s - loss: 1.5892 - regression_loss: 1.3631 - classification_loss: 0.2261 480/500 [===========================>..] - ETA: 4s - loss: 1.5912 - regression_loss: 1.3645 - classification_loss: 0.2267 481/500 [===========================>..] - ETA: 4s - loss: 1.5910 - regression_loss: 1.3644 - classification_loss: 0.2267 482/500 [===========================>..] - ETA: 4s - loss: 1.5917 - regression_loss: 1.3649 - classification_loss: 0.2268 483/500 [===========================>..] - ETA: 4s - loss: 1.5909 - regression_loss: 1.3643 - classification_loss: 0.2266 484/500 [============================>.] - ETA: 3s - loss: 1.5897 - regression_loss: 1.3632 - classification_loss: 0.2265 485/500 [============================>.] - ETA: 3s - loss: 1.5903 - regression_loss: 1.3638 - classification_loss: 0.2265 486/500 [============================>.] - ETA: 3s - loss: 1.5899 - regression_loss: 1.3635 - classification_loss: 0.2263 487/500 [============================>.] - ETA: 3s - loss: 1.5900 - regression_loss: 1.3636 - classification_loss: 0.2264 488/500 [============================>.] - ETA: 2s - loss: 1.5904 - regression_loss: 1.3641 - classification_loss: 0.2262 489/500 [============================>.] - ETA: 2s - loss: 1.5910 - regression_loss: 1.3647 - classification_loss: 0.2262 490/500 [============================>.] - ETA: 2s - loss: 1.5906 - regression_loss: 1.3645 - classification_loss: 0.2261 491/500 [============================>.] - ETA: 2s - loss: 1.5906 - regression_loss: 1.3646 - classification_loss: 0.2259 492/500 [============================>.] - ETA: 1s - loss: 1.5902 - regression_loss: 1.3643 - classification_loss: 0.2259 493/500 [============================>.] - ETA: 1s - loss: 1.5888 - regression_loss: 1.3632 - classification_loss: 0.2256 494/500 [============================>.] - ETA: 1s - loss: 1.5873 - regression_loss: 1.3619 - classification_loss: 0.2254 495/500 [============================>.] - ETA: 1s - loss: 1.5865 - regression_loss: 1.3613 - classification_loss: 0.2252 496/500 [============================>.] - ETA: 0s - loss: 1.5850 - regression_loss: 1.3600 - classification_loss: 0.2250 497/500 [============================>.] - ETA: 0s - loss: 1.5851 - regression_loss: 1.3600 - classification_loss: 0.2252 498/500 [============================>.] - ETA: 0s - loss: 1.5846 - regression_loss: 1.3594 - classification_loss: 0.2252 499/500 [============================>.] - ETA: 0s - loss: 1.5848 - regression_loss: 1.3597 - classification_loss: 0.2251 500/500 [==============================] - 118s 236ms/step - loss: 1.5832 - regression_loss: 1.3584 - classification_loss: 0.2248 326 instances of class plum with average precision: 0.8005 mAP: 0.8005 Epoch 00009: saving model to ./training/snapshots/resnet50_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 1:45 - loss: 1.5584 - regression_loss: 1.3620 - classification_loss: 0.1965 2/500 [..............................] - ETA: 1:46 - loss: 1.6225 - regression_loss: 1.4729 - classification_loss: 0.1497 3/500 [..............................] - ETA: 1:46 - loss: 1.5827 - regression_loss: 1.4277 - classification_loss: 0.1550 4/500 [..............................] - ETA: 1:51 - loss: 2.1984 - regression_loss: 1.9059 - classification_loss: 0.2926 5/500 [..............................] - ETA: 1:51 - loss: 1.9173 - regression_loss: 1.6677 - classification_loss: 0.2496 6/500 [..............................] - ETA: 1:53 - loss: 1.9359 - regression_loss: 1.6706 - classification_loss: 0.2653 7/500 [..............................] - ETA: 1:53 - loss: 1.8254 - regression_loss: 1.5822 - classification_loss: 0.2432 8/500 [..............................] - ETA: 1:52 - loss: 1.6800 - regression_loss: 1.4599 - classification_loss: 0.2202 9/500 [..............................] - ETA: 1:52 - loss: 1.6362 - regression_loss: 1.4274 - classification_loss: 0.2088 10/500 [..............................] - ETA: 1:53 - loss: 1.6399 - regression_loss: 1.4270 - classification_loss: 0.2129 11/500 [..............................] - ETA: 1:53 - loss: 1.6193 - regression_loss: 1.4057 - classification_loss: 0.2136 12/500 [..............................] - ETA: 1:53 - loss: 1.6443 - regression_loss: 1.4369 - classification_loss: 0.2074 13/500 [..............................] - ETA: 1:53 - loss: 1.6498 - regression_loss: 1.4379 - classification_loss: 0.2119 14/500 [..............................] - ETA: 1:53 - loss: 1.6613 - regression_loss: 1.4425 - classification_loss: 0.2189 15/500 [..............................] - ETA: 1:53 - loss: 1.6934 - regression_loss: 1.4704 - classification_loss: 0.2230 16/500 [..............................] - ETA: 1:53 - loss: 1.7763 - regression_loss: 1.5411 - classification_loss: 0.2353 17/500 [>.............................] - ETA: 1:53 - loss: 1.7785 - regression_loss: 1.5411 - classification_loss: 0.2375 18/500 [>.............................] - ETA: 1:53 - loss: 1.7688 - regression_loss: 1.5327 - classification_loss: 0.2362 19/500 [>.............................] - ETA: 1:52 - loss: 1.7732 - regression_loss: 1.5335 - classification_loss: 0.2398 20/500 [>.............................] - ETA: 1:52 - loss: 1.7531 - regression_loss: 1.5169 - classification_loss: 0.2362 21/500 [>.............................] - ETA: 1:52 - loss: 1.7378 - regression_loss: 1.5047 - classification_loss: 0.2331 22/500 [>.............................] - ETA: 1:52 - loss: 1.7387 - regression_loss: 1.5015 - classification_loss: 0.2372 23/500 [>.............................] - ETA: 1:51 - loss: 1.7634 - regression_loss: 1.5235 - classification_loss: 0.2400 24/500 [>.............................] - ETA: 1:51 - loss: 1.7914 - regression_loss: 1.5446 - classification_loss: 0.2467 25/500 [>.............................] - ETA: 1:51 - loss: 1.7824 - regression_loss: 1.5380 - classification_loss: 0.2445 26/500 [>.............................] - ETA: 1:51 - loss: 1.7552 - regression_loss: 1.5158 - classification_loss: 0.2394 27/500 [>.............................] - ETA: 1:51 - loss: 1.7439 - regression_loss: 1.5065 - classification_loss: 0.2374 28/500 [>.............................] - ETA: 1:50 - loss: 1.7033 - regression_loss: 1.4700 - classification_loss: 0.2333 29/500 [>.............................] - ETA: 1:51 - loss: 1.6592 - regression_loss: 1.4322 - classification_loss: 0.2271 30/500 [>.............................] - ETA: 1:51 - loss: 1.6784 - regression_loss: 1.4448 - classification_loss: 0.2336 31/500 [>.............................] - ETA: 1:51 - loss: 1.6548 - regression_loss: 1.4269 - classification_loss: 0.2279 32/500 [>.............................] - ETA: 1:51 - loss: 1.6636 - regression_loss: 1.4346 - classification_loss: 0.2290 33/500 [>.............................] - ETA: 1:50 - loss: 1.6438 - regression_loss: 1.4182 - classification_loss: 0.2256 34/500 [=>............................] - ETA: 1:50 - loss: 1.6511 - regression_loss: 1.4231 - classification_loss: 0.2279 35/500 [=>............................] - ETA: 1:50 - loss: 1.6300 - regression_loss: 1.4054 - classification_loss: 0.2246 36/500 [=>............................] - ETA: 1:49 - loss: 1.6281 - regression_loss: 1.4006 - classification_loss: 0.2275 37/500 [=>............................] - ETA: 1:49 - loss: 1.6249 - regression_loss: 1.3992 - classification_loss: 0.2258 38/500 [=>............................] - ETA: 1:49 - loss: 1.6116 - regression_loss: 1.3881 - classification_loss: 0.2236 39/500 [=>............................] - ETA: 1:49 - loss: 1.6015 - regression_loss: 1.3795 - classification_loss: 0.2220 40/500 [=>............................] - ETA: 1:49 - loss: 1.6177 - regression_loss: 1.3931 - classification_loss: 0.2246 41/500 [=>............................] - ETA: 1:48 - loss: 1.6218 - regression_loss: 1.3965 - classification_loss: 0.2253 42/500 [=>............................] - ETA: 1:48 - loss: 1.6403 - regression_loss: 1.4112 - classification_loss: 0.2291 43/500 [=>............................] - ETA: 1:48 - loss: 1.6599 - regression_loss: 1.4294 - classification_loss: 0.2305 44/500 [=>............................] - ETA: 1:47 - loss: 1.6279 - regression_loss: 1.4012 - classification_loss: 0.2267 45/500 [=>............................] - ETA: 1:47 - loss: 1.6413 - regression_loss: 1.4125 - classification_loss: 0.2288 46/500 [=>............................] - ETA: 1:47 - loss: 1.6527 - regression_loss: 1.4209 - classification_loss: 0.2318 47/500 [=>............................] - ETA: 1:47 - loss: 1.6608 - regression_loss: 1.4280 - classification_loss: 0.2328 48/500 [=>............................] - ETA: 1:46 - loss: 1.6495 - regression_loss: 1.4186 - classification_loss: 0.2309 49/500 [=>............................] - ETA: 1:46 - loss: 1.6562 - regression_loss: 1.4251 - classification_loss: 0.2310 50/500 [==>...........................] - ETA: 1:46 - loss: 1.6493 - regression_loss: 1.4189 - classification_loss: 0.2304 51/500 [==>...........................] - ETA: 1:45 - loss: 1.6591 - regression_loss: 1.4258 - classification_loss: 0.2333 52/500 [==>...........................] - ETA: 1:45 - loss: 1.6572 - regression_loss: 1.4253 - classification_loss: 0.2319 53/500 [==>...........................] - ETA: 1:45 - loss: 1.6452 - regression_loss: 1.4148 - classification_loss: 0.2304 54/500 [==>...........................] - ETA: 1:44 - loss: 1.6483 - regression_loss: 1.4188 - classification_loss: 0.2295 55/500 [==>...........................] - ETA: 1:44 - loss: 1.6466 - regression_loss: 1.4187 - classification_loss: 0.2279 56/500 [==>...........................] - ETA: 1:44 - loss: 1.6256 - regression_loss: 1.4000 - classification_loss: 0.2256 57/500 [==>...........................] - ETA: 1:44 - loss: 1.6293 - regression_loss: 1.4043 - classification_loss: 0.2250 58/500 [==>...........................] - ETA: 1:43 - loss: 1.6140 - regression_loss: 1.3913 - classification_loss: 0.2227 59/500 [==>...........................] - ETA: 1:43 - loss: 1.6049 - regression_loss: 1.3835 - classification_loss: 0.2214 60/500 [==>...........................] - ETA: 1:43 - loss: 1.6066 - regression_loss: 1.3848 - classification_loss: 0.2218 61/500 [==>...........................] - ETA: 1:43 - loss: 1.6178 - regression_loss: 1.3937 - classification_loss: 0.2241 62/500 [==>...........................] - ETA: 1:43 - loss: 1.6198 - regression_loss: 1.3956 - classification_loss: 0.2242 63/500 [==>...........................] - ETA: 1:42 - loss: 1.6036 - regression_loss: 1.3803 - classification_loss: 0.2233 64/500 [==>...........................] - ETA: 1:42 - loss: 1.5991 - regression_loss: 1.3766 - classification_loss: 0.2226 65/500 [==>...........................] - ETA: 1:42 - loss: 1.6034 - regression_loss: 1.3801 - classification_loss: 0.2232 66/500 [==>...........................] - ETA: 1:42 - loss: 1.5969 - regression_loss: 1.3747 - classification_loss: 0.2222 67/500 [===>..........................] - ETA: 1:41 - loss: 1.5963 - regression_loss: 1.3743 - classification_loss: 0.2219 68/500 [===>..........................] - ETA: 1:41 - loss: 1.6074 - regression_loss: 1.3839 - classification_loss: 0.2235 69/500 [===>..........................] - ETA: 1:41 - loss: 1.6049 - regression_loss: 1.3832 - classification_loss: 0.2217 70/500 [===>..........................] - ETA: 1:41 - loss: 1.6005 - regression_loss: 1.3796 - classification_loss: 0.2209 71/500 [===>..........................] - ETA: 1:41 - loss: 1.5995 - regression_loss: 1.3798 - classification_loss: 0.2197 72/500 [===>..........................] - ETA: 1:41 - loss: 1.6069 - regression_loss: 1.3847 - classification_loss: 0.2222 73/500 [===>..........................] - ETA: 1:40 - loss: 1.6084 - regression_loss: 1.3858 - classification_loss: 0.2226 74/500 [===>..........................] - ETA: 1:40 - loss: 1.5999 - regression_loss: 1.3794 - classification_loss: 0.2205 75/500 [===>..........................] - ETA: 1:40 - loss: 1.5984 - regression_loss: 1.3777 - classification_loss: 0.2207 76/500 [===>..........................] - ETA: 1:40 - loss: 1.5891 - regression_loss: 1.3700 - classification_loss: 0.2191 77/500 [===>..........................] - ETA: 1:39 - loss: 1.5866 - regression_loss: 1.3672 - classification_loss: 0.2194 78/500 [===>..........................] - ETA: 1:39 - loss: 1.5946 - regression_loss: 1.3738 - classification_loss: 0.2208 79/500 [===>..........................] - ETA: 1:39 - loss: 1.5932 - regression_loss: 1.3734 - classification_loss: 0.2199 80/500 [===>..........................] - ETA: 1:38 - loss: 1.5904 - regression_loss: 1.3708 - classification_loss: 0.2196 81/500 [===>..........................] - ETA: 1:38 - loss: 1.5954 - regression_loss: 1.3748 - classification_loss: 0.2207 82/500 [===>..........................] - ETA: 1:38 - loss: 1.6010 - regression_loss: 1.3794 - classification_loss: 0.2216 83/500 [===>..........................] - ETA: 1:38 - loss: 1.6038 - regression_loss: 1.3820 - classification_loss: 0.2219 84/500 [====>.........................] - ETA: 1:38 - loss: 1.5927 - regression_loss: 1.3725 - classification_loss: 0.2202 85/500 [====>.........................] - ETA: 1:37 - loss: 1.5940 - regression_loss: 1.3738 - classification_loss: 0.2203 86/500 [====>.........................] - ETA: 1:37 - loss: 1.5795 - regression_loss: 1.3613 - classification_loss: 0.2182 87/500 [====>.........................] - ETA: 1:37 - loss: 1.5747 - regression_loss: 1.3579 - classification_loss: 0.2169 88/500 [====>.........................] - ETA: 1:37 - loss: 1.5804 - regression_loss: 1.3618 - classification_loss: 0.2186 89/500 [====>.........................] - ETA: 1:36 - loss: 1.5858 - regression_loss: 1.3667 - classification_loss: 0.2190 90/500 [====>.........................] - ETA: 1:36 - loss: 1.5773 - regression_loss: 1.3599 - classification_loss: 0.2175 91/500 [====>.........................] - ETA: 1:36 - loss: 1.5704 - regression_loss: 1.3544 - classification_loss: 0.2160 92/500 [====>.........................] - ETA: 1:36 - loss: 1.5628 - regression_loss: 1.3479 - classification_loss: 0.2150 93/500 [====>.........................] - ETA: 1:35 - loss: 1.5669 - regression_loss: 1.3519 - classification_loss: 0.2150 94/500 [====>.........................] - ETA: 1:35 - loss: 1.5746 - regression_loss: 1.3587 - classification_loss: 0.2159 95/500 [====>.........................] - ETA: 1:35 - loss: 1.5748 - regression_loss: 1.3588 - classification_loss: 0.2160 96/500 [====>.........................] - ETA: 1:35 - loss: 1.5875 - regression_loss: 1.3689 - classification_loss: 0.2187 97/500 [====>.........................] - ETA: 1:34 - loss: 1.5938 - regression_loss: 1.3743 - classification_loss: 0.2195 98/500 [====>.........................] - ETA: 1:34 - loss: 1.5940 - regression_loss: 1.3750 - classification_loss: 0.2190 99/500 [====>.........................] - ETA: 1:34 - loss: 1.5905 - regression_loss: 1.3726 - classification_loss: 0.2179 100/500 [=====>........................] - ETA: 1:34 - loss: 1.5949 - regression_loss: 1.3760 - classification_loss: 0.2189 101/500 [=====>........................] - ETA: 1:33 - loss: 1.5945 - regression_loss: 1.3762 - classification_loss: 0.2184 102/500 [=====>........................] - ETA: 1:33 - loss: 1.5955 - regression_loss: 1.3777 - classification_loss: 0.2179 103/500 [=====>........................] - ETA: 1:33 - loss: 1.6044 - regression_loss: 1.3842 - classification_loss: 0.2202 104/500 [=====>........................] - ETA: 1:33 - loss: 1.6017 - regression_loss: 1.3823 - classification_loss: 0.2194 105/500 [=====>........................] - ETA: 1:32 - loss: 1.5997 - regression_loss: 1.3805 - classification_loss: 0.2192 106/500 [=====>........................] - ETA: 1:32 - loss: 1.6001 - regression_loss: 1.3812 - classification_loss: 0.2190 107/500 [=====>........................] - ETA: 1:32 - loss: 1.6014 - regression_loss: 1.3836 - classification_loss: 0.2178 108/500 [=====>........................] - ETA: 1:32 - loss: 1.5990 - regression_loss: 1.3809 - classification_loss: 0.2180 109/500 [=====>........................] - ETA: 1:32 - loss: 1.5969 - regression_loss: 1.3790 - classification_loss: 0.2179 110/500 [=====>........................] - ETA: 1:31 - loss: 1.5971 - regression_loss: 1.3794 - classification_loss: 0.2178 111/500 [=====>........................] - ETA: 1:31 - loss: 1.6036 - regression_loss: 1.3835 - classification_loss: 0.2201 112/500 [=====>........................] - ETA: 1:31 - loss: 1.6020 - regression_loss: 1.3820 - classification_loss: 0.2200 113/500 [=====>........................] - ETA: 1:31 - loss: 1.6045 - regression_loss: 1.3842 - classification_loss: 0.2204 114/500 [=====>........................] - ETA: 1:30 - loss: 1.5986 - regression_loss: 1.3791 - classification_loss: 0.2194 115/500 [=====>........................] - ETA: 1:30 - loss: 1.5935 - regression_loss: 1.3751 - classification_loss: 0.2184 116/500 [=====>........................] - ETA: 1:30 - loss: 1.5889 - regression_loss: 1.3714 - classification_loss: 0.2175 117/500 [======>.......................] - ETA: 1:30 - loss: 1.5972 - regression_loss: 1.3786 - classification_loss: 0.2186 118/500 [======>.......................] - ETA: 1:29 - loss: 1.6004 - regression_loss: 1.3813 - classification_loss: 0.2191 119/500 [======>.......................] - ETA: 1:29 - loss: 1.5956 - regression_loss: 1.3773 - classification_loss: 0.2183 120/500 [======>.......................] - ETA: 1:29 - loss: 1.5974 - regression_loss: 1.3790 - classification_loss: 0.2184 121/500 [======>.......................] - ETA: 1:29 - loss: 1.6002 - regression_loss: 1.3811 - classification_loss: 0.2191 122/500 [======>.......................] - ETA: 1:29 - loss: 1.5918 - regression_loss: 1.3737 - classification_loss: 0.2181 123/500 [======>.......................] - ETA: 1:28 - loss: 1.5849 - regression_loss: 1.3680 - classification_loss: 0.2169 124/500 [======>.......................] - ETA: 1:28 - loss: 1.5834 - regression_loss: 1.3667 - classification_loss: 0.2167 125/500 [======>.......................] - ETA: 1:28 - loss: 1.5853 - regression_loss: 1.3684 - classification_loss: 0.2169 126/500 [======>.......................] - ETA: 1:28 - loss: 1.5860 - regression_loss: 1.3686 - classification_loss: 0.2174 127/500 [======>.......................] - ETA: 1:27 - loss: 1.5854 - regression_loss: 1.3681 - classification_loss: 0.2173 128/500 [======>.......................] - ETA: 1:27 - loss: 1.5826 - regression_loss: 1.3659 - classification_loss: 0.2167 129/500 [======>.......................] - ETA: 1:27 - loss: 1.5821 - regression_loss: 1.3658 - classification_loss: 0.2163 130/500 [======>.......................] - ETA: 1:27 - loss: 1.5798 - regression_loss: 1.3641 - classification_loss: 0.2157 131/500 [======>.......................] - ETA: 1:26 - loss: 1.5821 - regression_loss: 1.3658 - classification_loss: 0.2163 132/500 [======>.......................] - ETA: 1:26 - loss: 1.5802 - regression_loss: 1.3647 - classification_loss: 0.2154 133/500 [======>.......................] - ETA: 1:26 - loss: 1.5798 - regression_loss: 1.3645 - classification_loss: 0.2153 134/500 [=======>......................] - ETA: 1:26 - loss: 1.5762 - regression_loss: 1.3615 - classification_loss: 0.2147 135/500 [=======>......................] - ETA: 1:25 - loss: 1.5743 - regression_loss: 1.3598 - classification_loss: 0.2145 136/500 [=======>......................] - ETA: 1:25 - loss: 1.5653 - regression_loss: 1.3498 - classification_loss: 0.2155 137/500 [=======>......................] - ETA: 1:25 - loss: 1.5668 - regression_loss: 1.3510 - classification_loss: 0.2157 138/500 [=======>......................] - ETA: 1:25 - loss: 1.5644 - regression_loss: 1.3491 - classification_loss: 0.2153 139/500 [=======>......................] - ETA: 1:25 - loss: 1.5575 - regression_loss: 1.3432 - classification_loss: 0.2143 140/500 [=======>......................] - ETA: 1:24 - loss: 1.5547 - regression_loss: 1.3412 - classification_loss: 0.2135 141/500 [=======>......................] - ETA: 1:24 - loss: 1.5522 - regression_loss: 1.3390 - classification_loss: 0.2132 142/500 [=======>......................] - ETA: 1:24 - loss: 1.5525 - regression_loss: 1.3393 - classification_loss: 0.2133 143/500 [=======>......................] - ETA: 1:24 - loss: 1.5516 - regression_loss: 1.3385 - classification_loss: 0.2131 144/500 [=======>......................] - ETA: 1:23 - loss: 1.5500 - regression_loss: 1.3371 - classification_loss: 0.2129 145/500 [=======>......................] - ETA: 1:23 - loss: 1.5503 - regression_loss: 1.3367 - classification_loss: 0.2136 146/500 [=======>......................] - ETA: 1:23 - loss: 1.5481 - regression_loss: 1.3353 - classification_loss: 0.2129 147/500 [=======>......................] - ETA: 1:23 - loss: 1.5432 - regression_loss: 1.3309 - classification_loss: 0.2123 148/500 [=======>......................] - ETA: 1:22 - loss: 1.5432 - regression_loss: 1.3307 - classification_loss: 0.2125 149/500 [=======>......................] - ETA: 1:22 - loss: 1.5441 - regression_loss: 1.3313 - classification_loss: 0.2128 150/500 [========>.....................] - ETA: 1:22 - loss: 1.5415 - regression_loss: 1.3293 - classification_loss: 0.2122 151/500 [========>.....................] - ETA: 1:22 - loss: 1.5451 - regression_loss: 1.3326 - classification_loss: 0.2125 152/500 [========>.....................] - ETA: 1:21 - loss: 1.5425 - regression_loss: 1.3302 - classification_loss: 0.2123 153/500 [========>.....................] - ETA: 1:21 - loss: 1.5456 - regression_loss: 1.3327 - classification_loss: 0.2129 154/500 [========>.....................] - ETA: 1:21 - loss: 1.5464 - regression_loss: 1.3333 - classification_loss: 0.2131 155/500 [========>.....................] - ETA: 1:21 - loss: 1.5415 - regression_loss: 1.3293 - classification_loss: 0.2122 156/500 [========>.....................] - ETA: 1:20 - loss: 1.5431 - regression_loss: 1.3308 - classification_loss: 0.2123 157/500 [========>.....................] - ETA: 1:20 - loss: 1.5445 - regression_loss: 1.3323 - classification_loss: 0.2121 158/500 [========>.....................] - ETA: 1:20 - loss: 1.5463 - regression_loss: 1.3341 - classification_loss: 0.2122 159/500 [========>.....................] - ETA: 1:20 - loss: 1.5470 - regression_loss: 1.3346 - classification_loss: 0.2124 160/500 [========>.....................] - ETA: 1:20 - loss: 1.5466 - regression_loss: 1.3344 - classification_loss: 0.2122 161/500 [========>.....................] - ETA: 1:19 - loss: 1.5477 - regression_loss: 1.3352 - classification_loss: 0.2125 162/500 [========>.....................] - ETA: 1:19 - loss: 1.5461 - regression_loss: 1.3338 - classification_loss: 0.2123 163/500 [========>.....................] - ETA: 1:19 - loss: 1.5420 - regression_loss: 1.3303 - classification_loss: 0.2117 164/500 [========>.....................] - ETA: 1:19 - loss: 1.5404 - regression_loss: 1.3293 - classification_loss: 0.2111 165/500 [========>.....................] - ETA: 1:18 - loss: 1.5387 - regression_loss: 1.3278 - classification_loss: 0.2109 166/500 [========>.....................] - ETA: 1:18 - loss: 1.5432 - regression_loss: 1.3313 - classification_loss: 0.2119 167/500 [=========>....................] - ETA: 1:18 - loss: 1.5433 - regression_loss: 1.3313 - classification_loss: 0.2120 168/500 [=========>....................] - ETA: 1:18 - loss: 1.5420 - regression_loss: 1.3301 - classification_loss: 0.2118 169/500 [=========>....................] - ETA: 1:17 - loss: 1.5394 - regression_loss: 1.3275 - classification_loss: 0.2119 170/500 [=========>....................] - ETA: 1:17 - loss: 1.5431 - regression_loss: 1.3313 - classification_loss: 0.2118 171/500 [=========>....................] - ETA: 1:17 - loss: 1.5445 - regression_loss: 1.3325 - classification_loss: 0.2121 172/500 [=========>....................] - ETA: 1:17 - loss: 1.5409 - regression_loss: 1.3292 - classification_loss: 0.2117 173/500 [=========>....................] - ETA: 1:17 - loss: 1.5458 - regression_loss: 1.3328 - classification_loss: 0.2131 174/500 [=========>....................] - ETA: 1:16 - loss: 1.5471 - regression_loss: 1.3331 - classification_loss: 0.2140 175/500 [=========>....................] - ETA: 1:16 - loss: 1.5473 - regression_loss: 1.3336 - classification_loss: 0.2137 176/500 [=========>....................] - ETA: 1:16 - loss: 1.5425 - regression_loss: 1.3294 - classification_loss: 0.2130 177/500 [=========>....................] - ETA: 1:16 - loss: 1.5401 - regression_loss: 1.3277 - classification_loss: 0.2125 178/500 [=========>....................] - ETA: 1:15 - loss: 1.5390 - regression_loss: 1.3266 - classification_loss: 0.2124 179/500 [=========>....................] - ETA: 1:15 - loss: 1.5408 - regression_loss: 1.3279 - classification_loss: 0.2129 180/500 [=========>....................] - ETA: 1:15 - loss: 1.5364 - regression_loss: 1.3239 - classification_loss: 0.2125 181/500 [=========>....................] - ETA: 1:15 - loss: 1.5356 - regression_loss: 1.3233 - classification_loss: 0.2123 182/500 [=========>....................] - ETA: 1:14 - loss: 1.5308 - regression_loss: 1.3192 - classification_loss: 0.2116 183/500 [=========>....................] - ETA: 1:14 - loss: 1.5296 - regression_loss: 1.3181 - classification_loss: 0.2115 184/500 [==========>...................] - ETA: 1:14 - loss: 1.5299 - regression_loss: 1.3177 - classification_loss: 0.2122 185/500 [==========>...................] - ETA: 1:14 - loss: 1.5274 - regression_loss: 1.3157 - classification_loss: 0.2118 186/500 [==========>...................] - ETA: 1:13 - loss: 1.5295 - regression_loss: 1.3175 - classification_loss: 0.2120 187/500 [==========>...................] - ETA: 1:13 - loss: 1.5277 - regression_loss: 1.3163 - classification_loss: 0.2115 188/500 [==========>...................] - ETA: 1:13 - loss: 1.5239 - regression_loss: 1.3131 - classification_loss: 0.2107 189/500 [==========>...................] - ETA: 1:13 - loss: 1.5247 - regression_loss: 1.3144 - classification_loss: 0.2103 190/500 [==========>...................] - ETA: 1:12 - loss: 1.5225 - regression_loss: 1.3125 - classification_loss: 0.2099 191/500 [==========>...................] - ETA: 1:12 - loss: 1.5217 - regression_loss: 1.3118 - classification_loss: 0.2099 192/500 [==========>...................] - ETA: 1:12 - loss: 1.5212 - regression_loss: 1.3114 - classification_loss: 0.2098 193/500 [==========>...................] - ETA: 1:12 - loss: 1.5298 - regression_loss: 1.3180 - classification_loss: 0.2117 194/500 [==========>...................] - ETA: 1:12 - loss: 1.5298 - regression_loss: 1.3178 - classification_loss: 0.2119 195/500 [==========>...................] - ETA: 1:11 - loss: 1.5333 - regression_loss: 1.3200 - classification_loss: 0.2133 196/500 [==========>...................] - ETA: 1:11 - loss: 1.5313 - regression_loss: 1.3184 - classification_loss: 0.2129 197/500 [==========>...................] - ETA: 1:11 - loss: 1.5303 - regression_loss: 1.3177 - classification_loss: 0.2125 198/500 [==========>...................] - ETA: 1:11 - loss: 1.5334 - regression_loss: 1.3201 - classification_loss: 0.2133 199/500 [==========>...................] - ETA: 1:10 - loss: 1.5344 - regression_loss: 1.3214 - classification_loss: 0.2131 200/500 [===========>..................] - ETA: 1:10 - loss: 1.5363 - regression_loss: 1.3231 - classification_loss: 0.2132 201/500 [===========>..................] - ETA: 1:10 - loss: 1.5358 - regression_loss: 1.3229 - classification_loss: 0.2129 202/500 [===========>..................] - ETA: 1:10 - loss: 1.5352 - regression_loss: 1.3221 - classification_loss: 0.2131 203/500 [===========>..................] - ETA: 1:09 - loss: 1.5344 - regression_loss: 1.3216 - classification_loss: 0.2128 204/500 [===========>..................] - ETA: 1:09 - loss: 1.5381 - regression_loss: 1.3256 - classification_loss: 0.2125 205/500 [===========>..................] - ETA: 1:09 - loss: 1.5374 - regression_loss: 1.3252 - classification_loss: 0.2123 206/500 [===========>..................] - ETA: 1:09 - loss: 1.5362 - regression_loss: 1.3243 - classification_loss: 0.2119 207/500 [===========>..................] - ETA: 1:08 - loss: 1.5363 - regression_loss: 1.3243 - classification_loss: 0.2119 208/500 [===========>..................] - ETA: 1:08 - loss: 1.5361 - regression_loss: 1.3242 - classification_loss: 0.2118 209/500 [===========>..................] - ETA: 1:08 - loss: 1.5339 - regression_loss: 1.3223 - classification_loss: 0.2116 210/500 [===========>..................] - ETA: 1:08 - loss: 1.5369 - regression_loss: 1.3250 - classification_loss: 0.2119 211/500 [===========>..................] - ETA: 1:08 - loss: 1.5361 - regression_loss: 1.3243 - classification_loss: 0.2118 212/500 [===========>..................] - ETA: 1:07 - loss: 1.5380 - regression_loss: 1.3258 - classification_loss: 0.2122 213/500 [===========>..................] - ETA: 1:07 - loss: 1.5426 - regression_loss: 1.3291 - classification_loss: 0.2135 214/500 [===========>..................] - ETA: 1:07 - loss: 1.5407 - regression_loss: 1.3278 - classification_loss: 0.2129 215/500 [===========>..................] - ETA: 1:07 - loss: 1.5456 - regression_loss: 1.3313 - classification_loss: 0.2143 216/500 [===========>..................] - ETA: 1:06 - loss: 1.5453 - regression_loss: 1.3310 - classification_loss: 0.2143 217/500 [============>.................] - ETA: 1:06 - loss: 1.5484 - regression_loss: 1.3340 - classification_loss: 0.2144 218/500 [============>.................] - ETA: 1:06 - loss: 1.5496 - regression_loss: 1.3349 - classification_loss: 0.2147 219/500 [============>.................] - ETA: 1:06 - loss: 1.5516 - regression_loss: 1.3367 - classification_loss: 0.2149 220/500 [============>.................] - ETA: 1:05 - loss: 1.5487 - regression_loss: 1.3343 - classification_loss: 0.2144 221/500 [============>.................] - ETA: 1:05 - loss: 1.5475 - regression_loss: 1.3329 - classification_loss: 0.2147 222/500 [============>.................] - ETA: 1:05 - loss: 1.5466 - regression_loss: 1.3321 - classification_loss: 0.2146 223/500 [============>.................] - ETA: 1:05 - loss: 1.5466 - regression_loss: 1.3320 - classification_loss: 0.2146 224/500 [============>.................] - ETA: 1:04 - loss: 1.5476 - regression_loss: 1.3327 - classification_loss: 0.2149 225/500 [============>.................] - ETA: 1:04 - loss: 1.5474 - regression_loss: 1.3328 - classification_loss: 0.2146 226/500 [============>.................] - ETA: 1:04 - loss: 1.5497 - regression_loss: 1.3347 - classification_loss: 0.2150 227/500 [============>.................] - ETA: 1:04 - loss: 1.5520 - regression_loss: 1.3364 - classification_loss: 0.2156 228/500 [============>.................] - ETA: 1:04 - loss: 1.5532 - regression_loss: 1.3374 - classification_loss: 0.2158 229/500 [============>.................] - ETA: 1:03 - loss: 1.5511 - regression_loss: 1.3356 - classification_loss: 0.2155 230/500 [============>.................] - ETA: 1:03 - loss: 1.5477 - regression_loss: 1.3327 - classification_loss: 0.2150 231/500 [============>.................] - ETA: 1:03 - loss: 1.5498 - regression_loss: 1.3342 - classification_loss: 0.2157 232/500 [============>.................] - ETA: 1:03 - loss: 1.5457 - regression_loss: 1.3305 - classification_loss: 0.2152 233/500 [============>.................] - ETA: 1:02 - loss: 1.5501 - regression_loss: 1.3341 - classification_loss: 0.2159 234/500 [=============>................] - ETA: 1:02 - loss: 1.5480 - regression_loss: 1.3322 - classification_loss: 0.2158 235/500 [=============>................] - ETA: 1:02 - loss: 1.5456 - regression_loss: 1.3303 - classification_loss: 0.2153 236/500 [=============>................] - ETA: 1:02 - loss: 1.5437 - regression_loss: 1.3288 - classification_loss: 0.2149 237/500 [=============>................] - ETA: 1:01 - loss: 1.5420 - regression_loss: 1.3275 - classification_loss: 0.2145 238/500 [=============>................] - ETA: 1:01 - loss: 1.5424 - regression_loss: 1.3279 - classification_loss: 0.2145 239/500 [=============>................] - ETA: 1:01 - loss: 1.5420 - regression_loss: 1.3278 - classification_loss: 0.2143 240/500 [=============>................] - ETA: 1:01 - loss: 1.5421 - regression_loss: 1.3280 - classification_loss: 0.2142 241/500 [=============>................] - ETA: 1:00 - loss: 1.5415 - regression_loss: 1.3275 - classification_loss: 0.2140 242/500 [=============>................] - ETA: 1:00 - loss: 1.5437 - regression_loss: 1.3290 - classification_loss: 0.2147 243/500 [=============>................] - ETA: 1:00 - loss: 1.5467 - regression_loss: 1.3312 - classification_loss: 0.2155 244/500 [=============>................] - ETA: 1:00 - loss: 1.5452 - regression_loss: 1.3300 - classification_loss: 0.2152 245/500 [=============>................] - ETA: 1:00 - loss: 1.5427 - regression_loss: 1.3281 - classification_loss: 0.2146 246/500 [=============>................] - ETA: 59s - loss: 1.5439 - regression_loss: 1.3294 - classification_loss: 0.2145  247/500 [=============>................] - ETA: 59s - loss: 1.5433 - regression_loss: 1.3291 - classification_loss: 0.2142 248/500 [=============>................] - ETA: 59s - loss: 1.5461 - regression_loss: 1.3314 - classification_loss: 0.2147 249/500 [=============>................] - ETA: 59s - loss: 1.5475 - regression_loss: 1.3325 - classification_loss: 0.2150 250/500 [==============>...............] - ETA: 58s - loss: 1.5477 - regression_loss: 1.3327 - classification_loss: 0.2150 251/500 [==============>...............] - ETA: 58s - loss: 1.5452 - regression_loss: 1.3306 - classification_loss: 0.2147 252/500 [==============>...............] - ETA: 58s - loss: 1.5431 - regression_loss: 1.3290 - classification_loss: 0.2142 253/500 [==============>...............] - ETA: 58s - loss: 1.5415 - regression_loss: 1.3278 - classification_loss: 0.2137 254/500 [==============>...............] - ETA: 57s - loss: 1.5378 - regression_loss: 1.3245 - classification_loss: 0.2133 255/500 [==============>...............] - ETA: 57s - loss: 1.5346 - regression_loss: 1.3213 - classification_loss: 0.2133 256/500 [==============>...............] - ETA: 57s - loss: 1.5376 - regression_loss: 1.3237 - classification_loss: 0.2139 257/500 [==============>...............] - ETA: 57s - loss: 1.5364 - regression_loss: 1.3228 - classification_loss: 0.2136 258/500 [==============>...............] - ETA: 56s - loss: 1.5370 - regression_loss: 1.3232 - classification_loss: 0.2138 259/500 [==============>...............] - ETA: 56s - loss: 1.5335 - regression_loss: 1.3202 - classification_loss: 0.2133 260/500 [==============>...............] - ETA: 56s - loss: 1.5332 - regression_loss: 1.3198 - classification_loss: 0.2134 261/500 [==============>...............] - ETA: 56s - loss: 1.5341 - regression_loss: 1.3204 - classification_loss: 0.2137 262/500 [==============>...............] - ETA: 55s - loss: 1.5338 - regression_loss: 1.3202 - classification_loss: 0.2136 263/500 [==============>...............] - ETA: 55s - loss: 1.5309 - regression_loss: 1.3177 - classification_loss: 0.2132 264/500 [==============>...............] - ETA: 55s - loss: 1.5307 - regression_loss: 1.3178 - classification_loss: 0.2130 265/500 [==============>...............] - ETA: 55s - loss: 1.5327 - regression_loss: 1.3191 - classification_loss: 0.2135 266/500 [==============>...............] - ETA: 55s - loss: 1.5338 - regression_loss: 1.3201 - classification_loss: 0.2137 267/500 [===============>..............] - ETA: 54s - loss: 1.5323 - regression_loss: 1.3188 - classification_loss: 0.2134 268/500 [===============>..............] - ETA: 54s - loss: 1.5325 - regression_loss: 1.3193 - classification_loss: 0.2132 269/500 [===============>..............] - ETA: 54s - loss: 1.5344 - regression_loss: 1.3207 - classification_loss: 0.2137 270/500 [===============>..............] - ETA: 54s - loss: 1.5345 - regression_loss: 1.3211 - classification_loss: 0.2134 271/500 [===============>..............] - ETA: 53s - loss: 1.5396 - regression_loss: 1.3247 - classification_loss: 0.2149 272/500 [===============>..............] - ETA: 53s - loss: 1.5395 - regression_loss: 1.3246 - classification_loss: 0.2149 273/500 [===============>..............] - ETA: 53s - loss: 1.5355 - regression_loss: 1.3212 - classification_loss: 0.2143 274/500 [===============>..............] - ETA: 53s - loss: 1.5342 - regression_loss: 1.3203 - classification_loss: 0.2139 275/500 [===============>..............] - ETA: 52s - loss: 1.5324 - regression_loss: 1.3188 - classification_loss: 0.2135 276/500 [===============>..............] - ETA: 52s - loss: 1.5318 - regression_loss: 1.3184 - classification_loss: 0.2134 277/500 [===============>..............] - ETA: 52s - loss: 1.5355 - regression_loss: 1.3216 - classification_loss: 0.2139 278/500 [===============>..............] - ETA: 52s - loss: 1.5383 - regression_loss: 1.3243 - classification_loss: 0.2140 279/500 [===============>..............] - ETA: 51s - loss: 1.5352 - regression_loss: 1.3218 - classification_loss: 0.2135 280/500 [===============>..............] - ETA: 51s - loss: 1.5336 - regression_loss: 1.3203 - classification_loss: 0.2132 281/500 [===============>..............] - ETA: 51s - loss: 1.5327 - regression_loss: 1.3197 - classification_loss: 0.2130 282/500 [===============>..............] - ETA: 51s - loss: 1.5318 - regression_loss: 1.3190 - classification_loss: 0.2128 283/500 [===============>..............] - ETA: 50s - loss: 1.5349 - regression_loss: 1.3220 - classification_loss: 0.2129 284/500 [================>.............] - ETA: 50s - loss: 1.5364 - regression_loss: 1.3226 - classification_loss: 0.2137 285/500 [================>.............] - ETA: 50s - loss: 1.5359 - regression_loss: 1.3223 - classification_loss: 0.2136 286/500 [================>.............] - ETA: 50s - loss: 1.5356 - regression_loss: 1.3220 - classification_loss: 0.2136 287/500 [================>.............] - ETA: 50s - loss: 1.5362 - regression_loss: 1.3224 - classification_loss: 0.2138 288/500 [================>.............] - ETA: 49s - loss: 1.5365 - regression_loss: 1.3228 - classification_loss: 0.2137 289/500 [================>.............] - ETA: 49s - loss: 1.5376 - regression_loss: 1.3239 - classification_loss: 0.2137 290/500 [================>.............] - ETA: 49s - loss: 1.5367 - regression_loss: 1.3232 - classification_loss: 0.2135 291/500 [================>.............] - ETA: 49s - loss: 1.5386 - regression_loss: 1.3249 - classification_loss: 0.2137 292/500 [================>.............] - ETA: 48s - loss: 1.5405 - regression_loss: 1.3268 - classification_loss: 0.2137 293/500 [================>.............] - ETA: 48s - loss: 1.5404 - regression_loss: 1.3267 - classification_loss: 0.2137 294/500 [================>.............] - ETA: 48s - loss: 1.5410 - regression_loss: 1.3273 - classification_loss: 0.2138 295/500 [================>.............] - ETA: 48s - loss: 1.5402 - regression_loss: 1.3264 - classification_loss: 0.2137 296/500 [================>.............] - ETA: 47s - loss: 1.5428 - regression_loss: 1.3287 - classification_loss: 0.2141 297/500 [================>.............] - ETA: 47s - loss: 1.5430 - regression_loss: 1.3289 - classification_loss: 0.2141 298/500 [================>.............] - ETA: 47s - loss: 1.5406 - regression_loss: 1.3268 - classification_loss: 0.2138 299/500 [================>.............] - ETA: 47s - loss: 1.5450 - regression_loss: 1.3302 - classification_loss: 0.2148 300/500 [=================>............] - ETA: 46s - loss: 1.5429 - regression_loss: 1.3284 - classification_loss: 0.2144 301/500 [=================>............] - ETA: 46s - loss: 1.5457 - regression_loss: 1.3309 - classification_loss: 0.2147 302/500 [=================>............] - ETA: 46s - loss: 1.5477 - regression_loss: 1.3328 - classification_loss: 0.2149 303/500 [=================>............] - ETA: 46s - loss: 1.5470 - regression_loss: 1.3322 - classification_loss: 0.2148 304/500 [=================>............] - ETA: 46s - loss: 1.5472 - regression_loss: 1.3325 - classification_loss: 0.2147 305/500 [=================>............] - ETA: 45s - loss: 1.5450 - regression_loss: 1.3307 - classification_loss: 0.2143 306/500 [=================>............] - ETA: 45s - loss: 1.5486 - regression_loss: 1.3339 - classification_loss: 0.2147 307/500 [=================>............] - ETA: 45s - loss: 1.5496 - regression_loss: 1.3346 - classification_loss: 0.2150 308/500 [=================>............] - ETA: 45s - loss: 1.5475 - regression_loss: 1.3329 - classification_loss: 0.2147 309/500 [=================>............] - ETA: 44s - loss: 1.5458 - regression_loss: 1.3313 - classification_loss: 0.2145 310/500 [=================>............] - ETA: 44s - loss: 1.5447 - regression_loss: 1.3302 - classification_loss: 0.2145 311/500 [=================>............] - ETA: 44s - loss: 1.5437 - regression_loss: 1.3294 - classification_loss: 0.2142 312/500 [=================>............] - ETA: 44s - loss: 1.5422 - regression_loss: 1.3283 - classification_loss: 0.2139 313/500 [=================>............] - ETA: 43s - loss: 1.5410 - regression_loss: 1.3273 - classification_loss: 0.2137 314/500 [=================>............] - ETA: 43s - loss: 1.5427 - regression_loss: 1.3286 - classification_loss: 0.2141 315/500 [=================>............] - ETA: 43s - loss: 1.5409 - regression_loss: 1.3272 - classification_loss: 0.2137 316/500 [=================>............] - ETA: 43s - loss: 1.5429 - regression_loss: 1.3291 - classification_loss: 0.2139 317/500 [==================>...........] - ETA: 43s - loss: 1.5418 - regression_loss: 1.3282 - classification_loss: 0.2136 318/500 [==================>...........] - ETA: 42s - loss: 1.5431 - regression_loss: 1.3292 - classification_loss: 0.2139 319/500 [==================>...........] - ETA: 42s - loss: 1.5429 - regression_loss: 1.3291 - classification_loss: 0.2138 320/500 [==================>...........] - ETA: 42s - loss: 1.5421 - regression_loss: 1.3283 - classification_loss: 0.2138 321/500 [==================>...........] - ETA: 42s - loss: 1.5418 - regression_loss: 1.3282 - classification_loss: 0.2136 322/500 [==================>...........] - ETA: 41s - loss: 1.5415 - regression_loss: 1.3282 - classification_loss: 0.2133 323/500 [==================>...........] - ETA: 41s - loss: 1.5420 - regression_loss: 1.3289 - classification_loss: 0.2130 324/500 [==================>...........] - ETA: 41s - loss: 1.5391 - regression_loss: 1.3261 - classification_loss: 0.2131 325/500 [==================>...........] - ETA: 41s - loss: 1.5375 - regression_loss: 1.3245 - classification_loss: 0.2130 326/500 [==================>...........] - ETA: 40s - loss: 1.5369 - regression_loss: 1.3241 - classification_loss: 0.2128 327/500 [==================>...........] - ETA: 40s - loss: 1.5372 - regression_loss: 1.3242 - classification_loss: 0.2130 328/500 [==================>...........] - ETA: 40s - loss: 1.5372 - regression_loss: 1.3241 - classification_loss: 0.2131 329/500 [==================>...........] - ETA: 40s - loss: 1.5353 - regression_loss: 1.3224 - classification_loss: 0.2129 330/500 [==================>...........] - ETA: 39s - loss: 1.5355 - regression_loss: 1.3227 - classification_loss: 0.2127 331/500 [==================>...........] - ETA: 39s - loss: 1.5370 - regression_loss: 1.3239 - classification_loss: 0.2130 332/500 [==================>...........] - ETA: 39s - loss: 1.5338 - regression_loss: 1.3212 - classification_loss: 0.2126 333/500 [==================>...........] - ETA: 39s - loss: 1.5354 - regression_loss: 1.3228 - classification_loss: 0.2126 334/500 [===================>..........] - ETA: 39s - loss: 1.5339 - regression_loss: 1.3217 - classification_loss: 0.2123 335/500 [===================>..........] - ETA: 38s - loss: 1.5355 - regression_loss: 1.3231 - classification_loss: 0.2124 336/500 [===================>..........] - ETA: 38s - loss: 1.5354 - regression_loss: 1.3230 - classification_loss: 0.2125 337/500 [===================>..........] - ETA: 38s - loss: 1.5364 - regression_loss: 1.3240 - classification_loss: 0.2124 338/500 [===================>..........] - ETA: 38s - loss: 1.5358 - regression_loss: 1.3235 - classification_loss: 0.2122 339/500 [===================>..........] - ETA: 37s - loss: 1.5365 - regression_loss: 1.3241 - classification_loss: 0.2124 340/500 [===================>..........] - ETA: 37s - loss: 1.5354 - regression_loss: 1.3232 - classification_loss: 0.2122 341/500 [===================>..........] - ETA: 37s - loss: 1.5359 - regression_loss: 1.3236 - classification_loss: 0.2123 342/500 [===================>..........] - ETA: 37s - loss: 1.5336 - regression_loss: 1.3217 - classification_loss: 0.2119 343/500 [===================>..........] - ETA: 36s - loss: 1.5340 - regression_loss: 1.3221 - classification_loss: 0.2119 344/500 [===================>..........] - ETA: 36s - loss: 1.5326 - regression_loss: 1.3204 - classification_loss: 0.2122 345/500 [===================>..........] - ETA: 36s - loss: 1.5317 - regression_loss: 1.3199 - classification_loss: 0.2118 346/500 [===================>..........] - ETA: 36s - loss: 1.5331 - regression_loss: 1.3213 - classification_loss: 0.2117 347/500 [===================>..........] - ETA: 35s - loss: 1.5335 - regression_loss: 1.3217 - classification_loss: 0.2117 348/500 [===================>..........] - ETA: 35s - loss: 1.5319 - regression_loss: 1.3206 - classification_loss: 0.2113 349/500 [===================>..........] - ETA: 35s - loss: 1.5328 - regression_loss: 1.3214 - classification_loss: 0.2114 350/500 [====================>.........] - ETA: 35s - loss: 1.5353 - regression_loss: 1.3230 - classification_loss: 0.2123 351/500 [====================>.........] - ETA: 35s - loss: 1.5353 - regression_loss: 1.3229 - classification_loss: 0.2124 352/500 [====================>.........] - ETA: 34s - loss: 1.5377 - regression_loss: 1.3249 - classification_loss: 0.2128 353/500 [====================>.........] - ETA: 34s - loss: 1.5404 - regression_loss: 1.3270 - classification_loss: 0.2134 354/500 [====================>.........] - ETA: 34s - loss: 1.5392 - regression_loss: 1.3262 - classification_loss: 0.2131 355/500 [====================>.........] - ETA: 34s - loss: 1.5392 - regression_loss: 1.3262 - classification_loss: 0.2130 356/500 [====================>.........] - ETA: 33s - loss: 1.5393 - regression_loss: 1.3265 - classification_loss: 0.2128 357/500 [====================>.........] - ETA: 33s - loss: 1.5376 - regression_loss: 1.3252 - classification_loss: 0.2124 358/500 [====================>.........] - ETA: 33s - loss: 1.5363 - regression_loss: 1.3242 - classification_loss: 0.2121 359/500 [====================>.........] - ETA: 33s - loss: 1.5367 - regression_loss: 1.3246 - classification_loss: 0.2121 360/500 [====================>.........] - ETA: 32s - loss: 1.5343 - regression_loss: 1.3226 - classification_loss: 0.2117 361/500 [====================>.........] - ETA: 32s - loss: 1.5364 - regression_loss: 1.3239 - classification_loss: 0.2124 362/500 [====================>.........] - ETA: 32s - loss: 1.5370 - regression_loss: 1.3246 - classification_loss: 0.2125 363/500 [====================>.........] - ETA: 32s - loss: 1.5371 - regression_loss: 1.3247 - classification_loss: 0.2124 364/500 [====================>.........] - ETA: 31s - loss: 1.5355 - regression_loss: 1.3234 - classification_loss: 0.2121 365/500 [====================>.........] - ETA: 31s - loss: 1.5345 - regression_loss: 1.3227 - classification_loss: 0.2119 366/500 [====================>.........] - ETA: 31s - loss: 1.5336 - regression_loss: 1.3220 - classification_loss: 0.2117 367/500 [=====================>........] - ETA: 31s - loss: 1.5332 - regression_loss: 1.3217 - classification_loss: 0.2115 368/500 [=====================>........] - ETA: 31s - loss: 1.5331 - regression_loss: 1.3216 - classification_loss: 0.2115 369/500 [=====================>........] - ETA: 30s - loss: 1.5333 - regression_loss: 1.3220 - classification_loss: 0.2112 370/500 [=====================>........] - ETA: 30s - loss: 1.5319 - regression_loss: 1.3209 - classification_loss: 0.2110 371/500 [=====================>........] - ETA: 30s - loss: 1.5314 - regression_loss: 1.3204 - classification_loss: 0.2110 372/500 [=====================>........] - ETA: 30s - loss: 1.5330 - regression_loss: 1.3216 - classification_loss: 0.2114 373/500 [=====================>........] - ETA: 29s - loss: 1.5329 - regression_loss: 1.3217 - classification_loss: 0.2112 374/500 [=====================>........] - ETA: 29s - loss: 1.5325 - regression_loss: 1.3215 - classification_loss: 0.2111 375/500 [=====================>........] - ETA: 29s - loss: 1.5314 - regression_loss: 1.3205 - classification_loss: 0.2109 376/500 [=====================>........] - ETA: 29s - loss: 1.5305 - regression_loss: 1.3198 - classification_loss: 0.2107 377/500 [=====================>........] - ETA: 28s - loss: 1.5301 - regression_loss: 1.3195 - classification_loss: 0.2106 378/500 [=====================>........] - ETA: 28s - loss: 1.5299 - regression_loss: 1.3194 - classification_loss: 0.2105 379/500 [=====================>........] - ETA: 28s - loss: 1.5297 - regression_loss: 1.3193 - classification_loss: 0.2104 380/500 [=====================>........] - ETA: 28s - loss: 1.5315 - regression_loss: 1.3207 - classification_loss: 0.2107 381/500 [=====================>........] - ETA: 27s - loss: 1.5309 - regression_loss: 1.3204 - classification_loss: 0.2106 382/500 [=====================>........] - ETA: 27s - loss: 1.5293 - regression_loss: 1.3189 - classification_loss: 0.2104 383/500 [=====================>........] - ETA: 27s - loss: 1.5295 - regression_loss: 1.3190 - classification_loss: 0.2105 384/500 [======================>.......] - ETA: 27s - loss: 1.5298 - regression_loss: 1.3193 - classification_loss: 0.2105 385/500 [======================>.......] - ETA: 27s - loss: 1.5298 - regression_loss: 1.3192 - classification_loss: 0.2106 386/500 [======================>.......] - ETA: 26s - loss: 1.5282 - regression_loss: 1.3177 - classification_loss: 0.2105 387/500 [======================>.......] - ETA: 26s - loss: 1.5274 - regression_loss: 1.3170 - classification_loss: 0.2104 388/500 [======================>.......] - ETA: 26s - loss: 1.5289 - regression_loss: 1.3182 - classification_loss: 0.2107 389/500 [======================>.......] - ETA: 26s - loss: 1.5295 - regression_loss: 1.3187 - classification_loss: 0.2108 390/500 [======================>.......] - ETA: 25s - loss: 1.5293 - regression_loss: 1.3187 - classification_loss: 0.2106 391/500 [======================>.......] - ETA: 25s - loss: 1.5278 - regression_loss: 1.3175 - classification_loss: 0.2103 392/500 [======================>.......] - ETA: 25s - loss: 1.5259 - regression_loss: 1.3158 - classification_loss: 0.2101 393/500 [======================>.......] - ETA: 25s - loss: 1.5284 - regression_loss: 1.3174 - classification_loss: 0.2110 394/500 [======================>.......] - ETA: 24s - loss: 1.5289 - regression_loss: 1.3176 - classification_loss: 0.2113 395/500 [======================>.......] - ETA: 24s - loss: 1.5286 - regression_loss: 1.3175 - classification_loss: 0.2111 396/500 [======================>.......] - ETA: 24s - loss: 1.5270 - regression_loss: 1.3161 - classification_loss: 0.2109 397/500 [======================>.......] - ETA: 24s - loss: 1.5266 - regression_loss: 1.3157 - classification_loss: 0.2109 398/500 [======================>.......] - ETA: 23s - loss: 1.5271 - regression_loss: 1.3162 - classification_loss: 0.2109 399/500 [======================>.......] - ETA: 23s - loss: 1.5281 - regression_loss: 1.3170 - classification_loss: 0.2110 400/500 [=======================>......] - ETA: 23s - loss: 1.5287 - regression_loss: 1.3175 - classification_loss: 0.2111 401/500 [=======================>......] - ETA: 23s - loss: 1.5266 - regression_loss: 1.3158 - classification_loss: 0.2108 402/500 [=======================>......] - ETA: 22s - loss: 1.5288 - regression_loss: 1.3176 - classification_loss: 0.2112 403/500 [=======================>......] - ETA: 22s - loss: 1.5310 - regression_loss: 1.3195 - classification_loss: 0.2115 404/500 [=======================>......] - ETA: 22s - loss: 1.5307 - regression_loss: 1.3192 - classification_loss: 0.2115 405/500 [=======================>......] - ETA: 22s - loss: 1.5314 - regression_loss: 1.3199 - classification_loss: 0.2115 406/500 [=======================>......] - ETA: 22s - loss: 1.5308 - regression_loss: 1.3195 - classification_loss: 0.2113 407/500 [=======================>......] - ETA: 21s - loss: 1.5312 - regression_loss: 1.3199 - classification_loss: 0.2113 408/500 [=======================>......] - ETA: 21s - loss: 1.5305 - regression_loss: 1.3194 - classification_loss: 0.2110 409/500 [=======================>......] - ETA: 21s - loss: 1.5311 - regression_loss: 1.3202 - classification_loss: 0.2109 410/500 [=======================>......] - ETA: 21s - loss: 1.5307 - regression_loss: 1.3199 - classification_loss: 0.2108 411/500 [=======================>......] - ETA: 20s - loss: 1.5309 - regression_loss: 1.3199 - classification_loss: 0.2109 412/500 [=======================>......] - ETA: 20s - loss: 1.5305 - regression_loss: 1.3196 - classification_loss: 0.2109 413/500 [=======================>......] - ETA: 20s - loss: 1.5300 - regression_loss: 1.3192 - classification_loss: 0.2108 414/500 [=======================>......] - ETA: 20s - loss: 1.5287 - regression_loss: 1.3182 - classification_loss: 0.2105 415/500 [=======================>......] - ETA: 19s - loss: 1.5308 - regression_loss: 1.3198 - classification_loss: 0.2110 416/500 [=======================>......] - ETA: 19s - loss: 1.5295 - regression_loss: 1.3187 - classification_loss: 0.2107 417/500 [========================>.....] - ETA: 19s - loss: 1.5289 - regression_loss: 1.3181 - classification_loss: 0.2107 418/500 [========================>.....] - ETA: 19s - loss: 1.5297 - regression_loss: 1.3190 - classification_loss: 0.2108 419/500 [========================>.....] - ETA: 19s - loss: 1.5296 - regression_loss: 1.3188 - classification_loss: 0.2108 420/500 [========================>.....] - ETA: 18s - loss: 1.5292 - regression_loss: 1.3185 - classification_loss: 0.2107 421/500 [========================>.....] - ETA: 18s - loss: 1.5300 - regression_loss: 1.3190 - classification_loss: 0.2110 422/500 [========================>.....] - ETA: 18s - loss: 1.5300 - regression_loss: 1.3189 - classification_loss: 0.2111 423/500 [========================>.....] - ETA: 18s - loss: 1.5314 - regression_loss: 1.3199 - classification_loss: 0.2115 424/500 [========================>.....] - ETA: 17s - loss: 1.5306 - regression_loss: 1.3192 - classification_loss: 0.2114 425/500 [========================>.....] - ETA: 17s - loss: 1.5293 - regression_loss: 1.3182 - classification_loss: 0.2111 426/500 [========================>.....] - ETA: 17s - loss: 1.5282 - regression_loss: 1.3172 - classification_loss: 0.2110 427/500 [========================>.....] - ETA: 17s - loss: 1.5288 - regression_loss: 1.3178 - classification_loss: 0.2110 428/500 [========================>.....] - ETA: 16s - loss: 1.5300 - regression_loss: 1.3187 - classification_loss: 0.2113 429/500 [========================>.....] - ETA: 16s - loss: 1.5310 - regression_loss: 1.3198 - classification_loss: 0.2112 430/500 [========================>.....] - ETA: 16s - loss: 1.5314 - regression_loss: 1.3203 - classification_loss: 0.2111 431/500 [========================>.....] - ETA: 16s - loss: 1.5316 - regression_loss: 1.3204 - classification_loss: 0.2112 432/500 [========================>.....] - ETA: 15s - loss: 1.5315 - regression_loss: 1.3205 - classification_loss: 0.2111 433/500 [========================>.....] - ETA: 15s - loss: 1.5315 - regression_loss: 1.3204 - classification_loss: 0.2110 434/500 [=========================>....] - ETA: 15s - loss: 1.5375 - regression_loss: 1.3241 - classification_loss: 0.2134 435/500 [=========================>....] - ETA: 15s - loss: 1.5377 - regression_loss: 1.3244 - classification_loss: 0.2133 436/500 [=========================>....] - ETA: 15s - loss: 1.5362 - regression_loss: 1.3231 - classification_loss: 0.2131 437/500 [=========================>....] - ETA: 14s - loss: 1.5367 - regression_loss: 1.3235 - classification_loss: 0.2132 438/500 [=========================>....] - ETA: 14s - loss: 1.5379 - regression_loss: 1.3246 - classification_loss: 0.2133 439/500 [=========================>....] - ETA: 14s - loss: 1.5369 - regression_loss: 1.3238 - classification_loss: 0.2131 440/500 [=========================>....] - ETA: 14s - loss: 1.5355 - regression_loss: 1.3227 - classification_loss: 0.2128 441/500 [=========================>....] - ETA: 13s - loss: 1.5359 - regression_loss: 1.3231 - classification_loss: 0.2128 442/500 [=========================>....] - ETA: 13s - loss: 1.5363 - regression_loss: 1.3235 - classification_loss: 0.2127 443/500 [=========================>....] - ETA: 13s - loss: 1.5345 - regression_loss: 1.3221 - classification_loss: 0.2124 444/500 [=========================>....] - ETA: 13s - loss: 1.5327 - regression_loss: 1.3205 - classification_loss: 0.2122 445/500 [=========================>....] - ETA: 12s - loss: 1.5309 - regression_loss: 1.3190 - classification_loss: 0.2119 446/500 [=========================>....] - ETA: 12s - loss: 1.5318 - regression_loss: 1.3197 - classification_loss: 0.2120 447/500 [=========================>....] - ETA: 12s - loss: 1.5311 - regression_loss: 1.3192 - classification_loss: 0.2118 448/500 [=========================>....] - ETA: 12s - loss: 1.5298 - regression_loss: 1.3181 - classification_loss: 0.2117 449/500 [=========================>....] - ETA: 11s - loss: 1.5285 - regression_loss: 1.3170 - classification_loss: 0.2115 450/500 [==========================>...] - ETA: 11s - loss: 1.5279 - regression_loss: 1.3165 - classification_loss: 0.2114 451/500 [==========================>...] - ETA: 11s - loss: 1.5289 - regression_loss: 1.3173 - classification_loss: 0.2116 452/500 [==========================>...] - ETA: 11s - loss: 1.5277 - regression_loss: 1.3162 - classification_loss: 0.2115 453/500 [==========================>...] - ETA: 11s - loss: 1.5270 - regression_loss: 1.3156 - classification_loss: 0.2114 454/500 [==========================>...] - ETA: 10s - loss: 1.5265 - regression_loss: 1.3153 - classification_loss: 0.2113 455/500 [==========================>...] - ETA: 10s - loss: 1.5263 - regression_loss: 1.3150 - classification_loss: 0.2112 456/500 [==========================>...] - ETA: 10s - loss: 1.5240 - regression_loss: 1.3131 - classification_loss: 0.2109 457/500 [==========================>...] - ETA: 10s - loss: 1.5232 - regression_loss: 1.3123 - classification_loss: 0.2108 458/500 [==========================>...] - ETA: 9s - loss: 1.5229 - regression_loss: 1.3120 - classification_loss: 0.2108  459/500 [==========================>...] - ETA: 9s - loss: 1.5229 - regression_loss: 1.3122 - classification_loss: 0.2107 460/500 [==========================>...] - ETA: 9s - loss: 1.5211 - regression_loss: 1.3093 - classification_loss: 0.2118 461/500 [==========================>...] - ETA: 9s - loss: 1.5239 - regression_loss: 1.3116 - classification_loss: 0.2123 462/500 [==========================>...] - ETA: 8s - loss: 1.5220 - regression_loss: 1.3098 - classification_loss: 0.2122 463/500 [==========================>...] - ETA: 8s - loss: 1.5215 - regression_loss: 1.3095 - classification_loss: 0.2120 464/500 [==========================>...] - ETA: 8s - loss: 1.5214 - regression_loss: 1.3093 - classification_loss: 0.2121 465/500 [==========================>...] - ETA: 8s - loss: 1.5230 - regression_loss: 1.3107 - classification_loss: 0.2123 466/500 [==========================>...] - ETA: 7s - loss: 1.5223 - regression_loss: 1.3100 - classification_loss: 0.2123 467/500 [===========================>..] - ETA: 7s - loss: 1.5222 - regression_loss: 1.3097 - classification_loss: 0.2125 468/500 [===========================>..] - ETA: 7s - loss: 1.5224 - regression_loss: 1.3099 - classification_loss: 0.2125 469/500 [===========================>..] - ETA: 7s - loss: 1.5218 - regression_loss: 1.3094 - classification_loss: 0.2124 470/500 [===========================>..] - ETA: 7s - loss: 1.5214 - regression_loss: 1.3090 - classification_loss: 0.2124 471/500 [===========================>..] - ETA: 6s - loss: 1.5218 - regression_loss: 1.3093 - classification_loss: 0.2125 472/500 [===========================>..] - ETA: 6s - loss: 1.5195 - regression_loss: 1.3074 - classification_loss: 0.2121 473/500 [===========================>..] - ETA: 6s - loss: 1.5204 - regression_loss: 1.3082 - classification_loss: 0.2122 474/500 [===========================>..] - ETA: 6s - loss: 1.5201 - regression_loss: 1.3079 - classification_loss: 0.2122 475/500 [===========================>..] - ETA: 5s - loss: 1.5208 - regression_loss: 1.3086 - classification_loss: 0.2122 476/500 [===========================>..] - ETA: 5s - loss: 1.5216 - regression_loss: 1.3091 - classification_loss: 0.2124 477/500 [===========================>..] - ETA: 5s - loss: 1.5209 - regression_loss: 1.3084 - classification_loss: 0.2125 478/500 [===========================>..] - ETA: 5s - loss: 1.5206 - regression_loss: 1.3083 - classification_loss: 0.2124 479/500 [===========================>..] - ETA: 4s - loss: 1.5197 - regression_loss: 1.3075 - classification_loss: 0.2122 480/500 [===========================>..] - ETA: 4s - loss: 1.5192 - regression_loss: 1.3071 - classification_loss: 0.2121 481/500 [===========================>..] - ETA: 4s - loss: 1.5181 - regression_loss: 1.3062 - classification_loss: 0.2119 482/500 [===========================>..] - ETA: 4s - loss: 1.5194 - regression_loss: 1.3073 - classification_loss: 0.2120 483/500 [===========================>..] - ETA: 3s - loss: 1.5177 - regression_loss: 1.3060 - classification_loss: 0.2117 484/500 [============================>.] - ETA: 3s - loss: 1.5193 - regression_loss: 1.3074 - classification_loss: 0.2120 485/500 [============================>.] - ETA: 3s - loss: 1.5194 - regression_loss: 1.3076 - classification_loss: 0.2119 486/500 [============================>.] - ETA: 3s - loss: 1.5190 - regression_loss: 1.3072 - classification_loss: 0.2117 487/500 [============================>.] - ETA: 3s - loss: 1.5190 - regression_loss: 1.3074 - classification_loss: 0.2116 488/500 [============================>.] - ETA: 2s - loss: 1.5180 - regression_loss: 1.3065 - classification_loss: 0.2115 489/500 [============================>.] - ETA: 2s - loss: 1.5165 - regression_loss: 1.3053 - classification_loss: 0.2112 490/500 [============================>.] - ETA: 2s - loss: 1.5171 - regression_loss: 1.3057 - classification_loss: 0.2114 491/500 [============================>.] - ETA: 2s - loss: 1.5176 - regression_loss: 1.3061 - classification_loss: 0.2115 492/500 [============================>.] - ETA: 1s - loss: 1.5171 - regression_loss: 1.3055 - classification_loss: 0.2115 493/500 [============================>.] - ETA: 1s - loss: 1.5182 - regression_loss: 1.3065 - classification_loss: 0.2117 494/500 [============================>.] - ETA: 1s - loss: 1.5173 - regression_loss: 1.3058 - classification_loss: 0.2115 495/500 [============================>.] - ETA: 1s - loss: 1.5171 - regression_loss: 1.3057 - classification_loss: 0.2114 496/500 [============================>.] - ETA: 0s - loss: 1.5169 - regression_loss: 1.3057 - classification_loss: 0.2112 497/500 [============================>.] - ETA: 0s - loss: 1.5178 - regression_loss: 1.3064 - classification_loss: 0.2114 498/500 [============================>.] - ETA: 0s - loss: 1.5174 - regression_loss: 1.3061 - classification_loss: 0.2113 499/500 [============================>.] - ETA: 0s - loss: 1.5179 - regression_loss: 1.3066 - classification_loss: 0.2113 500/500 [==============================] - 117s 234ms/step - loss: 1.5192 - regression_loss: 1.3078 - classification_loss: 0.2114 326 instances of class plum with average precision: 0.7892 mAP: 0.7892 Epoch 00010: saving model to ./training/snapshots/resnet50_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 1:55 - loss: 2.7553 - regression_loss: 2.2523 - classification_loss: 0.5030 2/500 [..............................] - ETA: 1:53 - loss: 1.9848 - regression_loss: 1.6434 - classification_loss: 0.3414 3/500 [..............................] - ETA: 1:56 - loss: 1.8269 - regression_loss: 1.5182 - classification_loss: 0.3087 4/500 [..............................] - ETA: 1:54 - loss: 1.6453 - regression_loss: 1.3744 - classification_loss: 0.2708 5/500 [..............................] - ETA: 1:53 - loss: 1.5958 - regression_loss: 1.3435 - classification_loss: 0.2522 6/500 [..............................] - ETA: 1:55 - loss: 1.4654 - regression_loss: 1.2410 - classification_loss: 0.2243 7/500 [..............................] - ETA: 1:54 - loss: 1.5080 - regression_loss: 1.2791 - classification_loss: 0.2288 8/500 [..............................] - ETA: 1:53 - loss: 1.4466 - regression_loss: 1.2312 - classification_loss: 0.2154 9/500 [..............................] - ETA: 1:52 - loss: 1.4564 - regression_loss: 1.2399 - classification_loss: 0.2165 10/500 [..............................] - ETA: 1:52 - loss: 1.4239 - regression_loss: 1.2160 - classification_loss: 0.2078 11/500 [..............................] - ETA: 1:52 - loss: 1.3687 - regression_loss: 1.1711 - classification_loss: 0.1977 12/500 [..............................] - ETA: 1:53 - loss: 1.3918 - regression_loss: 1.1883 - classification_loss: 0.2035 13/500 [..............................] - ETA: 1:54 - loss: 1.3926 - regression_loss: 1.1904 - classification_loss: 0.2022 14/500 [..............................] - ETA: 1:54 - loss: 1.4072 - regression_loss: 1.2017 - classification_loss: 0.2055 15/500 [..............................] - ETA: 1:54 - loss: 1.3632 - regression_loss: 1.1649 - classification_loss: 0.1983 16/500 [..............................] - ETA: 1:54 - loss: 1.3972 - regression_loss: 1.1959 - classification_loss: 0.2013 17/500 [>.............................] - ETA: 1:54 - loss: 1.3838 - regression_loss: 1.1874 - classification_loss: 0.1964 18/500 [>.............................] - ETA: 1:53 - loss: 1.3824 - regression_loss: 1.1883 - classification_loss: 0.1941 19/500 [>.............................] - ETA: 1:53 - loss: 1.3538 - regression_loss: 1.1634 - classification_loss: 0.1904 20/500 [>.............................] - ETA: 1:53 - loss: 1.3283 - regression_loss: 1.1052 - classification_loss: 0.2230 21/500 [>.............................] - ETA: 1:52 - loss: 1.3231 - regression_loss: 1.1039 - classification_loss: 0.2193 22/500 [>.............................] - ETA: 1:52 - loss: 1.3161 - regression_loss: 1.0996 - classification_loss: 0.2165 23/500 [>.............................] - ETA: 1:52 - loss: 1.3133 - regression_loss: 1.1004 - classification_loss: 0.2129 24/500 [>.............................] - ETA: 1:52 - loss: 1.3421 - regression_loss: 1.1237 - classification_loss: 0.2184 25/500 [>.............................] - ETA: 1:52 - loss: 1.3676 - regression_loss: 1.1446 - classification_loss: 0.2230 26/500 [>.............................] - ETA: 1:52 - loss: 1.3440 - regression_loss: 1.1268 - classification_loss: 0.2172 27/500 [>.............................] - ETA: 1:52 - loss: 1.3477 - regression_loss: 1.1311 - classification_loss: 0.2166 28/500 [>.............................] - ETA: 1:52 - loss: 1.3259 - regression_loss: 1.1155 - classification_loss: 0.2104 29/500 [>.............................] - ETA: 1:52 - loss: 1.3617 - regression_loss: 1.1454 - classification_loss: 0.2162 30/500 [>.............................] - ETA: 1:51 - loss: 1.3772 - regression_loss: 1.1558 - classification_loss: 0.2214 31/500 [>.............................] - ETA: 1:51 - loss: 1.3978 - regression_loss: 1.1681 - classification_loss: 0.2297 32/500 [>.............................] - ETA: 1:50 - loss: 1.4006 - regression_loss: 1.1726 - classification_loss: 0.2281 33/500 [>.............................] - ETA: 1:50 - loss: 1.4018 - regression_loss: 1.1758 - classification_loss: 0.2260 34/500 [=>............................] - ETA: 1:50 - loss: 1.4070 - regression_loss: 1.1815 - classification_loss: 0.2255 35/500 [=>............................] - ETA: 1:49 - loss: 1.4222 - regression_loss: 1.1950 - classification_loss: 0.2272 36/500 [=>............................] - ETA: 1:49 - loss: 1.4047 - regression_loss: 1.1815 - classification_loss: 0.2232 37/500 [=>............................] - ETA: 1:48 - loss: 1.4070 - regression_loss: 1.1856 - classification_loss: 0.2214 38/500 [=>............................] - ETA: 1:48 - loss: 1.4069 - regression_loss: 1.1870 - classification_loss: 0.2199 39/500 [=>............................] - ETA: 1:48 - loss: 1.4127 - regression_loss: 1.1940 - classification_loss: 0.2187 40/500 [=>............................] - ETA: 1:48 - loss: 1.4057 - regression_loss: 1.1867 - classification_loss: 0.2190 41/500 [=>............................] - ETA: 1:48 - loss: 1.4393 - regression_loss: 1.2171 - classification_loss: 0.2223 42/500 [=>............................] - ETA: 1:48 - loss: 1.4456 - regression_loss: 1.2229 - classification_loss: 0.2228 43/500 [=>............................] - ETA: 1:48 - loss: 1.4507 - regression_loss: 1.2275 - classification_loss: 0.2232 44/500 [=>............................] - ETA: 1:48 - loss: 1.4383 - regression_loss: 1.2184 - classification_loss: 0.2199 45/500 [=>............................] - ETA: 1:47 - loss: 1.4117 - regression_loss: 1.1913 - classification_loss: 0.2204 46/500 [=>............................] - ETA: 1:47 - loss: 1.4016 - regression_loss: 1.1827 - classification_loss: 0.2189 47/500 [=>............................] - ETA: 1:47 - loss: 1.4050 - regression_loss: 1.1864 - classification_loss: 0.2185 48/500 [=>............................] - ETA: 1:46 - loss: 1.4278 - regression_loss: 1.2049 - classification_loss: 0.2229 49/500 [=>............................] - ETA: 1:46 - loss: 1.4457 - regression_loss: 1.2196 - classification_loss: 0.2261 50/500 [==>...........................] - ETA: 1:46 - loss: 1.4287 - regression_loss: 1.2059 - classification_loss: 0.2228 51/500 [==>...........................] - ETA: 1:46 - loss: 1.4182 - regression_loss: 1.1979 - classification_loss: 0.2203 52/500 [==>...........................] - ETA: 1:45 - loss: 1.4251 - regression_loss: 1.2040 - classification_loss: 0.2211 53/500 [==>...........................] - ETA: 1:45 - loss: 1.4172 - regression_loss: 1.1977 - classification_loss: 0.2195 54/500 [==>...........................] - ETA: 1:45 - loss: 1.4125 - regression_loss: 1.1943 - classification_loss: 0.2182 55/500 [==>...........................] - ETA: 1:45 - loss: 1.4041 - regression_loss: 1.1884 - classification_loss: 0.2157 56/500 [==>...........................] - ETA: 1:44 - loss: 1.4071 - regression_loss: 1.1920 - classification_loss: 0.2151 57/500 [==>...........................] - ETA: 1:44 - loss: 1.4018 - regression_loss: 1.1886 - classification_loss: 0.2132 58/500 [==>...........................] - ETA: 1:44 - loss: 1.3984 - regression_loss: 1.1860 - classification_loss: 0.2124 59/500 [==>...........................] - ETA: 1:44 - loss: 1.3913 - regression_loss: 1.1777 - classification_loss: 0.2136 60/500 [==>...........................] - ETA: 1:44 - loss: 1.4043 - regression_loss: 1.1894 - classification_loss: 0.2149 61/500 [==>...........................] - ETA: 1:43 - loss: 1.4145 - regression_loss: 1.1991 - classification_loss: 0.2154 62/500 [==>...........................] - ETA: 1:43 - loss: 1.4068 - regression_loss: 1.1930 - classification_loss: 0.2138 63/500 [==>...........................] - ETA: 1:43 - loss: 1.4082 - regression_loss: 1.1954 - classification_loss: 0.2128 64/500 [==>...........................] - ETA: 1:43 - loss: 1.3998 - regression_loss: 1.1885 - classification_loss: 0.2112 65/500 [==>...........................] - ETA: 1:42 - loss: 1.4010 - regression_loss: 1.1896 - classification_loss: 0.2114 66/500 [==>...........................] - ETA: 1:42 - loss: 1.4105 - regression_loss: 1.1978 - classification_loss: 0.2126 67/500 [===>..........................] - ETA: 1:42 - loss: 1.4064 - regression_loss: 1.1953 - classification_loss: 0.2111 68/500 [===>..........................] - ETA: 1:42 - loss: 1.4030 - regression_loss: 1.1919 - classification_loss: 0.2110 69/500 [===>..........................] - ETA: 1:41 - loss: 1.4093 - regression_loss: 1.1958 - classification_loss: 0.2135 70/500 [===>..........................] - ETA: 1:41 - loss: 1.4132 - regression_loss: 1.1997 - classification_loss: 0.2135 71/500 [===>..........................] - ETA: 1:41 - loss: 1.4164 - regression_loss: 1.2032 - classification_loss: 0.2131 72/500 [===>..........................] - ETA: 1:41 - loss: 1.4178 - regression_loss: 1.2044 - classification_loss: 0.2135 73/500 [===>..........................] - ETA: 1:40 - loss: 1.4336 - regression_loss: 1.2170 - classification_loss: 0.2165 74/500 [===>..........................] - ETA: 1:40 - loss: 1.4391 - regression_loss: 1.2222 - classification_loss: 0.2170 75/500 [===>..........................] - ETA: 1:40 - loss: 1.4333 - regression_loss: 1.2178 - classification_loss: 0.2156 76/500 [===>..........................] - ETA: 1:40 - loss: 1.4376 - regression_loss: 1.2229 - classification_loss: 0.2147 77/500 [===>..........................] - ETA: 1:40 - loss: 1.4655 - regression_loss: 1.2471 - classification_loss: 0.2184 78/500 [===>..........................] - ETA: 1:39 - loss: 1.4605 - regression_loss: 1.2433 - classification_loss: 0.2172 79/500 [===>..........................] - ETA: 1:39 - loss: 1.4653 - regression_loss: 1.2480 - classification_loss: 0.2173 80/500 [===>..........................] - ETA: 1:39 - loss: 1.4656 - regression_loss: 1.2489 - classification_loss: 0.2168 81/500 [===>..........................] - ETA: 1:39 - loss: 1.4699 - regression_loss: 1.2527 - classification_loss: 0.2171 82/500 [===>..........................] - ETA: 1:38 - loss: 1.4821 - regression_loss: 1.2605 - classification_loss: 0.2216 83/500 [===>..........................] - ETA: 1:38 - loss: 1.4830 - regression_loss: 1.2625 - classification_loss: 0.2205 84/500 [====>.........................] - ETA: 1:38 - loss: 1.4854 - regression_loss: 1.2645 - classification_loss: 0.2210 85/500 [====>.........................] - ETA: 1:38 - loss: 1.4785 - regression_loss: 1.2580 - classification_loss: 0.2205 86/500 [====>.........................] - ETA: 1:37 - loss: 1.4739 - regression_loss: 1.2545 - classification_loss: 0.2194 87/500 [====>.........................] - ETA: 1:37 - loss: 1.4697 - regression_loss: 1.2510 - classification_loss: 0.2186 88/500 [====>.........................] - ETA: 1:37 - loss: 1.4686 - regression_loss: 1.2512 - classification_loss: 0.2174 89/500 [====>.........................] - ETA: 1:37 - loss: 1.4662 - regression_loss: 1.2491 - classification_loss: 0.2171 90/500 [====>.........................] - ETA: 1:36 - loss: 1.4611 - regression_loss: 1.2446 - classification_loss: 0.2165 91/500 [====>.........................] - ETA: 1:36 - loss: 1.4682 - regression_loss: 1.2507 - classification_loss: 0.2175 92/500 [====>.........................] - ETA: 1:36 - loss: 1.4696 - regression_loss: 1.2526 - classification_loss: 0.2170 93/500 [====>.........................] - ETA: 1:36 - loss: 1.4649 - regression_loss: 1.2495 - classification_loss: 0.2154 94/500 [====>.........................] - ETA: 1:35 - loss: 1.4620 - regression_loss: 1.2472 - classification_loss: 0.2148 95/500 [====>.........................] - ETA: 1:35 - loss: 1.4631 - regression_loss: 1.2486 - classification_loss: 0.2145 96/500 [====>.........................] - ETA: 1:35 - loss: 1.4625 - regression_loss: 1.2482 - classification_loss: 0.2143 97/500 [====>.........................] - ETA: 1:35 - loss: 1.4592 - regression_loss: 1.2455 - classification_loss: 0.2137 98/500 [====>.........................] - ETA: 1:34 - loss: 1.4539 - regression_loss: 1.2411 - classification_loss: 0.2127 99/500 [====>.........................] - ETA: 1:34 - loss: 1.4607 - regression_loss: 1.2474 - classification_loss: 0.2133 100/500 [=====>........................] - ETA: 1:34 - loss: 1.4712 - regression_loss: 1.2560 - classification_loss: 0.2152 101/500 [=====>........................] - ETA: 1:34 - loss: 1.4674 - regression_loss: 1.2535 - classification_loss: 0.2139 102/500 [=====>........................] - ETA: 1:34 - loss: 1.4645 - regression_loss: 1.2509 - classification_loss: 0.2136 103/500 [=====>........................] - ETA: 1:33 - loss: 1.4604 - regression_loss: 1.2474 - classification_loss: 0.2130 104/500 [=====>........................] - ETA: 1:33 - loss: 1.4702 - regression_loss: 1.2538 - classification_loss: 0.2165 105/500 [=====>........................] - ETA: 1:33 - loss: 1.4655 - regression_loss: 1.2500 - classification_loss: 0.2155 106/500 [=====>........................] - ETA: 1:33 - loss: 1.4655 - regression_loss: 1.2508 - classification_loss: 0.2147 107/500 [=====>........................] - ETA: 1:32 - loss: 1.4613 - regression_loss: 1.2473 - classification_loss: 0.2140 108/500 [=====>........................] - ETA: 1:32 - loss: 1.4683 - regression_loss: 1.2544 - classification_loss: 0.2139 109/500 [=====>........................] - ETA: 1:32 - loss: 1.4622 - regression_loss: 1.2495 - classification_loss: 0.2127 110/500 [=====>........................] - ETA: 1:32 - loss: 1.4564 - regression_loss: 1.2449 - classification_loss: 0.2116 111/500 [=====>........................] - ETA: 1:31 - loss: 1.4555 - regression_loss: 1.2443 - classification_loss: 0.2112 112/500 [=====>........................] - ETA: 1:31 - loss: 1.4643 - regression_loss: 1.2514 - classification_loss: 0.2129 113/500 [=====>........................] - ETA: 1:31 - loss: 1.4611 - regression_loss: 1.2489 - classification_loss: 0.2123 114/500 [=====>........................] - ETA: 1:31 - loss: 1.4551 - regression_loss: 1.2438 - classification_loss: 0.2113 115/500 [=====>........................] - ETA: 1:31 - loss: 1.4568 - regression_loss: 1.2448 - classification_loss: 0.2119 116/500 [=====>........................] - ETA: 1:30 - loss: 1.4534 - regression_loss: 1.2422 - classification_loss: 0.2112 117/500 [======>.......................] - ETA: 1:30 - loss: 1.4521 - regression_loss: 1.2411 - classification_loss: 0.2111 118/500 [======>.......................] - ETA: 1:30 - loss: 1.4438 - regression_loss: 1.2341 - classification_loss: 0.2097 119/500 [======>.......................] - ETA: 1:30 - loss: 1.4497 - regression_loss: 1.2387 - classification_loss: 0.2110 120/500 [======>.......................] - ETA: 1:30 - loss: 1.4559 - regression_loss: 1.2442 - classification_loss: 0.2117 121/500 [======>.......................] - ETA: 1:29 - loss: 1.4613 - regression_loss: 1.2488 - classification_loss: 0.2125 122/500 [======>.......................] - ETA: 1:29 - loss: 1.4602 - regression_loss: 1.2483 - classification_loss: 0.2119 123/500 [======>.......................] - ETA: 1:29 - loss: 1.4576 - regression_loss: 1.2462 - classification_loss: 0.2114 124/500 [======>.......................] - ETA: 1:29 - loss: 1.4565 - regression_loss: 1.2456 - classification_loss: 0.2109 125/500 [======>.......................] - ETA: 1:28 - loss: 1.4551 - regression_loss: 1.2448 - classification_loss: 0.2104 126/500 [======>.......................] - ETA: 1:28 - loss: 1.4624 - regression_loss: 1.2509 - classification_loss: 0.2116 127/500 [======>.......................] - ETA: 1:28 - loss: 1.4643 - regression_loss: 1.2523 - classification_loss: 0.2120 128/500 [======>.......................] - ETA: 1:28 - loss: 1.4661 - regression_loss: 1.2537 - classification_loss: 0.2123 129/500 [======>.......................] - ETA: 1:27 - loss: 1.4653 - regression_loss: 1.2536 - classification_loss: 0.2117 130/500 [======>.......................] - ETA: 1:27 - loss: 1.4660 - regression_loss: 1.2545 - classification_loss: 0.2115 131/500 [======>.......................] - ETA: 1:27 - loss: 1.4609 - regression_loss: 1.2502 - classification_loss: 0.2107 132/500 [======>.......................] - ETA: 1:27 - loss: 1.4558 - regression_loss: 1.2460 - classification_loss: 0.2098 133/500 [======>.......................] - ETA: 1:26 - loss: 1.4526 - regression_loss: 1.2436 - classification_loss: 0.2090 134/500 [=======>......................] - ETA: 1:26 - loss: 1.4521 - regression_loss: 1.2433 - classification_loss: 0.2089 135/500 [=======>......................] - ETA: 1:26 - loss: 1.4523 - regression_loss: 1.2433 - classification_loss: 0.2089 136/500 [=======>......................] - ETA: 1:26 - loss: 1.4546 - regression_loss: 1.2453 - classification_loss: 0.2094 137/500 [=======>......................] - ETA: 1:25 - loss: 1.4580 - regression_loss: 1.2486 - classification_loss: 0.2094 138/500 [=======>......................] - ETA: 1:25 - loss: 1.4619 - regression_loss: 1.2519 - classification_loss: 0.2100 139/500 [=======>......................] - ETA: 1:25 - loss: 1.4683 - regression_loss: 1.2572 - classification_loss: 0.2112 140/500 [=======>......................] - ETA: 1:25 - loss: 1.4656 - regression_loss: 1.2552 - classification_loss: 0.2104 141/500 [=======>......................] - ETA: 1:25 - loss: 1.4649 - regression_loss: 1.2543 - classification_loss: 0.2107 142/500 [=======>......................] - ETA: 1:24 - loss: 1.4617 - regression_loss: 1.2514 - classification_loss: 0.2102 143/500 [=======>......................] - ETA: 1:24 - loss: 1.4569 - regression_loss: 1.2468 - classification_loss: 0.2100 144/500 [=======>......................] - ETA: 1:24 - loss: 1.4647 - regression_loss: 1.2531 - classification_loss: 0.2116 145/500 [=======>......................] - ETA: 1:24 - loss: 1.4772 - regression_loss: 1.2633 - classification_loss: 0.2138 146/500 [=======>......................] - ETA: 1:23 - loss: 1.4779 - regression_loss: 1.2642 - classification_loss: 0.2137 147/500 [=======>......................] - ETA: 1:23 - loss: 1.4789 - regression_loss: 1.2651 - classification_loss: 0.2138 148/500 [=======>......................] - ETA: 1:23 - loss: 1.4797 - regression_loss: 1.2659 - classification_loss: 0.2138 149/500 [=======>......................] - ETA: 1:23 - loss: 1.4745 - regression_loss: 1.2615 - classification_loss: 0.2130 150/500 [========>.....................] - ETA: 1:22 - loss: 1.4730 - regression_loss: 1.2606 - classification_loss: 0.2124 151/500 [========>.....................] - ETA: 1:22 - loss: 1.4691 - regression_loss: 1.2577 - classification_loss: 0.2113 152/500 [========>.....................] - ETA: 1:22 - loss: 1.4682 - regression_loss: 1.2573 - classification_loss: 0.2108 153/500 [========>.....................] - ETA: 1:22 - loss: 1.4664 - regression_loss: 1.2557 - classification_loss: 0.2107 154/500 [========>.....................] - ETA: 1:22 - loss: 1.4657 - regression_loss: 1.2557 - classification_loss: 0.2099 155/500 [========>.....................] - ETA: 1:21 - loss: 1.4622 - regression_loss: 1.2529 - classification_loss: 0.2092 156/500 [========>.....................] - ETA: 1:21 - loss: 1.4616 - regression_loss: 1.2522 - classification_loss: 0.2094 157/500 [========>.....................] - ETA: 1:21 - loss: 1.4599 - regression_loss: 1.2508 - classification_loss: 0.2091 158/500 [========>.....................] - ETA: 1:20 - loss: 1.4649 - regression_loss: 1.2545 - classification_loss: 0.2104 159/500 [========>.....................] - ETA: 1:20 - loss: 1.4635 - regression_loss: 1.2539 - classification_loss: 0.2096 160/500 [========>.....................] - ETA: 1:20 - loss: 1.4599 - regression_loss: 1.2510 - classification_loss: 0.2089 161/500 [========>.....................] - ETA: 1:20 - loss: 1.4646 - regression_loss: 1.2546 - classification_loss: 0.2100 162/500 [========>.....................] - ETA: 1:19 - loss: 1.4646 - regression_loss: 1.2547 - classification_loss: 0.2098 163/500 [========>.....................] - ETA: 1:19 - loss: 1.4627 - regression_loss: 1.2532 - classification_loss: 0.2095 164/500 [========>.....................] - ETA: 1:19 - loss: 1.4640 - regression_loss: 1.2546 - classification_loss: 0.2094 165/500 [========>.....................] - ETA: 1:19 - loss: 1.4599 - regression_loss: 1.2514 - classification_loss: 0.2085 166/500 [========>.....................] - ETA: 1:19 - loss: 1.4727 - regression_loss: 1.2596 - classification_loss: 0.2131 167/500 [=========>....................] - ETA: 1:18 - loss: 1.4712 - regression_loss: 1.2589 - classification_loss: 0.2123 168/500 [=========>....................] - ETA: 1:18 - loss: 1.4748 - regression_loss: 1.2621 - classification_loss: 0.2126 169/500 [=========>....................] - ETA: 1:18 - loss: 1.4740 - regression_loss: 1.2617 - classification_loss: 0.2123 170/500 [=========>....................] - ETA: 1:18 - loss: 1.4757 - regression_loss: 1.2630 - classification_loss: 0.2127 171/500 [=========>....................] - ETA: 1:17 - loss: 1.4735 - regression_loss: 1.2611 - classification_loss: 0.2124 172/500 [=========>....................] - ETA: 1:17 - loss: 1.4744 - regression_loss: 1.2621 - classification_loss: 0.2123 173/500 [=========>....................] - ETA: 1:17 - loss: 1.4751 - regression_loss: 1.2625 - classification_loss: 0.2126 174/500 [=========>....................] - ETA: 1:17 - loss: 1.4835 - regression_loss: 1.2680 - classification_loss: 0.2155 175/500 [=========>....................] - ETA: 1:16 - loss: 1.4820 - regression_loss: 1.2668 - classification_loss: 0.2152 176/500 [=========>....................] - ETA: 1:16 - loss: 1.4821 - regression_loss: 1.2672 - classification_loss: 0.2149 177/500 [=========>....................] - ETA: 1:16 - loss: 1.4822 - regression_loss: 1.2674 - classification_loss: 0.2148 178/500 [=========>....................] - ETA: 1:16 - loss: 1.4826 - regression_loss: 1.2679 - classification_loss: 0.2148 179/500 [=========>....................] - ETA: 1:15 - loss: 1.4821 - regression_loss: 1.2673 - classification_loss: 0.2148 180/500 [=========>....................] - ETA: 1:15 - loss: 1.4842 - regression_loss: 1.2688 - classification_loss: 0.2154 181/500 [=========>....................] - ETA: 1:15 - loss: 1.4812 - regression_loss: 1.2661 - classification_loss: 0.2151 182/500 [=========>....................] - ETA: 1:15 - loss: 1.4831 - regression_loss: 1.2677 - classification_loss: 0.2154 183/500 [=========>....................] - ETA: 1:15 - loss: 1.4797 - regression_loss: 1.2646 - classification_loss: 0.2151 184/500 [==========>...................] - ETA: 1:14 - loss: 1.4783 - regression_loss: 1.2634 - classification_loss: 0.2148 185/500 [==========>...................] - ETA: 1:14 - loss: 1.4760 - regression_loss: 1.2617 - classification_loss: 0.2144 186/500 [==========>...................] - ETA: 1:14 - loss: 1.4719 - regression_loss: 1.2582 - classification_loss: 0.2136 187/500 [==========>...................] - ETA: 1:14 - loss: 1.4707 - regression_loss: 1.2573 - classification_loss: 0.2135 188/500 [==========>...................] - ETA: 1:13 - loss: 1.4701 - regression_loss: 1.2571 - classification_loss: 0.2129 189/500 [==========>...................] - ETA: 1:13 - loss: 1.4694 - regression_loss: 1.2568 - classification_loss: 0.2127 190/500 [==========>...................] - ETA: 1:13 - loss: 1.4701 - regression_loss: 1.2574 - classification_loss: 0.2128 191/500 [==========>...................] - ETA: 1:13 - loss: 1.4685 - regression_loss: 1.2561 - classification_loss: 0.2124 192/500 [==========>...................] - ETA: 1:12 - loss: 1.4676 - regression_loss: 1.2555 - classification_loss: 0.2121 193/500 [==========>...................] - ETA: 1:12 - loss: 1.4638 - regression_loss: 1.2519 - classification_loss: 0.2119 194/500 [==========>...................] - ETA: 1:12 - loss: 1.4625 - regression_loss: 1.2510 - classification_loss: 0.2115 195/500 [==========>...................] - ETA: 1:12 - loss: 1.4582 - regression_loss: 1.2474 - classification_loss: 0.2108 196/500 [==========>...................] - ETA: 1:12 - loss: 1.4590 - regression_loss: 1.2478 - classification_loss: 0.2112 197/500 [==========>...................] - ETA: 1:11 - loss: 1.4564 - regression_loss: 1.2457 - classification_loss: 0.2106 198/500 [==========>...................] - ETA: 1:11 - loss: 1.4589 - regression_loss: 1.2478 - classification_loss: 0.2111 199/500 [==========>...................] - ETA: 1:11 - loss: 1.4588 - regression_loss: 1.2478 - classification_loss: 0.2111 200/500 [===========>..................] - ETA: 1:11 - loss: 1.4648 - regression_loss: 1.2523 - classification_loss: 0.2125 201/500 [===========>..................] - ETA: 1:10 - loss: 1.4646 - regression_loss: 1.2526 - classification_loss: 0.2120 202/500 [===========>..................] - ETA: 1:10 - loss: 1.4640 - regression_loss: 1.2521 - classification_loss: 0.2119 203/500 [===========>..................] - ETA: 1:10 - loss: 1.4639 - regression_loss: 1.2521 - classification_loss: 0.2118 204/500 [===========>..................] - ETA: 1:10 - loss: 1.4646 - regression_loss: 1.2527 - classification_loss: 0.2120 205/500 [===========>..................] - ETA: 1:09 - loss: 1.4648 - regression_loss: 1.2527 - classification_loss: 0.2121 206/500 [===========>..................] - ETA: 1:09 - loss: 1.4693 - regression_loss: 1.2560 - classification_loss: 0.2134 207/500 [===========>..................] - ETA: 1:09 - loss: 1.4669 - regression_loss: 1.2540 - classification_loss: 0.2129 208/500 [===========>..................] - ETA: 1:09 - loss: 1.4732 - regression_loss: 1.2598 - classification_loss: 0.2135 209/500 [===========>..................] - ETA: 1:09 - loss: 1.4686 - regression_loss: 1.2557 - classification_loss: 0.2129 210/500 [===========>..................] - ETA: 1:08 - loss: 1.4696 - regression_loss: 1.2566 - classification_loss: 0.2130 211/500 [===========>..................] - ETA: 1:08 - loss: 1.4686 - regression_loss: 1.2555 - classification_loss: 0.2130 212/500 [===========>..................] - ETA: 1:08 - loss: 1.4693 - regression_loss: 1.2558 - classification_loss: 0.2135 213/500 [===========>..................] - ETA: 1:08 - loss: 1.4664 - regression_loss: 1.2537 - classification_loss: 0.2128 214/500 [===========>..................] - ETA: 1:07 - loss: 1.4650 - regression_loss: 1.2523 - classification_loss: 0.2127 215/500 [===========>..................] - ETA: 1:07 - loss: 1.4648 - regression_loss: 1.2517 - classification_loss: 0.2131 216/500 [===========>..................] - ETA: 1:07 - loss: 1.4626 - regression_loss: 1.2498 - classification_loss: 0.2128 217/500 [============>.................] - ETA: 1:07 - loss: 1.4648 - regression_loss: 1.2519 - classification_loss: 0.2129 218/500 [============>.................] - ETA: 1:06 - loss: 1.4647 - regression_loss: 1.2520 - classification_loss: 0.2127 219/500 [============>.................] - ETA: 1:06 - loss: 1.4619 - regression_loss: 1.2496 - classification_loss: 0.2123 220/500 [============>.................] - ETA: 1:06 - loss: 1.4636 - regression_loss: 1.2512 - classification_loss: 0.2125 221/500 [============>.................] - ETA: 1:06 - loss: 1.4634 - regression_loss: 1.2512 - classification_loss: 0.2122 222/500 [============>.................] - ETA: 1:05 - loss: 1.4633 - regression_loss: 1.2513 - classification_loss: 0.2120 223/500 [============>.................] - ETA: 1:05 - loss: 1.4659 - regression_loss: 1.2532 - classification_loss: 0.2127 224/500 [============>.................] - ETA: 1:05 - loss: 1.4675 - regression_loss: 1.2548 - classification_loss: 0.2127 225/500 [============>.................] - ETA: 1:05 - loss: 1.4662 - regression_loss: 1.2539 - classification_loss: 0.2123 226/500 [============>.................] - ETA: 1:04 - loss: 1.4653 - regression_loss: 1.2535 - classification_loss: 0.2119 227/500 [============>.................] - ETA: 1:04 - loss: 1.4614 - regression_loss: 1.2502 - classification_loss: 0.2112 228/500 [============>.................] - ETA: 1:04 - loss: 1.4635 - regression_loss: 1.2522 - classification_loss: 0.2114 229/500 [============>.................] - ETA: 1:04 - loss: 1.4627 - regression_loss: 1.2516 - classification_loss: 0.2111 230/500 [============>.................] - ETA: 1:03 - loss: 1.4599 - regression_loss: 1.2494 - classification_loss: 0.2105 231/500 [============>.................] - ETA: 1:03 - loss: 1.4635 - regression_loss: 1.2529 - classification_loss: 0.2106 232/500 [============>.................] - ETA: 1:03 - loss: 1.4606 - regression_loss: 1.2506 - classification_loss: 0.2100 233/500 [============>.................] - ETA: 1:03 - loss: 1.4608 - regression_loss: 1.2505 - classification_loss: 0.2103 234/500 [=============>................] - ETA: 1:03 - loss: 1.4569 - regression_loss: 1.2473 - classification_loss: 0.2096 235/500 [=============>................] - ETA: 1:02 - loss: 1.4557 - regression_loss: 1.2465 - classification_loss: 0.2092 236/500 [=============>................] - ETA: 1:02 - loss: 1.4530 - regression_loss: 1.2444 - classification_loss: 0.2087 237/500 [=============>................] - ETA: 1:02 - loss: 1.4507 - regression_loss: 1.2424 - classification_loss: 0.2083 238/500 [=============>................] - ETA: 1:02 - loss: 1.4525 - regression_loss: 1.2437 - classification_loss: 0.2088 239/500 [=============>................] - ETA: 1:01 - loss: 1.4525 - regression_loss: 1.2433 - classification_loss: 0.2091 240/500 [=============>................] - ETA: 1:01 - loss: 1.4492 - regression_loss: 1.2405 - classification_loss: 0.2087 241/500 [=============>................] - ETA: 1:01 - loss: 1.4484 - regression_loss: 1.2396 - classification_loss: 0.2088 242/500 [=============>................] - ETA: 1:01 - loss: 1.4485 - regression_loss: 1.2397 - classification_loss: 0.2088 243/500 [=============>................] - ETA: 1:00 - loss: 1.4497 - regression_loss: 1.2404 - classification_loss: 0.2093 244/500 [=============>................] - ETA: 1:00 - loss: 1.4486 - regression_loss: 1.2394 - classification_loss: 0.2092 245/500 [=============>................] - ETA: 1:00 - loss: 1.4470 - regression_loss: 1.2383 - classification_loss: 0.2087 246/500 [=============>................] - ETA: 1:00 - loss: 1.4455 - regression_loss: 1.2372 - classification_loss: 0.2083 247/500 [=============>................] - ETA: 59s - loss: 1.4480 - regression_loss: 1.2392 - classification_loss: 0.2088  248/500 [=============>................] - ETA: 59s - loss: 1.4493 - regression_loss: 1.2400 - classification_loss: 0.2093 249/500 [=============>................] - ETA: 59s - loss: 1.4508 - regression_loss: 1.2417 - classification_loss: 0.2091 250/500 [==============>...............] - ETA: 59s - loss: 1.4502 - regression_loss: 1.2412 - classification_loss: 0.2090 251/500 [==============>...............] - ETA: 58s - loss: 1.4502 - regression_loss: 1.2413 - classification_loss: 0.2088 252/500 [==============>...............] - ETA: 58s - loss: 1.4528 - regression_loss: 1.2436 - classification_loss: 0.2092 253/500 [==============>...............] - ETA: 58s - loss: 1.4555 - regression_loss: 1.2455 - classification_loss: 0.2101 254/500 [==============>...............] - ETA: 58s - loss: 1.4548 - regression_loss: 1.2449 - classification_loss: 0.2099 255/500 [==============>...............] - ETA: 57s - loss: 1.4533 - regression_loss: 1.2438 - classification_loss: 0.2095 256/500 [==============>...............] - ETA: 57s - loss: 1.4538 - regression_loss: 1.2443 - classification_loss: 0.2095 257/500 [==============>...............] - ETA: 57s - loss: 1.4559 - regression_loss: 1.2461 - classification_loss: 0.2097 258/500 [==============>...............] - ETA: 57s - loss: 1.4545 - regression_loss: 1.2451 - classification_loss: 0.2094 259/500 [==============>...............] - ETA: 57s - loss: 1.4516 - regression_loss: 1.2427 - classification_loss: 0.2089 260/500 [==============>...............] - ETA: 56s - loss: 1.4515 - regression_loss: 1.2427 - classification_loss: 0.2089 261/500 [==============>...............] - ETA: 56s - loss: 1.4518 - regression_loss: 1.2429 - classification_loss: 0.2089 262/500 [==============>...............] - ETA: 56s - loss: 1.4491 - regression_loss: 1.2408 - classification_loss: 0.2084 263/500 [==============>...............] - ETA: 56s - loss: 1.4478 - regression_loss: 1.2398 - classification_loss: 0.2080 264/500 [==============>...............] - ETA: 55s - loss: 1.4499 - regression_loss: 1.2416 - classification_loss: 0.2083 265/500 [==============>...............] - ETA: 55s - loss: 1.4501 - regression_loss: 1.2418 - classification_loss: 0.2083 266/500 [==============>...............] - ETA: 55s - loss: 1.4493 - regression_loss: 1.2412 - classification_loss: 0.2081 267/500 [===============>..............] - ETA: 55s - loss: 1.4474 - regression_loss: 1.2397 - classification_loss: 0.2077 268/500 [===============>..............] - ETA: 54s - loss: 1.4489 - regression_loss: 1.2409 - classification_loss: 0.2081 269/500 [===============>..............] - ETA: 54s - loss: 1.4492 - regression_loss: 1.2414 - classification_loss: 0.2079 270/500 [===============>..............] - ETA: 54s - loss: 1.4484 - regression_loss: 1.2406 - classification_loss: 0.2078 271/500 [===============>..............] - ETA: 54s - loss: 1.4497 - regression_loss: 1.2418 - classification_loss: 0.2079 272/500 [===============>..............] - ETA: 53s - loss: 1.4517 - regression_loss: 1.2433 - classification_loss: 0.2085 273/500 [===============>..............] - ETA: 53s - loss: 1.4492 - regression_loss: 1.2413 - classification_loss: 0.2079 274/500 [===============>..............] - ETA: 53s - loss: 1.4456 - regression_loss: 1.2382 - classification_loss: 0.2073 275/500 [===============>..............] - ETA: 53s - loss: 1.4465 - regression_loss: 1.2391 - classification_loss: 0.2073 276/500 [===============>..............] - ETA: 53s - loss: 1.4454 - regression_loss: 1.2384 - classification_loss: 0.2071 277/500 [===============>..............] - ETA: 52s - loss: 1.4449 - regression_loss: 1.2379 - classification_loss: 0.2070 278/500 [===============>..............] - ETA: 52s - loss: 1.4454 - regression_loss: 1.2377 - classification_loss: 0.2077 279/500 [===============>..............] - ETA: 52s - loss: 1.4460 - regression_loss: 1.2384 - classification_loss: 0.2076 280/500 [===============>..............] - ETA: 52s - loss: 1.4450 - regression_loss: 1.2377 - classification_loss: 0.2073 281/500 [===============>..............] - ETA: 51s - loss: 1.4428 - regression_loss: 1.2354 - classification_loss: 0.2074 282/500 [===============>..............] - ETA: 51s - loss: 1.4408 - regression_loss: 1.2339 - classification_loss: 0.2069 283/500 [===============>..............] - ETA: 51s - loss: 1.4418 - regression_loss: 1.2347 - classification_loss: 0.2070 284/500 [================>.............] - ETA: 51s - loss: 1.4423 - regression_loss: 1.2352 - classification_loss: 0.2071 285/500 [================>.............] - ETA: 50s - loss: 1.4399 - regression_loss: 1.2330 - classification_loss: 0.2070 286/500 [================>.............] - ETA: 50s - loss: 1.4375 - regression_loss: 1.2311 - classification_loss: 0.2065 287/500 [================>.............] - ETA: 50s - loss: 1.4355 - regression_loss: 1.2295 - classification_loss: 0.2060 288/500 [================>.............] - ETA: 50s - loss: 1.4380 - regression_loss: 1.2306 - classification_loss: 0.2073 289/500 [================>.............] - ETA: 49s - loss: 1.4418 - regression_loss: 1.2344 - classification_loss: 0.2074 290/500 [================>.............] - ETA: 49s - loss: 1.4444 - regression_loss: 1.2365 - classification_loss: 0.2079 291/500 [================>.............] - ETA: 49s - loss: 1.4440 - regression_loss: 1.2364 - classification_loss: 0.2077 292/500 [================>.............] - ETA: 49s - loss: 1.4415 - regression_loss: 1.2342 - classification_loss: 0.2073 293/500 [================>.............] - ETA: 48s - loss: 1.4413 - regression_loss: 1.2343 - classification_loss: 0.2070 294/500 [================>.............] - ETA: 48s - loss: 1.4419 - regression_loss: 1.2345 - classification_loss: 0.2074 295/500 [================>.............] - ETA: 48s - loss: 1.4422 - regression_loss: 1.2349 - classification_loss: 0.2073 296/500 [================>.............] - ETA: 48s - loss: 1.4415 - regression_loss: 1.2345 - classification_loss: 0.2070 297/500 [================>.............] - ETA: 48s - loss: 1.4409 - regression_loss: 1.2341 - classification_loss: 0.2068 298/500 [================>.............] - ETA: 47s - loss: 1.4415 - regression_loss: 1.2345 - classification_loss: 0.2070 299/500 [================>.............] - ETA: 47s - loss: 1.4413 - regression_loss: 1.2344 - classification_loss: 0.2070 300/500 [=================>............] - ETA: 47s - loss: 1.4416 - regression_loss: 1.2346 - classification_loss: 0.2069 301/500 [=================>............] - ETA: 47s - loss: 1.4416 - regression_loss: 1.2348 - classification_loss: 0.2068 302/500 [=================>............] - ETA: 46s - loss: 1.4403 - regression_loss: 1.2338 - classification_loss: 0.2064 303/500 [=================>............] - ETA: 46s - loss: 1.4390 - regression_loss: 1.2329 - classification_loss: 0.2061 304/500 [=================>............] - ETA: 46s - loss: 1.4379 - regression_loss: 1.2320 - classification_loss: 0.2058 305/500 [=================>............] - ETA: 46s - loss: 1.4367 - regression_loss: 1.2312 - classification_loss: 0.2055 306/500 [=================>............] - ETA: 45s - loss: 1.4370 - regression_loss: 1.2316 - classification_loss: 0.2054 307/500 [=================>............] - ETA: 45s - loss: 1.4366 - regression_loss: 1.2311 - classification_loss: 0.2055 308/500 [=================>............] - ETA: 45s - loss: 1.4350 - regression_loss: 1.2299 - classification_loss: 0.2051 309/500 [=================>............] - ETA: 45s - loss: 1.4345 - regression_loss: 1.2294 - classification_loss: 0.2051 310/500 [=================>............] - ETA: 44s - loss: 1.4350 - regression_loss: 1.2298 - classification_loss: 0.2051 311/500 [=================>............] - ETA: 44s - loss: 1.4339 - regression_loss: 1.2291 - classification_loss: 0.2049 312/500 [=================>............] - ETA: 44s - loss: 1.4319 - regression_loss: 1.2275 - classification_loss: 0.2044 313/500 [=================>............] - ETA: 44s - loss: 1.4314 - regression_loss: 1.2272 - classification_loss: 0.2042 314/500 [=================>............] - ETA: 44s - loss: 1.4313 - regression_loss: 1.2272 - classification_loss: 0.2041 315/500 [=================>............] - ETA: 43s - loss: 1.4362 - regression_loss: 1.2298 - classification_loss: 0.2064 316/500 [=================>............] - ETA: 43s - loss: 1.4380 - regression_loss: 1.2316 - classification_loss: 0.2065 317/500 [==================>...........] - ETA: 43s - loss: 1.4354 - regression_loss: 1.2292 - classification_loss: 0.2062 318/500 [==================>...........] - ETA: 43s - loss: 1.4344 - regression_loss: 1.2284 - classification_loss: 0.2061 319/500 [==================>...........] - ETA: 42s - loss: 1.4362 - regression_loss: 1.2300 - classification_loss: 0.2062 320/500 [==================>...........] - ETA: 42s - loss: 1.4371 - regression_loss: 1.2308 - classification_loss: 0.2063 321/500 [==================>...........] - ETA: 42s - loss: 1.4373 - regression_loss: 1.2310 - classification_loss: 0.2063 322/500 [==================>...........] - ETA: 42s - loss: 1.4395 - regression_loss: 1.2327 - classification_loss: 0.2068 323/500 [==================>...........] - ETA: 41s - loss: 1.4374 - regression_loss: 1.2309 - classification_loss: 0.2064 324/500 [==================>...........] - ETA: 41s - loss: 1.4386 - regression_loss: 1.2319 - classification_loss: 0.2068 325/500 [==================>...........] - ETA: 41s - loss: 1.4395 - regression_loss: 1.2324 - classification_loss: 0.2072 326/500 [==================>...........] - ETA: 41s - loss: 1.4410 - regression_loss: 1.2338 - classification_loss: 0.2072 327/500 [==================>...........] - ETA: 41s - loss: 1.4422 - regression_loss: 1.2349 - classification_loss: 0.2073 328/500 [==================>...........] - ETA: 40s - loss: 1.4404 - regression_loss: 1.2335 - classification_loss: 0.2069 329/500 [==================>...........] - ETA: 40s - loss: 1.4415 - regression_loss: 1.2344 - classification_loss: 0.2071 330/500 [==================>...........] - ETA: 40s - loss: 1.4406 - regression_loss: 1.2337 - classification_loss: 0.2069 331/500 [==================>...........] - ETA: 40s - loss: 1.4420 - regression_loss: 1.2350 - classification_loss: 0.2070 332/500 [==================>...........] - ETA: 39s - loss: 1.4427 - regression_loss: 1.2359 - classification_loss: 0.2068 333/500 [==================>...........] - ETA: 39s - loss: 1.4458 - regression_loss: 1.2385 - classification_loss: 0.2074 334/500 [===================>..........] - ETA: 39s - loss: 1.4440 - regression_loss: 1.2371 - classification_loss: 0.2069 335/500 [===================>..........] - ETA: 39s - loss: 1.4456 - regression_loss: 1.2387 - classification_loss: 0.2069 336/500 [===================>..........] - ETA: 38s - loss: 1.4456 - regression_loss: 1.2386 - classification_loss: 0.2070 337/500 [===================>..........] - ETA: 38s - loss: 1.4445 - regression_loss: 1.2379 - classification_loss: 0.2065 338/500 [===================>..........] - ETA: 38s - loss: 1.4440 - regression_loss: 1.2374 - classification_loss: 0.2066 339/500 [===================>..........] - ETA: 38s - loss: 1.4430 - regression_loss: 1.2367 - classification_loss: 0.2064 340/500 [===================>..........] - ETA: 37s - loss: 1.4450 - regression_loss: 1.2382 - classification_loss: 0.2068 341/500 [===================>..........] - ETA: 37s - loss: 1.4434 - regression_loss: 1.2369 - classification_loss: 0.2064 342/500 [===================>..........] - ETA: 37s - loss: 1.4439 - regression_loss: 1.2374 - classification_loss: 0.2065 343/500 [===================>..........] - ETA: 37s - loss: 1.4442 - regression_loss: 1.2377 - classification_loss: 0.2065 344/500 [===================>..........] - ETA: 36s - loss: 1.4441 - regression_loss: 1.2378 - classification_loss: 0.2063 345/500 [===================>..........] - ETA: 36s - loss: 1.4433 - regression_loss: 1.2372 - classification_loss: 0.2061 346/500 [===================>..........] - ETA: 36s - loss: 1.4429 - regression_loss: 1.2367 - classification_loss: 0.2062 347/500 [===================>..........] - ETA: 36s - loss: 1.4434 - regression_loss: 1.2372 - classification_loss: 0.2063 348/500 [===================>..........] - ETA: 36s - loss: 1.4433 - regression_loss: 1.2372 - classification_loss: 0.2061 349/500 [===================>..........] - ETA: 35s - loss: 1.4464 - regression_loss: 1.2393 - classification_loss: 0.2071 350/500 [====================>.........] - ETA: 35s - loss: 1.4473 - regression_loss: 1.2402 - classification_loss: 0.2071 351/500 [====================>.........] - ETA: 35s - loss: 1.4476 - regression_loss: 1.2406 - classification_loss: 0.2070 352/500 [====================>.........] - ETA: 35s - loss: 1.4505 - regression_loss: 1.2427 - classification_loss: 0.2078 353/500 [====================>.........] - ETA: 34s - loss: 1.4513 - regression_loss: 1.2433 - classification_loss: 0.2080 354/500 [====================>.........] - ETA: 34s - loss: 1.4514 - regression_loss: 1.2434 - classification_loss: 0.2080 355/500 [====================>.........] - ETA: 34s - loss: 1.4531 - regression_loss: 1.2448 - classification_loss: 0.2083 356/500 [====================>.........] - ETA: 34s - loss: 1.4533 - regression_loss: 1.2450 - classification_loss: 0.2083 357/500 [====================>.........] - ETA: 33s - loss: 1.4533 - regression_loss: 1.2452 - classification_loss: 0.2081 358/500 [====================>.........] - ETA: 33s - loss: 1.4533 - regression_loss: 1.2454 - classification_loss: 0.2079 359/500 [====================>.........] - ETA: 33s - loss: 1.4520 - regression_loss: 1.2443 - classification_loss: 0.2077 360/500 [====================>.........] - ETA: 33s - loss: 1.4544 - regression_loss: 1.2463 - classification_loss: 0.2081 361/500 [====================>.........] - ETA: 32s - loss: 1.4544 - regression_loss: 1.2464 - classification_loss: 0.2080 362/500 [====================>.........] - ETA: 32s - loss: 1.4540 - regression_loss: 1.2460 - classification_loss: 0.2080 363/500 [====================>.........] - ETA: 32s - loss: 1.4526 - regression_loss: 1.2447 - classification_loss: 0.2078 364/500 [====================>.........] - ETA: 32s - loss: 1.4534 - regression_loss: 1.2454 - classification_loss: 0.2080 365/500 [====================>.........] - ETA: 31s - loss: 1.4530 - regression_loss: 1.2451 - classification_loss: 0.2079 366/500 [====================>.........] - ETA: 31s - loss: 1.4524 - regression_loss: 1.2448 - classification_loss: 0.2077 367/500 [=====================>........] - ETA: 31s - loss: 1.4502 - regression_loss: 1.2429 - classification_loss: 0.2073 368/500 [=====================>........] - ETA: 31s - loss: 1.4492 - regression_loss: 1.2421 - classification_loss: 0.2071 369/500 [=====================>........] - ETA: 31s - loss: 1.4497 - regression_loss: 1.2425 - classification_loss: 0.2072 370/500 [=====================>........] - ETA: 30s - loss: 1.4496 - regression_loss: 1.2425 - classification_loss: 0.2071 371/500 [=====================>........] - ETA: 30s - loss: 1.4492 - regression_loss: 1.2421 - classification_loss: 0.2071 372/500 [=====================>........] - ETA: 30s - loss: 1.4485 - regression_loss: 1.2417 - classification_loss: 0.2068 373/500 [=====================>........] - ETA: 30s - loss: 1.4471 - regression_loss: 1.2406 - classification_loss: 0.2065 374/500 [=====================>........] - ETA: 29s - loss: 1.4476 - regression_loss: 1.2411 - classification_loss: 0.2065 375/500 [=====================>........] - ETA: 29s - loss: 1.4461 - regression_loss: 1.2378 - classification_loss: 0.2083 376/500 [=====================>........] - ETA: 29s - loss: 1.4482 - regression_loss: 1.2395 - classification_loss: 0.2088 377/500 [=====================>........] - ETA: 29s - loss: 1.4492 - regression_loss: 1.2403 - classification_loss: 0.2089 378/500 [=====================>........] - ETA: 28s - loss: 1.4500 - regression_loss: 1.2412 - classification_loss: 0.2088 379/500 [=====================>........] - ETA: 28s - loss: 1.4487 - regression_loss: 1.2401 - classification_loss: 0.2086 380/500 [=====================>........] - ETA: 28s - loss: 1.4474 - regression_loss: 1.2391 - classification_loss: 0.2083 381/500 [=====================>........] - ETA: 28s - loss: 1.4454 - regression_loss: 1.2373 - classification_loss: 0.2081 382/500 [=====================>........] - ETA: 27s - loss: 1.4454 - regression_loss: 1.2369 - classification_loss: 0.2085 383/500 [=====================>........] - ETA: 27s - loss: 1.4433 - regression_loss: 1.2352 - classification_loss: 0.2081 384/500 [======================>.......] - ETA: 27s - loss: 1.4433 - regression_loss: 1.2353 - classification_loss: 0.2080 385/500 [======================>.......] - ETA: 27s - loss: 1.4420 - regression_loss: 1.2342 - classification_loss: 0.2078 386/500 [======================>.......] - ETA: 27s - loss: 1.4427 - regression_loss: 1.2348 - classification_loss: 0.2079 387/500 [======================>.......] - ETA: 26s - loss: 1.4422 - regression_loss: 1.2343 - classification_loss: 0.2079 388/500 [======================>.......] - ETA: 26s - loss: 1.4429 - regression_loss: 1.2350 - classification_loss: 0.2079 389/500 [======================>.......] - ETA: 26s - loss: 1.4438 - regression_loss: 1.2357 - classification_loss: 0.2081 390/500 [======================>.......] - ETA: 26s - loss: 1.4428 - regression_loss: 1.2349 - classification_loss: 0.2079 391/500 [======================>.......] - ETA: 25s - loss: 1.4426 - regression_loss: 1.2348 - classification_loss: 0.2078 392/500 [======================>.......] - ETA: 25s - loss: 1.4440 - regression_loss: 1.2359 - classification_loss: 0.2081 393/500 [======================>.......] - ETA: 25s - loss: 1.4439 - regression_loss: 1.2358 - classification_loss: 0.2080 394/500 [======================>.......] - ETA: 25s - loss: 1.4443 - regression_loss: 1.2365 - classification_loss: 0.2078 395/500 [======================>.......] - ETA: 24s - loss: 1.4457 - regression_loss: 1.2377 - classification_loss: 0.2080 396/500 [======================>.......] - ETA: 24s - loss: 1.4486 - regression_loss: 1.2400 - classification_loss: 0.2086 397/500 [======================>.......] - ETA: 24s - loss: 1.4485 - regression_loss: 1.2400 - classification_loss: 0.2084 398/500 [======================>.......] - ETA: 24s - loss: 1.4483 - regression_loss: 1.2401 - classification_loss: 0.2082 399/500 [======================>.......] - ETA: 23s - loss: 1.4479 - regression_loss: 1.2399 - classification_loss: 0.2080 400/500 [=======================>......] - ETA: 23s - loss: 1.4501 - regression_loss: 1.2420 - classification_loss: 0.2081 401/500 [=======================>......] - ETA: 23s - loss: 1.4539 - regression_loss: 1.2449 - classification_loss: 0.2090 402/500 [=======================>......] - ETA: 23s - loss: 1.4526 - regression_loss: 1.2439 - classification_loss: 0.2087 403/500 [=======================>......] - ETA: 22s - loss: 1.4536 - regression_loss: 1.2445 - classification_loss: 0.2091 404/500 [=======================>......] - ETA: 22s - loss: 1.4549 - regression_loss: 1.2455 - classification_loss: 0.2093 405/500 [=======================>......] - ETA: 22s - loss: 1.4552 - regression_loss: 1.2458 - classification_loss: 0.2094 406/500 [=======================>......] - ETA: 22s - loss: 1.4561 - regression_loss: 1.2466 - classification_loss: 0.2096 407/500 [=======================>......] - ETA: 22s - loss: 1.4572 - regression_loss: 1.2476 - classification_loss: 0.2096 408/500 [=======================>......] - ETA: 21s - loss: 1.4587 - regression_loss: 1.2488 - classification_loss: 0.2099 409/500 [=======================>......] - ETA: 21s - loss: 1.4585 - regression_loss: 1.2484 - classification_loss: 0.2101 410/500 [=======================>......] - ETA: 21s - loss: 1.4576 - regression_loss: 1.2477 - classification_loss: 0.2099 411/500 [=======================>......] - ETA: 21s - loss: 1.4570 - regression_loss: 1.2470 - classification_loss: 0.2101 412/500 [=======================>......] - ETA: 20s - loss: 1.4552 - regression_loss: 1.2454 - classification_loss: 0.2097 413/500 [=======================>......] - ETA: 20s - loss: 1.4546 - regression_loss: 1.2450 - classification_loss: 0.2096 414/500 [=======================>......] - ETA: 20s - loss: 1.4554 - regression_loss: 1.2457 - classification_loss: 0.2097 415/500 [=======================>......] - ETA: 20s - loss: 1.4533 - regression_loss: 1.2439 - classification_loss: 0.2094 416/500 [=======================>......] - ETA: 19s - loss: 1.4521 - regression_loss: 1.2430 - classification_loss: 0.2091 417/500 [========================>.....] - ETA: 19s - loss: 1.4520 - regression_loss: 1.2429 - classification_loss: 0.2090 418/500 [========================>.....] - ETA: 19s - loss: 1.4520 - regression_loss: 1.2431 - classification_loss: 0.2089 419/500 [========================>.....] - ETA: 19s - loss: 1.4515 - regression_loss: 1.2429 - classification_loss: 0.2086 420/500 [========================>.....] - ETA: 18s - loss: 1.4525 - regression_loss: 1.2439 - classification_loss: 0.2086 421/500 [========================>.....] - ETA: 18s - loss: 1.4521 - regression_loss: 1.2436 - classification_loss: 0.2086 422/500 [========================>.....] - ETA: 18s - loss: 1.4535 - regression_loss: 1.2448 - classification_loss: 0.2087 423/500 [========================>.....] - ETA: 18s - loss: 1.4523 - regression_loss: 1.2437 - classification_loss: 0.2085 424/500 [========================>.....] - ETA: 18s - loss: 1.4514 - regression_loss: 1.2431 - classification_loss: 0.2083 425/500 [========================>.....] - ETA: 17s - loss: 1.4534 - regression_loss: 1.2445 - classification_loss: 0.2088 426/500 [========================>.....] - ETA: 17s - loss: 1.4522 - regression_loss: 1.2437 - classification_loss: 0.2086 427/500 [========================>.....] - ETA: 17s - loss: 1.4522 - regression_loss: 1.2438 - classification_loss: 0.2084 428/500 [========================>.....] - ETA: 17s - loss: 1.4506 - regression_loss: 1.2425 - classification_loss: 0.2081 429/500 [========================>.....] - ETA: 16s - loss: 1.4496 - regression_loss: 1.2418 - classification_loss: 0.2078 430/500 [========================>.....] - ETA: 16s - loss: 1.4501 - regression_loss: 1.2424 - classification_loss: 0.2077 431/500 [========================>.....] - ETA: 16s - loss: 1.4497 - regression_loss: 1.2420 - classification_loss: 0.2078 432/500 [========================>.....] - ETA: 16s - loss: 1.4491 - regression_loss: 1.2415 - classification_loss: 0.2076 433/500 [========================>.....] - ETA: 15s - loss: 1.4476 - regression_loss: 1.2402 - classification_loss: 0.2074 434/500 [=========================>....] - ETA: 15s - loss: 1.4466 - regression_loss: 1.2395 - classification_loss: 0.2072 435/500 [=========================>....] - ETA: 15s - loss: 1.4510 - regression_loss: 1.2432 - classification_loss: 0.2078 436/500 [=========================>....] - ETA: 15s - loss: 1.4525 - regression_loss: 1.2444 - classification_loss: 0.2080 437/500 [=========================>....] - ETA: 14s - loss: 1.4522 - regression_loss: 1.2443 - classification_loss: 0.2079 438/500 [=========================>....] - ETA: 14s - loss: 1.4532 - regression_loss: 1.2450 - classification_loss: 0.2082 439/500 [=========================>....] - ETA: 14s - loss: 1.4530 - regression_loss: 1.2449 - classification_loss: 0.2080 440/500 [=========================>....] - ETA: 14s - loss: 1.4529 - regression_loss: 1.2449 - classification_loss: 0.2080 441/500 [=========================>....] - ETA: 13s - loss: 1.4529 - regression_loss: 1.2448 - classification_loss: 0.2081 442/500 [=========================>....] - ETA: 13s - loss: 1.4521 - regression_loss: 1.2441 - classification_loss: 0.2080 443/500 [=========================>....] - ETA: 13s - loss: 1.4502 - regression_loss: 1.2425 - classification_loss: 0.2077 444/500 [=========================>....] - ETA: 13s - loss: 1.4496 - regression_loss: 1.2419 - classification_loss: 0.2076 445/500 [=========================>....] - ETA: 13s - loss: 1.4487 - regression_loss: 1.2412 - classification_loss: 0.2076 446/500 [=========================>....] - ETA: 12s - loss: 1.4500 - regression_loss: 1.2421 - classification_loss: 0.2079 447/500 [=========================>....] - ETA: 12s - loss: 1.4498 - regression_loss: 1.2420 - classification_loss: 0.2078 448/500 [=========================>....] - ETA: 12s - loss: 1.4506 - regression_loss: 1.2426 - classification_loss: 0.2079 449/500 [=========================>....] - ETA: 12s - loss: 1.4513 - regression_loss: 1.2432 - classification_loss: 0.2080 450/500 [==========================>...] - ETA: 11s - loss: 1.4495 - regression_loss: 1.2418 - classification_loss: 0.2077 451/500 [==========================>...] - ETA: 11s - loss: 1.4495 - regression_loss: 1.2420 - classification_loss: 0.2075 452/500 [==========================>...] - ETA: 11s - loss: 1.4494 - regression_loss: 1.2420 - classification_loss: 0.2073 453/500 [==========================>...] - ETA: 11s - loss: 1.4508 - regression_loss: 1.2432 - classification_loss: 0.2076 454/500 [==========================>...] - ETA: 10s - loss: 1.4497 - regression_loss: 1.2424 - classification_loss: 0.2074 455/500 [==========================>...] - ETA: 10s - loss: 1.4485 - regression_loss: 1.2414 - classification_loss: 0.2071 456/500 [==========================>...] - ETA: 10s - loss: 1.4488 - regression_loss: 1.2417 - classification_loss: 0.2071 457/500 [==========================>...] - ETA: 10s - loss: 1.4486 - regression_loss: 1.2416 - classification_loss: 0.2070 458/500 [==========================>...] - ETA: 9s - loss: 1.4499 - regression_loss: 1.2429 - classification_loss: 0.2071  459/500 [==========================>...] - ETA: 9s - loss: 1.4493 - regression_loss: 1.2423 - classification_loss: 0.2070 460/500 [==========================>...] - ETA: 9s - loss: 1.4488 - regression_loss: 1.2419 - classification_loss: 0.2069 461/500 [==========================>...] - ETA: 9s - loss: 1.4495 - regression_loss: 1.2425 - classification_loss: 0.2070 462/500 [==========================>...] - ETA: 8s - loss: 1.4508 - regression_loss: 1.2440 - classification_loss: 0.2069 463/500 [==========================>...] - ETA: 8s - loss: 1.4496 - regression_loss: 1.2429 - classification_loss: 0.2067 464/500 [==========================>...] - ETA: 8s - loss: 1.4515 - regression_loss: 1.2444 - classification_loss: 0.2071 465/500 [==========================>...] - ETA: 8s - loss: 1.4536 - regression_loss: 1.2459 - classification_loss: 0.2076 466/500 [==========================>...] - ETA: 8s - loss: 1.4524 - regression_loss: 1.2450 - classification_loss: 0.2074 467/500 [===========================>..] - ETA: 7s - loss: 1.4522 - regression_loss: 1.2450 - classification_loss: 0.2072 468/500 [===========================>..] - ETA: 7s - loss: 1.4524 - regression_loss: 1.2453 - classification_loss: 0.2071 469/500 [===========================>..] - ETA: 7s - loss: 1.4518 - regression_loss: 1.2449 - classification_loss: 0.2069 470/500 [===========================>..] - ETA: 7s - loss: 1.4514 - regression_loss: 1.2445 - classification_loss: 0.2069 471/500 [===========================>..] - ETA: 6s - loss: 1.4515 - regression_loss: 1.2446 - classification_loss: 0.2069 472/500 [===========================>..] - ETA: 6s - loss: 1.4531 - regression_loss: 1.2458 - classification_loss: 0.2073 473/500 [===========================>..] - ETA: 6s - loss: 1.4524 - regression_loss: 1.2451 - classification_loss: 0.2073 474/500 [===========================>..] - ETA: 6s - loss: 1.4532 - regression_loss: 1.2460 - classification_loss: 0.2072 475/500 [===========================>..] - ETA: 5s - loss: 1.4518 - regression_loss: 1.2449 - classification_loss: 0.2069 476/500 [===========================>..] - ETA: 5s - loss: 1.4527 - regression_loss: 1.2456 - classification_loss: 0.2070 477/500 [===========================>..] - ETA: 5s - loss: 1.4523 - regression_loss: 1.2454 - classification_loss: 0.2069 478/500 [===========================>..] - ETA: 5s - loss: 1.4525 - regression_loss: 1.2456 - classification_loss: 0.2069 479/500 [===========================>..] - ETA: 4s - loss: 1.4517 - regression_loss: 1.2449 - classification_loss: 0.2068 480/500 [===========================>..] - ETA: 4s - loss: 1.4521 - regression_loss: 1.2453 - classification_loss: 0.2068 481/500 [===========================>..] - ETA: 4s - loss: 1.4522 - regression_loss: 1.2454 - classification_loss: 0.2068 482/500 [===========================>..] - ETA: 4s - loss: 1.4525 - regression_loss: 1.2457 - classification_loss: 0.2068 483/500 [===========================>..] - ETA: 4s - loss: 1.4527 - regression_loss: 1.2458 - classification_loss: 0.2069 484/500 [============================>.] - ETA: 3s - loss: 1.4532 - regression_loss: 1.2463 - classification_loss: 0.2069 485/500 [============================>.] - ETA: 3s - loss: 1.4520 - regression_loss: 1.2452 - classification_loss: 0.2068 486/500 [============================>.] - ETA: 3s - loss: 1.4507 - regression_loss: 1.2441 - classification_loss: 0.2066 487/500 [============================>.] - ETA: 3s - loss: 1.4521 - regression_loss: 1.2453 - classification_loss: 0.2068 488/500 [============================>.] - ETA: 2s - loss: 1.4530 - regression_loss: 1.2461 - classification_loss: 0.2070 489/500 [============================>.] - ETA: 2s - loss: 1.4533 - regression_loss: 1.2462 - classification_loss: 0.2071 490/500 [============================>.] - ETA: 2s - loss: 1.4539 - regression_loss: 1.2466 - classification_loss: 0.2073 491/500 [============================>.] - ETA: 2s - loss: 1.4534 - regression_loss: 1.2463 - classification_loss: 0.2072 492/500 [============================>.] - ETA: 1s - loss: 1.4525 - regression_loss: 1.2452 - classification_loss: 0.2072 493/500 [============================>.] - ETA: 1s - loss: 1.4520 - regression_loss: 1.2447 - classification_loss: 0.2073 494/500 [============================>.] - ETA: 1s - loss: 1.4517 - regression_loss: 1.2445 - classification_loss: 0.2072 495/500 [============================>.] - ETA: 1s - loss: 1.4511 - regression_loss: 1.2441 - classification_loss: 0.2070 496/500 [============================>.] - ETA: 0s - loss: 1.4522 - regression_loss: 1.2451 - classification_loss: 0.2071 497/500 [============================>.] - ETA: 0s - loss: 1.4532 - regression_loss: 1.2461 - classification_loss: 0.2071 498/500 [============================>.] - ETA: 0s - loss: 1.4527 - regression_loss: 1.2455 - classification_loss: 0.2072 499/500 [============================>.] - ETA: 0s - loss: 1.4514 - regression_loss: 1.2445 - classification_loss: 0.2069 500/500 [==============================] - 118s 237ms/step - loss: 1.4509 - regression_loss: 1.2441 - classification_loss: 0.2068 326 instances of class plum with average precision: 0.8224 mAP: 0.8224 Epoch 00011: saving model to ./training/snapshots/resnet50_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 1:51 - loss: 0.8354 - regression_loss: 0.7616 - classification_loss: 0.0738 2/500 [..............................] - ETA: 2:02 - loss: 1.2536 - regression_loss: 1.1156 - classification_loss: 0.1380 3/500 [..............................] - ETA: 2:03 - loss: 1.7038 - regression_loss: 1.4545 - classification_loss: 0.2493 4/500 [..............................] - ETA: 2:00 - loss: 1.4191 - regression_loss: 1.2118 - classification_loss: 0.2073 5/500 [..............................] - ETA: 2:00 - loss: 1.4236 - regression_loss: 1.2276 - classification_loss: 0.1960 6/500 [..............................] - ETA: 1:59 - loss: 1.3619 - regression_loss: 1.1778 - classification_loss: 0.1840 7/500 [..............................] - ETA: 1:57 - loss: 1.3541 - regression_loss: 1.1715 - classification_loss: 0.1827 8/500 [..............................] - ETA: 1:58 - loss: 1.3067 - regression_loss: 1.1282 - classification_loss: 0.1785 9/500 [..............................] - ETA: 1:57 - loss: 1.2907 - regression_loss: 1.1184 - classification_loss: 0.1723 10/500 [..............................] - ETA: 1:56 - loss: 1.3167 - regression_loss: 1.1465 - classification_loss: 0.1702 11/500 [..............................] - ETA: 1:55 - loss: 1.3559 - regression_loss: 1.1844 - classification_loss: 0.1715 12/500 [..............................] - ETA: 1:55 - loss: 1.3190 - regression_loss: 1.1504 - classification_loss: 0.1686 13/500 [..............................] - ETA: 1:55 - loss: 1.2615 - regression_loss: 1.1004 - classification_loss: 0.1611 14/500 [..............................] - ETA: 1:54 - loss: 1.3605 - regression_loss: 1.1852 - classification_loss: 0.1753 15/500 [..............................] - ETA: 1:54 - loss: 1.2993 - regression_loss: 1.1288 - classification_loss: 0.1705 16/500 [..............................] - ETA: 1:54 - loss: 1.3325 - regression_loss: 1.1576 - classification_loss: 0.1749 17/500 [>.............................] - ETA: 1:54 - loss: 1.3534 - regression_loss: 1.1772 - classification_loss: 0.1762 18/500 [>.............................] - ETA: 1:54 - loss: 1.3845 - regression_loss: 1.2006 - classification_loss: 0.1839 19/500 [>.............................] - ETA: 1:54 - loss: 1.4728 - regression_loss: 1.2682 - classification_loss: 0.2045 20/500 [>.............................] - ETA: 1:54 - loss: 1.4576 - regression_loss: 1.2569 - classification_loss: 0.2006 21/500 [>.............................] - ETA: 1:53 - loss: 1.4288 - regression_loss: 1.2328 - classification_loss: 0.1960 22/500 [>.............................] - ETA: 1:52 - loss: 1.4081 - regression_loss: 1.2166 - classification_loss: 0.1916 23/500 [>.............................] - ETA: 1:52 - loss: 1.4227 - regression_loss: 1.2258 - classification_loss: 0.1970 24/500 [>.............................] - ETA: 1:52 - loss: 1.4456 - regression_loss: 1.2480 - classification_loss: 0.1977 25/500 [>.............................] - ETA: 1:51 - loss: 1.4689 - regression_loss: 1.2672 - classification_loss: 0.2017 26/500 [>.............................] - ETA: 1:52 - loss: 1.4632 - regression_loss: 1.2648 - classification_loss: 0.1984 27/500 [>.............................] - ETA: 1:51 - loss: 1.4858 - regression_loss: 1.2830 - classification_loss: 0.2028 28/500 [>.............................] - ETA: 1:51 - loss: 1.5103 - regression_loss: 1.3023 - classification_loss: 0.2079 29/500 [>.............................] - ETA: 1:51 - loss: 1.5109 - regression_loss: 1.3027 - classification_loss: 0.2082 30/500 [>.............................] - ETA: 1:50 - loss: 1.5036 - regression_loss: 1.2970 - classification_loss: 0.2066 31/500 [>.............................] - ETA: 1:50 - loss: 1.5252 - regression_loss: 1.3169 - classification_loss: 0.2083 32/500 [>.............................] - ETA: 1:50 - loss: 1.4990 - regression_loss: 1.2946 - classification_loss: 0.2044 33/500 [>.............................] - ETA: 1:50 - loss: 1.5175 - regression_loss: 1.3090 - classification_loss: 0.2086 34/500 [=>............................] - ETA: 1:50 - loss: 1.4856 - regression_loss: 1.2809 - classification_loss: 0.2047 35/500 [=>............................] - ETA: 1:49 - loss: 1.4636 - regression_loss: 1.2618 - classification_loss: 0.2017 36/500 [=>............................] - ETA: 1:49 - loss: 1.4575 - regression_loss: 1.2562 - classification_loss: 0.2013 37/500 [=>............................] - ETA: 1:48 - loss: 1.4447 - regression_loss: 1.2464 - classification_loss: 0.1984 38/500 [=>............................] - ETA: 1:48 - loss: 1.4375 - regression_loss: 1.2406 - classification_loss: 0.1968 39/500 [=>............................] - ETA: 1:48 - loss: 1.4557 - regression_loss: 1.2535 - classification_loss: 0.2022 40/500 [=>............................] - ETA: 1:48 - loss: 1.5010 - regression_loss: 1.2844 - classification_loss: 0.2165 41/500 [=>............................] - ETA: 1:48 - loss: 1.4999 - regression_loss: 1.2829 - classification_loss: 0.2170 42/500 [=>............................] - ETA: 1:47 - loss: 1.4893 - regression_loss: 1.2743 - classification_loss: 0.2150 43/500 [=>............................] - ETA: 1:47 - loss: 1.4872 - regression_loss: 1.2727 - classification_loss: 0.2145 44/500 [=>............................] - ETA: 1:47 - loss: 1.4790 - regression_loss: 1.2641 - classification_loss: 0.2149 45/500 [=>............................] - ETA: 1:46 - loss: 1.4834 - regression_loss: 1.2674 - classification_loss: 0.2160 46/500 [=>............................] - ETA: 1:46 - loss: 1.4825 - regression_loss: 1.2675 - classification_loss: 0.2150 47/500 [=>............................] - ETA: 1:46 - loss: 1.4803 - regression_loss: 1.2659 - classification_loss: 0.2143 48/500 [=>............................] - ETA: 1:46 - loss: 1.4733 - regression_loss: 1.2615 - classification_loss: 0.2118 49/500 [=>............................] - ETA: 1:45 - loss: 1.4560 - regression_loss: 1.2461 - classification_loss: 0.2099 50/500 [==>...........................] - ETA: 1:45 - loss: 1.4657 - regression_loss: 1.2553 - classification_loss: 0.2104 51/500 [==>...........................] - ETA: 1:45 - loss: 1.4745 - regression_loss: 1.2635 - classification_loss: 0.2109 52/500 [==>...........................] - ETA: 1:45 - loss: 1.4684 - regression_loss: 1.2568 - classification_loss: 0.2116 53/500 [==>...........................] - ETA: 1:44 - loss: 1.4561 - regression_loss: 1.2471 - classification_loss: 0.2091 54/500 [==>...........................] - ETA: 1:44 - loss: 1.4518 - regression_loss: 1.2440 - classification_loss: 0.2078 55/500 [==>...........................] - ETA: 1:44 - loss: 1.4872 - regression_loss: 1.2689 - classification_loss: 0.2182 56/500 [==>...........................] - ETA: 1:44 - loss: 1.4789 - regression_loss: 1.2625 - classification_loss: 0.2164 57/500 [==>...........................] - ETA: 1:43 - loss: 1.5191 - regression_loss: 1.2913 - classification_loss: 0.2278 58/500 [==>...........................] - ETA: 1:43 - loss: 1.5411 - regression_loss: 1.3075 - classification_loss: 0.2336 59/500 [==>...........................] - ETA: 1:43 - loss: 1.5453 - regression_loss: 1.3099 - classification_loss: 0.2354 60/500 [==>...........................] - ETA: 1:43 - loss: 1.5328 - regression_loss: 1.2999 - classification_loss: 0.2329 61/500 [==>...........................] - ETA: 1:43 - loss: 1.5418 - regression_loss: 1.3084 - classification_loss: 0.2335 62/500 [==>...........................] - ETA: 1:43 - loss: 1.5445 - regression_loss: 1.3114 - classification_loss: 0.2331 63/500 [==>...........................] - ETA: 1:43 - loss: 1.5455 - regression_loss: 1.3132 - classification_loss: 0.2322 64/500 [==>...........................] - ETA: 1:42 - loss: 1.5421 - regression_loss: 1.3113 - classification_loss: 0.2308 65/500 [==>...........................] - ETA: 1:42 - loss: 1.5410 - regression_loss: 1.3099 - classification_loss: 0.2311 66/500 [==>...........................] - ETA: 1:42 - loss: 1.5364 - regression_loss: 1.3062 - classification_loss: 0.2302 67/500 [===>..........................] - ETA: 1:42 - loss: 1.5516 - regression_loss: 1.3192 - classification_loss: 0.2324 68/500 [===>..........................] - ETA: 1:42 - loss: 1.5421 - regression_loss: 1.3111 - classification_loss: 0.2310 69/500 [===>..........................] - ETA: 1:41 - loss: 1.5489 - regression_loss: 1.3174 - classification_loss: 0.2316 70/500 [===>..........................] - ETA: 1:41 - loss: 1.5649 - regression_loss: 1.3317 - classification_loss: 0.2332 71/500 [===>..........................] - ETA: 1:41 - loss: 1.5611 - regression_loss: 1.3277 - classification_loss: 0.2334 72/500 [===>..........................] - ETA: 1:41 - loss: 1.5736 - regression_loss: 1.3387 - classification_loss: 0.2349 73/500 [===>..........................] - ETA: 1:40 - loss: 1.5648 - regression_loss: 1.3316 - classification_loss: 0.2332 74/500 [===>..........................] - ETA: 1:40 - loss: 1.5579 - regression_loss: 1.3270 - classification_loss: 0.2309 75/500 [===>..........................] - ETA: 1:40 - loss: 1.5537 - regression_loss: 1.3239 - classification_loss: 0.2299 76/500 [===>..........................] - ETA: 1:40 - loss: 1.5577 - regression_loss: 1.3260 - classification_loss: 0.2317 77/500 [===>..........................] - ETA: 1:39 - loss: 1.5588 - regression_loss: 1.3270 - classification_loss: 0.2318 78/500 [===>..........................] - ETA: 1:39 - loss: 1.5607 - regression_loss: 1.3289 - classification_loss: 0.2318 79/500 [===>..........................] - ETA: 1:39 - loss: 1.5527 - regression_loss: 1.3226 - classification_loss: 0.2301 80/500 [===>..........................] - ETA: 1:39 - loss: 1.5465 - regression_loss: 1.3180 - classification_loss: 0.2285 81/500 [===>..........................] - ETA: 1:38 - loss: 1.5436 - regression_loss: 1.3159 - classification_loss: 0.2277 82/500 [===>..........................] - ETA: 1:38 - loss: 1.5445 - regression_loss: 1.3153 - classification_loss: 0.2292 83/500 [===>..........................] - ETA: 1:38 - loss: 1.5372 - regression_loss: 1.3094 - classification_loss: 0.2277 84/500 [====>.........................] - ETA: 1:38 - loss: 1.5283 - regression_loss: 1.2976 - classification_loss: 0.2307 85/500 [====>.........................] - ETA: 1:38 - loss: 1.5303 - regression_loss: 1.3000 - classification_loss: 0.2303 86/500 [====>.........................] - ETA: 1:37 - loss: 1.5285 - regression_loss: 1.2995 - classification_loss: 0.2289 87/500 [====>.........................] - ETA: 1:37 - loss: 1.5252 - regression_loss: 1.2973 - classification_loss: 0.2280 88/500 [====>.........................] - ETA: 1:37 - loss: 1.5321 - regression_loss: 1.3039 - classification_loss: 0.2282 89/500 [====>.........................] - ETA: 1:37 - loss: 1.5328 - regression_loss: 1.3049 - classification_loss: 0.2279 90/500 [====>.........................] - ETA: 1:36 - loss: 1.5375 - regression_loss: 1.3090 - classification_loss: 0.2285 91/500 [====>.........................] - ETA: 1:36 - loss: 1.5369 - regression_loss: 1.3086 - classification_loss: 0.2284 92/500 [====>.........................] - ETA: 1:36 - loss: 1.5396 - regression_loss: 1.3109 - classification_loss: 0.2288 93/500 [====>.........................] - ETA: 1:35 - loss: 1.5417 - regression_loss: 1.3126 - classification_loss: 0.2291 94/500 [====>.........................] - ETA: 1:35 - loss: 1.5355 - regression_loss: 1.3079 - classification_loss: 0.2277 95/500 [====>.........................] - ETA: 1:35 - loss: 1.5278 - regression_loss: 1.3016 - classification_loss: 0.2262 96/500 [====>.........................] - ETA: 1:35 - loss: 1.5272 - regression_loss: 1.3009 - classification_loss: 0.2263 97/500 [====>.........................] - ETA: 1:34 - loss: 1.5222 - regression_loss: 1.2973 - classification_loss: 0.2249 98/500 [====>.........................] - ETA: 1:34 - loss: 1.5223 - regression_loss: 1.2981 - classification_loss: 0.2243 99/500 [====>.........................] - ETA: 1:34 - loss: 1.5147 - regression_loss: 1.2918 - classification_loss: 0.2229 100/500 [=====>........................] - ETA: 1:34 - loss: 1.5104 - regression_loss: 1.2884 - classification_loss: 0.2221 101/500 [=====>........................] - ETA: 1:33 - loss: 1.5082 - regression_loss: 1.2863 - classification_loss: 0.2219 102/500 [=====>........................] - ETA: 1:33 - loss: 1.5086 - regression_loss: 1.2869 - classification_loss: 0.2218 103/500 [=====>........................] - ETA: 1:33 - loss: 1.5192 - regression_loss: 1.2962 - classification_loss: 0.2231 104/500 [=====>........................] - ETA: 1:33 - loss: 1.5185 - regression_loss: 1.2954 - classification_loss: 0.2231 105/500 [=====>........................] - ETA: 1:33 - loss: 1.5121 - regression_loss: 1.2894 - classification_loss: 0.2227 106/500 [=====>........................] - ETA: 1:32 - loss: 1.5121 - regression_loss: 1.2898 - classification_loss: 0.2223 107/500 [=====>........................] - ETA: 1:32 - loss: 1.5067 - regression_loss: 1.2855 - classification_loss: 0.2213 108/500 [=====>........................] - ETA: 1:32 - loss: 1.5022 - regression_loss: 1.2823 - classification_loss: 0.2200 109/500 [=====>........................] - ETA: 1:32 - loss: 1.5021 - regression_loss: 1.2824 - classification_loss: 0.2197 110/500 [=====>........................] - ETA: 1:32 - loss: 1.5007 - regression_loss: 1.2818 - classification_loss: 0.2189 111/500 [=====>........................] - ETA: 1:31 - loss: 1.4944 - regression_loss: 1.2760 - classification_loss: 0.2183 112/500 [=====>........................] - ETA: 1:31 - loss: 1.4899 - regression_loss: 1.2722 - classification_loss: 0.2178 113/500 [=====>........................] - ETA: 1:31 - loss: 1.4891 - regression_loss: 1.2717 - classification_loss: 0.2174 114/500 [=====>........................] - ETA: 1:30 - loss: 1.4884 - regression_loss: 1.2711 - classification_loss: 0.2173 115/500 [=====>........................] - ETA: 1:30 - loss: 1.4869 - regression_loss: 1.2700 - classification_loss: 0.2168 116/500 [=====>........................] - ETA: 1:30 - loss: 1.4911 - regression_loss: 1.2732 - classification_loss: 0.2179 117/500 [======>.......................] - ETA: 1:30 - loss: 1.4939 - regression_loss: 1.2757 - classification_loss: 0.2183 118/500 [======>.......................] - ETA: 1:30 - loss: 1.5093 - regression_loss: 1.2887 - classification_loss: 0.2206 119/500 [======>.......................] - ETA: 1:29 - loss: 1.5120 - regression_loss: 1.2912 - classification_loss: 0.2208 120/500 [======>.......................] - ETA: 1:29 - loss: 1.5109 - regression_loss: 1.2905 - classification_loss: 0.2204 121/500 [======>.......................] - ETA: 1:29 - loss: 1.5121 - regression_loss: 1.2913 - classification_loss: 0.2209 122/500 [======>.......................] - ETA: 1:29 - loss: 1.5109 - regression_loss: 1.2902 - classification_loss: 0.2208 123/500 [======>.......................] - ETA: 1:28 - loss: 1.5078 - regression_loss: 1.2870 - classification_loss: 0.2208 124/500 [======>.......................] - ETA: 1:28 - loss: 1.5040 - regression_loss: 1.2840 - classification_loss: 0.2200 125/500 [======>.......................] - ETA: 1:28 - loss: 1.5067 - regression_loss: 1.2871 - classification_loss: 0.2196 126/500 [======>.......................] - ETA: 1:28 - loss: 1.5037 - regression_loss: 1.2849 - classification_loss: 0.2187 127/500 [======>.......................] - ETA: 1:28 - loss: 1.5006 - regression_loss: 1.2819 - classification_loss: 0.2187 128/500 [======>.......................] - ETA: 1:27 - loss: 1.5020 - regression_loss: 1.2834 - classification_loss: 0.2186 129/500 [======>.......................] - ETA: 1:27 - loss: 1.5064 - regression_loss: 1.2872 - classification_loss: 0.2192 130/500 [======>.......................] - ETA: 1:27 - loss: 1.5078 - regression_loss: 1.2891 - classification_loss: 0.2187 131/500 [======>.......................] - ETA: 1:27 - loss: 1.5054 - regression_loss: 1.2866 - classification_loss: 0.2189 132/500 [======>.......................] - ETA: 1:26 - loss: 1.5046 - regression_loss: 1.2861 - classification_loss: 0.2186 133/500 [======>.......................] - ETA: 1:26 - loss: 1.5085 - regression_loss: 1.2890 - classification_loss: 0.2195 134/500 [=======>......................] - ETA: 1:26 - loss: 1.5045 - regression_loss: 1.2860 - classification_loss: 0.2185 135/500 [=======>......................] - ETA: 1:25 - loss: 1.5007 - regression_loss: 1.2813 - classification_loss: 0.2194 136/500 [=======>......................] - ETA: 1:25 - loss: 1.4979 - regression_loss: 1.2787 - classification_loss: 0.2192 137/500 [=======>......................] - ETA: 1:25 - loss: 1.5014 - regression_loss: 1.2818 - classification_loss: 0.2195 138/500 [=======>......................] - ETA: 1:25 - loss: 1.5025 - regression_loss: 1.2828 - classification_loss: 0.2197 139/500 [=======>......................] - ETA: 1:24 - loss: 1.5026 - regression_loss: 1.2832 - classification_loss: 0.2194 140/500 [=======>......................] - ETA: 1:24 - loss: 1.5018 - regression_loss: 1.2830 - classification_loss: 0.2188 141/500 [=======>......................] - ETA: 1:24 - loss: 1.4987 - regression_loss: 1.2806 - classification_loss: 0.2180 142/500 [=======>......................] - ETA: 1:23 - loss: 1.4953 - regression_loss: 1.2781 - classification_loss: 0.2172 143/500 [=======>......................] - ETA: 1:23 - loss: 1.4943 - regression_loss: 1.2775 - classification_loss: 0.2169 144/500 [=======>......................] - ETA: 1:23 - loss: 1.4928 - regression_loss: 1.2758 - classification_loss: 0.2169 145/500 [=======>......................] - ETA: 1:23 - loss: 1.4883 - regression_loss: 1.2725 - classification_loss: 0.2158 146/500 [=======>......................] - ETA: 1:23 - loss: 1.4858 - regression_loss: 1.2706 - classification_loss: 0.2152 147/500 [=======>......................] - ETA: 1:22 - loss: 1.4848 - regression_loss: 1.2700 - classification_loss: 0.2148 148/500 [=======>......................] - ETA: 1:22 - loss: 1.4863 - regression_loss: 1.2715 - classification_loss: 0.2149 149/500 [=======>......................] - ETA: 1:22 - loss: 1.4878 - regression_loss: 1.2731 - classification_loss: 0.2147 150/500 [========>.....................] - ETA: 1:22 - loss: 1.4849 - regression_loss: 1.2708 - classification_loss: 0.2141 151/500 [========>.....................] - ETA: 1:21 - loss: 1.4876 - regression_loss: 1.2731 - classification_loss: 0.2146 152/500 [========>.....................] - ETA: 1:21 - loss: 1.4871 - regression_loss: 1.2726 - classification_loss: 0.2145 153/500 [========>.....................] - ETA: 1:21 - loss: 1.4849 - regression_loss: 1.2712 - classification_loss: 0.2136 154/500 [========>.....................] - ETA: 1:21 - loss: 1.4860 - regression_loss: 1.2721 - classification_loss: 0.2138 155/500 [========>.....................] - ETA: 1:21 - loss: 1.4835 - regression_loss: 1.2700 - classification_loss: 0.2135 156/500 [========>.....................] - ETA: 1:20 - loss: 1.4842 - regression_loss: 1.2708 - classification_loss: 0.2135 157/500 [========>.....................] - ETA: 1:20 - loss: 1.4835 - regression_loss: 1.2707 - classification_loss: 0.2127 158/500 [========>.....................] - ETA: 1:20 - loss: 1.4857 - regression_loss: 1.2726 - classification_loss: 0.2131 159/500 [========>.....................] - ETA: 1:20 - loss: 1.4914 - regression_loss: 1.2773 - classification_loss: 0.2141 160/500 [========>.....................] - ETA: 1:19 - loss: 1.4859 - regression_loss: 1.2727 - classification_loss: 0.2132 161/500 [========>.....................] - ETA: 1:19 - loss: 1.4899 - regression_loss: 1.2757 - classification_loss: 0.2142 162/500 [========>.....................] - ETA: 1:19 - loss: 1.4966 - regression_loss: 1.2813 - classification_loss: 0.2153 163/500 [========>.....................] - ETA: 1:19 - loss: 1.4921 - regression_loss: 1.2777 - classification_loss: 0.2144 164/500 [========>.....................] - ETA: 1:18 - loss: 1.4872 - regression_loss: 1.2699 - classification_loss: 0.2173 165/500 [========>.....................] - ETA: 1:18 - loss: 1.4857 - regression_loss: 1.2685 - classification_loss: 0.2172 166/500 [========>.....................] - ETA: 1:18 - loss: 1.4904 - regression_loss: 1.2726 - classification_loss: 0.2177 167/500 [=========>....................] - ETA: 1:18 - loss: 1.4883 - regression_loss: 1.2709 - classification_loss: 0.2173 168/500 [=========>....................] - ETA: 1:18 - loss: 1.4880 - regression_loss: 1.2709 - classification_loss: 0.2171 169/500 [=========>....................] - ETA: 1:17 - loss: 1.4882 - regression_loss: 1.2710 - classification_loss: 0.2172 170/500 [=========>....................] - ETA: 1:17 - loss: 1.4829 - regression_loss: 1.2666 - classification_loss: 0.2163 171/500 [=========>....................] - ETA: 1:17 - loss: 1.4808 - regression_loss: 1.2650 - classification_loss: 0.2157 172/500 [=========>....................] - ETA: 1:17 - loss: 1.4804 - regression_loss: 1.2646 - classification_loss: 0.2158 173/500 [=========>....................] - ETA: 1:17 - loss: 1.4841 - regression_loss: 1.2680 - classification_loss: 0.2161 174/500 [=========>....................] - ETA: 1:16 - loss: 1.4802 - regression_loss: 1.2648 - classification_loss: 0.2155 175/500 [=========>....................] - ETA: 1:16 - loss: 1.4767 - regression_loss: 1.2621 - classification_loss: 0.2146 176/500 [=========>....................] - ETA: 1:16 - loss: 1.4794 - regression_loss: 1.2642 - classification_loss: 0.2152 177/500 [=========>....................] - ETA: 1:16 - loss: 1.4779 - regression_loss: 1.2632 - classification_loss: 0.2147 178/500 [=========>....................] - ETA: 1:15 - loss: 1.4814 - regression_loss: 1.2654 - classification_loss: 0.2160 179/500 [=========>....................] - ETA: 1:15 - loss: 1.4827 - regression_loss: 1.2667 - classification_loss: 0.2160 180/500 [=========>....................] - ETA: 1:15 - loss: 1.4860 - regression_loss: 1.2696 - classification_loss: 0.2164 181/500 [=========>....................] - ETA: 1:15 - loss: 1.4871 - regression_loss: 1.2709 - classification_loss: 0.2163 182/500 [=========>....................] - ETA: 1:14 - loss: 1.4873 - regression_loss: 1.2716 - classification_loss: 0.2157 183/500 [=========>....................] - ETA: 1:14 - loss: 1.4888 - regression_loss: 1.2729 - classification_loss: 0.2159 184/500 [==========>...................] - ETA: 1:14 - loss: 1.4887 - regression_loss: 1.2730 - classification_loss: 0.2157 185/500 [==========>...................] - ETA: 1:14 - loss: 1.4874 - regression_loss: 1.2722 - classification_loss: 0.2152 186/500 [==========>...................] - ETA: 1:14 - loss: 1.4892 - regression_loss: 1.2734 - classification_loss: 0.2157 187/500 [==========>...................] - ETA: 1:13 - loss: 1.4881 - regression_loss: 1.2726 - classification_loss: 0.2155 188/500 [==========>...................] - ETA: 1:13 - loss: 1.4870 - regression_loss: 1.2720 - classification_loss: 0.2150 189/500 [==========>...................] - ETA: 1:13 - loss: 1.4836 - regression_loss: 1.2692 - classification_loss: 0.2144 190/500 [==========>...................] - ETA: 1:13 - loss: 1.4830 - regression_loss: 1.2686 - classification_loss: 0.2143 191/500 [==========>...................] - ETA: 1:12 - loss: 1.4868 - regression_loss: 1.2718 - classification_loss: 0.2150 192/500 [==========>...................] - ETA: 1:12 - loss: 1.4863 - regression_loss: 1.2716 - classification_loss: 0.2147 193/500 [==========>...................] - ETA: 1:12 - loss: 1.4848 - regression_loss: 1.2705 - classification_loss: 0.2143 194/500 [==========>...................] - ETA: 1:12 - loss: 1.4818 - regression_loss: 1.2683 - classification_loss: 0.2136 195/500 [==========>...................] - ETA: 1:11 - loss: 1.4895 - regression_loss: 1.2745 - classification_loss: 0.2150 196/500 [==========>...................] - ETA: 1:11 - loss: 1.4907 - regression_loss: 1.2754 - classification_loss: 0.2153 197/500 [==========>...................] - ETA: 1:11 - loss: 1.4890 - regression_loss: 1.2742 - classification_loss: 0.2148 198/500 [==========>...................] - ETA: 1:11 - loss: 1.4933 - regression_loss: 1.2774 - classification_loss: 0.2159 199/500 [==========>...................] - ETA: 1:10 - loss: 1.4918 - regression_loss: 1.2763 - classification_loss: 0.2156 200/500 [===========>..................] - ETA: 1:10 - loss: 1.4902 - regression_loss: 1.2747 - classification_loss: 0.2155 201/500 [===========>..................] - ETA: 1:10 - loss: 1.4869 - regression_loss: 1.2722 - classification_loss: 0.2148 202/500 [===========>..................] - ETA: 1:10 - loss: 1.4860 - regression_loss: 1.2712 - classification_loss: 0.2147 203/500 [===========>..................] - ETA: 1:10 - loss: 1.4866 - regression_loss: 1.2719 - classification_loss: 0.2147 204/500 [===========>..................] - ETA: 1:09 - loss: 1.4831 - regression_loss: 1.2691 - classification_loss: 0.2140 205/500 [===========>..................] - ETA: 1:09 - loss: 1.4851 - regression_loss: 1.2708 - classification_loss: 0.2143 206/500 [===========>..................] - ETA: 1:09 - loss: 1.4829 - regression_loss: 1.2691 - classification_loss: 0.2138 207/500 [===========>..................] - ETA: 1:09 - loss: 1.4839 - regression_loss: 1.2702 - classification_loss: 0.2137 208/500 [===========>..................] - ETA: 1:08 - loss: 1.4829 - regression_loss: 1.2695 - classification_loss: 0.2134 209/500 [===========>..................] - ETA: 1:08 - loss: 1.4826 - regression_loss: 1.2694 - classification_loss: 0.2132 210/500 [===========>..................] - ETA: 1:08 - loss: 1.4797 - regression_loss: 1.2671 - classification_loss: 0.2126 211/500 [===========>..................] - ETA: 1:08 - loss: 1.4813 - regression_loss: 1.2671 - classification_loss: 0.2142 212/500 [===========>..................] - ETA: 1:07 - loss: 1.4821 - regression_loss: 1.2676 - classification_loss: 0.2145 213/500 [===========>..................] - ETA: 1:07 - loss: 1.4805 - regression_loss: 1.2664 - classification_loss: 0.2141 214/500 [===========>..................] - ETA: 1:07 - loss: 1.4829 - regression_loss: 1.2685 - classification_loss: 0.2144 215/500 [===========>..................] - ETA: 1:07 - loss: 1.4818 - regression_loss: 1.2676 - classification_loss: 0.2142 216/500 [===========>..................] - ETA: 1:07 - loss: 1.4820 - regression_loss: 1.2677 - classification_loss: 0.2143 217/500 [============>.................] - ETA: 1:06 - loss: 1.4807 - regression_loss: 1.2666 - classification_loss: 0.2141 218/500 [============>.................] - ETA: 1:06 - loss: 1.4773 - regression_loss: 1.2638 - classification_loss: 0.2135 219/500 [============>.................] - ETA: 1:06 - loss: 1.4795 - regression_loss: 1.2652 - classification_loss: 0.2142 220/500 [============>.................] - ETA: 1:06 - loss: 1.4774 - regression_loss: 1.2637 - classification_loss: 0.2137 221/500 [============>.................] - ETA: 1:05 - loss: 1.4782 - regression_loss: 1.2643 - classification_loss: 0.2138 222/500 [============>.................] - ETA: 1:05 - loss: 1.4787 - regression_loss: 1.2645 - classification_loss: 0.2142 223/500 [============>.................] - ETA: 1:05 - loss: 1.4778 - regression_loss: 1.2639 - classification_loss: 0.2139 224/500 [============>.................] - ETA: 1:05 - loss: 1.4759 - regression_loss: 1.2624 - classification_loss: 0.2135 225/500 [============>.................] - ETA: 1:04 - loss: 1.4753 - regression_loss: 1.2619 - classification_loss: 0.2134 226/500 [============>.................] - ETA: 1:04 - loss: 1.4736 - regression_loss: 1.2607 - classification_loss: 0.2129 227/500 [============>.................] - ETA: 1:04 - loss: 1.4717 - regression_loss: 1.2592 - classification_loss: 0.2125 228/500 [============>.................] - ETA: 1:04 - loss: 1.4680 - regression_loss: 1.2562 - classification_loss: 0.2118 229/500 [============>.................] - ETA: 1:03 - loss: 1.4703 - regression_loss: 1.2582 - classification_loss: 0.2121 230/500 [============>.................] - ETA: 1:03 - loss: 1.4697 - regression_loss: 1.2578 - classification_loss: 0.2119 231/500 [============>.................] - ETA: 1:03 - loss: 1.4715 - regression_loss: 1.2588 - classification_loss: 0.2127 232/500 [============>.................] - ETA: 1:03 - loss: 1.4706 - regression_loss: 1.2580 - classification_loss: 0.2126 233/500 [============>.................] - ETA: 1:03 - loss: 1.4700 - regression_loss: 1.2573 - classification_loss: 0.2127 234/500 [=============>................] - ETA: 1:02 - loss: 1.4730 - regression_loss: 1.2600 - classification_loss: 0.2130 235/500 [=============>................] - ETA: 1:02 - loss: 1.4747 - regression_loss: 1.2616 - classification_loss: 0.2131 236/500 [=============>................] - ETA: 1:02 - loss: 1.4745 - regression_loss: 1.2614 - classification_loss: 0.2131 237/500 [=============>................] - ETA: 1:02 - loss: 1.4786 - regression_loss: 1.2643 - classification_loss: 0.2143 238/500 [=============>................] - ETA: 1:01 - loss: 1.4772 - regression_loss: 1.2632 - classification_loss: 0.2139 239/500 [=============>................] - ETA: 1:01 - loss: 1.4766 - regression_loss: 1.2629 - classification_loss: 0.2137 240/500 [=============>................] - ETA: 1:01 - loss: 1.4781 - regression_loss: 1.2643 - classification_loss: 0.2138 241/500 [=============>................] - ETA: 1:01 - loss: 1.4783 - regression_loss: 1.2648 - classification_loss: 0.2135 242/500 [=============>................] - ETA: 1:00 - loss: 1.4761 - regression_loss: 1.2629 - classification_loss: 0.2131 243/500 [=============>................] - ETA: 1:00 - loss: 1.4732 - regression_loss: 1.2605 - classification_loss: 0.2127 244/500 [=============>................] - ETA: 1:00 - loss: 1.4746 - regression_loss: 1.2615 - classification_loss: 0.2132 245/500 [=============>................] - ETA: 1:00 - loss: 1.4753 - regression_loss: 1.2622 - classification_loss: 0.2130 246/500 [=============>................] - ETA: 59s - loss: 1.4733 - regression_loss: 1.2601 - classification_loss: 0.2133  247/500 [=============>................] - ETA: 59s - loss: 1.4766 - regression_loss: 1.2634 - classification_loss: 0.2133 248/500 [=============>................] - ETA: 59s - loss: 1.4787 - regression_loss: 1.2651 - classification_loss: 0.2137 249/500 [=============>................] - ETA: 59s - loss: 1.4799 - regression_loss: 1.2661 - classification_loss: 0.2138 250/500 [==============>...............] - ETA: 59s - loss: 1.4792 - regression_loss: 1.2656 - classification_loss: 0.2136 251/500 [==============>...............] - ETA: 58s - loss: 1.4793 - regression_loss: 1.2659 - classification_loss: 0.2135 252/500 [==============>...............] - ETA: 58s - loss: 1.4791 - regression_loss: 1.2656 - classification_loss: 0.2136 253/500 [==============>...............] - ETA: 58s - loss: 1.4796 - regression_loss: 1.2660 - classification_loss: 0.2136 254/500 [==============>...............] - ETA: 58s - loss: 1.4799 - regression_loss: 1.2664 - classification_loss: 0.2135 255/500 [==============>...............] - ETA: 57s - loss: 1.4781 - regression_loss: 1.2651 - classification_loss: 0.2130 256/500 [==============>...............] - ETA: 57s - loss: 1.4780 - regression_loss: 1.2652 - classification_loss: 0.2128 257/500 [==============>...............] - ETA: 57s - loss: 1.4756 - regression_loss: 1.2634 - classification_loss: 0.2123 258/500 [==============>...............] - ETA: 57s - loss: 1.4746 - regression_loss: 1.2626 - classification_loss: 0.2120 259/500 [==============>...............] - ETA: 56s - loss: 1.4756 - regression_loss: 1.2632 - classification_loss: 0.2123 260/500 [==============>...............] - ETA: 56s - loss: 1.4752 - regression_loss: 1.2628 - classification_loss: 0.2124 261/500 [==============>...............] - ETA: 56s - loss: 1.4764 - regression_loss: 1.2635 - classification_loss: 0.2129 262/500 [==============>...............] - ETA: 56s - loss: 1.4804 - regression_loss: 1.2664 - classification_loss: 0.2140 263/500 [==============>...............] - ETA: 55s - loss: 1.4791 - regression_loss: 1.2652 - classification_loss: 0.2139 264/500 [==============>...............] - ETA: 55s - loss: 1.4787 - regression_loss: 1.2652 - classification_loss: 0.2136 265/500 [==============>...............] - ETA: 55s - loss: 1.4791 - regression_loss: 1.2655 - classification_loss: 0.2137 266/500 [==============>...............] - ETA: 55s - loss: 1.4760 - regression_loss: 1.2629 - classification_loss: 0.2131 267/500 [===============>..............] - ETA: 55s - loss: 1.4748 - regression_loss: 1.2620 - classification_loss: 0.2128 268/500 [===============>..............] - ETA: 54s - loss: 1.4781 - regression_loss: 1.2644 - classification_loss: 0.2137 269/500 [===============>..............] - ETA: 54s - loss: 1.4752 - regression_loss: 1.2619 - classification_loss: 0.2133 270/500 [===============>..............] - ETA: 54s - loss: 1.4746 - regression_loss: 1.2616 - classification_loss: 0.2131 271/500 [===============>..............] - ETA: 54s - loss: 1.4746 - regression_loss: 1.2616 - classification_loss: 0.2130 272/500 [===============>..............] - ETA: 53s - loss: 1.4743 - regression_loss: 1.2613 - classification_loss: 0.2130 273/500 [===============>..............] - ETA: 53s - loss: 1.4751 - regression_loss: 1.2620 - classification_loss: 0.2131 274/500 [===============>..............] - ETA: 53s - loss: 1.4740 - regression_loss: 1.2611 - classification_loss: 0.2129 275/500 [===============>..............] - ETA: 53s - loss: 1.4766 - regression_loss: 1.2630 - classification_loss: 0.2136 276/500 [===============>..............] - ETA: 52s - loss: 1.4777 - regression_loss: 1.2637 - classification_loss: 0.2140 277/500 [===============>..............] - ETA: 52s - loss: 1.4766 - regression_loss: 1.2627 - classification_loss: 0.2139 278/500 [===============>..............] - ETA: 52s - loss: 1.4777 - regression_loss: 1.2635 - classification_loss: 0.2141 279/500 [===============>..............] - ETA: 52s - loss: 1.4762 - regression_loss: 1.2620 - classification_loss: 0.2142 280/500 [===============>..............] - ETA: 51s - loss: 1.4752 - regression_loss: 1.2613 - classification_loss: 0.2138 281/500 [===============>..............] - ETA: 51s - loss: 1.4757 - regression_loss: 1.2620 - classification_loss: 0.2137 282/500 [===============>..............] - ETA: 51s - loss: 1.4729 - regression_loss: 1.2595 - classification_loss: 0.2134 283/500 [===============>..............] - ETA: 51s - loss: 1.4715 - regression_loss: 1.2581 - classification_loss: 0.2134 284/500 [================>.............] - ETA: 51s - loss: 1.4714 - regression_loss: 1.2577 - classification_loss: 0.2136 285/500 [================>.............] - ETA: 50s - loss: 1.4720 - regression_loss: 1.2583 - classification_loss: 0.2137 286/500 [================>.............] - ETA: 50s - loss: 1.4728 - regression_loss: 1.2593 - classification_loss: 0.2135 287/500 [================>.............] - ETA: 50s - loss: 1.4748 - regression_loss: 1.2610 - classification_loss: 0.2138 288/500 [================>.............] - ETA: 50s - loss: 1.4728 - regression_loss: 1.2593 - classification_loss: 0.2136 289/500 [================>.............] - ETA: 49s - loss: 1.4726 - regression_loss: 1.2594 - classification_loss: 0.2132 290/500 [================>.............] - ETA: 49s - loss: 1.4726 - regression_loss: 1.2595 - classification_loss: 0.2131 291/500 [================>.............] - ETA: 49s - loss: 1.4733 - regression_loss: 1.2603 - classification_loss: 0.2131 292/500 [================>.............] - ETA: 49s - loss: 1.4706 - regression_loss: 1.2581 - classification_loss: 0.2125 293/500 [================>.............] - ETA: 48s - loss: 1.4681 - regression_loss: 1.2561 - classification_loss: 0.2120 294/500 [================>.............] - ETA: 48s - loss: 1.4676 - regression_loss: 1.2558 - classification_loss: 0.2118 295/500 [================>.............] - ETA: 48s - loss: 1.4639 - regression_loss: 1.2527 - classification_loss: 0.2113 296/500 [================>.............] - ETA: 48s - loss: 1.4633 - regression_loss: 1.2523 - classification_loss: 0.2110 297/500 [================>.............] - ETA: 47s - loss: 1.4622 - regression_loss: 1.2516 - classification_loss: 0.2106 298/500 [================>.............] - ETA: 47s - loss: 1.4619 - regression_loss: 1.2514 - classification_loss: 0.2105 299/500 [================>.............] - ETA: 47s - loss: 1.4622 - regression_loss: 1.2516 - classification_loss: 0.2106 300/500 [=================>............] - ETA: 47s - loss: 1.4637 - regression_loss: 1.2529 - classification_loss: 0.2108 301/500 [=================>............] - ETA: 47s - loss: 1.4657 - regression_loss: 1.2547 - classification_loss: 0.2110 302/500 [=================>............] - ETA: 46s - loss: 1.4653 - regression_loss: 1.2543 - classification_loss: 0.2110 303/500 [=================>............] - ETA: 46s - loss: 1.4641 - regression_loss: 1.2534 - classification_loss: 0.2107 304/500 [=================>............] - ETA: 46s - loss: 1.4621 - regression_loss: 1.2517 - classification_loss: 0.2104 305/500 [=================>............] - ETA: 46s - loss: 1.4602 - regression_loss: 1.2503 - classification_loss: 0.2099 306/500 [=================>............] - ETA: 45s - loss: 1.4588 - regression_loss: 1.2492 - classification_loss: 0.2096 307/500 [=================>............] - ETA: 45s - loss: 1.4572 - regression_loss: 1.2479 - classification_loss: 0.2094 308/500 [=================>............] - ETA: 45s - loss: 1.4574 - regression_loss: 1.2482 - classification_loss: 0.2093 309/500 [=================>............] - ETA: 45s - loss: 1.4583 - regression_loss: 1.2484 - classification_loss: 0.2099 310/500 [=================>............] - ETA: 44s - loss: 1.4593 - regression_loss: 1.2494 - classification_loss: 0.2099 311/500 [=================>............] - ETA: 44s - loss: 1.4610 - regression_loss: 1.2510 - classification_loss: 0.2099 312/500 [=================>............] - ETA: 44s - loss: 1.4614 - regression_loss: 1.2517 - classification_loss: 0.2097 313/500 [=================>............] - ETA: 44s - loss: 1.4620 - regression_loss: 1.2523 - classification_loss: 0.2098 314/500 [=================>............] - ETA: 43s - loss: 1.4621 - regression_loss: 1.2524 - classification_loss: 0.2097 315/500 [=================>............] - ETA: 43s - loss: 1.4619 - regression_loss: 1.2523 - classification_loss: 0.2096 316/500 [=================>............] - ETA: 43s - loss: 1.4616 - regression_loss: 1.2521 - classification_loss: 0.2095 317/500 [==================>...........] - ETA: 43s - loss: 1.4628 - regression_loss: 1.2531 - classification_loss: 0.2097 318/500 [==================>...........] - ETA: 42s - loss: 1.4621 - regression_loss: 1.2522 - classification_loss: 0.2099 319/500 [==================>...........] - ETA: 42s - loss: 1.4610 - regression_loss: 1.2512 - classification_loss: 0.2098 320/500 [==================>...........] - ETA: 42s - loss: 1.4600 - regression_loss: 1.2505 - classification_loss: 0.2095 321/500 [==================>...........] - ETA: 42s - loss: 1.4585 - regression_loss: 1.2494 - classification_loss: 0.2091 322/500 [==================>...........] - ETA: 41s - loss: 1.4567 - regression_loss: 1.2480 - classification_loss: 0.2087 323/500 [==================>...........] - ETA: 41s - loss: 1.4551 - regression_loss: 1.2466 - classification_loss: 0.2085 324/500 [==================>...........] - ETA: 41s - loss: 1.4540 - regression_loss: 1.2459 - classification_loss: 0.2081 325/500 [==================>...........] - ETA: 41s - loss: 1.4525 - regression_loss: 1.2446 - classification_loss: 0.2079 326/500 [==================>...........] - ETA: 41s - loss: 1.4500 - regression_loss: 1.2425 - classification_loss: 0.2075 327/500 [==================>...........] - ETA: 40s - loss: 1.4514 - regression_loss: 1.2438 - classification_loss: 0.2077 328/500 [==================>...........] - ETA: 40s - loss: 1.4516 - regression_loss: 1.2441 - classification_loss: 0.2075 329/500 [==================>...........] - ETA: 40s - loss: 1.4521 - regression_loss: 1.2448 - classification_loss: 0.2073 330/500 [==================>...........] - ETA: 40s - loss: 1.4524 - regression_loss: 1.2450 - classification_loss: 0.2075 331/500 [==================>...........] - ETA: 39s - loss: 1.4499 - regression_loss: 1.2428 - classification_loss: 0.2071 332/500 [==================>...........] - ETA: 39s - loss: 1.4472 - regression_loss: 1.2404 - classification_loss: 0.2068 333/500 [==================>...........] - ETA: 39s - loss: 1.4465 - regression_loss: 1.2400 - classification_loss: 0.2065 334/500 [===================>..........] - ETA: 39s - loss: 1.4456 - regression_loss: 1.2391 - classification_loss: 0.2065 335/500 [===================>..........] - ETA: 38s - loss: 1.4440 - regression_loss: 1.2378 - classification_loss: 0.2062 336/500 [===================>..........] - ETA: 38s - loss: 1.4452 - regression_loss: 1.2341 - classification_loss: 0.2111 337/500 [===================>..........] - ETA: 38s - loss: 1.4490 - regression_loss: 1.2370 - classification_loss: 0.2120 338/500 [===================>..........] - ETA: 38s - loss: 1.4487 - regression_loss: 1.2367 - classification_loss: 0.2120 339/500 [===================>..........] - ETA: 37s - loss: 1.4472 - regression_loss: 1.2354 - classification_loss: 0.2118 340/500 [===================>..........] - ETA: 37s - loss: 1.4443 - regression_loss: 1.2328 - classification_loss: 0.2115 341/500 [===================>..........] - ETA: 37s - loss: 1.4446 - regression_loss: 1.2329 - classification_loss: 0.2117 342/500 [===================>..........] - ETA: 37s - loss: 1.4455 - regression_loss: 1.2337 - classification_loss: 0.2119 343/500 [===================>..........] - ETA: 36s - loss: 1.4459 - regression_loss: 1.2339 - classification_loss: 0.2119 344/500 [===================>..........] - ETA: 36s - loss: 1.4448 - regression_loss: 1.2329 - classification_loss: 0.2119 345/500 [===================>..........] - ETA: 36s - loss: 1.4438 - regression_loss: 1.2323 - classification_loss: 0.2115 346/500 [===================>..........] - ETA: 36s - loss: 1.4464 - regression_loss: 1.2342 - classification_loss: 0.2122 347/500 [===================>..........] - ETA: 36s - loss: 1.4461 - regression_loss: 1.2340 - classification_loss: 0.2122 348/500 [===================>..........] - ETA: 35s - loss: 1.4477 - regression_loss: 1.2351 - classification_loss: 0.2125 349/500 [===================>..........] - ETA: 35s - loss: 1.4468 - regression_loss: 1.2345 - classification_loss: 0.2123 350/500 [====================>.........] - ETA: 35s - loss: 1.4468 - regression_loss: 1.2344 - classification_loss: 0.2124 351/500 [====================>.........] - ETA: 35s - loss: 1.4482 - regression_loss: 1.2356 - classification_loss: 0.2126 352/500 [====================>.........] - ETA: 34s - loss: 1.4496 - regression_loss: 1.2369 - classification_loss: 0.2127 353/500 [====================>.........] - ETA: 34s - loss: 1.4513 - regression_loss: 1.2385 - classification_loss: 0.2128 354/500 [====================>.........] - ETA: 34s - loss: 1.4508 - regression_loss: 1.2381 - classification_loss: 0.2127 355/500 [====================>.........] - ETA: 34s - loss: 1.4485 - regression_loss: 1.2362 - classification_loss: 0.2123 356/500 [====================>.........] - ETA: 33s - loss: 1.4491 - regression_loss: 1.2366 - classification_loss: 0.2125 357/500 [====================>.........] - ETA: 33s - loss: 1.4504 - regression_loss: 1.2379 - classification_loss: 0.2125 358/500 [====================>.........] - ETA: 33s - loss: 1.4514 - regression_loss: 1.2387 - classification_loss: 0.2128 359/500 [====================>.........] - ETA: 33s - loss: 1.4497 - regression_loss: 1.2373 - classification_loss: 0.2124 360/500 [====================>.........] - ETA: 32s - loss: 1.4504 - regression_loss: 1.2380 - classification_loss: 0.2124 361/500 [====================>.........] - ETA: 32s - loss: 1.4509 - regression_loss: 1.2387 - classification_loss: 0.2122 362/500 [====================>.........] - ETA: 32s - loss: 1.4507 - regression_loss: 1.2386 - classification_loss: 0.2121 363/500 [====================>.........] - ETA: 32s - loss: 1.4505 - regression_loss: 1.2384 - classification_loss: 0.2121 364/500 [====================>.........] - ETA: 32s - loss: 1.4500 - regression_loss: 1.2380 - classification_loss: 0.2120 365/500 [====================>.........] - ETA: 31s - loss: 1.4494 - regression_loss: 1.2374 - classification_loss: 0.2120 366/500 [====================>.........] - ETA: 31s - loss: 1.4488 - regression_loss: 1.2369 - classification_loss: 0.2119 367/500 [=====================>........] - ETA: 31s - loss: 1.4495 - regression_loss: 1.2375 - classification_loss: 0.2120 368/500 [=====================>........] - ETA: 31s - loss: 1.4484 - regression_loss: 1.2366 - classification_loss: 0.2118 369/500 [=====================>........] - ETA: 30s - loss: 1.4485 - regression_loss: 1.2369 - classification_loss: 0.2116 370/500 [=====================>........] - ETA: 30s - loss: 1.4481 - regression_loss: 1.2367 - classification_loss: 0.2114 371/500 [=====================>........] - ETA: 30s - loss: 1.4488 - regression_loss: 1.2366 - classification_loss: 0.2122 372/500 [=====================>........] - ETA: 30s - loss: 1.4480 - regression_loss: 1.2361 - classification_loss: 0.2120 373/500 [=====================>........] - ETA: 29s - loss: 1.4463 - regression_loss: 1.2346 - classification_loss: 0.2117 374/500 [=====================>........] - ETA: 29s - loss: 1.4446 - regression_loss: 1.2332 - classification_loss: 0.2114 375/500 [=====================>........] - ETA: 29s - loss: 1.4460 - regression_loss: 1.2342 - classification_loss: 0.2118 376/500 [=====================>........] - ETA: 29s - loss: 1.4433 - regression_loss: 1.2319 - classification_loss: 0.2113 377/500 [=====================>........] - ETA: 28s - loss: 1.4428 - regression_loss: 1.2317 - classification_loss: 0.2112 378/500 [=====================>........] - ETA: 28s - loss: 1.4409 - regression_loss: 1.2301 - classification_loss: 0.2109 379/500 [=====================>........] - ETA: 28s - loss: 1.4410 - regression_loss: 1.2301 - classification_loss: 0.2108 380/500 [=====================>........] - ETA: 28s - loss: 1.4434 - regression_loss: 1.2317 - classification_loss: 0.2117 381/500 [=====================>........] - ETA: 28s - loss: 1.4435 - regression_loss: 1.2319 - classification_loss: 0.2116 382/500 [=====================>........] - ETA: 27s - loss: 1.4441 - regression_loss: 1.2325 - classification_loss: 0.2116 383/500 [=====================>........] - ETA: 27s - loss: 1.4434 - regression_loss: 1.2319 - classification_loss: 0.2115 384/500 [======================>.......] - ETA: 27s - loss: 1.4416 - regression_loss: 1.2305 - classification_loss: 0.2111 385/500 [======================>.......] - ETA: 27s - loss: 1.4412 - regression_loss: 1.2301 - classification_loss: 0.2111 386/500 [======================>.......] - ETA: 26s - loss: 1.4402 - regression_loss: 1.2293 - classification_loss: 0.2109 387/500 [======================>.......] - ETA: 26s - loss: 1.4407 - regression_loss: 1.2298 - classification_loss: 0.2109 388/500 [======================>.......] - ETA: 26s - loss: 1.4387 - regression_loss: 1.2280 - classification_loss: 0.2107 389/500 [======================>.......] - ETA: 26s - loss: 1.4379 - regression_loss: 1.2275 - classification_loss: 0.2104 390/500 [======================>.......] - ETA: 25s - loss: 1.4383 - regression_loss: 1.2278 - classification_loss: 0.2105 391/500 [======================>.......] - ETA: 25s - loss: 1.4402 - regression_loss: 1.2292 - classification_loss: 0.2110 392/500 [======================>.......] - ETA: 25s - loss: 1.4406 - regression_loss: 1.2298 - classification_loss: 0.2107 393/500 [======================>.......] - ETA: 25s - loss: 1.4404 - regression_loss: 1.2295 - classification_loss: 0.2108 394/500 [======================>.......] - ETA: 24s - loss: 1.4385 - regression_loss: 1.2280 - classification_loss: 0.2105 395/500 [======================>.......] - ETA: 24s - loss: 1.4397 - regression_loss: 1.2292 - classification_loss: 0.2105 396/500 [======================>.......] - ETA: 24s - loss: 1.4387 - regression_loss: 1.2284 - classification_loss: 0.2104 397/500 [======================>.......] - ETA: 24s - loss: 1.4380 - regression_loss: 1.2277 - classification_loss: 0.2103 398/500 [======================>.......] - ETA: 24s - loss: 1.4378 - regression_loss: 1.2276 - classification_loss: 0.2101 399/500 [======================>.......] - ETA: 23s - loss: 1.4364 - regression_loss: 1.2265 - classification_loss: 0.2099 400/500 [=======================>......] - ETA: 23s - loss: 1.4362 - regression_loss: 1.2262 - classification_loss: 0.2099 401/500 [=======================>......] - ETA: 23s - loss: 1.4372 - regression_loss: 1.2273 - classification_loss: 0.2099 402/500 [=======================>......] - ETA: 23s - loss: 1.4368 - regression_loss: 1.2272 - classification_loss: 0.2096 403/500 [=======================>......] - ETA: 22s - loss: 1.4360 - regression_loss: 1.2264 - classification_loss: 0.2096 404/500 [=======================>......] - ETA: 22s - loss: 1.4357 - regression_loss: 1.2262 - classification_loss: 0.2095 405/500 [=======================>......] - ETA: 22s - loss: 1.4361 - regression_loss: 1.2267 - classification_loss: 0.2094 406/500 [=======================>......] - ETA: 22s - loss: 1.4353 - regression_loss: 1.2261 - classification_loss: 0.2092 407/500 [=======================>......] - ETA: 21s - loss: 1.4342 - regression_loss: 1.2252 - classification_loss: 0.2089 408/500 [=======================>......] - ETA: 21s - loss: 1.4338 - regression_loss: 1.2251 - classification_loss: 0.2087 409/500 [=======================>......] - ETA: 21s - loss: 1.4321 - regression_loss: 1.2237 - classification_loss: 0.2084 410/500 [=======================>......] - ETA: 21s - loss: 1.4326 - regression_loss: 1.2243 - classification_loss: 0.2083 411/500 [=======================>......] - ETA: 20s - loss: 1.4317 - regression_loss: 1.2237 - classification_loss: 0.2080 412/500 [=======================>......] - ETA: 20s - loss: 1.4310 - regression_loss: 1.2232 - classification_loss: 0.2078 413/500 [=======================>......] - ETA: 20s - loss: 1.4311 - regression_loss: 1.2233 - classification_loss: 0.2078 414/500 [=======================>......] - ETA: 20s - loss: 1.4295 - regression_loss: 1.2220 - classification_loss: 0.2075 415/500 [=======================>......] - ETA: 20s - loss: 1.4309 - regression_loss: 1.2231 - classification_loss: 0.2078 416/500 [=======================>......] - ETA: 19s - loss: 1.4327 - regression_loss: 1.2247 - classification_loss: 0.2080 417/500 [========================>.....] - ETA: 19s - loss: 1.4348 - regression_loss: 1.2266 - classification_loss: 0.2082 418/500 [========================>.....] - ETA: 19s - loss: 1.4368 - regression_loss: 1.2236 - classification_loss: 0.2131 419/500 [========================>.....] - ETA: 19s - loss: 1.4374 - regression_loss: 1.2243 - classification_loss: 0.2131 420/500 [========================>.....] - ETA: 18s - loss: 1.4370 - regression_loss: 1.2239 - classification_loss: 0.2131 421/500 [========================>.....] - ETA: 18s - loss: 1.4384 - regression_loss: 1.2253 - classification_loss: 0.2131 422/500 [========================>.....] - ETA: 18s - loss: 1.4378 - regression_loss: 1.2249 - classification_loss: 0.2129 423/500 [========================>.....] - ETA: 18s - loss: 1.4383 - regression_loss: 1.2253 - classification_loss: 0.2130 424/500 [========================>.....] - ETA: 17s - loss: 1.4390 - regression_loss: 1.2260 - classification_loss: 0.2130 425/500 [========================>.....] - ETA: 17s - loss: 1.4390 - regression_loss: 1.2262 - classification_loss: 0.2128 426/500 [========================>.....] - ETA: 17s - loss: 1.4400 - regression_loss: 1.2269 - classification_loss: 0.2130 427/500 [========================>.....] - ETA: 17s - loss: 1.4410 - regression_loss: 1.2278 - classification_loss: 0.2133 428/500 [========================>.....] - ETA: 16s - loss: 1.4401 - regression_loss: 1.2271 - classification_loss: 0.2130 429/500 [========================>.....] - ETA: 16s - loss: 1.4432 - regression_loss: 1.2296 - classification_loss: 0.2136 430/500 [========================>.....] - ETA: 16s - loss: 1.4441 - regression_loss: 1.2302 - classification_loss: 0.2138 431/500 [========================>.....] - ETA: 16s - loss: 1.4445 - regression_loss: 1.2307 - classification_loss: 0.2137 432/500 [========================>.....] - ETA: 16s - loss: 1.4444 - regression_loss: 1.2308 - classification_loss: 0.2135 433/500 [========================>.....] - ETA: 15s - loss: 1.4458 - regression_loss: 1.2321 - classification_loss: 0.2137 434/500 [=========================>....] - ETA: 15s - loss: 1.4465 - regression_loss: 1.2329 - classification_loss: 0.2135 435/500 [=========================>....] - ETA: 15s - loss: 1.4481 - regression_loss: 1.2346 - classification_loss: 0.2135 436/500 [=========================>....] - ETA: 15s - loss: 1.4479 - regression_loss: 1.2341 - classification_loss: 0.2138 437/500 [=========================>....] - ETA: 14s - loss: 1.4459 - regression_loss: 1.2325 - classification_loss: 0.2134 438/500 [=========================>....] - ETA: 14s - loss: 1.4483 - regression_loss: 1.2343 - classification_loss: 0.2140 439/500 [=========================>....] - ETA: 14s - loss: 1.4490 - regression_loss: 1.2350 - classification_loss: 0.2140 440/500 [=========================>....] - ETA: 14s - loss: 1.4475 - regression_loss: 1.2338 - classification_loss: 0.2137 441/500 [=========================>....] - ETA: 13s - loss: 1.4477 - regression_loss: 1.2341 - classification_loss: 0.2136 442/500 [=========================>....] - ETA: 13s - loss: 1.4470 - regression_loss: 1.2335 - classification_loss: 0.2134 443/500 [=========================>....] - ETA: 13s - loss: 1.4481 - regression_loss: 1.2344 - classification_loss: 0.2137 444/500 [=========================>....] - ETA: 13s - loss: 1.4465 - regression_loss: 1.2330 - classification_loss: 0.2135 445/500 [=========================>....] - ETA: 12s - loss: 1.4457 - regression_loss: 1.2324 - classification_loss: 0.2134 446/500 [=========================>....] - ETA: 12s - loss: 1.4460 - regression_loss: 1.2327 - classification_loss: 0.2134 447/500 [=========================>....] - ETA: 12s - loss: 1.4450 - regression_loss: 1.2319 - classification_loss: 0.2132 448/500 [=========================>....] - ETA: 12s - loss: 1.4457 - regression_loss: 1.2324 - classification_loss: 0.2133 449/500 [=========================>....] - ETA: 12s - loss: 1.4446 - regression_loss: 1.2314 - classification_loss: 0.2132 450/500 [==========================>...] - ETA: 11s - loss: 1.4432 - regression_loss: 1.2303 - classification_loss: 0.2129 451/500 [==========================>...] - ETA: 11s - loss: 1.4428 - regression_loss: 1.2300 - classification_loss: 0.2128 452/500 [==========================>...] - ETA: 11s - loss: 1.4465 - regression_loss: 1.2335 - classification_loss: 0.2130 453/500 [==========================>...] - ETA: 11s - loss: 1.4449 - regression_loss: 1.2323 - classification_loss: 0.2126 454/500 [==========================>...] - ETA: 10s - loss: 1.4446 - regression_loss: 1.2321 - classification_loss: 0.2126 455/500 [==========================>...] - ETA: 10s - loss: 1.4458 - regression_loss: 1.2331 - classification_loss: 0.2127 456/500 [==========================>...] - ETA: 10s - loss: 1.4450 - regression_loss: 1.2325 - classification_loss: 0.2126 457/500 [==========================>...] - ETA: 10s - loss: 1.4440 - regression_loss: 1.2315 - classification_loss: 0.2125 458/500 [==========================>...] - ETA: 9s - loss: 1.4438 - regression_loss: 1.2315 - classification_loss: 0.2123  459/500 [==========================>...] - ETA: 9s - loss: 1.4429 - regression_loss: 1.2307 - classification_loss: 0.2122 460/500 [==========================>...] - ETA: 9s - loss: 1.4430 - regression_loss: 1.2308 - classification_loss: 0.2122 461/500 [==========================>...] - ETA: 9s - loss: 1.4426 - regression_loss: 1.2306 - classification_loss: 0.2120 462/500 [==========================>...] - ETA: 8s - loss: 1.4416 - regression_loss: 1.2298 - classification_loss: 0.2118 463/500 [==========================>...] - ETA: 8s - loss: 1.4403 - regression_loss: 1.2288 - classification_loss: 0.2115 464/500 [==========================>...] - ETA: 8s - loss: 1.4417 - regression_loss: 1.2298 - classification_loss: 0.2119 465/500 [==========================>...] - ETA: 8s - loss: 1.4409 - regression_loss: 1.2289 - classification_loss: 0.2120 466/500 [==========================>...] - ETA: 8s - loss: 1.4412 - regression_loss: 1.2292 - classification_loss: 0.2120 467/500 [===========================>..] - ETA: 7s - loss: 1.4403 - regression_loss: 1.2285 - classification_loss: 0.2118 468/500 [===========================>..] - ETA: 7s - loss: 1.4402 - regression_loss: 1.2285 - classification_loss: 0.2117 469/500 [===========================>..] - ETA: 7s - loss: 1.4405 - regression_loss: 1.2287 - classification_loss: 0.2118 470/500 [===========================>..] - ETA: 7s - loss: 1.4403 - regression_loss: 1.2286 - classification_loss: 0.2117 471/500 [===========================>..] - ETA: 6s - loss: 1.4398 - regression_loss: 1.2283 - classification_loss: 0.2115 472/500 [===========================>..] - ETA: 6s - loss: 1.4395 - regression_loss: 1.2282 - classification_loss: 0.2113 473/500 [===========================>..] - ETA: 6s - loss: 1.4399 - regression_loss: 1.2284 - classification_loss: 0.2115 474/500 [===========================>..] - ETA: 6s - loss: 1.4409 - regression_loss: 1.2292 - classification_loss: 0.2117 475/500 [===========================>..] - ETA: 5s - loss: 1.4411 - regression_loss: 1.2295 - classification_loss: 0.2117 476/500 [===========================>..] - ETA: 5s - loss: 1.4416 - regression_loss: 1.2299 - classification_loss: 0.2117 477/500 [===========================>..] - ETA: 5s - loss: 1.4418 - regression_loss: 1.2300 - classification_loss: 0.2117 478/500 [===========================>..] - ETA: 5s - loss: 1.4413 - regression_loss: 1.2296 - classification_loss: 0.2116 479/500 [===========================>..] - ETA: 4s - loss: 1.4432 - regression_loss: 1.2311 - classification_loss: 0.2120 480/500 [===========================>..] - ETA: 4s - loss: 1.4413 - regression_loss: 1.2296 - classification_loss: 0.2117 481/500 [===========================>..] - ETA: 4s - loss: 1.4403 - regression_loss: 1.2287 - classification_loss: 0.2115 482/500 [===========================>..] - ETA: 4s - loss: 1.4400 - regression_loss: 1.2286 - classification_loss: 0.2114 483/500 [===========================>..] - ETA: 4s - loss: 1.4379 - regression_loss: 1.2269 - classification_loss: 0.2110 484/500 [============================>.] - ETA: 3s - loss: 1.4377 - regression_loss: 1.2268 - classification_loss: 0.2109 485/500 [============================>.] - ETA: 3s - loss: 1.4372 - regression_loss: 1.2264 - classification_loss: 0.2108 486/500 [============================>.] - ETA: 3s - loss: 1.4381 - regression_loss: 1.2272 - classification_loss: 0.2109 487/500 [============================>.] - ETA: 3s - loss: 1.4373 - regression_loss: 1.2260 - classification_loss: 0.2112 488/500 [============================>.] - ETA: 2s - loss: 1.4361 - regression_loss: 1.2250 - classification_loss: 0.2110 489/500 [============================>.] - ETA: 2s - loss: 1.4356 - regression_loss: 1.2248 - classification_loss: 0.2108 490/500 [============================>.] - ETA: 2s - loss: 1.4340 - regression_loss: 1.2234 - classification_loss: 0.2106 491/500 [============================>.] - ETA: 2s - loss: 1.4343 - regression_loss: 1.2238 - classification_loss: 0.2105 492/500 [============================>.] - ETA: 1s - loss: 1.4340 - regression_loss: 1.2236 - classification_loss: 0.2104 493/500 [============================>.] - ETA: 1s - loss: 1.4336 - regression_loss: 1.2232 - classification_loss: 0.2103 494/500 [============================>.] - ETA: 1s - loss: 1.4338 - regression_loss: 1.2236 - classification_loss: 0.2102 495/500 [============================>.] - ETA: 1s - loss: 1.4340 - regression_loss: 1.2239 - classification_loss: 0.2101 496/500 [============================>.] - ETA: 0s - loss: 1.4336 - regression_loss: 1.2236 - classification_loss: 0.2100 497/500 [============================>.] - ETA: 0s - loss: 1.4327 - regression_loss: 1.2229 - classification_loss: 0.2098 498/500 [============================>.] - ETA: 0s - loss: 1.4334 - regression_loss: 1.2235 - classification_loss: 0.2098 499/500 [============================>.] - ETA: 0s - loss: 1.4333 - regression_loss: 1.2236 - classification_loss: 0.2097 500/500 [==============================] - 118s 236ms/step - loss: 1.4338 - regression_loss: 1.2239 - classification_loss: 0.2099 326 instances of class plum with average precision: 0.8193 mAP: 0.8193 Epoch 00012: saving model to ./training/snapshots/resnet50_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 1:57 - loss: 2.3873 - regression_loss: 2.0098 - classification_loss: 0.3775 2/500 [..............................] - ETA: 1:59 - loss: 1.9129 - regression_loss: 1.6871 - classification_loss: 0.2258 3/500 [..............................] - ETA: 2:00 - loss: 1.9211 - regression_loss: 1.6675 - classification_loss: 0.2537 4/500 [..............................] - ETA: 1:58 - loss: 1.7085 - regression_loss: 1.4904 - classification_loss: 0.2180 5/500 [..............................] - ETA: 1:57 - loss: 1.5278 - regression_loss: 1.3324 - classification_loss: 0.1954 6/500 [..............................] - ETA: 1:57 - loss: 1.5331 - regression_loss: 1.3289 - classification_loss: 0.2042 7/500 [..............................] - ETA: 1:56 - loss: 1.4744 - regression_loss: 1.2821 - classification_loss: 0.1922 8/500 [..............................] - ETA: 1:55 - loss: 1.4815 - regression_loss: 1.2912 - classification_loss: 0.1903 9/500 [..............................] - ETA: 1:55 - loss: 1.4893 - regression_loss: 1.2993 - classification_loss: 0.1900 10/500 [..............................] - ETA: 1:55 - loss: 1.5287 - regression_loss: 1.3323 - classification_loss: 0.1964 11/500 [..............................] - ETA: 1:55 - loss: 1.5206 - regression_loss: 1.3201 - classification_loss: 0.2005 12/500 [..............................] - ETA: 1:55 - loss: 1.5065 - regression_loss: 1.2985 - classification_loss: 0.2079 13/500 [..............................] - ETA: 1:55 - loss: 1.4590 - regression_loss: 1.2582 - classification_loss: 0.2008 14/500 [..............................] - ETA: 1:55 - loss: 1.4195 - regression_loss: 1.2112 - classification_loss: 0.2083 15/500 [..............................] - ETA: 1:54 - loss: 1.3961 - regression_loss: 1.1947 - classification_loss: 0.2014 16/500 [..............................] - ETA: 1:54 - loss: 1.3946 - regression_loss: 1.1957 - classification_loss: 0.1989 17/500 [>.............................] - ETA: 1:54 - loss: 1.4358 - regression_loss: 1.2328 - classification_loss: 0.2031 18/500 [>.............................] - ETA: 1:53 - loss: 1.4753 - regression_loss: 1.2631 - classification_loss: 0.2122 19/500 [>.............................] - ETA: 1:53 - loss: 1.4776 - regression_loss: 1.2631 - classification_loss: 0.2145 20/500 [>.............................] - ETA: 1:53 - loss: 1.4670 - regression_loss: 1.2524 - classification_loss: 0.2145 21/500 [>.............................] - ETA: 1:53 - loss: 1.4771 - regression_loss: 1.2624 - classification_loss: 0.2147 22/500 [>.............................] - ETA: 1:52 - loss: 1.4593 - regression_loss: 1.2489 - classification_loss: 0.2104 23/500 [>.............................] - ETA: 1:52 - loss: 1.4089 - regression_loss: 1.2062 - classification_loss: 0.2027 24/500 [>.............................] - ETA: 1:51 - loss: 1.4538 - regression_loss: 1.2490 - classification_loss: 0.2048 25/500 [>.............................] - ETA: 1:51 - loss: 1.4510 - regression_loss: 1.2468 - classification_loss: 0.2042 26/500 [>.............................] - ETA: 1:51 - loss: 1.4529 - regression_loss: 1.2478 - classification_loss: 0.2051 27/500 [>.............................] - ETA: 1:51 - loss: 1.4541 - regression_loss: 1.2505 - classification_loss: 0.2036 28/500 [>.............................] - ETA: 1:51 - loss: 1.4414 - regression_loss: 1.2396 - classification_loss: 0.2018 29/500 [>.............................] - ETA: 1:51 - loss: 1.4168 - regression_loss: 1.2188 - classification_loss: 0.1980 30/500 [>.............................] - ETA: 1:51 - loss: 1.4281 - regression_loss: 1.2275 - classification_loss: 0.2006 31/500 [>.............................] - ETA: 1:51 - loss: 1.4352 - regression_loss: 1.2338 - classification_loss: 0.2013 32/500 [>.............................] - ETA: 1:50 - loss: 1.4377 - regression_loss: 1.2369 - classification_loss: 0.2008 33/500 [>.............................] - ETA: 1:50 - loss: 1.4263 - regression_loss: 1.2280 - classification_loss: 0.1984 34/500 [=>............................] - ETA: 1:50 - loss: 1.4071 - regression_loss: 1.2124 - classification_loss: 0.1947 35/500 [=>............................] - ETA: 1:50 - loss: 1.4133 - regression_loss: 1.2180 - classification_loss: 0.1953 36/500 [=>............................] - ETA: 1:50 - loss: 1.4221 - regression_loss: 1.2261 - classification_loss: 0.1959 37/500 [=>............................] - ETA: 1:49 - loss: 1.4324 - regression_loss: 1.2321 - classification_loss: 0.2003 38/500 [=>............................] - ETA: 1:49 - loss: 1.4285 - regression_loss: 1.2298 - classification_loss: 0.1988 39/500 [=>............................] - ETA: 1:49 - loss: 1.4213 - regression_loss: 1.2238 - classification_loss: 0.1974 40/500 [=>............................] - ETA: 1:49 - loss: 1.4179 - regression_loss: 1.2208 - classification_loss: 0.1971 41/500 [=>............................] - ETA: 1:49 - loss: 1.4144 - regression_loss: 1.2173 - classification_loss: 0.1970 42/500 [=>............................] - ETA: 1:48 - loss: 1.4500 - regression_loss: 1.2445 - classification_loss: 0.2055 43/500 [=>............................] - ETA: 1:48 - loss: 1.4441 - regression_loss: 1.2398 - classification_loss: 0.2042 44/500 [=>............................] - ETA: 1:48 - loss: 1.4380 - regression_loss: 1.2348 - classification_loss: 0.2032 45/500 [=>............................] - ETA: 1:47 - loss: 1.4198 - regression_loss: 1.2201 - classification_loss: 0.1997 46/500 [=>............................] - ETA: 1:47 - loss: 1.4201 - regression_loss: 1.2212 - classification_loss: 0.1989 47/500 [=>............................] - ETA: 1:47 - loss: 1.4063 - regression_loss: 1.2103 - classification_loss: 0.1961 48/500 [=>............................] - ETA: 1:46 - loss: 1.3940 - regression_loss: 1.1999 - classification_loss: 0.1941 49/500 [=>............................] - ETA: 1:46 - loss: 1.4161 - regression_loss: 1.2152 - classification_loss: 0.2009 50/500 [==>...........................] - ETA: 1:46 - loss: 1.4225 - regression_loss: 1.2212 - classification_loss: 0.2013 51/500 [==>...........................] - ETA: 1:46 - loss: 1.4344 - regression_loss: 1.2305 - classification_loss: 0.2039 52/500 [==>...........................] - ETA: 1:46 - loss: 1.4254 - regression_loss: 1.2227 - classification_loss: 0.2027 53/500 [==>...........................] - ETA: 1:45 - loss: 1.4373 - regression_loss: 1.2340 - classification_loss: 0.2034 54/500 [==>...........................] - ETA: 1:45 - loss: 1.4362 - regression_loss: 1.2334 - classification_loss: 0.2028 55/500 [==>...........................] - ETA: 1:45 - loss: 1.4521 - regression_loss: 1.2476 - classification_loss: 0.2045 56/500 [==>...........................] - ETA: 1:45 - loss: 1.4477 - regression_loss: 1.2445 - classification_loss: 0.2033 57/500 [==>...........................] - ETA: 1:44 - loss: 1.4289 - regression_loss: 1.2284 - classification_loss: 0.2005 58/500 [==>...........................] - ETA: 1:44 - loss: 1.4146 - regression_loss: 1.2150 - classification_loss: 0.1996 59/500 [==>...........................] - ETA: 1:44 - loss: 1.4129 - regression_loss: 1.2141 - classification_loss: 0.1988 60/500 [==>...........................] - ETA: 1:44 - loss: 1.4106 - regression_loss: 1.2127 - classification_loss: 0.1979 61/500 [==>...........................] - ETA: 1:43 - loss: 1.4123 - regression_loss: 1.2144 - classification_loss: 0.1979 62/500 [==>...........................] - ETA: 1:43 - loss: 1.4143 - regression_loss: 1.2167 - classification_loss: 0.1976 63/500 [==>...........................] - ETA: 1:43 - loss: 1.4198 - regression_loss: 1.2204 - classification_loss: 0.1994 64/500 [==>...........................] - ETA: 1:42 - loss: 1.4158 - regression_loss: 1.2172 - classification_loss: 0.1986 65/500 [==>...........................] - ETA: 1:42 - loss: 1.4100 - regression_loss: 1.2131 - classification_loss: 0.1969 66/500 [==>...........................] - ETA: 1:42 - loss: 1.4020 - regression_loss: 1.2069 - classification_loss: 0.1951 67/500 [===>..........................] - ETA: 1:42 - loss: 1.3966 - regression_loss: 1.2028 - classification_loss: 0.1939 68/500 [===>..........................] - ETA: 1:41 - loss: 1.3948 - regression_loss: 1.2020 - classification_loss: 0.1927 69/500 [===>..........................] - ETA: 1:41 - loss: 1.3900 - regression_loss: 1.1982 - classification_loss: 0.1917 70/500 [===>..........................] - ETA: 1:41 - loss: 1.3910 - regression_loss: 1.1995 - classification_loss: 0.1915 71/500 [===>..........................] - ETA: 1:41 - loss: 1.3867 - regression_loss: 1.1970 - classification_loss: 0.1897 72/500 [===>..........................] - ETA: 1:41 - loss: 1.3797 - regression_loss: 1.1920 - classification_loss: 0.1877 73/500 [===>..........................] - ETA: 1:40 - loss: 1.3840 - regression_loss: 1.1957 - classification_loss: 0.1882 74/500 [===>..........................] - ETA: 1:40 - loss: 1.3811 - regression_loss: 1.1933 - classification_loss: 0.1878 75/500 [===>..........................] - ETA: 1:40 - loss: 1.3829 - regression_loss: 1.1950 - classification_loss: 0.1879 76/500 [===>..........................] - ETA: 1:40 - loss: 1.3816 - regression_loss: 1.1940 - classification_loss: 0.1875 77/500 [===>..........................] - ETA: 1:39 - loss: 1.3758 - regression_loss: 1.1894 - classification_loss: 0.1864 78/500 [===>..........................] - ETA: 1:39 - loss: 1.3956 - regression_loss: 1.2043 - classification_loss: 0.1913 79/500 [===>..........................] - ETA: 1:39 - loss: 1.3876 - regression_loss: 1.1978 - classification_loss: 0.1899 80/500 [===>..........................] - ETA: 1:39 - loss: 1.3935 - regression_loss: 1.2035 - classification_loss: 0.1900 81/500 [===>..........................] - ETA: 1:38 - loss: 1.3851 - regression_loss: 1.1966 - classification_loss: 0.1885 82/500 [===>..........................] - ETA: 1:38 - loss: 1.3844 - regression_loss: 1.1970 - classification_loss: 0.1875 83/500 [===>..........................] - ETA: 1:38 - loss: 1.3841 - regression_loss: 1.1969 - classification_loss: 0.1872 84/500 [====>.........................] - ETA: 1:38 - loss: 1.3925 - regression_loss: 1.2038 - classification_loss: 0.1887 85/500 [====>.........................] - ETA: 1:37 - loss: 1.4081 - regression_loss: 1.2181 - classification_loss: 0.1900 86/500 [====>.........................] - ETA: 1:37 - loss: 1.4123 - regression_loss: 1.2217 - classification_loss: 0.1906 87/500 [====>.........................] - ETA: 1:37 - loss: 1.4130 - regression_loss: 1.2235 - classification_loss: 0.1895 88/500 [====>.........................] - ETA: 1:37 - loss: 1.4053 - regression_loss: 1.2171 - classification_loss: 0.1882 89/500 [====>.........................] - ETA: 1:36 - loss: 1.4032 - regression_loss: 1.2154 - classification_loss: 0.1877 90/500 [====>.........................] - ETA: 1:36 - loss: 1.3958 - regression_loss: 1.2094 - classification_loss: 0.1864 91/500 [====>.........................] - ETA: 1:36 - loss: 1.3974 - regression_loss: 1.2120 - classification_loss: 0.1854 92/500 [====>.........................] - ETA: 1:36 - loss: 1.4016 - regression_loss: 1.2156 - classification_loss: 0.1860 93/500 [====>.........................] - ETA: 1:36 - loss: 1.4158 - regression_loss: 1.2266 - classification_loss: 0.1892 94/500 [====>.........................] - ETA: 1:35 - loss: 1.4143 - regression_loss: 1.2257 - classification_loss: 0.1886 95/500 [====>.........................] - ETA: 1:35 - loss: 1.4149 - regression_loss: 1.2262 - classification_loss: 0.1887 96/500 [====>.........................] - ETA: 1:35 - loss: 1.4170 - regression_loss: 1.2284 - classification_loss: 0.1886 97/500 [====>.........................] - ETA: 1:35 - loss: 1.4091 - regression_loss: 1.2220 - classification_loss: 0.1871 98/500 [====>.........................] - ETA: 1:34 - loss: 1.4090 - regression_loss: 1.2221 - classification_loss: 0.1869 99/500 [====>.........................] - ETA: 1:34 - loss: 1.4064 - regression_loss: 1.2199 - classification_loss: 0.1865 100/500 [=====>........................] - ETA: 1:34 - loss: 1.4100 - regression_loss: 1.2230 - classification_loss: 0.1870 101/500 [=====>........................] - ETA: 1:34 - loss: 1.4041 - regression_loss: 1.2183 - classification_loss: 0.1859 102/500 [=====>........................] - ETA: 1:33 - loss: 1.3988 - regression_loss: 1.2139 - classification_loss: 0.1849 103/500 [=====>........................] - ETA: 1:33 - loss: 1.4004 - regression_loss: 1.2153 - classification_loss: 0.1851 104/500 [=====>........................] - ETA: 1:33 - loss: 1.4066 - regression_loss: 1.2202 - classification_loss: 0.1864 105/500 [=====>........................] - ETA: 1:33 - loss: 1.4099 - regression_loss: 1.2241 - classification_loss: 0.1859 106/500 [=====>........................] - ETA: 1:32 - loss: 1.4037 - regression_loss: 1.2187 - classification_loss: 0.1850 107/500 [=====>........................] - ETA: 1:32 - loss: 1.4023 - regression_loss: 1.2182 - classification_loss: 0.1841 108/500 [=====>........................] - ETA: 1:32 - loss: 1.3979 - regression_loss: 1.2145 - classification_loss: 0.1834 109/500 [=====>........................] - ETA: 1:32 - loss: 1.4017 - regression_loss: 1.2182 - classification_loss: 0.1835 110/500 [=====>........................] - ETA: 1:31 - loss: 1.3972 - regression_loss: 1.2142 - classification_loss: 0.1830 111/500 [=====>........................] - ETA: 1:31 - loss: 1.3921 - regression_loss: 1.2099 - classification_loss: 0.1822 112/500 [=====>........................] - ETA: 1:31 - loss: 1.3883 - regression_loss: 1.2064 - classification_loss: 0.1819 113/500 [=====>........................] - ETA: 1:31 - loss: 1.3962 - regression_loss: 1.2144 - classification_loss: 0.1818 114/500 [=====>........................] - ETA: 1:30 - loss: 1.3962 - regression_loss: 1.2146 - classification_loss: 0.1816 115/500 [=====>........................] - ETA: 1:30 - loss: 1.4021 - regression_loss: 1.2202 - classification_loss: 0.1819 116/500 [=====>........................] - ETA: 1:30 - loss: 1.3971 - regression_loss: 1.2160 - classification_loss: 0.1811 117/500 [======>.......................] - ETA: 1:30 - loss: 1.3981 - regression_loss: 1.2173 - classification_loss: 0.1808 118/500 [======>.......................] - ETA: 1:29 - loss: 1.3983 - regression_loss: 1.2170 - classification_loss: 0.1813 119/500 [======>.......................] - ETA: 1:29 - loss: 1.4012 - regression_loss: 1.2194 - classification_loss: 0.1819 120/500 [======>.......................] - ETA: 1:29 - loss: 1.3931 - regression_loss: 1.2124 - classification_loss: 0.1807 121/500 [======>.......................] - ETA: 1:29 - loss: 1.3850 - regression_loss: 1.2054 - classification_loss: 0.1796 122/500 [======>.......................] - ETA: 1:29 - loss: 1.3823 - regression_loss: 1.2032 - classification_loss: 0.1791 123/500 [======>.......................] - ETA: 1:28 - loss: 1.3925 - regression_loss: 1.2104 - classification_loss: 0.1821 124/500 [======>.......................] - ETA: 1:28 - loss: 1.3978 - regression_loss: 1.2146 - classification_loss: 0.1832 125/500 [======>.......................] - ETA: 1:28 - loss: 1.3953 - regression_loss: 1.2126 - classification_loss: 0.1828 126/500 [======>.......................] - ETA: 1:28 - loss: 1.4004 - regression_loss: 1.2167 - classification_loss: 0.1837 127/500 [======>.......................] - ETA: 1:28 - loss: 1.4022 - regression_loss: 1.2178 - classification_loss: 0.1844 128/500 [======>.......................] - ETA: 1:27 - loss: 1.3996 - regression_loss: 1.2157 - classification_loss: 0.1839 129/500 [======>.......................] - ETA: 1:27 - loss: 1.3988 - regression_loss: 1.2148 - classification_loss: 0.1840 130/500 [======>.......................] - ETA: 1:27 - loss: 1.4017 - regression_loss: 1.2182 - classification_loss: 0.1836 131/500 [======>.......................] - ETA: 1:26 - loss: 1.3967 - regression_loss: 1.2137 - classification_loss: 0.1830 132/500 [======>.......................] - ETA: 1:26 - loss: 1.3957 - regression_loss: 1.2129 - classification_loss: 0.1828 133/500 [======>.......................] - ETA: 1:26 - loss: 1.3945 - regression_loss: 1.2121 - classification_loss: 0.1824 134/500 [=======>......................] - ETA: 1:26 - loss: 1.3911 - regression_loss: 1.2078 - classification_loss: 0.1833 135/500 [=======>......................] - ETA: 1:26 - loss: 1.3918 - regression_loss: 1.2085 - classification_loss: 0.1833 136/500 [=======>......................] - ETA: 1:25 - loss: 1.3928 - regression_loss: 1.2090 - classification_loss: 0.1838 137/500 [=======>......................] - ETA: 1:25 - loss: 1.3926 - regression_loss: 1.2090 - classification_loss: 0.1835 138/500 [=======>......................] - ETA: 1:25 - loss: 1.3941 - regression_loss: 1.2104 - classification_loss: 0.1837 139/500 [=======>......................] - ETA: 1:25 - loss: 1.3886 - regression_loss: 1.2056 - classification_loss: 0.1830 140/500 [=======>......................] - ETA: 1:24 - loss: 1.3870 - regression_loss: 1.2041 - classification_loss: 0.1829 141/500 [=======>......................] - ETA: 1:24 - loss: 1.3863 - regression_loss: 1.2036 - classification_loss: 0.1827 142/500 [=======>......................] - ETA: 1:24 - loss: 1.3857 - regression_loss: 1.2032 - classification_loss: 0.1825 143/500 [=======>......................] - ETA: 1:24 - loss: 1.3877 - regression_loss: 1.2052 - classification_loss: 0.1825 144/500 [=======>......................] - ETA: 1:23 - loss: 1.3838 - regression_loss: 1.2021 - classification_loss: 0.1817 145/500 [=======>......................] - ETA: 1:23 - loss: 1.3850 - regression_loss: 1.2030 - classification_loss: 0.1820 146/500 [=======>......................] - ETA: 1:23 - loss: 1.3817 - regression_loss: 1.2002 - classification_loss: 0.1816 147/500 [=======>......................] - ETA: 1:23 - loss: 1.3855 - regression_loss: 1.2031 - classification_loss: 0.1823 148/500 [=======>......................] - ETA: 1:23 - loss: 1.3848 - regression_loss: 1.2028 - classification_loss: 0.1820 149/500 [=======>......................] - ETA: 1:22 - loss: 1.4006 - regression_loss: 1.2158 - classification_loss: 0.1848 150/500 [========>.....................] - ETA: 1:22 - loss: 1.3998 - regression_loss: 1.2152 - classification_loss: 0.1846 151/500 [========>.....................] - ETA: 1:22 - loss: 1.3996 - regression_loss: 1.2151 - classification_loss: 0.1846 152/500 [========>.....................] - ETA: 1:22 - loss: 1.3963 - regression_loss: 1.2127 - classification_loss: 0.1836 153/500 [========>.....................] - ETA: 1:22 - loss: 1.3942 - regression_loss: 1.2110 - classification_loss: 0.1832 154/500 [========>.....................] - ETA: 1:21 - loss: 1.3951 - regression_loss: 1.2116 - classification_loss: 0.1835 155/500 [========>.....................] - ETA: 1:21 - loss: 1.3998 - regression_loss: 1.2150 - classification_loss: 0.1848 156/500 [========>.....................] - ETA: 1:21 - loss: 1.4058 - regression_loss: 1.2198 - classification_loss: 0.1860 157/500 [========>.....................] - ETA: 1:21 - loss: 1.4044 - regression_loss: 1.2187 - classification_loss: 0.1856 158/500 [========>.....................] - ETA: 1:20 - loss: 1.4034 - regression_loss: 1.2180 - classification_loss: 0.1854 159/500 [========>.....................] - ETA: 1:20 - loss: 1.3984 - regression_loss: 1.2137 - classification_loss: 0.1847 160/500 [========>.....................] - ETA: 1:20 - loss: 1.4003 - regression_loss: 1.2155 - classification_loss: 0.1849 161/500 [========>.....................] - ETA: 1:19 - loss: 1.3996 - regression_loss: 1.2148 - classification_loss: 0.1847 162/500 [========>.....................] - ETA: 1:19 - loss: 1.4017 - regression_loss: 1.2169 - classification_loss: 0.1848 163/500 [========>.....................] - ETA: 1:19 - loss: 1.4026 - regression_loss: 1.2176 - classification_loss: 0.1850 164/500 [========>.....................] - ETA: 1:19 - loss: 1.4026 - regression_loss: 1.2173 - classification_loss: 0.1854 165/500 [========>.....................] - ETA: 1:18 - loss: 1.4021 - regression_loss: 1.2170 - classification_loss: 0.1851 166/500 [========>.....................] - ETA: 1:18 - loss: 1.4057 - regression_loss: 1.2201 - classification_loss: 0.1856 167/500 [=========>....................] - ETA: 1:18 - loss: 1.4081 - regression_loss: 1.2218 - classification_loss: 0.1862 168/500 [=========>....................] - ETA: 1:18 - loss: 1.4071 - regression_loss: 1.2209 - classification_loss: 0.1862 169/500 [=========>....................] - ETA: 1:17 - loss: 1.4072 - regression_loss: 1.2214 - classification_loss: 0.1858 170/500 [=========>....................] - ETA: 1:17 - loss: 1.4068 - regression_loss: 1.2209 - classification_loss: 0.1858 171/500 [=========>....................] - ETA: 1:17 - loss: 1.4075 - regression_loss: 1.2214 - classification_loss: 0.1861 172/500 [=========>....................] - ETA: 1:17 - loss: 1.4103 - regression_loss: 1.2240 - classification_loss: 0.1863 173/500 [=========>....................] - ETA: 1:16 - loss: 1.4086 - regression_loss: 1.2228 - classification_loss: 0.1858 174/500 [=========>....................] - ETA: 1:16 - loss: 1.4112 - regression_loss: 1.2246 - classification_loss: 0.1865 175/500 [=========>....................] - ETA: 1:16 - loss: 1.4055 - regression_loss: 1.2197 - classification_loss: 0.1858 176/500 [=========>....................] - ETA: 1:16 - loss: 1.4027 - regression_loss: 1.2173 - classification_loss: 0.1854 177/500 [=========>....................] - ETA: 1:15 - loss: 1.4022 - regression_loss: 1.2171 - classification_loss: 0.1851 178/500 [=========>....................] - ETA: 1:15 - loss: 1.4028 - regression_loss: 1.2176 - classification_loss: 0.1852 179/500 [=========>....................] - ETA: 1:15 - loss: 1.4006 - regression_loss: 1.2160 - classification_loss: 0.1846 180/500 [=========>....................] - ETA: 1:15 - loss: 1.4019 - regression_loss: 1.2172 - classification_loss: 0.1848 181/500 [=========>....................] - ETA: 1:15 - loss: 1.4016 - regression_loss: 1.2167 - classification_loss: 0.1849 182/500 [=========>....................] - ETA: 1:14 - loss: 1.4033 - regression_loss: 1.2181 - classification_loss: 0.1851 183/500 [=========>....................] - ETA: 1:14 - loss: 1.4014 - regression_loss: 1.2166 - classification_loss: 0.1847 184/500 [==========>...................] - ETA: 1:14 - loss: 1.3985 - regression_loss: 1.2142 - classification_loss: 0.1843 185/500 [==========>...................] - ETA: 1:14 - loss: 1.3938 - regression_loss: 1.2099 - classification_loss: 0.1839 186/500 [==========>...................] - ETA: 1:13 - loss: 1.3916 - regression_loss: 1.2080 - classification_loss: 0.1835 187/500 [==========>...................] - ETA: 1:13 - loss: 1.3873 - regression_loss: 1.2044 - classification_loss: 0.1829 188/500 [==========>...................] - ETA: 1:13 - loss: 1.3900 - regression_loss: 1.2061 - classification_loss: 0.1839 189/500 [==========>...................] - ETA: 1:13 - loss: 1.3910 - regression_loss: 1.2072 - classification_loss: 0.1838 190/500 [==========>...................] - ETA: 1:12 - loss: 1.3907 - regression_loss: 1.2073 - classification_loss: 0.1835 191/500 [==========>...................] - ETA: 1:12 - loss: 1.3917 - regression_loss: 1.2080 - classification_loss: 0.1836 192/500 [==========>...................] - ETA: 1:12 - loss: 1.3908 - regression_loss: 1.2073 - classification_loss: 0.1835 193/500 [==========>...................] - ETA: 1:12 - loss: 1.3916 - regression_loss: 1.2078 - classification_loss: 0.1839 194/500 [==========>...................] - ETA: 1:12 - loss: 1.3925 - regression_loss: 1.2087 - classification_loss: 0.1838 195/500 [==========>...................] - ETA: 1:11 - loss: 1.3912 - regression_loss: 1.2070 - classification_loss: 0.1842 196/500 [==========>...................] - ETA: 1:11 - loss: 1.3894 - regression_loss: 1.2054 - classification_loss: 0.1840 197/500 [==========>...................] - ETA: 1:11 - loss: 1.3887 - regression_loss: 1.2050 - classification_loss: 0.1837 198/500 [==========>...................] - ETA: 1:11 - loss: 1.3903 - regression_loss: 1.2063 - classification_loss: 0.1840 199/500 [==========>...................] - ETA: 1:10 - loss: 1.3869 - regression_loss: 1.2034 - classification_loss: 0.1835 200/500 [===========>..................] - ETA: 1:10 - loss: 1.3891 - regression_loss: 1.2056 - classification_loss: 0.1835 201/500 [===========>..................] - ETA: 1:10 - loss: 1.3864 - regression_loss: 1.2034 - classification_loss: 0.1830 202/500 [===========>..................] - ETA: 1:10 - loss: 1.3887 - regression_loss: 1.2053 - classification_loss: 0.1834 203/500 [===========>..................] - ETA: 1:09 - loss: 1.3875 - regression_loss: 1.2045 - classification_loss: 0.1830 204/500 [===========>..................] - ETA: 1:09 - loss: 1.3863 - regression_loss: 1.2035 - classification_loss: 0.1828 205/500 [===========>..................] - ETA: 1:09 - loss: 1.3836 - regression_loss: 1.2012 - classification_loss: 0.1824 206/500 [===========>..................] - ETA: 1:09 - loss: 1.3855 - regression_loss: 1.2030 - classification_loss: 0.1825 207/500 [===========>..................] - ETA: 1:09 - loss: 1.3862 - regression_loss: 1.2034 - classification_loss: 0.1829 208/500 [===========>..................] - ETA: 1:08 - loss: 1.3876 - regression_loss: 1.2044 - classification_loss: 0.1832 209/500 [===========>..................] - ETA: 1:08 - loss: 1.3883 - regression_loss: 1.2049 - classification_loss: 0.1834 210/500 [===========>..................] - ETA: 1:08 - loss: 1.3892 - regression_loss: 1.2053 - classification_loss: 0.1839 211/500 [===========>..................] - ETA: 1:08 - loss: 1.3876 - regression_loss: 1.2041 - classification_loss: 0.1835 212/500 [===========>..................] - ETA: 1:07 - loss: 1.3868 - regression_loss: 1.2036 - classification_loss: 0.1832 213/500 [===========>..................] - ETA: 1:07 - loss: 1.3864 - regression_loss: 1.2034 - classification_loss: 0.1829 214/500 [===========>..................] - ETA: 1:07 - loss: 1.3867 - regression_loss: 1.2035 - classification_loss: 0.1832 215/500 [===========>..................] - ETA: 1:07 - loss: 1.3867 - regression_loss: 1.2034 - classification_loss: 0.1833 216/500 [===========>..................] - ETA: 1:06 - loss: 1.3857 - regression_loss: 1.2025 - classification_loss: 0.1832 217/500 [============>.................] - ETA: 1:06 - loss: 1.3833 - regression_loss: 1.2004 - classification_loss: 0.1829 218/500 [============>.................] - ETA: 1:06 - loss: 1.3876 - regression_loss: 1.2037 - classification_loss: 0.1840 219/500 [============>.................] - ETA: 1:06 - loss: 1.3891 - regression_loss: 1.2052 - classification_loss: 0.1839 220/500 [============>.................] - ETA: 1:05 - loss: 1.3933 - regression_loss: 1.2085 - classification_loss: 0.1848 221/500 [============>.................] - ETA: 1:05 - loss: 1.3915 - regression_loss: 1.2068 - classification_loss: 0.1846 222/500 [============>.................] - ETA: 1:05 - loss: 1.3908 - regression_loss: 1.2064 - classification_loss: 0.1844 223/500 [============>.................] - ETA: 1:05 - loss: 1.3924 - regression_loss: 1.2082 - classification_loss: 0.1842 224/500 [============>.................] - ETA: 1:05 - loss: 1.3919 - regression_loss: 1.2076 - classification_loss: 0.1842 225/500 [============>.................] - ETA: 1:04 - loss: 1.3923 - regression_loss: 1.2079 - classification_loss: 0.1844 226/500 [============>.................] - ETA: 1:04 - loss: 1.3917 - regression_loss: 1.2074 - classification_loss: 0.1843 227/500 [============>.................] - ETA: 1:04 - loss: 1.3896 - regression_loss: 1.2055 - classification_loss: 0.1841 228/500 [============>.................] - ETA: 1:04 - loss: 1.3883 - regression_loss: 1.2043 - classification_loss: 0.1840 229/500 [============>.................] - ETA: 1:03 - loss: 1.3844 - regression_loss: 1.2010 - classification_loss: 0.1834 230/500 [============>.................] - ETA: 1:03 - loss: 1.3859 - regression_loss: 1.2020 - classification_loss: 0.1839 231/500 [============>.................] - ETA: 1:03 - loss: 1.3863 - regression_loss: 1.2023 - classification_loss: 0.1841 232/500 [============>.................] - ETA: 1:03 - loss: 1.3860 - regression_loss: 1.2020 - classification_loss: 0.1840 233/500 [============>.................] - ETA: 1:02 - loss: 1.3817 - regression_loss: 1.1983 - classification_loss: 0.1834 234/500 [=============>................] - ETA: 1:02 - loss: 1.3836 - regression_loss: 1.2003 - classification_loss: 0.1833 235/500 [=============>................] - ETA: 1:02 - loss: 1.3841 - regression_loss: 1.2009 - classification_loss: 0.1832 236/500 [=============>................] - ETA: 1:02 - loss: 1.3842 - regression_loss: 1.2011 - classification_loss: 0.1830 237/500 [=============>................] - ETA: 1:02 - loss: 1.3829 - regression_loss: 1.2001 - classification_loss: 0.1828 238/500 [=============>................] - ETA: 1:01 - loss: 1.3830 - regression_loss: 1.2004 - classification_loss: 0.1827 239/500 [=============>................] - ETA: 1:01 - loss: 1.3814 - regression_loss: 1.1991 - classification_loss: 0.1823 240/500 [=============>................] - ETA: 1:01 - loss: 1.3811 - regression_loss: 1.1990 - classification_loss: 0.1821 241/500 [=============>................] - ETA: 1:01 - loss: 1.3850 - regression_loss: 1.2022 - classification_loss: 0.1827 242/500 [=============>................] - ETA: 1:00 - loss: 1.3870 - regression_loss: 1.2036 - classification_loss: 0.1834 243/500 [=============>................] - ETA: 1:00 - loss: 1.3856 - regression_loss: 1.2025 - classification_loss: 0.1831 244/500 [=============>................] - ETA: 1:00 - loss: 1.3867 - regression_loss: 1.2034 - classification_loss: 0.1833 245/500 [=============>................] - ETA: 1:00 - loss: 1.3832 - regression_loss: 1.2003 - classification_loss: 0.1830 246/500 [=============>................] - ETA: 59s - loss: 1.3819 - regression_loss: 1.1991 - classification_loss: 0.1828  247/500 [=============>................] - ETA: 59s - loss: 1.3826 - regression_loss: 1.1996 - classification_loss: 0.1830 248/500 [=============>................] - ETA: 59s - loss: 1.3803 - regression_loss: 1.1978 - classification_loss: 0.1826 249/500 [=============>................] - ETA: 59s - loss: 1.3856 - regression_loss: 1.2025 - classification_loss: 0.1831 250/500 [==============>...............] - ETA: 58s - loss: 1.3860 - regression_loss: 1.2029 - classification_loss: 0.1831 251/500 [==============>...............] - ETA: 58s - loss: 1.3912 - regression_loss: 1.2069 - classification_loss: 0.1843 252/500 [==============>...............] - ETA: 58s - loss: 1.3899 - regression_loss: 1.2061 - classification_loss: 0.1838 253/500 [==============>...............] - ETA: 58s - loss: 1.3920 - regression_loss: 1.2071 - classification_loss: 0.1848 254/500 [==============>...............] - ETA: 57s - loss: 1.3921 - regression_loss: 1.2073 - classification_loss: 0.1848 255/500 [==============>...............] - ETA: 57s - loss: 1.3909 - regression_loss: 1.2061 - classification_loss: 0.1847 256/500 [==============>...............] - ETA: 57s - loss: 1.3950 - regression_loss: 1.2092 - classification_loss: 0.1859 257/500 [==============>...............] - ETA: 57s - loss: 1.3962 - regression_loss: 1.2104 - classification_loss: 0.1859 258/500 [==============>...............] - ETA: 56s - loss: 1.3992 - regression_loss: 1.2126 - classification_loss: 0.1866 259/500 [==============>...............] - ETA: 56s - loss: 1.3959 - regression_loss: 1.2097 - classification_loss: 0.1862 260/500 [==============>...............] - ETA: 56s - loss: 1.4017 - regression_loss: 1.2146 - classification_loss: 0.1871 261/500 [==============>...............] - ETA: 56s - loss: 1.4028 - regression_loss: 1.2152 - classification_loss: 0.1876 262/500 [==============>...............] - ETA: 56s - loss: 1.4002 - regression_loss: 1.2130 - classification_loss: 0.1873 263/500 [==============>...............] - ETA: 55s - loss: 1.3997 - regression_loss: 1.2126 - classification_loss: 0.1871 264/500 [==============>...............] - ETA: 55s - loss: 1.3973 - regression_loss: 1.2104 - classification_loss: 0.1869 265/500 [==============>...............] - ETA: 55s - loss: 1.3958 - regression_loss: 1.2091 - classification_loss: 0.1867 266/500 [==============>...............] - ETA: 55s - loss: 1.3941 - regression_loss: 1.2078 - classification_loss: 0.1863 267/500 [===============>..............] - ETA: 54s - loss: 1.3937 - regression_loss: 1.2077 - classification_loss: 0.1860 268/500 [===============>..............] - ETA: 54s - loss: 1.3919 - regression_loss: 1.2063 - classification_loss: 0.1857 269/500 [===============>..............] - ETA: 54s - loss: 1.3922 - regression_loss: 1.2067 - classification_loss: 0.1856 270/500 [===============>..............] - ETA: 54s - loss: 1.3951 - regression_loss: 1.2094 - classification_loss: 0.1857 271/500 [===============>..............] - ETA: 53s - loss: 1.3942 - regression_loss: 1.2086 - classification_loss: 0.1856 272/500 [===============>..............] - ETA: 53s - loss: 1.3942 - regression_loss: 1.2087 - classification_loss: 0.1855 273/500 [===============>..............] - ETA: 53s - loss: 1.3942 - regression_loss: 1.2086 - classification_loss: 0.1856 274/500 [===============>..............] - ETA: 53s - loss: 1.3929 - regression_loss: 1.2076 - classification_loss: 0.1854 275/500 [===============>..............] - ETA: 52s - loss: 1.3927 - regression_loss: 1.2074 - classification_loss: 0.1852 276/500 [===============>..............] - ETA: 52s - loss: 1.3931 - regression_loss: 1.2079 - classification_loss: 0.1852 277/500 [===============>..............] - ETA: 52s - loss: 1.4008 - regression_loss: 1.2121 - classification_loss: 0.1887 278/500 [===============>..............] - ETA: 52s - loss: 1.4059 - regression_loss: 1.2161 - classification_loss: 0.1899 279/500 [===============>..............] - ETA: 51s - loss: 1.4071 - regression_loss: 1.2169 - classification_loss: 0.1902 280/500 [===============>..............] - ETA: 51s - loss: 1.4062 - regression_loss: 1.2161 - classification_loss: 0.1902 281/500 [===============>..............] - ETA: 51s - loss: 1.4071 - regression_loss: 1.2170 - classification_loss: 0.1902 282/500 [===============>..............] - ETA: 51s - loss: 1.4050 - regression_loss: 1.2153 - classification_loss: 0.1897 283/500 [===============>..............] - ETA: 51s - loss: 1.4072 - regression_loss: 1.2170 - classification_loss: 0.1902 284/500 [================>.............] - ETA: 50s - loss: 1.4071 - regression_loss: 1.2166 - classification_loss: 0.1905 285/500 [================>.............] - ETA: 50s - loss: 1.4050 - regression_loss: 1.2147 - classification_loss: 0.1902 286/500 [================>.............] - ETA: 50s - loss: 1.4039 - regression_loss: 1.2138 - classification_loss: 0.1901 287/500 [================>.............] - ETA: 50s - loss: 1.4039 - regression_loss: 1.2137 - classification_loss: 0.1902 288/500 [================>.............] - ETA: 49s - loss: 1.4021 - regression_loss: 1.2122 - classification_loss: 0.1898 289/500 [================>.............] - ETA: 49s - loss: 1.4050 - regression_loss: 1.2145 - classification_loss: 0.1905 290/500 [================>.............] - ETA: 49s - loss: 1.4080 - regression_loss: 1.2165 - classification_loss: 0.1914 291/500 [================>.............] - ETA: 49s - loss: 1.4085 - regression_loss: 1.2170 - classification_loss: 0.1915 292/500 [================>.............] - ETA: 48s - loss: 1.4085 - regression_loss: 1.2171 - classification_loss: 0.1914 293/500 [================>.............] - ETA: 48s - loss: 1.4160 - regression_loss: 1.2231 - classification_loss: 0.1929 294/500 [================>.............] - ETA: 48s - loss: 1.4164 - regression_loss: 1.2237 - classification_loss: 0.1927 295/500 [================>.............] - ETA: 48s - loss: 1.4156 - regression_loss: 1.2231 - classification_loss: 0.1925 296/500 [================>.............] - ETA: 47s - loss: 1.4147 - regression_loss: 1.2223 - classification_loss: 0.1924 297/500 [================>.............] - ETA: 47s - loss: 1.4152 - regression_loss: 1.2228 - classification_loss: 0.1924 298/500 [================>.............] - ETA: 47s - loss: 1.4156 - regression_loss: 1.2229 - classification_loss: 0.1926 299/500 [================>.............] - ETA: 47s - loss: 1.4146 - regression_loss: 1.2222 - classification_loss: 0.1924 300/500 [=================>............] - ETA: 47s - loss: 1.4115 - regression_loss: 1.2196 - classification_loss: 0.1919 301/500 [=================>............] - ETA: 46s - loss: 1.4081 - regression_loss: 1.2168 - classification_loss: 0.1913 302/500 [=================>............] - ETA: 46s - loss: 1.4056 - regression_loss: 1.2147 - classification_loss: 0.1909 303/500 [=================>............] - ETA: 46s - loss: 1.4040 - regression_loss: 1.2134 - classification_loss: 0.1906 304/500 [=================>............] - ETA: 46s - loss: 1.4036 - regression_loss: 1.2132 - classification_loss: 0.1904 305/500 [=================>............] - ETA: 45s - loss: 1.4038 - regression_loss: 1.2133 - classification_loss: 0.1906 306/500 [=================>............] - ETA: 45s - loss: 1.4025 - regression_loss: 1.2118 - classification_loss: 0.1906 307/500 [=================>............] - ETA: 45s - loss: 1.4038 - regression_loss: 1.2127 - classification_loss: 0.1911 308/500 [=================>............] - ETA: 45s - loss: 1.4067 - regression_loss: 1.2151 - classification_loss: 0.1916 309/500 [=================>............] - ETA: 44s - loss: 1.4084 - regression_loss: 1.2165 - classification_loss: 0.1919 310/500 [=================>............] - ETA: 44s - loss: 1.4093 - regression_loss: 1.2175 - classification_loss: 0.1918 311/500 [=================>............] - ETA: 44s - loss: 1.4072 - regression_loss: 1.2157 - classification_loss: 0.1915 312/500 [=================>............] - ETA: 44s - loss: 1.4083 - regression_loss: 1.2170 - classification_loss: 0.1912 313/500 [=================>............] - ETA: 43s - loss: 1.4085 - regression_loss: 1.2172 - classification_loss: 0.1913 314/500 [=================>............] - ETA: 43s - loss: 1.4079 - regression_loss: 1.2170 - classification_loss: 0.1909 315/500 [=================>............] - ETA: 43s - loss: 1.4118 - regression_loss: 1.2209 - classification_loss: 0.1909 316/500 [=================>............] - ETA: 43s - loss: 1.4091 - regression_loss: 1.2185 - classification_loss: 0.1905 317/500 [==================>...........] - ETA: 43s - loss: 1.4089 - regression_loss: 1.2184 - classification_loss: 0.1905 318/500 [==================>...........] - ETA: 42s - loss: 1.4063 - regression_loss: 1.2161 - classification_loss: 0.1902 319/500 [==================>...........] - ETA: 42s - loss: 1.4066 - regression_loss: 1.2161 - classification_loss: 0.1904 320/500 [==================>...........] - ETA: 42s - loss: 1.4077 - regression_loss: 1.2170 - classification_loss: 0.1907 321/500 [==================>...........] - ETA: 42s - loss: 1.4067 - regression_loss: 1.2162 - classification_loss: 0.1904 322/500 [==================>...........] - ETA: 41s - loss: 1.4094 - regression_loss: 1.2182 - classification_loss: 0.1911 323/500 [==================>...........] - ETA: 41s - loss: 1.4093 - regression_loss: 1.2182 - classification_loss: 0.1911 324/500 [==================>...........] - ETA: 41s - loss: 1.4110 - regression_loss: 1.2194 - classification_loss: 0.1916 325/500 [==================>...........] - ETA: 41s - loss: 1.4103 - regression_loss: 1.2190 - classification_loss: 0.1913 326/500 [==================>...........] - ETA: 40s - loss: 1.4095 - regression_loss: 1.2184 - classification_loss: 0.1911 327/500 [==================>...........] - ETA: 40s - loss: 1.4110 - regression_loss: 1.2191 - classification_loss: 0.1919 328/500 [==================>...........] - ETA: 40s - loss: 1.4126 - regression_loss: 1.2203 - classification_loss: 0.1924 329/500 [==================>...........] - ETA: 40s - loss: 1.4119 - regression_loss: 1.2196 - classification_loss: 0.1924 330/500 [==================>...........] - ETA: 40s - loss: 1.4125 - regression_loss: 1.2201 - classification_loss: 0.1924 331/500 [==================>...........] - ETA: 39s - loss: 1.4123 - regression_loss: 1.2200 - classification_loss: 0.1924 332/500 [==================>...........] - ETA: 39s - loss: 1.4130 - regression_loss: 1.2206 - classification_loss: 0.1924 333/500 [==================>...........] - ETA: 39s - loss: 1.4136 - regression_loss: 1.2213 - classification_loss: 0.1923 334/500 [===================>..........] - ETA: 39s - loss: 1.4147 - regression_loss: 1.2224 - classification_loss: 0.1924 335/500 [===================>..........] - ETA: 38s - loss: 1.4153 - regression_loss: 1.2229 - classification_loss: 0.1924 336/500 [===================>..........] - ETA: 38s - loss: 1.4145 - regression_loss: 1.2224 - classification_loss: 0.1922 337/500 [===================>..........] - ETA: 38s - loss: 1.4122 - regression_loss: 1.2202 - classification_loss: 0.1919 338/500 [===================>..........] - ETA: 38s - loss: 1.4127 - regression_loss: 1.2208 - classification_loss: 0.1919 339/500 [===================>..........] - ETA: 37s - loss: 1.4135 - regression_loss: 1.2215 - classification_loss: 0.1919 340/500 [===================>..........] - ETA: 37s - loss: 1.4128 - regression_loss: 1.2211 - classification_loss: 0.1918 341/500 [===================>..........] - ETA: 37s - loss: 1.4154 - regression_loss: 1.2229 - classification_loss: 0.1925 342/500 [===================>..........] - ETA: 37s - loss: 1.4164 - regression_loss: 1.2236 - classification_loss: 0.1928 343/500 [===================>..........] - ETA: 36s - loss: 1.4163 - regression_loss: 1.2236 - classification_loss: 0.1927 344/500 [===================>..........] - ETA: 36s - loss: 1.4179 - regression_loss: 1.2250 - classification_loss: 0.1928 345/500 [===================>..........] - ETA: 36s - loss: 1.4165 - regression_loss: 1.2239 - classification_loss: 0.1926 346/500 [===================>..........] - ETA: 36s - loss: 1.4163 - regression_loss: 1.2238 - classification_loss: 0.1926 347/500 [===================>..........] - ETA: 35s - loss: 1.4149 - regression_loss: 1.2226 - classification_loss: 0.1923 348/500 [===================>..........] - ETA: 35s - loss: 1.4157 - regression_loss: 1.2233 - classification_loss: 0.1924 349/500 [===================>..........] - ETA: 35s - loss: 1.4191 - regression_loss: 1.2260 - classification_loss: 0.1932 350/500 [====================>.........] - ETA: 35s - loss: 1.4183 - regression_loss: 1.2252 - classification_loss: 0.1931 351/500 [====================>.........] - ETA: 35s - loss: 1.4164 - regression_loss: 1.2236 - classification_loss: 0.1928 352/500 [====================>.........] - ETA: 34s - loss: 1.4160 - regression_loss: 1.2233 - classification_loss: 0.1927 353/500 [====================>.........] - ETA: 34s - loss: 1.4166 - regression_loss: 1.2238 - classification_loss: 0.1928 354/500 [====================>.........] - ETA: 34s - loss: 1.4183 - regression_loss: 1.2249 - classification_loss: 0.1934 355/500 [====================>.........] - ETA: 34s - loss: 1.4190 - regression_loss: 1.2254 - classification_loss: 0.1936 356/500 [====================>.........] - ETA: 33s - loss: 1.4175 - regression_loss: 1.2241 - classification_loss: 0.1935 357/500 [====================>.........] - ETA: 33s - loss: 1.4165 - regression_loss: 1.2233 - classification_loss: 0.1932 358/500 [====================>.........] - ETA: 33s - loss: 1.4155 - regression_loss: 1.2225 - classification_loss: 0.1930 359/500 [====================>.........] - ETA: 33s - loss: 1.4160 - regression_loss: 1.2228 - classification_loss: 0.1932 360/500 [====================>.........] - ETA: 32s - loss: 1.4161 - regression_loss: 1.2230 - classification_loss: 0.1931 361/500 [====================>.........] - ETA: 32s - loss: 1.4163 - regression_loss: 1.2231 - classification_loss: 0.1932 362/500 [====================>.........] - ETA: 32s - loss: 1.4140 - regression_loss: 1.2211 - classification_loss: 0.1929 363/500 [====================>.........] - ETA: 32s - loss: 1.4122 - regression_loss: 1.2196 - classification_loss: 0.1926 364/500 [====================>.........] - ETA: 31s - loss: 1.4132 - regression_loss: 1.2205 - classification_loss: 0.1926 365/500 [====================>.........] - ETA: 31s - loss: 1.4128 - regression_loss: 1.2202 - classification_loss: 0.1926 366/500 [====================>.........] - ETA: 31s - loss: 1.4118 - regression_loss: 1.2194 - classification_loss: 0.1924 367/500 [=====================>........] - ETA: 31s - loss: 1.4131 - regression_loss: 1.2206 - classification_loss: 0.1925 368/500 [=====================>........] - ETA: 31s - loss: 1.4120 - regression_loss: 1.2196 - classification_loss: 0.1923 369/500 [=====================>........] - ETA: 30s - loss: 1.4133 - regression_loss: 1.2209 - classification_loss: 0.1924 370/500 [=====================>........] - ETA: 30s - loss: 1.4128 - regression_loss: 1.2205 - classification_loss: 0.1923 371/500 [=====================>........] - ETA: 30s - loss: 1.4129 - regression_loss: 1.2207 - classification_loss: 0.1922 372/500 [=====================>........] - ETA: 30s - loss: 1.4144 - regression_loss: 1.2220 - classification_loss: 0.1924 373/500 [=====================>........] - ETA: 29s - loss: 1.4131 - regression_loss: 1.2210 - classification_loss: 0.1921 374/500 [=====================>........] - ETA: 29s - loss: 1.4142 - regression_loss: 1.2219 - classification_loss: 0.1924 375/500 [=====================>........] - ETA: 29s - loss: 1.4125 - regression_loss: 1.2204 - classification_loss: 0.1921 376/500 [=====================>........] - ETA: 29s - loss: 1.4142 - regression_loss: 1.2220 - classification_loss: 0.1922 377/500 [=====================>........] - ETA: 28s - loss: 1.4142 - regression_loss: 1.2220 - classification_loss: 0.1922 378/500 [=====================>........] - ETA: 28s - loss: 1.4138 - regression_loss: 1.2218 - classification_loss: 0.1920 379/500 [=====================>........] - ETA: 28s - loss: 1.4126 - regression_loss: 1.2207 - classification_loss: 0.1919 380/500 [=====================>........] - ETA: 28s - loss: 1.4122 - regression_loss: 1.2204 - classification_loss: 0.1919 381/500 [=====================>........] - ETA: 27s - loss: 1.4115 - regression_loss: 1.2197 - classification_loss: 0.1918 382/500 [=====================>........] - ETA: 27s - loss: 1.4118 - regression_loss: 1.2200 - classification_loss: 0.1919 383/500 [=====================>........] - ETA: 27s - loss: 1.4108 - regression_loss: 1.2192 - classification_loss: 0.1916 384/500 [======================>.......] - ETA: 27s - loss: 1.4132 - regression_loss: 1.2214 - classification_loss: 0.1918 385/500 [======================>.......] - ETA: 27s - loss: 1.4145 - regression_loss: 1.2224 - classification_loss: 0.1922 386/500 [======================>.......] - ETA: 26s - loss: 1.4138 - regression_loss: 1.2217 - classification_loss: 0.1921 387/500 [======================>.......] - ETA: 26s - loss: 1.4141 - regression_loss: 1.2220 - classification_loss: 0.1921 388/500 [======================>.......] - ETA: 26s - loss: 1.4147 - regression_loss: 1.2225 - classification_loss: 0.1922 389/500 [======================>.......] - ETA: 26s - loss: 1.4137 - regression_loss: 1.2217 - classification_loss: 0.1920 390/500 [======================>.......] - ETA: 25s - loss: 1.4139 - regression_loss: 1.2218 - classification_loss: 0.1921 391/500 [======================>.......] - ETA: 25s - loss: 1.4146 - regression_loss: 1.2222 - classification_loss: 0.1924 392/500 [======================>.......] - ETA: 25s - loss: 1.4166 - regression_loss: 1.2236 - classification_loss: 0.1930 393/500 [======================>.......] - ETA: 25s - loss: 1.4147 - regression_loss: 1.2219 - classification_loss: 0.1928 394/500 [======================>.......] - ETA: 24s - loss: 1.4148 - regression_loss: 1.2222 - classification_loss: 0.1926 395/500 [======================>.......] - ETA: 24s - loss: 1.4145 - regression_loss: 1.2221 - classification_loss: 0.1924 396/500 [======================>.......] - ETA: 24s - loss: 1.4140 - regression_loss: 1.2216 - classification_loss: 0.1923 397/500 [======================>.......] - ETA: 24s - loss: 1.4131 - regression_loss: 1.2210 - classification_loss: 0.1922 398/500 [======================>.......] - ETA: 24s - loss: 1.4112 - regression_loss: 1.2194 - classification_loss: 0.1918 399/500 [======================>.......] - ETA: 23s - loss: 1.4106 - regression_loss: 1.2189 - classification_loss: 0.1917 400/500 [=======================>......] - ETA: 23s - loss: 1.4104 - regression_loss: 1.2189 - classification_loss: 0.1915 401/500 [=======================>......] - ETA: 23s - loss: 1.4094 - regression_loss: 1.2181 - classification_loss: 0.1913 402/500 [=======================>......] - ETA: 23s - loss: 1.4103 - regression_loss: 1.2190 - classification_loss: 0.1913 403/500 [=======================>......] - ETA: 22s - loss: 1.4094 - regression_loss: 1.2184 - classification_loss: 0.1911 404/500 [=======================>......] - ETA: 22s - loss: 1.4088 - regression_loss: 1.2178 - classification_loss: 0.1910 405/500 [=======================>......] - ETA: 22s - loss: 1.4073 - regression_loss: 1.2166 - classification_loss: 0.1907 406/500 [=======================>......] - ETA: 22s - loss: 1.4081 - regression_loss: 1.2171 - classification_loss: 0.1910 407/500 [=======================>......] - ETA: 21s - loss: 1.4101 - regression_loss: 1.2188 - classification_loss: 0.1913 408/500 [=======================>......] - ETA: 21s - loss: 1.4106 - regression_loss: 1.2195 - classification_loss: 0.1911 409/500 [=======================>......] - ETA: 21s - loss: 1.4089 - regression_loss: 1.2180 - classification_loss: 0.1908 410/500 [=======================>......] - ETA: 21s - loss: 1.4107 - regression_loss: 1.2193 - classification_loss: 0.1914 411/500 [=======================>......] - ETA: 20s - loss: 1.4101 - regression_loss: 1.2189 - classification_loss: 0.1912 412/500 [=======================>......] - ETA: 20s - loss: 1.4086 - regression_loss: 1.2159 - classification_loss: 0.1927 413/500 [=======================>......] - ETA: 20s - loss: 1.4071 - regression_loss: 1.2147 - classification_loss: 0.1924 414/500 [=======================>......] - ETA: 20s - loss: 1.4081 - regression_loss: 1.2155 - classification_loss: 0.1926 415/500 [=======================>......] - ETA: 20s - loss: 1.4082 - regression_loss: 1.2158 - classification_loss: 0.1925 416/500 [=======================>......] - ETA: 19s - loss: 1.4070 - regression_loss: 1.2148 - classification_loss: 0.1921 417/500 [========================>.....] - ETA: 19s - loss: 1.4078 - regression_loss: 1.2156 - classification_loss: 0.1922 418/500 [========================>.....] - ETA: 19s - loss: 1.4063 - regression_loss: 1.2142 - classification_loss: 0.1921 419/500 [========================>.....] - ETA: 19s - loss: 1.4068 - regression_loss: 1.2146 - classification_loss: 0.1922 420/500 [========================>.....] - ETA: 18s - loss: 1.4080 - regression_loss: 1.2157 - classification_loss: 0.1922 421/500 [========================>.....] - ETA: 18s - loss: 1.4057 - regression_loss: 1.2139 - classification_loss: 0.1919 422/500 [========================>.....] - ETA: 18s - loss: 1.4043 - regression_loss: 1.2126 - classification_loss: 0.1916 423/500 [========================>.....] - ETA: 18s - loss: 1.4037 - regression_loss: 1.2123 - classification_loss: 0.1914 424/500 [========================>.....] - ETA: 17s - loss: 1.4039 - regression_loss: 1.2126 - classification_loss: 0.1912 425/500 [========================>.....] - ETA: 17s - loss: 1.4037 - regression_loss: 1.2126 - classification_loss: 0.1912 426/500 [========================>.....] - ETA: 17s - loss: 1.4048 - regression_loss: 1.2135 - classification_loss: 0.1913 427/500 [========================>.....] - ETA: 17s - loss: 1.4041 - regression_loss: 1.2129 - classification_loss: 0.1912 428/500 [========================>.....] - ETA: 16s - loss: 1.4063 - regression_loss: 1.2150 - classification_loss: 0.1913 429/500 [========================>.....] - ETA: 16s - loss: 1.4079 - regression_loss: 1.2161 - classification_loss: 0.1918 430/500 [========================>.....] - ETA: 16s - loss: 1.4089 - regression_loss: 1.2171 - classification_loss: 0.1918 431/500 [========================>.....] - ETA: 16s - loss: 1.4087 - regression_loss: 1.2169 - classification_loss: 0.1918 432/500 [========================>.....] - ETA: 16s - loss: 1.4083 - regression_loss: 1.2166 - classification_loss: 0.1917 433/500 [========================>.....] - ETA: 15s - loss: 1.4105 - regression_loss: 1.2183 - classification_loss: 0.1921 434/500 [=========================>....] - ETA: 15s - loss: 1.4095 - regression_loss: 1.2176 - classification_loss: 0.1919 435/500 [=========================>....] - ETA: 15s - loss: 1.4085 - regression_loss: 1.2168 - classification_loss: 0.1917 436/500 [=========================>....] - ETA: 15s - loss: 1.4066 - regression_loss: 1.2151 - classification_loss: 0.1914 437/500 [=========================>....] - ETA: 14s - loss: 1.4067 - regression_loss: 1.2153 - classification_loss: 0.1914 438/500 [=========================>....] - ETA: 14s - loss: 1.4054 - regression_loss: 1.2143 - classification_loss: 0.1911 439/500 [=========================>....] - ETA: 14s - loss: 1.4052 - regression_loss: 1.2142 - classification_loss: 0.1910 440/500 [=========================>....] - ETA: 14s - loss: 1.4065 - regression_loss: 1.2154 - classification_loss: 0.1911 441/500 [=========================>....] - ETA: 13s - loss: 1.4075 - regression_loss: 1.2160 - classification_loss: 0.1915 442/500 [=========================>....] - ETA: 13s - loss: 1.4076 - regression_loss: 1.2161 - classification_loss: 0.1915 443/500 [=========================>....] - ETA: 13s - loss: 1.4063 - regression_loss: 1.2150 - classification_loss: 0.1913 444/500 [=========================>....] - ETA: 13s - loss: 1.4062 - regression_loss: 1.2149 - classification_loss: 0.1912 445/500 [=========================>....] - ETA: 12s - loss: 1.4070 - regression_loss: 1.2156 - classification_loss: 0.1914 446/500 [=========================>....] - ETA: 12s - loss: 1.4059 - regression_loss: 1.2147 - classification_loss: 0.1912 447/500 [=========================>....] - ETA: 12s - loss: 1.4076 - regression_loss: 1.2162 - classification_loss: 0.1913 448/500 [=========================>....] - ETA: 12s - loss: 1.4057 - regression_loss: 1.2146 - classification_loss: 0.1910 449/500 [=========================>....] - ETA: 11s - loss: 1.4060 - regression_loss: 1.2150 - classification_loss: 0.1911 450/500 [==========================>...] - ETA: 11s - loss: 1.4053 - regression_loss: 1.2143 - classification_loss: 0.1910 451/500 [==========================>...] - ETA: 11s - loss: 1.4052 - regression_loss: 1.2142 - classification_loss: 0.1910 452/500 [==========================>...] - ETA: 11s - loss: 1.4054 - regression_loss: 1.2143 - classification_loss: 0.1910 453/500 [==========================>...] - ETA: 11s - loss: 1.4051 - regression_loss: 1.2142 - classification_loss: 0.1909 454/500 [==========================>...] - ETA: 10s - loss: 1.4034 - regression_loss: 1.2125 - classification_loss: 0.1909 455/500 [==========================>...] - ETA: 10s - loss: 1.4042 - regression_loss: 1.2131 - classification_loss: 0.1911 456/500 [==========================>...] - ETA: 10s - loss: 1.4040 - regression_loss: 1.2130 - classification_loss: 0.1910 457/500 [==========================>...] - ETA: 10s - loss: 1.4036 - regression_loss: 1.2128 - classification_loss: 0.1908 458/500 [==========================>...] - ETA: 9s - loss: 1.4035 - regression_loss: 1.2127 - classification_loss: 0.1908  459/500 [==========================>...] - ETA: 9s - loss: 1.4026 - regression_loss: 1.2119 - classification_loss: 0.1907 460/500 [==========================>...] - ETA: 9s - loss: 1.4049 - regression_loss: 1.2140 - classification_loss: 0.1909 461/500 [==========================>...] - ETA: 9s - loss: 1.4038 - regression_loss: 1.2132 - classification_loss: 0.1907 462/500 [==========================>...] - ETA: 8s - loss: 1.4036 - regression_loss: 1.2131 - classification_loss: 0.1906 463/500 [==========================>...] - ETA: 8s - loss: 1.4040 - regression_loss: 1.2135 - classification_loss: 0.1905 464/500 [==========================>...] - ETA: 8s - loss: 1.4042 - regression_loss: 1.2137 - classification_loss: 0.1905 465/500 [==========================>...] - ETA: 8s - loss: 1.4038 - regression_loss: 1.2134 - classification_loss: 0.1904 466/500 [==========================>...] - ETA: 7s - loss: 1.4038 - regression_loss: 1.2133 - classification_loss: 0.1905 467/500 [===========================>..] - ETA: 7s - loss: 1.4046 - regression_loss: 1.2142 - classification_loss: 0.1904 468/500 [===========================>..] - ETA: 7s - loss: 1.4046 - regression_loss: 1.2143 - classification_loss: 0.1904 469/500 [===========================>..] - ETA: 7s - loss: 1.4032 - regression_loss: 1.2130 - classification_loss: 0.1902 470/500 [===========================>..] - ETA: 7s - loss: 1.4038 - regression_loss: 1.2132 - classification_loss: 0.1905 471/500 [===========================>..] - ETA: 6s - loss: 1.4020 - regression_loss: 1.2118 - classification_loss: 0.1903 472/500 [===========================>..] - ETA: 6s - loss: 1.4015 - regression_loss: 1.2115 - classification_loss: 0.1900 473/500 [===========================>..] - ETA: 6s - loss: 1.4002 - regression_loss: 1.2104 - classification_loss: 0.1898 474/500 [===========================>..] - ETA: 6s - loss: 1.4009 - regression_loss: 1.2110 - classification_loss: 0.1899 475/500 [===========================>..] - ETA: 5s - loss: 1.4014 - regression_loss: 1.2113 - classification_loss: 0.1901 476/500 [===========================>..] - ETA: 5s - loss: 1.4021 - regression_loss: 1.2119 - classification_loss: 0.1902 477/500 [===========================>..] - ETA: 5s - loss: 1.4012 - regression_loss: 1.2112 - classification_loss: 0.1900 478/500 [===========================>..] - ETA: 5s - loss: 1.4015 - regression_loss: 1.2116 - classification_loss: 0.1899 479/500 [===========================>..] - ETA: 4s - loss: 1.4014 - regression_loss: 1.2113 - classification_loss: 0.1900 480/500 [===========================>..] - ETA: 4s - loss: 1.4015 - regression_loss: 1.2113 - classification_loss: 0.1902 481/500 [===========================>..] - ETA: 4s - loss: 1.4005 - regression_loss: 1.2105 - classification_loss: 0.1900 482/500 [===========================>..] - ETA: 4s - loss: 1.4015 - regression_loss: 1.2114 - classification_loss: 0.1901 483/500 [===========================>..] - ETA: 3s - loss: 1.4004 - regression_loss: 1.2104 - classification_loss: 0.1899 484/500 [============================>.] - ETA: 3s - loss: 1.3993 - regression_loss: 1.2095 - classification_loss: 0.1898 485/500 [============================>.] - ETA: 3s - loss: 1.4001 - regression_loss: 1.2103 - classification_loss: 0.1898 486/500 [============================>.] - ETA: 3s - loss: 1.4004 - regression_loss: 1.2107 - classification_loss: 0.1897 487/500 [============================>.] - ETA: 3s - loss: 1.4007 - regression_loss: 1.2109 - classification_loss: 0.1898 488/500 [============================>.] - ETA: 2s - loss: 1.3998 - regression_loss: 1.2102 - classification_loss: 0.1897 489/500 [============================>.] - ETA: 2s - loss: 1.4005 - regression_loss: 1.2106 - classification_loss: 0.1899 490/500 [============================>.] - ETA: 2s - loss: 1.4020 - regression_loss: 1.2118 - classification_loss: 0.1902 491/500 [============================>.] - ETA: 2s - loss: 1.4022 - regression_loss: 1.2121 - classification_loss: 0.1901 492/500 [============================>.] - ETA: 1s - loss: 1.4022 - regression_loss: 1.2121 - classification_loss: 0.1901 493/500 [============================>.] - ETA: 1s - loss: 1.4011 - regression_loss: 1.2112 - classification_loss: 0.1900 494/500 [============================>.] - ETA: 1s - loss: 1.4010 - regression_loss: 1.2111 - classification_loss: 0.1899 495/500 [============================>.] - ETA: 1s - loss: 1.4009 - regression_loss: 1.2110 - classification_loss: 0.1899 496/500 [============================>.] - ETA: 0s - loss: 1.4010 - regression_loss: 1.2111 - classification_loss: 0.1898 497/500 [============================>.] - ETA: 0s - loss: 1.4031 - regression_loss: 1.2130 - classification_loss: 0.1900 498/500 [============================>.] - ETA: 0s - loss: 1.4050 - regression_loss: 1.2148 - classification_loss: 0.1902 499/500 [============================>.] - ETA: 0s - loss: 1.4040 - regression_loss: 1.2140 - classification_loss: 0.1900 500/500 [==============================] - 118s 235ms/step - loss: 1.4050 - regression_loss: 1.2145 - classification_loss: 0.1905 326 instances of class plum with average precision: 0.7957 mAP: 0.7957 Epoch 00013: saving model to ./training/snapshots/resnet50_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 1:50 - loss: 1.8661 - regression_loss: 1.5854 - classification_loss: 0.2808 2/500 [..............................] - ETA: 1:55 - loss: 1.4817 - regression_loss: 1.2593 - classification_loss: 0.2224 3/500 [..............................] - ETA: 1:56 - loss: 1.3941 - regression_loss: 1.2164 - classification_loss: 0.1777 4/500 [..............................] - ETA: 1:56 - loss: 1.3844 - regression_loss: 1.2103 - classification_loss: 0.1741 5/500 [..............................] - ETA: 1:55 - loss: 1.4874 - regression_loss: 1.3065 - classification_loss: 0.1809 6/500 [..............................] - ETA: 1:55 - loss: 1.5390 - regression_loss: 1.3474 - classification_loss: 0.1916 7/500 [..............................] - ETA: 1:54 - loss: 1.4664 - regression_loss: 1.2858 - classification_loss: 0.1806 8/500 [..............................] - ETA: 1:54 - loss: 1.4837 - regression_loss: 1.2990 - classification_loss: 0.1847 9/500 [..............................] - ETA: 1:55 - loss: 1.4117 - regression_loss: 1.2413 - classification_loss: 0.1704 10/500 [..............................] - ETA: 1:54 - loss: 1.5224 - regression_loss: 1.3450 - classification_loss: 0.1773 11/500 [..............................] - ETA: 1:54 - loss: 1.5551 - regression_loss: 1.3667 - classification_loss: 0.1884 12/500 [..............................] - ETA: 1:53 - loss: 1.6542 - regression_loss: 1.4418 - classification_loss: 0.2124 13/500 [..............................] - ETA: 1:52 - loss: 1.5977 - regression_loss: 1.3943 - classification_loss: 0.2034 14/500 [..............................] - ETA: 1:52 - loss: 1.5600 - regression_loss: 1.3634 - classification_loss: 0.1966 15/500 [..............................] - ETA: 1:51 - loss: 1.5362 - regression_loss: 1.3465 - classification_loss: 0.1897 16/500 [..............................] - ETA: 1:51 - loss: 1.5257 - regression_loss: 1.3379 - classification_loss: 0.1878 17/500 [>.............................] - ETA: 1:51 - loss: 1.5274 - regression_loss: 1.3389 - classification_loss: 0.1885 18/500 [>.............................] - ETA: 1:51 - loss: 1.5099 - regression_loss: 1.3254 - classification_loss: 0.1845 19/500 [>.............................] - ETA: 1:51 - loss: 1.4902 - regression_loss: 1.3094 - classification_loss: 0.1807 20/500 [>.............................] - ETA: 1:51 - loss: 1.4471 - regression_loss: 1.2719 - classification_loss: 0.1752 21/500 [>.............................] - ETA: 1:50 - loss: 1.4894 - regression_loss: 1.3083 - classification_loss: 0.1811 22/500 [>.............................] - ETA: 1:50 - loss: 1.4744 - regression_loss: 1.2973 - classification_loss: 0.1771 23/500 [>.............................] - ETA: 1:50 - loss: 1.4771 - regression_loss: 1.3000 - classification_loss: 0.1771 24/500 [>.............................] - ETA: 1:49 - loss: 1.4424 - regression_loss: 1.2685 - classification_loss: 0.1738 25/500 [>.............................] - ETA: 1:49 - loss: 1.4484 - regression_loss: 1.2736 - classification_loss: 0.1748 26/500 [>.............................] - ETA: 1:49 - loss: 1.4292 - regression_loss: 1.2573 - classification_loss: 0.1719 27/500 [>.............................] - ETA: 1:49 - loss: 1.4412 - regression_loss: 1.2649 - classification_loss: 0.1762 28/500 [>.............................] - ETA: 1:48 - loss: 1.4141 - regression_loss: 1.2424 - classification_loss: 0.1717 29/500 [>.............................] - ETA: 1:48 - loss: 1.3962 - regression_loss: 1.2270 - classification_loss: 0.1692 30/500 [>.............................] - ETA: 1:48 - loss: 1.3810 - regression_loss: 1.2142 - classification_loss: 0.1668 31/500 [>.............................] - ETA: 1:48 - loss: 1.3674 - regression_loss: 1.2021 - classification_loss: 0.1653 32/500 [>.............................] - ETA: 1:48 - loss: 1.3590 - regression_loss: 1.1958 - classification_loss: 0.1632 33/500 [>.............................] - ETA: 1:48 - loss: 1.3767 - regression_loss: 1.2124 - classification_loss: 0.1643 34/500 [=>............................] - ETA: 1:48 - loss: 1.3692 - regression_loss: 1.2068 - classification_loss: 0.1624 35/500 [=>............................] - ETA: 1:47 - loss: 1.3491 - regression_loss: 1.1890 - classification_loss: 0.1601 36/500 [=>............................] - ETA: 1:47 - loss: 1.3640 - regression_loss: 1.2012 - classification_loss: 0.1627 37/500 [=>............................] - ETA: 1:47 - loss: 1.3573 - regression_loss: 1.1948 - classification_loss: 0.1625 38/500 [=>............................] - ETA: 1:47 - loss: 1.3596 - regression_loss: 1.1972 - classification_loss: 0.1625 39/500 [=>............................] - ETA: 1:47 - loss: 1.3468 - regression_loss: 1.1859 - classification_loss: 0.1609 40/500 [=>............................] - ETA: 1:47 - loss: 1.3322 - regression_loss: 1.1736 - classification_loss: 0.1586 41/500 [=>............................] - ETA: 1:47 - loss: 1.3309 - regression_loss: 1.1724 - classification_loss: 0.1584 42/500 [=>............................] - ETA: 1:46 - loss: 1.3453 - regression_loss: 1.1851 - classification_loss: 0.1601 43/500 [=>............................] - ETA: 1:46 - loss: 1.3454 - regression_loss: 1.1857 - classification_loss: 0.1597 44/500 [=>............................] - ETA: 1:46 - loss: 1.3430 - regression_loss: 1.1839 - classification_loss: 0.1591 45/500 [=>............................] - ETA: 1:46 - loss: 1.3620 - regression_loss: 1.1980 - classification_loss: 0.1641 46/500 [=>............................] - ETA: 1:45 - loss: 1.3522 - regression_loss: 1.1902 - classification_loss: 0.1620 47/500 [=>............................] - ETA: 1:45 - loss: 1.3516 - regression_loss: 1.1914 - classification_loss: 0.1602 48/500 [=>............................] - ETA: 1:45 - loss: 1.3387 - regression_loss: 1.1801 - classification_loss: 0.1587 49/500 [=>............................] - ETA: 1:45 - loss: 1.3421 - regression_loss: 1.1835 - classification_loss: 0.1586 50/500 [==>...........................] - ETA: 1:45 - loss: 1.3268 - regression_loss: 1.1696 - classification_loss: 0.1572 51/500 [==>...........................] - ETA: 1:44 - loss: 1.3412 - regression_loss: 1.1797 - classification_loss: 0.1615 52/500 [==>...........................] - ETA: 1:44 - loss: 1.3421 - regression_loss: 1.1808 - classification_loss: 0.1613 53/500 [==>...........................] - ETA: 1:44 - loss: 1.3441 - regression_loss: 1.1824 - classification_loss: 0.1617 54/500 [==>...........................] - ETA: 1:43 - loss: 1.3324 - regression_loss: 1.1722 - classification_loss: 0.1602 55/500 [==>...........................] - ETA: 1:43 - loss: 1.3614 - regression_loss: 1.1901 - classification_loss: 0.1714 56/500 [==>...........................] - ETA: 1:43 - loss: 1.3590 - regression_loss: 1.1879 - classification_loss: 0.1712 57/500 [==>...........................] - ETA: 1:43 - loss: 1.3693 - regression_loss: 1.1955 - classification_loss: 0.1738 58/500 [==>...........................] - ETA: 1:42 - loss: 1.3824 - regression_loss: 1.2050 - classification_loss: 0.1774 59/500 [==>...........................] - ETA: 1:42 - loss: 1.3808 - regression_loss: 1.2038 - classification_loss: 0.1771 60/500 [==>...........................] - ETA: 1:42 - loss: 1.3750 - regression_loss: 1.1993 - classification_loss: 0.1757 61/500 [==>...........................] - ETA: 1:42 - loss: 1.3821 - regression_loss: 1.2048 - classification_loss: 0.1773 62/500 [==>...........................] - ETA: 1:41 - loss: 1.3793 - regression_loss: 1.2033 - classification_loss: 0.1761 63/500 [==>...........................] - ETA: 1:41 - loss: 1.3804 - regression_loss: 1.2045 - classification_loss: 0.1758 64/500 [==>...........................] - ETA: 1:41 - loss: 1.3772 - regression_loss: 1.2022 - classification_loss: 0.1749 65/500 [==>...........................] - ETA: 1:41 - loss: 1.3748 - regression_loss: 1.2009 - classification_loss: 0.1740 66/500 [==>...........................] - ETA: 1:40 - loss: 1.3775 - regression_loss: 1.2035 - classification_loss: 0.1740 67/500 [===>..........................] - ETA: 1:40 - loss: 1.3819 - regression_loss: 1.2076 - classification_loss: 0.1743 68/500 [===>..........................] - ETA: 1:40 - loss: 1.3744 - regression_loss: 1.2007 - classification_loss: 0.1737 69/500 [===>..........................] - ETA: 1:40 - loss: 1.3639 - regression_loss: 1.1916 - classification_loss: 0.1724 70/500 [===>..........................] - ETA: 1:39 - loss: 1.3609 - regression_loss: 1.1889 - classification_loss: 0.1720 71/500 [===>..........................] - ETA: 1:39 - loss: 1.3637 - regression_loss: 1.1918 - classification_loss: 0.1719 72/500 [===>..........................] - ETA: 1:39 - loss: 1.3609 - regression_loss: 1.1902 - classification_loss: 0.1707 73/500 [===>..........................] - ETA: 1:39 - loss: 1.3607 - regression_loss: 1.1904 - classification_loss: 0.1703 74/500 [===>..........................] - ETA: 1:38 - loss: 1.3835 - regression_loss: 1.1743 - classification_loss: 0.2092 75/500 [===>..........................] - ETA: 1:38 - loss: 1.3886 - regression_loss: 1.1792 - classification_loss: 0.2094 76/500 [===>..........................] - ETA: 1:38 - loss: 1.3903 - regression_loss: 1.1813 - classification_loss: 0.2090 77/500 [===>..........................] - ETA: 1:38 - loss: 1.3913 - regression_loss: 1.1828 - classification_loss: 0.2085 78/500 [===>..........................] - ETA: 1:38 - loss: 1.3912 - regression_loss: 1.1835 - classification_loss: 0.2077 79/500 [===>..........................] - ETA: 1:37 - loss: 1.4031 - regression_loss: 1.1925 - classification_loss: 0.2106 80/500 [===>..........................] - ETA: 1:37 - loss: 1.4050 - regression_loss: 1.1936 - classification_loss: 0.2114 81/500 [===>..........................] - ETA: 1:37 - loss: 1.4158 - regression_loss: 1.2032 - classification_loss: 0.2125 82/500 [===>..........................] - ETA: 1:37 - loss: 1.4308 - regression_loss: 1.2160 - classification_loss: 0.2148 83/500 [===>..........................] - ETA: 1:37 - loss: 1.4286 - regression_loss: 1.2152 - classification_loss: 0.2134 84/500 [====>.........................] - ETA: 1:36 - loss: 1.4283 - regression_loss: 1.2154 - classification_loss: 0.2129 85/500 [====>.........................] - ETA: 1:36 - loss: 1.4308 - regression_loss: 1.2171 - classification_loss: 0.2138 86/500 [====>.........................] - ETA: 1:36 - loss: 1.4262 - regression_loss: 1.2134 - classification_loss: 0.2127 87/500 [====>.........................] - ETA: 1:36 - loss: 1.4447 - regression_loss: 1.2316 - classification_loss: 0.2131 88/500 [====>.........................] - ETA: 1:35 - loss: 1.4498 - regression_loss: 1.2361 - classification_loss: 0.2136 89/500 [====>.........................] - ETA: 1:35 - loss: 1.4484 - regression_loss: 1.2356 - classification_loss: 0.2128 90/500 [====>.........................] - ETA: 1:35 - loss: 1.4491 - regression_loss: 1.2363 - classification_loss: 0.2129 91/500 [====>.........................] - ETA: 1:35 - loss: 1.4496 - regression_loss: 1.2369 - classification_loss: 0.2127 92/500 [====>.........................] - ETA: 1:35 - loss: 1.4446 - regression_loss: 1.2330 - classification_loss: 0.2116 93/500 [====>.........................] - ETA: 1:34 - loss: 1.4438 - regression_loss: 1.2326 - classification_loss: 0.2111 94/500 [====>.........................] - ETA: 1:34 - loss: 1.4461 - regression_loss: 1.2351 - classification_loss: 0.2110 95/500 [====>.........................] - ETA: 1:34 - loss: 1.4424 - regression_loss: 1.2325 - classification_loss: 0.2099 96/500 [====>.........................] - ETA: 1:34 - loss: 1.4431 - regression_loss: 1.2333 - classification_loss: 0.2097 97/500 [====>.........................] - ETA: 1:34 - loss: 1.4552 - regression_loss: 1.2422 - classification_loss: 0.2130 98/500 [====>.........................] - ETA: 1:33 - loss: 1.4575 - regression_loss: 1.2443 - classification_loss: 0.2131 99/500 [====>.........................] - ETA: 1:33 - loss: 1.4570 - regression_loss: 1.2440 - classification_loss: 0.2130 100/500 [=====>........................] - ETA: 1:33 - loss: 1.4512 - regression_loss: 1.2395 - classification_loss: 0.2117 101/500 [=====>........................] - ETA: 1:33 - loss: 1.4522 - regression_loss: 1.2407 - classification_loss: 0.2115 102/500 [=====>........................] - ETA: 1:32 - loss: 1.4451 - regression_loss: 1.2351 - classification_loss: 0.2100 103/500 [=====>........................] - ETA: 1:32 - loss: 1.4461 - regression_loss: 1.2357 - classification_loss: 0.2104 104/500 [=====>........................] - ETA: 1:32 - loss: 1.4477 - regression_loss: 1.2360 - classification_loss: 0.2116 105/500 [=====>........................] - ETA: 1:32 - loss: 1.4568 - regression_loss: 1.2441 - classification_loss: 0.2128 106/500 [=====>........................] - ETA: 1:31 - loss: 1.4518 - regression_loss: 1.2406 - classification_loss: 0.2111 107/500 [=====>........................] - ETA: 1:31 - loss: 1.4496 - regression_loss: 1.2386 - classification_loss: 0.2110 108/500 [=====>........................] - ETA: 1:31 - loss: 1.4557 - regression_loss: 1.2435 - classification_loss: 0.2122 109/500 [=====>........................] - ETA: 1:31 - loss: 1.4560 - regression_loss: 1.2443 - classification_loss: 0.2117 110/500 [=====>........................] - ETA: 1:30 - loss: 1.4476 - regression_loss: 1.2374 - classification_loss: 0.2102 111/500 [=====>........................] - ETA: 1:30 - loss: 1.4453 - regression_loss: 1.2357 - classification_loss: 0.2096 112/500 [=====>........................] - ETA: 1:30 - loss: 1.4460 - regression_loss: 1.2366 - classification_loss: 0.2094 113/500 [=====>........................] - ETA: 1:30 - loss: 1.4495 - regression_loss: 1.2405 - classification_loss: 0.2090 114/500 [=====>........................] - ETA: 1:29 - loss: 1.4463 - regression_loss: 1.2378 - classification_loss: 0.2085 115/500 [=====>........................] - ETA: 1:29 - loss: 1.4443 - regression_loss: 1.2363 - classification_loss: 0.2080 116/500 [=====>........................] - ETA: 1:29 - loss: 1.4431 - regression_loss: 1.2358 - classification_loss: 0.2073 117/500 [======>.......................] - ETA: 1:29 - loss: 1.4417 - regression_loss: 1.2346 - classification_loss: 0.2071 118/500 [======>.......................] - ETA: 1:29 - loss: 1.4431 - regression_loss: 1.2356 - classification_loss: 0.2075 119/500 [======>.......................] - ETA: 1:28 - loss: 1.4489 - regression_loss: 1.2401 - classification_loss: 0.2088 120/500 [======>.......................] - ETA: 1:28 - loss: 1.4473 - regression_loss: 1.2391 - classification_loss: 0.2082 121/500 [======>.......................] - ETA: 1:28 - loss: 1.4460 - regression_loss: 1.2379 - classification_loss: 0.2081 122/500 [======>.......................] - ETA: 1:28 - loss: 1.4370 - regression_loss: 1.2303 - classification_loss: 0.2067 123/500 [======>.......................] - ETA: 1:28 - loss: 1.4346 - regression_loss: 1.2284 - classification_loss: 0.2062 124/500 [======>.......................] - ETA: 1:27 - loss: 1.4351 - regression_loss: 1.2292 - classification_loss: 0.2059 125/500 [======>.......................] - ETA: 1:27 - loss: 1.4282 - regression_loss: 1.2235 - classification_loss: 0.2047 126/500 [======>.......................] - ETA: 1:27 - loss: 1.4284 - regression_loss: 1.2241 - classification_loss: 0.2043 127/500 [======>.......................] - ETA: 1:27 - loss: 1.4322 - regression_loss: 1.2278 - classification_loss: 0.2044 128/500 [======>.......................] - ETA: 1:26 - loss: 1.4405 - regression_loss: 1.2351 - classification_loss: 0.2054 129/500 [======>.......................] - ETA: 1:26 - loss: 1.4388 - regression_loss: 1.2343 - classification_loss: 0.2046 130/500 [======>.......................] - ETA: 1:26 - loss: 1.4466 - regression_loss: 1.2417 - classification_loss: 0.2048 131/500 [======>.......................] - ETA: 1:26 - loss: 1.4510 - regression_loss: 1.2455 - classification_loss: 0.2054 132/500 [======>.......................] - ETA: 1:26 - loss: 1.4427 - regression_loss: 1.2383 - classification_loss: 0.2045 133/500 [======>.......................] - ETA: 1:25 - loss: 1.4407 - regression_loss: 1.2361 - classification_loss: 0.2046 134/500 [=======>......................] - ETA: 1:25 - loss: 1.4386 - regression_loss: 1.2347 - classification_loss: 0.2039 135/500 [=======>......................] - ETA: 1:25 - loss: 1.4418 - regression_loss: 1.2377 - classification_loss: 0.2041 136/500 [=======>......................] - ETA: 1:25 - loss: 1.4379 - regression_loss: 1.2347 - classification_loss: 0.2032 137/500 [=======>......................] - ETA: 1:24 - loss: 1.4400 - regression_loss: 1.2368 - classification_loss: 0.2033 138/500 [=======>......................] - ETA: 1:24 - loss: 1.4405 - regression_loss: 1.2372 - classification_loss: 0.2033 139/500 [=======>......................] - ETA: 1:24 - loss: 1.4351 - regression_loss: 1.2328 - classification_loss: 0.2023 140/500 [=======>......................] - ETA: 1:24 - loss: 1.4348 - regression_loss: 1.2328 - classification_loss: 0.2021 141/500 [=======>......................] - ETA: 1:23 - loss: 1.4309 - regression_loss: 1.2296 - classification_loss: 0.2012 142/500 [=======>......................] - ETA: 1:23 - loss: 1.4296 - regression_loss: 1.2287 - classification_loss: 0.2009 143/500 [=======>......................] - ETA: 1:23 - loss: 1.4283 - regression_loss: 1.2278 - classification_loss: 0.2005 144/500 [=======>......................] - ETA: 1:23 - loss: 1.4341 - regression_loss: 1.2329 - classification_loss: 0.2012 145/500 [=======>......................] - ETA: 1:23 - loss: 1.4323 - regression_loss: 1.2315 - classification_loss: 0.2008 146/500 [=======>......................] - ETA: 1:22 - loss: 1.4344 - regression_loss: 1.2327 - classification_loss: 0.2017 147/500 [=======>......................] - ETA: 1:22 - loss: 1.4314 - regression_loss: 1.2303 - classification_loss: 0.2011 148/500 [=======>......................] - ETA: 1:22 - loss: 1.4341 - regression_loss: 1.2330 - classification_loss: 0.2011 149/500 [=======>......................] - ETA: 1:22 - loss: 1.4372 - regression_loss: 1.2355 - classification_loss: 0.2017 150/500 [========>.....................] - ETA: 1:21 - loss: 1.4401 - regression_loss: 1.2370 - classification_loss: 0.2031 151/500 [========>.....................] - ETA: 1:21 - loss: 1.4427 - regression_loss: 1.2394 - classification_loss: 0.2034 152/500 [========>.....................] - ETA: 1:21 - loss: 1.4444 - regression_loss: 1.2406 - classification_loss: 0.2038 153/500 [========>.....................] - ETA: 1:21 - loss: 1.4427 - regression_loss: 1.2393 - classification_loss: 0.2034 154/500 [========>.....................] - ETA: 1:20 - loss: 1.4418 - regression_loss: 1.2379 - classification_loss: 0.2039 155/500 [========>.....................] - ETA: 1:20 - loss: 1.4415 - regression_loss: 1.2375 - classification_loss: 0.2040 156/500 [========>.....................] - ETA: 1:20 - loss: 1.4411 - regression_loss: 1.2376 - classification_loss: 0.2035 157/500 [========>.....................] - ETA: 1:20 - loss: 1.4358 - regression_loss: 1.2331 - classification_loss: 0.2027 158/500 [========>.....................] - ETA: 1:19 - loss: 1.4369 - regression_loss: 1.2339 - classification_loss: 0.2030 159/500 [========>.....................] - ETA: 1:19 - loss: 1.4367 - regression_loss: 1.2341 - classification_loss: 0.2025 160/500 [========>.....................] - ETA: 1:19 - loss: 1.4385 - regression_loss: 1.2354 - classification_loss: 0.2031 161/500 [========>.....................] - ETA: 1:19 - loss: 1.4418 - regression_loss: 1.2381 - classification_loss: 0.2038 162/500 [========>.....................] - ETA: 1:19 - loss: 1.4432 - regression_loss: 1.2392 - classification_loss: 0.2039 163/500 [========>.....................] - ETA: 1:18 - loss: 1.4396 - regression_loss: 1.2365 - classification_loss: 0.2031 164/500 [========>.....................] - ETA: 1:18 - loss: 1.4499 - regression_loss: 1.2448 - classification_loss: 0.2051 165/500 [========>.....................] - ETA: 1:18 - loss: 1.4525 - regression_loss: 1.2473 - classification_loss: 0.2052 166/500 [========>.....................] - ETA: 1:18 - loss: 1.4524 - regression_loss: 1.2476 - classification_loss: 0.2048 167/500 [=========>....................] - ETA: 1:17 - loss: 1.4551 - regression_loss: 1.2500 - classification_loss: 0.2051 168/500 [=========>....................] - ETA: 1:17 - loss: 1.4608 - regression_loss: 1.2550 - classification_loss: 0.2058 169/500 [=========>....................] - ETA: 1:17 - loss: 1.4719 - regression_loss: 1.2638 - classification_loss: 0.2081 170/500 [=========>....................] - ETA: 1:17 - loss: 1.4733 - regression_loss: 1.2646 - classification_loss: 0.2087 171/500 [=========>....................] - ETA: 1:17 - loss: 1.4731 - regression_loss: 1.2644 - classification_loss: 0.2087 172/500 [=========>....................] - ETA: 1:16 - loss: 1.4729 - regression_loss: 1.2643 - classification_loss: 0.2086 173/500 [=========>....................] - ETA: 1:16 - loss: 1.4706 - regression_loss: 1.2625 - classification_loss: 0.2082 174/500 [=========>....................] - ETA: 1:16 - loss: 1.4722 - regression_loss: 1.2637 - classification_loss: 0.2084 175/500 [=========>....................] - ETA: 1:16 - loss: 1.4718 - regression_loss: 1.2637 - classification_loss: 0.2082 176/500 [=========>....................] - ETA: 1:15 - loss: 1.4720 - regression_loss: 1.2638 - classification_loss: 0.2082 177/500 [=========>....................] - ETA: 1:15 - loss: 1.4743 - regression_loss: 1.2664 - classification_loss: 0.2079 178/500 [=========>....................] - ETA: 1:15 - loss: 1.4710 - regression_loss: 1.2640 - classification_loss: 0.2070 179/500 [=========>....................] - ETA: 1:15 - loss: 1.4710 - regression_loss: 1.2639 - classification_loss: 0.2071 180/500 [=========>....................] - ETA: 1:15 - loss: 1.4708 - regression_loss: 1.2640 - classification_loss: 0.2068 181/500 [=========>....................] - ETA: 1:14 - loss: 1.4650 - regression_loss: 1.2592 - classification_loss: 0.2058 182/500 [=========>....................] - ETA: 1:14 - loss: 1.4709 - regression_loss: 1.2633 - classification_loss: 0.2077 183/500 [=========>....................] - ETA: 1:14 - loss: 1.4755 - regression_loss: 1.2669 - classification_loss: 0.2086 184/500 [==========>...................] - ETA: 1:14 - loss: 1.4722 - regression_loss: 1.2643 - classification_loss: 0.2078 185/500 [==========>...................] - ETA: 1:13 - loss: 1.4727 - regression_loss: 1.2648 - classification_loss: 0.2079 186/500 [==========>...................] - ETA: 1:13 - loss: 1.4696 - regression_loss: 1.2620 - classification_loss: 0.2076 187/500 [==========>...................] - ETA: 1:13 - loss: 1.4670 - regression_loss: 1.2600 - classification_loss: 0.2069 188/500 [==========>...................] - ETA: 1:13 - loss: 1.4637 - regression_loss: 1.2573 - classification_loss: 0.2065 189/500 [==========>...................] - ETA: 1:13 - loss: 1.4604 - regression_loss: 1.2547 - classification_loss: 0.2057 190/500 [==========>...................] - ETA: 1:12 - loss: 1.4611 - regression_loss: 1.2554 - classification_loss: 0.2057 191/500 [==========>...................] - ETA: 1:12 - loss: 1.4550 - regression_loss: 1.2502 - classification_loss: 0.2048 192/500 [==========>...................] - ETA: 1:12 - loss: 1.4513 - regression_loss: 1.2471 - classification_loss: 0.2042 193/500 [==========>...................] - ETA: 1:11 - loss: 1.4509 - regression_loss: 1.2469 - classification_loss: 0.2040 194/500 [==========>...................] - ETA: 1:11 - loss: 1.4469 - regression_loss: 1.2435 - classification_loss: 0.2033 195/500 [==========>...................] - ETA: 1:11 - loss: 1.4449 - regression_loss: 1.2421 - classification_loss: 0.2028 196/500 [==========>...................] - ETA: 1:11 - loss: 1.4467 - regression_loss: 1.2437 - classification_loss: 0.2029 197/500 [==========>...................] - ETA: 1:11 - loss: 1.4442 - regression_loss: 1.2418 - classification_loss: 0.2024 198/500 [==========>...................] - ETA: 1:10 - loss: 1.4459 - regression_loss: 1.2430 - classification_loss: 0.2029 199/500 [==========>...................] - ETA: 1:10 - loss: 1.4487 - regression_loss: 1.2447 - classification_loss: 0.2040 200/500 [===========>..................] - ETA: 1:10 - loss: 1.4446 - regression_loss: 1.2411 - classification_loss: 0.2035 201/500 [===========>..................] - ETA: 1:10 - loss: 1.4430 - regression_loss: 1.2400 - classification_loss: 0.2030 202/500 [===========>..................] - ETA: 1:09 - loss: 1.4450 - regression_loss: 1.2412 - classification_loss: 0.2037 203/500 [===========>..................] - ETA: 1:09 - loss: 1.4462 - regression_loss: 1.2423 - classification_loss: 0.2039 204/500 [===========>..................] - ETA: 1:09 - loss: 1.4512 - regression_loss: 1.2470 - classification_loss: 0.2042 205/500 [===========>..................] - ETA: 1:09 - loss: 1.4477 - regression_loss: 1.2443 - classification_loss: 0.2034 206/500 [===========>..................] - ETA: 1:08 - loss: 1.4476 - regression_loss: 1.2441 - classification_loss: 0.2036 207/500 [===========>..................] - ETA: 1:08 - loss: 1.4491 - regression_loss: 1.2456 - classification_loss: 0.2035 208/500 [===========>..................] - ETA: 1:08 - loss: 1.4469 - regression_loss: 1.2438 - classification_loss: 0.2031 209/500 [===========>..................] - ETA: 1:08 - loss: 1.4454 - regression_loss: 1.2424 - classification_loss: 0.2030 210/500 [===========>..................] - ETA: 1:08 - loss: 1.4461 - regression_loss: 1.2429 - classification_loss: 0.2032 211/500 [===========>..................] - ETA: 1:07 - loss: 1.4446 - regression_loss: 1.2415 - classification_loss: 0.2031 212/500 [===========>..................] - ETA: 1:07 - loss: 1.4419 - regression_loss: 1.2393 - classification_loss: 0.2026 213/500 [===========>..................] - ETA: 1:07 - loss: 1.4429 - regression_loss: 1.2404 - classification_loss: 0.2025 214/500 [===========>..................] - ETA: 1:07 - loss: 1.4424 - regression_loss: 1.2401 - classification_loss: 0.2023 215/500 [===========>..................] - ETA: 1:06 - loss: 1.4432 - regression_loss: 1.2407 - classification_loss: 0.2024 216/500 [===========>..................] - ETA: 1:06 - loss: 1.4441 - regression_loss: 1.2417 - classification_loss: 0.2024 217/500 [============>.................] - ETA: 1:06 - loss: 1.4433 - regression_loss: 1.2413 - classification_loss: 0.2020 218/500 [============>.................] - ETA: 1:06 - loss: 1.4418 - regression_loss: 1.2403 - classification_loss: 0.2015 219/500 [============>.................] - ETA: 1:05 - loss: 1.4410 - regression_loss: 1.2396 - classification_loss: 0.2013 220/500 [============>.................] - ETA: 1:05 - loss: 1.4384 - regression_loss: 1.2375 - classification_loss: 0.2010 221/500 [============>.................] - ETA: 1:05 - loss: 1.4392 - regression_loss: 1.2384 - classification_loss: 0.2008 222/500 [============>.................] - ETA: 1:05 - loss: 1.4403 - regression_loss: 1.2395 - classification_loss: 0.2008 223/500 [============>.................] - ETA: 1:05 - loss: 1.4382 - regression_loss: 1.2378 - classification_loss: 0.2004 224/500 [============>.................] - ETA: 1:04 - loss: 1.4381 - regression_loss: 1.2378 - classification_loss: 0.2003 225/500 [============>.................] - ETA: 1:04 - loss: 1.4351 - regression_loss: 1.2352 - classification_loss: 0.1998 226/500 [============>.................] - ETA: 1:04 - loss: 1.4344 - regression_loss: 1.2348 - classification_loss: 0.1997 227/500 [============>.................] - ETA: 1:04 - loss: 1.4335 - regression_loss: 1.2338 - classification_loss: 0.1997 228/500 [============>.................] - ETA: 1:03 - loss: 1.4358 - regression_loss: 1.2358 - classification_loss: 0.1999 229/500 [============>.................] - ETA: 1:03 - loss: 1.4340 - regression_loss: 1.2345 - classification_loss: 0.1994 230/500 [============>.................] - ETA: 1:03 - loss: 1.4335 - regression_loss: 1.2346 - classification_loss: 0.1990 231/500 [============>.................] - ETA: 1:03 - loss: 1.4347 - regression_loss: 1.2357 - classification_loss: 0.1990 232/500 [============>.................] - ETA: 1:02 - loss: 1.4346 - regression_loss: 1.2354 - classification_loss: 0.1992 233/500 [============>.................] - ETA: 1:02 - loss: 1.4326 - regression_loss: 1.2339 - classification_loss: 0.1987 234/500 [=============>................] - ETA: 1:02 - loss: 1.4323 - regression_loss: 1.2340 - classification_loss: 0.1983 235/500 [=============>................] - ETA: 1:02 - loss: 1.4363 - regression_loss: 1.2372 - classification_loss: 0.1991 236/500 [=============>................] - ETA: 1:01 - loss: 1.4369 - regression_loss: 1.2378 - classification_loss: 0.1991 237/500 [=============>................] - ETA: 1:01 - loss: 1.4381 - regression_loss: 1.2384 - classification_loss: 0.1997 238/500 [=============>................] - ETA: 1:01 - loss: 1.4409 - regression_loss: 1.2409 - classification_loss: 0.1999 239/500 [=============>................] - ETA: 1:01 - loss: 1.4381 - regression_loss: 1.2387 - classification_loss: 0.1994 240/500 [=============>................] - ETA: 1:01 - loss: 1.4357 - regression_loss: 1.2367 - classification_loss: 0.1990 241/500 [=============>................] - ETA: 1:00 - loss: 1.4335 - regression_loss: 1.2350 - classification_loss: 0.1986 242/500 [=============>................] - ETA: 1:00 - loss: 1.4337 - regression_loss: 1.2354 - classification_loss: 0.1983 243/500 [=============>................] - ETA: 1:00 - loss: 1.4327 - regression_loss: 1.2348 - classification_loss: 0.1979 244/500 [=============>................] - ETA: 1:00 - loss: 1.4330 - regression_loss: 1.2350 - classification_loss: 0.1980 245/500 [=============>................] - ETA: 59s - loss: 1.4329 - regression_loss: 1.2355 - classification_loss: 0.1975  246/500 [=============>................] - ETA: 59s - loss: 1.4309 - regression_loss: 1.2340 - classification_loss: 0.1969 247/500 [=============>................] - ETA: 59s - loss: 1.4321 - regression_loss: 1.2348 - classification_loss: 0.1973 248/500 [=============>................] - ETA: 59s - loss: 1.4355 - regression_loss: 1.2372 - classification_loss: 0.1983 249/500 [=============>................] - ETA: 58s - loss: 1.4385 - regression_loss: 1.2395 - classification_loss: 0.1991 250/500 [==============>...............] - ETA: 58s - loss: 1.4367 - regression_loss: 1.2378 - classification_loss: 0.1989 251/500 [==============>...............] - ETA: 58s - loss: 1.4363 - regression_loss: 1.2377 - classification_loss: 0.1986 252/500 [==============>...............] - ETA: 58s - loss: 1.4335 - regression_loss: 1.2352 - classification_loss: 0.1982 253/500 [==============>...............] - ETA: 58s - loss: 1.4327 - regression_loss: 1.2346 - classification_loss: 0.1981 254/500 [==============>...............] - ETA: 57s - loss: 1.4333 - regression_loss: 1.2355 - classification_loss: 0.1978 255/500 [==============>...............] - ETA: 57s - loss: 1.4383 - regression_loss: 1.2393 - classification_loss: 0.1990 256/500 [==============>...............] - ETA: 57s - loss: 1.4392 - regression_loss: 1.2404 - classification_loss: 0.1989 257/500 [==============>...............] - ETA: 57s - loss: 1.4386 - regression_loss: 1.2401 - classification_loss: 0.1985 258/500 [==============>...............] - ETA: 56s - loss: 1.4388 - regression_loss: 1.2403 - classification_loss: 0.1985 259/500 [==============>...............] - ETA: 56s - loss: 1.4372 - regression_loss: 1.2391 - classification_loss: 0.1981 260/500 [==============>...............] - ETA: 56s - loss: 1.4369 - regression_loss: 1.2389 - classification_loss: 0.1981 261/500 [==============>...............] - ETA: 56s - loss: 1.4371 - regression_loss: 1.2390 - classification_loss: 0.1981 262/500 [==============>...............] - ETA: 55s - loss: 1.4405 - regression_loss: 1.2421 - classification_loss: 0.1984 263/500 [==============>...............] - ETA: 55s - loss: 1.4367 - regression_loss: 1.2388 - classification_loss: 0.1979 264/500 [==============>...............] - ETA: 55s - loss: 1.4398 - regression_loss: 1.2411 - classification_loss: 0.1987 265/500 [==============>...............] - ETA: 55s - loss: 1.4377 - regression_loss: 1.2394 - classification_loss: 0.1983 266/500 [==============>...............] - ETA: 54s - loss: 1.4372 - regression_loss: 1.2391 - classification_loss: 0.1981 267/500 [===============>..............] - ETA: 54s - loss: 1.4379 - regression_loss: 1.2395 - classification_loss: 0.1984 268/500 [===============>..............] - ETA: 54s - loss: 1.4359 - regression_loss: 1.2379 - classification_loss: 0.1980 269/500 [===============>..............] - ETA: 54s - loss: 1.4376 - regression_loss: 1.2396 - classification_loss: 0.1981 270/500 [===============>..............] - ETA: 54s - loss: 1.4383 - regression_loss: 1.2403 - classification_loss: 0.1980 271/500 [===============>..............] - ETA: 53s - loss: 1.4357 - regression_loss: 1.2382 - classification_loss: 0.1974 272/500 [===============>..............] - ETA: 53s - loss: 1.4370 - regression_loss: 1.2394 - classification_loss: 0.1976 273/500 [===============>..............] - ETA: 53s - loss: 1.4370 - regression_loss: 1.2396 - classification_loss: 0.1974 274/500 [===============>..............] - ETA: 53s - loss: 1.4354 - regression_loss: 1.2384 - classification_loss: 0.1970 275/500 [===============>..............] - ETA: 52s - loss: 1.4356 - regression_loss: 1.2385 - classification_loss: 0.1971 276/500 [===============>..............] - ETA: 52s - loss: 1.4376 - regression_loss: 1.2403 - classification_loss: 0.1972 277/500 [===============>..............] - ETA: 52s - loss: 1.4383 - regression_loss: 1.2410 - classification_loss: 0.1973 278/500 [===============>..............] - ETA: 52s - loss: 1.4378 - regression_loss: 1.2405 - classification_loss: 0.1973 279/500 [===============>..............] - ETA: 51s - loss: 1.4378 - regression_loss: 1.2402 - classification_loss: 0.1976 280/500 [===============>..............] - ETA: 51s - loss: 1.4369 - regression_loss: 1.2394 - classification_loss: 0.1975 281/500 [===============>..............] - ETA: 51s - loss: 1.4358 - regression_loss: 1.2385 - classification_loss: 0.1973 282/500 [===============>..............] - ETA: 51s - loss: 1.4357 - regression_loss: 1.2384 - classification_loss: 0.1973 283/500 [===============>..............] - ETA: 50s - loss: 1.4413 - regression_loss: 1.2428 - classification_loss: 0.1986 284/500 [================>.............] - ETA: 50s - loss: 1.4421 - regression_loss: 1.2433 - classification_loss: 0.1989 285/500 [================>.............] - ETA: 50s - loss: 1.4441 - regression_loss: 1.2450 - classification_loss: 0.1991 286/500 [================>.............] - ETA: 50s - loss: 1.4442 - regression_loss: 1.2452 - classification_loss: 0.1991 287/500 [================>.............] - ETA: 50s - loss: 1.4444 - regression_loss: 1.2455 - classification_loss: 0.1989 288/500 [================>.............] - ETA: 49s - loss: 1.4472 - regression_loss: 1.2474 - classification_loss: 0.1998 289/500 [================>.............] - ETA: 49s - loss: 1.4498 - regression_loss: 1.2495 - classification_loss: 0.2003 290/500 [================>.............] - ETA: 49s - loss: 1.4498 - regression_loss: 1.2497 - classification_loss: 0.2001 291/500 [================>.............] - ETA: 49s - loss: 1.4496 - regression_loss: 1.2495 - classification_loss: 0.2000 292/500 [================>.............] - ETA: 48s - loss: 1.4494 - regression_loss: 1.2497 - classification_loss: 0.1997 293/500 [================>.............] - ETA: 48s - loss: 1.4475 - regression_loss: 1.2481 - classification_loss: 0.1993 294/500 [================>.............] - ETA: 48s - loss: 1.4464 - regression_loss: 1.2474 - classification_loss: 0.1990 295/500 [================>.............] - ETA: 48s - loss: 1.4469 - regression_loss: 1.2476 - classification_loss: 0.1993 296/500 [================>.............] - ETA: 47s - loss: 1.4477 - regression_loss: 1.2484 - classification_loss: 0.1993 297/500 [================>.............] - ETA: 47s - loss: 1.4485 - regression_loss: 1.2490 - classification_loss: 0.1996 298/500 [================>.............] - ETA: 47s - loss: 1.4523 - regression_loss: 1.2524 - classification_loss: 0.1999 299/500 [================>.............] - ETA: 47s - loss: 1.4518 - regression_loss: 1.2521 - classification_loss: 0.1998 300/500 [=================>............] - ETA: 46s - loss: 1.4512 - regression_loss: 1.2518 - classification_loss: 0.1994 301/500 [=================>............] - ETA: 46s - loss: 1.4489 - regression_loss: 1.2498 - classification_loss: 0.1991 302/500 [=================>............] - ETA: 46s - loss: 1.4483 - regression_loss: 1.2494 - classification_loss: 0.1990 303/500 [=================>............] - ETA: 46s - loss: 1.4472 - regression_loss: 1.2487 - classification_loss: 0.1985 304/500 [=================>............] - ETA: 45s - loss: 1.4443 - regression_loss: 1.2463 - classification_loss: 0.1980 305/500 [=================>............] - ETA: 45s - loss: 1.4445 - regression_loss: 1.2466 - classification_loss: 0.1979 306/500 [=================>............] - ETA: 45s - loss: 1.4443 - regression_loss: 1.2465 - classification_loss: 0.1978 307/500 [=================>............] - ETA: 45s - loss: 1.4442 - regression_loss: 1.2466 - classification_loss: 0.1976 308/500 [=================>............] - ETA: 45s - loss: 1.4406 - regression_loss: 1.2435 - classification_loss: 0.1971 309/500 [=================>............] - ETA: 44s - loss: 1.4400 - regression_loss: 1.2431 - classification_loss: 0.1969 310/500 [=================>............] - ETA: 44s - loss: 1.4399 - regression_loss: 1.2432 - classification_loss: 0.1968 311/500 [=================>............] - ETA: 44s - loss: 1.4398 - regression_loss: 1.2434 - classification_loss: 0.1964 312/500 [=================>............] - ETA: 44s - loss: 1.4382 - regression_loss: 1.2422 - classification_loss: 0.1960 313/500 [=================>............] - ETA: 43s - loss: 1.4375 - regression_loss: 1.2418 - classification_loss: 0.1957 314/500 [=================>............] - ETA: 43s - loss: 1.4372 - regression_loss: 1.2417 - classification_loss: 0.1955 315/500 [=================>............] - ETA: 43s - loss: 1.4375 - regression_loss: 1.2418 - classification_loss: 0.1957 316/500 [=================>............] - ETA: 43s - loss: 1.4363 - regression_loss: 1.2410 - classification_loss: 0.1954 317/500 [==================>...........] - ETA: 42s - loss: 1.4366 - regression_loss: 1.2409 - classification_loss: 0.1957 318/500 [==================>...........] - ETA: 42s - loss: 1.4372 - regression_loss: 1.2411 - classification_loss: 0.1961 319/500 [==================>...........] - ETA: 42s - loss: 1.4393 - regression_loss: 1.2433 - classification_loss: 0.1960 320/500 [==================>...........] - ETA: 42s - loss: 1.4374 - regression_loss: 1.2417 - classification_loss: 0.1956 321/500 [==================>...........] - ETA: 41s - loss: 1.4365 - regression_loss: 1.2410 - classification_loss: 0.1954 322/500 [==================>...........] - ETA: 41s - loss: 1.4349 - regression_loss: 1.2398 - classification_loss: 0.1951 323/500 [==================>...........] - ETA: 41s - loss: 1.4333 - regression_loss: 1.2385 - classification_loss: 0.1948 324/500 [==================>...........] - ETA: 41s - loss: 1.4333 - regression_loss: 1.2383 - classification_loss: 0.1949 325/500 [==================>...........] - ETA: 41s - loss: 1.4315 - regression_loss: 1.2369 - classification_loss: 0.1946 326/500 [==================>...........] - ETA: 40s - loss: 1.4311 - regression_loss: 1.2365 - classification_loss: 0.1946 327/500 [==================>...........] - ETA: 40s - loss: 1.4322 - regression_loss: 1.2374 - classification_loss: 0.1949 328/500 [==================>...........] - ETA: 40s - loss: 1.4326 - regression_loss: 1.2376 - classification_loss: 0.1950 329/500 [==================>...........] - ETA: 40s - loss: 1.4309 - regression_loss: 1.2361 - classification_loss: 0.1948 330/500 [==================>...........] - ETA: 39s - loss: 1.4315 - regression_loss: 1.2365 - classification_loss: 0.1949 331/500 [==================>...........] - ETA: 39s - loss: 1.4302 - regression_loss: 1.2355 - classification_loss: 0.1946 332/500 [==================>...........] - ETA: 39s - loss: 1.4284 - regression_loss: 1.2341 - classification_loss: 0.1942 333/500 [==================>...........] - ETA: 39s - loss: 1.4310 - regression_loss: 1.2363 - classification_loss: 0.1947 334/500 [===================>..........] - ETA: 38s - loss: 1.4316 - regression_loss: 1.2367 - classification_loss: 0.1949 335/500 [===================>..........] - ETA: 38s - loss: 1.4311 - regression_loss: 1.2363 - classification_loss: 0.1948 336/500 [===================>..........] - ETA: 38s - loss: 1.4292 - regression_loss: 1.2347 - classification_loss: 0.1945 337/500 [===================>..........] - ETA: 38s - loss: 1.4285 - regression_loss: 1.2340 - classification_loss: 0.1945 338/500 [===================>..........] - ETA: 37s - loss: 1.4298 - regression_loss: 1.2353 - classification_loss: 0.1945 339/500 [===================>..........] - ETA: 37s - loss: 1.4299 - regression_loss: 1.2354 - classification_loss: 0.1945 340/500 [===================>..........] - ETA: 37s - loss: 1.4287 - regression_loss: 1.2344 - classification_loss: 0.1943 341/500 [===================>..........] - ETA: 37s - loss: 1.4284 - regression_loss: 1.2343 - classification_loss: 0.1941 342/500 [===================>..........] - ETA: 37s - loss: 1.4260 - regression_loss: 1.2323 - classification_loss: 0.1937 343/500 [===================>..........] - ETA: 36s - loss: 1.4249 - regression_loss: 1.2313 - classification_loss: 0.1935 344/500 [===================>..........] - ETA: 36s - loss: 1.4240 - regression_loss: 1.2304 - classification_loss: 0.1935 345/500 [===================>..........] - ETA: 36s - loss: 1.4235 - regression_loss: 1.2303 - classification_loss: 0.1933 346/500 [===================>..........] - ETA: 36s - loss: 1.4215 - regression_loss: 1.2284 - classification_loss: 0.1931 347/500 [===================>..........] - ETA: 35s - loss: 1.4223 - regression_loss: 1.2291 - classification_loss: 0.1932 348/500 [===================>..........] - ETA: 35s - loss: 1.4222 - regression_loss: 1.2291 - classification_loss: 0.1931 349/500 [===================>..........] - ETA: 35s - loss: 1.4209 - regression_loss: 1.2281 - classification_loss: 0.1929 350/500 [====================>.........] - ETA: 35s - loss: 1.4200 - regression_loss: 1.2274 - classification_loss: 0.1926 351/500 [====================>.........] - ETA: 34s - loss: 1.4203 - regression_loss: 1.2277 - classification_loss: 0.1926 352/500 [====================>.........] - ETA: 34s - loss: 1.4183 - regression_loss: 1.2260 - classification_loss: 0.1923 353/500 [====================>.........] - ETA: 34s - loss: 1.4177 - regression_loss: 1.2255 - classification_loss: 0.1922 354/500 [====================>.........] - ETA: 34s - loss: 1.4158 - regression_loss: 1.2240 - classification_loss: 0.1918 355/500 [====================>.........] - ETA: 34s - loss: 1.4128 - regression_loss: 1.2213 - classification_loss: 0.1916 356/500 [====================>.........] - ETA: 33s - loss: 1.4130 - regression_loss: 1.2214 - classification_loss: 0.1916 357/500 [====================>.........] - ETA: 33s - loss: 1.4133 - regression_loss: 1.2215 - classification_loss: 0.1919 358/500 [====================>.........] - ETA: 33s - loss: 1.4130 - regression_loss: 1.2211 - classification_loss: 0.1919 359/500 [====================>.........] - ETA: 33s - loss: 1.4147 - regression_loss: 1.2229 - classification_loss: 0.1918 360/500 [====================>.........] - ETA: 32s - loss: 1.4145 - regression_loss: 1.2228 - classification_loss: 0.1917 361/500 [====================>.........] - ETA: 32s - loss: 1.4154 - regression_loss: 1.2236 - classification_loss: 0.1918 362/500 [====================>.........] - ETA: 32s - loss: 1.4162 - regression_loss: 1.2245 - classification_loss: 0.1917 363/500 [====================>.........] - ETA: 32s - loss: 1.4159 - regression_loss: 1.2244 - classification_loss: 0.1915 364/500 [====================>.........] - ETA: 31s - loss: 1.4163 - regression_loss: 1.2249 - classification_loss: 0.1914 365/500 [====================>.........] - ETA: 31s - loss: 1.4155 - regression_loss: 1.2241 - classification_loss: 0.1914 366/500 [====================>.........] - ETA: 31s - loss: 1.4155 - regression_loss: 1.2243 - classification_loss: 0.1912 367/500 [=====================>........] - ETA: 31s - loss: 1.4140 - regression_loss: 1.2230 - classification_loss: 0.1911 368/500 [=====================>........] - ETA: 30s - loss: 1.4148 - regression_loss: 1.2237 - classification_loss: 0.1911 369/500 [=====================>........] - ETA: 30s - loss: 1.4137 - regression_loss: 1.2228 - classification_loss: 0.1909 370/500 [=====================>........] - ETA: 30s - loss: 1.4139 - regression_loss: 1.2225 - classification_loss: 0.1914 371/500 [=====================>........] - ETA: 30s - loss: 1.4139 - regression_loss: 1.2223 - classification_loss: 0.1917 372/500 [=====================>........] - ETA: 30s - loss: 1.4122 - regression_loss: 1.2208 - classification_loss: 0.1914 373/500 [=====================>........] - ETA: 29s - loss: 1.4110 - regression_loss: 1.2199 - classification_loss: 0.1911 374/500 [=====================>........] - ETA: 29s - loss: 1.4110 - regression_loss: 1.2200 - classification_loss: 0.1910 375/500 [=====================>........] - ETA: 29s - loss: 1.4124 - regression_loss: 1.2212 - classification_loss: 0.1912 376/500 [=====================>........] - ETA: 29s - loss: 1.4112 - regression_loss: 1.2202 - classification_loss: 0.1910 377/500 [=====================>........] - ETA: 28s - loss: 1.4097 - regression_loss: 1.2189 - classification_loss: 0.1908 378/500 [=====================>........] - ETA: 28s - loss: 1.4080 - regression_loss: 1.2174 - classification_loss: 0.1906 379/500 [=====================>........] - ETA: 28s - loss: 1.4082 - regression_loss: 1.2177 - classification_loss: 0.1905 380/500 [=====================>........] - ETA: 28s - loss: 1.4080 - regression_loss: 1.2176 - classification_loss: 0.1904 381/500 [=====================>........] - ETA: 27s - loss: 1.4065 - regression_loss: 1.2164 - classification_loss: 0.1901 382/500 [=====================>........] - ETA: 27s - loss: 1.4061 - regression_loss: 1.2162 - classification_loss: 0.1899 383/500 [=====================>........] - ETA: 27s - loss: 1.4054 - regression_loss: 1.2156 - classification_loss: 0.1899 384/500 [======================>.......] - ETA: 27s - loss: 1.4082 - regression_loss: 1.2180 - classification_loss: 0.1902 385/500 [======================>.......] - ETA: 26s - loss: 1.4087 - regression_loss: 1.2184 - classification_loss: 0.1902 386/500 [======================>.......] - ETA: 26s - loss: 1.4082 - regression_loss: 1.2179 - classification_loss: 0.1903 387/500 [======================>.......] - ETA: 26s - loss: 1.4083 - regression_loss: 1.2179 - classification_loss: 0.1904 388/500 [======================>.......] - ETA: 26s - loss: 1.4076 - regression_loss: 1.2171 - classification_loss: 0.1905 389/500 [======================>.......] - ETA: 26s - loss: 1.4067 - regression_loss: 1.2163 - classification_loss: 0.1903 390/500 [======================>.......] - ETA: 25s - loss: 1.4073 - regression_loss: 1.2170 - classification_loss: 0.1903 391/500 [======================>.......] - ETA: 25s - loss: 1.4067 - regression_loss: 1.2166 - classification_loss: 0.1901 392/500 [======================>.......] - ETA: 25s - loss: 1.4055 - regression_loss: 1.2156 - classification_loss: 0.1898 393/500 [======================>.......] - ETA: 25s - loss: 1.4059 - regression_loss: 1.2161 - classification_loss: 0.1898 394/500 [======================>.......] - ETA: 24s - loss: 1.4083 - regression_loss: 1.2180 - classification_loss: 0.1903 395/500 [======================>.......] - ETA: 24s - loss: 1.4076 - regression_loss: 1.2174 - classification_loss: 0.1902 396/500 [======================>.......] - ETA: 24s - loss: 1.4071 - regression_loss: 1.2171 - classification_loss: 0.1901 397/500 [======================>.......] - ETA: 24s - loss: 1.4059 - regression_loss: 1.2161 - classification_loss: 0.1897 398/500 [======================>.......] - ETA: 23s - loss: 1.4060 - regression_loss: 1.2160 - classification_loss: 0.1900 399/500 [======================>.......] - ETA: 23s - loss: 1.4033 - regression_loss: 1.2136 - classification_loss: 0.1897 400/500 [=======================>......] - ETA: 23s - loss: 1.4021 - regression_loss: 1.2125 - classification_loss: 0.1896 401/500 [=======================>......] - ETA: 23s - loss: 1.4018 - regression_loss: 1.2122 - classification_loss: 0.1896 402/500 [=======================>......] - ETA: 22s - loss: 1.4045 - regression_loss: 1.2146 - classification_loss: 0.1899 403/500 [=======================>......] - ETA: 22s - loss: 1.4043 - regression_loss: 1.2143 - classification_loss: 0.1899 404/500 [=======================>......] - ETA: 22s - loss: 1.4034 - regression_loss: 1.2138 - classification_loss: 0.1896 405/500 [=======================>......] - ETA: 22s - loss: 1.4036 - regression_loss: 1.2141 - classification_loss: 0.1895 406/500 [=======================>......] - ETA: 22s - loss: 1.4035 - regression_loss: 1.2140 - classification_loss: 0.1894 407/500 [=======================>......] - ETA: 21s - loss: 1.4042 - regression_loss: 1.2147 - classification_loss: 0.1895 408/500 [=======================>......] - ETA: 21s - loss: 1.4030 - regression_loss: 1.2136 - classification_loss: 0.1893 409/500 [=======================>......] - ETA: 21s - loss: 1.4012 - regression_loss: 1.2121 - classification_loss: 0.1891 410/500 [=======================>......] - ETA: 21s - loss: 1.4006 - regression_loss: 1.2115 - classification_loss: 0.1891 411/500 [=======================>......] - ETA: 20s - loss: 1.4018 - regression_loss: 1.2126 - classification_loss: 0.1891 412/500 [=======================>......] - ETA: 20s - loss: 1.4023 - regression_loss: 1.2131 - classification_loss: 0.1892 413/500 [=======================>......] - ETA: 20s - loss: 1.3998 - regression_loss: 1.2109 - classification_loss: 0.1889 414/500 [=======================>......] - ETA: 20s - loss: 1.3998 - regression_loss: 1.2111 - classification_loss: 0.1887 415/500 [=======================>......] - ETA: 19s - loss: 1.4018 - regression_loss: 1.2127 - classification_loss: 0.1891 416/500 [=======================>......] - ETA: 19s - loss: 1.4007 - regression_loss: 1.2117 - classification_loss: 0.1890 417/500 [========================>.....] - ETA: 19s - loss: 1.4009 - regression_loss: 1.2119 - classification_loss: 0.1890 418/500 [========================>.....] - ETA: 19s - loss: 1.3999 - regression_loss: 1.2113 - classification_loss: 0.1887 419/500 [========================>.....] - ETA: 19s - loss: 1.3987 - regression_loss: 1.2103 - classification_loss: 0.1884 420/500 [========================>.....] - ETA: 18s - loss: 1.3975 - regression_loss: 1.2094 - classification_loss: 0.1882 421/500 [========================>.....] - ETA: 18s - loss: 1.3971 - regression_loss: 1.2091 - classification_loss: 0.1880 422/500 [========================>.....] - ETA: 18s - loss: 1.3965 - regression_loss: 1.2088 - classification_loss: 0.1877 423/500 [========================>.....] - ETA: 18s - loss: 1.3968 - regression_loss: 1.2089 - classification_loss: 0.1878 424/500 [========================>.....] - ETA: 17s - loss: 1.3973 - regression_loss: 1.2093 - classification_loss: 0.1880 425/500 [========================>.....] - ETA: 17s - loss: 1.3975 - regression_loss: 1.2096 - classification_loss: 0.1880 426/500 [========================>.....] - ETA: 17s - loss: 1.3987 - regression_loss: 1.2105 - classification_loss: 0.1882 427/500 [========================>.....] - ETA: 17s - loss: 1.3980 - regression_loss: 1.2100 - classification_loss: 0.1880 428/500 [========================>.....] - ETA: 16s - loss: 1.3976 - regression_loss: 1.2096 - classification_loss: 0.1880 429/500 [========================>.....] - ETA: 16s - loss: 1.3967 - regression_loss: 1.2088 - classification_loss: 0.1879 430/500 [========================>.....] - ETA: 16s - loss: 1.3969 - regression_loss: 1.2091 - classification_loss: 0.1879 431/500 [========================>.....] - ETA: 16s - loss: 1.3967 - regression_loss: 1.2090 - classification_loss: 0.1877 432/500 [========================>.....] - ETA: 15s - loss: 1.3953 - regression_loss: 1.2079 - classification_loss: 0.1874 433/500 [========================>.....] - ETA: 15s - loss: 1.3953 - regression_loss: 1.2079 - classification_loss: 0.1873 434/500 [=========================>....] - ETA: 15s - loss: 1.3944 - regression_loss: 1.2073 - classification_loss: 0.1871 435/500 [=========================>....] - ETA: 15s - loss: 1.3951 - regression_loss: 1.2079 - classification_loss: 0.1872 436/500 [=========================>....] - ETA: 15s - loss: 1.3970 - regression_loss: 1.2096 - classification_loss: 0.1874 437/500 [=========================>....] - ETA: 14s - loss: 1.3972 - regression_loss: 1.2098 - classification_loss: 0.1874 438/500 [=========================>....] - ETA: 14s - loss: 1.3954 - regression_loss: 1.2083 - classification_loss: 0.1871 439/500 [=========================>....] - ETA: 14s - loss: 1.3939 - regression_loss: 1.2070 - classification_loss: 0.1868 440/500 [=========================>....] - ETA: 14s - loss: 1.3942 - regression_loss: 1.2075 - classification_loss: 0.1868 441/500 [=========================>....] - ETA: 13s - loss: 1.3934 - regression_loss: 1.2068 - classification_loss: 0.1866 442/500 [=========================>....] - ETA: 13s - loss: 1.3945 - regression_loss: 1.2074 - classification_loss: 0.1871 443/500 [=========================>....] - ETA: 13s - loss: 1.3928 - regression_loss: 1.2060 - classification_loss: 0.1868 444/500 [=========================>....] - ETA: 13s - loss: 1.3926 - regression_loss: 1.2059 - classification_loss: 0.1867 445/500 [=========================>....] - ETA: 12s - loss: 1.3919 - regression_loss: 1.2053 - classification_loss: 0.1866 446/500 [=========================>....] - ETA: 12s - loss: 1.3934 - regression_loss: 1.2065 - classification_loss: 0.1869 447/500 [=========================>....] - ETA: 12s - loss: 1.3935 - regression_loss: 1.2065 - classification_loss: 0.1870 448/500 [=========================>....] - ETA: 12s - loss: 1.3929 - regression_loss: 1.2060 - classification_loss: 0.1868 449/500 [=========================>....] - ETA: 11s - loss: 1.3928 - regression_loss: 1.2060 - classification_loss: 0.1868 450/500 [==========================>...] - ETA: 11s - loss: 1.3935 - regression_loss: 1.2066 - classification_loss: 0.1868 451/500 [==========================>...] - ETA: 11s - loss: 1.3924 - regression_loss: 1.2058 - classification_loss: 0.1866 452/500 [==========================>...] - ETA: 11s - loss: 1.3912 - regression_loss: 1.2048 - classification_loss: 0.1863 453/500 [==========================>...] - ETA: 11s - loss: 1.3908 - regression_loss: 1.2046 - classification_loss: 0.1863 454/500 [==========================>...] - ETA: 10s - loss: 1.3916 - regression_loss: 1.2051 - classification_loss: 0.1865 455/500 [==========================>...] - ETA: 10s - loss: 1.3933 - regression_loss: 1.2066 - classification_loss: 0.1866 456/500 [==========================>...] - ETA: 10s - loss: 1.3934 - regression_loss: 1.2067 - classification_loss: 0.1866 457/500 [==========================>...] - ETA: 10s - loss: 1.3925 - regression_loss: 1.2061 - classification_loss: 0.1864 458/500 [==========================>...] - ETA: 9s - loss: 1.3924 - regression_loss: 1.2060 - classification_loss: 0.1864  459/500 [==========================>...] - ETA: 9s - loss: 1.3933 - regression_loss: 1.2068 - classification_loss: 0.1864 460/500 [==========================>...] - ETA: 9s - loss: 1.3934 - regression_loss: 1.2069 - classification_loss: 0.1864 461/500 [==========================>...] - ETA: 9s - loss: 1.3926 - regression_loss: 1.2064 - classification_loss: 0.1863 462/500 [==========================>...] - ETA: 8s - loss: 1.3924 - regression_loss: 1.2063 - classification_loss: 0.1861 463/500 [==========================>...] - ETA: 8s - loss: 1.3925 - regression_loss: 1.2064 - classification_loss: 0.1861 464/500 [==========================>...] - ETA: 8s - loss: 1.3921 - regression_loss: 1.2061 - classification_loss: 0.1860 465/500 [==========================>...] - ETA: 8s - loss: 1.3912 - regression_loss: 1.2055 - classification_loss: 0.1857 466/500 [==========================>...] - ETA: 7s - loss: 1.3921 - regression_loss: 1.2061 - classification_loss: 0.1859 467/500 [===========================>..] - ETA: 7s - loss: 1.3919 - regression_loss: 1.2061 - classification_loss: 0.1858 468/500 [===========================>..] - ETA: 7s - loss: 1.3926 - regression_loss: 1.2066 - classification_loss: 0.1860 469/500 [===========================>..] - ETA: 7s - loss: 1.3924 - regression_loss: 1.2065 - classification_loss: 0.1859 470/500 [===========================>..] - ETA: 7s - loss: 1.3923 - regression_loss: 1.2065 - classification_loss: 0.1858 471/500 [===========================>..] - ETA: 6s - loss: 1.3911 - regression_loss: 1.2054 - classification_loss: 0.1856 472/500 [===========================>..] - ETA: 6s - loss: 1.3906 - regression_loss: 1.2051 - classification_loss: 0.1855 473/500 [===========================>..] - ETA: 6s - loss: 1.3905 - regression_loss: 1.2048 - classification_loss: 0.1856 474/500 [===========================>..] - ETA: 6s - loss: 1.3915 - regression_loss: 1.2053 - classification_loss: 0.1862 475/500 [===========================>..] - ETA: 5s - loss: 1.3914 - regression_loss: 1.2053 - classification_loss: 0.1861 476/500 [===========================>..] - ETA: 5s - loss: 1.3906 - regression_loss: 1.2047 - classification_loss: 0.1859 477/500 [===========================>..] - ETA: 5s - loss: 1.3905 - regression_loss: 1.2046 - classification_loss: 0.1859 478/500 [===========================>..] - ETA: 5s - loss: 1.3924 - regression_loss: 1.2060 - classification_loss: 0.1865 479/500 [===========================>..] - ETA: 4s - loss: 1.3931 - regression_loss: 1.2065 - classification_loss: 0.1865 480/500 [===========================>..] - ETA: 4s - loss: 1.3934 - regression_loss: 1.2068 - classification_loss: 0.1865 481/500 [===========================>..] - ETA: 4s - loss: 1.3943 - regression_loss: 1.2077 - classification_loss: 0.1867 482/500 [===========================>..] - ETA: 4s - loss: 1.3940 - regression_loss: 1.2074 - classification_loss: 0.1865 483/500 [===========================>..] - ETA: 3s - loss: 1.3939 - regression_loss: 1.2074 - classification_loss: 0.1865 484/500 [============================>.] - ETA: 3s - loss: 1.3937 - regression_loss: 1.2074 - classification_loss: 0.1864 485/500 [============================>.] - ETA: 3s - loss: 1.3934 - regression_loss: 1.2072 - classification_loss: 0.1863 486/500 [============================>.] - ETA: 3s - loss: 1.3928 - regression_loss: 1.2066 - classification_loss: 0.1862 487/500 [============================>.] - ETA: 3s - loss: 1.3932 - regression_loss: 1.2070 - classification_loss: 0.1862 488/500 [============================>.] - ETA: 2s - loss: 1.3932 - regression_loss: 1.2071 - classification_loss: 0.1861 489/500 [============================>.] - ETA: 2s - loss: 1.3936 - regression_loss: 1.2075 - classification_loss: 0.1861 490/500 [============================>.] - ETA: 2s - loss: 1.3923 - regression_loss: 1.2064 - classification_loss: 0.1859 491/500 [============================>.] - ETA: 2s - loss: 1.3925 - regression_loss: 1.2067 - classification_loss: 0.1858 492/500 [============================>.] - ETA: 1s - loss: 1.3909 - regression_loss: 1.2053 - classification_loss: 0.1856 493/500 [============================>.] - ETA: 1s - loss: 1.3913 - regression_loss: 1.2058 - classification_loss: 0.1856 494/500 [============================>.] - ETA: 1s - loss: 1.3903 - regression_loss: 1.2049 - classification_loss: 0.1853 495/500 [============================>.] - ETA: 1s - loss: 1.3890 - regression_loss: 1.2040 - classification_loss: 0.1851 496/500 [============================>.] - ETA: 0s - loss: 1.3897 - regression_loss: 1.2046 - classification_loss: 0.1851 497/500 [============================>.] - ETA: 0s - loss: 1.3893 - regression_loss: 1.2043 - classification_loss: 0.1849 498/500 [============================>.] - ETA: 0s - loss: 1.3885 - regression_loss: 1.2038 - classification_loss: 0.1848 499/500 [============================>.] - ETA: 0s - loss: 1.3891 - regression_loss: 1.2042 - classification_loss: 0.1849 500/500 [==============================] - 117s 235ms/step - loss: 1.3913 - regression_loss: 1.2060 - classification_loss: 0.1854 326 instances of class plum with average precision: 0.8236 mAP: 0.8236 Epoch 00014: saving model to ./training/snapshots/resnet50_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 1:49 - loss: 1.2467 - regression_loss: 1.1257 - classification_loss: 0.1209 2/500 [..............................] - ETA: 1:50 - loss: 1.3190 - regression_loss: 1.1509 - classification_loss: 0.1681 3/500 [..............................] - ETA: 1:56 - loss: 1.2697 - regression_loss: 1.0970 - classification_loss: 0.1727 4/500 [..............................] - ETA: 1:55 - loss: 1.2246 - regression_loss: 1.0609 - classification_loss: 0.1637 5/500 [..............................] - ETA: 1:52 - loss: 1.2993 - regression_loss: 1.1406 - classification_loss: 0.1587 6/500 [..............................] - ETA: 1:54 - loss: 1.2726 - regression_loss: 1.1149 - classification_loss: 0.1577 7/500 [..............................] - ETA: 1:53 - loss: 1.3110 - regression_loss: 1.1413 - classification_loss: 0.1697 8/500 [..............................] - ETA: 1:53 - loss: 1.4102 - regression_loss: 1.2169 - classification_loss: 0.1933 9/500 [..............................] - ETA: 1:53 - loss: 1.3274 - regression_loss: 1.1498 - classification_loss: 0.1776 10/500 [..............................] - ETA: 1:54 - loss: 1.3577 - regression_loss: 1.1724 - classification_loss: 0.1853 11/500 [..............................] - ETA: 1:53 - loss: 1.3243 - regression_loss: 1.1476 - classification_loss: 0.1767 12/500 [..............................] - ETA: 1:54 - loss: 1.3733 - regression_loss: 1.1902 - classification_loss: 0.1831 13/500 [..............................] - ETA: 1:54 - loss: 1.3650 - regression_loss: 1.1844 - classification_loss: 0.1806 14/500 [..............................] - ETA: 1:54 - loss: 1.4395 - regression_loss: 1.2526 - classification_loss: 0.1869 15/500 [..............................] - ETA: 1:54 - loss: 1.3829 - regression_loss: 1.2047 - classification_loss: 0.1782 16/500 [..............................] - ETA: 1:54 - loss: 1.3608 - regression_loss: 1.1856 - classification_loss: 0.1752 17/500 [>.............................] - ETA: 1:54 - loss: 1.3159 - regression_loss: 1.1488 - classification_loss: 0.1671 18/500 [>.............................] - ETA: 1:53 - loss: 1.3060 - regression_loss: 1.1397 - classification_loss: 0.1664 19/500 [>.............................] - ETA: 1:53 - loss: 1.2834 - regression_loss: 1.1216 - classification_loss: 0.1618 20/500 [>.............................] - ETA: 1:53 - loss: 1.3023 - regression_loss: 1.1401 - classification_loss: 0.1622 21/500 [>.............................] - ETA: 1:54 - loss: 1.3014 - regression_loss: 1.1408 - classification_loss: 0.1606 22/500 [>.............................] - ETA: 1:53 - loss: 1.3265 - regression_loss: 1.1605 - classification_loss: 0.1660 23/500 [>.............................] - ETA: 1:53 - loss: 1.3845 - regression_loss: 1.2050 - classification_loss: 0.1795 24/500 [>.............................] - ETA: 1:53 - loss: 1.3871 - regression_loss: 1.2055 - classification_loss: 0.1816 25/500 [>.............................] - ETA: 1:53 - loss: 1.3539 - regression_loss: 1.1785 - classification_loss: 0.1754 26/500 [>.............................] - ETA: 1:52 - loss: 1.3811 - regression_loss: 1.2007 - classification_loss: 0.1804 27/500 [>.............................] - ETA: 1:52 - loss: 1.3834 - regression_loss: 1.2042 - classification_loss: 0.1792 28/500 [>.............................] - ETA: 1:51 - loss: 1.4148 - regression_loss: 1.2334 - classification_loss: 0.1814 29/500 [>.............................] - ETA: 1:51 - loss: 1.3962 - regression_loss: 1.2173 - classification_loss: 0.1789 30/500 [>.............................] - ETA: 1:50 - loss: 1.3871 - regression_loss: 1.2100 - classification_loss: 0.1770 31/500 [>.............................] - ETA: 1:50 - loss: 1.3951 - regression_loss: 1.2184 - classification_loss: 0.1767 32/500 [>.............................] - ETA: 1:50 - loss: 1.3847 - regression_loss: 1.2097 - classification_loss: 0.1749 33/500 [>.............................] - ETA: 1:50 - loss: 1.3950 - regression_loss: 1.2188 - classification_loss: 0.1762 34/500 [=>............................] - ETA: 1:49 - loss: 1.3853 - regression_loss: 1.2109 - classification_loss: 0.1744 35/500 [=>............................] - ETA: 1:49 - loss: 1.3770 - regression_loss: 1.2021 - classification_loss: 0.1748 36/500 [=>............................] - ETA: 1:49 - loss: 1.3683 - regression_loss: 1.1956 - classification_loss: 0.1727 37/500 [=>............................] - ETA: 1:49 - loss: 1.3568 - regression_loss: 1.1862 - classification_loss: 0.1706 38/500 [=>............................] - ETA: 1:48 - loss: 1.3504 - regression_loss: 1.1815 - classification_loss: 0.1689 39/500 [=>............................] - ETA: 1:48 - loss: 1.3430 - regression_loss: 1.1762 - classification_loss: 0.1668 40/500 [=>............................] - ETA: 1:48 - loss: 1.3432 - regression_loss: 1.1766 - classification_loss: 0.1666 41/500 [=>............................] - ETA: 1:47 - loss: 1.3334 - regression_loss: 1.1690 - classification_loss: 0.1644 42/500 [=>............................] - ETA: 1:47 - loss: 1.3367 - regression_loss: 1.1716 - classification_loss: 0.1651 43/500 [=>............................] - ETA: 1:47 - loss: 1.3244 - regression_loss: 1.1601 - classification_loss: 0.1643 44/500 [=>............................] - ETA: 1:46 - loss: 1.3380 - regression_loss: 1.1717 - classification_loss: 0.1664 45/500 [=>............................] - ETA: 1:46 - loss: 1.3256 - regression_loss: 1.1602 - classification_loss: 0.1654 46/500 [=>............................] - ETA: 1:46 - loss: 1.3212 - regression_loss: 1.1564 - classification_loss: 0.1648 47/500 [=>............................] - ETA: 1:45 - loss: 1.3254 - regression_loss: 1.1583 - classification_loss: 0.1671 48/500 [=>............................] - ETA: 1:45 - loss: 1.3332 - regression_loss: 1.1634 - classification_loss: 0.1697 49/500 [=>............................] - ETA: 1:45 - loss: 1.3222 - regression_loss: 1.1539 - classification_loss: 0.1683 50/500 [==>...........................] - ETA: 1:45 - loss: 1.3105 - regression_loss: 1.1433 - classification_loss: 0.1672 51/500 [==>...........................] - ETA: 1:45 - loss: 1.2998 - regression_loss: 1.1345 - classification_loss: 0.1653 52/500 [==>...........................] - ETA: 1:44 - loss: 1.3040 - regression_loss: 1.1377 - classification_loss: 0.1663 53/500 [==>...........................] - ETA: 1:44 - loss: 1.2983 - regression_loss: 1.1332 - classification_loss: 0.1651 54/500 [==>...........................] - ETA: 1:44 - loss: 1.2944 - regression_loss: 1.1304 - classification_loss: 0.1640 55/500 [==>...........................] - ETA: 1:44 - loss: 1.3127 - regression_loss: 1.1453 - classification_loss: 0.1674 56/500 [==>...........................] - ETA: 1:44 - loss: 1.3225 - regression_loss: 1.1521 - classification_loss: 0.1705 57/500 [==>...........................] - ETA: 1:44 - loss: 1.3208 - regression_loss: 1.1509 - classification_loss: 0.1700 58/500 [==>...........................] - ETA: 1:43 - loss: 1.3181 - regression_loss: 1.1482 - classification_loss: 0.1699 59/500 [==>...........................] - ETA: 1:43 - loss: 1.3079 - regression_loss: 1.1397 - classification_loss: 0.1682 60/500 [==>...........................] - ETA: 1:43 - loss: 1.3045 - regression_loss: 1.1381 - classification_loss: 0.1664 61/500 [==>...........................] - ETA: 1:43 - loss: 1.3115 - regression_loss: 1.1440 - classification_loss: 0.1675 62/500 [==>...........................] - ETA: 1:43 - loss: 1.3083 - regression_loss: 1.1417 - classification_loss: 0.1666 63/500 [==>...........................] - ETA: 1:42 - loss: 1.3090 - regression_loss: 1.1430 - classification_loss: 0.1660 64/500 [==>...........................] - ETA: 1:42 - loss: 1.2982 - regression_loss: 1.1340 - classification_loss: 0.1643 65/500 [==>...........................] - ETA: 1:42 - loss: 1.2890 - regression_loss: 1.1264 - classification_loss: 0.1626 66/500 [==>...........................] - ETA: 1:42 - loss: 1.2864 - regression_loss: 1.1244 - classification_loss: 0.1620 67/500 [===>..........................] - ETA: 1:42 - loss: 1.2882 - regression_loss: 1.1264 - classification_loss: 0.1618 68/500 [===>..........................] - ETA: 1:41 - loss: 1.2870 - regression_loss: 1.1254 - classification_loss: 0.1616 69/500 [===>..........................] - ETA: 1:41 - loss: 1.2747 - regression_loss: 1.1148 - classification_loss: 0.1600 70/500 [===>..........................] - ETA: 1:41 - loss: 1.2658 - regression_loss: 1.1072 - classification_loss: 0.1586 71/500 [===>..........................] - ETA: 1:41 - loss: 1.2720 - regression_loss: 1.1119 - classification_loss: 0.1602 72/500 [===>..........................] - ETA: 1:40 - loss: 1.2757 - regression_loss: 1.1147 - classification_loss: 0.1610 73/500 [===>..........................] - ETA: 1:40 - loss: 1.2756 - regression_loss: 1.1150 - classification_loss: 0.1606 74/500 [===>..........................] - ETA: 1:40 - loss: 1.2783 - regression_loss: 1.1167 - classification_loss: 0.1617 75/500 [===>..........................] - ETA: 1:40 - loss: 1.2806 - regression_loss: 1.1187 - classification_loss: 0.1619 76/500 [===>..........................] - ETA: 1:40 - loss: 1.2817 - regression_loss: 1.1196 - classification_loss: 0.1620 77/500 [===>..........................] - ETA: 1:39 - loss: 1.2740 - regression_loss: 1.1130 - classification_loss: 0.1610 78/500 [===>..........................] - ETA: 1:39 - loss: 1.2764 - regression_loss: 1.1148 - classification_loss: 0.1616 79/500 [===>..........................] - ETA: 1:39 - loss: 1.2750 - regression_loss: 1.1143 - classification_loss: 0.1607 80/500 [===>..........................] - ETA: 1:39 - loss: 1.2621 - regression_loss: 1.1030 - classification_loss: 0.1592 81/500 [===>..........................] - ETA: 1:38 - loss: 1.2610 - regression_loss: 1.1024 - classification_loss: 0.1587 82/500 [===>..........................] - ETA: 1:38 - loss: 1.2571 - regression_loss: 1.0987 - classification_loss: 0.1584 83/500 [===>..........................] - ETA: 1:38 - loss: 1.2663 - regression_loss: 1.1061 - classification_loss: 0.1602 84/500 [====>.........................] - ETA: 1:38 - loss: 1.2663 - regression_loss: 1.1052 - classification_loss: 0.1611 85/500 [====>.........................] - ETA: 1:38 - loss: 1.2654 - regression_loss: 1.1041 - classification_loss: 0.1613 86/500 [====>.........................] - ETA: 1:38 - loss: 1.2617 - regression_loss: 1.1016 - classification_loss: 0.1601 87/500 [====>.........................] - ETA: 1:37 - loss: 1.2565 - regression_loss: 1.0969 - classification_loss: 0.1596 88/500 [====>.........................] - ETA: 1:37 - loss: 1.2606 - regression_loss: 1.0996 - classification_loss: 0.1610 89/500 [====>.........................] - ETA: 1:37 - loss: 1.2572 - regression_loss: 1.0974 - classification_loss: 0.1598 90/500 [====>.........................] - ETA: 1:37 - loss: 1.2584 - regression_loss: 1.0980 - classification_loss: 0.1604 91/500 [====>.........................] - ETA: 1:36 - loss: 1.2547 - regression_loss: 1.0945 - classification_loss: 0.1601 92/500 [====>.........................] - ETA: 1:36 - loss: 1.2537 - regression_loss: 1.0935 - classification_loss: 0.1601 93/500 [====>.........................] - ETA: 1:36 - loss: 1.2551 - regression_loss: 1.0947 - classification_loss: 0.1604 94/500 [====>.........................] - ETA: 1:36 - loss: 1.2563 - regression_loss: 1.0957 - classification_loss: 0.1605 95/500 [====>.........................] - ETA: 1:35 - loss: 1.2628 - regression_loss: 1.1012 - classification_loss: 0.1616 96/500 [====>.........................] - ETA: 1:35 - loss: 1.2613 - regression_loss: 1.1004 - classification_loss: 0.1610 97/500 [====>.........................] - ETA: 1:35 - loss: 1.2613 - regression_loss: 1.1010 - classification_loss: 0.1604 98/500 [====>.........................] - ETA: 1:35 - loss: 1.2634 - regression_loss: 1.1026 - classification_loss: 0.1608 99/500 [====>.........................] - ETA: 1:34 - loss: 1.2664 - regression_loss: 1.1048 - classification_loss: 0.1617 100/500 [=====>........................] - ETA: 1:34 - loss: 1.2635 - regression_loss: 1.1024 - classification_loss: 0.1611 101/500 [=====>........................] - ETA: 1:34 - loss: 1.2683 - regression_loss: 1.1063 - classification_loss: 0.1620 102/500 [=====>........................] - ETA: 1:34 - loss: 1.2752 - regression_loss: 1.1110 - classification_loss: 0.1642 103/500 [=====>........................] - ETA: 1:33 - loss: 1.2802 - regression_loss: 1.1153 - classification_loss: 0.1649 104/500 [=====>........................] - ETA: 1:33 - loss: 1.2736 - regression_loss: 1.1097 - classification_loss: 0.1639 105/500 [=====>........................] - ETA: 1:33 - loss: 1.2736 - regression_loss: 1.1096 - classification_loss: 0.1640 106/500 [=====>........................] - ETA: 1:33 - loss: 1.2682 - regression_loss: 1.1049 - classification_loss: 0.1633 107/500 [=====>........................] - ETA: 1:33 - loss: 1.2671 - regression_loss: 1.1043 - classification_loss: 0.1628 108/500 [=====>........................] - ETA: 1:32 - loss: 1.2685 - regression_loss: 1.1048 - classification_loss: 0.1637 109/500 [=====>........................] - ETA: 1:32 - loss: 1.2617 - regression_loss: 1.0990 - classification_loss: 0.1627 110/500 [=====>........................] - ETA: 1:32 - loss: 1.2560 - regression_loss: 1.0940 - classification_loss: 0.1619 111/500 [=====>........................] - ETA: 1:32 - loss: 1.2534 - regression_loss: 1.0922 - classification_loss: 0.1612 112/500 [=====>........................] - ETA: 1:31 - loss: 1.2559 - regression_loss: 1.0944 - classification_loss: 0.1615 113/500 [=====>........................] - ETA: 1:31 - loss: 1.2616 - regression_loss: 1.0976 - classification_loss: 0.1639 114/500 [=====>........................] - ETA: 1:31 - loss: 1.2578 - regression_loss: 1.0946 - classification_loss: 0.1632 115/500 [=====>........................] - ETA: 1:31 - loss: 1.2651 - regression_loss: 1.1007 - classification_loss: 0.1644 116/500 [=====>........................] - ETA: 1:30 - loss: 1.2661 - regression_loss: 1.1017 - classification_loss: 0.1644 117/500 [======>.......................] - ETA: 1:30 - loss: 1.2629 - regression_loss: 1.0991 - classification_loss: 0.1638 118/500 [======>.......................] - ETA: 1:30 - loss: 1.2618 - regression_loss: 1.0986 - classification_loss: 0.1632 119/500 [======>.......................] - ETA: 1:30 - loss: 1.2590 - regression_loss: 1.0958 - classification_loss: 0.1631 120/500 [======>.......................] - ETA: 1:29 - loss: 1.2572 - regression_loss: 1.0941 - classification_loss: 0.1630 121/500 [======>.......................] - ETA: 1:29 - loss: 1.2624 - regression_loss: 1.0989 - classification_loss: 0.1635 122/500 [======>.......................] - ETA: 1:29 - loss: 1.2639 - regression_loss: 1.1004 - classification_loss: 0.1635 123/500 [======>.......................] - ETA: 1:29 - loss: 1.2653 - regression_loss: 1.1011 - classification_loss: 0.1641 124/500 [======>.......................] - ETA: 1:28 - loss: 1.2624 - regression_loss: 1.0987 - classification_loss: 0.1637 125/500 [======>.......................] - ETA: 1:28 - loss: 1.2681 - regression_loss: 1.1045 - classification_loss: 0.1636 126/500 [======>.......................] - ETA: 1:28 - loss: 1.2660 - regression_loss: 1.1026 - classification_loss: 0.1633 127/500 [======>.......................] - ETA: 1:28 - loss: 1.2651 - regression_loss: 1.1019 - classification_loss: 0.1631 128/500 [======>.......................] - ETA: 1:28 - loss: 1.2633 - regression_loss: 1.1000 - classification_loss: 0.1633 129/500 [======>.......................] - ETA: 1:27 - loss: 1.2683 - regression_loss: 1.1044 - classification_loss: 0.1639 130/500 [======>.......................] - ETA: 1:27 - loss: 1.2642 - regression_loss: 1.1008 - classification_loss: 0.1634 131/500 [======>.......................] - ETA: 1:27 - loss: 1.2692 - regression_loss: 1.1052 - classification_loss: 0.1640 132/500 [======>.......................] - ETA: 1:27 - loss: 1.2642 - regression_loss: 1.1010 - classification_loss: 0.1632 133/500 [======>.......................] - ETA: 1:27 - loss: 1.2696 - regression_loss: 1.1054 - classification_loss: 0.1642 134/500 [=======>......................] - ETA: 1:26 - loss: 1.2718 - regression_loss: 1.1075 - classification_loss: 0.1643 135/500 [=======>......................] - ETA: 1:26 - loss: 1.2727 - regression_loss: 1.1083 - classification_loss: 0.1644 136/500 [=======>......................] - ETA: 1:26 - loss: 1.2683 - regression_loss: 1.1044 - classification_loss: 0.1639 137/500 [=======>......................] - ETA: 1:25 - loss: 1.2703 - regression_loss: 1.1059 - classification_loss: 0.1644 138/500 [=======>......................] - ETA: 1:25 - loss: 1.2672 - regression_loss: 1.1033 - classification_loss: 0.1639 139/500 [=======>......................] - ETA: 1:25 - loss: 1.2685 - regression_loss: 1.1042 - classification_loss: 0.1644 140/500 [=======>......................] - ETA: 1:25 - loss: 1.2675 - regression_loss: 1.1031 - classification_loss: 0.1645 141/500 [=======>......................] - ETA: 1:24 - loss: 1.2623 - regression_loss: 1.0980 - classification_loss: 0.1643 142/500 [=======>......................] - ETA: 1:24 - loss: 1.2599 - regression_loss: 1.0958 - classification_loss: 0.1641 143/500 [=======>......................] - ETA: 1:24 - loss: 1.2641 - regression_loss: 1.0990 - classification_loss: 0.1651 144/500 [=======>......................] - ETA: 1:24 - loss: 1.2589 - regression_loss: 1.0944 - classification_loss: 0.1645 145/500 [=======>......................] - ETA: 1:24 - loss: 1.2627 - regression_loss: 1.0868 - classification_loss: 0.1758 146/500 [=======>......................] - ETA: 1:23 - loss: 1.2692 - regression_loss: 1.0919 - classification_loss: 0.1774 147/500 [=======>......................] - ETA: 1:23 - loss: 1.2755 - regression_loss: 1.0969 - classification_loss: 0.1786 148/500 [=======>......................] - ETA: 1:23 - loss: 1.2754 - regression_loss: 1.0970 - classification_loss: 0.1784 149/500 [=======>......................] - ETA: 1:23 - loss: 1.2756 - regression_loss: 1.0971 - classification_loss: 0.1786 150/500 [========>.....................] - ETA: 1:22 - loss: 1.2695 - regression_loss: 1.0918 - classification_loss: 0.1778 151/500 [========>.....................] - ETA: 1:22 - loss: 1.2762 - regression_loss: 1.0971 - classification_loss: 0.1791 152/500 [========>.....................] - ETA: 1:22 - loss: 1.2777 - regression_loss: 1.0983 - classification_loss: 0.1795 153/500 [========>.....................] - ETA: 1:22 - loss: 1.2818 - regression_loss: 1.1021 - classification_loss: 0.1797 154/500 [========>.....................] - ETA: 1:21 - loss: 1.2846 - regression_loss: 1.1048 - classification_loss: 0.1798 155/500 [========>.....................] - ETA: 1:21 - loss: 1.2851 - regression_loss: 1.1054 - classification_loss: 0.1797 156/500 [========>.....................] - ETA: 1:21 - loss: 1.2927 - regression_loss: 1.1115 - classification_loss: 0.1813 157/500 [========>.....................] - ETA: 1:21 - loss: 1.2970 - regression_loss: 1.1151 - classification_loss: 0.1818 158/500 [========>.....................] - ETA: 1:20 - loss: 1.3029 - regression_loss: 1.1190 - classification_loss: 0.1839 159/500 [========>.....................] - ETA: 1:20 - loss: 1.3043 - regression_loss: 1.1203 - classification_loss: 0.1840 160/500 [========>.....................] - ETA: 1:20 - loss: 1.3064 - regression_loss: 1.1214 - classification_loss: 0.1850 161/500 [========>.....................] - ETA: 1:20 - loss: 1.3102 - regression_loss: 1.1243 - classification_loss: 0.1858 162/500 [========>.....................] - ETA: 1:19 - loss: 1.3089 - regression_loss: 1.1236 - classification_loss: 0.1853 163/500 [========>.....................] - ETA: 1:19 - loss: 1.3103 - regression_loss: 1.1250 - classification_loss: 0.1853 164/500 [========>.....................] - ETA: 1:19 - loss: 1.3114 - regression_loss: 1.1261 - classification_loss: 0.1853 165/500 [========>.....................] - ETA: 1:19 - loss: 1.3151 - regression_loss: 1.1292 - classification_loss: 0.1858 166/500 [========>.....................] - ETA: 1:18 - loss: 1.3211 - regression_loss: 1.1352 - classification_loss: 0.1860 167/500 [=========>....................] - ETA: 1:18 - loss: 1.3176 - regression_loss: 1.1324 - classification_loss: 0.1852 168/500 [=========>....................] - ETA: 1:18 - loss: 1.3208 - regression_loss: 1.1347 - classification_loss: 0.1861 169/500 [=========>....................] - ETA: 1:18 - loss: 1.3216 - regression_loss: 1.1358 - classification_loss: 0.1858 170/500 [=========>....................] - ETA: 1:17 - loss: 1.3223 - regression_loss: 1.1365 - classification_loss: 0.1857 171/500 [=========>....................] - ETA: 1:17 - loss: 1.3235 - regression_loss: 1.1381 - classification_loss: 0.1855 172/500 [=========>....................] - ETA: 1:17 - loss: 1.3236 - regression_loss: 1.1383 - classification_loss: 0.1853 173/500 [=========>....................] - ETA: 1:17 - loss: 1.3207 - regression_loss: 1.1359 - classification_loss: 0.1847 174/500 [=========>....................] - ETA: 1:16 - loss: 1.3217 - regression_loss: 1.1365 - classification_loss: 0.1851 175/500 [=========>....................] - ETA: 1:16 - loss: 1.3193 - regression_loss: 1.1349 - classification_loss: 0.1844 176/500 [=========>....................] - ETA: 1:16 - loss: 1.3235 - regression_loss: 1.1385 - classification_loss: 0.1850 177/500 [=========>....................] - ETA: 1:16 - loss: 1.3249 - regression_loss: 1.1399 - classification_loss: 0.1850 178/500 [=========>....................] - ETA: 1:16 - loss: 1.3235 - regression_loss: 1.1387 - classification_loss: 0.1847 179/500 [=========>....................] - ETA: 1:15 - loss: 1.3202 - regression_loss: 1.1360 - classification_loss: 0.1842 180/500 [=========>....................] - ETA: 1:15 - loss: 1.3229 - regression_loss: 1.1386 - classification_loss: 0.1843 181/500 [=========>....................] - ETA: 1:15 - loss: 1.3250 - regression_loss: 1.1408 - classification_loss: 0.1842 182/500 [=========>....................] - ETA: 1:15 - loss: 1.3233 - regression_loss: 1.1394 - classification_loss: 0.1839 183/500 [=========>....................] - ETA: 1:14 - loss: 1.3223 - regression_loss: 1.1387 - classification_loss: 0.1836 184/500 [==========>...................] - ETA: 1:14 - loss: 1.3222 - regression_loss: 1.1380 - classification_loss: 0.1843 185/500 [==========>...................] - ETA: 1:14 - loss: 1.3224 - regression_loss: 1.1380 - classification_loss: 0.1844 186/500 [==========>...................] - ETA: 1:14 - loss: 1.3221 - regression_loss: 1.1378 - classification_loss: 0.1843 187/500 [==========>...................] - ETA: 1:13 - loss: 1.3220 - regression_loss: 1.1376 - classification_loss: 0.1844 188/500 [==========>...................] - ETA: 1:13 - loss: 1.3235 - regression_loss: 1.1316 - classification_loss: 0.1919 189/500 [==========>...................] - ETA: 1:13 - loss: 1.3212 - regression_loss: 1.1300 - classification_loss: 0.1912 190/500 [==========>...................] - ETA: 1:13 - loss: 1.3213 - regression_loss: 1.1302 - classification_loss: 0.1911 191/500 [==========>...................] - ETA: 1:12 - loss: 1.3246 - regression_loss: 1.1328 - classification_loss: 0.1919 192/500 [==========>...................] - ETA: 1:12 - loss: 1.3237 - regression_loss: 1.1325 - classification_loss: 0.1912 193/500 [==========>...................] - ETA: 1:12 - loss: 1.3245 - regression_loss: 1.1335 - classification_loss: 0.1910 194/500 [==========>...................] - ETA: 1:12 - loss: 1.3264 - regression_loss: 1.1349 - classification_loss: 0.1915 195/500 [==========>...................] - ETA: 1:12 - loss: 1.3265 - regression_loss: 1.1347 - classification_loss: 0.1918 196/500 [==========>...................] - ETA: 1:11 - loss: 1.3256 - regression_loss: 1.1339 - classification_loss: 0.1917 197/500 [==========>...................] - ETA: 1:11 - loss: 1.3286 - regression_loss: 1.1356 - classification_loss: 0.1930 198/500 [==========>...................] - ETA: 1:11 - loss: 1.3285 - regression_loss: 1.1359 - classification_loss: 0.1925 199/500 [==========>...................] - ETA: 1:11 - loss: 1.3319 - regression_loss: 1.1392 - classification_loss: 0.1928 200/500 [===========>..................] - ETA: 1:10 - loss: 1.3306 - regression_loss: 1.1382 - classification_loss: 0.1924 201/500 [===========>..................] - ETA: 1:10 - loss: 1.3301 - regression_loss: 1.1376 - classification_loss: 0.1925 202/500 [===========>..................] - ETA: 1:10 - loss: 1.3295 - regression_loss: 1.1372 - classification_loss: 0.1923 203/500 [===========>..................] - ETA: 1:10 - loss: 1.3293 - regression_loss: 1.1369 - classification_loss: 0.1924 204/500 [===========>..................] - ETA: 1:09 - loss: 1.3281 - regression_loss: 1.1362 - classification_loss: 0.1920 205/500 [===========>..................] - ETA: 1:09 - loss: 1.3312 - regression_loss: 1.1391 - classification_loss: 0.1921 206/500 [===========>..................] - ETA: 1:09 - loss: 1.3286 - regression_loss: 1.1370 - classification_loss: 0.1916 207/500 [===========>..................] - ETA: 1:09 - loss: 1.3279 - regression_loss: 1.1363 - classification_loss: 0.1916 208/500 [===========>..................] - ETA: 1:08 - loss: 1.3244 - regression_loss: 1.1332 - classification_loss: 0.1912 209/500 [===========>..................] - ETA: 1:08 - loss: 1.3257 - regression_loss: 1.1341 - classification_loss: 0.1916 210/500 [===========>..................] - ETA: 1:08 - loss: 1.3242 - regression_loss: 1.1330 - classification_loss: 0.1912 211/500 [===========>..................] - ETA: 1:08 - loss: 1.3250 - regression_loss: 1.1339 - classification_loss: 0.1912 212/500 [===========>..................] - ETA: 1:08 - loss: 1.3263 - regression_loss: 1.1352 - classification_loss: 0.1911 213/500 [===========>..................] - ETA: 1:07 - loss: 1.3255 - regression_loss: 1.1346 - classification_loss: 0.1909 214/500 [===========>..................] - ETA: 1:07 - loss: 1.3258 - regression_loss: 1.1351 - classification_loss: 0.1907 215/500 [===========>..................] - ETA: 1:07 - loss: 1.3233 - regression_loss: 1.1330 - classification_loss: 0.1903 216/500 [===========>..................] - ETA: 1:07 - loss: 1.3226 - regression_loss: 1.1326 - classification_loss: 0.1900 217/500 [============>.................] - ETA: 1:06 - loss: 1.3223 - regression_loss: 1.1326 - classification_loss: 0.1897 218/500 [============>.................] - ETA: 1:06 - loss: 1.3238 - regression_loss: 1.1338 - classification_loss: 0.1900 219/500 [============>.................] - ETA: 1:06 - loss: 1.3241 - regression_loss: 1.1339 - classification_loss: 0.1901 220/500 [============>.................] - ETA: 1:06 - loss: 1.3213 - regression_loss: 1.1318 - classification_loss: 0.1895 221/500 [============>.................] - ETA: 1:05 - loss: 1.3211 - regression_loss: 1.1320 - classification_loss: 0.1892 222/500 [============>.................] - ETA: 1:05 - loss: 1.3212 - regression_loss: 1.1321 - classification_loss: 0.1891 223/500 [============>.................] - ETA: 1:05 - loss: 1.3217 - regression_loss: 1.1328 - classification_loss: 0.1889 224/500 [============>.................] - ETA: 1:05 - loss: 1.3230 - regression_loss: 1.1340 - classification_loss: 0.1891 225/500 [============>.................] - ETA: 1:04 - loss: 1.3229 - regression_loss: 1.1340 - classification_loss: 0.1889 226/500 [============>.................] - ETA: 1:04 - loss: 1.3201 - regression_loss: 1.1317 - classification_loss: 0.1884 227/500 [============>.................] - ETA: 1:04 - loss: 1.3230 - regression_loss: 1.1338 - classification_loss: 0.1892 228/500 [============>.................] - ETA: 1:04 - loss: 1.3231 - regression_loss: 1.1342 - classification_loss: 0.1890 229/500 [============>.................] - ETA: 1:04 - loss: 1.3204 - regression_loss: 1.1320 - classification_loss: 0.1884 230/500 [============>.................] - ETA: 1:03 - loss: 1.3193 - regression_loss: 1.1311 - classification_loss: 0.1882 231/500 [============>.................] - ETA: 1:03 - loss: 1.3196 - regression_loss: 1.1312 - classification_loss: 0.1884 232/500 [============>.................] - ETA: 1:03 - loss: 1.3183 - regression_loss: 1.1302 - classification_loss: 0.1880 233/500 [============>.................] - ETA: 1:03 - loss: 1.3237 - regression_loss: 1.1349 - classification_loss: 0.1888 234/500 [=============>................] - ETA: 1:02 - loss: 1.3259 - regression_loss: 1.1369 - classification_loss: 0.1891 235/500 [=============>................] - ETA: 1:02 - loss: 1.3246 - regression_loss: 1.1359 - classification_loss: 0.1887 236/500 [=============>................] - ETA: 1:02 - loss: 1.3252 - regression_loss: 1.1363 - classification_loss: 0.1888 237/500 [=============>................] - ETA: 1:02 - loss: 1.3257 - regression_loss: 1.1370 - classification_loss: 0.1887 238/500 [=============>................] - ETA: 1:01 - loss: 1.3272 - regression_loss: 1.1385 - classification_loss: 0.1887 239/500 [=============>................] - ETA: 1:01 - loss: 1.3268 - regression_loss: 1.1383 - classification_loss: 0.1886 240/500 [=============>................] - ETA: 1:01 - loss: 1.3253 - regression_loss: 1.1369 - classification_loss: 0.1884 241/500 [=============>................] - ETA: 1:01 - loss: 1.3247 - regression_loss: 1.1361 - classification_loss: 0.1886 242/500 [=============>................] - ETA: 1:00 - loss: 1.3232 - regression_loss: 1.1348 - classification_loss: 0.1884 243/500 [=============>................] - ETA: 1:00 - loss: 1.3224 - regression_loss: 1.1343 - classification_loss: 0.1881 244/500 [=============>................] - ETA: 1:00 - loss: 1.3233 - regression_loss: 1.1353 - classification_loss: 0.1879 245/500 [=============>................] - ETA: 1:00 - loss: 1.3229 - regression_loss: 1.1352 - classification_loss: 0.1878 246/500 [=============>................] - ETA: 1:00 - loss: 1.3240 - regression_loss: 1.1363 - classification_loss: 0.1878 247/500 [=============>................] - ETA: 59s - loss: 1.3221 - regression_loss: 1.1348 - classification_loss: 0.1874  248/500 [=============>................] - ETA: 59s - loss: 1.3227 - regression_loss: 1.1352 - classification_loss: 0.1876 249/500 [=============>................] - ETA: 59s - loss: 1.3234 - regression_loss: 1.1356 - classification_loss: 0.1878 250/500 [==============>...............] - ETA: 59s - loss: 1.3231 - regression_loss: 1.1356 - classification_loss: 0.1875 251/500 [==============>...............] - ETA: 58s - loss: 1.3234 - regression_loss: 1.1361 - classification_loss: 0.1873 252/500 [==============>...............] - ETA: 58s - loss: 1.3197 - regression_loss: 1.1330 - classification_loss: 0.1867 253/500 [==============>...............] - ETA: 58s - loss: 1.3179 - regression_loss: 1.1315 - classification_loss: 0.1864 254/500 [==============>...............] - ETA: 58s - loss: 1.3147 - regression_loss: 1.1289 - classification_loss: 0.1858 255/500 [==============>...............] - ETA: 57s - loss: 1.3123 - regression_loss: 1.1268 - classification_loss: 0.1855 256/500 [==============>...............] - ETA: 57s - loss: 1.3116 - regression_loss: 1.1266 - classification_loss: 0.1851 257/500 [==============>...............] - ETA: 57s - loss: 1.3148 - regression_loss: 1.1292 - classification_loss: 0.1856 258/500 [==============>...............] - ETA: 57s - loss: 1.3152 - regression_loss: 1.1297 - classification_loss: 0.1855 259/500 [==============>...............] - ETA: 56s - loss: 1.3220 - regression_loss: 1.1358 - classification_loss: 0.1862 260/500 [==============>...............] - ETA: 56s - loss: 1.3241 - regression_loss: 1.1375 - classification_loss: 0.1866 261/500 [==============>...............] - ETA: 56s - loss: 1.3224 - regression_loss: 1.1363 - classification_loss: 0.1862 262/500 [==============>...............] - ETA: 56s - loss: 1.3226 - regression_loss: 1.1365 - classification_loss: 0.1861 263/500 [==============>...............] - ETA: 56s - loss: 1.3234 - regression_loss: 1.1375 - classification_loss: 0.1859 264/500 [==============>...............] - ETA: 55s - loss: 1.3222 - regression_loss: 1.1367 - classification_loss: 0.1855 265/500 [==============>...............] - ETA: 55s - loss: 1.3203 - regression_loss: 1.1352 - classification_loss: 0.1851 266/500 [==============>...............] - ETA: 55s - loss: 1.3178 - regression_loss: 1.1331 - classification_loss: 0.1846 267/500 [===============>..............] - ETA: 55s - loss: 1.3193 - regression_loss: 1.1345 - classification_loss: 0.1848 268/500 [===============>..............] - ETA: 54s - loss: 1.3187 - regression_loss: 1.1340 - classification_loss: 0.1847 269/500 [===============>..............] - ETA: 54s - loss: 1.3167 - regression_loss: 1.1325 - classification_loss: 0.1842 270/500 [===============>..............] - ETA: 54s - loss: 1.3185 - regression_loss: 1.1340 - classification_loss: 0.1845 271/500 [===============>..............] - ETA: 54s - loss: 1.3180 - regression_loss: 1.1338 - classification_loss: 0.1842 272/500 [===============>..............] - ETA: 53s - loss: 1.3174 - regression_loss: 1.1329 - classification_loss: 0.1844 273/500 [===============>..............] - ETA: 53s - loss: 1.3178 - regression_loss: 1.1334 - classification_loss: 0.1844 274/500 [===============>..............] - ETA: 53s - loss: 1.3152 - regression_loss: 1.1312 - classification_loss: 0.1840 275/500 [===============>..............] - ETA: 53s - loss: 1.3175 - regression_loss: 1.1327 - classification_loss: 0.1848 276/500 [===============>..............] - ETA: 52s - loss: 1.3171 - regression_loss: 1.1325 - classification_loss: 0.1846 277/500 [===============>..............] - ETA: 52s - loss: 1.3183 - regression_loss: 1.1334 - classification_loss: 0.1849 278/500 [===============>..............] - ETA: 52s - loss: 1.3178 - regression_loss: 1.1330 - classification_loss: 0.1848 279/500 [===============>..............] - ETA: 52s - loss: 1.3177 - regression_loss: 1.1332 - classification_loss: 0.1845 280/500 [===============>..............] - ETA: 51s - loss: 1.3227 - regression_loss: 1.1375 - classification_loss: 0.1852 281/500 [===============>..............] - ETA: 51s - loss: 1.3224 - regression_loss: 1.1372 - classification_loss: 0.1852 282/500 [===============>..............] - ETA: 51s - loss: 1.3228 - regression_loss: 1.1375 - classification_loss: 0.1853 283/500 [===============>..............] - ETA: 51s - loss: 1.3245 - regression_loss: 1.1391 - classification_loss: 0.1855 284/500 [================>.............] - ETA: 50s - loss: 1.3257 - regression_loss: 1.1402 - classification_loss: 0.1855 285/500 [================>.............] - ETA: 50s - loss: 1.3242 - regression_loss: 1.1391 - classification_loss: 0.1851 286/500 [================>.............] - ETA: 50s - loss: 1.3294 - regression_loss: 1.1436 - classification_loss: 0.1858 287/500 [================>.............] - ETA: 50s - loss: 1.3295 - regression_loss: 1.1437 - classification_loss: 0.1858 288/500 [================>.............] - ETA: 50s - loss: 1.3304 - regression_loss: 1.1445 - classification_loss: 0.1859 289/500 [================>.............] - ETA: 49s - loss: 1.3314 - regression_loss: 1.1451 - classification_loss: 0.1863 290/500 [================>.............] - ETA: 49s - loss: 1.3302 - regression_loss: 1.1442 - classification_loss: 0.1860 291/500 [================>.............] - ETA: 49s - loss: 1.3319 - regression_loss: 1.1457 - classification_loss: 0.1863 292/500 [================>.............] - ETA: 49s - loss: 1.3313 - regression_loss: 1.1450 - classification_loss: 0.1863 293/500 [================>.............] - ETA: 48s - loss: 1.3306 - regression_loss: 1.1446 - classification_loss: 0.1861 294/500 [================>.............] - ETA: 48s - loss: 1.3287 - regression_loss: 1.1430 - classification_loss: 0.1857 295/500 [================>.............] - ETA: 48s - loss: 1.3295 - regression_loss: 1.1438 - classification_loss: 0.1856 296/500 [================>.............] - ETA: 48s - loss: 1.3295 - regression_loss: 1.1441 - classification_loss: 0.1855 297/500 [================>.............] - ETA: 47s - loss: 1.3278 - regression_loss: 1.1425 - classification_loss: 0.1853 298/500 [================>.............] - ETA: 47s - loss: 1.3268 - regression_loss: 1.1419 - classification_loss: 0.1849 299/500 [================>.............] - ETA: 47s - loss: 1.3267 - regression_loss: 1.1414 - classification_loss: 0.1853 300/500 [=================>............] - ETA: 47s - loss: 1.3259 - regression_loss: 1.1407 - classification_loss: 0.1852 301/500 [=================>............] - ETA: 46s - loss: 1.3258 - regression_loss: 1.1405 - classification_loss: 0.1853 302/500 [=================>............] - ETA: 46s - loss: 1.3252 - regression_loss: 1.1401 - classification_loss: 0.1851 303/500 [=================>............] - ETA: 46s - loss: 1.3271 - regression_loss: 1.1418 - classification_loss: 0.1853 304/500 [=================>............] - ETA: 46s - loss: 1.3264 - regression_loss: 1.1413 - classification_loss: 0.1851 305/500 [=================>............] - ETA: 46s - loss: 1.3249 - regression_loss: 1.1401 - classification_loss: 0.1849 306/500 [=================>............] - ETA: 45s - loss: 1.3222 - regression_loss: 1.1378 - classification_loss: 0.1844 307/500 [=================>............] - ETA: 45s - loss: 1.3223 - regression_loss: 1.1380 - classification_loss: 0.1843 308/500 [=================>............] - ETA: 45s - loss: 1.3209 - regression_loss: 1.1369 - classification_loss: 0.1840 309/500 [=================>............] - ETA: 45s - loss: 1.3218 - regression_loss: 1.1374 - classification_loss: 0.1844 310/500 [=================>............] - ETA: 44s - loss: 1.3199 - regression_loss: 1.1357 - classification_loss: 0.1842 311/500 [=================>............] - ETA: 44s - loss: 1.3225 - regression_loss: 1.1378 - classification_loss: 0.1847 312/500 [=================>............] - ETA: 44s - loss: 1.3204 - regression_loss: 1.1360 - classification_loss: 0.1844 313/500 [=================>............] - ETA: 44s - loss: 1.3201 - regression_loss: 1.1357 - classification_loss: 0.1844 314/500 [=================>............] - ETA: 43s - loss: 1.3193 - regression_loss: 1.1352 - classification_loss: 0.1841 315/500 [=================>............] - ETA: 43s - loss: 1.3175 - regression_loss: 1.1338 - classification_loss: 0.1837 316/500 [=================>............] - ETA: 43s - loss: 1.3182 - regression_loss: 1.1343 - classification_loss: 0.1838 317/500 [==================>...........] - ETA: 43s - loss: 1.3174 - regression_loss: 1.1336 - classification_loss: 0.1838 318/500 [==================>...........] - ETA: 42s - loss: 1.3167 - regression_loss: 1.1332 - classification_loss: 0.1835 319/500 [==================>...........] - ETA: 42s - loss: 1.3166 - regression_loss: 1.1333 - classification_loss: 0.1833 320/500 [==================>...........] - ETA: 42s - loss: 1.3178 - regression_loss: 1.1344 - classification_loss: 0.1834 321/500 [==================>...........] - ETA: 42s - loss: 1.3223 - regression_loss: 1.1384 - classification_loss: 0.1839 322/500 [==================>...........] - ETA: 41s - loss: 1.3219 - regression_loss: 1.1381 - classification_loss: 0.1838 323/500 [==================>...........] - ETA: 41s - loss: 1.3209 - regression_loss: 1.1372 - classification_loss: 0.1837 324/500 [==================>...........] - ETA: 41s - loss: 1.3239 - regression_loss: 1.1397 - classification_loss: 0.1842 325/500 [==================>...........] - ETA: 41s - loss: 1.3241 - regression_loss: 1.1399 - classification_loss: 0.1842 326/500 [==================>...........] - ETA: 41s - loss: 1.3233 - regression_loss: 1.1392 - classification_loss: 0.1841 327/500 [==================>...........] - ETA: 40s - loss: 1.3277 - regression_loss: 1.1421 - classification_loss: 0.1856 328/500 [==================>...........] - ETA: 40s - loss: 1.3293 - regression_loss: 1.1434 - classification_loss: 0.1859 329/500 [==================>...........] - ETA: 40s - loss: 1.3287 - regression_loss: 1.1429 - classification_loss: 0.1857 330/500 [==================>...........] - ETA: 40s - loss: 1.3291 - regression_loss: 1.1433 - classification_loss: 0.1858 331/500 [==================>...........] - ETA: 39s - loss: 1.3286 - regression_loss: 1.1430 - classification_loss: 0.1856 332/500 [==================>...........] - ETA: 39s - loss: 1.3291 - regression_loss: 1.1435 - classification_loss: 0.1857 333/500 [==================>...........] - ETA: 39s - loss: 1.3277 - regression_loss: 1.1423 - classification_loss: 0.1854 334/500 [===================>..........] - ETA: 39s - loss: 1.3264 - regression_loss: 1.1413 - classification_loss: 0.1850 335/500 [===================>..........] - ETA: 38s - loss: 1.3253 - regression_loss: 1.1405 - classification_loss: 0.1848 336/500 [===================>..........] - ETA: 38s - loss: 1.3257 - regression_loss: 1.1409 - classification_loss: 0.1849 337/500 [===================>..........] - ETA: 38s - loss: 1.3231 - regression_loss: 1.1387 - classification_loss: 0.1844 338/500 [===================>..........] - ETA: 38s - loss: 1.3221 - regression_loss: 1.1378 - classification_loss: 0.1843 339/500 [===================>..........] - ETA: 37s - loss: 1.3212 - regression_loss: 1.1372 - classification_loss: 0.1840 340/500 [===================>..........] - ETA: 37s - loss: 1.3221 - regression_loss: 1.1383 - classification_loss: 0.1838 341/500 [===================>..........] - ETA: 37s - loss: 1.3239 - regression_loss: 1.1399 - classification_loss: 0.1840 342/500 [===================>..........] - ETA: 37s - loss: 1.3208 - regression_loss: 1.1366 - classification_loss: 0.1842 343/500 [===================>..........] - ETA: 36s - loss: 1.3207 - regression_loss: 1.1365 - classification_loss: 0.1842 344/500 [===================>..........] - ETA: 36s - loss: 1.3198 - regression_loss: 1.1359 - classification_loss: 0.1839 345/500 [===================>..........] - ETA: 36s - loss: 1.3212 - regression_loss: 1.1373 - classification_loss: 0.1839 346/500 [===================>..........] - ETA: 36s - loss: 1.3208 - regression_loss: 1.1371 - classification_loss: 0.1837 347/500 [===================>..........] - ETA: 36s - loss: 1.3220 - regression_loss: 1.1381 - classification_loss: 0.1839 348/500 [===================>..........] - ETA: 35s - loss: 1.3223 - regression_loss: 1.1386 - classification_loss: 0.1837 349/500 [===================>..........] - ETA: 35s - loss: 1.3201 - regression_loss: 1.1367 - classification_loss: 0.1834 350/500 [====================>.........] - ETA: 35s - loss: 1.3218 - regression_loss: 1.1383 - classification_loss: 0.1835 351/500 [====================>.........] - ETA: 35s - loss: 1.3222 - regression_loss: 1.1386 - classification_loss: 0.1836 352/500 [====================>.........] - ETA: 34s - loss: 1.3213 - regression_loss: 1.1380 - classification_loss: 0.1833 353/500 [====================>.........] - ETA: 34s - loss: 1.3206 - regression_loss: 1.1374 - classification_loss: 0.1832 354/500 [====================>.........] - ETA: 34s - loss: 1.3222 - regression_loss: 1.1389 - classification_loss: 0.1833 355/500 [====================>.........] - ETA: 34s - loss: 1.3201 - regression_loss: 1.1371 - classification_loss: 0.1829 356/500 [====================>.........] - ETA: 33s - loss: 1.3211 - regression_loss: 1.1382 - classification_loss: 0.1829 357/500 [====================>.........] - ETA: 33s - loss: 1.3214 - regression_loss: 1.1386 - classification_loss: 0.1829 358/500 [====================>.........] - ETA: 33s - loss: 1.3218 - regression_loss: 1.1390 - classification_loss: 0.1828 359/500 [====================>.........] - ETA: 33s - loss: 1.3219 - regression_loss: 1.1392 - classification_loss: 0.1827 360/500 [====================>.........] - ETA: 33s - loss: 1.3245 - regression_loss: 1.1414 - classification_loss: 0.1831 361/500 [====================>.........] - ETA: 32s - loss: 1.3238 - regression_loss: 1.1409 - classification_loss: 0.1829 362/500 [====================>.........] - ETA: 32s - loss: 1.3222 - regression_loss: 1.1397 - classification_loss: 0.1826 363/500 [====================>.........] - ETA: 32s - loss: 1.3207 - regression_loss: 1.1384 - classification_loss: 0.1823 364/500 [====================>.........] - ETA: 32s - loss: 1.3205 - regression_loss: 1.1383 - classification_loss: 0.1822 365/500 [====================>.........] - ETA: 31s - loss: 1.3195 - regression_loss: 1.1375 - classification_loss: 0.1820 366/500 [====================>.........] - ETA: 31s - loss: 1.3195 - regression_loss: 1.1376 - classification_loss: 0.1819 367/500 [=====================>........] - ETA: 31s - loss: 1.3221 - regression_loss: 1.1393 - classification_loss: 0.1828 368/500 [=====================>........] - ETA: 31s - loss: 1.3215 - regression_loss: 1.1388 - classification_loss: 0.1827 369/500 [=====================>........] - ETA: 30s - loss: 1.3246 - regression_loss: 1.1414 - classification_loss: 0.1832 370/500 [=====================>........] - ETA: 30s - loss: 1.3237 - regression_loss: 1.1407 - classification_loss: 0.1830 371/500 [=====================>........] - ETA: 30s - loss: 1.3239 - regression_loss: 1.1409 - classification_loss: 0.1830 372/500 [=====================>........] - ETA: 30s - loss: 1.3251 - regression_loss: 1.1419 - classification_loss: 0.1832 373/500 [=====================>........] - ETA: 29s - loss: 1.3231 - regression_loss: 1.1402 - classification_loss: 0.1829 374/500 [=====================>........] - ETA: 29s - loss: 1.3239 - regression_loss: 1.1408 - classification_loss: 0.1831 375/500 [=====================>........] - ETA: 29s - loss: 1.3225 - regression_loss: 1.1397 - classification_loss: 0.1828 376/500 [=====================>........] - ETA: 29s - loss: 1.3229 - regression_loss: 1.1399 - classification_loss: 0.1830 377/500 [=====================>........] - ETA: 28s - loss: 1.3230 - regression_loss: 1.1401 - classification_loss: 0.1829 378/500 [=====================>........] - ETA: 28s - loss: 1.3221 - regression_loss: 1.1393 - classification_loss: 0.1829 379/500 [=====================>........] - ETA: 28s - loss: 1.3225 - regression_loss: 1.1398 - classification_loss: 0.1828 380/500 [=====================>........] - ETA: 28s - loss: 1.3227 - regression_loss: 1.1397 - classification_loss: 0.1830 381/500 [=====================>........] - ETA: 28s - loss: 1.3223 - regression_loss: 1.1393 - classification_loss: 0.1829 382/500 [=====================>........] - ETA: 27s - loss: 1.3210 - regression_loss: 1.1383 - classification_loss: 0.1826 383/500 [=====================>........] - ETA: 27s - loss: 1.3193 - regression_loss: 1.1369 - classification_loss: 0.1824 384/500 [======================>.......] - ETA: 27s - loss: 1.3169 - regression_loss: 1.1349 - classification_loss: 0.1821 385/500 [======================>.......] - ETA: 27s - loss: 1.3175 - regression_loss: 1.1354 - classification_loss: 0.1821 386/500 [======================>.......] - ETA: 26s - loss: 1.3166 - regression_loss: 1.1348 - classification_loss: 0.1819 387/500 [======================>.......] - ETA: 26s - loss: 1.3151 - regression_loss: 1.1334 - classification_loss: 0.1817 388/500 [======================>.......] - ETA: 26s - loss: 1.3164 - regression_loss: 1.1343 - classification_loss: 0.1821 389/500 [======================>.......] - ETA: 26s - loss: 1.3189 - regression_loss: 1.1366 - classification_loss: 0.1823 390/500 [======================>.......] - ETA: 25s - loss: 1.3206 - regression_loss: 1.1382 - classification_loss: 0.1824 391/500 [======================>.......] - ETA: 25s - loss: 1.3209 - regression_loss: 1.1387 - classification_loss: 0.1822 392/500 [======================>.......] - ETA: 25s - loss: 1.3213 - regression_loss: 1.1390 - classification_loss: 0.1823 393/500 [======================>.......] - ETA: 25s - loss: 1.3195 - regression_loss: 1.1376 - classification_loss: 0.1819 394/500 [======================>.......] - ETA: 24s - loss: 1.3202 - regression_loss: 1.1385 - classification_loss: 0.1817 395/500 [======================>.......] - ETA: 24s - loss: 1.3216 - regression_loss: 1.1399 - classification_loss: 0.1818 396/500 [======================>.......] - ETA: 24s - loss: 1.3233 - regression_loss: 1.1414 - classification_loss: 0.1819 397/500 [======================>.......] - ETA: 24s - loss: 1.3233 - regression_loss: 1.1415 - classification_loss: 0.1818 398/500 [======================>.......] - ETA: 23s - loss: 1.3231 - regression_loss: 1.1415 - classification_loss: 0.1816 399/500 [======================>.......] - ETA: 23s - loss: 1.3243 - regression_loss: 1.1428 - classification_loss: 0.1814 400/500 [=======================>......] - ETA: 23s - loss: 1.3297 - regression_loss: 1.1445 - classification_loss: 0.1852 401/500 [=======================>......] - ETA: 23s - loss: 1.3307 - regression_loss: 1.1452 - classification_loss: 0.1855 402/500 [=======================>......] - ETA: 23s - loss: 1.3312 - regression_loss: 1.1457 - classification_loss: 0.1855 403/500 [=======================>......] - ETA: 22s - loss: 1.3308 - regression_loss: 1.1455 - classification_loss: 0.1853 404/500 [=======================>......] - ETA: 22s - loss: 1.3301 - regression_loss: 1.1450 - classification_loss: 0.1851 405/500 [=======================>......] - ETA: 22s - loss: 1.3302 - regression_loss: 1.1451 - classification_loss: 0.1851 406/500 [=======================>......] - ETA: 22s - loss: 1.3294 - regression_loss: 1.1445 - classification_loss: 0.1849 407/500 [=======================>......] - ETA: 21s - loss: 1.3290 - regression_loss: 1.1442 - classification_loss: 0.1848 408/500 [=======================>......] - ETA: 21s - loss: 1.3292 - regression_loss: 1.1445 - classification_loss: 0.1847 409/500 [=======================>......] - ETA: 21s - loss: 1.3288 - regression_loss: 1.1443 - classification_loss: 0.1845 410/500 [=======================>......] - ETA: 21s - loss: 1.3277 - regression_loss: 1.1434 - classification_loss: 0.1843 411/500 [=======================>......] - ETA: 20s - loss: 1.3269 - regression_loss: 1.1426 - classification_loss: 0.1843 412/500 [=======================>......] - ETA: 20s - loss: 1.3289 - regression_loss: 1.1444 - classification_loss: 0.1845 413/500 [=======================>......] - ETA: 20s - loss: 1.3289 - regression_loss: 1.1444 - classification_loss: 0.1845 414/500 [=======================>......] - ETA: 20s - loss: 1.3294 - regression_loss: 1.1447 - classification_loss: 0.1847 415/500 [=======================>......] - ETA: 19s - loss: 1.3299 - regression_loss: 1.1452 - classification_loss: 0.1847 416/500 [=======================>......] - ETA: 19s - loss: 1.3303 - regression_loss: 1.1455 - classification_loss: 0.1847 417/500 [========================>.....] - ETA: 19s - loss: 1.3297 - regression_loss: 1.1449 - classification_loss: 0.1848 418/500 [========================>.....] - ETA: 19s - loss: 1.3276 - regression_loss: 1.1431 - classification_loss: 0.1846 419/500 [========================>.....] - ETA: 19s - loss: 1.3260 - regression_loss: 1.1417 - classification_loss: 0.1843 420/500 [========================>.....] - ETA: 18s - loss: 1.3241 - regression_loss: 1.1402 - classification_loss: 0.1839 421/500 [========================>.....] - ETA: 18s - loss: 1.3263 - regression_loss: 1.1417 - classification_loss: 0.1846 422/500 [========================>.....] - ETA: 18s - loss: 1.3253 - regression_loss: 1.1410 - classification_loss: 0.1843 423/500 [========================>.....] - ETA: 18s - loss: 1.3238 - regression_loss: 1.1397 - classification_loss: 0.1841 424/500 [========================>.....] - ETA: 17s - loss: 1.3253 - regression_loss: 1.1407 - classification_loss: 0.1846 425/500 [========================>.....] - ETA: 17s - loss: 1.3267 - regression_loss: 1.1417 - classification_loss: 0.1849 426/500 [========================>.....] - ETA: 17s - loss: 1.3258 - regression_loss: 1.1410 - classification_loss: 0.1848 427/500 [========================>.....] - ETA: 17s - loss: 1.3259 - regression_loss: 1.1411 - classification_loss: 0.1847 428/500 [========================>.....] - ETA: 16s - loss: 1.3275 - regression_loss: 1.1423 - classification_loss: 0.1851 429/500 [========================>.....] - ETA: 16s - loss: 1.3270 - regression_loss: 1.1419 - classification_loss: 0.1851 430/500 [========================>.....] - ETA: 16s - loss: 1.3268 - regression_loss: 1.1418 - classification_loss: 0.1850 431/500 [========================>.....] - ETA: 16s - loss: 1.3261 - regression_loss: 1.1414 - classification_loss: 0.1848 432/500 [========================>.....] - ETA: 15s - loss: 1.3261 - regression_loss: 1.1415 - classification_loss: 0.1846 433/500 [========================>.....] - ETA: 15s - loss: 1.3255 - regression_loss: 1.1410 - classification_loss: 0.1845 434/500 [=========================>....] - ETA: 15s - loss: 1.3263 - regression_loss: 1.1415 - classification_loss: 0.1848 435/500 [=========================>....] - ETA: 15s - loss: 1.3261 - regression_loss: 1.1414 - classification_loss: 0.1847 436/500 [=========================>....] - ETA: 15s - loss: 1.3247 - regression_loss: 1.1402 - classification_loss: 0.1845 437/500 [=========================>....] - ETA: 14s - loss: 1.3233 - regression_loss: 1.1390 - classification_loss: 0.1843 438/500 [=========================>....] - ETA: 14s - loss: 1.3254 - regression_loss: 1.1406 - classification_loss: 0.1848 439/500 [=========================>....] - ETA: 14s - loss: 1.3241 - regression_loss: 1.1396 - classification_loss: 0.1845 440/500 [=========================>....] - ETA: 14s - loss: 1.3231 - regression_loss: 1.1388 - classification_loss: 0.1844 441/500 [=========================>....] - ETA: 13s - loss: 1.3211 - regression_loss: 1.1370 - classification_loss: 0.1841 442/500 [=========================>....] - ETA: 13s - loss: 1.3221 - regression_loss: 1.1380 - classification_loss: 0.1841 443/500 [=========================>....] - ETA: 13s - loss: 1.3212 - regression_loss: 1.1373 - classification_loss: 0.1839 444/500 [=========================>....] - ETA: 13s - loss: 1.3199 - regression_loss: 1.1361 - classification_loss: 0.1838 445/500 [=========================>....] - ETA: 12s - loss: 1.3191 - regression_loss: 1.1354 - classification_loss: 0.1837 446/500 [=========================>....] - ETA: 12s - loss: 1.3279 - regression_loss: 1.1386 - classification_loss: 0.1892 447/500 [=========================>....] - ETA: 12s - loss: 1.3271 - regression_loss: 1.1380 - classification_loss: 0.1891 448/500 [=========================>....] - ETA: 12s - loss: 1.3267 - regression_loss: 1.1377 - classification_loss: 0.1890 449/500 [=========================>....] - ETA: 11s - loss: 1.3301 - regression_loss: 1.1405 - classification_loss: 0.1896 450/500 [==========================>...] - ETA: 11s - loss: 1.3287 - regression_loss: 1.1393 - classification_loss: 0.1894 451/500 [==========================>...] - ETA: 11s - loss: 1.3305 - regression_loss: 1.1406 - classification_loss: 0.1899 452/500 [==========================>...] - ETA: 11s - loss: 1.3300 - regression_loss: 1.1403 - classification_loss: 0.1897 453/500 [==========================>...] - ETA: 11s - loss: 1.3297 - regression_loss: 1.1401 - classification_loss: 0.1896 454/500 [==========================>...] - ETA: 10s - loss: 1.3289 - regression_loss: 1.1395 - classification_loss: 0.1895 455/500 [==========================>...] - ETA: 10s - loss: 1.3283 - regression_loss: 1.1390 - classification_loss: 0.1893 456/500 [==========================>...] - ETA: 10s - loss: 1.3293 - regression_loss: 1.1399 - classification_loss: 0.1894 457/500 [==========================>...] - ETA: 10s - loss: 1.3311 - regression_loss: 1.1413 - classification_loss: 0.1898 458/500 [==========================>...] - ETA: 9s - loss: 1.3309 - regression_loss: 1.1412 - classification_loss: 0.1897  459/500 [==========================>...] - ETA: 9s - loss: 1.3305 - regression_loss: 1.1409 - classification_loss: 0.1896 460/500 [==========================>...] - ETA: 9s - loss: 1.3312 - regression_loss: 1.1416 - classification_loss: 0.1896 461/500 [==========================>...] - ETA: 9s - loss: 1.3297 - regression_loss: 1.1403 - classification_loss: 0.1894 462/500 [==========================>...] - ETA: 8s - loss: 1.3294 - regression_loss: 1.1401 - classification_loss: 0.1893 463/500 [==========================>...] - ETA: 8s - loss: 1.3285 - regression_loss: 1.1393 - classification_loss: 0.1891 464/500 [==========================>...] - ETA: 8s - loss: 1.3280 - regression_loss: 1.1390 - classification_loss: 0.1890 465/500 [==========================>...] - ETA: 8s - loss: 1.3276 - regression_loss: 1.1386 - classification_loss: 0.1890 466/500 [==========================>...] - ETA: 7s - loss: 1.3276 - regression_loss: 1.1386 - classification_loss: 0.1890 467/500 [===========================>..] - ETA: 7s - loss: 1.3272 - regression_loss: 1.1383 - classification_loss: 0.1890 468/500 [===========================>..] - ETA: 7s - loss: 1.3273 - regression_loss: 1.1383 - classification_loss: 0.1889 469/500 [===========================>..] - ETA: 7s - loss: 1.3279 - regression_loss: 1.1389 - classification_loss: 0.1890 470/500 [===========================>..] - ETA: 7s - loss: 1.3265 - regression_loss: 1.1378 - classification_loss: 0.1887 471/500 [===========================>..] - ETA: 6s - loss: 1.3272 - regression_loss: 1.1382 - classification_loss: 0.1890 472/500 [===========================>..] - ETA: 6s - loss: 1.3248 - regression_loss: 1.1358 - classification_loss: 0.1890 473/500 [===========================>..] - ETA: 6s - loss: 1.3244 - regression_loss: 1.1355 - classification_loss: 0.1889 474/500 [===========================>..] - ETA: 6s - loss: 1.3253 - regression_loss: 1.1364 - classification_loss: 0.1890 475/500 [===========================>..] - ETA: 5s - loss: 1.3251 - regression_loss: 1.1363 - classification_loss: 0.1889 476/500 [===========================>..] - ETA: 5s - loss: 1.3257 - regression_loss: 1.1369 - classification_loss: 0.1888 477/500 [===========================>..] - ETA: 5s - loss: 1.3242 - regression_loss: 1.1356 - classification_loss: 0.1885 478/500 [===========================>..] - ETA: 5s - loss: 1.3226 - regression_loss: 1.1342 - classification_loss: 0.1884 479/500 [===========================>..] - ETA: 4s - loss: 1.3223 - regression_loss: 1.1341 - classification_loss: 0.1882 480/500 [===========================>..] - ETA: 4s - loss: 1.3219 - regression_loss: 1.1338 - classification_loss: 0.1881 481/500 [===========================>..] - ETA: 4s - loss: 1.3214 - regression_loss: 1.1335 - classification_loss: 0.1879 482/500 [===========================>..] - ETA: 4s - loss: 1.3213 - regression_loss: 1.1335 - classification_loss: 0.1878 483/500 [===========================>..] - ETA: 3s - loss: 1.3211 - regression_loss: 1.1335 - classification_loss: 0.1876 484/500 [============================>.] - ETA: 3s - loss: 1.3211 - regression_loss: 1.1336 - classification_loss: 0.1875 485/500 [============================>.] - ETA: 3s - loss: 1.3215 - regression_loss: 1.1339 - classification_loss: 0.1875 486/500 [============================>.] - ETA: 3s - loss: 1.3205 - regression_loss: 1.1331 - classification_loss: 0.1874 487/500 [============================>.] - ETA: 3s - loss: 1.3228 - regression_loss: 1.1355 - classification_loss: 0.1873 488/500 [============================>.] - ETA: 2s - loss: 1.3235 - regression_loss: 1.1362 - classification_loss: 0.1873 489/500 [============================>.] - ETA: 2s - loss: 1.3250 - regression_loss: 1.1376 - classification_loss: 0.1874 490/500 [============================>.] - ETA: 2s - loss: 1.3259 - regression_loss: 1.1385 - classification_loss: 0.1874 491/500 [============================>.] - ETA: 2s - loss: 1.3269 - regression_loss: 1.1395 - classification_loss: 0.1874 492/500 [============================>.] - ETA: 1s - loss: 1.3276 - regression_loss: 1.1402 - classification_loss: 0.1874 493/500 [============================>.] - ETA: 1s - loss: 1.3280 - regression_loss: 1.1406 - classification_loss: 0.1874 494/500 [============================>.] - ETA: 1s - loss: 1.3286 - regression_loss: 1.1411 - classification_loss: 0.1875 495/500 [============================>.] - ETA: 1s - loss: 1.3289 - regression_loss: 1.1415 - classification_loss: 0.1874 496/500 [============================>.] - ETA: 0s - loss: 1.3299 - regression_loss: 1.1422 - classification_loss: 0.1877 497/500 [============================>.] - ETA: 0s - loss: 1.3302 - regression_loss: 1.1425 - classification_loss: 0.1877 498/500 [============================>.] - ETA: 0s - loss: 1.3316 - regression_loss: 1.1438 - classification_loss: 0.1878 499/500 [============================>.] - ETA: 0s - loss: 1.3312 - regression_loss: 1.1435 - classification_loss: 0.1877 500/500 [==============================] - 118s 235ms/step - loss: 1.3307 - regression_loss: 1.1432 - classification_loss: 0.1875 326 instances of class plum with average precision: 0.8464 mAP: 0.8464 Epoch 00015: saving model to ./training/snapshots/resnet50_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 1:46 - loss: 1.8254 - regression_loss: 1.5888 - classification_loss: 0.2367 2/500 [..............................] - ETA: 1:50 - loss: 2.0290 - regression_loss: 1.7061 - classification_loss: 0.3229 3/500 [..............................] - ETA: 1:49 - loss: 1.5590 - regression_loss: 1.3116 - classification_loss: 0.2473 4/500 [..............................] - ETA: 1:51 - loss: 1.3878 - regression_loss: 1.1841 - classification_loss: 0.2037 5/500 [..............................] - ETA: 1:50 - loss: 1.3527 - regression_loss: 1.1637 - classification_loss: 0.1890 6/500 [..............................] - ETA: 1:49 - loss: 1.3471 - regression_loss: 1.1581 - classification_loss: 0.1890 7/500 [..............................] - ETA: 1:48 - loss: 1.2645 - regression_loss: 1.0811 - classification_loss: 0.1834 8/500 [..............................] - ETA: 1:50 - loss: 1.2686 - regression_loss: 1.0859 - classification_loss: 0.1827 9/500 [..............................] - ETA: 1:50 - loss: 1.2533 - regression_loss: 1.0775 - classification_loss: 0.1758 10/500 [..............................] - ETA: 1:49 - loss: 1.3362 - regression_loss: 1.1482 - classification_loss: 0.1880 11/500 [..............................] - ETA: 1:49 - loss: 1.3341 - regression_loss: 1.1508 - classification_loss: 0.1833 12/500 [..............................] - ETA: 1:49 - loss: 1.3676 - regression_loss: 1.1763 - classification_loss: 0.1913 13/500 [..............................] - ETA: 1:50 - loss: 1.3448 - regression_loss: 1.1602 - classification_loss: 0.1845 14/500 [..............................] - ETA: 1:50 - loss: 1.3208 - regression_loss: 1.1417 - classification_loss: 0.1791 15/500 [..............................] - ETA: 1:50 - loss: 1.2630 - regression_loss: 1.0916 - classification_loss: 0.1714 16/500 [..............................] - ETA: 1:50 - loss: 1.2575 - regression_loss: 1.0883 - classification_loss: 0.1692 17/500 [>.............................] - ETA: 1:50 - loss: 1.2116 - regression_loss: 1.0502 - classification_loss: 0.1614 18/500 [>.............................] - ETA: 1:50 - loss: 1.2630 - regression_loss: 1.0900 - classification_loss: 0.1730 19/500 [>.............................] - ETA: 1:50 - loss: 1.2832 - regression_loss: 1.1087 - classification_loss: 0.1746 20/500 [>.............................] - ETA: 1:50 - loss: 1.3246 - regression_loss: 1.1438 - classification_loss: 0.1809 21/500 [>.............................] - ETA: 1:50 - loss: 1.3272 - regression_loss: 1.1461 - classification_loss: 0.1811 22/500 [>.............................] - ETA: 1:50 - loss: 1.3501 - regression_loss: 1.1636 - classification_loss: 0.1865 23/500 [>.............................] - ETA: 1:50 - loss: 1.3254 - regression_loss: 1.1440 - classification_loss: 0.1814 24/500 [>.............................] - ETA: 1:50 - loss: 1.3553 - regression_loss: 1.1683 - classification_loss: 0.1870 25/500 [>.............................] - ETA: 1:50 - loss: 1.3456 - regression_loss: 1.1613 - classification_loss: 0.1844 26/500 [>.............................] - ETA: 1:50 - loss: 1.3515 - regression_loss: 1.1642 - classification_loss: 0.1873 27/500 [>.............................] - ETA: 1:50 - loss: 1.3487 - regression_loss: 1.1598 - classification_loss: 0.1889 28/500 [>.............................] - ETA: 1:49 - loss: 1.3455 - regression_loss: 1.1569 - classification_loss: 0.1887 29/500 [>.............................] - ETA: 1:49 - loss: 1.3337 - regression_loss: 1.1462 - classification_loss: 0.1875 30/500 [>.............................] - ETA: 1:49 - loss: 1.3422 - regression_loss: 1.1562 - classification_loss: 0.1860 31/500 [>.............................] - ETA: 1:49 - loss: 1.3550 - regression_loss: 1.1654 - classification_loss: 0.1896 32/500 [>.............................] - ETA: 1:49 - loss: 1.3330 - regression_loss: 1.1480 - classification_loss: 0.1850 33/500 [>.............................] - ETA: 1:48 - loss: 1.3292 - regression_loss: 1.1456 - classification_loss: 0.1836 34/500 [=>............................] - ETA: 1:48 - loss: 1.3405 - regression_loss: 1.1547 - classification_loss: 0.1859 35/500 [=>............................] - ETA: 1:48 - loss: 1.3451 - regression_loss: 1.1580 - classification_loss: 0.1872 36/500 [=>............................] - ETA: 1:48 - loss: 1.3566 - regression_loss: 1.1657 - classification_loss: 0.1908 37/500 [=>............................] - ETA: 1:47 - loss: 1.3534 - regression_loss: 1.1635 - classification_loss: 0.1900 38/500 [=>............................] - ETA: 1:47 - loss: 1.3464 - regression_loss: 1.1581 - classification_loss: 0.1883 39/500 [=>............................] - ETA: 1:47 - loss: 1.3493 - regression_loss: 1.1589 - classification_loss: 0.1905 40/500 [=>............................] - ETA: 1:47 - loss: 1.3596 - regression_loss: 1.1681 - classification_loss: 0.1915 41/500 [=>............................] - ETA: 1:47 - loss: 1.3584 - regression_loss: 1.1673 - classification_loss: 0.1912 42/500 [=>............................] - ETA: 1:47 - loss: 1.3589 - regression_loss: 1.1692 - classification_loss: 0.1897 43/500 [=>............................] - ETA: 1:46 - loss: 1.3562 - regression_loss: 1.1679 - classification_loss: 0.1883 44/500 [=>............................] - ETA: 1:46 - loss: 1.3644 - regression_loss: 1.1728 - classification_loss: 0.1916 45/500 [=>............................] - ETA: 1:46 - loss: 1.3617 - regression_loss: 1.1692 - classification_loss: 0.1925 46/500 [=>............................] - ETA: 1:46 - loss: 1.3589 - regression_loss: 1.1676 - classification_loss: 0.1913 47/500 [=>............................] - ETA: 1:46 - loss: 1.3597 - regression_loss: 1.1696 - classification_loss: 0.1901 48/500 [=>............................] - ETA: 1:46 - loss: 1.3460 - regression_loss: 1.1584 - classification_loss: 0.1877 49/500 [=>............................] - ETA: 1:45 - loss: 1.3448 - regression_loss: 1.1569 - classification_loss: 0.1879 50/500 [==>...........................] - ETA: 1:45 - loss: 1.3335 - regression_loss: 1.1481 - classification_loss: 0.1854 51/500 [==>...........................] - ETA: 1:45 - loss: 1.3355 - regression_loss: 1.1507 - classification_loss: 0.1847 52/500 [==>...........................] - ETA: 1:45 - loss: 1.3298 - regression_loss: 1.1469 - classification_loss: 0.1829 53/500 [==>...........................] - ETA: 1:44 - loss: 1.3199 - regression_loss: 1.1370 - classification_loss: 0.1829 54/500 [==>...........................] - ETA: 1:44 - loss: 1.3124 - regression_loss: 1.1318 - classification_loss: 0.1806 55/500 [==>...........................] - ETA: 1:44 - loss: 1.3061 - regression_loss: 1.1269 - classification_loss: 0.1792 56/500 [==>...........................] - ETA: 1:44 - loss: 1.2964 - regression_loss: 1.1192 - classification_loss: 0.1772 57/500 [==>...........................] - ETA: 1:43 - loss: 1.3000 - regression_loss: 1.1218 - classification_loss: 0.1782 58/500 [==>...........................] - ETA: 1:43 - loss: 1.2900 - regression_loss: 1.1138 - classification_loss: 0.1762 59/500 [==>...........................] - ETA: 1:43 - loss: 1.2925 - regression_loss: 1.1162 - classification_loss: 0.1763 60/500 [==>...........................] - ETA: 1:43 - loss: 1.2945 - regression_loss: 1.1174 - classification_loss: 0.1772 61/500 [==>...........................] - ETA: 1:43 - loss: 1.2912 - regression_loss: 1.1144 - classification_loss: 0.1767 62/500 [==>...........................] - ETA: 1:43 - loss: 1.2927 - regression_loss: 1.1167 - classification_loss: 0.1760 63/500 [==>...........................] - ETA: 1:42 - loss: 1.2909 - regression_loss: 1.1157 - classification_loss: 0.1752 64/500 [==>...........................] - ETA: 1:42 - loss: 1.2864 - regression_loss: 1.1125 - classification_loss: 0.1739 65/500 [==>...........................] - ETA: 1:42 - loss: 1.2840 - regression_loss: 1.1085 - classification_loss: 0.1755 66/500 [==>...........................] - ETA: 1:42 - loss: 1.2869 - regression_loss: 1.1110 - classification_loss: 0.1759 67/500 [===>..........................] - ETA: 1:42 - loss: 1.2822 - regression_loss: 1.1078 - classification_loss: 0.1743 68/500 [===>..........................] - ETA: 1:41 - loss: 1.2792 - regression_loss: 1.1048 - classification_loss: 0.1744 69/500 [===>..........................] - ETA: 1:41 - loss: 1.2699 - regression_loss: 1.0973 - classification_loss: 0.1725 70/500 [===>..........................] - ETA: 1:41 - loss: 1.2603 - regression_loss: 1.0891 - classification_loss: 0.1712 71/500 [===>..........................] - ETA: 1:41 - loss: 1.2625 - regression_loss: 1.0918 - classification_loss: 0.1707 72/500 [===>..........................] - ETA: 1:40 - loss: 1.2637 - regression_loss: 1.0943 - classification_loss: 0.1694 73/500 [===>..........................] - ETA: 1:40 - loss: 1.2507 - regression_loss: 1.0831 - classification_loss: 0.1676 74/500 [===>..........................] - ETA: 1:40 - loss: 1.2554 - regression_loss: 1.0881 - classification_loss: 0.1673 75/500 [===>..........................] - ETA: 1:40 - loss: 1.2619 - regression_loss: 1.0942 - classification_loss: 0.1677 76/500 [===>..........................] - ETA: 1:39 - loss: 1.2554 - regression_loss: 1.0890 - classification_loss: 0.1664 77/500 [===>..........................] - ETA: 1:39 - loss: 1.2482 - regression_loss: 1.0829 - classification_loss: 0.1653 78/500 [===>..........................] - ETA: 1:39 - loss: 1.2667 - regression_loss: 1.0979 - classification_loss: 0.1689 79/500 [===>..........................] - ETA: 1:39 - loss: 1.2598 - regression_loss: 1.0921 - classification_loss: 0.1677 80/500 [===>..........................] - ETA: 1:38 - loss: 1.2582 - regression_loss: 1.0903 - classification_loss: 0.1679 81/500 [===>..........................] - ETA: 1:38 - loss: 1.2709 - regression_loss: 1.1013 - classification_loss: 0.1695 82/500 [===>..........................] - ETA: 1:38 - loss: 1.2728 - regression_loss: 1.1026 - classification_loss: 0.1701 83/500 [===>..........................] - ETA: 1:38 - loss: 1.2700 - regression_loss: 1.1004 - classification_loss: 0.1696 84/500 [====>.........................] - ETA: 1:37 - loss: 1.2738 - regression_loss: 1.1025 - classification_loss: 0.1713 85/500 [====>.........................] - ETA: 1:37 - loss: 1.2712 - regression_loss: 1.1003 - classification_loss: 0.1709 86/500 [====>.........................] - ETA: 1:37 - loss: 1.2720 - regression_loss: 1.1014 - classification_loss: 0.1706 87/500 [====>.........................] - ETA: 1:37 - loss: 1.2781 - regression_loss: 1.1067 - classification_loss: 0.1714 88/500 [====>.........................] - ETA: 1:36 - loss: 1.2789 - regression_loss: 1.1085 - classification_loss: 0.1704 89/500 [====>.........................] - ETA: 1:36 - loss: 1.2906 - regression_loss: 1.1181 - classification_loss: 0.1725 90/500 [====>.........................] - ETA: 1:36 - loss: 1.3030 - regression_loss: 1.1274 - classification_loss: 0.1756 91/500 [====>.........................] - ETA: 1:36 - loss: 1.2980 - regression_loss: 1.1234 - classification_loss: 0.1746 92/500 [====>.........................] - ETA: 1:35 - loss: 1.2897 - regression_loss: 1.1162 - classification_loss: 0.1735 93/500 [====>.........................] - ETA: 1:35 - loss: 1.2878 - regression_loss: 1.1146 - classification_loss: 0.1732 94/500 [====>.........................] - ETA: 1:35 - loss: 1.2913 - regression_loss: 1.1171 - classification_loss: 0.1743 95/500 [====>.........................] - ETA: 1:35 - loss: 1.2922 - regression_loss: 1.1181 - classification_loss: 0.1741 96/500 [====>.........................] - ETA: 1:35 - loss: 1.2937 - regression_loss: 1.1202 - classification_loss: 0.1736 97/500 [====>.........................] - ETA: 1:34 - loss: 1.2981 - regression_loss: 1.1231 - classification_loss: 0.1750 98/500 [====>.........................] - ETA: 1:34 - loss: 1.2947 - regression_loss: 1.1206 - classification_loss: 0.1741 99/500 [====>.........................] - ETA: 1:34 - loss: 1.2993 - regression_loss: 1.1240 - classification_loss: 0.1753 100/500 [=====>........................] - ETA: 1:34 - loss: 1.3028 - regression_loss: 1.1270 - classification_loss: 0.1759 101/500 [=====>........................] - ETA: 1:33 - loss: 1.3076 - regression_loss: 1.1303 - classification_loss: 0.1773 102/500 [=====>........................] - ETA: 1:33 - loss: 1.3056 - regression_loss: 1.1290 - classification_loss: 0.1766 103/500 [=====>........................] - ETA: 1:33 - loss: 1.3044 - regression_loss: 1.1284 - classification_loss: 0.1761 104/500 [=====>........................] - ETA: 1:33 - loss: 1.2979 - regression_loss: 1.1230 - classification_loss: 0.1749 105/500 [=====>........................] - ETA: 1:32 - loss: 1.2990 - regression_loss: 1.1239 - classification_loss: 0.1751 106/500 [=====>........................] - ETA: 1:32 - loss: 1.3019 - regression_loss: 1.1265 - classification_loss: 0.1753 107/500 [=====>........................] - ETA: 1:32 - loss: 1.2987 - regression_loss: 1.1239 - classification_loss: 0.1748 108/500 [=====>........................] - ETA: 1:32 - loss: 1.3109 - regression_loss: 1.1345 - classification_loss: 0.1764 109/500 [=====>........................] - ETA: 1:31 - loss: 1.3113 - regression_loss: 1.1349 - classification_loss: 0.1765 110/500 [=====>........................] - ETA: 1:31 - loss: 1.3125 - regression_loss: 1.1362 - classification_loss: 0.1763 111/500 [=====>........................] - ETA: 1:31 - loss: 1.3114 - regression_loss: 1.1353 - classification_loss: 0.1761 112/500 [=====>........................] - ETA: 1:31 - loss: 1.3123 - regression_loss: 1.1360 - classification_loss: 0.1763 113/500 [=====>........................] - ETA: 1:30 - loss: 1.3165 - regression_loss: 1.1395 - classification_loss: 0.1770 114/500 [=====>........................] - ETA: 1:30 - loss: 1.3137 - regression_loss: 1.1374 - classification_loss: 0.1763 115/500 [=====>........................] - ETA: 1:30 - loss: 1.3191 - regression_loss: 1.1417 - classification_loss: 0.1774 116/500 [=====>........................] - ETA: 1:30 - loss: 1.3200 - regression_loss: 1.1426 - classification_loss: 0.1774 117/500 [======>.......................] - ETA: 1:30 - loss: 1.3231 - regression_loss: 1.1453 - classification_loss: 0.1778 118/500 [======>.......................] - ETA: 1:29 - loss: 1.3253 - regression_loss: 1.1470 - classification_loss: 0.1783 119/500 [======>.......................] - ETA: 1:29 - loss: 1.3284 - regression_loss: 1.1495 - classification_loss: 0.1790 120/500 [======>.......................] - ETA: 1:29 - loss: 1.3234 - regression_loss: 1.1452 - classification_loss: 0.1782 121/500 [======>.......................] - ETA: 1:29 - loss: 1.3196 - regression_loss: 1.1357 - classification_loss: 0.1839 122/500 [======>.......................] - ETA: 1:29 - loss: 1.3157 - regression_loss: 1.1326 - classification_loss: 0.1831 123/500 [======>.......................] - ETA: 1:28 - loss: 1.3135 - regression_loss: 1.1311 - classification_loss: 0.1824 124/500 [======>.......................] - ETA: 1:28 - loss: 1.3127 - regression_loss: 1.1310 - classification_loss: 0.1817 125/500 [======>.......................] - ETA: 1:28 - loss: 1.3193 - regression_loss: 1.1355 - classification_loss: 0.1838 126/500 [======>.......................] - ETA: 1:27 - loss: 1.3194 - regression_loss: 1.1358 - classification_loss: 0.1836 127/500 [======>.......................] - ETA: 1:27 - loss: 1.3186 - regression_loss: 1.1358 - classification_loss: 0.1829 128/500 [======>.......................] - ETA: 1:27 - loss: 1.3186 - regression_loss: 1.1360 - classification_loss: 0.1826 129/500 [======>.......................] - ETA: 1:27 - loss: 1.3237 - regression_loss: 1.1398 - classification_loss: 0.1839 130/500 [======>.......................] - ETA: 1:26 - loss: 1.3245 - regression_loss: 1.1403 - classification_loss: 0.1842 131/500 [======>.......................] - ETA: 1:26 - loss: 1.3303 - regression_loss: 1.1446 - classification_loss: 0.1857 132/500 [======>.......................] - ETA: 1:26 - loss: 1.3316 - regression_loss: 1.1457 - classification_loss: 0.1859 133/500 [======>.......................] - ETA: 1:26 - loss: 1.3333 - regression_loss: 1.1472 - classification_loss: 0.1862 134/500 [=======>......................] - ETA: 1:25 - loss: 1.3295 - regression_loss: 1.1440 - classification_loss: 0.1855 135/500 [=======>......................] - ETA: 1:25 - loss: 1.3263 - regression_loss: 1.1415 - classification_loss: 0.1848 136/500 [=======>......................] - ETA: 1:25 - loss: 1.3254 - regression_loss: 1.1409 - classification_loss: 0.1846 137/500 [=======>......................] - ETA: 1:25 - loss: 1.3251 - regression_loss: 1.1403 - classification_loss: 0.1849 138/500 [=======>......................] - ETA: 1:25 - loss: 1.3240 - regression_loss: 1.1397 - classification_loss: 0.1843 139/500 [=======>......................] - ETA: 1:24 - loss: 1.3261 - regression_loss: 1.1414 - classification_loss: 0.1847 140/500 [=======>......................] - ETA: 1:24 - loss: 1.3249 - regression_loss: 1.1409 - classification_loss: 0.1840 141/500 [=======>......................] - ETA: 1:24 - loss: 1.3261 - regression_loss: 1.1423 - classification_loss: 0.1837 142/500 [=======>......................] - ETA: 1:24 - loss: 1.3305 - regression_loss: 1.1462 - classification_loss: 0.1842 143/500 [=======>......................] - ETA: 1:23 - loss: 1.3288 - regression_loss: 1.1454 - classification_loss: 0.1834 144/500 [=======>......................] - ETA: 1:23 - loss: 1.3281 - regression_loss: 1.1452 - classification_loss: 0.1829 145/500 [=======>......................] - ETA: 1:23 - loss: 1.3266 - regression_loss: 1.1441 - classification_loss: 0.1824 146/500 [=======>......................] - ETA: 1:23 - loss: 1.3267 - regression_loss: 1.1445 - classification_loss: 0.1822 147/500 [=======>......................] - ETA: 1:22 - loss: 1.3249 - regression_loss: 1.1434 - classification_loss: 0.1815 148/500 [=======>......................] - ETA: 1:22 - loss: 1.3249 - regression_loss: 1.1432 - classification_loss: 0.1817 149/500 [=======>......................] - ETA: 1:22 - loss: 1.3247 - regression_loss: 1.1432 - classification_loss: 0.1816 150/500 [========>.....................] - ETA: 1:22 - loss: 1.3220 - regression_loss: 1.1407 - classification_loss: 0.1813 151/500 [========>.....................] - ETA: 1:21 - loss: 1.3221 - regression_loss: 1.1406 - classification_loss: 0.1815 152/500 [========>.....................] - ETA: 1:21 - loss: 1.3219 - regression_loss: 1.1406 - classification_loss: 0.1814 153/500 [========>.....................] - ETA: 1:21 - loss: 1.3219 - regression_loss: 1.1406 - classification_loss: 0.1813 154/500 [========>.....................] - ETA: 1:21 - loss: 1.3196 - regression_loss: 1.1387 - classification_loss: 0.1810 155/500 [========>.....................] - ETA: 1:20 - loss: 1.3200 - regression_loss: 1.1389 - classification_loss: 0.1811 156/500 [========>.....................] - ETA: 1:20 - loss: 1.3202 - regression_loss: 1.1390 - classification_loss: 0.1812 157/500 [========>.....................] - ETA: 1:20 - loss: 1.3261 - regression_loss: 1.1444 - classification_loss: 0.1817 158/500 [========>.....................] - ETA: 1:20 - loss: 1.3256 - regression_loss: 1.1441 - classification_loss: 0.1815 159/500 [========>.....................] - ETA: 1:19 - loss: 1.3279 - regression_loss: 1.1463 - classification_loss: 0.1816 160/500 [========>.....................] - ETA: 1:19 - loss: 1.3262 - regression_loss: 1.1450 - classification_loss: 0.1812 161/500 [========>.....................] - ETA: 1:19 - loss: 1.3239 - regression_loss: 1.1434 - classification_loss: 0.1805 162/500 [========>.....................] - ETA: 1:19 - loss: 1.3248 - regression_loss: 1.1442 - classification_loss: 0.1806 163/500 [========>.....................] - ETA: 1:19 - loss: 1.3248 - regression_loss: 1.1440 - classification_loss: 0.1808 164/500 [========>.....................] - ETA: 1:18 - loss: 1.3249 - regression_loss: 1.1440 - classification_loss: 0.1809 165/500 [========>.....................] - ETA: 1:18 - loss: 1.3238 - regression_loss: 1.1430 - classification_loss: 0.1808 166/500 [========>.....................] - ETA: 1:18 - loss: 1.3202 - regression_loss: 1.1399 - classification_loss: 0.1803 167/500 [=========>....................] - ETA: 1:18 - loss: 1.3193 - regression_loss: 1.1393 - classification_loss: 0.1800 168/500 [=========>....................] - ETA: 1:17 - loss: 1.3196 - regression_loss: 1.1399 - classification_loss: 0.1796 169/500 [=========>....................] - ETA: 1:17 - loss: 1.3184 - regression_loss: 1.1388 - classification_loss: 0.1796 170/500 [=========>....................] - ETA: 1:17 - loss: 1.3167 - regression_loss: 1.1375 - classification_loss: 0.1791 171/500 [=========>....................] - ETA: 1:17 - loss: 1.3132 - regression_loss: 1.1347 - classification_loss: 0.1785 172/500 [=========>....................] - ETA: 1:16 - loss: 1.3097 - regression_loss: 1.1316 - classification_loss: 0.1781 173/500 [=========>....................] - ETA: 1:16 - loss: 1.3070 - regression_loss: 1.1292 - classification_loss: 0.1779 174/500 [=========>....................] - ETA: 1:16 - loss: 1.3029 - regression_loss: 1.1257 - classification_loss: 0.1773 175/500 [=========>....................] - ETA: 1:16 - loss: 1.2968 - regression_loss: 1.1204 - classification_loss: 0.1765 176/500 [=========>....................] - ETA: 1:16 - loss: 1.2946 - regression_loss: 1.1188 - classification_loss: 0.1758 177/500 [=========>....................] - ETA: 1:15 - loss: 1.2905 - regression_loss: 1.1153 - classification_loss: 0.1753 178/500 [=========>....................] - ETA: 1:15 - loss: 1.2934 - regression_loss: 1.1178 - classification_loss: 0.1755 179/500 [=========>....................] - ETA: 1:16 - loss: 1.2899 - regression_loss: 1.1149 - classification_loss: 0.1750 180/500 [=========>....................] - ETA: 1:16 - loss: 1.2908 - regression_loss: 1.1155 - classification_loss: 0.1753 181/500 [=========>....................] - ETA: 1:15 - loss: 1.2872 - regression_loss: 1.1126 - classification_loss: 0.1746 182/500 [=========>....................] - ETA: 1:15 - loss: 1.2859 - regression_loss: 1.1117 - classification_loss: 0.1741 183/500 [=========>....................] - ETA: 1:15 - loss: 1.2834 - regression_loss: 1.1097 - classification_loss: 0.1738 184/500 [==========>...................] - ETA: 1:15 - loss: 1.2842 - regression_loss: 1.1106 - classification_loss: 0.1736 185/500 [==========>...................] - ETA: 1:14 - loss: 1.2847 - regression_loss: 1.1110 - classification_loss: 0.1737 186/500 [==========>...................] - ETA: 1:14 - loss: 1.2907 - regression_loss: 1.1159 - classification_loss: 0.1748 187/500 [==========>...................] - ETA: 1:14 - loss: 1.2866 - regression_loss: 1.1124 - classification_loss: 0.1741 188/500 [==========>...................] - ETA: 1:14 - loss: 1.2863 - regression_loss: 1.1123 - classification_loss: 0.1740 189/500 [==========>...................] - ETA: 1:13 - loss: 1.2822 - regression_loss: 1.1088 - classification_loss: 0.1734 190/500 [==========>...................] - ETA: 1:13 - loss: 1.2821 - regression_loss: 1.1090 - classification_loss: 0.1731 191/500 [==========>...................] - ETA: 1:13 - loss: 1.2814 - regression_loss: 1.1084 - classification_loss: 0.1730 192/500 [==========>...................] - ETA: 1:13 - loss: 1.2822 - regression_loss: 1.1094 - classification_loss: 0.1728 193/500 [==========>...................] - ETA: 1:12 - loss: 1.2842 - regression_loss: 1.1112 - classification_loss: 0.1730 194/500 [==========>...................] - ETA: 1:12 - loss: 1.2851 - regression_loss: 1.1120 - classification_loss: 0.1731 195/500 [==========>...................] - ETA: 1:12 - loss: 1.2839 - regression_loss: 1.1110 - classification_loss: 0.1729 196/500 [==========>...................] - ETA: 1:12 - loss: 1.2829 - regression_loss: 1.1100 - classification_loss: 0.1730 197/500 [==========>...................] - ETA: 1:12 - loss: 1.2842 - regression_loss: 1.1111 - classification_loss: 0.1730 198/500 [==========>...................] - ETA: 1:11 - loss: 1.2861 - regression_loss: 1.1127 - classification_loss: 0.1734 199/500 [==========>...................] - ETA: 1:11 - loss: 1.2888 - regression_loss: 1.1155 - classification_loss: 0.1733 200/500 [===========>..................] - ETA: 1:11 - loss: 1.2852 - regression_loss: 1.1123 - classification_loss: 0.1729 201/500 [===========>..................] - ETA: 1:11 - loss: 1.2860 - regression_loss: 1.1130 - classification_loss: 0.1730 202/500 [===========>..................] - ETA: 1:10 - loss: 1.2848 - regression_loss: 1.1122 - classification_loss: 0.1726 203/500 [===========>..................] - ETA: 1:10 - loss: 1.2880 - regression_loss: 1.1143 - classification_loss: 0.1737 204/500 [===========>..................] - ETA: 1:10 - loss: 1.2838 - regression_loss: 1.1107 - classification_loss: 0.1731 205/500 [===========>..................] - ETA: 1:10 - loss: 1.2858 - regression_loss: 1.1121 - classification_loss: 0.1736 206/500 [===========>..................] - ETA: 1:09 - loss: 1.2872 - regression_loss: 1.1132 - classification_loss: 0.1739 207/500 [===========>..................] - ETA: 1:09 - loss: 1.2870 - regression_loss: 1.1130 - classification_loss: 0.1740 208/500 [===========>..................] - ETA: 1:09 - loss: 1.2847 - regression_loss: 1.1111 - classification_loss: 0.1735 209/500 [===========>..................] - ETA: 1:09 - loss: 1.2826 - regression_loss: 1.1094 - classification_loss: 0.1731 210/500 [===========>..................] - ETA: 1:08 - loss: 1.2862 - regression_loss: 1.1129 - classification_loss: 0.1733 211/500 [===========>..................] - ETA: 1:08 - loss: 1.2842 - regression_loss: 1.1115 - classification_loss: 0.1727 212/500 [===========>..................] - ETA: 1:08 - loss: 1.2840 - regression_loss: 1.1114 - classification_loss: 0.1726 213/500 [===========>..................] - ETA: 1:08 - loss: 1.2840 - regression_loss: 1.1115 - classification_loss: 0.1725 214/500 [===========>..................] - ETA: 1:07 - loss: 1.2865 - regression_loss: 1.1134 - classification_loss: 0.1731 215/500 [===========>..................] - ETA: 1:07 - loss: 1.2903 - regression_loss: 1.1164 - classification_loss: 0.1738 216/500 [===========>..................] - ETA: 1:07 - loss: 1.3050 - regression_loss: 1.1246 - classification_loss: 0.1804 217/500 [============>.................] - ETA: 1:07 - loss: 1.3074 - regression_loss: 1.1270 - classification_loss: 0.1804 218/500 [============>.................] - ETA: 1:06 - loss: 1.3089 - regression_loss: 1.1283 - classification_loss: 0.1806 219/500 [============>.................] - ETA: 1:06 - loss: 1.3061 - regression_loss: 1.1259 - classification_loss: 0.1802 220/500 [============>.................] - ETA: 1:06 - loss: 1.3101 - regression_loss: 1.1295 - classification_loss: 0.1805 221/500 [============>.................] - ETA: 1:06 - loss: 1.3066 - regression_loss: 1.1263 - classification_loss: 0.1802 222/500 [============>.................] - ETA: 1:05 - loss: 1.3100 - regression_loss: 1.1292 - classification_loss: 0.1807 223/500 [============>.................] - ETA: 1:05 - loss: 1.3107 - regression_loss: 1.1300 - classification_loss: 0.1807 224/500 [============>.................] - ETA: 1:05 - loss: 1.3207 - regression_loss: 1.1376 - classification_loss: 0.1831 225/500 [============>.................] - ETA: 1:05 - loss: 1.3221 - regression_loss: 1.1391 - classification_loss: 0.1830 226/500 [============>.................] - ETA: 1:04 - loss: 1.3210 - regression_loss: 1.1384 - classification_loss: 0.1826 227/500 [============>.................] - ETA: 1:04 - loss: 1.3208 - regression_loss: 1.1381 - classification_loss: 0.1828 228/500 [============>.................] - ETA: 1:04 - loss: 1.3188 - regression_loss: 1.1365 - classification_loss: 0.1824 229/500 [============>.................] - ETA: 1:04 - loss: 1.3189 - regression_loss: 1.1365 - classification_loss: 0.1824 230/500 [============>.................] - ETA: 1:04 - loss: 1.3176 - regression_loss: 1.1353 - classification_loss: 0.1824 231/500 [============>.................] - ETA: 1:03 - loss: 1.3175 - regression_loss: 1.1351 - classification_loss: 0.1824 232/500 [============>.................] - ETA: 1:03 - loss: 1.3138 - regression_loss: 1.1320 - classification_loss: 0.1818 233/500 [============>.................] - ETA: 1:03 - loss: 1.3158 - regression_loss: 1.1334 - classification_loss: 0.1824 234/500 [=============>................] - ETA: 1:03 - loss: 1.3130 - regression_loss: 1.1310 - classification_loss: 0.1820 235/500 [=============>................] - ETA: 1:02 - loss: 1.3143 - regression_loss: 1.1320 - classification_loss: 0.1823 236/500 [=============>................] - ETA: 1:02 - loss: 1.3130 - regression_loss: 1.1311 - classification_loss: 0.1819 237/500 [=============>................] - ETA: 1:02 - loss: 1.3130 - regression_loss: 1.1315 - classification_loss: 0.1815 238/500 [=============>................] - ETA: 1:02 - loss: 1.3133 - regression_loss: 1.1319 - classification_loss: 0.1813 239/500 [=============>................] - ETA: 1:01 - loss: 1.3162 - regression_loss: 1.1340 - classification_loss: 0.1822 240/500 [=============>................] - ETA: 1:01 - loss: 1.3161 - regression_loss: 1.1336 - classification_loss: 0.1825 241/500 [=============>................] - ETA: 1:01 - loss: 1.3167 - regression_loss: 1.1338 - classification_loss: 0.1829 242/500 [=============>................] - ETA: 1:01 - loss: 1.3135 - regression_loss: 1.1306 - classification_loss: 0.1830 243/500 [=============>................] - ETA: 1:00 - loss: 1.3124 - regression_loss: 1.1296 - classification_loss: 0.1828 244/500 [=============>................] - ETA: 1:00 - loss: 1.3127 - regression_loss: 1.1301 - classification_loss: 0.1826 245/500 [=============>................] - ETA: 1:00 - loss: 1.3140 - regression_loss: 1.1314 - classification_loss: 0.1826 246/500 [=============>................] - ETA: 1:00 - loss: 1.3141 - regression_loss: 1.1316 - classification_loss: 0.1825 247/500 [=============>................] - ETA: 59s - loss: 1.3122 - regression_loss: 1.1301 - classification_loss: 0.1821  248/500 [=============>................] - ETA: 59s - loss: 1.3123 - regression_loss: 1.1303 - classification_loss: 0.1820 249/500 [=============>................] - ETA: 59s - loss: 1.3125 - regression_loss: 1.1302 - classification_loss: 0.1823 250/500 [==============>...............] - ETA: 59s - loss: 1.3097 - regression_loss: 1.1277 - classification_loss: 0.1819 251/500 [==============>...............] - ETA: 58s - loss: 1.3090 - regression_loss: 1.1272 - classification_loss: 0.1818 252/500 [==============>...............] - ETA: 58s - loss: 1.3090 - regression_loss: 1.1275 - classification_loss: 0.1814 253/500 [==============>...............] - ETA: 58s - loss: 1.3089 - regression_loss: 1.1272 - classification_loss: 0.1817 254/500 [==============>...............] - ETA: 58s - loss: 1.3060 - regression_loss: 1.1248 - classification_loss: 0.1812 255/500 [==============>...............] - ETA: 57s - loss: 1.3061 - regression_loss: 1.1248 - classification_loss: 0.1813 256/500 [==============>...............] - ETA: 57s - loss: 1.3054 - regression_loss: 1.1243 - classification_loss: 0.1812 257/500 [==============>...............] - ETA: 57s - loss: 1.3040 - regression_loss: 1.1231 - classification_loss: 0.1809 258/500 [==============>...............] - ETA: 57s - loss: 1.3024 - regression_loss: 1.1215 - classification_loss: 0.1808 259/500 [==============>...............] - ETA: 57s - loss: 1.2999 - regression_loss: 1.1196 - classification_loss: 0.1803 260/500 [==============>...............] - ETA: 56s - loss: 1.3019 - regression_loss: 1.1214 - classification_loss: 0.1805 261/500 [==============>...............] - ETA: 56s - loss: 1.3043 - regression_loss: 1.1233 - classification_loss: 0.1810 262/500 [==============>...............] - ETA: 56s - loss: 1.3051 - regression_loss: 1.1242 - classification_loss: 0.1810 263/500 [==============>...............] - ETA: 56s - loss: 1.3046 - regression_loss: 1.1237 - classification_loss: 0.1808 264/500 [==============>...............] - ETA: 55s - loss: 1.3007 - regression_loss: 1.1204 - classification_loss: 0.1803 265/500 [==============>...............] - ETA: 55s - loss: 1.3045 - regression_loss: 1.1234 - classification_loss: 0.1811 266/500 [==============>...............] - ETA: 55s - loss: 1.3029 - regression_loss: 1.1222 - classification_loss: 0.1807 267/500 [===============>..............] - ETA: 55s - loss: 1.3040 - regression_loss: 1.1233 - classification_loss: 0.1807 268/500 [===============>..............] - ETA: 54s - loss: 1.3051 - regression_loss: 1.1241 - classification_loss: 0.1810 269/500 [===============>..............] - ETA: 54s - loss: 1.3027 - regression_loss: 1.1219 - classification_loss: 0.1808 270/500 [===============>..............] - ETA: 54s - loss: 1.3032 - regression_loss: 1.1225 - classification_loss: 0.1807 271/500 [===============>..............] - ETA: 54s - loss: 1.3001 - regression_loss: 1.1198 - classification_loss: 0.1802 272/500 [===============>..............] - ETA: 53s - loss: 1.3011 - regression_loss: 1.1208 - classification_loss: 0.1803 273/500 [===============>..............] - ETA: 53s - loss: 1.3043 - regression_loss: 1.1237 - classification_loss: 0.1806 274/500 [===============>..............] - ETA: 53s - loss: 1.3058 - regression_loss: 1.1252 - classification_loss: 0.1806 275/500 [===============>..............] - ETA: 53s - loss: 1.3049 - regression_loss: 1.1244 - classification_loss: 0.1805 276/500 [===============>..............] - ETA: 53s - loss: 1.3047 - regression_loss: 1.1243 - classification_loss: 0.1804 277/500 [===============>..............] - ETA: 52s - loss: 1.3040 - regression_loss: 1.1238 - classification_loss: 0.1802 278/500 [===============>..............] - ETA: 52s - loss: 1.3041 - regression_loss: 1.1241 - classification_loss: 0.1800 279/500 [===============>..............] - ETA: 52s - loss: 1.3014 - regression_loss: 1.1218 - classification_loss: 0.1796 280/500 [===============>..............] - ETA: 52s - loss: 1.3016 - regression_loss: 1.1220 - classification_loss: 0.1796 281/500 [===============>..............] - ETA: 51s - loss: 1.3035 - regression_loss: 1.1235 - classification_loss: 0.1800 282/500 [===============>..............] - ETA: 51s - loss: 1.3040 - regression_loss: 1.1239 - classification_loss: 0.1800 283/500 [===============>..............] - ETA: 51s - loss: 1.3032 - regression_loss: 1.1231 - classification_loss: 0.1801 284/500 [================>.............] - ETA: 51s - loss: 1.3030 - regression_loss: 1.1232 - classification_loss: 0.1799 285/500 [================>.............] - ETA: 50s - loss: 1.3021 - regression_loss: 1.1226 - classification_loss: 0.1795 286/500 [================>.............] - ETA: 50s - loss: 1.3017 - regression_loss: 1.1222 - classification_loss: 0.1795 287/500 [================>.............] - ETA: 50s - loss: 1.3021 - regression_loss: 1.1226 - classification_loss: 0.1795 288/500 [================>.............] - ETA: 50s - loss: 1.2991 - regression_loss: 1.1200 - classification_loss: 0.1791 289/500 [================>.............] - ETA: 50s - loss: 1.2997 - regression_loss: 1.1203 - classification_loss: 0.1794 290/500 [================>.............] - ETA: 49s - loss: 1.2988 - regression_loss: 1.1195 - classification_loss: 0.1793 291/500 [================>.............] - ETA: 49s - loss: 1.2972 - regression_loss: 1.1182 - classification_loss: 0.1789 292/500 [================>.............] - ETA: 49s - loss: 1.2967 - regression_loss: 1.1181 - classification_loss: 0.1787 293/500 [================>.............] - ETA: 49s - loss: 1.2947 - regression_loss: 1.1164 - classification_loss: 0.1783 294/500 [================>.............] - ETA: 48s - loss: 1.2950 - regression_loss: 1.1166 - classification_loss: 0.1784 295/500 [================>.............] - ETA: 48s - loss: 1.2932 - regression_loss: 1.1152 - classification_loss: 0.1780 296/500 [================>.............] - ETA: 48s - loss: 1.2944 - regression_loss: 1.1163 - classification_loss: 0.1781 297/500 [================>.............] - ETA: 48s - loss: 1.2934 - regression_loss: 1.1154 - classification_loss: 0.1779 298/500 [================>.............] - ETA: 47s - loss: 1.2949 - regression_loss: 1.1167 - classification_loss: 0.1782 299/500 [================>.............] - ETA: 47s - loss: 1.2928 - regression_loss: 1.1149 - classification_loss: 0.1778 300/500 [=================>............] - ETA: 47s - loss: 1.2923 - regression_loss: 1.1146 - classification_loss: 0.1777 301/500 [=================>............] - ETA: 47s - loss: 1.2930 - regression_loss: 1.1153 - classification_loss: 0.1776 302/500 [=================>............] - ETA: 46s - loss: 1.2921 - regression_loss: 1.1149 - classification_loss: 0.1772 303/500 [=================>............] - ETA: 46s - loss: 1.2923 - regression_loss: 1.1152 - classification_loss: 0.1771 304/500 [=================>............] - ETA: 46s - loss: 1.2925 - regression_loss: 1.1153 - classification_loss: 0.1772 305/500 [=================>............] - ETA: 46s - loss: 1.2920 - regression_loss: 1.1149 - classification_loss: 0.1771 306/500 [=================>............] - ETA: 45s - loss: 1.2952 - regression_loss: 1.1178 - classification_loss: 0.1774 307/500 [=================>............] - ETA: 45s - loss: 1.2971 - regression_loss: 1.1193 - classification_loss: 0.1778 308/500 [=================>............] - ETA: 45s - loss: 1.2942 - regression_loss: 1.1169 - classification_loss: 0.1773 309/500 [=================>............] - ETA: 45s - loss: 1.2928 - regression_loss: 1.1156 - classification_loss: 0.1772 310/500 [=================>............] - ETA: 45s - loss: 1.2922 - regression_loss: 1.1152 - classification_loss: 0.1771 311/500 [=================>............] - ETA: 44s - loss: 1.2930 - regression_loss: 1.1160 - classification_loss: 0.1770 312/500 [=================>............] - ETA: 44s - loss: 1.2907 - regression_loss: 1.1141 - classification_loss: 0.1767 313/500 [=================>............] - ETA: 44s - loss: 1.2910 - regression_loss: 1.1143 - classification_loss: 0.1767 314/500 [=================>............] - ETA: 44s - loss: 1.2889 - regression_loss: 1.1125 - classification_loss: 0.1764 315/500 [=================>............] - ETA: 43s - loss: 1.2926 - regression_loss: 1.1156 - classification_loss: 0.1770 316/500 [=================>............] - ETA: 43s - loss: 1.2948 - regression_loss: 1.1177 - classification_loss: 0.1771 317/500 [==================>...........] - ETA: 43s - loss: 1.2935 - regression_loss: 1.1168 - classification_loss: 0.1768 318/500 [==================>...........] - ETA: 43s - loss: 1.2935 - regression_loss: 1.1169 - classification_loss: 0.1766 319/500 [==================>...........] - ETA: 42s - loss: 1.2944 - regression_loss: 1.1175 - classification_loss: 0.1769 320/500 [==================>...........] - ETA: 42s - loss: 1.2926 - regression_loss: 1.1161 - classification_loss: 0.1765 321/500 [==================>...........] - ETA: 42s - loss: 1.2923 - regression_loss: 1.1159 - classification_loss: 0.1764 322/500 [==================>...........] - ETA: 42s - loss: 1.2916 - regression_loss: 1.1153 - classification_loss: 0.1763 323/500 [==================>...........] - ETA: 41s - loss: 1.2974 - regression_loss: 1.1201 - classification_loss: 0.1773 324/500 [==================>...........] - ETA: 41s - loss: 1.2963 - regression_loss: 1.1192 - classification_loss: 0.1770 325/500 [==================>...........] - ETA: 41s - loss: 1.2976 - regression_loss: 1.1205 - classification_loss: 0.1770 326/500 [==================>...........] - ETA: 41s - loss: 1.2972 - regression_loss: 1.1204 - classification_loss: 0.1769 327/500 [==================>...........] - ETA: 40s - loss: 1.2976 - regression_loss: 1.1208 - classification_loss: 0.1768 328/500 [==================>...........] - ETA: 40s - loss: 1.2980 - regression_loss: 1.1215 - classification_loss: 0.1764 329/500 [==================>...........] - ETA: 40s - loss: 1.2985 - regression_loss: 1.1222 - classification_loss: 0.1763 330/500 [==================>...........] - ETA: 40s - loss: 1.2985 - regression_loss: 1.1223 - classification_loss: 0.1762 331/500 [==================>...........] - ETA: 40s - loss: 1.3005 - regression_loss: 1.1243 - classification_loss: 0.1763 332/500 [==================>...........] - ETA: 39s - loss: 1.3018 - regression_loss: 1.1257 - classification_loss: 0.1761 333/500 [==================>...........] - ETA: 39s - loss: 1.3025 - regression_loss: 1.1264 - classification_loss: 0.1762 334/500 [===================>..........] - ETA: 39s - loss: 1.3029 - regression_loss: 1.1266 - classification_loss: 0.1763 335/500 [===================>..........] - ETA: 39s - loss: 1.3026 - regression_loss: 1.1264 - classification_loss: 0.1762 336/500 [===================>..........] - ETA: 38s - loss: 1.3030 - regression_loss: 1.1269 - classification_loss: 0.1761 337/500 [===================>..........] - ETA: 38s - loss: 1.3030 - regression_loss: 1.1269 - classification_loss: 0.1761 338/500 [===================>..........] - ETA: 38s - loss: 1.3042 - regression_loss: 1.1278 - classification_loss: 0.1764 339/500 [===================>..........] - ETA: 38s - loss: 1.3025 - regression_loss: 1.1261 - classification_loss: 0.1764 340/500 [===================>..........] - ETA: 37s - loss: 1.3011 - regression_loss: 1.1250 - classification_loss: 0.1762 341/500 [===================>..........] - ETA: 37s - loss: 1.2996 - regression_loss: 1.1237 - classification_loss: 0.1759 342/500 [===================>..........] - ETA: 37s - loss: 1.3001 - regression_loss: 1.1242 - classification_loss: 0.1759 343/500 [===================>..........] - ETA: 37s - loss: 1.2997 - regression_loss: 1.1240 - classification_loss: 0.1758 344/500 [===================>..........] - ETA: 36s - loss: 1.3009 - regression_loss: 1.1248 - classification_loss: 0.1761 345/500 [===================>..........] - ETA: 36s - loss: 1.2989 - regression_loss: 1.1230 - classification_loss: 0.1759 346/500 [===================>..........] - ETA: 36s - loss: 1.2994 - regression_loss: 1.1236 - classification_loss: 0.1758 347/500 [===================>..........] - ETA: 36s - loss: 1.3006 - regression_loss: 1.1246 - classification_loss: 0.1760 348/500 [===================>..........] - ETA: 36s - loss: 1.3004 - regression_loss: 1.1245 - classification_loss: 0.1759 349/500 [===================>..........] - ETA: 35s - loss: 1.2988 - regression_loss: 1.1232 - classification_loss: 0.1756 350/500 [====================>.........] - ETA: 35s - loss: 1.2974 - regression_loss: 1.1222 - classification_loss: 0.1752 351/500 [====================>.........] - ETA: 35s - loss: 1.3002 - regression_loss: 1.1245 - classification_loss: 0.1756 352/500 [====================>.........] - ETA: 35s - loss: 1.3000 - regression_loss: 1.1245 - classification_loss: 0.1755 353/500 [====================>.........] - ETA: 34s - loss: 1.2996 - regression_loss: 1.1242 - classification_loss: 0.1754 354/500 [====================>.........] - ETA: 34s - loss: 1.2998 - regression_loss: 1.1243 - classification_loss: 0.1755 355/500 [====================>.........] - ETA: 34s - loss: 1.3017 - regression_loss: 1.1259 - classification_loss: 0.1758 356/500 [====================>.........] - ETA: 34s - loss: 1.3011 - regression_loss: 1.1252 - classification_loss: 0.1759 357/500 [====================>.........] - ETA: 33s - loss: 1.3028 - regression_loss: 1.1265 - classification_loss: 0.1763 358/500 [====================>.........] - ETA: 33s - loss: 1.3033 - regression_loss: 1.1269 - classification_loss: 0.1764 359/500 [====================>.........] - ETA: 33s - loss: 1.3026 - regression_loss: 1.1263 - classification_loss: 0.1763 360/500 [====================>.........] - ETA: 33s - loss: 1.3022 - regression_loss: 1.1261 - classification_loss: 0.1762 361/500 [====================>.........] - ETA: 32s - loss: 1.3024 - regression_loss: 1.1264 - classification_loss: 0.1760 362/500 [====================>.........] - ETA: 32s - loss: 1.3026 - regression_loss: 1.1267 - classification_loss: 0.1759 363/500 [====================>.........] - ETA: 32s - loss: 1.3027 - regression_loss: 1.1269 - classification_loss: 0.1758 364/500 [====================>.........] - ETA: 32s - loss: 1.3013 - regression_loss: 1.1257 - classification_loss: 0.1756 365/500 [====================>.........] - ETA: 31s - loss: 1.3014 - regression_loss: 1.1257 - classification_loss: 0.1757 366/500 [====================>.........] - ETA: 31s - loss: 1.2991 - regression_loss: 1.1237 - classification_loss: 0.1754 367/500 [=====================>........] - ETA: 31s - loss: 1.2997 - regression_loss: 1.1241 - classification_loss: 0.1756 368/500 [=====================>........] - ETA: 31s - loss: 1.2980 - regression_loss: 1.1227 - classification_loss: 0.1753 369/500 [=====================>........] - ETA: 31s - loss: 1.2973 - regression_loss: 1.1222 - classification_loss: 0.1751 370/500 [=====================>........] - ETA: 30s - loss: 1.2964 - regression_loss: 1.1215 - classification_loss: 0.1750 371/500 [=====================>........] - ETA: 30s - loss: 1.2985 - regression_loss: 1.1230 - classification_loss: 0.1755 372/500 [=====================>........] - ETA: 30s - loss: 1.2977 - regression_loss: 1.1223 - classification_loss: 0.1753 373/500 [=====================>........] - ETA: 30s - loss: 1.2972 - regression_loss: 1.1220 - classification_loss: 0.1752 374/500 [=====================>........] - ETA: 29s - loss: 1.2970 - regression_loss: 1.1220 - classification_loss: 0.1750 375/500 [=====================>........] - ETA: 29s - loss: 1.2971 - regression_loss: 1.1218 - classification_loss: 0.1753 376/500 [=====================>........] - ETA: 29s - loss: 1.2985 - regression_loss: 1.1231 - classification_loss: 0.1754 377/500 [=====================>........] - ETA: 29s - loss: 1.2986 - regression_loss: 1.1233 - classification_loss: 0.1753 378/500 [=====================>........] - ETA: 28s - loss: 1.2971 - regression_loss: 1.1222 - classification_loss: 0.1749 379/500 [=====================>........] - ETA: 28s - loss: 1.2977 - regression_loss: 1.1228 - classification_loss: 0.1749 380/500 [=====================>........] - ETA: 28s - loss: 1.2981 - regression_loss: 1.1232 - classification_loss: 0.1749 381/500 [=====================>........] - ETA: 28s - loss: 1.3012 - regression_loss: 1.1255 - classification_loss: 0.1756 382/500 [=====================>........] - ETA: 27s - loss: 1.3008 - regression_loss: 1.1252 - classification_loss: 0.1756 383/500 [=====================>........] - ETA: 27s - loss: 1.3007 - regression_loss: 1.1252 - classification_loss: 0.1755 384/500 [======================>.......] - ETA: 27s - loss: 1.3017 - regression_loss: 1.1260 - classification_loss: 0.1757 385/500 [======================>.......] - ETA: 27s - loss: 1.3012 - regression_loss: 1.1255 - classification_loss: 0.1757 386/500 [======================>.......] - ETA: 27s - loss: 1.3009 - regression_loss: 1.1254 - classification_loss: 0.1755 387/500 [======================>.......] - ETA: 26s - loss: 1.3004 - regression_loss: 1.1250 - classification_loss: 0.1755 388/500 [======================>.......] - ETA: 26s - loss: 1.2986 - regression_loss: 1.1233 - classification_loss: 0.1754 389/500 [======================>.......] - ETA: 26s - loss: 1.2979 - regression_loss: 1.1228 - classification_loss: 0.1751 390/500 [======================>.......] - ETA: 26s - loss: 1.2980 - regression_loss: 1.1229 - classification_loss: 0.1751 391/500 [======================>.......] - ETA: 25s - loss: 1.2985 - regression_loss: 1.1234 - classification_loss: 0.1751 392/500 [======================>.......] - ETA: 25s - loss: 1.2982 - regression_loss: 1.1232 - classification_loss: 0.1750 393/500 [======================>.......] - ETA: 25s - loss: 1.3000 - regression_loss: 1.1247 - classification_loss: 0.1754 394/500 [======================>.......] - ETA: 25s - loss: 1.3010 - regression_loss: 1.1258 - classification_loss: 0.1752 395/500 [======================>.......] - ETA: 24s - loss: 1.3034 - regression_loss: 1.1278 - classification_loss: 0.1757 396/500 [======================>.......] - ETA: 24s - loss: 1.3025 - regression_loss: 1.1271 - classification_loss: 0.1754 397/500 [======================>.......] - ETA: 24s - loss: 1.3039 - regression_loss: 1.1282 - classification_loss: 0.1757 398/500 [======================>.......] - ETA: 24s - loss: 1.3027 - regression_loss: 1.1271 - classification_loss: 0.1756 399/500 [======================>.......] - ETA: 23s - loss: 1.3054 - regression_loss: 1.1290 - classification_loss: 0.1764 400/500 [=======================>......] - ETA: 23s - loss: 1.3058 - regression_loss: 1.1292 - classification_loss: 0.1766 401/500 [=======================>......] - ETA: 23s - loss: 1.3040 - regression_loss: 1.1278 - classification_loss: 0.1762 402/500 [=======================>......] - ETA: 23s - loss: 1.3029 - regression_loss: 1.1270 - classification_loss: 0.1760 403/500 [=======================>......] - ETA: 22s - loss: 1.3023 - regression_loss: 1.1265 - classification_loss: 0.1758 404/500 [=======================>......] - ETA: 22s - loss: 1.3037 - regression_loss: 1.1278 - classification_loss: 0.1759 405/500 [=======================>......] - ETA: 22s - loss: 1.3093 - regression_loss: 1.1316 - classification_loss: 0.1777 406/500 [=======================>......] - ETA: 22s - loss: 1.3078 - regression_loss: 1.1302 - classification_loss: 0.1776 407/500 [=======================>......] - ETA: 22s - loss: 1.3083 - regression_loss: 1.1306 - classification_loss: 0.1776 408/500 [=======================>......] - ETA: 21s - loss: 1.3080 - regression_loss: 1.1304 - classification_loss: 0.1776 409/500 [=======================>......] - ETA: 21s - loss: 1.3082 - regression_loss: 1.1308 - classification_loss: 0.1774 410/500 [=======================>......] - ETA: 21s - loss: 1.3063 - regression_loss: 1.1292 - classification_loss: 0.1771 411/500 [=======================>......] - ETA: 21s - loss: 1.3055 - regression_loss: 1.1286 - classification_loss: 0.1769 412/500 [=======================>......] - ETA: 20s - loss: 1.3040 - regression_loss: 1.1274 - classification_loss: 0.1766 413/500 [=======================>......] - ETA: 20s - loss: 1.3036 - regression_loss: 1.1271 - classification_loss: 0.1765 414/500 [=======================>......] - ETA: 20s - loss: 1.3051 - regression_loss: 1.1281 - classification_loss: 0.1770 415/500 [=======================>......] - ETA: 20s - loss: 1.3042 - regression_loss: 1.1274 - classification_loss: 0.1769 416/500 [=======================>......] - ETA: 19s - loss: 1.3039 - regression_loss: 1.1272 - classification_loss: 0.1767 417/500 [========================>.....] - ETA: 19s - loss: 1.3026 - regression_loss: 1.1261 - classification_loss: 0.1764 418/500 [========================>.....] - ETA: 19s - loss: 1.3020 - regression_loss: 1.1256 - classification_loss: 0.1763 419/500 [========================>.....] - ETA: 19s - loss: 1.3036 - regression_loss: 1.1272 - classification_loss: 0.1764 420/500 [========================>.....] - ETA: 18s - loss: 1.3038 - regression_loss: 1.1274 - classification_loss: 0.1764 421/500 [========================>.....] - ETA: 18s - loss: 1.3032 - regression_loss: 1.1271 - classification_loss: 0.1761 422/500 [========================>.....] - ETA: 18s - loss: 1.3029 - regression_loss: 1.1269 - classification_loss: 0.1759 423/500 [========================>.....] - ETA: 18s - loss: 1.3027 - regression_loss: 1.1269 - classification_loss: 0.1758 424/500 [========================>.....] - ETA: 18s - loss: 1.3031 - regression_loss: 1.1271 - classification_loss: 0.1760 425/500 [========================>.....] - ETA: 17s - loss: 1.3028 - regression_loss: 1.1270 - classification_loss: 0.1758 426/500 [========================>.....] - ETA: 17s - loss: 1.3023 - regression_loss: 1.1266 - classification_loss: 0.1757 427/500 [========================>.....] - ETA: 17s - loss: 1.3019 - regression_loss: 1.1264 - classification_loss: 0.1755 428/500 [========================>.....] - ETA: 17s - loss: 1.3024 - regression_loss: 1.1267 - classification_loss: 0.1757 429/500 [========================>.....] - ETA: 16s - loss: 1.3019 - regression_loss: 1.1263 - classification_loss: 0.1756 430/500 [========================>.....] - ETA: 16s - loss: 1.3022 - regression_loss: 1.1265 - classification_loss: 0.1757 431/500 [========================>.....] - ETA: 16s - loss: 1.3018 - regression_loss: 1.1263 - classification_loss: 0.1755 432/500 [========================>.....] - ETA: 16s - loss: 1.3012 - regression_loss: 1.1259 - classification_loss: 0.1753 433/500 [========================>.....] - ETA: 15s - loss: 1.3018 - regression_loss: 1.1266 - classification_loss: 0.1752 434/500 [=========================>....] - ETA: 15s - loss: 1.3011 - regression_loss: 1.1260 - classification_loss: 0.1750 435/500 [=========================>....] - ETA: 15s - loss: 1.3017 - regression_loss: 1.1267 - classification_loss: 0.1751 436/500 [=========================>....] - ETA: 15s - loss: 1.3009 - regression_loss: 1.1261 - classification_loss: 0.1748 437/500 [=========================>....] - ETA: 14s - loss: 1.2998 - regression_loss: 1.1252 - classification_loss: 0.1746 438/500 [=========================>....] - ETA: 14s - loss: 1.3000 - regression_loss: 1.1253 - classification_loss: 0.1747 439/500 [=========================>....] - ETA: 14s - loss: 1.2993 - regression_loss: 1.1247 - classification_loss: 0.1746 440/500 [=========================>....] - ETA: 14s - loss: 1.2981 - regression_loss: 1.1238 - classification_loss: 0.1743 441/500 [=========================>....] - ETA: 13s - loss: 1.2965 - regression_loss: 1.1225 - classification_loss: 0.1740 442/500 [=========================>....] - ETA: 13s - loss: 1.2976 - regression_loss: 1.1232 - classification_loss: 0.1744 443/500 [=========================>....] - ETA: 13s - loss: 1.2973 - regression_loss: 1.1230 - classification_loss: 0.1743 444/500 [=========================>....] - ETA: 13s - loss: 1.2993 - regression_loss: 1.1247 - classification_loss: 0.1746 445/500 [=========================>....] - ETA: 13s - loss: 1.2989 - regression_loss: 1.1245 - classification_loss: 0.1743 446/500 [=========================>....] - ETA: 12s - loss: 1.2987 - regression_loss: 1.1244 - classification_loss: 0.1743 447/500 [=========================>....] - ETA: 12s - loss: 1.2991 - regression_loss: 1.1247 - classification_loss: 0.1743 448/500 [=========================>....] - ETA: 12s - loss: 1.2984 - regression_loss: 1.1242 - classification_loss: 0.1742 449/500 [=========================>....] - ETA: 12s - loss: 1.2981 - regression_loss: 1.1238 - classification_loss: 0.1743 450/500 [==========================>...] - ETA: 11s - loss: 1.2987 - regression_loss: 1.1243 - classification_loss: 0.1744 451/500 [==========================>...] - ETA: 11s - loss: 1.2986 - regression_loss: 1.1244 - classification_loss: 0.1742 452/500 [==========================>...] - ETA: 11s - loss: 1.2972 - regression_loss: 1.1232 - classification_loss: 0.1740 453/500 [==========================>...] - ETA: 11s - loss: 1.2962 - regression_loss: 1.1224 - classification_loss: 0.1738 454/500 [==========================>...] - ETA: 10s - loss: 1.2969 - regression_loss: 1.1230 - classification_loss: 0.1740 455/500 [==========================>...] - ETA: 10s - loss: 1.2966 - regression_loss: 1.1227 - classification_loss: 0.1739 456/500 [==========================>...] - ETA: 10s - loss: 1.2956 - regression_loss: 1.1219 - classification_loss: 0.1737 457/500 [==========================>...] - ETA: 10s - loss: 1.2956 - regression_loss: 1.1218 - classification_loss: 0.1738 458/500 [==========================>...] - ETA: 9s - loss: 1.2941 - regression_loss: 1.1205 - classification_loss: 0.1736  459/500 [==========================>...] - ETA: 9s - loss: 1.2937 - regression_loss: 1.1201 - classification_loss: 0.1736 460/500 [==========================>...] - ETA: 9s - loss: 1.2934 - regression_loss: 1.1199 - classification_loss: 0.1735 461/500 [==========================>...] - ETA: 9s - loss: 1.2914 - regression_loss: 1.1182 - classification_loss: 0.1732 462/500 [==========================>...] - ETA: 9s - loss: 1.2916 - regression_loss: 1.1183 - classification_loss: 0.1733 463/500 [==========================>...] - ETA: 8s - loss: 1.2907 - regression_loss: 1.1176 - classification_loss: 0.1731 464/500 [==========================>...] - ETA: 8s - loss: 1.2906 - regression_loss: 1.1176 - classification_loss: 0.1730 465/500 [==========================>...] - ETA: 8s - loss: 1.2909 - regression_loss: 1.1179 - classification_loss: 0.1730 466/500 [==========================>...] - ETA: 8s - loss: 1.2909 - regression_loss: 1.1180 - classification_loss: 0.1729 467/500 [===========================>..] - ETA: 7s - loss: 1.2905 - regression_loss: 1.1177 - classification_loss: 0.1728 468/500 [===========================>..] - ETA: 7s - loss: 1.2906 - regression_loss: 1.1180 - classification_loss: 0.1727 469/500 [===========================>..] - ETA: 7s - loss: 1.2912 - regression_loss: 1.1185 - classification_loss: 0.1727 470/500 [===========================>..] - ETA: 7s - loss: 1.2916 - regression_loss: 1.1187 - classification_loss: 0.1729 471/500 [===========================>..] - ETA: 6s - loss: 1.2893 - regression_loss: 1.1163 - classification_loss: 0.1729 472/500 [===========================>..] - ETA: 6s - loss: 1.2884 - regression_loss: 1.1156 - classification_loss: 0.1728 473/500 [===========================>..] - ETA: 6s - loss: 1.2882 - regression_loss: 1.1155 - classification_loss: 0.1727 474/500 [===========================>..] - ETA: 6s - loss: 1.2872 - regression_loss: 1.1147 - classification_loss: 0.1725 475/500 [===========================>..] - ETA: 5s - loss: 1.2877 - regression_loss: 1.1152 - classification_loss: 0.1725 476/500 [===========================>..] - ETA: 5s - loss: 1.2876 - regression_loss: 1.1152 - classification_loss: 0.1725 477/500 [===========================>..] - ETA: 5s - loss: 1.2873 - regression_loss: 1.1150 - classification_loss: 0.1723 478/500 [===========================>..] - ETA: 5s - loss: 1.2876 - regression_loss: 1.1153 - classification_loss: 0.1724 479/500 [===========================>..] - ETA: 4s - loss: 1.2864 - regression_loss: 1.1142 - classification_loss: 0.1721 480/500 [===========================>..] - ETA: 4s - loss: 1.2863 - regression_loss: 1.1143 - classification_loss: 0.1720 481/500 [===========================>..] - ETA: 4s - loss: 1.2858 - regression_loss: 1.1139 - classification_loss: 0.1719 482/500 [===========================>..] - ETA: 4s - loss: 1.2851 - regression_loss: 1.1133 - classification_loss: 0.1718 483/500 [===========================>..] - ETA: 4s - loss: 1.2850 - regression_loss: 1.1132 - classification_loss: 0.1718 484/500 [============================>.] - ETA: 3s - loss: 1.2849 - regression_loss: 1.1132 - classification_loss: 0.1717 485/500 [============================>.] - ETA: 3s - loss: 1.2883 - regression_loss: 1.1158 - classification_loss: 0.1725 486/500 [============================>.] - ETA: 3s - loss: 1.2892 - regression_loss: 1.1164 - classification_loss: 0.1727 487/500 [============================>.] - ETA: 3s - loss: 1.2877 - regression_loss: 1.1152 - classification_loss: 0.1725 488/500 [============================>.] - ETA: 2s - loss: 1.2869 - regression_loss: 1.1147 - classification_loss: 0.1723 489/500 [============================>.] - ETA: 2s - loss: 1.2871 - regression_loss: 1.1149 - classification_loss: 0.1722 490/500 [============================>.] - ETA: 2s - loss: 1.2874 - regression_loss: 1.1152 - classification_loss: 0.1722 491/500 [============================>.] - ETA: 2s - loss: 1.2862 - regression_loss: 1.1141 - classification_loss: 0.1721 492/500 [============================>.] - ETA: 1s - loss: 1.2859 - regression_loss: 1.1139 - classification_loss: 0.1720 493/500 [============================>.] - ETA: 1s - loss: 1.2861 - regression_loss: 1.1142 - classification_loss: 0.1719 494/500 [============================>.] - ETA: 1s - loss: 1.2843 - regression_loss: 1.1126 - classification_loss: 0.1716 495/500 [============================>.] - ETA: 1s - loss: 1.2840 - regression_loss: 1.1125 - classification_loss: 0.1715 496/500 [============================>.] - ETA: 0s - loss: 1.2841 - regression_loss: 1.1126 - classification_loss: 0.1715 497/500 [============================>.] - ETA: 0s - loss: 1.2847 - regression_loss: 1.1132 - classification_loss: 0.1714 498/500 [============================>.] - ETA: 0s - loss: 1.2848 - regression_loss: 1.1133 - classification_loss: 0.1715 499/500 [============================>.] - ETA: 0s - loss: 1.2866 - regression_loss: 1.1146 - classification_loss: 0.1720 500/500 [==============================] - 118s 237ms/step - loss: 1.2863 - regression_loss: 1.1144 - classification_loss: 0.1719 326 instances of class plum with average precision: 0.8413 mAP: 0.8413 Epoch 00016: saving model to ./training/snapshots/resnet50_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 1:55 - loss: 2.4893 - regression_loss: 2.1585 - classification_loss: 0.3308 2/500 [..............................] - ETA: 1:54 - loss: 1.8845 - regression_loss: 1.6278 - classification_loss: 0.2567 3/500 [..............................] - ETA: 1:52 - loss: 1.5766 - regression_loss: 1.3733 - classification_loss: 0.2033 4/500 [..............................] - ETA: 1:56 - loss: 1.5080 - regression_loss: 1.3258 - classification_loss: 0.1822 5/500 [..............................] - ETA: 1:56 - loss: 1.3630 - regression_loss: 1.1977 - classification_loss: 0.1653 6/500 [..............................] - ETA: 1:57 - loss: 1.2993 - regression_loss: 1.1485 - classification_loss: 0.1508 7/500 [..............................] - ETA: 1:56 - loss: 1.4126 - regression_loss: 1.2544 - classification_loss: 0.1582 8/500 [..............................] - ETA: 1:55 - loss: 1.4844 - regression_loss: 1.3101 - classification_loss: 0.1743 9/500 [..............................] - ETA: 1:54 - loss: 1.4259 - regression_loss: 1.2470 - classification_loss: 0.1789 10/500 [..............................] - ETA: 1:54 - loss: 1.3398 - regression_loss: 1.1733 - classification_loss: 0.1664 11/500 [..............................] - ETA: 1:54 - loss: 1.4152 - regression_loss: 1.2248 - classification_loss: 0.1905 12/500 [..............................] - ETA: 1:53 - loss: 1.3988 - regression_loss: 1.2109 - classification_loss: 0.1879 13/500 [..............................] - ETA: 1:53 - loss: 1.3795 - regression_loss: 1.1943 - classification_loss: 0.1851 14/500 [..............................] - ETA: 1:53 - loss: 1.4316 - regression_loss: 1.2242 - classification_loss: 0.2073 15/500 [..............................] - ETA: 1:52 - loss: 1.4605 - regression_loss: 1.2468 - classification_loss: 0.2137 16/500 [..............................] - ETA: 1:53 - loss: 1.4494 - regression_loss: 1.2407 - classification_loss: 0.2087 17/500 [>.............................] - ETA: 1:52 - loss: 1.4766 - regression_loss: 1.2673 - classification_loss: 0.2093 18/500 [>.............................] - ETA: 1:52 - loss: 1.4478 - regression_loss: 1.2454 - classification_loss: 0.2023 19/500 [>.............................] - ETA: 1:52 - loss: 1.3942 - regression_loss: 1.1987 - classification_loss: 0.1955 20/500 [>.............................] - ETA: 1:52 - loss: 1.3472 - regression_loss: 1.1594 - classification_loss: 0.1878 21/500 [>.............................] - ETA: 1:51 - loss: 1.3475 - regression_loss: 1.1596 - classification_loss: 0.1879 22/500 [>.............................] - ETA: 1:52 - loss: 1.3400 - regression_loss: 1.1557 - classification_loss: 0.1844 23/500 [>.............................] - ETA: 1:51 - loss: 1.3708 - regression_loss: 1.1822 - classification_loss: 0.1886 24/500 [>.............................] - ETA: 1:51 - loss: 1.3405 - regression_loss: 1.1575 - classification_loss: 0.1829 25/500 [>.............................] - ETA: 1:51 - loss: 1.3173 - regression_loss: 1.1379 - classification_loss: 0.1794 26/500 [>.............................] - ETA: 1:50 - loss: 1.3243 - regression_loss: 1.1438 - classification_loss: 0.1805 27/500 [>.............................] - ETA: 1:50 - loss: 1.3320 - regression_loss: 1.1488 - classification_loss: 0.1832 28/500 [>.............................] - ETA: 1:50 - loss: 1.3285 - regression_loss: 1.1494 - classification_loss: 0.1791 29/500 [>.............................] - ETA: 1:50 - loss: 1.3164 - regression_loss: 1.1402 - classification_loss: 0.1762 30/500 [>.............................] - ETA: 1:50 - loss: 1.3274 - regression_loss: 1.1489 - classification_loss: 0.1785 31/500 [>.............................] - ETA: 1:49 - loss: 1.3267 - regression_loss: 1.1484 - classification_loss: 0.1783 32/500 [>.............................] - ETA: 1:49 - loss: 1.3358 - regression_loss: 1.1537 - classification_loss: 0.1822 33/500 [>.............................] - ETA: 1:48 - loss: 1.3226 - regression_loss: 1.1429 - classification_loss: 0.1797 34/500 [=>............................] - ETA: 1:48 - loss: 1.3388 - regression_loss: 1.1546 - classification_loss: 0.1842 35/500 [=>............................] - ETA: 1:48 - loss: 1.3300 - regression_loss: 1.1480 - classification_loss: 0.1821 36/500 [=>............................] - ETA: 1:48 - loss: 1.3206 - regression_loss: 1.1417 - classification_loss: 0.1789 37/500 [=>............................] - ETA: 1:48 - loss: 1.3017 - regression_loss: 1.1261 - classification_loss: 0.1755 38/500 [=>............................] - ETA: 1:47 - loss: 1.3188 - regression_loss: 1.1387 - classification_loss: 0.1801 39/500 [=>............................] - ETA: 1:47 - loss: 1.3188 - regression_loss: 1.1400 - classification_loss: 0.1788 40/500 [=>............................] - ETA: 1:47 - loss: 1.2974 - regression_loss: 1.1221 - classification_loss: 0.1753 41/500 [=>............................] - ETA: 1:47 - loss: 1.2933 - regression_loss: 1.1190 - classification_loss: 0.1743 42/500 [=>............................] - ETA: 1:46 - loss: 1.2778 - regression_loss: 1.1054 - classification_loss: 0.1724 43/500 [=>............................] - ETA: 1:46 - loss: 1.2716 - regression_loss: 1.1010 - classification_loss: 0.1706 44/500 [=>............................] - ETA: 1:46 - loss: 1.2729 - regression_loss: 1.1012 - classification_loss: 0.1717 45/500 [=>............................] - ETA: 1:46 - loss: 1.2672 - regression_loss: 1.0954 - classification_loss: 0.1718 46/500 [=>............................] - ETA: 1:46 - loss: 1.2539 - regression_loss: 1.0839 - classification_loss: 0.1700 47/500 [=>............................] - ETA: 1:46 - loss: 1.2619 - regression_loss: 1.0852 - classification_loss: 0.1767 48/500 [=>............................] - ETA: 1:46 - loss: 1.2563 - regression_loss: 1.0815 - classification_loss: 0.1748 49/500 [=>............................] - ETA: 1:45 - loss: 1.2452 - regression_loss: 1.0721 - classification_loss: 0.1731 50/500 [==>...........................] - ETA: 1:45 - loss: 1.2457 - regression_loss: 1.0736 - classification_loss: 0.1721 51/500 [==>...........................] - ETA: 1:45 - loss: 1.2481 - regression_loss: 1.0748 - classification_loss: 0.1733 52/500 [==>...........................] - ETA: 1:44 - loss: 1.2430 - regression_loss: 1.0707 - classification_loss: 0.1722 53/500 [==>...........................] - ETA: 1:44 - loss: 1.2409 - regression_loss: 1.0693 - classification_loss: 0.1716 54/500 [==>...........................] - ETA: 1:44 - loss: 1.2596 - regression_loss: 1.0847 - classification_loss: 0.1749 55/500 [==>...........................] - ETA: 1:43 - loss: 1.2593 - regression_loss: 1.0859 - classification_loss: 0.1734 56/500 [==>...........................] - ETA: 1:43 - loss: 1.2549 - regression_loss: 1.0828 - classification_loss: 0.1721 57/500 [==>...........................] - ETA: 1:43 - loss: 1.2488 - regression_loss: 1.0783 - classification_loss: 0.1705 58/500 [==>...........................] - ETA: 1:43 - loss: 1.2492 - regression_loss: 1.0779 - classification_loss: 0.1713 59/500 [==>...........................] - ETA: 1:43 - loss: 1.2441 - regression_loss: 1.0748 - classification_loss: 0.1693 60/500 [==>...........................] - ETA: 1:43 - loss: 1.2419 - regression_loss: 1.0736 - classification_loss: 0.1683 61/500 [==>...........................] - ETA: 1:42 - loss: 1.2577 - regression_loss: 1.0896 - classification_loss: 0.1682 62/500 [==>...........................] - ETA: 1:42 - loss: 1.2746 - regression_loss: 1.1031 - classification_loss: 0.1715 63/500 [==>...........................] - ETA: 1:42 - loss: 1.2724 - regression_loss: 1.1016 - classification_loss: 0.1707 64/500 [==>...........................] - ETA: 1:42 - loss: 1.2748 - regression_loss: 1.1042 - classification_loss: 0.1706 65/500 [==>...........................] - ETA: 1:41 - loss: 1.2808 - regression_loss: 1.1101 - classification_loss: 0.1706 66/500 [==>...........................] - ETA: 1:41 - loss: 1.2816 - regression_loss: 1.1100 - classification_loss: 0.1716 67/500 [===>..........................] - ETA: 1:41 - loss: 1.2797 - regression_loss: 1.1081 - classification_loss: 0.1716 68/500 [===>..........................] - ETA: 1:41 - loss: 1.2819 - regression_loss: 1.1105 - classification_loss: 0.1714 69/500 [===>..........................] - ETA: 1:40 - loss: 1.2663 - regression_loss: 1.0944 - classification_loss: 0.1719 70/500 [===>..........................] - ETA: 1:40 - loss: 1.2650 - regression_loss: 1.0943 - classification_loss: 0.1707 71/500 [===>..........................] - ETA: 1:40 - loss: 1.2653 - regression_loss: 1.0949 - classification_loss: 0.1704 72/500 [===>..........................] - ETA: 1:40 - loss: 1.2663 - regression_loss: 1.0955 - classification_loss: 0.1708 73/500 [===>..........................] - ETA: 1:39 - loss: 1.2553 - regression_loss: 1.0861 - classification_loss: 0.1691 74/500 [===>..........................] - ETA: 1:39 - loss: 1.2520 - regression_loss: 1.0837 - classification_loss: 0.1682 75/500 [===>..........................] - ETA: 1:39 - loss: 1.2543 - regression_loss: 1.0861 - classification_loss: 0.1681 76/500 [===>..........................] - ETA: 1:39 - loss: 1.2638 - regression_loss: 1.0941 - classification_loss: 0.1697 77/500 [===>..........................] - ETA: 1:39 - loss: 1.2647 - regression_loss: 1.0952 - classification_loss: 0.1695 78/500 [===>..........................] - ETA: 1:38 - loss: 1.2635 - regression_loss: 1.0945 - classification_loss: 0.1689 79/500 [===>..........................] - ETA: 1:38 - loss: 1.2616 - regression_loss: 1.0933 - classification_loss: 0.1683 80/500 [===>..........................] - ETA: 1:38 - loss: 1.2533 - regression_loss: 1.0862 - classification_loss: 0.1671 81/500 [===>..........................] - ETA: 1:38 - loss: 1.2542 - regression_loss: 1.0874 - classification_loss: 0.1668 82/500 [===>..........................] - ETA: 1:37 - loss: 1.2474 - regression_loss: 1.0820 - classification_loss: 0.1654 83/500 [===>..........................] - ETA: 1:37 - loss: 1.2528 - regression_loss: 1.0862 - classification_loss: 0.1665 84/500 [====>.........................] - ETA: 1:37 - loss: 1.2525 - regression_loss: 1.0863 - classification_loss: 0.1662 85/500 [====>.........................] - ETA: 1:37 - loss: 1.2476 - regression_loss: 1.0823 - classification_loss: 0.1652 86/500 [====>.........................] - ETA: 1:36 - loss: 1.2498 - regression_loss: 1.0839 - classification_loss: 0.1659 87/500 [====>.........................] - ETA: 1:36 - loss: 1.2469 - regression_loss: 1.0817 - classification_loss: 0.1652 88/500 [====>.........................] - ETA: 1:36 - loss: 1.2495 - regression_loss: 1.0838 - classification_loss: 0.1657 89/500 [====>.........................] - ETA: 1:36 - loss: 1.2520 - regression_loss: 1.0859 - classification_loss: 0.1661 90/500 [====>.........................] - ETA: 1:36 - loss: 1.2480 - regression_loss: 1.0826 - classification_loss: 0.1654 91/500 [====>.........................] - ETA: 1:35 - loss: 1.2449 - regression_loss: 1.0798 - classification_loss: 0.1651 92/500 [====>.........................] - ETA: 1:35 - loss: 1.2505 - regression_loss: 1.0852 - classification_loss: 0.1653 93/500 [====>.........................] - ETA: 1:35 - loss: 1.2523 - regression_loss: 1.0870 - classification_loss: 0.1653 94/500 [====>.........................] - ETA: 1:35 - loss: 1.2547 - regression_loss: 1.0897 - classification_loss: 0.1650 95/500 [====>.........................] - ETA: 1:34 - loss: 1.2521 - regression_loss: 1.0882 - classification_loss: 0.1639 96/500 [====>.........................] - ETA: 1:34 - loss: 1.2536 - regression_loss: 1.0896 - classification_loss: 0.1639 97/500 [====>.........................] - ETA: 1:34 - loss: 1.2550 - regression_loss: 1.0917 - classification_loss: 0.1633 98/500 [====>.........................] - ETA: 1:34 - loss: 1.2523 - regression_loss: 1.0894 - classification_loss: 0.1630 99/500 [====>.........................] - ETA: 1:33 - loss: 1.2558 - regression_loss: 1.0927 - classification_loss: 0.1631 100/500 [=====>........................] - ETA: 1:33 - loss: 1.2585 - regression_loss: 1.0957 - classification_loss: 0.1628 101/500 [=====>........................] - ETA: 1:33 - loss: 1.2658 - regression_loss: 1.1017 - classification_loss: 0.1640 102/500 [=====>........................] - ETA: 1:32 - loss: 1.2617 - regression_loss: 1.0981 - classification_loss: 0.1636 103/500 [=====>........................] - ETA: 1:32 - loss: 1.2617 - regression_loss: 1.0986 - classification_loss: 0.1632 104/500 [=====>........................] - ETA: 1:32 - loss: 1.2665 - regression_loss: 1.1020 - classification_loss: 0.1645 105/500 [=====>........................] - ETA: 1:32 - loss: 1.2625 - regression_loss: 1.0987 - classification_loss: 0.1638 106/500 [=====>........................] - ETA: 1:32 - loss: 1.2560 - regression_loss: 1.0923 - classification_loss: 0.1637 107/500 [=====>........................] - ETA: 1:31 - loss: 1.2634 - regression_loss: 1.0998 - classification_loss: 0.1636 108/500 [=====>........................] - ETA: 1:31 - loss: 1.2702 - regression_loss: 1.1044 - classification_loss: 0.1657 109/500 [=====>........................] - ETA: 1:31 - loss: 1.2671 - regression_loss: 1.1019 - classification_loss: 0.1653 110/500 [=====>........................] - ETA: 1:31 - loss: 1.2680 - regression_loss: 1.1028 - classification_loss: 0.1652 111/500 [=====>........................] - ETA: 1:30 - loss: 1.2722 - regression_loss: 1.1066 - classification_loss: 0.1656 112/500 [=====>........................] - ETA: 1:30 - loss: 1.2742 - regression_loss: 1.1079 - classification_loss: 0.1663 113/500 [=====>........................] - ETA: 1:30 - loss: 1.2704 - regression_loss: 1.1051 - classification_loss: 0.1654 114/500 [=====>........................] - ETA: 1:30 - loss: 1.2659 - regression_loss: 1.1012 - classification_loss: 0.1647 115/500 [=====>........................] - ETA: 1:30 - loss: 1.2640 - regression_loss: 1.0993 - classification_loss: 0.1647 116/500 [=====>........................] - ETA: 1:30 - loss: 1.2620 - regression_loss: 1.0978 - classification_loss: 0.1641 117/500 [======>.......................] - ETA: 1:29 - loss: 1.2629 - regression_loss: 1.0986 - classification_loss: 0.1644 118/500 [======>.......................] - ETA: 1:29 - loss: 1.2583 - regression_loss: 1.0948 - classification_loss: 0.1635 119/500 [======>.......................] - ETA: 1:29 - loss: 1.2621 - regression_loss: 1.0978 - classification_loss: 0.1644 120/500 [======>.......................] - ETA: 1:29 - loss: 1.2577 - regression_loss: 1.0941 - classification_loss: 0.1636 121/500 [======>.......................] - ETA: 1:28 - loss: 1.2576 - regression_loss: 1.0940 - classification_loss: 0.1635 122/500 [======>.......................] - ETA: 1:28 - loss: 1.2587 - regression_loss: 1.0950 - classification_loss: 0.1636 123/500 [======>.......................] - ETA: 1:28 - loss: 1.2597 - regression_loss: 1.0958 - classification_loss: 0.1638 124/500 [======>.......................] - ETA: 1:28 - loss: 1.2549 - regression_loss: 1.0918 - classification_loss: 0.1631 125/500 [======>.......................] - ETA: 1:28 - loss: 1.2763 - regression_loss: 1.1080 - classification_loss: 0.1683 126/500 [======>.......................] - ETA: 1:27 - loss: 1.2751 - regression_loss: 1.1069 - classification_loss: 0.1682 127/500 [======>.......................] - ETA: 1:27 - loss: 1.2739 - regression_loss: 1.1062 - classification_loss: 0.1677 128/500 [======>.......................] - ETA: 1:27 - loss: 1.2770 - regression_loss: 1.1088 - classification_loss: 0.1683 129/500 [======>.......................] - ETA: 1:27 - loss: 1.2806 - regression_loss: 1.1115 - classification_loss: 0.1691 130/500 [======>.......................] - ETA: 1:27 - loss: 1.2773 - regression_loss: 1.1088 - classification_loss: 0.1685 131/500 [======>.......................] - ETA: 1:26 - loss: 1.2788 - regression_loss: 1.1103 - classification_loss: 0.1685 132/500 [======>.......................] - ETA: 1:26 - loss: 1.2779 - regression_loss: 1.1100 - classification_loss: 0.1679 133/500 [======>.......................] - ETA: 1:26 - loss: 1.2776 - regression_loss: 1.1098 - classification_loss: 0.1678 134/500 [=======>......................] - ETA: 1:26 - loss: 1.2760 - regression_loss: 1.1085 - classification_loss: 0.1675 135/500 [=======>......................] - ETA: 1:25 - loss: 1.2735 - regression_loss: 1.1064 - classification_loss: 0.1671 136/500 [=======>......................] - ETA: 1:25 - loss: 1.2690 - regression_loss: 1.1028 - classification_loss: 0.1662 137/500 [=======>......................] - ETA: 1:25 - loss: 1.2676 - regression_loss: 1.1017 - classification_loss: 0.1659 138/500 [=======>......................] - ETA: 1:25 - loss: 1.2631 - regression_loss: 1.0979 - classification_loss: 0.1652 139/500 [=======>......................] - ETA: 1:24 - loss: 1.2629 - regression_loss: 1.0978 - classification_loss: 0.1652 140/500 [=======>......................] - ETA: 1:24 - loss: 1.2627 - regression_loss: 1.0981 - classification_loss: 0.1646 141/500 [=======>......................] - ETA: 1:24 - loss: 1.2616 - regression_loss: 1.0973 - classification_loss: 0.1643 142/500 [=======>......................] - ETA: 1:24 - loss: 1.2615 - regression_loss: 1.0976 - classification_loss: 0.1640 143/500 [=======>......................] - ETA: 1:23 - loss: 1.2592 - regression_loss: 1.0957 - classification_loss: 0.1634 144/500 [=======>......................] - ETA: 1:23 - loss: 1.2634 - regression_loss: 1.0988 - classification_loss: 0.1646 145/500 [=======>......................] - ETA: 1:23 - loss: 1.2623 - regression_loss: 1.0974 - classification_loss: 0.1649 146/500 [=======>......................] - ETA: 1:23 - loss: 1.2662 - regression_loss: 1.0999 - classification_loss: 0.1663 147/500 [=======>......................] - ETA: 1:22 - loss: 1.2644 - regression_loss: 1.0986 - classification_loss: 0.1658 148/500 [=======>......................] - ETA: 1:22 - loss: 1.2679 - regression_loss: 1.1011 - classification_loss: 0.1668 149/500 [=======>......................] - ETA: 1:22 - loss: 1.2651 - regression_loss: 1.0988 - classification_loss: 0.1663 150/500 [========>.....................] - ETA: 1:22 - loss: 1.2644 - regression_loss: 1.0980 - classification_loss: 0.1664 151/500 [========>.....................] - ETA: 1:22 - loss: 1.2632 - regression_loss: 1.0970 - classification_loss: 0.1662 152/500 [========>.....................] - ETA: 1:21 - loss: 1.2639 - regression_loss: 1.0981 - classification_loss: 0.1657 153/500 [========>.....................] - ETA: 1:21 - loss: 1.2645 - regression_loss: 1.0988 - classification_loss: 0.1656 154/500 [========>.....................] - ETA: 1:21 - loss: 1.2689 - regression_loss: 1.1025 - classification_loss: 0.1664 155/500 [========>.....................] - ETA: 1:21 - loss: 1.2703 - regression_loss: 1.1043 - classification_loss: 0.1660 156/500 [========>.....................] - ETA: 1:21 - loss: 1.2660 - regression_loss: 1.1007 - classification_loss: 0.1653 157/500 [========>.....................] - ETA: 1:20 - loss: 1.2700 - regression_loss: 1.1042 - classification_loss: 0.1658 158/500 [========>.....................] - ETA: 1:20 - loss: 1.2688 - regression_loss: 1.1033 - classification_loss: 0.1655 159/500 [========>.....................] - ETA: 1:20 - loss: 1.2681 - regression_loss: 1.1027 - classification_loss: 0.1654 160/500 [========>.....................] - ETA: 1:20 - loss: 1.2655 - regression_loss: 1.1006 - classification_loss: 0.1649 161/500 [========>.....................] - ETA: 1:19 - loss: 1.2609 - regression_loss: 1.0968 - classification_loss: 0.1641 162/500 [========>.....................] - ETA: 1:19 - loss: 1.2602 - regression_loss: 1.0964 - classification_loss: 0.1638 163/500 [========>.....................] - ETA: 1:19 - loss: 1.2568 - regression_loss: 1.0934 - classification_loss: 0.1633 164/500 [========>.....................] - ETA: 1:19 - loss: 1.2549 - regression_loss: 1.0923 - classification_loss: 0.1627 165/500 [========>.....................] - ETA: 1:18 - loss: 1.2569 - regression_loss: 1.0940 - classification_loss: 0.1629 166/500 [========>.....................] - ETA: 1:18 - loss: 1.2617 - regression_loss: 1.0987 - classification_loss: 0.1630 167/500 [=========>....................] - ETA: 1:18 - loss: 1.2652 - regression_loss: 1.1019 - classification_loss: 0.1633 168/500 [=========>....................] - ETA: 1:18 - loss: 1.2657 - regression_loss: 1.1029 - classification_loss: 0.1628 169/500 [=========>....................] - ETA: 1:17 - loss: 1.2664 - regression_loss: 1.1037 - classification_loss: 0.1628 170/500 [=========>....................] - ETA: 1:17 - loss: 1.2723 - regression_loss: 1.1078 - classification_loss: 0.1645 171/500 [=========>....................] - ETA: 1:17 - loss: 1.2706 - regression_loss: 1.1068 - classification_loss: 0.1638 172/500 [=========>....................] - ETA: 1:17 - loss: 1.2707 - regression_loss: 1.1071 - classification_loss: 0.1636 173/500 [=========>....................] - ETA: 1:16 - loss: 1.2746 - regression_loss: 1.1108 - classification_loss: 0.1638 174/500 [=========>....................] - ETA: 1:16 - loss: 1.2709 - regression_loss: 1.1076 - classification_loss: 0.1633 175/500 [=========>....................] - ETA: 1:16 - loss: 1.2731 - regression_loss: 1.1095 - classification_loss: 0.1636 176/500 [=========>....................] - ETA: 1:16 - loss: 1.2734 - regression_loss: 1.1098 - classification_loss: 0.1636 177/500 [=========>....................] - ETA: 1:15 - loss: 1.2705 - regression_loss: 1.1075 - classification_loss: 0.1630 178/500 [=========>....................] - ETA: 1:15 - loss: 1.2687 - regression_loss: 1.1061 - classification_loss: 0.1626 179/500 [=========>....................] - ETA: 1:15 - loss: 1.2651 - regression_loss: 1.1032 - classification_loss: 0.1620 180/500 [=========>....................] - ETA: 1:15 - loss: 1.2681 - regression_loss: 1.1057 - classification_loss: 0.1624 181/500 [=========>....................] - ETA: 1:15 - loss: 1.2686 - regression_loss: 1.1061 - classification_loss: 0.1625 182/500 [=========>....................] - ETA: 1:14 - loss: 1.2729 - regression_loss: 1.1092 - classification_loss: 0.1638 183/500 [=========>....................] - ETA: 1:14 - loss: 1.2744 - regression_loss: 1.1100 - classification_loss: 0.1645 184/500 [==========>...................] - ETA: 1:14 - loss: 1.2743 - regression_loss: 1.1100 - classification_loss: 0.1643 185/500 [==========>...................] - ETA: 1:14 - loss: 1.2726 - regression_loss: 1.1089 - classification_loss: 0.1637 186/500 [==========>...................] - ETA: 1:13 - loss: 1.2745 - regression_loss: 1.1107 - classification_loss: 0.1638 187/500 [==========>...................] - ETA: 1:13 - loss: 1.2761 - regression_loss: 1.1122 - classification_loss: 0.1639 188/500 [==========>...................] - ETA: 1:13 - loss: 1.2764 - regression_loss: 1.1127 - classification_loss: 0.1638 189/500 [==========>...................] - ETA: 1:13 - loss: 1.2752 - regression_loss: 1.1118 - classification_loss: 0.1634 190/500 [==========>...................] - ETA: 1:12 - loss: 1.2742 - regression_loss: 1.1109 - classification_loss: 0.1633 191/500 [==========>...................] - ETA: 1:12 - loss: 1.2742 - regression_loss: 1.1111 - classification_loss: 0.1631 192/500 [==========>...................] - ETA: 1:12 - loss: 1.2736 - regression_loss: 1.1108 - classification_loss: 0.1628 193/500 [==========>...................] - ETA: 1:12 - loss: 1.2737 - regression_loss: 1.1107 - classification_loss: 0.1630 194/500 [==========>...................] - ETA: 1:11 - loss: 1.2715 - regression_loss: 1.1090 - classification_loss: 0.1625 195/500 [==========>...................] - ETA: 1:11 - loss: 1.2720 - regression_loss: 1.1097 - classification_loss: 0.1623 196/500 [==========>...................] - ETA: 1:11 - loss: 1.2715 - regression_loss: 1.1094 - classification_loss: 0.1621 197/500 [==========>...................] - ETA: 1:11 - loss: 1.2737 - regression_loss: 1.1113 - classification_loss: 0.1624 198/500 [==========>...................] - ETA: 1:11 - loss: 1.2721 - regression_loss: 1.1100 - classification_loss: 0.1620 199/500 [==========>...................] - ETA: 1:10 - loss: 1.2696 - regression_loss: 1.1080 - classification_loss: 0.1616 200/500 [===========>..................] - ETA: 1:10 - loss: 1.2692 - regression_loss: 1.1078 - classification_loss: 0.1614 201/500 [===========>..................] - ETA: 1:10 - loss: 1.2704 - regression_loss: 1.1088 - classification_loss: 0.1616 202/500 [===========>..................] - ETA: 1:10 - loss: 1.2749 - regression_loss: 1.1131 - classification_loss: 0.1619 203/500 [===========>..................] - ETA: 1:09 - loss: 1.2741 - regression_loss: 1.1128 - classification_loss: 0.1613 204/500 [===========>..................] - ETA: 1:09 - loss: 1.2754 - regression_loss: 1.1138 - classification_loss: 0.1616 205/500 [===========>..................] - ETA: 1:09 - loss: 1.2752 - regression_loss: 1.1138 - classification_loss: 0.1614 206/500 [===========>..................] - ETA: 1:09 - loss: 1.2765 - regression_loss: 1.1151 - classification_loss: 0.1614 207/500 [===========>..................] - ETA: 1:08 - loss: 1.2742 - regression_loss: 1.1132 - classification_loss: 0.1609 208/500 [===========>..................] - ETA: 1:08 - loss: 1.2713 - regression_loss: 1.1101 - classification_loss: 0.1612 209/500 [===========>..................] - ETA: 1:08 - loss: 1.2695 - regression_loss: 1.1086 - classification_loss: 0.1610 210/500 [===========>..................] - ETA: 1:08 - loss: 1.2691 - regression_loss: 1.1081 - classification_loss: 0.1611 211/500 [===========>..................] - ETA: 1:07 - loss: 1.2677 - regression_loss: 1.1069 - classification_loss: 0.1608 212/500 [===========>..................] - ETA: 1:07 - loss: 1.2682 - regression_loss: 1.1074 - classification_loss: 0.1608 213/500 [===========>..................] - ETA: 1:07 - loss: 1.2683 - regression_loss: 1.1075 - classification_loss: 0.1608 214/500 [===========>..................] - ETA: 1:07 - loss: 1.2691 - regression_loss: 1.1081 - classification_loss: 0.1610 215/500 [===========>..................] - ETA: 1:06 - loss: 1.2682 - regression_loss: 1.1075 - classification_loss: 0.1607 216/500 [===========>..................] - ETA: 1:06 - loss: 1.2660 - regression_loss: 1.1059 - classification_loss: 0.1601 217/500 [============>.................] - ETA: 1:06 - loss: 1.2670 - regression_loss: 1.1068 - classification_loss: 0.1602 218/500 [============>.................] - ETA: 1:06 - loss: 1.2656 - regression_loss: 1.1056 - classification_loss: 0.1599 219/500 [============>.................] - ETA: 1:06 - loss: 1.2640 - regression_loss: 1.1044 - classification_loss: 0.1596 220/500 [============>.................] - ETA: 1:05 - loss: 1.2639 - regression_loss: 1.1045 - classification_loss: 0.1594 221/500 [============>.................] - ETA: 1:05 - loss: 1.2654 - regression_loss: 1.1058 - classification_loss: 0.1595 222/500 [============>.................] - ETA: 1:05 - loss: 1.2672 - regression_loss: 1.1076 - classification_loss: 0.1596 223/500 [============>.................] - ETA: 1:05 - loss: 1.2658 - regression_loss: 1.1063 - classification_loss: 0.1595 224/500 [============>.................] - ETA: 1:04 - loss: 1.2654 - regression_loss: 1.1061 - classification_loss: 0.1593 225/500 [============>.................] - ETA: 1:04 - loss: 1.2680 - regression_loss: 1.1078 - classification_loss: 0.1602 226/500 [============>.................] - ETA: 1:04 - loss: 1.2665 - regression_loss: 1.1065 - classification_loss: 0.1600 227/500 [============>.................] - ETA: 1:04 - loss: 1.2695 - regression_loss: 1.1090 - classification_loss: 0.1605 228/500 [============>.................] - ETA: 1:03 - loss: 1.2664 - regression_loss: 1.1064 - classification_loss: 0.1600 229/500 [============>.................] - ETA: 1:03 - loss: 1.2655 - regression_loss: 1.1056 - classification_loss: 0.1599 230/500 [============>.................] - ETA: 1:03 - loss: 1.2678 - regression_loss: 1.1074 - classification_loss: 0.1604 231/500 [============>.................] - ETA: 1:03 - loss: 1.2671 - regression_loss: 1.1067 - classification_loss: 0.1603 232/500 [============>.................] - ETA: 1:02 - loss: 1.2668 - regression_loss: 1.1068 - classification_loss: 0.1600 233/500 [============>.................] - ETA: 1:02 - loss: 1.2631 - regression_loss: 1.1036 - classification_loss: 0.1595 234/500 [=============>................] - ETA: 1:02 - loss: 1.2615 - regression_loss: 1.1025 - classification_loss: 0.1590 235/500 [=============>................] - ETA: 1:02 - loss: 1.2608 - regression_loss: 1.1019 - classification_loss: 0.1590 236/500 [=============>................] - ETA: 1:02 - loss: 1.2602 - regression_loss: 1.1014 - classification_loss: 0.1588 237/500 [=============>................] - ETA: 1:01 - loss: 1.2609 - regression_loss: 1.1020 - classification_loss: 0.1590 238/500 [=============>................] - ETA: 1:01 - loss: 1.2586 - regression_loss: 1.1000 - classification_loss: 0.1585 239/500 [=============>................] - ETA: 1:01 - loss: 1.2578 - regression_loss: 1.0993 - classification_loss: 0.1585 240/500 [=============>................] - ETA: 1:01 - loss: 1.2571 - regression_loss: 1.0988 - classification_loss: 0.1584 241/500 [=============>................] - ETA: 1:00 - loss: 1.2591 - regression_loss: 1.1006 - classification_loss: 0.1584 242/500 [=============>................] - ETA: 1:00 - loss: 1.2593 - regression_loss: 1.1009 - classification_loss: 0.1584 243/500 [=============>................] - ETA: 1:00 - loss: 1.2608 - regression_loss: 1.1024 - classification_loss: 0.1585 244/500 [=============>................] - ETA: 1:00 - loss: 1.2581 - regression_loss: 1.1001 - classification_loss: 0.1580 245/500 [=============>................] - ETA: 59s - loss: 1.2573 - regression_loss: 1.0997 - classification_loss: 0.1576  246/500 [=============>................] - ETA: 59s - loss: 1.2567 - regression_loss: 1.0994 - classification_loss: 0.1573 247/500 [=============>................] - ETA: 59s - loss: 1.2566 - regression_loss: 1.0991 - classification_loss: 0.1575 248/500 [=============>................] - ETA: 59s - loss: 1.2593 - regression_loss: 1.1010 - classification_loss: 0.1584 249/500 [=============>................] - ETA: 58s - loss: 1.2604 - regression_loss: 1.1018 - classification_loss: 0.1586 250/500 [==============>...............] - ETA: 58s - loss: 1.2617 - regression_loss: 1.1030 - classification_loss: 0.1587 251/500 [==============>...............] - ETA: 58s - loss: 1.2620 - regression_loss: 1.1029 - classification_loss: 0.1591 252/500 [==============>...............] - ETA: 58s - loss: 1.2597 - regression_loss: 1.1008 - classification_loss: 0.1588 253/500 [==============>...............] - ETA: 58s - loss: 1.2610 - regression_loss: 1.1011 - classification_loss: 0.1599 254/500 [==============>...............] - ETA: 57s - loss: 1.2628 - regression_loss: 1.1023 - classification_loss: 0.1604 255/500 [==============>...............] - ETA: 57s - loss: 1.2633 - regression_loss: 1.1031 - classification_loss: 0.1602 256/500 [==============>...............] - ETA: 57s - loss: 1.2601 - regression_loss: 1.1004 - classification_loss: 0.1598 257/500 [==============>...............] - ETA: 57s - loss: 1.2618 - regression_loss: 1.1021 - classification_loss: 0.1597 258/500 [==============>...............] - ETA: 56s - loss: 1.2618 - regression_loss: 1.1023 - classification_loss: 0.1596 259/500 [==============>...............] - ETA: 56s - loss: 1.2613 - regression_loss: 1.1019 - classification_loss: 0.1594 260/500 [==============>...............] - ETA: 56s - loss: 1.2649 - regression_loss: 1.1048 - classification_loss: 0.1601 261/500 [==============>...............] - ETA: 56s - loss: 1.2646 - regression_loss: 1.1049 - classification_loss: 0.1598 262/500 [==============>...............] - ETA: 55s - loss: 1.2635 - regression_loss: 1.1040 - classification_loss: 0.1595 263/500 [==============>...............] - ETA: 55s - loss: 1.2629 - regression_loss: 1.1036 - classification_loss: 0.1593 264/500 [==============>...............] - ETA: 55s - loss: 1.2665 - regression_loss: 1.1067 - classification_loss: 0.1598 265/500 [==============>...............] - ETA: 55s - loss: 1.2651 - regression_loss: 1.1056 - classification_loss: 0.1595 266/500 [==============>...............] - ETA: 55s - loss: 1.2647 - regression_loss: 1.1053 - classification_loss: 0.1594 267/500 [===============>..............] - ETA: 54s - loss: 1.2676 - regression_loss: 1.1077 - classification_loss: 0.1599 268/500 [===============>..............] - ETA: 54s - loss: 1.2690 - regression_loss: 1.1090 - classification_loss: 0.1600 269/500 [===============>..............] - ETA: 54s - loss: 1.2698 - regression_loss: 1.1095 - classification_loss: 0.1603 270/500 [===============>..............] - ETA: 54s - loss: 1.2680 - regression_loss: 1.1080 - classification_loss: 0.1600 271/500 [===============>..............] - ETA: 53s - loss: 1.2700 - regression_loss: 1.1098 - classification_loss: 0.1602 272/500 [===============>..............] - ETA: 53s - loss: 1.2702 - regression_loss: 1.1100 - classification_loss: 0.1601 273/500 [===============>..............] - ETA: 53s - loss: 1.2695 - regression_loss: 1.1095 - classification_loss: 0.1600 274/500 [===============>..............] - ETA: 53s - loss: 1.2690 - regression_loss: 1.1087 - classification_loss: 0.1603 275/500 [===============>..............] - ETA: 52s - loss: 1.2707 - regression_loss: 1.1100 - classification_loss: 0.1607 276/500 [===============>..............] - ETA: 52s - loss: 1.2710 - regression_loss: 1.1104 - classification_loss: 0.1606 277/500 [===============>..............] - ETA: 52s - loss: 1.2698 - regression_loss: 1.1095 - classification_loss: 0.1604 278/500 [===============>..............] - ETA: 52s - loss: 1.2696 - regression_loss: 1.1094 - classification_loss: 0.1602 279/500 [===============>..............] - ETA: 51s - loss: 1.2688 - regression_loss: 1.1089 - classification_loss: 0.1600 280/500 [===============>..............] - ETA: 51s - loss: 1.2695 - regression_loss: 1.1095 - classification_loss: 0.1600 281/500 [===============>..............] - ETA: 51s - loss: 1.2720 - regression_loss: 1.1116 - classification_loss: 0.1604 282/500 [===============>..............] - ETA: 51s - loss: 1.2765 - regression_loss: 1.1153 - classification_loss: 0.1612 283/500 [===============>..............] - ETA: 51s - loss: 1.2743 - regression_loss: 1.1134 - classification_loss: 0.1609 284/500 [================>.............] - ETA: 50s - loss: 1.2712 - regression_loss: 1.1107 - classification_loss: 0.1604 285/500 [================>.............] - ETA: 50s - loss: 1.2713 - regression_loss: 1.1109 - classification_loss: 0.1605 286/500 [================>.............] - ETA: 50s - loss: 1.2693 - regression_loss: 1.1091 - classification_loss: 0.1602 287/500 [================>.............] - ETA: 50s - loss: 1.2672 - regression_loss: 1.1073 - classification_loss: 0.1599 288/500 [================>.............] - ETA: 49s - loss: 1.2666 - regression_loss: 1.1068 - classification_loss: 0.1598 289/500 [================>.............] - ETA: 49s - loss: 1.2635 - regression_loss: 1.1040 - classification_loss: 0.1595 290/500 [================>.............] - ETA: 49s - loss: 1.2649 - regression_loss: 1.1052 - classification_loss: 0.1597 291/500 [================>.............] - ETA: 49s - loss: 1.2668 - regression_loss: 1.1068 - classification_loss: 0.1600 292/500 [================>.............] - ETA: 48s - loss: 1.2675 - regression_loss: 1.1076 - classification_loss: 0.1599 293/500 [================>.............] - ETA: 48s - loss: 1.2720 - regression_loss: 1.1109 - classification_loss: 0.1611 294/500 [================>.............] - ETA: 48s - loss: 1.2719 - regression_loss: 1.1110 - classification_loss: 0.1609 295/500 [================>.............] - ETA: 48s - loss: 1.2719 - regression_loss: 1.1110 - classification_loss: 0.1609 296/500 [================>.............] - ETA: 48s - loss: 1.2697 - regression_loss: 1.1092 - classification_loss: 0.1605 297/500 [================>.............] - ETA: 47s - loss: 1.2701 - regression_loss: 1.1096 - classification_loss: 0.1606 298/500 [================>.............] - ETA: 47s - loss: 1.2701 - regression_loss: 1.1097 - classification_loss: 0.1604 299/500 [================>.............] - ETA: 47s - loss: 1.2688 - regression_loss: 1.1086 - classification_loss: 0.1601 300/500 [=================>............] - ETA: 47s - loss: 1.2667 - regression_loss: 1.1067 - classification_loss: 0.1599 301/500 [=================>............] - ETA: 46s - loss: 1.2669 - regression_loss: 1.1068 - classification_loss: 0.1601 302/500 [=================>............] - ETA: 46s - loss: 1.2665 - regression_loss: 1.1063 - classification_loss: 0.1601 303/500 [=================>............] - ETA: 46s - loss: 1.2670 - regression_loss: 1.1067 - classification_loss: 0.1603 304/500 [=================>............] - ETA: 46s - loss: 1.2668 - regression_loss: 1.1065 - classification_loss: 0.1602 305/500 [=================>............] - ETA: 45s - loss: 1.2666 - regression_loss: 1.1065 - classification_loss: 0.1602 306/500 [=================>............] - ETA: 45s - loss: 1.2642 - regression_loss: 1.1045 - classification_loss: 0.1597 307/500 [=================>............] - ETA: 45s - loss: 1.2661 - regression_loss: 1.1058 - classification_loss: 0.1602 308/500 [=================>............] - ETA: 45s - loss: 1.2664 - regression_loss: 1.1062 - classification_loss: 0.1602 309/500 [=================>............] - ETA: 44s - loss: 1.2713 - regression_loss: 1.1102 - classification_loss: 0.1610 310/500 [=================>............] - ETA: 44s - loss: 1.2698 - regression_loss: 1.1091 - classification_loss: 0.1607 311/500 [=================>............] - ETA: 44s - loss: 1.2685 - regression_loss: 1.1080 - classification_loss: 0.1605 312/500 [=================>............] - ETA: 44s - loss: 1.2684 - regression_loss: 1.1080 - classification_loss: 0.1604 313/500 [=================>............] - ETA: 43s - loss: 1.2730 - regression_loss: 1.1117 - classification_loss: 0.1612 314/500 [=================>............] - ETA: 43s - loss: 1.2738 - regression_loss: 1.1122 - classification_loss: 0.1616 315/500 [=================>............] - ETA: 43s - loss: 1.2733 - regression_loss: 1.1119 - classification_loss: 0.1615 316/500 [=================>............] - ETA: 43s - loss: 1.2718 - regression_loss: 1.1106 - classification_loss: 0.1613 317/500 [==================>...........] - ETA: 43s - loss: 1.2714 - regression_loss: 1.1101 - classification_loss: 0.1612 318/500 [==================>...........] - ETA: 42s - loss: 1.2724 - regression_loss: 1.1109 - classification_loss: 0.1614 319/500 [==================>...........] - ETA: 42s - loss: 1.2722 - regression_loss: 1.1110 - classification_loss: 0.1612 320/500 [==================>...........] - ETA: 42s - loss: 1.2738 - regression_loss: 1.1124 - classification_loss: 0.1614 321/500 [==================>...........] - ETA: 42s - loss: 1.2753 - regression_loss: 1.1137 - classification_loss: 0.1615 322/500 [==================>...........] - ETA: 41s - loss: 1.2737 - regression_loss: 1.1125 - classification_loss: 0.1612 323/500 [==================>...........] - ETA: 41s - loss: 1.2734 - regression_loss: 1.1123 - classification_loss: 0.1611 324/500 [==================>...........] - ETA: 41s - loss: 1.2736 - regression_loss: 1.1125 - classification_loss: 0.1611 325/500 [==================>...........] - ETA: 41s - loss: 1.2742 - regression_loss: 1.1130 - classification_loss: 0.1612 326/500 [==================>...........] - ETA: 40s - loss: 1.2723 - regression_loss: 1.1110 - classification_loss: 0.1613 327/500 [==================>...........] - ETA: 40s - loss: 1.2743 - regression_loss: 1.1126 - classification_loss: 0.1617 328/500 [==================>...........] - ETA: 40s - loss: 1.2751 - regression_loss: 1.1135 - classification_loss: 0.1616 329/500 [==================>...........] - ETA: 40s - loss: 1.2768 - regression_loss: 1.1148 - classification_loss: 0.1621 330/500 [==================>...........] - ETA: 39s - loss: 1.2773 - regression_loss: 1.1151 - classification_loss: 0.1622 331/500 [==================>...........] - ETA: 39s - loss: 1.2770 - regression_loss: 1.1148 - classification_loss: 0.1622 332/500 [==================>...........] - ETA: 39s - loss: 1.2774 - regression_loss: 1.1152 - classification_loss: 0.1622 333/500 [==================>...........] - ETA: 39s - loss: 1.2774 - regression_loss: 1.1153 - classification_loss: 0.1621 334/500 [===================>..........] - ETA: 39s - loss: 1.2777 - regression_loss: 1.1158 - classification_loss: 0.1620 335/500 [===================>..........] - ETA: 38s - loss: 1.2781 - regression_loss: 1.1160 - classification_loss: 0.1620 336/500 [===================>..........] - ETA: 38s - loss: 1.2767 - regression_loss: 1.1149 - classification_loss: 0.1618 337/500 [===================>..........] - ETA: 38s - loss: 1.2786 - regression_loss: 1.1165 - classification_loss: 0.1621 338/500 [===================>..........] - ETA: 38s - loss: 1.2775 - regression_loss: 1.1156 - classification_loss: 0.1620 339/500 [===================>..........] - ETA: 37s - loss: 1.2782 - regression_loss: 1.1162 - classification_loss: 0.1620 340/500 [===================>..........] - ETA: 37s - loss: 1.2783 - regression_loss: 1.1164 - classification_loss: 0.1619 341/500 [===================>..........] - ETA: 37s - loss: 1.2777 - regression_loss: 1.1160 - classification_loss: 0.1617 342/500 [===================>..........] - ETA: 37s - loss: 1.2804 - regression_loss: 1.1180 - classification_loss: 0.1624 343/500 [===================>..........] - ETA: 36s - loss: 1.2788 - regression_loss: 1.1166 - classification_loss: 0.1622 344/500 [===================>..........] - ETA: 36s - loss: 1.2782 - regression_loss: 1.1162 - classification_loss: 0.1620 345/500 [===================>..........] - ETA: 36s - loss: 1.2781 - regression_loss: 1.1162 - classification_loss: 0.1619 346/500 [===================>..........] - ETA: 36s - loss: 1.2792 - regression_loss: 1.1172 - classification_loss: 0.1620 347/500 [===================>..........] - ETA: 36s - loss: 1.2783 - regression_loss: 1.1165 - classification_loss: 0.1618 348/500 [===================>..........] - ETA: 35s - loss: 1.2766 - regression_loss: 1.1149 - classification_loss: 0.1618 349/500 [===================>..........] - ETA: 35s - loss: 1.2770 - regression_loss: 1.1150 - classification_loss: 0.1620 350/500 [====================>.........] - ETA: 35s - loss: 1.2769 - regression_loss: 1.1149 - classification_loss: 0.1620 351/500 [====================>.........] - ETA: 35s - loss: 1.2747 - regression_loss: 1.1131 - classification_loss: 0.1616 352/500 [====================>.........] - ETA: 34s - loss: 1.2732 - regression_loss: 1.1119 - classification_loss: 0.1613 353/500 [====================>.........] - ETA: 34s - loss: 1.2737 - regression_loss: 1.1124 - classification_loss: 0.1613 354/500 [====================>.........] - ETA: 34s - loss: 1.2749 - regression_loss: 1.1133 - classification_loss: 0.1616 355/500 [====================>.........] - ETA: 34s - loss: 1.2744 - regression_loss: 1.1130 - classification_loss: 0.1615 356/500 [====================>.........] - ETA: 33s - loss: 1.2741 - regression_loss: 1.1127 - classification_loss: 0.1613 357/500 [====================>.........] - ETA: 33s - loss: 1.2745 - regression_loss: 1.1133 - classification_loss: 0.1613 358/500 [====================>.........] - ETA: 33s - loss: 1.2748 - regression_loss: 1.1135 - classification_loss: 0.1614 359/500 [====================>.........] - ETA: 33s - loss: 1.2761 - regression_loss: 1.1147 - classification_loss: 0.1613 360/500 [====================>.........] - ETA: 32s - loss: 1.2752 - regression_loss: 1.1141 - classification_loss: 0.1611 361/500 [====================>.........] - ETA: 32s - loss: 1.2751 - regression_loss: 1.1141 - classification_loss: 0.1611 362/500 [====================>.........] - ETA: 32s - loss: 1.2735 - regression_loss: 1.1127 - classification_loss: 0.1608 363/500 [====================>.........] - ETA: 32s - loss: 1.2749 - regression_loss: 1.1142 - classification_loss: 0.1608 364/500 [====================>.........] - ETA: 32s - loss: 1.2755 - regression_loss: 1.1147 - classification_loss: 0.1609 365/500 [====================>.........] - ETA: 31s - loss: 1.2777 - regression_loss: 1.1164 - classification_loss: 0.1612 366/500 [====================>.........] - ETA: 31s - loss: 1.2769 - regression_loss: 1.1158 - classification_loss: 0.1611 367/500 [=====================>........] - ETA: 31s - loss: 1.2775 - regression_loss: 1.1165 - classification_loss: 0.1611 368/500 [=====================>........] - ETA: 31s - loss: 1.2763 - regression_loss: 1.1154 - classification_loss: 0.1610 369/500 [=====================>........] - ETA: 30s - loss: 1.2742 - regression_loss: 1.1136 - classification_loss: 0.1606 370/500 [=====================>........] - ETA: 30s - loss: 1.2741 - regression_loss: 1.1135 - classification_loss: 0.1606 371/500 [=====================>........] - ETA: 30s - loss: 1.2721 - regression_loss: 1.1118 - classification_loss: 0.1603 372/500 [=====================>........] - ETA: 30s - loss: 1.2731 - regression_loss: 1.1127 - classification_loss: 0.1605 373/500 [=====================>........] - ETA: 29s - loss: 1.2737 - regression_loss: 1.1131 - classification_loss: 0.1607 374/500 [=====================>........] - ETA: 29s - loss: 1.2731 - regression_loss: 1.1126 - classification_loss: 0.1605 375/500 [=====================>........] - ETA: 29s - loss: 1.2721 - regression_loss: 1.1118 - classification_loss: 0.1603 376/500 [=====================>........] - ETA: 29s - loss: 1.2722 - regression_loss: 1.1121 - classification_loss: 0.1602 377/500 [=====================>........] - ETA: 28s - loss: 1.2718 - regression_loss: 1.1116 - classification_loss: 0.1602 378/500 [=====================>........] - ETA: 28s - loss: 1.2718 - regression_loss: 1.1116 - classification_loss: 0.1602 379/500 [=====================>........] - ETA: 28s - loss: 1.2714 - regression_loss: 1.1113 - classification_loss: 0.1601 380/500 [=====================>........] - ETA: 28s - loss: 1.2735 - regression_loss: 1.1135 - classification_loss: 0.1600 381/500 [=====================>........] - ETA: 28s - loss: 1.2746 - regression_loss: 1.1145 - classification_loss: 0.1602 382/500 [=====================>........] - ETA: 27s - loss: 1.2741 - regression_loss: 1.1140 - classification_loss: 0.1601 383/500 [=====================>........] - ETA: 27s - loss: 1.2758 - regression_loss: 1.1154 - classification_loss: 0.1604 384/500 [======================>.......] - ETA: 27s - loss: 1.2773 - regression_loss: 1.1165 - classification_loss: 0.1608 385/500 [======================>.......] - ETA: 27s - loss: 1.2771 - regression_loss: 1.1164 - classification_loss: 0.1607 386/500 [======================>.......] - ETA: 26s - loss: 1.2751 - regression_loss: 1.1145 - classification_loss: 0.1605 387/500 [======================>.......] - ETA: 26s - loss: 1.2748 - regression_loss: 1.1142 - classification_loss: 0.1605 388/500 [======================>.......] - ETA: 26s - loss: 1.2748 - regression_loss: 1.1143 - classification_loss: 0.1605 389/500 [======================>.......] - ETA: 26s - loss: 1.2754 - regression_loss: 1.1146 - classification_loss: 0.1609 390/500 [======================>.......] - ETA: 25s - loss: 1.2743 - regression_loss: 1.1137 - classification_loss: 0.1606 391/500 [======================>.......] - ETA: 25s - loss: 1.2733 - regression_loss: 1.1129 - classification_loss: 0.1605 392/500 [======================>.......] - ETA: 25s - loss: 1.2721 - regression_loss: 1.1119 - classification_loss: 0.1603 393/500 [======================>.......] - ETA: 25s - loss: 1.2722 - regression_loss: 1.1118 - classification_loss: 0.1605 394/500 [======================>.......] - ETA: 24s - loss: 1.2719 - regression_loss: 1.1115 - classification_loss: 0.1604 395/500 [======================>.......] - ETA: 24s - loss: 1.2718 - regression_loss: 1.1114 - classification_loss: 0.1604 396/500 [======================>.......] - ETA: 24s - loss: 1.2689 - regression_loss: 1.1086 - classification_loss: 0.1603 397/500 [======================>.......] - ETA: 24s - loss: 1.2689 - regression_loss: 1.1086 - classification_loss: 0.1603 398/500 [======================>.......] - ETA: 24s - loss: 1.2686 - regression_loss: 1.1085 - classification_loss: 0.1601 399/500 [======================>.......] - ETA: 23s - loss: 1.2674 - regression_loss: 1.1074 - classification_loss: 0.1600 400/500 [=======================>......] - ETA: 23s - loss: 1.2664 - regression_loss: 1.1067 - classification_loss: 0.1598 401/500 [=======================>......] - ETA: 23s - loss: 1.2657 - regression_loss: 1.1062 - classification_loss: 0.1596 402/500 [=======================>......] - ETA: 23s - loss: 1.2667 - regression_loss: 1.1073 - classification_loss: 0.1594 403/500 [=======================>......] - ETA: 22s - loss: 1.2665 - regression_loss: 1.1074 - classification_loss: 0.1592 404/500 [=======================>......] - ETA: 22s - loss: 1.2659 - regression_loss: 1.1069 - classification_loss: 0.1590 405/500 [=======================>......] - ETA: 22s - loss: 1.2667 - regression_loss: 1.1077 - classification_loss: 0.1590 406/500 [=======================>......] - ETA: 22s - loss: 1.2666 - regression_loss: 1.1076 - classification_loss: 0.1590 407/500 [=======================>......] - ETA: 21s - loss: 1.2683 - regression_loss: 1.1087 - classification_loss: 0.1595 408/500 [=======================>......] - ETA: 21s - loss: 1.2668 - regression_loss: 1.1075 - classification_loss: 0.1593 409/500 [=======================>......] - ETA: 21s - loss: 1.2675 - regression_loss: 1.1082 - classification_loss: 0.1593 410/500 [=======================>......] - ETA: 21s - loss: 1.2680 - regression_loss: 1.1087 - classification_loss: 0.1594 411/500 [=======================>......] - ETA: 20s - loss: 1.2690 - regression_loss: 1.1095 - classification_loss: 0.1595 412/500 [=======================>......] - ETA: 20s - loss: 1.2690 - regression_loss: 1.1094 - classification_loss: 0.1596 413/500 [=======================>......] - ETA: 20s - loss: 1.2700 - regression_loss: 1.1103 - classification_loss: 0.1597 414/500 [=======================>......] - ETA: 20s - loss: 1.2701 - regression_loss: 1.1103 - classification_loss: 0.1598 415/500 [=======================>......] - ETA: 20s - loss: 1.2695 - regression_loss: 1.1096 - classification_loss: 0.1598 416/500 [=======================>......] - ETA: 19s - loss: 1.2683 - regression_loss: 1.1086 - classification_loss: 0.1596 417/500 [========================>.....] - ETA: 19s - loss: 1.2680 - regression_loss: 1.1084 - classification_loss: 0.1595 418/500 [========================>.....] - ETA: 19s - loss: 1.2657 - regression_loss: 1.1064 - classification_loss: 0.1593 419/500 [========================>.....] - ETA: 19s - loss: 1.2648 - regression_loss: 1.1055 - classification_loss: 0.1593 420/500 [========================>.....] - ETA: 18s - loss: 1.2647 - regression_loss: 1.1055 - classification_loss: 0.1592 421/500 [========================>.....] - ETA: 18s - loss: 1.2639 - regression_loss: 1.1049 - classification_loss: 0.1590 422/500 [========================>.....] - ETA: 18s - loss: 1.2631 - regression_loss: 1.1037 - classification_loss: 0.1594 423/500 [========================>.....] - ETA: 18s - loss: 1.2623 - regression_loss: 1.1031 - classification_loss: 0.1592 424/500 [========================>.....] - ETA: 17s - loss: 1.2635 - regression_loss: 1.1041 - classification_loss: 0.1594 425/500 [========================>.....] - ETA: 17s - loss: 1.2645 - regression_loss: 1.1048 - classification_loss: 0.1596 426/500 [========================>.....] - ETA: 17s - loss: 1.2690 - regression_loss: 1.1088 - classification_loss: 0.1602 427/500 [========================>.....] - ETA: 17s - loss: 1.2683 - regression_loss: 1.1082 - classification_loss: 0.1601 428/500 [========================>.....] - ETA: 16s - loss: 1.2683 - regression_loss: 1.1083 - classification_loss: 0.1601 429/500 [========================>.....] - ETA: 16s - loss: 1.2696 - regression_loss: 1.1094 - classification_loss: 0.1603 430/500 [========================>.....] - ETA: 16s - loss: 1.2694 - regression_loss: 1.1091 - classification_loss: 0.1603 431/500 [========================>.....] - ETA: 16s - loss: 1.2693 - regression_loss: 1.1091 - classification_loss: 0.1602 432/500 [========================>.....] - ETA: 16s - loss: 1.2696 - regression_loss: 1.1094 - classification_loss: 0.1601 433/500 [========================>.....] - ETA: 15s - loss: 1.2702 - regression_loss: 1.1098 - classification_loss: 0.1605 434/500 [=========================>....] - ETA: 15s - loss: 1.2695 - regression_loss: 1.1091 - classification_loss: 0.1604 435/500 [=========================>....] - ETA: 15s - loss: 1.2690 - regression_loss: 1.1087 - classification_loss: 0.1603 436/500 [=========================>....] - ETA: 15s - loss: 1.2701 - regression_loss: 1.1097 - classification_loss: 0.1604 437/500 [=========================>....] - ETA: 14s - loss: 1.2713 - regression_loss: 1.1108 - classification_loss: 0.1606 438/500 [=========================>....] - ETA: 14s - loss: 1.2715 - regression_loss: 1.1109 - classification_loss: 0.1606 439/500 [=========================>....] - ETA: 14s - loss: 1.2706 - regression_loss: 1.1102 - classification_loss: 0.1604 440/500 [=========================>....] - ETA: 14s - loss: 1.2695 - regression_loss: 1.1092 - classification_loss: 0.1602 441/500 [=========================>....] - ETA: 13s - loss: 1.2710 - regression_loss: 1.1106 - classification_loss: 0.1605 442/500 [=========================>....] - ETA: 13s - loss: 1.2733 - regression_loss: 1.1122 - classification_loss: 0.1611 443/500 [=========================>....] - ETA: 13s - loss: 1.2723 - regression_loss: 1.1114 - classification_loss: 0.1609 444/500 [=========================>....] - ETA: 13s - loss: 1.2714 - regression_loss: 1.1106 - classification_loss: 0.1608 445/500 [=========================>....] - ETA: 12s - loss: 1.2715 - regression_loss: 1.1108 - classification_loss: 0.1607 446/500 [=========================>....] - ETA: 12s - loss: 1.2717 - regression_loss: 1.1110 - classification_loss: 0.1608 447/500 [=========================>....] - ETA: 12s - loss: 1.2709 - regression_loss: 1.1102 - classification_loss: 0.1607 448/500 [=========================>....] - ETA: 12s - loss: 1.2685 - regression_loss: 1.1081 - classification_loss: 0.1604 449/500 [=========================>....] - ETA: 12s - loss: 1.2684 - regression_loss: 1.1080 - classification_loss: 0.1604 450/500 [==========================>...] - ETA: 11s - loss: 1.2683 - regression_loss: 1.1080 - classification_loss: 0.1603 451/500 [==========================>...] - ETA: 11s - loss: 1.2692 - regression_loss: 1.1087 - classification_loss: 0.1605 452/500 [==========================>...] - ETA: 11s - loss: 1.2686 - regression_loss: 1.1081 - classification_loss: 0.1605 453/500 [==========================>...] - ETA: 11s - loss: 1.2696 - regression_loss: 1.1087 - classification_loss: 0.1609 454/500 [==========================>...] - ETA: 10s - loss: 1.2756 - regression_loss: 1.1105 - classification_loss: 0.1652 455/500 [==========================>...] - ETA: 10s - loss: 1.2766 - regression_loss: 1.1112 - classification_loss: 0.1654 456/500 [==========================>...] - ETA: 10s - loss: 1.2764 - regression_loss: 1.1111 - classification_loss: 0.1653 457/500 [==========================>...] - ETA: 10s - loss: 1.2778 - regression_loss: 1.1122 - classification_loss: 0.1655 458/500 [==========================>...] - ETA: 9s - loss: 1.2772 - regression_loss: 1.1118 - classification_loss: 0.1654  459/500 [==========================>...] - ETA: 9s - loss: 1.2769 - regression_loss: 1.1115 - classification_loss: 0.1654 460/500 [==========================>...] - ETA: 9s - loss: 1.2768 - regression_loss: 1.1114 - classification_loss: 0.1655 461/500 [==========================>...] - ETA: 9s - loss: 1.2756 - regression_loss: 1.1104 - classification_loss: 0.1652 462/500 [==========================>...] - ETA: 8s - loss: 1.2753 - regression_loss: 1.1102 - classification_loss: 0.1651 463/500 [==========================>...] - ETA: 8s - loss: 1.2750 - regression_loss: 1.1100 - classification_loss: 0.1650 464/500 [==========================>...] - ETA: 8s - loss: 1.2749 - regression_loss: 1.1099 - classification_loss: 0.1650 465/500 [==========================>...] - ETA: 8s - loss: 1.2748 - regression_loss: 1.1099 - classification_loss: 0.1650 466/500 [==========================>...] - ETA: 8s - loss: 1.2751 - regression_loss: 1.1103 - classification_loss: 0.1648 467/500 [===========================>..] - ETA: 7s - loss: 1.2738 - regression_loss: 1.1092 - classification_loss: 0.1646 468/500 [===========================>..] - ETA: 7s - loss: 1.2736 - regression_loss: 1.1090 - classification_loss: 0.1647 469/500 [===========================>..] - ETA: 7s - loss: 1.2732 - regression_loss: 1.1084 - classification_loss: 0.1648 470/500 [===========================>..] - ETA: 7s - loss: 1.2725 - regression_loss: 1.1079 - classification_loss: 0.1646 471/500 [===========================>..] - ETA: 6s - loss: 1.2716 - regression_loss: 1.1071 - classification_loss: 0.1645 472/500 [===========================>..] - ETA: 6s - loss: 1.2721 - regression_loss: 1.1074 - classification_loss: 0.1647 473/500 [===========================>..] - ETA: 6s - loss: 1.2721 - regression_loss: 1.1076 - classification_loss: 0.1645 474/500 [===========================>..] - ETA: 6s - loss: 1.2739 - regression_loss: 1.1091 - classification_loss: 0.1649 475/500 [===========================>..] - ETA: 5s - loss: 1.2743 - regression_loss: 1.1095 - classification_loss: 0.1648 476/500 [===========================>..] - ETA: 5s - loss: 1.2740 - regression_loss: 1.1093 - classification_loss: 0.1647 477/500 [===========================>..] - ETA: 5s - loss: 1.2739 - regression_loss: 1.1093 - classification_loss: 0.1646 478/500 [===========================>..] - ETA: 5s - loss: 1.2742 - regression_loss: 1.1098 - classification_loss: 0.1644 479/500 [===========================>..] - ETA: 4s - loss: 1.2729 - regression_loss: 1.1088 - classification_loss: 0.1642 480/500 [===========================>..] - ETA: 4s - loss: 1.2723 - regression_loss: 1.1082 - classification_loss: 0.1641 481/500 [===========================>..] - ETA: 4s - loss: 1.2764 - regression_loss: 1.1117 - classification_loss: 0.1647 482/500 [===========================>..] - ETA: 4s - loss: 1.2745 - regression_loss: 1.1100 - classification_loss: 0.1645 483/500 [===========================>..] - ETA: 4s - loss: 1.2728 - regression_loss: 1.1085 - classification_loss: 0.1642 484/500 [============================>.] - ETA: 3s - loss: 1.2720 - regression_loss: 1.1079 - classification_loss: 0.1641 485/500 [============================>.] - ETA: 3s - loss: 1.2706 - regression_loss: 1.1067 - classification_loss: 0.1639 486/500 [============================>.] - ETA: 3s - loss: 1.2714 - regression_loss: 1.1073 - classification_loss: 0.1641 487/500 [============================>.] - ETA: 3s - loss: 1.2722 - regression_loss: 1.1078 - classification_loss: 0.1644 488/500 [============================>.] - ETA: 2s - loss: 1.2719 - regression_loss: 1.1076 - classification_loss: 0.1643 489/500 [============================>.] - ETA: 2s - loss: 1.2728 - regression_loss: 1.1084 - classification_loss: 0.1644 490/500 [============================>.] - ETA: 2s - loss: 1.2715 - regression_loss: 1.1074 - classification_loss: 0.1641 491/500 [============================>.] - ETA: 2s - loss: 1.2710 - regression_loss: 1.1070 - classification_loss: 0.1640 492/500 [============================>.] - ETA: 1s - loss: 1.2709 - regression_loss: 1.1068 - classification_loss: 0.1641 493/500 [============================>.] - ETA: 1s - loss: 1.2709 - regression_loss: 1.1069 - classification_loss: 0.1640 494/500 [============================>.] - ETA: 1s - loss: 1.2709 - regression_loss: 1.1070 - classification_loss: 0.1639 495/500 [============================>.] - ETA: 1s - loss: 1.2711 - regression_loss: 1.1073 - classification_loss: 0.1638 496/500 [============================>.] - ETA: 0s - loss: 1.2707 - regression_loss: 1.1070 - classification_loss: 0.1637 497/500 [============================>.] - ETA: 0s - loss: 1.2708 - regression_loss: 1.1071 - classification_loss: 0.1637 498/500 [============================>.] - ETA: 0s - loss: 1.2719 - regression_loss: 1.1081 - classification_loss: 0.1638 499/500 [============================>.] - ETA: 0s - loss: 1.2719 - regression_loss: 1.1083 - classification_loss: 0.1636 500/500 [==============================] - 118s 236ms/step - loss: 1.2720 - regression_loss: 1.1087 - classification_loss: 0.1633 326 instances of class plum with average precision: 0.8214 mAP: 0.8214 Epoch 00017: saving model to ./training/snapshots/resnet50_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 1:45 - loss: 0.1414 - regression_loss: 0.0000e+00 - classification_loss: 0.1414 2/500 [..............................] - ETA: 1:52 - loss: 0.4661 - regression_loss: 0.3491 - classification_loss: 0.1170  3/500 [..............................] - ETA: 1:53 - loss: 0.7934 - regression_loss: 0.6305 - classification_loss: 0.1629 4/500 [..............................] - ETA: 1:52 - loss: 1.0303 - regression_loss: 0.8321 - classification_loss: 0.1982 5/500 [..............................] - ETA: 1:54 - loss: 1.0532 - regression_loss: 0.8732 - classification_loss: 0.1801 6/500 [..............................] - ETA: 1:54 - loss: 1.1476 - regression_loss: 0.9639 - classification_loss: 0.1838 7/500 [..............................] - ETA: 1:54 - loss: 1.2620 - regression_loss: 1.0823 - classification_loss: 0.1796 8/500 [..............................] - ETA: 1:54 - loss: 1.2408 - regression_loss: 1.0668 - classification_loss: 0.1740 9/500 [..............................] - ETA: 1:53 - loss: 1.2736 - regression_loss: 1.0973 - classification_loss: 0.1762 10/500 [..............................] - ETA: 1:53 - loss: 1.2411 - regression_loss: 1.0697 - classification_loss: 0.1714 11/500 [..............................] - ETA: 1:52 - loss: 1.2653 - regression_loss: 1.0939 - classification_loss: 0.1714 12/500 [..............................] - ETA: 1:53 - loss: 1.2560 - regression_loss: 1.0878 - classification_loss: 0.1682 13/500 [..............................] - ETA: 1:52 - loss: 1.2154 - regression_loss: 1.0571 - classification_loss: 0.1582 14/500 [..............................] - ETA: 1:52 - loss: 1.2738 - regression_loss: 1.1128 - classification_loss: 0.1609 15/500 [..............................] - ETA: 1:52 - loss: 1.2789 - regression_loss: 1.1173 - classification_loss: 0.1617 16/500 [..............................] - ETA: 1:52 - loss: 1.2600 - regression_loss: 1.1021 - classification_loss: 0.1579 17/500 [>.............................] - ETA: 1:52 - loss: 1.2684 - regression_loss: 1.1116 - classification_loss: 0.1568 18/500 [>.............................] - ETA: 1:52 - loss: 1.2330 - regression_loss: 1.0805 - classification_loss: 0.1525 19/500 [>.............................] - ETA: 1:52 - loss: 1.2104 - regression_loss: 1.0614 - classification_loss: 0.1489 20/500 [>.............................] - ETA: 1:52 - loss: 1.2140 - regression_loss: 1.0655 - classification_loss: 0.1484 21/500 [>.............................] - ETA: 1:52 - loss: 1.2219 - regression_loss: 1.0727 - classification_loss: 0.1492 22/500 [>.............................] - ETA: 1:51 - loss: 1.2059 - regression_loss: 1.0598 - classification_loss: 0.1461 23/500 [>.............................] - ETA: 1:51 - loss: 1.2021 - regression_loss: 1.0584 - classification_loss: 0.1436 24/500 [>.............................] - ETA: 1:51 - loss: 1.1735 - regression_loss: 1.0320 - classification_loss: 0.1415 25/500 [>.............................] - ETA: 1:50 - loss: 1.1941 - regression_loss: 1.0498 - classification_loss: 0.1443 26/500 [>.............................] - ETA: 1:50 - loss: 1.2346 - regression_loss: 1.0850 - classification_loss: 0.1496 27/500 [>.............................] - ETA: 1:50 - loss: 1.2470 - regression_loss: 1.0968 - classification_loss: 0.1502 28/500 [>.............................] - ETA: 1:50 - loss: 1.2256 - regression_loss: 1.0783 - classification_loss: 0.1473 29/500 [>.............................] - ETA: 1:50 - loss: 1.2133 - regression_loss: 1.0689 - classification_loss: 0.1444 30/500 [>.............................] - ETA: 1:50 - loss: 1.1991 - regression_loss: 1.0577 - classification_loss: 0.1414 31/500 [>.............................] - ETA: 1:50 - loss: 1.2108 - regression_loss: 1.0661 - classification_loss: 0.1447 32/500 [>.............................] - ETA: 1:50 - loss: 1.2292 - regression_loss: 1.0822 - classification_loss: 0.1470 33/500 [>.............................] - ETA: 1:49 - loss: 1.2395 - regression_loss: 1.0912 - classification_loss: 0.1483 34/500 [=>............................] - ETA: 1:49 - loss: 1.2424 - regression_loss: 1.0933 - classification_loss: 0.1491 35/500 [=>............................] - ETA: 1:49 - loss: 1.2525 - regression_loss: 1.1016 - classification_loss: 0.1509 36/500 [=>............................] - ETA: 1:48 - loss: 1.2570 - regression_loss: 1.1063 - classification_loss: 0.1507 37/500 [=>............................] - ETA: 1:48 - loss: 1.2309 - regression_loss: 1.0836 - classification_loss: 0.1473 38/500 [=>............................] - ETA: 1:48 - loss: 1.2265 - regression_loss: 1.0810 - classification_loss: 0.1455 39/500 [=>............................] - ETA: 1:47 - loss: 1.2166 - regression_loss: 1.0731 - classification_loss: 0.1435 40/500 [=>............................] - ETA: 1:47 - loss: 1.2013 - regression_loss: 1.0603 - classification_loss: 0.1410 41/500 [=>............................] - ETA: 1:47 - loss: 1.2009 - regression_loss: 1.0603 - classification_loss: 0.1406 42/500 [=>............................] - ETA: 1:46 - loss: 1.2022 - regression_loss: 1.0617 - classification_loss: 0.1405 43/500 [=>............................] - ETA: 1:46 - loss: 1.1949 - regression_loss: 1.0549 - classification_loss: 0.1400 44/500 [=>............................] - ETA: 1:46 - loss: 1.1844 - regression_loss: 1.0458 - classification_loss: 0.1386 45/500 [=>............................] - ETA: 1:45 - loss: 1.1812 - regression_loss: 1.0433 - classification_loss: 0.1380 46/500 [=>............................] - ETA: 1:45 - loss: 1.1783 - regression_loss: 1.0415 - classification_loss: 0.1368 47/500 [=>............................] - ETA: 1:45 - loss: 1.1766 - regression_loss: 1.0403 - classification_loss: 0.1363 48/500 [=>............................] - ETA: 1:45 - loss: 1.1632 - regression_loss: 1.0286 - classification_loss: 0.1346 49/500 [=>............................] - ETA: 1:44 - loss: 1.1711 - regression_loss: 1.0342 - classification_loss: 0.1369 50/500 [==>...........................] - ETA: 1:44 - loss: 1.1722 - regression_loss: 1.0356 - classification_loss: 0.1366 51/500 [==>...........................] - ETA: 1:43 - loss: 1.1600 - regression_loss: 1.0246 - classification_loss: 0.1354 52/500 [==>...........................] - ETA: 1:43 - loss: 1.1624 - regression_loss: 1.0269 - classification_loss: 0.1355 53/500 [==>...........................] - ETA: 1:43 - loss: 1.1702 - regression_loss: 1.0324 - classification_loss: 0.1378 54/500 [==>...........................] - ETA: 1:43 - loss: 1.1722 - regression_loss: 1.0341 - classification_loss: 0.1381 55/500 [==>...........................] - ETA: 1:42 - loss: 1.1848 - regression_loss: 1.0443 - classification_loss: 0.1405 56/500 [==>...........................] - ETA: 1:42 - loss: 1.1823 - regression_loss: 1.0420 - classification_loss: 0.1403 57/500 [==>...........................] - ETA: 1:42 - loss: 1.1768 - regression_loss: 1.0372 - classification_loss: 0.1396 58/500 [==>...........................] - ETA: 1:42 - loss: 1.1773 - regression_loss: 1.0380 - classification_loss: 0.1393 59/500 [==>...........................] - ETA: 1:42 - loss: 1.1951 - regression_loss: 1.0505 - classification_loss: 0.1446 60/500 [==>...........................] - ETA: 1:41 - loss: 1.1967 - regression_loss: 1.0518 - classification_loss: 0.1449 61/500 [==>...........................] - ETA: 1:41 - loss: 1.1964 - regression_loss: 1.0521 - classification_loss: 0.1444 62/500 [==>...........................] - ETA: 1:41 - loss: 1.1932 - regression_loss: 1.0492 - classification_loss: 0.1440 63/500 [==>...........................] - ETA: 1:41 - loss: 1.1916 - regression_loss: 1.0478 - classification_loss: 0.1437 64/500 [==>...........................] - ETA: 1:40 - loss: 1.1965 - regression_loss: 1.0511 - classification_loss: 0.1454 65/500 [==>...........................] - ETA: 1:40 - loss: 1.2016 - regression_loss: 1.0560 - classification_loss: 0.1456 66/500 [==>...........................] - ETA: 1:40 - loss: 1.2028 - regression_loss: 1.0565 - classification_loss: 0.1463 67/500 [===>..........................] - ETA: 1:40 - loss: 1.2016 - regression_loss: 1.0556 - classification_loss: 0.1460 68/500 [===>..........................] - ETA: 1:39 - loss: 1.2014 - regression_loss: 1.0560 - classification_loss: 0.1454 69/500 [===>..........................] - ETA: 1:39 - loss: 1.2044 - regression_loss: 1.0586 - classification_loss: 0.1458 70/500 [===>..........................] - ETA: 1:39 - loss: 1.2063 - regression_loss: 1.0598 - classification_loss: 0.1466 71/500 [===>..........................] - ETA: 1:39 - loss: 1.2035 - regression_loss: 1.0574 - classification_loss: 0.1461 72/500 [===>..........................] - ETA: 1:39 - loss: 1.2015 - regression_loss: 1.0559 - classification_loss: 0.1456 73/500 [===>..........................] - ETA: 1:38 - loss: 1.1973 - regression_loss: 1.0527 - classification_loss: 0.1446 74/500 [===>..........................] - ETA: 1:38 - loss: 1.1900 - regression_loss: 1.0467 - classification_loss: 0.1434 75/500 [===>..........................] - ETA: 1:38 - loss: 1.1821 - regression_loss: 1.0402 - classification_loss: 0.1419 76/500 [===>..........................] - ETA: 1:38 - loss: 1.1855 - regression_loss: 1.0439 - classification_loss: 0.1417 77/500 [===>..........................] - ETA: 1:37 - loss: 1.1913 - regression_loss: 1.0485 - classification_loss: 0.1428 78/500 [===>..........................] - ETA: 1:37 - loss: 1.1998 - regression_loss: 1.0568 - classification_loss: 0.1430 79/500 [===>..........................] - ETA: 1:37 - loss: 1.1967 - regression_loss: 1.0545 - classification_loss: 0.1422 80/500 [===>..........................] - ETA: 1:37 - loss: 1.2138 - regression_loss: 1.0696 - classification_loss: 0.1441 81/500 [===>..........................] - ETA: 1:37 - loss: 1.2071 - regression_loss: 1.0641 - classification_loss: 0.1430 82/500 [===>..........................] - ETA: 1:37 - loss: 1.2129 - regression_loss: 1.0686 - classification_loss: 0.1444 83/500 [===>..........................] - ETA: 1:36 - loss: 1.2290 - regression_loss: 1.0822 - classification_loss: 0.1467 84/500 [====>.........................] - ETA: 1:36 - loss: 1.2276 - regression_loss: 1.0815 - classification_loss: 0.1462 85/500 [====>.........................] - ETA: 1:36 - loss: 1.2333 - regression_loss: 1.0864 - classification_loss: 0.1469 86/500 [====>.........................] - ETA: 1:36 - loss: 1.2392 - regression_loss: 1.0913 - classification_loss: 0.1480 87/500 [====>.........................] - ETA: 1:35 - loss: 1.2450 - regression_loss: 1.0961 - classification_loss: 0.1489 88/500 [====>.........................] - ETA: 1:35 - loss: 1.2406 - regression_loss: 1.0914 - classification_loss: 0.1492 89/500 [====>.........................] - ETA: 1:35 - loss: 1.2428 - regression_loss: 1.0921 - classification_loss: 0.1507 90/500 [====>.........................] - ETA: 1:35 - loss: 1.2437 - regression_loss: 1.0938 - classification_loss: 0.1499 91/500 [====>.........................] - ETA: 1:35 - loss: 1.2405 - regression_loss: 1.0912 - classification_loss: 0.1494 92/500 [====>.........................] - ETA: 1:34 - loss: 1.2431 - regression_loss: 1.0934 - classification_loss: 0.1497 93/500 [====>.........................] - ETA: 1:34 - loss: 1.2536 - regression_loss: 1.1023 - classification_loss: 0.1513 94/500 [====>.........................] - ETA: 1:34 - loss: 1.2526 - regression_loss: 1.1017 - classification_loss: 0.1509 95/500 [====>.........................] - ETA: 1:34 - loss: 1.2472 - regression_loss: 1.0972 - classification_loss: 0.1500 96/500 [====>.........................] - ETA: 1:33 - loss: 1.2491 - regression_loss: 1.0990 - classification_loss: 0.1500 97/500 [====>.........................] - ETA: 1:33 - loss: 1.2716 - regression_loss: 1.1174 - classification_loss: 0.1542 98/500 [====>.........................] - ETA: 1:33 - loss: 1.2701 - regression_loss: 1.1163 - classification_loss: 0.1538 99/500 [====>.........................] - ETA: 1:33 - loss: 1.2702 - regression_loss: 1.1163 - classification_loss: 0.1539 100/500 [=====>........................] - ETA: 1:33 - loss: 1.2612 - regression_loss: 1.1085 - classification_loss: 0.1527 101/500 [=====>........................] - ETA: 1:32 - loss: 1.2576 - regression_loss: 1.1052 - classification_loss: 0.1524 102/500 [=====>........................] - ETA: 1:32 - loss: 1.2519 - regression_loss: 1.1002 - classification_loss: 0.1516 103/500 [=====>........................] - ETA: 1:32 - loss: 1.2483 - regression_loss: 1.0975 - classification_loss: 0.1508 104/500 [=====>........................] - ETA: 1:32 - loss: 1.2436 - regression_loss: 1.0922 - classification_loss: 0.1514 105/500 [=====>........................] - ETA: 1:32 - loss: 1.2543 - regression_loss: 1.1007 - classification_loss: 0.1536 106/500 [=====>........................] - ETA: 1:31 - loss: 1.2568 - regression_loss: 1.1031 - classification_loss: 0.1537 107/500 [=====>........................] - ETA: 1:31 - loss: 1.2566 - regression_loss: 1.1029 - classification_loss: 0.1537 108/500 [=====>........................] - ETA: 1:31 - loss: 1.2560 - regression_loss: 1.1022 - classification_loss: 0.1539 109/500 [=====>........................] - ETA: 1:31 - loss: 1.2555 - regression_loss: 1.1015 - classification_loss: 0.1540 110/500 [=====>........................] - ETA: 1:30 - loss: 1.2561 - regression_loss: 1.1021 - classification_loss: 0.1540 111/500 [=====>........................] - ETA: 1:30 - loss: 1.2522 - regression_loss: 1.0986 - classification_loss: 0.1536 112/500 [=====>........................] - ETA: 1:30 - loss: 1.2521 - regression_loss: 1.0982 - classification_loss: 0.1539 113/500 [=====>........................] - ETA: 1:30 - loss: 1.2667 - regression_loss: 1.1097 - classification_loss: 0.1570 114/500 [=====>........................] - ETA: 1:29 - loss: 1.2615 - regression_loss: 1.1052 - classification_loss: 0.1563 115/500 [=====>........................] - ETA: 1:29 - loss: 1.2737 - regression_loss: 1.1156 - classification_loss: 0.1580 116/500 [=====>........................] - ETA: 1:29 - loss: 1.2722 - regression_loss: 1.1144 - classification_loss: 0.1578 117/500 [======>.......................] - ETA: 1:29 - loss: 1.2811 - regression_loss: 1.1206 - classification_loss: 0.1605 118/500 [======>.......................] - ETA: 1:29 - loss: 1.2775 - regression_loss: 1.1176 - classification_loss: 0.1599 119/500 [======>.......................] - ETA: 1:28 - loss: 1.2779 - regression_loss: 1.1179 - classification_loss: 0.1600 120/500 [======>.......................] - ETA: 1:28 - loss: 1.2774 - regression_loss: 1.1177 - classification_loss: 0.1597 121/500 [======>.......................] - ETA: 1:28 - loss: 1.2723 - regression_loss: 1.1131 - classification_loss: 0.1592 122/500 [======>.......................] - ETA: 1:28 - loss: 1.2702 - regression_loss: 1.1116 - classification_loss: 0.1587 123/500 [======>.......................] - ETA: 1:28 - loss: 1.2666 - regression_loss: 1.1082 - classification_loss: 0.1584 124/500 [======>.......................] - ETA: 1:27 - loss: 1.2641 - regression_loss: 1.1063 - classification_loss: 0.1577 125/500 [======>.......................] - ETA: 1:27 - loss: 1.2631 - regression_loss: 1.1039 - classification_loss: 0.1592 126/500 [======>.......................] - ETA: 1:27 - loss: 1.2593 - regression_loss: 1.1007 - classification_loss: 0.1585 127/500 [======>.......................] - ETA: 1:26 - loss: 1.2666 - regression_loss: 1.1059 - classification_loss: 0.1607 128/500 [======>.......................] - ETA: 1:26 - loss: 1.2634 - regression_loss: 1.1034 - classification_loss: 0.1600 129/500 [======>.......................] - ETA: 1:26 - loss: 1.2635 - regression_loss: 1.1037 - classification_loss: 0.1598 130/500 [======>.......................] - ETA: 1:26 - loss: 1.2627 - regression_loss: 1.1032 - classification_loss: 0.1595 131/500 [======>.......................] - ETA: 1:25 - loss: 1.2621 - regression_loss: 1.1030 - classification_loss: 0.1591 132/500 [======>.......................] - ETA: 1:25 - loss: 1.2617 - regression_loss: 1.1032 - classification_loss: 0.1585 133/500 [======>.......................] - ETA: 1:25 - loss: 1.2618 - regression_loss: 1.1035 - classification_loss: 0.1583 134/500 [=======>......................] - ETA: 1:25 - loss: 1.2625 - regression_loss: 1.1047 - classification_loss: 0.1578 135/500 [=======>......................] - ETA: 1:25 - loss: 1.2612 - regression_loss: 1.1037 - classification_loss: 0.1575 136/500 [=======>......................] - ETA: 1:24 - loss: 1.2592 - regression_loss: 1.1024 - classification_loss: 0.1568 137/500 [=======>......................] - ETA: 1:24 - loss: 1.2518 - regression_loss: 1.0959 - classification_loss: 0.1558 138/500 [=======>......................] - ETA: 1:24 - loss: 1.2555 - regression_loss: 1.0990 - classification_loss: 0.1565 139/500 [=======>......................] - ETA: 1:24 - loss: 1.2570 - regression_loss: 1.0994 - classification_loss: 0.1575 140/500 [=======>......................] - ETA: 1:23 - loss: 1.2518 - regression_loss: 1.0949 - classification_loss: 0.1568 141/500 [=======>......................] - ETA: 1:23 - loss: 1.2468 - regression_loss: 1.0906 - classification_loss: 0.1562 142/500 [=======>......................] - ETA: 1:23 - loss: 1.2444 - regression_loss: 1.0887 - classification_loss: 0.1557 143/500 [=======>......................] - ETA: 1:23 - loss: 1.2467 - regression_loss: 1.0906 - classification_loss: 0.1561 144/500 [=======>......................] - ETA: 1:23 - loss: 1.2484 - regression_loss: 1.0919 - classification_loss: 0.1565 145/500 [=======>......................] - ETA: 1:22 - loss: 1.2495 - regression_loss: 1.0929 - classification_loss: 0.1566 146/500 [=======>......................] - ETA: 1:22 - loss: 1.2507 - regression_loss: 1.0944 - classification_loss: 0.1563 147/500 [=======>......................] - ETA: 1:22 - loss: 1.2528 - regression_loss: 1.0962 - classification_loss: 0.1566 148/500 [=======>......................] - ETA: 1:22 - loss: 1.2525 - regression_loss: 1.0957 - classification_loss: 0.1568 149/500 [=======>......................] - ETA: 1:21 - loss: 1.2566 - regression_loss: 1.0988 - classification_loss: 0.1578 150/500 [========>.....................] - ETA: 1:21 - loss: 1.2575 - regression_loss: 1.0997 - classification_loss: 0.1578 151/500 [========>.....................] - ETA: 1:21 - loss: 1.2587 - regression_loss: 1.1008 - classification_loss: 0.1579 152/500 [========>.....................] - ETA: 1:21 - loss: 1.2608 - regression_loss: 1.1021 - classification_loss: 0.1586 153/500 [========>.....................] - ETA: 1:20 - loss: 1.2583 - regression_loss: 1.1001 - classification_loss: 0.1582 154/500 [========>.....................] - ETA: 1:20 - loss: 1.2640 - regression_loss: 1.1025 - classification_loss: 0.1614 155/500 [========>.....................] - ETA: 1:20 - loss: 1.2636 - regression_loss: 1.1024 - classification_loss: 0.1613 156/500 [========>.....................] - ETA: 1:20 - loss: 1.2614 - regression_loss: 1.0998 - classification_loss: 0.1616 157/500 [========>.....................] - ETA: 1:20 - loss: 1.2643 - regression_loss: 1.1029 - classification_loss: 0.1614 158/500 [========>.....................] - ETA: 1:19 - loss: 1.2633 - regression_loss: 1.1018 - classification_loss: 0.1615 159/500 [========>.....................] - ETA: 1:19 - loss: 1.2651 - regression_loss: 1.1028 - classification_loss: 0.1622 160/500 [========>.....................] - ETA: 1:19 - loss: 1.2651 - regression_loss: 1.1029 - classification_loss: 0.1622 161/500 [========>.....................] - ETA: 1:19 - loss: 1.2626 - regression_loss: 1.1008 - classification_loss: 0.1619 162/500 [========>.....................] - ETA: 1:18 - loss: 1.2605 - regression_loss: 1.0992 - classification_loss: 0.1613 163/500 [========>.....................] - ETA: 1:18 - loss: 1.2613 - regression_loss: 1.0999 - classification_loss: 0.1614 164/500 [========>.....................] - ETA: 1:18 - loss: 1.2606 - regression_loss: 1.0990 - classification_loss: 0.1616 165/500 [========>.....................] - ETA: 1:18 - loss: 1.2619 - regression_loss: 1.1003 - classification_loss: 0.1616 166/500 [========>.....................] - ETA: 1:17 - loss: 1.2590 - regression_loss: 1.0979 - classification_loss: 0.1611 167/500 [=========>....................] - ETA: 1:17 - loss: 1.2561 - regression_loss: 1.0954 - classification_loss: 0.1607 168/500 [=========>....................] - ETA: 1:17 - loss: 1.2563 - regression_loss: 1.0955 - classification_loss: 0.1608 169/500 [=========>....................] - ETA: 1:17 - loss: 1.2582 - regression_loss: 1.0972 - classification_loss: 0.1610 170/500 [=========>....................] - ETA: 1:17 - loss: 1.2582 - regression_loss: 1.0971 - classification_loss: 0.1611 171/500 [=========>....................] - ETA: 1:16 - loss: 1.2611 - regression_loss: 1.0985 - classification_loss: 0.1626 172/500 [=========>....................] - ETA: 1:16 - loss: 1.2662 - regression_loss: 1.1021 - classification_loss: 0.1641 173/500 [=========>....................] - ETA: 1:16 - loss: 1.2663 - regression_loss: 1.1023 - classification_loss: 0.1640 174/500 [=========>....................] - ETA: 1:16 - loss: 1.2656 - regression_loss: 1.1017 - classification_loss: 0.1639 175/500 [=========>....................] - ETA: 1:16 - loss: 1.2625 - regression_loss: 1.0990 - classification_loss: 0.1636 176/500 [=========>....................] - ETA: 1:15 - loss: 1.2604 - regression_loss: 1.0972 - classification_loss: 0.1632 177/500 [=========>....................] - ETA: 1:15 - loss: 1.2567 - regression_loss: 1.0939 - classification_loss: 0.1628 178/500 [=========>....................] - ETA: 1:15 - loss: 1.2618 - regression_loss: 1.0986 - classification_loss: 0.1632 179/500 [=========>....................] - ETA: 1:15 - loss: 1.2574 - regression_loss: 1.0950 - classification_loss: 0.1624 180/500 [=========>....................] - ETA: 1:14 - loss: 1.2583 - regression_loss: 1.0953 - classification_loss: 0.1631 181/500 [=========>....................] - ETA: 1:14 - loss: 1.2607 - regression_loss: 1.0974 - classification_loss: 0.1633 182/500 [=========>....................] - ETA: 1:14 - loss: 1.2581 - regression_loss: 1.0954 - classification_loss: 0.1627 183/500 [=========>....................] - ETA: 1:14 - loss: 1.2583 - regression_loss: 1.0956 - classification_loss: 0.1627 184/500 [==========>...................] - ETA: 1:13 - loss: 1.2591 - regression_loss: 1.0962 - classification_loss: 0.1629 185/500 [==========>...................] - ETA: 1:13 - loss: 1.2547 - regression_loss: 1.0924 - classification_loss: 0.1623 186/500 [==========>...................] - ETA: 1:13 - loss: 1.2515 - regression_loss: 1.0899 - classification_loss: 0.1617 187/500 [==========>...................] - ETA: 1:13 - loss: 1.2514 - regression_loss: 1.0899 - classification_loss: 0.1615 188/500 [==========>...................] - ETA: 1:12 - loss: 1.2509 - regression_loss: 1.0897 - classification_loss: 0.1613 189/500 [==========>...................] - ETA: 1:12 - loss: 1.2547 - regression_loss: 1.0930 - classification_loss: 0.1617 190/500 [==========>...................] - ETA: 1:12 - loss: 1.2510 - regression_loss: 1.0899 - classification_loss: 0.1611 191/500 [==========>...................] - ETA: 1:12 - loss: 1.2508 - regression_loss: 1.0898 - classification_loss: 0.1611 192/500 [==========>...................] - ETA: 1:12 - loss: 1.2498 - regression_loss: 1.0888 - classification_loss: 0.1610 193/500 [==========>...................] - ETA: 1:11 - loss: 1.2510 - regression_loss: 1.0900 - classification_loss: 0.1611 194/500 [==========>...................] - ETA: 1:11 - loss: 1.2533 - regression_loss: 1.0920 - classification_loss: 0.1613 195/500 [==========>...................] - ETA: 1:11 - loss: 1.2537 - regression_loss: 1.0924 - classification_loss: 0.1612 196/500 [==========>...................] - ETA: 1:11 - loss: 1.2487 - regression_loss: 1.0880 - classification_loss: 0.1607 197/500 [==========>...................] - ETA: 1:11 - loss: 1.2450 - regression_loss: 1.0849 - classification_loss: 0.1601 198/500 [==========>...................] - ETA: 1:10 - loss: 1.2443 - regression_loss: 1.0843 - classification_loss: 0.1600 199/500 [==========>...................] - ETA: 1:10 - loss: 1.2482 - regression_loss: 1.0876 - classification_loss: 0.1606 200/500 [===========>..................] - ETA: 1:10 - loss: 1.2497 - regression_loss: 1.0890 - classification_loss: 0.1608 201/500 [===========>..................] - ETA: 1:10 - loss: 1.2486 - regression_loss: 1.0883 - classification_loss: 0.1603 202/500 [===========>..................] - ETA: 1:09 - loss: 1.2482 - regression_loss: 1.0877 - classification_loss: 0.1605 203/500 [===========>..................] - ETA: 1:09 - loss: 1.2478 - regression_loss: 1.0876 - classification_loss: 0.1602 204/500 [===========>..................] - ETA: 1:09 - loss: 1.2439 - regression_loss: 1.0841 - classification_loss: 0.1597 205/500 [===========>..................] - ETA: 1:09 - loss: 1.2419 - regression_loss: 1.0827 - classification_loss: 0.1592 206/500 [===========>..................] - ETA: 1:08 - loss: 1.2421 - regression_loss: 1.0825 - classification_loss: 0.1595 207/500 [===========>..................] - ETA: 1:08 - loss: 1.2416 - regression_loss: 1.0820 - classification_loss: 0.1595 208/500 [===========>..................] - ETA: 1:08 - loss: 1.2412 - regression_loss: 1.0819 - classification_loss: 0.1594 209/500 [===========>..................] - ETA: 1:08 - loss: 1.2393 - regression_loss: 1.0802 - classification_loss: 0.1590 210/500 [===========>..................] - ETA: 1:07 - loss: 1.2396 - regression_loss: 1.0807 - classification_loss: 0.1589 211/500 [===========>..................] - ETA: 1:07 - loss: 1.2401 - regression_loss: 1.0811 - classification_loss: 0.1590 212/500 [===========>..................] - ETA: 1:07 - loss: 1.2431 - regression_loss: 1.0838 - classification_loss: 0.1592 213/500 [===========>..................] - ETA: 1:07 - loss: 1.2397 - regression_loss: 1.0808 - classification_loss: 0.1589 214/500 [===========>..................] - ETA: 1:07 - loss: 1.2382 - regression_loss: 1.0795 - classification_loss: 0.1587 215/500 [===========>..................] - ETA: 1:06 - loss: 1.2377 - regression_loss: 1.0791 - classification_loss: 0.1586 216/500 [===========>..................] - ETA: 1:06 - loss: 1.2381 - regression_loss: 1.0797 - classification_loss: 0.1584 217/500 [============>.................] - ETA: 1:06 - loss: 1.2391 - regression_loss: 1.0805 - classification_loss: 0.1585 218/500 [============>.................] - ETA: 1:06 - loss: 1.2395 - regression_loss: 1.0811 - classification_loss: 0.1584 219/500 [============>.................] - ETA: 1:05 - loss: 1.2439 - regression_loss: 1.0848 - classification_loss: 0.1591 220/500 [============>.................] - ETA: 1:05 - loss: 1.2423 - regression_loss: 1.0834 - classification_loss: 0.1589 221/500 [============>.................] - ETA: 1:05 - loss: 1.2455 - regression_loss: 1.0864 - classification_loss: 0.1591 222/500 [============>.................] - ETA: 1:05 - loss: 1.2431 - regression_loss: 1.0845 - classification_loss: 0.1586 223/500 [============>.................] - ETA: 1:04 - loss: 1.2410 - regression_loss: 1.0829 - classification_loss: 0.1582 224/500 [============>.................] - ETA: 1:04 - loss: 1.2400 - regression_loss: 1.0824 - classification_loss: 0.1577 225/500 [============>.................] - ETA: 1:04 - loss: 1.2402 - regression_loss: 1.0827 - classification_loss: 0.1575 226/500 [============>.................] - ETA: 1:04 - loss: 1.2396 - regression_loss: 1.0823 - classification_loss: 0.1574 227/500 [============>.................] - ETA: 1:03 - loss: 1.2396 - regression_loss: 1.0824 - classification_loss: 0.1572 228/500 [============>.................] - ETA: 1:03 - loss: 1.2384 - regression_loss: 1.0816 - classification_loss: 0.1568 229/500 [============>.................] - ETA: 1:03 - loss: 1.2394 - regression_loss: 1.0827 - classification_loss: 0.1568 230/500 [============>.................] - ETA: 1:03 - loss: 1.2426 - regression_loss: 1.0850 - classification_loss: 0.1576 231/500 [============>.................] - ETA: 1:02 - loss: 1.2438 - regression_loss: 1.0860 - classification_loss: 0.1578 232/500 [============>.................] - ETA: 1:02 - loss: 1.2428 - regression_loss: 1.0852 - classification_loss: 0.1577 233/500 [============>.................] - ETA: 1:02 - loss: 1.2420 - regression_loss: 1.0845 - classification_loss: 0.1575 234/500 [=============>................] - ETA: 1:02 - loss: 1.2445 - regression_loss: 1.0867 - classification_loss: 0.1577 235/500 [=============>................] - ETA: 1:01 - loss: 1.2442 - regression_loss: 1.0866 - classification_loss: 0.1575 236/500 [=============>................] - ETA: 1:01 - loss: 1.2441 - regression_loss: 1.0867 - classification_loss: 0.1574 237/500 [=============>................] - ETA: 1:01 - loss: 1.2416 - regression_loss: 1.0844 - classification_loss: 0.1573 238/500 [=============>................] - ETA: 1:01 - loss: 1.2423 - regression_loss: 1.0851 - classification_loss: 0.1572 239/500 [=============>................] - ETA: 1:01 - loss: 1.2441 - regression_loss: 1.0864 - classification_loss: 0.1577 240/500 [=============>................] - ETA: 1:00 - loss: 1.2425 - regression_loss: 1.0851 - classification_loss: 0.1573 241/500 [=============>................] - ETA: 1:00 - loss: 1.2381 - regression_loss: 1.0812 - classification_loss: 0.1569 242/500 [=============>................] - ETA: 1:00 - loss: 1.2364 - regression_loss: 1.0799 - classification_loss: 0.1565 243/500 [=============>................] - ETA: 1:00 - loss: 1.2359 - regression_loss: 1.0795 - classification_loss: 0.1564 244/500 [=============>................] - ETA: 59s - loss: 1.2377 - regression_loss: 1.0808 - classification_loss: 0.1568  245/500 [=============>................] - ETA: 59s - loss: 1.2386 - regression_loss: 1.0818 - classification_loss: 0.1568 246/500 [=============>................] - ETA: 59s - loss: 1.2357 - regression_loss: 1.0793 - classification_loss: 0.1564 247/500 [=============>................] - ETA: 59s - loss: 1.2350 - regression_loss: 1.0789 - classification_loss: 0.1562 248/500 [=============>................] - ETA: 58s - loss: 1.2344 - regression_loss: 1.0782 - classification_loss: 0.1562 249/500 [=============>................] - ETA: 58s - loss: 1.2335 - regression_loss: 1.0776 - classification_loss: 0.1559 250/500 [==============>...............] - ETA: 58s - loss: 1.2334 - regression_loss: 1.0775 - classification_loss: 0.1559 251/500 [==============>...............] - ETA: 58s - loss: 1.2323 - regression_loss: 1.0764 - classification_loss: 0.1559 252/500 [==============>...............] - ETA: 58s - loss: 1.2335 - regression_loss: 1.0760 - classification_loss: 0.1575 253/500 [==============>...............] - ETA: 57s - loss: 1.2317 - regression_loss: 1.0745 - classification_loss: 0.1573 254/500 [==============>...............] - ETA: 57s - loss: 1.2307 - regression_loss: 1.0737 - classification_loss: 0.1570 255/500 [==============>...............] - ETA: 57s - loss: 1.2293 - regression_loss: 1.0725 - classification_loss: 0.1568 256/500 [==============>...............] - ETA: 57s - loss: 1.2321 - regression_loss: 1.0747 - classification_loss: 0.1573 257/500 [==============>...............] - ETA: 56s - loss: 1.2289 - regression_loss: 1.0719 - classification_loss: 0.1570 258/500 [==============>...............] - ETA: 56s - loss: 1.2301 - regression_loss: 1.0732 - classification_loss: 0.1569 259/500 [==============>...............] - ETA: 56s - loss: 1.2299 - regression_loss: 1.0731 - classification_loss: 0.1568 260/500 [==============>...............] - ETA: 56s - loss: 1.2288 - regression_loss: 1.0722 - classification_loss: 0.1566 261/500 [==============>...............] - ETA: 55s - loss: 1.2277 - regression_loss: 1.0714 - classification_loss: 0.1563 262/500 [==============>...............] - ETA: 55s - loss: 1.2257 - regression_loss: 1.0697 - classification_loss: 0.1561 263/500 [==============>...............] - ETA: 55s - loss: 1.2256 - regression_loss: 1.0697 - classification_loss: 0.1558 264/500 [==============>...............] - ETA: 55s - loss: 1.2239 - regression_loss: 1.0686 - classification_loss: 0.1553 265/500 [==============>...............] - ETA: 55s - loss: 1.2227 - regression_loss: 1.0677 - classification_loss: 0.1550 266/500 [==============>...............] - ETA: 54s - loss: 1.2253 - regression_loss: 1.0701 - classification_loss: 0.1553 267/500 [===============>..............] - ETA: 54s - loss: 1.2252 - regression_loss: 1.0700 - classification_loss: 0.1553 268/500 [===============>..............] - ETA: 54s - loss: 1.2259 - regression_loss: 1.0704 - classification_loss: 0.1555 269/500 [===============>..............] - ETA: 54s - loss: 1.2291 - regression_loss: 1.0733 - classification_loss: 0.1559 270/500 [===============>..............] - ETA: 53s - loss: 1.2269 - regression_loss: 1.0714 - classification_loss: 0.1556 271/500 [===============>..............] - ETA: 53s - loss: 1.2301 - regression_loss: 1.0743 - classification_loss: 0.1558 272/500 [===============>..............] - ETA: 53s - loss: 1.2292 - regression_loss: 1.0735 - classification_loss: 0.1557 273/500 [===============>..............] - ETA: 53s - loss: 1.2299 - regression_loss: 1.0742 - classification_loss: 0.1557 274/500 [===============>..............] - ETA: 52s - loss: 1.2302 - regression_loss: 1.0741 - classification_loss: 0.1561 275/500 [===============>..............] - ETA: 52s - loss: 1.2298 - regression_loss: 1.0738 - classification_loss: 0.1560 276/500 [===============>..............] - ETA: 52s - loss: 1.2300 - regression_loss: 1.0741 - classification_loss: 0.1559 277/500 [===============>..............] - ETA: 52s - loss: 1.2293 - regression_loss: 1.0735 - classification_loss: 0.1558 278/500 [===============>..............] - ETA: 52s - loss: 1.2292 - regression_loss: 1.0733 - classification_loss: 0.1559 279/500 [===============>..............] - ETA: 51s - loss: 1.2302 - regression_loss: 1.0739 - classification_loss: 0.1563 280/500 [===============>..............] - ETA: 51s - loss: 1.2297 - regression_loss: 1.0735 - classification_loss: 0.1562 281/500 [===============>..............] - ETA: 51s - loss: 1.2266 - regression_loss: 1.0707 - classification_loss: 0.1559 282/500 [===============>..............] - ETA: 51s - loss: 1.2287 - regression_loss: 1.0728 - classification_loss: 0.1560 283/500 [===============>..............] - ETA: 50s - loss: 1.2264 - regression_loss: 1.0706 - classification_loss: 0.1558 284/500 [================>.............] - ETA: 50s - loss: 1.2265 - regression_loss: 1.0706 - classification_loss: 0.1559 285/500 [================>.............] - ETA: 50s - loss: 1.2288 - regression_loss: 1.0725 - classification_loss: 0.1563 286/500 [================>.............] - ETA: 50s - loss: 1.2271 - regression_loss: 1.0712 - classification_loss: 0.1560 287/500 [================>.............] - ETA: 49s - loss: 1.2252 - regression_loss: 1.0695 - classification_loss: 0.1557 288/500 [================>.............] - ETA: 49s - loss: 1.2284 - regression_loss: 1.0722 - classification_loss: 0.1563 289/500 [================>.............] - ETA: 49s - loss: 1.2313 - regression_loss: 1.0747 - classification_loss: 0.1567 290/500 [================>.............] - ETA: 49s - loss: 1.2329 - regression_loss: 1.0760 - classification_loss: 0.1569 291/500 [================>.............] - ETA: 48s - loss: 1.2336 - regression_loss: 1.0769 - classification_loss: 0.1567 292/500 [================>.............] - ETA: 48s - loss: 1.2341 - regression_loss: 1.0772 - classification_loss: 0.1568 293/500 [================>.............] - ETA: 48s - loss: 1.2379 - regression_loss: 1.0803 - classification_loss: 0.1576 294/500 [================>.............] - ETA: 48s - loss: 1.2364 - regression_loss: 1.0791 - classification_loss: 0.1573 295/500 [================>.............] - ETA: 48s - loss: 1.2358 - regression_loss: 1.0785 - classification_loss: 0.1572 296/500 [================>.............] - ETA: 47s - loss: 1.2366 - regression_loss: 1.0794 - classification_loss: 0.1572 297/500 [================>.............] - ETA: 47s - loss: 1.2358 - regression_loss: 1.0785 - classification_loss: 0.1574 298/500 [================>.............] - ETA: 47s - loss: 1.2362 - regression_loss: 1.0789 - classification_loss: 0.1573 299/500 [================>.............] - ETA: 47s - loss: 1.2394 - regression_loss: 1.0817 - classification_loss: 0.1577 300/500 [=================>............] - ETA: 46s - loss: 1.2391 - regression_loss: 1.0814 - classification_loss: 0.1578 301/500 [=================>............] - ETA: 46s - loss: 1.2393 - regression_loss: 1.0815 - classification_loss: 0.1578 302/500 [=================>............] - ETA: 46s - loss: 1.2390 - regression_loss: 1.0814 - classification_loss: 0.1576 303/500 [=================>............] - ETA: 46s - loss: 1.2367 - regression_loss: 1.0794 - classification_loss: 0.1573 304/500 [=================>............] - ETA: 45s - loss: 1.2363 - regression_loss: 1.0789 - classification_loss: 0.1573 305/500 [=================>............] - ETA: 45s - loss: 1.2355 - regression_loss: 1.0784 - classification_loss: 0.1571 306/500 [=================>............] - ETA: 45s - loss: 1.2364 - regression_loss: 1.0792 - classification_loss: 0.1572 307/500 [=================>............] - ETA: 45s - loss: 1.2372 - regression_loss: 1.0799 - classification_loss: 0.1573 308/500 [=================>............] - ETA: 44s - loss: 1.2373 - regression_loss: 1.0801 - classification_loss: 0.1572 309/500 [=================>............] - ETA: 44s - loss: 1.2374 - regression_loss: 1.0801 - classification_loss: 0.1573 310/500 [=================>............] - ETA: 44s - loss: 1.2381 - regression_loss: 1.0806 - classification_loss: 0.1575 311/500 [=================>............] - ETA: 44s - loss: 1.2388 - regression_loss: 1.0812 - classification_loss: 0.1576 312/500 [=================>............] - ETA: 44s - loss: 1.2382 - regression_loss: 1.0808 - classification_loss: 0.1574 313/500 [=================>............] - ETA: 43s - loss: 1.2382 - regression_loss: 1.0811 - classification_loss: 0.1571 314/500 [=================>............] - ETA: 43s - loss: 1.2365 - regression_loss: 1.0797 - classification_loss: 0.1568 315/500 [=================>............] - ETA: 43s - loss: 1.2356 - regression_loss: 1.0789 - classification_loss: 0.1567 316/500 [=================>............] - ETA: 43s - loss: 1.2353 - regression_loss: 1.0787 - classification_loss: 0.1566 317/500 [==================>...........] - ETA: 42s - loss: 1.2340 - regression_loss: 1.0775 - classification_loss: 0.1565 318/500 [==================>...........] - ETA: 42s - loss: 1.2340 - regression_loss: 1.0776 - classification_loss: 0.1564 319/500 [==================>...........] - ETA: 42s - loss: 1.2346 - regression_loss: 1.0780 - classification_loss: 0.1565 320/500 [==================>...........] - ETA: 42s - loss: 1.2324 - regression_loss: 1.0761 - classification_loss: 0.1563 321/500 [==================>...........] - ETA: 41s - loss: 1.2305 - regression_loss: 1.0744 - classification_loss: 0.1561 322/500 [==================>...........] - ETA: 41s - loss: 1.2332 - regression_loss: 1.0762 - classification_loss: 0.1570 323/500 [==================>...........] - ETA: 41s - loss: 1.2351 - regression_loss: 1.0780 - classification_loss: 0.1571 324/500 [==================>...........] - ETA: 41s - loss: 1.2345 - regression_loss: 1.0773 - classification_loss: 0.1572 325/500 [==================>...........] - ETA: 40s - loss: 1.2319 - regression_loss: 1.0740 - classification_loss: 0.1579 326/500 [==================>...........] - ETA: 40s - loss: 1.2301 - regression_loss: 1.0725 - classification_loss: 0.1576 327/500 [==================>...........] - ETA: 40s - loss: 1.2279 - regression_loss: 1.0706 - classification_loss: 0.1573 328/500 [==================>...........] - ETA: 40s - loss: 1.2272 - regression_loss: 1.0702 - classification_loss: 0.1570 329/500 [==================>...........] - ETA: 40s - loss: 1.2256 - regression_loss: 1.0689 - classification_loss: 0.1567 330/500 [==================>...........] - ETA: 39s - loss: 1.2246 - regression_loss: 1.0680 - classification_loss: 0.1566 331/500 [==================>...........] - ETA: 39s - loss: 1.2250 - regression_loss: 1.0684 - classification_loss: 0.1566 332/500 [==================>...........] - ETA: 39s - loss: 1.2241 - regression_loss: 1.0676 - classification_loss: 0.1565 333/500 [==================>...........] - ETA: 39s - loss: 1.2229 - regression_loss: 1.0666 - classification_loss: 0.1563 334/500 [===================>..........] - ETA: 38s - loss: 1.2229 - regression_loss: 1.0666 - classification_loss: 0.1562 335/500 [===================>..........] - ETA: 38s - loss: 1.2232 - regression_loss: 1.0670 - classification_loss: 0.1562 336/500 [===================>..........] - ETA: 38s - loss: 1.2256 - regression_loss: 1.0689 - classification_loss: 0.1567 337/500 [===================>..........] - ETA: 38s - loss: 1.2247 - regression_loss: 1.0680 - classification_loss: 0.1567 338/500 [===================>..........] - ETA: 37s - loss: 1.2226 - regression_loss: 1.0662 - classification_loss: 0.1564 339/500 [===================>..........] - ETA: 37s - loss: 1.2225 - regression_loss: 1.0662 - classification_loss: 0.1563 340/500 [===================>..........] - ETA: 37s - loss: 1.2217 - regression_loss: 1.0656 - classification_loss: 0.1561 341/500 [===================>..........] - ETA: 37s - loss: 1.2233 - regression_loss: 1.0666 - classification_loss: 0.1567 342/500 [===================>..........] - ETA: 36s - loss: 1.2246 - regression_loss: 1.0679 - classification_loss: 0.1567 343/500 [===================>..........] - ETA: 36s - loss: 1.2241 - regression_loss: 1.0677 - classification_loss: 0.1565 344/500 [===================>..........] - ETA: 36s - loss: 1.2243 - regression_loss: 1.0679 - classification_loss: 0.1565 345/500 [===================>..........] - ETA: 36s - loss: 1.2227 - regression_loss: 1.0664 - classification_loss: 0.1563 346/500 [===================>..........] - ETA: 36s - loss: 1.2217 - regression_loss: 1.0655 - classification_loss: 0.1562 347/500 [===================>..........] - ETA: 35s - loss: 1.2199 - regression_loss: 1.0640 - classification_loss: 0.1559 348/500 [===================>..........] - ETA: 35s - loss: 1.2191 - regression_loss: 1.0633 - classification_loss: 0.1559 349/500 [===================>..........] - ETA: 35s - loss: 1.2169 - regression_loss: 1.0614 - classification_loss: 0.1555 350/500 [====================>.........] - ETA: 35s - loss: 1.2185 - regression_loss: 1.0626 - classification_loss: 0.1559 351/500 [====================>.........] - ETA: 34s - loss: 1.2189 - regression_loss: 1.0627 - classification_loss: 0.1562 352/500 [====================>.........] - ETA: 34s - loss: 1.2193 - regression_loss: 1.0631 - classification_loss: 0.1562 353/500 [====================>.........] - ETA: 34s - loss: 1.2179 - regression_loss: 1.0619 - classification_loss: 0.1560 354/500 [====================>.........] - ETA: 34s - loss: 1.2170 - regression_loss: 1.0607 - classification_loss: 0.1563 355/500 [====================>.........] - ETA: 33s - loss: 1.2156 - regression_loss: 1.0596 - classification_loss: 0.1560 356/500 [====================>.........] - ETA: 33s - loss: 1.2156 - regression_loss: 1.0595 - classification_loss: 0.1561 357/500 [====================>.........] - ETA: 33s - loss: 1.2152 - regression_loss: 1.0591 - classification_loss: 0.1561 358/500 [====================>.........] - ETA: 33s - loss: 1.2137 - regression_loss: 1.0579 - classification_loss: 0.1557 359/500 [====================>.........] - ETA: 33s - loss: 1.2128 - regression_loss: 1.0574 - classification_loss: 0.1555 360/500 [====================>.........] - ETA: 32s - loss: 1.2122 - regression_loss: 1.0567 - classification_loss: 0.1555 361/500 [====================>.........] - ETA: 32s - loss: 1.2125 - regression_loss: 1.0570 - classification_loss: 0.1555 362/500 [====================>.........] - ETA: 32s - loss: 1.2123 - regression_loss: 1.0569 - classification_loss: 0.1554 363/500 [====================>.........] - ETA: 32s - loss: 1.2108 - regression_loss: 1.0556 - classification_loss: 0.1552 364/500 [====================>.........] - ETA: 31s - loss: 1.2094 - regression_loss: 1.0544 - classification_loss: 0.1549 365/500 [====================>.........] - ETA: 31s - loss: 1.2102 - regression_loss: 1.0550 - classification_loss: 0.1552 366/500 [====================>.........] - ETA: 31s - loss: 1.2092 - regression_loss: 1.0541 - classification_loss: 0.1551 367/500 [=====================>........] - ETA: 31s - loss: 1.2095 - regression_loss: 1.0542 - classification_loss: 0.1553 368/500 [=====================>........] - ETA: 30s - loss: 1.2098 - regression_loss: 1.0546 - classification_loss: 0.1552 369/500 [=====================>........] - ETA: 30s - loss: 1.2108 - regression_loss: 1.0555 - classification_loss: 0.1552 370/500 [=====================>........] - ETA: 30s - loss: 1.2089 - regression_loss: 1.0539 - classification_loss: 0.1549 371/500 [=====================>........] - ETA: 30s - loss: 1.2063 - regression_loss: 1.0517 - classification_loss: 0.1546 372/500 [=====================>........] - ETA: 30s - loss: 1.2037 - regression_loss: 1.0494 - classification_loss: 0.1543 373/500 [=====================>........] - ETA: 29s - loss: 1.2046 - regression_loss: 1.0500 - classification_loss: 0.1547 374/500 [=====================>........] - ETA: 29s - loss: 1.2044 - regression_loss: 1.0497 - classification_loss: 0.1547 375/500 [=====================>........] - ETA: 29s - loss: 1.2050 - regression_loss: 1.0502 - classification_loss: 0.1548 376/500 [=====================>........] - ETA: 29s - loss: 1.2053 - regression_loss: 1.0504 - classification_loss: 0.1550 377/500 [=====================>........] - ETA: 28s - loss: 1.2040 - regression_loss: 1.0492 - classification_loss: 0.1549 378/500 [=====================>........] - ETA: 28s - loss: 1.2045 - regression_loss: 1.0496 - classification_loss: 0.1548 379/500 [=====================>........] - ETA: 28s - loss: 1.2044 - regression_loss: 1.0495 - classification_loss: 0.1549 380/500 [=====================>........] - ETA: 28s - loss: 1.2074 - regression_loss: 1.0518 - classification_loss: 0.1556 381/500 [=====================>........] - ETA: 27s - loss: 1.2086 - regression_loss: 1.0526 - classification_loss: 0.1560 382/500 [=====================>........] - ETA: 27s - loss: 1.2081 - regression_loss: 1.0523 - classification_loss: 0.1558 383/500 [=====================>........] - ETA: 27s - loss: 1.2082 - regression_loss: 1.0524 - classification_loss: 0.1558 384/500 [======================>.......] - ETA: 27s - loss: 1.2069 - regression_loss: 1.0514 - classification_loss: 0.1555 385/500 [======================>.......] - ETA: 26s - loss: 1.2071 - regression_loss: 1.0516 - classification_loss: 0.1555 386/500 [======================>.......] - ETA: 26s - loss: 1.2061 - regression_loss: 1.0509 - classification_loss: 0.1553 387/500 [======================>.......] - ETA: 26s - loss: 1.2071 - regression_loss: 1.0519 - classification_loss: 0.1553 388/500 [======================>.......] - ETA: 26s - loss: 1.2071 - regression_loss: 1.0519 - classification_loss: 0.1552 389/500 [======================>.......] - ETA: 26s - loss: 1.2080 - regression_loss: 1.0526 - classification_loss: 0.1554 390/500 [======================>.......] - ETA: 25s - loss: 1.2084 - regression_loss: 1.0530 - classification_loss: 0.1554 391/500 [======================>.......] - ETA: 25s - loss: 1.2085 - regression_loss: 1.0532 - classification_loss: 0.1553 392/500 [======================>.......] - ETA: 25s - loss: 1.2111 - regression_loss: 1.0555 - classification_loss: 0.1556 393/500 [======================>.......] - ETA: 25s - loss: 1.2109 - regression_loss: 1.0554 - classification_loss: 0.1555 394/500 [======================>.......] - ETA: 24s - loss: 1.2117 - regression_loss: 1.0561 - classification_loss: 0.1556 395/500 [======================>.......] - ETA: 24s - loss: 1.2113 - regression_loss: 1.0558 - classification_loss: 0.1555 396/500 [======================>.......] - ETA: 24s - loss: 1.2107 - regression_loss: 1.0554 - classification_loss: 0.1553 397/500 [======================>.......] - ETA: 24s - loss: 1.2095 - regression_loss: 1.0544 - classification_loss: 0.1551 398/500 [======================>.......] - ETA: 23s - loss: 1.2094 - regression_loss: 1.0545 - classification_loss: 0.1549 399/500 [======================>.......] - ETA: 23s - loss: 1.2100 - regression_loss: 1.0551 - classification_loss: 0.1550 400/500 [=======================>......] - ETA: 23s - loss: 1.2093 - regression_loss: 1.0544 - classification_loss: 0.1549 401/500 [=======================>......] - ETA: 23s - loss: 1.2095 - regression_loss: 1.0546 - classification_loss: 0.1549 402/500 [=======================>......] - ETA: 22s - loss: 1.2106 - regression_loss: 1.0555 - classification_loss: 0.1551 403/500 [=======================>......] - ETA: 22s - loss: 1.2120 - regression_loss: 1.0566 - classification_loss: 0.1554 404/500 [=======================>......] - ETA: 22s - loss: 1.2117 - regression_loss: 1.0563 - classification_loss: 0.1554 405/500 [=======================>......] - ETA: 22s - loss: 1.2125 - regression_loss: 1.0570 - classification_loss: 0.1555 406/500 [=======================>......] - ETA: 22s - loss: 1.2129 - regression_loss: 1.0575 - classification_loss: 0.1554 407/500 [=======================>......] - ETA: 21s - loss: 1.2137 - regression_loss: 1.0581 - classification_loss: 0.1555 408/500 [=======================>......] - ETA: 21s - loss: 1.2130 - regression_loss: 1.0577 - classification_loss: 0.1554 409/500 [=======================>......] - ETA: 21s - loss: 1.2143 - regression_loss: 1.0588 - classification_loss: 0.1556 410/500 [=======================>......] - ETA: 21s - loss: 1.2146 - regression_loss: 1.0591 - classification_loss: 0.1555 411/500 [=======================>......] - ETA: 20s - loss: 1.2140 - regression_loss: 1.0586 - classification_loss: 0.1555 412/500 [=======================>......] - ETA: 20s - loss: 1.2133 - regression_loss: 1.0580 - classification_loss: 0.1553 413/500 [=======================>......] - ETA: 20s - loss: 1.2117 - regression_loss: 1.0567 - classification_loss: 0.1550 414/500 [=======================>......] - ETA: 20s - loss: 1.2114 - regression_loss: 1.0564 - classification_loss: 0.1550 415/500 [=======================>......] - ETA: 19s - loss: 1.2103 - regression_loss: 1.0554 - classification_loss: 0.1549 416/500 [=======================>......] - ETA: 19s - loss: 1.2129 - regression_loss: 1.0576 - classification_loss: 0.1553 417/500 [========================>.....] - ETA: 19s - loss: 1.2111 - regression_loss: 1.0560 - classification_loss: 0.1551 418/500 [========================>.....] - ETA: 19s - loss: 1.2105 - regression_loss: 1.0554 - classification_loss: 0.1551 419/500 [========================>.....] - ETA: 18s - loss: 1.2111 - regression_loss: 1.0560 - classification_loss: 0.1551 420/500 [========================>.....] - ETA: 18s - loss: 1.2098 - regression_loss: 1.0548 - classification_loss: 0.1550 421/500 [========================>.....] - ETA: 18s - loss: 1.2095 - regression_loss: 1.0541 - classification_loss: 0.1555 422/500 [========================>.....] - ETA: 18s - loss: 1.2077 - regression_loss: 1.0524 - classification_loss: 0.1553 423/500 [========================>.....] - ETA: 18s - loss: 1.2066 - regression_loss: 1.0515 - classification_loss: 0.1551 424/500 [========================>.....] - ETA: 17s - loss: 1.2054 - regression_loss: 1.0504 - classification_loss: 0.1550 425/500 [========================>.....] - ETA: 17s - loss: 1.2039 - regression_loss: 1.0491 - classification_loss: 0.1548 426/500 [========================>.....] - ETA: 17s - loss: 1.2025 - regression_loss: 1.0479 - classification_loss: 0.1546 427/500 [========================>.....] - ETA: 17s - loss: 1.2034 - regression_loss: 1.0487 - classification_loss: 0.1547 428/500 [========================>.....] - ETA: 16s - loss: 1.2038 - regression_loss: 1.0490 - classification_loss: 0.1547 429/500 [========================>.....] - ETA: 16s - loss: 1.2038 - regression_loss: 1.0491 - classification_loss: 0.1547 430/500 [========================>.....] - ETA: 16s - loss: 1.2029 - regression_loss: 1.0483 - classification_loss: 0.1546 431/500 [========================>.....] - ETA: 16s - loss: 1.2021 - regression_loss: 1.0477 - classification_loss: 0.1545 432/500 [========================>.....] - ETA: 15s - loss: 1.2051 - regression_loss: 1.0507 - classification_loss: 0.1544 433/500 [========================>.....] - ETA: 15s - loss: 1.2066 - regression_loss: 1.0517 - classification_loss: 0.1549 434/500 [=========================>....] - ETA: 15s - loss: 1.2080 - regression_loss: 1.0528 - classification_loss: 0.1553 435/500 [=========================>....] - ETA: 15s - loss: 1.2083 - regression_loss: 1.0530 - classification_loss: 0.1553 436/500 [=========================>....] - ETA: 15s - loss: 1.2079 - regression_loss: 1.0526 - classification_loss: 0.1553 437/500 [=========================>....] - ETA: 14s - loss: 1.2096 - regression_loss: 1.0542 - classification_loss: 0.1554 438/500 [=========================>....] - ETA: 14s - loss: 1.2109 - regression_loss: 1.0551 - classification_loss: 0.1559 439/500 [=========================>....] - ETA: 14s - loss: 1.2103 - regression_loss: 1.0545 - classification_loss: 0.1558 440/500 [=========================>....] - ETA: 14s - loss: 1.2101 - regression_loss: 1.0544 - classification_loss: 0.1558 441/500 [=========================>....] - ETA: 13s - loss: 1.2093 - regression_loss: 1.0536 - classification_loss: 0.1556 442/500 [=========================>....] - ETA: 13s - loss: 1.2084 - regression_loss: 1.0529 - classification_loss: 0.1555 443/500 [=========================>....] - ETA: 13s - loss: 1.2085 - regression_loss: 1.0530 - classification_loss: 0.1555 444/500 [=========================>....] - ETA: 13s - loss: 1.2082 - regression_loss: 1.0529 - classification_loss: 0.1553 445/500 [=========================>....] - ETA: 12s - loss: 1.2074 - regression_loss: 1.0521 - classification_loss: 0.1553 446/500 [=========================>....] - ETA: 12s - loss: 1.2084 - regression_loss: 1.0530 - classification_loss: 0.1555 447/500 [=========================>....] - ETA: 12s - loss: 1.2082 - regression_loss: 1.0528 - classification_loss: 0.1554 448/500 [=========================>....] - ETA: 12s - loss: 1.2085 - regression_loss: 1.0531 - classification_loss: 0.1554 449/500 [=========================>....] - ETA: 11s - loss: 1.2094 - regression_loss: 1.0539 - classification_loss: 0.1555 450/500 [==========================>...] - ETA: 11s - loss: 1.2084 - regression_loss: 1.0530 - classification_loss: 0.1554 451/500 [==========================>...] - ETA: 11s - loss: 1.2098 - regression_loss: 1.0538 - classification_loss: 0.1560 452/500 [==========================>...] - ETA: 11s - loss: 1.2085 - regression_loss: 1.0526 - classification_loss: 0.1558 453/500 [==========================>...] - ETA: 11s - loss: 1.2092 - regression_loss: 1.0532 - classification_loss: 0.1560 454/500 [==========================>...] - ETA: 10s - loss: 1.2100 - regression_loss: 1.0538 - classification_loss: 0.1562 455/500 [==========================>...] - ETA: 10s - loss: 1.2084 - regression_loss: 1.0524 - classification_loss: 0.1560 456/500 [==========================>...] - ETA: 10s - loss: 1.2087 - regression_loss: 1.0526 - classification_loss: 0.1561 457/500 [==========================>...] - ETA: 10s - loss: 1.2090 - regression_loss: 1.0529 - classification_loss: 0.1561 458/500 [==========================>...] - ETA: 9s - loss: 1.2083 - regression_loss: 1.0523 - classification_loss: 0.1560  459/500 [==========================>...] - ETA: 9s - loss: 1.2097 - regression_loss: 1.0535 - classification_loss: 0.1563 460/500 [==========================>...] - ETA: 9s - loss: 1.2099 - regression_loss: 1.0536 - classification_loss: 0.1563 461/500 [==========================>...] - ETA: 9s - loss: 1.2102 - regression_loss: 1.0537 - classification_loss: 0.1565 462/500 [==========================>...] - ETA: 8s - loss: 1.2110 - regression_loss: 1.0542 - classification_loss: 0.1568 463/500 [==========================>...] - ETA: 8s - loss: 1.2103 - regression_loss: 1.0537 - classification_loss: 0.1566 464/500 [==========================>...] - ETA: 8s - loss: 1.2111 - regression_loss: 1.0546 - classification_loss: 0.1566 465/500 [==========================>...] - ETA: 8s - loss: 1.2130 - regression_loss: 1.0558 - classification_loss: 0.1572 466/500 [==========================>...] - ETA: 7s - loss: 1.2143 - regression_loss: 1.0570 - classification_loss: 0.1573 467/500 [===========================>..] - ETA: 7s - loss: 1.2160 - regression_loss: 1.0586 - classification_loss: 0.1574 468/500 [===========================>..] - ETA: 7s - loss: 1.2175 - regression_loss: 1.0599 - classification_loss: 0.1576 469/500 [===========================>..] - ETA: 7s - loss: 1.2183 - regression_loss: 1.0605 - classification_loss: 0.1577 470/500 [===========================>..] - ETA: 7s - loss: 1.2198 - regression_loss: 1.0614 - classification_loss: 0.1584 471/500 [===========================>..] - ETA: 6s - loss: 1.2219 - regression_loss: 1.0634 - classification_loss: 0.1585 472/500 [===========================>..] - ETA: 6s - loss: 1.2217 - regression_loss: 1.0630 - classification_loss: 0.1587 473/500 [===========================>..] - ETA: 6s - loss: 1.2210 - regression_loss: 1.0624 - classification_loss: 0.1586 474/500 [===========================>..] - ETA: 6s - loss: 1.2194 - regression_loss: 1.0610 - classification_loss: 0.1583 475/500 [===========================>..] - ETA: 5s - loss: 1.2217 - regression_loss: 1.0629 - classification_loss: 0.1588 476/500 [===========================>..] - ETA: 5s - loss: 1.2220 - regression_loss: 1.0631 - classification_loss: 0.1589 477/500 [===========================>..] - ETA: 5s - loss: 1.2209 - regression_loss: 1.0620 - classification_loss: 0.1589 478/500 [===========================>..] - ETA: 5s - loss: 1.2204 - regression_loss: 1.0617 - classification_loss: 0.1588 479/500 [===========================>..] - ETA: 4s - loss: 1.2190 - regression_loss: 1.0605 - classification_loss: 0.1586 480/500 [===========================>..] - ETA: 4s - loss: 1.2184 - regression_loss: 1.0600 - classification_loss: 0.1584 481/500 [===========================>..] - ETA: 4s - loss: 1.2171 - regression_loss: 1.0589 - classification_loss: 0.1582 482/500 [===========================>..] - ETA: 4s - loss: 1.2168 - regression_loss: 1.0585 - classification_loss: 0.1582 483/500 [===========================>..] - ETA: 3s - loss: 1.2174 - regression_loss: 1.0592 - classification_loss: 0.1582 484/500 [============================>.] - ETA: 3s - loss: 1.2166 - regression_loss: 1.0586 - classification_loss: 0.1580 485/500 [============================>.] - ETA: 3s - loss: 1.2148 - regression_loss: 1.0570 - classification_loss: 0.1578 486/500 [============================>.] - ETA: 3s - loss: 1.2149 - regression_loss: 1.0569 - classification_loss: 0.1580 487/500 [============================>.] - ETA: 3s - loss: 1.2161 - regression_loss: 1.0579 - classification_loss: 0.1582 488/500 [============================>.] - ETA: 2s - loss: 1.2156 - regression_loss: 1.0575 - classification_loss: 0.1581 489/500 [============================>.] - ETA: 2s - loss: 1.2160 - regression_loss: 1.0577 - classification_loss: 0.1582 490/500 [============================>.] - ETA: 2s - loss: 1.2160 - regression_loss: 1.0578 - classification_loss: 0.1583 491/500 [============================>.] - ETA: 2s - loss: 1.2180 - regression_loss: 1.0593 - classification_loss: 0.1586 492/500 [============================>.] - ETA: 1s - loss: 1.2185 - regression_loss: 1.0599 - classification_loss: 0.1586 493/500 [============================>.] - ETA: 1s - loss: 1.2180 - regression_loss: 1.0592 - classification_loss: 0.1588 494/500 [============================>.] - ETA: 1s - loss: 1.2174 - regression_loss: 1.0587 - classification_loss: 0.1587 495/500 [============================>.] - ETA: 1s - loss: 1.2176 - regression_loss: 1.0590 - classification_loss: 0.1587 496/500 [============================>.] - ETA: 0s - loss: 1.2179 - regression_loss: 1.0593 - classification_loss: 0.1586 497/500 [============================>.] - ETA: 0s - loss: 1.2169 - regression_loss: 1.0585 - classification_loss: 0.1584 498/500 [============================>.] - ETA: 0s - loss: 1.2168 - regression_loss: 1.0586 - classification_loss: 0.1583 499/500 [============================>.] - ETA: 0s - loss: 1.2172 - regression_loss: 1.0587 - classification_loss: 0.1584 500/500 [==============================] - 117s 234ms/step - loss: 1.2159 - regression_loss: 1.0576 - classification_loss: 0.1583 326 instances of class plum with average precision: 0.7973 mAP: 0.7973 Epoch 00018: saving model to ./training/snapshots/resnet50_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 1:43 - loss: 1.2043 - regression_loss: 1.0509 - classification_loss: 0.1535 2/500 [..............................] - ETA: 1:44 - loss: 1.2549 - regression_loss: 1.0916 - classification_loss: 0.1633 3/500 [..............................] - ETA: 1:50 - loss: 1.1062 - regression_loss: 0.9690 - classification_loss: 0.1371 4/500 [..............................] - ETA: 1:52 - loss: 1.4992 - regression_loss: 1.2984 - classification_loss: 0.2009 5/500 [..............................] - ETA: 1:55 - loss: 1.4650 - regression_loss: 1.2762 - classification_loss: 0.1888 6/500 [..............................] - ETA: 1:55 - loss: 1.4442 - regression_loss: 1.2625 - classification_loss: 0.1817 7/500 [..............................] - ETA: 1:57 - loss: 1.3933 - regression_loss: 1.2229 - classification_loss: 0.1704 8/500 [..............................] - ETA: 1:56 - loss: 1.3727 - regression_loss: 1.2032 - classification_loss: 0.1695 9/500 [..............................] - ETA: 1:56 - loss: 1.2728 - regression_loss: 1.1154 - classification_loss: 0.1574 10/500 [..............................] - ETA: 1:57 - loss: 1.2435 - regression_loss: 1.0880 - classification_loss: 0.1554 11/500 [..............................] - ETA: 1:56 - loss: 1.1919 - regression_loss: 1.0455 - classification_loss: 0.1464 12/500 [..............................] - ETA: 1:56 - loss: 1.3039 - regression_loss: 1.1265 - classification_loss: 0.1774 13/500 [..............................] - ETA: 1:56 - loss: 1.3623 - regression_loss: 1.1677 - classification_loss: 0.1945 14/500 [..............................] - ETA: 1:56 - loss: 1.3612 - regression_loss: 1.1727 - classification_loss: 0.1885 15/500 [..............................] - ETA: 1:56 - loss: 1.3707 - regression_loss: 1.1829 - classification_loss: 0.1878 16/500 [..............................] - ETA: 1:55 - loss: 1.3859 - regression_loss: 1.1952 - classification_loss: 0.1907 17/500 [>.............................] - ETA: 1:55 - loss: 1.3773 - regression_loss: 1.1898 - classification_loss: 0.1875 18/500 [>.............................] - ETA: 1:56 - loss: 1.3670 - regression_loss: 1.1828 - classification_loss: 0.1842 19/500 [>.............................] - ETA: 1:55 - loss: 1.3585 - regression_loss: 1.1763 - classification_loss: 0.1822 20/500 [>.............................] - ETA: 1:54 - loss: 1.3905 - regression_loss: 1.2016 - classification_loss: 0.1889 21/500 [>.............................] - ETA: 1:54 - loss: 1.3734 - regression_loss: 1.1865 - classification_loss: 0.1869 22/500 [>.............................] - ETA: 1:54 - loss: 1.3580 - regression_loss: 1.1737 - classification_loss: 0.1843 23/500 [>.............................] - ETA: 1:54 - loss: 1.3362 - regression_loss: 1.1568 - classification_loss: 0.1795 24/500 [>.............................] - ETA: 1:53 - loss: 1.3428 - regression_loss: 1.1638 - classification_loss: 0.1790 25/500 [>.............................] - ETA: 1:53 - loss: 1.3448 - regression_loss: 1.1660 - classification_loss: 0.1788 26/500 [>.............................] - ETA: 1:53 - loss: 1.3324 - regression_loss: 1.1569 - classification_loss: 0.1755 27/500 [>.............................] - ETA: 1:52 - loss: 1.3370 - regression_loss: 1.1616 - classification_loss: 0.1754 28/500 [>.............................] - ETA: 1:52 - loss: 1.3441 - regression_loss: 1.1667 - classification_loss: 0.1774 29/500 [>.............................] - ETA: 1:51 - loss: 1.3410 - regression_loss: 1.1639 - classification_loss: 0.1770 30/500 [>.............................] - ETA: 1:51 - loss: 1.3419 - regression_loss: 1.1642 - classification_loss: 0.1776 31/500 [>.............................] - ETA: 1:51 - loss: 1.3357 - regression_loss: 1.1592 - classification_loss: 0.1765 32/500 [>.............................] - ETA: 1:51 - loss: 1.3376 - regression_loss: 1.1619 - classification_loss: 0.1757 33/500 [>.............................] - ETA: 1:51 - loss: 1.3027 - regression_loss: 1.1267 - classification_loss: 0.1761 34/500 [=>............................] - ETA: 1:50 - loss: 1.3036 - regression_loss: 1.1269 - classification_loss: 0.1767 35/500 [=>............................] - ETA: 1:50 - loss: 1.2979 - regression_loss: 1.1209 - classification_loss: 0.1770 36/500 [=>............................] - ETA: 1:50 - loss: 1.2995 - regression_loss: 1.1232 - classification_loss: 0.1763 37/500 [=>............................] - ETA: 1:50 - loss: 1.3075 - regression_loss: 1.1324 - classification_loss: 0.1751 38/500 [=>............................] - ETA: 1:49 - loss: 1.3230 - regression_loss: 1.1418 - classification_loss: 0.1812 39/500 [=>............................] - ETA: 1:49 - loss: 1.3354 - regression_loss: 1.1547 - classification_loss: 0.1806 40/500 [=>............................] - ETA: 1:48 - loss: 1.3352 - regression_loss: 1.1566 - classification_loss: 0.1787 41/500 [=>............................] - ETA: 1:48 - loss: 1.3430 - regression_loss: 1.1628 - classification_loss: 0.1802 42/500 [=>............................] - ETA: 1:48 - loss: 1.3533 - regression_loss: 1.1706 - classification_loss: 0.1827 43/500 [=>............................] - ETA: 1:48 - loss: 1.3684 - regression_loss: 1.1834 - classification_loss: 0.1850 44/500 [=>............................] - ETA: 1:47 - loss: 1.3599 - regression_loss: 1.1761 - classification_loss: 0.1838 45/500 [=>............................] - ETA: 1:47 - loss: 1.3555 - regression_loss: 1.1732 - classification_loss: 0.1823 46/500 [=>............................] - ETA: 1:47 - loss: 1.3457 - regression_loss: 1.1654 - classification_loss: 0.1803 47/500 [=>............................] - ETA: 1:47 - loss: 1.3475 - regression_loss: 1.1637 - classification_loss: 0.1837 48/500 [=>............................] - ETA: 1:47 - loss: 1.3560 - regression_loss: 1.1718 - classification_loss: 0.1842 49/500 [=>............................] - ETA: 1:46 - loss: 1.3502 - regression_loss: 1.1678 - classification_loss: 0.1824 50/500 [==>...........................] - ETA: 1:46 - loss: 1.3442 - regression_loss: 1.1630 - classification_loss: 0.1812 51/500 [==>...........................] - ETA: 1:46 - loss: 1.3273 - regression_loss: 1.1479 - classification_loss: 0.1793 52/500 [==>...........................] - ETA: 1:46 - loss: 1.3092 - regression_loss: 1.1328 - classification_loss: 0.1763 53/500 [==>...........................] - ETA: 1:45 - loss: 1.3008 - regression_loss: 1.1249 - classification_loss: 0.1759 54/500 [==>...........................] - ETA: 1:45 - loss: 1.2967 - regression_loss: 1.1225 - classification_loss: 0.1742 55/500 [==>...........................] - ETA: 1:45 - loss: 1.2911 - regression_loss: 1.1176 - classification_loss: 0.1735 56/500 [==>...........................] - ETA: 1:45 - loss: 1.2721 - regression_loss: 1.1013 - classification_loss: 0.1708 57/500 [==>...........................] - ETA: 1:44 - loss: 1.2668 - regression_loss: 1.0975 - classification_loss: 0.1693 58/500 [==>...........................] - ETA: 1:44 - loss: 1.2605 - regression_loss: 1.0925 - classification_loss: 0.1681 59/500 [==>...........................] - ETA: 1:44 - loss: 1.2505 - regression_loss: 1.0843 - classification_loss: 0.1662 60/500 [==>...........................] - ETA: 1:44 - loss: 1.2430 - regression_loss: 1.0783 - classification_loss: 0.1647 61/500 [==>...........................] - ETA: 1:44 - loss: 1.2392 - regression_loss: 1.0751 - classification_loss: 0.1641 62/500 [==>...........................] - ETA: 1:43 - loss: 1.2285 - regression_loss: 1.0662 - classification_loss: 0.1623 63/500 [==>...........................] - ETA: 1:43 - loss: 1.2308 - regression_loss: 1.0689 - classification_loss: 0.1618 64/500 [==>...........................] - ETA: 1:43 - loss: 1.2338 - regression_loss: 1.0718 - classification_loss: 0.1620 65/500 [==>...........................] - ETA: 1:43 - loss: 1.2253 - regression_loss: 1.0648 - classification_loss: 0.1605 66/500 [==>...........................] - ETA: 1:42 - loss: 1.2177 - regression_loss: 1.0586 - classification_loss: 0.1591 67/500 [===>..........................] - ETA: 1:42 - loss: 1.2253 - regression_loss: 1.0654 - classification_loss: 0.1600 68/500 [===>..........................] - ETA: 1:42 - loss: 1.2409 - regression_loss: 1.0800 - classification_loss: 0.1609 69/500 [===>..........................] - ETA: 1:42 - loss: 1.2386 - regression_loss: 1.0784 - classification_loss: 0.1602 70/500 [===>..........................] - ETA: 1:42 - loss: 1.2380 - regression_loss: 1.0783 - classification_loss: 0.1596 71/500 [===>..........................] - ETA: 1:41 - loss: 1.2312 - regression_loss: 1.0727 - classification_loss: 0.1584 72/500 [===>..........................] - ETA: 1:41 - loss: 1.2224 - regression_loss: 1.0650 - classification_loss: 0.1574 73/500 [===>..........................] - ETA: 1:41 - loss: 1.2302 - regression_loss: 1.0708 - classification_loss: 0.1594 74/500 [===>..........................] - ETA: 1:41 - loss: 1.2177 - regression_loss: 1.0596 - classification_loss: 0.1581 75/500 [===>..........................] - ETA: 1:40 - loss: 1.2164 - regression_loss: 1.0583 - classification_loss: 0.1581 76/500 [===>..........................] - ETA: 1:40 - loss: 1.2158 - regression_loss: 1.0582 - classification_loss: 0.1576 77/500 [===>..........................] - ETA: 1:40 - loss: 1.2115 - regression_loss: 1.0551 - classification_loss: 0.1564 78/500 [===>..........................] - ETA: 1:40 - loss: 1.2088 - regression_loss: 1.0524 - classification_loss: 0.1564 79/500 [===>..........................] - ETA: 1:39 - loss: 1.2015 - regression_loss: 1.0466 - classification_loss: 0.1549 80/500 [===>..........................] - ETA: 1:39 - loss: 1.2008 - regression_loss: 1.0468 - classification_loss: 0.1540 81/500 [===>..........................] - ETA: 1:39 - loss: 1.1986 - regression_loss: 1.0444 - classification_loss: 0.1542 82/500 [===>..........................] - ETA: 1:39 - loss: 1.1979 - regression_loss: 1.0434 - classification_loss: 0.1545 83/500 [===>..........................] - ETA: 1:39 - loss: 1.1895 - regression_loss: 1.0360 - classification_loss: 0.1535 84/500 [====>.........................] - ETA: 1:38 - loss: 1.1936 - regression_loss: 1.0401 - classification_loss: 0.1535 85/500 [====>.........................] - ETA: 1:38 - loss: 1.1887 - regression_loss: 1.0360 - classification_loss: 0.1527 86/500 [====>.........................] - ETA: 1:38 - loss: 1.1853 - regression_loss: 1.0324 - classification_loss: 0.1529 87/500 [====>.........................] - ETA: 1:37 - loss: 1.1870 - regression_loss: 1.0342 - classification_loss: 0.1528 88/500 [====>.........................] - ETA: 1:37 - loss: 1.1801 - regression_loss: 1.0284 - classification_loss: 0.1518 89/500 [====>.........................] - ETA: 1:37 - loss: 1.1828 - regression_loss: 1.0312 - classification_loss: 0.1516 90/500 [====>.........................] - ETA: 1:37 - loss: 1.1811 - regression_loss: 1.0300 - classification_loss: 0.1511 91/500 [====>.........................] - ETA: 1:37 - loss: 1.1851 - regression_loss: 1.0328 - classification_loss: 0.1523 92/500 [====>.........................] - ETA: 1:36 - loss: 1.1812 - regression_loss: 1.0297 - classification_loss: 0.1515 93/500 [====>.........................] - ETA: 1:36 - loss: 1.1821 - regression_loss: 1.0307 - classification_loss: 0.1514 94/500 [====>.........................] - ETA: 1:36 - loss: 1.1827 - regression_loss: 1.0313 - classification_loss: 0.1515 95/500 [====>.........................] - ETA: 1:35 - loss: 1.1766 - regression_loss: 1.0262 - classification_loss: 0.1504 96/500 [====>.........................] - ETA: 1:35 - loss: 1.1768 - regression_loss: 1.0261 - classification_loss: 0.1507 97/500 [====>.........................] - ETA: 1:35 - loss: 1.1753 - regression_loss: 1.0252 - classification_loss: 0.1500 98/500 [====>.........................] - ETA: 1:35 - loss: 1.1718 - regression_loss: 1.0221 - classification_loss: 0.1497 99/500 [====>.........................] - ETA: 1:35 - loss: 1.1689 - regression_loss: 1.0197 - classification_loss: 0.1492 100/500 [=====>........................] - ETA: 1:35 - loss: 1.1805 - regression_loss: 1.0301 - classification_loss: 0.1504 101/500 [=====>........................] - ETA: 1:34 - loss: 1.1818 - regression_loss: 1.0313 - classification_loss: 0.1505 102/500 [=====>........................] - ETA: 1:34 - loss: 1.1800 - regression_loss: 1.0300 - classification_loss: 0.1500 103/500 [=====>........................] - ETA: 1:34 - loss: 1.1810 - regression_loss: 1.0304 - classification_loss: 0.1506 104/500 [=====>........................] - ETA: 1:33 - loss: 1.1854 - regression_loss: 1.0339 - classification_loss: 0.1515 105/500 [=====>........................] - ETA: 1:33 - loss: 1.1853 - regression_loss: 1.0333 - classification_loss: 0.1520 106/500 [=====>........................] - ETA: 1:33 - loss: 1.1852 - regression_loss: 1.0333 - classification_loss: 0.1519 107/500 [=====>........................] - ETA: 1:33 - loss: 1.1852 - regression_loss: 1.0330 - classification_loss: 0.1522 108/500 [=====>........................] - ETA: 1:33 - loss: 1.1909 - regression_loss: 1.0386 - classification_loss: 0.1523 109/500 [=====>........................] - ETA: 1:32 - loss: 1.1905 - regression_loss: 1.0386 - classification_loss: 0.1519 110/500 [=====>........................] - ETA: 1:32 - loss: 1.1953 - regression_loss: 1.0420 - classification_loss: 0.1534 111/500 [=====>........................] - ETA: 1:32 - loss: 1.1935 - regression_loss: 1.0403 - classification_loss: 0.1532 112/500 [=====>........................] - ETA: 1:31 - loss: 1.1908 - regression_loss: 1.0361 - classification_loss: 0.1547 113/500 [=====>........................] - ETA: 1:31 - loss: 1.1982 - regression_loss: 1.0423 - classification_loss: 0.1560 114/500 [=====>........................] - ETA: 1:31 - loss: 1.1970 - regression_loss: 1.0414 - classification_loss: 0.1556 115/500 [=====>........................] - ETA: 1:31 - loss: 1.2040 - regression_loss: 1.0469 - classification_loss: 0.1571 116/500 [=====>........................] - ETA: 1:30 - loss: 1.2044 - regression_loss: 1.0466 - classification_loss: 0.1578 117/500 [======>.......................] - ETA: 1:30 - loss: 1.1982 - regression_loss: 1.0414 - classification_loss: 0.1568 118/500 [======>.......................] - ETA: 1:30 - loss: 1.2013 - regression_loss: 1.0439 - classification_loss: 0.1574 119/500 [======>.......................] - ETA: 1:30 - loss: 1.2021 - regression_loss: 1.0447 - classification_loss: 0.1574 120/500 [======>.......................] - ETA: 1:29 - loss: 1.2002 - regression_loss: 1.0429 - classification_loss: 0.1572 121/500 [======>.......................] - ETA: 1:29 - loss: 1.2025 - regression_loss: 1.0450 - classification_loss: 0.1575 122/500 [======>.......................] - ETA: 1:29 - loss: 1.2017 - regression_loss: 1.0443 - classification_loss: 0.1574 123/500 [======>.......................] - ETA: 1:28 - loss: 1.2024 - regression_loss: 1.0450 - classification_loss: 0.1574 124/500 [======>.......................] - ETA: 1:28 - loss: 1.1991 - regression_loss: 1.0424 - classification_loss: 0.1567 125/500 [======>.......................] - ETA: 1:28 - loss: 1.1973 - regression_loss: 1.0410 - classification_loss: 0.1563 126/500 [======>.......................] - ETA: 1:28 - loss: 1.1978 - regression_loss: 1.0415 - classification_loss: 0.1562 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1988 - regression_loss: 1.0427 - classification_loss: 0.1561 128/500 [======>.......................] - ETA: 1:27 - loss: 1.2079 - regression_loss: 1.0492 - classification_loss: 0.1587 129/500 [======>.......................] - ETA: 1:27 - loss: 1.2115 - regression_loss: 1.0518 - classification_loss: 0.1597 130/500 [======>.......................] - ETA: 1:27 - loss: 1.2070 - regression_loss: 1.0481 - classification_loss: 0.1589 131/500 [======>.......................] - ETA: 1:26 - loss: 1.2040 - regression_loss: 1.0459 - classification_loss: 0.1581 132/500 [======>.......................] - ETA: 1:26 - loss: 1.2036 - regression_loss: 1.0457 - classification_loss: 0.1579 133/500 [======>.......................] - ETA: 1:26 - loss: 1.2021 - regression_loss: 1.0445 - classification_loss: 0.1576 134/500 [=======>......................] - ETA: 1:26 - loss: 1.2016 - regression_loss: 1.0443 - classification_loss: 0.1573 135/500 [=======>......................] - ETA: 1:25 - loss: 1.2005 - regression_loss: 1.0435 - classification_loss: 0.1570 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1988 - regression_loss: 1.0422 - classification_loss: 0.1566 137/500 [=======>......................] - ETA: 1:25 - loss: 1.1994 - regression_loss: 1.0426 - classification_loss: 0.1568 138/500 [=======>......................] - ETA: 1:25 - loss: 1.1964 - regression_loss: 1.0404 - classification_loss: 0.1560 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1989 - regression_loss: 1.0429 - classification_loss: 0.1560 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1991 - regression_loss: 1.0430 - classification_loss: 0.1561 141/500 [=======>......................] - ETA: 1:24 - loss: 1.1960 - regression_loss: 1.0404 - classification_loss: 0.1555 142/500 [=======>......................] - ETA: 1:24 - loss: 1.1948 - regression_loss: 1.0396 - classification_loss: 0.1552 143/500 [=======>......................] - ETA: 1:24 - loss: 1.1994 - regression_loss: 1.0434 - classification_loss: 0.1559 144/500 [=======>......................] - ETA: 1:23 - loss: 1.2024 - regression_loss: 1.0456 - classification_loss: 0.1568 145/500 [=======>......................] - ETA: 1:23 - loss: 1.1975 - regression_loss: 1.0415 - classification_loss: 0.1560 146/500 [=======>......................] - ETA: 1:23 - loss: 1.2015 - regression_loss: 1.0442 - classification_loss: 0.1573 147/500 [=======>......................] - ETA: 1:23 - loss: 1.1971 - regression_loss: 1.0404 - classification_loss: 0.1567 148/500 [=======>......................] - ETA: 1:22 - loss: 1.2021 - regression_loss: 1.0443 - classification_loss: 0.1579 149/500 [=======>......................] - ETA: 1:22 - loss: 1.2080 - regression_loss: 1.0494 - classification_loss: 0.1586 150/500 [========>.....................] - ETA: 1:22 - loss: 1.2054 - regression_loss: 1.0473 - classification_loss: 0.1581 151/500 [========>.....................] - ETA: 1:22 - loss: 1.2052 - regression_loss: 1.0476 - classification_loss: 0.1576 152/500 [========>.....................] - ETA: 1:22 - loss: 1.2047 - regression_loss: 1.0472 - classification_loss: 0.1575 153/500 [========>.....................] - ETA: 1:21 - loss: 1.2012 - regression_loss: 1.0442 - classification_loss: 0.1569 154/500 [========>.....................] - ETA: 1:21 - loss: 1.1984 - regression_loss: 1.0417 - classification_loss: 0.1566 155/500 [========>.....................] - ETA: 1:21 - loss: 1.1963 - regression_loss: 1.0399 - classification_loss: 0.1564 156/500 [========>.....................] - ETA: 1:20 - loss: 1.1956 - regression_loss: 1.0394 - classification_loss: 0.1562 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1969 - regression_loss: 1.0407 - classification_loss: 0.1562 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1974 - regression_loss: 1.0415 - classification_loss: 0.1559 159/500 [========>.....................] - ETA: 1:20 - loss: 1.1983 - regression_loss: 1.0423 - classification_loss: 0.1560 160/500 [========>.....................] - ETA: 1:20 - loss: 1.2026 - regression_loss: 1.0462 - classification_loss: 0.1565 161/500 [========>.....................] - ETA: 1:19 - loss: 1.2003 - regression_loss: 1.0444 - classification_loss: 0.1559 162/500 [========>.....................] - ETA: 1:19 - loss: 1.2003 - regression_loss: 1.0447 - classification_loss: 0.1556 163/500 [========>.....................] - ETA: 1:19 - loss: 1.2023 - regression_loss: 1.0468 - classification_loss: 0.1556 164/500 [========>.....................] - ETA: 1:19 - loss: 1.1972 - regression_loss: 1.0423 - classification_loss: 0.1549 165/500 [========>.....................] - ETA: 1:18 - loss: 1.1973 - regression_loss: 1.0426 - classification_loss: 0.1547 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1923 - regression_loss: 1.0382 - classification_loss: 0.1541 167/500 [=========>....................] - ETA: 1:18 - loss: 1.1971 - regression_loss: 1.0417 - classification_loss: 0.1554 168/500 [=========>....................] - ETA: 1:18 - loss: 1.1987 - regression_loss: 1.0430 - classification_loss: 0.1556 169/500 [=========>....................] - ETA: 1:17 - loss: 1.2006 - regression_loss: 1.0454 - classification_loss: 0.1552 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1985 - regression_loss: 1.0436 - classification_loss: 0.1549 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1936 - regression_loss: 1.0392 - classification_loss: 0.1544 172/500 [=========>....................] - ETA: 1:17 - loss: 1.1988 - regression_loss: 1.0442 - classification_loss: 0.1546 173/500 [=========>....................] - ETA: 1:16 - loss: 1.1986 - regression_loss: 1.0438 - classification_loss: 0.1549 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1993 - regression_loss: 1.0443 - classification_loss: 0.1549 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1979 - regression_loss: 1.0433 - classification_loss: 0.1546 176/500 [=========>....................] - ETA: 1:16 - loss: 1.1945 - regression_loss: 1.0405 - classification_loss: 0.1540 177/500 [=========>....................] - ETA: 1:16 - loss: 1.1947 - regression_loss: 1.0407 - classification_loss: 0.1540 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1970 - regression_loss: 1.0415 - classification_loss: 0.1555 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1977 - regression_loss: 1.0416 - classification_loss: 0.1561 180/500 [=========>....................] - ETA: 1:15 - loss: 1.1952 - regression_loss: 1.0397 - classification_loss: 0.1555 181/500 [=========>....................] - ETA: 1:15 - loss: 1.1980 - regression_loss: 1.0422 - classification_loss: 0.1558 182/500 [=========>....................] - ETA: 1:14 - loss: 1.1941 - regression_loss: 1.0389 - classification_loss: 0.1552 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1900 - regression_loss: 1.0354 - classification_loss: 0.1546 184/500 [==========>...................] - ETA: 1:14 - loss: 1.1917 - regression_loss: 1.0366 - classification_loss: 0.1551 185/500 [==========>...................] - ETA: 1:14 - loss: 1.1916 - regression_loss: 1.0367 - classification_loss: 0.1548 186/500 [==========>...................] - ETA: 1:13 - loss: 1.1914 - regression_loss: 1.0366 - classification_loss: 0.1547 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1929 - regression_loss: 1.0379 - classification_loss: 0.1550 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1945 - regression_loss: 1.0391 - classification_loss: 0.1554 189/500 [==========>...................] - ETA: 1:13 - loss: 1.1963 - regression_loss: 1.0409 - classification_loss: 0.1554 190/500 [==========>...................] - ETA: 1:12 - loss: 1.1967 - regression_loss: 1.0416 - classification_loss: 0.1550 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1980 - regression_loss: 1.0432 - classification_loss: 0.1549 192/500 [==========>...................] - ETA: 1:12 - loss: 1.2013 - regression_loss: 1.0461 - classification_loss: 0.1552 193/500 [==========>...................] - ETA: 1:12 - loss: 1.2035 - regression_loss: 1.0481 - classification_loss: 0.1554 194/500 [==========>...................] - ETA: 1:12 - loss: 1.2051 - regression_loss: 1.0499 - classification_loss: 0.1552 195/500 [==========>...................] - ETA: 1:11 - loss: 1.2092 - regression_loss: 1.0532 - classification_loss: 0.1560 196/500 [==========>...................] - ETA: 1:11 - loss: 1.2100 - regression_loss: 1.0540 - classification_loss: 0.1560 197/500 [==========>...................] - ETA: 1:11 - loss: 1.2091 - regression_loss: 1.0535 - classification_loss: 0.1557 198/500 [==========>...................] - ETA: 1:11 - loss: 1.2067 - regression_loss: 1.0513 - classification_loss: 0.1554 199/500 [==========>...................] - ETA: 1:10 - loss: 1.2099 - regression_loss: 1.0535 - classification_loss: 0.1563 200/500 [===========>..................] - ETA: 1:10 - loss: 1.2084 - regression_loss: 1.0522 - classification_loss: 0.1562 201/500 [===========>..................] - ETA: 1:10 - loss: 1.2092 - regression_loss: 1.0529 - classification_loss: 0.1562 202/500 [===========>..................] - ETA: 1:10 - loss: 1.2092 - regression_loss: 1.0532 - classification_loss: 0.1560 203/500 [===========>..................] - ETA: 1:09 - loss: 1.2092 - regression_loss: 1.0531 - classification_loss: 0.1561 204/500 [===========>..................] - ETA: 1:09 - loss: 1.2136 - regression_loss: 1.0562 - classification_loss: 0.1574 205/500 [===========>..................] - ETA: 1:09 - loss: 1.2125 - regression_loss: 1.0554 - classification_loss: 0.1571 206/500 [===========>..................] - ETA: 1:09 - loss: 1.2126 - regression_loss: 1.0555 - classification_loss: 0.1570 207/500 [===========>..................] - ETA: 1:08 - loss: 1.2134 - regression_loss: 1.0564 - classification_loss: 0.1570 208/500 [===========>..................] - ETA: 1:08 - loss: 1.2127 - regression_loss: 1.0559 - classification_loss: 0.1568 209/500 [===========>..................] - ETA: 1:08 - loss: 1.2166 - regression_loss: 1.0590 - classification_loss: 0.1576 210/500 [===========>..................] - ETA: 1:08 - loss: 1.2153 - regression_loss: 1.0578 - classification_loss: 0.1575 211/500 [===========>..................] - ETA: 1:07 - loss: 1.2136 - regression_loss: 1.0565 - classification_loss: 0.1571 212/500 [===========>..................] - ETA: 1:07 - loss: 1.2131 - regression_loss: 1.0563 - classification_loss: 0.1568 213/500 [===========>..................] - ETA: 1:07 - loss: 1.2124 - regression_loss: 1.0555 - classification_loss: 0.1569 214/500 [===========>..................] - ETA: 1:07 - loss: 1.2126 - regression_loss: 1.0557 - classification_loss: 0.1569 215/500 [===========>..................] - ETA: 1:06 - loss: 1.2140 - regression_loss: 1.0572 - classification_loss: 0.1568 216/500 [===========>..................] - ETA: 1:06 - loss: 1.2104 - regression_loss: 1.0542 - classification_loss: 0.1562 217/500 [============>.................] - ETA: 1:06 - loss: 1.2077 - regression_loss: 1.0519 - classification_loss: 0.1559 218/500 [============>.................] - ETA: 1:06 - loss: 1.2083 - regression_loss: 1.0523 - classification_loss: 0.1560 219/500 [============>.................] - ETA: 1:06 - loss: 1.2090 - regression_loss: 1.0530 - classification_loss: 0.1560 220/500 [============>.................] - ETA: 1:05 - loss: 1.2096 - regression_loss: 1.0538 - classification_loss: 0.1559 221/500 [============>.................] - ETA: 1:05 - loss: 1.2113 - regression_loss: 1.0554 - classification_loss: 0.1560 222/500 [============>.................] - ETA: 1:05 - loss: 1.2140 - regression_loss: 1.0573 - classification_loss: 0.1568 223/500 [============>.................] - ETA: 1:05 - loss: 1.2137 - regression_loss: 1.0570 - classification_loss: 0.1567 224/500 [============>.................] - ETA: 1:04 - loss: 1.2133 - regression_loss: 1.0565 - classification_loss: 0.1568 225/500 [============>.................] - ETA: 1:04 - loss: 1.2128 - regression_loss: 1.0561 - classification_loss: 0.1567 226/500 [============>.................] - ETA: 1:04 - loss: 1.2102 - regression_loss: 1.0538 - classification_loss: 0.1564 227/500 [============>.................] - ETA: 1:04 - loss: 1.2086 - regression_loss: 1.0527 - classification_loss: 0.1559 228/500 [============>.................] - ETA: 1:03 - loss: 1.2068 - regression_loss: 1.0513 - classification_loss: 0.1555 229/500 [============>.................] - ETA: 1:03 - loss: 1.2055 - regression_loss: 1.0503 - classification_loss: 0.1553 230/500 [============>.................] - ETA: 1:03 - loss: 1.2046 - regression_loss: 1.0496 - classification_loss: 0.1550 231/500 [============>.................] - ETA: 1:03 - loss: 1.2044 - regression_loss: 1.0497 - classification_loss: 0.1547 232/500 [============>.................] - ETA: 1:02 - loss: 1.2037 - regression_loss: 1.0494 - classification_loss: 0.1544 233/500 [============>.................] - ETA: 1:02 - loss: 1.2057 - regression_loss: 1.0515 - classification_loss: 0.1543 234/500 [=============>................] - ETA: 1:02 - loss: 1.2103 - regression_loss: 1.0554 - classification_loss: 0.1549 235/500 [=============>................] - ETA: 1:02 - loss: 1.2108 - regression_loss: 1.0559 - classification_loss: 0.1550 236/500 [=============>................] - ETA: 1:01 - loss: 1.2130 - regression_loss: 1.0580 - classification_loss: 0.1550 237/500 [=============>................] - ETA: 1:01 - loss: 1.2130 - regression_loss: 1.0582 - classification_loss: 0.1548 238/500 [=============>................] - ETA: 1:01 - loss: 1.2139 - regression_loss: 1.0591 - classification_loss: 0.1548 239/500 [=============>................] - ETA: 1:01 - loss: 1.2190 - regression_loss: 1.0634 - classification_loss: 0.1556 240/500 [=============>................] - ETA: 1:01 - loss: 1.2182 - regression_loss: 1.0629 - classification_loss: 0.1553 241/500 [=============>................] - ETA: 1:00 - loss: 1.2284 - regression_loss: 1.0702 - classification_loss: 0.1582 242/500 [=============>................] - ETA: 1:00 - loss: 1.2285 - regression_loss: 1.0707 - classification_loss: 0.1578 243/500 [=============>................] - ETA: 1:00 - loss: 1.2287 - regression_loss: 1.0707 - classification_loss: 0.1580 244/500 [=============>................] - ETA: 1:00 - loss: 1.2288 - regression_loss: 1.0709 - classification_loss: 0.1579 245/500 [=============>................] - ETA: 59s - loss: 1.2261 - regression_loss: 1.0687 - classification_loss: 0.1574  246/500 [=============>................] - ETA: 59s - loss: 1.2255 - regression_loss: 1.0683 - classification_loss: 0.1572 247/500 [=============>................] - ETA: 59s - loss: 1.2261 - regression_loss: 1.0690 - classification_loss: 0.1572 248/500 [=============>................] - ETA: 59s - loss: 1.2265 - regression_loss: 1.0695 - classification_loss: 0.1570 249/500 [=============>................] - ETA: 59s - loss: 1.2270 - regression_loss: 1.0701 - classification_loss: 0.1569 250/500 [==============>...............] - ETA: 58s - loss: 1.2247 - regression_loss: 1.0683 - classification_loss: 0.1564 251/500 [==============>...............] - ETA: 58s - loss: 1.2246 - regression_loss: 1.0682 - classification_loss: 0.1563 252/500 [==============>...............] - ETA: 58s - loss: 1.2240 - regression_loss: 1.0678 - classification_loss: 0.1562 253/500 [==============>...............] - ETA: 58s - loss: 1.2245 - regression_loss: 1.0685 - classification_loss: 0.1560 254/500 [==============>...............] - ETA: 57s - loss: 1.2235 - regression_loss: 1.0679 - classification_loss: 0.1556 255/500 [==============>...............] - ETA: 57s - loss: 1.2216 - regression_loss: 1.0663 - classification_loss: 0.1553 256/500 [==============>...............] - ETA: 57s - loss: 1.2204 - regression_loss: 1.0653 - classification_loss: 0.1551 257/500 [==============>...............] - ETA: 57s - loss: 1.2174 - regression_loss: 1.0628 - classification_loss: 0.1547 258/500 [==============>...............] - ETA: 56s - loss: 1.2167 - regression_loss: 1.0622 - classification_loss: 0.1546 259/500 [==============>...............] - ETA: 56s - loss: 1.2124 - regression_loss: 1.0581 - classification_loss: 0.1543 260/500 [==============>...............] - ETA: 56s - loss: 1.2101 - regression_loss: 1.0562 - classification_loss: 0.1539 261/500 [==============>...............] - ETA: 56s - loss: 1.2103 - regression_loss: 1.0564 - classification_loss: 0.1539 262/500 [==============>...............] - ETA: 55s - loss: 1.2119 - regression_loss: 1.0577 - classification_loss: 0.1542 263/500 [==============>...............] - ETA: 55s - loss: 1.2121 - regression_loss: 1.0579 - classification_loss: 0.1542 264/500 [==============>...............] - ETA: 55s - loss: 1.2112 - regression_loss: 1.0571 - classification_loss: 0.1541 265/500 [==============>...............] - ETA: 55s - loss: 1.2110 - regression_loss: 1.0567 - classification_loss: 0.1542 266/500 [==============>...............] - ETA: 54s - loss: 1.2129 - regression_loss: 1.0581 - classification_loss: 0.1548 267/500 [===============>..............] - ETA: 54s - loss: 1.2113 - regression_loss: 1.0568 - classification_loss: 0.1545 268/500 [===============>..............] - ETA: 54s - loss: 1.2110 - regression_loss: 1.0567 - classification_loss: 0.1543 269/500 [===============>..............] - ETA: 54s - loss: 1.2115 - regression_loss: 1.0571 - classification_loss: 0.1545 270/500 [===============>..............] - ETA: 54s - loss: 1.2117 - regression_loss: 1.0574 - classification_loss: 0.1544 271/500 [===============>..............] - ETA: 53s - loss: 1.2125 - regression_loss: 1.0583 - classification_loss: 0.1542 272/500 [===============>..............] - ETA: 53s - loss: 1.2106 - regression_loss: 1.0567 - classification_loss: 0.1538 273/500 [===============>..............] - ETA: 53s - loss: 1.2123 - regression_loss: 1.0582 - classification_loss: 0.1541 274/500 [===============>..............] - ETA: 53s - loss: 1.2119 - regression_loss: 1.0577 - classification_loss: 0.1542 275/500 [===============>..............] - ETA: 52s - loss: 1.2137 - regression_loss: 1.0591 - classification_loss: 0.1546 276/500 [===============>..............] - ETA: 52s - loss: 1.2138 - regression_loss: 1.0592 - classification_loss: 0.1546 277/500 [===============>..............] - ETA: 52s - loss: 1.2147 - regression_loss: 1.0602 - classification_loss: 0.1544 278/500 [===============>..............] - ETA: 52s - loss: 1.2128 - regression_loss: 1.0586 - classification_loss: 0.1542 279/500 [===============>..............] - ETA: 51s - loss: 1.2121 - regression_loss: 1.0580 - classification_loss: 0.1541 280/500 [===============>..............] - ETA: 51s - loss: 1.2111 - regression_loss: 1.0570 - classification_loss: 0.1542 281/500 [===============>..............] - ETA: 51s - loss: 1.2093 - regression_loss: 1.0555 - classification_loss: 0.1538 282/500 [===============>..............] - ETA: 51s - loss: 1.2078 - regression_loss: 1.0544 - classification_loss: 0.1534 283/500 [===============>..............] - ETA: 51s - loss: 1.2082 - regression_loss: 1.0546 - classification_loss: 0.1536 284/500 [================>.............] - ETA: 50s - loss: 1.2067 - regression_loss: 1.0534 - classification_loss: 0.1532 285/500 [================>.............] - ETA: 50s - loss: 1.2097 - regression_loss: 1.0561 - classification_loss: 0.1536 286/500 [================>.............] - ETA: 50s - loss: 1.2127 - regression_loss: 1.0583 - classification_loss: 0.1544 287/500 [================>.............] - ETA: 50s - loss: 1.2133 - regression_loss: 1.0588 - classification_loss: 0.1545 288/500 [================>.............] - ETA: 49s - loss: 1.2130 - regression_loss: 1.0586 - classification_loss: 0.1543 289/500 [================>.............] - ETA: 49s - loss: 1.2130 - regression_loss: 1.0587 - classification_loss: 0.1543 290/500 [================>.............] - ETA: 49s - loss: 1.2126 - regression_loss: 1.0582 - classification_loss: 0.1543 291/500 [================>.............] - ETA: 49s - loss: 1.2141 - regression_loss: 1.0591 - classification_loss: 0.1550 292/500 [================>.............] - ETA: 48s - loss: 1.2142 - regression_loss: 1.0590 - classification_loss: 0.1551 293/500 [================>.............] - ETA: 48s - loss: 1.2138 - regression_loss: 1.0588 - classification_loss: 0.1550 294/500 [================>.............] - ETA: 48s - loss: 1.2137 - regression_loss: 1.0586 - classification_loss: 0.1551 295/500 [================>.............] - ETA: 48s - loss: 1.2138 - regression_loss: 1.0586 - classification_loss: 0.1552 296/500 [================>.............] - ETA: 47s - loss: 1.2136 - regression_loss: 1.0583 - classification_loss: 0.1553 297/500 [================>.............] - ETA: 47s - loss: 1.2108 - regression_loss: 1.0559 - classification_loss: 0.1549 298/500 [================>.............] - ETA: 47s - loss: 1.2093 - regression_loss: 1.0546 - classification_loss: 0.1546 299/500 [================>.............] - ETA: 47s - loss: 1.2086 - regression_loss: 1.0540 - classification_loss: 0.1545 300/500 [=================>............] - ETA: 47s - loss: 1.2070 - regression_loss: 1.0527 - classification_loss: 0.1543 301/500 [=================>............] - ETA: 46s - loss: 1.2062 - regression_loss: 1.0523 - classification_loss: 0.1540 302/500 [=================>............] - ETA: 46s - loss: 1.2052 - regression_loss: 1.0514 - classification_loss: 0.1538 303/500 [=================>............] - ETA: 46s - loss: 1.2052 - regression_loss: 1.0513 - classification_loss: 0.1539 304/500 [=================>............] - ETA: 46s - loss: 1.2057 - regression_loss: 1.0518 - classification_loss: 0.1539 305/500 [=================>............] - ETA: 45s - loss: 1.2068 - regression_loss: 1.0529 - classification_loss: 0.1539 306/500 [=================>............] - ETA: 45s - loss: 1.2072 - regression_loss: 1.0531 - classification_loss: 0.1540 307/500 [=================>............] - ETA: 45s - loss: 1.2093 - regression_loss: 1.0550 - classification_loss: 0.1544 308/500 [=================>............] - ETA: 45s - loss: 1.2091 - regression_loss: 1.0549 - classification_loss: 0.1542 309/500 [=================>............] - ETA: 44s - loss: 1.2095 - regression_loss: 1.0553 - classification_loss: 0.1542 310/500 [=================>............] - ETA: 44s - loss: 1.2100 - regression_loss: 1.0558 - classification_loss: 0.1541 311/500 [=================>............] - ETA: 44s - loss: 1.2095 - regression_loss: 1.0555 - classification_loss: 0.1541 312/500 [=================>............] - ETA: 44s - loss: 1.2104 - regression_loss: 1.0563 - classification_loss: 0.1541 313/500 [=================>............] - ETA: 43s - loss: 1.2103 - regression_loss: 1.0562 - classification_loss: 0.1540 314/500 [=================>............] - ETA: 43s - loss: 1.2101 - regression_loss: 1.0562 - classification_loss: 0.1539 315/500 [=================>............] - ETA: 43s - loss: 1.2088 - regression_loss: 1.0551 - classification_loss: 0.1536 316/500 [=================>............] - ETA: 43s - loss: 1.2093 - regression_loss: 1.0556 - classification_loss: 0.1538 317/500 [==================>...........] - ETA: 43s - loss: 1.2095 - regression_loss: 1.0557 - classification_loss: 0.1538 318/500 [==================>...........] - ETA: 42s - loss: 1.2099 - regression_loss: 1.0561 - classification_loss: 0.1538 319/500 [==================>...........] - ETA: 42s - loss: 1.2104 - regression_loss: 1.0566 - classification_loss: 0.1538 320/500 [==================>...........] - ETA: 42s - loss: 1.2110 - regression_loss: 1.0574 - classification_loss: 0.1536 321/500 [==================>...........] - ETA: 42s - loss: 1.2100 - regression_loss: 1.0564 - classification_loss: 0.1536 322/500 [==================>...........] - ETA: 41s - loss: 1.2076 - regression_loss: 1.0544 - classification_loss: 0.1532 323/500 [==================>...........] - ETA: 41s - loss: 1.2056 - regression_loss: 1.0527 - classification_loss: 0.1529 324/500 [==================>...........] - ETA: 41s - loss: 1.2054 - regression_loss: 1.0524 - classification_loss: 0.1530 325/500 [==================>...........] - ETA: 41s - loss: 1.2047 - regression_loss: 1.0518 - classification_loss: 0.1529 326/500 [==================>...........] - ETA: 40s - loss: 1.2064 - regression_loss: 1.0531 - classification_loss: 0.1533 327/500 [==================>...........] - ETA: 40s - loss: 1.2077 - regression_loss: 1.0543 - classification_loss: 0.1534 328/500 [==================>...........] - ETA: 40s - loss: 1.2083 - regression_loss: 1.0544 - classification_loss: 0.1538 329/500 [==================>...........] - ETA: 40s - loss: 1.2085 - regression_loss: 1.0546 - classification_loss: 0.1538 330/500 [==================>...........] - ETA: 39s - loss: 1.2083 - regression_loss: 1.0546 - classification_loss: 0.1537 331/500 [==================>...........] - ETA: 39s - loss: 1.2079 - regression_loss: 1.0543 - classification_loss: 0.1536 332/500 [==================>...........] - ETA: 39s - loss: 1.2070 - regression_loss: 1.0533 - classification_loss: 0.1538 333/500 [==================>...........] - ETA: 39s - loss: 1.2100 - regression_loss: 1.0557 - classification_loss: 0.1544 334/500 [===================>..........] - ETA: 39s - loss: 1.2107 - regression_loss: 1.0561 - classification_loss: 0.1546 335/500 [===================>..........] - ETA: 38s - loss: 1.2121 - regression_loss: 1.0574 - classification_loss: 0.1546 336/500 [===================>..........] - ETA: 38s - loss: 1.2118 - regression_loss: 1.0572 - classification_loss: 0.1545 337/500 [===================>..........] - ETA: 38s - loss: 1.2114 - regression_loss: 1.0570 - classification_loss: 0.1544 338/500 [===================>..........] - ETA: 38s - loss: 1.2111 - regression_loss: 1.0568 - classification_loss: 0.1543 339/500 [===================>..........] - ETA: 37s - loss: 1.2102 - regression_loss: 1.0560 - classification_loss: 0.1542 340/500 [===================>..........] - ETA: 37s - loss: 1.2099 - regression_loss: 1.0558 - classification_loss: 0.1542 341/500 [===================>..........] - ETA: 37s - loss: 1.2079 - regression_loss: 1.0541 - classification_loss: 0.1538 342/500 [===================>..........] - ETA: 37s - loss: 1.2087 - regression_loss: 1.0549 - classification_loss: 0.1538 343/500 [===================>..........] - ETA: 36s - loss: 1.2108 - regression_loss: 1.0567 - classification_loss: 0.1541 344/500 [===================>..........] - ETA: 36s - loss: 1.2093 - regression_loss: 1.0555 - classification_loss: 0.1538 345/500 [===================>..........] - ETA: 36s - loss: 1.2099 - regression_loss: 1.0561 - classification_loss: 0.1538 346/500 [===================>..........] - ETA: 36s - loss: 1.2079 - regression_loss: 1.0545 - classification_loss: 0.1535 347/500 [===================>..........] - ETA: 35s - loss: 1.2074 - regression_loss: 1.0539 - classification_loss: 0.1535 348/500 [===================>..........] - ETA: 35s - loss: 1.2061 - regression_loss: 1.0528 - classification_loss: 0.1533 349/500 [===================>..........] - ETA: 35s - loss: 1.2086 - regression_loss: 1.0549 - classification_loss: 0.1537 350/500 [====================>.........] - ETA: 35s - loss: 1.2121 - regression_loss: 1.0579 - classification_loss: 0.1542 351/500 [====================>.........] - ETA: 35s - loss: 1.2108 - regression_loss: 1.0569 - classification_loss: 0.1539 352/500 [====================>.........] - ETA: 34s - loss: 1.2106 - regression_loss: 1.0568 - classification_loss: 0.1538 353/500 [====================>.........] - ETA: 34s - loss: 1.2109 - regression_loss: 1.0569 - classification_loss: 0.1541 354/500 [====================>.........] - ETA: 34s - loss: 1.2102 - regression_loss: 1.0563 - classification_loss: 0.1539 355/500 [====================>.........] - ETA: 34s - loss: 1.2092 - regression_loss: 1.0555 - classification_loss: 0.1537 356/500 [====================>.........] - ETA: 33s - loss: 1.2100 - regression_loss: 1.0562 - classification_loss: 0.1537 357/500 [====================>.........] - ETA: 33s - loss: 1.2099 - regression_loss: 1.0563 - classification_loss: 0.1537 358/500 [====================>.........] - ETA: 33s - loss: 1.2103 - regression_loss: 1.0566 - classification_loss: 0.1538 359/500 [====================>.........] - ETA: 33s - loss: 1.2096 - regression_loss: 1.0560 - classification_loss: 0.1536 360/500 [====================>.........] - ETA: 32s - loss: 1.2114 - regression_loss: 1.0577 - classification_loss: 0.1537 361/500 [====================>.........] - ETA: 32s - loss: 1.2136 - regression_loss: 1.0597 - classification_loss: 0.1539 362/500 [====================>.........] - ETA: 32s - loss: 1.2144 - regression_loss: 1.0602 - classification_loss: 0.1542 363/500 [====================>.........] - ETA: 32s - loss: 1.2131 - regression_loss: 1.0591 - classification_loss: 0.1540 364/500 [====================>.........] - ETA: 31s - loss: 1.2119 - regression_loss: 1.0562 - classification_loss: 0.1556 365/500 [====================>.........] - ETA: 31s - loss: 1.2114 - regression_loss: 1.0558 - classification_loss: 0.1556 366/500 [====================>.........] - ETA: 31s - loss: 1.2114 - regression_loss: 1.0560 - classification_loss: 0.1555 367/500 [=====================>........] - ETA: 31s - loss: 1.2123 - regression_loss: 1.0566 - classification_loss: 0.1557 368/500 [=====================>........] - ETA: 31s - loss: 1.2119 - regression_loss: 1.0561 - classification_loss: 0.1557 369/500 [=====================>........] - ETA: 30s - loss: 1.2108 - regression_loss: 1.0553 - classification_loss: 0.1555 370/500 [=====================>........] - ETA: 30s - loss: 1.2107 - regression_loss: 1.0552 - classification_loss: 0.1555 371/500 [=====================>........] - ETA: 30s - loss: 1.2113 - regression_loss: 1.0558 - classification_loss: 0.1555 372/500 [=====================>........] - ETA: 30s - loss: 1.2144 - regression_loss: 1.0580 - classification_loss: 0.1564 373/500 [=====================>........] - ETA: 29s - loss: 1.2127 - regression_loss: 1.0566 - classification_loss: 0.1561 374/500 [=====================>........] - ETA: 29s - loss: 1.2128 - regression_loss: 1.0566 - classification_loss: 0.1562 375/500 [=====================>........] - ETA: 29s - loss: 1.2104 - regression_loss: 1.0545 - classification_loss: 0.1559 376/500 [=====================>........] - ETA: 29s - loss: 1.2107 - regression_loss: 1.0549 - classification_loss: 0.1558 377/500 [=====================>........] - ETA: 28s - loss: 1.2097 - regression_loss: 1.0541 - classification_loss: 0.1556 378/500 [=====================>........] - ETA: 28s - loss: 1.2089 - regression_loss: 1.0533 - classification_loss: 0.1556 379/500 [=====================>........] - ETA: 28s - loss: 1.2085 - regression_loss: 1.0530 - classification_loss: 0.1555 380/500 [=====================>........] - ETA: 28s - loss: 1.2089 - regression_loss: 1.0534 - classification_loss: 0.1555 381/500 [=====================>........] - ETA: 27s - loss: 1.2102 - regression_loss: 1.0545 - classification_loss: 0.1558 382/500 [=====================>........] - ETA: 27s - loss: 1.2114 - regression_loss: 1.0553 - classification_loss: 0.1561 383/500 [=====================>........] - ETA: 27s - loss: 1.2125 - regression_loss: 1.0562 - classification_loss: 0.1563 384/500 [======================>.......] - ETA: 27s - loss: 1.2129 - regression_loss: 1.0566 - classification_loss: 0.1563 385/500 [======================>.......] - ETA: 27s - loss: 1.2104 - regression_loss: 1.0544 - classification_loss: 0.1560 386/500 [======================>.......] - ETA: 26s - loss: 1.2100 - regression_loss: 1.0540 - classification_loss: 0.1560 387/500 [======================>.......] - ETA: 26s - loss: 1.2104 - regression_loss: 1.0545 - classification_loss: 0.1559 388/500 [======================>.......] - ETA: 26s - loss: 1.2118 - regression_loss: 1.0558 - classification_loss: 0.1560 389/500 [======================>.......] - ETA: 26s - loss: 1.2127 - regression_loss: 1.0565 - classification_loss: 0.1562 390/500 [======================>.......] - ETA: 25s - loss: 1.2129 - regression_loss: 1.0567 - classification_loss: 0.1562 391/500 [======================>.......] - ETA: 25s - loss: 1.2129 - regression_loss: 1.0567 - classification_loss: 0.1563 392/500 [======================>.......] - ETA: 25s - loss: 1.2122 - regression_loss: 1.0560 - classification_loss: 0.1561 393/500 [======================>.......] - ETA: 25s - loss: 1.2131 - regression_loss: 1.0568 - classification_loss: 0.1563 394/500 [======================>.......] - ETA: 24s - loss: 1.2147 - regression_loss: 1.0576 - classification_loss: 0.1570 395/500 [======================>.......] - ETA: 24s - loss: 1.2136 - regression_loss: 1.0567 - classification_loss: 0.1569 396/500 [======================>.......] - ETA: 24s - loss: 1.2132 - regression_loss: 1.0564 - classification_loss: 0.1569 397/500 [======================>.......] - ETA: 24s - loss: 1.2130 - regression_loss: 1.0562 - classification_loss: 0.1568 398/500 [======================>.......] - ETA: 24s - loss: 1.2121 - regression_loss: 1.0555 - classification_loss: 0.1566 399/500 [======================>.......] - ETA: 23s - loss: 1.2117 - regression_loss: 1.0552 - classification_loss: 0.1564 400/500 [=======================>......] - ETA: 23s - loss: 1.2114 - regression_loss: 1.0549 - classification_loss: 0.1565 401/500 [=======================>......] - ETA: 23s - loss: 1.2103 - regression_loss: 1.0540 - classification_loss: 0.1563 402/500 [=======================>......] - ETA: 23s - loss: 1.2140 - regression_loss: 1.0572 - classification_loss: 0.1568 403/500 [=======================>......] - ETA: 22s - loss: 1.2133 - regression_loss: 1.0567 - classification_loss: 0.1566 404/500 [=======================>......] - ETA: 22s - loss: 1.2128 - regression_loss: 1.0563 - classification_loss: 0.1565 405/500 [=======================>......] - ETA: 22s - loss: 1.2118 - regression_loss: 1.0555 - classification_loss: 0.1563 406/500 [=======================>......] - ETA: 22s - loss: 1.2128 - regression_loss: 1.0563 - classification_loss: 0.1565 407/500 [=======================>......] - ETA: 21s - loss: 1.2130 - regression_loss: 1.0564 - classification_loss: 0.1566 408/500 [=======================>......] - ETA: 21s - loss: 1.2124 - regression_loss: 1.0559 - classification_loss: 0.1565 409/500 [=======================>......] - ETA: 21s - loss: 1.2120 - regression_loss: 1.0557 - classification_loss: 0.1564 410/500 [=======================>......] - ETA: 21s - loss: 1.2110 - regression_loss: 1.0547 - classification_loss: 0.1563 411/500 [=======================>......] - ETA: 20s - loss: 1.2102 - regression_loss: 1.0541 - classification_loss: 0.1561 412/500 [=======================>......] - ETA: 20s - loss: 1.2131 - regression_loss: 1.0566 - classification_loss: 0.1565 413/500 [=======================>......] - ETA: 20s - loss: 1.2125 - regression_loss: 1.0560 - classification_loss: 0.1565 414/500 [=======================>......] - ETA: 20s - loss: 1.2108 - regression_loss: 1.0546 - classification_loss: 0.1562 415/500 [=======================>......] - ETA: 19s - loss: 1.2123 - regression_loss: 1.0557 - classification_loss: 0.1566 416/500 [=======================>......] - ETA: 19s - loss: 1.2118 - regression_loss: 1.0552 - classification_loss: 0.1566 417/500 [========================>.....] - ETA: 19s - loss: 1.2113 - regression_loss: 1.0549 - classification_loss: 0.1564 418/500 [========================>.....] - ETA: 19s - loss: 1.2107 - regression_loss: 1.0545 - classification_loss: 0.1562 419/500 [========================>.....] - ETA: 19s - loss: 1.2114 - regression_loss: 1.0551 - classification_loss: 0.1563 420/500 [========================>.....] - ETA: 18s - loss: 1.2112 - regression_loss: 1.0549 - classification_loss: 0.1562 421/500 [========================>.....] - ETA: 18s - loss: 1.2127 - regression_loss: 1.0561 - classification_loss: 0.1565 422/500 [========================>.....] - ETA: 18s - loss: 1.2125 - regression_loss: 1.0561 - classification_loss: 0.1564 423/500 [========================>.....] - ETA: 18s - loss: 1.2119 - regression_loss: 1.0557 - classification_loss: 0.1562 424/500 [========================>.....] - ETA: 17s - loss: 1.2142 - regression_loss: 1.0573 - classification_loss: 0.1569 425/500 [========================>.....] - ETA: 17s - loss: 1.2147 - regression_loss: 1.0576 - classification_loss: 0.1571 426/500 [========================>.....] - ETA: 17s - loss: 1.2161 - regression_loss: 1.0586 - classification_loss: 0.1574 427/500 [========================>.....] - ETA: 17s - loss: 1.2151 - regression_loss: 1.0578 - classification_loss: 0.1573 428/500 [========================>.....] - ETA: 16s - loss: 1.2144 - regression_loss: 1.0572 - classification_loss: 0.1572 429/500 [========================>.....] - ETA: 16s - loss: 1.2137 - regression_loss: 1.0565 - classification_loss: 0.1572 430/500 [========================>.....] - ETA: 16s - loss: 1.2141 - regression_loss: 1.0569 - classification_loss: 0.1572 431/500 [========================>.....] - ETA: 16s - loss: 1.2137 - regression_loss: 1.0565 - classification_loss: 0.1572 432/500 [========================>.....] - ETA: 15s - loss: 1.2145 - regression_loss: 1.0572 - classification_loss: 0.1574 433/500 [========================>.....] - ETA: 15s - loss: 1.2133 - regression_loss: 1.0561 - classification_loss: 0.1572 434/500 [=========================>....] - ETA: 15s - loss: 1.2121 - regression_loss: 1.0551 - classification_loss: 0.1571 435/500 [=========================>....] - ETA: 15s - loss: 1.2128 - regression_loss: 1.0557 - classification_loss: 0.1571 436/500 [=========================>....] - ETA: 15s - loss: 1.2128 - regression_loss: 1.0557 - classification_loss: 0.1571 437/500 [=========================>....] - ETA: 14s - loss: 1.2109 - regression_loss: 1.0540 - classification_loss: 0.1569 438/500 [=========================>....] - ETA: 14s - loss: 1.2121 - regression_loss: 1.0549 - classification_loss: 0.1572 439/500 [=========================>....] - ETA: 14s - loss: 1.2117 - regression_loss: 1.0546 - classification_loss: 0.1571 440/500 [=========================>....] - ETA: 14s - loss: 1.2124 - regression_loss: 1.0552 - classification_loss: 0.1572 441/500 [=========================>....] - ETA: 13s - loss: 1.2115 - regression_loss: 1.0545 - classification_loss: 0.1571 442/500 [=========================>....] - ETA: 13s - loss: 1.2114 - regression_loss: 1.0544 - classification_loss: 0.1570 443/500 [=========================>....] - ETA: 13s - loss: 1.2111 - regression_loss: 1.0542 - classification_loss: 0.1569 444/500 [=========================>....] - ETA: 13s - loss: 1.2100 - regression_loss: 1.0531 - classification_loss: 0.1569 445/500 [=========================>....] - ETA: 12s - loss: 1.2103 - regression_loss: 1.0535 - classification_loss: 0.1568 446/500 [=========================>....] - ETA: 12s - loss: 1.2107 - regression_loss: 1.0539 - classification_loss: 0.1568 447/500 [=========================>....] - ETA: 12s - loss: 1.2113 - regression_loss: 1.0543 - classification_loss: 0.1569 448/500 [=========================>....] - ETA: 12s - loss: 1.2111 - regression_loss: 1.0542 - classification_loss: 0.1569 449/500 [=========================>....] - ETA: 11s - loss: 1.2097 - regression_loss: 1.0531 - classification_loss: 0.1567 450/500 [==========================>...] - ETA: 11s - loss: 1.2089 - regression_loss: 1.0524 - classification_loss: 0.1565 451/500 [==========================>...] - ETA: 11s - loss: 1.2073 - regression_loss: 1.0509 - classification_loss: 0.1564 452/500 [==========================>...] - ETA: 11s - loss: 1.2081 - regression_loss: 1.0517 - classification_loss: 0.1565 453/500 [==========================>...] - ETA: 11s - loss: 1.2077 - regression_loss: 1.0514 - classification_loss: 0.1563 454/500 [==========================>...] - ETA: 10s - loss: 1.2071 - regression_loss: 1.0508 - classification_loss: 0.1563 455/500 [==========================>...] - ETA: 10s - loss: 1.2070 - regression_loss: 1.0507 - classification_loss: 0.1563 456/500 [==========================>...] - ETA: 10s - loss: 1.2077 - regression_loss: 1.0514 - classification_loss: 0.1563 457/500 [==========================>...] - ETA: 10s - loss: 1.2070 - regression_loss: 1.0508 - classification_loss: 0.1562 458/500 [==========================>...] - ETA: 9s - loss: 1.2070 - regression_loss: 1.0510 - classification_loss: 0.1560  459/500 [==========================>...] - ETA: 9s - loss: 1.2064 - regression_loss: 1.0504 - classification_loss: 0.1559 460/500 [==========================>...] - ETA: 9s - loss: 1.2075 - regression_loss: 1.0513 - classification_loss: 0.1562 461/500 [==========================>...] - ETA: 9s - loss: 1.2072 - regression_loss: 1.0511 - classification_loss: 0.1561 462/500 [==========================>...] - ETA: 8s - loss: 1.2076 - regression_loss: 1.0513 - classification_loss: 0.1564 463/500 [==========================>...] - ETA: 8s - loss: 1.2083 - regression_loss: 1.0521 - classification_loss: 0.1562 464/500 [==========================>...] - ETA: 8s - loss: 1.2088 - regression_loss: 1.0525 - classification_loss: 0.1563 465/500 [==========================>...] - ETA: 8s - loss: 1.2082 - regression_loss: 1.0521 - classification_loss: 0.1562 466/500 [==========================>...] - ETA: 7s - loss: 1.2081 - regression_loss: 1.0520 - classification_loss: 0.1562 467/500 [===========================>..] - ETA: 7s - loss: 1.2079 - regression_loss: 1.0517 - classification_loss: 0.1561 468/500 [===========================>..] - ETA: 7s - loss: 1.2074 - regression_loss: 1.0515 - classification_loss: 0.1560 469/500 [===========================>..] - ETA: 7s - loss: 1.2066 - regression_loss: 1.0508 - classification_loss: 0.1558 470/500 [===========================>..] - ETA: 7s - loss: 1.2048 - regression_loss: 1.0492 - classification_loss: 0.1556 471/500 [===========================>..] - ETA: 6s - loss: 1.2060 - regression_loss: 1.0502 - classification_loss: 0.1558 472/500 [===========================>..] - ETA: 6s - loss: 1.2054 - regression_loss: 1.0498 - classification_loss: 0.1556 473/500 [===========================>..] - ETA: 6s - loss: 1.2057 - regression_loss: 1.0500 - classification_loss: 0.1557 474/500 [===========================>..] - ETA: 6s - loss: 1.2058 - regression_loss: 1.0500 - classification_loss: 0.1558 475/500 [===========================>..] - ETA: 5s - loss: 1.2071 - regression_loss: 1.0512 - classification_loss: 0.1559 476/500 [===========================>..] - ETA: 5s - loss: 1.2049 - regression_loss: 1.0490 - classification_loss: 0.1559 477/500 [===========================>..] - ETA: 5s - loss: 1.2042 - regression_loss: 1.0485 - classification_loss: 0.1558 478/500 [===========================>..] - ETA: 5s - loss: 1.2044 - regression_loss: 1.0487 - classification_loss: 0.1557 479/500 [===========================>..] - ETA: 4s - loss: 1.2035 - regression_loss: 1.0480 - classification_loss: 0.1556 480/500 [===========================>..] - ETA: 4s - loss: 1.2041 - regression_loss: 1.0485 - classification_loss: 0.1556 481/500 [===========================>..] - ETA: 4s - loss: 1.2054 - regression_loss: 1.0493 - classification_loss: 0.1561 482/500 [===========================>..] - ETA: 4s - loss: 1.2060 - regression_loss: 1.0497 - classification_loss: 0.1563 483/500 [===========================>..] - ETA: 3s - loss: 1.2079 - regression_loss: 1.0513 - classification_loss: 0.1566 484/500 [============================>.] - ETA: 3s - loss: 1.2071 - regression_loss: 1.0506 - classification_loss: 0.1565 485/500 [============================>.] - ETA: 3s - loss: 1.2073 - regression_loss: 1.0507 - classification_loss: 0.1566 486/500 [============================>.] - ETA: 3s - loss: 1.2081 - regression_loss: 1.0514 - classification_loss: 0.1566 487/500 [============================>.] - ETA: 3s - loss: 1.2072 - regression_loss: 1.0508 - classification_loss: 0.1564 488/500 [============================>.] - ETA: 2s - loss: 1.2070 - regression_loss: 1.0507 - classification_loss: 0.1563 489/500 [============================>.] - ETA: 2s - loss: 1.2071 - regression_loss: 1.0508 - classification_loss: 0.1562 490/500 [============================>.] - ETA: 2s - loss: 1.2074 - regression_loss: 1.0511 - classification_loss: 0.1563 491/500 [============================>.] - ETA: 2s - loss: 1.2073 - regression_loss: 1.0510 - classification_loss: 0.1563 492/500 [============================>.] - ETA: 1s - loss: 1.2067 - regression_loss: 1.0506 - classification_loss: 0.1561 493/500 [============================>.] - ETA: 1s - loss: 1.2053 - regression_loss: 1.0494 - classification_loss: 0.1559 494/500 [============================>.] - ETA: 1s - loss: 1.2056 - regression_loss: 1.0498 - classification_loss: 0.1558 495/500 [============================>.] - ETA: 1s - loss: 1.2056 - regression_loss: 1.0497 - classification_loss: 0.1559 496/500 [============================>.] - ETA: 0s - loss: 1.2060 - regression_loss: 1.0500 - classification_loss: 0.1560 497/500 [============================>.] - ETA: 0s - loss: 1.2058 - regression_loss: 1.0500 - classification_loss: 0.1559 498/500 [============================>.] - ETA: 0s - loss: 1.2058 - regression_loss: 1.0501 - classification_loss: 0.1558 499/500 [============================>.] - ETA: 0s - loss: 1.2056 - regression_loss: 1.0500 - classification_loss: 0.1556 500/500 [==============================] - 118s 235ms/step - loss: 1.2062 - regression_loss: 1.0506 - classification_loss: 0.1556 326 instances of class plum with average precision: 0.8545 mAP: 0.8545 Epoch 00019: saving model to ./training/snapshots/resnet50_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 1:43 - loss: 1.0175 - regression_loss: 0.8571 - classification_loss: 0.1604 2/500 [..............................] - ETA: 1:57 - loss: 0.8132 - regression_loss: 0.6998 - classification_loss: 0.1134 3/500 [..............................] - ETA: 1:59 - loss: 0.9499 - regression_loss: 0.8184 - classification_loss: 0.1315 4/500 [..............................] - ETA: 2:00 - loss: 1.1011 - regression_loss: 0.9511 - classification_loss: 0.1500 5/500 [..............................] - ETA: 1:57 - loss: 1.0890 - regression_loss: 0.9485 - classification_loss: 0.1405 6/500 [..............................] - ETA: 1:55 - loss: 1.0883 - regression_loss: 0.9499 - classification_loss: 0.1384 7/500 [..............................] - ETA: 1:56 - loss: 1.0791 - regression_loss: 0.9411 - classification_loss: 0.1380 8/500 [..............................] - ETA: 1:55 - loss: 1.0212 - regression_loss: 0.8939 - classification_loss: 0.1273 9/500 [..............................] - ETA: 1:55 - loss: 1.0458 - regression_loss: 0.9159 - classification_loss: 0.1299 10/500 [..............................] - ETA: 1:56 - loss: 1.0492 - regression_loss: 0.9209 - classification_loss: 0.1283 11/500 [..............................] - ETA: 1:56 - loss: 1.0856 - regression_loss: 0.9550 - classification_loss: 0.1306 12/500 [..............................] - ETA: 1:56 - loss: 1.0762 - regression_loss: 0.9516 - classification_loss: 0.1246 13/500 [..............................] - ETA: 1:55 - loss: 1.0405 - regression_loss: 0.9209 - classification_loss: 0.1196 14/500 [..............................] - ETA: 1:54 - loss: 1.0519 - regression_loss: 0.9305 - classification_loss: 0.1214 15/500 [..............................] - ETA: 1:54 - loss: 1.0342 - regression_loss: 0.9161 - classification_loss: 0.1182 16/500 [..............................] - ETA: 1:54 - loss: 1.0978 - regression_loss: 0.9618 - classification_loss: 0.1361 17/500 [>.............................] - ETA: 1:53 - loss: 1.0879 - regression_loss: 0.9544 - classification_loss: 0.1335 18/500 [>.............................] - ETA: 1:53 - loss: 1.1222 - regression_loss: 0.9846 - classification_loss: 0.1376 19/500 [>.............................] - ETA: 1:53 - loss: 1.0866 - regression_loss: 0.9549 - classification_loss: 0.1316 20/500 [>.............................] - ETA: 1:52 - loss: 1.1141 - regression_loss: 0.9781 - classification_loss: 0.1359 21/500 [>.............................] - ETA: 1:52 - loss: 1.1214 - regression_loss: 0.9848 - classification_loss: 0.1366 22/500 [>.............................] - ETA: 1:52 - loss: 1.1361 - regression_loss: 0.9979 - classification_loss: 0.1381 23/500 [>.............................] - ETA: 1:52 - loss: 1.1330 - regression_loss: 0.9943 - classification_loss: 0.1387 24/500 [>.............................] - ETA: 1:51 - loss: 1.1275 - regression_loss: 0.9906 - classification_loss: 0.1369 25/500 [>.............................] - ETA: 1:51 - loss: 1.1183 - regression_loss: 0.9835 - classification_loss: 0.1348 26/500 [>.............................] - ETA: 1:51 - loss: 1.1288 - regression_loss: 0.9917 - classification_loss: 0.1371 27/500 [>.............................] - ETA: 1:51 - loss: 1.1343 - regression_loss: 0.9927 - classification_loss: 0.1416 28/500 [>.............................] - ETA: 1:51 - loss: 1.1316 - regression_loss: 0.9901 - classification_loss: 0.1415 29/500 [>.............................] - ETA: 1:51 - loss: 1.1469 - regression_loss: 1.0042 - classification_loss: 0.1427 30/500 [>.............................] - ETA: 1:50 - loss: 1.1141 - regression_loss: 0.9707 - classification_loss: 0.1434 31/500 [>.............................] - ETA: 1:50 - loss: 1.1474 - regression_loss: 1.0007 - classification_loss: 0.1466 32/500 [>.............................] - ETA: 1:50 - loss: 1.1467 - regression_loss: 0.9988 - classification_loss: 0.1479 33/500 [>.............................] - ETA: 1:49 - loss: 1.1395 - regression_loss: 0.9931 - classification_loss: 0.1465 34/500 [=>............................] - ETA: 1:49 - loss: 1.1468 - regression_loss: 1.0000 - classification_loss: 0.1467 35/500 [=>............................] - ETA: 1:49 - loss: 1.1574 - regression_loss: 1.0095 - classification_loss: 0.1479 36/500 [=>............................] - ETA: 1:49 - loss: 1.1402 - regression_loss: 0.9942 - classification_loss: 0.1460 37/500 [=>............................] - ETA: 1:49 - loss: 1.1484 - regression_loss: 1.0020 - classification_loss: 0.1464 38/500 [=>............................] - ETA: 1:48 - loss: 1.2095 - regression_loss: 1.0457 - classification_loss: 0.1638 39/500 [=>............................] - ETA: 1:48 - loss: 1.2252 - regression_loss: 1.0602 - classification_loss: 0.1650 40/500 [=>............................] - ETA: 1:47 - loss: 1.2258 - regression_loss: 1.0606 - classification_loss: 0.1652 41/500 [=>............................] - ETA: 1:47 - loss: 1.2069 - regression_loss: 1.0445 - classification_loss: 0.1624 42/500 [=>............................] - ETA: 1:47 - loss: 1.1970 - regression_loss: 1.0366 - classification_loss: 0.1604 43/500 [=>............................] - ETA: 1:47 - loss: 1.2004 - regression_loss: 1.0402 - classification_loss: 0.1602 44/500 [=>............................] - ETA: 1:47 - loss: 1.2166 - regression_loss: 1.0546 - classification_loss: 0.1619 45/500 [=>............................] - ETA: 1:46 - loss: 1.2060 - regression_loss: 1.0454 - classification_loss: 0.1605 46/500 [=>............................] - ETA: 1:46 - loss: 1.1975 - regression_loss: 1.0396 - classification_loss: 0.1579 47/500 [=>............................] - ETA: 1:46 - loss: 1.2108 - regression_loss: 1.0511 - classification_loss: 0.1596 48/500 [=>............................] - ETA: 1:46 - loss: 1.1996 - regression_loss: 1.0425 - classification_loss: 0.1571 49/500 [=>............................] - ETA: 1:46 - loss: 1.2004 - regression_loss: 1.0441 - classification_loss: 0.1564 50/500 [==>...........................] - ETA: 1:45 - loss: 1.2062 - regression_loss: 1.0480 - classification_loss: 0.1582 51/500 [==>...........................] - ETA: 1:45 - loss: 1.1979 - regression_loss: 1.0411 - classification_loss: 0.1567 52/500 [==>...........................] - ETA: 1:45 - loss: 1.1898 - regression_loss: 1.0340 - classification_loss: 0.1558 53/500 [==>...........................] - ETA: 1:44 - loss: 1.1997 - regression_loss: 1.0432 - classification_loss: 0.1565 54/500 [==>...........................] - ETA: 1:44 - loss: 1.2005 - regression_loss: 1.0437 - classification_loss: 0.1569 55/500 [==>...........................] - ETA: 1:44 - loss: 1.1961 - regression_loss: 1.0412 - classification_loss: 0.1549 56/500 [==>...........................] - ETA: 1:44 - loss: 1.1933 - regression_loss: 1.0397 - classification_loss: 0.1535 57/500 [==>...........................] - ETA: 1:44 - loss: 1.1912 - regression_loss: 1.0386 - classification_loss: 0.1526 58/500 [==>...........................] - ETA: 1:44 - loss: 1.1893 - regression_loss: 1.0382 - classification_loss: 0.1511 59/500 [==>...........................] - ETA: 1:43 - loss: 1.1866 - regression_loss: 1.0352 - classification_loss: 0.1514 60/500 [==>...........................] - ETA: 1:43 - loss: 1.1818 - regression_loss: 1.0319 - classification_loss: 0.1500 61/500 [==>...........................] - ETA: 1:43 - loss: 1.1814 - regression_loss: 1.0319 - classification_loss: 0.1495 62/500 [==>...........................] - ETA: 1:43 - loss: 1.1796 - regression_loss: 1.0300 - classification_loss: 0.1497 63/500 [==>...........................] - ETA: 1:42 - loss: 1.1722 - regression_loss: 1.0239 - classification_loss: 0.1483 64/500 [==>...........................] - ETA: 1:42 - loss: 1.1724 - regression_loss: 1.0245 - classification_loss: 0.1479 65/500 [==>...........................] - ETA: 1:42 - loss: 1.1696 - regression_loss: 1.0219 - classification_loss: 0.1477 66/500 [==>...........................] - ETA: 1:42 - loss: 1.1653 - regression_loss: 1.0186 - classification_loss: 0.1467 67/500 [===>..........................] - ETA: 1:41 - loss: 1.1715 - regression_loss: 1.0250 - classification_loss: 0.1465 68/500 [===>..........................] - ETA: 1:41 - loss: 1.1681 - regression_loss: 1.0223 - classification_loss: 0.1458 69/500 [===>..........................] - ETA: 1:41 - loss: 1.1667 - regression_loss: 1.0195 - classification_loss: 0.1472 70/500 [===>..........................] - ETA: 1:41 - loss: 1.1730 - regression_loss: 1.0259 - classification_loss: 0.1471 71/500 [===>..........................] - ETA: 1:40 - loss: 1.1818 - regression_loss: 1.0340 - classification_loss: 0.1478 72/500 [===>..........................] - ETA: 1:40 - loss: 1.1816 - regression_loss: 1.0338 - classification_loss: 0.1479 73/500 [===>..........................] - ETA: 1:40 - loss: 1.1813 - regression_loss: 1.0336 - classification_loss: 0.1477 74/500 [===>..........................] - ETA: 1:40 - loss: 1.1742 - regression_loss: 1.0276 - classification_loss: 0.1467 75/500 [===>..........................] - ETA: 1:39 - loss: 1.1728 - regression_loss: 1.0267 - classification_loss: 0.1462 76/500 [===>..........................] - ETA: 1:39 - loss: 1.1769 - regression_loss: 1.0315 - classification_loss: 0.1454 77/500 [===>..........................] - ETA: 1:39 - loss: 1.1732 - regression_loss: 1.0287 - classification_loss: 0.1445 78/500 [===>..........................] - ETA: 1:39 - loss: 1.1758 - regression_loss: 1.0312 - classification_loss: 0.1446 79/500 [===>..........................] - ETA: 1:38 - loss: 1.1731 - regression_loss: 1.0294 - classification_loss: 0.1437 80/500 [===>..........................] - ETA: 1:38 - loss: 1.1690 - regression_loss: 1.0254 - classification_loss: 0.1435 81/500 [===>..........................] - ETA: 1:38 - loss: 1.1630 - regression_loss: 1.0207 - classification_loss: 0.1422 82/500 [===>..........................] - ETA: 1:38 - loss: 1.1745 - regression_loss: 1.0294 - classification_loss: 0.1451 83/500 [===>..........................] - ETA: 1:37 - loss: 1.1677 - regression_loss: 1.0240 - classification_loss: 0.1437 84/500 [====>.........................] - ETA: 1:37 - loss: 1.1628 - regression_loss: 1.0196 - classification_loss: 0.1432 85/500 [====>.........................] - ETA: 1:37 - loss: 1.1645 - regression_loss: 1.0206 - classification_loss: 0.1439 86/500 [====>.........................] - ETA: 1:37 - loss: 1.1630 - regression_loss: 1.0195 - classification_loss: 0.1436 87/500 [====>.........................] - ETA: 1:36 - loss: 1.1589 - regression_loss: 1.0165 - classification_loss: 0.1424 88/500 [====>.........................] - ETA: 1:36 - loss: 1.1542 - regression_loss: 1.0126 - classification_loss: 0.1416 89/500 [====>.........................] - ETA: 1:36 - loss: 1.1593 - regression_loss: 1.0167 - classification_loss: 0.1426 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1612 - regression_loss: 1.0187 - classification_loss: 0.1425 91/500 [====>.........................] - ETA: 1:35 - loss: 1.1564 - regression_loss: 1.0148 - classification_loss: 0.1416 92/500 [====>.........................] - ETA: 1:35 - loss: 1.1604 - regression_loss: 1.0182 - classification_loss: 0.1422 93/500 [====>.........................] - ETA: 1:35 - loss: 1.1686 - regression_loss: 1.0257 - classification_loss: 0.1429 94/500 [====>.........................] - ETA: 1:35 - loss: 1.1688 - regression_loss: 1.0261 - classification_loss: 0.1426 95/500 [====>.........................] - ETA: 1:34 - loss: 1.1654 - regression_loss: 1.0233 - classification_loss: 0.1422 96/500 [====>.........................] - ETA: 1:34 - loss: 1.1622 - regression_loss: 1.0207 - classification_loss: 0.1416 97/500 [====>.........................] - ETA: 1:34 - loss: 1.1667 - regression_loss: 1.0248 - classification_loss: 0.1419 98/500 [====>.........................] - ETA: 1:34 - loss: 1.1681 - regression_loss: 1.0268 - classification_loss: 0.1413 99/500 [====>.........................] - ETA: 1:33 - loss: 1.1609 - regression_loss: 1.0206 - classification_loss: 0.1403 100/500 [=====>........................] - ETA: 1:33 - loss: 1.1586 - regression_loss: 1.0187 - classification_loss: 0.1399 101/500 [=====>........................] - ETA: 1:33 - loss: 1.1514 - regression_loss: 1.0124 - classification_loss: 0.1390 102/500 [=====>........................] - ETA: 1:33 - loss: 1.1597 - regression_loss: 1.0190 - classification_loss: 0.1407 103/500 [=====>........................] - ETA: 1:33 - loss: 1.1536 - regression_loss: 1.0138 - classification_loss: 0.1397 104/500 [=====>........................] - ETA: 1:32 - loss: 1.1544 - regression_loss: 1.0139 - classification_loss: 0.1405 105/500 [=====>........................] - ETA: 1:32 - loss: 1.1579 - regression_loss: 1.0172 - classification_loss: 0.1407 106/500 [=====>........................] - ETA: 1:32 - loss: 1.1574 - regression_loss: 1.0172 - classification_loss: 0.1402 107/500 [=====>........................] - ETA: 1:31 - loss: 1.1543 - regression_loss: 1.0144 - classification_loss: 0.1398 108/500 [=====>........................] - ETA: 1:31 - loss: 1.1518 - regression_loss: 1.0123 - classification_loss: 0.1395 109/500 [=====>........................] - ETA: 1:31 - loss: 1.1456 - regression_loss: 1.0070 - classification_loss: 0.1387 110/500 [=====>........................] - ETA: 1:31 - loss: 1.1358 - regression_loss: 0.9978 - classification_loss: 0.1380 111/500 [=====>........................] - ETA: 1:31 - loss: 1.1418 - regression_loss: 1.0029 - classification_loss: 0.1389 112/500 [=====>........................] - ETA: 1:30 - loss: 1.1379 - regression_loss: 0.9997 - classification_loss: 0.1381 113/500 [=====>........................] - ETA: 1:30 - loss: 1.1431 - regression_loss: 1.0038 - classification_loss: 0.1393 114/500 [=====>........................] - ETA: 1:30 - loss: 1.1449 - regression_loss: 1.0055 - classification_loss: 0.1394 115/500 [=====>........................] - ETA: 1:30 - loss: 1.1512 - regression_loss: 1.0105 - classification_loss: 0.1407 116/500 [=====>........................] - ETA: 1:30 - loss: 1.1509 - regression_loss: 1.0100 - classification_loss: 0.1410 117/500 [======>.......................] - ETA: 1:29 - loss: 1.1542 - regression_loss: 1.0132 - classification_loss: 0.1410 118/500 [======>.......................] - ETA: 1:29 - loss: 1.1606 - regression_loss: 1.0186 - classification_loss: 0.1420 119/500 [======>.......................] - ETA: 1:29 - loss: 1.1609 - regression_loss: 1.0190 - classification_loss: 0.1420 120/500 [======>.......................] - ETA: 1:29 - loss: 1.1618 - regression_loss: 1.0194 - classification_loss: 0.1424 121/500 [======>.......................] - ETA: 1:28 - loss: 1.1624 - regression_loss: 1.0202 - classification_loss: 0.1421 122/500 [======>.......................] - ETA: 1:28 - loss: 1.1656 - regression_loss: 1.0233 - classification_loss: 0.1423 123/500 [======>.......................] - ETA: 1:28 - loss: 1.1635 - regression_loss: 1.0221 - classification_loss: 0.1414 124/500 [======>.......................] - ETA: 1:28 - loss: 1.1634 - regression_loss: 1.0218 - classification_loss: 0.1416 125/500 [======>.......................] - ETA: 1:27 - loss: 1.1629 - regression_loss: 1.0214 - classification_loss: 0.1415 126/500 [======>.......................] - ETA: 1:27 - loss: 1.1680 - regression_loss: 1.0256 - classification_loss: 0.1423 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1686 - regression_loss: 1.0264 - classification_loss: 0.1422 128/500 [======>.......................] - ETA: 1:27 - loss: 1.1678 - regression_loss: 1.0261 - classification_loss: 0.1417 129/500 [======>.......................] - ETA: 1:26 - loss: 1.1702 - regression_loss: 1.0277 - classification_loss: 0.1425 130/500 [======>.......................] - ETA: 1:26 - loss: 1.1693 - regression_loss: 1.0270 - classification_loss: 0.1423 131/500 [======>.......................] - ETA: 1:26 - loss: 1.1631 - regression_loss: 1.0217 - classification_loss: 0.1414 132/500 [======>.......................] - ETA: 1:26 - loss: 1.1635 - regression_loss: 1.0219 - classification_loss: 0.1416 133/500 [======>.......................] - ETA: 1:25 - loss: 1.1616 - regression_loss: 1.0200 - classification_loss: 0.1416 134/500 [=======>......................] - ETA: 1:25 - loss: 1.1569 - regression_loss: 1.0159 - classification_loss: 0.1410 135/500 [=======>......................] - ETA: 1:25 - loss: 1.1543 - regression_loss: 1.0132 - classification_loss: 0.1410 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1553 - regression_loss: 1.0143 - classification_loss: 0.1411 137/500 [=======>......................] - ETA: 1:25 - loss: 1.1526 - regression_loss: 1.0110 - classification_loss: 0.1416 138/500 [=======>......................] - ETA: 1:24 - loss: 1.1517 - regression_loss: 1.0103 - classification_loss: 0.1414 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1528 - regression_loss: 1.0115 - classification_loss: 0.1413 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1511 - regression_loss: 1.0101 - classification_loss: 0.1409 141/500 [=======>......................] - ETA: 1:24 - loss: 1.1540 - regression_loss: 1.0125 - classification_loss: 0.1415 142/500 [=======>......................] - ETA: 1:24 - loss: 1.1558 - regression_loss: 1.0141 - classification_loss: 0.1418 143/500 [=======>......................] - ETA: 1:23 - loss: 1.1566 - regression_loss: 1.0145 - classification_loss: 0.1421 144/500 [=======>......................] - ETA: 1:23 - loss: 1.1565 - regression_loss: 1.0146 - classification_loss: 0.1419 145/500 [=======>......................] - ETA: 1:23 - loss: 1.1557 - regression_loss: 1.0140 - classification_loss: 0.1417 146/500 [=======>......................] - ETA: 1:23 - loss: 1.1537 - regression_loss: 1.0124 - classification_loss: 0.1413 147/500 [=======>......................] - ETA: 1:22 - loss: 1.1532 - regression_loss: 1.0120 - classification_loss: 0.1412 148/500 [=======>......................] - ETA: 1:22 - loss: 1.1548 - regression_loss: 1.0133 - classification_loss: 0.1414 149/500 [=======>......................] - ETA: 1:22 - loss: 1.1525 - regression_loss: 1.0114 - classification_loss: 0.1411 150/500 [========>.....................] - ETA: 1:22 - loss: 1.1587 - regression_loss: 1.0170 - classification_loss: 0.1417 151/500 [========>.....................] - ETA: 1:21 - loss: 1.1607 - regression_loss: 1.0187 - classification_loss: 0.1420 152/500 [========>.....................] - ETA: 1:21 - loss: 1.1608 - regression_loss: 1.0188 - classification_loss: 0.1420 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1649 - regression_loss: 1.0217 - classification_loss: 0.1432 154/500 [========>.....................] - ETA: 1:21 - loss: 1.1654 - regression_loss: 1.0222 - classification_loss: 0.1432 155/500 [========>.....................] - ETA: 1:21 - loss: 1.1646 - regression_loss: 1.0211 - classification_loss: 0.1435 156/500 [========>.....................] - ETA: 1:20 - loss: 1.1677 - regression_loss: 1.0240 - classification_loss: 0.1438 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1667 - regression_loss: 1.0231 - classification_loss: 0.1436 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1672 - regression_loss: 1.0237 - classification_loss: 0.1435 159/500 [========>.....................] - ETA: 1:20 - loss: 1.1690 - regression_loss: 1.0250 - classification_loss: 0.1440 160/500 [========>.....................] - ETA: 1:19 - loss: 1.1674 - regression_loss: 1.0237 - classification_loss: 0.1436 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1649 - regression_loss: 1.0215 - classification_loss: 0.1433 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1669 - regression_loss: 1.0232 - classification_loss: 0.1437 163/500 [========>.....................] - ETA: 1:19 - loss: 1.1686 - regression_loss: 1.0243 - classification_loss: 0.1443 164/500 [========>.....................] - ETA: 1:18 - loss: 1.1709 - regression_loss: 1.0260 - classification_loss: 0.1449 165/500 [========>.....................] - ETA: 1:18 - loss: 1.1670 - regression_loss: 1.0229 - classification_loss: 0.1441 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1634 - regression_loss: 1.0198 - classification_loss: 0.1436 167/500 [=========>....................] - ETA: 1:18 - loss: 1.1647 - regression_loss: 1.0209 - classification_loss: 0.1438 168/500 [=========>....................] - ETA: 1:18 - loss: 1.1637 - regression_loss: 1.0196 - classification_loss: 0.1441 169/500 [=========>....................] - ETA: 1:17 - loss: 1.1609 - regression_loss: 1.0170 - classification_loss: 0.1439 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1641 - regression_loss: 1.0190 - classification_loss: 0.1450 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1645 - regression_loss: 1.0194 - classification_loss: 0.1451 172/500 [=========>....................] - ETA: 1:17 - loss: 1.1623 - regression_loss: 1.0177 - classification_loss: 0.1446 173/500 [=========>....................] - ETA: 1:16 - loss: 1.1614 - regression_loss: 1.0170 - classification_loss: 0.1444 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1649 - regression_loss: 1.0204 - classification_loss: 0.1446 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1681 - regression_loss: 1.0227 - classification_loss: 0.1454 176/500 [=========>....................] - ETA: 1:16 - loss: 1.1643 - regression_loss: 1.0195 - classification_loss: 0.1448 177/500 [=========>....................] - ETA: 1:15 - loss: 1.1644 - regression_loss: 1.0198 - classification_loss: 0.1446 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1670 - regression_loss: 1.0215 - classification_loss: 0.1456 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1647 - regression_loss: 1.0196 - classification_loss: 0.1450 180/500 [=========>....................] - ETA: 1:15 - loss: 1.1598 - regression_loss: 1.0154 - classification_loss: 0.1444 181/500 [=========>....................] - ETA: 1:14 - loss: 1.1589 - regression_loss: 1.0148 - classification_loss: 0.1441 182/500 [=========>....................] - ETA: 1:14 - loss: 1.1589 - regression_loss: 1.0150 - classification_loss: 0.1439 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1562 - regression_loss: 1.0127 - classification_loss: 0.1435 184/500 [==========>...................] - ETA: 1:13 - loss: 1.1544 - regression_loss: 1.0114 - classification_loss: 0.1430 185/500 [==========>...................] - ETA: 1:13 - loss: 1.1616 - regression_loss: 1.0167 - classification_loss: 0.1449 186/500 [==========>...................] - ETA: 1:13 - loss: 1.1615 - regression_loss: 1.0168 - classification_loss: 0.1448 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1605 - regression_loss: 1.0159 - classification_loss: 0.1446 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1638 - regression_loss: 1.0189 - classification_loss: 0.1449 189/500 [==========>...................] - ETA: 1:12 - loss: 1.1638 - regression_loss: 1.0190 - classification_loss: 0.1449 190/500 [==========>...................] - ETA: 1:12 - loss: 1.1627 - regression_loss: 1.0184 - classification_loss: 0.1443 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1614 - regression_loss: 1.0172 - classification_loss: 0.1442 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1612 - regression_loss: 1.0172 - classification_loss: 0.1440 193/500 [==========>...................] - ETA: 1:11 - loss: 1.1581 - regression_loss: 1.0145 - classification_loss: 0.1436 194/500 [==========>...................] - ETA: 1:11 - loss: 1.1574 - regression_loss: 1.0138 - classification_loss: 0.1435 195/500 [==========>...................] - ETA: 1:11 - loss: 1.1578 - regression_loss: 1.0144 - classification_loss: 0.1434 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1581 - regression_loss: 1.0147 - classification_loss: 0.1434 197/500 [==========>...................] - ETA: 1:10 - loss: 1.1581 - regression_loss: 1.0148 - classification_loss: 0.1433 198/500 [==========>...................] - ETA: 1:10 - loss: 1.1542 - regression_loss: 1.0115 - classification_loss: 0.1428 199/500 [==========>...................] - ETA: 1:10 - loss: 1.1595 - regression_loss: 1.0162 - classification_loss: 0.1434 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1586 - regression_loss: 1.0152 - classification_loss: 0.1434 201/500 [===========>..................] - ETA: 1:09 - loss: 1.1602 - regression_loss: 1.0166 - classification_loss: 0.1436 202/500 [===========>..................] - ETA: 1:09 - loss: 1.1629 - regression_loss: 1.0189 - classification_loss: 0.1440 203/500 [===========>..................] - ETA: 1:09 - loss: 1.1608 - regression_loss: 1.0172 - classification_loss: 0.1435 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1575 - regression_loss: 1.0146 - classification_loss: 0.1429 205/500 [===========>..................] - ETA: 1:08 - loss: 1.1550 - regression_loss: 1.0125 - classification_loss: 0.1425 206/500 [===========>..................] - ETA: 1:08 - loss: 1.1602 - regression_loss: 1.0176 - classification_loss: 0.1425 207/500 [===========>..................] - ETA: 1:08 - loss: 1.1597 - regression_loss: 1.0175 - classification_loss: 0.1422 208/500 [===========>..................] - ETA: 1:08 - loss: 1.1596 - regression_loss: 1.0174 - classification_loss: 0.1422 209/500 [===========>..................] - ETA: 1:08 - loss: 1.1578 - regression_loss: 1.0159 - classification_loss: 0.1419 210/500 [===========>..................] - ETA: 1:07 - loss: 1.1557 - regression_loss: 1.0141 - classification_loss: 0.1416 211/500 [===========>..................] - ETA: 1:07 - loss: 1.1568 - regression_loss: 1.0147 - classification_loss: 0.1421 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1572 - regression_loss: 1.0148 - classification_loss: 0.1424 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1545 - regression_loss: 1.0126 - classification_loss: 0.1419 214/500 [===========>..................] - ETA: 1:06 - loss: 1.1541 - regression_loss: 1.0124 - classification_loss: 0.1416 215/500 [===========>..................] - ETA: 1:06 - loss: 1.1567 - regression_loss: 1.0148 - classification_loss: 0.1418 216/500 [===========>..................] - ETA: 1:06 - loss: 1.1556 - regression_loss: 1.0139 - classification_loss: 0.1417 217/500 [============>.................] - ETA: 1:06 - loss: 1.1586 - regression_loss: 1.0168 - classification_loss: 0.1418 218/500 [============>.................] - ETA: 1:05 - loss: 1.1634 - regression_loss: 1.0205 - classification_loss: 0.1429 219/500 [============>.................] - ETA: 1:05 - loss: 1.1620 - regression_loss: 1.0190 - classification_loss: 0.1430 220/500 [============>.................] - ETA: 1:05 - loss: 1.1619 - regression_loss: 1.0183 - classification_loss: 0.1437 221/500 [============>.................] - ETA: 1:05 - loss: 1.1647 - regression_loss: 1.0207 - classification_loss: 0.1440 222/500 [============>.................] - ETA: 1:05 - loss: 1.1643 - regression_loss: 1.0204 - classification_loss: 0.1439 223/500 [============>.................] - ETA: 1:04 - loss: 1.1640 - regression_loss: 1.0201 - classification_loss: 0.1440 224/500 [============>.................] - ETA: 1:04 - loss: 1.1672 - regression_loss: 1.0225 - classification_loss: 0.1447 225/500 [============>.................] - ETA: 1:04 - loss: 1.1675 - regression_loss: 1.0231 - classification_loss: 0.1444 226/500 [============>.................] - ETA: 1:04 - loss: 1.1665 - regression_loss: 1.0224 - classification_loss: 0.1441 227/500 [============>.................] - ETA: 1:03 - loss: 1.1641 - regression_loss: 1.0204 - classification_loss: 0.1437 228/500 [============>.................] - ETA: 1:03 - loss: 1.1630 - regression_loss: 1.0196 - classification_loss: 0.1433 229/500 [============>.................] - ETA: 1:03 - loss: 1.1605 - regression_loss: 1.0176 - classification_loss: 0.1429 230/500 [============>.................] - ETA: 1:03 - loss: 1.1586 - regression_loss: 1.0160 - classification_loss: 0.1426 231/500 [============>.................] - ETA: 1:02 - loss: 1.1611 - regression_loss: 1.0178 - classification_loss: 0.1433 232/500 [============>.................] - ETA: 1:02 - loss: 1.1580 - regression_loss: 1.0151 - classification_loss: 0.1429 233/500 [============>.................] - ETA: 1:02 - loss: 1.1566 - regression_loss: 1.0139 - classification_loss: 0.1427 234/500 [=============>................] - ETA: 1:02 - loss: 1.1624 - regression_loss: 1.0188 - classification_loss: 0.1435 235/500 [=============>................] - ETA: 1:01 - loss: 1.1625 - regression_loss: 1.0188 - classification_loss: 0.1438 236/500 [=============>................] - ETA: 1:01 - loss: 1.1619 - regression_loss: 1.0182 - classification_loss: 0.1437 237/500 [=============>................] - ETA: 1:01 - loss: 1.1580 - regression_loss: 1.0148 - classification_loss: 0.1432 238/500 [=============>................] - ETA: 1:01 - loss: 1.1597 - regression_loss: 1.0159 - classification_loss: 0.1437 239/500 [=============>................] - ETA: 1:01 - loss: 1.1570 - regression_loss: 1.0137 - classification_loss: 0.1434 240/500 [=============>................] - ETA: 1:00 - loss: 1.1590 - regression_loss: 1.0149 - classification_loss: 0.1441 241/500 [=============>................] - ETA: 1:00 - loss: 1.1616 - regression_loss: 1.0171 - classification_loss: 0.1446 242/500 [=============>................] - ETA: 1:00 - loss: 1.1620 - regression_loss: 1.0174 - classification_loss: 0.1446 243/500 [=============>................] - ETA: 1:00 - loss: 1.1610 - regression_loss: 1.0166 - classification_loss: 0.1444 244/500 [=============>................] - ETA: 59s - loss: 1.1620 - regression_loss: 1.0169 - classification_loss: 0.1452  245/500 [=============>................] - ETA: 59s - loss: 1.1624 - regression_loss: 1.0173 - classification_loss: 0.1450 246/500 [=============>................] - ETA: 59s - loss: 1.1634 - regression_loss: 1.0181 - classification_loss: 0.1453 247/500 [=============>................] - ETA: 59s - loss: 1.1636 - regression_loss: 1.0184 - classification_loss: 0.1452 248/500 [=============>................] - ETA: 58s - loss: 1.1627 - regression_loss: 1.0174 - classification_loss: 0.1453 249/500 [=============>................] - ETA: 58s - loss: 1.1639 - regression_loss: 1.0184 - classification_loss: 0.1455 250/500 [==============>...............] - ETA: 58s - loss: 1.1633 - regression_loss: 1.0179 - classification_loss: 0.1454 251/500 [==============>...............] - ETA: 58s - loss: 1.1638 - regression_loss: 1.0182 - classification_loss: 0.1456 252/500 [==============>...............] - ETA: 58s - loss: 1.1651 - regression_loss: 1.0193 - classification_loss: 0.1458 253/500 [==============>...............] - ETA: 57s - loss: 1.1659 - regression_loss: 1.0200 - classification_loss: 0.1459 254/500 [==============>...............] - ETA: 57s - loss: 1.1660 - regression_loss: 1.0203 - classification_loss: 0.1457 255/500 [==============>...............] - ETA: 57s - loss: 1.1707 - regression_loss: 1.0237 - classification_loss: 0.1469 256/500 [==============>...............] - ETA: 57s - loss: 1.1683 - regression_loss: 1.0216 - classification_loss: 0.1467 257/500 [==============>...............] - ETA: 56s - loss: 1.1685 - regression_loss: 1.0219 - classification_loss: 0.1465 258/500 [==============>...............] - ETA: 56s - loss: 1.1674 - regression_loss: 1.0209 - classification_loss: 0.1465 259/500 [==============>...............] - ETA: 56s - loss: 1.1684 - regression_loss: 1.0216 - classification_loss: 0.1468 260/500 [==============>...............] - ETA: 56s - loss: 1.1668 - regression_loss: 1.0202 - classification_loss: 0.1466 261/500 [==============>...............] - ETA: 55s - loss: 1.1658 - regression_loss: 1.0195 - classification_loss: 0.1463 262/500 [==============>...............] - ETA: 55s - loss: 1.1674 - regression_loss: 1.0213 - classification_loss: 0.1461 263/500 [==============>...............] - ETA: 55s - loss: 1.1663 - regression_loss: 1.0200 - classification_loss: 0.1462 264/500 [==============>...............] - ETA: 55s - loss: 1.1650 - regression_loss: 1.0189 - classification_loss: 0.1460 265/500 [==============>...............] - ETA: 54s - loss: 1.1651 - regression_loss: 1.0189 - classification_loss: 0.1462 266/500 [==============>...............] - ETA: 54s - loss: 1.1642 - regression_loss: 1.0184 - classification_loss: 0.1459 267/500 [===============>..............] - ETA: 54s - loss: 1.1658 - regression_loss: 1.0198 - classification_loss: 0.1459 268/500 [===============>..............] - ETA: 54s - loss: 1.1697 - regression_loss: 1.0231 - classification_loss: 0.1465 269/500 [===============>..............] - ETA: 53s - loss: 1.1739 - regression_loss: 1.0268 - classification_loss: 0.1471 270/500 [===============>..............] - ETA: 53s - loss: 1.1747 - regression_loss: 1.0276 - classification_loss: 0.1471 271/500 [===============>..............] - ETA: 53s - loss: 1.1813 - regression_loss: 1.0335 - classification_loss: 0.1477 272/500 [===============>..............] - ETA: 53s - loss: 1.1821 - regression_loss: 1.0343 - classification_loss: 0.1478 273/500 [===============>..............] - ETA: 53s - loss: 1.1822 - regression_loss: 1.0344 - classification_loss: 0.1478 274/500 [===============>..............] - ETA: 52s - loss: 1.1826 - regression_loss: 1.0349 - classification_loss: 0.1477 275/500 [===============>..............] - ETA: 52s - loss: 1.1834 - regression_loss: 1.0359 - classification_loss: 0.1475 276/500 [===============>..............] - ETA: 52s - loss: 1.1814 - regression_loss: 1.0343 - classification_loss: 0.1471 277/500 [===============>..............] - ETA: 52s - loss: 1.1808 - regression_loss: 1.0337 - classification_loss: 0.1471 278/500 [===============>..............] - ETA: 51s - loss: 1.1805 - regression_loss: 1.0336 - classification_loss: 0.1468 279/500 [===============>..............] - ETA: 51s - loss: 1.1809 - regression_loss: 1.0341 - classification_loss: 0.1467 280/500 [===============>..............] - ETA: 51s - loss: 1.1830 - regression_loss: 1.0360 - classification_loss: 0.1470 281/500 [===============>..............] - ETA: 51s - loss: 1.1840 - regression_loss: 1.0370 - classification_loss: 0.1470 282/500 [===============>..............] - ETA: 50s - loss: 1.1867 - regression_loss: 1.0394 - classification_loss: 0.1473 283/500 [===============>..............] - ETA: 50s - loss: 1.1871 - regression_loss: 1.0398 - classification_loss: 0.1474 284/500 [================>.............] - ETA: 50s - loss: 1.1869 - regression_loss: 1.0396 - classification_loss: 0.1473 285/500 [================>.............] - ETA: 50s - loss: 1.1869 - regression_loss: 1.0397 - classification_loss: 0.1472 286/500 [================>.............] - ETA: 49s - loss: 1.1851 - regression_loss: 1.0379 - classification_loss: 0.1473 287/500 [================>.............] - ETA: 49s - loss: 1.1856 - regression_loss: 1.0385 - classification_loss: 0.1471 288/500 [================>.............] - ETA: 49s - loss: 1.1867 - regression_loss: 1.0396 - classification_loss: 0.1470 289/500 [================>.............] - ETA: 49s - loss: 1.1869 - regression_loss: 1.0400 - classification_loss: 0.1468 290/500 [================>.............] - ETA: 49s - loss: 1.1860 - regression_loss: 1.0392 - classification_loss: 0.1468 291/500 [================>.............] - ETA: 48s - loss: 1.1841 - regression_loss: 1.0376 - classification_loss: 0.1465 292/500 [================>.............] - ETA: 48s - loss: 1.1840 - regression_loss: 1.0374 - classification_loss: 0.1466 293/500 [================>.............] - ETA: 48s - loss: 1.1846 - regression_loss: 1.0376 - classification_loss: 0.1470 294/500 [================>.............] - ETA: 48s - loss: 1.1857 - regression_loss: 1.0386 - classification_loss: 0.1471 295/500 [================>.............] - ETA: 47s - loss: 1.1859 - regression_loss: 1.0388 - classification_loss: 0.1471 296/500 [================>.............] - ETA: 47s - loss: 1.1840 - regression_loss: 1.0372 - classification_loss: 0.1468 297/500 [================>.............] - ETA: 47s - loss: 1.1831 - regression_loss: 1.0365 - classification_loss: 0.1466 298/500 [================>.............] - ETA: 47s - loss: 1.1832 - regression_loss: 1.0365 - classification_loss: 0.1467 299/500 [================>.............] - ETA: 46s - loss: 1.1819 - regression_loss: 1.0353 - classification_loss: 0.1466 300/500 [=================>............] - ETA: 46s - loss: 1.1832 - regression_loss: 1.0364 - classification_loss: 0.1468 301/500 [=================>............] - ETA: 46s - loss: 1.1825 - regression_loss: 1.0359 - classification_loss: 0.1466 302/500 [=================>............] - ETA: 46s - loss: 1.1822 - regression_loss: 1.0356 - classification_loss: 0.1466 303/500 [=================>............] - ETA: 46s - loss: 1.1806 - regression_loss: 1.0342 - classification_loss: 0.1464 304/500 [=================>............] - ETA: 45s - loss: 1.1797 - regression_loss: 1.0334 - classification_loss: 0.1463 305/500 [=================>............] - ETA: 45s - loss: 1.1805 - regression_loss: 1.0342 - classification_loss: 0.1463 306/500 [=================>............] - ETA: 45s - loss: 1.1790 - regression_loss: 1.0329 - classification_loss: 0.1461 307/500 [=================>............] - ETA: 45s - loss: 1.1799 - regression_loss: 1.0337 - classification_loss: 0.1462 308/500 [=================>............] - ETA: 44s - loss: 1.1784 - regression_loss: 1.0325 - classification_loss: 0.1459 309/500 [=================>............] - ETA: 44s - loss: 1.1788 - regression_loss: 1.0329 - classification_loss: 0.1459 310/500 [=================>............] - ETA: 44s - loss: 1.1776 - regression_loss: 1.0319 - classification_loss: 0.1457 311/500 [=================>............] - ETA: 44s - loss: 1.1816 - regression_loss: 1.0355 - classification_loss: 0.1461 312/500 [=================>............] - ETA: 43s - loss: 1.1791 - regression_loss: 1.0334 - classification_loss: 0.1457 313/500 [=================>............] - ETA: 43s - loss: 1.1811 - regression_loss: 1.0350 - classification_loss: 0.1460 314/500 [=================>............] - ETA: 43s - loss: 1.1792 - regression_loss: 1.0334 - classification_loss: 0.1458 315/500 [=================>............] - ETA: 43s - loss: 1.1798 - regression_loss: 1.0338 - classification_loss: 0.1460 316/500 [=================>............] - ETA: 42s - loss: 1.1794 - regression_loss: 1.0334 - classification_loss: 0.1460 317/500 [==================>...........] - ETA: 42s - loss: 1.1776 - regression_loss: 1.0319 - classification_loss: 0.1457 318/500 [==================>...........] - ETA: 42s - loss: 1.1777 - regression_loss: 1.0321 - classification_loss: 0.1456 319/500 [==================>...........] - ETA: 42s - loss: 1.1777 - regression_loss: 1.0321 - classification_loss: 0.1456 320/500 [==================>...........] - ETA: 42s - loss: 1.1757 - regression_loss: 1.0304 - classification_loss: 0.1453 321/500 [==================>...........] - ETA: 41s - loss: 1.1763 - regression_loss: 1.0310 - classification_loss: 0.1454 322/500 [==================>...........] - ETA: 41s - loss: 1.1768 - regression_loss: 1.0314 - classification_loss: 0.1455 323/500 [==================>...........] - ETA: 41s - loss: 1.1764 - regression_loss: 1.0311 - classification_loss: 0.1453 324/500 [==================>...........] - ETA: 41s - loss: 1.1756 - regression_loss: 1.0302 - classification_loss: 0.1454 325/500 [==================>...........] - ETA: 40s - loss: 1.1764 - regression_loss: 1.0311 - classification_loss: 0.1453 326/500 [==================>...........] - ETA: 40s - loss: 1.1748 - regression_loss: 1.0297 - classification_loss: 0.1451 327/500 [==================>...........] - ETA: 40s - loss: 1.1782 - regression_loss: 1.0326 - classification_loss: 0.1456 328/500 [==================>...........] - ETA: 40s - loss: 1.1799 - regression_loss: 1.0338 - classification_loss: 0.1461 329/500 [==================>...........] - ETA: 39s - loss: 1.1799 - regression_loss: 1.0339 - classification_loss: 0.1460 330/500 [==================>...........] - ETA: 39s - loss: 1.1804 - regression_loss: 1.0344 - classification_loss: 0.1460 331/500 [==================>...........] - ETA: 39s - loss: 1.1796 - regression_loss: 1.0337 - classification_loss: 0.1459 332/500 [==================>...........] - ETA: 39s - loss: 1.1809 - regression_loss: 1.0351 - classification_loss: 0.1458 333/500 [==================>...........] - ETA: 39s - loss: 1.1818 - regression_loss: 1.0356 - classification_loss: 0.1461 334/500 [===================>..........] - ETA: 38s - loss: 1.1802 - regression_loss: 1.0343 - classification_loss: 0.1459 335/500 [===================>..........] - ETA: 38s - loss: 1.1792 - regression_loss: 1.0336 - classification_loss: 0.1456 336/500 [===================>..........] - ETA: 38s - loss: 1.1781 - regression_loss: 1.0326 - classification_loss: 0.1454 337/500 [===================>..........] - ETA: 38s - loss: 1.1773 - regression_loss: 1.0319 - classification_loss: 0.1453 338/500 [===================>..........] - ETA: 37s - loss: 1.1758 - regression_loss: 1.0306 - classification_loss: 0.1452 339/500 [===================>..........] - ETA: 37s - loss: 1.1762 - regression_loss: 1.0309 - classification_loss: 0.1453 340/500 [===================>..........] - ETA: 37s - loss: 1.1754 - regression_loss: 1.0303 - classification_loss: 0.1450 341/500 [===================>..........] - ETA: 37s - loss: 1.1750 - regression_loss: 1.0300 - classification_loss: 0.1451 342/500 [===================>..........] - ETA: 36s - loss: 1.1832 - regression_loss: 1.0334 - classification_loss: 0.1498 343/500 [===================>..........] - ETA: 36s - loss: 1.1817 - regression_loss: 1.0322 - classification_loss: 0.1495 344/500 [===================>..........] - ETA: 36s - loss: 1.1849 - regression_loss: 1.0348 - classification_loss: 0.1501 345/500 [===================>..........] - ETA: 36s - loss: 1.1845 - regression_loss: 1.0346 - classification_loss: 0.1499 346/500 [===================>..........] - ETA: 35s - loss: 1.1851 - regression_loss: 1.0350 - classification_loss: 0.1501 347/500 [===================>..........] - ETA: 35s - loss: 1.1855 - regression_loss: 1.0352 - classification_loss: 0.1503 348/500 [===================>..........] - ETA: 35s - loss: 1.1851 - regression_loss: 1.0347 - classification_loss: 0.1504 349/500 [===================>..........] - ETA: 35s - loss: 1.1848 - regression_loss: 1.0340 - classification_loss: 0.1508 350/500 [====================>.........] - ETA: 35s - loss: 1.1846 - regression_loss: 1.0337 - classification_loss: 0.1509 351/500 [====================>.........] - ETA: 34s - loss: 1.1839 - regression_loss: 1.0332 - classification_loss: 0.1507 352/500 [====================>.........] - ETA: 34s - loss: 1.1860 - regression_loss: 1.0347 - classification_loss: 0.1514 353/500 [====================>.........] - ETA: 34s - loss: 1.1841 - regression_loss: 1.0330 - classification_loss: 0.1511 354/500 [====================>.........] - ETA: 34s - loss: 1.1827 - regression_loss: 1.0318 - classification_loss: 0.1509 355/500 [====================>.........] - ETA: 33s - loss: 1.1819 - regression_loss: 1.0312 - classification_loss: 0.1507 356/500 [====================>.........] - ETA: 33s - loss: 1.1815 - regression_loss: 1.0309 - classification_loss: 0.1506 357/500 [====================>.........] - ETA: 33s - loss: 1.1808 - regression_loss: 1.0305 - classification_loss: 0.1503 358/500 [====================>.........] - ETA: 33s - loss: 1.1801 - regression_loss: 1.0299 - classification_loss: 0.1503 359/500 [====================>.........] - ETA: 32s - loss: 1.1787 - regression_loss: 1.0286 - classification_loss: 0.1500 360/500 [====================>.........] - ETA: 32s - loss: 1.1765 - regression_loss: 1.0268 - classification_loss: 0.1497 361/500 [====================>.........] - ETA: 32s - loss: 1.1762 - regression_loss: 1.0266 - classification_loss: 0.1496 362/500 [====================>.........] - ETA: 32s - loss: 1.1748 - regression_loss: 1.0255 - classification_loss: 0.1494 363/500 [====================>.........] - ETA: 32s - loss: 1.1736 - regression_loss: 1.0244 - classification_loss: 0.1492 364/500 [====================>.........] - ETA: 31s - loss: 1.1733 - regression_loss: 1.0242 - classification_loss: 0.1491 365/500 [====================>.........] - ETA: 31s - loss: 1.1739 - regression_loss: 1.0248 - classification_loss: 0.1491 366/500 [====================>.........] - ETA: 31s - loss: 1.1732 - regression_loss: 1.0243 - classification_loss: 0.1489 367/500 [=====================>........] - ETA: 31s - loss: 1.1722 - regression_loss: 1.0234 - classification_loss: 0.1487 368/500 [=====================>........] - ETA: 30s - loss: 1.1703 - regression_loss: 1.0219 - classification_loss: 0.1484 369/500 [=====================>........] - ETA: 30s - loss: 1.1689 - regression_loss: 1.0208 - classification_loss: 0.1481 370/500 [=====================>........] - ETA: 30s - loss: 1.1674 - regression_loss: 1.0193 - classification_loss: 0.1481 371/500 [=====================>........] - ETA: 30s - loss: 1.1693 - regression_loss: 1.0210 - classification_loss: 0.1483 372/500 [=====================>........] - ETA: 29s - loss: 1.1693 - regression_loss: 1.0210 - classification_loss: 0.1483 373/500 [=====================>........] - ETA: 29s - loss: 1.1680 - regression_loss: 1.0199 - classification_loss: 0.1481 374/500 [=====================>........] - ETA: 29s - loss: 1.1676 - regression_loss: 1.0196 - classification_loss: 0.1480 375/500 [=====================>........] - ETA: 29s - loss: 1.1704 - regression_loss: 1.0216 - classification_loss: 0.1488 376/500 [=====================>........] - ETA: 28s - loss: 1.1703 - regression_loss: 1.0216 - classification_loss: 0.1486 377/500 [=====================>........] - ETA: 28s - loss: 1.1703 - regression_loss: 1.0218 - classification_loss: 0.1485 378/500 [=====================>........] - ETA: 28s - loss: 1.1692 - regression_loss: 1.0209 - classification_loss: 0.1483 379/500 [=====================>........] - ETA: 28s - loss: 1.1676 - regression_loss: 1.0193 - classification_loss: 0.1483 380/500 [=====================>........] - ETA: 28s - loss: 1.1678 - regression_loss: 1.0195 - classification_loss: 0.1484 381/500 [=====================>........] - ETA: 27s - loss: 1.1684 - regression_loss: 1.0196 - classification_loss: 0.1488 382/500 [=====================>........] - ETA: 27s - loss: 1.1701 - regression_loss: 1.0209 - classification_loss: 0.1492 383/500 [=====================>........] - ETA: 27s - loss: 1.1719 - regression_loss: 1.0222 - classification_loss: 0.1497 384/500 [======================>.......] - ETA: 27s - loss: 1.1715 - regression_loss: 1.0220 - classification_loss: 0.1495 385/500 [======================>.......] - ETA: 26s - loss: 1.1719 - regression_loss: 1.0224 - classification_loss: 0.1495 386/500 [======================>.......] - ETA: 26s - loss: 1.1715 - regression_loss: 1.0220 - classification_loss: 0.1495 387/500 [======================>.......] - ETA: 26s - loss: 1.1707 - regression_loss: 1.0214 - classification_loss: 0.1492 388/500 [======================>.......] - ETA: 26s - loss: 1.1692 - regression_loss: 1.0202 - classification_loss: 0.1490 389/500 [======================>.......] - ETA: 25s - loss: 1.1690 - regression_loss: 1.0201 - classification_loss: 0.1489 390/500 [======================>.......] - ETA: 25s - loss: 1.1680 - regression_loss: 1.0192 - classification_loss: 0.1488 391/500 [======================>.......] - ETA: 25s - loss: 1.1683 - regression_loss: 1.0196 - classification_loss: 0.1488 392/500 [======================>.......] - ETA: 25s - loss: 1.1686 - regression_loss: 1.0197 - classification_loss: 0.1489 393/500 [======================>.......] - ETA: 25s - loss: 1.1688 - regression_loss: 1.0199 - classification_loss: 0.1489 394/500 [======================>.......] - ETA: 24s - loss: 1.1690 - regression_loss: 1.0200 - classification_loss: 0.1490 395/500 [======================>.......] - ETA: 24s - loss: 1.1690 - regression_loss: 1.0201 - classification_loss: 0.1489 396/500 [======================>.......] - ETA: 24s - loss: 1.1680 - regression_loss: 1.0193 - classification_loss: 0.1487 397/500 [======================>.......] - ETA: 24s - loss: 1.1686 - regression_loss: 1.0198 - classification_loss: 0.1488 398/500 [======================>.......] - ETA: 23s - loss: 1.1695 - regression_loss: 1.0208 - classification_loss: 0.1487 399/500 [======================>.......] - ETA: 23s - loss: 1.1699 - regression_loss: 1.0214 - classification_loss: 0.1485 400/500 [=======================>......] - ETA: 23s - loss: 1.1715 - regression_loss: 1.0228 - classification_loss: 0.1487 401/500 [=======================>......] - ETA: 23s - loss: 1.1715 - regression_loss: 1.0228 - classification_loss: 0.1488 402/500 [=======================>......] - ETA: 22s - loss: 1.1716 - regression_loss: 1.0225 - classification_loss: 0.1491 403/500 [=======================>......] - ETA: 22s - loss: 1.1719 - regression_loss: 1.0227 - classification_loss: 0.1491 404/500 [=======================>......] - ETA: 22s - loss: 1.1725 - regression_loss: 1.0234 - classification_loss: 0.1491 405/500 [=======================>......] - ETA: 22s - loss: 1.1733 - regression_loss: 1.0242 - classification_loss: 0.1491 406/500 [=======================>......] - ETA: 21s - loss: 1.1725 - regression_loss: 1.0236 - classification_loss: 0.1489 407/500 [=======================>......] - ETA: 21s - loss: 1.1729 - regression_loss: 1.0240 - classification_loss: 0.1489 408/500 [=======================>......] - ETA: 21s - loss: 1.1736 - regression_loss: 1.0244 - classification_loss: 0.1492 409/500 [=======================>......] - ETA: 21s - loss: 1.1718 - regression_loss: 1.0229 - classification_loss: 0.1490 410/500 [=======================>......] - ETA: 21s - loss: 1.1723 - regression_loss: 1.0233 - classification_loss: 0.1491 411/500 [=======================>......] - ETA: 20s - loss: 1.1715 - regression_loss: 1.0226 - classification_loss: 0.1489 412/500 [=======================>......] - ETA: 20s - loss: 1.1701 - regression_loss: 1.0215 - classification_loss: 0.1487 413/500 [=======================>......] - ETA: 20s - loss: 1.1700 - regression_loss: 1.0215 - classification_loss: 0.1485 414/500 [=======================>......] - ETA: 20s - loss: 1.1702 - regression_loss: 1.0217 - classification_loss: 0.1485 415/500 [=======================>......] - ETA: 19s - loss: 1.1715 - regression_loss: 1.0225 - classification_loss: 0.1490 416/500 [=======================>......] - ETA: 19s - loss: 1.1713 - regression_loss: 1.0223 - classification_loss: 0.1489 417/500 [========================>.....] - ETA: 19s - loss: 1.1697 - regression_loss: 1.0210 - classification_loss: 0.1487 418/500 [========================>.....] - ETA: 19s - loss: 1.1714 - regression_loss: 1.0225 - classification_loss: 0.1489 419/500 [========================>.....] - ETA: 18s - loss: 1.1700 - regression_loss: 1.0214 - classification_loss: 0.1486 420/500 [========================>.....] - ETA: 18s - loss: 1.1695 - regression_loss: 1.0209 - classification_loss: 0.1486 421/500 [========================>.....] - ETA: 18s - loss: 1.1692 - regression_loss: 1.0207 - classification_loss: 0.1485 422/500 [========================>.....] - ETA: 18s - loss: 1.1685 - regression_loss: 1.0201 - classification_loss: 0.1483 423/500 [========================>.....] - ETA: 18s - loss: 1.1697 - regression_loss: 1.0211 - classification_loss: 0.1485 424/500 [========================>.....] - ETA: 17s - loss: 1.1704 - regression_loss: 1.0218 - classification_loss: 0.1486 425/500 [========================>.....] - ETA: 17s - loss: 1.1695 - regression_loss: 1.0211 - classification_loss: 0.1484 426/500 [========================>.....] - ETA: 17s - loss: 1.1700 - regression_loss: 1.0216 - classification_loss: 0.1484 427/500 [========================>.....] - ETA: 17s - loss: 1.1703 - regression_loss: 1.0221 - classification_loss: 0.1482 428/500 [========================>.....] - ETA: 16s - loss: 1.1683 - regression_loss: 1.0204 - classification_loss: 0.1479 429/500 [========================>.....] - ETA: 16s - loss: 1.1713 - regression_loss: 1.0228 - classification_loss: 0.1484 430/500 [========================>.....] - ETA: 16s - loss: 1.1711 - regression_loss: 1.0225 - classification_loss: 0.1486 431/500 [========================>.....] - ETA: 16s - loss: 1.1722 - regression_loss: 1.0233 - classification_loss: 0.1489 432/500 [========================>.....] - ETA: 15s - loss: 1.1739 - regression_loss: 1.0246 - classification_loss: 0.1493 433/500 [========================>.....] - ETA: 15s - loss: 1.1754 - regression_loss: 1.0259 - classification_loss: 0.1494 434/500 [=========================>....] - ETA: 15s - loss: 1.1771 - regression_loss: 1.0273 - classification_loss: 0.1498 435/500 [=========================>....] - ETA: 15s - loss: 1.1766 - regression_loss: 1.0268 - classification_loss: 0.1497 436/500 [=========================>....] - ETA: 14s - loss: 1.1762 - regression_loss: 1.0267 - classification_loss: 0.1495 437/500 [=========================>....] - ETA: 14s - loss: 1.1759 - regression_loss: 1.0264 - classification_loss: 0.1494 438/500 [=========================>....] - ETA: 14s - loss: 1.1758 - regression_loss: 1.0265 - classification_loss: 0.1494 439/500 [=========================>....] - ETA: 14s - loss: 1.1758 - regression_loss: 1.0265 - classification_loss: 0.1493 440/500 [=========================>....] - ETA: 14s - loss: 1.1766 - regression_loss: 1.0272 - classification_loss: 0.1494 441/500 [=========================>....] - ETA: 13s - loss: 1.1761 - regression_loss: 1.0268 - classification_loss: 0.1493 442/500 [=========================>....] - ETA: 13s - loss: 1.1766 - regression_loss: 1.0272 - classification_loss: 0.1494 443/500 [=========================>....] - ETA: 13s - loss: 1.1769 - regression_loss: 1.0273 - classification_loss: 0.1496 444/500 [=========================>....] - ETA: 13s - loss: 1.1757 - regression_loss: 1.0263 - classification_loss: 0.1494 445/500 [=========================>....] - ETA: 12s - loss: 1.1752 - regression_loss: 1.0260 - classification_loss: 0.1492 446/500 [=========================>....] - ETA: 12s - loss: 1.1757 - regression_loss: 1.0264 - classification_loss: 0.1493 447/500 [=========================>....] - ETA: 12s - loss: 1.1753 - regression_loss: 1.0258 - classification_loss: 0.1494 448/500 [=========================>....] - ETA: 12s - loss: 1.1733 - regression_loss: 1.0240 - classification_loss: 0.1493 449/500 [=========================>....] - ETA: 11s - loss: 1.1735 - regression_loss: 1.0243 - classification_loss: 0.1492 450/500 [==========================>...] - ETA: 11s - loss: 1.1727 - regression_loss: 1.0236 - classification_loss: 0.1491 451/500 [==========================>...] - ETA: 11s - loss: 1.1724 - regression_loss: 1.0233 - classification_loss: 0.1491 452/500 [==========================>...] - ETA: 11s - loss: 1.1721 - regression_loss: 1.0231 - classification_loss: 0.1491 453/500 [==========================>...] - ETA: 10s - loss: 1.1714 - regression_loss: 1.0224 - classification_loss: 0.1490 454/500 [==========================>...] - ETA: 10s - loss: 1.1700 - regression_loss: 1.0212 - classification_loss: 0.1488 455/500 [==========================>...] - ETA: 10s - loss: 1.1694 - regression_loss: 1.0208 - classification_loss: 0.1486 456/500 [==========================>...] - ETA: 10s - loss: 1.1700 - regression_loss: 1.0214 - classification_loss: 0.1486 457/500 [==========================>...] - ETA: 10s - loss: 1.1715 - regression_loss: 1.0227 - classification_loss: 0.1488 458/500 [==========================>...] - ETA: 9s - loss: 1.1715 - regression_loss: 1.0227 - classification_loss: 0.1488  459/500 [==========================>...] - ETA: 9s - loss: 1.1715 - regression_loss: 1.0225 - classification_loss: 0.1490 460/500 [==========================>...] - ETA: 9s - loss: 1.1716 - regression_loss: 1.0226 - classification_loss: 0.1490 461/500 [==========================>...] - ETA: 9s - loss: 1.1734 - regression_loss: 1.0239 - classification_loss: 0.1495 462/500 [==========================>...] - ETA: 8s - loss: 1.1731 - regression_loss: 1.0236 - classification_loss: 0.1495 463/500 [==========================>...] - ETA: 8s - loss: 1.1724 - regression_loss: 1.0231 - classification_loss: 0.1493 464/500 [==========================>...] - ETA: 8s - loss: 1.1737 - regression_loss: 1.0245 - classification_loss: 0.1492 465/500 [==========================>...] - ETA: 8s - loss: 1.1745 - regression_loss: 1.0252 - classification_loss: 0.1493 466/500 [==========================>...] - ETA: 7s - loss: 1.1748 - regression_loss: 1.0254 - classification_loss: 0.1493 467/500 [===========================>..] - ETA: 7s - loss: 1.1747 - regression_loss: 1.0254 - classification_loss: 0.1493 468/500 [===========================>..] - ETA: 7s - loss: 1.1755 - regression_loss: 1.0262 - classification_loss: 0.1493 469/500 [===========================>..] - ETA: 7s - loss: 1.1752 - regression_loss: 1.0260 - classification_loss: 0.1493 470/500 [===========================>..] - ETA: 7s - loss: 1.1750 - regression_loss: 1.0258 - classification_loss: 0.1492 471/500 [===========================>..] - ETA: 6s - loss: 1.1773 - regression_loss: 1.0275 - classification_loss: 0.1498 472/500 [===========================>..] - ETA: 6s - loss: 1.1783 - regression_loss: 1.0284 - classification_loss: 0.1498 473/500 [===========================>..] - ETA: 6s - loss: 1.1784 - regression_loss: 1.0286 - classification_loss: 0.1499 474/500 [===========================>..] - ETA: 6s - loss: 1.1775 - regression_loss: 1.0278 - classification_loss: 0.1497 475/500 [===========================>..] - ETA: 5s - loss: 1.1784 - regression_loss: 1.0285 - classification_loss: 0.1498 476/500 [===========================>..] - ETA: 5s - loss: 1.1776 - regression_loss: 1.0279 - classification_loss: 0.1497 477/500 [===========================>..] - ETA: 5s - loss: 1.1775 - regression_loss: 1.0279 - classification_loss: 0.1497 478/500 [===========================>..] - ETA: 5s - loss: 1.1781 - regression_loss: 1.0284 - classification_loss: 0.1497 479/500 [===========================>..] - ETA: 4s - loss: 1.1778 - regression_loss: 1.0281 - classification_loss: 0.1496 480/500 [===========================>..] - ETA: 4s - loss: 1.1782 - regression_loss: 1.0285 - classification_loss: 0.1497 481/500 [===========================>..] - ETA: 4s - loss: 1.1780 - regression_loss: 1.0283 - classification_loss: 0.1498 482/500 [===========================>..] - ETA: 4s - loss: 1.1787 - regression_loss: 1.0289 - classification_loss: 0.1498 483/500 [===========================>..] - ETA: 3s - loss: 1.1781 - regression_loss: 1.0284 - classification_loss: 0.1497 484/500 [============================>.] - ETA: 3s - loss: 1.1775 - regression_loss: 1.0277 - classification_loss: 0.1498 485/500 [============================>.] - ETA: 3s - loss: 1.1768 - regression_loss: 1.0270 - classification_loss: 0.1497 486/500 [============================>.] - ETA: 3s - loss: 1.1782 - regression_loss: 1.0283 - classification_loss: 0.1499 487/500 [============================>.] - ETA: 3s - loss: 1.1780 - regression_loss: 1.0282 - classification_loss: 0.1498 488/500 [============================>.] - ETA: 2s - loss: 1.1766 - regression_loss: 1.0270 - classification_loss: 0.1496 489/500 [============================>.] - ETA: 2s - loss: 1.1767 - regression_loss: 1.0273 - classification_loss: 0.1494 490/500 [============================>.] - ETA: 2s - loss: 1.1770 - regression_loss: 1.0277 - classification_loss: 0.1493 491/500 [============================>.] - ETA: 2s - loss: 1.1772 - regression_loss: 1.0281 - classification_loss: 0.1491 492/500 [============================>.] - ETA: 1s - loss: 1.1774 - regression_loss: 1.0284 - classification_loss: 0.1490 493/500 [============================>.] - ETA: 1s - loss: 1.1768 - regression_loss: 1.0280 - classification_loss: 0.1488 494/500 [============================>.] - ETA: 1s - loss: 1.1763 - regression_loss: 1.0276 - classification_loss: 0.1487 495/500 [============================>.] - ETA: 1s - loss: 1.1766 - regression_loss: 1.0278 - classification_loss: 0.1488 496/500 [============================>.] - ETA: 0s - loss: 1.1760 - regression_loss: 1.0273 - classification_loss: 0.1487 497/500 [============================>.] - ETA: 0s - loss: 1.1768 - regression_loss: 1.0281 - classification_loss: 0.1487 498/500 [============================>.] - ETA: 0s - loss: 1.1768 - regression_loss: 1.0281 - classification_loss: 0.1487 499/500 [============================>.] - ETA: 0s - loss: 1.1764 - regression_loss: 1.0279 - classification_loss: 0.1486 500/500 [==============================] - 117s 234ms/step - loss: 1.1760 - regression_loss: 1.0275 - classification_loss: 0.1485 326 instances of class plum with average precision: 0.8396 mAP: 0.8396 Epoch 00020: saving model to ./training/snapshots/resnet50_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 1:48 - loss: 1.2078 - regression_loss: 1.0895 - classification_loss: 0.1183 2/500 [..............................] - ETA: 1:54 - loss: 0.9835 - regression_loss: 0.8788 - classification_loss: 0.1047 3/500 [..............................] - ETA: 1:54 - loss: 1.1230 - regression_loss: 1.0126 - classification_loss: 0.1104 4/500 [..............................] - ETA: 1:57 - loss: 1.2028 - regression_loss: 1.0563 - classification_loss: 0.1466 5/500 [..............................] - ETA: 1:55 - loss: 1.0574 - regression_loss: 0.9295 - classification_loss: 0.1279 6/500 [..............................] - ETA: 1:56 - loss: 1.1258 - regression_loss: 0.9830 - classification_loss: 0.1428 7/500 [..............................] - ETA: 1:55 - loss: 1.1822 - regression_loss: 1.0322 - classification_loss: 0.1500 8/500 [..............................] - ETA: 1:54 - loss: 1.0983 - regression_loss: 0.9606 - classification_loss: 0.1377 9/500 [..............................] - ETA: 1:54 - loss: 1.1357 - regression_loss: 0.9895 - classification_loss: 0.1462 10/500 [..............................] - ETA: 1:54 - loss: 1.1526 - regression_loss: 1.0097 - classification_loss: 0.1429 11/500 [..............................] - ETA: 1:54 - loss: 1.1474 - regression_loss: 1.0065 - classification_loss: 0.1409 12/500 [..............................] - ETA: 1:54 - loss: 1.1768 - regression_loss: 1.0246 - classification_loss: 0.1522 13/500 [..............................] - ETA: 1:54 - loss: 1.1445 - regression_loss: 0.9990 - classification_loss: 0.1455 14/500 [..............................] - ETA: 1:53 - loss: 1.1621 - regression_loss: 1.0154 - classification_loss: 0.1468 15/500 [..............................] - ETA: 1:53 - loss: 1.1908 - regression_loss: 1.0469 - classification_loss: 0.1439 16/500 [..............................] - ETA: 1:53 - loss: 1.1941 - regression_loss: 1.0491 - classification_loss: 0.1451 17/500 [>.............................] - ETA: 1:52 - loss: 1.1988 - regression_loss: 1.0546 - classification_loss: 0.1442 18/500 [>.............................] - ETA: 1:52 - loss: 1.1719 - regression_loss: 1.0311 - classification_loss: 0.1408 19/500 [>.............................] - ETA: 1:53 - loss: 1.2023 - regression_loss: 1.0500 - classification_loss: 0.1524 20/500 [>.............................] - ETA: 1:53 - loss: 1.2060 - regression_loss: 1.0532 - classification_loss: 0.1529 21/500 [>.............................] - ETA: 1:53 - loss: 1.1763 - regression_loss: 1.0289 - classification_loss: 0.1474 22/500 [>.............................] - ETA: 1:53 - loss: 1.1507 - regression_loss: 1.0071 - classification_loss: 0.1436 23/500 [>.............................] - ETA: 1:52 - loss: 1.1532 - regression_loss: 1.0096 - classification_loss: 0.1436 24/500 [>.............................] - ETA: 1:52 - loss: 1.2157 - regression_loss: 1.0617 - classification_loss: 0.1539 25/500 [>.............................] - ETA: 1:51 - loss: 1.1933 - regression_loss: 1.0427 - classification_loss: 0.1506 26/500 [>.............................] - ETA: 1:52 - loss: 1.1797 - regression_loss: 1.0313 - classification_loss: 0.1485 27/500 [>.............................] - ETA: 1:52 - loss: 1.1655 - regression_loss: 1.0198 - classification_loss: 0.1458 28/500 [>.............................] - ETA: 1:51 - loss: 1.1756 - regression_loss: 1.0285 - classification_loss: 0.1471 29/500 [>.............................] - ETA: 1:51 - loss: 1.1520 - regression_loss: 1.0085 - classification_loss: 0.1436 30/500 [>.............................] - ETA: 1:50 - loss: 1.1374 - regression_loss: 0.9968 - classification_loss: 0.1406 31/500 [>.............................] - ETA: 1:50 - loss: 1.1092 - regression_loss: 0.9725 - classification_loss: 0.1366 32/500 [>.............................] - ETA: 1:50 - loss: 1.1151 - regression_loss: 0.9777 - classification_loss: 0.1374 33/500 [>.............................] - ETA: 1:49 - loss: 1.1423 - regression_loss: 1.0029 - classification_loss: 0.1394 34/500 [=>............................] - ETA: 1:49 - loss: 1.1381 - regression_loss: 1.0000 - classification_loss: 0.1380 35/500 [=>............................] - ETA: 1:48 - loss: 1.1425 - regression_loss: 1.0040 - classification_loss: 0.1386 36/500 [=>............................] - ETA: 1:48 - loss: 1.1471 - regression_loss: 1.0083 - classification_loss: 0.1388 37/500 [=>............................] - ETA: 1:48 - loss: 1.1458 - regression_loss: 1.0064 - classification_loss: 0.1394 38/500 [=>............................] - ETA: 1:48 - loss: 1.1468 - regression_loss: 1.0084 - classification_loss: 0.1384 39/500 [=>............................] - ETA: 1:48 - loss: 1.1497 - regression_loss: 1.0118 - classification_loss: 0.1379 40/500 [=>............................] - ETA: 1:47 - loss: 1.1496 - regression_loss: 1.0116 - classification_loss: 0.1379 41/500 [=>............................] - ETA: 1:47 - loss: 1.1418 - regression_loss: 1.0060 - classification_loss: 0.1358 42/500 [=>............................] - ETA: 1:47 - loss: 1.1468 - regression_loss: 1.0100 - classification_loss: 0.1368 43/500 [=>............................] - ETA: 1:46 - loss: 1.1539 - regression_loss: 1.0179 - classification_loss: 0.1360 44/500 [=>............................] - ETA: 1:46 - loss: 1.1579 - regression_loss: 1.0210 - classification_loss: 0.1369 45/500 [=>............................] - ETA: 1:46 - loss: 1.1529 - regression_loss: 1.0171 - classification_loss: 0.1358 46/500 [=>............................] - ETA: 1:46 - loss: 1.1599 - regression_loss: 1.0223 - classification_loss: 0.1376 47/500 [=>............................] - ETA: 1:46 - loss: 1.1556 - regression_loss: 1.0191 - classification_loss: 0.1366 48/500 [=>............................] - ETA: 1:46 - loss: 1.1627 - regression_loss: 1.0267 - classification_loss: 0.1359 49/500 [=>............................] - ETA: 1:45 - loss: 1.1695 - regression_loss: 1.0333 - classification_loss: 0.1361 50/500 [==>...........................] - ETA: 1:45 - loss: 1.1614 - regression_loss: 1.0263 - classification_loss: 0.1350 51/500 [==>...........................] - ETA: 1:45 - loss: 1.1828 - regression_loss: 1.0409 - classification_loss: 0.1419 52/500 [==>...........................] - ETA: 1:44 - loss: 1.1869 - regression_loss: 1.0446 - classification_loss: 0.1424 53/500 [==>...........................] - ETA: 1:44 - loss: 1.1791 - regression_loss: 1.0388 - classification_loss: 0.1404 54/500 [==>...........................] - ETA: 1:44 - loss: 1.1903 - regression_loss: 1.0471 - classification_loss: 0.1432 55/500 [==>...........................] - ETA: 1:44 - loss: 1.1916 - regression_loss: 1.0485 - classification_loss: 0.1431 56/500 [==>...........................] - ETA: 1:44 - loss: 1.2034 - regression_loss: 1.0578 - classification_loss: 0.1456 57/500 [==>...........................] - ETA: 1:44 - loss: 1.1966 - regression_loss: 1.0526 - classification_loss: 0.1440 58/500 [==>...........................] - ETA: 1:44 - loss: 1.1939 - regression_loss: 1.0470 - classification_loss: 0.1470 59/500 [==>...........................] - ETA: 1:43 - loss: 1.1917 - regression_loss: 1.0443 - classification_loss: 0.1474 60/500 [==>...........................] - ETA: 1:43 - loss: 1.1887 - regression_loss: 1.0424 - classification_loss: 0.1463 61/500 [==>...........................] - ETA: 1:43 - loss: 1.1921 - regression_loss: 1.0462 - classification_loss: 0.1459 62/500 [==>...........................] - ETA: 1:42 - loss: 1.2081 - regression_loss: 1.0597 - classification_loss: 0.1484 63/500 [==>...........................] - ETA: 1:42 - loss: 1.2047 - regression_loss: 1.0572 - classification_loss: 0.1475 64/500 [==>...........................] - ETA: 1:42 - loss: 1.2083 - regression_loss: 1.0592 - classification_loss: 0.1491 65/500 [==>...........................] - ETA: 1:42 - loss: 1.2000 - regression_loss: 1.0525 - classification_loss: 0.1475 66/500 [==>...........................] - ETA: 1:41 - loss: 1.1994 - regression_loss: 1.0515 - classification_loss: 0.1479 67/500 [===>..........................] - ETA: 1:41 - loss: 1.1969 - regression_loss: 1.0496 - classification_loss: 0.1473 68/500 [===>..........................] - ETA: 1:41 - loss: 1.1961 - regression_loss: 1.0492 - classification_loss: 0.1470 69/500 [===>..........................] - ETA: 1:41 - loss: 1.1858 - regression_loss: 1.0406 - classification_loss: 0.1453 70/500 [===>..........................] - ETA: 1:41 - loss: 1.1912 - regression_loss: 1.0447 - classification_loss: 0.1465 71/500 [===>..........................] - ETA: 1:40 - loss: 1.1894 - regression_loss: 1.0428 - classification_loss: 0.1467 72/500 [===>..........................] - ETA: 1:40 - loss: 1.1790 - regression_loss: 1.0334 - classification_loss: 0.1456 73/500 [===>..........................] - ETA: 1:40 - loss: 1.1741 - regression_loss: 1.0301 - classification_loss: 0.1441 74/500 [===>..........................] - ETA: 1:40 - loss: 1.1726 - regression_loss: 1.0298 - classification_loss: 0.1428 75/500 [===>..........................] - ETA: 1:40 - loss: 1.1685 - regression_loss: 1.0265 - classification_loss: 0.1420 76/500 [===>..........................] - ETA: 1:39 - loss: 1.1744 - regression_loss: 1.0313 - classification_loss: 0.1431 77/500 [===>..........................] - ETA: 1:39 - loss: 1.1756 - regression_loss: 1.0317 - classification_loss: 0.1438 78/500 [===>..........................] - ETA: 1:39 - loss: 1.1655 - regression_loss: 1.0229 - classification_loss: 0.1426 79/500 [===>..........................] - ETA: 1:39 - loss: 1.1625 - regression_loss: 1.0201 - classification_loss: 0.1424 80/500 [===>..........................] - ETA: 1:39 - loss: 1.1550 - regression_loss: 1.0137 - classification_loss: 0.1413 81/500 [===>..........................] - ETA: 1:38 - loss: 1.1552 - regression_loss: 1.0143 - classification_loss: 0.1409 82/500 [===>..........................] - ETA: 1:38 - loss: 1.1453 - regression_loss: 1.0055 - classification_loss: 0.1398 83/500 [===>..........................] - ETA: 1:38 - loss: 1.1394 - regression_loss: 1.0006 - classification_loss: 0.1388 84/500 [====>.........................] - ETA: 1:37 - loss: 1.1438 - regression_loss: 1.0042 - classification_loss: 0.1396 85/500 [====>.........................] - ETA: 1:37 - loss: 1.1404 - regression_loss: 1.0017 - classification_loss: 0.1387 86/500 [====>.........................] - ETA: 1:37 - loss: 1.1452 - regression_loss: 1.0059 - classification_loss: 0.1393 87/500 [====>.........................] - ETA: 1:37 - loss: 1.1398 - regression_loss: 1.0018 - classification_loss: 0.1381 88/500 [====>.........................] - ETA: 1:37 - loss: 1.1411 - regression_loss: 1.0031 - classification_loss: 0.1380 89/500 [====>.........................] - ETA: 1:36 - loss: 1.1422 - regression_loss: 1.0035 - classification_loss: 0.1388 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1405 - regression_loss: 1.0013 - classification_loss: 0.1392 91/500 [====>.........................] - ETA: 1:36 - loss: 1.1390 - regression_loss: 0.9996 - classification_loss: 0.1394 92/500 [====>.........................] - ETA: 1:36 - loss: 1.1500 - regression_loss: 1.0081 - classification_loss: 0.1419 93/500 [====>.........................] - ETA: 1:35 - loss: 1.1484 - regression_loss: 1.0069 - classification_loss: 0.1414 94/500 [====>.........................] - ETA: 1:35 - loss: 1.1455 - regression_loss: 1.0051 - classification_loss: 0.1404 95/500 [====>.........................] - ETA: 1:35 - loss: 1.1427 - regression_loss: 1.0024 - classification_loss: 0.1403 96/500 [====>.........................] - ETA: 1:35 - loss: 1.1421 - regression_loss: 1.0020 - classification_loss: 0.1402 97/500 [====>.........................] - ETA: 1:34 - loss: 1.1372 - regression_loss: 0.9978 - classification_loss: 0.1394 98/500 [====>.........................] - ETA: 1:34 - loss: 1.1478 - regression_loss: 1.0065 - classification_loss: 0.1413 99/500 [====>.........................] - ETA: 1:34 - loss: 1.1522 - regression_loss: 1.0105 - classification_loss: 0.1416 100/500 [=====>........................] - ETA: 1:34 - loss: 1.1502 - regression_loss: 1.0088 - classification_loss: 0.1414 101/500 [=====>........................] - ETA: 1:33 - loss: 1.1432 - regression_loss: 1.0027 - classification_loss: 0.1405 102/500 [=====>........................] - ETA: 1:33 - loss: 1.1438 - regression_loss: 1.0039 - classification_loss: 0.1399 103/500 [=====>........................] - ETA: 1:33 - loss: 1.1486 - regression_loss: 1.0078 - classification_loss: 0.1408 104/500 [=====>........................] - ETA: 1:33 - loss: 1.1474 - regression_loss: 1.0069 - classification_loss: 0.1405 105/500 [=====>........................] - ETA: 1:32 - loss: 1.1443 - regression_loss: 1.0042 - classification_loss: 0.1401 106/500 [=====>........................] - ETA: 1:32 - loss: 1.1427 - regression_loss: 1.0029 - classification_loss: 0.1398 107/500 [=====>........................] - ETA: 1:32 - loss: 1.1370 - regression_loss: 0.9981 - classification_loss: 0.1389 108/500 [=====>........................] - ETA: 1:32 - loss: 1.1362 - regression_loss: 0.9975 - classification_loss: 0.1388 109/500 [=====>........................] - ETA: 1:31 - loss: 1.1338 - regression_loss: 0.9955 - classification_loss: 0.1384 110/500 [=====>........................] - ETA: 1:31 - loss: 1.1342 - regression_loss: 0.9963 - classification_loss: 0.1380 111/500 [=====>........................] - ETA: 1:31 - loss: 1.1322 - regression_loss: 0.9947 - classification_loss: 0.1375 112/500 [=====>........................] - ETA: 1:31 - loss: 1.1569 - regression_loss: 1.0068 - classification_loss: 0.1501 113/500 [=====>........................] - ETA: 1:31 - loss: 1.1593 - regression_loss: 1.0092 - classification_loss: 0.1501 114/500 [=====>........................] - ETA: 1:30 - loss: 1.1653 - regression_loss: 1.0147 - classification_loss: 0.1506 115/500 [=====>........................] - ETA: 1:30 - loss: 1.1644 - regression_loss: 1.0142 - classification_loss: 0.1502 116/500 [=====>........................] - ETA: 1:30 - loss: 1.1621 - regression_loss: 1.0127 - classification_loss: 0.1493 117/500 [======>.......................] - ETA: 1:30 - loss: 1.1621 - regression_loss: 1.0122 - classification_loss: 0.1499 118/500 [======>.......................] - ETA: 1:29 - loss: 1.1585 - regression_loss: 1.0093 - classification_loss: 0.1492 119/500 [======>.......................] - ETA: 1:29 - loss: 1.1536 - regression_loss: 1.0051 - classification_loss: 0.1485 120/500 [======>.......................] - ETA: 1:29 - loss: 1.1472 - regression_loss: 0.9995 - classification_loss: 0.1476 121/500 [======>.......................] - ETA: 1:29 - loss: 1.1482 - regression_loss: 1.0006 - classification_loss: 0.1476 122/500 [======>.......................] - ETA: 1:29 - loss: 1.1427 - regression_loss: 0.9958 - classification_loss: 0.1468 123/500 [======>.......................] - ETA: 1:28 - loss: 1.1449 - regression_loss: 0.9979 - classification_loss: 0.1470 124/500 [======>.......................] - ETA: 1:28 - loss: 1.1452 - regression_loss: 0.9983 - classification_loss: 0.1469 125/500 [======>.......................] - ETA: 1:28 - loss: 1.1446 - regression_loss: 0.9980 - classification_loss: 0.1466 126/500 [======>.......................] - ETA: 1:28 - loss: 1.1453 - regression_loss: 0.9987 - classification_loss: 0.1467 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1457 - regression_loss: 0.9991 - classification_loss: 0.1466 128/500 [======>.......................] - ETA: 1:27 - loss: 1.1455 - regression_loss: 0.9989 - classification_loss: 0.1465 129/500 [======>.......................] - ETA: 1:27 - loss: 1.1467 - regression_loss: 1.0003 - classification_loss: 0.1464 130/500 [======>.......................] - ETA: 1:26 - loss: 1.1508 - regression_loss: 1.0041 - classification_loss: 0.1467 131/500 [======>.......................] - ETA: 1:26 - loss: 1.1533 - regression_loss: 1.0064 - classification_loss: 0.1469 132/500 [======>.......................] - ETA: 1:26 - loss: 1.1518 - regression_loss: 1.0053 - classification_loss: 0.1465 133/500 [======>.......................] - ETA: 1:26 - loss: 1.1503 - regression_loss: 1.0044 - classification_loss: 0.1459 134/500 [=======>......................] - ETA: 1:26 - loss: 1.1521 - regression_loss: 1.0066 - classification_loss: 0.1455 135/500 [=======>......................] - ETA: 1:25 - loss: 1.1465 - regression_loss: 1.0017 - classification_loss: 0.1448 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1446 - regression_loss: 1.0004 - classification_loss: 0.1443 137/500 [=======>......................] - ETA: 1:25 - loss: 1.1406 - regression_loss: 0.9969 - classification_loss: 0.1436 138/500 [=======>......................] - ETA: 1:25 - loss: 1.1388 - regression_loss: 0.9956 - classification_loss: 0.1432 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1372 - regression_loss: 0.9942 - classification_loss: 0.1430 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1320 - regression_loss: 0.9897 - classification_loss: 0.1423 141/500 [=======>......................] - ETA: 1:24 - loss: 1.1293 - regression_loss: 0.9874 - classification_loss: 0.1419 142/500 [=======>......................] - ETA: 1:24 - loss: 1.1276 - regression_loss: 0.9860 - classification_loss: 0.1416 143/500 [=======>......................] - ETA: 1:24 - loss: 1.1254 - regression_loss: 0.9833 - classification_loss: 0.1421 144/500 [=======>......................] - ETA: 1:23 - loss: 1.1247 - regression_loss: 0.9828 - classification_loss: 0.1420 145/500 [=======>......................] - ETA: 1:23 - loss: 1.1251 - regression_loss: 0.9835 - classification_loss: 0.1417 146/500 [=======>......................] - ETA: 1:23 - loss: 1.1232 - regression_loss: 0.9816 - classification_loss: 0.1416 147/500 [=======>......................] - ETA: 1:23 - loss: 1.1236 - regression_loss: 0.9819 - classification_loss: 0.1417 148/500 [=======>......................] - ETA: 1:22 - loss: 1.1219 - regression_loss: 0.9809 - classification_loss: 0.1411 149/500 [=======>......................] - ETA: 1:22 - loss: 1.1242 - regression_loss: 0.9826 - classification_loss: 0.1416 150/500 [========>.....................] - ETA: 1:22 - loss: 1.1243 - regression_loss: 0.9827 - classification_loss: 0.1417 151/500 [========>.....................] - ETA: 1:22 - loss: 1.1222 - regression_loss: 0.9809 - classification_loss: 0.1412 152/500 [========>.....................] - ETA: 1:21 - loss: 1.1235 - regression_loss: 0.9818 - classification_loss: 0.1417 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1243 - regression_loss: 0.9828 - classification_loss: 0.1415 154/500 [========>.....................] - ETA: 1:21 - loss: 1.1323 - regression_loss: 0.9906 - classification_loss: 0.1417 155/500 [========>.....................] - ETA: 1:21 - loss: 1.1340 - regression_loss: 0.9919 - classification_loss: 0.1421 156/500 [========>.....................] - ETA: 1:21 - loss: 1.1335 - regression_loss: 0.9912 - classification_loss: 0.1424 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1365 - regression_loss: 0.9939 - classification_loss: 0.1427 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1358 - regression_loss: 0.9934 - classification_loss: 0.1424 159/500 [========>.....................] - ETA: 1:20 - loss: 1.1349 - regression_loss: 0.9929 - classification_loss: 0.1420 160/500 [========>.....................] - ETA: 1:20 - loss: 1.1335 - regression_loss: 0.9909 - classification_loss: 0.1425 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1330 - regression_loss: 0.9907 - classification_loss: 0.1423 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1355 - regression_loss: 0.9931 - classification_loss: 0.1424 163/500 [========>.....................] - ETA: 1:19 - loss: 1.1340 - regression_loss: 0.9918 - classification_loss: 0.1422 164/500 [========>.....................] - ETA: 1:19 - loss: 1.1312 - regression_loss: 0.9895 - classification_loss: 0.1417 165/500 [========>.....................] - ETA: 1:19 - loss: 1.1304 - regression_loss: 0.9891 - classification_loss: 0.1413 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1275 - regression_loss: 0.9864 - classification_loss: 0.1411 167/500 [=========>....................] - ETA: 1:18 - loss: 1.1298 - regression_loss: 0.9884 - classification_loss: 0.1414 168/500 [=========>....................] - ETA: 1:18 - loss: 1.1463 - regression_loss: 0.9987 - classification_loss: 0.1476 169/500 [=========>....................] - ETA: 1:18 - loss: 1.1488 - regression_loss: 1.0007 - classification_loss: 0.1481 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1487 - regression_loss: 1.0009 - classification_loss: 0.1478 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1481 - regression_loss: 1.0001 - classification_loss: 0.1481 172/500 [=========>....................] - ETA: 1:17 - loss: 1.1484 - regression_loss: 1.0004 - classification_loss: 0.1480 173/500 [=========>....................] - ETA: 1:17 - loss: 1.1486 - regression_loss: 1.0010 - classification_loss: 0.1476 174/500 [=========>....................] - ETA: 1:17 - loss: 1.1483 - regression_loss: 1.0008 - classification_loss: 0.1474 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1458 - regression_loss: 0.9989 - classification_loss: 0.1469 176/500 [=========>....................] - ETA: 1:16 - loss: 1.1470 - regression_loss: 0.9997 - classification_loss: 0.1473 177/500 [=========>....................] - ETA: 1:16 - loss: 1.1475 - regression_loss: 1.0002 - classification_loss: 0.1473 178/500 [=========>....................] - ETA: 1:16 - loss: 1.1493 - regression_loss: 1.0019 - classification_loss: 0.1474 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1497 - regression_loss: 1.0025 - classification_loss: 0.1473 180/500 [=========>....................] - ETA: 1:15 - loss: 1.1471 - regression_loss: 1.0003 - classification_loss: 0.1468 181/500 [=========>....................] - ETA: 1:15 - loss: 1.1558 - regression_loss: 1.0075 - classification_loss: 0.1483 182/500 [=========>....................] - ETA: 1:15 - loss: 1.1582 - regression_loss: 1.0096 - classification_loss: 0.1487 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1603 - regression_loss: 1.0113 - classification_loss: 0.1490 184/500 [==========>...................] - ETA: 1:14 - loss: 1.1584 - regression_loss: 1.0099 - classification_loss: 0.1485 185/500 [==========>...................] - ETA: 1:14 - loss: 1.1603 - regression_loss: 1.0123 - classification_loss: 0.1480 186/500 [==========>...................] - ETA: 1:14 - loss: 1.1609 - regression_loss: 1.0129 - classification_loss: 0.1480 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1578 - regression_loss: 1.0102 - classification_loss: 0.1476 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1579 - regression_loss: 1.0103 - classification_loss: 0.1476 189/500 [==========>...................] - ETA: 1:13 - loss: 1.1535 - regression_loss: 1.0064 - classification_loss: 0.1471 190/500 [==========>...................] - ETA: 1:13 - loss: 1.1544 - regression_loss: 1.0068 - classification_loss: 0.1477 191/500 [==========>...................] - ETA: 1:13 - loss: 1.1538 - regression_loss: 1.0063 - classification_loss: 0.1475 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1544 - regression_loss: 1.0070 - classification_loss: 0.1474 193/500 [==========>...................] - ETA: 1:12 - loss: 1.1575 - regression_loss: 1.0098 - classification_loss: 0.1477 194/500 [==========>...................] - ETA: 1:12 - loss: 1.1538 - regression_loss: 1.0067 - classification_loss: 0.1471 195/500 [==========>...................] - ETA: 1:12 - loss: 1.1513 - regression_loss: 1.0046 - classification_loss: 0.1467 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1551 - regression_loss: 1.0077 - classification_loss: 0.1475 197/500 [==========>...................] - ETA: 1:11 - loss: 1.1535 - regression_loss: 1.0064 - classification_loss: 0.1471 198/500 [==========>...................] - ETA: 1:11 - loss: 1.1530 - regression_loss: 1.0059 - classification_loss: 0.1471 199/500 [==========>...................] - ETA: 1:11 - loss: 1.1532 - regression_loss: 1.0060 - classification_loss: 0.1472 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1518 - regression_loss: 1.0049 - classification_loss: 0.1470 201/500 [===========>..................] - ETA: 1:10 - loss: 1.1511 - regression_loss: 1.0041 - classification_loss: 0.1470 202/500 [===========>..................] - ETA: 1:10 - loss: 1.1501 - regression_loss: 1.0034 - classification_loss: 0.1468 203/500 [===========>..................] - ETA: 1:10 - loss: 1.1487 - regression_loss: 1.0022 - classification_loss: 0.1465 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1467 - regression_loss: 1.0005 - classification_loss: 0.1461 205/500 [===========>..................] - ETA: 1:09 - loss: 1.1452 - regression_loss: 0.9994 - classification_loss: 0.1457 206/500 [===========>..................] - ETA: 1:09 - loss: 1.1426 - regression_loss: 0.9972 - classification_loss: 0.1454 207/500 [===========>..................] - ETA: 1:09 - loss: 1.1434 - regression_loss: 0.9979 - classification_loss: 0.1455 208/500 [===========>..................] - ETA: 1:09 - loss: 1.1434 - regression_loss: 0.9979 - classification_loss: 0.1455 209/500 [===========>..................] - ETA: 1:08 - loss: 1.1462 - regression_loss: 1.0003 - classification_loss: 0.1459 210/500 [===========>..................] - ETA: 1:08 - loss: 1.1460 - regression_loss: 1.0002 - classification_loss: 0.1458 211/500 [===========>..................] - ETA: 1:08 - loss: 1.1469 - regression_loss: 1.0010 - classification_loss: 0.1458 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1470 - regression_loss: 1.0011 - classification_loss: 0.1459 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1459 - regression_loss: 1.0003 - classification_loss: 0.1456 214/500 [===========>..................] - ETA: 1:07 - loss: 1.1421 - regression_loss: 0.9969 - classification_loss: 0.1452 215/500 [===========>..................] - ETA: 1:07 - loss: 1.1402 - regression_loss: 0.9955 - classification_loss: 0.1447 216/500 [===========>..................] - ETA: 1:07 - loss: 1.1388 - regression_loss: 0.9944 - classification_loss: 0.1444 217/500 [============>.................] - ETA: 1:06 - loss: 1.1393 - regression_loss: 0.9949 - classification_loss: 0.1444 218/500 [============>.................] - ETA: 1:06 - loss: 1.1369 - regression_loss: 0.9929 - classification_loss: 0.1440 219/500 [============>.................] - ETA: 1:06 - loss: 1.1367 - regression_loss: 0.9928 - classification_loss: 0.1439 220/500 [============>.................] - ETA: 1:06 - loss: 1.1372 - regression_loss: 0.9935 - classification_loss: 0.1437 221/500 [============>.................] - ETA: 1:05 - loss: 1.1372 - regression_loss: 0.9935 - classification_loss: 0.1437 222/500 [============>.................] - ETA: 1:05 - loss: 1.1395 - regression_loss: 0.9956 - classification_loss: 0.1439 223/500 [============>.................] - ETA: 1:05 - loss: 1.1366 - regression_loss: 0.9931 - classification_loss: 0.1435 224/500 [============>.................] - ETA: 1:05 - loss: 1.1357 - regression_loss: 0.9923 - classification_loss: 0.1434 225/500 [============>.................] - ETA: 1:04 - loss: 1.1363 - regression_loss: 0.9932 - classification_loss: 0.1431 226/500 [============>.................] - ETA: 1:04 - loss: 1.1339 - regression_loss: 0.9912 - classification_loss: 0.1427 227/500 [============>.................] - ETA: 1:04 - loss: 1.1363 - regression_loss: 0.9937 - classification_loss: 0.1426 228/500 [============>.................] - ETA: 1:04 - loss: 1.1358 - regression_loss: 0.9934 - classification_loss: 0.1425 229/500 [============>.................] - ETA: 1:03 - loss: 1.1361 - regression_loss: 0.9939 - classification_loss: 0.1422 230/500 [============>.................] - ETA: 1:03 - loss: 1.1332 - regression_loss: 0.9913 - classification_loss: 0.1419 231/500 [============>.................] - ETA: 1:03 - loss: 1.1333 - regression_loss: 0.9915 - classification_loss: 0.1418 232/500 [============>.................] - ETA: 1:03 - loss: 1.1351 - regression_loss: 0.9934 - classification_loss: 0.1418 233/500 [============>.................] - ETA: 1:03 - loss: 1.1372 - regression_loss: 0.9953 - classification_loss: 0.1420 234/500 [=============>................] - ETA: 1:02 - loss: 1.1412 - regression_loss: 0.9985 - classification_loss: 0.1427 235/500 [=============>................] - ETA: 1:02 - loss: 1.1405 - regression_loss: 0.9980 - classification_loss: 0.1425 236/500 [=============>................] - ETA: 1:02 - loss: 1.1437 - regression_loss: 1.0006 - classification_loss: 0.1430 237/500 [=============>................] - ETA: 1:02 - loss: 1.1449 - regression_loss: 1.0016 - classification_loss: 0.1433 238/500 [=============>................] - ETA: 1:01 - loss: 1.1424 - regression_loss: 0.9994 - classification_loss: 0.1429 239/500 [=============>................] - ETA: 1:01 - loss: 1.1416 - regression_loss: 0.9990 - classification_loss: 0.1427 240/500 [=============>................] - ETA: 1:01 - loss: 1.1425 - regression_loss: 1.0001 - classification_loss: 0.1424 241/500 [=============>................] - ETA: 1:01 - loss: 1.1442 - regression_loss: 1.0012 - classification_loss: 0.1429 242/500 [=============>................] - ETA: 1:00 - loss: 1.1456 - regression_loss: 1.0025 - classification_loss: 0.1431 243/500 [=============>................] - ETA: 1:00 - loss: 1.1459 - regression_loss: 1.0030 - classification_loss: 0.1430 244/500 [=============>................] - ETA: 1:00 - loss: 1.1496 - regression_loss: 1.0057 - classification_loss: 0.1439 245/500 [=============>................] - ETA: 1:00 - loss: 1.1494 - regression_loss: 1.0058 - classification_loss: 0.1436 246/500 [=============>................] - ETA: 59s - loss: 1.1481 - regression_loss: 1.0047 - classification_loss: 0.1435  247/500 [=============>................] - ETA: 59s - loss: 1.1494 - regression_loss: 1.0057 - classification_loss: 0.1436 248/500 [=============>................] - ETA: 59s - loss: 1.1495 - regression_loss: 1.0059 - classification_loss: 0.1436 249/500 [=============>................] - ETA: 59s - loss: 1.1522 - regression_loss: 1.0085 - classification_loss: 0.1437 250/500 [==============>...............] - ETA: 58s - loss: 1.1513 - regression_loss: 1.0078 - classification_loss: 0.1436 251/500 [==============>...............] - ETA: 58s - loss: 1.1487 - regression_loss: 1.0054 - classification_loss: 0.1432 252/500 [==============>...............] - ETA: 58s - loss: 1.1484 - regression_loss: 1.0052 - classification_loss: 0.1431 253/500 [==============>...............] - ETA: 58s - loss: 1.1490 - regression_loss: 1.0060 - classification_loss: 0.1430 254/500 [==============>...............] - ETA: 58s - loss: 1.1527 - regression_loss: 1.0090 - classification_loss: 0.1437 255/500 [==============>...............] - ETA: 57s - loss: 1.1543 - regression_loss: 1.0106 - classification_loss: 0.1437 256/500 [==============>...............] - ETA: 57s - loss: 1.1531 - regression_loss: 1.0094 - classification_loss: 0.1437 257/500 [==============>...............] - ETA: 57s - loss: 1.1534 - regression_loss: 1.0098 - classification_loss: 0.1436 258/500 [==============>...............] - ETA: 57s - loss: 1.1532 - regression_loss: 1.0095 - classification_loss: 0.1436 259/500 [==============>...............] - ETA: 56s - loss: 1.1518 - regression_loss: 1.0084 - classification_loss: 0.1434 260/500 [==============>...............] - ETA: 56s - loss: 1.1530 - regression_loss: 1.0095 - classification_loss: 0.1435 261/500 [==============>...............] - ETA: 56s - loss: 1.1500 - regression_loss: 1.0069 - classification_loss: 0.1431 262/500 [==============>...............] - ETA: 56s - loss: 1.1505 - regression_loss: 1.0075 - classification_loss: 0.1431 263/500 [==============>...............] - ETA: 55s - loss: 1.1495 - regression_loss: 1.0064 - classification_loss: 0.1431 264/500 [==============>...............] - ETA: 55s - loss: 1.1486 - regression_loss: 1.0055 - classification_loss: 0.1430 265/500 [==============>...............] - ETA: 55s - loss: 1.1499 - regression_loss: 1.0065 - classification_loss: 0.1434 266/500 [==============>...............] - ETA: 55s - loss: 1.1505 - regression_loss: 1.0072 - classification_loss: 0.1433 267/500 [===============>..............] - ETA: 54s - loss: 1.1504 - regression_loss: 1.0072 - classification_loss: 0.1432 268/500 [===============>..............] - ETA: 54s - loss: 1.1507 - regression_loss: 1.0073 - classification_loss: 0.1434 269/500 [===============>..............] - ETA: 54s - loss: 1.1534 - regression_loss: 1.0096 - classification_loss: 0.1439 270/500 [===============>..............] - ETA: 54s - loss: 1.1540 - regression_loss: 1.0100 - classification_loss: 0.1440 271/500 [===============>..............] - ETA: 54s - loss: 1.1542 - regression_loss: 1.0104 - classification_loss: 0.1438 272/500 [===============>..............] - ETA: 53s - loss: 1.1567 - regression_loss: 1.0122 - classification_loss: 0.1445 273/500 [===============>..............] - ETA: 53s - loss: 1.1551 - regression_loss: 1.0109 - classification_loss: 0.1442 274/500 [===============>..............] - ETA: 53s - loss: 1.1542 - regression_loss: 1.0103 - classification_loss: 0.1439 275/500 [===============>..............] - ETA: 53s - loss: 1.1515 - regression_loss: 1.0077 - classification_loss: 0.1438 276/500 [===============>..............] - ETA: 52s - loss: 1.1535 - regression_loss: 1.0095 - classification_loss: 0.1440 277/500 [===============>..............] - ETA: 52s - loss: 1.1579 - regression_loss: 1.0125 - classification_loss: 0.1454 278/500 [===============>..............] - ETA: 52s - loss: 1.1615 - regression_loss: 1.0154 - classification_loss: 0.1461 279/500 [===============>..............] - ETA: 52s - loss: 1.1638 - regression_loss: 1.0174 - classification_loss: 0.1465 280/500 [===============>..............] - ETA: 51s - loss: 1.1635 - regression_loss: 1.0173 - classification_loss: 0.1462 281/500 [===============>..............] - ETA: 51s - loss: 1.1641 - regression_loss: 1.0174 - classification_loss: 0.1467 282/500 [===============>..............] - ETA: 51s - loss: 1.1657 - regression_loss: 1.0190 - classification_loss: 0.1468 283/500 [===============>..............] - ETA: 51s - loss: 1.1641 - regression_loss: 1.0177 - classification_loss: 0.1464 284/500 [================>.............] - ETA: 50s - loss: 1.1657 - regression_loss: 1.0194 - classification_loss: 0.1464 285/500 [================>.............] - ETA: 50s - loss: 1.1659 - regression_loss: 1.0195 - classification_loss: 0.1464 286/500 [================>.............] - ETA: 50s - loss: 1.1677 - regression_loss: 1.0209 - classification_loss: 0.1467 287/500 [================>.............] - ETA: 50s - loss: 1.1699 - regression_loss: 1.0230 - classification_loss: 0.1469 288/500 [================>.............] - ETA: 50s - loss: 1.1690 - regression_loss: 1.0224 - classification_loss: 0.1466 289/500 [================>.............] - ETA: 49s - loss: 1.1687 - regression_loss: 1.0219 - classification_loss: 0.1468 290/500 [================>.............] - ETA: 49s - loss: 1.1672 - regression_loss: 1.0207 - classification_loss: 0.1465 291/500 [================>.............] - ETA: 49s - loss: 1.1694 - regression_loss: 1.0224 - classification_loss: 0.1470 292/500 [================>.............] - ETA: 49s - loss: 1.1696 - regression_loss: 1.0226 - classification_loss: 0.1470 293/500 [================>.............] - ETA: 48s - loss: 1.1705 - regression_loss: 1.0233 - classification_loss: 0.1472 294/500 [================>.............] - ETA: 48s - loss: 1.1761 - regression_loss: 1.0280 - classification_loss: 0.1481 295/500 [================>.............] - ETA: 48s - loss: 1.1749 - regression_loss: 1.0270 - classification_loss: 0.1479 296/500 [================>.............] - ETA: 48s - loss: 1.1770 - regression_loss: 1.0285 - classification_loss: 0.1485 297/500 [================>.............] - ETA: 47s - loss: 1.1765 - regression_loss: 1.0281 - classification_loss: 0.1484 298/500 [================>.............] - ETA: 47s - loss: 1.1757 - regression_loss: 1.0275 - classification_loss: 0.1483 299/500 [================>.............] - ETA: 47s - loss: 1.1740 - regression_loss: 1.0260 - classification_loss: 0.1479 300/500 [=================>............] - ETA: 47s - loss: 1.1808 - regression_loss: 1.0304 - classification_loss: 0.1504 301/500 [=================>............] - ETA: 46s - loss: 1.1831 - regression_loss: 1.0320 - classification_loss: 0.1511 302/500 [=================>............] - ETA: 46s - loss: 1.1849 - regression_loss: 1.0336 - classification_loss: 0.1513 303/500 [=================>............] - ETA: 46s - loss: 1.1845 - regression_loss: 1.0332 - classification_loss: 0.1513 304/500 [=================>............] - ETA: 46s - loss: 1.1832 - regression_loss: 1.0322 - classification_loss: 0.1510 305/500 [=================>............] - ETA: 46s - loss: 1.1831 - regression_loss: 1.0320 - classification_loss: 0.1511 306/500 [=================>............] - ETA: 45s - loss: 1.1824 - regression_loss: 1.0316 - classification_loss: 0.1508 307/500 [=================>............] - ETA: 45s - loss: 1.1836 - regression_loss: 1.0327 - classification_loss: 0.1510 308/500 [=================>............] - ETA: 45s - loss: 1.1830 - regression_loss: 1.0322 - classification_loss: 0.1508 309/500 [=================>............] - ETA: 45s - loss: 1.1822 - regression_loss: 1.0316 - classification_loss: 0.1505 310/500 [=================>............] - ETA: 44s - loss: 1.1815 - regression_loss: 1.0311 - classification_loss: 0.1504 311/500 [=================>............] - ETA: 44s - loss: 1.1840 - regression_loss: 1.0331 - classification_loss: 0.1509 312/500 [=================>............] - ETA: 44s - loss: 1.1862 - regression_loss: 1.0347 - classification_loss: 0.1515 313/500 [=================>............] - ETA: 44s - loss: 1.1854 - regression_loss: 1.0341 - classification_loss: 0.1514 314/500 [=================>............] - ETA: 43s - loss: 1.1839 - regression_loss: 1.0328 - classification_loss: 0.1511 315/500 [=================>............] - ETA: 43s - loss: 1.1831 - regression_loss: 1.0322 - classification_loss: 0.1509 316/500 [=================>............] - ETA: 43s - loss: 1.1813 - regression_loss: 1.0306 - classification_loss: 0.1507 317/500 [==================>...........] - ETA: 43s - loss: 1.1826 - regression_loss: 1.0318 - classification_loss: 0.1508 318/500 [==================>...........] - ETA: 43s - loss: 1.1821 - regression_loss: 1.0314 - classification_loss: 0.1507 319/500 [==================>...........] - ETA: 42s - loss: 1.1827 - regression_loss: 1.0320 - classification_loss: 0.1507 320/500 [==================>...........] - ETA: 42s - loss: 1.1814 - regression_loss: 1.0310 - classification_loss: 0.1504 321/500 [==================>...........] - ETA: 42s - loss: 1.1810 - regression_loss: 1.0307 - classification_loss: 0.1503 322/500 [==================>...........] - ETA: 42s - loss: 1.1812 - regression_loss: 1.0308 - classification_loss: 0.1503 323/500 [==================>...........] - ETA: 41s - loss: 1.1816 - regression_loss: 1.0312 - classification_loss: 0.1504 324/500 [==================>...........] - ETA: 41s - loss: 1.1814 - regression_loss: 1.0310 - classification_loss: 0.1504 325/500 [==================>...........] - ETA: 41s - loss: 1.1802 - regression_loss: 1.0299 - classification_loss: 0.1503 326/500 [==================>...........] - ETA: 41s - loss: 1.1818 - regression_loss: 1.0313 - classification_loss: 0.1506 327/500 [==================>...........] - ETA: 40s - loss: 1.1825 - regression_loss: 1.0318 - classification_loss: 0.1507 328/500 [==================>...........] - ETA: 40s - loss: 1.1800 - regression_loss: 1.0297 - classification_loss: 0.1503 329/500 [==================>...........] - ETA: 40s - loss: 1.1796 - regression_loss: 1.0295 - classification_loss: 0.1501 330/500 [==================>...........] - ETA: 40s - loss: 1.1796 - regression_loss: 1.0295 - classification_loss: 0.1502 331/500 [==================>...........] - ETA: 39s - loss: 1.1780 - regression_loss: 1.0281 - classification_loss: 0.1499 332/500 [==================>...........] - ETA: 39s - loss: 1.1780 - regression_loss: 1.0280 - classification_loss: 0.1499 333/500 [==================>...........] - ETA: 39s - loss: 1.1771 - regression_loss: 1.0274 - classification_loss: 0.1497 334/500 [===================>..........] - ETA: 39s - loss: 1.1776 - regression_loss: 1.0277 - classification_loss: 0.1500 335/500 [===================>..........] - ETA: 38s - loss: 1.1779 - regression_loss: 1.0280 - classification_loss: 0.1500 336/500 [===================>..........] - ETA: 38s - loss: 1.1795 - regression_loss: 1.0296 - classification_loss: 0.1499 337/500 [===================>..........] - ETA: 38s - loss: 1.1780 - regression_loss: 1.0284 - classification_loss: 0.1497 338/500 [===================>..........] - ETA: 38s - loss: 1.1795 - regression_loss: 1.0296 - classification_loss: 0.1499 339/500 [===================>..........] - ETA: 38s - loss: 1.1772 - regression_loss: 1.0276 - classification_loss: 0.1497 340/500 [===================>..........] - ETA: 37s - loss: 1.1765 - regression_loss: 1.0270 - classification_loss: 0.1495 341/500 [===================>..........] - ETA: 37s - loss: 1.1754 - regression_loss: 1.0261 - classification_loss: 0.1493 342/500 [===================>..........] - ETA: 37s - loss: 1.1752 - regression_loss: 1.0261 - classification_loss: 0.1491 343/500 [===================>..........] - ETA: 37s - loss: 1.1734 - regression_loss: 1.0245 - classification_loss: 0.1488 344/500 [===================>..........] - ETA: 36s - loss: 1.1722 - regression_loss: 1.0236 - classification_loss: 0.1486 345/500 [===================>..........] - ETA: 36s - loss: 1.1704 - regression_loss: 1.0221 - classification_loss: 0.1483 346/500 [===================>..........] - ETA: 36s - loss: 1.1700 - regression_loss: 1.0217 - classification_loss: 0.1483 347/500 [===================>..........] - ETA: 36s - loss: 1.1709 - regression_loss: 1.0223 - classification_loss: 0.1485 348/500 [===================>..........] - ETA: 35s - loss: 1.1703 - regression_loss: 1.0219 - classification_loss: 0.1484 349/500 [===================>..........] - ETA: 35s - loss: 1.1710 - regression_loss: 1.0225 - classification_loss: 0.1485 350/500 [====================>.........] - ETA: 35s - loss: 1.1704 - regression_loss: 1.0219 - classification_loss: 0.1484 351/500 [====================>.........] - ETA: 35s - loss: 1.1697 - regression_loss: 1.0211 - classification_loss: 0.1486 352/500 [====================>.........] - ETA: 34s - loss: 1.1680 - regression_loss: 1.0197 - classification_loss: 0.1483 353/500 [====================>.........] - ETA: 34s - loss: 1.1676 - regression_loss: 1.0195 - classification_loss: 0.1481 354/500 [====================>.........] - ETA: 34s - loss: 1.1674 - regression_loss: 1.0195 - classification_loss: 0.1479 355/500 [====================>.........] - ETA: 34s - loss: 1.1668 - regression_loss: 1.0189 - classification_loss: 0.1478 356/500 [====================>.........] - ETA: 34s - loss: 1.1644 - regression_loss: 1.0169 - classification_loss: 0.1475 357/500 [====================>.........] - ETA: 33s - loss: 1.1650 - regression_loss: 1.0175 - classification_loss: 0.1475 358/500 [====================>.........] - ETA: 33s - loss: 1.1638 - regression_loss: 1.0165 - classification_loss: 0.1473 359/500 [====================>.........] - ETA: 33s - loss: 1.1633 - regression_loss: 1.0160 - classification_loss: 0.1473 360/500 [====================>.........] - ETA: 33s - loss: 1.1631 - regression_loss: 1.0158 - classification_loss: 0.1473 361/500 [====================>.........] - ETA: 32s - loss: 1.1619 - regression_loss: 1.0148 - classification_loss: 0.1471 362/500 [====================>.........] - ETA: 32s - loss: 1.1626 - regression_loss: 1.0155 - classification_loss: 0.1471 363/500 [====================>.........] - ETA: 32s - loss: 1.1641 - regression_loss: 1.0167 - classification_loss: 0.1474 364/500 [====================>.........] - ETA: 32s - loss: 1.1647 - regression_loss: 1.0172 - classification_loss: 0.1475 365/500 [====================>.........] - ETA: 31s - loss: 1.1649 - regression_loss: 1.0173 - classification_loss: 0.1476 366/500 [====================>.........] - ETA: 31s - loss: 1.1638 - regression_loss: 1.0165 - classification_loss: 0.1473 367/500 [=====================>........] - ETA: 31s - loss: 1.1634 - regression_loss: 1.0162 - classification_loss: 0.1473 368/500 [=====================>........] - ETA: 31s - loss: 1.1624 - regression_loss: 1.0154 - classification_loss: 0.1470 369/500 [=====================>........] - ETA: 30s - loss: 1.1640 - regression_loss: 1.0168 - classification_loss: 0.1472 370/500 [=====================>........] - ETA: 30s - loss: 1.1640 - regression_loss: 1.0169 - classification_loss: 0.1470 371/500 [=====================>........] - ETA: 30s - loss: 1.1638 - regression_loss: 1.0171 - classification_loss: 0.1467 372/500 [=====================>........] - ETA: 30s - loss: 1.1648 - regression_loss: 1.0177 - classification_loss: 0.1471 373/500 [=====================>........] - ETA: 30s - loss: 1.1647 - regression_loss: 1.0175 - classification_loss: 0.1472 374/500 [=====================>........] - ETA: 29s - loss: 1.1658 - regression_loss: 1.0184 - classification_loss: 0.1474 375/500 [=====================>........] - ETA: 29s - loss: 1.1652 - regression_loss: 1.0180 - classification_loss: 0.1472 376/500 [=====================>........] - ETA: 29s - loss: 1.1662 - regression_loss: 1.0187 - classification_loss: 0.1475 377/500 [=====================>........] - ETA: 29s - loss: 1.1673 - regression_loss: 1.0195 - classification_loss: 0.1478 378/500 [=====================>........] - ETA: 28s - loss: 1.1670 - regression_loss: 1.0192 - classification_loss: 0.1477 379/500 [=====================>........] - ETA: 28s - loss: 1.1648 - regression_loss: 1.0173 - classification_loss: 0.1475 380/500 [=====================>........] - ETA: 28s - loss: 1.1658 - regression_loss: 1.0180 - classification_loss: 0.1478 381/500 [=====================>........] - ETA: 28s - loss: 1.1672 - regression_loss: 1.0192 - classification_loss: 0.1480 382/500 [=====================>........] - ETA: 27s - loss: 1.1694 - regression_loss: 1.0214 - classification_loss: 0.1480 383/500 [=====================>........] - ETA: 27s - loss: 1.1701 - regression_loss: 1.0220 - classification_loss: 0.1481 384/500 [======================>.......] - ETA: 27s - loss: 1.1683 - regression_loss: 1.0205 - classification_loss: 0.1478 385/500 [======================>.......] - ETA: 27s - loss: 1.1682 - regression_loss: 1.0203 - classification_loss: 0.1478 386/500 [======================>.......] - ETA: 26s - loss: 1.1685 - regression_loss: 1.0205 - classification_loss: 0.1480 387/500 [======================>.......] - ETA: 26s - loss: 1.1679 - regression_loss: 1.0201 - classification_loss: 0.1479 388/500 [======================>.......] - ETA: 26s - loss: 1.1687 - regression_loss: 1.0209 - classification_loss: 0.1478 389/500 [======================>.......] - ETA: 26s - loss: 1.1691 - regression_loss: 1.0214 - classification_loss: 0.1477 390/500 [======================>.......] - ETA: 25s - loss: 1.1701 - regression_loss: 1.0223 - classification_loss: 0.1478 391/500 [======================>.......] - ETA: 25s - loss: 1.1719 - regression_loss: 1.0239 - classification_loss: 0.1480 392/500 [======================>.......] - ETA: 25s - loss: 1.1722 - regression_loss: 1.0243 - classification_loss: 0.1479 393/500 [======================>.......] - ETA: 25s - loss: 1.1719 - regression_loss: 1.0242 - classification_loss: 0.1478 394/500 [======================>.......] - ETA: 25s - loss: 1.1743 - regression_loss: 1.0261 - classification_loss: 0.1482 395/500 [======================>.......] - ETA: 24s - loss: 1.1759 - regression_loss: 1.0270 - classification_loss: 0.1488 396/500 [======================>.......] - ETA: 24s - loss: 1.1755 - regression_loss: 1.0268 - classification_loss: 0.1487 397/500 [======================>.......] - ETA: 24s - loss: 1.1765 - regression_loss: 1.0276 - classification_loss: 0.1488 398/500 [======================>.......] - ETA: 24s - loss: 1.1754 - regression_loss: 1.0267 - classification_loss: 0.1486 399/500 [======================>.......] - ETA: 23s - loss: 1.1736 - regression_loss: 1.0252 - classification_loss: 0.1484 400/500 [=======================>......] - ETA: 23s - loss: 1.1718 - regression_loss: 1.0236 - classification_loss: 0.1482 401/500 [=======================>......] - ETA: 23s - loss: 1.1722 - regression_loss: 1.0239 - classification_loss: 0.1482 402/500 [=======================>......] - ETA: 23s - loss: 1.1713 - regression_loss: 1.0232 - classification_loss: 0.1481 403/500 [=======================>......] - ETA: 22s - loss: 1.1709 - regression_loss: 1.0229 - classification_loss: 0.1480 404/500 [=======================>......] - ETA: 22s - loss: 1.1706 - regression_loss: 1.0226 - classification_loss: 0.1480 405/500 [=======================>......] - ETA: 22s - loss: 1.1699 - regression_loss: 1.0221 - classification_loss: 0.1478 406/500 [=======================>......] - ETA: 22s - loss: 1.1709 - regression_loss: 1.0230 - classification_loss: 0.1478 407/500 [=======================>......] - ETA: 21s - loss: 1.1714 - regression_loss: 1.0235 - classification_loss: 0.1479 408/500 [=======================>......] - ETA: 21s - loss: 1.1715 - regression_loss: 1.0235 - classification_loss: 0.1480 409/500 [=======================>......] - ETA: 21s - loss: 1.1704 - regression_loss: 1.0226 - classification_loss: 0.1478 410/500 [=======================>......] - ETA: 21s - loss: 1.1711 - regression_loss: 1.0231 - classification_loss: 0.1479 411/500 [=======================>......] - ETA: 20s - loss: 1.1735 - regression_loss: 1.0249 - classification_loss: 0.1486 412/500 [=======================>......] - ETA: 20s - loss: 1.1732 - regression_loss: 1.0247 - classification_loss: 0.1485 413/500 [=======================>......] - ETA: 20s - loss: 1.1735 - regression_loss: 1.0252 - classification_loss: 0.1484 414/500 [=======================>......] - ETA: 20s - loss: 1.1724 - regression_loss: 1.0227 - classification_loss: 0.1498 415/500 [=======================>......] - ETA: 20s - loss: 1.1719 - regression_loss: 1.0223 - classification_loss: 0.1496 416/500 [=======================>......] - ETA: 19s - loss: 1.1712 - regression_loss: 1.0218 - classification_loss: 0.1494 417/500 [========================>.....] - ETA: 19s - loss: 1.1699 - regression_loss: 1.0207 - classification_loss: 0.1492 418/500 [========================>.....] - ETA: 19s - loss: 1.1680 - regression_loss: 1.0192 - classification_loss: 0.1489 419/500 [========================>.....] - ETA: 19s - loss: 1.1695 - regression_loss: 1.0206 - classification_loss: 0.1489 420/500 [========================>.....] - ETA: 18s - loss: 1.1697 - regression_loss: 1.0207 - classification_loss: 0.1490 421/500 [========================>.....] - ETA: 18s - loss: 1.1706 - regression_loss: 1.0214 - classification_loss: 0.1492 422/500 [========================>.....] - ETA: 18s - loss: 1.1699 - regression_loss: 1.0208 - classification_loss: 0.1491 423/500 [========================>.....] - ETA: 18s - loss: 1.1713 - regression_loss: 1.0219 - classification_loss: 0.1493 424/500 [========================>.....] - ETA: 17s - loss: 1.1716 - regression_loss: 1.0222 - classification_loss: 0.1494 425/500 [========================>.....] - ETA: 17s - loss: 1.1713 - regression_loss: 1.0220 - classification_loss: 0.1494 426/500 [========================>.....] - ETA: 17s - loss: 1.1697 - regression_loss: 1.0206 - classification_loss: 0.1491 427/500 [========================>.....] - ETA: 17s - loss: 1.1696 - regression_loss: 1.0206 - classification_loss: 0.1490 428/500 [========================>.....] - ETA: 16s - loss: 1.1679 - regression_loss: 1.0192 - classification_loss: 0.1487 429/500 [========================>.....] - ETA: 16s - loss: 1.1671 - regression_loss: 1.0183 - classification_loss: 0.1488 430/500 [========================>.....] - ETA: 16s - loss: 1.1664 - regression_loss: 1.0177 - classification_loss: 0.1487 431/500 [========================>.....] - ETA: 16s - loss: 1.1668 - regression_loss: 1.0180 - classification_loss: 0.1488 432/500 [========================>.....] - ETA: 16s - loss: 1.1664 - regression_loss: 1.0177 - classification_loss: 0.1487 433/500 [========================>.....] - ETA: 15s - loss: 1.1672 - regression_loss: 1.0183 - classification_loss: 0.1489 434/500 [=========================>....] - ETA: 15s - loss: 1.1662 - regression_loss: 1.0175 - classification_loss: 0.1487 435/500 [=========================>....] - ETA: 15s - loss: 1.1663 - regression_loss: 1.0176 - classification_loss: 0.1487 436/500 [=========================>....] - ETA: 15s - loss: 1.1685 - regression_loss: 1.0195 - classification_loss: 0.1489 437/500 [=========================>....] - ETA: 14s - loss: 1.1692 - regression_loss: 1.0203 - classification_loss: 0.1489 438/500 [=========================>....] - ETA: 14s - loss: 1.1703 - regression_loss: 1.0212 - classification_loss: 0.1491 439/500 [=========================>....] - ETA: 14s - loss: 1.1709 - regression_loss: 1.0217 - classification_loss: 0.1492 440/500 [=========================>....] - ETA: 14s - loss: 1.1698 - regression_loss: 1.0208 - classification_loss: 0.1490 441/500 [=========================>....] - ETA: 13s - loss: 1.1716 - regression_loss: 1.0221 - classification_loss: 0.1495 442/500 [=========================>....] - ETA: 13s - loss: 1.1715 - regression_loss: 1.0222 - classification_loss: 0.1493 443/500 [=========================>....] - ETA: 13s - loss: 1.1714 - regression_loss: 1.0222 - classification_loss: 0.1492 444/500 [=========================>....] - ETA: 13s - loss: 1.1702 - regression_loss: 1.0212 - classification_loss: 0.1490 445/500 [=========================>....] - ETA: 12s - loss: 1.1719 - regression_loss: 1.0224 - classification_loss: 0.1495 446/500 [=========================>....] - ETA: 12s - loss: 1.1716 - regression_loss: 1.0223 - classification_loss: 0.1493 447/500 [=========================>....] - ETA: 12s - loss: 1.1722 - regression_loss: 1.0228 - classification_loss: 0.1494 448/500 [=========================>....] - ETA: 12s - loss: 1.1719 - regression_loss: 1.0225 - classification_loss: 0.1494 449/500 [=========================>....] - ETA: 12s - loss: 1.1714 - regression_loss: 1.0221 - classification_loss: 0.1494 450/500 [==========================>...] - ETA: 11s - loss: 1.1714 - regression_loss: 1.0221 - classification_loss: 0.1493 451/500 [==========================>...] - ETA: 11s - loss: 1.1708 - regression_loss: 1.0217 - classification_loss: 0.1491 452/500 [==========================>...] - ETA: 11s - loss: 1.1689 - regression_loss: 1.0201 - classification_loss: 0.1489 453/500 [==========================>...] - ETA: 11s - loss: 1.1669 - regression_loss: 1.0183 - classification_loss: 0.1486 454/500 [==========================>...] - ETA: 10s - loss: 1.1666 - regression_loss: 1.0180 - classification_loss: 0.1486 455/500 [==========================>...] - ETA: 10s - loss: 1.1671 - regression_loss: 1.0186 - classification_loss: 0.1485 456/500 [==========================>...] - ETA: 10s - loss: 1.1673 - regression_loss: 1.0187 - classification_loss: 0.1486 457/500 [==========================>...] - ETA: 10s - loss: 1.1665 - regression_loss: 1.0180 - classification_loss: 0.1485 458/500 [==========================>...] - ETA: 9s - loss: 1.1670 - regression_loss: 1.0184 - classification_loss: 0.1485  459/500 [==========================>...] - ETA: 9s - loss: 1.1685 - regression_loss: 1.0199 - classification_loss: 0.1487 460/500 [==========================>...] - ETA: 9s - loss: 1.1675 - regression_loss: 1.0190 - classification_loss: 0.1486 461/500 [==========================>...] - ETA: 9s - loss: 1.1684 - regression_loss: 1.0198 - classification_loss: 0.1486 462/500 [==========================>...] - ETA: 8s - loss: 1.1680 - regression_loss: 1.0195 - classification_loss: 0.1485 463/500 [==========================>...] - ETA: 8s - loss: 1.1687 - regression_loss: 1.0199 - classification_loss: 0.1488 464/500 [==========================>...] - ETA: 8s - loss: 1.1684 - regression_loss: 1.0197 - classification_loss: 0.1487 465/500 [==========================>...] - ETA: 8s - loss: 1.1677 - regression_loss: 1.0191 - classification_loss: 0.1486 466/500 [==========================>...] - ETA: 8s - loss: 1.1667 - regression_loss: 1.0184 - classification_loss: 0.1484 467/500 [===========================>..] - ETA: 7s - loss: 1.1653 - regression_loss: 1.0172 - classification_loss: 0.1482 468/500 [===========================>..] - ETA: 7s - loss: 1.1646 - regression_loss: 1.0166 - classification_loss: 0.1481 469/500 [===========================>..] - ETA: 7s - loss: 1.1671 - regression_loss: 1.0186 - classification_loss: 0.1485 470/500 [===========================>..] - ETA: 7s - loss: 1.1669 - regression_loss: 1.0186 - classification_loss: 0.1483 471/500 [===========================>..] - ETA: 6s - loss: 1.1664 - regression_loss: 1.0182 - classification_loss: 0.1483 472/500 [===========================>..] - ETA: 6s - loss: 1.1665 - regression_loss: 1.0182 - classification_loss: 0.1483 473/500 [===========================>..] - ETA: 6s - loss: 1.1661 - regression_loss: 1.0179 - classification_loss: 0.1482 474/500 [===========================>..] - ETA: 6s - loss: 1.1684 - regression_loss: 1.0200 - classification_loss: 0.1484 475/500 [===========================>..] - ETA: 5s - loss: 1.1684 - regression_loss: 1.0200 - classification_loss: 0.1484 476/500 [===========================>..] - ETA: 5s - loss: 1.1704 - regression_loss: 1.0214 - classification_loss: 0.1490 477/500 [===========================>..] - ETA: 5s - loss: 1.1697 - regression_loss: 1.0209 - classification_loss: 0.1488 478/500 [===========================>..] - ETA: 5s - loss: 1.1685 - regression_loss: 1.0200 - classification_loss: 0.1485 479/500 [===========================>..] - ETA: 4s - loss: 1.1678 - regression_loss: 1.0194 - classification_loss: 0.1484 480/500 [===========================>..] - ETA: 4s - loss: 1.1677 - regression_loss: 1.0193 - classification_loss: 0.1484 481/500 [===========================>..] - ETA: 4s - loss: 1.1672 - regression_loss: 1.0190 - classification_loss: 0.1482 482/500 [===========================>..] - ETA: 4s - loss: 1.1656 - regression_loss: 1.0176 - classification_loss: 0.1480 483/500 [===========================>..] - ETA: 4s - loss: 1.1646 - regression_loss: 1.0168 - classification_loss: 0.1478 484/500 [============================>.] - ETA: 3s - loss: 1.1641 - regression_loss: 1.0164 - classification_loss: 0.1477 485/500 [============================>.] - ETA: 3s - loss: 1.1646 - regression_loss: 1.0169 - classification_loss: 0.1477 486/500 [============================>.] - ETA: 3s - loss: 1.1651 - regression_loss: 1.0173 - classification_loss: 0.1478 487/500 [============================>.] - ETA: 3s - loss: 1.1638 - regression_loss: 1.0162 - classification_loss: 0.1476 488/500 [============================>.] - ETA: 2s - loss: 1.1637 - regression_loss: 1.0160 - classification_loss: 0.1477 489/500 [============================>.] - ETA: 2s - loss: 1.1638 - regression_loss: 1.0161 - classification_loss: 0.1477 490/500 [============================>.] - ETA: 2s - loss: 1.1643 - regression_loss: 1.0167 - classification_loss: 0.1477 491/500 [============================>.] - ETA: 2s - loss: 1.1648 - regression_loss: 1.0171 - classification_loss: 0.1477 492/500 [============================>.] - ETA: 1s - loss: 1.1656 - regression_loss: 1.0178 - classification_loss: 0.1478 493/500 [============================>.] - ETA: 1s - loss: 1.1643 - regression_loss: 1.0165 - classification_loss: 0.1477 494/500 [============================>.] - ETA: 1s - loss: 1.1636 - regression_loss: 1.0160 - classification_loss: 0.1476 495/500 [============================>.] - ETA: 1s - loss: 1.1639 - regression_loss: 1.0163 - classification_loss: 0.1476 496/500 [============================>.] - ETA: 0s - loss: 1.1625 - regression_loss: 1.0151 - classification_loss: 0.1474 497/500 [============================>.] - ETA: 0s - loss: 1.1629 - regression_loss: 1.0154 - classification_loss: 0.1475 498/500 [============================>.] - ETA: 0s - loss: 1.1625 - regression_loss: 1.0151 - classification_loss: 0.1474 499/500 [============================>.] - ETA: 0s - loss: 1.1633 - regression_loss: 1.0158 - classification_loss: 0.1475 500/500 [==============================] - 118s 236ms/step - loss: 1.1628 - regression_loss: 1.0155 - classification_loss: 0.1473 326 instances of class plum with average precision: 0.8494 mAP: 0.8494 Epoch 00021: saving model to ./training/snapshots/resnet50_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 1:56 - loss: 1.2194 - regression_loss: 1.0758 - classification_loss: 0.1436 2/500 [..............................] - ETA: 1:55 - loss: 1.4380 - regression_loss: 1.2722 - classification_loss: 0.1657 3/500 [..............................] - ETA: 1:58 - loss: 1.4062 - regression_loss: 1.2440 - classification_loss: 0.1622 4/500 [..............................] - ETA: 1:57 - loss: 1.4695 - regression_loss: 1.2923 - classification_loss: 0.1773 5/500 [..............................] - ETA: 1:58 - loss: 1.3671 - regression_loss: 1.2050 - classification_loss: 0.1620 6/500 [..............................] - ETA: 1:57 - loss: 1.2519 - regression_loss: 1.1091 - classification_loss: 0.1427 7/500 [..............................] - ETA: 1:55 - loss: 1.2218 - regression_loss: 1.0843 - classification_loss: 0.1375 8/500 [..............................] - ETA: 1:54 - loss: 1.1719 - regression_loss: 1.0379 - classification_loss: 0.1340 9/500 [..............................] - ETA: 1:54 - loss: 1.1150 - regression_loss: 0.9873 - classification_loss: 0.1276 10/500 [..............................] - ETA: 1:54 - loss: 1.1784 - regression_loss: 1.0565 - classification_loss: 0.1220 11/500 [..............................] - ETA: 1:54 - loss: 1.2233 - regression_loss: 1.0956 - classification_loss: 0.1277 12/500 [..............................] - ETA: 1:54 - loss: 1.2185 - regression_loss: 1.0862 - classification_loss: 0.1323 13/500 [..............................] - ETA: 1:53 - loss: 1.2272 - regression_loss: 1.0899 - classification_loss: 0.1373 14/500 [..............................] - ETA: 1:52 - loss: 1.2020 - regression_loss: 1.0684 - classification_loss: 0.1336 15/500 [..............................] - ETA: 1:52 - loss: 1.1510 - regression_loss: 1.0202 - classification_loss: 0.1308 16/500 [..............................] - ETA: 1:52 - loss: 1.1170 - regression_loss: 0.9909 - classification_loss: 0.1260 17/500 [>.............................] - ETA: 1:52 - loss: 1.1040 - regression_loss: 0.9805 - classification_loss: 0.1235 18/500 [>.............................] - ETA: 1:52 - loss: 1.1088 - regression_loss: 0.9841 - classification_loss: 0.1247 19/500 [>.............................] - ETA: 1:52 - loss: 1.1212 - regression_loss: 0.9960 - classification_loss: 0.1252 20/500 [>.............................] - ETA: 1:51 - loss: 1.1539 - regression_loss: 1.0201 - classification_loss: 0.1338 21/500 [>.............................] - ETA: 1:51 - loss: 1.1385 - regression_loss: 1.0067 - classification_loss: 0.1318 22/500 [>.............................] - ETA: 1:51 - loss: 1.1497 - regression_loss: 1.0176 - classification_loss: 0.1321 23/500 [>.............................] - ETA: 1:50 - loss: 1.1699 - regression_loss: 1.0355 - classification_loss: 0.1344 24/500 [>.............................] - ETA: 1:50 - loss: 1.1741 - regression_loss: 1.0408 - classification_loss: 0.1333 25/500 [>.............................] - ETA: 1:50 - loss: 1.1664 - regression_loss: 1.0333 - classification_loss: 0.1330 26/500 [>.............................] - ETA: 1:50 - loss: 1.1791 - regression_loss: 1.0400 - classification_loss: 0.1391 27/500 [>.............................] - ETA: 1:50 - loss: 1.1444 - regression_loss: 1.0098 - classification_loss: 0.1346 28/500 [>.............................] - ETA: 1:50 - loss: 1.1426 - regression_loss: 1.0088 - classification_loss: 0.1338 29/500 [>.............................] - ETA: 1:49 - loss: 1.1370 - regression_loss: 1.0033 - classification_loss: 0.1337 30/500 [>.............................] - ETA: 1:49 - loss: 1.1378 - regression_loss: 1.0064 - classification_loss: 0.1314 31/500 [>.............................] - ETA: 1:49 - loss: 1.1299 - regression_loss: 0.9997 - classification_loss: 0.1302 32/500 [>.............................] - ETA: 1:49 - loss: 1.1220 - regression_loss: 0.9934 - classification_loss: 0.1286 33/500 [>.............................] - ETA: 1:49 - loss: 1.1103 - regression_loss: 0.9839 - classification_loss: 0.1264 34/500 [=>............................] - ETA: 1:48 - loss: 1.0949 - regression_loss: 0.9704 - classification_loss: 0.1244 35/500 [=>............................] - ETA: 1:48 - loss: 1.0966 - regression_loss: 0.9715 - classification_loss: 0.1251 36/500 [=>............................] - ETA: 1:48 - loss: 1.0870 - regression_loss: 0.9628 - classification_loss: 0.1242 37/500 [=>............................] - ETA: 1:48 - loss: 1.0815 - regression_loss: 0.9587 - classification_loss: 0.1228 38/500 [=>............................] - ETA: 1:48 - loss: 1.0862 - regression_loss: 0.9619 - classification_loss: 0.1243 39/500 [=>............................] - ETA: 1:47 - loss: 1.0919 - regression_loss: 0.9676 - classification_loss: 0.1243 40/500 [=>............................] - ETA: 1:47 - loss: 1.1072 - regression_loss: 0.9817 - classification_loss: 0.1255 41/500 [=>............................] - ETA: 1:46 - loss: 1.1122 - regression_loss: 0.9859 - classification_loss: 0.1263 42/500 [=>............................] - ETA: 1:46 - loss: 1.1136 - regression_loss: 0.9879 - classification_loss: 0.1257 43/500 [=>............................] - ETA: 1:46 - loss: 1.1161 - regression_loss: 0.9913 - classification_loss: 0.1248 44/500 [=>............................] - ETA: 1:46 - loss: 1.1113 - regression_loss: 0.9871 - classification_loss: 0.1242 45/500 [=>............................] - ETA: 1:46 - loss: 1.1173 - regression_loss: 0.9915 - classification_loss: 0.1259 46/500 [=>............................] - ETA: 1:46 - loss: 1.1156 - regression_loss: 0.9901 - classification_loss: 0.1255 47/500 [=>............................] - ETA: 1:46 - loss: 1.1063 - regression_loss: 0.9824 - classification_loss: 0.1239 48/500 [=>............................] - ETA: 1:45 - loss: 1.1185 - regression_loss: 0.9928 - classification_loss: 0.1257 49/500 [=>............................] - ETA: 1:45 - loss: 1.1242 - regression_loss: 0.9980 - classification_loss: 0.1262 50/500 [==>...........................] - ETA: 1:45 - loss: 1.1164 - regression_loss: 0.9914 - classification_loss: 0.1250 51/500 [==>...........................] - ETA: 1:45 - loss: 1.1171 - regression_loss: 0.9919 - classification_loss: 0.1252 52/500 [==>...........................] - ETA: 1:44 - loss: 1.1214 - regression_loss: 0.9963 - classification_loss: 0.1251 53/500 [==>...........................] - ETA: 1:44 - loss: 1.1068 - regression_loss: 0.9833 - classification_loss: 0.1235 54/500 [==>...........................] - ETA: 1:44 - loss: 1.1081 - regression_loss: 0.9841 - classification_loss: 0.1240 55/500 [==>...........................] - ETA: 1:44 - loss: 1.1161 - regression_loss: 0.9899 - classification_loss: 0.1261 56/500 [==>...........................] - ETA: 1:44 - loss: 1.1121 - regression_loss: 0.9861 - classification_loss: 0.1259 57/500 [==>...........................] - ETA: 1:44 - loss: 1.1121 - regression_loss: 0.9867 - classification_loss: 0.1254 58/500 [==>...........................] - ETA: 1:43 - loss: 1.1072 - regression_loss: 0.9825 - classification_loss: 0.1247 59/500 [==>...........................] - ETA: 1:43 - loss: 1.1063 - regression_loss: 0.9821 - classification_loss: 0.1242 60/500 [==>...........................] - ETA: 1:43 - loss: 1.1007 - regression_loss: 0.9770 - classification_loss: 0.1237 61/500 [==>...........................] - ETA: 1:43 - loss: 1.1004 - regression_loss: 0.9774 - classification_loss: 0.1230 62/500 [==>...........................] - ETA: 1:42 - loss: 1.1074 - regression_loss: 0.9824 - classification_loss: 0.1251 63/500 [==>...........................] - ETA: 1:42 - loss: 1.1034 - regression_loss: 0.9790 - classification_loss: 0.1245 64/500 [==>...........................] - ETA: 1:42 - loss: 1.1052 - regression_loss: 0.9814 - classification_loss: 0.1238 65/500 [==>...........................] - ETA: 1:42 - loss: 1.1227 - regression_loss: 0.9962 - classification_loss: 0.1265 66/500 [==>...........................] - ETA: 1:41 - loss: 1.1171 - regression_loss: 0.9915 - classification_loss: 0.1255 67/500 [===>..........................] - ETA: 1:41 - loss: 1.1163 - regression_loss: 0.9905 - classification_loss: 0.1258 68/500 [===>..........................] - ETA: 1:41 - loss: 1.1198 - regression_loss: 0.9941 - classification_loss: 0.1257 69/500 [===>..........................] - ETA: 1:41 - loss: 1.1205 - regression_loss: 0.9944 - classification_loss: 0.1261 70/500 [===>..........................] - ETA: 1:40 - loss: 1.1274 - regression_loss: 0.9999 - classification_loss: 0.1275 71/500 [===>..........................] - ETA: 1:40 - loss: 1.1288 - regression_loss: 1.0011 - classification_loss: 0.1277 72/500 [===>..........................] - ETA: 1:40 - loss: 1.1300 - regression_loss: 1.0025 - classification_loss: 0.1275 73/500 [===>..........................] - ETA: 1:39 - loss: 1.1358 - regression_loss: 1.0076 - classification_loss: 0.1283 74/500 [===>..........................] - ETA: 1:39 - loss: 1.1372 - regression_loss: 1.0093 - classification_loss: 0.1279 75/500 [===>..........................] - ETA: 1:39 - loss: 1.1361 - regression_loss: 1.0092 - classification_loss: 0.1270 76/500 [===>..........................] - ETA: 1:39 - loss: 1.1334 - regression_loss: 1.0071 - classification_loss: 0.1263 77/500 [===>..........................] - ETA: 1:39 - loss: 1.1307 - regression_loss: 1.0049 - classification_loss: 0.1258 78/500 [===>..........................] - ETA: 1:38 - loss: 1.1265 - regression_loss: 1.0014 - classification_loss: 0.1251 79/500 [===>..........................] - ETA: 1:38 - loss: 1.1358 - regression_loss: 1.0083 - classification_loss: 0.1274 80/500 [===>..........................] - ETA: 1:38 - loss: 1.1318 - regression_loss: 1.0040 - classification_loss: 0.1278 81/500 [===>..........................] - ETA: 1:38 - loss: 1.1215 - regression_loss: 0.9951 - classification_loss: 0.1265 82/500 [===>..........................] - ETA: 1:38 - loss: 1.1280 - regression_loss: 1.0008 - classification_loss: 0.1271 83/500 [===>..........................] - ETA: 1:37 - loss: 1.1319 - regression_loss: 1.0042 - classification_loss: 0.1277 84/500 [====>.........................] - ETA: 1:37 - loss: 1.1245 - regression_loss: 0.9975 - classification_loss: 0.1270 85/500 [====>.........................] - ETA: 1:37 - loss: 1.1192 - regression_loss: 0.9933 - classification_loss: 0.1259 86/500 [====>.........................] - ETA: 1:37 - loss: 1.1215 - regression_loss: 0.9950 - classification_loss: 0.1265 87/500 [====>.........................] - ETA: 1:36 - loss: 1.1152 - regression_loss: 0.9893 - classification_loss: 0.1260 88/500 [====>.........................] - ETA: 1:36 - loss: 1.1171 - regression_loss: 0.9907 - classification_loss: 0.1264 89/500 [====>.........................] - ETA: 1:36 - loss: 1.1196 - regression_loss: 0.9928 - classification_loss: 0.1268 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1191 - regression_loss: 0.9925 - classification_loss: 0.1266 91/500 [====>.........................] - ETA: 1:35 - loss: 1.1256 - regression_loss: 0.9993 - classification_loss: 0.1263 92/500 [====>.........................] - ETA: 1:35 - loss: 1.1193 - regression_loss: 0.9941 - classification_loss: 0.1253 93/500 [====>.........................] - ETA: 1:35 - loss: 1.1171 - regression_loss: 0.9923 - classification_loss: 0.1248 94/500 [====>.........................] - ETA: 1:35 - loss: 1.1119 - regression_loss: 0.9878 - classification_loss: 0.1241 95/500 [====>.........................] - ETA: 1:34 - loss: 1.1126 - regression_loss: 0.9888 - classification_loss: 0.1238 96/500 [====>.........................] - ETA: 1:34 - loss: 1.1107 - regression_loss: 0.9875 - classification_loss: 0.1232 97/500 [====>.........................] - ETA: 1:34 - loss: 1.1130 - regression_loss: 0.9890 - classification_loss: 0.1239 98/500 [====>.........................] - ETA: 1:34 - loss: 1.1183 - regression_loss: 0.9937 - classification_loss: 0.1246 99/500 [====>.........................] - ETA: 1:33 - loss: 1.1180 - regression_loss: 0.9935 - classification_loss: 0.1244 100/500 [=====>........................] - ETA: 1:33 - loss: 1.1113 - regression_loss: 0.9836 - classification_loss: 0.1277 101/500 [=====>........................] - ETA: 1:33 - loss: 1.1158 - regression_loss: 0.9873 - classification_loss: 0.1285 102/500 [=====>........................] - ETA: 1:33 - loss: 1.1199 - regression_loss: 0.9904 - classification_loss: 0.1295 103/500 [=====>........................] - ETA: 1:33 - loss: 1.1168 - regression_loss: 0.9878 - classification_loss: 0.1290 104/500 [=====>........................] - ETA: 1:32 - loss: 1.1140 - regression_loss: 0.9853 - classification_loss: 0.1288 105/500 [=====>........................] - ETA: 1:32 - loss: 1.1132 - regression_loss: 0.9849 - classification_loss: 0.1282 106/500 [=====>........................] - ETA: 1:32 - loss: 1.1141 - regression_loss: 0.9853 - classification_loss: 0.1289 107/500 [=====>........................] - ETA: 1:32 - loss: 1.1077 - regression_loss: 0.9796 - classification_loss: 0.1281 108/500 [=====>........................] - ETA: 1:31 - loss: 1.1047 - regression_loss: 0.9769 - classification_loss: 0.1278 109/500 [=====>........................] - ETA: 1:31 - loss: 1.1107 - regression_loss: 0.9825 - classification_loss: 0.1282 110/500 [=====>........................] - ETA: 1:31 - loss: 1.1119 - regression_loss: 0.9835 - classification_loss: 0.1284 111/500 [=====>........................] - ETA: 1:31 - loss: 1.1119 - regression_loss: 0.9837 - classification_loss: 0.1282 112/500 [=====>........................] - ETA: 1:30 - loss: 1.1104 - regression_loss: 0.9828 - classification_loss: 0.1276 113/500 [=====>........................] - ETA: 1:30 - loss: 1.1121 - regression_loss: 0.9842 - classification_loss: 0.1279 114/500 [=====>........................] - ETA: 1:30 - loss: 1.1153 - regression_loss: 0.9865 - classification_loss: 0.1287 115/500 [=====>........................] - ETA: 1:30 - loss: 1.1094 - regression_loss: 0.9813 - classification_loss: 0.1281 116/500 [=====>........................] - ETA: 1:30 - loss: 1.1145 - regression_loss: 0.9855 - classification_loss: 0.1290 117/500 [======>.......................] - ETA: 1:29 - loss: 1.1156 - regression_loss: 0.9865 - classification_loss: 0.1292 118/500 [======>.......................] - ETA: 1:29 - loss: 1.1161 - regression_loss: 0.9870 - classification_loss: 0.1291 119/500 [======>.......................] - ETA: 1:29 - loss: 1.1173 - regression_loss: 0.9881 - classification_loss: 0.1292 120/500 [======>.......................] - ETA: 1:29 - loss: 1.1145 - regression_loss: 0.9859 - classification_loss: 0.1286 121/500 [======>.......................] - ETA: 1:28 - loss: 1.1189 - regression_loss: 0.9897 - classification_loss: 0.1292 122/500 [======>.......................] - ETA: 1:28 - loss: 1.1185 - regression_loss: 0.9899 - classification_loss: 0.1285 123/500 [======>.......................] - ETA: 1:28 - loss: 1.1184 - regression_loss: 0.9900 - classification_loss: 0.1284 124/500 [======>.......................] - ETA: 1:27 - loss: 1.1251 - regression_loss: 0.9948 - classification_loss: 0.1303 125/500 [======>.......................] - ETA: 1:27 - loss: 1.1391 - regression_loss: 1.0060 - classification_loss: 0.1331 126/500 [======>.......................] - ETA: 1:27 - loss: 1.1390 - regression_loss: 1.0062 - classification_loss: 0.1328 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1358 - regression_loss: 1.0036 - classification_loss: 0.1322 128/500 [======>.......................] - ETA: 1:27 - loss: 1.1356 - regression_loss: 1.0025 - classification_loss: 0.1331 129/500 [======>.......................] - ETA: 1:26 - loss: 1.1399 - regression_loss: 1.0061 - classification_loss: 0.1338 130/500 [======>.......................] - ETA: 1:26 - loss: 1.1402 - regression_loss: 1.0063 - classification_loss: 0.1339 131/500 [======>.......................] - ETA: 1:26 - loss: 1.1404 - regression_loss: 1.0067 - classification_loss: 0.1337 132/500 [======>.......................] - ETA: 1:26 - loss: 1.1377 - regression_loss: 1.0040 - classification_loss: 0.1336 133/500 [======>.......................] - ETA: 1:25 - loss: 1.1432 - regression_loss: 1.0090 - classification_loss: 0.1342 134/500 [=======>......................] - ETA: 1:25 - loss: 1.1406 - regression_loss: 1.0067 - classification_loss: 0.1339 135/500 [=======>......................] - ETA: 1:25 - loss: 1.1414 - regression_loss: 1.0075 - classification_loss: 0.1339 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1405 - regression_loss: 1.0068 - classification_loss: 0.1337 137/500 [=======>......................] - ETA: 1:24 - loss: 1.1357 - regression_loss: 1.0026 - classification_loss: 0.1332 138/500 [=======>......................] - ETA: 1:24 - loss: 1.1330 - regression_loss: 1.0000 - classification_loss: 0.1330 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1305 - regression_loss: 0.9979 - classification_loss: 0.1326 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1307 - regression_loss: 0.9979 - classification_loss: 0.1328 141/500 [=======>......................] - ETA: 1:23 - loss: 1.1305 - regression_loss: 0.9971 - classification_loss: 0.1334 142/500 [=======>......................] - ETA: 1:23 - loss: 1.1285 - regression_loss: 0.9952 - classification_loss: 0.1333 143/500 [=======>......................] - ETA: 1:23 - loss: 1.1324 - regression_loss: 0.9990 - classification_loss: 0.1334 144/500 [=======>......................] - ETA: 1:23 - loss: 1.1314 - regression_loss: 0.9981 - classification_loss: 0.1333 145/500 [=======>......................] - ETA: 1:22 - loss: 1.1298 - regression_loss: 0.9969 - classification_loss: 0.1329 146/500 [=======>......................] - ETA: 1:22 - loss: 1.1284 - regression_loss: 0.9958 - classification_loss: 0.1326 147/500 [=======>......................] - ETA: 1:22 - loss: 1.1302 - regression_loss: 0.9972 - classification_loss: 0.1330 148/500 [=======>......................] - ETA: 1:22 - loss: 1.1304 - regression_loss: 0.9970 - classification_loss: 0.1334 149/500 [=======>......................] - ETA: 1:21 - loss: 1.1309 - regression_loss: 0.9976 - classification_loss: 0.1334 150/500 [========>.....................] - ETA: 1:21 - loss: 1.1336 - regression_loss: 1.0001 - classification_loss: 0.1335 151/500 [========>.....................] - ETA: 1:21 - loss: 1.1314 - regression_loss: 0.9978 - classification_loss: 0.1336 152/500 [========>.....................] - ETA: 1:21 - loss: 1.1283 - regression_loss: 0.9952 - classification_loss: 0.1332 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1256 - regression_loss: 0.9931 - classification_loss: 0.1326 154/500 [========>.....................] - ETA: 1:20 - loss: 1.1270 - regression_loss: 0.9942 - classification_loss: 0.1328 155/500 [========>.....................] - ETA: 1:20 - loss: 1.1317 - regression_loss: 0.9982 - classification_loss: 0.1335 156/500 [========>.....................] - ETA: 1:20 - loss: 1.1300 - regression_loss: 0.9970 - classification_loss: 0.1330 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1321 - regression_loss: 0.9990 - classification_loss: 0.1330 158/500 [========>.....................] - ETA: 1:19 - loss: 1.1311 - regression_loss: 0.9983 - classification_loss: 0.1328 159/500 [========>.....................] - ETA: 1:19 - loss: 1.1351 - regression_loss: 1.0014 - classification_loss: 0.1337 160/500 [========>.....................] - ETA: 1:19 - loss: 1.1360 - regression_loss: 1.0024 - classification_loss: 0.1336 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1439 - regression_loss: 1.0086 - classification_loss: 0.1354 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1435 - regression_loss: 1.0081 - classification_loss: 0.1354 163/500 [========>.....................] - ETA: 1:18 - loss: 1.1393 - regression_loss: 1.0046 - classification_loss: 0.1347 164/500 [========>.....................] - ETA: 1:18 - loss: 1.1346 - regression_loss: 1.0005 - classification_loss: 0.1342 165/500 [========>.....................] - ETA: 1:18 - loss: 1.1326 - regression_loss: 0.9987 - classification_loss: 0.1339 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1333 - regression_loss: 0.9992 - classification_loss: 0.1341 167/500 [=========>....................] - ETA: 1:17 - loss: 1.1335 - regression_loss: 0.9994 - classification_loss: 0.1341 168/500 [=========>....................] - ETA: 1:17 - loss: 1.1315 - regression_loss: 0.9977 - classification_loss: 0.1338 169/500 [=========>....................] - ETA: 1:17 - loss: 1.1358 - regression_loss: 1.0007 - classification_loss: 0.1351 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1370 - regression_loss: 1.0017 - classification_loss: 0.1353 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1394 - regression_loss: 1.0038 - classification_loss: 0.1357 172/500 [=========>....................] - ETA: 1:16 - loss: 1.1389 - regression_loss: 1.0037 - classification_loss: 0.1353 173/500 [=========>....................] - ETA: 1:16 - loss: 1.1380 - regression_loss: 1.0031 - classification_loss: 0.1349 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1361 - regression_loss: 1.0016 - classification_loss: 0.1345 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1331 - regression_loss: 0.9990 - classification_loss: 0.1342 176/500 [=========>....................] - ETA: 1:15 - loss: 1.1368 - regression_loss: 1.0016 - classification_loss: 0.1352 177/500 [=========>....................] - ETA: 1:15 - loss: 1.1328 - regression_loss: 0.9979 - classification_loss: 0.1349 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1359 - regression_loss: 1.0004 - classification_loss: 0.1356 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1358 - regression_loss: 1.0000 - classification_loss: 0.1357 180/500 [=========>....................] - ETA: 1:14 - loss: 1.1348 - regression_loss: 0.9991 - classification_loss: 0.1357 181/500 [=========>....................] - ETA: 1:14 - loss: 1.1389 - regression_loss: 1.0033 - classification_loss: 0.1356 182/500 [=========>....................] - ETA: 1:14 - loss: 1.1400 - regression_loss: 1.0046 - classification_loss: 0.1355 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1403 - regression_loss: 1.0050 - classification_loss: 0.1353 184/500 [==========>...................] - ETA: 1:13 - loss: 1.1383 - regression_loss: 1.0027 - classification_loss: 0.1356 185/500 [==========>...................] - ETA: 1:13 - loss: 1.1399 - regression_loss: 1.0036 - classification_loss: 0.1363 186/500 [==========>...................] - ETA: 1:13 - loss: 1.1379 - regression_loss: 1.0019 - classification_loss: 0.1360 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1361 - regression_loss: 1.0004 - classification_loss: 0.1356 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1365 - regression_loss: 1.0005 - classification_loss: 0.1360 189/500 [==========>...................] - ETA: 1:12 - loss: 1.1344 - regression_loss: 0.9989 - classification_loss: 0.1356 190/500 [==========>...................] - ETA: 1:12 - loss: 1.1329 - regression_loss: 0.9975 - classification_loss: 0.1354 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1325 - regression_loss: 0.9974 - classification_loss: 0.1351 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1355 - regression_loss: 1.0001 - classification_loss: 0.1355 193/500 [==========>...................] - ETA: 1:11 - loss: 1.1365 - regression_loss: 1.0009 - classification_loss: 0.1356 194/500 [==========>...................] - ETA: 1:11 - loss: 1.1371 - regression_loss: 1.0014 - classification_loss: 0.1357 195/500 [==========>...................] - ETA: 1:11 - loss: 1.1346 - regression_loss: 0.9991 - classification_loss: 0.1355 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1324 - regression_loss: 0.9973 - classification_loss: 0.1351 197/500 [==========>...................] - ETA: 1:10 - loss: 1.1282 - regression_loss: 0.9937 - classification_loss: 0.1345 198/500 [==========>...................] - ETA: 1:10 - loss: 1.1295 - regression_loss: 0.9949 - classification_loss: 0.1346 199/500 [==========>...................] - ETA: 1:10 - loss: 1.1285 - regression_loss: 0.9943 - classification_loss: 0.1343 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1256 - regression_loss: 0.9919 - classification_loss: 0.1337 201/500 [===========>..................] - ETA: 1:09 - loss: 1.1245 - regression_loss: 0.9910 - classification_loss: 0.1335 202/500 [===========>..................] - ETA: 1:09 - loss: 1.1239 - regression_loss: 0.9903 - classification_loss: 0.1336 203/500 [===========>..................] - ETA: 1:09 - loss: 1.1262 - regression_loss: 0.9921 - classification_loss: 0.1341 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1275 - regression_loss: 0.9931 - classification_loss: 0.1344 205/500 [===========>..................] - ETA: 1:09 - loss: 1.1264 - regression_loss: 0.9919 - classification_loss: 0.1345 206/500 [===========>..................] - ETA: 1:08 - loss: 1.1257 - regression_loss: 0.9912 - classification_loss: 0.1345 207/500 [===========>..................] - ETA: 1:08 - loss: 1.1257 - regression_loss: 0.9912 - classification_loss: 0.1346 208/500 [===========>..................] - ETA: 1:08 - loss: 1.1285 - regression_loss: 0.9937 - classification_loss: 0.1347 209/500 [===========>..................] - ETA: 1:08 - loss: 1.1282 - regression_loss: 0.9938 - classification_loss: 0.1344 210/500 [===========>..................] - ETA: 1:07 - loss: 1.1276 - regression_loss: 0.9934 - classification_loss: 0.1343 211/500 [===========>..................] - ETA: 1:07 - loss: 1.1280 - regression_loss: 0.9937 - classification_loss: 0.1343 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1279 - regression_loss: 0.9936 - classification_loss: 0.1342 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1283 - regression_loss: 0.9942 - classification_loss: 0.1341 214/500 [===========>..................] - ETA: 1:07 - loss: 1.1329 - regression_loss: 0.9982 - classification_loss: 0.1346 215/500 [===========>..................] - ETA: 1:06 - loss: 1.1293 - regression_loss: 0.9952 - classification_loss: 0.1342 216/500 [===========>..................] - ETA: 1:06 - loss: 1.1297 - regression_loss: 0.9958 - classification_loss: 0.1339 217/500 [============>.................] - ETA: 1:06 - loss: 1.1291 - regression_loss: 0.9951 - classification_loss: 0.1340 218/500 [============>.................] - ETA: 1:06 - loss: 1.1299 - regression_loss: 0.9960 - classification_loss: 0.1339 219/500 [============>.................] - ETA: 1:05 - loss: 1.1283 - regression_loss: 0.9946 - classification_loss: 0.1338 220/500 [============>.................] - ETA: 1:05 - loss: 1.1304 - regression_loss: 0.9963 - classification_loss: 0.1340 221/500 [============>.................] - ETA: 1:05 - loss: 1.1282 - regression_loss: 0.9944 - classification_loss: 0.1337 222/500 [============>.................] - ETA: 1:05 - loss: 1.1262 - regression_loss: 0.9926 - classification_loss: 0.1336 223/500 [============>.................] - ETA: 1:04 - loss: 1.1229 - regression_loss: 0.9898 - classification_loss: 0.1331 224/500 [============>.................] - ETA: 1:04 - loss: 1.1347 - regression_loss: 0.9970 - classification_loss: 0.1377 225/500 [============>.................] - ETA: 1:04 - loss: 1.1335 - regression_loss: 0.9961 - classification_loss: 0.1374 226/500 [============>.................] - ETA: 1:04 - loss: 1.1329 - regression_loss: 0.9959 - classification_loss: 0.1370 227/500 [============>.................] - ETA: 1:04 - loss: 1.1326 - regression_loss: 0.9956 - classification_loss: 0.1370 228/500 [============>.................] - ETA: 1:03 - loss: 1.1323 - regression_loss: 0.9956 - classification_loss: 0.1367 229/500 [============>.................] - ETA: 1:03 - loss: 1.1306 - regression_loss: 0.9941 - classification_loss: 0.1365 230/500 [============>.................] - ETA: 1:03 - loss: 1.1327 - regression_loss: 0.9957 - classification_loss: 0.1369 231/500 [============>.................] - ETA: 1:03 - loss: 1.1300 - regression_loss: 0.9935 - classification_loss: 0.1365 232/500 [============>.................] - ETA: 1:02 - loss: 1.1298 - regression_loss: 0.9936 - classification_loss: 0.1363 233/500 [============>.................] - ETA: 1:02 - loss: 1.1295 - regression_loss: 0.9933 - classification_loss: 0.1362 234/500 [=============>................] - ETA: 1:02 - loss: 1.1285 - regression_loss: 0.9925 - classification_loss: 0.1361 235/500 [=============>................] - ETA: 1:02 - loss: 1.1283 - regression_loss: 0.9922 - classification_loss: 0.1361 236/500 [=============>................] - ETA: 1:01 - loss: 1.1303 - regression_loss: 0.9936 - classification_loss: 0.1367 237/500 [=============>................] - ETA: 1:01 - loss: 1.1293 - regression_loss: 0.9928 - classification_loss: 0.1365 238/500 [=============>................] - ETA: 1:01 - loss: 1.1275 - regression_loss: 0.9911 - classification_loss: 0.1364 239/500 [=============>................] - ETA: 1:01 - loss: 1.1284 - regression_loss: 0.9920 - classification_loss: 0.1364 240/500 [=============>................] - ETA: 1:00 - loss: 1.1310 - regression_loss: 0.9938 - classification_loss: 0.1373 241/500 [=============>................] - ETA: 1:00 - loss: 1.1343 - regression_loss: 0.9957 - classification_loss: 0.1386 242/500 [=============>................] - ETA: 1:00 - loss: 1.1342 - regression_loss: 0.9960 - classification_loss: 0.1383 243/500 [=============>................] - ETA: 1:00 - loss: 1.1327 - regression_loss: 0.9947 - classification_loss: 0.1380 244/500 [=============>................] - ETA: 1:00 - loss: 1.1321 - regression_loss: 0.9942 - classification_loss: 0.1378 245/500 [=============>................] - ETA: 59s - loss: 1.1319 - regression_loss: 0.9942 - classification_loss: 0.1377  246/500 [=============>................] - ETA: 59s - loss: 1.1339 - regression_loss: 0.9957 - classification_loss: 0.1382 247/500 [=============>................] - ETA: 59s - loss: 1.1327 - regression_loss: 0.9948 - classification_loss: 0.1380 248/500 [=============>................] - ETA: 59s - loss: 1.1366 - regression_loss: 0.9977 - classification_loss: 0.1389 249/500 [=============>................] - ETA: 58s - loss: 1.1356 - regression_loss: 0.9968 - classification_loss: 0.1389 250/500 [==============>...............] - ETA: 58s - loss: 1.1341 - regression_loss: 0.9955 - classification_loss: 0.1386 251/500 [==============>...............] - ETA: 58s - loss: 1.1343 - regression_loss: 0.9959 - classification_loss: 0.1383 252/500 [==============>...............] - ETA: 58s - loss: 1.1356 - regression_loss: 0.9969 - classification_loss: 0.1386 253/500 [==============>...............] - ETA: 57s - loss: 1.1380 - regression_loss: 0.9992 - classification_loss: 0.1388 254/500 [==============>...............] - ETA: 57s - loss: 1.1364 - regression_loss: 0.9980 - classification_loss: 0.1384 255/500 [==============>...............] - ETA: 57s - loss: 1.1340 - regression_loss: 0.9960 - classification_loss: 0.1380 256/500 [==============>...............] - ETA: 57s - loss: 1.1328 - regression_loss: 0.9948 - classification_loss: 0.1380 257/500 [==============>...............] - ETA: 56s - loss: 1.1311 - regression_loss: 0.9935 - classification_loss: 0.1376 258/500 [==============>...............] - ETA: 56s - loss: 1.1325 - regression_loss: 0.9943 - classification_loss: 0.1382 259/500 [==============>...............] - ETA: 56s - loss: 1.1305 - regression_loss: 0.9926 - classification_loss: 0.1378 260/500 [==============>...............] - ETA: 56s - loss: 1.1317 - regression_loss: 0.9935 - classification_loss: 0.1382 261/500 [==============>...............] - ETA: 56s - loss: 1.1332 - regression_loss: 0.9946 - classification_loss: 0.1386 262/500 [==============>...............] - ETA: 55s - loss: 1.1345 - regression_loss: 0.9958 - classification_loss: 0.1387 263/500 [==============>...............] - ETA: 55s - loss: 1.1328 - regression_loss: 0.9945 - classification_loss: 0.1383 264/500 [==============>...............] - ETA: 55s - loss: 1.1331 - regression_loss: 0.9948 - classification_loss: 0.1383 265/500 [==============>...............] - ETA: 55s - loss: 1.1331 - regression_loss: 0.9950 - classification_loss: 0.1380 266/500 [==============>...............] - ETA: 54s - loss: 1.1338 - regression_loss: 0.9957 - classification_loss: 0.1380 267/500 [===============>..............] - ETA: 54s - loss: 1.1356 - regression_loss: 0.9976 - classification_loss: 0.1380 268/500 [===============>..............] - ETA: 54s - loss: 1.1373 - regression_loss: 0.9988 - classification_loss: 0.1385 269/500 [===============>..............] - ETA: 54s - loss: 1.1369 - regression_loss: 0.9984 - classification_loss: 0.1385 270/500 [===============>..............] - ETA: 54s - loss: 1.1375 - regression_loss: 0.9989 - classification_loss: 0.1385 271/500 [===============>..............] - ETA: 53s - loss: 1.1374 - regression_loss: 0.9991 - classification_loss: 0.1383 272/500 [===============>..............] - ETA: 53s - loss: 1.1378 - regression_loss: 0.9996 - classification_loss: 0.1382 273/500 [===============>..............] - ETA: 53s - loss: 1.1392 - regression_loss: 1.0007 - classification_loss: 0.1386 274/500 [===============>..............] - ETA: 53s - loss: 1.1383 - regression_loss: 0.9999 - classification_loss: 0.1384 275/500 [===============>..............] - ETA: 52s - loss: 1.1371 - regression_loss: 0.9990 - classification_loss: 0.1381 276/500 [===============>..............] - ETA: 52s - loss: 1.1353 - regression_loss: 0.9976 - classification_loss: 0.1377 277/500 [===============>..............] - ETA: 52s - loss: 1.1329 - regression_loss: 0.9955 - classification_loss: 0.1374 278/500 [===============>..............] - ETA: 52s - loss: 1.1328 - regression_loss: 0.9919 - classification_loss: 0.1409 279/500 [===============>..............] - ETA: 51s - loss: 1.1324 - regression_loss: 0.9917 - classification_loss: 0.1407 280/500 [===============>..............] - ETA: 51s - loss: 1.1334 - regression_loss: 0.9923 - classification_loss: 0.1411 281/500 [===============>..............] - ETA: 51s - loss: 1.1357 - regression_loss: 0.9948 - classification_loss: 0.1409 282/500 [===============>..............] - ETA: 51s - loss: 1.1369 - regression_loss: 0.9960 - classification_loss: 0.1409 283/500 [===============>..............] - ETA: 50s - loss: 1.1408 - regression_loss: 0.9991 - classification_loss: 0.1417 284/500 [================>.............] - ETA: 50s - loss: 1.1422 - regression_loss: 1.0004 - classification_loss: 0.1418 285/500 [================>.............] - ETA: 50s - loss: 1.1421 - regression_loss: 1.0004 - classification_loss: 0.1417 286/500 [================>.............] - ETA: 50s - loss: 1.1409 - regression_loss: 0.9995 - classification_loss: 0.1415 287/500 [================>.............] - ETA: 49s - loss: 1.1416 - regression_loss: 1.0002 - classification_loss: 0.1414 288/500 [================>.............] - ETA: 49s - loss: 1.1416 - regression_loss: 1.0004 - classification_loss: 0.1413 289/500 [================>.............] - ETA: 49s - loss: 1.1407 - regression_loss: 0.9997 - classification_loss: 0.1411 290/500 [================>.............] - ETA: 49s - loss: 1.1408 - regression_loss: 0.9997 - classification_loss: 0.1411 291/500 [================>.............] - ETA: 49s - loss: 1.1398 - regression_loss: 0.9989 - classification_loss: 0.1410 292/500 [================>.............] - ETA: 48s - loss: 1.1411 - regression_loss: 1.0002 - classification_loss: 0.1409 293/500 [================>.............] - ETA: 48s - loss: 1.1434 - regression_loss: 1.0018 - classification_loss: 0.1416 294/500 [================>.............] - ETA: 48s - loss: 1.1404 - regression_loss: 0.9992 - classification_loss: 0.1412 295/500 [================>.............] - ETA: 48s - loss: 1.1385 - regression_loss: 0.9976 - classification_loss: 0.1409 296/500 [================>.............] - ETA: 47s - loss: 1.1384 - regression_loss: 0.9976 - classification_loss: 0.1408 297/500 [================>.............] - ETA: 47s - loss: 1.1357 - regression_loss: 0.9953 - classification_loss: 0.1404 298/500 [================>.............] - ETA: 47s - loss: 1.1346 - regression_loss: 0.9944 - classification_loss: 0.1402 299/500 [================>.............] - ETA: 47s - loss: 1.1338 - regression_loss: 0.9937 - classification_loss: 0.1400 300/500 [=================>............] - ETA: 46s - loss: 1.1358 - regression_loss: 0.9955 - classification_loss: 0.1403 301/500 [=================>............] - ETA: 46s - loss: 1.1393 - regression_loss: 0.9981 - classification_loss: 0.1411 302/500 [=================>............] - ETA: 46s - loss: 1.1391 - regression_loss: 0.9981 - classification_loss: 0.1410 303/500 [=================>............] - ETA: 46s - loss: 1.1383 - regression_loss: 0.9974 - classification_loss: 0.1408 304/500 [=================>............] - ETA: 45s - loss: 1.1387 - regression_loss: 0.9977 - classification_loss: 0.1410 305/500 [=================>............] - ETA: 45s - loss: 1.1369 - regression_loss: 0.9961 - classification_loss: 0.1408 306/500 [=================>............] - ETA: 45s - loss: 1.1384 - regression_loss: 0.9973 - classification_loss: 0.1411 307/500 [=================>............] - ETA: 45s - loss: 1.1370 - regression_loss: 0.9962 - classification_loss: 0.1408 308/500 [=================>............] - ETA: 45s - loss: 1.1371 - regression_loss: 0.9963 - classification_loss: 0.1408 309/500 [=================>............] - ETA: 44s - loss: 1.1359 - regression_loss: 0.9953 - classification_loss: 0.1406 310/500 [=================>............] - ETA: 44s - loss: 1.1398 - regression_loss: 0.9986 - classification_loss: 0.1412 311/500 [=================>............] - ETA: 44s - loss: 1.1382 - regression_loss: 0.9972 - classification_loss: 0.1410 312/500 [=================>............] - ETA: 44s - loss: 1.1428 - regression_loss: 1.0011 - classification_loss: 0.1416 313/500 [=================>............] - ETA: 43s - loss: 1.1405 - regression_loss: 0.9992 - classification_loss: 0.1413 314/500 [=================>............] - ETA: 43s - loss: 1.1409 - regression_loss: 0.9997 - classification_loss: 0.1411 315/500 [=================>............] - ETA: 43s - loss: 1.1422 - regression_loss: 1.0007 - classification_loss: 0.1416 316/500 [=================>............] - ETA: 43s - loss: 1.1428 - regression_loss: 1.0012 - classification_loss: 0.1416 317/500 [==================>...........] - ETA: 42s - loss: 1.1426 - regression_loss: 1.0012 - classification_loss: 0.1414 318/500 [==================>...........] - ETA: 42s - loss: 1.1437 - regression_loss: 1.0023 - classification_loss: 0.1414 319/500 [==================>...........] - ETA: 42s - loss: 1.1455 - regression_loss: 1.0038 - classification_loss: 0.1417 320/500 [==================>...........] - ETA: 42s - loss: 1.1457 - regression_loss: 1.0039 - classification_loss: 0.1418 321/500 [==================>...........] - ETA: 42s - loss: 1.1466 - regression_loss: 1.0044 - classification_loss: 0.1421 322/500 [==================>...........] - ETA: 41s - loss: 1.1473 - regression_loss: 1.0051 - classification_loss: 0.1422 323/500 [==================>...........] - ETA: 41s - loss: 1.1472 - regression_loss: 1.0049 - classification_loss: 0.1422 324/500 [==================>...........] - ETA: 41s - loss: 1.1464 - regression_loss: 1.0043 - classification_loss: 0.1421 325/500 [==================>...........] - ETA: 41s - loss: 1.1465 - regression_loss: 1.0044 - classification_loss: 0.1421 326/500 [==================>...........] - ETA: 40s - loss: 1.1461 - regression_loss: 1.0042 - classification_loss: 0.1420 327/500 [==================>...........] - ETA: 40s - loss: 1.1455 - regression_loss: 1.0037 - classification_loss: 0.1419 328/500 [==================>...........] - ETA: 40s - loss: 1.1470 - regression_loss: 1.0049 - classification_loss: 0.1421 329/500 [==================>...........] - ETA: 40s - loss: 1.1472 - regression_loss: 1.0050 - classification_loss: 0.1422 330/500 [==================>...........] - ETA: 39s - loss: 1.1486 - regression_loss: 1.0061 - classification_loss: 0.1425 331/500 [==================>...........] - ETA: 39s - loss: 1.1478 - regression_loss: 1.0055 - classification_loss: 0.1423 332/500 [==================>...........] - ETA: 39s - loss: 1.1468 - regression_loss: 1.0047 - classification_loss: 0.1421 333/500 [==================>...........] - ETA: 39s - loss: 1.1466 - regression_loss: 1.0046 - classification_loss: 0.1421 334/500 [===================>..........] - ETA: 38s - loss: 1.1470 - regression_loss: 1.0046 - classification_loss: 0.1425 335/500 [===================>..........] - ETA: 38s - loss: 1.1465 - regression_loss: 1.0040 - classification_loss: 0.1425 336/500 [===================>..........] - ETA: 38s - loss: 1.1456 - regression_loss: 1.0033 - classification_loss: 0.1423 337/500 [===================>..........] - ETA: 38s - loss: 1.1462 - regression_loss: 1.0039 - classification_loss: 0.1423 338/500 [===================>..........] - ETA: 37s - loss: 1.1459 - regression_loss: 1.0037 - classification_loss: 0.1422 339/500 [===================>..........] - ETA: 37s - loss: 1.1489 - regression_loss: 1.0062 - classification_loss: 0.1427 340/500 [===================>..........] - ETA: 37s - loss: 1.1488 - regression_loss: 1.0061 - classification_loss: 0.1427 341/500 [===================>..........] - ETA: 37s - loss: 1.1494 - regression_loss: 1.0067 - classification_loss: 0.1427 342/500 [===================>..........] - ETA: 37s - loss: 1.1494 - regression_loss: 1.0068 - classification_loss: 0.1427 343/500 [===================>..........] - ETA: 36s - loss: 1.1493 - regression_loss: 1.0067 - classification_loss: 0.1426 344/500 [===================>..........] - ETA: 36s - loss: 1.1491 - regression_loss: 1.0066 - classification_loss: 0.1426 345/500 [===================>..........] - ETA: 36s - loss: 1.1510 - regression_loss: 1.0083 - classification_loss: 0.1426 346/500 [===================>..........] - ETA: 36s - loss: 1.1494 - regression_loss: 1.0070 - classification_loss: 0.1423 347/500 [===================>..........] - ETA: 35s - loss: 1.1500 - regression_loss: 1.0076 - classification_loss: 0.1424 348/500 [===================>..........] - ETA: 35s - loss: 1.1502 - regression_loss: 1.0077 - classification_loss: 0.1425 349/500 [===================>..........] - ETA: 35s - loss: 1.1521 - regression_loss: 1.0092 - classification_loss: 0.1428 350/500 [====================>.........] - ETA: 35s - loss: 1.1523 - regression_loss: 1.0095 - classification_loss: 0.1428 351/500 [====================>.........] - ETA: 34s - loss: 1.1527 - regression_loss: 1.0100 - classification_loss: 0.1426 352/500 [====================>.........] - ETA: 34s - loss: 1.1499 - regression_loss: 1.0076 - classification_loss: 0.1423 353/500 [====================>.........] - ETA: 34s - loss: 1.1482 - regression_loss: 1.0061 - classification_loss: 0.1421 354/500 [====================>.........] - ETA: 34s - loss: 1.1476 - regression_loss: 1.0057 - classification_loss: 0.1419 355/500 [====================>.........] - ETA: 34s - loss: 1.1533 - regression_loss: 1.0099 - classification_loss: 0.1434 356/500 [====================>.........] - ETA: 33s - loss: 1.1543 - regression_loss: 1.0107 - classification_loss: 0.1435 357/500 [====================>.........] - ETA: 33s - loss: 1.1530 - regression_loss: 1.0097 - classification_loss: 0.1433 358/500 [====================>.........] - ETA: 33s - loss: 1.1508 - regression_loss: 1.0078 - classification_loss: 0.1431 359/500 [====================>.........] - ETA: 33s - loss: 1.1501 - regression_loss: 1.0072 - classification_loss: 0.1429 360/500 [====================>.........] - ETA: 32s - loss: 1.1489 - regression_loss: 1.0062 - classification_loss: 0.1426 361/500 [====================>.........] - ETA: 32s - loss: 1.1499 - regression_loss: 1.0069 - classification_loss: 0.1431 362/500 [====================>.........] - ETA: 32s - loss: 1.1498 - regression_loss: 1.0068 - classification_loss: 0.1430 363/500 [====================>.........] - ETA: 32s - loss: 1.1490 - regression_loss: 1.0062 - classification_loss: 0.1428 364/500 [====================>.........] - ETA: 31s - loss: 1.1496 - regression_loss: 1.0066 - classification_loss: 0.1430 365/500 [====================>.........] - ETA: 31s - loss: 1.1506 - regression_loss: 1.0077 - classification_loss: 0.1430 366/500 [====================>.........] - ETA: 31s - loss: 1.1492 - regression_loss: 1.0065 - classification_loss: 0.1427 367/500 [=====================>........] - ETA: 31s - loss: 1.1499 - regression_loss: 1.0071 - classification_loss: 0.1428 368/500 [=====================>........] - ETA: 30s - loss: 1.1489 - regression_loss: 1.0063 - classification_loss: 0.1426 369/500 [=====================>........] - ETA: 30s - loss: 1.1490 - regression_loss: 1.0063 - classification_loss: 0.1426 370/500 [=====================>........] - ETA: 30s - loss: 1.1511 - regression_loss: 1.0080 - classification_loss: 0.1431 371/500 [=====================>........] - ETA: 30s - loss: 1.1533 - regression_loss: 1.0099 - classification_loss: 0.1434 372/500 [=====================>........] - ETA: 30s - loss: 1.1529 - regression_loss: 1.0096 - classification_loss: 0.1433 373/500 [=====================>........] - ETA: 29s - loss: 1.1539 - regression_loss: 1.0106 - classification_loss: 0.1434 374/500 [=====================>........] - ETA: 29s - loss: 1.1539 - regression_loss: 1.0106 - classification_loss: 0.1433 375/500 [=====================>........] - ETA: 29s - loss: 1.1549 - regression_loss: 1.0115 - classification_loss: 0.1435 376/500 [=====================>........] - ETA: 29s - loss: 1.1552 - regression_loss: 1.0117 - classification_loss: 0.1435 377/500 [=====================>........] - ETA: 28s - loss: 1.1555 - regression_loss: 1.0120 - classification_loss: 0.1434 378/500 [=====================>........] - ETA: 28s - loss: 1.1554 - regression_loss: 1.0120 - classification_loss: 0.1434 379/500 [=====================>........] - ETA: 28s - loss: 1.1564 - regression_loss: 1.0129 - classification_loss: 0.1435 380/500 [=====================>........] - ETA: 28s - loss: 1.1558 - regression_loss: 1.0124 - classification_loss: 0.1435 381/500 [=====================>........] - ETA: 27s - loss: 1.1551 - regression_loss: 1.0118 - classification_loss: 0.1434 382/500 [=====================>........] - ETA: 27s - loss: 1.1540 - regression_loss: 1.0109 - classification_loss: 0.1431 383/500 [=====================>........] - ETA: 27s - loss: 1.1548 - regression_loss: 1.0116 - classification_loss: 0.1432 384/500 [======================>.......] - ETA: 27s - loss: 1.1559 - regression_loss: 1.0127 - classification_loss: 0.1432 385/500 [======================>.......] - ETA: 26s - loss: 1.1583 - regression_loss: 1.0144 - classification_loss: 0.1438 386/500 [======================>.......] - ETA: 26s - loss: 1.1583 - regression_loss: 1.0145 - classification_loss: 0.1438 387/500 [======================>.......] - ETA: 26s - loss: 1.1571 - regression_loss: 1.0135 - classification_loss: 0.1435 388/500 [======================>.......] - ETA: 26s - loss: 1.1549 - regression_loss: 1.0117 - classification_loss: 0.1432 389/500 [======================>.......] - ETA: 26s - loss: 1.1540 - regression_loss: 1.0110 - classification_loss: 0.1430 390/500 [======================>.......] - ETA: 25s - loss: 1.1534 - regression_loss: 1.0105 - classification_loss: 0.1429 391/500 [======================>.......] - ETA: 25s - loss: 1.1541 - regression_loss: 1.0110 - classification_loss: 0.1431 392/500 [======================>.......] - ETA: 25s - loss: 1.1531 - regression_loss: 1.0101 - classification_loss: 0.1430 393/500 [======================>.......] - ETA: 25s - loss: 1.1526 - regression_loss: 1.0097 - classification_loss: 0.1429 394/500 [======================>.......] - ETA: 24s - loss: 1.1528 - regression_loss: 1.0099 - classification_loss: 0.1429 395/500 [======================>.......] - ETA: 24s - loss: 1.1520 - regression_loss: 1.0092 - classification_loss: 0.1428 396/500 [======================>.......] - ETA: 24s - loss: 1.1529 - regression_loss: 1.0101 - classification_loss: 0.1429 397/500 [======================>.......] - ETA: 24s - loss: 1.1511 - regression_loss: 1.0085 - classification_loss: 0.1427 398/500 [======================>.......] - ETA: 23s - loss: 1.1509 - regression_loss: 1.0084 - classification_loss: 0.1425 399/500 [======================>.......] - ETA: 23s - loss: 1.1507 - regression_loss: 1.0083 - classification_loss: 0.1424 400/500 [=======================>......] - ETA: 23s - loss: 1.1511 - regression_loss: 1.0088 - classification_loss: 0.1423 401/500 [=======================>......] - ETA: 23s - loss: 1.1510 - regression_loss: 1.0088 - classification_loss: 0.1422 402/500 [=======================>......] - ETA: 22s - loss: 1.1502 - regression_loss: 1.0081 - classification_loss: 0.1420 403/500 [=======================>......] - ETA: 22s - loss: 1.1511 - regression_loss: 1.0093 - classification_loss: 0.1418 404/500 [=======================>......] - ETA: 22s - loss: 1.1515 - regression_loss: 1.0094 - classification_loss: 0.1421 405/500 [=======================>......] - ETA: 22s - loss: 1.1518 - regression_loss: 1.0098 - classification_loss: 0.1421 406/500 [=======================>......] - ETA: 22s - loss: 1.1507 - regression_loss: 1.0087 - classification_loss: 0.1419 407/500 [=======================>......] - ETA: 21s - loss: 1.1494 - regression_loss: 1.0076 - classification_loss: 0.1418 408/500 [=======================>......] - ETA: 21s - loss: 1.1493 - regression_loss: 1.0074 - classification_loss: 0.1419 409/500 [=======================>......] - ETA: 21s - loss: 1.1492 - regression_loss: 1.0074 - classification_loss: 0.1418 410/500 [=======================>......] - ETA: 21s - loss: 1.1489 - regression_loss: 1.0073 - classification_loss: 0.1416 411/500 [=======================>......] - ETA: 20s - loss: 1.1487 - regression_loss: 1.0071 - classification_loss: 0.1416 412/500 [=======================>......] - ETA: 20s - loss: 1.1482 - regression_loss: 1.0068 - classification_loss: 0.1414 413/500 [=======================>......] - ETA: 20s - loss: 1.1480 - regression_loss: 1.0067 - classification_loss: 0.1413 414/500 [=======================>......] - ETA: 20s - loss: 1.1479 - regression_loss: 1.0067 - classification_loss: 0.1412 415/500 [=======================>......] - ETA: 19s - loss: 1.1461 - regression_loss: 1.0052 - classification_loss: 0.1409 416/500 [=======================>......] - ETA: 19s - loss: 1.1462 - regression_loss: 1.0054 - classification_loss: 0.1408 417/500 [========================>.....] - ETA: 19s - loss: 1.1460 - regression_loss: 1.0052 - classification_loss: 0.1408 418/500 [========================>.....] - ETA: 19s - loss: 1.1465 - regression_loss: 1.0056 - classification_loss: 0.1409 419/500 [========================>.....] - ETA: 18s - loss: 1.1460 - regression_loss: 1.0052 - classification_loss: 0.1408 420/500 [========================>.....] - ETA: 18s - loss: 1.1449 - regression_loss: 1.0043 - classification_loss: 0.1406 421/500 [========================>.....] - ETA: 18s - loss: 1.1437 - regression_loss: 1.0032 - classification_loss: 0.1405 422/500 [========================>.....] - ETA: 18s - loss: 1.1434 - regression_loss: 1.0029 - classification_loss: 0.1405 423/500 [========================>.....] - ETA: 18s - loss: 1.1433 - regression_loss: 1.0030 - classification_loss: 0.1404 424/500 [========================>.....] - ETA: 17s - loss: 1.1433 - regression_loss: 1.0030 - classification_loss: 0.1403 425/500 [========================>.....] - ETA: 17s - loss: 1.1424 - regression_loss: 1.0022 - classification_loss: 0.1402 426/500 [========================>.....] - ETA: 17s - loss: 1.1415 - regression_loss: 1.0014 - classification_loss: 0.1401 427/500 [========================>.....] - ETA: 17s - loss: 1.1437 - regression_loss: 1.0031 - classification_loss: 0.1407 428/500 [========================>.....] - ETA: 16s - loss: 1.1419 - regression_loss: 1.0015 - classification_loss: 0.1404 429/500 [========================>.....] - ETA: 16s - loss: 1.1403 - regression_loss: 1.0002 - classification_loss: 0.1401 430/500 [========================>.....] - ETA: 16s - loss: 1.1432 - regression_loss: 1.0025 - classification_loss: 0.1408 431/500 [========================>.....] - ETA: 16s - loss: 1.1435 - regression_loss: 1.0026 - classification_loss: 0.1409 432/500 [========================>.....] - ETA: 15s - loss: 1.1434 - regression_loss: 1.0025 - classification_loss: 0.1409 433/500 [========================>.....] - ETA: 15s - loss: 1.1452 - regression_loss: 1.0040 - classification_loss: 0.1411 434/500 [=========================>....] - ETA: 15s - loss: 1.1445 - regression_loss: 1.0034 - classification_loss: 0.1411 435/500 [=========================>....] - ETA: 15s - loss: 1.1447 - regression_loss: 1.0036 - classification_loss: 0.1411 436/500 [=========================>....] - ETA: 14s - loss: 1.1461 - regression_loss: 1.0048 - classification_loss: 0.1413 437/500 [=========================>....] - ETA: 14s - loss: 1.1468 - regression_loss: 1.0054 - classification_loss: 0.1414 438/500 [=========================>....] - ETA: 14s - loss: 1.1470 - regression_loss: 1.0055 - classification_loss: 0.1415 439/500 [=========================>....] - ETA: 14s - loss: 1.1469 - regression_loss: 1.0055 - classification_loss: 0.1414 440/500 [=========================>....] - ETA: 14s - loss: 1.1465 - regression_loss: 1.0053 - classification_loss: 0.1413 441/500 [=========================>....] - ETA: 13s - loss: 1.1466 - regression_loss: 1.0053 - classification_loss: 0.1413 442/500 [=========================>....] - ETA: 13s - loss: 1.1462 - regression_loss: 1.0049 - classification_loss: 0.1413 443/500 [=========================>....] - ETA: 13s - loss: 1.1450 - regression_loss: 1.0039 - classification_loss: 0.1411 444/500 [=========================>....] - ETA: 13s - loss: 1.1459 - regression_loss: 1.0047 - classification_loss: 0.1413 445/500 [=========================>....] - ETA: 12s - loss: 1.1469 - regression_loss: 1.0054 - classification_loss: 0.1415 446/500 [=========================>....] - ETA: 12s - loss: 1.1462 - regression_loss: 1.0049 - classification_loss: 0.1413 447/500 [=========================>....] - ETA: 12s - loss: 1.1457 - regression_loss: 1.0045 - classification_loss: 0.1413 448/500 [=========================>....] - ETA: 12s - loss: 1.1468 - regression_loss: 1.0054 - classification_loss: 0.1414 449/500 [=========================>....] - ETA: 11s - loss: 1.1466 - regression_loss: 1.0052 - classification_loss: 0.1414 450/500 [==========================>...] - ETA: 11s - loss: 1.1456 - regression_loss: 1.0044 - classification_loss: 0.1412 451/500 [==========================>...] - ETA: 11s - loss: 1.1457 - regression_loss: 1.0046 - classification_loss: 0.1411 452/500 [==========================>...] - ETA: 11s - loss: 1.1449 - regression_loss: 1.0040 - classification_loss: 0.1409 453/500 [==========================>...] - ETA: 11s - loss: 1.1446 - regression_loss: 1.0038 - classification_loss: 0.1408 454/500 [==========================>...] - ETA: 10s - loss: 1.1452 - regression_loss: 1.0043 - classification_loss: 0.1409 455/500 [==========================>...] - ETA: 10s - loss: 1.1440 - regression_loss: 1.0034 - classification_loss: 0.1406 456/500 [==========================>...] - ETA: 10s - loss: 1.1451 - regression_loss: 1.0043 - classification_loss: 0.1408 457/500 [==========================>...] - ETA: 10s - loss: 1.1435 - regression_loss: 1.0030 - classification_loss: 0.1406 458/500 [==========================>...] - ETA: 9s - loss: 1.1424 - regression_loss: 1.0021 - classification_loss: 0.1403  459/500 [==========================>...] - ETA: 9s - loss: 1.1416 - regression_loss: 1.0015 - classification_loss: 0.1401 460/500 [==========================>...] - ETA: 9s - loss: 1.1418 - regression_loss: 1.0014 - classification_loss: 0.1404 461/500 [==========================>...] - ETA: 9s - loss: 1.1419 - regression_loss: 1.0016 - classification_loss: 0.1404 462/500 [==========================>...] - ETA: 8s - loss: 1.1436 - regression_loss: 1.0027 - classification_loss: 0.1409 463/500 [==========================>...] - ETA: 8s - loss: 1.1426 - regression_loss: 1.0020 - classification_loss: 0.1407 464/500 [==========================>...] - ETA: 8s - loss: 1.1428 - regression_loss: 1.0022 - classification_loss: 0.1406 465/500 [==========================>...] - ETA: 8s - loss: 1.1419 - regression_loss: 1.0015 - classification_loss: 0.1404 466/500 [==========================>...] - ETA: 7s - loss: 1.1417 - regression_loss: 1.0014 - classification_loss: 0.1403 467/500 [===========================>..] - ETA: 7s - loss: 1.1406 - regression_loss: 1.0005 - classification_loss: 0.1401 468/500 [===========================>..] - ETA: 7s - loss: 1.1404 - regression_loss: 1.0003 - classification_loss: 0.1401 469/500 [===========================>..] - ETA: 7s - loss: 1.1402 - regression_loss: 1.0001 - classification_loss: 0.1401 470/500 [===========================>..] - ETA: 7s - loss: 1.1401 - regression_loss: 1.0000 - classification_loss: 0.1401 471/500 [===========================>..] - ETA: 6s - loss: 1.1400 - regression_loss: 0.9999 - classification_loss: 0.1400 472/500 [===========================>..] - ETA: 6s - loss: 1.1390 - regression_loss: 0.9991 - classification_loss: 0.1399 473/500 [===========================>..] - ETA: 6s - loss: 1.1373 - regression_loss: 0.9977 - classification_loss: 0.1396 474/500 [===========================>..] - ETA: 6s - loss: 1.1361 - regression_loss: 0.9967 - classification_loss: 0.1395 475/500 [===========================>..] - ETA: 5s - loss: 1.1353 - regression_loss: 0.9959 - classification_loss: 0.1394 476/500 [===========================>..] - ETA: 5s - loss: 1.1354 - regression_loss: 0.9961 - classification_loss: 0.1394 477/500 [===========================>..] - ETA: 5s - loss: 1.1358 - regression_loss: 0.9964 - classification_loss: 0.1393 478/500 [===========================>..] - ETA: 5s - loss: 1.1357 - regression_loss: 0.9964 - classification_loss: 0.1393 479/500 [===========================>..] - ETA: 4s - loss: 1.1344 - regression_loss: 0.9953 - classification_loss: 0.1391 480/500 [===========================>..] - ETA: 4s - loss: 1.1347 - regression_loss: 0.9957 - classification_loss: 0.1390 481/500 [===========================>..] - ETA: 4s - loss: 1.1355 - regression_loss: 0.9963 - classification_loss: 0.1392 482/500 [===========================>..] - ETA: 4s - loss: 1.1343 - regression_loss: 0.9952 - classification_loss: 0.1390 483/500 [===========================>..] - ETA: 3s - loss: 1.1340 - regression_loss: 0.9950 - classification_loss: 0.1390 484/500 [============================>.] - ETA: 3s - loss: 1.1327 - regression_loss: 0.9939 - classification_loss: 0.1388 485/500 [============================>.] - ETA: 3s - loss: 1.1317 - regression_loss: 0.9931 - classification_loss: 0.1386 486/500 [============================>.] - ETA: 3s - loss: 1.1314 - regression_loss: 0.9929 - classification_loss: 0.1385 487/500 [============================>.] - ETA: 3s - loss: 1.1312 - regression_loss: 0.9928 - classification_loss: 0.1385 488/500 [============================>.] - ETA: 2s - loss: 1.1318 - regression_loss: 0.9933 - classification_loss: 0.1385 489/500 [============================>.] - ETA: 2s - loss: 1.1314 - regression_loss: 0.9930 - classification_loss: 0.1384 490/500 [============================>.] - ETA: 2s - loss: 1.1318 - regression_loss: 0.9934 - classification_loss: 0.1384 491/500 [============================>.] - ETA: 2s - loss: 1.1311 - regression_loss: 0.9928 - classification_loss: 0.1382 492/500 [============================>.] - ETA: 1s - loss: 1.1307 - regression_loss: 0.9925 - classification_loss: 0.1382 493/500 [============================>.] - ETA: 1s - loss: 1.1301 - regression_loss: 0.9921 - classification_loss: 0.1380 494/500 [============================>.] - ETA: 1s - loss: 1.1317 - regression_loss: 0.9932 - classification_loss: 0.1385 495/500 [============================>.] - ETA: 1s - loss: 1.1318 - regression_loss: 0.9932 - classification_loss: 0.1385 496/500 [============================>.] - ETA: 0s - loss: 1.1310 - regression_loss: 0.9926 - classification_loss: 0.1384 497/500 [============================>.] - ETA: 0s - loss: 1.1310 - regression_loss: 0.9926 - classification_loss: 0.1384 498/500 [============================>.] - ETA: 0s - loss: 1.1311 - regression_loss: 0.9928 - classification_loss: 0.1383 499/500 [============================>.] - ETA: 0s - loss: 1.1315 - regression_loss: 0.9930 - classification_loss: 0.1384 500/500 [==============================] - 117s 234ms/step - loss: 1.1306 - regression_loss: 0.9924 - classification_loss: 0.1382 326 instances of class plum with average precision: 0.8609 mAP: 0.8609 Epoch 00022: saving model to ./training/snapshots/resnet50_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 1:45 - loss: 0.9820 - regression_loss: 0.9227 - classification_loss: 0.0593 2/500 [..............................] - ETA: 1:46 - loss: 0.8792 - regression_loss: 0.7858 - classification_loss: 0.0934 3/500 [..............................] - ETA: 1:46 - loss: 1.0291 - regression_loss: 0.9033 - classification_loss: 0.1257 4/500 [..............................] - ETA: 1:50 - loss: 1.1028 - regression_loss: 0.9594 - classification_loss: 0.1434 5/500 [..............................] - ETA: 1:52 - loss: 1.1344 - regression_loss: 0.9938 - classification_loss: 0.1406 6/500 [..............................] - ETA: 1:54 - loss: 1.0992 - regression_loss: 0.9661 - classification_loss: 0.1330 7/500 [..............................] - ETA: 1:54 - loss: 1.0628 - regression_loss: 0.9324 - classification_loss: 0.1304 8/500 [..............................] - ETA: 1:55 - loss: 1.0800 - regression_loss: 0.9478 - classification_loss: 0.1321 9/500 [..............................] - ETA: 1:56 - loss: 1.0605 - regression_loss: 0.9338 - classification_loss: 0.1268 10/500 [..............................] - ETA: 1:56 - loss: 1.0543 - regression_loss: 0.9301 - classification_loss: 0.1242 11/500 [..............................] - ETA: 1:56 - loss: 1.1065 - regression_loss: 0.9795 - classification_loss: 0.1270 12/500 [..............................] - ETA: 1:55 - loss: 1.0903 - regression_loss: 0.9658 - classification_loss: 0.1245 13/500 [..............................] - ETA: 1:54 - loss: 1.0793 - regression_loss: 0.9585 - classification_loss: 0.1208 14/500 [..............................] - ETA: 1:53 - loss: 1.1256 - regression_loss: 1.0040 - classification_loss: 0.1216 15/500 [..............................] - ETA: 1:53 - loss: 1.1315 - regression_loss: 1.0119 - classification_loss: 0.1197 16/500 [..............................] - ETA: 1:53 - loss: 1.1140 - regression_loss: 0.9976 - classification_loss: 0.1164 17/500 [>.............................] - ETA: 1:53 - loss: 1.1524 - regression_loss: 1.0228 - classification_loss: 0.1296 18/500 [>.............................] - ETA: 1:53 - loss: 1.1797 - regression_loss: 1.0359 - classification_loss: 0.1438 19/500 [>.............................] - ETA: 1:53 - loss: 1.1557 - regression_loss: 1.0159 - classification_loss: 0.1399 20/500 [>.............................] - ETA: 1:53 - loss: 1.1638 - regression_loss: 1.0217 - classification_loss: 0.1421 21/500 [>.............................] - ETA: 1:52 - loss: 1.1898 - regression_loss: 1.0454 - classification_loss: 0.1444 22/500 [>.............................] - ETA: 1:52 - loss: 1.1723 - regression_loss: 1.0311 - classification_loss: 0.1411 23/500 [>.............................] - ETA: 1:51 - loss: 1.1782 - regression_loss: 1.0365 - classification_loss: 0.1417 24/500 [>.............................] - ETA: 1:51 - loss: 1.1943 - regression_loss: 1.0556 - classification_loss: 0.1387 25/500 [>.............................] - ETA: 1:51 - loss: 1.1864 - regression_loss: 1.0495 - classification_loss: 0.1370 26/500 [>.............................] - ETA: 1:50 - loss: 1.1542 - regression_loss: 1.0212 - classification_loss: 0.1331 27/500 [>.............................] - ETA: 1:50 - loss: 1.1609 - regression_loss: 1.0267 - classification_loss: 0.1342 28/500 [>.............................] - ETA: 1:50 - loss: 1.1525 - regression_loss: 1.0203 - classification_loss: 0.1323 29/500 [>.............................] - ETA: 1:50 - loss: 1.1511 - regression_loss: 1.0183 - classification_loss: 0.1328 30/500 [>.............................] - ETA: 1:50 - loss: 1.1752 - regression_loss: 1.0387 - classification_loss: 0.1365 31/500 [>.............................] - ETA: 1:49 - loss: 1.1922 - regression_loss: 1.0561 - classification_loss: 0.1362 32/500 [>.............................] - ETA: 1:49 - loss: 1.1875 - regression_loss: 1.0515 - classification_loss: 0.1360 33/500 [>.............................] - ETA: 1:49 - loss: 1.1830 - regression_loss: 1.0482 - classification_loss: 0.1348 34/500 [=>............................] - ETA: 1:49 - loss: 1.1652 - regression_loss: 1.0321 - classification_loss: 0.1331 35/500 [=>............................] - ETA: 1:49 - loss: 1.1471 - regression_loss: 1.0167 - classification_loss: 0.1304 36/500 [=>............................] - ETA: 1:48 - loss: 1.1403 - regression_loss: 1.0113 - classification_loss: 0.1290 37/500 [=>............................] - ETA: 1:48 - loss: 1.1319 - regression_loss: 1.0036 - classification_loss: 0.1284 38/500 [=>............................] - ETA: 1:48 - loss: 1.1459 - regression_loss: 1.0153 - classification_loss: 0.1305 39/500 [=>............................] - ETA: 1:48 - loss: 1.1306 - regression_loss: 1.0013 - classification_loss: 0.1293 40/500 [=>............................] - ETA: 1:48 - loss: 1.1301 - regression_loss: 1.0007 - classification_loss: 0.1294 41/500 [=>............................] - ETA: 1:47 - loss: 1.1322 - regression_loss: 1.0017 - classification_loss: 0.1304 42/500 [=>............................] - ETA: 1:47 - loss: 1.1161 - regression_loss: 0.9880 - classification_loss: 0.1281 43/500 [=>............................] - ETA: 1:47 - loss: 1.0976 - regression_loss: 0.9716 - classification_loss: 0.1260 44/500 [=>............................] - ETA: 1:46 - loss: 1.0898 - regression_loss: 0.9654 - classification_loss: 0.1245 45/500 [=>............................] - ETA: 1:46 - loss: 1.0770 - regression_loss: 0.9540 - classification_loss: 0.1230 46/500 [=>............................] - ETA: 1:46 - loss: 1.0743 - regression_loss: 0.9518 - classification_loss: 0.1224 47/500 [=>............................] - ETA: 1:45 - loss: 1.0601 - regression_loss: 0.9393 - classification_loss: 0.1209 48/500 [=>............................] - ETA: 1:45 - loss: 1.0676 - regression_loss: 0.9451 - classification_loss: 0.1225 49/500 [=>............................] - ETA: 1:45 - loss: 1.0619 - regression_loss: 0.9396 - classification_loss: 0.1223 50/500 [==>...........................] - ETA: 1:45 - loss: 1.0605 - regression_loss: 0.9367 - classification_loss: 0.1238 51/500 [==>...........................] - ETA: 1:45 - loss: 1.0593 - regression_loss: 0.9367 - classification_loss: 0.1226 52/500 [==>...........................] - ETA: 1:44 - loss: 1.0601 - regression_loss: 0.9382 - classification_loss: 0.1218 53/500 [==>...........................] - ETA: 1:44 - loss: 1.0655 - regression_loss: 0.9433 - classification_loss: 0.1221 54/500 [==>...........................] - ETA: 1:44 - loss: 1.0804 - regression_loss: 0.9544 - classification_loss: 0.1260 55/500 [==>...........................] - ETA: 1:44 - loss: 1.0795 - regression_loss: 0.9529 - classification_loss: 0.1266 56/500 [==>...........................] - ETA: 1:43 - loss: 1.0831 - regression_loss: 0.9562 - classification_loss: 0.1269 57/500 [==>...........................] - ETA: 1:43 - loss: 1.0822 - regression_loss: 0.9551 - classification_loss: 0.1272 58/500 [==>...........................] - ETA: 1:43 - loss: 1.0768 - regression_loss: 0.9507 - classification_loss: 0.1262 59/500 [==>...........................] - ETA: 1:42 - loss: 1.0797 - regression_loss: 0.9536 - classification_loss: 0.1261 60/500 [==>...........................] - ETA: 1:42 - loss: 1.0910 - regression_loss: 0.9641 - classification_loss: 0.1268 61/500 [==>...........................] - ETA: 1:42 - loss: 1.1091 - regression_loss: 0.9780 - classification_loss: 0.1311 62/500 [==>...........................] - ETA: 1:42 - loss: 1.1095 - regression_loss: 0.9789 - classification_loss: 0.1306 63/500 [==>...........................] - ETA: 1:41 - loss: 1.1129 - regression_loss: 0.9834 - classification_loss: 0.1296 64/500 [==>...........................] - ETA: 1:41 - loss: 1.1088 - regression_loss: 0.9799 - classification_loss: 0.1289 65/500 [==>...........................] - ETA: 1:41 - loss: 1.1031 - regression_loss: 0.9746 - classification_loss: 0.1285 66/500 [==>...........................] - ETA: 1:40 - loss: 1.0988 - regression_loss: 0.9712 - classification_loss: 0.1276 67/500 [===>..........................] - ETA: 1:40 - loss: 1.0933 - regression_loss: 0.9665 - classification_loss: 0.1268 68/500 [===>..........................] - ETA: 1:40 - loss: 1.1004 - regression_loss: 0.9723 - classification_loss: 0.1280 69/500 [===>..........................] - ETA: 1:39 - loss: 1.1050 - regression_loss: 0.9766 - classification_loss: 0.1285 70/500 [===>..........................] - ETA: 1:39 - loss: 1.1167 - regression_loss: 0.9876 - classification_loss: 0.1291 71/500 [===>..........................] - ETA: 1:39 - loss: 1.1145 - regression_loss: 0.9858 - classification_loss: 0.1288 72/500 [===>..........................] - ETA: 1:39 - loss: 1.1142 - regression_loss: 0.9840 - classification_loss: 0.1302 73/500 [===>..........................] - ETA: 1:39 - loss: 1.1208 - regression_loss: 0.9886 - classification_loss: 0.1322 74/500 [===>..........................] - ETA: 1:39 - loss: 1.1203 - regression_loss: 0.9880 - classification_loss: 0.1323 75/500 [===>..........................] - ETA: 1:38 - loss: 1.1293 - regression_loss: 0.9957 - classification_loss: 0.1336 76/500 [===>..........................] - ETA: 1:38 - loss: 1.1238 - regression_loss: 0.9910 - classification_loss: 0.1328 77/500 [===>..........................] - ETA: 1:38 - loss: 1.1180 - regression_loss: 0.9861 - classification_loss: 0.1319 78/500 [===>..........................] - ETA: 1:38 - loss: 1.1099 - regression_loss: 0.9791 - classification_loss: 0.1309 79/500 [===>..........................] - ETA: 1:38 - loss: 1.1042 - regression_loss: 0.9736 - classification_loss: 0.1306 80/500 [===>..........................] - ETA: 1:37 - loss: 1.1036 - regression_loss: 0.9732 - classification_loss: 0.1304 81/500 [===>..........................] - ETA: 1:37 - loss: 1.1094 - regression_loss: 0.9775 - classification_loss: 0.1318 82/500 [===>..........................] - ETA: 1:37 - loss: 1.1067 - regression_loss: 0.9755 - classification_loss: 0.1313 83/500 [===>..........................] - ETA: 1:37 - loss: 1.1074 - regression_loss: 0.9759 - classification_loss: 0.1314 84/500 [====>.........................] - ETA: 1:37 - loss: 1.1100 - regression_loss: 0.9776 - classification_loss: 0.1325 85/500 [====>.........................] - ETA: 1:36 - loss: 1.1115 - regression_loss: 0.9791 - classification_loss: 0.1324 86/500 [====>.........................] - ETA: 1:36 - loss: 1.1040 - regression_loss: 0.9725 - classification_loss: 0.1316 87/500 [====>.........................] - ETA: 1:36 - loss: 1.1120 - regression_loss: 0.9795 - classification_loss: 0.1325 88/500 [====>.........................] - ETA: 1:36 - loss: 1.1114 - regression_loss: 0.9788 - classification_loss: 0.1326 89/500 [====>.........................] - ETA: 1:35 - loss: 1.1143 - regression_loss: 0.9811 - classification_loss: 0.1332 90/500 [====>.........................] - ETA: 1:35 - loss: 1.1157 - regression_loss: 0.9827 - classification_loss: 0.1329 91/500 [====>.........................] - ETA: 1:35 - loss: 1.1175 - regression_loss: 0.9843 - classification_loss: 0.1332 92/500 [====>.........................] - ETA: 1:35 - loss: 1.1098 - regression_loss: 0.9774 - classification_loss: 0.1323 93/500 [====>.........................] - ETA: 1:34 - loss: 1.1096 - regression_loss: 0.9772 - classification_loss: 0.1324 94/500 [====>.........................] - ETA: 1:34 - loss: 1.1065 - regression_loss: 0.9748 - classification_loss: 0.1317 95/500 [====>.........................] - ETA: 1:34 - loss: 1.1075 - regression_loss: 0.9758 - classification_loss: 0.1316 96/500 [====>.........................] - ETA: 1:34 - loss: 1.1237 - regression_loss: 0.9914 - classification_loss: 0.1323 97/500 [====>.........................] - ETA: 1:33 - loss: 1.1332 - regression_loss: 1.0005 - classification_loss: 0.1327 98/500 [====>.........................] - ETA: 1:33 - loss: 1.1270 - regression_loss: 0.9953 - classification_loss: 0.1317 99/500 [====>.........................] - ETA: 1:33 - loss: 1.1231 - regression_loss: 0.9921 - classification_loss: 0.1310 100/500 [=====>........................] - ETA: 1:33 - loss: 1.1217 - regression_loss: 0.9909 - classification_loss: 0.1308 101/500 [=====>........................] - ETA: 1:32 - loss: 1.1244 - regression_loss: 0.9933 - classification_loss: 0.1310 102/500 [=====>........................] - ETA: 1:32 - loss: 1.1206 - regression_loss: 0.9901 - classification_loss: 0.1305 103/500 [=====>........................] - ETA: 1:32 - loss: 1.1193 - regression_loss: 0.9888 - classification_loss: 0.1306 104/500 [=====>........................] - ETA: 1:32 - loss: 1.1156 - regression_loss: 0.9857 - classification_loss: 0.1298 105/500 [=====>........................] - ETA: 1:32 - loss: 1.1140 - regression_loss: 0.9846 - classification_loss: 0.1294 106/500 [=====>........................] - ETA: 1:31 - loss: 1.1151 - regression_loss: 0.9853 - classification_loss: 0.1298 107/500 [=====>........................] - ETA: 1:31 - loss: 1.1177 - regression_loss: 0.9875 - classification_loss: 0.1302 108/500 [=====>........................] - ETA: 1:31 - loss: 1.1172 - regression_loss: 0.9869 - classification_loss: 0.1303 109/500 [=====>........................] - ETA: 1:30 - loss: 1.1177 - regression_loss: 0.9874 - classification_loss: 0.1304 110/500 [=====>........................] - ETA: 1:30 - loss: 1.1137 - regression_loss: 0.9834 - classification_loss: 0.1303 111/500 [=====>........................] - ETA: 1:30 - loss: 1.1040 - regression_loss: 0.9746 - classification_loss: 0.1295 112/500 [=====>........................] - ETA: 1:30 - loss: 1.1053 - regression_loss: 0.9757 - classification_loss: 0.1295 113/500 [=====>........................] - ETA: 1:29 - loss: 1.1075 - regression_loss: 0.9781 - classification_loss: 0.1294 114/500 [=====>........................] - ETA: 1:29 - loss: 1.1080 - regression_loss: 0.9776 - classification_loss: 0.1305 115/500 [=====>........................] - ETA: 1:29 - loss: 1.1087 - regression_loss: 0.9780 - classification_loss: 0.1307 116/500 [=====>........................] - ETA: 1:29 - loss: 1.1075 - regression_loss: 0.9769 - classification_loss: 0.1306 117/500 [======>.......................] - ETA: 1:29 - loss: 1.1032 - regression_loss: 0.9729 - classification_loss: 0.1303 118/500 [======>.......................] - ETA: 1:29 - loss: 1.1018 - regression_loss: 0.9722 - classification_loss: 0.1296 119/500 [======>.......................] - ETA: 1:28 - loss: 1.1015 - regression_loss: 0.9727 - classification_loss: 0.1288 120/500 [======>.......................] - ETA: 1:28 - loss: 1.1036 - regression_loss: 0.9738 - classification_loss: 0.1298 121/500 [======>.......................] - ETA: 1:28 - loss: 1.1052 - regression_loss: 0.9747 - classification_loss: 0.1305 122/500 [======>.......................] - ETA: 1:28 - loss: 1.1037 - regression_loss: 0.9737 - classification_loss: 0.1301 123/500 [======>.......................] - ETA: 1:28 - loss: 1.1109 - regression_loss: 0.9786 - classification_loss: 0.1323 124/500 [======>.......................] - ETA: 1:27 - loss: 1.1129 - regression_loss: 0.9803 - classification_loss: 0.1326 125/500 [======>.......................] - ETA: 1:27 - loss: 1.1114 - regression_loss: 0.9791 - classification_loss: 0.1323 126/500 [======>.......................] - ETA: 1:27 - loss: 1.1081 - regression_loss: 0.9765 - classification_loss: 0.1316 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1134 - regression_loss: 0.9808 - classification_loss: 0.1326 128/500 [======>.......................] - ETA: 1:26 - loss: 1.1133 - regression_loss: 0.9810 - classification_loss: 0.1322 129/500 [======>.......................] - ETA: 1:26 - loss: 1.1109 - regression_loss: 0.9788 - classification_loss: 0.1320 130/500 [======>.......................] - ETA: 1:26 - loss: 1.1163 - regression_loss: 0.9837 - classification_loss: 0.1326 131/500 [======>.......................] - ETA: 1:26 - loss: 1.1254 - regression_loss: 0.9911 - classification_loss: 0.1343 132/500 [======>.......................] - ETA: 1:25 - loss: 1.1273 - regression_loss: 0.9922 - classification_loss: 0.1351 133/500 [======>.......................] - ETA: 1:25 - loss: 1.1243 - regression_loss: 0.9897 - classification_loss: 0.1346 134/500 [=======>......................] - ETA: 1:25 - loss: 1.1250 - regression_loss: 0.9899 - classification_loss: 0.1350 135/500 [=======>......................] - ETA: 1:25 - loss: 1.1234 - regression_loss: 0.9889 - classification_loss: 0.1345 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1264 - regression_loss: 0.9921 - classification_loss: 0.1344 137/500 [=======>......................] - ETA: 1:24 - loss: 1.1242 - regression_loss: 0.9903 - classification_loss: 0.1339 138/500 [=======>......................] - ETA: 1:24 - loss: 1.1212 - regression_loss: 0.9879 - classification_loss: 0.1333 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1224 - regression_loss: 0.9886 - classification_loss: 0.1338 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1248 - regression_loss: 0.9908 - classification_loss: 0.1340 141/500 [=======>......................] - ETA: 1:23 - loss: 1.1330 - regression_loss: 0.9980 - classification_loss: 0.1350 142/500 [=======>......................] - ETA: 1:23 - loss: 1.1357 - regression_loss: 0.9999 - classification_loss: 0.1359 143/500 [=======>......................] - ETA: 1:23 - loss: 1.1333 - regression_loss: 0.9979 - classification_loss: 0.1354 144/500 [=======>......................] - ETA: 1:23 - loss: 1.1329 - regression_loss: 0.9980 - classification_loss: 0.1349 145/500 [=======>......................] - ETA: 1:22 - loss: 1.1329 - regression_loss: 0.9979 - classification_loss: 0.1350 146/500 [=======>......................] - ETA: 1:22 - loss: 1.1362 - regression_loss: 1.0007 - classification_loss: 0.1355 147/500 [=======>......................] - ETA: 1:22 - loss: 1.1399 - regression_loss: 1.0040 - classification_loss: 0.1359 148/500 [=======>......................] - ETA: 1:22 - loss: 1.1429 - regression_loss: 1.0067 - classification_loss: 0.1362 149/500 [=======>......................] - ETA: 1:22 - loss: 1.1542 - regression_loss: 1.0161 - classification_loss: 0.1381 150/500 [========>.....................] - ETA: 1:21 - loss: 1.1544 - regression_loss: 1.0160 - classification_loss: 0.1384 151/500 [========>.....................] - ETA: 1:21 - loss: 1.1560 - regression_loss: 1.0172 - classification_loss: 0.1388 152/500 [========>.....................] - ETA: 1:21 - loss: 1.1509 - regression_loss: 1.0128 - classification_loss: 0.1380 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1503 - regression_loss: 1.0124 - classification_loss: 0.1379 154/500 [========>.....................] - ETA: 1:20 - loss: 1.1440 - regression_loss: 1.0067 - classification_loss: 0.1373 155/500 [========>.....................] - ETA: 1:20 - loss: 1.1419 - regression_loss: 1.0050 - classification_loss: 0.1369 156/500 [========>.....................] - ETA: 1:20 - loss: 1.1433 - regression_loss: 1.0056 - classification_loss: 0.1377 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1412 - regression_loss: 1.0034 - classification_loss: 0.1378 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1482 - regression_loss: 1.0080 - classification_loss: 0.1403 159/500 [========>.....................] - ETA: 1:19 - loss: 1.1496 - regression_loss: 1.0093 - classification_loss: 0.1403 160/500 [========>.....................] - ETA: 1:19 - loss: 1.1476 - regression_loss: 1.0078 - classification_loss: 0.1398 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1457 - regression_loss: 1.0060 - classification_loss: 0.1397 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1454 - regression_loss: 1.0058 - classification_loss: 0.1395 163/500 [========>.....................] - ETA: 1:18 - loss: 1.1512 - regression_loss: 1.0114 - classification_loss: 0.1398 164/500 [========>.....................] - ETA: 1:18 - loss: 1.1468 - regression_loss: 1.0076 - classification_loss: 0.1392 165/500 [========>.....................] - ETA: 1:18 - loss: 1.1501 - regression_loss: 1.0099 - classification_loss: 0.1402 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1583 - regression_loss: 1.0160 - classification_loss: 0.1423 167/500 [=========>....................] - ETA: 1:17 - loss: 1.1577 - regression_loss: 1.0156 - classification_loss: 0.1422 168/500 [=========>....................] - ETA: 1:17 - loss: 1.1547 - regression_loss: 1.0129 - classification_loss: 0.1418 169/500 [=========>....................] - ETA: 1:17 - loss: 1.1545 - regression_loss: 1.0128 - classification_loss: 0.1417 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1521 - regression_loss: 1.0110 - classification_loss: 0.1411 171/500 [=========>....................] - ETA: 1:16 - loss: 1.1535 - regression_loss: 1.0120 - classification_loss: 0.1415 172/500 [=========>....................] - ETA: 1:16 - loss: 1.1545 - regression_loss: 1.0131 - classification_loss: 0.1413 173/500 [=========>....................] - ETA: 1:16 - loss: 1.1501 - regression_loss: 1.0093 - classification_loss: 0.1409 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1489 - regression_loss: 1.0081 - classification_loss: 0.1408 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1493 - regression_loss: 1.0087 - classification_loss: 0.1406 176/500 [=========>....................] - ETA: 1:15 - loss: 1.1473 - regression_loss: 1.0073 - classification_loss: 0.1400 177/500 [=========>....................] - ETA: 1:15 - loss: 1.1492 - regression_loss: 1.0092 - classification_loss: 0.1401 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1494 - regression_loss: 1.0095 - classification_loss: 0.1399 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1470 - regression_loss: 1.0074 - classification_loss: 0.1396 180/500 [=========>....................] - ETA: 1:14 - loss: 1.1497 - regression_loss: 1.0093 - classification_loss: 0.1404 181/500 [=========>....................] - ETA: 1:14 - loss: 1.1482 - regression_loss: 1.0082 - classification_loss: 0.1400 182/500 [=========>....................] - ETA: 1:14 - loss: 1.1467 - regression_loss: 1.0069 - classification_loss: 0.1398 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1462 - regression_loss: 1.0065 - classification_loss: 0.1397 184/500 [==========>...................] - ETA: 1:13 - loss: 1.1445 - regression_loss: 1.0053 - classification_loss: 0.1392 185/500 [==========>...................] - ETA: 1:13 - loss: 1.1436 - regression_loss: 1.0045 - classification_loss: 0.1391 186/500 [==========>...................] - ETA: 1:13 - loss: 1.1393 - regression_loss: 1.0009 - classification_loss: 0.1384 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1366 - regression_loss: 0.9987 - classification_loss: 0.1379 188/500 [==========>...................] - ETA: 1:12 - loss: 1.1369 - regression_loss: 0.9990 - classification_loss: 0.1378 189/500 [==========>...................] - ETA: 1:12 - loss: 1.1384 - regression_loss: 1.0003 - classification_loss: 0.1381 190/500 [==========>...................] - ETA: 1:12 - loss: 1.1378 - regression_loss: 0.9999 - classification_loss: 0.1379 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1393 - regression_loss: 1.0012 - classification_loss: 0.1381 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1396 - regression_loss: 1.0015 - classification_loss: 0.1381 193/500 [==========>...................] - ETA: 1:11 - loss: 1.1371 - regression_loss: 0.9996 - classification_loss: 0.1375 194/500 [==========>...................] - ETA: 1:11 - loss: 1.1378 - regression_loss: 1.0003 - classification_loss: 0.1375 195/500 [==========>...................] - ETA: 1:11 - loss: 1.1403 - regression_loss: 1.0022 - classification_loss: 0.1382 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1393 - regression_loss: 1.0015 - classification_loss: 0.1378 197/500 [==========>...................] - ETA: 1:10 - loss: 1.1368 - regression_loss: 0.9994 - classification_loss: 0.1375 198/500 [==========>...................] - ETA: 1:10 - loss: 1.1383 - regression_loss: 1.0011 - classification_loss: 0.1372 199/500 [==========>...................] - ETA: 1:10 - loss: 1.1386 - regression_loss: 1.0014 - classification_loss: 0.1372 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1396 - regression_loss: 1.0021 - classification_loss: 0.1375 201/500 [===========>..................] - ETA: 1:09 - loss: 1.1392 - regression_loss: 1.0016 - classification_loss: 0.1376 202/500 [===========>..................] - ETA: 1:09 - loss: 1.1392 - regression_loss: 1.0017 - classification_loss: 0.1375 203/500 [===========>..................] - ETA: 1:09 - loss: 1.1390 - regression_loss: 1.0014 - classification_loss: 0.1375 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1366 - regression_loss: 0.9990 - classification_loss: 0.1375 205/500 [===========>..................] - ETA: 1:08 - loss: 1.1382 - regression_loss: 1.0006 - classification_loss: 0.1377 206/500 [===========>..................] - ETA: 1:08 - loss: 1.1382 - regression_loss: 1.0005 - classification_loss: 0.1377 207/500 [===========>..................] - ETA: 1:08 - loss: 1.1394 - regression_loss: 1.0017 - classification_loss: 0.1377 208/500 [===========>..................] - ETA: 1:08 - loss: 1.1396 - regression_loss: 1.0018 - classification_loss: 0.1378 209/500 [===========>..................] - ETA: 1:07 - loss: 1.1439 - regression_loss: 1.0050 - classification_loss: 0.1389 210/500 [===========>..................] - ETA: 1:07 - loss: 1.1479 - regression_loss: 1.0090 - classification_loss: 0.1389 211/500 [===========>..................] - ETA: 1:07 - loss: 1.1463 - regression_loss: 1.0079 - classification_loss: 0.1384 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1468 - regression_loss: 1.0085 - classification_loss: 0.1383 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1476 - regression_loss: 1.0089 - classification_loss: 0.1387 214/500 [===========>..................] - ETA: 1:06 - loss: 1.1444 - regression_loss: 1.0063 - classification_loss: 0.1381 215/500 [===========>..................] - ETA: 1:06 - loss: 1.1454 - regression_loss: 1.0072 - classification_loss: 0.1382 216/500 [===========>..................] - ETA: 1:06 - loss: 1.1441 - regression_loss: 1.0061 - classification_loss: 0.1379 217/500 [============>.................] - ETA: 1:06 - loss: 1.1443 - regression_loss: 1.0064 - classification_loss: 0.1379 218/500 [============>.................] - ETA: 1:05 - loss: 1.1421 - regression_loss: 1.0046 - classification_loss: 0.1375 219/500 [============>.................] - ETA: 1:05 - loss: 1.1399 - regression_loss: 1.0024 - classification_loss: 0.1376 220/500 [============>.................] - ETA: 1:05 - loss: 1.1392 - regression_loss: 1.0018 - classification_loss: 0.1375 221/500 [============>.................] - ETA: 1:05 - loss: 1.1388 - regression_loss: 1.0015 - classification_loss: 0.1373 222/500 [============>.................] - ETA: 1:05 - loss: 1.1367 - regression_loss: 0.9997 - classification_loss: 0.1370 223/500 [============>.................] - ETA: 1:04 - loss: 1.1358 - regression_loss: 0.9990 - classification_loss: 0.1368 224/500 [============>.................] - ETA: 1:04 - loss: 1.1359 - regression_loss: 0.9993 - classification_loss: 0.1367 225/500 [============>.................] - ETA: 1:04 - loss: 1.1333 - regression_loss: 0.9971 - classification_loss: 0.1362 226/500 [============>.................] - ETA: 1:04 - loss: 1.1330 - regression_loss: 0.9969 - classification_loss: 0.1361 227/500 [============>.................] - ETA: 1:03 - loss: 1.1332 - regression_loss: 0.9972 - classification_loss: 0.1360 228/500 [============>.................] - ETA: 1:03 - loss: 1.1317 - regression_loss: 0.9961 - classification_loss: 0.1357 229/500 [============>.................] - ETA: 1:03 - loss: 1.1294 - regression_loss: 0.9941 - classification_loss: 0.1353 230/500 [============>.................] - ETA: 1:03 - loss: 1.1300 - regression_loss: 0.9946 - classification_loss: 0.1353 231/500 [============>.................] - ETA: 1:02 - loss: 1.1283 - regression_loss: 0.9932 - classification_loss: 0.1351 232/500 [============>.................] - ETA: 1:02 - loss: 1.1263 - regression_loss: 0.9916 - classification_loss: 0.1347 233/500 [============>.................] - ETA: 1:02 - loss: 1.1297 - regression_loss: 0.9947 - classification_loss: 0.1350 234/500 [=============>................] - ETA: 1:02 - loss: 1.1302 - regression_loss: 0.9947 - classification_loss: 0.1355 235/500 [=============>................] - ETA: 1:01 - loss: 1.1356 - regression_loss: 0.9990 - classification_loss: 0.1365 236/500 [=============>................] - ETA: 1:01 - loss: 1.1361 - regression_loss: 0.9995 - classification_loss: 0.1365 237/500 [=============>................] - ETA: 1:01 - loss: 1.1349 - regression_loss: 0.9988 - classification_loss: 0.1362 238/500 [=============>................] - ETA: 1:01 - loss: 1.1371 - regression_loss: 1.0003 - classification_loss: 0.1368 239/500 [=============>................] - ETA: 1:01 - loss: 1.1373 - regression_loss: 1.0005 - classification_loss: 0.1368 240/500 [=============>................] - ETA: 1:00 - loss: 1.1354 - regression_loss: 0.9989 - classification_loss: 0.1365 241/500 [=============>................] - ETA: 1:00 - loss: 1.1366 - regression_loss: 0.9999 - classification_loss: 0.1367 242/500 [=============>................] - ETA: 1:00 - loss: 1.1361 - regression_loss: 0.9997 - classification_loss: 0.1364 243/500 [=============>................] - ETA: 1:00 - loss: 1.1362 - regression_loss: 0.9999 - classification_loss: 0.1363 244/500 [=============>................] - ETA: 59s - loss: 1.1374 - regression_loss: 1.0011 - classification_loss: 0.1364  245/500 [=============>................] - ETA: 59s - loss: 1.1350 - regression_loss: 0.9989 - classification_loss: 0.1361 246/500 [=============>................] - ETA: 59s - loss: 1.1323 - regression_loss: 0.9967 - classification_loss: 0.1357 247/500 [=============>................] - ETA: 59s - loss: 1.1328 - regression_loss: 0.9972 - classification_loss: 0.1356 248/500 [=============>................] - ETA: 59s - loss: 1.1348 - regression_loss: 0.9992 - classification_loss: 0.1356 249/500 [=============>................] - ETA: 58s - loss: 1.1348 - regression_loss: 0.9992 - classification_loss: 0.1355 250/500 [==============>...............] - ETA: 58s - loss: 1.1375 - regression_loss: 1.0009 - classification_loss: 0.1367 251/500 [==============>...............] - ETA: 58s - loss: 1.1364 - regression_loss: 0.9999 - classification_loss: 0.1365 252/500 [==============>...............] - ETA: 58s - loss: 1.1347 - regression_loss: 0.9985 - classification_loss: 0.1362 253/500 [==============>...............] - ETA: 57s - loss: 1.1338 - regression_loss: 0.9979 - classification_loss: 0.1359 254/500 [==============>...............] - ETA: 57s - loss: 1.1338 - regression_loss: 0.9980 - classification_loss: 0.1359 255/500 [==============>...............] - ETA: 57s - loss: 1.1364 - regression_loss: 1.0001 - classification_loss: 0.1363 256/500 [==============>...............] - ETA: 57s - loss: 1.1355 - regression_loss: 0.9995 - classification_loss: 0.1359 257/500 [==============>...............] - ETA: 56s - loss: 1.1355 - regression_loss: 0.9996 - classification_loss: 0.1359 258/500 [==============>...............] - ETA: 56s - loss: 1.1350 - regression_loss: 0.9993 - classification_loss: 0.1357 259/500 [==============>...............] - ETA: 56s - loss: 1.1339 - regression_loss: 0.9985 - classification_loss: 0.1354 260/500 [==============>...............] - ETA: 56s - loss: 1.1321 - regression_loss: 0.9969 - classification_loss: 0.1352 261/500 [==============>...............] - ETA: 56s - loss: 1.1322 - regression_loss: 0.9970 - classification_loss: 0.1351 262/500 [==============>...............] - ETA: 55s - loss: 1.1327 - regression_loss: 0.9976 - classification_loss: 0.1351 263/500 [==============>...............] - ETA: 55s - loss: 1.1325 - regression_loss: 0.9972 - classification_loss: 0.1353 264/500 [==============>...............] - ETA: 55s - loss: 1.1347 - regression_loss: 0.9991 - classification_loss: 0.1355 265/500 [==============>...............] - ETA: 55s - loss: 1.1338 - regression_loss: 0.9985 - classification_loss: 0.1353 266/500 [==============>...............] - ETA: 54s - loss: 1.1314 - regression_loss: 0.9965 - classification_loss: 0.1349 267/500 [===============>..............] - ETA: 54s - loss: 1.1318 - regression_loss: 0.9969 - classification_loss: 0.1349 268/500 [===============>..............] - ETA: 54s - loss: 1.1369 - regression_loss: 1.0011 - classification_loss: 0.1358 269/500 [===============>..............] - ETA: 54s - loss: 1.1363 - regression_loss: 0.9996 - classification_loss: 0.1367 270/500 [===============>..............] - ETA: 53s - loss: 1.1341 - regression_loss: 0.9977 - classification_loss: 0.1364 271/500 [===============>..............] - ETA: 53s - loss: 1.1338 - regression_loss: 0.9977 - classification_loss: 0.1361 272/500 [===============>..............] - ETA: 53s - loss: 1.1330 - regression_loss: 0.9970 - classification_loss: 0.1360 273/500 [===============>..............] - ETA: 53s - loss: 1.1323 - regression_loss: 0.9965 - classification_loss: 0.1358 274/500 [===============>..............] - ETA: 52s - loss: 1.1328 - regression_loss: 0.9970 - classification_loss: 0.1358 275/500 [===============>..............] - ETA: 52s - loss: 1.1359 - regression_loss: 0.9993 - classification_loss: 0.1367 276/500 [===============>..............] - ETA: 52s - loss: 1.1336 - regression_loss: 0.9974 - classification_loss: 0.1363 277/500 [===============>..............] - ETA: 52s - loss: 1.1319 - regression_loss: 0.9958 - classification_loss: 0.1360 278/500 [===============>..............] - ETA: 52s - loss: 1.1328 - regression_loss: 0.9964 - classification_loss: 0.1364 279/500 [===============>..............] - ETA: 51s - loss: 1.1324 - regression_loss: 0.9960 - classification_loss: 0.1364 280/500 [===============>..............] - ETA: 51s - loss: 1.1358 - regression_loss: 0.9993 - classification_loss: 0.1365 281/500 [===============>..............] - ETA: 51s - loss: 1.1350 - regression_loss: 0.9986 - classification_loss: 0.1364 282/500 [===============>..............] - ETA: 51s - loss: 1.1352 - regression_loss: 0.9987 - classification_loss: 0.1364 283/500 [===============>..............] - ETA: 50s - loss: 1.1358 - regression_loss: 0.9994 - classification_loss: 0.1364 284/500 [================>.............] - ETA: 50s - loss: 1.1372 - regression_loss: 1.0006 - classification_loss: 0.1366 285/500 [================>.............] - ETA: 50s - loss: 1.1374 - regression_loss: 1.0009 - classification_loss: 0.1365 286/500 [================>.............] - ETA: 50s - loss: 1.1359 - regression_loss: 0.9997 - classification_loss: 0.1362 287/500 [================>.............] - ETA: 49s - loss: 1.1345 - regression_loss: 0.9986 - classification_loss: 0.1359 288/500 [================>.............] - ETA: 49s - loss: 1.1334 - regression_loss: 0.9978 - classification_loss: 0.1357 289/500 [================>.............] - ETA: 49s - loss: 1.1322 - regression_loss: 0.9968 - classification_loss: 0.1354 290/500 [================>.............] - ETA: 49s - loss: 1.1322 - regression_loss: 0.9970 - classification_loss: 0.1353 291/500 [================>.............] - ETA: 48s - loss: 1.1334 - regression_loss: 0.9979 - classification_loss: 0.1355 292/500 [================>.............] - ETA: 48s - loss: 1.1326 - regression_loss: 0.9972 - classification_loss: 0.1354 293/500 [================>.............] - ETA: 48s - loss: 1.1301 - regression_loss: 0.9950 - classification_loss: 0.1350 294/500 [================>.............] - ETA: 48s - loss: 1.1296 - regression_loss: 0.9946 - classification_loss: 0.1349 295/500 [================>.............] - ETA: 48s - loss: 1.1277 - regression_loss: 0.9931 - classification_loss: 0.1346 296/500 [================>.............] - ETA: 47s - loss: 1.1277 - regression_loss: 0.9932 - classification_loss: 0.1345 297/500 [================>.............] - ETA: 47s - loss: 1.1276 - regression_loss: 0.9930 - classification_loss: 0.1346 298/500 [================>.............] - ETA: 47s - loss: 1.1260 - regression_loss: 0.9916 - classification_loss: 0.1344 299/500 [================>.............] - ETA: 47s - loss: 1.1254 - regression_loss: 0.9912 - classification_loss: 0.1341 300/500 [=================>............] - ETA: 46s - loss: 1.1231 - regression_loss: 0.9893 - classification_loss: 0.1338 301/500 [=================>............] - ETA: 46s - loss: 1.1229 - regression_loss: 0.9891 - classification_loss: 0.1338 302/500 [=================>............] - ETA: 46s - loss: 1.1228 - regression_loss: 0.9887 - classification_loss: 0.1341 303/500 [=================>............] - ETA: 46s - loss: 1.1216 - regression_loss: 0.9877 - classification_loss: 0.1339 304/500 [=================>............] - ETA: 45s - loss: 1.1228 - regression_loss: 0.9887 - classification_loss: 0.1340 305/500 [=================>............] - ETA: 45s - loss: 1.1238 - regression_loss: 0.9901 - classification_loss: 0.1337 306/500 [=================>............] - ETA: 45s - loss: 1.1226 - regression_loss: 0.9891 - classification_loss: 0.1335 307/500 [=================>............] - ETA: 45s - loss: 1.1241 - regression_loss: 0.9904 - classification_loss: 0.1337 308/500 [=================>............] - ETA: 44s - loss: 1.1234 - regression_loss: 0.9898 - classification_loss: 0.1336 309/500 [=================>............] - ETA: 44s - loss: 1.1223 - regression_loss: 0.9889 - classification_loss: 0.1334 310/500 [=================>............] - ETA: 44s - loss: 1.1206 - regression_loss: 0.9874 - classification_loss: 0.1332 311/500 [=================>............] - ETA: 44s - loss: 1.1210 - regression_loss: 0.9878 - classification_loss: 0.1332 312/500 [=================>............] - ETA: 44s - loss: 1.1194 - regression_loss: 0.9865 - classification_loss: 0.1329 313/500 [=================>............] - ETA: 43s - loss: 1.1194 - regression_loss: 0.9865 - classification_loss: 0.1328 314/500 [=================>............] - ETA: 43s - loss: 1.1208 - regression_loss: 0.9877 - classification_loss: 0.1331 315/500 [=================>............] - ETA: 43s - loss: 1.1211 - regression_loss: 0.9880 - classification_loss: 0.1331 316/500 [=================>............] - ETA: 43s - loss: 1.1188 - regression_loss: 0.9859 - classification_loss: 0.1329 317/500 [==================>...........] - ETA: 42s - loss: 1.1197 - regression_loss: 0.9868 - classification_loss: 0.1329 318/500 [==================>...........] - ETA: 42s - loss: 1.1207 - regression_loss: 0.9880 - classification_loss: 0.1328 319/500 [==================>...........] - ETA: 42s - loss: 1.1204 - regression_loss: 0.9875 - classification_loss: 0.1329 320/500 [==================>...........] - ETA: 42s - loss: 1.1170 - regression_loss: 0.9844 - classification_loss: 0.1326 321/500 [==================>...........] - ETA: 41s - loss: 1.1185 - regression_loss: 0.9854 - classification_loss: 0.1331 322/500 [==================>...........] - ETA: 41s - loss: 1.1196 - regression_loss: 0.9866 - classification_loss: 0.1330 323/500 [==================>...........] - ETA: 41s - loss: 1.1196 - regression_loss: 0.9866 - classification_loss: 0.1330 324/500 [==================>...........] - ETA: 41s - loss: 1.1189 - regression_loss: 0.9861 - classification_loss: 0.1328 325/500 [==================>...........] - ETA: 40s - loss: 1.1184 - regression_loss: 0.9857 - classification_loss: 0.1326 326/500 [==================>...........] - ETA: 40s - loss: 1.1187 - regression_loss: 0.9858 - classification_loss: 0.1329 327/500 [==================>...........] - ETA: 40s - loss: 1.1202 - regression_loss: 0.9874 - classification_loss: 0.1328 328/500 [==================>...........] - ETA: 40s - loss: 1.1185 - regression_loss: 0.9859 - classification_loss: 0.1326 329/500 [==================>...........] - ETA: 40s - loss: 1.1174 - regression_loss: 0.9850 - classification_loss: 0.1324 330/500 [==================>...........] - ETA: 39s - loss: 1.1187 - regression_loss: 0.9858 - classification_loss: 0.1329 331/500 [==================>...........] - ETA: 39s - loss: 1.1201 - regression_loss: 0.9865 - classification_loss: 0.1336 332/500 [==================>...........] - ETA: 39s - loss: 1.1212 - regression_loss: 0.9871 - classification_loss: 0.1341 333/500 [==================>...........] - ETA: 39s - loss: 1.1219 - regression_loss: 0.9881 - classification_loss: 0.1339 334/500 [===================>..........] - ETA: 38s - loss: 1.1230 - regression_loss: 0.9886 - classification_loss: 0.1344 335/500 [===================>..........] - ETA: 38s - loss: 1.1226 - regression_loss: 0.9883 - classification_loss: 0.1343 336/500 [===================>..........] - ETA: 38s - loss: 1.1235 - regression_loss: 0.9889 - classification_loss: 0.1345 337/500 [===================>..........] - ETA: 38s - loss: 1.1244 - regression_loss: 0.9896 - classification_loss: 0.1348 338/500 [===================>..........] - ETA: 37s - loss: 1.1234 - regression_loss: 0.9887 - classification_loss: 0.1346 339/500 [===================>..........] - ETA: 37s - loss: 1.1228 - regression_loss: 0.9883 - classification_loss: 0.1346 340/500 [===================>..........] - ETA: 37s - loss: 1.1209 - regression_loss: 0.9866 - classification_loss: 0.1344 341/500 [===================>..........] - ETA: 37s - loss: 1.1231 - regression_loss: 0.9884 - classification_loss: 0.1346 342/500 [===================>..........] - ETA: 36s - loss: 1.1226 - regression_loss: 0.9876 - classification_loss: 0.1350 343/500 [===================>..........] - ETA: 36s - loss: 1.1218 - regression_loss: 0.9871 - classification_loss: 0.1347 344/500 [===================>..........] - ETA: 36s - loss: 1.1221 - regression_loss: 0.9874 - classification_loss: 0.1347 345/500 [===================>..........] - ETA: 36s - loss: 1.1227 - regression_loss: 0.9879 - classification_loss: 0.1347 346/500 [===================>..........] - ETA: 36s - loss: 1.1230 - regression_loss: 0.9883 - classification_loss: 0.1347 347/500 [===================>..........] - ETA: 35s - loss: 1.1208 - regression_loss: 0.9864 - classification_loss: 0.1344 348/500 [===================>..........] - ETA: 35s - loss: 1.1203 - regression_loss: 0.9859 - classification_loss: 0.1344 349/500 [===================>..........] - ETA: 35s - loss: 1.1201 - regression_loss: 0.9858 - classification_loss: 0.1343 350/500 [====================>.........] - ETA: 35s - loss: 1.1197 - regression_loss: 0.9854 - classification_loss: 0.1343 351/500 [====================>.........] - ETA: 34s - loss: 1.1192 - regression_loss: 0.9850 - classification_loss: 0.1341 352/500 [====================>.........] - ETA: 34s - loss: 1.1184 - regression_loss: 0.9845 - classification_loss: 0.1339 353/500 [====================>.........] - ETA: 34s - loss: 1.1185 - regression_loss: 0.9848 - classification_loss: 0.1338 354/500 [====================>.........] - ETA: 34s - loss: 1.1182 - regression_loss: 0.9845 - classification_loss: 0.1336 355/500 [====================>.........] - ETA: 33s - loss: 1.1179 - regression_loss: 0.9843 - classification_loss: 0.1336 356/500 [====================>.........] - ETA: 33s - loss: 1.1191 - regression_loss: 0.9851 - classification_loss: 0.1340 357/500 [====================>.........] - ETA: 33s - loss: 1.1179 - regression_loss: 0.9842 - classification_loss: 0.1338 358/500 [====================>.........] - ETA: 33s - loss: 1.1154 - regression_loss: 0.9819 - classification_loss: 0.1335 359/500 [====================>.........] - ETA: 32s - loss: 1.1161 - regression_loss: 0.9826 - classification_loss: 0.1335 360/500 [====================>.........] - ETA: 32s - loss: 1.1167 - regression_loss: 0.9832 - classification_loss: 0.1334 361/500 [====================>.........] - ETA: 32s - loss: 1.1145 - regression_loss: 0.9814 - classification_loss: 0.1331 362/500 [====================>.........] - ETA: 32s - loss: 1.1142 - regression_loss: 0.9812 - classification_loss: 0.1330 363/500 [====================>.........] - ETA: 32s - loss: 1.1135 - regression_loss: 0.9806 - classification_loss: 0.1329 364/500 [====================>.........] - ETA: 31s - loss: 1.1122 - regression_loss: 0.9794 - classification_loss: 0.1328 365/500 [====================>.........] - ETA: 31s - loss: 1.1118 - regression_loss: 0.9790 - classification_loss: 0.1327 366/500 [====================>.........] - ETA: 31s - loss: 1.1108 - regression_loss: 0.9783 - classification_loss: 0.1325 367/500 [=====================>........] - ETA: 31s - loss: 1.1113 - regression_loss: 0.9787 - classification_loss: 0.1326 368/500 [=====================>........] - ETA: 30s - loss: 1.1094 - regression_loss: 0.9771 - classification_loss: 0.1323 369/500 [=====================>........] - ETA: 30s - loss: 1.1103 - regression_loss: 0.9779 - classification_loss: 0.1324 370/500 [=====================>........] - ETA: 30s - loss: 1.1104 - regression_loss: 0.9781 - classification_loss: 0.1324 371/500 [=====================>........] - ETA: 30s - loss: 1.1095 - regression_loss: 0.9774 - classification_loss: 0.1322 372/500 [=====================>........] - ETA: 29s - loss: 1.1080 - regression_loss: 0.9760 - classification_loss: 0.1320 373/500 [=====================>........] - ETA: 29s - loss: 1.1080 - regression_loss: 0.9762 - classification_loss: 0.1319 374/500 [=====================>........] - ETA: 29s - loss: 1.1066 - regression_loss: 0.9750 - classification_loss: 0.1316 375/500 [=====================>........] - ETA: 29s - loss: 1.1071 - regression_loss: 0.9755 - classification_loss: 0.1316 376/500 [=====================>........] - ETA: 28s - loss: 1.1093 - regression_loss: 0.9775 - classification_loss: 0.1318 377/500 [=====================>........] - ETA: 28s - loss: 1.1094 - regression_loss: 0.9776 - classification_loss: 0.1317 378/500 [=====================>........] - ETA: 28s - loss: 1.1093 - regression_loss: 0.9775 - classification_loss: 0.1318 379/500 [=====================>........] - ETA: 28s - loss: 1.1136 - regression_loss: 0.9814 - classification_loss: 0.1322 380/500 [=====================>........] - ETA: 28s - loss: 1.1126 - regression_loss: 0.9807 - classification_loss: 0.1319 381/500 [=====================>........] - ETA: 27s - loss: 1.1122 - regression_loss: 0.9803 - classification_loss: 0.1319 382/500 [=====================>........] - ETA: 27s - loss: 1.1113 - regression_loss: 0.9796 - classification_loss: 0.1317 383/500 [=====================>........] - ETA: 27s - loss: 1.1109 - regression_loss: 0.9792 - classification_loss: 0.1317 384/500 [======================>.......] - ETA: 27s - loss: 1.1111 - regression_loss: 0.9795 - classification_loss: 0.1317 385/500 [======================>.......] - ETA: 26s - loss: 1.1116 - regression_loss: 0.9800 - classification_loss: 0.1316 386/500 [======================>.......] - ETA: 26s - loss: 1.1109 - regression_loss: 0.9794 - classification_loss: 0.1315 387/500 [======================>.......] - ETA: 26s - loss: 1.1099 - regression_loss: 0.9786 - classification_loss: 0.1313 388/500 [======================>.......] - ETA: 26s - loss: 1.1110 - regression_loss: 0.9793 - classification_loss: 0.1318 389/500 [======================>.......] - ETA: 25s - loss: 1.1118 - regression_loss: 0.9800 - classification_loss: 0.1318 390/500 [======================>.......] - ETA: 25s - loss: 1.1107 - regression_loss: 0.9790 - classification_loss: 0.1316 391/500 [======================>.......] - ETA: 25s - loss: 1.1105 - regression_loss: 0.9790 - classification_loss: 0.1314 392/500 [======================>.......] - ETA: 25s - loss: 1.1111 - regression_loss: 0.9796 - classification_loss: 0.1315 393/500 [======================>.......] - ETA: 25s - loss: 1.1116 - regression_loss: 0.9800 - classification_loss: 0.1316 394/500 [======================>.......] - ETA: 24s - loss: 1.1117 - regression_loss: 0.9803 - classification_loss: 0.1314 395/500 [======================>.......] - ETA: 24s - loss: 1.1105 - regression_loss: 0.9793 - classification_loss: 0.1312 396/500 [======================>.......] - ETA: 24s - loss: 1.1098 - regression_loss: 0.9787 - classification_loss: 0.1310 397/500 [======================>.......] - ETA: 24s - loss: 1.1121 - regression_loss: 0.9806 - classification_loss: 0.1315 398/500 [======================>.......] - ETA: 23s - loss: 1.1119 - regression_loss: 0.9806 - classification_loss: 0.1313 399/500 [======================>.......] - ETA: 23s - loss: 1.1132 - regression_loss: 0.9818 - classification_loss: 0.1314 400/500 [=======================>......] - ETA: 23s - loss: 1.1127 - regression_loss: 0.9815 - classification_loss: 0.1312 401/500 [=======================>......] - ETA: 23s - loss: 1.1135 - regression_loss: 0.9822 - classification_loss: 0.1313 402/500 [=======================>......] - ETA: 22s - loss: 1.1123 - regression_loss: 0.9813 - classification_loss: 0.1310 403/500 [=======================>......] - ETA: 22s - loss: 1.1130 - regression_loss: 0.9816 - classification_loss: 0.1315 404/500 [=======================>......] - ETA: 22s - loss: 1.1139 - regression_loss: 0.9822 - classification_loss: 0.1317 405/500 [=======================>......] - ETA: 22s - loss: 1.1123 - regression_loss: 0.9809 - classification_loss: 0.1315 406/500 [=======================>......] - ETA: 21s - loss: 1.1129 - regression_loss: 0.9813 - classification_loss: 0.1317 407/500 [=======================>......] - ETA: 21s - loss: 1.1129 - regression_loss: 0.9814 - classification_loss: 0.1315 408/500 [=======================>......] - ETA: 21s - loss: 1.1129 - regression_loss: 0.9814 - classification_loss: 0.1315 409/500 [=======================>......] - ETA: 21s - loss: 1.1131 - regression_loss: 0.9816 - classification_loss: 0.1315 410/500 [=======================>......] - ETA: 21s - loss: 1.1132 - regression_loss: 0.9816 - classification_loss: 0.1316 411/500 [=======================>......] - ETA: 20s - loss: 1.1120 - regression_loss: 0.9805 - classification_loss: 0.1315 412/500 [=======================>......] - ETA: 20s - loss: 1.1121 - regression_loss: 0.9806 - classification_loss: 0.1315 413/500 [=======================>......] - ETA: 20s - loss: 1.1116 - regression_loss: 0.9801 - classification_loss: 0.1315 414/500 [=======================>......] - ETA: 20s - loss: 1.1110 - regression_loss: 0.9796 - classification_loss: 0.1315 415/500 [=======================>......] - ETA: 19s - loss: 1.1117 - regression_loss: 0.9801 - classification_loss: 0.1316 416/500 [=======================>......] - ETA: 19s - loss: 1.1112 - regression_loss: 0.9797 - classification_loss: 0.1315 417/500 [========================>.....] - ETA: 19s - loss: 1.1113 - regression_loss: 0.9798 - classification_loss: 0.1314 418/500 [========================>.....] - ETA: 19s - loss: 1.1111 - regression_loss: 0.9796 - classification_loss: 0.1314 419/500 [========================>.....] - ETA: 18s - loss: 1.1108 - regression_loss: 0.9794 - classification_loss: 0.1314 420/500 [========================>.....] - ETA: 18s - loss: 1.1101 - regression_loss: 0.9788 - classification_loss: 0.1313 421/500 [========================>.....] - ETA: 18s - loss: 1.1095 - regression_loss: 0.9783 - classification_loss: 0.1312 422/500 [========================>.....] - ETA: 18s - loss: 1.1099 - regression_loss: 0.9782 - classification_loss: 0.1316 423/500 [========================>.....] - ETA: 17s - loss: 1.1110 - regression_loss: 0.9793 - classification_loss: 0.1317 424/500 [========================>.....] - ETA: 17s - loss: 1.1126 - regression_loss: 0.9805 - classification_loss: 0.1321 425/500 [========================>.....] - ETA: 17s - loss: 1.1123 - regression_loss: 0.9803 - classification_loss: 0.1320 426/500 [========================>.....] - ETA: 17s - loss: 1.1114 - regression_loss: 0.9796 - classification_loss: 0.1318 427/500 [========================>.....] - ETA: 17s - loss: 1.1101 - regression_loss: 0.9785 - classification_loss: 0.1316 428/500 [========================>.....] - ETA: 16s - loss: 1.1112 - regression_loss: 0.9793 - classification_loss: 0.1320 429/500 [========================>.....] - ETA: 16s - loss: 1.1120 - regression_loss: 0.9796 - classification_loss: 0.1324 430/500 [========================>.....] - ETA: 16s - loss: 1.1139 - regression_loss: 0.9814 - classification_loss: 0.1325 431/500 [========================>.....] - ETA: 16s - loss: 1.1137 - regression_loss: 0.9813 - classification_loss: 0.1325 432/500 [========================>.....] - ETA: 15s - loss: 1.1119 - regression_loss: 0.9790 - classification_loss: 0.1329 433/500 [========================>.....] - ETA: 15s - loss: 1.1122 - regression_loss: 0.9791 - classification_loss: 0.1331 434/500 [=========================>....] - ETA: 15s - loss: 1.1123 - regression_loss: 0.9793 - classification_loss: 0.1330 435/500 [=========================>....] - ETA: 15s - loss: 1.1134 - regression_loss: 0.9802 - classification_loss: 0.1332 436/500 [=========================>....] - ETA: 14s - loss: 1.1140 - regression_loss: 0.9807 - classification_loss: 0.1333 437/500 [=========================>....] - ETA: 14s - loss: 1.1165 - regression_loss: 0.9827 - classification_loss: 0.1338 438/500 [=========================>....] - ETA: 14s - loss: 1.1159 - regression_loss: 0.9823 - classification_loss: 0.1336 439/500 [=========================>....] - ETA: 14s - loss: 1.1155 - regression_loss: 0.9821 - classification_loss: 0.1334 440/500 [=========================>....] - ETA: 14s - loss: 1.1148 - regression_loss: 0.9815 - classification_loss: 0.1333 441/500 [=========================>....] - ETA: 13s - loss: 1.1178 - regression_loss: 0.9839 - classification_loss: 0.1339 442/500 [=========================>....] - ETA: 13s - loss: 1.1174 - regression_loss: 0.9835 - classification_loss: 0.1339 443/500 [=========================>....] - ETA: 13s - loss: 1.1185 - regression_loss: 0.9843 - classification_loss: 0.1342 444/500 [=========================>....] - ETA: 13s - loss: 1.1174 - regression_loss: 0.9833 - classification_loss: 0.1341 445/500 [=========================>....] - ETA: 12s - loss: 1.1174 - regression_loss: 0.9832 - classification_loss: 0.1342 446/500 [=========================>....] - ETA: 12s - loss: 1.1180 - regression_loss: 0.9839 - classification_loss: 0.1342 447/500 [=========================>....] - ETA: 12s - loss: 1.1172 - regression_loss: 0.9832 - classification_loss: 0.1340 448/500 [=========================>....] - ETA: 12s - loss: 1.1177 - regression_loss: 0.9836 - classification_loss: 0.1341 449/500 [=========================>....] - ETA: 11s - loss: 1.1174 - regression_loss: 0.9833 - classification_loss: 0.1340 450/500 [==========================>...] - ETA: 11s - loss: 1.1165 - regression_loss: 0.9826 - classification_loss: 0.1339 451/500 [==========================>...] - ETA: 11s - loss: 1.1162 - regression_loss: 0.9821 - classification_loss: 0.1342 452/500 [==========================>...] - ETA: 11s - loss: 1.1164 - regression_loss: 0.9821 - classification_loss: 0.1343 453/500 [==========================>...] - ETA: 10s - loss: 1.1177 - regression_loss: 0.9831 - classification_loss: 0.1346 454/500 [==========================>...] - ETA: 10s - loss: 1.1183 - regression_loss: 0.9837 - classification_loss: 0.1347 455/500 [==========================>...] - ETA: 10s - loss: 1.1197 - regression_loss: 0.9850 - classification_loss: 0.1347 456/500 [==========================>...] - ETA: 10s - loss: 1.1200 - regression_loss: 0.9851 - classification_loss: 0.1349 457/500 [==========================>...] - ETA: 10s - loss: 1.1213 - regression_loss: 0.9863 - classification_loss: 0.1350 458/500 [==========================>...] - ETA: 9s - loss: 1.1206 - regression_loss: 0.9858 - classification_loss: 0.1348  459/500 [==========================>...] - ETA: 9s - loss: 1.1227 - regression_loss: 0.9876 - classification_loss: 0.1351 460/500 [==========================>...] - ETA: 9s - loss: 1.1234 - regression_loss: 0.9881 - classification_loss: 0.1354 461/500 [==========================>...] - ETA: 9s - loss: 1.1244 - regression_loss: 0.9887 - classification_loss: 0.1356 462/500 [==========================>...] - ETA: 8s - loss: 1.1240 - regression_loss: 0.9885 - classification_loss: 0.1356 463/500 [==========================>...] - ETA: 8s - loss: 1.1227 - regression_loss: 0.9874 - classification_loss: 0.1354 464/500 [==========================>...] - ETA: 8s - loss: 1.1219 - regression_loss: 0.9867 - classification_loss: 0.1352 465/500 [==========================>...] - ETA: 8s - loss: 1.1228 - regression_loss: 0.9876 - classification_loss: 0.1352 466/500 [==========================>...] - ETA: 7s - loss: 1.1251 - regression_loss: 0.9900 - classification_loss: 0.1351 467/500 [===========================>..] - ETA: 7s - loss: 1.1254 - regression_loss: 0.9903 - classification_loss: 0.1351 468/500 [===========================>..] - ETA: 7s - loss: 1.1249 - regression_loss: 0.9899 - classification_loss: 0.1350 469/500 [===========================>..] - ETA: 7s - loss: 1.1242 - regression_loss: 0.9893 - classification_loss: 0.1349 470/500 [===========================>..] - ETA: 7s - loss: 1.1240 - regression_loss: 0.9891 - classification_loss: 0.1349 471/500 [===========================>..] - ETA: 6s - loss: 1.1237 - regression_loss: 0.9889 - classification_loss: 0.1349 472/500 [===========================>..] - ETA: 6s - loss: 1.1259 - regression_loss: 0.9908 - classification_loss: 0.1350 473/500 [===========================>..] - ETA: 6s - loss: 1.1248 - regression_loss: 0.9900 - classification_loss: 0.1348 474/500 [===========================>..] - ETA: 6s - loss: 1.1238 - regression_loss: 0.9891 - classification_loss: 0.1347 475/500 [===========================>..] - ETA: 5s - loss: 1.1233 - regression_loss: 0.9887 - classification_loss: 0.1346 476/500 [===========================>..] - ETA: 5s - loss: 1.1238 - regression_loss: 0.9891 - classification_loss: 0.1347 477/500 [===========================>..] - ETA: 5s - loss: 1.1255 - regression_loss: 0.9906 - classification_loss: 0.1349 478/500 [===========================>..] - ETA: 5s - loss: 1.1250 - regression_loss: 0.9901 - classification_loss: 0.1348 479/500 [===========================>..] - ETA: 4s - loss: 1.1252 - regression_loss: 0.9903 - classification_loss: 0.1349 480/500 [===========================>..] - ETA: 4s - loss: 1.1243 - regression_loss: 0.9895 - classification_loss: 0.1347 481/500 [===========================>..] - ETA: 4s - loss: 1.1234 - regression_loss: 0.9888 - classification_loss: 0.1346 482/500 [===========================>..] - ETA: 4s - loss: 1.1243 - regression_loss: 0.9894 - classification_loss: 0.1349 483/500 [===========================>..] - ETA: 3s - loss: 1.1232 - regression_loss: 0.9886 - classification_loss: 0.1346 484/500 [============================>.] - ETA: 3s - loss: 1.1240 - regression_loss: 0.9893 - classification_loss: 0.1348 485/500 [============================>.] - ETA: 3s - loss: 1.1225 - regression_loss: 0.9879 - classification_loss: 0.1346 486/500 [============================>.] - ETA: 3s - loss: 1.1223 - regression_loss: 0.9878 - classification_loss: 0.1345 487/500 [============================>.] - ETA: 3s - loss: 1.1220 - regression_loss: 0.9875 - classification_loss: 0.1345 488/500 [============================>.] - ETA: 2s - loss: 1.1210 - regression_loss: 0.9866 - classification_loss: 0.1344 489/500 [============================>.] - ETA: 2s - loss: 1.1195 - regression_loss: 0.9853 - classification_loss: 0.1342 490/500 [============================>.] - ETA: 2s - loss: 1.1189 - regression_loss: 0.9848 - classification_loss: 0.1341 491/500 [============================>.] - ETA: 2s - loss: 1.1184 - regression_loss: 0.9844 - classification_loss: 0.1340 492/500 [============================>.] - ETA: 1s - loss: 1.1191 - regression_loss: 0.9849 - classification_loss: 0.1343 493/500 [============================>.] - ETA: 1s - loss: 1.1195 - regression_loss: 0.9853 - classification_loss: 0.1342 494/500 [============================>.] - ETA: 1s - loss: 1.1186 - regression_loss: 0.9845 - classification_loss: 0.1341 495/500 [============================>.] - ETA: 1s - loss: 1.1177 - regression_loss: 0.9838 - classification_loss: 0.1339 496/500 [============================>.] - ETA: 0s - loss: 1.1180 - regression_loss: 0.9842 - classification_loss: 0.1338 497/500 [============================>.] - ETA: 0s - loss: 1.1180 - regression_loss: 0.9842 - classification_loss: 0.1338 498/500 [============================>.] - ETA: 0s - loss: 1.1185 - regression_loss: 0.9844 - classification_loss: 0.1341 499/500 [============================>.] - ETA: 0s - loss: 1.1187 - regression_loss: 0.9847 - classification_loss: 0.1341 500/500 [==============================] - 117s 234ms/step - loss: 1.1185 - regression_loss: 0.9844 - classification_loss: 0.1341 326 instances of class plum with average precision: 0.8273 mAP: 0.8273 Epoch 00023: saving model to ./training/snapshots/resnet50_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 1:52 - loss: 0.9044 - regression_loss: 0.7956 - classification_loss: 0.1088 2/500 [..............................] - ETA: 1:54 - loss: 0.7425 - regression_loss: 0.6744 - classification_loss: 0.0682 3/500 [..............................] - ETA: 1:56 - loss: 0.8423 - regression_loss: 0.7744 - classification_loss: 0.0679 4/500 [..............................] - ETA: 1:56 - loss: 0.9448 - regression_loss: 0.8333 - classification_loss: 0.1115 5/500 [..............................] - ETA: 1:54 - loss: 0.8830 - regression_loss: 0.7780 - classification_loss: 0.1050 6/500 [..............................] - ETA: 1:54 - loss: 0.8431 - regression_loss: 0.7462 - classification_loss: 0.0969 7/500 [..............................] - ETA: 1:55 - loss: 0.9955 - regression_loss: 0.8521 - classification_loss: 0.1434 8/500 [..............................] - ETA: 1:54 - loss: 1.0834 - regression_loss: 0.9192 - classification_loss: 0.1642 9/500 [..............................] - ETA: 1:53 - loss: 1.1112 - regression_loss: 0.9469 - classification_loss: 0.1643 10/500 [..............................] - ETA: 1:53 - loss: 1.1954 - regression_loss: 1.0324 - classification_loss: 0.1630 11/500 [..............................] - ETA: 1:52 - loss: 1.2008 - regression_loss: 1.0391 - classification_loss: 0.1617 12/500 [..............................] - ETA: 1:51 - loss: 1.2075 - regression_loss: 1.0506 - classification_loss: 0.1569 13/500 [..............................] - ETA: 1:52 - loss: 1.2067 - regression_loss: 1.0510 - classification_loss: 0.1557 14/500 [..............................] - ETA: 1:51 - loss: 1.2044 - regression_loss: 1.0495 - classification_loss: 0.1548 15/500 [..............................] - ETA: 1:51 - loss: 1.1907 - regression_loss: 1.0374 - classification_loss: 0.1533 16/500 [..............................] - ETA: 1:51 - loss: 1.1792 - regression_loss: 1.0301 - classification_loss: 0.1491 17/500 [>.............................] - ETA: 1:51 - loss: 1.1679 - regression_loss: 1.0200 - classification_loss: 0.1479 18/500 [>.............................] - ETA: 1:50 - loss: 1.1767 - regression_loss: 1.0297 - classification_loss: 0.1471 19/500 [>.............................] - ETA: 1:50 - loss: 1.1813 - regression_loss: 1.0351 - classification_loss: 0.1462 20/500 [>.............................] - ETA: 1:50 - loss: 1.1602 - regression_loss: 1.0177 - classification_loss: 0.1426 21/500 [>.............................] - ETA: 1:50 - loss: 1.1614 - regression_loss: 1.0205 - classification_loss: 0.1408 22/500 [>.............................] - ETA: 1:50 - loss: 1.1575 - regression_loss: 1.0177 - classification_loss: 0.1398 23/500 [>.............................] - ETA: 1:50 - loss: 1.1796 - regression_loss: 1.0358 - classification_loss: 0.1438 24/500 [>.............................] - ETA: 1:49 - loss: 1.1894 - regression_loss: 1.0444 - classification_loss: 0.1450 25/500 [>.............................] - ETA: 1:49 - loss: 1.1696 - regression_loss: 1.0262 - classification_loss: 0.1434 26/500 [>.............................] - ETA: 1:48 - loss: 1.1557 - regression_loss: 1.0147 - classification_loss: 0.1410 27/500 [>.............................] - ETA: 1:48 - loss: 1.1548 - regression_loss: 1.0142 - classification_loss: 0.1406 28/500 [>.............................] - ETA: 1:48 - loss: 1.1453 - regression_loss: 1.0061 - classification_loss: 0.1392 29/500 [>.............................] - ETA: 1:49 - loss: 1.1520 - regression_loss: 1.0134 - classification_loss: 0.1386 30/500 [>.............................] - ETA: 1:48 - loss: 1.1506 - regression_loss: 1.0133 - classification_loss: 0.1373 31/500 [>.............................] - ETA: 1:48 - loss: 1.1371 - regression_loss: 1.0022 - classification_loss: 0.1348 32/500 [>.............................] - ETA: 1:48 - loss: 1.1263 - regression_loss: 0.9944 - classification_loss: 0.1318 33/500 [>.............................] - ETA: 1:47 - loss: 1.1157 - regression_loss: 0.9861 - classification_loss: 0.1296 34/500 [=>............................] - ETA: 1:47 - loss: 1.1261 - regression_loss: 0.9951 - classification_loss: 0.1309 35/500 [=>............................] - ETA: 1:47 - loss: 1.1073 - regression_loss: 0.9793 - classification_loss: 0.1280 36/500 [=>............................] - ETA: 1:47 - loss: 1.1205 - regression_loss: 0.9930 - classification_loss: 0.1275 37/500 [=>............................] - ETA: 1:47 - loss: 1.1301 - regression_loss: 0.9983 - classification_loss: 0.1318 38/500 [=>............................] - ETA: 1:46 - loss: 1.1277 - regression_loss: 0.9965 - classification_loss: 0.1312 39/500 [=>............................] - ETA: 1:46 - loss: 1.1613 - regression_loss: 1.0221 - classification_loss: 0.1391 40/500 [=>............................] - ETA: 1:46 - loss: 1.1595 - regression_loss: 1.0203 - classification_loss: 0.1393 41/500 [=>............................] - ETA: 1:45 - loss: 1.1607 - regression_loss: 1.0207 - classification_loss: 0.1400 42/500 [=>............................] - ETA: 1:45 - loss: 1.1618 - regression_loss: 1.0228 - classification_loss: 0.1390 43/500 [=>............................] - ETA: 1:45 - loss: 1.1476 - regression_loss: 1.0113 - classification_loss: 0.1363 44/500 [=>............................] - ETA: 1:45 - loss: 1.1422 - regression_loss: 1.0069 - classification_loss: 0.1353 45/500 [=>............................] - ETA: 1:45 - loss: 1.1401 - regression_loss: 1.0054 - classification_loss: 0.1347 46/500 [=>............................] - ETA: 1:45 - loss: 1.1312 - regression_loss: 0.9977 - classification_loss: 0.1335 47/500 [=>............................] - ETA: 1:44 - loss: 1.1326 - regression_loss: 0.9990 - classification_loss: 0.1335 48/500 [=>............................] - ETA: 1:44 - loss: 1.1362 - regression_loss: 1.0028 - classification_loss: 0.1334 49/500 [=>............................] - ETA: 1:44 - loss: 1.1259 - regression_loss: 0.9941 - classification_loss: 0.1318 50/500 [==>...........................] - ETA: 1:44 - loss: 1.1146 - regression_loss: 0.9836 - classification_loss: 0.1310 51/500 [==>...........................] - ETA: 1:44 - loss: 1.1065 - regression_loss: 0.9766 - classification_loss: 0.1299 52/500 [==>...........................] - ETA: 1:43 - loss: 1.1058 - regression_loss: 0.9760 - classification_loss: 0.1298 53/500 [==>...........................] - ETA: 1:43 - loss: 1.1139 - regression_loss: 0.9824 - classification_loss: 0.1315 54/500 [==>...........................] - ETA: 1:43 - loss: 1.1147 - regression_loss: 0.9836 - classification_loss: 0.1311 55/500 [==>...........................] - ETA: 1:43 - loss: 1.1152 - regression_loss: 0.9844 - classification_loss: 0.1308 56/500 [==>...........................] - ETA: 1:42 - loss: 1.1083 - regression_loss: 0.9787 - classification_loss: 0.1295 57/500 [==>...........................] - ETA: 1:42 - loss: 1.1188 - regression_loss: 0.9878 - classification_loss: 0.1310 58/500 [==>...........................] - ETA: 1:42 - loss: 1.1232 - regression_loss: 0.9914 - classification_loss: 0.1317 59/500 [==>...........................] - ETA: 1:42 - loss: 1.1138 - regression_loss: 0.9833 - classification_loss: 0.1305 60/500 [==>...........................] - ETA: 1:42 - loss: 1.1151 - regression_loss: 0.9851 - classification_loss: 0.1301 61/500 [==>...........................] - ETA: 1:41 - loss: 1.1082 - regression_loss: 0.9793 - classification_loss: 0.1289 62/500 [==>...........................] - ETA: 1:41 - loss: 1.1127 - regression_loss: 0.9846 - classification_loss: 0.1281 63/500 [==>...........................] - ETA: 1:41 - loss: 1.1039 - regression_loss: 0.9774 - classification_loss: 0.1265 64/500 [==>...........................] - ETA: 1:41 - loss: 1.1064 - regression_loss: 0.9791 - classification_loss: 0.1273 65/500 [==>...........................] - ETA: 1:41 - loss: 1.1148 - regression_loss: 0.9869 - classification_loss: 0.1279 66/500 [==>...........................] - ETA: 1:41 - loss: 1.1115 - regression_loss: 0.9836 - classification_loss: 0.1279 67/500 [===>..........................] - ETA: 1:41 - loss: 1.1101 - regression_loss: 0.9825 - classification_loss: 0.1276 68/500 [===>..........................] - ETA: 1:40 - loss: 1.1079 - regression_loss: 0.9811 - classification_loss: 0.1268 69/500 [===>..........................] - ETA: 1:40 - loss: 1.1066 - regression_loss: 0.9804 - classification_loss: 0.1261 70/500 [===>..........................] - ETA: 1:40 - loss: 1.1093 - regression_loss: 0.9819 - classification_loss: 0.1274 71/500 [===>..........................] - ETA: 1:40 - loss: 1.1121 - regression_loss: 0.9848 - classification_loss: 0.1273 72/500 [===>..........................] - ETA: 1:40 - loss: 1.1176 - regression_loss: 0.9887 - classification_loss: 0.1288 73/500 [===>..........................] - ETA: 1:40 - loss: 1.1104 - regression_loss: 0.9828 - classification_loss: 0.1276 74/500 [===>..........................] - ETA: 1:39 - loss: 1.1072 - regression_loss: 0.9802 - classification_loss: 0.1271 75/500 [===>..........................] - ETA: 1:39 - loss: 1.1129 - regression_loss: 0.9854 - classification_loss: 0.1275 76/500 [===>..........................] - ETA: 1:39 - loss: 1.1208 - regression_loss: 0.9898 - classification_loss: 0.1310 77/500 [===>..........................] - ETA: 1:39 - loss: 1.1172 - regression_loss: 0.9871 - classification_loss: 0.1302 78/500 [===>..........................] - ETA: 1:38 - loss: 1.1256 - regression_loss: 0.9951 - classification_loss: 0.1305 79/500 [===>..........................] - ETA: 1:38 - loss: 1.1273 - regression_loss: 0.9970 - classification_loss: 0.1303 80/500 [===>..........................] - ETA: 1:38 - loss: 1.1258 - regression_loss: 0.9957 - classification_loss: 0.1301 81/500 [===>..........................] - ETA: 1:38 - loss: 1.1288 - regression_loss: 0.9986 - classification_loss: 0.1302 82/500 [===>..........................] - ETA: 1:38 - loss: 1.1301 - regression_loss: 0.9995 - classification_loss: 0.1305 83/500 [===>..........................] - ETA: 1:37 - loss: 1.1300 - regression_loss: 1.0000 - classification_loss: 0.1300 84/500 [====>.........................] - ETA: 1:37 - loss: 1.1287 - regression_loss: 0.9985 - classification_loss: 0.1302 85/500 [====>.........................] - ETA: 1:37 - loss: 1.1264 - regression_loss: 0.9968 - classification_loss: 0.1296 86/500 [====>.........................] - ETA: 1:37 - loss: 1.1225 - regression_loss: 0.9936 - classification_loss: 0.1288 87/500 [====>.........................] - ETA: 1:36 - loss: 1.1201 - regression_loss: 0.9917 - classification_loss: 0.1284 88/500 [====>.........................] - ETA: 1:36 - loss: 1.1207 - regression_loss: 0.9926 - classification_loss: 0.1281 89/500 [====>.........................] - ETA: 1:36 - loss: 1.1219 - regression_loss: 0.9937 - classification_loss: 0.1282 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1136 - regression_loss: 0.9866 - classification_loss: 0.1270 91/500 [====>.........................] - ETA: 1:35 - loss: 1.1104 - regression_loss: 0.9844 - classification_loss: 0.1260 92/500 [====>.........................] - ETA: 1:35 - loss: 1.1041 - regression_loss: 0.9780 - classification_loss: 0.1261 93/500 [====>.........................] - ETA: 1:35 - loss: 1.1041 - regression_loss: 0.9783 - classification_loss: 0.1258 94/500 [====>.........................] - ETA: 1:35 - loss: 1.1027 - regression_loss: 0.9775 - classification_loss: 0.1252 95/500 [====>.........................] - ETA: 1:34 - loss: 1.1087 - regression_loss: 0.9827 - classification_loss: 0.1260 96/500 [====>.........................] - ETA: 1:34 - loss: 1.1058 - regression_loss: 0.9806 - classification_loss: 0.1252 97/500 [====>.........................] - ETA: 1:34 - loss: 1.0997 - regression_loss: 0.9755 - classification_loss: 0.1242 98/500 [====>.........................] - ETA: 1:34 - loss: 1.0929 - regression_loss: 0.9697 - classification_loss: 0.1232 99/500 [====>.........................] - ETA: 1:34 - loss: 1.0882 - regression_loss: 0.9656 - classification_loss: 0.1226 100/500 [=====>........................] - ETA: 1:33 - loss: 1.0921 - regression_loss: 0.9685 - classification_loss: 0.1235 101/500 [=====>........................] - ETA: 1:33 - loss: 1.0922 - regression_loss: 0.9686 - classification_loss: 0.1236 102/500 [=====>........................] - ETA: 1:33 - loss: 1.0888 - regression_loss: 0.9660 - classification_loss: 0.1228 103/500 [=====>........................] - ETA: 1:33 - loss: 1.0835 - regression_loss: 0.9613 - classification_loss: 0.1222 104/500 [=====>........................] - ETA: 1:32 - loss: 1.0763 - regression_loss: 0.9546 - classification_loss: 0.1217 105/500 [=====>........................] - ETA: 1:32 - loss: 1.0739 - regression_loss: 0.9522 - classification_loss: 0.1217 106/500 [=====>........................] - ETA: 1:32 - loss: 1.0771 - regression_loss: 0.9544 - classification_loss: 0.1226 107/500 [=====>........................] - ETA: 1:32 - loss: 1.0786 - regression_loss: 0.9556 - classification_loss: 0.1230 108/500 [=====>........................] - ETA: 1:32 - loss: 1.0788 - regression_loss: 0.9559 - classification_loss: 0.1229 109/500 [=====>........................] - ETA: 1:31 - loss: 1.0843 - regression_loss: 0.9597 - classification_loss: 0.1247 110/500 [=====>........................] - ETA: 1:31 - loss: 1.0824 - regression_loss: 0.9580 - classification_loss: 0.1245 111/500 [=====>........................] - ETA: 1:31 - loss: 1.0866 - regression_loss: 0.9618 - classification_loss: 0.1248 112/500 [=====>........................] - ETA: 1:31 - loss: 1.0826 - regression_loss: 0.9582 - classification_loss: 0.1244 113/500 [=====>........................] - ETA: 1:30 - loss: 1.0817 - regression_loss: 0.9575 - classification_loss: 0.1242 114/500 [=====>........................] - ETA: 1:30 - loss: 1.0795 - regression_loss: 0.9558 - classification_loss: 0.1237 115/500 [=====>........................] - ETA: 1:30 - loss: 1.0818 - regression_loss: 0.9584 - classification_loss: 0.1235 116/500 [=====>........................] - ETA: 1:30 - loss: 1.0813 - regression_loss: 0.9579 - classification_loss: 0.1234 117/500 [======>.......................] - ETA: 1:29 - loss: 1.0811 - regression_loss: 0.9582 - classification_loss: 0.1229 118/500 [======>.......................] - ETA: 1:29 - loss: 1.0826 - regression_loss: 0.9596 - classification_loss: 0.1230 119/500 [======>.......................] - ETA: 1:29 - loss: 1.0840 - regression_loss: 0.9607 - classification_loss: 0.1232 120/500 [======>.......................] - ETA: 1:29 - loss: 1.0823 - regression_loss: 0.9594 - classification_loss: 0.1229 121/500 [======>.......................] - ETA: 1:29 - loss: 1.0802 - regression_loss: 0.9578 - classification_loss: 0.1224 122/500 [======>.......................] - ETA: 1:28 - loss: 1.0769 - regression_loss: 0.9551 - classification_loss: 0.1218 123/500 [======>.......................] - ETA: 1:28 - loss: 1.0773 - regression_loss: 0.9556 - classification_loss: 0.1218 124/500 [======>.......................] - ETA: 1:28 - loss: 1.0755 - regression_loss: 0.9541 - classification_loss: 0.1214 125/500 [======>.......................] - ETA: 1:28 - loss: 1.0744 - regression_loss: 0.9530 - classification_loss: 0.1214 126/500 [======>.......................] - ETA: 1:27 - loss: 1.0749 - regression_loss: 0.9535 - classification_loss: 0.1214 127/500 [======>.......................] - ETA: 1:27 - loss: 1.0728 - regression_loss: 0.9519 - classification_loss: 0.1209 128/500 [======>.......................] - ETA: 1:27 - loss: 1.0717 - regression_loss: 0.9511 - classification_loss: 0.1206 129/500 [======>.......................] - ETA: 1:27 - loss: 1.0744 - regression_loss: 0.9534 - classification_loss: 0.1210 130/500 [======>.......................] - ETA: 1:26 - loss: 1.0762 - regression_loss: 0.9550 - classification_loss: 0.1212 131/500 [======>.......................] - ETA: 1:26 - loss: 1.0809 - regression_loss: 0.9582 - classification_loss: 0.1228 132/500 [======>.......................] - ETA: 1:26 - loss: 1.0830 - regression_loss: 0.9599 - classification_loss: 0.1231 133/500 [======>.......................] - ETA: 1:26 - loss: 1.0840 - regression_loss: 0.9605 - classification_loss: 0.1235 134/500 [=======>......................] - ETA: 1:25 - loss: 1.0808 - regression_loss: 0.9579 - classification_loss: 0.1229 135/500 [=======>......................] - ETA: 1:25 - loss: 1.0784 - regression_loss: 0.9560 - classification_loss: 0.1224 136/500 [=======>......................] - ETA: 1:25 - loss: 1.0766 - regression_loss: 0.9546 - classification_loss: 0.1220 137/500 [=======>......................] - ETA: 1:25 - loss: 1.0787 - regression_loss: 0.9559 - classification_loss: 0.1227 138/500 [=======>......................] - ETA: 1:24 - loss: 1.0831 - regression_loss: 0.9596 - classification_loss: 0.1235 139/500 [=======>......................] - ETA: 1:24 - loss: 1.0824 - regression_loss: 0.9588 - classification_loss: 0.1236 140/500 [=======>......................] - ETA: 1:24 - loss: 1.0784 - regression_loss: 0.9554 - classification_loss: 0.1230 141/500 [=======>......................] - ETA: 1:24 - loss: 1.0771 - regression_loss: 0.9546 - classification_loss: 0.1225 142/500 [=======>......................] - ETA: 1:23 - loss: 1.0766 - regression_loss: 0.9547 - classification_loss: 0.1220 143/500 [=======>......................] - ETA: 1:23 - loss: 1.0742 - regression_loss: 0.9526 - classification_loss: 0.1216 144/500 [=======>......................] - ETA: 1:23 - loss: 1.0763 - regression_loss: 0.9542 - classification_loss: 0.1222 145/500 [=======>......................] - ETA: 1:23 - loss: 1.0786 - regression_loss: 0.9563 - classification_loss: 0.1223 146/500 [=======>......................] - ETA: 1:22 - loss: 1.0820 - regression_loss: 0.9591 - classification_loss: 0.1230 147/500 [=======>......................] - ETA: 1:22 - loss: 1.0787 - regression_loss: 0.9564 - classification_loss: 0.1223 148/500 [=======>......................] - ETA: 1:22 - loss: 1.0892 - regression_loss: 0.9622 - classification_loss: 0.1270 149/500 [=======>......................] - ETA: 1:22 - loss: 1.0902 - regression_loss: 0.9631 - classification_loss: 0.1271 150/500 [========>.....................] - ETA: 1:21 - loss: 1.0900 - regression_loss: 0.9629 - classification_loss: 0.1271 151/500 [========>.....................] - ETA: 1:21 - loss: 1.0894 - regression_loss: 0.9626 - classification_loss: 0.1267 152/500 [========>.....................] - ETA: 1:21 - loss: 1.0886 - regression_loss: 0.9619 - classification_loss: 0.1267 153/500 [========>.....................] - ETA: 1:21 - loss: 1.0855 - regression_loss: 0.9592 - classification_loss: 0.1263 154/500 [========>.....................] - ETA: 1:20 - loss: 1.0840 - regression_loss: 0.9579 - classification_loss: 0.1261 155/500 [========>.....................] - ETA: 1:20 - loss: 1.0817 - regression_loss: 0.9559 - classification_loss: 0.1258 156/500 [========>.....................] - ETA: 1:20 - loss: 1.0877 - regression_loss: 0.9617 - classification_loss: 0.1260 157/500 [========>.....................] - ETA: 1:20 - loss: 1.0843 - regression_loss: 0.9587 - classification_loss: 0.1257 158/500 [========>.....................] - ETA: 1:19 - loss: 1.0863 - regression_loss: 0.9602 - classification_loss: 0.1261 159/500 [========>.....................] - ETA: 1:19 - loss: 1.0841 - regression_loss: 0.9584 - classification_loss: 0.1257 160/500 [========>.....................] - ETA: 1:19 - loss: 1.0798 - regression_loss: 0.9547 - classification_loss: 0.1251 161/500 [========>.....................] - ETA: 1:19 - loss: 1.0831 - regression_loss: 0.9571 - classification_loss: 0.1260 162/500 [========>.....................] - ETA: 1:19 - loss: 1.0812 - regression_loss: 0.9556 - classification_loss: 0.1256 163/500 [========>.....................] - ETA: 1:18 - loss: 1.0797 - regression_loss: 0.9541 - classification_loss: 0.1255 164/500 [========>.....................] - ETA: 1:18 - loss: 1.0803 - regression_loss: 0.9544 - classification_loss: 0.1259 165/500 [========>.....................] - ETA: 1:18 - loss: 1.0820 - regression_loss: 0.9558 - classification_loss: 0.1263 166/500 [========>.....................] - ETA: 1:18 - loss: 1.0792 - regression_loss: 0.9532 - classification_loss: 0.1261 167/500 [=========>....................] - ETA: 1:17 - loss: 1.0770 - regression_loss: 0.9515 - classification_loss: 0.1256 168/500 [=========>....................] - ETA: 1:17 - loss: 1.0803 - regression_loss: 0.9547 - classification_loss: 0.1256 169/500 [=========>....................] - ETA: 1:17 - loss: 1.0765 - regression_loss: 0.9515 - classification_loss: 0.1249 170/500 [=========>....................] - ETA: 1:17 - loss: 1.0807 - regression_loss: 0.9555 - classification_loss: 0.1252 171/500 [=========>....................] - ETA: 1:16 - loss: 1.0817 - regression_loss: 0.9567 - classification_loss: 0.1250 172/500 [=========>....................] - ETA: 1:16 - loss: 1.0851 - regression_loss: 0.9596 - classification_loss: 0.1255 173/500 [=========>....................] - ETA: 1:16 - loss: 1.0846 - regression_loss: 0.9590 - classification_loss: 0.1256 174/500 [=========>....................] - ETA: 1:16 - loss: 1.0849 - regression_loss: 0.9588 - classification_loss: 0.1261 175/500 [=========>....................] - ETA: 1:15 - loss: 1.0841 - regression_loss: 0.9581 - classification_loss: 0.1260 176/500 [=========>....................] - ETA: 1:15 - loss: 1.0845 - regression_loss: 0.9585 - classification_loss: 0.1260 177/500 [=========>....................] - ETA: 1:15 - loss: 1.0859 - regression_loss: 0.9598 - classification_loss: 0.1261 178/500 [=========>....................] - ETA: 1:15 - loss: 1.0838 - regression_loss: 0.9580 - classification_loss: 0.1258 179/500 [=========>....................] - ETA: 1:15 - loss: 1.0819 - regression_loss: 0.9565 - classification_loss: 0.1254 180/500 [=========>....................] - ETA: 1:14 - loss: 1.0828 - regression_loss: 0.9569 - classification_loss: 0.1259 181/500 [=========>....................] - ETA: 1:14 - loss: 1.0823 - regression_loss: 0.9566 - classification_loss: 0.1257 182/500 [=========>....................] - ETA: 1:15 - loss: 1.0837 - regression_loss: 0.9578 - classification_loss: 0.1259 183/500 [=========>....................] - ETA: 1:15 - loss: 1.0884 - regression_loss: 0.9625 - classification_loss: 0.1259 184/500 [==========>...................] - ETA: 1:14 - loss: 1.0965 - regression_loss: 0.9692 - classification_loss: 0.1273 185/500 [==========>...................] - ETA: 1:14 - loss: 1.0939 - regression_loss: 0.9669 - classification_loss: 0.1269 186/500 [==========>...................] - ETA: 1:14 - loss: 1.0937 - regression_loss: 0.9670 - classification_loss: 0.1267 187/500 [==========>...................] - ETA: 1:14 - loss: 1.0932 - regression_loss: 0.9667 - classification_loss: 0.1265 188/500 [==========>...................] - ETA: 1:13 - loss: 1.0951 - regression_loss: 0.9689 - classification_loss: 0.1262 189/500 [==========>...................] - ETA: 1:13 - loss: 1.0940 - regression_loss: 0.9680 - classification_loss: 0.1259 190/500 [==========>...................] - ETA: 1:13 - loss: 1.0931 - regression_loss: 0.9674 - classification_loss: 0.1257 191/500 [==========>...................] - ETA: 1:13 - loss: 1.0922 - regression_loss: 0.9668 - classification_loss: 0.1253 192/500 [==========>...................] - ETA: 1:13 - loss: 1.0888 - regression_loss: 0.9638 - classification_loss: 0.1249 193/500 [==========>...................] - ETA: 1:12 - loss: 1.0906 - regression_loss: 0.9656 - classification_loss: 0.1250 194/500 [==========>...................] - ETA: 1:12 - loss: 1.0902 - regression_loss: 0.9657 - classification_loss: 0.1245 195/500 [==========>...................] - ETA: 1:12 - loss: 1.0879 - regression_loss: 0.9638 - classification_loss: 0.1241 196/500 [==========>...................] - ETA: 1:12 - loss: 1.0909 - regression_loss: 0.9659 - classification_loss: 0.1250 197/500 [==========>...................] - ETA: 1:11 - loss: 1.0886 - regression_loss: 0.9640 - classification_loss: 0.1245 198/500 [==========>...................] - ETA: 1:11 - loss: 1.0946 - regression_loss: 0.9690 - classification_loss: 0.1256 199/500 [==========>...................] - ETA: 1:11 - loss: 1.0919 - regression_loss: 0.9667 - classification_loss: 0.1252 200/500 [===========>..................] - ETA: 1:11 - loss: 1.0923 - regression_loss: 0.9671 - classification_loss: 0.1252 201/500 [===========>..................] - ETA: 1:10 - loss: 1.0912 - regression_loss: 0.9663 - classification_loss: 0.1249 202/500 [===========>..................] - ETA: 1:10 - loss: 1.0909 - regression_loss: 0.9661 - classification_loss: 0.1247 203/500 [===========>..................] - ETA: 1:10 - loss: 1.0923 - regression_loss: 0.9675 - classification_loss: 0.1248 204/500 [===========>..................] - ETA: 1:10 - loss: 1.0911 - regression_loss: 0.9665 - classification_loss: 0.1245 205/500 [===========>..................] - ETA: 1:09 - loss: 1.0936 - regression_loss: 0.9691 - classification_loss: 0.1245 206/500 [===========>..................] - ETA: 1:09 - loss: 1.0954 - regression_loss: 0.9708 - classification_loss: 0.1246 207/500 [===========>..................] - ETA: 1:09 - loss: 1.0933 - regression_loss: 0.9691 - classification_loss: 0.1242 208/500 [===========>..................] - ETA: 1:09 - loss: 1.0916 - regression_loss: 0.9677 - classification_loss: 0.1239 209/500 [===========>..................] - ETA: 1:08 - loss: 1.0904 - regression_loss: 0.9670 - classification_loss: 0.1234 210/500 [===========>..................] - ETA: 1:08 - loss: 1.0934 - regression_loss: 0.9690 - classification_loss: 0.1244 211/500 [===========>..................] - ETA: 1:08 - loss: 1.0948 - regression_loss: 0.9704 - classification_loss: 0.1244 212/500 [===========>..................] - ETA: 1:08 - loss: 1.0946 - regression_loss: 0.9703 - classification_loss: 0.1244 213/500 [===========>..................] - ETA: 1:07 - loss: 1.0927 - regression_loss: 0.9687 - classification_loss: 0.1240 214/500 [===========>..................] - ETA: 1:07 - loss: 1.0919 - regression_loss: 0.9681 - classification_loss: 0.1238 215/500 [===========>..................] - ETA: 1:07 - loss: 1.0916 - regression_loss: 0.9677 - classification_loss: 0.1239 216/500 [===========>..................] - ETA: 1:07 - loss: 1.0938 - regression_loss: 0.9694 - classification_loss: 0.1244 217/500 [============>.................] - ETA: 1:07 - loss: 1.0920 - regression_loss: 0.9678 - classification_loss: 0.1242 218/500 [============>.................] - ETA: 1:06 - loss: 1.0914 - regression_loss: 0.9676 - classification_loss: 0.1239 219/500 [============>.................] - ETA: 1:06 - loss: 1.0985 - regression_loss: 0.9720 - classification_loss: 0.1265 220/500 [============>.................] - ETA: 1:06 - loss: 1.0964 - regression_loss: 0.9702 - classification_loss: 0.1262 221/500 [============>.................] - ETA: 1:06 - loss: 1.0962 - regression_loss: 0.9700 - classification_loss: 0.1262 222/500 [============>.................] - ETA: 1:05 - loss: 1.1015 - regression_loss: 0.9734 - classification_loss: 0.1282 223/500 [============>.................] - ETA: 1:05 - loss: 1.1033 - regression_loss: 0.9749 - classification_loss: 0.1285 224/500 [============>.................] - ETA: 1:05 - loss: 1.1043 - regression_loss: 0.9759 - classification_loss: 0.1284 225/500 [============>.................] - ETA: 1:05 - loss: 1.1022 - regression_loss: 0.9742 - classification_loss: 0.1279 226/500 [============>.................] - ETA: 1:04 - loss: 1.1032 - regression_loss: 0.9752 - classification_loss: 0.1280 227/500 [============>.................] - ETA: 1:04 - loss: 1.1030 - regression_loss: 0.9753 - classification_loss: 0.1277 228/500 [============>.................] - ETA: 1:04 - loss: 1.1037 - regression_loss: 0.9760 - classification_loss: 0.1277 229/500 [============>.................] - ETA: 1:04 - loss: 1.1032 - regression_loss: 0.9758 - classification_loss: 0.1275 230/500 [============>.................] - ETA: 1:03 - loss: 1.1068 - regression_loss: 0.9787 - classification_loss: 0.1281 231/500 [============>.................] - ETA: 1:03 - loss: 1.1110 - regression_loss: 0.9817 - classification_loss: 0.1294 232/500 [============>.................] - ETA: 1:03 - loss: 1.1103 - regression_loss: 0.9810 - classification_loss: 0.1293 233/500 [============>.................] - ETA: 1:03 - loss: 1.1108 - regression_loss: 0.9816 - classification_loss: 0.1292 234/500 [=============>................] - ETA: 1:03 - loss: 1.1098 - regression_loss: 0.9807 - classification_loss: 0.1291 235/500 [=============>................] - ETA: 1:02 - loss: 1.1090 - regression_loss: 0.9801 - classification_loss: 0.1289 236/500 [=============>................] - ETA: 1:02 - loss: 1.1089 - regression_loss: 0.9791 - classification_loss: 0.1298 237/500 [=============>................] - ETA: 1:02 - loss: 1.1067 - regression_loss: 0.9762 - classification_loss: 0.1306 238/500 [=============>................] - ETA: 1:02 - loss: 1.1037 - regression_loss: 0.9735 - classification_loss: 0.1302 239/500 [=============>................] - ETA: 1:01 - loss: 1.1059 - regression_loss: 0.9752 - classification_loss: 0.1306 240/500 [=============>................] - ETA: 1:01 - loss: 1.1062 - regression_loss: 0.9753 - classification_loss: 0.1309 241/500 [=============>................] - ETA: 1:01 - loss: 1.1042 - regression_loss: 0.9734 - classification_loss: 0.1307 242/500 [=============>................] - ETA: 1:01 - loss: 1.1021 - regression_loss: 0.9717 - classification_loss: 0.1304 243/500 [=============>................] - ETA: 1:00 - loss: 1.1019 - regression_loss: 0.9714 - classification_loss: 0.1305 244/500 [=============>................] - ETA: 1:00 - loss: 1.0991 - regression_loss: 0.9690 - classification_loss: 0.1302 245/500 [=============>................] - ETA: 1:00 - loss: 1.0999 - regression_loss: 0.9697 - classification_loss: 0.1302 246/500 [=============>................] - ETA: 1:00 - loss: 1.0995 - regression_loss: 0.9693 - classification_loss: 0.1302 247/500 [=============>................] - ETA: 59s - loss: 1.0968 - regression_loss: 0.9670 - classification_loss: 0.1298  248/500 [=============>................] - ETA: 59s - loss: 1.0962 - regression_loss: 0.9666 - classification_loss: 0.1296 249/500 [=============>................] - ETA: 59s - loss: 1.0954 - regression_loss: 0.9660 - classification_loss: 0.1294 250/500 [==============>...............] - ETA: 59s - loss: 1.0969 - regression_loss: 0.9671 - classification_loss: 0.1298 251/500 [==============>...............] - ETA: 58s - loss: 1.0959 - regression_loss: 0.9662 - classification_loss: 0.1297 252/500 [==============>...............] - ETA: 58s - loss: 1.0954 - regression_loss: 0.9658 - classification_loss: 0.1296 253/500 [==============>...............] - ETA: 58s - loss: 1.0937 - regression_loss: 0.9644 - classification_loss: 0.1293 254/500 [==============>...............] - ETA: 58s - loss: 1.0929 - regression_loss: 0.9639 - classification_loss: 0.1290 255/500 [==============>...............] - ETA: 57s - loss: 1.0983 - regression_loss: 0.9685 - classification_loss: 0.1298 256/500 [==============>...............] - ETA: 57s - loss: 1.0964 - regression_loss: 0.9668 - classification_loss: 0.1296 257/500 [==============>...............] - ETA: 57s - loss: 1.0963 - regression_loss: 0.9668 - classification_loss: 0.1295 258/500 [==============>...............] - ETA: 57s - loss: 1.0972 - regression_loss: 0.9676 - classification_loss: 0.1296 259/500 [==============>...............] - ETA: 57s - loss: 1.0973 - regression_loss: 0.9678 - classification_loss: 0.1295 260/500 [==============>...............] - ETA: 56s - loss: 1.1011 - regression_loss: 0.9706 - classification_loss: 0.1305 261/500 [==============>...............] - ETA: 56s - loss: 1.1022 - regression_loss: 0.9714 - classification_loss: 0.1308 262/500 [==============>...............] - ETA: 56s - loss: 1.1004 - regression_loss: 0.9698 - classification_loss: 0.1306 263/500 [==============>...............] - ETA: 56s - loss: 1.1015 - regression_loss: 0.9708 - classification_loss: 0.1307 264/500 [==============>...............] - ETA: 55s - loss: 1.1023 - regression_loss: 0.9715 - classification_loss: 0.1308 265/500 [==============>...............] - ETA: 55s - loss: 1.1015 - regression_loss: 0.9708 - classification_loss: 0.1306 266/500 [==============>...............] - ETA: 55s - loss: 1.1019 - regression_loss: 0.9712 - classification_loss: 0.1306 267/500 [===============>..............] - ETA: 55s - loss: 1.1023 - regression_loss: 0.9716 - classification_loss: 0.1307 268/500 [===============>..............] - ETA: 54s - loss: 1.1046 - regression_loss: 0.9738 - classification_loss: 0.1308 269/500 [===============>..............] - ETA: 54s - loss: 1.1059 - regression_loss: 0.9750 - classification_loss: 0.1309 270/500 [===============>..............] - ETA: 54s - loss: 1.1112 - regression_loss: 0.9795 - classification_loss: 0.1317 271/500 [===============>..............] - ETA: 54s - loss: 1.1086 - regression_loss: 0.9772 - classification_loss: 0.1314 272/500 [===============>..............] - ETA: 53s - loss: 1.1095 - regression_loss: 0.9781 - classification_loss: 0.1314 273/500 [===============>..............] - ETA: 53s - loss: 1.1077 - regression_loss: 0.9765 - classification_loss: 0.1312 274/500 [===============>..............] - ETA: 53s - loss: 1.1087 - regression_loss: 0.9775 - classification_loss: 0.1312 275/500 [===============>..............] - ETA: 53s - loss: 1.1083 - regression_loss: 0.9773 - classification_loss: 0.1310 276/500 [===============>..............] - ETA: 53s - loss: 1.1063 - regression_loss: 0.9755 - classification_loss: 0.1308 277/500 [===============>..............] - ETA: 52s - loss: 1.1097 - regression_loss: 0.9780 - classification_loss: 0.1317 278/500 [===============>..............] - ETA: 52s - loss: 1.1119 - regression_loss: 0.9798 - classification_loss: 0.1320 279/500 [===============>..............] - ETA: 52s - loss: 1.1118 - regression_loss: 0.9797 - classification_loss: 0.1321 280/500 [===============>..............] - ETA: 52s - loss: 1.1135 - regression_loss: 0.9811 - classification_loss: 0.1324 281/500 [===============>..............] - ETA: 51s - loss: 1.1129 - regression_loss: 0.9806 - classification_loss: 0.1323 282/500 [===============>..............] - ETA: 51s - loss: 1.1134 - regression_loss: 0.9811 - classification_loss: 0.1323 283/500 [===============>..............] - ETA: 51s - loss: 1.1124 - regression_loss: 0.9802 - classification_loss: 0.1322 284/500 [================>.............] - ETA: 51s - loss: 1.1118 - regression_loss: 0.9797 - classification_loss: 0.1322 285/500 [================>.............] - ETA: 50s - loss: 1.1110 - regression_loss: 0.9792 - classification_loss: 0.1319 286/500 [================>.............] - ETA: 50s - loss: 1.1107 - regression_loss: 0.9788 - classification_loss: 0.1319 287/500 [================>.............] - ETA: 50s - loss: 1.1101 - regression_loss: 0.9783 - classification_loss: 0.1318 288/500 [================>.............] - ETA: 50s - loss: 1.1107 - regression_loss: 0.9790 - classification_loss: 0.1317 289/500 [================>.............] - ETA: 49s - loss: 1.1105 - regression_loss: 0.9790 - classification_loss: 0.1315 290/500 [================>.............] - ETA: 49s - loss: 1.1094 - regression_loss: 0.9780 - classification_loss: 0.1314 291/500 [================>.............] - ETA: 49s - loss: 1.1087 - regression_loss: 0.9773 - classification_loss: 0.1315 292/500 [================>.............] - ETA: 49s - loss: 1.1075 - regression_loss: 0.9763 - classification_loss: 0.1312 293/500 [================>.............] - ETA: 48s - loss: 1.1102 - regression_loss: 0.9787 - classification_loss: 0.1314 294/500 [================>.............] - ETA: 48s - loss: 1.1097 - regression_loss: 0.9783 - classification_loss: 0.1314 295/500 [================>.............] - ETA: 48s - loss: 1.1127 - regression_loss: 0.9809 - classification_loss: 0.1317 296/500 [================>.............] - ETA: 48s - loss: 1.1131 - regression_loss: 0.9812 - classification_loss: 0.1319 297/500 [================>.............] - ETA: 48s - loss: 1.1127 - regression_loss: 0.9807 - classification_loss: 0.1320 298/500 [================>.............] - ETA: 47s - loss: 1.1102 - regression_loss: 0.9785 - classification_loss: 0.1317 299/500 [================>.............] - ETA: 47s - loss: 1.1086 - regression_loss: 0.9771 - classification_loss: 0.1316 300/500 [=================>............] - ETA: 47s - loss: 1.1083 - regression_loss: 0.9769 - classification_loss: 0.1314 301/500 [=================>............] - ETA: 47s - loss: 1.1083 - regression_loss: 0.9769 - classification_loss: 0.1314 302/500 [=================>............] - ETA: 46s - loss: 1.1075 - regression_loss: 0.9762 - classification_loss: 0.1313 303/500 [=================>............] - ETA: 46s - loss: 1.1086 - regression_loss: 0.9772 - classification_loss: 0.1314 304/500 [=================>............] - ETA: 46s - loss: 1.1093 - regression_loss: 0.9777 - classification_loss: 0.1316 305/500 [=================>............] - ETA: 46s - loss: 1.1086 - regression_loss: 0.9771 - classification_loss: 0.1315 306/500 [=================>............] - ETA: 45s - loss: 1.1099 - regression_loss: 0.9782 - classification_loss: 0.1317 307/500 [=================>............] - ETA: 45s - loss: 1.1086 - regression_loss: 0.9771 - classification_loss: 0.1315 308/500 [=================>............] - ETA: 45s - loss: 1.1071 - regression_loss: 0.9758 - classification_loss: 0.1313 309/500 [=================>............] - ETA: 45s - loss: 1.1054 - regression_loss: 0.9742 - classification_loss: 0.1311 310/500 [=================>............] - ETA: 44s - loss: 1.1050 - regression_loss: 0.9740 - classification_loss: 0.1310 311/500 [=================>............] - ETA: 44s - loss: 1.1085 - regression_loss: 0.9769 - classification_loss: 0.1316 312/500 [=================>............] - ETA: 44s - loss: 1.1086 - regression_loss: 0.9770 - classification_loss: 0.1315 313/500 [=================>............] - ETA: 44s - loss: 1.1069 - regression_loss: 0.9756 - classification_loss: 0.1313 314/500 [=================>............] - ETA: 44s - loss: 1.1075 - regression_loss: 0.9761 - classification_loss: 0.1313 315/500 [=================>............] - ETA: 43s - loss: 1.1067 - regression_loss: 0.9754 - classification_loss: 0.1313 316/500 [=================>............] - ETA: 43s - loss: 1.1062 - regression_loss: 0.9748 - classification_loss: 0.1314 317/500 [==================>...........] - ETA: 43s - loss: 1.1037 - regression_loss: 0.9726 - classification_loss: 0.1311 318/500 [==================>...........] - ETA: 43s - loss: 1.1049 - regression_loss: 0.9736 - classification_loss: 0.1312 319/500 [==================>...........] - ETA: 42s - loss: 1.1042 - regression_loss: 0.9731 - classification_loss: 0.1311 320/500 [==================>...........] - ETA: 42s - loss: 1.1043 - regression_loss: 0.9733 - classification_loss: 0.1310 321/500 [==================>...........] - ETA: 42s - loss: 1.1042 - regression_loss: 0.9733 - classification_loss: 0.1309 322/500 [==================>...........] - ETA: 42s - loss: 1.1040 - regression_loss: 0.9731 - classification_loss: 0.1309 323/500 [==================>...........] - ETA: 41s - loss: 1.1043 - regression_loss: 0.9731 - classification_loss: 0.1312 324/500 [==================>...........] - ETA: 41s - loss: 1.1034 - regression_loss: 0.9724 - classification_loss: 0.1310 325/500 [==================>...........] - ETA: 41s - loss: 1.1052 - regression_loss: 0.9741 - classification_loss: 0.1311 326/500 [==================>...........] - ETA: 41s - loss: 1.1045 - regression_loss: 0.9735 - classification_loss: 0.1310 327/500 [==================>...........] - ETA: 40s - loss: 1.1044 - regression_loss: 0.9734 - classification_loss: 0.1310 328/500 [==================>...........] - ETA: 40s - loss: 1.1044 - regression_loss: 0.9734 - classification_loss: 0.1310 329/500 [==================>...........] - ETA: 40s - loss: 1.1060 - regression_loss: 0.9749 - classification_loss: 0.1311 330/500 [==================>...........] - ETA: 40s - loss: 1.1050 - regression_loss: 0.9740 - classification_loss: 0.1310 331/500 [==================>...........] - ETA: 39s - loss: 1.1043 - regression_loss: 0.9735 - classification_loss: 0.1308 332/500 [==================>...........] - ETA: 39s - loss: 1.1045 - regression_loss: 0.9736 - classification_loss: 0.1309 333/500 [==================>...........] - ETA: 39s - loss: 1.1042 - regression_loss: 0.9732 - classification_loss: 0.1310 334/500 [===================>..........] - ETA: 39s - loss: 1.1027 - regression_loss: 0.9718 - classification_loss: 0.1308 335/500 [===================>..........] - ETA: 38s - loss: 1.1004 - regression_loss: 0.9699 - classification_loss: 0.1306 336/500 [===================>..........] - ETA: 38s - loss: 1.0991 - regression_loss: 0.9689 - classification_loss: 0.1303 337/500 [===================>..........] - ETA: 38s - loss: 1.0992 - regression_loss: 0.9686 - classification_loss: 0.1306 338/500 [===================>..........] - ETA: 38s - loss: 1.0985 - regression_loss: 0.9676 - classification_loss: 0.1309 339/500 [===================>..........] - ETA: 38s - loss: 1.0985 - regression_loss: 0.9678 - classification_loss: 0.1308 340/500 [===================>..........] - ETA: 37s - loss: 1.0986 - regression_loss: 0.9678 - classification_loss: 0.1308 341/500 [===================>..........] - ETA: 37s - loss: 1.0979 - regression_loss: 0.9672 - classification_loss: 0.1307 342/500 [===================>..........] - ETA: 37s - loss: 1.0985 - regression_loss: 0.9677 - classification_loss: 0.1308 343/500 [===================>..........] - ETA: 37s - loss: 1.0991 - regression_loss: 0.9684 - classification_loss: 0.1307 344/500 [===================>..........] - ETA: 36s - loss: 1.0973 - regression_loss: 0.9669 - classification_loss: 0.1304 345/500 [===================>..........] - ETA: 36s - loss: 1.0980 - regression_loss: 0.9676 - classification_loss: 0.1304 346/500 [===================>..........] - ETA: 36s - loss: 1.0995 - regression_loss: 0.9687 - classification_loss: 0.1308 347/500 [===================>..........] - ETA: 36s - loss: 1.1006 - regression_loss: 0.9695 - classification_loss: 0.1311 348/500 [===================>..........] - ETA: 35s - loss: 1.0995 - regression_loss: 0.9686 - classification_loss: 0.1309 349/500 [===================>..........] - ETA: 35s - loss: 1.0984 - regression_loss: 0.9677 - classification_loss: 0.1307 350/500 [====================>.........] - ETA: 35s - loss: 1.0973 - regression_loss: 0.9668 - classification_loss: 0.1305 351/500 [====================>.........] - ETA: 35s - loss: 1.0959 - regression_loss: 0.9656 - classification_loss: 0.1303 352/500 [====================>.........] - ETA: 34s - loss: 1.0959 - regression_loss: 0.9657 - classification_loss: 0.1302 353/500 [====================>.........] - ETA: 34s - loss: 1.0956 - regression_loss: 0.9654 - classification_loss: 0.1302 354/500 [====================>.........] - ETA: 34s - loss: 1.0959 - regression_loss: 0.9653 - classification_loss: 0.1305 355/500 [====================>.........] - ETA: 34s - loss: 1.0960 - regression_loss: 0.9656 - classification_loss: 0.1304 356/500 [====================>.........] - ETA: 33s - loss: 1.0964 - regression_loss: 0.9661 - classification_loss: 0.1303 357/500 [====================>.........] - ETA: 33s - loss: 1.0949 - regression_loss: 0.9647 - classification_loss: 0.1301 358/500 [====================>.........] - ETA: 33s - loss: 1.0960 - regression_loss: 0.9658 - classification_loss: 0.1302 359/500 [====================>.........] - ETA: 33s - loss: 1.0937 - regression_loss: 0.9638 - classification_loss: 0.1299 360/500 [====================>.........] - ETA: 33s - loss: 1.0949 - regression_loss: 0.9646 - classification_loss: 0.1303 361/500 [====================>.........] - ETA: 32s - loss: 1.0960 - regression_loss: 0.9653 - classification_loss: 0.1307 362/500 [====================>.........] - ETA: 32s - loss: 1.0949 - regression_loss: 0.9645 - classification_loss: 0.1304 363/500 [====================>.........] - ETA: 32s - loss: 1.0941 - regression_loss: 0.9638 - classification_loss: 0.1303 364/500 [====================>.........] - ETA: 32s - loss: 1.0928 - regression_loss: 0.9627 - classification_loss: 0.1301 365/500 [====================>.........] - ETA: 31s - loss: 1.0927 - regression_loss: 0.9626 - classification_loss: 0.1301 366/500 [====================>.........] - ETA: 31s - loss: 1.0912 - regression_loss: 0.9613 - classification_loss: 0.1299 367/500 [=====================>........] - ETA: 31s - loss: 1.0906 - regression_loss: 0.9608 - classification_loss: 0.1298 368/500 [=====================>........] - ETA: 31s - loss: 1.0902 - regression_loss: 0.9605 - classification_loss: 0.1297 369/500 [=====================>........] - ETA: 30s - loss: 1.0905 - regression_loss: 0.9607 - classification_loss: 0.1298 370/500 [=====================>........] - ETA: 30s - loss: 1.0921 - regression_loss: 0.9621 - classification_loss: 0.1301 371/500 [=====================>........] - ETA: 30s - loss: 1.0933 - regression_loss: 0.9630 - classification_loss: 0.1303 372/500 [=====================>........] - ETA: 30s - loss: 1.0925 - regression_loss: 0.9623 - classification_loss: 0.1302 373/500 [=====================>........] - ETA: 29s - loss: 1.0909 - regression_loss: 0.9609 - classification_loss: 0.1299 374/500 [=====================>........] - ETA: 29s - loss: 1.0916 - regression_loss: 0.9615 - classification_loss: 0.1301 375/500 [=====================>........] - ETA: 29s - loss: 1.0918 - regression_loss: 0.9615 - classification_loss: 0.1302 376/500 [=====================>........] - ETA: 29s - loss: 1.0907 - regression_loss: 0.9608 - classification_loss: 0.1300 377/500 [=====================>........] - ETA: 29s - loss: 1.0916 - regression_loss: 0.9614 - classification_loss: 0.1302 378/500 [=====================>........] - ETA: 28s - loss: 1.0916 - regression_loss: 0.9614 - classification_loss: 0.1302 379/500 [=====================>........] - ETA: 28s - loss: 1.0916 - regression_loss: 0.9616 - classification_loss: 0.1301 380/500 [=====================>........] - ETA: 28s - loss: 1.0917 - regression_loss: 0.9618 - classification_loss: 0.1300 381/500 [=====================>........] - ETA: 28s - loss: 1.0914 - regression_loss: 0.9616 - classification_loss: 0.1299 382/500 [=====================>........] - ETA: 27s - loss: 1.0910 - regression_loss: 0.9612 - classification_loss: 0.1298 383/500 [=====================>........] - ETA: 27s - loss: 1.0900 - regression_loss: 0.9605 - classification_loss: 0.1296 384/500 [======================>.......] - ETA: 27s - loss: 1.0891 - regression_loss: 0.9596 - classification_loss: 0.1295 385/500 [======================>.......] - ETA: 27s - loss: 1.0886 - regression_loss: 0.9592 - classification_loss: 0.1294 386/500 [======================>.......] - ETA: 26s - loss: 1.0896 - regression_loss: 0.9600 - classification_loss: 0.1296 387/500 [======================>.......] - ETA: 26s - loss: 1.0900 - regression_loss: 0.9603 - classification_loss: 0.1297 388/500 [======================>.......] - ETA: 26s - loss: 1.0887 - regression_loss: 0.9592 - classification_loss: 0.1295 389/500 [======================>.......] - ETA: 26s - loss: 1.0889 - regression_loss: 0.9595 - classification_loss: 0.1295 390/500 [======================>.......] - ETA: 25s - loss: 1.0884 - regression_loss: 0.9591 - classification_loss: 0.1293 391/500 [======================>.......] - ETA: 25s - loss: 1.0891 - regression_loss: 0.9598 - classification_loss: 0.1294 392/500 [======================>.......] - ETA: 25s - loss: 1.0892 - regression_loss: 0.9599 - classification_loss: 0.1293 393/500 [======================>.......] - ETA: 25s - loss: 1.0928 - regression_loss: 0.9628 - classification_loss: 0.1300 394/500 [======================>.......] - ETA: 25s - loss: 1.0924 - regression_loss: 0.9626 - classification_loss: 0.1298 395/500 [======================>.......] - ETA: 24s - loss: 1.0944 - regression_loss: 0.9643 - classification_loss: 0.1301 396/500 [======================>.......] - ETA: 24s - loss: 1.0936 - regression_loss: 0.9637 - classification_loss: 0.1299 397/500 [======================>.......] - ETA: 24s - loss: 1.0936 - regression_loss: 0.9638 - classification_loss: 0.1298 398/500 [======================>.......] - ETA: 24s - loss: 1.0956 - regression_loss: 0.9653 - classification_loss: 0.1304 399/500 [======================>.......] - ETA: 23s - loss: 1.0950 - regression_loss: 0.9649 - classification_loss: 0.1301 400/500 [=======================>......] - ETA: 23s - loss: 1.0933 - regression_loss: 0.9634 - classification_loss: 0.1299 401/500 [=======================>......] - ETA: 23s - loss: 1.0924 - regression_loss: 0.9627 - classification_loss: 0.1297 402/500 [=======================>......] - ETA: 23s - loss: 1.0927 - regression_loss: 0.9628 - classification_loss: 0.1299 403/500 [=======================>......] - ETA: 22s - loss: 1.0931 - regression_loss: 0.9632 - classification_loss: 0.1299 404/500 [=======================>......] - ETA: 22s - loss: 1.0933 - regression_loss: 0.9608 - classification_loss: 0.1325 405/500 [=======================>......] - ETA: 22s - loss: 1.0930 - regression_loss: 0.9608 - classification_loss: 0.1322 406/500 [=======================>......] - ETA: 22s - loss: 1.0920 - regression_loss: 0.9599 - classification_loss: 0.1321 407/500 [=======================>......] - ETA: 21s - loss: 1.0924 - regression_loss: 0.9603 - classification_loss: 0.1320 408/500 [=======================>......] - ETA: 21s - loss: 1.0930 - regression_loss: 0.9610 - classification_loss: 0.1319 409/500 [=======================>......] - ETA: 21s - loss: 1.0929 - regression_loss: 0.9611 - classification_loss: 0.1318 410/500 [=======================>......] - ETA: 21s - loss: 1.0924 - regression_loss: 0.9607 - classification_loss: 0.1317 411/500 [=======================>......] - ETA: 21s - loss: 1.0917 - regression_loss: 0.9601 - classification_loss: 0.1316 412/500 [=======================>......] - ETA: 20s - loss: 1.0926 - regression_loss: 0.9608 - classification_loss: 0.1318 413/500 [=======================>......] - ETA: 20s - loss: 1.0937 - regression_loss: 0.9616 - classification_loss: 0.1320 414/500 [=======================>......] - ETA: 20s - loss: 1.0953 - regression_loss: 0.9631 - classification_loss: 0.1322 415/500 [=======================>......] - ETA: 20s - loss: 1.0940 - regression_loss: 0.9621 - classification_loss: 0.1320 416/500 [=======================>......] - ETA: 19s - loss: 1.0948 - regression_loss: 0.9628 - classification_loss: 0.1320 417/500 [========================>.....] - ETA: 19s - loss: 1.0932 - regression_loss: 0.9613 - classification_loss: 0.1318 418/500 [========================>.....] - ETA: 19s - loss: 1.0924 - regression_loss: 0.9607 - classification_loss: 0.1317 419/500 [========================>.....] - ETA: 19s - loss: 1.0929 - regression_loss: 0.9612 - classification_loss: 0.1317 420/500 [========================>.....] - ETA: 18s - loss: 1.0930 - regression_loss: 0.9612 - classification_loss: 0.1319 421/500 [========================>.....] - ETA: 18s - loss: 1.0932 - regression_loss: 0.9613 - classification_loss: 0.1319 422/500 [========================>.....] - ETA: 18s - loss: 1.0930 - regression_loss: 0.9612 - classification_loss: 0.1318 423/500 [========================>.....] - ETA: 18s - loss: 1.0923 - regression_loss: 0.9606 - classification_loss: 0.1317 424/500 [========================>.....] - ETA: 17s - loss: 1.0931 - regression_loss: 0.9614 - classification_loss: 0.1318 425/500 [========================>.....] - ETA: 17s - loss: 1.0941 - regression_loss: 0.9624 - classification_loss: 0.1317 426/500 [========================>.....] - ETA: 17s - loss: 1.0950 - regression_loss: 0.9632 - classification_loss: 0.1318 427/500 [========================>.....] - ETA: 17s - loss: 1.0938 - regression_loss: 0.9622 - classification_loss: 0.1316 428/500 [========================>.....] - ETA: 16s - loss: 1.0930 - regression_loss: 0.9615 - classification_loss: 0.1315 429/500 [========================>.....] - ETA: 16s - loss: 1.0935 - regression_loss: 0.9620 - classification_loss: 0.1315 430/500 [========================>.....] - ETA: 16s - loss: 1.0941 - regression_loss: 0.9624 - classification_loss: 0.1316 431/500 [========================>.....] - ETA: 16s - loss: 1.0947 - regression_loss: 0.9630 - classification_loss: 0.1317 432/500 [========================>.....] - ETA: 16s - loss: 1.0929 - regression_loss: 0.9614 - classification_loss: 0.1315 433/500 [========================>.....] - ETA: 15s - loss: 1.0927 - regression_loss: 0.9611 - classification_loss: 0.1315 434/500 [=========================>....] - ETA: 15s - loss: 1.0930 - regression_loss: 0.9614 - classification_loss: 0.1316 435/500 [=========================>....] - ETA: 15s - loss: 1.0941 - regression_loss: 0.9622 - classification_loss: 0.1318 436/500 [=========================>....] - ETA: 15s - loss: 1.0929 - regression_loss: 0.9612 - classification_loss: 0.1316 437/500 [=========================>....] - ETA: 14s - loss: 1.0935 - regression_loss: 0.9616 - classification_loss: 0.1318 438/500 [=========================>....] - ETA: 14s - loss: 1.0935 - regression_loss: 0.9617 - classification_loss: 0.1318 439/500 [=========================>....] - ETA: 14s - loss: 1.0956 - regression_loss: 0.9632 - classification_loss: 0.1324 440/500 [=========================>....] - ETA: 14s - loss: 1.0952 - regression_loss: 0.9629 - classification_loss: 0.1324 441/500 [=========================>....] - ETA: 13s - loss: 1.0951 - regression_loss: 0.9629 - classification_loss: 0.1323 442/500 [=========================>....] - ETA: 13s - loss: 1.0962 - regression_loss: 0.9637 - classification_loss: 0.1325 443/500 [=========================>....] - ETA: 13s - loss: 1.0953 - regression_loss: 0.9629 - classification_loss: 0.1324 444/500 [=========================>....] - ETA: 13s - loss: 1.0948 - regression_loss: 0.9623 - classification_loss: 0.1325 445/500 [=========================>....] - ETA: 12s - loss: 1.0974 - regression_loss: 0.9643 - classification_loss: 0.1330 446/500 [=========================>....] - ETA: 12s - loss: 1.0955 - regression_loss: 0.9625 - classification_loss: 0.1330 447/500 [=========================>....] - ETA: 12s - loss: 1.0961 - regression_loss: 0.9630 - classification_loss: 0.1331 448/500 [=========================>....] - ETA: 12s - loss: 1.0963 - regression_loss: 0.9631 - classification_loss: 0.1332 449/500 [=========================>....] - ETA: 12s - loss: 1.0961 - regression_loss: 0.9629 - classification_loss: 0.1331 450/500 [==========================>...] - ETA: 11s - loss: 1.0965 - regression_loss: 0.9633 - classification_loss: 0.1332 451/500 [==========================>...] - ETA: 11s - loss: 1.0961 - regression_loss: 0.9629 - classification_loss: 0.1332 452/500 [==========================>...] - ETA: 11s - loss: 1.0964 - regression_loss: 0.9632 - classification_loss: 0.1333 453/500 [==========================>...] - ETA: 11s - loss: 1.0963 - regression_loss: 0.9630 - classification_loss: 0.1333 454/500 [==========================>...] - ETA: 10s - loss: 1.0969 - regression_loss: 0.9637 - classification_loss: 0.1333 455/500 [==========================>...] - ETA: 10s - loss: 1.0952 - regression_loss: 0.9621 - classification_loss: 0.1332 456/500 [==========================>...] - ETA: 10s - loss: 1.0946 - regression_loss: 0.9614 - classification_loss: 0.1332 457/500 [==========================>...] - ETA: 10s - loss: 1.0953 - regression_loss: 0.9619 - classification_loss: 0.1335 458/500 [==========================>...] - ETA: 9s - loss: 1.0968 - regression_loss: 0.9629 - classification_loss: 0.1339  459/500 [==========================>...] - ETA: 9s - loss: 1.0966 - regression_loss: 0.9627 - classification_loss: 0.1339 460/500 [==========================>...] - ETA: 9s - loss: 1.0984 - regression_loss: 0.9640 - classification_loss: 0.1344 461/500 [==========================>...] - ETA: 9s - loss: 1.0985 - regression_loss: 0.9640 - classification_loss: 0.1344 462/500 [==========================>...] - ETA: 8s - loss: 1.0973 - regression_loss: 0.9630 - classification_loss: 0.1342 463/500 [==========================>...] - ETA: 8s - loss: 1.0974 - regression_loss: 0.9632 - classification_loss: 0.1342 464/500 [==========================>...] - ETA: 8s - loss: 1.0979 - regression_loss: 0.9636 - classification_loss: 0.1342 465/500 [==========================>...] - ETA: 8s - loss: 1.0976 - regression_loss: 0.9635 - classification_loss: 0.1341 466/500 [==========================>...] - ETA: 8s - loss: 1.0976 - regression_loss: 0.9635 - classification_loss: 0.1341 467/500 [===========================>..] - ETA: 7s - loss: 1.0982 - regression_loss: 0.9639 - classification_loss: 0.1343 468/500 [===========================>..] - ETA: 7s - loss: 1.0979 - regression_loss: 0.9636 - classification_loss: 0.1343 469/500 [===========================>..] - ETA: 7s - loss: 1.0989 - regression_loss: 0.9644 - classification_loss: 0.1345 470/500 [===========================>..] - ETA: 7s - loss: 1.1043 - regression_loss: 0.9668 - classification_loss: 0.1375 471/500 [===========================>..] - ETA: 6s - loss: 1.1035 - regression_loss: 0.9661 - classification_loss: 0.1374 472/500 [===========================>..] - ETA: 6s - loss: 1.1032 - regression_loss: 0.9659 - classification_loss: 0.1373 473/500 [===========================>..] - ETA: 6s - loss: 1.1030 - regression_loss: 0.9658 - classification_loss: 0.1373 474/500 [===========================>..] - ETA: 6s - loss: 1.1019 - regression_loss: 0.9649 - classification_loss: 0.1370 475/500 [===========================>..] - ETA: 5s - loss: 1.1050 - regression_loss: 0.9673 - classification_loss: 0.1377 476/500 [===========================>..] - ETA: 5s - loss: 1.1056 - regression_loss: 0.9678 - classification_loss: 0.1378 477/500 [===========================>..] - ETA: 5s - loss: 1.1060 - regression_loss: 0.9682 - classification_loss: 0.1378 478/500 [===========================>..] - ETA: 5s - loss: 1.1063 - regression_loss: 0.9684 - classification_loss: 0.1379 479/500 [===========================>..] - ETA: 4s - loss: 1.1055 - regression_loss: 0.9678 - classification_loss: 0.1377 480/500 [===========================>..] - ETA: 4s - loss: 1.1052 - regression_loss: 0.9677 - classification_loss: 0.1375 481/500 [===========================>..] - ETA: 4s - loss: 1.1053 - regression_loss: 0.9679 - classification_loss: 0.1375 482/500 [===========================>..] - ETA: 4s - loss: 1.1047 - regression_loss: 0.9674 - classification_loss: 0.1373 483/500 [===========================>..] - ETA: 4s - loss: 1.1060 - regression_loss: 0.9686 - classification_loss: 0.1374 484/500 [============================>.] - ETA: 3s - loss: 1.1056 - regression_loss: 0.9682 - classification_loss: 0.1373 485/500 [============================>.] - ETA: 3s - loss: 1.1047 - regression_loss: 0.9676 - classification_loss: 0.1371 486/500 [============================>.] - ETA: 3s - loss: 1.1037 - regression_loss: 0.9668 - classification_loss: 0.1370 487/500 [============================>.] - ETA: 3s - loss: 1.1047 - regression_loss: 0.9677 - classification_loss: 0.1370 488/500 [============================>.] - ETA: 2s - loss: 1.1043 - regression_loss: 0.9674 - classification_loss: 0.1369 489/500 [============================>.] - ETA: 2s - loss: 1.1042 - regression_loss: 0.9669 - classification_loss: 0.1372 490/500 [============================>.] - ETA: 2s - loss: 1.1042 - regression_loss: 0.9669 - classification_loss: 0.1373 491/500 [============================>.] - ETA: 2s - loss: 1.1040 - regression_loss: 0.9667 - classification_loss: 0.1372 492/500 [============================>.] - ETA: 1s - loss: 1.1038 - regression_loss: 0.9667 - classification_loss: 0.1371 493/500 [============================>.] - ETA: 1s - loss: 1.1036 - regression_loss: 0.9666 - classification_loss: 0.1370 494/500 [============================>.] - ETA: 1s - loss: 1.1025 - regression_loss: 0.9657 - classification_loss: 0.1368 495/500 [============================>.] - ETA: 1s - loss: 1.1029 - regression_loss: 0.9661 - classification_loss: 0.1368 496/500 [============================>.] - ETA: 0s - loss: 1.1021 - regression_loss: 0.9654 - classification_loss: 0.1367 497/500 [============================>.] - ETA: 0s - loss: 1.1020 - regression_loss: 0.9654 - classification_loss: 0.1366 498/500 [============================>.] - ETA: 0s - loss: 1.1028 - regression_loss: 0.9661 - classification_loss: 0.1368 499/500 [============================>.] - ETA: 0s - loss: 1.1031 - regression_loss: 0.9664 - classification_loss: 0.1367 500/500 [==============================] - 118s 235ms/step - loss: 1.1030 - regression_loss: 0.9663 - classification_loss: 0.1367 326 instances of class plum with average precision: 0.8346 mAP: 0.8346 Epoch 00024: saving model to ./training/snapshots/resnet50_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:00 - loss: 0.9581 - regression_loss: 0.8745 - classification_loss: 0.0836 2/500 [..............................] - ETA: 1:59 - loss: 1.0450 - regression_loss: 0.9370 - classification_loss: 0.1080 3/500 [..............................] - ETA: 2:01 - loss: 0.9313 - regression_loss: 0.8424 - classification_loss: 0.0888 4/500 [..............................] - ETA: 2:01 - loss: 1.0113 - regression_loss: 0.9024 - classification_loss: 0.1088 5/500 [..............................] - ETA: 2:01 - loss: 0.8752 - regression_loss: 0.7822 - classification_loss: 0.0931 6/500 [..............................] - ETA: 2:01 - loss: 0.9365 - regression_loss: 0.8415 - classification_loss: 0.0950 7/500 [..............................] - ETA: 1:59 - loss: 0.9652 - regression_loss: 0.8642 - classification_loss: 0.1010 8/500 [..............................] - ETA: 1:59 - loss: 0.9641 - regression_loss: 0.8633 - classification_loss: 0.1008 9/500 [..............................] - ETA: 1:58 - loss: 1.0039 - regression_loss: 0.8961 - classification_loss: 0.1079 10/500 [..............................] - ETA: 1:57 - loss: 0.9420 - regression_loss: 0.8429 - classification_loss: 0.0991 11/500 [..............................] - ETA: 1:57 - loss: 0.8905 - regression_loss: 0.7959 - classification_loss: 0.0946 12/500 [..............................] - ETA: 1:56 - loss: 0.8239 - regression_loss: 0.7296 - classification_loss: 0.0943 13/500 [..............................] - ETA: 1:56 - loss: 0.8524 - regression_loss: 0.7535 - classification_loss: 0.0989 14/500 [..............................] - ETA: 1:55 - loss: 0.8343 - regression_loss: 0.7381 - classification_loss: 0.0961 15/500 [..............................] - ETA: 1:55 - loss: 0.8274 - regression_loss: 0.7329 - classification_loss: 0.0945 16/500 [..............................] - ETA: 1:54 - loss: 0.8247 - regression_loss: 0.7317 - classification_loss: 0.0929 17/500 [>.............................] - ETA: 1:54 - loss: 0.8142 - regression_loss: 0.7230 - classification_loss: 0.0911 18/500 [>.............................] - ETA: 1:54 - loss: 0.8049 - regression_loss: 0.7158 - classification_loss: 0.0891 19/500 [>.............................] - ETA: 1:54 - loss: 0.8018 - regression_loss: 0.7147 - classification_loss: 0.0871 20/500 [>.............................] - ETA: 1:54 - loss: 0.8097 - regression_loss: 0.7245 - classification_loss: 0.0852 21/500 [>.............................] - ETA: 1:54 - loss: 0.8524 - regression_loss: 0.7623 - classification_loss: 0.0901 22/500 [>.............................] - ETA: 1:54 - loss: 0.8427 - regression_loss: 0.7547 - classification_loss: 0.0881 23/500 [>.............................] - ETA: 1:53 - loss: 0.8356 - regression_loss: 0.7488 - classification_loss: 0.0868 24/500 [>.............................] - ETA: 1:53 - loss: 0.8858 - regression_loss: 0.7861 - classification_loss: 0.0997 25/500 [>.............................] - ETA: 1:53 - loss: 0.9169 - regression_loss: 0.8112 - classification_loss: 0.1057 26/500 [>.............................] - ETA: 1:52 - loss: 0.9032 - regression_loss: 0.7989 - classification_loss: 0.1043 27/500 [>.............................] - ETA: 1:52 - loss: 0.9219 - regression_loss: 0.8174 - classification_loss: 0.1045 28/500 [>.............................] - ETA: 1:52 - loss: 0.9337 - regression_loss: 0.8285 - classification_loss: 0.1052 29/500 [>.............................] - ETA: 1:51 - loss: 0.9197 - regression_loss: 0.8159 - classification_loss: 0.1038 30/500 [>.............................] - ETA: 1:51 - loss: 0.9524 - regression_loss: 0.8431 - classification_loss: 0.1093 31/500 [>.............................] - ETA: 1:51 - loss: 0.9558 - regression_loss: 0.8469 - classification_loss: 0.1088 32/500 [>.............................] - ETA: 1:51 - loss: 0.9782 - regression_loss: 0.8632 - classification_loss: 0.1150 33/500 [>.............................] - ETA: 1:51 - loss: 0.9764 - regression_loss: 0.8636 - classification_loss: 0.1128 34/500 [=>............................] - ETA: 1:50 - loss: 1.0064 - regression_loss: 0.8891 - classification_loss: 0.1173 35/500 [=>............................] - ETA: 1:50 - loss: 1.0066 - regression_loss: 0.8900 - classification_loss: 0.1166 36/500 [=>............................] - ETA: 1:50 - loss: 1.0135 - regression_loss: 0.8958 - classification_loss: 0.1177 37/500 [=>............................] - ETA: 1:50 - loss: 1.0130 - regression_loss: 0.8970 - classification_loss: 0.1159 38/500 [=>............................] - ETA: 1:50 - loss: 1.0345 - regression_loss: 0.9145 - classification_loss: 0.1200 39/500 [=>............................] - ETA: 1:49 - loss: 1.0825 - regression_loss: 0.9482 - classification_loss: 0.1343 40/500 [=>............................] - ETA: 1:49 - loss: 1.0823 - regression_loss: 0.9491 - classification_loss: 0.1332 41/500 [=>............................] - ETA: 1:48 - loss: 1.0737 - regression_loss: 0.9429 - classification_loss: 0.1308 42/500 [=>............................] - ETA: 1:48 - loss: 1.0836 - regression_loss: 0.9515 - classification_loss: 0.1321 43/500 [=>............................] - ETA: 1:48 - loss: 1.0908 - regression_loss: 0.9566 - classification_loss: 0.1342 44/500 [=>............................] - ETA: 1:47 - loss: 1.1225 - regression_loss: 0.9829 - classification_loss: 0.1396 45/500 [=>............................] - ETA: 1:47 - loss: 1.1382 - regression_loss: 0.9954 - classification_loss: 0.1427 46/500 [=>............................] - ETA: 1:47 - loss: 1.1439 - regression_loss: 1.0007 - classification_loss: 0.1432 47/500 [=>............................] - ETA: 1:47 - loss: 1.1694 - regression_loss: 1.0219 - classification_loss: 0.1475 48/500 [=>............................] - ETA: 1:47 - loss: 1.1568 - regression_loss: 1.0117 - classification_loss: 0.1451 49/500 [=>............................] - ETA: 1:46 - loss: 1.1506 - regression_loss: 1.0063 - classification_loss: 0.1443 50/500 [==>...........................] - ETA: 1:46 - loss: 1.1471 - regression_loss: 1.0022 - classification_loss: 0.1449 51/500 [==>...........................] - ETA: 1:46 - loss: 1.1478 - regression_loss: 1.0036 - classification_loss: 0.1442 52/500 [==>...........................] - ETA: 1:46 - loss: 1.1473 - regression_loss: 1.0039 - classification_loss: 0.1434 53/500 [==>...........................] - ETA: 1:45 - loss: 1.1550 - regression_loss: 1.0114 - classification_loss: 0.1435 54/500 [==>...........................] - ETA: 1:45 - loss: 1.1526 - regression_loss: 1.0097 - classification_loss: 0.1430 55/500 [==>...........................] - ETA: 1:45 - loss: 1.1599 - regression_loss: 1.0169 - classification_loss: 0.1430 56/500 [==>...........................] - ETA: 1:45 - loss: 1.1613 - regression_loss: 1.0178 - classification_loss: 0.1435 57/500 [==>...........................] - ETA: 1:44 - loss: 1.1671 - regression_loss: 1.0226 - classification_loss: 0.1445 58/500 [==>...........................] - ETA: 1:44 - loss: 1.1592 - regression_loss: 1.0157 - classification_loss: 0.1434 59/500 [==>...........................] - ETA: 1:44 - loss: 1.1531 - regression_loss: 1.0106 - classification_loss: 0.1426 60/500 [==>...........................] - ETA: 1:43 - loss: 1.1461 - regression_loss: 1.0044 - classification_loss: 0.1417 61/500 [==>...........................] - ETA: 1:43 - loss: 1.1488 - regression_loss: 1.0065 - classification_loss: 0.1423 62/500 [==>...........................] - ETA: 1:43 - loss: 1.1571 - regression_loss: 1.0130 - classification_loss: 0.1441 63/500 [==>...........................] - ETA: 1:43 - loss: 1.1454 - regression_loss: 1.0027 - classification_loss: 0.1427 64/500 [==>...........................] - ETA: 1:43 - loss: 1.1412 - regression_loss: 0.9999 - classification_loss: 0.1414 65/500 [==>...........................] - ETA: 1:42 - loss: 1.1414 - regression_loss: 1.0003 - classification_loss: 0.1412 66/500 [==>...........................] - ETA: 1:42 - loss: 1.1451 - regression_loss: 1.0027 - classification_loss: 0.1424 67/500 [===>..........................] - ETA: 1:42 - loss: 1.1416 - regression_loss: 0.9999 - classification_loss: 0.1417 68/500 [===>..........................] - ETA: 1:42 - loss: 1.1575 - regression_loss: 1.0120 - classification_loss: 0.1454 69/500 [===>..........................] - ETA: 1:42 - loss: 1.1542 - regression_loss: 1.0095 - classification_loss: 0.1448 70/500 [===>..........................] - ETA: 1:41 - loss: 1.1515 - regression_loss: 1.0075 - classification_loss: 0.1440 71/500 [===>..........................] - ETA: 1:41 - loss: 1.1599 - regression_loss: 1.0140 - classification_loss: 0.1459 72/500 [===>..........................] - ETA: 1:41 - loss: 1.1538 - regression_loss: 1.0087 - classification_loss: 0.1451 73/500 [===>..........................] - ETA: 1:41 - loss: 1.1600 - regression_loss: 1.0125 - classification_loss: 0.1475 74/500 [===>..........................] - ETA: 1:40 - loss: 1.1570 - regression_loss: 1.0108 - classification_loss: 0.1461 75/500 [===>..........................] - ETA: 1:40 - loss: 1.1640 - regression_loss: 1.0175 - classification_loss: 0.1465 76/500 [===>..........................] - ETA: 1:40 - loss: 1.1662 - regression_loss: 1.0194 - classification_loss: 0.1468 77/500 [===>..........................] - ETA: 1:40 - loss: 1.1614 - regression_loss: 1.0156 - classification_loss: 0.1458 78/500 [===>..........................] - ETA: 1:39 - loss: 1.1602 - regression_loss: 1.0143 - classification_loss: 0.1459 79/500 [===>..........................] - ETA: 1:39 - loss: 1.1581 - regression_loss: 1.0128 - classification_loss: 0.1454 80/500 [===>..........................] - ETA: 1:39 - loss: 1.1540 - regression_loss: 1.0094 - classification_loss: 0.1445 81/500 [===>..........................] - ETA: 1:39 - loss: 1.1486 - regression_loss: 1.0050 - classification_loss: 0.1436 82/500 [===>..........................] - ETA: 1:39 - loss: 1.1503 - regression_loss: 1.0065 - classification_loss: 0.1438 83/500 [===>..........................] - ETA: 1:38 - loss: 1.1468 - regression_loss: 1.0036 - classification_loss: 0.1432 84/500 [====>.........................] - ETA: 1:38 - loss: 1.1479 - regression_loss: 1.0046 - classification_loss: 0.1433 85/500 [====>.........................] - ETA: 1:38 - loss: 1.1435 - regression_loss: 1.0011 - classification_loss: 0.1425 86/500 [====>.........................] - ETA: 1:38 - loss: 1.1401 - regression_loss: 0.9984 - classification_loss: 0.1417 87/500 [====>.........................] - ETA: 1:37 - loss: 1.1426 - regression_loss: 1.0017 - classification_loss: 0.1409 88/500 [====>.........................] - ETA: 1:37 - loss: 1.1492 - regression_loss: 1.0076 - classification_loss: 0.1416 89/500 [====>.........................] - ETA: 1:37 - loss: 1.1478 - regression_loss: 1.0066 - classification_loss: 0.1412 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1595 - regression_loss: 1.0179 - classification_loss: 0.1416 91/500 [====>.........................] - ETA: 1:36 - loss: 1.1658 - regression_loss: 1.0235 - classification_loss: 0.1423 92/500 [====>.........................] - ETA: 1:36 - loss: 1.1628 - regression_loss: 1.0214 - classification_loss: 0.1414 93/500 [====>.........................] - ETA: 1:36 - loss: 1.1554 - regression_loss: 1.0149 - classification_loss: 0.1405 94/500 [====>.........................] - ETA: 1:36 - loss: 1.1536 - regression_loss: 1.0133 - classification_loss: 0.1403 95/500 [====>.........................] - ETA: 1:35 - loss: 1.1499 - regression_loss: 1.0099 - classification_loss: 0.1399 96/500 [====>.........................] - ETA: 1:35 - loss: 1.1473 - regression_loss: 1.0081 - classification_loss: 0.1392 97/500 [====>.........................] - ETA: 1:35 - loss: 1.1393 - regression_loss: 1.0010 - classification_loss: 0.1383 98/500 [====>.........................] - ETA: 1:35 - loss: 1.1409 - regression_loss: 1.0028 - classification_loss: 0.1380 99/500 [====>.........................] - ETA: 1:34 - loss: 1.1500 - regression_loss: 1.0097 - classification_loss: 0.1403 100/500 [=====>........................] - ETA: 1:34 - loss: 1.1534 - regression_loss: 1.0119 - classification_loss: 0.1415 101/500 [=====>........................] - ETA: 1:34 - loss: 1.1555 - regression_loss: 1.0135 - classification_loss: 0.1421 102/500 [=====>........................] - ETA: 1:34 - loss: 1.1576 - regression_loss: 1.0149 - classification_loss: 0.1427 103/500 [=====>........................] - ETA: 1:33 - loss: 1.1542 - regression_loss: 1.0122 - classification_loss: 0.1420 104/500 [=====>........................] - ETA: 1:33 - loss: 1.1463 - regression_loss: 1.0056 - classification_loss: 0.1408 105/500 [=====>........................] - ETA: 1:33 - loss: 1.1427 - regression_loss: 1.0030 - classification_loss: 0.1397 106/500 [=====>........................] - ETA: 1:33 - loss: 1.1439 - regression_loss: 1.0042 - classification_loss: 0.1398 107/500 [=====>........................] - ETA: 1:33 - loss: 1.1434 - regression_loss: 1.0038 - classification_loss: 0.1396 108/500 [=====>........................] - ETA: 1:32 - loss: 1.1456 - regression_loss: 1.0055 - classification_loss: 0.1401 109/500 [=====>........................] - ETA: 1:32 - loss: 1.1448 - regression_loss: 1.0048 - classification_loss: 0.1400 110/500 [=====>........................] - ETA: 1:32 - loss: 1.1429 - regression_loss: 1.0031 - classification_loss: 0.1397 111/500 [=====>........................] - ETA: 1:32 - loss: 1.1406 - regression_loss: 1.0013 - classification_loss: 0.1393 112/500 [=====>........................] - ETA: 1:31 - loss: 1.1379 - regression_loss: 0.9994 - classification_loss: 0.1386 113/500 [=====>........................] - ETA: 1:31 - loss: 1.1328 - regression_loss: 0.9950 - classification_loss: 0.1378 114/500 [=====>........................] - ETA: 1:31 - loss: 1.1332 - regression_loss: 0.9955 - classification_loss: 0.1378 115/500 [=====>........................] - ETA: 1:31 - loss: 1.1284 - regression_loss: 0.9913 - classification_loss: 0.1371 116/500 [=====>........................] - ETA: 1:30 - loss: 1.1345 - regression_loss: 0.9964 - classification_loss: 0.1381 117/500 [======>.......................] - ETA: 1:30 - loss: 1.1357 - regression_loss: 0.9972 - classification_loss: 0.1384 118/500 [======>.......................] - ETA: 1:30 - loss: 1.1332 - regression_loss: 0.9949 - classification_loss: 0.1384 119/500 [======>.......................] - ETA: 1:30 - loss: 1.1306 - regression_loss: 0.9927 - classification_loss: 0.1379 120/500 [======>.......................] - ETA: 1:29 - loss: 1.1298 - regression_loss: 0.9923 - classification_loss: 0.1376 121/500 [======>.......................] - ETA: 1:29 - loss: 1.1225 - regression_loss: 0.9859 - classification_loss: 0.1367 122/500 [======>.......................] - ETA: 1:29 - loss: 1.1190 - regression_loss: 0.9831 - classification_loss: 0.1359 123/500 [======>.......................] - ETA: 1:29 - loss: 1.1209 - regression_loss: 0.9850 - classification_loss: 0.1359 124/500 [======>.......................] - ETA: 1:29 - loss: 1.1206 - regression_loss: 0.9848 - classification_loss: 0.1357 125/500 [======>.......................] - ETA: 1:28 - loss: 1.1160 - regression_loss: 0.9811 - classification_loss: 0.1350 126/500 [======>.......................] - ETA: 1:28 - loss: 1.1190 - regression_loss: 0.9838 - classification_loss: 0.1352 127/500 [======>.......................] - ETA: 1:28 - loss: 1.1235 - regression_loss: 0.9876 - classification_loss: 0.1359 128/500 [======>.......................] - ETA: 1:28 - loss: 1.1295 - regression_loss: 0.9920 - classification_loss: 0.1375 129/500 [======>.......................] - ETA: 1:27 - loss: 1.1353 - regression_loss: 0.9961 - classification_loss: 0.1392 130/500 [======>.......................] - ETA: 1:27 - loss: 1.1336 - regression_loss: 0.9947 - classification_loss: 0.1390 131/500 [======>.......................] - ETA: 1:27 - loss: 1.1308 - regression_loss: 0.9924 - classification_loss: 0.1385 132/500 [======>.......................] - ETA: 1:26 - loss: 1.1348 - regression_loss: 0.9959 - classification_loss: 0.1389 133/500 [======>.......................] - ETA: 1:26 - loss: 1.1299 - regression_loss: 0.9919 - classification_loss: 0.1381 134/500 [=======>......................] - ETA: 1:26 - loss: 1.1256 - regression_loss: 0.9882 - classification_loss: 0.1374 135/500 [=======>......................] - ETA: 1:26 - loss: 1.1346 - regression_loss: 0.9960 - classification_loss: 0.1386 136/500 [=======>......................] - ETA: 1:26 - loss: 1.1358 - regression_loss: 0.9971 - classification_loss: 0.1386 137/500 [=======>......................] - ETA: 1:25 - loss: 1.1364 - regression_loss: 0.9978 - classification_loss: 0.1386 138/500 [=======>......................] - ETA: 1:25 - loss: 1.1363 - regression_loss: 0.9980 - classification_loss: 0.1383 139/500 [=======>......................] - ETA: 1:25 - loss: 1.1353 - regression_loss: 0.9973 - classification_loss: 0.1379 140/500 [=======>......................] - ETA: 1:25 - loss: 1.1345 - regression_loss: 0.9969 - classification_loss: 0.1376 141/500 [=======>......................] - ETA: 1:24 - loss: 1.1355 - regression_loss: 0.9979 - classification_loss: 0.1376 142/500 [=======>......................] - ETA: 1:24 - loss: 1.1346 - regression_loss: 0.9972 - classification_loss: 0.1375 143/500 [=======>......................] - ETA: 1:24 - loss: 1.1339 - regression_loss: 0.9969 - classification_loss: 0.1370 144/500 [=======>......................] - ETA: 1:24 - loss: 1.1326 - regression_loss: 0.9960 - classification_loss: 0.1366 145/500 [=======>......................] - ETA: 1:23 - loss: 1.1272 - regression_loss: 0.9913 - classification_loss: 0.1359 146/500 [=======>......................] - ETA: 1:23 - loss: 1.1257 - regression_loss: 0.9900 - classification_loss: 0.1357 147/500 [=======>......................] - ETA: 1:23 - loss: 1.1206 - regression_loss: 0.9855 - classification_loss: 0.1351 148/500 [=======>......................] - ETA: 1:23 - loss: 1.1195 - regression_loss: 0.9848 - classification_loss: 0.1348 149/500 [=======>......................] - ETA: 1:22 - loss: 1.1193 - regression_loss: 0.9843 - classification_loss: 0.1351 150/500 [========>.....................] - ETA: 1:22 - loss: 1.1196 - regression_loss: 0.9845 - classification_loss: 0.1351 151/500 [========>.....................] - ETA: 1:22 - loss: 1.1147 - regression_loss: 0.9802 - classification_loss: 0.1345 152/500 [========>.....................] - ETA: 1:22 - loss: 1.1287 - regression_loss: 0.9895 - classification_loss: 0.1392 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1278 - regression_loss: 0.9887 - classification_loss: 0.1391 154/500 [========>.....................] - ETA: 1:21 - loss: 1.1245 - regression_loss: 0.9861 - classification_loss: 0.1384 155/500 [========>.....................] - ETA: 1:21 - loss: 1.1268 - regression_loss: 0.9873 - classification_loss: 0.1395 156/500 [========>.....................] - ETA: 1:21 - loss: 1.1269 - regression_loss: 0.9869 - classification_loss: 0.1400 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1288 - regression_loss: 0.9887 - classification_loss: 0.1401 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1290 - regression_loss: 0.9891 - classification_loss: 0.1399 159/500 [========>.....................] - ETA: 1:20 - loss: 1.1322 - regression_loss: 0.9926 - classification_loss: 0.1397 160/500 [========>.....................] - ETA: 1:20 - loss: 1.1312 - regression_loss: 0.9918 - classification_loss: 0.1395 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1284 - regression_loss: 0.9894 - classification_loss: 0.1390 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1360 - regression_loss: 0.9956 - classification_loss: 0.1405 163/500 [========>.....................] - ETA: 1:19 - loss: 1.1334 - regression_loss: 0.9934 - classification_loss: 0.1399 164/500 [========>.....................] - ETA: 1:19 - loss: 1.1371 - regression_loss: 0.9967 - classification_loss: 0.1404 165/500 [========>.....................] - ETA: 1:19 - loss: 1.1382 - regression_loss: 0.9979 - classification_loss: 0.1403 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1368 - regression_loss: 0.9971 - classification_loss: 0.1398 167/500 [=========>....................] - ETA: 1:18 - loss: 1.1374 - regression_loss: 0.9979 - classification_loss: 0.1395 168/500 [=========>....................] - ETA: 1:18 - loss: 1.1347 - regression_loss: 0.9959 - classification_loss: 0.1389 169/500 [=========>....................] - ETA: 1:18 - loss: 1.1324 - regression_loss: 0.9937 - classification_loss: 0.1388 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1298 - regression_loss: 0.9916 - classification_loss: 0.1382 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1289 - regression_loss: 0.9907 - classification_loss: 0.1383 172/500 [=========>....................] - ETA: 1:17 - loss: 1.1262 - regression_loss: 0.9883 - classification_loss: 0.1378 173/500 [=========>....................] - ETA: 1:17 - loss: 1.1219 - regression_loss: 0.9847 - classification_loss: 0.1372 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1160 - regression_loss: 0.9790 - classification_loss: 0.1370 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1249 - regression_loss: 0.9870 - classification_loss: 0.1379 176/500 [=========>....................] - ETA: 1:16 - loss: 1.1232 - regression_loss: 0.9857 - classification_loss: 0.1375 177/500 [=========>....................] - ETA: 1:16 - loss: 1.1220 - regression_loss: 0.9846 - classification_loss: 0.1374 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1209 - regression_loss: 0.9840 - classification_loss: 0.1369 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1225 - regression_loss: 0.9856 - classification_loss: 0.1369 180/500 [=========>....................] - ETA: 1:15 - loss: 1.1208 - regression_loss: 0.9843 - classification_loss: 0.1365 181/500 [=========>....................] - ETA: 1:15 - loss: 1.1183 - regression_loss: 0.9823 - classification_loss: 0.1361 182/500 [=========>....................] - ETA: 1:15 - loss: 1.1182 - regression_loss: 0.9821 - classification_loss: 0.1361 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1172 - regression_loss: 0.9813 - classification_loss: 0.1359 184/500 [==========>...................] - ETA: 1:14 - loss: 1.1166 - regression_loss: 0.9809 - classification_loss: 0.1357 185/500 [==========>...................] - ETA: 1:14 - loss: 1.1168 - regression_loss: 0.9806 - classification_loss: 0.1362 186/500 [==========>...................] - ETA: 1:14 - loss: 1.1212 - regression_loss: 0.9838 - classification_loss: 0.1374 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1242 - regression_loss: 0.9860 - classification_loss: 0.1382 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1237 - regression_loss: 0.9856 - classification_loss: 0.1381 189/500 [==========>...................] - ETA: 1:13 - loss: 1.1217 - regression_loss: 0.9840 - classification_loss: 0.1377 190/500 [==========>...................] - ETA: 1:13 - loss: 1.1214 - regression_loss: 0.9837 - classification_loss: 0.1377 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1174 - regression_loss: 0.9803 - classification_loss: 0.1371 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1170 - regression_loss: 0.9800 - classification_loss: 0.1370 193/500 [==========>...................] - ETA: 1:12 - loss: 1.1123 - regression_loss: 0.9759 - classification_loss: 0.1364 194/500 [==========>...................] - ETA: 1:12 - loss: 1.1233 - regression_loss: 0.9831 - classification_loss: 0.1402 195/500 [==========>...................] - ETA: 1:11 - loss: 1.1249 - regression_loss: 0.9847 - classification_loss: 0.1402 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1250 - regression_loss: 0.9848 - classification_loss: 0.1402 197/500 [==========>...................] - ETA: 1:11 - loss: 1.1317 - regression_loss: 0.9895 - classification_loss: 0.1422 198/500 [==========>...................] - ETA: 1:11 - loss: 1.1309 - regression_loss: 0.9887 - classification_loss: 0.1422 199/500 [==========>...................] - ETA: 1:10 - loss: 1.1307 - regression_loss: 0.9885 - classification_loss: 0.1422 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1302 - regression_loss: 0.9883 - classification_loss: 0.1419 201/500 [===========>..................] - ETA: 1:10 - loss: 1.1296 - regression_loss: 0.9879 - classification_loss: 0.1417 202/500 [===========>..................] - ETA: 1:10 - loss: 1.1284 - regression_loss: 0.9870 - classification_loss: 0.1414 203/500 [===========>..................] - ETA: 1:09 - loss: 1.1277 - regression_loss: 0.9865 - classification_loss: 0.1412 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1272 - regression_loss: 0.9862 - classification_loss: 0.1410 205/500 [===========>..................] - ETA: 1:09 - loss: 1.1275 - regression_loss: 0.9862 - classification_loss: 0.1412 206/500 [===========>..................] - ETA: 1:09 - loss: 1.1284 - regression_loss: 0.9868 - classification_loss: 0.1416 207/500 [===========>..................] - ETA: 1:09 - loss: 1.1292 - regression_loss: 0.9875 - classification_loss: 0.1416 208/500 [===========>..................] - ETA: 1:08 - loss: 1.1332 - regression_loss: 0.9913 - classification_loss: 0.1420 209/500 [===========>..................] - ETA: 1:08 - loss: 1.1369 - regression_loss: 0.9946 - classification_loss: 0.1423 210/500 [===========>..................] - ETA: 1:08 - loss: 1.1331 - regression_loss: 0.9910 - classification_loss: 0.1421 211/500 [===========>..................] - ETA: 1:08 - loss: 1.1337 - regression_loss: 0.9914 - classification_loss: 0.1423 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1331 - regression_loss: 0.9910 - classification_loss: 0.1421 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1326 - regression_loss: 0.9908 - classification_loss: 0.1418 214/500 [===========>..................] - ETA: 1:07 - loss: 1.1326 - regression_loss: 0.9910 - classification_loss: 0.1415 215/500 [===========>..................] - ETA: 1:07 - loss: 1.1301 - regression_loss: 0.9890 - classification_loss: 0.1411 216/500 [===========>..................] - ETA: 1:06 - loss: 1.1304 - regression_loss: 0.9893 - classification_loss: 0.1411 217/500 [============>.................] - ETA: 1:06 - loss: 1.1296 - regression_loss: 0.9888 - classification_loss: 0.1407 218/500 [============>.................] - ETA: 1:06 - loss: 1.1299 - regression_loss: 0.9891 - classification_loss: 0.1408 219/500 [============>.................] - ETA: 1:06 - loss: 1.1298 - regression_loss: 0.9890 - classification_loss: 0.1409 220/500 [============>.................] - ETA: 1:05 - loss: 1.1287 - regression_loss: 0.9880 - classification_loss: 0.1406 221/500 [============>.................] - ETA: 1:05 - loss: 1.1263 - regression_loss: 0.9862 - classification_loss: 0.1402 222/500 [============>.................] - ETA: 1:05 - loss: 1.1258 - regression_loss: 0.9855 - classification_loss: 0.1403 223/500 [============>.................] - ETA: 1:05 - loss: 1.1216 - regression_loss: 0.9818 - classification_loss: 0.1398 224/500 [============>.................] - ETA: 1:04 - loss: 1.1239 - regression_loss: 0.9837 - classification_loss: 0.1402 225/500 [============>.................] - ETA: 1:04 - loss: 1.1231 - regression_loss: 0.9832 - classification_loss: 0.1399 226/500 [============>.................] - ETA: 1:04 - loss: 1.1231 - regression_loss: 0.9831 - classification_loss: 0.1400 227/500 [============>.................] - ETA: 1:04 - loss: 1.1209 - regression_loss: 0.9813 - classification_loss: 0.1396 228/500 [============>.................] - ETA: 1:04 - loss: 1.1219 - regression_loss: 0.9824 - classification_loss: 0.1396 229/500 [============>.................] - ETA: 1:03 - loss: 1.1202 - regression_loss: 0.9809 - classification_loss: 0.1394 230/500 [============>.................] - ETA: 1:03 - loss: 1.1186 - regression_loss: 0.9796 - classification_loss: 0.1390 231/500 [============>.................] - ETA: 1:03 - loss: 1.1198 - regression_loss: 0.9809 - classification_loss: 0.1389 232/500 [============>.................] - ETA: 1:03 - loss: 1.1206 - regression_loss: 0.9814 - classification_loss: 0.1391 233/500 [============>.................] - ETA: 1:02 - loss: 1.1201 - regression_loss: 0.9810 - classification_loss: 0.1391 234/500 [=============>................] - ETA: 1:02 - loss: 1.1193 - regression_loss: 0.9805 - classification_loss: 0.1388 235/500 [=============>................] - ETA: 1:02 - loss: 1.1206 - regression_loss: 0.9818 - classification_loss: 0.1388 236/500 [=============>................] - ETA: 1:02 - loss: 1.1206 - regression_loss: 0.9818 - classification_loss: 0.1388 237/500 [=============>................] - ETA: 1:01 - loss: 1.1231 - regression_loss: 0.9839 - classification_loss: 0.1392 238/500 [=============>................] - ETA: 1:01 - loss: 1.1228 - regression_loss: 0.9838 - classification_loss: 0.1390 239/500 [=============>................] - ETA: 1:01 - loss: 1.1208 - regression_loss: 0.9821 - classification_loss: 0.1387 240/500 [=============>................] - ETA: 1:01 - loss: 1.1212 - regression_loss: 0.9826 - classification_loss: 0.1385 241/500 [=============>................] - ETA: 1:00 - loss: 1.1232 - regression_loss: 0.9838 - classification_loss: 0.1394 242/500 [=============>................] - ETA: 1:00 - loss: 1.1225 - regression_loss: 0.9832 - classification_loss: 0.1393 243/500 [=============>................] - ETA: 1:00 - loss: 1.1239 - regression_loss: 0.9845 - classification_loss: 0.1394 244/500 [=============>................] - ETA: 1:00 - loss: 1.1253 - regression_loss: 0.9858 - classification_loss: 0.1396 245/500 [=============>................] - ETA: 1:00 - loss: 1.1238 - regression_loss: 0.9844 - classification_loss: 0.1393 246/500 [=============>................] - ETA: 59s - loss: 1.1247 - regression_loss: 0.9851 - classification_loss: 0.1396  247/500 [=============>................] - ETA: 59s - loss: 1.1247 - regression_loss: 0.9850 - classification_loss: 0.1397 248/500 [=============>................] - ETA: 59s - loss: 1.1237 - regression_loss: 0.9841 - classification_loss: 0.1397 249/500 [=============>................] - ETA: 59s - loss: 1.1235 - regression_loss: 0.9839 - classification_loss: 0.1396 250/500 [==============>...............] - ETA: 58s - loss: 1.1243 - regression_loss: 0.9847 - classification_loss: 0.1396 251/500 [==============>...............] - ETA: 58s - loss: 1.1255 - regression_loss: 0.9854 - classification_loss: 0.1401 252/500 [==============>...............] - ETA: 58s - loss: 1.1254 - regression_loss: 0.9852 - classification_loss: 0.1402 253/500 [==============>...............] - ETA: 58s - loss: 1.1262 - regression_loss: 0.9859 - classification_loss: 0.1403 254/500 [==============>...............] - ETA: 57s - loss: 1.1273 - regression_loss: 0.9870 - classification_loss: 0.1403 255/500 [==============>...............] - ETA: 57s - loss: 1.1281 - regression_loss: 0.9876 - classification_loss: 0.1404 256/500 [==============>...............] - ETA: 57s - loss: 1.1274 - regression_loss: 0.9871 - classification_loss: 0.1404 257/500 [==============>...............] - ETA: 57s - loss: 1.1276 - regression_loss: 0.9872 - classification_loss: 0.1404 258/500 [==============>...............] - ETA: 56s - loss: 1.1277 - regression_loss: 0.9873 - classification_loss: 0.1403 259/500 [==============>...............] - ETA: 56s - loss: 1.1301 - regression_loss: 0.9898 - classification_loss: 0.1404 260/500 [==============>...............] - ETA: 56s - loss: 1.1328 - regression_loss: 0.9921 - classification_loss: 0.1407 261/500 [==============>...............] - ETA: 56s - loss: 1.1305 - regression_loss: 0.9901 - classification_loss: 0.1404 262/500 [==============>...............] - ETA: 56s - loss: 1.1310 - regression_loss: 0.9905 - classification_loss: 0.1405 263/500 [==============>...............] - ETA: 55s - loss: 1.1346 - regression_loss: 0.9938 - classification_loss: 0.1408 264/500 [==============>...............] - ETA: 55s - loss: 1.1337 - regression_loss: 0.9931 - classification_loss: 0.1406 265/500 [==============>...............] - ETA: 55s - loss: 1.1324 - regression_loss: 0.9921 - classification_loss: 0.1403 266/500 [==============>...............] - ETA: 55s - loss: 1.1302 - regression_loss: 0.9902 - classification_loss: 0.1400 267/500 [===============>..............] - ETA: 54s - loss: 1.1297 - regression_loss: 0.9900 - classification_loss: 0.1396 268/500 [===============>..............] - ETA: 54s - loss: 1.1306 - regression_loss: 0.9910 - classification_loss: 0.1396 269/500 [===============>..............] - ETA: 54s - loss: 1.1332 - regression_loss: 0.9931 - classification_loss: 0.1401 270/500 [===============>..............] - ETA: 54s - loss: 1.1314 - regression_loss: 0.9917 - classification_loss: 0.1397 271/500 [===============>..............] - ETA: 53s - loss: 1.1321 - regression_loss: 0.9923 - classification_loss: 0.1398 272/500 [===============>..............] - ETA: 53s - loss: 1.1316 - regression_loss: 0.9919 - classification_loss: 0.1397 273/500 [===============>..............] - ETA: 53s - loss: 1.1316 - regression_loss: 0.9921 - classification_loss: 0.1395 274/500 [===============>..............] - ETA: 53s - loss: 1.1301 - regression_loss: 0.9909 - classification_loss: 0.1393 275/500 [===============>..............] - ETA: 52s - loss: 1.1287 - regression_loss: 0.9894 - classification_loss: 0.1392 276/500 [===============>..............] - ETA: 52s - loss: 1.1308 - regression_loss: 0.9910 - classification_loss: 0.1398 277/500 [===============>..............] - ETA: 52s - loss: 1.1282 - regression_loss: 0.9888 - classification_loss: 0.1394 278/500 [===============>..............] - ETA: 52s - loss: 1.1261 - regression_loss: 0.9869 - classification_loss: 0.1392 279/500 [===============>..............] - ETA: 52s - loss: 1.1265 - regression_loss: 0.9873 - classification_loss: 0.1392 280/500 [===============>..............] - ETA: 51s - loss: 1.1271 - regression_loss: 0.9881 - classification_loss: 0.1390 281/500 [===============>..............] - ETA: 51s - loss: 1.1263 - regression_loss: 0.9875 - classification_loss: 0.1388 282/500 [===============>..............] - ETA: 51s - loss: 1.1262 - regression_loss: 0.9876 - classification_loss: 0.1387 283/500 [===============>..............] - ETA: 51s - loss: 1.1264 - regression_loss: 0.9878 - classification_loss: 0.1386 284/500 [================>.............] - ETA: 50s - loss: 1.1276 - regression_loss: 0.9889 - classification_loss: 0.1387 285/500 [================>.............] - ETA: 50s - loss: 1.1257 - regression_loss: 0.9874 - classification_loss: 0.1384 286/500 [================>.............] - ETA: 50s - loss: 1.1246 - regression_loss: 0.9865 - classification_loss: 0.1382 287/500 [================>.............] - ETA: 50s - loss: 1.1262 - regression_loss: 0.9877 - classification_loss: 0.1385 288/500 [================>.............] - ETA: 49s - loss: 1.1291 - regression_loss: 0.9898 - classification_loss: 0.1393 289/500 [================>.............] - ETA: 49s - loss: 1.1291 - regression_loss: 0.9899 - classification_loss: 0.1393 290/500 [================>.............] - ETA: 49s - loss: 1.1293 - regression_loss: 0.9902 - classification_loss: 0.1391 291/500 [================>.............] - ETA: 49s - loss: 1.1297 - regression_loss: 0.9907 - classification_loss: 0.1390 292/500 [================>.............] - ETA: 48s - loss: 1.1294 - regression_loss: 0.9905 - classification_loss: 0.1388 293/500 [================>.............] - ETA: 48s - loss: 1.1304 - regression_loss: 0.9916 - classification_loss: 0.1388 294/500 [================>.............] - ETA: 48s - loss: 1.1288 - regression_loss: 0.9903 - classification_loss: 0.1385 295/500 [================>.............] - ETA: 48s - loss: 1.1283 - regression_loss: 0.9899 - classification_loss: 0.1384 296/500 [================>.............] - ETA: 48s - loss: 1.1331 - regression_loss: 0.9939 - classification_loss: 0.1392 297/500 [================>.............] - ETA: 47s - loss: 1.1315 - regression_loss: 0.9926 - classification_loss: 0.1389 298/500 [================>.............] - ETA: 47s - loss: 1.1295 - regression_loss: 0.9909 - classification_loss: 0.1386 299/500 [================>.............] - ETA: 47s - loss: 1.1296 - regression_loss: 0.9911 - classification_loss: 0.1385 300/500 [=================>............] - ETA: 47s - loss: 1.1296 - regression_loss: 0.9911 - classification_loss: 0.1384 301/500 [=================>............] - ETA: 46s - loss: 1.1297 - regression_loss: 0.9914 - classification_loss: 0.1383 302/500 [=================>............] - ETA: 46s - loss: 1.1289 - regression_loss: 0.9909 - classification_loss: 0.1380 303/500 [=================>............] - ETA: 46s - loss: 1.1274 - regression_loss: 0.9897 - classification_loss: 0.1377 304/500 [=================>............] - ETA: 46s - loss: 1.1282 - regression_loss: 0.9904 - classification_loss: 0.1378 305/500 [=================>............] - ETA: 45s - loss: 1.1295 - regression_loss: 0.9913 - classification_loss: 0.1382 306/500 [=================>............] - ETA: 45s - loss: 1.1300 - regression_loss: 0.9917 - classification_loss: 0.1383 307/500 [=================>............] - ETA: 45s - loss: 1.1301 - regression_loss: 0.9921 - classification_loss: 0.1380 308/500 [=================>............] - ETA: 45s - loss: 1.1316 - regression_loss: 0.9933 - classification_loss: 0.1383 309/500 [=================>............] - ETA: 44s - loss: 1.1300 - regression_loss: 0.9919 - classification_loss: 0.1381 310/500 [=================>............] - ETA: 44s - loss: 1.1279 - regression_loss: 0.9901 - classification_loss: 0.1378 311/500 [=================>............] - ETA: 44s - loss: 1.1265 - regression_loss: 0.9889 - classification_loss: 0.1375 312/500 [=================>............] - ETA: 44s - loss: 1.1253 - regression_loss: 0.9878 - classification_loss: 0.1374 313/500 [=================>............] - ETA: 44s - loss: 1.1236 - regression_loss: 0.9865 - classification_loss: 0.1372 314/500 [=================>............] - ETA: 43s - loss: 1.1225 - regression_loss: 0.9855 - classification_loss: 0.1370 315/500 [=================>............] - ETA: 43s - loss: 1.1256 - regression_loss: 0.9877 - classification_loss: 0.1379 316/500 [=================>............] - ETA: 43s - loss: 1.1266 - regression_loss: 0.9887 - classification_loss: 0.1380 317/500 [==================>...........] - ETA: 43s - loss: 1.1244 - regression_loss: 0.9866 - classification_loss: 0.1378 318/500 [==================>...........] - ETA: 42s - loss: 1.1254 - regression_loss: 0.9876 - classification_loss: 0.1378 319/500 [==================>...........] - ETA: 42s - loss: 1.1256 - regression_loss: 0.9878 - classification_loss: 0.1378 320/500 [==================>...........] - ETA: 42s - loss: 1.1270 - regression_loss: 0.9892 - classification_loss: 0.1379 321/500 [==================>...........] - ETA: 42s - loss: 1.1259 - regression_loss: 0.9882 - classification_loss: 0.1376 322/500 [==================>...........] - ETA: 41s - loss: 1.1242 - regression_loss: 0.9868 - classification_loss: 0.1374 323/500 [==================>...........] - ETA: 41s - loss: 1.1243 - regression_loss: 0.9869 - classification_loss: 0.1374 324/500 [==================>...........] - ETA: 41s - loss: 1.1229 - regression_loss: 0.9859 - classification_loss: 0.1370 325/500 [==================>...........] - ETA: 41s - loss: 1.1228 - regression_loss: 0.9859 - classification_loss: 0.1369 326/500 [==================>...........] - ETA: 40s - loss: 1.1211 - regression_loss: 0.9845 - classification_loss: 0.1366 327/500 [==================>...........] - ETA: 40s - loss: 1.1208 - regression_loss: 0.9842 - classification_loss: 0.1365 328/500 [==================>...........] - ETA: 40s - loss: 1.1215 - regression_loss: 0.9849 - classification_loss: 0.1366 329/500 [==================>...........] - ETA: 40s - loss: 1.1193 - regression_loss: 0.9830 - classification_loss: 0.1363 330/500 [==================>...........] - ETA: 40s - loss: 1.1189 - regression_loss: 0.9827 - classification_loss: 0.1362 331/500 [==================>...........] - ETA: 39s - loss: 1.1209 - regression_loss: 0.9845 - classification_loss: 0.1364 332/500 [==================>...........] - ETA: 39s - loss: 1.1209 - regression_loss: 0.9845 - classification_loss: 0.1364 333/500 [==================>...........] - ETA: 39s - loss: 1.1215 - regression_loss: 0.9851 - classification_loss: 0.1364 334/500 [===================>..........] - ETA: 39s - loss: 1.1201 - regression_loss: 0.9838 - classification_loss: 0.1363 335/500 [===================>..........] - ETA: 38s - loss: 1.1199 - regression_loss: 0.9838 - classification_loss: 0.1361 336/500 [===================>..........] - ETA: 38s - loss: 1.1205 - regression_loss: 0.9844 - classification_loss: 0.1361 337/500 [===================>..........] - ETA: 38s - loss: 1.1221 - regression_loss: 0.9857 - classification_loss: 0.1364 338/500 [===================>..........] - ETA: 38s - loss: 1.1198 - regression_loss: 0.9837 - classification_loss: 0.1361 339/500 [===================>..........] - ETA: 37s - loss: 1.1204 - regression_loss: 0.9840 - classification_loss: 0.1364 340/500 [===================>..........] - ETA: 37s - loss: 1.1186 - regression_loss: 0.9825 - classification_loss: 0.1362 341/500 [===================>..........] - ETA: 37s - loss: 1.1166 - regression_loss: 0.9806 - classification_loss: 0.1359 342/500 [===================>..........] - ETA: 37s - loss: 1.1157 - regression_loss: 0.9798 - classification_loss: 0.1358 343/500 [===================>..........] - ETA: 36s - loss: 1.1163 - regression_loss: 0.9803 - classification_loss: 0.1361 344/500 [===================>..........] - ETA: 36s - loss: 1.1175 - regression_loss: 0.9812 - classification_loss: 0.1362 345/500 [===================>..........] - ETA: 36s - loss: 1.1165 - regression_loss: 0.9802 - classification_loss: 0.1362 346/500 [===================>..........] - ETA: 36s - loss: 1.1161 - regression_loss: 0.9798 - classification_loss: 0.1363 347/500 [===================>..........] - ETA: 36s - loss: 1.1145 - regression_loss: 0.9785 - classification_loss: 0.1360 348/500 [===================>..........] - ETA: 35s - loss: 1.1140 - regression_loss: 0.9781 - classification_loss: 0.1359 349/500 [===================>..........] - ETA: 35s - loss: 1.1126 - regression_loss: 0.9769 - classification_loss: 0.1356 350/500 [====================>.........] - ETA: 35s - loss: 1.1115 - regression_loss: 0.9761 - classification_loss: 0.1354 351/500 [====================>.........] - ETA: 35s - loss: 1.1118 - regression_loss: 0.9765 - classification_loss: 0.1352 352/500 [====================>.........] - ETA: 34s - loss: 1.1123 - regression_loss: 0.9770 - classification_loss: 0.1352 353/500 [====================>.........] - ETA: 34s - loss: 1.1132 - regression_loss: 0.9778 - classification_loss: 0.1354 354/500 [====================>.........] - ETA: 34s - loss: 1.1130 - regression_loss: 0.9777 - classification_loss: 0.1353 355/500 [====================>.........] - ETA: 34s - loss: 1.1134 - regression_loss: 0.9779 - classification_loss: 0.1354 356/500 [====================>.........] - ETA: 33s - loss: 1.1118 - regression_loss: 0.9766 - classification_loss: 0.1351 357/500 [====================>.........] - ETA: 33s - loss: 1.1113 - regression_loss: 0.9763 - classification_loss: 0.1350 358/500 [====================>.........] - ETA: 33s - loss: 1.1130 - regression_loss: 0.9774 - classification_loss: 0.1356 359/500 [====================>.........] - ETA: 33s - loss: 1.1116 - regression_loss: 0.9762 - classification_loss: 0.1354 360/500 [====================>.........] - ETA: 32s - loss: 1.1092 - regression_loss: 0.9742 - classification_loss: 0.1350 361/500 [====================>.........] - ETA: 32s - loss: 1.1082 - regression_loss: 0.9733 - classification_loss: 0.1349 362/500 [====================>.........] - ETA: 32s - loss: 1.1073 - regression_loss: 0.9727 - classification_loss: 0.1346 363/500 [====================>.........] - ETA: 32s - loss: 1.1067 - regression_loss: 0.9720 - classification_loss: 0.1347 364/500 [====================>.........] - ETA: 32s - loss: 1.1075 - regression_loss: 0.9727 - classification_loss: 0.1348 365/500 [====================>.........] - ETA: 31s - loss: 1.1069 - regression_loss: 0.9721 - classification_loss: 0.1347 366/500 [====================>.........] - ETA: 31s - loss: 1.1063 - regression_loss: 0.9717 - classification_loss: 0.1346 367/500 [=====================>........] - ETA: 31s - loss: 1.1068 - regression_loss: 0.9718 - classification_loss: 0.1350 368/500 [=====================>........] - ETA: 31s - loss: 1.1082 - regression_loss: 0.9730 - classification_loss: 0.1353 369/500 [=====================>........] - ETA: 30s - loss: 1.1075 - regression_loss: 0.9724 - classification_loss: 0.1351 370/500 [=====================>........] - ETA: 30s - loss: 1.1068 - regression_loss: 0.9719 - classification_loss: 0.1350 371/500 [=====================>........] - ETA: 30s - loss: 1.1065 - regression_loss: 0.9716 - classification_loss: 0.1349 372/500 [=====================>........] - ETA: 30s - loss: 1.1061 - regression_loss: 0.9713 - classification_loss: 0.1348 373/500 [=====================>........] - ETA: 29s - loss: 1.1082 - regression_loss: 0.9727 - classification_loss: 0.1355 374/500 [=====================>........] - ETA: 29s - loss: 1.1076 - regression_loss: 0.9722 - classification_loss: 0.1354 375/500 [=====================>........] - ETA: 29s - loss: 1.1077 - regression_loss: 0.9723 - classification_loss: 0.1354 376/500 [=====================>........] - ETA: 29s - loss: 1.1070 - regression_loss: 0.9717 - classification_loss: 0.1353 377/500 [=====================>........] - ETA: 28s - loss: 1.1072 - regression_loss: 0.9720 - classification_loss: 0.1352 378/500 [=====================>........] - ETA: 28s - loss: 1.1079 - regression_loss: 0.9725 - classification_loss: 0.1354 379/500 [=====================>........] - ETA: 28s - loss: 1.1073 - regression_loss: 0.9720 - classification_loss: 0.1354 380/500 [=====================>........] - ETA: 28s - loss: 1.1056 - regression_loss: 0.9704 - classification_loss: 0.1351 381/500 [=====================>........] - ETA: 28s - loss: 1.1060 - regression_loss: 0.9708 - classification_loss: 0.1352 382/500 [=====================>........] - ETA: 27s - loss: 1.1058 - regression_loss: 0.9707 - classification_loss: 0.1352 383/500 [=====================>........] - ETA: 27s - loss: 1.1049 - regression_loss: 0.9699 - classification_loss: 0.1350 384/500 [======================>.......] - ETA: 27s - loss: 1.1072 - regression_loss: 0.9717 - classification_loss: 0.1355 385/500 [======================>.......] - ETA: 27s - loss: 1.1076 - regression_loss: 0.9720 - classification_loss: 0.1356 386/500 [======================>.......] - ETA: 26s - loss: 1.1065 - regression_loss: 0.9710 - classification_loss: 0.1355 387/500 [======================>.......] - ETA: 26s - loss: 1.1056 - regression_loss: 0.9702 - classification_loss: 0.1353 388/500 [======================>.......] - ETA: 26s - loss: 1.1059 - regression_loss: 0.9705 - classification_loss: 0.1354 389/500 [======================>.......] - ETA: 26s - loss: 1.1069 - regression_loss: 0.9714 - classification_loss: 0.1355 390/500 [======================>.......] - ETA: 25s - loss: 1.1049 - regression_loss: 0.9696 - classification_loss: 0.1353 391/500 [======================>.......] - ETA: 25s - loss: 1.1048 - regression_loss: 0.9696 - classification_loss: 0.1353 392/500 [======================>.......] - ETA: 25s - loss: 1.1033 - regression_loss: 0.9682 - classification_loss: 0.1351 393/500 [======================>.......] - ETA: 25s - loss: 1.1014 - regression_loss: 0.9665 - classification_loss: 0.1349 394/500 [======================>.......] - ETA: 24s - loss: 1.1021 - regression_loss: 0.9674 - classification_loss: 0.1347 395/500 [======================>.......] - ETA: 24s - loss: 1.1019 - regression_loss: 0.9672 - classification_loss: 0.1346 396/500 [======================>.......] - ETA: 24s - loss: 1.1015 - regression_loss: 0.9670 - classification_loss: 0.1345 397/500 [======================>.......] - ETA: 24s - loss: 1.1019 - regression_loss: 0.9674 - classification_loss: 0.1345 398/500 [======================>.......] - ETA: 24s - loss: 1.1018 - regression_loss: 0.9674 - classification_loss: 0.1344 399/500 [======================>.......] - ETA: 23s - loss: 1.1020 - regression_loss: 0.9676 - classification_loss: 0.1345 400/500 [=======================>......] - ETA: 23s - loss: 1.1008 - regression_loss: 0.9666 - classification_loss: 0.1342 401/500 [=======================>......] - ETA: 23s - loss: 1.0995 - regression_loss: 0.9655 - classification_loss: 0.1340 402/500 [=======================>......] - ETA: 23s - loss: 1.0975 - regression_loss: 0.9638 - classification_loss: 0.1337 403/500 [=======================>......] - ETA: 22s - loss: 1.0975 - regression_loss: 0.9638 - classification_loss: 0.1337 404/500 [=======================>......] - ETA: 22s - loss: 1.0968 - regression_loss: 0.9634 - classification_loss: 0.1335 405/500 [=======================>......] - ETA: 22s - loss: 1.0973 - regression_loss: 0.9638 - classification_loss: 0.1334 406/500 [=======================>......] - ETA: 22s - loss: 1.0970 - regression_loss: 0.9633 - classification_loss: 0.1337 407/500 [=======================>......] - ETA: 21s - loss: 1.0968 - regression_loss: 0.9631 - classification_loss: 0.1337 408/500 [=======================>......] - ETA: 21s - loss: 1.0956 - regression_loss: 0.9621 - classification_loss: 0.1335 409/500 [=======================>......] - ETA: 21s - loss: 1.0953 - regression_loss: 0.9618 - classification_loss: 0.1335 410/500 [=======================>......] - ETA: 21s - loss: 1.0959 - regression_loss: 0.9624 - classification_loss: 0.1334 411/500 [=======================>......] - ETA: 20s - loss: 1.0970 - regression_loss: 0.9635 - classification_loss: 0.1335 412/500 [=======================>......] - ETA: 20s - loss: 1.0967 - regression_loss: 0.9633 - classification_loss: 0.1334 413/500 [=======================>......] - ETA: 20s - loss: 1.0972 - regression_loss: 0.9638 - classification_loss: 0.1334 414/500 [=======================>......] - ETA: 20s - loss: 1.0968 - regression_loss: 0.9635 - classification_loss: 0.1334 415/500 [=======================>......] - ETA: 20s - loss: 1.0972 - regression_loss: 0.9638 - classification_loss: 0.1334 416/500 [=======================>......] - ETA: 19s - loss: 1.0970 - regression_loss: 0.9637 - classification_loss: 0.1333 417/500 [========================>.....] - ETA: 19s - loss: 1.0962 - regression_loss: 0.9630 - classification_loss: 0.1332 418/500 [========================>.....] - ETA: 19s - loss: 1.0950 - regression_loss: 0.9620 - classification_loss: 0.1330 419/500 [========================>.....] - ETA: 19s - loss: 1.0933 - regression_loss: 0.9606 - classification_loss: 0.1327 420/500 [========================>.....] - ETA: 18s - loss: 1.0942 - regression_loss: 0.9614 - classification_loss: 0.1328 421/500 [========================>.....] - ETA: 18s - loss: 1.0930 - regression_loss: 0.9604 - classification_loss: 0.1325 422/500 [========================>.....] - ETA: 18s - loss: 1.0917 - regression_loss: 0.9594 - classification_loss: 0.1323 423/500 [========================>.....] - ETA: 18s - loss: 1.0916 - regression_loss: 0.9594 - classification_loss: 0.1323 424/500 [========================>.....] - ETA: 17s - loss: 1.0929 - regression_loss: 0.9602 - classification_loss: 0.1327 425/500 [========================>.....] - ETA: 17s - loss: 1.0959 - regression_loss: 0.9628 - classification_loss: 0.1331 426/500 [========================>.....] - ETA: 17s - loss: 1.0955 - regression_loss: 0.9625 - classification_loss: 0.1330 427/500 [========================>.....] - ETA: 17s - loss: 1.0936 - regression_loss: 0.9608 - classification_loss: 0.1328 428/500 [========================>.....] - ETA: 16s - loss: 1.0939 - regression_loss: 0.9611 - classification_loss: 0.1329 429/500 [========================>.....] - ETA: 16s - loss: 1.0928 - regression_loss: 0.9601 - classification_loss: 0.1327 430/500 [========================>.....] - ETA: 16s - loss: 1.0925 - regression_loss: 0.9599 - classification_loss: 0.1326 431/500 [========================>.....] - ETA: 16s - loss: 1.0915 - regression_loss: 0.9590 - classification_loss: 0.1325 432/500 [========================>.....] - ETA: 16s - loss: 1.0919 - regression_loss: 0.9593 - classification_loss: 0.1326 433/500 [========================>.....] - ETA: 15s - loss: 1.0913 - regression_loss: 0.9586 - classification_loss: 0.1327 434/500 [=========================>....] - ETA: 15s - loss: 1.0915 - regression_loss: 0.9589 - classification_loss: 0.1326 435/500 [=========================>....] - ETA: 15s - loss: 1.0926 - regression_loss: 0.9600 - classification_loss: 0.1325 436/500 [=========================>....] - ETA: 15s - loss: 1.0935 - regression_loss: 0.9609 - classification_loss: 0.1326 437/500 [=========================>....] - ETA: 14s - loss: 1.0942 - regression_loss: 0.9614 - classification_loss: 0.1328 438/500 [=========================>....] - ETA: 14s - loss: 1.0929 - regression_loss: 0.9604 - classification_loss: 0.1325 439/500 [=========================>....] - ETA: 14s - loss: 1.0930 - regression_loss: 0.9605 - classification_loss: 0.1325 440/500 [=========================>....] - ETA: 14s - loss: 1.0921 - regression_loss: 0.9598 - classification_loss: 0.1324 441/500 [=========================>....] - ETA: 13s - loss: 1.0916 - regression_loss: 0.9593 - classification_loss: 0.1323 442/500 [=========================>....] - ETA: 13s - loss: 1.0917 - regression_loss: 0.9595 - classification_loss: 0.1322 443/500 [=========================>....] - ETA: 13s - loss: 1.0923 - regression_loss: 0.9598 - classification_loss: 0.1324 444/500 [=========================>....] - ETA: 13s - loss: 1.0915 - regression_loss: 0.9592 - classification_loss: 0.1323 445/500 [=========================>....] - ETA: 12s - loss: 1.0901 - regression_loss: 0.9581 - classification_loss: 0.1321 446/500 [=========================>....] - ETA: 12s - loss: 1.0894 - regression_loss: 0.9575 - classification_loss: 0.1319 447/500 [=========================>....] - ETA: 12s - loss: 1.0882 - regression_loss: 0.9565 - classification_loss: 0.1317 448/500 [=========================>....] - ETA: 12s - loss: 1.0889 - regression_loss: 0.9572 - classification_loss: 0.1317 449/500 [=========================>....] - ETA: 12s - loss: 1.0876 - regression_loss: 0.9561 - classification_loss: 0.1315 450/500 [==========================>...] - ETA: 11s - loss: 1.0870 - regression_loss: 0.9557 - classification_loss: 0.1313 451/500 [==========================>...] - ETA: 11s - loss: 1.0883 - regression_loss: 0.9568 - classification_loss: 0.1315 452/500 [==========================>...] - ETA: 11s - loss: 1.0880 - regression_loss: 0.9566 - classification_loss: 0.1314 453/500 [==========================>...] - ETA: 11s - loss: 1.0893 - regression_loss: 0.9576 - classification_loss: 0.1317 454/500 [==========================>...] - ETA: 10s - loss: 1.0901 - regression_loss: 0.9583 - classification_loss: 0.1318 455/500 [==========================>...] - ETA: 10s - loss: 1.0898 - regression_loss: 0.9581 - classification_loss: 0.1317 456/500 [==========================>...] - ETA: 10s - loss: 1.0914 - regression_loss: 0.9595 - classification_loss: 0.1319 457/500 [==========================>...] - ETA: 10s - loss: 1.0928 - regression_loss: 0.9604 - classification_loss: 0.1323 458/500 [==========================>...] - ETA: 9s - loss: 1.0932 - regression_loss: 0.9608 - classification_loss: 0.1324  459/500 [==========================>...] - ETA: 9s - loss: 1.0930 - regression_loss: 0.9607 - classification_loss: 0.1323 460/500 [==========================>...] - ETA: 9s - loss: 1.0934 - regression_loss: 0.9612 - classification_loss: 0.1322 461/500 [==========================>...] - ETA: 9s - loss: 1.0935 - regression_loss: 0.9613 - classification_loss: 0.1323 462/500 [==========================>...] - ETA: 8s - loss: 1.0937 - regression_loss: 0.9615 - classification_loss: 0.1322 463/500 [==========================>...] - ETA: 8s - loss: 1.0932 - regression_loss: 0.9611 - classification_loss: 0.1321 464/500 [==========================>...] - ETA: 8s - loss: 1.0931 - regression_loss: 0.9611 - classification_loss: 0.1320 465/500 [==========================>...] - ETA: 8s - loss: 1.0932 - regression_loss: 0.9612 - classification_loss: 0.1320 466/500 [==========================>...] - ETA: 7s - loss: 1.0927 - regression_loss: 0.9609 - classification_loss: 0.1319 467/500 [===========================>..] - ETA: 7s - loss: 1.0916 - regression_loss: 0.9599 - classification_loss: 0.1317 468/500 [===========================>..] - ETA: 7s - loss: 1.0914 - regression_loss: 0.9598 - classification_loss: 0.1316 469/500 [===========================>..] - ETA: 7s - loss: 1.0903 - regression_loss: 0.9590 - classification_loss: 0.1314 470/500 [===========================>..] - ETA: 7s - loss: 1.0926 - regression_loss: 0.9611 - classification_loss: 0.1315 471/500 [===========================>..] - ETA: 6s - loss: 1.0921 - regression_loss: 0.9607 - classification_loss: 0.1314 472/500 [===========================>..] - ETA: 6s - loss: 1.0924 - regression_loss: 0.9610 - classification_loss: 0.1314 473/500 [===========================>..] - ETA: 6s - loss: 1.0932 - regression_loss: 0.9616 - classification_loss: 0.1317 474/500 [===========================>..] - ETA: 6s - loss: 1.0935 - regression_loss: 0.9618 - classification_loss: 0.1317 475/500 [===========================>..] - ETA: 5s - loss: 1.0927 - regression_loss: 0.9609 - classification_loss: 0.1318 476/500 [===========================>..] - ETA: 5s - loss: 1.0916 - regression_loss: 0.9600 - classification_loss: 0.1316 477/500 [===========================>..] - ETA: 5s - loss: 1.0906 - regression_loss: 0.9592 - classification_loss: 0.1314 478/500 [===========================>..] - ETA: 5s - loss: 1.0899 - regression_loss: 0.9586 - classification_loss: 0.1313 479/500 [===========================>..] - ETA: 4s - loss: 1.0911 - regression_loss: 0.9596 - classification_loss: 0.1315 480/500 [===========================>..] - ETA: 4s - loss: 1.0912 - regression_loss: 0.9597 - classification_loss: 0.1315 481/500 [===========================>..] - ETA: 4s - loss: 1.0908 - regression_loss: 0.9592 - classification_loss: 0.1316 482/500 [===========================>..] - ETA: 4s - loss: 1.0896 - regression_loss: 0.9582 - classification_loss: 0.1314 483/500 [===========================>..] - ETA: 3s - loss: 1.0879 - regression_loss: 0.9567 - classification_loss: 0.1312 484/500 [============================>.] - ETA: 3s - loss: 1.0890 - regression_loss: 0.9576 - classification_loss: 0.1314 485/500 [============================>.] - ETA: 3s - loss: 1.0908 - regression_loss: 0.9590 - classification_loss: 0.1318 486/500 [============================>.] - ETA: 3s - loss: 1.0905 - regression_loss: 0.9587 - classification_loss: 0.1318 487/500 [============================>.] - ETA: 3s - loss: 1.0909 - regression_loss: 0.9592 - classification_loss: 0.1317 488/500 [============================>.] - ETA: 2s - loss: 1.0913 - regression_loss: 0.9593 - classification_loss: 0.1320 489/500 [============================>.] - ETA: 2s - loss: 1.0913 - regression_loss: 0.9593 - classification_loss: 0.1321 490/500 [============================>.] - ETA: 2s - loss: 1.0918 - regression_loss: 0.9597 - classification_loss: 0.1321 491/500 [============================>.] - ETA: 2s - loss: 1.0915 - regression_loss: 0.9595 - classification_loss: 0.1320 492/500 [============================>.] - ETA: 1s - loss: 1.0916 - regression_loss: 0.9595 - classification_loss: 0.1321 493/500 [============================>.] - ETA: 1s - loss: 1.0927 - regression_loss: 0.9603 - classification_loss: 0.1323 494/500 [============================>.] - ETA: 1s - loss: 1.0929 - regression_loss: 0.9606 - classification_loss: 0.1323 495/500 [============================>.] - ETA: 1s - loss: 1.0927 - regression_loss: 0.9605 - classification_loss: 0.1323 496/500 [============================>.] - ETA: 0s - loss: 1.0927 - regression_loss: 0.9605 - classification_loss: 0.1322 497/500 [============================>.] - ETA: 0s - loss: 1.0937 - regression_loss: 0.9614 - classification_loss: 0.1323 498/500 [============================>.] - ETA: 0s - loss: 1.0934 - regression_loss: 0.9612 - classification_loss: 0.1322 499/500 [============================>.] - ETA: 0s - loss: 1.0946 - regression_loss: 0.9622 - classification_loss: 0.1324 500/500 [==============================] - 118s 235ms/step - loss: 1.0940 - regression_loss: 0.9617 - classification_loss: 0.1323 326 instances of class plum with average precision: 0.8464 mAP: 0.8464 Epoch 00025: saving model to ./training/snapshots/resnet50_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 1:50 - loss: 1.2943 - regression_loss: 1.1299 - classification_loss: 0.1643 2/500 [..............................] - ETA: 1:52 - loss: 1.2553 - regression_loss: 1.0674 - classification_loss: 0.1878 3/500 [..............................] - ETA: 1:53 - loss: 1.2120 - regression_loss: 1.0469 - classification_loss: 0.1652 4/500 [..............................] - ETA: 1:54 - loss: 1.0737 - regression_loss: 0.9300 - classification_loss: 0.1437 5/500 [..............................] - ETA: 1:52 - loss: 1.0079 - regression_loss: 0.8840 - classification_loss: 0.1239 6/500 [..............................] - ETA: 1:53 - loss: 1.0352 - regression_loss: 0.9221 - classification_loss: 0.1130 7/500 [..............................] - ETA: 1:55 - loss: 1.0543 - regression_loss: 0.9400 - classification_loss: 0.1144 8/500 [..............................] - ETA: 1:56 - loss: 1.0222 - regression_loss: 0.9151 - classification_loss: 0.1071 9/500 [..............................] - ETA: 1:55 - loss: 1.0425 - regression_loss: 0.9291 - classification_loss: 0.1135 10/500 [..............................] - ETA: 1:54 - loss: 1.0214 - regression_loss: 0.9149 - classification_loss: 0.1065 11/500 [..............................] - ETA: 1:54 - loss: 0.9543 - regression_loss: 0.8541 - classification_loss: 0.1002 12/500 [..............................] - ETA: 1:54 - loss: 0.9647 - regression_loss: 0.8641 - classification_loss: 0.1005 13/500 [..............................] - ETA: 1:54 - loss: 0.9535 - regression_loss: 0.8432 - classification_loss: 0.1103 14/500 [..............................] - ETA: 1:53 - loss: 0.9782 - regression_loss: 0.8683 - classification_loss: 0.1099 15/500 [..............................] - ETA: 1:53 - loss: 0.9708 - regression_loss: 0.8642 - classification_loss: 0.1066 16/500 [..............................] - ETA: 1:53 - loss: 0.9786 - regression_loss: 0.8714 - classification_loss: 0.1072 17/500 [>.............................] - ETA: 1:53 - loss: 0.9592 - regression_loss: 0.8552 - classification_loss: 0.1040 18/500 [>.............................] - ETA: 1:53 - loss: 0.9499 - regression_loss: 0.8480 - classification_loss: 0.1019 19/500 [>.............................] - ETA: 1:52 - loss: 1.0049 - regression_loss: 0.8942 - classification_loss: 0.1108 20/500 [>.............................] - ETA: 1:52 - loss: 0.9982 - regression_loss: 0.8879 - classification_loss: 0.1103 21/500 [>.............................] - ETA: 1:52 - loss: 0.9947 - regression_loss: 0.8859 - classification_loss: 0.1089 22/500 [>.............................] - ETA: 1:52 - loss: 0.9790 - regression_loss: 0.8737 - classification_loss: 0.1054 23/500 [>.............................] - ETA: 1:51 - loss: 0.9857 - regression_loss: 0.8802 - classification_loss: 0.1055 24/500 [>.............................] - ETA: 1:51 - loss: 1.0441 - regression_loss: 0.9229 - classification_loss: 0.1212 25/500 [>.............................] - ETA: 1:50 - loss: 1.0383 - regression_loss: 0.9173 - classification_loss: 0.1210 26/500 [>.............................] - ETA: 1:50 - loss: 1.0439 - regression_loss: 0.9235 - classification_loss: 0.1204 27/500 [>.............................] - ETA: 1:50 - loss: 1.0426 - regression_loss: 0.9212 - classification_loss: 0.1214 28/500 [>.............................] - ETA: 1:50 - loss: 1.0347 - regression_loss: 0.9151 - classification_loss: 0.1196 29/500 [>.............................] - ETA: 1:49 - loss: 1.0164 - regression_loss: 0.9002 - classification_loss: 0.1162 30/500 [>.............................] - ETA: 1:49 - loss: 1.0252 - regression_loss: 0.9092 - classification_loss: 0.1160 31/500 [>.............................] - ETA: 1:48 - loss: 1.0406 - regression_loss: 0.9233 - classification_loss: 0.1174 32/500 [>.............................] - ETA: 1:48 - loss: 1.0392 - regression_loss: 0.9243 - classification_loss: 0.1150 33/500 [>.............................] - ETA: 1:48 - loss: 1.0328 - regression_loss: 0.9189 - classification_loss: 0.1138 34/500 [=>............................] - ETA: 1:48 - loss: 1.0437 - regression_loss: 0.9294 - classification_loss: 0.1144 35/500 [=>............................] - ETA: 1:48 - loss: 1.0564 - regression_loss: 0.9380 - classification_loss: 0.1184 36/500 [=>............................] - ETA: 1:47 - loss: 1.0578 - regression_loss: 0.9393 - classification_loss: 0.1186 37/500 [=>............................] - ETA: 1:47 - loss: 1.0386 - regression_loss: 0.9226 - classification_loss: 0.1160 38/500 [=>............................] - ETA: 1:47 - loss: 1.0285 - regression_loss: 0.9115 - classification_loss: 0.1170 39/500 [=>............................] - ETA: 1:47 - loss: 1.0066 - regression_loss: 0.8921 - classification_loss: 0.1145 40/500 [=>............................] - ETA: 1:47 - loss: 1.0083 - regression_loss: 0.8929 - classification_loss: 0.1154 41/500 [=>............................] - ETA: 1:47 - loss: 1.0432 - regression_loss: 0.9210 - classification_loss: 0.1222 42/500 [=>............................] - ETA: 1:47 - loss: 1.0319 - regression_loss: 0.9112 - classification_loss: 0.1206 43/500 [=>............................] - ETA: 1:47 - loss: 1.0397 - regression_loss: 0.9190 - classification_loss: 0.1207 44/500 [=>............................] - ETA: 1:47 - loss: 1.0453 - regression_loss: 0.9254 - classification_loss: 0.1199 45/500 [=>............................] - ETA: 1:46 - loss: 1.0651 - regression_loss: 0.9427 - classification_loss: 0.1224 46/500 [=>............................] - ETA: 1:46 - loss: 1.0651 - regression_loss: 0.9436 - classification_loss: 0.1215 47/500 [=>............................] - ETA: 1:46 - loss: 1.0695 - regression_loss: 0.9491 - classification_loss: 0.1203 48/500 [=>............................] - ETA: 1:46 - loss: 1.0763 - regression_loss: 0.9570 - classification_loss: 0.1193 49/500 [=>............................] - ETA: 1:46 - loss: 1.1001 - regression_loss: 0.9798 - classification_loss: 0.1203 50/500 [==>...........................] - ETA: 1:46 - loss: 1.0975 - regression_loss: 0.9782 - classification_loss: 0.1193 51/500 [==>...........................] - ETA: 1:45 - loss: 1.1007 - regression_loss: 0.9809 - classification_loss: 0.1199 52/500 [==>...........................] - ETA: 1:45 - loss: 1.1073 - regression_loss: 0.9861 - classification_loss: 0.1212 53/500 [==>...........................] - ETA: 1:45 - loss: 1.1363 - regression_loss: 1.0129 - classification_loss: 0.1235 54/500 [==>...........................] - ETA: 1:45 - loss: 1.1312 - regression_loss: 1.0081 - classification_loss: 0.1231 55/500 [==>...........................] - ETA: 1:44 - loss: 1.1341 - regression_loss: 1.0108 - classification_loss: 0.1233 56/500 [==>...........................] - ETA: 1:44 - loss: 1.1145 - regression_loss: 0.9927 - classification_loss: 0.1217 57/500 [==>...........................] - ETA: 1:44 - loss: 1.1145 - regression_loss: 0.9930 - classification_loss: 0.1215 58/500 [==>...........................] - ETA: 1:44 - loss: 1.1174 - regression_loss: 0.9946 - classification_loss: 0.1228 59/500 [==>...........................] - ETA: 1:43 - loss: 1.1208 - regression_loss: 0.9978 - classification_loss: 0.1230 60/500 [==>...........................] - ETA: 1:43 - loss: 1.1318 - regression_loss: 1.0060 - classification_loss: 0.1258 61/500 [==>...........................] - ETA: 1:43 - loss: 1.1384 - regression_loss: 1.0129 - classification_loss: 0.1255 62/500 [==>...........................] - ETA: 1:43 - loss: 1.1546 - regression_loss: 1.0269 - classification_loss: 0.1276 63/500 [==>...........................] - ETA: 1:43 - loss: 1.1455 - regression_loss: 1.0193 - classification_loss: 0.1262 64/500 [==>...........................] - ETA: 1:42 - loss: 1.1501 - regression_loss: 1.0233 - classification_loss: 0.1268 65/500 [==>...........................] - ETA: 1:42 - loss: 1.1412 - regression_loss: 1.0157 - classification_loss: 0.1255 66/500 [==>...........................] - ETA: 1:42 - loss: 1.1343 - regression_loss: 1.0096 - classification_loss: 0.1247 67/500 [===>..........................] - ETA: 1:42 - loss: 1.1324 - regression_loss: 1.0078 - classification_loss: 0.1247 68/500 [===>..........................] - ETA: 1:41 - loss: 1.1475 - regression_loss: 1.0200 - classification_loss: 0.1275 69/500 [===>..........................] - ETA: 1:41 - loss: 1.1436 - regression_loss: 1.0162 - classification_loss: 0.1274 70/500 [===>..........................] - ETA: 1:41 - loss: 1.1406 - regression_loss: 1.0143 - classification_loss: 0.1263 71/500 [===>..........................] - ETA: 1:41 - loss: 1.1417 - regression_loss: 1.0146 - classification_loss: 0.1271 72/500 [===>..........................] - ETA: 1:41 - loss: 1.1360 - regression_loss: 1.0094 - classification_loss: 0.1266 73/500 [===>..........................] - ETA: 1:40 - loss: 1.1328 - regression_loss: 1.0069 - classification_loss: 0.1259 74/500 [===>..........................] - ETA: 1:40 - loss: 1.1373 - regression_loss: 1.0116 - classification_loss: 0.1257 75/500 [===>..........................] - ETA: 1:40 - loss: 1.1404 - regression_loss: 1.0143 - classification_loss: 0.1261 76/500 [===>..........................] - ETA: 1:40 - loss: 1.1548 - regression_loss: 1.0260 - classification_loss: 0.1288 77/500 [===>..........................] - ETA: 1:39 - loss: 1.1607 - regression_loss: 1.0324 - classification_loss: 0.1283 78/500 [===>..........................] - ETA: 1:39 - loss: 1.1563 - regression_loss: 1.0283 - classification_loss: 0.1280 79/500 [===>..........................] - ETA: 1:39 - loss: 1.1480 - regression_loss: 1.0209 - classification_loss: 0.1271 80/500 [===>..........................] - ETA: 1:39 - loss: 1.1465 - regression_loss: 1.0193 - classification_loss: 0.1272 81/500 [===>..........................] - ETA: 1:39 - loss: 1.1488 - regression_loss: 1.0203 - classification_loss: 0.1286 82/500 [===>..........................] - ETA: 1:38 - loss: 1.1512 - regression_loss: 1.0220 - classification_loss: 0.1292 83/500 [===>..........................] - ETA: 1:38 - loss: 1.1497 - regression_loss: 1.0208 - classification_loss: 0.1288 84/500 [====>.........................] - ETA: 1:38 - loss: 1.1475 - regression_loss: 1.0188 - classification_loss: 0.1288 85/500 [====>.........................] - ETA: 1:38 - loss: 1.1475 - regression_loss: 1.0190 - classification_loss: 0.1285 86/500 [====>.........................] - ETA: 1:37 - loss: 1.1426 - regression_loss: 1.0149 - classification_loss: 0.1277 87/500 [====>.........................] - ETA: 1:37 - loss: 1.1423 - regression_loss: 1.0145 - classification_loss: 0.1278 88/500 [====>.........................] - ETA: 1:37 - loss: 1.1346 - regression_loss: 1.0079 - classification_loss: 0.1268 89/500 [====>.........................] - ETA: 1:37 - loss: 1.1359 - regression_loss: 1.0095 - classification_loss: 0.1264 90/500 [====>.........................] - ETA: 1:36 - loss: 1.1355 - regression_loss: 1.0098 - classification_loss: 0.1257 91/500 [====>.........................] - ETA: 1:36 - loss: 1.1370 - regression_loss: 1.0111 - classification_loss: 0.1259 92/500 [====>.........................] - ETA: 1:36 - loss: 1.1409 - regression_loss: 1.0138 - classification_loss: 0.1271 93/500 [====>.........................] - ETA: 1:36 - loss: 1.1478 - regression_loss: 1.0189 - classification_loss: 0.1289 94/500 [====>.........................] - ETA: 1:35 - loss: 1.1430 - regression_loss: 1.0148 - classification_loss: 0.1282 95/500 [====>.........................] - ETA: 1:35 - loss: 1.1503 - regression_loss: 1.0208 - classification_loss: 0.1294 96/500 [====>.........................] - ETA: 1:35 - loss: 1.1461 - regression_loss: 1.0174 - classification_loss: 0.1287 97/500 [====>.........................] - ETA: 1:35 - loss: 1.1404 - regression_loss: 1.0121 - classification_loss: 0.1283 98/500 [====>.........................] - ETA: 1:34 - loss: 1.1355 - regression_loss: 1.0079 - classification_loss: 0.1276 99/500 [====>.........................] - ETA: 1:34 - loss: 1.1403 - regression_loss: 1.0114 - classification_loss: 0.1289 100/500 [=====>........................] - ETA: 1:34 - loss: 1.1403 - regression_loss: 1.0112 - classification_loss: 0.1292 101/500 [=====>........................] - ETA: 1:34 - loss: 1.1448 - regression_loss: 1.0143 - classification_loss: 0.1306 102/500 [=====>........................] - ETA: 1:33 - loss: 1.1455 - regression_loss: 1.0148 - classification_loss: 0.1307 103/500 [=====>........................] - ETA: 1:33 - loss: 1.1455 - regression_loss: 1.0139 - classification_loss: 0.1316 104/500 [=====>........................] - ETA: 1:33 - loss: 1.1417 - regression_loss: 1.0105 - classification_loss: 0.1313 105/500 [=====>........................] - ETA: 1:33 - loss: 1.1334 - regression_loss: 1.0031 - classification_loss: 0.1303 106/500 [=====>........................] - ETA: 1:32 - loss: 1.1291 - regression_loss: 0.9996 - classification_loss: 0.1295 107/500 [=====>........................] - ETA: 1:32 - loss: 1.1258 - regression_loss: 0.9972 - classification_loss: 0.1287 108/500 [=====>........................] - ETA: 1:32 - loss: 1.1231 - regression_loss: 0.9948 - classification_loss: 0.1283 109/500 [=====>........................] - ETA: 1:32 - loss: 1.1216 - regression_loss: 0.9932 - classification_loss: 0.1284 110/500 [=====>........................] - ETA: 1:31 - loss: 1.1183 - regression_loss: 0.9903 - classification_loss: 0.1280 111/500 [=====>........................] - ETA: 1:31 - loss: 1.1259 - regression_loss: 0.9956 - classification_loss: 0.1304 112/500 [=====>........................] - ETA: 1:31 - loss: 1.1233 - regression_loss: 0.9933 - classification_loss: 0.1300 113/500 [=====>........................] - ETA: 1:31 - loss: 1.1246 - regression_loss: 0.9945 - classification_loss: 0.1301 114/500 [=====>........................] - ETA: 1:30 - loss: 1.1213 - regression_loss: 0.9918 - classification_loss: 0.1295 115/500 [=====>........................] - ETA: 1:30 - loss: 1.1212 - regression_loss: 0.9901 - classification_loss: 0.1311 116/500 [=====>........................] - ETA: 1:30 - loss: 1.1242 - regression_loss: 0.9917 - classification_loss: 0.1325 117/500 [======>.......................] - ETA: 1:30 - loss: 1.1260 - regression_loss: 0.9925 - classification_loss: 0.1335 118/500 [======>.......................] - ETA: 1:29 - loss: 1.1301 - regression_loss: 0.9960 - classification_loss: 0.1341 119/500 [======>.......................] - ETA: 1:29 - loss: 1.1286 - regression_loss: 0.9947 - classification_loss: 0.1339 120/500 [======>.......................] - ETA: 1:29 - loss: 1.1261 - regression_loss: 0.9928 - classification_loss: 0.1332 121/500 [======>.......................] - ETA: 1:29 - loss: 1.1217 - regression_loss: 0.9891 - classification_loss: 0.1326 122/500 [======>.......................] - ETA: 1:28 - loss: 1.1249 - regression_loss: 0.9919 - classification_loss: 0.1330 123/500 [======>.......................] - ETA: 1:28 - loss: 1.1256 - regression_loss: 0.9928 - classification_loss: 0.1328 124/500 [======>.......................] - ETA: 1:28 - loss: 1.1267 - regression_loss: 0.9939 - classification_loss: 0.1328 125/500 [======>.......................] - ETA: 1:28 - loss: 1.1311 - regression_loss: 0.9975 - classification_loss: 0.1336 126/500 [======>.......................] - ETA: 1:28 - loss: 1.1278 - regression_loss: 0.9947 - classification_loss: 0.1331 127/500 [======>.......................] - ETA: 1:27 - loss: 1.1277 - regression_loss: 0.9944 - classification_loss: 0.1332 128/500 [======>.......................] - ETA: 1:27 - loss: 1.1246 - regression_loss: 0.9918 - classification_loss: 0.1328 129/500 [======>.......................] - ETA: 1:27 - loss: 1.1219 - regression_loss: 0.9896 - classification_loss: 0.1323 130/500 [======>.......................] - ETA: 1:27 - loss: 1.1238 - regression_loss: 0.9916 - classification_loss: 0.1322 131/500 [======>.......................] - ETA: 1:26 - loss: 1.1237 - regression_loss: 0.9910 - classification_loss: 0.1327 132/500 [======>.......................] - ETA: 1:26 - loss: 1.1210 - regression_loss: 0.9883 - classification_loss: 0.1327 133/500 [======>.......................] - ETA: 1:26 - loss: 1.1210 - regression_loss: 0.9883 - classification_loss: 0.1327 134/500 [=======>......................] - ETA: 1:26 - loss: 1.1216 - regression_loss: 0.9885 - classification_loss: 0.1331 135/500 [=======>......................] - ETA: 1:25 - loss: 1.1163 - regression_loss: 0.9839 - classification_loss: 0.1325 136/500 [=======>......................] - ETA: 1:25 - loss: 1.1142 - regression_loss: 0.9821 - classification_loss: 0.1322 137/500 [=======>......................] - ETA: 1:25 - loss: 1.1156 - regression_loss: 0.9833 - classification_loss: 0.1323 138/500 [=======>......................] - ETA: 1:25 - loss: 1.1186 - regression_loss: 0.9863 - classification_loss: 0.1323 139/500 [=======>......................] - ETA: 1:24 - loss: 1.1193 - regression_loss: 0.9871 - classification_loss: 0.1322 140/500 [=======>......................] - ETA: 1:24 - loss: 1.1182 - regression_loss: 0.9862 - classification_loss: 0.1320 141/500 [=======>......................] - ETA: 1:24 - loss: 1.1237 - regression_loss: 0.9908 - classification_loss: 0.1329 142/500 [=======>......................] - ETA: 1:24 - loss: 1.1251 - regression_loss: 0.9923 - classification_loss: 0.1328 143/500 [=======>......................] - ETA: 1:23 - loss: 1.1233 - regression_loss: 0.9910 - classification_loss: 0.1323 144/500 [=======>......................] - ETA: 1:23 - loss: 1.1218 - regression_loss: 0.9898 - classification_loss: 0.1319 145/500 [=======>......................] - ETA: 1:23 - loss: 1.1165 - regression_loss: 0.9851 - classification_loss: 0.1314 146/500 [=======>......................] - ETA: 1:23 - loss: 1.1113 - regression_loss: 0.9807 - classification_loss: 0.1306 147/500 [=======>......................] - ETA: 1:23 - loss: 1.1104 - regression_loss: 0.9799 - classification_loss: 0.1305 148/500 [=======>......................] - ETA: 1:22 - loss: 1.1099 - regression_loss: 0.9795 - classification_loss: 0.1303 149/500 [=======>......................] - ETA: 1:22 - loss: 1.1064 - regression_loss: 0.9762 - classification_loss: 0.1303 150/500 [========>.....................] - ETA: 1:22 - loss: 1.1090 - regression_loss: 0.9781 - classification_loss: 0.1308 151/500 [========>.....................] - ETA: 1:22 - loss: 1.1065 - regression_loss: 0.9760 - classification_loss: 0.1305 152/500 [========>.....................] - ETA: 1:21 - loss: 1.1015 - regression_loss: 0.9716 - classification_loss: 0.1299 153/500 [========>.....................] - ETA: 1:21 - loss: 1.1065 - regression_loss: 0.9751 - classification_loss: 0.1315 154/500 [========>.....................] - ETA: 1:21 - loss: 1.1121 - regression_loss: 0.9791 - classification_loss: 0.1331 155/500 [========>.....................] - ETA: 1:21 - loss: 1.1097 - regression_loss: 0.9769 - classification_loss: 0.1328 156/500 [========>.....................] - ETA: 1:21 - loss: 1.1069 - regression_loss: 0.9747 - classification_loss: 0.1322 157/500 [========>.....................] - ETA: 1:20 - loss: 1.1059 - regression_loss: 0.9738 - classification_loss: 0.1321 158/500 [========>.....................] - ETA: 1:20 - loss: 1.1029 - regression_loss: 0.9712 - classification_loss: 0.1317 159/500 [========>.....................] - ETA: 1:20 - loss: 1.1059 - regression_loss: 0.9732 - classification_loss: 0.1327 160/500 [========>.....................] - ETA: 1:20 - loss: 1.1059 - regression_loss: 0.9733 - classification_loss: 0.1326 161/500 [========>.....................] - ETA: 1:19 - loss: 1.1072 - regression_loss: 0.9744 - classification_loss: 0.1328 162/500 [========>.....................] - ETA: 1:19 - loss: 1.1088 - regression_loss: 0.9755 - classification_loss: 0.1333 163/500 [========>.....................] - ETA: 1:19 - loss: 1.1111 - regression_loss: 0.9775 - classification_loss: 0.1336 164/500 [========>.....................] - ETA: 1:19 - loss: 1.1083 - regression_loss: 0.9751 - classification_loss: 0.1332 165/500 [========>.....................] - ETA: 1:19 - loss: 1.1148 - regression_loss: 0.9802 - classification_loss: 0.1346 166/500 [========>.....................] - ETA: 1:18 - loss: 1.1133 - regression_loss: 0.9789 - classification_loss: 0.1344 167/500 [=========>....................] - ETA: 1:18 - loss: 1.1112 - regression_loss: 0.9774 - classification_loss: 0.1338 168/500 [=========>....................] - ETA: 1:18 - loss: 1.1117 - regression_loss: 0.9778 - classification_loss: 0.1339 169/500 [=========>....................] - ETA: 1:18 - loss: 1.1120 - regression_loss: 0.9784 - classification_loss: 0.1337 170/500 [=========>....................] - ETA: 1:17 - loss: 1.1111 - regression_loss: 0.9776 - classification_loss: 0.1335 171/500 [=========>....................] - ETA: 1:17 - loss: 1.1124 - regression_loss: 0.9787 - classification_loss: 0.1337 172/500 [=========>....................] - ETA: 1:17 - loss: 1.1124 - regression_loss: 0.9789 - classification_loss: 0.1335 173/500 [=========>....................] - ETA: 1:17 - loss: 1.1108 - regression_loss: 0.9776 - classification_loss: 0.1332 174/500 [=========>....................] - ETA: 1:16 - loss: 1.1128 - regression_loss: 0.9793 - classification_loss: 0.1335 175/500 [=========>....................] - ETA: 1:16 - loss: 1.1110 - regression_loss: 0.9777 - classification_loss: 0.1332 176/500 [=========>....................] - ETA: 1:16 - loss: 1.1105 - regression_loss: 0.9774 - classification_loss: 0.1331 177/500 [=========>....................] - ETA: 1:16 - loss: 1.1104 - regression_loss: 0.9772 - classification_loss: 0.1332 178/500 [=========>....................] - ETA: 1:15 - loss: 1.1119 - regression_loss: 0.9786 - classification_loss: 0.1332 179/500 [=========>....................] - ETA: 1:15 - loss: 1.1124 - regression_loss: 0.9792 - classification_loss: 0.1332 180/500 [=========>....................] - ETA: 1:15 - loss: 1.1107 - regression_loss: 0.9776 - classification_loss: 0.1331 181/500 [=========>....................] - ETA: 1:15 - loss: 1.1096 - regression_loss: 0.9767 - classification_loss: 0.1329 182/500 [=========>....................] - ETA: 1:14 - loss: 1.1065 - regression_loss: 0.9741 - classification_loss: 0.1323 183/500 [=========>....................] - ETA: 1:14 - loss: 1.1077 - regression_loss: 0.9753 - classification_loss: 0.1324 184/500 [==========>...................] - ETA: 1:14 - loss: 1.1097 - regression_loss: 0.9768 - classification_loss: 0.1328 185/500 [==========>...................] - ETA: 1:14 - loss: 1.1072 - regression_loss: 0.9749 - classification_loss: 0.1323 186/500 [==========>...................] - ETA: 1:14 - loss: 1.1066 - regression_loss: 0.9746 - classification_loss: 0.1320 187/500 [==========>...................] - ETA: 1:13 - loss: 1.1047 - regression_loss: 0.9730 - classification_loss: 0.1317 188/500 [==========>...................] - ETA: 1:13 - loss: 1.1059 - regression_loss: 0.9740 - classification_loss: 0.1319 189/500 [==========>...................] - ETA: 1:13 - loss: 1.1068 - regression_loss: 0.9749 - classification_loss: 0.1319 190/500 [==========>...................] - ETA: 1:13 - loss: 1.1044 - regression_loss: 0.9728 - classification_loss: 0.1316 191/500 [==========>...................] - ETA: 1:12 - loss: 1.1045 - regression_loss: 0.9732 - classification_loss: 0.1314 192/500 [==========>...................] - ETA: 1:12 - loss: 1.1086 - regression_loss: 0.9759 - classification_loss: 0.1326 193/500 [==========>...................] - ETA: 1:12 - loss: 1.1066 - regression_loss: 0.9742 - classification_loss: 0.1323 194/500 [==========>...................] - ETA: 1:12 - loss: 1.1054 - regression_loss: 0.9734 - classification_loss: 0.1321 195/500 [==========>...................] - ETA: 1:11 - loss: 1.1058 - regression_loss: 0.9738 - classification_loss: 0.1320 196/500 [==========>...................] - ETA: 1:11 - loss: 1.1086 - regression_loss: 0.9760 - classification_loss: 0.1326 197/500 [==========>...................] - ETA: 1:11 - loss: 1.1100 - regression_loss: 0.9769 - classification_loss: 0.1331 198/500 [==========>...................] - ETA: 1:11 - loss: 1.1088 - regression_loss: 0.9760 - classification_loss: 0.1328 199/500 [==========>...................] - ETA: 1:10 - loss: 1.1109 - regression_loss: 0.9772 - classification_loss: 0.1337 200/500 [===========>..................] - ETA: 1:10 - loss: 1.1126 - regression_loss: 0.9787 - classification_loss: 0.1338 201/500 [===========>..................] - ETA: 1:10 - loss: 1.1135 - regression_loss: 0.9799 - classification_loss: 0.1336 202/500 [===========>..................] - ETA: 1:10 - loss: 1.1172 - regression_loss: 0.9834 - classification_loss: 0.1338 203/500 [===========>..................] - ETA: 1:10 - loss: 1.1185 - regression_loss: 0.9845 - classification_loss: 0.1340 204/500 [===========>..................] - ETA: 1:09 - loss: 1.1175 - regression_loss: 0.9836 - classification_loss: 0.1339 205/500 [===========>..................] - ETA: 1:09 - loss: 1.1181 - regression_loss: 0.9845 - classification_loss: 0.1337 206/500 [===========>..................] - ETA: 1:09 - loss: 1.1144 - regression_loss: 0.9812 - classification_loss: 0.1332 207/500 [===========>..................] - ETA: 1:09 - loss: 1.1139 - regression_loss: 0.9810 - classification_loss: 0.1329 208/500 [===========>..................] - ETA: 1:08 - loss: 1.1125 - regression_loss: 0.9799 - classification_loss: 0.1326 209/500 [===========>..................] - ETA: 1:08 - loss: 1.1133 - regression_loss: 0.9807 - classification_loss: 0.1325 210/500 [===========>..................] - ETA: 1:08 - loss: 1.1144 - regression_loss: 0.9816 - classification_loss: 0.1328 211/500 [===========>..................] - ETA: 1:08 - loss: 1.1130 - regression_loss: 0.9805 - classification_loss: 0.1325 212/500 [===========>..................] - ETA: 1:07 - loss: 1.1130 - regression_loss: 0.9806 - classification_loss: 0.1324 213/500 [===========>..................] - ETA: 1:07 - loss: 1.1112 - regression_loss: 0.9792 - classification_loss: 0.1320 214/500 [===========>..................] - ETA: 1:07 - loss: 1.1126 - regression_loss: 0.9805 - classification_loss: 0.1321 215/500 [===========>..................] - ETA: 1:07 - loss: 1.1113 - regression_loss: 0.9796 - classification_loss: 0.1317 216/500 [===========>..................] - ETA: 1:06 - loss: 1.1120 - regression_loss: 0.9804 - classification_loss: 0.1317 217/500 [============>.................] - ETA: 1:06 - loss: 1.1134 - regression_loss: 0.9819 - classification_loss: 0.1316 218/500 [============>.................] - ETA: 1:06 - loss: 1.1103 - regression_loss: 0.9791 - classification_loss: 0.1311 219/500 [============>.................] - ETA: 1:06 - loss: 1.1098 - regression_loss: 0.9785 - classification_loss: 0.1313 220/500 [============>.................] - ETA: 1:05 - loss: 1.1118 - regression_loss: 0.9800 - classification_loss: 0.1318 221/500 [============>.................] - ETA: 1:05 - loss: 1.1139 - regression_loss: 0.9818 - classification_loss: 0.1320 222/500 [============>.................] - ETA: 1:05 - loss: 1.1150 - regression_loss: 0.9830 - classification_loss: 0.1320 223/500 [============>.................] - ETA: 1:05 - loss: 1.1159 - regression_loss: 0.9836 - classification_loss: 0.1323 224/500 [============>.................] - ETA: 1:05 - loss: 1.1170 - regression_loss: 0.9848 - classification_loss: 0.1322 225/500 [============>.................] - ETA: 1:04 - loss: 1.1185 - regression_loss: 0.9860 - classification_loss: 0.1324 226/500 [============>.................] - ETA: 1:04 - loss: 1.1186 - regression_loss: 0.9859 - classification_loss: 0.1328 227/500 [============>.................] - ETA: 1:04 - loss: 1.1169 - regression_loss: 0.9844 - classification_loss: 0.1325 228/500 [============>.................] - ETA: 1:04 - loss: 1.1158 - regression_loss: 0.9837 - classification_loss: 0.1321 229/500 [============>.................] - ETA: 1:03 - loss: 1.1166 - regression_loss: 0.9845 - classification_loss: 0.1321 230/500 [============>.................] - ETA: 1:03 - loss: 1.1155 - regression_loss: 0.9834 - classification_loss: 0.1321 231/500 [============>.................] - ETA: 1:03 - loss: 1.1165 - regression_loss: 0.9843 - classification_loss: 0.1322 232/500 [============>.................] - ETA: 1:03 - loss: 1.1196 - regression_loss: 0.9869 - classification_loss: 0.1327 233/500 [============>.................] - ETA: 1:02 - loss: 1.1240 - regression_loss: 0.9907 - classification_loss: 0.1333 234/500 [=============>................] - ETA: 1:02 - loss: 1.1232 - regression_loss: 0.9902 - classification_loss: 0.1329 235/500 [=============>................] - ETA: 1:02 - loss: 1.1239 - regression_loss: 0.9910 - classification_loss: 0.1329 236/500 [=============>................] - ETA: 1:02 - loss: 1.1265 - regression_loss: 0.9935 - classification_loss: 0.1330 237/500 [=============>................] - ETA: 1:01 - loss: 1.1266 - regression_loss: 0.9937 - classification_loss: 0.1329 238/500 [=============>................] - ETA: 1:01 - loss: 1.1298 - regression_loss: 0.9963 - classification_loss: 0.1335 239/500 [=============>................] - ETA: 1:01 - loss: 1.1279 - regression_loss: 0.9946 - classification_loss: 0.1333 240/500 [=============>................] - ETA: 1:01 - loss: 1.1279 - regression_loss: 0.9947 - classification_loss: 0.1332 241/500 [=============>................] - ETA: 1:00 - loss: 1.1272 - regression_loss: 0.9943 - classification_loss: 0.1329 242/500 [=============>................] - ETA: 1:00 - loss: 1.1267 - regression_loss: 0.9939 - classification_loss: 0.1328 243/500 [=============>................] - ETA: 1:00 - loss: 1.1342 - regression_loss: 1.0002 - classification_loss: 0.1340 244/500 [=============>................] - ETA: 1:00 - loss: 1.1329 - regression_loss: 0.9990 - classification_loss: 0.1339 245/500 [=============>................] - ETA: 1:00 - loss: 1.1315 - regression_loss: 0.9978 - classification_loss: 0.1338 246/500 [=============>................] - ETA: 59s - loss: 1.1319 - regression_loss: 0.9982 - classification_loss: 0.1337  247/500 [=============>................] - ETA: 59s - loss: 1.1310 - regression_loss: 0.9974 - classification_loss: 0.1335 248/500 [=============>................] - ETA: 59s - loss: 1.1299 - regression_loss: 0.9965 - classification_loss: 0.1334 249/500 [=============>................] - ETA: 59s - loss: 1.1282 - regression_loss: 0.9950 - classification_loss: 0.1332 250/500 [==============>...............] - ETA: 58s - loss: 1.1274 - regression_loss: 0.9943 - classification_loss: 0.1331 251/500 [==============>...............] - ETA: 58s - loss: 1.1269 - regression_loss: 0.9940 - classification_loss: 0.1330 252/500 [==============>...............] - ETA: 58s - loss: 1.1240 - regression_loss: 0.9915 - classification_loss: 0.1325 253/500 [==============>...............] - ETA: 58s - loss: 1.1221 - regression_loss: 0.9899 - classification_loss: 0.1322 254/500 [==============>...............] - ETA: 57s - loss: 1.1224 - regression_loss: 0.9900 - classification_loss: 0.1324 255/500 [==============>...............] - ETA: 57s - loss: 1.1218 - regression_loss: 0.9893 - classification_loss: 0.1325 256/500 [==============>...............] - ETA: 57s - loss: 1.1185 - regression_loss: 0.9865 - classification_loss: 0.1320 257/500 [==============>...............] - ETA: 57s - loss: 1.1185 - regression_loss: 0.9864 - classification_loss: 0.1320 258/500 [==============>...............] - ETA: 56s - loss: 1.1274 - regression_loss: 0.9948 - classification_loss: 0.1326 259/500 [==============>...............] - ETA: 56s - loss: 1.1292 - regression_loss: 0.9963 - classification_loss: 0.1328 260/500 [==============>...............] - ETA: 56s - loss: 1.1298 - regression_loss: 0.9966 - classification_loss: 0.1332 261/500 [==============>...............] - ETA: 56s - loss: 1.1283 - regression_loss: 0.9954 - classification_loss: 0.1329 262/500 [==============>...............] - ETA: 56s - loss: 1.1289 - regression_loss: 0.9959 - classification_loss: 0.1330 263/500 [==============>...............] - ETA: 55s - loss: 1.1292 - regression_loss: 0.9962 - classification_loss: 0.1330 264/500 [==============>...............] - ETA: 55s - loss: 1.1297 - regression_loss: 0.9967 - classification_loss: 0.1330 265/500 [==============>...............] - ETA: 55s - loss: 1.1279 - regression_loss: 0.9951 - classification_loss: 0.1327 266/500 [==============>...............] - ETA: 55s - loss: 1.1254 - regression_loss: 0.9928 - classification_loss: 0.1326 267/500 [===============>..............] - ETA: 54s - loss: 1.1219 - regression_loss: 0.9897 - classification_loss: 0.1322 268/500 [===============>..............] - ETA: 54s - loss: 1.1227 - regression_loss: 0.9902 - classification_loss: 0.1325 269/500 [===============>..............] - ETA: 54s - loss: 1.1219 - regression_loss: 0.9894 - classification_loss: 0.1325 270/500 [===============>..............] - ETA: 54s - loss: 1.1197 - regression_loss: 0.9875 - classification_loss: 0.1321 271/500 [===============>..............] - ETA: 53s - loss: 1.1199 - regression_loss: 0.9878 - classification_loss: 0.1321 272/500 [===============>..............] - ETA: 53s - loss: 1.1187 - regression_loss: 0.9869 - classification_loss: 0.1318 273/500 [===============>..............] - ETA: 53s - loss: 1.1179 - regression_loss: 0.9863 - classification_loss: 0.1317 274/500 [===============>..............] - ETA: 53s - loss: 1.1185 - regression_loss: 0.9868 - classification_loss: 0.1317 275/500 [===============>..............] - ETA: 53s - loss: 1.1158 - regression_loss: 0.9845 - classification_loss: 0.1313 276/500 [===============>..............] - ETA: 52s - loss: 1.1169 - regression_loss: 0.9856 - classification_loss: 0.1313 277/500 [===============>..............] - ETA: 52s - loss: 1.1185 - regression_loss: 0.9872 - classification_loss: 0.1313 278/500 [===============>..............] - ETA: 52s - loss: 1.1177 - regression_loss: 0.9867 - classification_loss: 0.1310 279/500 [===============>..............] - ETA: 52s - loss: 1.1165 - regression_loss: 0.9858 - classification_loss: 0.1307 280/500 [===============>..............] - ETA: 51s - loss: 1.1166 - regression_loss: 0.9860 - classification_loss: 0.1306 281/500 [===============>..............] - ETA: 51s - loss: 1.1159 - regression_loss: 0.9856 - classification_loss: 0.1303 282/500 [===============>..............] - ETA: 51s - loss: 1.1155 - regression_loss: 0.9853 - classification_loss: 0.1302 283/500 [===============>..............] - ETA: 51s - loss: 1.1143 - regression_loss: 0.9844 - classification_loss: 0.1300 284/500 [================>.............] - ETA: 50s - loss: 1.1139 - regression_loss: 0.9841 - classification_loss: 0.1299 285/500 [================>.............] - ETA: 50s - loss: 1.1147 - regression_loss: 0.9845 - classification_loss: 0.1302 286/500 [================>.............] - ETA: 50s - loss: 1.1150 - regression_loss: 0.9847 - classification_loss: 0.1304 287/500 [================>.............] - ETA: 50s - loss: 1.1126 - regression_loss: 0.9812 - classification_loss: 0.1313 288/500 [================>.............] - ETA: 49s - loss: 1.1136 - regression_loss: 0.9821 - classification_loss: 0.1314 289/500 [================>.............] - ETA: 49s - loss: 1.1151 - regression_loss: 0.9832 - classification_loss: 0.1319 290/500 [================>.............] - ETA: 49s - loss: 1.1136 - regression_loss: 0.9818 - classification_loss: 0.1317 291/500 [================>.............] - ETA: 49s - loss: 1.1140 - regression_loss: 0.9823 - classification_loss: 0.1317 292/500 [================>.............] - ETA: 49s - loss: 1.1129 - regression_loss: 0.9814 - classification_loss: 0.1315 293/500 [================>.............] - ETA: 48s - loss: 1.1120 - regression_loss: 0.9807 - classification_loss: 0.1313 294/500 [================>.............] - ETA: 48s - loss: 1.1102 - regression_loss: 0.9792 - classification_loss: 0.1310 295/500 [================>.............] - ETA: 48s - loss: 1.1116 - regression_loss: 0.9805 - classification_loss: 0.1311 296/500 [================>.............] - ETA: 48s - loss: 1.1096 - regression_loss: 0.9788 - classification_loss: 0.1309 297/500 [================>.............] - ETA: 47s - loss: 1.1095 - regression_loss: 0.9787 - classification_loss: 0.1308 298/500 [================>.............] - ETA: 47s - loss: 1.1098 - regression_loss: 0.9790 - classification_loss: 0.1308 299/500 [================>.............] - ETA: 47s - loss: 1.1104 - regression_loss: 0.9794 - classification_loss: 0.1311 300/500 [=================>............] - ETA: 47s - loss: 1.1098 - regression_loss: 0.9789 - classification_loss: 0.1309 301/500 [=================>............] - ETA: 46s - loss: 1.1095 - regression_loss: 0.9786 - classification_loss: 0.1308 302/500 [=================>............] - ETA: 46s - loss: 1.1085 - regression_loss: 0.9779 - classification_loss: 0.1306 303/500 [=================>............] - ETA: 46s - loss: 1.1074 - regression_loss: 0.9770 - classification_loss: 0.1304 304/500 [=================>............] - ETA: 46s - loss: 1.1069 - regression_loss: 0.9767 - classification_loss: 0.1302 305/500 [=================>............] - ETA: 45s - loss: 1.1070 - regression_loss: 0.9768 - classification_loss: 0.1303 306/500 [=================>............] - ETA: 45s - loss: 1.1051 - regression_loss: 0.9751 - classification_loss: 0.1300 307/500 [=================>............] - ETA: 45s - loss: 1.1063 - regression_loss: 0.9761 - classification_loss: 0.1302 308/500 [=================>............] - ETA: 45s - loss: 1.1065 - regression_loss: 0.9762 - classification_loss: 0.1304 309/500 [=================>............] - ETA: 45s - loss: 1.1065 - regression_loss: 0.9764 - classification_loss: 0.1301 310/500 [=================>............] - ETA: 44s - loss: 1.1050 - regression_loss: 0.9751 - classification_loss: 0.1299 311/500 [=================>............] - ETA: 44s - loss: 1.1056 - regression_loss: 0.9758 - classification_loss: 0.1298 312/500 [=================>............] - ETA: 44s - loss: 1.1050 - regression_loss: 0.9753 - classification_loss: 0.1297 313/500 [=================>............] - ETA: 44s - loss: 1.1029 - regression_loss: 0.9736 - classification_loss: 0.1294 314/500 [=================>............] - ETA: 43s - loss: 1.1022 - regression_loss: 0.9728 - classification_loss: 0.1294 315/500 [=================>............] - ETA: 43s - loss: 1.1034 - regression_loss: 0.9740 - classification_loss: 0.1295 316/500 [=================>............] - ETA: 43s - loss: 1.1017 - regression_loss: 0.9724 - classification_loss: 0.1292 317/500 [==================>...........] - ETA: 43s - loss: 1.1029 - regression_loss: 0.9734 - classification_loss: 0.1295 318/500 [==================>...........] - ETA: 42s - loss: 1.1037 - regression_loss: 0.9741 - classification_loss: 0.1296 319/500 [==================>...........] - ETA: 42s - loss: 1.1046 - regression_loss: 0.9748 - classification_loss: 0.1297 320/500 [==================>...........] - ETA: 42s - loss: 1.1039 - regression_loss: 0.9742 - classification_loss: 0.1297 321/500 [==================>...........] - ETA: 42s - loss: 1.1035 - regression_loss: 0.9739 - classification_loss: 0.1296 322/500 [==================>...........] - ETA: 41s - loss: 1.1017 - regression_loss: 0.9722 - classification_loss: 0.1294 323/500 [==================>...........] - ETA: 41s - loss: 1.1021 - regression_loss: 0.9727 - classification_loss: 0.1294 324/500 [==================>...........] - ETA: 41s - loss: 1.0994 - regression_loss: 0.9703 - classification_loss: 0.1291 325/500 [==================>...........] - ETA: 41s - loss: 1.0961 - regression_loss: 0.9673 - classification_loss: 0.1287 326/500 [==================>...........] - ETA: 40s - loss: 1.0970 - regression_loss: 0.9679 - classification_loss: 0.1291 327/500 [==================>...........] - ETA: 40s - loss: 1.0964 - regression_loss: 0.9675 - classification_loss: 0.1289 328/500 [==================>...........] - ETA: 40s - loss: 1.0950 - regression_loss: 0.9662 - classification_loss: 0.1287 329/500 [==================>...........] - ETA: 40s - loss: 1.0928 - regression_loss: 0.9643 - classification_loss: 0.1284 330/500 [==================>...........] - ETA: 40s - loss: 1.0970 - regression_loss: 0.9676 - classification_loss: 0.1294 331/500 [==================>...........] - ETA: 39s - loss: 1.0948 - regression_loss: 0.9656 - classification_loss: 0.1292 332/500 [==================>...........] - ETA: 39s - loss: 1.0940 - regression_loss: 0.9651 - classification_loss: 0.1289 333/500 [==================>...........] - ETA: 39s - loss: 1.0945 - regression_loss: 0.9657 - classification_loss: 0.1288 334/500 [===================>..........] - ETA: 39s - loss: 1.0941 - regression_loss: 0.9655 - classification_loss: 0.1286 335/500 [===================>..........] - ETA: 38s - loss: 1.0938 - regression_loss: 0.9652 - classification_loss: 0.1286 336/500 [===================>..........] - ETA: 38s - loss: 1.0937 - regression_loss: 0.9652 - classification_loss: 0.1285 337/500 [===================>..........] - ETA: 38s - loss: 1.0940 - regression_loss: 0.9656 - classification_loss: 0.1284 338/500 [===================>..........] - ETA: 38s - loss: 1.0926 - regression_loss: 0.9644 - classification_loss: 0.1282 339/500 [===================>..........] - ETA: 37s - loss: 1.0935 - regression_loss: 0.9650 - classification_loss: 0.1284 340/500 [===================>..........] - ETA: 37s - loss: 1.0927 - regression_loss: 0.9644 - classification_loss: 0.1283 341/500 [===================>..........] - ETA: 37s - loss: 1.0926 - regression_loss: 0.9643 - classification_loss: 0.1282 342/500 [===================>..........] - ETA: 37s - loss: 1.0935 - regression_loss: 0.9653 - classification_loss: 0.1282 343/500 [===================>..........] - ETA: 36s - loss: 1.0946 - regression_loss: 0.9662 - classification_loss: 0.1284 344/500 [===================>..........] - ETA: 36s - loss: 1.0946 - regression_loss: 0.9663 - classification_loss: 0.1283 345/500 [===================>..........] - ETA: 36s - loss: 1.0933 - regression_loss: 0.9652 - classification_loss: 0.1281 346/500 [===================>..........] - ETA: 36s - loss: 1.0927 - regression_loss: 0.9649 - classification_loss: 0.1279 347/500 [===================>..........] - ETA: 36s - loss: 1.0929 - regression_loss: 0.9649 - classification_loss: 0.1280 348/500 [===================>..........] - ETA: 35s - loss: 1.0912 - regression_loss: 0.9634 - classification_loss: 0.1278 349/500 [===================>..........] - ETA: 35s - loss: 1.0895 - regression_loss: 0.9620 - classification_loss: 0.1275 350/500 [====================>.........] - ETA: 35s - loss: 1.0895 - regression_loss: 0.9620 - classification_loss: 0.1275 351/500 [====================>.........] - ETA: 35s - loss: 1.0911 - regression_loss: 0.9633 - classification_loss: 0.1278 352/500 [====================>.........] - ETA: 34s - loss: 1.0903 - regression_loss: 0.9626 - classification_loss: 0.1276 353/500 [====================>.........] - ETA: 34s - loss: 1.0907 - regression_loss: 0.9631 - classification_loss: 0.1276 354/500 [====================>.........] - ETA: 34s - loss: 1.0917 - regression_loss: 0.9639 - classification_loss: 0.1278 355/500 [====================>.........] - ETA: 34s - loss: 1.0917 - regression_loss: 0.9639 - classification_loss: 0.1278 356/500 [====================>.........] - ETA: 33s - loss: 1.0899 - regression_loss: 0.9624 - classification_loss: 0.1275 357/500 [====================>.........] - ETA: 33s - loss: 1.0891 - regression_loss: 0.9616 - classification_loss: 0.1275 358/500 [====================>.........] - ETA: 33s - loss: 1.0882 - regression_loss: 0.9610 - classification_loss: 0.1272 359/500 [====================>.........] - ETA: 33s - loss: 1.0885 - regression_loss: 0.9613 - classification_loss: 0.1272 360/500 [====================>.........] - ETA: 32s - loss: 1.0871 - regression_loss: 0.9601 - classification_loss: 0.1270 361/500 [====================>.........] - ETA: 32s - loss: 1.0863 - regression_loss: 0.9594 - classification_loss: 0.1269 362/500 [====================>.........] - ETA: 32s - loss: 1.0870 - regression_loss: 0.9600 - classification_loss: 0.1270 363/500 [====================>.........] - ETA: 32s - loss: 1.0876 - regression_loss: 0.9606 - classification_loss: 0.1270 364/500 [====================>.........] - ETA: 31s - loss: 1.0868 - regression_loss: 0.9599 - classification_loss: 0.1269 365/500 [====================>.........] - ETA: 31s - loss: 1.0857 - regression_loss: 0.9590 - classification_loss: 0.1267 366/500 [====================>.........] - ETA: 31s - loss: 1.0844 - regression_loss: 0.9580 - classification_loss: 0.1265 367/500 [=====================>........] - ETA: 31s - loss: 1.0830 - regression_loss: 0.9568 - classification_loss: 0.1262 368/500 [=====================>........] - ETA: 31s - loss: 1.0820 - regression_loss: 0.9557 - classification_loss: 0.1262 369/500 [=====================>........] - ETA: 30s - loss: 1.0815 - regression_loss: 0.9554 - classification_loss: 0.1261 370/500 [=====================>........] - ETA: 30s - loss: 1.0818 - regression_loss: 0.9558 - classification_loss: 0.1260 371/500 [=====================>........] - ETA: 30s - loss: 1.0806 - regression_loss: 0.9547 - classification_loss: 0.1259 372/500 [=====================>........] - ETA: 30s - loss: 1.0808 - regression_loss: 0.9550 - classification_loss: 0.1258 373/500 [=====================>........] - ETA: 29s - loss: 1.0825 - regression_loss: 0.9564 - classification_loss: 0.1261 374/500 [=====================>........] - ETA: 29s - loss: 1.0824 - regression_loss: 0.9565 - classification_loss: 0.1259 375/500 [=====================>........] - ETA: 29s - loss: 1.0812 - regression_loss: 0.9556 - classification_loss: 0.1256 376/500 [=====================>........] - ETA: 29s - loss: 1.0799 - regression_loss: 0.9543 - classification_loss: 0.1256 377/500 [=====================>........] - ETA: 28s - loss: 1.0790 - regression_loss: 0.9536 - classification_loss: 0.1254 378/500 [=====================>........] - ETA: 28s - loss: 1.0800 - regression_loss: 0.9546 - classification_loss: 0.1254 379/500 [=====================>........] - ETA: 28s - loss: 1.0791 - regression_loss: 0.9537 - classification_loss: 0.1254 380/500 [=====================>........] - ETA: 28s - loss: 1.0802 - regression_loss: 0.9547 - classification_loss: 0.1255 381/500 [=====================>........] - ETA: 27s - loss: 1.0807 - regression_loss: 0.9551 - classification_loss: 0.1255 382/500 [=====================>........] - ETA: 27s - loss: 1.0794 - regression_loss: 0.9541 - classification_loss: 0.1253 383/500 [=====================>........] - ETA: 27s - loss: 1.0808 - regression_loss: 0.9550 - classification_loss: 0.1257 384/500 [======================>.......] - ETA: 27s - loss: 1.0796 - regression_loss: 0.9540 - classification_loss: 0.1256 385/500 [======================>.......] - ETA: 27s - loss: 1.0790 - regression_loss: 0.9535 - classification_loss: 0.1255 386/500 [======================>.......] - ETA: 26s - loss: 1.0799 - regression_loss: 0.9544 - classification_loss: 0.1255 387/500 [======================>.......] - ETA: 26s - loss: 1.0795 - regression_loss: 0.9541 - classification_loss: 0.1254 388/500 [======================>.......] - ETA: 26s - loss: 1.0796 - regression_loss: 0.9542 - classification_loss: 0.1254 389/500 [======================>.......] - ETA: 26s - loss: 1.0789 - regression_loss: 0.9537 - classification_loss: 0.1252 390/500 [======================>.......] - ETA: 25s - loss: 1.0788 - regression_loss: 0.9536 - classification_loss: 0.1252 391/500 [======================>.......] - ETA: 25s - loss: 1.0776 - regression_loss: 0.9526 - classification_loss: 0.1250 392/500 [======================>.......] - ETA: 25s - loss: 1.0760 - regression_loss: 0.9512 - classification_loss: 0.1248 393/500 [======================>.......] - ETA: 25s - loss: 1.0764 - regression_loss: 0.9516 - classification_loss: 0.1248 394/500 [======================>.......] - ETA: 24s - loss: 1.0750 - regression_loss: 0.9504 - classification_loss: 0.1245 395/500 [======================>.......] - ETA: 24s - loss: 1.0749 - regression_loss: 0.9504 - classification_loss: 0.1245 396/500 [======================>.......] - ETA: 24s - loss: 1.0748 - regression_loss: 0.9504 - classification_loss: 0.1244 397/500 [======================>.......] - ETA: 24s - loss: 1.0751 - regression_loss: 0.9506 - classification_loss: 0.1245 398/500 [======================>.......] - ETA: 23s - loss: 1.0744 - regression_loss: 0.9499 - classification_loss: 0.1244 399/500 [======================>.......] - ETA: 23s - loss: 1.0730 - regression_loss: 0.9487 - classification_loss: 0.1242 400/500 [=======================>......] - ETA: 23s - loss: 1.0723 - regression_loss: 0.9481 - classification_loss: 0.1242 401/500 [=======================>......] - ETA: 23s - loss: 1.0709 - regression_loss: 0.9470 - classification_loss: 0.1239 402/500 [=======================>......] - ETA: 23s - loss: 1.0698 - regression_loss: 0.9460 - classification_loss: 0.1238 403/500 [=======================>......] - ETA: 22s - loss: 1.0697 - regression_loss: 0.9460 - classification_loss: 0.1237 404/500 [=======================>......] - ETA: 22s - loss: 1.0714 - regression_loss: 0.9477 - classification_loss: 0.1237 405/500 [=======================>......] - ETA: 22s - loss: 1.0741 - regression_loss: 0.9493 - classification_loss: 0.1247 406/500 [=======================>......] - ETA: 22s - loss: 1.0728 - regression_loss: 0.9482 - classification_loss: 0.1246 407/500 [=======================>......] - ETA: 21s - loss: 1.0727 - regression_loss: 0.9482 - classification_loss: 0.1245 408/500 [=======================>......] - ETA: 21s - loss: 1.0720 - regression_loss: 0.9477 - classification_loss: 0.1244 409/500 [=======================>......] - ETA: 21s - loss: 1.0728 - regression_loss: 0.9482 - classification_loss: 0.1245 410/500 [=======================>......] - ETA: 21s - loss: 1.0747 - regression_loss: 0.9497 - classification_loss: 0.1249 411/500 [=======================>......] - ETA: 20s - loss: 1.0745 - regression_loss: 0.9498 - classification_loss: 0.1247 412/500 [=======================>......] - ETA: 20s - loss: 1.0754 - regression_loss: 0.9505 - classification_loss: 0.1249 413/500 [=======================>......] - ETA: 20s - loss: 1.0746 - regression_loss: 0.9499 - classification_loss: 0.1247 414/500 [=======================>......] - ETA: 20s - loss: 1.0750 - regression_loss: 0.9503 - classification_loss: 0.1247 415/500 [=======================>......] - ETA: 19s - loss: 1.0756 - regression_loss: 0.9508 - classification_loss: 0.1248 416/500 [=======================>......] - ETA: 19s - loss: 1.0755 - regression_loss: 0.9509 - classification_loss: 0.1247 417/500 [========================>.....] - ETA: 19s - loss: 1.0763 - regression_loss: 0.9516 - classification_loss: 0.1247 418/500 [========================>.....] - ETA: 19s - loss: 1.0757 - regression_loss: 0.9511 - classification_loss: 0.1246 419/500 [========================>.....] - ETA: 19s - loss: 1.0763 - regression_loss: 0.9517 - classification_loss: 0.1246 420/500 [========================>.....] - ETA: 18s - loss: 1.0775 - regression_loss: 0.9527 - classification_loss: 0.1248 421/500 [========================>.....] - ETA: 18s - loss: 1.0770 - regression_loss: 0.9524 - classification_loss: 0.1247 422/500 [========================>.....] - ETA: 18s - loss: 1.0766 - regression_loss: 0.9520 - classification_loss: 0.1245 423/500 [========================>.....] - ETA: 18s - loss: 1.0767 - regression_loss: 0.9523 - classification_loss: 0.1245 424/500 [========================>.....] - ETA: 17s - loss: 1.0760 - regression_loss: 0.9517 - classification_loss: 0.1243 425/500 [========================>.....] - ETA: 17s - loss: 1.0756 - regression_loss: 0.9514 - classification_loss: 0.1243 426/500 [========================>.....] - ETA: 17s - loss: 1.0743 - regression_loss: 0.9502 - classification_loss: 0.1241 427/500 [========================>.....] - ETA: 17s - loss: 1.0745 - regression_loss: 0.9505 - classification_loss: 0.1241 428/500 [========================>.....] - ETA: 16s - loss: 1.0745 - regression_loss: 0.9505 - classification_loss: 0.1240 429/500 [========================>.....] - ETA: 16s - loss: 1.0750 - regression_loss: 0.9510 - classification_loss: 0.1240 430/500 [========================>.....] - ETA: 16s - loss: 1.0743 - regression_loss: 0.9505 - classification_loss: 0.1238 431/500 [========================>.....] - ETA: 16s - loss: 1.0750 - regression_loss: 0.9509 - classification_loss: 0.1241 432/500 [========================>.....] - ETA: 15s - loss: 1.0772 - regression_loss: 0.9525 - classification_loss: 0.1246 433/500 [========================>.....] - ETA: 15s - loss: 1.0770 - regression_loss: 0.9525 - classification_loss: 0.1246 434/500 [=========================>....] - ETA: 15s - loss: 1.0777 - regression_loss: 0.9530 - classification_loss: 0.1247 435/500 [=========================>....] - ETA: 15s - loss: 1.0779 - regression_loss: 0.9531 - classification_loss: 0.1248 436/500 [=========================>....] - ETA: 15s - loss: 1.0772 - regression_loss: 0.9525 - classification_loss: 0.1247 437/500 [=========================>....] - ETA: 14s - loss: 1.0768 - regression_loss: 0.9522 - classification_loss: 0.1245 438/500 [=========================>....] - ETA: 14s - loss: 1.0765 - regression_loss: 0.9520 - classification_loss: 0.1245 439/500 [=========================>....] - ETA: 14s - loss: 1.0768 - regression_loss: 0.9524 - classification_loss: 0.1245 440/500 [=========================>....] - ETA: 14s - loss: 1.0769 - regression_loss: 0.9524 - classification_loss: 0.1244 441/500 [=========================>....] - ETA: 13s - loss: 1.0760 - regression_loss: 0.9517 - classification_loss: 0.1243 442/500 [=========================>....] - ETA: 13s - loss: 1.0760 - regression_loss: 0.9519 - classification_loss: 0.1241 443/500 [=========================>....] - ETA: 13s - loss: 1.0758 - regression_loss: 0.9518 - classification_loss: 0.1239 444/500 [=========================>....] - ETA: 13s - loss: 1.0752 - regression_loss: 0.9513 - classification_loss: 0.1239 445/500 [=========================>....] - ETA: 12s - loss: 1.0746 - regression_loss: 0.9508 - classification_loss: 0.1237 446/500 [=========================>....] - ETA: 12s - loss: 1.0727 - regression_loss: 0.9492 - classification_loss: 0.1235 447/500 [=========================>....] - ETA: 12s - loss: 1.0739 - regression_loss: 0.9502 - classification_loss: 0.1237 448/500 [=========================>....] - ETA: 12s - loss: 1.0729 - regression_loss: 0.9494 - classification_loss: 0.1235 449/500 [=========================>....] - ETA: 11s - loss: 1.0733 - regression_loss: 0.9498 - classification_loss: 0.1235 450/500 [==========================>...] - ETA: 11s - loss: 1.0740 - regression_loss: 0.9504 - classification_loss: 0.1236 451/500 [==========================>...] - ETA: 11s - loss: 1.0764 - regression_loss: 0.9525 - classification_loss: 0.1239 452/500 [==========================>...] - ETA: 11s - loss: 1.0760 - regression_loss: 0.9522 - classification_loss: 0.1238 453/500 [==========================>...] - ETA: 11s - loss: 1.0763 - regression_loss: 0.9524 - classification_loss: 0.1239 454/500 [==========================>...] - ETA: 10s - loss: 1.0788 - regression_loss: 0.9545 - classification_loss: 0.1242 455/500 [==========================>...] - ETA: 10s - loss: 1.0796 - regression_loss: 0.9554 - classification_loss: 0.1242 456/500 [==========================>...] - ETA: 10s - loss: 1.0785 - regression_loss: 0.9544 - classification_loss: 0.1240 457/500 [==========================>...] - ETA: 10s - loss: 1.0780 - regression_loss: 0.9541 - classification_loss: 0.1239 458/500 [==========================>...] - ETA: 9s - loss: 1.0819 - regression_loss: 0.9579 - classification_loss: 0.1241  459/500 [==========================>...] - ETA: 9s - loss: 1.0815 - regression_loss: 0.9576 - classification_loss: 0.1239 460/500 [==========================>...] - ETA: 9s - loss: 1.0826 - regression_loss: 0.9585 - classification_loss: 0.1242 461/500 [==========================>...] - ETA: 9s - loss: 1.0820 - regression_loss: 0.9581 - classification_loss: 0.1240 462/500 [==========================>...] - ETA: 8s - loss: 1.0817 - regression_loss: 0.9579 - classification_loss: 0.1239 463/500 [==========================>...] - ETA: 8s - loss: 1.0805 - regression_loss: 0.9568 - classification_loss: 0.1237 464/500 [==========================>...] - ETA: 8s - loss: 1.0804 - regression_loss: 0.9567 - classification_loss: 0.1237 465/500 [==========================>...] - ETA: 8s - loss: 1.0810 - regression_loss: 0.9572 - classification_loss: 0.1239 466/500 [==========================>...] - ETA: 7s - loss: 1.0809 - regression_loss: 0.9571 - classification_loss: 0.1238 467/500 [===========================>..] - ETA: 7s - loss: 1.0812 - regression_loss: 0.9572 - classification_loss: 0.1241 468/500 [===========================>..] - ETA: 7s - loss: 1.0809 - regression_loss: 0.9568 - classification_loss: 0.1240 469/500 [===========================>..] - ETA: 7s - loss: 1.0812 - regression_loss: 0.9571 - classification_loss: 0.1241 470/500 [===========================>..] - ETA: 7s - loss: 1.0806 - regression_loss: 0.9566 - classification_loss: 0.1240 471/500 [===========================>..] - ETA: 6s - loss: 1.0793 - regression_loss: 0.9555 - classification_loss: 0.1238 472/500 [===========================>..] - ETA: 6s - loss: 1.0785 - regression_loss: 0.9549 - classification_loss: 0.1236 473/500 [===========================>..] - ETA: 6s - loss: 1.0785 - regression_loss: 0.9548 - classification_loss: 0.1237 474/500 [===========================>..] - ETA: 6s - loss: 1.0807 - regression_loss: 0.9566 - classification_loss: 0.1242 475/500 [===========================>..] - ETA: 5s - loss: 1.0819 - regression_loss: 0.9575 - classification_loss: 0.1243 476/500 [===========================>..] - ETA: 5s - loss: 1.0810 - regression_loss: 0.9568 - classification_loss: 0.1242 477/500 [===========================>..] - ETA: 5s - loss: 1.0812 - regression_loss: 0.9571 - classification_loss: 0.1241 478/500 [===========================>..] - ETA: 5s - loss: 1.0801 - regression_loss: 0.9561 - classification_loss: 0.1240 479/500 [===========================>..] - ETA: 4s - loss: 1.0800 - regression_loss: 0.9560 - classification_loss: 0.1241 480/500 [===========================>..] - ETA: 4s - loss: 1.0799 - regression_loss: 0.9559 - classification_loss: 0.1240 481/500 [===========================>..] - ETA: 4s - loss: 1.0811 - regression_loss: 0.9571 - classification_loss: 0.1240 482/500 [===========================>..] - ETA: 4s - loss: 1.0821 - regression_loss: 0.9578 - classification_loss: 0.1243 483/500 [===========================>..] - ETA: 3s - loss: 1.0811 - regression_loss: 0.9570 - classification_loss: 0.1241 484/500 [============================>.] - ETA: 3s - loss: 1.0804 - regression_loss: 0.9563 - classification_loss: 0.1240 485/500 [============================>.] - ETA: 3s - loss: 1.0819 - regression_loss: 0.9575 - classification_loss: 0.1244 486/500 [============================>.] - ETA: 3s - loss: 1.0817 - regression_loss: 0.9574 - classification_loss: 0.1244 487/500 [============================>.] - ETA: 3s - loss: 1.0822 - regression_loss: 0.9576 - classification_loss: 0.1246 488/500 [============================>.] - ETA: 2s - loss: 1.0829 - regression_loss: 0.9582 - classification_loss: 0.1247 489/500 [============================>.] - ETA: 2s - loss: 1.0825 - regression_loss: 0.9579 - classification_loss: 0.1246 490/500 [============================>.] - ETA: 2s - loss: 1.0820 - regression_loss: 0.9576 - classification_loss: 0.1244 491/500 [============================>.] - ETA: 2s - loss: 1.0832 - regression_loss: 0.9587 - classification_loss: 0.1245 492/500 [============================>.] - ETA: 1s - loss: 1.0857 - regression_loss: 0.9607 - classification_loss: 0.1250 493/500 [============================>.] - ETA: 1s - loss: 1.0847 - regression_loss: 0.9599 - classification_loss: 0.1248 494/500 [============================>.] - ETA: 1s - loss: 1.0841 - regression_loss: 0.9594 - classification_loss: 0.1247 495/500 [============================>.] - ETA: 1s - loss: 1.0834 - regression_loss: 0.9589 - classification_loss: 0.1246 496/500 [============================>.] - ETA: 0s - loss: 1.0823 - regression_loss: 0.9579 - classification_loss: 0.1244 497/500 [============================>.] - ETA: 0s - loss: 1.0828 - regression_loss: 0.9583 - classification_loss: 0.1245 498/500 [============================>.] - ETA: 0s - loss: 1.0824 - regression_loss: 0.9579 - classification_loss: 0.1244 499/500 [============================>.] - ETA: 0s - loss: 1.0818 - regression_loss: 0.9572 - classification_loss: 0.1246 500/500 [==============================] - 118s 235ms/step - loss: 1.0822 - regression_loss: 0.9576 - classification_loss: 0.1246 326 instances of class plum with average precision: 0.8141 mAP: 0.8141 Epoch 00026: saving model to ./training/snapshots/resnet50_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 1:58 - loss: 0.7094 - regression_loss: 0.6567 - classification_loss: 0.0527 2/500 [..............................] - ETA: 1:55 - loss: 0.9968 - regression_loss: 0.8562 - classification_loss: 0.1406 3/500 [..............................] - ETA: 1:56 - loss: 1.0225 - regression_loss: 0.8671 - classification_loss: 0.1555 4/500 [..............................] - ETA: 1:55 - loss: 0.8753 - regression_loss: 0.7495 - classification_loss: 0.1258 5/500 [..............................] - ETA: 1:55 - loss: 1.0745 - regression_loss: 0.9304 - classification_loss: 0.1441 6/500 [..............................] - ETA: 1:57 - loss: 1.1418 - regression_loss: 1.0067 - classification_loss: 0.1351 7/500 [..............................] - ETA: 1:58 - loss: 1.0821 - regression_loss: 0.9541 - classification_loss: 0.1281 8/500 [..............................] - ETA: 1:57 - loss: 1.0677 - regression_loss: 0.9392 - classification_loss: 0.1285 9/500 [..............................] - ETA: 1:57 - loss: 1.1898 - regression_loss: 1.0259 - classification_loss: 0.1639 10/500 [..............................] - ETA: 1:56 - loss: 1.2527 - regression_loss: 1.0742 - classification_loss: 0.1785 11/500 [..............................] - ETA: 1:57 - loss: 1.1861 - regression_loss: 1.0205 - classification_loss: 0.1656 12/500 [..............................] - ETA: 1:56 - loss: 1.1879 - regression_loss: 1.0244 - classification_loss: 0.1636 13/500 [..............................] - ETA: 1:55 - loss: 1.1763 - regression_loss: 1.0204 - classification_loss: 0.1559 14/500 [..............................] - ETA: 1:55 - loss: 1.1555 - regression_loss: 1.0042 - classification_loss: 0.1513 15/500 [..............................] - ETA: 1:55 - loss: 1.1312 - regression_loss: 0.9837 - classification_loss: 0.1475 16/500 [..............................] - ETA: 1:55 - loss: 1.1201 - regression_loss: 0.9767 - classification_loss: 0.1434 17/500 [>.............................] - ETA: 1:54 - loss: 1.1006 - regression_loss: 0.9625 - classification_loss: 0.1381 18/500 [>.............................] - ETA: 1:54 - loss: 1.0679 - regression_loss: 0.9364 - classification_loss: 0.1315 19/500 [>.............................] - ETA: 1:55 - loss: 1.0641 - regression_loss: 0.9344 - classification_loss: 0.1296 20/500 [>.............................] - ETA: 1:54 - loss: 1.0603 - regression_loss: 0.9341 - classification_loss: 0.1263 21/500 [>.............................] - ETA: 1:54 - loss: 1.0787 - regression_loss: 0.9497 - classification_loss: 0.1290 22/500 [>.............................] - ETA: 1:54 - loss: 1.0658 - regression_loss: 0.9364 - classification_loss: 0.1295 23/500 [>.............................] - ETA: 1:54 - loss: 1.0410 - regression_loss: 0.9155 - classification_loss: 0.1255 24/500 [>.............................] - ETA: 1:54 - loss: 1.0636 - regression_loss: 0.9338 - classification_loss: 0.1298 25/500 [>.............................] - ETA: 1:53 - loss: 1.0559 - regression_loss: 0.9279 - classification_loss: 0.1279 26/500 [>.............................] - ETA: 1:52 - loss: 1.0596 - regression_loss: 0.9307 - classification_loss: 0.1289 27/500 [>.............................] - ETA: 1:52 - loss: 1.0471 - regression_loss: 0.9210 - classification_loss: 0.1261 28/500 [>.............................] - ETA: 1:51 - loss: 1.0348 - regression_loss: 0.9115 - classification_loss: 0.1233 29/500 [>.............................] - ETA: 1:51 - loss: 1.0682 - regression_loss: 0.9386 - classification_loss: 0.1296 30/500 [>.............................] - ETA: 1:50 - loss: 1.0655 - regression_loss: 0.9359 - classification_loss: 0.1296 31/500 [>.............................] - ETA: 1:50 - loss: 1.0528 - regression_loss: 0.9243 - classification_loss: 0.1285 32/500 [>.............................] - ETA: 1:50 - loss: 1.0525 - regression_loss: 0.9241 - classification_loss: 0.1284 33/500 [>.............................] - ETA: 1:49 - loss: 1.0308 - regression_loss: 0.9051 - classification_loss: 0.1257 34/500 [=>............................] - ETA: 1:49 - loss: 1.0249 - regression_loss: 0.9017 - classification_loss: 0.1232 35/500 [=>............................] - ETA: 1:48 - loss: 1.0441 - regression_loss: 0.9187 - classification_loss: 0.1254 36/500 [=>............................] - ETA: 1:48 - loss: 1.0307 - regression_loss: 0.9068 - classification_loss: 0.1239 37/500 [=>............................] - ETA: 1:48 - loss: 1.0150 - regression_loss: 0.8934 - classification_loss: 0.1216 38/500 [=>............................] - ETA: 1:48 - loss: 1.0154 - regression_loss: 0.8946 - classification_loss: 0.1209 39/500 [=>............................] - ETA: 1:48 - loss: 1.0151 - regression_loss: 0.8946 - classification_loss: 0.1205 40/500 [=>............................] - ETA: 1:48 - loss: 1.0174 - regression_loss: 0.8966 - classification_loss: 0.1207 41/500 [=>............................] - ETA: 1:47 - loss: 1.0126 - regression_loss: 0.8935 - classification_loss: 0.1191 42/500 [=>............................] - ETA: 1:47 - loss: 1.0199 - regression_loss: 0.8998 - classification_loss: 0.1201 43/500 [=>............................] - ETA: 1:47 - loss: 1.0260 - regression_loss: 0.9053 - classification_loss: 0.1207 44/500 [=>............................] - ETA: 1:47 - loss: 1.0150 - regression_loss: 0.8956 - classification_loss: 0.1194 45/500 [=>............................] - ETA: 1:46 - loss: 1.0170 - regression_loss: 0.8964 - classification_loss: 0.1206 46/500 [=>............................] - ETA: 1:46 - loss: 1.0145 - regression_loss: 0.8952 - classification_loss: 0.1193 47/500 [=>............................] - ETA: 1:46 - loss: 1.0223 - regression_loss: 0.9008 - classification_loss: 0.1215 48/500 [=>............................] - ETA: 1:46 - loss: 1.0154 - regression_loss: 0.8947 - classification_loss: 0.1207 49/500 [=>............................] - ETA: 1:45 - loss: 1.0067 - regression_loss: 0.8876 - classification_loss: 0.1191 50/500 [==>...........................] - ETA: 1:45 - loss: 1.0120 - regression_loss: 0.8911 - classification_loss: 0.1209 51/500 [==>...........................] - ETA: 1:45 - loss: 1.0019 - regression_loss: 0.8823 - classification_loss: 0.1196 52/500 [==>...........................] - ETA: 1:45 - loss: 1.0041 - regression_loss: 0.8838 - classification_loss: 0.1202 53/500 [==>...........................] - ETA: 1:45 - loss: 1.0006 - regression_loss: 0.8811 - classification_loss: 0.1195 54/500 [==>...........................] - ETA: 1:44 - loss: 0.9996 - regression_loss: 0.8812 - classification_loss: 0.1184 55/500 [==>...........................] - ETA: 1:44 - loss: 1.0087 - regression_loss: 0.8893 - classification_loss: 0.1194 56/500 [==>...........................] - ETA: 1:44 - loss: 1.0033 - regression_loss: 0.8831 - classification_loss: 0.1202 57/500 [==>...........................] - ETA: 1:44 - loss: 0.9964 - regression_loss: 0.8766 - classification_loss: 0.1198 58/500 [==>...........................] - ETA: 1:43 - loss: 0.9922 - regression_loss: 0.8697 - classification_loss: 0.1225 59/500 [==>...........................] - ETA: 1:43 - loss: 0.9940 - regression_loss: 0.8710 - classification_loss: 0.1230 60/500 [==>...........................] - ETA: 1:43 - loss: 0.9978 - regression_loss: 0.8754 - classification_loss: 0.1225 61/500 [==>...........................] - ETA: 1:43 - loss: 0.9947 - regression_loss: 0.8733 - classification_loss: 0.1214 62/500 [==>...........................] - ETA: 1:42 - loss: 0.9968 - regression_loss: 0.8754 - classification_loss: 0.1214 63/500 [==>...........................] - ETA: 1:42 - loss: 1.0009 - regression_loss: 0.8790 - classification_loss: 0.1219 64/500 [==>...........................] - ETA: 1:42 - loss: 1.0021 - regression_loss: 0.8805 - classification_loss: 0.1216 65/500 [==>...........................] - ETA: 1:42 - loss: 1.0061 - regression_loss: 0.8834 - classification_loss: 0.1227 66/500 [==>...........................] - ETA: 1:42 - loss: 0.9998 - regression_loss: 0.8781 - classification_loss: 0.1216 67/500 [===>..........................] - ETA: 1:41 - loss: 1.0018 - regression_loss: 0.8794 - classification_loss: 0.1224 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9982 - regression_loss: 0.8762 - classification_loss: 0.1220 69/500 [===>..........................] - ETA: 1:41 - loss: 1.0033 - regression_loss: 0.8822 - classification_loss: 0.1211 70/500 [===>..........................] - ETA: 1:41 - loss: 1.0050 - regression_loss: 0.8837 - classification_loss: 0.1213 71/500 [===>..........................] - ETA: 1:41 - loss: 0.9912 - regression_loss: 0.8712 - classification_loss: 0.1200 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9879 - regression_loss: 0.8684 - classification_loss: 0.1195 73/500 [===>..........................] - ETA: 1:40 - loss: 0.9931 - regression_loss: 0.8740 - classification_loss: 0.1191 74/500 [===>..........................] - ETA: 1:40 - loss: 0.9905 - regression_loss: 0.8722 - classification_loss: 0.1183 75/500 [===>..........................] - ETA: 1:40 - loss: 0.9906 - regression_loss: 0.8727 - classification_loss: 0.1178 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9905 - regression_loss: 0.8729 - classification_loss: 0.1176 77/500 [===>..........................] - ETA: 1:39 - loss: 0.9914 - regression_loss: 0.8742 - classification_loss: 0.1172 78/500 [===>..........................] - ETA: 1:39 - loss: 0.9923 - regression_loss: 0.8753 - classification_loss: 0.1170 79/500 [===>..........................] - ETA: 1:39 - loss: 0.9832 - regression_loss: 0.8672 - classification_loss: 0.1159 80/500 [===>..........................] - ETA: 1:39 - loss: 0.9837 - regression_loss: 0.8685 - classification_loss: 0.1152 81/500 [===>..........................] - ETA: 1:38 - loss: 0.9831 - regression_loss: 0.8682 - classification_loss: 0.1149 82/500 [===>..........................] - ETA: 1:38 - loss: 0.9837 - regression_loss: 0.8688 - classification_loss: 0.1150 83/500 [===>..........................] - ETA: 1:38 - loss: 0.9939 - regression_loss: 0.8783 - classification_loss: 0.1156 84/500 [====>.........................] - ETA: 1:38 - loss: 0.9913 - regression_loss: 0.8767 - classification_loss: 0.1146 85/500 [====>.........................] - ETA: 1:37 - loss: 0.9905 - regression_loss: 0.8764 - classification_loss: 0.1141 86/500 [====>.........................] - ETA: 1:37 - loss: 1.0041 - regression_loss: 0.8853 - classification_loss: 0.1187 87/500 [====>.........................] - ETA: 1:37 - loss: 1.0051 - regression_loss: 0.8866 - classification_loss: 0.1185 88/500 [====>.........................] - ETA: 1:37 - loss: 0.9985 - regression_loss: 0.8812 - classification_loss: 0.1174 89/500 [====>.........................] - ETA: 1:37 - loss: 0.9985 - regression_loss: 0.8809 - classification_loss: 0.1176 90/500 [====>.........................] - ETA: 1:36 - loss: 0.9946 - regression_loss: 0.8779 - classification_loss: 0.1166 91/500 [====>.........................] - ETA: 1:36 - loss: 0.9953 - regression_loss: 0.8788 - classification_loss: 0.1164 92/500 [====>.........................] - ETA: 1:36 - loss: 1.0019 - regression_loss: 0.8842 - classification_loss: 0.1177 93/500 [====>.........................] - ETA: 1:36 - loss: 1.0005 - regression_loss: 0.8833 - classification_loss: 0.1172 94/500 [====>.........................] - ETA: 1:35 - loss: 0.9937 - regression_loss: 0.8777 - classification_loss: 0.1161 95/500 [====>.........................] - ETA: 1:35 - loss: 0.9969 - regression_loss: 0.8797 - classification_loss: 0.1172 96/500 [====>.........................] - ETA: 1:35 - loss: 1.0001 - regression_loss: 0.8823 - classification_loss: 0.1178 97/500 [====>.........................] - ETA: 1:34 - loss: 1.0052 - regression_loss: 0.8868 - classification_loss: 0.1184 98/500 [====>.........................] - ETA: 1:34 - loss: 1.0058 - regression_loss: 0.8879 - classification_loss: 0.1179 99/500 [====>.........................] - ETA: 1:34 - loss: 1.0009 - regression_loss: 0.8840 - classification_loss: 0.1169 100/500 [=====>........................] - ETA: 1:34 - loss: 1.0016 - regression_loss: 0.8846 - classification_loss: 0.1170 101/500 [=====>........................] - ETA: 1:33 - loss: 1.0027 - regression_loss: 0.8852 - classification_loss: 0.1174 102/500 [=====>........................] - ETA: 1:33 - loss: 1.0054 - regression_loss: 0.8878 - classification_loss: 0.1175 103/500 [=====>........................] - ETA: 1:33 - loss: 1.0037 - regression_loss: 0.8865 - classification_loss: 0.1172 104/500 [=====>........................] - ETA: 1:33 - loss: 1.0063 - regression_loss: 0.8884 - classification_loss: 0.1179 105/500 [=====>........................] - ETA: 1:32 - loss: 1.0093 - regression_loss: 0.8906 - classification_loss: 0.1186 106/500 [=====>........................] - ETA: 1:32 - loss: 1.0060 - regression_loss: 0.8879 - classification_loss: 0.1181 107/500 [=====>........................] - ETA: 1:32 - loss: 1.0072 - regression_loss: 0.8885 - classification_loss: 0.1187 108/500 [=====>........................] - ETA: 1:32 - loss: 1.0044 - regression_loss: 0.8860 - classification_loss: 0.1184 109/500 [=====>........................] - ETA: 1:32 - loss: 1.0029 - regression_loss: 0.8852 - classification_loss: 0.1177 110/500 [=====>........................] - ETA: 1:31 - loss: 0.9997 - regression_loss: 0.8826 - classification_loss: 0.1171 111/500 [=====>........................] - ETA: 1:31 - loss: 1.0015 - regression_loss: 0.8846 - classification_loss: 0.1170 112/500 [=====>........................] - ETA: 1:31 - loss: 1.0038 - regression_loss: 0.8865 - classification_loss: 0.1173 113/500 [=====>........................] - ETA: 1:31 - loss: 1.0030 - regression_loss: 0.8861 - classification_loss: 0.1169 114/500 [=====>........................] - ETA: 1:30 - loss: 1.0004 - regression_loss: 0.8839 - classification_loss: 0.1165 115/500 [=====>........................] - ETA: 1:30 - loss: 0.9960 - regression_loss: 0.8801 - classification_loss: 0.1158 116/500 [=====>........................] - ETA: 1:30 - loss: 0.9943 - regression_loss: 0.8784 - classification_loss: 0.1158 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9927 - regression_loss: 0.8771 - classification_loss: 0.1156 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9978 - regression_loss: 0.8820 - classification_loss: 0.1158 119/500 [======>.......................] - ETA: 1:29 - loss: 1.0056 - regression_loss: 0.8896 - classification_loss: 0.1160 120/500 [======>.......................] - ETA: 1:29 - loss: 1.0084 - regression_loss: 0.8913 - classification_loss: 0.1171 121/500 [======>.......................] - ETA: 1:28 - loss: 1.0082 - regression_loss: 0.8912 - classification_loss: 0.1170 122/500 [======>.......................] - ETA: 1:28 - loss: 1.0129 - regression_loss: 0.8940 - classification_loss: 0.1189 123/500 [======>.......................] - ETA: 1:28 - loss: 1.0115 - regression_loss: 0.8930 - classification_loss: 0.1185 124/500 [======>.......................] - ETA: 1:28 - loss: 1.0079 - regression_loss: 0.8901 - classification_loss: 0.1178 125/500 [======>.......................] - ETA: 1:28 - loss: 1.0067 - regression_loss: 0.8891 - classification_loss: 0.1176 126/500 [======>.......................] - ETA: 1:27 - loss: 1.0032 - regression_loss: 0.8863 - classification_loss: 0.1169 127/500 [======>.......................] - ETA: 1:27 - loss: 1.0047 - regression_loss: 0.8878 - classification_loss: 0.1169 128/500 [======>.......................] - ETA: 1:27 - loss: 1.0003 - regression_loss: 0.8840 - classification_loss: 0.1163 129/500 [======>.......................] - ETA: 1:27 - loss: 0.9995 - regression_loss: 0.8833 - classification_loss: 0.1162 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9968 - regression_loss: 0.8811 - classification_loss: 0.1157 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9997 - regression_loss: 0.8836 - classification_loss: 0.1161 132/500 [======>.......................] - ETA: 1:26 - loss: 0.9988 - regression_loss: 0.8831 - classification_loss: 0.1157 133/500 [======>.......................] - ETA: 1:26 - loss: 1.0027 - regression_loss: 0.8861 - classification_loss: 0.1166 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9983 - regression_loss: 0.8824 - classification_loss: 0.1159 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9984 - regression_loss: 0.8823 - classification_loss: 0.1161 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9985 - regression_loss: 0.8824 - classification_loss: 0.1161 137/500 [=======>......................] - ETA: 1:25 - loss: 1.0078 - regression_loss: 0.8903 - classification_loss: 0.1175 138/500 [=======>......................] - ETA: 1:24 - loss: 1.0140 - regression_loss: 0.8940 - classification_loss: 0.1201 139/500 [=======>......................] - ETA: 1:24 - loss: 1.0163 - regression_loss: 0.8960 - classification_loss: 0.1203 140/500 [=======>......................] - ETA: 1:24 - loss: 1.0168 - regression_loss: 0.8967 - classification_loss: 0.1201 141/500 [=======>......................] - ETA: 1:24 - loss: 1.0176 - regression_loss: 0.8976 - classification_loss: 0.1200 142/500 [=======>......................] - ETA: 1:23 - loss: 1.0152 - regression_loss: 0.8958 - classification_loss: 0.1195 143/500 [=======>......................] - ETA: 1:23 - loss: 1.0142 - regression_loss: 0.8951 - classification_loss: 0.1191 144/500 [=======>......................] - ETA: 1:23 - loss: 1.0111 - regression_loss: 0.8925 - classification_loss: 0.1186 145/500 [=======>......................] - ETA: 1:23 - loss: 1.0163 - regression_loss: 0.8968 - classification_loss: 0.1195 146/500 [=======>......................] - ETA: 1:23 - loss: 1.0133 - regression_loss: 0.8942 - classification_loss: 0.1192 147/500 [=======>......................] - ETA: 1:22 - loss: 1.0107 - regression_loss: 0.8919 - classification_loss: 0.1188 148/500 [=======>......................] - ETA: 1:22 - loss: 1.0056 - regression_loss: 0.8873 - classification_loss: 0.1183 149/500 [=======>......................] - ETA: 1:22 - loss: 1.0070 - regression_loss: 0.8883 - classification_loss: 0.1186 150/500 [========>.....................] - ETA: 1:22 - loss: 1.0057 - regression_loss: 0.8876 - classification_loss: 0.1181 151/500 [========>.....................] - ETA: 1:21 - loss: 1.0057 - regression_loss: 0.8876 - classification_loss: 0.1181 152/500 [========>.....................] - ETA: 1:21 - loss: 1.0064 - regression_loss: 0.8885 - classification_loss: 0.1180 153/500 [========>.....................] - ETA: 1:21 - loss: 1.0115 - regression_loss: 0.8922 - classification_loss: 0.1193 154/500 [========>.....................] - ETA: 1:21 - loss: 1.0112 - regression_loss: 0.8920 - classification_loss: 0.1193 155/500 [========>.....................] - ETA: 1:20 - loss: 1.0106 - regression_loss: 0.8916 - classification_loss: 0.1190 156/500 [========>.....................] - ETA: 1:20 - loss: 1.0096 - regression_loss: 0.8907 - classification_loss: 0.1189 157/500 [========>.....................] - ETA: 1:20 - loss: 1.0097 - regression_loss: 0.8908 - classification_loss: 0.1189 158/500 [========>.....................] - ETA: 1:20 - loss: 1.0068 - regression_loss: 0.8883 - classification_loss: 0.1185 159/500 [========>.....................] - ETA: 1:19 - loss: 1.0099 - regression_loss: 0.8915 - classification_loss: 0.1183 160/500 [========>.....................] - ETA: 1:19 - loss: 1.0069 - regression_loss: 0.8889 - classification_loss: 0.1180 161/500 [========>.....................] - ETA: 1:19 - loss: 1.0049 - regression_loss: 0.8871 - classification_loss: 0.1178 162/500 [========>.....................] - ETA: 1:19 - loss: 1.0074 - regression_loss: 0.8892 - classification_loss: 0.1182 163/500 [========>.....................] - ETA: 1:18 - loss: 1.0069 - regression_loss: 0.8890 - classification_loss: 0.1179 164/500 [========>.....................] - ETA: 1:18 - loss: 1.0048 - regression_loss: 0.8873 - classification_loss: 0.1174 165/500 [========>.....................] - ETA: 1:18 - loss: 1.0015 - regression_loss: 0.8844 - classification_loss: 0.1170 166/500 [========>.....................] - ETA: 1:18 - loss: 1.0029 - regression_loss: 0.8857 - classification_loss: 0.1171 167/500 [=========>....................] - ETA: 1:18 - loss: 1.0043 - regression_loss: 0.8870 - classification_loss: 0.1173 168/500 [=========>....................] - ETA: 1:17 - loss: 1.0021 - regression_loss: 0.8848 - classification_loss: 0.1173 169/500 [=========>....................] - ETA: 1:17 - loss: 1.0008 - regression_loss: 0.8840 - classification_loss: 0.1168 170/500 [=========>....................] - ETA: 1:17 - loss: 1.0000 - regression_loss: 0.8832 - classification_loss: 0.1168 171/500 [=========>....................] - ETA: 1:17 - loss: 0.9959 - regression_loss: 0.8796 - classification_loss: 0.1162 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9966 - regression_loss: 0.8804 - classification_loss: 0.1162 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9989 - regression_loss: 0.8825 - classification_loss: 0.1164 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9994 - regression_loss: 0.8830 - classification_loss: 0.1164 175/500 [=========>....................] - ETA: 1:16 - loss: 1.0038 - regression_loss: 0.8865 - classification_loss: 0.1173 176/500 [=========>....................] - ETA: 1:16 - loss: 1.0037 - regression_loss: 0.8865 - classification_loss: 0.1172 177/500 [=========>....................] - ETA: 1:15 - loss: 1.0061 - regression_loss: 0.8887 - classification_loss: 0.1174 178/500 [=========>....................] - ETA: 1:15 - loss: 1.0100 - regression_loss: 0.8924 - classification_loss: 0.1176 179/500 [=========>....................] - ETA: 1:15 - loss: 1.0138 - regression_loss: 0.8959 - classification_loss: 0.1180 180/500 [=========>....................] - ETA: 1:15 - loss: 1.0173 - regression_loss: 0.8991 - classification_loss: 0.1183 181/500 [=========>....................] - ETA: 1:14 - loss: 1.0180 - regression_loss: 0.8999 - classification_loss: 0.1182 182/500 [=========>....................] - ETA: 1:14 - loss: 1.0167 - regression_loss: 0.8985 - classification_loss: 0.1183 183/500 [=========>....................] - ETA: 1:14 - loss: 1.0150 - regression_loss: 0.8970 - classification_loss: 0.1180 184/500 [==========>...................] - ETA: 1:14 - loss: 1.0155 - regression_loss: 0.8977 - classification_loss: 0.1178 185/500 [==========>...................] - ETA: 1:13 - loss: 1.0189 - regression_loss: 0.9003 - classification_loss: 0.1186 186/500 [==========>...................] - ETA: 1:13 - loss: 1.0193 - regression_loss: 0.9007 - classification_loss: 0.1186 187/500 [==========>...................] - ETA: 1:13 - loss: 1.0208 - regression_loss: 0.9020 - classification_loss: 0.1189 188/500 [==========>...................] - ETA: 1:13 - loss: 1.0196 - regression_loss: 0.9008 - classification_loss: 0.1188 189/500 [==========>...................] - ETA: 1:13 - loss: 1.0191 - regression_loss: 0.9005 - classification_loss: 0.1186 190/500 [==========>...................] - ETA: 1:12 - loss: 1.0192 - regression_loss: 0.9008 - classification_loss: 0.1184 191/500 [==========>...................] - ETA: 1:12 - loss: 1.0154 - regression_loss: 0.8976 - classification_loss: 0.1178 192/500 [==========>...................] - ETA: 1:12 - loss: 1.0138 - regression_loss: 0.8963 - classification_loss: 0.1175 193/500 [==========>...................] - ETA: 1:12 - loss: 1.0107 - regression_loss: 0.8935 - classification_loss: 0.1172 194/500 [==========>...................] - ETA: 1:11 - loss: 1.0091 - regression_loss: 0.8921 - classification_loss: 0.1170 195/500 [==========>...................] - ETA: 1:11 - loss: 1.0111 - regression_loss: 0.8935 - classification_loss: 0.1175 196/500 [==========>...................] - ETA: 1:11 - loss: 1.0131 - regression_loss: 0.8953 - classification_loss: 0.1178 197/500 [==========>...................] - ETA: 1:11 - loss: 1.0143 - regression_loss: 0.8963 - classification_loss: 0.1179 198/500 [==========>...................] - ETA: 1:11 - loss: 1.0146 - regression_loss: 0.8967 - classification_loss: 0.1179 199/500 [==========>...................] - ETA: 1:10 - loss: 1.0128 - regression_loss: 0.8952 - classification_loss: 0.1176 200/500 [===========>..................] - ETA: 1:10 - loss: 1.0139 - regression_loss: 0.8960 - classification_loss: 0.1180 201/500 [===========>..................] - ETA: 1:10 - loss: 1.0143 - regression_loss: 0.8965 - classification_loss: 0.1178 202/500 [===========>..................] - ETA: 1:10 - loss: 1.0166 - regression_loss: 0.8983 - classification_loss: 0.1183 203/500 [===========>..................] - ETA: 1:09 - loss: 1.0174 - regression_loss: 0.8989 - classification_loss: 0.1185 204/500 [===========>..................] - ETA: 1:09 - loss: 1.0163 - regression_loss: 0.8980 - classification_loss: 0.1184 205/500 [===========>..................] - ETA: 1:09 - loss: 1.0138 - regression_loss: 0.8957 - classification_loss: 0.1180 206/500 [===========>..................] - ETA: 1:09 - loss: 1.0138 - regression_loss: 0.8956 - classification_loss: 0.1182 207/500 [===========>..................] - ETA: 1:08 - loss: 1.0116 - regression_loss: 0.8937 - classification_loss: 0.1180 208/500 [===========>..................] - ETA: 1:08 - loss: 1.0090 - regression_loss: 0.8915 - classification_loss: 0.1176 209/500 [===========>..................] - ETA: 1:08 - loss: 1.0106 - regression_loss: 0.8926 - classification_loss: 0.1180 210/500 [===========>..................] - ETA: 1:08 - loss: 1.0084 - regression_loss: 0.8907 - classification_loss: 0.1177 211/500 [===========>..................] - ETA: 1:07 - loss: 1.0085 - regression_loss: 0.8907 - classification_loss: 0.1178 212/500 [===========>..................] - ETA: 1:07 - loss: 1.0085 - regression_loss: 0.8908 - classification_loss: 0.1177 213/500 [===========>..................] - ETA: 1:07 - loss: 1.0055 - regression_loss: 0.8883 - classification_loss: 0.1172 214/500 [===========>..................] - ETA: 1:07 - loss: 1.0042 - regression_loss: 0.8871 - classification_loss: 0.1171 215/500 [===========>..................] - ETA: 1:06 - loss: 1.0048 - regression_loss: 0.8876 - classification_loss: 0.1171 216/500 [===========>..................] - ETA: 1:06 - loss: 1.0044 - regression_loss: 0.8876 - classification_loss: 0.1169 217/500 [============>.................] - ETA: 1:06 - loss: 1.0031 - regression_loss: 0.8863 - classification_loss: 0.1169 218/500 [============>.................] - ETA: 1:06 - loss: 1.0045 - regression_loss: 0.8876 - classification_loss: 0.1169 219/500 [============>.................] - ETA: 1:05 - loss: 1.0076 - regression_loss: 0.8906 - classification_loss: 0.1171 220/500 [============>.................] - ETA: 1:05 - loss: 1.0058 - regression_loss: 0.8891 - classification_loss: 0.1167 221/500 [============>.................] - ETA: 1:05 - loss: 1.0057 - regression_loss: 0.8890 - classification_loss: 0.1166 222/500 [============>.................] - ETA: 1:05 - loss: 1.0082 - regression_loss: 0.8911 - classification_loss: 0.1171 223/500 [============>.................] - ETA: 1:05 - loss: 1.0079 - regression_loss: 0.8910 - classification_loss: 0.1169 224/500 [============>.................] - ETA: 1:04 - loss: 1.0064 - regression_loss: 0.8896 - classification_loss: 0.1168 225/500 [============>.................] - ETA: 1:04 - loss: 1.0109 - regression_loss: 0.8935 - classification_loss: 0.1174 226/500 [============>.................] - ETA: 1:04 - loss: 1.0117 - regression_loss: 0.8942 - classification_loss: 0.1174 227/500 [============>.................] - ETA: 1:04 - loss: 1.0100 - regression_loss: 0.8929 - classification_loss: 0.1171 228/500 [============>.................] - ETA: 1:03 - loss: 1.0074 - regression_loss: 0.8906 - classification_loss: 0.1167 229/500 [============>.................] - ETA: 1:03 - loss: 1.0059 - regression_loss: 0.8895 - classification_loss: 0.1164 230/500 [============>.................] - ETA: 1:03 - loss: 1.0067 - regression_loss: 0.8903 - classification_loss: 0.1163 231/500 [============>.................] - ETA: 1:03 - loss: 1.0073 - regression_loss: 0.8910 - classification_loss: 0.1163 232/500 [============>.................] - ETA: 1:02 - loss: 1.0082 - regression_loss: 0.8916 - classification_loss: 0.1166 233/500 [============>.................] - ETA: 1:02 - loss: 1.0088 - regression_loss: 0.8923 - classification_loss: 0.1165 234/500 [=============>................] - ETA: 1:02 - loss: 1.0065 - regression_loss: 0.8903 - classification_loss: 0.1162 235/500 [=============>................] - ETA: 1:02 - loss: 1.0051 - regression_loss: 0.8893 - classification_loss: 0.1158 236/500 [=============>................] - ETA: 1:01 - loss: 1.0031 - regression_loss: 0.8875 - classification_loss: 0.1156 237/500 [=============>................] - ETA: 1:01 - loss: 1.0068 - regression_loss: 0.8906 - classification_loss: 0.1162 238/500 [=============>................] - ETA: 1:01 - loss: 1.0045 - regression_loss: 0.8885 - classification_loss: 0.1159 239/500 [=============>................] - ETA: 1:01 - loss: 1.0050 - regression_loss: 0.8891 - classification_loss: 0.1159 240/500 [=============>................] - ETA: 1:00 - loss: 1.0062 - regression_loss: 0.8904 - classification_loss: 0.1159 241/500 [=============>................] - ETA: 1:00 - loss: 1.0057 - regression_loss: 0.8898 - classification_loss: 0.1159 242/500 [=============>................] - ETA: 1:00 - loss: 1.0054 - regression_loss: 0.8896 - classification_loss: 0.1158 243/500 [=============>................] - ETA: 1:00 - loss: 1.0043 - regression_loss: 0.8887 - classification_loss: 0.1155 244/500 [=============>................] - ETA: 59s - loss: 1.0036 - regression_loss: 0.8883 - classification_loss: 0.1154  245/500 [=============>................] - ETA: 59s - loss: 1.0055 - regression_loss: 0.8900 - classification_loss: 0.1155 246/500 [=============>................] - ETA: 59s - loss: 1.0036 - regression_loss: 0.8884 - classification_loss: 0.1151 247/500 [=============>................] - ETA: 59s - loss: 1.0061 - regression_loss: 0.8905 - classification_loss: 0.1156 248/500 [=============>................] - ETA: 59s - loss: 1.0088 - regression_loss: 0.8923 - classification_loss: 0.1165 249/500 [=============>................] - ETA: 58s - loss: 1.0095 - regression_loss: 0.8931 - classification_loss: 0.1164 250/500 [==============>...............] - ETA: 58s - loss: 1.0087 - regression_loss: 0.8925 - classification_loss: 0.1162 251/500 [==============>...............] - ETA: 58s - loss: 1.0093 - regression_loss: 0.8931 - classification_loss: 0.1162 252/500 [==============>...............] - ETA: 58s - loss: 1.0084 - regression_loss: 0.8924 - classification_loss: 0.1161 253/500 [==============>...............] - ETA: 57s - loss: 1.0124 - regression_loss: 0.8957 - classification_loss: 0.1167 254/500 [==============>...............] - ETA: 57s - loss: 1.0136 - regression_loss: 0.8969 - classification_loss: 0.1167 255/500 [==============>...............] - ETA: 57s - loss: 1.0141 - regression_loss: 0.8974 - classification_loss: 0.1166 256/500 [==============>...............] - ETA: 57s - loss: 1.0128 - regression_loss: 0.8964 - classification_loss: 0.1164 257/500 [==============>...............] - ETA: 56s - loss: 1.0116 - regression_loss: 0.8954 - classification_loss: 0.1162 258/500 [==============>...............] - ETA: 56s - loss: 1.0113 - regression_loss: 0.8951 - classification_loss: 0.1162 259/500 [==============>...............] - ETA: 56s - loss: 1.0097 - regression_loss: 0.8939 - classification_loss: 0.1159 260/500 [==============>...............] - ETA: 56s - loss: 1.0077 - regression_loss: 0.8921 - classification_loss: 0.1156 261/500 [==============>...............] - ETA: 55s - loss: 1.0098 - regression_loss: 0.8934 - classification_loss: 0.1164 262/500 [==============>...............] - ETA: 55s - loss: 1.0098 - regression_loss: 0.8935 - classification_loss: 0.1163 263/500 [==============>...............] - ETA: 55s - loss: 1.0080 - regression_loss: 0.8919 - classification_loss: 0.1161 264/500 [==============>...............] - ETA: 55s - loss: 1.0083 - regression_loss: 0.8923 - classification_loss: 0.1161 265/500 [==============>...............] - ETA: 54s - loss: 1.0079 - regression_loss: 0.8919 - classification_loss: 0.1160 266/500 [==============>...............] - ETA: 54s - loss: 1.0073 - regression_loss: 0.8913 - classification_loss: 0.1160 267/500 [===============>..............] - ETA: 54s - loss: 1.0098 - regression_loss: 0.8934 - classification_loss: 0.1164 268/500 [===============>..............] - ETA: 54s - loss: 1.0097 - regression_loss: 0.8935 - classification_loss: 0.1162 269/500 [===============>..............] - ETA: 54s - loss: 1.0092 - regression_loss: 0.8930 - classification_loss: 0.1162 270/500 [===============>..............] - ETA: 53s - loss: 1.0097 - regression_loss: 0.8935 - classification_loss: 0.1163 271/500 [===============>..............] - ETA: 53s - loss: 1.0099 - regression_loss: 0.8938 - classification_loss: 0.1161 272/500 [===============>..............] - ETA: 53s - loss: 1.0097 - regression_loss: 0.8937 - classification_loss: 0.1160 273/500 [===============>..............] - ETA: 53s - loss: 1.0105 - regression_loss: 0.8943 - classification_loss: 0.1162 274/500 [===============>..............] - ETA: 52s - loss: 1.0125 - regression_loss: 0.8960 - classification_loss: 0.1166 275/500 [===============>..............] - ETA: 52s - loss: 1.0116 - regression_loss: 0.8952 - classification_loss: 0.1164 276/500 [===============>..............] - ETA: 52s - loss: 1.0114 - regression_loss: 0.8951 - classification_loss: 0.1163 277/500 [===============>..............] - ETA: 52s - loss: 1.0121 - regression_loss: 0.8958 - classification_loss: 0.1164 278/500 [===============>..............] - ETA: 51s - loss: 1.0124 - regression_loss: 0.8960 - classification_loss: 0.1164 279/500 [===============>..............] - ETA: 51s - loss: 1.0109 - regression_loss: 0.8948 - classification_loss: 0.1161 280/500 [===============>..............] - ETA: 51s - loss: 1.0110 - regression_loss: 0.8948 - classification_loss: 0.1162 281/500 [===============>..............] - ETA: 51s - loss: 1.0122 - regression_loss: 0.8956 - classification_loss: 0.1166 282/500 [===============>..............] - ETA: 50s - loss: 1.0113 - regression_loss: 0.8949 - classification_loss: 0.1163 283/500 [===============>..............] - ETA: 50s - loss: 1.0104 - regression_loss: 0.8942 - classification_loss: 0.1162 284/500 [================>.............] - ETA: 50s - loss: 1.0121 - regression_loss: 0.8958 - classification_loss: 0.1163 285/500 [================>.............] - ETA: 50s - loss: 1.0116 - regression_loss: 0.8955 - classification_loss: 0.1161 286/500 [================>.............] - ETA: 50s - loss: 1.0101 - regression_loss: 0.8942 - classification_loss: 0.1159 287/500 [================>.............] - ETA: 49s - loss: 1.0109 - regression_loss: 0.8946 - classification_loss: 0.1164 288/500 [================>.............] - ETA: 49s - loss: 1.0126 - regression_loss: 0.8963 - classification_loss: 0.1164 289/500 [================>.............] - ETA: 49s - loss: 1.0124 - regression_loss: 0.8961 - classification_loss: 0.1162 290/500 [================>.............] - ETA: 49s - loss: 1.0124 - regression_loss: 0.8962 - classification_loss: 0.1163 291/500 [================>.............] - ETA: 48s - loss: 1.0121 - regression_loss: 0.8959 - classification_loss: 0.1162 292/500 [================>.............] - ETA: 48s - loss: 1.0118 - regression_loss: 0.8958 - classification_loss: 0.1161 293/500 [================>.............] - ETA: 48s - loss: 1.0101 - regression_loss: 0.8943 - classification_loss: 0.1159 294/500 [================>.............] - ETA: 48s - loss: 1.0092 - regression_loss: 0.8934 - classification_loss: 0.1157 295/500 [================>.............] - ETA: 47s - loss: 1.0115 - regression_loss: 0.8954 - classification_loss: 0.1162 296/500 [================>.............] - ETA: 47s - loss: 1.0107 - regression_loss: 0.8946 - classification_loss: 0.1160 297/500 [================>.............] - ETA: 47s - loss: 1.0116 - regression_loss: 0.8956 - classification_loss: 0.1160 298/500 [================>.............] - ETA: 47s - loss: 1.0109 - regression_loss: 0.8951 - classification_loss: 0.1159 299/500 [================>.............] - ETA: 47s - loss: 1.0114 - regression_loss: 0.8956 - classification_loss: 0.1159 300/500 [=================>............] - ETA: 46s - loss: 1.0097 - regression_loss: 0.8940 - classification_loss: 0.1156 301/500 [=================>............] - ETA: 46s - loss: 1.0119 - regression_loss: 0.8962 - classification_loss: 0.1157 302/500 [=================>............] - ETA: 46s - loss: 1.0123 - regression_loss: 0.8965 - classification_loss: 0.1157 303/500 [=================>............] - ETA: 46s - loss: 1.0128 - regression_loss: 0.8969 - classification_loss: 0.1158 304/500 [=================>............] - ETA: 45s - loss: 1.0124 - regression_loss: 0.8967 - classification_loss: 0.1157 305/500 [=================>............] - ETA: 45s - loss: 1.0131 - regression_loss: 0.8975 - classification_loss: 0.1155 306/500 [=================>............] - ETA: 45s - loss: 1.0133 - regression_loss: 0.8980 - classification_loss: 0.1153 307/500 [=================>............] - ETA: 45s - loss: 1.0136 - regression_loss: 0.8983 - classification_loss: 0.1153 308/500 [=================>............] - ETA: 44s - loss: 1.0147 - regression_loss: 0.8988 - classification_loss: 0.1159 309/500 [=================>............] - ETA: 44s - loss: 1.0144 - regression_loss: 0.8983 - classification_loss: 0.1161 310/500 [=================>............] - ETA: 44s - loss: 1.0142 - regression_loss: 0.8982 - classification_loss: 0.1160 311/500 [=================>............] - ETA: 44s - loss: 1.0125 - regression_loss: 0.8968 - classification_loss: 0.1157 312/500 [=================>............] - ETA: 43s - loss: 1.0132 - regression_loss: 0.8974 - classification_loss: 0.1157 313/500 [=================>............] - ETA: 43s - loss: 1.0119 - regression_loss: 0.8964 - classification_loss: 0.1155 314/500 [=================>............] - ETA: 43s - loss: 1.0117 - regression_loss: 0.8965 - classification_loss: 0.1152 315/500 [=================>............] - ETA: 43s - loss: 1.0123 - regression_loss: 0.8972 - classification_loss: 0.1151 316/500 [=================>............] - ETA: 43s - loss: 1.0114 - regression_loss: 0.8963 - classification_loss: 0.1150 317/500 [==================>...........] - ETA: 42s - loss: 1.0111 - regression_loss: 0.8960 - classification_loss: 0.1150 318/500 [==================>...........] - ETA: 42s - loss: 1.0104 - regression_loss: 0.8957 - classification_loss: 0.1148 319/500 [==================>...........] - ETA: 42s - loss: 1.0092 - regression_loss: 0.8946 - classification_loss: 0.1146 320/500 [==================>...........] - ETA: 42s - loss: 1.0103 - regression_loss: 0.8956 - classification_loss: 0.1147 321/500 [==================>...........] - ETA: 41s - loss: 1.0109 - regression_loss: 0.8961 - classification_loss: 0.1147 322/500 [==================>...........] - ETA: 41s - loss: 1.0102 - regression_loss: 0.8957 - classification_loss: 0.1145 323/500 [==================>...........] - ETA: 41s - loss: 1.0104 - regression_loss: 0.8958 - classification_loss: 0.1145 324/500 [==================>...........] - ETA: 41s - loss: 1.0106 - regression_loss: 0.8958 - classification_loss: 0.1148 325/500 [==================>...........] - ETA: 40s - loss: 1.0103 - regression_loss: 0.8955 - classification_loss: 0.1148 326/500 [==================>...........] - ETA: 40s - loss: 1.0090 - regression_loss: 0.8944 - classification_loss: 0.1146 327/500 [==================>...........] - ETA: 40s - loss: 1.0075 - regression_loss: 0.8931 - classification_loss: 0.1144 328/500 [==================>...........] - ETA: 40s - loss: 1.0080 - regression_loss: 0.8937 - classification_loss: 0.1143 329/500 [==================>...........] - ETA: 39s - loss: 1.0087 - regression_loss: 0.8943 - classification_loss: 0.1144 330/500 [==================>...........] - ETA: 39s - loss: 1.0086 - regression_loss: 0.8943 - classification_loss: 0.1144 331/500 [==================>...........] - ETA: 39s - loss: 1.0077 - regression_loss: 0.8934 - classification_loss: 0.1143 332/500 [==================>...........] - ETA: 39s - loss: 1.0064 - regression_loss: 0.8924 - classification_loss: 0.1140 333/500 [==================>...........] - ETA: 39s - loss: 1.0074 - regression_loss: 0.8928 - classification_loss: 0.1146 334/500 [===================>..........] - ETA: 38s - loss: 1.0074 - regression_loss: 0.8930 - classification_loss: 0.1144 335/500 [===================>..........] - ETA: 38s - loss: 1.0079 - regression_loss: 0.8935 - classification_loss: 0.1145 336/500 [===================>..........] - ETA: 38s - loss: 1.0079 - regression_loss: 0.8934 - classification_loss: 0.1145 337/500 [===================>..........] - ETA: 38s - loss: 1.0070 - regression_loss: 0.8927 - classification_loss: 0.1143 338/500 [===================>..........] - ETA: 37s - loss: 1.0084 - regression_loss: 0.8937 - classification_loss: 0.1147 339/500 [===================>..........] - ETA: 37s - loss: 1.0072 - regression_loss: 0.8926 - classification_loss: 0.1146 340/500 [===================>..........] - ETA: 37s - loss: 1.0075 - regression_loss: 0.8930 - classification_loss: 0.1145 341/500 [===================>..........] - ETA: 37s - loss: 1.0073 - regression_loss: 0.8928 - classification_loss: 0.1145 342/500 [===================>..........] - ETA: 36s - loss: 1.0084 - regression_loss: 0.8938 - classification_loss: 0.1147 343/500 [===================>..........] - ETA: 36s - loss: 1.0061 - regression_loss: 0.8918 - classification_loss: 0.1144 344/500 [===================>..........] - ETA: 36s - loss: 1.0055 - regression_loss: 0.8913 - classification_loss: 0.1142 345/500 [===================>..........] - ETA: 36s - loss: 1.0102 - regression_loss: 0.8947 - classification_loss: 0.1155 346/500 [===================>..........] - ETA: 36s - loss: 1.0099 - regression_loss: 0.8946 - classification_loss: 0.1154 347/500 [===================>..........] - ETA: 35s - loss: 1.0097 - regression_loss: 0.8944 - classification_loss: 0.1154 348/500 [===================>..........] - ETA: 35s - loss: 1.0086 - regression_loss: 0.8932 - classification_loss: 0.1154 349/500 [===================>..........] - ETA: 35s - loss: 1.0082 - regression_loss: 0.8928 - classification_loss: 0.1154 350/500 [====================>.........] - ETA: 35s - loss: 1.0066 - regression_loss: 0.8915 - classification_loss: 0.1152 351/500 [====================>.........] - ETA: 34s - loss: 1.0093 - regression_loss: 0.8931 - classification_loss: 0.1162 352/500 [====================>.........] - ETA: 34s - loss: 1.0086 - regression_loss: 0.8925 - classification_loss: 0.1160 353/500 [====================>.........] - ETA: 34s - loss: 1.0090 - regression_loss: 0.8930 - classification_loss: 0.1160 354/500 [====================>.........] - ETA: 34s - loss: 1.0093 - regression_loss: 0.8933 - classification_loss: 0.1160 355/500 [====================>.........] - ETA: 33s - loss: 1.0095 - regression_loss: 0.8935 - classification_loss: 0.1160 356/500 [====================>.........] - ETA: 33s - loss: 1.0105 - regression_loss: 0.8943 - classification_loss: 0.1162 357/500 [====================>.........] - ETA: 33s - loss: 1.0119 - regression_loss: 0.8955 - classification_loss: 0.1164 358/500 [====================>.........] - ETA: 33s - loss: 1.0126 - regression_loss: 0.8963 - classification_loss: 0.1163 359/500 [====================>.........] - ETA: 32s - loss: 1.0118 - regression_loss: 0.8956 - classification_loss: 0.1162 360/500 [====================>.........] - ETA: 32s - loss: 1.0118 - regression_loss: 0.8956 - classification_loss: 0.1162 361/500 [====================>.........] - ETA: 32s - loss: 1.0107 - regression_loss: 0.8946 - classification_loss: 0.1161 362/500 [====================>.........] - ETA: 32s - loss: 1.0132 - regression_loss: 0.8965 - classification_loss: 0.1166 363/500 [====================>.........] - ETA: 32s - loss: 1.0128 - regression_loss: 0.8963 - classification_loss: 0.1165 364/500 [====================>.........] - ETA: 31s - loss: 1.0131 - regression_loss: 0.8967 - classification_loss: 0.1164 365/500 [====================>.........] - ETA: 31s - loss: 1.0131 - regression_loss: 0.8968 - classification_loss: 0.1163 366/500 [====================>.........] - ETA: 31s - loss: 1.0142 - regression_loss: 0.8978 - classification_loss: 0.1164 367/500 [=====================>........] - ETA: 31s - loss: 1.0148 - regression_loss: 0.8984 - classification_loss: 0.1164 368/500 [=====================>........] - ETA: 30s - loss: 1.0139 - regression_loss: 0.8977 - classification_loss: 0.1162 369/500 [=====================>........] - ETA: 30s - loss: 1.0166 - regression_loss: 0.8996 - classification_loss: 0.1170 370/500 [=====================>........] - ETA: 30s - loss: 1.0191 - regression_loss: 0.9019 - classification_loss: 0.1172 371/500 [=====================>........] - ETA: 30s - loss: 1.0207 - regression_loss: 0.9032 - classification_loss: 0.1174 372/500 [=====================>........] - ETA: 29s - loss: 1.0219 - regression_loss: 0.9046 - classification_loss: 0.1174 373/500 [=====================>........] - ETA: 29s - loss: 1.0226 - regression_loss: 0.9053 - classification_loss: 0.1173 374/500 [=====================>........] - ETA: 29s - loss: 1.0239 - regression_loss: 0.9064 - classification_loss: 0.1174 375/500 [=====================>........] - ETA: 29s - loss: 1.0229 - regression_loss: 0.9057 - classification_loss: 0.1173 376/500 [=====================>........] - ETA: 28s - loss: 1.0251 - regression_loss: 0.9072 - classification_loss: 0.1179 377/500 [=====================>........] - ETA: 28s - loss: 1.0258 - regression_loss: 0.9077 - classification_loss: 0.1181 378/500 [=====================>........] - ETA: 28s - loss: 1.0260 - regression_loss: 0.9079 - classification_loss: 0.1181 379/500 [=====================>........] - ETA: 28s - loss: 1.0261 - regression_loss: 0.9080 - classification_loss: 0.1180 380/500 [=====================>........] - ETA: 28s - loss: 1.0241 - regression_loss: 0.9063 - classification_loss: 0.1178 381/500 [=====================>........] - ETA: 27s - loss: 1.0226 - regression_loss: 0.9050 - classification_loss: 0.1176 382/500 [=====================>........] - ETA: 27s - loss: 1.0225 - regression_loss: 0.9047 - classification_loss: 0.1178 383/500 [=====================>........] - ETA: 27s - loss: 1.0230 - regression_loss: 0.9053 - classification_loss: 0.1177 384/500 [======================>.......] - ETA: 27s - loss: 1.0223 - regression_loss: 0.9047 - classification_loss: 0.1176 385/500 [======================>.......] - ETA: 26s - loss: 1.0209 - regression_loss: 0.9036 - classification_loss: 0.1173 386/500 [======================>.......] - ETA: 26s - loss: 1.0199 - regression_loss: 0.9024 - classification_loss: 0.1175 387/500 [======================>.......] - ETA: 26s - loss: 1.0185 - regression_loss: 0.9011 - classification_loss: 0.1174 388/500 [======================>.......] - ETA: 26s - loss: 1.0167 - regression_loss: 0.8995 - classification_loss: 0.1172 389/500 [======================>.......] - ETA: 25s - loss: 1.0178 - regression_loss: 0.9004 - classification_loss: 0.1174 390/500 [======================>.......] - ETA: 25s - loss: 1.0189 - regression_loss: 0.9014 - classification_loss: 0.1175 391/500 [======================>.......] - ETA: 25s - loss: 1.0200 - regression_loss: 0.9021 - classification_loss: 0.1179 392/500 [======================>.......] - ETA: 25s - loss: 1.0199 - regression_loss: 0.9021 - classification_loss: 0.1179 393/500 [======================>.......] - ETA: 25s - loss: 1.0197 - regression_loss: 0.9020 - classification_loss: 0.1178 394/500 [======================>.......] - ETA: 24s - loss: 1.0206 - regression_loss: 0.9026 - classification_loss: 0.1180 395/500 [======================>.......] - ETA: 24s - loss: 1.0198 - regression_loss: 0.9020 - classification_loss: 0.1178 396/500 [======================>.......] - ETA: 24s - loss: 1.0209 - regression_loss: 0.9027 - classification_loss: 0.1181 397/500 [======================>.......] - ETA: 24s - loss: 1.0206 - regression_loss: 0.9026 - classification_loss: 0.1180 398/500 [======================>.......] - ETA: 23s - loss: 1.0208 - regression_loss: 0.9026 - classification_loss: 0.1182 399/500 [======================>.......] - ETA: 23s - loss: 1.0203 - regression_loss: 0.9022 - classification_loss: 0.1181 400/500 [=======================>......] - ETA: 23s - loss: 1.0193 - regression_loss: 0.9014 - classification_loss: 0.1179 401/500 [=======================>......] - ETA: 23s - loss: 1.0202 - regression_loss: 0.9022 - classification_loss: 0.1181 402/500 [=======================>......] - ETA: 22s - loss: 1.0194 - regression_loss: 0.9015 - classification_loss: 0.1179 403/500 [=======================>......] - ETA: 22s - loss: 1.0200 - regression_loss: 0.9019 - classification_loss: 0.1181 404/500 [=======================>......] - ETA: 22s - loss: 1.0188 - regression_loss: 0.9007 - classification_loss: 0.1181 405/500 [=======================>......] - ETA: 22s - loss: 1.0206 - regression_loss: 0.9021 - classification_loss: 0.1185 406/500 [=======================>......] - ETA: 21s - loss: 1.0205 - regression_loss: 0.9020 - classification_loss: 0.1184 407/500 [=======================>......] - ETA: 21s - loss: 1.0203 - regression_loss: 0.9020 - classification_loss: 0.1183 408/500 [=======================>......] - ETA: 21s - loss: 1.0219 - regression_loss: 0.9034 - classification_loss: 0.1185 409/500 [=======================>......] - ETA: 21s - loss: 1.0231 - regression_loss: 0.9047 - classification_loss: 0.1185 410/500 [=======================>......] - ETA: 21s - loss: 1.0226 - regression_loss: 0.9043 - classification_loss: 0.1183 411/500 [=======================>......] - ETA: 20s - loss: 1.0217 - regression_loss: 0.9036 - classification_loss: 0.1181 412/500 [=======================>......] - ETA: 20s - loss: 1.0230 - regression_loss: 0.9046 - classification_loss: 0.1184 413/500 [=======================>......] - ETA: 20s - loss: 1.0232 - regression_loss: 0.9048 - classification_loss: 0.1184 414/500 [=======================>......] - ETA: 20s - loss: 1.0251 - regression_loss: 0.9063 - classification_loss: 0.1188 415/500 [=======================>......] - ETA: 19s - loss: 1.0253 - regression_loss: 0.9065 - classification_loss: 0.1188 416/500 [=======================>......] - ETA: 19s - loss: 1.0245 - regression_loss: 0.9058 - classification_loss: 0.1187 417/500 [========================>.....] - ETA: 19s - loss: 1.0237 - regression_loss: 0.9051 - classification_loss: 0.1186 418/500 [========================>.....] - ETA: 19s - loss: 1.0227 - regression_loss: 0.9042 - classification_loss: 0.1185 419/500 [========================>.....] - ETA: 18s - loss: 1.0225 - regression_loss: 0.9041 - classification_loss: 0.1184 420/500 [========================>.....] - ETA: 18s - loss: 1.0229 - regression_loss: 0.9045 - classification_loss: 0.1184 421/500 [========================>.....] - ETA: 18s - loss: 1.0259 - regression_loss: 0.9072 - classification_loss: 0.1187 422/500 [========================>.....] - ETA: 18s - loss: 1.0262 - regression_loss: 0.9076 - classification_loss: 0.1186 423/500 [========================>.....] - ETA: 18s - loss: 1.0251 - regression_loss: 0.9066 - classification_loss: 0.1185 424/500 [========================>.....] - ETA: 17s - loss: 1.0237 - regression_loss: 0.9054 - classification_loss: 0.1183 425/500 [========================>.....] - ETA: 17s - loss: 1.0244 - regression_loss: 0.9061 - classification_loss: 0.1182 426/500 [========================>.....] - ETA: 17s - loss: 1.0234 - regression_loss: 0.9053 - classification_loss: 0.1181 427/500 [========================>.....] - ETA: 17s - loss: 1.0251 - regression_loss: 0.9069 - classification_loss: 0.1181 428/500 [========================>.....] - ETA: 16s - loss: 1.0259 - regression_loss: 0.9075 - classification_loss: 0.1183 429/500 [========================>.....] - ETA: 16s - loss: 1.0264 - regression_loss: 0.9080 - classification_loss: 0.1185 430/500 [========================>.....] - ETA: 16s - loss: 1.0274 - regression_loss: 0.9087 - classification_loss: 0.1186 431/500 [========================>.....] - ETA: 16s - loss: 1.0287 - regression_loss: 0.9097 - classification_loss: 0.1190 432/500 [========================>.....] - ETA: 15s - loss: 1.0281 - regression_loss: 0.9092 - classification_loss: 0.1189 433/500 [========================>.....] - ETA: 15s - loss: 1.0267 - regression_loss: 0.9079 - classification_loss: 0.1188 434/500 [=========================>....] - ETA: 15s - loss: 1.0274 - regression_loss: 0.9088 - classification_loss: 0.1187 435/500 [=========================>....] - ETA: 15s - loss: 1.0270 - regression_loss: 0.9084 - classification_loss: 0.1186 436/500 [=========================>....] - ETA: 14s - loss: 1.0267 - regression_loss: 0.9081 - classification_loss: 0.1186 437/500 [=========================>....] - ETA: 14s - loss: 1.0267 - regression_loss: 0.9081 - classification_loss: 0.1185 438/500 [=========================>....] - ETA: 14s - loss: 1.0274 - regression_loss: 0.9089 - classification_loss: 0.1185 439/500 [=========================>....] - ETA: 14s - loss: 1.0273 - regression_loss: 0.9089 - classification_loss: 0.1184 440/500 [=========================>....] - ETA: 14s - loss: 1.0259 - regression_loss: 0.9077 - classification_loss: 0.1182 441/500 [=========================>....] - ETA: 13s - loss: 1.0250 - regression_loss: 0.9068 - classification_loss: 0.1181 442/500 [=========================>....] - ETA: 13s - loss: 1.0262 - regression_loss: 0.9082 - classification_loss: 0.1180 443/500 [=========================>....] - ETA: 13s - loss: 1.0257 - regression_loss: 0.9078 - classification_loss: 0.1179 444/500 [=========================>....] - ETA: 13s - loss: 1.0258 - regression_loss: 0.9079 - classification_loss: 0.1178 445/500 [=========================>....] - ETA: 12s - loss: 1.0254 - regression_loss: 0.9077 - classification_loss: 0.1178 446/500 [=========================>....] - ETA: 12s - loss: 1.0245 - regression_loss: 0.9069 - classification_loss: 0.1176 447/500 [=========================>....] - ETA: 12s - loss: 1.0239 - regression_loss: 0.9064 - classification_loss: 0.1175 448/500 [=========================>....] - ETA: 12s - loss: 1.0239 - regression_loss: 0.9064 - classification_loss: 0.1175 449/500 [=========================>....] - ETA: 11s - loss: 1.0245 - regression_loss: 0.9072 - classification_loss: 0.1174 450/500 [==========================>...] - ETA: 11s - loss: 1.0242 - regression_loss: 0.9069 - classification_loss: 0.1173 451/500 [==========================>...] - ETA: 11s - loss: 1.0235 - regression_loss: 0.9064 - classification_loss: 0.1171 452/500 [==========================>...] - ETA: 11s - loss: 1.0230 - regression_loss: 0.9060 - classification_loss: 0.1170 453/500 [==========================>...] - ETA: 11s - loss: 1.0227 - regression_loss: 0.9058 - classification_loss: 0.1169 454/500 [==========================>...] - ETA: 10s - loss: 1.0294 - regression_loss: 0.9083 - classification_loss: 0.1211 455/500 [==========================>...] - ETA: 10s - loss: 1.0285 - regression_loss: 0.9075 - classification_loss: 0.1210 456/500 [==========================>...] - ETA: 10s - loss: 1.0280 - regression_loss: 0.9072 - classification_loss: 0.1209 457/500 [==========================>...] - ETA: 10s - loss: 1.0287 - regression_loss: 0.9078 - classification_loss: 0.1209 458/500 [==========================>...] - ETA: 9s - loss: 1.0281 - regression_loss: 0.9074 - classification_loss: 0.1208  459/500 [==========================>...] - ETA: 9s - loss: 1.0289 - regression_loss: 0.9081 - classification_loss: 0.1209 460/500 [==========================>...] - ETA: 9s - loss: 1.0317 - regression_loss: 0.9105 - classification_loss: 0.1212 461/500 [==========================>...] - ETA: 9s - loss: 1.0323 - regression_loss: 0.9110 - classification_loss: 0.1212 462/500 [==========================>...] - ETA: 8s - loss: 1.0335 - regression_loss: 0.9123 - classification_loss: 0.1212 463/500 [==========================>...] - ETA: 8s - loss: 1.0343 - regression_loss: 0.9131 - classification_loss: 0.1213 464/500 [==========================>...] - ETA: 8s - loss: 1.0341 - regression_loss: 0.9128 - classification_loss: 0.1212 465/500 [==========================>...] - ETA: 8s - loss: 1.0329 - regression_loss: 0.9119 - classification_loss: 0.1210 466/500 [==========================>...] - ETA: 7s - loss: 1.0317 - regression_loss: 0.9108 - classification_loss: 0.1208 467/500 [===========================>..] - ETA: 7s - loss: 1.0332 - regression_loss: 0.9120 - classification_loss: 0.1213 468/500 [===========================>..] - ETA: 7s - loss: 1.0328 - regression_loss: 0.9116 - classification_loss: 0.1212 469/500 [===========================>..] - ETA: 7s - loss: 1.0324 - regression_loss: 0.9112 - classification_loss: 0.1212 470/500 [===========================>..] - ETA: 7s - loss: 1.0315 - regression_loss: 0.9104 - classification_loss: 0.1211 471/500 [===========================>..] - ETA: 6s - loss: 1.0317 - regression_loss: 0.9107 - classification_loss: 0.1210 472/500 [===========================>..] - ETA: 6s - loss: 1.0318 - regression_loss: 0.9106 - classification_loss: 0.1212 473/500 [===========================>..] - ETA: 6s - loss: 1.0322 - regression_loss: 0.9111 - classification_loss: 0.1212 474/500 [===========================>..] - ETA: 6s - loss: 1.0315 - regression_loss: 0.9105 - classification_loss: 0.1210 475/500 [===========================>..] - ETA: 5s - loss: 1.0311 - regression_loss: 0.9102 - classification_loss: 0.1209 476/500 [===========================>..] - ETA: 5s - loss: 1.0302 - regression_loss: 0.9095 - classification_loss: 0.1208 477/500 [===========================>..] - ETA: 5s - loss: 1.0299 - regression_loss: 0.9093 - classification_loss: 0.1207 478/500 [===========================>..] - ETA: 5s - loss: 1.0296 - regression_loss: 0.9090 - classification_loss: 0.1206 479/500 [===========================>..] - ETA: 4s - loss: 1.0292 - regression_loss: 0.9087 - classification_loss: 0.1204 480/500 [===========================>..] - ETA: 4s - loss: 1.0296 - regression_loss: 0.9091 - classification_loss: 0.1205 481/500 [===========================>..] - ETA: 4s - loss: 1.0311 - regression_loss: 0.9103 - classification_loss: 0.1208 482/500 [===========================>..] - ETA: 4s - loss: 1.0320 - regression_loss: 0.9108 - classification_loss: 0.1212 483/500 [===========================>..] - ETA: 3s - loss: 1.0319 - regression_loss: 0.9109 - classification_loss: 0.1211 484/500 [============================>.] - ETA: 3s - loss: 1.0315 - regression_loss: 0.9106 - classification_loss: 0.1209 485/500 [============================>.] - ETA: 3s - loss: 1.0310 - regression_loss: 0.9101 - classification_loss: 0.1208 486/500 [============================>.] - ETA: 3s - loss: 1.0319 - regression_loss: 0.9110 - classification_loss: 0.1209 487/500 [============================>.] - ETA: 3s - loss: 1.0317 - regression_loss: 0.9109 - classification_loss: 0.1208 488/500 [============================>.] - ETA: 2s - loss: 1.0315 - regression_loss: 0.9107 - classification_loss: 0.1208 489/500 [============================>.] - ETA: 2s - loss: 1.0318 - regression_loss: 0.9110 - classification_loss: 0.1208 490/500 [============================>.] - ETA: 2s - loss: 1.0311 - regression_loss: 0.9104 - classification_loss: 0.1207 491/500 [============================>.] - ETA: 2s - loss: 1.0310 - regression_loss: 0.9104 - classification_loss: 0.1206 492/500 [============================>.] - ETA: 1s - loss: 1.0315 - regression_loss: 0.9108 - classification_loss: 0.1207 493/500 [============================>.] - ETA: 1s - loss: 1.0304 - regression_loss: 0.9099 - classification_loss: 0.1205 494/500 [============================>.] - ETA: 1s - loss: 1.0302 - regression_loss: 0.9098 - classification_loss: 0.1204 495/500 [============================>.] - ETA: 1s - loss: 1.0298 - regression_loss: 0.9096 - classification_loss: 0.1202 496/500 [============================>.] - ETA: 0s - loss: 1.0298 - regression_loss: 0.9096 - classification_loss: 0.1202 497/500 [============================>.] - ETA: 0s - loss: 1.0309 - regression_loss: 0.9106 - classification_loss: 0.1203 498/500 [============================>.] - ETA: 0s - loss: 1.0311 - regression_loss: 0.9108 - classification_loss: 0.1202 499/500 [============================>.] - ETA: 0s - loss: 1.0304 - regression_loss: 0.9103 - classification_loss: 0.1201 500/500 [==============================] - 117s 235ms/step - loss: 1.0295 - regression_loss: 0.9095 - classification_loss: 0.1199 326 instances of class plum with average precision: 0.8210 mAP: 0.8210 Epoch 00027: saving model to ./training/snapshots/resnet50_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 1:50 - loss: 1.6391 - regression_loss: 1.4967 - classification_loss: 0.1424 2/500 [..............................] - ETA: 1:50 - loss: 1.3959 - regression_loss: 1.2472 - classification_loss: 0.1487 3/500 [..............................] - ETA: 1:54 - loss: 1.2004 - regression_loss: 1.0755 - classification_loss: 0.1249 4/500 [..............................] - ETA: 1:54 - loss: 1.2089 - regression_loss: 1.0901 - classification_loss: 0.1188 5/500 [..............................] - ETA: 1:52 - loss: 1.0956 - regression_loss: 0.9896 - classification_loss: 0.1060 6/500 [..............................] - ETA: 1:54 - loss: 1.0605 - regression_loss: 0.9563 - classification_loss: 0.1042 7/500 [..............................] - ETA: 1:53 - loss: 1.1031 - regression_loss: 0.9952 - classification_loss: 0.1079 8/500 [..............................] - ETA: 1:53 - loss: 1.1589 - regression_loss: 1.0385 - classification_loss: 0.1204 9/500 [..............................] - ETA: 1:54 - loss: 1.1106 - regression_loss: 0.9945 - classification_loss: 0.1161 10/500 [..............................] - ETA: 1:54 - loss: 1.0978 - regression_loss: 0.9813 - classification_loss: 0.1164 11/500 [..............................] - ETA: 1:54 - loss: 1.1509 - regression_loss: 1.0294 - classification_loss: 0.1215 12/500 [..............................] - ETA: 1:54 - loss: 1.1611 - regression_loss: 1.0387 - classification_loss: 0.1224 13/500 [..............................] - ETA: 1:54 - loss: 1.2205 - regression_loss: 1.0860 - classification_loss: 0.1345 14/500 [..............................] - ETA: 1:54 - loss: 1.2158 - regression_loss: 1.0883 - classification_loss: 0.1275 15/500 [..............................] - ETA: 1:54 - loss: 1.2225 - regression_loss: 1.0906 - classification_loss: 0.1319 16/500 [..............................] - ETA: 1:53 - loss: 1.1962 - regression_loss: 1.0684 - classification_loss: 0.1278 17/500 [>.............................] - ETA: 1:53 - loss: 1.2168 - regression_loss: 1.0862 - classification_loss: 0.1306 18/500 [>.............................] - ETA: 1:52 - loss: 1.2141 - regression_loss: 1.0827 - classification_loss: 0.1314 19/500 [>.............................] - ETA: 1:52 - loss: 1.1854 - regression_loss: 1.0575 - classification_loss: 0.1279 20/500 [>.............................] - ETA: 1:51 - loss: 1.1575 - regression_loss: 1.0318 - classification_loss: 0.1257 21/500 [>.............................] - ETA: 1:51 - loss: 1.1521 - regression_loss: 1.0273 - classification_loss: 0.1248 22/500 [>.............................] - ETA: 1:51 - loss: 1.1320 - regression_loss: 1.0087 - classification_loss: 0.1234 23/500 [>.............................] - ETA: 1:50 - loss: 1.1175 - regression_loss: 0.9954 - classification_loss: 0.1221 24/500 [>.............................] - ETA: 1:50 - loss: 1.0889 - regression_loss: 0.9700 - classification_loss: 0.1190 25/500 [>.............................] - ETA: 1:49 - loss: 1.0913 - regression_loss: 0.9737 - classification_loss: 0.1176 26/500 [>.............................] - ETA: 1:49 - loss: 1.0890 - regression_loss: 0.9720 - classification_loss: 0.1169 27/500 [>.............................] - ETA: 1:49 - loss: 1.0959 - regression_loss: 0.9736 - classification_loss: 0.1222 28/500 [>.............................] - ETA: 1:49 - loss: 1.0746 - regression_loss: 0.9555 - classification_loss: 0.1192 29/500 [>.............................] - ETA: 1:49 - loss: 1.0787 - regression_loss: 0.9598 - classification_loss: 0.1189 30/500 [>.............................] - ETA: 1:49 - loss: 1.0635 - regression_loss: 0.9474 - classification_loss: 0.1161 31/500 [>.............................] - ETA: 1:49 - loss: 1.0624 - regression_loss: 0.9476 - classification_loss: 0.1148 32/500 [>.............................] - ETA: 1:48 - loss: 1.0711 - regression_loss: 0.9557 - classification_loss: 0.1154 33/500 [>.............................] - ETA: 1:48 - loss: 1.0691 - regression_loss: 0.9545 - classification_loss: 0.1146 34/500 [=>............................] - ETA: 1:48 - loss: 1.0611 - regression_loss: 0.9467 - classification_loss: 0.1144 35/500 [=>............................] - ETA: 1:48 - loss: 1.0463 - regression_loss: 0.9333 - classification_loss: 0.1130 36/500 [=>............................] - ETA: 1:47 - loss: 1.0509 - regression_loss: 0.9350 - classification_loss: 0.1158 37/500 [=>............................] - ETA: 1:48 - loss: 1.0478 - regression_loss: 0.9336 - classification_loss: 0.1143 38/500 [=>............................] - ETA: 1:47 - loss: 1.0659 - regression_loss: 0.9471 - classification_loss: 0.1189 39/500 [=>............................] - ETA: 1:47 - loss: 1.0870 - regression_loss: 0.9643 - classification_loss: 0.1227 40/500 [=>............................] - ETA: 1:47 - loss: 1.0889 - regression_loss: 0.9670 - classification_loss: 0.1219 41/500 [=>............................] - ETA: 1:47 - loss: 1.0916 - regression_loss: 0.9696 - classification_loss: 0.1220 42/500 [=>............................] - ETA: 1:46 - loss: 1.0887 - regression_loss: 0.9677 - classification_loss: 0.1210 43/500 [=>............................] - ETA: 1:46 - loss: 1.0835 - regression_loss: 0.9626 - classification_loss: 0.1209 44/500 [=>............................] - ETA: 1:46 - loss: 1.0802 - regression_loss: 0.9595 - classification_loss: 0.1207 45/500 [=>............................] - ETA: 1:46 - loss: 1.0711 - regression_loss: 0.9520 - classification_loss: 0.1191 46/500 [=>............................] - ETA: 1:46 - loss: 1.0666 - regression_loss: 0.9482 - classification_loss: 0.1184 47/500 [=>............................] - ETA: 1:46 - loss: 1.0757 - regression_loss: 0.9549 - classification_loss: 0.1209 48/500 [=>............................] - ETA: 1:45 - loss: 1.0733 - regression_loss: 0.9524 - classification_loss: 0.1209 49/500 [=>............................] - ETA: 1:45 - loss: 1.0754 - regression_loss: 0.9546 - classification_loss: 0.1208 50/500 [==>...........................] - ETA: 1:45 - loss: 1.0800 - regression_loss: 0.9579 - classification_loss: 0.1221 51/500 [==>...........................] - ETA: 1:45 - loss: 1.0776 - regression_loss: 0.9564 - classification_loss: 0.1212 52/500 [==>...........................] - ETA: 1:44 - loss: 1.0761 - regression_loss: 0.9554 - classification_loss: 0.1207 53/500 [==>...........................] - ETA: 1:44 - loss: 1.0894 - regression_loss: 0.9668 - classification_loss: 0.1226 54/500 [==>...........................] - ETA: 1:44 - loss: 1.0742 - regression_loss: 0.9530 - classification_loss: 0.1212 55/500 [==>...........................] - ETA: 1:43 - loss: 1.0739 - regression_loss: 0.9480 - classification_loss: 0.1259 56/500 [==>...........................] - ETA: 1:43 - loss: 1.0816 - regression_loss: 0.9548 - classification_loss: 0.1268 57/500 [==>...........................] - ETA: 1:43 - loss: 1.0853 - regression_loss: 0.9595 - classification_loss: 0.1258 58/500 [==>...........................] - ETA: 1:43 - loss: 1.0842 - regression_loss: 0.9587 - classification_loss: 0.1255 59/500 [==>...........................] - ETA: 1:43 - loss: 1.0860 - regression_loss: 0.9601 - classification_loss: 0.1259 60/500 [==>...........................] - ETA: 1:42 - loss: 1.0931 - regression_loss: 0.9675 - classification_loss: 0.1256 61/500 [==>...........................] - ETA: 1:42 - loss: 1.0948 - regression_loss: 0.9704 - classification_loss: 0.1244 62/500 [==>...........................] - ETA: 1:42 - loss: 1.0988 - regression_loss: 0.9744 - classification_loss: 0.1244 63/500 [==>...........................] - ETA: 1:42 - loss: 1.0891 - regression_loss: 0.9661 - classification_loss: 0.1231 64/500 [==>...........................] - ETA: 1:41 - loss: 1.0871 - regression_loss: 0.9651 - classification_loss: 0.1220 65/500 [==>...........................] - ETA: 1:41 - loss: 1.0790 - regression_loss: 0.9581 - classification_loss: 0.1210 66/500 [==>...........................] - ETA: 1:41 - loss: 1.0738 - regression_loss: 0.9535 - classification_loss: 0.1203 67/500 [===>..........................] - ETA: 1:41 - loss: 1.0697 - regression_loss: 0.9499 - classification_loss: 0.1198 68/500 [===>..........................] - ETA: 1:41 - loss: 1.0758 - regression_loss: 0.9545 - classification_loss: 0.1212 69/500 [===>..........................] - ETA: 1:40 - loss: 1.0801 - regression_loss: 0.9570 - classification_loss: 0.1231 70/500 [===>..........................] - ETA: 1:40 - loss: 1.0831 - regression_loss: 0.9593 - classification_loss: 0.1238 71/500 [===>..........................] - ETA: 1:40 - loss: 1.0820 - regression_loss: 0.9589 - classification_loss: 0.1231 72/500 [===>..........................] - ETA: 1:40 - loss: 1.0805 - regression_loss: 0.9576 - classification_loss: 0.1229 73/500 [===>..........................] - ETA: 1:39 - loss: 1.0859 - regression_loss: 0.9629 - classification_loss: 0.1230 74/500 [===>..........................] - ETA: 1:39 - loss: 1.0857 - regression_loss: 0.9631 - classification_loss: 0.1226 75/500 [===>..........................] - ETA: 1:39 - loss: 1.0771 - regression_loss: 0.9552 - classification_loss: 0.1219 76/500 [===>..........................] - ETA: 1:39 - loss: 1.0738 - regression_loss: 0.9525 - classification_loss: 0.1213 77/500 [===>..........................] - ETA: 1:39 - loss: 1.0622 - regression_loss: 0.9420 - classification_loss: 0.1202 78/500 [===>..........................] - ETA: 1:38 - loss: 1.0564 - regression_loss: 0.9367 - classification_loss: 0.1197 79/500 [===>..........................] - ETA: 1:38 - loss: 1.0458 - regression_loss: 0.9275 - classification_loss: 0.1183 80/500 [===>..........................] - ETA: 1:38 - loss: 1.0466 - regression_loss: 0.9287 - classification_loss: 0.1179 81/500 [===>..........................] - ETA: 1:38 - loss: 1.0565 - regression_loss: 0.9361 - classification_loss: 0.1203 82/500 [===>..........................] - ETA: 1:38 - loss: 1.0498 - regression_loss: 0.9303 - classification_loss: 0.1196 83/500 [===>..........................] - ETA: 1:38 - loss: 1.0450 - regression_loss: 0.9263 - classification_loss: 0.1186 84/500 [====>.........................] - ETA: 1:37 - loss: 1.0561 - regression_loss: 0.9356 - classification_loss: 0.1205 85/500 [====>.........................] - ETA: 1:37 - loss: 1.0618 - regression_loss: 0.9400 - classification_loss: 0.1218 86/500 [====>.........................] - ETA: 1:37 - loss: 1.0608 - regression_loss: 0.9390 - classification_loss: 0.1218 87/500 [====>.........................] - ETA: 1:37 - loss: 1.0604 - regression_loss: 0.9386 - classification_loss: 0.1218 88/500 [====>.........................] - ETA: 1:37 - loss: 1.0597 - regression_loss: 0.9279 - classification_loss: 0.1318 89/500 [====>.........................] - ETA: 1:36 - loss: 1.0594 - regression_loss: 0.9276 - classification_loss: 0.1318 90/500 [====>.........................] - ETA: 1:36 - loss: 1.0597 - regression_loss: 0.9280 - classification_loss: 0.1316 91/500 [====>.........................] - ETA: 1:36 - loss: 1.0563 - regression_loss: 0.9254 - classification_loss: 0.1310 92/500 [====>.........................] - ETA: 1:36 - loss: 1.0543 - regression_loss: 0.9238 - classification_loss: 0.1304 93/500 [====>.........................] - ETA: 1:35 - loss: 1.0504 - regression_loss: 0.9204 - classification_loss: 0.1300 94/500 [====>.........................] - ETA: 1:35 - loss: 1.0522 - regression_loss: 0.9219 - classification_loss: 0.1302 95/500 [====>.........................] - ETA: 1:35 - loss: 1.0522 - regression_loss: 0.9223 - classification_loss: 0.1300 96/500 [====>.........................] - ETA: 1:35 - loss: 1.0613 - regression_loss: 0.9300 - classification_loss: 0.1313 97/500 [====>.........................] - ETA: 1:34 - loss: 1.0544 - regression_loss: 0.9240 - classification_loss: 0.1304 98/500 [====>.........................] - ETA: 1:34 - loss: 1.0604 - regression_loss: 0.9277 - classification_loss: 0.1327 99/500 [====>.........................] - ETA: 1:34 - loss: 1.0629 - regression_loss: 0.9298 - classification_loss: 0.1331 100/500 [=====>........................] - ETA: 1:34 - loss: 1.0577 - regression_loss: 0.9253 - classification_loss: 0.1324 101/500 [=====>........................] - ETA: 1:33 - loss: 1.0688 - regression_loss: 0.9349 - classification_loss: 0.1339 102/500 [=====>........................] - ETA: 1:33 - loss: 1.0675 - regression_loss: 0.9342 - classification_loss: 0.1333 103/500 [=====>........................] - ETA: 1:33 - loss: 1.0661 - regression_loss: 0.9332 - classification_loss: 0.1329 104/500 [=====>........................] - ETA: 1:33 - loss: 1.0717 - regression_loss: 0.9374 - classification_loss: 0.1343 105/500 [=====>........................] - ETA: 1:33 - loss: 1.0755 - regression_loss: 0.9408 - classification_loss: 0.1347 106/500 [=====>........................] - ETA: 1:32 - loss: 1.0731 - regression_loss: 0.9390 - classification_loss: 0.1341 107/500 [=====>........................] - ETA: 1:32 - loss: 1.0701 - regression_loss: 0.9364 - classification_loss: 0.1337 108/500 [=====>........................] - ETA: 1:32 - loss: 1.0726 - regression_loss: 0.9387 - classification_loss: 0.1338 109/500 [=====>........................] - ETA: 1:32 - loss: 1.0740 - regression_loss: 0.9408 - classification_loss: 0.1333 110/500 [=====>........................] - ETA: 1:31 - loss: 1.0744 - regression_loss: 0.9412 - classification_loss: 0.1332 111/500 [=====>........................] - ETA: 1:31 - loss: 1.0763 - regression_loss: 0.9433 - classification_loss: 0.1330 112/500 [=====>........................] - ETA: 1:31 - loss: 1.0717 - regression_loss: 0.9394 - classification_loss: 0.1323 113/500 [=====>........................] - ETA: 1:31 - loss: 1.0706 - regression_loss: 0.9387 - classification_loss: 0.1319 114/500 [=====>........................] - ETA: 1:30 - loss: 1.0718 - regression_loss: 0.9396 - classification_loss: 0.1321 115/500 [=====>........................] - ETA: 1:30 - loss: 1.0733 - regression_loss: 0.9414 - classification_loss: 0.1319 116/500 [=====>........................] - ETA: 1:30 - loss: 1.0732 - regression_loss: 0.9407 - classification_loss: 0.1325 117/500 [======>.......................] - ETA: 1:30 - loss: 1.0687 - regression_loss: 0.9370 - classification_loss: 0.1318 118/500 [======>.......................] - ETA: 1:29 - loss: 1.0691 - regression_loss: 0.9375 - classification_loss: 0.1316 119/500 [======>.......................] - ETA: 1:29 - loss: 1.0724 - regression_loss: 0.9396 - classification_loss: 0.1328 120/500 [======>.......................] - ETA: 1:29 - loss: 1.0756 - regression_loss: 0.9428 - classification_loss: 0.1328 121/500 [======>.......................] - ETA: 1:29 - loss: 1.0718 - regression_loss: 0.9397 - classification_loss: 0.1321 122/500 [======>.......................] - ETA: 1:28 - loss: 1.0770 - regression_loss: 0.9432 - classification_loss: 0.1338 123/500 [======>.......................] - ETA: 1:28 - loss: 1.0752 - regression_loss: 0.9418 - classification_loss: 0.1334 124/500 [======>.......................] - ETA: 1:28 - loss: 1.0735 - regression_loss: 0.9404 - classification_loss: 0.1331 125/500 [======>.......................] - ETA: 1:28 - loss: 1.0731 - regression_loss: 0.9401 - classification_loss: 0.1330 126/500 [======>.......................] - ETA: 1:27 - loss: 1.0721 - regression_loss: 0.9394 - classification_loss: 0.1327 127/500 [======>.......................] - ETA: 1:27 - loss: 1.0707 - regression_loss: 0.9383 - classification_loss: 0.1324 128/500 [======>.......................] - ETA: 1:27 - loss: 1.0710 - regression_loss: 0.9387 - classification_loss: 0.1323 129/500 [======>.......................] - ETA: 1:26 - loss: 1.0690 - regression_loss: 0.9369 - classification_loss: 0.1321 130/500 [======>.......................] - ETA: 1:26 - loss: 1.0655 - regression_loss: 0.9338 - classification_loss: 0.1317 131/500 [======>.......................] - ETA: 1:26 - loss: 1.0614 - regression_loss: 0.9304 - classification_loss: 0.1311 132/500 [======>.......................] - ETA: 1:26 - loss: 1.0628 - regression_loss: 0.9316 - classification_loss: 0.1312 133/500 [======>.......................] - ETA: 1:25 - loss: 1.0607 - regression_loss: 0.9301 - classification_loss: 0.1307 134/500 [=======>......................] - ETA: 1:25 - loss: 1.0567 - regression_loss: 0.9265 - classification_loss: 0.1302 135/500 [=======>......................] - ETA: 1:25 - loss: 1.0563 - regression_loss: 0.9261 - classification_loss: 0.1303 136/500 [=======>......................] - ETA: 1:25 - loss: 1.0556 - regression_loss: 0.9253 - classification_loss: 0.1302 137/500 [=======>......................] - ETA: 1:24 - loss: 1.0520 - regression_loss: 0.9224 - classification_loss: 0.1297 138/500 [=======>......................] - ETA: 1:24 - loss: 1.0508 - regression_loss: 0.9216 - classification_loss: 0.1293 139/500 [=======>......................] - ETA: 1:24 - loss: 1.0510 - regression_loss: 0.9222 - classification_loss: 0.1288 140/500 [=======>......................] - ETA: 1:24 - loss: 1.0486 - regression_loss: 0.9205 - classification_loss: 0.1281 141/500 [=======>......................] - ETA: 1:24 - loss: 1.0472 - regression_loss: 0.9191 - classification_loss: 0.1281 142/500 [=======>......................] - ETA: 1:23 - loss: 1.0464 - regression_loss: 0.9183 - classification_loss: 0.1281 143/500 [=======>......................] - ETA: 1:23 - loss: 1.0416 - regression_loss: 0.9141 - classification_loss: 0.1275 144/500 [=======>......................] - ETA: 1:23 - loss: 1.0398 - regression_loss: 0.9126 - classification_loss: 0.1271 145/500 [=======>......................] - ETA: 1:23 - loss: 1.0362 - regression_loss: 0.9098 - classification_loss: 0.1264 146/500 [=======>......................] - ETA: 1:22 - loss: 1.0367 - regression_loss: 0.9106 - classification_loss: 0.1261 147/500 [=======>......................] - ETA: 1:22 - loss: 1.0359 - regression_loss: 0.9104 - classification_loss: 0.1255 148/500 [=======>......................] - ETA: 1:22 - loss: 1.0379 - regression_loss: 0.9120 - classification_loss: 0.1259 149/500 [=======>......................] - ETA: 1:22 - loss: 1.0353 - regression_loss: 0.9099 - classification_loss: 0.1254 150/500 [========>.....................] - ETA: 1:21 - loss: 1.0310 - regression_loss: 0.9063 - classification_loss: 0.1247 151/500 [========>.....................] - ETA: 1:21 - loss: 1.0334 - regression_loss: 0.9084 - classification_loss: 0.1250 152/500 [========>.....................] - ETA: 1:21 - loss: 1.0304 - regression_loss: 0.9059 - classification_loss: 0.1245 153/500 [========>.....................] - ETA: 1:21 - loss: 1.0296 - regression_loss: 0.9054 - classification_loss: 0.1241 154/500 [========>.....................] - ETA: 1:20 - loss: 1.0310 - regression_loss: 0.9063 - classification_loss: 0.1247 155/500 [========>.....................] - ETA: 1:20 - loss: 1.0286 - regression_loss: 0.9045 - classification_loss: 0.1241 156/500 [========>.....................] - ETA: 1:20 - loss: 1.0284 - regression_loss: 0.9043 - classification_loss: 0.1241 157/500 [========>.....................] - ETA: 1:20 - loss: 1.0285 - regression_loss: 0.9047 - classification_loss: 0.1238 158/500 [========>.....................] - ETA: 1:20 - loss: 1.0284 - regression_loss: 0.9045 - classification_loss: 0.1238 159/500 [========>.....................] - ETA: 1:19 - loss: 1.0285 - regression_loss: 0.9046 - classification_loss: 0.1240 160/500 [========>.....................] - ETA: 1:19 - loss: 1.0322 - regression_loss: 0.9079 - classification_loss: 0.1243 161/500 [========>.....................] - ETA: 1:19 - loss: 1.0302 - regression_loss: 0.9064 - classification_loss: 0.1238 162/500 [========>.....................] - ETA: 1:19 - loss: 1.0309 - regression_loss: 0.9067 - classification_loss: 0.1242 163/500 [========>.....................] - ETA: 1:18 - loss: 1.0303 - regression_loss: 0.9063 - classification_loss: 0.1240 164/500 [========>.....................] - ETA: 1:18 - loss: 1.0299 - regression_loss: 0.9061 - classification_loss: 0.1238 165/500 [========>.....................] - ETA: 1:18 - loss: 1.0274 - regression_loss: 0.9039 - classification_loss: 0.1236 166/500 [========>.....................] - ETA: 1:18 - loss: 1.0260 - regression_loss: 0.9026 - classification_loss: 0.1234 167/500 [=========>....................] - ETA: 1:17 - loss: 1.0241 - regression_loss: 0.9010 - classification_loss: 0.1231 168/500 [=========>....................] - ETA: 1:17 - loss: 1.0235 - regression_loss: 0.9008 - classification_loss: 0.1227 169/500 [=========>....................] - ETA: 1:17 - loss: 1.0219 - regression_loss: 0.8995 - classification_loss: 0.1225 170/500 [=========>....................] - ETA: 1:17 - loss: 1.0234 - regression_loss: 0.9006 - classification_loss: 0.1227 171/500 [=========>....................] - ETA: 1:17 - loss: 1.0248 - regression_loss: 0.9021 - classification_loss: 0.1227 172/500 [=========>....................] - ETA: 1:16 - loss: 1.0230 - regression_loss: 0.9004 - classification_loss: 0.1227 173/500 [=========>....................] - ETA: 1:16 - loss: 1.0216 - regression_loss: 0.8991 - classification_loss: 0.1225 174/500 [=========>....................] - ETA: 1:16 - loss: 1.0239 - regression_loss: 0.9012 - classification_loss: 0.1227 175/500 [=========>....................] - ETA: 1:16 - loss: 1.0237 - regression_loss: 0.9010 - classification_loss: 0.1227 176/500 [=========>....................] - ETA: 1:15 - loss: 1.0207 - regression_loss: 0.8986 - classification_loss: 0.1221 177/500 [=========>....................] - ETA: 1:15 - loss: 1.0183 - regression_loss: 0.8967 - classification_loss: 0.1216 178/500 [=========>....................] - ETA: 1:15 - loss: 1.0164 - regression_loss: 0.8951 - classification_loss: 0.1212 179/500 [=========>....................] - ETA: 1:15 - loss: 1.0220 - regression_loss: 0.8994 - classification_loss: 0.1226 180/500 [=========>....................] - ETA: 1:14 - loss: 1.0231 - regression_loss: 0.9007 - classification_loss: 0.1224 181/500 [=========>....................] - ETA: 1:14 - loss: 1.0215 - regression_loss: 0.8995 - classification_loss: 0.1221 182/500 [=========>....................] - ETA: 1:14 - loss: 1.0185 - regression_loss: 0.8971 - classification_loss: 0.1215 183/500 [=========>....................] - ETA: 1:14 - loss: 1.0171 - regression_loss: 0.8957 - classification_loss: 0.1214 184/500 [==========>...................] - ETA: 1:13 - loss: 1.0184 - regression_loss: 0.8969 - classification_loss: 0.1216 185/500 [==========>...................] - ETA: 1:13 - loss: 1.0191 - regression_loss: 0.8974 - classification_loss: 0.1217 186/500 [==========>...................] - ETA: 1:13 - loss: 1.0206 - regression_loss: 0.8987 - classification_loss: 0.1219 187/500 [==========>...................] - ETA: 1:13 - loss: 1.0186 - regression_loss: 0.8971 - classification_loss: 0.1215 188/500 [==========>...................] - ETA: 1:13 - loss: 1.0180 - regression_loss: 0.8968 - classification_loss: 0.1212 189/500 [==========>...................] - ETA: 1:12 - loss: 1.0176 - regression_loss: 0.8964 - classification_loss: 0.1211 190/500 [==========>...................] - ETA: 1:12 - loss: 1.0167 - regression_loss: 0.8957 - classification_loss: 0.1210 191/500 [==========>...................] - ETA: 1:12 - loss: 1.0151 - regression_loss: 0.8941 - classification_loss: 0.1209 192/500 [==========>...................] - ETA: 1:12 - loss: 1.0146 - regression_loss: 0.8935 - classification_loss: 0.1212 193/500 [==========>...................] - ETA: 1:11 - loss: 1.0117 - regression_loss: 0.8910 - classification_loss: 0.1207 194/500 [==========>...................] - ETA: 1:11 - loss: 1.0130 - regression_loss: 0.8921 - classification_loss: 0.1209 195/500 [==========>...................] - ETA: 1:11 - loss: 1.0104 - regression_loss: 0.8898 - classification_loss: 0.1205 196/500 [==========>...................] - ETA: 1:11 - loss: 1.0123 - regression_loss: 0.8916 - classification_loss: 0.1207 197/500 [==========>...................] - ETA: 1:10 - loss: 1.0112 - regression_loss: 0.8906 - classification_loss: 0.1206 198/500 [==========>...................] - ETA: 1:10 - loss: 1.0115 - regression_loss: 0.8910 - classification_loss: 0.1205 199/500 [==========>...................] - ETA: 1:10 - loss: 1.0132 - regression_loss: 0.8927 - classification_loss: 0.1205 200/500 [===========>..................] - ETA: 1:10 - loss: 1.0173 - regression_loss: 0.8963 - classification_loss: 0.1210 201/500 [===========>..................] - ETA: 1:09 - loss: 1.0147 - regression_loss: 0.8940 - classification_loss: 0.1206 202/500 [===========>..................] - ETA: 1:09 - loss: 1.0139 - regression_loss: 0.8931 - classification_loss: 0.1208 203/500 [===========>..................] - ETA: 1:09 - loss: 1.0175 - regression_loss: 0.8967 - classification_loss: 0.1208 204/500 [===========>..................] - ETA: 1:09 - loss: 1.0167 - regression_loss: 0.8961 - classification_loss: 0.1206 205/500 [===========>..................] - ETA: 1:09 - loss: 1.0180 - regression_loss: 0.8972 - classification_loss: 0.1207 206/500 [===========>..................] - ETA: 1:08 - loss: 1.0153 - regression_loss: 0.8950 - classification_loss: 0.1203 207/500 [===========>..................] - ETA: 1:08 - loss: 1.0156 - regression_loss: 0.8954 - classification_loss: 0.1202 208/500 [===========>..................] - ETA: 1:08 - loss: 1.0173 - regression_loss: 0.8973 - classification_loss: 0.1200 209/500 [===========>..................] - ETA: 1:08 - loss: 1.0156 - regression_loss: 0.8957 - classification_loss: 0.1199 210/500 [===========>..................] - ETA: 1:07 - loss: 1.0188 - regression_loss: 0.8986 - classification_loss: 0.1202 211/500 [===========>..................] - ETA: 1:07 - loss: 1.0182 - regression_loss: 0.8982 - classification_loss: 0.1200 212/500 [===========>..................] - ETA: 1:07 - loss: 1.0213 - regression_loss: 0.9004 - classification_loss: 0.1208 213/500 [===========>..................] - ETA: 1:07 - loss: 1.0204 - regression_loss: 0.8998 - classification_loss: 0.1205 214/500 [===========>..................] - ETA: 1:07 - loss: 1.0203 - regression_loss: 0.8998 - classification_loss: 0.1205 215/500 [===========>..................] - ETA: 1:06 - loss: 1.0207 - regression_loss: 0.8999 - classification_loss: 0.1208 216/500 [===========>..................] - ETA: 1:06 - loss: 1.0209 - regression_loss: 0.9000 - classification_loss: 0.1209 217/500 [============>.................] - ETA: 1:06 - loss: 1.0185 - regression_loss: 0.8981 - classification_loss: 0.1204 218/500 [============>.................] - ETA: 1:06 - loss: 1.0177 - regression_loss: 0.8975 - classification_loss: 0.1202 219/500 [============>.................] - ETA: 1:05 - loss: 1.0188 - regression_loss: 0.8989 - classification_loss: 0.1200 220/500 [============>.................] - ETA: 1:05 - loss: 1.0164 - regression_loss: 0.8966 - classification_loss: 0.1198 221/500 [============>.................] - ETA: 1:05 - loss: 1.0169 - regression_loss: 0.8971 - classification_loss: 0.1198 222/500 [============>.................] - ETA: 1:05 - loss: 1.0162 - regression_loss: 0.8964 - classification_loss: 0.1198 223/500 [============>.................] - ETA: 1:04 - loss: 1.0155 - regression_loss: 0.8959 - classification_loss: 0.1196 224/500 [============>.................] - ETA: 1:04 - loss: 1.0170 - regression_loss: 0.8970 - classification_loss: 0.1200 225/500 [============>.................] - ETA: 1:04 - loss: 1.0287 - regression_loss: 0.9037 - classification_loss: 0.1250 226/500 [============>.................] - ETA: 1:04 - loss: 1.0289 - regression_loss: 0.9039 - classification_loss: 0.1250 227/500 [============>.................] - ETA: 1:04 - loss: 1.0278 - regression_loss: 0.9030 - classification_loss: 0.1248 228/500 [============>.................] - ETA: 1:03 - loss: 1.0297 - regression_loss: 0.9049 - classification_loss: 0.1248 229/500 [============>.................] - ETA: 1:03 - loss: 1.0296 - regression_loss: 0.9051 - classification_loss: 0.1245 230/500 [============>.................] - ETA: 1:03 - loss: 1.0345 - regression_loss: 0.9092 - classification_loss: 0.1252 231/500 [============>.................] - ETA: 1:03 - loss: 1.0343 - regression_loss: 0.9092 - classification_loss: 0.1251 232/500 [============>.................] - ETA: 1:02 - loss: 1.0334 - regression_loss: 0.9085 - classification_loss: 0.1249 233/500 [============>.................] - ETA: 1:02 - loss: 1.0319 - regression_loss: 0.9070 - classification_loss: 0.1249 234/500 [=============>................] - ETA: 1:02 - loss: 1.0295 - regression_loss: 0.9050 - classification_loss: 0.1245 235/500 [=============>................] - ETA: 1:02 - loss: 1.0284 - regression_loss: 0.9042 - classification_loss: 0.1242 236/500 [=============>................] - ETA: 1:01 - loss: 1.0306 - regression_loss: 0.9058 - classification_loss: 0.1248 237/500 [=============>................] - ETA: 1:01 - loss: 1.0292 - regression_loss: 0.9047 - classification_loss: 0.1245 238/500 [=============>................] - ETA: 1:01 - loss: 1.0333 - regression_loss: 0.9078 - classification_loss: 0.1254 239/500 [=============>................] - ETA: 1:01 - loss: 1.0319 - regression_loss: 0.9067 - classification_loss: 0.1252 240/500 [=============>................] - ETA: 1:01 - loss: 1.0333 - regression_loss: 0.9079 - classification_loss: 0.1254 241/500 [=============>................] - ETA: 1:00 - loss: 1.0310 - regression_loss: 0.9059 - classification_loss: 0.1252 242/500 [=============>................] - ETA: 1:00 - loss: 1.0323 - regression_loss: 0.9071 - classification_loss: 0.1251 243/500 [=============>................] - ETA: 1:00 - loss: 1.0322 - regression_loss: 0.9072 - classification_loss: 0.1249 244/500 [=============>................] - ETA: 1:00 - loss: 1.0330 - regression_loss: 0.9079 - classification_loss: 0.1251 245/500 [=============>................] - ETA: 59s - loss: 1.0324 - regression_loss: 0.9075 - classification_loss: 0.1250  246/500 [=============>................] - ETA: 59s - loss: 1.0308 - regression_loss: 0.9060 - classification_loss: 0.1248 247/500 [=============>................] - ETA: 59s - loss: 1.0336 - regression_loss: 0.9087 - classification_loss: 0.1249 248/500 [=============>................] - ETA: 59s - loss: 1.0337 - regression_loss: 0.9087 - classification_loss: 0.1249 249/500 [=============>................] - ETA: 58s - loss: 1.0326 - regression_loss: 0.9079 - classification_loss: 0.1247 250/500 [==============>...............] - ETA: 58s - loss: 1.0333 - regression_loss: 0.9084 - classification_loss: 0.1249 251/500 [==============>...............] - ETA: 58s - loss: 1.0325 - regression_loss: 0.9078 - classification_loss: 0.1247 252/500 [==============>...............] - ETA: 58s - loss: 1.0334 - regression_loss: 0.9085 - classification_loss: 0.1249 253/500 [==============>...............] - ETA: 58s - loss: 1.0368 - regression_loss: 0.9112 - classification_loss: 0.1256 254/500 [==============>...............] - ETA: 57s - loss: 1.0377 - regression_loss: 0.9119 - classification_loss: 0.1259 255/500 [==============>...............] - ETA: 57s - loss: 1.0391 - regression_loss: 0.9128 - classification_loss: 0.1263 256/500 [==============>...............] - ETA: 57s - loss: 1.0366 - regression_loss: 0.9107 - classification_loss: 0.1259 257/500 [==============>...............] - ETA: 57s - loss: 1.0361 - regression_loss: 0.9103 - classification_loss: 0.1257 258/500 [==============>...............] - ETA: 56s - loss: 1.0381 - regression_loss: 0.9121 - classification_loss: 0.1260 259/500 [==============>...............] - ETA: 56s - loss: 1.0375 - regression_loss: 0.9118 - classification_loss: 0.1257 260/500 [==============>...............] - ETA: 56s - loss: 1.0374 - regression_loss: 0.9119 - classification_loss: 0.1255 261/500 [==============>...............] - ETA: 56s - loss: 1.0361 - regression_loss: 0.9107 - classification_loss: 0.1253 262/500 [==============>...............] - ETA: 55s - loss: 1.0359 - regression_loss: 0.9104 - classification_loss: 0.1255 263/500 [==============>...............] - ETA: 55s - loss: 1.0342 - regression_loss: 0.9091 - classification_loss: 0.1251 264/500 [==============>...............] - ETA: 55s - loss: 1.0335 - regression_loss: 0.9086 - classification_loss: 0.1250 265/500 [==============>...............] - ETA: 55s - loss: 1.0316 - regression_loss: 0.9070 - classification_loss: 0.1245 266/500 [==============>...............] - ETA: 54s - loss: 1.0311 - regression_loss: 0.9066 - classification_loss: 0.1244 267/500 [===============>..............] - ETA: 54s - loss: 1.0310 - regression_loss: 0.9069 - classification_loss: 0.1241 268/500 [===============>..............] - ETA: 54s - loss: 1.0301 - regression_loss: 0.9063 - classification_loss: 0.1238 269/500 [===============>..............] - ETA: 54s - loss: 1.0280 - regression_loss: 0.9045 - classification_loss: 0.1235 270/500 [===============>..............] - ETA: 54s - loss: 1.0262 - regression_loss: 0.9028 - classification_loss: 0.1234 271/500 [===============>..............] - ETA: 53s - loss: 1.0255 - regression_loss: 0.9023 - classification_loss: 0.1232 272/500 [===============>..............] - ETA: 53s - loss: 1.0257 - regression_loss: 0.9026 - classification_loss: 0.1231 273/500 [===============>..............] - ETA: 53s - loss: 1.0261 - regression_loss: 0.9029 - classification_loss: 0.1231 274/500 [===============>..............] - ETA: 53s - loss: 1.0249 - regression_loss: 0.9021 - classification_loss: 0.1228 275/500 [===============>..............] - ETA: 52s - loss: 1.0233 - regression_loss: 0.9007 - classification_loss: 0.1226 276/500 [===============>..............] - ETA: 52s - loss: 1.0270 - regression_loss: 0.9035 - classification_loss: 0.1234 277/500 [===============>..............] - ETA: 52s - loss: 1.0279 - regression_loss: 0.9043 - classification_loss: 0.1236 278/500 [===============>..............] - ETA: 52s - loss: 1.0270 - regression_loss: 0.9035 - classification_loss: 0.1235 279/500 [===============>..............] - ETA: 51s - loss: 1.0274 - regression_loss: 0.9040 - classification_loss: 0.1234 280/500 [===============>..............] - ETA: 51s - loss: 1.0284 - regression_loss: 0.9047 - classification_loss: 0.1237 281/500 [===============>..............] - ETA: 51s - loss: 1.0282 - regression_loss: 0.9048 - classification_loss: 0.1234 282/500 [===============>..............] - ETA: 51s - loss: 1.0255 - regression_loss: 0.9026 - classification_loss: 0.1230 283/500 [===============>..............] - ETA: 50s - loss: 1.0242 - regression_loss: 0.9013 - classification_loss: 0.1229 284/500 [================>.............] - ETA: 50s - loss: 1.0252 - regression_loss: 0.9021 - classification_loss: 0.1231 285/500 [================>.............] - ETA: 50s - loss: 1.0239 - regression_loss: 0.9009 - classification_loss: 0.1230 286/500 [================>.............] - ETA: 50s - loss: 1.0233 - regression_loss: 0.9004 - classification_loss: 0.1229 287/500 [================>.............] - ETA: 50s - loss: 1.0247 - regression_loss: 0.9018 - classification_loss: 0.1229 288/500 [================>.............] - ETA: 49s - loss: 1.0251 - regression_loss: 0.9023 - classification_loss: 0.1228 289/500 [================>.............] - ETA: 49s - loss: 1.0288 - regression_loss: 0.9054 - classification_loss: 0.1233 290/500 [================>.............] - ETA: 49s - loss: 1.0285 - regression_loss: 0.9053 - classification_loss: 0.1232 291/500 [================>.............] - ETA: 49s - loss: 1.0269 - regression_loss: 0.9040 - classification_loss: 0.1229 292/500 [================>.............] - ETA: 48s - loss: 1.0268 - regression_loss: 0.9039 - classification_loss: 0.1229 293/500 [================>.............] - ETA: 48s - loss: 1.0267 - regression_loss: 0.9038 - classification_loss: 0.1229 294/500 [================>.............] - ETA: 48s - loss: 1.0272 - regression_loss: 0.9045 - classification_loss: 0.1226 295/500 [================>.............] - ETA: 48s - loss: 1.0273 - regression_loss: 0.9047 - classification_loss: 0.1226 296/500 [================>.............] - ETA: 47s - loss: 1.0273 - regression_loss: 0.9046 - classification_loss: 0.1227 297/500 [================>.............] - ETA: 47s - loss: 1.0294 - regression_loss: 0.9066 - classification_loss: 0.1229 298/500 [================>.............] - ETA: 47s - loss: 1.0264 - regression_loss: 0.9035 - classification_loss: 0.1229 299/500 [================>.............] - ETA: 47s - loss: 1.0261 - regression_loss: 0.9033 - classification_loss: 0.1228 300/500 [=================>............] - ETA: 46s - loss: 1.0264 - regression_loss: 0.9035 - classification_loss: 0.1229 301/500 [=================>............] - ETA: 46s - loss: 1.0252 - regression_loss: 0.9025 - classification_loss: 0.1227 302/500 [=================>............] - ETA: 46s - loss: 1.0265 - regression_loss: 0.9036 - classification_loss: 0.1229 303/500 [=================>............] - ETA: 46s - loss: 1.0266 - regression_loss: 0.9037 - classification_loss: 0.1229 304/500 [=================>............] - ETA: 46s - loss: 1.0268 - regression_loss: 0.9038 - classification_loss: 0.1230 305/500 [=================>............] - ETA: 45s - loss: 1.0282 - regression_loss: 0.9049 - classification_loss: 0.1234 306/500 [=================>............] - ETA: 45s - loss: 1.0249 - regression_loss: 0.9019 - classification_loss: 0.1230 307/500 [=================>............] - ETA: 45s - loss: 1.0249 - regression_loss: 0.9020 - classification_loss: 0.1229 308/500 [=================>............] - ETA: 45s - loss: 1.0260 - regression_loss: 0.9030 - classification_loss: 0.1230 309/500 [=================>............] - ETA: 44s - loss: 1.0241 - regression_loss: 0.9013 - classification_loss: 0.1228 310/500 [=================>............] - ETA: 44s - loss: 1.0241 - regression_loss: 0.9013 - classification_loss: 0.1228 311/500 [=================>............] - ETA: 44s - loss: 1.0286 - regression_loss: 0.9047 - classification_loss: 0.1238 312/500 [=================>............] - ETA: 44s - loss: 1.0291 - regression_loss: 0.9053 - classification_loss: 0.1238 313/500 [=================>............] - ETA: 43s - loss: 1.0307 - regression_loss: 0.9070 - classification_loss: 0.1237 314/500 [=================>............] - ETA: 43s - loss: 1.0303 - regression_loss: 0.9066 - classification_loss: 0.1237 315/500 [=================>............] - ETA: 43s - loss: 1.0297 - regression_loss: 0.9062 - classification_loss: 0.1235 316/500 [=================>............] - ETA: 43s - loss: 1.0311 - regression_loss: 0.9075 - classification_loss: 0.1237 317/500 [==================>...........] - ETA: 43s - loss: 1.0322 - regression_loss: 0.9086 - classification_loss: 0.1236 318/500 [==================>...........] - ETA: 42s - loss: 1.0310 - regression_loss: 0.9076 - classification_loss: 0.1234 319/500 [==================>...........] - ETA: 42s - loss: 1.0309 - regression_loss: 0.9076 - classification_loss: 0.1233 320/500 [==================>...........] - ETA: 42s - loss: 1.0332 - regression_loss: 0.9097 - classification_loss: 0.1236 321/500 [==================>...........] - ETA: 42s - loss: 1.0302 - regression_loss: 0.9068 - classification_loss: 0.1234 322/500 [==================>...........] - ETA: 41s - loss: 1.0299 - regression_loss: 0.9067 - classification_loss: 0.1232 323/500 [==================>...........] - ETA: 41s - loss: 1.0320 - regression_loss: 0.9086 - classification_loss: 0.1235 324/500 [==================>...........] - ETA: 41s - loss: 1.0326 - regression_loss: 0.9091 - classification_loss: 0.1235 325/500 [==================>...........] - ETA: 41s - loss: 1.0309 - regression_loss: 0.9076 - classification_loss: 0.1233 326/500 [==================>...........] - ETA: 40s - loss: 1.0312 - regression_loss: 0.9079 - classification_loss: 0.1232 327/500 [==================>...........] - ETA: 40s - loss: 1.0318 - regression_loss: 0.9084 - classification_loss: 0.1234 328/500 [==================>...........] - ETA: 40s - loss: 1.0309 - regression_loss: 0.9077 - classification_loss: 0.1232 329/500 [==================>...........] - ETA: 40s - loss: 1.0315 - regression_loss: 0.9083 - classification_loss: 0.1232 330/500 [==================>...........] - ETA: 39s - loss: 1.0313 - regression_loss: 0.9082 - classification_loss: 0.1231 331/500 [==================>...........] - ETA: 39s - loss: 1.0320 - regression_loss: 0.9090 - classification_loss: 0.1231 332/500 [==================>...........] - ETA: 39s - loss: 1.0310 - regression_loss: 0.9082 - classification_loss: 0.1228 333/500 [==================>...........] - ETA: 39s - loss: 1.0315 - regression_loss: 0.9089 - classification_loss: 0.1226 334/500 [===================>..........] - ETA: 38s - loss: 1.0306 - regression_loss: 0.9082 - classification_loss: 0.1224 335/500 [===================>..........] - ETA: 38s - loss: 1.0294 - regression_loss: 0.9072 - classification_loss: 0.1222 336/500 [===================>..........] - ETA: 38s - loss: 1.0292 - regression_loss: 0.9070 - classification_loss: 0.1222 337/500 [===================>..........] - ETA: 38s - loss: 1.0296 - regression_loss: 0.9075 - classification_loss: 0.1221 338/500 [===================>..........] - ETA: 38s - loss: 1.0322 - regression_loss: 0.9094 - classification_loss: 0.1228 339/500 [===================>..........] - ETA: 37s - loss: 1.0331 - regression_loss: 0.9103 - classification_loss: 0.1228 340/500 [===================>..........] - ETA: 37s - loss: 1.0327 - regression_loss: 0.9099 - classification_loss: 0.1228 341/500 [===================>..........] - ETA: 37s - loss: 1.0335 - regression_loss: 0.9106 - classification_loss: 0.1229 342/500 [===================>..........] - ETA: 37s - loss: 1.0337 - regression_loss: 0.9109 - classification_loss: 0.1228 343/500 [===================>..........] - ETA: 36s - loss: 1.0333 - regression_loss: 0.9106 - classification_loss: 0.1227 344/500 [===================>..........] - ETA: 36s - loss: 1.0333 - regression_loss: 0.9107 - classification_loss: 0.1226 345/500 [===================>..........] - ETA: 36s - loss: 1.0332 - regression_loss: 0.9107 - classification_loss: 0.1225 346/500 [===================>..........] - ETA: 36s - loss: 1.0337 - regression_loss: 0.9111 - classification_loss: 0.1226 347/500 [===================>..........] - ETA: 35s - loss: 1.0338 - regression_loss: 0.9113 - classification_loss: 0.1225 348/500 [===================>..........] - ETA: 35s - loss: 1.0350 - regression_loss: 0.9123 - classification_loss: 0.1227 349/500 [===================>..........] - ETA: 35s - loss: 1.0356 - regression_loss: 0.9128 - classification_loss: 0.1228 350/500 [====================>.........] - ETA: 35s - loss: 1.0361 - regression_loss: 0.9133 - classification_loss: 0.1228 351/500 [====================>.........] - ETA: 34s - loss: 1.0366 - regression_loss: 0.9139 - classification_loss: 0.1227 352/500 [====================>.........] - ETA: 34s - loss: 1.0378 - regression_loss: 0.9149 - classification_loss: 0.1229 353/500 [====================>.........] - ETA: 34s - loss: 1.0366 - regression_loss: 0.9140 - classification_loss: 0.1226 354/500 [====================>.........] - ETA: 34s - loss: 1.0354 - regression_loss: 0.9130 - classification_loss: 0.1224 355/500 [====================>.........] - ETA: 34s - loss: 1.0367 - regression_loss: 0.9141 - classification_loss: 0.1226 356/500 [====================>.........] - ETA: 33s - loss: 1.0367 - regression_loss: 0.9141 - classification_loss: 0.1227 357/500 [====================>.........] - ETA: 33s - loss: 1.0350 - regression_loss: 0.9125 - classification_loss: 0.1224 358/500 [====================>.........] - ETA: 33s - loss: 1.0353 - regression_loss: 0.9129 - classification_loss: 0.1224 359/500 [====================>.........] - ETA: 33s - loss: 1.0378 - regression_loss: 0.9149 - classification_loss: 0.1229 360/500 [====================>.........] - ETA: 32s - loss: 1.0383 - regression_loss: 0.9152 - classification_loss: 0.1231 361/500 [====================>.........] - ETA: 32s - loss: 1.0383 - regression_loss: 0.9152 - classification_loss: 0.1231 362/500 [====================>.........] - ETA: 32s - loss: 1.0384 - regression_loss: 0.9154 - classification_loss: 0.1230 363/500 [====================>.........] - ETA: 32s - loss: 1.0386 - regression_loss: 0.9159 - classification_loss: 0.1227 364/500 [====================>.........] - ETA: 31s - loss: 1.0379 - regression_loss: 0.9153 - classification_loss: 0.1226 365/500 [====================>.........] - ETA: 31s - loss: 1.0362 - regression_loss: 0.9135 - classification_loss: 0.1227 366/500 [====================>.........] - ETA: 31s - loss: 1.0351 - regression_loss: 0.9126 - classification_loss: 0.1225 367/500 [=====================>........] - ETA: 31s - loss: 1.0347 - regression_loss: 0.9123 - classification_loss: 0.1224 368/500 [=====================>........] - ETA: 31s - loss: 1.0344 - regression_loss: 0.9120 - classification_loss: 0.1224 369/500 [=====================>........] - ETA: 30s - loss: 1.0332 - regression_loss: 0.9109 - classification_loss: 0.1222 370/500 [=====================>........] - ETA: 30s - loss: 1.0348 - regression_loss: 0.9122 - classification_loss: 0.1226 371/500 [=====================>........] - ETA: 30s - loss: 1.0350 - regression_loss: 0.9122 - classification_loss: 0.1227 372/500 [=====================>........] - ETA: 30s - loss: 1.0355 - regression_loss: 0.9126 - classification_loss: 0.1229 373/500 [=====================>........] - ETA: 29s - loss: 1.0350 - regression_loss: 0.9122 - classification_loss: 0.1228 374/500 [=====================>........] - ETA: 29s - loss: 1.0344 - regression_loss: 0.9117 - classification_loss: 0.1227 375/500 [=====================>........] - ETA: 29s - loss: 1.0348 - regression_loss: 0.9121 - classification_loss: 0.1227 376/500 [=====================>........] - ETA: 29s - loss: 1.0355 - regression_loss: 0.9130 - classification_loss: 0.1226 377/500 [=====================>........] - ETA: 28s - loss: 1.0342 - regression_loss: 0.9118 - classification_loss: 0.1224 378/500 [=====================>........] - ETA: 28s - loss: 1.0348 - regression_loss: 0.9125 - classification_loss: 0.1224 379/500 [=====================>........] - ETA: 28s - loss: 1.0344 - regression_loss: 0.9122 - classification_loss: 0.1222 380/500 [=====================>........] - ETA: 28s - loss: 1.0357 - regression_loss: 0.9132 - classification_loss: 0.1226 381/500 [=====================>........] - ETA: 27s - loss: 1.0361 - regression_loss: 0.9136 - classification_loss: 0.1225 382/500 [=====================>........] - ETA: 27s - loss: 1.0343 - regression_loss: 0.9121 - classification_loss: 0.1222 383/500 [=====================>........] - ETA: 27s - loss: 1.0345 - regression_loss: 0.9122 - classification_loss: 0.1223 384/500 [======================>.......] - ETA: 27s - loss: 1.0348 - regression_loss: 0.9125 - classification_loss: 0.1223 385/500 [======================>.......] - ETA: 27s - loss: 1.0347 - regression_loss: 0.9123 - classification_loss: 0.1224 386/500 [======================>.......] - ETA: 26s - loss: 1.0363 - regression_loss: 0.9137 - classification_loss: 0.1226 387/500 [======================>.......] - ETA: 26s - loss: 1.0358 - regression_loss: 0.9133 - classification_loss: 0.1225 388/500 [======================>.......] - ETA: 26s - loss: 1.0358 - regression_loss: 0.9133 - classification_loss: 0.1226 389/500 [======================>.......] - ETA: 26s - loss: 1.0349 - regression_loss: 0.9124 - classification_loss: 0.1226 390/500 [======================>.......] - ETA: 25s - loss: 1.0351 - regression_loss: 0.9125 - classification_loss: 0.1225 391/500 [======================>.......] - ETA: 25s - loss: 1.0350 - regression_loss: 0.9125 - classification_loss: 0.1224 392/500 [======================>.......] - ETA: 25s - loss: 1.0342 - regression_loss: 0.9119 - classification_loss: 0.1223 393/500 [======================>.......] - ETA: 25s - loss: 1.0342 - regression_loss: 0.9119 - classification_loss: 0.1223 394/500 [======================>.......] - ETA: 24s - loss: 1.0349 - regression_loss: 0.9126 - classification_loss: 0.1223 395/500 [======================>.......] - ETA: 24s - loss: 1.0347 - regression_loss: 0.9125 - classification_loss: 0.1222 396/500 [======================>.......] - ETA: 24s - loss: 1.0353 - regression_loss: 0.9131 - classification_loss: 0.1222 397/500 [======================>.......] - ETA: 24s - loss: 1.0353 - regression_loss: 0.9131 - classification_loss: 0.1221 398/500 [======================>.......] - ETA: 23s - loss: 1.0346 - regression_loss: 0.9126 - classification_loss: 0.1219 399/500 [======================>.......] - ETA: 23s - loss: 1.0334 - regression_loss: 0.9117 - classification_loss: 0.1217 400/500 [=======================>......] - ETA: 23s - loss: 1.0350 - regression_loss: 0.9130 - classification_loss: 0.1220 401/500 [=======================>......] - ETA: 23s - loss: 1.0348 - regression_loss: 0.9130 - classification_loss: 0.1218 402/500 [=======================>......] - ETA: 23s - loss: 1.0359 - regression_loss: 0.9140 - classification_loss: 0.1219 403/500 [=======================>......] - ETA: 22s - loss: 1.0359 - regression_loss: 0.9140 - classification_loss: 0.1218 404/500 [=======================>......] - ETA: 22s - loss: 1.0350 - regression_loss: 0.9133 - classification_loss: 0.1217 405/500 [=======================>......] - ETA: 22s - loss: 1.0344 - regression_loss: 0.9128 - classification_loss: 0.1216 406/500 [=======================>......] - ETA: 22s - loss: 1.0334 - regression_loss: 0.9119 - classification_loss: 0.1215 407/500 [=======================>......] - ETA: 21s - loss: 1.0346 - regression_loss: 0.9131 - classification_loss: 0.1216 408/500 [=======================>......] - ETA: 21s - loss: 1.0357 - regression_loss: 0.9140 - classification_loss: 0.1217 409/500 [=======================>......] - ETA: 21s - loss: 1.0358 - regression_loss: 0.9143 - classification_loss: 0.1215 410/500 [=======================>......] - ETA: 21s - loss: 1.0339 - regression_loss: 0.9127 - classification_loss: 0.1212 411/500 [=======================>......] - ETA: 20s - loss: 1.0342 - regression_loss: 0.9130 - classification_loss: 0.1212 412/500 [=======================>......] - ETA: 20s - loss: 1.0343 - regression_loss: 0.9131 - classification_loss: 0.1212 413/500 [=======================>......] - ETA: 20s - loss: 1.0342 - regression_loss: 0.9130 - classification_loss: 0.1212 414/500 [=======================>......] - ETA: 20s - loss: 1.0342 - regression_loss: 0.9130 - classification_loss: 0.1212 415/500 [=======================>......] - ETA: 19s - loss: 1.0338 - regression_loss: 0.9126 - classification_loss: 0.1211 416/500 [=======================>......] - ETA: 19s - loss: 1.0330 - regression_loss: 0.9118 - classification_loss: 0.1212 417/500 [========================>.....] - ETA: 19s - loss: 1.0339 - regression_loss: 0.9125 - classification_loss: 0.1213 418/500 [========================>.....] - ETA: 19s - loss: 1.0339 - regression_loss: 0.9127 - classification_loss: 0.1213 419/500 [========================>.....] - ETA: 19s - loss: 1.0331 - regression_loss: 0.9119 - classification_loss: 0.1212 420/500 [========================>.....] - ETA: 18s - loss: 1.0327 - regression_loss: 0.9116 - classification_loss: 0.1211 421/500 [========================>.....] - ETA: 18s - loss: 1.0312 - regression_loss: 0.9104 - classification_loss: 0.1208 422/500 [========================>.....] - ETA: 18s - loss: 1.0320 - regression_loss: 0.9111 - classification_loss: 0.1209 423/500 [========================>.....] - ETA: 18s - loss: 1.0305 - regression_loss: 0.9098 - classification_loss: 0.1207 424/500 [========================>.....] - ETA: 17s - loss: 1.0292 - regression_loss: 0.9087 - classification_loss: 0.1206 425/500 [========================>.....] - ETA: 17s - loss: 1.0282 - regression_loss: 0.9078 - classification_loss: 0.1204 426/500 [========================>.....] - ETA: 17s - loss: 1.0275 - regression_loss: 0.9073 - classification_loss: 0.1202 427/500 [========================>.....] - ETA: 17s - loss: 1.0266 - regression_loss: 0.9064 - classification_loss: 0.1202 428/500 [========================>.....] - ETA: 16s - loss: 1.0258 - regression_loss: 0.9057 - classification_loss: 0.1202 429/500 [========================>.....] - ETA: 16s - loss: 1.0265 - regression_loss: 0.9062 - classification_loss: 0.1203 430/500 [========================>.....] - ETA: 16s - loss: 1.0260 - regression_loss: 0.9059 - classification_loss: 0.1201 431/500 [========================>.....] - ETA: 16s - loss: 1.0264 - regression_loss: 0.9064 - classification_loss: 0.1200 432/500 [========================>.....] - ETA: 15s - loss: 1.0261 - regression_loss: 0.9061 - classification_loss: 0.1200 433/500 [========================>.....] - ETA: 15s - loss: 1.0261 - regression_loss: 0.9062 - classification_loss: 0.1199 434/500 [=========================>....] - ETA: 15s - loss: 1.0263 - regression_loss: 0.9064 - classification_loss: 0.1198 435/500 [=========================>....] - ETA: 15s - loss: 1.0255 - regression_loss: 0.9058 - classification_loss: 0.1197 436/500 [=========================>....] - ETA: 15s - loss: 1.0239 - regression_loss: 0.9044 - classification_loss: 0.1195 437/500 [=========================>....] - ETA: 14s - loss: 1.0251 - regression_loss: 0.9054 - classification_loss: 0.1197 438/500 [=========================>....] - ETA: 14s - loss: 1.0259 - regression_loss: 0.9062 - classification_loss: 0.1197 439/500 [=========================>....] - ETA: 14s - loss: 1.0256 - regression_loss: 0.9059 - classification_loss: 0.1197 440/500 [=========================>....] - ETA: 14s - loss: 1.0244 - regression_loss: 0.9048 - classification_loss: 0.1195 441/500 [=========================>....] - ETA: 13s - loss: 1.0246 - regression_loss: 0.9052 - classification_loss: 0.1194 442/500 [=========================>....] - ETA: 13s - loss: 1.0235 - regression_loss: 0.9042 - classification_loss: 0.1192 443/500 [=========================>....] - ETA: 13s - loss: 1.0232 - regression_loss: 0.9041 - classification_loss: 0.1191 444/500 [=========================>....] - ETA: 13s - loss: 1.0229 - regression_loss: 0.9038 - classification_loss: 0.1190 445/500 [=========================>....] - ETA: 12s - loss: 1.0234 - regression_loss: 0.9041 - classification_loss: 0.1193 446/500 [=========================>....] - ETA: 12s - loss: 1.0225 - regression_loss: 0.9033 - classification_loss: 0.1192 447/500 [=========================>....] - ETA: 12s - loss: 1.0225 - regression_loss: 0.9033 - classification_loss: 0.1192 448/500 [=========================>....] - ETA: 12s - loss: 1.0227 - regression_loss: 0.9034 - classification_loss: 0.1192 449/500 [=========================>....] - ETA: 11s - loss: 1.0219 - regression_loss: 0.9028 - classification_loss: 0.1191 450/500 [==========================>...] - ETA: 11s - loss: 1.0203 - regression_loss: 0.9015 - classification_loss: 0.1188 451/500 [==========================>...] - ETA: 11s - loss: 1.0200 - regression_loss: 0.9012 - classification_loss: 0.1188 452/500 [==========================>...] - ETA: 11s - loss: 1.0197 - regression_loss: 0.9010 - classification_loss: 0.1187 453/500 [==========================>...] - ETA: 11s - loss: 1.0192 - regression_loss: 0.9006 - classification_loss: 0.1186 454/500 [==========================>...] - ETA: 10s - loss: 1.0192 - regression_loss: 0.9006 - classification_loss: 0.1187 455/500 [==========================>...] - ETA: 10s - loss: 1.0188 - regression_loss: 0.9003 - classification_loss: 0.1185 456/500 [==========================>...] - ETA: 10s - loss: 1.0175 - regression_loss: 0.8992 - classification_loss: 0.1183 457/500 [==========================>...] - ETA: 10s - loss: 1.0159 - regression_loss: 0.8978 - classification_loss: 0.1181 458/500 [==========================>...] - ETA: 9s - loss: 1.0157 - regression_loss: 0.8977 - classification_loss: 0.1180  459/500 [==========================>...] - ETA: 9s - loss: 1.0149 - regression_loss: 0.8970 - classification_loss: 0.1178 460/500 [==========================>...] - ETA: 9s - loss: 1.0170 - regression_loss: 0.8988 - classification_loss: 0.1182 461/500 [==========================>...] - ETA: 9s - loss: 1.0175 - regression_loss: 0.8993 - classification_loss: 0.1182 462/500 [==========================>...] - ETA: 8s - loss: 1.0176 - regression_loss: 0.8994 - classification_loss: 0.1182 463/500 [==========================>...] - ETA: 8s - loss: 1.0173 - regression_loss: 0.8992 - classification_loss: 0.1181 464/500 [==========================>...] - ETA: 8s - loss: 1.0175 - regression_loss: 0.8994 - classification_loss: 0.1181 465/500 [==========================>...] - ETA: 8s - loss: 1.0192 - regression_loss: 0.9009 - classification_loss: 0.1183 466/500 [==========================>...] - ETA: 7s - loss: 1.0217 - regression_loss: 0.9028 - classification_loss: 0.1188 467/500 [===========================>..] - ETA: 7s - loss: 1.0218 - regression_loss: 0.9029 - classification_loss: 0.1189 468/500 [===========================>..] - ETA: 7s - loss: 1.0233 - regression_loss: 0.9043 - classification_loss: 0.1190 469/500 [===========================>..] - ETA: 7s - loss: 1.0230 - regression_loss: 0.9040 - classification_loss: 0.1190 470/500 [===========================>..] - ETA: 7s - loss: 1.0228 - regression_loss: 0.9037 - classification_loss: 0.1191 471/500 [===========================>..] - ETA: 6s - loss: 1.0231 - regression_loss: 0.9040 - classification_loss: 0.1191 472/500 [===========================>..] - ETA: 6s - loss: 1.0228 - regression_loss: 0.9038 - classification_loss: 0.1190 473/500 [===========================>..] - ETA: 6s - loss: 1.0235 - regression_loss: 0.9044 - classification_loss: 0.1191 474/500 [===========================>..] - ETA: 6s - loss: 1.0227 - regression_loss: 0.9037 - classification_loss: 0.1190 475/500 [===========================>..] - ETA: 5s - loss: 1.0219 - regression_loss: 0.9030 - classification_loss: 0.1189 476/500 [===========================>..] - ETA: 5s - loss: 1.0223 - regression_loss: 0.9032 - classification_loss: 0.1191 477/500 [===========================>..] - ETA: 5s - loss: 1.0233 - regression_loss: 0.9043 - classification_loss: 0.1190 478/500 [===========================>..] - ETA: 5s - loss: 1.0231 - regression_loss: 0.9041 - classification_loss: 0.1190 479/500 [===========================>..] - ETA: 4s - loss: 1.0232 - regression_loss: 0.9042 - classification_loss: 0.1191 480/500 [===========================>..] - ETA: 4s - loss: 1.0230 - regression_loss: 0.9040 - classification_loss: 0.1190 481/500 [===========================>..] - ETA: 4s - loss: 1.0236 - regression_loss: 0.9045 - classification_loss: 0.1190 482/500 [===========================>..] - ETA: 4s - loss: 1.0250 - regression_loss: 0.9057 - classification_loss: 0.1193 483/500 [===========================>..] - ETA: 3s - loss: 1.0247 - regression_loss: 0.9054 - classification_loss: 0.1193 484/500 [============================>.] - ETA: 3s - loss: 1.0246 - regression_loss: 0.9053 - classification_loss: 0.1192 485/500 [============================>.] - ETA: 3s - loss: 1.0237 - regression_loss: 0.9046 - classification_loss: 0.1191 486/500 [============================>.] - ETA: 3s - loss: 1.0243 - regression_loss: 0.9050 - classification_loss: 0.1193 487/500 [============================>.] - ETA: 3s - loss: 1.0242 - regression_loss: 0.9050 - classification_loss: 0.1192 488/500 [============================>.] - ETA: 2s - loss: 1.0234 - regression_loss: 0.9043 - classification_loss: 0.1190 489/500 [============================>.] - ETA: 2s - loss: 1.0233 - regression_loss: 0.9042 - classification_loss: 0.1192 490/500 [============================>.] - ETA: 2s - loss: 1.0230 - regression_loss: 0.9039 - classification_loss: 0.1191 491/500 [============================>.] - ETA: 2s - loss: 1.0243 - regression_loss: 0.9049 - classification_loss: 0.1194 492/500 [============================>.] - ETA: 1s - loss: 1.0238 - regression_loss: 0.9046 - classification_loss: 0.1192 493/500 [============================>.] - ETA: 1s - loss: 1.0246 - regression_loss: 0.9050 - classification_loss: 0.1196 494/500 [============================>.] - ETA: 1s - loss: 1.0253 - regression_loss: 0.9055 - classification_loss: 0.1198 495/500 [============================>.] - ETA: 1s - loss: 1.0257 - regression_loss: 0.9059 - classification_loss: 0.1199 496/500 [============================>.] - ETA: 0s - loss: 1.0259 - regression_loss: 0.9060 - classification_loss: 0.1199 497/500 [============================>.] - ETA: 0s - loss: 1.0270 - regression_loss: 0.9067 - classification_loss: 0.1203 498/500 [============================>.] - ETA: 0s - loss: 1.0269 - regression_loss: 0.9066 - classification_loss: 0.1203 499/500 [============================>.] - ETA: 0s - loss: 1.0263 - regression_loss: 0.9061 - classification_loss: 0.1202 500/500 [==============================] - 117s 235ms/step - loss: 1.0250 - regression_loss: 0.9050 - classification_loss: 0.1201 326 instances of class plum with average precision: 0.8458 mAP: 0.8458 Epoch 00028: saving model to ./training/snapshots/resnet50_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 1:55 - loss: 0.8804 - regression_loss: 0.7908 - classification_loss: 0.0895 2/500 [..............................] - ETA: 1:54 - loss: 0.9572 - regression_loss: 0.8798 - classification_loss: 0.0774 3/500 [..............................] - ETA: 1:55 - loss: 1.0480 - regression_loss: 0.9446 - classification_loss: 0.1034 4/500 [..............................] - ETA: 1:54 - loss: 0.9512 - regression_loss: 0.8493 - classification_loss: 0.1019 5/500 [..............................] - ETA: 1:53 - loss: 1.0734 - regression_loss: 0.9818 - classification_loss: 0.0915 6/500 [..............................] - ETA: 1:55 - loss: 1.1225 - regression_loss: 1.0347 - classification_loss: 0.0879 7/500 [..............................] - ETA: 1:55 - loss: 1.0956 - regression_loss: 1.0129 - classification_loss: 0.0826 8/500 [..............................] - ETA: 1:54 - loss: 1.0949 - regression_loss: 1.0067 - classification_loss: 0.0882 9/500 [..............................] - ETA: 1:55 - loss: 1.1283 - regression_loss: 1.0303 - classification_loss: 0.0980 10/500 [..............................] - ETA: 1:54 - loss: 1.1051 - regression_loss: 1.0061 - classification_loss: 0.0990 11/500 [..............................] - ETA: 1:53 - loss: 1.0407 - regression_loss: 0.9487 - classification_loss: 0.0920 12/500 [..............................] - ETA: 1:53 - loss: 1.0046 - regression_loss: 0.9169 - classification_loss: 0.0877 13/500 [..............................] - ETA: 1:52 - loss: 1.0342 - regression_loss: 0.9435 - classification_loss: 0.0906 14/500 [..............................] - ETA: 1:52 - loss: 1.0370 - regression_loss: 0.9421 - classification_loss: 0.0949 15/500 [..............................] - ETA: 1:51 - loss: 1.0310 - regression_loss: 0.9344 - classification_loss: 0.0966 16/500 [..............................] - ETA: 1:51 - loss: 1.0180 - regression_loss: 0.9241 - classification_loss: 0.0939 17/500 [>.............................] - ETA: 1:51 - loss: 1.0126 - regression_loss: 0.9176 - classification_loss: 0.0950 18/500 [>.............................] - ETA: 1:51 - loss: 1.0409 - regression_loss: 0.9352 - classification_loss: 0.1057 19/500 [>.............................] - ETA: 1:51 - loss: 1.0185 - regression_loss: 0.9154 - classification_loss: 0.1031 20/500 [>.............................] - ETA: 1:51 - loss: 1.0116 - regression_loss: 0.9071 - classification_loss: 0.1045 21/500 [>.............................] - ETA: 1:50 - loss: 1.0141 - regression_loss: 0.9088 - classification_loss: 0.1053 22/500 [>.............................] - ETA: 1:51 - loss: 1.0185 - regression_loss: 0.9108 - classification_loss: 0.1077 23/500 [>.............................] - ETA: 1:50 - loss: 1.0082 - regression_loss: 0.9037 - classification_loss: 0.1045 24/500 [>.............................] - ETA: 1:50 - loss: 0.9946 - regression_loss: 0.8922 - classification_loss: 0.1024 25/500 [>.............................] - ETA: 1:50 - loss: 1.0052 - regression_loss: 0.9026 - classification_loss: 0.1026 26/500 [>.............................] - ETA: 1:49 - loss: 1.0905 - regression_loss: 0.9741 - classification_loss: 0.1164 27/500 [>.............................] - ETA: 1:49 - loss: 1.0897 - regression_loss: 0.9750 - classification_loss: 0.1147 28/500 [>.............................] - ETA: 1:49 - loss: 1.0964 - regression_loss: 0.9812 - classification_loss: 0.1152 29/500 [>.............................] - ETA: 1:49 - loss: 1.0801 - regression_loss: 0.9676 - classification_loss: 0.1125 30/500 [>.............................] - ETA: 1:48 - loss: 1.0658 - regression_loss: 0.9542 - classification_loss: 0.1116 31/500 [>.............................] - ETA: 1:48 - loss: 1.0741 - regression_loss: 0.9568 - classification_loss: 0.1173 32/500 [>.............................] - ETA: 1:48 - loss: 1.0765 - regression_loss: 0.9588 - classification_loss: 0.1176 33/500 [>.............................] - ETA: 1:48 - loss: 1.0764 - regression_loss: 0.9605 - classification_loss: 0.1159 34/500 [=>............................] - ETA: 1:47 - loss: 1.0632 - regression_loss: 0.9497 - classification_loss: 0.1135 35/500 [=>............................] - ETA: 1:48 - loss: 1.0505 - regression_loss: 0.9378 - classification_loss: 0.1127 36/500 [=>............................] - ETA: 1:47 - loss: 1.0571 - regression_loss: 0.9445 - classification_loss: 0.1125 37/500 [=>............................] - ETA: 1:47 - loss: 1.0295 - regression_loss: 0.9190 - classification_loss: 0.1104 38/500 [=>............................] - ETA: 1:47 - loss: 1.0295 - regression_loss: 0.9190 - classification_loss: 0.1105 39/500 [=>............................] - ETA: 1:47 - loss: 1.0314 - regression_loss: 0.9217 - classification_loss: 0.1097 40/500 [=>............................] - ETA: 1:47 - loss: 1.0285 - regression_loss: 0.9205 - classification_loss: 0.1080 41/500 [=>............................] - ETA: 1:47 - loss: 1.0214 - regression_loss: 0.9141 - classification_loss: 0.1072 42/500 [=>............................] - ETA: 1:46 - loss: 1.0149 - regression_loss: 0.9085 - classification_loss: 0.1065 43/500 [=>............................] - ETA: 1:46 - loss: 1.0238 - regression_loss: 0.9148 - classification_loss: 0.1090 44/500 [=>............................] - ETA: 1:46 - loss: 1.0176 - regression_loss: 0.9097 - classification_loss: 0.1079 45/500 [=>............................] - ETA: 1:46 - loss: 1.0217 - regression_loss: 0.9137 - classification_loss: 0.1080 46/500 [=>............................] - ETA: 1:45 - loss: 1.0092 - regression_loss: 0.9027 - classification_loss: 0.1066 47/500 [=>............................] - ETA: 1:45 - loss: 0.9994 - regression_loss: 0.8939 - classification_loss: 0.1055 48/500 [=>............................] - ETA: 1:45 - loss: 0.9914 - regression_loss: 0.8868 - classification_loss: 0.1047 49/500 [=>............................] - ETA: 1:45 - loss: 0.9974 - regression_loss: 0.8925 - classification_loss: 0.1049 50/500 [==>...........................] - ETA: 1:44 - loss: 0.9835 - regression_loss: 0.8802 - classification_loss: 0.1034 51/500 [==>...........................] - ETA: 1:44 - loss: 0.9783 - regression_loss: 0.8756 - classification_loss: 0.1026 52/500 [==>...........................] - ETA: 1:44 - loss: 0.9831 - regression_loss: 0.8802 - classification_loss: 0.1029 53/500 [==>...........................] - ETA: 1:43 - loss: 0.9906 - regression_loss: 0.8867 - classification_loss: 0.1039 54/500 [==>...........................] - ETA: 1:43 - loss: 0.9872 - regression_loss: 0.8840 - classification_loss: 0.1032 55/500 [==>...........................] - ETA: 1:43 - loss: 0.9844 - regression_loss: 0.8809 - classification_loss: 0.1035 56/500 [==>...........................] - ETA: 1:43 - loss: 0.9810 - regression_loss: 0.8786 - classification_loss: 0.1024 57/500 [==>...........................] - ETA: 1:43 - loss: 0.9809 - regression_loss: 0.8783 - classification_loss: 0.1026 58/500 [==>...........................] - ETA: 1:43 - loss: 0.9937 - regression_loss: 0.8874 - classification_loss: 0.1063 59/500 [==>...........................] - ETA: 1:43 - loss: 1.0010 - regression_loss: 0.8933 - classification_loss: 0.1076 60/500 [==>...........................] - ETA: 1:42 - loss: 0.9903 - regression_loss: 0.8831 - classification_loss: 0.1071 61/500 [==>...........................] - ETA: 1:42 - loss: 0.9830 - regression_loss: 0.8770 - classification_loss: 0.1060 62/500 [==>...........................] - ETA: 1:42 - loss: 0.9798 - regression_loss: 0.8742 - classification_loss: 0.1056 63/500 [==>...........................] - ETA: 1:42 - loss: 0.9785 - regression_loss: 0.8729 - classification_loss: 0.1056 64/500 [==>...........................] - ETA: 1:41 - loss: 0.9838 - regression_loss: 0.8777 - classification_loss: 0.1061 65/500 [==>...........................] - ETA: 1:41 - loss: 0.9855 - regression_loss: 0.8781 - classification_loss: 0.1073 66/500 [==>...........................] - ETA: 1:41 - loss: 0.9745 - regression_loss: 0.8685 - classification_loss: 0.1060 67/500 [===>..........................] - ETA: 1:41 - loss: 0.9804 - regression_loss: 0.8733 - classification_loss: 0.1070 68/500 [===>..........................] - ETA: 1:40 - loss: 0.9826 - regression_loss: 0.8749 - classification_loss: 0.1077 69/500 [===>..........................] - ETA: 1:40 - loss: 0.9787 - regression_loss: 0.8708 - classification_loss: 0.1079 70/500 [===>..........................] - ETA: 1:40 - loss: 0.9748 - regression_loss: 0.8675 - classification_loss: 0.1072 71/500 [===>..........................] - ETA: 1:40 - loss: 0.9759 - regression_loss: 0.8690 - classification_loss: 0.1069 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9747 - regression_loss: 0.8678 - classification_loss: 0.1070 73/500 [===>..........................] - ETA: 1:39 - loss: 0.9739 - regression_loss: 0.8669 - classification_loss: 0.1070 74/500 [===>..........................] - ETA: 1:39 - loss: 0.9721 - regression_loss: 0.8651 - classification_loss: 0.1070 75/500 [===>..........................] - ETA: 1:39 - loss: 0.9687 - regression_loss: 0.8627 - classification_loss: 0.1060 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9714 - regression_loss: 0.8649 - classification_loss: 0.1065 77/500 [===>..........................] - ETA: 1:38 - loss: 0.9754 - regression_loss: 0.8687 - classification_loss: 0.1067 78/500 [===>..........................] - ETA: 1:38 - loss: 0.9869 - regression_loss: 0.8778 - classification_loss: 0.1091 79/500 [===>..........................] - ETA: 1:38 - loss: 0.9862 - regression_loss: 0.8774 - classification_loss: 0.1088 80/500 [===>..........................] - ETA: 1:38 - loss: 0.9878 - regression_loss: 0.8787 - classification_loss: 0.1090 81/500 [===>..........................] - ETA: 1:37 - loss: 0.9868 - regression_loss: 0.8785 - classification_loss: 0.1083 82/500 [===>..........................] - ETA: 1:37 - loss: 0.9891 - regression_loss: 0.8801 - classification_loss: 0.1091 83/500 [===>..........................] - ETA: 1:37 - loss: 0.9856 - regression_loss: 0.8775 - classification_loss: 0.1081 84/500 [====>.........................] - ETA: 1:36 - loss: 0.9821 - regression_loss: 0.8747 - classification_loss: 0.1074 85/500 [====>.........................] - ETA: 1:36 - loss: 0.9863 - regression_loss: 0.8775 - classification_loss: 0.1088 86/500 [====>.........................] - ETA: 1:36 - loss: 0.9860 - regression_loss: 0.8766 - classification_loss: 0.1094 87/500 [====>.........................] - ETA: 1:36 - loss: 0.9892 - regression_loss: 0.8795 - classification_loss: 0.1096 88/500 [====>.........................] - ETA: 1:35 - loss: 0.9914 - regression_loss: 0.8817 - classification_loss: 0.1097 89/500 [====>.........................] - ETA: 1:35 - loss: 0.9995 - regression_loss: 0.8872 - classification_loss: 0.1123 90/500 [====>.........................] - ETA: 1:35 - loss: 0.9917 - regression_loss: 0.8803 - classification_loss: 0.1114 91/500 [====>.........................] - ETA: 1:35 - loss: 0.9866 - regression_loss: 0.8760 - classification_loss: 0.1107 92/500 [====>.........................] - ETA: 1:35 - loss: 0.9819 - regression_loss: 0.8719 - classification_loss: 0.1100 93/500 [====>.........................] - ETA: 1:34 - loss: 0.9854 - regression_loss: 0.8752 - classification_loss: 0.1102 94/500 [====>.........................] - ETA: 1:34 - loss: 0.9897 - regression_loss: 0.8785 - classification_loss: 0.1112 95/500 [====>.........................] - ETA: 1:34 - loss: 0.9954 - regression_loss: 0.8825 - classification_loss: 0.1129 96/500 [====>.........................] - ETA: 1:34 - loss: 0.9965 - regression_loss: 0.8836 - classification_loss: 0.1129 97/500 [====>.........................] - ETA: 1:34 - loss: 0.9965 - regression_loss: 0.8836 - classification_loss: 0.1128 98/500 [====>.........................] - ETA: 1:33 - loss: 0.9931 - regression_loss: 0.8810 - classification_loss: 0.1121 99/500 [====>.........................] - ETA: 1:33 - loss: 0.9900 - regression_loss: 0.8784 - classification_loss: 0.1116 100/500 [=====>........................] - ETA: 1:33 - loss: 0.9885 - regression_loss: 0.8773 - classification_loss: 0.1113 101/500 [=====>........................] - ETA: 1:33 - loss: 0.9861 - regression_loss: 0.8754 - classification_loss: 0.1107 102/500 [=====>........................] - ETA: 1:33 - loss: 0.9872 - regression_loss: 0.8766 - classification_loss: 0.1106 103/500 [=====>........................] - ETA: 1:32 - loss: 0.9910 - regression_loss: 0.8798 - classification_loss: 0.1112 104/500 [=====>........................] - ETA: 1:32 - loss: 0.9876 - regression_loss: 0.8767 - classification_loss: 0.1109 105/500 [=====>........................] - ETA: 1:32 - loss: 0.9904 - regression_loss: 0.8791 - classification_loss: 0.1113 106/500 [=====>........................] - ETA: 1:31 - loss: 0.9928 - regression_loss: 0.8813 - classification_loss: 0.1114 107/500 [=====>........................] - ETA: 1:31 - loss: 0.9930 - regression_loss: 0.8820 - classification_loss: 0.1110 108/500 [=====>........................] - ETA: 1:31 - loss: 0.9931 - regression_loss: 0.8823 - classification_loss: 0.1108 109/500 [=====>........................] - ETA: 1:30 - loss: 0.9952 - regression_loss: 0.8841 - classification_loss: 0.1111 110/500 [=====>........................] - ETA: 1:30 - loss: 0.9904 - regression_loss: 0.8796 - classification_loss: 0.1108 111/500 [=====>........................] - ETA: 1:30 - loss: 0.9875 - regression_loss: 0.8768 - classification_loss: 0.1107 112/500 [=====>........................] - ETA: 1:30 - loss: 0.9852 - regression_loss: 0.8750 - classification_loss: 0.1101 113/500 [=====>........................] - ETA: 1:30 - loss: 0.9846 - regression_loss: 0.8746 - classification_loss: 0.1100 114/500 [=====>........................] - ETA: 1:29 - loss: 0.9861 - regression_loss: 0.8757 - classification_loss: 0.1104 115/500 [=====>........................] - ETA: 1:29 - loss: 0.9852 - regression_loss: 0.8752 - classification_loss: 0.1100 116/500 [=====>........................] - ETA: 1:29 - loss: 0.9823 - regression_loss: 0.8728 - classification_loss: 0.1095 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9817 - regression_loss: 0.8724 - classification_loss: 0.1093 118/500 [======>.......................] - ETA: 1:28 - loss: 0.9823 - regression_loss: 0.8732 - classification_loss: 0.1091 119/500 [======>.......................] - ETA: 1:28 - loss: 0.9843 - regression_loss: 0.8754 - classification_loss: 0.1089 120/500 [======>.......................] - ETA: 1:28 - loss: 0.9825 - regression_loss: 0.8739 - classification_loss: 0.1086 121/500 [======>.......................] - ETA: 1:28 - loss: 0.9778 - regression_loss: 0.8700 - classification_loss: 0.1078 122/500 [======>.......................] - ETA: 1:27 - loss: 0.9766 - regression_loss: 0.8693 - classification_loss: 0.1073 123/500 [======>.......................] - ETA: 1:27 - loss: 0.9781 - regression_loss: 0.8709 - classification_loss: 0.1073 124/500 [======>.......................] - ETA: 1:27 - loss: 0.9833 - regression_loss: 0.8749 - classification_loss: 0.1084 125/500 [======>.......................] - ETA: 1:27 - loss: 0.9826 - regression_loss: 0.8742 - classification_loss: 0.1084 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9796 - regression_loss: 0.8717 - classification_loss: 0.1079 127/500 [======>.......................] - ETA: 1:26 - loss: 0.9829 - regression_loss: 0.8750 - classification_loss: 0.1079 128/500 [======>.......................] - ETA: 1:26 - loss: 0.9812 - regression_loss: 0.8737 - classification_loss: 0.1076 129/500 [======>.......................] - ETA: 1:26 - loss: 0.9823 - regression_loss: 0.8742 - classification_loss: 0.1081 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9845 - regression_loss: 0.8760 - classification_loss: 0.1086 131/500 [======>.......................] - ETA: 1:25 - loss: 0.9904 - regression_loss: 0.8802 - classification_loss: 0.1102 132/500 [======>.......................] - ETA: 1:25 - loss: 0.9932 - regression_loss: 0.8831 - classification_loss: 0.1101 133/500 [======>.......................] - ETA: 1:25 - loss: 0.9892 - regression_loss: 0.8797 - classification_loss: 0.1095 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9922 - regression_loss: 0.8827 - classification_loss: 0.1094 135/500 [=======>......................] - ETA: 1:24 - loss: 0.9908 - regression_loss: 0.8819 - classification_loss: 0.1090 136/500 [=======>......................] - ETA: 1:24 - loss: 0.9892 - regression_loss: 0.8807 - classification_loss: 0.1085 137/500 [=======>......................] - ETA: 1:24 - loss: 0.9858 - regression_loss: 0.8778 - classification_loss: 0.1081 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9851 - regression_loss: 0.8771 - classification_loss: 0.1080 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9836 - regression_loss: 0.8758 - classification_loss: 0.1078 140/500 [=======>......................] - ETA: 1:23 - loss: 0.9871 - regression_loss: 0.8783 - classification_loss: 0.1087 141/500 [=======>......................] - ETA: 1:23 - loss: 0.9892 - regression_loss: 0.8802 - classification_loss: 0.1090 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9865 - regression_loss: 0.8780 - classification_loss: 0.1085 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9883 - regression_loss: 0.8792 - classification_loss: 0.1090 144/500 [=======>......................] - ETA: 1:22 - loss: 0.9966 - regression_loss: 0.8863 - classification_loss: 0.1103 145/500 [=======>......................] - ETA: 1:22 - loss: 0.9991 - regression_loss: 0.8883 - classification_loss: 0.1108 146/500 [=======>......................] - ETA: 1:22 - loss: 1.0004 - regression_loss: 0.8894 - classification_loss: 0.1110 147/500 [=======>......................] - ETA: 1:22 - loss: 1.0014 - regression_loss: 0.8906 - classification_loss: 0.1109 148/500 [=======>......................] - ETA: 1:22 - loss: 1.0017 - regression_loss: 0.8910 - classification_loss: 0.1107 149/500 [=======>......................] - ETA: 1:21 - loss: 1.0008 - regression_loss: 0.8903 - classification_loss: 0.1105 150/500 [========>.....................] - ETA: 1:21 - loss: 1.0069 - regression_loss: 0.8961 - classification_loss: 0.1109 151/500 [========>.....................] - ETA: 1:21 - loss: 1.0037 - regression_loss: 0.8934 - classification_loss: 0.1103 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9994 - regression_loss: 0.8897 - classification_loss: 0.1098 153/500 [========>.....................] - ETA: 1:20 - loss: 1.0001 - regression_loss: 0.8905 - classification_loss: 0.1095 154/500 [========>.....................] - ETA: 1:20 - loss: 0.9972 - regression_loss: 0.8881 - classification_loss: 0.1090 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9963 - regression_loss: 0.8875 - classification_loss: 0.1088 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9935 - regression_loss: 0.8848 - classification_loss: 0.1086 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9970 - regression_loss: 0.8876 - classification_loss: 0.1094 158/500 [========>.....................] - ETA: 1:19 - loss: 0.9942 - regression_loss: 0.8851 - classification_loss: 0.1091 159/500 [========>.....................] - ETA: 1:19 - loss: 0.9917 - regression_loss: 0.8820 - classification_loss: 0.1097 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9914 - regression_loss: 0.8817 - classification_loss: 0.1097 161/500 [========>.....................] - ETA: 1:19 - loss: 0.9906 - regression_loss: 0.8810 - classification_loss: 0.1095 162/500 [========>.....................] - ETA: 1:18 - loss: 0.9899 - regression_loss: 0.8806 - classification_loss: 0.1092 163/500 [========>.....................] - ETA: 1:18 - loss: 0.9889 - regression_loss: 0.8798 - classification_loss: 0.1090 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9871 - regression_loss: 0.8785 - classification_loss: 0.1086 165/500 [========>.....................] - ETA: 1:18 - loss: 0.9866 - regression_loss: 0.8784 - classification_loss: 0.1083 166/500 [========>.....................] - ETA: 1:18 - loss: 0.9858 - regression_loss: 0.8779 - classification_loss: 0.1079 167/500 [=========>....................] - ETA: 1:17 - loss: 0.9810 - regression_loss: 0.8736 - classification_loss: 0.1074 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9805 - regression_loss: 0.8732 - classification_loss: 0.1073 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9819 - regression_loss: 0.8742 - classification_loss: 0.1078 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9807 - regression_loss: 0.8732 - classification_loss: 0.1075 171/500 [=========>....................] - ETA: 1:16 - loss: 0.9790 - regression_loss: 0.8719 - classification_loss: 0.1071 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9793 - regression_loss: 0.8719 - classification_loss: 0.1073 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9772 - regression_loss: 0.8702 - classification_loss: 0.1069 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9764 - regression_loss: 0.8698 - classification_loss: 0.1066 175/500 [=========>....................] - ETA: 1:15 - loss: 0.9783 - regression_loss: 0.8716 - classification_loss: 0.1067 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9761 - regression_loss: 0.8697 - classification_loss: 0.1064 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9754 - regression_loss: 0.8692 - classification_loss: 0.1062 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9741 - regression_loss: 0.8684 - classification_loss: 0.1057 179/500 [=========>....................] - ETA: 1:14 - loss: 0.9703 - regression_loss: 0.8650 - classification_loss: 0.1053 180/500 [=========>....................] - ETA: 1:14 - loss: 0.9688 - regression_loss: 0.8638 - classification_loss: 0.1050 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9696 - regression_loss: 0.8646 - classification_loss: 0.1050 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9699 - regression_loss: 0.8649 - classification_loss: 0.1051 183/500 [=========>....................] - ETA: 1:14 - loss: 0.9698 - regression_loss: 0.8647 - classification_loss: 0.1051 184/500 [==========>...................] - ETA: 1:13 - loss: 0.9683 - regression_loss: 0.8633 - classification_loss: 0.1051 185/500 [==========>...................] - ETA: 1:13 - loss: 0.9728 - regression_loss: 0.8674 - classification_loss: 0.1053 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9736 - regression_loss: 0.8683 - classification_loss: 0.1053 187/500 [==========>...................] - ETA: 1:13 - loss: 0.9750 - regression_loss: 0.8695 - classification_loss: 0.1054 188/500 [==========>...................] - ETA: 1:12 - loss: 0.9766 - regression_loss: 0.8708 - classification_loss: 0.1058 189/500 [==========>...................] - ETA: 1:12 - loss: 0.9760 - regression_loss: 0.8702 - classification_loss: 0.1058 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9760 - regression_loss: 0.8697 - classification_loss: 0.1063 191/500 [==========>...................] - ETA: 1:12 - loss: 0.9771 - regression_loss: 0.8709 - classification_loss: 0.1062 192/500 [==========>...................] - ETA: 1:12 - loss: 0.9789 - regression_loss: 0.8724 - classification_loss: 0.1065 193/500 [==========>...................] - ETA: 1:11 - loss: 0.9798 - regression_loss: 0.8735 - classification_loss: 0.1063 194/500 [==========>...................] - ETA: 1:11 - loss: 0.9786 - regression_loss: 0.8724 - classification_loss: 0.1062 195/500 [==========>...................] - ETA: 1:11 - loss: 0.9786 - regression_loss: 0.8726 - classification_loss: 0.1061 196/500 [==========>...................] - ETA: 1:10 - loss: 0.9775 - regression_loss: 0.8716 - classification_loss: 0.1059 197/500 [==========>...................] - ETA: 1:10 - loss: 0.9779 - regression_loss: 0.8719 - classification_loss: 0.1060 198/500 [==========>...................] - ETA: 1:10 - loss: 0.9788 - regression_loss: 0.8725 - classification_loss: 0.1063 199/500 [==========>...................] - ETA: 1:10 - loss: 0.9762 - regression_loss: 0.8701 - classification_loss: 0.1061 200/500 [===========>..................] - ETA: 1:10 - loss: 0.9793 - regression_loss: 0.8733 - classification_loss: 0.1060 201/500 [===========>..................] - ETA: 1:09 - loss: 0.9797 - regression_loss: 0.8736 - classification_loss: 0.1062 202/500 [===========>..................] - ETA: 1:09 - loss: 0.9811 - regression_loss: 0.8746 - classification_loss: 0.1065 203/500 [===========>..................] - ETA: 1:09 - loss: 0.9816 - regression_loss: 0.8749 - classification_loss: 0.1067 204/500 [===========>..................] - ETA: 1:09 - loss: 0.9835 - regression_loss: 0.8765 - classification_loss: 0.1070 205/500 [===========>..................] - ETA: 1:08 - loss: 0.9817 - regression_loss: 0.8751 - classification_loss: 0.1066 206/500 [===========>..................] - ETA: 1:08 - loss: 0.9799 - regression_loss: 0.8735 - classification_loss: 0.1064 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9805 - regression_loss: 0.8740 - classification_loss: 0.1064 208/500 [===========>..................] - ETA: 1:08 - loss: 0.9782 - regression_loss: 0.8719 - classification_loss: 0.1063 209/500 [===========>..................] - ETA: 1:07 - loss: 0.9774 - regression_loss: 0.8712 - classification_loss: 0.1062 210/500 [===========>..................] - ETA: 1:07 - loss: 0.9747 - regression_loss: 0.8689 - classification_loss: 0.1058 211/500 [===========>..................] - ETA: 1:07 - loss: 0.9732 - regression_loss: 0.8676 - classification_loss: 0.1057 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9773 - regression_loss: 0.8712 - classification_loss: 0.1062 213/500 [===========>..................] - ETA: 1:07 - loss: 0.9750 - regression_loss: 0.8692 - classification_loss: 0.1058 214/500 [===========>..................] - ETA: 1:06 - loss: 0.9793 - regression_loss: 0.8725 - classification_loss: 0.1068 215/500 [===========>..................] - ETA: 1:06 - loss: 0.9922 - regression_loss: 0.8776 - classification_loss: 0.1146 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9909 - regression_loss: 0.8765 - classification_loss: 0.1144 217/500 [============>.................] - ETA: 1:06 - loss: 0.9947 - regression_loss: 0.8799 - classification_loss: 0.1148 218/500 [============>.................] - ETA: 1:05 - loss: 0.9926 - regression_loss: 0.8781 - classification_loss: 0.1145 219/500 [============>.................] - ETA: 1:05 - loss: 0.9914 - regression_loss: 0.8770 - classification_loss: 0.1144 220/500 [============>.................] - ETA: 1:05 - loss: 0.9901 - regression_loss: 0.8757 - classification_loss: 0.1143 221/500 [============>.................] - ETA: 1:05 - loss: 0.9895 - regression_loss: 0.8754 - classification_loss: 0.1141 222/500 [============>.................] - ETA: 1:04 - loss: 0.9908 - regression_loss: 0.8769 - classification_loss: 0.1138 223/500 [============>.................] - ETA: 1:04 - loss: 0.9895 - regression_loss: 0.8759 - classification_loss: 0.1136 224/500 [============>.................] - ETA: 1:04 - loss: 0.9904 - regression_loss: 0.8768 - classification_loss: 0.1137 225/500 [============>.................] - ETA: 1:04 - loss: 0.9889 - regression_loss: 0.8755 - classification_loss: 0.1134 226/500 [============>.................] - ETA: 1:03 - loss: 0.9881 - regression_loss: 0.8748 - classification_loss: 0.1133 227/500 [============>.................] - ETA: 1:03 - loss: 0.9871 - regression_loss: 0.8741 - classification_loss: 0.1130 228/500 [============>.................] - ETA: 1:03 - loss: 0.9860 - regression_loss: 0.8731 - classification_loss: 0.1129 229/500 [============>.................] - ETA: 1:03 - loss: 0.9878 - regression_loss: 0.8745 - classification_loss: 0.1133 230/500 [============>.................] - ETA: 1:02 - loss: 0.9874 - regression_loss: 0.8738 - classification_loss: 0.1136 231/500 [============>.................] - ETA: 1:02 - loss: 0.9886 - regression_loss: 0.8749 - classification_loss: 0.1137 232/500 [============>.................] - ETA: 1:02 - loss: 0.9865 - regression_loss: 0.8732 - classification_loss: 0.1133 233/500 [============>.................] - ETA: 1:02 - loss: 0.9888 - regression_loss: 0.8753 - classification_loss: 0.1135 234/500 [=============>................] - ETA: 1:01 - loss: 0.9890 - regression_loss: 0.8755 - classification_loss: 0.1135 235/500 [=============>................] - ETA: 1:01 - loss: 0.9908 - regression_loss: 0.8767 - classification_loss: 0.1141 236/500 [=============>................] - ETA: 1:01 - loss: 0.9901 - regression_loss: 0.8762 - classification_loss: 0.1140 237/500 [=============>................] - ETA: 1:01 - loss: 0.9899 - regression_loss: 0.8763 - classification_loss: 0.1136 238/500 [=============>................] - ETA: 1:01 - loss: 0.9896 - regression_loss: 0.8759 - classification_loss: 0.1137 239/500 [=============>................] - ETA: 1:00 - loss: 0.9912 - regression_loss: 0.8772 - classification_loss: 0.1140 240/500 [=============>................] - ETA: 1:00 - loss: 0.9887 - regression_loss: 0.8751 - classification_loss: 0.1136 241/500 [=============>................] - ETA: 1:00 - loss: 0.9890 - regression_loss: 0.8755 - classification_loss: 0.1135 242/500 [=============>................] - ETA: 1:00 - loss: 0.9890 - regression_loss: 0.8756 - classification_loss: 0.1134 243/500 [=============>................] - ETA: 59s - loss: 0.9885 - regression_loss: 0.8755 - classification_loss: 0.1130  244/500 [=============>................] - ETA: 59s - loss: 0.9877 - regression_loss: 0.8746 - classification_loss: 0.1131 245/500 [=============>................] - ETA: 59s - loss: 0.9879 - regression_loss: 0.8747 - classification_loss: 0.1132 246/500 [=============>................] - ETA: 59s - loss: 0.9928 - regression_loss: 0.8790 - classification_loss: 0.1138 247/500 [=============>................] - ETA: 58s - loss: 0.9937 - regression_loss: 0.8798 - classification_loss: 0.1139 248/500 [=============>................] - ETA: 58s - loss: 0.9939 - regression_loss: 0.8801 - classification_loss: 0.1137 249/500 [=============>................] - ETA: 58s - loss: 0.9959 - regression_loss: 0.8819 - classification_loss: 0.1140 250/500 [==============>...............] - ETA: 58s - loss: 0.9951 - regression_loss: 0.8810 - classification_loss: 0.1141 251/500 [==============>...............] - ETA: 57s - loss: 0.9947 - regression_loss: 0.8806 - classification_loss: 0.1141 252/500 [==============>...............] - ETA: 57s - loss: 0.9939 - regression_loss: 0.8801 - classification_loss: 0.1139 253/500 [==============>...............] - ETA: 57s - loss: 0.9970 - regression_loss: 0.8831 - classification_loss: 0.1139 254/500 [==============>...............] - ETA: 57s - loss: 0.9974 - regression_loss: 0.8837 - classification_loss: 0.1137 255/500 [==============>...............] - ETA: 57s - loss: 0.9979 - regression_loss: 0.8840 - classification_loss: 0.1139 256/500 [==============>...............] - ETA: 56s - loss: 0.9983 - regression_loss: 0.8845 - classification_loss: 0.1138 257/500 [==============>...............] - ETA: 56s - loss: 0.9980 - regression_loss: 0.8844 - classification_loss: 0.1136 258/500 [==============>...............] - ETA: 56s - loss: 0.9968 - regression_loss: 0.8832 - classification_loss: 0.1136 259/500 [==============>...............] - ETA: 56s - loss: 0.9971 - regression_loss: 0.8835 - classification_loss: 0.1136 260/500 [==============>...............] - ETA: 55s - loss: 0.9943 - regression_loss: 0.8811 - classification_loss: 0.1132 261/500 [==============>...............] - ETA: 55s - loss: 0.9978 - regression_loss: 0.8839 - classification_loss: 0.1139 262/500 [==============>...............] - ETA: 55s - loss: 0.9975 - regression_loss: 0.8838 - classification_loss: 0.1137 263/500 [==============>...............] - ETA: 55s - loss: 0.9979 - regression_loss: 0.8842 - classification_loss: 0.1137 264/500 [==============>...............] - ETA: 54s - loss: 0.9984 - regression_loss: 0.8848 - classification_loss: 0.1137 265/500 [==============>...............] - ETA: 54s - loss: 1.0018 - regression_loss: 0.8877 - classification_loss: 0.1141 266/500 [==============>...............] - ETA: 54s - loss: 1.0048 - regression_loss: 0.8899 - classification_loss: 0.1149 267/500 [===============>..............] - ETA: 54s - loss: 1.0046 - regression_loss: 0.8899 - classification_loss: 0.1147 268/500 [===============>..............] - ETA: 54s - loss: 1.0025 - regression_loss: 0.8882 - classification_loss: 0.1144 269/500 [===============>..............] - ETA: 53s - loss: 1.0027 - regression_loss: 0.8883 - classification_loss: 0.1143 270/500 [===============>..............] - ETA: 53s - loss: 1.0036 - regression_loss: 0.8891 - classification_loss: 0.1144 271/500 [===============>..............] - ETA: 53s - loss: 1.0017 - regression_loss: 0.8875 - classification_loss: 0.1142 272/500 [===============>..............] - ETA: 53s - loss: 1.0021 - regression_loss: 0.8880 - classification_loss: 0.1141 273/500 [===============>..............] - ETA: 52s - loss: 1.0028 - regression_loss: 0.8886 - classification_loss: 0.1143 274/500 [===============>..............] - ETA: 52s - loss: 1.0042 - regression_loss: 0.8894 - classification_loss: 0.1147 275/500 [===============>..............] - ETA: 52s - loss: 1.0054 - regression_loss: 0.8905 - classification_loss: 0.1149 276/500 [===============>..............] - ETA: 52s - loss: 1.0045 - regression_loss: 0.8896 - classification_loss: 0.1148 277/500 [===============>..............] - ETA: 51s - loss: 1.0040 - regression_loss: 0.8893 - classification_loss: 0.1147 278/500 [===============>..............] - ETA: 51s - loss: 1.0037 - regression_loss: 0.8892 - classification_loss: 0.1145 279/500 [===============>..............] - ETA: 51s - loss: 1.0035 - regression_loss: 0.8892 - classification_loss: 0.1142 280/500 [===============>..............] - ETA: 51s - loss: 1.0021 - regression_loss: 0.8881 - classification_loss: 0.1139 281/500 [===============>..............] - ETA: 51s - loss: 1.0015 - regression_loss: 0.8876 - classification_loss: 0.1138 282/500 [===============>..............] - ETA: 50s - loss: 1.0016 - regression_loss: 0.8877 - classification_loss: 0.1139 283/500 [===============>..............] - ETA: 50s - loss: 1.0055 - regression_loss: 0.8910 - classification_loss: 0.1146 284/500 [================>.............] - ETA: 50s - loss: 1.0052 - regression_loss: 0.8907 - classification_loss: 0.1145 285/500 [================>.............] - ETA: 50s - loss: 1.0046 - regression_loss: 0.8903 - classification_loss: 0.1143 286/500 [================>.............] - ETA: 49s - loss: 1.0049 - regression_loss: 0.8906 - classification_loss: 0.1143 287/500 [================>.............] - ETA: 49s - loss: 1.0029 - regression_loss: 0.8889 - classification_loss: 0.1140 288/500 [================>.............] - ETA: 49s - loss: 1.0036 - regression_loss: 0.8894 - classification_loss: 0.1142 289/500 [================>.............] - ETA: 49s - loss: 1.0039 - regression_loss: 0.8894 - classification_loss: 0.1144 290/500 [================>.............] - ETA: 48s - loss: 1.0046 - regression_loss: 0.8899 - classification_loss: 0.1147 291/500 [================>.............] - ETA: 48s - loss: 1.0030 - regression_loss: 0.8885 - classification_loss: 0.1144 292/500 [================>.............] - ETA: 48s - loss: 1.0024 - regression_loss: 0.8881 - classification_loss: 0.1143 293/500 [================>.............] - ETA: 48s - loss: 1.0030 - regression_loss: 0.8887 - classification_loss: 0.1143 294/500 [================>.............] - ETA: 48s - loss: 1.0025 - regression_loss: 0.8882 - classification_loss: 0.1143 295/500 [================>.............] - ETA: 47s - loss: 1.0002 - regression_loss: 0.8863 - classification_loss: 0.1139 296/500 [================>.............] - ETA: 47s - loss: 1.0018 - regression_loss: 0.8877 - classification_loss: 0.1142 297/500 [================>.............] - ETA: 47s - loss: 1.0026 - regression_loss: 0.8883 - classification_loss: 0.1142 298/500 [================>.............] - ETA: 47s - loss: 1.0059 - regression_loss: 0.8912 - classification_loss: 0.1146 299/500 [================>.............] - ETA: 46s - loss: 1.0055 - regression_loss: 0.8910 - classification_loss: 0.1145 300/500 [=================>............] - ETA: 46s - loss: 1.0048 - regression_loss: 0.8904 - classification_loss: 0.1144 301/500 [=================>............] - ETA: 46s - loss: 1.0027 - regression_loss: 0.8886 - classification_loss: 0.1141 302/500 [=================>............] - ETA: 46s - loss: 1.0022 - regression_loss: 0.8881 - classification_loss: 0.1141 303/500 [=================>............] - ETA: 45s - loss: 1.0052 - regression_loss: 0.8908 - classification_loss: 0.1144 304/500 [=================>............] - ETA: 45s - loss: 1.0062 - regression_loss: 0.8919 - classification_loss: 0.1143 305/500 [=================>............] - ETA: 45s - loss: 1.0064 - regression_loss: 0.8920 - classification_loss: 0.1144 306/500 [=================>............] - ETA: 45s - loss: 1.0070 - regression_loss: 0.8928 - classification_loss: 0.1142 307/500 [=================>............] - ETA: 45s - loss: 1.0068 - regression_loss: 0.8927 - classification_loss: 0.1141 308/500 [=================>............] - ETA: 44s - loss: 1.0067 - regression_loss: 0.8928 - classification_loss: 0.1139 309/500 [=================>............] - ETA: 44s - loss: 1.0077 - regression_loss: 0.8936 - classification_loss: 0.1140 310/500 [=================>............] - ETA: 44s - loss: 1.0086 - regression_loss: 0.8945 - classification_loss: 0.1140 311/500 [=================>............] - ETA: 44s - loss: 1.0085 - regression_loss: 0.8944 - classification_loss: 0.1140 312/500 [=================>............] - ETA: 43s - loss: 1.0086 - regression_loss: 0.8946 - classification_loss: 0.1140 313/500 [=================>............] - ETA: 43s - loss: 1.0066 - regression_loss: 0.8930 - classification_loss: 0.1136 314/500 [=================>............] - ETA: 43s - loss: 1.0074 - regression_loss: 0.8936 - classification_loss: 0.1138 315/500 [=================>............] - ETA: 43s - loss: 1.0061 - regression_loss: 0.8926 - classification_loss: 0.1136 316/500 [=================>............] - ETA: 42s - loss: 1.0108 - regression_loss: 0.8964 - classification_loss: 0.1144 317/500 [==================>...........] - ETA: 42s - loss: 1.0112 - regression_loss: 0.8968 - classification_loss: 0.1144 318/500 [==================>...........] - ETA: 42s - loss: 1.0122 - regression_loss: 0.8976 - classification_loss: 0.1146 319/500 [==================>...........] - ETA: 42s - loss: 1.0130 - regression_loss: 0.8983 - classification_loss: 0.1148 320/500 [==================>...........] - ETA: 41s - loss: 1.0112 - regression_loss: 0.8967 - classification_loss: 0.1145 321/500 [==================>...........] - ETA: 41s - loss: 1.0133 - regression_loss: 0.8985 - classification_loss: 0.1148 322/500 [==================>...........] - ETA: 41s - loss: 1.0140 - regression_loss: 0.8990 - classification_loss: 0.1149 323/500 [==================>...........] - ETA: 41s - loss: 1.0134 - regression_loss: 0.8986 - classification_loss: 0.1149 324/500 [==================>...........] - ETA: 41s - loss: 1.0144 - regression_loss: 0.8992 - classification_loss: 0.1152 325/500 [==================>...........] - ETA: 40s - loss: 1.0133 - regression_loss: 0.8983 - classification_loss: 0.1150 326/500 [==================>...........] - ETA: 40s - loss: 1.0113 - regression_loss: 0.8964 - classification_loss: 0.1148 327/500 [==================>...........] - ETA: 40s - loss: 1.0112 - regression_loss: 0.8961 - classification_loss: 0.1150 328/500 [==================>...........] - ETA: 40s - loss: 1.0123 - regression_loss: 0.8972 - classification_loss: 0.1151 329/500 [==================>...........] - ETA: 39s - loss: 1.0124 - regression_loss: 0.8973 - classification_loss: 0.1151 330/500 [==================>...........] - ETA: 39s - loss: 1.0116 - regression_loss: 0.8966 - classification_loss: 0.1150 331/500 [==================>...........] - ETA: 39s - loss: 1.0122 - regression_loss: 0.8970 - classification_loss: 0.1152 332/500 [==================>...........] - ETA: 39s - loss: 1.0117 - regression_loss: 0.8966 - classification_loss: 0.1151 333/500 [==================>...........] - ETA: 38s - loss: 1.0132 - regression_loss: 0.8979 - classification_loss: 0.1153 334/500 [===================>..........] - ETA: 38s - loss: 1.0126 - regression_loss: 0.8972 - classification_loss: 0.1154 335/500 [===================>..........] - ETA: 38s - loss: 1.0111 - regression_loss: 0.8960 - classification_loss: 0.1151 336/500 [===================>..........] - ETA: 38s - loss: 1.0111 - regression_loss: 0.8961 - classification_loss: 0.1151 337/500 [===================>..........] - ETA: 37s - loss: 1.0112 - regression_loss: 0.8963 - classification_loss: 0.1149 338/500 [===================>..........] - ETA: 37s - loss: 1.0103 - regression_loss: 0.8956 - classification_loss: 0.1148 339/500 [===================>..........] - ETA: 37s - loss: 1.0085 - regression_loss: 0.8940 - classification_loss: 0.1145 340/500 [===================>..........] - ETA: 37s - loss: 1.0083 - regression_loss: 0.8938 - classification_loss: 0.1145 341/500 [===================>..........] - ETA: 37s - loss: 1.0065 - regression_loss: 0.8922 - classification_loss: 0.1142 342/500 [===================>..........] - ETA: 36s - loss: 1.0056 - regression_loss: 0.8916 - classification_loss: 0.1140 343/500 [===================>..........] - ETA: 36s - loss: 1.0067 - regression_loss: 0.8925 - classification_loss: 0.1143 344/500 [===================>..........] - ETA: 36s - loss: 1.0069 - regression_loss: 0.8926 - classification_loss: 0.1144 345/500 [===================>..........] - ETA: 36s - loss: 1.0076 - regression_loss: 0.8930 - classification_loss: 0.1146 346/500 [===================>..........] - ETA: 35s - loss: 1.0072 - regression_loss: 0.8928 - classification_loss: 0.1145 347/500 [===================>..........] - ETA: 35s - loss: 1.0101 - regression_loss: 0.8954 - classification_loss: 0.1147 348/500 [===================>..........] - ETA: 35s - loss: 1.0089 - regression_loss: 0.8944 - classification_loss: 0.1146 349/500 [===================>..........] - ETA: 35s - loss: 1.0100 - regression_loss: 0.8953 - classification_loss: 0.1147 350/500 [====================>.........] - ETA: 34s - loss: 1.0100 - regression_loss: 0.8953 - classification_loss: 0.1147 351/500 [====================>.........] - ETA: 34s - loss: 1.0126 - regression_loss: 0.8972 - classification_loss: 0.1153 352/500 [====================>.........] - ETA: 34s - loss: 1.0122 - regression_loss: 0.8971 - classification_loss: 0.1151 353/500 [====================>.........] - ETA: 34s - loss: 1.0121 - regression_loss: 0.8970 - classification_loss: 0.1151 354/500 [====================>.........] - ETA: 34s - loss: 1.0116 - regression_loss: 0.8965 - classification_loss: 0.1151 355/500 [====================>.........] - ETA: 33s - loss: 1.0117 - regression_loss: 0.8965 - classification_loss: 0.1151 356/500 [====================>.........] - ETA: 33s - loss: 1.0107 - regression_loss: 0.8958 - classification_loss: 0.1149 357/500 [====================>.........] - ETA: 33s - loss: 1.0117 - regression_loss: 0.8966 - classification_loss: 0.1151 358/500 [====================>.........] - ETA: 33s - loss: 1.0114 - regression_loss: 0.8964 - classification_loss: 0.1150 359/500 [====================>.........] - ETA: 32s - loss: 1.0110 - regression_loss: 0.8960 - classification_loss: 0.1149 360/500 [====================>.........] - ETA: 32s - loss: 1.0120 - regression_loss: 0.8967 - classification_loss: 0.1152 361/500 [====================>.........] - ETA: 32s - loss: 1.0114 - regression_loss: 0.8963 - classification_loss: 0.1151 362/500 [====================>.........] - ETA: 32s - loss: 1.0122 - regression_loss: 0.8968 - classification_loss: 0.1154 363/500 [====================>.........] - ETA: 31s - loss: 1.0138 - regression_loss: 0.8980 - classification_loss: 0.1158 364/500 [====================>.........] - ETA: 31s - loss: 1.0132 - regression_loss: 0.8974 - classification_loss: 0.1158 365/500 [====================>.........] - ETA: 31s - loss: 1.0119 - regression_loss: 0.8964 - classification_loss: 0.1156 366/500 [====================>.........] - ETA: 31s - loss: 1.0134 - regression_loss: 0.8975 - classification_loss: 0.1159 367/500 [=====================>........] - ETA: 31s - loss: 1.0122 - regression_loss: 0.8966 - classification_loss: 0.1156 368/500 [=====================>........] - ETA: 30s - loss: 1.0117 - regression_loss: 0.8959 - classification_loss: 0.1158 369/500 [=====================>........] - ETA: 30s - loss: 1.0133 - regression_loss: 0.8971 - classification_loss: 0.1162 370/500 [=====================>........] - ETA: 30s - loss: 1.0141 - regression_loss: 0.8977 - classification_loss: 0.1163 371/500 [=====================>........] - ETA: 30s - loss: 1.0137 - regression_loss: 0.8976 - classification_loss: 0.1161 372/500 [=====================>........] - ETA: 29s - loss: 1.0129 - regression_loss: 0.8969 - classification_loss: 0.1160 373/500 [=====================>........] - ETA: 29s - loss: 1.0130 - regression_loss: 0.8971 - classification_loss: 0.1159 374/500 [=====================>........] - ETA: 29s - loss: 1.0106 - regression_loss: 0.8950 - classification_loss: 0.1157 375/500 [=====================>........] - ETA: 29s - loss: 1.0104 - regression_loss: 0.8947 - classification_loss: 0.1157 376/500 [=====================>........] - ETA: 28s - loss: 1.0116 - regression_loss: 0.8957 - classification_loss: 0.1159 377/500 [=====================>........] - ETA: 28s - loss: 1.0132 - regression_loss: 0.8970 - classification_loss: 0.1162 378/500 [=====================>........] - ETA: 28s - loss: 1.0132 - regression_loss: 0.8970 - classification_loss: 0.1162 379/500 [=====================>........] - ETA: 28s - loss: 1.0120 - regression_loss: 0.8959 - classification_loss: 0.1160 380/500 [=====================>........] - ETA: 28s - loss: 1.0131 - regression_loss: 0.8967 - classification_loss: 0.1164 381/500 [=====================>........] - ETA: 27s - loss: 1.0132 - regression_loss: 0.8969 - classification_loss: 0.1163 382/500 [=====================>........] - ETA: 27s - loss: 1.0123 - regression_loss: 0.8962 - classification_loss: 0.1162 383/500 [=====================>........] - ETA: 27s - loss: 1.0124 - regression_loss: 0.8962 - classification_loss: 0.1162 384/500 [======================>.......] - ETA: 27s - loss: 1.0115 - regression_loss: 0.8955 - classification_loss: 0.1160 385/500 [======================>.......] - ETA: 26s - loss: 1.0129 - regression_loss: 0.8970 - classification_loss: 0.1159 386/500 [======================>.......] - ETA: 26s - loss: 1.0123 - regression_loss: 0.8966 - classification_loss: 0.1157 387/500 [======================>.......] - ETA: 26s - loss: 1.0119 - regression_loss: 0.8963 - classification_loss: 0.1156 388/500 [======================>.......] - ETA: 26s - loss: 1.0129 - regression_loss: 0.8971 - classification_loss: 0.1158 389/500 [======================>.......] - ETA: 25s - loss: 1.0129 - regression_loss: 0.8971 - classification_loss: 0.1158 390/500 [======================>.......] - ETA: 25s - loss: 1.0149 - regression_loss: 0.8988 - classification_loss: 0.1161 391/500 [======================>.......] - ETA: 25s - loss: 1.0151 - regression_loss: 0.8991 - classification_loss: 0.1160 392/500 [======================>.......] - ETA: 25s - loss: 1.0136 - regression_loss: 0.8978 - classification_loss: 0.1158 393/500 [======================>.......] - ETA: 24s - loss: 1.0145 - regression_loss: 0.8985 - classification_loss: 0.1161 394/500 [======================>.......] - ETA: 24s - loss: 1.0146 - regression_loss: 0.8985 - classification_loss: 0.1161 395/500 [======================>.......] - ETA: 24s - loss: 1.0138 - regression_loss: 0.8979 - classification_loss: 0.1159 396/500 [======================>.......] - ETA: 24s - loss: 1.0138 - regression_loss: 0.8979 - classification_loss: 0.1159 397/500 [======================>.......] - ETA: 24s - loss: 1.0146 - regression_loss: 0.8986 - classification_loss: 0.1160 398/500 [======================>.......] - ETA: 23s - loss: 1.0139 - regression_loss: 0.8980 - classification_loss: 0.1159 399/500 [======================>.......] - ETA: 23s - loss: 1.0138 - regression_loss: 0.8980 - classification_loss: 0.1158 400/500 [=======================>......] - ETA: 23s - loss: 1.0145 - regression_loss: 0.8987 - classification_loss: 0.1159 401/500 [=======================>......] - ETA: 23s - loss: 1.0144 - regression_loss: 0.8986 - classification_loss: 0.1158 402/500 [=======================>......] - ETA: 22s - loss: 1.0146 - regression_loss: 0.8987 - classification_loss: 0.1159 403/500 [=======================>......] - ETA: 22s - loss: 1.0136 - regression_loss: 0.8980 - classification_loss: 0.1156 404/500 [=======================>......] - ETA: 22s - loss: 1.0146 - regression_loss: 0.8989 - classification_loss: 0.1157 405/500 [=======================>......] - ETA: 22s - loss: 1.0152 - regression_loss: 0.8994 - classification_loss: 0.1158 406/500 [=======================>......] - ETA: 21s - loss: 1.0156 - regression_loss: 0.8999 - classification_loss: 0.1157 407/500 [=======================>......] - ETA: 21s - loss: 1.0169 - regression_loss: 0.9011 - classification_loss: 0.1158 408/500 [=======================>......] - ETA: 21s - loss: 1.0176 - regression_loss: 0.9017 - classification_loss: 0.1159 409/500 [=======================>......] - ETA: 21s - loss: 1.0170 - regression_loss: 0.9012 - classification_loss: 0.1158 410/500 [=======================>......] - ETA: 21s - loss: 1.0172 - regression_loss: 0.9016 - classification_loss: 0.1157 411/500 [=======================>......] - ETA: 20s - loss: 1.0174 - regression_loss: 0.9018 - classification_loss: 0.1156 412/500 [=======================>......] - ETA: 20s - loss: 1.0177 - regression_loss: 0.9021 - classification_loss: 0.1156 413/500 [=======================>......] - ETA: 20s - loss: 1.0170 - regression_loss: 0.9015 - classification_loss: 0.1155 414/500 [=======================>......] - ETA: 20s - loss: 1.0178 - regression_loss: 0.9023 - classification_loss: 0.1155 415/500 [=======================>......] - ETA: 19s - loss: 1.0173 - regression_loss: 0.9019 - classification_loss: 0.1154 416/500 [=======================>......] - ETA: 19s - loss: 1.0195 - regression_loss: 0.9038 - classification_loss: 0.1156 417/500 [========================>.....] - ETA: 19s - loss: 1.0196 - regression_loss: 0.9040 - classification_loss: 0.1156 418/500 [========================>.....] - ETA: 19s - loss: 1.0197 - regression_loss: 0.9042 - classification_loss: 0.1155 419/500 [========================>.....] - ETA: 18s - loss: 1.0194 - regression_loss: 0.9039 - classification_loss: 0.1155 420/500 [========================>.....] - ETA: 18s - loss: 1.0197 - regression_loss: 0.9041 - classification_loss: 0.1156 421/500 [========================>.....] - ETA: 18s - loss: 1.0184 - regression_loss: 0.9030 - classification_loss: 0.1154 422/500 [========================>.....] - ETA: 18s - loss: 1.0184 - regression_loss: 0.9030 - classification_loss: 0.1153 423/500 [========================>.....] - ETA: 17s - loss: 1.0166 - regression_loss: 0.9015 - classification_loss: 0.1151 424/500 [========================>.....] - ETA: 17s - loss: 1.0169 - regression_loss: 0.9017 - classification_loss: 0.1152 425/500 [========================>.....] - ETA: 17s - loss: 1.0155 - regression_loss: 0.9005 - classification_loss: 0.1150 426/500 [========================>.....] - ETA: 17s - loss: 1.0142 - regression_loss: 0.8994 - classification_loss: 0.1148 427/500 [========================>.....] - ETA: 17s - loss: 1.0146 - regression_loss: 0.8996 - classification_loss: 0.1149 428/500 [========================>.....] - ETA: 16s - loss: 1.0135 - regression_loss: 0.8987 - classification_loss: 0.1148 429/500 [========================>.....] - ETA: 16s - loss: 1.0124 - regression_loss: 0.8977 - classification_loss: 0.1146 430/500 [========================>.....] - ETA: 16s - loss: 1.0134 - regression_loss: 0.8986 - classification_loss: 0.1148 431/500 [========================>.....] - ETA: 16s - loss: 1.0119 - regression_loss: 0.8972 - classification_loss: 0.1148 432/500 [========================>.....] - ETA: 15s - loss: 1.0122 - regression_loss: 0.8974 - classification_loss: 0.1148 433/500 [========================>.....] - ETA: 15s - loss: 1.0116 - regression_loss: 0.8970 - classification_loss: 0.1146 434/500 [=========================>....] - ETA: 15s - loss: 1.0129 - regression_loss: 0.8981 - classification_loss: 0.1148 435/500 [=========================>....] - ETA: 15s - loss: 1.0152 - regression_loss: 0.9000 - classification_loss: 0.1152 436/500 [=========================>....] - ETA: 14s - loss: 1.0158 - regression_loss: 0.9006 - classification_loss: 0.1152 437/500 [=========================>....] - ETA: 14s - loss: 1.0162 - regression_loss: 0.9009 - classification_loss: 0.1153 438/500 [=========================>....] - ETA: 14s - loss: 1.0168 - regression_loss: 0.9013 - classification_loss: 0.1155 439/500 [=========================>....] - ETA: 14s - loss: 1.0165 - regression_loss: 0.9010 - classification_loss: 0.1154 440/500 [=========================>....] - ETA: 13s - loss: 1.0167 - regression_loss: 0.9011 - classification_loss: 0.1156 441/500 [=========================>....] - ETA: 13s - loss: 1.0175 - regression_loss: 0.9016 - classification_loss: 0.1158 442/500 [=========================>....] - ETA: 13s - loss: 1.0174 - regression_loss: 0.9016 - classification_loss: 0.1158 443/500 [=========================>....] - ETA: 13s - loss: 1.0175 - regression_loss: 0.9018 - classification_loss: 0.1157 444/500 [=========================>....] - ETA: 13s - loss: 1.0175 - regression_loss: 0.9018 - classification_loss: 0.1157 445/500 [=========================>....] - ETA: 12s - loss: 1.0162 - regression_loss: 0.9007 - classification_loss: 0.1155 446/500 [=========================>....] - ETA: 12s - loss: 1.0171 - regression_loss: 0.9016 - classification_loss: 0.1156 447/500 [=========================>....] - ETA: 12s - loss: 1.0161 - regression_loss: 0.9007 - classification_loss: 0.1154 448/500 [=========================>....] - ETA: 12s - loss: 1.0150 - regression_loss: 0.8997 - classification_loss: 0.1153 449/500 [=========================>....] - ETA: 11s - loss: 1.0153 - regression_loss: 0.8998 - classification_loss: 0.1155 450/500 [==========================>...] - ETA: 11s - loss: 1.0155 - regression_loss: 0.9000 - classification_loss: 0.1155 451/500 [==========================>...] - ETA: 11s - loss: 1.0151 - regression_loss: 0.8997 - classification_loss: 0.1154 452/500 [==========================>...] - ETA: 11s - loss: 1.0153 - regression_loss: 0.8999 - classification_loss: 0.1154 453/500 [==========================>...] - ETA: 10s - loss: 1.0157 - regression_loss: 0.9002 - classification_loss: 0.1154 454/500 [==========================>...] - ETA: 10s - loss: 1.0151 - regression_loss: 0.8997 - classification_loss: 0.1154 455/500 [==========================>...] - ETA: 10s - loss: 1.0148 - regression_loss: 0.8995 - classification_loss: 0.1154 456/500 [==========================>...] - ETA: 10s - loss: 1.0138 - regression_loss: 0.8986 - classification_loss: 0.1152 457/500 [==========================>...] - ETA: 10s - loss: 1.0131 - regression_loss: 0.8981 - classification_loss: 0.1150 458/500 [==========================>...] - ETA: 9s - loss: 1.0131 - regression_loss: 0.8982 - classification_loss: 0.1150  459/500 [==========================>...] - ETA: 9s - loss: 1.0123 - regression_loss: 0.8976 - classification_loss: 0.1148 460/500 [==========================>...] - ETA: 9s - loss: 1.0129 - regression_loss: 0.8980 - classification_loss: 0.1149 461/500 [==========================>...] - ETA: 9s - loss: 1.0117 - regression_loss: 0.8970 - classification_loss: 0.1147 462/500 [==========================>...] - ETA: 8s - loss: 1.0115 - regression_loss: 0.8969 - classification_loss: 0.1146 463/500 [==========================>...] - ETA: 8s - loss: 1.0112 - regression_loss: 0.8965 - classification_loss: 0.1146 464/500 [==========================>...] - ETA: 8s - loss: 1.0108 - regression_loss: 0.8961 - classification_loss: 0.1147 465/500 [==========================>...] - ETA: 8s - loss: 1.0113 - regression_loss: 0.8966 - classification_loss: 0.1147 466/500 [==========================>...] - ETA: 7s - loss: 1.0122 - regression_loss: 0.8974 - classification_loss: 0.1148 467/500 [===========================>..] - ETA: 7s - loss: 1.0120 - regression_loss: 0.8973 - classification_loss: 0.1147 468/500 [===========================>..] - ETA: 7s - loss: 1.0108 - regression_loss: 0.8963 - classification_loss: 0.1145 469/500 [===========================>..] - ETA: 7s - loss: 1.0097 - regression_loss: 0.8954 - classification_loss: 0.1144 470/500 [===========================>..] - ETA: 7s - loss: 1.0100 - regression_loss: 0.8956 - classification_loss: 0.1144 471/500 [===========================>..] - ETA: 6s - loss: 1.0103 - regression_loss: 0.8959 - classification_loss: 0.1144 472/500 [===========================>..] - ETA: 6s - loss: 1.0103 - regression_loss: 0.8959 - classification_loss: 0.1144 473/500 [===========================>..] - ETA: 6s - loss: 1.0129 - regression_loss: 0.8980 - classification_loss: 0.1149 474/500 [===========================>..] - ETA: 6s - loss: 1.0142 - regression_loss: 0.8991 - classification_loss: 0.1151 475/500 [===========================>..] - ETA: 5s - loss: 1.0130 - regression_loss: 0.8981 - classification_loss: 0.1149 476/500 [===========================>..] - ETA: 5s - loss: 1.0119 - regression_loss: 0.8971 - classification_loss: 0.1148 477/500 [===========================>..] - ETA: 5s - loss: 1.0111 - regression_loss: 0.8964 - classification_loss: 0.1146 478/500 [===========================>..] - ETA: 5s - loss: 1.0119 - regression_loss: 0.8970 - classification_loss: 0.1148 479/500 [===========================>..] - ETA: 4s - loss: 1.0107 - regression_loss: 0.8961 - classification_loss: 0.1147 480/500 [===========================>..] - ETA: 4s - loss: 1.0099 - regression_loss: 0.8954 - classification_loss: 0.1146 481/500 [===========================>..] - ETA: 4s - loss: 1.0103 - regression_loss: 0.8957 - classification_loss: 0.1146 482/500 [===========================>..] - ETA: 4s - loss: 1.0087 - regression_loss: 0.8943 - classification_loss: 0.1144 483/500 [===========================>..] - ETA: 3s - loss: 1.0105 - regression_loss: 0.8960 - classification_loss: 0.1145 484/500 [============================>.] - ETA: 3s - loss: 1.0106 - regression_loss: 0.8962 - classification_loss: 0.1145 485/500 [============================>.] - ETA: 3s - loss: 1.0131 - regression_loss: 0.8979 - classification_loss: 0.1152 486/500 [============================>.] - ETA: 3s - loss: 1.0131 - regression_loss: 0.8980 - classification_loss: 0.1151 487/500 [============================>.] - ETA: 3s - loss: 1.0139 - regression_loss: 0.8984 - classification_loss: 0.1155 488/500 [============================>.] - ETA: 2s - loss: 1.0130 - regression_loss: 0.8977 - classification_loss: 0.1154 489/500 [============================>.] - ETA: 2s - loss: 1.0130 - regression_loss: 0.8976 - classification_loss: 0.1154 490/500 [============================>.] - ETA: 2s - loss: 1.0122 - regression_loss: 0.8970 - classification_loss: 0.1152 491/500 [============================>.] - ETA: 2s - loss: 1.0139 - regression_loss: 0.8982 - classification_loss: 0.1158 492/500 [============================>.] - ETA: 1s - loss: 1.0137 - regression_loss: 0.8979 - classification_loss: 0.1157 493/500 [============================>.] - ETA: 1s - loss: 1.0125 - regression_loss: 0.8970 - classification_loss: 0.1156 494/500 [============================>.] - ETA: 1s - loss: 1.0123 - regression_loss: 0.8967 - classification_loss: 0.1156 495/500 [============================>.] - ETA: 1s - loss: 1.0133 - regression_loss: 0.8976 - classification_loss: 0.1157 496/500 [============================>.] - ETA: 0s - loss: 1.0128 - regression_loss: 0.8972 - classification_loss: 0.1156 497/500 [============================>.] - ETA: 0s - loss: 1.0126 - regression_loss: 0.8970 - classification_loss: 0.1156 498/500 [============================>.] - ETA: 0s - loss: 1.0135 - regression_loss: 0.8978 - classification_loss: 0.1157 499/500 [============================>.] - ETA: 0s - loss: 1.0141 - regression_loss: 0.8984 - classification_loss: 0.1157 500/500 [==============================] - 117s 234ms/step - loss: 1.0138 - regression_loss: 0.8981 - classification_loss: 0.1157 326 instances of class plum with average precision: 0.8307 mAP: 0.8307 Epoch 00029: saving model to ./training/snapshots/resnet50_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 1:54 - loss: 0.2998 - regression_loss: 0.2792 - classification_loss: 0.0207 2/500 [..............................] - ETA: 1:59 - loss: 0.6360 - regression_loss: 0.5699 - classification_loss: 0.0661 3/500 [..............................] - ETA: 1:55 - loss: 0.7421 - regression_loss: 0.6408 - classification_loss: 0.1012 4/500 [..............................] - ETA: 1:53 - loss: 0.8622 - regression_loss: 0.7679 - classification_loss: 0.0943 5/500 [..............................] - ETA: 1:54 - loss: 0.9083 - regression_loss: 0.7967 - classification_loss: 0.1117 6/500 [..............................] - ETA: 1:54 - loss: 0.8609 - regression_loss: 0.7619 - classification_loss: 0.0990 7/500 [..............................] - ETA: 1:52 - loss: 0.8944 - regression_loss: 0.7885 - classification_loss: 0.1059 8/500 [..............................] - ETA: 1:53 - loss: 0.8783 - regression_loss: 0.7655 - classification_loss: 0.1128 9/500 [..............................] - ETA: 1:53 - loss: 0.8690 - regression_loss: 0.7574 - classification_loss: 0.1115 10/500 [..............................] - ETA: 1:53 - loss: 1.0468 - regression_loss: 0.9139 - classification_loss: 0.1329 11/500 [..............................] - ETA: 1:53 - loss: 1.0784 - regression_loss: 0.9406 - classification_loss: 0.1378 12/500 [..............................] - ETA: 1:52 - loss: 1.0497 - regression_loss: 0.9172 - classification_loss: 0.1325 13/500 [..............................] - ETA: 1:53 - loss: 1.0751 - regression_loss: 0.9387 - classification_loss: 0.1364 14/500 [..............................] - ETA: 1:53 - loss: 1.0535 - regression_loss: 0.9221 - classification_loss: 0.1314 15/500 [..............................] - ETA: 1:52 - loss: 1.0859 - regression_loss: 0.9520 - classification_loss: 0.1339 16/500 [..............................] - ETA: 1:52 - loss: 1.0635 - regression_loss: 0.9290 - classification_loss: 0.1345 17/500 [>.............................] - ETA: 1:52 - loss: 1.0709 - regression_loss: 0.9342 - classification_loss: 0.1367 18/500 [>.............................] - ETA: 1:52 - loss: 1.0876 - regression_loss: 0.9467 - classification_loss: 0.1409 19/500 [>.............................] - ETA: 1:51 - loss: 1.0669 - regression_loss: 0.9281 - classification_loss: 0.1388 20/500 [>.............................] - ETA: 1:51 - loss: 1.0714 - regression_loss: 0.9330 - classification_loss: 0.1383 21/500 [>.............................] - ETA: 1:51 - loss: 1.0832 - regression_loss: 0.9431 - classification_loss: 0.1401 22/500 [>.............................] - ETA: 1:51 - loss: 1.1009 - regression_loss: 0.9568 - classification_loss: 0.1441 23/500 [>.............................] - ETA: 1:51 - loss: 1.0861 - regression_loss: 0.9443 - classification_loss: 0.1419 24/500 [>.............................] - ETA: 1:52 - loss: 1.0898 - regression_loss: 0.9494 - classification_loss: 0.1404 25/500 [>.............................] - ETA: 1:51 - loss: 1.0963 - regression_loss: 0.9557 - classification_loss: 0.1406 26/500 [>.............................] - ETA: 1:51 - loss: 1.0840 - regression_loss: 0.9464 - classification_loss: 0.1376 27/500 [>.............................] - ETA: 1:51 - loss: 1.0846 - regression_loss: 0.9458 - classification_loss: 0.1387 28/500 [>.............................] - ETA: 1:51 - loss: 1.0969 - regression_loss: 0.9562 - classification_loss: 0.1407 29/500 [>.............................] - ETA: 1:51 - loss: 1.0806 - regression_loss: 0.9426 - classification_loss: 0.1380 30/500 [>.............................] - ETA: 1:50 - loss: 1.0667 - regression_loss: 0.9315 - classification_loss: 0.1352 31/500 [>.............................] - ETA: 1:50 - loss: 1.0760 - regression_loss: 0.9385 - classification_loss: 0.1375 32/500 [>.............................] - ETA: 1:49 - loss: 1.0542 - regression_loss: 0.9198 - classification_loss: 0.1343 33/500 [>.............................] - ETA: 1:49 - loss: 1.0445 - regression_loss: 0.9124 - classification_loss: 0.1322 34/500 [=>............................] - ETA: 1:49 - loss: 1.0457 - regression_loss: 0.9142 - classification_loss: 0.1316 35/500 [=>............................] - ETA: 1:49 - loss: 1.0445 - regression_loss: 0.9148 - classification_loss: 0.1297 36/500 [=>............................] - ETA: 1:49 - loss: 1.0455 - regression_loss: 0.9143 - classification_loss: 0.1311 37/500 [=>............................] - ETA: 1:48 - loss: 1.0350 - regression_loss: 0.9044 - classification_loss: 0.1306 38/500 [=>............................] - ETA: 1:48 - loss: 1.0384 - regression_loss: 0.9077 - classification_loss: 0.1308 39/500 [=>............................] - ETA: 1:48 - loss: 1.0275 - regression_loss: 0.8994 - classification_loss: 0.1281 40/500 [=>............................] - ETA: 1:48 - loss: 1.0296 - regression_loss: 0.9019 - classification_loss: 0.1277 41/500 [=>............................] - ETA: 1:48 - loss: 1.0403 - regression_loss: 0.9107 - classification_loss: 0.1296 42/500 [=>............................] - ETA: 1:47 - loss: 1.0408 - regression_loss: 0.9108 - classification_loss: 0.1300 43/500 [=>............................] - ETA: 1:47 - loss: 1.0330 - regression_loss: 0.9050 - classification_loss: 0.1280 44/500 [=>............................] - ETA: 1:47 - loss: 1.0318 - regression_loss: 0.9043 - classification_loss: 0.1275 45/500 [=>............................] - ETA: 1:46 - loss: 1.0385 - regression_loss: 0.9112 - classification_loss: 0.1273 46/500 [=>............................] - ETA: 1:46 - loss: 1.0569 - regression_loss: 0.9265 - classification_loss: 0.1304 47/500 [=>............................] - ETA: 1:46 - loss: 1.0726 - regression_loss: 0.9379 - classification_loss: 0.1346 48/500 [=>............................] - ETA: 1:46 - loss: 1.0666 - regression_loss: 0.9342 - classification_loss: 0.1324 49/500 [=>............................] - ETA: 1:45 - loss: 1.0727 - regression_loss: 0.9406 - classification_loss: 0.1322 50/500 [==>...........................] - ETA: 1:45 - loss: 1.0564 - regression_loss: 0.9263 - classification_loss: 0.1302 51/500 [==>...........................] - ETA: 1:45 - loss: 1.0697 - regression_loss: 0.9364 - classification_loss: 0.1333 52/500 [==>...........................] - ETA: 1:44 - loss: 1.0658 - regression_loss: 0.9332 - classification_loss: 0.1326 53/500 [==>...........................] - ETA: 1:44 - loss: 1.0706 - regression_loss: 0.9381 - classification_loss: 0.1325 54/500 [==>...........................] - ETA: 1:43 - loss: 1.0689 - regression_loss: 0.9361 - classification_loss: 0.1328 55/500 [==>...........................] - ETA: 1:43 - loss: 1.0664 - regression_loss: 0.9333 - classification_loss: 0.1331 56/500 [==>...........................] - ETA: 1:43 - loss: 1.0566 - regression_loss: 0.9252 - classification_loss: 0.1314 57/500 [==>...........................] - ETA: 1:43 - loss: 1.0485 - regression_loss: 0.9179 - classification_loss: 0.1306 58/500 [==>...........................] - ETA: 1:43 - loss: 1.0915 - regression_loss: 0.9524 - classification_loss: 0.1391 59/500 [==>...........................] - ETA: 1:42 - loss: 1.0876 - regression_loss: 0.9494 - classification_loss: 0.1382 60/500 [==>...........................] - ETA: 1:42 - loss: 1.0912 - regression_loss: 0.9531 - classification_loss: 0.1381 61/500 [==>...........................] - ETA: 1:42 - loss: 1.0851 - regression_loss: 0.9473 - classification_loss: 0.1378 62/500 [==>...........................] - ETA: 1:42 - loss: 1.0776 - regression_loss: 0.9407 - classification_loss: 0.1368 63/500 [==>...........................] - ETA: 1:41 - loss: 1.0715 - regression_loss: 0.9357 - classification_loss: 0.1358 64/500 [==>...........................] - ETA: 1:41 - loss: 1.0789 - regression_loss: 0.9422 - classification_loss: 0.1367 65/500 [==>...........................] - ETA: 1:41 - loss: 1.0731 - regression_loss: 0.9379 - classification_loss: 0.1352 66/500 [==>...........................] - ETA: 1:41 - loss: 1.0771 - regression_loss: 0.9422 - classification_loss: 0.1350 67/500 [===>..........................] - ETA: 1:40 - loss: 1.0707 - regression_loss: 0.9368 - classification_loss: 0.1339 68/500 [===>..........................] - ETA: 1:40 - loss: 1.0682 - regression_loss: 0.9351 - classification_loss: 0.1331 69/500 [===>..........................] - ETA: 1:40 - loss: 1.0661 - regression_loss: 0.9334 - classification_loss: 0.1327 70/500 [===>..........................] - ETA: 1:40 - loss: 1.0830 - regression_loss: 0.9462 - classification_loss: 0.1368 71/500 [===>..........................] - ETA: 1:39 - loss: 1.0885 - regression_loss: 0.9513 - classification_loss: 0.1372 72/500 [===>..........................] - ETA: 1:39 - loss: 1.0934 - regression_loss: 0.9568 - classification_loss: 0.1366 73/500 [===>..........................] - ETA: 1:39 - loss: 1.0995 - regression_loss: 0.9630 - classification_loss: 0.1365 74/500 [===>..........................] - ETA: 1:39 - loss: 1.0979 - regression_loss: 0.9622 - classification_loss: 0.1357 75/500 [===>..........................] - ETA: 1:38 - loss: 1.0924 - regression_loss: 0.9578 - classification_loss: 0.1346 76/500 [===>..........................] - ETA: 1:38 - loss: 1.0944 - regression_loss: 0.9605 - classification_loss: 0.1339 77/500 [===>..........................] - ETA: 1:38 - loss: 1.0922 - regression_loss: 0.9584 - classification_loss: 0.1337 78/500 [===>..........................] - ETA: 1:38 - loss: 1.0856 - regression_loss: 0.9520 - classification_loss: 0.1336 79/500 [===>..........................] - ETA: 1:37 - loss: 1.0873 - regression_loss: 0.9536 - classification_loss: 0.1337 80/500 [===>..........................] - ETA: 1:37 - loss: 1.0934 - regression_loss: 0.9573 - classification_loss: 0.1361 81/500 [===>..........................] - ETA: 1:37 - loss: 1.0886 - regression_loss: 0.9539 - classification_loss: 0.1347 82/500 [===>..........................] - ETA: 1:37 - loss: 1.0820 - regression_loss: 0.9481 - classification_loss: 0.1339 83/500 [===>..........................] - ETA: 1:36 - loss: 1.0811 - regression_loss: 0.9481 - classification_loss: 0.1330 84/500 [====>.........................] - ETA: 1:36 - loss: 1.0762 - regression_loss: 0.9444 - classification_loss: 0.1318 85/500 [====>.........................] - ETA: 1:36 - loss: 1.0768 - regression_loss: 0.9451 - classification_loss: 0.1317 86/500 [====>.........................] - ETA: 1:36 - loss: 1.0802 - regression_loss: 0.9489 - classification_loss: 0.1313 87/500 [====>.........................] - ETA: 1:36 - loss: 1.0793 - regression_loss: 0.9487 - classification_loss: 0.1306 88/500 [====>.........................] - ETA: 1:35 - loss: 1.0748 - regression_loss: 0.9452 - classification_loss: 0.1296 89/500 [====>.........................] - ETA: 1:35 - loss: 1.0775 - regression_loss: 0.9483 - classification_loss: 0.1292 90/500 [====>.........................] - ETA: 1:35 - loss: 1.0818 - regression_loss: 0.9527 - classification_loss: 0.1292 91/500 [====>.........................] - ETA: 1:35 - loss: 1.0847 - regression_loss: 0.9555 - classification_loss: 0.1292 92/500 [====>.........................] - ETA: 1:35 - loss: 1.0782 - regression_loss: 0.9500 - classification_loss: 0.1282 93/500 [====>.........................] - ETA: 1:34 - loss: 1.0782 - regression_loss: 0.9510 - classification_loss: 0.1272 94/500 [====>.........................] - ETA: 1:34 - loss: 1.0741 - regression_loss: 0.9477 - classification_loss: 0.1265 95/500 [====>.........................] - ETA: 1:34 - loss: 1.0733 - regression_loss: 0.9471 - classification_loss: 0.1263 96/500 [====>.........................] - ETA: 1:34 - loss: 1.0724 - regression_loss: 0.9461 - classification_loss: 0.1264 97/500 [====>.........................] - ETA: 1:33 - loss: 1.0753 - regression_loss: 0.9481 - classification_loss: 0.1272 98/500 [====>.........................] - ETA: 1:33 - loss: 1.0712 - regression_loss: 0.9450 - classification_loss: 0.1263 99/500 [====>.........................] - ETA: 1:33 - loss: 1.0733 - regression_loss: 0.9467 - classification_loss: 0.1267 100/500 [=====>........................] - ETA: 1:33 - loss: 1.0714 - regression_loss: 0.9450 - classification_loss: 0.1263 101/500 [=====>........................] - ETA: 1:33 - loss: 1.0692 - regression_loss: 0.9431 - classification_loss: 0.1261 102/500 [=====>........................] - ETA: 1:32 - loss: 1.0663 - regression_loss: 0.9411 - classification_loss: 0.1252 103/500 [=====>........................] - ETA: 1:32 - loss: 1.0650 - regression_loss: 0.9404 - classification_loss: 0.1246 104/500 [=====>........................] - ETA: 1:32 - loss: 1.0648 - regression_loss: 0.9403 - classification_loss: 0.1244 105/500 [=====>........................] - ETA: 1:32 - loss: 1.0615 - regression_loss: 0.9374 - classification_loss: 0.1241 106/500 [=====>........................] - ETA: 1:32 - loss: 1.0653 - regression_loss: 0.9402 - classification_loss: 0.1251 107/500 [=====>........................] - ETA: 1:31 - loss: 1.0602 - regression_loss: 0.9361 - classification_loss: 0.1242 108/500 [=====>........................] - ETA: 1:31 - loss: 1.0538 - regression_loss: 0.9304 - classification_loss: 0.1234 109/500 [=====>........................] - ETA: 1:31 - loss: 1.0498 - regression_loss: 0.9266 - classification_loss: 0.1232 110/500 [=====>........................] - ETA: 1:30 - loss: 1.0504 - regression_loss: 0.9272 - classification_loss: 0.1232 111/500 [=====>........................] - ETA: 1:30 - loss: 1.0482 - regression_loss: 0.9255 - classification_loss: 0.1227 112/500 [=====>........................] - ETA: 1:30 - loss: 1.0516 - regression_loss: 0.9285 - classification_loss: 0.1231 113/500 [=====>........................] - ETA: 1:30 - loss: 1.0499 - regression_loss: 0.9274 - classification_loss: 0.1225 114/500 [=====>........................] - ETA: 1:29 - loss: 1.0509 - regression_loss: 0.9280 - classification_loss: 0.1229 115/500 [=====>........................] - ETA: 1:29 - loss: 1.0535 - regression_loss: 0.9302 - classification_loss: 0.1233 116/500 [=====>........................] - ETA: 1:29 - loss: 1.0461 - regression_loss: 0.9237 - classification_loss: 0.1224 117/500 [======>.......................] - ETA: 1:29 - loss: 1.0477 - regression_loss: 0.9255 - classification_loss: 0.1222 118/500 [======>.......................] - ETA: 1:28 - loss: 1.0488 - regression_loss: 0.9267 - classification_loss: 0.1222 119/500 [======>.......................] - ETA: 1:28 - loss: 1.0435 - regression_loss: 0.9221 - classification_loss: 0.1214 120/500 [======>.......................] - ETA: 1:28 - loss: 1.0436 - regression_loss: 0.9226 - classification_loss: 0.1211 121/500 [======>.......................] - ETA: 1:28 - loss: 1.0432 - regression_loss: 0.9223 - classification_loss: 0.1209 122/500 [======>.......................] - ETA: 1:27 - loss: 1.0410 - regression_loss: 0.9205 - classification_loss: 0.1205 123/500 [======>.......................] - ETA: 1:27 - loss: 1.0452 - regression_loss: 0.9235 - classification_loss: 0.1217 124/500 [======>.......................] - ETA: 1:27 - loss: 1.0397 - regression_loss: 0.9187 - classification_loss: 0.1211 125/500 [======>.......................] - ETA: 1:27 - loss: 1.0373 - regression_loss: 0.9169 - classification_loss: 0.1205 126/500 [======>.......................] - ETA: 1:27 - loss: 1.0408 - regression_loss: 0.9204 - classification_loss: 0.1204 127/500 [======>.......................] - ETA: 1:26 - loss: 1.0414 - regression_loss: 0.9213 - classification_loss: 0.1202 128/500 [======>.......................] - ETA: 1:26 - loss: 1.0444 - regression_loss: 0.9241 - classification_loss: 0.1203 129/500 [======>.......................] - ETA: 1:26 - loss: 1.0464 - regression_loss: 0.9260 - classification_loss: 0.1204 130/500 [======>.......................] - ETA: 1:26 - loss: 1.0454 - regression_loss: 0.9254 - classification_loss: 0.1200 131/500 [======>.......................] - ETA: 1:25 - loss: 1.0458 - regression_loss: 0.9259 - classification_loss: 0.1199 132/500 [======>.......................] - ETA: 1:25 - loss: 1.0444 - regression_loss: 0.9247 - classification_loss: 0.1197 133/500 [======>.......................] - ETA: 1:25 - loss: 1.0432 - regression_loss: 0.9238 - classification_loss: 0.1193 134/500 [=======>......................] - ETA: 1:25 - loss: 1.0411 - regression_loss: 0.9221 - classification_loss: 0.1190 135/500 [=======>......................] - ETA: 1:25 - loss: 1.0455 - regression_loss: 0.9265 - classification_loss: 0.1190 136/500 [=======>......................] - ETA: 1:24 - loss: 1.0475 - regression_loss: 0.9280 - classification_loss: 0.1196 137/500 [=======>......................] - ETA: 1:24 - loss: 1.0463 - regression_loss: 0.9269 - classification_loss: 0.1194 138/500 [=======>......................] - ETA: 1:24 - loss: 1.0452 - regression_loss: 0.9260 - classification_loss: 0.1192 139/500 [=======>......................] - ETA: 1:24 - loss: 1.0462 - regression_loss: 0.9267 - classification_loss: 0.1195 140/500 [=======>......................] - ETA: 1:23 - loss: 1.0472 - regression_loss: 0.9277 - classification_loss: 0.1195 141/500 [=======>......................] - ETA: 1:23 - loss: 1.0458 - regression_loss: 0.9261 - classification_loss: 0.1197 142/500 [=======>......................] - ETA: 1:23 - loss: 1.0452 - regression_loss: 0.9259 - classification_loss: 0.1193 143/500 [=======>......................] - ETA: 1:23 - loss: 1.0447 - regression_loss: 0.9250 - classification_loss: 0.1197 144/500 [=======>......................] - ETA: 1:23 - loss: 1.0479 - regression_loss: 0.9279 - classification_loss: 0.1200 145/500 [=======>......................] - ETA: 1:22 - loss: 1.0490 - regression_loss: 0.9289 - classification_loss: 0.1202 146/500 [=======>......................] - ETA: 1:22 - loss: 1.0455 - regression_loss: 0.9259 - classification_loss: 0.1196 147/500 [=======>......................] - ETA: 1:22 - loss: 1.0447 - regression_loss: 0.9251 - classification_loss: 0.1196 148/500 [=======>......................] - ETA: 1:22 - loss: 1.0465 - regression_loss: 0.9265 - classification_loss: 0.1200 149/500 [=======>......................] - ETA: 1:21 - loss: 1.0474 - regression_loss: 0.9274 - classification_loss: 0.1200 150/500 [========>.....................] - ETA: 1:21 - loss: 1.0479 - regression_loss: 0.9282 - classification_loss: 0.1196 151/500 [========>.....................] - ETA: 1:21 - loss: 1.0460 - regression_loss: 0.9269 - classification_loss: 0.1191 152/500 [========>.....................] - ETA: 1:21 - loss: 1.0449 - regression_loss: 0.9259 - classification_loss: 0.1190 153/500 [========>.....................] - ETA: 1:20 - loss: 1.0463 - regression_loss: 0.9272 - classification_loss: 0.1191 154/500 [========>.....................] - ETA: 1:20 - loss: 1.0462 - regression_loss: 0.9269 - classification_loss: 0.1193 155/500 [========>.....................] - ETA: 1:20 - loss: 1.0463 - regression_loss: 0.9272 - classification_loss: 0.1190 156/500 [========>.....................] - ETA: 1:20 - loss: 1.0462 - regression_loss: 0.9271 - classification_loss: 0.1191 157/500 [========>.....................] - ETA: 1:20 - loss: 1.0477 - regression_loss: 0.9282 - classification_loss: 0.1195 158/500 [========>.....................] - ETA: 1:19 - loss: 1.0503 - regression_loss: 0.9307 - classification_loss: 0.1196 159/500 [========>.....................] - ETA: 1:19 - loss: 1.0507 - regression_loss: 0.9312 - classification_loss: 0.1196 160/500 [========>.....................] - ETA: 1:19 - loss: 1.0528 - regression_loss: 0.9331 - classification_loss: 0.1197 161/500 [========>.....................] - ETA: 1:19 - loss: 1.0497 - regression_loss: 0.9305 - classification_loss: 0.1192 162/500 [========>.....................] - ETA: 1:18 - loss: 1.0480 - regression_loss: 0.9291 - classification_loss: 0.1189 163/500 [========>.....................] - ETA: 1:18 - loss: 1.0507 - regression_loss: 0.9311 - classification_loss: 0.1196 164/500 [========>.....................] - ETA: 1:18 - loss: 1.0523 - regression_loss: 0.9326 - classification_loss: 0.1197 165/500 [========>.....................] - ETA: 1:18 - loss: 1.0520 - regression_loss: 0.9327 - classification_loss: 0.1193 166/500 [========>.....................] - ETA: 1:18 - loss: 1.0508 - regression_loss: 0.9319 - classification_loss: 0.1189 167/500 [=========>....................] - ETA: 1:17 - loss: 1.0495 - regression_loss: 0.9308 - classification_loss: 0.1187 168/500 [=========>....................] - ETA: 1:17 - loss: 1.0502 - regression_loss: 0.9313 - classification_loss: 0.1189 169/500 [=========>....................] - ETA: 1:17 - loss: 1.0525 - regression_loss: 0.9329 - classification_loss: 0.1197 170/500 [=========>....................] - ETA: 1:17 - loss: 1.0509 - regression_loss: 0.9316 - classification_loss: 0.1193 171/500 [=========>....................] - ETA: 1:17 - loss: 1.0475 - regression_loss: 0.9287 - classification_loss: 0.1188 172/500 [=========>....................] - ETA: 1:16 - loss: 1.0472 - regression_loss: 0.9284 - classification_loss: 0.1188 173/500 [=========>....................] - ETA: 1:16 - loss: 1.0454 - regression_loss: 0.9271 - classification_loss: 0.1183 174/500 [=========>....................] - ETA: 1:16 - loss: 1.0501 - regression_loss: 0.9316 - classification_loss: 0.1185 175/500 [=========>....................] - ETA: 1:16 - loss: 1.0569 - regression_loss: 0.9370 - classification_loss: 0.1199 176/500 [=========>....................] - ETA: 1:15 - loss: 1.0528 - regression_loss: 0.9334 - classification_loss: 0.1194 177/500 [=========>....................] - ETA: 1:15 - loss: 1.0508 - regression_loss: 0.9316 - classification_loss: 0.1192 178/500 [=========>....................] - ETA: 1:15 - loss: 1.0519 - regression_loss: 0.9326 - classification_loss: 0.1193 179/500 [=========>....................] - ETA: 1:15 - loss: 1.0499 - regression_loss: 0.9309 - classification_loss: 0.1190 180/500 [=========>....................] - ETA: 1:14 - loss: 1.0485 - regression_loss: 0.9298 - classification_loss: 0.1187 181/500 [=========>....................] - ETA: 1:14 - loss: 1.0482 - regression_loss: 0.9297 - classification_loss: 0.1185 182/500 [=========>....................] - ETA: 1:14 - loss: 1.0487 - regression_loss: 0.9303 - classification_loss: 0.1184 183/500 [=========>....................] - ETA: 1:14 - loss: 1.0460 - regression_loss: 0.9280 - classification_loss: 0.1179 184/500 [==========>...................] - ETA: 1:13 - loss: 1.0435 - regression_loss: 0.9261 - classification_loss: 0.1174 185/500 [==========>...................] - ETA: 1:13 - loss: 1.0443 - regression_loss: 0.9261 - classification_loss: 0.1181 186/500 [==========>...................] - ETA: 1:13 - loss: 1.0418 - regression_loss: 0.9241 - classification_loss: 0.1177 187/500 [==========>...................] - ETA: 1:13 - loss: 1.0421 - regression_loss: 0.9244 - classification_loss: 0.1176 188/500 [==========>...................] - ETA: 1:13 - loss: 1.0535 - regression_loss: 0.9307 - classification_loss: 0.1228 189/500 [==========>...................] - ETA: 1:12 - loss: 1.0505 - regression_loss: 0.9282 - classification_loss: 0.1223 190/500 [==========>...................] - ETA: 1:12 - loss: 1.0475 - regression_loss: 0.9256 - classification_loss: 0.1218 191/500 [==========>...................] - ETA: 1:12 - loss: 1.0498 - regression_loss: 0.9276 - classification_loss: 0.1222 192/500 [==========>...................] - ETA: 1:12 - loss: 1.0474 - regression_loss: 0.9256 - classification_loss: 0.1218 193/500 [==========>...................] - ETA: 1:11 - loss: 1.0462 - regression_loss: 0.9245 - classification_loss: 0.1216 194/500 [==========>...................] - ETA: 1:11 - loss: 1.0435 - regression_loss: 0.9221 - classification_loss: 0.1214 195/500 [==========>...................] - ETA: 1:11 - loss: 1.0450 - regression_loss: 0.9235 - classification_loss: 0.1215 196/500 [==========>...................] - ETA: 1:11 - loss: 1.0465 - regression_loss: 0.9250 - classification_loss: 0.1215 197/500 [==========>...................] - ETA: 1:11 - loss: 1.0481 - regression_loss: 0.9262 - classification_loss: 0.1219 198/500 [==========>...................] - ETA: 1:10 - loss: 1.0505 - regression_loss: 0.9289 - classification_loss: 0.1217 199/500 [==========>...................] - ETA: 1:10 - loss: 1.0495 - regression_loss: 0.9281 - classification_loss: 0.1214 200/500 [===========>..................] - ETA: 1:10 - loss: 1.0491 - regression_loss: 0.9278 - classification_loss: 0.1213 201/500 [===========>..................] - ETA: 1:10 - loss: 1.0464 - regression_loss: 0.9253 - classification_loss: 0.1211 202/500 [===========>..................] - ETA: 1:09 - loss: 1.0439 - regression_loss: 0.9232 - classification_loss: 0.1207 203/500 [===========>..................] - ETA: 1:09 - loss: 1.0455 - regression_loss: 0.9243 - classification_loss: 0.1212 204/500 [===========>..................] - ETA: 1:09 - loss: 1.0434 - regression_loss: 0.9225 - classification_loss: 0.1209 205/500 [===========>..................] - ETA: 1:09 - loss: 1.0433 - regression_loss: 0.9226 - classification_loss: 0.1207 206/500 [===========>..................] - ETA: 1:08 - loss: 1.0422 - regression_loss: 0.9216 - classification_loss: 0.1206 207/500 [===========>..................] - ETA: 1:08 - loss: 1.0423 - regression_loss: 0.9217 - classification_loss: 0.1206 208/500 [===========>..................] - ETA: 1:08 - loss: 1.0395 - regression_loss: 0.9193 - classification_loss: 0.1202 209/500 [===========>..................] - ETA: 1:08 - loss: 1.0419 - regression_loss: 0.9214 - classification_loss: 0.1205 210/500 [===========>..................] - ETA: 1:08 - loss: 1.0394 - regression_loss: 0.9194 - classification_loss: 0.1200 211/500 [===========>..................] - ETA: 1:07 - loss: 1.0414 - regression_loss: 0.9211 - classification_loss: 0.1203 212/500 [===========>..................] - ETA: 1:07 - loss: 1.0431 - regression_loss: 0.9225 - classification_loss: 0.1206 213/500 [===========>..................] - ETA: 1:07 - loss: 1.0406 - regression_loss: 0.9204 - classification_loss: 0.1203 214/500 [===========>..................] - ETA: 1:07 - loss: 1.0401 - regression_loss: 0.9199 - classification_loss: 0.1202 215/500 [===========>..................] - ETA: 1:06 - loss: 1.0375 - regression_loss: 0.9173 - classification_loss: 0.1202 216/500 [===========>..................] - ETA: 1:06 - loss: 1.0377 - regression_loss: 0.9174 - classification_loss: 0.1203 217/500 [============>.................] - ETA: 1:06 - loss: 1.0364 - regression_loss: 0.9164 - classification_loss: 0.1200 218/500 [============>.................] - ETA: 1:06 - loss: 1.0410 - regression_loss: 0.9206 - classification_loss: 0.1204 219/500 [============>.................] - ETA: 1:05 - loss: 1.0421 - regression_loss: 0.9217 - classification_loss: 0.1204 220/500 [============>.................] - ETA: 1:05 - loss: 1.0460 - regression_loss: 0.9257 - classification_loss: 0.1203 221/500 [============>.................] - ETA: 1:05 - loss: 1.0477 - regression_loss: 0.9270 - classification_loss: 0.1207 222/500 [============>.................] - ETA: 1:05 - loss: 1.0476 - regression_loss: 0.9270 - classification_loss: 0.1206 223/500 [============>.................] - ETA: 1:05 - loss: 1.0487 - regression_loss: 0.9280 - classification_loss: 0.1207 224/500 [============>.................] - ETA: 1:04 - loss: 1.0486 - regression_loss: 0.9280 - classification_loss: 0.1206 225/500 [============>.................] - ETA: 1:04 - loss: 1.0499 - regression_loss: 0.9292 - classification_loss: 0.1208 226/500 [============>.................] - ETA: 1:04 - loss: 1.0502 - regression_loss: 0.9294 - classification_loss: 0.1208 227/500 [============>.................] - ETA: 1:04 - loss: 1.0496 - regression_loss: 0.9287 - classification_loss: 0.1209 228/500 [============>.................] - ETA: 1:03 - loss: 1.0492 - regression_loss: 0.9285 - classification_loss: 0.1207 229/500 [============>.................] - ETA: 1:03 - loss: 1.0519 - regression_loss: 0.9305 - classification_loss: 0.1214 230/500 [============>.................] - ETA: 1:03 - loss: 1.0513 - regression_loss: 0.9301 - classification_loss: 0.1212 231/500 [============>.................] - ETA: 1:03 - loss: 1.0513 - regression_loss: 0.9302 - classification_loss: 0.1211 232/500 [============>.................] - ETA: 1:02 - loss: 1.0539 - regression_loss: 0.9330 - classification_loss: 0.1209 233/500 [============>.................] - ETA: 1:02 - loss: 1.0541 - regression_loss: 0.9334 - classification_loss: 0.1207 234/500 [=============>................] - ETA: 1:02 - loss: 1.0520 - regression_loss: 0.9314 - classification_loss: 0.1206 235/500 [=============>................] - ETA: 1:02 - loss: 1.0536 - regression_loss: 0.9327 - classification_loss: 0.1209 236/500 [=============>................] - ETA: 1:01 - loss: 1.0532 - regression_loss: 0.9325 - classification_loss: 0.1207 237/500 [=============>................] - ETA: 1:01 - loss: 1.0526 - regression_loss: 0.9321 - classification_loss: 0.1205 238/500 [=============>................] - ETA: 1:01 - loss: 1.0525 - regression_loss: 0.9321 - classification_loss: 0.1203 239/500 [=============>................] - ETA: 1:01 - loss: 1.0506 - regression_loss: 0.9307 - classification_loss: 0.1199 240/500 [=============>................] - ETA: 1:00 - loss: 1.0488 - regression_loss: 0.9292 - classification_loss: 0.1196 241/500 [=============>................] - ETA: 1:00 - loss: 1.0495 - regression_loss: 0.9300 - classification_loss: 0.1195 242/500 [=============>................] - ETA: 1:00 - loss: 1.0521 - regression_loss: 0.9321 - classification_loss: 0.1199 243/500 [=============>................] - ETA: 1:00 - loss: 1.0505 - regression_loss: 0.9310 - classification_loss: 0.1196 244/500 [=============>................] - ETA: 1:00 - loss: 1.0487 - regression_loss: 0.9293 - classification_loss: 0.1194 245/500 [=============>................] - ETA: 59s - loss: 1.0515 - regression_loss: 0.9314 - classification_loss: 0.1201  246/500 [=============>................] - ETA: 59s - loss: 1.0506 - regression_loss: 0.9307 - classification_loss: 0.1198 247/500 [=============>................] - ETA: 59s - loss: 1.0497 - regression_loss: 0.9301 - classification_loss: 0.1196 248/500 [=============>................] - ETA: 59s - loss: 1.0479 - regression_loss: 0.9285 - classification_loss: 0.1194 249/500 [=============>................] - ETA: 58s - loss: 1.0469 - regression_loss: 0.9278 - classification_loss: 0.1191 250/500 [==============>...............] - ETA: 58s - loss: 1.0471 - regression_loss: 0.9280 - classification_loss: 0.1191 251/500 [==============>...............] - ETA: 58s - loss: 1.0490 - regression_loss: 0.9297 - classification_loss: 0.1193 252/500 [==============>...............] - ETA: 58s - loss: 1.0480 - regression_loss: 0.9288 - classification_loss: 0.1192 253/500 [==============>...............] - ETA: 58s - loss: 1.0449 - regression_loss: 0.9261 - classification_loss: 0.1188 254/500 [==============>...............] - ETA: 57s - loss: 1.0427 - regression_loss: 0.9243 - classification_loss: 0.1184 255/500 [==============>...............] - ETA: 57s - loss: 1.0430 - regression_loss: 0.9245 - classification_loss: 0.1186 256/500 [==============>...............] - ETA: 57s - loss: 1.0414 - regression_loss: 0.9231 - classification_loss: 0.1183 257/500 [==============>...............] - ETA: 57s - loss: 1.0411 - regression_loss: 0.9229 - classification_loss: 0.1181 258/500 [==============>...............] - ETA: 56s - loss: 1.0427 - regression_loss: 0.9241 - classification_loss: 0.1186 259/500 [==============>...............] - ETA: 56s - loss: 1.0402 - regression_loss: 0.9220 - classification_loss: 0.1183 260/500 [==============>...............] - ETA: 56s - loss: 1.0394 - regression_loss: 0.9213 - classification_loss: 0.1181 261/500 [==============>...............] - ETA: 56s - loss: 1.0360 - regression_loss: 0.9183 - classification_loss: 0.1177 262/500 [==============>...............] - ETA: 55s - loss: 1.0370 - regression_loss: 0.9189 - classification_loss: 0.1181 263/500 [==============>...............] - ETA: 55s - loss: 1.0371 - regression_loss: 0.9188 - classification_loss: 0.1183 264/500 [==============>...............] - ETA: 55s - loss: 1.0375 - regression_loss: 0.9194 - classification_loss: 0.1181 265/500 [==============>...............] - ETA: 55s - loss: 1.0363 - regression_loss: 0.9182 - classification_loss: 0.1181 266/500 [==============>...............] - ETA: 54s - loss: 1.0371 - regression_loss: 0.9192 - classification_loss: 0.1179 267/500 [===============>..............] - ETA: 54s - loss: 1.0386 - regression_loss: 0.9202 - classification_loss: 0.1184 268/500 [===============>..............] - ETA: 54s - loss: 1.0373 - regression_loss: 0.9191 - classification_loss: 0.1182 269/500 [===============>..............] - ETA: 54s - loss: 1.0431 - regression_loss: 0.9236 - classification_loss: 0.1195 270/500 [===============>..............] - ETA: 54s - loss: 1.0422 - regression_loss: 0.9228 - classification_loss: 0.1194 271/500 [===============>..............] - ETA: 53s - loss: 1.0418 - regression_loss: 0.9224 - classification_loss: 0.1194 272/500 [===============>..............] - ETA: 53s - loss: 1.0402 - regression_loss: 0.9210 - classification_loss: 0.1192 273/500 [===============>..............] - ETA: 53s - loss: 1.0390 - regression_loss: 0.9201 - classification_loss: 0.1189 274/500 [===============>..............] - ETA: 53s - loss: 1.0391 - regression_loss: 0.9202 - classification_loss: 0.1190 275/500 [===============>..............] - ETA: 52s - loss: 1.0394 - regression_loss: 0.9205 - classification_loss: 0.1188 276/500 [===============>..............] - ETA: 52s - loss: 1.0383 - regression_loss: 0.9197 - classification_loss: 0.1186 277/500 [===============>..............] - ETA: 52s - loss: 1.0359 - regression_loss: 0.9175 - classification_loss: 0.1184 278/500 [===============>..............] - ETA: 52s - loss: 1.0339 - regression_loss: 0.9157 - classification_loss: 0.1182 279/500 [===============>..............] - ETA: 51s - loss: 1.0333 - regression_loss: 0.9152 - classification_loss: 0.1181 280/500 [===============>..............] - ETA: 51s - loss: 1.0322 - regression_loss: 0.9144 - classification_loss: 0.1178 281/500 [===============>..............] - ETA: 51s - loss: 1.0309 - regression_loss: 0.9133 - classification_loss: 0.1175 282/500 [===============>..............] - ETA: 51s - loss: 1.0321 - regression_loss: 0.9144 - classification_loss: 0.1177 283/500 [===============>..............] - ETA: 51s - loss: 1.0303 - regression_loss: 0.9129 - classification_loss: 0.1174 284/500 [================>.............] - ETA: 50s - loss: 1.0302 - regression_loss: 0.9127 - classification_loss: 0.1174 285/500 [================>.............] - ETA: 50s - loss: 1.0279 - regression_loss: 0.9106 - classification_loss: 0.1173 286/500 [================>.............] - ETA: 50s - loss: 1.0296 - regression_loss: 0.9123 - classification_loss: 0.1173 287/500 [================>.............] - ETA: 50s - loss: 1.0307 - regression_loss: 0.9134 - classification_loss: 0.1174 288/500 [================>.............] - ETA: 49s - loss: 1.0299 - regression_loss: 0.9127 - classification_loss: 0.1171 289/500 [================>.............] - ETA: 49s - loss: 1.0283 - regression_loss: 0.9113 - classification_loss: 0.1169 290/500 [================>.............] - ETA: 49s - loss: 1.0284 - regression_loss: 0.9116 - classification_loss: 0.1169 291/500 [================>.............] - ETA: 49s - loss: 1.0322 - regression_loss: 0.9149 - classification_loss: 0.1174 292/500 [================>.............] - ETA: 48s - loss: 1.0321 - regression_loss: 0.9148 - classification_loss: 0.1173 293/500 [================>.............] - ETA: 48s - loss: 1.0301 - regression_loss: 0.9130 - classification_loss: 0.1171 294/500 [================>.............] - ETA: 48s - loss: 1.0292 - regression_loss: 0.9123 - classification_loss: 0.1169 295/500 [================>.............] - ETA: 48s - loss: 1.0274 - regression_loss: 0.9107 - classification_loss: 0.1166 296/500 [================>.............] - ETA: 47s - loss: 1.0251 - regression_loss: 0.9087 - classification_loss: 0.1164 297/500 [================>.............] - ETA: 47s - loss: 1.0264 - regression_loss: 0.9096 - classification_loss: 0.1168 298/500 [================>.............] - ETA: 47s - loss: 1.0264 - regression_loss: 0.9097 - classification_loss: 0.1168 299/500 [================>.............] - ETA: 47s - loss: 1.0244 - regression_loss: 0.9078 - classification_loss: 0.1166 300/500 [=================>............] - ETA: 47s - loss: 1.0219 - regression_loss: 0.9056 - classification_loss: 0.1163 301/500 [=================>............] - ETA: 46s - loss: 1.0227 - regression_loss: 0.9060 - classification_loss: 0.1167 302/500 [=================>............] - ETA: 46s - loss: 1.0237 - regression_loss: 0.9068 - classification_loss: 0.1169 303/500 [=================>............] - ETA: 46s - loss: 1.0241 - regression_loss: 0.9070 - classification_loss: 0.1171 304/500 [=================>............] - ETA: 46s - loss: 1.0230 - regression_loss: 0.9061 - classification_loss: 0.1169 305/500 [=================>............] - ETA: 45s - loss: 1.0231 - regression_loss: 0.9062 - classification_loss: 0.1170 306/500 [=================>............] - ETA: 45s - loss: 1.0237 - regression_loss: 0.9068 - classification_loss: 0.1169 307/500 [=================>............] - ETA: 45s - loss: 1.0220 - regression_loss: 0.9053 - classification_loss: 0.1166 308/500 [=================>............] - ETA: 45s - loss: 1.0209 - regression_loss: 0.9045 - classification_loss: 0.1164 309/500 [=================>............] - ETA: 44s - loss: 1.0222 - regression_loss: 0.9056 - classification_loss: 0.1166 310/500 [=================>............] - ETA: 44s - loss: 1.0199 - regression_loss: 0.9036 - classification_loss: 0.1164 311/500 [=================>............] - ETA: 44s - loss: 1.0180 - regression_loss: 0.9019 - classification_loss: 0.1161 312/500 [=================>............] - ETA: 44s - loss: 1.0173 - regression_loss: 0.9013 - classification_loss: 0.1160 313/500 [=================>............] - ETA: 43s - loss: 1.0179 - regression_loss: 0.9018 - classification_loss: 0.1160 314/500 [=================>............] - ETA: 43s - loss: 1.0181 - regression_loss: 0.9020 - classification_loss: 0.1161 315/500 [=================>............] - ETA: 43s - loss: 1.0184 - regression_loss: 0.9023 - classification_loss: 0.1161 316/500 [=================>............] - ETA: 43s - loss: 1.0187 - regression_loss: 0.9027 - classification_loss: 0.1160 317/500 [==================>...........] - ETA: 43s - loss: 1.0181 - regression_loss: 0.9023 - classification_loss: 0.1158 318/500 [==================>...........] - ETA: 42s - loss: 1.0174 - regression_loss: 0.9017 - classification_loss: 0.1157 319/500 [==================>...........] - ETA: 42s - loss: 1.0162 - regression_loss: 0.9007 - classification_loss: 0.1155 320/500 [==================>...........] - ETA: 42s - loss: 1.0153 - regression_loss: 0.8999 - classification_loss: 0.1154 321/500 [==================>...........] - ETA: 42s - loss: 1.0145 - regression_loss: 0.8994 - classification_loss: 0.1151 322/500 [==================>...........] - ETA: 41s - loss: 1.0144 - regression_loss: 0.8994 - classification_loss: 0.1150 323/500 [==================>...........] - ETA: 41s - loss: 1.0129 - regression_loss: 0.8982 - classification_loss: 0.1147 324/500 [==================>...........] - ETA: 41s - loss: 1.0134 - regression_loss: 0.8988 - classification_loss: 0.1146 325/500 [==================>...........] - ETA: 41s - loss: 1.0118 - regression_loss: 0.8975 - classification_loss: 0.1144 326/500 [==================>...........] - ETA: 40s - loss: 1.0106 - regression_loss: 0.8964 - classification_loss: 0.1142 327/500 [==================>...........] - ETA: 40s - loss: 1.0093 - regression_loss: 0.8953 - classification_loss: 0.1140 328/500 [==================>...........] - ETA: 40s - loss: 1.0078 - regression_loss: 0.8940 - classification_loss: 0.1138 329/500 [==================>...........] - ETA: 40s - loss: 1.0068 - regression_loss: 0.8931 - classification_loss: 0.1136 330/500 [==================>...........] - ETA: 39s - loss: 1.0047 - regression_loss: 0.8913 - classification_loss: 0.1134 331/500 [==================>...........] - ETA: 39s - loss: 1.0055 - regression_loss: 0.8920 - classification_loss: 0.1135 332/500 [==================>...........] - ETA: 39s - loss: 1.0060 - regression_loss: 0.8927 - classification_loss: 0.1133 333/500 [==================>...........] - ETA: 39s - loss: 1.0069 - regression_loss: 0.8935 - classification_loss: 0.1134 334/500 [===================>..........] - ETA: 39s - loss: 1.0077 - regression_loss: 0.8945 - classification_loss: 0.1132 335/500 [===================>..........] - ETA: 38s - loss: 1.0084 - regression_loss: 0.8951 - classification_loss: 0.1133 336/500 [===================>..........] - ETA: 38s - loss: 1.0081 - regression_loss: 0.8946 - classification_loss: 0.1134 337/500 [===================>..........] - ETA: 38s - loss: 1.0094 - regression_loss: 0.8958 - classification_loss: 0.1136 338/500 [===================>..........] - ETA: 38s - loss: 1.0127 - regression_loss: 0.8985 - classification_loss: 0.1141 339/500 [===================>..........] - ETA: 37s - loss: 1.0119 - regression_loss: 0.8979 - classification_loss: 0.1140 340/500 [===================>..........] - ETA: 37s - loss: 1.0115 - regression_loss: 0.8976 - classification_loss: 0.1139 341/500 [===================>..........] - ETA: 37s - loss: 1.0128 - regression_loss: 0.8988 - classification_loss: 0.1141 342/500 [===================>..........] - ETA: 37s - loss: 1.0125 - regression_loss: 0.8984 - classification_loss: 0.1140 343/500 [===================>..........] - ETA: 36s - loss: 1.0127 - regression_loss: 0.8988 - classification_loss: 0.1139 344/500 [===================>..........] - ETA: 36s - loss: 1.0143 - regression_loss: 0.8999 - classification_loss: 0.1144 345/500 [===================>..........] - ETA: 36s - loss: 1.0151 - regression_loss: 0.9007 - classification_loss: 0.1143 346/500 [===================>..........] - ETA: 36s - loss: 1.0138 - regression_loss: 0.8997 - classification_loss: 0.1141 347/500 [===================>..........] - ETA: 35s - loss: 1.0132 - regression_loss: 0.8992 - classification_loss: 0.1140 348/500 [===================>..........] - ETA: 35s - loss: 1.0133 - regression_loss: 0.8991 - classification_loss: 0.1142 349/500 [===================>..........] - ETA: 35s - loss: 1.0154 - regression_loss: 0.9012 - classification_loss: 0.1142 350/500 [====================>.........] - ETA: 35s - loss: 1.0155 - regression_loss: 0.9011 - classification_loss: 0.1144 351/500 [====================>.........] - ETA: 35s - loss: 1.0146 - regression_loss: 0.9003 - classification_loss: 0.1142 352/500 [====================>.........] - ETA: 34s - loss: 1.0141 - regression_loss: 0.9000 - classification_loss: 0.1141 353/500 [====================>.........] - ETA: 34s - loss: 1.0156 - regression_loss: 0.9016 - classification_loss: 0.1140 354/500 [====================>.........] - ETA: 34s - loss: 1.0152 - regression_loss: 0.9013 - classification_loss: 0.1139 355/500 [====================>.........] - ETA: 34s - loss: 1.0163 - regression_loss: 0.9020 - classification_loss: 0.1143 356/500 [====================>.........] - ETA: 33s - loss: 1.0165 - regression_loss: 0.9022 - classification_loss: 0.1143 357/500 [====================>.........] - ETA: 33s - loss: 1.0161 - regression_loss: 0.9019 - classification_loss: 0.1142 358/500 [====================>.........] - ETA: 33s - loss: 1.0149 - regression_loss: 0.9007 - classification_loss: 0.1141 359/500 [====================>.........] - ETA: 33s - loss: 1.0153 - regression_loss: 0.9012 - classification_loss: 0.1141 360/500 [====================>.........] - ETA: 32s - loss: 1.0152 - regression_loss: 0.9011 - classification_loss: 0.1142 361/500 [====================>.........] - ETA: 32s - loss: 1.0144 - regression_loss: 0.9003 - classification_loss: 0.1141 362/500 [====================>.........] - ETA: 32s - loss: 1.0136 - regression_loss: 0.8996 - classification_loss: 0.1140 363/500 [====================>.........] - ETA: 32s - loss: 1.0142 - regression_loss: 0.9001 - classification_loss: 0.1142 364/500 [====================>.........] - ETA: 31s - loss: 1.0132 - regression_loss: 0.8993 - classification_loss: 0.1139 365/500 [====================>.........] - ETA: 31s - loss: 1.0146 - regression_loss: 0.9006 - classification_loss: 0.1140 366/500 [====================>.........] - ETA: 31s - loss: 1.0140 - regression_loss: 0.9002 - classification_loss: 0.1138 367/500 [=====================>........] - ETA: 31s - loss: 1.0129 - regression_loss: 0.8993 - classification_loss: 0.1136 368/500 [=====================>........] - ETA: 31s - loss: 1.0124 - regression_loss: 0.8989 - classification_loss: 0.1136 369/500 [=====================>........] - ETA: 30s - loss: 1.0114 - regression_loss: 0.8980 - classification_loss: 0.1134 370/500 [=====================>........] - ETA: 30s - loss: 1.0124 - regression_loss: 0.8988 - classification_loss: 0.1136 371/500 [=====================>........] - ETA: 30s - loss: 1.0120 - regression_loss: 0.8984 - classification_loss: 0.1135 372/500 [=====================>........] - ETA: 30s - loss: 1.0159 - regression_loss: 0.9018 - classification_loss: 0.1141 373/500 [=====================>........] - ETA: 29s - loss: 1.0152 - regression_loss: 0.9013 - classification_loss: 0.1139 374/500 [=====================>........] - ETA: 29s - loss: 1.0162 - regression_loss: 0.9022 - classification_loss: 0.1140 375/500 [=====================>........] - ETA: 29s - loss: 1.0148 - regression_loss: 0.9010 - classification_loss: 0.1138 376/500 [=====================>........] - ETA: 29s - loss: 1.0137 - regression_loss: 0.9001 - classification_loss: 0.1136 377/500 [=====================>........] - ETA: 28s - loss: 1.0130 - regression_loss: 0.8995 - classification_loss: 0.1135 378/500 [=====================>........] - ETA: 28s - loss: 1.0117 - regression_loss: 0.8984 - classification_loss: 0.1133 379/500 [=====================>........] - ETA: 28s - loss: 1.0115 - regression_loss: 0.8982 - classification_loss: 0.1132 380/500 [=====================>........] - ETA: 28s - loss: 1.0117 - regression_loss: 0.8985 - classification_loss: 0.1132 381/500 [=====================>........] - ETA: 27s - loss: 1.0115 - regression_loss: 0.8982 - classification_loss: 0.1133 382/500 [=====================>........] - ETA: 27s - loss: 1.0120 - regression_loss: 0.8987 - classification_loss: 0.1133 383/500 [=====================>........] - ETA: 27s - loss: 1.0133 - regression_loss: 0.8997 - classification_loss: 0.1136 384/500 [======================>.......] - ETA: 27s - loss: 1.0147 - regression_loss: 0.9011 - classification_loss: 0.1136 385/500 [======================>.......] - ETA: 27s - loss: 1.0144 - regression_loss: 0.9009 - classification_loss: 0.1135 386/500 [======================>.......] - ETA: 26s - loss: 1.0157 - regression_loss: 0.9020 - classification_loss: 0.1138 387/500 [======================>.......] - ETA: 26s - loss: 1.0163 - regression_loss: 0.9024 - classification_loss: 0.1138 388/500 [======================>.......] - ETA: 26s - loss: 1.0153 - regression_loss: 0.9017 - classification_loss: 0.1137 389/500 [======================>.......] - ETA: 26s - loss: 1.0167 - regression_loss: 0.9027 - classification_loss: 0.1140 390/500 [======================>.......] - ETA: 25s - loss: 1.0173 - regression_loss: 0.9031 - classification_loss: 0.1141 391/500 [======================>.......] - ETA: 25s - loss: 1.0177 - regression_loss: 0.9035 - classification_loss: 0.1142 392/500 [======================>.......] - ETA: 25s - loss: 1.0163 - regression_loss: 0.9023 - classification_loss: 0.1140 393/500 [======================>.......] - ETA: 25s - loss: 1.0179 - regression_loss: 0.9036 - classification_loss: 0.1143 394/500 [======================>.......] - ETA: 24s - loss: 1.0188 - regression_loss: 0.9044 - classification_loss: 0.1144 395/500 [======================>.......] - ETA: 24s - loss: 1.0182 - regression_loss: 0.9039 - classification_loss: 0.1143 396/500 [======================>.......] - ETA: 24s - loss: 1.0185 - regression_loss: 0.9044 - classification_loss: 0.1141 397/500 [======================>.......] - ETA: 24s - loss: 1.0186 - regression_loss: 0.9045 - classification_loss: 0.1140 398/500 [======================>.......] - ETA: 23s - loss: 1.0192 - regression_loss: 0.9053 - classification_loss: 0.1139 399/500 [======================>.......] - ETA: 23s - loss: 1.0198 - regression_loss: 0.9059 - classification_loss: 0.1139 400/500 [=======================>......] - ETA: 23s - loss: 1.0201 - regression_loss: 0.9061 - classification_loss: 0.1139 401/500 [=======================>......] - ETA: 23s - loss: 1.0209 - regression_loss: 0.9069 - classification_loss: 0.1140 402/500 [=======================>......] - ETA: 23s - loss: 1.0213 - regression_loss: 0.9072 - classification_loss: 0.1141 403/500 [=======================>......] - ETA: 22s - loss: 1.0199 - regression_loss: 0.9061 - classification_loss: 0.1139 404/500 [=======================>......] - ETA: 22s - loss: 1.0179 - regression_loss: 0.9043 - classification_loss: 0.1137 405/500 [=======================>......] - ETA: 22s - loss: 1.0176 - regression_loss: 0.9040 - classification_loss: 0.1136 406/500 [=======================>......] - ETA: 22s - loss: 1.0182 - regression_loss: 0.9048 - classification_loss: 0.1135 407/500 [=======================>......] - ETA: 21s - loss: 1.0175 - regression_loss: 0.9041 - classification_loss: 0.1134 408/500 [=======================>......] - ETA: 21s - loss: 1.0173 - regression_loss: 0.9039 - classification_loss: 0.1134 409/500 [=======================>......] - ETA: 21s - loss: 1.0169 - regression_loss: 0.9035 - classification_loss: 0.1134 410/500 [=======================>......] - ETA: 21s - loss: 1.0159 - regression_loss: 0.9026 - classification_loss: 0.1133 411/500 [=======================>......] - ETA: 20s - loss: 1.0156 - regression_loss: 0.9024 - classification_loss: 0.1132 412/500 [=======================>......] - ETA: 20s - loss: 1.0152 - regression_loss: 0.9021 - classification_loss: 0.1132 413/500 [=======================>......] - ETA: 20s - loss: 1.0159 - regression_loss: 0.9027 - classification_loss: 0.1132 414/500 [=======================>......] - ETA: 20s - loss: 1.0150 - regression_loss: 0.9019 - classification_loss: 0.1131 415/500 [=======================>......] - ETA: 19s - loss: 1.0141 - regression_loss: 0.9012 - classification_loss: 0.1129 416/500 [=======================>......] - ETA: 19s - loss: 1.0137 - regression_loss: 0.9007 - classification_loss: 0.1130 417/500 [========================>.....] - ETA: 19s - loss: 1.0136 - regression_loss: 0.9006 - classification_loss: 0.1129 418/500 [========================>.....] - ETA: 19s - loss: 1.0140 - regression_loss: 0.9010 - classification_loss: 0.1130 419/500 [========================>.....] - ETA: 19s - loss: 1.0134 - regression_loss: 0.9005 - classification_loss: 0.1129 420/500 [========================>.....] - ETA: 18s - loss: 1.0124 - regression_loss: 0.8997 - classification_loss: 0.1127 421/500 [========================>.....] - ETA: 18s - loss: 1.0126 - regression_loss: 0.8999 - classification_loss: 0.1127 422/500 [========================>.....] - ETA: 18s - loss: 1.0113 - regression_loss: 0.8987 - classification_loss: 0.1125 423/500 [========================>.....] - ETA: 18s - loss: 1.0111 - regression_loss: 0.8986 - classification_loss: 0.1125 424/500 [========================>.....] - ETA: 17s - loss: 1.0116 - regression_loss: 0.8989 - classification_loss: 0.1127 425/500 [========================>.....] - ETA: 17s - loss: 1.0142 - regression_loss: 0.9006 - classification_loss: 0.1136 426/500 [========================>.....] - ETA: 17s - loss: 1.0153 - regression_loss: 0.9013 - classification_loss: 0.1140 427/500 [========================>.....] - ETA: 17s - loss: 1.0145 - regression_loss: 0.9007 - classification_loss: 0.1138 428/500 [========================>.....] - ETA: 16s - loss: 1.0140 - regression_loss: 0.9001 - classification_loss: 0.1139 429/500 [========================>.....] - ETA: 16s - loss: 1.0132 - regression_loss: 0.8994 - classification_loss: 0.1138 430/500 [========================>.....] - ETA: 16s - loss: 1.0113 - regression_loss: 0.8978 - classification_loss: 0.1136 431/500 [========================>.....] - ETA: 16s - loss: 1.0117 - regression_loss: 0.8981 - classification_loss: 0.1136 432/500 [========================>.....] - ETA: 15s - loss: 1.0127 - regression_loss: 0.8989 - classification_loss: 0.1138 433/500 [========================>.....] - ETA: 15s - loss: 1.0130 - regression_loss: 0.8993 - classification_loss: 0.1137 434/500 [=========================>....] - ETA: 15s - loss: 1.0147 - regression_loss: 0.9003 - classification_loss: 0.1143 435/500 [=========================>....] - ETA: 15s - loss: 1.0153 - regression_loss: 0.9006 - classification_loss: 0.1147 436/500 [=========================>....] - ETA: 15s - loss: 1.0150 - regression_loss: 0.9004 - classification_loss: 0.1147 437/500 [=========================>....] - ETA: 14s - loss: 1.0152 - regression_loss: 0.9005 - classification_loss: 0.1147 438/500 [=========================>....] - ETA: 14s - loss: 1.0150 - regression_loss: 0.9004 - classification_loss: 0.1146 439/500 [=========================>....] - ETA: 14s - loss: 1.0152 - regression_loss: 0.9007 - classification_loss: 0.1145 440/500 [=========================>....] - ETA: 14s - loss: 1.0162 - regression_loss: 0.9014 - classification_loss: 0.1148 441/500 [=========================>....] - ETA: 13s - loss: 1.0166 - regression_loss: 0.9017 - classification_loss: 0.1149 442/500 [=========================>....] - ETA: 13s - loss: 1.0173 - regression_loss: 0.9022 - classification_loss: 0.1151 443/500 [=========================>....] - ETA: 13s - loss: 1.0168 - regression_loss: 0.9018 - classification_loss: 0.1151 444/500 [=========================>....] - ETA: 13s - loss: 1.0168 - regression_loss: 0.9017 - classification_loss: 0.1151 445/500 [=========================>....] - ETA: 12s - loss: 1.0169 - regression_loss: 0.9018 - classification_loss: 0.1151 446/500 [=========================>....] - ETA: 12s - loss: 1.0180 - regression_loss: 0.9028 - classification_loss: 0.1152 447/500 [=========================>....] - ETA: 12s - loss: 1.0173 - regression_loss: 0.9022 - classification_loss: 0.1151 448/500 [=========================>....] - ETA: 12s - loss: 1.0172 - regression_loss: 0.9020 - classification_loss: 0.1152 449/500 [=========================>....] - ETA: 11s - loss: 1.0166 - regression_loss: 0.9016 - classification_loss: 0.1150 450/500 [==========================>...] - ETA: 11s - loss: 1.0160 - regression_loss: 0.9012 - classification_loss: 0.1148 451/500 [==========================>...] - ETA: 11s - loss: 1.0154 - regression_loss: 0.9007 - classification_loss: 0.1147 452/500 [==========================>...] - ETA: 11s - loss: 1.0150 - regression_loss: 0.9002 - classification_loss: 0.1148 453/500 [==========================>...] - ETA: 11s - loss: 1.0157 - regression_loss: 0.9009 - classification_loss: 0.1148 454/500 [==========================>...] - ETA: 10s - loss: 1.0155 - regression_loss: 0.9008 - classification_loss: 0.1147 455/500 [==========================>...] - ETA: 10s - loss: 1.0158 - regression_loss: 0.9011 - classification_loss: 0.1146 456/500 [==========================>...] - ETA: 10s - loss: 1.0149 - regression_loss: 0.9001 - classification_loss: 0.1148 457/500 [==========================>...] - ETA: 10s - loss: 1.0139 - regression_loss: 0.8992 - classification_loss: 0.1147 458/500 [==========================>...] - ETA: 9s - loss: 1.0146 - regression_loss: 0.8997 - classification_loss: 0.1149  459/500 [==========================>...] - ETA: 9s - loss: 1.0145 - regression_loss: 0.8997 - classification_loss: 0.1148 460/500 [==========================>...] - ETA: 9s - loss: 1.0141 - regression_loss: 0.8994 - classification_loss: 0.1148 461/500 [==========================>...] - ETA: 9s - loss: 1.0137 - regression_loss: 0.8991 - classification_loss: 0.1146 462/500 [==========================>...] - ETA: 8s - loss: 1.0138 - regression_loss: 0.8993 - classification_loss: 0.1146 463/500 [==========================>...] - ETA: 8s - loss: 1.0137 - regression_loss: 0.8993 - classification_loss: 0.1145 464/500 [==========================>...] - ETA: 8s - loss: 1.0134 - regression_loss: 0.8990 - classification_loss: 0.1144 465/500 [==========================>...] - ETA: 8s - loss: 1.0128 - regression_loss: 0.8985 - classification_loss: 0.1143 466/500 [==========================>...] - ETA: 7s - loss: 1.0124 - regression_loss: 0.8982 - classification_loss: 0.1142 467/500 [===========================>..] - ETA: 7s - loss: 1.0129 - regression_loss: 0.8986 - classification_loss: 0.1143 468/500 [===========================>..] - ETA: 7s - loss: 1.0114 - regression_loss: 0.8973 - classification_loss: 0.1141 469/500 [===========================>..] - ETA: 7s - loss: 1.0113 - regression_loss: 0.8972 - classification_loss: 0.1141 470/500 [===========================>..] - ETA: 7s - loss: 1.0107 - regression_loss: 0.8968 - classification_loss: 0.1139 471/500 [===========================>..] - ETA: 6s - loss: 1.0132 - regression_loss: 0.8992 - classification_loss: 0.1140 472/500 [===========================>..] - ETA: 6s - loss: 1.0120 - regression_loss: 0.8981 - classification_loss: 0.1139 473/500 [===========================>..] - ETA: 6s - loss: 1.0116 - regression_loss: 0.8977 - classification_loss: 0.1138 474/500 [===========================>..] - ETA: 6s - loss: 1.0105 - regression_loss: 0.8968 - classification_loss: 0.1137 475/500 [===========================>..] - ETA: 5s - loss: 1.0100 - regression_loss: 0.8964 - classification_loss: 0.1136 476/500 [===========================>..] - ETA: 5s - loss: 1.0100 - regression_loss: 0.8965 - classification_loss: 0.1136 477/500 [===========================>..] - ETA: 5s - loss: 1.0086 - regression_loss: 0.8952 - classification_loss: 0.1134 478/500 [===========================>..] - ETA: 5s - loss: 1.0089 - regression_loss: 0.8955 - classification_loss: 0.1134 479/500 [===========================>..] - ETA: 4s - loss: 1.0098 - regression_loss: 0.8963 - classification_loss: 0.1135 480/500 [===========================>..] - ETA: 4s - loss: 1.0095 - regression_loss: 0.8962 - classification_loss: 0.1133 481/500 [===========================>..] - ETA: 4s - loss: 1.0093 - regression_loss: 0.8961 - classification_loss: 0.1132 482/500 [===========================>..] - ETA: 4s - loss: 1.0092 - regression_loss: 0.8959 - classification_loss: 0.1132 483/500 [===========================>..] - ETA: 3s - loss: 1.0094 - regression_loss: 0.8962 - classification_loss: 0.1132 484/500 [============================>.] - ETA: 3s - loss: 1.0087 - regression_loss: 0.8956 - classification_loss: 0.1131 485/500 [============================>.] - ETA: 3s - loss: 1.0090 - regression_loss: 0.8959 - classification_loss: 0.1131 486/500 [============================>.] - ETA: 3s - loss: 1.0092 - regression_loss: 0.8960 - classification_loss: 0.1132 487/500 [============================>.] - ETA: 3s - loss: 1.0083 - regression_loss: 0.8953 - classification_loss: 0.1130 488/500 [============================>.] - ETA: 2s - loss: 1.0088 - regression_loss: 0.8958 - classification_loss: 0.1131 489/500 [============================>.] - ETA: 2s - loss: 1.0089 - regression_loss: 0.8959 - classification_loss: 0.1130 490/500 [============================>.] - ETA: 2s - loss: 1.0098 - regression_loss: 0.8970 - classification_loss: 0.1128 491/500 [============================>.] - ETA: 2s - loss: 1.0102 - regression_loss: 0.8974 - classification_loss: 0.1128 492/500 [============================>.] - ETA: 1s - loss: 1.0109 - regression_loss: 0.8979 - classification_loss: 0.1129 493/500 [============================>.] - ETA: 1s - loss: 1.0112 - regression_loss: 0.8981 - classification_loss: 0.1131 494/500 [============================>.] - ETA: 1s - loss: 1.0106 - regression_loss: 0.8976 - classification_loss: 0.1130 495/500 [============================>.] - ETA: 1s - loss: 1.0103 - regression_loss: 0.8974 - classification_loss: 0.1129 496/500 [============================>.] - ETA: 0s - loss: 1.0099 - regression_loss: 0.8970 - classification_loss: 0.1129 497/500 [============================>.] - ETA: 0s - loss: 1.0101 - regression_loss: 0.8972 - classification_loss: 0.1129 498/500 [============================>.] - ETA: 0s - loss: 1.0102 - regression_loss: 0.8973 - classification_loss: 0.1129 499/500 [============================>.] - ETA: 0s - loss: 1.0092 - regression_loss: 0.8964 - classification_loss: 0.1128 500/500 [==============================] - 117s 235ms/step - loss: 1.0078 - regression_loss: 0.8952 - classification_loss: 0.1126 326 instances of class plum with average precision: 0.8554 mAP: 0.8554 Epoch 00030: saving model to ./training/snapshots/resnet50_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 2:01 - loss: 0.9628 - regression_loss: 0.8586 - classification_loss: 0.1042 2/500 [..............................] - ETA: 1:59 - loss: 0.9466 - regression_loss: 0.8474 - classification_loss: 0.0992 3/500 [..............................] - ETA: 2:00 - loss: 0.9019 - regression_loss: 0.8129 - classification_loss: 0.0890 4/500 [..............................] - ETA: 1:59 - loss: 0.9785 - regression_loss: 0.8876 - classification_loss: 0.0909 5/500 [..............................] - ETA: 1:59 - loss: 0.9125 - regression_loss: 0.8313 - classification_loss: 0.0813 6/500 [..............................] - ETA: 2:01 - loss: 0.8968 - regression_loss: 0.8182 - classification_loss: 0.0786 7/500 [..............................] - ETA: 1:59 - loss: 0.8446 - regression_loss: 0.7716 - classification_loss: 0.0731 8/500 [..............................] - ETA: 1:59 - loss: 0.9045 - regression_loss: 0.8194 - classification_loss: 0.0852 9/500 [..............................] - ETA: 1:57 - loss: 0.8568 - regression_loss: 0.7777 - classification_loss: 0.0791 10/500 [..............................] - ETA: 1:56 - loss: 0.8916 - regression_loss: 0.8073 - classification_loss: 0.0843 11/500 [..............................] - ETA: 1:56 - loss: 0.8650 - regression_loss: 0.7862 - classification_loss: 0.0788 12/500 [..............................] - ETA: 1:54 - loss: 0.8818 - regression_loss: 0.8015 - classification_loss: 0.0803 13/500 [..............................] - ETA: 1:54 - loss: 0.8498 - regression_loss: 0.7728 - classification_loss: 0.0769 14/500 [..............................] - ETA: 1:54 - loss: 0.8573 - regression_loss: 0.7718 - classification_loss: 0.0855 15/500 [..............................] - ETA: 1:54 - loss: 0.8621 - regression_loss: 0.7775 - classification_loss: 0.0846 16/500 [..............................] - ETA: 1:54 - loss: 0.8623 - regression_loss: 0.7777 - classification_loss: 0.0845 17/500 [>.............................] - ETA: 1:54 - loss: 0.8610 - regression_loss: 0.7772 - classification_loss: 0.0838 18/500 [>.............................] - ETA: 1:54 - loss: 0.8542 - regression_loss: 0.7706 - classification_loss: 0.0835 19/500 [>.............................] - ETA: 1:54 - loss: 0.8809 - regression_loss: 0.7869 - classification_loss: 0.0940 20/500 [>.............................] - ETA: 1:54 - loss: 0.8725 - regression_loss: 0.7800 - classification_loss: 0.0924 21/500 [>.............................] - ETA: 1:53 - loss: 0.8895 - regression_loss: 0.7948 - classification_loss: 0.0947 22/500 [>.............................] - ETA: 1:53 - loss: 0.8664 - regression_loss: 0.7753 - classification_loss: 0.0911 23/500 [>.............................] - ETA: 1:53 - loss: 0.8579 - regression_loss: 0.7686 - classification_loss: 0.0893 24/500 [>.............................] - ETA: 1:53 - loss: 0.8920 - regression_loss: 0.8016 - classification_loss: 0.0904 25/500 [>.............................] - ETA: 1:53 - loss: 0.9060 - regression_loss: 0.8114 - classification_loss: 0.0946 26/500 [>.............................] - ETA: 1:52 - loss: 0.8894 - regression_loss: 0.7972 - classification_loss: 0.0921 27/500 [>.............................] - ETA: 1:51 - loss: 0.9020 - regression_loss: 0.8071 - classification_loss: 0.0949 28/500 [>.............................] - ETA: 1:51 - loss: 0.9096 - regression_loss: 0.8140 - classification_loss: 0.0956 29/500 [>.............................] - ETA: 1:51 - loss: 0.9254 - regression_loss: 0.8300 - classification_loss: 0.0954 30/500 [>.............................] - ETA: 1:51 - loss: 0.9121 - regression_loss: 0.8194 - classification_loss: 0.0927 31/500 [>.............................] - ETA: 1:51 - loss: 0.9066 - regression_loss: 0.8151 - classification_loss: 0.0915 32/500 [>.............................] - ETA: 1:50 - loss: 0.9075 - regression_loss: 0.8157 - classification_loss: 0.0918 33/500 [>.............................] - ETA: 1:50 - loss: 0.9097 - regression_loss: 0.8176 - classification_loss: 0.0921 34/500 [=>............................] - ETA: 1:50 - loss: 0.9405 - regression_loss: 0.8418 - classification_loss: 0.0987 35/500 [=>............................] - ETA: 1:49 - loss: 0.9402 - regression_loss: 0.8411 - classification_loss: 0.0991 36/500 [=>............................] - ETA: 1:49 - loss: 0.9577 - regression_loss: 0.8554 - classification_loss: 0.1023 37/500 [=>............................] - ETA: 1:49 - loss: 0.9579 - regression_loss: 0.8555 - classification_loss: 0.1024 38/500 [=>............................] - ETA: 1:48 - loss: 0.9468 - regression_loss: 0.8448 - classification_loss: 0.1019 39/500 [=>............................] - ETA: 1:48 - loss: 0.9373 - regression_loss: 0.8368 - classification_loss: 0.1006 40/500 [=>............................] - ETA: 1:48 - loss: 0.9523 - regression_loss: 0.8508 - classification_loss: 0.1015 41/500 [=>............................] - ETA: 1:48 - loss: 0.9400 - regression_loss: 0.8402 - classification_loss: 0.0998 42/500 [=>............................] - ETA: 1:47 - loss: 0.9556 - regression_loss: 0.8512 - classification_loss: 0.1044 43/500 [=>............................] - ETA: 1:47 - loss: 0.9454 - regression_loss: 0.8424 - classification_loss: 0.1030 44/500 [=>............................] - ETA: 1:47 - loss: 0.9352 - regression_loss: 0.8334 - classification_loss: 0.1018 45/500 [=>............................] - ETA: 1:46 - loss: 0.9149 - regression_loss: 0.8149 - classification_loss: 0.1000 46/500 [=>............................] - ETA: 1:46 - loss: 0.9087 - regression_loss: 0.8094 - classification_loss: 0.0993 47/500 [=>............................] - ETA: 1:46 - loss: 0.9056 - regression_loss: 0.8062 - classification_loss: 0.0994 48/500 [=>............................] - ETA: 1:45 - loss: 0.9006 - regression_loss: 0.8025 - classification_loss: 0.0981 49/500 [=>............................] - ETA: 1:45 - loss: 0.9161 - regression_loss: 0.8138 - classification_loss: 0.1023 50/500 [==>...........................] - ETA: 1:45 - loss: 0.9167 - regression_loss: 0.8148 - classification_loss: 0.1018 51/500 [==>...........................] - ETA: 1:45 - loss: 0.9084 - regression_loss: 0.8075 - classification_loss: 0.1009 52/500 [==>...........................] - ETA: 1:45 - loss: 0.9216 - regression_loss: 0.8165 - classification_loss: 0.1051 53/500 [==>...........................] - ETA: 1:44 - loss: 0.9297 - regression_loss: 0.8240 - classification_loss: 0.1057 54/500 [==>...........................] - ETA: 1:44 - loss: 0.9440 - regression_loss: 0.8352 - classification_loss: 0.1088 55/500 [==>...........................] - ETA: 1:44 - loss: 0.9440 - regression_loss: 0.8354 - classification_loss: 0.1085 56/500 [==>...........................] - ETA: 1:44 - loss: 0.9421 - regression_loss: 0.8343 - classification_loss: 0.1078 57/500 [==>...........................] - ETA: 1:43 - loss: 0.9464 - regression_loss: 0.8380 - classification_loss: 0.1084 58/500 [==>...........................] - ETA: 1:43 - loss: 0.9395 - regression_loss: 0.8308 - classification_loss: 0.1088 59/500 [==>...........................] - ETA: 1:43 - loss: 0.9395 - regression_loss: 0.8311 - classification_loss: 0.1085 60/500 [==>...........................] - ETA: 1:43 - loss: 0.9375 - regression_loss: 0.8300 - classification_loss: 0.1075 61/500 [==>...........................] - ETA: 1:43 - loss: 0.9321 - regression_loss: 0.8258 - classification_loss: 0.1063 62/500 [==>...........................] - ETA: 1:42 - loss: 0.9300 - regression_loss: 0.8242 - classification_loss: 0.1059 63/500 [==>...........................] - ETA: 1:42 - loss: 0.9330 - regression_loss: 0.8272 - classification_loss: 0.1058 64/500 [==>...........................] - ETA: 1:42 - loss: 0.9427 - regression_loss: 0.8336 - classification_loss: 0.1090 65/500 [==>...........................] - ETA: 1:42 - loss: 0.9331 - regression_loss: 0.8253 - classification_loss: 0.1078 66/500 [==>...........................] - ETA: 1:42 - loss: 0.9356 - regression_loss: 0.8268 - classification_loss: 0.1087 67/500 [===>..........................] - ETA: 1:41 - loss: 0.9384 - regression_loss: 0.8295 - classification_loss: 0.1088 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9384 - regression_loss: 0.8295 - classification_loss: 0.1089 69/500 [===>..........................] - ETA: 1:41 - loss: 0.9299 - regression_loss: 0.8220 - classification_loss: 0.1079 70/500 [===>..........................] - ETA: 1:40 - loss: 0.9374 - regression_loss: 0.8281 - classification_loss: 0.1093 71/500 [===>..........................] - ETA: 1:40 - loss: 0.9362 - regression_loss: 0.8273 - classification_loss: 0.1088 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9331 - regression_loss: 0.8246 - classification_loss: 0.1085 73/500 [===>..........................] - ETA: 1:40 - loss: 0.9316 - regression_loss: 0.8237 - classification_loss: 0.1079 74/500 [===>..........................] - ETA: 1:40 - loss: 0.9389 - regression_loss: 0.8302 - classification_loss: 0.1087 75/500 [===>..........................] - ETA: 1:40 - loss: 0.9417 - regression_loss: 0.8329 - classification_loss: 0.1088 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9341 - regression_loss: 0.8263 - classification_loss: 0.1078 77/500 [===>..........................] - ETA: 1:39 - loss: 0.9310 - regression_loss: 0.8240 - classification_loss: 0.1069 78/500 [===>..........................] - ETA: 1:39 - loss: 0.9280 - regression_loss: 0.8221 - classification_loss: 0.1059 79/500 [===>..........................] - ETA: 1:39 - loss: 0.9336 - regression_loss: 0.8272 - classification_loss: 0.1064 80/500 [===>..........................] - ETA: 1:38 - loss: 0.9453 - regression_loss: 0.8374 - classification_loss: 0.1079 81/500 [===>..........................] - ETA: 1:38 - loss: 0.9451 - regression_loss: 0.8375 - classification_loss: 0.1076 82/500 [===>..........................] - ETA: 1:38 - loss: 0.9505 - regression_loss: 0.8425 - classification_loss: 0.1080 83/500 [===>..........................] - ETA: 1:38 - loss: 0.9571 - regression_loss: 0.8480 - classification_loss: 0.1091 84/500 [====>.........................] - ETA: 1:38 - loss: 0.9669 - regression_loss: 0.8560 - classification_loss: 0.1108 85/500 [====>.........................] - ETA: 1:37 - loss: 0.9711 - regression_loss: 0.8599 - classification_loss: 0.1113 86/500 [====>.........................] - ETA: 1:37 - loss: 0.9712 - regression_loss: 0.8604 - classification_loss: 0.1109 87/500 [====>.........................] - ETA: 1:37 - loss: 0.9722 - regression_loss: 0.8608 - classification_loss: 0.1114 88/500 [====>.........................] - ETA: 1:36 - loss: 0.9710 - regression_loss: 0.8596 - classification_loss: 0.1114 89/500 [====>.........................] - ETA: 1:36 - loss: 0.9730 - regression_loss: 0.8615 - classification_loss: 0.1115 90/500 [====>.........................] - ETA: 1:36 - loss: 0.9687 - regression_loss: 0.8579 - classification_loss: 0.1108 91/500 [====>.........................] - ETA: 1:36 - loss: 0.9718 - regression_loss: 0.8611 - classification_loss: 0.1107 92/500 [====>.........................] - ETA: 1:36 - loss: 0.9737 - regression_loss: 0.8631 - classification_loss: 0.1106 93/500 [====>.........................] - ETA: 1:35 - loss: 0.9679 - regression_loss: 0.8583 - classification_loss: 0.1096 94/500 [====>.........................] - ETA: 1:35 - loss: 0.9668 - regression_loss: 0.8574 - classification_loss: 0.1094 95/500 [====>.........................] - ETA: 1:35 - loss: 0.9647 - regression_loss: 0.8554 - classification_loss: 0.1093 96/500 [====>.........................] - ETA: 1:35 - loss: 0.9626 - regression_loss: 0.8537 - classification_loss: 0.1089 97/500 [====>.........................] - ETA: 1:34 - loss: 0.9600 - regression_loss: 0.8513 - classification_loss: 0.1087 98/500 [====>.........................] - ETA: 1:34 - loss: 0.9598 - regression_loss: 0.8515 - classification_loss: 0.1083 99/500 [====>.........................] - ETA: 1:34 - loss: 0.9567 - regression_loss: 0.8484 - classification_loss: 0.1084 100/500 [=====>........................] - ETA: 1:34 - loss: 0.9576 - regression_loss: 0.8498 - classification_loss: 0.1078 101/500 [=====>........................] - ETA: 1:33 - loss: 0.9601 - regression_loss: 0.8517 - classification_loss: 0.1084 102/500 [=====>........................] - ETA: 1:33 - loss: 0.9602 - regression_loss: 0.8521 - classification_loss: 0.1081 103/500 [=====>........................] - ETA: 1:33 - loss: 0.9601 - regression_loss: 0.8520 - classification_loss: 0.1081 104/500 [=====>........................] - ETA: 1:33 - loss: 0.9587 - regression_loss: 0.8514 - classification_loss: 0.1073 105/500 [=====>........................] - ETA: 1:32 - loss: 0.9607 - regression_loss: 0.8536 - classification_loss: 0.1071 106/500 [=====>........................] - ETA: 1:32 - loss: 0.9611 - regression_loss: 0.8541 - classification_loss: 0.1070 107/500 [=====>........................] - ETA: 1:32 - loss: 0.9613 - regression_loss: 0.8540 - classification_loss: 0.1073 108/500 [=====>........................] - ETA: 1:32 - loss: 0.9578 - regression_loss: 0.8512 - classification_loss: 0.1066 109/500 [=====>........................] - ETA: 1:32 - loss: 0.9554 - regression_loss: 0.8490 - classification_loss: 0.1064 110/500 [=====>........................] - ETA: 1:31 - loss: 0.9570 - regression_loss: 0.8492 - classification_loss: 0.1078 111/500 [=====>........................] - ETA: 1:31 - loss: 0.9544 - regression_loss: 0.8472 - classification_loss: 0.1071 112/500 [=====>........................] - ETA: 1:31 - loss: 0.9535 - regression_loss: 0.8467 - classification_loss: 0.1068 113/500 [=====>........................] - ETA: 1:31 - loss: 0.9495 - regression_loss: 0.8432 - classification_loss: 0.1063 114/500 [=====>........................] - ETA: 1:30 - loss: 0.9453 - regression_loss: 0.8396 - classification_loss: 0.1057 115/500 [=====>........................] - ETA: 1:30 - loss: 0.9456 - regression_loss: 0.8403 - classification_loss: 0.1052 116/500 [=====>........................] - ETA: 1:30 - loss: 0.9479 - regression_loss: 0.8422 - classification_loss: 0.1056 117/500 [======>.......................] - ETA: 1:30 - loss: 0.9574 - regression_loss: 0.8501 - classification_loss: 0.1074 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9597 - regression_loss: 0.8516 - classification_loss: 0.1081 119/500 [======>.......................] - ETA: 1:29 - loss: 0.9556 - regression_loss: 0.8481 - classification_loss: 0.1076 120/500 [======>.......................] - ETA: 1:29 - loss: 0.9549 - regression_loss: 0.8472 - classification_loss: 0.1077 121/500 [======>.......................] - ETA: 1:29 - loss: 0.9605 - regression_loss: 0.8528 - classification_loss: 0.1077 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9589 - regression_loss: 0.8516 - classification_loss: 0.1073 123/500 [======>.......................] - ETA: 1:28 - loss: 0.9589 - regression_loss: 0.8507 - classification_loss: 0.1083 124/500 [======>.......................] - ETA: 1:28 - loss: 0.9599 - regression_loss: 0.8517 - classification_loss: 0.1081 125/500 [======>.......................] - ETA: 1:28 - loss: 0.9531 - regression_loss: 0.8449 - classification_loss: 0.1082 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9486 - regression_loss: 0.8408 - classification_loss: 0.1077 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9489 - regression_loss: 0.8395 - classification_loss: 0.1094 128/500 [======>.......................] - ETA: 1:27 - loss: 0.9499 - regression_loss: 0.8401 - classification_loss: 0.1099 129/500 [======>.......................] - ETA: 1:27 - loss: 0.9487 - regression_loss: 0.8387 - classification_loss: 0.1099 130/500 [======>.......................] - ETA: 1:27 - loss: 0.9571 - regression_loss: 0.8457 - classification_loss: 0.1113 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9629 - regression_loss: 0.8508 - classification_loss: 0.1121 132/500 [======>.......................] - ETA: 1:26 - loss: 0.9630 - regression_loss: 0.8510 - classification_loss: 0.1121 133/500 [======>.......................] - ETA: 1:26 - loss: 0.9604 - regression_loss: 0.8485 - classification_loss: 0.1118 134/500 [=======>......................] - ETA: 1:26 - loss: 0.9654 - regression_loss: 0.8535 - classification_loss: 0.1119 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9675 - regression_loss: 0.8553 - classification_loss: 0.1122 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9661 - regression_loss: 0.8544 - classification_loss: 0.1117 137/500 [=======>......................] - ETA: 1:25 - loss: 0.9632 - regression_loss: 0.8519 - classification_loss: 0.1112 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9615 - regression_loss: 0.8506 - classification_loss: 0.1109 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9615 - regression_loss: 0.8507 - classification_loss: 0.1108 140/500 [=======>......................] - ETA: 1:24 - loss: 0.9611 - regression_loss: 0.8504 - classification_loss: 0.1107 141/500 [=======>......................] - ETA: 1:24 - loss: 0.9610 - regression_loss: 0.8503 - classification_loss: 0.1107 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9590 - regression_loss: 0.8488 - classification_loss: 0.1102 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9602 - regression_loss: 0.8491 - classification_loss: 0.1111 144/500 [=======>......................] - ETA: 1:23 - loss: 0.9608 - regression_loss: 0.8499 - classification_loss: 0.1110 145/500 [=======>......................] - ETA: 1:23 - loss: 0.9601 - regression_loss: 0.8485 - classification_loss: 0.1116 146/500 [=======>......................] - ETA: 1:23 - loss: 0.9583 - regression_loss: 0.8470 - classification_loss: 0.1113 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9577 - regression_loss: 0.8466 - classification_loss: 0.1111 148/500 [=======>......................] - ETA: 1:22 - loss: 0.9600 - regression_loss: 0.8484 - classification_loss: 0.1115 149/500 [=======>......................] - ETA: 1:22 - loss: 0.9596 - regression_loss: 0.8483 - classification_loss: 0.1113 150/500 [========>.....................] - ETA: 1:22 - loss: 0.9601 - regression_loss: 0.8489 - classification_loss: 0.1112 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9644 - regression_loss: 0.8526 - classification_loss: 0.1119 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9659 - regression_loss: 0.8543 - classification_loss: 0.1117 153/500 [========>.....................] - ETA: 1:21 - loss: 0.9650 - regression_loss: 0.8536 - classification_loss: 0.1114 154/500 [========>.....................] - ETA: 1:21 - loss: 0.9650 - regression_loss: 0.8539 - classification_loss: 0.1111 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9636 - regression_loss: 0.8528 - classification_loss: 0.1107 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9641 - regression_loss: 0.8536 - classification_loss: 0.1105 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9650 - regression_loss: 0.8540 - classification_loss: 0.1110 158/500 [========>.....................] - ETA: 1:20 - loss: 0.9628 - regression_loss: 0.8522 - classification_loss: 0.1106 159/500 [========>.....................] - ETA: 1:19 - loss: 0.9704 - regression_loss: 0.8595 - classification_loss: 0.1109 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9704 - regression_loss: 0.8598 - classification_loss: 0.1105 161/500 [========>.....................] - ETA: 1:19 - loss: 0.9655 - regression_loss: 0.8555 - classification_loss: 0.1100 162/500 [========>.....................] - ETA: 1:19 - loss: 0.9661 - regression_loss: 0.8563 - classification_loss: 0.1098 163/500 [========>.....................] - ETA: 1:19 - loss: 0.9650 - regression_loss: 0.8555 - classification_loss: 0.1095 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9643 - regression_loss: 0.8546 - classification_loss: 0.1097 165/500 [========>.....................] - ETA: 1:18 - loss: 0.9620 - regression_loss: 0.8527 - classification_loss: 0.1093 166/500 [========>.....................] - ETA: 1:18 - loss: 0.9590 - regression_loss: 0.8501 - classification_loss: 0.1089 167/500 [=========>....................] - ETA: 1:18 - loss: 0.9594 - regression_loss: 0.8503 - classification_loss: 0.1091 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9570 - regression_loss: 0.8483 - classification_loss: 0.1087 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9590 - regression_loss: 0.8507 - classification_loss: 0.1083 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9583 - regression_loss: 0.8502 - classification_loss: 0.1081 171/500 [=========>....................] - ETA: 1:17 - loss: 0.9590 - regression_loss: 0.8510 - classification_loss: 0.1080 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9689 - regression_loss: 0.8597 - classification_loss: 0.1092 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9681 - regression_loss: 0.8591 - classification_loss: 0.1090 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9738 - regression_loss: 0.8635 - classification_loss: 0.1103 175/500 [=========>....................] - ETA: 1:16 - loss: 0.9753 - regression_loss: 0.8647 - classification_loss: 0.1105 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9759 - regression_loss: 0.8655 - classification_loss: 0.1104 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9749 - regression_loss: 0.8648 - classification_loss: 0.1101 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9729 - regression_loss: 0.8632 - classification_loss: 0.1098 179/500 [=========>....................] - ETA: 1:15 - loss: 0.9715 - regression_loss: 0.8620 - classification_loss: 0.1095 180/500 [=========>....................] - ETA: 1:15 - loss: 0.9709 - regression_loss: 0.8618 - classification_loss: 0.1092 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9668 - regression_loss: 0.8581 - classification_loss: 0.1087 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9682 - regression_loss: 0.8595 - classification_loss: 0.1087 183/500 [=========>....................] - ETA: 1:14 - loss: 0.9680 - regression_loss: 0.8596 - classification_loss: 0.1084 184/500 [==========>...................] - ETA: 1:14 - loss: 0.9670 - regression_loss: 0.8587 - classification_loss: 0.1083 185/500 [==========>...................] - ETA: 1:13 - loss: 0.9647 - regression_loss: 0.8568 - classification_loss: 0.1079 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9649 - regression_loss: 0.8570 - classification_loss: 0.1079 187/500 [==========>...................] - ETA: 1:13 - loss: 0.9652 - regression_loss: 0.8573 - classification_loss: 0.1079 188/500 [==========>...................] - ETA: 1:13 - loss: 0.9639 - regression_loss: 0.8563 - classification_loss: 0.1076 189/500 [==========>...................] - ETA: 1:12 - loss: 0.9616 - regression_loss: 0.8544 - classification_loss: 0.1072 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9587 - regression_loss: 0.8518 - classification_loss: 0.1069 191/500 [==========>...................] - ETA: 1:12 - loss: 0.9577 - regression_loss: 0.8510 - classification_loss: 0.1067 192/500 [==========>...................] - ETA: 1:12 - loss: 0.9627 - regression_loss: 0.8548 - classification_loss: 0.1079 193/500 [==========>...................] - ETA: 1:12 - loss: 0.9680 - regression_loss: 0.8589 - classification_loss: 0.1092 194/500 [==========>...................] - ETA: 1:11 - loss: 0.9721 - regression_loss: 0.8628 - classification_loss: 0.1092 195/500 [==========>...................] - ETA: 1:11 - loss: 0.9712 - regression_loss: 0.8623 - classification_loss: 0.1089 196/500 [==========>...................] - ETA: 1:11 - loss: 0.9694 - regression_loss: 0.8607 - classification_loss: 0.1086 197/500 [==========>...................] - ETA: 1:11 - loss: 0.9703 - regression_loss: 0.8618 - classification_loss: 0.1085 198/500 [==========>...................] - ETA: 1:10 - loss: 0.9703 - regression_loss: 0.8619 - classification_loss: 0.1083 199/500 [==========>...................] - ETA: 1:10 - loss: 0.9711 - regression_loss: 0.8629 - classification_loss: 0.1082 200/500 [===========>..................] - ETA: 1:10 - loss: 0.9702 - regression_loss: 0.8624 - classification_loss: 0.1078 201/500 [===========>..................] - ETA: 1:10 - loss: 0.9720 - regression_loss: 0.8641 - classification_loss: 0.1079 202/500 [===========>..................] - ETA: 1:09 - loss: 0.9690 - regression_loss: 0.8614 - classification_loss: 0.1076 203/500 [===========>..................] - ETA: 1:09 - loss: 0.9658 - regression_loss: 0.8586 - classification_loss: 0.1072 204/500 [===========>..................] - ETA: 1:09 - loss: 0.9685 - regression_loss: 0.8609 - classification_loss: 0.1076 205/500 [===========>..................] - ETA: 1:09 - loss: 0.9675 - regression_loss: 0.8601 - classification_loss: 0.1074 206/500 [===========>..................] - ETA: 1:08 - loss: 0.9665 - regression_loss: 0.8592 - classification_loss: 0.1073 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9673 - regression_loss: 0.8600 - classification_loss: 0.1072 208/500 [===========>..................] - ETA: 1:08 - loss: 0.9650 - regression_loss: 0.8581 - classification_loss: 0.1069 209/500 [===========>..................] - ETA: 1:08 - loss: 0.9649 - regression_loss: 0.8579 - classification_loss: 0.1069 210/500 [===========>..................] - ETA: 1:07 - loss: 0.9680 - regression_loss: 0.8607 - classification_loss: 0.1073 211/500 [===========>..................] - ETA: 1:07 - loss: 0.9693 - regression_loss: 0.8619 - classification_loss: 0.1074 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9680 - regression_loss: 0.8606 - classification_loss: 0.1074 213/500 [===========>..................] - ETA: 1:07 - loss: 0.9646 - regression_loss: 0.8576 - classification_loss: 0.1070 214/500 [===========>..................] - ETA: 1:07 - loss: 0.9637 - regression_loss: 0.8568 - classification_loss: 0.1068 215/500 [===========>..................] - ETA: 1:06 - loss: 0.9612 - regression_loss: 0.8547 - classification_loss: 0.1064 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9604 - regression_loss: 0.8541 - classification_loss: 0.1062 217/500 [============>.................] - ETA: 1:06 - loss: 0.9575 - regression_loss: 0.8516 - classification_loss: 0.1059 218/500 [============>.................] - ETA: 1:06 - loss: 0.9570 - regression_loss: 0.8514 - classification_loss: 0.1057 219/500 [============>.................] - ETA: 1:05 - loss: 0.9587 - regression_loss: 0.8529 - classification_loss: 0.1057 220/500 [============>.................] - ETA: 1:05 - loss: 0.9583 - regression_loss: 0.8527 - classification_loss: 0.1057 221/500 [============>.................] - ETA: 1:05 - loss: 0.9602 - regression_loss: 0.8540 - classification_loss: 0.1062 222/500 [============>.................] - ETA: 1:05 - loss: 0.9586 - regression_loss: 0.8525 - classification_loss: 0.1061 223/500 [============>.................] - ETA: 1:04 - loss: 0.9622 - regression_loss: 0.8558 - classification_loss: 0.1064 224/500 [============>.................] - ETA: 1:04 - loss: 0.9607 - regression_loss: 0.8545 - classification_loss: 0.1062 225/500 [============>.................] - ETA: 1:04 - loss: 0.9599 - regression_loss: 0.8538 - classification_loss: 0.1060 226/500 [============>.................] - ETA: 1:04 - loss: 0.9579 - regression_loss: 0.8501 - classification_loss: 0.1078 227/500 [============>.................] - ETA: 1:03 - loss: 0.9577 - regression_loss: 0.8498 - classification_loss: 0.1079 228/500 [============>.................] - ETA: 1:03 - loss: 0.9566 - regression_loss: 0.8490 - classification_loss: 0.1077 229/500 [============>.................] - ETA: 1:03 - loss: 0.9563 - regression_loss: 0.8487 - classification_loss: 0.1076 230/500 [============>.................] - ETA: 1:03 - loss: 0.9563 - regression_loss: 0.8489 - classification_loss: 0.1074 231/500 [============>.................] - ETA: 1:02 - loss: 0.9559 - regression_loss: 0.8485 - classification_loss: 0.1073 232/500 [============>.................] - ETA: 1:02 - loss: 0.9554 - regression_loss: 0.8482 - classification_loss: 0.1071 233/500 [============>.................] - ETA: 1:02 - loss: 0.9547 - regression_loss: 0.8478 - classification_loss: 0.1069 234/500 [=============>................] - ETA: 1:02 - loss: 0.9546 - regression_loss: 0.8477 - classification_loss: 0.1069 235/500 [=============>................] - ETA: 1:02 - loss: 0.9534 - regression_loss: 0.8467 - classification_loss: 0.1067 236/500 [=============>................] - ETA: 1:01 - loss: 0.9550 - regression_loss: 0.8485 - classification_loss: 0.1065 237/500 [=============>................] - ETA: 1:01 - loss: 0.9527 - regression_loss: 0.8465 - classification_loss: 0.1062 238/500 [=============>................] - ETA: 1:01 - loss: 0.9529 - regression_loss: 0.8467 - classification_loss: 0.1062 239/500 [=============>................] - ETA: 1:01 - loss: 0.9529 - regression_loss: 0.8467 - classification_loss: 0.1062 240/500 [=============>................] - ETA: 1:00 - loss: 0.9515 - regression_loss: 0.8456 - classification_loss: 0.1059 241/500 [=============>................] - ETA: 1:00 - loss: 0.9539 - regression_loss: 0.8477 - classification_loss: 0.1063 242/500 [=============>................] - ETA: 1:00 - loss: 0.9531 - regression_loss: 0.8468 - classification_loss: 0.1063 243/500 [=============>................] - ETA: 1:00 - loss: 0.9549 - regression_loss: 0.8485 - classification_loss: 0.1063 244/500 [=============>................] - ETA: 59s - loss: 0.9552 - regression_loss: 0.8488 - classification_loss: 0.1063  245/500 [=============>................] - ETA: 59s - loss: 0.9558 - regression_loss: 0.8491 - classification_loss: 0.1067 246/500 [=============>................] - ETA: 59s - loss: 0.9582 - regression_loss: 0.8511 - classification_loss: 0.1071 247/500 [=============>................] - ETA: 59s - loss: 0.9570 - regression_loss: 0.8501 - classification_loss: 0.1069 248/500 [=============>................] - ETA: 58s - loss: 0.9574 - regression_loss: 0.8505 - classification_loss: 0.1070 249/500 [=============>................] - ETA: 58s - loss: 0.9586 - regression_loss: 0.8517 - classification_loss: 0.1070 250/500 [==============>...............] - ETA: 58s - loss: 0.9581 - regression_loss: 0.8511 - classification_loss: 0.1070 251/500 [==============>...............] - ETA: 58s - loss: 0.9576 - regression_loss: 0.8509 - classification_loss: 0.1067 252/500 [==============>...............] - ETA: 58s - loss: 0.9583 - regression_loss: 0.8515 - classification_loss: 0.1067 253/500 [==============>...............] - ETA: 57s - loss: 0.9605 - regression_loss: 0.8535 - classification_loss: 0.1070 254/500 [==============>...............] - ETA: 57s - loss: 0.9611 - regression_loss: 0.8542 - classification_loss: 0.1069 255/500 [==============>...............] - ETA: 57s - loss: 0.9607 - regression_loss: 0.8539 - classification_loss: 0.1068 256/500 [==============>...............] - ETA: 57s - loss: 0.9609 - regression_loss: 0.8542 - classification_loss: 0.1067 257/500 [==============>...............] - ETA: 56s - loss: 0.9593 - regression_loss: 0.8527 - classification_loss: 0.1066 258/500 [==============>...............] - ETA: 56s - loss: 0.9598 - regression_loss: 0.8531 - classification_loss: 0.1067 259/500 [==============>...............] - ETA: 56s - loss: 0.9595 - regression_loss: 0.8528 - classification_loss: 0.1066 260/500 [==============>...............] - ETA: 56s - loss: 0.9621 - regression_loss: 0.8552 - classification_loss: 0.1069 261/500 [==============>...............] - ETA: 55s - loss: 0.9639 - regression_loss: 0.8567 - classification_loss: 0.1071 262/500 [==============>...............] - ETA: 55s - loss: 0.9646 - regression_loss: 0.8575 - classification_loss: 0.1072 263/500 [==============>...............] - ETA: 55s - loss: 0.9652 - regression_loss: 0.8579 - classification_loss: 0.1073 264/500 [==============>...............] - ETA: 55s - loss: 0.9629 - regression_loss: 0.8558 - classification_loss: 0.1071 265/500 [==============>...............] - ETA: 54s - loss: 0.9604 - regression_loss: 0.8535 - classification_loss: 0.1069 266/500 [==============>...............] - ETA: 54s - loss: 0.9600 - regression_loss: 0.8531 - classification_loss: 0.1069 267/500 [===============>..............] - ETA: 54s - loss: 0.9604 - regression_loss: 0.8535 - classification_loss: 0.1070 268/500 [===============>..............] - ETA: 54s - loss: 0.9616 - regression_loss: 0.8546 - classification_loss: 0.1070 269/500 [===============>..............] - ETA: 54s - loss: 0.9636 - regression_loss: 0.8564 - classification_loss: 0.1073 270/500 [===============>..............] - ETA: 53s - loss: 0.9621 - regression_loss: 0.8549 - classification_loss: 0.1071 271/500 [===============>..............] - ETA: 53s - loss: 0.9628 - regression_loss: 0.8557 - classification_loss: 0.1071 272/500 [===============>..............] - ETA: 53s - loss: 0.9624 - regression_loss: 0.8555 - classification_loss: 0.1070 273/500 [===============>..............] - ETA: 53s - loss: 0.9621 - regression_loss: 0.8552 - classification_loss: 0.1069 274/500 [===============>..............] - ETA: 52s - loss: 0.9628 - regression_loss: 0.8560 - classification_loss: 0.1068 275/500 [===============>..............] - ETA: 52s - loss: 0.9624 - regression_loss: 0.8557 - classification_loss: 0.1067 276/500 [===============>..............] - ETA: 52s - loss: 0.9609 - regression_loss: 0.8545 - classification_loss: 0.1064 277/500 [===============>..............] - ETA: 52s - loss: 0.9594 - regression_loss: 0.8530 - classification_loss: 0.1064 278/500 [===============>..............] - ETA: 51s - loss: 0.9601 - regression_loss: 0.8538 - classification_loss: 0.1064 279/500 [===============>..............] - ETA: 51s - loss: 0.9605 - regression_loss: 0.8540 - classification_loss: 0.1065 280/500 [===============>..............] - ETA: 51s - loss: 0.9629 - regression_loss: 0.8559 - classification_loss: 0.1070 281/500 [===============>..............] - ETA: 51s - loss: 0.9620 - regression_loss: 0.8551 - classification_loss: 0.1069 282/500 [===============>..............] - ETA: 51s - loss: 0.9616 - regression_loss: 0.8549 - classification_loss: 0.1067 283/500 [===============>..............] - ETA: 50s - loss: 0.9596 - regression_loss: 0.8533 - classification_loss: 0.1064 284/500 [================>.............] - ETA: 50s - loss: 0.9587 - regression_loss: 0.8525 - classification_loss: 0.1061 285/500 [================>.............] - ETA: 50s - loss: 0.9586 - regression_loss: 0.8525 - classification_loss: 0.1061 286/500 [================>.............] - ETA: 50s - loss: 0.9598 - regression_loss: 0.8536 - classification_loss: 0.1063 287/500 [================>.............] - ETA: 49s - loss: 0.9591 - regression_loss: 0.8529 - classification_loss: 0.1062 288/500 [================>.............] - ETA: 49s - loss: 0.9605 - regression_loss: 0.8540 - classification_loss: 0.1065 289/500 [================>.............] - ETA: 49s - loss: 0.9584 - regression_loss: 0.8523 - classification_loss: 0.1061 290/500 [================>.............] - ETA: 49s - loss: 0.9607 - regression_loss: 0.8543 - classification_loss: 0.1064 291/500 [================>.............] - ETA: 48s - loss: 0.9594 - regression_loss: 0.8533 - classification_loss: 0.1061 292/500 [================>.............] - ETA: 48s - loss: 0.9602 - regression_loss: 0.8539 - classification_loss: 0.1063 293/500 [================>.............] - ETA: 48s - loss: 0.9629 - regression_loss: 0.8557 - classification_loss: 0.1072 294/500 [================>.............] - ETA: 48s - loss: 0.9647 - regression_loss: 0.8573 - classification_loss: 0.1073 295/500 [================>.............] - ETA: 48s - loss: 0.9657 - regression_loss: 0.8581 - classification_loss: 0.1076 296/500 [================>.............] - ETA: 47s - loss: 0.9666 - regression_loss: 0.8589 - classification_loss: 0.1077 297/500 [================>.............] - ETA: 47s - loss: 0.9657 - regression_loss: 0.8582 - classification_loss: 0.1075 298/500 [================>.............] - ETA: 47s - loss: 0.9658 - regression_loss: 0.8583 - classification_loss: 0.1075 299/500 [================>.............] - ETA: 47s - loss: 0.9655 - regression_loss: 0.8580 - classification_loss: 0.1075 300/500 [=================>............] - ETA: 46s - loss: 0.9645 - regression_loss: 0.8572 - classification_loss: 0.1073 301/500 [=================>............] - ETA: 46s - loss: 0.9633 - regression_loss: 0.8562 - classification_loss: 0.1071 302/500 [=================>............] - ETA: 46s - loss: 0.9636 - regression_loss: 0.8564 - classification_loss: 0.1072 303/500 [=================>............] - ETA: 46s - loss: 0.9657 - regression_loss: 0.8582 - classification_loss: 0.1075 304/500 [=================>............] - ETA: 45s - loss: 0.9652 - regression_loss: 0.8577 - classification_loss: 0.1074 305/500 [=================>............] - ETA: 45s - loss: 0.9650 - regression_loss: 0.8577 - classification_loss: 0.1073 306/500 [=================>............] - ETA: 45s - loss: 0.9643 - regression_loss: 0.8572 - classification_loss: 0.1071 307/500 [=================>............] - ETA: 45s - loss: 0.9662 - regression_loss: 0.8586 - classification_loss: 0.1076 308/500 [=================>............] - ETA: 45s - loss: 0.9668 - regression_loss: 0.8592 - classification_loss: 0.1076 309/500 [=================>............] - ETA: 44s - loss: 0.9649 - regression_loss: 0.8577 - classification_loss: 0.1073 310/500 [=================>............] - ETA: 44s - loss: 0.9636 - regression_loss: 0.8565 - classification_loss: 0.1071 311/500 [=================>............] - ETA: 44s - loss: 0.9622 - regression_loss: 0.8554 - classification_loss: 0.1068 312/500 [=================>............] - ETA: 44s - loss: 0.9602 - regression_loss: 0.8534 - classification_loss: 0.1068 313/500 [=================>............] - ETA: 43s - loss: 0.9579 - regression_loss: 0.8513 - classification_loss: 0.1066 314/500 [=================>............] - ETA: 43s - loss: 0.9580 - regression_loss: 0.8512 - classification_loss: 0.1068 315/500 [=================>............] - ETA: 43s - loss: 0.9587 - regression_loss: 0.8517 - classification_loss: 0.1070 316/500 [=================>............] - ETA: 43s - loss: 0.9591 - regression_loss: 0.8521 - classification_loss: 0.1070 317/500 [==================>...........] - ETA: 42s - loss: 0.9587 - regression_loss: 0.8518 - classification_loss: 0.1069 318/500 [==================>...........] - ETA: 42s - loss: 0.9574 - regression_loss: 0.8506 - classification_loss: 0.1068 319/500 [==================>...........] - ETA: 42s - loss: 0.9563 - regression_loss: 0.8497 - classification_loss: 0.1066 320/500 [==================>...........] - ETA: 42s - loss: 0.9573 - regression_loss: 0.8506 - classification_loss: 0.1067 321/500 [==================>...........] - ETA: 41s - loss: 0.9556 - regression_loss: 0.8491 - classification_loss: 0.1065 322/500 [==================>...........] - ETA: 41s - loss: 0.9557 - regression_loss: 0.8493 - classification_loss: 0.1064 323/500 [==================>...........] - ETA: 41s - loss: 0.9551 - regression_loss: 0.8487 - classification_loss: 0.1063 324/500 [==================>...........] - ETA: 41s - loss: 0.9549 - regression_loss: 0.8488 - classification_loss: 0.1061 325/500 [==================>...........] - ETA: 41s - loss: 0.9541 - regression_loss: 0.8481 - classification_loss: 0.1060 326/500 [==================>...........] - ETA: 40s - loss: 0.9552 - regression_loss: 0.8490 - classification_loss: 0.1062 327/500 [==================>...........] - ETA: 40s - loss: 0.9557 - regression_loss: 0.8494 - classification_loss: 0.1062 328/500 [==================>...........] - ETA: 40s - loss: 0.9574 - regression_loss: 0.8508 - classification_loss: 0.1066 329/500 [==================>...........] - ETA: 40s - loss: 0.9561 - regression_loss: 0.8491 - classification_loss: 0.1070 330/500 [==================>...........] - ETA: 39s - loss: 0.9566 - regression_loss: 0.8496 - classification_loss: 0.1069 331/500 [==================>...........] - ETA: 39s - loss: 0.9568 - regression_loss: 0.8499 - classification_loss: 0.1069 332/500 [==================>...........] - ETA: 39s - loss: 0.9553 - regression_loss: 0.8486 - classification_loss: 0.1067 333/500 [==================>...........] - ETA: 39s - loss: 0.9532 - regression_loss: 0.8468 - classification_loss: 0.1064 334/500 [===================>..........] - ETA: 38s - loss: 0.9518 - regression_loss: 0.8455 - classification_loss: 0.1062 335/500 [===================>..........] - ETA: 38s - loss: 0.9510 - regression_loss: 0.8450 - classification_loss: 0.1060 336/500 [===================>..........] - ETA: 38s - loss: 0.9524 - regression_loss: 0.8460 - classification_loss: 0.1064 337/500 [===================>..........] - ETA: 38s - loss: 0.9509 - regression_loss: 0.8447 - classification_loss: 0.1062 338/500 [===================>..........] - ETA: 37s - loss: 0.9546 - regression_loss: 0.8474 - classification_loss: 0.1072 339/500 [===================>..........] - ETA: 37s - loss: 0.9552 - regression_loss: 0.8480 - classification_loss: 0.1073 340/500 [===================>..........] - ETA: 37s - loss: 0.9585 - regression_loss: 0.8510 - classification_loss: 0.1076 341/500 [===================>..........] - ETA: 37s - loss: 0.9578 - regression_loss: 0.8503 - classification_loss: 0.1075 342/500 [===================>..........] - ETA: 37s - loss: 0.9575 - regression_loss: 0.8500 - classification_loss: 0.1075 343/500 [===================>..........] - ETA: 36s - loss: 0.9569 - regression_loss: 0.8494 - classification_loss: 0.1075 344/500 [===================>..........] - ETA: 36s - loss: 0.9585 - regression_loss: 0.8511 - classification_loss: 0.1074 345/500 [===================>..........] - ETA: 36s - loss: 0.9589 - regression_loss: 0.8513 - classification_loss: 0.1076 346/500 [===================>..........] - ETA: 36s - loss: 0.9600 - regression_loss: 0.8524 - classification_loss: 0.1076 347/500 [===================>..........] - ETA: 35s - loss: 0.9613 - regression_loss: 0.8534 - classification_loss: 0.1080 348/500 [===================>..........] - ETA: 35s - loss: 0.9620 - regression_loss: 0.8540 - classification_loss: 0.1080 349/500 [===================>..........] - ETA: 35s - loss: 0.9641 - regression_loss: 0.8559 - classification_loss: 0.1082 350/500 [====================>.........] - ETA: 35s - loss: 0.9637 - regression_loss: 0.8558 - classification_loss: 0.1080 351/500 [====================>.........] - ETA: 34s - loss: 0.9631 - regression_loss: 0.8552 - classification_loss: 0.1079 352/500 [====================>.........] - ETA: 34s - loss: 0.9617 - regression_loss: 0.8540 - classification_loss: 0.1076 353/500 [====================>.........] - ETA: 34s - loss: 0.9639 - regression_loss: 0.8557 - classification_loss: 0.1082 354/500 [====================>.........] - ETA: 34s - loss: 0.9660 - regression_loss: 0.8576 - classification_loss: 0.1084 355/500 [====================>.........] - ETA: 34s - loss: 0.9664 - regression_loss: 0.8579 - classification_loss: 0.1085 356/500 [====================>.........] - ETA: 33s - loss: 0.9663 - regression_loss: 0.8579 - classification_loss: 0.1084 357/500 [====================>.........] - ETA: 33s - loss: 0.9661 - regression_loss: 0.8578 - classification_loss: 0.1083 358/500 [====================>.........] - ETA: 33s - loss: 0.9654 - regression_loss: 0.8573 - classification_loss: 0.1081 359/500 [====================>.........] - ETA: 33s - loss: 0.9658 - regression_loss: 0.8577 - classification_loss: 0.1081 360/500 [====================>.........] - ETA: 32s - loss: 0.9656 - regression_loss: 0.8575 - classification_loss: 0.1080 361/500 [====================>.........] - ETA: 32s - loss: 0.9655 - regression_loss: 0.8576 - classification_loss: 0.1078 362/500 [====================>.........] - ETA: 32s - loss: 0.9638 - regression_loss: 0.8561 - classification_loss: 0.1077 363/500 [====================>.........] - ETA: 32s - loss: 0.9631 - regression_loss: 0.8557 - classification_loss: 0.1074 364/500 [====================>.........] - ETA: 31s - loss: 0.9626 - regression_loss: 0.8553 - classification_loss: 0.1073 365/500 [====================>.........] - ETA: 31s - loss: 0.9620 - regression_loss: 0.8548 - classification_loss: 0.1072 366/500 [====================>.........] - ETA: 31s - loss: 0.9611 - regression_loss: 0.8541 - classification_loss: 0.1070 367/500 [=====================>........] - ETA: 31s - loss: 0.9606 - regression_loss: 0.8536 - classification_loss: 0.1070 368/500 [=====================>........] - ETA: 30s - loss: 0.9603 - regression_loss: 0.8534 - classification_loss: 0.1068 369/500 [=====================>........] - ETA: 30s - loss: 0.9604 - regression_loss: 0.8535 - classification_loss: 0.1069 370/500 [=====================>........] - ETA: 30s - loss: 0.9601 - regression_loss: 0.8532 - classification_loss: 0.1069 371/500 [=====================>........] - ETA: 30s - loss: 0.9594 - regression_loss: 0.8527 - classification_loss: 0.1067 372/500 [=====================>........] - ETA: 29s - loss: 0.9594 - regression_loss: 0.8526 - classification_loss: 0.1068 373/500 [=====================>........] - ETA: 29s - loss: 0.9584 - regression_loss: 0.8518 - classification_loss: 0.1066 374/500 [=====================>........] - ETA: 29s - loss: 0.9581 - regression_loss: 0.8515 - classification_loss: 0.1066 375/500 [=====================>........] - ETA: 29s - loss: 0.9571 - regression_loss: 0.8506 - classification_loss: 0.1064 376/500 [=====================>........] - ETA: 29s - loss: 0.9564 - regression_loss: 0.8501 - classification_loss: 0.1063 377/500 [=====================>........] - ETA: 28s - loss: 0.9559 - regression_loss: 0.8497 - classification_loss: 0.1061 378/500 [=====================>........] - ETA: 28s - loss: 0.9560 - regression_loss: 0.8500 - classification_loss: 0.1061 379/500 [=====================>........] - ETA: 28s - loss: 0.9558 - regression_loss: 0.8497 - classification_loss: 0.1060 380/500 [=====================>........] - ETA: 28s - loss: 0.9553 - regression_loss: 0.8493 - classification_loss: 0.1059 381/500 [=====================>........] - ETA: 27s - loss: 0.9555 - regression_loss: 0.8496 - classification_loss: 0.1059 382/500 [=====================>........] - ETA: 27s - loss: 0.9553 - regression_loss: 0.8495 - classification_loss: 0.1058 383/500 [=====================>........] - ETA: 27s - loss: 0.9561 - regression_loss: 0.8501 - classification_loss: 0.1060 384/500 [======================>.......] - ETA: 27s - loss: 0.9556 - regression_loss: 0.8497 - classification_loss: 0.1058 385/500 [======================>.......] - ETA: 26s - loss: 0.9562 - regression_loss: 0.8502 - classification_loss: 0.1060 386/500 [======================>.......] - ETA: 26s - loss: 0.9554 - regression_loss: 0.8496 - classification_loss: 0.1059 387/500 [======================>.......] - ETA: 26s - loss: 0.9556 - regression_loss: 0.8497 - classification_loss: 0.1059 388/500 [======================>.......] - ETA: 26s - loss: 0.9550 - regression_loss: 0.8492 - classification_loss: 0.1058 389/500 [======================>.......] - ETA: 26s - loss: 0.9555 - regression_loss: 0.8496 - classification_loss: 0.1059 390/500 [======================>.......] - ETA: 25s - loss: 0.9531 - regression_loss: 0.8475 - classification_loss: 0.1056 391/500 [======================>.......] - ETA: 25s - loss: 0.9524 - regression_loss: 0.8468 - classification_loss: 0.1055 392/500 [======================>.......] - ETA: 25s - loss: 0.9531 - regression_loss: 0.8475 - classification_loss: 0.1056 393/500 [======================>.......] - ETA: 25s - loss: 0.9535 - regression_loss: 0.8479 - classification_loss: 0.1056 394/500 [======================>.......] - ETA: 24s - loss: 0.9538 - regression_loss: 0.8480 - classification_loss: 0.1058 395/500 [======================>.......] - ETA: 24s - loss: 0.9546 - regression_loss: 0.8488 - classification_loss: 0.1058 396/500 [======================>.......] - ETA: 24s - loss: 0.9550 - regression_loss: 0.8492 - classification_loss: 0.1058 397/500 [======================>.......] - ETA: 24s - loss: 0.9553 - regression_loss: 0.8494 - classification_loss: 0.1060 398/500 [======================>.......] - ETA: 23s - loss: 0.9556 - regression_loss: 0.8495 - classification_loss: 0.1061 399/500 [======================>.......] - ETA: 23s - loss: 0.9559 - regression_loss: 0.8497 - classification_loss: 0.1061 400/500 [=======================>......] - ETA: 23s - loss: 0.9545 - regression_loss: 0.8485 - classification_loss: 0.1059 401/500 [=======================>......] - ETA: 23s - loss: 0.9549 - regression_loss: 0.8489 - classification_loss: 0.1060 402/500 [=======================>......] - ETA: 22s - loss: 0.9546 - regression_loss: 0.8488 - classification_loss: 0.1058 403/500 [=======================>......] - ETA: 22s - loss: 0.9557 - regression_loss: 0.8497 - classification_loss: 0.1060 404/500 [=======================>......] - ETA: 22s - loss: 0.9551 - regression_loss: 0.8492 - classification_loss: 0.1059 405/500 [=======================>......] - ETA: 22s - loss: 0.9574 - regression_loss: 0.8508 - classification_loss: 0.1065 406/500 [=======================>......] - ETA: 22s - loss: 0.9576 - regression_loss: 0.8512 - classification_loss: 0.1065 407/500 [=======================>......] - ETA: 21s - loss: 0.9581 - regression_loss: 0.8517 - classification_loss: 0.1064 408/500 [=======================>......] - ETA: 21s - loss: 0.9589 - regression_loss: 0.8524 - classification_loss: 0.1065 409/500 [=======================>......] - ETA: 21s - loss: 0.9587 - regression_loss: 0.8522 - classification_loss: 0.1065 410/500 [=======================>......] - ETA: 21s - loss: 0.9588 - regression_loss: 0.8523 - classification_loss: 0.1064 411/500 [=======================>......] - ETA: 20s - loss: 0.9596 - regression_loss: 0.8531 - classification_loss: 0.1066 412/500 [=======================>......] - ETA: 20s - loss: 0.9619 - regression_loss: 0.8550 - classification_loss: 0.1069 413/500 [=======================>......] - ETA: 20s - loss: 0.9624 - regression_loss: 0.8556 - classification_loss: 0.1068 414/500 [=======================>......] - ETA: 20s - loss: 0.9616 - regression_loss: 0.8549 - classification_loss: 0.1067 415/500 [=======================>......] - ETA: 19s - loss: 0.9629 - regression_loss: 0.8558 - classification_loss: 0.1070 416/500 [=======================>......] - ETA: 19s - loss: 0.9624 - regression_loss: 0.8555 - classification_loss: 0.1070 417/500 [========================>.....] - ETA: 19s - loss: 0.9621 - regression_loss: 0.8552 - classification_loss: 0.1069 418/500 [========================>.....] - ETA: 19s - loss: 0.9619 - regression_loss: 0.8551 - classification_loss: 0.1068 419/500 [========================>.....] - ETA: 18s - loss: 0.9610 - regression_loss: 0.8544 - classification_loss: 0.1066 420/500 [========================>.....] - ETA: 18s - loss: 0.9620 - regression_loss: 0.8554 - classification_loss: 0.1066 421/500 [========================>.....] - ETA: 18s - loss: 0.9622 - regression_loss: 0.8555 - classification_loss: 0.1067 422/500 [========================>.....] - ETA: 18s - loss: 0.9627 - regression_loss: 0.8560 - classification_loss: 0.1067 423/500 [========================>.....] - ETA: 18s - loss: 0.9622 - regression_loss: 0.8556 - classification_loss: 0.1066 424/500 [========================>.....] - ETA: 17s - loss: 0.9614 - regression_loss: 0.8549 - classification_loss: 0.1065 425/500 [========================>.....] - ETA: 17s - loss: 0.9608 - regression_loss: 0.8545 - classification_loss: 0.1063 426/500 [========================>.....] - ETA: 17s - loss: 0.9608 - regression_loss: 0.8545 - classification_loss: 0.1063 427/500 [========================>.....] - ETA: 17s - loss: 0.9615 - regression_loss: 0.8550 - classification_loss: 0.1065 428/500 [========================>.....] - ETA: 16s - loss: 0.9618 - regression_loss: 0.8552 - classification_loss: 0.1066 429/500 [========================>.....] - ETA: 16s - loss: 0.9631 - regression_loss: 0.8563 - classification_loss: 0.1067 430/500 [========================>.....] - ETA: 16s - loss: 0.9622 - regression_loss: 0.8556 - classification_loss: 0.1066 431/500 [========================>.....] - ETA: 16s - loss: 0.9636 - regression_loss: 0.8569 - classification_loss: 0.1068 432/500 [========================>.....] - ETA: 15s - loss: 0.9647 - regression_loss: 0.8575 - classification_loss: 0.1072 433/500 [========================>.....] - ETA: 15s - loss: 0.9643 - regression_loss: 0.8572 - classification_loss: 0.1072 434/500 [=========================>....] - ETA: 15s - loss: 0.9659 - regression_loss: 0.8585 - classification_loss: 0.1074 435/500 [=========================>....] - ETA: 15s - loss: 0.9658 - regression_loss: 0.8585 - classification_loss: 0.1073 436/500 [=========================>....] - ETA: 14s - loss: 0.9645 - regression_loss: 0.8574 - classification_loss: 0.1070 437/500 [=========================>....] - ETA: 14s - loss: 0.9649 - regression_loss: 0.8578 - classification_loss: 0.1071 438/500 [=========================>....] - ETA: 14s - loss: 0.9664 - regression_loss: 0.8593 - classification_loss: 0.1071 439/500 [=========================>....] - ETA: 14s - loss: 0.9656 - regression_loss: 0.8586 - classification_loss: 0.1069 440/500 [=========================>....] - ETA: 14s - loss: 0.9649 - regression_loss: 0.8581 - classification_loss: 0.1068 441/500 [=========================>....] - ETA: 13s - loss: 0.9646 - regression_loss: 0.8578 - classification_loss: 0.1068 442/500 [=========================>....] - ETA: 13s - loss: 0.9640 - regression_loss: 0.8574 - classification_loss: 0.1067 443/500 [=========================>....] - ETA: 13s - loss: 0.9641 - regression_loss: 0.8573 - classification_loss: 0.1068 444/500 [=========================>....] - ETA: 13s - loss: 0.9646 - regression_loss: 0.8577 - classification_loss: 0.1069 445/500 [=========================>....] - ETA: 12s - loss: 0.9649 - regression_loss: 0.8580 - classification_loss: 0.1069 446/500 [=========================>....] - ETA: 12s - loss: 0.9649 - regression_loss: 0.8582 - classification_loss: 0.1068 447/500 [=========================>....] - ETA: 12s - loss: 0.9653 - regression_loss: 0.8585 - classification_loss: 0.1068 448/500 [=========================>....] - ETA: 12s - loss: 0.9647 - regression_loss: 0.8579 - classification_loss: 0.1067 449/500 [=========================>....] - ETA: 11s - loss: 0.9641 - regression_loss: 0.8576 - classification_loss: 0.1066 450/500 [==========================>...] - ETA: 11s - loss: 0.9638 - regression_loss: 0.8573 - classification_loss: 0.1066 451/500 [==========================>...] - ETA: 11s - loss: 0.9630 - regression_loss: 0.8566 - classification_loss: 0.1064 452/500 [==========================>...] - ETA: 11s - loss: 0.9633 - regression_loss: 0.8568 - classification_loss: 0.1064 453/500 [==========================>...] - ETA: 11s - loss: 0.9624 - regression_loss: 0.8561 - classification_loss: 0.1062 454/500 [==========================>...] - ETA: 10s - loss: 0.9637 - regression_loss: 0.8574 - classification_loss: 0.1063 455/500 [==========================>...] - ETA: 10s - loss: 0.9633 - regression_loss: 0.8571 - classification_loss: 0.1062 456/500 [==========================>...] - ETA: 10s - loss: 0.9635 - regression_loss: 0.8572 - classification_loss: 0.1062 457/500 [==========================>...] - ETA: 10s - loss: 0.9640 - regression_loss: 0.8577 - classification_loss: 0.1063 458/500 [==========================>...] - ETA: 9s - loss: 0.9641 - regression_loss: 0.8578 - classification_loss: 0.1063  459/500 [==========================>...] - ETA: 9s - loss: 0.9653 - regression_loss: 0.8588 - classification_loss: 0.1065 460/500 [==========================>...] - ETA: 9s - loss: 0.9647 - regression_loss: 0.8583 - classification_loss: 0.1064 461/500 [==========================>...] - ETA: 9s - loss: 0.9635 - regression_loss: 0.8573 - classification_loss: 0.1062 462/500 [==========================>...] - ETA: 8s - loss: 0.9624 - regression_loss: 0.8564 - classification_loss: 0.1060 463/500 [==========================>...] - ETA: 8s - loss: 0.9630 - regression_loss: 0.8569 - classification_loss: 0.1061 464/500 [==========================>...] - ETA: 8s - loss: 0.9626 - regression_loss: 0.8566 - classification_loss: 0.1060 465/500 [==========================>...] - ETA: 8s - loss: 0.9619 - regression_loss: 0.8560 - classification_loss: 0.1058 466/500 [==========================>...] - ETA: 7s - loss: 0.9622 - regression_loss: 0.8563 - classification_loss: 0.1059 467/500 [===========================>..] - ETA: 7s - loss: 0.9618 - regression_loss: 0.8560 - classification_loss: 0.1058 468/500 [===========================>..] - ETA: 7s - loss: 0.9628 - regression_loss: 0.8567 - classification_loss: 0.1061 469/500 [===========================>..] - ETA: 7s - loss: 0.9612 - regression_loss: 0.8553 - classification_loss: 0.1059 470/500 [===========================>..] - ETA: 7s - loss: 0.9601 - regression_loss: 0.8543 - classification_loss: 0.1058 471/500 [===========================>..] - ETA: 6s - loss: 0.9593 - regression_loss: 0.8536 - classification_loss: 0.1057 472/500 [===========================>..] - ETA: 6s - loss: 0.9583 - regression_loss: 0.8528 - classification_loss: 0.1055 473/500 [===========================>..] - ETA: 6s - loss: 0.9600 - regression_loss: 0.8538 - classification_loss: 0.1062 474/500 [===========================>..] - ETA: 6s - loss: 0.9597 - regression_loss: 0.8535 - classification_loss: 0.1062 475/500 [===========================>..] - ETA: 5s - loss: 0.9596 - regression_loss: 0.8535 - classification_loss: 0.1062 476/500 [===========================>..] - ETA: 5s - loss: 0.9601 - regression_loss: 0.8539 - classification_loss: 0.1063 477/500 [===========================>..] - ETA: 5s - loss: 0.9602 - regression_loss: 0.8539 - classification_loss: 0.1063 478/500 [===========================>..] - ETA: 5s - loss: 0.9600 - regression_loss: 0.8536 - classification_loss: 0.1063 479/500 [===========================>..] - ETA: 4s - loss: 0.9607 - regression_loss: 0.8542 - classification_loss: 0.1065 480/500 [===========================>..] - ETA: 4s - loss: 0.9607 - regression_loss: 0.8542 - classification_loss: 0.1065 481/500 [===========================>..] - ETA: 4s - loss: 0.9602 - regression_loss: 0.8537 - classification_loss: 0.1065 482/500 [===========================>..] - ETA: 4s - loss: 0.9591 - regression_loss: 0.8528 - classification_loss: 0.1063 483/500 [===========================>..] - ETA: 3s - loss: 0.9583 - regression_loss: 0.8521 - classification_loss: 0.1062 484/500 [============================>.] - ETA: 3s - loss: 0.9585 - regression_loss: 0.8522 - classification_loss: 0.1063 485/500 [============================>.] - ETA: 3s - loss: 0.9590 - regression_loss: 0.8528 - classification_loss: 0.1062 486/500 [============================>.] - ETA: 3s - loss: 0.9584 - regression_loss: 0.8523 - classification_loss: 0.1061 487/500 [============================>.] - ETA: 3s - loss: 0.9586 - regression_loss: 0.8524 - classification_loss: 0.1061 488/500 [============================>.] - ETA: 2s - loss: 0.9581 - regression_loss: 0.8521 - classification_loss: 0.1060 489/500 [============================>.] - ETA: 2s - loss: 0.9584 - regression_loss: 0.8523 - classification_loss: 0.1060 490/500 [============================>.] - ETA: 2s - loss: 0.9584 - regression_loss: 0.8524 - classification_loss: 0.1060 491/500 [============================>.] - ETA: 2s - loss: 0.9583 - regression_loss: 0.8524 - classification_loss: 0.1059 492/500 [============================>.] - ETA: 1s - loss: 0.9581 - regression_loss: 0.8522 - classification_loss: 0.1059 493/500 [============================>.] - ETA: 1s - loss: 0.9580 - regression_loss: 0.8522 - classification_loss: 0.1058 494/500 [============================>.] - ETA: 1s - loss: 0.9581 - regression_loss: 0.8522 - classification_loss: 0.1059 495/500 [============================>.] - ETA: 1s - loss: 0.9580 - regression_loss: 0.8521 - classification_loss: 0.1059 496/500 [============================>.] - ETA: 0s - loss: 0.9581 - regression_loss: 0.8523 - classification_loss: 0.1058 497/500 [============================>.] - ETA: 0s - loss: 0.9576 - regression_loss: 0.8519 - classification_loss: 0.1057 498/500 [============================>.] - ETA: 0s - loss: 0.9576 - regression_loss: 0.8519 - classification_loss: 0.1057 499/500 [============================>.] - ETA: 0s - loss: 0.9592 - regression_loss: 0.8532 - classification_loss: 0.1060 500/500 [==============================] - 117s 234ms/step - loss: 0.9584 - regression_loss: 0.8525 - classification_loss: 0.1059 326 instances of class plum with average precision: 0.8581 mAP: 0.8581 Epoch 00031: saving model to ./training/snapshots/resnet50_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 1:59 - loss: 1.0144 - regression_loss: 0.9551 - classification_loss: 0.0593 2/500 [..............................] - ETA: 1:55 - loss: 1.5482 - regression_loss: 1.4254 - classification_loss: 0.1228 3/500 [..............................] - ETA: 1:57 - loss: 1.6521 - regression_loss: 1.5353 - classification_loss: 0.1168 4/500 [..............................] - ETA: 1:56 - loss: 1.3608 - regression_loss: 1.2463 - classification_loss: 0.1145 5/500 [..............................] - ETA: 1:55 - loss: 1.1860 - regression_loss: 1.0918 - classification_loss: 0.0943 6/500 [..............................] - ETA: 1:54 - loss: 1.2711 - regression_loss: 1.1252 - classification_loss: 0.1459 7/500 [..............................] - ETA: 1:55 - loss: 1.2228 - regression_loss: 1.0813 - classification_loss: 0.1415 8/500 [..............................] - ETA: 1:54 - loss: 1.2236 - regression_loss: 1.0838 - classification_loss: 0.1398 9/500 [..............................] - ETA: 1:53 - loss: 1.1842 - regression_loss: 1.0512 - classification_loss: 0.1330 10/500 [..............................] - ETA: 1:54 - loss: 1.3333 - regression_loss: 1.1374 - classification_loss: 0.1959 11/500 [..............................] - ETA: 1:54 - loss: 1.2420 - regression_loss: 1.0613 - classification_loss: 0.1807 12/500 [..............................] - ETA: 1:54 - loss: 1.2088 - regression_loss: 1.0376 - classification_loss: 0.1712 13/500 [..............................] - ETA: 1:54 - loss: 1.1888 - regression_loss: 1.0239 - classification_loss: 0.1649 14/500 [..............................] - ETA: 1:54 - loss: 1.2012 - regression_loss: 1.0272 - classification_loss: 0.1739 15/500 [..............................] - ETA: 1:54 - loss: 1.1600 - regression_loss: 0.9952 - classification_loss: 0.1649 16/500 [..............................] - ETA: 1:54 - loss: 1.1134 - regression_loss: 0.9574 - classification_loss: 0.1560 17/500 [>.............................] - ETA: 1:54 - loss: 1.0626 - regression_loss: 0.9142 - classification_loss: 0.1484 18/500 [>.............................] - ETA: 1:53 - loss: 1.0622 - regression_loss: 0.9156 - classification_loss: 0.1465 19/500 [>.............................] - ETA: 1:53 - loss: 1.0461 - regression_loss: 0.9031 - classification_loss: 0.1430 20/500 [>.............................] - ETA: 1:53 - loss: 1.0460 - regression_loss: 0.9009 - classification_loss: 0.1452 21/500 [>.............................] - ETA: 1:53 - loss: 1.0302 - regression_loss: 0.8891 - classification_loss: 0.1411 22/500 [>.............................] - ETA: 1:53 - loss: 1.0228 - regression_loss: 0.8839 - classification_loss: 0.1389 23/500 [>.............................] - ETA: 1:52 - loss: 1.0212 - regression_loss: 0.8815 - classification_loss: 0.1397 24/500 [>.............................] - ETA: 1:51 - loss: 1.0069 - regression_loss: 0.8713 - classification_loss: 0.1356 25/500 [>.............................] - ETA: 1:51 - loss: 0.9917 - regression_loss: 0.8572 - classification_loss: 0.1345 26/500 [>.............................] - ETA: 1:51 - loss: 0.9956 - regression_loss: 0.8615 - classification_loss: 0.1341 27/500 [>.............................] - ETA: 1:51 - loss: 1.0012 - regression_loss: 0.8671 - classification_loss: 0.1341 28/500 [>.............................] - ETA: 1:52 - loss: 1.0039 - regression_loss: 0.8705 - classification_loss: 0.1334 29/500 [>.............................] - ETA: 1:51 - loss: 1.0118 - regression_loss: 0.8782 - classification_loss: 0.1337 30/500 [>.............................] - ETA: 1:51 - loss: 0.9996 - regression_loss: 0.8683 - classification_loss: 0.1313 31/500 [>.............................] - ETA: 1:51 - loss: 0.9844 - regression_loss: 0.8557 - classification_loss: 0.1287 32/500 [>.............................] - ETA: 1:51 - loss: 0.9804 - regression_loss: 0.8545 - classification_loss: 0.1259 33/500 [>.............................] - ETA: 1:51 - loss: 0.9890 - regression_loss: 0.8636 - classification_loss: 0.1254 34/500 [=>............................] - ETA: 1:50 - loss: 0.9862 - regression_loss: 0.8622 - classification_loss: 0.1240 35/500 [=>............................] - ETA: 1:50 - loss: 0.9954 - regression_loss: 0.8703 - classification_loss: 0.1251 36/500 [=>............................] - ETA: 1:50 - loss: 0.9999 - regression_loss: 0.8748 - classification_loss: 0.1250 37/500 [=>............................] - ETA: 1:49 - loss: 0.9999 - regression_loss: 0.8749 - classification_loss: 0.1250 38/500 [=>............................] - ETA: 1:49 - loss: 0.9868 - regression_loss: 0.8638 - classification_loss: 0.1230 39/500 [=>............................] - ETA: 1:49 - loss: 0.9833 - regression_loss: 0.8613 - classification_loss: 0.1220 40/500 [=>............................] - ETA: 1:48 - loss: 0.9820 - regression_loss: 0.8607 - classification_loss: 0.1212 41/500 [=>............................] - ETA: 1:48 - loss: 0.9872 - regression_loss: 0.8651 - classification_loss: 0.1221 42/500 [=>............................] - ETA: 1:48 - loss: 0.9844 - regression_loss: 0.8634 - classification_loss: 0.1210 43/500 [=>............................] - ETA: 1:47 - loss: 0.9881 - regression_loss: 0.8669 - classification_loss: 0.1212 44/500 [=>............................] - ETA: 1:47 - loss: 0.9834 - regression_loss: 0.8635 - classification_loss: 0.1199 45/500 [=>............................] - ETA: 1:47 - loss: 0.9930 - regression_loss: 0.8735 - classification_loss: 0.1196 46/500 [=>............................] - ETA: 1:46 - loss: 0.9828 - regression_loss: 0.8650 - classification_loss: 0.1178 47/500 [=>............................] - ETA: 1:46 - loss: 0.9857 - regression_loss: 0.8676 - classification_loss: 0.1181 48/500 [=>............................] - ETA: 1:46 - loss: 0.9809 - regression_loss: 0.8635 - classification_loss: 0.1173 49/500 [=>............................] - ETA: 1:46 - loss: 0.9803 - regression_loss: 0.8636 - classification_loss: 0.1167 50/500 [==>...........................] - ETA: 1:46 - loss: 0.9743 - regression_loss: 0.8585 - classification_loss: 0.1158 51/500 [==>...........................] - ETA: 1:46 - loss: 0.9677 - regression_loss: 0.8533 - classification_loss: 0.1145 52/500 [==>...........................] - ETA: 1:45 - loss: 0.9591 - regression_loss: 0.8459 - classification_loss: 0.1132 53/500 [==>...........................] - ETA: 1:45 - loss: 0.9522 - regression_loss: 0.8399 - classification_loss: 0.1123 54/500 [==>...........................] - ETA: 1:45 - loss: 0.9453 - regression_loss: 0.8345 - classification_loss: 0.1108 55/500 [==>...........................] - ETA: 1:44 - loss: 0.9585 - regression_loss: 0.8450 - classification_loss: 0.1134 56/500 [==>...........................] - ETA: 1:44 - loss: 0.9515 - regression_loss: 0.8395 - classification_loss: 0.1121 57/500 [==>...........................] - ETA: 1:44 - loss: 0.9513 - regression_loss: 0.8398 - classification_loss: 0.1115 58/500 [==>...........................] - ETA: 1:44 - loss: 0.9441 - regression_loss: 0.8338 - classification_loss: 0.1103 59/500 [==>...........................] - ETA: 1:44 - loss: 0.9426 - regression_loss: 0.8328 - classification_loss: 0.1098 60/500 [==>...........................] - ETA: 1:43 - loss: 0.9487 - regression_loss: 0.8388 - classification_loss: 0.1100 61/500 [==>...........................] - ETA: 1:43 - loss: 0.9462 - regression_loss: 0.8365 - classification_loss: 0.1097 62/500 [==>...........................] - ETA: 1:43 - loss: 0.9548 - regression_loss: 0.8435 - classification_loss: 0.1113 63/500 [==>...........................] - ETA: 1:43 - loss: 0.9601 - regression_loss: 0.8478 - classification_loss: 0.1123 64/500 [==>...........................] - ETA: 1:42 - loss: 0.9584 - regression_loss: 0.8469 - classification_loss: 0.1115 65/500 [==>...........................] - ETA: 1:42 - loss: 0.9664 - regression_loss: 0.8536 - classification_loss: 0.1128 66/500 [==>...........................] - ETA: 1:42 - loss: 0.9612 - regression_loss: 0.8494 - classification_loss: 0.1118 67/500 [===>..........................] - ETA: 1:42 - loss: 0.9537 - regression_loss: 0.8430 - classification_loss: 0.1107 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9484 - regression_loss: 0.8388 - classification_loss: 0.1096 69/500 [===>..........................] - ETA: 1:41 - loss: 0.9476 - regression_loss: 0.8381 - classification_loss: 0.1095 70/500 [===>..........................] - ETA: 1:41 - loss: 0.9426 - regression_loss: 0.8340 - classification_loss: 0.1086 71/500 [===>..........................] - ETA: 1:41 - loss: 0.9462 - regression_loss: 0.8377 - classification_loss: 0.1085 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9515 - regression_loss: 0.8433 - classification_loss: 0.1082 73/500 [===>..........................] - ETA: 1:40 - loss: 0.9448 - regression_loss: 0.8376 - classification_loss: 0.1072 74/500 [===>..........................] - ETA: 1:40 - loss: 0.9469 - regression_loss: 0.8392 - classification_loss: 0.1077 75/500 [===>..........................] - ETA: 1:40 - loss: 0.9402 - regression_loss: 0.8336 - classification_loss: 0.1066 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9450 - regression_loss: 0.8381 - classification_loss: 0.1069 77/500 [===>..........................] - ETA: 1:39 - loss: 0.9507 - regression_loss: 0.8439 - classification_loss: 0.1068 78/500 [===>..........................] - ETA: 1:39 - loss: 0.9424 - regression_loss: 0.8367 - classification_loss: 0.1057 79/500 [===>..........................] - ETA: 1:39 - loss: 0.9457 - regression_loss: 0.8390 - classification_loss: 0.1067 80/500 [===>..........................] - ETA: 1:38 - loss: 0.9400 - regression_loss: 0.8342 - classification_loss: 0.1058 81/500 [===>..........................] - ETA: 1:38 - loss: 0.9402 - regression_loss: 0.8340 - classification_loss: 0.1061 82/500 [===>..........................] - ETA: 1:38 - loss: 0.9508 - regression_loss: 0.8439 - classification_loss: 0.1069 83/500 [===>..........................] - ETA: 1:38 - loss: 0.9489 - regression_loss: 0.8422 - classification_loss: 0.1066 84/500 [====>.........................] - ETA: 1:38 - loss: 0.9500 - regression_loss: 0.8436 - classification_loss: 0.1064 85/500 [====>.........................] - ETA: 1:37 - loss: 0.9447 - regression_loss: 0.8391 - classification_loss: 0.1056 86/500 [====>.........................] - ETA: 1:37 - loss: 0.9437 - regression_loss: 0.8387 - classification_loss: 0.1050 87/500 [====>.........................] - ETA: 1:37 - loss: 0.9404 - regression_loss: 0.8365 - classification_loss: 0.1039 88/500 [====>.........................] - ETA: 1:37 - loss: 0.9354 - regression_loss: 0.8325 - classification_loss: 0.1030 89/500 [====>.........................] - ETA: 1:37 - loss: 0.9369 - regression_loss: 0.8343 - classification_loss: 0.1026 90/500 [====>.........................] - ETA: 1:36 - loss: 0.9371 - regression_loss: 0.8344 - classification_loss: 0.1027 91/500 [====>.........................] - ETA: 1:36 - loss: 0.9401 - regression_loss: 0.8373 - classification_loss: 0.1028 92/500 [====>.........................] - ETA: 1:36 - loss: 0.9440 - regression_loss: 0.8407 - classification_loss: 0.1033 93/500 [====>.........................] - ETA: 1:36 - loss: 0.9448 - regression_loss: 0.8418 - classification_loss: 0.1030 94/500 [====>.........................] - ETA: 1:35 - loss: 0.9399 - regression_loss: 0.8369 - classification_loss: 0.1030 95/500 [====>.........................] - ETA: 1:35 - loss: 0.9416 - regression_loss: 0.8384 - classification_loss: 0.1032 96/500 [====>.........................] - ETA: 1:35 - loss: 0.9484 - regression_loss: 0.8447 - classification_loss: 0.1037 97/500 [====>.........................] - ETA: 1:35 - loss: 0.9444 - regression_loss: 0.8412 - classification_loss: 0.1032 98/500 [====>.........................] - ETA: 1:34 - loss: 0.9441 - regression_loss: 0.8411 - classification_loss: 0.1030 99/500 [====>.........................] - ETA: 1:34 - loss: 0.9381 - regression_loss: 0.8359 - classification_loss: 0.1022 100/500 [=====>........................] - ETA: 1:34 - loss: 0.9331 - regression_loss: 0.8316 - classification_loss: 0.1015 101/500 [=====>........................] - ETA: 1:34 - loss: 0.9384 - regression_loss: 0.8364 - classification_loss: 0.1020 102/500 [=====>........................] - ETA: 1:33 - loss: 0.9338 - regression_loss: 0.8327 - classification_loss: 0.1011 103/500 [=====>........................] - ETA: 1:33 - loss: 0.9287 - regression_loss: 0.8282 - classification_loss: 0.1005 104/500 [=====>........................] - ETA: 1:33 - loss: 0.9321 - regression_loss: 0.8311 - classification_loss: 0.1011 105/500 [=====>........................] - ETA: 1:32 - loss: 0.9278 - regression_loss: 0.8271 - classification_loss: 0.1007 106/500 [=====>........................] - ETA: 1:32 - loss: 0.9233 - regression_loss: 0.8232 - classification_loss: 0.1001 107/500 [=====>........................] - ETA: 1:32 - loss: 0.9188 - regression_loss: 0.8194 - classification_loss: 0.0994 108/500 [=====>........................] - ETA: 1:32 - loss: 0.9170 - regression_loss: 0.8179 - classification_loss: 0.0991 109/500 [=====>........................] - ETA: 1:32 - loss: 0.9176 - regression_loss: 0.8184 - classification_loss: 0.0992 110/500 [=====>........................] - ETA: 1:31 - loss: 0.9160 - regression_loss: 0.8170 - classification_loss: 0.0990 111/500 [=====>........................] - ETA: 1:31 - loss: 0.9178 - regression_loss: 0.8186 - classification_loss: 0.0991 112/500 [=====>........................] - ETA: 1:31 - loss: 0.9198 - regression_loss: 0.8208 - classification_loss: 0.0991 113/500 [=====>........................] - ETA: 1:31 - loss: 0.9232 - regression_loss: 0.8239 - classification_loss: 0.0992 114/500 [=====>........................] - ETA: 1:30 - loss: 0.9255 - regression_loss: 0.8254 - classification_loss: 0.1000 115/500 [=====>........................] - ETA: 1:30 - loss: 0.9232 - regression_loss: 0.8238 - classification_loss: 0.0994 116/500 [=====>........................] - ETA: 1:30 - loss: 0.9279 - regression_loss: 0.8280 - classification_loss: 0.0998 117/500 [======>.......................] - ETA: 1:30 - loss: 0.9268 - regression_loss: 0.8272 - classification_loss: 0.0996 118/500 [======>.......................] - ETA: 1:30 - loss: 0.9326 - regression_loss: 0.8317 - classification_loss: 0.1009 119/500 [======>.......................] - ETA: 1:29 - loss: 0.9331 - regression_loss: 0.8324 - classification_loss: 0.1007 120/500 [======>.......................] - ETA: 1:29 - loss: 0.9335 - regression_loss: 0.8325 - classification_loss: 0.1010 121/500 [======>.......................] - ETA: 1:29 - loss: 0.9331 - regression_loss: 0.8324 - classification_loss: 0.1008 122/500 [======>.......................] - ETA: 1:29 - loss: 0.9329 - regression_loss: 0.8324 - classification_loss: 0.1005 123/500 [======>.......................] - ETA: 1:28 - loss: 0.9298 - regression_loss: 0.8296 - classification_loss: 0.1001 124/500 [======>.......................] - ETA: 1:28 - loss: 0.9332 - regression_loss: 0.8330 - classification_loss: 0.1002 125/500 [======>.......................] - ETA: 1:28 - loss: 0.9363 - regression_loss: 0.8347 - classification_loss: 0.1016 126/500 [======>.......................] - ETA: 1:28 - loss: 0.9327 - regression_loss: 0.8313 - classification_loss: 0.1014 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9376 - regression_loss: 0.8345 - classification_loss: 0.1031 128/500 [======>.......................] - ETA: 1:27 - loss: 0.9375 - regression_loss: 0.8345 - classification_loss: 0.1030 129/500 [======>.......................] - ETA: 1:27 - loss: 0.9406 - regression_loss: 0.8376 - classification_loss: 0.1029 130/500 [======>.......................] - ETA: 1:27 - loss: 0.9410 - regression_loss: 0.8383 - classification_loss: 0.1027 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9439 - regression_loss: 0.8410 - classification_loss: 0.1029 132/500 [======>.......................] - ETA: 1:26 - loss: 0.9410 - regression_loss: 0.8385 - classification_loss: 0.1024 133/500 [======>.......................] - ETA: 1:26 - loss: 0.9399 - regression_loss: 0.8380 - classification_loss: 0.1019 134/500 [=======>......................] - ETA: 1:26 - loss: 0.9466 - regression_loss: 0.8424 - classification_loss: 0.1041 135/500 [=======>......................] - ETA: 1:26 - loss: 0.9463 - regression_loss: 0.8424 - classification_loss: 0.1038 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9470 - regression_loss: 0.8435 - classification_loss: 0.1035 137/500 [=======>......................] - ETA: 1:25 - loss: 0.9455 - regression_loss: 0.8423 - classification_loss: 0.1032 138/500 [=======>......................] - ETA: 1:25 - loss: 0.9463 - regression_loss: 0.8430 - classification_loss: 0.1033 139/500 [=======>......................] - ETA: 1:25 - loss: 0.9423 - regression_loss: 0.8396 - classification_loss: 0.1027 140/500 [=======>......................] - ETA: 1:24 - loss: 0.9377 - regression_loss: 0.8355 - classification_loss: 0.1022 141/500 [=======>......................] - ETA: 1:24 - loss: 0.9365 - regression_loss: 0.8344 - classification_loss: 0.1021 142/500 [=======>......................] - ETA: 1:24 - loss: 0.9354 - regression_loss: 0.8338 - classification_loss: 0.1016 143/500 [=======>......................] - ETA: 1:24 - loss: 0.9361 - regression_loss: 0.8345 - classification_loss: 0.1016 144/500 [=======>......................] - ETA: 1:23 - loss: 0.9349 - regression_loss: 0.8336 - classification_loss: 0.1013 145/500 [=======>......................] - ETA: 1:23 - loss: 0.9363 - regression_loss: 0.8350 - classification_loss: 0.1012 146/500 [=======>......................] - ETA: 1:23 - loss: 0.9393 - regression_loss: 0.8379 - classification_loss: 0.1015 147/500 [=======>......................] - ETA: 1:23 - loss: 0.9379 - regression_loss: 0.8369 - classification_loss: 0.1010 148/500 [=======>......................] - ETA: 1:23 - loss: 0.9360 - regression_loss: 0.8354 - classification_loss: 0.1006 149/500 [=======>......................] - ETA: 1:22 - loss: 0.9389 - regression_loss: 0.8382 - classification_loss: 0.1007 150/500 [========>.....................] - ETA: 1:22 - loss: 0.9433 - regression_loss: 0.8415 - classification_loss: 0.1018 151/500 [========>.....................] - ETA: 1:22 - loss: 0.9414 - regression_loss: 0.8401 - classification_loss: 0.1014 152/500 [========>.....................] - ETA: 1:22 - loss: 0.9425 - regression_loss: 0.8409 - classification_loss: 0.1017 153/500 [========>.....................] - ETA: 1:21 - loss: 0.9400 - regression_loss: 0.8386 - classification_loss: 0.1014 154/500 [========>.....................] - ETA: 1:21 - loss: 0.9424 - regression_loss: 0.8400 - classification_loss: 0.1024 155/500 [========>.....................] - ETA: 1:21 - loss: 0.9388 - regression_loss: 0.8368 - classification_loss: 0.1020 156/500 [========>.....................] - ETA: 1:21 - loss: 0.9409 - regression_loss: 0.8387 - classification_loss: 0.1022 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9422 - regression_loss: 0.8399 - classification_loss: 0.1023 158/500 [========>.....................] - ETA: 1:20 - loss: 0.9453 - regression_loss: 0.8425 - classification_loss: 0.1028 159/500 [========>.....................] - ETA: 1:20 - loss: 0.9459 - regression_loss: 0.8430 - classification_loss: 0.1029 160/500 [========>.....................] - ETA: 1:20 - loss: 0.9478 - regression_loss: 0.8444 - classification_loss: 0.1033 161/500 [========>.....................] - ETA: 1:19 - loss: 0.9467 - regression_loss: 0.8436 - classification_loss: 0.1031 162/500 [========>.....................] - ETA: 1:19 - loss: 0.9472 - regression_loss: 0.8444 - classification_loss: 0.1028 163/500 [========>.....................] - ETA: 1:19 - loss: 0.9434 - regression_loss: 0.8410 - classification_loss: 0.1023 164/500 [========>.....................] - ETA: 1:19 - loss: 0.9467 - regression_loss: 0.8435 - classification_loss: 0.1031 165/500 [========>.....................] - ETA: 1:19 - loss: 0.9425 - regression_loss: 0.8399 - classification_loss: 0.1026 166/500 [========>.....................] - ETA: 1:18 - loss: 0.9445 - regression_loss: 0.8414 - classification_loss: 0.1031 167/500 [=========>....................] - ETA: 1:18 - loss: 0.9478 - regression_loss: 0.8438 - classification_loss: 0.1040 168/500 [=========>....................] - ETA: 1:18 - loss: 0.9467 - regression_loss: 0.8428 - classification_loss: 0.1039 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9470 - regression_loss: 0.8432 - classification_loss: 0.1039 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9478 - regression_loss: 0.8440 - classification_loss: 0.1038 171/500 [=========>....................] - ETA: 1:17 - loss: 0.9489 - regression_loss: 0.8450 - classification_loss: 0.1040 172/500 [=========>....................] - ETA: 1:17 - loss: 0.9477 - regression_loss: 0.8441 - classification_loss: 0.1036 173/500 [=========>....................] - ETA: 1:17 - loss: 0.9481 - regression_loss: 0.8445 - classification_loss: 0.1036 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9541 - regression_loss: 0.8493 - classification_loss: 0.1048 175/500 [=========>....................] - ETA: 1:16 - loss: 0.9530 - regression_loss: 0.8485 - classification_loss: 0.1045 176/500 [=========>....................] - ETA: 1:16 - loss: 0.9539 - regression_loss: 0.8496 - classification_loss: 0.1044 177/500 [=========>....................] - ETA: 1:16 - loss: 0.9547 - regression_loss: 0.8502 - classification_loss: 0.1045 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9544 - regression_loss: 0.8500 - classification_loss: 0.1044 179/500 [=========>....................] - ETA: 1:15 - loss: 0.9575 - regression_loss: 0.8527 - classification_loss: 0.1047 180/500 [=========>....................] - ETA: 1:15 - loss: 0.9579 - regression_loss: 0.8533 - classification_loss: 0.1046 181/500 [=========>....................] - ETA: 1:15 - loss: 0.9582 - regression_loss: 0.8537 - classification_loss: 0.1045 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9582 - regression_loss: 0.8537 - classification_loss: 0.1045 183/500 [=========>....................] - ETA: 1:14 - loss: 0.9564 - regression_loss: 0.8521 - classification_loss: 0.1043 184/500 [==========>...................] - ETA: 1:14 - loss: 0.9563 - regression_loss: 0.8521 - classification_loss: 0.1043 185/500 [==========>...................] - ETA: 1:14 - loss: 0.9552 - regression_loss: 0.8513 - classification_loss: 0.1039 186/500 [==========>...................] - ETA: 1:14 - loss: 0.9569 - regression_loss: 0.8529 - classification_loss: 0.1040 187/500 [==========>...................] - ETA: 1:13 - loss: 0.9566 - regression_loss: 0.8529 - classification_loss: 0.1037 188/500 [==========>...................] - ETA: 1:13 - loss: 0.9623 - regression_loss: 0.8573 - classification_loss: 0.1050 189/500 [==========>...................] - ETA: 1:13 - loss: 0.9616 - regression_loss: 0.8570 - classification_loss: 0.1047 190/500 [==========>...................] - ETA: 1:13 - loss: 0.9618 - regression_loss: 0.8575 - classification_loss: 0.1043 191/500 [==========>...................] - ETA: 1:12 - loss: 0.9614 - regression_loss: 0.8573 - classification_loss: 0.1042 192/500 [==========>...................] - ETA: 1:12 - loss: 0.9610 - regression_loss: 0.8570 - classification_loss: 0.1039 193/500 [==========>...................] - ETA: 1:12 - loss: 0.9644 - regression_loss: 0.8602 - classification_loss: 0.1042 194/500 [==========>...................] - ETA: 1:12 - loss: 0.9648 - regression_loss: 0.8604 - classification_loss: 0.1043 195/500 [==========>...................] - ETA: 1:11 - loss: 0.9647 - regression_loss: 0.8603 - classification_loss: 0.1044 196/500 [==========>...................] - ETA: 1:11 - loss: 0.9654 - regression_loss: 0.8607 - classification_loss: 0.1047 197/500 [==========>...................] - ETA: 1:11 - loss: 0.9665 - regression_loss: 0.8615 - classification_loss: 0.1050 198/500 [==========>...................] - ETA: 1:11 - loss: 0.9721 - regression_loss: 0.8660 - classification_loss: 0.1061 199/500 [==========>...................] - ETA: 1:11 - loss: 0.9709 - regression_loss: 0.8652 - classification_loss: 0.1057 200/500 [===========>..................] - ETA: 1:10 - loss: 0.9698 - regression_loss: 0.8641 - classification_loss: 0.1057 201/500 [===========>..................] - ETA: 1:10 - loss: 0.9696 - regression_loss: 0.8639 - classification_loss: 0.1057 202/500 [===========>..................] - ETA: 1:10 - loss: 0.9677 - regression_loss: 0.8622 - classification_loss: 0.1054 203/500 [===========>..................] - ETA: 1:10 - loss: 0.9682 - regression_loss: 0.8626 - classification_loss: 0.1055 204/500 [===========>..................] - ETA: 1:09 - loss: 0.9676 - regression_loss: 0.8622 - classification_loss: 0.1054 205/500 [===========>..................] - ETA: 1:09 - loss: 0.9689 - regression_loss: 0.8633 - classification_loss: 0.1056 206/500 [===========>..................] - ETA: 1:09 - loss: 0.9689 - regression_loss: 0.8632 - classification_loss: 0.1057 207/500 [===========>..................] - ETA: 1:09 - loss: 0.9699 - regression_loss: 0.8643 - classification_loss: 0.1056 208/500 [===========>..................] - ETA: 1:08 - loss: 0.9684 - regression_loss: 0.8631 - classification_loss: 0.1053 209/500 [===========>..................] - ETA: 1:08 - loss: 0.9663 - regression_loss: 0.8613 - classification_loss: 0.1050 210/500 [===========>..................] - ETA: 1:08 - loss: 0.9646 - regression_loss: 0.8599 - classification_loss: 0.1048 211/500 [===========>..................] - ETA: 1:08 - loss: 0.9647 - regression_loss: 0.8599 - classification_loss: 0.1048 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9642 - regression_loss: 0.8596 - classification_loss: 0.1047 213/500 [===========>..................] - ETA: 1:07 - loss: 0.9637 - regression_loss: 0.8593 - classification_loss: 0.1043 214/500 [===========>..................] - ETA: 1:07 - loss: 0.9657 - regression_loss: 0.8610 - classification_loss: 0.1046 215/500 [===========>..................] - ETA: 1:07 - loss: 0.9648 - regression_loss: 0.8603 - classification_loss: 0.1045 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9619 - regression_loss: 0.8578 - classification_loss: 0.1042 217/500 [============>.................] - ETA: 1:06 - loss: 0.9626 - regression_loss: 0.8585 - classification_loss: 0.1041 218/500 [============>.................] - ETA: 1:06 - loss: 0.9601 - regression_loss: 0.8564 - classification_loss: 0.1037 219/500 [============>.................] - ETA: 1:06 - loss: 0.9565 - regression_loss: 0.8532 - classification_loss: 0.1033 220/500 [============>.................] - ETA: 1:05 - loss: 0.9591 - regression_loss: 0.8556 - classification_loss: 0.1035 221/500 [============>.................] - ETA: 1:05 - loss: 0.9618 - regression_loss: 0.8583 - classification_loss: 0.1036 222/500 [============>.................] - ETA: 1:05 - loss: 0.9595 - regression_loss: 0.8563 - classification_loss: 0.1032 223/500 [============>.................] - ETA: 1:05 - loss: 0.9607 - regression_loss: 0.8570 - classification_loss: 0.1038 224/500 [============>.................] - ETA: 1:04 - loss: 0.9602 - regression_loss: 0.8565 - classification_loss: 0.1037 225/500 [============>.................] - ETA: 1:04 - loss: 0.9655 - regression_loss: 0.8608 - classification_loss: 0.1047 226/500 [============>.................] - ETA: 1:04 - loss: 0.9676 - regression_loss: 0.8623 - classification_loss: 0.1053 227/500 [============>.................] - ETA: 1:04 - loss: 0.9658 - regression_loss: 0.8607 - classification_loss: 0.1051 228/500 [============>.................] - ETA: 1:04 - loss: 0.9667 - regression_loss: 0.8616 - classification_loss: 0.1051 229/500 [============>.................] - ETA: 1:03 - loss: 0.9654 - regression_loss: 0.8605 - classification_loss: 0.1049 230/500 [============>.................] - ETA: 1:03 - loss: 0.9658 - regression_loss: 0.8607 - classification_loss: 0.1051 231/500 [============>.................] - ETA: 1:03 - loss: 0.9668 - regression_loss: 0.8616 - classification_loss: 0.1052 232/500 [============>.................] - ETA: 1:03 - loss: 0.9669 - regression_loss: 0.8619 - classification_loss: 0.1050 233/500 [============>.................] - ETA: 1:02 - loss: 0.9659 - regression_loss: 0.8611 - classification_loss: 0.1048 234/500 [=============>................] - ETA: 1:02 - loss: 0.9655 - regression_loss: 0.8608 - classification_loss: 0.1047 235/500 [=============>................] - ETA: 1:02 - loss: 0.9658 - regression_loss: 0.8612 - classification_loss: 0.1046 236/500 [=============>................] - ETA: 1:02 - loss: 0.9653 - regression_loss: 0.8607 - classification_loss: 0.1046 237/500 [=============>................] - ETA: 1:01 - loss: 0.9684 - regression_loss: 0.8639 - classification_loss: 0.1046 238/500 [=============>................] - ETA: 1:01 - loss: 0.9667 - regression_loss: 0.8625 - classification_loss: 0.1042 239/500 [=============>................] - ETA: 1:01 - loss: 0.9664 - regression_loss: 0.8622 - classification_loss: 0.1042 240/500 [=============>................] - ETA: 1:01 - loss: 0.9658 - regression_loss: 0.8613 - classification_loss: 0.1045 241/500 [=============>................] - ETA: 1:01 - loss: 0.9662 - regression_loss: 0.8619 - classification_loss: 0.1043 242/500 [=============>................] - ETA: 1:00 - loss: 0.9668 - regression_loss: 0.8624 - classification_loss: 0.1044 243/500 [=============>................] - ETA: 1:00 - loss: 0.9673 - regression_loss: 0.8626 - classification_loss: 0.1047 244/500 [=============>................] - ETA: 1:00 - loss: 0.9673 - regression_loss: 0.8626 - classification_loss: 0.1046 245/500 [=============>................] - ETA: 1:00 - loss: 0.9653 - regression_loss: 0.8609 - classification_loss: 0.1044 246/500 [=============>................] - ETA: 59s - loss: 0.9652 - regression_loss: 0.8609 - classification_loss: 0.1043  247/500 [=============>................] - ETA: 59s - loss: 0.9642 - regression_loss: 0.8602 - classification_loss: 0.1041 248/500 [=============>................] - ETA: 59s - loss: 0.9640 - regression_loss: 0.8602 - classification_loss: 0.1039 249/500 [=============>................] - ETA: 59s - loss: 0.9654 - regression_loss: 0.8614 - classification_loss: 0.1040 250/500 [==============>...............] - ETA: 58s - loss: 0.9642 - regression_loss: 0.8605 - classification_loss: 0.1037 251/500 [==============>...............] - ETA: 58s - loss: 0.9641 - regression_loss: 0.8603 - classification_loss: 0.1038 252/500 [==============>...............] - ETA: 58s - loss: 0.9622 - regression_loss: 0.8587 - classification_loss: 0.1035 253/500 [==============>...............] - ETA: 58s - loss: 0.9616 - regression_loss: 0.8584 - classification_loss: 0.1032 254/500 [==============>...............] - ETA: 57s - loss: 0.9605 - regression_loss: 0.8574 - classification_loss: 0.1030 255/500 [==============>...............] - ETA: 57s - loss: 0.9609 - regression_loss: 0.8575 - classification_loss: 0.1034 256/500 [==============>...............] - ETA: 57s - loss: 0.9623 - regression_loss: 0.8588 - classification_loss: 0.1035 257/500 [==============>...............] - ETA: 57s - loss: 0.9628 - regression_loss: 0.8592 - classification_loss: 0.1036 258/500 [==============>...............] - ETA: 56s - loss: 0.9638 - regression_loss: 0.8601 - classification_loss: 0.1037 259/500 [==============>...............] - ETA: 56s - loss: 0.9630 - regression_loss: 0.8595 - classification_loss: 0.1036 260/500 [==============>...............] - ETA: 56s - loss: 0.9613 - regression_loss: 0.8580 - classification_loss: 0.1033 261/500 [==============>...............] - ETA: 56s - loss: 0.9613 - regression_loss: 0.8580 - classification_loss: 0.1033 262/500 [==============>...............] - ETA: 56s - loss: 0.9581 - regression_loss: 0.8548 - classification_loss: 0.1033 263/500 [==============>...............] - ETA: 55s - loss: 0.9600 - regression_loss: 0.8565 - classification_loss: 0.1035 264/500 [==============>...............] - ETA: 55s - loss: 0.9626 - regression_loss: 0.8587 - classification_loss: 0.1039 265/500 [==============>...............] - ETA: 55s - loss: 0.9629 - regression_loss: 0.8592 - classification_loss: 0.1037 266/500 [==============>...............] - ETA: 55s - loss: 0.9621 - regression_loss: 0.8585 - classification_loss: 0.1036 267/500 [===============>..............] - ETA: 54s - loss: 0.9605 - regression_loss: 0.8571 - classification_loss: 0.1034 268/500 [===============>..............] - ETA: 54s - loss: 0.9603 - regression_loss: 0.8566 - classification_loss: 0.1036 269/500 [===============>..............] - ETA: 54s - loss: 0.9581 - regression_loss: 0.8547 - classification_loss: 0.1034 270/500 [===============>..............] - ETA: 54s - loss: 0.9605 - regression_loss: 0.8568 - classification_loss: 0.1037 271/500 [===============>..............] - ETA: 53s - loss: 0.9597 - regression_loss: 0.8560 - classification_loss: 0.1037 272/500 [===============>..............] - ETA: 53s - loss: 0.9597 - regression_loss: 0.8558 - classification_loss: 0.1039 273/500 [===============>..............] - ETA: 53s - loss: 0.9605 - regression_loss: 0.8566 - classification_loss: 0.1039 274/500 [===============>..............] - ETA: 53s - loss: 0.9606 - regression_loss: 0.8568 - classification_loss: 0.1038 275/500 [===============>..............] - ETA: 52s - loss: 0.9579 - regression_loss: 0.8544 - classification_loss: 0.1035 276/500 [===============>..............] - ETA: 52s - loss: 0.9572 - regression_loss: 0.8539 - classification_loss: 0.1033 277/500 [===============>..............] - ETA: 52s - loss: 0.9588 - regression_loss: 0.8553 - classification_loss: 0.1035 278/500 [===============>..............] - ETA: 52s - loss: 0.9583 - regression_loss: 0.8548 - classification_loss: 0.1035 279/500 [===============>..............] - ETA: 52s - loss: 0.9607 - regression_loss: 0.8569 - classification_loss: 0.1037 280/500 [===============>..............] - ETA: 51s - loss: 0.9624 - regression_loss: 0.8586 - classification_loss: 0.1038 281/500 [===============>..............] - ETA: 51s - loss: 0.9609 - regression_loss: 0.8573 - classification_loss: 0.1036 282/500 [===============>..............] - ETA: 51s - loss: 0.9601 - regression_loss: 0.8567 - classification_loss: 0.1035 283/500 [===============>..............] - ETA: 51s - loss: 0.9600 - regression_loss: 0.8565 - classification_loss: 0.1034 284/500 [================>.............] - ETA: 50s - loss: 0.9586 - regression_loss: 0.8552 - classification_loss: 0.1035 285/500 [================>.............] - ETA: 50s - loss: 0.9595 - regression_loss: 0.8560 - classification_loss: 0.1035 286/500 [================>.............] - ETA: 50s - loss: 0.9590 - regression_loss: 0.8555 - classification_loss: 0.1035 287/500 [================>.............] - ETA: 50s - loss: 0.9580 - regression_loss: 0.8547 - classification_loss: 0.1033 288/500 [================>.............] - ETA: 49s - loss: 0.9584 - regression_loss: 0.8551 - classification_loss: 0.1033 289/500 [================>.............] - ETA: 49s - loss: 0.9570 - regression_loss: 0.8540 - classification_loss: 0.1030 290/500 [================>.............] - ETA: 49s - loss: 0.9606 - regression_loss: 0.8569 - classification_loss: 0.1037 291/500 [================>.............] - ETA: 49s - loss: 0.9626 - regression_loss: 0.8587 - classification_loss: 0.1039 292/500 [================>.............] - ETA: 48s - loss: 0.9630 - regression_loss: 0.8590 - classification_loss: 0.1039 293/500 [================>.............] - ETA: 48s - loss: 0.9659 - regression_loss: 0.8610 - classification_loss: 0.1049 294/500 [================>.............] - ETA: 48s - loss: 0.9653 - regression_loss: 0.8604 - classification_loss: 0.1049 295/500 [================>.............] - ETA: 48s - loss: 0.9645 - regression_loss: 0.8598 - classification_loss: 0.1047 296/500 [================>.............] - ETA: 48s - loss: 0.9632 - regression_loss: 0.8586 - classification_loss: 0.1046 297/500 [================>.............] - ETA: 47s - loss: 0.9615 - regression_loss: 0.8571 - classification_loss: 0.1044 298/500 [================>.............] - ETA: 47s - loss: 0.9616 - regression_loss: 0.8571 - classification_loss: 0.1044 299/500 [================>.............] - ETA: 47s - loss: 0.9644 - regression_loss: 0.8600 - classification_loss: 0.1044 300/500 [=================>............] - ETA: 47s - loss: 0.9630 - regression_loss: 0.8589 - classification_loss: 0.1041 301/500 [=================>............] - ETA: 46s - loss: 0.9625 - regression_loss: 0.8585 - classification_loss: 0.1040 302/500 [=================>............] - ETA: 46s - loss: 0.9608 - regression_loss: 0.8570 - classification_loss: 0.1038 303/500 [=================>............] - ETA: 46s - loss: 0.9594 - regression_loss: 0.8558 - classification_loss: 0.1036 304/500 [=================>............] - ETA: 46s - loss: 0.9589 - regression_loss: 0.8555 - classification_loss: 0.1034 305/500 [=================>............] - ETA: 45s - loss: 0.9580 - regression_loss: 0.8547 - classification_loss: 0.1032 306/500 [=================>............] - ETA: 45s - loss: 0.9564 - regression_loss: 0.8534 - classification_loss: 0.1030 307/500 [=================>............] - ETA: 45s - loss: 0.9561 - regression_loss: 0.8531 - classification_loss: 0.1030 308/500 [=================>............] - ETA: 45s - loss: 0.9562 - regression_loss: 0.8534 - classification_loss: 0.1028 309/500 [=================>............] - ETA: 44s - loss: 0.9544 - regression_loss: 0.8518 - classification_loss: 0.1026 310/500 [=================>............] - ETA: 44s - loss: 0.9545 - regression_loss: 0.8521 - classification_loss: 0.1024 311/500 [=================>............] - ETA: 44s - loss: 0.9542 - regression_loss: 0.8518 - classification_loss: 0.1024 312/500 [=================>............] - ETA: 44s - loss: 0.9531 - regression_loss: 0.8509 - classification_loss: 0.1023 313/500 [=================>............] - ETA: 43s - loss: 0.9544 - regression_loss: 0.8519 - classification_loss: 0.1025 314/500 [=================>............] - ETA: 43s - loss: 0.9530 - regression_loss: 0.8508 - classification_loss: 0.1022 315/500 [=================>............] - ETA: 43s - loss: 0.9542 - regression_loss: 0.8519 - classification_loss: 0.1023 316/500 [=================>............] - ETA: 43s - loss: 0.9547 - regression_loss: 0.8525 - classification_loss: 0.1022 317/500 [==================>...........] - ETA: 43s - loss: 0.9542 - regression_loss: 0.8521 - classification_loss: 0.1021 318/500 [==================>...........] - ETA: 42s - loss: 0.9553 - regression_loss: 0.8530 - classification_loss: 0.1023 319/500 [==================>...........] - ETA: 42s - loss: 0.9542 - regression_loss: 0.8521 - classification_loss: 0.1021 320/500 [==================>...........] - ETA: 42s - loss: 0.9532 - regression_loss: 0.8513 - classification_loss: 0.1019 321/500 [==================>...........] - ETA: 42s - loss: 0.9546 - regression_loss: 0.8525 - classification_loss: 0.1021 322/500 [==================>...........] - ETA: 41s - loss: 0.9566 - regression_loss: 0.8538 - classification_loss: 0.1027 323/500 [==================>...........] - ETA: 41s - loss: 0.9553 - regression_loss: 0.8526 - classification_loss: 0.1027 324/500 [==================>...........] - ETA: 41s - loss: 0.9555 - regression_loss: 0.8529 - classification_loss: 0.1026 325/500 [==================>...........] - ETA: 41s - loss: 0.9571 - regression_loss: 0.8544 - classification_loss: 0.1027 326/500 [==================>...........] - ETA: 40s - loss: 0.9567 - regression_loss: 0.8540 - classification_loss: 0.1027 327/500 [==================>...........] - ETA: 40s - loss: 0.9556 - regression_loss: 0.8531 - classification_loss: 0.1025 328/500 [==================>...........] - ETA: 40s - loss: 0.9543 - regression_loss: 0.8519 - classification_loss: 0.1024 329/500 [==================>...........] - ETA: 40s - loss: 0.9549 - regression_loss: 0.8524 - classification_loss: 0.1025 330/500 [==================>...........] - ETA: 40s - loss: 0.9557 - regression_loss: 0.8532 - classification_loss: 0.1025 331/500 [==================>...........] - ETA: 39s - loss: 0.9544 - regression_loss: 0.8520 - classification_loss: 0.1024 332/500 [==================>...........] - ETA: 39s - loss: 0.9546 - regression_loss: 0.8521 - classification_loss: 0.1025 333/500 [==================>...........] - ETA: 39s - loss: 0.9533 - regression_loss: 0.8509 - classification_loss: 0.1024 334/500 [===================>..........] - ETA: 39s - loss: 0.9521 - regression_loss: 0.8500 - classification_loss: 0.1022 335/500 [===================>..........] - ETA: 38s - loss: 0.9516 - regression_loss: 0.8495 - classification_loss: 0.1021 336/500 [===================>..........] - ETA: 38s - loss: 0.9532 - regression_loss: 0.8508 - classification_loss: 0.1024 337/500 [===================>..........] - ETA: 38s - loss: 0.9535 - regression_loss: 0.8509 - classification_loss: 0.1026 338/500 [===================>..........] - ETA: 38s - loss: 0.9528 - regression_loss: 0.8503 - classification_loss: 0.1025 339/500 [===================>..........] - ETA: 37s - loss: 0.9530 - regression_loss: 0.8505 - classification_loss: 0.1024 340/500 [===================>..........] - ETA: 37s - loss: 0.9549 - regression_loss: 0.8520 - classification_loss: 0.1030 341/500 [===================>..........] - ETA: 37s - loss: 0.9547 - regression_loss: 0.8518 - classification_loss: 0.1029 342/500 [===================>..........] - ETA: 37s - loss: 0.9560 - regression_loss: 0.8529 - classification_loss: 0.1031 343/500 [===================>..........] - ETA: 36s - loss: 0.9562 - regression_loss: 0.8532 - classification_loss: 0.1031 344/500 [===================>..........] - ETA: 36s - loss: 0.9548 - regression_loss: 0.8519 - classification_loss: 0.1029 345/500 [===================>..........] - ETA: 36s - loss: 0.9539 - regression_loss: 0.8512 - classification_loss: 0.1027 346/500 [===================>..........] - ETA: 36s - loss: 0.9553 - regression_loss: 0.8523 - classification_loss: 0.1029 347/500 [===================>..........] - ETA: 35s - loss: 0.9551 - regression_loss: 0.8522 - classification_loss: 0.1030 348/500 [===================>..........] - ETA: 35s - loss: 0.9551 - regression_loss: 0.8522 - classification_loss: 0.1029 349/500 [===================>..........] - ETA: 35s - loss: 0.9548 - regression_loss: 0.8520 - classification_loss: 0.1029 350/500 [====================>.........] - ETA: 35s - loss: 0.9547 - regression_loss: 0.8517 - classification_loss: 0.1030 351/500 [====================>.........] - ETA: 35s - loss: 0.9526 - regression_loss: 0.8499 - classification_loss: 0.1027 352/500 [====================>.........] - ETA: 34s - loss: 0.9541 - regression_loss: 0.8510 - classification_loss: 0.1031 353/500 [====================>.........] - ETA: 34s - loss: 0.9542 - regression_loss: 0.8510 - classification_loss: 0.1032 354/500 [====================>.........] - ETA: 34s - loss: 0.9529 - regression_loss: 0.8499 - classification_loss: 0.1030 355/500 [====================>.........] - ETA: 34s - loss: 0.9531 - regression_loss: 0.8503 - classification_loss: 0.1028 356/500 [====================>.........] - ETA: 33s - loss: 0.9543 - regression_loss: 0.8512 - classification_loss: 0.1030 357/500 [====================>.........] - ETA: 33s - loss: 0.9540 - regression_loss: 0.8510 - classification_loss: 0.1030 358/500 [====================>.........] - ETA: 33s - loss: 0.9524 - regression_loss: 0.8496 - classification_loss: 0.1028 359/500 [====================>.........] - ETA: 33s - loss: 0.9537 - regression_loss: 0.8506 - classification_loss: 0.1031 360/500 [====================>.........] - ETA: 32s - loss: 0.9543 - regression_loss: 0.8513 - classification_loss: 0.1031 361/500 [====================>.........] - ETA: 32s - loss: 0.9531 - regression_loss: 0.8502 - classification_loss: 0.1029 362/500 [====================>.........] - ETA: 32s - loss: 0.9529 - regression_loss: 0.8500 - classification_loss: 0.1029 363/500 [====================>.........] - ETA: 32s - loss: 0.9533 - regression_loss: 0.8505 - classification_loss: 0.1028 364/500 [====================>.........] - ETA: 31s - loss: 0.9534 - regression_loss: 0.8505 - classification_loss: 0.1029 365/500 [====================>.........] - ETA: 31s - loss: 0.9537 - regression_loss: 0.8507 - classification_loss: 0.1029 366/500 [====================>.........] - ETA: 31s - loss: 0.9525 - regression_loss: 0.8498 - classification_loss: 0.1027 367/500 [=====================>........] - ETA: 31s - loss: 0.9520 - regression_loss: 0.8493 - classification_loss: 0.1027 368/500 [=====================>........] - ETA: 31s - loss: 0.9510 - regression_loss: 0.8485 - classification_loss: 0.1025 369/500 [=====================>........] - ETA: 30s - loss: 0.9500 - regression_loss: 0.8477 - classification_loss: 0.1023 370/500 [=====================>........] - ETA: 30s - loss: 0.9519 - regression_loss: 0.8490 - classification_loss: 0.1028 371/500 [=====================>........] - ETA: 30s - loss: 0.9522 - regression_loss: 0.8493 - classification_loss: 0.1028 372/500 [=====================>........] - ETA: 30s - loss: 0.9514 - regression_loss: 0.8486 - classification_loss: 0.1029 373/500 [=====================>........] - ETA: 29s - loss: 0.9518 - regression_loss: 0.8490 - classification_loss: 0.1028 374/500 [=====================>........] - ETA: 29s - loss: 0.9508 - regression_loss: 0.8481 - classification_loss: 0.1028 375/500 [=====================>........] - ETA: 29s - loss: 0.9516 - regression_loss: 0.8483 - classification_loss: 0.1033 376/500 [=====================>........] - ETA: 29s - loss: 0.9496 - regression_loss: 0.8466 - classification_loss: 0.1031 377/500 [=====================>........] - ETA: 28s - loss: 0.9483 - regression_loss: 0.8454 - classification_loss: 0.1029 378/500 [=====================>........] - ETA: 28s - loss: 0.9487 - regression_loss: 0.8457 - classification_loss: 0.1029 379/500 [=====================>........] - ETA: 28s - loss: 0.9481 - regression_loss: 0.8452 - classification_loss: 0.1028 380/500 [=====================>........] - ETA: 28s - loss: 0.9473 - regression_loss: 0.8446 - classification_loss: 0.1027 381/500 [=====================>........] - ETA: 28s - loss: 0.9472 - regression_loss: 0.8446 - classification_loss: 0.1026 382/500 [=====================>........] - ETA: 27s - loss: 0.9482 - regression_loss: 0.8455 - classification_loss: 0.1027 383/500 [=====================>........] - ETA: 27s - loss: 0.9484 - regression_loss: 0.8457 - classification_loss: 0.1027 384/500 [======================>.......] - ETA: 27s - loss: 0.9484 - regression_loss: 0.8457 - classification_loss: 0.1027 385/500 [======================>.......] - ETA: 27s - loss: 0.9479 - regression_loss: 0.8452 - classification_loss: 0.1026 386/500 [======================>.......] - ETA: 26s - loss: 0.9468 - regression_loss: 0.8443 - classification_loss: 0.1025 387/500 [======================>.......] - ETA: 26s - loss: 0.9474 - regression_loss: 0.8447 - classification_loss: 0.1027 388/500 [======================>.......] - ETA: 26s - loss: 0.9473 - regression_loss: 0.8445 - classification_loss: 0.1027 389/500 [======================>.......] - ETA: 26s - loss: 0.9484 - regression_loss: 0.8458 - classification_loss: 0.1026 390/500 [======================>.......] - ETA: 25s - loss: 0.9474 - regression_loss: 0.8449 - classification_loss: 0.1025 391/500 [======================>.......] - ETA: 25s - loss: 0.9467 - regression_loss: 0.8443 - classification_loss: 0.1023 392/500 [======================>.......] - ETA: 25s - loss: 0.9475 - regression_loss: 0.8450 - classification_loss: 0.1025 393/500 [======================>.......] - ETA: 25s - loss: 0.9479 - regression_loss: 0.8454 - classification_loss: 0.1025 394/500 [======================>.......] - ETA: 24s - loss: 0.9478 - regression_loss: 0.8453 - classification_loss: 0.1024 395/500 [======================>.......] - ETA: 24s - loss: 0.9492 - regression_loss: 0.8462 - classification_loss: 0.1029 396/500 [======================>.......] - ETA: 24s - loss: 0.9486 - regression_loss: 0.8458 - classification_loss: 0.1028 397/500 [======================>.......] - ETA: 24s - loss: 0.9473 - regression_loss: 0.8446 - classification_loss: 0.1027 398/500 [======================>.......] - ETA: 24s - loss: 0.9474 - regression_loss: 0.8448 - classification_loss: 0.1026 399/500 [======================>.......] - ETA: 23s - loss: 0.9468 - regression_loss: 0.8444 - classification_loss: 0.1025 400/500 [=======================>......] - ETA: 23s - loss: 0.9471 - regression_loss: 0.8445 - classification_loss: 0.1026 401/500 [=======================>......] - ETA: 23s - loss: 0.9473 - regression_loss: 0.8445 - classification_loss: 0.1028 402/500 [=======================>......] - ETA: 23s - loss: 0.9481 - regression_loss: 0.8454 - classification_loss: 0.1027 403/500 [=======================>......] - ETA: 22s - loss: 0.9475 - regression_loss: 0.8449 - classification_loss: 0.1026 404/500 [=======================>......] - ETA: 22s - loss: 0.9473 - regression_loss: 0.8448 - classification_loss: 0.1025 405/500 [=======================>......] - ETA: 22s - loss: 0.9468 - regression_loss: 0.8445 - classification_loss: 0.1023 406/500 [=======================>......] - ETA: 22s - loss: 0.9458 - regression_loss: 0.8436 - classification_loss: 0.1022 407/500 [=======================>......] - ETA: 21s - loss: 0.9458 - regression_loss: 0.8435 - classification_loss: 0.1023 408/500 [=======================>......] - ETA: 21s - loss: 0.9474 - regression_loss: 0.8450 - classification_loss: 0.1023 409/500 [=======================>......] - ETA: 21s - loss: 0.9480 - regression_loss: 0.8457 - classification_loss: 0.1024 410/500 [=======================>......] - ETA: 21s - loss: 0.9482 - regression_loss: 0.8458 - classification_loss: 0.1024 411/500 [=======================>......] - ETA: 20s - loss: 0.9484 - regression_loss: 0.8460 - classification_loss: 0.1023 412/500 [=======================>......] - ETA: 20s - loss: 0.9477 - regression_loss: 0.8455 - classification_loss: 0.1022 413/500 [=======================>......] - ETA: 20s - loss: 0.9494 - regression_loss: 0.8470 - classification_loss: 0.1024 414/500 [=======================>......] - ETA: 20s - loss: 0.9498 - regression_loss: 0.8473 - classification_loss: 0.1025 415/500 [=======================>......] - ETA: 20s - loss: 0.9506 - regression_loss: 0.8480 - classification_loss: 0.1026 416/500 [=======================>......] - ETA: 19s - loss: 0.9504 - regression_loss: 0.8479 - classification_loss: 0.1025 417/500 [========================>.....] - ETA: 19s - loss: 0.9504 - regression_loss: 0.8479 - classification_loss: 0.1025 418/500 [========================>.....] - ETA: 19s - loss: 0.9497 - regression_loss: 0.8473 - classification_loss: 0.1024 419/500 [========================>.....] - ETA: 19s - loss: 0.9493 - regression_loss: 0.8470 - classification_loss: 0.1024 420/500 [========================>.....] - ETA: 18s - loss: 0.9480 - regression_loss: 0.8456 - classification_loss: 0.1023 421/500 [========================>.....] - ETA: 18s - loss: 0.9486 - regression_loss: 0.8462 - classification_loss: 0.1023 422/500 [========================>.....] - ETA: 18s - loss: 0.9484 - regression_loss: 0.8462 - classification_loss: 0.1022 423/500 [========================>.....] - ETA: 18s - loss: 0.9488 - regression_loss: 0.8466 - classification_loss: 0.1021 424/500 [========================>.....] - ETA: 17s - loss: 0.9489 - regression_loss: 0.8468 - classification_loss: 0.1021 425/500 [========================>.....] - ETA: 17s - loss: 0.9479 - regression_loss: 0.8460 - classification_loss: 0.1019 426/500 [========================>.....] - ETA: 17s - loss: 0.9483 - regression_loss: 0.8463 - classification_loss: 0.1020 427/500 [========================>.....] - ETA: 17s - loss: 0.9487 - regression_loss: 0.8465 - classification_loss: 0.1022 428/500 [========================>.....] - ETA: 16s - loss: 0.9489 - regression_loss: 0.8466 - classification_loss: 0.1023 429/500 [========================>.....] - ETA: 16s - loss: 0.9478 - regression_loss: 0.8456 - classification_loss: 0.1022 430/500 [========================>.....] - ETA: 16s - loss: 0.9471 - regression_loss: 0.8451 - classification_loss: 0.1020 431/500 [========================>.....] - ETA: 16s - loss: 0.9473 - regression_loss: 0.8454 - classification_loss: 0.1020 432/500 [========================>.....] - ETA: 16s - loss: 0.9464 - regression_loss: 0.8445 - classification_loss: 0.1019 433/500 [========================>.....] - ETA: 15s - loss: 0.9450 - regression_loss: 0.8433 - classification_loss: 0.1017 434/500 [=========================>....] - ETA: 15s - loss: 0.9449 - regression_loss: 0.8432 - classification_loss: 0.1017 435/500 [=========================>....] - ETA: 15s - loss: 0.9452 - regression_loss: 0.8435 - classification_loss: 0.1017 436/500 [=========================>....] - ETA: 15s - loss: 0.9451 - regression_loss: 0.8435 - classification_loss: 0.1016 437/500 [=========================>....] - ETA: 14s - loss: 0.9439 - regression_loss: 0.8425 - classification_loss: 0.1014 438/500 [=========================>....] - ETA: 14s - loss: 0.9431 - regression_loss: 0.8418 - classification_loss: 0.1013 439/500 [=========================>....] - ETA: 14s - loss: 0.9428 - regression_loss: 0.8414 - classification_loss: 0.1014 440/500 [=========================>....] - ETA: 14s - loss: 0.9415 - regression_loss: 0.8403 - classification_loss: 0.1012 441/500 [=========================>....] - ETA: 13s - loss: 0.9419 - regression_loss: 0.8406 - classification_loss: 0.1013 442/500 [=========================>....] - ETA: 13s - loss: 0.9405 - regression_loss: 0.8395 - classification_loss: 0.1010 443/500 [=========================>....] - ETA: 13s - loss: 0.9400 - regression_loss: 0.8390 - classification_loss: 0.1010 444/500 [=========================>....] - ETA: 13s - loss: 0.9403 - regression_loss: 0.8393 - classification_loss: 0.1010 445/500 [=========================>....] - ETA: 12s - loss: 0.9397 - regression_loss: 0.8388 - classification_loss: 0.1010 446/500 [=========================>....] - ETA: 12s - loss: 0.9384 - regression_loss: 0.8376 - classification_loss: 0.1008 447/500 [=========================>....] - ETA: 12s - loss: 0.9389 - regression_loss: 0.8380 - classification_loss: 0.1009 448/500 [=========================>....] - ETA: 12s - loss: 0.9377 - regression_loss: 0.8370 - classification_loss: 0.1008 449/500 [=========================>....] - ETA: 12s - loss: 0.9378 - regression_loss: 0.8370 - classification_loss: 0.1008 450/500 [==========================>...] - ETA: 11s - loss: 0.9383 - regression_loss: 0.8375 - classification_loss: 0.1008 451/500 [==========================>...] - ETA: 11s - loss: 0.9388 - regression_loss: 0.8379 - classification_loss: 0.1009 452/500 [==========================>...] - ETA: 11s - loss: 0.9379 - regression_loss: 0.8372 - classification_loss: 0.1007 453/500 [==========================>...] - ETA: 11s - loss: 0.9372 - regression_loss: 0.8367 - classification_loss: 0.1005 454/500 [==========================>...] - ETA: 10s - loss: 0.9367 - regression_loss: 0.8363 - classification_loss: 0.1004 455/500 [==========================>...] - ETA: 10s - loss: 0.9372 - regression_loss: 0.8369 - classification_loss: 0.1003 456/500 [==========================>...] - ETA: 10s - loss: 0.9377 - regression_loss: 0.8374 - classification_loss: 0.1003 457/500 [==========================>...] - ETA: 10s - loss: 0.9374 - regression_loss: 0.8372 - classification_loss: 0.1002 458/500 [==========================>...] - ETA: 9s - loss: 0.9387 - regression_loss: 0.8382 - classification_loss: 0.1006  459/500 [==========================>...] - ETA: 9s - loss: 0.9386 - regression_loss: 0.8380 - classification_loss: 0.1005 460/500 [==========================>...] - ETA: 9s - loss: 0.9387 - regression_loss: 0.8382 - classification_loss: 0.1004 461/500 [==========================>...] - ETA: 9s - loss: 0.9399 - regression_loss: 0.8391 - classification_loss: 0.1007 462/500 [==========================>...] - ETA: 8s - loss: 0.9389 - regression_loss: 0.8382 - classification_loss: 0.1006 463/500 [==========================>...] - ETA: 8s - loss: 0.9398 - regression_loss: 0.8391 - classification_loss: 0.1007 464/500 [==========================>...] - ETA: 8s - loss: 0.9412 - regression_loss: 0.8402 - classification_loss: 0.1011 465/500 [==========================>...] - ETA: 8s - loss: 0.9408 - regression_loss: 0.8397 - classification_loss: 0.1011 466/500 [==========================>...] - ETA: 8s - loss: 0.9411 - regression_loss: 0.8401 - classification_loss: 0.1011 467/500 [===========================>..] - ETA: 7s - loss: 0.9400 - regression_loss: 0.8390 - classification_loss: 0.1009 468/500 [===========================>..] - ETA: 7s - loss: 0.9402 - regression_loss: 0.8392 - classification_loss: 0.1010 469/500 [===========================>..] - ETA: 7s - loss: 0.9395 - regression_loss: 0.8386 - classification_loss: 0.1009 470/500 [===========================>..] - ETA: 7s - loss: 0.9402 - regression_loss: 0.8393 - classification_loss: 0.1009 471/500 [===========================>..] - ETA: 6s - loss: 0.9396 - regression_loss: 0.8388 - classification_loss: 0.1008 472/500 [===========================>..] - ETA: 6s - loss: 0.9391 - regression_loss: 0.8383 - classification_loss: 0.1007 473/500 [===========================>..] - ETA: 6s - loss: 0.9390 - regression_loss: 0.8383 - classification_loss: 0.1006 474/500 [===========================>..] - ETA: 6s - loss: 0.9401 - regression_loss: 0.8393 - classification_loss: 0.1008 475/500 [===========================>..] - ETA: 5s - loss: 0.9396 - regression_loss: 0.8389 - classification_loss: 0.1007 476/500 [===========================>..] - ETA: 5s - loss: 0.9388 - regression_loss: 0.8382 - classification_loss: 0.1005 477/500 [===========================>..] - ETA: 5s - loss: 0.9386 - regression_loss: 0.8380 - classification_loss: 0.1005 478/500 [===========================>..] - ETA: 5s - loss: 0.9378 - regression_loss: 0.8374 - classification_loss: 0.1004 479/500 [===========================>..] - ETA: 4s - loss: 0.9378 - regression_loss: 0.8374 - classification_loss: 0.1004 480/500 [===========================>..] - ETA: 4s - loss: 0.9381 - regression_loss: 0.8375 - classification_loss: 0.1006 481/500 [===========================>..] - ETA: 4s - loss: 0.9381 - regression_loss: 0.8374 - classification_loss: 0.1007 482/500 [===========================>..] - ETA: 4s - loss: 0.9379 - regression_loss: 0.8372 - classification_loss: 0.1007 483/500 [===========================>..] - ETA: 4s - loss: 0.9372 - regression_loss: 0.8367 - classification_loss: 0.1005 484/500 [============================>.] - ETA: 3s - loss: 0.9365 - regression_loss: 0.8360 - classification_loss: 0.1005 485/500 [============================>.] - ETA: 3s - loss: 0.9361 - regression_loss: 0.8357 - classification_loss: 0.1004 486/500 [============================>.] - ETA: 3s - loss: 0.9361 - regression_loss: 0.8358 - classification_loss: 0.1003 487/500 [============================>.] - ETA: 3s - loss: 0.9361 - regression_loss: 0.8360 - classification_loss: 0.1002 488/500 [============================>.] - ETA: 2s - loss: 0.9368 - regression_loss: 0.8364 - classification_loss: 0.1004 489/500 [============================>.] - ETA: 2s - loss: 0.9366 - regression_loss: 0.8363 - classification_loss: 0.1003 490/500 [============================>.] - ETA: 2s - loss: 0.9370 - regression_loss: 0.8365 - classification_loss: 0.1005 491/500 [============================>.] - ETA: 2s - loss: 0.9361 - regression_loss: 0.8357 - classification_loss: 0.1004 492/500 [============================>.] - ETA: 1s - loss: 0.9352 - regression_loss: 0.8350 - classification_loss: 0.1002 493/500 [============================>.] - ETA: 1s - loss: 0.9342 - regression_loss: 0.8341 - classification_loss: 0.1001 494/500 [============================>.] - ETA: 1s - loss: 0.9335 - regression_loss: 0.8335 - classification_loss: 0.1000 495/500 [============================>.] - ETA: 1s - loss: 0.9333 - regression_loss: 0.8334 - classification_loss: 0.0999 496/500 [============================>.] - ETA: 0s - loss: 0.9319 - regression_loss: 0.8321 - classification_loss: 0.0997 497/500 [============================>.] - ETA: 0s - loss: 0.9318 - regression_loss: 0.8321 - classification_loss: 0.0997 498/500 [============================>.] - ETA: 0s - loss: 0.9324 - regression_loss: 0.8328 - classification_loss: 0.0997 499/500 [============================>.] - ETA: 0s - loss: 0.9335 - regression_loss: 0.8335 - classification_loss: 0.0999 500/500 [==============================] - 118s 235ms/step - loss: 0.9337 - regression_loss: 0.8337 - classification_loss: 0.1000 326 instances of class plum with average precision: 0.8442 mAP: 0.8442 Epoch 00032: saving model to ./training/snapshots/resnet50_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 1:50 - loss: 0.9208 - regression_loss: 0.7763 - classification_loss: 0.1444 2/500 [..............................] - ETA: 1:52 - loss: 0.9680 - regression_loss: 0.8358 - classification_loss: 0.1322 3/500 [..............................] - ETA: 1:55 - loss: 0.8769 - regression_loss: 0.7763 - classification_loss: 0.1006 4/500 [..............................] - ETA: 1:54 - loss: 0.9856 - regression_loss: 0.8714 - classification_loss: 0.1142 5/500 [..............................] - ETA: 1:53 - loss: 1.1504 - regression_loss: 0.9920 - classification_loss: 0.1584 6/500 [..............................] - ETA: 1:54 - loss: 1.1595 - regression_loss: 1.0042 - classification_loss: 0.1553 7/500 [..............................] - ETA: 1:56 - loss: 1.1406 - regression_loss: 0.9827 - classification_loss: 0.1579 8/500 [..............................] - ETA: 1:55 - loss: 1.1406 - regression_loss: 0.9859 - classification_loss: 0.1547 9/500 [..............................] - ETA: 1:54 - loss: 1.0971 - regression_loss: 0.9523 - classification_loss: 0.1448 10/500 [..............................] - ETA: 1:54 - loss: 1.1058 - regression_loss: 0.9585 - classification_loss: 0.1473 11/500 [..............................] - ETA: 1:54 - loss: 1.1208 - regression_loss: 0.9722 - classification_loss: 0.1486 12/500 [..............................] - ETA: 1:54 - loss: 1.1434 - regression_loss: 0.9961 - classification_loss: 0.1473 13/500 [..............................] - ETA: 1:54 - loss: 1.0943 - regression_loss: 0.9509 - classification_loss: 0.1434 14/500 [..............................] - ETA: 1:53 - loss: 1.0659 - regression_loss: 0.9293 - classification_loss: 0.1366 15/500 [..............................] - ETA: 1:53 - loss: 1.0553 - regression_loss: 0.9221 - classification_loss: 0.1332 16/500 [..............................] - ETA: 1:53 - loss: 1.0198 - regression_loss: 0.8938 - classification_loss: 0.1260 17/500 [>.............................] - ETA: 1:52 - loss: 1.0181 - regression_loss: 0.8939 - classification_loss: 0.1242 18/500 [>.............................] - ETA: 1:51 - loss: 1.0109 - regression_loss: 0.8883 - classification_loss: 0.1226 19/500 [>.............................] - ETA: 1:51 - loss: 0.9982 - regression_loss: 0.8781 - classification_loss: 0.1200 20/500 [>.............................] - ETA: 1:51 - loss: 1.0073 - regression_loss: 0.8881 - classification_loss: 0.1192 21/500 [>.............................] - ETA: 1:50 - loss: 1.0178 - regression_loss: 0.8989 - classification_loss: 0.1189 22/500 [>.............................] - ETA: 1:50 - loss: 1.0058 - regression_loss: 0.8883 - classification_loss: 0.1175 23/500 [>.............................] - ETA: 1:50 - loss: 1.0010 - regression_loss: 0.8841 - classification_loss: 0.1168 24/500 [>.............................] - ETA: 1:50 - loss: 1.0039 - regression_loss: 0.8894 - classification_loss: 0.1145 25/500 [>.............................] - ETA: 1:49 - loss: 1.0199 - regression_loss: 0.9039 - classification_loss: 0.1160 26/500 [>.............................] - ETA: 1:49 - loss: 1.0130 - regression_loss: 0.8987 - classification_loss: 0.1143 27/500 [>.............................] - ETA: 1:48 - loss: 1.0033 - regression_loss: 0.8913 - classification_loss: 0.1120 28/500 [>.............................] - ETA: 1:48 - loss: 0.9916 - regression_loss: 0.8804 - classification_loss: 0.1112 29/500 [>.............................] - ETA: 1:48 - loss: 0.9853 - regression_loss: 0.8750 - classification_loss: 0.1103 30/500 [>.............................] - ETA: 1:48 - loss: 0.9760 - regression_loss: 0.8671 - classification_loss: 0.1088 31/500 [>.............................] - ETA: 1:48 - loss: 0.9799 - regression_loss: 0.8719 - classification_loss: 0.1080 32/500 [>.............................] - ETA: 1:48 - loss: 0.9628 - regression_loss: 0.8572 - classification_loss: 0.1055 33/500 [>.............................] - ETA: 1:47 - loss: 0.9640 - regression_loss: 0.8585 - classification_loss: 0.1055 34/500 [=>............................] - ETA: 1:47 - loss: 0.9541 - regression_loss: 0.8503 - classification_loss: 0.1039 35/500 [=>............................] - ETA: 1:47 - loss: 0.9682 - regression_loss: 0.8645 - classification_loss: 0.1037 36/500 [=>............................] - ETA: 1:47 - loss: 0.9545 - regression_loss: 0.8531 - classification_loss: 0.1013 37/500 [=>............................] - ETA: 1:47 - loss: 0.9457 - regression_loss: 0.8459 - classification_loss: 0.0998 38/500 [=>............................] - ETA: 1:47 - loss: 0.9551 - regression_loss: 0.8550 - classification_loss: 0.1001 39/500 [=>............................] - ETA: 1:46 - loss: 0.9599 - regression_loss: 0.8584 - classification_loss: 0.1015 40/500 [=>............................] - ETA: 1:46 - loss: 0.9647 - regression_loss: 0.8614 - classification_loss: 0.1033 41/500 [=>............................] - ETA: 1:46 - loss: 0.9606 - regression_loss: 0.8575 - classification_loss: 0.1031 42/500 [=>............................] - ETA: 1:45 - loss: 0.9662 - regression_loss: 0.8614 - classification_loss: 0.1048 43/500 [=>............................] - ETA: 1:45 - loss: 0.9689 - regression_loss: 0.8649 - classification_loss: 0.1040 44/500 [=>............................] - ETA: 1:45 - loss: 0.9639 - regression_loss: 0.8612 - classification_loss: 0.1027 45/500 [=>............................] - ETA: 1:45 - loss: 0.9641 - regression_loss: 0.8614 - classification_loss: 0.1028 46/500 [=>............................] - ETA: 1:45 - loss: 0.9641 - regression_loss: 0.8604 - classification_loss: 0.1037 47/500 [=>............................] - ETA: 1:45 - loss: 0.9651 - regression_loss: 0.8617 - classification_loss: 0.1034 48/500 [=>............................] - ETA: 1:44 - loss: 0.9621 - regression_loss: 0.8594 - classification_loss: 0.1027 49/500 [=>............................] - ETA: 1:44 - loss: 0.9563 - regression_loss: 0.8544 - classification_loss: 0.1019 50/500 [==>...........................] - ETA: 1:44 - loss: 0.9505 - regression_loss: 0.8499 - classification_loss: 0.1005 51/500 [==>...........................] - ETA: 1:44 - loss: 0.9432 - regression_loss: 0.8433 - classification_loss: 0.0999 52/500 [==>...........................] - ETA: 1:44 - loss: 0.9383 - regression_loss: 0.8391 - classification_loss: 0.0992 53/500 [==>...........................] - ETA: 1:44 - loss: 0.9356 - regression_loss: 0.8372 - classification_loss: 0.0985 54/500 [==>...........................] - ETA: 1:44 - loss: 0.9374 - regression_loss: 0.8386 - classification_loss: 0.0987 55/500 [==>...........................] - ETA: 1:43 - loss: 0.9274 - regression_loss: 0.8301 - classification_loss: 0.0973 56/500 [==>...........................] - ETA: 1:43 - loss: 0.9257 - regression_loss: 0.8285 - classification_loss: 0.0972 57/500 [==>...........................] - ETA: 1:43 - loss: 0.9257 - regression_loss: 0.8288 - classification_loss: 0.0969 58/500 [==>...........................] - ETA: 1:43 - loss: 0.9302 - regression_loss: 0.8317 - classification_loss: 0.0985 59/500 [==>...........................] - ETA: 1:43 - loss: 0.9350 - regression_loss: 0.8363 - classification_loss: 0.0988 60/500 [==>...........................] - ETA: 1:42 - loss: 0.9298 - regression_loss: 0.8316 - classification_loss: 0.0982 61/500 [==>...........................] - ETA: 1:42 - loss: 0.9244 - regression_loss: 0.8269 - classification_loss: 0.0975 62/500 [==>...........................] - ETA: 1:42 - loss: 0.9299 - regression_loss: 0.8318 - classification_loss: 0.0982 63/500 [==>...........................] - ETA: 1:42 - loss: 0.9276 - regression_loss: 0.8302 - classification_loss: 0.0974 64/500 [==>...........................] - ETA: 1:42 - loss: 0.9245 - regression_loss: 0.8275 - classification_loss: 0.0970 65/500 [==>...........................] - ETA: 1:42 - loss: 0.9333 - regression_loss: 0.8356 - classification_loss: 0.0977 66/500 [==>...........................] - ETA: 1:41 - loss: 0.9358 - regression_loss: 0.8381 - classification_loss: 0.0978 67/500 [===>..........................] - ETA: 1:41 - loss: 0.9343 - regression_loss: 0.8371 - classification_loss: 0.0972 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9302 - regression_loss: 0.8338 - classification_loss: 0.0964 69/500 [===>..........................] - ETA: 1:40 - loss: 0.9343 - regression_loss: 0.8375 - classification_loss: 0.0967 70/500 [===>..........................] - ETA: 1:40 - loss: 0.9371 - regression_loss: 0.8407 - classification_loss: 0.0964 71/500 [===>..........................] - ETA: 1:40 - loss: 0.9417 - regression_loss: 0.8443 - classification_loss: 0.0974 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9449 - regression_loss: 0.8468 - classification_loss: 0.0981 73/500 [===>..........................] - ETA: 1:39 - loss: 0.9519 - regression_loss: 0.8523 - classification_loss: 0.0996 74/500 [===>..........................] - ETA: 1:39 - loss: 0.9608 - regression_loss: 0.8602 - classification_loss: 0.1007 75/500 [===>..........................] - ETA: 1:39 - loss: 0.9681 - regression_loss: 0.8670 - classification_loss: 0.1011 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9684 - regression_loss: 0.8675 - classification_loss: 0.1009 77/500 [===>..........................] - ETA: 1:38 - loss: 0.9661 - regression_loss: 0.8655 - classification_loss: 0.1006 78/500 [===>..........................] - ETA: 1:38 - loss: 0.9638 - regression_loss: 0.8635 - classification_loss: 0.1003 79/500 [===>..........................] - ETA: 1:38 - loss: 0.9568 - regression_loss: 0.8574 - classification_loss: 0.0994 80/500 [===>..........................] - ETA: 1:38 - loss: 0.9647 - regression_loss: 0.8642 - classification_loss: 0.1005 81/500 [===>..........................] - ETA: 1:37 - loss: 0.9713 - regression_loss: 0.8705 - classification_loss: 0.1008 82/500 [===>..........................] - ETA: 1:37 - loss: 0.9705 - regression_loss: 0.8697 - classification_loss: 0.1008 83/500 [===>..........................] - ETA: 1:37 - loss: 0.9696 - regression_loss: 0.8689 - classification_loss: 0.1007 84/500 [====>.........................] - ETA: 1:36 - loss: 0.9708 - regression_loss: 0.8697 - classification_loss: 0.1011 85/500 [====>.........................] - ETA: 1:36 - loss: 0.9761 - regression_loss: 0.8743 - classification_loss: 0.1018 86/500 [====>.........................] - ETA: 1:36 - loss: 0.9811 - regression_loss: 0.8780 - classification_loss: 0.1031 87/500 [====>.........................] - ETA: 1:36 - loss: 0.9761 - regression_loss: 0.8737 - classification_loss: 0.1023 88/500 [====>.........................] - ETA: 1:35 - loss: 0.9708 - regression_loss: 0.8691 - classification_loss: 0.1017 89/500 [====>.........................] - ETA: 1:35 - loss: 0.9695 - regression_loss: 0.8678 - classification_loss: 0.1017 90/500 [====>.........................] - ETA: 1:35 - loss: 0.9663 - regression_loss: 0.8649 - classification_loss: 0.1013 91/500 [====>.........................] - ETA: 1:35 - loss: 0.9588 - regression_loss: 0.8585 - classification_loss: 0.1003 92/500 [====>.........................] - ETA: 1:35 - loss: 0.9591 - regression_loss: 0.8585 - classification_loss: 0.1006 93/500 [====>.........................] - ETA: 1:34 - loss: 0.9617 - regression_loss: 0.8608 - classification_loss: 0.1009 94/500 [====>.........................] - ETA: 1:34 - loss: 0.9658 - regression_loss: 0.8647 - classification_loss: 0.1012 95/500 [====>.........................] - ETA: 1:34 - loss: 0.9639 - regression_loss: 0.8633 - classification_loss: 0.1006 96/500 [====>.........................] - ETA: 1:34 - loss: 0.9656 - regression_loss: 0.8646 - classification_loss: 0.1010 97/500 [====>.........................] - ETA: 1:33 - loss: 0.9590 - regression_loss: 0.8588 - classification_loss: 0.1003 98/500 [====>.........................] - ETA: 1:33 - loss: 0.9539 - regression_loss: 0.8541 - classification_loss: 0.0998 99/500 [====>.........................] - ETA: 1:33 - loss: 0.9530 - regression_loss: 0.8533 - classification_loss: 0.0997 100/500 [=====>........................] - ETA: 1:33 - loss: 0.9528 - regression_loss: 0.8531 - classification_loss: 0.0997 101/500 [=====>........................] - ETA: 1:32 - loss: 0.9478 - regression_loss: 0.8487 - classification_loss: 0.0991 102/500 [=====>........................] - ETA: 1:32 - loss: 0.9466 - regression_loss: 0.8479 - classification_loss: 0.0988 103/500 [=====>........................] - ETA: 1:32 - loss: 0.9433 - regression_loss: 0.8451 - classification_loss: 0.0982 104/500 [=====>........................] - ETA: 1:32 - loss: 0.9402 - regression_loss: 0.8419 - classification_loss: 0.0982 105/500 [=====>........................] - ETA: 1:31 - loss: 0.9417 - regression_loss: 0.8434 - classification_loss: 0.0983 106/500 [=====>........................] - ETA: 1:31 - loss: 0.9385 - regression_loss: 0.8404 - classification_loss: 0.0981 107/500 [=====>........................] - ETA: 1:31 - loss: 0.9392 - regression_loss: 0.8406 - classification_loss: 0.0986 108/500 [=====>........................] - ETA: 1:31 - loss: 0.9444 - regression_loss: 0.8440 - classification_loss: 0.1004 109/500 [=====>........................] - ETA: 1:30 - loss: 0.9407 - regression_loss: 0.8409 - classification_loss: 0.0997 110/500 [=====>........................] - ETA: 1:30 - loss: 0.9418 - regression_loss: 0.8421 - classification_loss: 0.0997 111/500 [=====>........................] - ETA: 1:30 - loss: 0.9405 - regression_loss: 0.8411 - classification_loss: 0.0994 112/500 [=====>........................] - ETA: 1:30 - loss: 0.9363 - regression_loss: 0.8374 - classification_loss: 0.0989 113/500 [=====>........................] - ETA: 1:29 - loss: 0.9369 - regression_loss: 0.8384 - classification_loss: 0.0985 114/500 [=====>........................] - ETA: 1:29 - loss: 0.9346 - regression_loss: 0.8365 - classification_loss: 0.0981 115/500 [=====>........................] - ETA: 1:29 - loss: 0.9378 - regression_loss: 0.8391 - classification_loss: 0.0987 116/500 [=====>........................] - ETA: 1:29 - loss: 0.9401 - regression_loss: 0.8412 - classification_loss: 0.0989 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9397 - regression_loss: 0.8411 - classification_loss: 0.0985 118/500 [======>.......................] - ETA: 1:28 - loss: 0.9427 - regression_loss: 0.8443 - classification_loss: 0.0984 119/500 [======>.......................] - ETA: 1:28 - loss: 0.9445 - regression_loss: 0.8460 - classification_loss: 0.0985 120/500 [======>.......................] - ETA: 1:28 - loss: 0.9477 - regression_loss: 0.8485 - classification_loss: 0.0992 121/500 [======>.......................] - ETA: 1:28 - loss: 0.9492 - regression_loss: 0.8502 - classification_loss: 0.0991 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9473 - regression_loss: 0.8487 - classification_loss: 0.0986 123/500 [======>.......................] - ETA: 1:27 - loss: 0.9467 - regression_loss: 0.8483 - classification_loss: 0.0985 124/500 [======>.......................] - ETA: 1:27 - loss: 0.9441 - regression_loss: 0.8460 - classification_loss: 0.0981 125/500 [======>.......................] - ETA: 1:27 - loss: 0.9461 - regression_loss: 0.8479 - classification_loss: 0.0982 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9484 - regression_loss: 0.8492 - classification_loss: 0.0992 127/500 [======>.......................] - ETA: 1:26 - loss: 0.9593 - regression_loss: 0.8577 - classification_loss: 0.1016 128/500 [======>.......................] - ETA: 1:26 - loss: 0.9650 - regression_loss: 0.8634 - classification_loss: 0.1016 129/500 [======>.......................] - ETA: 1:26 - loss: 0.9668 - regression_loss: 0.8652 - classification_loss: 0.1016 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9692 - regression_loss: 0.8662 - classification_loss: 0.1031 131/500 [======>.......................] - ETA: 1:25 - loss: 0.9730 - regression_loss: 0.8690 - classification_loss: 0.1040 132/500 [======>.......................] - ETA: 1:25 - loss: 0.9714 - regression_loss: 0.8678 - classification_loss: 0.1035 133/500 [======>.......................] - ETA: 1:25 - loss: 0.9696 - regression_loss: 0.8665 - classification_loss: 0.1031 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9695 - regression_loss: 0.8665 - classification_loss: 0.1030 135/500 [=======>......................] - ETA: 1:24 - loss: 0.9678 - regression_loss: 0.8651 - classification_loss: 0.1026 136/500 [=======>......................] - ETA: 1:24 - loss: 0.9673 - regression_loss: 0.8647 - classification_loss: 0.1026 137/500 [=======>......................] - ETA: 1:24 - loss: 0.9735 - regression_loss: 0.8702 - classification_loss: 0.1033 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9737 - regression_loss: 0.8706 - classification_loss: 0.1031 139/500 [=======>......................] - ETA: 1:23 - loss: 0.9752 - regression_loss: 0.8722 - classification_loss: 0.1030 140/500 [=======>......................] - ETA: 1:23 - loss: 0.9752 - regression_loss: 0.8725 - classification_loss: 0.1027 141/500 [=======>......................] - ETA: 1:23 - loss: 0.9777 - regression_loss: 0.8738 - classification_loss: 0.1038 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9772 - regression_loss: 0.8738 - classification_loss: 0.1034 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9759 - regression_loss: 0.8715 - classification_loss: 0.1044 144/500 [=======>......................] - ETA: 1:22 - loss: 0.9744 - regression_loss: 0.8704 - classification_loss: 0.1040 145/500 [=======>......................] - ETA: 1:22 - loss: 0.9761 - regression_loss: 0.8717 - classification_loss: 0.1044 146/500 [=======>......................] - ETA: 1:22 - loss: 0.9748 - regression_loss: 0.8709 - classification_loss: 0.1039 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9748 - regression_loss: 0.8708 - classification_loss: 0.1040 148/500 [=======>......................] - ETA: 1:21 - loss: 0.9719 - regression_loss: 0.8683 - classification_loss: 0.1036 149/500 [=======>......................] - ETA: 1:21 - loss: 0.9737 - regression_loss: 0.8697 - classification_loss: 0.1039 150/500 [========>.....................] - ETA: 1:21 - loss: 0.9704 - regression_loss: 0.8669 - classification_loss: 0.1035 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9705 - regression_loss: 0.8671 - classification_loss: 0.1035 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9747 - regression_loss: 0.8705 - classification_loss: 0.1042 153/500 [========>.....................] - ETA: 1:20 - loss: 0.9771 - regression_loss: 0.8721 - classification_loss: 0.1049 154/500 [========>.....................] - ETA: 1:20 - loss: 0.9768 - regression_loss: 0.8720 - classification_loss: 0.1048 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9771 - regression_loss: 0.8720 - classification_loss: 0.1051 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9769 - regression_loss: 0.8718 - classification_loss: 0.1051 157/500 [========>.....................] - ETA: 1:19 - loss: 0.9753 - regression_loss: 0.8704 - classification_loss: 0.1049 158/500 [========>.....................] - ETA: 1:19 - loss: 0.9794 - regression_loss: 0.8738 - classification_loss: 0.1056 159/500 [========>.....................] - ETA: 1:19 - loss: 0.9807 - regression_loss: 0.8750 - classification_loss: 0.1057 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9771 - regression_loss: 0.8718 - classification_loss: 0.1053 161/500 [========>.....................] - ETA: 1:18 - loss: 0.9759 - regression_loss: 0.8709 - classification_loss: 0.1050 162/500 [========>.....................] - ETA: 1:18 - loss: 0.9728 - regression_loss: 0.8683 - classification_loss: 0.1045 163/500 [========>.....................] - ETA: 1:18 - loss: 0.9703 - regression_loss: 0.8662 - classification_loss: 0.1040 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9667 - regression_loss: 0.8625 - classification_loss: 0.1042 165/500 [========>.....................] - ETA: 1:17 - loss: 0.9653 - regression_loss: 0.8611 - classification_loss: 0.1042 166/500 [========>.....................] - ETA: 1:17 - loss: 0.9743 - regression_loss: 0.8687 - classification_loss: 0.1056 167/500 [=========>....................] - ETA: 1:17 - loss: 0.9768 - regression_loss: 0.8704 - classification_loss: 0.1064 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9741 - regression_loss: 0.8681 - classification_loss: 0.1061 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9737 - regression_loss: 0.8681 - classification_loss: 0.1056 170/500 [=========>....................] - ETA: 1:16 - loss: 0.9783 - regression_loss: 0.8727 - classification_loss: 0.1057 171/500 [=========>....................] - ETA: 1:16 - loss: 0.9818 - regression_loss: 0.8756 - classification_loss: 0.1061 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9806 - regression_loss: 0.8746 - classification_loss: 0.1059 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9778 - regression_loss: 0.8723 - classification_loss: 0.1055 174/500 [=========>....................] - ETA: 1:15 - loss: 0.9764 - regression_loss: 0.8709 - classification_loss: 0.1055 175/500 [=========>....................] - ETA: 1:15 - loss: 0.9784 - regression_loss: 0.8728 - classification_loss: 0.1056 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9817 - regression_loss: 0.8750 - classification_loss: 0.1067 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9813 - regression_loss: 0.8747 - classification_loss: 0.1066 178/500 [=========>....................] - ETA: 1:14 - loss: 0.9814 - regression_loss: 0.8748 - classification_loss: 0.1066 179/500 [=========>....................] - ETA: 1:14 - loss: 0.9831 - regression_loss: 0.8760 - classification_loss: 0.1071 180/500 [=========>....................] - ETA: 1:14 - loss: 0.9807 - regression_loss: 0.8740 - classification_loss: 0.1067 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9821 - regression_loss: 0.8753 - classification_loss: 0.1067 182/500 [=========>....................] - ETA: 1:13 - loss: 0.9838 - regression_loss: 0.8764 - classification_loss: 0.1074 183/500 [=========>....................] - ETA: 1:13 - loss: 0.9803 - regression_loss: 0.8734 - classification_loss: 0.1069 184/500 [==========>...................] - ETA: 1:13 - loss: 0.9791 - regression_loss: 0.8725 - classification_loss: 0.1066 185/500 [==========>...................] - ETA: 1:13 - loss: 0.9779 - regression_loss: 0.8715 - classification_loss: 0.1064 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9790 - regression_loss: 0.8729 - classification_loss: 0.1061 187/500 [==========>...................] - ETA: 1:12 - loss: 0.9777 - regression_loss: 0.8718 - classification_loss: 0.1059 188/500 [==========>...................] - ETA: 1:12 - loss: 0.9749 - regression_loss: 0.8694 - classification_loss: 0.1055 189/500 [==========>...................] - ETA: 1:12 - loss: 0.9775 - regression_loss: 0.8714 - classification_loss: 0.1061 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9764 - regression_loss: 0.8705 - classification_loss: 0.1059 191/500 [==========>...................] - ETA: 1:11 - loss: 0.9811 - regression_loss: 0.8745 - classification_loss: 0.1066 192/500 [==========>...................] - ETA: 1:11 - loss: 0.9823 - regression_loss: 0.8754 - classification_loss: 0.1070 193/500 [==========>...................] - ETA: 1:11 - loss: 0.9834 - regression_loss: 0.8762 - classification_loss: 0.1072 194/500 [==========>...................] - ETA: 1:11 - loss: 0.9835 - regression_loss: 0.8763 - classification_loss: 0.1072 195/500 [==========>...................] - ETA: 1:10 - loss: 0.9830 - regression_loss: 0.8760 - classification_loss: 0.1070 196/500 [==========>...................] - ETA: 1:10 - loss: 0.9805 - regression_loss: 0.8738 - classification_loss: 0.1067 197/500 [==========>...................] - ETA: 1:10 - loss: 0.9789 - regression_loss: 0.8724 - classification_loss: 0.1065 198/500 [==========>...................] - ETA: 1:10 - loss: 0.9778 - regression_loss: 0.8713 - classification_loss: 0.1065 199/500 [==========>...................] - ETA: 1:10 - loss: 0.9785 - regression_loss: 0.8719 - classification_loss: 0.1065 200/500 [===========>..................] - ETA: 1:09 - loss: 0.9746 - regression_loss: 0.8685 - classification_loss: 0.1061 201/500 [===========>..................] - ETA: 1:09 - loss: 0.9727 - regression_loss: 0.8669 - classification_loss: 0.1058 202/500 [===========>..................] - ETA: 1:09 - loss: 0.9815 - regression_loss: 0.8697 - classification_loss: 0.1117 203/500 [===========>..................] - ETA: 1:09 - loss: 0.9806 - regression_loss: 0.8689 - classification_loss: 0.1117 204/500 [===========>..................] - ETA: 1:08 - loss: 0.9810 - regression_loss: 0.8692 - classification_loss: 0.1117 205/500 [===========>..................] - ETA: 1:08 - loss: 0.9794 - regression_loss: 0.8680 - classification_loss: 0.1113 206/500 [===========>..................] - ETA: 1:08 - loss: 0.9784 - regression_loss: 0.8673 - classification_loss: 0.1111 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9738 - regression_loss: 0.8631 - classification_loss: 0.1107 208/500 [===========>..................] - ETA: 1:07 - loss: 0.9741 - regression_loss: 0.8631 - classification_loss: 0.1109 209/500 [===========>..................] - ETA: 1:07 - loss: 0.9703 - regression_loss: 0.8598 - classification_loss: 0.1105 210/500 [===========>..................] - ETA: 1:07 - loss: 0.9707 - regression_loss: 0.8602 - classification_loss: 0.1106 211/500 [===========>..................] - ETA: 1:07 - loss: 0.9698 - regression_loss: 0.8592 - classification_loss: 0.1106 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9670 - regression_loss: 0.8567 - classification_loss: 0.1103 213/500 [===========>..................] - ETA: 1:06 - loss: 0.9667 - regression_loss: 0.8566 - classification_loss: 0.1101 214/500 [===========>..................] - ETA: 1:06 - loss: 0.9708 - regression_loss: 0.8596 - classification_loss: 0.1112 215/500 [===========>..................] - ETA: 1:06 - loss: 0.9708 - regression_loss: 0.8599 - classification_loss: 0.1110 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9725 - regression_loss: 0.8613 - classification_loss: 0.1112 217/500 [============>.................] - ETA: 1:05 - loss: 0.9720 - regression_loss: 0.8605 - classification_loss: 0.1115 218/500 [============>.................] - ETA: 1:05 - loss: 0.9717 - regression_loss: 0.8605 - classification_loss: 0.1112 219/500 [============>.................] - ETA: 1:05 - loss: 0.9712 - regression_loss: 0.8600 - classification_loss: 0.1112 220/500 [============>.................] - ETA: 1:05 - loss: 0.9696 - regression_loss: 0.8588 - classification_loss: 0.1108 221/500 [============>.................] - ETA: 1:04 - loss: 0.9712 - regression_loss: 0.8596 - classification_loss: 0.1116 222/500 [============>.................] - ETA: 1:04 - loss: 0.9709 - regression_loss: 0.8594 - classification_loss: 0.1115 223/500 [============>.................] - ETA: 1:04 - loss: 0.9706 - regression_loss: 0.8591 - classification_loss: 0.1115 224/500 [============>.................] - ETA: 1:04 - loss: 0.9685 - regression_loss: 0.8565 - classification_loss: 0.1120 225/500 [============>.................] - ETA: 1:04 - loss: 0.9688 - regression_loss: 0.8569 - classification_loss: 0.1119 226/500 [============>.................] - ETA: 1:03 - loss: 0.9691 - regression_loss: 0.8573 - classification_loss: 0.1117 227/500 [============>.................] - ETA: 1:03 - loss: 0.9695 - regression_loss: 0.8578 - classification_loss: 0.1117 228/500 [============>.................] - ETA: 1:03 - loss: 0.9704 - regression_loss: 0.8586 - classification_loss: 0.1117 229/500 [============>.................] - ETA: 1:03 - loss: 0.9674 - regression_loss: 0.8561 - classification_loss: 0.1113 230/500 [============>.................] - ETA: 1:02 - loss: 0.9662 - regression_loss: 0.8551 - classification_loss: 0.1112 231/500 [============>.................] - ETA: 1:02 - loss: 0.9663 - regression_loss: 0.8555 - classification_loss: 0.1108 232/500 [============>.................] - ETA: 1:02 - loss: 0.9666 - regression_loss: 0.8558 - classification_loss: 0.1108 233/500 [============>.................] - ETA: 1:02 - loss: 0.9678 - regression_loss: 0.8571 - classification_loss: 0.1107 234/500 [=============>................] - ETA: 1:01 - loss: 0.9688 - regression_loss: 0.8580 - classification_loss: 0.1108 235/500 [=============>................] - ETA: 1:01 - loss: 0.9681 - regression_loss: 0.8575 - classification_loss: 0.1107 236/500 [=============>................] - ETA: 1:01 - loss: 0.9662 - regression_loss: 0.8559 - classification_loss: 0.1103 237/500 [=============>................] - ETA: 1:01 - loss: 0.9667 - regression_loss: 0.8566 - classification_loss: 0.1102 238/500 [=============>................] - ETA: 1:00 - loss: 0.9653 - regression_loss: 0.8554 - classification_loss: 0.1098 239/500 [=============>................] - ETA: 1:00 - loss: 0.9652 - regression_loss: 0.8555 - classification_loss: 0.1097 240/500 [=============>................] - ETA: 1:00 - loss: 0.9629 - regression_loss: 0.8535 - classification_loss: 0.1094 241/500 [=============>................] - ETA: 1:00 - loss: 0.9613 - regression_loss: 0.8521 - classification_loss: 0.1092 242/500 [=============>................] - ETA: 1:00 - loss: 0.9640 - regression_loss: 0.8545 - classification_loss: 0.1095 243/500 [=============>................] - ETA: 59s - loss: 0.9649 - regression_loss: 0.8551 - classification_loss: 0.1097  244/500 [=============>................] - ETA: 59s - loss: 0.9647 - regression_loss: 0.8550 - classification_loss: 0.1097 245/500 [=============>................] - ETA: 59s - loss: 0.9658 - regression_loss: 0.8560 - classification_loss: 0.1098 246/500 [=============>................] - ETA: 59s - loss: 0.9654 - regression_loss: 0.8558 - classification_loss: 0.1096 247/500 [=============>................] - ETA: 58s - loss: 0.9651 - regression_loss: 0.8557 - classification_loss: 0.1094 248/500 [=============>................] - ETA: 58s - loss: 0.9651 - regression_loss: 0.8556 - classification_loss: 0.1094 249/500 [=============>................] - ETA: 58s - loss: 0.9651 - regression_loss: 0.8558 - classification_loss: 0.1093 250/500 [==============>...............] - ETA: 58s - loss: 0.9650 - regression_loss: 0.8560 - classification_loss: 0.1090 251/500 [==============>...............] - ETA: 58s - loss: 0.9637 - regression_loss: 0.8549 - classification_loss: 0.1088 252/500 [==============>...............] - ETA: 57s - loss: 0.9668 - regression_loss: 0.8572 - classification_loss: 0.1097 253/500 [==============>...............] - ETA: 57s - loss: 0.9660 - regression_loss: 0.8566 - classification_loss: 0.1094 254/500 [==============>...............] - ETA: 57s - loss: 0.9664 - regression_loss: 0.8571 - classification_loss: 0.1093 255/500 [==============>...............] - ETA: 57s - loss: 0.9665 - regression_loss: 0.8571 - classification_loss: 0.1093 256/500 [==============>...............] - ETA: 56s - loss: 0.9665 - regression_loss: 0.8573 - classification_loss: 0.1093 257/500 [==============>...............] - ETA: 56s - loss: 0.9650 - regression_loss: 0.8559 - classification_loss: 0.1091 258/500 [==============>...............] - ETA: 56s - loss: 0.9649 - regression_loss: 0.8554 - classification_loss: 0.1095 259/500 [==============>...............] - ETA: 56s - loss: 0.9663 - regression_loss: 0.8565 - classification_loss: 0.1098 260/500 [==============>...............] - ETA: 55s - loss: 0.9660 - regression_loss: 0.8564 - classification_loss: 0.1096 261/500 [==============>...............] - ETA: 55s - loss: 0.9660 - regression_loss: 0.8565 - classification_loss: 0.1095 262/500 [==============>...............] - ETA: 55s - loss: 0.9631 - regression_loss: 0.8540 - classification_loss: 0.1092 263/500 [==============>...............] - ETA: 55s - loss: 0.9636 - regression_loss: 0.8545 - classification_loss: 0.1091 264/500 [==============>...............] - ETA: 55s - loss: 0.9636 - regression_loss: 0.8547 - classification_loss: 0.1090 265/500 [==============>...............] - ETA: 54s - loss: 0.9639 - regression_loss: 0.8550 - classification_loss: 0.1089 266/500 [==============>...............] - ETA: 54s - loss: 0.9647 - regression_loss: 0.8555 - classification_loss: 0.1092 267/500 [===============>..............] - ETA: 54s - loss: 0.9656 - regression_loss: 0.8563 - classification_loss: 0.1093 268/500 [===============>..............] - ETA: 54s - loss: 0.9659 - regression_loss: 0.8567 - classification_loss: 0.1092 269/500 [===============>..............] - ETA: 53s - loss: 0.9677 - regression_loss: 0.8580 - classification_loss: 0.1097 270/500 [===============>..............] - ETA: 53s - loss: 0.9674 - regression_loss: 0.8578 - classification_loss: 0.1096 271/500 [===============>..............] - ETA: 53s - loss: 0.9653 - regression_loss: 0.8560 - classification_loss: 0.1092 272/500 [===============>..............] - ETA: 53s - loss: 0.9653 - regression_loss: 0.8561 - classification_loss: 0.1092 273/500 [===============>..............] - ETA: 53s - loss: 0.9634 - regression_loss: 0.8544 - classification_loss: 0.1090 274/500 [===============>..............] - ETA: 52s - loss: 0.9636 - regression_loss: 0.8547 - classification_loss: 0.1089 275/500 [===============>..............] - ETA: 52s - loss: 0.9656 - regression_loss: 0.8564 - classification_loss: 0.1092 276/500 [===============>..............] - ETA: 52s - loss: 0.9631 - regression_loss: 0.8542 - classification_loss: 0.1090 277/500 [===============>..............] - ETA: 52s - loss: 0.9632 - regression_loss: 0.8545 - classification_loss: 0.1087 278/500 [===============>..............] - ETA: 51s - loss: 0.9615 - regression_loss: 0.8531 - classification_loss: 0.1084 279/500 [===============>..............] - ETA: 51s - loss: 0.9615 - regression_loss: 0.8532 - classification_loss: 0.1083 280/500 [===============>..............] - ETA: 51s - loss: 0.9613 - regression_loss: 0.8527 - classification_loss: 0.1086 281/500 [===============>..............] - ETA: 51s - loss: 0.9684 - regression_loss: 0.8587 - classification_loss: 0.1097 282/500 [===============>..............] - ETA: 50s - loss: 0.9717 - regression_loss: 0.8615 - classification_loss: 0.1102 283/500 [===============>..............] - ETA: 50s - loss: 0.9704 - regression_loss: 0.8603 - classification_loss: 0.1101 284/500 [================>.............] - ETA: 50s - loss: 0.9694 - regression_loss: 0.8594 - classification_loss: 0.1099 285/500 [================>.............] - ETA: 50s - loss: 0.9699 - regression_loss: 0.8600 - classification_loss: 0.1099 286/500 [================>.............] - ETA: 50s - loss: 0.9697 - regression_loss: 0.8598 - classification_loss: 0.1099 287/500 [================>.............] - ETA: 49s - loss: 0.9675 - regression_loss: 0.8579 - classification_loss: 0.1096 288/500 [================>.............] - ETA: 49s - loss: 0.9672 - regression_loss: 0.8577 - classification_loss: 0.1095 289/500 [================>.............] - ETA: 49s - loss: 0.9659 - regression_loss: 0.8566 - classification_loss: 0.1092 290/500 [================>.............] - ETA: 49s - loss: 0.9694 - regression_loss: 0.8600 - classification_loss: 0.1094 291/500 [================>.............] - ETA: 48s - loss: 0.9692 - regression_loss: 0.8598 - classification_loss: 0.1094 292/500 [================>.............] - ETA: 48s - loss: 0.9694 - regression_loss: 0.8600 - classification_loss: 0.1094 293/500 [================>.............] - ETA: 48s - loss: 0.9716 - regression_loss: 0.8617 - classification_loss: 0.1099 294/500 [================>.............] - ETA: 48s - loss: 0.9722 - regression_loss: 0.8622 - classification_loss: 0.1100 295/500 [================>.............] - ETA: 47s - loss: 0.9727 - regression_loss: 0.8628 - classification_loss: 0.1099 296/500 [================>.............] - ETA: 47s - loss: 0.9747 - regression_loss: 0.8646 - classification_loss: 0.1100 297/500 [================>.............] - ETA: 47s - loss: 0.9749 - regression_loss: 0.8648 - classification_loss: 0.1100 298/500 [================>.............] - ETA: 47s - loss: 0.9753 - regression_loss: 0.8650 - classification_loss: 0.1103 299/500 [================>.............] - ETA: 46s - loss: 0.9737 - regression_loss: 0.8637 - classification_loss: 0.1100 300/500 [=================>............] - ETA: 46s - loss: 0.9733 - regression_loss: 0.8633 - classification_loss: 0.1100 301/500 [=================>............] - ETA: 46s - loss: 0.9782 - regression_loss: 0.8670 - classification_loss: 0.1111 302/500 [=================>............] - ETA: 46s - loss: 0.9800 - regression_loss: 0.8688 - classification_loss: 0.1113 303/500 [=================>............] - ETA: 46s - loss: 0.9837 - regression_loss: 0.8722 - classification_loss: 0.1116 304/500 [=================>............] - ETA: 45s - loss: 0.9831 - regression_loss: 0.8717 - classification_loss: 0.1115 305/500 [=================>............] - ETA: 45s - loss: 0.9818 - regression_loss: 0.8706 - classification_loss: 0.1113 306/500 [=================>............] - ETA: 45s - loss: 0.9808 - regression_loss: 0.8697 - classification_loss: 0.1111 307/500 [=================>............] - ETA: 45s - loss: 0.9819 - regression_loss: 0.8705 - classification_loss: 0.1114 308/500 [=================>............] - ETA: 44s - loss: 0.9802 - regression_loss: 0.8691 - classification_loss: 0.1111 309/500 [=================>............] - ETA: 44s - loss: 0.9815 - regression_loss: 0.8699 - classification_loss: 0.1116 310/500 [=================>............] - ETA: 44s - loss: 0.9805 - regression_loss: 0.8691 - classification_loss: 0.1114 311/500 [=================>............] - ETA: 44s - loss: 0.9799 - regression_loss: 0.8686 - classification_loss: 0.1113 312/500 [=================>............] - ETA: 43s - loss: 0.9790 - regression_loss: 0.8679 - classification_loss: 0.1111 313/500 [=================>............] - ETA: 43s - loss: 0.9798 - regression_loss: 0.8686 - classification_loss: 0.1112 314/500 [=================>............] - ETA: 43s - loss: 0.9808 - regression_loss: 0.8694 - classification_loss: 0.1114 315/500 [=================>............] - ETA: 43s - loss: 0.9798 - regression_loss: 0.8686 - classification_loss: 0.1112 316/500 [=================>............] - ETA: 42s - loss: 0.9795 - regression_loss: 0.8682 - classification_loss: 0.1113 317/500 [==================>...........] - ETA: 42s - loss: 0.9787 - regression_loss: 0.8676 - classification_loss: 0.1111 318/500 [==================>...........] - ETA: 42s - loss: 0.9767 - regression_loss: 0.8659 - classification_loss: 0.1108 319/500 [==================>...........] - ETA: 42s - loss: 0.9770 - regression_loss: 0.8661 - classification_loss: 0.1109 320/500 [==================>...........] - ETA: 42s - loss: 0.9763 - regression_loss: 0.8656 - classification_loss: 0.1108 321/500 [==================>...........] - ETA: 41s - loss: 0.9772 - regression_loss: 0.8664 - classification_loss: 0.1109 322/500 [==================>...........] - ETA: 41s - loss: 0.9778 - regression_loss: 0.8669 - classification_loss: 0.1109 323/500 [==================>...........] - ETA: 41s - loss: 0.9758 - regression_loss: 0.8652 - classification_loss: 0.1106 324/500 [==================>...........] - ETA: 41s - loss: 0.9753 - regression_loss: 0.8648 - classification_loss: 0.1105 325/500 [==================>...........] - ETA: 40s - loss: 0.9758 - regression_loss: 0.8652 - classification_loss: 0.1106 326/500 [==================>...........] - ETA: 40s - loss: 0.9752 - regression_loss: 0.8647 - classification_loss: 0.1104 327/500 [==================>...........] - ETA: 40s - loss: 0.9740 - regression_loss: 0.8638 - classification_loss: 0.1102 328/500 [==================>...........] - ETA: 40s - loss: 0.9729 - regression_loss: 0.8627 - classification_loss: 0.1101 329/500 [==================>...........] - ETA: 39s - loss: 0.9727 - regression_loss: 0.8626 - classification_loss: 0.1101 330/500 [==================>...........] - ETA: 39s - loss: 0.9730 - regression_loss: 0.8629 - classification_loss: 0.1101 331/500 [==================>...........] - ETA: 39s - loss: 0.9753 - regression_loss: 0.8650 - classification_loss: 0.1103 332/500 [==================>...........] - ETA: 39s - loss: 0.9759 - regression_loss: 0.8657 - classification_loss: 0.1102 333/500 [==================>...........] - ETA: 39s - loss: 0.9769 - regression_loss: 0.8667 - classification_loss: 0.1102 334/500 [===================>..........] - ETA: 38s - loss: 0.9770 - regression_loss: 0.8669 - classification_loss: 0.1100 335/500 [===================>..........] - ETA: 38s - loss: 0.9766 - regression_loss: 0.8666 - classification_loss: 0.1100 336/500 [===================>..........] - ETA: 38s - loss: 0.9757 - regression_loss: 0.8660 - classification_loss: 0.1098 337/500 [===================>..........] - ETA: 38s - loss: 0.9759 - regression_loss: 0.8661 - classification_loss: 0.1098 338/500 [===================>..........] - ETA: 37s - loss: 0.9738 - regression_loss: 0.8643 - classification_loss: 0.1095 339/500 [===================>..........] - ETA: 37s - loss: 0.9754 - regression_loss: 0.8655 - classification_loss: 0.1099 340/500 [===================>..........] - ETA: 37s - loss: 0.9745 - regression_loss: 0.8648 - classification_loss: 0.1097 341/500 [===================>..........] - ETA: 37s - loss: 0.9751 - regression_loss: 0.8655 - classification_loss: 0.1096 342/500 [===================>..........] - ETA: 36s - loss: 0.9757 - regression_loss: 0.8660 - classification_loss: 0.1097 343/500 [===================>..........] - ETA: 36s - loss: 0.9763 - regression_loss: 0.8667 - classification_loss: 0.1097 344/500 [===================>..........] - ETA: 36s - loss: 0.9765 - regression_loss: 0.8668 - classification_loss: 0.1097 345/500 [===================>..........] - ETA: 36s - loss: 0.9787 - regression_loss: 0.8688 - classification_loss: 0.1099 346/500 [===================>..........] - ETA: 35s - loss: 0.9782 - regression_loss: 0.8685 - classification_loss: 0.1097 347/500 [===================>..........] - ETA: 35s - loss: 0.9773 - regression_loss: 0.8678 - classification_loss: 0.1095 348/500 [===================>..........] - ETA: 35s - loss: 0.9772 - regression_loss: 0.8675 - classification_loss: 0.1097 349/500 [===================>..........] - ETA: 35s - loss: 0.9758 - regression_loss: 0.8663 - classification_loss: 0.1095 350/500 [====================>.........] - ETA: 35s - loss: 0.9748 - regression_loss: 0.8655 - classification_loss: 0.1093 351/500 [====================>.........] - ETA: 34s - loss: 0.9746 - regression_loss: 0.8654 - classification_loss: 0.1092 352/500 [====================>.........] - ETA: 34s - loss: 0.9736 - regression_loss: 0.8645 - classification_loss: 0.1091 353/500 [====================>.........] - ETA: 34s - loss: 0.9744 - regression_loss: 0.8653 - classification_loss: 0.1091 354/500 [====================>.........] - ETA: 34s - loss: 0.9741 - regression_loss: 0.8651 - classification_loss: 0.1090 355/500 [====================>.........] - ETA: 33s - loss: 0.9737 - regression_loss: 0.8648 - classification_loss: 0.1089 356/500 [====================>.........] - ETA: 33s - loss: 0.9737 - regression_loss: 0.8649 - classification_loss: 0.1089 357/500 [====================>.........] - ETA: 33s - loss: 0.9733 - regression_loss: 0.8646 - classification_loss: 0.1088 358/500 [====================>.........] - ETA: 33s - loss: 0.9728 - regression_loss: 0.8642 - classification_loss: 0.1086 359/500 [====================>.........] - ETA: 32s - loss: 0.9707 - regression_loss: 0.8624 - classification_loss: 0.1083 360/500 [====================>.........] - ETA: 32s - loss: 0.9708 - regression_loss: 0.8624 - classification_loss: 0.1085 361/500 [====================>.........] - ETA: 32s - loss: 0.9735 - regression_loss: 0.8645 - classification_loss: 0.1089 362/500 [====================>.........] - ETA: 32s - loss: 0.9745 - regression_loss: 0.8654 - classification_loss: 0.1091 363/500 [====================>.........] - ETA: 32s - loss: 0.9751 - regression_loss: 0.8656 - classification_loss: 0.1095 364/500 [====================>.........] - ETA: 31s - loss: 0.9741 - regression_loss: 0.8648 - classification_loss: 0.1094 365/500 [====================>.........] - ETA: 31s - loss: 0.9740 - regression_loss: 0.8648 - classification_loss: 0.1093 366/500 [====================>.........] - ETA: 31s - loss: 0.9722 - regression_loss: 0.8631 - classification_loss: 0.1091 367/500 [=====================>........] - ETA: 31s - loss: 0.9701 - regression_loss: 0.8613 - classification_loss: 0.1088 368/500 [=====================>........] - ETA: 30s - loss: 0.9709 - regression_loss: 0.8619 - classification_loss: 0.1089 369/500 [=====================>........] - ETA: 30s - loss: 0.9713 - regression_loss: 0.8623 - classification_loss: 0.1090 370/500 [=====================>........] - ETA: 30s - loss: 0.9729 - regression_loss: 0.8634 - classification_loss: 0.1095 371/500 [=====================>........] - ETA: 30s - loss: 0.9726 - regression_loss: 0.8633 - classification_loss: 0.1093 372/500 [=====================>........] - ETA: 29s - loss: 0.9732 - regression_loss: 0.8638 - classification_loss: 0.1094 373/500 [=====================>........] - ETA: 29s - loss: 0.9740 - regression_loss: 0.8644 - classification_loss: 0.1096 374/500 [=====================>........] - ETA: 29s - loss: 0.9745 - regression_loss: 0.8648 - classification_loss: 0.1097 375/500 [=====================>........] - ETA: 29s - loss: 0.9752 - regression_loss: 0.8654 - classification_loss: 0.1098 376/500 [=====================>........] - ETA: 28s - loss: 0.9747 - regression_loss: 0.8650 - classification_loss: 0.1097 377/500 [=====================>........] - ETA: 28s - loss: 0.9736 - regression_loss: 0.8641 - classification_loss: 0.1095 378/500 [=====================>........] - ETA: 28s - loss: 0.9724 - regression_loss: 0.8631 - classification_loss: 0.1094 379/500 [=====================>........] - ETA: 28s - loss: 0.9727 - regression_loss: 0.8634 - classification_loss: 0.1094 380/500 [=====================>........] - ETA: 28s - loss: 0.9736 - regression_loss: 0.8638 - classification_loss: 0.1097 381/500 [=====================>........] - ETA: 27s - loss: 0.9737 - regression_loss: 0.8640 - classification_loss: 0.1097 382/500 [=====================>........] - ETA: 27s - loss: 0.9755 - regression_loss: 0.8656 - classification_loss: 0.1100 383/500 [=====================>........] - ETA: 27s - loss: 0.9740 - regression_loss: 0.8643 - classification_loss: 0.1097 384/500 [======================>.......] - ETA: 27s - loss: 0.9728 - regression_loss: 0.8633 - classification_loss: 0.1095 385/500 [======================>.......] - ETA: 26s - loss: 0.9744 - regression_loss: 0.8648 - classification_loss: 0.1097 386/500 [======================>.......] - ETA: 26s - loss: 0.9732 - regression_loss: 0.8638 - classification_loss: 0.1094 387/500 [======================>.......] - ETA: 26s - loss: 0.9729 - regression_loss: 0.8635 - classification_loss: 0.1093 388/500 [======================>.......] - ETA: 26s - loss: 0.9727 - regression_loss: 0.8634 - classification_loss: 0.1093 389/500 [======================>.......] - ETA: 25s - loss: 0.9718 - regression_loss: 0.8627 - classification_loss: 0.1091 390/500 [======================>.......] - ETA: 25s - loss: 0.9720 - regression_loss: 0.8629 - classification_loss: 0.1090 391/500 [======================>.......] - ETA: 25s - loss: 0.9730 - regression_loss: 0.8638 - classification_loss: 0.1092 392/500 [======================>.......] - ETA: 25s - loss: 0.9746 - regression_loss: 0.8651 - classification_loss: 0.1095 393/500 [======================>.......] - ETA: 24s - loss: 0.9742 - regression_loss: 0.8646 - classification_loss: 0.1096 394/500 [======================>.......] - ETA: 24s - loss: 0.9738 - regression_loss: 0.8643 - classification_loss: 0.1095 395/500 [======================>.......] - ETA: 24s - loss: 0.9731 - regression_loss: 0.8636 - classification_loss: 0.1095 396/500 [======================>.......] - ETA: 24s - loss: 0.9725 - regression_loss: 0.8631 - classification_loss: 0.1094 397/500 [======================>.......] - ETA: 24s - loss: 0.9720 - regression_loss: 0.8627 - classification_loss: 0.1094 398/500 [======================>.......] - ETA: 23s - loss: 0.9712 - regression_loss: 0.8619 - classification_loss: 0.1092 399/500 [======================>.......] - ETA: 23s - loss: 0.9729 - regression_loss: 0.8633 - classification_loss: 0.1097 400/500 [=======================>......] - ETA: 23s - loss: 0.9736 - regression_loss: 0.8639 - classification_loss: 0.1097 401/500 [=======================>......] - ETA: 23s - loss: 0.9750 - regression_loss: 0.8649 - classification_loss: 0.1101 402/500 [=======================>......] - ETA: 22s - loss: 0.9748 - regression_loss: 0.8648 - classification_loss: 0.1101 403/500 [=======================>......] - ETA: 22s - loss: 0.9734 - regression_loss: 0.8635 - classification_loss: 0.1099 404/500 [=======================>......] - ETA: 22s - loss: 0.9733 - regression_loss: 0.8635 - classification_loss: 0.1098 405/500 [=======================>......] - ETA: 22s - loss: 0.9736 - regression_loss: 0.8638 - classification_loss: 0.1097 406/500 [=======================>......] - ETA: 21s - loss: 0.9753 - regression_loss: 0.8653 - classification_loss: 0.1100 407/500 [=======================>......] - ETA: 21s - loss: 0.9740 - regression_loss: 0.8642 - classification_loss: 0.1098 408/500 [=======================>......] - ETA: 21s - loss: 0.9739 - regression_loss: 0.8642 - classification_loss: 0.1097 409/500 [=======================>......] - ETA: 21s - loss: 0.9756 - regression_loss: 0.8658 - classification_loss: 0.1098 410/500 [=======================>......] - ETA: 21s - loss: 0.9761 - regression_loss: 0.8663 - classification_loss: 0.1098 411/500 [=======================>......] - ETA: 20s - loss: 0.9773 - regression_loss: 0.8675 - classification_loss: 0.1098 412/500 [=======================>......] - ETA: 20s - loss: 0.9773 - regression_loss: 0.8674 - classification_loss: 0.1099 413/500 [=======================>......] - ETA: 20s - loss: 0.9785 - regression_loss: 0.8683 - classification_loss: 0.1102 414/500 [=======================>......] - ETA: 20s - loss: 0.9788 - regression_loss: 0.8684 - classification_loss: 0.1104 415/500 [=======================>......] - ETA: 19s - loss: 0.9799 - regression_loss: 0.8696 - classification_loss: 0.1103 416/500 [=======================>......] - ETA: 19s - loss: 0.9784 - regression_loss: 0.8683 - classification_loss: 0.1101 417/500 [========================>.....] - ETA: 19s - loss: 0.9782 - regression_loss: 0.8682 - classification_loss: 0.1100 418/500 [========================>.....] - ETA: 19s - loss: 0.9770 - regression_loss: 0.8672 - classification_loss: 0.1098 419/500 [========================>.....] - ETA: 18s - loss: 0.9763 - regression_loss: 0.8667 - classification_loss: 0.1096 420/500 [========================>.....] - ETA: 18s - loss: 0.9748 - regression_loss: 0.8654 - classification_loss: 0.1094 421/500 [========================>.....] - ETA: 18s - loss: 0.9734 - regression_loss: 0.8642 - classification_loss: 0.1093 422/500 [========================>.....] - ETA: 18s - loss: 0.9728 - regression_loss: 0.8637 - classification_loss: 0.1091 423/500 [========================>.....] - ETA: 17s - loss: 0.9718 - regression_loss: 0.8629 - classification_loss: 0.1089 424/500 [========================>.....] - ETA: 17s - loss: 0.9715 - regression_loss: 0.8625 - classification_loss: 0.1090 425/500 [========================>.....] - ETA: 17s - loss: 0.9711 - regression_loss: 0.8622 - classification_loss: 0.1090 426/500 [========================>.....] - ETA: 17s - loss: 0.9703 - regression_loss: 0.8615 - classification_loss: 0.1088 427/500 [========================>.....] - ETA: 17s - loss: 0.9710 - regression_loss: 0.8622 - classification_loss: 0.1089 428/500 [========================>.....] - ETA: 16s - loss: 0.9705 - regression_loss: 0.8617 - classification_loss: 0.1087 429/500 [========================>.....] - ETA: 16s - loss: 0.9694 - regression_loss: 0.8609 - classification_loss: 0.1086 430/500 [========================>.....] - ETA: 16s - loss: 0.9700 - regression_loss: 0.8614 - classification_loss: 0.1086 431/500 [========================>.....] - ETA: 16s - loss: 0.9692 - regression_loss: 0.8608 - classification_loss: 0.1085 432/500 [========================>.....] - ETA: 15s - loss: 0.9706 - regression_loss: 0.8620 - classification_loss: 0.1085 433/500 [========================>.....] - ETA: 15s - loss: 0.9693 - regression_loss: 0.8609 - classification_loss: 0.1083 434/500 [=========================>....] - ETA: 15s - loss: 0.9683 - regression_loss: 0.8600 - classification_loss: 0.1083 435/500 [=========================>....] - ETA: 15s - loss: 0.9681 - regression_loss: 0.8598 - classification_loss: 0.1083 436/500 [=========================>....] - ETA: 14s - loss: 0.9671 - regression_loss: 0.8589 - classification_loss: 0.1081 437/500 [=========================>....] - ETA: 14s - loss: 0.9677 - regression_loss: 0.8596 - classification_loss: 0.1081 438/500 [=========================>....] - ETA: 14s - loss: 0.9675 - regression_loss: 0.8595 - classification_loss: 0.1080 439/500 [=========================>....] - ETA: 14s - loss: 0.9666 - regression_loss: 0.8587 - classification_loss: 0.1079 440/500 [=========================>....] - ETA: 14s - loss: 0.9661 - regression_loss: 0.8583 - classification_loss: 0.1078 441/500 [=========================>....] - ETA: 13s - loss: 0.9653 - regression_loss: 0.8576 - classification_loss: 0.1076 442/500 [=========================>....] - ETA: 13s - loss: 0.9656 - regression_loss: 0.8580 - classification_loss: 0.1076 443/500 [=========================>....] - ETA: 13s - loss: 0.9667 - regression_loss: 0.8588 - classification_loss: 0.1079 444/500 [=========================>....] - ETA: 13s - loss: 0.9663 - regression_loss: 0.8585 - classification_loss: 0.1078 445/500 [=========================>....] - ETA: 12s - loss: 0.9668 - regression_loss: 0.8589 - classification_loss: 0.1078 446/500 [=========================>....] - ETA: 12s - loss: 0.9655 - regression_loss: 0.8578 - classification_loss: 0.1077 447/500 [=========================>....] - ETA: 12s - loss: 0.9659 - regression_loss: 0.8583 - classification_loss: 0.1076 448/500 [=========================>....] - ETA: 12s - loss: 0.9649 - regression_loss: 0.8574 - classification_loss: 0.1074 449/500 [=========================>....] - ETA: 11s - loss: 0.9650 - regression_loss: 0.8574 - classification_loss: 0.1075 450/500 [==========================>...] - ETA: 11s - loss: 0.9658 - regression_loss: 0.8583 - classification_loss: 0.1075 451/500 [==========================>...] - ETA: 11s - loss: 0.9659 - regression_loss: 0.8585 - classification_loss: 0.1074 452/500 [==========================>...] - ETA: 11s - loss: 0.9643 - regression_loss: 0.8570 - classification_loss: 0.1073 453/500 [==========================>...] - ETA: 10s - loss: 0.9641 - regression_loss: 0.8570 - classification_loss: 0.1071 454/500 [==========================>...] - ETA: 10s - loss: 0.9649 - regression_loss: 0.8575 - classification_loss: 0.1074 455/500 [==========================>...] - ETA: 10s - loss: 0.9654 - regression_loss: 0.8579 - classification_loss: 0.1074 456/500 [==========================>...] - ETA: 10s - loss: 0.9671 - regression_loss: 0.8595 - classification_loss: 0.1076 457/500 [==========================>...] - ETA: 10s - loss: 0.9672 - regression_loss: 0.8595 - classification_loss: 0.1077 458/500 [==========================>...] - ETA: 9s - loss: 0.9678 - regression_loss: 0.8602 - classification_loss: 0.1077  459/500 [==========================>...] - ETA: 9s - loss: 0.9669 - regression_loss: 0.8594 - classification_loss: 0.1075 460/500 [==========================>...] - ETA: 9s - loss: 0.9656 - regression_loss: 0.8583 - classification_loss: 0.1073 461/500 [==========================>...] - ETA: 9s - loss: 0.9675 - regression_loss: 0.8600 - classification_loss: 0.1075 462/500 [==========================>...] - ETA: 8s - loss: 0.9677 - regression_loss: 0.8602 - classification_loss: 0.1075 463/500 [==========================>...] - ETA: 8s - loss: 0.9675 - regression_loss: 0.8601 - classification_loss: 0.1074 464/500 [==========================>...] - ETA: 8s - loss: 0.9667 - regression_loss: 0.8595 - classification_loss: 0.1073 465/500 [==========================>...] - ETA: 8s - loss: 0.9661 - regression_loss: 0.8587 - classification_loss: 0.1074 466/500 [==========================>...] - ETA: 7s - loss: 0.9679 - regression_loss: 0.8603 - classification_loss: 0.1076 467/500 [===========================>..] - ETA: 7s - loss: 0.9663 - regression_loss: 0.8588 - classification_loss: 0.1074 468/500 [===========================>..] - ETA: 7s - loss: 0.9669 - regression_loss: 0.8594 - classification_loss: 0.1075 469/500 [===========================>..] - ETA: 7s - loss: 0.9659 - regression_loss: 0.8586 - classification_loss: 0.1073 470/500 [===========================>..] - ETA: 7s - loss: 0.9652 - regression_loss: 0.8581 - classification_loss: 0.1072 471/500 [===========================>..] - ETA: 6s - loss: 0.9661 - regression_loss: 0.8586 - classification_loss: 0.1075 472/500 [===========================>..] - ETA: 6s - loss: 0.9667 - regression_loss: 0.8591 - classification_loss: 0.1076 473/500 [===========================>..] - ETA: 6s - loss: 0.9662 - regression_loss: 0.8586 - classification_loss: 0.1076 474/500 [===========================>..] - ETA: 6s - loss: 0.9651 - regression_loss: 0.8577 - classification_loss: 0.1074 475/500 [===========================>..] - ETA: 5s - loss: 0.9655 - regression_loss: 0.8577 - classification_loss: 0.1077 476/500 [===========================>..] - ETA: 5s - loss: 0.9652 - regression_loss: 0.8574 - classification_loss: 0.1078 477/500 [===========================>..] - ETA: 5s - loss: 0.9645 - regression_loss: 0.8569 - classification_loss: 0.1076 478/500 [===========================>..] - ETA: 5s - loss: 0.9647 - regression_loss: 0.8571 - classification_loss: 0.1076 479/500 [===========================>..] - ETA: 4s - loss: 0.9644 - regression_loss: 0.8569 - classification_loss: 0.1076 480/500 [===========================>..] - ETA: 4s - loss: 0.9640 - regression_loss: 0.8565 - classification_loss: 0.1075 481/500 [===========================>..] - ETA: 4s - loss: 0.9632 - regression_loss: 0.8559 - classification_loss: 0.1073 482/500 [===========================>..] - ETA: 4s - loss: 0.9626 - regression_loss: 0.8554 - classification_loss: 0.1072 483/500 [===========================>..] - ETA: 3s - loss: 0.9625 - regression_loss: 0.8554 - classification_loss: 0.1071 484/500 [============================>.] - ETA: 3s - loss: 0.9639 - regression_loss: 0.8563 - classification_loss: 0.1076 485/500 [============================>.] - ETA: 3s - loss: 0.9634 - regression_loss: 0.8559 - classification_loss: 0.1074 486/500 [============================>.] - ETA: 3s - loss: 0.9635 - regression_loss: 0.8561 - classification_loss: 0.1074 487/500 [============================>.] - ETA: 3s - loss: 0.9638 - regression_loss: 0.8565 - classification_loss: 0.1073 488/500 [============================>.] - ETA: 2s - loss: 0.9629 - regression_loss: 0.8557 - classification_loss: 0.1072 489/500 [============================>.] - ETA: 2s - loss: 0.9623 - regression_loss: 0.8552 - classification_loss: 0.1071 490/500 [============================>.] - ETA: 2s - loss: 0.9644 - regression_loss: 0.8572 - classification_loss: 0.1072 491/500 [============================>.] - ETA: 2s - loss: 0.9641 - regression_loss: 0.8570 - classification_loss: 0.1071 492/500 [============================>.] - ETA: 1s - loss: 0.9649 - regression_loss: 0.8576 - classification_loss: 0.1073 493/500 [============================>.] - ETA: 1s - loss: 0.9637 - regression_loss: 0.8565 - classification_loss: 0.1071 494/500 [============================>.] - ETA: 1s - loss: 0.9635 - regression_loss: 0.8565 - classification_loss: 0.1070 495/500 [============================>.] - ETA: 1s - loss: 0.9642 - regression_loss: 0.8572 - classification_loss: 0.1070 496/500 [============================>.] - ETA: 0s - loss: 0.9642 - regression_loss: 0.8574 - classification_loss: 0.1069 497/500 [============================>.] - ETA: 0s - loss: 0.9645 - regression_loss: 0.8576 - classification_loss: 0.1069 498/500 [============================>.] - ETA: 0s - loss: 0.9640 - regression_loss: 0.8572 - classification_loss: 0.1067 499/500 [============================>.] - ETA: 0s - loss: 0.9641 - regression_loss: 0.8574 - classification_loss: 0.1067 500/500 [==============================] - 117s 234ms/step - loss: 0.9643 - regression_loss: 0.8576 - classification_loss: 0.1067 326 instances of class plum with average precision: 0.8530 mAP: 0.8530 Epoch 00033: saving model to ./training/snapshots/resnet50_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 1:57 - loss: 0.8351 - regression_loss: 0.7774 - classification_loss: 0.0577 2/500 [..............................] - ETA: 1:55 - loss: 0.6404 - regression_loss: 0.5952 - classification_loss: 0.0453 3/500 [..............................] - ETA: 1:57 - loss: 0.5994 - regression_loss: 0.5569 - classification_loss: 0.0425 4/500 [..............................] - ETA: 1:59 - loss: 0.6075 - regression_loss: 0.5496 - classification_loss: 0.0579 5/500 [..............................] - ETA: 1:59 - loss: 0.7193 - regression_loss: 0.6521 - classification_loss: 0.0673 6/500 [..............................] - ETA: 1:58 - loss: 0.7411 - regression_loss: 0.6691 - classification_loss: 0.0720 7/500 [..............................] - ETA: 1:57 - loss: 0.8464 - regression_loss: 0.7575 - classification_loss: 0.0889 8/500 [..............................] - ETA: 1:57 - loss: 0.8464 - regression_loss: 0.7563 - classification_loss: 0.0900 9/500 [..............................] - ETA: 1:57 - loss: 0.8301 - regression_loss: 0.7418 - classification_loss: 0.0883 10/500 [..............................] - ETA: 1:57 - loss: 0.7540 - regression_loss: 0.6676 - classification_loss: 0.0863 11/500 [..............................] - ETA: 1:57 - loss: 0.7670 - regression_loss: 0.6819 - classification_loss: 0.0851 12/500 [..............................] - ETA: 1:56 - loss: 0.7630 - regression_loss: 0.6805 - classification_loss: 0.0825 13/500 [..............................] - ETA: 1:56 - loss: 0.7903 - regression_loss: 0.7050 - classification_loss: 0.0853 14/500 [..............................] - ETA: 1:55 - loss: 0.7783 - regression_loss: 0.6979 - classification_loss: 0.0804 15/500 [..............................] - ETA: 1:55 - loss: 0.7919 - regression_loss: 0.7082 - classification_loss: 0.0837 16/500 [..............................] - ETA: 1:53 - loss: 0.7861 - regression_loss: 0.7021 - classification_loss: 0.0840 17/500 [>.............................] - ETA: 1:54 - loss: 0.8219 - regression_loss: 0.7249 - classification_loss: 0.0970 18/500 [>.............................] - ETA: 1:53 - loss: 0.8492 - regression_loss: 0.7458 - classification_loss: 0.1035 19/500 [>.............................] - ETA: 1:53 - loss: 0.8697 - regression_loss: 0.7638 - classification_loss: 0.1059 20/500 [>.............................] - ETA: 1:53 - loss: 0.8814 - regression_loss: 0.7747 - classification_loss: 0.1067 21/500 [>.............................] - ETA: 1:53 - loss: 0.8910 - regression_loss: 0.7840 - classification_loss: 0.1069 22/500 [>.............................] - ETA: 1:53 - loss: 0.8741 - regression_loss: 0.7693 - classification_loss: 0.1047 23/500 [>.............................] - ETA: 1:52 - loss: 0.9245 - regression_loss: 0.8148 - classification_loss: 0.1096 24/500 [>.............................] - ETA: 1:52 - loss: 0.9042 - regression_loss: 0.7979 - classification_loss: 0.1063 25/500 [>.............................] - ETA: 1:52 - loss: 0.9158 - regression_loss: 0.8093 - classification_loss: 0.1065 26/500 [>.............................] - ETA: 1:52 - loss: 0.9340 - regression_loss: 0.8260 - classification_loss: 0.1080 27/500 [>.............................] - ETA: 1:52 - loss: 0.9261 - regression_loss: 0.8193 - classification_loss: 0.1067 28/500 [>.............................] - ETA: 1:51 - loss: 0.9370 - regression_loss: 0.8282 - classification_loss: 0.1088 29/500 [>.............................] - ETA: 1:51 - loss: 0.9403 - regression_loss: 0.8326 - classification_loss: 0.1078 30/500 [>.............................] - ETA: 1:50 - loss: 0.9318 - regression_loss: 0.8257 - classification_loss: 0.1061 31/500 [>.............................] - ETA: 1:50 - loss: 0.9446 - regression_loss: 0.8364 - classification_loss: 0.1082 32/500 [>.............................] - ETA: 1:50 - loss: 0.9507 - regression_loss: 0.8405 - classification_loss: 0.1102 33/500 [>.............................] - ETA: 1:49 - loss: 1.0256 - regression_loss: 0.8863 - classification_loss: 0.1394 34/500 [=>............................] - ETA: 1:49 - loss: 1.0382 - regression_loss: 0.8967 - classification_loss: 0.1416 35/500 [=>............................] - ETA: 1:49 - loss: 1.0426 - regression_loss: 0.9017 - classification_loss: 0.1409 36/500 [=>............................] - ETA: 1:48 - loss: 1.0886 - regression_loss: 0.9324 - classification_loss: 0.1562 37/500 [=>............................] - ETA: 1:48 - loss: 1.0834 - regression_loss: 0.9271 - classification_loss: 0.1563 38/500 [=>............................] - ETA: 1:48 - loss: 1.0847 - regression_loss: 0.9301 - classification_loss: 0.1547 39/500 [=>............................] - ETA: 1:48 - loss: 1.0689 - regression_loss: 0.9171 - classification_loss: 0.1518 40/500 [=>............................] - ETA: 1:48 - loss: 1.0797 - regression_loss: 0.9284 - classification_loss: 0.1513 41/500 [=>............................] - ETA: 1:48 - loss: 1.0798 - regression_loss: 0.9291 - classification_loss: 0.1507 42/500 [=>............................] - ETA: 1:47 - loss: 1.0715 - regression_loss: 0.9233 - classification_loss: 0.1481 43/500 [=>............................] - ETA: 1:47 - loss: 1.0517 - regression_loss: 0.9066 - classification_loss: 0.1451 44/500 [=>............................] - ETA: 1:47 - loss: 1.0765 - regression_loss: 0.9240 - classification_loss: 0.1525 45/500 [=>............................] - ETA: 1:46 - loss: 1.0707 - regression_loss: 0.9204 - classification_loss: 0.1504 46/500 [=>............................] - ETA: 1:46 - loss: 1.0510 - regression_loss: 0.9034 - classification_loss: 0.1476 47/500 [=>............................] - ETA: 1:46 - loss: 1.0573 - regression_loss: 0.9086 - classification_loss: 0.1486 48/500 [=>............................] - ETA: 1:45 - loss: 1.0616 - regression_loss: 0.9111 - classification_loss: 0.1505 49/500 [=>............................] - ETA: 1:45 - loss: 1.0644 - regression_loss: 0.9138 - classification_loss: 0.1506 50/500 [==>...........................] - ETA: 1:45 - loss: 1.0459 - regression_loss: 0.8955 - classification_loss: 0.1504 51/500 [==>...........................] - ETA: 1:45 - loss: 1.0471 - regression_loss: 0.8986 - classification_loss: 0.1485 52/500 [==>...........................] - ETA: 1:44 - loss: 1.0579 - regression_loss: 0.9092 - classification_loss: 0.1487 53/500 [==>...........................] - ETA: 1:44 - loss: 1.0541 - regression_loss: 0.9075 - classification_loss: 0.1465 54/500 [==>...........................] - ETA: 1:44 - loss: 1.0512 - regression_loss: 0.9066 - classification_loss: 0.1447 55/500 [==>...........................] - ETA: 1:43 - loss: 1.0479 - regression_loss: 0.9048 - classification_loss: 0.1431 56/500 [==>...........................] - ETA: 1:43 - loss: 1.0528 - regression_loss: 0.9104 - classification_loss: 0.1423 57/500 [==>...........................] - ETA: 1:43 - loss: 1.0615 - regression_loss: 0.9164 - classification_loss: 0.1451 58/500 [==>...........................] - ETA: 1:43 - loss: 1.0597 - regression_loss: 0.9145 - classification_loss: 0.1452 59/500 [==>...........................] - ETA: 1:42 - loss: 1.0524 - regression_loss: 0.9086 - classification_loss: 0.1438 60/500 [==>...........................] - ETA: 1:42 - loss: 1.0572 - regression_loss: 0.9132 - classification_loss: 0.1439 61/500 [==>...........................] - ETA: 1:42 - loss: 1.0487 - regression_loss: 0.9065 - classification_loss: 0.1422 62/500 [==>...........................] - ETA: 1:41 - loss: 1.0492 - regression_loss: 0.9070 - classification_loss: 0.1421 63/500 [==>...........................] - ETA: 1:41 - loss: 1.0515 - regression_loss: 0.9097 - classification_loss: 0.1418 64/500 [==>...........................] - ETA: 1:41 - loss: 1.0548 - regression_loss: 0.9125 - classification_loss: 0.1423 65/500 [==>...........................] - ETA: 1:41 - loss: 1.0575 - regression_loss: 0.9154 - classification_loss: 0.1420 66/500 [==>...........................] - ETA: 1:41 - loss: 1.0677 - regression_loss: 0.9250 - classification_loss: 0.1427 67/500 [===>..........................] - ETA: 1:41 - loss: 1.0746 - regression_loss: 0.9310 - classification_loss: 0.1436 68/500 [===>..........................] - ETA: 1:40 - loss: 1.0794 - regression_loss: 0.9349 - classification_loss: 0.1445 69/500 [===>..........................] - ETA: 1:40 - loss: 1.0799 - regression_loss: 0.9355 - classification_loss: 0.1445 70/500 [===>..........................] - ETA: 1:40 - loss: 1.0697 - regression_loss: 0.9271 - classification_loss: 0.1426 71/500 [===>..........................] - ETA: 1:40 - loss: 1.0721 - regression_loss: 0.9299 - classification_loss: 0.1422 72/500 [===>..........................] - ETA: 1:40 - loss: 1.0676 - regression_loss: 0.9265 - classification_loss: 0.1411 73/500 [===>..........................] - ETA: 1:39 - loss: 1.0638 - regression_loss: 0.9239 - classification_loss: 0.1399 74/500 [===>..........................] - ETA: 1:39 - loss: 1.0542 - regression_loss: 0.9158 - classification_loss: 0.1384 75/500 [===>..........................] - ETA: 1:39 - loss: 1.0488 - regression_loss: 0.9103 - classification_loss: 0.1385 76/500 [===>..........................] - ETA: 1:39 - loss: 1.0437 - regression_loss: 0.9063 - classification_loss: 0.1374 77/500 [===>..........................] - ETA: 1:38 - loss: 1.0392 - regression_loss: 0.9031 - classification_loss: 0.1362 78/500 [===>..........................] - ETA: 1:38 - loss: 1.0404 - regression_loss: 0.9045 - classification_loss: 0.1359 79/500 [===>..........................] - ETA: 1:38 - loss: 1.0355 - regression_loss: 0.9005 - classification_loss: 0.1350 80/500 [===>..........................] - ETA: 1:38 - loss: 1.0413 - regression_loss: 0.9039 - classification_loss: 0.1373 81/500 [===>..........................] - ETA: 1:38 - loss: 1.0381 - regression_loss: 0.9017 - classification_loss: 0.1364 82/500 [===>..........................] - ETA: 1:37 - loss: 1.0382 - regression_loss: 0.9024 - classification_loss: 0.1358 83/500 [===>..........................] - ETA: 1:37 - loss: 1.0340 - regression_loss: 0.8989 - classification_loss: 0.1351 84/500 [====>.........................] - ETA: 1:37 - loss: 1.0291 - regression_loss: 0.8952 - classification_loss: 0.1340 85/500 [====>.........................] - ETA: 1:37 - loss: 1.0307 - regression_loss: 0.8967 - classification_loss: 0.1340 86/500 [====>.........................] - ETA: 1:37 - loss: 1.0273 - regression_loss: 0.8943 - classification_loss: 0.1330 87/500 [====>.........................] - ETA: 1:36 - loss: 1.0212 - regression_loss: 0.8891 - classification_loss: 0.1321 88/500 [====>.........................] - ETA: 1:36 - loss: 1.0192 - regression_loss: 0.8878 - classification_loss: 0.1314 89/500 [====>.........................] - ETA: 1:36 - loss: 1.0146 - regression_loss: 0.8839 - classification_loss: 0.1308 90/500 [====>.........................] - ETA: 1:36 - loss: 1.0232 - regression_loss: 0.8909 - classification_loss: 0.1323 91/500 [====>.........................] - ETA: 1:35 - loss: 1.0208 - regression_loss: 0.8888 - classification_loss: 0.1320 92/500 [====>.........................] - ETA: 1:35 - loss: 1.0155 - regression_loss: 0.8845 - classification_loss: 0.1310 93/500 [====>.........................] - ETA: 1:35 - loss: 1.0115 - regression_loss: 0.8812 - classification_loss: 0.1303 94/500 [====>.........................] - ETA: 1:34 - loss: 1.0083 - regression_loss: 0.8789 - classification_loss: 0.1294 95/500 [====>.........................] - ETA: 1:34 - loss: 1.0088 - regression_loss: 0.8798 - classification_loss: 0.1291 96/500 [====>.........................] - ETA: 1:34 - loss: 1.0004 - regression_loss: 0.8722 - classification_loss: 0.1281 97/500 [====>.........................] - ETA: 1:34 - loss: 0.9946 - regression_loss: 0.8673 - classification_loss: 0.1273 98/500 [====>.........................] - ETA: 1:34 - loss: 0.9957 - regression_loss: 0.8682 - classification_loss: 0.1276 99/500 [====>.........................] - ETA: 1:33 - loss: 0.9925 - regression_loss: 0.8655 - classification_loss: 0.1270 100/500 [=====>........................] - ETA: 1:33 - loss: 0.9944 - regression_loss: 0.8677 - classification_loss: 0.1267 101/500 [=====>........................] - ETA: 1:33 - loss: 0.9901 - regression_loss: 0.8638 - classification_loss: 0.1263 102/500 [=====>........................] - ETA: 1:33 - loss: 0.9821 - regression_loss: 0.8567 - classification_loss: 0.1254 103/500 [=====>........................] - ETA: 1:32 - loss: 0.9801 - regression_loss: 0.8556 - classification_loss: 0.1246 104/500 [=====>........................] - ETA: 1:32 - loss: 0.9819 - regression_loss: 0.8569 - classification_loss: 0.1250 105/500 [=====>........................] - ETA: 1:32 - loss: 0.9774 - regression_loss: 0.8532 - classification_loss: 0.1242 106/500 [=====>........................] - ETA: 1:32 - loss: 0.9774 - regression_loss: 0.8535 - classification_loss: 0.1239 107/500 [=====>........................] - ETA: 1:31 - loss: 0.9814 - regression_loss: 0.8583 - classification_loss: 0.1231 108/500 [=====>........................] - ETA: 1:31 - loss: 0.9771 - regression_loss: 0.8545 - classification_loss: 0.1226 109/500 [=====>........................] - ETA: 1:31 - loss: 0.9745 - regression_loss: 0.8524 - classification_loss: 0.1220 110/500 [=====>........................] - ETA: 1:31 - loss: 0.9721 - regression_loss: 0.8508 - classification_loss: 0.1213 111/500 [=====>........................] - ETA: 1:30 - loss: 0.9731 - regression_loss: 0.8518 - classification_loss: 0.1212 112/500 [=====>........................] - ETA: 1:30 - loss: 0.9673 - regression_loss: 0.8471 - classification_loss: 0.1203 113/500 [=====>........................] - ETA: 1:30 - loss: 0.9652 - regression_loss: 0.8451 - classification_loss: 0.1201 114/500 [=====>........................] - ETA: 1:30 - loss: 0.9695 - regression_loss: 0.8487 - classification_loss: 0.1208 115/500 [=====>........................] - ETA: 1:29 - loss: 0.9671 - regression_loss: 0.8467 - classification_loss: 0.1204 116/500 [=====>........................] - ETA: 1:29 - loss: 0.9682 - regression_loss: 0.8481 - classification_loss: 0.1201 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9638 - regression_loss: 0.8443 - classification_loss: 0.1195 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9654 - regression_loss: 0.8459 - classification_loss: 0.1196 119/500 [======>.......................] - ETA: 1:29 - loss: 0.9611 - regression_loss: 0.8423 - classification_loss: 0.1188 120/500 [======>.......................] - ETA: 1:28 - loss: 0.9619 - regression_loss: 0.8430 - classification_loss: 0.1189 121/500 [======>.......................] - ETA: 1:28 - loss: 0.9626 - regression_loss: 0.8439 - classification_loss: 0.1188 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9617 - regression_loss: 0.8429 - classification_loss: 0.1188 123/500 [======>.......................] - ETA: 1:28 - loss: 0.9601 - regression_loss: 0.8417 - classification_loss: 0.1184 124/500 [======>.......................] - ETA: 1:27 - loss: 0.9587 - regression_loss: 0.8409 - classification_loss: 0.1178 125/500 [======>.......................] - ETA: 1:27 - loss: 0.9595 - regression_loss: 0.8420 - classification_loss: 0.1175 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9652 - regression_loss: 0.8468 - classification_loss: 0.1183 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9631 - regression_loss: 0.8453 - classification_loss: 0.1178 128/500 [======>.......................] - ETA: 1:26 - loss: 0.9634 - regression_loss: 0.8458 - classification_loss: 0.1176 129/500 [======>.......................] - ETA: 1:26 - loss: 0.9618 - regression_loss: 0.8445 - classification_loss: 0.1173 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9599 - regression_loss: 0.8430 - classification_loss: 0.1169 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9581 - regression_loss: 0.8415 - classification_loss: 0.1166 132/500 [======>.......................] - ETA: 1:26 - loss: 0.9543 - regression_loss: 0.8384 - classification_loss: 0.1159 133/500 [======>.......................] - ETA: 1:25 - loss: 0.9542 - regression_loss: 0.8384 - classification_loss: 0.1158 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9495 - regression_loss: 0.8343 - classification_loss: 0.1152 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9486 - regression_loss: 0.8337 - classification_loss: 0.1149 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9557 - regression_loss: 0.8394 - classification_loss: 0.1163 137/500 [=======>......................] - ETA: 1:24 - loss: 0.9601 - regression_loss: 0.8435 - classification_loss: 0.1166 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9564 - regression_loss: 0.8402 - classification_loss: 0.1162 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9528 - regression_loss: 0.8372 - classification_loss: 0.1156 140/500 [=======>......................] - ETA: 1:24 - loss: 0.9493 - regression_loss: 0.8343 - classification_loss: 0.1150 141/500 [=======>......................] - ETA: 1:23 - loss: 0.9510 - regression_loss: 0.8360 - classification_loss: 0.1150 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9505 - regression_loss: 0.8357 - classification_loss: 0.1148 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9458 - regression_loss: 0.8317 - classification_loss: 0.1141 144/500 [=======>......................] - ETA: 1:23 - loss: 0.9447 - regression_loss: 0.8309 - classification_loss: 0.1138 145/500 [=======>......................] - ETA: 1:23 - loss: 0.9490 - regression_loss: 0.8347 - classification_loss: 0.1143 146/500 [=======>......................] - ETA: 1:22 - loss: 0.9509 - regression_loss: 0.8366 - classification_loss: 0.1144 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9501 - regression_loss: 0.8359 - classification_loss: 0.1142 148/500 [=======>......................] - ETA: 1:22 - loss: 0.9504 - regression_loss: 0.8364 - classification_loss: 0.1140 149/500 [=======>......................] - ETA: 1:22 - loss: 0.9504 - regression_loss: 0.8370 - classification_loss: 0.1134 150/500 [========>.....................] - ETA: 1:21 - loss: 0.9468 - regression_loss: 0.8340 - classification_loss: 0.1128 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9495 - regression_loss: 0.8370 - classification_loss: 0.1125 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9510 - regression_loss: 0.8380 - classification_loss: 0.1130 153/500 [========>.....................] - ETA: 1:21 - loss: 0.9492 - regression_loss: 0.8366 - classification_loss: 0.1126 154/500 [========>.....................] - ETA: 1:20 - loss: 0.9509 - regression_loss: 0.8381 - classification_loss: 0.1127 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9529 - regression_loss: 0.8393 - classification_loss: 0.1136 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9558 - regression_loss: 0.8419 - classification_loss: 0.1139 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9554 - regression_loss: 0.8416 - classification_loss: 0.1138 158/500 [========>.....................] - ETA: 1:19 - loss: 0.9527 - regression_loss: 0.8394 - classification_loss: 0.1133 159/500 [========>.....................] - ETA: 1:19 - loss: 0.9506 - regression_loss: 0.8376 - classification_loss: 0.1130 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9506 - regression_loss: 0.8375 - classification_loss: 0.1131 161/500 [========>.....................] - ETA: 1:19 - loss: 0.9498 - regression_loss: 0.8371 - classification_loss: 0.1128 162/500 [========>.....................] - ETA: 1:18 - loss: 0.9508 - regression_loss: 0.8379 - classification_loss: 0.1128 163/500 [========>.....................] - ETA: 1:18 - loss: 0.9495 - regression_loss: 0.8369 - classification_loss: 0.1126 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9524 - regression_loss: 0.8392 - classification_loss: 0.1131 165/500 [========>.....................] - ETA: 1:18 - loss: 0.9562 - regression_loss: 0.8431 - classification_loss: 0.1131 166/500 [========>.....................] - ETA: 1:18 - loss: 0.9543 - regression_loss: 0.8417 - classification_loss: 0.1126 167/500 [=========>....................] - ETA: 1:17 - loss: 0.9556 - regression_loss: 0.8427 - classification_loss: 0.1129 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9558 - regression_loss: 0.8428 - classification_loss: 0.1130 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9630 - regression_loss: 0.8492 - classification_loss: 0.1138 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9610 - regression_loss: 0.8476 - classification_loss: 0.1134 171/500 [=========>....................] - ETA: 1:16 - loss: 0.9586 - regression_loss: 0.8458 - classification_loss: 0.1128 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9561 - regression_loss: 0.8437 - classification_loss: 0.1124 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9633 - regression_loss: 0.8501 - classification_loss: 0.1132 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9610 - regression_loss: 0.8478 - classification_loss: 0.1132 175/500 [=========>....................] - ETA: 1:15 - loss: 0.9635 - regression_loss: 0.8503 - classification_loss: 0.1132 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9988 - regression_loss: 0.8529 - classification_loss: 0.1459 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9986 - regression_loss: 0.8531 - classification_loss: 0.1455 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9988 - regression_loss: 0.8535 - classification_loss: 0.1453 179/500 [=========>....................] - ETA: 1:15 - loss: 1.0006 - regression_loss: 0.8553 - classification_loss: 0.1453 180/500 [=========>....................] - ETA: 1:14 - loss: 1.0024 - regression_loss: 0.8573 - classification_loss: 0.1451 181/500 [=========>....................] - ETA: 1:14 - loss: 1.0037 - regression_loss: 0.8587 - classification_loss: 0.1450 182/500 [=========>....................] - ETA: 1:14 - loss: 1.0020 - regression_loss: 0.8575 - classification_loss: 0.1444 183/500 [=========>....................] - ETA: 1:14 - loss: 1.0039 - regression_loss: 0.8595 - classification_loss: 0.1444 184/500 [==========>...................] - ETA: 1:14 - loss: 1.0026 - regression_loss: 0.8586 - classification_loss: 0.1440 185/500 [==========>...................] - ETA: 1:13 - loss: 1.0009 - regression_loss: 0.8574 - classification_loss: 0.1436 186/500 [==========>...................] - ETA: 1:13 - loss: 1.0007 - regression_loss: 0.8573 - classification_loss: 0.1435 187/500 [==========>...................] - ETA: 1:13 - loss: 1.0057 - regression_loss: 0.8616 - classification_loss: 0.1441 188/500 [==========>...................] - ETA: 1:13 - loss: 1.0075 - regression_loss: 0.8629 - classification_loss: 0.1446 189/500 [==========>...................] - ETA: 1:12 - loss: 1.0060 - regression_loss: 0.8618 - classification_loss: 0.1442 190/500 [==========>...................] - ETA: 1:12 - loss: 1.0033 - regression_loss: 0.8595 - classification_loss: 0.1438 191/500 [==========>...................] - ETA: 1:12 - loss: 1.0029 - regression_loss: 0.8595 - classification_loss: 0.1434 192/500 [==========>...................] - ETA: 1:12 - loss: 1.0024 - regression_loss: 0.8592 - classification_loss: 0.1432 193/500 [==========>...................] - ETA: 1:11 - loss: 1.0042 - regression_loss: 0.8612 - classification_loss: 0.1430 194/500 [==========>...................] - ETA: 1:11 - loss: 1.0044 - regression_loss: 0.8617 - classification_loss: 0.1427 195/500 [==========>...................] - ETA: 1:11 - loss: 1.0049 - regression_loss: 0.8627 - classification_loss: 0.1422 196/500 [==========>...................] - ETA: 1:11 - loss: 1.0058 - regression_loss: 0.8633 - classification_loss: 0.1426 197/500 [==========>...................] - ETA: 1:10 - loss: 1.0059 - regression_loss: 0.8636 - classification_loss: 0.1422 198/500 [==========>...................] - ETA: 1:10 - loss: 1.0248 - regression_loss: 0.8687 - classification_loss: 0.1561 199/500 [==========>...................] - ETA: 1:10 - loss: 1.0240 - regression_loss: 0.8681 - classification_loss: 0.1559 200/500 [===========>..................] - ETA: 1:10 - loss: 1.0233 - regression_loss: 0.8678 - classification_loss: 0.1555 201/500 [===========>..................] - ETA: 1:10 - loss: 1.0250 - regression_loss: 0.8697 - classification_loss: 0.1553 202/500 [===========>..................] - ETA: 1:09 - loss: 1.0259 - regression_loss: 0.8705 - classification_loss: 0.1554 203/500 [===========>..................] - ETA: 1:09 - loss: 1.0265 - regression_loss: 0.8713 - classification_loss: 0.1552 204/500 [===========>..................] - ETA: 1:09 - loss: 1.0280 - regression_loss: 0.8730 - classification_loss: 0.1550 205/500 [===========>..................] - ETA: 1:09 - loss: 1.0247 - regression_loss: 0.8703 - classification_loss: 0.1544 206/500 [===========>..................] - ETA: 1:08 - loss: 1.0275 - regression_loss: 0.8733 - classification_loss: 0.1541 207/500 [===========>..................] - ETA: 1:08 - loss: 1.0271 - regression_loss: 0.8732 - classification_loss: 0.1539 208/500 [===========>..................] - ETA: 1:08 - loss: 1.0281 - regression_loss: 0.8737 - classification_loss: 0.1544 209/500 [===========>..................] - ETA: 1:08 - loss: 1.0265 - regression_loss: 0.8725 - classification_loss: 0.1541 210/500 [===========>..................] - ETA: 1:07 - loss: 1.0261 - regression_loss: 0.8721 - classification_loss: 0.1541 211/500 [===========>..................] - ETA: 1:07 - loss: 1.0235 - regression_loss: 0.8698 - classification_loss: 0.1536 212/500 [===========>..................] - ETA: 1:07 - loss: 1.0212 - regression_loss: 0.8681 - classification_loss: 0.1532 213/500 [===========>..................] - ETA: 1:07 - loss: 1.0234 - regression_loss: 0.8703 - classification_loss: 0.1531 214/500 [===========>..................] - ETA: 1:06 - loss: 1.0236 - regression_loss: 0.8707 - classification_loss: 0.1530 215/500 [===========>..................] - ETA: 1:06 - loss: 1.0213 - regression_loss: 0.8688 - classification_loss: 0.1525 216/500 [===========>..................] - ETA: 1:06 - loss: 1.0226 - regression_loss: 0.8702 - classification_loss: 0.1524 217/500 [============>.................] - ETA: 1:06 - loss: 1.0198 - regression_loss: 0.8680 - classification_loss: 0.1518 218/500 [============>.................] - ETA: 1:06 - loss: 1.0187 - regression_loss: 0.8673 - classification_loss: 0.1514 219/500 [============>.................] - ETA: 1:05 - loss: 1.0205 - regression_loss: 0.8689 - classification_loss: 0.1516 220/500 [============>.................] - ETA: 1:05 - loss: 1.0207 - regression_loss: 0.8690 - classification_loss: 0.1517 221/500 [============>.................] - ETA: 1:05 - loss: 1.0195 - regression_loss: 0.8682 - classification_loss: 0.1513 222/500 [============>.................] - ETA: 1:04 - loss: 1.0216 - regression_loss: 0.8703 - classification_loss: 0.1512 223/500 [============>.................] - ETA: 1:04 - loss: 1.0230 - regression_loss: 0.8718 - classification_loss: 0.1513 224/500 [============>.................] - ETA: 1:04 - loss: 1.0231 - regression_loss: 0.8719 - classification_loss: 0.1512 225/500 [============>.................] - ETA: 1:04 - loss: 1.0210 - regression_loss: 0.8702 - classification_loss: 0.1508 226/500 [============>.................] - ETA: 1:04 - loss: 1.0224 - regression_loss: 0.8717 - classification_loss: 0.1507 227/500 [============>.................] - ETA: 1:03 - loss: 1.0207 - regression_loss: 0.8704 - classification_loss: 0.1503 228/500 [============>.................] - ETA: 1:03 - loss: 1.0175 - regression_loss: 0.8677 - classification_loss: 0.1498 229/500 [============>.................] - ETA: 1:03 - loss: 1.0195 - regression_loss: 0.8692 - classification_loss: 0.1503 230/500 [============>.................] - ETA: 1:03 - loss: 1.0198 - regression_loss: 0.8696 - classification_loss: 0.1502 231/500 [============>.................] - ETA: 1:02 - loss: 1.0186 - regression_loss: 0.8689 - classification_loss: 0.1498 232/500 [============>.................] - ETA: 1:02 - loss: 1.0170 - regression_loss: 0.8677 - classification_loss: 0.1494 233/500 [============>.................] - ETA: 1:02 - loss: 1.0163 - regression_loss: 0.8671 - classification_loss: 0.1492 234/500 [=============>................] - ETA: 1:02 - loss: 1.0148 - regression_loss: 0.8660 - classification_loss: 0.1488 235/500 [=============>................] - ETA: 1:01 - loss: 1.0148 - regression_loss: 0.8661 - classification_loss: 0.1488 236/500 [=============>................] - ETA: 1:01 - loss: 1.0136 - regression_loss: 0.8652 - classification_loss: 0.1484 237/500 [=============>................] - ETA: 1:01 - loss: 1.0122 - regression_loss: 0.8642 - classification_loss: 0.1480 238/500 [=============>................] - ETA: 1:01 - loss: 1.0120 - regression_loss: 0.8642 - classification_loss: 0.1478 239/500 [=============>................] - ETA: 1:01 - loss: 1.0094 - regression_loss: 0.8621 - classification_loss: 0.1474 240/500 [=============>................] - ETA: 1:00 - loss: 1.0084 - regression_loss: 0.8611 - classification_loss: 0.1473 241/500 [=============>................] - ETA: 1:00 - loss: 1.0090 - regression_loss: 0.8618 - classification_loss: 0.1471 242/500 [=============>................] - ETA: 1:00 - loss: 1.0078 - regression_loss: 0.8608 - classification_loss: 0.1470 243/500 [=============>................] - ETA: 1:00 - loss: 1.0086 - regression_loss: 0.8615 - classification_loss: 0.1471 244/500 [=============>................] - ETA: 59s - loss: 1.0087 - regression_loss: 0.8618 - classification_loss: 0.1469  245/500 [=============>................] - ETA: 59s - loss: 1.0100 - regression_loss: 0.8627 - classification_loss: 0.1473 246/500 [=============>................] - ETA: 59s - loss: 1.0104 - regression_loss: 0.8634 - classification_loss: 0.1470 247/500 [=============>................] - ETA: 59s - loss: 1.0104 - regression_loss: 0.8636 - classification_loss: 0.1468 248/500 [=============>................] - ETA: 58s - loss: 1.0089 - regression_loss: 0.8624 - classification_loss: 0.1465 249/500 [=============>................] - ETA: 58s - loss: 1.0065 - regression_loss: 0.8605 - classification_loss: 0.1460 250/500 [==============>...............] - ETA: 58s - loss: 1.0058 - regression_loss: 0.8600 - classification_loss: 0.1458 251/500 [==============>...............] - ETA: 58s - loss: 1.0045 - regression_loss: 0.8591 - classification_loss: 0.1454 252/500 [==============>...............] - ETA: 58s - loss: 1.0038 - regression_loss: 0.8585 - classification_loss: 0.1453 253/500 [==============>...............] - ETA: 57s - loss: 1.0018 - regression_loss: 0.8568 - classification_loss: 0.1450 254/500 [==============>...............] - ETA: 57s - loss: 0.9997 - regression_loss: 0.8551 - classification_loss: 0.1446 255/500 [==============>...............] - ETA: 57s - loss: 1.0000 - regression_loss: 0.8554 - classification_loss: 0.1447 256/500 [==============>...............] - ETA: 57s - loss: 0.9980 - regression_loss: 0.8537 - classification_loss: 0.1443 257/500 [==============>...............] - ETA: 56s - loss: 0.9967 - regression_loss: 0.8528 - classification_loss: 0.1439 258/500 [==============>...............] - ETA: 56s - loss: 0.9952 - regression_loss: 0.8516 - classification_loss: 0.1435 259/500 [==============>...............] - ETA: 56s - loss: 0.9932 - regression_loss: 0.8498 - classification_loss: 0.1435 260/500 [==============>...............] - ETA: 56s - loss: 0.9936 - regression_loss: 0.8505 - classification_loss: 0.1431 261/500 [==============>...............] - ETA: 55s - loss: 0.9942 - regression_loss: 0.8510 - classification_loss: 0.1433 262/500 [==============>...............] - ETA: 55s - loss: 0.9949 - regression_loss: 0.8518 - classification_loss: 0.1432 263/500 [==============>...............] - ETA: 55s - loss: 0.9967 - regression_loss: 0.8528 - classification_loss: 0.1439 264/500 [==============>...............] - ETA: 55s - loss: 0.9955 - regression_loss: 0.8520 - classification_loss: 0.1435 265/500 [==============>...............] - ETA: 54s - loss: 0.9967 - regression_loss: 0.8535 - classification_loss: 0.1432 266/500 [==============>...............] - ETA: 54s - loss: 0.9977 - regression_loss: 0.8548 - classification_loss: 0.1430 267/500 [===============>..............] - ETA: 54s - loss: 0.9994 - regression_loss: 0.8564 - classification_loss: 0.1429 268/500 [===============>..............] - ETA: 54s - loss: 0.9986 - regression_loss: 0.8560 - classification_loss: 0.1426 269/500 [===============>..............] - ETA: 54s - loss: 0.9997 - regression_loss: 0.8572 - classification_loss: 0.1425 270/500 [===============>..............] - ETA: 53s - loss: 1.0000 - regression_loss: 0.8579 - classification_loss: 0.1421 271/500 [===============>..............] - ETA: 53s - loss: 0.9998 - regression_loss: 0.8579 - classification_loss: 0.1419 272/500 [===============>..............] - ETA: 53s - loss: 0.9991 - regression_loss: 0.8573 - classification_loss: 0.1418 273/500 [===============>..............] - ETA: 53s - loss: 0.9969 - regression_loss: 0.8556 - classification_loss: 0.1413 274/500 [===============>..............] - ETA: 52s - loss: 0.9959 - regression_loss: 0.8549 - classification_loss: 0.1410 275/500 [===============>..............] - ETA: 52s - loss: 0.9952 - regression_loss: 0.8545 - classification_loss: 0.1407 276/500 [===============>..............] - ETA: 52s - loss: 0.9945 - regression_loss: 0.8539 - classification_loss: 0.1406 277/500 [===============>..............] - ETA: 52s - loss: 0.9922 - regression_loss: 0.8519 - classification_loss: 0.1403 278/500 [===============>..............] - ETA: 51s - loss: 0.9926 - regression_loss: 0.8524 - classification_loss: 0.1402 279/500 [===============>..............] - ETA: 51s - loss: 0.9931 - regression_loss: 0.8530 - classification_loss: 0.1402 280/500 [===============>..............] - ETA: 51s - loss: 0.9933 - regression_loss: 0.8531 - classification_loss: 0.1402 281/500 [===============>..............] - ETA: 51s - loss: 0.9922 - regression_loss: 0.8524 - classification_loss: 0.1398 282/500 [===============>..............] - ETA: 51s - loss: 0.9908 - regression_loss: 0.8514 - classification_loss: 0.1394 283/500 [===============>..............] - ETA: 50s - loss: 0.9890 - regression_loss: 0.8500 - classification_loss: 0.1390 284/500 [================>.............] - ETA: 50s - loss: 0.9878 - regression_loss: 0.8491 - classification_loss: 0.1387 285/500 [================>.............] - ETA: 50s - loss: 0.9886 - regression_loss: 0.8497 - classification_loss: 0.1389 286/500 [================>.............] - ETA: 50s - loss: 0.9886 - regression_loss: 0.8497 - classification_loss: 0.1389 287/500 [================>.............] - ETA: 49s - loss: 0.9889 - regression_loss: 0.8502 - classification_loss: 0.1387 288/500 [================>.............] - ETA: 49s - loss: 0.9894 - regression_loss: 0.8509 - classification_loss: 0.1384 289/500 [================>.............] - ETA: 49s - loss: 0.9885 - regression_loss: 0.8503 - classification_loss: 0.1382 290/500 [================>.............] - ETA: 49s - loss: 0.9902 - regression_loss: 0.8517 - classification_loss: 0.1384 291/500 [================>.............] - ETA: 48s - loss: 0.9889 - regression_loss: 0.8506 - classification_loss: 0.1382 292/500 [================>.............] - ETA: 48s - loss: 0.9885 - regression_loss: 0.8506 - classification_loss: 0.1379 293/500 [================>.............] - ETA: 48s - loss: 0.9865 - regression_loss: 0.8489 - classification_loss: 0.1375 294/500 [================>.............] - ETA: 48s - loss: 0.9840 - regression_loss: 0.8468 - classification_loss: 0.1371 295/500 [================>.............] - ETA: 47s - loss: 0.9841 - regression_loss: 0.8471 - classification_loss: 0.1370 296/500 [================>.............] - ETA: 47s - loss: 0.9844 - regression_loss: 0.8473 - classification_loss: 0.1371 297/500 [================>.............] - ETA: 47s - loss: 0.9840 - regression_loss: 0.8471 - classification_loss: 0.1369 298/500 [================>.............] - ETA: 47s - loss: 0.9832 - regression_loss: 0.8466 - classification_loss: 0.1366 299/500 [================>.............] - ETA: 47s - loss: 0.9855 - regression_loss: 0.8484 - classification_loss: 0.1371 300/500 [=================>............] - ETA: 46s - loss: 0.9858 - regression_loss: 0.8491 - classification_loss: 0.1367 301/500 [=================>............] - ETA: 46s - loss: 0.9826 - regression_loss: 0.8463 - classification_loss: 0.1363 302/500 [=================>............] - ETA: 46s - loss: 0.9823 - regression_loss: 0.8461 - classification_loss: 0.1362 303/500 [=================>............] - ETA: 46s - loss: 0.9813 - regression_loss: 0.8455 - classification_loss: 0.1359 304/500 [=================>............] - ETA: 45s - loss: 0.9805 - regression_loss: 0.8450 - classification_loss: 0.1355 305/500 [=================>............] - ETA: 45s - loss: 0.9818 - regression_loss: 0.8462 - classification_loss: 0.1357 306/500 [=================>............] - ETA: 45s - loss: 0.9844 - regression_loss: 0.8486 - classification_loss: 0.1358 307/500 [=================>............] - ETA: 45s - loss: 0.9846 - regression_loss: 0.8489 - classification_loss: 0.1357 308/500 [=================>............] - ETA: 44s - loss: 0.9827 - regression_loss: 0.8473 - classification_loss: 0.1354 309/500 [=================>............] - ETA: 44s - loss: 0.9827 - regression_loss: 0.8474 - classification_loss: 0.1353 310/500 [=================>............] - ETA: 44s - loss: 0.9836 - regression_loss: 0.8484 - classification_loss: 0.1352 311/500 [=================>............] - ETA: 44s - loss: 0.9834 - regression_loss: 0.8485 - classification_loss: 0.1349 312/500 [=================>............] - ETA: 44s - loss: 0.9842 - regression_loss: 0.8492 - classification_loss: 0.1350 313/500 [=================>............] - ETA: 43s - loss: 0.9851 - regression_loss: 0.8498 - classification_loss: 0.1353 314/500 [=================>............] - ETA: 43s - loss: 0.9853 - regression_loss: 0.8502 - classification_loss: 0.1351 315/500 [=================>............] - ETA: 43s - loss: 0.9852 - regression_loss: 0.8504 - classification_loss: 0.1349 316/500 [=================>............] - ETA: 43s - loss: 0.9844 - regression_loss: 0.8498 - classification_loss: 0.1346 317/500 [==================>...........] - ETA: 42s - loss: 0.9833 - regression_loss: 0.8491 - classification_loss: 0.1342 318/500 [==================>...........] - ETA: 42s - loss: 0.9841 - regression_loss: 0.8499 - classification_loss: 0.1343 319/500 [==================>...........] - ETA: 42s - loss: 0.9839 - regression_loss: 0.8498 - classification_loss: 0.1340 320/500 [==================>...........] - ETA: 42s - loss: 0.9832 - regression_loss: 0.8494 - classification_loss: 0.1339 321/500 [==================>...........] - ETA: 41s - loss: 0.9852 - regression_loss: 0.8512 - classification_loss: 0.1341 322/500 [==================>...........] - ETA: 41s - loss: 0.9841 - regression_loss: 0.8504 - classification_loss: 0.1337 323/500 [==================>...........] - ETA: 41s - loss: 0.9825 - regression_loss: 0.8491 - classification_loss: 0.1334 324/500 [==================>...........] - ETA: 41s - loss: 0.9811 - regression_loss: 0.8480 - classification_loss: 0.1332 325/500 [==================>...........] - ETA: 40s - loss: 0.9800 - regression_loss: 0.8471 - classification_loss: 0.1330 326/500 [==================>...........] - ETA: 40s - loss: 0.9789 - regression_loss: 0.8463 - classification_loss: 0.1326 327/500 [==================>...........] - ETA: 40s - loss: 0.9779 - regression_loss: 0.8454 - classification_loss: 0.1325 328/500 [==================>...........] - ETA: 40s - loss: 0.9786 - regression_loss: 0.8463 - classification_loss: 0.1323 329/500 [==================>...........] - ETA: 39s - loss: 0.9795 - regression_loss: 0.8470 - classification_loss: 0.1325 330/500 [==================>...........] - ETA: 39s - loss: 0.9789 - regression_loss: 0.8465 - classification_loss: 0.1323 331/500 [==================>...........] - ETA: 39s - loss: 0.9781 - regression_loss: 0.8459 - classification_loss: 0.1322 332/500 [==================>...........] - ETA: 39s - loss: 0.9765 - regression_loss: 0.8446 - classification_loss: 0.1319 333/500 [==================>...........] - ETA: 39s - loss: 0.9786 - regression_loss: 0.8460 - classification_loss: 0.1326 334/500 [===================>..........] - ETA: 38s - loss: 0.9793 - regression_loss: 0.8468 - classification_loss: 0.1325 335/500 [===================>..........] - ETA: 38s - loss: 0.9797 - regression_loss: 0.8471 - classification_loss: 0.1325 336/500 [===================>..........] - ETA: 38s - loss: 0.9806 - regression_loss: 0.8482 - classification_loss: 0.1324 337/500 [===================>..........] - ETA: 38s - loss: 0.9806 - regression_loss: 0.8482 - classification_loss: 0.1323 338/500 [===================>..........] - ETA: 37s - loss: 0.9807 - regression_loss: 0.8485 - classification_loss: 0.1322 339/500 [===================>..........] - ETA: 37s - loss: 0.9811 - regression_loss: 0.8489 - classification_loss: 0.1322 340/500 [===================>..........] - ETA: 37s - loss: 0.9795 - regression_loss: 0.8477 - classification_loss: 0.1319 341/500 [===================>..........] - ETA: 37s - loss: 0.9790 - regression_loss: 0.8473 - classification_loss: 0.1318 342/500 [===================>..........] - ETA: 36s - loss: 0.9779 - regression_loss: 0.8465 - classification_loss: 0.1315 343/500 [===================>..........] - ETA: 36s - loss: 0.9783 - regression_loss: 0.8469 - classification_loss: 0.1314 344/500 [===================>..........] - ETA: 36s - loss: 0.9783 - regression_loss: 0.8469 - classification_loss: 0.1314 345/500 [===================>..........] - ETA: 36s - loss: 0.9783 - regression_loss: 0.8470 - classification_loss: 0.1313 346/500 [===================>..........] - ETA: 36s - loss: 0.9778 - regression_loss: 0.8467 - classification_loss: 0.1310 347/500 [===================>..........] - ETA: 35s - loss: 0.9769 - regression_loss: 0.8461 - classification_loss: 0.1308 348/500 [===================>..........] - ETA: 35s - loss: 0.9757 - regression_loss: 0.8452 - classification_loss: 0.1305 349/500 [===================>..........] - ETA: 35s - loss: 0.9757 - regression_loss: 0.8454 - classification_loss: 0.1303 350/500 [====================>.........] - ETA: 35s - loss: 0.9750 - regression_loss: 0.8448 - classification_loss: 0.1302 351/500 [====================>.........] - ETA: 34s - loss: 0.9739 - regression_loss: 0.8440 - classification_loss: 0.1299 352/500 [====================>.........] - ETA: 34s - loss: 0.9737 - regression_loss: 0.8439 - classification_loss: 0.1298 353/500 [====================>.........] - ETA: 34s - loss: 0.9772 - regression_loss: 0.8467 - classification_loss: 0.1306 354/500 [====================>.........] - ETA: 34s - loss: 0.9769 - regression_loss: 0.8464 - classification_loss: 0.1305 355/500 [====================>.........] - ETA: 33s - loss: 0.9775 - regression_loss: 0.8472 - classification_loss: 0.1303 356/500 [====================>.........] - ETA: 33s - loss: 0.9792 - regression_loss: 0.8489 - classification_loss: 0.1302 357/500 [====================>.........] - ETA: 33s - loss: 0.9797 - regression_loss: 0.8494 - classification_loss: 0.1303 358/500 [====================>.........] - ETA: 33s - loss: 0.9786 - regression_loss: 0.8486 - classification_loss: 0.1301 359/500 [====================>.........] - ETA: 32s - loss: 0.9786 - regression_loss: 0.8487 - classification_loss: 0.1300 360/500 [====================>.........] - ETA: 32s - loss: 0.9774 - regression_loss: 0.8477 - classification_loss: 0.1297 361/500 [====================>.........] - ETA: 32s - loss: 0.9753 - regression_loss: 0.8460 - classification_loss: 0.1294 362/500 [====================>.........] - ETA: 32s - loss: 0.9736 - regression_loss: 0.8445 - classification_loss: 0.1291 363/500 [====================>.........] - ETA: 32s - loss: 0.9749 - regression_loss: 0.8455 - classification_loss: 0.1295 364/500 [====================>.........] - ETA: 31s - loss: 0.9746 - regression_loss: 0.8453 - classification_loss: 0.1293 365/500 [====================>.........] - ETA: 31s - loss: 0.9755 - regression_loss: 0.8461 - classification_loss: 0.1294 366/500 [====================>.........] - ETA: 31s - loss: 0.9763 - regression_loss: 0.8468 - classification_loss: 0.1295 367/500 [=====================>........] - ETA: 31s - loss: 0.9763 - regression_loss: 0.8469 - classification_loss: 0.1293 368/500 [=====================>........] - ETA: 30s - loss: 0.9755 - regression_loss: 0.8463 - classification_loss: 0.1291 369/500 [=====================>........] - ETA: 30s - loss: 0.9752 - regression_loss: 0.8462 - classification_loss: 0.1289 370/500 [=====================>........] - ETA: 30s - loss: 0.9755 - regression_loss: 0.8467 - classification_loss: 0.1288 371/500 [=====================>........] - ETA: 30s - loss: 0.9741 - regression_loss: 0.8456 - classification_loss: 0.1285 372/500 [=====================>........] - ETA: 29s - loss: 0.9724 - regression_loss: 0.8442 - classification_loss: 0.1282 373/500 [=====================>........] - ETA: 29s - loss: 0.9719 - regression_loss: 0.8438 - classification_loss: 0.1280 374/500 [=====================>........] - ETA: 29s - loss: 0.9712 - regression_loss: 0.8432 - classification_loss: 0.1280 375/500 [=====================>........] - ETA: 29s - loss: 0.9722 - regression_loss: 0.8442 - classification_loss: 0.1280 376/500 [=====================>........] - ETA: 29s - loss: 0.9730 - regression_loss: 0.8448 - classification_loss: 0.1282 377/500 [=====================>........] - ETA: 28s - loss: 0.9735 - regression_loss: 0.8452 - classification_loss: 0.1283 378/500 [=====================>........] - ETA: 28s - loss: 0.9730 - regression_loss: 0.8448 - classification_loss: 0.1281 379/500 [=====================>........] - ETA: 28s - loss: 0.9749 - regression_loss: 0.8463 - classification_loss: 0.1286 380/500 [=====================>........] - ETA: 28s - loss: 0.9760 - regression_loss: 0.8473 - classification_loss: 0.1287 381/500 [=====================>........] - ETA: 27s - loss: 0.9750 - regression_loss: 0.8466 - classification_loss: 0.1284 382/500 [=====================>........] - ETA: 27s - loss: 0.9750 - regression_loss: 0.8467 - classification_loss: 0.1283 383/500 [=====================>........] - ETA: 27s - loss: 0.9750 - regression_loss: 0.8467 - classification_loss: 0.1283 384/500 [======================>.......] - ETA: 27s - loss: 0.9754 - regression_loss: 0.8472 - classification_loss: 0.1283 385/500 [======================>.......] - ETA: 26s - loss: 0.9747 - regression_loss: 0.8467 - classification_loss: 0.1281 386/500 [======================>.......] - ETA: 26s - loss: 0.9734 - regression_loss: 0.8456 - classification_loss: 0.1278 387/500 [======================>.......] - ETA: 26s - loss: 0.9769 - regression_loss: 0.8480 - classification_loss: 0.1289 388/500 [======================>.......] - ETA: 26s - loss: 0.9776 - regression_loss: 0.8486 - classification_loss: 0.1290 389/500 [======================>.......] - ETA: 25s - loss: 0.9779 - regression_loss: 0.8489 - classification_loss: 0.1290 390/500 [======================>.......] - ETA: 25s - loss: 0.9791 - regression_loss: 0.8500 - classification_loss: 0.1290 391/500 [======================>.......] - ETA: 25s - loss: 0.9799 - regression_loss: 0.8510 - classification_loss: 0.1290 392/500 [======================>.......] - ETA: 25s - loss: 0.9794 - regression_loss: 0.8506 - classification_loss: 0.1288 393/500 [======================>.......] - ETA: 25s - loss: 0.9786 - regression_loss: 0.8499 - classification_loss: 0.1287 394/500 [======================>.......] - ETA: 24s - loss: 0.9789 - regression_loss: 0.8501 - classification_loss: 0.1288 395/500 [======================>.......] - ETA: 24s - loss: 0.9794 - regression_loss: 0.8504 - classification_loss: 0.1290 396/500 [======================>.......] - ETA: 24s - loss: 0.9791 - regression_loss: 0.8500 - classification_loss: 0.1291 397/500 [======================>.......] - ETA: 24s - loss: 0.9781 - regression_loss: 0.8493 - classification_loss: 0.1288 398/500 [======================>.......] - ETA: 23s - loss: 0.9773 - regression_loss: 0.8486 - classification_loss: 0.1287 399/500 [======================>.......] - ETA: 23s - loss: 0.9756 - regression_loss: 0.8471 - classification_loss: 0.1285 400/500 [=======================>......] - ETA: 23s - loss: 0.9748 - regression_loss: 0.8464 - classification_loss: 0.1284 401/500 [=======================>......] - ETA: 23s - loss: 0.9738 - regression_loss: 0.8457 - classification_loss: 0.1281 402/500 [=======================>......] - ETA: 22s - loss: 0.9726 - regression_loss: 0.8447 - classification_loss: 0.1279 403/500 [=======================>......] - ETA: 22s - loss: 0.9714 - regression_loss: 0.8436 - classification_loss: 0.1278 404/500 [=======================>......] - ETA: 22s - loss: 0.9731 - regression_loss: 0.8451 - classification_loss: 0.1279 405/500 [=======================>......] - ETA: 22s - loss: 0.9751 - regression_loss: 0.8469 - classification_loss: 0.1282 406/500 [=======================>......] - ETA: 21s - loss: 0.9755 - regression_loss: 0.8474 - classification_loss: 0.1281 407/500 [=======================>......] - ETA: 21s - loss: 0.9753 - regression_loss: 0.8472 - classification_loss: 0.1281 408/500 [=======================>......] - ETA: 21s - loss: 0.9754 - regression_loss: 0.8473 - classification_loss: 0.1281 409/500 [=======================>......] - ETA: 21s - loss: 0.9756 - regression_loss: 0.8476 - classification_loss: 0.1280 410/500 [=======================>......] - ETA: 21s - loss: 0.9745 - regression_loss: 0.8467 - classification_loss: 0.1277 411/500 [=======================>......] - ETA: 20s - loss: 0.9739 - regression_loss: 0.8463 - classification_loss: 0.1276 412/500 [=======================>......] - ETA: 20s - loss: 0.9736 - regression_loss: 0.8461 - classification_loss: 0.1275 413/500 [=======================>......] - ETA: 20s - loss: 0.9738 - regression_loss: 0.8463 - classification_loss: 0.1275 414/500 [=======================>......] - ETA: 20s - loss: 0.9743 - regression_loss: 0.8468 - classification_loss: 0.1274 415/500 [=======================>......] - ETA: 19s - loss: 0.9739 - regression_loss: 0.8466 - classification_loss: 0.1274 416/500 [=======================>......] - ETA: 19s - loss: 0.9764 - regression_loss: 0.8490 - classification_loss: 0.1274 417/500 [========================>.....] - ETA: 19s - loss: 0.9749 - regression_loss: 0.8478 - classification_loss: 0.1272 418/500 [========================>.....] - ETA: 19s - loss: 0.9753 - regression_loss: 0.8482 - classification_loss: 0.1272 419/500 [========================>.....] - ETA: 18s - loss: 0.9760 - regression_loss: 0.8488 - classification_loss: 0.1272 420/500 [========================>.....] - ETA: 18s - loss: 0.9754 - regression_loss: 0.8483 - classification_loss: 0.1271 421/500 [========================>.....] - ETA: 18s - loss: 0.9758 - regression_loss: 0.8486 - classification_loss: 0.1271 422/500 [========================>.....] - ETA: 18s - loss: 0.9754 - regression_loss: 0.8484 - classification_loss: 0.1270 423/500 [========================>.....] - ETA: 18s - loss: 0.9747 - regression_loss: 0.8479 - classification_loss: 0.1268 424/500 [========================>.....] - ETA: 17s - loss: 0.9747 - regression_loss: 0.8480 - classification_loss: 0.1266 425/500 [========================>.....] - ETA: 17s - loss: 0.9729 - regression_loss: 0.8465 - classification_loss: 0.1264 426/500 [========================>.....] - ETA: 17s - loss: 0.9719 - regression_loss: 0.8457 - classification_loss: 0.1262 427/500 [========================>.....] - ETA: 17s - loss: 0.9710 - regression_loss: 0.8450 - classification_loss: 0.1260 428/500 [========================>.....] - ETA: 16s - loss: 0.9720 - regression_loss: 0.8457 - classification_loss: 0.1262 429/500 [========================>.....] - ETA: 16s - loss: 0.9724 - regression_loss: 0.8461 - classification_loss: 0.1264 430/500 [========================>.....] - ETA: 16s - loss: 0.9716 - regression_loss: 0.8455 - classification_loss: 0.1261 431/500 [========================>.....] - ETA: 16s - loss: 0.9720 - regression_loss: 0.8460 - classification_loss: 0.1260 432/500 [========================>.....] - ETA: 15s - loss: 0.9720 - regression_loss: 0.8458 - classification_loss: 0.1261 433/500 [========================>.....] - ETA: 15s - loss: 0.9717 - regression_loss: 0.8457 - classification_loss: 0.1260 434/500 [=========================>....] - ETA: 15s - loss: 0.9712 - regression_loss: 0.8452 - classification_loss: 0.1260 435/500 [=========================>....] - ETA: 15s - loss: 0.9724 - regression_loss: 0.8461 - classification_loss: 0.1263 436/500 [=========================>....] - ETA: 14s - loss: 0.9734 - regression_loss: 0.8469 - classification_loss: 0.1265 437/500 [=========================>....] - ETA: 14s - loss: 0.9731 - regression_loss: 0.8467 - classification_loss: 0.1264 438/500 [=========================>....] - ETA: 14s - loss: 0.9731 - regression_loss: 0.8467 - classification_loss: 0.1264 439/500 [=========================>....] - ETA: 14s - loss: 0.9738 - regression_loss: 0.8473 - classification_loss: 0.1265 440/500 [=========================>....] - ETA: 14s - loss: 0.9738 - regression_loss: 0.8474 - classification_loss: 0.1265 441/500 [=========================>....] - ETA: 13s - loss: 0.9728 - regression_loss: 0.8466 - classification_loss: 0.1262 442/500 [=========================>....] - ETA: 13s - loss: 0.9741 - regression_loss: 0.8477 - classification_loss: 0.1264 443/500 [=========================>....] - ETA: 13s - loss: 0.9743 - regression_loss: 0.8479 - classification_loss: 0.1264 444/500 [=========================>....] - ETA: 13s - loss: 0.9734 - regression_loss: 0.8472 - classification_loss: 0.1263 445/500 [=========================>....] - ETA: 12s - loss: 0.9733 - regression_loss: 0.8470 - classification_loss: 0.1263 446/500 [=========================>....] - ETA: 12s - loss: 0.9740 - regression_loss: 0.8479 - classification_loss: 0.1261 447/500 [=========================>....] - ETA: 12s - loss: 0.9741 - regression_loss: 0.8481 - classification_loss: 0.1260 448/500 [=========================>....] - ETA: 12s - loss: 0.9738 - regression_loss: 0.8478 - classification_loss: 0.1260 449/500 [=========================>....] - ETA: 11s - loss: 0.9738 - regression_loss: 0.8478 - classification_loss: 0.1260 450/500 [==========================>...] - ETA: 11s - loss: 0.9741 - regression_loss: 0.8483 - classification_loss: 0.1259 451/500 [==========================>...] - ETA: 11s - loss: 0.9738 - regression_loss: 0.8481 - classification_loss: 0.1257 452/500 [==========================>...] - ETA: 11s - loss: 0.9734 - regression_loss: 0.8478 - classification_loss: 0.1257 453/500 [==========================>...] - ETA: 10s - loss: 0.9726 - regression_loss: 0.8471 - classification_loss: 0.1255 454/500 [==========================>...] - ETA: 10s - loss: 0.9728 - regression_loss: 0.8474 - classification_loss: 0.1255 455/500 [==========================>...] - ETA: 10s - loss: 0.9710 - regression_loss: 0.8458 - classification_loss: 0.1252 456/500 [==========================>...] - ETA: 10s - loss: 0.9701 - regression_loss: 0.8451 - classification_loss: 0.1250 457/500 [==========================>...] - ETA: 10s - loss: 0.9691 - regression_loss: 0.8442 - classification_loss: 0.1249 458/500 [==========================>...] - ETA: 9s - loss: 0.9684 - regression_loss: 0.8437 - classification_loss: 0.1247  459/500 [==========================>...] - ETA: 9s - loss: 0.9677 - regression_loss: 0.8431 - classification_loss: 0.1245 460/500 [==========================>...] - ETA: 9s - loss: 0.9676 - regression_loss: 0.8433 - classification_loss: 0.1244 461/500 [==========================>...] - ETA: 9s - loss: 0.9693 - regression_loss: 0.8448 - classification_loss: 0.1245 462/500 [==========================>...] - ETA: 8s - loss: 0.9690 - regression_loss: 0.8447 - classification_loss: 0.1244 463/500 [==========================>...] - ETA: 8s - loss: 0.9688 - regression_loss: 0.8445 - classification_loss: 0.1242 464/500 [==========================>...] - ETA: 8s - loss: 0.9680 - regression_loss: 0.8439 - classification_loss: 0.1242 465/500 [==========================>...] - ETA: 8s - loss: 0.9685 - regression_loss: 0.8444 - classification_loss: 0.1241 466/500 [==========================>...] - ETA: 7s - loss: 0.9683 - regression_loss: 0.8443 - classification_loss: 0.1239 467/500 [===========================>..] - ETA: 7s - loss: 0.9686 - regression_loss: 0.8447 - classification_loss: 0.1239 468/500 [===========================>..] - ETA: 7s - loss: 0.9690 - regression_loss: 0.8451 - classification_loss: 0.1239 469/500 [===========================>..] - ETA: 7s - loss: 0.9696 - regression_loss: 0.8456 - classification_loss: 0.1240 470/500 [===========================>..] - ETA: 7s - loss: 0.9692 - regression_loss: 0.8453 - classification_loss: 0.1239 471/500 [===========================>..] - ETA: 6s - loss: 0.9684 - regression_loss: 0.8447 - classification_loss: 0.1237 472/500 [===========================>..] - ETA: 6s - loss: 0.9683 - regression_loss: 0.8445 - classification_loss: 0.1237 473/500 [===========================>..] - ETA: 6s - loss: 0.9705 - regression_loss: 0.8465 - classification_loss: 0.1240 474/500 [===========================>..] - ETA: 6s - loss: 0.9702 - regression_loss: 0.8463 - classification_loss: 0.1239 475/500 [===========================>..] - ETA: 5s - loss: 0.9710 - regression_loss: 0.8471 - classification_loss: 0.1239 476/500 [===========================>..] - ETA: 5s - loss: 0.9710 - regression_loss: 0.8471 - classification_loss: 0.1239 477/500 [===========================>..] - ETA: 5s - loss: 0.9701 - regression_loss: 0.8465 - classification_loss: 0.1237 478/500 [===========================>..] - ETA: 5s - loss: 0.9693 - regression_loss: 0.8459 - classification_loss: 0.1235 479/500 [===========================>..] - ETA: 4s - loss: 0.9690 - regression_loss: 0.8457 - classification_loss: 0.1233 480/500 [===========================>..] - ETA: 4s - loss: 0.9679 - regression_loss: 0.8448 - classification_loss: 0.1231 481/500 [===========================>..] - ETA: 4s - loss: 0.9687 - regression_loss: 0.8455 - classification_loss: 0.1232 482/500 [===========================>..] - ETA: 4s - loss: 0.9682 - regression_loss: 0.8451 - classification_loss: 0.1231 483/500 [===========================>..] - ETA: 3s - loss: 0.9680 - regression_loss: 0.8451 - classification_loss: 0.1229 484/500 [============================>.] - ETA: 3s - loss: 0.9687 - regression_loss: 0.8458 - classification_loss: 0.1229 485/500 [============================>.] - ETA: 3s - loss: 0.9674 - regression_loss: 0.8448 - classification_loss: 0.1226 486/500 [============================>.] - ETA: 3s - loss: 0.9685 - regression_loss: 0.8456 - classification_loss: 0.1229 487/500 [============================>.] - ETA: 3s - loss: 0.9688 - regression_loss: 0.8459 - classification_loss: 0.1229 488/500 [============================>.] - ETA: 2s - loss: 0.9697 - regression_loss: 0.8466 - classification_loss: 0.1230 489/500 [============================>.] - ETA: 2s - loss: 0.9692 - regression_loss: 0.8463 - classification_loss: 0.1229 490/500 [============================>.] - ETA: 2s - loss: 0.9700 - regression_loss: 0.8471 - classification_loss: 0.1229 491/500 [============================>.] - ETA: 2s - loss: 0.9702 - regression_loss: 0.8471 - classification_loss: 0.1231 492/500 [============================>.] - ETA: 1s - loss: 0.9698 - regression_loss: 0.8468 - classification_loss: 0.1230 493/500 [============================>.] - ETA: 1s - loss: 0.9702 - regression_loss: 0.8472 - classification_loss: 0.1230 494/500 [============================>.] - ETA: 1s - loss: 0.9696 - regression_loss: 0.8467 - classification_loss: 0.1229 495/500 [============================>.] - ETA: 1s - loss: 0.9707 - regression_loss: 0.8478 - classification_loss: 0.1229 496/500 [============================>.] - ETA: 0s - loss: 0.9710 - regression_loss: 0.8481 - classification_loss: 0.1228 497/500 [============================>.] - ETA: 0s - loss: 0.9718 - regression_loss: 0.8488 - classification_loss: 0.1230 498/500 [============================>.] - ETA: 0s - loss: 0.9717 - regression_loss: 0.8488 - classification_loss: 0.1229 499/500 [============================>.] - ETA: 0s - loss: 0.9706 - regression_loss: 0.8478 - classification_loss: 0.1227 500/500 [==============================] - 117s 234ms/step - loss: 0.9694 - regression_loss: 0.8469 - classification_loss: 0.1225 326 instances of class plum with average precision: 0.8486 mAP: 0.8486 Epoch 00034: saving model to ./training/snapshots/resnet50_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 1:53 - loss: 0.8540 - regression_loss: 0.7827 - classification_loss: 0.0713 2/500 [..............................] - ETA: 1:51 - loss: 0.7802 - regression_loss: 0.7245 - classification_loss: 0.0557 3/500 [..............................] - ETA: 1:49 - loss: 0.6420 - regression_loss: 0.6008 - classification_loss: 0.0412 4/500 [..............................] - ETA: 1:49 - loss: 0.6890 - regression_loss: 0.6425 - classification_loss: 0.0465 5/500 [..............................] - ETA: 1:49 - loss: 0.7948 - regression_loss: 0.7350 - classification_loss: 0.0598 6/500 [..............................] - ETA: 1:50 - loss: 0.8030 - regression_loss: 0.7324 - classification_loss: 0.0707 7/500 [..............................] - ETA: 1:50 - loss: 0.8361 - regression_loss: 0.7663 - classification_loss: 0.0699 8/500 [..............................] - ETA: 1:50 - loss: 0.7778 - regression_loss: 0.7126 - classification_loss: 0.0652 9/500 [..............................] - ETA: 1:50 - loss: 0.8030 - regression_loss: 0.7308 - classification_loss: 0.0722 10/500 [..............................] - ETA: 1:50 - loss: 0.8435 - regression_loss: 0.7700 - classification_loss: 0.0735 11/500 [..............................] - ETA: 1:51 - loss: 0.8415 - regression_loss: 0.7653 - classification_loss: 0.0762 12/500 [..............................] - ETA: 1:51 - loss: 0.8111 - regression_loss: 0.7382 - classification_loss: 0.0729 13/500 [..............................] - ETA: 1:51 - loss: 0.8100 - regression_loss: 0.7362 - classification_loss: 0.0739 14/500 [..............................] - ETA: 1:51 - loss: 0.8361 - regression_loss: 0.7550 - classification_loss: 0.0811 15/500 [..............................] - ETA: 1:51 - loss: 0.8412 - regression_loss: 0.7600 - classification_loss: 0.0812 16/500 [..............................] - ETA: 1:51 - loss: 0.8600 - regression_loss: 0.7762 - classification_loss: 0.0838 17/500 [>.............................] - ETA: 1:51 - loss: 0.8647 - regression_loss: 0.7821 - classification_loss: 0.0826 18/500 [>.............................] - ETA: 1:51 - loss: 0.8696 - regression_loss: 0.7822 - classification_loss: 0.0874 19/500 [>.............................] - ETA: 1:50 - loss: 0.8616 - regression_loss: 0.7752 - classification_loss: 0.0864 20/500 [>.............................] - ETA: 1:51 - loss: 0.8834 - regression_loss: 0.7965 - classification_loss: 0.0869 21/500 [>.............................] - ETA: 1:50 - loss: 0.8953 - regression_loss: 0.8056 - classification_loss: 0.0897 22/500 [>.............................] - ETA: 1:50 - loss: 0.9315 - regression_loss: 0.8365 - classification_loss: 0.0950 23/500 [>.............................] - ETA: 1:50 - loss: 0.9385 - regression_loss: 0.8422 - classification_loss: 0.0963 24/500 [>.............................] - ETA: 1:50 - loss: 0.9403 - regression_loss: 0.8432 - classification_loss: 0.0972 25/500 [>.............................] - ETA: 1:49 - loss: 0.9343 - regression_loss: 0.8388 - classification_loss: 0.0956 26/500 [>.............................] - ETA: 1:49 - loss: 0.9332 - regression_loss: 0.8375 - classification_loss: 0.0957 27/500 [>.............................] - ETA: 1:49 - loss: 0.9230 - regression_loss: 0.8276 - classification_loss: 0.0954 28/500 [>.............................] - ETA: 1:49 - loss: 0.9270 - regression_loss: 0.8312 - classification_loss: 0.0957 29/500 [>.............................] - ETA: 1:48 - loss: 0.9257 - regression_loss: 0.8299 - classification_loss: 0.0957 30/500 [>.............................] - ETA: 1:48 - loss: 0.9138 - regression_loss: 0.8201 - classification_loss: 0.0937 31/500 [>.............................] - ETA: 1:48 - loss: 0.9237 - regression_loss: 0.8289 - classification_loss: 0.0948 32/500 [>.............................] - ETA: 1:48 - loss: 0.9212 - regression_loss: 0.8276 - classification_loss: 0.0936 33/500 [>.............................] - ETA: 1:48 - loss: 0.9050 - regression_loss: 0.8137 - classification_loss: 0.0913 34/500 [=>............................] - ETA: 1:48 - loss: 0.9028 - regression_loss: 0.8111 - classification_loss: 0.0918 35/500 [=>............................] - ETA: 1:47 - loss: 0.8808 - regression_loss: 0.7879 - classification_loss: 0.0929 36/500 [=>............................] - ETA: 1:47 - loss: 0.8699 - regression_loss: 0.7779 - classification_loss: 0.0920 37/500 [=>............................] - ETA: 1:47 - loss: 0.8867 - regression_loss: 0.7876 - classification_loss: 0.0991 38/500 [=>............................] - ETA: 1:47 - loss: 0.8806 - regression_loss: 0.7833 - classification_loss: 0.0973 39/500 [=>............................] - ETA: 1:47 - loss: 0.8772 - regression_loss: 0.7804 - classification_loss: 0.0968 40/500 [=>............................] - ETA: 1:47 - loss: 0.8878 - regression_loss: 0.7903 - classification_loss: 0.0975 41/500 [=>............................] - ETA: 1:47 - loss: 0.8827 - regression_loss: 0.7861 - classification_loss: 0.0966 42/500 [=>............................] - ETA: 1:46 - loss: 0.8853 - regression_loss: 0.7883 - classification_loss: 0.0970 43/500 [=>............................] - ETA: 1:46 - loss: 0.8918 - regression_loss: 0.7935 - classification_loss: 0.0983 44/500 [=>............................] - ETA: 1:46 - loss: 0.8862 - regression_loss: 0.7893 - classification_loss: 0.0969 45/500 [=>............................] - ETA: 1:45 - loss: 0.8858 - regression_loss: 0.7901 - classification_loss: 0.0957 46/500 [=>............................] - ETA: 1:46 - loss: 0.8766 - regression_loss: 0.7818 - classification_loss: 0.0948 47/500 [=>............................] - ETA: 1:45 - loss: 0.8840 - regression_loss: 0.7875 - classification_loss: 0.0965 48/500 [=>............................] - ETA: 1:45 - loss: 0.8909 - regression_loss: 0.7935 - classification_loss: 0.0975 49/500 [=>............................] - ETA: 1:45 - loss: 0.8945 - regression_loss: 0.7975 - classification_loss: 0.0970 50/500 [==>...........................] - ETA: 1:45 - loss: 0.9041 - regression_loss: 0.8055 - classification_loss: 0.0986 51/500 [==>...........................] - ETA: 1:44 - loss: 0.9010 - regression_loss: 0.8034 - classification_loss: 0.0976 52/500 [==>...........................] - ETA: 1:44 - loss: 0.9001 - regression_loss: 0.8029 - classification_loss: 0.0972 53/500 [==>...........................] - ETA: 1:44 - loss: 0.8959 - regression_loss: 0.7998 - classification_loss: 0.0961 54/500 [==>...........................] - ETA: 1:43 - loss: 0.8949 - regression_loss: 0.7990 - classification_loss: 0.0959 55/500 [==>...........................] - ETA: 1:43 - loss: 0.8917 - regression_loss: 0.7960 - classification_loss: 0.0957 56/500 [==>...........................] - ETA: 1:43 - loss: 0.8868 - regression_loss: 0.7924 - classification_loss: 0.0944 57/500 [==>...........................] - ETA: 1:43 - loss: 0.8881 - regression_loss: 0.7937 - classification_loss: 0.0944 58/500 [==>...........................] - ETA: 1:42 - loss: 0.8936 - regression_loss: 0.7992 - classification_loss: 0.0944 59/500 [==>...........................] - ETA: 1:42 - loss: 0.9004 - regression_loss: 0.8063 - classification_loss: 0.0941 60/500 [==>...........................] - ETA: 1:42 - loss: 0.8936 - regression_loss: 0.8006 - classification_loss: 0.0930 61/500 [==>...........................] - ETA: 1:42 - loss: 0.8976 - regression_loss: 0.8048 - classification_loss: 0.0929 62/500 [==>...........................] - ETA: 1:42 - loss: 0.8965 - regression_loss: 0.8040 - classification_loss: 0.0925 63/500 [==>...........................] - ETA: 1:41 - loss: 0.8996 - regression_loss: 0.8058 - classification_loss: 0.0937 64/500 [==>...........................] - ETA: 1:41 - loss: 0.9030 - regression_loss: 0.8094 - classification_loss: 0.0936 65/500 [==>...........................] - ETA: 1:41 - loss: 0.8990 - regression_loss: 0.8060 - classification_loss: 0.0930 66/500 [==>...........................] - ETA: 1:41 - loss: 0.8959 - regression_loss: 0.8035 - classification_loss: 0.0924 67/500 [===>..........................] - ETA: 1:41 - loss: 0.8915 - regression_loss: 0.7995 - classification_loss: 0.0920 68/500 [===>..........................] - ETA: 1:40 - loss: 0.8853 - regression_loss: 0.7940 - classification_loss: 0.0913 69/500 [===>..........................] - ETA: 1:40 - loss: 0.8883 - regression_loss: 0.7969 - classification_loss: 0.0914 70/500 [===>..........................] - ETA: 1:40 - loss: 0.8825 - regression_loss: 0.7921 - classification_loss: 0.0905 71/500 [===>..........................] - ETA: 1:40 - loss: 0.8865 - regression_loss: 0.7943 - classification_loss: 0.0923 72/500 [===>..........................] - ETA: 1:39 - loss: 0.8873 - regression_loss: 0.7947 - classification_loss: 0.0926 73/500 [===>..........................] - ETA: 1:39 - loss: 0.8965 - regression_loss: 0.8026 - classification_loss: 0.0939 74/500 [===>..........................] - ETA: 1:39 - loss: 0.8971 - regression_loss: 0.8033 - classification_loss: 0.0938 75/500 [===>..........................] - ETA: 1:38 - loss: 0.8981 - regression_loss: 0.8045 - classification_loss: 0.0936 76/500 [===>..........................] - ETA: 1:38 - loss: 0.9025 - regression_loss: 0.8077 - classification_loss: 0.0948 77/500 [===>..........................] - ETA: 1:38 - loss: 0.9082 - regression_loss: 0.8124 - classification_loss: 0.0958 78/500 [===>..........................] - ETA: 1:38 - loss: 0.9011 - regression_loss: 0.8059 - classification_loss: 0.0952 79/500 [===>..........................] - ETA: 1:38 - loss: 0.8955 - regression_loss: 0.8011 - classification_loss: 0.0944 80/500 [===>..........................] - ETA: 1:37 - loss: 0.9017 - regression_loss: 0.8074 - classification_loss: 0.0942 81/500 [===>..........................] - ETA: 1:37 - loss: 0.8984 - regression_loss: 0.8047 - classification_loss: 0.0937 82/500 [===>..........................] - ETA: 1:37 - loss: 0.8928 - regression_loss: 0.7995 - classification_loss: 0.0934 83/500 [===>..........................] - ETA: 1:37 - loss: 0.8973 - regression_loss: 0.8033 - classification_loss: 0.0940 84/500 [====>.........................] - ETA: 1:36 - loss: 0.8965 - regression_loss: 0.8021 - classification_loss: 0.0944 85/500 [====>.........................] - ETA: 1:36 - loss: 0.8961 - regression_loss: 0.8018 - classification_loss: 0.0943 86/500 [====>.........................] - ETA: 1:36 - loss: 0.9022 - regression_loss: 0.8065 - classification_loss: 0.0957 87/500 [====>.........................] - ETA: 1:36 - loss: 0.9040 - regression_loss: 0.8075 - classification_loss: 0.0966 88/500 [====>.........................] - ETA: 1:35 - loss: 0.9037 - regression_loss: 0.8073 - classification_loss: 0.0964 89/500 [====>.........................] - ETA: 1:35 - loss: 0.9036 - regression_loss: 0.8078 - classification_loss: 0.0959 90/500 [====>.........................] - ETA: 1:35 - loss: 0.9026 - regression_loss: 0.8068 - classification_loss: 0.0958 91/500 [====>.........................] - ETA: 1:35 - loss: 0.9093 - regression_loss: 0.8110 - classification_loss: 0.0983 92/500 [====>.........................] - ETA: 1:34 - loss: 0.9177 - regression_loss: 0.8181 - classification_loss: 0.0996 93/500 [====>.........................] - ETA: 1:34 - loss: 0.9212 - regression_loss: 0.8213 - classification_loss: 0.1000 94/500 [====>.........................] - ETA: 1:34 - loss: 0.9310 - regression_loss: 0.8298 - classification_loss: 0.1012 95/500 [====>.........................] - ETA: 1:34 - loss: 0.9274 - regression_loss: 0.8270 - classification_loss: 0.1004 96/500 [====>.........................] - ETA: 1:33 - loss: 0.9294 - regression_loss: 0.8282 - classification_loss: 0.1011 97/500 [====>.........................] - ETA: 1:33 - loss: 0.9275 - regression_loss: 0.8263 - classification_loss: 0.1013 98/500 [====>.........................] - ETA: 1:33 - loss: 0.9276 - regression_loss: 0.8263 - classification_loss: 0.1012 99/500 [====>.........................] - ETA: 1:33 - loss: 0.9245 - regression_loss: 0.8237 - classification_loss: 0.1008 100/500 [=====>........................] - ETA: 1:32 - loss: 0.9275 - regression_loss: 0.8265 - classification_loss: 0.1011 101/500 [=====>........................] - ETA: 1:32 - loss: 0.9253 - regression_loss: 0.8244 - classification_loss: 0.1008 102/500 [=====>........................] - ETA: 1:32 - loss: 0.9309 - regression_loss: 0.8294 - classification_loss: 0.1016 103/500 [=====>........................] - ETA: 1:32 - loss: 0.9413 - regression_loss: 0.8384 - classification_loss: 0.1029 104/500 [=====>........................] - ETA: 1:31 - loss: 0.9399 - regression_loss: 0.8375 - classification_loss: 0.1024 105/500 [=====>........................] - ETA: 1:31 - loss: 0.9345 - regression_loss: 0.8329 - classification_loss: 0.1016 106/500 [=====>........................] - ETA: 1:31 - loss: 0.9354 - regression_loss: 0.8337 - classification_loss: 0.1017 107/500 [=====>........................] - ETA: 1:31 - loss: 0.9310 - regression_loss: 0.8300 - classification_loss: 0.1009 108/500 [=====>........................] - ETA: 1:31 - loss: 0.9312 - regression_loss: 0.8307 - classification_loss: 0.1005 109/500 [=====>........................] - ETA: 1:30 - loss: 0.9297 - regression_loss: 0.8295 - classification_loss: 0.1002 110/500 [=====>........................] - ETA: 1:30 - loss: 0.9351 - regression_loss: 0.8343 - classification_loss: 0.1009 111/500 [=====>........................] - ETA: 1:30 - loss: 0.9392 - regression_loss: 0.8371 - classification_loss: 0.1022 112/500 [=====>........................] - ETA: 1:30 - loss: 0.9393 - regression_loss: 0.8374 - classification_loss: 0.1019 113/500 [=====>........................] - ETA: 1:30 - loss: 0.9390 - regression_loss: 0.8377 - classification_loss: 0.1013 114/500 [=====>........................] - ETA: 1:29 - loss: 0.9408 - regression_loss: 0.8395 - classification_loss: 0.1013 115/500 [=====>........................] - ETA: 1:29 - loss: 0.9376 - regression_loss: 0.8367 - classification_loss: 0.1009 116/500 [=====>........................] - ETA: 1:29 - loss: 0.9380 - regression_loss: 0.8372 - classification_loss: 0.1008 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9422 - regression_loss: 0.8414 - classification_loss: 0.1007 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9427 - regression_loss: 0.8420 - classification_loss: 0.1008 119/500 [======>.......................] - ETA: 1:28 - loss: 0.9433 - regression_loss: 0.8426 - classification_loss: 0.1007 120/500 [======>.......................] - ETA: 1:28 - loss: 0.9374 - regression_loss: 0.8372 - classification_loss: 0.1002 121/500 [======>.......................] - ETA: 1:28 - loss: 0.9376 - regression_loss: 0.8374 - classification_loss: 0.1002 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9360 - regression_loss: 0.8362 - classification_loss: 0.0997 123/500 [======>.......................] - ETA: 1:28 - loss: 0.9328 - regression_loss: 0.8336 - classification_loss: 0.0992 124/500 [======>.......................] - ETA: 1:27 - loss: 0.9316 - regression_loss: 0.8328 - classification_loss: 0.0989 125/500 [======>.......................] - ETA: 1:27 - loss: 0.9298 - regression_loss: 0.8312 - classification_loss: 0.0986 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9287 - regression_loss: 0.8305 - classification_loss: 0.0982 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9254 - regression_loss: 0.8278 - classification_loss: 0.0977 128/500 [======>.......................] - ETA: 1:26 - loss: 0.9216 - regression_loss: 0.8243 - classification_loss: 0.0973 129/500 [======>.......................] - ETA: 1:26 - loss: 0.9236 - regression_loss: 0.8264 - classification_loss: 0.0972 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9245 - regression_loss: 0.8274 - classification_loss: 0.0971 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9253 - regression_loss: 0.8285 - classification_loss: 0.0969 132/500 [======>.......................] - ETA: 1:25 - loss: 0.9230 - regression_loss: 0.8266 - classification_loss: 0.0964 133/500 [======>.......................] - ETA: 1:25 - loss: 0.9207 - regression_loss: 0.8247 - classification_loss: 0.0961 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9207 - regression_loss: 0.8250 - classification_loss: 0.0958 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9232 - regression_loss: 0.8272 - classification_loss: 0.0960 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9232 - regression_loss: 0.8270 - classification_loss: 0.0961 137/500 [=======>......................] - ETA: 1:24 - loss: 0.9248 - regression_loss: 0.8280 - classification_loss: 0.0968 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9209 - regression_loss: 0.8246 - classification_loss: 0.0963 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9199 - regression_loss: 0.8238 - classification_loss: 0.0961 140/500 [=======>......................] - ETA: 1:24 - loss: 0.9173 - regression_loss: 0.8216 - classification_loss: 0.0956 141/500 [=======>......................] - ETA: 1:23 - loss: 0.9189 - regression_loss: 0.8231 - classification_loss: 0.0958 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9141 - regression_loss: 0.8187 - classification_loss: 0.0953 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9151 - regression_loss: 0.8199 - classification_loss: 0.0951 144/500 [=======>......................] - ETA: 1:23 - loss: 0.9118 - regression_loss: 0.8170 - classification_loss: 0.0948 145/500 [=======>......................] - ETA: 1:22 - loss: 0.9119 - regression_loss: 0.8169 - classification_loss: 0.0950 146/500 [=======>......................] - ETA: 1:22 - loss: 0.9087 - regression_loss: 0.8141 - classification_loss: 0.0945 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9091 - regression_loss: 0.8141 - classification_loss: 0.0950 148/500 [=======>......................] - ETA: 1:22 - loss: 0.9063 - regression_loss: 0.8116 - classification_loss: 0.0947 149/500 [=======>......................] - ETA: 1:22 - loss: 0.9068 - regression_loss: 0.8121 - classification_loss: 0.0946 150/500 [========>.....................] - ETA: 1:21 - loss: 0.9066 - regression_loss: 0.8122 - classification_loss: 0.0944 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9073 - regression_loss: 0.8128 - classification_loss: 0.0945 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9037 - regression_loss: 0.8096 - classification_loss: 0.0941 153/500 [========>.....................] - ETA: 1:21 - loss: 0.9030 - regression_loss: 0.8093 - classification_loss: 0.0938 154/500 [========>.....................] - ETA: 1:20 - loss: 0.9019 - regression_loss: 0.8082 - classification_loss: 0.0937 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9019 - regression_loss: 0.8083 - classification_loss: 0.0936 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9048 - regression_loss: 0.8107 - classification_loss: 0.0941 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9039 - regression_loss: 0.8100 - classification_loss: 0.0939 158/500 [========>.....................] - ETA: 1:19 - loss: 0.9018 - regression_loss: 0.8081 - classification_loss: 0.0938 159/500 [========>.....................] - ETA: 1:19 - loss: 0.8979 - regression_loss: 0.8046 - classification_loss: 0.0932 160/500 [========>.....................] - ETA: 1:19 - loss: 0.8980 - regression_loss: 0.8050 - classification_loss: 0.0930 161/500 [========>.....................] - ETA: 1:19 - loss: 0.8951 - regression_loss: 0.8025 - classification_loss: 0.0926 162/500 [========>.....................] - ETA: 1:19 - loss: 0.8955 - regression_loss: 0.8031 - classification_loss: 0.0924 163/500 [========>.....................] - ETA: 1:18 - loss: 0.8955 - regression_loss: 0.8031 - classification_loss: 0.0924 164/500 [========>.....................] - ETA: 1:18 - loss: 0.8929 - regression_loss: 0.8010 - classification_loss: 0.0919 165/500 [========>.....................] - ETA: 1:18 - loss: 0.8937 - regression_loss: 0.8018 - classification_loss: 0.0920 166/500 [========>.....................] - ETA: 1:17 - loss: 0.8968 - regression_loss: 0.8048 - classification_loss: 0.0920 167/500 [=========>....................] - ETA: 1:17 - loss: 0.8980 - regression_loss: 0.8055 - classification_loss: 0.0925 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9027 - regression_loss: 0.8087 - classification_loss: 0.0940 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9021 - regression_loss: 0.8082 - classification_loss: 0.0939 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9021 - regression_loss: 0.8085 - classification_loss: 0.0936 171/500 [=========>....................] - ETA: 1:16 - loss: 0.8992 - regression_loss: 0.8061 - classification_loss: 0.0931 172/500 [=========>....................] - ETA: 1:16 - loss: 0.8972 - regression_loss: 0.8044 - classification_loss: 0.0928 173/500 [=========>....................] - ETA: 1:16 - loss: 0.8991 - regression_loss: 0.8062 - classification_loss: 0.0929 174/500 [=========>....................] - ETA: 1:16 - loss: 0.8990 - regression_loss: 0.8059 - classification_loss: 0.0931 175/500 [=========>....................] - ETA: 1:15 - loss: 0.9014 - regression_loss: 0.8078 - classification_loss: 0.0936 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9024 - regression_loss: 0.8086 - classification_loss: 0.0938 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9032 - regression_loss: 0.8094 - classification_loss: 0.0938 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9030 - regression_loss: 0.8094 - classification_loss: 0.0936 179/500 [=========>....................] - ETA: 1:15 - loss: 0.9019 - regression_loss: 0.8085 - classification_loss: 0.0933 180/500 [=========>....................] - ETA: 1:14 - loss: 0.9050 - regression_loss: 0.8115 - classification_loss: 0.0935 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9038 - regression_loss: 0.8105 - classification_loss: 0.0933 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9041 - regression_loss: 0.8109 - classification_loss: 0.0932 183/500 [=========>....................] - ETA: 1:13 - loss: 0.9034 - regression_loss: 0.8105 - classification_loss: 0.0929 184/500 [==========>...................] - ETA: 1:13 - loss: 0.9023 - regression_loss: 0.8094 - classification_loss: 0.0929 185/500 [==========>...................] - ETA: 1:13 - loss: 0.9012 - regression_loss: 0.8086 - classification_loss: 0.0926 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9020 - regression_loss: 0.8093 - classification_loss: 0.0927 187/500 [==========>...................] - ETA: 1:13 - loss: 0.9033 - regression_loss: 0.8107 - classification_loss: 0.0926 188/500 [==========>...................] - ETA: 1:12 - loss: 0.9030 - regression_loss: 0.8105 - classification_loss: 0.0925 189/500 [==========>...................] - ETA: 1:12 - loss: 0.9022 - regression_loss: 0.8098 - classification_loss: 0.0924 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9017 - regression_loss: 0.8094 - classification_loss: 0.0923 191/500 [==========>...................] - ETA: 1:12 - loss: 0.8994 - regression_loss: 0.8075 - classification_loss: 0.0919 192/500 [==========>...................] - ETA: 1:11 - loss: 0.8967 - regression_loss: 0.8051 - classification_loss: 0.0915 193/500 [==========>...................] - ETA: 1:11 - loss: 0.8946 - regression_loss: 0.8033 - classification_loss: 0.0913 194/500 [==========>...................] - ETA: 1:11 - loss: 0.8934 - regression_loss: 0.8025 - classification_loss: 0.0909 195/500 [==========>...................] - ETA: 1:11 - loss: 0.8952 - regression_loss: 0.8039 - classification_loss: 0.0913 196/500 [==========>...................] - ETA: 1:10 - loss: 0.8939 - regression_loss: 0.8029 - classification_loss: 0.0910 197/500 [==========>...................] - ETA: 1:10 - loss: 0.8940 - regression_loss: 0.8027 - classification_loss: 0.0912 198/500 [==========>...................] - ETA: 1:10 - loss: 0.8943 - regression_loss: 0.8032 - classification_loss: 0.0911 199/500 [==========>...................] - ETA: 1:10 - loss: 0.8943 - regression_loss: 0.8033 - classification_loss: 0.0910 200/500 [===========>..................] - ETA: 1:10 - loss: 0.8963 - regression_loss: 0.8048 - classification_loss: 0.0915 201/500 [===========>..................] - ETA: 1:09 - loss: 0.8964 - regression_loss: 0.8049 - classification_loss: 0.0915 202/500 [===========>..................] - ETA: 1:09 - loss: 0.8978 - regression_loss: 0.8064 - classification_loss: 0.0913 203/500 [===========>..................] - ETA: 1:09 - loss: 0.8972 - regression_loss: 0.8060 - classification_loss: 0.0912 204/500 [===========>..................] - ETA: 1:09 - loss: 0.8964 - regression_loss: 0.8053 - classification_loss: 0.0911 205/500 [===========>..................] - ETA: 1:08 - loss: 0.8956 - regression_loss: 0.8045 - classification_loss: 0.0911 206/500 [===========>..................] - ETA: 1:08 - loss: 0.8944 - regression_loss: 0.8033 - classification_loss: 0.0911 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9002 - regression_loss: 0.8073 - classification_loss: 0.0929 208/500 [===========>..................] - ETA: 1:08 - loss: 0.8986 - regression_loss: 0.8060 - classification_loss: 0.0926 209/500 [===========>..................] - ETA: 1:08 - loss: 0.8974 - regression_loss: 0.8049 - classification_loss: 0.0925 210/500 [===========>..................] - ETA: 1:07 - loss: 0.8991 - regression_loss: 0.8062 - classification_loss: 0.0928 211/500 [===========>..................] - ETA: 1:07 - loss: 0.8964 - regression_loss: 0.8039 - classification_loss: 0.0925 212/500 [===========>..................] - ETA: 1:07 - loss: 0.8950 - regression_loss: 0.8028 - classification_loss: 0.0922 213/500 [===========>..................] - ETA: 1:07 - loss: 0.8942 - regression_loss: 0.8022 - classification_loss: 0.0920 214/500 [===========>..................] - ETA: 1:06 - loss: 0.8940 - regression_loss: 0.8023 - classification_loss: 0.0917 215/500 [===========>..................] - ETA: 1:06 - loss: 0.8960 - regression_loss: 0.8038 - classification_loss: 0.0922 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8990 - regression_loss: 0.8060 - classification_loss: 0.0930 217/500 [============>.................] - ETA: 1:06 - loss: 0.8994 - regression_loss: 0.8063 - classification_loss: 0.0931 218/500 [============>.................] - ETA: 1:05 - loss: 0.9042 - regression_loss: 0.8107 - classification_loss: 0.0936 219/500 [============>.................] - ETA: 1:05 - loss: 0.9018 - regression_loss: 0.8086 - classification_loss: 0.0933 220/500 [============>.................] - ETA: 1:05 - loss: 0.9032 - regression_loss: 0.8097 - classification_loss: 0.0935 221/500 [============>.................] - ETA: 1:05 - loss: 0.9044 - regression_loss: 0.8107 - classification_loss: 0.0937 222/500 [============>.................] - ETA: 1:04 - loss: 0.9056 - regression_loss: 0.8115 - classification_loss: 0.0940 223/500 [============>.................] - ETA: 1:04 - loss: 0.9087 - regression_loss: 0.8137 - classification_loss: 0.0950 224/500 [============>.................] - ETA: 1:04 - loss: 0.9085 - regression_loss: 0.8136 - classification_loss: 0.0949 225/500 [============>.................] - ETA: 1:04 - loss: 0.9078 - regression_loss: 0.8130 - classification_loss: 0.0948 226/500 [============>.................] - ETA: 1:04 - loss: 0.9067 - regression_loss: 0.8121 - classification_loss: 0.0945 227/500 [============>.................] - ETA: 1:03 - loss: 0.9071 - regression_loss: 0.8127 - classification_loss: 0.0944 228/500 [============>.................] - ETA: 1:03 - loss: 0.9060 - regression_loss: 0.8119 - classification_loss: 0.0941 229/500 [============>.................] - ETA: 1:03 - loss: 0.9072 - regression_loss: 0.8129 - classification_loss: 0.0943 230/500 [============>.................] - ETA: 1:03 - loss: 0.9092 - regression_loss: 0.8144 - classification_loss: 0.0947 231/500 [============>.................] - ETA: 1:02 - loss: 0.9089 - regression_loss: 0.8142 - classification_loss: 0.0947 232/500 [============>.................] - ETA: 1:02 - loss: 0.9087 - regression_loss: 0.8142 - classification_loss: 0.0945 233/500 [============>.................] - ETA: 1:02 - loss: 0.9111 - regression_loss: 0.8157 - classification_loss: 0.0954 234/500 [=============>................] - ETA: 1:02 - loss: 0.9124 - regression_loss: 0.8166 - classification_loss: 0.0958 235/500 [=============>................] - ETA: 1:01 - loss: 0.9111 - regression_loss: 0.8154 - classification_loss: 0.0956 236/500 [=============>................] - ETA: 1:01 - loss: 0.9107 - regression_loss: 0.8150 - classification_loss: 0.0957 237/500 [=============>................] - ETA: 1:01 - loss: 0.9117 - regression_loss: 0.8158 - classification_loss: 0.0960 238/500 [=============>................] - ETA: 1:01 - loss: 0.9112 - regression_loss: 0.8151 - classification_loss: 0.0961 239/500 [=============>................] - ETA: 1:00 - loss: 0.9113 - regression_loss: 0.8154 - classification_loss: 0.0959 240/500 [=============>................] - ETA: 1:00 - loss: 0.9112 - regression_loss: 0.8155 - classification_loss: 0.0958 241/500 [=============>................] - ETA: 1:00 - loss: 0.9123 - regression_loss: 0.8164 - classification_loss: 0.0959 242/500 [=============>................] - ETA: 1:00 - loss: 0.9156 - regression_loss: 0.8190 - classification_loss: 0.0966 243/500 [=============>................] - ETA: 1:00 - loss: 0.9187 - regression_loss: 0.8219 - classification_loss: 0.0968 244/500 [=============>................] - ETA: 59s - loss: 0.9169 - regression_loss: 0.8203 - classification_loss: 0.0967  245/500 [=============>................] - ETA: 59s - loss: 0.9159 - regression_loss: 0.8192 - classification_loss: 0.0967 246/500 [=============>................] - ETA: 59s - loss: 0.9157 - regression_loss: 0.8192 - classification_loss: 0.0965 247/500 [=============>................] - ETA: 59s - loss: 0.9176 - regression_loss: 0.8210 - classification_loss: 0.0966 248/500 [=============>................] - ETA: 58s - loss: 0.9160 - regression_loss: 0.8196 - classification_loss: 0.0964 249/500 [=============>................] - ETA: 58s - loss: 0.9179 - regression_loss: 0.8214 - classification_loss: 0.0965 250/500 [==============>...............] - ETA: 58s - loss: 0.9212 - regression_loss: 0.8242 - classification_loss: 0.0970 251/500 [==============>...............] - ETA: 58s - loss: 0.9220 - regression_loss: 0.8249 - classification_loss: 0.0972 252/500 [==============>...............] - ETA: 57s - loss: 0.9214 - regression_loss: 0.8244 - classification_loss: 0.0970 253/500 [==============>...............] - ETA: 57s - loss: 0.9207 - regression_loss: 0.8238 - classification_loss: 0.0968 254/500 [==============>...............] - ETA: 57s - loss: 0.9205 - regression_loss: 0.8238 - classification_loss: 0.0967 255/500 [==============>...............] - ETA: 57s - loss: 0.9185 - regression_loss: 0.8221 - classification_loss: 0.0965 256/500 [==============>...............] - ETA: 56s - loss: 0.9180 - regression_loss: 0.8216 - classification_loss: 0.0963 257/500 [==============>...............] - ETA: 56s - loss: 0.9178 - regression_loss: 0.8216 - classification_loss: 0.0962 258/500 [==============>...............] - ETA: 56s - loss: 0.9185 - regression_loss: 0.8221 - classification_loss: 0.0964 259/500 [==============>...............] - ETA: 56s - loss: 0.9188 - regression_loss: 0.8223 - classification_loss: 0.0966 260/500 [==============>...............] - ETA: 56s - loss: 0.9190 - regression_loss: 0.8227 - classification_loss: 0.0964 261/500 [==============>...............] - ETA: 55s - loss: 0.9178 - regression_loss: 0.8216 - classification_loss: 0.0962 262/500 [==============>...............] - ETA: 55s - loss: 0.9176 - regression_loss: 0.8216 - classification_loss: 0.0959 263/500 [==============>...............] - ETA: 55s - loss: 0.9164 - regression_loss: 0.8207 - classification_loss: 0.0958 264/500 [==============>...............] - ETA: 55s - loss: 0.9181 - regression_loss: 0.8220 - classification_loss: 0.0961 265/500 [==============>...............] - ETA: 54s - loss: 0.9192 - regression_loss: 0.8231 - classification_loss: 0.0961 266/500 [==============>...............] - ETA: 54s - loss: 0.9193 - regression_loss: 0.8234 - classification_loss: 0.0959 267/500 [===============>..............] - ETA: 54s - loss: 0.9184 - regression_loss: 0.8226 - classification_loss: 0.0958 268/500 [===============>..............] - ETA: 54s - loss: 0.9187 - regression_loss: 0.8230 - classification_loss: 0.0957 269/500 [===============>..............] - ETA: 54s - loss: 0.9178 - regression_loss: 0.8222 - classification_loss: 0.0957 270/500 [===============>..............] - ETA: 53s - loss: 0.9171 - regression_loss: 0.8215 - classification_loss: 0.0956 271/500 [===============>..............] - ETA: 53s - loss: 0.9172 - regression_loss: 0.8218 - classification_loss: 0.0954 272/500 [===============>..............] - ETA: 53s - loss: 0.9158 - regression_loss: 0.8206 - classification_loss: 0.0952 273/500 [===============>..............] - ETA: 53s - loss: 0.9152 - regression_loss: 0.8201 - classification_loss: 0.0950 274/500 [===============>..............] - ETA: 52s - loss: 0.9144 - regression_loss: 0.8194 - classification_loss: 0.0950 275/500 [===============>..............] - ETA: 52s - loss: 0.9152 - regression_loss: 0.8199 - classification_loss: 0.0953 276/500 [===============>..............] - ETA: 52s - loss: 0.9143 - regression_loss: 0.8192 - classification_loss: 0.0952 277/500 [===============>..............] - ETA: 52s - loss: 0.9140 - regression_loss: 0.8188 - classification_loss: 0.0951 278/500 [===============>..............] - ETA: 51s - loss: 0.9138 - regression_loss: 0.8186 - classification_loss: 0.0951 279/500 [===============>..............] - ETA: 51s - loss: 0.9119 - regression_loss: 0.8170 - classification_loss: 0.0950 280/500 [===============>..............] - ETA: 51s - loss: 0.9119 - regression_loss: 0.8170 - classification_loss: 0.0949 281/500 [===============>..............] - ETA: 51s - loss: 0.9115 - regression_loss: 0.8166 - classification_loss: 0.0948 282/500 [===============>..............] - ETA: 50s - loss: 0.9112 - regression_loss: 0.8164 - classification_loss: 0.0948 283/500 [===============>..............] - ETA: 50s - loss: 0.9107 - regression_loss: 0.8159 - classification_loss: 0.0947 284/500 [================>.............] - ETA: 50s - loss: 0.9157 - regression_loss: 0.8195 - classification_loss: 0.0962 285/500 [================>.............] - ETA: 50s - loss: 0.9147 - regression_loss: 0.8187 - classification_loss: 0.0960 286/500 [================>.............] - ETA: 50s - loss: 0.9126 - regression_loss: 0.8168 - classification_loss: 0.0958 287/500 [================>.............] - ETA: 49s - loss: 0.9130 - regression_loss: 0.8173 - classification_loss: 0.0957 288/500 [================>.............] - ETA: 49s - loss: 0.9149 - regression_loss: 0.8189 - classification_loss: 0.0960 289/500 [================>.............] - ETA: 49s - loss: 0.9141 - regression_loss: 0.8182 - classification_loss: 0.0959 290/500 [================>.............] - ETA: 49s - loss: 0.9125 - regression_loss: 0.8168 - classification_loss: 0.0957 291/500 [================>.............] - ETA: 48s - loss: 0.9133 - regression_loss: 0.8176 - classification_loss: 0.0957 292/500 [================>.............] - ETA: 48s - loss: 0.9126 - regression_loss: 0.8171 - classification_loss: 0.0956 293/500 [================>.............] - ETA: 48s - loss: 0.9129 - regression_loss: 0.8176 - classification_loss: 0.0953 294/500 [================>.............] - ETA: 48s - loss: 0.9118 - regression_loss: 0.8167 - classification_loss: 0.0951 295/500 [================>.............] - ETA: 48s - loss: 0.9127 - regression_loss: 0.8176 - classification_loss: 0.0951 296/500 [================>.............] - ETA: 47s - loss: 0.9145 - regression_loss: 0.8189 - classification_loss: 0.0956 297/500 [================>.............] - ETA: 47s - loss: 0.9131 - regression_loss: 0.8176 - classification_loss: 0.0954 298/500 [================>.............] - ETA: 47s - loss: 0.9148 - regression_loss: 0.8190 - classification_loss: 0.0958 299/500 [================>.............] - ETA: 47s - loss: 0.9144 - regression_loss: 0.8188 - classification_loss: 0.0956 300/500 [=================>............] - ETA: 46s - loss: 0.9156 - regression_loss: 0.8196 - classification_loss: 0.0961 301/500 [=================>............] - ETA: 46s - loss: 0.9180 - regression_loss: 0.8213 - classification_loss: 0.0967 302/500 [=================>............] - ETA: 46s - loss: 0.9182 - regression_loss: 0.8215 - classification_loss: 0.0966 303/500 [=================>............] - ETA: 46s - loss: 0.9166 - regression_loss: 0.8202 - classification_loss: 0.0964 304/500 [=================>............] - ETA: 45s - loss: 0.9185 - regression_loss: 0.8216 - classification_loss: 0.0969 305/500 [=================>............] - ETA: 45s - loss: 0.9165 - regression_loss: 0.8197 - classification_loss: 0.0968 306/500 [=================>............] - ETA: 45s - loss: 0.9169 - regression_loss: 0.8202 - classification_loss: 0.0968 307/500 [=================>............] - ETA: 45s - loss: 0.9186 - regression_loss: 0.8212 - classification_loss: 0.0974 308/500 [=================>............] - ETA: 44s - loss: 0.9188 - regression_loss: 0.8215 - classification_loss: 0.0973 309/500 [=================>............] - ETA: 44s - loss: 0.9170 - regression_loss: 0.8199 - classification_loss: 0.0971 310/500 [=================>............] - ETA: 44s - loss: 0.9181 - regression_loss: 0.8208 - classification_loss: 0.0973 311/500 [=================>............] - ETA: 44s - loss: 0.9185 - regression_loss: 0.8213 - classification_loss: 0.0972 312/500 [=================>............] - ETA: 44s - loss: 0.9190 - regression_loss: 0.8218 - classification_loss: 0.0972 313/500 [=================>............] - ETA: 43s - loss: 0.9188 - regression_loss: 0.8215 - classification_loss: 0.0972 314/500 [=================>............] - ETA: 43s - loss: 0.9183 - regression_loss: 0.8213 - classification_loss: 0.0970 315/500 [=================>............] - ETA: 43s - loss: 0.9185 - regression_loss: 0.8215 - classification_loss: 0.0970 316/500 [=================>............] - ETA: 43s - loss: 0.9189 - regression_loss: 0.8218 - classification_loss: 0.0971 317/500 [==================>...........] - ETA: 42s - loss: 0.9193 - regression_loss: 0.8221 - classification_loss: 0.0971 318/500 [==================>...........] - ETA: 42s - loss: 0.9194 - regression_loss: 0.8222 - classification_loss: 0.0972 319/500 [==================>...........] - ETA: 42s - loss: 0.9202 - regression_loss: 0.8229 - classification_loss: 0.0972 320/500 [==================>...........] - ETA: 42s - loss: 0.9194 - regression_loss: 0.8222 - classification_loss: 0.0971 321/500 [==================>...........] - ETA: 41s - loss: 0.9184 - regression_loss: 0.8213 - classification_loss: 0.0971 322/500 [==================>...........] - ETA: 41s - loss: 0.9181 - regression_loss: 0.8211 - classification_loss: 0.0971 323/500 [==================>...........] - ETA: 41s - loss: 0.9182 - regression_loss: 0.8211 - classification_loss: 0.0971 324/500 [==================>...........] - ETA: 41s - loss: 0.9171 - regression_loss: 0.8201 - classification_loss: 0.0971 325/500 [==================>...........] - ETA: 41s - loss: 0.9182 - regression_loss: 0.8211 - classification_loss: 0.0971 326/500 [==================>...........] - ETA: 40s - loss: 0.9224 - regression_loss: 0.8249 - classification_loss: 0.0975 327/500 [==================>...........] - ETA: 40s - loss: 0.9227 - regression_loss: 0.8251 - classification_loss: 0.0976 328/500 [==================>...........] - ETA: 40s - loss: 0.9245 - regression_loss: 0.8269 - classification_loss: 0.0976 329/500 [==================>...........] - ETA: 40s - loss: 0.9266 - regression_loss: 0.8285 - classification_loss: 0.0982 330/500 [==================>...........] - ETA: 39s - loss: 0.9271 - regression_loss: 0.8288 - classification_loss: 0.0982 331/500 [==================>...........] - ETA: 39s - loss: 0.9275 - regression_loss: 0.8293 - classification_loss: 0.0982 332/500 [==================>...........] - ETA: 39s - loss: 0.9284 - regression_loss: 0.8299 - classification_loss: 0.0984 333/500 [==================>...........] - ETA: 39s - loss: 0.9280 - regression_loss: 0.8295 - classification_loss: 0.0985 334/500 [===================>..........] - ETA: 38s - loss: 0.9282 - regression_loss: 0.8297 - classification_loss: 0.0985 335/500 [===================>..........] - ETA: 38s - loss: 0.9279 - regression_loss: 0.8294 - classification_loss: 0.0984 336/500 [===================>..........] - ETA: 38s - loss: 0.9289 - regression_loss: 0.8303 - classification_loss: 0.0986 337/500 [===================>..........] - ETA: 38s - loss: 0.9308 - regression_loss: 0.8315 - classification_loss: 0.0992 338/500 [===================>..........] - ETA: 37s - loss: 0.9318 - regression_loss: 0.8324 - classification_loss: 0.0994 339/500 [===================>..........] - ETA: 37s - loss: 0.9294 - regression_loss: 0.8303 - classification_loss: 0.0991 340/500 [===================>..........] - ETA: 37s - loss: 0.9302 - regression_loss: 0.8310 - classification_loss: 0.0992 341/500 [===================>..........] - ETA: 37s - loss: 0.9301 - regression_loss: 0.8310 - classification_loss: 0.0991 342/500 [===================>..........] - ETA: 36s - loss: 0.9296 - regression_loss: 0.8305 - classification_loss: 0.0991 343/500 [===================>..........] - ETA: 36s - loss: 0.9313 - regression_loss: 0.8320 - classification_loss: 0.0993 344/500 [===================>..........] - ETA: 36s - loss: 0.9307 - regression_loss: 0.8316 - classification_loss: 0.0991 345/500 [===================>..........] - ETA: 36s - loss: 0.9297 - regression_loss: 0.8307 - classification_loss: 0.0989 346/500 [===================>..........] - ETA: 36s - loss: 0.9284 - regression_loss: 0.8296 - classification_loss: 0.0988 347/500 [===================>..........] - ETA: 35s - loss: 0.9284 - regression_loss: 0.8295 - classification_loss: 0.0989 348/500 [===================>..........] - ETA: 35s - loss: 0.9288 - regression_loss: 0.8299 - classification_loss: 0.0989 349/500 [===================>..........] - ETA: 35s - loss: 0.9285 - regression_loss: 0.8296 - classification_loss: 0.0988 350/500 [====================>.........] - ETA: 35s - loss: 0.9300 - regression_loss: 0.8310 - classification_loss: 0.0989 351/500 [====================>.........] - ETA: 34s - loss: 0.9303 - regression_loss: 0.8312 - classification_loss: 0.0991 352/500 [====================>.........] - ETA: 34s - loss: 0.9304 - regression_loss: 0.8314 - classification_loss: 0.0990 353/500 [====================>.........] - ETA: 34s - loss: 0.9294 - regression_loss: 0.8306 - classification_loss: 0.0988 354/500 [====================>.........] - ETA: 34s - loss: 0.9286 - regression_loss: 0.8300 - classification_loss: 0.0987 355/500 [====================>.........] - ETA: 33s - loss: 0.9293 - regression_loss: 0.8303 - classification_loss: 0.0990 356/500 [====================>.........] - ETA: 33s - loss: 0.9303 - regression_loss: 0.8312 - classification_loss: 0.0991 357/500 [====================>.........] - ETA: 33s - loss: 0.9296 - regression_loss: 0.8307 - classification_loss: 0.0989 358/500 [====================>.........] - ETA: 33s - loss: 0.9283 - regression_loss: 0.8295 - classification_loss: 0.0988 359/500 [====================>.........] - ETA: 33s - loss: 0.9291 - regression_loss: 0.8302 - classification_loss: 0.0989 360/500 [====================>.........] - ETA: 32s - loss: 0.9280 - regression_loss: 0.8291 - classification_loss: 0.0988 361/500 [====================>.........] - ETA: 32s - loss: 0.9302 - regression_loss: 0.8309 - classification_loss: 0.0993 362/500 [====================>.........] - ETA: 32s - loss: 0.9299 - regression_loss: 0.8306 - classification_loss: 0.0992 363/500 [====================>.........] - ETA: 32s - loss: 0.9290 - regression_loss: 0.8299 - classification_loss: 0.0991 364/500 [====================>.........] - ETA: 31s - loss: 0.9291 - regression_loss: 0.8297 - classification_loss: 0.0994 365/500 [====================>.........] - ETA: 31s - loss: 0.9284 - regression_loss: 0.8290 - classification_loss: 0.0993 366/500 [====================>.........] - ETA: 31s - loss: 0.9269 - regression_loss: 0.8277 - classification_loss: 0.0991 367/500 [=====================>........] - ETA: 31s - loss: 0.9249 - regression_loss: 0.8260 - classification_loss: 0.0989 368/500 [=====================>........] - ETA: 30s - loss: 0.9252 - regression_loss: 0.8263 - classification_loss: 0.0989 369/500 [=====================>........] - ETA: 30s - loss: 0.9246 - regression_loss: 0.8257 - classification_loss: 0.0989 370/500 [=====================>........] - ETA: 30s - loss: 0.9257 - regression_loss: 0.8266 - classification_loss: 0.0991 371/500 [=====================>........] - ETA: 30s - loss: 0.9244 - regression_loss: 0.8256 - classification_loss: 0.0989 372/500 [=====================>........] - ETA: 29s - loss: 0.9238 - regression_loss: 0.8250 - classification_loss: 0.0987 373/500 [=====================>........] - ETA: 29s - loss: 0.9236 - regression_loss: 0.8249 - classification_loss: 0.0987 374/500 [=====================>........] - ETA: 29s - loss: 0.9242 - regression_loss: 0.8256 - classification_loss: 0.0986 375/500 [=====================>........] - ETA: 29s - loss: 0.9240 - regression_loss: 0.8254 - classification_loss: 0.0986 376/500 [=====================>........] - ETA: 29s - loss: 0.9255 - regression_loss: 0.8232 - classification_loss: 0.1024 377/500 [=====================>........] - ETA: 28s - loss: 0.9257 - regression_loss: 0.8234 - classification_loss: 0.1023 378/500 [=====================>........] - ETA: 28s - loss: 0.9279 - regression_loss: 0.8252 - classification_loss: 0.1028 379/500 [=====================>........] - ETA: 28s - loss: 0.9287 - regression_loss: 0.8259 - classification_loss: 0.1028 380/500 [=====================>........] - ETA: 28s - loss: 0.9279 - regression_loss: 0.8253 - classification_loss: 0.1026 381/500 [=====================>........] - ETA: 27s - loss: 0.9290 - regression_loss: 0.8262 - classification_loss: 0.1028 382/500 [=====================>........] - ETA: 27s - loss: 0.9285 - regression_loss: 0.8259 - classification_loss: 0.1027 383/500 [=====================>........] - ETA: 27s - loss: 0.9289 - regression_loss: 0.8264 - classification_loss: 0.1025 384/500 [======================>.......] - ETA: 27s - loss: 0.9297 - regression_loss: 0.8268 - classification_loss: 0.1029 385/500 [======================>.......] - ETA: 26s - loss: 0.9289 - regression_loss: 0.8261 - classification_loss: 0.1027 386/500 [======================>.......] - ETA: 26s - loss: 0.9280 - regression_loss: 0.8254 - classification_loss: 0.1026 387/500 [======================>.......] - ETA: 26s - loss: 0.9273 - regression_loss: 0.8248 - classification_loss: 0.1025 388/500 [======================>.......] - ETA: 26s - loss: 0.9275 - regression_loss: 0.8251 - classification_loss: 0.1024 389/500 [======================>.......] - ETA: 25s - loss: 0.9275 - regression_loss: 0.8252 - classification_loss: 0.1023 390/500 [======================>.......] - ETA: 25s - loss: 0.9264 - regression_loss: 0.8242 - classification_loss: 0.1022 391/500 [======================>.......] - ETA: 25s - loss: 0.9249 - regression_loss: 0.8228 - classification_loss: 0.1021 392/500 [======================>.......] - ETA: 25s - loss: 0.9255 - regression_loss: 0.8232 - classification_loss: 0.1023 393/500 [======================>.......] - ETA: 25s - loss: 0.9259 - regression_loss: 0.8236 - classification_loss: 0.1023 394/500 [======================>.......] - ETA: 24s - loss: 0.9253 - regression_loss: 0.8230 - classification_loss: 0.1022 395/500 [======================>.......] - ETA: 24s - loss: 0.9237 - regression_loss: 0.8217 - classification_loss: 0.1020 396/500 [======================>.......] - ETA: 24s - loss: 0.9231 - regression_loss: 0.8211 - classification_loss: 0.1020 397/500 [======================>.......] - ETA: 24s - loss: 0.9232 - regression_loss: 0.8211 - classification_loss: 0.1020 398/500 [======================>.......] - ETA: 23s - loss: 0.9235 - regression_loss: 0.8216 - classification_loss: 0.1020 399/500 [======================>.......] - ETA: 23s - loss: 0.9229 - regression_loss: 0.8211 - classification_loss: 0.1018 400/500 [=======================>......] - ETA: 23s - loss: 0.9230 - regression_loss: 0.8212 - classification_loss: 0.1018 401/500 [=======================>......] - ETA: 23s - loss: 0.9231 - regression_loss: 0.8213 - classification_loss: 0.1017 402/500 [=======================>......] - ETA: 22s - loss: 0.9224 - regression_loss: 0.8208 - classification_loss: 0.1016 403/500 [=======================>......] - ETA: 22s - loss: 0.9212 - regression_loss: 0.8198 - classification_loss: 0.1015 404/500 [=======================>......] - ETA: 22s - loss: 0.9213 - regression_loss: 0.8199 - classification_loss: 0.1015 405/500 [=======================>......] - ETA: 22s - loss: 0.9219 - regression_loss: 0.8203 - classification_loss: 0.1016 406/500 [=======================>......] - ETA: 22s - loss: 0.9221 - regression_loss: 0.8206 - classification_loss: 0.1015 407/500 [=======================>......] - ETA: 21s - loss: 0.9216 - regression_loss: 0.8202 - classification_loss: 0.1014 408/500 [=======================>......] - ETA: 21s - loss: 0.9213 - regression_loss: 0.8201 - classification_loss: 0.1013 409/500 [=======================>......] - ETA: 21s - loss: 0.9201 - regression_loss: 0.8190 - classification_loss: 0.1012 410/500 [=======================>......] - ETA: 21s - loss: 0.9196 - regression_loss: 0.8186 - classification_loss: 0.1011 411/500 [=======================>......] - ETA: 20s - loss: 0.9203 - regression_loss: 0.8190 - classification_loss: 0.1012 412/500 [=======================>......] - ETA: 20s - loss: 0.9211 - regression_loss: 0.8198 - classification_loss: 0.1013 413/500 [=======================>......] - ETA: 20s - loss: 0.9212 - regression_loss: 0.8199 - classification_loss: 0.1013 414/500 [=======================>......] - ETA: 20s - loss: 0.9204 - regression_loss: 0.8192 - classification_loss: 0.1011 415/500 [=======================>......] - ETA: 19s - loss: 0.9204 - regression_loss: 0.8193 - classification_loss: 0.1011 416/500 [=======================>......] - ETA: 19s - loss: 0.9209 - regression_loss: 0.8198 - classification_loss: 0.1011 417/500 [========================>.....] - ETA: 19s - loss: 0.9215 - regression_loss: 0.8203 - classification_loss: 0.1012 418/500 [========================>.....] - ETA: 19s - loss: 0.9214 - regression_loss: 0.8203 - classification_loss: 0.1011 419/500 [========================>.....] - ETA: 18s - loss: 0.9196 - regression_loss: 0.8184 - classification_loss: 0.1012 420/500 [========================>.....] - ETA: 18s - loss: 0.9185 - regression_loss: 0.8175 - classification_loss: 0.1010 421/500 [========================>.....] - ETA: 18s - loss: 0.9185 - regression_loss: 0.8176 - classification_loss: 0.1009 422/500 [========================>.....] - ETA: 18s - loss: 0.9192 - regression_loss: 0.8181 - classification_loss: 0.1011 423/500 [========================>.....] - ETA: 18s - loss: 0.9213 - regression_loss: 0.8198 - classification_loss: 0.1015 424/500 [========================>.....] - ETA: 17s - loss: 0.9209 - regression_loss: 0.8195 - classification_loss: 0.1014 425/500 [========================>.....] - ETA: 17s - loss: 0.9215 - regression_loss: 0.8199 - classification_loss: 0.1016 426/500 [========================>.....] - ETA: 17s - loss: 0.9212 - regression_loss: 0.8196 - classification_loss: 0.1016 427/500 [========================>.....] - ETA: 17s - loss: 0.9211 - regression_loss: 0.8195 - classification_loss: 0.1016 428/500 [========================>.....] - ETA: 16s - loss: 0.9209 - regression_loss: 0.8193 - classification_loss: 0.1016 429/500 [========================>.....] - ETA: 16s - loss: 0.9210 - regression_loss: 0.8193 - classification_loss: 0.1017 430/500 [========================>.....] - ETA: 16s - loss: 0.9228 - regression_loss: 0.8208 - classification_loss: 0.1020 431/500 [========================>.....] - ETA: 16s - loss: 0.9227 - regression_loss: 0.8207 - classification_loss: 0.1020 432/500 [========================>.....] - ETA: 15s - loss: 0.9216 - regression_loss: 0.8198 - classification_loss: 0.1018 433/500 [========================>.....] - ETA: 15s - loss: 0.9218 - regression_loss: 0.8201 - classification_loss: 0.1017 434/500 [=========================>....] - ETA: 15s - loss: 0.9214 - regression_loss: 0.8197 - classification_loss: 0.1017 435/500 [=========================>....] - ETA: 15s - loss: 0.9217 - regression_loss: 0.8198 - classification_loss: 0.1018 436/500 [=========================>....] - ETA: 14s - loss: 0.9206 - regression_loss: 0.8189 - classification_loss: 0.1017 437/500 [=========================>....] - ETA: 14s - loss: 0.9194 - regression_loss: 0.8178 - classification_loss: 0.1016 438/500 [=========================>....] - ETA: 14s - loss: 0.9199 - regression_loss: 0.8183 - classification_loss: 0.1016 439/500 [=========================>....] - ETA: 14s - loss: 0.9199 - regression_loss: 0.8183 - classification_loss: 0.1016 440/500 [=========================>....] - ETA: 14s - loss: 0.9186 - regression_loss: 0.8172 - classification_loss: 0.1014 441/500 [=========================>....] - ETA: 13s - loss: 0.9193 - regression_loss: 0.8177 - classification_loss: 0.1015 442/500 [=========================>....] - ETA: 13s - loss: 0.9201 - regression_loss: 0.8185 - classification_loss: 0.1016 443/500 [=========================>....] - ETA: 13s - loss: 0.9194 - regression_loss: 0.8179 - classification_loss: 0.1015 444/500 [=========================>....] - ETA: 13s - loss: 0.9197 - regression_loss: 0.8183 - classification_loss: 0.1015 445/500 [=========================>....] - ETA: 12s - loss: 0.9190 - regression_loss: 0.8177 - classification_loss: 0.1013 446/500 [=========================>....] - ETA: 12s - loss: 0.9183 - regression_loss: 0.8171 - classification_loss: 0.1012 447/500 [=========================>....] - ETA: 12s - loss: 0.9182 - regression_loss: 0.8170 - classification_loss: 0.1011 448/500 [=========================>....] - ETA: 12s - loss: 0.9177 - regression_loss: 0.8167 - classification_loss: 0.1010 449/500 [=========================>....] - ETA: 11s - loss: 0.9174 - regression_loss: 0.8165 - classification_loss: 0.1009 450/500 [==========================>...] - ETA: 11s - loss: 0.9178 - regression_loss: 0.8170 - classification_loss: 0.1008 451/500 [==========================>...] - ETA: 11s - loss: 0.9173 - regression_loss: 0.8167 - classification_loss: 0.1007 452/500 [==========================>...] - ETA: 11s - loss: 0.9174 - regression_loss: 0.8168 - classification_loss: 0.1006 453/500 [==========================>...] - ETA: 11s - loss: 0.9169 - regression_loss: 0.8165 - classification_loss: 0.1004 454/500 [==========================>...] - ETA: 10s - loss: 0.9167 - regression_loss: 0.8165 - classification_loss: 0.1003 455/500 [==========================>...] - ETA: 10s - loss: 0.9155 - regression_loss: 0.8155 - classification_loss: 0.1001 456/500 [==========================>...] - ETA: 10s - loss: 0.9144 - regression_loss: 0.8144 - classification_loss: 0.0999 457/500 [==========================>...] - ETA: 10s - loss: 0.9141 - regression_loss: 0.8142 - classification_loss: 0.0998 458/500 [==========================>...] - ETA: 9s - loss: 0.9137 - regression_loss: 0.8139 - classification_loss: 0.0997  459/500 [==========================>...] - ETA: 9s - loss: 0.9137 - regression_loss: 0.8140 - classification_loss: 0.0998 460/500 [==========================>...] - ETA: 9s - loss: 0.9138 - regression_loss: 0.8141 - classification_loss: 0.0997 461/500 [==========================>...] - ETA: 9s - loss: 0.9134 - regression_loss: 0.8137 - classification_loss: 0.0996 462/500 [==========================>...] - ETA: 8s - loss: 0.9138 - regression_loss: 0.8140 - classification_loss: 0.0998 463/500 [==========================>...] - ETA: 8s - loss: 0.9158 - regression_loss: 0.8155 - classification_loss: 0.1003 464/500 [==========================>...] - ETA: 8s - loss: 0.9176 - regression_loss: 0.8171 - classification_loss: 0.1006 465/500 [==========================>...] - ETA: 8s - loss: 0.9170 - regression_loss: 0.8165 - classification_loss: 0.1005 466/500 [==========================>...] - ETA: 7s - loss: 0.9174 - regression_loss: 0.8170 - classification_loss: 0.1005 467/500 [===========================>..] - ETA: 7s - loss: 0.9168 - regression_loss: 0.8165 - classification_loss: 0.1004 468/500 [===========================>..] - ETA: 7s - loss: 0.9161 - regression_loss: 0.8158 - classification_loss: 0.1002 469/500 [===========================>..] - ETA: 7s - loss: 0.9160 - regression_loss: 0.8158 - classification_loss: 0.1002 470/500 [===========================>..] - ETA: 7s - loss: 0.9154 - regression_loss: 0.8153 - classification_loss: 0.1001 471/500 [===========================>..] - ETA: 6s - loss: 0.9149 - regression_loss: 0.8149 - classification_loss: 0.1000 472/500 [===========================>..] - ETA: 6s - loss: 0.9142 - regression_loss: 0.8143 - classification_loss: 0.0999 473/500 [===========================>..] - ETA: 6s - loss: 0.9139 - regression_loss: 0.8141 - classification_loss: 0.0998 474/500 [===========================>..] - ETA: 6s - loss: 0.9143 - regression_loss: 0.8146 - classification_loss: 0.0997 475/500 [===========================>..] - ETA: 5s - loss: 0.9156 - regression_loss: 0.8155 - classification_loss: 0.1000 476/500 [===========================>..] - ETA: 5s - loss: 0.9153 - regression_loss: 0.8153 - classification_loss: 0.1000 477/500 [===========================>..] - ETA: 5s - loss: 0.9150 - regression_loss: 0.8151 - classification_loss: 0.0999 478/500 [===========================>..] - ETA: 5s - loss: 0.9162 - regression_loss: 0.8161 - classification_loss: 0.1001 479/500 [===========================>..] - ETA: 4s - loss: 0.9172 - regression_loss: 0.8169 - classification_loss: 0.1003 480/500 [===========================>..] - ETA: 4s - loss: 0.9177 - regression_loss: 0.8173 - classification_loss: 0.1004 481/500 [===========================>..] - ETA: 4s - loss: 0.9177 - regression_loss: 0.8174 - classification_loss: 0.1003 482/500 [===========================>..] - ETA: 4s - loss: 0.9186 - regression_loss: 0.8183 - classification_loss: 0.1004 483/500 [===========================>..] - ETA: 3s - loss: 0.9183 - regression_loss: 0.8180 - classification_loss: 0.1003 484/500 [============================>.] - ETA: 3s - loss: 0.9178 - regression_loss: 0.8176 - classification_loss: 0.1002 485/500 [============================>.] - ETA: 3s - loss: 0.9172 - regression_loss: 0.8170 - classification_loss: 0.1002 486/500 [============================>.] - ETA: 3s - loss: 0.9179 - regression_loss: 0.8176 - classification_loss: 0.1002 487/500 [============================>.] - ETA: 3s - loss: 0.9175 - regression_loss: 0.8173 - classification_loss: 0.1001 488/500 [============================>.] - ETA: 2s - loss: 0.9180 - regression_loss: 0.8177 - classification_loss: 0.1003 489/500 [============================>.] - ETA: 2s - loss: 0.9184 - regression_loss: 0.8180 - classification_loss: 0.1003 490/500 [============================>.] - ETA: 2s - loss: 0.9194 - regression_loss: 0.8190 - classification_loss: 0.1004 491/500 [============================>.] - ETA: 2s - loss: 0.9206 - regression_loss: 0.8203 - classification_loss: 0.1003 492/500 [============================>.] - ETA: 1s - loss: 0.9208 - regression_loss: 0.8204 - classification_loss: 0.1004 493/500 [============================>.] - ETA: 1s - loss: 0.9204 - regression_loss: 0.8201 - classification_loss: 0.1002 494/500 [============================>.] - ETA: 1s - loss: 0.9203 - regression_loss: 0.8201 - classification_loss: 0.1002 495/500 [============================>.] - ETA: 1s - loss: 0.9204 - regression_loss: 0.8203 - classification_loss: 0.1001 496/500 [============================>.] - ETA: 0s - loss: 0.9189 - regression_loss: 0.8190 - classification_loss: 0.1000 497/500 [============================>.] - ETA: 0s - loss: 0.9194 - regression_loss: 0.8193 - classification_loss: 0.1001 498/500 [============================>.] - ETA: 0s - loss: 0.9195 - regression_loss: 0.8193 - classification_loss: 0.1002 499/500 [============================>.] - ETA: 0s - loss: 0.9199 - regression_loss: 0.8196 - classification_loss: 0.1002 500/500 [==============================] - 117s 234ms/step - loss: 0.9183 - regression_loss: 0.8182 - classification_loss: 0.1001 326 instances of class plum with average precision: 0.8456 mAP: 0.8456 Epoch 00035: saving model to ./training/snapshots/resnet50_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 1:51 - loss: 0.9036 - regression_loss: 0.7970 - classification_loss: 0.1066 2/500 [..............................] - ETA: 1:54 - loss: 1.0986 - regression_loss: 0.9605 - classification_loss: 0.1381 3/500 [..............................] - ETA: 1:55 - loss: 1.0647 - regression_loss: 0.9332 - classification_loss: 0.1315 4/500 [..............................] - ETA: 1:56 - loss: 0.8697 - regression_loss: 0.7658 - classification_loss: 0.1039 5/500 [..............................] - ETA: 1:54 - loss: 0.8021 - regression_loss: 0.7116 - classification_loss: 0.0905 6/500 [..............................] - ETA: 1:55 - loss: 0.9186 - regression_loss: 0.8131 - classification_loss: 0.1056 7/500 [..............................] - ETA: 1:54 - loss: 0.9586 - regression_loss: 0.8426 - classification_loss: 0.1161 8/500 [..............................] - ETA: 1:55 - loss: 0.9706 - regression_loss: 0.8526 - classification_loss: 0.1180 9/500 [..............................] - ETA: 1:56 - loss: 0.9743 - regression_loss: 0.8556 - classification_loss: 0.1187 10/500 [..............................] - ETA: 1:55 - loss: 0.9516 - regression_loss: 0.8421 - classification_loss: 0.1094 11/500 [..............................] - ETA: 1:55 - loss: 0.9840 - regression_loss: 0.8781 - classification_loss: 0.1059 12/500 [..............................] - ETA: 1:54 - loss: 0.9651 - regression_loss: 0.8646 - classification_loss: 0.1005 13/500 [..............................] - ETA: 1:54 - loss: 0.9869 - regression_loss: 0.8842 - classification_loss: 0.1027 14/500 [..............................] - ETA: 1:54 - loss: 0.9772 - regression_loss: 0.8760 - classification_loss: 0.1012 15/500 [..............................] - ETA: 1:53 - loss: 0.9702 - regression_loss: 0.8721 - classification_loss: 0.0981 16/500 [..............................] - ETA: 1:53 - loss: 0.9690 - regression_loss: 0.8719 - classification_loss: 0.0971 17/500 [>.............................] - ETA: 1:53 - loss: 0.9892 - regression_loss: 0.8903 - classification_loss: 0.0989 18/500 [>.............................] - ETA: 1:53 - loss: 0.9700 - regression_loss: 0.8692 - classification_loss: 0.1008 19/500 [>.............................] - ETA: 1:53 - loss: 0.9608 - regression_loss: 0.8611 - classification_loss: 0.0997 20/500 [>.............................] - ETA: 1:53 - loss: 0.9479 - regression_loss: 0.8501 - classification_loss: 0.0978 21/500 [>.............................] - ETA: 1:52 - loss: 0.9476 - regression_loss: 0.8509 - classification_loss: 0.0967 22/500 [>.............................] - ETA: 1:52 - loss: 0.9268 - regression_loss: 0.8336 - classification_loss: 0.0932 23/500 [>.............................] - ETA: 1:52 - loss: 0.9364 - regression_loss: 0.8437 - classification_loss: 0.0927 24/500 [>.............................] - ETA: 1:51 - loss: 0.9690 - regression_loss: 0.8662 - classification_loss: 0.1028 25/500 [>.............................] - ETA: 1:51 - loss: 0.9691 - regression_loss: 0.8652 - classification_loss: 0.1038 26/500 [>.............................] - ETA: 1:51 - loss: 0.9689 - regression_loss: 0.8627 - classification_loss: 0.1062 27/500 [>.............................] - ETA: 1:51 - loss: 0.9770 - regression_loss: 0.8690 - classification_loss: 0.1080 28/500 [>.............................] - ETA: 1:51 - loss: 0.9810 - regression_loss: 0.8731 - classification_loss: 0.1080 29/500 [>.............................] - ETA: 1:50 - loss: 0.9718 - regression_loss: 0.8652 - classification_loss: 0.1067 30/500 [>.............................] - ETA: 1:50 - loss: 0.9650 - regression_loss: 0.8596 - classification_loss: 0.1055 31/500 [>.............................] - ETA: 1:49 - loss: 0.9455 - regression_loss: 0.8427 - classification_loss: 0.1028 32/500 [>.............................] - ETA: 1:49 - loss: 0.9424 - regression_loss: 0.8405 - classification_loss: 0.1019 33/500 [>.............................] - ETA: 1:49 - loss: 0.9497 - regression_loss: 0.8472 - classification_loss: 0.1025 34/500 [=>............................] - ETA: 1:49 - loss: 0.9359 - regression_loss: 0.8345 - classification_loss: 0.1014 35/500 [=>............................] - ETA: 1:49 - loss: 0.9387 - regression_loss: 0.8357 - classification_loss: 0.1030 36/500 [=>............................] - ETA: 1:48 - loss: 0.9527 - regression_loss: 0.8481 - classification_loss: 0.1046 37/500 [=>............................] - ETA: 1:48 - loss: 0.9566 - regression_loss: 0.8531 - classification_loss: 0.1035 38/500 [=>............................] - ETA: 1:48 - loss: 0.9573 - regression_loss: 0.8540 - classification_loss: 0.1032 39/500 [=>............................] - ETA: 1:48 - loss: 0.9562 - regression_loss: 0.8537 - classification_loss: 0.1025 40/500 [=>............................] - ETA: 1:47 - loss: 0.9559 - regression_loss: 0.8534 - classification_loss: 0.1025 41/500 [=>............................] - ETA: 1:47 - loss: 0.9586 - regression_loss: 0.8560 - classification_loss: 0.1026 42/500 [=>............................] - ETA: 1:47 - loss: 0.9517 - regression_loss: 0.8501 - classification_loss: 0.1016 43/500 [=>............................] - ETA: 1:47 - loss: 0.9468 - regression_loss: 0.8447 - classification_loss: 0.1021 44/500 [=>............................] - ETA: 1:47 - loss: 0.9356 - regression_loss: 0.8354 - classification_loss: 0.1002 45/500 [=>............................] - ETA: 1:46 - loss: 0.9381 - regression_loss: 0.8377 - classification_loss: 0.1005 46/500 [=>............................] - ETA: 1:46 - loss: 0.9402 - regression_loss: 0.8393 - classification_loss: 0.1009 47/500 [=>............................] - ETA: 1:46 - loss: 0.9518 - regression_loss: 0.8489 - classification_loss: 0.1030 48/500 [=>............................] - ETA: 1:46 - loss: 0.9478 - regression_loss: 0.8454 - classification_loss: 0.1024 49/500 [=>............................] - ETA: 1:46 - loss: 0.9433 - regression_loss: 0.8413 - classification_loss: 0.1021 50/500 [==>...........................] - ETA: 1:45 - loss: 0.9613 - regression_loss: 0.8568 - classification_loss: 0.1045 51/500 [==>...........................] - ETA: 1:45 - loss: 0.9695 - regression_loss: 0.8638 - classification_loss: 0.1057 52/500 [==>...........................] - ETA: 1:45 - loss: 0.9764 - regression_loss: 0.8710 - classification_loss: 0.1054 53/500 [==>...........................] - ETA: 1:45 - loss: 0.9715 - regression_loss: 0.8673 - classification_loss: 0.1041 54/500 [==>...........................] - ETA: 1:44 - loss: 0.9671 - regression_loss: 0.8634 - classification_loss: 0.1037 55/500 [==>...........................] - ETA: 1:44 - loss: 0.9574 - regression_loss: 0.8551 - classification_loss: 0.1023 56/500 [==>...........................] - ETA: 1:44 - loss: 0.9503 - regression_loss: 0.8488 - classification_loss: 0.1015 57/500 [==>...........................] - ETA: 1:44 - loss: 0.9494 - regression_loss: 0.8484 - classification_loss: 0.1011 58/500 [==>...........................] - ETA: 1:43 - loss: 0.9469 - regression_loss: 0.8463 - classification_loss: 0.1007 59/500 [==>...........................] - ETA: 1:43 - loss: 0.9401 - regression_loss: 0.8404 - classification_loss: 0.0998 60/500 [==>...........................] - ETA: 1:43 - loss: 0.9345 - regression_loss: 0.8355 - classification_loss: 0.0990 61/500 [==>...........................] - ETA: 1:43 - loss: 0.9289 - regression_loss: 0.8313 - classification_loss: 0.0976 62/500 [==>...........................] - ETA: 1:42 - loss: 0.9429 - regression_loss: 0.8430 - classification_loss: 0.0999 63/500 [==>...........................] - ETA: 1:42 - loss: 0.9281 - regression_loss: 0.8296 - classification_loss: 0.0984 64/500 [==>...........................] - ETA: 1:42 - loss: 0.9248 - regression_loss: 0.8268 - classification_loss: 0.0980 65/500 [==>...........................] - ETA: 1:42 - loss: 0.9277 - regression_loss: 0.8278 - classification_loss: 0.0999 66/500 [==>...........................] - ETA: 1:41 - loss: 0.9293 - regression_loss: 0.8297 - classification_loss: 0.0995 67/500 [===>..........................] - ETA: 1:41 - loss: 0.9310 - regression_loss: 0.8309 - classification_loss: 0.1001 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9276 - regression_loss: 0.8282 - classification_loss: 0.0994 69/500 [===>..........................] - ETA: 1:41 - loss: 0.9267 - regression_loss: 0.8273 - classification_loss: 0.0995 70/500 [===>..........................] - ETA: 1:40 - loss: 0.9266 - regression_loss: 0.8270 - classification_loss: 0.0996 71/500 [===>..........................] - ETA: 1:40 - loss: 0.9242 - regression_loss: 0.8251 - classification_loss: 0.0990 72/500 [===>..........................] - ETA: 1:40 - loss: 0.9250 - regression_loss: 0.8257 - classification_loss: 0.0993 73/500 [===>..........................] - ETA: 1:40 - loss: 0.9295 - regression_loss: 0.8291 - classification_loss: 0.1004 74/500 [===>..........................] - ETA: 1:40 - loss: 0.9314 - regression_loss: 0.8309 - classification_loss: 0.1005 75/500 [===>..........................] - ETA: 1:39 - loss: 0.9283 - regression_loss: 0.8287 - classification_loss: 0.0996 76/500 [===>..........................] - ETA: 1:39 - loss: 0.9238 - regression_loss: 0.8249 - classification_loss: 0.0989 77/500 [===>..........................] - ETA: 1:39 - loss: 0.9169 - regression_loss: 0.8184 - classification_loss: 0.0984 78/500 [===>..........................] - ETA: 1:39 - loss: 0.9094 - regression_loss: 0.8119 - classification_loss: 0.0975 79/500 [===>..........................] - ETA: 1:39 - loss: 0.9091 - regression_loss: 0.8116 - classification_loss: 0.0975 80/500 [===>..........................] - ETA: 1:38 - loss: 0.9080 - regression_loss: 0.8105 - classification_loss: 0.0975 81/500 [===>..........................] - ETA: 1:38 - loss: 0.9027 - regression_loss: 0.8060 - classification_loss: 0.0968 82/500 [===>..........................] - ETA: 1:38 - loss: 0.8922 - regression_loss: 0.7961 - classification_loss: 0.0960 83/500 [===>..........................] - ETA: 1:38 - loss: 0.8945 - regression_loss: 0.7984 - classification_loss: 0.0961 84/500 [====>.........................] - ETA: 1:37 - loss: 0.8996 - regression_loss: 0.8016 - classification_loss: 0.0980 85/500 [====>.........................] - ETA: 1:37 - loss: 0.8961 - regression_loss: 0.7984 - classification_loss: 0.0977 86/500 [====>.........................] - ETA: 1:37 - loss: 0.9005 - regression_loss: 0.8010 - classification_loss: 0.0995 87/500 [====>.........................] - ETA: 1:37 - loss: 0.9014 - regression_loss: 0.8025 - classification_loss: 0.0989 88/500 [====>.........................] - ETA: 1:37 - loss: 0.9048 - regression_loss: 0.8060 - classification_loss: 0.0988 89/500 [====>.........................] - ETA: 1:36 - loss: 0.9086 - regression_loss: 0.8105 - classification_loss: 0.0981 90/500 [====>.........................] - ETA: 1:36 - loss: 0.9049 - regression_loss: 0.8077 - classification_loss: 0.0972 91/500 [====>.........................] - ETA: 1:36 - loss: 0.9094 - regression_loss: 0.8111 - classification_loss: 0.0983 92/500 [====>.........................] - ETA: 1:36 - loss: 0.9113 - regression_loss: 0.8132 - classification_loss: 0.0981 93/500 [====>.........................] - ETA: 1:36 - loss: 0.9097 - regression_loss: 0.8117 - classification_loss: 0.0980 94/500 [====>.........................] - ETA: 1:35 - loss: 0.9135 - regression_loss: 0.8152 - classification_loss: 0.0983 95/500 [====>.........................] - ETA: 1:35 - loss: 0.9138 - regression_loss: 0.8155 - classification_loss: 0.0983 96/500 [====>.........................] - ETA: 1:35 - loss: 0.9190 - regression_loss: 0.8207 - classification_loss: 0.0983 97/500 [====>.........................] - ETA: 1:35 - loss: 0.9180 - regression_loss: 0.8198 - classification_loss: 0.0982 98/500 [====>.........................] - ETA: 1:34 - loss: 0.9116 - regression_loss: 0.8142 - classification_loss: 0.0974 99/500 [====>.........................] - ETA: 1:34 - loss: 0.9119 - regression_loss: 0.8150 - classification_loss: 0.0968 100/500 [=====>........................] - ETA: 1:34 - loss: 0.9118 - regression_loss: 0.8153 - classification_loss: 0.0965 101/500 [=====>........................] - ETA: 1:34 - loss: 0.9105 - regression_loss: 0.8143 - classification_loss: 0.0962 102/500 [=====>........................] - ETA: 1:33 - loss: 0.9184 - regression_loss: 0.8211 - classification_loss: 0.0973 103/500 [=====>........................] - ETA: 1:33 - loss: 0.9178 - regression_loss: 0.8209 - classification_loss: 0.0969 104/500 [=====>........................] - ETA: 1:33 - loss: 0.9174 - regression_loss: 0.8206 - classification_loss: 0.0968 105/500 [=====>........................] - ETA: 1:32 - loss: 0.9155 - regression_loss: 0.8192 - classification_loss: 0.0963 106/500 [=====>........................] - ETA: 1:32 - loss: 0.9128 - regression_loss: 0.8171 - classification_loss: 0.0957 107/500 [=====>........................] - ETA: 1:32 - loss: 0.9130 - regression_loss: 0.8176 - classification_loss: 0.0954 108/500 [=====>........................] - ETA: 1:32 - loss: 0.9127 - regression_loss: 0.8176 - classification_loss: 0.0950 109/500 [=====>........................] - ETA: 1:31 - loss: 0.9112 - regression_loss: 0.8164 - classification_loss: 0.0948 110/500 [=====>........................] - ETA: 1:31 - loss: 0.9119 - regression_loss: 0.8170 - classification_loss: 0.0949 111/500 [=====>........................] - ETA: 1:31 - loss: 0.9137 - regression_loss: 0.8190 - classification_loss: 0.0947 112/500 [=====>........................] - ETA: 1:31 - loss: 0.9136 - regression_loss: 0.8192 - classification_loss: 0.0944 113/500 [=====>........................] - ETA: 1:31 - loss: 0.9143 - regression_loss: 0.8198 - classification_loss: 0.0946 114/500 [=====>........................] - ETA: 1:30 - loss: 0.9196 - regression_loss: 0.8237 - classification_loss: 0.0958 115/500 [=====>........................] - ETA: 1:30 - loss: 0.9244 - regression_loss: 0.8283 - classification_loss: 0.0961 116/500 [=====>........................] - ETA: 1:30 - loss: 0.9215 - regression_loss: 0.8260 - classification_loss: 0.0955 117/500 [======>.......................] - ETA: 1:30 - loss: 0.9200 - regression_loss: 0.8247 - classification_loss: 0.0953 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9267 - regression_loss: 0.8294 - classification_loss: 0.0974 119/500 [======>.......................] - ETA: 1:29 - loss: 0.9289 - regression_loss: 0.8309 - classification_loss: 0.0980 120/500 [======>.......................] - ETA: 1:29 - loss: 0.9299 - regression_loss: 0.8317 - classification_loss: 0.0982 121/500 [======>.......................] - ETA: 1:29 - loss: 0.9331 - regression_loss: 0.8346 - classification_loss: 0.0986 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9354 - regression_loss: 0.8366 - classification_loss: 0.0988 123/500 [======>.......................] - ETA: 1:28 - loss: 0.9405 - regression_loss: 0.8409 - classification_loss: 0.0996 124/500 [======>.......................] - ETA: 1:28 - loss: 0.9392 - regression_loss: 0.8397 - classification_loss: 0.0994 125/500 [======>.......................] - ETA: 1:28 - loss: 0.9426 - regression_loss: 0.8430 - classification_loss: 0.0997 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9414 - regression_loss: 0.8422 - classification_loss: 0.0992 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9415 - regression_loss: 0.8424 - classification_loss: 0.0991 128/500 [======>.......................] - ETA: 1:27 - loss: 0.9459 - regression_loss: 0.8459 - classification_loss: 0.1000 129/500 [======>.......................] - ETA: 1:27 - loss: 0.9451 - regression_loss: 0.8452 - classification_loss: 0.0999 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9422 - regression_loss: 0.8430 - classification_loss: 0.0992 131/500 [======>.......................] - ETA: 1:26 - loss: 0.9404 - regression_loss: 0.8405 - classification_loss: 0.0999 132/500 [======>.......................] - ETA: 1:26 - loss: 0.9437 - regression_loss: 0.8428 - classification_loss: 0.1008 133/500 [======>.......................] - ETA: 1:26 - loss: 0.9445 - regression_loss: 0.8437 - classification_loss: 0.1009 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9430 - regression_loss: 0.8426 - classification_loss: 0.1004 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9440 - regression_loss: 0.8436 - classification_loss: 0.1004 136/500 [=======>......................] - ETA: 1:25 - loss: 0.9427 - regression_loss: 0.8429 - classification_loss: 0.0998 137/500 [=======>......................] - ETA: 1:25 - loss: 0.9429 - regression_loss: 0.8430 - classification_loss: 0.0999 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9445 - regression_loss: 0.8445 - classification_loss: 0.1000 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9449 - regression_loss: 0.8450 - classification_loss: 0.0999 140/500 [=======>......................] - ETA: 1:24 - loss: 0.9454 - regression_loss: 0.8452 - classification_loss: 0.1002 141/500 [=======>......................] - ETA: 1:24 - loss: 0.9469 - regression_loss: 0.8465 - classification_loss: 0.1004 142/500 [=======>......................] - ETA: 1:24 - loss: 0.9436 - regression_loss: 0.8435 - classification_loss: 0.1001 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9421 - regression_loss: 0.8422 - classification_loss: 0.1000 144/500 [=======>......................] - ETA: 1:23 - loss: 0.9390 - regression_loss: 0.8395 - classification_loss: 0.0995 145/500 [=======>......................] - ETA: 1:23 - loss: 0.9397 - regression_loss: 0.8399 - classification_loss: 0.0999 146/500 [=======>......................] - ETA: 1:23 - loss: 0.9401 - regression_loss: 0.8403 - classification_loss: 0.0998 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9363 - regression_loss: 0.8371 - classification_loss: 0.0992 148/500 [=======>......................] - ETA: 1:22 - loss: 0.9402 - regression_loss: 0.8406 - classification_loss: 0.0996 149/500 [=======>......................] - ETA: 1:22 - loss: 0.9402 - regression_loss: 0.8406 - classification_loss: 0.0997 150/500 [========>.....................] - ETA: 1:22 - loss: 0.9397 - regression_loss: 0.8402 - classification_loss: 0.0996 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9475 - regression_loss: 0.8469 - classification_loss: 0.1005 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9471 - regression_loss: 0.8468 - classification_loss: 0.1003 153/500 [========>.....................] - ETA: 1:21 - loss: 0.9459 - regression_loss: 0.8456 - classification_loss: 0.1003 154/500 [========>.....................] - ETA: 1:21 - loss: 0.9453 - regression_loss: 0.8453 - classification_loss: 0.1000 155/500 [========>.....................] - ETA: 1:21 - loss: 0.9438 - regression_loss: 0.8442 - classification_loss: 0.0996 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9473 - regression_loss: 0.8474 - classification_loss: 0.0999 157/500 [========>.....................] - ETA: 1:20 - loss: 0.9471 - regression_loss: 0.8472 - classification_loss: 0.0999 158/500 [========>.....................] - ETA: 1:20 - loss: 0.9507 - regression_loss: 0.8506 - classification_loss: 0.1001 159/500 [========>.....................] - ETA: 1:20 - loss: 0.9519 - regression_loss: 0.8515 - classification_loss: 0.1004 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9470 - regression_loss: 0.8471 - classification_loss: 0.0999 161/500 [========>.....................] - ETA: 1:19 - loss: 0.9489 - regression_loss: 0.8485 - classification_loss: 0.1003 162/500 [========>.....................] - ETA: 1:19 - loss: 0.9476 - regression_loss: 0.8477 - classification_loss: 0.0998 163/500 [========>.....................] - ETA: 1:19 - loss: 0.9457 - regression_loss: 0.8462 - classification_loss: 0.0995 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9463 - regression_loss: 0.8469 - classification_loss: 0.0994 165/500 [========>.....................] - ETA: 1:18 - loss: 0.9456 - regression_loss: 0.8465 - classification_loss: 0.0991 166/500 [========>.....................] - ETA: 1:18 - loss: 0.9432 - regression_loss: 0.8445 - classification_loss: 0.0987 167/500 [=========>....................] - ETA: 1:18 - loss: 0.9421 - regression_loss: 0.8436 - classification_loss: 0.0985 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9376 - regression_loss: 0.8396 - classification_loss: 0.0981 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9383 - regression_loss: 0.8402 - classification_loss: 0.0980 170/500 [=========>....................] - ETA: 1:17 - loss: 0.9401 - regression_loss: 0.8415 - classification_loss: 0.0985 171/500 [=========>....................] - ETA: 1:17 - loss: 0.9370 - regression_loss: 0.8389 - classification_loss: 0.0981 172/500 [=========>....................] - ETA: 1:17 - loss: 0.9369 - regression_loss: 0.8387 - classification_loss: 0.0982 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9383 - regression_loss: 0.8396 - classification_loss: 0.0986 174/500 [=========>....................] - ETA: 1:16 - loss: 0.9386 - regression_loss: 0.8398 - classification_loss: 0.0988 175/500 [=========>....................] - ETA: 1:16 - loss: 0.9368 - regression_loss: 0.8383 - classification_loss: 0.0985 176/500 [=========>....................] - ETA: 1:16 - loss: 0.9354 - regression_loss: 0.8371 - classification_loss: 0.0982 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9375 - regression_loss: 0.8390 - classification_loss: 0.0986 178/500 [=========>....................] - ETA: 1:15 - loss: 0.9362 - regression_loss: 0.8378 - classification_loss: 0.0983 179/500 [=========>....................] - ETA: 1:15 - loss: 0.9398 - regression_loss: 0.8406 - classification_loss: 0.0992 180/500 [=========>....................] - ETA: 1:15 - loss: 0.9382 - regression_loss: 0.8390 - classification_loss: 0.0993 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9353 - regression_loss: 0.8364 - classification_loss: 0.0989 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9339 - regression_loss: 0.8354 - classification_loss: 0.0986 183/500 [=========>....................] - ETA: 1:14 - loss: 0.9356 - regression_loss: 0.8368 - classification_loss: 0.0988 184/500 [==========>...................] - ETA: 1:14 - loss: 0.9351 - regression_loss: 0.8366 - classification_loss: 0.0985 185/500 [==========>...................] - ETA: 1:14 - loss: 0.9344 - regression_loss: 0.8361 - classification_loss: 0.0983 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9344 - regression_loss: 0.8361 - classification_loss: 0.0983 187/500 [==========>...................] - ETA: 1:13 - loss: 0.9331 - regression_loss: 0.8351 - classification_loss: 0.0980 188/500 [==========>...................] - ETA: 1:13 - loss: 0.9339 - regression_loss: 0.8361 - classification_loss: 0.0977 189/500 [==========>...................] - ETA: 1:13 - loss: 0.9298 - regression_loss: 0.8325 - classification_loss: 0.0973 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9300 - regression_loss: 0.8328 - classification_loss: 0.0972 191/500 [==========>...................] - ETA: 1:12 - loss: 0.9287 - regression_loss: 0.8317 - classification_loss: 0.0969 192/500 [==========>...................] - ETA: 1:12 - loss: 0.9306 - regression_loss: 0.8334 - classification_loss: 0.0973 193/500 [==========>...................] - ETA: 1:12 - loss: 0.9295 - regression_loss: 0.8324 - classification_loss: 0.0970 194/500 [==========>...................] - ETA: 1:11 - loss: 0.9282 - regression_loss: 0.8315 - classification_loss: 0.0967 195/500 [==========>...................] - ETA: 1:11 - loss: 0.9270 - regression_loss: 0.8304 - classification_loss: 0.0966 196/500 [==========>...................] - ETA: 1:11 - loss: 0.9286 - regression_loss: 0.8321 - classification_loss: 0.0965 197/500 [==========>...................] - ETA: 1:11 - loss: 0.9338 - regression_loss: 0.8368 - classification_loss: 0.0971 198/500 [==========>...................] - ETA: 1:11 - loss: 0.9325 - regression_loss: 0.8357 - classification_loss: 0.0968 199/500 [==========>...................] - ETA: 1:10 - loss: 0.9343 - regression_loss: 0.8374 - classification_loss: 0.0969 200/500 [===========>..................] - ETA: 1:10 - loss: 0.9360 - regression_loss: 0.8386 - classification_loss: 0.0974 201/500 [===========>..................] - ETA: 1:10 - loss: 0.9359 - regression_loss: 0.8384 - classification_loss: 0.0975 202/500 [===========>..................] - ETA: 1:10 - loss: 0.9348 - regression_loss: 0.8376 - classification_loss: 0.0972 203/500 [===========>..................] - ETA: 1:09 - loss: 0.9346 - regression_loss: 0.8374 - classification_loss: 0.0971 204/500 [===========>..................] - ETA: 1:09 - loss: 0.9346 - regression_loss: 0.8375 - classification_loss: 0.0970 205/500 [===========>..................] - ETA: 1:09 - loss: 0.9312 - regression_loss: 0.8346 - classification_loss: 0.0967 206/500 [===========>..................] - ETA: 1:09 - loss: 0.9298 - regression_loss: 0.8331 - classification_loss: 0.0967 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9279 - regression_loss: 0.8314 - classification_loss: 0.0965 208/500 [===========>..................] - ETA: 1:08 - loss: 0.9270 - regression_loss: 0.8306 - classification_loss: 0.0963 209/500 [===========>..................] - ETA: 1:08 - loss: 0.9245 - regression_loss: 0.8285 - classification_loss: 0.0960 210/500 [===========>..................] - ETA: 1:08 - loss: 0.9236 - regression_loss: 0.8278 - classification_loss: 0.0958 211/500 [===========>..................] - ETA: 1:07 - loss: 0.9228 - regression_loss: 0.8271 - classification_loss: 0.0957 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9224 - regression_loss: 0.8266 - classification_loss: 0.0958 213/500 [===========>..................] - ETA: 1:07 - loss: 0.9199 - regression_loss: 0.8239 - classification_loss: 0.0960 214/500 [===========>..................] - ETA: 1:07 - loss: 0.9169 - regression_loss: 0.8212 - classification_loss: 0.0957 215/500 [===========>..................] - ETA: 1:06 - loss: 0.9181 - regression_loss: 0.8223 - classification_loss: 0.0957 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9167 - regression_loss: 0.8210 - classification_loss: 0.0957 217/500 [============>.................] - ETA: 1:06 - loss: 0.9168 - regression_loss: 0.8208 - classification_loss: 0.0960 218/500 [============>.................] - ETA: 1:06 - loss: 0.9181 - regression_loss: 0.8219 - classification_loss: 0.0962 219/500 [============>.................] - ETA: 1:06 - loss: 0.9161 - regression_loss: 0.8202 - classification_loss: 0.0958 220/500 [============>.................] - ETA: 1:05 - loss: 0.9162 - regression_loss: 0.8205 - classification_loss: 0.0958 221/500 [============>.................] - ETA: 1:05 - loss: 0.9134 - regression_loss: 0.8179 - classification_loss: 0.0954 222/500 [============>.................] - ETA: 1:05 - loss: 0.9115 - regression_loss: 0.8162 - classification_loss: 0.0953 223/500 [============>.................] - ETA: 1:05 - loss: 0.9094 - regression_loss: 0.8145 - classification_loss: 0.0949 224/500 [============>.................] - ETA: 1:04 - loss: 0.9087 - regression_loss: 0.8139 - classification_loss: 0.0948 225/500 [============>.................] - ETA: 1:04 - loss: 0.9048 - regression_loss: 0.8103 - classification_loss: 0.0946 226/500 [============>.................] - ETA: 1:04 - loss: 0.9036 - regression_loss: 0.8093 - classification_loss: 0.0943 227/500 [============>.................] - ETA: 1:04 - loss: 0.9046 - regression_loss: 0.8102 - classification_loss: 0.0943 228/500 [============>.................] - ETA: 1:03 - loss: 0.9064 - regression_loss: 0.8113 - classification_loss: 0.0950 229/500 [============>.................] - ETA: 1:03 - loss: 0.9055 - regression_loss: 0.8104 - classification_loss: 0.0951 230/500 [============>.................] - ETA: 1:03 - loss: 0.9040 - regression_loss: 0.8092 - classification_loss: 0.0948 231/500 [============>.................] - ETA: 1:03 - loss: 0.9044 - regression_loss: 0.8095 - classification_loss: 0.0949 232/500 [============>.................] - ETA: 1:03 - loss: 0.9049 - regression_loss: 0.8099 - classification_loss: 0.0949 233/500 [============>.................] - ETA: 1:02 - loss: 0.9046 - regression_loss: 0.8098 - classification_loss: 0.0948 234/500 [=============>................] - ETA: 1:02 - loss: 0.9051 - regression_loss: 0.8103 - classification_loss: 0.0949 235/500 [=============>................] - ETA: 1:02 - loss: 0.9038 - regression_loss: 0.8093 - classification_loss: 0.0946 236/500 [=============>................] - ETA: 1:02 - loss: 0.9050 - regression_loss: 0.8103 - classification_loss: 0.0947 237/500 [=============>................] - ETA: 1:01 - loss: 0.9050 - regression_loss: 0.8106 - classification_loss: 0.0944 238/500 [=============>................] - ETA: 1:01 - loss: 0.9063 - regression_loss: 0.8117 - classification_loss: 0.0946 239/500 [=============>................] - ETA: 1:01 - loss: 0.9064 - regression_loss: 0.8115 - classification_loss: 0.0949 240/500 [=============>................] - ETA: 1:01 - loss: 0.9056 - regression_loss: 0.8106 - classification_loss: 0.0950 241/500 [=============>................] - ETA: 1:00 - loss: 0.9055 - regression_loss: 0.8106 - classification_loss: 0.0948 242/500 [=============>................] - ETA: 1:00 - loss: 0.9055 - regression_loss: 0.8106 - classification_loss: 0.0950 243/500 [=============>................] - ETA: 1:00 - loss: 0.9046 - regression_loss: 0.8097 - classification_loss: 0.0949 244/500 [=============>................] - ETA: 1:00 - loss: 0.9054 - regression_loss: 0.8107 - classification_loss: 0.0947 245/500 [=============>................] - ETA: 59s - loss: 0.9043 - regression_loss: 0.8098 - classification_loss: 0.0945  246/500 [=============>................] - ETA: 59s - loss: 0.9042 - regression_loss: 0.8098 - classification_loss: 0.0944 247/500 [=============>................] - ETA: 59s - loss: 0.9061 - regression_loss: 0.8115 - classification_loss: 0.0946 248/500 [=============>................] - ETA: 59s - loss: 0.9067 - regression_loss: 0.8122 - classification_loss: 0.0945 249/500 [=============>................] - ETA: 59s - loss: 0.9067 - regression_loss: 0.8125 - classification_loss: 0.0942 250/500 [==============>...............] - ETA: 58s - loss: 0.9063 - regression_loss: 0.8122 - classification_loss: 0.0941 251/500 [==============>...............] - ETA: 58s - loss: 0.9071 - regression_loss: 0.8130 - classification_loss: 0.0941 252/500 [==============>...............] - ETA: 58s - loss: 0.9067 - regression_loss: 0.8127 - classification_loss: 0.0939 253/500 [==============>...............] - ETA: 58s - loss: 0.9079 - regression_loss: 0.8140 - classification_loss: 0.0939 254/500 [==============>...............] - ETA: 57s - loss: 0.9085 - regression_loss: 0.8147 - classification_loss: 0.0938 255/500 [==============>...............] - ETA: 57s - loss: 0.9081 - regression_loss: 0.8146 - classification_loss: 0.0936 256/500 [==============>...............] - ETA: 57s - loss: 0.9089 - regression_loss: 0.8155 - classification_loss: 0.0934 257/500 [==============>...............] - ETA: 57s - loss: 0.9067 - regression_loss: 0.8136 - classification_loss: 0.0931 258/500 [==============>...............] - ETA: 57s - loss: 0.9066 - regression_loss: 0.8135 - classification_loss: 0.0931 259/500 [==============>...............] - ETA: 56s - loss: 0.9071 - regression_loss: 0.8141 - classification_loss: 0.0930 260/500 [==============>...............] - ETA: 56s - loss: 0.9067 - regression_loss: 0.8137 - classification_loss: 0.0930 261/500 [==============>...............] - ETA: 56s - loss: 0.9082 - regression_loss: 0.8150 - classification_loss: 0.0932 262/500 [==============>...............] - ETA: 56s - loss: 0.9086 - regression_loss: 0.8154 - classification_loss: 0.0932 263/500 [==============>...............] - ETA: 55s - loss: 0.9072 - regression_loss: 0.8142 - classification_loss: 0.0930 264/500 [==============>...............] - ETA: 55s - loss: 0.9061 - regression_loss: 0.8132 - classification_loss: 0.0929 265/500 [==============>...............] - ETA: 55s - loss: 0.9060 - regression_loss: 0.8132 - classification_loss: 0.0929 266/500 [==============>...............] - ETA: 55s - loss: 0.9077 - regression_loss: 0.8145 - classification_loss: 0.0933 267/500 [===============>..............] - ETA: 54s - loss: 0.9070 - regression_loss: 0.8139 - classification_loss: 0.0931 268/500 [===============>..............] - ETA: 54s - loss: 0.9066 - regression_loss: 0.8137 - classification_loss: 0.0930 269/500 [===============>..............] - ETA: 54s - loss: 0.9058 - regression_loss: 0.8131 - classification_loss: 0.0927 270/500 [===============>..............] - ETA: 54s - loss: 0.9032 - regression_loss: 0.8107 - classification_loss: 0.0925 271/500 [===============>..............] - ETA: 53s - loss: 0.9028 - regression_loss: 0.8105 - classification_loss: 0.0924 272/500 [===============>..............] - ETA: 53s - loss: 0.9066 - regression_loss: 0.8138 - classification_loss: 0.0929 273/500 [===============>..............] - ETA: 53s - loss: 0.9066 - regression_loss: 0.8137 - classification_loss: 0.0929 274/500 [===============>..............] - ETA: 53s - loss: 0.9071 - regression_loss: 0.8142 - classification_loss: 0.0929 275/500 [===============>..............] - ETA: 52s - loss: 0.9057 - regression_loss: 0.8130 - classification_loss: 0.0927 276/500 [===============>..............] - ETA: 52s - loss: 0.9045 - regression_loss: 0.8120 - classification_loss: 0.0925 277/500 [===============>..............] - ETA: 52s - loss: 0.9025 - regression_loss: 0.8103 - classification_loss: 0.0922 278/500 [===============>..............] - ETA: 52s - loss: 0.9062 - regression_loss: 0.8124 - classification_loss: 0.0938 279/500 [===============>..............] - ETA: 51s - loss: 0.9070 - regression_loss: 0.8131 - classification_loss: 0.0939 280/500 [===============>..............] - ETA: 51s - loss: 0.9052 - regression_loss: 0.8116 - classification_loss: 0.0937 281/500 [===============>..............] - ETA: 51s - loss: 0.9040 - regression_loss: 0.8104 - classification_loss: 0.0935 282/500 [===============>..............] - ETA: 51s - loss: 0.9032 - regression_loss: 0.8098 - classification_loss: 0.0934 283/500 [===============>..............] - ETA: 51s - loss: 0.9044 - regression_loss: 0.8106 - classification_loss: 0.0938 284/500 [================>.............] - ETA: 50s - loss: 0.9063 - regression_loss: 0.8121 - classification_loss: 0.0942 285/500 [================>.............] - ETA: 50s - loss: 0.9064 - regression_loss: 0.8123 - classification_loss: 0.0942 286/500 [================>.............] - ETA: 50s - loss: 0.9049 - regression_loss: 0.8110 - classification_loss: 0.0940 287/500 [================>.............] - ETA: 50s - loss: 0.9040 - regression_loss: 0.8102 - classification_loss: 0.0938 288/500 [================>.............] - ETA: 49s - loss: 0.9045 - regression_loss: 0.8106 - classification_loss: 0.0939 289/500 [================>.............] - ETA: 49s - loss: 0.9039 - regression_loss: 0.8102 - classification_loss: 0.0937 290/500 [================>.............] - ETA: 49s - loss: 0.9025 - regression_loss: 0.8090 - classification_loss: 0.0935 291/500 [================>.............] - ETA: 49s - loss: 0.9032 - regression_loss: 0.8097 - classification_loss: 0.0935 292/500 [================>.............] - ETA: 48s - loss: 0.9029 - regression_loss: 0.8096 - classification_loss: 0.0934 293/500 [================>.............] - ETA: 48s - loss: 0.9020 - regression_loss: 0.8089 - classification_loss: 0.0931 294/500 [================>.............] - ETA: 48s - loss: 0.9042 - regression_loss: 0.8104 - classification_loss: 0.0938 295/500 [================>.............] - ETA: 48s - loss: 0.9059 - regression_loss: 0.8116 - classification_loss: 0.0942 296/500 [================>.............] - ETA: 47s - loss: 0.9056 - regression_loss: 0.8115 - classification_loss: 0.0942 297/500 [================>.............] - ETA: 47s - loss: 0.9059 - regression_loss: 0.8118 - classification_loss: 0.0941 298/500 [================>.............] - ETA: 47s - loss: 0.9064 - regression_loss: 0.8123 - classification_loss: 0.0940 299/500 [================>.............] - ETA: 47s - loss: 0.9072 - regression_loss: 0.8128 - classification_loss: 0.0943 300/500 [=================>............] - ETA: 47s - loss: 0.9088 - regression_loss: 0.8142 - classification_loss: 0.0946 301/500 [=================>............] - ETA: 46s - loss: 0.9098 - regression_loss: 0.8148 - classification_loss: 0.0951 302/500 [=================>............] - ETA: 46s - loss: 0.9115 - regression_loss: 0.8163 - classification_loss: 0.0952 303/500 [=================>............] - ETA: 46s - loss: 0.9114 - regression_loss: 0.8162 - classification_loss: 0.0952 304/500 [=================>............] - ETA: 46s - loss: 0.9093 - regression_loss: 0.8144 - classification_loss: 0.0950 305/500 [=================>............] - ETA: 45s - loss: 0.9078 - regression_loss: 0.8131 - classification_loss: 0.0947 306/500 [=================>............] - ETA: 45s - loss: 0.9092 - regression_loss: 0.8143 - classification_loss: 0.0949 307/500 [=================>............] - ETA: 45s - loss: 0.9088 - regression_loss: 0.8140 - classification_loss: 0.0949 308/500 [=================>............] - ETA: 45s - loss: 0.9086 - regression_loss: 0.8138 - classification_loss: 0.0949 309/500 [=================>............] - ETA: 44s - loss: 0.9093 - regression_loss: 0.8145 - classification_loss: 0.0948 310/500 [=================>............] - ETA: 44s - loss: 0.9082 - regression_loss: 0.8135 - classification_loss: 0.0946 311/500 [=================>............] - ETA: 44s - loss: 0.9080 - regression_loss: 0.8132 - classification_loss: 0.0948 312/500 [=================>............] - ETA: 44s - loss: 0.9075 - regression_loss: 0.8127 - classification_loss: 0.0948 313/500 [=================>............] - ETA: 44s - loss: 0.9065 - regression_loss: 0.8118 - classification_loss: 0.0947 314/500 [=================>............] - ETA: 43s - loss: 0.9068 - regression_loss: 0.8121 - classification_loss: 0.0947 315/500 [=================>............] - ETA: 43s - loss: 0.9068 - regression_loss: 0.8122 - classification_loss: 0.0946 316/500 [=================>............] - ETA: 43s - loss: 0.9066 - regression_loss: 0.8121 - classification_loss: 0.0946 317/500 [==================>...........] - ETA: 43s - loss: 0.9113 - regression_loss: 0.8157 - classification_loss: 0.0956 318/500 [==================>...........] - ETA: 42s - loss: 0.9123 - regression_loss: 0.8168 - classification_loss: 0.0955 319/500 [==================>...........] - ETA: 42s - loss: 0.9126 - regression_loss: 0.8171 - classification_loss: 0.0955 320/500 [==================>...........] - ETA: 42s - loss: 0.9132 - regression_loss: 0.8176 - classification_loss: 0.0957 321/500 [==================>...........] - ETA: 42s - loss: 0.9117 - regression_loss: 0.8162 - classification_loss: 0.0955 322/500 [==================>...........] - ETA: 41s - loss: 0.9133 - regression_loss: 0.8176 - classification_loss: 0.0957 323/500 [==================>...........] - ETA: 41s - loss: 0.9132 - regression_loss: 0.8176 - classification_loss: 0.0956 324/500 [==================>...........] - ETA: 41s - loss: 0.9162 - regression_loss: 0.8197 - classification_loss: 0.0964 325/500 [==================>...........] - ETA: 41s - loss: 0.9164 - regression_loss: 0.8199 - classification_loss: 0.0966 326/500 [==================>...........] - ETA: 40s - loss: 0.9185 - regression_loss: 0.8213 - classification_loss: 0.0972 327/500 [==================>...........] - ETA: 40s - loss: 0.9170 - regression_loss: 0.8200 - classification_loss: 0.0970 328/500 [==================>...........] - ETA: 40s - loss: 0.9166 - regression_loss: 0.8195 - classification_loss: 0.0971 329/500 [==================>...........] - ETA: 40s - loss: 0.9154 - regression_loss: 0.8185 - classification_loss: 0.0969 330/500 [==================>...........] - ETA: 40s - loss: 0.9152 - regression_loss: 0.8183 - classification_loss: 0.0969 331/500 [==================>...........] - ETA: 39s - loss: 0.9153 - regression_loss: 0.8183 - classification_loss: 0.0971 332/500 [==================>...........] - ETA: 39s - loss: 0.9147 - regression_loss: 0.8178 - classification_loss: 0.0969 333/500 [==================>...........] - ETA: 39s - loss: 0.9137 - regression_loss: 0.8170 - classification_loss: 0.0968 334/500 [===================>..........] - ETA: 39s - loss: 0.9135 - regression_loss: 0.8167 - classification_loss: 0.0967 335/500 [===================>..........] - ETA: 38s - loss: 0.9135 - regression_loss: 0.8163 - classification_loss: 0.0971 336/500 [===================>..........] - ETA: 38s - loss: 0.9127 - regression_loss: 0.8158 - classification_loss: 0.0970 337/500 [===================>..........] - ETA: 38s - loss: 0.9125 - regression_loss: 0.8156 - classification_loss: 0.0969 338/500 [===================>..........] - ETA: 38s - loss: 0.9114 - regression_loss: 0.8148 - classification_loss: 0.0966 339/500 [===================>..........] - ETA: 37s - loss: 0.9109 - regression_loss: 0.8143 - classification_loss: 0.0967 340/500 [===================>..........] - ETA: 37s - loss: 0.9138 - regression_loss: 0.8163 - classification_loss: 0.0974 341/500 [===================>..........] - ETA: 37s - loss: 0.9154 - regression_loss: 0.8177 - classification_loss: 0.0977 342/500 [===================>..........] - ETA: 37s - loss: 0.9160 - regression_loss: 0.8181 - classification_loss: 0.0978 343/500 [===================>..........] - ETA: 36s - loss: 0.9156 - regression_loss: 0.8178 - classification_loss: 0.0978 344/500 [===================>..........] - ETA: 36s - loss: 0.9145 - regression_loss: 0.8168 - classification_loss: 0.0977 345/500 [===================>..........] - ETA: 36s - loss: 0.9135 - regression_loss: 0.8159 - classification_loss: 0.0975 346/500 [===================>..........] - ETA: 36s - loss: 0.9138 - regression_loss: 0.8161 - classification_loss: 0.0977 347/500 [===================>..........] - ETA: 36s - loss: 0.9143 - regression_loss: 0.8167 - classification_loss: 0.0976 348/500 [===================>..........] - ETA: 35s - loss: 0.9149 - regression_loss: 0.8173 - classification_loss: 0.0976 349/500 [===================>..........] - ETA: 35s - loss: 0.9164 - regression_loss: 0.8184 - classification_loss: 0.0980 350/500 [====================>.........] - ETA: 35s - loss: 0.9159 - regression_loss: 0.8179 - classification_loss: 0.0979 351/500 [====================>.........] - ETA: 35s - loss: 0.9178 - regression_loss: 0.8196 - classification_loss: 0.0982 352/500 [====================>.........] - ETA: 34s - loss: 0.9173 - regression_loss: 0.8191 - classification_loss: 0.0982 353/500 [====================>.........] - ETA: 34s - loss: 0.9170 - regression_loss: 0.8189 - classification_loss: 0.0981 354/500 [====================>.........] - ETA: 34s - loss: 0.9176 - regression_loss: 0.8194 - classification_loss: 0.0983 355/500 [====================>.........] - ETA: 34s - loss: 0.9169 - regression_loss: 0.8186 - classification_loss: 0.0982 356/500 [====================>.........] - ETA: 33s - loss: 0.9172 - regression_loss: 0.8188 - classification_loss: 0.0983 357/500 [====================>.........] - ETA: 33s - loss: 0.9182 - regression_loss: 0.8199 - classification_loss: 0.0983 358/500 [====================>.........] - ETA: 33s - loss: 0.9180 - regression_loss: 0.8198 - classification_loss: 0.0982 359/500 [====================>.........] - ETA: 33s - loss: 0.9164 - regression_loss: 0.8184 - classification_loss: 0.0980 360/500 [====================>.........] - ETA: 32s - loss: 0.9192 - regression_loss: 0.8204 - classification_loss: 0.0988 361/500 [====================>.........] - ETA: 32s - loss: 0.9199 - regression_loss: 0.8210 - classification_loss: 0.0988 362/500 [====================>.........] - ETA: 32s - loss: 0.9198 - regression_loss: 0.8212 - classification_loss: 0.0987 363/500 [====================>.........] - ETA: 32s - loss: 0.9217 - regression_loss: 0.8227 - classification_loss: 0.0990 364/500 [====================>.........] - ETA: 32s - loss: 0.9202 - regression_loss: 0.8214 - classification_loss: 0.0988 365/500 [====================>.........] - ETA: 31s - loss: 0.9199 - regression_loss: 0.8211 - classification_loss: 0.0988 366/500 [====================>.........] - ETA: 31s - loss: 0.9187 - regression_loss: 0.8201 - classification_loss: 0.0986 367/500 [=====================>........] - ETA: 31s - loss: 0.9183 - regression_loss: 0.8197 - classification_loss: 0.0986 368/500 [=====================>........] - ETA: 31s - loss: 0.9177 - regression_loss: 0.8192 - classification_loss: 0.0985 369/500 [=====================>........] - ETA: 30s - loss: 0.9177 - regression_loss: 0.8194 - classification_loss: 0.0983 370/500 [=====================>........] - ETA: 30s - loss: 0.9172 - regression_loss: 0.8171 - classification_loss: 0.1000 371/500 [=====================>........] - ETA: 30s - loss: 0.9177 - regression_loss: 0.8176 - classification_loss: 0.1001 372/500 [=====================>........] - ETA: 30s - loss: 0.9166 - regression_loss: 0.8167 - classification_loss: 0.0999 373/500 [=====================>........] - ETA: 29s - loss: 0.9154 - regression_loss: 0.8156 - classification_loss: 0.0998 374/500 [=====================>........] - ETA: 29s - loss: 0.9163 - regression_loss: 0.8161 - classification_loss: 0.1002 375/500 [=====================>........] - ETA: 29s - loss: 0.9147 - regression_loss: 0.8146 - classification_loss: 0.1000 376/500 [=====================>........] - ETA: 29s - loss: 0.9147 - regression_loss: 0.8148 - classification_loss: 0.0999 377/500 [=====================>........] - ETA: 28s - loss: 0.9144 - regression_loss: 0.8144 - classification_loss: 0.1000 378/500 [=====================>........] - ETA: 28s - loss: 0.9157 - regression_loss: 0.8156 - classification_loss: 0.1001 379/500 [=====================>........] - ETA: 28s - loss: 0.9150 - regression_loss: 0.8150 - classification_loss: 0.1000 380/500 [=====================>........] - ETA: 28s - loss: 0.9153 - regression_loss: 0.8153 - classification_loss: 0.0999 381/500 [=====================>........] - ETA: 28s - loss: 0.9145 - regression_loss: 0.8147 - classification_loss: 0.0998 382/500 [=====================>........] - ETA: 27s - loss: 0.9129 - regression_loss: 0.8132 - classification_loss: 0.0996 383/500 [=====================>........] - ETA: 27s - loss: 0.9128 - regression_loss: 0.8132 - classification_loss: 0.0996 384/500 [======================>.......] - ETA: 27s - loss: 0.9117 - regression_loss: 0.8121 - classification_loss: 0.0995 385/500 [======================>.......] - ETA: 27s - loss: 0.9103 - regression_loss: 0.8109 - classification_loss: 0.0994 386/500 [======================>.......] - ETA: 26s - loss: 0.9109 - regression_loss: 0.8113 - classification_loss: 0.0995 387/500 [======================>.......] - ETA: 26s - loss: 0.9114 - regression_loss: 0.8118 - classification_loss: 0.0996 388/500 [======================>.......] - ETA: 26s - loss: 0.9113 - regression_loss: 0.8117 - classification_loss: 0.0996 389/500 [======================>.......] - ETA: 26s - loss: 0.9120 - regression_loss: 0.8125 - classification_loss: 0.0995 390/500 [======================>.......] - ETA: 25s - loss: 0.9120 - regression_loss: 0.8126 - classification_loss: 0.0994 391/500 [======================>.......] - ETA: 25s - loss: 0.9117 - regression_loss: 0.8124 - classification_loss: 0.0993 392/500 [======================>.......] - ETA: 25s - loss: 0.9111 - regression_loss: 0.8120 - classification_loss: 0.0992 393/500 [======================>.......] - ETA: 25s - loss: 0.9111 - regression_loss: 0.8121 - classification_loss: 0.0990 394/500 [======================>.......] - ETA: 24s - loss: 0.9104 - regression_loss: 0.8115 - classification_loss: 0.0989 395/500 [======================>.......] - ETA: 24s - loss: 0.9098 - regression_loss: 0.8110 - classification_loss: 0.0988 396/500 [======================>.......] - ETA: 24s - loss: 0.9082 - regression_loss: 0.8096 - classification_loss: 0.0986 397/500 [======================>.......] - ETA: 24s - loss: 0.9088 - regression_loss: 0.8102 - classification_loss: 0.0986 398/500 [======================>.......] - ETA: 24s - loss: 0.9079 - regression_loss: 0.8095 - classification_loss: 0.0984 399/500 [======================>.......] - ETA: 23s - loss: 0.9075 - regression_loss: 0.8092 - classification_loss: 0.0983 400/500 [=======================>......] - ETA: 23s - loss: 0.9093 - regression_loss: 0.8105 - classification_loss: 0.0988 401/500 [=======================>......] - ETA: 23s - loss: 0.9095 - regression_loss: 0.8107 - classification_loss: 0.0988 402/500 [=======================>......] - ETA: 23s - loss: 0.9087 - regression_loss: 0.8100 - classification_loss: 0.0986 403/500 [=======================>......] - ETA: 22s - loss: 0.9083 - regression_loss: 0.8097 - classification_loss: 0.0985 404/500 [=======================>......] - ETA: 22s - loss: 0.9082 - regression_loss: 0.8098 - classification_loss: 0.0984 405/500 [=======================>......] - ETA: 22s - loss: 0.9086 - regression_loss: 0.8102 - classification_loss: 0.0984 406/500 [=======================>......] - ETA: 22s - loss: 0.9083 - regression_loss: 0.8100 - classification_loss: 0.0983 407/500 [=======================>......] - ETA: 21s - loss: 0.9083 - regression_loss: 0.8101 - classification_loss: 0.0982 408/500 [=======================>......] - ETA: 21s - loss: 0.9083 - regression_loss: 0.8102 - classification_loss: 0.0982 409/500 [=======================>......] - ETA: 21s - loss: 0.9075 - regression_loss: 0.8095 - classification_loss: 0.0981 410/500 [=======================>......] - ETA: 21s - loss: 0.9089 - regression_loss: 0.8106 - classification_loss: 0.0983 411/500 [=======================>......] - ETA: 20s - loss: 0.9074 - regression_loss: 0.8092 - classification_loss: 0.0981 412/500 [=======================>......] - ETA: 20s - loss: 0.9062 - regression_loss: 0.8082 - classification_loss: 0.0980 413/500 [=======================>......] - ETA: 20s - loss: 0.9050 - regression_loss: 0.8072 - classification_loss: 0.0978 414/500 [=======================>......] - ETA: 20s - loss: 0.9043 - regression_loss: 0.8066 - classification_loss: 0.0977 415/500 [=======================>......] - ETA: 20s - loss: 0.9034 - regression_loss: 0.8058 - classification_loss: 0.0975 416/500 [=======================>......] - ETA: 19s - loss: 0.9031 - regression_loss: 0.8057 - classification_loss: 0.0974 417/500 [========================>.....] - ETA: 19s - loss: 0.9028 - regression_loss: 0.8054 - classification_loss: 0.0974 418/500 [========================>.....] - ETA: 19s - loss: 0.9021 - regression_loss: 0.8047 - classification_loss: 0.0974 419/500 [========================>.....] - ETA: 19s - loss: 0.9017 - regression_loss: 0.8044 - classification_loss: 0.0972 420/500 [========================>.....] - ETA: 18s - loss: 0.9024 - regression_loss: 0.8051 - classification_loss: 0.0973 421/500 [========================>.....] - ETA: 18s - loss: 0.9031 - regression_loss: 0.8056 - classification_loss: 0.0974 422/500 [========================>.....] - ETA: 18s - loss: 0.9025 - regression_loss: 0.8052 - classification_loss: 0.0973 423/500 [========================>.....] - ETA: 18s - loss: 0.9033 - regression_loss: 0.8059 - classification_loss: 0.0974 424/500 [========================>.....] - ETA: 17s - loss: 0.9047 - regression_loss: 0.8069 - classification_loss: 0.0977 425/500 [========================>.....] - ETA: 17s - loss: 0.9044 - regression_loss: 0.8068 - classification_loss: 0.0976 426/500 [========================>.....] - ETA: 17s - loss: 0.9046 - regression_loss: 0.8071 - classification_loss: 0.0975 427/500 [========================>.....] - ETA: 17s - loss: 0.9052 - regression_loss: 0.8076 - classification_loss: 0.0976 428/500 [========================>.....] - ETA: 16s - loss: 0.9057 - regression_loss: 0.8081 - classification_loss: 0.0976 429/500 [========================>.....] - ETA: 16s - loss: 0.9063 - regression_loss: 0.8085 - classification_loss: 0.0977 430/500 [========================>.....] - ETA: 16s - loss: 0.9065 - regression_loss: 0.8087 - classification_loss: 0.0978 431/500 [========================>.....] - ETA: 16s - loss: 0.9062 - regression_loss: 0.8085 - classification_loss: 0.0977 432/500 [========================>.....] - ETA: 16s - loss: 0.9084 - regression_loss: 0.8105 - classification_loss: 0.0979 433/500 [========================>.....] - ETA: 15s - loss: 0.9083 - regression_loss: 0.8104 - classification_loss: 0.0978 434/500 [=========================>....] - ETA: 15s - loss: 0.9076 - regression_loss: 0.8099 - classification_loss: 0.0977 435/500 [=========================>....] - ETA: 15s - loss: 0.9074 - regression_loss: 0.8098 - classification_loss: 0.0976 436/500 [=========================>....] - ETA: 15s - loss: 0.9065 - regression_loss: 0.8091 - classification_loss: 0.0974 437/500 [=========================>....] - ETA: 14s - loss: 0.9059 - regression_loss: 0.8086 - classification_loss: 0.0974 438/500 [=========================>....] - ETA: 14s - loss: 0.9064 - regression_loss: 0.8091 - classification_loss: 0.0973 439/500 [=========================>....] - ETA: 14s - loss: 0.9057 - regression_loss: 0.8085 - classification_loss: 0.0972 440/500 [=========================>....] - ETA: 14s - loss: 0.9056 - regression_loss: 0.8083 - classification_loss: 0.0973 441/500 [=========================>....] - ETA: 13s - loss: 0.9070 - regression_loss: 0.8093 - classification_loss: 0.0977 442/500 [=========================>....] - ETA: 13s - loss: 0.9075 - regression_loss: 0.8099 - classification_loss: 0.0976 443/500 [=========================>....] - ETA: 13s - loss: 0.9072 - regression_loss: 0.8097 - classification_loss: 0.0976 444/500 [=========================>....] - ETA: 13s - loss: 0.9059 - regression_loss: 0.8085 - classification_loss: 0.0974 445/500 [=========================>....] - ETA: 12s - loss: 0.9058 - regression_loss: 0.8085 - classification_loss: 0.0973 446/500 [=========================>....] - ETA: 12s - loss: 0.9056 - regression_loss: 0.8083 - classification_loss: 0.0972 447/500 [=========================>....] - ETA: 12s - loss: 0.9060 - regression_loss: 0.8086 - classification_loss: 0.0974 448/500 [=========================>....] - ETA: 12s - loss: 0.9048 - regression_loss: 0.8075 - classification_loss: 0.0973 449/500 [=========================>....] - ETA: 12s - loss: 0.9042 - regression_loss: 0.8069 - classification_loss: 0.0973 450/500 [==========================>...] - ETA: 11s - loss: 0.9027 - regression_loss: 0.8056 - classification_loss: 0.0971 451/500 [==========================>...] - ETA: 11s - loss: 0.9027 - regression_loss: 0.8056 - classification_loss: 0.0971 452/500 [==========================>...] - ETA: 11s - loss: 0.9028 - regression_loss: 0.8057 - classification_loss: 0.0971 453/500 [==========================>...] - ETA: 11s - loss: 0.9023 - regression_loss: 0.8053 - classification_loss: 0.0970 454/500 [==========================>...] - ETA: 10s - loss: 0.9027 - regression_loss: 0.8057 - classification_loss: 0.0970 455/500 [==========================>...] - ETA: 10s - loss: 0.9026 - regression_loss: 0.8057 - classification_loss: 0.0969 456/500 [==========================>...] - ETA: 10s - loss: 0.9021 - regression_loss: 0.8053 - classification_loss: 0.0968 457/500 [==========================>...] - ETA: 10s - loss: 0.9022 - regression_loss: 0.8055 - classification_loss: 0.0967 458/500 [==========================>...] - ETA: 9s - loss: 0.9027 - regression_loss: 0.8061 - classification_loss: 0.0967  459/500 [==========================>...] - ETA: 9s - loss: 0.9028 - regression_loss: 0.8061 - classification_loss: 0.0967 460/500 [==========================>...] - ETA: 9s - loss: 0.9031 - regression_loss: 0.8064 - classification_loss: 0.0967 461/500 [==========================>...] - ETA: 9s - loss: 0.9030 - regression_loss: 0.8063 - classification_loss: 0.0967 462/500 [==========================>...] - ETA: 8s - loss: 0.9033 - regression_loss: 0.8067 - classification_loss: 0.0966 463/500 [==========================>...] - ETA: 8s - loss: 0.9026 - regression_loss: 0.8061 - classification_loss: 0.0965 464/500 [==========================>...] - ETA: 8s - loss: 0.9037 - regression_loss: 0.8071 - classification_loss: 0.0966 465/500 [==========================>...] - ETA: 8s - loss: 0.9035 - regression_loss: 0.8069 - classification_loss: 0.0966 466/500 [==========================>...] - ETA: 8s - loss: 0.9047 - regression_loss: 0.8080 - classification_loss: 0.0967 467/500 [===========================>..] - ETA: 7s - loss: 0.9028 - regression_loss: 0.8063 - classification_loss: 0.0965 468/500 [===========================>..] - ETA: 7s - loss: 0.9025 - regression_loss: 0.8061 - classification_loss: 0.0964 469/500 [===========================>..] - ETA: 7s - loss: 0.9024 - regression_loss: 0.8061 - classification_loss: 0.0963 470/500 [===========================>..] - ETA: 7s - loss: 0.9044 - regression_loss: 0.8081 - classification_loss: 0.0963 471/500 [===========================>..] - ETA: 6s - loss: 0.9044 - regression_loss: 0.8082 - classification_loss: 0.0962 472/500 [===========================>..] - ETA: 6s - loss: 0.9055 - regression_loss: 0.8089 - classification_loss: 0.0966 473/500 [===========================>..] - ETA: 6s - loss: 0.9057 - regression_loss: 0.8090 - classification_loss: 0.0967 474/500 [===========================>..] - ETA: 6s - loss: 0.9047 - regression_loss: 0.8082 - classification_loss: 0.0965 475/500 [===========================>..] - ETA: 5s - loss: 0.9038 - regression_loss: 0.8074 - classification_loss: 0.0964 476/500 [===========================>..] - ETA: 5s - loss: 0.9027 - regression_loss: 0.8065 - classification_loss: 0.0962 477/500 [===========================>..] - ETA: 5s - loss: 0.9018 - regression_loss: 0.8057 - classification_loss: 0.0961 478/500 [===========================>..] - ETA: 5s - loss: 0.9030 - regression_loss: 0.8066 - classification_loss: 0.0964 479/500 [===========================>..] - ETA: 4s - loss: 0.9018 - regression_loss: 0.8055 - classification_loss: 0.0963 480/500 [===========================>..] - ETA: 4s - loss: 0.9029 - regression_loss: 0.8064 - classification_loss: 0.0965 481/500 [===========================>..] - ETA: 4s - loss: 0.9035 - regression_loss: 0.8070 - classification_loss: 0.0965 482/500 [===========================>..] - ETA: 4s - loss: 0.9029 - regression_loss: 0.8065 - classification_loss: 0.0964 483/500 [===========================>..] - ETA: 4s - loss: 0.9033 - regression_loss: 0.8070 - classification_loss: 0.0962 484/500 [============================>.] - ETA: 3s - loss: 0.9028 - regression_loss: 0.8066 - classification_loss: 0.0962 485/500 [============================>.] - ETA: 3s - loss: 0.9039 - regression_loss: 0.8076 - classification_loss: 0.0963 486/500 [============================>.] - ETA: 3s - loss: 0.9043 - regression_loss: 0.8079 - classification_loss: 0.0964 487/500 [============================>.] - ETA: 3s - loss: 0.9035 - regression_loss: 0.8072 - classification_loss: 0.0963 488/500 [============================>.] - ETA: 2s - loss: 0.9032 - regression_loss: 0.8070 - classification_loss: 0.0963 489/500 [============================>.] - ETA: 2s - loss: 0.9029 - regression_loss: 0.8067 - classification_loss: 0.0962 490/500 [============================>.] - ETA: 2s - loss: 0.9043 - regression_loss: 0.8078 - classification_loss: 0.0965 491/500 [============================>.] - ETA: 2s - loss: 0.9067 - regression_loss: 0.8098 - classification_loss: 0.0970 492/500 [============================>.] - ETA: 1s - loss: 0.9064 - regression_loss: 0.8094 - classification_loss: 0.0970 493/500 [============================>.] - ETA: 1s - loss: 0.9068 - regression_loss: 0.8098 - classification_loss: 0.0970 494/500 [============================>.] - ETA: 1s - loss: 0.9074 - regression_loss: 0.8104 - classification_loss: 0.0970 495/500 [============================>.] - ETA: 1s - loss: 0.9069 - regression_loss: 0.8100 - classification_loss: 0.0969 496/500 [============================>.] - ETA: 0s - loss: 0.9073 - regression_loss: 0.8104 - classification_loss: 0.0969 497/500 [============================>.] - ETA: 0s - loss: 0.9082 - regression_loss: 0.8111 - classification_loss: 0.0971 498/500 [============================>.] - ETA: 0s - loss: 0.9082 - regression_loss: 0.8111 - classification_loss: 0.0971 499/500 [============================>.] - ETA: 0s - loss: 0.9075 - regression_loss: 0.8105 - classification_loss: 0.0970 500/500 [==============================] - 118s 235ms/step - loss: 0.9082 - regression_loss: 0.8113 - classification_loss: 0.0969 326 instances of class plum with average precision: 0.8267 mAP: 0.8267 Epoch 00036: saving model to ./training/snapshots/resnet50_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 1:59 - loss: 0.7231 - regression_loss: 0.6548 - classification_loss: 0.0682 2/500 [..............................] - ETA: 1:55 - loss: 1.1273 - regression_loss: 0.9995 - classification_loss: 0.1279 3/500 [..............................] - ETA: 1:58 - loss: 0.9844 - regression_loss: 0.8784 - classification_loss: 0.1060 4/500 [..............................] - ETA: 1:59 - loss: 0.9800 - regression_loss: 0.8664 - classification_loss: 0.1136 5/500 [..............................] - ETA: 1:57 - loss: 0.9064 - regression_loss: 0.8101 - classification_loss: 0.0963 6/500 [..............................] - ETA: 1:56 - loss: 0.8758 - regression_loss: 0.7896 - classification_loss: 0.0862 7/500 [..............................] - ETA: 1:54 - loss: 0.9815 - regression_loss: 0.8682 - classification_loss: 0.1133 8/500 [..............................] - ETA: 1:53 - loss: 0.9641 - regression_loss: 0.8532 - classification_loss: 0.1109 9/500 [..............................] - ETA: 1:53 - loss: 0.9060 - regression_loss: 0.8037 - classification_loss: 0.1023 10/500 [..............................] - ETA: 1:52 - loss: 0.9200 - regression_loss: 0.8193 - classification_loss: 0.1007 11/500 [..............................] - ETA: 1:52 - loss: 0.9062 - regression_loss: 0.8116 - classification_loss: 0.0946 12/500 [..............................] - ETA: 1:51 - loss: 0.9089 - regression_loss: 0.8106 - classification_loss: 0.0982 13/500 [..............................] - ETA: 1:51 - loss: 0.9216 - regression_loss: 0.8267 - classification_loss: 0.0949 14/500 [..............................] - ETA: 1:50 - loss: 0.8962 - regression_loss: 0.8061 - classification_loss: 0.0901 15/500 [..............................] - ETA: 1:50 - loss: 0.9397 - regression_loss: 0.8410 - classification_loss: 0.0987 16/500 [..............................] - ETA: 1:49 - loss: 0.9551 - regression_loss: 0.8529 - classification_loss: 0.1022 17/500 [>.............................] - ETA: 1:49 - loss: 0.9408 - regression_loss: 0.8406 - classification_loss: 0.1002 18/500 [>.............................] - ETA: 1:48 - loss: 0.9534 - regression_loss: 0.8495 - classification_loss: 0.1040 19/500 [>.............................] - ETA: 1:48 - loss: 0.9600 - regression_loss: 0.8547 - classification_loss: 0.1053 20/500 [>.............................] - ETA: 1:48 - loss: 0.9488 - regression_loss: 0.8458 - classification_loss: 0.1030 21/500 [>.............................] - ETA: 1:48 - loss: 0.9171 - regression_loss: 0.8187 - classification_loss: 0.0984 22/500 [>.............................] - ETA: 1:48 - loss: 0.9209 - regression_loss: 0.8228 - classification_loss: 0.0981 23/500 [>.............................] - ETA: 1:48 - loss: 0.9156 - regression_loss: 0.8181 - classification_loss: 0.0975 24/500 [>.............................] - ETA: 1:48 - loss: 0.9135 - regression_loss: 0.8157 - classification_loss: 0.0978 25/500 [>.............................] - ETA: 1:49 - loss: 0.9301 - regression_loss: 0.8269 - classification_loss: 0.1032 26/500 [>.............................] - ETA: 1:49 - loss: 0.9547 - regression_loss: 0.8512 - classification_loss: 0.1035 27/500 [>.............................] - ETA: 1:49 - loss: 0.9468 - regression_loss: 0.8448 - classification_loss: 0.1020 28/500 [>.............................] - ETA: 1:48 - loss: 0.9558 - regression_loss: 0.8546 - classification_loss: 0.1012 29/500 [>.............................] - ETA: 1:48 - loss: 0.9632 - regression_loss: 0.8597 - classification_loss: 0.1036 30/500 [>.............................] - ETA: 1:47 - loss: 0.9654 - regression_loss: 0.8607 - classification_loss: 0.1047 31/500 [>.............................] - ETA: 1:48 - loss: 0.9843 - regression_loss: 0.8742 - classification_loss: 0.1101 32/500 [>.............................] - ETA: 1:47 - loss: 0.9839 - regression_loss: 0.8747 - classification_loss: 0.1092 33/500 [>.............................] - ETA: 1:47 - loss: 0.9777 - regression_loss: 0.8700 - classification_loss: 0.1077 34/500 [=>............................] - ETA: 1:47 - loss: 0.9716 - regression_loss: 0.8657 - classification_loss: 0.1059 35/500 [=>............................] - ETA: 1:47 - loss: 0.9531 - regression_loss: 0.8497 - classification_loss: 0.1033 36/500 [=>............................] - ETA: 1:46 - loss: 0.9469 - regression_loss: 0.8454 - classification_loss: 0.1014 37/500 [=>............................] - ETA: 1:46 - loss: 0.9473 - regression_loss: 0.8472 - classification_loss: 0.1001 38/500 [=>............................] - ETA: 1:46 - loss: 0.9522 - regression_loss: 0.8506 - classification_loss: 0.1016 39/500 [=>............................] - ETA: 1:46 - loss: 0.9416 - regression_loss: 0.8420 - classification_loss: 0.0996 40/500 [=>............................] - ETA: 1:46 - loss: 0.9291 - regression_loss: 0.8311 - classification_loss: 0.0980 41/500 [=>............................] - ETA: 1:45 - loss: 0.9141 - regression_loss: 0.8182 - classification_loss: 0.0959 42/500 [=>............................] - ETA: 1:45 - loss: 0.9000 - regression_loss: 0.8061 - classification_loss: 0.0939 43/500 [=>............................] - ETA: 1:45 - loss: 0.9122 - regression_loss: 0.8158 - classification_loss: 0.0964 44/500 [=>............................] - ETA: 1:45 - loss: 0.9090 - regression_loss: 0.8131 - classification_loss: 0.0959 45/500 [=>............................] - ETA: 1:45 - loss: 0.9095 - regression_loss: 0.8144 - classification_loss: 0.0951 46/500 [=>............................] - ETA: 1:45 - loss: 0.9044 - regression_loss: 0.8102 - classification_loss: 0.0942 47/500 [=>............................] - ETA: 1:45 - loss: 0.9150 - regression_loss: 0.8188 - classification_loss: 0.0962 48/500 [=>............................] - ETA: 1:45 - loss: 0.9115 - regression_loss: 0.8152 - classification_loss: 0.0963 49/500 [=>............................] - ETA: 1:44 - loss: 0.9036 - regression_loss: 0.8086 - classification_loss: 0.0950 50/500 [==>...........................] - ETA: 1:44 - loss: 0.9065 - regression_loss: 0.8122 - classification_loss: 0.0943 51/500 [==>...........................] - ETA: 1:44 - loss: 0.8991 - regression_loss: 0.8049 - classification_loss: 0.0941 52/500 [==>...........................] - ETA: 1:44 - loss: 0.9523 - regression_loss: 0.8405 - classification_loss: 0.1118 53/500 [==>...........................] - ETA: 1:43 - loss: 0.9504 - regression_loss: 0.8397 - classification_loss: 0.1107 54/500 [==>...........................] - ETA: 1:43 - loss: 0.9518 - regression_loss: 0.8419 - classification_loss: 0.1099 55/500 [==>...........................] - ETA: 1:43 - loss: 0.9579 - regression_loss: 0.8479 - classification_loss: 0.1101 56/500 [==>...........................] - ETA: 1:42 - loss: 0.9607 - regression_loss: 0.8504 - classification_loss: 0.1102 57/500 [==>...........................] - ETA: 1:42 - loss: 0.9578 - regression_loss: 0.8485 - classification_loss: 0.1094 58/500 [==>...........................] - ETA: 1:42 - loss: 0.9578 - regression_loss: 0.8488 - classification_loss: 0.1091 59/500 [==>...........................] - ETA: 1:42 - loss: 0.9551 - regression_loss: 0.8465 - classification_loss: 0.1087 60/500 [==>...........................] - ETA: 1:42 - loss: 0.9518 - regression_loss: 0.8438 - classification_loss: 0.1080 61/500 [==>...........................] - ETA: 1:41 - loss: 0.9422 - regression_loss: 0.8352 - classification_loss: 0.1070 62/500 [==>...........................] - ETA: 1:41 - loss: 0.9398 - regression_loss: 0.8329 - classification_loss: 0.1069 63/500 [==>...........................] - ETA: 1:41 - loss: 0.9381 - regression_loss: 0.8315 - classification_loss: 0.1067 64/500 [==>...........................] - ETA: 1:41 - loss: 0.9369 - regression_loss: 0.8307 - classification_loss: 0.1062 65/500 [==>...........................] - ETA: 1:41 - loss: 0.9369 - regression_loss: 0.8305 - classification_loss: 0.1064 66/500 [==>...........................] - ETA: 1:41 - loss: 0.9405 - regression_loss: 0.8344 - classification_loss: 0.1061 67/500 [===>..........................] - ETA: 1:40 - loss: 0.9372 - regression_loss: 0.8320 - classification_loss: 0.1052 68/500 [===>..........................] - ETA: 1:40 - loss: 0.9317 - regression_loss: 0.8275 - classification_loss: 0.1042 69/500 [===>..........................] - ETA: 1:40 - loss: 0.9368 - regression_loss: 0.8321 - classification_loss: 0.1047 70/500 [===>..........................] - ETA: 1:39 - loss: 0.9362 - regression_loss: 0.8319 - classification_loss: 0.1043 71/500 [===>..........................] - ETA: 1:39 - loss: 0.9452 - regression_loss: 0.8394 - classification_loss: 0.1058 72/500 [===>..........................] - ETA: 1:39 - loss: 0.9416 - regression_loss: 0.8359 - classification_loss: 0.1057 73/500 [===>..........................] - ETA: 1:39 - loss: 0.9382 - regression_loss: 0.8337 - classification_loss: 0.1045 74/500 [===>..........................] - ETA: 1:39 - loss: 0.9358 - regression_loss: 0.8311 - classification_loss: 0.1047 75/500 [===>..........................] - ETA: 1:38 - loss: 0.9355 - regression_loss: 0.8307 - classification_loss: 0.1048 76/500 [===>..........................] - ETA: 1:38 - loss: 0.9326 - regression_loss: 0.8286 - classification_loss: 0.1040 77/500 [===>..........................] - ETA: 1:38 - loss: 0.9254 - regression_loss: 0.8221 - classification_loss: 0.1033 78/500 [===>..........................] - ETA: 1:37 - loss: 0.9285 - regression_loss: 0.8251 - classification_loss: 0.1034 79/500 [===>..........................] - ETA: 1:37 - loss: 0.9263 - regression_loss: 0.8224 - classification_loss: 0.1040 80/500 [===>..........................] - ETA: 1:37 - loss: 0.9261 - regression_loss: 0.8226 - classification_loss: 0.1035 81/500 [===>..........................] - ETA: 1:37 - loss: 0.9212 - regression_loss: 0.8183 - classification_loss: 0.1028 82/500 [===>..........................] - ETA: 1:36 - loss: 0.9218 - regression_loss: 0.8192 - classification_loss: 0.1026 83/500 [===>..........................] - ETA: 1:36 - loss: 0.9198 - regression_loss: 0.8175 - classification_loss: 0.1023 84/500 [====>.........................] - ETA: 1:36 - loss: 0.9133 - regression_loss: 0.8119 - classification_loss: 0.1013 85/500 [====>.........................] - ETA: 1:36 - loss: 0.9104 - regression_loss: 0.8098 - classification_loss: 0.1006 86/500 [====>.........................] - ETA: 1:35 - loss: 0.9129 - regression_loss: 0.8124 - classification_loss: 0.1005 87/500 [====>.........................] - ETA: 1:35 - loss: 0.9178 - regression_loss: 0.8167 - classification_loss: 0.1012 88/500 [====>.........................] - ETA: 1:35 - loss: 0.9118 - regression_loss: 0.8116 - classification_loss: 0.1002 89/500 [====>.........................] - ETA: 1:35 - loss: 0.9112 - regression_loss: 0.8111 - classification_loss: 0.1001 90/500 [====>.........................] - ETA: 1:35 - loss: 0.9289 - regression_loss: 0.8244 - classification_loss: 0.1045 91/500 [====>.........................] - ETA: 1:34 - loss: 0.9289 - regression_loss: 0.8247 - classification_loss: 0.1042 92/500 [====>.........................] - ETA: 1:34 - loss: 0.9294 - regression_loss: 0.8245 - classification_loss: 0.1049 93/500 [====>.........................] - ETA: 1:34 - loss: 0.9249 - regression_loss: 0.8208 - classification_loss: 0.1042 94/500 [====>.........................] - ETA: 1:34 - loss: 0.9193 - regression_loss: 0.8161 - classification_loss: 0.1033 95/500 [====>.........................] - ETA: 1:34 - loss: 0.9206 - regression_loss: 0.8175 - classification_loss: 0.1031 96/500 [====>.........................] - ETA: 1:33 - loss: 0.9181 - regression_loss: 0.8156 - classification_loss: 0.1026 97/500 [====>.........................] - ETA: 1:33 - loss: 0.9133 - regression_loss: 0.8116 - classification_loss: 0.1018 98/500 [====>.........................] - ETA: 1:33 - loss: 0.9129 - regression_loss: 0.8114 - classification_loss: 0.1014 99/500 [====>.........................] - ETA: 1:33 - loss: 0.9108 - regression_loss: 0.8097 - classification_loss: 0.1011 100/500 [=====>........................] - ETA: 1:32 - loss: 0.9097 - regression_loss: 0.8092 - classification_loss: 0.1005 101/500 [=====>........................] - ETA: 1:32 - loss: 0.9125 - regression_loss: 0.8119 - classification_loss: 0.1006 102/500 [=====>........................] - ETA: 1:32 - loss: 0.9036 - regression_loss: 0.8039 - classification_loss: 0.0997 103/500 [=====>........................] - ETA: 1:32 - loss: 0.9021 - regression_loss: 0.8023 - classification_loss: 0.0999 104/500 [=====>........................] - ETA: 1:32 - loss: 0.8989 - regression_loss: 0.7997 - classification_loss: 0.0992 105/500 [=====>........................] - ETA: 1:31 - loss: 0.9042 - regression_loss: 0.8042 - classification_loss: 0.1000 106/500 [=====>........................] - ETA: 1:31 - loss: 0.9068 - regression_loss: 0.8067 - classification_loss: 0.1002 107/500 [=====>........................] - ETA: 1:31 - loss: 0.9053 - regression_loss: 0.8054 - classification_loss: 0.0999 108/500 [=====>........................] - ETA: 1:31 - loss: 0.9092 - regression_loss: 0.8078 - classification_loss: 0.1013 109/500 [=====>........................] - ETA: 1:30 - loss: 0.9104 - regression_loss: 0.8092 - classification_loss: 0.1013 110/500 [=====>........................] - ETA: 1:30 - loss: 0.9073 - regression_loss: 0.8061 - classification_loss: 0.1012 111/500 [=====>........................] - ETA: 1:30 - loss: 0.9153 - regression_loss: 0.8130 - classification_loss: 0.1023 112/500 [=====>........................] - ETA: 1:30 - loss: 0.9178 - regression_loss: 0.8153 - classification_loss: 0.1025 113/500 [=====>........................] - ETA: 1:30 - loss: 0.9149 - regression_loss: 0.8130 - classification_loss: 0.1019 114/500 [=====>........................] - ETA: 1:29 - loss: 0.9144 - regression_loss: 0.8129 - classification_loss: 0.1015 115/500 [=====>........................] - ETA: 1:29 - loss: 0.9211 - regression_loss: 0.8178 - classification_loss: 0.1033 116/500 [=====>........................] - ETA: 1:29 - loss: 0.9152 - regression_loss: 0.8127 - classification_loss: 0.1025 117/500 [======>.......................] - ETA: 1:29 - loss: 0.9123 - regression_loss: 0.8104 - classification_loss: 0.1019 118/500 [======>.......................] - ETA: 1:29 - loss: 0.9092 - regression_loss: 0.8079 - classification_loss: 0.1014 119/500 [======>.......................] - ETA: 1:28 - loss: 0.9102 - regression_loss: 0.8090 - classification_loss: 0.1013 120/500 [======>.......................] - ETA: 1:28 - loss: 0.9109 - regression_loss: 0.8094 - classification_loss: 0.1015 121/500 [======>.......................] - ETA: 1:28 - loss: 0.9052 - regression_loss: 0.8044 - classification_loss: 0.1008 122/500 [======>.......................] - ETA: 1:28 - loss: 0.9046 - regression_loss: 0.8043 - classification_loss: 0.1004 123/500 [======>.......................] - ETA: 1:27 - loss: 0.9026 - regression_loss: 0.8027 - classification_loss: 0.1000 124/500 [======>.......................] - ETA: 1:27 - loss: 0.9043 - regression_loss: 0.8044 - classification_loss: 0.0999 125/500 [======>.......................] - ETA: 1:27 - loss: 0.9027 - regression_loss: 0.8035 - classification_loss: 0.0993 126/500 [======>.......................] - ETA: 1:27 - loss: 0.9015 - regression_loss: 0.8024 - classification_loss: 0.0991 127/500 [======>.......................] - ETA: 1:27 - loss: 0.9030 - regression_loss: 0.8040 - classification_loss: 0.0990 128/500 [======>.......................] - ETA: 1:26 - loss: 0.9036 - regression_loss: 0.8041 - classification_loss: 0.0995 129/500 [======>.......................] - ETA: 1:26 - loss: 0.9003 - regression_loss: 0.8013 - classification_loss: 0.0991 130/500 [======>.......................] - ETA: 1:26 - loss: 0.9066 - regression_loss: 0.8061 - classification_loss: 0.1006 131/500 [======>.......................] - ETA: 1:25 - loss: 0.9022 - regression_loss: 0.8023 - classification_loss: 0.0999 132/500 [======>.......................] - ETA: 1:25 - loss: 0.9005 - regression_loss: 0.8010 - classification_loss: 0.0995 133/500 [======>.......................] - ETA: 1:25 - loss: 0.9036 - regression_loss: 0.8038 - classification_loss: 0.0998 134/500 [=======>......................] - ETA: 1:25 - loss: 0.9041 - regression_loss: 0.8040 - classification_loss: 0.1002 135/500 [=======>......................] - ETA: 1:25 - loss: 0.9056 - regression_loss: 0.8057 - classification_loss: 0.0999 136/500 [=======>......................] - ETA: 1:24 - loss: 0.9040 - regression_loss: 0.8041 - classification_loss: 0.0999 137/500 [=======>......................] - ETA: 1:24 - loss: 0.9053 - regression_loss: 0.8053 - classification_loss: 0.1000 138/500 [=======>......................] - ETA: 1:24 - loss: 0.9055 - regression_loss: 0.8055 - classification_loss: 0.1000 139/500 [=======>......................] - ETA: 1:24 - loss: 0.9082 - regression_loss: 0.8083 - classification_loss: 0.0999 140/500 [=======>......................] - ETA: 1:23 - loss: 0.9088 - regression_loss: 0.8090 - classification_loss: 0.0998 141/500 [=======>......................] - ETA: 1:23 - loss: 0.9080 - regression_loss: 0.8081 - classification_loss: 0.0998 142/500 [=======>......................] - ETA: 1:23 - loss: 0.9089 - regression_loss: 0.8095 - classification_loss: 0.0995 143/500 [=======>......................] - ETA: 1:23 - loss: 0.9083 - regression_loss: 0.8082 - classification_loss: 0.1001 144/500 [=======>......................] - ETA: 1:22 - loss: 0.9079 - regression_loss: 0.8080 - classification_loss: 0.0999 145/500 [=======>......................] - ETA: 1:22 - loss: 0.9084 - regression_loss: 0.8085 - classification_loss: 0.1000 146/500 [=======>......................] - ETA: 1:22 - loss: 0.9049 - regression_loss: 0.8054 - classification_loss: 0.0995 147/500 [=======>......................] - ETA: 1:22 - loss: 0.9075 - regression_loss: 0.8078 - classification_loss: 0.0996 148/500 [=======>......................] - ETA: 1:21 - loss: 0.9086 - regression_loss: 0.8086 - classification_loss: 0.1000 149/500 [=======>......................] - ETA: 1:21 - loss: 0.9238 - regression_loss: 0.8178 - classification_loss: 0.1061 150/500 [========>.....................] - ETA: 1:21 - loss: 0.9316 - regression_loss: 0.8238 - classification_loss: 0.1078 151/500 [========>.....................] - ETA: 1:21 - loss: 0.9298 - regression_loss: 0.8223 - classification_loss: 0.1075 152/500 [========>.....................] - ETA: 1:21 - loss: 0.9325 - regression_loss: 0.8248 - classification_loss: 0.1077 153/500 [========>.....................] - ETA: 1:20 - loss: 0.9341 - regression_loss: 0.8262 - classification_loss: 0.1079 154/500 [========>.....................] - ETA: 1:20 - loss: 0.9359 - regression_loss: 0.8280 - classification_loss: 0.1079 155/500 [========>.....................] - ETA: 1:20 - loss: 0.9350 - regression_loss: 0.8273 - classification_loss: 0.1077 156/500 [========>.....................] - ETA: 1:20 - loss: 0.9343 - regression_loss: 0.8267 - classification_loss: 0.1076 157/500 [========>.....................] - ETA: 1:19 - loss: 0.9346 - regression_loss: 0.8271 - classification_loss: 0.1075 158/500 [========>.....................] - ETA: 1:19 - loss: 0.9358 - regression_loss: 0.8278 - classification_loss: 0.1080 159/500 [========>.....................] - ETA: 1:19 - loss: 0.9319 - regression_loss: 0.8244 - classification_loss: 0.1075 160/500 [========>.....................] - ETA: 1:19 - loss: 0.9318 - regression_loss: 0.8244 - classification_loss: 0.1074 161/500 [========>.....................] - ETA: 1:18 - loss: 0.9323 - regression_loss: 0.8252 - classification_loss: 0.1071 162/500 [========>.....................] - ETA: 1:18 - loss: 0.9338 - regression_loss: 0.8268 - classification_loss: 0.1070 163/500 [========>.....................] - ETA: 1:18 - loss: 0.9325 - regression_loss: 0.8258 - classification_loss: 0.1067 164/500 [========>.....................] - ETA: 1:18 - loss: 0.9306 - regression_loss: 0.8244 - classification_loss: 0.1062 165/500 [========>.....................] - ETA: 1:17 - loss: 0.9259 - regression_loss: 0.8203 - classification_loss: 0.1056 166/500 [========>.....................] - ETA: 1:17 - loss: 0.9230 - regression_loss: 0.8178 - classification_loss: 0.1052 167/500 [=========>....................] - ETA: 1:17 - loss: 0.9194 - regression_loss: 0.8147 - classification_loss: 0.1047 168/500 [=========>....................] - ETA: 1:17 - loss: 0.9162 - regression_loss: 0.8120 - classification_loss: 0.1042 169/500 [=========>....................] - ETA: 1:17 - loss: 0.9171 - regression_loss: 0.8129 - classification_loss: 0.1043 170/500 [=========>....................] - ETA: 1:16 - loss: 0.9189 - regression_loss: 0.8145 - classification_loss: 0.1045 171/500 [=========>....................] - ETA: 1:16 - loss: 0.9193 - regression_loss: 0.8150 - classification_loss: 0.1043 172/500 [=========>....................] - ETA: 1:16 - loss: 0.9172 - regression_loss: 0.8132 - classification_loss: 0.1040 173/500 [=========>....................] - ETA: 1:16 - loss: 0.9159 - regression_loss: 0.8123 - classification_loss: 0.1036 174/500 [=========>....................] - ETA: 1:15 - loss: 0.9154 - regression_loss: 0.8121 - classification_loss: 0.1033 175/500 [=========>....................] - ETA: 1:15 - loss: 0.9122 - regression_loss: 0.8093 - classification_loss: 0.1029 176/500 [=========>....................] - ETA: 1:15 - loss: 0.9130 - regression_loss: 0.8098 - classification_loss: 0.1032 177/500 [=========>....................] - ETA: 1:15 - loss: 0.9136 - regression_loss: 0.8102 - classification_loss: 0.1034 178/500 [=========>....................] - ETA: 1:14 - loss: 0.9143 - regression_loss: 0.8111 - classification_loss: 0.1031 179/500 [=========>....................] - ETA: 1:14 - loss: 0.9167 - regression_loss: 0.8133 - classification_loss: 0.1034 180/500 [=========>....................] - ETA: 1:14 - loss: 0.9161 - regression_loss: 0.8129 - classification_loss: 0.1032 181/500 [=========>....................] - ETA: 1:14 - loss: 0.9157 - regression_loss: 0.8126 - classification_loss: 0.1032 182/500 [=========>....................] - ETA: 1:14 - loss: 0.9160 - regression_loss: 0.8130 - classification_loss: 0.1030 183/500 [=========>....................] - ETA: 1:13 - loss: 0.9153 - regression_loss: 0.8126 - classification_loss: 0.1027 184/500 [==========>...................] - ETA: 1:13 - loss: 0.9165 - regression_loss: 0.8137 - classification_loss: 0.1028 185/500 [==========>...................] - ETA: 1:13 - loss: 0.9189 - regression_loss: 0.8162 - classification_loss: 0.1027 186/500 [==========>...................] - ETA: 1:13 - loss: 0.9178 - regression_loss: 0.8152 - classification_loss: 0.1026 187/500 [==========>...................] - ETA: 1:12 - loss: 0.9199 - regression_loss: 0.8173 - classification_loss: 0.1026 188/500 [==========>...................] - ETA: 1:12 - loss: 0.9195 - regression_loss: 0.8170 - classification_loss: 0.1025 189/500 [==========>...................] - ETA: 1:12 - loss: 0.9218 - regression_loss: 0.8190 - classification_loss: 0.1028 190/500 [==========>...................] - ETA: 1:12 - loss: 0.9202 - regression_loss: 0.8176 - classification_loss: 0.1026 191/500 [==========>...................] - ETA: 1:11 - loss: 0.9198 - regression_loss: 0.8173 - classification_loss: 0.1025 192/500 [==========>...................] - ETA: 1:11 - loss: 0.9181 - regression_loss: 0.8160 - classification_loss: 0.1021 193/500 [==========>...................] - ETA: 1:11 - loss: 0.9202 - regression_loss: 0.8173 - classification_loss: 0.1029 194/500 [==========>...................] - ETA: 1:11 - loss: 0.9204 - regression_loss: 0.8177 - classification_loss: 0.1027 195/500 [==========>...................] - ETA: 1:10 - loss: 0.9189 - regression_loss: 0.8166 - classification_loss: 0.1023 196/500 [==========>...................] - ETA: 1:10 - loss: 0.9161 - regression_loss: 0.8142 - classification_loss: 0.1019 197/500 [==========>...................] - ETA: 1:10 - loss: 0.9173 - regression_loss: 0.8152 - classification_loss: 0.1021 198/500 [==========>...................] - ETA: 1:10 - loss: 0.9149 - regression_loss: 0.8132 - classification_loss: 0.1017 199/500 [==========>...................] - ETA: 1:09 - loss: 0.9166 - regression_loss: 0.8148 - classification_loss: 0.1017 200/500 [===========>..................] - ETA: 1:09 - loss: 0.9152 - regression_loss: 0.8138 - classification_loss: 0.1015 201/500 [===========>..................] - ETA: 1:09 - loss: 0.9157 - regression_loss: 0.8143 - classification_loss: 0.1014 202/500 [===========>..................] - ETA: 1:09 - loss: 0.9207 - regression_loss: 0.8180 - classification_loss: 0.1027 203/500 [===========>..................] - ETA: 1:09 - loss: 0.9197 - regression_loss: 0.8171 - classification_loss: 0.1026 204/500 [===========>..................] - ETA: 1:08 - loss: 0.9204 - regression_loss: 0.8177 - classification_loss: 0.1027 205/500 [===========>..................] - ETA: 1:08 - loss: 0.9221 - regression_loss: 0.8190 - classification_loss: 0.1030 206/500 [===========>..................] - ETA: 1:08 - loss: 0.9216 - regression_loss: 0.8187 - classification_loss: 0.1029 207/500 [===========>..................] - ETA: 1:08 - loss: 0.9231 - regression_loss: 0.8200 - classification_loss: 0.1031 208/500 [===========>..................] - ETA: 1:07 - loss: 0.9203 - regression_loss: 0.8175 - classification_loss: 0.1027 209/500 [===========>..................] - ETA: 1:07 - loss: 0.9220 - regression_loss: 0.8188 - classification_loss: 0.1032 210/500 [===========>..................] - ETA: 1:07 - loss: 0.9192 - regression_loss: 0.8164 - classification_loss: 0.1028 211/500 [===========>..................] - ETA: 1:07 - loss: 0.9183 - regression_loss: 0.8157 - classification_loss: 0.1026 212/500 [===========>..................] - ETA: 1:07 - loss: 0.9216 - regression_loss: 0.8188 - classification_loss: 0.1028 213/500 [===========>..................] - ETA: 1:06 - loss: 0.9220 - regression_loss: 0.8193 - classification_loss: 0.1028 214/500 [===========>..................] - ETA: 1:06 - loss: 0.9260 - regression_loss: 0.8224 - classification_loss: 0.1035 215/500 [===========>..................] - ETA: 1:06 - loss: 0.9240 - regression_loss: 0.8207 - classification_loss: 0.1033 216/500 [===========>..................] - ETA: 1:06 - loss: 0.9254 - regression_loss: 0.8219 - classification_loss: 0.1035 217/500 [============>.................] - ETA: 1:05 - loss: 0.9237 - regression_loss: 0.8202 - classification_loss: 0.1035 218/500 [============>.................] - ETA: 1:05 - loss: 0.9237 - regression_loss: 0.8202 - classification_loss: 0.1035 219/500 [============>.................] - ETA: 1:05 - loss: 0.9224 - regression_loss: 0.8192 - classification_loss: 0.1033 220/500 [============>.................] - ETA: 1:05 - loss: 0.9250 - regression_loss: 0.8215 - classification_loss: 0.1036 221/500 [============>.................] - ETA: 1:04 - loss: 0.9254 - regression_loss: 0.8218 - classification_loss: 0.1036 222/500 [============>.................] - ETA: 1:04 - loss: 0.9246 - regression_loss: 0.8210 - classification_loss: 0.1037 223/500 [============>.................] - ETA: 1:04 - loss: 0.9289 - regression_loss: 0.8242 - classification_loss: 0.1047 224/500 [============>.................] - ETA: 1:04 - loss: 0.9318 - regression_loss: 0.8263 - classification_loss: 0.1055 225/500 [============>.................] - ETA: 1:03 - loss: 0.9307 - regression_loss: 0.8250 - classification_loss: 0.1056 226/500 [============>.................] - ETA: 1:03 - loss: 0.9295 - regression_loss: 0.8241 - classification_loss: 0.1054 227/500 [============>.................] - ETA: 1:03 - loss: 0.9284 - regression_loss: 0.8232 - classification_loss: 0.1052 228/500 [============>.................] - ETA: 1:03 - loss: 0.9282 - regression_loss: 0.8231 - classification_loss: 0.1052 229/500 [============>.................] - ETA: 1:03 - loss: 0.9277 - regression_loss: 0.8226 - classification_loss: 0.1051 230/500 [============>.................] - ETA: 1:02 - loss: 0.9276 - regression_loss: 0.8225 - classification_loss: 0.1051 231/500 [============>.................] - ETA: 1:02 - loss: 0.9279 - regression_loss: 0.8223 - classification_loss: 0.1056 232/500 [============>.................] - ETA: 1:02 - loss: 0.9255 - regression_loss: 0.8202 - classification_loss: 0.1053 233/500 [============>.................] - ETA: 1:02 - loss: 0.9260 - regression_loss: 0.8204 - classification_loss: 0.1056 234/500 [=============>................] - ETA: 1:01 - loss: 0.9243 - regression_loss: 0.8190 - classification_loss: 0.1053 235/500 [=============>................] - ETA: 1:01 - loss: 0.9239 - regression_loss: 0.8188 - classification_loss: 0.1052 236/500 [=============>................] - ETA: 1:01 - loss: 0.9246 - regression_loss: 0.8194 - classification_loss: 0.1052 237/500 [=============>................] - ETA: 1:01 - loss: 0.9221 - regression_loss: 0.8172 - classification_loss: 0.1049 238/500 [=============>................] - ETA: 1:00 - loss: 0.9206 - regression_loss: 0.8160 - classification_loss: 0.1046 239/500 [=============>................] - ETA: 1:00 - loss: 0.9192 - regression_loss: 0.8148 - classification_loss: 0.1044 240/500 [=============>................] - ETA: 1:00 - loss: 0.9181 - regression_loss: 0.8140 - classification_loss: 0.1042 241/500 [=============>................] - ETA: 1:00 - loss: 0.9175 - regression_loss: 0.8135 - classification_loss: 0.1040 242/500 [=============>................] - ETA: 59s - loss: 0.9170 - regression_loss: 0.8131 - classification_loss: 0.1040  243/500 [=============>................] - ETA: 59s - loss: 0.9164 - regression_loss: 0.8126 - classification_loss: 0.1038 244/500 [=============>................] - ETA: 59s - loss: 0.9143 - regression_loss: 0.8107 - classification_loss: 0.1036 245/500 [=============>................] - ETA: 59s - loss: 0.9140 - regression_loss: 0.8105 - classification_loss: 0.1035 246/500 [=============>................] - ETA: 59s - loss: 0.9142 - regression_loss: 0.8108 - classification_loss: 0.1033 247/500 [=============>................] - ETA: 58s - loss: 0.9128 - regression_loss: 0.8097 - classification_loss: 0.1032 248/500 [=============>................] - ETA: 58s - loss: 0.9132 - regression_loss: 0.8098 - classification_loss: 0.1034 249/500 [=============>................] - ETA: 58s - loss: 0.9137 - regression_loss: 0.8104 - classification_loss: 0.1033 250/500 [==============>...............] - ETA: 58s - loss: 0.9144 - regression_loss: 0.8110 - classification_loss: 0.1034 251/500 [==============>...............] - ETA: 57s - loss: 0.9141 - regression_loss: 0.8109 - classification_loss: 0.1033 252/500 [==============>...............] - ETA: 57s - loss: 0.9127 - regression_loss: 0.8096 - classification_loss: 0.1031 253/500 [==============>...............] - ETA: 57s - loss: 0.9142 - regression_loss: 0.8106 - classification_loss: 0.1036 254/500 [==============>...............] - ETA: 57s - loss: 0.9136 - regression_loss: 0.8100 - classification_loss: 0.1036 255/500 [==============>...............] - ETA: 56s - loss: 0.9111 - regression_loss: 0.8078 - classification_loss: 0.1033 256/500 [==============>...............] - ETA: 56s - loss: 0.9089 - regression_loss: 0.8060 - classification_loss: 0.1029 257/500 [==============>...............] - ETA: 56s - loss: 0.9082 - regression_loss: 0.8056 - classification_loss: 0.1027 258/500 [==============>...............] - ETA: 56s - loss: 0.9094 - regression_loss: 0.8066 - classification_loss: 0.1028 259/500 [==============>...............] - ETA: 56s - loss: 0.9096 - regression_loss: 0.8067 - classification_loss: 0.1029 260/500 [==============>...............] - ETA: 55s - loss: 0.9119 - regression_loss: 0.8087 - classification_loss: 0.1033 261/500 [==============>...............] - ETA: 55s - loss: 0.9094 - regression_loss: 0.8065 - classification_loss: 0.1029 262/500 [==============>...............] - ETA: 55s - loss: 0.9127 - regression_loss: 0.8093 - classification_loss: 0.1033 263/500 [==============>...............] - ETA: 55s - loss: 0.9120 - regression_loss: 0.8088 - classification_loss: 0.1032 264/500 [==============>...............] - ETA: 54s - loss: 0.9116 - regression_loss: 0.8086 - classification_loss: 0.1031 265/500 [==============>...............] - ETA: 54s - loss: 0.9116 - regression_loss: 0.8089 - classification_loss: 0.1028 266/500 [==============>...............] - ETA: 54s - loss: 0.9114 - regression_loss: 0.8087 - classification_loss: 0.1027 267/500 [===============>..............] - ETA: 54s - loss: 0.9119 - regression_loss: 0.8090 - classification_loss: 0.1029 268/500 [===============>..............] - ETA: 53s - loss: 0.9129 - regression_loss: 0.8101 - classification_loss: 0.1028 269/500 [===============>..............] - ETA: 53s - loss: 0.9137 - regression_loss: 0.8109 - classification_loss: 0.1028 270/500 [===============>..............] - ETA: 53s - loss: 0.9125 - regression_loss: 0.8099 - classification_loss: 0.1026 271/500 [===============>..............] - ETA: 53s - loss: 0.9116 - regression_loss: 0.8090 - classification_loss: 0.1026 272/500 [===============>..............] - ETA: 52s - loss: 0.9129 - regression_loss: 0.8104 - classification_loss: 0.1026 273/500 [===============>..............] - ETA: 52s - loss: 0.9110 - regression_loss: 0.8086 - classification_loss: 0.1024 274/500 [===============>..............] - ETA: 52s - loss: 0.9110 - regression_loss: 0.8085 - classification_loss: 0.1024 275/500 [===============>..............] - ETA: 52s - loss: 0.9108 - regression_loss: 0.8086 - classification_loss: 0.1022 276/500 [===============>..............] - ETA: 52s - loss: 0.9124 - regression_loss: 0.8098 - classification_loss: 0.1026 277/500 [===============>..............] - ETA: 51s - loss: 0.9115 - regression_loss: 0.8092 - classification_loss: 0.1024 278/500 [===============>..............] - ETA: 51s - loss: 0.9111 - regression_loss: 0.8091 - classification_loss: 0.1021 279/500 [===============>..............] - ETA: 51s - loss: 0.9122 - regression_loss: 0.8100 - classification_loss: 0.1023 280/500 [===============>..............] - ETA: 51s - loss: 0.9129 - regression_loss: 0.8106 - classification_loss: 0.1022 281/500 [===============>..............] - ETA: 50s - loss: 0.9116 - regression_loss: 0.8096 - classification_loss: 0.1019 282/500 [===============>..............] - ETA: 50s - loss: 0.9113 - regression_loss: 0.8093 - classification_loss: 0.1020 283/500 [===============>..............] - ETA: 50s - loss: 0.9118 - regression_loss: 0.8098 - classification_loss: 0.1020 284/500 [================>.............] - ETA: 50s - loss: 0.9120 - regression_loss: 0.8101 - classification_loss: 0.1019 285/500 [================>.............] - ETA: 49s - loss: 0.9112 - regression_loss: 0.8094 - classification_loss: 0.1017 286/500 [================>.............] - ETA: 49s - loss: 0.9102 - regression_loss: 0.8087 - classification_loss: 0.1015 287/500 [================>.............] - ETA: 49s - loss: 0.9083 - regression_loss: 0.8071 - classification_loss: 0.1012 288/500 [================>.............] - ETA: 49s - loss: 0.9105 - regression_loss: 0.8089 - classification_loss: 0.1017 289/500 [================>.............] - ETA: 49s - loss: 0.9091 - regression_loss: 0.8077 - classification_loss: 0.1013 290/500 [================>.............] - ETA: 48s - loss: 0.9069 - regression_loss: 0.8059 - classification_loss: 0.1010 291/500 [================>.............] - ETA: 48s - loss: 0.9070 - regression_loss: 0.8060 - classification_loss: 0.1010 292/500 [================>.............] - ETA: 48s - loss: 0.9077 - regression_loss: 0.8067 - classification_loss: 0.1010 293/500 [================>.............] - ETA: 48s - loss: 0.9103 - regression_loss: 0.8084 - classification_loss: 0.1019 294/500 [================>.............] - ETA: 47s - loss: 0.9106 - regression_loss: 0.8087 - classification_loss: 0.1019 295/500 [================>.............] - ETA: 47s - loss: 0.9094 - regression_loss: 0.8077 - classification_loss: 0.1017 296/500 [================>.............] - ETA: 47s - loss: 0.9094 - regression_loss: 0.8077 - classification_loss: 0.1017 297/500 [================>.............] - ETA: 47s - loss: 0.9081 - regression_loss: 0.8066 - classification_loss: 0.1016 298/500 [================>.............] - ETA: 46s - loss: 0.9067 - regression_loss: 0.8053 - classification_loss: 0.1013 299/500 [================>.............] - ETA: 46s - loss: 0.9063 - regression_loss: 0.8050 - classification_loss: 0.1013 300/500 [=================>............] - ETA: 46s - loss: 0.9057 - regression_loss: 0.8046 - classification_loss: 0.1012 301/500 [=================>............] - ETA: 46s - loss: 0.9063 - regression_loss: 0.8051 - classification_loss: 0.1012 302/500 [=================>............] - ETA: 46s - loss: 0.9053 - regression_loss: 0.8042 - classification_loss: 0.1010 303/500 [=================>............] - ETA: 45s - loss: 0.9063 - regression_loss: 0.8051 - classification_loss: 0.1012 304/500 [=================>............] - ETA: 45s - loss: 0.9087 - regression_loss: 0.8069 - classification_loss: 0.1018 305/500 [=================>............] - ETA: 45s - loss: 0.9085 - regression_loss: 0.8067 - classification_loss: 0.1018 306/500 [=================>............] - ETA: 45s - loss: 0.9078 - regression_loss: 0.8062 - classification_loss: 0.1016 307/500 [=================>............] - ETA: 44s - loss: 0.9115 - regression_loss: 0.8092 - classification_loss: 0.1023 308/500 [=================>............] - ETA: 44s - loss: 0.9111 - regression_loss: 0.8089 - classification_loss: 0.1022 309/500 [=================>............] - ETA: 44s - loss: 0.9109 - regression_loss: 0.8087 - classification_loss: 0.1022 310/500 [=================>............] - ETA: 44s - loss: 0.9115 - regression_loss: 0.8094 - classification_loss: 0.1021 311/500 [=================>............] - ETA: 43s - loss: 0.9114 - regression_loss: 0.8093 - classification_loss: 0.1021 312/500 [=================>............] - ETA: 43s - loss: 0.9117 - regression_loss: 0.8097 - classification_loss: 0.1020 313/500 [=================>............] - ETA: 43s - loss: 0.9114 - regression_loss: 0.8095 - classification_loss: 0.1019 314/500 [=================>............] - ETA: 43s - loss: 0.9122 - regression_loss: 0.8102 - classification_loss: 0.1020 315/500 [=================>............] - ETA: 42s - loss: 0.9134 - regression_loss: 0.8116 - classification_loss: 0.1019 316/500 [=================>............] - ETA: 42s - loss: 0.9127 - regression_loss: 0.8110 - classification_loss: 0.1017 317/500 [==================>...........] - ETA: 42s - loss: 0.9123 - regression_loss: 0.8107 - classification_loss: 0.1016 318/500 [==================>...........] - ETA: 42s - loss: 0.9135 - regression_loss: 0.8116 - classification_loss: 0.1019 319/500 [==================>...........] - ETA: 42s - loss: 0.9126 - regression_loss: 0.8109 - classification_loss: 0.1017 320/500 [==================>...........] - ETA: 41s - loss: 0.9127 - regression_loss: 0.8111 - classification_loss: 0.1016 321/500 [==================>...........] - ETA: 41s - loss: 0.9123 - regression_loss: 0.8107 - classification_loss: 0.1017 322/500 [==================>...........] - ETA: 41s - loss: 0.9140 - regression_loss: 0.8122 - classification_loss: 0.1018 323/500 [==================>...........] - ETA: 41s - loss: 0.9151 - regression_loss: 0.8133 - classification_loss: 0.1018 324/500 [==================>...........] - ETA: 40s - loss: 0.9140 - regression_loss: 0.8121 - classification_loss: 0.1020 325/500 [==================>...........] - ETA: 40s - loss: 0.9144 - regression_loss: 0.8124 - classification_loss: 0.1020 326/500 [==================>...........] - ETA: 40s - loss: 0.9153 - regression_loss: 0.8133 - classification_loss: 0.1020 327/500 [==================>...........] - ETA: 40s - loss: 0.9140 - regression_loss: 0.8122 - classification_loss: 0.1018 328/500 [==================>...........] - ETA: 39s - loss: 0.9146 - regression_loss: 0.8126 - classification_loss: 0.1020 329/500 [==================>...........] - ETA: 39s - loss: 0.9128 - regression_loss: 0.8111 - classification_loss: 0.1017 330/500 [==================>...........] - ETA: 39s - loss: 0.9125 - regression_loss: 0.8108 - classification_loss: 0.1017 331/500 [==================>...........] - ETA: 39s - loss: 0.9131 - regression_loss: 0.8113 - classification_loss: 0.1018 332/500 [==================>...........] - ETA: 39s - loss: 0.9129 - regression_loss: 0.8111 - classification_loss: 0.1018 333/500 [==================>...........] - ETA: 38s - loss: 0.9111 - regression_loss: 0.8096 - classification_loss: 0.1015 334/500 [===================>..........] - ETA: 38s - loss: 0.9099 - regression_loss: 0.8086 - classification_loss: 0.1013 335/500 [===================>..........] - ETA: 38s - loss: 0.9099 - regression_loss: 0.8087 - classification_loss: 0.1012 336/500 [===================>..........] - ETA: 38s - loss: 0.9092 - regression_loss: 0.8082 - classification_loss: 0.1010 337/500 [===================>..........] - ETA: 37s - loss: 0.9109 - regression_loss: 0.8098 - classification_loss: 0.1010 338/500 [===================>..........] - ETA: 37s - loss: 0.9092 - regression_loss: 0.8084 - classification_loss: 0.1008 339/500 [===================>..........] - ETA: 37s - loss: 0.9070 - regression_loss: 0.8064 - classification_loss: 0.1005 340/500 [===================>..........] - ETA: 37s - loss: 0.9066 - regression_loss: 0.8062 - classification_loss: 0.1005 341/500 [===================>..........] - ETA: 36s - loss: 0.9076 - regression_loss: 0.8070 - classification_loss: 0.1007 342/500 [===================>..........] - ETA: 36s - loss: 0.9080 - regression_loss: 0.8073 - classification_loss: 0.1007 343/500 [===================>..........] - ETA: 36s - loss: 0.9072 - regression_loss: 0.8065 - classification_loss: 0.1007 344/500 [===================>..........] - ETA: 36s - loss: 0.9067 - regression_loss: 0.8063 - classification_loss: 0.1004 345/500 [===================>..........] - ETA: 36s - loss: 0.9068 - regression_loss: 0.8065 - classification_loss: 0.1003 346/500 [===================>..........] - ETA: 35s - loss: 0.9088 - regression_loss: 0.8086 - classification_loss: 0.1002 347/500 [===================>..........] - ETA: 35s - loss: 0.9088 - regression_loss: 0.8086 - classification_loss: 0.1002 348/500 [===================>..........] - ETA: 35s - loss: 0.9097 - regression_loss: 0.8094 - classification_loss: 0.1003 349/500 [===================>..........] - ETA: 35s - loss: 0.9093 - regression_loss: 0.8092 - classification_loss: 0.1001 350/500 [====================>.........] - ETA: 34s - loss: 0.9091 - regression_loss: 0.8090 - classification_loss: 0.1001 351/500 [====================>.........] - ETA: 34s - loss: 0.9071 - regression_loss: 0.8073 - classification_loss: 0.0999 352/500 [====================>.........] - ETA: 34s - loss: 0.9069 - regression_loss: 0.8071 - classification_loss: 0.0998 353/500 [====================>.........] - ETA: 34s - loss: 0.9091 - regression_loss: 0.8090 - classification_loss: 0.1001 354/500 [====================>.........] - ETA: 33s - loss: 0.9101 - regression_loss: 0.8099 - classification_loss: 0.1002 355/500 [====================>.........] - ETA: 33s - loss: 0.9108 - regression_loss: 0.8104 - classification_loss: 0.1004 356/500 [====================>.........] - ETA: 33s - loss: 0.9098 - regression_loss: 0.8095 - classification_loss: 0.1003 357/500 [====================>.........] - ETA: 33s - loss: 0.9081 - regression_loss: 0.8081 - classification_loss: 0.1000 358/500 [====================>.........] - ETA: 33s - loss: 0.9079 - regression_loss: 0.8080 - classification_loss: 0.0999 359/500 [====================>.........] - ETA: 32s - loss: 0.9068 - regression_loss: 0.8071 - classification_loss: 0.0997 360/500 [====================>.........] - ETA: 32s - loss: 0.9057 - regression_loss: 0.8062 - classification_loss: 0.0995 361/500 [====================>.........] - ETA: 32s - loss: 0.9061 - regression_loss: 0.8065 - classification_loss: 0.0996 362/500 [====================>.........] - ETA: 32s - loss: 0.9059 - regression_loss: 0.8064 - classification_loss: 0.0995 363/500 [====================>.........] - ETA: 31s - loss: 0.9052 - regression_loss: 0.8058 - classification_loss: 0.0994 364/500 [====================>.........] - ETA: 31s - loss: 0.9052 - regression_loss: 0.8058 - classification_loss: 0.0994 365/500 [====================>.........] - ETA: 31s - loss: 0.9070 - regression_loss: 0.8071 - classification_loss: 0.0999 366/500 [====================>.........] - ETA: 31s - loss: 0.9053 - regression_loss: 0.8057 - classification_loss: 0.0997 367/500 [=====================>........] - ETA: 30s - loss: 0.9058 - regression_loss: 0.8061 - classification_loss: 0.0997 368/500 [=====================>........] - ETA: 30s - loss: 0.9051 - regression_loss: 0.8055 - classification_loss: 0.0996 369/500 [=====================>........] - ETA: 30s - loss: 0.9068 - regression_loss: 0.8070 - classification_loss: 0.0998 370/500 [=====================>........] - ETA: 30s - loss: 0.9079 - regression_loss: 0.8080 - classification_loss: 0.0999 371/500 [=====================>........] - ETA: 30s - loss: 0.9085 - regression_loss: 0.8086 - classification_loss: 0.0999 372/500 [=====================>........] - ETA: 29s - loss: 0.9090 - regression_loss: 0.8090 - classification_loss: 0.0999 373/500 [=====================>........] - ETA: 29s - loss: 0.9077 - regression_loss: 0.8080 - classification_loss: 0.0997 374/500 [=====================>........] - ETA: 29s - loss: 0.9076 - regression_loss: 0.8080 - classification_loss: 0.0996 375/500 [=====================>........] - ETA: 29s - loss: 0.9064 - regression_loss: 0.8070 - classification_loss: 0.0994 376/500 [=====================>........] - ETA: 28s - loss: 0.9056 - regression_loss: 0.8063 - classification_loss: 0.0993 377/500 [=====================>........] - ETA: 28s - loss: 0.9059 - regression_loss: 0.8066 - classification_loss: 0.0993 378/500 [=====================>........] - ETA: 28s - loss: 0.9051 - regression_loss: 0.8056 - classification_loss: 0.0995 379/500 [=====================>........] - ETA: 28s - loss: 0.9052 - regression_loss: 0.8057 - classification_loss: 0.0995 380/500 [=====================>........] - ETA: 27s - loss: 0.9047 - regression_loss: 0.8054 - classification_loss: 0.0993 381/500 [=====================>........] - ETA: 27s - loss: 0.9036 - regression_loss: 0.8044 - classification_loss: 0.0991 382/500 [=====================>........] - ETA: 27s - loss: 0.9051 - regression_loss: 0.8056 - classification_loss: 0.0995 383/500 [=====================>........] - ETA: 27s - loss: 0.9036 - regression_loss: 0.8043 - classification_loss: 0.0993 384/500 [======================>.......] - ETA: 27s - loss: 0.9032 - regression_loss: 0.8040 - classification_loss: 0.0992 385/500 [======================>.......] - ETA: 26s - loss: 0.9063 - regression_loss: 0.8069 - classification_loss: 0.0995 386/500 [======================>.......] - ETA: 26s - loss: 0.9058 - regression_loss: 0.8065 - classification_loss: 0.0993 387/500 [======================>.......] - ETA: 26s - loss: 0.9051 - regression_loss: 0.8059 - classification_loss: 0.0992 388/500 [======================>.......] - ETA: 26s - loss: 0.9054 - regression_loss: 0.8063 - classification_loss: 0.0991 389/500 [======================>.......] - ETA: 25s - loss: 0.9058 - regression_loss: 0.8065 - classification_loss: 0.0993 390/500 [======================>.......] - ETA: 25s - loss: 0.9055 - regression_loss: 0.8062 - classification_loss: 0.0992 391/500 [======================>.......] - ETA: 25s - loss: 0.9052 - regression_loss: 0.8061 - classification_loss: 0.0991 392/500 [======================>.......] - ETA: 25s - loss: 0.9048 - regression_loss: 0.8058 - classification_loss: 0.0990 393/500 [======================>.......] - ETA: 24s - loss: 0.9050 - regression_loss: 0.8060 - classification_loss: 0.0989 394/500 [======================>.......] - ETA: 24s - loss: 0.9052 - regression_loss: 0.8063 - classification_loss: 0.0988 395/500 [======================>.......] - ETA: 24s - loss: 0.9052 - regression_loss: 0.8062 - classification_loss: 0.0990 396/500 [======================>.......] - ETA: 24s - loss: 0.9044 - regression_loss: 0.8055 - classification_loss: 0.0988 397/500 [======================>.......] - ETA: 23s - loss: 0.9046 - regression_loss: 0.8058 - classification_loss: 0.0988 398/500 [======================>.......] - ETA: 23s - loss: 0.9047 - regression_loss: 0.8059 - classification_loss: 0.0989 399/500 [======================>.......] - ETA: 23s - loss: 0.9038 - regression_loss: 0.8052 - classification_loss: 0.0987 400/500 [=======================>......] - ETA: 23s - loss: 0.9024 - regression_loss: 0.8040 - classification_loss: 0.0985 401/500 [=======================>......] - ETA: 23s - loss: 0.9018 - regression_loss: 0.8034 - classification_loss: 0.0984 402/500 [=======================>......] - ETA: 22s - loss: 0.9011 - regression_loss: 0.8028 - classification_loss: 0.0983 403/500 [=======================>......] - ETA: 22s - loss: 0.9006 - regression_loss: 0.8025 - classification_loss: 0.0982 404/500 [=======================>......] - ETA: 22s - loss: 0.9046 - regression_loss: 0.8062 - classification_loss: 0.0984 405/500 [=======================>......] - ETA: 22s - loss: 0.9039 - regression_loss: 0.8056 - classification_loss: 0.0982 406/500 [=======================>......] - ETA: 21s - loss: 0.9044 - regression_loss: 0.8062 - classification_loss: 0.0982 407/500 [=======================>......] - ETA: 21s - loss: 0.9041 - regression_loss: 0.8059 - classification_loss: 0.0982 408/500 [=======================>......] - ETA: 21s - loss: 0.9025 - regression_loss: 0.8045 - classification_loss: 0.0980 409/500 [=======================>......] - ETA: 21s - loss: 0.9027 - regression_loss: 0.8047 - classification_loss: 0.0980 410/500 [=======================>......] - ETA: 20s - loss: 0.9030 - regression_loss: 0.8050 - classification_loss: 0.0980 411/500 [=======================>......] - ETA: 20s - loss: 0.9032 - regression_loss: 0.8052 - classification_loss: 0.0980 412/500 [=======================>......] - ETA: 20s - loss: 0.9051 - regression_loss: 0.8068 - classification_loss: 0.0982 413/500 [=======================>......] - ETA: 20s - loss: 0.9067 - regression_loss: 0.8082 - classification_loss: 0.0985 414/500 [=======================>......] - ETA: 20s - loss: 0.9067 - regression_loss: 0.8083 - classification_loss: 0.0984 415/500 [=======================>......] - ETA: 19s - loss: 0.9061 - regression_loss: 0.8078 - classification_loss: 0.0984 416/500 [=======================>......] - ETA: 19s - loss: 0.9061 - regression_loss: 0.8078 - classification_loss: 0.0983 417/500 [========================>.....] - ETA: 19s - loss: 0.9064 - regression_loss: 0.8080 - classification_loss: 0.0983 418/500 [========================>.....] - ETA: 19s - loss: 0.9070 - regression_loss: 0.8086 - classification_loss: 0.0984 419/500 [========================>.....] - ETA: 18s - loss: 0.9072 - regression_loss: 0.8088 - classification_loss: 0.0984 420/500 [========================>.....] - ETA: 18s - loss: 0.9079 - regression_loss: 0.8095 - classification_loss: 0.0984 421/500 [========================>.....] - ETA: 18s - loss: 0.9086 - regression_loss: 0.8100 - classification_loss: 0.0986 422/500 [========================>.....] - ETA: 18s - loss: 0.9082 - regression_loss: 0.8098 - classification_loss: 0.0985 423/500 [========================>.....] - ETA: 17s - loss: 0.9087 - regression_loss: 0.8101 - classification_loss: 0.0987 424/500 [========================>.....] - ETA: 17s - loss: 0.9104 - regression_loss: 0.8114 - classification_loss: 0.0990 425/500 [========================>.....] - ETA: 17s - loss: 0.9106 - regression_loss: 0.8117 - classification_loss: 0.0988 426/500 [========================>.....] - ETA: 17s - loss: 0.9109 - regression_loss: 0.8120 - classification_loss: 0.0989 427/500 [========================>.....] - ETA: 17s - loss: 0.9102 - regression_loss: 0.8114 - classification_loss: 0.0988 428/500 [========================>.....] - ETA: 16s - loss: 0.9120 - regression_loss: 0.8129 - classification_loss: 0.0991 429/500 [========================>.....] - ETA: 16s - loss: 0.9120 - regression_loss: 0.8129 - classification_loss: 0.0991 430/500 [========================>.....] - ETA: 16s - loss: 0.9113 - regression_loss: 0.8123 - classification_loss: 0.0989 431/500 [========================>.....] - ETA: 16s - loss: 0.9113 - regression_loss: 0.8122 - classification_loss: 0.0992 432/500 [========================>.....] - ETA: 15s - loss: 0.9129 - regression_loss: 0.8135 - classification_loss: 0.0995 433/500 [========================>.....] - ETA: 15s - loss: 0.9123 - regression_loss: 0.8129 - classification_loss: 0.0994 434/500 [=========================>....] - ETA: 15s - loss: 0.9134 - regression_loss: 0.8139 - classification_loss: 0.0995 435/500 [=========================>....] - ETA: 15s - loss: 0.9144 - regression_loss: 0.8146 - classification_loss: 0.0997 436/500 [=========================>....] - ETA: 14s - loss: 0.9145 - regression_loss: 0.8148 - classification_loss: 0.0997 437/500 [=========================>....] - ETA: 14s - loss: 0.9151 - regression_loss: 0.8153 - classification_loss: 0.0998 438/500 [=========================>....] - ETA: 14s - loss: 0.9165 - regression_loss: 0.8165 - classification_loss: 0.1000 439/500 [=========================>....] - ETA: 14s - loss: 0.9157 - regression_loss: 0.8158 - classification_loss: 0.0999 440/500 [=========================>....] - ETA: 13s - loss: 0.9149 - regression_loss: 0.8152 - classification_loss: 0.0997 441/500 [=========================>....] - ETA: 13s - loss: 0.9164 - regression_loss: 0.8166 - classification_loss: 0.0997 442/500 [=========================>....] - ETA: 13s - loss: 0.9151 - regression_loss: 0.8156 - classification_loss: 0.0996 443/500 [=========================>....] - ETA: 13s - loss: 0.9154 - regression_loss: 0.8159 - classification_loss: 0.0995 444/500 [=========================>....] - ETA: 13s - loss: 0.9143 - regression_loss: 0.8150 - classification_loss: 0.0994 445/500 [=========================>....] - ETA: 12s - loss: 0.9141 - regression_loss: 0.8149 - classification_loss: 0.0992 446/500 [=========================>....] - ETA: 12s - loss: 0.9136 - regression_loss: 0.8145 - classification_loss: 0.0991 447/500 [=========================>....] - ETA: 12s - loss: 0.9128 - regression_loss: 0.8139 - classification_loss: 0.0990 448/500 [=========================>....] - ETA: 12s - loss: 0.9134 - regression_loss: 0.8144 - classification_loss: 0.0990 449/500 [=========================>....] - ETA: 11s - loss: 0.9126 - regression_loss: 0.8137 - classification_loss: 0.0989 450/500 [==========================>...] - ETA: 11s - loss: 0.9111 - regression_loss: 0.8124 - classification_loss: 0.0987 451/500 [==========================>...] - ETA: 11s - loss: 0.9114 - regression_loss: 0.8125 - classification_loss: 0.0989 452/500 [==========================>...] - ETA: 11s - loss: 0.9110 - regression_loss: 0.8122 - classification_loss: 0.0989 453/500 [==========================>...] - ETA: 10s - loss: 0.9131 - regression_loss: 0.8135 - classification_loss: 0.0996 454/500 [==========================>...] - ETA: 10s - loss: 0.9130 - regression_loss: 0.8135 - classification_loss: 0.0995 455/500 [==========================>...] - ETA: 10s - loss: 0.9117 - regression_loss: 0.8123 - classification_loss: 0.0994 456/500 [==========================>...] - ETA: 10s - loss: 0.9111 - regression_loss: 0.8118 - classification_loss: 0.0992 457/500 [==========================>...] - ETA: 10s - loss: 0.9115 - regression_loss: 0.8123 - classification_loss: 0.0992 458/500 [==========================>...] - ETA: 9s - loss: 0.9118 - regression_loss: 0.8126 - classification_loss: 0.0992  459/500 [==========================>...] - ETA: 9s - loss: 0.9115 - regression_loss: 0.8124 - classification_loss: 0.0992 460/500 [==========================>...] - ETA: 9s - loss: 0.9139 - regression_loss: 0.8147 - classification_loss: 0.0993 461/500 [==========================>...] - ETA: 9s - loss: 0.9138 - regression_loss: 0.8147 - classification_loss: 0.0992 462/500 [==========================>...] - ETA: 8s - loss: 0.9146 - regression_loss: 0.8153 - classification_loss: 0.0993 463/500 [==========================>...] - ETA: 8s - loss: 0.9143 - regression_loss: 0.8152 - classification_loss: 0.0991 464/500 [==========================>...] - ETA: 8s - loss: 0.9135 - regression_loss: 0.8145 - classification_loss: 0.0990 465/500 [==========================>...] - ETA: 8s - loss: 0.9141 - regression_loss: 0.8151 - classification_loss: 0.0990 466/500 [==========================>...] - ETA: 7s - loss: 0.9134 - regression_loss: 0.8145 - classification_loss: 0.0989 467/500 [===========================>..] - ETA: 7s - loss: 0.9137 - regression_loss: 0.8148 - classification_loss: 0.0989 468/500 [===========================>..] - ETA: 7s - loss: 0.9143 - regression_loss: 0.8153 - classification_loss: 0.0990 469/500 [===========================>..] - ETA: 7s - loss: 0.9144 - regression_loss: 0.8154 - classification_loss: 0.0990 470/500 [===========================>..] - ETA: 6s - loss: 0.9153 - regression_loss: 0.8164 - classification_loss: 0.0989 471/500 [===========================>..] - ETA: 6s - loss: 0.9152 - regression_loss: 0.8163 - classification_loss: 0.0988 472/500 [===========================>..] - ETA: 6s - loss: 0.9142 - regression_loss: 0.8155 - classification_loss: 0.0987 473/500 [===========================>..] - ETA: 6s - loss: 0.9143 - regression_loss: 0.8156 - classification_loss: 0.0987 474/500 [===========================>..] - ETA: 6s - loss: 0.9141 - regression_loss: 0.8156 - classification_loss: 0.0986 475/500 [===========================>..] - ETA: 5s - loss: 0.9135 - regression_loss: 0.8150 - classification_loss: 0.0984 476/500 [===========================>..] - ETA: 5s - loss: 0.9154 - regression_loss: 0.8166 - classification_loss: 0.0988 477/500 [===========================>..] - ETA: 5s - loss: 0.9143 - regression_loss: 0.8157 - classification_loss: 0.0986 478/500 [===========================>..] - ETA: 5s - loss: 0.9154 - regression_loss: 0.8166 - classification_loss: 0.0989 479/500 [===========================>..] - ETA: 4s - loss: 0.9148 - regression_loss: 0.8160 - classification_loss: 0.0987 480/500 [===========================>..] - ETA: 4s - loss: 0.9141 - regression_loss: 0.8155 - classification_loss: 0.0986 481/500 [===========================>..] - ETA: 4s - loss: 0.9140 - regression_loss: 0.8155 - classification_loss: 0.0985 482/500 [===========================>..] - ETA: 4s - loss: 0.9136 - regression_loss: 0.8152 - classification_loss: 0.0984 483/500 [===========================>..] - ETA: 3s - loss: 0.9137 - regression_loss: 0.8152 - classification_loss: 0.0984 484/500 [============================>.] - ETA: 3s - loss: 0.9135 - regression_loss: 0.8152 - classification_loss: 0.0984 485/500 [============================>.] - ETA: 3s - loss: 0.9129 - regression_loss: 0.8142 - classification_loss: 0.0987 486/500 [============================>.] - ETA: 3s - loss: 0.9137 - regression_loss: 0.8150 - classification_loss: 0.0988 487/500 [============================>.] - ETA: 3s - loss: 0.9140 - regression_loss: 0.8152 - classification_loss: 0.0988 488/500 [============================>.] - ETA: 2s - loss: 0.9142 - regression_loss: 0.8155 - classification_loss: 0.0988 489/500 [============================>.] - ETA: 2s - loss: 0.9139 - regression_loss: 0.8152 - classification_loss: 0.0987 490/500 [============================>.] - ETA: 2s - loss: 0.9139 - regression_loss: 0.8152 - classification_loss: 0.0988 491/500 [============================>.] - ETA: 2s - loss: 0.9142 - regression_loss: 0.8155 - classification_loss: 0.0987 492/500 [============================>.] - ETA: 1s - loss: 0.9157 - regression_loss: 0.8167 - classification_loss: 0.0989 493/500 [============================>.] - ETA: 1s - loss: 0.9152 - regression_loss: 0.8163 - classification_loss: 0.0988 494/500 [============================>.] - ETA: 1s - loss: 0.9152 - regression_loss: 0.8164 - classification_loss: 0.0988 495/500 [============================>.] - ETA: 1s - loss: 0.9147 - regression_loss: 0.8159 - classification_loss: 0.0988 496/500 [============================>.] - ETA: 0s - loss: 0.9149 - regression_loss: 0.8160 - classification_loss: 0.0988 497/500 [============================>.] - ETA: 0s - loss: 0.9152 - regression_loss: 0.8163 - classification_loss: 0.0989 498/500 [============================>.] - ETA: 0s - loss: 0.9155 - regression_loss: 0.8166 - classification_loss: 0.0989 499/500 [============================>.] - ETA: 0s - loss: 0.9161 - regression_loss: 0.8170 - classification_loss: 0.0991 500/500 [==============================] - 117s 233ms/step - loss: 0.9157 - regression_loss: 0.8167 - classification_loss: 0.0990 326 instances of class plum with average precision: 0.8455 mAP: 0.8455 Epoch 00037: saving model to ./training/snapshots/resnet50_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 1:46 - loss: 0.7681 - regression_loss: 0.6911 - classification_loss: 0.0770 2/500 [..............................] - ETA: 1:46 - loss: 0.8495 - regression_loss: 0.7546 - classification_loss: 0.0949 3/500 [..............................] - ETA: 1:45 - loss: 1.0689 - regression_loss: 0.9389 - classification_loss: 0.1299 4/500 [..............................] - ETA: 1:48 - loss: 0.9592 - regression_loss: 0.8470 - classification_loss: 0.1122 5/500 [..............................] - ETA: 1:50 - loss: 0.8438 - regression_loss: 0.7445 - classification_loss: 0.0994 6/500 [..............................] - ETA: 1:50 - loss: 0.8269 - regression_loss: 0.7397 - classification_loss: 0.0871 7/500 [..............................] - ETA: 1:50 - loss: 0.8778 - regression_loss: 0.7801 - classification_loss: 0.0977 8/500 [..............................] - ETA: 1:51 - loss: 0.8362 - regression_loss: 0.7433 - classification_loss: 0.0928 9/500 [..............................] - ETA: 1:51 - loss: 0.8684 - regression_loss: 0.7794 - classification_loss: 0.0890 10/500 [..............................] - ETA: 1:51 - loss: 0.9193 - regression_loss: 0.7991 - classification_loss: 0.1202 11/500 [..............................] - ETA: 1:50 - loss: 0.9290 - regression_loss: 0.8098 - classification_loss: 0.1193 12/500 [..............................] - ETA: 1:51 - loss: 0.9217 - regression_loss: 0.8059 - classification_loss: 0.1158 13/500 [..............................] - ETA: 1:51 - loss: 0.9088 - regression_loss: 0.7980 - classification_loss: 0.1108 14/500 [..............................] - ETA: 1:52 - loss: 0.9445 - regression_loss: 0.8337 - classification_loss: 0.1109 15/500 [..............................] - ETA: 1:52 - loss: 0.9642 - regression_loss: 0.8429 - classification_loss: 0.1213 16/500 [..............................] - ETA: 1:52 - loss: 0.9634 - regression_loss: 0.8442 - classification_loss: 0.1192 17/500 [>.............................] - ETA: 1:52 - loss: 0.9871 - regression_loss: 0.8680 - classification_loss: 0.1191 18/500 [>.............................] - ETA: 1:53 - loss: 0.9504 - regression_loss: 0.8371 - classification_loss: 0.1134 19/500 [>.............................] - ETA: 1:52 - loss: 0.9704 - regression_loss: 0.8538 - classification_loss: 0.1167 20/500 [>.............................] - ETA: 1:52 - loss: 0.9774 - regression_loss: 0.8615 - classification_loss: 0.1159 21/500 [>.............................] - ETA: 1:52 - loss: 0.9594 - regression_loss: 0.8467 - classification_loss: 0.1127 22/500 [>.............................] - ETA: 1:51 - loss: 0.9446 - regression_loss: 0.8348 - classification_loss: 0.1098 23/500 [>.............................] - ETA: 1:51 - loss: 0.9483 - regression_loss: 0.8403 - classification_loss: 0.1080 24/500 [>.............................] - ETA: 1:50 - loss: 0.9806 - regression_loss: 0.8713 - classification_loss: 0.1093 25/500 [>.............................] - ETA: 1:50 - loss: 0.9703 - regression_loss: 0.8625 - classification_loss: 0.1079 26/500 [>.............................] - ETA: 1:50 - loss: 0.9571 - regression_loss: 0.8507 - classification_loss: 0.1064 27/500 [>.............................] - ETA: 1:49 - loss: 0.9647 - regression_loss: 0.8587 - classification_loss: 0.1060 28/500 [>.............................] - ETA: 1:49 - loss: 0.9740 - regression_loss: 0.8668 - classification_loss: 0.1072 29/500 [>.............................] - ETA: 1:49 - loss: 0.9554 - regression_loss: 0.8513 - classification_loss: 0.1040 30/500 [>.............................] - ETA: 1:49 - loss: 0.9396 - regression_loss: 0.8382 - classification_loss: 0.1014 31/500 [>.............................] - ETA: 1:49 - loss: 0.9241 - regression_loss: 0.8249 - classification_loss: 0.0992 32/500 [>.............................] - ETA: 1:48 - loss: 0.9032 - regression_loss: 0.8064 - classification_loss: 0.0968 33/500 [>.............................] - ETA: 1:48 - loss: 0.8920 - regression_loss: 0.7968 - classification_loss: 0.0952 34/500 [=>............................] - ETA: 1:48 - loss: 0.8866 - regression_loss: 0.7929 - classification_loss: 0.0938 35/500 [=>............................] - ETA: 1:48 - loss: 0.8940 - regression_loss: 0.8011 - classification_loss: 0.0929 36/500 [=>............................] - ETA: 1:48 - loss: 0.8975 - regression_loss: 0.8043 - classification_loss: 0.0932 37/500 [=>............................] - ETA: 1:48 - loss: 0.9009 - regression_loss: 0.8072 - classification_loss: 0.0937 38/500 [=>............................] - ETA: 1:47 - loss: 0.8883 - regression_loss: 0.7964 - classification_loss: 0.0919 39/500 [=>............................] - ETA: 1:47 - loss: 0.8775 - regression_loss: 0.7858 - classification_loss: 0.0917 40/500 [=>............................] - ETA: 1:47 - loss: 0.8719 - regression_loss: 0.7817 - classification_loss: 0.0902 41/500 [=>............................] - ETA: 1:47 - loss: 0.8756 - regression_loss: 0.7859 - classification_loss: 0.0898 42/500 [=>............................] - ETA: 1:46 - loss: 0.8723 - regression_loss: 0.7827 - classification_loss: 0.0896 43/500 [=>............................] - ETA: 1:46 - loss: 0.8578 - regression_loss: 0.7698 - classification_loss: 0.0880 44/500 [=>............................] - ETA: 1:46 - loss: 0.8537 - regression_loss: 0.7666 - classification_loss: 0.0871 45/500 [=>............................] - ETA: 1:45 - loss: 0.8503 - regression_loss: 0.7637 - classification_loss: 0.0865 46/500 [=>............................] - ETA: 1:45 - loss: 0.8690 - regression_loss: 0.7787 - classification_loss: 0.0903 47/500 [=>............................] - ETA: 1:45 - loss: 0.8531 - regression_loss: 0.7646 - classification_loss: 0.0886 48/500 [=>............................] - ETA: 1:45 - loss: 0.8487 - regression_loss: 0.7605 - classification_loss: 0.0882 49/500 [=>............................] - ETA: 1:45 - loss: 0.8517 - regression_loss: 0.7632 - classification_loss: 0.0885 50/500 [==>...........................] - ETA: 1:44 - loss: 0.8579 - regression_loss: 0.7686 - classification_loss: 0.0893 51/500 [==>...........................] - ETA: 1:44 - loss: 0.8665 - regression_loss: 0.7761 - classification_loss: 0.0905 52/500 [==>...........................] - ETA: 1:44 - loss: 0.8679 - regression_loss: 0.7781 - classification_loss: 0.0899 53/500 [==>...........................] - ETA: 1:43 - loss: 0.8651 - regression_loss: 0.7754 - classification_loss: 0.0897 54/500 [==>...........................] - ETA: 1:43 - loss: 0.8595 - regression_loss: 0.7708 - classification_loss: 0.0887 55/500 [==>...........................] - ETA: 1:43 - loss: 0.8549 - regression_loss: 0.7669 - classification_loss: 0.0880 56/500 [==>...........................] - ETA: 1:42 - loss: 0.8584 - regression_loss: 0.7698 - classification_loss: 0.0886 57/500 [==>...........................] - ETA: 1:42 - loss: 0.8547 - regression_loss: 0.7672 - classification_loss: 0.0875 58/500 [==>...........................] - ETA: 1:42 - loss: 0.8773 - regression_loss: 0.7836 - classification_loss: 0.0937 59/500 [==>...........................] - ETA: 1:41 - loss: 0.8846 - regression_loss: 0.7893 - classification_loss: 0.0953 60/500 [==>...........................] - ETA: 1:41 - loss: 0.8811 - regression_loss: 0.7861 - classification_loss: 0.0949 61/500 [==>...........................] - ETA: 1:41 - loss: 0.8849 - regression_loss: 0.7895 - classification_loss: 0.0955 62/500 [==>...........................] - ETA: 1:40 - loss: 0.8934 - regression_loss: 0.7969 - classification_loss: 0.0966 63/500 [==>...........................] - ETA: 1:40 - loss: 0.8943 - regression_loss: 0.7981 - classification_loss: 0.0962 64/500 [==>...........................] - ETA: 1:40 - loss: 0.8928 - regression_loss: 0.7966 - classification_loss: 0.0963 65/500 [==>...........................] - ETA: 1:39 - loss: 0.8934 - regression_loss: 0.7976 - classification_loss: 0.0958 66/500 [==>...........................] - ETA: 1:39 - loss: 0.8913 - regression_loss: 0.7962 - classification_loss: 0.0951 67/500 [===>..........................] - ETA: 1:39 - loss: 0.8888 - regression_loss: 0.7945 - classification_loss: 0.0942 68/500 [===>..........................] - ETA: 1:39 - loss: 0.8936 - regression_loss: 0.7990 - classification_loss: 0.0946 69/500 [===>..........................] - ETA: 1:38 - loss: 0.8868 - regression_loss: 0.7933 - classification_loss: 0.0936 70/500 [===>..........................] - ETA: 1:38 - loss: 0.8825 - regression_loss: 0.7896 - classification_loss: 0.0929 71/500 [===>..........................] - ETA: 1:38 - loss: 0.8822 - regression_loss: 0.7895 - classification_loss: 0.0927 72/500 [===>..........................] - ETA: 1:38 - loss: 0.8823 - regression_loss: 0.7895 - classification_loss: 0.0928 73/500 [===>..........................] - ETA: 1:37 - loss: 0.8832 - regression_loss: 0.7902 - classification_loss: 0.0930 74/500 [===>..........................] - ETA: 1:37 - loss: 0.8888 - regression_loss: 0.7945 - classification_loss: 0.0943 75/500 [===>..........................] - ETA: 1:37 - loss: 0.8841 - regression_loss: 0.7907 - classification_loss: 0.0934 76/500 [===>..........................] - ETA: 1:37 - loss: 0.8896 - regression_loss: 0.7967 - classification_loss: 0.0929 77/500 [===>..........................] - ETA: 1:36 - loss: 0.8959 - regression_loss: 0.8028 - classification_loss: 0.0931 78/500 [===>..........................] - ETA: 1:36 - loss: 0.9074 - regression_loss: 0.8114 - classification_loss: 0.0960 79/500 [===>..........................] - ETA: 1:36 - loss: 0.9185 - regression_loss: 0.8205 - classification_loss: 0.0979 80/500 [===>..........................] - ETA: 1:36 - loss: 0.9188 - regression_loss: 0.8212 - classification_loss: 0.0976 81/500 [===>..........................] - ETA: 1:36 - loss: 0.9163 - regression_loss: 0.8192 - classification_loss: 0.0971 82/500 [===>..........................] - ETA: 1:36 - loss: 0.9160 - regression_loss: 0.8192 - classification_loss: 0.0969 83/500 [===>..........................] - ETA: 1:35 - loss: 0.9130 - regression_loss: 0.8168 - classification_loss: 0.0962 84/500 [====>.........................] - ETA: 1:35 - loss: 0.9106 - regression_loss: 0.8151 - classification_loss: 0.0956 85/500 [====>.........................] - ETA: 1:35 - loss: 0.9316 - regression_loss: 0.8298 - classification_loss: 0.1018 86/500 [====>.........................] - ETA: 1:35 - loss: 0.9269 - regression_loss: 0.8261 - classification_loss: 0.1008 87/500 [====>.........................] - ETA: 1:35 - loss: 0.9291 - regression_loss: 0.8284 - classification_loss: 0.1007 88/500 [====>.........................] - ETA: 1:34 - loss: 0.9310 - regression_loss: 0.8303 - classification_loss: 0.1007 89/500 [====>.........................] - ETA: 1:34 - loss: 0.9365 - regression_loss: 0.8351 - classification_loss: 0.1014 90/500 [====>.........................] - ETA: 1:34 - loss: 0.9320 - regression_loss: 0.8314 - classification_loss: 0.1006 91/500 [====>.........................] - ETA: 1:34 - loss: 0.9299 - regression_loss: 0.8296 - classification_loss: 0.1002 92/500 [====>.........................] - ETA: 1:33 - loss: 0.9324 - regression_loss: 0.8314 - classification_loss: 0.1011 93/500 [====>.........................] - ETA: 1:33 - loss: 0.9317 - regression_loss: 0.8299 - classification_loss: 0.1018 94/500 [====>.........................] - ETA: 1:33 - loss: 0.9287 - regression_loss: 0.8276 - classification_loss: 0.1011 95/500 [====>.........................] - ETA: 1:33 - loss: 0.9262 - regression_loss: 0.8257 - classification_loss: 0.1005 96/500 [====>.........................] - ETA: 1:32 - loss: 0.9260 - regression_loss: 0.8255 - classification_loss: 0.1005 97/500 [====>.........................] - ETA: 1:32 - loss: 0.9236 - regression_loss: 0.8237 - classification_loss: 0.0999 98/500 [====>.........................] - ETA: 1:32 - loss: 0.9219 - regression_loss: 0.8225 - classification_loss: 0.0994 99/500 [====>.........................] - ETA: 1:32 - loss: 0.9189 - regression_loss: 0.8199 - classification_loss: 0.0989 100/500 [=====>........................] - ETA: 1:31 - loss: 0.9230 - regression_loss: 0.8241 - classification_loss: 0.0989 101/500 [=====>........................] - ETA: 1:31 - loss: 0.9195 - regression_loss: 0.8214 - classification_loss: 0.0982 102/500 [=====>........................] - ETA: 1:31 - loss: 0.9196 - regression_loss: 0.8216 - classification_loss: 0.0980 103/500 [=====>........................] - ETA: 1:31 - loss: 0.9245 - regression_loss: 0.8258 - classification_loss: 0.0986 104/500 [=====>........................] - ETA: 1:31 - loss: 0.9215 - regression_loss: 0.8233 - classification_loss: 0.0982 105/500 [=====>........................] - ETA: 1:30 - loss: 0.9226 - regression_loss: 0.8241 - classification_loss: 0.0985 106/500 [=====>........................] - ETA: 1:30 - loss: 0.9139 - regression_loss: 0.8163 - classification_loss: 0.0976 107/500 [=====>........................] - ETA: 1:30 - loss: 0.9144 - regression_loss: 0.8173 - classification_loss: 0.0972 108/500 [=====>........................] - ETA: 1:30 - loss: 0.9122 - regression_loss: 0.8153 - classification_loss: 0.0969 109/500 [=====>........................] - ETA: 1:30 - loss: 0.9125 - regression_loss: 0.8162 - classification_loss: 0.0963 110/500 [=====>........................] - ETA: 1:29 - loss: 0.9143 - regression_loss: 0.8172 - classification_loss: 0.0971 111/500 [=====>........................] - ETA: 1:29 - loss: 0.9141 - regression_loss: 0.8170 - classification_loss: 0.0971 112/500 [=====>........................] - ETA: 1:29 - loss: 0.9114 - regression_loss: 0.8146 - classification_loss: 0.0968 113/500 [=====>........................] - ETA: 1:29 - loss: 0.9081 - regression_loss: 0.8118 - classification_loss: 0.0963 114/500 [=====>........................] - ETA: 1:28 - loss: 0.9066 - regression_loss: 0.8104 - classification_loss: 0.0961 115/500 [=====>........................] - ETA: 1:28 - loss: 0.9062 - regression_loss: 0.8101 - classification_loss: 0.0961 116/500 [=====>........................] - ETA: 1:28 - loss: 0.9078 - regression_loss: 0.8117 - classification_loss: 0.0961 117/500 [======>.......................] - ETA: 1:28 - loss: 0.9062 - regression_loss: 0.8105 - classification_loss: 0.0958 118/500 [======>.......................] - ETA: 1:27 - loss: 0.9068 - regression_loss: 0.8117 - classification_loss: 0.0950 119/500 [======>.......................] - ETA: 1:27 - loss: 0.9007 - regression_loss: 0.8064 - classification_loss: 0.0943 120/500 [======>.......................] - ETA: 1:27 - loss: 0.9016 - regression_loss: 0.8078 - classification_loss: 0.0938 121/500 [======>.......................] - ETA: 1:27 - loss: 0.9017 - regression_loss: 0.8083 - classification_loss: 0.0934 122/500 [======>.......................] - ETA: 1:27 - loss: 0.9020 - regression_loss: 0.8085 - classification_loss: 0.0935 123/500 [======>.......................] - ETA: 1:26 - loss: 0.8999 - regression_loss: 0.8066 - classification_loss: 0.0933 124/500 [======>.......................] - ETA: 1:26 - loss: 0.8976 - regression_loss: 0.8046 - classification_loss: 0.0930 125/500 [======>.......................] - ETA: 1:26 - loss: 0.8999 - regression_loss: 0.8069 - classification_loss: 0.0930 126/500 [======>.......................] - ETA: 1:26 - loss: 0.9025 - regression_loss: 0.8088 - classification_loss: 0.0937 127/500 [======>.......................] - ETA: 1:25 - loss: 0.9034 - regression_loss: 0.8092 - classification_loss: 0.0943 128/500 [======>.......................] - ETA: 1:25 - loss: 0.8987 - regression_loss: 0.8050 - classification_loss: 0.0937 129/500 [======>.......................] - ETA: 1:25 - loss: 0.8981 - regression_loss: 0.8049 - classification_loss: 0.0932 130/500 [======>.......................] - ETA: 1:25 - loss: 0.8976 - regression_loss: 0.8036 - classification_loss: 0.0940 131/500 [======>.......................] - ETA: 1:25 - loss: 0.8933 - regression_loss: 0.7998 - classification_loss: 0.0935 132/500 [======>.......................] - ETA: 1:24 - loss: 0.8917 - regression_loss: 0.7986 - classification_loss: 0.0930 133/500 [======>.......................] - ETA: 1:24 - loss: 0.8881 - regression_loss: 0.7956 - classification_loss: 0.0925 134/500 [=======>......................] - ETA: 1:24 - loss: 0.8847 - regression_loss: 0.7927 - classification_loss: 0.0920 135/500 [=======>......................] - ETA: 1:24 - loss: 0.8870 - regression_loss: 0.7946 - classification_loss: 0.0924 136/500 [=======>......................] - ETA: 1:23 - loss: 0.8866 - regression_loss: 0.7944 - classification_loss: 0.0923 137/500 [=======>......................] - ETA: 1:23 - loss: 0.8904 - regression_loss: 0.7973 - classification_loss: 0.0930 138/500 [=======>......................] - ETA: 1:23 - loss: 0.8909 - regression_loss: 0.7976 - classification_loss: 0.0933 139/500 [=======>......................] - ETA: 1:23 - loss: 0.8892 - regression_loss: 0.7962 - classification_loss: 0.0930 140/500 [=======>......................] - ETA: 1:22 - loss: 0.8857 - regression_loss: 0.7932 - classification_loss: 0.0925 141/500 [=======>......................] - ETA: 1:22 - loss: 0.8828 - regression_loss: 0.7906 - classification_loss: 0.0921 142/500 [=======>......................] - ETA: 1:22 - loss: 0.8804 - regression_loss: 0.7886 - classification_loss: 0.0918 143/500 [=======>......................] - ETA: 1:22 - loss: 0.8861 - regression_loss: 0.7929 - classification_loss: 0.0932 144/500 [=======>......................] - ETA: 1:21 - loss: 0.8883 - regression_loss: 0.7946 - classification_loss: 0.0937 145/500 [=======>......................] - ETA: 1:21 - loss: 0.8869 - regression_loss: 0.7935 - classification_loss: 0.0934 146/500 [=======>......................] - ETA: 1:21 - loss: 0.8849 - regression_loss: 0.7917 - classification_loss: 0.0931 147/500 [=======>......................] - ETA: 1:21 - loss: 0.8822 - regression_loss: 0.7895 - classification_loss: 0.0927 148/500 [=======>......................] - ETA: 1:21 - loss: 0.8915 - regression_loss: 0.7960 - classification_loss: 0.0955 149/500 [=======>......................] - ETA: 1:20 - loss: 0.8905 - regression_loss: 0.7953 - classification_loss: 0.0952 150/500 [========>.....................] - ETA: 1:20 - loss: 0.8930 - regression_loss: 0.7970 - classification_loss: 0.0960 151/500 [========>.....................] - ETA: 1:20 - loss: 0.8931 - regression_loss: 0.7971 - classification_loss: 0.0960 152/500 [========>.....................] - ETA: 1:20 - loss: 0.8957 - regression_loss: 0.7990 - classification_loss: 0.0968 153/500 [========>.....................] - ETA: 1:20 - loss: 0.8959 - regression_loss: 0.7994 - classification_loss: 0.0966 154/500 [========>.....................] - ETA: 1:19 - loss: 0.8969 - regression_loss: 0.8003 - classification_loss: 0.0966 155/500 [========>.....................] - ETA: 1:19 - loss: 0.8923 - regression_loss: 0.7962 - classification_loss: 0.0961 156/500 [========>.....................] - ETA: 1:19 - loss: 0.8930 - regression_loss: 0.7971 - classification_loss: 0.0959 157/500 [========>.....................] - ETA: 1:19 - loss: 0.8898 - regression_loss: 0.7941 - classification_loss: 0.0957 158/500 [========>.....................] - ETA: 1:19 - loss: 0.8896 - regression_loss: 0.7938 - classification_loss: 0.0958 159/500 [========>.....................] - ETA: 1:18 - loss: 0.8926 - regression_loss: 0.7961 - classification_loss: 0.0965 160/500 [========>.....................] - ETA: 1:18 - loss: 0.8898 - regression_loss: 0.7937 - classification_loss: 0.0961 161/500 [========>.....................] - ETA: 1:18 - loss: 0.8920 - regression_loss: 0.7958 - classification_loss: 0.0963 162/500 [========>.....................] - ETA: 1:18 - loss: 0.8905 - regression_loss: 0.7945 - classification_loss: 0.0960 163/500 [========>.....................] - ETA: 1:17 - loss: 0.8862 - regression_loss: 0.7907 - classification_loss: 0.0955 164/500 [========>.....................] - ETA: 1:17 - loss: 0.8853 - regression_loss: 0.7901 - classification_loss: 0.0952 165/500 [========>.....................] - ETA: 1:17 - loss: 0.8847 - regression_loss: 0.7896 - classification_loss: 0.0951 166/500 [========>.....................] - ETA: 1:17 - loss: 0.8815 - regression_loss: 0.7869 - classification_loss: 0.0946 167/500 [=========>....................] - ETA: 1:17 - loss: 0.8807 - regression_loss: 0.7863 - classification_loss: 0.0944 168/500 [=========>....................] - ETA: 1:16 - loss: 0.8819 - regression_loss: 0.7874 - classification_loss: 0.0945 169/500 [=========>....................] - ETA: 1:16 - loss: 0.8834 - regression_loss: 0.7883 - classification_loss: 0.0951 170/500 [=========>....................] - ETA: 1:16 - loss: 0.8803 - regression_loss: 0.7856 - classification_loss: 0.0947 171/500 [=========>....................] - ETA: 1:16 - loss: 0.8782 - regression_loss: 0.7839 - classification_loss: 0.0943 172/500 [=========>....................] - ETA: 1:15 - loss: 0.8798 - regression_loss: 0.7855 - classification_loss: 0.0943 173/500 [=========>....................] - ETA: 1:15 - loss: 0.8790 - regression_loss: 0.7848 - classification_loss: 0.0942 174/500 [=========>....................] - ETA: 1:15 - loss: 0.8740 - regression_loss: 0.7803 - classification_loss: 0.0937 175/500 [=========>....................] - ETA: 1:15 - loss: 0.8735 - regression_loss: 0.7799 - classification_loss: 0.0937 176/500 [=========>....................] - ETA: 1:14 - loss: 0.8770 - regression_loss: 0.7834 - classification_loss: 0.0936 177/500 [=========>....................] - ETA: 1:14 - loss: 0.8753 - regression_loss: 0.7820 - classification_loss: 0.0933 178/500 [=========>....................] - ETA: 1:14 - loss: 0.8738 - regression_loss: 0.7809 - classification_loss: 0.0930 179/500 [=========>....................] - ETA: 1:14 - loss: 0.8735 - regression_loss: 0.7807 - classification_loss: 0.0927 180/500 [=========>....................] - ETA: 1:14 - loss: 0.8736 - regression_loss: 0.7811 - classification_loss: 0.0924 181/500 [=========>....................] - ETA: 1:13 - loss: 0.8729 - regression_loss: 0.7806 - classification_loss: 0.0924 182/500 [=========>....................] - ETA: 1:13 - loss: 0.8711 - regression_loss: 0.7790 - classification_loss: 0.0920 183/500 [=========>....................] - ETA: 1:13 - loss: 0.8716 - regression_loss: 0.7796 - classification_loss: 0.0921 184/500 [==========>...................] - ETA: 1:13 - loss: 0.8720 - regression_loss: 0.7796 - classification_loss: 0.0924 185/500 [==========>...................] - ETA: 1:12 - loss: 0.8742 - regression_loss: 0.7809 - classification_loss: 0.0933 186/500 [==========>...................] - ETA: 1:12 - loss: 0.8735 - regression_loss: 0.7803 - classification_loss: 0.0932 187/500 [==========>...................] - ETA: 1:12 - loss: 0.8728 - regression_loss: 0.7799 - classification_loss: 0.0930 188/500 [==========>...................] - ETA: 1:12 - loss: 0.8715 - regression_loss: 0.7788 - classification_loss: 0.0928 189/500 [==========>...................] - ETA: 1:12 - loss: 0.8743 - regression_loss: 0.7811 - classification_loss: 0.0931 190/500 [==========>...................] - ETA: 1:11 - loss: 0.8747 - regression_loss: 0.7818 - classification_loss: 0.0930 191/500 [==========>...................] - ETA: 1:11 - loss: 0.8783 - regression_loss: 0.7841 - classification_loss: 0.0942 192/500 [==========>...................] - ETA: 1:11 - loss: 0.8781 - regression_loss: 0.7841 - classification_loss: 0.0940 193/500 [==========>...................] - ETA: 1:11 - loss: 0.8777 - regression_loss: 0.7837 - classification_loss: 0.0940 194/500 [==========>...................] - ETA: 1:10 - loss: 0.8811 - regression_loss: 0.7868 - classification_loss: 0.0943 195/500 [==========>...................] - ETA: 1:10 - loss: 0.8827 - regression_loss: 0.7881 - classification_loss: 0.0946 196/500 [==========>...................] - ETA: 1:10 - loss: 0.8825 - regression_loss: 0.7880 - classification_loss: 0.0945 197/500 [==========>...................] - ETA: 1:10 - loss: 0.8876 - regression_loss: 0.7924 - classification_loss: 0.0951 198/500 [==========>...................] - ETA: 1:10 - loss: 0.8878 - regression_loss: 0.7927 - classification_loss: 0.0951 199/500 [==========>...................] - ETA: 1:09 - loss: 0.8880 - regression_loss: 0.7931 - classification_loss: 0.0950 200/500 [===========>..................] - ETA: 1:09 - loss: 0.8873 - regression_loss: 0.7927 - classification_loss: 0.0946 201/500 [===========>..................] - ETA: 1:09 - loss: 0.8895 - regression_loss: 0.7948 - classification_loss: 0.0947 202/500 [===========>..................] - ETA: 1:09 - loss: 0.8892 - regression_loss: 0.7945 - classification_loss: 0.0947 203/500 [===========>..................] - ETA: 1:09 - loss: 0.8913 - regression_loss: 0.7963 - classification_loss: 0.0951 204/500 [===========>..................] - ETA: 1:08 - loss: 0.8922 - regression_loss: 0.7972 - classification_loss: 0.0950 205/500 [===========>..................] - ETA: 1:08 - loss: 0.8925 - regression_loss: 0.7976 - classification_loss: 0.0948 206/500 [===========>..................] - ETA: 1:08 - loss: 0.8930 - regression_loss: 0.7982 - classification_loss: 0.0948 207/500 [===========>..................] - ETA: 1:08 - loss: 0.8916 - regression_loss: 0.7971 - classification_loss: 0.0945 208/500 [===========>..................] - ETA: 1:07 - loss: 0.8912 - regression_loss: 0.7967 - classification_loss: 0.0944 209/500 [===========>..................] - ETA: 1:07 - loss: 0.8891 - regression_loss: 0.7949 - classification_loss: 0.0942 210/500 [===========>..................] - ETA: 1:07 - loss: 0.8881 - regression_loss: 0.7940 - classification_loss: 0.0941 211/500 [===========>..................] - ETA: 1:07 - loss: 0.8874 - regression_loss: 0.7933 - classification_loss: 0.0941 212/500 [===========>..................] - ETA: 1:06 - loss: 0.8862 - regression_loss: 0.7924 - classification_loss: 0.0939 213/500 [===========>..................] - ETA: 1:06 - loss: 0.8886 - regression_loss: 0.7943 - classification_loss: 0.0943 214/500 [===========>..................] - ETA: 1:06 - loss: 0.8856 - regression_loss: 0.7917 - classification_loss: 0.0939 215/500 [===========>..................] - ETA: 1:06 - loss: 0.8858 - regression_loss: 0.7918 - classification_loss: 0.0940 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8836 - regression_loss: 0.7899 - classification_loss: 0.0937 217/500 [============>.................] - ETA: 1:05 - loss: 0.8831 - regression_loss: 0.7894 - classification_loss: 0.0937 218/500 [============>.................] - ETA: 1:05 - loss: 0.8827 - regression_loss: 0.7890 - classification_loss: 0.0937 219/500 [============>.................] - ETA: 1:05 - loss: 0.8838 - regression_loss: 0.7901 - classification_loss: 0.0938 220/500 [============>.................] - ETA: 1:05 - loss: 0.8850 - regression_loss: 0.7913 - classification_loss: 0.0938 221/500 [============>.................] - ETA: 1:04 - loss: 0.8838 - regression_loss: 0.7903 - classification_loss: 0.0936 222/500 [============>.................] - ETA: 1:04 - loss: 0.8810 - regression_loss: 0.7878 - classification_loss: 0.0932 223/500 [============>.................] - ETA: 1:04 - loss: 0.8791 - regression_loss: 0.7862 - classification_loss: 0.0929 224/500 [============>.................] - ETA: 1:04 - loss: 0.8796 - regression_loss: 0.7866 - classification_loss: 0.0929 225/500 [============>.................] - ETA: 1:03 - loss: 0.8772 - regression_loss: 0.7846 - classification_loss: 0.0926 226/500 [============>.................] - ETA: 1:03 - loss: 0.8801 - regression_loss: 0.7874 - classification_loss: 0.0927 227/500 [============>.................] - ETA: 1:03 - loss: 0.8800 - regression_loss: 0.7872 - classification_loss: 0.0928 228/500 [============>.................] - ETA: 1:03 - loss: 0.8804 - regression_loss: 0.7876 - classification_loss: 0.0928 229/500 [============>.................] - ETA: 1:03 - loss: 0.8814 - regression_loss: 0.7886 - classification_loss: 0.0928 230/500 [============>.................] - ETA: 1:02 - loss: 0.8829 - regression_loss: 0.7902 - classification_loss: 0.0928 231/500 [============>.................] - ETA: 1:02 - loss: 0.8873 - regression_loss: 0.7942 - classification_loss: 0.0930 232/500 [============>.................] - ETA: 1:02 - loss: 0.8870 - regression_loss: 0.7942 - classification_loss: 0.0928 233/500 [============>.................] - ETA: 1:02 - loss: 0.8870 - regression_loss: 0.7943 - classification_loss: 0.0927 234/500 [=============>................] - ETA: 1:01 - loss: 0.8874 - regression_loss: 0.7946 - classification_loss: 0.0928 235/500 [=============>................] - ETA: 1:01 - loss: 0.8858 - regression_loss: 0.7933 - classification_loss: 0.0925 236/500 [=============>................] - ETA: 1:01 - loss: 0.8856 - regression_loss: 0.7933 - classification_loss: 0.0923 237/500 [=============>................] - ETA: 1:01 - loss: 0.8867 - regression_loss: 0.7943 - classification_loss: 0.0924 238/500 [=============>................] - ETA: 1:01 - loss: 0.8867 - regression_loss: 0.7944 - classification_loss: 0.0924 239/500 [=============>................] - ETA: 1:00 - loss: 0.8866 - regression_loss: 0.7942 - classification_loss: 0.0924 240/500 [=============>................] - ETA: 1:00 - loss: 0.8896 - regression_loss: 0.7972 - classification_loss: 0.0924 241/500 [=============>................] - ETA: 1:00 - loss: 0.8901 - regression_loss: 0.7977 - classification_loss: 0.0924 242/500 [=============>................] - ETA: 1:00 - loss: 0.8945 - regression_loss: 0.8013 - classification_loss: 0.0931 243/500 [=============>................] - ETA: 59s - loss: 0.8953 - regression_loss: 0.8022 - classification_loss: 0.0930  244/500 [=============>................] - ETA: 59s - loss: 0.8965 - regression_loss: 0.8032 - classification_loss: 0.0933 245/500 [=============>................] - ETA: 59s - loss: 0.8979 - regression_loss: 0.8039 - classification_loss: 0.0940 246/500 [=============>................] - ETA: 59s - loss: 0.8975 - regression_loss: 0.8037 - classification_loss: 0.0938 247/500 [=============>................] - ETA: 59s - loss: 0.8977 - regression_loss: 0.8039 - classification_loss: 0.0937 248/500 [=============>................] - ETA: 58s - loss: 0.8960 - regression_loss: 0.8025 - classification_loss: 0.0935 249/500 [=============>................] - ETA: 58s - loss: 0.8952 - regression_loss: 0.8019 - classification_loss: 0.0933 250/500 [==============>...............] - ETA: 58s - loss: 0.8954 - regression_loss: 0.8020 - classification_loss: 0.0934 251/500 [==============>...............] - ETA: 58s - loss: 0.8954 - regression_loss: 0.8020 - classification_loss: 0.0934 252/500 [==============>...............] - ETA: 57s - loss: 0.8955 - regression_loss: 0.8022 - classification_loss: 0.0933 253/500 [==============>...............] - ETA: 57s - loss: 0.8965 - regression_loss: 0.8028 - classification_loss: 0.0937 254/500 [==============>...............] - ETA: 57s - loss: 0.8999 - regression_loss: 0.8052 - classification_loss: 0.0947 255/500 [==============>...............] - ETA: 57s - loss: 0.8992 - regression_loss: 0.8047 - classification_loss: 0.0945 256/500 [==============>...............] - ETA: 56s - loss: 0.9008 - regression_loss: 0.8059 - classification_loss: 0.0949 257/500 [==============>...............] - ETA: 56s - loss: 0.9025 - regression_loss: 0.8075 - classification_loss: 0.0951 258/500 [==============>...............] - ETA: 56s - loss: 0.9003 - regression_loss: 0.8052 - classification_loss: 0.0951 259/500 [==============>...............] - ETA: 56s - loss: 0.9007 - regression_loss: 0.8056 - classification_loss: 0.0951 260/500 [==============>...............] - ETA: 55s - loss: 0.9019 - regression_loss: 0.8068 - classification_loss: 0.0951 261/500 [==============>...............] - ETA: 55s - loss: 0.9008 - regression_loss: 0.8060 - classification_loss: 0.0949 262/500 [==============>...............] - ETA: 55s - loss: 0.9013 - regression_loss: 0.8063 - classification_loss: 0.0950 263/500 [==============>...............] - ETA: 55s - loss: 0.8997 - regression_loss: 0.8050 - classification_loss: 0.0947 264/500 [==============>...............] - ETA: 55s - loss: 0.8963 - regression_loss: 0.8019 - classification_loss: 0.0944 265/500 [==============>...............] - ETA: 54s - loss: 0.8970 - regression_loss: 0.8025 - classification_loss: 0.0945 266/500 [==============>...............] - ETA: 54s - loss: 0.8983 - regression_loss: 0.8036 - classification_loss: 0.0946 267/500 [===============>..............] - ETA: 54s - loss: 0.8994 - regression_loss: 0.8043 - classification_loss: 0.0951 268/500 [===============>..............] - ETA: 54s - loss: 0.8986 - regression_loss: 0.8037 - classification_loss: 0.0949 269/500 [===============>..............] - ETA: 53s - loss: 0.8991 - regression_loss: 0.8041 - classification_loss: 0.0950 270/500 [===============>..............] - ETA: 53s - loss: 0.8995 - regression_loss: 0.8040 - classification_loss: 0.0955 271/500 [===============>..............] - ETA: 53s - loss: 0.9001 - regression_loss: 0.8047 - classification_loss: 0.0954 272/500 [===============>..............] - ETA: 53s - loss: 0.8996 - regression_loss: 0.8044 - classification_loss: 0.0952 273/500 [===============>..............] - ETA: 52s - loss: 0.9016 - regression_loss: 0.8065 - classification_loss: 0.0951 274/500 [===============>..............] - ETA: 52s - loss: 0.9017 - regression_loss: 0.8067 - classification_loss: 0.0949 275/500 [===============>..............] - ETA: 52s - loss: 0.9028 - regression_loss: 0.8079 - classification_loss: 0.0949 276/500 [===============>..............] - ETA: 52s - loss: 0.9032 - regression_loss: 0.8084 - classification_loss: 0.0949 277/500 [===============>..............] - ETA: 52s - loss: 0.9058 - regression_loss: 0.8107 - classification_loss: 0.0951 278/500 [===============>..............] - ETA: 51s - loss: 0.9071 - regression_loss: 0.8117 - classification_loss: 0.0954 279/500 [===============>..............] - ETA: 51s - loss: 0.9054 - regression_loss: 0.8103 - classification_loss: 0.0951 280/500 [===============>..............] - ETA: 51s - loss: 0.9075 - regression_loss: 0.8121 - classification_loss: 0.0954 281/500 [===============>..............] - ETA: 51s - loss: 0.9063 - regression_loss: 0.8111 - classification_loss: 0.0952 282/500 [===============>..............] - ETA: 50s - loss: 0.9062 - regression_loss: 0.8110 - classification_loss: 0.0952 283/500 [===============>..............] - ETA: 50s - loss: 0.9054 - regression_loss: 0.8102 - classification_loss: 0.0952 284/500 [================>.............] - ETA: 50s - loss: 0.9060 - regression_loss: 0.8107 - classification_loss: 0.0953 285/500 [================>.............] - ETA: 50s - loss: 0.9061 - regression_loss: 0.8107 - classification_loss: 0.0955 286/500 [================>.............] - ETA: 49s - loss: 0.9064 - regression_loss: 0.8107 - classification_loss: 0.0956 287/500 [================>.............] - ETA: 49s - loss: 0.9062 - regression_loss: 0.8106 - classification_loss: 0.0956 288/500 [================>.............] - ETA: 49s - loss: 0.9049 - regression_loss: 0.8095 - classification_loss: 0.0954 289/500 [================>.............] - ETA: 49s - loss: 0.9085 - regression_loss: 0.8131 - classification_loss: 0.0954 290/500 [================>.............] - ETA: 49s - loss: 0.9093 - regression_loss: 0.8137 - classification_loss: 0.0956 291/500 [================>.............] - ETA: 48s - loss: 0.9075 - regression_loss: 0.8122 - classification_loss: 0.0953 292/500 [================>.............] - ETA: 48s - loss: 0.9089 - regression_loss: 0.8135 - classification_loss: 0.0954 293/500 [================>.............] - ETA: 48s - loss: 0.9107 - regression_loss: 0.8151 - classification_loss: 0.0956 294/500 [================>.............] - ETA: 48s - loss: 0.9113 - regression_loss: 0.8156 - classification_loss: 0.0957 295/500 [================>.............] - ETA: 47s - loss: 0.9115 - regression_loss: 0.8159 - classification_loss: 0.0956 296/500 [================>.............] - ETA: 47s - loss: 0.9121 - regression_loss: 0.8166 - classification_loss: 0.0955 297/500 [================>.............] - ETA: 47s - loss: 0.9111 - regression_loss: 0.8157 - classification_loss: 0.0954 298/500 [================>.............] - ETA: 47s - loss: 0.9118 - regression_loss: 0.8163 - classification_loss: 0.0954 299/500 [================>.............] - ETA: 46s - loss: 0.9117 - regression_loss: 0.8164 - classification_loss: 0.0953 300/500 [=================>............] - ETA: 46s - loss: 0.9098 - regression_loss: 0.8147 - classification_loss: 0.0951 301/500 [=================>............] - ETA: 46s - loss: 0.9088 - regression_loss: 0.8138 - classification_loss: 0.0949 302/500 [=================>............] - ETA: 46s - loss: 0.9082 - regression_loss: 0.8134 - classification_loss: 0.0948 303/500 [=================>............] - ETA: 46s - loss: 0.9092 - regression_loss: 0.8143 - classification_loss: 0.0949 304/500 [=================>............] - ETA: 45s - loss: 0.9116 - regression_loss: 0.8163 - classification_loss: 0.0953 305/500 [=================>............] - ETA: 45s - loss: 0.9111 - regression_loss: 0.8158 - classification_loss: 0.0953 306/500 [=================>............] - ETA: 45s - loss: 0.9111 - regression_loss: 0.8156 - classification_loss: 0.0955 307/500 [=================>............] - ETA: 45s - loss: 0.9111 - regression_loss: 0.8153 - classification_loss: 0.0958 308/500 [=================>............] - ETA: 44s - loss: 0.9096 - regression_loss: 0.8140 - classification_loss: 0.0956 309/500 [=================>............] - ETA: 44s - loss: 0.9082 - regression_loss: 0.8128 - classification_loss: 0.0954 310/500 [=================>............] - ETA: 44s - loss: 0.9087 - regression_loss: 0.8133 - classification_loss: 0.0955 311/500 [=================>............] - ETA: 44s - loss: 0.9098 - regression_loss: 0.8139 - classification_loss: 0.0959 312/500 [=================>............] - ETA: 43s - loss: 0.9101 - regression_loss: 0.8141 - classification_loss: 0.0960 313/500 [=================>............] - ETA: 43s - loss: 0.9100 - regression_loss: 0.8141 - classification_loss: 0.0960 314/500 [=================>............] - ETA: 43s - loss: 0.9092 - regression_loss: 0.8131 - classification_loss: 0.0961 315/500 [=================>............] - ETA: 43s - loss: 0.9084 - regression_loss: 0.8125 - classification_loss: 0.0959 316/500 [=================>............] - ETA: 43s - loss: 0.9074 - regression_loss: 0.8117 - classification_loss: 0.0957 317/500 [==================>...........] - ETA: 42s - loss: 0.9094 - regression_loss: 0.8132 - classification_loss: 0.0962 318/500 [==================>...........] - ETA: 42s - loss: 0.9086 - regression_loss: 0.8125 - classification_loss: 0.0961 319/500 [==================>...........] - ETA: 42s - loss: 0.9080 - regression_loss: 0.8121 - classification_loss: 0.0959 320/500 [==================>...........] - ETA: 42s - loss: 0.9082 - regression_loss: 0.8125 - classification_loss: 0.0958 321/500 [==================>...........] - ETA: 41s - loss: 0.9066 - regression_loss: 0.8111 - classification_loss: 0.0955 322/500 [==================>...........] - ETA: 41s - loss: 0.9067 - regression_loss: 0.8110 - classification_loss: 0.0957 323/500 [==================>...........] - ETA: 41s - loss: 0.9055 - regression_loss: 0.8100 - classification_loss: 0.0955 324/500 [==================>...........] - ETA: 41s - loss: 0.9059 - regression_loss: 0.8102 - classification_loss: 0.0956 325/500 [==================>...........] - ETA: 40s - loss: 0.9059 - regression_loss: 0.8102 - classification_loss: 0.0957 326/500 [==================>...........] - ETA: 40s - loss: 0.9049 - regression_loss: 0.8093 - classification_loss: 0.0956 327/500 [==================>...........] - ETA: 40s - loss: 0.9054 - regression_loss: 0.8100 - classification_loss: 0.0954 328/500 [==================>...........] - ETA: 40s - loss: 0.9057 - regression_loss: 0.8103 - classification_loss: 0.0953 329/500 [==================>...........] - ETA: 39s - loss: 0.9065 - regression_loss: 0.8112 - classification_loss: 0.0954 330/500 [==================>...........] - ETA: 39s - loss: 0.9067 - regression_loss: 0.8112 - classification_loss: 0.0956 331/500 [==================>...........] - ETA: 39s - loss: 0.9091 - regression_loss: 0.8132 - classification_loss: 0.0958 332/500 [==================>...........] - ETA: 39s - loss: 0.9079 - regression_loss: 0.8122 - classification_loss: 0.0956 333/500 [==================>...........] - ETA: 38s - loss: 0.9083 - regression_loss: 0.8127 - classification_loss: 0.0956 334/500 [===================>..........] - ETA: 38s - loss: 0.9083 - regression_loss: 0.8127 - classification_loss: 0.0957 335/500 [===================>..........] - ETA: 38s - loss: 0.9069 - regression_loss: 0.8114 - classification_loss: 0.0955 336/500 [===================>..........] - ETA: 38s - loss: 0.9051 - regression_loss: 0.8099 - classification_loss: 0.0952 337/500 [===================>..........] - ETA: 38s - loss: 0.9062 - regression_loss: 0.8111 - classification_loss: 0.0950 338/500 [===================>..........] - ETA: 37s - loss: 0.9067 - regression_loss: 0.8111 - classification_loss: 0.0956 339/500 [===================>..........] - ETA: 37s - loss: 0.9075 - regression_loss: 0.8118 - classification_loss: 0.0957 340/500 [===================>..........] - ETA: 37s - loss: 0.9061 - regression_loss: 0.8104 - classification_loss: 0.0957 341/500 [===================>..........] - ETA: 37s - loss: 0.9056 - regression_loss: 0.8099 - classification_loss: 0.0957 342/500 [===================>..........] - ETA: 36s - loss: 0.9050 - regression_loss: 0.8094 - classification_loss: 0.0956 343/500 [===================>..........] - ETA: 36s - loss: 0.9046 - regression_loss: 0.8091 - classification_loss: 0.0956 344/500 [===================>..........] - ETA: 36s - loss: 0.9043 - regression_loss: 0.8089 - classification_loss: 0.0954 345/500 [===================>..........] - ETA: 36s - loss: 0.9046 - regression_loss: 0.8094 - classification_loss: 0.0952 346/500 [===================>..........] - ETA: 35s - loss: 0.9053 - regression_loss: 0.8100 - classification_loss: 0.0953 347/500 [===================>..........] - ETA: 35s - loss: 0.9045 - regression_loss: 0.8093 - classification_loss: 0.0952 348/500 [===================>..........] - ETA: 35s - loss: 0.9044 - regression_loss: 0.8093 - classification_loss: 0.0952 349/500 [===================>..........] - ETA: 35s - loss: 0.9046 - regression_loss: 0.8093 - classification_loss: 0.0953 350/500 [====================>.........] - ETA: 35s - loss: 0.9047 - regression_loss: 0.8094 - classification_loss: 0.0953 351/500 [====================>.........] - ETA: 34s - loss: 0.9046 - regression_loss: 0.8092 - classification_loss: 0.0954 352/500 [====================>.........] - ETA: 34s - loss: 0.9049 - regression_loss: 0.8095 - classification_loss: 0.0955 353/500 [====================>.........] - ETA: 34s - loss: 0.9054 - regression_loss: 0.8097 - classification_loss: 0.0957 354/500 [====================>.........] - ETA: 34s - loss: 0.9042 - regression_loss: 0.8088 - classification_loss: 0.0955 355/500 [====================>.........] - ETA: 33s - loss: 0.9025 - regression_loss: 0.8072 - classification_loss: 0.0952 356/500 [====================>.........] - ETA: 33s - loss: 0.9011 - regression_loss: 0.8061 - classification_loss: 0.0950 357/500 [====================>.........] - ETA: 33s - loss: 0.9005 - regression_loss: 0.8056 - classification_loss: 0.0950 358/500 [====================>.........] - ETA: 33s - loss: 0.9002 - regression_loss: 0.8053 - classification_loss: 0.0949 359/500 [====================>.........] - ETA: 32s - loss: 0.8992 - regression_loss: 0.8045 - classification_loss: 0.0947 360/500 [====================>.........] - ETA: 32s - loss: 0.9040 - regression_loss: 0.8082 - classification_loss: 0.0958 361/500 [====================>.........] - ETA: 32s - loss: 0.9027 - regression_loss: 0.8071 - classification_loss: 0.0956 362/500 [====================>.........] - ETA: 32s - loss: 0.9028 - regression_loss: 0.8073 - classification_loss: 0.0956 363/500 [====================>.........] - ETA: 31s - loss: 0.9035 - regression_loss: 0.8080 - classification_loss: 0.0955 364/500 [====================>.........] - ETA: 31s - loss: 0.9026 - regression_loss: 0.8072 - classification_loss: 0.0954 365/500 [====================>.........] - ETA: 31s - loss: 0.9031 - regression_loss: 0.8077 - classification_loss: 0.0954 366/500 [====================>.........] - ETA: 31s - loss: 0.9026 - regression_loss: 0.8073 - classification_loss: 0.0953 367/500 [=====================>........] - ETA: 31s - loss: 0.9026 - regression_loss: 0.8074 - classification_loss: 0.0952 368/500 [=====================>........] - ETA: 30s - loss: 0.9048 - regression_loss: 0.8089 - classification_loss: 0.0958 369/500 [=====================>........] - ETA: 30s - loss: 0.9051 - regression_loss: 0.8092 - classification_loss: 0.0958 370/500 [=====================>........] - ETA: 30s - loss: 0.9044 - regression_loss: 0.8087 - classification_loss: 0.0957 371/500 [=====================>........] - ETA: 30s - loss: 0.9040 - regression_loss: 0.8084 - classification_loss: 0.0957 372/500 [=====================>........] - ETA: 29s - loss: 0.9038 - regression_loss: 0.8082 - classification_loss: 0.0956 373/500 [=====================>........] - ETA: 29s - loss: 0.9073 - regression_loss: 0.8106 - classification_loss: 0.0966 374/500 [=====================>........] - ETA: 29s - loss: 0.9085 - regression_loss: 0.8116 - classification_loss: 0.0970 375/500 [=====================>........] - ETA: 29s - loss: 0.9082 - regression_loss: 0.8114 - classification_loss: 0.0968 376/500 [=====================>........] - ETA: 28s - loss: 0.9070 - regression_loss: 0.8104 - classification_loss: 0.0966 377/500 [=====================>........] - ETA: 28s - loss: 0.9059 - regression_loss: 0.8095 - classification_loss: 0.0965 378/500 [=====================>........] - ETA: 28s - loss: 0.9058 - regression_loss: 0.8094 - classification_loss: 0.0964 379/500 [=====================>........] - ETA: 28s - loss: 0.9055 - regression_loss: 0.8091 - classification_loss: 0.0964 380/500 [=====================>........] - ETA: 28s - loss: 0.9059 - regression_loss: 0.8092 - classification_loss: 0.0966 381/500 [=====================>........] - ETA: 27s - loss: 0.9056 - regression_loss: 0.8091 - classification_loss: 0.0965 382/500 [=====================>........] - ETA: 27s - loss: 0.9041 - regression_loss: 0.8078 - classification_loss: 0.0963 383/500 [=====================>........] - ETA: 27s - loss: 0.9029 - regression_loss: 0.8067 - classification_loss: 0.0961 384/500 [======================>.......] - ETA: 27s - loss: 0.9034 - regression_loss: 0.8073 - classification_loss: 0.0961 385/500 [======================>.......] - ETA: 26s - loss: 0.9023 - regression_loss: 0.8064 - classification_loss: 0.0960 386/500 [======================>.......] - ETA: 26s - loss: 0.9031 - regression_loss: 0.8068 - classification_loss: 0.0962 387/500 [======================>.......] - ETA: 26s - loss: 0.9026 - regression_loss: 0.8065 - classification_loss: 0.0961 388/500 [======================>.......] - ETA: 26s - loss: 0.9029 - regression_loss: 0.8068 - classification_loss: 0.0961 389/500 [======================>.......] - ETA: 25s - loss: 0.9036 - regression_loss: 0.8072 - classification_loss: 0.0964 390/500 [======================>.......] - ETA: 25s - loss: 0.9027 - regression_loss: 0.8065 - classification_loss: 0.0962 391/500 [======================>.......] - ETA: 25s - loss: 0.9026 - regression_loss: 0.8063 - classification_loss: 0.0963 392/500 [======================>.......] - ETA: 25s - loss: 0.9032 - regression_loss: 0.8068 - classification_loss: 0.0963 393/500 [======================>.......] - ETA: 24s - loss: 0.9013 - regression_loss: 0.8051 - classification_loss: 0.0961 394/500 [======================>.......] - ETA: 24s - loss: 0.9007 - regression_loss: 0.8046 - classification_loss: 0.0961 395/500 [======================>.......] - ETA: 24s - loss: 0.9007 - regression_loss: 0.8047 - classification_loss: 0.0960 396/500 [======================>.......] - ETA: 24s - loss: 0.9013 - regression_loss: 0.8050 - classification_loss: 0.0962 397/500 [======================>.......] - ETA: 24s - loss: 0.9010 - regression_loss: 0.8049 - classification_loss: 0.0962 398/500 [======================>.......] - ETA: 23s - loss: 0.9020 - regression_loss: 0.8057 - classification_loss: 0.0962 399/500 [======================>.......] - ETA: 23s - loss: 0.9028 - regression_loss: 0.8067 - classification_loss: 0.0961 400/500 [=======================>......] - ETA: 23s - loss: 0.9028 - regression_loss: 0.8068 - classification_loss: 0.0960 401/500 [=======================>......] - ETA: 23s - loss: 0.9023 - regression_loss: 0.8064 - classification_loss: 0.0959 402/500 [=======================>......] - ETA: 22s - loss: 0.9027 - regression_loss: 0.8066 - classification_loss: 0.0962 403/500 [=======================>......] - ETA: 22s - loss: 0.9040 - regression_loss: 0.8075 - classification_loss: 0.0965 404/500 [=======================>......] - ETA: 22s - loss: 0.9030 - regression_loss: 0.8067 - classification_loss: 0.0963 405/500 [=======================>......] - ETA: 22s - loss: 0.9029 - regression_loss: 0.8067 - classification_loss: 0.0962 406/500 [=======================>......] - ETA: 21s - loss: 0.9032 - regression_loss: 0.8070 - classification_loss: 0.0961 407/500 [=======================>......] - ETA: 21s - loss: 0.9029 - regression_loss: 0.8067 - classification_loss: 0.0962 408/500 [=======================>......] - ETA: 21s - loss: 0.9029 - regression_loss: 0.8067 - classification_loss: 0.0962 409/500 [=======================>......] - ETA: 21s - loss: 0.9030 - regression_loss: 0.8068 - classification_loss: 0.0962 410/500 [=======================>......] - ETA: 21s - loss: 0.9032 - regression_loss: 0.8071 - classification_loss: 0.0961 411/500 [=======================>......] - ETA: 20s - loss: 0.9028 - regression_loss: 0.8069 - classification_loss: 0.0960 412/500 [=======================>......] - ETA: 20s - loss: 0.9023 - regression_loss: 0.8065 - classification_loss: 0.0958 413/500 [=======================>......] - ETA: 20s - loss: 0.9017 - regression_loss: 0.8060 - classification_loss: 0.0958 414/500 [=======================>......] - ETA: 20s - loss: 0.9005 - regression_loss: 0.8050 - classification_loss: 0.0956 415/500 [=======================>......] - ETA: 19s - loss: 0.9021 - regression_loss: 0.8065 - classification_loss: 0.0956 416/500 [=======================>......] - ETA: 19s - loss: 0.9041 - regression_loss: 0.8084 - classification_loss: 0.0957 417/500 [========================>.....] - ETA: 19s - loss: 0.9040 - regression_loss: 0.8084 - classification_loss: 0.0956 418/500 [========================>.....] - ETA: 19s - loss: 0.9034 - regression_loss: 0.8079 - classification_loss: 0.0954 419/500 [========================>.....] - ETA: 18s - loss: 0.9027 - regression_loss: 0.8074 - classification_loss: 0.0953 420/500 [========================>.....] - ETA: 18s - loss: 0.9035 - regression_loss: 0.8082 - classification_loss: 0.0953 421/500 [========================>.....] - ETA: 18s - loss: 0.9034 - regression_loss: 0.8081 - classification_loss: 0.0953 422/500 [========================>.....] - ETA: 18s - loss: 0.9038 - regression_loss: 0.8085 - classification_loss: 0.0953 423/500 [========================>.....] - ETA: 17s - loss: 0.9027 - regression_loss: 0.8075 - classification_loss: 0.0951 424/500 [========================>.....] - ETA: 17s - loss: 0.9028 - regression_loss: 0.8077 - classification_loss: 0.0951 425/500 [========================>.....] - ETA: 17s - loss: 0.9022 - regression_loss: 0.8072 - classification_loss: 0.0950 426/500 [========================>.....] - ETA: 17s - loss: 0.9020 - regression_loss: 0.8071 - classification_loss: 0.0949 427/500 [========================>.....] - ETA: 17s - loss: 0.9021 - regression_loss: 0.8071 - classification_loss: 0.0950 428/500 [========================>.....] - ETA: 16s - loss: 0.9025 - regression_loss: 0.8075 - classification_loss: 0.0950 429/500 [========================>.....] - ETA: 16s - loss: 0.9018 - regression_loss: 0.8070 - classification_loss: 0.0948 430/500 [========================>.....] - ETA: 16s - loss: 0.9007 - regression_loss: 0.8060 - classification_loss: 0.0947 431/500 [========================>.....] - ETA: 16s - loss: 0.9001 - regression_loss: 0.8055 - classification_loss: 0.0946 432/500 [========================>.....] - ETA: 15s - loss: 0.8990 - regression_loss: 0.8046 - classification_loss: 0.0944 433/500 [========================>.....] - ETA: 15s - loss: 0.8991 - regression_loss: 0.8047 - classification_loss: 0.0944 434/500 [=========================>....] - ETA: 15s - loss: 0.8985 - regression_loss: 0.8041 - classification_loss: 0.0943 435/500 [=========================>....] - ETA: 15s - loss: 0.8989 - regression_loss: 0.8044 - classification_loss: 0.0944 436/500 [=========================>....] - ETA: 14s - loss: 0.8992 - regression_loss: 0.8048 - classification_loss: 0.0945 437/500 [=========================>....] - ETA: 14s - loss: 0.8984 - regression_loss: 0.8041 - classification_loss: 0.0943 438/500 [=========================>....] - ETA: 14s - loss: 0.8977 - regression_loss: 0.8035 - classification_loss: 0.0942 439/500 [=========================>....] - ETA: 14s - loss: 0.8957 - regression_loss: 0.8017 - classification_loss: 0.0940 440/500 [=========================>....] - ETA: 14s - loss: 0.8960 - regression_loss: 0.8021 - classification_loss: 0.0940 441/500 [=========================>....] - ETA: 13s - loss: 0.8968 - regression_loss: 0.8027 - classification_loss: 0.0941 442/500 [=========================>....] - ETA: 13s - loss: 0.8972 - regression_loss: 0.8031 - classification_loss: 0.0941 443/500 [=========================>....] - ETA: 13s - loss: 0.8977 - regression_loss: 0.8036 - classification_loss: 0.0941 444/500 [=========================>....] - ETA: 13s - loss: 0.8978 - regression_loss: 0.8037 - classification_loss: 0.0940 445/500 [=========================>....] - ETA: 12s - loss: 0.8982 - regression_loss: 0.8041 - classification_loss: 0.0942 446/500 [=========================>....] - ETA: 12s - loss: 0.8991 - regression_loss: 0.8046 - classification_loss: 0.0945 447/500 [=========================>....] - ETA: 12s - loss: 0.8981 - regression_loss: 0.8037 - classification_loss: 0.0944 448/500 [=========================>....] - ETA: 12s - loss: 0.8979 - regression_loss: 0.8035 - classification_loss: 0.0943 449/500 [=========================>....] - ETA: 11s - loss: 0.8979 - regression_loss: 0.8035 - classification_loss: 0.0944 450/500 [==========================>...] - ETA: 11s - loss: 0.8978 - regression_loss: 0.8034 - classification_loss: 0.0944 451/500 [==========================>...] - ETA: 11s - loss: 0.8968 - regression_loss: 0.8026 - classification_loss: 0.0943 452/500 [==========================>...] - ETA: 11s - loss: 0.8971 - regression_loss: 0.8028 - classification_loss: 0.0943 453/500 [==========================>...] - ETA: 10s - loss: 0.8973 - regression_loss: 0.8031 - classification_loss: 0.0942 454/500 [==========================>...] - ETA: 10s - loss: 0.8973 - regression_loss: 0.8032 - classification_loss: 0.0941 455/500 [==========================>...] - ETA: 10s - loss: 0.8984 - regression_loss: 0.8039 - classification_loss: 0.0944 456/500 [==========================>...] - ETA: 10s - loss: 0.8989 - regression_loss: 0.8043 - classification_loss: 0.0946 457/500 [==========================>...] - ETA: 10s - loss: 0.8986 - regression_loss: 0.8042 - classification_loss: 0.0945 458/500 [==========================>...] - ETA: 9s - loss: 0.8978 - regression_loss: 0.8032 - classification_loss: 0.0946  459/500 [==========================>...] - ETA: 9s - loss: 0.8965 - regression_loss: 0.8021 - classification_loss: 0.0944 460/500 [==========================>...] - ETA: 9s - loss: 0.8964 - regression_loss: 0.8019 - classification_loss: 0.0944 461/500 [==========================>...] - ETA: 9s - loss: 0.8969 - regression_loss: 0.8024 - classification_loss: 0.0945 462/500 [==========================>...] - ETA: 8s - loss: 0.8978 - regression_loss: 0.8032 - classification_loss: 0.0946 463/500 [==========================>...] - ETA: 8s - loss: 0.8970 - regression_loss: 0.8026 - classification_loss: 0.0944 464/500 [==========================>...] - ETA: 8s - loss: 0.8973 - regression_loss: 0.8029 - classification_loss: 0.0944 465/500 [==========================>...] - ETA: 8s - loss: 0.8981 - regression_loss: 0.8037 - classification_loss: 0.0944 466/500 [==========================>...] - ETA: 7s - loss: 0.8980 - regression_loss: 0.8036 - classification_loss: 0.0944 467/500 [===========================>..] - ETA: 7s - loss: 0.8975 - regression_loss: 0.8032 - classification_loss: 0.0943 468/500 [===========================>..] - ETA: 7s - loss: 0.8978 - regression_loss: 0.8034 - classification_loss: 0.0944 469/500 [===========================>..] - ETA: 7s - loss: 0.8974 - regression_loss: 0.8031 - classification_loss: 0.0944 470/500 [===========================>..] - ETA: 7s - loss: 0.8964 - regression_loss: 0.8022 - classification_loss: 0.0942 471/500 [===========================>..] - ETA: 6s - loss: 0.8961 - regression_loss: 0.8021 - classification_loss: 0.0941 472/500 [===========================>..] - ETA: 6s - loss: 0.8949 - regression_loss: 0.8010 - classification_loss: 0.0939 473/500 [===========================>..] - ETA: 6s - loss: 0.8949 - regression_loss: 0.8011 - classification_loss: 0.0939 474/500 [===========================>..] - ETA: 6s - loss: 0.8950 - regression_loss: 0.8011 - classification_loss: 0.0940 475/500 [===========================>..] - ETA: 5s - loss: 0.8948 - regression_loss: 0.8009 - classification_loss: 0.0939 476/500 [===========================>..] - ETA: 5s - loss: 0.8941 - regression_loss: 0.8003 - classification_loss: 0.0937 477/500 [===========================>..] - ETA: 5s - loss: 0.8947 - regression_loss: 0.8008 - classification_loss: 0.0939 478/500 [===========================>..] - ETA: 5s - loss: 0.8968 - regression_loss: 0.8025 - classification_loss: 0.0943 479/500 [===========================>..] - ETA: 4s - loss: 0.8976 - regression_loss: 0.8031 - classification_loss: 0.0945 480/500 [===========================>..] - ETA: 4s - loss: 0.8991 - regression_loss: 0.8041 - classification_loss: 0.0950 481/500 [===========================>..] - ETA: 4s - loss: 0.8985 - regression_loss: 0.8036 - classification_loss: 0.0949 482/500 [===========================>..] - ETA: 4s - loss: 0.8980 - regression_loss: 0.8031 - classification_loss: 0.0948 483/500 [===========================>..] - ETA: 3s - loss: 0.8978 - regression_loss: 0.8031 - classification_loss: 0.0948 484/500 [============================>.] - ETA: 3s - loss: 0.8983 - regression_loss: 0.8033 - classification_loss: 0.0950 485/500 [============================>.] - ETA: 3s - loss: 0.8971 - regression_loss: 0.8023 - classification_loss: 0.0948 486/500 [============================>.] - ETA: 3s - loss: 0.8969 - regression_loss: 0.8021 - classification_loss: 0.0948 487/500 [============================>.] - ETA: 3s - loss: 0.8965 - regression_loss: 0.8018 - classification_loss: 0.0947 488/500 [============================>.] - ETA: 2s - loss: 0.8983 - regression_loss: 0.8033 - classification_loss: 0.0950 489/500 [============================>.] - ETA: 2s - loss: 0.8976 - regression_loss: 0.8027 - classification_loss: 0.0949 490/500 [============================>.] - ETA: 2s - loss: 0.8969 - regression_loss: 0.8021 - classification_loss: 0.0948 491/500 [============================>.] - ETA: 2s - loss: 0.8976 - regression_loss: 0.8027 - classification_loss: 0.0949 492/500 [============================>.] - ETA: 1s - loss: 0.8974 - regression_loss: 0.8026 - classification_loss: 0.0948 493/500 [============================>.] - ETA: 1s - loss: 0.8979 - regression_loss: 0.8031 - classification_loss: 0.0948 494/500 [============================>.] - ETA: 1s - loss: 0.8984 - regression_loss: 0.8036 - classification_loss: 0.0948 495/500 [============================>.] - ETA: 1s - loss: 0.8990 - regression_loss: 0.8043 - classification_loss: 0.0947 496/500 [============================>.] - ETA: 0s - loss: 0.8990 - regression_loss: 0.8043 - classification_loss: 0.0947 497/500 [============================>.] - ETA: 0s - loss: 0.8988 - regression_loss: 0.8041 - classification_loss: 0.0946 498/500 [============================>.] - ETA: 0s - loss: 0.8987 - regression_loss: 0.8042 - classification_loss: 0.0945 499/500 [============================>.] - ETA: 0s - loss: 0.8997 - regression_loss: 0.8050 - classification_loss: 0.0947 500/500 [==============================] - 117s 234ms/step - loss: 0.8984 - regression_loss: 0.8038 - classification_loss: 0.0946 326 instances of class plum with average precision: 0.8515 mAP: 0.8515 Epoch 00038: saving model to ./training/snapshots/resnet50_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 1:58 - loss: 0.5836 - regression_loss: 0.5413 - classification_loss: 0.0423 2/500 [..............................] - ETA: 1:58 - loss: 0.4550 - regression_loss: 0.4262 - classification_loss: 0.0289 3/500 [..............................] - ETA: 1:59 - loss: 0.6696 - regression_loss: 0.6191 - classification_loss: 0.0505 4/500 [..............................] - ETA: 1:59 - loss: 0.7425 - regression_loss: 0.6890 - classification_loss: 0.0536 5/500 [..............................] - ETA: 1:57 - loss: 0.8613 - regression_loss: 0.7870 - classification_loss: 0.0742 6/500 [..............................] - ETA: 1:59 - loss: 0.8323 - regression_loss: 0.7619 - classification_loss: 0.0705 7/500 [..............................] - ETA: 1:57 - loss: 0.8956 - regression_loss: 0.8096 - classification_loss: 0.0860 8/500 [..............................] - ETA: 1:56 - loss: 0.8988 - regression_loss: 0.8133 - classification_loss: 0.0855 9/500 [..............................] - ETA: 1:55 - loss: 0.9648 - regression_loss: 0.8696 - classification_loss: 0.0952 10/500 [..............................] - ETA: 1:54 - loss: 0.9333 - regression_loss: 0.8430 - classification_loss: 0.0903 11/500 [..............................] - ETA: 1:53 - loss: 0.9133 - regression_loss: 0.8232 - classification_loss: 0.0902 12/500 [..............................] - ETA: 1:52 - loss: 0.9022 - regression_loss: 0.8154 - classification_loss: 0.0868 13/500 [..............................] - ETA: 1:53 - loss: 0.8984 - regression_loss: 0.8108 - classification_loss: 0.0877 14/500 [..............................] - ETA: 1:53 - loss: 0.8898 - regression_loss: 0.8030 - classification_loss: 0.0867 15/500 [..............................] - ETA: 1:52 - loss: 0.9045 - regression_loss: 0.8126 - classification_loss: 0.0919 16/500 [..............................] - ETA: 1:52 - loss: 0.9157 - regression_loss: 0.8269 - classification_loss: 0.0888 17/500 [>.............................] - ETA: 1:52 - loss: 0.9119 - regression_loss: 0.8238 - classification_loss: 0.0880 18/500 [>.............................] - ETA: 1:52 - loss: 0.9114 - regression_loss: 0.8263 - classification_loss: 0.0850 19/500 [>.............................] - ETA: 1:52 - loss: 0.9107 - regression_loss: 0.8250 - classification_loss: 0.0857 20/500 [>.............................] - ETA: 1:52 - loss: 0.9345 - regression_loss: 0.8436 - classification_loss: 0.0909 21/500 [>.............................] - ETA: 1:52 - loss: 0.9267 - regression_loss: 0.8356 - classification_loss: 0.0911 22/500 [>.............................] - ETA: 1:52 - loss: 0.9091 - regression_loss: 0.8205 - classification_loss: 0.0887 23/500 [>.............................] - ETA: 1:52 - loss: 0.9022 - regression_loss: 0.8148 - classification_loss: 0.0874 24/500 [>.............................] - ETA: 1:51 - loss: 0.8980 - regression_loss: 0.8118 - classification_loss: 0.0861 25/500 [>.............................] - ETA: 1:51 - loss: 0.8766 - regression_loss: 0.7932 - classification_loss: 0.0834 26/500 [>.............................] - ETA: 1:51 - loss: 0.8935 - regression_loss: 0.8101 - classification_loss: 0.0834 27/500 [>.............................] - ETA: 1:51 - loss: 0.9077 - regression_loss: 0.8226 - classification_loss: 0.0851 28/500 [>.............................] - ETA: 1:50 - loss: 0.9026 - regression_loss: 0.8186 - classification_loss: 0.0840 29/500 [>.............................] - ETA: 1:51 - loss: 0.8955 - regression_loss: 0.8117 - classification_loss: 0.0838 30/500 [>.............................] - ETA: 1:50 - loss: 0.8836 - regression_loss: 0.8008 - classification_loss: 0.0828 31/500 [>.............................] - ETA: 1:50 - loss: 0.8999 - regression_loss: 0.8136 - classification_loss: 0.0864 32/500 [>.............................] - ETA: 1:50 - loss: 0.9214 - regression_loss: 0.8317 - classification_loss: 0.0897 33/500 [>.............................] - ETA: 1:50 - loss: 0.9184 - regression_loss: 0.8300 - classification_loss: 0.0884 34/500 [=>............................] - ETA: 1:50 - loss: 0.9068 - regression_loss: 0.8200 - classification_loss: 0.0868 35/500 [=>............................] - ETA: 1:50 - loss: 0.9083 - regression_loss: 0.8220 - classification_loss: 0.0863 36/500 [=>............................] - ETA: 1:49 - loss: 0.9182 - regression_loss: 0.8307 - classification_loss: 0.0876 37/500 [=>............................] - ETA: 1:49 - loss: 0.9181 - regression_loss: 0.8301 - classification_loss: 0.0879 38/500 [=>............................] - ETA: 1:49 - loss: 0.9168 - regression_loss: 0.8296 - classification_loss: 0.0872 39/500 [=>............................] - ETA: 1:48 - loss: 0.9103 - regression_loss: 0.8241 - classification_loss: 0.0862 40/500 [=>............................] - ETA: 1:48 - loss: 0.9110 - regression_loss: 0.8247 - classification_loss: 0.0864 41/500 [=>............................] - ETA: 1:48 - loss: 0.9027 - regression_loss: 0.8167 - classification_loss: 0.0860 42/500 [=>............................] - ETA: 1:48 - loss: 0.9136 - regression_loss: 0.8282 - classification_loss: 0.0855 43/500 [=>............................] - ETA: 1:48 - loss: 0.9122 - regression_loss: 0.8270 - classification_loss: 0.0852 44/500 [=>............................] - ETA: 1:47 - loss: 0.9170 - regression_loss: 0.8308 - classification_loss: 0.0862 45/500 [=>............................] - ETA: 1:47 - loss: 0.9148 - regression_loss: 0.8290 - classification_loss: 0.0859 46/500 [=>............................] - ETA: 1:47 - loss: 0.9203 - regression_loss: 0.8338 - classification_loss: 0.0864 47/500 [=>............................] - ETA: 1:46 - loss: 0.9150 - regression_loss: 0.8283 - classification_loss: 0.0867 48/500 [=>............................] - ETA: 1:46 - loss: 0.9082 - regression_loss: 0.8221 - classification_loss: 0.0861 49/500 [=>............................] - ETA: 1:46 - loss: 0.8989 - regression_loss: 0.8128 - classification_loss: 0.0861 50/500 [==>...........................] - ETA: 1:45 - loss: 0.9060 - regression_loss: 0.8195 - classification_loss: 0.0865 51/500 [==>...........................] - ETA: 1:45 - loss: 0.8979 - regression_loss: 0.8124 - classification_loss: 0.0855 52/500 [==>...........................] - ETA: 1:45 - loss: 0.8939 - regression_loss: 0.8090 - classification_loss: 0.0848 53/500 [==>...........................] - ETA: 1:45 - loss: 0.8955 - regression_loss: 0.8097 - classification_loss: 0.0858 54/500 [==>...........................] - ETA: 1:45 - loss: 0.9099 - regression_loss: 0.8220 - classification_loss: 0.0879 55/500 [==>...........................] - ETA: 1:44 - loss: 0.9202 - regression_loss: 0.8313 - classification_loss: 0.0889 56/500 [==>...........................] - ETA: 1:44 - loss: 0.9227 - regression_loss: 0.8336 - classification_loss: 0.0890 57/500 [==>...........................] - ETA: 1:44 - loss: 0.9175 - regression_loss: 0.8296 - classification_loss: 0.0880 58/500 [==>...........................] - ETA: 1:44 - loss: 0.9193 - regression_loss: 0.8316 - classification_loss: 0.0877 59/500 [==>...........................] - ETA: 1:43 - loss: 0.9220 - regression_loss: 0.8352 - classification_loss: 0.0868 60/500 [==>...........................] - ETA: 1:43 - loss: 0.9181 - regression_loss: 0.8317 - classification_loss: 0.0864 61/500 [==>...........................] - ETA: 1:43 - loss: 0.9125 - regression_loss: 0.8268 - classification_loss: 0.0858 62/500 [==>...........................] - ETA: 1:43 - loss: 0.9171 - regression_loss: 0.8310 - classification_loss: 0.0862 63/500 [==>...........................] - ETA: 1:43 - loss: 0.9133 - regression_loss: 0.8276 - classification_loss: 0.0856 64/500 [==>...........................] - ETA: 1:42 - loss: 0.9075 - regression_loss: 0.8225 - classification_loss: 0.0850 65/500 [==>...........................] - ETA: 1:42 - loss: 0.9019 - regression_loss: 0.8172 - classification_loss: 0.0846 66/500 [==>...........................] - ETA: 1:42 - loss: 0.8920 - regression_loss: 0.8079 - classification_loss: 0.0841 67/500 [===>..........................] - ETA: 1:42 - loss: 0.8884 - regression_loss: 0.8030 - classification_loss: 0.0854 68/500 [===>..........................] - ETA: 1:41 - loss: 0.8876 - regression_loss: 0.8020 - classification_loss: 0.0856 69/500 [===>..........................] - ETA: 1:41 - loss: 0.8860 - regression_loss: 0.8010 - classification_loss: 0.0850 70/500 [===>..........................] - ETA: 1:41 - loss: 0.8908 - regression_loss: 0.8053 - classification_loss: 0.0855 71/500 [===>..........................] - ETA: 1:41 - loss: 0.8933 - regression_loss: 0.8073 - classification_loss: 0.0860 72/500 [===>..........................] - ETA: 1:40 - loss: 0.8950 - regression_loss: 0.8089 - classification_loss: 0.0861 73/500 [===>..........................] - ETA: 1:40 - loss: 0.8926 - regression_loss: 0.8070 - classification_loss: 0.0856 74/500 [===>..........................] - ETA: 1:40 - loss: 0.8910 - regression_loss: 0.8059 - classification_loss: 0.0851 75/500 [===>..........................] - ETA: 1:40 - loss: 0.8924 - regression_loss: 0.8068 - classification_loss: 0.0856 76/500 [===>..........................] - ETA: 1:39 - loss: 0.8896 - regression_loss: 0.8045 - classification_loss: 0.0850 77/500 [===>..........................] - ETA: 1:39 - loss: 0.8814 - regression_loss: 0.7974 - classification_loss: 0.0840 78/500 [===>..........................] - ETA: 1:39 - loss: 0.8823 - regression_loss: 0.7982 - classification_loss: 0.0841 79/500 [===>..........................] - ETA: 1:39 - loss: 0.8804 - regression_loss: 0.7962 - classification_loss: 0.0842 80/500 [===>..........................] - ETA: 1:38 - loss: 0.8760 - regression_loss: 0.7918 - classification_loss: 0.0842 81/500 [===>..........................] - ETA: 1:38 - loss: 0.8693 - regression_loss: 0.7860 - classification_loss: 0.0833 82/500 [===>..........................] - ETA: 1:38 - loss: 0.8680 - regression_loss: 0.7849 - classification_loss: 0.0831 83/500 [===>..........................] - ETA: 1:38 - loss: 0.8725 - regression_loss: 0.7897 - classification_loss: 0.0828 84/500 [====>.........................] - ETA: 1:37 - loss: 0.8686 - regression_loss: 0.7865 - classification_loss: 0.0821 85/500 [====>.........................] - ETA: 1:37 - loss: 0.8701 - regression_loss: 0.7876 - classification_loss: 0.0825 86/500 [====>.........................] - ETA: 1:37 - loss: 0.8754 - regression_loss: 0.7921 - classification_loss: 0.0832 87/500 [====>.........................] - ETA: 1:37 - loss: 0.8808 - regression_loss: 0.7973 - classification_loss: 0.0835 88/500 [====>.........................] - ETA: 1:36 - loss: 0.8775 - regression_loss: 0.7944 - classification_loss: 0.0831 89/500 [====>.........................] - ETA: 1:36 - loss: 0.8725 - regression_loss: 0.7900 - classification_loss: 0.0825 90/500 [====>.........................] - ETA: 1:36 - loss: 0.8725 - regression_loss: 0.7901 - classification_loss: 0.0824 91/500 [====>.........................] - ETA: 1:35 - loss: 0.8684 - regression_loss: 0.7865 - classification_loss: 0.0818 92/500 [====>.........................] - ETA: 1:35 - loss: 0.8728 - regression_loss: 0.7902 - classification_loss: 0.0827 93/500 [====>.........................] - ETA: 1:35 - loss: 0.8743 - regression_loss: 0.7915 - classification_loss: 0.0828 94/500 [====>.........................] - ETA: 1:35 - loss: 0.8714 - regression_loss: 0.7893 - classification_loss: 0.0821 95/500 [====>.........................] - ETA: 1:34 - loss: 0.8729 - regression_loss: 0.7907 - classification_loss: 0.0823 96/500 [====>.........................] - ETA: 1:34 - loss: 0.8639 - regression_loss: 0.7824 - classification_loss: 0.0815 97/500 [====>.........................] - ETA: 1:34 - loss: 0.8635 - regression_loss: 0.7823 - classification_loss: 0.0812 98/500 [====>.........................] - ETA: 1:34 - loss: 0.8689 - regression_loss: 0.7875 - classification_loss: 0.0814 99/500 [====>.........................] - ETA: 1:33 - loss: 0.8725 - regression_loss: 0.7910 - classification_loss: 0.0816 100/500 [=====>........................] - ETA: 1:33 - loss: 0.8735 - regression_loss: 0.7921 - classification_loss: 0.0815 101/500 [=====>........................] - ETA: 1:33 - loss: 0.8763 - regression_loss: 0.7938 - classification_loss: 0.0825 102/500 [=====>........................] - ETA: 1:33 - loss: 0.8764 - regression_loss: 0.7940 - classification_loss: 0.0824 103/500 [=====>........................] - ETA: 1:33 - loss: 0.8757 - regression_loss: 0.7934 - classification_loss: 0.0822 104/500 [=====>........................] - ETA: 1:32 - loss: 0.8733 - regression_loss: 0.7909 - classification_loss: 0.0824 105/500 [=====>........................] - ETA: 1:32 - loss: 0.8682 - regression_loss: 0.7864 - classification_loss: 0.0818 106/500 [=====>........................] - ETA: 1:32 - loss: 0.8639 - regression_loss: 0.7826 - classification_loss: 0.0813 107/500 [=====>........................] - ETA: 1:31 - loss: 0.8679 - regression_loss: 0.7852 - classification_loss: 0.0826 108/500 [=====>........................] - ETA: 1:31 - loss: 0.8623 - regression_loss: 0.7803 - classification_loss: 0.0821 109/500 [=====>........................] - ETA: 1:31 - loss: 0.8583 - regression_loss: 0.7768 - classification_loss: 0.0815 110/500 [=====>........................] - ETA: 1:31 - loss: 0.8585 - regression_loss: 0.7774 - classification_loss: 0.0811 111/500 [=====>........................] - ETA: 1:31 - loss: 0.8553 - regression_loss: 0.7744 - classification_loss: 0.0809 112/500 [=====>........................] - ETA: 1:30 - loss: 0.8521 - regression_loss: 0.7716 - classification_loss: 0.0805 113/500 [=====>........................] - ETA: 1:30 - loss: 0.8496 - regression_loss: 0.7691 - classification_loss: 0.0805 114/500 [=====>........................] - ETA: 1:30 - loss: 0.8510 - regression_loss: 0.7703 - classification_loss: 0.0807 115/500 [=====>........................] - ETA: 1:30 - loss: 0.8480 - regression_loss: 0.7677 - classification_loss: 0.0803 116/500 [=====>........................] - ETA: 1:29 - loss: 0.8499 - regression_loss: 0.7696 - classification_loss: 0.0803 117/500 [======>.......................] - ETA: 1:29 - loss: 0.8500 - regression_loss: 0.7698 - classification_loss: 0.0803 118/500 [======>.......................] - ETA: 1:29 - loss: 0.8449 - regression_loss: 0.7649 - classification_loss: 0.0800 119/500 [======>.......................] - ETA: 1:29 - loss: 0.8420 - regression_loss: 0.7626 - classification_loss: 0.0794 120/500 [======>.......................] - ETA: 1:29 - loss: 0.8473 - regression_loss: 0.7667 - classification_loss: 0.0805 121/500 [======>.......................] - ETA: 1:28 - loss: 0.8497 - regression_loss: 0.7691 - classification_loss: 0.0807 122/500 [======>.......................] - ETA: 1:28 - loss: 0.8508 - regression_loss: 0.7702 - classification_loss: 0.0806 123/500 [======>.......................] - ETA: 1:28 - loss: 0.8459 - regression_loss: 0.7658 - classification_loss: 0.0801 124/500 [======>.......................] - ETA: 1:28 - loss: 0.8469 - regression_loss: 0.7670 - classification_loss: 0.0800 125/500 [======>.......................] - ETA: 1:28 - loss: 0.8481 - regression_loss: 0.7681 - classification_loss: 0.0800 126/500 [======>.......................] - ETA: 1:27 - loss: 0.8478 - regression_loss: 0.7678 - classification_loss: 0.0800 127/500 [======>.......................] - ETA: 1:27 - loss: 0.8490 - regression_loss: 0.7688 - classification_loss: 0.0802 128/500 [======>.......................] - ETA: 1:27 - loss: 0.8472 - regression_loss: 0.7673 - classification_loss: 0.0799 129/500 [======>.......................] - ETA: 1:27 - loss: 0.8464 - regression_loss: 0.7668 - classification_loss: 0.0796 130/500 [======>.......................] - ETA: 1:27 - loss: 0.8446 - regression_loss: 0.7650 - classification_loss: 0.0796 131/500 [======>.......................] - ETA: 1:26 - loss: 0.8406 - regression_loss: 0.7614 - classification_loss: 0.0791 132/500 [======>.......................] - ETA: 1:26 - loss: 0.8454 - regression_loss: 0.7647 - classification_loss: 0.0807 133/500 [======>.......................] - ETA: 1:26 - loss: 0.8589 - regression_loss: 0.7768 - classification_loss: 0.0821 134/500 [=======>......................] - ETA: 1:26 - loss: 0.8594 - regression_loss: 0.7773 - classification_loss: 0.0821 135/500 [=======>......................] - ETA: 1:25 - loss: 0.8617 - regression_loss: 0.7794 - classification_loss: 0.0823 136/500 [=======>......................] - ETA: 1:25 - loss: 0.8593 - regression_loss: 0.7774 - classification_loss: 0.0819 137/500 [=======>......................] - ETA: 1:25 - loss: 0.8562 - regression_loss: 0.7746 - classification_loss: 0.0815 138/500 [=======>......................] - ETA: 1:25 - loss: 0.8582 - regression_loss: 0.7765 - classification_loss: 0.0817 139/500 [=======>......................] - ETA: 1:24 - loss: 0.8585 - regression_loss: 0.7767 - classification_loss: 0.0818 140/500 [=======>......................] - ETA: 1:24 - loss: 0.8544 - regression_loss: 0.7731 - classification_loss: 0.0813 141/500 [=======>......................] - ETA: 1:24 - loss: 0.8561 - regression_loss: 0.7741 - classification_loss: 0.0820 142/500 [=======>......................] - ETA: 1:24 - loss: 0.8545 - regression_loss: 0.7727 - classification_loss: 0.0818 143/500 [=======>......................] - ETA: 1:24 - loss: 0.8553 - regression_loss: 0.7739 - classification_loss: 0.0814 144/500 [=======>......................] - ETA: 1:23 - loss: 0.8543 - regression_loss: 0.7730 - classification_loss: 0.0812 145/500 [=======>......................] - ETA: 1:23 - loss: 0.8747 - regression_loss: 0.7924 - classification_loss: 0.0823 146/500 [=======>......................] - ETA: 1:23 - loss: 0.8723 - regression_loss: 0.7902 - classification_loss: 0.0821 147/500 [=======>......................] - ETA: 1:22 - loss: 0.8712 - regression_loss: 0.7889 - classification_loss: 0.0824 148/500 [=======>......................] - ETA: 1:22 - loss: 0.8718 - regression_loss: 0.7892 - classification_loss: 0.0826 149/500 [=======>......................] - ETA: 1:22 - loss: 0.8738 - regression_loss: 0.7910 - classification_loss: 0.0828 150/500 [========>.....................] - ETA: 1:22 - loss: 0.8743 - regression_loss: 0.7913 - classification_loss: 0.0830 151/500 [========>.....................] - ETA: 1:21 - loss: 0.8767 - regression_loss: 0.7936 - classification_loss: 0.0831 152/500 [========>.....................] - ETA: 1:21 - loss: 0.8783 - regression_loss: 0.7955 - classification_loss: 0.0827 153/500 [========>.....................] - ETA: 1:21 - loss: 0.8782 - regression_loss: 0.7955 - classification_loss: 0.0827 154/500 [========>.....................] - ETA: 1:21 - loss: 0.8813 - regression_loss: 0.7973 - classification_loss: 0.0839 155/500 [========>.....................] - ETA: 1:21 - loss: 0.8816 - regression_loss: 0.7978 - classification_loss: 0.0838 156/500 [========>.....................] - ETA: 1:20 - loss: 0.8848 - regression_loss: 0.8004 - classification_loss: 0.0844 157/500 [========>.....................] - ETA: 1:20 - loss: 0.8813 - regression_loss: 0.7974 - classification_loss: 0.0839 158/500 [========>.....................] - ETA: 1:20 - loss: 0.8810 - regression_loss: 0.7969 - classification_loss: 0.0841 159/500 [========>.....................] - ETA: 1:20 - loss: 0.8836 - regression_loss: 0.7995 - classification_loss: 0.0841 160/500 [========>.....................] - ETA: 1:19 - loss: 0.8862 - regression_loss: 0.8014 - classification_loss: 0.0848 161/500 [========>.....................] - ETA: 1:19 - loss: 0.8831 - regression_loss: 0.7987 - classification_loss: 0.0844 162/500 [========>.....................] - ETA: 1:19 - loss: 0.8877 - regression_loss: 0.8021 - classification_loss: 0.0856 163/500 [========>.....................] - ETA: 1:19 - loss: 0.8878 - regression_loss: 0.8021 - classification_loss: 0.0857 164/500 [========>.....................] - ETA: 1:18 - loss: 0.8884 - regression_loss: 0.8029 - classification_loss: 0.0855 165/500 [========>.....................] - ETA: 1:18 - loss: 0.8862 - regression_loss: 0.8010 - classification_loss: 0.0852 166/500 [========>.....................] - ETA: 1:18 - loss: 0.8894 - regression_loss: 0.8036 - classification_loss: 0.0858 167/500 [=========>....................] - ETA: 1:18 - loss: 0.8872 - regression_loss: 0.8018 - classification_loss: 0.0854 168/500 [=========>....................] - ETA: 1:17 - loss: 0.8886 - regression_loss: 0.8030 - classification_loss: 0.0856 169/500 [=========>....................] - ETA: 1:17 - loss: 0.8909 - regression_loss: 0.8049 - classification_loss: 0.0860 170/500 [=========>....................] - ETA: 1:17 - loss: 0.8891 - regression_loss: 0.8034 - classification_loss: 0.0856 171/500 [=========>....................] - ETA: 1:17 - loss: 0.8936 - regression_loss: 0.8069 - classification_loss: 0.0867 172/500 [=========>....................] - ETA: 1:16 - loss: 0.8929 - regression_loss: 0.8065 - classification_loss: 0.0864 173/500 [=========>....................] - ETA: 1:16 - loss: 0.8919 - regression_loss: 0.8057 - classification_loss: 0.0863 174/500 [=========>....................] - ETA: 1:16 - loss: 0.8900 - regression_loss: 0.8040 - classification_loss: 0.0860 175/500 [=========>....................] - ETA: 1:16 - loss: 0.8914 - regression_loss: 0.8055 - classification_loss: 0.0859 176/500 [=========>....................] - ETA: 1:15 - loss: 0.8906 - regression_loss: 0.8049 - classification_loss: 0.0856 177/500 [=========>....................] - ETA: 1:15 - loss: 0.8896 - regression_loss: 0.8040 - classification_loss: 0.0856 178/500 [=========>....................] - ETA: 1:15 - loss: 0.8886 - regression_loss: 0.8032 - classification_loss: 0.0854 179/500 [=========>....................] - ETA: 1:15 - loss: 0.8866 - regression_loss: 0.8015 - classification_loss: 0.0851 180/500 [=========>....................] - ETA: 1:15 - loss: 0.8857 - regression_loss: 0.8009 - classification_loss: 0.0848 181/500 [=========>....................] - ETA: 1:14 - loss: 0.8852 - regression_loss: 0.8005 - classification_loss: 0.0847 182/500 [=========>....................] - ETA: 1:14 - loss: 0.8848 - regression_loss: 0.8000 - classification_loss: 0.0848 183/500 [=========>....................] - ETA: 1:14 - loss: 0.8851 - regression_loss: 0.8004 - classification_loss: 0.0847 184/500 [==========>...................] - ETA: 1:14 - loss: 0.8862 - regression_loss: 0.8013 - classification_loss: 0.0849 185/500 [==========>...................] - ETA: 1:13 - loss: 0.8905 - regression_loss: 0.8056 - classification_loss: 0.0849 186/500 [==========>...................] - ETA: 1:13 - loss: 0.8920 - regression_loss: 0.8069 - classification_loss: 0.0851 187/500 [==========>...................] - ETA: 1:13 - loss: 0.8903 - regression_loss: 0.8054 - classification_loss: 0.0849 188/500 [==========>...................] - ETA: 1:13 - loss: 0.8895 - regression_loss: 0.8048 - classification_loss: 0.0847 189/500 [==========>...................] - ETA: 1:12 - loss: 0.8899 - regression_loss: 0.8055 - classification_loss: 0.0844 190/500 [==========>...................] - ETA: 1:12 - loss: 0.8913 - regression_loss: 0.8070 - classification_loss: 0.0843 191/500 [==========>...................] - ETA: 1:12 - loss: 0.8883 - regression_loss: 0.8043 - classification_loss: 0.0840 192/500 [==========>...................] - ETA: 1:12 - loss: 0.8879 - regression_loss: 0.8038 - classification_loss: 0.0840 193/500 [==========>...................] - ETA: 1:11 - loss: 0.8909 - regression_loss: 0.8067 - classification_loss: 0.0842 194/500 [==========>...................] - ETA: 1:11 - loss: 0.8930 - regression_loss: 0.8085 - classification_loss: 0.0844 195/500 [==========>...................] - ETA: 1:11 - loss: 0.8946 - regression_loss: 0.8095 - classification_loss: 0.0851 196/500 [==========>...................] - ETA: 1:11 - loss: 0.8925 - regression_loss: 0.8078 - classification_loss: 0.0847 197/500 [==========>...................] - ETA: 1:11 - loss: 0.8936 - regression_loss: 0.8088 - classification_loss: 0.0848 198/500 [==========>...................] - ETA: 1:10 - loss: 0.8932 - regression_loss: 0.8086 - classification_loss: 0.0846 199/500 [==========>...................] - ETA: 1:10 - loss: 0.8925 - regression_loss: 0.8081 - classification_loss: 0.0844 200/500 [===========>..................] - ETA: 1:10 - loss: 0.8954 - regression_loss: 0.8112 - classification_loss: 0.0843 201/500 [===========>..................] - ETA: 1:10 - loss: 0.8967 - regression_loss: 0.8124 - classification_loss: 0.0844 202/500 [===========>..................] - ETA: 1:09 - loss: 0.8955 - regression_loss: 0.8114 - classification_loss: 0.0841 203/500 [===========>..................] - ETA: 1:09 - loss: 0.8953 - regression_loss: 0.8114 - classification_loss: 0.0839 204/500 [===========>..................] - ETA: 1:09 - loss: 0.8947 - regression_loss: 0.8108 - classification_loss: 0.0839 205/500 [===========>..................] - ETA: 1:09 - loss: 0.8939 - regression_loss: 0.8102 - classification_loss: 0.0837 206/500 [===========>..................] - ETA: 1:08 - loss: 0.8922 - regression_loss: 0.8087 - classification_loss: 0.0835 207/500 [===========>..................] - ETA: 1:08 - loss: 0.8898 - regression_loss: 0.8065 - classification_loss: 0.0832 208/500 [===========>..................] - ETA: 1:08 - loss: 0.8905 - regression_loss: 0.8072 - classification_loss: 0.0833 209/500 [===========>..................] - ETA: 1:08 - loss: 0.8910 - regression_loss: 0.8078 - classification_loss: 0.0833 210/500 [===========>..................] - ETA: 1:08 - loss: 0.8877 - regression_loss: 0.8048 - classification_loss: 0.0829 211/500 [===========>..................] - ETA: 1:07 - loss: 0.8878 - regression_loss: 0.8049 - classification_loss: 0.0829 212/500 [===========>..................] - ETA: 1:07 - loss: 0.8890 - regression_loss: 0.8057 - classification_loss: 0.0832 213/500 [===========>..................] - ETA: 1:07 - loss: 0.8908 - regression_loss: 0.8072 - classification_loss: 0.0836 214/500 [===========>..................] - ETA: 1:07 - loss: 0.8893 - regression_loss: 0.8059 - classification_loss: 0.0834 215/500 [===========>..................] - ETA: 1:06 - loss: 0.8909 - regression_loss: 0.8071 - classification_loss: 0.0839 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8903 - regression_loss: 0.8066 - classification_loss: 0.0837 217/500 [============>.................] - ETA: 1:06 - loss: 0.8925 - regression_loss: 0.8083 - classification_loss: 0.0842 218/500 [============>.................] - ETA: 1:06 - loss: 0.8942 - regression_loss: 0.8094 - classification_loss: 0.0848 219/500 [============>.................] - ETA: 1:05 - loss: 0.8922 - regression_loss: 0.8077 - classification_loss: 0.0845 220/500 [============>.................] - ETA: 1:05 - loss: 0.8907 - regression_loss: 0.8064 - classification_loss: 0.0843 221/500 [============>.................] - ETA: 1:05 - loss: 0.8924 - regression_loss: 0.8080 - classification_loss: 0.0844 222/500 [============>.................] - ETA: 1:05 - loss: 0.8928 - regression_loss: 0.8084 - classification_loss: 0.0844 223/500 [============>.................] - ETA: 1:04 - loss: 0.8905 - regression_loss: 0.8063 - classification_loss: 0.0842 224/500 [============>.................] - ETA: 1:04 - loss: 0.8897 - regression_loss: 0.8055 - classification_loss: 0.0841 225/500 [============>.................] - ETA: 1:04 - loss: 0.8909 - regression_loss: 0.8066 - classification_loss: 0.0843 226/500 [============>.................] - ETA: 1:04 - loss: 0.8890 - regression_loss: 0.8050 - classification_loss: 0.0840 227/500 [============>.................] - ETA: 1:03 - loss: 0.8881 - regression_loss: 0.8041 - classification_loss: 0.0839 228/500 [============>.................] - ETA: 1:03 - loss: 0.8857 - regression_loss: 0.8020 - classification_loss: 0.0837 229/500 [============>.................] - ETA: 1:03 - loss: 0.8833 - regression_loss: 0.7999 - classification_loss: 0.0834 230/500 [============>.................] - ETA: 1:03 - loss: 0.8835 - regression_loss: 0.8000 - classification_loss: 0.0835 231/500 [============>.................] - ETA: 1:03 - loss: 0.8797 - regression_loss: 0.7965 - classification_loss: 0.0832 232/500 [============>.................] - ETA: 1:02 - loss: 0.8790 - regression_loss: 0.7959 - classification_loss: 0.0831 233/500 [============>.................] - ETA: 1:02 - loss: 0.8791 - regression_loss: 0.7959 - classification_loss: 0.0832 234/500 [=============>................] - ETA: 1:02 - loss: 0.8789 - regression_loss: 0.7955 - classification_loss: 0.0834 235/500 [=============>................] - ETA: 1:02 - loss: 0.8794 - regression_loss: 0.7962 - classification_loss: 0.0833 236/500 [=============>................] - ETA: 1:01 - loss: 0.8773 - regression_loss: 0.7943 - classification_loss: 0.0830 237/500 [=============>................] - ETA: 1:01 - loss: 0.8769 - regression_loss: 0.7940 - classification_loss: 0.0829 238/500 [=============>................] - ETA: 1:01 - loss: 0.8748 - regression_loss: 0.7921 - classification_loss: 0.0826 239/500 [=============>................] - ETA: 1:01 - loss: 0.8745 - regression_loss: 0.7922 - classification_loss: 0.0824 240/500 [=============>................] - ETA: 1:00 - loss: 0.8754 - regression_loss: 0.7927 - classification_loss: 0.0827 241/500 [=============>................] - ETA: 1:00 - loss: 0.8753 - regression_loss: 0.7925 - classification_loss: 0.0829 242/500 [=============>................] - ETA: 1:00 - loss: 0.8744 - regression_loss: 0.7916 - classification_loss: 0.0828 243/500 [=============>................] - ETA: 1:00 - loss: 0.8742 - regression_loss: 0.7913 - classification_loss: 0.0829 244/500 [=============>................] - ETA: 59s - loss: 0.8730 - regression_loss: 0.7904 - classification_loss: 0.0826  245/500 [=============>................] - ETA: 59s - loss: 0.8718 - regression_loss: 0.7890 - classification_loss: 0.0828 246/500 [=============>................] - ETA: 59s - loss: 0.8711 - regression_loss: 0.7883 - classification_loss: 0.0828 247/500 [=============>................] - ETA: 59s - loss: 0.8708 - regression_loss: 0.7881 - classification_loss: 0.0827 248/500 [=============>................] - ETA: 58s - loss: 0.8673 - regression_loss: 0.7849 - classification_loss: 0.0824 249/500 [=============>................] - ETA: 58s - loss: 0.8713 - regression_loss: 0.7859 - classification_loss: 0.0854 250/500 [==============>...............] - ETA: 58s - loss: 0.8725 - regression_loss: 0.7868 - classification_loss: 0.0857 251/500 [==============>...............] - ETA: 58s - loss: 0.8724 - regression_loss: 0.7865 - classification_loss: 0.0859 252/500 [==============>...............] - ETA: 58s - loss: 0.8739 - regression_loss: 0.7879 - classification_loss: 0.0859 253/500 [==============>...............] - ETA: 57s - loss: 0.8729 - regression_loss: 0.7871 - classification_loss: 0.0858 254/500 [==============>...............] - ETA: 57s - loss: 0.8734 - regression_loss: 0.7876 - classification_loss: 0.0858 255/500 [==============>...............] - ETA: 57s - loss: 0.8741 - regression_loss: 0.7884 - classification_loss: 0.0858 256/500 [==============>...............] - ETA: 57s - loss: 0.8729 - regression_loss: 0.7873 - classification_loss: 0.0856 257/500 [==============>...............] - ETA: 56s - loss: 0.8722 - regression_loss: 0.7867 - classification_loss: 0.0855 258/500 [==============>...............] - ETA: 56s - loss: 0.8712 - regression_loss: 0.7859 - classification_loss: 0.0853 259/500 [==============>...............] - ETA: 56s - loss: 0.8707 - regression_loss: 0.7855 - classification_loss: 0.0853 260/500 [==============>...............] - ETA: 56s - loss: 0.8699 - regression_loss: 0.7847 - classification_loss: 0.0852 261/500 [==============>...............] - ETA: 55s - loss: 0.8695 - regression_loss: 0.7844 - classification_loss: 0.0852 262/500 [==============>...............] - ETA: 55s - loss: 0.8700 - regression_loss: 0.7846 - classification_loss: 0.0854 263/500 [==============>...............] - ETA: 55s - loss: 0.8701 - regression_loss: 0.7848 - classification_loss: 0.0853 264/500 [==============>...............] - ETA: 55s - loss: 0.8689 - regression_loss: 0.7838 - classification_loss: 0.0852 265/500 [==============>...............] - ETA: 55s - loss: 0.8685 - regression_loss: 0.7834 - classification_loss: 0.0851 266/500 [==============>...............] - ETA: 54s - loss: 0.8664 - regression_loss: 0.7815 - classification_loss: 0.0849 267/500 [===============>..............] - ETA: 54s - loss: 0.8664 - regression_loss: 0.7818 - classification_loss: 0.0846 268/500 [===============>..............] - ETA: 54s - loss: 0.8662 - regression_loss: 0.7816 - classification_loss: 0.0846 269/500 [===============>..............] - ETA: 54s - loss: 0.8658 - regression_loss: 0.7813 - classification_loss: 0.0845 270/500 [===============>..............] - ETA: 53s - loss: 0.8664 - regression_loss: 0.7818 - classification_loss: 0.0846 271/500 [===============>..............] - ETA: 53s - loss: 0.8661 - regression_loss: 0.7815 - classification_loss: 0.0846 272/500 [===============>..............] - ETA: 53s - loss: 0.8661 - regression_loss: 0.7814 - classification_loss: 0.0846 273/500 [===============>..............] - ETA: 53s - loss: 0.8670 - regression_loss: 0.7823 - classification_loss: 0.0847 274/500 [===============>..............] - ETA: 52s - loss: 0.8643 - regression_loss: 0.7798 - classification_loss: 0.0845 275/500 [===============>..............] - ETA: 52s - loss: 0.8651 - regression_loss: 0.7805 - classification_loss: 0.0846 276/500 [===============>..............] - ETA: 52s - loss: 0.8666 - regression_loss: 0.7815 - classification_loss: 0.0850 277/500 [===============>..............] - ETA: 52s - loss: 0.8656 - regression_loss: 0.7808 - classification_loss: 0.0848 278/500 [===============>..............] - ETA: 51s - loss: 0.8665 - regression_loss: 0.7816 - classification_loss: 0.0849 279/500 [===============>..............] - ETA: 51s - loss: 0.8692 - regression_loss: 0.7839 - classification_loss: 0.0853 280/500 [===============>..............] - ETA: 51s - loss: 0.8680 - regression_loss: 0.7829 - classification_loss: 0.0851 281/500 [===============>..............] - ETA: 51s - loss: 0.8695 - regression_loss: 0.7843 - classification_loss: 0.0852 282/500 [===============>..............] - ETA: 51s - loss: 0.8693 - regression_loss: 0.7842 - classification_loss: 0.0852 283/500 [===============>..............] - ETA: 50s - loss: 0.8683 - regression_loss: 0.7832 - classification_loss: 0.0851 284/500 [================>.............] - ETA: 50s - loss: 0.8678 - regression_loss: 0.7828 - classification_loss: 0.0850 285/500 [================>.............] - ETA: 50s - loss: 0.8672 - regression_loss: 0.7824 - classification_loss: 0.0849 286/500 [================>.............] - ETA: 50s - loss: 0.8657 - regression_loss: 0.7810 - classification_loss: 0.0847 287/500 [================>.............] - ETA: 49s - loss: 0.8648 - regression_loss: 0.7803 - classification_loss: 0.0846 288/500 [================>.............] - ETA: 49s - loss: 0.8660 - regression_loss: 0.7811 - classification_loss: 0.0849 289/500 [================>.............] - ETA: 49s - loss: 0.8654 - regression_loss: 0.7807 - classification_loss: 0.0847 290/500 [================>.............] - ETA: 49s - loss: 0.8656 - regression_loss: 0.7811 - classification_loss: 0.0846 291/500 [================>.............] - ETA: 48s - loss: 0.8656 - regression_loss: 0.7810 - classification_loss: 0.0846 292/500 [================>.............] - ETA: 48s - loss: 0.8663 - regression_loss: 0.7817 - classification_loss: 0.0847 293/500 [================>.............] - ETA: 48s - loss: 0.8677 - regression_loss: 0.7828 - classification_loss: 0.0849 294/500 [================>.............] - ETA: 48s - loss: 0.8681 - regression_loss: 0.7831 - classification_loss: 0.0850 295/500 [================>.............] - ETA: 47s - loss: 0.8680 - regression_loss: 0.7828 - classification_loss: 0.0852 296/500 [================>.............] - ETA: 47s - loss: 0.8675 - regression_loss: 0.7823 - classification_loss: 0.0852 297/500 [================>.............] - ETA: 47s - loss: 0.8675 - regression_loss: 0.7823 - classification_loss: 0.0852 298/500 [================>.............] - ETA: 47s - loss: 0.8669 - regression_loss: 0.7818 - classification_loss: 0.0851 299/500 [================>.............] - ETA: 47s - loss: 0.8664 - regression_loss: 0.7814 - classification_loss: 0.0850 300/500 [=================>............] - ETA: 46s - loss: 0.8649 - regression_loss: 0.7801 - classification_loss: 0.0848 301/500 [=================>............] - ETA: 46s - loss: 0.8645 - regression_loss: 0.7797 - classification_loss: 0.0847 302/500 [=================>............] - ETA: 46s - loss: 0.8640 - regression_loss: 0.7792 - classification_loss: 0.0848 303/500 [=================>............] - ETA: 46s - loss: 0.8644 - regression_loss: 0.7795 - classification_loss: 0.0849 304/500 [=================>............] - ETA: 45s - loss: 0.8649 - regression_loss: 0.7799 - classification_loss: 0.0850 305/500 [=================>............] - ETA: 45s - loss: 0.8651 - regression_loss: 0.7801 - classification_loss: 0.0850 306/500 [=================>............] - ETA: 45s - loss: 0.8644 - regression_loss: 0.7794 - classification_loss: 0.0850 307/500 [=================>............] - ETA: 45s - loss: 0.8637 - regression_loss: 0.7787 - classification_loss: 0.0850 308/500 [=================>............] - ETA: 44s - loss: 0.8644 - regression_loss: 0.7792 - classification_loss: 0.0852 309/500 [=================>............] - ETA: 44s - loss: 0.8643 - regression_loss: 0.7793 - classification_loss: 0.0850 310/500 [=================>............] - ETA: 44s - loss: 0.8633 - regression_loss: 0.7784 - classification_loss: 0.0849 311/500 [=================>............] - ETA: 44s - loss: 0.8633 - regression_loss: 0.7784 - classification_loss: 0.0849 312/500 [=================>............] - ETA: 44s - loss: 0.8634 - regression_loss: 0.7786 - classification_loss: 0.0848 313/500 [=================>............] - ETA: 43s - loss: 0.8644 - regression_loss: 0.7794 - classification_loss: 0.0850 314/500 [=================>............] - ETA: 43s - loss: 0.8635 - regression_loss: 0.7786 - classification_loss: 0.0848 315/500 [=================>............] - ETA: 43s - loss: 0.8638 - regression_loss: 0.7788 - classification_loss: 0.0849 316/500 [=================>............] - ETA: 43s - loss: 0.8633 - regression_loss: 0.7783 - classification_loss: 0.0850 317/500 [==================>...........] - ETA: 42s - loss: 0.8625 - regression_loss: 0.7776 - classification_loss: 0.0849 318/500 [==================>...........] - ETA: 42s - loss: 0.8621 - regression_loss: 0.7774 - classification_loss: 0.0847 319/500 [==================>...........] - ETA: 42s - loss: 0.8612 - regression_loss: 0.7767 - classification_loss: 0.0845 320/500 [==================>...........] - ETA: 42s - loss: 0.8625 - regression_loss: 0.7779 - classification_loss: 0.0846 321/500 [==================>...........] - ETA: 41s - loss: 0.8630 - regression_loss: 0.7783 - classification_loss: 0.0847 322/500 [==================>...........] - ETA: 41s - loss: 0.8638 - regression_loss: 0.7791 - classification_loss: 0.0847 323/500 [==================>...........] - ETA: 41s - loss: 0.8622 - regression_loss: 0.7777 - classification_loss: 0.0845 324/500 [==================>...........] - ETA: 41s - loss: 0.8628 - regression_loss: 0.7781 - classification_loss: 0.0847 325/500 [==================>...........] - ETA: 41s - loss: 0.8631 - regression_loss: 0.7784 - classification_loss: 0.0847 326/500 [==================>...........] - ETA: 40s - loss: 0.8656 - regression_loss: 0.7803 - classification_loss: 0.0853 327/500 [==================>...........] - ETA: 40s - loss: 0.8652 - regression_loss: 0.7801 - classification_loss: 0.0852 328/500 [==================>...........] - ETA: 40s - loss: 0.8669 - regression_loss: 0.7811 - classification_loss: 0.0858 329/500 [==================>...........] - ETA: 40s - loss: 0.8657 - regression_loss: 0.7802 - classification_loss: 0.0855 330/500 [==================>...........] - ETA: 39s - loss: 0.8655 - regression_loss: 0.7798 - classification_loss: 0.0857 331/500 [==================>...........] - ETA: 39s - loss: 0.8651 - regression_loss: 0.7795 - classification_loss: 0.0856 332/500 [==================>...........] - ETA: 39s - loss: 0.8647 - regression_loss: 0.7792 - classification_loss: 0.0855 333/500 [==================>...........] - ETA: 39s - loss: 0.8673 - regression_loss: 0.7812 - classification_loss: 0.0861 334/500 [===================>..........] - ETA: 38s - loss: 0.8664 - regression_loss: 0.7804 - classification_loss: 0.0860 335/500 [===================>..........] - ETA: 38s - loss: 0.8664 - regression_loss: 0.7804 - classification_loss: 0.0859 336/500 [===================>..........] - ETA: 38s - loss: 0.8665 - regression_loss: 0.7805 - classification_loss: 0.0860 337/500 [===================>..........] - ETA: 38s - loss: 0.8719 - regression_loss: 0.7845 - classification_loss: 0.0874 338/500 [===================>..........] - ETA: 37s - loss: 0.8742 - regression_loss: 0.7861 - classification_loss: 0.0881 339/500 [===================>..........] - ETA: 37s - loss: 0.8729 - regression_loss: 0.7850 - classification_loss: 0.0879 340/500 [===================>..........] - ETA: 37s - loss: 0.8740 - regression_loss: 0.7861 - classification_loss: 0.0879 341/500 [===================>..........] - ETA: 37s - loss: 0.8745 - regression_loss: 0.7865 - classification_loss: 0.0880 342/500 [===================>..........] - ETA: 37s - loss: 0.8749 - regression_loss: 0.7868 - classification_loss: 0.0881 343/500 [===================>..........] - ETA: 36s - loss: 0.8756 - regression_loss: 0.7874 - classification_loss: 0.0882 344/500 [===================>..........] - ETA: 36s - loss: 0.8758 - regression_loss: 0.7876 - classification_loss: 0.0881 345/500 [===================>..........] - ETA: 36s - loss: 0.8757 - regression_loss: 0.7876 - classification_loss: 0.0881 346/500 [===================>..........] - ETA: 36s - loss: 0.8777 - regression_loss: 0.7890 - classification_loss: 0.0887 347/500 [===================>..........] - ETA: 35s - loss: 0.8805 - regression_loss: 0.7913 - classification_loss: 0.0892 348/500 [===================>..........] - ETA: 35s - loss: 0.8806 - regression_loss: 0.7915 - classification_loss: 0.0891 349/500 [===================>..........] - ETA: 35s - loss: 0.8803 - regression_loss: 0.7911 - classification_loss: 0.0892 350/500 [====================>.........] - ETA: 35s - loss: 0.8791 - regression_loss: 0.7901 - classification_loss: 0.0890 351/500 [====================>.........] - ETA: 34s - loss: 0.8805 - regression_loss: 0.7914 - classification_loss: 0.0891 352/500 [====================>.........] - ETA: 34s - loss: 0.8804 - regression_loss: 0.7913 - classification_loss: 0.0891 353/500 [====================>.........] - ETA: 34s - loss: 0.8814 - regression_loss: 0.7921 - classification_loss: 0.0894 354/500 [====================>.........] - ETA: 34s - loss: 0.8811 - regression_loss: 0.7918 - classification_loss: 0.0893 355/500 [====================>.........] - ETA: 34s - loss: 0.8805 - regression_loss: 0.7912 - classification_loss: 0.0893 356/500 [====================>.........] - ETA: 33s - loss: 0.8808 - regression_loss: 0.7915 - classification_loss: 0.0893 357/500 [====================>.........] - ETA: 33s - loss: 0.8813 - regression_loss: 0.7920 - classification_loss: 0.0893 358/500 [====================>.........] - ETA: 33s - loss: 0.8825 - regression_loss: 0.7932 - classification_loss: 0.0893 359/500 [====================>.........] - ETA: 33s - loss: 0.8821 - regression_loss: 0.7928 - classification_loss: 0.0892 360/500 [====================>.........] - ETA: 32s - loss: 0.8838 - regression_loss: 0.7944 - classification_loss: 0.0894 361/500 [====================>.........] - ETA: 32s - loss: 0.8823 - regression_loss: 0.7929 - classification_loss: 0.0894 362/500 [====================>.........] - ETA: 32s - loss: 0.8826 - regression_loss: 0.7932 - classification_loss: 0.0894 363/500 [====================>.........] - ETA: 32s - loss: 0.8806 - regression_loss: 0.7914 - classification_loss: 0.0892 364/500 [====================>.........] - ETA: 31s - loss: 0.8804 - regression_loss: 0.7912 - classification_loss: 0.0891 365/500 [====================>.........] - ETA: 31s - loss: 0.8809 - regression_loss: 0.7917 - classification_loss: 0.0892 366/500 [====================>.........] - ETA: 31s - loss: 0.8799 - regression_loss: 0.7909 - classification_loss: 0.0890 367/500 [=====================>........] - ETA: 31s - loss: 0.8803 - regression_loss: 0.7913 - classification_loss: 0.0891 368/500 [=====================>........] - ETA: 30s - loss: 0.8795 - regression_loss: 0.7904 - classification_loss: 0.0891 369/500 [=====================>........] - ETA: 30s - loss: 0.8800 - regression_loss: 0.7907 - classification_loss: 0.0892 370/500 [=====================>........] - ETA: 30s - loss: 0.8790 - regression_loss: 0.7900 - classification_loss: 0.0891 371/500 [=====================>........] - ETA: 30s - loss: 0.8783 - regression_loss: 0.7894 - classification_loss: 0.0889 372/500 [=====================>........] - ETA: 30s - loss: 0.8795 - regression_loss: 0.7904 - classification_loss: 0.0890 373/500 [=====================>........] - ETA: 29s - loss: 0.8808 - regression_loss: 0.7917 - classification_loss: 0.0891 374/500 [=====================>........] - ETA: 29s - loss: 0.8816 - regression_loss: 0.7926 - classification_loss: 0.0890 375/500 [=====================>........] - ETA: 29s - loss: 0.8816 - regression_loss: 0.7927 - classification_loss: 0.0889 376/500 [=====================>........] - ETA: 29s - loss: 0.8817 - regression_loss: 0.7929 - classification_loss: 0.0888 377/500 [=====================>........] - ETA: 28s - loss: 0.8810 - regression_loss: 0.7923 - classification_loss: 0.0887 378/500 [=====================>........] - ETA: 28s - loss: 0.8796 - regression_loss: 0.7911 - classification_loss: 0.0885 379/500 [=====================>........] - ETA: 28s - loss: 0.8790 - regression_loss: 0.7905 - classification_loss: 0.0884 380/500 [=====================>........] - ETA: 28s - loss: 0.8799 - regression_loss: 0.7913 - classification_loss: 0.0887 381/500 [=====================>........] - ETA: 27s - loss: 0.8806 - regression_loss: 0.7918 - classification_loss: 0.0887 382/500 [=====================>........] - ETA: 27s - loss: 0.8807 - regression_loss: 0.7920 - classification_loss: 0.0887 383/500 [=====================>........] - ETA: 27s - loss: 0.8788 - regression_loss: 0.7903 - classification_loss: 0.0885 384/500 [======================>.......] - ETA: 27s - loss: 0.8787 - regression_loss: 0.7902 - classification_loss: 0.0885 385/500 [======================>.......] - ETA: 26s - loss: 0.8796 - regression_loss: 0.7910 - classification_loss: 0.0887 386/500 [======================>.......] - ETA: 26s - loss: 0.8790 - regression_loss: 0.7904 - classification_loss: 0.0886 387/500 [======================>.......] - ETA: 26s - loss: 0.8788 - regression_loss: 0.7903 - classification_loss: 0.0885 388/500 [======================>.......] - ETA: 26s - loss: 0.8784 - regression_loss: 0.7899 - classification_loss: 0.0884 389/500 [======================>.......] - ETA: 26s - loss: 0.8784 - regression_loss: 0.7900 - classification_loss: 0.0884 390/500 [======================>.......] - ETA: 25s - loss: 0.8784 - regression_loss: 0.7900 - classification_loss: 0.0884 391/500 [======================>.......] - ETA: 25s - loss: 0.8782 - regression_loss: 0.7899 - classification_loss: 0.0883 392/500 [======================>.......] - ETA: 25s - loss: 0.8785 - regression_loss: 0.7901 - classification_loss: 0.0884 393/500 [======================>.......] - ETA: 25s - loss: 0.8769 - regression_loss: 0.7887 - classification_loss: 0.0882 394/500 [======================>.......] - ETA: 24s - loss: 0.8770 - regression_loss: 0.7888 - classification_loss: 0.0881 395/500 [======================>.......] - ETA: 24s - loss: 0.8779 - regression_loss: 0.7897 - classification_loss: 0.0881 396/500 [======================>.......] - ETA: 24s - loss: 0.8782 - regression_loss: 0.7901 - classification_loss: 0.0881 397/500 [======================>.......] - ETA: 24s - loss: 0.8786 - regression_loss: 0.7905 - classification_loss: 0.0881 398/500 [======================>.......] - ETA: 23s - loss: 0.8780 - regression_loss: 0.7900 - classification_loss: 0.0880 399/500 [======================>.......] - ETA: 23s - loss: 0.8790 - regression_loss: 0.7909 - classification_loss: 0.0881 400/500 [=======================>......] - ETA: 23s - loss: 0.8774 - regression_loss: 0.7895 - classification_loss: 0.0879 401/500 [=======================>......] - ETA: 23s - loss: 0.8766 - regression_loss: 0.7887 - classification_loss: 0.0878 402/500 [=======================>......] - ETA: 22s - loss: 0.8754 - regression_loss: 0.7875 - classification_loss: 0.0879 403/500 [=======================>......] - ETA: 22s - loss: 0.8761 - regression_loss: 0.7878 - classification_loss: 0.0883 404/500 [=======================>......] - ETA: 22s - loss: 0.8759 - regression_loss: 0.7876 - classification_loss: 0.0883 405/500 [=======================>......] - ETA: 22s - loss: 0.8751 - regression_loss: 0.7869 - classification_loss: 0.0882 406/500 [=======================>......] - ETA: 22s - loss: 0.8748 - regression_loss: 0.7866 - classification_loss: 0.0882 407/500 [=======================>......] - ETA: 21s - loss: 0.8746 - regression_loss: 0.7865 - classification_loss: 0.0882 408/500 [=======================>......] - ETA: 21s - loss: 0.8742 - regression_loss: 0.7861 - classification_loss: 0.0881 409/500 [=======================>......] - ETA: 21s - loss: 0.8748 - regression_loss: 0.7865 - classification_loss: 0.0883 410/500 [=======================>......] - ETA: 21s - loss: 0.8749 - regression_loss: 0.7867 - classification_loss: 0.0882 411/500 [=======================>......] - ETA: 20s - loss: 0.8758 - regression_loss: 0.7876 - classification_loss: 0.0882 412/500 [=======================>......] - ETA: 20s - loss: 0.8773 - regression_loss: 0.7886 - classification_loss: 0.0887 413/500 [=======================>......] - ETA: 20s - loss: 0.8773 - regression_loss: 0.7886 - classification_loss: 0.0886 414/500 [=======================>......] - ETA: 20s - loss: 0.8764 - regression_loss: 0.7879 - classification_loss: 0.0885 415/500 [=======================>......] - ETA: 19s - loss: 0.8780 - regression_loss: 0.7890 - classification_loss: 0.0890 416/500 [=======================>......] - ETA: 19s - loss: 0.8774 - regression_loss: 0.7886 - classification_loss: 0.0889 417/500 [========================>.....] - ETA: 19s - loss: 0.8772 - regression_loss: 0.7883 - classification_loss: 0.0889 418/500 [========================>.....] - ETA: 19s - loss: 0.8779 - regression_loss: 0.7890 - classification_loss: 0.0889 419/500 [========================>.....] - ETA: 18s - loss: 0.8788 - regression_loss: 0.7898 - classification_loss: 0.0890 420/500 [========================>.....] - ETA: 18s - loss: 0.8808 - regression_loss: 0.7917 - classification_loss: 0.0891 421/500 [========================>.....] - ETA: 18s - loss: 0.8817 - regression_loss: 0.7924 - classification_loss: 0.0893 422/500 [========================>.....] - ETA: 18s - loss: 0.8820 - regression_loss: 0.7927 - classification_loss: 0.0893 423/500 [========================>.....] - ETA: 18s - loss: 0.8809 - regression_loss: 0.7917 - classification_loss: 0.0892 424/500 [========================>.....] - ETA: 17s - loss: 0.8796 - regression_loss: 0.7905 - classification_loss: 0.0890 425/500 [========================>.....] - ETA: 17s - loss: 0.8793 - regression_loss: 0.7902 - classification_loss: 0.0891 426/500 [========================>.....] - ETA: 17s - loss: 0.8795 - regression_loss: 0.7902 - classification_loss: 0.0892 427/500 [========================>.....] - ETA: 17s - loss: 0.8797 - regression_loss: 0.7904 - classification_loss: 0.0893 428/500 [========================>.....] - ETA: 16s - loss: 0.8791 - regression_loss: 0.7899 - classification_loss: 0.0892 429/500 [========================>.....] - ETA: 16s - loss: 0.8798 - regression_loss: 0.7906 - classification_loss: 0.0892 430/500 [========================>.....] - ETA: 16s - loss: 0.8797 - regression_loss: 0.7906 - classification_loss: 0.0891 431/500 [========================>.....] - ETA: 16s - loss: 0.8795 - regression_loss: 0.7904 - classification_loss: 0.0891 432/500 [========================>.....] - ETA: 15s - loss: 0.8794 - regression_loss: 0.7903 - classification_loss: 0.0891 433/500 [========================>.....] - ETA: 15s - loss: 0.8800 - regression_loss: 0.7908 - classification_loss: 0.0892 434/500 [=========================>....] - ETA: 15s - loss: 0.8797 - regression_loss: 0.7906 - classification_loss: 0.0891 435/500 [=========================>....] - ETA: 15s - loss: 0.8798 - regression_loss: 0.7907 - classification_loss: 0.0891 436/500 [=========================>....] - ETA: 15s - loss: 0.8801 - regression_loss: 0.7910 - classification_loss: 0.0891 437/500 [=========================>....] - ETA: 14s - loss: 0.8794 - regression_loss: 0.7905 - classification_loss: 0.0890 438/500 [=========================>....] - ETA: 14s - loss: 0.8791 - regression_loss: 0.7901 - classification_loss: 0.0889 439/500 [=========================>....] - ETA: 14s - loss: 0.8804 - regression_loss: 0.7913 - classification_loss: 0.0891 440/500 [=========================>....] - ETA: 14s - loss: 0.8825 - regression_loss: 0.7928 - classification_loss: 0.0897 441/500 [=========================>....] - ETA: 13s - loss: 0.8825 - regression_loss: 0.7929 - classification_loss: 0.0896 442/500 [=========================>....] - ETA: 13s - loss: 0.8816 - regression_loss: 0.7921 - classification_loss: 0.0895 443/500 [=========================>....] - ETA: 13s - loss: 0.8810 - regression_loss: 0.7916 - classification_loss: 0.0894 444/500 [=========================>....] - ETA: 13s - loss: 0.8803 - regression_loss: 0.7910 - classification_loss: 0.0893 445/500 [=========================>....] - ETA: 12s - loss: 0.8798 - regression_loss: 0.7905 - classification_loss: 0.0892 446/500 [=========================>....] - ETA: 12s - loss: 0.8799 - regression_loss: 0.7907 - classification_loss: 0.0893 447/500 [=========================>....] - ETA: 12s - loss: 0.8790 - regression_loss: 0.7899 - classification_loss: 0.0891 448/500 [=========================>....] - ETA: 12s - loss: 0.8794 - regression_loss: 0.7902 - classification_loss: 0.0892 449/500 [=========================>....] - ETA: 11s - loss: 0.8795 - regression_loss: 0.7901 - classification_loss: 0.0894 450/500 [==========================>...] - ETA: 11s - loss: 0.8807 - regression_loss: 0.7912 - classification_loss: 0.0896 451/500 [==========================>...] - ETA: 11s - loss: 0.8818 - regression_loss: 0.7920 - classification_loss: 0.0898 452/500 [==========================>...] - ETA: 11s - loss: 0.8812 - regression_loss: 0.7915 - classification_loss: 0.0897 453/500 [==========================>...] - ETA: 11s - loss: 0.8831 - regression_loss: 0.7930 - classification_loss: 0.0901 454/500 [==========================>...] - ETA: 10s - loss: 0.8826 - regression_loss: 0.7925 - classification_loss: 0.0901 455/500 [==========================>...] - ETA: 10s - loss: 0.8829 - regression_loss: 0.7927 - classification_loss: 0.0901 456/500 [==========================>...] - ETA: 10s - loss: 0.8830 - regression_loss: 0.7928 - classification_loss: 0.0902 457/500 [==========================>...] - ETA: 10s - loss: 0.8832 - regression_loss: 0.7930 - classification_loss: 0.0902 458/500 [==========================>...] - ETA: 9s - loss: 0.8833 - regression_loss: 0.7931 - classification_loss: 0.0903  459/500 [==========================>...] - ETA: 9s - loss: 0.8832 - regression_loss: 0.7929 - classification_loss: 0.0903 460/500 [==========================>...] - ETA: 9s - loss: 0.8831 - regression_loss: 0.7928 - classification_loss: 0.0903 461/500 [==========================>...] - ETA: 9s - loss: 0.8831 - regression_loss: 0.7930 - classification_loss: 0.0901 462/500 [==========================>...] - ETA: 8s - loss: 0.8833 - regression_loss: 0.7931 - classification_loss: 0.0902 463/500 [==========================>...] - ETA: 8s - loss: 0.8830 - regression_loss: 0.7929 - classification_loss: 0.0901 464/500 [==========================>...] - ETA: 8s - loss: 0.8844 - regression_loss: 0.7940 - classification_loss: 0.0904 465/500 [==========================>...] - ETA: 8s - loss: 0.8849 - regression_loss: 0.7944 - classification_loss: 0.0905 466/500 [==========================>...] - ETA: 7s - loss: 0.8848 - regression_loss: 0.7944 - classification_loss: 0.0904 467/500 [===========================>..] - ETA: 7s - loss: 0.8845 - regression_loss: 0.7941 - classification_loss: 0.0904 468/500 [===========================>..] - ETA: 7s - loss: 0.8849 - regression_loss: 0.7945 - classification_loss: 0.0904 469/500 [===========================>..] - ETA: 7s - loss: 0.8847 - regression_loss: 0.7943 - classification_loss: 0.0903 470/500 [===========================>..] - ETA: 7s - loss: 0.8852 - regression_loss: 0.7948 - classification_loss: 0.0904 471/500 [===========================>..] - ETA: 6s - loss: 0.8838 - regression_loss: 0.7936 - classification_loss: 0.0902 472/500 [===========================>..] - ETA: 6s - loss: 0.8839 - regression_loss: 0.7936 - classification_loss: 0.0903 473/500 [===========================>..] - ETA: 6s - loss: 0.8837 - regression_loss: 0.7934 - classification_loss: 0.0903 474/500 [===========================>..] - ETA: 6s - loss: 0.8828 - regression_loss: 0.7925 - classification_loss: 0.0903 475/500 [===========================>..] - ETA: 5s - loss: 0.8819 - regression_loss: 0.7918 - classification_loss: 0.0902 476/500 [===========================>..] - ETA: 5s - loss: 0.8817 - regression_loss: 0.7914 - classification_loss: 0.0902 477/500 [===========================>..] - ETA: 5s - loss: 0.8816 - regression_loss: 0.7914 - classification_loss: 0.0902 478/500 [===========================>..] - ETA: 5s - loss: 0.8818 - regression_loss: 0.7917 - classification_loss: 0.0902 479/500 [===========================>..] - ETA: 4s - loss: 0.8816 - regression_loss: 0.7915 - classification_loss: 0.0901 480/500 [===========================>..] - ETA: 4s - loss: 0.8815 - regression_loss: 0.7915 - classification_loss: 0.0900 481/500 [===========================>..] - ETA: 4s - loss: 0.8817 - regression_loss: 0.7917 - classification_loss: 0.0900 482/500 [===========================>..] - ETA: 4s - loss: 0.8819 - regression_loss: 0.7919 - classification_loss: 0.0900 483/500 [===========================>..] - ETA: 3s - loss: 0.8806 - regression_loss: 0.7907 - classification_loss: 0.0899 484/500 [============================>.] - ETA: 3s - loss: 0.8814 - regression_loss: 0.7915 - classification_loss: 0.0899 485/500 [============================>.] - ETA: 3s - loss: 0.8822 - regression_loss: 0.7921 - classification_loss: 0.0901 486/500 [============================>.] - ETA: 3s - loss: 0.8813 - regression_loss: 0.7913 - classification_loss: 0.0900 487/500 [============================>.] - ETA: 3s - loss: 0.8825 - regression_loss: 0.7922 - classification_loss: 0.0903 488/500 [============================>.] - ETA: 2s - loss: 0.8820 - regression_loss: 0.7918 - classification_loss: 0.0902 489/500 [============================>.] - ETA: 2s - loss: 0.8818 - regression_loss: 0.7916 - classification_loss: 0.0902 490/500 [============================>.] - ETA: 2s - loss: 0.8816 - regression_loss: 0.7914 - classification_loss: 0.0901 491/500 [============================>.] - ETA: 2s - loss: 0.8812 - regression_loss: 0.7911 - classification_loss: 0.0900 492/500 [============================>.] - ETA: 1s - loss: 0.8821 - regression_loss: 0.7918 - classification_loss: 0.0904 493/500 [============================>.] - ETA: 1s - loss: 0.8829 - regression_loss: 0.7925 - classification_loss: 0.0904 494/500 [============================>.] - ETA: 1s - loss: 0.8822 - regression_loss: 0.7920 - classification_loss: 0.0903 495/500 [============================>.] - ETA: 1s - loss: 0.8814 - regression_loss: 0.7913 - classification_loss: 0.0901 496/500 [============================>.] - ETA: 0s - loss: 0.8808 - regression_loss: 0.7907 - classification_loss: 0.0901 497/500 [============================>.] - ETA: 0s - loss: 0.8805 - regression_loss: 0.7904 - classification_loss: 0.0902 498/500 [============================>.] - ETA: 0s - loss: 0.8800 - regression_loss: 0.7899 - classification_loss: 0.0900 499/500 [============================>.] - ETA: 0s - loss: 0.8794 - regression_loss: 0.7894 - classification_loss: 0.0900 500/500 [==============================] - 117s 234ms/step - loss: 0.8783 - regression_loss: 0.7885 - classification_loss: 0.0898 326 instances of class plum with average precision: 0.8491 mAP: 0.8491 Epoch 00039: saving model to ./training/snapshots/resnet50_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 1:47 - loss: 0.8346 - regression_loss: 0.7642 - classification_loss: 0.0704 2/500 [..............................] - ETA: 1:49 - loss: 0.9056 - regression_loss: 0.8119 - classification_loss: 0.0937 3/500 [..............................] - ETA: 1:48 - loss: 0.8083 - regression_loss: 0.7351 - classification_loss: 0.0732 4/500 [..............................] - ETA: 1:51 - loss: 0.7344 - regression_loss: 0.6770 - classification_loss: 0.0574 5/500 [..............................] - ETA: 1:51 - loss: 0.7611 - regression_loss: 0.6994 - classification_loss: 0.0617 6/500 [..............................] - ETA: 1:51 - loss: 0.7556 - regression_loss: 0.6974 - classification_loss: 0.0582 7/500 [..............................] - ETA: 1:52 - loss: 0.7719 - regression_loss: 0.7153 - classification_loss: 0.0566 8/500 [..............................] - ETA: 1:51 - loss: 0.8194 - regression_loss: 0.7624 - classification_loss: 0.0570 9/500 [..............................] - ETA: 1:52 - loss: 0.8294 - regression_loss: 0.7705 - classification_loss: 0.0590 10/500 [..............................] - ETA: 1:52 - loss: 0.8740 - regression_loss: 0.8166 - classification_loss: 0.0573 11/500 [..............................] - ETA: 1:52 - loss: 0.8261 - regression_loss: 0.7723 - classification_loss: 0.0538 12/500 [..............................] - ETA: 1:52 - loss: 0.8491 - regression_loss: 0.7840 - classification_loss: 0.0652 13/500 [..............................] - ETA: 1:52 - loss: 0.8894 - regression_loss: 0.8136 - classification_loss: 0.0758 14/500 [..............................] - ETA: 1:52 - loss: 0.9066 - regression_loss: 0.8286 - classification_loss: 0.0780 15/500 [..............................] - ETA: 1:52 - loss: 0.9296 - regression_loss: 0.8474 - classification_loss: 0.0821 16/500 [..............................] - ETA: 1:52 - loss: 0.9083 - regression_loss: 0.8297 - classification_loss: 0.0787 17/500 [>.............................] - ETA: 1:51 - loss: 0.9486 - regression_loss: 0.8644 - classification_loss: 0.0842 18/500 [>.............................] - ETA: 1:51 - loss: 0.9432 - regression_loss: 0.8608 - classification_loss: 0.0824 19/500 [>.............................] - ETA: 1:51 - loss: 0.9404 - regression_loss: 0.8602 - classification_loss: 0.0802 20/500 [>.............................] - ETA: 1:51 - loss: 0.9289 - regression_loss: 0.8505 - classification_loss: 0.0784 21/500 [>.............................] - ETA: 1:51 - loss: 0.9153 - regression_loss: 0.8373 - classification_loss: 0.0781 22/500 [>.............................] - ETA: 1:51 - loss: 0.9070 - regression_loss: 0.8304 - classification_loss: 0.0766 23/500 [>.............................] - ETA: 1:51 - loss: 0.9236 - regression_loss: 0.8410 - classification_loss: 0.0825 24/500 [>.............................] - ETA: 1:51 - loss: 0.9362 - regression_loss: 0.8526 - classification_loss: 0.0835 25/500 [>.............................] - ETA: 1:51 - loss: 0.9093 - regression_loss: 0.8285 - classification_loss: 0.0808 26/500 [>.............................] - ETA: 1:51 - loss: 0.8973 - regression_loss: 0.8182 - classification_loss: 0.0791 27/500 [>.............................] - ETA: 1:50 - loss: 0.9010 - regression_loss: 0.8216 - classification_loss: 0.0794 28/500 [>.............................] - ETA: 1:50 - loss: 0.9176 - regression_loss: 0.8349 - classification_loss: 0.0827 29/500 [>.............................] - ETA: 1:50 - loss: 0.9297 - regression_loss: 0.8465 - classification_loss: 0.0832 30/500 [>.............................] - ETA: 1:50 - loss: 0.9139 - regression_loss: 0.8323 - classification_loss: 0.0816 31/500 [>.............................] - ETA: 1:49 - loss: 0.9113 - regression_loss: 0.8310 - classification_loss: 0.0803 32/500 [>.............................] - ETA: 1:50 - loss: 0.8993 - regression_loss: 0.8211 - classification_loss: 0.0782 33/500 [>.............................] - ETA: 1:50 - loss: 0.9344 - regression_loss: 0.8500 - classification_loss: 0.0844 34/500 [=>............................] - ETA: 1:49 - loss: 0.9450 - regression_loss: 0.8583 - classification_loss: 0.0867 35/500 [=>............................] - ETA: 1:49 - loss: 0.9466 - regression_loss: 0.8588 - classification_loss: 0.0878 36/500 [=>............................] - ETA: 1:49 - loss: 0.9441 - regression_loss: 0.8566 - classification_loss: 0.0875 37/500 [=>............................] - ETA: 1:49 - loss: 0.9479 - regression_loss: 0.8599 - classification_loss: 0.0880 38/500 [=>............................] - ETA: 1:48 - loss: 0.9440 - regression_loss: 0.8566 - classification_loss: 0.0874 39/500 [=>............................] - ETA: 1:48 - loss: 0.9284 - regression_loss: 0.8426 - classification_loss: 0.0858 40/500 [=>............................] - ETA: 1:48 - loss: 0.9246 - regression_loss: 0.8402 - classification_loss: 0.0844 41/500 [=>............................] - ETA: 1:47 - loss: 0.9299 - regression_loss: 0.8451 - classification_loss: 0.0849 42/500 [=>............................] - ETA: 1:47 - loss: 0.9484 - regression_loss: 0.8590 - classification_loss: 0.0894 43/500 [=>............................] - ETA: 1:47 - loss: 0.9444 - regression_loss: 0.8565 - classification_loss: 0.0879 44/500 [=>............................] - ETA: 1:46 - loss: 0.9374 - regression_loss: 0.8502 - classification_loss: 0.0872 45/500 [=>............................] - ETA: 1:46 - loss: 0.9327 - regression_loss: 0.8460 - classification_loss: 0.0866 46/500 [=>............................] - ETA: 1:46 - loss: 0.9259 - regression_loss: 0.8404 - classification_loss: 0.0855 47/500 [=>............................] - ETA: 1:46 - loss: 0.9272 - regression_loss: 0.8418 - classification_loss: 0.0854 48/500 [=>............................] - ETA: 1:45 - loss: 0.9208 - regression_loss: 0.8359 - classification_loss: 0.0849 49/500 [=>............................] - ETA: 1:45 - loss: 0.9224 - regression_loss: 0.8363 - classification_loss: 0.0860 50/500 [==>...........................] - ETA: 1:45 - loss: 0.9199 - regression_loss: 0.8341 - classification_loss: 0.0857 51/500 [==>...........................] - ETA: 1:45 - loss: 0.9085 - regression_loss: 0.8236 - classification_loss: 0.0849 52/500 [==>...........................] - ETA: 1:45 - loss: 0.9103 - regression_loss: 0.8255 - classification_loss: 0.0848 53/500 [==>...........................] - ETA: 1:45 - loss: 0.9106 - regression_loss: 0.8260 - classification_loss: 0.0846 54/500 [==>...........................] - ETA: 1:44 - loss: 0.9003 - regression_loss: 0.8163 - classification_loss: 0.0840 55/500 [==>...........................] - ETA: 1:44 - loss: 0.8938 - regression_loss: 0.8106 - classification_loss: 0.0833 56/500 [==>...........................] - ETA: 1:44 - loss: 0.8896 - regression_loss: 0.8071 - classification_loss: 0.0825 57/500 [==>...........................] - ETA: 1:44 - loss: 0.8869 - regression_loss: 0.8048 - classification_loss: 0.0822 58/500 [==>...........................] - ETA: 1:44 - loss: 0.8845 - regression_loss: 0.8022 - classification_loss: 0.0823 59/500 [==>...........................] - ETA: 1:43 - loss: 0.8831 - regression_loss: 0.8016 - classification_loss: 0.0815 60/500 [==>...........................] - ETA: 1:43 - loss: 0.8852 - regression_loss: 0.8040 - classification_loss: 0.0812 61/500 [==>...........................] - ETA: 1:43 - loss: 0.8869 - regression_loss: 0.8049 - classification_loss: 0.0821 62/500 [==>...........................] - ETA: 1:43 - loss: 0.8877 - regression_loss: 0.8055 - classification_loss: 0.0823 63/500 [==>...........................] - ETA: 1:42 - loss: 0.8865 - regression_loss: 0.8045 - classification_loss: 0.0820 64/500 [==>...........................] - ETA: 1:42 - loss: 0.8898 - regression_loss: 0.8079 - classification_loss: 0.0819 65/500 [==>...........................] - ETA: 1:42 - loss: 0.8833 - regression_loss: 0.8024 - classification_loss: 0.0810 66/500 [==>...........................] - ETA: 1:42 - loss: 0.8780 - regression_loss: 0.7978 - classification_loss: 0.0802 67/500 [===>..........................] - ETA: 1:41 - loss: 0.8797 - regression_loss: 0.7994 - classification_loss: 0.0802 68/500 [===>..........................] - ETA: 1:41 - loss: 0.8735 - regression_loss: 0.7938 - classification_loss: 0.0797 69/500 [===>..........................] - ETA: 1:41 - loss: 0.8694 - regression_loss: 0.7891 - classification_loss: 0.0803 70/500 [===>..........................] - ETA: 1:40 - loss: 0.8842 - regression_loss: 0.8014 - classification_loss: 0.0828 71/500 [===>..........................] - ETA: 1:40 - loss: 0.8806 - regression_loss: 0.7985 - classification_loss: 0.0821 72/500 [===>..........................] - ETA: 1:40 - loss: 0.8844 - regression_loss: 0.8015 - classification_loss: 0.0829 73/500 [===>..........................] - ETA: 1:40 - loss: 0.8814 - regression_loss: 0.7987 - classification_loss: 0.0827 74/500 [===>..........................] - ETA: 1:39 - loss: 0.8750 - regression_loss: 0.7931 - classification_loss: 0.0819 75/500 [===>..........................] - ETA: 1:39 - loss: 0.8760 - regression_loss: 0.7942 - classification_loss: 0.0818 76/500 [===>..........................] - ETA: 1:39 - loss: 0.8688 - regression_loss: 0.7875 - classification_loss: 0.0813 77/500 [===>..........................] - ETA: 1:39 - loss: 0.8724 - regression_loss: 0.7908 - classification_loss: 0.0815 78/500 [===>..........................] - ETA: 1:38 - loss: 0.8739 - regression_loss: 0.7907 - classification_loss: 0.0831 79/500 [===>..........................] - ETA: 1:38 - loss: 0.8769 - regression_loss: 0.7933 - classification_loss: 0.0835 80/500 [===>..........................] - ETA: 1:38 - loss: 0.8742 - regression_loss: 0.7913 - classification_loss: 0.0830 81/500 [===>..........................] - ETA: 1:38 - loss: 0.8728 - regression_loss: 0.7900 - classification_loss: 0.0828 82/500 [===>..........................] - ETA: 1:37 - loss: 0.8808 - regression_loss: 0.7957 - classification_loss: 0.0851 83/500 [===>..........................] - ETA: 1:37 - loss: 0.8834 - regression_loss: 0.7981 - classification_loss: 0.0853 84/500 [====>.........................] - ETA: 1:37 - loss: 0.8763 - regression_loss: 0.7919 - classification_loss: 0.0845 85/500 [====>.........................] - ETA: 1:37 - loss: 0.8734 - regression_loss: 0.7892 - classification_loss: 0.0842 86/500 [====>.........................] - ETA: 1:36 - loss: 0.8694 - regression_loss: 0.7858 - classification_loss: 0.0836 87/500 [====>.........................] - ETA: 1:36 - loss: 0.8737 - regression_loss: 0.7897 - classification_loss: 0.0839 88/500 [====>.........................] - ETA: 1:36 - loss: 0.8735 - regression_loss: 0.7898 - classification_loss: 0.0837 89/500 [====>.........................] - ETA: 1:36 - loss: 0.8766 - regression_loss: 0.7923 - classification_loss: 0.0843 90/500 [====>.........................] - ETA: 1:36 - loss: 0.8821 - regression_loss: 0.7963 - classification_loss: 0.0858 91/500 [====>.........................] - ETA: 1:35 - loss: 0.9095 - regression_loss: 0.8148 - classification_loss: 0.0948 92/500 [====>.........................] - ETA: 1:35 - loss: 0.9090 - regression_loss: 0.8146 - classification_loss: 0.0944 93/500 [====>.........................] - ETA: 1:35 - loss: 0.9058 - regression_loss: 0.8121 - classification_loss: 0.0937 94/500 [====>.........................] - ETA: 1:35 - loss: 0.9096 - regression_loss: 0.8153 - classification_loss: 0.0944 95/500 [====>.........................] - ETA: 1:35 - loss: 0.9083 - regression_loss: 0.8141 - classification_loss: 0.0941 96/500 [====>.........................] - ETA: 1:35 - loss: 0.9077 - regression_loss: 0.8138 - classification_loss: 0.0939 97/500 [====>.........................] - ETA: 1:34 - loss: 0.9052 - regression_loss: 0.8119 - classification_loss: 0.0933 98/500 [====>.........................] - ETA: 1:34 - loss: 0.9014 - regression_loss: 0.8087 - classification_loss: 0.0927 99/500 [====>.........................] - ETA: 1:34 - loss: 0.9002 - regression_loss: 0.8082 - classification_loss: 0.0920 100/500 [=====>........................] - ETA: 1:34 - loss: 0.8972 - regression_loss: 0.8057 - classification_loss: 0.0916 101/500 [=====>........................] - ETA: 1:33 - loss: 0.8956 - regression_loss: 0.8041 - classification_loss: 0.0915 102/500 [=====>........................] - ETA: 1:33 - loss: 0.8935 - regression_loss: 0.8025 - classification_loss: 0.0911 103/500 [=====>........................] - ETA: 1:33 - loss: 0.8951 - regression_loss: 0.8040 - classification_loss: 0.0911 104/500 [=====>........................] - ETA: 1:33 - loss: 0.8962 - regression_loss: 0.8047 - classification_loss: 0.0915 105/500 [=====>........................] - ETA: 1:33 - loss: 0.8935 - regression_loss: 0.8024 - classification_loss: 0.0911 106/500 [=====>........................] - ETA: 1:32 - loss: 0.8936 - regression_loss: 0.8022 - classification_loss: 0.0913 107/500 [=====>........................] - ETA: 1:32 - loss: 0.8899 - regression_loss: 0.7991 - classification_loss: 0.0908 108/500 [=====>........................] - ETA: 1:32 - loss: 0.8909 - regression_loss: 0.7993 - classification_loss: 0.0916 109/500 [=====>........................] - ETA: 1:32 - loss: 0.8930 - regression_loss: 0.8012 - classification_loss: 0.0918 110/500 [=====>........................] - ETA: 1:31 - loss: 0.8927 - regression_loss: 0.8011 - classification_loss: 0.0916 111/500 [=====>........................] - ETA: 1:31 - loss: 0.8963 - regression_loss: 0.8037 - classification_loss: 0.0926 112/500 [=====>........................] - ETA: 1:31 - loss: 0.8962 - regression_loss: 0.8042 - classification_loss: 0.0920 113/500 [=====>........................] - ETA: 1:31 - loss: 0.8983 - regression_loss: 0.8062 - classification_loss: 0.0920 114/500 [=====>........................] - ETA: 1:30 - loss: 0.8951 - regression_loss: 0.8032 - classification_loss: 0.0919 115/500 [=====>........................] - ETA: 1:30 - loss: 0.8982 - regression_loss: 0.8057 - classification_loss: 0.0925 116/500 [=====>........................] - ETA: 1:30 - loss: 0.8984 - regression_loss: 0.8061 - classification_loss: 0.0923 117/500 [======>.......................] - ETA: 1:30 - loss: 0.8980 - regression_loss: 0.8062 - classification_loss: 0.0918 118/500 [======>.......................] - ETA: 1:29 - loss: 0.8994 - regression_loss: 0.8072 - classification_loss: 0.0922 119/500 [======>.......................] - ETA: 1:29 - loss: 0.8997 - regression_loss: 0.8072 - classification_loss: 0.0925 120/500 [======>.......................] - ETA: 1:29 - loss: 0.8991 - regression_loss: 0.8067 - classification_loss: 0.0924 121/500 [======>.......................] - ETA: 1:29 - loss: 0.8985 - regression_loss: 0.8064 - classification_loss: 0.0921 122/500 [======>.......................] - ETA: 1:29 - loss: 0.8960 - regression_loss: 0.8041 - classification_loss: 0.0919 123/500 [======>.......................] - ETA: 1:28 - loss: 0.8945 - regression_loss: 0.8028 - classification_loss: 0.0917 124/500 [======>.......................] - ETA: 1:28 - loss: 0.8904 - regression_loss: 0.7994 - classification_loss: 0.0910 125/500 [======>.......................] - ETA: 1:28 - loss: 0.8870 - regression_loss: 0.7965 - classification_loss: 0.0905 126/500 [======>.......................] - ETA: 1:28 - loss: 0.8860 - regression_loss: 0.7958 - classification_loss: 0.0902 127/500 [======>.......................] - ETA: 1:27 - loss: 0.8885 - regression_loss: 0.7977 - classification_loss: 0.0908 128/500 [======>.......................] - ETA: 1:27 - loss: 0.8872 - regression_loss: 0.7966 - classification_loss: 0.0906 129/500 [======>.......................] - ETA: 1:27 - loss: 0.8891 - regression_loss: 0.7981 - classification_loss: 0.0911 130/500 [======>.......................] - ETA: 1:27 - loss: 0.8888 - regression_loss: 0.7980 - classification_loss: 0.0907 131/500 [======>.......................] - ETA: 1:27 - loss: 0.8859 - regression_loss: 0.7956 - classification_loss: 0.0903 132/500 [======>.......................] - ETA: 1:26 - loss: 0.8837 - regression_loss: 0.7939 - classification_loss: 0.0899 133/500 [======>.......................] - ETA: 1:26 - loss: 0.8827 - regression_loss: 0.7928 - classification_loss: 0.0899 134/500 [=======>......................] - ETA: 1:26 - loss: 0.8850 - regression_loss: 0.7951 - classification_loss: 0.0899 135/500 [=======>......................] - ETA: 1:26 - loss: 0.8849 - regression_loss: 0.7952 - classification_loss: 0.0896 136/500 [=======>......................] - ETA: 1:25 - loss: 0.8848 - regression_loss: 0.7950 - classification_loss: 0.0898 137/500 [=======>......................] - ETA: 1:25 - loss: 0.8792 - regression_loss: 0.7899 - classification_loss: 0.0893 138/500 [=======>......................] - ETA: 1:25 - loss: 0.8841 - regression_loss: 0.7935 - classification_loss: 0.0906 139/500 [=======>......................] - ETA: 1:25 - loss: 0.8808 - regression_loss: 0.7908 - classification_loss: 0.0901 140/500 [=======>......................] - ETA: 1:24 - loss: 0.8818 - regression_loss: 0.7922 - classification_loss: 0.0896 141/500 [=======>......................] - ETA: 1:24 - loss: 0.8795 - regression_loss: 0.7903 - classification_loss: 0.0892 142/500 [=======>......................] - ETA: 1:24 - loss: 0.8809 - regression_loss: 0.7916 - classification_loss: 0.0893 143/500 [=======>......................] - ETA: 1:24 - loss: 0.8843 - regression_loss: 0.7941 - classification_loss: 0.0902 144/500 [=======>......................] - ETA: 1:23 - loss: 0.8853 - regression_loss: 0.7952 - classification_loss: 0.0902 145/500 [=======>......................] - ETA: 1:23 - loss: 0.8834 - regression_loss: 0.7935 - classification_loss: 0.0899 146/500 [=======>......................] - ETA: 1:23 - loss: 0.8840 - regression_loss: 0.7943 - classification_loss: 0.0897 147/500 [=======>......................] - ETA: 1:23 - loss: 0.8820 - regression_loss: 0.7927 - classification_loss: 0.0893 148/500 [=======>......................] - ETA: 1:22 - loss: 0.8859 - regression_loss: 0.7966 - classification_loss: 0.0893 149/500 [=======>......................] - ETA: 1:22 - loss: 0.8872 - regression_loss: 0.7979 - classification_loss: 0.0892 150/500 [========>.....................] - ETA: 1:22 - loss: 0.8912 - regression_loss: 0.8012 - classification_loss: 0.0899 151/500 [========>.....................] - ETA: 1:22 - loss: 0.8909 - regression_loss: 0.8010 - classification_loss: 0.0898 152/500 [========>.....................] - ETA: 1:21 - loss: 0.8910 - regression_loss: 0.8014 - classification_loss: 0.0896 153/500 [========>.....................] - ETA: 1:21 - loss: 0.8913 - regression_loss: 0.8022 - classification_loss: 0.0891 154/500 [========>.....................] - ETA: 1:21 - loss: 0.8920 - regression_loss: 0.8030 - classification_loss: 0.0890 155/500 [========>.....................] - ETA: 1:21 - loss: 0.8952 - regression_loss: 0.8056 - classification_loss: 0.0897 156/500 [========>.....................] - ETA: 1:20 - loss: 0.8992 - regression_loss: 0.8088 - classification_loss: 0.0904 157/500 [========>.....................] - ETA: 1:20 - loss: 0.8994 - regression_loss: 0.8089 - classification_loss: 0.0905 158/500 [========>.....................] - ETA: 1:20 - loss: 0.8976 - regression_loss: 0.8074 - classification_loss: 0.0902 159/500 [========>.....................] - ETA: 1:20 - loss: 0.8969 - regression_loss: 0.8063 - classification_loss: 0.0905 160/500 [========>.....................] - ETA: 1:19 - loss: 0.8952 - regression_loss: 0.8047 - classification_loss: 0.0906 161/500 [========>.....................] - ETA: 1:19 - loss: 0.8964 - regression_loss: 0.8058 - classification_loss: 0.0906 162/500 [========>.....................] - ETA: 1:19 - loss: 0.8956 - regression_loss: 0.8054 - classification_loss: 0.0902 163/500 [========>.....................] - ETA: 1:19 - loss: 0.9012 - regression_loss: 0.8106 - classification_loss: 0.0905 164/500 [========>.....................] - ETA: 1:19 - loss: 0.9005 - regression_loss: 0.8101 - classification_loss: 0.0904 165/500 [========>.....................] - ETA: 1:18 - loss: 0.8999 - regression_loss: 0.8098 - classification_loss: 0.0901 166/500 [========>.....................] - ETA: 1:18 - loss: 0.8988 - regression_loss: 0.8091 - classification_loss: 0.0897 167/500 [=========>....................] - ETA: 1:18 - loss: 0.8980 - regression_loss: 0.8087 - classification_loss: 0.0893 168/500 [=========>....................] - ETA: 1:18 - loss: 0.8966 - regression_loss: 0.8074 - classification_loss: 0.0892 169/500 [=========>....................] - ETA: 1:17 - loss: 0.8954 - regression_loss: 0.8064 - classification_loss: 0.0890 170/500 [=========>....................] - ETA: 1:17 - loss: 0.8947 - regression_loss: 0.8053 - classification_loss: 0.0893 171/500 [=========>....................] - ETA: 1:17 - loss: 0.8933 - regression_loss: 0.8042 - classification_loss: 0.0891 172/500 [=========>....................] - ETA: 1:17 - loss: 0.8930 - regression_loss: 0.8041 - classification_loss: 0.0889 173/500 [=========>....................] - ETA: 1:16 - loss: 0.8880 - regression_loss: 0.7995 - classification_loss: 0.0885 174/500 [=========>....................] - ETA: 1:16 - loss: 0.8873 - regression_loss: 0.7991 - classification_loss: 0.0882 175/500 [=========>....................] - ETA: 1:16 - loss: 0.8838 - regression_loss: 0.7960 - classification_loss: 0.0878 176/500 [=========>....................] - ETA: 1:16 - loss: 0.8848 - regression_loss: 0.7968 - classification_loss: 0.0880 177/500 [=========>....................] - ETA: 1:15 - loss: 0.8844 - regression_loss: 0.7968 - classification_loss: 0.0877 178/500 [=========>....................] - ETA: 1:15 - loss: 0.8860 - regression_loss: 0.7983 - classification_loss: 0.0877 179/500 [=========>....................] - ETA: 1:15 - loss: 0.8840 - regression_loss: 0.7965 - classification_loss: 0.0875 180/500 [=========>....................] - ETA: 1:15 - loss: 0.8868 - regression_loss: 0.7988 - classification_loss: 0.0880 181/500 [=========>....................] - ETA: 1:15 - loss: 0.8870 - regression_loss: 0.7986 - classification_loss: 0.0884 182/500 [=========>....................] - ETA: 1:14 - loss: 0.8884 - regression_loss: 0.7998 - classification_loss: 0.0886 183/500 [=========>....................] - ETA: 1:14 - loss: 0.8856 - regression_loss: 0.7974 - classification_loss: 0.0882 184/500 [==========>...................] - ETA: 1:14 - loss: 0.8852 - regression_loss: 0.7972 - classification_loss: 0.0880 185/500 [==========>...................] - ETA: 1:14 - loss: 0.8867 - regression_loss: 0.7985 - classification_loss: 0.0882 186/500 [==========>...................] - ETA: 1:13 - loss: 0.8846 - regression_loss: 0.7968 - classification_loss: 0.0878 187/500 [==========>...................] - ETA: 1:13 - loss: 0.8846 - regression_loss: 0.7968 - classification_loss: 0.0878 188/500 [==========>...................] - ETA: 1:13 - loss: 0.8851 - regression_loss: 0.7973 - classification_loss: 0.0878 189/500 [==========>...................] - ETA: 1:13 - loss: 0.8838 - regression_loss: 0.7962 - classification_loss: 0.0876 190/500 [==========>...................] - ETA: 1:13 - loss: 0.8842 - regression_loss: 0.7969 - classification_loss: 0.0873 191/500 [==========>...................] - ETA: 1:12 - loss: 0.8863 - regression_loss: 0.7987 - classification_loss: 0.0876 192/500 [==========>...................] - ETA: 1:12 - loss: 0.8894 - regression_loss: 0.8011 - classification_loss: 0.0883 193/500 [==========>...................] - ETA: 1:12 - loss: 0.8876 - regression_loss: 0.7996 - classification_loss: 0.0879 194/500 [==========>...................] - ETA: 1:12 - loss: 0.8880 - regression_loss: 0.8000 - classification_loss: 0.0880 195/500 [==========>...................] - ETA: 1:11 - loss: 0.8873 - regression_loss: 0.7994 - classification_loss: 0.0878 196/500 [==========>...................] - ETA: 1:11 - loss: 0.8907 - regression_loss: 0.8024 - classification_loss: 0.0883 197/500 [==========>...................] - ETA: 1:11 - loss: 0.8907 - regression_loss: 0.8024 - classification_loss: 0.0883 198/500 [==========>...................] - ETA: 1:11 - loss: 0.8891 - regression_loss: 0.8010 - classification_loss: 0.0881 199/500 [==========>...................] - ETA: 1:10 - loss: 0.8870 - regression_loss: 0.7992 - classification_loss: 0.0878 200/500 [===========>..................] - ETA: 1:10 - loss: 0.8886 - regression_loss: 0.8006 - classification_loss: 0.0880 201/500 [===========>..................] - ETA: 1:10 - loss: 0.8907 - regression_loss: 0.8020 - classification_loss: 0.0887 202/500 [===========>..................] - ETA: 1:10 - loss: 0.8904 - regression_loss: 0.8019 - classification_loss: 0.0885 203/500 [===========>..................] - ETA: 1:10 - loss: 0.8901 - regression_loss: 0.8013 - classification_loss: 0.0888 204/500 [===========>..................] - ETA: 1:09 - loss: 0.8940 - regression_loss: 0.8038 - classification_loss: 0.0902 205/500 [===========>..................] - ETA: 1:09 - loss: 0.8936 - regression_loss: 0.8034 - classification_loss: 0.0902 206/500 [===========>..................] - ETA: 1:09 - loss: 0.8923 - regression_loss: 0.8022 - classification_loss: 0.0900 207/500 [===========>..................] - ETA: 1:09 - loss: 0.8906 - regression_loss: 0.8009 - classification_loss: 0.0897 208/500 [===========>..................] - ETA: 1:08 - loss: 0.8893 - regression_loss: 0.7997 - classification_loss: 0.0896 209/500 [===========>..................] - ETA: 1:08 - loss: 0.8910 - regression_loss: 0.8013 - classification_loss: 0.0897 210/500 [===========>..................] - ETA: 1:08 - loss: 0.8930 - regression_loss: 0.8026 - classification_loss: 0.0903 211/500 [===========>..................] - ETA: 1:08 - loss: 0.8916 - regression_loss: 0.8014 - classification_loss: 0.0902 212/500 [===========>..................] - ETA: 1:07 - loss: 0.8953 - regression_loss: 0.8025 - classification_loss: 0.0928 213/500 [===========>..................] - ETA: 1:07 - loss: 0.8956 - regression_loss: 0.8027 - classification_loss: 0.0929 214/500 [===========>..................] - ETA: 1:07 - loss: 0.8955 - regression_loss: 0.8028 - classification_loss: 0.0927 215/500 [===========>..................] - ETA: 1:07 - loss: 0.8939 - regression_loss: 0.8016 - classification_loss: 0.0923 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8935 - regression_loss: 0.8012 - classification_loss: 0.0923 217/500 [============>.................] - ETA: 1:06 - loss: 0.8949 - regression_loss: 0.8023 - classification_loss: 0.0926 218/500 [============>.................] - ETA: 1:06 - loss: 0.8969 - regression_loss: 0.8037 - classification_loss: 0.0932 219/500 [============>.................] - ETA: 1:06 - loss: 0.8967 - regression_loss: 0.8037 - classification_loss: 0.0930 220/500 [============>.................] - ETA: 1:05 - loss: 0.8953 - regression_loss: 0.8024 - classification_loss: 0.0929 221/500 [============>.................] - ETA: 1:05 - loss: 0.8958 - regression_loss: 0.8031 - classification_loss: 0.0927 222/500 [============>.................] - ETA: 1:05 - loss: 0.8977 - regression_loss: 0.8045 - classification_loss: 0.0932 223/500 [============>.................] - ETA: 1:05 - loss: 0.8988 - regression_loss: 0.8055 - classification_loss: 0.0933 224/500 [============>.................] - ETA: 1:04 - loss: 0.8973 - regression_loss: 0.8043 - classification_loss: 0.0930 225/500 [============>.................] - ETA: 1:04 - loss: 0.8961 - regression_loss: 0.8032 - classification_loss: 0.0929 226/500 [============>.................] - ETA: 1:04 - loss: 0.8971 - regression_loss: 0.8041 - classification_loss: 0.0930 227/500 [============>.................] - ETA: 1:04 - loss: 0.8963 - regression_loss: 0.8035 - classification_loss: 0.0928 228/500 [============>.................] - ETA: 1:04 - loss: 0.8951 - regression_loss: 0.8025 - classification_loss: 0.0926 229/500 [============>.................] - ETA: 1:03 - loss: 0.8967 - regression_loss: 0.8035 - classification_loss: 0.0932 230/500 [============>.................] - ETA: 1:03 - loss: 0.8954 - regression_loss: 0.8024 - classification_loss: 0.0930 231/500 [============>.................] - ETA: 1:03 - loss: 0.8943 - regression_loss: 0.8013 - classification_loss: 0.0930 232/500 [============>.................] - ETA: 1:03 - loss: 0.8959 - regression_loss: 0.8023 - classification_loss: 0.0936 233/500 [============>.................] - ETA: 1:02 - loss: 0.8963 - regression_loss: 0.8028 - classification_loss: 0.0935 234/500 [=============>................] - ETA: 1:02 - loss: 0.8934 - regression_loss: 0.8002 - classification_loss: 0.0932 235/500 [=============>................] - ETA: 1:02 - loss: 0.8943 - regression_loss: 0.8012 - classification_loss: 0.0931 236/500 [=============>................] - ETA: 1:02 - loss: 0.8948 - regression_loss: 0.8017 - classification_loss: 0.0932 237/500 [=============>................] - ETA: 1:01 - loss: 0.8944 - regression_loss: 0.8013 - classification_loss: 0.0931 238/500 [=============>................] - ETA: 1:01 - loss: 0.8959 - regression_loss: 0.8027 - classification_loss: 0.0933 239/500 [=============>................] - ETA: 1:01 - loss: 0.8980 - regression_loss: 0.8048 - classification_loss: 0.0932 240/500 [=============>................] - ETA: 1:01 - loss: 0.8963 - regression_loss: 0.8033 - classification_loss: 0.0930 241/500 [=============>................] - ETA: 1:00 - loss: 0.8972 - regression_loss: 0.8042 - classification_loss: 0.0930 242/500 [=============>................] - ETA: 1:00 - loss: 0.8956 - regression_loss: 0.8028 - classification_loss: 0.0927 243/500 [=============>................] - ETA: 1:00 - loss: 0.8958 - regression_loss: 0.8032 - classification_loss: 0.0926 244/500 [=============>................] - ETA: 1:00 - loss: 0.8940 - regression_loss: 0.8017 - classification_loss: 0.0923 245/500 [=============>................] - ETA: 59s - loss: 0.8938 - regression_loss: 0.8015 - classification_loss: 0.0922  246/500 [=============>................] - ETA: 59s - loss: 0.8917 - regression_loss: 0.7997 - classification_loss: 0.0920 247/500 [=============>................] - ETA: 59s - loss: 0.8928 - regression_loss: 0.8004 - classification_loss: 0.0924 248/500 [=============>................] - ETA: 59s - loss: 0.8910 - regression_loss: 0.7988 - classification_loss: 0.0922 249/500 [=============>................] - ETA: 59s - loss: 0.8913 - regression_loss: 0.7991 - classification_loss: 0.0922 250/500 [==============>...............] - ETA: 58s - loss: 0.8932 - regression_loss: 0.8011 - classification_loss: 0.0921 251/500 [==============>...............] - ETA: 58s - loss: 0.8944 - regression_loss: 0.8020 - classification_loss: 0.0924 252/500 [==============>...............] - ETA: 58s - loss: 0.8948 - regression_loss: 0.8023 - classification_loss: 0.0925 253/500 [==============>...............] - ETA: 58s - loss: 0.8941 - regression_loss: 0.8016 - classification_loss: 0.0925 254/500 [==============>...............] - ETA: 57s - loss: 0.8930 - regression_loss: 0.8007 - classification_loss: 0.0923 255/500 [==============>...............] - ETA: 57s - loss: 0.8938 - regression_loss: 0.8012 - classification_loss: 0.0927 256/500 [==============>...............] - ETA: 57s - loss: 0.8921 - regression_loss: 0.7996 - classification_loss: 0.0925 257/500 [==============>...............] - ETA: 57s - loss: 0.8910 - regression_loss: 0.7984 - classification_loss: 0.0926 258/500 [==============>...............] - ETA: 56s - loss: 0.8930 - regression_loss: 0.8001 - classification_loss: 0.0930 259/500 [==============>...............] - ETA: 56s - loss: 0.8932 - regression_loss: 0.8001 - classification_loss: 0.0931 260/500 [==============>...............] - ETA: 56s - loss: 0.8936 - regression_loss: 0.8005 - classification_loss: 0.0931 261/500 [==============>...............] - ETA: 56s - loss: 0.8923 - regression_loss: 0.7994 - classification_loss: 0.0929 262/500 [==============>...............] - ETA: 55s - loss: 0.8918 - regression_loss: 0.7990 - classification_loss: 0.0928 263/500 [==============>...............] - ETA: 55s - loss: 0.8923 - regression_loss: 0.7994 - classification_loss: 0.0928 264/500 [==============>...............] - ETA: 55s - loss: 0.8937 - regression_loss: 0.8006 - classification_loss: 0.0931 265/500 [==============>...............] - ETA: 55s - loss: 0.8932 - regression_loss: 0.8003 - classification_loss: 0.0930 266/500 [==============>...............] - ETA: 54s - loss: 0.8932 - regression_loss: 0.8003 - classification_loss: 0.0930 267/500 [===============>..............] - ETA: 54s - loss: 0.8921 - regression_loss: 0.7994 - classification_loss: 0.0927 268/500 [===============>..............] - ETA: 54s - loss: 0.8922 - regression_loss: 0.7995 - classification_loss: 0.0926 269/500 [===============>..............] - ETA: 54s - loss: 0.8906 - regression_loss: 0.7982 - classification_loss: 0.0924 270/500 [===============>..............] - ETA: 54s - loss: 0.8900 - regression_loss: 0.7977 - classification_loss: 0.0923 271/500 [===============>..............] - ETA: 53s - loss: 0.8895 - regression_loss: 0.7973 - classification_loss: 0.0922 272/500 [===============>..............] - ETA: 53s - loss: 0.8885 - regression_loss: 0.7965 - classification_loss: 0.0921 273/500 [===============>..............] - ETA: 53s - loss: 0.8895 - regression_loss: 0.7973 - classification_loss: 0.0922 274/500 [===============>..............] - ETA: 53s - loss: 0.8894 - regression_loss: 0.7974 - classification_loss: 0.0920 275/500 [===============>..............] - ETA: 52s - loss: 0.8889 - regression_loss: 0.7971 - classification_loss: 0.0918 276/500 [===============>..............] - ETA: 52s - loss: 0.8909 - regression_loss: 0.7984 - classification_loss: 0.0925 277/500 [===============>..............] - ETA: 52s - loss: 0.8908 - regression_loss: 0.7984 - classification_loss: 0.0924 278/500 [===============>..............] - ETA: 52s - loss: 0.8912 - regression_loss: 0.7989 - classification_loss: 0.0923 279/500 [===============>..............] - ETA: 51s - loss: 0.8914 - regression_loss: 0.7992 - classification_loss: 0.0922 280/500 [===============>..............] - ETA: 51s - loss: 0.8907 - regression_loss: 0.7987 - classification_loss: 0.0920 281/500 [===============>..............] - ETA: 51s - loss: 0.8903 - regression_loss: 0.7985 - classification_loss: 0.0918 282/500 [===============>..............] - ETA: 51s - loss: 0.8893 - regression_loss: 0.7976 - classification_loss: 0.0917 283/500 [===============>..............] - ETA: 50s - loss: 0.8902 - regression_loss: 0.7983 - classification_loss: 0.0918 284/500 [================>.............] - ETA: 50s - loss: 0.8924 - regression_loss: 0.8000 - classification_loss: 0.0924 285/500 [================>.............] - ETA: 50s - loss: 0.8918 - regression_loss: 0.7995 - classification_loss: 0.0923 286/500 [================>.............] - ETA: 50s - loss: 0.8935 - regression_loss: 0.8011 - classification_loss: 0.0925 287/500 [================>.............] - ETA: 50s - loss: 0.8931 - regression_loss: 0.8005 - classification_loss: 0.0925 288/500 [================>.............] - ETA: 49s - loss: 0.8922 - regression_loss: 0.7998 - classification_loss: 0.0924 289/500 [================>.............] - ETA: 49s - loss: 0.8907 - regression_loss: 0.7986 - classification_loss: 0.0921 290/500 [================>.............] - ETA: 49s - loss: 0.8907 - regression_loss: 0.7989 - classification_loss: 0.0919 291/500 [================>.............] - ETA: 49s - loss: 0.8922 - regression_loss: 0.8001 - classification_loss: 0.0921 292/500 [================>.............] - ETA: 48s - loss: 0.8909 - regression_loss: 0.7991 - classification_loss: 0.0919 293/500 [================>.............] - ETA: 48s - loss: 0.8905 - regression_loss: 0.7987 - classification_loss: 0.0918 294/500 [================>.............] - ETA: 48s - loss: 0.8906 - regression_loss: 0.7987 - classification_loss: 0.0918 295/500 [================>.............] - ETA: 48s - loss: 0.8895 - regression_loss: 0.7978 - classification_loss: 0.0917 296/500 [================>.............] - ETA: 47s - loss: 0.8892 - regression_loss: 0.7976 - classification_loss: 0.0916 297/500 [================>.............] - ETA: 47s - loss: 0.8870 - regression_loss: 0.7956 - classification_loss: 0.0914 298/500 [================>.............] - ETA: 47s - loss: 0.8882 - regression_loss: 0.7967 - classification_loss: 0.0916 299/500 [================>.............] - ETA: 47s - loss: 0.8894 - regression_loss: 0.7976 - classification_loss: 0.0918 300/500 [=================>............] - ETA: 46s - loss: 0.8882 - regression_loss: 0.7965 - classification_loss: 0.0917 301/500 [=================>............] - ETA: 46s - loss: 0.8888 - regression_loss: 0.7971 - classification_loss: 0.0917 302/500 [=================>............] - ETA: 46s - loss: 0.8900 - regression_loss: 0.7984 - classification_loss: 0.0916 303/500 [=================>............] - ETA: 46s - loss: 0.8890 - regression_loss: 0.7975 - classification_loss: 0.0915 304/500 [=================>............] - ETA: 45s - loss: 0.8871 - regression_loss: 0.7959 - classification_loss: 0.0912 305/500 [=================>............] - ETA: 45s - loss: 0.8878 - regression_loss: 0.7963 - classification_loss: 0.0914 306/500 [=================>............] - ETA: 45s - loss: 0.8859 - regression_loss: 0.7947 - classification_loss: 0.0912 307/500 [=================>............] - ETA: 45s - loss: 0.8845 - regression_loss: 0.7935 - classification_loss: 0.0909 308/500 [=================>............] - ETA: 45s - loss: 0.8847 - regression_loss: 0.7937 - classification_loss: 0.0910 309/500 [=================>............] - ETA: 44s - loss: 0.8833 - regression_loss: 0.7925 - classification_loss: 0.0908 310/500 [=================>............] - ETA: 44s - loss: 0.8832 - regression_loss: 0.7924 - classification_loss: 0.0908 311/500 [=================>............] - ETA: 44s - loss: 0.8840 - regression_loss: 0.7932 - classification_loss: 0.0908 312/500 [=================>............] - ETA: 44s - loss: 0.8858 - regression_loss: 0.7948 - classification_loss: 0.0910 313/500 [=================>............] - ETA: 43s - loss: 0.8854 - regression_loss: 0.7945 - classification_loss: 0.0909 314/500 [=================>............] - ETA: 43s - loss: 0.8859 - regression_loss: 0.7949 - classification_loss: 0.0909 315/500 [=================>............] - ETA: 43s - loss: 0.8855 - regression_loss: 0.7947 - classification_loss: 0.0908 316/500 [=================>............] - ETA: 43s - loss: 0.8870 - regression_loss: 0.7960 - classification_loss: 0.0910 317/500 [==================>...........] - ETA: 42s - loss: 0.8878 - regression_loss: 0.7963 - classification_loss: 0.0914 318/500 [==================>...........] - ETA: 42s - loss: 0.8873 - regression_loss: 0.7960 - classification_loss: 0.0913 319/500 [==================>...........] - ETA: 42s - loss: 0.8871 - regression_loss: 0.7958 - classification_loss: 0.0913 320/500 [==================>...........] - ETA: 42s - loss: 0.8885 - regression_loss: 0.7970 - classification_loss: 0.0915 321/500 [==================>...........] - ETA: 41s - loss: 0.8882 - regression_loss: 0.7967 - classification_loss: 0.0915 322/500 [==================>...........] - ETA: 41s - loss: 0.8888 - regression_loss: 0.7973 - classification_loss: 0.0915 323/500 [==================>...........] - ETA: 41s - loss: 0.8909 - regression_loss: 0.7990 - classification_loss: 0.0919 324/500 [==================>...........] - ETA: 41s - loss: 0.8906 - regression_loss: 0.7988 - classification_loss: 0.0917 325/500 [==================>...........] - ETA: 41s - loss: 0.8910 - regression_loss: 0.7992 - classification_loss: 0.0918 326/500 [==================>...........] - ETA: 40s - loss: 0.8915 - regression_loss: 0.7995 - classification_loss: 0.0920 327/500 [==================>...........] - ETA: 40s - loss: 0.8908 - regression_loss: 0.7989 - classification_loss: 0.0919 328/500 [==================>...........] - ETA: 40s - loss: 0.8910 - regression_loss: 0.7991 - classification_loss: 0.0919 329/500 [==================>...........] - ETA: 40s - loss: 0.8918 - regression_loss: 0.7997 - classification_loss: 0.0921 330/500 [==================>...........] - ETA: 39s - loss: 0.8924 - regression_loss: 0.8002 - classification_loss: 0.0922 331/500 [==================>...........] - ETA: 39s - loss: 0.8904 - regression_loss: 0.7985 - classification_loss: 0.0920 332/500 [==================>...........] - ETA: 39s - loss: 0.8938 - regression_loss: 0.8012 - classification_loss: 0.0926 333/500 [==================>...........] - ETA: 39s - loss: 0.8938 - regression_loss: 0.8012 - classification_loss: 0.0926 334/500 [===================>..........] - ETA: 38s - loss: 0.8942 - regression_loss: 0.8012 - classification_loss: 0.0930 335/500 [===================>..........] - ETA: 38s - loss: 0.8938 - regression_loss: 0.8009 - classification_loss: 0.0929 336/500 [===================>..........] - ETA: 38s - loss: 0.8928 - regression_loss: 0.8001 - classification_loss: 0.0927 337/500 [===================>..........] - ETA: 38s - loss: 0.8922 - regression_loss: 0.7997 - classification_loss: 0.0926 338/500 [===================>..........] - ETA: 38s - loss: 0.8918 - regression_loss: 0.7993 - classification_loss: 0.0924 339/500 [===================>..........] - ETA: 37s - loss: 0.8929 - regression_loss: 0.8003 - classification_loss: 0.0926 340/500 [===================>..........] - ETA: 37s - loss: 0.8952 - regression_loss: 0.8021 - classification_loss: 0.0931 341/500 [===================>..........] - ETA: 37s - loss: 0.8961 - regression_loss: 0.8027 - classification_loss: 0.0934 342/500 [===================>..........] - ETA: 37s - loss: 0.8964 - regression_loss: 0.8031 - classification_loss: 0.0933 343/500 [===================>..........] - ETA: 36s - loss: 0.8957 - regression_loss: 0.8026 - classification_loss: 0.0932 344/500 [===================>..........] - ETA: 36s - loss: 0.8947 - regression_loss: 0.8017 - classification_loss: 0.0930 345/500 [===================>..........] - ETA: 36s - loss: 0.8935 - regression_loss: 0.8007 - classification_loss: 0.0928 346/500 [===================>..........] - ETA: 36s - loss: 0.8928 - regression_loss: 0.8001 - classification_loss: 0.0927 347/500 [===================>..........] - ETA: 35s - loss: 0.8920 - regression_loss: 0.7994 - classification_loss: 0.0926 348/500 [===================>..........] - ETA: 35s - loss: 0.8918 - regression_loss: 0.7993 - classification_loss: 0.0925 349/500 [===================>..........] - ETA: 35s - loss: 0.8913 - regression_loss: 0.7988 - classification_loss: 0.0924 350/500 [====================>.........] - ETA: 35s - loss: 0.8911 - regression_loss: 0.7987 - classification_loss: 0.0924 351/500 [====================>.........] - ETA: 34s - loss: 0.8911 - regression_loss: 0.7988 - classification_loss: 0.0923 352/500 [====================>.........] - ETA: 34s - loss: 0.8911 - regression_loss: 0.7989 - classification_loss: 0.0921 353/500 [====================>.........] - ETA: 34s - loss: 0.8912 - regression_loss: 0.7990 - classification_loss: 0.0922 354/500 [====================>.........] - ETA: 34s - loss: 0.8909 - regression_loss: 0.7988 - classification_loss: 0.0921 355/500 [====================>.........] - ETA: 34s - loss: 0.8894 - regression_loss: 0.7974 - classification_loss: 0.0920 356/500 [====================>.........] - ETA: 33s - loss: 0.8880 - regression_loss: 0.7963 - classification_loss: 0.0918 357/500 [====================>.........] - ETA: 33s - loss: 0.8871 - regression_loss: 0.7953 - classification_loss: 0.0917 358/500 [====================>.........] - ETA: 33s - loss: 0.8877 - regression_loss: 0.7958 - classification_loss: 0.0918 359/500 [====================>.........] - ETA: 33s - loss: 0.8877 - regression_loss: 0.7958 - classification_loss: 0.0918 360/500 [====================>.........] - ETA: 32s - loss: 0.8878 - regression_loss: 0.7961 - classification_loss: 0.0917 361/500 [====================>.........] - ETA: 32s - loss: 0.8887 - regression_loss: 0.7969 - classification_loss: 0.0918 362/500 [====================>.........] - ETA: 32s - loss: 0.8883 - regression_loss: 0.7966 - classification_loss: 0.0918 363/500 [====================>.........] - ETA: 32s - loss: 0.8904 - regression_loss: 0.7983 - classification_loss: 0.0921 364/500 [====================>.........] - ETA: 31s - loss: 0.8896 - regression_loss: 0.7977 - classification_loss: 0.0919 365/500 [====================>.........] - ETA: 31s - loss: 0.8887 - regression_loss: 0.7970 - classification_loss: 0.0917 366/500 [====================>.........] - ETA: 31s - loss: 0.8888 - regression_loss: 0.7971 - classification_loss: 0.0917 367/500 [=====================>........] - ETA: 31s - loss: 0.8901 - regression_loss: 0.7980 - classification_loss: 0.0921 368/500 [=====================>........] - ETA: 30s - loss: 0.8906 - regression_loss: 0.7984 - classification_loss: 0.0922 369/500 [=====================>........] - ETA: 30s - loss: 0.8905 - regression_loss: 0.7983 - classification_loss: 0.0923 370/500 [=====================>........] - ETA: 30s - loss: 0.8903 - regression_loss: 0.7981 - classification_loss: 0.0921 371/500 [=====================>........] - ETA: 30s - loss: 0.8911 - regression_loss: 0.7988 - classification_loss: 0.0924 372/500 [=====================>........] - ETA: 30s - loss: 0.8904 - regression_loss: 0.7982 - classification_loss: 0.0922 373/500 [=====================>........] - ETA: 29s - loss: 0.8893 - regression_loss: 0.7973 - classification_loss: 0.0920 374/500 [=====================>........] - ETA: 29s - loss: 0.8876 - regression_loss: 0.7958 - classification_loss: 0.0918 375/500 [=====================>........] - ETA: 29s - loss: 0.8877 - regression_loss: 0.7960 - classification_loss: 0.0917 376/500 [=====================>........] - ETA: 29s - loss: 0.8878 - regression_loss: 0.7961 - classification_loss: 0.0917 377/500 [=====================>........] - ETA: 28s - loss: 0.8889 - regression_loss: 0.7972 - classification_loss: 0.0916 378/500 [=====================>........] - ETA: 28s - loss: 0.8894 - regression_loss: 0.7976 - classification_loss: 0.0919 379/500 [=====================>........] - ETA: 28s - loss: 0.8882 - regression_loss: 0.7965 - classification_loss: 0.0917 380/500 [=====================>........] - ETA: 28s - loss: 0.8887 - regression_loss: 0.7970 - classification_loss: 0.0918 381/500 [=====================>........] - ETA: 27s - loss: 0.8881 - regression_loss: 0.7965 - classification_loss: 0.0916 382/500 [=====================>........] - ETA: 27s - loss: 0.8886 - regression_loss: 0.7970 - classification_loss: 0.0916 383/500 [=====================>........] - ETA: 27s - loss: 0.8887 - regression_loss: 0.7971 - classification_loss: 0.0916 384/500 [======================>.......] - ETA: 27s - loss: 0.8884 - regression_loss: 0.7970 - classification_loss: 0.0914 385/500 [======================>.......] - ETA: 27s - loss: 0.8887 - regression_loss: 0.7973 - classification_loss: 0.0914 386/500 [======================>.......] - ETA: 26s - loss: 0.8894 - regression_loss: 0.7981 - classification_loss: 0.0913 387/500 [======================>.......] - ETA: 26s - loss: 0.8894 - regression_loss: 0.7981 - classification_loss: 0.0913 388/500 [======================>.......] - ETA: 26s - loss: 0.8891 - regression_loss: 0.7979 - classification_loss: 0.0912 389/500 [======================>.......] - ETA: 26s - loss: 0.8879 - regression_loss: 0.7969 - classification_loss: 0.0911 390/500 [======================>.......] - ETA: 25s - loss: 0.8876 - regression_loss: 0.7964 - classification_loss: 0.0912 391/500 [======================>.......] - ETA: 25s - loss: 0.8870 - regression_loss: 0.7960 - classification_loss: 0.0911 392/500 [======================>.......] - ETA: 25s - loss: 0.8877 - regression_loss: 0.7966 - classification_loss: 0.0911 393/500 [======================>.......] - ETA: 25s - loss: 0.8887 - regression_loss: 0.7976 - classification_loss: 0.0911 394/500 [======================>.......] - ETA: 24s - loss: 0.8893 - regression_loss: 0.7980 - classification_loss: 0.0913 395/500 [======================>.......] - ETA: 24s - loss: 0.8900 - regression_loss: 0.7986 - classification_loss: 0.0915 396/500 [======================>.......] - ETA: 24s - loss: 0.8903 - regression_loss: 0.7988 - classification_loss: 0.0915 397/500 [======================>.......] - ETA: 24s - loss: 0.8903 - regression_loss: 0.7989 - classification_loss: 0.0915 398/500 [======================>.......] - ETA: 23s - loss: 0.8927 - regression_loss: 0.8007 - classification_loss: 0.0920 399/500 [======================>.......] - ETA: 23s - loss: 0.8925 - regression_loss: 0.8006 - classification_loss: 0.0919 400/500 [=======================>......] - ETA: 23s - loss: 0.8933 - regression_loss: 0.8013 - classification_loss: 0.0919 401/500 [=======================>......] - ETA: 23s - loss: 0.8930 - regression_loss: 0.8011 - classification_loss: 0.0919 402/500 [=======================>......] - ETA: 23s - loss: 0.8928 - regression_loss: 0.8011 - classification_loss: 0.0917 403/500 [=======================>......] - ETA: 22s - loss: 0.8925 - regression_loss: 0.8009 - classification_loss: 0.0916 404/500 [=======================>......] - ETA: 22s - loss: 0.8936 - regression_loss: 0.8019 - classification_loss: 0.0918 405/500 [=======================>......] - ETA: 22s - loss: 0.8938 - regression_loss: 0.8020 - classification_loss: 0.0918 406/500 [=======================>......] - ETA: 22s - loss: 0.8940 - regression_loss: 0.8022 - classification_loss: 0.0918 407/500 [=======================>......] - ETA: 21s - loss: 0.8944 - regression_loss: 0.8027 - classification_loss: 0.0917 408/500 [=======================>......] - ETA: 21s - loss: 0.8941 - regression_loss: 0.8024 - classification_loss: 0.0917 409/500 [=======================>......] - ETA: 21s - loss: 0.8932 - regression_loss: 0.8017 - classification_loss: 0.0915 410/500 [=======================>......] - ETA: 21s - loss: 0.8925 - regression_loss: 0.8011 - classification_loss: 0.0914 411/500 [=======================>......] - ETA: 20s - loss: 0.8933 - regression_loss: 0.8018 - classification_loss: 0.0915 412/500 [=======================>......] - ETA: 20s - loss: 0.8918 - regression_loss: 0.8005 - classification_loss: 0.0913 413/500 [=======================>......] - ETA: 20s - loss: 0.8910 - regression_loss: 0.7998 - classification_loss: 0.0912 414/500 [=======================>......] - ETA: 20s - loss: 0.8909 - regression_loss: 0.7997 - classification_loss: 0.0912 415/500 [=======================>......] - ETA: 19s - loss: 0.8898 - regression_loss: 0.7988 - classification_loss: 0.0910 416/500 [=======================>......] - ETA: 19s - loss: 0.8915 - regression_loss: 0.8002 - classification_loss: 0.0913 417/500 [========================>.....] - ETA: 19s - loss: 0.8909 - regression_loss: 0.7997 - classification_loss: 0.0912 418/500 [========================>.....] - ETA: 19s - loss: 0.8906 - regression_loss: 0.7995 - classification_loss: 0.0910 419/500 [========================>.....] - ETA: 19s - loss: 0.8905 - regression_loss: 0.7996 - classification_loss: 0.0909 420/500 [========================>.....] - ETA: 18s - loss: 0.8913 - regression_loss: 0.8003 - classification_loss: 0.0910 421/500 [========================>.....] - ETA: 18s - loss: 0.8905 - regression_loss: 0.7996 - classification_loss: 0.0909 422/500 [========================>.....] - ETA: 18s - loss: 0.8903 - regression_loss: 0.7996 - classification_loss: 0.0907 423/500 [========================>.....] - ETA: 18s - loss: 0.8899 - regression_loss: 0.7992 - classification_loss: 0.0907 424/500 [========================>.....] - ETA: 17s - loss: 0.8909 - regression_loss: 0.8000 - classification_loss: 0.0910 425/500 [========================>.....] - ETA: 17s - loss: 0.8905 - regression_loss: 0.7997 - classification_loss: 0.0908 426/500 [========================>.....] - ETA: 17s - loss: 0.8910 - regression_loss: 0.8002 - classification_loss: 0.0908 427/500 [========================>.....] - ETA: 17s - loss: 0.8910 - regression_loss: 0.8002 - classification_loss: 0.0908 428/500 [========================>.....] - ETA: 16s - loss: 0.8901 - regression_loss: 0.7995 - classification_loss: 0.0906 429/500 [========================>.....] - ETA: 16s - loss: 0.8896 - regression_loss: 0.7990 - classification_loss: 0.0905 430/500 [========================>.....] - ETA: 16s - loss: 0.8893 - regression_loss: 0.7988 - classification_loss: 0.0904 431/500 [========================>.....] - ETA: 16s - loss: 0.8893 - regression_loss: 0.7989 - classification_loss: 0.0904 432/500 [========================>.....] - ETA: 15s - loss: 0.8899 - regression_loss: 0.7995 - classification_loss: 0.0904 433/500 [========================>.....] - ETA: 15s - loss: 0.8900 - regression_loss: 0.7996 - classification_loss: 0.0903 434/500 [=========================>....] - ETA: 15s - loss: 0.8899 - regression_loss: 0.7996 - classification_loss: 0.0903 435/500 [=========================>....] - ETA: 15s - loss: 0.8900 - regression_loss: 0.7997 - classification_loss: 0.0903 436/500 [=========================>....] - ETA: 15s - loss: 0.8902 - regression_loss: 0.7995 - classification_loss: 0.0907 437/500 [=========================>....] - ETA: 14s - loss: 0.8895 - regression_loss: 0.7989 - classification_loss: 0.0907 438/500 [=========================>....] - ETA: 14s - loss: 0.8888 - regression_loss: 0.7983 - classification_loss: 0.0906 439/500 [=========================>....] - ETA: 14s - loss: 0.8884 - regression_loss: 0.7980 - classification_loss: 0.0904 440/500 [=========================>....] - ETA: 14s - loss: 0.8912 - regression_loss: 0.8007 - classification_loss: 0.0905 441/500 [=========================>....] - ETA: 13s - loss: 0.8905 - regression_loss: 0.8001 - classification_loss: 0.0904 442/500 [=========================>....] - ETA: 13s - loss: 0.8895 - regression_loss: 0.7993 - classification_loss: 0.0902 443/500 [=========================>....] - ETA: 13s - loss: 0.8896 - regression_loss: 0.7994 - classification_loss: 0.0902 444/500 [=========================>....] - ETA: 13s - loss: 0.8894 - regression_loss: 0.7992 - classification_loss: 0.0903 445/500 [=========================>....] - ETA: 12s - loss: 0.8887 - regression_loss: 0.7986 - classification_loss: 0.0902 446/500 [=========================>....] - ETA: 12s - loss: 0.8876 - regression_loss: 0.7974 - classification_loss: 0.0902 447/500 [=========================>....] - ETA: 12s - loss: 0.8873 - regression_loss: 0.7972 - classification_loss: 0.0901 448/500 [=========================>....] - ETA: 12s - loss: 0.8875 - regression_loss: 0.7973 - classification_loss: 0.0902 449/500 [=========================>....] - ETA: 11s - loss: 0.8872 - regression_loss: 0.7971 - classification_loss: 0.0901 450/500 [==========================>...] - ETA: 11s - loss: 0.8874 - regression_loss: 0.7973 - classification_loss: 0.0901 451/500 [==========================>...] - ETA: 11s - loss: 0.8885 - regression_loss: 0.7983 - classification_loss: 0.0902 452/500 [==========================>...] - ETA: 11s - loss: 0.8890 - regression_loss: 0.7986 - classification_loss: 0.0903 453/500 [==========================>...] - ETA: 11s - loss: 0.8892 - regression_loss: 0.7989 - classification_loss: 0.0903 454/500 [==========================>...] - ETA: 10s - loss: 0.8884 - regression_loss: 0.7983 - classification_loss: 0.0901 455/500 [==========================>...] - ETA: 10s - loss: 0.8878 - regression_loss: 0.7978 - classification_loss: 0.0900 456/500 [==========================>...] - ETA: 10s - loss: 0.8875 - regression_loss: 0.7976 - classification_loss: 0.0899 457/500 [==========================>...] - ETA: 10s - loss: 0.8868 - regression_loss: 0.7970 - classification_loss: 0.0898 458/500 [==========================>...] - ETA: 9s - loss: 0.8865 - regression_loss: 0.7969 - classification_loss: 0.0897  459/500 [==========================>...] - ETA: 9s - loss: 0.8862 - regression_loss: 0.7966 - classification_loss: 0.0895 460/500 [==========================>...] - ETA: 9s - loss: 0.8870 - regression_loss: 0.7974 - classification_loss: 0.0895 461/500 [==========================>...] - ETA: 9s - loss: 0.8872 - regression_loss: 0.7976 - classification_loss: 0.0897 462/500 [==========================>...] - ETA: 8s - loss: 0.8862 - regression_loss: 0.7965 - classification_loss: 0.0897 463/500 [==========================>...] - ETA: 8s - loss: 0.8858 - regression_loss: 0.7962 - classification_loss: 0.0896 464/500 [==========================>...] - ETA: 8s - loss: 0.8852 - regression_loss: 0.7957 - classification_loss: 0.0895 465/500 [==========================>...] - ETA: 8s - loss: 0.8855 - regression_loss: 0.7960 - classification_loss: 0.0895 466/500 [==========================>...] - ETA: 7s - loss: 0.8860 - regression_loss: 0.7965 - classification_loss: 0.0895 467/500 [===========================>..] - ETA: 7s - loss: 0.8850 - regression_loss: 0.7957 - classification_loss: 0.0893 468/500 [===========================>..] - ETA: 7s - loss: 0.8855 - regression_loss: 0.7961 - classification_loss: 0.0894 469/500 [===========================>..] - ETA: 7s - loss: 0.8850 - regression_loss: 0.7957 - classification_loss: 0.0893 470/500 [===========================>..] - ETA: 7s - loss: 0.8854 - regression_loss: 0.7960 - classification_loss: 0.0894 471/500 [===========================>..] - ETA: 6s - loss: 0.8858 - regression_loss: 0.7962 - classification_loss: 0.0896 472/500 [===========================>..] - ETA: 6s - loss: 0.8856 - regression_loss: 0.7960 - classification_loss: 0.0896 473/500 [===========================>..] - ETA: 6s - loss: 0.8849 - regression_loss: 0.7953 - classification_loss: 0.0895 474/500 [===========================>..] - ETA: 6s - loss: 0.8840 - regression_loss: 0.7946 - classification_loss: 0.0894 475/500 [===========================>..] - ETA: 5s - loss: 0.8825 - regression_loss: 0.7933 - classification_loss: 0.0892 476/500 [===========================>..] - ETA: 5s - loss: 0.8819 - regression_loss: 0.7928 - classification_loss: 0.0892 477/500 [===========================>..] - ETA: 5s - loss: 0.8824 - regression_loss: 0.7932 - classification_loss: 0.0893 478/500 [===========================>..] - ETA: 5s - loss: 0.8828 - regression_loss: 0.7935 - classification_loss: 0.0893 479/500 [===========================>..] - ETA: 4s - loss: 0.8835 - regression_loss: 0.7941 - classification_loss: 0.0894 480/500 [===========================>..] - ETA: 4s - loss: 0.8827 - regression_loss: 0.7934 - classification_loss: 0.0893 481/500 [===========================>..] - ETA: 4s - loss: 0.8830 - regression_loss: 0.7938 - classification_loss: 0.0892 482/500 [===========================>..] - ETA: 4s - loss: 0.8845 - regression_loss: 0.7952 - classification_loss: 0.0893 483/500 [===========================>..] - ETA: 3s - loss: 0.8839 - regression_loss: 0.7946 - classification_loss: 0.0892 484/500 [============================>.] - ETA: 3s - loss: 0.8839 - regression_loss: 0.7947 - classification_loss: 0.0892 485/500 [============================>.] - ETA: 3s - loss: 0.8835 - regression_loss: 0.7944 - classification_loss: 0.0891 486/500 [============================>.] - ETA: 3s - loss: 0.8824 - regression_loss: 0.7935 - classification_loss: 0.0890 487/500 [============================>.] - ETA: 3s - loss: 0.8826 - regression_loss: 0.7936 - classification_loss: 0.0889 488/500 [============================>.] - ETA: 2s - loss: 0.8843 - regression_loss: 0.7951 - classification_loss: 0.0892 489/500 [============================>.] - ETA: 2s - loss: 0.8857 - regression_loss: 0.7965 - classification_loss: 0.0892 490/500 [============================>.] - ETA: 2s - loss: 0.8853 - regression_loss: 0.7961 - classification_loss: 0.0892 491/500 [============================>.] - ETA: 2s - loss: 0.8859 - regression_loss: 0.7965 - classification_loss: 0.0893 492/500 [============================>.] - ETA: 1s - loss: 0.8858 - regression_loss: 0.7965 - classification_loss: 0.0893 493/500 [============================>.] - ETA: 1s - loss: 0.8868 - regression_loss: 0.7974 - classification_loss: 0.0894 494/500 [============================>.] - ETA: 1s - loss: 0.8864 - regression_loss: 0.7970 - classification_loss: 0.0893 495/500 [============================>.] - ETA: 1s - loss: 0.8857 - regression_loss: 0.7965 - classification_loss: 0.0892 496/500 [============================>.] - ETA: 0s - loss: 0.8846 - regression_loss: 0.7955 - classification_loss: 0.0891 497/500 [============================>.] - ETA: 0s - loss: 0.8844 - regression_loss: 0.7954 - classification_loss: 0.0891 498/500 [============================>.] - ETA: 0s - loss: 0.8842 - regression_loss: 0.7952 - classification_loss: 0.0890 499/500 [============================>.] - ETA: 0s - loss: 0.8840 - regression_loss: 0.7951 - classification_loss: 0.0889 500/500 [==============================] - 117s 235ms/step - loss: 0.8835 - regression_loss: 0.7947 - classification_loss: 0.0888 326 instances of class plum with average precision: 0.8420 mAP: 0.8420 Epoch 00040: saving model to ./training/snapshots/resnet50_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 1:52 - loss: 0.6305 - regression_loss: 0.5597 - classification_loss: 0.0709 2/500 [..............................] - ETA: 1:57 - loss: 0.8456 - regression_loss: 0.7469 - classification_loss: 0.0988 3/500 [..............................] - ETA: 1:59 - loss: 0.7180 - regression_loss: 0.6497 - classification_loss: 0.0682 4/500 [..............................] - ETA: 2:00 - loss: 0.6969 - regression_loss: 0.6245 - classification_loss: 0.0724 5/500 [..............................] - ETA: 1:59 - loss: 0.7039 - regression_loss: 0.6371 - classification_loss: 0.0669 6/500 [..............................] - ETA: 1:58 - loss: 0.6772 - regression_loss: 0.6124 - classification_loss: 0.0648 7/500 [..............................] - ETA: 1:59 - loss: 0.7321 - regression_loss: 0.6617 - classification_loss: 0.0705 8/500 [..............................] - ETA: 1:59 - loss: 0.7319 - regression_loss: 0.6611 - classification_loss: 0.0708 9/500 [..............................] - ETA: 1:58 - loss: 0.8023 - regression_loss: 0.7213 - classification_loss: 0.0810 10/500 [..............................] - ETA: 1:57 - loss: 0.8497 - regression_loss: 0.7641 - classification_loss: 0.0856 11/500 [..............................] - ETA: 1:57 - loss: 0.8163 - regression_loss: 0.7362 - classification_loss: 0.0801 12/500 [..............................] - ETA: 1:56 - loss: 0.7661 - regression_loss: 0.6916 - classification_loss: 0.0745 13/500 [..............................] - ETA: 1:56 - loss: 0.7569 - regression_loss: 0.6830 - classification_loss: 0.0739 14/500 [..............................] - ETA: 1:56 - loss: 0.7431 - regression_loss: 0.6698 - classification_loss: 0.0733 15/500 [..............................] - ETA: 1:56 - loss: 0.7415 - regression_loss: 0.6699 - classification_loss: 0.0716 16/500 [..............................] - ETA: 1:55 - loss: 0.7069 - regression_loss: 0.6390 - classification_loss: 0.0679 17/500 [>.............................] - ETA: 1:55 - loss: 0.7030 - regression_loss: 0.6345 - classification_loss: 0.0685 18/500 [>.............................] - ETA: 1:54 - loss: 0.6641 - regression_loss: 0.5992 - classification_loss: 0.0649 19/500 [>.............................] - ETA: 1:54 - loss: 0.6683 - regression_loss: 0.6028 - classification_loss: 0.0655 20/500 [>.............................] - ETA: 1:53 - loss: 0.7109 - regression_loss: 0.6382 - classification_loss: 0.0727 21/500 [>.............................] - ETA: 1:53 - loss: 0.7175 - regression_loss: 0.6455 - classification_loss: 0.0721 22/500 [>.............................] - ETA: 1:52 - loss: 0.7527 - regression_loss: 0.6785 - classification_loss: 0.0742 23/500 [>.............................] - ETA: 1:51 - loss: 0.7891 - regression_loss: 0.7102 - classification_loss: 0.0789 24/500 [>.............................] - ETA: 1:52 - loss: 0.8166 - regression_loss: 0.7295 - classification_loss: 0.0871 25/500 [>.............................] - ETA: 1:51 - loss: 0.8312 - regression_loss: 0.7439 - classification_loss: 0.0873 26/500 [>.............................] - ETA: 1:51 - loss: 0.8118 - regression_loss: 0.7272 - classification_loss: 0.0847 27/500 [>.............................] - ETA: 1:51 - loss: 0.8028 - regression_loss: 0.7201 - classification_loss: 0.0827 28/500 [>.............................] - ETA: 1:51 - loss: 0.7964 - regression_loss: 0.7149 - classification_loss: 0.0816 29/500 [>.............................] - ETA: 1:51 - loss: 0.8045 - regression_loss: 0.7227 - classification_loss: 0.0818 30/500 [>.............................] - ETA: 1:51 - loss: 0.7844 - regression_loss: 0.7048 - classification_loss: 0.0797 31/500 [>.............................] - ETA: 1:50 - loss: 0.7827 - regression_loss: 0.7035 - classification_loss: 0.0792 32/500 [>.............................] - ETA: 1:50 - loss: 0.7717 - regression_loss: 0.6936 - classification_loss: 0.0782 33/500 [>.............................] - ETA: 1:50 - loss: 0.7783 - regression_loss: 0.6999 - classification_loss: 0.0783 34/500 [=>............................] - ETA: 1:50 - loss: 0.7807 - regression_loss: 0.7001 - classification_loss: 0.0806 35/500 [=>............................] - ETA: 1:50 - loss: 0.7755 - regression_loss: 0.6961 - classification_loss: 0.0794 36/500 [=>............................] - ETA: 1:49 - loss: 0.7860 - regression_loss: 0.7049 - classification_loss: 0.0812 37/500 [=>............................] - ETA: 1:49 - loss: 0.7983 - regression_loss: 0.7144 - classification_loss: 0.0839 38/500 [=>............................] - ETA: 1:49 - loss: 0.8134 - regression_loss: 0.7293 - classification_loss: 0.0840 39/500 [=>............................] - ETA: 1:48 - loss: 0.8348 - regression_loss: 0.7459 - classification_loss: 0.0889 40/500 [=>............................] - ETA: 1:48 - loss: 0.8287 - regression_loss: 0.7405 - classification_loss: 0.0882 41/500 [=>............................] - ETA: 1:48 - loss: 0.8277 - regression_loss: 0.7406 - classification_loss: 0.0871 42/500 [=>............................] - ETA: 1:48 - loss: 0.8164 - regression_loss: 0.7311 - classification_loss: 0.0853 43/500 [=>............................] - ETA: 1:47 - loss: 0.8436 - regression_loss: 0.7535 - classification_loss: 0.0901 44/500 [=>............................] - ETA: 1:47 - loss: 0.8522 - regression_loss: 0.7616 - classification_loss: 0.0906 45/500 [=>............................] - ETA: 1:47 - loss: 0.8497 - regression_loss: 0.7593 - classification_loss: 0.0904 46/500 [=>............................] - ETA: 1:46 - loss: 0.8499 - regression_loss: 0.7602 - classification_loss: 0.0897 47/500 [=>............................] - ETA: 1:46 - loss: 0.8424 - regression_loss: 0.7535 - classification_loss: 0.0889 48/500 [=>............................] - ETA: 1:46 - loss: 0.8439 - regression_loss: 0.7553 - classification_loss: 0.0886 49/500 [=>............................] - ETA: 1:45 - loss: 0.8406 - regression_loss: 0.7519 - classification_loss: 0.0887 50/500 [==>...........................] - ETA: 1:45 - loss: 0.8498 - regression_loss: 0.7576 - classification_loss: 0.0922 51/500 [==>...........................] - ETA: 1:45 - loss: 0.8396 - regression_loss: 0.7487 - classification_loss: 0.0909 52/500 [==>...........................] - ETA: 1:45 - loss: 0.8293 - regression_loss: 0.7397 - classification_loss: 0.0896 53/500 [==>...........................] - ETA: 1:45 - loss: 0.8361 - regression_loss: 0.7459 - classification_loss: 0.0902 54/500 [==>...........................] - ETA: 1:44 - loss: 0.8360 - regression_loss: 0.7461 - classification_loss: 0.0898 55/500 [==>...........................] - ETA: 1:44 - loss: 0.8294 - regression_loss: 0.7408 - classification_loss: 0.0887 56/500 [==>...........................] - ETA: 1:44 - loss: 0.8320 - regression_loss: 0.7432 - classification_loss: 0.0889 57/500 [==>...........................] - ETA: 1:44 - loss: 0.8296 - regression_loss: 0.7417 - classification_loss: 0.0880 58/500 [==>...........................] - ETA: 1:44 - loss: 0.8300 - regression_loss: 0.7417 - classification_loss: 0.0883 59/500 [==>...........................] - ETA: 1:43 - loss: 0.8313 - regression_loss: 0.7434 - classification_loss: 0.0879 60/500 [==>...........................] - ETA: 1:43 - loss: 0.8323 - regression_loss: 0.7445 - classification_loss: 0.0878 61/500 [==>...........................] - ETA: 1:43 - loss: 0.8391 - regression_loss: 0.7513 - classification_loss: 0.0879 62/500 [==>...........................] - ETA: 1:43 - loss: 0.8417 - regression_loss: 0.7543 - classification_loss: 0.0874 63/500 [==>...........................] - ETA: 1:43 - loss: 0.8377 - regression_loss: 0.7508 - classification_loss: 0.0869 64/500 [==>...........................] - ETA: 1:42 - loss: 0.8354 - regression_loss: 0.7494 - classification_loss: 0.0860 65/500 [==>...........................] - ETA: 1:42 - loss: 0.8326 - regression_loss: 0.7477 - classification_loss: 0.0849 66/500 [==>...........................] - ETA: 1:42 - loss: 0.8251 - regression_loss: 0.7411 - classification_loss: 0.0840 67/500 [===>..........................] - ETA: 1:41 - loss: 0.8261 - regression_loss: 0.7423 - classification_loss: 0.0839 68/500 [===>..........................] - ETA: 1:41 - loss: 0.8265 - regression_loss: 0.7426 - classification_loss: 0.0838 69/500 [===>..........................] - ETA: 1:41 - loss: 0.8234 - regression_loss: 0.7397 - classification_loss: 0.0837 70/500 [===>..........................] - ETA: 1:41 - loss: 0.8262 - regression_loss: 0.7421 - classification_loss: 0.0841 71/500 [===>..........................] - ETA: 1:40 - loss: 0.8319 - regression_loss: 0.7467 - classification_loss: 0.0852 72/500 [===>..........................] - ETA: 1:40 - loss: 0.8350 - regression_loss: 0.7492 - classification_loss: 0.0858 73/500 [===>..........................] - ETA: 1:40 - loss: 0.8323 - regression_loss: 0.7470 - classification_loss: 0.0853 74/500 [===>..........................] - ETA: 1:40 - loss: 0.8269 - regression_loss: 0.7420 - classification_loss: 0.0849 75/500 [===>..........................] - ETA: 1:40 - loss: 0.8261 - regression_loss: 0.7415 - classification_loss: 0.0846 76/500 [===>..........................] - ETA: 1:39 - loss: 0.8227 - regression_loss: 0.7389 - classification_loss: 0.0838 77/500 [===>..........................] - ETA: 1:39 - loss: 0.8220 - regression_loss: 0.7384 - classification_loss: 0.0836 78/500 [===>..........................] - ETA: 1:39 - loss: 0.8227 - regression_loss: 0.7392 - classification_loss: 0.0835 79/500 [===>..........................] - ETA: 1:38 - loss: 0.8236 - regression_loss: 0.7371 - classification_loss: 0.0865 80/500 [===>..........................] - ETA: 1:38 - loss: 0.8298 - regression_loss: 0.7421 - classification_loss: 0.0877 81/500 [===>..........................] - ETA: 1:38 - loss: 0.8318 - regression_loss: 0.7439 - classification_loss: 0.0879 82/500 [===>..........................] - ETA: 1:38 - loss: 0.8282 - regression_loss: 0.7409 - classification_loss: 0.0872 83/500 [===>..........................] - ETA: 1:37 - loss: 0.8279 - regression_loss: 0.7412 - classification_loss: 0.0867 84/500 [====>.........................] - ETA: 1:37 - loss: 0.8317 - regression_loss: 0.7453 - classification_loss: 0.0864 85/500 [====>.........................] - ETA: 1:37 - loss: 0.8283 - regression_loss: 0.7424 - classification_loss: 0.0859 86/500 [====>.........................] - ETA: 1:37 - loss: 0.8238 - regression_loss: 0.7388 - classification_loss: 0.0851 87/500 [====>.........................] - ETA: 1:37 - loss: 0.8259 - regression_loss: 0.7402 - classification_loss: 0.0857 88/500 [====>.........................] - ETA: 1:36 - loss: 0.8227 - regression_loss: 0.7372 - classification_loss: 0.0855 89/500 [====>.........................] - ETA: 1:36 - loss: 0.8223 - regression_loss: 0.7369 - classification_loss: 0.0854 90/500 [====>.........................] - ETA: 1:36 - loss: 0.8225 - regression_loss: 0.7373 - classification_loss: 0.0853 91/500 [====>.........................] - ETA: 1:36 - loss: 0.8267 - regression_loss: 0.7405 - classification_loss: 0.0861 92/500 [====>.........................] - ETA: 1:35 - loss: 0.8209 - regression_loss: 0.7356 - classification_loss: 0.0854 93/500 [====>.........................] - ETA: 1:35 - loss: 0.8216 - regression_loss: 0.7364 - classification_loss: 0.0852 94/500 [====>.........................] - ETA: 1:35 - loss: 0.8213 - regression_loss: 0.7362 - classification_loss: 0.0851 95/500 [====>.........................] - ETA: 1:35 - loss: 0.8189 - regression_loss: 0.7344 - classification_loss: 0.0845 96/500 [====>.........................] - ETA: 1:34 - loss: 0.8196 - regression_loss: 0.7352 - classification_loss: 0.0843 97/500 [====>.........................] - ETA: 1:34 - loss: 0.8212 - regression_loss: 0.7368 - classification_loss: 0.0844 98/500 [====>.........................] - ETA: 1:34 - loss: 0.8184 - regression_loss: 0.7344 - classification_loss: 0.0840 99/500 [====>.........................] - ETA: 1:34 - loss: 0.8238 - regression_loss: 0.7395 - classification_loss: 0.0843 100/500 [=====>........................] - ETA: 1:33 - loss: 0.8241 - regression_loss: 0.7400 - classification_loss: 0.0841 101/500 [=====>........................] - ETA: 1:33 - loss: 0.8261 - regression_loss: 0.7419 - classification_loss: 0.0842 102/500 [=====>........................] - ETA: 1:33 - loss: 0.8258 - regression_loss: 0.7417 - classification_loss: 0.0840 103/500 [=====>........................] - ETA: 1:33 - loss: 0.8267 - regression_loss: 0.7432 - classification_loss: 0.0835 104/500 [=====>........................] - ETA: 1:32 - loss: 0.8290 - regression_loss: 0.7453 - classification_loss: 0.0838 105/500 [=====>........................] - ETA: 1:32 - loss: 0.8255 - regression_loss: 0.7423 - classification_loss: 0.0832 106/500 [=====>........................] - ETA: 1:32 - loss: 0.8212 - regression_loss: 0.7386 - classification_loss: 0.0826 107/500 [=====>........................] - ETA: 1:32 - loss: 0.8221 - regression_loss: 0.7393 - classification_loss: 0.0828 108/500 [=====>........................] - ETA: 1:32 - loss: 0.8241 - regression_loss: 0.7414 - classification_loss: 0.0827 109/500 [=====>........................] - ETA: 1:31 - loss: 0.8229 - regression_loss: 0.7398 - classification_loss: 0.0831 110/500 [=====>........................] - ETA: 1:31 - loss: 0.8256 - regression_loss: 0.7426 - classification_loss: 0.0830 111/500 [=====>........................] - ETA: 1:31 - loss: 0.8226 - regression_loss: 0.7399 - classification_loss: 0.0827 112/500 [=====>........................] - ETA: 1:31 - loss: 0.8227 - regression_loss: 0.7396 - classification_loss: 0.0830 113/500 [=====>........................] - ETA: 1:30 - loss: 0.8289 - regression_loss: 0.7449 - classification_loss: 0.0840 114/500 [=====>........................] - ETA: 1:30 - loss: 0.8295 - regression_loss: 0.7458 - classification_loss: 0.0836 115/500 [=====>........................] - ETA: 1:30 - loss: 0.8294 - regression_loss: 0.7457 - classification_loss: 0.0836 116/500 [=====>........................] - ETA: 1:30 - loss: 0.8280 - regression_loss: 0.7447 - classification_loss: 0.0833 117/500 [======>.......................] - ETA: 1:29 - loss: 0.8282 - regression_loss: 0.7453 - classification_loss: 0.0829 118/500 [======>.......................] - ETA: 1:29 - loss: 0.8326 - regression_loss: 0.7495 - classification_loss: 0.0831 119/500 [======>.......................] - ETA: 1:29 - loss: 0.8379 - regression_loss: 0.7533 - classification_loss: 0.0846 120/500 [======>.......................] - ETA: 1:29 - loss: 0.8384 - regression_loss: 0.7539 - classification_loss: 0.0845 121/500 [======>.......................] - ETA: 1:29 - loss: 0.8354 - regression_loss: 0.7513 - classification_loss: 0.0841 122/500 [======>.......................] - ETA: 1:28 - loss: 0.8359 - regression_loss: 0.7517 - classification_loss: 0.0842 123/500 [======>.......................] - ETA: 1:28 - loss: 0.8330 - regression_loss: 0.7491 - classification_loss: 0.0838 124/500 [======>.......................] - ETA: 1:28 - loss: 0.8304 - regression_loss: 0.7472 - classification_loss: 0.0833 125/500 [======>.......................] - ETA: 1:28 - loss: 0.8331 - regression_loss: 0.7494 - classification_loss: 0.0838 126/500 [======>.......................] - ETA: 1:27 - loss: 0.8409 - regression_loss: 0.7564 - classification_loss: 0.0845 127/500 [======>.......................] - ETA: 1:27 - loss: 0.8419 - regression_loss: 0.7573 - classification_loss: 0.0846 128/500 [======>.......................] - ETA: 1:27 - loss: 0.8425 - regression_loss: 0.7581 - classification_loss: 0.0844 129/500 [======>.......................] - ETA: 1:27 - loss: 0.8407 - regression_loss: 0.7566 - classification_loss: 0.0841 130/500 [======>.......................] - ETA: 1:27 - loss: 0.8437 - regression_loss: 0.7593 - classification_loss: 0.0843 131/500 [======>.......................] - ETA: 1:26 - loss: 0.8433 - regression_loss: 0.7589 - classification_loss: 0.0845 132/500 [======>.......................] - ETA: 1:26 - loss: 0.8456 - regression_loss: 0.7612 - classification_loss: 0.0844 133/500 [======>.......................] - ETA: 1:26 - loss: 0.8455 - regression_loss: 0.7613 - classification_loss: 0.0842 134/500 [=======>......................] - ETA: 1:26 - loss: 0.8487 - regression_loss: 0.7639 - classification_loss: 0.0849 135/500 [=======>......................] - ETA: 1:25 - loss: 0.8503 - regression_loss: 0.7649 - classification_loss: 0.0854 136/500 [=======>......................] - ETA: 1:25 - loss: 0.8475 - regression_loss: 0.7624 - classification_loss: 0.0851 137/500 [=======>......................] - ETA: 1:25 - loss: 0.8442 - regression_loss: 0.7596 - classification_loss: 0.0846 138/500 [=======>......................] - ETA: 1:25 - loss: 0.8550 - regression_loss: 0.7541 - classification_loss: 0.1009 139/500 [=======>......................] - ETA: 1:25 - loss: 0.8535 - regression_loss: 0.7523 - classification_loss: 0.1012 140/500 [=======>......................] - ETA: 1:24 - loss: 0.8519 - regression_loss: 0.7512 - classification_loss: 0.1008 141/500 [=======>......................] - ETA: 1:24 - loss: 0.8500 - regression_loss: 0.7497 - classification_loss: 0.1003 142/500 [=======>......................] - ETA: 1:24 - loss: 0.8465 - regression_loss: 0.7467 - classification_loss: 0.0998 143/500 [=======>......................] - ETA: 1:24 - loss: 0.8469 - regression_loss: 0.7473 - classification_loss: 0.0996 144/500 [=======>......................] - ETA: 1:24 - loss: 0.8493 - regression_loss: 0.7498 - classification_loss: 0.0995 145/500 [=======>......................] - ETA: 1:23 - loss: 0.8481 - regression_loss: 0.7492 - classification_loss: 0.0989 146/500 [=======>......................] - ETA: 1:23 - loss: 0.8469 - regression_loss: 0.7483 - classification_loss: 0.0985 147/500 [=======>......................] - ETA: 1:23 - loss: 0.8502 - regression_loss: 0.7513 - classification_loss: 0.0990 148/500 [=======>......................] - ETA: 1:22 - loss: 0.8507 - regression_loss: 0.7518 - classification_loss: 0.0989 149/500 [=======>......................] - ETA: 1:22 - loss: 0.8489 - regression_loss: 0.7502 - classification_loss: 0.0987 150/500 [========>.....................] - ETA: 1:22 - loss: 0.8472 - regression_loss: 0.7486 - classification_loss: 0.0986 151/500 [========>.....................] - ETA: 1:22 - loss: 0.8485 - regression_loss: 0.7497 - classification_loss: 0.0988 152/500 [========>.....................] - ETA: 1:22 - loss: 0.8469 - regression_loss: 0.7486 - classification_loss: 0.0983 153/500 [========>.....................] - ETA: 1:21 - loss: 0.8472 - regression_loss: 0.7489 - classification_loss: 0.0983 154/500 [========>.....................] - ETA: 1:21 - loss: 0.8460 - regression_loss: 0.7481 - classification_loss: 0.0979 155/500 [========>.....................] - ETA: 1:21 - loss: 0.8474 - regression_loss: 0.7489 - classification_loss: 0.0986 156/500 [========>.....................] - ETA: 1:21 - loss: 0.8449 - regression_loss: 0.7469 - classification_loss: 0.0980 157/500 [========>.....................] - ETA: 1:20 - loss: 0.8449 - regression_loss: 0.7472 - classification_loss: 0.0976 158/500 [========>.....................] - ETA: 1:20 - loss: 0.8434 - regression_loss: 0.7462 - classification_loss: 0.0972 159/500 [========>.....................] - ETA: 1:20 - loss: 0.8437 - regression_loss: 0.7467 - classification_loss: 0.0970 160/500 [========>.....................] - ETA: 1:20 - loss: 0.8408 - regression_loss: 0.7441 - classification_loss: 0.0967 161/500 [========>.....................] - ETA: 1:19 - loss: 0.8408 - regression_loss: 0.7440 - classification_loss: 0.0969 162/500 [========>.....................] - ETA: 1:19 - loss: 0.8384 - regression_loss: 0.7419 - classification_loss: 0.0965 163/500 [========>.....................] - ETA: 1:19 - loss: 0.8396 - regression_loss: 0.7432 - classification_loss: 0.0964 164/500 [========>.....................] - ETA: 1:19 - loss: 0.8400 - regression_loss: 0.7437 - classification_loss: 0.0962 165/500 [========>.....................] - ETA: 1:18 - loss: 0.8399 - regression_loss: 0.7435 - classification_loss: 0.0964 166/500 [========>.....................] - ETA: 1:18 - loss: 0.8402 - regression_loss: 0.7439 - classification_loss: 0.0963 167/500 [=========>....................] - ETA: 1:18 - loss: 0.8417 - regression_loss: 0.7450 - classification_loss: 0.0967 168/500 [=========>....................] - ETA: 1:18 - loss: 0.8414 - regression_loss: 0.7447 - classification_loss: 0.0967 169/500 [=========>....................] - ETA: 1:18 - loss: 0.8407 - regression_loss: 0.7442 - classification_loss: 0.0965 170/500 [=========>....................] - ETA: 1:17 - loss: 0.8391 - regression_loss: 0.7430 - classification_loss: 0.0962 171/500 [=========>....................] - ETA: 1:17 - loss: 0.8391 - regression_loss: 0.7433 - classification_loss: 0.0958 172/500 [=========>....................] - ETA: 1:17 - loss: 0.8380 - regression_loss: 0.7425 - classification_loss: 0.0955 173/500 [=========>....................] - ETA: 1:17 - loss: 0.8374 - regression_loss: 0.7423 - classification_loss: 0.0951 174/500 [=========>....................] - ETA: 1:16 - loss: 0.8352 - regression_loss: 0.7403 - classification_loss: 0.0948 175/500 [=========>....................] - ETA: 1:16 - loss: 0.8328 - regression_loss: 0.7383 - classification_loss: 0.0945 176/500 [=========>....................] - ETA: 1:16 - loss: 0.8306 - regression_loss: 0.7365 - classification_loss: 0.0941 177/500 [=========>....................] - ETA: 1:16 - loss: 0.8312 - regression_loss: 0.7371 - classification_loss: 0.0941 178/500 [=========>....................] - ETA: 1:15 - loss: 0.8317 - regression_loss: 0.7376 - classification_loss: 0.0940 179/500 [=========>....................] - ETA: 1:15 - loss: 0.8332 - regression_loss: 0.7388 - classification_loss: 0.0944 180/500 [=========>....................] - ETA: 1:15 - loss: 0.8309 - regression_loss: 0.7370 - classification_loss: 0.0940 181/500 [=========>....................] - ETA: 1:15 - loss: 0.8337 - regression_loss: 0.7395 - classification_loss: 0.0943 182/500 [=========>....................] - ETA: 1:14 - loss: 0.8375 - regression_loss: 0.7421 - classification_loss: 0.0954 183/500 [=========>....................] - ETA: 1:14 - loss: 0.8386 - regression_loss: 0.7435 - classification_loss: 0.0951 184/500 [==========>...................] - ETA: 1:14 - loss: 0.8371 - regression_loss: 0.7423 - classification_loss: 0.0948 185/500 [==========>...................] - ETA: 1:14 - loss: 0.8363 - regression_loss: 0.7415 - classification_loss: 0.0948 186/500 [==========>...................] - ETA: 1:13 - loss: 0.8345 - regression_loss: 0.7401 - classification_loss: 0.0945 187/500 [==========>...................] - ETA: 1:13 - loss: 0.8341 - regression_loss: 0.7398 - classification_loss: 0.0943 188/500 [==========>...................] - ETA: 1:13 - loss: 0.8341 - regression_loss: 0.7401 - classification_loss: 0.0940 189/500 [==========>...................] - ETA: 1:13 - loss: 0.8363 - regression_loss: 0.7422 - classification_loss: 0.0941 190/500 [==========>...................] - ETA: 1:12 - loss: 0.8366 - regression_loss: 0.7423 - classification_loss: 0.0943 191/500 [==========>...................] - ETA: 1:12 - loss: 0.8377 - regression_loss: 0.7436 - classification_loss: 0.0941 192/500 [==========>...................] - ETA: 1:12 - loss: 0.8371 - regression_loss: 0.7430 - classification_loss: 0.0941 193/500 [==========>...................] - ETA: 1:12 - loss: 0.8372 - regression_loss: 0.7433 - classification_loss: 0.0939 194/500 [==========>...................] - ETA: 1:12 - loss: 0.8395 - regression_loss: 0.7454 - classification_loss: 0.0940 195/500 [==========>...................] - ETA: 1:11 - loss: 0.8385 - regression_loss: 0.7448 - classification_loss: 0.0937 196/500 [==========>...................] - ETA: 1:11 - loss: 0.8364 - regression_loss: 0.7430 - classification_loss: 0.0934 197/500 [==========>...................] - ETA: 1:11 - loss: 0.8383 - regression_loss: 0.7446 - classification_loss: 0.0937 198/500 [==========>...................] - ETA: 1:11 - loss: 0.8381 - regression_loss: 0.7446 - classification_loss: 0.0935 199/500 [==========>...................] - ETA: 1:10 - loss: 0.8386 - regression_loss: 0.7452 - classification_loss: 0.0935 200/500 [===========>..................] - ETA: 1:10 - loss: 0.8373 - regression_loss: 0.7441 - classification_loss: 0.0932 201/500 [===========>..................] - ETA: 1:10 - loss: 0.8353 - regression_loss: 0.7425 - classification_loss: 0.0928 202/500 [===========>..................] - ETA: 1:10 - loss: 0.8358 - regression_loss: 0.7430 - classification_loss: 0.0928 203/500 [===========>..................] - ETA: 1:09 - loss: 0.8383 - regression_loss: 0.7451 - classification_loss: 0.0933 204/500 [===========>..................] - ETA: 1:09 - loss: 0.8388 - regression_loss: 0.7456 - classification_loss: 0.0931 205/500 [===========>..................] - ETA: 1:09 - loss: 0.8415 - regression_loss: 0.7476 - classification_loss: 0.0939 206/500 [===========>..................] - ETA: 1:09 - loss: 0.8428 - regression_loss: 0.7488 - classification_loss: 0.0940 207/500 [===========>..................] - ETA: 1:08 - loss: 0.8409 - regression_loss: 0.7471 - classification_loss: 0.0937 208/500 [===========>..................] - ETA: 1:08 - loss: 0.8411 - regression_loss: 0.7474 - classification_loss: 0.0937 209/500 [===========>..................] - ETA: 1:08 - loss: 0.8400 - regression_loss: 0.7464 - classification_loss: 0.0935 210/500 [===========>..................] - ETA: 1:08 - loss: 0.8414 - regression_loss: 0.7479 - classification_loss: 0.0935 211/500 [===========>..................] - ETA: 1:07 - loss: 0.8396 - regression_loss: 0.7464 - classification_loss: 0.0932 212/500 [===========>..................] - ETA: 1:07 - loss: 0.8384 - regression_loss: 0.7454 - classification_loss: 0.0929 213/500 [===========>..................] - ETA: 1:07 - loss: 0.8377 - regression_loss: 0.7449 - classification_loss: 0.0928 214/500 [===========>..................] - ETA: 1:07 - loss: 0.8370 - regression_loss: 0.7443 - classification_loss: 0.0927 215/500 [===========>..................] - ETA: 1:07 - loss: 0.8363 - regression_loss: 0.7438 - classification_loss: 0.0925 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8361 - regression_loss: 0.7440 - classification_loss: 0.0921 217/500 [============>.................] - ETA: 1:06 - loss: 0.8388 - regression_loss: 0.7464 - classification_loss: 0.0924 218/500 [============>.................] - ETA: 1:06 - loss: 0.8381 - regression_loss: 0.7457 - classification_loss: 0.0924 219/500 [============>.................] - ETA: 1:06 - loss: 0.8373 - regression_loss: 0.7451 - classification_loss: 0.0922 220/500 [============>.................] - ETA: 1:05 - loss: 0.8379 - regression_loss: 0.7459 - classification_loss: 0.0921 221/500 [============>.................] - ETA: 1:05 - loss: 0.8389 - regression_loss: 0.7468 - classification_loss: 0.0921 222/500 [============>.................] - ETA: 1:05 - loss: 0.8370 - regression_loss: 0.7452 - classification_loss: 0.0918 223/500 [============>.................] - ETA: 1:05 - loss: 0.8389 - regression_loss: 0.7468 - classification_loss: 0.0921 224/500 [============>.................] - ETA: 1:04 - loss: 0.8377 - regression_loss: 0.7459 - classification_loss: 0.0918 225/500 [============>.................] - ETA: 1:04 - loss: 0.8402 - regression_loss: 0.7483 - classification_loss: 0.0919 226/500 [============>.................] - ETA: 1:04 - loss: 0.8403 - regression_loss: 0.7486 - classification_loss: 0.0917 227/500 [============>.................] - ETA: 1:04 - loss: 0.8380 - regression_loss: 0.7467 - classification_loss: 0.0914 228/500 [============>.................] - ETA: 1:03 - loss: 0.8390 - regression_loss: 0.7474 - classification_loss: 0.0915 229/500 [============>.................] - ETA: 1:03 - loss: 0.8390 - regression_loss: 0.7476 - classification_loss: 0.0914 230/500 [============>.................] - ETA: 1:03 - loss: 0.8401 - regression_loss: 0.7484 - classification_loss: 0.0916 231/500 [============>.................] - ETA: 1:03 - loss: 0.8417 - regression_loss: 0.7501 - classification_loss: 0.0916 232/500 [============>.................] - ETA: 1:02 - loss: 0.8415 - regression_loss: 0.7502 - classification_loss: 0.0914 233/500 [============>.................] - ETA: 1:02 - loss: 0.8419 - regression_loss: 0.7505 - classification_loss: 0.0914 234/500 [=============>................] - ETA: 1:02 - loss: 0.8427 - regression_loss: 0.7513 - classification_loss: 0.0914 235/500 [=============>................] - ETA: 1:02 - loss: 0.8428 - regression_loss: 0.7515 - classification_loss: 0.0913 236/500 [=============>................] - ETA: 1:01 - loss: 0.8438 - regression_loss: 0.7524 - classification_loss: 0.0914 237/500 [=============>................] - ETA: 1:01 - loss: 0.8469 - regression_loss: 0.7552 - classification_loss: 0.0917 238/500 [=============>................] - ETA: 1:01 - loss: 0.8484 - regression_loss: 0.7565 - classification_loss: 0.0918 239/500 [=============>................] - ETA: 1:01 - loss: 0.8508 - regression_loss: 0.7584 - classification_loss: 0.0923 240/500 [=============>................] - ETA: 1:01 - loss: 0.8494 - regression_loss: 0.7573 - classification_loss: 0.0921 241/500 [=============>................] - ETA: 1:00 - loss: 0.8472 - regression_loss: 0.7554 - classification_loss: 0.0918 242/500 [=============>................] - ETA: 1:00 - loss: 0.8456 - regression_loss: 0.7541 - classification_loss: 0.0915 243/500 [=============>................] - ETA: 1:00 - loss: 0.8456 - regression_loss: 0.7543 - classification_loss: 0.0914 244/500 [=============>................] - ETA: 1:00 - loss: 0.8446 - regression_loss: 0.7534 - classification_loss: 0.0912 245/500 [=============>................] - ETA: 59s - loss: 0.8451 - regression_loss: 0.7540 - classification_loss: 0.0911  246/500 [=============>................] - ETA: 59s - loss: 0.8445 - regression_loss: 0.7534 - classification_loss: 0.0911 247/500 [=============>................] - ETA: 59s - loss: 0.8445 - regression_loss: 0.7531 - classification_loss: 0.0914 248/500 [=============>................] - ETA: 59s - loss: 0.8430 - regression_loss: 0.7518 - classification_loss: 0.0912 249/500 [=============>................] - ETA: 58s - loss: 0.8454 - regression_loss: 0.7539 - classification_loss: 0.0915 250/500 [==============>...............] - ETA: 58s - loss: 0.8446 - regression_loss: 0.7533 - classification_loss: 0.0913 251/500 [==============>...............] - ETA: 58s - loss: 0.8438 - regression_loss: 0.7527 - classification_loss: 0.0912 252/500 [==============>...............] - ETA: 58s - loss: 0.8426 - regression_loss: 0.7517 - classification_loss: 0.0909 253/500 [==============>...............] - ETA: 57s - loss: 0.8444 - regression_loss: 0.7532 - classification_loss: 0.0911 254/500 [==============>...............] - ETA: 57s - loss: 0.8460 - regression_loss: 0.7547 - classification_loss: 0.0913 255/500 [==============>...............] - ETA: 57s - loss: 0.8460 - regression_loss: 0.7548 - classification_loss: 0.0912 256/500 [==============>...............] - ETA: 57s - loss: 0.8469 - regression_loss: 0.7558 - classification_loss: 0.0911 257/500 [==============>...............] - ETA: 57s - loss: 0.8463 - regression_loss: 0.7555 - classification_loss: 0.0908 258/500 [==============>...............] - ETA: 56s - loss: 0.8466 - regression_loss: 0.7555 - classification_loss: 0.0912 259/500 [==============>...............] - ETA: 56s - loss: 0.8452 - regression_loss: 0.7541 - classification_loss: 0.0910 260/500 [==============>...............] - ETA: 56s - loss: 0.8454 - regression_loss: 0.7543 - classification_loss: 0.0911 261/500 [==============>...............] - ETA: 56s - loss: 0.8450 - regression_loss: 0.7541 - classification_loss: 0.0910 262/500 [==============>...............] - ETA: 55s - loss: 0.8442 - regression_loss: 0.7534 - classification_loss: 0.0907 263/500 [==============>...............] - ETA: 55s - loss: 0.8437 - regression_loss: 0.7530 - classification_loss: 0.0906 264/500 [==============>...............] - ETA: 55s - loss: 0.8443 - regression_loss: 0.7536 - classification_loss: 0.0907 265/500 [==============>...............] - ETA: 55s - loss: 0.8438 - regression_loss: 0.7532 - classification_loss: 0.0906 266/500 [==============>...............] - ETA: 54s - loss: 0.8456 - regression_loss: 0.7551 - classification_loss: 0.0905 267/500 [===============>..............] - ETA: 54s - loss: 0.8466 - regression_loss: 0.7558 - classification_loss: 0.0908 268/500 [===============>..............] - ETA: 54s - loss: 0.8479 - regression_loss: 0.7567 - classification_loss: 0.0912 269/500 [===============>..............] - ETA: 54s - loss: 0.8474 - regression_loss: 0.7563 - classification_loss: 0.0911 270/500 [===============>..............] - ETA: 53s - loss: 0.8477 - regression_loss: 0.7567 - classification_loss: 0.0910 271/500 [===============>..............] - ETA: 53s - loss: 0.8500 - regression_loss: 0.7589 - classification_loss: 0.0911 272/500 [===============>..............] - ETA: 53s - loss: 0.8502 - regression_loss: 0.7592 - classification_loss: 0.0910 273/500 [===============>..............] - ETA: 53s - loss: 0.8506 - regression_loss: 0.7596 - classification_loss: 0.0910 274/500 [===============>..............] - ETA: 52s - loss: 0.8517 - regression_loss: 0.7607 - classification_loss: 0.0910 275/500 [===============>..............] - ETA: 52s - loss: 0.8568 - regression_loss: 0.7647 - classification_loss: 0.0921 276/500 [===============>..............] - ETA: 52s - loss: 0.8574 - regression_loss: 0.7654 - classification_loss: 0.0920 277/500 [===============>..............] - ETA: 52s - loss: 0.8573 - regression_loss: 0.7653 - classification_loss: 0.0920 278/500 [===============>..............] - ETA: 52s - loss: 0.8578 - regression_loss: 0.7655 - classification_loss: 0.0923 279/500 [===============>..............] - ETA: 51s - loss: 0.8577 - regression_loss: 0.7655 - classification_loss: 0.0922 280/500 [===============>..............] - ETA: 51s - loss: 0.8572 - regression_loss: 0.7652 - classification_loss: 0.0920 281/500 [===============>..............] - ETA: 51s - loss: 0.8565 - regression_loss: 0.7646 - classification_loss: 0.0919 282/500 [===============>..............] - ETA: 51s - loss: 0.8573 - regression_loss: 0.7655 - classification_loss: 0.0917 283/500 [===============>..............] - ETA: 50s - loss: 0.8597 - regression_loss: 0.7678 - classification_loss: 0.0919 284/500 [================>.............] - ETA: 50s - loss: 0.8594 - regression_loss: 0.7675 - classification_loss: 0.0919 285/500 [================>.............] - ETA: 50s - loss: 0.8600 - regression_loss: 0.7683 - classification_loss: 0.0917 286/500 [================>.............] - ETA: 50s - loss: 0.8593 - regression_loss: 0.7677 - classification_loss: 0.0916 287/500 [================>.............] - ETA: 49s - loss: 0.8603 - regression_loss: 0.7687 - classification_loss: 0.0916 288/500 [================>.............] - ETA: 49s - loss: 0.8597 - regression_loss: 0.7682 - classification_loss: 0.0915 289/500 [================>.............] - ETA: 49s - loss: 0.8597 - regression_loss: 0.7682 - classification_loss: 0.0916 290/500 [================>.............] - ETA: 49s - loss: 0.8590 - regression_loss: 0.7676 - classification_loss: 0.0914 291/500 [================>.............] - ETA: 48s - loss: 0.8593 - regression_loss: 0.7677 - classification_loss: 0.0916 292/500 [================>.............] - ETA: 48s - loss: 0.8577 - regression_loss: 0.7664 - classification_loss: 0.0913 293/500 [================>.............] - ETA: 48s - loss: 0.8577 - regression_loss: 0.7665 - classification_loss: 0.0912 294/500 [================>.............] - ETA: 48s - loss: 0.8571 - regression_loss: 0.7659 - classification_loss: 0.0912 295/500 [================>.............] - ETA: 48s - loss: 0.8591 - regression_loss: 0.7676 - classification_loss: 0.0916 296/500 [================>.............] - ETA: 47s - loss: 0.8593 - regression_loss: 0.7679 - classification_loss: 0.0915 297/500 [================>.............] - ETA: 47s - loss: 0.8596 - regression_loss: 0.7681 - classification_loss: 0.0915 298/500 [================>.............] - ETA: 47s - loss: 0.8596 - regression_loss: 0.7682 - classification_loss: 0.0914 299/500 [================>.............] - ETA: 47s - loss: 0.8586 - regression_loss: 0.7672 - classification_loss: 0.0914 300/500 [=================>............] - ETA: 46s - loss: 0.8586 - regression_loss: 0.7674 - classification_loss: 0.0912 301/500 [=================>............] - ETA: 46s - loss: 0.8583 - regression_loss: 0.7671 - classification_loss: 0.0912 302/500 [=================>............] - ETA: 46s - loss: 0.8581 - regression_loss: 0.7669 - classification_loss: 0.0912 303/500 [=================>............] - ETA: 46s - loss: 0.8590 - regression_loss: 0.7678 - classification_loss: 0.0913 304/500 [=================>............] - ETA: 45s - loss: 0.8574 - regression_loss: 0.7663 - classification_loss: 0.0911 305/500 [=================>............] - ETA: 45s - loss: 0.8569 - regression_loss: 0.7655 - classification_loss: 0.0913 306/500 [=================>............] - ETA: 45s - loss: 0.8586 - regression_loss: 0.7666 - classification_loss: 0.0920 307/500 [=================>............] - ETA: 45s - loss: 0.8573 - regression_loss: 0.7655 - classification_loss: 0.0918 308/500 [=================>............] - ETA: 44s - loss: 0.8570 - regression_loss: 0.7653 - classification_loss: 0.0917 309/500 [=================>............] - ETA: 44s - loss: 0.8588 - regression_loss: 0.7669 - classification_loss: 0.0919 310/500 [=================>............] - ETA: 44s - loss: 0.8598 - regression_loss: 0.7679 - classification_loss: 0.0919 311/500 [=================>............] - ETA: 44s - loss: 0.8584 - regression_loss: 0.7667 - classification_loss: 0.0917 312/500 [=================>............] - ETA: 44s - loss: 0.8574 - regression_loss: 0.7657 - classification_loss: 0.0917 313/500 [=================>............] - ETA: 43s - loss: 0.8566 - regression_loss: 0.7650 - classification_loss: 0.0915 314/500 [=================>............] - ETA: 43s - loss: 0.8574 - regression_loss: 0.7661 - classification_loss: 0.0914 315/500 [=================>............] - ETA: 43s - loss: 0.8581 - regression_loss: 0.7665 - classification_loss: 0.0916 316/500 [=================>............] - ETA: 43s - loss: 0.8584 - regression_loss: 0.7668 - classification_loss: 0.0915 317/500 [==================>...........] - ETA: 42s - loss: 0.8570 - regression_loss: 0.7656 - classification_loss: 0.0914 318/500 [==================>...........] - ETA: 42s - loss: 0.8574 - regression_loss: 0.7660 - classification_loss: 0.0913 319/500 [==================>...........] - ETA: 42s - loss: 0.8568 - regression_loss: 0.7656 - classification_loss: 0.0912 320/500 [==================>...........] - ETA: 42s - loss: 0.8587 - regression_loss: 0.7675 - classification_loss: 0.0912 321/500 [==================>...........] - ETA: 41s - loss: 0.8585 - regression_loss: 0.7673 - classification_loss: 0.0912 322/500 [==================>...........] - ETA: 41s - loss: 0.8582 - regression_loss: 0.7670 - classification_loss: 0.0912 323/500 [==================>...........] - ETA: 41s - loss: 0.8573 - regression_loss: 0.7663 - classification_loss: 0.0910 324/500 [==================>...........] - ETA: 41s - loss: 0.8570 - regression_loss: 0.7661 - classification_loss: 0.0909 325/500 [==================>...........] - ETA: 40s - loss: 0.8574 - regression_loss: 0.7665 - classification_loss: 0.0909 326/500 [==================>...........] - ETA: 40s - loss: 0.8568 - regression_loss: 0.7660 - classification_loss: 0.0908 327/500 [==================>...........] - ETA: 40s - loss: 0.8567 - regression_loss: 0.7660 - classification_loss: 0.0906 328/500 [==================>...........] - ETA: 40s - loss: 0.8550 - regression_loss: 0.7646 - classification_loss: 0.0904 329/500 [==================>...........] - ETA: 40s - loss: 0.8553 - regression_loss: 0.7650 - classification_loss: 0.0903 330/500 [==================>...........] - ETA: 39s - loss: 0.8556 - regression_loss: 0.7652 - classification_loss: 0.0903 331/500 [==================>...........] - ETA: 39s - loss: 0.8575 - regression_loss: 0.7667 - classification_loss: 0.0908 332/500 [==================>...........] - ETA: 39s - loss: 0.8596 - regression_loss: 0.7681 - classification_loss: 0.0915 333/500 [==================>...........] - ETA: 39s - loss: 0.8631 - regression_loss: 0.7703 - classification_loss: 0.0927 334/500 [===================>..........] - ETA: 38s - loss: 0.8644 - regression_loss: 0.7716 - classification_loss: 0.0928 335/500 [===================>..........] - ETA: 38s - loss: 0.8632 - regression_loss: 0.7705 - classification_loss: 0.0927 336/500 [===================>..........] - ETA: 38s - loss: 0.8631 - regression_loss: 0.7706 - classification_loss: 0.0926 337/500 [===================>..........] - ETA: 38s - loss: 0.8636 - regression_loss: 0.7713 - classification_loss: 0.0924 338/500 [===================>..........] - ETA: 37s - loss: 0.8631 - regression_loss: 0.7708 - classification_loss: 0.0923 339/500 [===================>..........] - ETA: 37s - loss: 0.8642 - regression_loss: 0.7716 - classification_loss: 0.0926 340/500 [===================>..........] - ETA: 37s - loss: 0.8656 - regression_loss: 0.7727 - classification_loss: 0.0928 341/500 [===================>..........] - ETA: 37s - loss: 0.8654 - regression_loss: 0.7726 - classification_loss: 0.0928 342/500 [===================>..........] - ETA: 37s - loss: 0.8651 - regression_loss: 0.7724 - classification_loss: 0.0926 343/500 [===================>..........] - ETA: 36s - loss: 0.8654 - regression_loss: 0.7726 - classification_loss: 0.0928 344/500 [===================>..........] - ETA: 36s - loss: 0.8648 - regression_loss: 0.7722 - classification_loss: 0.0926 345/500 [===================>..........] - ETA: 36s - loss: 0.8629 - regression_loss: 0.7705 - classification_loss: 0.0924 346/500 [===================>..........] - ETA: 36s - loss: 0.8633 - regression_loss: 0.7710 - classification_loss: 0.0923 347/500 [===================>..........] - ETA: 35s - loss: 0.8643 - regression_loss: 0.7718 - classification_loss: 0.0926 348/500 [===================>..........] - ETA: 35s - loss: 0.8630 - regression_loss: 0.7707 - classification_loss: 0.0923 349/500 [===================>..........] - ETA: 35s - loss: 0.8653 - regression_loss: 0.7727 - classification_loss: 0.0925 350/500 [====================>.........] - ETA: 35s - loss: 0.8639 - regression_loss: 0.7716 - classification_loss: 0.0923 351/500 [====================>.........] - ETA: 34s - loss: 0.8637 - regression_loss: 0.7714 - classification_loss: 0.0923 352/500 [====================>.........] - ETA: 34s - loss: 0.8630 - regression_loss: 0.7708 - classification_loss: 0.0921 353/500 [====================>.........] - ETA: 34s - loss: 0.8620 - regression_loss: 0.7700 - classification_loss: 0.0919 354/500 [====================>.........] - ETA: 34s - loss: 0.8626 - regression_loss: 0.7708 - classification_loss: 0.0918 355/500 [====================>.........] - ETA: 34s - loss: 0.8640 - regression_loss: 0.7722 - classification_loss: 0.0918 356/500 [====================>.........] - ETA: 33s - loss: 0.8635 - regression_loss: 0.7717 - classification_loss: 0.0917 357/500 [====================>.........] - ETA: 33s - loss: 0.8624 - regression_loss: 0.7709 - classification_loss: 0.0916 358/500 [====================>.........] - ETA: 33s - loss: 0.8628 - regression_loss: 0.7713 - classification_loss: 0.0915 359/500 [====================>.........] - ETA: 33s - loss: 0.8629 - regression_loss: 0.7715 - classification_loss: 0.0914 360/500 [====================>.........] - ETA: 32s - loss: 0.8636 - regression_loss: 0.7722 - classification_loss: 0.0913 361/500 [====================>.........] - ETA: 32s - loss: 0.8631 - regression_loss: 0.7719 - classification_loss: 0.0912 362/500 [====================>.........] - ETA: 32s - loss: 0.8640 - regression_loss: 0.7725 - classification_loss: 0.0915 363/500 [====================>.........] - ETA: 32s - loss: 0.8626 - regression_loss: 0.7713 - classification_loss: 0.0913 364/500 [====================>.........] - ETA: 31s - loss: 0.8643 - regression_loss: 0.7729 - classification_loss: 0.0914 365/500 [====================>.........] - ETA: 31s - loss: 0.8633 - regression_loss: 0.7721 - classification_loss: 0.0912 366/500 [====================>.........] - ETA: 31s - loss: 0.8638 - regression_loss: 0.7726 - classification_loss: 0.0912 367/500 [=====================>........] - ETA: 31s - loss: 0.8630 - regression_loss: 0.7720 - classification_loss: 0.0910 368/500 [=====================>........] - ETA: 31s - loss: 0.8619 - regression_loss: 0.7710 - classification_loss: 0.0908 369/500 [=====================>........] - ETA: 30s - loss: 0.8624 - regression_loss: 0.7716 - classification_loss: 0.0908 370/500 [=====================>........] - ETA: 30s - loss: 0.8635 - regression_loss: 0.7725 - classification_loss: 0.0910 371/500 [=====================>........] - ETA: 30s - loss: 0.8634 - regression_loss: 0.7726 - classification_loss: 0.0909 372/500 [=====================>........] - ETA: 30s - loss: 0.8643 - regression_loss: 0.7734 - classification_loss: 0.0909 373/500 [=====================>........] - ETA: 29s - loss: 0.8645 - regression_loss: 0.7736 - classification_loss: 0.0910 374/500 [=====================>........] - ETA: 29s - loss: 0.8649 - regression_loss: 0.7740 - classification_loss: 0.0910 375/500 [=====================>........] - ETA: 29s - loss: 0.8645 - regression_loss: 0.7736 - classification_loss: 0.0909 376/500 [=====================>........] - ETA: 29s - loss: 0.8643 - regression_loss: 0.7735 - classification_loss: 0.0908 377/500 [=====================>........] - ETA: 28s - loss: 0.8658 - regression_loss: 0.7749 - classification_loss: 0.0910 378/500 [=====================>........] - ETA: 28s - loss: 0.8664 - regression_loss: 0.7755 - classification_loss: 0.0909 379/500 [=====================>........] - ETA: 28s - loss: 0.8668 - regression_loss: 0.7759 - classification_loss: 0.0909 380/500 [=====================>........] - ETA: 28s - loss: 0.8663 - regression_loss: 0.7755 - classification_loss: 0.0908 381/500 [=====================>........] - ETA: 27s - loss: 0.8656 - regression_loss: 0.7749 - classification_loss: 0.0907 382/500 [=====================>........] - ETA: 27s - loss: 0.8649 - regression_loss: 0.7742 - classification_loss: 0.0907 383/500 [=====================>........] - ETA: 27s - loss: 0.8638 - regression_loss: 0.7732 - classification_loss: 0.0906 384/500 [======================>.......] - ETA: 27s - loss: 0.8635 - regression_loss: 0.7731 - classification_loss: 0.0905 385/500 [======================>.......] - ETA: 27s - loss: 0.8630 - regression_loss: 0.7726 - classification_loss: 0.0903 386/500 [======================>.......] - ETA: 26s - loss: 0.8623 - regression_loss: 0.7721 - classification_loss: 0.0902 387/500 [======================>.......] - ETA: 26s - loss: 0.8614 - regression_loss: 0.7713 - classification_loss: 0.0900 388/500 [======================>.......] - ETA: 26s - loss: 0.8613 - regression_loss: 0.7714 - classification_loss: 0.0900 389/500 [======================>.......] - ETA: 26s - loss: 0.8613 - regression_loss: 0.7713 - classification_loss: 0.0900 390/500 [======================>.......] - ETA: 25s - loss: 0.8623 - regression_loss: 0.7721 - classification_loss: 0.0902 391/500 [======================>.......] - ETA: 25s - loss: 0.8625 - regression_loss: 0.7723 - classification_loss: 0.0903 392/500 [======================>.......] - ETA: 25s - loss: 0.8648 - regression_loss: 0.7745 - classification_loss: 0.0904 393/500 [======================>.......] - ETA: 25s - loss: 0.8648 - regression_loss: 0.7745 - classification_loss: 0.0903 394/500 [======================>.......] - ETA: 24s - loss: 0.8643 - regression_loss: 0.7741 - classification_loss: 0.0902 395/500 [======================>.......] - ETA: 24s - loss: 0.8665 - regression_loss: 0.7760 - classification_loss: 0.0905 396/500 [======================>.......] - ETA: 24s - loss: 0.8679 - regression_loss: 0.7775 - classification_loss: 0.0904 397/500 [======================>.......] - ETA: 24s - loss: 0.8717 - regression_loss: 0.7807 - classification_loss: 0.0910 398/500 [======================>.......] - ETA: 23s - loss: 0.8708 - regression_loss: 0.7799 - classification_loss: 0.0909 399/500 [======================>.......] - ETA: 23s - loss: 0.8706 - regression_loss: 0.7799 - classification_loss: 0.0908 400/500 [=======================>......] - ETA: 23s - loss: 0.8698 - regression_loss: 0.7792 - classification_loss: 0.0907 401/500 [=======================>......] - ETA: 23s - loss: 0.8700 - regression_loss: 0.7793 - classification_loss: 0.0907 402/500 [=======================>......] - ETA: 23s - loss: 0.8708 - regression_loss: 0.7801 - classification_loss: 0.0907 403/500 [=======================>......] - ETA: 22s - loss: 0.8699 - regression_loss: 0.7793 - classification_loss: 0.0905 404/500 [=======================>......] - ETA: 22s - loss: 0.8701 - regression_loss: 0.7797 - classification_loss: 0.0904 405/500 [=======================>......] - ETA: 22s - loss: 0.8709 - regression_loss: 0.7804 - classification_loss: 0.0905 406/500 [=======================>......] - ETA: 22s - loss: 0.8701 - regression_loss: 0.7796 - classification_loss: 0.0906 407/500 [=======================>......] - ETA: 21s - loss: 0.8686 - regression_loss: 0.7782 - classification_loss: 0.0904 408/500 [=======================>......] - ETA: 21s - loss: 0.8683 - regression_loss: 0.7780 - classification_loss: 0.0903 409/500 [=======================>......] - ETA: 21s - loss: 0.8678 - regression_loss: 0.7777 - classification_loss: 0.0901 410/500 [=======================>......] - ETA: 21s - loss: 0.8677 - regression_loss: 0.7776 - classification_loss: 0.0901 411/500 [=======================>......] - ETA: 20s - loss: 0.8670 - regression_loss: 0.7771 - classification_loss: 0.0899 412/500 [=======================>......] - ETA: 20s - loss: 0.8676 - regression_loss: 0.7776 - classification_loss: 0.0900 413/500 [=======================>......] - ETA: 20s - loss: 0.8670 - regression_loss: 0.7769 - classification_loss: 0.0902 414/500 [=======================>......] - ETA: 20s - loss: 0.8670 - regression_loss: 0.7769 - classification_loss: 0.0901 415/500 [=======================>......] - ETA: 20s - loss: 0.8674 - regression_loss: 0.7772 - classification_loss: 0.0901 416/500 [=======================>......] - ETA: 19s - loss: 0.8660 - regression_loss: 0.7761 - classification_loss: 0.0900 417/500 [========================>.....] - ETA: 19s - loss: 0.8659 - regression_loss: 0.7760 - classification_loss: 0.0899 418/500 [========================>.....] - ETA: 19s - loss: 0.8667 - regression_loss: 0.7768 - classification_loss: 0.0900 419/500 [========================>.....] - ETA: 19s - loss: 0.8655 - regression_loss: 0.7757 - classification_loss: 0.0898 420/500 [========================>.....] - ETA: 18s - loss: 0.8668 - regression_loss: 0.7768 - classification_loss: 0.0900 421/500 [========================>.....] - ETA: 18s - loss: 0.8677 - regression_loss: 0.7775 - classification_loss: 0.0902 422/500 [========================>.....] - ETA: 18s - loss: 0.8672 - regression_loss: 0.7771 - classification_loss: 0.0902 423/500 [========================>.....] - ETA: 18s - loss: 0.8667 - regression_loss: 0.7767 - classification_loss: 0.0900 424/500 [========================>.....] - ETA: 17s - loss: 0.8652 - regression_loss: 0.7754 - classification_loss: 0.0898 425/500 [========================>.....] - ETA: 17s - loss: 0.8660 - regression_loss: 0.7761 - classification_loss: 0.0899 426/500 [========================>.....] - ETA: 17s - loss: 0.8663 - regression_loss: 0.7765 - classification_loss: 0.0898 427/500 [========================>.....] - ETA: 17s - loss: 0.8681 - regression_loss: 0.7781 - classification_loss: 0.0900 428/500 [========================>.....] - ETA: 16s - loss: 0.8675 - regression_loss: 0.7775 - classification_loss: 0.0899 429/500 [========================>.....] - ETA: 16s - loss: 0.8670 - regression_loss: 0.7772 - classification_loss: 0.0898 430/500 [========================>.....] - ETA: 16s - loss: 0.8692 - regression_loss: 0.7789 - classification_loss: 0.0903 431/500 [========================>.....] - ETA: 16s - loss: 0.8693 - regression_loss: 0.7791 - classification_loss: 0.0902 432/500 [========================>.....] - ETA: 15s - loss: 0.8682 - regression_loss: 0.7781 - classification_loss: 0.0901 433/500 [========================>.....] - ETA: 15s - loss: 0.8673 - regression_loss: 0.7774 - classification_loss: 0.0899 434/500 [=========================>....] - ETA: 15s - loss: 0.8675 - regression_loss: 0.7776 - classification_loss: 0.0899 435/500 [=========================>....] - ETA: 15s - loss: 0.8681 - regression_loss: 0.7782 - classification_loss: 0.0899 436/500 [=========================>....] - ETA: 15s - loss: 0.8684 - regression_loss: 0.7784 - classification_loss: 0.0899 437/500 [=========================>....] - ETA: 14s - loss: 0.8687 - regression_loss: 0.7788 - classification_loss: 0.0899 438/500 [=========================>....] - ETA: 14s - loss: 0.8694 - regression_loss: 0.7794 - classification_loss: 0.0900 439/500 [=========================>....] - ETA: 14s - loss: 0.8707 - regression_loss: 0.7806 - classification_loss: 0.0901 440/500 [=========================>....] - ETA: 14s - loss: 0.8707 - regression_loss: 0.7807 - classification_loss: 0.0900 441/500 [=========================>....] - ETA: 13s - loss: 0.8704 - regression_loss: 0.7805 - classification_loss: 0.0899 442/500 [=========================>....] - ETA: 13s - loss: 0.8691 - regression_loss: 0.7794 - classification_loss: 0.0897 443/500 [=========================>....] - ETA: 13s - loss: 0.8688 - regression_loss: 0.7791 - classification_loss: 0.0897 444/500 [=========================>....] - ETA: 13s - loss: 0.8686 - regression_loss: 0.7789 - classification_loss: 0.0897 445/500 [=========================>....] - ETA: 12s - loss: 0.8690 - regression_loss: 0.7793 - classification_loss: 0.0897 446/500 [=========================>....] - ETA: 12s - loss: 0.8687 - regression_loss: 0.7791 - classification_loss: 0.0896 447/500 [=========================>....] - ETA: 12s - loss: 0.8687 - regression_loss: 0.7792 - classification_loss: 0.0895 448/500 [=========================>....] - ETA: 12s - loss: 0.8715 - regression_loss: 0.7816 - classification_loss: 0.0899 449/500 [=========================>....] - ETA: 11s - loss: 0.8708 - regression_loss: 0.7810 - classification_loss: 0.0898 450/500 [==========================>...] - ETA: 11s - loss: 0.8696 - regression_loss: 0.7800 - classification_loss: 0.0897 451/500 [==========================>...] - ETA: 11s - loss: 0.8702 - regression_loss: 0.7805 - classification_loss: 0.0897 452/500 [==========================>...] - ETA: 11s - loss: 0.8698 - regression_loss: 0.7802 - classification_loss: 0.0896 453/500 [==========================>...] - ETA: 11s - loss: 0.8696 - regression_loss: 0.7801 - classification_loss: 0.0895 454/500 [==========================>...] - ETA: 10s - loss: 0.8703 - regression_loss: 0.7806 - classification_loss: 0.0897 455/500 [==========================>...] - ETA: 10s - loss: 0.8692 - regression_loss: 0.7797 - classification_loss: 0.0895 456/500 [==========================>...] - ETA: 10s - loss: 0.8690 - regression_loss: 0.7796 - classification_loss: 0.0894 457/500 [==========================>...] - ETA: 10s - loss: 0.8693 - regression_loss: 0.7797 - classification_loss: 0.0896 458/500 [==========================>...] - ETA: 9s - loss: 0.8691 - regression_loss: 0.7795 - classification_loss: 0.0896  459/500 [==========================>...] - ETA: 9s - loss: 0.8695 - regression_loss: 0.7799 - classification_loss: 0.0896 460/500 [==========================>...] - ETA: 9s - loss: 0.8695 - regression_loss: 0.7799 - classification_loss: 0.0896 461/500 [==========================>...] - ETA: 9s - loss: 0.8685 - regression_loss: 0.7790 - classification_loss: 0.0894 462/500 [==========================>...] - ETA: 8s - loss: 0.8683 - regression_loss: 0.7789 - classification_loss: 0.0894 463/500 [==========================>...] - ETA: 8s - loss: 0.8684 - regression_loss: 0.7790 - classification_loss: 0.0893 464/500 [==========================>...] - ETA: 8s - loss: 0.8678 - regression_loss: 0.7786 - classification_loss: 0.0892 465/500 [==========================>...] - ETA: 8s - loss: 0.8681 - regression_loss: 0.7789 - classification_loss: 0.0892 466/500 [==========================>...] - ETA: 8s - loss: 0.8691 - regression_loss: 0.7799 - classification_loss: 0.0892 467/500 [===========================>..] - ETA: 7s - loss: 0.8683 - regression_loss: 0.7792 - classification_loss: 0.0890 468/500 [===========================>..] - ETA: 7s - loss: 0.8692 - regression_loss: 0.7800 - classification_loss: 0.0892 469/500 [===========================>..] - ETA: 7s - loss: 0.8687 - regression_loss: 0.7796 - classification_loss: 0.0891 470/500 [===========================>..] - ETA: 7s - loss: 0.8704 - regression_loss: 0.7810 - classification_loss: 0.0894 471/500 [===========================>..] - ETA: 6s - loss: 0.8717 - regression_loss: 0.7822 - classification_loss: 0.0895 472/500 [===========================>..] - ETA: 6s - loss: 0.8706 - regression_loss: 0.7813 - classification_loss: 0.0894 473/500 [===========================>..] - ETA: 6s - loss: 0.8705 - regression_loss: 0.7812 - classification_loss: 0.0893 474/500 [===========================>..] - ETA: 6s - loss: 0.8703 - regression_loss: 0.7811 - classification_loss: 0.0892 475/500 [===========================>..] - ETA: 5s - loss: 0.8699 - regression_loss: 0.7807 - classification_loss: 0.0892 476/500 [===========================>..] - ETA: 5s - loss: 0.8702 - regression_loss: 0.7809 - classification_loss: 0.0892 477/500 [===========================>..] - ETA: 5s - loss: 0.8704 - regression_loss: 0.7809 - classification_loss: 0.0894 478/500 [===========================>..] - ETA: 5s - loss: 0.8705 - regression_loss: 0.7811 - classification_loss: 0.0894 479/500 [===========================>..] - ETA: 4s - loss: 0.8699 - regression_loss: 0.7805 - classification_loss: 0.0893 480/500 [===========================>..] - ETA: 4s - loss: 0.8695 - regression_loss: 0.7803 - classification_loss: 0.0892 481/500 [===========================>..] - ETA: 4s - loss: 0.8689 - regression_loss: 0.7799 - classification_loss: 0.0891 482/500 [===========================>..] - ETA: 4s - loss: 0.8686 - regression_loss: 0.7797 - classification_loss: 0.0890 483/500 [===========================>..] - ETA: 3s - loss: 0.8677 - regression_loss: 0.7789 - classification_loss: 0.0888 484/500 [============================>.] - ETA: 3s - loss: 0.8675 - regression_loss: 0.7787 - classification_loss: 0.0888 485/500 [============================>.] - ETA: 3s - loss: 0.8669 - regression_loss: 0.7782 - classification_loss: 0.0887 486/500 [============================>.] - ETA: 3s - loss: 0.8666 - regression_loss: 0.7779 - classification_loss: 0.0887 487/500 [============================>.] - ETA: 3s - loss: 0.8671 - regression_loss: 0.7784 - classification_loss: 0.0887 488/500 [============================>.] - ETA: 2s - loss: 0.8663 - regression_loss: 0.7778 - classification_loss: 0.0886 489/500 [============================>.] - ETA: 2s - loss: 0.8667 - regression_loss: 0.7781 - classification_loss: 0.0886 490/500 [============================>.] - ETA: 2s - loss: 0.8662 - regression_loss: 0.7777 - classification_loss: 0.0885 491/500 [============================>.] - ETA: 2s - loss: 0.8659 - regression_loss: 0.7774 - classification_loss: 0.0885 492/500 [============================>.] - ETA: 1s - loss: 0.8648 - regression_loss: 0.7765 - classification_loss: 0.0883 493/500 [============================>.] - ETA: 1s - loss: 0.8649 - regression_loss: 0.7767 - classification_loss: 0.0882 494/500 [============================>.] - ETA: 1s - loss: 0.8651 - regression_loss: 0.7769 - classification_loss: 0.0882 495/500 [============================>.] - ETA: 1s - loss: 0.8656 - regression_loss: 0.7774 - classification_loss: 0.0882 496/500 [============================>.] - ETA: 0s - loss: 0.8674 - regression_loss: 0.7787 - classification_loss: 0.0887 497/500 [============================>.] - ETA: 0s - loss: 0.8669 - regression_loss: 0.7784 - classification_loss: 0.0885 498/500 [============================>.] - ETA: 0s - loss: 0.8668 - regression_loss: 0.7783 - classification_loss: 0.0885 499/500 [============================>.] - ETA: 0s - loss: 0.8673 - regression_loss: 0.7787 - classification_loss: 0.0886 500/500 [==============================] - 118s 235ms/step - loss: 0.8662 - regression_loss: 0.7777 - classification_loss: 0.0885 326 instances of class plum with average precision: 0.8515 mAP: 0.8515 Epoch 00041: saving model to ./training/snapshots/resnet50_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 1:49 - loss: 0.6899 - regression_loss: 0.6642 - classification_loss: 0.0257 2/500 [..............................] - ETA: 2:00 - loss: 0.5354 - regression_loss: 0.5034 - classification_loss: 0.0320 3/500 [..............................] - ETA: 1:59 - loss: 0.4329 - regression_loss: 0.4084 - classification_loss: 0.0245 4/500 [..............................] - ETA: 1:57 - loss: 0.5522 - regression_loss: 0.5104 - classification_loss: 0.0418 5/500 [..............................] - ETA: 1:58 - loss: 0.5470 - regression_loss: 0.5078 - classification_loss: 0.0392 6/500 [..............................] - ETA: 1:57 - loss: 0.6002 - regression_loss: 0.5625 - classification_loss: 0.0377 7/500 [..............................] - ETA: 1:58 - loss: 0.7469 - regression_loss: 0.6862 - classification_loss: 0.0607 8/500 [..............................] - ETA: 1:57 - loss: 0.7719 - regression_loss: 0.7042 - classification_loss: 0.0677 9/500 [..............................] - ETA: 1:56 - loss: 0.8294 - regression_loss: 0.7480 - classification_loss: 0.0814 10/500 [..............................] - ETA: 1:56 - loss: 0.8050 - regression_loss: 0.7289 - classification_loss: 0.0760 11/500 [..............................] - ETA: 1:56 - loss: 0.7976 - regression_loss: 0.7249 - classification_loss: 0.0727 12/500 [..............................] - ETA: 1:56 - loss: 0.8032 - regression_loss: 0.7295 - classification_loss: 0.0737 13/500 [..............................] - ETA: 1:56 - loss: 0.7622 - regression_loss: 0.6928 - classification_loss: 0.0694 14/500 [..............................] - ETA: 1:55 - loss: 0.7731 - regression_loss: 0.7007 - classification_loss: 0.0724 15/500 [..............................] - ETA: 1:55 - loss: 0.7812 - regression_loss: 0.7065 - classification_loss: 0.0746 16/500 [..............................] - ETA: 1:55 - loss: 0.8056 - regression_loss: 0.7279 - classification_loss: 0.0777 17/500 [>.............................] - ETA: 1:54 - loss: 0.7958 - regression_loss: 0.7202 - classification_loss: 0.0756 18/500 [>.............................] - ETA: 1:54 - loss: 0.7929 - regression_loss: 0.7188 - classification_loss: 0.0742 19/500 [>.............................] - ETA: 1:53 - loss: 0.9542 - regression_loss: 0.8292 - classification_loss: 0.1250 20/500 [>.............................] - ETA: 1:53 - loss: 0.9286 - regression_loss: 0.8087 - classification_loss: 0.1199 21/500 [>.............................] - ETA: 1:53 - loss: 0.9185 - regression_loss: 0.8005 - classification_loss: 0.1181 22/500 [>.............................] - ETA: 1:52 - loss: 0.9064 - regression_loss: 0.7910 - classification_loss: 0.1154 23/500 [>.............................] - ETA: 1:52 - loss: 0.9092 - regression_loss: 0.7955 - classification_loss: 0.1137 24/500 [>.............................] - ETA: 1:52 - loss: 0.9111 - regression_loss: 0.7984 - classification_loss: 0.1128 25/500 [>.............................] - ETA: 1:52 - loss: 0.9049 - regression_loss: 0.7949 - classification_loss: 0.1099 26/500 [>.............................] - ETA: 1:51 - loss: 0.8835 - regression_loss: 0.7760 - classification_loss: 0.1075 27/500 [>.............................] - ETA: 1:51 - loss: 0.8729 - regression_loss: 0.7673 - classification_loss: 0.1056 28/500 [>.............................] - ETA: 1:51 - loss: 0.8591 - regression_loss: 0.7559 - classification_loss: 0.1031 29/500 [>.............................] - ETA: 1:50 - loss: 0.8893 - regression_loss: 0.7791 - classification_loss: 0.1101 30/500 [>.............................] - ETA: 1:50 - loss: 0.9448 - regression_loss: 0.8310 - classification_loss: 0.1138 31/500 [>.............................] - ETA: 1:50 - loss: 0.9319 - regression_loss: 0.8204 - classification_loss: 0.1115 32/500 [>.............................] - ETA: 1:50 - loss: 0.9338 - regression_loss: 0.8237 - classification_loss: 0.1101 33/500 [>.............................] - ETA: 1:50 - loss: 0.9371 - regression_loss: 0.8264 - classification_loss: 0.1106 34/500 [=>............................] - ETA: 1:50 - loss: 0.9424 - regression_loss: 0.8305 - classification_loss: 0.1119 35/500 [=>............................] - ETA: 1:49 - loss: 0.9315 - regression_loss: 0.8217 - classification_loss: 0.1098 36/500 [=>............................] - ETA: 1:49 - loss: 0.9218 - regression_loss: 0.8140 - classification_loss: 0.1078 37/500 [=>............................] - ETA: 1:49 - loss: 0.9040 - regression_loss: 0.7985 - classification_loss: 0.1055 38/500 [=>............................] - ETA: 1:49 - loss: 0.8957 - regression_loss: 0.7916 - classification_loss: 0.1041 39/500 [=>............................] - ETA: 1:48 - loss: 0.9052 - regression_loss: 0.7982 - classification_loss: 0.1070 40/500 [=>............................] - ETA: 1:48 - loss: 0.9113 - regression_loss: 0.8020 - classification_loss: 0.1093 41/500 [=>............................] - ETA: 1:48 - loss: 0.9127 - regression_loss: 0.8042 - classification_loss: 0.1085 42/500 [=>............................] - ETA: 1:48 - loss: 0.9170 - regression_loss: 0.8093 - classification_loss: 0.1078 43/500 [=>............................] - ETA: 1:47 - loss: 0.9098 - regression_loss: 0.8035 - classification_loss: 0.1063 44/500 [=>............................] - ETA: 1:47 - loss: 0.9123 - regression_loss: 0.8060 - classification_loss: 0.1063 45/500 [=>............................] - ETA: 1:47 - loss: 0.9084 - regression_loss: 0.8033 - classification_loss: 0.1051 46/500 [=>............................] - ETA: 1:47 - loss: 0.9191 - regression_loss: 0.8121 - classification_loss: 0.1071 47/500 [=>............................] - ETA: 1:46 - loss: 0.9106 - regression_loss: 0.8056 - classification_loss: 0.1051 48/500 [=>............................] - ETA: 1:46 - loss: 0.9030 - regression_loss: 0.7995 - classification_loss: 0.1035 49/500 [=>............................] - ETA: 1:46 - loss: 0.8997 - regression_loss: 0.7970 - classification_loss: 0.1026 50/500 [==>...........................] - ETA: 1:46 - loss: 0.8924 - regression_loss: 0.7905 - classification_loss: 0.1018 51/500 [==>...........................] - ETA: 1:45 - loss: 0.9003 - regression_loss: 0.7970 - classification_loss: 0.1034 52/500 [==>...........................] - ETA: 1:45 - loss: 0.9017 - regression_loss: 0.7983 - classification_loss: 0.1035 53/500 [==>...........................] - ETA: 1:45 - loss: 0.9040 - regression_loss: 0.8006 - classification_loss: 0.1034 54/500 [==>...........................] - ETA: 1:44 - loss: 0.8976 - regression_loss: 0.7956 - classification_loss: 0.1020 55/500 [==>...........................] - ETA: 1:44 - loss: 0.9000 - regression_loss: 0.7984 - classification_loss: 0.1016 56/500 [==>...........................] - ETA: 1:44 - loss: 0.8963 - regression_loss: 0.7956 - classification_loss: 0.1008 57/500 [==>...........................] - ETA: 1:44 - loss: 0.8941 - regression_loss: 0.7940 - classification_loss: 0.1001 58/500 [==>...........................] - ETA: 1:43 - loss: 0.8951 - regression_loss: 0.7946 - classification_loss: 0.1005 59/500 [==>...........................] - ETA: 1:43 - loss: 0.8944 - regression_loss: 0.7946 - classification_loss: 0.0999 60/500 [==>...........................] - ETA: 1:43 - loss: 0.8982 - regression_loss: 0.7984 - classification_loss: 0.0998 61/500 [==>...........................] - ETA: 1:43 - loss: 0.8937 - regression_loss: 0.7947 - classification_loss: 0.0990 62/500 [==>...........................] - ETA: 1:43 - loss: 0.8934 - regression_loss: 0.7949 - classification_loss: 0.0985 63/500 [==>...........................] - ETA: 1:42 - loss: 0.8946 - regression_loss: 0.7963 - classification_loss: 0.0983 64/500 [==>...........................] - ETA: 1:42 - loss: 0.8944 - regression_loss: 0.7961 - classification_loss: 0.0983 65/500 [==>...........................] - ETA: 1:42 - loss: 0.8945 - regression_loss: 0.7965 - classification_loss: 0.0980 66/500 [==>...........................] - ETA: 1:42 - loss: 0.8979 - regression_loss: 0.7994 - classification_loss: 0.0985 67/500 [===>..........................] - ETA: 1:41 - loss: 0.9000 - regression_loss: 0.8018 - classification_loss: 0.0982 68/500 [===>..........................] - ETA: 1:41 - loss: 0.9010 - regression_loss: 0.8030 - classification_loss: 0.0980 69/500 [===>..........................] - ETA: 1:41 - loss: 0.8946 - regression_loss: 0.7974 - classification_loss: 0.0972 70/500 [===>..........................] - ETA: 1:41 - loss: 0.8942 - regression_loss: 0.7970 - classification_loss: 0.0972 71/500 [===>..........................] - ETA: 1:41 - loss: 0.8949 - regression_loss: 0.7979 - classification_loss: 0.0971 72/500 [===>..........................] - ETA: 1:40 - loss: 0.8932 - regression_loss: 0.7966 - classification_loss: 0.0966 73/500 [===>..........................] - ETA: 1:40 - loss: 0.8911 - regression_loss: 0.7952 - classification_loss: 0.0959 74/500 [===>..........................] - ETA: 1:40 - loss: 0.8881 - regression_loss: 0.7932 - classification_loss: 0.0950 75/500 [===>..........................] - ETA: 1:39 - loss: 0.8833 - regression_loss: 0.7890 - classification_loss: 0.0942 76/500 [===>..........................] - ETA: 1:39 - loss: 0.8859 - regression_loss: 0.7915 - classification_loss: 0.0944 77/500 [===>..........................] - ETA: 1:39 - loss: 0.8872 - regression_loss: 0.7929 - classification_loss: 0.0943 78/500 [===>..........................] - ETA: 1:39 - loss: 0.8823 - regression_loss: 0.7884 - classification_loss: 0.0939 79/500 [===>..........................] - ETA: 1:39 - loss: 0.8855 - regression_loss: 0.7914 - classification_loss: 0.0941 80/500 [===>..........................] - ETA: 1:38 - loss: 0.8823 - regression_loss: 0.7888 - classification_loss: 0.0934 81/500 [===>..........................] - ETA: 1:38 - loss: 0.8782 - regression_loss: 0.7854 - classification_loss: 0.0928 82/500 [===>..........................] - ETA: 1:38 - loss: 0.8745 - regression_loss: 0.7825 - classification_loss: 0.0920 83/500 [===>..........................] - ETA: 1:38 - loss: 0.8821 - regression_loss: 0.7891 - classification_loss: 0.0930 84/500 [====>.........................] - ETA: 1:38 - loss: 0.8810 - regression_loss: 0.7879 - classification_loss: 0.0932 85/500 [====>.........................] - ETA: 1:37 - loss: 0.8792 - regression_loss: 0.7865 - classification_loss: 0.0927 86/500 [====>.........................] - ETA: 1:37 - loss: 0.8737 - regression_loss: 0.7820 - classification_loss: 0.0917 87/500 [====>.........................] - ETA: 1:37 - loss: 0.8729 - regression_loss: 0.7817 - classification_loss: 0.0912 88/500 [====>.........................] - ETA: 1:37 - loss: 0.8675 - regression_loss: 0.7771 - classification_loss: 0.0904 89/500 [====>.........................] - ETA: 1:37 - loss: 0.8749 - regression_loss: 0.7846 - classification_loss: 0.0903 90/500 [====>.........................] - ETA: 1:36 - loss: 0.8728 - regression_loss: 0.7829 - classification_loss: 0.0899 91/500 [====>.........................] - ETA: 1:36 - loss: 0.8686 - regression_loss: 0.7793 - classification_loss: 0.0893 92/500 [====>.........................] - ETA: 1:36 - loss: 0.8682 - regression_loss: 0.7795 - classification_loss: 0.0886 93/500 [====>.........................] - ETA: 1:36 - loss: 0.8733 - regression_loss: 0.7841 - classification_loss: 0.0892 94/500 [====>.........................] - ETA: 1:35 - loss: 0.8719 - regression_loss: 0.7832 - classification_loss: 0.0887 95/500 [====>.........................] - ETA: 1:35 - loss: 0.8737 - regression_loss: 0.7849 - classification_loss: 0.0888 96/500 [====>.........................] - ETA: 1:35 - loss: 0.8763 - regression_loss: 0.7872 - classification_loss: 0.0891 97/500 [====>.........................] - ETA: 1:34 - loss: 0.8874 - regression_loss: 0.7973 - classification_loss: 0.0901 98/500 [====>.........................] - ETA: 1:34 - loss: 0.8829 - regression_loss: 0.7936 - classification_loss: 0.0894 99/500 [====>.........................] - ETA: 1:34 - loss: 0.8821 - regression_loss: 0.7931 - classification_loss: 0.0889 100/500 [=====>........................] - ETA: 1:34 - loss: 0.8835 - regression_loss: 0.7945 - classification_loss: 0.0890 101/500 [=====>........................] - ETA: 1:33 - loss: 0.8786 - regression_loss: 0.7903 - classification_loss: 0.0883 102/500 [=====>........................] - ETA: 1:33 - loss: 0.8827 - regression_loss: 0.7946 - classification_loss: 0.0881 103/500 [=====>........................] - ETA: 1:33 - loss: 0.8842 - regression_loss: 0.7958 - classification_loss: 0.0884 104/500 [=====>........................] - ETA: 1:33 - loss: 0.8842 - regression_loss: 0.7957 - classification_loss: 0.0884 105/500 [=====>........................] - ETA: 1:33 - loss: 0.8786 - regression_loss: 0.7909 - classification_loss: 0.0878 106/500 [=====>........................] - ETA: 1:32 - loss: 0.8767 - regression_loss: 0.7894 - classification_loss: 0.0873 107/500 [=====>........................] - ETA: 1:32 - loss: 0.8762 - regression_loss: 0.7892 - classification_loss: 0.0869 108/500 [=====>........................] - ETA: 1:32 - loss: 0.8755 - regression_loss: 0.7885 - classification_loss: 0.0870 109/500 [=====>........................] - ETA: 1:32 - loss: 0.8743 - regression_loss: 0.7875 - classification_loss: 0.0867 110/500 [=====>........................] - ETA: 1:31 - loss: 0.8794 - regression_loss: 0.7912 - classification_loss: 0.0882 111/500 [=====>........................] - ETA: 1:31 - loss: 0.8790 - regression_loss: 0.7911 - classification_loss: 0.0880 112/500 [=====>........................] - ETA: 1:31 - loss: 0.8803 - regression_loss: 0.7922 - classification_loss: 0.0881 113/500 [=====>........................] - ETA: 1:31 - loss: 0.8764 - regression_loss: 0.7889 - classification_loss: 0.0875 114/500 [=====>........................] - ETA: 1:30 - loss: 0.8742 - regression_loss: 0.7866 - classification_loss: 0.0875 115/500 [=====>........................] - ETA: 1:30 - loss: 0.8732 - regression_loss: 0.7864 - classification_loss: 0.0869 116/500 [=====>........................] - ETA: 1:30 - loss: 0.8757 - regression_loss: 0.7887 - classification_loss: 0.0871 117/500 [======>.......................] - ETA: 1:30 - loss: 0.8792 - regression_loss: 0.7907 - classification_loss: 0.0884 118/500 [======>.......................] - ETA: 1:30 - loss: 0.8832 - regression_loss: 0.7946 - classification_loss: 0.0886 119/500 [======>.......................] - ETA: 1:29 - loss: 0.8861 - regression_loss: 0.7972 - classification_loss: 0.0888 120/500 [======>.......................] - ETA: 1:29 - loss: 0.8845 - regression_loss: 0.7961 - classification_loss: 0.0884 121/500 [======>.......................] - ETA: 1:29 - loss: 0.8840 - regression_loss: 0.7958 - classification_loss: 0.0881 122/500 [======>.......................] - ETA: 1:29 - loss: 0.8829 - regression_loss: 0.7950 - classification_loss: 0.0879 123/500 [======>.......................] - ETA: 1:28 - loss: 0.8822 - regression_loss: 0.7946 - classification_loss: 0.0876 124/500 [======>.......................] - ETA: 1:28 - loss: 0.8826 - regression_loss: 0.7948 - classification_loss: 0.0877 125/500 [======>.......................] - ETA: 1:28 - loss: 0.8838 - regression_loss: 0.7954 - classification_loss: 0.0884 126/500 [======>.......................] - ETA: 1:28 - loss: 0.8835 - regression_loss: 0.7952 - classification_loss: 0.0883 127/500 [======>.......................] - ETA: 1:27 - loss: 0.8849 - regression_loss: 0.7964 - classification_loss: 0.0884 128/500 [======>.......................] - ETA: 1:27 - loss: 0.8817 - regression_loss: 0.7937 - classification_loss: 0.0880 129/500 [======>.......................] - ETA: 1:27 - loss: 0.8844 - regression_loss: 0.7954 - classification_loss: 0.0889 130/500 [======>.......................] - ETA: 1:27 - loss: 0.8859 - regression_loss: 0.7966 - classification_loss: 0.0893 131/500 [======>.......................] - ETA: 1:26 - loss: 0.8861 - regression_loss: 0.7966 - classification_loss: 0.0894 132/500 [======>.......................] - ETA: 1:26 - loss: 0.8832 - regression_loss: 0.7942 - classification_loss: 0.0889 133/500 [======>.......................] - ETA: 1:26 - loss: 0.8803 - regression_loss: 0.7919 - classification_loss: 0.0885 134/500 [=======>......................] - ETA: 1:26 - loss: 0.8822 - regression_loss: 0.7938 - classification_loss: 0.0884 135/500 [=======>......................] - ETA: 1:26 - loss: 0.8854 - regression_loss: 0.7969 - classification_loss: 0.0886 136/500 [=======>......................] - ETA: 1:25 - loss: 0.8864 - regression_loss: 0.7973 - classification_loss: 0.0891 137/500 [=======>......................] - ETA: 1:25 - loss: 0.8879 - regression_loss: 0.7987 - classification_loss: 0.0893 138/500 [=======>......................] - ETA: 1:25 - loss: 0.8839 - regression_loss: 0.7951 - classification_loss: 0.0887 139/500 [=======>......................] - ETA: 1:25 - loss: 0.8841 - regression_loss: 0.7954 - classification_loss: 0.0887 140/500 [=======>......................] - ETA: 1:24 - loss: 0.8828 - regression_loss: 0.7946 - classification_loss: 0.0883 141/500 [=======>......................] - ETA: 1:24 - loss: 0.8858 - regression_loss: 0.7974 - classification_loss: 0.0884 142/500 [=======>......................] - ETA: 1:24 - loss: 0.8833 - regression_loss: 0.7952 - classification_loss: 0.0881 143/500 [=======>......................] - ETA: 1:24 - loss: 0.8848 - regression_loss: 0.7970 - classification_loss: 0.0878 144/500 [=======>......................] - ETA: 1:23 - loss: 0.8836 - regression_loss: 0.7960 - classification_loss: 0.0876 145/500 [=======>......................] - ETA: 1:23 - loss: 0.8807 - regression_loss: 0.7935 - classification_loss: 0.0872 146/500 [=======>......................] - ETA: 1:23 - loss: 0.8825 - regression_loss: 0.7948 - classification_loss: 0.0877 147/500 [=======>......................] - ETA: 1:23 - loss: 0.8914 - regression_loss: 0.8014 - classification_loss: 0.0900 148/500 [=======>......................] - ETA: 1:22 - loss: 0.8911 - regression_loss: 0.8013 - classification_loss: 0.0898 149/500 [=======>......................] - ETA: 1:22 - loss: 0.8923 - regression_loss: 0.8022 - classification_loss: 0.0901 150/500 [========>.....................] - ETA: 1:22 - loss: 0.8888 - regression_loss: 0.7991 - classification_loss: 0.0897 151/500 [========>.....................] - ETA: 1:22 - loss: 0.8914 - regression_loss: 0.8015 - classification_loss: 0.0899 152/500 [========>.....................] - ETA: 1:21 - loss: 0.8890 - regression_loss: 0.7994 - classification_loss: 0.0897 153/500 [========>.....................] - ETA: 1:21 - loss: 0.8888 - regression_loss: 0.7993 - classification_loss: 0.0895 154/500 [========>.....................] - ETA: 1:21 - loss: 0.8913 - regression_loss: 0.8015 - classification_loss: 0.0898 155/500 [========>.....................] - ETA: 1:21 - loss: 0.8923 - regression_loss: 0.8023 - classification_loss: 0.0900 156/500 [========>.....................] - ETA: 1:20 - loss: 0.8944 - regression_loss: 0.8041 - classification_loss: 0.0903 157/500 [========>.....................] - ETA: 1:20 - loss: 0.8904 - regression_loss: 0.8005 - classification_loss: 0.0898 158/500 [========>.....................] - ETA: 1:20 - loss: 0.8931 - regression_loss: 0.8027 - classification_loss: 0.0904 159/500 [========>.....................] - ETA: 1:20 - loss: 0.8925 - regression_loss: 0.8023 - classification_loss: 0.0902 160/500 [========>.....................] - ETA: 1:19 - loss: 0.8926 - regression_loss: 0.8023 - classification_loss: 0.0903 161/500 [========>.....................] - ETA: 1:19 - loss: 0.8918 - regression_loss: 0.8017 - classification_loss: 0.0900 162/500 [========>.....................] - ETA: 1:19 - loss: 0.8934 - regression_loss: 0.8029 - classification_loss: 0.0905 163/500 [========>.....................] - ETA: 1:19 - loss: 0.8983 - regression_loss: 0.8071 - classification_loss: 0.0912 164/500 [========>.....................] - ETA: 1:19 - loss: 0.8999 - regression_loss: 0.8084 - classification_loss: 0.0915 165/500 [========>.....................] - ETA: 1:18 - loss: 0.8998 - regression_loss: 0.8084 - classification_loss: 0.0914 166/500 [========>.....................] - ETA: 1:18 - loss: 0.8961 - regression_loss: 0.8052 - classification_loss: 0.0909 167/500 [=========>....................] - ETA: 1:18 - loss: 0.8940 - regression_loss: 0.8028 - classification_loss: 0.0912 168/500 [=========>....................] - ETA: 1:18 - loss: 0.8920 - regression_loss: 0.8010 - classification_loss: 0.0909 169/500 [=========>....................] - ETA: 1:17 - loss: 0.8914 - regression_loss: 0.7999 - classification_loss: 0.0915 170/500 [=========>....................] - ETA: 1:17 - loss: 0.8947 - regression_loss: 0.8018 - classification_loss: 0.0929 171/500 [=========>....................] - ETA: 1:17 - loss: 0.8906 - regression_loss: 0.7981 - classification_loss: 0.0925 172/500 [=========>....................] - ETA: 1:17 - loss: 0.8901 - regression_loss: 0.7978 - classification_loss: 0.0923 173/500 [=========>....................] - ETA: 1:16 - loss: 0.8901 - regression_loss: 0.7973 - classification_loss: 0.0928 174/500 [=========>....................] - ETA: 1:16 - loss: 0.8889 - regression_loss: 0.7961 - classification_loss: 0.0927 175/500 [=========>....................] - ETA: 1:16 - loss: 0.8856 - regression_loss: 0.7932 - classification_loss: 0.0924 176/500 [=========>....................] - ETA: 1:16 - loss: 0.8848 - regression_loss: 0.7927 - classification_loss: 0.0922 177/500 [=========>....................] - ETA: 1:16 - loss: 0.8827 - regression_loss: 0.7908 - classification_loss: 0.0919 178/500 [=========>....................] - ETA: 1:15 - loss: 0.8834 - regression_loss: 0.7917 - classification_loss: 0.0917 179/500 [=========>....................] - ETA: 1:15 - loss: 0.8828 - regression_loss: 0.7911 - classification_loss: 0.0917 180/500 [=========>....................] - ETA: 1:15 - loss: 0.8851 - regression_loss: 0.7927 - classification_loss: 0.0924 181/500 [=========>....................] - ETA: 1:15 - loss: 0.8826 - regression_loss: 0.7904 - classification_loss: 0.0922 182/500 [=========>....................] - ETA: 1:14 - loss: 0.8830 - regression_loss: 0.7904 - classification_loss: 0.0925 183/500 [=========>....................] - ETA: 1:14 - loss: 0.8810 - regression_loss: 0.7888 - classification_loss: 0.0922 184/500 [==========>...................] - ETA: 1:14 - loss: 0.8794 - regression_loss: 0.7874 - classification_loss: 0.0921 185/500 [==========>...................] - ETA: 1:14 - loss: 0.8784 - regression_loss: 0.7866 - classification_loss: 0.0918 186/500 [==========>...................] - ETA: 1:13 - loss: 0.8779 - regression_loss: 0.7863 - classification_loss: 0.0917 187/500 [==========>...................] - ETA: 1:13 - loss: 0.8777 - regression_loss: 0.7862 - classification_loss: 0.0915 188/500 [==========>...................] - ETA: 1:13 - loss: 0.8781 - regression_loss: 0.7867 - classification_loss: 0.0914 189/500 [==========>...................] - ETA: 1:13 - loss: 0.8788 - regression_loss: 0.7873 - classification_loss: 0.0915 190/500 [==========>...................] - ETA: 1:12 - loss: 0.8764 - regression_loss: 0.7853 - classification_loss: 0.0911 191/500 [==========>...................] - ETA: 1:12 - loss: 0.8746 - regression_loss: 0.7836 - classification_loss: 0.0909 192/500 [==========>...................] - ETA: 1:12 - loss: 0.8734 - regression_loss: 0.7828 - classification_loss: 0.0906 193/500 [==========>...................] - ETA: 1:12 - loss: 0.8728 - regression_loss: 0.7825 - classification_loss: 0.0903 194/500 [==========>...................] - ETA: 1:11 - loss: 0.8761 - regression_loss: 0.7847 - classification_loss: 0.0914 195/500 [==========>...................] - ETA: 1:11 - loss: 0.8749 - regression_loss: 0.7838 - classification_loss: 0.0911 196/500 [==========>...................] - ETA: 1:11 - loss: 0.8721 - regression_loss: 0.7814 - classification_loss: 0.0907 197/500 [==========>...................] - ETA: 1:11 - loss: 0.8734 - regression_loss: 0.7824 - classification_loss: 0.0909 198/500 [==========>...................] - ETA: 1:10 - loss: 0.8762 - regression_loss: 0.7848 - classification_loss: 0.0914 199/500 [==========>...................] - ETA: 1:10 - loss: 0.8789 - regression_loss: 0.7866 - classification_loss: 0.0923 200/500 [===========>..................] - ETA: 1:10 - loss: 0.8788 - regression_loss: 0.7865 - classification_loss: 0.0923 201/500 [===========>..................] - ETA: 1:10 - loss: 0.8790 - regression_loss: 0.7865 - classification_loss: 0.0925 202/500 [===========>..................] - ETA: 1:10 - loss: 0.8775 - regression_loss: 0.7852 - classification_loss: 0.0923 203/500 [===========>..................] - ETA: 1:09 - loss: 0.8753 - regression_loss: 0.7834 - classification_loss: 0.0919 204/500 [===========>..................] - ETA: 1:09 - loss: 0.8747 - regression_loss: 0.7827 - classification_loss: 0.0921 205/500 [===========>..................] - ETA: 1:09 - loss: 0.8712 - regression_loss: 0.7795 - classification_loss: 0.0917 206/500 [===========>..................] - ETA: 1:09 - loss: 0.8723 - regression_loss: 0.7802 - classification_loss: 0.0920 207/500 [===========>..................] - ETA: 1:08 - loss: 0.8716 - regression_loss: 0.7798 - classification_loss: 0.0918 208/500 [===========>..................] - ETA: 1:08 - loss: 0.8724 - regression_loss: 0.7803 - classification_loss: 0.0920 209/500 [===========>..................] - ETA: 1:08 - loss: 0.8711 - regression_loss: 0.7793 - classification_loss: 0.0918 210/500 [===========>..................] - ETA: 1:08 - loss: 0.8717 - regression_loss: 0.7800 - classification_loss: 0.0917 211/500 [===========>..................] - ETA: 1:07 - loss: 0.8698 - regression_loss: 0.7785 - classification_loss: 0.0914 212/500 [===========>..................] - ETA: 1:07 - loss: 0.8676 - regression_loss: 0.7765 - classification_loss: 0.0912 213/500 [===========>..................] - ETA: 1:07 - loss: 0.8685 - regression_loss: 0.7772 - classification_loss: 0.0913 214/500 [===========>..................] - ETA: 1:07 - loss: 0.8695 - regression_loss: 0.7780 - classification_loss: 0.0915 215/500 [===========>..................] - ETA: 1:06 - loss: 0.8692 - regression_loss: 0.7777 - classification_loss: 0.0915 216/500 [===========>..................] - ETA: 1:06 - loss: 0.8680 - regression_loss: 0.7767 - classification_loss: 0.0913 217/500 [============>.................] - ETA: 1:06 - loss: 0.8669 - regression_loss: 0.7758 - classification_loss: 0.0910 218/500 [============>.................] - ETA: 1:06 - loss: 0.8688 - regression_loss: 0.7780 - classification_loss: 0.0908 219/500 [============>.................] - ETA: 1:05 - loss: 0.8693 - regression_loss: 0.7785 - classification_loss: 0.0908 220/500 [============>.................] - ETA: 1:05 - loss: 0.8674 - regression_loss: 0.7769 - classification_loss: 0.0905 221/500 [============>.................] - ETA: 1:05 - loss: 0.8661 - regression_loss: 0.7734 - classification_loss: 0.0927 222/500 [============>.................] - ETA: 1:05 - loss: 0.8665 - regression_loss: 0.7739 - classification_loss: 0.0926 223/500 [============>.................] - ETA: 1:05 - loss: 0.8639 - regression_loss: 0.7716 - classification_loss: 0.0923 224/500 [============>.................] - ETA: 1:04 - loss: 0.8643 - regression_loss: 0.7719 - classification_loss: 0.0925 225/500 [============>.................] - ETA: 1:04 - loss: 0.8632 - regression_loss: 0.7709 - classification_loss: 0.0923 226/500 [============>.................] - ETA: 1:04 - loss: 0.8628 - regression_loss: 0.7706 - classification_loss: 0.0922 227/500 [============>.................] - ETA: 1:04 - loss: 0.8606 - regression_loss: 0.7687 - classification_loss: 0.0919 228/500 [============>.................] - ETA: 1:03 - loss: 0.8579 - regression_loss: 0.7664 - classification_loss: 0.0915 229/500 [============>.................] - ETA: 1:03 - loss: 0.8581 - regression_loss: 0.7665 - classification_loss: 0.0916 230/500 [============>.................] - ETA: 1:03 - loss: 0.8578 - regression_loss: 0.7663 - classification_loss: 0.0915 231/500 [============>.................] - ETA: 1:03 - loss: 0.8603 - regression_loss: 0.7685 - classification_loss: 0.0917 232/500 [============>.................] - ETA: 1:02 - loss: 0.8621 - regression_loss: 0.7701 - classification_loss: 0.0920 233/500 [============>.................] - ETA: 1:02 - loss: 0.8614 - regression_loss: 0.7696 - classification_loss: 0.0918 234/500 [=============>................] - ETA: 1:02 - loss: 0.8618 - regression_loss: 0.7701 - classification_loss: 0.0917 235/500 [=============>................] - ETA: 1:02 - loss: 0.8624 - regression_loss: 0.7704 - classification_loss: 0.0920 236/500 [=============>................] - ETA: 1:01 - loss: 0.8642 - regression_loss: 0.7713 - classification_loss: 0.0929 237/500 [=============>................] - ETA: 1:01 - loss: 0.8634 - regression_loss: 0.7706 - classification_loss: 0.0928 238/500 [=============>................] - ETA: 1:01 - loss: 0.8620 - regression_loss: 0.7693 - classification_loss: 0.0927 239/500 [=============>................] - ETA: 1:01 - loss: 0.8604 - regression_loss: 0.7679 - classification_loss: 0.0925 240/500 [=============>................] - ETA: 1:00 - loss: 0.8606 - regression_loss: 0.7681 - classification_loss: 0.0925 241/500 [=============>................] - ETA: 1:00 - loss: 0.8599 - regression_loss: 0.7675 - classification_loss: 0.0924 242/500 [=============>................] - ETA: 1:00 - loss: 0.8614 - regression_loss: 0.7688 - classification_loss: 0.0927 243/500 [=============>................] - ETA: 1:00 - loss: 0.8635 - regression_loss: 0.7701 - classification_loss: 0.0934 244/500 [=============>................] - ETA: 1:00 - loss: 0.8658 - regression_loss: 0.7721 - classification_loss: 0.0936 245/500 [=============>................] - ETA: 59s - loss: 0.8663 - regression_loss: 0.7726 - classification_loss: 0.0937  246/500 [=============>................] - ETA: 59s - loss: 0.8691 - regression_loss: 0.7753 - classification_loss: 0.0939 247/500 [=============>................] - ETA: 59s - loss: 0.8702 - regression_loss: 0.7762 - classification_loss: 0.0940 248/500 [=============>................] - ETA: 59s - loss: 0.8714 - regression_loss: 0.7775 - classification_loss: 0.0939 249/500 [=============>................] - ETA: 58s - loss: 0.8729 - regression_loss: 0.7790 - classification_loss: 0.0939 250/500 [==============>...............] - ETA: 58s - loss: 0.8757 - regression_loss: 0.7816 - classification_loss: 0.0940 251/500 [==============>...............] - ETA: 58s - loss: 0.8733 - regression_loss: 0.7796 - classification_loss: 0.0937 252/500 [==============>...............] - ETA: 58s - loss: 0.8708 - regression_loss: 0.7774 - classification_loss: 0.0934 253/500 [==============>...............] - ETA: 58s - loss: 0.8690 - regression_loss: 0.7759 - classification_loss: 0.0932 254/500 [==============>...............] - ETA: 57s - loss: 0.8682 - regression_loss: 0.7752 - classification_loss: 0.0930 255/500 [==============>...............] - ETA: 57s - loss: 0.8665 - regression_loss: 0.7738 - classification_loss: 0.0927 256/500 [==============>...............] - ETA: 57s - loss: 0.8695 - regression_loss: 0.7759 - classification_loss: 0.0936 257/500 [==============>...............] - ETA: 57s - loss: 0.8711 - regression_loss: 0.7771 - classification_loss: 0.0940 258/500 [==============>...............] - ETA: 56s - loss: 0.8719 - regression_loss: 0.7778 - classification_loss: 0.0941 259/500 [==============>...............] - ETA: 56s - loss: 0.8719 - regression_loss: 0.7779 - classification_loss: 0.0940 260/500 [==============>...............] - ETA: 56s - loss: 0.8752 - regression_loss: 0.7810 - classification_loss: 0.0942 261/500 [==============>...............] - ETA: 56s - loss: 0.8749 - regression_loss: 0.7808 - classification_loss: 0.0941 262/500 [==============>...............] - ETA: 55s - loss: 0.8738 - regression_loss: 0.7797 - classification_loss: 0.0942 263/500 [==============>...............] - ETA: 55s - loss: 0.8748 - regression_loss: 0.7805 - classification_loss: 0.0943 264/500 [==============>...............] - ETA: 55s - loss: 0.8744 - regression_loss: 0.7803 - classification_loss: 0.0941 265/500 [==============>...............] - ETA: 55s - loss: 0.8744 - regression_loss: 0.7798 - classification_loss: 0.0946 266/500 [==============>...............] - ETA: 54s - loss: 0.8772 - regression_loss: 0.7819 - classification_loss: 0.0953 267/500 [===============>..............] - ETA: 54s - loss: 0.8789 - regression_loss: 0.7836 - classification_loss: 0.0953 268/500 [===============>..............] - ETA: 54s - loss: 0.8768 - regression_loss: 0.7817 - classification_loss: 0.0951 269/500 [===============>..............] - ETA: 54s - loss: 0.8762 - regression_loss: 0.7811 - classification_loss: 0.0951 270/500 [===============>..............] - ETA: 54s - loss: 0.8777 - regression_loss: 0.7823 - classification_loss: 0.0954 271/500 [===============>..............] - ETA: 53s - loss: 0.8781 - regression_loss: 0.7826 - classification_loss: 0.0955 272/500 [===============>..............] - ETA: 53s - loss: 0.8774 - regression_loss: 0.7820 - classification_loss: 0.0954 273/500 [===============>..............] - ETA: 53s - loss: 0.8767 - regression_loss: 0.7816 - classification_loss: 0.0951 274/500 [===============>..............] - ETA: 53s - loss: 0.8754 - regression_loss: 0.7804 - classification_loss: 0.0949 275/500 [===============>..............] - ETA: 52s - loss: 0.8760 - regression_loss: 0.7810 - classification_loss: 0.0950 276/500 [===============>..............] - ETA: 52s - loss: 0.8758 - regression_loss: 0.7810 - classification_loss: 0.0948 277/500 [===============>..............] - ETA: 52s - loss: 0.8761 - regression_loss: 0.7812 - classification_loss: 0.0949 278/500 [===============>..............] - ETA: 52s - loss: 0.8755 - regression_loss: 0.7808 - classification_loss: 0.0947 279/500 [===============>..............] - ETA: 51s - loss: 0.8828 - regression_loss: 0.7856 - classification_loss: 0.0972 280/500 [===============>..............] - ETA: 51s - loss: 0.8814 - regression_loss: 0.7845 - classification_loss: 0.0969 281/500 [===============>..............] - ETA: 51s - loss: 0.8811 - regression_loss: 0.7843 - classification_loss: 0.0968 282/500 [===============>..............] - ETA: 51s - loss: 0.8797 - regression_loss: 0.7830 - classification_loss: 0.0967 283/500 [===============>..............] - ETA: 50s - loss: 0.8797 - regression_loss: 0.7830 - classification_loss: 0.0967 284/500 [================>.............] - ETA: 50s - loss: 0.8780 - regression_loss: 0.7815 - classification_loss: 0.0965 285/500 [================>.............] - ETA: 50s - loss: 0.8766 - regression_loss: 0.7804 - classification_loss: 0.0962 286/500 [================>.............] - ETA: 50s - loss: 0.8758 - regression_loss: 0.7798 - classification_loss: 0.0960 287/500 [================>.............] - ETA: 49s - loss: 0.8761 - regression_loss: 0.7802 - classification_loss: 0.0959 288/500 [================>.............] - ETA: 49s - loss: 0.8761 - regression_loss: 0.7804 - classification_loss: 0.0957 289/500 [================>.............] - ETA: 49s - loss: 0.8764 - regression_loss: 0.7807 - classification_loss: 0.0957 290/500 [================>.............] - ETA: 49s - loss: 0.8760 - regression_loss: 0.7804 - classification_loss: 0.0955 291/500 [================>.............] - ETA: 49s - loss: 0.8761 - regression_loss: 0.7806 - classification_loss: 0.0955 292/500 [================>.............] - ETA: 48s - loss: 0.8748 - regression_loss: 0.7796 - classification_loss: 0.0953 293/500 [================>.............] - ETA: 48s - loss: 0.8740 - regression_loss: 0.7789 - classification_loss: 0.0951 294/500 [================>.............] - ETA: 48s - loss: 0.8775 - regression_loss: 0.7820 - classification_loss: 0.0955 295/500 [================>.............] - ETA: 48s - loss: 0.8772 - regression_loss: 0.7818 - classification_loss: 0.0954 296/500 [================>.............] - ETA: 47s - loss: 0.8789 - regression_loss: 0.7830 - classification_loss: 0.0959 297/500 [================>.............] - ETA: 47s - loss: 0.8794 - regression_loss: 0.7835 - classification_loss: 0.0959 298/500 [================>.............] - ETA: 47s - loss: 0.8781 - regression_loss: 0.7824 - classification_loss: 0.0957 299/500 [================>.............] - ETA: 47s - loss: 0.8778 - regression_loss: 0.7824 - classification_loss: 0.0954 300/500 [=================>............] - ETA: 46s - loss: 0.8765 - regression_loss: 0.7813 - classification_loss: 0.0952 301/500 [=================>............] - ETA: 46s - loss: 0.8775 - regression_loss: 0.7822 - classification_loss: 0.0953 302/500 [=================>............] - ETA: 46s - loss: 0.8760 - regression_loss: 0.7810 - classification_loss: 0.0950 303/500 [=================>............] - ETA: 46s - loss: 0.8814 - regression_loss: 0.7858 - classification_loss: 0.0956 304/500 [=================>............] - ETA: 45s - loss: 0.8793 - regression_loss: 0.7839 - classification_loss: 0.0954 305/500 [=================>............] - ETA: 45s - loss: 0.8795 - regression_loss: 0.7843 - classification_loss: 0.0952 306/500 [=================>............] - ETA: 45s - loss: 0.8786 - regression_loss: 0.7836 - classification_loss: 0.0950 307/500 [=================>............] - ETA: 45s - loss: 0.8787 - regression_loss: 0.7835 - classification_loss: 0.0953 308/500 [=================>............] - ETA: 45s - loss: 0.8778 - regression_loss: 0.7828 - classification_loss: 0.0951 309/500 [=================>............] - ETA: 44s - loss: 0.8774 - regression_loss: 0.7825 - classification_loss: 0.0949 310/500 [=================>............] - ETA: 44s - loss: 0.8772 - regression_loss: 0.7825 - classification_loss: 0.0947 311/500 [=================>............] - ETA: 44s - loss: 0.8785 - regression_loss: 0.7837 - classification_loss: 0.0948 312/500 [=================>............] - ETA: 44s - loss: 0.8776 - regression_loss: 0.7829 - classification_loss: 0.0947 313/500 [=================>............] - ETA: 43s - loss: 0.8757 - regression_loss: 0.7812 - classification_loss: 0.0945 314/500 [=================>............] - ETA: 43s - loss: 0.8746 - regression_loss: 0.7803 - classification_loss: 0.0943 315/500 [=================>............] - ETA: 43s - loss: 0.8746 - regression_loss: 0.7805 - classification_loss: 0.0941 316/500 [=================>............] - ETA: 43s - loss: 0.8737 - regression_loss: 0.7797 - classification_loss: 0.0939 317/500 [==================>...........] - ETA: 42s - loss: 0.8734 - regression_loss: 0.7795 - classification_loss: 0.0939 318/500 [==================>...........] - ETA: 42s - loss: 0.8732 - regression_loss: 0.7792 - classification_loss: 0.0939 319/500 [==================>...........] - ETA: 42s - loss: 0.8725 - regression_loss: 0.7787 - classification_loss: 0.0938 320/500 [==================>...........] - ETA: 42s - loss: 0.8718 - regression_loss: 0.7782 - classification_loss: 0.0937 321/500 [==================>...........] - ETA: 42s - loss: 0.8715 - regression_loss: 0.7779 - classification_loss: 0.0936 322/500 [==================>...........] - ETA: 41s - loss: 0.8723 - regression_loss: 0.7786 - classification_loss: 0.0937 323/500 [==================>...........] - ETA: 41s - loss: 0.8734 - regression_loss: 0.7795 - classification_loss: 0.0939 324/500 [==================>...........] - ETA: 41s - loss: 0.8728 - regression_loss: 0.7789 - classification_loss: 0.0939 325/500 [==================>...........] - ETA: 41s - loss: 0.8712 - regression_loss: 0.7776 - classification_loss: 0.0936 326/500 [==================>...........] - ETA: 40s - loss: 0.8725 - regression_loss: 0.7789 - classification_loss: 0.0937 327/500 [==================>...........] - ETA: 40s - loss: 0.8727 - regression_loss: 0.7792 - classification_loss: 0.0935 328/500 [==================>...........] - ETA: 40s - loss: 0.8719 - regression_loss: 0.7786 - classification_loss: 0.0933 329/500 [==================>...........] - ETA: 40s - loss: 0.8757 - regression_loss: 0.7819 - classification_loss: 0.0938 330/500 [==================>...........] - ETA: 39s - loss: 0.8771 - regression_loss: 0.7831 - classification_loss: 0.0940 331/500 [==================>...........] - ETA: 39s - loss: 0.8756 - regression_loss: 0.7816 - classification_loss: 0.0940 332/500 [==================>...........] - ETA: 39s - loss: 0.8753 - regression_loss: 0.7814 - classification_loss: 0.0940 333/500 [==================>...........] - ETA: 39s - loss: 0.8756 - regression_loss: 0.7816 - classification_loss: 0.0939 334/500 [===================>..........] - ETA: 38s - loss: 0.8755 - regression_loss: 0.7817 - classification_loss: 0.0938 335/500 [===================>..........] - ETA: 38s - loss: 0.8745 - regression_loss: 0.7808 - classification_loss: 0.0937 336/500 [===================>..........] - ETA: 38s - loss: 0.8736 - regression_loss: 0.7800 - classification_loss: 0.0936 337/500 [===================>..........] - ETA: 38s - loss: 0.8732 - regression_loss: 0.7796 - classification_loss: 0.0937 338/500 [===================>..........] - ETA: 38s - loss: 0.8744 - regression_loss: 0.7808 - classification_loss: 0.0937 339/500 [===================>..........] - ETA: 37s - loss: 0.8740 - regression_loss: 0.7804 - classification_loss: 0.0937 340/500 [===================>..........] - ETA: 37s - loss: 0.8750 - regression_loss: 0.7813 - classification_loss: 0.0937 341/500 [===================>..........] - ETA: 37s - loss: 0.8740 - regression_loss: 0.7805 - classification_loss: 0.0935 342/500 [===================>..........] - ETA: 37s - loss: 0.8744 - regression_loss: 0.7808 - classification_loss: 0.0936 343/500 [===================>..........] - ETA: 36s - loss: 0.8746 - regression_loss: 0.7810 - classification_loss: 0.0936 344/500 [===================>..........] - ETA: 36s - loss: 0.8736 - regression_loss: 0.7801 - classification_loss: 0.0934 345/500 [===================>..........] - ETA: 36s - loss: 0.8730 - regression_loss: 0.7797 - classification_loss: 0.0934 346/500 [===================>..........] - ETA: 36s - loss: 0.8732 - regression_loss: 0.7799 - classification_loss: 0.0933 347/500 [===================>..........] - ETA: 35s - loss: 0.8729 - regression_loss: 0.7797 - classification_loss: 0.0932 348/500 [===================>..........] - ETA: 35s - loss: 0.8723 - regression_loss: 0.7792 - classification_loss: 0.0931 349/500 [===================>..........] - ETA: 35s - loss: 0.8719 - regression_loss: 0.7789 - classification_loss: 0.0930 350/500 [====================>.........] - ETA: 35s - loss: 0.8711 - regression_loss: 0.7783 - classification_loss: 0.0928 351/500 [====================>.........] - ETA: 34s - loss: 0.8692 - regression_loss: 0.7766 - classification_loss: 0.0926 352/500 [====================>.........] - ETA: 34s - loss: 0.8681 - regression_loss: 0.7757 - classification_loss: 0.0924 353/500 [====================>.........] - ETA: 34s - loss: 0.8696 - regression_loss: 0.7772 - classification_loss: 0.0924 354/500 [====================>.........] - ETA: 34s - loss: 0.8699 - regression_loss: 0.7776 - classification_loss: 0.0923 355/500 [====================>.........] - ETA: 34s - loss: 0.8692 - regression_loss: 0.7771 - classification_loss: 0.0921 356/500 [====================>.........] - ETA: 33s - loss: 0.8697 - regression_loss: 0.7774 - classification_loss: 0.0923 357/500 [====================>.........] - ETA: 33s - loss: 0.8701 - regression_loss: 0.7778 - classification_loss: 0.0923 358/500 [====================>.........] - ETA: 33s - loss: 0.8696 - regression_loss: 0.7776 - classification_loss: 0.0921 359/500 [====================>.........] - ETA: 33s - loss: 0.8701 - regression_loss: 0.7780 - classification_loss: 0.0921 360/500 [====================>.........] - ETA: 32s - loss: 0.8699 - regression_loss: 0.7778 - classification_loss: 0.0921 361/500 [====================>.........] - ETA: 32s - loss: 0.8693 - regression_loss: 0.7774 - classification_loss: 0.0920 362/500 [====================>.........] - ETA: 32s - loss: 0.8694 - regression_loss: 0.7775 - classification_loss: 0.0919 363/500 [====================>.........] - ETA: 32s - loss: 0.8687 - regression_loss: 0.7770 - classification_loss: 0.0917 364/500 [====================>.........] - ETA: 31s - loss: 0.8679 - regression_loss: 0.7764 - classification_loss: 0.0915 365/500 [====================>.........] - ETA: 31s - loss: 0.8690 - regression_loss: 0.7774 - classification_loss: 0.0916 366/500 [====================>.........] - ETA: 31s - loss: 0.8697 - regression_loss: 0.7780 - classification_loss: 0.0917 367/500 [=====================>........] - ETA: 31s - loss: 0.8685 - regression_loss: 0.7770 - classification_loss: 0.0916 368/500 [=====================>........] - ETA: 30s - loss: 0.8697 - regression_loss: 0.7779 - classification_loss: 0.0918 369/500 [=====================>........] - ETA: 30s - loss: 0.8705 - regression_loss: 0.7786 - classification_loss: 0.0919 370/500 [=====================>........] - ETA: 30s - loss: 0.8717 - regression_loss: 0.7795 - classification_loss: 0.0922 371/500 [=====================>........] - ETA: 30s - loss: 0.8710 - regression_loss: 0.7787 - classification_loss: 0.0922 372/500 [=====================>........] - ETA: 30s - loss: 0.8703 - regression_loss: 0.7782 - classification_loss: 0.0921 373/500 [=====================>........] - ETA: 29s - loss: 0.8700 - regression_loss: 0.7780 - classification_loss: 0.0920 374/500 [=====================>........] - ETA: 29s - loss: 0.8708 - regression_loss: 0.7786 - classification_loss: 0.0922 375/500 [=====================>........] - ETA: 29s - loss: 0.8713 - regression_loss: 0.7792 - classification_loss: 0.0921 376/500 [=====================>........] - ETA: 29s - loss: 0.8717 - regression_loss: 0.7795 - classification_loss: 0.0922 377/500 [=====================>........] - ETA: 28s - loss: 0.8716 - regression_loss: 0.7794 - classification_loss: 0.0922 378/500 [=====================>........] - ETA: 28s - loss: 0.8704 - regression_loss: 0.7783 - classification_loss: 0.0921 379/500 [=====================>........] - ETA: 28s - loss: 0.8708 - regression_loss: 0.7786 - classification_loss: 0.0922 380/500 [=====================>........] - ETA: 28s - loss: 0.8698 - regression_loss: 0.7777 - classification_loss: 0.0921 381/500 [=====================>........] - ETA: 27s - loss: 0.8697 - regression_loss: 0.7775 - classification_loss: 0.0921 382/500 [=====================>........] - ETA: 27s - loss: 0.8691 - regression_loss: 0.7771 - classification_loss: 0.0920 383/500 [=====================>........] - ETA: 27s - loss: 0.8680 - regression_loss: 0.7761 - classification_loss: 0.0918 384/500 [======================>.......] - ETA: 27s - loss: 0.8669 - regression_loss: 0.7752 - classification_loss: 0.0917 385/500 [======================>.......] - ETA: 26s - loss: 0.8659 - regression_loss: 0.7743 - classification_loss: 0.0916 386/500 [======================>.......] - ETA: 26s - loss: 0.8656 - regression_loss: 0.7740 - classification_loss: 0.0915 387/500 [======================>.......] - ETA: 26s - loss: 0.8647 - regression_loss: 0.7734 - classification_loss: 0.0913 388/500 [======================>.......] - ETA: 26s - loss: 0.8645 - regression_loss: 0.7732 - classification_loss: 0.0913 389/500 [======================>.......] - ETA: 26s - loss: 0.8637 - regression_loss: 0.7725 - classification_loss: 0.0911 390/500 [======================>.......] - ETA: 25s - loss: 0.8642 - regression_loss: 0.7729 - classification_loss: 0.0913 391/500 [======================>.......] - ETA: 25s - loss: 0.8649 - regression_loss: 0.7734 - classification_loss: 0.0915 392/500 [======================>.......] - ETA: 25s - loss: 0.8642 - regression_loss: 0.7728 - classification_loss: 0.0914 393/500 [======================>.......] - ETA: 25s - loss: 0.8662 - regression_loss: 0.7746 - classification_loss: 0.0917 394/500 [======================>.......] - ETA: 24s - loss: 0.8664 - regression_loss: 0.7747 - classification_loss: 0.0917 395/500 [======================>.......] - ETA: 24s - loss: 0.8673 - regression_loss: 0.7754 - classification_loss: 0.0919 396/500 [======================>.......] - ETA: 24s - loss: 0.8674 - regression_loss: 0.7756 - classification_loss: 0.0918 397/500 [======================>.......] - ETA: 24s - loss: 0.8665 - regression_loss: 0.7748 - classification_loss: 0.0918 398/500 [======================>.......] - ETA: 23s - loss: 0.8663 - regression_loss: 0.7745 - classification_loss: 0.0918 399/500 [======================>.......] - ETA: 23s - loss: 0.8672 - regression_loss: 0.7753 - classification_loss: 0.0919 400/500 [=======================>......] - ETA: 23s - loss: 0.8687 - regression_loss: 0.7767 - classification_loss: 0.0920 401/500 [=======================>......] - ETA: 23s - loss: 0.8678 - regression_loss: 0.7760 - classification_loss: 0.0919 402/500 [=======================>......] - ETA: 22s - loss: 0.8676 - regression_loss: 0.7758 - classification_loss: 0.0918 403/500 [=======================>......] - ETA: 22s - loss: 0.8687 - regression_loss: 0.7766 - classification_loss: 0.0921 404/500 [=======================>......] - ETA: 22s - loss: 0.8690 - regression_loss: 0.7770 - classification_loss: 0.0920 405/500 [=======================>......] - ETA: 22s - loss: 0.8688 - regression_loss: 0.7768 - classification_loss: 0.0920 406/500 [=======================>......] - ETA: 22s - loss: 0.8695 - regression_loss: 0.7776 - classification_loss: 0.0919 407/500 [=======================>......] - ETA: 21s - loss: 0.8688 - regression_loss: 0.7771 - classification_loss: 0.0918 408/500 [=======================>......] - ETA: 21s - loss: 0.8681 - regression_loss: 0.7763 - classification_loss: 0.0918 409/500 [=======================>......] - ETA: 21s - loss: 0.8685 - regression_loss: 0.7768 - classification_loss: 0.0918 410/500 [=======================>......] - ETA: 21s - loss: 0.8691 - regression_loss: 0.7773 - classification_loss: 0.0918 411/500 [=======================>......] - ETA: 20s - loss: 0.8681 - regression_loss: 0.7765 - classification_loss: 0.0916 412/500 [=======================>......] - ETA: 20s - loss: 0.8677 - regression_loss: 0.7761 - classification_loss: 0.0916 413/500 [=======================>......] - ETA: 20s - loss: 0.8666 - regression_loss: 0.7751 - classification_loss: 0.0915 414/500 [=======================>......] - ETA: 20s - loss: 0.8665 - regression_loss: 0.7749 - classification_loss: 0.0915 415/500 [=======================>......] - ETA: 19s - loss: 0.8659 - regression_loss: 0.7745 - classification_loss: 0.0914 416/500 [=======================>......] - ETA: 19s - loss: 0.8651 - regression_loss: 0.7739 - classification_loss: 0.0913 417/500 [========================>.....] - ETA: 19s - loss: 0.8650 - regression_loss: 0.7738 - classification_loss: 0.0912 418/500 [========================>.....] - ETA: 19s - loss: 0.8653 - regression_loss: 0.7740 - classification_loss: 0.0913 419/500 [========================>.....] - ETA: 18s - loss: 0.8657 - regression_loss: 0.7744 - classification_loss: 0.0913 420/500 [========================>.....] - ETA: 18s - loss: 0.8649 - regression_loss: 0.7738 - classification_loss: 0.0912 421/500 [========================>.....] - ETA: 18s - loss: 0.8637 - regression_loss: 0.7727 - classification_loss: 0.0910 422/500 [========================>.....] - ETA: 18s - loss: 0.8651 - regression_loss: 0.7737 - classification_loss: 0.0914 423/500 [========================>.....] - ETA: 18s - loss: 0.8646 - regression_loss: 0.7733 - classification_loss: 0.0913 424/500 [========================>.....] - ETA: 17s - loss: 0.8654 - regression_loss: 0.7741 - classification_loss: 0.0914 425/500 [========================>.....] - ETA: 17s - loss: 0.8643 - regression_loss: 0.7731 - classification_loss: 0.0912 426/500 [========================>.....] - ETA: 17s - loss: 0.8646 - regression_loss: 0.7732 - classification_loss: 0.0913 427/500 [========================>.....] - ETA: 17s - loss: 0.8641 - regression_loss: 0.7729 - classification_loss: 0.0912 428/500 [========================>.....] - ETA: 16s - loss: 0.8630 - regression_loss: 0.7720 - classification_loss: 0.0910 429/500 [========================>.....] - ETA: 16s - loss: 0.8625 - regression_loss: 0.7716 - classification_loss: 0.0909 430/500 [========================>.....] - ETA: 16s - loss: 0.8637 - regression_loss: 0.7726 - classification_loss: 0.0910 431/500 [========================>.....] - ETA: 16s - loss: 0.8635 - regression_loss: 0.7726 - classification_loss: 0.0909 432/500 [========================>.....] - ETA: 15s - loss: 0.8631 - regression_loss: 0.7723 - classification_loss: 0.0908 433/500 [========================>.....] - ETA: 15s - loss: 0.8621 - regression_loss: 0.7714 - classification_loss: 0.0907 434/500 [=========================>....] - ETA: 15s - loss: 0.8619 - regression_loss: 0.7713 - classification_loss: 0.0906 435/500 [=========================>....] - ETA: 15s - loss: 0.8607 - regression_loss: 0.7703 - classification_loss: 0.0905 436/500 [=========================>....] - ETA: 15s - loss: 0.8598 - regression_loss: 0.7695 - classification_loss: 0.0904 437/500 [=========================>....] - ETA: 14s - loss: 0.8607 - regression_loss: 0.7701 - classification_loss: 0.0905 438/500 [=========================>....] - ETA: 14s - loss: 0.8607 - regression_loss: 0.7702 - classification_loss: 0.0906 439/500 [=========================>....] - ETA: 14s - loss: 0.8604 - regression_loss: 0.7699 - classification_loss: 0.0905 440/500 [=========================>....] - ETA: 14s - loss: 0.8595 - regression_loss: 0.7691 - classification_loss: 0.0904 441/500 [=========================>....] - ETA: 13s - loss: 0.8612 - regression_loss: 0.7705 - classification_loss: 0.0907 442/500 [=========================>....] - ETA: 13s - loss: 0.8620 - regression_loss: 0.7713 - classification_loss: 0.0908 443/500 [=========================>....] - ETA: 13s - loss: 0.8631 - regression_loss: 0.7720 - classification_loss: 0.0911 444/500 [=========================>....] - ETA: 13s - loss: 0.8636 - regression_loss: 0.7724 - classification_loss: 0.0911 445/500 [=========================>....] - ETA: 12s - loss: 0.8635 - regression_loss: 0.7724 - classification_loss: 0.0911 446/500 [=========================>....] - ETA: 12s - loss: 0.8637 - regression_loss: 0.7727 - classification_loss: 0.0910 447/500 [=========================>....] - ETA: 12s - loss: 0.8639 - regression_loss: 0.7729 - classification_loss: 0.0910 448/500 [=========================>....] - ETA: 12s - loss: 0.8639 - regression_loss: 0.7730 - classification_loss: 0.0909 449/500 [=========================>....] - ETA: 11s - loss: 0.8639 - regression_loss: 0.7731 - classification_loss: 0.0909 450/500 [==========================>...] - ETA: 11s - loss: 0.8637 - regression_loss: 0.7729 - classification_loss: 0.0908 451/500 [==========================>...] - ETA: 11s - loss: 0.8634 - regression_loss: 0.7727 - classification_loss: 0.0907 452/500 [==========================>...] - ETA: 11s - loss: 0.8635 - regression_loss: 0.7726 - classification_loss: 0.0908 453/500 [==========================>...] - ETA: 11s - loss: 0.8636 - regression_loss: 0.7727 - classification_loss: 0.0909 454/500 [==========================>...] - ETA: 10s - loss: 0.8625 - regression_loss: 0.7717 - classification_loss: 0.0908 455/500 [==========================>...] - ETA: 10s - loss: 0.8631 - regression_loss: 0.7723 - classification_loss: 0.0908 456/500 [==========================>...] - ETA: 10s - loss: 0.8636 - regression_loss: 0.7728 - classification_loss: 0.0908 457/500 [==========================>...] - ETA: 10s - loss: 0.8634 - regression_loss: 0.7727 - classification_loss: 0.0907 458/500 [==========================>...] - ETA: 9s - loss: 0.8625 - regression_loss: 0.7719 - classification_loss: 0.0906  459/500 [==========================>...] - ETA: 9s - loss: 0.8628 - regression_loss: 0.7723 - classification_loss: 0.0905 460/500 [==========================>...] - ETA: 9s - loss: 0.8626 - regression_loss: 0.7720 - classification_loss: 0.0905 461/500 [==========================>...] - ETA: 9s - loss: 0.8642 - regression_loss: 0.7733 - classification_loss: 0.0909 462/500 [==========================>...] - ETA: 8s - loss: 0.8646 - regression_loss: 0.7736 - classification_loss: 0.0910 463/500 [==========================>...] - ETA: 8s - loss: 0.8651 - regression_loss: 0.7741 - classification_loss: 0.0910 464/500 [==========================>...] - ETA: 8s - loss: 0.8657 - regression_loss: 0.7748 - classification_loss: 0.0910 465/500 [==========================>...] - ETA: 8s - loss: 0.8650 - regression_loss: 0.7741 - classification_loss: 0.0909 466/500 [==========================>...] - ETA: 7s - loss: 0.8658 - regression_loss: 0.7750 - classification_loss: 0.0908 467/500 [===========================>..] - ETA: 7s - loss: 0.8657 - regression_loss: 0.7749 - classification_loss: 0.0908 468/500 [===========================>..] - ETA: 7s - loss: 0.8660 - regression_loss: 0.7753 - classification_loss: 0.0908 469/500 [===========================>..] - ETA: 7s - loss: 0.8664 - regression_loss: 0.7756 - classification_loss: 0.0908 470/500 [===========================>..] - ETA: 7s - loss: 0.8659 - regression_loss: 0.7753 - classification_loss: 0.0906 471/500 [===========================>..] - ETA: 6s - loss: 0.8658 - regression_loss: 0.7752 - classification_loss: 0.0906 472/500 [===========================>..] - ETA: 6s - loss: 0.8666 - regression_loss: 0.7758 - classification_loss: 0.0908 473/500 [===========================>..] - ETA: 6s - loss: 0.8669 - regression_loss: 0.7761 - classification_loss: 0.0908 474/500 [===========================>..] - ETA: 6s - loss: 0.8678 - regression_loss: 0.7770 - classification_loss: 0.0909 475/500 [===========================>..] - ETA: 5s - loss: 0.8681 - regression_loss: 0.7772 - classification_loss: 0.0909 476/500 [===========================>..] - ETA: 5s - loss: 0.8701 - regression_loss: 0.7789 - classification_loss: 0.0912 477/500 [===========================>..] - ETA: 5s - loss: 0.8706 - regression_loss: 0.7795 - classification_loss: 0.0911 478/500 [===========================>..] - ETA: 5s - loss: 0.8704 - regression_loss: 0.7793 - classification_loss: 0.0911 479/500 [===========================>..] - ETA: 4s - loss: 0.8715 - regression_loss: 0.7805 - classification_loss: 0.0911 480/500 [===========================>..] - ETA: 4s - loss: 0.8714 - regression_loss: 0.7804 - classification_loss: 0.0910 481/500 [===========================>..] - ETA: 4s - loss: 0.8712 - regression_loss: 0.7802 - classification_loss: 0.0910 482/500 [===========================>..] - ETA: 4s - loss: 0.8707 - regression_loss: 0.7799 - classification_loss: 0.0909 483/500 [===========================>..] - ETA: 3s - loss: 0.8703 - regression_loss: 0.7795 - classification_loss: 0.0908 484/500 [============================>.] - ETA: 3s - loss: 0.8696 - regression_loss: 0.7790 - classification_loss: 0.0907 485/500 [============================>.] - ETA: 3s - loss: 0.8685 - regression_loss: 0.7780 - classification_loss: 0.0905 486/500 [============================>.] - ETA: 3s - loss: 0.8682 - regression_loss: 0.7777 - classification_loss: 0.0905 487/500 [============================>.] - ETA: 3s - loss: 0.8684 - regression_loss: 0.7779 - classification_loss: 0.0905 488/500 [============================>.] - ETA: 2s - loss: 0.8671 - regression_loss: 0.7768 - classification_loss: 0.0903 489/500 [============================>.] - ETA: 2s - loss: 0.8667 - regression_loss: 0.7764 - classification_loss: 0.0902 490/500 [============================>.] - ETA: 2s - loss: 0.8660 - regression_loss: 0.7759 - classification_loss: 0.0901 491/500 [============================>.] - ETA: 2s - loss: 0.8661 - regression_loss: 0.7760 - classification_loss: 0.0901 492/500 [============================>.] - ETA: 1s - loss: 0.8650 - regression_loss: 0.7750 - classification_loss: 0.0900 493/500 [============================>.] - ETA: 1s - loss: 0.8654 - regression_loss: 0.7752 - classification_loss: 0.0901 494/500 [============================>.] - ETA: 1s - loss: 0.8646 - regression_loss: 0.7737 - classification_loss: 0.0910 495/500 [============================>.] - ETA: 1s - loss: 0.8648 - regression_loss: 0.7738 - classification_loss: 0.0909 496/500 [============================>.] - ETA: 0s - loss: 0.8647 - regression_loss: 0.7738 - classification_loss: 0.0909 497/500 [============================>.] - ETA: 0s - loss: 0.8646 - regression_loss: 0.7738 - classification_loss: 0.0908 498/500 [============================>.] - ETA: 0s - loss: 0.8654 - regression_loss: 0.7743 - classification_loss: 0.0911 499/500 [============================>.] - ETA: 0s - loss: 0.8657 - regression_loss: 0.7746 - classification_loss: 0.0911 500/500 [==============================] - 117s 235ms/step - loss: 0.8650 - regression_loss: 0.7740 - classification_loss: 0.0909 326 instances of class plum with average precision: 0.8501 mAP: 0.8501 Epoch 00042: saving model to ./training/snapshots/resnet50_pascal_42.h5