CHECK: Is CUDA the right version (10)? Creating model, this may take a second... tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2a (BatchNormalizati (None, None, None, 1 512 res3b_branch2a[0][0] __________________________________________________________________________________________________ res3b_branch2a_relu (Activation (None, None, None, 1 0 bn3b_branch2a[0][0] __________________________________________________________________________________________________ padding3b_branch2b (ZeroPadding (None, None, None, 1 0 res3b_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b_branch2b (Conv2D) (None, None, None, 1 147456 padding3b_branch2b[0][0] __________________________________________________________________________________________________ bn3b_branch2b (BatchNormalizati (None, None, None, 1 512 res3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2b_relu (Activation (None, None, None, 1 0 bn3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2c (Conv2D) (None, None, None, 5 65536 res3b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2c (BatchNormalizati (None, None, None, 5 2048 res3b_branch2c[0][0] __________________________________________________________________________________________________ res3b (Add) (None, None, None, 5 0 bn3b_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b_relu (Activation) (None, None, None, 5 0 res3b[0][0] __________________________________________________________________________________________________ res3c_branch2a (Conv2D) (None, None, None, 1 65536 res3b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2a (BatchNormalizati (None, None, None, 1 512 res3c_branch2a[0][0] __________________________________________________________________________________________________ res3c_branch2a_relu (Activation (None, None, None, 1 0 bn3c_branch2a[0][0] __________________________________________________________________________________________________ padding3c_branch2b (ZeroPadding (None, None, None, 1 0 res3c_branch2a_relu[0][0] __________________________________________________________________________________________________ res3c_branch2b (Conv2D) (None, None, None, 1 147456 padding3c_branch2b[0][0] __________________________________________________________________________________________________ bn3c_branch2b (BatchNormalizati (None, None, None, 1 512 res3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2b_relu (Activation (None, None, None, 1 0 bn3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2c (Conv2D) (None, None, None, 5 65536 res3c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2c (BatchNormalizati (None, None, None, 5 2048 res3c_branch2c[0][0] __________________________________________________________________________________________________ res3c (Add) (None, None, None, 5 0 bn3c_branch2c[0][0] res3b_relu[0][0] __________________________________________________________________________________________________ res3c_relu (Activation) (None, None, None, 5 0 res3c[0][0] __________________________________________________________________________________________________ res3d_branch2a (Conv2D) (None, None, None, 1 65536 res3c_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2a (BatchNormalizati (None, None, None, 1 512 res3d_branch2a[0][0] __________________________________________________________________________________________________ res3d_branch2a_relu (Activation (None, None, None, 1 0 bn3d_branch2a[0][0] __________________________________________________________________________________________________ padding3d_branch2b (ZeroPadding (None, None, None, 1 0 res3d_branch2a_relu[0][0] __________________________________________________________________________________________________ res3d_branch2b (Conv2D) (None, None, None, 1 147456 padding3d_branch2b[0][0] __________________________________________________________________________________________________ bn3d_branch2b (BatchNormalizati (None, None, None, 1 512 res3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2b_relu (Activation (None, None, None, 1 0 bn3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2c (Conv2D) (None, None, None, 5 65536 res3d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2c (BatchNormalizati (None, None, None, 5 2048 res3d_branch2c[0][0] __________________________________________________________________________________________________ res3d (Add) (None, None, None, 5 0 bn3d_branch2c[0][0] res3c_relu[0][0] __________________________________________________________________________________________________ res3d_relu (Activation) (None, None, None, 5 0 res3d[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2a (BatchNormalizati (None, None, None, 2 1024 res4b_branch2a[0][0] __________________________________________________________________________________________________ res4b_branch2a_relu (Activation (None, None, None, 2 0 bn4b_branch2a[0][0] __________________________________________________________________________________________________ padding4b_branch2b (ZeroPadding (None, None, None, 2 0 res4b_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b_branch2b (Conv2D) (None, None, None, 2 589824 padding4b_branch2b[0][0] __________________________________________________________________________________________________ bn4b_branch2b (BatchNormalizati (None, None, None, 2 1024 res4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2b_relu (Activation (None, None, None, 2 0 bn4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2c (Conv2D) (None, None, None, 1 262144 res4b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2c (BatchNormalizati (None, None, None, 1 4096 res4b_branch2c[0][0] __________________________________________________________________________________________________ res4b (Add) (None, None, None, 1 0 bn4b_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b_relu (Activation) (None, None, None, 1 0 res4b[0][0] __________________________________________________________________________________________________ res4c_branch2a (Conv2D) (None, None, None, 2 262144 res4b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2a (BatchNormalizati (None, None, None, 2 1024 res4c_branch2a[0][0] __________________________________________________________________________________________________ res4c_branch2a_relu (Activation (None, None, None, 2 0 bn4c_branch2a[0][0] __________________________________________________________________________________________________ padding4c_branch2b (ZeroPadding (None, None, None, 2 0 res4c_branch2a_relu[0][0] __________________________________________________________________________________________________ res4c_branch2b (Conv2D) (None, None, None, 2 589824 padding4c_branch2b[0][0] __________________________________________________________________________________________________ bn4c_branch2b (BatchNormalizati (None, None, None, 2 1024 res4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2b_relu (Activation (None, None, None, 2 0 bn4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2c (Conv2D) (None, None, None, 1 262144 res4c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2c (BatchNormalizati (None, None, None, 1 4096 res4c_branch2c[0][0] __________________________________________________________________________________________________ res4c (Add) (None, None, None, 1 0 bn4c_branch2c[0][0] res4b_relu[0][0] __________________________________________________________________________________________________ res4c_relu (Activation) (None, None, None, 1 0 res4c[0][0] __________________________________________________________________________________________________ res4d_branch2a (Conv2D) (None, None, None, 2 262144 res4c_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2a (BatchNormalizati (None, None, None, 2 1024 res4d_branch2a[0][0] __________________________________________________________________________________________________ res4d_branch2a_relu (Activation (None, None, None, 2 0 bn4d_branch2a[0][0] __________________________________________________________________________________________________ padding4d_branch2b (ZeroPadding (None, None, None, 2 0 res4d_branch2a_relu[0][0] __________________________________________________________________________________________________ res4d_branch2b (Conv2D) (None, None, None, 2 589824 padding4d_branch2b[0][0] __________________________________________________________________________________________________ bn4d_branch2b (BatchNormalizati (None, None, None, 2 1024 res4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2b_relu (Activation (None, None, None, 2 0 bn4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2c (Conv2D) (None, None, None, 1 262144 res4d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2c (BatchNormalizati (None, None, None, 1 4096 res4d_branch2c[0][0] __________________________________________________________________________________________________ res4d (Add) (None, None, None, 1 0 bn4d_branch2c[0][0] res4c_relu[0][0] __________________________________________________________________________________________________ res4d_relu (Activation) (None, None, None, 1 0 res4d[0][0] __________________________________________________________________________________________________ res4e_branch2a (Conv2D) (None, None, None, 2 262144 res4d_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2a (BatchNormalizati (None, None, None, 2 1024 res4e_branch2a[0][0] __________________________________________________________________________________________________ res4e_branch2a_relu (Activation (None, None, None, 2 0 bn4e_branch2a[0][0] __________________________________________________________________________________________________ padding4e_branch2b (ZeroPadding (None, None, None, 2 0 res4e_branch2a_relu[0][0] __________________________________________________________________________________________________ res4e_branch2b (Conv2D) (None, None, None, 2 589824 padding4e_branch2b[0][0] __________________________________________________________________________________________________ bn4e_branch2b (BatchNormalizati (None, None, None, 2 1024 res4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2b_relu (Activation (None, None, None, 2 0 bn4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2c (Conv2D) (None, None, None, 1 262144 res4e_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2c (BatchNormalizati (None, None, None, 1 4096 res4e_branch2c[0][0] __________________________________________________________________________________________________ res4e (Add) (None, None, None, 1 0 bn4e_branch2c[0][0] res4d_relu[0][0] __________________________________________________________________________________________________ res4e_relu (Activation) (None, None, None, 1 0 res4e[0][0] __________________________________________________________________________________________________ res4f_branch2a (Conv2D) (None, None, None, 2 262144 res4e_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2a (BatchNormalizati (None, None, None, 2 1024 res4f_branch2a[0][0] __________________________________________________________________________________________________ res4f_branch2a_relu (Activation (None, None, None, 2 0 bn4f_branch2a[0][0] __________________________________________________________________________________________________ padding4f_branch2b (ZeroPadding (None, None, None, 2 0 res4f_branch2a_relu[0][0] __________________________________________________________________________________________________ res4f_branch2b (Conv2D) (None, None, None, 2 589824 padding4f_branch2b[0][0] __________________________________________________________________________________________________ bn4f_branch2b (BatchNormalizati (None, None, None, 2 1024 res4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2b_relu (Activation (None, None, None, 2 0 bn4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2c (Conv2D) (None, None, None, 1 262144 res4f_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2c (BatchNormalizati (None, None, None, 1 4096 res4f_branch2c[0][0] __________________________________________________________________________________________________ res4f (Add) (None, None, None, 1 0 bn4f_branch2c[0][0] res4e_relu[0][0] __________________________________________________________________________________________________ res4f_relu (Activation) (None, None, None, 1 0 res4f[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4f_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4f_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3d_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3d_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 36,382,957 Trainable params: 36,276,717 Non-trainable params: 106,240 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 44:56 - loss: 1725.4568 - regression_loss: 143.5611 - classification_loss: 1581.8958 2/500 [..............................] - ETA: 23:29 - loss: 11933.3055 - regression_loss: 135.0789 - classification_loss: 11798.2266 3/500 [..............................] - ETA: 16:17 - loss: 8936.8478 - regression_loss: 147.1898 - classification_loss: 8789.6582 4/500 [..............................] - ETA: 12:40 - loss: 8749.2034 - regression_loss: 151.8814 - classification_loss: 8597.3223 5/500 [..............................] - ETA: 10:32 - loss: 9172.4250 - regression_loss: 137.2679 - classification_loss: 9035.1572 6/500 [..............................] - ETA: 9:06 - loss: 7875.5468 - regression_loss: 139.0967 - classification_loss: 7736.4497 7/500 [..............................] - ETA: 8:05 - loss: 7616.0346 - regression_loss: 132.0881 - classification_loss: 7483.9463 8/500 [..............................] - ETA: 7:19 - loss: 7111.3821 - regression_loss: 139.1096 - classification_loss: 6972.2725 9/500 [..............................] - ETA: 6:43 - loss: 6444.7986 - regression_loss: 139.9860 - classification_loss: 6304.8125 10/500 [..............................] - ETA: 6:15 - loss: 6162.0256 - regression_loss: 136.4854 - classification_loss: 6025.5400 11/500 [..............................] - ETA: 5:51 - loss: 5918.8767 - regression_loss: 131.8429 - classification_loss: 5787.0337 12/500 [..............................] - ETA: 5:31 - loss: 5648.4224 - regression_loss: 129.0719 - classification_loss: 5519.3501 13/500 [..............................] - ETA: 5:14 - loss: 5284.2020 - regression_loss: 128.6668 - classification_loss: 5155.5347 14/500 [..............................] - ETA: 5:00 - loss: 5300.6108 - regression_loss: 127.6110 - classification_loss: 5172.9995 15/500 [..............................] - ETA: 4:47 - loss: 5017.9995 - regression_loss: 127.3263 - classification_loss: 4890.6729 16/500 [..............................] - ETA: 4:37 - loss: 5002.1223 - regression_loss: 124.4483 - classification_loss: 4877.6738 17/500 [>.............................] - ETA: 4:27 - loss: 4738.6998 - regression_loss: 122.9906 - classification_loss: 4615.7090 18/500 [>.............................] - ETA: 4:19 - loss: 4578.4166 - regression_loss: 122.8735 - classification_loss: 4455.5430 19/500 [>.............................] - ETA: 4:11 - loss: 4667.1321 - regression_loss: 120.6794 - classification_loss: 4546.4526 20/500 [>.............................] - ETA: 4:04 - loss: 4574.1779 - regression_loss: 119.2118 - classification_loss: 4454.9658 21/500 [>.............................] - ETA: 3:57 - loss: 4529.6494 - regression_loss: 117.3009 - classification_loss: 4412.3486 22/500 [>.............................] - ETA: 3:52 - loss: 4359.7061 - regression_loss: 114.0522 - classification_loss: 4245.6543 23/500 [>.............................] - ETA: 3:46 - loss: 4253.7344 - regression_loss: 112.1957 - classification_loss: 4141.5386 24/500 [>.............................] - ETA: 3:41 - loss: 4112.4476 - regression_loss: 110.9156 - classification_loss: 4001.5322 25/500 [>.............................] - ETA: 3:37 - loss: 4013.6440 - regression_loss: 108.1746 - classification_loss: 3905.4695 26/500 [>.............................] - ETA: 3:32 - loss: 3935.7526 - regression_loss: 105.9024 - classification_loss: 3829.8503 27/500 [>.............................] - ETA: 3:28 - loss: 3867.9570 - regression_loss: 103.7187 - classification_loss: 3764.2385 28/500 [>.............................] - ETA: 3:25 - loss: 3809.4783 - regression_loss: 101.3323 - classification_loss: 3708.1460 29/500 [>.............................] - ETA: 3:22 - loss: 3713.1168 - regression_loss: 98.9381 - classification_loss: 3614.1787 30/500 [>.............................] - ETA: 3:18 - loss: 3609.1927 - regression_loss: 96.6237 - classification_loss: 3512.5691 31/500 [>.............................] - ETA: 3:15 - loss: 3502.6733 - regression_loss: 94.4286 - classification_loss: 3408.2446 32/500 [>.............................] - ETA: 3:13 - loss: 3409.4619 - regression_loss: 92.3811 - classification_loss: 3317.0808 33/500 [>.............................] - ETA: 3:10 - loss: 3309.8331 - regression_loss: 90.7004 - classification_loss: 3219.1326 34/500 [=>............................] - ETA: 3:07 - loss: 3217.5235 - regression_loss: 88.9314 - classification_loss: 3128.5918 35/500 [=>............................] - ETA: 3:05 - loss: 3133.9850 - regression_loss: 88.0553 - classification_loss: 3045.9297 36/500 [=>............................] - ETA: 3:03 - loss: 3047.9462 - regression_loss: 86.3889 - classification_loss: 2961.5574 37/500 [=>............................] - ETA: 3:00 - loss: 2966.5997 - regression_loss: 84.9215 - classification_loss: 2881.6782 38/500 [=>............................] - ETA: 2:58 - loss: 2889.2932 - regression_loss: 83.3096 - classification_loss: 2805.9836 39/500 [=>............................] - ETA: 2:56 - loss: 2815.9382 - regression_loss: 81.7815 - classification_loss: 2734.1567 40/500 [=>............................] - ETA: 2:54 - loss: 2746.3136 - regression_loss: 80.3715 - classification_loss: 2665.9419 41/500 [=>............................] - ETA: 2:52 - loss: 2680.2492 - regression_loss: 79.2156 - classification_loss: 2601.0334 42/500 [=>............................] - ETA: 2:51 - loss: 2616.9184 - regression_loss: 77.6966 - classification_loss: 2539.2217 43/500 [=>............................] - ETA: 2:49 - loss: 2556.5442 - regression_loss: 76.2775 - classification_loss: 2480.2666 44/500 [=>............................] - ETA: 2:48 - loss: 2499.0702 - regression_loss: 75.0860 - classification_loss: 2423.9841 45/500 [=>............................] - ETA: 2:46 - loss: 2444.2745 - regression_loss: 74.0673 - classification_loss: 2370.2070 46/500 [=>............................] - ETA: 2:44 - loss: 2391.6840 - regression_loss: 72.9176 - classification_loss: 2318.7661 47/500 [=>............................] - ETA: 2:43 - loss: 2341.3686 - regression_loss: 71.8516 - classification_loss: 2269.5168 48/500 [=>............................] - ETA: 2:41 - loss: 2293.0147 - regression_loss: 70.6991 - classification_loss: 2222.3154 49/500 [=>............................] - ETA: 2:40 - loss: 2246.6221 - regression_loss: 69.5807 - classification_loss: 2177.0413 50/500 [==>...........................] - ETA: 2:38 - loss: 2202.0732 - regression_loss: 68.4973 - classification_loss: 2133.5757 51/500 [==>...........................] - ETA: 2:37 - loss: 2159.2201 - regression_loss: 67.4028 - classification_loss: 2091.8171 52/500 [==>...........................] - ETA: 2:36 - loss: 2118.0072 - regression_loss: 66.3461 - classification_loss: 2051.6609 53/500 [==>...........................] - ETA: 2:34 - loss: 2078.3630 - regression_loss: 65.3399 - classification_loss: 2013.0229 54/500 [==>...........................] - ETA: 2:33 - loss: 2040.1512 - regression_loss: 64.3339 - classification_loss: 1975.8171 55/500 [==>...........................] - ETA: 2:32 - loss: 2003.5495 - regression_loss: 63.5851 - classification_loss: 1939.9642 56/500 [==>...........................] - ETA: 2:31 - loss: 1968.0356 - regression_loss: 62.6453 - classification_loss: 1905.3900 57/500 [==>...........................] - ETA: 2:30 - loss: 1933.9132 - regression_loss: 61.8863 - classification_loss: 1872.0267 58/500 [==>...........................] - ETA: 2:29 - loss: 1900.8289 - regression_loss: 61.0119 - classification_loss: 1839.8168 59/500 [==>...........................] - ETA: 2:28 - loss: 1868.8542 - regression_loss: 60.1551 - classification_loss: 1808.6989 60/500 [==>...........................] - ETA: 2:27 - loss: 1838.0774 - regression_loss: 59.4589 - classification_loss: 1778.6183 61/500 [==>...........................] - ETA: 2:26 - loss: 1808.1916 - regression_loss: 58.6679 - classification_loss: 1749.5236 62/500 [==>...........................] - ETA: 2:25 - loss: 1779.2658 - regression_loss: 57.8983 - classification_loss: 1721.3673 63/500 [==>...........................] - ETA: 2:24 - loss: 1751.2736 - regression_loss: 57.1681 - classification_loss: 1694.1052 64/500 [==>...........................] - ETA: 2:23 - loss: 1724.1549 - regression_loss: 56.4589 - classification_loss: 1667.6958 65/500 [==>...........................] - ETA: 2:22 - loss: 1697.8663 - regression_loss: 55.7679 - classification_loss: 1642.0981 66/500 [==>...........................] - ETA: 2:21 - loss: 1672.3655 - regression_loss: 55.0895 - classification_loss: 1617.2758 67/500 [===>..........................] - ETA: 2:21 - loss: 1647.6068 - regression_loss: 54.4117 - classification_loss: 1593.1949 68/500 [===>..........................] - ETA: 2:20 - loss: 1623.5791 - regression_loss: 53.7578 - classification_loss: 1569.8212 69/500 [===>..........................] - ETA: 2:19 - loss: 1600.2260 - regression_loss: 53.1015 - classification_loss: 1547.1243 70/500 [===>..........................] - ETA: 2:18 - loss: 1577.5815 - regression_loss: 52.5052 - classification_loss: 1525.0762 71/500 [===>..........................] - ETA: 2:18 - loss: 1555.5318 - regression_loss: 51.8817 - classification_loss: 1503.6500 72/500 [===>..........................] - ETA: 2:17 - loss: 1534.1089 - regression_loss: 51.2893 - classification_loss: 1482.8196 73/500 [===>..........................] - ETA: 2:16 - loss: 1513.2783 - regression_loss: 50.7188 - classification_loss: 1462.5594 74/500 [===>..........................] - ETA: 2:16 - loss: 1493.0872 - regression_loss: 50.2400 - classification_loss: 1442.8472 75/500 [===>..........................] - ETA: 2:15 - loss: 1473.3919 - regression_loss: 49.7315 - classification_loss: 1423.6603 76/500 [===>..........................] - ETA: 2:14 - loss: 1454.1857 - regression_loss: 49.2086 - classification_loss: 1404.9769 77/500 [===>..........................] - ETA: 2:13 - loss: 1435.5445 - regression_loss: 48.7632 - classification_loss: 1386.7812 78/500 [===>..........................] - ETA: 2:12 - loss: 1417.3063 - regression_loss: 48.2575 - classification_loss: 1369.0487 79/500 [===>..........................] - ETA: 2:11 - loss: 1399.5166 - regression_loss: 47.7520 - classification_loss: 1351.7644 80/500 [===>..........................] - ETA: 2:10 - loss: 1382.2126 - regression_loss: 47.2925 - classification_loss: 1334.9200 81/500 [===>..........................] - ETA: 2:10 - loss: 1365.2781 - regression_loss: 46.7941 - classification_loss: 1318.4839 82/500 [===>..........................] - ETA: 2:09 - loss: 1348.7631 - regression_loss: 46.3121 - classification_loss: 1302.4508 83/500 [===>..........................] - ETA: 2:09 - loss: 1332.6804 - regression_loss: 45.8837 - classification_loss: 1286.7966 84/500 [====>.........................] - ETA: 2:08 - loss: 1316.9590 - regression_loss: 45.4365 - classification_loss: 1271.5225 85/500 [====>.........................] - ETA: 2:07 - loss: 1301.5894 - regression_loss: 44.9811 - classification_loss: 1256.6083 86/500 [====>.........................] - ETA: 2:07 - loss: 1286.6020 - regression_loss: 44.5612 - classification_loss: 1242.0406 87/500 [====>.........................] - ETA: 2:06 - loss: 1271.9588 - regression_loss: 44.1514 - classification_loss: 1227.8073 88/500 [====>.........................] - ETA: 2:06 - loss: 1257.6210 - regression_loss: 43.7243 - classification_loss: 1213.8966 89/500 [====>.........................] - ETA: 2:05 - loss: 1243.6219 - regression_loss: 43.3214 - classification_loss: 1200.3004 90/500 [====>.........................] - ETA: 2:04 - loss: 1229.9282 - regression_loss: 42.9224 - classification_loss: 1187.0057 91/500 [====>.........................] - ETA: 2:04 - loss: 1216.5189 - regression_loss: 42.5160 - classification_loss: 1174.0028 92/500 [====>.........................] - ETA: 2:03 - loss: 1203.3983 - regression_loss: 42.1157 - classification_loss: 1161.2825 93/500 [====>.........................] - ETA: 2:03 - loss: 1190.5936 - regression_loss: 41.7563 - classification_loss: 1148.8372 94/500 [====>.........................] - ETA: 2:02 - loss: 1178.0243 - regression_loss: 41.3690 - classification_loss: 1136.6552 95/500 [====>.........................] - ETA: 2:01 - loss: 1165.7223 - regression_loss: 40.9926 - classification_loss: 1124.7295 96/500 [====>.........................] - ETA: 2:01 - loss: 1153.7148 - regression_loss: 40.6609 - classification_loss: 1113.0538 97/500 [====>.........................] - ETA: 2:00 - loss: 1141.9154 - regression_loss: 40.3006 - classification_loss: 1101.6147 98/500 [====>.........................] - ETA: 2:00 - loss: 1130.3544 - regression_loss: 39.9440 - classification_loss: 1090.4103 99/500 [====>.........................] - ETA: 1:59 - loss: 1119.0266 - regression_loss: 39.5969 - classification_loss: 1079.4296 100/500 [=====>........................] - ETA: 1:58 - loss: 1107.9402 - regression_loss: 39.2666 - classification_loss: 1068.6735 101/500 [=====>........................] - ETA: 1:58 - loss: 1097.0725 - regression_loss: 38.9433 - classification_loss: 1058.1290 102/500 [=====>........................] - ETA: 1:57 - loss: 1086.4233 - regression_loss: 38.6329 - classification_loss: 1047.7903 103/500 [=====>........................] - ETA: 1:57 - loss: 1075.9648 - regression_loss: 38.3108 - classification_loss: 1037.6539 104/500 [=====>........................] - ETA: 1:56 - loss: 1065.7048 - regression_loss: 37.9942 - classification_loss: 1027.7104 105/500 [=====>........................] - ETA: 1:55 - loss: 1055.6421 - regression_loss: 37.6860 - classification_loss: 1017.9561 106/500 [=====>........................] - ETA: 1:55 - loss: 1045.7612 - regression_loss: 37.3755 - classification_loss: 1008.3856 107/500 [=====>........................] - ETA: 1:54 - loss: 1036.0703 - regression_loss: 37.0784 - classification_loss: 998.9918 108/500 [=====>........................] - ETA: 1:54 - loss: 1026.5630 - regression_loss: 36.7986 - classification_loss: 989.7643 109/500 [=====>........................] - ETA: 1:53 - loss: 1017.2241 - regression_loss: 36.5134 - classification_loss: 980.7106 110/500 [=====>........................] - ETA: 1:53 - loss: 1008.0501 - regression_loss: 36.2291 - classification_loss: 971.8209 111/500 [=====>........................] - ETA: 1:52 - loss: 999.0374 - regression_loss: 35.9459 - classification_loss: 963.0914 112/500 [=====>........................] - ETA: 1:51 - loss: 990.1783 - regression_loss: 35.6573 - classification_loss: 954.5209 113/500 [=====>........................] - ETA: 1:51 - loss: 981.4933 - regression_loss: 35.3941 - classification_loss: 946.0992 114/500 [=====>........................] - ETA: 1:50 - loss: 972.9542 - regression_loss: 35.1315 - classification_loss: 937.8228 115/500 [=====>........................] - ETA: 1:50 - loss: 964.5638 - regression_loss: 34.8721 - classification_loss: 929.6917 116/500 [=====>........................] - ETA: 1:49 - loss: 956.3467 - regression_loss: 34.6374 - classification_loss: 921.7094 117/500 [======>.......................] - ETA: 1:49 - loss: 948.2419 - regression_loss: 34.3798 - classification_loss: 913.8621 118/500 [======>.......................] - ETA: 1:48 - loss: 940.2715 - regression_loss: 34.1298 - classification_loss: 906.1417 119/500 [======>.......................] - ETA: 1:48 - loss: 932.4407 - regression_loss: 33.8876 - classification_loss: 898.5532 120/500 [======>.......................] - ETA: 1:47 - loss: 924.7364 - regression_loss: 33.6470 - classification_loss: 891.0896 121/500 [======>.......................] - ETA: 1:47 - loss: 917.1649 - regression_loss: 33.4182 - classification_loss: 883.7468 122/500 [======>.......................] - ETA: 1:46 - loss: 909.7050 - regression_loss: 33.1800 - classification_loss: 876.5250 123/500 [======>.......................] - ETA: 1:46 - loss: 902.3715 - regression_loss: 32.9505 - classification_loss: 869.4210 124/500 [======>.......................] - ETA: 1:45 - loss: 895.1471 - regression_loss: 32.7150 - classification_loss: 862.4322 125/500 [======>.......................] - ETA: 1:45 - loss: 888.0482 - regression_loss: 32.4934 - classification_loss: 855.5548 126/500 [======>.......................] - ETA: 1:44 - loss: 881.0605 - regression_loss: 32.2717 - classification_loss: 848.7889 127/500 [======>.......................] - ETA: 1:44 - loss: 874.1742 - regression_loss: 32.0491 - classification_loss: 842.1251 128/500 [======>.......................] - ETA: 1:44 - loss: 867.4021 - regression_loss: 31.8328 - classification_loss: 835.5694 129/500 [======>.......................] - ETA: 1:43 - loss: 860.7530 - regression_loss: 31.6304 - classification_loss: 829.1227 130/500 [======>.......................] - ETA: 1:43 - loss: 854.1767 - regression_loss: 31.4131 - classification_loss: 822.7637 131/500 [======>.......................] - ETA: 1:42 - loss: 847.7323 - regression_loss: 31.2248 - classification_loss: 816.5074 132/500 [======>.......................] - ETA: 1:42 - loss: 841.3657 - regression_loss: 31.0210 - classification_loss: 810.3447 133/500 [======>.......................] - ETA: 1:41 - loss: 835.0829 - regression_loss: 30.8143 - classification_loss: 804.2686 134/500 [=======>......................] - ETA: 1:41 - loss: 828.9009 - regression_loss: 30.6153 - classification_loss: 798.2856 135/500 [=======>......................] - ETA: 1:40 - loss: 822.8070 - regression_loss: 30.4186 - classification_loss: 792.3885 136/500 [=======>......................] - ETA: 1:40 - loss: 816.8033 - regression_loss: 30.2215 - classification_loss: 786.5817 137/500 [=======>......................] - ETA: 1:39 - loss: 810.8852 - regression_loss: 30.0260 - classification_loss: 780.8591 138/500 [=======>......................] - ETA: 1:39 - loss: 805.0582 - regression_loss: 29.8363 - classification_loss: 775.2219 139/500 [=======>......................] - ETA: 1:39 - loss: 799.3046 - regression_loss: 29.6423 - classification_loss: 769.6623 140/500 [=======>......................] - ETA: 1:38 - loss: 793.6357 - regression_loss: 29.4534 - classification_loss: 764.1822 141/500 [=======>......................] - ETA: 1:38 - loss: 788.0745 - regression_loss: 29.2814 - classification_loss: 758.7930 142/500 [=======>......................] - ETA: 1:37 - loss: 782.5783 - regression_loss: 29.1082 - classification_loss: 753.4700 143/500 [=======>......................] - ETA: 1:37 - loss: 777.1529 - regression_loss: 28.9349 - classification_loss: 748.2180 144/500 [=======>......................] - ETA: 1:37 - loss: 771.7932 - regression_loss: 28.7547 - classification_loss: 743.0384 145/500 [=======>......................] - ETA: 1:36 - loss: 766.5179 - regression_loss: 28.5864 - classification_loss: 737.9315 146/500 [=======>......................] - ETA: 1:36 - loss: 761.3140 - regression_loss: 28.4194 - classification_loss: 732.8945 147/500 [=======>......................] - ETA: 1:35 - loss: 756.1787 - regression_loss: 28.2514 - classification_loss: 727.9272 148/500 [=======>......................] - ETA: 1:35 - loss: 751.1067 - regression_loss: 28.0842 - classification_loss: 723.0225 149/500 [=======>......................] - ETA: 1:35 - loss: 746.1043 - regression_loss: 27.9178 - classification_loss: 718.1865 150/500 [========>.....................] - ETA: 1:34 - loss: 741.1720 - regression_loss: 27.7576 - classification_loss: 713.4142 151/500 [========>.....................] - ETA: 1:34 - loss: 736.3043 - regression_loss: 27.5985 - classification_loss: 708.7057 152/500 [========>.....................] - ETA: 1:33 - loss: 731.4963 - regression_loss: 27.4394 - classification_loss: 704.0569 153/500 [========>.....................] - ETA: 1:33 - loss: 726.7495 - regression_loss: 27.2814 - classification_loss: 699.4680 154/500 [========>.....................] - ETA: 1:33 - loss: 722.0687 - regression_loss: 27.1292 - classification_loss: 694.9394 155/500 [========>.....................] - ETA: 1:32 - loss: 717.4422 - regression_loss: 26.9729 - classification_loss: 690.4692 156/500 [========>.....................] - ETA: 1:32 - loss: 712.8759 - regression_loss: 26.8196 - classification_loss: 686.0562 157/500 [========>.....................] - ETA: 1:31 - loss: 708.3930 - regression_loss: 26.6849 - classification_loss: 681.7080 158/500 [========>.....................] - ETA: 1:31 - loss: 703.9543 - regression_loss: 26.5447 - classification_loss: 677.4095 159/500 [========>.....................] - ETA: 1:31 - loss: 699.5602 - regression_loss: 26.3999 - classification_loss: 673.1603 160/500 [========>.....................] - ETA: 1:30 - loss: 695.2180 - regression_loss: 26.2520 - classification_loss: 668.9659 161/500 [========>.....................] - ETA: 1:30 - loss: 690.9339 - regression_loss: 26.1098 - classification_loss: 664.8240 162/500 [========>.....................] - ETA: 1:30 - loss: 686.7020 - regression_loss: 25.9689 - classification_loss: 660.7330 163/500 [========>.....................] - ETA: 1:29 - loss: 682.5193 - regression_loss: 25.8292 - classification_loss: 656.6901 164/500 [========>.....................] - ETA: 1:29 - loss: 678.3885 - regression_loss: 25.6918 - classification_loss: 652.6965 165/500 [========>.....................] - ETA: 1:28 - loss: 674.3069 - regression_loss: 25.5550 - classification_loss: 648.7518 166/500 [========>.....................] - ETA: 1:28 - loss: 670.2790 - regression_loss: 25.4207 - classification_loss: 644.8582 167/500 [=========>....................] - ETA: 1:28 - loss: 666.3054 - regression_loss: 25.2941 - classification_loss: 641.0111 168/500 [=========>....................] - ETA: 1:27 - loss: 662.3660 - regression_loss: 25.1621 - classification_loss: 637.2038 169/500 [=========>....................] - ETA: 1:27 - loss: 658.4747 - regression_loss: 25.0324 - classification_loss: 633.4421 170/500 [=========>....................] - ETA: 1:27 - loss: 654.6292 - regression_loss: 24.9021 - classification_loss: 629.7269 171/500 [=========>....................] - ETA: 1:26 - loss: 650.8345 - regression_loss: 24.7781 - classification_loss: 626.0562 172/500 [=========>....................] - ETA: 1:26 - loss: 647.0766 - regression_loss: 24.6529 - classification_loss: 622.4236 173/500 [=========>....................] - ETA: 1:26 - loss: 643.3626 - regression_loss: 24.5265 - classification_loss: 618.8359 174/500 [=========>....................] - ETA: 1:25 - loss: 639.6921 - regression_loss: 24.4032 - classification_loss: 615.2888 175/500 [=========>....................] - ETA: 1:25 - loss: 636.0622 - regression_loss: 24.2807 - classification_loss: 611.7814 176/500 [=========>....................] - ETA: 1:25 - loss: 632.4772 - regression_loss: 24.1619 - classification_loss: 608.3152 177/500 [=========>....................] - ETA: 1:24 - loss: 628.9281 - regression_loss: 24.0415 - classification_loss: 604.8864 178/500 [=========>....................] - ETA: 1:24 - loss: 625.4193 - regression_loss: 23.9233 - classification_loss: 601.4958 179/500 [=========>....................] - ETA: 1:24 - loss: 621.9578 - regression_loss: 23.8088 - classification_loss: 598.1488 180/500 [=========>....................] - ETA: 1:23 - loss: 618.5288 - regression_loss: 23.6938 - classification_loss: 594.8349 181/500 [=========>....................] - ETA: 1:23 - loss: 615.1351 - regression_loss: 23.5780 - classification_loss: 591.5569 182/500 [=========>....................] - ETA: 1:23 - loss: 611.7828 - regression_loss: 23.4662 - classification_loss: 588.3165 183/500 [=========>....................] - ETA: 1:22 - loss: 608.4665 - regression_loss: 23.3553 - classification_loss: 585.1111 184/500 [==========>...................] - ETA: 1:22 - loss: 605.1856 - regression_loss: 23.2440 - classification_loss: 581.9415 185/500 [==========>...................] - ETA: 1:22 - loss: 601.9424 - regression_loss: 23.1338 - classification_loss: 578.8085 186/500 [==========>...................] - ETA: 1:21 - loss: 598.7385 - regression_loss: 23.0307 - classification_loss: 575.7078 187/500 [==========>...................] - ETA: 1:21 - loss: 595.5580 - regression_loss: 22.9221 - classification_loss: 572.6358 188/500 [==========>...................] - ETA: 1:21 - loss: 592.4125 - regression_loss: 22.8157 - classification_loss: 569.5967 189/500 [==========>...................] - ETA: 1:20 - loss: 589.3001 - regression_loss: 22.7109 - classification_loss: 566.5891 190/500 [==========>...................] - ETA: 1:20 - loss: 586.2179 - regression_loss: 22.6039 - classification_loss: 563.6138 191/500 [==========>...................] - ETA: 1:20 - loss: 583.1728 - regression_loss: 22.5023 - classification_loss: 560.6704 192/500 [==========>...................] - ETA: 1:19 - loss: 580.1594 - regression_loss: 22.4018 - classification_loss: 557.7574 193/500 [==========>...................] - ETA: 1:19 - loss: 577.1778 - regression_loss: 22.3019 - classification_loss: 554.8758 194/500 [==========>...................] - ETA: 1:19 - loss: 574.2422 - regression_loss: 22.2098 - classification_loss: 552.0322 195/500 [==========>...................] - ETA: 1:18 - loss: 571.3180 - regression_loss: 22.1105 - classification_loss: 549.2074 196/500 [==========>...................] - ETA: 1:18 - loss: 568.4283 - regression_loss: 22.0161 - classification_loss: 546.4120 197/500 [==========>...................] - ETA: 1:18 - loss: 565.5675 - regression_loss: 21.9220 - classification_loss: 543.6453 198/500 [==========>...................] - ETA: 1:17 - loss: 562.7334 - regression_loss: 21.8272 - classification_loss: 540.9061 199/500 [==========>...................] - ETA: 1:17 - loss: 559.9295 - regression_loss: 21.7334 - classification_loss: 538.1960 200/500 [===========>..................] - ETA: 1:17 - loss: 557.1488 - regression_loss: 21.6381 - classification_loss: 535.5105 201/500 [===========>..................] - ETA: 1:16 - loss: 554.3983 - regression_loss: 21.5442 - classification_loss: 532.8539 202/500 [===========>..................] - ETA: 1:16 - loss: 551.6768 - regression_loss: 21.4546 - classification_loss: 530.2220 203/500 [===========>..................] - ETA: 1:16 - loss: 548.9816 - regression_loss: 21.3639 - classification_loss: 527.6176 204/500 [===========>..................] - ETA: 1:16 - loss: 546.3124 - regression_loss: 21.2727 - classification_loss: 525.0396 205/500 [===========>..................] - ETA: 1:15 - loss: 543.6680 - regression_loss: 21.1834 - classification_loss: 522.4844 206/500 [===========>..................] - ETA: 1:15 - loss: 541.0523 - regression_loss: 21.0977 - classification_loss: 519.9544 207/500 [===========>..................] - ETA: 1:15 - loss: 538.4586 - regression_loss: 21.0109 - classification_loss: 517.4476 208/500 [===========>..................] - ETA: 1:14 - loss: 535.8885 - regression_loss: 20.9238 - classification_loss: 514.9646 209/500 [===========>..................] - ETA: 1:14 - loss: 533.3422 - regression_loss: 20.8365 - classification_loss: 512.5056 210/500 [===========>..................] - ETA: 1:14 - loss: 530.8227 - regression_loss: 20.7503 - classification_loss: 510.0723 211/500 [===========>..................] - ETA: 1:13 - loss: 528.3268 - regression_loss: 20.6643 - classification_loss: 507.6623 212/500 [===========>..................] - ETA: 1:13 - loss: 525.8532 - regression_loss: 20.5801 - classification_loss: 505.2730 213/500 [===========>..................] - ETA: 1:13 - loss: 523.4055 - regression_loss: 20.4988 - classification_loss: 502.9066 214/500 [===========>..................] - ETA: 1:12 - loss: 520.9794 - regression_loss: 20.4164 - classification_loss: 500.5629 215/500 [===========>..................] - ETA: 1:12 - loss: 518.5754 - regression_loss: 20.3349 - classification_loss: 498.2404 216/500 [===========>..................] - ETA: 1:12 - loss: 516.1947 - regression_loss: 20.2553 - classification_loss: 495.9393 217/500 [============>.................] - ETA: 1:12 - loss: 513.8350 - regression_loss: 20.1762 - classification_loss: 493.6587 218/500 [============>.................] - ETA: 1:11 - loss: 511.4984 - regression_loss: 20.0977 - classification_loss: 491.4006 219/500 [============>.................] - ETA: 1:11 - loss: 509.1798 - regression_loss: 20.0183 - classification_loss: 489.1615 220/500 [============>.................] - ETA: 1:11 - loss: 506.8927 - regression_loss: 19.9456 - classification_loss: 486.9471 221/500 [============>.................] - ETA: 1:10 - loss: 504.6205 - regression_loss: 19.8712 - classification_loss: 484.7492 222/500 [============>.................] - ETA: 1:10 - loss: 502.3664 - regression_loss: 19.7944 - classification_loss: 482.5720 223/500 [============>.................] - ETA: 1:10 - loss: 500.1310 - regression_loss: 19.7177 - classification_loss: 480.4132 224/500 [============>.................] - ETA: 1:10 - loss: 497.9153 - regression_loss: 19.6424 - classification_loss: 478.2729 225/500 [============>.................] - ETA: 1:09 - loss: 495.7210 - regression_loss: 19.5681 - classification_loss: 476.1528 226/500 [============>.................] - ETA: 1:09 - loss: 493.5444 - regression_loss: 19.4939 - classification_loss: 474.0504 227/500 [============>.................] - ETA: 1:09 - loss: 491.3873 - regression_loss: 19.4208 - classification_loss: 471.9664 228/500 [============>.................] - ETA: 1:09 - loss: 489.2472 - regression_loss: 19.3470 - classification_loss: 469.9001 229/500 [============>.................] - ETA: 1:08 - loss: 487.1276 - regression_loss: 19.2752 - classification_loss: 467.8522 230/500 [============>.................] - ETA: 1:08 - loss: 485.0272 - regression_loss: 19.2043 - classification_loss: 465.8228 231/500 [============>.................] - ETA: 1:08 - loss: 482.9427 - regression_loss: 19.1323 - classification_loss: 463.8103 232/500 [============>.................] - ETA: 1:07 - loss: 480.8763 - regression_loss: 19.0615 - classification_loss: 461.8148 233/500 [============>.................] - ETA: 1:07 - loss: 478.8288 - regression_loss: 18.9922 - classification_loss: 459.8366 234/500 [=============>................] - ETA: 1:07 - loss: 476.7988 - regression_loss: 18.9230 - classification_loss: 457.8757 235/500 [=============>................] - ETA: 1:07 - loss: 474.7856 - regression_loss: 18.8543 - classification_loss: 455.9312 236/500 [=============>................] - ETA: 1:06 - loss: 472.7885 - regression_loss: 18.7847 - classification_loss: 454.0037 237/500 [=============>................] - ETA: 1:06 - loss: 470.8083 - regression_loss: 18.7169 - classification_loss: 452.0913 238/500 [=============>................] - ETA: 1:06 - loss: 468.8469 - regression_loss: 18.6501 - classification_loss: 450.1967 239/500 [=============>................] - ETA: 1:06 - loss: 466.9003 - regression_loss: 18.5834 - classification_loss: 448.3168 240/500 [=============>................] - ETA: 1:05 - loss: 464.9697 - regression_loss: 18.5170 - classification_loss: 446.4525 241/500 [=============>................] - ETA: 1:05 - loss: 463.0729 - regression_loss: 18.4592 - classification_loss: 444.6136 242/500 [=============>................] - ETA: 1:05 - loss: 461.1778 - regression_loss: 18.3958 - classification_loss: 442.7818 243/500 [=============>................] - ETA: 1:05 - loss: 459.2954 - regression_loss: 18.3319 - classification_loss: 440.9634 244/500 [=============>................] - ETA: 1:04 - loss: 457.4266 - regression_loss: 18.2669 - classification_loss: 439.1596 245/500 [=============>................] - ETA: 1:04 - loss: 455.5748 - regression_loss: 18.2035 - classification_loss: 437.3712 246/500 [=============>................] - ETA: 1:04 - loss: 453.7394 - regression_loss: 18.1417 - classification_loss: 435.5976 247/500 [=============>................] - ETA: 1:04 - loss: 451.9190 - regression_loss: 18.0804 - classification_loss: 433.8384 248/500 [=============>................] - ETA: 1:03 - loss: 450.1141 - regression_loss: 18.0205 - classification_loss: 432.0935 249/500 [=============>................] - ETA: 1:03 - loss: 448.3211 - regression_loss: 17.9591 - classification_loss: 430.3619 250/500 [==============>...............] - ETA: 1:03 - loss: 446.5428 - regression_loss: 17.8983 - classification_loss: 428.6444 251/500 [==============>...............] - ETA: 1:03 - loss: 444.7769 - regression_loss: 17.8365 - classification_loss: 426.9404 252/500 [==============>...............] - ETA: 1:02 - loss: 443.0273 - regression_loss: 17.7772 - classification_loss: 425.2500 253/500 [==============>...............] - ETA: 1:02 - loss: 441.2916 - regression_loss: 17.7187 - classification_loss: 423.5728 254/500 [==============>...............] - ETA: 1:02 - loss: 439.5686 - regression_loss: 17.6597 - classification_loss: 421.9088 255/500 [==============>...............] - ETA: 1:02 - loss: 437.8586 - regression_loss: 17.6008 - classification_loss: 420.2577 256/500 [==============>...............] - ETA: 1:01 - loss: 436.1614 - regression_loss: 17.5419 - classification_loss: 418.6194 257/500 [==============>...............] - ETA: 1:01 - loss: 434.4777 - regression_loss: 17.4838 - classification_loss: 416.9938 258/500 [==============>...............] - ETA: 1:01 - loss: 432.8076 - regression_loss: 17.4267 - classification_loss: 415.3808 259/500 [==============>...............] - ETA: 1:01 - loss: 431.1503 - regression_loss: 17.3699 - classification_loss: 413.7802 260/500 [==============>...............] - ETA: 1:00 - loss: 429.5051 - regression_loss: 17.3129 - classification_loss: 412.1920 261/500 [==============>...............] - ETA: 1:00 - loss: 427.8750 - regression_loss: 17.2581 - classification_loss: 410.6168 262/500 [==============>...............] - ETA: 1:00 - loss: 426.2543 - regression_loss: 17.2018 - classification_loss: 409.0524 263/500 [==============>...............] - ETA: 59s - loss: 424.6497 - regression_loss: 17.1479 - classification_loss: 407.5017 264/500 [==============>...............] - ETA: 59s - loss: 423.0545 - regression_loss: 17.0928 - classification_loss: 405.9615 265/500 [==============>...............] - ETA: 59s - loss: 421.4708 - regression_loss: 17.0381 - classification_loss: 404.4326 266/500 [==============>...............] - ETA: 59s - loss: 419.9010 - regression_loss: 16.9839 - classification_loss: 402.9170 267/500 [===============>..............] - ETA: 58s - loss: 418.3416 - regression_loss: 16.9303 - classification_loss: 401.4113 268/500 [===============>..............] - ETA: 58s - loss: 416.8129 - regression_loss: 16.8830 - classification_loss: 399.9297 269/500 [===============>..............] - ETA: 58s - loss: 415.2766 - regression_loss: 16.8307 - classification_loss: 398.4458 270/500 [===============>..............] - ETA: 58s - loss: 413.7526 - regression_loss: 16.7776 - classification_loss: 396.9750 271/500 [===============>..............] - ETA: 57s - loss: 412.2386 - regression_loss: 16.7254 - classification_loss: 395.5131 272/500 [===============>..............] - ETA: 57s - loss: 410.7367 - regression_loss: 16.6745 - classification_loss: 394.0622 273/500 [===============>..............] - ETA: 57s - loss: 409.2457 - regression_loss: 16.6233 - classification_loss: 392.6224 274/500 [===============>..............] - ETA: 57s - loss: 407.7656 - regression_loss: 16.5725 - classification_loss: 391.1930 275/500 [===============>..............] - ETA: 56s - loss: 406.2972 - regression_loss: 16.5234 - classification_loss: 389.7737 276/500 [===============>..............] - ETA: 56s - loss: 404.8379 - regression_loss: 16.4731 - classification_loss: 388.3647 277/500 [===============>..............] - ETA: 56s - loss: 403.3888 - regression_loss: 16.4228 - classification_loss: 386.9660 278/500 [===============>..............] - ETA: 56s - loss: 401.9525 - regression_loss: 16.3750 - classification_loss: 385.5775 279/500 [===============>..............] - ETA: 55s - loss: 400.5257 - regression_loss: 16.3268 - classification_loss: 384.1989 280/500 [===============>..............] - ETA: 55s - loss: 399.1083 - regression_loss: 16.2783 - classification_loss: 382.8299 281/500 [===============>..............] - ETA: 55s - loss: 397.6998 - regression_loss: 16.2293 - classification_loss: 381.4705 282/500 [===============>..............] - ETA: 55s - loss: 396.3010 - regression_loss: 16.1801 - classification_loss: 380.1208 283/500 [===============>..............] - ETA: 54s - loss: 394.9130 - regression_loss: 16.1322 - classification_loss: 378.7807 284/500 [================>.............] - ETA: 54s - loss: 393.5346 - regression_loss: 16.0845 - classification_loss: 377.4500 285/500 [================>.............] - ETA: 54s - loss: 392.1654 - regression_loss: 16.0367 - classification_loss: 376.1286 286/500 [================>.............] - ETA: 54s - loss: 390.8102 - regression_loss: 15.9933 - classification_loss: 374.8168 287/500 [================>.............] - ETA: 53s - loss: 389.4603 - regression_loss: 15.9469 - classification_loss: 373.5133 288/500 [================>.............] - ETA: 53s - loss: 388.1193 - regression_loss: 15.9003 - classification_loss: 372.2189 289/500 [================>.............] - ETA: 53s - loss: 386.7889 - regression_loss: 15.8548 - classification_loss: 370.9341 290/500 [================>.............] - ETA: 53s - loss: 385.4673 - regression_loss: 15.8094 - classification_loss: 369.6579 291/500 [================>.............] - ETA: 52s - loss: 384.1553 - regression_loss: 15.7647 - classification_loss: 368.3905 292/500 [================>.............] - ETA: 52s - loss: 382.8507 - regression_loss: 15.7194 - classification_loss: 367.1313 293/500 [================>.............] - ETA: 52s - loss: 381.5548 - regression_loss: 15.6736 - classification_loss: 365.8811 294/500 [================>.............] - ETA: 51s - loss: 380.2681 - regression_loss: 15.6290 - classification_loss: 364.6391 295/500 [================>.............] - ETA: 51s - loss: 378.9925 - regression_loss: 15.5858 - classification_loss: 363.4066 296/500 [================>.............] - ETA: 51s - loss: 377.7230 - regression_loss: 15.5414 - classification_loss: 362.1814 297/500 [================>.............] - ETA: 51s - loss: 376.4620 - regression_loss: 15.4976 - classification_loss: 360.9643 298/500 [================>.............] - ETA: 50s - loss: 375.2102 - regression_loss: 15.4543 - classification_loss: 359.7558 299/500 [================>.............] - ETA: 50s - loss: 373.9684 - regression_loss: 15.4132 - classification_loss: 358.5552 300/500 [=================>............] - ETA: 50s - loss: 372.7334 - regression_loss: 15.3706 - classification_loss: 357.3627 301/500 [=================>............] - ETA: 50s - loss: 371.5079 - regression_loss: 15.3294 - classification_loss: 356.1784 302/500 [=================>............] - ETA: 49s - loss: 370.2893 - regression_loss: 15.2876 - classification_loss: 355.0016 303/500 [=================>............] - ETA: 49s - loss: 369.0787 - regression_loss: 15.2464 - classification_loss: 353.8322 304/500 [=================>............] - ETA: 49s - loss: 367.8761 - regression_loss: 15.2051 - classification_loss: 352.6709 305/500 [=================>............] - ETA: 49s - loss: 366.6821 - regression_loss: 15.1641 - classification_loss: 351.5179 306/500 [=================>............] - ETA: 48s - loss: 365.4962 - regression_loss: 15.1239 - classification_loss: 350.3722 307/500 [=================>............] - ETA: 48s - loss: 364.3219 - regression_loss: 15.0834 - classification_loss: 349.2384 308/500 [=================>............] - ETA: 48s - loss: 363.1497 - regression_loss: 15.0426 - classification_loss: 348.1070 309/500 [=================>............] - ETA: 48s - loss: 361.9857 - regression_loss: 15.0023 - classification_loss: 346.9834 310/500 [=================>............] - ETA: 47s - loss: 360.8302 - regression_loss: 14.9631 - classification_loss: 345.8670 311/500 [=================>............] - ETA: 47s - loss: 359.6813 - regression_loss: 14.9235 - classification_loss: 344.7578 312/500 [=================>............] - ETA: 47s - loss: 358.5394 - regression_loss: 14.8838 - classification_loss: 343.6555 313/500 [=================>............] - ETA: 47s - loss: 357.4048 - regression_loss: 14.8445 - classification_loss: 342.5602 314/500 [=================>............] - ETA: 46s - loss: 356.2765 - regression_loss: 14.8045 - classification_loss: 341.4719 315/500 [=================>............] - ETA: 46s - loss: 355.1565 - regression_loss: 14.7661 - classification_loss: 340.3904 316/500 [=================>............] - ETA: 46s - loss: 354.0439 - regression_loss: 14.7281 - classification_loss: 339.3157 317/500 [==================>...........] - ETA: 46s - loss: 352.9421 - regression_loss: 14.6911 - classification_loss: 338.2510 318/500 [==================>...........] - ETA: 45s - loss: 351.8424 - regression_loss: 14.6524 - classification_loss: 337.1899 319/500 [==================>...........] - ETA: 45s - loss: 350.7499 - regression_loss: 14.6144 - classification_loss: 336.1353 320/500 [==================>...........] - ETA: 45s - loss: 349.6644 - regression_loss: 14.5767 - classification_loss: 335.0876 321/500 [==================>...........] - ETA: 45s - loss: 348.5855 - regression_loss: 14.5391 - classification_loss: 334.0463 322/500 [==================>...........] - ETA: 44s - loss: 347.5139 - regression_loss: 14.5025 - classification_loss: 333.0114 323/500 [==================>...........] - ETA: 44s - loss: 346.4502 - regression_loss: 14.4674 - classification_loss: 331.9827 324/500 [==================>...........] - ETA: 44s - loss: 345.3925 - regression_loss: 14.4312 - classification_loss: 330.9612 325/500 [==================>...........] - ETA: 44s - loss: 344.3399 - regression_loss: 14.3942 - classification_loss: 329.9456 326/500 [==================>...........] - ETA: 43s - loss: 343.2933 - regression_loss: 14.3574 - classification_loss: 328.9358 327/500 [==================>...........] - ETA: 43s - loss: 342.2549 - regression_loss: 14.3221 - classification_loss: 327.9327 328/500 [==================>...........] - ETA: 43s - loss: 341.2228 - regression_loss: 14.2874 - classification_loss: 326.9353 329/500 [==================>...........] - ETA: 43s - loss: 340.1963 - regression_loss: 14.2524 - classification_loss: 325.9438 330/500 [==================>...........] - ETA: 42s - loss: 339.1827 - regression_loss: 14.2194 - classification_loss: 324.9633 331/500 [==================>...........] - ETA: 42s - loss: 338.1681 - regression_loss: 14.1841 - classification_loss: 323.9839 332/500 [==================>...........] - ETA: 42s - loss: 337.1593 - regression_loss: 14.1489 - classification_loss: 323.0103 333/500 [==================>...........] - ETA: 42s - loss: 336.1573 - regression_loss: 14.1144 - classification_loss: 322.0429 334/500 [===================>..........] - ETA: 41s - loss: 335.1601 - regression_loss: 14.0794 - classification_loss: 321.0806 335/500 [===================>..........] - ETA: 41s - loss: 334.1820 - regression_loss: 14.0473 - classification_loss: 320.1346 336/500 [===================>..........] - ETA: 41s - loss: 333.1982 - regression_loss: 14.0136 - classification_loss: 319.1845 337/500 [===================>..........] - ETA: 41s - loss: 332.2190 - regression_loss: 13.9792 - classification_loss: 318.2396 338/500 [===================>..........] - ETA: 40s - loss: 331.2457 - regression_loss: 13.9455 - classification_loss: 317.3002 339/500 [===================>..........] - ETA: 40s - loss: 330.2785 - regression_loss: 13.9118 - classification_loss: 316.3666 340/500 [===================>..........] - ETA: 40s - loss: 329.3174 - regression_loss: 13.8786 - classification_loss: 315.4387 341/500 [===================>..........] - ETA: 40s - loss: 328.3630 - regression_loss: 13.8464 - classification_loss: 314.5166 342/500 [===================>..........] - ETA: 39s - loss: 327.4121 - regression_loss: 13.8127 - classification_loss: 313.5993 343/500 [===================>..........] - ETA: 39s - loss: 326.4668 - regression_loss: 13.7796 - classification_loss: 312.6872 344/500 [===================>..........] - ETA: 39s - loss: 325.5277 - regression_loss: 13.7469 - classification_loss: 311.7807 345/500 [===================>..........] - ETA: 39s - loss: 324.5944 - regression_loss: 13.7148 - classification_loss: 310.8794 346/500 [===================>..........] - ETA: 38s - loss: 323.6658 - regression_loss: 13.6827 - classification_loss: 309.9830 347/500 [===================>..........] - ETA: 38s - loss: 322.7424 - regression_loss: 13.6505 - classification_loss: 309.0919 348/500 [===================>..........] - ETA: 38s - loss: 321.8254 - regression_loss: 13.6190 - classification_loss: 308.2063 349/500 [===================>..........] - ETA: 37s - loss: 320.9148 - regression_loss: 13.5888 - classification_loss: 307.3258 350/500 [====================>.........] - ETA: 37s - loss: 320.0074 - regression_loss: 13.5571 - classification_loss: 306.4502 351/500 [====================>.........] - ETA: 37s - loss: 319.1063 - regression_loss: 13.5265 - classification_loss: 305.5797 352/500 [====================>.........] - ETA: 37s - loss: 318.2094 - regression_loss: 13.4954 - classification_loss: 304.7139 353/500 [====================>.........] - ETA: 36s - loss: 317.3159 - regression_loss: 13.4629 - classification_loss: 303.8529 354/500 [====================>.........] - ETA: 36s - loss: 316.4299 - regression_loss: 13.4324 - classification_loss: 302.9973 355/500 [====================>.........] - ETA: 36s - loss: 315.5485 - regression_loss: 13.4021 - classification_loss: 302.1462 356/500 [====================>.........] - ETA: 36s - loss: 314.6726 - regression_loss: 13.3726 - classification_loss: 301.3000 357/500 [====================>.........] - ETA: 35s - loss: 313.8001 - regression_loss: 13.3420 - classification_loss: 300.4579 358/500 [====================>.........] - ETA: 35s - loss: 312.9327 - regression_loss: 13.3117 - classification_loss: 299.6209 359/500 [====================>.........] - ETA: 35s - loss: 312.0703 - regression_loss: 13.2819 - classification_loss: 298.7883 360/500 [====================>.........] - ETA: 35s - loss: 311.2121 - regression_loss: 13.2517 - classification_loss: 297.9603 361/500 [====================>.........] - ETA: 34s - loss: 310.3591 - regression_loss: 13.2221 - classification_loss: 297.1369 362/500 [====================>.........] - ETA: 34s - loss: 309.5163 - regression_loss: 13.1934 - classification_loss: 296.3228 363/500 [====================>.........] - ETA: 34s - loss: 308.6725 - regression_loss: 13.1635 - classification_loss: 295.5089 364/500 [====================>.........] - ETA: 34s - loss: 307.8336 - regression_loss: 13.1344 - classification_loss: 294.6990 365/500 [====================>.........] - ETA: 33s - loss: 306.9990 - regression_loss: 13.1054 - classification_loss: 293.8935 366/500 [====================>.........] - ETA: 33s - loss: 306.1689 - regression_loss: 13.0764 - classification_loss: 293.0924 367/500 [=====================>........] - ETA: 33s - loss: 305.3442 - regression_loss: 13.0481 - classification_loss: 292.2959 368/500 [=====================>........] - ETA: 33s - loss: 304.5229 - regression_loss: 13.0193 - classification_loss: 291.5035 369/500 [=====================>........] - ETA: 32s - loss: 303.7061 - regression_loss: 12.9906 - classification_loss: 290.7154 370/500 [=====================>........] - ETA: 32s - loss: 302.8936 - regression_loss: 12.9619 - classification_loss: 289.9316 371/500 [=====================>........] - ETA: 32s - loss: 302.0861 - regression_loss: 12.9338 - classification_loss: 289.1522 372/500 [=====================>........] - ETA: 32s - loss: 301.2835 - regression_loss: 12.9064 - classification_loss: 288.3770 373/500 [=====================>........] - ETA: 31s - loss: 300.4844 - regression_loss: 12.8782 - classification_loss: 287.6061 374/500 [=====================>........] - ETA: 31s - loss: 299.6890 - regression_loss: 12.8500 - classification_loss: 286.8389 375/500 [=====================>........] - ETA: 31s - loss: 298.8992 - regression_loss: 12.8230 - classification_loss: 286.0761 376/500 [=====================>........] - ETA: 31s - loss: 298.1134 - regression_loss: 12.7959 - classification_loss: 285.3174 377/500 [=====================>........] - ETA: 30s - loss: 297.3331 - regression_loss: 12.7698 - classification_loss: 284.5631 378/500 [=====================>........] - ETA: 30s - loss: 296.5553 - regression_loss: 12.7429 - classification_loss: 283.8123 379/500 [=====================>........] - ETA: 30s - loss: 295.7804 - regression_loss: 12.7152 - classification_loss: 283.0650 380/500 [=====================>........] - ETA: 30s - loss: 295.0104 - regression_loss: 12.6883 - classification_loss: 282.3220 381/500 [=====================>........] - ETA: 29s - loss: 294.2459 - regression_loss: 12.6617 - classification_loss: 281.5840 382/500 [=====================>........] - ETA: 29s - loss: 293.4845 - regression_loss: 12.6352 - classification_loss: 280.8491 383/500 [=====================>........] - ETA: 29s - loss: 292.7268 - regression_loss: 12.6088 - classification_loss: 280.1179 384/500 [======================>.......] - ETA: 29s - loss: 291.9739 - regression_loss: 12.5830 - classification_loss: 279.3908 385/500 [======================>.......] - ETA: 28s - loss: 291.2241 - regression_loss: 12.5566 - classification_loss: 278.6673 386/500 [======================>.......] - ETA: 28s - loss: 290.4797 - regression_loss: 12.5314 - classification_loss: 277.9482 387/500 [======================>.......] - ETA: 28s - loss: 289.7435 - regression_loss: 12.5072 - classification_loss: 277.2361 388/500 [======================>.......] - ETA: 28s - loss: 289.0050 - regression_loss: 12.4814 - classification_loss: 276.5235 389/500 [======================>.......] - ETA: 27s - loss: 288.2707 - regression_loss: 12.4558 - classification_loss: 275.8148 390/500 [======================>.......] - ETA: 27s - loss: 287.5425 - regression_loss: 12.4313 - classification_loss: 275.1110 391/500 [======================>.......] - ETA: 27s - loss: 286.8154 - regression_loss: 12.4057 - classification_loss: 274.4095 392/500 [======================>.......] - ETA: 27s - loss: 286.0924 - regression_loss: 12.3807 - classification_loss: 273.7115 393/500 [======================>.......] - ETA: 26s - loss: 285.3738 - regression_loss: 12.3567 - classification_loss: 273.0169 394/500 [======================>.......] - ETA: 26s - loss: 284.6575 - regression_loss: 12.3313 - classification_loss: 272.3260 395/500 [======================>.......] - ETA: 26s - loss: 283.9459 - regression_loss: 12.3070 - classification_loss: 271.6388 396/500 [======================>.......] - ETA: 26s - loss: 283.2365 - regression_loss: 12.2817 - classification_loss: 270.9547 397/500 [======================>.......] - ETA: 25s - loss: 282.5313 - regression_loss: 12.2570 - classification_loss: 270.2742 398/500 [======================>.......] - ETA: 25s - loss: 281.8295 - regression_loss: 12.2323 - classification_loss: 269.5971 399/500 [======================>.......] - ETA: 25s - loss: 281.1324 - regression_loss: 12.2087 - classification_loss: 268.9236 400/500 [=======================>......] - ETA: 25s - loss: 280.4388 - regression_loss: 12.1852 - classification_loss: 268.2534 401/500 [=======================>......] - ETA: 24s - loss: 279.7481 - regression_loss: 12.1613 - classification_loss: 267.5867 402/500 [=======================>......] - ETA: 24s - loss: 279.0610 - regression_loss: 12.1376 - classification_loss: 266.9232 403/500 [=======================>......] - ETA: 24s - loss: 278.3783 - regression_loss: 12.1144 - classification_loss: 266.2638 404/500 [=======================>......] - ETA: 24s - loss: 277.6969 - regression_loss: 12.0902 - classification_loss: 265.6067 405/500 [=======================>......] - ETA: 23s - loss: 277.0196 - regression_loss: 12.0667 - classification_loss: 264.9528 406/500 [=======================>......] - ETA: 23s - loss: 276.3476 - regression_loss: 12.0447 - classification_loss: 264.3028 407/500 [=======================>......] - ETA: 23s - loss: 275.6768 - regression_loss: 12.0214 - classification_loss: 263.6553 408/500 [=======================>......] - ETA: 23s - loss: 275.0091 - regression_loss: 11.9981 - classification_loss: 263.0109 409/500 [=======================>......] - ETA: 22s - loss: 274.3443 - regression_loss: 11.9746 - classification_loss: 262.3696 410/500 [=======================>......] - ETA: 22s - loss: 273.6832 - regression_loss: 11.9516 - classification_loss: 261.7316 411/500 [=======================>......] - ETA: 22s - loss: 273.0253 - regression_loss: 11.9286 - classification_loss: 261.0966 412/500 [=======================>......] - ETA: 22s - loss: 272.3712 - regression_loss: 11.9062 - classification_loss: 260.4648 413/500 [=======================>......] - ETA: 21s - loss: 271.7186 - regression_loss: 11.8829 - classification_loss: 259.8355 414/500 [=======================>......] - ETA: 21s - loss: 271.0699 - regression_loss: 11.8601 - classification_loss: 259.2096 415/500 [=======================>......] - ETA: 21s - loss: 270.4244 - regression_loss: 11.8370 - classification_loss: 258.5873 416/500 [=======================>......] - ETA: 21s - loss: 269.7837 - regression_loss: 11.8155 - classification_loss: 257.9680 417/500 [========================>.....] - ETA: 20s - loss: 269.1448 - regression_loss: 11.7929 - classification_loss: 257.3518 418/500 [========================>.....] - ETA: 20s - loss: 268.5074 - regression_loss: 11.7698 - classification_loss: 256.7375 419/500 [========================>.....] - ETA: 20s - loss: 267.8742 - regression_loss: 11.7473 - classification_loss: 256.1267 420/500 [========================>.....] - ETA: 20s - loss: 267.2447 - regression_loss: 11.7255 - classification_loss: 255.5190 421/500 [========================>.....] - ETA: 19s - loss: 266.6174 - regression_loss: 11.7034 - classification_loss: 254.9138 422/500 [========================>.....] - ETA: 19s - loss: 265.9931 - regression_loss: 11.6814 - classification_loss: 254.3116 423/500 [========================>.....] - ETA: 19s - loss: 265.3726 - regression_loss: 11.6602 - classification_loss: 253.7123 424/500 [========================>.....] - ETA: 19s - loss: 264.7546 - regression_loss: 11.6387 - classification_loss: 253.1158 425/500 [========================>.....] - ETA: 18s - loss: 264.1391 - regression_loss: 11.6169 - classification_loss: 252.5220 426/500 [========================>.....] - ETA: 18s - loss: 263.5270 - regression_loss: 11.5960 - classification_loss: 251.9309 427/500 [========================>.....] - ETA: 18s - loss: 262.9176 - regression_loss: 11.5748 - classification_loss: 251.3426 428/500 [========================>.....] - ETA: 18s - loss: 262.3107 - regression_loss: 11.5532 - classification_loss: 250.7575 429/500 [========================>.....] - ETA: 17s - loss: 261.7073 - regression_loss: 11.5323 - classification_loss: 250.1750 430/500 [========================>.....] - ETA: 17s - loss: 261.1050 - regression_loss: 11.5105 - classification_loss: 249.5944 431/500 [========================>.....] - ETA: 17s - loss: 260.5059 - regression_loss: 11.4891 - classification_loss: 249.0168 432/500 [========================>.....] - ETA: 17s - loss: 259.9104 - regression_loss: 11.4683 - classification_loss: 248.4420 433/500 [========================>.....] - ETA: 16s - loss: 259.3178 - regression_loss: 11.4477 - classification_loss: 247.8700 434/500 [=========================>....] - ETA: 16s - loss: 258.7403 - regression_loss: 11.4288 - classification_loss: 247.3114 435/500 [=========================>....] - ETA: 16s - loss: 258.1538 - regression_loss: 11.4086 - classification_loss: 246.7451 436/500 [=========================>....] - ETA: 16s - loss: 257.5698 - regression_loss: 11.3890 - classification_loss: 246.1808 437/500 [=========================>....] - ETA: 15s - loss: 256.9876 - regression_loss: 11.3685 - classification_loss: 245.6190 438/500 [=========================>....] - ETA: 15s - loss: 256.4081 - regression_loss: 11.3480 - classification_loss: 245.0600 439/500 [=========================>....] - ETA: 15s - loss: 255.8323 - regression_loss: 11.3282 - classification_loss: 244.5040 440/500 [=========================>....] - ETA: 15s - loss: 255.2582 - regression_loss: 11.3078 - classification_loss: 243.9503 441/500 [=========================>....] - ETA: 14s - loss: 254.6872 - regression_loss: 11.2882 - classification_loss: 243.3989 442/500 [=========================>....] - ETA: 14s - loss: 254.1188 - regression_loss: 11.2685 - classification_loss: 242.8502 443/500 [=========================>....] - ETA: 14s - loss: 253.5531 - regression_loss: 11.2494 - classification_loss: 242.3036 444/500 [=========================>....] - ETA: 14s - loss: 252.9899 - regression_loss: 11.2307 - classification_loss: 241.7592 445/500 [=========================>....] - ETA: 13s - loss: 252.4292 - regression_loss: 11.2112 - classification_loss: 241.2179 446/500 [=========================>....] - ETA: 13s - loss: 251.8719 - regression_loss: 11.1917 - classification_loss: 240.6802 447/500 [=========================>....] - ETA: 13s - loss: 251.3164 - regression_loss: 11.1726 - classification_loss: 240.1437 448/500 [=========================>....] - ETA: 13s - loss: 250.7624 - regression_loss: 11.1531 - classification_loss: 239.6092 449/500 [=========================>....] - ETA: 12s - loss: 250.2112 - regression_loss: 11.1338 - classification_loss: 239.0774 450/500 [==========================>...] - ETA: 12s - loss: 249.6623 - regression_loss: 11.1144 - classification_loss: 238.5478 451/500 [==========================>...] - ETA: 12s - loss: 249.1155 - regression_loss: 11.0950 - classification_loss: 238.0205 452/500 [==========================>...] - ETA: 12s - loss: 248.5715 - regression_loss: 11.0759 - classification_loss: 237.4956 453/500 [==========================>...] - ETA: 11s - loss: 248.0299 - regression_loss: 11.0568 - classification_loss: 236.9731 454/500 [==========================>...] - ETA: 11s - loss: 247.4910 - regression_loss: 11.0381 - classification_loss: 236.4528 455/500 [==========================>...] - ETA: 11s - loss: 246.9555 - regression_loss: 11.0205 - classification_loss: 235.9349 456/500 [==========================>...] - ETA: 11s - loss: 246.4229 - regression_loss: 11.0034 - classification_loss: 235.4194 457/500 [==========================>...] - ETA: 10s - loss: 245.8909 - regression_loss: 10.9850 - classification_loss: 234.9058 458/500 [==========================>...] - ETA: 10s - loss: 245.3604 - regression_loss: 10.9660 - classification_loss: 234.3943 459/500 [==========================>...] - ETA: 10s - loss: 244.8345 - regression_loss: 10.9491 - classification_loss: 233.8853 460/500 [==========================>...] - ETA: 10s - loss: 244.3090 - regression_loss: 10.9306 - classification_loss: 233.3784 461/500 [==========================>...] - ETA: 9s - loss: 243.7862 - regression_loss: 10.9125 - classification_loss: 232.8736 462/500 [==========================>...] - ETA: 9s - loss: 243.2661 - regression_loss: 10.8949 - classification_loss: 232.3712 463/500 [==========================>...] - ETA: 9s - loss: 242.7472 - regression_loss: 10.8760 - classification_loss: 231.8711 464/500 [==========================>...] - ETA: 9s - loss: 242.2310 - regression_loss: 10.8581 - classification_loss: 231.3728 465/500 [==========================>...] - ETA: 8s - loss: 241.7172 - regression_loss: 10.8405 - classification_loss: 230.8766 466/500 [==========================>...] - ETA: 8s - loss: 241.2064 - regression_loss: 10.8234 - classification_loss: 230.3830 467/500 [===========================>..] - ETA: 8s - loss: 240.6972 - regression_loss: 10.8056 - classification_loss: 229.8915 468/500 [===========================>..] - ETA: 8s - loss: 240.1898 - regression_loss: 10.7877 - classification_loss: 229.4020 469/500 [===========================>..] - ETA: 7s - loss: 239.6849 - regression_loss: 10.7707 - classification_loss: 228.9142 470/500 [===========================>..] - ETA: 7s - loss: 239.1818 - regression_loss: 10.7530 - classification_loss: 228.4287 471/500 [===========================>..] - ETA: 7s - loss: 238.6811 - regression_loss: 10.7358 - classification_loss: 227.9452 472/500 [===========================>..] - ETA: 6s - loss: 238.1830 - regression_loss: 10.7189 - classification_loss: 227.4640 473/500 [===========================>..] - ETA: 6s - loss: 237.6865 - regression_loss: 10.7016 - classification_loss: 226.9848 474/500 [===========================>..] - ETA: 6s - loss: 237.1918 - regression_loss: 10.6843 - classification_loss: 226.5074 475/500 [===========================>..] - ETA: 6s - loss: 236.7018 - regression_loss: 10.6679 - classification_loss: 226.0338 476/500 [===========================>..] - ETA: 5s - loss: 236.2115 - regression_loss: 10.6511 - classification_loss: 225.5604 477/500 [===========================>..] - ETA: 5s - loss: 235.7228 - regression_loss: 10.6340 - classification_loss: 225.0888 478/500 [===========================>..] - ETA: 5s - loss: 235.2385 - regression_loss: 10.6184 - classification_loss: 224.6200 479/500 [===========================>..] - ETA: 5s - loss: 234.7543 - regression_loss: 10.6015 - classification_loss: 224.1528 480/500 [===========================>..] - ETA: 4s - loss: 234.2724 - regression_loss: 10.5848 - classification_loss: 223.6875 481/500 [===========================>..] - ETA: 4s - loss: 233.7923 - regression_loss: 10.5683 - classification_loss: 223.2239 482/500 [===========================>..] - ETA: 4s - loss: 233.3141 - regression_loss: 10.5518 - classification_loss: 222.7623 483/500 [===========================>..] - ETA: 4s - loss: 232.8382 - regression_loss: 10.5354 - classification_loss: 222.3027 484/500 [============================>.] - ETA: 3s - loss: 232.3640 - regression_loss: 10.5190 - classification_loss: 221.8450 485/500 [============================>.] - ETA: 3s - loss: 231.8913 - regression_loss: 10.5023 - classification_loss: 221.3889 486/500 [============================>.] - ETA: 3s - loss: 231.4207 - regression_loss: 10.4856 - classification_loss: 220.9351 487/500 [============================>.] - ETA: 3s - loss: 230.9535 - regression_loss: 10.4691 - classification_loss: 220.4844 488/500 [============================>.] - ETA: 2s - loss: 230.4871 - regression_loss: 10.4529 - classification_loss: 220.0341 489/500 [============================>.] - ETA: 2s - loss: 230.0222 - regression_loss: 10.4366 - classification_loss: 219.5855 490/500 [============================>.] - ETA: 2s - loss: 229.5618 - regression_loss: 10.4212 - classification_loss: 219.1406 491/500 [============================>.] - ETA: 2s - loss: 229.1007 - regression_loss: 10.4049 - classification_loss: 218.6957 492/500 [============================>.] - ETA: 1s - loss: 228.6413 - regression_loss: 10.3888 - classification_loss: 218.2524 493/500 [============================>.] - ETA: 1s - loss: 228.1838 - regression_loss: 10.3727 - classification_loss: 217.8110 494/500 [============================>.] - ETA: 1s - loss: 227.7287 - regression_loss: 10.3570 - classification_loss: 217.3717 495/500 [============================>.] - ETA: 1s - loss: 227.2750 - regression_loss: 10.3410 - classification_loss: 216.9339 496/500 [============================>.] - ETA: 0s - loss: 226.8227 - regression_loss: 10.3248 - classification_loss: 216.4978 497/500 [============================>.] - ETA: 0s - loss: 226.3725 - regression_loss: 10.3090 - classification_loss: 216.0635 498/500 [============================>.] - ETA: 0s - loss: 225.9252 - regression_loss: 10.2942 - classification_loss: 215.6310 499/500 [============================>.] - ETA: 0s - loss: 225.4865 - regression_loss: 10.2796 - classification_loss: 215.2069 500/500 [==============================] - 125s 249ms/step - loss: 225.0416 - regression_loss: 10.2635 - classification_loss: 214.7780 326 instances of class plum with average precision: 0.0104 mAP: 0.0104 Epoch 00001: saving model to ./training/snapshots/resnet50_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 2:00 - loss: 3.5107 - regression_loss: 2.6228 - classification_loss: 0.8878 2/500 [..............................] - ETA: 1:57 - loss: 3.4018 - regression_loss: 2.5308 - classification_loss: 0.8711 3/500 [..............................] - ETA: 1:54 - loss: 3.3604 - regression_loss: 2.5305 - classification_loss: 0.8299 4/500 [..............................] - ETA: 1:53 - loss: 3.2075 - regression_loss: 2.4284 - classification_loss: 0.7792 5/500 [..............................] - ETA: 1:52 - loss: 3.2191 - regression_loss: 2.4220 - classification_loss: 0.7971 6/500 [..............................] - ETA: 1:51 - loss: 3.2235 - regression_loss: 2.4388 - classification_loss: 0.7848 7/500 [..............................] - ETA: 1:51 - loss: 3.2119 - regression_loss: 2.4216 - classification_loss: 0.7903 8/500 [..............................] - ETA: 1:51 - loss: 3.1957 - regression_loss: 2.4074 - classification_loss: 0.7883 9/500 [..............................] - ETA: 1:50 - loss: 3.1847 - regression_loss: 2.3996 - classification_loss: 0.7851 10/500 [..............................] - ETA: 1:50 - loss: 3.1948 - regression_loss: 2.4183 - classification_loss: 0.7766 11/500 [..............................] - ETA: 1:50 - loss: 3.1907 - regression_loss: 2.4128 - classification_loss: 0.7779 12/500 [..............................] - ETA: 1:49 - loss: 3.2137 - regression_loss: 2.4339 - classification_loss: 0.7798 13/500 [..............................] - ETA: 1:49 - loss: 3.2058 - regression_loss: 2.4235 - classification_loss: 0.7823 14/500 [..............................] - ETA: 1:48 - loss: 3.2141 - regression_loss: 2.4364 - classification_loss: 0.7777 15/500 [..............................] - ETA: 1:48 - loss: 3.2831 - regression_loss: 2.4627 - classification_loss: 0.8204 16/500 [..............................] - ETA: 1:47 - loss: 3.3043 - regression_loss: 2.4996 - classification_loss: 0.8047 17/500 [>.............................] - ETA: 1:47 - loss: 3.3606 - regression_loss: 2.5436 - classification_loss: 0.8170 18/500 [>.............................] - ETA: 1:47 - loss: 3.3761 - regression_loss: 2.5536 - classification_loss: 0.8224 19/500 [>.............................] - ETA: 1:46 - loss: 3.3754 - regression_loss: 2.5609 - classification_loss: 0.8146 20/500 [>.............................] - ETA: 1:46 - loss: 3.3839 - regression_loss: 2.5621 - classification_loss: 0.8218 21/500 [>.............................] - ETA: 1:46 - loss: 3.3763 - regression_loss: 2.5565 - classification_loss: 0.8197 22/500 [>.............................] - ETA: 1:46 - loss: 3.3818 - regression_loss: 2.5643 - classification_loss: 0.8175 23/500 [>.............................] - ETA: 1:45 - loss: 3.3895 - regression_loss: 2.5730 - classification_loss: 0.8165 24/500 [>.............................] - ETA: 1:45 - loss: 3.3781 - regression_loss: 2.5666 - classification_loss: 0.8115 25/500 [>.............................] - ETA: 1:45 - loss: 3.3695 - regression_loss: 2.5593 - classification_loss: 0.8102 26/500 [>.............................] - ETA: 1:45 - loss: 3.3633 - regression_loss: 2.5538 - classification_loss: 0.8095 27/500 [>.............................] - ETA: 1:45 - loss: 3.3629 - regression_loss: 2.5567 - classification_loss: 0.8062 28/500 [>.............................] - ETA: 1:45 - loss: 3.3545 - regression_loss: 2.5487 - classification_loss: 0.8058 29/500 [>.............................] - ETA: 1:45 - loss: 3.3444 - regression_loss: 2.5414 - classification_loss: 0.8030 30/500 [>.............................] - ETA: 1:45 - loss: 3.3305 - regression_loss: 2.5331 - classification_loss: 0.7974 31/500 [>.............................] - ETA: 1:46 - loss: 3.3220 - regression_loss: 2.5282 - classification_loss: 0.7938 32/500 [>.............................] - ETA: 1:46 - loss: 3.3283 - regression_loss: 2.5347 - classification_loss: 0.7936 33/500 [>.............................] - ETA: 1:46 - loss: 3.3332 - regression_loss: 2.5410 - classification_loss: 0.7922 34/500 [=>............................] - ETA: 1:46 - loss: 3.3257 - regression_loss: 2.5364 - classification_loss: 0.7893 35/500 [=>............................] - ETA: 1:46 - loss: 3.3293 - regression_loss: 2.5356 - classification_loss: 0.7937 36/500 [=>............................] - ETA: 1:46 - loss: 3.3329 - regression_loss: 2.5362 - classification_loss: 0.7967 37/500 [=>............................] - ETA: 1:46 - loss: 3.3302 - regression_loss: 2.5366 - classification_loss: 0.7937 38/500 [=>............................] - ETA: 1:46 - loss: 3.4085 - regression_loss: 2.5417 - classification_loss: 0.8668 39/500 [=>............................] - ETA: 1:45 - loss: 3.4007 - regression_loss: 2.5380 - classification_loss: 0.8627 40/500 [=>............................] - ETA: 1:45 - loss: 3.3964 - regression_loss: 2.5372 - classification_loss: 0.8592 41/500 [=>............................] - ETA: 1:45 - loss: 4.0613 - regression_loss: 2.4753 - classification_loss: 1.5860 42/500 [=>............................] - ETA: 1:45 - loss: 4.0371 - regression_loss: 2.4730 - classification_loss: 1.5641 43/500 [=>............................] - ETA: 1:45 - loss: 4.0118 - regression_loss: 2.4695 - classification_loss: 1.5423 44/500 [=>............................] - ETA: 1:45 - loss: 4.0106 - regression_loss: 2.4808 - classification_loss: 1.5298 45/500 [=>............................] - ETA: 1:45 - loss: 3.9932 - regression_loss: 2.4808 - classification_loss: 1.5124 46/500 [=>............................] - ETA: 1:44 - loss: 3.9731 - regression_loss: 2.4810 - classification_loss: 1.4921 47/500 [=>............................] - ETA: 1:44 - loss: 3.9594 - regression_loss: 2.4834 - classification_loss: 1.4760 48/500 [=>............................] - ETA: 1:44 - loss: 3.9476 - regression_loss: 2.4855 - classification_loss: 1.4621 49/500 [=>............................] - ETA: 1:44 - loss: 3.9319 - regression_loss: 2.4846 - classification_loss: 1.4473 50/500 [==>...........................] - ETA: 1:44 - loss: 3.9186 - regression_loss: 2.4873 - classification_loss: 1.4313 51/500 [==>...........................] - ETA: 1:44 - loss: 3.9050 - regression_loss: 2.4888 - classification_loss: 1.4162 52/500 [==>...........................] - ETA: 1:43 - loss: 3.8947 - regression_loss: 2.4944 - classification_loss: 1.4003 53/500 [==>...........................] - ETA: 1:43 - loss: 3.8813 - regression_loss: 2.4932 - classification_loss: 1.3881 54/500 [==>...........................] - ETA: 1:43 - loss: 3.8663 - regression_loss: 2.4919 - classification_loss: 1.3744 55/500 [==>...........................] - ETA: 1:43 - loss: 3.8566 - regression_loss: 2.4937 - classification_loss: 1.3629 56/500 [==>...........................] - ETA: 1:43 - loss: 3.8452 - regression_loss: 2.4924 - classification_loss: 1.3528 57/500 [==>...........................] - ETA: 1:43 - loss: 3.8385 - regression_loss: 2.4960 - classification_loss: 1.3426 58/500 [==>...........................] - ETA: 1:43 - loss: 3.8334 - regression_loss: 2.5006 - classification_loss: 1.3329 59/500 [==>...........................] - ETA: 1:42 - loss: 3.8193 - regression_loss: 2.4987 - classification_loss: 1.3206 60/500 [==>...........................] - ETA: 1:42 - loss: 3.8066 - regression_loss: 2.4968 - classification_loss: 1.3098 61/500 [==>...........................] - ETA: 1:42 - loss: 3.7924 - regression_loss: 2.4934 - classification_loss: 1.2990 62/500 [==>...........................] - ETA: 1:42 - loss: 3.7818 - regression_loss: 2.4918 - classification_loss: 1.2900 63/500 [==>...........................] - ETA: 1:41 - loss: 3.7685 - regression_loss: 2.4892 - classification_loss: 1.2793 64/500 [==>...........................] - ETA: 1:41 - loss: 3.7612 - regression_loss: 2.4911 - classification_loss: 1.2701 65/500 [==>...........................] - ETA: 1:41 - loss: 3.7718 - regression_loss: 2.4966 - classification_loss: 1.2752 66/500 [==>...........................] - ETA: 1:40 - loss: 3.7669 - regression_loss: 2.5005 - classification_loss: 1.2664 67/500 [===>..........................] - ETA: 1:40 - loss: 3.7578 - regression_loss: 2.4995 - classification_loss: 1.2583 68/500 [===>..........................] - ETA: 1:40 - loss: 3.7436 - regression_loss: 2.4948 - classification_loss: 1.2487 69/500 [===>..........................] - ETA: 1:39 - loss: 3.7362 - regression_loss: 2.4982 - classification_loss: 1.2381 70/500 [===>..........................] - ETA: 1:39 - loss: 3.7336 - regression_loss: 2.5006 - classification_loss: 1.2330 71/500 [===>..........................] - ETA: 1:39 - loss: 3.7420 - regression_loss: 2.4995 - classification_loss: 1.2425 72/500 [===>..........................] - ETA: 1:39 - loss: 3.7339 - regression_loss: 2.4988 - classification_loss: 1.2351 73/500 [===>..........................] - ETA: 1:39 - loss: 3.7332 - regression_loss: 2.5054 - classification_loss: 1.2278 74/500 [===>..........................] - ETA: 1:38 - loss: 3.7331 - regression_loss: 2.5092 - classification_loss: 1.2239 75/500 [===>..........................] - ETA: 1:38 - loss: 3.7274 - regression_loss: 2.5103 - classification_loss: 1.2171 76/500 [===>..........................] - ETA: 1:38 - loss: 3.7200 - regression_loss: 2.5085 - classification_loss: 1.2115 77/500 [===>..........................] - ETA: 1:38 - loss: 3.7179 - regression_loss: 2.5094 - classification_loss: 1.2084 78/500 [===>..........................] - ETA: 1:38 - loss: 3.7124 - regression_loss: 2.5079 - classification_loss: 1.2045 79/500 [===>..........................] - ETA: 1:38 - loss: 3.7033 - regression_loss: 2.5054 - classification_loss: 1.1979 80/500 [===>..........................] - ETA: 1:37 - loss: 3.7000 - regression_loss: 2.5063 - classification_loss: 1.1937 81/500 [===>..........................] - ETA: 1:37 - loss: 3.6971 - regression_loss: 2.5059 - classification_loss: 1.1912 82/500 [===>..........................] - ETA: 1:37 - loss: 3.6813 - regression_loss: 2.4992 - classification_loss: 1.1821 83/500 [===>..........................] - ETA: 1:37 - loss: 3.6837 - regression_loss: 2.5087 - classification_loss: 1.1750 84/500 [====>.........................] - ETA: 1:36 - loss: 3.6781 - regression_loss: 2.5087 - classification_loss: 1.1694 85/500 [====>.........................] - ETA: 1:36 - loss: 3.6763 - regression_loss: 2.5102 - classification_loss: 1.1661 86/500 [====>.........................] - ETA: 1:36 - loss: 3.6766 - regression_loss: 2.5149 - classification_loss: 1.1616 87/500 [====>.........................] - ETA: 1:36 - loss: 3.6719 - regression_loss: 2.5142 - classification_loss: 1.1577 88/500 [====>.........................] - ETA: 1:35 - loss: 3.6712 - regression_loss: 2.5164 - classification_loss: 1.1548 89/500 [====>.........................] - ETA: 1:35 - loss: 3.6633 - regression_loss: 2.5144 - classification_loss: 1.1489 90/500 [====>.........................] - ETA: 1:35 - loss: 3.6650 - regression_loss: 2.5200 - classification_loss: 1.1450 91/500 [====>.........................] - ETA: 1:34 - loss: 3.6577 - regression_loss: 2.5173 - classification_loss: 1.1404 92/500 [====>.........................] - ETA: 1:34 - loss: 3.6505 - regression_loss: 2.5158 - classification_loss: 1.1347 93/500 [====>.........................] - ETA: 1:34 - loss: 3.6431 - regression_loss: 2.5135 - classification_loss: 1.1296 94/500 [====>.........................] - ETA: 1:34 - loss: 3.6371 - regression_loss: 2.5138 - classification_loss: 1.1233 95/500 [====>.........................] - ETA: 1:33 - loss: 3.6433 - regression_loss: 2.5219 - classification_loss: 1.1214 96/500 [====>.........................] - ETA: 1:33 - loss: 3.6423 - regression_loss: 2.5268 - classification_loss: 1.1156 97/500 [====>.........................] - ETA: 1:33 - loss: 3.6371 - regression_loss: 2.5260 - classification_loss: 1.1111 98/500 [====>.........................] - ETA: 1:32 - loss: 3.6312 - regression_loss: 2.5251 - classification_loss: 1.1061 99/500 [====>.........................] - ETA: 1:32 - loss: 3.6213 - regression_loss: 2.5207 - classification_loss: 1.1006 100/500 [=====>........................] - ETA: 1:32 - loss: 3.6338 - regression_loss: 2.5206 - classification_loss: 1.1132 101/500 [=====>........................] - ETA: 1:32 - loss: 3.6327 - regression_loss: 2.5214 - classification_loss: 1.1113 102/500 [=====>........................] - ETA: 1:31 - loss: 3.6281 - regression_loss: 2.5220 - classification_loss: 1.1060 103/500 [=====>........................] - ETA: 1:31 - loss: 3.6220 - regression_loss: 2.5210 - classification_loss: 1.1009 104/500 [=====>........................] - ETA: 1:31 - loss: 3.6157 - regression_loss: 2.5204 - classification_loss: 1.0954 105/500 [=====>........................] - ETA: 1:31 - loss: 3.6317 - regression_loss: 2.5222 - classification_loss: 1.1095 106/500 [=====>........................] - ETA: 1:30 - loss: 3.6267 - regression_loss: 2.5213 - classification_loss: 1.1054 107/500 [=====>........................] - ETA: 1:30 - loss: 3.6215 - regression_loss: 2.5194 - classification_loss: 1.1021 108/500 [=====>........................] - ETA: 1:30 - loss: 3.6179 - regression_loss: 2.5203 - classification_loss: 1.0976 109/500 [=====>........................] - ETA: 1:29 - loss: 3.6118 - regression_loss: 2.5177 - classification_loss: 1.0941 110/500 [=====>........................] - ETA: 1:29 - loss: 3.6069 - regression_loss: 2.5165 - classification_loss: 1.0904 111/500 [=====>........................] - ETA: 1:29 - loss: 3.6004 - regression_loss: 2.5150 - classification_loss: 1.0854 112/500 [=====>........................] - ETA: 1:29 - loss: 3.6006 - regression_loss: 2.5164 - classification_loss: 1.0842 113/500 [=====>........................] - ETA: 1:28 - loss: 3.6031 - regression_loss: 2.5172 - classification_loss: 1.0859 114/500 [=====>........................] - ETA: 1:28 - loss: 3.5981 - regression_loss: 2.5156 - classification_loss: 1.0824 115/500 [=====>........................] - ETA: 1:28 - loss: 3.5875 - regression_loss: 2.5118 - classification_loss: 1.0757 116/500 [=====>........................] - ETA: 1:27 - loss: 3.5813 - regression_loss: 2.5096 - classification_loss: 1.0717 117/500 [======>.......................] - ETA: 1:27 - loss: 3.5760 - regression_loss: 2.5080 - classification_loss: 1.0680 118/500 [======>.......................] - ETA: 1:27 - loss: 3.5705 - regression_loss: 2.5058 - classification_loss: 1.0647 119/500 [======>.......................] - ETA: 1:27 - loss: 3.5681 - regression_loss: 2.5059 - classification_loss: 1.0623 120/500 [======>.......................] - ETA: 1:26 - loss: 3.5655 - regression_loss: 2.5057 - classification_loss: 1.0598 121/500 [======>.......................] - ETA: 1:26 - loss: 3.5749 - regression_loss: 2.5096 - classification_loss: 1.0653 122/500 [======>.......................] - ETA: 1:26 - loss: 3.5693 - regression_loss: 2.5077 - classification_loss: 1.0616 123/500 [======>.......................] - ETA: 1:26 - loss: 3.5636 - regression_loss: 2.5058 - classification_loss: 1.0578 124/500 [======>.......................] - ETA: 1:25 - loss: 3.5596 - regression_loss: 2.5054 - classification_loss: 1.0542 125/500 [======>.......................] - ETA: 1:25 - loss: 3.5609 - regression_loss: 2.5070 - classification_loss: 1.0540 126/500 [======>.......................] - ETA: 1:25 - loss: 3.5541 - regression_loss: 2.5048 - classification_loss: 1.0493 127/500 [======>.......................] - ETA: 1:25 - loss: 3.5508 - regression_loss: 2.5061 - classification_loss: 1.0448 128/500 [======>.......................] - ETA: 1:24 - loss: 3.5455 - regression_loss: 2.5044 - classification_loss: 1.0412 129/500 [======>.......................] - ETA: 1:24 - loss: 3.5415 - regression_loss: 2.5037 - classification_loss: 1.0378 130/500 [======>.......................] - ETA: 1:24 - loss: 3.5419 - regression_loss: 2.5053 - classification_loss: 1.0366 131/500 [======>.......................] - ETA: 1:24 - loss: 3.5358 - regression_loss: 2.5035 - classification_loss: 1.0323 132/500 [======>.......................] - ETA: 1:23 - loss: 3.5328 - regression_loss: 2.5032 - classification_loss: 1.0296 133/500 [======>.......................] - ETA: 1:23 - loss: 3.5297 - regression_loss: 2.5039 - classification_loss: 1.0258 134/500 [=======>......................] - ETA: 1:23 - loss: 3.5292 - regression_loss: 2.5054 - classification_loss: 1.0237 135/500 [=======>......................] - ETA: 1:22 - loss: 3.5222 - regression_loss: 2.5030 - classification_loss: 1.0192 136/500 [=======>......................] - ETA: 1:22 - loss: 3.5216 - regression_loss: 2.5049 - classification_loss: 1.0167 137/500 [=======>......................] - ETA: 1:22 - loss: 3.5171 - regression_loss: 2.5033 - classification_loss: 1.0139 138/500 [=======>......................] - ETA: 1:22 - loss: 3.5125 - regression_loss: 2.5026 - classification_loss: 1.0099 139/500 [=======>......................] - ETA: 1:21 - loss: 3.5089 - regression_loss: 2.5015 - classification_loss: 1.0075 140/500 [=======>......................] - ETA: 1:21 - loss: 3.5029 - regression_loss: 2.4992 - classification_loss: 1.0037 141/500 [=======>......................] - ETA: 1:21 - loss: 3.5004 - regression_loss: 2.4973 - classification_loss: 1.0031 142/500 [=======>......................] - ETA: 1:21 - loss: 3.4990 - regression_loss: 2.4944 - classification_loss: 1.0046 143/500 [=======>......................] - ETA: 1:20 - loss: 3.4935 - regression_loss: 2.4916 - classification_loss: 1.0019 144/500 [=======>......................] - ETA: 1:20 - loss: 3.4913 - regression_loss: 2.4922 - classification_loss: 0.9991 145/500 [=======>......................] - ETA: 1:20 - loss: 3.4877 - regression_loss: 2.4902 - classification_loss: 0.9975 146/500 [=======>......................] - ETA: 1:20 - loss: 3.4859 - regression_loss: 2.4908 - classification_loss: 0.9951 147/500 [=======>......................] - ETA: 1:19 - loss: 3.4825 - regression_loss: 2.4899 - classification_loss: 0.9925 148/500 [=======>......................] - ETA: 1:19 - loss: 3.4792 - regression_loss: 2.4888 - classification_loss: 0.9904 149/500 [=======>......................] - ETA: 1:19 - loss: 3.4757 - regression_loss: 2.4870 - classification_loss: 0.9886 150/500 [========>.....................] - ETA: 1:19 - loss: 3.4902 - regression_loss: 2.4970 - classification_loss: 0.9932 151/500 [========>.....................] - ETA: 1:18 - loss: 3.4883 - regression_loss: 2.4965 - classification_loss: 0.9918 152/500 [========>.....................] - ETA: 1:18 - loss: 3.4844 - regression_loss: 2.4947 - classification_loss: 0.9897 153/500 [========>.....................] - ETA: 1:18 - loss: 3.4832 - regression_loss: 2.4946 - classification_loss: 0.9886 154/500 [========>.....................] - ETA: 1:18 - loss: 3.4844 - regression_loss: 2.4970 - classification_loss: 0.9874 155/500 [========>.....................] - ETA: 1:17 - loss: 3.4839 - regression_loss: 2.4994 - classification_loss: 0.9845 156/500 [========>.....................] - ETA: 1:17 - loss: 3.4824 - regression_loss: 2.5002 - classification_loss: 0.9823 157/500 [========>.....................] - ETA: 1:17 - loss: 3.4810 - regression_loss: 2.5001 - classification_loss: 0.9809 158/500 [========>.....................] - ETA: 1:17 - loss: 3.4822 - regression_loss: 2.5014 - classification_loss: 0.9808 159/500 [========>.....................] - ETA: 1:17 - loss: 3.4802 - regression_loss: 2.5017 - classification_loss: 0.9785 160/500 [========>.....................] - ETA: 1:16 - loss: 3.4776 - regression_loss: 2.5017 - classification_loss: 0.9758 161/500 [========>.....................] - ETA: 1:16 - loss: 3.4736 - regression_loss: 2.5005 - classification_loss: 0.9731 162/500 [========>.....................] - ETA: 1:16 - loss: 3.4728 - regression_loss: 2.5015 - classification_loss: 0.9713 163/500 [========>.....................] - ETA: 1:16 - loss: 3.4676 - regression_loss: 2.4990 - classification_loss: 0.9685 164/500 [========>.....................] - ETA: 1:15 - loss: 3.4659 - regression_loss: 2.4987 - classification_loss: 0.9671 165/500 [========>.....................] - ETA: 1:15 - loss: 3.4642 - regression_loss: 2.4988 - classification_loss: 0.9654 166/500 [========>.....................] - ETA: 1:15 - loss: 3.4624 - regression_loss: 2.4989 - classification_loss: 0.9635 167/500 [=========>....................] - ETA: 1:15 - loss: 3.4697 - regression_loss: 2.5004 - classification_loss: 0.9693 168/500 [=========>....................] - ETA: 1:14 - loss: 3.4661 - regression_loss: 2.4995 - classification_loss: 0.9666 169/500 [=========>....................] - ETA: 1:14 - loss: 3.4686 - regression_loss: 2.5023 - classification_loss: 0.9663 170/500 [=========>....................] - ETA: 1:14 - loss: 3.4670 - regression_loss: 2.5025 - classification_loss: 0.9644 171/500 [=========>....................] - ETA: 1:14 - loss: 3.4641 - regression_loss: 2.5016 - classification_loss: 0.9624 172/500 [=========>....................] - ETA: 1:13 - loss: 3.4640 - regression_loss: 2.5038 - classification_loss: 0.9602 173/500 [=========>....................] - ETA: 1:13 - loss: 3.4597 - regression_loss: 2.5024 - classification_loss: 0.9573 174/500 [=========>....................] - ETA: 1:13 - loss: 3.4582 - regression_loss: 2.5023 - classification_loss: 0.9559 175/500 [=========>....................] - ETA: 1:13 - loss: 3.4559 - regression_loss: 2.5017 - classification_loss: 0.9542 176/500 [=========>....................] - ETA: 1:12 - loss: 3.4577 - regression_loss: 2.5032 - classification_loss: 0.9545 177/500 [=========>....................] - ETA: 1:12 - loss: 3.4551 - regression_loss: 2.5015 - classification_loss: 0.9536 178/500 [=========>....................] - ETA: 1:12 - loss: 3.4523 - regression_loss: 2.5000 - classification_loss: 0.9523 179/500 [=========>....................] - ETA: 1:12 - loss: 3.4502 - regression_loss: 2.5004 - classification_loss: 0.9498 180/500 [=========>....................] - ETA: 1:11 - loss: 3.4490 - regression_loss: 2.5006 - classification_loss: 0.9483 181/500 [=========>....................] - ETA: 1:11 - loss: 3.4488 - regression_loss: 2.5013 - classification_loss: 0.9474 182/500 [=========>....................] - ETA: 1:11 - loss: 3.4473 - regression_loss: 2.5014 - classification_loss: 0.9459 183/500 [=========>....................] - ETA: 1:11 - loss: 3.4446 - regression_loss: 2.5007 - classification_loss: 0.9439 184/500 [==========>...................] - ETA: 1:11 - loss: 3.4560 - regression_loss: 2.5088 - classification_loss: 0.9472 185/500 [==========>...................] - ETA: 1:10 - loss: 3.4544 - regression_loss: 2.5096 - classification_loss: 0.9448 186/500 [==========>...................] - ETA: 1:10 - loss: 3.4536 - regression_loss: 2.5102 - classification_loss: 0.9435 187/500 [==========>...................] - ETA: 1:10 - loss: 3.4543 - regression_loss: 2.5112 - classification_loss: 0.9431 188/500 [==========>...................] - ETA: 1:10 - loss: 3.4517 - regression_loss: 2.5098 - classification_loss: 0.9419 189/500 [==========>...................] - ETA: 1:09 - loss: 3.4485 - regression_loss: 2.5085 - classification_loss: 0.9400 190/500 [==========>...................] - ETA: 1:09 - loss: 3.4460 - regression_loss: 2.5076 - classification_loss: 0.9383 191/500 [==========>...................] - ETA: 1:09 - loss: 3.4406 - regression_loss: 2.5055 - classification_loss: 0.9351 192/500 [==========>...................] - ETA: 1:09 - loss: 3.4386 - regression_loss: 2.5051 - classification_loss: 0.9335 193/500 [==========>...................] - ETA: 1:08 - loss: 3.4365 - regression_loss: 2.5046 - classification_loss: 0.9320 194/500 [==========>...................] - ETA: 1:08 - loss: 3.4341 - regression_loss: 2.5037 - classification_loss: 0.9305 195/500 [==========>...................] - ETA: 1:08 - loss: 3.4320 - regression_loss: 2.5031 - classification_loss: 0.9289 196/500 [==========>...................] - ETA: 1:08 - loss: 3.4297 - regression_loss: 2.5024 - classification_loss: 0.9273 197/500 [==========>...................] - ETA: 1:07 - loss: 3.4278 - regression_loss: 2.5014 - classification_loss: 0.9264 198/500 [==========>...................] - ETA: 1:07 - loss: 3.4234 - regression_loss: 2.4991 - classification_loss: 0.9243 199/500 [==========>...................] - ETA: 1:07 - loss: 3.4245 - regression_loss: 2.5021 - classification_loss: 0.9224 200/500 [===========>..................] - ETA: 1:07 - loss: 3.4223 - regression_loss: 2.5015 - classification_loss: 0.9208 201/500 [===========>..................] - ETA: 1:06 - loss: 3.4202 - regression_loss: 2.5004 - classification_loss: 0.9198 202/500 [===========>..................] - ETA: 1:06 - loss: 3.4180 - regression_loss: 2.5000 - classification_loss: 0.9180 203/500 [===========>..................] - ETA: 1:06 - loss: 3.4164 - regression_loss: 2.4998 - classification_loss: 0.9166 204/500 [===========>..................] - ETA: 1:06 - loss: 3.4129 - regression_loss: 2.4984 - classification_loss: 0.9145 205/500 [===========>..................] - ETA: 1:06 - loss: 3.4090 - regression_loss: 2.4968 - classification_loss: 0.9122 206/500 [===========>..................] - ETA: 1:05 - loss: 3.4124 - regression_loss: 2.4985 - classification_loss: 0.9138 207/500 [===========>..................] - ETA: 1:05 - loss: 3.4092 - regression_loss: 2.4973 - classification_loss: 0.9119 208/500 [===========>..................] - ETA: 1:05 - loss: 3.4068 - regression_loss: 2.4961 - classification_loss: 0.9107 209/500 [===========>..................] - ETA: 1:05 - loss: 3.4056 - regression_loss: 2.4957 - classification_loss: 0.9099 210/500 [===========>..................] - ETA: 1:05 - loss: 3.4062 - regression_loss: 2.4954 - classification_loss: 0.9108 211/500 [===========>..................] - ETA: 1:04 - loss: 3.4044 - regression_loss: 2.4950 - classification_loss: 0.9094 212/500 [===========>..................] - ETA: 1:04 - loss: 3.4057 - regression_loss: 2.4974 - classification_loss: 0.9084 213/500 [===========>..................] - ETA: 1:04 - loss: 3.4037 - regression_loss: 2.4963 - classification_loss: 0.9075 214/500 [===========>..................] - ETA: 1:04 - loss: 3.3986 - regression_loss: 2.4929 - classification_loss: 0.9058 215/500 [===========>..................] - ETA: 1:04 - loss: 3.3979 - regression_loss: 2.4937 - classification_loss: 0.9042 216/500 [===========>..................] - ETA: 1:03 - loss: 3.3978 - regression_loss: 2.4937 - classification_loss: 0.9041 217/500 [============>.................] - ETA: 1:03 - loss: 3.3955 - regression_loss: 2.4930 - classification_loss: 0.9025 218/500 [============>.................] - ETA: 1:03 - loss: 3.3951 - regression_loss: 2.4934 - classification_loss: 0.9018 219/500 [============>.................] - ETA: 1:03 - loss: 3.3955 - regression_loss: 2.4946 - classification_loss: 0.9010 220/500 [============>.................] - ETA: 1:03 - loss: 3.3936 - regression_loss: 2.4937 - classification_loss: 0.8998 221/500 [============>.................] - ETA: 1:02 - loss: 3.3933 - regression_loss: 2.4945 - classification_loss: 0.8988 222/500 [============>.................] - ETA: 1:02 - loss: 3.3911 - regression_loss: 2.4939 - classification_loss: 0.8971 223/500 [============>.................] - ETA: 1:02 - loss: 3.3899 - regression_loss: 2.4943 - classification_loss: 0.8956 224/500 [============>.................] - ETA: 1:02 - loss: 3.4011 - regression_loss: 2.4956 - classification_loss: 0.9055 225/500 [============>.................] - ETA: 1:02 - loss: 3.3986 - regression_loss: 2.4947 - classification_loss: 0.9040 226/500 [============>.................] - ETA: 1:01 - loss: 3.3975 - regression_loss: 2.4944 - classification_loss: 0.9031 227/500 [============>.................] - ETA: 1:01 - loss: 3.3975 - regression_loss: 2.4950 - classification_loss: 0.9025 228/500 [============>.................] - ETA: 1:01 - loss: 3.3972 - regression_loss: 2.4949 - classification_loss: 0.9023 229/500 [============>.................] - ETA: 1:01 - loss: 3.3931 - regression_loss: 2.4933 - classification_loss: 0.8998 230/500 [============>.................] - ETA: 1:00 - loss: 3.3914 - regression_loss: 2.4927 - classification_loss: 0.8986 231/500 [============>.................] - ETA: 1:00 - loss: 3.3894 - regression_loss: 2.4926 - classification_loss: 0.8968 232/500 [============>.................] - ETA: 1:00 - loss: 3.3880 - regression_loss: 2.4917 - classification_loss: 0.8963 233/500 [============>.................] - ETA: 1:00 - loss: 3.3864 - regression_loss: 2.4912 - classification_loss: 0.8952 234/500 [=============>................] - ETA: 1:00 - loss: 3.3842 - regression_loss: 2.4906 - classification_loss: 0.8937 235/500 [=============>................] - ETA: 59s - loss: 3.3828 - regression_loss: 2.4888 - classification_loss: 0.8940  236/500 [=============>................] - ETA: 59s - loss: 3.3831 - regression_loss: 2.4891 - classification_loss: 0.8939 237/500 [=============>................] - ETA: 59s - loss: 3.3832 - regression_loss: 2.4899 - classification_loss: 0.8933 238/500 [=============>................] - ETA: 59s - loss: 3.3842 - regression_loss: 2.4917 - classification_loss: 0.8925 239/500 [=============>................] - ETA: 59s - loss: 3.3831 - regression_loss: 2.4917 - classification_loss: 0.8914 240/500 [=============>................] - ETA: 58s - loss: 3.3809 - regression_loss: 2.4910 - classification_loss: 0.8899 241/500 [=============>................] - ETA: 58s - loss: 3.3801 - regression_loss: 2.4900 - classification_loss: 0.8901 242/500 [=============>................] - ETA: 58s - loss: 3.3796 - regression_loss: 2.4913 - classification_loss: 0.8883 243/500 [=============>................] - ETA: 58s - loss: 3.3773 - regression_loss: 2.4905 - classification_loss: 0.8867 244/500 [=============>................] - ETA: 57s - loss: 3.3755 - regression_loss: 2.4900 - classification_loss: 0.8855 245/500 [=============>................] - ETA: 57s - loss: 3.3751 - regression_loss: 2.4903 - classification_loss: 0.8848 246/500 [=============>................] - ETA: 57s - loss: 3.3767 - regression_loss: 2.4901 - classification_loss: 0.8866 247/500 [=============>................] - ETA: 57s - loss: 3.3762 - regression_loss: 2.4905 - classification_loss: 0.8857 248/500 [=============>................] - ETA: 57s - loss: 3.3735 - regression_loss: 2.4897 - classification_loss: 0.8838 249/500 [=============>................] - ETA: 56s - loss: 3.3725 - regression_loss: 2.4895 - classification_loss: 0.8830 250/500 [==============>...............] - ETA: 56s - loss: 3.3700 - regression_loss: 2.4886 - classification_loss: 0.8814 251/500 [==============>...............] - ETA: 56s - loss: 3.3676 - regression_loss: 2.4876 - classification_loss: 0.8800 252/500 [==============>...............] - ETA: 56s - loss: 3.3669 - regression_loss: 2.4879 - classification_loss: 0.8790 253/500 [==============>...............] - ETA: 56s - loss: 3.3647 - regression_loss: 2.4869 - classification_loss: 0.8778 254/500 [==============>...............] - ETA: 55s - loss: 3.3734 - regression_loss: 2.4879 - classification_loss: 0.8855 255/500 [==============>...............] - ETA: 55s - loss: 3.3708 - regression_loss: 2.4871 - classification_loss: 0.8837 256/500 [==============>...............] - ETA: 55s - loss: 3.3720 - regression_loss: 2.4879 - classification_loss: 0.8841 257/500 [==============>...............] - ETA: 55s - loss: 3.3710 - regression_loss: 2.4880 - classification_loss: 0.8830 258/500 [==============>...............] - ETA: 54s - loss: 3.3690 - regression_loss: 2.4869 - classification_loss: 0.8821 259/500 [==============>...............] - ETA: 54s - loss: 3.3670 - regression_loss: 2.4861 - classification_loss: 0.8810 260/500 [==============>...............] - ETA: 54s - loss: 3.3625 - regression_loss: 2.4832 - classification_loss: 0.8793 261/500 [==============>...............] - ETA: 54s - loss: 3.3626 - regression_loss: 2.4841 - classification_loss: 0.8784 262/500 [==============>...............] - ETA: 54s - loss: 3.3617 - regression_loss: 2.4842 - classification_loss: 0.8775 263/500 [==============>...............] - ETA: 53s - loss: 3.3589 - regression_loss: 2.4827 - classification_loss: 0.8762 264/500 [==============>...............] - ETA: 53s - loss: 3.3596 - regression_loss: 2.4836 - classification_loss: 0.8760 265/500 [==============>...............] - ETA: 53s - loss: 3.3584 - regression_loss: 2.4840 - classification_loss: 0.8745 266/500 [==============>...............] - ETA: 53s - loss: 3.3561 - regression_loss: 2.4835 - classification_loss: 0.8726 267/500 [===============>..............] - ETA: 53s - loss: 3.3555 - regression_loss: 2.4836 - classification_loss: 0.8719 268/500 [===============>..............] - ETA: 52s - loss: 3.3550 - regression_loss: 2.4846 - classification_loss: 0.8704 269/500 [===============>..............] - ETA: 52s - loss: 3.3537 - regression_loss: 2.4842 - classification_loss: 0.8695 270/500 [===============>..............] - ETA: 52s - loss: 3.3522 - regression_loss: 2.4834 - classification_loss: 0.8688 271/500 [===============>..............] - ETA: 52s - loss: 3.3512 - regression_loss: 2.4838 - classification_loss: 0.8674 272/500 [===============>..............] - ETA: 51s - loss: 3.3506 - regression_loss: 2.4830 - classification_loss: 0.8677 273/500 [===============>..............] - ETA: 51s - loss: 3.3487 - regression_loss: 2.4822 - classification_loss: 0.8665 274/500 [===============>..............] - ETA: 51s - loss: 3.3482 - regression_loss: 2.4822 - classification_loss: 0.8660 275/500 [===============>..............] - ETA: 51s - loss: 3.3471 - regression_loss: 2.4814 - classification_loss: 0.8657 276/500 [===============>..............] - ETA: 50s - loss: 3.3437 - regression_loss: 2.4799 - classification_loss: 0.8638 277/500 [===============>..............] - ETA: 50s - loss: 3.3504 - regression_loss: 2.4815 - classification_loss: 0.8689 278/500 [===============>..............] - ETA: 50s - loss: 3.3492 - regression_loss: 2.4811 - classification_loss: 0.8681 279/500 [===============>..............] - ETA: 50s - loss: 3.3503 - regression_loss: 2.4825 - classification_loss: 0.8677 280/500 [===============>..............] - ETA: 50s - loss: 3.3482 - regression_loss: 2.4816 - classification_loss: 0.8666 281/500 [===============>..............] - ETA: 49s - loss: 3.3472 - regression_loss: 2.4818 - classification_loss: 0.8654 282/500 [===============>..............] - ETA: 49s - loss: 3.3479 - regression_loss: 2.4832 - classification_loss: 0.8647 283/500 [===============>..............] - ETA: 49s - loss: 3.3492 - regression_loss: 2.4846 - classification_loss: 0.8646 284/500 [================>.............] - ETA: 49s - loss: 3.3507 - regression_loss: 2.4862 - classification_loss: 0.8644 285/500 [================>.............] - ETA: 48s - loss: 3.3488 - regression_loss: 2.4852 - classification_loss: 0.8636 286/500 [================>.............] - ETA: 48s - loss: 3.3477 - regression_loss: 2.4846 - classification_loss: 0.8631 287/500 [================>.............] - ETA: 48s - loss: 3.3483 - regression_loss: 2.4859 - classification_loss: 0.8624 288/500 [================>.............] - ETA: 48s - loss: 3.3462 - regression_loss: 2.4849 - classification_loss: 0.8613 289/500 [================>.............] - ETA: 48s - loss: 3.3452 - regression_loss: 2.4845 - classification_loss: 0.8607 290/500 [================>.............] - ETA: 47s - loss: 3.3448 - regression_loss: 2.4848 - classification_loss: 0.8600 291/500 [================>.............] - ETA: 47s - loss: 3.3421 - regression_loss: 2.4835 - classification_loss: 0.8586 292/500 [================>.............] - ETA: 47s - loss: 3.3410 - regression_loss: 2.4835 - classification_loss: 0.8574 293/500 [================>.............] - ETA: 47s - loss: 3.3420 - regression_loss: 2.4850 - classification_loss: 0.8569 294/500 [================>.............] - ETA: 47s - loss: 3.3398 - regression_loss: 2.4839 - classification_loss: 0.8558 295/500 [================>.............] - ETA: 46s - loss: 3.3415 - regression_loss: 2.4848 - classification_loss: 0.8567 296/500 [================>.............] - ETA: 46s - loss: 3.3417 - regression_loss: 2.4847 - classification_loss: 0.8570 297/500 [================>.............] - ETA: 46s - loss: 3.3393 - regression_loss: 2.4838 - classification_loss: 0.8555 298/500 [================>.............] - ETA: 46s - loss: 3.3355 - regression_loss: 2.4818 - classification_loss: 0.8536 299/500 [================>.............] - ETA: 45s - loss: 3.3359 - regression_loss: 2.4826 - classification_loss: 0.8533 300/500 [=================>............] - ETA: 45s - loss: 3.3339 - regression_loss: 2.4815 - classification_loss: 0.8524 301/500 [=================>............] - ETA: 45s - loss: 3.3357 - regression_loss: 2.4831 - classification_loss: 0.8527 302/500 [=================>............] - ETA: 45s - loss: 3.3352 - regression_loss: 2.4830 - classification_loss: 0.8521 303/500 [=================>............] - ETA: 44s - loss: 3.3334 - regression_loss: 2.4823 - classification_loss: 0.8511 304/500 [=================>............] - ETA: 44s - loss: 3.3320 - regression_loss: 2.4819 - classification_loss: 0.8501 305/500 [=================>............] - ETA: 44s - loss: 3.3305 - regression_loss: 2.4810 - classification_loss: 0.8495 306/500 [=================>............] - ETA: 44s - loss: 3.3298 - regression_loss: 2.4809 - classification_loss: 0.8489 307/500 [=================>............] - ETA: 44s - loss: 3.3259 - regression_loss: 2.4788 - classification_loss: 0.8471 308/500 [=================>............] - ETA: 43s - loss: 3.3238 - regression_loss: 2.4779 - classification_loss: 0.8459 309/500 [=================>............] - ETA: 43s - loss: 3.3210 - regression_loss: 2.4767 - classification_loss: 0.8443 310/500 [=================>............] - ETA: 43s - loss: 3.3181 - regression_loss: 2.4752 - classification_loss: 0.8429 311/500 [=================>............] - ETA: 43s - loss: 3.3169 - regression_loss: 2.4743 - classification_loss: 0.8425 312/500 [=================>............] - ETA: 42s - loss: 3.3152 - regression_loss: 2.4737 - classification_loss: 0.8415 313/500 [=================>............] - ETA: 42s - loss: 3.3147 - regression_loss: 2.4732 - classification_loss: 0.8415 314/500 [=================>............] - ETA: 42s - loss: 3.3135 - regression_loss: 2.4725 - classification_loss: 0.8409 315/500 [=================>............] - ETA: 42s - loss: 3.3185 - regression_loss: 2.4734 - classification_loss: 0.8450 316/500 [=================>............] - ETA: 42s - loss: 3.3163 - regression_loss: 2.4725 - classification_loss: 0.8439 317/500 [==================>...........] - ETA: 41s - loss: 3.3164 - regression_loss: 2.4731 - classification_loss: 0.8432 318/500 [==================>...........] - ETA: 41s - loss: 3.3162 - regression_loss: 2.4733 - classification_loss: 0.8429 319/500 [==================>...........] - ETA: 41s - loss: 3.3163 - regression_loss: 2.4739 - classification_loss: 0.8425 320/500 [==================>...........] - ETA: 41s - loss: 3.3144 - regression_loss: 2.4735 - classification_loss: 0.8410 321/500 [==================>...........] - ETA: 41s - loss: 3.3121 - regression_loss: 2.4724 - classification_loss: 0.8397 322/500 [==================>...........] - ETA: 40s - loss: 3.3133 - regression_loss: 2.4737 - classification_loss: 0.8396 323/500 [==================>...........] - ETA: 40s - loss: 3.3126 - regression_loss: 2.4740 - classification_loss: 0.8386 324/500 [==================>...........] - ETA: 40s - loss: 3.3107 - regression_loss: 2.4733 - classification_loss: 0.8374 325/500 [==================>...........] - ETA: 40s - loss: 3.3092 - regression_loss: 2.4722 - classification_loss: 0.8370 326/500 [==================>...........] - ETA: 39s - loss: 3.3079 - regression_loss: 2.4717 - classification_loss: 0.8362 327/500 [==================>...........] - ETA: 39s - loss: 3.3065 - regression_loss: 2.4711 - classification_loss: 0.8354 328/500 [==================>...........] - ETA: 39s - loss: 3.3037 - regression_loss: 2.4694 - classification_loss: 0.8343 329/500 [==================>...........] - ETA: 39s - loss: 3.3017 - regression_loss: 2.4683 - classification_loss: 0.8334 330/500 [==================>...........] - ETA: 38s - loss: 3.3008 - regression_loss: 2.4678 - classification_loss: 0.8330 331/500 [==================>...........] - ETA: 38s - loss: 3.2995 - regression_loss: 2.4671 - classification_loss: 0.8324 332/500 [==================>...........] - ETA: 38s - loss: 3.3017 - regression_loss: 2.4687 - classification_loss: 0.8330 333/500 [==================>...........] - ETA: 38s - loss: 3.3008 - regression_loss: 2.4685 - classification_loss: 0.8323 334/500 [===================>..........] - ETA: 38s - loss: 3.2980 - regression_loss: 2.4670 - classification_loss: 0.8311 335/500 [===================>..........] - ETA: 37s - loss: 3.2979 - regression_loss: 2.4675 - classification_loss: 0.8303 336/500 [===================>..........] - ETA: 37s - loss: 3.2987 - regression_loss: 2.4684 - classification_loss: 0.8303 337/500 [===================>..........] - ETA: 37s - loss: 3.2970 - regression_loss: 2.4678 - classification_loss: 0.8292 338/500 [===================>..........] - ETA: 37s - loss: 3.2953 - regression_loss: 2.4670 - classification_loss: 0.8282 339/500 [===================>..........] - ETA: 36s - loss: 3.2959 - regression_loss: 2.4675 - classification_loss: 0.8284 340/500 [===================>..........] - ETA: 36s - loss: 3.2941 - regression_loss: 2.4668 - classification_loss: 0.8273 341/500 [===================>..........] - ETA: 36s - loss: 3.2933 - regression_loss: 2.4665 - classification_loss: 0.8268 342/500 [===================>..........] - ETA: 36s - loss: 3.2917 - regression_loss: 2.4659 - classification_loss: 0.8258 343/500 [===================>..........] - ETA: 36s - loss: 3.2925 - regression_loss: 2.4662 - classification_loss: 0.8263 344/500 [===================>..........] - ETA: 35s - loss: 3.2901 - regression_loss: 2.4650 - classification_loss: 0.8252 345/500 [===================>..........] - ETA: 35s - loss: 3.2890 - regression_loss: 2.4645 - classification_loss: 0.8245 346/500 [===================>..........] - ETA: 35s - loss: 3.2883 - regression_loss: 2.4646 - classification_loss: 0.8237 347/500 [===================>..........] - ETA: 35s - loss: 3.2860 - regression_loss: 2.4633 - classification_loss: 0.8227 348/500 [===================>..........] - ETA: 34s - loss: 3.2837 - regression_loss: 2.4618 - classification_loss: 0.8219 349/500 [===================>..........] - ETA: 34s - loss: 3.2826 - regression_loss: 2.4613 - classification_loss: 0.8213 350/500 [====================>.........] - ETA: 34s - loss: 3.2815 - regression_loss: 2.4610 - classification_loss: 0.8206 351/500 [====================>.........] - ETA: 34s - loss: 3.2800 - regression_loss: 2.4604 - classification_loss: 0.8196 352/500 [====================>.........] - ETA: 34s - loss: 3.2771 - regression_loss: 2.4586 - classification_loss: 0.8184 353/500 [====================>.........] - ETA: 33s - loss: 3.2765 - regression_loss: 2.4590 - classification_loss: 0.8176 354/500 [====================>.........] - ETA: 33s - loss: 3.2778 - regression_loss: 2.4604 - classification_loss: 0.8174 355/500 [====================>.........] - ETA: 33s - loss: 3.2781 - regression_loss: 2.4616 - classification_loss: 0.8166 356/500 [====================>.........] - ETA: 33s - loss: 3.2744 - regression_loss: 2.4591 - classification_loss: 0.8153 357/500 [====================>.........] - ETA: 32s - loss: 3.2758 - regression_loss: 2.4606 - classification_loss: 0.8151 358/500 [====================>.........] - ETA: 32s - loss: 3.2750 - regression_loss: 2.4604 - classification_loss: 0.8145 359/500 [====================>.........] - ETA: 32s - loss: 3.2731 - regression_loss: 2.4594 - classification_loss: 0.8138 360/500 [====================>.........] - ETA: 32s - loss: 3.2726 - regression_loss: 2.4594 - classification_loss: 0.8132 361/500 [====================>.........] - ETA: 31s - loss: 3.2721 - regression_loss: 2.4597 - classification_loss: 0.8123 362/500 [====================>.........] - ETA: 31s - loss: 3.2714 - regression_loss: 2.4600 - classification_loss: 0.8114 363/500 [====================>.........] - ETA: 31s - loss: 3.2704 - regression_loss: 2.4597 - classification_loss: 0.8107 364/500 [====================>.........] - ETA: 31s - loss: 3.2697 - regression_loss: 2.4595 - classification_loss: 0.8102 365/500 [====================>.........] - ETA: 31s - loss: 3.2709 - regression_loss: 2.4607 - classification_loss: 0.8101 366/500 [====================>.........] - ETA: 30s - loss: 3.2687 - regression_loss: 2.4599 - classification_loss: 0.8088 367/500 [=====================>........] - ETA: 30s - loss: 3.2657 - regression_loss: 2.4581 - classification_loss: 0.8076 368/500 [=====================>........] - ETA: 30s - loss: 3.2646 - regression_loss: 2.4577 - classification_loss: 0.8070 369/500 [=====================>........] - ETA: 30s - loss: 3.2625 - regression_loss: 2.4566 - classification_loss: 0.8058 370/500 [=====================>........] - ETA: 29s - loss: 3.2621 - regression_loss: 2.4557 - classification_loss: 0.8064 371/500 [=====================>........] - ETA: 29s - loss: 3.2602 - regression_loss: 2.4548 - classification_loss: 0.8054 372/500 [=====================>........] - ETA: 29s - loss: 3.2594 - regression_loss: 2.4545 - classification_loss: 0.8049 373/500 [=====================>........] - ETA: 29s - loss: 3.2577 - regression_loss: 2.4538 - classification_loss: 0.8039 374/500 [=====================>........] - ETA: 29s - loss: 3.2570 - regression_loss: 2.4535 - classification_loss: 0.8035 375/500 [=====================>........] - ETA: 28s - loss: 3.2540 - regression_loss: 2.4517 - classification_loss: 0.8022 376/500 [=====================>........] - ETA: 28s - loss: 3.2558 - regression_loss: 2.4509 - classification_loss: 0.8048 377/500 [=====================>........] - ETA: 28s - loss: 3.2581 - regression_loss: 2.4514 - classification_loss: 0.8068 378/500 [=====================>........] - ETA: 28s - loss: 3.2553 - regression_loss: 2.4498 - classification_loss: 0.8054 379/500 [=====================>........] - ETA: 27s - loss: 3.2537 - regression_loss: 2.4490 - classification_loss: 0.8047 380/500 [=====================>........] - ETA: 27s - loss: 3.2528 - regression_loss: 2.4487 - classification_loss: 0.8041 381/500 [=====================>........] - ETA: 27s - loss: 3.2490 - regression_loss: 2.4463 - classification_loss: 0.8027 382/500 [=====================>........] - ETA: 27s - loss: 3.2472 - regression_loss: 2.4452 - classification_loss: 0.8020 383/500 [=====================>........] - ETA: 26s - loss: 3.2464 - regression_loss: 2.4453 - classification_loss: 0.8011 384/500 [======================>.......] - ETA: 26s - loss: 3.2459 - regression_loss: 2.4452 - classification_loss: 0.8006 385/500 [======================>.......] - ETA: 26s - loss: 3.2460 - regression_loss: 2.4460 - classification_loss: 0.8000 386/500 [======================>.......] - ETA: 26s - loss: 3.2445 - regression_loss: 2.4451 - classification_loss: 0.7995 387/500 [======================>.......] - ETA: 26s - loss: 3.2402 - regression_loss: 2.4421 - classification_loss: 0.7981 388/500 [======================>.......] - ETA: 25s - loss: 3.2407 - regression_loss: 2.4432 - classification_loss: 0.7976 389/500 [======================>.......] - ETA: 25s - loss: 3.2417 - regression_loss: 2.4438 - classification_loss: 0.7979 390/500 [======================>.......] - ETA: 25s - loss: 3.2405 - regression_loss: 2.4432 - classification_loss: 0.7974 391/500 [======================>.......] - ETA: 25s - loss: 3.2407 - regression_loss: 2.4439 - classification_loss: 0.7967 392/500 [======================>.......] - ETA: 24s - loss: 3.2408 - regression_loss: 2.4446 - classification_loss: 0.7962 393/500 [======================>.......] - ETA: 24s - loss: 3.2405 - regression_loss: 2.4447 - classification_loss: 0.7958 394/500 [======================>.......] - ETA: 24s - loss: 3.2416 - regression_loss: 2.4461 - classification_loss: 0.7955 395/500 [======================>.......] - ETA: 24s - loss: 3.2415 - regression_loss: 2.4470 - classification_loss: 0.7945 396/500 [======================>.......] - ETA: 24s - loss: 3.2409 - regression_loss: 2.4466 - classification_loss: 0.7943 397/500 [======================>.......] - ETA: 23s - loss: 3.2422 - regression_loss: 2.4471 - classification_loss: 0.7952 398/500 [======================>.......] - ETA: 23s - loss: 3.2419 - regression_loss: 2.4473 - classification_loss: 0.7946 399/500 [======================>.......] - ETA: 23s - loss: 3.2427 - regression_loss: 2.4477 - classification_loss: 0.7950 400/500 [=======================>......] - ETA: 23s - loss: 3.2418 - regression_loss: 2.4475 - classification_loss: 0.7943 401/500 [=======================>......] - ETA: 22s - loss: 3.2410 - regression_loss: 2.4472 - classification_loss: 0.7938 402/500 [=======================>......] - ETA: 22s - loss: 3.2414 - regression_loss: 2.4478 - classification_loss: 0.7937 403/500 [=======================>......] - ETA: 22s - loss: 3.2404 - regression_loss: 2.4474 - classification_loss: 0.7931 404/500 [=======================>......] - ETA: 22s - loss: 3.2398 - regression_loss: 2.4469 - classification_loss: 0.7929 405/500 [=======================>......] - ETA: 21s - loss: 3.2391 - regression_loss: 2.4469 - classification_loss: 0.7922 406/500 [=======================>......] - ETA: 21s - loss: 3.2377 - regression_loss: 2.4462 - classification_loss: 0.7914 407/500 [=======================>......] - ETA: 21s - loss: 3.2370 - regression_loss: 2.4460 - classification_loss: 0.7910 408/500 [=======================>......] - ETA: 21s - loss: 3.2351 - regression_loss: 2.4449 - classification_loss: 0.7902 409/500 [=======================>......] - ETA: 21s - loss: 3.2343 - regression_loss: 2.4449 - classification_loss: 0.7894 410/500 [=======================>......] - ETA: 20s - loss: 3.2351 - regression_loss: 2.4448 - classification_loss: 0.7902 411/500 [=======================>......] - ETA: 20s - loss: 3.2320 - regression_loss: 2.4430 - classification_loss: 0.7890 412/500 [=======================>......] - ETA: 20s - loss: 3.2314 - regression_loss: 2.4428 - classification_loss: 0.7886 413/500 [=======================>......] - ETA: 20s - loss: 3.2310 - regression_loss: 2.4425 - classification_loss: 0.7885 414/500 [=======================>......] - ETA: 19s - loss: 3.2302 - regression_loss: 2.4424 - classification_loss: 0.7878 415/500 [=======================>......] - ETA: 19s - loss: 3.2281 - regression_loss: 2.4413 - classification_loss: 0.7868 416/500 [=======================>......] - ETA: 19s - loss: 3.2281 - regression_loss: 2.4418 - classification_loss: 0.7863 417/500 [========================>.....] - ETA: 19s - loss: 3.2311 - regression_loss: 2.4449 - classification_loss: 0.7863 418/500 [========================>.....] - ETA: 18s - loss: 3.2299 - regression_loss: 2.4442 - classification_loss: 0.7857 419/500 [========================>.....] - ETA: 18s - loss: 3.2298 - regression_loss: 2.4444 - classification_loss: 0.7854 420/500 [========================>.....] - ETA: 18s - loss: 3.2293 - regression_loss: 2.4441 - classification_loss: 0.7852 421/500 [========================>.....] - ETA: 18s - loss: 3.2282 - regression_loss: 2.4436 - classification_loss: 0.7846 422/500 [========================>.....] - ETA: 18s - loss: 3.2288 - regression_loss: 2.4437 - classification_loss: 0.7851 423/500 [========================>.....] - ETA: 17s - loss: 3.2287 - regression_loss: 2.4440 - classification_loss: 0.7847 424/500 [========================>.....] - ETA: 17s - loss: 3.2284 - regression_loss: 2.4441 - classification_loss: 0.7842 425/500 [========================>.....] - ETA: 17s - loss: 3.2268 - regression_loss: 2.4434 - classification_loss: 0.7834 426/500 [========================>.....] - ETA: 17s - loss: 3.2269 - regression_loss: 2.4439 - classification_loss: 0.7831 427/500 [========================>.....] - ETA: 16s - loss: 3.2259 - regression_loss: 2.4434 - classification_loss: 0.7825 428/500 [========================>.....] - ETA: 16s - loss: 3.2263 - regression_loss: 2.4439 - classification_loss: 0.7824 429/500 [========================>.....] - ETA: 16s - loss: 3.2259 - regression_loss: 2.4438 - classification_loss: 0.7822 430/500 [========================>.....] - ETA: 16s - loss: 3.2272 - regression_loss: 2.4448 - classification_loss: 0.7824 431/500 [========================>.....] - ETA: 15s - loss: 3.2257 - regression_loss: 2.4442 - classification_loss: 0.7816 432/500 [========================>.....] - ETA: 15s - loss: 3.2258 - regression_loss: 2.4447 - classification_loss: 0.7811 433/500 [========================>.....] - ETA: 15s - loss: 3.2247 - regression_loss: 2.4442 - classification_loss: 0.7805 434/500 [=========================>....] - ETA: 15s - loss: 3.2243 - regression_loss: 2.4440 - classification_loss: 0.7803 435/500 [=========================>....] - ETA: 15s - loss: 3.2235 - regression_loss: 2.4437 - classification_loss: 0.7798 436/500 [=========================>....] - ETA: 14s - loss: 3.2229 - regression_loss: 2.4434 - classification_loss: 0.7795 437/500 [=========================>....] - ETA: 14s - loss: 3.2211 - regression_loss: 2.4426 - classification_loss: 0.7785 438/500 [=========================>....] - ETA: 14s - loss: 3.2197 - regression_loss: 2.4418 - classification_loss: 0.7779 439/500 [=========================>....] - ETA: 14s - loss: 3.2183 - regression_loss: 2.4410 - classification_loss: 0.7773 440/500 [=========================>....] - ETA: 13s - loss: 3.2169 - regression_loss: 2.4402 - classification_loss: 0.7766 441/500 [=========================>....] - ETA: 13s - loss: 3.2163 - regression_loss: 2.4402 - classification_loss: 0.7761 442/500 [=========================>....] - ETA: 13s - loss: 3.2153 - regression_loss: 2.4397 - classification_loss: 0.7757 443/500 [=========================>....] - ETA: 13s - loss: 3.2140 - regression_loss: 2.4390 - classification_loss: 0.7750 444/500 [=========================>....] - ETA: 12s - loss: 3.2139 - regression_loss: 2.4396 - classification_loss: 0.7743 445/500 [=========================>....] - ETA: 12s - loss: 3.2136 - regression_loss: 2.4393 - classification_loss: 0.7742 446/500 [=========================>....] - ETA: 12s - loss: 3.2127 - regression_loss: 2.4391 - classification_loss: 0.7735 447/500 [=========================>....] - ETA: 12s - loss: 3.2110 - regression_loss: 2.4384 - classification_loss: 0.7726 448/500 [=========================>....] - ETA: 12s - loss: 3.2095 - regression_loss: 2.4376 - classification_loss: 0.7719 449/500 [=========================>....] - ETA: 11s - loss: 3.2080 - regression_loss: 2.4370 - classification_loss: 0.7710 450/500 [==========================>...] - ETA: 11s - loss: 3.2062 - regression_loss: 2.4361 - classification_loss: 0.7701 451/500 [==========================>...] - ETA: 11s - loss: 3.2060 - regression_loss: 2.4361 - classification_loss: 0.7699 452/500 [==========================>...] - ETA: 11s - loss: 3.2050 - regression_loss: 2.4355 - classification_loss: 0.7695 453/500 [==========================>...] - ETA: 10s - loss: 3.2052 - regression_loss: 2.4361 - classification_loss: 0.7691 454/500 [==========================>...] - ETA: 10s - loss: 3.2039 - regression_loss: 2.4357 - classification_loss: 0.7682 455/500 [==========================>...] - ETA: 10s - loss: 3.2041 - regression_loss: 2.4361 - classification_loss: 0.7680 456/500 [==========================>...] - ETA: 10s - loss: 3.2027 - regression_loss: 2.4355 - classification_loss: 0.7672 457/500 [==========================>...] - ETA: 9s - loss: 3.2016 - regression_loss: 2.4351 - classification_loss: 0.7665  458/500 [==========================>...] - ETA: 9s - loss: 3.2029 - regression_loss: 2.4360 - classification_loss: 0.7669 459/500 [==========================>...] - ETA: 9s - loss: 3.2028 - regression_loss: 2.4364 - classification_loss: 0.7663 460/500 [==========================>...] - ETA: 9s - loss: 3.2033 - regression_loss: 2.4372 - classification_loss: 0.7661 461/500 [==========================>...] - ETA: 9s - loss: 3.2027 - regression_loss: 2.4368 - classification_loss: 0.7659 462/500 [==========================>...] - ETA: 8s - loss: 3.2016 - regression_loss: 2.4362 - classification_loss: 0.7653 463/500 [==========================>...] - ETA: 8s - loss: 3.2003 - regression_loss: 2.4355 - classification_loss: 0.7647 464/500 [==========================>...] - ETA: 8s - loss: 3.1967 - regression_loss: 2.4333 - classification_loss: 0.7634 465/500 [==========================>...] - ETA: 8s - loss: 3.1975 - regression_loss: 2.4338 - classification_loss: 0.7637 466/500 [==========================>...] - ETA: 7s - loss: 3.1966 - regression_loss: 2.4335 - classification_loss: 0.7631 467/500 [===========================>..] - ETA: 7s - loss: 3.1964 - regression_loss: 2.4335 - classification_loss: 0.7629 468/500 [===========================>..] - ETA: 7s - loss: 3.1960 - regression_loss: 2.4335 - classification_loss: 0.7625 469/500 [===========================>..] - ETA: 7s - loss: 3.1950 - regression_loss: 2.4328 - classification_loss: 0.7622 470/500 [===========================>..] - ETA: 6s - loss: 3.1937 - regression_loss: 2.4325 - classification_loss: 0.7613 471/500 [===========================>..] - ETA: 6s - loss: 3.1940 - regression_loss: 2.4329 - classification_loss: 0.7611 472/500 [===========================>..] - ETA: 6s - loss: 3.1929 - regression_loss: 2.4325 - classification_loss: 0.7604 473/500 [===========================>..] - ETA: 6s - loss: 3.1926 - regression_loss: 2.4327 - classification_loss: 0.7599 474/500 [===========================>..] - ETA: 6s - loss: 3.1935 - regression_loss: 2.4335 - classification_loss: 0.7600 475/500 [===========================>..] - ETA: 5s - loss: 3.1933 - regression_loss: 2.4336 - classification_loss: 0.7597 476/500 [===========================>..] - ETA: 5s - loss: 3.1939 - regression_loss: 2.4337 - classification_loss: 0.7602 477/500 [===========================>..] - ETA: 5s - loss: 3.1928 - regression_loss: 2.4331 - classification_loss: 0.7596 478/500 [===========================>..] - ETA: 5s - loss: 3.1915 - regression_loss: 2.4325 - classification_loss: 0.7590 479/500 [===========================>..] - ETA: 4s - loss: 3.1914 - regression_loss: 2.4333 - classification_loss: 0.7581 480/500 [===========================>..] - ETA: 4s - loss: 3.1904 - regression_loss: 2.4327 - classification_loss: 0.7576 481/500 [===========================>..] - ETA: 4s - loss: 3.1897 - regression_loss: 2.4326 - classification_loss: 0.7571 482/500 [===========================>..] - ETA: 4s - loss: 3.1906 - regression_loss: 2.4337 - classification_loss: 0.7569 483/500 [===========================>..] - ETA: 3s - loss: 3.1892 - regression_loss: 2.4329 - classification_loss: 0.7563 484/500 [============================>.] - ETA: 3s - loss: 3.1878 - regression_loss: 2.4321 - classification_loss: 0.7558 485/500 [============================>.] - ETA: 3s - loss: 3.1884 - regression_loss: 2.4325 - classification_loss: 0.7559 486/500 [============================>.] - ETA: 3s - loss: 3.1882 - regression_loss: 2.4324 - classification_loss: 0.7558 487/500 [============================>.] - ETA: 3s - loss: 3.1866 - regression_loss: 2.4315 - classification_loss: 0.7551 488/500 [============================>.] - ETA: 2s - loss: 3.1839 - regression_loss: 2.4298 - classification_loss: 0.7541 489/500 [============================>.] - ETA: 2s - loss: 3.1844 - regression_loss: 2.4306 - classification_loss: 0.7538 490/500 [============================>.] - ETA: 2s - loss: 3.1831 - regression_loss: 2.4299 - classification_loss: 0.7532 491/500 [============================>.] - ETA: 2s - loss: 3.1820 - regression_loss: 2.4294 - classification_loss: 0.7527 492/500 [============================>.] - ETA: 1s - loss: 3.1816 - regression_loss: 2.4291 - classification_loss: 0.7525 493/500 [============================>.] - ETA: 1s - loss: 3.1817 - regression_loss: 2.4294 - classification_loss: 0.7523 494/500 [============================>.] - ETA: 1s - loss: 3.1809 - regression_loss: 2.4293 - classification_loss: 0.7517 495/500 [============================>.] - ETA: 1s - loss: 3.1807 - regression_loss: 2.4297 - classification_loss: 0.7510 496/500 [============================>.] - ETA: 0s - loss: 3.1814 - regression_loss: 2.4301 - classification_loss: 0.7513 497/500 [============================>.] - ETA: 0s - loss: 3.1804 - regression_loss: 2.4299 - classification_loss: 0.7506 498/500 [============================>.] - ETA: 0s - loss: 3.1796 - regression_loss: 2.4293 - classification_loss: 0.7502 499/500 [============================>.] - ETA: 0s - loss: 3.1783 - regression_loss: 2.4285 - classification_loss: 0.7497 500/500 [==============================] - 116s 232ms/step - loss: 3.1777 - regression_loss: 2.4285 - classification_loss: 0.7491 326 instances of class plum with average precision: 0.2432 mAP: 0.2432 Epoch 00002: saving model to ./training/snapshots/resnet50_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 1:59 - loss: 3.2481 - regression_loss: 2.4900 - classification_loss: 0.7580 2/500 [..............................] - ETA: 1:57 - loss: 3.1127 - regression_loss: 2.4729 - classification_loss: 0.6398 3/500 [..............................] - ETA: 1:59 - loss: 2.7897 - regression_loss: 2.2707 - classification_loss: 0.5190 4/500 [..............................] - ETA: 1:58 - loss: 2.8365 - regression_loss: 2.3062 - classification_loss: 0.5304 5/500 [..............................] - ETA: 1:56 - loss: 2.8213 - regression_loss: 2.2793 - classification_loss: 0.5420 6/500 [..............................] - ETA: 1:58 - loss: 2.8459 - regression_loss: 2.2846 - classification_loss: 0.5614 7/500 [..............................] - ETA: 1:57 - loss: 2.8496 - regression_loss: 2.3100 - classification_loss: 0.5397 8/500 [..............................] - ETA: 1:57 - loss: 2.7902 - regression_loss: 2.2629 - classification_loss: 0.5274 9/500 [..............................] - ETA: 1:57 - loss: 2.7541 - regression_loss: 2.2366 - classification_loss: 0.5175 10/500 [..............................] - ETA: 1:56 - loss: 2.7496 - regression_loss: 2.2361 - classification_loss: 0.5135 11/500 [..............................] - ETA: 1:56 - loss: 2.7236 - regression_loss: 2.2150 - classification_loss: 0.5086 12/500 [..............................] - ETA: 1:55 - loss: 2.7241 - regression_loss: 2.2204 - classification_loss: 0.5037 13/500 [..............................] - ETA: 1:55 - loss: 2.7150 - regression_loss: 2.2070 - classification_loss: 0.5081 14/500 [..............................] - ETA: 1:55 - loss: 2.7593 - regression_loss: 2.2395 - classification_loss: 0.5198 15/500 [..............................] - ETA: 1:55 - loss: 2.8153 - regression_loss: 2.2849 - classification_loss: 0.5304 16/500 [..............................] - ETA: 1:55 - loss: 2.8918 - regression_loss: 2.3482 - classification_loss: 0.5436 17/500 [>.............................] - ETA: 1:55 - loss: 2.9150 - regression_loss: 2.3583 - classification_loss: 0.5567 18/500 [>.............................] - ETA: 1:54 - loss: 2.9093 - regression_loss: 2.3488 - classification_loss: 0.5606 19/500 [>.............................] - ETA: 1:53 - loss: 2.8951 - regression_loss: 2.3366 - classification_loss: 0.5586 20/500 [>.............................] - ETA: 1:53 - loss: 2.8847 - regression_loss: 2.3301 - classification_loss: 0.5546 21/500 [>.............................] - ETA: 1:53 - loss: 2.8675 - regression_loss: 2.3133 - classification_loss: 0.5542 22/500 [>.............................] - ETA: 1:53 - loss: 2.8885 - regression_loss: 2.3304 - classification_loss: 0.5580 23/500 [>.............................] - ETA: 1:53 - loss: 2.8659 - regression_loss: 2.3140 - classification_loss: 0.5519 24/500 [>.............................] - ETA: 1:52 - loss: 2.9164 - regression_loss: 2.3528 - classification_loss: 0.5636 25/500 [>.............................] - ETA: 1:53 - loss: 2.9042 - regression_loss: 2.3437 - classification_loss: 0.5605 26/500 [>.............................] - ETA: 1:52 - loss: 2.9450 - regression_loss: 2.3733 - classification_loss: 0.5716 27/500 [>.............................] - ETA: 1:52 - loss: 2.9320 - regression_loss: 2.3627 - classification_loss: 0.5693 28/500 [>.............................] - ETA: 1:51 - loss: 2.9226 - regression_loss: 2.3602 - classification_loss: 0.5624 29/500 [>.............................] - ETA: 1:51 - loss: 2.9162 - regression_loss: 2.3582 - classification_loss: 0.5579 30/500 [>.............................] - ETA: 1:51 - loss: 2.9134 - regression_loss: 2.3580 - classification_loss: 0.5554 31/500 [>.............................] - ETA: 1:50 - loss: 2.8767 - regression_loss: 2.3274 - classification_loss: 0.5493 32/500 [>.............................] - ETA: 1:50 - loss: 2.8737 - regression_loss: 2.3236 - classification_loss: 0.5501 33/500 [>.............................] - ETA: 1:50 - loss: 2.8857 - regression_loss: 2.3329 - classification_loss: 0.5528 34/500 [=>............................] - ETA: 1:49 - loss: 2.8829 - regression_loss: 2.3299 - classification_loss: 0.5530 35/500 [=>............................] - ETA: 1:49 - loss: 2.8860 - regression_loss: 2.3288 - classification_loss: 0.5572 36/500 [=>............................] - ETA: 1:49 - loss: 2.8794 - regression_loss: 2.3208 - classification_loss: 0.5586 37/500 [=>............................] - ETA: 1:49 - loss: 2.8908 - regression_loss: 2.3260 - classification_loss: 0.5648 38/500 [=>............................] - ETA: 1:49 - loss: 2.8840 - regression_loss: 2.3176 - classification_loss: 0.5664 39/500 [=>............................] - ETA: 1:49 - loss: 2.8794 - regression_loss: 2.3087 - classification_loss: 0.5707 40/500 [=>............................] - ETA: 1:48 - loss: 2.8396 - regression_loss: 2.2510 - classification_loss: 0.5886 41/500 [=>............................] - ETA: 1:48 - loss: 2.8317 - regression_loss: 2.2453 - classification_loss: 0.5864 42/500 [=>............................] - ETA: 1:47 - loss: 2.8331 - regression_loss: 2.2446 - classification_loss: 0.5885 43/500 [=>............................] - ETA: 1:47 - loss: 2.8333 - regression_loss: 2.2433 - classification_loss: 0.5900 44/500 [=>............................] - ETA: 1:47 - loss: 2.8423 - regression_loss: 2.2501 - classification_loss: 0.5922 45/500 [=>............................] - ETA: 1:47 - loss: 2.8318 - regression_loss: 2.2414 - classification_loss: 0.5905 46/500 [=>............................] - ETA: 1:47 - loss: 2.8370 - regression_loss: 2.2452 - classification_loss: 0.5919 47/500 [=>............................] - ETA: 1:47 - loss: 2.8609 - regression_loss: 2.2678 - classification_loss: 0.5932 48/500 [=>............................] - ETA: 1:46 - loss: 2.8569 - regression_loss: 2.2652 - classification_loss: 0.5917 49/500 [=>............................] - ETA: 1:46 - loss: 2.8464 - regression_loss: 2.2590 - classification_loss: 0.5874 50/500 [==>...........................] - ETA: 1:46 - loss: 2.8183 - regression_loss: 2.2398 - classification_loss: 0.5785 51/500 [==>...........................] - ETA: 1:46 - loss: 2.8133 - regression_loss: 2.2378 - classification_loss: 0.5755 52/500 [==>...........................] - ETA: 1:46 - loss: 2.8115 - regression_loss: 2.2374 - classification_loss: 0.5741 53/500 [==>...........................] - ETA: 1:45 - loss: 2.8157 - regression_loss: 2.2394 - classification_loss: 0.5763 54/500 [==>...........................] - ETA: 1:45 - loss: 2.8027 - regression_loss: 2.2318 - classification_loss: 0.5709 55/500 [==>...........................] - ETA: 1:45 - loss: 2.8122 - regression_loss: 2.2382 - classification_loss: 0.5741 56/500 [==>...........................] - ETA: 1:45 - loss: 2.8169 - regression_loss: 2.2432 - classification_loss: 0.5738 57/500 [==>...........................] - ETA: 1:44 - loss: 2.8175 - regression_loss: 2.2444 - classification_loss: 0.5731 58/500 [==>...........................] - ETA: 1:44 - loss: 2.8188 - regression_loss: 2.2427 - classification_loss: 0.5760 59/500 [==>...........................] - ETA: 1:43 - loss: 2.8646 - regression_loss: 2.2588 - classification_loss: 0.6058 60/500 [==>...........................] - ETA: 1:43 - loss: 2.8639 - regression_loss: 2.2585 - classification_loss: 0.6054 61/500 [==>...........................] - ETA: 1:43 - loss: 2.8569 - regression_loss: 2.2457 - classification_loss: 0.6112 62/500 [==>...........................] - ETA: 1:43 - loss: 2.8490 - regression_loss: 2.2404 - classification_loss: 0.6086 63/500 [==>...........................] - ETA: 1:43 - loss: 2.8560 - regression_loss: 2.2472 - classification_loss: 0.6088 64/500 [==>...........................] - ETA: 1:43 - loss: 2.8442 - regression_loss: 2.2392 - classification_loss: 0.6050 65/500 [==>...........................] - ETA: 1:42 - loss: 2.8525 - regression_loss: 2.2474 - classification_loss: 0.6051 66/500 [==>...........................] - ETA: 1:42 - loss: 2.8470 - regression_loss: 2.2450 - classification_loss: 0.6020 67/500 [===>..........................] - ETA: 1:42 - loss: 2.8398 - regression_loss: 2.2412 - classification_loss: 0.5986 68/500 [===>..........................] - ETA: 1:42 - loss: 2.8360 - regression_loss: 2.2392 - classification_loss: 0.5968 69/500 [===>..........................] - ETA: 1:41 - loss: 2.8407 - regression_loss: 2.2427 - classification_loss: 0.5979 70/500 [===>..........................] - ETA: 1:41 - loss: 2.8464 - regression_loss: 2.2482 - classification_loss: 0.5982 71/500 [===>..........................] - ETA: 1:41 - loss: 2.8427 - regression_loss: 2.2458 - classification_loss: 0.5969 72/500 [===>..........................] - ETA: 1:41 - loss: 2.8417 - regression_loss: 2.2456 - classification_loss: 0.5961 73/500 [===>..........................] - ETA: 1:40 - loss: 2.8461 - regression_loss: 2.2512 - classification_loss: 0.5950 74/500 [===>..........................] - ETA: 1:40 - loss: 2.8400 - regression_loss: 2.2471 - classification_loss: 0.5930 75/500 [===>..........................] - ETA: 1:40 - loss: 2.8424 - regression_loss: 2.2481 - classification_loss: 0.5943 76/500 [===>..........................] - ETA: 1:40 - loss: 2.8392 - regression_loss: 2.2482 - classification_loss: 0.5910 77/500 [===>..........................] - ETA: 1:40 - loss: 2.8261 - regression_loss: 2.2394 - classification_loss: 0.5866 78/500 [===>..........................] - ETA: 1:40 - loss: 2.8292 - regression_loss: 2.2427 - classification_loss: 0.5864 79/500 [===>..........................] - ETA: 1:39 - loss: 2.8246 - regression_loss: 2.2391 - classification_loss: 0.5855 80/500 [===>..........................] - ETA: 1:39 - loss: 2.8178 - regression_loss: 2.2354 - classification_loss: 0.5823 81/500 [===>..........................] - ETA: 1:39 - loss: 2.8181 - regression_loss: 2.2351 - classification_loss: 0.5830 82/500 [===>..........................] - ETA: 1:39 - loss: 2.8282 - regression_loss: 2.2413 - classification_loss: 0.5869 83/500 [===>..........................] - ETA: 1:39 - loss: 2.8314 - regression_loss: 2.2438 - classification_loss: 0.5876 84/500 [====>.........................] - ETA: 1:38 - loss: 2.8335 - regression_loss: 2.2486 - classification_loss: 0.5849 85/500 [====>.........................] - ETA: 1:38 - loss: 2.8368 - regression_loss: 2.2515 - classification_loss: 0.5852 86/500 [====>.........................] - ETA: 1:38 - loss: 2.8368 - regression_loss: 2.2506 - classification_loss: 0.5862 87/500 [====>.........................] - ETA: 1:38 - loss: 2.8336 - regression_loss: 2.2486 - classification_loss: 0.5850 88/500 [====>.........................] - ETA: 1:37 - loss: 2.8292 - regression_loss: 2.2469 - classification_loss: 0.5823 89/500 [====>.........................] - ETA: 1:37 - loss: 2.8201 - regression_loss: 2.2395 - classification_loss: 0.5806 90/500 [====>.........................] - ETA: 1:37 - loss: 2.8249 - regression_loss: 2.2414 - classification_loss: 0.5835 91/500 [====>.........................] - ETA: 1:37 - loss: 2.8312 - regression_loss: 2.2456 - classification_loss: 0.5856 92/500 [====>.........................] - ETA: 1:36 - loss: 2.8309 - regression_loss: 2.2467 - classification_loss: 0.5842 93/500 [====>.........................] - ETA: 1:36 - loss: 2.8267 - regression_loss: 2.2453 - classification_loss: 0.5814 94/500 [====>.........................] - ETA: 1:36 - loss: 2.8198 - regression_loss: 2.2414 - classification_loss: 0.5784 95/500 [====>.........................] - ETA: 1:36 - loss: 2.8179 - regression_loss: 2.2399 - classification_loss: 0.5780 96/500 [====>.........................] - ETA: 1:35 - loss: 2.8186 - regression_loss: 2.2407 - classification_loss: 0.5780 97/500 [====>.........................] - ETA: 1:35 - loss: 2.8170 - regression_loss: 2.2403 - classification_loss: 0.5767 98/500 [====>.........................] - ETA: 1:35 - loss: 2.8166 - regression_loss: 2.2394 - classification_loss: 0.5772 99/500 [====>.........................] - ETA: 1:35 - loss: 2.8166 - regression_loss: 2.2407 - classification_loss: 0.5759 100/500 [=====>........................] - ETA: 1:35 - loss: 2.8168 - regression_loss: 2.2417 - classification_loss: 0.5752 101/500 [=====>........................] - ETA: 1:34 - loss: 2.8202 - regression_loss: 2.2439 - classification_loss: 0.5762 102/500 [=====>........................] - ETA: 1:34 - loss: 2.8195 - regression_loss: 2.2438 - classification_loss: 0.5757 103/500 [=====>........................] - ETA: 1:34 - loss: 2.8147 - regression_loss: 2.2416 - classification_loss: 0.5731 104/500 [=====>........................] - ETA: 1:34 - loss: 2.8161 - regression_loss: 2.2405 - classification_loss: 0.5757 105/500 [=====>........................] - ETA: 1:34 - loss: 2.8179 - regression_loss: 2.2435 - classification_loss: 0.5743 106/500 [=====>........................] - ETA: 1:33 - loss: 2.8230 - regression_loss: 2.2468 - classification_loss: 0.5763 107/500 [=====>........................] - ETA: 1:33 - loss: 2.8201 - regression_loss: 2.2443 - classification_loss: 0.5758 108/500 [=====>........................] - ETA: 1:33 - loss: 2.8182 - regression_loss: 2.2433 - classification_loss: 0.5749 109/500 [=====>........................] - ETA: 1:33 - loss: 2.8128 - regression_loss: 2.2393 - classification_loss: 0.5735 110/500 [=====>........................] - ETA: 1:32 - loss: 2.8054 - regression_loss: 2.2351 - classification_loss: 0.5703 111/500 [=====>........................] - ETA: 1:32 - loss: 2.8059 - regression_loss: 2.2364 - classification_loss: 0.5695 112/500 [=====>........................] - ETA: 1:32 - loss: 2.7993 - regression_loss: 2.2325 - classification_loss: 0.5668 113/500 [=====>........................] - ETA: 1:32 - loss: 2.7933 - regression_loss: 2.2289 - classification_loss: 0.5644 114/500 [=====>........................] - ETA: 1:32 - loss: 2.7980 - regression_loss: 2.2330 - classification_loss: 0.5651 115/500 [=====>........................] - ETA: 1:31 - loss: 2.7968 - regression_loss: 2.2311 - classification_loss: 0.5657 116/500 [=====>........................] - ETA: 1:31 - loss: 2.7964 - regression_loss: 2.2317 - classification_loss: 0.5647 117/500 [======>.......................] - ETA: 1:31 - loss: 2.7927 - regression_loss: 2.2293 - classification_loss: 0.5634 118/500 [======>.......................] - ETA: 1:31 - loss: 2.7914 - regression_loss: 2.2284 - classification_loss: 0.5630 119/500 [======>.......................] - ETA: 1:30 - loss: 2.8075 - regression_loss: 2.2401 - classification_loss: 0.5673 120/500 [======>.......................] - ETA: 1:30 - loss: 2.8254 - regression_loss: 2.2432 - classification_loss: 0.5822 121/500 [======>.......................] - ETA: 1:30 - loss: 2.8237 - regression_loss: 2.2423 - classification_loss: 0.5814 122/500 [======>.......................] - ETA: 1:30 - loss: 2.8245 - regression_loss: 2.2442 - classification_loss: 0.5803 123/500 [======>.......................] - ETA: 1:29 - loss: 2.8193 - regression_loss: 2.2412 - classification_loss: 0.5781 124/500 [======>.......................] - ETA: 1:29 - loss: 2.8175 - regression_loss: 2.2411 - classification_loss: 0.5765 125/500 [======>.......................] - ETA: 1:29 - loss: 2.8143 - regression_loss: 2.2383 - classification_loss: 0.5760 126/500 [======>.......................] - ETA: 1:29 - loss: 2.8121 - regression_loss: 2.2378 - classification_loss: 0.5743 127/500 [======>.......................] - ETA: 1:29 - loss: 2.8093 - regression_loss: 2.2359 - classification_loss: 0.5734 128/500 [======>.......................] - ETA: 1:28 - loss: 2.8229 - regression_loss: 2.2460 - classification_loss: 0.5768 129/500 [======>.......................] - ETA: 1:28 - loss: 2.8218 - regression_loss: 2.2461 - classification_loss: 0.5757 130/500 [======>.......................] - ETA: 1:28 - loss: 2.8109 - regression_loss: 2.2387 - classification_loss: 0.5723 131/500 [======>.......................] - ETA: 1:27 - loss: 2.8104 - regression_loss: 2.2386 - classification_loss: 0.5717 132/500 [======>.......................] - ETA: 1:27 - loss: 2.8110 - regression_loss: 2.2390 - classification_loss: 0.5720 133/500 [======>.......................] - ETA: 1:27 - loss: 2.8077 - regression_loss: 2.2370 - classification_loss: 0.5707 134/500 [=======>......................] - ETA: 1:27 - loss: 2.8092 - regression_loss: 2.2387 - classification_loss: 0.5705 135/500 [=======>......................] - ETA: 1:26 - loss: 2.8130 - regression_loss: 2.2416 - classification_loss: 0.5714 136/500 [=======>......................] - ETA: 1:26 - loss: 2.8154 - regression_loss: 2.2428 - classification_loss: 0.5726 137/500 [=======>......................] - ETA: 1:26 - loss: 2.8136 - regression_loss: 2.2422 - classification_loss: 0.5714 138/500 [=======>......................] - ETA: 1:26 - loss: 2.8121 - regression_loss: 2.2411 - classification_loss: 0.5710 139/500 [=======>......................] - ETA: 1:25 - loss: 2.8112 - regression_loss: 2.2412 - classification_loss: 0.5700 140/500 [=======>......................] - ETA: 1:25 - loss: 2.8113 - regression_loss: 2.2415 - classification_loss: 0.5698 141/500 [=======>......................] - ETA: 1:25 - loss: 2.8105 - regression_loss: 2.2411 - classification_loss: 0.5693 142/500 [=======>......................] - ETA: 1:25 - loss: 2.8077 - regression_loss: 2.2393 - classification_loss: 0.5684 143/500 [=======>......................] - ETA: 1:24 - loss: 2.8062 - regression_loss: 2.2379 - classification_loss: 0.5684 144/500 [=======>......................] - ETA: 1:24 - loss: 2.8064 - regression_loss: 2.2386 - classification_loss: 0.5678 145/500 [=======>......................] - ETA: 1:24 - loss: 2.8055 - regression_loss: 2.2381 - classification_loss: 0.5674 146/500 [=======>......................] - ETA: 1:24 - loss: 2.8048 - regression_loss: 2.2379 - classification_loss: 0.5670 147/500 [=======>......................] - ETA: 1:24 - loss: 2.8058 - regression_loss: 2.2376 - classification_loss: 0.5682 148/500 [=======>......................] - ETA: 1:23 - loss: 2.8067 - regression_loss: 2.2383 - classification_loss: 0.5684 149/500 [=======>......................] - ETA: 1:23 - loss: 2.8050 - regression_loss: 2.2375 - classification_loss: 0.5675 150/500 [========>.....................] - ETA: 1:23 - loss: 2.8106 - regression_loss: 2.2422 - classification_loss: 0.5684 151/500 [========>.....................] - ETA: 1:23 - loss: 2.8111 - regression_loss: 2.2414 - classification_loss: 0.5697 152/500 [========>.....................] - ETA: 1:22 - loss: 2.8179 - regression_loss: 2.2448 - classification_loss: 0.5730 153/500 [========>.....................] - ETA: 1:22 - loss: 2.8182 - regression_loss: 2.2451 - classification_loss: 0.5731 154/500 [========>.....................] - ETA: 1:22 - loss: 2.8192 - regression_loss: 2.2458 - classification_loss: 0.5734 155/500 [========>.....................] - ETA: 1:22 - loss: 2.8181 - regression_loss: 2.2451 - classification_loss: 0.5730 156/500 [========>.....................] - ETA: 1:21 - loss: 2.8197 - regression_loss: 2.2465 - classification_loss: 0.5732 157/500 [========>.....................] - ETA: 1:21 - loss: 2.8210 - regression_loss: 2.2490 - classification_loss: 0.5720 158/500 [========>.....................] - ETA: 1:21 - loss: 2.8197 - regression_loss: 2.2482 - classification_loss: 0.5715 159/500 [========>.....................] - ETA: 1:21 - loss: 2.8256 - regression_loss: 2.2537 - classification_loss: 0.5719 160/500 [========>.....................] - ETA: 1:20 - loss: 2.8224 - regression_loss: 2.2518 - classification_loss: 0.5706 161/500 [========>.....................] - ETA: 1:20 - loss: 2.8237 - regression_loss: 2.2527 - classification_loss: 0.5711 162/500 [========>.....................] - ETA: 1:20 - loss: 2.8221 - regression_loss: 2.2520 - classification_loss: 0.5700 163/500 [========>.....................] - ETA: 1:20 - loss: 2.8184 - regression_loss: 2.2498 - classification_loss: 0.5686 164/500 [========>.....................] - ETA: 1:19 - loss: 2.8144 - regression_loss: 2.2472 - classification_loss: 0.5672 165/500 [========>.....................] - ETA: 1:19 - loss: 2.8194 - regression_loss: 2.2511 - classification_loss: 0.5683 166/500 [========>.....................] - ETA: 1:19 - loss: 2.8235 - regression_loss: 2.2553 - classification_loss: 0.5681 167/500 [=========>....................] - ETA: 1:19 - loss: 2.8271 - regression_loss: 2.2592 - classification_loss: 0.5680 168/500 [=========>....................] - ETA: 1:18 - loss: 2.8252 - regression_loss: 2.2583 - classification_loss: 0.5669 169/500 [=========>....................] - ETA: 1:18 - loss: 2.8264 - regression_loss: 2.2598 - classification_loss: 0.5666 170/500 [=========>....................] - ETA: 1:18 - loss: 2.8254 - regression_loss: 2.2590 - classification_loss: 0.5664 171/500 [=========>....................] - ETA: 1:18 - loss: 2.8266 - regression_loss: 2.2607 - classification_loss: 0.5659 172/500 [=========>....................] - ETA: 1:17 - loss: 2.8250 - regression_loss: 2.2597 - classification_loss: 0.5653 173/500 [=========>....................] - ETA: 1:17 - loss: 2.8241 - regression_loss: 2.2592 - classification_loss: 0.5649 174/500 [=========>....................] - ETA: 1:17 - loss: 2.8250 - regression_loss: 2.2604 - classification_loss: 0.5646 175/500 [=========>....................] - ETA: 1:17 - loss: 2.8274 - regression_loss: 2.2627 - classification_loss: 0.5647 176/500 [=========>....................] - ETA: 1:16 - loss: 2.8274 - regression_loss: 2.2635 - classification_loss: 0.5639 177/500 [=========>....................] - ETA: 1:16 - loss: 2.8269 - regression_loss: 2.2631 - classification_loss: 0.5638 178/500 [=========>....................] - ETA: 1:16 - loss: 2.8222 - regression_loss: 2.2598 - classification_loss: 0.5624 179/500 [=========>....................] - ETA: 1:16 - loss: 2.8229 - regression_loss: 2.2610 - classification_loss: 0.5619 180/500 [=========>....................] - ETA: 1:15 - loss: 2.8273 - regression_loss: 2.2635 - classification_loss: 0.5638 181/500 [=========>....................] - ETA: 1:15 - loss: 2.8421 - regression_loss: 2.2673 - classification_loss: 0.5749 182/500 [=========>....................] - ETA: 1:15 - loss: 2.8454 - regression_loss: 2.2700 - classification_loss: 0.5753 183/500 [=========>....................] - ETA: 1:15 - loss: 2.8462 - regression_loss: 2.2708 - classification_loss: 0.5754 184/500 [==========>...................] - ETA: 1:14 - loss: 2.8441 - regression_loss: 2.2691 - classification_loss: 0.5750 185/500 [==========>...................] - ETA: 1:14 - loss: 2.8405 - regression_loss: 2.2672 - classification_loss: 0.5733 186/500 [==========>...................] - ETA: 1:14 - loss: 2.8458 - regression_loss: 2.2694 - classification_loss: 0.5764 187/500 [==========>...................] - ETA: 1:14 - loss: 2.8453 - regression_loss: 2.2696 - classification_loss: 0.5758 188/500 [==========>...................] - ETA: 1:14 - loss: 2.8403 - regression_loss: 2.2666 - classification_loss: 0.5737 189/500 [==========>...................] - ETA: 1:13 - loss: 2.8416 - regression_loss: 2.2678 - classification_loss: 0.5738 190/500 [==========>...................] - ETA: 1:13 - loss: 2.8397 - regression_loss: 2.2665 - classification_loss: 0.5732 191/500 [==========>...................] - ETA: 1:13 - loss: 2.8423 - regression_loss: 2.2682 - classification_loss: 0.5741 192/500 [==========>...................] - ETA: 1:13 - loss: 2.8406 - regression_loss: 2.2673 - classification_loss: 0.5733 193/500 [==========>...................] - ETA: 1:12 - loss: 2.8429 - regression_loss: 2.2703 - classification_loss: 0.5727 194/500 [==========>...................] - ETA: 1:12 - loss: 2.8423 - regression_loss: 2.2698 - classification_loss: 0.5725 195/500 [==========>...................] - ETA: 1:12 - loss: 2.8429 - regression_loss: 2.2704 - classification_loss: 0.5725 196/500 [==========>...................] - ETA: 1:12 - loss: 2.8412 - regression_loss: 2.2693 - classification_loss: 0.5719 197/500 [==========>...................] - ETA: 1:11 - loss: 2.8399 - regression_loss: 2.2686 - classification_loss: 0.5713 198/500 [==========>...................] - ETA: 1:11 - loss: 2.8423 - regression_loss: 2.2710 - classification_loss: 0.5713 199/500 [==========>...................] - ETA: 1:11 - loss: 2.8416 - regression_loss: 2.2705 - classification_loss: 0.5711 200/500 [===========>..................] - ETA: 1:11 - loss: 2.8444 - regression_loss: 2.2721 - classification_loss: 0.5723 201/500 [===========>..................] - ETA: 1:11 - loss: 2.8439 - regression_loss: 2.2720 - classification_loss: 0.5719 202/500 [===========>..................] - ETA: 1:10 - loss: 2.8460 - regression_loss: 2.2740 - classification_loss: 0.5720 203/500 [===========>..................] - ETA: 1:10 - loss: 2.8453 - regression_loss: 2.2739 - classification_loss: 0.5714 204/500 [===========>..................] - ETA: 1:10 - loss: 2.8430 - regression_loss: 2.2724 - classification_loss: 0.5706 205/500 [===========>..................] - ETA: 1:10 - loss: 2.8409 - regression_loss: 2.2716 - classification_loss: 0.5693 206/500 [===========>..................] - ETA: 1:09 - loss: 2.8423 - regression_loss: 2.2721 - classification_loss: 0.5702 207/500 [===========>..................] - ETA: 1:09 - loss: 2.8422 - regression_loss: 2.2722 - classification_loss: 0.5699 208/500 [===========>..................] - ETA: 1:09 - loss: 2.8429 - regression_loss: 2.2734 - classification_loss: 0.5694 209/500 [===========>..................] - ETA: 1:09 - loss: 2.8414 - regression_loss: 2.2729 - classification_loss: 0.5685 210/500 [===========>..................] - ETA: 1:08 - loss: 2.8419 - regression_loss: 2.2736 - classification_loss: 0.5683 211/500 [===========>..................] - ETA: 1:08 - loss: 2.8406 - regression_loss: 2.2728 - classification_loss: 0.5679 212/500 [===========>..................] - ETA: 1:08 - loss: 2.8410 - regression_loss: 2.2733 - classification_loss: 0.5677 213/500 [===========>..................] - ETA: 1:08 - loss: 2.8381 - regression_loss: 2.2714 - classification_loss: 0.5668 214/500 [===========>..................] - ETA: 1:07 - loss: 2.8361 - regression_loss: 2.2700 - classification_loss: 0.5661 215/500 [===========>..................] - ETA: 1:07 - loss: 2.8348 - regression_loss: 2.2695 - classification_loss: 0.5653 216/500 [===========>..................] - ETA: 1:07 - loss: 2.8341 - regression_loss: 2.2693 - classification_loss: 0.5648 217/500 [============>.................] - ETA: 1:07 - loss: 2.8330 - regression_loss: 2.2684 - classification_loss: 0.5646 218/500 [============>.................] - ETA: 1:06 - loss: 2.8338 - regression_loss: 2.2689 - classification_loss: 0.5649 219/500 [============>.................] - ETA: 1:06 - loss: 2.8359 - regression_loss: 2.2700 - classification_loss: 0.5658 220/500 [============>.................] - ETA: 1:06 - loss: 2.8337 - regression_loss: 2.2686 - classification_loss: 0.5651 221/500 [============>.................] - ETA: 1:06 - loss: 2.8338 - regression_loss: 2.2686 - classification_loss: 0.5652 222/500 [============>.................] - ETA: 1:05 - loss: 2.8334 - regression_loss: 2.2683 - classification_loss: 0.5651 223/500 [============>.................] - ETA: 1:05 - loss: 2.8360 - regression_loss: 2.2711 - classification_loss: 0.5649 224/500 [============>.................] - ETA: 1:05 - loss: 2.8345 - regression_loss: 2.2691 - classification_loss: 0.5654 225/500 [============>.................] - ETA: 1:05 - loss: 2.8307 - regression_loss: 2.2665 - classification_loss: 0.5643 226/500 [============>.................] - ETA: 1:05 - loss: 2.8299 - regression_loss: 2.2659 - classification_loss: 0.5640 227/500 [============>.................] - ETA: 1:04 - loss: 2.8298 - regression_loss: 2.2661 - classification_loss: 0.5637 228/500 [============>.................] - ETA: 1:04 - loss: 2.8301 - regression_loss: 2.2659 - classification_loss: 0.5643 229/500 [============>.................] - ETA: 1:04 - loss: 2.8300 - regression_loss: 2.2657 - classification_loss: 0.5643 230/500 [============>.................] - ETA: 1:04 - loss: 2.8294 - regression_loss: 2.2649 - classification_loss: 0.5645 231/500 [============>.................] - ETA: 1:03 - loss: 2.8291 - regression_loss: 2.2645 - classification_loss: 0.5646 232/500 [============>.................] - ETA: 1:03 - loss: 2.8247 - regression_loss: 2.2609 - classification_loss: 0.5637 233/500 [============>.................] - ETA: 1:03 - loss: 2.8230 - regression_loss: 2.2597 - classification_loss: 0.5633 234/500 [=============>................] - ETA: 1:03 - loss: 2.8217 - regression_loss: 2.2586 - classification_loss: 0.5631 235/500 [=============>................] - ETA: 1:02 - loss: 2.8208 - regression_loss: 2.2579 - classification_loss: 0.5629 236/500 [=============>................] - ETA: 1:02 - loss: 2.8243 - regression_loss: 2.2606 - classification_loss: 0.5636 237/500 [=============>................] - ETA: 1:02 - loss: 2.8250 - regression_loss: 2.2610 - classification_loss: 0.5639 238/500 [=============>................] - ETA: 1:02 - loss: 2.8217 - regression_loss: 2.2592 - classification_loss: 0.5626 239/500 [=============>................] - ETA: 1:01 - loss: 2.8217 - regression_loss: 2.2599 - classification_loss: 0.5618 240/500 [=============>................] - ETA: 1:01 - loss: 2.8209 - regression_loss: 2.2594 - classification_loss: 0.5615 241/500 [=============>................] - ETA: 1:01 - loss: 2.8213 - regression_loss: 2.2599 - classification_loss: 0.5614 242/500 [=============>................] - ETA: 1:01 - loss: 2.8212 - regression_loss: 2.2597 - classification_loss: 0.5615 243/500 [=============>................] - ETA: 1:00 - loss: 2.8205 - regression_loss: 2.2592 - classification_loss: 0.5613 244/500 [=============>................] - ETA: 1:00 - loss: 2.8185 - regression_loss: 2.2582 - classification_loss: 0.5603 245/500 [=============>................] - ETA: 1:00 - loss: 2.8194 - regression_loss: 2.2586 - classification_loss: 0.5607 246/500 [=============>................] - ETA: 1:00 - loss: 2.8171 - regression_loss: 2.2574 - classification_loss: 0.5596 247/500 [=============>................] - ETA: 1:00 - loss: 2.8167 - regression_loss: 2.2571 - classification_loss: 0.5597 248/500 [=============>................] - ETA: 59s - loss: 2.8152 - regression_loss: 2.2561 - classification_loss: 0.5590  249/500 [=============>................] - ETA: 59s - loss: 2.8149 - regression_loss: 2.2561 - classification_loss: 0.5588 250/500 [==============>...............] - ETA: 59s - loss: 2.8154 - regression_loss: 2.2565 - classification_loss: 0.5589 251/500 [==============>...............] - ETA: 59s - loss: 2.8151 - regression_loss: 2.2566 - classification_loss: 0.5586 252/500 [==============>...............] - ETA: 58s - loss: 2.8147 - regression_loss: 2.2564 - classification_loss: 0.5583 253/500 [==============>...............] - ETA: 58s - loss: 2.8155 - regression_loss: 2.2579 - classification_loss: 0.5575 254/500 [==============>...............] - ETA: 58s - loss: 2.8142 - regression_loss: 2.2573 - classification_loss: 0.5569 255/500 [==============>...............] - ETA: 58s - loss: 2.8150 - regression_loss: 2.2583 - classification_loss: 0.5567 256/500 [==============>...............] - ETA: 57s - loss: 2.8123 - regression_loss: 2.2566 - classification_loss: 0.5558 257/500 [==============>...............] - ETA: 57s - loss: 2.8122 - regression_loss: 2.2570 - classification_loss: 0.5552 258/500 [==============>...............] - ETA: 57s - loss: 2.8123 - regression_loss: 2.2572 - classification_loss: 0.5551 259/500 [==============>...............] - ETA: 57s - loss: 2.8114 - regression_loss: 2.2566 - classification_loss: 0.5548 260/500 [==============>...............] - ETA: 56s - loss: 2.8105 - regression_loss: 2.2559 - classification_loss: 0.5547 261/500 [==============>...............] - ETA: 56s - loss: 2.8114 - regression_loss: 2.2566 - classification_loss: 0.5548 262/500 [==============>...............] - ETA: 56s - loss: 2.8070 - regression_loss: 2.2534 - classification_loss: 0.5536 263/500 [==============>...............] - ETA: 56s - loss: 2.8052 - regression_loss: 2.2525 - classification_loss: 0.5528 264/500 [==============>...............] - ETA: 56s - loss: 2.8047 - regression_loss: 2.2523 - classification_loss: 0.5524 265/500 [==============>...............] - ETA: 55s - loss: 2.8040 - regression_loss: 2.2520 - classification_loss: 0.5520 266/500 [==============>...............] - ETA: 55s - loss: 2.8018 - regression_loss: 2.2502 - classification_loss: 0.5516 267/500 [===============>..............] - ETA: 55s - loss: 2.8009 - regression_loss: 2.2495 - classification_loss: 0.5514 268/500 [===============>..............] - ETA: 55s - loss: 2.8022 - regression_loss: 2.2506 - classification_loss: 0.5516 269/500 [===============>..............] - ETA: 54s - loss: 2.7975 - regression_loss: 2.2458 - classification_loss: 0.5516 270/500 [===============>..............] - ETA: 54s - loss: 2.7944 - regression_loss: 2.2438 - classification_loss: 0.5506 271/500 [===============>..............] - ETA: 54s - loss: 2.7939 - regression_loss: 2.2435 - classification_loss: 0.5505 272/500 [===============>..............] - ETA: 54s - loss: 2.7931 - regression_loss: 2.2430 - classification_loss: 0.5501 273/500 [===============>..............] - ETA: 53s - loss: 2.7910 - regression_loss: 2.2414 - classification_loss: 0.5495 274/500 [===============>..............] - ETA: 53s - loss: 2.7920 - regression_loss: 2.2422 - classification_loss: 0.5498 275/500 [===============>..............] - ETA: 53s - loss: 2.7926 - regression_loss: 2.2424 - classification_loss: 0.5502 276/500 [===============>..............] - ETA: 53s - loss: 2.7900 - regression_loss: 2.2407 - classification_loss: 0.5492 277/500 [===============>..............] - ETA: 52s - loss: 2.7888 - regression_loss: 2.2395 - classification_loss: 0.5493 278/500 [===============>..............] - ETA: 52s - loss: 2.7868 - regression_loss: 2.2383 - classification_loss: 0.5484 279/500 [===============>..............] - ETA: 52s - loss: 2.7849 - regression_loss: 2.2371 - classification_loss: 0.5478 280/500 [===============>..............] - ETA: 52s - loss: 2.7855 - regression_loss: 2.2383 - classification_loss: 0.5473 281/500 [===============>..............] - ETA: 51s - loss: 2.7863 - regression_loss: 2.2385 - classification_loss: 0.5478 282/500 [===============>..............] - ETA: 51s - loss: 2.7855 - regression_loss: 2.2379 - classification_loss: 0.5475 283/500 [===============>..............] - ETA: 51s - loss: 2.7884 - regression_loss: 2.2391 - classification_loss: 0.5493 284/500 [================>.............] - ETA: 51s - loss: 2.7870 - regression_loss: 2.2384 - classification_loss: 0.5486 285/500 [================>.............] - ETA: 51s - loss: 2.7847 - regression_loss: 2.2369 - classification_loss: 0.5478 286/500 [================>.............] - ETA: 50s - loss: 2.7851 - regression_loss: 2.2375 - classification_loss: 0.5476 287/500 [================>.............] - ETA: 50s - loss: 2.7855 - regression_loss: 2.2379 - classification_loss: 0.5476 288/500 [================>.............] - ETA: 50s - loss: 2.7862 - regression_loss: 2.2386 - classification_loss: 0.5476 289/500 [================>.............] - ETA: 50s - loss: 2.7851 - regression_loss: 2.2380 - classification_loss: 0.5471 290/500 [================>.............] - ETA: 49s - loss: 2.7797 - regression_loss: 2.2337 - classification_loss: 0.5461 291/500 [================>.............] - ETA: 49s - loss: 2.7847 - regression_loss: 2.2368 - classification_loss: 0.5479 292/500 [================>.............] - ETA: 49s - loss: 2.7850 - regression_loss: 2.2372 - classification_loss: 0.5478 293/500 [================>.............] - ETA: 49s - loss: 2.7852 - regression_loss: 2.2377 - classification_loss: 0.5475 294/500 [================>.............] - ETA: 49s - loss: 2.7837 - regression_loss: 2.2366 - classification_loss: 0.5471 295/500 [================>.............] - ETA: 48s - loss: 2.7827 - regression_loss: 2.2359 - classification_loss: 0.5468 296/500 [================>.............] - ETA: 48s - loss: 2.7853 - regression_loss: 2.2376 - classification_loss: 0.5477 297/500 [================>.............] - ETA: 48s - loss: 2.7835 - regression_loss: 2.2367 - classification_loss: 0.5468 298/500 [================>.............] - ETA: 48s - loss: 2.7853 - regression_loss: 2.2370 - classification_loss: 0.5483 299/500 [================>.............] - ETA: 47s - loss: 2.7837 - regression_loss: 2.2362 - classification_loss: 0.5475 300/500 [=================>............] - ETA: 47s - loss: 2.7829 - regression_loss: 2.2360 - classification_loss: 0.5468 301/500 [=================>............] - ETA: 47s - loss: 2.7831 - regression_loss: 2.2362 - classification_loss: 0.5469 302/500 [=================>............] - ETA: 47s - loss: 2.7850 - regression_loss: 2.2380 - classification_loss: 0.5470 303/500 [=================>............] - ETA: 46s - loss: 2.7871 - regression_loss: 2.2394 - classification_loss: 0.5477 304/500 [=================>............] - ETA: 46s - loss: 2.7869 - regression_loss: 2.2391 - classification_loss: 0.5478 305/500 [=================>............] - ETA: 46s - loss: 2.7850 - regression_loss: 2.2380 - classification_loss: 0.5470 306/500 [=================>............] - ETA: 46s - loss: 2.7851 - regression_loss: 2.2385 - classification_loss: 0.5467 307/500 [=================>............] - ETA: 45s - loss: 2.7826 - regression_loss: 2.2369 - classification_loss: 0.5457 308/500 [=================>............] - ETA: 45s - loss: 2.7861 - regression_loss: 2.2395 - classification_loss: 0.5466 309/500 [=================>............] - ETA: 45s - loss: 2.7865 - regression_loss: 2.2401 - classification_loss: 0.5464 310/500 [=================>............] - ETA: 45s - loss: 2.7873 - regression_loss: 2.2407 - classification_loss: 0.5465 311/500 [=================>............] - ETA: 44s - loss: 2.7861 - regression_loss: 2.2401 - classification_loss: 0.5460 312/500 [=================>............] - ETA: 44s - loss: 2.7855 - regression_loss: 2.2397 - classification_loss: 0.5457 313/500 [=================>............] - ETA: 44s - loss: 2.7854 - regression_loss: 2.2397 - classification_loss: 0.5458 314/500 [=================>............] - ETA: 44s - loss: 2.7838 - regression_loss: 2.2389 - classification_loss: 0.5449 315/500 [=================>............] - ETA: 44s - loss: 2.7837 - regression_loss: 2.2387 - classification_loss: 0.5449 316/500 [=================>............] - ETA: 43s - loss: 2.7841 - regression_loss: 2.2393 - classification_loss: 0.5448 317/500 [==================>...........] - ETA: 43s - loss: 2.7850 - regression_loss: 2.2403 - classification_loss: 0.5447 318/500 [==================>...........] - ETA: 43s - loss: 2.7854 - regression_loss: 2.2407 - classification_loss: 0.5446 319/500 [==================>...........] - ETA: 43s - loss: 2.7842 - regression_loss: 2.2400 - classification_loss: 0.5442 320/500 [==================>...........] - ETA: 42s - loss: 2.7835 - regression_loss: 2.2392 - classification_loss: 0.5444 321/500 [==================>...........] - ETA: 42s - loss: 2.7835 - regression_loss: 2.2391 - classification_loss: 0.5444 322/500 [==================>...........] - ETA: 42s - loss: 2.7844 - regression_loss: 2.2400 - classification_loss: 0.5444 323/500 [==================>...........] - ETA: 42s - loss: 2.7891 - regression_loss: 2.2403 - classification_loss: 0.5487 324/500 [==================>...........] - ETA: 41s - loss: 2.7869 - regression_loss: 2.2390 - classification_loss: 0.5479 325/500 [==================>...........] - ETA: 41s - loss: 2.7877 - regression_loss: 2.2398 - classification_loss: 0.5479 326/500 [==================>...........] - ETA: 41s - loss: 2.7885 - regression_loss: 2.2399 - classification_loss: 0.5487 327/500 [==================>...........] - ETA: 41s - loss: 2.7877 - regression_loss: 2.2392 - classification_loss: 0.5485 328/500 [==================>...........] - ETA: 40s - loss: 2.7852 - regression_loss: 2.2375 - classification_loss: 0.5477 329/500 [==================>...........] - ETA: 40s - loss: 2.7846 - regression_loss: 2.2373 - classification_loss: 0.5473 330/500 [==================>...........] - ETA: 40s - loss: 2.7853 - regression_loss: 2.2375 - classification_loss: 0.5477 331/500 [==================>...........] - ETA: 40s - loss: 2.7879 - regression_loss: 2.2398 - classification_loss: 0.5481 332/500 [==================>...........] - ETA: 39s - loss: 2.7889 - regression_loss: 2.2402 - classification_loss: 0.5487 333/500 [==================>...........] - ETA: 39s - loss: 2.7881 - regression_loss: 2.2395 - classification_loss: 0.5486 334/500 [===================>..........] - ETA: 39s - loss: 2.7873 - regression_loss: 2.2390 - classification_loss: 0.5483 335/500 [===================>..........] - ETA: 39s - loss: 2.7869 - regression_loss: 2.2390 - classification_loss: 0.5479 336/500 [===================>..........] - ETA: 39s - loss: 2.7851 - regression_loss: 2.2377 - classification_loss: 0.5475 337/500 [===================>..........] - ETA: 38s - loss: 2.7850 - regression_loss: 2.2375 - classification_loss: 0.5475 338/500 [===================>..........] - ETA: 38s - loss: 2.7864 - regression_loss: 2.2385 - classification_loss: 0.5480 339/500 [===================>..........] - ETA: 38s - loss: 2.7870 - regression_loss: 2.2388 - classification_loss: 0.5482 340/500 [===================>..........] - ETA: 38s - loss: 2.7872 - regression_loss: 2.2386 - classification_loss: 0.5486 341/500 [===================>..........] - ETA: 37s - loss: 2.7880 - regression_loss: 2.2395 - classification_loss: 0.5485 342/500 [===================>..........] - ETA: 37s - loss: 2.7850 - regression_loss: 2.2371 - classification_loss: 0.5479 343/500 [===================>..........] - ETA: 37s - loss: 2.7850 - regression_loss: 2.2372 - classification_loss: 0.5479 344/500 [===================>..........] - ETA: 37s - loss: 2.7842 - regression_loss: 2.2362 - classification_loss: 0.5479 345/500 [===================>..........] - ETA: 36s - loss: 2.7804 - regression_loss: 2.2330 - classification_loss: 0.5474 346/500 [===================>..........] - ETA: 36s - loss: 2.7792 - regression_loss: 2.2323 - classification_loss: 0.5469 347/500 [===================>..........] - ETA: 36s - loss: 2.7795 - regression_loss: 2.2325 - classification_loss: 0.5469 348/500 [===================>..........] - ETA: 36s - loss: 2.7770 - regression_loss: 2.2306 - classification_loss: 0.5464 349/500 [===================>..........] - ETA: 35s - loss: 2.7753 - regression_loss: 2.2296 - classification_loss: 0.5457 350/500 [====================>.........] - ETA: 35s - loss: 2.7806 - regression_loss: 2.2328 - classification_loss: 0.5477 351/500 [====================>.........] - ETA: 35s - loss: 2.7806 - regression_loss: 2.2329 - classification_loss: 0.5476 352/500 [====================>.........] - ETA: 35s - loss: 2.7800 - regression_loss: 2.2326 - classification_loss: 0.5474 353/500 [====================>.........] - ETA: 34s - loss: 2.7808 - regression_loss: 2.2333 - classification_loss: 0.5474 354/500 [====================>.........] - ETA: 34s - loss: 2.7824 - regression_loss: 2.2348 - classification_loss: 0.5476 355/500 [====================>.........] - ETA: 34s - loss: 2.7844 - regression_loss: 2.2363 - classification_loss: 0.5481 356/500 [====================>.........] - ETA: 34s - loss: 2.7835 - regression_loss: 2.2357 - classification_loss: 0.5478 357/500 [====================>.........] - ETA: 34s - loss: 2.7822 - regression_loss: 2.2345 - classification_loss: 0.5477 358/500 [====================>.........] - ETA: 33s - loss: 2.7812 - regression_loss: 2.2338 - classification_loss: 0.5474 359/500 [====================>.........] - ETA: 33s - loss: 2.7773 - regression_loss: 2.2310 - classification_loss: 0.5463 360/500 [====================>.........] - ETA: 33s - loss: 2.7739 - regression_loss: 2.2283 - classification_loss: 0.5456 361/500 [====================>.........] - ETA: 33s - loss: 2.7741 - regression_loss: 2.2284 - classification_loss: 0.5457 362/500 [====================>.........] - ETA: 32s - loss: 2.7731 - regression_loss: 2.2280 - classification_loss: 0.5452 363/500 [====================>.........] - ETA: 32s - loss: 2.7702 - regression_loss: 2.2257 - classification_loss: 0.5444 364/500 [====================>.........] - ETA: 32s - loss: 2.7698 - regression_loss: 2.2257 - classification_loss: 0.5441 365/500 [====================>.........] - ETA: 32s - loss: 2.7700 - regression_loss: 2.2261 - classification_loss: 0.5439 366/500 [====================>.........] - ETA: 31s - loss: 2.7695 - regression_loss: 2.2259 - classification_loss: 0.5436 367/500 [=====================>........] - ETA: 31s - loss: 2.7706 - regression_loss: 2.2271 - classification_loss: 0.5434 368/500 [=====================>........] - ETA: 31s - loss: 2.7703 - regression_loss: 2.2269 - classification_loss: 0.5434 369/500 [=====================>........] - ETA: 31s - loss: 2.7719 - regression_loss: 2.2284 - classification_loss: 0.5435 370/500 [=====================>........] - ETA: 30s - loss: 2.7727 - regression_loss: 2.2291 - classification_loss: 0.5436 371/500 [=====================>........] - ETA: 30s - loss: 2.7737 - regression_loss: 2.2297 - classification_loss: 0.5440 372/500 [=====================>........] - ETA: 30s - loss: 2.7758 - regression_loss: 2.2313 - classification_loss: 0.5445 373/500 [=====================>........] - ETA: 30s - loss: 2.7769 - regression_loss: 2.2322 - classification_loss: 0.5447 374/500 [=====================>........] - ETA: 30s - loss: 2.7766 - regression_loss: 2.2323 - classification_loss: 0.5443 375/500 [=====================>........] - ETA: 29s - loss: 2.7762 - regression_loss: 2.2324 - classification_loss: 0.5438 376/500 [=====================>........] - ETA: 29s - loss: 2.7800 - regression_loss: 2.2343 - classification_loss: 0.5456 377/500 [=====================>........] - ETA: 29s - loss: 2.7797 - regression_loss: 2.2343 - classification_loss: 0.5453 378/500 [=====================>........] - ETA: 29s - loss: 2.7766 - regression_loss: 2.2322 - classification_loss: 0.5444 379/500 [=====================>........] - ETA: 28s - loss: 2.7770 - regression_loss: 2.2325 - classification_loss: 0.5445 380/500 [=====================>........] - ETA: 28s - loss: 2.7770 - regression_loss: 2.2324 - classification_loss: 0.5447 381/500 [=====================>........] - ETA: 28s - loss: 2.7773 - regression_loss: 2.2330 - classification_loss: 0.5444 382/500 [=====================>........] - ETA: 28s - loss: 2.7745 - regression_loss: 2.2312 - classification_loss: 0.5433 383/500 [=====================>........] - ETA: 27s - loss: 2.7752 - regression_loss: 2.2316 - classification_loss: 0.5436 384/500 [======================>.......] - ETA: 27s - loss: 2.7758 - regression_loss: 2.2317 - classification_loss: 0.5440 385/500 [======================>.......] - ETA: 27s - loss: 2.7754 - regression_loss: 2.2316 - classification_loss: 0.5438 386/500 [======================>.......] - ETA: 27s - loss: 2.7765 - regression_loss: 2.2330 - classification_loss: 0.5435 387/500 [======================>.......] - ETA: 26s - loss: 2.7767 - regression_loss: 2.2334 - classification_loss: 0.5433 388/500 [======================>.......] - ETA: 26s - loss: 2.7766 - regression_loss: 2.2331 - classification_loss: 0.5436 389/500 [======================>.......] - ETA: 26s - loss: 2.7774 - regression_loss: 2.2338 - classification_loss: 0.5437 390/500 [======================>.......] - ETA: 26s - loss: 2.7764 - regression_loss: 2.2330 - classification_loss: 0.5434 391/500 [======================>.......] - ETA: 25s - loss: 2.7768 - regression_loss: 2.2330 - classification_loss: 0.5438 392/500 [======================>.......] - ETA: 25s - loss: 2.7762 - regression_loss: 2.2326 - classification_loss: 0.5435 393/500 [======================>.......] - ETA: 25s - loss: 2.7746 - regression_loss: 2.2316 - classification_loss: 0.5430 394/500 [======================>.......] - ETA: 25s - loss: 2.7740 - regression_loss: 2.2314 - classification_loss: 0.5426 395/500 [======================>.......] - ETA: 25s - loss: 2.7743 - regression_loss: 2.2315 - classification_loss: 0.5429 396/500 [======================>.......] - ETA: 24s - loss: 2.7735 - regression_loss: 2.2306 - classification_loss: 0.5429 397/500 [======================>.......] - ETA: 24s - loss: 2.7735 - regression_loss: 2.2308 - classification_loss: 0.5426 398/500 [======================>.......] - ETA: 24s - loss: 2.7753 - regression_loss: 2.2325 - classification_loss: 0.5428 399/500 [======================>.......] - ETA: 24s - loss: 2.7759 - regression_loss: 2.2328 - classification_loss: 0.5431 400/500 [=======================>......] - ETA: 23s - loss: 2.7774 - regression_loss: 2.2338 - classification_loss: 0.5437 401/500 [=======================>......] - ETA: 23s - loss: 2.7781 - regression_loss: 2.2345 - classification_loss: 0.5436 402/500 [=======================>......] - ETA: 23s - loss: 2.7788 - regression_loss: 2.2349 - classification_loss: 0.5439 403/500 [=======================>......] - ETA: 23s - loss: 2.7781 - regression_loss: 2.2346 - classification_loss: 0.5435 404/500 [=======================>......] - ETA: 22s - loss: 2.7784 - regression_loss: 2.2347 - classification_loss: 0.5437 405/500 [=======================>......] - ETA: 22s - loss: 2.7780 - regression_loss: 2.2343 - classification_loss: 0.5437 406/500 [=======================>......] - ETA: 22s - loss: 2.7780 - regression_loss: 2.2344 - classification_loss: 0.5437 407/500 [=======================>......] - ETA: 22s - loss: 2.7793 - regression_loss: 2.2356 - classification_loss: 0.5437 408/500 [=======================>......] - ETA: 21s - loss: 2.7779 - regression_loss: 2.2348 - classification_loss: 0.5431 409/500 [=======================>......] - ETA: 21s - loss: 2.7772 - regression_loss: 2.2345 - classification_loss: 0.5427 410/500 [=======================>......] - ETA: 21s - loss: 2.7791 - regression_loss: 2.2355 - classification_loss: 0.5436 411/500 [=======================>......] - ETA: 21s - loss: 2.7794 - regression_loss: 2.2361 - classification_loss: 0.5432 412/500 [=======================>......] - ETA: 20s - loss: 2.7789 - regression_loss: 2.2359 - classification_loss: 0.5430 413/500 [=======================>......] - ETA: 20s - loss: 2.7789 - regression_loss: 2.2362 - classification_loss: 0.5428 414/500 [=======================>......] - ETA: 20s - loss: 2.7779 - regression_loss: 2.2357 - classification_loss: 0.5422 415/500 [=======================>......] - ETA: 20s - loss: 2.7755 - regression_loss: 2.2340 - classification_loss: 0.5415 416/500 [=======================>......] - ETA: 19s - loss: 2.7756 - regression_loss: 2.2341 - classification_loss: 0.5414 417/500 [========================>.....] - ETA: 19s - loss: 2.7731 - regression_loss: 2.2326 - classification_loss: 0.5406 418/500 [========================>.....] - ETA: 19s - loss: 2.7734 - regression_loss: 2.2330 - classification_loss: 0.5404 419/500 [========================>.....] - ETA: 19s - loss: 2.7719 - regression_loss: 2.2319 - classification_loss: 0.5400 420/500 [========================>.....] - ETA: 19s - loss: 2.7737 - regression_loss: 2.2338 - classification_loss: 0.5399 421/500 [========================>.....] - ETA: 18s - loss: 2.7724 - regression_loss: 2.2328 - classification_loss: 0.5396 422/500 [========================>.....] - ETA: 18s - loss: 2.7703 - regression_loss: 2.2310 - classification_loss: 0.5393 423/500 [========================>.....] - ETA: 18s - loss: 2.7707 - regression_loss: 2.2310 - classification_loss: 0.5396 424/500 [========================>.....] - ETA: 18s - loss: 2.7721 - regression_loss: 2.2321 - classification_loss: 0.5400 425/500 [========================>.....] - ETA: 17s - loss: 2.7724 - regression_loss: 2.2324 - classification_loss: 0.5400 426/500 [========================>.....] - ETA: 17s - loss: 2.7716 - regression_loss: 2.2318 - classification_loss: 0.5398 427/500 [========================>.....] - ETA: 17s - loss: 2.7709 - regression_loss: 2.2314 - classification_loss: 0.5394 428/500 [========================>.....] - ETA: 17s - loss: 2.7705 - regression_loss: 2.2310 - classification_loss: 0.5395 429/500 [========================>.....] - ETA: 16s - loss: 2.7706 - regression_loss: 2.2309 - classification_loss: 0.5397 430/500 [========================>.....] - ETA: 16s - loss: 2.7698 - regression_loss: 2.2305 - classification_loss: 0.5394 431/500 [========================>.....] - ETA: 16s - loss: 2.7692 - regression_loss: 2.2301 - classification_loss: 0.5390 432/500 [========================>.....] - ETA: 16s - loss: 2.7685 - regression_loss: 2.2300 - classification_loss: 0.5385 433/500 [========================>.....] - ETA: 15s - loss: 2.7698 - regression_loss: 2.2308 - classification_loss: 0.5390 434/500 [=========================>....] - ETA: 15s - loss: 2.7694 - regression_loss: 2.2307 - classification_loss: 0.5387 435/500 [=========================>....] - ETA: 15s - loss: 2.7712 - regression_loss: 2.2322 - classification_loss: 0.5390 436/500 [=========================>....] - ETA: 15s - loss: 2.7711 - regression_loss: 2.2319 - classification_loss: 0.5392 437/500 [=========================>....] - ETA: 14s - loss: 2.7716 - regression_loss: 2.2321 - classification_loss: 0.5394 438/500 [=========================>....] - ETA: 14s - loss: 2.7720 - regression_loss: 2.2329 - classification_loss: 0.5391 439/500 [=========================>....] - ETA: 14s - loss: 2.7715 - regression_loss: 2.2327 - classification_loss: 0.5388 440/500 [=========================>....] - ETA: 14s - loss: 2.7704 - regression_loss: 2.2319 - classification_loss: 0.5385 441/500 [=========================>....] - ETA: 14s - loss: 2.7689 - regression_loss: 2.2310 - classification_loss: 0.5380 442/500 [=========================>....] - ETA: 13s - loss: 2.7692 - regression_loss: 2.2308 - classification_loss: 0.5384 443/500 [=========================>....] - ETA: 13s - loss: 2.7691 - regression_loss: 2.2311 - classification_loss: 0.5380 444/500 [=========================>....] - ETA: 13s - loss: 2.7677 - regression_loss: 2.2303 - classification_loss: 0.5374 445/500 [=========================>....] - ETA: 13s - loss: 2.7688 - regression_loss: 2.2310 - classification_loss: 0.5378 446/500 [=========================>....] - ETA: 12s - loss: 2.7714 - regression_loss: 2.2328 - classification_loss: 0.5387 447/500 [=========================>....] - ETA: 12s - loss: 2.7711 - regression_loss: 2.2323 - classification_loss: 0.5389 448/500 [=========================>....] - ETA: 12s - loss: 2.7711 - regression_loss: 2.2322 - classification_loss: 0.5389 449/500 [=========================>....] - ETA: 12s - loss: 2.7697 - regression_loss: 2.2313 - classification_loss: 0.5385 450/500 [==========================>...] - ETA: 11s - loss: 2.7685 - regression_loss: 2.2304 - classification_loss: 0.5381 451/500 [==========================>...] - ETA: 11s - loss: 2.7677 - regression_loss: 2.2297 - classification_loss: 0.5380 452/500 [==========================>...] - ETA: 11s - loss: 2.7665 - regression_loss: 2.2292 - classification_loss: 0.5373 453/500 [==========================>...] - ETA: 11s - loss: 2.7663 - regression_loss: 2.2290 - classification_loss: 0.5373 454/500 [==========================>...] - ETA: 10s - loss: 2.7648 - regression_loss: 2.2280 - classification_loss: 0.5368 455/500 [==========================>...] - ETA: 10s - loss: 2.7647 - regression_loss: 2.2281 - classification_loss: 0.5366 456/500 [==========================>...] - ETA: 10s - loss: 2.7654 - regression_loss: 2.2285 - classification_loss: 0.5369 457/500 [==========================>...] - ETA: 10s - loss: 2.7674 - regression_loss: 2.2295 - classification_loss: 0.5379 458/500 [==========================>...] - ETA: 9s - loss: 2.7664 - regression_loss: 2.2291 - classification_loss: 0.5373  459/500 [==========================>...] - ETA: 9s - loss: 2.7648 - regression_loss: 2.2279 - classification_loss: 0.5369 460/500 [==========================>...] - ETA: 9s - loss: 2.7646 - regression_loss: 2.2279 - classification_loss: 0.5367 461/500 [==========================>...] - ETA: 9s - loss: 2.7641 - regression_loss: 2.2273 - classification_loss: 0.5369 462/500 [==========================>...] - ETA: 9s - loss: 2.7657 - regression_loss: 2.2286 - classification_loss: 0.5371 463/500 [==========================>...] - ETA: 8s - loss: 2.7632 - regression_loss: 2.2266 - classification_loss: 0.5365 464/500 [==========================>...] - ETA: 8s - loss: 2.7614 - regression_loss: 2.2255 - classification_loss: 0.5359 465/500 [==========================>...] - ETA: 8s - loss: 2.7601 - regression_loss: 2.2246 - classification_loss: 0.5355 466/500 [==========================>...] - ETA: 8s - loss: 2.7599 - regression_loss: 2.2232 - classification_loss: 0.5366 467/500 [===========================>..] - ETA: 7s - loss: 2.7569 - regression_loss: 2.2211 - classification_loss: 0.5358 468/500 [===========================>..] - ETA: 7s - loss: 2.7559 - regression_loss: 2.2205 - classification_loss: 0.5354 469/500 [===========================>..] - ETA: 7s - loss: 2.7557 - regression_loss: 2.2204 - classification_loss: 0.5353 470/500 [===========================>..] - ETA: 7s - loss: 2.7555 - regression_loss: 2.2203 - classification_loss: 0.5352 471/500 [===========================>..] - ETA: 6s - loss: 2.7563 - regression_loss: 2.2206 - classification_loss: 0.5357 472/500 [===========================>..] - ETA: 6s - loss: 2.7554 - regression_loss: 2.2199 - classification_loss: 0.5355 473/500 [===========================>..] - ETA: 6s - loss: 2.7571 - regression_loss: 2.2212 - classification_loss: 0.5359 474/500 [===========================>..] - ETA: 6s - loss: 2.7568 - regression_loss: 2.2210 - classification_loss: 0.5358 475/500 [===========================>..] - ETA: 5s - loss: 2.7566 - regression_loss: 2.2208 - classification_loss: 0.5358 476/500 [===========================>..] - ETA: 5s - loss: 2.7543 - regression_loss: 2.2192 - classification_loss: 0.5351 477/500 [===========================>..] - ETA: 5s - loss: 2.7542 - regression_loss: 2.2192 - classification_loss: 0.5349 478/500 [===========================>..] - ETA: 5s - loss: 2.7542 - regression_loss: 2.2195 - classification_loss: 0.5347 479/500 [===========================>..] - ETA: 4s - loss: 2.7554 - regression_loss: 2.2202 - classification_loss: 0.5352 480/500 [===========================>..] - ETA: 4s - loss: 2.7551 - regression_loss: 2.2201 - classification_loss: 0.5350 481/500 [===========================>..] - ETA: 4s - loss: 2.7551 - regression_loss: 2.2201 - classification_loss: 0.5349 482/500 [===========================>..] - ETA: 4s - loss: 2.7545 - regression_loss: 2.2199 - classification_loss: 0.5346 483/500 [===========================>..] - ETA: 4s - loss: 2.7530 - regression_loss: 2.2189 - classification_loss: 0.5341 484/500 [============================>.] - ETA: 3s - loss: 2.7522 - regression_loss: 2.2181 - classification_loss: 0.5341 485/500 [============================>.] - ETA: 3s - loss: 2.7501 - regression_loss: 2.2166 - classification_loss: 0.5335 486/500 [============================>.] - ETA: 3s - loss: 2.7504 - regression_loss: 2.2168 - classification_loss: 0.5336 487/500 [============================>.] - ETA: 3s - loss: 2.7518 - regression_loss: 2.2174 - classification_loss: 0.5344 488/500 [============================>.] - ETA: 2s - loss: 2.7509 - regression_loss: 2.2168 - classification_loss: 0.5341 489/500 [============================>.] - ETA: 2s - loss: 2.7508 - regression_loss: 2.2167 - classification_loss: 0.5342 490/500 [============================>.] - ETA: 2s - loss: 2.7499 - regression_loss: 2.2161 - classification_loss: 0.5338 491/500 [============================>.] - ETA: 2s - loss: 2.7496 - regression_loss: 2.2159 - classification_loss: 0.5337 492/500 [============================>.] - ETA: 1s - loss: 2.7494 - regression_loss: 2.2157 - classification_loss: 0.5337 493/500 [============================>.] - ETA: 1s - loss: 2.7483 - regression_loss: 2.2150 - classification_loss: 0.5333 494/500 [============================>.] - ETA: 1s - loss: 2.7486 - regression_loss: 2.2154 - classification_loss: 0.5332 495/500 [============================>.] - ETA: 1s - loss: 2.7482 - regression_loss: 2.2148 - classification_loss: 0.5334 496/500 [============================>.] - ETA: 0s - loss: 2.7480 - regression_loss: 2.2148 - classification_loss: 0.5332 497/500 [============================>.] - ETA: 0s - loss: 2.7479 - regression_loss: 2.2148 - classification_loss: 0.5331 498/500 [============================>.] - ETA: 0s - loss: 2.7475 - regression_loss: 2.2149 - classification_loss: 0.5326 499/500 [============================>.] - ETA: 0s - loss: 2.7471 - regression_loss: 2.2145 - classification_loss: 0.5326 500/500 [==============================] - 119s 238ms/step - loss: 2.7470 - regression_loss: 2.2144 - classification_loss: 0.5326 326 instances of class plum with average precision: 0.3225 mAP: 0.3225 Epoch 00003: saving model to ./training/snapshots/resnet50_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 1:53 - loss: 2.5386 - regression_loss: 2.0977 - classification_loss: 0.4409 2/500 [..............................] - ETA: 1:52 - loss: 2.9917 - regression_loss: 2.4421 - classification_loss: 0.5497 3/500 [..............................] - ETA: 1:52 - loss: 3.1427 - regression_loss: 2.5634 - classification_loss: 0.5793 4/500 [..............................] - ETA: 1:54 - loss: 3.0018 - regression_loss: 2.4761 - classification_loss: 0.5257 5/500 [..............................] - ETA: 1:54 - loss: 3.4893 - regression_loss: 2.8486 - classification_loss: 0.6408 6/500 [..............................] - ETA: 1:53 - loss: 3.3425 - regression_loss: 2.7145 - classification_loss: 0.6280 7/500 [..............................] - ETA: 1:54 - loss: 3.2385 - regression_loss: 2.6280 - classification_loss: 0.6106 8/500 [..............................] - ETA: 1:54 - loss: 3.1265 - regression_loss: 2.5420 - classification_loss: 0.5845 9/500 [..............................] - ETA: 1:55 - loss: 3.0815 - regression_loss: 2.4963 - classification_loss: 0.5852 10/500 [..............................] - ETA: 1:54 - loss: 3.0336 - regression_loss: 2.4578 - classification_loss: 0.5758 11/500 [..............................] - ETA: 1:54 - loss: 3.0327 - regression_loss: 2.4517 - classification_loss: 0.5810 12/500 [..............................] - ETA: 1:54 - loss: 2.9737 - regression_loss: 2.4060 - classification_loss: 0.5677 13/500 [..............................] - ETA: 1:54 - loss: 2.9820 - regression_loss: 2.4092 - classification_loss: 0.5728 14/500 [..............................] - ETA: 1:53 - loss: 2.9420 - regression_loss: 2.3833 - classification_loss: 0.5587 15/500 [..............................] - ETA: 1:53 - loss: 2.9144 - regression_loss: 2.3623 - classification_loss: 0.5521 16/500 [..............................] - ETA: 1:53 - loss: 2.8630 - regression_loss: 2.3280 - classification_loss: 0.5350 17/500 [>.............................] - ETA: 1:52 - loss: 2.8796 - regression_loss: 2.3309 - classification_loss: 0.5488 18/500 [>.............................] - ETA: 1:52 - loss: 2.8855 - regression_loss: 2.3366 - classification_loss: 0.5489 19/500 [>.............................] - ETA: 1:52 - loss: 2.9097 - regression_loss: 2.3451 - classification_loss: 0.5646 20/500 [>.............................] - ETA: 1:52 - loss: 2.9345 - regression_loss: 2.3574 - classification_loss: 0.5771 21/500 [>.............................] - ETA: 1:52 - loss: 2.8849 - regression_loss: 2.3210 - classification_loss: 0.5639 22/500 [>.............................] - ETA: 1:51 - loss: 2.8732 - regression_loss: 2.3186 - classification_loss: 0.5546 23/500 [>.............................] - ETA: 1:51 - loss: 2.8358 - regression_loss: 2.2943 - classification_loss: 0.5415 24/500 [>.............................] - ETA: 1:51 - loss: 2.8242 - regression_loss: 2.2866 - classification_loss: 0.5375 25/500 [>.............................] - ETA: 1:50 - loss: 2.8200 - regression_loss: 2.2867 - classification_loss: 0.5333 26/500 [>.............................] - ETA: 1:50 - loss: 2.8131 - regression_loss: 2.2790 - classification_loss: 0.5341 27/500 [>.............................] - ETA: 1:50 - loss: 2.8075 - regression_loss: 2.2744 - classification_loss: 0.5331 28/500 [>.............................] - ETA: 1:49 - loss: 2.7993 - regression_loss: 2.2709 - classification_loss: 0.5285 29/500 [>.............................] - ETA: 1:49 - loss: 2.7987 - regression_loss: 2.2697 - classification_loss: 0.5290 30/500 [>.............................] - ETA: 1:49 - loss: 2.8107 - regression_loss: 2.2832 - classification_loss: 0.5274 31/500 [>.............................] - ETA: 1:49 - loss: 2.8033 - regression_loss: 2.2745 - classification_loss: 0.5288 32/500 [>.............................] - ETA: 1:48 - loss: 2.7978 - regression_loss: 2.2671 - classification_loss: 0.5307 33/500 [>.............................] - ETA: 1:48 - loss: 2.7891 - regression_loss: 2.2634 - classification_loss: 0.5256 34/500 [=>............................] - ETA: 1:48 - loss: 2.8064 - regression_loss: 2.2721 - classification_loss: 0.5343 35/500 [=>............................] - ETA: 1:48 - loss: 2.7987 - regression_loss: 2.2678 - classification_loss: 0.5308 36/500 [=>............................] - ETA: 1:48 - loss: 2.7902 - regression_loss: 2.2609 - classification_loss: 0.5293 37/500 [=>............................] - ETA: 1:47 - loss: 2.7930 - regression_loss: 2.2614 - classification_loss: 0.5317 38/500 [=>............................] - ETA: 1:47 - loss: 2.7951 - regression_loss: 2.2635 - classification_loss: 0.5317 39/500 [=>............................] - ETA: 1:47 - loss: 2.7928 - regression_loss: 2.2620 - classification_loss: 0.5308 40/500 [=>............................] - ETA: 1:47 - loss: 2.7921 - regression_loss: 2.2617 - classification_loss: 0.5305 41/500 [=>............................] - ETA: 1:47 - loss: 2.7814 - regression_loss: 2.2553 - classification_loss: 0.5261 42/500 [=>............................] - ETA: 1:46 - loss: 2.7791 - regression_loss: 2.2545 - classification_loss: 0.5246 43/500 [=>............................] - ETA: 1:46 - loss: 2.7747 - regression_loss: 2.2493 - classification_loss: 0.5254 44/500 [=>............................] - ETA: 1:46 - loss: 2.7637 - regression_loss: 2.2417 - classification_loss: 0.5220 45/500 [=>............................] - ETA: 1:45 - loss: 2.7645 - regression_loss: 2.2405 - classification_loss: 0.5240 46/500 [=>............................] - ETA: 1:45 - loss: 2.7671 - regression_loss: 2.2425 - classification_loss: 0.5246 47/500 [=>............................] - ETA: 1:45 - loss: 2.7642 - regression_loss: 2.2416 - classification_loss: 0.5226 48/500 [=>............................] - ETA: 1:45 - loss: 2.7904 - regression_loss: 2.2634 - classification_loss: 0.5270 49/500 [=>............................] - ETA: 1:44 - loss: 2.7871 - regression_loss: 2.2651 - classification_loss: 0.5221 50/500 [==>...........................] - ETA: 1:44 - loss: 2.7937 - regression_loss: 2.2678 - classification_loss: 0.5259 51/500 [==>...........................] - ETA: 1:44 - loss: 2.7907 - regression_loss: 2.2654 - classification_loss: 0.5254 52/500 [==>...........................] - ETA: 1:44 - loss: 2.7877 - regression_loss: 2.2630 - classification_loss: 0.5247 53/500 [==>...........................] - ETA: 1:44 - loss: 2.7860 - regression_loss: 2.2613 - classification_loss: 0.5247 54/500 [==>...........................] - ETA: 1:43 - loss: 2.7750 - regression_loss: 2.2527 - classification_loss: 0.5222 55/500 [==>...........................] - ETA: 1:43 - loss: 2.8009 - regression_loss: 2.2738 - classification_loss: 0.5271 56/500 [==>...........................] - ETA: 1:43 - loss: 2.8004 - regression_loss: 2.2743 - classification_loss: 0.5261 57/500 [==>...........................] - ETA: 1:43 - loss: 2.8088 - regression_loss: 2.2800 - classification_loss: 0.5288 58/500 [==>...........................] - ETA: 1:42 - loss: 2.7959 - regression_loss: 2.2710 - classification_loss: 0.5249 59/500 [==>...........................] - ETA: 1:42 - loss: 2.7847 - regression_loss: 2.2636 - classification_loss: 0.5212 60/500 [==>...........................] - ETA: 1:42 - loss: 2.7739 - regression_loss: 2.2544 - classification_loss: 0.5195 61/500 [==>...........................] - ETA: 1:41 - loss: 2.7699 - regression_loss: 2.2517 - classification_loss: 0.5182 62/500 [==>...........................] - ETA: 1:41 - loss: 2.7657 - regression_loss: 2.2488 - classification_loss: 0.5170 63/500 [==>...........................] - ETA: 1:41 - loss: 2.7684 - regression_loss: 2.2514 - classification_loss: 0.5169 64/500 [==>...........................] - ETA: 1:41 - loss: 2.7607 - regression_loss: 2.2457 - classification_loss: 0.5150 65/500 [==>...........................] - ETA: 1:40 - loss: 2.7480 - regression_loss: 2.2367 - classification_loss: 0.5113 66/500 [==>...........................] - ETA: 1:40 - loss: 2.7448 - regression_loss: 2.2348 - classification_loss: 0.5100 67/500 [===>..........................] - ETA: 1:40 - loss: 2.7446 - regression_loss: 2.2358 - classification_loss: 0.5088 68/500 [===>..........................] - ETA: 1:39 - loss: 2.7312 - regression_loss: 2.2252 - classification_loss: 0.5060 69/500 [===>..........................] - ETA: 1:39 - loss: 2.7219 - regression_loss: 2.2192 - classification_loss: 0.5027 70/500 [===>..........................] - ETA: 1:39 - loss: 2.7215 - regression_loss: 2.2192 - classification_loss: 0.5023 71/500 [===>..........................] - ETA: 1:39 - loss: 2.7119 - regression_loss: 2.2119 - classification_loss: 0.5000 72/500 [===>..........................] - ETA: 1:39 - loss: 2.7165 - regression_loss: 2.2133 - classification_loss: 0.5032 73/500 [===>..........................] - ETA: 1:38 - loss: 2.7266 - regression_loss: 2.2192 - classification_loss: 0.5075 74/500 [===>..........................] - ETA: 1:38 - loss: 2.7286 - regression_loss: 2.2198 - classification_loss: 0.5088 75/500 [===>..........................] - ETA: 1:38 - loss: 2.7272 - regression_loss: 2.2184 - classification_loss: 0.5088 76/500 [===>..........................] - ETA: 1:38 - loss: 2.7281 - regression_loss: 2.2187 - classification_loss: 0.5094 77/500 [===>..........................] - ETA: 1:38 - loss: 2.7265 - regression_loss: 2.2175 - classification_loss: 0.5091 78/500 [===>..........................] - ETA: 1:37 - loss: 2.7216 - regression_loss: 2.2125 - classification_loss: 0.5092 79/500 [===>..........................] - ETA: 1:37 - loss: 2.7245 - regression_loss: 2.2142 - classification_loss: 0.5103 80/500 [===>..........................] - ETA: 1:37 - loss: 2.7174 - regression_loss: 2.2091 - classification_loss: 0.5083 81/500 [===>..........................] - ETA: 1:37 - loss: 2.7348 - regression_loss: 2.2072 - classification_loss: 0.5276 82/500 [===>..........................] - ETA: 1:37 - loss: 2.7360 - regression_loss: 2.2081 - classification_loss: 0.5280 83/500 [===>..........................] - ETA: 1:36 - loss: 2.7351 - regression_loss: 2.2076 - classification_loss: 0.5275 84/500 [====>.........................] - ETA: 1:36 - loss: 2.7305 - regression_loss: 2.2055 - classification_loss: 0.5251 85/500 [====>.........................] - ETA: 1:36 - loss: 2.7315 - regression_loss: 2.2068 - classification_loss: 0.5247 86/500 [====>.........................] - ETA: 1:36 - loss: 2.7244 - regression_loss: 2.2020 - classification_loss: 0.5224 87/500 [====>.........................] - ETA: 1:35 - loss: 2.7250 - regression_loss: 2.2031 - classification_loss: 0.5219 88/500 [====>.........................] - ETA: 1:35 - loss: 2.7231 - regression_loss: 2.2019 - classification_loss: 0.5212 89/500 [====>.........................] - ETA: 1:35 - loss: 2.7168 - regression_loss: 2.1974 - classification_loss: 0.5194 90/500 [====>.........................] - ETA: 1:35 - loss: 2.7222 - regression_loss: 2.2023 - classification_loss: 0.5198 91/500 [====>.........................] - ETA: 1:35 - loss: 2.7209 - regression_loss: 2.2018 - classification_loss: 0.5191 92/500 [====>.........................] - ETA: 1:34 - loss: 2.7165 - regression_loss: 2.1990 - classification_loss: 0.5175 93/500 [====>.........................] - ETA: 1:34 - loss: 2.7128 - regression_loss: 2.1948 - classification_loss: 0.5180 94/500 [====>.........................] - ETA: 1:34 - loss: 2.7131 - regression_loss: 2.1948 - classification_loss: 0.5183 95/500 [====>.........................] - ETA: 1:34 - loss: 2.7105 - regression_loss: 2.1930 - classification_loss: 0.5174 96/500 [====>.........................] - ETA: 1:33 - loss: 2.7092 - regression_loss: 2.1912 - classification_loss: 0.5181 97/500 [====>.........................] - ETA: 1:33 - loss: 2.7062 - regression_loss: 2.1883 - classification_loss: 0.5179 98/500 [====>.........................] - ETA: 1:33 - loss: 2.7047 - regression_loss: 2.1868 - classification_loss: 0.5179 99/500 [====>.........................] - ETA: 1:33 - loss: 2.7040 - regression_loss: 2.1867 - classification_loss: 0.5174 100/500 [=====>........................] - ETA: 1:32 - loss: 2.6981 - regression_loss: 2.1822 - classification_loss: 0.5159 101/500 [=====>........................] - ETA: 1:32 - loss: 2.6988 - regression_loss: 2.1831 - classification_loss: 0.5156 102/500 [=====>........................] - ETA: 1:32 - loss: 2.6940 - regression_loss: 2.1796 - classification_loss: 0.5144 103/500 [=====>........................] - ETA: 1:32 - loss: 2.7016 - regression_loss: 2.1829 - classification_loss: 0.5188 104/500 [=====>........................] - ETA: 1:32 - loss: 2.7029 - regression_loss: 2.1839 - classification_loss: 0.5190 105/500 [=====>........................] - ETA: 1:31 - loss: 2.7183 - regression_loss: 2.1978 - classification_loss: 0.5204 106/500 [=====>........................] - ETA: 1:31 - loss: 2.7195 - regression_loss: 2.1997 - classification_loss: 0.5198 107/500 [=====>........................] - ETA: 1:31 - loss: 2.7166 - regression_loss: 2.1971 - classification_loss: 0.5195 108/500 [=====>........................] - ETA: 1:31 - loss: 2.7195 - regression_loss: 2.2004 - classification_loss: 0.5192 109/500 [=====>........................] - ETA: 1:30 - loss: 2.7183 - regression_loss: 2.1995 - classification_loss: 0.5188 110/500 [=====>........................] - ETA: 1:30 - loss: 2.7097 - regression_loss: 2.1930 - classification_loss: 0.5168 111/500 [=====>........................] - ETA: 1:30 - loss: 2.7076 - regression_loss: 2.1916 - classification_loss: 0.5161 112/500 [=====>........................] - ETA: 1:30 - loss: 2.7076 - regression_loss: 2.1916 - classification_loss: 0.5159 113/500 [=====>........................] - ETA: 1:29 - loss: 2.7026 - regression_loss: 2.1891 - classification_loss: 0.5135 114/500 [=====>........................] - ETA: 1:29 - loss: 2.7018 - regression_loss: 2.1887 - classification_loss: 0.5131 115/500 [=====>........................] - ETA: 1:29 - loss: 2.7035 - regression_loss: 2.1907 - classification_loss: 0.5128 116/500 [=====>........................] - ETA: 1:29 - loss: 2.6982 - regression_loss: 2.1878 - classification_loss: 0.5104 117/500 [======>.......................] - ETA: 1:28 - loss: 2.6961 - regression_loss: 2.1863 - classification_loss: 0.5097 118/500 [======>.......................] - ETA: 1:28 - loss: 2.6947 - regression_loss: 2.1857 - classification_loss: 0.5091 119/500 [======>.......................] - ETA: 1:28 - loss: 2.6977 - regression_loss: 2.1880 - classification_loss: 0.5098 120/500 [======>.......................] - ETA: 1:28 - loss: 2.6997 - regression_loss: 2.1894 - classification_loss: 0.5103 121/500 [======>.......................] - ETA: 1:27 - loss: 2.7033 - regression_loss: 2.1923 - classification_loss: 0.5110 122/500 [======>.......................] - ETA: 1:27 - loss: 2.7060 - regression_loss: 2.1954 - classification_loss: 0.5106 123/500 [======>.......................] - ETA: 1:27 - loss: 2.7091 - regression_loss: 2.1989 - classification_loss: 0.5103 124/500 [======>.......................] - ETA: 1:27 - loss: 2.7107 - regression_loss: 2.1996 - classification_loss: 0.5111 125/500 [======>.......................] - ETA: 1:27 - loss: 2.7038 - regression_loss: 2.1944 - classification_loss: 0.5093 126/500 [======>.......................] - ETA: 1:26 - loss: 2.7083 - regression_loss: 2.1985 - classification_loss: 0.5098 127/500 [======>.......................] - ETA: 1:26 - loss: 2.7076 - regression_loss: 2.1983 - classification_loss: 0.5094 128/500 [======>.......................] - ETA: 1:26 - loss: 2.7028 - regression_loss: 2.1955 - classification_loss: 0.5073 129/500 [======>.......................] - ETA: 1:26 - loss: 2.7048 - regression_loss: 2.1987 - classification_loss: 0.5060 130/500 [======>.......................] - ETA: 1:25 - loss: 2.7092 - regression_loss: 2.2014 - classification_loss: 0.5078 131/500 [======>.......................] - ETA: 1:25 - loss: 2.7091 - regression_loss: 2.2003 - classification_loss: 0.5088 132/500 [======>.......................] - ETA: 1:25 - loss: 2.7114 - regression_loss: 2.2015 - classification_loss: 0.5099 133/500 [======>.......................] - ETA: 1:25 - loss: 2.7097 - regression_loss: 2.1996 - classification_loss: 0.5100 134/500 [=======>......................] - ETA: 1:24 - loss: 2.7034 - regression_loss: 2.1948 - classification_loss: 0.5086 135/500 [=======>......................] - ETA: 1:24 - loss: 2.7042 - regression_loss: 2.1948 - classification_loss: 0.5094 136/500 [=======>......................] - ETA: 1:24 - loss: 2.6997 - regression_loss: 2.1913 - classification_loss: 0.5084 137/500 [=======>......................] - ETA: 1:24 - loss: 2.7022 - regression_loss: 2.1925 - classification_loss: 0.5097 138/500 [=======>......................] - ETA: 1:24 - loss: 2.6997 - regression_loss: 2.1913 - classification_loss: 0.5083 139/500 [=======>......................] - ETA: 1:23 - loss: 2.6973 - regression_loss: 2.1904 - classification_loss: 0.5069 140/500 [=======>......................] - ETA: 1:23 - loss: 2.6999 - regression_loss: 2.1929 - classification_loss: 0.5070 141/500 [=======>......................] - ETA: 1:23 - loss: 2.7068 - regression_loss: 2.1975 - classification_loss: 0.5093 142/500 [=======>......................] - ETA: 1:23 - loss: 2.7037 - regression_loss: 2.1948 - classification_loss: 0.5089 143/500 [=======>......................] - ETA: 1:22 - loss: 2.7030 - regression_loss: 2.1943 - classification_loss: 0.5087 144/500 [=======>......................] - ETA: 1:22 - loss: 2.7019 - regression_loss: 2.1936 - classification_loss: 0.5083 145/500 [=======>......................] - ETA: 1:22 - loss: 2.7006 - regression_loss: 2.1936 - classification_loss: 0.5070 146/500 [=======>......................] - ETA: 1:22 - loss: 2.7033 - regression_loss: 2.1941 - classification_loss: 0.5092 147/500 [=======>......................] - ETA: 1:22 - loss: 2.6993 - regression_loss: 2.1915 - classification_loss: 0.5078 148/500 [=======>......................] - ETA: 1:21 - loss: 2.7033 - regression_loss: 2.1934 - classification_loss: 0.5099 149/500 [=======>......................] - ETA: 1:21 - loss: 2.6976 - regression_loss: 2.1890 - classification_loss: 0.5086 150/500 [========>.....................] - ETA: 1:21 - loss: 2.6998 - regression_loss: 2.1902 - classification_loss: 0.5096 151/500 [========>.....................] - ETA: 1:21 - loss: 2.7083 - regression_loss: 2.1944 - classification_loss: 0.5139 152/500 [========>.....................] - ETA: 1:21 - loss: 2.7135 - regression_loss: 2.1988 - classification_loss: 0.5147 153/500 [========>.....................] - ETA: 1:20 - loss: 2.7115 - regression_loss: 2.1982 - classification_loss: 0.5134 154/500 [========>.....................] - ETA: 1:20 - loss: 2.7101 - regression_loss: 2.1969 - classification_loss: 0.5132 155/500 [========>.....................] - ETA: 1:20 - loss: 2.7201 - regression_loss: 2.2072 - classification_loss: 0.5130 156/500 [========>.....................] - ETA: 1:20 - loss: 2.7240 - regression_loss: 2.2100 - classification_loss: 0.5139 157/500 [========>.....................] - ETA: 1:19 - loss: 2.7242 - regression_loss: 2.2104 - classification_loss: 0.5138 158/500 [========>.....................] - ETA: 1:19 - loss: 2.7205 - regression_loss: 2.2081 - classification_loss: 0.5124 159/500 [========>.....................] - ETA: 1:19 - loss: 2.7145 - regression_loss: 2.2037 - classification_loss: 0.5108 160/500 [========>.....................] - ETA: 1:19 - loss: 2.7110 - regression_loss: 2.2013 - classification_loss: 0.5097 161/500 [========>.....................] - ETA: 1:18 - loss: 2.7099 - regression_loss: 2.2005 - classification_loss: 0.5094 162/500 [========>.....................] - ETA: 1:18 - loss: 2.7086 - regression_loss: 2.1991 - classification_loss: 0.5094 163/500 [========>.....................] - ETA: 1:18 - loss: 2.7121 - regression_loss: 2.2024 - classification_loss: 0.5097 164/500 [========>.....................] - ETA: 1:18 - loss: 2.7103 - regression_loss: 2.2014 - classification_loss: 0.5090 165/500 [========>.....................] - ETA: 1:17 - loss: 2.7108 - regression_loss: 2.2029 - classification_loss: 0.5079 166/500 [========>.....................] - ETA: 1:17 - loss: 2.7100 - regression_loss: 2.2021 - classification_loss: 0.5079 167/500 [=========>....................] - ETA: 1:17 - loss: 2.7079 - regression_loss: 2.2005 - classification_loss: 0.5074 168/500 [=========>....................] - ETA: 1:17 - loss: 2.7062 - regression_loss: 2.1994 - classification_loss: 0.5068 169/500 [=========>....................] - ETA: 1:17 - loss: 2.7053 - regression_loss: 2.1985 - classification_loss: 0.5068 170/500 [=========>....................] - ETA: 1:16 - loss: 2.7063 - regression_loss: 2.1985 - classification_loss: 0.5078 171/500 [=========>....................] - ETA: 1:16 - loss: 2.7056 - regression_loss: 2.1983 - classification_loss: 0.5073 172/500 [=========>....................] - ETA: 1:16 - loss: 2.7052 - regression_loss: 2.1983 - classification_loss: 0.5069 173/500 [=========>....................] - ETA: 1:16 - loss: 2.7063 - regression_loss: 2.1989 - classification_loss: 0.5073 174/500 [=========>....................] - ETA: 1:15 - loss: 2.7008 - regression_loss: 2.1950 - classification_loss: 0.5058 175/500 [=========>....................] - ETA: 1:15 - loss: 2.6977 - regression_loss: 2.1929 - classification_loss: 0.5048 176/500 [=========>....................] - ETA: 1:15 - loss: 2.7024 - regression_loss: 2.1955 - classification_loss: 0.5069 177/500 [=========>....................] - ETA: 1:15 - loss: 2.7023 - regression_loss: 2.1955 - classification_loss: 0.5068 178/500 [=========>....................] - ETA: 1:14 - loss: 2.6981 - regression_loss: 2.1921 - classification_loss: 0.5060 179/500 [=========>....................] - ETA: 1:14 - loss: 2.7009 - regression_loss: 2.1939 - classification_loss: 0.5070 180/500 [=========>....................] - ETA: 1:14 - loss: 2.6942 - regression_loss: 2.1891 - classification_loss: 0.5051 181/500 [=========>....................] - ETA: 1:14 - loss: 2.6926 - regression_loss: 2.1881 - classification_loss: 0.5045 182/500 [=========>....................] - ETA: 1:14 - loss: 2.6987 - regression_loss: 2.1929 - classification_loss: 0.5057 183/500 [=========>....................] - ETA: 1:13 - loss: 2.6979 - regression_loss: 2.1920 - classification_loss: 0.5059 184/500 [==========>...................] - ETA: 1:13 - loss: 2.6949 - regression_loss: 2.1897 - classification_loss: 0.5052 185/500 [==========>...................] - ETA: 1:13 - loss: 2.6946 - regression_loss: 2.1896 - classification_loss: 0.5051 186/500 [==========>...................] - ETA: 1:13 - loss: 2.6942 - regression_loss: 2.1894 - classification_loss: 0.5049 187/500 [==========>...................] - ETA: 1:12 - loss: 2.6956 - regression_loss: 2.1902 - classification_loss: 0.5055 188/500 [==========>...................] - ETA: 1:12 - loss: 2.6914 - regression_loss: 2.1869 - classification_loss: 0.5045 189/500 [==========>...................] - ETA: 1:12 - loss: 2.6956 - regression_loss: 2.1909 - classification_loss: 0.5047 190/500 [==========>...................] - ETA: 1:12 - loss: 2.6956 - regression_loss: 2.1908 - classification_loss: 0.5048 191/500 [==========>...................] - ETA: 1:11 - loss: 2.6989 - regression_loss: 2.1936 - classification_loss: 0.5053 192/500 [==========>...................] - ETA: 1:11 - loss: 2.7013 - regression_loss: 2.1956 - classification_loss: 0.5057 193/500 [==========>...................] - ETA: 1:11 - loss: 2.7002 - regression_loss: 2.1951 - classification_loss: 0.5052 194/500 [==========>...................] - ETA: 1:11 - loss: 2.6995 - regression_loss: 2.1947 - classification_loss: 0.5048 195/500 [==========>...................] - ETA: 1:11 - loss: 2.7037 - regression_loss: 2.1983 - classification_loss: 0.5054 196/500 [==========>...................] - ETA: 1:10 - loss: 2.7043 - regression_loss: 2.1982 - classification_loss: 0.5060 197/500 [==========>...................] - ETA: 1:10 - loss: 2.7056 - regression_loss: 2.1986 - classification_loss: 0.5070 198/500 [==========>...................] - ETA: 1:10 - loss: 2.7100 - regression_loss: 2.1875 - classification_loss: 0.5224 199/500 [==========>...................] - ETA: 1:10 - loss: 2.7098 - regression_loss: 2.1875 - classification_loss: 0.5223 200/500 [===========>..................] - ETA: 1:09 - loss: 2.7125 - regression_loss: 2.1890 - classification_loss: 0.5235 201/500 [===========>..................] - ETA: 1:09 - loss: 2.7147 - regression_loss: 2.1906 - classification_loss: 0.5242 202/500 [===========>..................] - ETA: 1:09 - loss: 2.7157 - regression_loss: 2.1912 - classification_loss: 0.5245 203/500 [===========>..................] - ETA: 1:09 - loss: 2.7117 - regression_loss: 2.1884 - classification_loss: 0.5233 204/500 [===========>..................] - ETA: 1:08 - loss: 2.7142 - regression_loss: 2.1909 - classification_loss: 0.5232 205/500 [===========>..................] - ETA: 1:08 - loss: 2.7105 - regression_loss: 2.1885 - classification_loss: 0.5221 206/500 [===========>..................] - ETA: 1:08 - loss: 2.7091 - regression_loss: 2.1872 - classification_loss: 0.5218 207/500 [===========>..................] - ETA: 1:08 - loss: 2.7054 - regression_loss: 2.1846 - classification_loss: 0.5208 208/500 [===========>..................] - ETA: 1:07 - loss: 2.7027 - regression_loss: 2.1829 - classification_loss: 0.5198 209/500 [===========>..................] - ETA: 1:07 - loss: 2.7029 - regression_loss: 2.1837 - classification_loss: 0.5192 210/500 [===========>..................] - ETA: 1:07 - loss: 2.7044 - regression_loss: 2.1851 - classification_loss: 0.5193 211/500 [===========>..................] - ETA: 1:07 - loss: 2.7035 - regression_loss: 2.1846 - classification_loss: 0.5189 212/500 [===========>..................] - ETA: 1:07 - loss: 2.7036 - regression_loss: 2.1847 - classification_loss: 0.5189 213/500 [===========>..................] - ETA: 1:06 - loss: 2.7034 - regression_loss: 2.1846 - classification_loss: 0.5188 214/500 [===========>..................] - ETA: 1:06 - loss: 2.7040 - regression_loss: 2.1851 - classification_loss: 0.5190 215/500 [===========>..................] - ETA: 1:06 - loss: 2.7062 - regression_loss: 2.1877 - classification_loss: 0.5185 216/500 [===========>..................] - ETA: 1:06 - loss: 2.6991 - regression_loss: 2.1823 - classification_loss: 0.5168 217/500 [============>.................] - ETA: 1:05 - loss: 2.6978 - regression_loss: 2.1815 - classification_loss: 0.5163 218/500 [============>.................] - ETA: 1:05 - loss: 2.6954 - regression_loss: 2.1795 - classification_loss: 0.5160 219/500 [============>.................] - ETA: 1:05 - loss: 2.6929 - regression_loss: 2.1778 - classification_loss: 0.5150 220/500 [============>.................] - ETA: 1:05 - loss: 2.6942 - regression_loss: 2.1787 - classification_loss: 0.5155 221/500 [============>.................] - ETA: 1:04 - loss: 2.6908 - regression_loss: 2.1764 - classification_loss: 0.5144 222/500 [============>.................] - ETA: 1:04 - loss: 2.6955 - regression_loss: 2.1803 - classification_loss: 0.5152 223/500 [============>.................] - ETA: 1:04 - loss: 2.6916 - regression_loss: 2.1775 - classification_loss: 0.5141 224/500 [============>.................] - ETA: 1:04 - loss: 2.6900 - regression_loss: 2.1768 - classification_loss: 0.5132 225/500 [============>.................] - ETA: 1:04 - loss: 2.6897 - regression_loss: 2.1767 - classification_loss: 0.5131 226/500 [============>.................] - ETA: 1:03 - loss: 2.6881 - regression_loss: 2.1753 - classification_loss: 0.5129 227/500 [============>.................] - ETA: 1:03 - loss: 2.6855 - regression_loss: 2.1736 - classification_loss: 0.5119 228/500 [============>.................] - ETA: 1:03 - loss: 2.6859 - regression_loss: 2.1739 - classification_loss: 0.5120 229/500 [============>.................] - ETA: 1:03 - loss: 2.6838 - regression_loss: 2.1721 - classification_loss: 0.5116 230/500 [============>.................] - ETA: 1:02 - loss: 2.6846 - regression_loss: 2.1724 - classification_loss: 0.5122 231/500 [============>.................] - ETA: 1:02 - loss: 2.6845 - regression_loss: 2.1722 - classification_loss: 0.5123 232/500 [============>.................] - ETA: 1:02 - loss: 2.6828 - regression_loss: 2.1711 - classification_loss: 0.5117 233/500 [============>.................] - ETA: 1:02 - loss: 2.6822 - regression_loss: 2.1703 - classification_loss: 0.5118 234/500 [=============>................] - ETA: 1:01 - loss: 2.6828 - regression_loss: 2.1713 - classification_loss: 0.5116 235/500 [=============>................] - ETA: 1:01 - loss: 2.6779 - regression_loss: 2.1678 - classification_loss: 0.5101 236/500 [=============>................] - ETA: 1:01 - loss: 2.6761 - regression_loss: 2.1663 - classification_loss: 0.5098 237/500 [=============>................] - ETA: 1:01 - loss: 2.6744 - regression_loss: 2.1652 - classification_loss: 0.5093 238/500 [=============>................] - ETA: 1:00 - loss: 2.6735 - regression_loss: 2.1646 - classification_loss: 0.5089 239/500 [=============>................] - ETA: 1:00 - loss: 2.6708 - regression_loss: 2.1629 - classification_loss: 0.5080 240/500 [=============>................] - ETA: 1:00 - loss: 2.6678 - regression_loss: 2.1603 - classification_loss: 0.5075 241/500 [=============>................] - ETA: 1:00 - loss: 2.6607 - regression_loss: 2.1549 - classification_loss: 0.5058 242/500 [=============>................] - ETA: 1:00 - loss: 2.6614 - regression_loss: 2.1557 - classification_loss: 0.5057 243/500 [=============>................] - ETA: 59s - loss: 2.6609 - regression_loss: 2.1556 - classification_loss: 0.5053  244/500 [=============>................] - ETA: 59s - loss: 2.6612 - regression_loss: 2.1557 - classification_loss: 0.5055 245/500 [=============>................] - ETA: 59s - loss: 2.6637 - regression_loss: 2.1581 - classification_loss: 0.5055 246/500 [=============>................] - ETA: 59s - loss: 2.6639 - regression_loss: 2.1580 - classification_loss: 0.5058 247/500 [=============>................] - ETA: 58s - loss: 2.6593 - regression_loss: 2.1546 - classification_loss: 0.5047 248/500 [=============>................] - ETA: 58s - loss: 2.6593 - regression_loss: 2.1548 - classification_loss: 0.5045 249/500 [=============>................] - ETA: 58s - loss: 2.6612 - regression_loss: 2.1555 - classification_loss: 0.5056 250/500 [==============>...............] - ETA: 58s - loss: 2.6625 - regression_loss: 2.1565 - classification_loss: 0.5060 251/500 [==============>...............] - ETA: 57s - loss: 2.6665 - regression_loss: 2.1600 - classification_loss: 0.5065 252/500 [==============>...............] - ETA: 57s - loss: 2.6651 - regression_loss: 2.1594 - classification_loss: 0.5057 253/500 [==============>...............] - ETA: 57s - loss: 2.6640 - regression_loss: 2.1586 - classification_loss: 0.5054 254/500 [==============>...............] - ETA: 57s - loss: 2.6660 - regression_loss: 2.1604 - classification_loss: 0.5056 255/500 [==============>...............] - ETA: 57s - loss: 2.6645 - regression_loss: 2.1592 - classification_loss: 0.5053 256/500 [==============>...............] - ETA: 56s - loss: 2.6651 - regression_loss: 2.1601 - classification_loss: 0.5050 257/500 [==============>...............] - ETA: 56s - loss: 2.6642 - regression_loss: 2.1596 - classification_loss: 0.5046 258/500 [==============>...............] - ETA: 56s - loss: 2.6638 - regression_loss: 2.1597 - classification_loss: 0.5041 259/500 [==============>...............] - ETA: 56s - loss: 2.6621 - regression_loss: 2.1585 - classification_loss: 0.5035 260/500 [==============>...............] - ETA: 55s - loss: 2.6626 - regression_loss: 2.1591 - classification_loss: 0.5035 261/500 [==============>...............] - ETA: 55s - loss: 2.6607 - regression_loss: 2.1578 - classification_loss: 0.5028 262/500 [==============>...............] - ETA: 55s - loss: 2.6609 - regression_loss: 2.1585 - classification_loss: 0.5024 263/500 [==============>...............] - ETA: 55s - loss: 2.6609 - regression_loss: 2.1585 - classification_loss: 0.5024 264/500 [==============>...............] - ETA: 54s - loss: 2.6590 - regression_loss: 2.1572 - classification_loss: 0.5018 265/500 [==============>...............] - ETA: 54s - loss: 2.6624 - regression_loss: 2.1597 - classification_loss: 0.5027 266/500 [==============>...............] - ETA: 54s - loss: 2.6629 - regression_loss: 2.1601 - classification_loss: 0.5028 267/500 [===============>..............] - ETA: 54s - loss: 2.6616 - regression_loss: 2.1593 - classification_loss: 0.5022 268/500 [===============>..............] - ETA: 53s - loss: 2.6597 - regression_loss: 2.1581 - classification_loss: 0.5016 269/500 [===============>..............] - ETA: 53s - loss: 2.6591 - regression_loss: 2.1577 - classification_loss: 0.5013 270/500 [===============>..............] - ETA: 53s - loss: 2.6616 - regression_loss: 2.1585 - classification_loss: 0.5031 271/500 [===============>..............] - ETA: 53s - loss: 2.6629 - regression_loss: 2.1599 - classification_loss: 0.5030 272/500 [===============>..............] - ETA: 53s - loss: 2.6611 - regression_loss: 2.1585 - classification_loss: 0.5025 273/500 [===============>..............] - ETA: 52s - loss: 2.6600 - regression_loss: 2.1575 - classification_loss: 0.5025 274/500 [===============>..............] - ETA: 52s - loss: 2.6607 - regression_loss: 2.1578 - classification_loss: 0.5028 275/500 [===============>..............] - ETA: 52s - loss: 2.6602 - regression_loss: 2.1575 - classification_loss: 0.5028 276/500 [===============>..............] - ETA: 52s - loss: 2.6591 - regression_loss: 2.1567 - classification_loss: 0.5024 277/500 [===============>..............] - ETA: 51s - loss: 2.6564 - regression_loss: 2.1548 - classification_loss: 0.5016 278/500 [===============>..............] - ETA: 51s - loss: 2.6572 - regression_loss: 2.1553 - classification_loss: 0.5019 279/500 [===============>..............] - ETA: 51s - loss: 2.6546 - regression_loss: 2.1532 - classification_loss: 0.5014 280/500 [===============>..............] - ETA: 51s - loss: 2.6532 - regression_loss: 2.1521 - classification_loss: 0.5011 281/500 [===============>..............] - ETA: 50s - loss: 2.6537 - regression_loss: 2.1527 - classification_loss: 0.5010 282/500 [===============>..............] - ETA: 50s - loss: 2.6541 - regression_loss: 2.1528 - classification_loss: 0.5013 283/500 [===============>..............] - ETA: 50s - loss: 2.6557 - regression_loss: 2.1543 - classification_loss: 0.5014 284/500 [================>.............] - ETA: 50s - loss: 2.6578 - regression_loss: 2.1565 - classification_loss: 0.5013 285/500 [================>.............] - ETA: 50s - loss: 2.6553 - regression_loss: 2.1549 - classification_loss: 0.5004 286/500 [================>.............] - ETA: 49s - loss: 2.6531 - regression_loss: 2.1534 - classification_loss: 0.4996 287/500 [================>.............] - ETA: 49s - loss: 2.6529 - regression_loss: 2.1531 - classification_loss: 0.4998 288/500 [================>.............] - ETA: 49s - loss: 2.6547 - regression_loss: 2.1550 - classification_loss: 0.4997 289/500 [================>.............] - ETA: 49s - loss: 2.6537 - regression_loss: 2.1544 - classification_loss: 0.4993 290/500 [================>.............] - ETA: 48s - loss: 2.6504 - regression_loss: 2.1520 - classification_loss: 0.4984 291/500 [================>.............] - ETA: 48s - loss: 2.6536 - regression_loss: 2.1545 - classification_loss: 0.4991 292/500 [================>.............] - ETA: 48s - loss: 2.6530 - regression_loss: 2.1538 - classification_loss: 0.4992 293/500 [================>.............] - ETA: 48s - loss: 2.6515 - regression_loss: 2.1525 - classification_loss: 0.4990 294/500 [================>.............] - ETA: 47s - loss: 2.6516 - regression_loss: 2.1525 - classification_loss: 0.4991 295/500 [================>.............] - ETA: 47s - loss: 2.6516 - regression_loss: 2.1527 - classification_loss: 0.4989 296/500 [================>.............] - ETA: 47s - loss: 2.6570 - regression_loss: 2.1576 - classification_loss: 0.4994 297/500 [================>.............] - ETA: 47s - loss: 2.6556 - regression_loss: 2.1568 - classification_loss: 0.4989 298/500 [================>.............] - ETA: 47s - loss: 2.6551 - regression_loss: 2.1560 - classification_loss: 0.4991 299/500 [================>.............] - ETA: 46s - loss: 2.6542 - regression_loss: 2.1554 - classification_loss: 0.4988 300/500 [=================>............] - ETA: 46s - loss: 2.6566 - regression_loss: 2.1572 - classification_loss: 0.4994 301/500 [=================>............] - ETA: 46s - loss: 2.6599 - regression_loss: 2.1597 - classification_loss: 0.5002 302/500 [=================>............] - ETA: 46s - loss: 2.6579 - regression_loss: 2.1585 - classification_loss: 0.4994 303/500 [=================>............] - ETA: 45s - loss: 2.6561 - regression_loss: 2.1574 - classification_loss: 0.4987 304/500 [=================>............] - ETA: 45s - loss: 2.6537 - regression_loss: 2.1556 - classification_loss: 0.4981 305/500 [=================>............] - ETA: 45s - loss: 2.6537 - regression_loss: 2.1553 - classification_loss: 0.4984 306/500 [=================>............] - ETA: 45s - loss: 2.6548 - regression_loss: 2.1559 - classification_loss: 0.4989 307/500 [=================>............] - ETA: 44s - loss: 2.6542 - regression_loss: 2.1552 - classification_loss: 0.4990 308/500 [=================>............] - ETA: 44s - loss: 2.6544 - regression_loss: 2.1550 - classification_loss: 0.4994 309/500 [=================>............] - ETA: 44s - loss: 2.6539 - regression_loss: 2.1546 - classification_loss: 0.4992 310/500 [=================>............] - ETA: 44s - loss: 2.6530 - regression_loss: 2.1541 - classification_loss: 0.4989 311/500 [=================>............] - ETA: 44s - loss: 2.6528 - regression_loss: 2.1539 - classification_loss: 0.4988 312/500 [=================>............] - ETA: 43s - loss: 2.6552 - regression_loss: 2.1558 - classification_loss: 0.4994 313/500 [=================>............] - ETA: 43s - loss: 2.6538 - regression_loss: 2.1551 - classification_loss: 0.4986 314/500 [=================>............] - ETA: 43s - loss: 2.6554 - regression_loss: 2.1561 - classification_loss: 0.4993 315/500 [=================>............] - ETA: 43s - loss: 2.6540 - regression_loss: 2.1553 - classification_loss: 0.4987 316/500 [=================>............] - ETA: 42s - loss: 2.6548 - regression_loss: 2.1559 - classification_loss: 0.4988 317/500 [==================>...........] - ETA: 42s - loss: 2.6550 - regression_loss: 2.1560 - classification_loss: 0.4990 318/500 [==================>...........] - ETA: 42s - loss: 2.6553 - regression_loss: 2.1561 - classification_loss: 0.4992 319/500 [==================>...........] - ETA: 42s - loss: 2.6557 - regression_loss: 2.1566 - classification_loss: 0.4990 320/500 [==================>...........] - ETA: 41s - loss: 2.6558 - regression_loss: 2.1568 - classification_loss: 0.4990 321/500 [==================>...........] - ETA: 41s - loss: 2.6524 - regression_loss: 2.1544 - classification_loss: 0.4980 322/500 [==================>...........] - ETA: 41s - loss: 2.6510 - regression_loss: 2.1534 - classification_loss: 0.4976 323/500 [==================>...........] - ETA: 41s - loss: 2.6494 - regression_loss: 2.1526 - classification_loss: 0.4968 324/500 [==================>...........] - ETA: 40s - loss: 2.6535 - regression_loss: 2.1552 - classification_loss: 0.4983 325/500 [==================>...........] - ETA: 40s - loss: 2.6528 - regression_loss: 2.1547 - classification_loss: 0.4982 326/500 [==================>...........] - ETA: 40s - loss: 2.6531 - regression_loss: 2.1550 - classification_loss: 0.4981 327/500 [==================>...........] - ETA: 40s - loss: 2.6518 - regression_loss: 2.1542 - classification_loss: 0.4976 328/500 [==================>...........] - ETA: 40s - loss: 2.6530 - regression_loss: 2.1552 - classification_loss: 0.4978 329/500 [==================>...........] - ETA: 39s - loss: 2.6494 - regression_loss: 2.1525 - classification_loss: 0.4968 330/500 [==================>...........] - ETA: 39s - loss: 2.6463 - regression_loss: 2.1502 - classification_loss: 0.4962 331/500 [==================>...........] - ETA: 39s - loss: 2.6461 - regression_loss: 2.1499 - classification_loss: 0.4962 332/500 [==================>...........] - ETA: 39s - loss: 2.6452 - regression_loss: 2.1488 - classification_loss: 0.4964 333/500 [==================>...........] - ETA: 38s - loss: 2.6437 - regression_loss: 2.1478 - classification_loss: 0.4959 334/500 [===================>..........] - ETA: 38s - loss: 2.6424 - regression_loss: 2.1467 - classification_loss: 0.4957 335/500 [===================>..........] - ETA: 38s - loss: 2.6407 - regression_loss: 2.1460 - classification_loss: 0.4947 336/500 [===================>..........] - ETA: 38s - loss: 2.6408 - regression_loss: 2.1459 - classification_loss: 0.4949 337/500 [===================>..........] - ETA: 37s - loss: 2.6428 - regression_loss: 2.1467 - classification_loss: 0.4961 338/500 [===================>..........] - ETA: 37s - loss: 2.6444 - regression_loss: 2.1476 - classification_loss: 0.4967 339/500 [===================>..........] - ETA: 37s - loss: 2.6453 - regression_loss: 2.1484 - classification_loss: 0.4969 340/500 [===================>..........] - ETA: 37s - loss: 2.6461 - regression_loss: 2.1489 - classification_loss: 0.4972 341/500 [===================>..........] - ETA: 37s - loss: 2.6453 - regression_loss: 2.1483 - classification_loss: 0.4971 342/500 [===================>..........] - ETA: 36s - loss: 2.6443 - regression_loss: 2.1476 - classification_loss: 0.4967 343/500 [===================>..........] - ETA: 36s - loss: 2.6432 - regression_loss: 2.1467 - classification_loss: 0.4965 344/500 [===================>..........] - ETA: 36s - loss: 2.6397 - regression_loss: 2.1442 - classification_loss: 0.4956 345/500 [===================>..........] - ETA: 36s - loss: 2.6410 - regression_loss: 2.1452 - classification_loss: 0.4959 346/500 [===================>..........] - ETA: 35s - loss: 2.6395 - regression_loss: 2.1441 - classification_loss: 0.4954 347/500 [===================>..........] - ETA: 35s - loss: 2.6394 - regression_loss: 2.1439 - classification_loss: 0.4955 348/500 [===================>..........] - ETA: 35s - loss: 2.6394 - regression_loss: 2.1441 - classification_loss: 0.4953 349/500 [===================>..........] - ETA: 35s - loss: 2.6424 - regression_loss: 2.1464 - classification_loss: 0.4960 350/500 [====================>.........] - ETA: 34s - loss: 2.6407 - regression_loss: 2.1453 - classification_loss: 0.4954 351/500 [====================>.........] - ETA: 34s - loss: 2.6410 - regression_loss: 2.1455 - classification_loss: 0.4955 352/500 [====================>.........] - ETA: 34s - loss: 2.6413 - regression_loss: 2.1459 - classification_loss: 0.4954 353/500 [====================>.........] - ETA: 34s - loss: 2.6409 - regression_loss: 2.1456 - classification_loss: 0.4953 354/500 [====================>.........] - ETA: 33s - loss: 2.6413 - regression_loss: 2.1461 - classification_loss: 0.4951 355/500 [====================>.........] - ETA: 33s - loss: 2.6428 - regression_loss: 2.1470 - classification_loss: 0.4958 356/500 [====================>.........] - ETA: 33s - loss: 2.6411 - regression_loss: 2.1458 - classification_loss: 0.4953 357/500 [====================>.........] - ETA: 33s - loss: 2.6435 - regression_loss: 2.1477 - classification_loss: 0.4957 358/500 [====================>.........] - ETA: 33s - loss: 2.6462 - regression_loss: 2.1500 - classification_loss: 0.4962 359/500 [====================>.........] - ETA: 32s - loss: 2.6460 - regression_loss: 2.1500 - classification_loss: 0.4960 360/500 [====================>.........] - ETA: 32s - loss: 2.6447 - regression_loss: 2.1492 - classification_loss: 0.4955 361/500 [====================>.........] - ETA: 32s - loss: 2.6456 - regression_loss: 2.1490 - classification_loss: 0.4966 362/500 [====================>.........] - ETA: 32s - loss: 2.6469 - regression_loss: 2.1498 - classification_loss: 0.4971 363/500 [====================>.........] - ETA: 31s - loss: 2.6440 - regression_loss: 2.1475 - classification_loss: 0.4965 364/500 [====================>.........] - ETA: 31s - loss: 2.6445 - regression_loss: 2.1481 - classification_loss: 0.4964 365/500 [====================>.........] - ETA: 31s - loss: 2.6448 - regression_loss: 2.1487 - classification_loss: 0.4962 366/500 [====================>.........] - ETA: 31s - loss: 2.6465 - regression_loss: 2.1499 - classification_loss: 0.4967 367/500 [=====================>........] - ETA: 30s - loss: 2.6455 - regression_loss: 2.1492 - classification_loss: 0.4963 368/500 [=====================>........] - ETA: 30s - loss: 2.6456 - regression_loss: 2.1490 - classification_loss: 0.4966 369/500 [=====================>........] - ETA: 30s - loss: 2.6449 - regression_loss: 2.1485 - classification_loss: 0.4963 370/500 [=====================>........] - ETA: 30s - loss: 2.6435 - regression_loss: 2.1474 - classification_loss: 0.4962 371/500 [=====================>........] - ETA: 30s - loss: 2.6431 - regression_loss: 2.1469 - classification_loss: 0.4963 372/500 [=====================>........] - ETA: 29s - loss: 2.6451 - regression_loss: 2.1489 - classification_loss: 0.4962 373/500 [=====================>........] - ETA: 29s - loss: 2.6459 - regression_loss: 2.1495 - classification_loss: 0.4964 374/500 [=====================>........] - ETA: 29s - loss: 2.6454 - regression_loss: 2.1495 - classification_loss: 0.4959 375/500 [=====================>........] - ETA: 29s - loss: 2.6458 - regression_loss: 2.1495 - classification_loss: 0.4963 376/500 [=====================>........] - ETA: 28s - loss: 2.6455 - regression_loss: 2.1492 - classification_loss: 0.4963 377/500 [=====================>........] - ETA: 28s - loss: 2.6418 - regression_loss: 2.1465 - classification_loss: 0.4953 378/500 [=====================>........] - ETA: 28s - loss: 2.6417 - regression_loss: 2.1467 - classification_loss: 0.4950 379/500 [=====================>........] - ETA: 28s - loss: 2.6385 - regression_loss: 2.1442 - classification_loss: 0.4943 380/500 [=====================>........] - ETA: 27s - loss: 2.6387 - regression_loss: 2.1442 - classification_loss: 0.4945 381/500 [=====================>........] - ETA: 27s - loss: 2.6393 - regression_loss: 2.1446 - classification_loss: 0.4947 382/500 [=====================>........] - ETA: 27s - loss: 2.6401 - regression_loss: 2.1451 - classification_loss: 0.4950 383/500 [=====================>........] - ETA: 27s - loss: 2.6400 - regression_loss: 2.1452 - classification_loss: 0.4948 384/500 [======================>.......] - ETA: 26s - loss: 2.6392 - regression_loss: 2.1444 - classification_loss: 0.4948 385/500 [======================>.......] - ETA: 26s - loss: 2.6403 - regression_loss: 2.1450 - classification_loss: 0.4953 386/500 [======================>.......] - ETA: 26s - loss: 2.6389 - regression_loss: 2.1441 - classification_loss: 0.4948 387/500 [======================>.......] - ETA: 26s - loss: 2.6389 - regression_loss: 2.1440 - classification_loss: 0.4949 388/500 [======================>.......] - ETA: 26s - loss: 2.6380 - regression_loss: 2.1436 - classification_loss: 0.4945 389/500 [======================>.......] - ETA: 25s - loss: 2.6368 - regression_loss: 2.1427 - classification_loss: 0.4941 390/500 [======================>.......] - ETA: 25s - loss: 2.6354 - regression_loss: 2.1418 - classification_loss: 0.4936 391/500 [======================>.......] - ETA: 25s - loss: 2.6382 - regression_loss: 2.1440 - classification_loss: 0.4942 392/500 [======================>.......] - ETA: 25s - loss: 2.6391 - regression_loss: 2.1449 - classification_loss: 0.4942 393/500 [======================>.......] - ETA: 24s - loss: 2.6395 - regression_loss: 2.1451 - classification_loss: 0.4944 394/500 [======================>.......] - ETA: 24s - loss: 2.6400 - regression_loss: 2.1454 - classification_loss: 0.4946 395/500 [======================>.......] - ETA: 24s - loss: 2.6377 - regression_loss: 2.1439 - classification_loss: 0.4939 396/500 [======================>.......] - ETA: 24s - loss: 2.6363 - regression_loss: 2.1428 - classification_loss: 0.4935 397/500 [======================>.......] - ETA: 23s - loss: 2.6355 - regression_loss: 2.1422 - classification_loss: 0.4933 398/500 [======================>.......] - ETA: 23s - loss: 2.6357 - regression_loss: 2.1425 - classification_loss: 0.4932 399/500 [======================>.......] - ETA: 23s - loss: 2.6357 - regression_loss: 2.1425 - classification_loss: 0.4932 400/500 [=======================>......] - ETA: 23s - loss: 2.6347 - regression_loss: 2.1418 - classification_loss: 0.4929 401/500 [=======================>......] - ETA: 23s - loss: 2.6343 - regression_loss: 2.1418 - classification_loss: 0.4924 402/500 [=======================>......] - ETA: 22s - loss: 2.6342 - regression_loss: 2.1417 - classification_loss: 0.4925 403/500 [=======================>......] - ETA: 22s - loss: 2.6339 - regression_loss: 2.1413 - classification_loss: 0.4926 404/500 [=======================>......] - ETA: 22s - loss: 2.6342 - regression_loss: 2.1413 - classification_loss: 0.4929 405/500 [=======================>......] - ETA: 22s - loss: 2.6352 - regression_loss: 2.1422 - classification_loss: 0.4930 406/500 [=======================>......] - ETA: 21s - loss: 2.6321 - regression_loss: 2.1397 - classification_loss: 0.4924 407/500 [=======================>......] - ETA: 21s - loss: 2.6333 - regression_loss: 2.1403 - classification_loss: 0.4930 408/500 [=======================>......] - ETA: 21s - loss: 2.6332 - regression_loss: 2.1401 - classification_loss: 0.4932 409/500 [=======================>......] - ETA: 21s - loss: 2.6379 - regression_loss: 2.1420 - classification_loss: 0.4958 410/500 [=======================>......] - ETA: 20s - loss: 2.6369 - regression_loss: 2.1413 - classification_loss: 0.4957 411/500 [=======================>......] - ETA: 20s - loss: 2.6426 - regression_loss: 2.1432 - classification_loss: 0.4994 412/500 [=======================>......] - ETA: 20s - loss: 2.6423 - regression_loss: 2.1430 - classification_loss: 0.4993 413/500 [=======================>......] - ETA: 20s - loss: 2.6427 - regression_loss: 2.1436 - classification_loss: 0.4992 414/500 [=======================>......] - ETA: 19s - loss: 2.6414 - regression_loss: 2.1427 - classification_loss: 0.4988 415/500 [=======================>......] - ETA: 19s - loss: 2.6404 - regression_loss: 2.1421 - classification_loss: 0.4983 416/500 [=======================>......] - ETA: 19s - loss: 2.6390 - regression_loss: 2.1413 - classification_loss: 0.4978 417/500 [========================>.....] - ETA: 19s - loss: 2.6406 - regression_loss: 2.1430 - classification_loss: 0.4977 418/500 [========================>.....] - ETA: 19s - loss: 2.6390 - regression_loss: 2.1419 - classification_loss: 0.4970 419/500 [========================>.....] - ETA: 18s - loss: 2.6393 - regression_loss: 2.1425 - classification_loss: 0.4968 420/500 [========================>.....] - ETA: 18s - loss: 2.6386 - regression_loss: 2.1421 - classification_loss: 0.4964 421/500 [========================>.....] - ETA: 18s - loss: 2.6379 - regression_loss: 2.1417 - classification_loss: 0.4963 422/500 [========================>.....] - ETA: 18s - loss: 2.6376 - regression_loss: 2.1411 - classification_loss: 0.4964 423/500 [========================>.....] - ETA: 17s - loss: 2.6366 - regression_loss: 2.1405 - classification_loss: 0.4962 424/500 [========================>.....] - ETA: 17s - loss: 2.6367 - regression_loss: 2.1407 - classification_loss: 0.4960 425/500 [========================>.....] - ETA: 17s - loss: 2.6349 - regression_loss: 2.1391 - classification_loss: 0.4957 426/500 [========================>.....] - ETA: 17s - loss: 2.6380 - regression_loss: 2.1419 - classification_loss: 0.4962 427/500 [========================>.....] - ETA: 16s - loss: 2.6376 - regression_loss: 2.1420 - classification_loss: 0.4956 428/500 [========================>.....] - ETA: 16s - loss: 2.6369 - regression_loss: 2.1413 - classification_loss: 0.4956 429/500 [========================>.....] - ETA: 16s - loss: 2.6362 - regression_loss: 2.1409 - classification_loss: 0.4954 430/500 [========================>.....] - ETA: 16s - loss: 2.6348 - regression_loss: 2.1399 - classification_loss: 0.4949 431/500 [========================>.....] - ETA: 16s - loss: 2.6353 - regression_loss: 2.1401 - classification_loss: 0.4952 432/500 [========================>.....] - ETA: 15s - loss: 2.6374 - regression_loss: 2.1415 - classification_loss: 0.4959 433/500 [========================>.....] - ETA: 15s - loss: 2.6374 - regression_loss: 2.1414 - classification_loss: 0.4960 434/500 [=========================>....] - ETA: 15s - loss: 2.6386 - regression_loss: 2.1425 - classification_loss: 0.4961 435/500 [=========================>....] - ETA: 15s - loss: 2.6385 - regression_loss: 2.1416 - classification_loss: 0.4969 436/500 [=========================>....] - ETA: 14s - loss: 2.6382 - regression_loss: 2.1414 - classification_loss: 0.4968 437/500 [=========================>....] - ETA: 14s - loss: 2.6384 - regression_loss: 2.1415 - classification_loss: 0.4968 438/500 [=========================>....] - ETA: 14s - loss: 2.6375 - regression_loss: 2.1409 - classification_loss: 0.4967 439/500 [=========================>....] - ETA: 14s - loss: 2.6390 - regression_loss: 2.1419 - classification_loss: 0.4971 440/500 [=========================>....] - ETA: 13s - loss: 2.6401 - regression_loss: 2.1431 - classification_loss: 0.4970 441/500 [=========================>....] - ETA: 13s - loss: 2.6401 - regression_loss: 2.1431 - classification_loss: 0.4970 442/500 [=========================>....] - ETA: 13s - loss: 2.6385 - regression_loss: 2.1417 - classification_loss: 0.4967 443/500 [=========================>....] - ETA: 13s - loss: 2.6388 - regression_loss: 2.1420 - classification_loss: 0.4968 444/500 [=========================>....] - ETA: 12s - loss: 2.6372 - regression_loss: 2.1409 - classification_loss: 0.4963 445/500 [=========================>....] - ETA: 12s - loss: 2.6374 - regression_loss: 2.1411 - classification_loss: 0.4962 446/500 [=========================>....] - ETA: 12s - loss: 2.6402 - regression_loss: 2.1437 - classification_loss: 0.4965 447/500 [=========================>....] - ETA: 12s - loss: 2.6394 - regression_loss: 2.1433 - classification_loss: 0.4961 448/500 [=========================>....] - ETA: 12s - loss: 2.6410 - regression_loss: 2.1448 - classification_loss: 0.4962 449/500 [=========================>....] - ETA: 11s - loss: 2.6427 - regression_loss: 2.1462 - classification_loss: 0.4965 450/500 [==========================>...] - ETA: 11s - loss: 2.6424 - regression_loss: 2.1460 - classification_loss: 0.4964 451/500 [==========================>...] - ETA: 11s - loss: 2.6421 - regression_loss: 2.1460 - classification_loss: 0.4962 452/500 [==========================>...] - ETA: 11s - loss: 2.6425 - regression_loss: 2.1464 - classification_loss: 0.4961 453/500 [==========================>...] - ETA: 10s - loss: 2.6420 - regression_loss: 2.1462 - classification_loss: 0.4958 454/500 [==========================>...] - ETA: 10s - loss: 2.6416 - regression_loss: 2.1457 - classification_loss: 0.4958 455/500 [==========================>...] - ETA: 10s - loss: 2.6429 - regression_loss: 2.1469 - classification_loss: 0.4960 456/500 [==========================>...] - ETA: 10s - loss: 2.6439 - regression_loss: 2.1478 - classification_loss: 0.4961 457/500 [==========================>...] - ETA: 9s - loss: 2.6421 - regression_loss: 2.1465 - classification_loss: 0.4956  458/500 [==========================>...] - ETA: 9s - loss: 2.6399 - regression_loss: 2.1449 - classification_loss: 0.4950 459/500 [==========================>...] - ETA: 9s - loss: 2.6400 - regression_loss: 2.1450 - classification_loss: 0.4950 460/500 [==========================>...] - ETA: 9s - loss: 2.6393 - regression_loss: 2.1444 - classification_loss: 0.4949 461/500 [==========================>...] - ETA: 9s - loss: 2.6381 - regression_loss: 2.1436 - classification_loss: 0.4945 462/500 [==========================>...] - ETA: 8s - loss: 2.6386 - regression_loss: 2.1442 - classification_loss: 0.4945 463/500 [==========================>...] - ETA: 8s - loss: 2.6374 - regression_loss: 2.1432 - classification_loss: 0.4942 464/500 [==========================>...] - ETA: 8s - loss: 2.6362 - regression_loss: 2.1424 - classification_loss: 0.4938 465/500 [==========================>...] - ETA: 8s - loss: 2.6362 - regression_loss: 2.1427 - classification_loss: 0.4935 466/500 [==========================>...] - ETA: 7s - loss: 2.6334 - regression_loss: 2.1404 - classification_loss: 0.4930 467/500 [===========================>..] - ETA: 7s - loss: 2.6345 - regression_loss: 2.1413 - classification_loss: 0.4933 468/500 [===========================>..] - ETA: 7s - loss: 2.6348 - regression_loss: 2.1414 - classification_loss: 0.4934 469/500 [===========================>..] - ETA: 7s - loss: 2.6361 - regression_loss: 2.1424 - classification_loss: 0.4937 470/500 [===========================>..] - ETA: 6s - loss: 2.6336 - regression_loss: 2.1403 - classification_loss: 0.4933 471/500 [===========================>..] - ETA: 6s - loss: 2.6348 - regression_loss: 2.1415 - classification_loss: 0.4933 472/500 [===========================>..] - ETA: 6s - loss: 2.6350 - regression_loss: 2.1417 - classification_loss: 0.4933 473/500 [===========================>..] - ETA: 6s - loss: 2.6349 - regression_loss: 2.1416 - classification_loss: 0.4933 474/500 [===========================>..] - ETA: 6s - loss: 2.6342 - regression_loss: 2.1411 - classification_loss: 0.4931 475/500 [===========================>..] - ETA: 5s - loss: 2.6345 - regression_loss: 2.1412 - classification_loss: 0.4932 476/500 [===========================>..] - ETA: 5s - loss: 2.6333 - regression_loss: 2.1405 - classification_loss: 0.4928 477/500 [===========================>..] - ETA: 5s - loss: 2.6302 - regression_loss: 2.1380 - classification_loss: 0.4922 478/500 [===========================>..] - ETA: 5s - loss: 2.6286 - regression_loss: 2.1368 - classification_loss: 0.4918 479/500 [===========================>..] - ETA: 4s - loss: 2.6295 - regression_loss: 2.1374 - classification_loss: 0.4921 480/500 [===========================>..] - ETA: 4s - loss: 2.6307 - regression_loss: 2.1385 - classification_loss: 0.4922 481/500 [===========================>..] - ETA: 4s - loss: 2.6312 - regression_loss: 2.1391 - classification_loss: 0.4922 482/500 [===========================>..] - ETA: 4s - loss: 2.6299 - regression_loss: 2.1379 - classification_loss: 0.4920 483/500 [===========================>..] - ETA: 3s - loss: 2.6290 - regression_loss: 2.1375 - classification_loss: 0.4915 484/500 [============================>.] - ETA: 3s - loss: 2.6274 - regression_loss: 2.1361 - classification_loss: 0.4913 485/500 [============================>.] - ETA: 3s - loss: 2.6264 - regression_loss: 2.1354 - classification_loss: 0.4911 486/500 [============================>.] - ETA: 3s - loss: 2.6254 - regression_loss: 2.1345 - classification_loss: 0.4910 487/500 [============================>.] - ETA: 3s - loss: 2.6241 - regression_loss: 2.1338 - classification_loss: 0.4903 488/500 [============================>.] - ETA: 2s - loss: 2.6254 - regression_loss: 2.1351 - classification_loss: 0.4903 489/500 [============================>.] - ETA: 2s - loss: 2.6249 - regression_loss: 2.1347 - classification_loss: 0.4902 490/500 [============================>.] - ETA: 2s - loss: 2.6255 - regression_loss: 2.1351 - classification_loss: 0.4904 491/500 [============================>.] - ETA: 2s - loss: 2.6273 - regression_loss: 2.1363 - classification_loss: 0.4909 492/500 [============================>.] - ETA: 1s - loss: 2.6273 - regression_loss: 2.1364 - classification_loss: 0.4909 493/500 [============================>.] - ETA: 1s - loss: 2.6259 - regression_loss: 2.1354 - classification_loss: 0.4905 494/500 [============================>.] - ETA: 1s - loss: 2.6256 - regression_loss: 2.1349 - classification_loss: 0.4907 495/500 [============================>.] - ETA: 1s - loss: 2.6252 - regression_loss: 2.1345 - classification_loss: 0.4907 496/500 [============================>.] - ETA: 0s - loss: 2.6255 - regression_loss: 2.1347 - classification_loss: 0.4908 497/500 [============================>.] - ETA: 0s - loss: 2.6261 - regression_loss: 2.1348 - classification_loss: 0.4913 498/500 [============================>.] - ETA: 0s - loss: 2.6258 - regression_loss: 2.1343 - classification_loss: 0.4916 499/500 [============================>.] - ETA: 0s - loss: 2.6243 - regression_loss: 2.1332 - classification_loss: 0.4911 500/500 [==============================] - 116s 232ms/step - loss: 2.6240 - regression_loss: 2.1330 - classification_loss: 0.4910 326 instances of class plum with average precision: 0.4824 mAP: 0.4824 Epoch 00004: saving model to ./training/snapshots/resnet50_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 1:54 - loss: 2.4273 - regression_loss: 1.9719 - classification_loss: 0.4553 2/500 [..............................] - ETA: 1:52 - loss: 2.3949 - regression_loss: 1.9688 - classification_loss: 0.4260 3/500 [..............................] - ETA: 1:52 - loss: 2.6056 - regression_loss: 2.1428 - classification_loss: 0.4628 4/500 [..............................] - ETA: 1:54 - loss: 2.6436 - regression_loss: 2.1605 - classification_loss: 0.4831 5/500 [..............................] - ETA: 1:55 - loss: 2.4866 - regression_loss: 2.0455 - classification_loss: 0.4411 6/500 [..............................] - ETA: 1:55 - loss: 2.4060 - regression_loss: 1.9729 - classification_loss: 0.4331 7/500 [..............................] - ETA: 1:55 - loss: 2.5232 - regression_loss: 2.0613 - classification_loss: 0.4619 8/500 [..............................] - ETA: 1:57 - loss: 2.5024 - regression_loss: 2.0607 - classification_loss: 0.4417 9/500 [..............................] - ETA: 1:57 - loss: 2.5336 - regression_loss: 2.0838 - classification_loss: 0.4498 10/500 [..............................] - ETA: 1:56 - loss: 2.5535 - regression_loss: 2.0701 - classification_loss: 0.4834 11/500 [..............................] - ETA: 1:56 - loss: 2.5945 - regression_loss: 2.0962 - classification_loss: 0.4983 12/500 [..............................] - ETA: 1:55 - loss: 2.6333 - regression_loss: 2.1176 - classification_loss: 0.5157 13/500 [..............................] - ETA: 1:55 - loss: 2.6071 - regression_loss: 2.1112 - classification_loss: 0.4959 14/500 [..............................] - ETA: 1:54 - loss: 2.6316 - regression_loss: 2.1329 - classification_loss: 0.4987 15/500 [..............................] - ETA: 1:54 - loss: 2.6387 - regression_loss: 2.1405 - classification_loss: 0.4982 16/500 [..............................] - ETA: 1:54 - loss: 2.6078 - regression_loss: 2.1126 - classification_loss: 0.4952 17/500 [>.............................] - ETA: 1:53 - loss: 2.6192 - regression_loss: 2.1221 - classification_loss: 0.4971 18/500 [>.............................] - ETA: 1:53 - loss: 2.8207 - regression_loss: 2.2533 - classification_loss: 0.5674 19/500 [>.............................] - ETA: 1:54 - loss: 2.8115 - regression_loss: 2.2431 - classification_loss: 0.5684 20/500 [>.............................] - ETA: 1:54 - loss: 2.7845 - regression_loss: 2.2225 - classification_loss: 0.5619 21/500 [>.............................] - ETA: 1:53 - loss: 2.7400 - regression_loss: 2.1897 - classification_loss: 0.5502 22/500 [>.............................] - ETA: 1:53 - loss: 2.7530 - regression_loss: 2.2072 - classification_loss: 0.5458 23/500 [>.............................] - ETA: 1:53 - loss: 2.7226 - regression_loss: 2.1889 - classification_loss: 0.5338 24/500 [>.............................] - ETA: 1:52 - loss: 2.7221 - regression_loss: 2.1874 - classification_loss: 0.5347 25/500 [>.............................] - ETA: 1:52 - loss: 2.7280 - regression_loss: 2.1935 - classification_loss: 0.5344 26/500 [>.............................] - ETA: 1:52 - loss: 2.7490 - regression_loss: 2.2157 - classification_loss: 0.5333 27/500 [>.............................] - ETA: 1:52 - loss: 2.7714 - regression_loss: 2.2362 - classification_loss: 0.5352 28/500 [>.............................] - ETA: 1:52 - loss: 2.7727 - regression_loss: 2.2346 - classification_loss: 0.5380 29/500 [>.............................] - ETA: 1:52 - loss: 2.7409 - regression_loss: 2.2125 - classification_loss: 0.5284 30/500 [>.............................] - ETA: 1:51 - loss: 2.7196 - regression_loss: 2.1968 - classification_loss: 0.5228 31/500 [>.............................] - ETA: 1:51 - loss: 2.7253 - regression_loss: 2.1978 - classification_loss: 0.5275 32/500 [>.............................] - ETA: 1:51 - loss: 2.7088 - regression_loss: 2.1876 - classification_loss: 0.5212 33/500 [>.............................] - ETA: 1:51 - loss: 2.7054 - regression_loss: 2.1817 - classification_loss: 0.5238 34/500 [=>............................] - ETA: 1:51 - loss: 2.6949 - regression_loss: 2.1750 - classification_loss: 0.5199 35/500 [=>............................] - ETA: 1:50 - loss: 2.6755 - regression_loss: 2.1603 - classification_loss: 0.5152 36/500 [=>............................] - ETA: 1:50 - loss: 2.6600 - regression_loss: 2.1489 - classification_loss: 0.5111 37/500 [=>............................] - ETA: 1:50 - loss: 2.6745 - regression_loss: 2.1588 - classification_loss: 0.5158 38/500 [=>............................] - ETA: 1:50 - loss: 2.6753 - regression_loss: 2.1585 - classification_loss: 0.5168 39/500 [=>............................] - ETA: 1:49 - loss: 2.6817 - regression_loss: 2.1626 - classification_loss: 0.5190 40/500 [=>............................] - ETA: 1:49 - loss: 2.6674 - regression_loss: 2.1525 - classification_loss: 0.5149 41/500 [=>............................] - ETA: 1:49 - loss: 2.6888 - regression_loss: 2.1671 - classification_loss: 0.5217 42/500 [=>............................] - ETA: 1:48 - loss: 2.6685 - regression_loss: 2.1521 - classification_loss: 0.5164 43/500 [=>............................] - ETA: 1:48 - loss: 2.6654 - regression_loss: 2.1511 - classification_loss: 0.5143 44/500 [=>............................] - ETA: 1:48 - loss: 2.6880 - regression_loss: 2.1675 - classification_loss: 0.5204 45/500 [=>............................] - ETA: 1:48 - loss: 2.6792 - regression_loss: 2.1608 - classification_loss: 0.5184 46/500 [=>............................] - ETA: 1:47 - loss: 2.6861 - regression_loss: 2.1674 - classification_loss: 0.5187 47/500 [=>............................] - ETA: 1:47 - loss: 2.7004 - regression_loss: 2.1770 - classification_loss: 0.5234 48/500 [=>............................] - ETA: 1:47 - loss: 2.6917 - regression_loss: 2.1707 - classification_loss: 0.5210 49/500 [=>............................] - ETA: 1:47 - loss: 2.6893 - regression_loss: 2.1686 - classification_loss: 0.5207 50/500 [==>...........................] - ETA: 1:47 - loss: 2.6767 - regression_loss: 2.1611 - classification_loss: 0.5156 51/500 [==>...........................] - ETA: 1:46 - loss: 2.6754 - regression_loss: 2.1601 - classification_loss: 0.5152 52/500 [==>...........................] - ETA: 1:46 - loss: 2.6856 - regression_loss: 2.1680 - classification_loss: 0.5176 53/500 [==>...........................] - ETA: 1:46 - loss: 2.6779 - regression_loss: 2.1630 - classification_loss: 0.5149 54/500 [==>...........................] - ETA: 1:45 - loss: 2.6667 - regression_loss: 2.1563 - classification_loss: 0.5103 55/500 [==>...........................] - ETA: 1:45 - loss: 2.6617 - regression_loss: 2.1527 - classification_loss: 0.5090 56/500 [==>...........................] - ETA: 1:45 - loss: 2.6626 - regression_loss: 2.1527 - classification_loss: 0.5099 57/500 [==>...........................] - ETA: 1:45 - loss: 2.6578 - regression_loss: 2.1498 - classification_loss: 0.5081 58/500 [==>...........................] - ETA: 1:44 - loss: 2.6519 - regression_loss: 2.1453 - classification_loss: 0.5066 59/500 [==>...........................] - ETA: 1:44 - loss: 2.7304 - regression_loss: 2.1089 - classification_loss: 0.6215 60/500 [==>...........................] - ETA: 1:44 - loss: 2.7326 - regression_loss: 2.1102 - classification_loss: 0.6224 61/500 [==>...........................] - ETA: 1:44 - loss: 2.7038 - regression_loss: 2.0886 - classification_loss: 0.6152 62/500 [==>...........................] - ETA: 1:43 - loss: 2.6979 - regression_loss: 2.0867 - classification_loss: 0.6113 63/500 [==>...........................] - ETA: 1:43 - loss: 2.7026 - regression_loss: 2.0939 - classification_loss: 0.6087 64/500 [==>...........................] - ETA: 1:43 - loss: 2.6975 - regression_loss: 2.0903 - classification_loss: 0.6071 65/500 [==>...........................] - ETA: 1:43 - loss: 2.6953 - regression_loss: 2.0911 - classification_loss: 0.6042 66/500 [==>...........................] - ETA: 1:42 - loss: 2.6762 - regression_loss: 2.0771 - classification_loss: 0.5992 67/500 [===>..........................] - ETA: 1:42 - loss: 2.6713 - regression_loss: 2.0756 - classification_loss: 0.5957 68/500 [===>..........................] - ETA: 1:42 - loss: 2.6710 - regression_loss: 2.0777 - classification_loss: 0.5933 69/500 [===>..........................] - ETA: 1:42 - loss: 2.6623 - regression_loss: 2.0730 - classification_loss: 0.5893 70/500 [===>..........................] - ETA: 1:41 - loss: 2.6647 - regression_loss: 2.0763 - classification_loss: 0.5884 71/500 [===>..........................] - ETA: 1:41 - loss: 2.6614 - regression_loss: 2.0758 - classification_loss: 0.5856 72/500 [===>..........................] - ETA: 1:41 - loss: 2.6686 - regression_loss: 2.0820 - classification_loss: 0.5866 73/500 [===>..........................] - ETA: 1:40 - loss: 2.6781 - regression_loss: 2.0943 - classification_loss: 0.5838 74/500 [===>..........................] - ETA: 1:40 - loss: 2.6771 - regression_loss: 2.0934 - classification_loss: 0.5837 75/500 [===>..........................] - ETA: 1:40 - loss: 2.6813 - regression_loss: 2.0956 - classification_loss: 0.5857 76/500 [===>..........................] - ETA: 1:40 - loss: 2.6748 - regression_loss: 2.0917 - classification_loss: 0.5832 77/500 [===>..........................] - ETA: 1:39 - loss: 2.6731 - regression_loss: 2.0912 - classification_loss: 0.5819 78/500 [===>..........................] - ETA: 1:39 - loss: 2.6683 - regression_loss: 2.0884 - classification_loss: 0.5799 79/500 [===>..........................] - ETA: 1:39 - loss: 2.6651 - regression_loss: 2.0853 - classification_loss: 0.5798 80/500 [===>..........................] - ETA: 1:39 - loss: 2.6720 - regression_loss: 2.0922 - classification_loss: 0.5798 81/500 [===>..........................] - ETA: 1:39 - loss: 2.6606 - regression_loss: 2.0843 - classification_loss: 0.5762 82/500 [===>..........................] - ETA: 1:38 - loss: 2.6548 - regression_loss: 2.0819 - classification_loss: 0.5730 83/500 [===>..........................] - ETA: 1:38 - loss: 2.6504 - regression_loss: 2.0790 - classification_loss: 0.5714 84/500 [====>.........................] - ETA: 1:38 - loss: 2.6463 - regression_loss: 2.0776 - classification_loss: 0.5687 85/500 [====>.........................] - ETA: 1:38 - loss: 2.6442 - regression_loss: 2.0774 - classification_loss: 0.5668 86/500 [====>.........................] - ETA: 1:37 - loss: 2.6435 - regression_loss: 2.0788 - classification_loss: 0.5647 87/500 [====>.........................] - ETA: 1:37 - loss: 2.6446 - regression_loss: 2.0791 - classification_loss: 0.5655 88/500 [====>.........................] - ETA: 1:37 - loss: 2.6383 - regression_loss: 2.0761 - classification_loss: 0.5622 89/500 [====>.........................] - ETA: 1:37 - loss: 2.6470 - regression_loss: 2.0831 - classification_loss: 0.5638 90/500 [====>.........................] - ETA: 1:36 - loss: 2.6556 - regression_loss: 2.0907 - classification_loss: 0.5649 91/500 [====>.........................] - ETA: 1:36 - loss: 2.6501 - regression_loss: 2.0880 - classification_loss: 0.5621 92/500 [====>.........................] - ETA: 1:36 - loss: 2.6469 - regression_loss: 2.0871 - classification_loss: 0.5598 93/500 [====>.........................] - ETA: 1:36 - loss: 2.6419 - regression_loss: 2.0843 - classification_loss: 0.5577 94/500 [====>.........................] - ETA: 1:35 - loss: 2.6434 - regression_loss: 2.0858 - classification_loss: 0.5577 95/500 [====>.........................] - ETA: 1:35 - loss: 2.6409 - regression_loss: 2.0851 - classification_loss: 0.5558 96/500 [====>.........................] - ETA: 1:35 - loss: 2.6403 - regression_loss: 2.0861 - classification_loss: 0.5542 97/500 [====>.........................] - ETA: 1:35 - loss: 2.6508 - regression_loss: 2.0949 - classification_loss: 0.5559 98/500 [====>.........................] - ETA: 1:34 - loss: 2.6493 - regression_loss: 2.0943 - classification_loss: 0.5549 99/500 [====>.........................] - ETA: 1:34 - loss: 2.6482 - regression_loss: 2.0940 - classification_loss: 0.5542 100/500 [=====>........................] - ETA: 1:34 - loss: 2.6510 - regression_loss: 2.0963 - classification_loss: 0.5546 101/500 [=====>........................] - ETA: 1:34 - loss: 2.6489 - regression_loss: 2.0960 - classification_loss: 0.5529 102/500 [=====>........................] - ETA: 1:33 - loss: 2.6425 - regression_loss: 2.0923 - classification_loss: 0.5501 103/500 [=====>........................] - ETA: 1:33 - loss: 2.6407 - regression_loss: 2.0918 - classification_loss: 0.5489 104/500 [=====>........................] - ETA: 1:33 - loss: 2.6420 - regression_loss: 2.0941 - classification_loss: 0.5479 105/500 [=====>........................] - ETA: 1:33 - loss: 2.6371 - regression_loss: 2.0911 - classification_loss: 0.5460 106/500 [=====>........................] - ETA: 1:32 - loss: 2.6321 - regression_loss: 2.0874 - classification_loss: 0.5447 107/500 [=====>........................] - ETA: 1:32 - loss: 2.6306 - regression_loss: 2.0874 - classification_loss: 0.5432 108/500 [=====>........................] - ETA: 1:32 - loss: 2.6275 - regression_loss: 2.0856 - classification_loss: 0.5419 109/500 [=====>........................] - ETA: 1:32 - loss: 2.6170 - regression_loss: 2.0782 - classification_loss: 0.5388 110/500 [=====>........................] - ETA: 1:31 - loss: 2.6140 - regression_loss: 2.0767 - classification_loss: 0.5373 111/500 [=====>........................] - ETA: 1:31 - loss: 2.6420 - regression_loss: 2.0810 - classification_loss: 0.5610 112/500 [=====>........................] - ETA: 1:31 - loss: 2.6392 - regression_loss: 2.0795 - classification_loss: 0.5597 113/500 [=====>........................] - ETA: 1:31 - loss: 2.6360 - regression_loss: 2.0778 - classification_loss: 0.5583 114/500 [=====>........................] - ETA: 1:30 - loss: 2.6400 - regression_loss: 2.0818 - classification_loss: 0.5582 115/500 [=====>........................] - ETA: 1:30 - loss: 2.6379 - regression_loss: 2.0809 - classification_loss: 0.5570 116/500 [=====>........................] - ETA: 1:30 - loss: 2.6356 - regression_loss: 2.0800 - classification_loss: 0.5556 117/500 [======>.......................] - ETA: 1:30 - loss: 2.6268 - regression_loss: 2.0743 - classification_loss: 0.5525 118/500 [======>.......................] - ETA: 1:29 - loss: 2.6279 - regression_loss: 2.0774 - classification_loss: 0.5505 119/500 [======>.......................] - ETA: 1:29 - loss: 2.6302 - regression_loss: 2.0791 - classification_loss: 0.5511 120/500 [======>.......................] - ETA: 1:29 - loss: 2.6261 - regression_loss: 2.0773 - classification_loss: 0.5488 121/500 [======>.......................] - ETA: 1:29 - loss: 2.6239 - regression_loss: 2.0759 - classification_loss: 0.5480 122/500 [======>.......................] - ETA: 1:28 - loss: 2.6263 - regression_loss: 2.0774 - classification_loss: 0.5489 123/500 [======>.......................] - ETA: 1:28 - loss: 2.6295 - regression_loss: 2.0809 - classification_loss: 0.5486 124/500 [======>.......................] - ETA: 1:28 - loss: 2.6281 - regression_loss: 2.0797 - classification_loss: 0.5483 125/500 [======>.......................] - ETA: 1:28 - loss: 2.6304 - regression_loss: 2.0818 - classification_loss: 0.5486 126/500 [======>.......................] - ETA: 1:28 - loss: 2.6279 - regression_loss: 2.0810 - classification_loss: 0.5469 127/500 [======>.......................] - ETA: 1:27 - loss: 2.6233 - regression_loss: 2.0782 - classification_loss: 0.5451 128/500 [======>.......................] - ETA: 1:27 - loss: 2.6252 - regression_loss: 2.0805 - classification_loss: 0.5447 129/500 [======>.......................] - ETA: 1:27 - loss: 2.6248 - regression_loss: 2.0814 - classification_loss: 0.5434 130/500 [======>.......................] - ETA: 1:27 - loss: 2.6133 - regression_loss: 2.0729 - classification_loss: 0.5404 131/500 [======>.......................] - ETA: 1:26 - loss: 2.6164 - regression_loss: 2.0755 - classification_loss: 0.5409 132/500 [======>.......................] - ETA: 1:26 - loss: 2.6153 - regression_loss: 2.0752 - classification_loss: 0.5401 133/500 [======>.......................] - ETA: 1:26 - loss: 2.6147 - regression_loss: 2.0756 - classification_loss: 0.5391 134/500 [=======>......................] - ETA: 1:26 - loss: 2.6080 - regression_loss: 2.0715 - classification_loss: 0.5365 135/500 [=======>......................] - ETA: 1:25 - loss: 2.6024 - regression_loss: 2.0679 - classification_loss: 0.5345 136/500 [=======>......................] - ETA: 1:25 - loss: 2.5984 - regression_loss: 2.0658 - classification_loss: 0.5327 137/500 [=======>......................] - ETA: 1:25 - loss: 2.5931 - regression_loss: 2.0623 - classification_loss: 0.5307 138/500 [=======>......................] - ETA: 1:25 - loss: 2.5948 - regression_loss: 2.0643 - classification_loss: 0.5304 139/500 [=======>......................] - ETA: 1:24 - loss: 2.5987 - regression_loss: 2.0672 - classification_loss: 0.5315 140/500 [=======>......................] - ETA: 1:24 - loss: 2.5981 - regression_loss: 2.0660 - classification_loss: 0.5321 141/500 [=======>......................] - ETA: 1:24 - loss: 2.5973 - regression_loss: 2.0653 - classification_loss: 0.5320 142/500 [=======>......................] - ETA: 1:24 - loss: 2.6008 - regression_loss: 2.0681 - classification_loss: 0.5327 143/500 [=======>......................] - ETA: 1:23 - loss: 2.6011 - regression_loss: 2.0685 - classification_loss: 0.5326 144/500 [=======>......................] - ETA: 1:23 - loss: 2.5996 - regression_loss: 2.0685 - classification_loss: 0.5311 145/500 [=======>......................] - ETA: 1:23 - loss: 2.5987 - regression_loss: 2.0688 - classification_loss: 0.5298 146/500 [=======>......................] - ETA: 1:23 - loss: 2.5943 - regression_loss: 2.0661 - classification_loss: 0.5282 147/500 [=======>......................] - ETA: 1:22 - loss: 2.5891 - regression_loss: 2.0620 - classification_loss: 0.5271 148/500 [=======>......................] - ETA: 1:22 - loss: 2.5900 - regression_loss: 2.0625 - classification_loss: 0.5275 149/500 [=======>......................] - ETA: 1:22 - loss: 2.5903 - regression_loss: 2.0629 - classification_loss: 0.5274 150/500 [========>.....................] - ETA: 1:22 - loss: 2.5865 - regression_loss: 2.0607 - classification_loss: 0.5258 151/500 [========>.....................] - ETA: 1:22 - loss: 2.5876 - regression_loss: 2.0636 - classification_loss: 0.5240 152/500 [========>.....................] - ETA: 1:21 - loss: 2.5857 - regression_loss: 2.0628 - classification_loss: 0.5229 153/500 [========>.....................] - ETA: 1:21 - loss: 2.5856 - regression_loss: 2.0634 - classification_loss: 0.5221 154/500 [========>.....................] - ETA: 1:21 - loss: 2.5840 - regression_loss: 2.0625 - classification_loss: 0.5215 155/500 [========>.....................] - ETA: 1:21 - loss: 2.5807 - regression_loss: 2.0606 - classification_loss: 0.5202 156/500 [========>.....................] - ETA: 1:20 - loss: 2.5781 - regression_loss: 2.0585 - classification_loss: 0.5195 157/500 [========>.....................] - ETA: 1:20 - loss: 2.5774 - regression_loss: 2.0579 - classification_loss: 0.5194 158/500 [========>.....................] - ETA: 1:20 - loss: 2.5771 - regression_loss: 2.0579 - classification_loss: 0.5192 159/500 [========>.....................] - ETA: 1:20 - loss: 2.5781 - regression_loss: 2.0594 - classification_loss: 0.5187 160/500 [========>.....................] - ETA: 1:19 - loss: 2.5755 - regression_loss: 2.0578 - classification_loss: 0.5178 161/500 [========>.....................] - ETA: 1:19 - loss: 2.5793 - regression_loss: 2.0590 - classification_loss: 0.5204 162/500 [========>.....................] - ETA: 1:19 - loss: 2.5856 - regression_loss: 2.0629 - classification_loss: 0.5228 163/500 [========>.....................] - ETA: 1:19 - loss: 2.5878 - regression_loss: 2.0664 - classification_loss: 0.5214 164/500 [========>.....................] - ETA: 1:18 - loss: 2.5866 - regression_loss: 2.0660 - classification_loss: 0.5206 165/500 [========>.....................] - ETA: 1:18 - loss: 2.5821 - regression_loss: 2.0631 - classification_loss: 0.5190 166/500 [========>.....................] - ETA: 1:18 - loss: 2.5901 - regression_loss: 2.0714 - classification_loss: 0.5187 167/500 [=========>....................] - ETA: 1:18 - loss: 2.5822 - regression_loss: 2.0590 - classification_loss: 0.5232 168/500 [=========>....................] - ETA: 1:18 - loss: 2.5855 - regression_loss: 2.0606 - classification_loss: 0.5248 169/500 [=========>....................] - ETA: 1:17 - loss: 2.5811 - regression_loss: 2.0577 - classification_loss: 0.5234 170/500 [=========>....................] - ETA: 1:17 - loss: 2.5818 - regression_loss: 2.0578 - classification_loss: 0.5240 171/500 [=========>....................] - ETA: 1:17 - loss: 2.5831 - regression_loss: 2.0585 - classification_loss: 0.5247 172/500 [=========>....................] - ETA: 1:17 - loss: 2.5796 - regression_loss: 2.0559 - classification_loss: 0.5237 173/500 [=========>....................] - ETA: 1:16 - loss: 2.5801 - regression_loss: 2.0561 - classification_loss: 0.5240 174/500 [=========>....................] - ETA: 1:16 - loss: 2.5799 - regression_loss: 2.0558 - classification_loss: 0.5241 175/500 [=========>....................] - ETA: 1:16 - loss: 2.5828 - regression_loss: 2.0571 - classification_loss: 0.5257 176/500 [=========>....................] - ETA: 1:16 - loss: 2.5860 - regression_loss: 2.0592 - classification_loss: 0.5268 177/500 [=========>....................] - ETA: 1:16 - loss: 2.5868 - regression_loss: 2.0595 - classification_loss: 0.5273 178/500 [=========>....................] - ETA: 1:15 - loss: 2.5915 - regression_loss: 2.0633 - classification_loss: 0.5283 179/500 [=========>....................] - ETA: 1:15 - loss: 2.5914 - regression_loss: 2.0631 - classification_loss: 0.5283 180/500 [=========>....................] - ETA: 1:15 - loss: 2.5914 - regression_loss: 2.0634 - classification_loss: 0.5281 181/500 [=========>....................] - ETA: 1:15 - loss: 2.5912 - regression_loss: 2.0638 - classification_loss: 0.5274 182/500 [=========>....................] - ETA: 1:14 - loss: 2.5857 - regression_loss: 2.0595 - classification_loss: 0.5262 183/500 [=========>....................] - ETA: 1:14 - loss: 2.5841 - regression_loss: 2.0584 - classification_loss: 0.5258 184/500 [==========>...................] - ETA: 1:14 - loss: 2.5864 - regression_loss: 2.0603 - classification_loss: 0.5261 185/500 [==========>...................] - ETA: 1:14 - loss: 2.5865 - regression_loss: 2.0600 - classification_loss: 0.5265 186/500 [==========>...................] - ETA: 1:13 - loss: 2.5826 - regression_loss: 2.0573 - classification_loss: 0.5253 187/500 [==========>...................] - ETA: 1:13 - loss: 2.5818 - regression_loss: 2.0567 - classification_loss: 0.5251 188/500 [==========>...................] - ETA: 1:13 - loss: 2.5813 - regression_loss: 2.0568 - classification_loss: 0.5245 189/500 [==========>...................] - ETA: 1:13 - loss: 2.5851 - regression_loss: 2.0606 - classification_loss: 0.5245 190/500 [==========>...................] - ETA: 1:12 - loss: 2.5866 - regression_loss: 2.0623 - classification_loss: 0.5244 191/500 [==========>...................] - ETA: 1:12 - loss: 2.5901 - regression_loss: 2.0650 - classification_loss: 0.5251 192/500 [==========>...................] - ETA: 1:12 - loss: 2.5846 - regression_loss: 2.0610 - classification_loss: 0.5235 193/500 [==========>...................] - ETA: 1:12 - loss: 2.5900 - regression_loss: 2.0655 - classification_loss: 0.5245 194/500 [==========>...................] - ETA: 1:12 - loss: 2.5886 - regression_loss: 2.0644 - classification_loss: 0.5243 195/500 [==========>...................] - ETA: 1:11 - loss: 2.5919 - regression_loss: 2.0663 - classification_loss: 0.5256 196/500 [==========>...................] - ETA: 1:11 - loss: 2.5903 - regression_loss: 2.0651 - classification_loss: 0.5252 197/500 [==========>...................] - ETA: 1:11 - loss: 2.5973 - regression_loss: 2.0706 - classification_loss: 0.5267 198/500 [==========>...................] - ETA: 1:11 - loss: 2.5969 - regression_loss: 2.0707 - classification_loss: 0.5262 199/500 [==========>...................] - ETA: 1:10 - loss: 2.5901 - regression_loss: 2.0655 - classification_loss: 0.5245 200/500 [===========>..................] - ETA: 1:10 - loss: 2.5897 - regression_loss: 2.0656 - classification_loss: 0.5241 201/500 [===========>..................] - ETA: 1:10 - loss: 2.5877 - regression_loss: 2.0644 - classification_loss: 0.5233 202/500 [===========>..................] - ETA: 1:10 - loss: 2.5907 - regression_loss: 2.0672 - classification_loss: 0.5235 203/500 [===========>..................] - ETA: 1:09 - loss: 2.5926 - regression_loss: 2.0699 - classification_loss: 0.5226 204/500 [===========>..................] - ETA: 1:09 - loss: 2.5910 - regression_loss: 2.0687 - classification_loss: 0.5223 205/500 [===========>..................] - ETA: 1:09 - loss: 2.5888 - regression_loss: 2.0669 - classification_loss: 0.5218 206/500 [===========>..................] - ETA: 1:09 - loss: 2.5907 - regression_loss: 2.0690 - classification_loss: 0.5217 207/500 [===========>..................] - ETA: 1:08 - loss: 2.5912 - regression_loss: 2.0694 - classification_loss: 0.5219 208/500 [===========>..................] - ETA: 1:08 - loss: 2.5950 - regression_loss: 2.0733 - classification_loss: 0.5217 209/500 [===========>..................] - ETA: 1:08 - loss: 2.5930 - regression_loss: 2.0721 - classification_loss: 0.5208 210/500 [===========>..................] - ETA: 1:08 - loss: 2.5933 - regression_loss: 2.0730 - classification_loss: 0.5203 211/500 [===========>..................] - ETA: 1:08 - loss: 2.5934 - regression_loss: 2.0736 - classification_loss: 0.5198 212/500 [===========>..................] - ETA: 1:07 - loss: 2.5914 - regression_loss: 2.0719 - classification_loss: 0.5195 213/500 [===========>..................] - ETA: 1:07 - loss: 2.5883 - regression_loss: 2.0699 - classification_loss: 0.5184 214/500 [===========>..................] - ETA: 1:07 - loss: 2.5906 - regression_loss: 2.0700 - classification_loss: 0.5206 215/500 [===========>..................] - ETA: 1:07 - loss: 2.5864 - regression_loss: 2.0658 - classification_loss: 0.5207 216/500 [===========>..................] - ETA: 1:06 - loss: 2.5940 - regression_loss: 2.0686 - classification_loss: 0.5253 217/500 [============>.................] - ETA: 1:06 - loss: 2.5971 - regression_loss: 2.0706 - classification_loss: 0.5265 218/500 [============>.................] - ETA: 1:06 - loss: 2.6007 - regression_loss: 2.0736 - classification_loss: 0.5271 219/500 [============>.................] - ETA: 1:06 - loss: 2.5993 - regression_loss: 2.0732 - classification_loss: 0.5261 220/500 [============>.................] - ETA: 1:05 - loss: 2.6002 - regression_loss: 2.0746 - classification_loss: 0.5257 221/500 [============>.................] - ETA: 1:05 - loss: 2.5984 - regression_loss: 2.0735 - classification_loss: 0.5249 222/500 [============>.................] - ETA: 1:05 - loss: 2.6004 - regression_loss: 2.0754 - classification_loss: 0.5250 223/500 [============>.................] - ETA: 1:05 - loss: 2.6006 - regression_loss: 2.0755 - classification_loss: 0.5251 224/500 [============>.................] - ETA: 1:04 - loss: 2.6013 - regression_loss: 2.0758 - classification_loss: 0.5255 225/500 [============>.................] - ETA: 1:04 - loss: 2.5996 - regression_loss: 2.0746 - classification_loss: 0.5250 226/500 [============>.................] - ETA: 1:04 - loss: 2.6020 - regression_loss: 2.0759 - classification_loss: 0.5261 227/500 [============>.................] - ETA: 1:04 - loss: 2.6023 - regression_loss: 2.0764 - classification_loss: 0.5259 228/500 [============>.................] - ETA: 1:04 - loss: 2.6012 - regression_loss: 2.0760 - classification_loss: 0.5252 229/500 [============>.................] - ETA: 1:03 - loss: 2.6072 - regression_loss: 2.0810 - classification_loss: 0.5262 230/500 [============>.................] - ETA: 1:03 - loss: 2.6058 - regression_loss: 2.0807 - classification_loss: 0.5251 231/500 [============>.................] - ETA: 1:03 - loss: 2.6058 - regression_loss: 2.0806 - classification_loss: 0.5253 232/500 [============>.................] - ETA: 1:03 - loss: 2.6064 - regression_loss: 2.0810 - classification_loss: 0.5254 233/500 [============>.................] - ETA: 1:02 - loss: 2.6057 - regression_loss: 2.0805 - classification_loss: 0.5252 234/500 [=============>................] - ETA: 1:02 - loss: 2.6021 - regression_loss: 2.0780 - classification_loss: 0.5242 235/500 [=============>................] - ETA: 1:02 - loss: 2.6004 - regression_loss: 2.0766 - classification_loss: 0.5238 236/500 [=============>................] - ETA: 1:02 - loss: 2.5988 - regression_loss: 2.0757 - classification_loss: 0.5232 237/500 [=============>................] - ETA: 1:01 - loss: 2.5982 - regression_loss: 2.0755 - classification_loss: 0.5227 238/500 [=============>................] - ETA: 1:01 - loss: 2.5924 - regression_loss: 2.0710 - classification_loss: 0.5214 239/500 [=============>................] - ETA: 1:01 - loss: 2.5910 - regression_loss: 2.0698 - classification_loss: 0.5211 240/500 [=============>................] - ETA: 1:01 - loss: 2.5917 - regression_loss: 2.0702 - classification_loss: 0.5215 241/500 [=============>................] - ETA: 1:01 - loss: 2.5891 - regression_loss: 2.0675 - classification_loss: 0.5216 242/500 [=============>................] - ETA: 1:00 - loss: 2.5882 - regression_loss: 2.0668 - classification_loss: 0.5214 243/500 [=============>................] - ETA: 1:00 - loss: 2.5871 - regression_loss: 2.0660 - classification_loss: 0.5211 244/500 [=============>................] - ETA: 1:00 - loss: 2.5879 - regression_loss: 2.0669 - classification_loss: 0.5210 245/500 [=============>................] - ETA: 1:00 - loss: 2.5896 - regression_loss: 2.0679 - classification_loss: 0.5217 246/500 [=============>................] - ETA: 59s - loss: 2.5911 - regression_loss: 2.0694 - classification_loss: 0.5217  247/500 [=============>................] - ETA: 59s - loss: 2.5921 - regression_loss: 2.0699 - classification_loss: 0.5223 248/500 [=============>................] - ETA: 59s - loss: 2.5895 - regression_loss: 2.0684 - classification_loss: 0.5211 249/500 [=============>................] - ETA: 59s - loss: 2.5896 - regression_loss: 2.0687 - classification_loss: 0.5209 250/500 [==============>...............] - ETA: 58s - loss: 2.5877 - regression_loss: 2.0681 - classification_loss: 0.5195 251/500 [==============>...............] - ETA: 58s - loss: 2.5870 - regression_loss: 2.0677 - classification_loss: 0.5193 252/500 [==============>...............] - ETA: 58s - loss: 2.5877 - regression_loss: 2.0678 - classification_loss: 0.5198 253/500 [==============>...............] - ETA: 58s - loss: 2.5883 - regression_loss: 2.0686 - classification_loss: 0.5198 254/500 [==============>...............] - ETA: 57s - loss: 2.5878 - regression_loss: 2.0682 - classification_loss: 0.5195 255/500 [==============>...............] - ETA: 57s - loss: 2.5844 - regression_loss: 2.0657 - classification_loss: 0.5186 256/500 [==============>...............] - ETA: 57s - loss: 2.5839 - regression_loss: 2.0656 - classification_loss: 0.5183 257/500 [==============>...............] - ETA: 57s - loss: 2.5807 - regression_loss: 2.0630 - classification_loss: 0.5177 258/500 [==============>...............] - ETA: 56s - loss: 2.5787 - regression_loss: 2.0618 - classification_loss: 0.5169 259/500 [==============>...............] - ETA: 56s - loss: 2.5788 - regression_loss: 2.0623 - classification_loss: 0.5165 260/500 [==============>...............] - ETA: 56s - loss: 2.5800 - regression_loss: 2.0636 - classification_loss: 0.5163 261/500 [==============>...............] - ETA: 56s - loss: 2.5789 - regression_loss: 2.0631 - classification_loss: 0.5158 262/500 [==============>...............] - ETA: 56s - loss: 2.5813 - regression_loss: 2.0649 - classification_loss: 0.5165 263/500 [==============>...............] - ETA: 55s - loss: 2.5787 - regression_loss: 2.0628 - classification_loss: 0.5159 264/500 [==============>...............] - ETA: 55s - loss: 2.5812 - regression_loss: 2.0649 - classification_loss: 0.5163 265/500 [==============>...............] - ETA: 55s - loss: 2.5835 - regression_loss: 2.0666 - classification_loss: 0.5169 266/500 [==============>...............] - ETA: 55s - loss: 2.5829 - regression_loss: 2.0663 - classification_loss: 0.5166 267/500 [===============>..............] - ETA: 54s - loss: 2.5816 - regression_loss: 2.0653 - classification_loss: 0.5163 268/500 [===============>..............] - ETA: 54s - loss: 2.5859 - regression_loss: 2.0691 - classification_loss: 0.5168 269/500 [===============>..............] - ETA: 54s - loss: 2.5836 - regression_loss: 2.0678 - classification_loss: 0.5159 270/500 [===============>..............] - ETA: 54s - loss: 2.5833 - regression_loss: 2.0677 - classification_loss: 0.5157 271/500 [===============>..............] - ETA: 53s - loss: 2.5814 - regression_loss: 2.0664 - classification_loss: 0.5150 272/500 [===============>..............] - ETA: 53s - loss: 2.5835 - regression_loss: 2.0679 - classification_loss: 0.5155 273/500 [===============>..............] - ETA: 53s - loss: 2.5832 - regression_loss: 2.0682 - classification_loss: 0.5150 274/500 [===============>..............] - ETA: 53s - loss: 2.5822 - regression_loss: 2.0677 - classification_loss: 0.5145 275/500 [===============>..............] - ETA: 52s - loss: 2.5815 - regression_loss: 2.0671 - classification_loss: 0.5144 276/500 [===============>..............] - ETA: 52s - loss: 2.5807 - regression_loss: 2.0667 - classification_loss: 0.5140 277/500 [===============>..............] - ETA: 52s - loss: 2.5819 - regression_loss: 2.0668 - classification_loss: 0.5150 278/500 [===============>..............] - ETA: 52s - loss: 2.5814 - regression_loss: 2.0666 - classification_loss: 0.5147 279/500 [===============>..............] - ETA: 52s - loss: 2.5805 - regression_loss: 2.0663 - classification_loss: 0.5142 280/500 [===============>..............] - ETA: 51s - loss: 2.5793 - regression_loss: 2.0655 - classification_loss: 0.5138 281/500 [===============>..............] - ETA: 51s - loss: 2.5746 - regression_loss: 2.0618 - classification_loss: 0.5128 282/500 [===============>..............] - ETA: 51s - loss: 2.5755 - regression_loss: 2.0625 - classification_loss: 0.5130 283/500 [===============>..............] - ETA: 51s - loss: 2.5746 - regression_loss: 2.0619 - classification_loss: 0.5126 284/500 [================>.............] - ETA: 50s - loss: 2.5773 - regression_loss: 2.0650 - classification_loss: 0.5123 285/500 [================>.............] - ETA: 50s - loss: 2.5782 - regression_loss: 2.0660 - classification_loss: 0.5123 286/500 [================>.............] - ETA: 50s - loss: 2.5749 - regression_loss: 2.0636 - classification_loss: 0.5113 287/500 [================>.............] - ETA: 50s - loss: 2.5733 - regression_loss: 2.0625 - classification_loss: 0.5107 288/500 [================>.............] - ETA: 49s - loss: 2.5736 - regression_loss: 2.0631 - classification_loss: 0.5105 289/500 [================>.............] - ETA: 49s - loss: 2.5751 - regression_loss: 2.0641 - classification_loss: 0.5110 290/500 [================>.............] - ETA: 49s - loss: 2.5754 - regression_loss: 2.0643 - classification_loss: 0.5112 291/500 [================>.............] - ETA: 49s - loss: 2.5724 - regression_loss: 2.0620 - classification_loss: 0.5104 292/500 [================>.............] - ETA: 48s - loss: 2.5730 - regression_loss: 2.0628 - classification_loss: 0.5103 293/500 [================>.............] - ETA: 48s - loss: 2.5756 - regression_loss: 2.0648 - classification_loss: 0.5108 294/500 [================>.............] - ETA: 48s - loss: 2.5708 - regression_loss: 2.0612 - classification_loss: 0.5096 295/500 [================>.............] - ETA: 48s - loss: 2.5703 - regression_loss: 2.0610 - classification_loss: 0.5093 296/500 [================>.............] - ETA: 48s - loss: 2.5684 - regression_loss: 2.0598 - classification_loss: 0.5086 297/500 [================>.............] - ETA: 47s - loss: 2.5683 - regression_loss: 2.0597 - classification_loss: 0.5086 298/500 [================>.............] - ETA: 47s - loss: 2.5686 - regression_loss: 2.0597 - classification_loss: 0.5090 299/500 [================>.............] - ETA: 47s - loss: 2.5685 - regression_loss: 2.0598 - classification_loss: 0.5088 300/500 [=================>............] - ETA: 47s - loss: 2.5668 - regression_loss: 2.0589 - classification_loss: 0.5080 301/500 [=================>............] - ETA: 46s - loss: 2.5679 - regression_loss: 2.0599 - classification_loss: 0.5080 302/500 [=================>............] - ETA: 46s - loss: 2.5709 - regression_loss: 2.0617 - classification_loss: 0.5092 303/500 [=================>............] - ETA: 46s - loss: 2.5716 - regression_loss: 2.0624 - classification_loss: 0.5092 304/500 [=================>............] - ETA: 46s - loss: 2.5725 - regression_loss: 2.0629 - classification_loss: 0.5096 305/500 [=================>............] - ETA: 45s - loss: 2.5728 - regression_loss: 2.0635 - classification_loss: 0.5094 306/500 [=================>............] - ETA: 45s - loss: 2.5735 - regression_loss: 2.0643 - classification_loss: 0.5093 307/500 [=================>............] - ETA: 45s - loss: 2.5723 - regression_loss: 2.0636 - classification_loss: 0.5087 308/500 [=================>............] - ETA: 45s - loss: 2.5743 - regression_loss: 2.0655 - classification_loss: 0.5088 309/500 [=================>............] - ETA: 44s - loss: 2.5733 - regression_loss: 2.0649 - classification_loss: 0.5084 310/500 [=================>............] - ETA: 44s - loss: 2.5792 - regression_loss: 2.0676 - classification_loss: 0.5116 311/500 [=================>............] - ETA: 44s - loss: 2.5796 - regression_loss: 2.0676 - classification_loss: 0.5121 312/500 [=================>............] - ETA: 44s - loss: 2.5809 - regression_loss: 2.0679 - classification_loss: 0.5130 313/500 [=================>............] - ETA: 43s - loss: 2.5791 - regression_loss: 2.0661 - classification_loss: 0.5129 314/500 [=================>............] - ETA: 43s - loss: 2.5784 - regression_loss: 2.0654 - classification_loss: 0.5130 315/500 [=================>............] - ETA: 43s - loss: 2.5761 - regression_loss: 2.0638 - classification_loss: 0.5122 316/500 [=================>............] - ETA: 43s - loss: 2.5746 - regression_loss: 2.0628 - classification_loss: 0.5118 317/500 [==================>...........] - ETA: 42s - loss: 2.5765 - regression_loss: 2.0643 - classification_loss: 0.5122 318/500 [==================>...........] - ETA: 42s - loss: 2.5786 - regression_loss: 2.0665 - classification_loss: 0.5121 319/500 [==================>...........] - ETA: 42s - loss: 2.5776 - regression_loss: 2.0659 - classification_loss: 0.5117 320/500 [==================>...........] - ETA: 42s - loss: 2.5752 - regression_loss: 2.0643 - classification_loss: 0.5109 321/500 [==================>...........] - ETA: 42s - loss: 2.5757 - regression_loss: 2.0649 - classification_loss: 0.5109 322/500 [==================>...........] - ETA: 41s - loss: 2.5758 - regression_loss: 2.0650 - classification_loss: 0.5107 323/500 [==================>...........] - ETA: 41s - loss: 2.5739 - regression_loss: 2.0637 - classification_loss: 0.5102 324/500 [==================>...........] - ETA: 41s - loss: 2.5727 - regression_loss: 2.0628 - classification_loss: 0.5099 325/500 [==================>...........] - ETA: 41s - loss: 2.5713 - regression_loss: 2.0619 - classification_loss: 0.5094 326/500 [==================>...........] - ETA: 40s - loss: 2.5697 - regression_loss: 2.0608 - classification_loss: 0.5089 327/500 [==================>...........] - ETA: 40s - loss: 2.5702 - regression_loss: 2.0613 - classification_loss: 0.5090 328/500 [==================>...........] - ETA: 40s - loss: 2.5687 - regression_loss: 2.0604 - classification_loss: 0.5083 329/500 [==================>...........] - ETA: 40s - loss: 2.5677 - regression_loss: 2.0599 - classification_loss: 0.5078 330/500 [==================>...........] - ETA: 39s - loss: 2.5685 - regression_loss: 2.0606 - classification_loss: 0.5079 331/500 [==================>...........] - ETA: 39s - loss: 2.5679 - regression_loss: 2.0601 - classification_loss: 0.5077 332/500 [==================>...........] - ETA: 39s - loss: 2.5676 - regression_loss: 2.0599 - classification_loss: 0.5076 333/500 [==================>...........] - ETA: 39s - loss: 2.5669 - regression_loss: 2.0596 - classification_loss: 0.5073 334/500 [===================>..........] - ETA: 38s - loss: 2.5652 - regression_loss: 2.0586 - classification_loss: 0.5066 335/500 [===================>..........] - ETA: 38s - loss: 2.5647 - regression_loss: 2.0583 - classification_loss: 0.5064 336/500 [===================>..........] - ETA: 38s - loss: 2.5624 - regression_loss: 2.0567 - classification_loss: 0.5056 337/500 [===================>..........] - ETA: 38s - loss: 2.5625 - regression_loss: 2.0564 - classification_loss: 0.5061 338/500 [===================>..........] - ETA: 37s - loss: 2.5623 - regression_loss: 2.0567 - classification_loss: 0.5057 339/500 [===================>..........] - ETA: 37s - loss: 2.5648 - regression_loss: 2.0589 - classification_loss: 0.5059 340/500 [===================>..........] - ETA: 37s - loss: 2.5637 - regression_loss: 2.0581 - classification_loss: 0.5056 341/500 [===================>..........] - ETA: 37s - loss: 2.5675 - regression_loss: 2.0611 - classification_loss: 0.5064 342/500 [===================>..........] - ETA: 37s - loss: 2.5654 - regression_loss: 2.0594 - classification_loss: 0.5059 343/500 [===================>..........] - ETA: 36s - loss: 2.5646 - regression_loss: 2.0589 - classification_loss: 0.5057 344/500 [===================>..........] - ETA: 36s - loss: 2.5656 - regression_loss: 2.0596 - classification_loss: 0.5061 345/500 [===================>..........] - ETA: 36s - loss: 2.5669 - regression_loss: 2.0606 - classification_loss: 0.5063 346/500 [===================>..........] - ETA: 36s - loss: 2.5682 - regression_loss: 2.0615 - classification_loss: 0.5067 347/500 [===================>..........] - ETA: 35s - loss: 2.5670 - regression_loss: 2.0605 - classification_loss: 0.5065 348/500 [===================>..........] - ETA: 35s - loss: 2.5660 - regression_loss: 2.0598 - classification_loss: 0.5062 349/500 [===================>..........] - ETA: 35s - loss: 2.5643 - regression_loss: 2.0585 - classification_loss: 0.5057 350/500 [====================>.........] - ETA: 35s - loss: 2.5622 - regression_loss: 2.0570 - classification_loss: 0.5052 351/500 [====================>.........] - ETA: 34s - loss: 2.5609 - regression_loss: 2.0561 - classification_loss: 0.5048 352/500 [====================>.........] - ETA: 34s - loss: 2.5623 - regression_loss: 2.0574 - classification_loss: 0.5050 353/500 [====================>.........] - ETA: 34s - loss: 2.5611 - regression_loss: 2.0564 - classification_loss: 0.5047 354/500 [====================>.........] - ETA: 34s - loss: 2.5632 - regression_loss: 2.0574 - classification_loss: 0.5058 355/500 [====================>.........] - ETA: 33s - loss: 2.5620 - regression_loss: 2.0560 - classification_loss: 0.5061 356/500 [====================>.........] - ETA: 33s - loss: 2.5629 - regression_loss: 2.0568 - classification_loss: 0.5061 357/500 [====================>.........] - ETA: 33s - loss: 2.5624 - regression_loss: 2.0562 - classification_loss: 0.5062 358/500 [====================>.........] - ETA: 33s - loss: 2.5601 - regression_loss: 2.0548 - classification_loss: 0.5053 359/500 [====================>.........] - ETA: 33s - loss: 2.5607 - regression_loss: 2.0555 - classification_loss: 0.5052 360/500 [====================>.........] - ETA: 32s - loss: 2.5624 - regression_loss: 2.0571 - classification_loss: 0.5053 361/500 [====================>.........] - ETA: 32s - loss: 2.5617 - regression_loss: 2.0567 - classification_loss: 0.5050 362/500 [====================>.........] - ETA: 32s - loss: 2.5637 - regression_loss: 2.0582 - classification_loss: 0.5054 363/500 [====================>.........] - ETA: 32s - loss: 2.5634 - regression_loss: 2.0581 - classification_loss: 0.5053 364/500 [====================>.........] - ETA: 31s - loss: 2.5626 - regression_loss: 2.0575 - classification_loss: 0.5051 365/500 [====================>.........] - ETA: 31s - loss: 2.5625 - regression_loss: 2.0573 - classification_loss: 0.5052 366/500 [====================>.........] - ETA: 31s - loss: 2.5610 - regression_loss: 2.0563 - classification_loss: 0.5047 367/500 [=====================>........] - ETA: 31s - loss: 2.5602 - regression_loss: 2.0558 - classification_loss: 0.5044 368/500 [=====================>........] - ETA: 30s - loss: 2.5614 - regression_loss: 2.0563 - classification_loss: 0.5051 369/500 [=====================>........] - ETA: 30s - loss: 2.5603 - regression_loss: 2.0556 - classification_loss: 0.5047 370/500 [=====================>........] - ETA: 30s - loss: 2.5571 - regression_loss: 2.0533 - classification_loss: 0.5038 371/500 [=====================>........] - ETA: 30s - loss: 2.5589 - regression_loss: 2.0547 - classification_loss: 0.5042 372/500 [=====================>........] - ETA: 29s - loss: 2.5573 - regression_loss: 2.0538 - classification_loss: 0.5035 373/500 [=====================>........] - ETA: 29s - loss: 2.5596 - regression_loss: 2.0558 - classification_loss: 0.5037 374/500 [=====================>........] - ETA: 29s - loss: 2.5583 - regression_loss: 2.0552 - classification_loss: 0.5031 375/500 [=====================>........] - ETA: 29s - loss: 2.5582 - regression_loss: 2.0553 - classification_loss: 0.5029 376/500 [=====================>........] - ETA: 29s - loss: 2.5612 - regression_loss: 2.0575 - classification_loss: 0.5036 377/500 [=====================>........] - ETA: 28s - loss: 2.5619 - regression_loss: 2.0576 - classification_loss: 0.5043 378/500 [=====================>........] - ETA: 28s - loss: 2.5629 - regression_loss: 2.0583 - classification_loss: 0.5046 379/500 [=====================>........] - ETA: 28s - loss: 2.5628 - regression_loss: 2.0584 - classification_loss: 0.5044 380/500 [=====================>........] - ETA: 28s - loss: 2.5655 - regression_loss: 2.0602 - classification_loss: 0.5053 381/500 [=====================>........] - ETA: 27s - loss: 2.5655 - regression_loss: 2.0602 - classification_loss: 0.5053 382/500 [=====================>........] - ETA: 27s - loss: 2.5618 - regression_loss: 2.0575 - classification_loss: 0.5044 383/500 [=====================>........] - ETA: 27s - loss: 2.5612 - regression_loss: 2.0573 - classification_loss: 0.5039 384/500 [======================>.......] - ETA: 27s - loss: 2.5619 - regression_loss: 2.0579 - classification_loss: 0.5041 385/500 [======================>.......] - ETA: 26s - loss: 2.5600 - regression_loss: 2.0564 - classification_loss: 0.5036 386/500 [======================>.......] - ETA: 26s - loss: 2.5596 - regression_loss: 2.0561 - classification_loss: 0.5035 387/500 [======================>.......] - ETA: 26s - loss: 2.5582 - regression_loss: 2.0552 - classification_loss: 0.5030 388/500 [======================>.......] - ETA: 26s - loss: 2.5594 - regression_loss: 2.0560 - classification_loss: 0.5034 389/500 [======================>.......] - ETA: 25s - loss: 2.5568 - regression_loss: 2.0541 - classification_loss: 0.5027 390/500 [======================>.......] - ETA: 25s - loss: 2.5566 - regression_loss: 2.0539 - classification_loss: 0.5026 391/500 [======================>.......] - ETA: 25s - loss: 2.5564 - regression_loss: 2.0540 - classification_loss: 0.5024 392/500 [======================>.......] - ETA: 25s - loss: 2.5561 - regression_loss: 2.0537 - classification_loss: 0.5024 393/500 [======================>.......] - ETA: 25s - loss: 2.5568 - regression_loss: 2.0546 - classification_loss: 0.5022 394/500 [======================>.......] - ETA: 24s - loss: 2.5565 - regression_loss: 2.0544 - classification_loss: 0.5020 395/500 [======================>.......] - ETA: 24s - loss: 2.5550 - regression_loss: 2.0535 - classification_loss: 0.5015 396/500 [======================>.......] - ETA: 24s - loss: 2.5545 - regression_loss: 2.0534 - classification_loss: 0.5012 397/500 [======================>.......] - ETA: 24s - loss: 2.5540 - regression_loss: 2.0531 - classification_loss: 0.5008 398/500 [======================>.......] - ETA: 23s - loss: 2.5566 - regression_loss: 2.0550 - classification_loss: 0.5016 399/500 [======================>.......] - ETA: 23s - loss: 2.5563 - regression_loss: 2.0551 - classification_loss: 0.5012 400/500 [=======================>......] - ETA: 23s - loss: 2.5565 - regression_loss: 2.0552 - classification_loss: 0.5012 401/500 [=======================>......] - ETA: 23s - loss: 2.5557 - regression_loss: 2.0547 - classification_loss: 0.5010 402/500 [=======================>......] - ETA: 22s - loss: 2.5553 - regression_loss: 2.0544 - classification_loss: 0.5008 403/500 [=======================>......] - ETA: 22s - loss: 2.5557 - regression_loss: 2.0545 - classification_loss: 0.5012 404/500 [=======================>......] - ETA: 22s - loss: 2.5564 - regression_loss: 2.0554 - classification_loss: 0.5010 405/500 [=======================>......] - ETA: 22s - loss: 2.5551 - regression_loss: 2.0545 - classification_loss: 0.5007 406/500 [=======================>......] - ETA: 21s - loss: 2.5550 - regression_loss: 2.0544 - classification_loss: 0.5006 407/500 [=======================>......] - ETA: 21s - loss: 2.5570 - regression_loss: 2.0562 - classification_loss: 0.5008 408/500 [=======================>......] - ETA: 21s - loss: 2.5557 - regression_loss: 2.0554 - classification_loss: 0.5003 409/500 [=======================>......] - ETA: 21s - loss: 2.5562 - regression_loss: 2.0559 - classification_loss: 0.5003 410/500 [=======================>......] - ETA: 21s - loss: 2.5562 - regression_loss: 2.0559 - classification_loss: 0.5003 411/500 [=======================>......] - ETA: 20s - loss: 2.5578 - regression_loss: 2.0574 - classification_loss: 0.5004 412/500 [=======================>......] - ETA: 20s - loss: 2.5596 - regression_loss: 2.0588 - classification_loss: 0.5009 413/500 [=======================>......] - ETA: 20s - loss: 2.5595 - regression_loss: 2.0588 - classification_loss: 0.5007 414/500 [=======================>......] - ETA: 20s - loss: 2.5595 - regression_loss: 2.0588 - classification_loss: 0.5007 415/500 [=======================>......] - ETA: 19s - loss: 2.5577 - regression_loss: 2.0576 - classification_loss: 0.5001 416/500 [=======================>......] - ETA: 19s - loss: 2.5581 - regression_loss: 2.0580 - classification_loss: 0.5000 417/500 [========================>.....] - ETA: 19s - loss: 2.5578 - regression_loss: 2.0580 - classification_loss: 0.4998 418/500 [========================>.....] - ETA: 19s - loss: 2.5563 - regression_loss: 2.0572 - classification_loss: 0.4991 419/500 [========================>.....] - ETA: 18s - loss: 2.5555 - regression_loss: 2.0565 - classification_loss: 0.4990 420/500 [========================>.....] - ETA: 18s - loss: 2.5550 - regression_loss: 2.0561 - classification_loss: 0.4989 421/500 [========================>.....] - ETA: 18s - loss: 2.5546 - regression_loss: 2.0559 - classification_loss: 0.4987 422/500 [========================>.....] - ETA: 18s - loss: 2.5569 - regression_loss: 2.0581 - classification_loss: 0.4989 423/500 [========================>.....] - ETA: 17s - loss: 2.5572 - regression_loss: 2.0582 - classification_loss: 0.4990 424/500 [========================>.....] - ETA: 17s - loss: 2.5582 - regression_loss: 2.0595 - classification_loss: 0.4987 425/500 [========================>.....] - ETA: 17s - loss: 2.5595 - regression_loss: 2.0604 - classification_loss: 0.4991 426/500 [========================>.....] - ETA: 17s - loss: 2.5601 - regression_loss: 2.0608 - classification_loss: 0.4994 427/500 [========================>.....] - ETA: 17s - loss: 2.5589 - regression_loss: 2.0599 - classification_loss: 0.4991 428/500 [========================>.....] - ETA: 16s - loss: 2.5587 - regression_loss: 2.0601 - classification_loss: 0.4987 429/500 [========================>.....] - ETA: 16s - loss: 2.5581 - regression_loss: 2.0597 - classification_loss: 0.4984 430/500 [========================>.....] - ETA: 16s - loss: 2.5584 - regression_loss: 2.0601 - classification_loss: 0.4983 431/500 [========================>.....] - ETA: 16s - loss: 2.5584 - regression_loss: 2.0602 - classification_loss: 0.4982 432/500 [========================>.....] - ETA: 15s - loss: 2.5578 - regression_loss: 2.0597 - classification_loss: 0.4981 433/500 [========================>.....] - ETA: 15s - loss: 2.5573 - regression_loss: 2.0594 - classification_loss: 0.4979 434/500 [=========================>....] - ETA: 15s - loss: 2.5569 - regression_loss: 2.0591 - classification_loss: 0.4978 435/500 [=========================>....] - ETA: 15s - loss: 2.5570 - regression_loss: 2.0595 - classification_loss: 0.4974 436/500 [=========================>....] - ETA: 14s - loss: 2.5566 - regression_loss: 2.0595 - classification_loss: 0.4971 437/500 [=========================>....] - ETA: 14s - loss: 2.5562 - regression_loss: 2.0593 - classification_loss: 0.4969 438/500 [=========================>....] - ETA: 14s - loss: 2.5580 - regression_loss: 2.0601 - classification_loss: 0.4979 439/500 [=========================>....] - ETA: 14s - loss: 2.5568 - regression_loss: 2.0593 - classification_loss: 0.4975 440/500 [=========================>....] - ETA: 14s - loss: 2.5598 - regression_loss: 2.0607 - classification_loss: 0.4991 441/500 [=========================>....] - ETA: 13s - loss: 2.5584 - regression_loss: 2.0598 - classification_loss: 0.4987 442/500 [=========================>....] - ETA: 13s - loss: 2.5573 - regression_loss: 2.0590 - classification_loss: 0.4982 443/500 [=========================>....] - ETA: 13s - loss: 2.5584 - regression_loss: 2.0598 - classification_loss: 0.4986 444/500 [=========================>....] - ETA: 13s - loss: 2.5591 - regression_loss: 2.0603 - classification_loss: 0.4988 445/500 [=========================>....] - ETA: 12s - loss: 2.5588 - regression_loss: 2.0600 - classification_loss: 0.4988 446/500 [=========================>....] - ETA: 12s - loss: 2.5601 - regression_loss: 2.0611 - classification_loss: 0.4990 447/500 [=========================>....] - ETA: 12s - loss: 2.5609 - regression_loss: 2.0621 - classification_loss: 0.4988 448/500 [=========================>....] - ETA: 12s - loss: 2.5572 - regression_loss: 2.0592 - classification_loss: 0.4980 449/500 [=========================>....] - ETA: 11s - loss: 2.5543 - regression_loss: 2.0570 - classification_loss: 0.4973 450/500 [==========================>...] - ETA: 11s - loss: 2.5546 - regression_loss: 2.0573 - classification_loss: 0.4973 451/500 [==========================>...] - ETA: 11s - loss: 2.5525 - regression_loss: 2.0556 - classification_loss: 0.4969 452/500 [==========================>...] - ETA: 11s - loss: 2.5514 - regression_loss: 2.0548 - classification_loss: 0.4966 453/500 [==========================>...] - ETA: 10s - loss: 2.5486 - regression_loss: 2.0527 - classification_loss: 0.4959 454/500 [==========================>...] - ETA: 10s - loss: 2.5492 - regression_loss: 2.0535 - classification_loss: 0.4958 455/500 [==========================>...] - ETA: 10s - loss: 2.5481 - regression_loss: 2.0528 - classification_loss: 0.4953 456/500 [==========================>...] - ETA: 10s - loss: 2.5466 - regression_loss: 2.0516 - classification_loss: 0.4950 457/500 [==========================>...] - ETA: 10s - loss: 2.5454 - regression_loss: 2.0508 - classification_loss: 0.4946 458/500 [==========================>...] - ETA: 9s - loss: 2.5458 - regression_loss: 2.0509 - classification_loss: 0.4948  459/500 [==========================>...] - ETA: 9s - loss: 2.5468 - regression_loss: 2.0520 - classification_loss: 0.4948 460/500 [==========================>...] - ETA: 9s - loss: 2.5472 - regression_loss: 2.0522 - classification_loss: 0.4949 461/500 [==========================>...] - ETA: 9s - loss: 2.5469 - regression_loss: 2.0520 - classification_loss: 0.4948 462/500 [==========================>...] - ETA: 8s - loss: 2.5478 - regression_loss: 2.0527 - classification_loss: 0.4951 463/500 [==========================>...] - ETA: 8s - loss: 2.5466 - regression_loss: 2.0519 - classification_loss: 0.4947 464/500 [==========================>...] - ETA: 8s - loss: 2.5460 - regression_loss: 2.0510 - classification_loss: 0.4950 465/500 [==========================>...] - ETA: 8s - loss: 2.5434 - regression_loss: 2.0491 - classification_loss: 0.4943 466/500 [==========================>...] - ETA: 7s - loss: 2.5449 - regression_loss: 2.0508 - classification_loss: 0.4941 467/500 [===========================>..] - ETA: 7s - loss: 2.5447 - regression_loss: 2.0507 - classification_loss: 0.4941 468/500 [===========================>..] - ETA: 7s - loss: 2.5448 - regression_loss: 2.0509 - classification_loss: 0.4939 469/500 [===========================>..] - ETA: 7s - loss: 2.5444 - regression_loss: 2.0506 - classification_loss: 0.4937 470/500 [===========================>..] - ETA: 6s - loss: 2.5429 - regression_loss: 2.0495 - classification_loss: 0.4934 471/500 [===========================>..] - ETA: 6s - loss: 2.5451 - regression_loss: 2.0514 - classification_loss: 0.4937 472/500 [===========================>..] - ETA: 6s - loss: 2.5444 - regression_loss: 2.0510 - classification_loss: 0.4933 473/500 [===========================>..] - ETA: 6s - loss: 2.5443 - regression_loss: 2.0510 - classification_loss: 0.4933 474/500 [===========================>..] - ETA: 6s - loss: 2.5439 - regression_loss: 2.0507 - classification_loss: 0.4932 475/500 [===========================>..] - ETA: 5s - loss: 2.5444 - regression_loss: 2.0512 - classification_loss: 0.4932 476/500 [===========================>..] - ETA: 5s - loss: 2.5449 - regression_loss: 2.0518 - classification_loss: 0.4931 477/500 [===========================>..] - ETA: 5s - loss: 2.5450 - regression_loss: 2.0518 - classification_loss: 0.4932 478/500 [===========================>..] - ETA: 5s - loss: 2.5447 - regression_loss: 2.0514 - classification_loss: 0.4934 479/500 [===========================>..] - ETA: 4s - loss: 2.5441 - regression_loss: 2.0510 - classification_loss: 0.4931 480/500 [===========================>..] - ETA: 4s - loss: 2.5433 - regression_loss: 2.0504 - classification_loss: 0.4929 481/500 [===========================>..] - ETA: 4s - loss: 2.5427 - regression_loss: 2.0499 - classification_loss: 0.4928 482/500 [===========================>..] - ETA: 4s - loss: 2.5434 - regression_loss: 2.0508 - classification_loss: 0.4926 483/500 [===========================>..] - ETA: 3s - loss: 2.5432 - regression_loss: 2.0508 - classification_loss: 0.4924 484/500 [============================>.] - ETA: 3s - loss: 2.5433 - regression_loss: 2.0509 - classification_loss: 0.4924 485/500 [============================>.] - ETA: 3s - loss: 2.5436 - regression_loss: 2.0513 - classification_loss: 0.4923 486/500 [============================>.] - ETA: 3s - loss: 2.5432 - regression_loss: 2.0511 - classification_loss: 0.4921 487/500 [============================>.] - ETA: 3s - loss: 2.5425 - regression_loss: 2.0507 - classification_loss: 0.4919 488/500 [============================>.] - ETA: 2s - loss: 2.5433 - regression_loss: 2.0511 - classification_loss: 0.4922 489/500 [============================>.] - ETA: 2s - loss: 2.5446 - regression_loss: 2.0521 - classification_loss: 0.4925 490/500 [============================>.] - ETA: 2s - loss: 2.5443 - regression_loss: 2.0520 - classification_loss: 0.4923 491/500 [============================>.] - ETA: 2s - loss: 2.5431 - regression_loss: 2.0512 - classification_loss: 0.4919 492/500 [============================>.] - ETA: 1s - loss: 2.5418 - regression_loss: 2.0503 - classification_loss: 0.4914 493/500 [============================>.] - ETA: 1s - loss: 2.5456 - regression_loss: 2.0540 - classification_loss: 0.4916 494/500 [============================>.] - ETA: 1s - loss: 2.5453 - regression_loss: 2.0540 - classification_loss: 0.4913 495/500 [============================>.] - ETA: 1s - loss: 2.5469 - regression_loss: 2.0553 - classification_loss: 0.4916 496/500 [============================>.] - ETA: 0s - loss: 2.5474 - regression_loss: 2.0556 - classification_loss: 0.4918 497/500 [============================>.] - ETA: 0s - loss: 2.5483 - regression_loss: 2.0566 - classification_loss: 0.4917 498/500 [============================>.] - ETA: 0s - loss: 2.5464 - regression_loss: 2.0551 - classification_loss: 0.4913 499/500 [============================>.] - ETA: 0s - loss: 2.5459 - regression_loss: 2.0548 - classification_loss: 0.4911 500/500 [==============================] - 116s 233ms/step - loss: 2.5476 - regression_loss: 2.0561 - classification_loss: 0.4914 326 instances of class plum with average precision: 0.4973 mAP: 0.4973 Epoch 00005: saving model to ./training/snapshots/resnet50_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 1:57 - loss: 2.4629 - regression_loss: 2.0334 - classification_loss: 0.4295 2/500 [..............................] - ETA: 1:55 - loss: 2.3309 - regression_loss: 1.9236 - classification_loss: 0.4073 3/500 [..............................] - ETA: 1:54 - loss: 2.1592 - regression_loss: 1.7854 - classification_loss: 0.3738 4/500 [..............................] - ETA: 1:55 - loss: 1.8560 - regression_loss: 1.5455 - classification_loss: 0.3105 5/500 [..............................] - ETA: 1:55 - loss: 1.9479 - regression_loss: 1.6189 - classification_loss: 0.3290 6/500 [..............................] - ETA: 1:55 - loss: 2.0142 - regression_loss: 1.6649 - classification_loss: 0.3493 7/500 [..............................] - ETA: 1:55 - loss: 2.0573 - regression_loss: 1.7061 - classification_loss: 0.3512 8/500 [..............................] - ETA: 1:55 - loss: 2.1344 - regression_loss: 1.7449 - classification_loss: 0.3895 9/500 [..............................] - ETA: 1:54 - loss: 2.1016 - regression_loss: 1.7235 - classification_loss: 0.3782 10/500 [..............................] - ETA: 1:54 - loss: 2.1675 - regression_loss: 1.7804 - classification_loss: 0.3871 11/500 [..............................] - ETA: 1:54 - loss: 2.3372 - regression_loss: 1.9358 - classification_loss: 0.4014 12/500 [..............................] - ETA: 1:53 - loss: 2.3485 - regression_loss: 1.9470 - classification_loss: 0.4014 13/500 [..............................] - ETA: 1:53 - loss: 2.3740 - regression_loss: 1.9658 - classification_loss: 0.4081 14/500 [..............................] - ETA: 1:53 - loss: 2.4562 - regression_loss: 2.0347 - classification_loss: 0.4215 15/500 [..............................] - ETA: 1:53 - loss: 2.4715 - regression_loss: 2.0399 - classification_loss: 0.4316 16/500 [..............................] - ETA: 1:53 - loss: 2.4292 - regression_loss: 2.0114 - classification_loss: 0.4178 17/500 [>.............................] - ETA: 1:53 - loss: 2.4388 - regression_loss: 2.0229 - classification_loss: 0.4159 18/500 [>.............................] - ETA: 1:52 - loss: 2.4696 - regression_loss: 2.0424 - classification_loss: 0.4272 19/500 [>.............................] - ETA: 1:52 - loss: 2.5123 - regression_loss: 2.0807 - classification_loss: 0.4316 20/500 [>.............................] - ETA: 1:52 - loss: 2.4787 - regression_loss: 2.0539 - classification_loss: 0.4248 21/500 [>.............................] - ETA: 1:52 - loss: 2.4825 - regression_loss: 2.0504 - classification_loss: 0.4321 22/500 [>.............................] - ETA: 1:52 - loss: 2.4663 - regression_loss: 2.0347 - classification_loss: 0.4316 23/500 [>.............................] - ETA: 1:52 - loss: 2.4472 - regression_loss: 2.0211 - classification_loss: 0.4262 24/500 [>.............................] - ETA: 1:51 - loss: 2.4465 - regression_loss: 2.0193 - classification_loss: 0.4272 25/500 [>.............................] - ETA: 1:51 - loss: 2.4223 - regression_loss: 1.9385 - classification_loss: 0.4837 26/500 [>.............................] - ETA: 1:51 - loss: 2.4338 - regression_loss: 1.9517 - classification_loss: 0.4821 27/500 [>.............................] - ETA: 1:51 - loss: 2.3967 - regression_loss: 1.9242 - classification_loss: 0.4724 28/500 [>.............................] - ETA: 1:51 - loss: 2.3923 - regression_loss: 1.9276 - classification_loss: 0.4647 29/500 [>.............................] - ETA: 1:50 - loss: 2.3845 - regression_loss: 1.9213 - classification_loss: 0.4632 30/500 [>.............................] - ETA: 1:50 - loss: 2.4012 - regression_loss: 1.9312 - classification_loss: 0.4700 31/500 [>.............................] - ETA: 1:50 - loss: 2.3933 - regression_loss: 1.9251 - classification_loss: 0.4682 32/500 [>.............................] - ETA: 1:50 - loss: 2.4143 - regression_loss: 1.9386 - classification_loss: 0.4757 33/500 [>.............................] - ETA: 1:49 - loss: 2.4418 - regression_loss: 1.9634 - classification_loss: 0.4785 34/500 [=>............................] - ETA: 1:49 - loss: 2.4493 - regression_loss: 1.9741 - classification_loss: 0.4753 35/500 [=>............................] - ETA: 1:49 - loss: 2.4560 - regression_loss: 1.9793 - classification_loss: 0.4767 36/500 [=>............................] - ETA: 1:49 - loss: 2.4452 - regression_loss: 1.9732 - classification_loss: 0.4720 37/500 [=>............................] - ETA: 1:48 - loss: 2.4662 - regression_loss: 1.9779 - classification_loss: 0.4882 38/500 [=>............................] - ETA: 1:48 - loss: 2.4741 - regression_loss: 1.9841 - classification_loss: 0.4900 39/500 [=>............................] - ETA: 1:48 - loss: 2.4676 - regression_loss: 1.9812 - classification_loss: 0.4863 40/500 [=>............................] - ETA: 1:48 - loss: 2.4971 - regression_loss: 2.0069 - classification_loss: 0.4902 41/500 [=>............................] - ETA: 1:47 - loss: 2.4812 - regression_loss: 1.9970 - classification_loss: 0.4842 42/500 [=>............................] - ETA: 1:47 - loss: 2.4780 - regression_loss: 1.9959 - classification_loss: 0.4821 43/500 [=>............................] - ETA: 1:47 - loss: 2.4745 - regression_loss: 1.9951 - classification_loss: 0.4794 44/500 [=>............................] - ETA: 1:47 - loss: 2.4666 - regression_loss: 1.9890 - classification_loss: 0.4776 45/500 [=>............................] - ETA: 1:46 - loss: 2.4847 - regression_loss: 2.0015 - classification_loss: 0.4831 46/500 [=>............................] - ETA: 1:46 - loss: 2.4865 - regression_loss: 2.0044 - classification_loss: 0.4821 47/500 [=>............................] - ETA: 1:46 - loss: 2.4754 - regression_loss: 1.9969 - classification_loss: 0.4785 48/500 [=>............................] - ETA: 1:46 - loss: 2.4893 - regression_loss: 2.0066 - classification_loss: 0.4827 49/500 [=>............................] - ETA: 1:45 - loss: 2.4846 - regression_loss: 2.0049 - classification_loss: 0.4797 50/500 [==>...........................] - ETA: 1:45 - loss: 2.4894 - regression_loss: 2.0093 - classification_loss: 0.4801 51/500 [==>...........................] - ETA: 1:45 - loss: 2.4898 - regression_loss: 2.0117 - classification_loss: 0.4782 52/500 [==>...........................] - ETA: 1:45 - loss: 2.4937 - regression_loss: 2.0142 - classification_loss: 0.4795 53/500 [==>...........................] - ETA: 1:45 - loss: 2.4982 - regression_loss: 2.0173 - classification_loss: 0.4810 54/500 [==>...........................] - ETA: 1:44 - loss: 2.5011 - regression_loss: 2.0200 - classification_loss: 0.4811 55/500 [==>...........................] - ETA: 1:44 - loss: 2.4978 - regression_loss: 2.0175 - classification_loss: 0.4803 56/500 [==>...........................] - ETA: 1:44 - loss: 2.4889 - regression_loss: 2.0121 - classification_loss: 0.4768 57/500 [==>...........................] - ETA: 1:44 - loss: 2.4864 - regression_loss: 2.0094 - classification_loss: 0.4770 58/500 [==>...........................] - ETA: 1:44 - loss: 2.4806 - regression_loss: 2.0061 - classification_loss: 0.4745 59/500 [==>...........................] - ETA: 1:43 - loss: 2.4759 - regression_loss: 2.0015 - classification_loss: 0.4744 60/500 [==>...........................] - ETA: 1:43 - loss: 2.4690 - regression_loss: 1.9972 - classification_loss: 0.4718 61/500 [==>...........................] - ETA: 1:43 - loss: 2.4641 - regression_loss: 1.9940 - classification_loss: 0.4701 62/500 [==>...........................] - ETA: 1:42 - loss: 2.4663 - regression_loss: 1.9962 - classification_loss: 0.4701 63/500 [==>...........................] - ETA: 1:42 - loss: 2.4745 - regression_loss: 2.0009 - classification_loss: 0.4736 64/500 [==>...........................] - ETA: 1:42 - loss: 2.4665 - regression_loss: 1.9957 - classification_loss: 0.4708 65/500 [==>...........................] - ETA: 1:41 - loss: 2.4581 - regression_loss: 1.9890 - classification_loss: 0.4691 66/500 [==>...........................] - ETA: 1:41 - loss: 2.4640 - regression_loss: 1.9916 - classification_loss: 0.4724 67/500 [===>..........................] - ETA: 1:41 - loss: 2.4485 - regression_loss: 1.9799 - classification_loss: 0.4685 68/500 [===>..........................] - ETA: 1:41 - loss: 2.4403 - regression_loss: 1.9749 - classification_loss: 0.4654 69/500 [===>..........................] - ETA: 1:40 - loss: 2.4365 - regression_loss: 1.9727 - classification_loss: 0.4638 70/500 [===>..........................] - ETA: 1:40 - loss: 2.4515 - regression_loss: 1.9882 - classification_loss: 0.4633 71/500 [===>..........................] - ETA: 1:40 - loss: 2.4406 - regression_loss: 1.9809 - classification_loss: 0.4597 72/500 [===>..........................] - ETA: 1:40 - loss: 2.4432 - regression_loss: 1.9817 - classification_loss: 0.4614 73/500 [===>..........................] - ETA: 1:40 - loss: 2.4409 - regression_loss: 1.9797 - classification_loss: 0.4612 74/500 [===>..........................] - ETA: 1:39 - loss: 2.4511 - regression_loss: 1.9883 - classification_loss: 0.4628 75/500 [===>..........................] - ETA: 1:39 - loss: 2.4511 - regression_loss: 1.9887 - classification_loss: 0.4625 76/500 [===>..........................] - ETA: 1:39 - loss: 2.4489 - regression_loss: 1.9889 - classification_loss: 0.4601 77/500 [===>..........................] - ETA: 1:39 - loss: 2.4561 - regression_loss: 1.9935 - classification_loss: 0.4625 78/500 [===>..........................] - ETA: 1:38 - loss: 2.4574 - regression_loss: 1.9939 - classification_loss: 0.4635 79/500 [===>..........................] - ETA: 1:38 - loss: 2.4606 - regression_loss: 1.9951 - classification_loss: 0.4655 80/500 [===>..........................] - ETA: 1:38 - loss: 2.4569 - regression_loss: 1.9931 - classification_loss: 0.4638 81/500 [===>..........................] - ETA: 1:38 - loss: 2.4546 - regression_loss: 1.9923 - classification_loss: 0.4624 82/500 [===>..........................] - ETA: 1:37 - loss: 2.4550 - regression_loss: 1.9918 - classification_loss: 0.4632 83/500 [===>..........................] - ETA: 1:37 - loss: 2.4545 - regression_loss: 1.9903 - classification_loss: 0.4643 84/500 [====>.........................] - ETA: 1:37 - loss: 2.4538 - regression_loss: 1.9903 - classification_loss: 0.4635 85/500 [====>.........................] - ETA: 1:37 - loss: 2.4534 - regression_loss: 1.9901 - classification_loss: 0.4633 86/500 [====>.........................] - ETA: 1:36 - loss: 2.4515 - regression_loss: 1.9882 - classification_loss: 0.4633 87/500 [====>.........................] - ETA: 1:36 - loss: 2.4579 - regression_loss: 1.9936 - classification_loss: 0.4644 88/500 [====>.........................] - ETA: 1:36 - loss: 2.4487 - regression_loss: 1.9831 - classification_loss: 0.4656 89/500 [====>.........................] - ETA: 1:36 - loss: 2.4504 - regression_loss: 1.9856 - classification_loss: 0.4648 90/500 [====>.........................] - ETA: 1:35 - loss: 2.4452 - regression_loss: 1.9822 - classification_loss: 0.4629 91/500 [====>.........................] - ETA: 1:35 - loss: 2.4391 - regression_loss: 1.9782 - classification_loss: 0.4609 92/500 [====>.........................] - ETA: 1:35 - loss: 2.4408 - regression_loss: 1.9779 - classification_loss: 0.4629 93/500 [====>.........................] - ETA: 1:35 - loss: 2.4489 - regression_loss: 1.9841 - classification_loss: 0.4647 94/500 [====>.........................] - ETA: 1:34 - loss: 2.4474 - regression_loss: 1.9833 - classification_loss: 0.4640 95/500 [====>.........................] - ETA: 1:34 - loss: 2.4365 - regression_loss: 1.9744 - classification_loss: 0.4622 96/500 [====>.........................] - ETA: 1:34 - loss: 2.4420 - regression_loss: 1.9803 - classification_loss: 0.4617 97/500 [====>.........................] - ETA: 1:34 - loss: 2.4479 - regression_loss: 1.9846 - classification_loss: 0.4633 98/500 [====>.........................] - ETA: 1:33 - loss: 2.4485 - regression_loss: 1.9857 - classification_loss: 0.4628 99/500 [====>.........................] - ETA: 1:33 - loss: 2.4518 - regression_loss: 1.9900 - classification_loss: 0.4618 100/500 [=====>........................] - ETA: 1:33 - loss: 2.4492 - regression_loss: 1.9885 - classification_loss: 0.4606 101/500 [=====>........................] - ETA: 1:33 - loss: 2.4423 - regression_loss: 1.9835 - classification_loss: 0.4588 102/500 [=====>........................] - ETA: 1:33 - loss: 2.4641 - regression_loss: 1.9906 - classification_loss: 0.4735 103/500 [=====>........................] - ETA: 1:32 - loss: 2.4665 - regression_loss: 1.9930 - classification_loss: 0.4735 104/500 [=====>........................] - ETA: 1:32 - loss: 2.4704 - regression_loss: 1.9951 - classification_loss: 0.4753 105/500 [=====>........................] - ETA: 1:32 - loss: 2.4721 - regression_loss: 1.9966 - classification_loss: 0.4755 106/500 [=====>........................] - ETA: 1:32 - loss: 2.4743 - regression_loss: 2.0002 - classification_loss: 0.4741 107/500 [=====>........................] - ETA: 1:31 - loss: 2.4762 - regression_loss: 2.0022 - classification_loss: 0.4740 108/500 [=====>........................] - ETA: 1:31 - loss: 2.4754 - regression_loss: 2.0022 - classification_loss: 0.4732 109/500 [=====>........................] - ETA: 1:31 - loss: 2.4744 - regression_loss: 2.0025 - classification_loss: 0.4719 110/500 [=====>........................] - ETA: 1:31 - loss: 2.4748 - regression_loss: 2.0027 - classification_loss: 0.4721 111/500 [=====>........................] - ETA: 1:30 - loss: 2.4748 - regression_loss: 2.0037 - classification_loss: 0.4712 112/500 [=====>........................] - ETA: 1:30 - loss: 2.4746 - regression_loss: 2.0040 - classification_loss: 0.4706 113/500 [=====>........................] - ETA: 1:30 - loss: 2.4800 - regression_loss: 2.0077 - classification_loss: 0.4723 114/500 [=====>........................] - ETA: 1:30 - loss: 2.4829 - regression_loss: 2.0093 - classification_loss: 0.4736 115/500 [=====>........................] - ETA: 1:29 - loss: 2.4976 - regression_loss: 2.0208 - classification_loss: 0.4768 116/500 [=====>........................] - ETA: 1:29 - loss: 2.5024 - regression_loss: 2.0243 - classification_loss: 0.4781 117/500 [======>.......................] - ETA: 1:29 - loss: 2.5041 - regression_loss: 2.0267 - classification_loss: 0.4774 118/500 [======>.......................] - ETA: 1:29 - loss: 2.4986 - regression_loss: 2.0235 - classification_loss: 0.4752 119/500 [======>.......................] - ETA: 1:29 - loss: 2.5017 - regression_loss: 2.0262 - classification_loss: 0.4755 120/500 [======>.......................] - ETA: 1:28 - loss: 2.5027 - regression_loss: 2.0281 - classification_loss: 0.4746 121/500 [======>.......................] - ETA: 1:28 - loss: 2.5083 - regression_loss: 2.0338 - classification_loss: 0.4744 122/500 [======>.......................] - ETA: 1:28 - loss: 2.5065 - regression_loss: 2.0330 - classification_loss: 0.4736 123/500 [======>.......................] - ETA: 1:28 - loss: 2.5109 - regression_loss: 2.0356 - classification_loss: 0.4753 124/500 [======>.......................] - ETA: 1:27 - loss: 2.5094 - regression_loss: 2.0347 - classification_loss: 0.4747 125/500 [======>.......................] - ETA: 1:27 - loss: 2.5084 - regression_loss: 2.0342 - classification_loss: 0.4743 126/500 [======>.......................] - ETA: 1:27 - loss: 2.5056 - regression_loss: 2.0328 - classification_loss: 0.4728 127/500 [======>.......................] - ETA: 1:27 - loss: 2.5087 - regression_loss: 2.0359 - classification_loss: 0.4728 128/500 [======>.......................] - ETA: 1:26 - loss: 2.5017 - regression_loss: 2.0305 - classification_loss: 0.4712 129/500 [======>.......................] - ETA: 1:26 - loss: 2.5054 - regression_loss: 2.0339 - classification_loss: 0.4716 130/500 [======>.......................] - ETA: 1:26 - loss: 2.5056 - regression_loss: 2.0343 - classification_loss: 0.4713 131/500 [======>.......................] - ETA: 1:26 - loss: 2.5052 - regression_loss: 2.0351 - classification_loss: 0.4701 132/500 [======>.......................] - ETA: 1:25 - loss: 2.5142 - regression_loss: 2.0417 - classification_loss: 0.4725 133/500 [======>.......................] - ETA: 1:25 - loss: 2.5113 - regression_loss: 2.0401 - classification_loss: 0.4712 134/500 [=======>......................] - ETA: 1:25 - loss: 2.5104 - regression_loss: 2.0405 - classification_loss: 0.4699 135/500 [=======>......................] - ETA: 1:25 - loss: 2.5065 - regression_loss: 2.0377 - classification_loss: 0.4689 136/500 [=======>......................] - ETA: 1:25 - loss: 2.5003 - regression_loss: 2.0324 - classification_loss: 0.4679 137/500 [=======>......................] - ETA: 1:24 - loss: 2.4931 - regression_loss: 2.0270 - classification_loss: 0.4661 138/500 [=======>......................] - ETA: 1:24 - loss: 2.4895 - regression_loss: 2.0253 - classification_loss: 0.4642 139/500 [=======>......................] - ETA: 1:24 - loss: 2.4846 - regression_loss: 2.0107 - classification_loss: 0.4739 140/500 [=======>......................] - ETA: 1:24 - loss: 2.4890 - regression_loss: 2.0145 - classification_loss: 0.4745 141/500 [=======>......................] - ETA: 1:23 - loss: 2.4817 - regression_loss: 2.0093 - classification_loss: 0.4724 142/500 [=======>......................] - ETA: 1:23 - loss: 2.4757 - regression_loss: 2.0049 - classification_loss: 0.4708 143/500 [=======>......................] - ETA: 1:23 - loss: 2.4736 - regression_loss: 2.0038 - classification_loss: 0.4698 144/500 [=======>......................] - ETA: 1:23 - loss: 2.4740 - regression_loss: 2.0044 - classification_loss: 0.4696 145/500 [=======>......................] - ETA: 1:23 - loss: 2.4716 - regression_loss: 2.0026 - classification_loss: 0.4690 146/500 [=======>......................] - ETA: 1:22 - loss: 2.4665 - regression_loss: 1.9990 - classification_loss: 0.4675 147/500 [=======>......................] - ETA: 1:22 - loss: 2.4640 - regression_loss: 1.9975 - classification_loss: 0.4665 148/500 [=======>......................] - ETA: 1:22 - loss: 2.4659 - regression_loss: 1.9984 - classification_loss: 0.4675 149/500 [=======>......................] - ETA: 1:22 - loss: 2.4670 - regression_loss: 1.9987 - classification_loss: 0.4683 150/500 [========>.....................] - ETA: 1:21 - loss: 2.4657 - regression_loss: 1.9979 - classification_loss: 0.4679 151/500 [========>.....................] - ETA: 1:21 - loss: 2.4633 - regression_loss: 1.9962 - classification_loss: 0.4672 152/500 [========>.....................] - ETA: 1:21 - loss: 2.4696 - regression_loss: 2.0012 - classification_loss: 0.4684 153/500 [========>.....................] - ETA: 1:21 - loss: 2.4681 - regression_loss: 2.0010 - classification_loss: 0.4671 154/500 [========>.....................] - ETA: 1:20 - loss: 2.4721 - regression_loss: 2.0028 - classification_loss: 0.4693 155/500 [========>.....................] - ETA: 1:20 - loss: 2.4736 - regression_loss: 2.0037 - classification_loss: 0.4698 156/500 [========>.....................] - ETA: 1:20 - loss: 2.4706 - regression_loss: 2.0018 - classification_loss: 0.4688 157/500 [========>.....................] - ETA: 1:20 - loss: 2.4700 - regression_loss: 2.0021 - classification_loss: 0.4678 158/500 [========>.....................] - ETA: 1:20 - loss: 2.4706 - regression_loss: 2.0026 - classification_loss: 0.4680 159/500 [========>.....................] - ETA: 1:19 - loss: 2.4720 - regression_loss: 2.0032 - classification_loss: 0.4687 160/500 [========>.....................] - ETA: 1:19 - loss: 2.4689 - regression_loss: 1.9996 - classification_loss: 0.4693 161/500 [========>.....................] - ETA: 1:19 - loss: 2.4663 - regression_loss: 1.9976 - classification_loss: 0.4687 162/500 [========>.....................] - ETA: 1:19 - loss: 2.4693 - regression_loss: 1.9995 - classification_loss: 0.4698 163/500 [========>.....................] - ETA: 1:18 - loss: 2.4686 - regression_loss: 1.9988 - classification_loss: 0.4698 164/500 [========>.....................] - ETA: 1:18 - loss: 2.4692 - regression_loss: 1.9993 - classification_loss: 0.4698 165/500 [========>.....................] - ETA: 1:18 - loss: 2.4670 - regression_loss: 1.9981 - classification_loss: 0.4689 166/500 [========>.....................] - ETA: 1:18 - loss: 2.4665 - regression_loss: 1.9978 - classification_loss: 0.4687 167/500 [=========>....................] - ETA: 1:17 - loss: 2.4644 - regression_loss: 1.9959 - classification_loss: 0.4684 168/500 [=========>....................] - ETA: 1:17 - loss: 2.4648 - regression_loss: 1.9963 - classification_loss: 0.4685 169/500 [=========>....................] - ETA: 1:17 - loss: 2.4627 - regression_loss: 1.9946 - classification_loss: 0.4680 170/500 [=========>....................] - ETA: 1:17 - loss: 2.4610 - regression_loss: 1.9928 - classification_loss: 0.4682 171/500 [=========>....................] - ETA: 1:16 - loss: 2.4661 - regression_loss: 1.9964 - classification_loss: 0.4697 172/500 [=========>....................] - ETA: 1:16 - loss: 2.4661 - regression_loss: 1.9971 - classification_loss: 0.4690 173/500 [=========>....................] - ETA: 1:16 - loss: 2.4649 - regression_loss: 1.9966 - classification_loss: 0.4683 174/500 [=========>....................] - ETA: 1:16 - loss: 2.4649 - regression_loss: 1.9963 - classification_loss: 0.4686 175/500 [=========>....................] - ETA: 1:16 - loss: 2.4654 - regression_loss: 1.9976 - classification_loss: 0.4677 176/500 [=========>....................] - ETA: 1:15 - loss: 2.4632 - regression_loss: 1.9961 - classification_loss: 0.4671 177/500 [=========>....................] - ETA: 1:15 - loss: 2.4644 - regression_loss: 1.9971 - classification_loss: 0.4673 178/500 [=========>....................] - ETA: 1:15 - loss: 2.4639 - regression_loss: 1.9968 - classification_loss: 0.4671 179/500 [=========>....................] - ETA: 1:15 - loss: 2.4654 - regression_loss: 1.9982 - classification_loss: 0.4673 180/500 [=========>....................] - ETA: 1:14 - loss: 2.4674 - regression_loss: 1.9999 - classification_loss: 0.4675 181/500 [=========>....................] - ETA: 1:14 - loss: 2.4669 - regression_loss: 2.0002 - classification_loss: 0.4668 182/500 [=========>....................] - ETA: 1:14 - loss: 2.4699 - regression_loss: 2.0021 - classification_loss: 0.4678 183/500 [=========>....................] - ETA: 1:14 - loss: 2.4736 - regression_loss: 2.0054 - classification_loss: 0.4682 184/500 [==========>...................] - ETA: 1:13 - loss: 2.4723 - regression_loss: 2.0044 - classification_loss: 0.4679 185/500 [==========>...................] - ETA: 1:13 - loss: 2.4763 - regression_loss: 2.0079 - classification_loss: 0.4684 186/500 [==========>...................] - ETA: 1:13 - loss: 2.4774 - regression_loss: 2.0088 - classification_loss: 0.4687 187/500 [==========>...................] - ETA: 1:13 - loss: 2.4781 - regression_loss: 2.0097 - classification_loss: 0.4684 188/500 [==========>...................] - ETA: 1:13 - loss: 2.4797 - regression_loss: 2.0119 - classification_loss: 0.4678 189/500 [==========>...................] - ETA: 1:12 - loss: 2.4805 - regression_loss: 2.0130 - classification_loss: 0.4676 190/500 [==========>...................] - ETA: 1:12 - loss: 2.4795 - regression_loss: 2.0125 - classification_loss: 0.4670 191/500 [==========>...................] - ETA: 1:12 - loss: 2.4791 - regression_loss: 2.0119 - classification_loss: 0.4671 192/500 [==========>...................] - ETA: 1:12 - loss: 2.4784 - regression_loss: 2.0120 - classification_loss: 0.4664 193/500 [==========>...................] - ETA: 1:11 - loss: 2.4774 - regression_loss: 2.0117 - classification_loss: 0.4657 194/500 [==========>...................] - ETA: 1:11 - loss: 2.4767 - regression_loss: 2.0110 - classification_loss: 0.4657 195/500 [==========>...................] - ETA: 1:11 - loss: 2.4792 - regression_loss: 2.0128 - classification_loss: 0.4664 196/500 [==========>...................] - ETA: 1:11 - loss: 2.4791 - regression_loss: 2.0130 - classification_loss: 0.4661 197/500 [==========>...................] - ETA: 1:10 - loss: 2.4791 - regression_loss: 2.0133 - classification_loss: 0.4658 198/500 [==========>...................] - ETA: 1:10 - loss: 2.5276 - regression_loss: 2.0032 - classification_loss: 0.5244 199/500 [==========>...................] - ETA: 1:10 - loss: 2.5291 - regression_loss: 2.0049 - classification_loss: 0.5242 200/500 [===========>..................] - ETA: 1:10 - loss: 2.5324 - regression_loss: 2.0079 - classification_loss: 0.5246 201/500 [===========>..................] - ETA: 1:10 - loss: 2.5280 - regression_loss: 2.0050 - classification_loss: 0.5230 202/500 [===========>..................] - ETA: 1:09 - loss: 2.5285 - regression_loss: 2.0058 - classification_loss: 0.5227 203/500 [===========>..................] - ETA: 1:09 - loss: 2.5328 - regression_loss: 2.0110 - classification_loss: 0.5218 204/500 [===========>..................] - ETA: 1:09 - loss: 2.5312 - regression_loss: 2.0106 - classification_loss: 0.5206 205/500 [===========>..................] - ETA: 1:09 - loss: 2.5279 - regression_loss: 2.0083 - classification_loss: 0.5196 206/500 [===========>..................] - ETA: 1:08 - loss: 2.5233 - regression_loss: 2.0054 - classification_loss: 0.5179 207/500 [===========>..................] - ETA: 1:08 - loss: 2.5217 - regression_loss: 2.0047 - classification_loss: 0.5170 208/500 [===========>..................] - ETA: 1:08 - loss: 2.5180 - regression_loss: 2.0022 - classification_loss: 0.5158 209/500 [===========>..................] - ETA: 1:08 - loss: 2.5188 - regression_loss: 2.0030 - classification_loss: 0.5158 210/500 [===========>..................] - ETA: 1:07 - loss: 2.5191 - regression_loss: 2.0035 - classification_loss: 0.5156 211/500 [===========>..................] - ETA: 1:07 - loss: 2.5179 - regression_loss: 2.0028 - classification_loss: 0.5152 212/500 [===========>..................] - ETA: 1:07 - loss: 2.5185 - regression_loss: 2.0037 - classification_loss: 0.5149 213/500 [===========>..................] - ETA: 1:07 - loss: 2.5131 - regression_loss: 1.9995 - classification_loss: 0.5137 214/500 [===========>..................] - ETA: 1:07 - loss: 2.5112 - regression_loss: 1.9985 - classification_loss: 0.5127 215/500 [===========>..................] - ETA: 1:06 - loss: 2.5089 - regression_loss: 1.9972 - classification_loss: 0.5117 216/500 [===========>..................] - ETA: 1:06 - loss: 2.5124 - regression_loss: 1.9997 - classification_loss: 0.5128 217/500 [============>.................] - ETA: 1:06 - loss: 2.5123 - regression_loss: 1.9999 - classification_loss: 0.5124 218/500 [============>.................] - ETA: 1:06 - loss: 2.5084 - regression_loss: 1.9973 - classification_loss: 0.5110 219/500 [============>.................] - ETA: 1:05 - loss: 2.5039 - regression_loss: 1.9939 - classification_loss: 0.5100 220/500 [============>.................] - ETA: 1:05 - loss: 2.4997 - regression_loss: 1.9913 - classification_loss: 0.5085 221/500 [============>.................] - ETA: 1:05 - loss: 2.4989 - regression_loss: 1.9913 - classification_loss: 0.5076 222/500 [============>.................] - ETA: 1:05 - loss: 2.5006 - regression_loss: 1.9929 - classification_loss: 0.5077 223/500 [============>.................] - ETA: 1:04 - loss: 2.5013 - regression_loss: 1.9937 - classification_loss: 0.5077 224/500 [============>.................] - ETA: 1:04 - loss: 2.5028 - regression_loss: 1.9944 - classification_loss: 0.5083 225/500 [============>.................] - ETA: 1:04 - loss: 2.5062 - regression_loss: 1.9979 - classification_loss: 0.5083 226/500 [============>.................] - ETA: 1:04 - loss: 2.5061 - regression_loss: 1.9981 - classification_loss: 0.5080 227/500 [============>.................] - ETA: 1:04 - loss: 2.5078 - regression_loss: 1.9995 - classification_loss: 0.5083 228/500 [============>.................] - ETA: 1:03 - loss: 2.5061 - regression_loss: 1.9988 - classification_loss: 0.5073 229/500 [============>.................] - ETA: 1:03 - loss: 2.5045 - regression_loss: 1.9980 - classification_loss: 0.5065 230/500 [============>.................] - ETA: 1:03 - loss: 2.5065 - regression_loss: 1.9998 - classification_loss: 0.5067 231/500 [============>.................] - ETA: 1:03 - loss: 2.5071 - regression_loss: 2.0006 - classification_loss: 0.5065 232/500 [============>.................] - ETA: 1:02 - loss: 2.5074 - regression_loss: 2.0012 - classification_loss: 0.5062 233/500 [============>.................] - ETA: 1:02 - loss: 2.5090 - regression_loss: 2.0020 - classification_loss: 0.5070 234/500 [=============>................] - ETA: 1:02 - loss: 2.5130 - regression_loss: 2.0055 - classification_loss: 0.5075 235/500 [=============>................] - ETA: 1:02 - loss: 2.5152 - regression_loss: 2.0076 - classification_loss: 0.5076 236/500 [=============>................] - ETA: 1:01 - loss: 2.5147 - regression_loss: 2.0077 - classification_loss: 0.5070 237/500 [=============>................] - ETA: 1:01 - loss: 2.5157 - regression_loss: 2.0084 - classification_loss: 0.5072 238/500 [=============>................] - ETA: 1:01 - loss: 2.5132 - regression_loss: 2.0069 - classification_loss: 0.5063 239/500 [=============>................] - ETA: 1:01 - loss: 2.5116 - regression_loss: 2.0059 - classification_loss: 0.5057 240/500 [=============>................] - ETA: 1:00 - loss: 2.5135 - regression_loss: 2.0057 - classification_loss: 0.5077 241/500 [=============>................] - ETA: 1:00 - loss: 2.5131 - regression_loss: 2.0060 - classification_loss: 0.5072 242/500 [=============>................] - ETA: 1:00 - loss: 2.5121 - regression_loss: 2.0052 - classification_loss: 0.5069 243/500 [=============>................] - ETA: 1:00 - loss: 2.5138 - regression_loss: 2.0068 - classification_loss: 0.5070 244/500 [=============>................] - ETA: 1:00 - loss: 2.5132 - regression_loss: 2.0066 - classification_loss: 0.5067 245/500 [=============>................] - ETA: 59s - loss: 2.5123 - regression_loss: 2.0062 - classification_loss: 0.5061  246/500 [=============>................] - ETA: 59s - loss: 2.5101 - regression_loss: 2.0050 - classification_loss: 0.5051 247/500 [=============>................] - ETA: 59s - loss: 2.5097 - regression_loss: 2.0045 - classification_loss: 0.5053 248/500 [=============>................] - ETA: 59s - loss: 2.5072 - regression_loss: 2.0027 - classification_loss: 0.5045 249/500 [=============>................] - ETA: 58s - loss: 2.5064 - regression_loss: 2.0022 - classification_loss: 0.5042 250/500 [==============>...............] - ETA: 58s - loss: 2.5059 - regression_loss: 2.0021 - classification_loss: 0.5038 251/500 [==============>...............] - ETA: 58s - loss: 2.5040 - regression_loss: 2.0007 - classification_loss: 0.5033 252/500 [==============>...............] - ETA: 58s - loss: 2.5036 - regression_loss: 2.0010 - classification_loss: 0.5025 253/500 [==============>...............] - ETA: 57s - loss: 2.5062 - regression_loss: 2.0037 - classification_loss: 0.5026 254/500 [==============>...............] - ETA: 57s - loss: 2.5057 - regression_loss: 2.0033 - classification_loss: 0.5023 255/500 [==============>...............] - ETA: 57s - loss: 2.5051 - regression_loss: 2.0029 - classification_loss: 0.5021 256/500 [==============>...............] - ETA: 57s - loss: 2.5064 - regression_loss: 2.0036 - classification_loss: 0.5028 257/500 [==============>...............] - ETA: 56s - loss: 2.5072 - regression_loss: 2.0042 - classification_loss: 0.5030 258/500 [==============>...............] - ETA: 56s - loss: 2.5068 - regression_loss: 2.0042 - classification_loss: 0.5026 259/500 [==============>...............] - ETA: 56s - loss: 2.5050 - regression_loss: 2.0033 - classification_loss: 0.5018 260/500 [==============>...............] - ETA: 56s - loss: 2.5072 - regression_loss: 2.0046 - classification_loss: 0.5025 261/500 [==============>...............] - ETA: 56s - loss: 2.5087 - regression_loss: 2.0060 - classification_loss: 0.5027 262/500 [==============>...............] - ETA: 55s - loss: 2.5082 - regression_loss: 2.0057 - classification_loss: 0.5026 263/500 [==============>...............] - ETA: 55s - loss: 2.5104 - regression_loss: 2.0076 - classification_loss: 0.5028 264/500 [==============>...............] - ETA: 55s - loss: 2.5125 - regression_loss: 2.0096 - classification_loss: 0.5028 265/500 [==============>...............] - ETA: 55s - loss: 2.5121 - regression_loss: 2.0097 - classification_loss: 0.5023 266/500 [==============>...............] - ETA: 54s - loss: 2.5159 - regression_loss: 2.0129 - classification_loss: 0.5030 267/500 [===============>..............] - ETA: 54s - loss: 2.5134 - regression_loss: 2.0115 - classification_loss: 0.5019 268/500 [===============>..............] - ETA: 54s - loss: 2.5134 - regression_loss: 2.0115 - classification_loss: 0.5019 269/500 [===============>..............] - ETA: 54s - loss: 2.5136 - regression_loss: 2.0121 - classification_loss: 0.5015 270/500 [===============>..............] - ETA: 53s - loss: 2.5115 - regression_loss: 2.0106 - classification_loss: 0.5009 271/500 [===============>..............] - ETA: 53s - loss: 2.5108 - regression_loss: 2.0109 - classification_loss: 0.4999 272/500 [===============>..............] - ETA: 53s - loss: 2.5085 - regression_loss: 2.0093 - classification_loss: 0.4992 273/500 [===============>..............] - ETA: 53s - loss: 2.5076 - regression_loss: 2.0090 - classification_loss: 0.4986 274/500 [===============>..............] - ETA: 53s - loss: 2.5068 - regression_loss: 2.0084 - classification_loss: 0.4984 275/500 [===============>..............] - ETA: 52s - loss: 2.5070 - regression_loss: 2.0086 - classification_loss: 0.4984 276/500 [===============>..............] - ETA: 52s - loss: 2.5076 - regression_loss: 2.0094 - classification_loss: 0.4982 277/500 [===============>..............] - ETA: 52s - loss: 2.5051 - regression_loss: 2.0078 - classification_loss: 0.4973 278/500 [===============>..............] - ETA: 52s - loss: 2.5053 - regression_loss: 2.0078 - classification_loss: 0.4975 279/500 [===============>..............] - ETA: 51s - loss: 2.5086 - regression_loss: 2.0103 - classification_loss: 0.4983 280/500 [===============>..............] - ETA: 51s - loss: 2.5061 - regression_loss: 2.0085 - classification_loss: 0.4976 281/500 [===============>..............] - ETA: 51s - loss: 2.5062 - regression_loss: 2.0088 - classification_loss: 0.4973 282/500 [===============>..............] - ETA: 51s - loss: 2.5077 - regression_loss: 2.0105 - classification_loss: 0.4972 283/500 [===============>..............] - ETA: 50s - loss: 2.5030 - regression_loss: 2.0066 - classification_loss: 0.4964 284/500 [================>.............] - ETA: 50s - loss: 2.5006 - regression_loss: 2.0050 - classification_loss: 0.4956 285/500 [================>.............] - ETA: 50s - loss: 2.5011 - regression_loss: 2.0055 - classification_loss: 0.4957 286/500 [================>.............] - ETA: 50s - loss: 2.5016 - regression_loss: 2.0056 - classification_loss: 0.4959 287/500 [================>.............] - ETA: 49s - loss: 2.5030 - regression_loss: 2.0069 - classification_loss: 0.4961 288/500 [================>.............] - ETA: 49s - loss: 2.5051 - regression_loss: 2.0085 - classification_loss: 0.4966 289/500 [================>.............] - ETA: 49s - loss: 2.5060 - regression_loss: 2.0101 - classification_loss: 0.4959 290/500 [================>.............] - ETA: 49s - loss: 2.5064 - regression_loss: 2.0104 - classification_loss: 0.4960 291/500 [================>.............] - ETA: 49s - loss: 2.5125 - regression_loss: 2.0111 - classification_loss: 0.5014 292/500 [================>.............] - ETA: 48s - loss: 2.5115 - regression_loss: 2.0105 - classification_loss: 0.5010 293/500 [================>.............] - ETA: 48s - loss: 2.5102 - regression_loss: 2.0097 - classification_loss: 0.5005 294/500 [================>.............] - ETA: 48s - loss: 2.5090 - regression_loss: 2.0086 - classification_loss: 0.5003 295/500 [================>.............] - ETA: 48s - loss: 2.5076 - regression_loss: 2.0077 - classification_loss: 0.4999 296/500 [================>.............] - ETA: 47s - loss: 2.5062 - regression_loss: 2.0067 - classification_loss: 0.4995 297/500 [================>.............] - ETA: 47s - loss: 2.5098 - regression_loss: 2.0107 - classification_loss: 0.4991 298/500 [================>.............] - ETA: 47s - loss: 2.5101 - regression_loss: 2.0111 - classification_loss: 0.4990 299/500 [================>.............] - ETA: 47s - loss: 2.5114 - regression_loss: 2.0122 - classification_loss: 0.4992 300/500 [=================>............] - ETA: 46s - loss: 2.5124 - regression_loss: 2.0131 - classification_loss: 0.4993 301/500 [=================>............] - ETA: 46s - loss: 2.5099 - regression_loss: 2.0115 - classification_loss: 0.4985 302/500 [=================>............] - ETA: 46s - loss: 2.5080 - regression_loss: 2.0094 - classification_loss: 0.4987 303/500 [=================>............] - ETA: 46s - loss: 2.5031 - regression_loss: 2.0056 - classification_loss: 0.4975 304/500 [=================>............] - ETA: 45s - loss: 2.5048 - regression_loss: 2.0068 - classification_loss: 0.4980 305/500 [=================>............] - ETA: 45s - loss: 2.5053 - regression_loss: 2.0075 - classification_loss: 0.4977 306/500 [=================>............] - ETA: 45s - loss: 2.5045 - regression_loss: 2.0068 - classification_loss: 0.4977 307/500 [=================>............] - ETA: 45s - loss: 2.5016 - regression_loss: 2.0048 - classification_loss: 0.4968 308/500 [=================>............] - ETA: 45s - loss: 2.4998 - regression_loss: 2.0035 - classification_loss: 0.4963 309/500 [=================>............] - ETA: 44s - loss: 2.4991 - regression_loss: 2.0031 - classification_loss: 0.4960 310/500 [=================>............] - ETA: 44s - loss: 2.5018 - regression_loss: 2.0050 - classification_loss: 0.4968 311/500 [=================>............] - ETA: 44s - loss: 2.5008 - regression_loss: 2.0040 - classification_loss: 0.4967 312/500 [=================>............] - ETA: 44s - loss: 2.5017 - regression_loss: 2.0050 - classification_loss: 0.4967 313/500 [=================>............] - ETA: 43s - loss: 2.5023 - regression_loss: 2.0057 - classification_loss: 0.4966 314/500 [=================>............] - ETA: 43s - loss: 2.5029 - regression_loss: 2.0061 - classification_loss: 0.4968 315/500 [=================>............] - ETA: 43s - loss: 2.5007 - regression_loss: 2.0045 - classification_loss: 0.4962 316/500 [=================>............] - ETA: 43s - loss: 2.4971 - regression_loss: 2.0018 - classification_loss: 0.4954 317/500 [==================>...........] - ETA: 42s - loss: 2.4995 - regression_loss: 2.0048 - classification_loss: 0.4947 318/500 [==================>...........] - ETA: 42s - loss: 2.4981 - regression_loss: 2.0040 - classification_loss: 0.4942 319/500 [==================>...........] - ETA: 42s - loss: 2.4955 - regression_loss: 2.0022 - classification_loss: 0.4933 320/500 [==================>...........] - ETA: 42s - loss: 2.4959 - regression_loss: 2.0028 - classification_loss: 0.4930 321/500 [==================>...........] - ETA: 42s - loss: 2.4962 - regression_loss: 2.0033 - classification_loss: 0.4929 322/500 [==================>...........] - ETA: 41s - loss: 2.4993 - regression_loss: 2.0055 - classification_loss: 0.4938 323/500 [==================>...........] - ETA: 41s - loss: 2.4947 - regression_loss: 2.0021 - classification_loss: 0.4926 324/500 [==================>...........] - ETA: 41s - loss: 2.4970 - regression_loss: 2.0040 - classification_loss: 0.4930 325/500 [==================>...........] - ETA: 41s - loss: 2.4990 - regression_loss: 2.0054 - classification_loss: 0.4935 326/500 [==================>...........] - ETA: 40s - loss: 2.4965 - regression_loss: 2.0039 - classification_loss: 0.4926 327/500 [==================>...........] - ETA: 40s - loss: 2.4938 - regression_loss: 2.0018 - classification_loss: 0.4920 328/500 [==================>...........] - ETA: 40s - loss: 2.4935 - regression_loss: 2.0020 - classification_loss: 0.4914 329/500 [==================>...........] - ETA: 40s - loss: 2.4963 - regression_loss: 2.0048 - classification_loss: 0.4916 330/500 [==================>...........] - ETA: 39s - loss: 2.4976 - regression_loss: 2.0058 - classification_loss: 0.4918 331/500 [==================>...........] - ETA: 39s - loss: 2.4968 - regression_loss: 2.0054 - classification_loss: 0.4914 332/500 [==================>...........] - ETA: 39s - loss: 2.4973 - regression_loss: 2.0057 - classification_loss: 0.4916 333/500 [==================>...........] - ETA: 39s - loss: 2.4948 - regression_loss: 2.0044 - classification_loss: 0.4905 334/500 [===================>..........] - ETA: 38s - loss: 2.4967 - regression_loss: 2.0062 - classification_loss: 0.4905 335/500 [===================>..........] - ETA: 38s - loss: 2.4951 - regression_loss: 2.0053 - classification_loss: 0.4898 336/500 [===================>..........] - ETA: 38s - loss: 2.4946 - regression_loss: 2.0053 - classification_loss: 0.4893 337/500 [===================>..........] - ETA: 38s - loss: 2.4967 - regression_loss: 2.0069 - classification_loss: 0.4897 338/500 [===================>..........] - ETA: 38s - loss: 2.4957 - regression_loss: 2.0067 - classification_loss: 0.4890 339/500 [===================>..........] - ETA: 37s - loss: 2.4963 - regression_loss: 2.0066 - classification_loss: 0.4897 340/500 [===================>..........] - ETA: 37s - loss: 2.4984 - regression_loss: 2.0078 - classification_loss: 0.4906 341/500 [===================>..........] - ETA: 37s - loss: 2.4982 - regression_loss: 2.0079 - classification_loss: 0.4903 342/500 [===================>..........] - ETA: 37s - loss: 2.4995 - regression_loss: 2.0090 - classification_loss: 0.4904 343/500 [===================>..........] - ETA: 36s - loss: 2.4996 - regression_loss: 2.0092 - classification_loss: 0.4904 344/500 [===================>..........] - ETA: 36s - loss: 2.5012 - regression_loss: 2.0102 - classification_loss: 0.4910 345/500 [===================>..........] - ETA: 36s - loss: 2.5003 - regression_loss: 2.0097 - classification_loss: 0.4906 346/500 [===================>..........] - ETA: 36s - loss: 2.4999 - regression_loss: 2.0096 - classification_loss: 0.4903 347/500 [===================>..........] - ETA: 35s - loss: 2.4994 - regression_loss: 2.0094 - classification_loss: 0.4900 348/500 [===================>..........] - ETA: 35s - loss: 2.4991 - regression_loss: 2.0089 - classification_loss: 0.4902 349/500 [===================>..........] - ETA: 35s - loss: 2.4982 - regression_loss: 2.0084 - classification_loss: 0.4898 350/500 [====================>.........] - ETA: 35s - loss: 2.4978 - regression_loss: 2.0081 - classification_loss: 0.4897 351/500 [====================>.........] - ETA: 34s - loss: 2.4987 - regression_loss: 2.0090 - classification_loss: 0.4896 352/500 [====================>.........] - ETA: 34s - loss: 2.4988 - regression_loss: 2.0094 - classification_loss: 0.4895 353/500 [====================>.........] - ETA: 34s - loss: 2.5016 - regression_loss: 2.0118 - classification_loss: 0.4897 354/500 [====================>.........] - ETA: 34s - loss: 2.5015 - regression_loss: 2.0123 - classification_loss: 0.4892 355/500 [====================>.........] - ETA: 34s - loss: 2.5050 - regression_loss: 2.0135 - classification_loss: 0.4915 356/500 [====================>.........] - ETA: 33s - loss: 2.5050 - regression_loss: 2.0137 - classification_loss: 0.4913 357/500 [====================>.........] - ETA: 33s - loss: 2.5064 - regression_loss: 2.0152 - classification_loss: 0.4912 358/500 [====================>.........] - ETA: 33s - loss: 2.5082 - regression_loss: 2.0168 - classification_loss: 0.4914 359/500 [====================>.........] - ETA: 33s - loss: 2.5052 - regression_loss: 2.0144 - classification_loss: 0.4908 360/500 [====================>.........] - ETA: 32s - loss: 2.5050 - regression_loss: 2.0143 - classification_loss: 0.4908 361/500 [====================>.........] - ETA: 32s - loss: 2.5057 - regression_loss: 2.0151 - classification_loss: 0.4906 362/500 [====================>.........] - ETA: 32s - loss: 2.5066 - regression_loss: 2.0160 - classification_loss: 0.4906 363/500 [====================>.........] - ETA: 32s - loss: 2.5111 - regression_loss: 2.0196 - classification_loss: 0.4915 364/500 [====================>.........] - ETA: 31s - loss: 2.5125 - regression_loss: 2.0208 - classification_loss: 0.4917 365/500 [====================>.........] - ETA: 31s - loss: 2.5128 - regression_loss: 2.0210 - classification_loss: 0.4918 366/500 [====================>.........] - ETA: 31s - loss: 2.5129 - regression_loss: 2.0215 - classification_loss: 0.4914 367/500 [=====================>........] - ETA: 31s - loss: 2.5126 - regression_loss: 2.0217 - classification_loss: 0.4909 368/500 [=====================>........] - ETA: 30s - loss: 2.5095 - regression_loss: 2.0195 - classification_loss: 0.4900 369/500 [=====================>........] - ETA: 30s - loss: 2.5099 - regression_loss: 2.0198 - classification_loss: 0.4902 370/500 [=====================>........] - ETA: 30s - loss: 2.5103 - regression_loss: 2.0204 - classification_loss: 0.4899 371/500 [=====================>........] - ETA: 30s - loss: 2.5088 - regression_loss: 2.0193 - classification_loss: 0.4895 372/500 [=====================>........] - ETA: 30s - loss: 2.5094 - regression_loss: 2.0201 - classification_loss: 0.4894 373/500 [=====================>........] - ETA: 29s - loss: 2.5092 - regression_loss: 2.0198 - classification_loss: 0.4894 374/500 [=====================>........] - ETA: 29s - loss: 2.5090 - regression_loss: 2.0197 - classification_loss: 0.4893 375/500 [=====================>........] - ETA: 29s - loss: 2.5088 - regression_loss: 2.0194 - classification_loss: 0.4895 376/500 [=====================>........] - ETA: 29s - loss: 2.5079 - regression_loss: 2.0183 - classification_loss: 0.4896 377/500 [=====================>........] - ETA: 28s - loss: 2.5083 - regression_loss: 2.0188 - classification_loss: 0.4895 378/500 [=====================>........] - ETA: 28s - loss: 2.5064 - regression_loss: 2.0176 - classification_loss: 0.4889 379/500 [=====================>........] - ETA: 28s - loss: 2.5077 - regression_loss: 2.0192 - classification_loss: 0.4885 380/500 [=====================>........] - ETA: 28s - loss: 2.5060 - regression_loss: 2.0182 - classification_loss: 0.4878 381/500 [=====================>........] - ETA: 27s - loss: 2.5052 - regression_loss: 2.0179 - classification_loss: 0.4873 382/500 [=====================>........] - ETA: 27s - loss: 2.5033 - regression_loss: 2.0165 - classification_loss: 0.4869 383/500 [=====================>........] - ETA: 27s - loss: 2.5040 - regression_loss: 2.0167 - classification_loss: 0.4873 384/500 [======================>.......] - ETA: 27s - loss: 2.5014 - regression_loss: 2.0146 - classification_loss: 0.4868 385/500 [======================>.......] - ETA: 27s - loss: 2.5011 - regression_loss: 2.0147 - classification_loss: 0.4863 386/500 [======================>.......] - ETA: 26s - loss: 2.5012 - regression_loss: 2.0151 - classification_loss: 0.4861 387/500 [======================>.......] - ETA: 26s - loss: 2.5010 - regression_loss: 2.0152 - classification_loss: 0.4859 388/500 [======================>.......] - ETA: 26s - loss: 2.5004 - regression_loss: 2.0147 - classification_loss: 0.4857 389/500 [======================>.......] - ETA: 26s - loss: 2.4990 - regression_loss: 2.0136 - classification_loss: 0.4854 390/500 [======================>.......] - ETA: 25s - loss: 2.4986 - regression_loss: 2.0135 - classification_loss: 0.4851 391/500 [======================>.......] - ETA: 25s - loss: 2.4979 - regression_loss: 2.0130 - classification_loss: 0.4849 392/500 [======================>.......] - ETA: 25s - loss: 2.4984 - regression_loss: 2.0131 - classification_loss: 0.4852 393/500 [======================>.......] - ETA: 25s - loss: 2.4966 - regression_loss: 2.0118 - classification_loss: 0.4847 394/500 [======================>.......] - ETA: 24s - loss: 2.4972 - regression_loss: 2.0124 - classification_loss: 0.4848 395/500 [======================>.......] - ETA: 24s - loss: 2.4967 - regression_loss: 2.0120 - classification_loss: 0.4847 396/500 [======================>.......] - ETA: 24s - loss: 2.4956 - regression_loss: 2.0115 - classification_loss: 0.4842 397/500 [======================>.......] - ETA: 24s - loss: 2.4948 - regression_loss: 2.0110 - classification_loss: 0.4838 398/500 [======================>.......] - ETA: 23s - loss: 2.4954 - regression_loss: 2.0115 - classification_loss: 0.4839 399/500 [======================>.......] - ETA: 23s - loss: 2.4942 - regression_loss: 2.0107 - classification_loss: 0.4835 400/500 [=======================>......] - ETA: 23s - loss: 2.4943 - regression_loss: 2.0109 - classification_loss: 0.4834 401/500 [=======================>......] - ETA: 23s - loss: 2.4938 - regression_loss: 2.0104 - classification_loss: 0.4834 402/500 [=======================>......] - ETA: 23s - loss: 2.4949 - regression_loss: 2.0110 - classification_loss: 0.4839 403/500 [=======================>......] - ETA: 22s - loss: 2.4937 - regression_loss: 2.0102 - classification_loss: 0.4835 404/500 [=======================>......] - ETA: 22s - loss: 2.4929 - regression_loss: 2.0094 - classification_loss: 0.4834 405/500 [=======================>......] - ETA: 22s - loss: 2.4925 - regression_loss: 2.0092 - classification_loss: 0.4832 406/500 [=======================>......] - ETA: 22s - loss: 2.4924 - regression_loss: 2.0092 - classification_loss: 0.4832 407/500 [=======================>......] - ETA: 21s - loss: 2.4954 - regression_loss: 2.0116 - classification_loss: 0.4837 408/500 [=======================>......] - ETA: 21s - loss: 2.4973 - regression_loss: 2.0131 - classification_loss: 0.4842 409/500 [=======================>......] - ETA: 21s - loss: 2.4988 - regression_loss: 2.0140 - classification_loss: 0.4848 410/500 [=======================>......] - ETA: 21s - loss: 2.4993 - regression_loss: 2.0144 - classification_loss: 0.4849 411/500 [=======================>......] - ETA: 20s - loss: 2.4997 - regression_loss: 2.0148 - classification_loss: 0.4849 412/500 [=======================>......] - ETA: 20s - loss: 2.4990 - regression_loss: 2.0144 - classification_loss: 0.4846 413/500 [=======================>......] - ETA: 20s - loss: 2.4997 - regression_loss: 2.0150 - classification_loss: 0.4847 414/500 [=======================>......] - ETA: 20s - loss: 2.4986 - regression_loss: 2.0145 - classification_loss: 0.4841 415/500 [=======================>......] - ETA: 19s - loss: 2.5026 - regression_loss: 2.0177 - classification_loss: 0.4849 416/500 [=======================>......] - ETA: 19s - loss: 2.5024 - regression_loss: 2.0176 - classification_loss: 0.4847 417/500 [========================>.....] - ETA: 19s - loss: 2.5021 - regression_loss: 2.0176 - classification_loss: 0.4846 418/500 [========================>.....] - ETA: 19s - loss: 2.5029 - regression_loss: 2.0187 - classification_loss: 0.4842 419/500 [========================>.....] - ETA: 19s - loss: 2.5031 - regression_loss: 2.0188 - classification_loss: 0.4843 420/500 [========================>.....] - ETA: 18s - loss: 2.5040 - regression_loss: 2.0196 - classification_loss: 0.4844 421/500 [========================>.....] - ETA: 18s - loss: 2.5007 - regression_loss: 2.0169 - classification_loss: 0.4838 422/500 [========================>.....] - ETA: 18s - loss: 2.4999 - regression_loss: 2.0164 - classification_loss: 0.4835 423/500 [========================>.....] - ETA: 18s - loss: 2.4983 - regression_loss: 2.0153 - classification_loss: 0.4830 424/500 [========================>.....] - ETA: 17s - loss: 2.4974 - regression_loss: 2.0149 - classification_loss: 0.4825 425/500 [========================>.....] - ETA: 17s - loss: 2.4971 - regression_loss: 2.0144 - classification_loss: 0.4827 426/500 [========================>.....] - ETA: 17s - loss: 2.4972 - regression_loss: 2.0146 - classification_loss: 0.4826 427/500 [========================>.....] - ETA: 17s - loss: 2.4959 - regression_loss: 2.0139 - classification_loss: 0.4820 428/500 [========================>.....] - ETA: 16s - loss: 2.4973 - regression_loss: 2.0151 - classification_loss: 0.4822 429/500 [========================>.....] - ETA: 16s - loss: 2.4952 - regression_loss: 2.0136 - classification_loss: 0.4816 430/500 [========================>.....] - ETA: 16s - loss: 2.4947 - regression_loss: 2.0132 - classification_loss: 0.4815 431/500 [========================>.....] - ETA: 16s - loss: 2.4918 - regression_loss: 2.0110 - classification_loss: 0.4808 432/500 [========================>.....] - ETA: 15s - loss: 2.4924 - regression_loss: 2.0116 - classification_loss: 0.4808 433/500 [========================>.....] - ETA: 15s - loss: 2.4943 - regression_loss: 2.0131 - classification_loss: 0.4812 434/500 [=========================>....] - ETA: 15s - loss: 2.4967 - regression_loss: 2.0150 - classification_loss: 0.4816 435/500 [=========================>....] - ETA: 15s - loss: 2.4963 - regression_loss: 2.0148 - classification_loss: 0.4815 436/500 [=========================>....] - ETA: 15s - loss: 2.4937 - regression_loss: 2.0130 - classification_loss: 0.4807 437/500 [=========================>....] - ETA: 14s - loss: 2.4913 - regression_loss: 2.0113 - classification_loss: 0.4800 438/500 [=========================>....] - ETA: 14s - loss: 2.4918 - regression_loss: 2.0118 - classification_loss: 0.4801 439/500 [=========================>....] - ETA: 14s - loss: 2.4896 - regression_loss: 2.0101 - classification_loss: 0.4794 440/500 [=========================>....] - ETA: 14s - loss: 2.4889 - regression_loss: 2.0097 - classification_loss: 0.4792 441/500 [=========================>....] - ETA: 13s - loss: 2.4863 - regression_loss: 2.0076 - classification_loss: 0.4787 442/500 [=========================>....] - ETA: 13s - loss: 2.4869 - regression_loss: 2.0079 - classification_loss: 0.4790 443/500 [=========================>....] - ETA: 13s - loss: 2.4865 - regression_loss: 2.0079 - classification_loss: 0.4786 444/500 [=========================>....] - ETA: 13s - loss: 2.4857 - regression_loss: 2.0073 - classification_loss: 0.4783 445/500 [=========================>....] - ETA: 12s - loss: 2.4854 - regression_loss: 2.0072 - classification_loss: 0.4782 446/500 [=========================>....] - ETA: 12s - loss: 2.4874 - regression_loss: 2.0088 - classification_loss: 0.4786 447/500 [=========================>....] - ETA: 12s - loss: 2.4884 - regression_loss: 2.0093 - classification_loss: 0.4791 448/500 [=========================>....] - ETA: 12s - loss: 2.4887 - regression_loss: 2.0095 - classification_loss: 0.4792 449/500 [=========================>....] - ETA: 11s - loss: 2.4883 - regression_loss: 2.0096 - classification_loss: 0.4788 450/500 [==========================>...] - ETA: 11s - loss: 2.4966 - regression_loss: 2.0051 - classification_loss: 0.4914 451/500 [==========================>...] - ETA: 11s - loss: 2.4954 - regression_loss: 2.0044 - classification_loss: 0.4910 452/500 [==========================>...] - ETA: 11s - loss: 2.4937 - regression_loss: 2.0029 - classification_loss: 0.4908 453/500 [==========================>...] - ETA: 11s - loss: 2.4931 - regression_loss: 2.0027 - classification_loss: 0.4903 454/500 [==========================>...] - ETA: 10s - loss: 2.4938 - regression_loss: 2.0031 - classification_loss: 0.4907 455/500 [==========================>...] - ETA: 10s - loss: 2.4933 - regression_loss: 2.0027 - classification_loss: 0.4907 456/500 [==========================>...] - ETA: 10s - loss: 2.4940 - regression_loss: 2.0030 - classification_loss: 0.4910 457/500 [==========================>...] - ETA: 10s - loss: 2.4956 - regression_loss: 2.0044 - classification_loss: 0.4912 458/500 [==========================>...] - ETA: 9s - loss: 2.4951 - regression_loss: 2.0042 - classification_loss: 0.4909  459/500 [==========================>...] - ETA: 9s - loss: 2.4951 - regression_loss: 2.0041 - classification_loss: 0.4909 460/500 [==========================>...] - ETA: 9s - loss: 2.4969 - regression_loss: 2.0059 - classification_loss: 0.4910 461/500 [==========================>...] - ETA: 9s - loss: 2.4960 - regression_loss: 2.0054 - classification_loss: 0.4907 462/500 [==========================>...] - ETA: 8s - loss: 2.4959 - regression_loss: 2.0052 - classification_loss: 0.4908 463/500 [==========================>...] - ETA: 8s - loss: 2.4955 - regression_loss: 2.0050 - classification_loss: 0.4905 464/500 [==========================>...] - ETA: 8s - loss: 2.4944 - regression_loss: 2.0044 - classification_loss: 0.4900 465/500 [==========================>...] - ETA: 8s - loss: 2.4949 - regression_loss: 2.0050 - classification_loss: 0.4899 466/500 [==========================>...] - ETA: 7s - loss: 2.4972 - regression_loss: 2.0068 - classification_loss: 0.4905 467/500 [===========================>..] - ETA: 7s - loss: 2.4972 - regression_loss: 2.0070 - classification_loss: 0.4902 468/500 [===========================>..] - ETA: 7s - loss: 2.4946 - regression_loss: 2.0051 - classification_loss: 0.4895 469/500 [===========================>..] - ETA: 7s - loss: 2.4942 - regression_loss: 2.0049 - classification_loss: 0.4894 470/500 [===========================>..] - ETA: 7s - loss: 2.4944 - regression_loss: 2.0050 - classification_loss: 0.4895 471/500 [===========================>..] - ETA: 6s - loss: 2.4937 - regression_loss: 2.0047 - classification_loss: 0.4890 472/500 [===========================>..] - ETA: 6s - loss: 2.4927 - regression_loss: 2.0040 - classification_loss: 0.4887 473/500 [===========================>..] - ETA: 6s - loss: 2.4926 - regression_loss: 2.0038 - classification_loss: 0.4888 474/500 [===========================>..] - ETA: 6s - loss: 2.4931 - regression_loss: 2.0043 - classification_loss: 0.4888 475/500 [===========================>..] - ETA: 5s - loss: 2.4915 - regression_loss: 2.0032 - classification_loss: 0.4883 476/500 [===========================>..] - ETA: 5s - loss: 2.4908 - regression_loss: 2.0028 - classification_loss: 0.4881 477/500 [===========================>..] - ETA: 5s - loss: 2.4903 - regression_loss: 2.0025 - classification_loss: 0.4879 478/500 [===========================>..] - ETA: 5s - loss: 2.4908 - regression_loss: 2.0029 - classification_loss: 0.4879 479/500 [===========================>..] - ETA: 4s - loss: 2.4907 - regression_loss: 2.0028 - classification_loss: 0.4878 480/500 [===========================>..] - ETA: 4s - loss: 2.4929 - regression_loss: 2.0045 - classification_loss: 0.4884 481/500 [===========================>..] - ETA: 4s - loss: 2.4925 - regression_loss: 2.0042 - classification_loss: 0.4883 482/500 [===========================>..] - ETA: 4s - loss: 2.4952 - regression_loss: 2.0055 - classification_loss: 0.4896 483/500 [===========================>..] - ETA: 3s - loss: 2.4944 - regression_loss: 2.0050 - classification_loss: 0.4894 484/500 [============================>.] - ETA: 3s - loss: 2.4942 - regression_loss: 2.0048 - classification_loss: 0.4893 485/500 [============================>.] - ETA: 3s - loss: 2.4935 - regression_loss: 2.0047 - classification_loss: 0.4889 486/500 [============================>.] - ETA: 3s - loss: 2.4929 - regression_loss: 2.0045 - classification_loss: 0.4884 487/500 [============================>.] - ETA: 3s - loss: 2.4927 - regression_loss: 2.0047 - classification_loss: 0.4880 488/500 [============================>.] - ETA: 2s - loss: 2.4919 - regression_loss: 2.0042 - classification_loss: 0.4877 489/500 [============================>.] - ETA: 2s - loss: 2.4921 - regression_loss: 2.0042 - classification_loss: 0.4879 490/500 [============================>.] - ETA: 2s - loss: 2.4915 - regression_loss: 2.0039 - classification_loss: 0.4876 491/500 [============================>.] - ETA: 2s - loss: 2.4931 - regression_loss: 2.0051 - classification_loss: 0.4880 492/500 [============================>.] - ETA: 1s - loss: 2.4929 - regression_loss: 2.0048 - classification_loss: 0.4881 493/500 [============================>.] - ETA: 1s - loss: 2.4929 - regression_loss: 2.0048 - classification_loss: 0.4881 494/500 [============================>.] - ETA: 1s - loss: 2.4928 - regression_loss: 2.0047 - classification_loss: 0.4881 495/500 [============================>.] - ETA: 1s - loss: 2.4928 - regression_loss: 2.0049 - classification_loss: 0.4879 496/500 [============================>.] - ETA: 0s - loss: 2.4918 - regression_loss: 2.0043 - classification_loss: 0.4875 497/500 [============================>.] - ETA: 0s - loss: 2.4908 - regression_loss: 2.0038 - classification_loss: 0.4871 498/500 [============================>.] - ETA: 0s - loss: 2.4906 - regression_loss: 2.0037 - classification_loss: 0.4869 499/500 [============================>.] - ETA: 0s - loss: 2.4876 - regression_loss: 2.0013 - classification_loss: 0.4863 500/500 [==============================] - 118s 235ms/step - loss: 2.4879 - regression_loss: 2.0016 - classification_loss: 0.4862 326 instances of class plum with average precision: 0.5184 mAP: 0.5184 Epoch 00006: saving model to ./training/snapshots/resnet50_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 1:56 - loss: 2.3354 - regression_loss: 1.9876 - classification_loss: 0.3478 2/500 [..............................] - ETA: 1:57 - loss: 2.7252 - regression_loss: 2.2288 - classification_loss: 0.4964 3/500 [..............................] - ETA: 1:58 - loss: 2.8732 - regression_loss: 2.3616 - classification_loss: 0.5116 4/500 [..............................] - ETA: 2:00 - loss: 2.5789 - regression_loss: 2.1247 - classification_loss: 0.4542 5/500 [..............................] - ETA: 1:59 - loss: 2.7532 - regression_loss: 2.2567 - classification_loss: 0.4964 6/500 [..............................] - ETA: 1:58 - loss: 2.6205 - regression_loss: 2.1581 - classification_loss: 0.4625 7/500 [..............................] - ETA: 1:58 - loss: 2.5309 - regression_loss: 2.0922 - classification_loss: 0.4387 8/500 [..............................] - ETA: 1:56 - loss: 2.6009 - regression_loss: 2.1382 - classification_loss: 0.4627 9/500 [..............................] - ETA: 1:56 - loss: 2.4938 - regression_loss: 2.0458 - classification_loss: 0.4480 10/500 [..............................] - ETA: 1:56 - loss: 2.4842 - regression_loss: 2.0320 - classification_loss: 0.4523 11/500 [..............................] - ETA: 1:55 - loss: 2.5607 - regression_loss: 2.0916 - classification_loss: 0.4691 12/500 [..............................] - ETA: 1:55 - loss: 2.5674 - regression_loss: 2.1011 - classification_loss: 0.4662 13/500 [..............................] - ETA: 1:55 - loss: 2.5504 - regression_loss: 2.0861 - classification_loss: 0.4643 14/500 [..............................] - ETA: 1:55 - loss: 2.5693 - regression_loss: 2.1006 - classification_loss: 0.4687 15/500 [..............................] - ETA: 1:54 - loss: 2.5585 - regression_loss: 2.0946 - classification_loss: 0.4639 16/500 [..............................] - ETA: 1:54 - loss: 2.5314 - regression_loss: 2.0790 - classification_loss: 0.4524 17/500 [>.............................] - ETA: 1:54 - loss: 2.5394 - regression_loss: 2.0878 - classification_loss: 0.4516 18/500 [>.............................] - ETA: 1:54 - loss: 2.5049 - regression_loss: 2.0616 - classification_loss: 0.4433 19/500 [>.............................] - ETA: 1:54 - loss: 2.4971 - regression_loss: 2.0489 - classification_loss: 0.4481 20/500 [>.............................] - ETA: 1:53 - loss: 2.5129 - regression_loss: 2.0600 - classification_loss: 0.4529 21/500 [>.............................] - ETA: 1:53 - loss: 2.4829 - regression_loss: 2.0397 - classification_loss: 0.4432 22/500 [>.............................] - ETA: 1:53 - loss: 2.4338 - regression_loss: 2.0008 - classification_loss: 0.4330 23/500 [>.............................] - ETA: 1:53 - loss: 2.4020 - regression_loss: 1.9767 - classification_loss: 0.4253 24/500 [>.............................] - ETA: 1:53 - loss: 2.4216 - regression_loss: 1.9870 - classification_loss: 0.4346 25/500 [>.............................] - ETA: 1:52 - loss: 2.4341 - regression_loss: 1.9949 - classification_loss: 0.4393 26/500 [>.............................] - ETA: 1:52 - loss: 2.4147 - regression_loss: 1.9787 - classification_loss: 0.4360 27/500 [>.............................] - ETA: 1:52 - loss: 2.4031 - regression_loss: 1.9717 - classification_loss: 0.4314 28/500 [>.............................] - ETA: 1:52 - loss: 2.3850 - regression_loss: 1.9555 - classification_loss: 0.4295 29/500 [>.............................] - ETA: 1:51 - loss: 2.4365 - regression_loss: 1.9998 - classification_loss: 0.4367 30/500 [>.............................] - ETA: 1:51 - loss: 2.4400 - regression_loss: 2.0025 - classification_loss: 0.4376 31/500 [>.............................] - ETA: 1:51 - loss: 2.4227 - regression_loss: 1.9897 - classification_loss: 0.4330 32/500 [>.............................] - ETA: 1:51 - loss: 2.4496 - regression_loss: 2.0083 - classification_loss: 0.4413 33/500 [>.............................] - ETA: 1:50 - loss: 2.4415 - regression_loss: 2.0004 - classification_loss: 0.4411 34/500 [=>............................] - ETA: 1:50 - loss: 2.4529 - regression_loss: 2.0065 - classification_loss: 0.4465 35/500 [=>............................] - ETA: 1:50 - loss: 2.4464 - regression_loss: 2.0041 - classification_loss: 0.4424 36/500 [=>............................] - ETA: 1:50 - loss: 2.4468 - regression_loss: 2.0067 - classification_loss: 0.4401 37/500 [=>............................] - ETA: 1:49 - loss: 2.4433 - regression_loss: 2.0040 - classification_loss: 0.4393 38/500 [=>............................] - ETA: 1:49 - loss: 2.4147 - regression_loss: 1.9829 - classification_loss: 0.4318 39/500 [=>............................] - ETA: 1:49 - loss: 2.4156 - regression_loss: 1.9801 - classification_loss: 0.4355 40/500 [=>............................] - ETA: 1:49 - loss: 2.3822 - regression_loss: 1.9538 - classification_loss: 0.4284 41/500 [=>............................] - ETA: 1:48 - loss: 2.4092 - regression_loss: 1.9761 - classification_loss: 0.4331 42/500 [=>............................] - ETA: 1:48 - loss: 2.4164 - regression_loss: 1.9802 - classification_loss: 0.4362 43/500 [=>............................] - ETA: 1:48 - loss: 2.4236 - regression_loss: 1.9877 - classification_loss: 0.4359 44/500 [=>............................] - ETA: 1:48 - loss: 2.4071 - regression_loss: 1.9756 - classification_loss: 0.4316 45/500 [=>............................] - ETA: 1:47 - loss: 2.4035 - regression_loss: 1.9736 - classification_loss: 0.4299 46/500 [=>............................] - ETA: 1:47 - loss: 2.4547 - regression_loss: 2.0129 - classification_loss: 0.4417 47/500 [=>............................] - ETA: 1:47 - loss: 2.4545 - regression_loss: 2.0116 - classification_loss: 0.4429 48/500 [=>............................] - ETA: 1:46 - loss: 2.4608 - regression_loss: 2.0149 - classification_loss: 0.4459 49/500 [=>............................] - ETA: 1:46 - loss: 2.4567 - regression_loss: 2.0109 - classification_loss: 0.4458 50/500 [==>...........................] - ETA: 1:46 - loss: 2.4548 - regression_loss: 2.0092 - classification_loss: 0.4456 51/500 [==>...........................] - ETA: 1:46 - loss: 2.4273 - regression_loss: 1.9861 - classification_loss: 0.4412 52/500 [==>...........................] - ETA: 1:45 - loss: 2.4136 - regression_loss: 1.9757 - classification_loss: 0.4379 53/500 [==>...........................] - ETA: 1:45 - loss: 2.4051 - regression_loss: 1.9707 - classification_loss: 0.4344 54/500 [==>...........................] - ETA: 1:45 - loss: 2.4029 - regression_loss: 1.9689 - classification_loss: 0.4340 55/500 [==>...........................] - ETA: 1:44 - loss: 2.4014 - regression_loss: 1.9667 - classification_loss: 0.4348 56/500 [==>...........................] - ETA: 1:44 - loss: 2.4043 - regression_loss: 1.9684 - classification_loss: 0.4359 57/500 [==>...........................] - ETA: 1:44 - loss: 2.4033 - regression_loss: 1.9674 - classification_loss: 0.4359 58/500 [==>...........................] - ETA: 1:44 - loss: 2.4024 - regression_loss: 1.9668 - classification_loss: 0.4355 59/500 [==>...........................] - ETA: 1:43 - loss: 2.4469 - regression_loss: 1.9774 - classification_loss: 0.4695 60/500 [==>...........................] - ETA: 1:43 - loss: 2.4484 - regression_loss: 1.9805 - classification_loss: 0.4679 61/500 [==>...........................] - ETA: 1:43 - loss: 2.4488 - regression_loss: 1.9809 - classification_loss: 0.4680 62/500 [==>...........................] - ETA: 1:43 - loss: 2.4378 - regression_loss: 1.9723 - classification_loss: 0.4655 63/500 [==>...........................] - ETA: 1:42 - loss: 2.4215 - regression_loss: 1.9601 - classification_loss: 0.4614 64/500 [==>...........................] - ETA: 1:42 - loss: 2.4168 - regression_loss: 1.9552 - classification_loss: 0.4615 65/500 [==>...........................] - ETA: 1:42 - loss: 2.4128 - regression_loss: 1.9542 - classification_loss: 0.4586 66/500 [==>...........................] - ETA: 1:41 - loss: 2.4141 - regression_loss: 1.9557 - classification_loss: 0.4584 67/500 [===>..........................] - ETA: 1:41 - loss: 2.4096 - regression_loss: 1.9531 - classification_loss: 0.4564 68/500 [===>..........................] - ETA: 1:41 - loss: 2.4196 - regression_loss: 1.9617 - classification_loss: 0.4578 69/500 [===>..........................] - ETA: 1:41 - loss: 2.4209 - regression_loss: 1.9642 - classification_loss: 0.4567 70/500 [===>..........................] - ETA: 1:40 - loss: 2.4136 - regression_loss: 1.9589 - classification_loss: 0.4547 71/500 [===>..........................] - ETA: 1:40 - loss: 2.4145 - regression_loss: 1.9602 - classification_loss: 0.4544 72/500 [===>..........................] - ETA: 1:40 - loss: 2.4165 - regression_loss: 1.9617 - classification_loss: 0.4548 73/500 [===>..........................] - ETA: 1:40 - loss: 2.4204 - regression_loss: 1.9658 - classification_loss: 0.4545 74/500 [===>..........................] - ETA: 1:39 - loss: 2.4159 - regression_loss: 1.9634 - classification_loss: 0.4525 75/500 [===>..........................] - ETA: 1:39 - loss: 2.4098 - regression_loss: 1.9373 - classification_loss: 0.4725 76/500 [===>..........................] - ETA: 1:39 - loss: 2.4200 - regression_loss: 1.9489 - classification_loss: 0.4711 77/500 [===>..........................] - ETA: 1:39 - loss: 2.4253 - regression_loss: 1.9525 - classification_loss: 0.4728 78/500 [===>..........................] - ETA: 1:38 - loss: 2.4195 - regression_loss: 1.9480 - classification_loss: 0.4715 79/500 [===>..........................] - ETA: 1:38 - loss: 2.4136 - regression_loss: 1.9233 - classification_loss: 0.4903 80/500 [===>..........................] - ETA: 1:38 - loss: 2.4134 - regression_loss: 1.9238 - classification_loss: 0.4895 81/500 [===>..........................] - ETA: 1:38 - loss: 2.4110 - regression_loss: 1.9222 - classification_loss: 0.4889 82/500 [===>..........................] - ETA: 1:37 - loss: 2.4142 - regression_loss: 1.9254 - classification_loss: 0.4888 83/500 [===>..........................] - ETA: 1:37 - loss: 2.4169 - regression_loss: 1.9288 - classification_loss: 0.4882 84/500 [====>.........................] - ETA: 1:37 - loss: 2.4124 - regression_loss: 1.9262 - classification_loss: 0.4862 85/500 [====>.........................] - ETA: 1:37 - loss: 2.4079 - regression_loss: 1.9238 - classification_loss: 0.4841 86/500 [====>.........................] - ETA: 1:37 - loss: 2.4050 - regression_loss: 1.9227 - classification_loss: 0.4823 87/500 [====>.........................] - ETA: 1:36 - loss: 2.4070 - regression_loss: 1.9264 - classification_loss: 0.4806 88/500 [====>.........................] - ETA: 1:36 - loss: 2.4094 - regression_loss: 1.9280 - classification_loss: 0.4814 89/500 [====>.........................] - ETA: 1:36 - loss: 2.4061 - regression_loss: 1.9273 - classification_loss: 0.4788 90/500 [====>.........................] - ETA: 1:36 - loss: 2.4146 - regression_loss: 1.9320 - classification_loss: 0.4826 91/500 [====>.........................] - ETA: 1:35 - loss: 2.4177 - regression_loss: 1.9345 - classification_loss: 0.4832 92/500 [====>.........................] - ETA: 1:35 - loss: 2.4186 - regression_loss: 1.9357 - classification_loss: 0.4828 93/500 [====>.........................] - ETA: 1:35 - loss: 2.4207 - regression_loss: 1.9379 - classification_loss: 0.4828 94/500 [====>.........................] - ETA: 1:35 - loss: 2.4347 - regression_loss: 1.9511 - classification_loss: 0.4836 95/500 [====>.........................] - ETA: 1:34 - loss: 2.4253 - regression_loss: 1.9446 - classification_loss: 0.4807 96/500 [====>.........................] - ETA: 1:34 - loss: 2.4286 - regression_loss: 1.9469 - classification_loss: 0.4817 97/500 [====>.........................] - ETA: 1:34 - loss: 2.4344 - regression_loss: 1.9524 - classification_loss: 0.4820 98/500 [====>.........................] - ETA: 1:34 - loss: 2.4433 - regression_loss: 1.9609 - classification_loss: 0.4824 99/500 [====>.........................] - ETA: 1:33 - loss: 2.4424 - regression_loss: 1.9615 - classification_loss: 0.4808 100/500 [=====>........................] - ETA: 1:33 - loss: 2.4404 - regression_loss: 1.9613 - classification_loss: 0.4791 101/500 [=====>........................] - ETA: 1:33 - loss: 2.4388 - regression_loss: 1.9599 - classification_loss: 0.4789 102/500 [=====>........................] - ETA: 1:33 - loss: 2.4355 - regression_loss: 1.9583 - classification_loss: 0.4772 103/500 [=====>........................] - ETA: 1:32 - loss: 2.4327 - regression_loss: 1.9568 - classification_loss: 0.4759 104/500 [=====>........................] - ETA: 1:32 - loss: 2.4341 - regression_loss: 1.9582 - classification_loss: 0.4759 105/500 [=====>........................] - ETA: 1:32 - loss: 2.4340 - regression_loss: 1.9594 - classification_loss: 0.4746 106/500 [=====>........................] - ETA: 1:32 - loss: 2.4335 - regression_loss: 1.9603 - classification_loss: 0.4731 107/500 [=====>........................] - ETA: 1:31 - loss: 2.4250 - regression_loss: 1.9542 - classification_loss: 0.4708 108/500 [=====>........................] - ETA: 1:31 - loss: 2.4236 - regression_loss: 1.9536 - classification_loss: 0.4700 109/500 [=====>........................] - ETA: 1:31 - loss: 2.4213 - regression_loss: 1.9520 - classification_loss: 0.4693 110/500 [=====>........................] - ETA: 1:31 - loss: 2.4219 - regression_loss: 1.9538 - classification_loss: 0.4681 111/500 [=====>........................] - ETA: 1:30 - loss: 2.4248 - regression_loss: 1.9563 - classification_loss: 0.4686 112/500 [=====>........................] - ETA: 1:30 - loss: 2.4220 - regression_loss: 1.9553 - classification_loss: 0.4667 113/500 [=====>........................] - ETA: 1:30 - loss: 2.4197 - regression_loss: 1.9540 - classification_loss: 0.4657 114/500 [=====>........................] - ETA: 1:30 - loss: 2.4215 - regression_loss: 1.9562 - classification_loss: 0.4652 115/500 [=====>........................] - ETA: 1:29 - loss: 2.4241 - regression_loss: 1.9590 - classification_loss: 0.4651 116/500 [=====>........................] - ETA: 1:29 - loss: 2.4257 - regression_loss: 1.9610 - classification_loss: 0.4647 117/500 [======>.......................] - ETA: 1:29 - loss: 2.4247 - regression_loss: 1.9605 - classification_loss: 0.4642 118/500 [======>.......................] - ETA: 1:29 - loss: 2.4268 - regression_loss: 1.9605 - classification_loss: 0.4663 119/500 [======>.......................] - ETA: 1:28 - loss: 2.4246 - regression_loss: 1.9592 - classification_loss: 0.4654 120/500 [======>.......................] - ETA: 1:28 - loss: 2.4283 - regression_loss: 1.9629 - classification_loss: 0.4654 121/500 [======>.......................] - ETA: 1:28 - loss: 2.4306 - regression_loss: 1.9639 - classification_loss: 0.4667 122/500 [======>.......................] - ETA: 1:27 - loss: 2.4383 - regression_loss: 1.9691 - classification_loss: 0.4692 123/500 [======>.......................] - ETA: 1:27 - loss: 2.4340 - regression_loss: 1.9660 - classification_loss: 0.4680 124/500 [======>.......................] - ETA: 1:27 - loss: 2.4418 - regression_loss: 1.9725 - classification_loss: 0.4693 125/500 [======>.......................] - ETA: 1:27 - loss: 2.4408 - regression_loss: 1.9721 - classification_loss: 0.4686 126/500 [======>.......................] - ETA: 1:26 - loss: 2.4371 - regression_loss: 1.9695 - classification_loss: 0.4676 127/500 [======>.......................] - ETA: 1:26 - loss: 2.4521 - regression_loss: 1.9825 - classification_loss: 0.4695 128/500 [======>.......................] - ETA: 1:26 - loss: 2.4477 - regression_loss: 1.9793 - classification_loss: 0.4684 129/500 [======>.......................] - ETA: 1:26 - loss: 2.4507 - regression_loss: 1.9810 - classification_loss: 0.4697 130/500 [======>.......................] - ETA: 1:26 - loss: 2.4553 - regression_loss: 1.9854 - classification_loss: 0.4699 131/500 [======>.......................] - ETA: 1:25 - loss: 2.4633 - regression_loss: 1.9911 - classification_loss: 0.4722 132/500 [======>.......................] - ETA: 1:25 - loss: 2.4624 - regression_loss: 1.9902 - classification_loss: 0.4722 133/500 [======>.......................] - ETA: 1:25 - loss: 2.4604 - regression_loss: 1.9890 - classification_loss: 0.4713 134/500 [=======>......................] - ETA: 1:24 - loss: 2.4491 - regression_loss: 1.9804 - classification_loss: 0.4687 135/500 [=======>......................] - ETA: 1:24 - loss: 2.4545 - regression_loss: 1.9853 - classification_loss: 0.4692 136/500 [=======>......................] - ETA: 1:24 - loss: 2.4530 - regression_loss: 1.9848 - classification_loss: 0.4682 137/500 [=======>......................] - ETA: 1:24 - loss: 2.4549 - regression_loss: 1.9861 - classification_loss: 0.4688 138/500 [=======>......................] - ETA: 1:24 - loss: 2.4546 - regression_loss: 1.9864 - classification_loss: 0.4682 139/500 [=======>......................] - ETA: 1:23 - loss: 2.4527 - regression_loss: 1.9854 - classification_loss: 0.4672 140/500 [=======>......................] - ETA: 1:23 - loss: 2.4498 - regression_loss: 1.9843 - classification_loss: 0.4655 141/500 [=======>......................] - ETA: 1:23 - loss: 2.4484 - regression_loss: 1.9833 - classification_loss: 0.4651 142/500 [=======>......................] - ETA: 1:23 - loss: 2.4490 - regression_loss: 1.9837 - classification_loss: 0.4652 143/500 [=======>......................] - ETA: 1:22 - loss: 2.4445 - regression_loss: 1.9811 - classification_loss: 0.4633 144/500 [=======>......................] - ETA: 1:22 - loss: 2.4499 - regression_loss: 1.9852 - classification_loss: 0.4646 145/500 [=======>......................] - ETA: 1:22 - loss: 2.4517 - regression_loss: 1.9873 - classification_loss: 0.4643 146/500 [=======>......................] - ETA: 1:22 - loss: 2.4561 - regression_loss: 1.9914 - classification_loss: 0.4647 147/500 [=======>......................] - ETA: 1:21 - loss: 2.4552 - regression_loss: 1.9911 - classification_loss: 0.4641 148/500 [=======>......................] - ETA: 1:21 - loss: 2.4569 - regression_loss: 1.9927 - classification_loss: 0.4641 149/500 [=======>......................] - ETA: 1:21 - loss: 2.4541 - regression_loss: 1.9904 - classification_loss: 0.4637 150/500 [========>.....................] - ETA: 1:21 - loss: 2.4553 - regression_loss: 1.9928 - classification_loss: 0.4625 151/500 [========>.....................] - ETA: 1:20 - loss: 2.4489 - regression_loss: 1.9879 - classification_loss: 0.4610 152/500 [========>.....................] - ETA: 1:20 - loss: 2.4476 - regression_loss: 1.9878 - classification_loss: 0.4598 153/500 [========>.....................] - ETA: 1:20 - loss: 2.4489 - regression_loss: 1.9891 - classification_loss: 0.4598 154/500 [========>.....................] - ETA: 1:20 - loss: 2.4464 - regression_loss: 1.9875 - classification_loss: 0.4589 155/500 [========>.....................] - ETA: 1:20 - loss: 2.4432 - regression_loss: 1.9853 - classification_loss: 0.4580 156/500 [========>.....................] - ETA: 1:19 - loss: 2.4416 - regression_loss: 1.9840 - classification_loss: 0.4576 157/500 [========>.....................] - ETA: 1:19 - loss: 2.4407 - regression_loss: 1.9831 - classification_loss: 0.4576 158/500 [========>.....................] - ETA: 1:19 - loss: 2.4442 - regression_loss: 1.9852 - classification_loss: 0.4589 159/500 [========>.....................] - ETA: 1:19 - loss: 2.4435 - regression_loss: 1.9849 - classification_loss: 0.4586 160/500 [========>.....................] - ETA: 1:18 - loss: 2.4452 - regression_loss: 1.9864 - classification_loss: 0.4588 161/500 [========>.....................] - ETA: 1:18 - loss: 2.4475 - regression_loss: 1.9882 - classification_loss: 0.4593 162/500 [========>.....................] - ETA: 1:18 - loss: 2.4458 - regression_loss: 1.9871 - classification_loss: 0.4587 163/500 [========>.....................] - ETA: 1:18 - loss: 2.4448 - regression_loss: 1.9867 - classification_loss: 0.4580 164/500 [========>.....................] - ETA: 1:17 - loss: 2.4421 - regression_loss: 1.9852 - classification_loss: 0.4569 165/500 [========>.....................] - ETA: 1:17 - loss: 2.4466 - regression_loss: 1.9886 - classification_loss: 0.4580 166/500 [========>.....................] - ETA: 1:17 - loss: 2.4445 - regression_loss: 1.9873 - classification_loss: 0.4572 167/500 [=========>....................] - ETA: 1:17 - loss: 2.4516 - regression_loss: 1.9927 - classification_loss: 0.4588 168/500 [=========>....................] - ETA: 1:16 - loss: 2.4524 - regression_loss: 1.9940 - classification_loss: 0.4584 169/500 [=========>....................] - ETA: 1:16 - loss: 2.4521 - regression_loss: 1.9942 - classification_loss: 0.4579 170/500 [=========>....................] - ETA: 1:16 - loss: 2.4484 - regression_loss: 1.9919 - classification_loss: 0.4565 171/500 [=========>....................] - ETA: 1:16 - loss: 2.4459 - regression_loss: 1.9904 - classification_loss: 0.4555 172/500 [=========>....................] - ETA: 1:15 - loss: 2.4431 - regression_loss: 1.9886 - classification_loss: 0.4544 173/500 [=========>....................] - ETA: 1:15 - loss: 2.4397 - regression_loss: 1.9860 - classification_loss: 0.4537 174/500 [=========>....................] - ETA: 1:15 - loss: 2.4382 - regression_loss: 1.9849 - classification_loss: 0.4533 175/500 [=========>....................] - ETA: 1:15 - loss: 2.4394 - regression_loss: 1.9865 - classification_loss: 0.4529 176/500 [=========>....................] - ETA: 1:15 - loss: 2.4323 - regression_loss: 1.9809 - classification_loss: 0.4514 177/500 [=========>....................] - ETA: 1:14 - loss: 2.4309 - regression_loss: 1.9797 - classification_loss: 0.4512 178/500 [=========>....................] - ETA: 1:14 - loss: 2.4325 - regression_loss: 1.9816 - classification_loss: 0.4509 179/500 [=========>....................] - ETA: 1:14 - loss: 2.4330 - regression_loss: 1.9820 - classification_loss: 0.4510 180/500 [=========>....................] - ETA: 1:14 - loss: 2.4324 - regression_loss: 1.9817 - classification_loss: 0.4507 181/500 [=========>....................] - ETA: 1:13 - loss: 2.4310 - regression_loss: 1.9804 - classification_loss: 0.4505 182/500 [=========>....................] - ETA: 1:13 - loss: 2.4317 - regression_loss: 1.9808 - classification_loss: 0.4509 183/500 [=========>....................] - ETA: 1:13 - loss: 2.4358 - regression_loss: 1.9836 - classification_loss: 0.4522 184/500 [==========>...................] - ETA: 1:13 - loss: 2.4290 - regression_loss: 1.9783 - classification_loss: 0.4507 185/500 [==========>...................] - ETA: 1:13 - loss: 2.4237 - regression_loss: 1.9735 - classification_loss: 0.4502 186/500 [==========>...................] - ETA: 1:12 - loss: 2.4229 - regression_loss: 1.9730 - classification_loss: 0.4498 187/500 [==========>...................] - ETA: 1:12 - loss: 2.4259 - regression_loss: 1.9761 - classification_loss: 0.4498 188/500 [==========>...................] - ETA: 1:12 - loss: 2.4242 - regression_loss: 1.9751 - classification_loss: 0.4491 189/500 [==========>...................] - ETA: 1:12 - loss: 2.4246 - regression_loss: 1.9756 - classification_loss: 0.4489 190/500 [==========>...................] - ETA: 1:11 - loss: 2.4232 - regression_loss: 1.9740 - classification_loss: 0.4492 191/500 [==========>...................] - ETA: 1:11 - loss: 2.4171 - regression_loss: 1.9694 - classification_loss: 0.4477 192/500 [==========>...................] - ETA: 1:11 - loss: 2.4143 - regression_loss: 1.9673 - classification_loss: 0.4470 193/500 [==========>...................] - ETA: 1:11 - loss: 2.4116 - regression_loss: 1.9651 - classification_loss: 0.4465 194/500 [==========>...................] - ETA: 1:10 - loss: 2.4145 - regression_loss: 1.9688 - classification_loss: 0.4457 195/500 [==========>...................] - ETA: 1:10 - loss: 2.4218 - regression_loss: 1.9745 - classification_loss: 0.4474 196/500 [==========>...................] - ETA: 1:10 - loss: 2.4247 - regression_loss: 1.9766 - classification_loss: 0.4481 197/500 [==========>...................] - ETA: 1:10 - loss: 2.4254 - regression_loss: 1.9771 - classification_loss: 0.4484 198/500 [==========>...................] - ETA: 1:10 - loss: 2.4257 - regression_loss: 1.9770 - classification_loss: 0.4487 199/500 [==========>...................] - ETA: 1:09 - loss: 2.4249 - regression_loss: 1.9770 - classification_loss: 0.4480 200/500 [===========>..................] - ETA: 1:09 - loss: 2.4206 - regression_loss: 1.9738 - classification_loss: 0.4469 201/500 [===========>..................] - ETA: 1:09 - loss: 2.4215 - regression_loss: 1.9744 - classification_loss: 0.4471 202/500 [===========>..................] - ETA: 1:09 - loss: 2.4198 - regression_loss: 1.9732 - classification_loss: 0.4466 203/500 [===========>..................] - ETA: 1:08 - loss: 2.4265 - regression_loss: 1.9778 - classification_loss: 0.4487 204/500 [===========>..................] - ETA: 1:08 - loss: 2.4271 - regression_loss: 1.9789 - classification_loss: 0.4483 205/500 [===========>..................] - ETA: 1:08 - loss: 2.4267 - regression_loss: 1.9783 - classification_loss: 0.4484 206/500 [===========>..................] - ETA: 1:08 - loss: 2.4255 - regression_loss: 1.9774 - classification_loss: 0.4481 207/500 [===========>..................] - ETA: 1:08 - loss: 2.4293 - regression_loss: 1.9794 - classification_loss: 0.4499 208/500 [===========>..................] - ETA: 1:07 - loss: 2.4275 - regression_loss: 1.9787 - classification_loss: 0.4488 209/500 [===========>..................] - ETA: 1:07 - loss: 2.4297 - regression_loss: 1.9810 - classification_loss: 0.4486 210/500 [===========>..................] - ETA: 1:07 - loss: 2.4297 - regression_loss: 1.9813 - classification_loss: 0.4484 211/500 [===========>..................] - ETA: 1:07 - loss: 2.4276 - regression_loss: 1.9799 - classification_loss: 0.4477 212/500 [===========>..................] - ETA: 1:06 - loss: 2.4267 - regression_loss: 1.9799 - classification_loss: 0.4468 213/500 [===========>..................] - ETA: 1:06 - loss: 2.4277 - regression_loss: 1.9806 - classification_loss: 0.4470 214/500 [===========>..................] - ETA: 1:06 - loss: 2.4273 - regression_loss: 1.9807 - classification_loss: 0.4466 215/500 [===========>..................] - ETA: 1:06 - loss: 2.4284 - regression_loss: 1.9818 - classification_loss: 0.4467 216/500 [===========>..................] - ETA: 1:05 - loss: 2.4275 - regression_loss: 1.9809 - classification_loss: 0.4466 217/500 [============>.................] - ETA: 1:05 - loss: 2.4253 - regression_loss: 1.9793 - classification_loss: 0.4460 218/500 [============>.................] - ETA: 1:05 - loss: 2.4238 - regression_loss: 1.9779 - classification_loss: 0.4459 219/500 [============>.................] - ETA: 1:05 - loss: 2.4232 - regression_loss: 1.9776 - classification_loss: 0.4456 220/500 [============>.................] - ETA: 1:04 - loss: 2.4227 - regression_loss: 1.9769 - classification_loss: 0.4458 221/500 [============>.................] - ETA: 1:04 - loss: 2.4234 - regression_loss: 1.9779 - classification_loss: 0.4455 222/500 [============>.................] - ETA: 1:04 - loss: 2.4217 - regression_loss: 1.9769 - classification_loss: 0.4448 223/500 [============>.................] - ETA: 1:04 - loss: 2.4264 - regression_loss: 1.9811 - classification_loss: 0.4453 224/500 [============>.................] - ETA: 1:04 - loss: 2.4280 - regression_loss: 1.9818 - classification_loss: 0.4462 225/500 [============>.................] - ETA: 1:03 - loss: 2.4294 - regression_loss: 1.9827 - classification_loss: 0.4466 226/500 [============>.................] - ETA: 1:03 - loss: 2.4296 - regression_loss: 1.9832 - classification_loss: 0.4464 227/500 [============>.................] - ETA: 1:03 - loss: 2.4300 - regression_loss: 1.9834 - classification_loss: 0.4465 228/500 [============>.................] - ETA: 1:03 - loss: 2.4251 - regression_loss: 1.9788 - classification_loss: 0.4462 229/500 [============>.................] - ETA: 1:02 - loss: 2.4252 - regression_loss: 1.9791 - classification_loss: 0.4462 230/500 [============>.................] - ETA: 1:02 - loss: 2.4284 - regression_loss: 1.9813 - classification_loss: 0.4470 231/500 [============>.................] - ETA: 1:02 - loss: 2.4317 - regression_loss: 1.9835 - classification_loss: 0.4482 232/500 [============>.................] - ETA: 1:02 - loss: 2.4312 - regression_loss: 1.9832 - classification_loss: 0.4480 233/500 [============>.................] - ETA: 1:02 - loss: 2.4269 - regression_loss: 1.9798 - classification_loss: 0.4471 234/500 [=============>................] - ETA: 1:01 - loss: 2.4267 - regression_loss: 1.9797 - classification_loss: 0.4471 235/500 [=============>................] - ETA: 1:01 - loss: 2.4289 - regression_loss: 1.9818 - classification_loss: 0.4470 236/500 [=============>................] - ETA: 1:01 - loss: 2.4284 - regression_loss: 1.9819 - classification_loss: 0.4464 237/500 [=============>................] - ETA: 1:01 - loss: 2.4261 - regression_loss: 1.9802 - classification_loss: 0.4459 238/500 [=============>................] - ETA: 1:00 - loss: 2.4286 - regression_loss: 1.9810 - classification_loss: 0.4476 239/500 [=============>................] - ETA: 1:00 - loss: 2.4264 - regression_loss: 1.9792 - classification_loss: 0.4472 240/500 [=============>................] - ETA: 1:00 - loss: 2.4268 - regression_loss: 1.9796 - classification_loss: 0.4472 241/500 [=============>................] - ETA: 1:00 - loss: 2.4285 - regression_loss: 1.9805 - classification_loss: 0.4480 242/500 [=============>................] - ETA: 59s - loss: 2.4236 - regression_loss: 1.9768 - classification_loss: 0.4468  243/500 [=============>................] - ETA: 59s - loss: 2.4272 - regression_loss: 1.9797 - classification_loss: 0.4475 244/500 [=============>................] - ETA: 59s - loss: 2.4262 - regression_loss: 1.9789 - classification_loss: 0.4473 245/500 [=============>................] - ETA: 59s - loss: 2.4312 - regression_loss: 1.9830 - classification_loss: 0.4482 246/500 [=============>................] - ETA: 59s - loss: 2.4321 - regression_loss: 1.9832 - classification_loss: 0.4489 247/500 [=============>................] - ETA: 58s - loss: 2.4314 - regression_loss: 1.9825 - classification_loss: 0.4489 248/500 [=============>................] - ETA: 58s - loss: 2.4353 - regression_loss: 1.9863 - classification_loss: 0.4490 249/500 [=============>................] - ETA: 58s - loss: 2.4338 - regression_loss: 1.9851 - classification_loss: 0.4487 250/500 [==============>...............] - ETA: 58s - loss: 2.4354 - regression_loss: 1.9863 - classification_loss: 0.4491 251/500 [==============>...............] - ETA: 57s - loss: 2.4392 - regression_loss: 1.9902 - classification_loss: 0.4491 252/500 [==============>...............] - ETA: 57s - loss: 2.4421 - regression_loss: 1.9925 - classification_loss: 0.4496 253/500 [==============>...............] - ETA: 57s - loss: 2.4413 - regression_loss: 1.9920 - classification_loss: 0.4493 254/500 [==============>...............] - ETA: 57s - loss: 2.4403 - regression_loss: 1.9914 - classification_loss: 0.4489 255/500 [==============>...............] - ETA: 56s - loss: 2.4389 - regression_loss: 1.9906 - classification_loss: 0.4482 256/500 [==============>...............] - ETA: 56s - loss: 2.4411 - regression_loss: 1.9922 - classification_loss: 0.4489 257/500 [==============>...............] - ETA: 56s - loss: 2.4381 - regression_loss: 1.9898 - classification_loss: 0.4483 258/500 [==============>...............] - ETA: 56s - loss: 2.4367 - regression_loss: 1.9888 - classification_loss: 0.4480 259/500 [==============>...............] - ETA: 56s - loss: 2.4363 - regression_loss: 1.9886 - classification_loss: 0.4478 260/500 [==============>...............] - ETA: 55s - loss: 2.4343 - regression_loss: 1.9871 - classification_loss: 0.4472 261/500 [==============>...............] - ETA: 55s - loss: 2.4352 - regression_loss: 1.9880 - classification_loss: 0.4472 262/500 [==============>...............] - ETA: 55s - loss: 2.4348 - regression_loss: 1.9877 - classification_loss: 0.4471 263/500 [==============>...............] - ETA: 55s - loss: 2.4335 - regression_loss: 1.9870 - classification_loss: 0.4465 264/500 [==============>...............] - ETA: 54s - loss: 2.4296 - regression_loss: 1.9841 - classification_loss: 0.4455 265/500 [==============>...............] - ETA: 54s - loss: 2.4307 - regression_loss: 1.9852 - classification_loss: 0.4455 266/500 [==============>...............] - ETA: 54s - loss: 2.4306 - regression_loss: 1.9851 - classification_loss: 0.4456 267/500 [===============>..............] - ETA: 54s - loss: 2.4307 - regression_loss: 1.9854 - classification_loss: 0.4453 268/500 [===============>..............] - ETA: 54s - loss: 2.4301 - regression_loss: 1.9850 - classification_loss: 0.4451 269/500 [===============>..............] - ETA: 53s - loss: 2.4332 - regression_loss: 1.9886 - classification_loss: 0.4446 270/500 [===============>..............] - ETA: 53s - loss: 2.4306 - regression_loss: 1.9868 - classification_loss: 0.4438 271/500 [===============>..............] - ETA: 53s - loss: 2.4309 - regression_loss: 1.9871 - classification_loss: 0.4438 272/500 [===============>..............] - ETA: 53s - loss: 2.4317 - regression_loss: 1.9877 - classification_loss: 0.4440 273/500 [===============>..............] - ETA: 52s - loss: 2.4283 - regression_loss: 1.9853 - classification_loss: 0.4430 274/500 [===============>..............] - ETA: 52s - loss: 2.4270 - regression_loss: 1.9845 - classification_loss: 0.4425 275/500 [===============>..............] - ETA: 52s - loss: 2.4254 - regression_loss: 1.9835 - classification_loss: 0.4419 276/500 [===============>..............] - ETA: 52s - loss: 2.4252 - regression_loss: 1.9837 - classification_loss: 0.4415 277/500 [===============>..............] - ETA: 51s - loss: 2.4296 - regression_loss: 1.9868 - classification_loss: 0.4428 278/500 [===============>..............] - ETA: 51s - loss: 2.4277 - regression_loss: 1.9856 - classification_loss: 0.4421 279/500 [===============>..............] - ETA: 51s - loss: 2.4279 - regression_loss: 1.9855 - classification_loss: 0.4424 280/500 [===============>..............] - ETA: 51s - loss: 2.4262 - regression_loss: 1.9843 - classification_loss: 0.4419 281/500 [===============>..............] - ETA: 50s - loss: 2.4264 - regression_loss: 1.9845 - classification_loss: 0.4418 282/500 [===============>..............] - ETA: 50s - loss: 2.4268 - regression_loss: 1.9850 - classification_loss: 0.4417 283/500 [===============>..............] - ETA: 50s - loss: 2.4274 - regression_loss: 1.9853 - classification_loss: 0.4421 284/500 [================>.............] - ETA: 50s - loss: 2.4262 - regression_loss: 1.9848 - classification_loss: 0.4414 285/500 [================>.............] - ETA: 50s - loss: 2.4259 - regression_loss: 1.9847 - classification_loss: 0.4411 286/500 [================>.............] - ETA: 49s - loss: 2.4264 - regression_loss: 1.9856 - classification_loss: 0.4408 287/500 [================>.............] - ETA: 49s - loss: 2.4257 - regression_loss: 1.9851 - classification_loss: 0.4406 288/500 [================>.............] - ETA: 49s - loss: 2.4244 - regression_loss: 1.9844 - classification_loss: 0.4400 289/500 [================>.............] - ETA: 49s - loss: 2.4277 - regression_loss: 1.9855 - classification_loss: 0.4422 290/500 [================>.............] - ETA: 48s - loss: 2.4266 - regression_loss: 1.9848 - classification_loss: 0.4418 291/500 [================>.............] - ETA: 48s - loss: 2.4269 - regression_loss: 1.9832 - classification_loss: 0.4437 292/500 [================>.............] - ETA: 48s - loss: 2.4291 - regression_loss: 1.9842 - classification_loss: 0.4449 293/500 [================>.............] - ETA: 48s - loss: 2.4251 - regression_loss: 1.9809 - classification_loss: 0.4443 294/500 [================>.............] - ETA: 48s - loss: 2.4272 - regression_loss: 1.9825 - classification_loss: 0.4447 295/500 [================>.............] - ETA: 47s - loss: 2.4265 - regression_loss: 1.9820 - classification_loss: 0.4446 296/500 [================>.............] - ETA: 47s - loss: 2.4292 - regression_loss: 1.9842 - classification_loss: 0.4450 297/500 [================>.............] - ETA: 47s - loss: 2.4320 - regression_loss: 1.9866 - classification_loss: 0.4454 298/500 [================>.............] - ETA: 47s - loss: 2.4306 - regression_loss: 1.9855 - classification_loss: 0.4451 299/500 [================>.............] - ETA: 46s - loss: 2.4307 - regression_loss: 1.9860 - classification_loss: 0.4448 300/500 [=================>............] - ETA: 46s - loss: 2.4322 - regression_loss: 1.9871 - classification_loss: 0.4450 301/500 [=================>............] - ETA: 46s - loss: 2.4307 - regression_loss: 1.9861 - classification_loss: 0.4447 302/500 [=================>............] - ETA: 46s - loss: 2.4292 - regression_loss: 1.9848 - classification_loss: 0.4443 303/500 [=================>............] - ETA: 45s - loss: 2.4286 - regression_loss: 1.9840 - classification_loss: 0.4446 304/500 [=================>............] - ETA: 45s - loss: 2.4277 - regression_loss: 1.9833 - classification_loss: 0.4443 305/500 [=================>............] - ETA: 45s - loss: 2.4284 - regression_loss: 1.9835 - classification_loss: 0.4449 306/500 [=================>............] - ETA: 45s - loss: 2.4274 - regression_loss: 1.9830 - classification_loss: 0.4445 307/500 [=================>............] - ETA: 44s - loss: 2.4272 - regression_loss: 1.9827 - classification_loss: 0.4444 308/500 [=================>............] - ETA: 44s - loss: 2.4275 - regression_loss: 1.9828 - classification_loss: 0.4446 309/500 [=================>............] - ETA: 44s - loss: 2.4293 - regression_loss: 1.9841 - classification_loss: 0.4452 310/500 [=================>............] - ETA: 44s - loss: 2.4299 - regression_loss: 1.9847 - classification_loss: 0.4452 311/500 [=================>............] - ETA: 44s - loss: 2.4258 - regression_loss: 1.9816 - classification_loss: 0.4442 312/500 [=================>............] - ETA: 43s - loss: 2.4263 - regression_loss: 1.9818 - classification_loss: 0.4445 313/500 [=================>............] - ETA: 43s - loss: 2.4279 - regression_loss: 1.9833 - classification_loss: 0.4446 314/500 [=================>............] - ETA: 43s - loss: 2.4274 - regression_loss: 1.9829 - classification_loss: 0.4444 315/500 [=================>............] - ETA: 43s - loss: 2.4277 - regression_loss: 1.9827 - classification_loss: 0.4450 316/500 [=================>............] - ETA: 42s - loss: 2.4242 - regression_loss: 1.9800 - classification_loss: 0.4442 317/500 [==================>...........] - ETA: 42s - loss: 2.4287 - regression_loss: 1.9836 - classification_loss: 0.4451 318/500 [==================>...........] - ETA: 42s - loss: 2.4284 - regression_loss: 1.9833 - classification_loss: 0.4451 319/500 [==================>...........] - ETA: 42s - loss: 2.4293 - regression_loss: 1.9836 - classification_loss: 0.4457 320/500 [==================>...........] - ETA: 41s - loss: 2.4282 - regression_loss: 1.9828 - classification_loss: 0.4454 321/500 [==================>...........] - ETA: 41s - loss: 2.4285 - regression_loss: 1.9834 - classification_loss: 0.4451 322/500 [==================>...........] - ETA: 41s - loss: 2.4298 - regression_loss: 1.9838 - classification_loss: 0.4460 323/500 [==================>...........] - ETA: 41s - loss: 2.4312 - regression_loss: 1.9846 - classification_loss: 0.4466 324/500 [==================>...........] - ETA: 41s - loss: 2.4334 - regression_loss: 1.9864 - classification_loss: 0.4469 325/500 [==================>...........] - ETA: 40s - loss: 2.4335 - regression_loss: 1.9862 - classification_loss: 0.4473 326/500 [==================>...........] - ETA: 40s - loss: 2.4338 - regression_loss: 1.9863 - classification_loss: 0.4475 327/500 [==================>...........] - ETA: 40s - loss: 2.4317 - regression_loss: 1.9848 - classification_loss: 0.4469 328/500 [==================>...........] - ETA: 40s - loss: 2.4310 - regression_loss: 1.9842 - classification_loss: 0.4469 329/500 [==================>...........] - ETA: 39s - loss: 2.4327 - regression_loss: 1.9857 - classification_loss: 0.4469 330/500 [==================>...........] - ETA: 39s - loss: 2.4347 - regression_loss: 1.9877 - classification_loss: 0.4470 331/500 [==================>...........] - ETA: 39s - loss: 2.4351 - regression_loss: 1.9879 - classification_loss: 0.4472 332/500 [==================>...........] - ETA: 39s - loss: 2.4344 - regression_loss: 1.9875 - classification_loss: 0.4469 333/500 [==================>...........] - ETA: 38s - loss: 2.4358 - regression_loss: 1.9884 - classification_loss: 0.4474 334/500 [===================>..........] - ETA: 38s - loss: 2.4358 - regression_loss: 1.9884 - classification_loss: 0.4474 335/500 [===================>..........] - ETA: 38s - loss: 2.4384 - regression_loss: 1.9898 - classification_loss: 0.4486 336/500 [===================>..........] - ETA: 38s - loss: 2.4404 - regression_loss: 1.9912 - classification_loss: 0.4491 337/500 [===================>..........] - ETA: 38s - loss: 2.4421 - regression_loss: 1.9925 - classification_loss: 0.4496 338/500 [===================>..........] - ETA: 37s - loss: 2.4413 - regression_loss: 1.9916 - classification_loss: 0.4497 339/500 [===================>..........] - ETA: 37s - loss: 2.4404 - regression_loss: 1.9911 - classification_loss: 0.4493 340/500 [===================>..........] - ETA: 37s - loss: 2.4422 - regression_loss: 1.9925 - classification_loss: 0.4497 341/500 [===================>..........] - ETA: 37s - loss: 2.4412 - regression_loss: 1.9919 - classification_loss: 0.4493 342/500 [===================>..........] - ETA: 36s - loss: 2.4409 - regression_loss: 1.9922 - classification_loss: 0.4488 343/500 [===================>..........] - ETA: 36s - loss: 2.4397 - regression_loss: 1.9915 - classification_loss: 0.4482 344/500 [===================>..........] - ETA: 36s - loss: 2.4391 - regression_loss: 1.9914 - classification_loss: 0.4477 345/500 [===================>..........] - ETA: 36s - loss: 2.4384 - regression_loss: 1.9909 - classification_loss: 0.4475 346/500 [===================>..........] - ETA: 35s - loss: 2.4377 - regression_loss: 1.9906 - classification_loss: 0.4471 347/500 [===================>..........] - ETA: 35s - loss: 2.4360 - regression_loss: 1.9894 - classification_loss: 0.4466 348/500 [===================>..........] - ETA: 35s - loss: 2.4369 - regression_loss: 1.9904 - classification_loss: 0.4466 349/500 [===================>..........] - ETA: 35s - loss: 2.4367 - regression_loss: 1.9902 - classification_loss: 0.4466 350/500 [====================>.........] - ETA: 35s - loss: 2.4382 - regression_loss: 1.9902 - classification_loss: 0.4480 351/500 [====================>.........] - ETA: 34s - loss: 2.4370 - regression_loss: 1.9891 - classification_loss: 0.4479 352/500 [====================>.........] - ETA: 34s - loss: 2.4354 - regression_loss: 1.9880 - classification_loss: 0.4474 353/500 [====================>.........] - ETA: 34s - loss: 2.4356 - regression_loss: 1.9885 - classification_loss: 0.4471 354/500 [====================>.........] - ETA: 34s - loss: 2.4341 - regression_loss: 1.9874 - classification_loss: 0.4467 355/500 [====================>.........] - ETA: 33s - loss: 2.4337 - regression_loss: 1.9871 - classification_loss: 0.4466 356/500 [====================>.........] - ETA: 33s - loss: 2.4293 - regression_loss: 1.9836 - classification_loss: 0.4457 357/500 [====================>.........] - ETA: 33s - loss: 2.4296 - regression_loss: 1.9834 - classification_loss: 0.4462 358/500 [====================>.........] - ETA: 33s - loss: 2.4296 - regression_loss: 1.9835 - classification_loss: 0.4462 359/500 [====================>.........] - ETA: 32s - loss: 2.4282 - regression_loss: 1.9823 - classification_loss: 0.4459 360/500 [====================>.........] - ETA: 32s - loss: 2.4287 - regression_loss: 1.9828 - classification_loss: 0.4460 361/500 [====================>.........] - ETA: 32s - loss: 2.4286 - regression_loss: 1.9826 - classification_loss: 0.4460 362/500 [====================>.........] - ETA: 32s - loss: 2.4271 - regression_loss: 1.9814 - classification_loss: 0.4457 363/500 [====================>.........] - ETA: 31s - loss: 2.4271 - regression_loss: 1.9814 - classification_loss: 0.4457 364/500 [====================>.........] - ETA: 31s - loss: 2.4265 - regression_loss: 1.9807 - classification_loss: 0.4458 365/500 [====================>.........] - ETA: 31s - loss: 2.4270 - regression_loss: 1.9811 - classification_loss: 0.4459 366/500 [====================>.........] - ETA: 31s - loss: 2.4271 - regression_loss: 1.9812 - classification_loss: 0.4459 367/500 [=====================>........] - ETA: 31s - loss: 2.4268 - regression_loss: 1.9809 - classification_loss: 0.4459 368/500 [=====================>........] - ETA: 30s - loss: 2.4297 - regression_loss: 1.9830 - classification_loss: 0.4467 369/500 [=====================>........] - ETA: 30s - loss: 2.4297 - regression_loss: 1.9833 - classification_loss: 0.4464 370/500 [=====================>........] - ETA: 30s - loss: 2.4301 - regression_loss: 1.9836 - classification_loss: 0.4465 371/500 [=====================>........] - ETA: 30s - loss: 2.4283 - regression_loss: 1.9824 - classification_loss: 0.4459 372/500 [=====================>........] - ETA: 29s - loss: 2.4250 - regression_loss: 1.9798 - classification_loss: 0.4452 373/500 [=====================>........] - ETA: 29s - loss: 2.4245 - regression_loss: 1.9795 - classification_loss: 0.4449 374/500 [=====================>........] - ETA: 29s - loss: 2.4237 - regression_loss: 1.9788 - classification_loss: 0.4449 375/500 [=====================>........] - ETA: 29s - loss: 2.4239 - regression_loss: 1.9793 - classification_loss: 0.4446 376/500 [=====================>........] - ETA: 28s - loss: 2.4248 - regression_loss: 1.9798 - classification_loss: 0.4450 377/500 [=====================>........] - ETA: 28s - loss: 2.4243 - regression_loss: 1.9794 - classification_loss: 0.4449 378/500 [=====================>........] - ETA: 28s - loss: 2.4254 - regression_loss: 1.9806 - classification_loss: 0.4448 379/500 [=====================>........] - ETA: 28s - loss: 2.4251 - regression_loss: 1.9805 - classification_loss: 0.4446 380/500 [=====================>........] - ETA: 28s - loss: 2.4269 - regression_loss: 1.9819 - classification_loss: 0.4450 381/500 [=====================>........] - ETA: 27s - loss: 2.4265 - regression_loss: 1.9819 - classification_loss: 0.4446 382/500 [=====================>........] - ETA: 27s - loss: 2.4267 - regression_loss: 1.9821 - classification_loss: 0.4446 383/500 [=====================>........] - ETA: 27s - loss: 2.4273 - regression_loss: 1.9826 - classification_loss: 0.4448 384/500 [======================>.......] - ETA: 27s - loss: 2.4261 - regression_loss: 1.9818 - classification_loss: 0.4442 385/500 [======================>.......] - ETA: 26s - loss: 2.4224 - regression_loss: 1.9788 - classification_loss: 0.4436 386/500 [======================>.......] - ETA: 26s - loss: 2.4222 - regression_loss: 1.9786 - classification_loss: 0.4437 387/500 [======================>.......] - ETA: 26s - loss: 2.4222 - regression_loss: 1.9786 - classification_loss: 0.4436 388/500 [======================>.......] - ETA: 26s - loss: 2.4218 - regression_loss: 1.9785 - classification_loss: 0.4433 389/500 [======================>.......] - ETA: 25s - loss: 2.4209 - regression_loss: 1.9778 - classification_loss: 0.4431 390/500 [======================>.......] - ETA: 25s - loss: 2.4208 - regression_loss: 1.9780 - classification_loss: 0.4428 391/500 [======================>.......] - ETA: 25s - loss: 2.4201 - regression_loss: 1.9775 - classification_loss: 0.4426 392/500 [======================>.......] - ETA: 25s - loss: 2.4215 - regression_loss: 1.9789 - classification_loss: 0.4426 393/500 [======================>.......] - ETA: 24s - loss: 2.4175 - regression_loss: 1.9756 - classification_loss: 0.4418 394/500 [======================>.......] - ETA: 24s - loss: 2.4137 - regression_loss: 1.9726 - classification_loss: 0.4411 395/500 [======================>.......] - ETA: 24s - loss: 2.4139 - regression_loss: 1.9729 - classification_loss: 0.4411 396/500 [======================>.......] - ETA: 24s - loss: 2.4143 - regression_loss: 1.9731 - classification_loss: 0.4412 397/500 [======================>.......] - ETA: 24s - loss: 2.4131 - regression_loss: 1.9722 - classification_loss: 0.4408 398/500 [======================>.......] - ETA: 23s - loss: 2.4151 - regression_loss: 1.9735 - classification_loss: 0.4416 399/500 [======================>.......] - ETA: 23s - loss: 2.4133 - regression_loss: 1.9722 - classification_loss: 0.4410 400/500 [=======================>......] - ETA: 23s - loss: 2.4164 - regression_loss: 1.9751 - classification_loss: 0.4413 401/500 [=======================>......] - ETA: 23s - loss: 2.4171 - regression_loss: 1.9757 - classification_loss: 0.4414 402/500 [=======================>......] - ETA: 22s - loss: 2.4196 - regression_loss: 1.9783 - classification_loss: 0.4413 403/500 [=======================>......] - ETA: 22s - loss: 2.4201 - regression_loss: 1.9786 - classification_loss: 0.4416 404/500 [=======================>......] - ETA: 22s - loss: 2.4191 - regression_loss: 1.9778 - classification_loss: 0.4413 405/500 [=======================>......] - ETA: 22s - loss: 2.4189 - regression_loss: 1.9778 - classification_loss: 0.4410 406/500 [=======================>......] - ETA: 21s - loss: 2.4201 - regression_loss: 1.9787 - classification_loss: 0.4413 407/500 [=======================>......] - ETA: 21s - loss: 2.4213 - regression_loss: 1.9796 - classification_loss: 0.4417 408/500 [=======================>......] - ETA: 21s - loss: 2.4219 - regression_loss: 1.9801 - classification_loss: 0.4418 409/500 [=======================>......] - ETA: 21s - loss: 2.4228 - regression_loss: 1.9810 - classification_loss: 0.4418 410/500 [=======================>......] - ETA: 21s - loss: 2.4241 - regression_loss: 1.9817 - classification_loss: 0.4424 411/500 [=======================>......] - ETA: 20s - loss: 2.4253 - regression_loss: 1.9825 - classification_loss: 0.4429 412/500 [=======================>......] - ETA: 20s - loss: 2.4250 - regression_loss: 1.9824 - classification_loss: 0.4426 413/500 [=======================>......] - ETA: 20s - loss: 2.4244 - regression_loss: 1.9822 - classification_loss: 0.4422 414/500 [=======================>......] - ETA: 20s - loss: 2.4249 - regression_loss: 1.9825 - classification_loss: 0.4424 415/500 [=======================>......] - ETA: 19s - loss: 2.4250 - regression_loss: 1.9828 - classification_loss: 0.4422 416/500 [=======================>......] - ETA: 19s - loss: 2.4250 - regression_loss: 1.9827 - classification_loss: 0.4423 417/500 [========================>.....] - ETA: 19s - loss: 2.4247 - regression_loss: 1.9826 - classification_loss: 0.4422 418/500 [========================>.....] - ETA: 19s - loss: 2.4253 - regression_loss: 1.9828 - classification_loss: 0.4425 419/500 [========================>.....] - ETA: 18s - loss: 2.4250 - regression_loss: 1.9827 - classification_loss: 0.4423 420/500 [========================>.....] - ETA: 18s - loss: 2.4247 - regression_loss: 1.9826 - classification_loss: 0.4421 421/500 [========================>.....] - ETA: 18s - loss: 2.4267 - regression_loss: 1.9842 - classification_loss: 0.4425 422/500 [========================>.....] - ETA: 18s - loss: 2.4272 - regression_loss: 1.9848 - classification_loss: 0.4425 423/500 [========================>.....] - ETA: 17s - loss: 2.4282 - regression_loss: 1.9857 - classification_loss: 0.4425 424/500 [========================>.....] - ETA: 17s - loss: 2.4260 - regression_loss: 1.9840 - classification_loss: 0.4420 425/500 [========================>.....] - ETA: 17s - loss: 2.4268 - regression_loss: 1.9847 - classification_loss: 0.4421 426/500 [========================>.....] - ETA: 17s - loss: 2.4274 - regression_loss: 1.9853 - classification_loss: 0.4421 427/500 [========================>.....] - ETA: 17s - loss: 2.4299 - regression_loss: 1.9877 - classification_loss: 0.4423 428/500 [========================>.....] - ETA: 16s - loss: 2.4296 - regression_loss: 1.9874 - classification_loss: 0.4422 429/500 [========================>.....] - ETA: 16s - loss: 2.4300 - regression_loss: 1.9877 - classification_loss: 0.4422 430/500 [========================>.....] - ETA: 16s - loss: 2.4294 - regression_loss: 1.9873 - classification_loss: 0.4421 431/500 [========================>.....] - ETA: 16s - loss: 2.4294 - regression_loss: 1.9873 - classification_loss: 0.4422 432/500 [========================>.....] - ETA: 15s - loss: 2.4306 - regression_loss: 1.9881 - classification_loss: 0.4424 433/500 [========================>.....] - ETA: 15s - loss: 2.4303 - regression_loss: 1.9877 - classification_loss: 0.4426 434/500 [=========================>....] - ETA: 15s - loss: 2.4293 - regression_loss: 1.9871 - classification_loss: 0.4423 435/500 [=========================>....] - ETA: 15s - loss: 2.4294 - regression_loss: 1.9872 - classification_loss: 0.4423 436/500 [=========================>....] - ETA: 14s - loss: 2.4279 - regression_loss: 1.9859 - classification_loss: 0.4420 437/500 [=========================>....] - ETA: 14s - loss: 2.4277 - regression_loss: 1.9858 - classification_loss: 0.4418 438/500 [=========================>....] - ETA: 14s - loss: 2.4275 - regression_loss: 1.9857 - classification_loss: 0.4418 439/500 [=========================>....] - ETA: 14s - loss: 2.4285 - regression_loss: 1.9866 - classification_loss: 0.4419 440/500 [=========================>....] - ETA: 13s - loss: 2.4281 - regression_loss: 1.9864 - classification_loss: 0.4417 441/500 [=========================>....] - ETA: 13s - loss: 2.4269 - regression_loss: 1.9857 - classification_loss: 0.4412 442/500 [=========================>....] - ETA: 13s - loss: 2.4249 - regression_loss: 1.9840 - classification_loss: 0.4409 443/500 [=========================>....] - ETA: 13s - loss: 2.4237 - regression_loss: 1.9832 - classification_loss: 0.4405 444/500 [=========================>....] - ETA: 13s - loss: 2.4230 - regression_loss: 1.9829 - classification_loss: 0.4401 445/500 [=========================>....] - ETA: 12s - loss: 2.4227 - regression_loss: 1.9826 - classification_loss: 0.4401 446/500 [=========================>....] - ETA: 12s - loss: 2.4246 - regression_loss: 1.9841 - classification_loss: 0.4405 447/500 [=========================>....] - ETA: 12s - loss: 2.4242 - regression_loss: 1.9831 - classification_loss: 0.4411 448/500 [=========================>....] - ETA: 12s - loss: 2.4221 - regression_loss: 1.9814 - classification_loss: 0.4407 449/500 [=========================>....] - ETA: 11s - loss: 2.4220 - regression_loss: 1.9813 - classification_loss: 0.4407 450/500 [==========================>...] - ETA: 11s - loss: 2.4223 - regression_loss: 1.9815 - classification_loss: 0.4408 451/500 [==========================>...] - ETA: 11s - loss: 2.4221 - regression_loss: 1.9813 - classification_loss: 0.4407 452/500 [==========================>...] - ETA: 11s - loss: 2.4217 - regression_loss: 1.9810 - classification_loss: 0.4407 453/500 [==========================>...] - ETA: 10s - loss: 2.4218 - regression_loss: 1.9813 - classification_loss: 0.4405 454/500 [==========================>...] - ETA: 10s - loss: 2.4219 - regression_loss: 1.9814 - classification_loss: 0.4405 455/500 [==========================>...] - ETA: 10s - loss: 2.4193 - regression_loss: 1.9794 - classification_loss: 0.4399 456/500 [==========================>...] - ETA: 10s - loss: 2.4204 - regression_loss: 1.9800 - classification_loss: 0.4403 457/500 [==========================>...] - ETA: 10s - loss: 2.4202 - regression_loss: 1.9799 - classification_loss: 0.4403 458/500 [==========================>...] - ETA: 9s - loss: 2.4195 - regression_loss: 1.9794 - classification_loss: 0.4401  459/500 [==========================>...] - ETA: 9s - loss: 2.4189 - regression_loss: 1.9792 - classification_loss: 0.4397 460/500 [==========================>...] - ETA: 9s - loss: 2.4194 - regression_loss: 1.9795 - classification_loss: 0.4400 461/500 [==========================>...] - ETA: 9s - loss: 2.4193 - regression_loss: 1.9792 - classification_loss: 0.4400 462/500 [==========================>...] - ETA: 8s - loss: 2.4191 - regression_loss: 1.9792 - classification_loss: 0.4399 463/500 [==========================>...] - ETA: 8s - loss: 2.4197 - regression_loss: 1.9798 - classification_loss: 0.4398 464/500 [==========================>...] - ETA: 8s - loss: 2.4189 - regression_loss: 1.9793 - classification_loss: 0.4396 465/500 [==========================>...] - ETA: 8s - loss: 2.4176 - regression_loss: 1.9784 - classification_loss: 0.4392 466/500 [==========================>...] - ETA: 7s - loss: 2.4177 - regression_loss: 1.9784 - classification_loss: 0.4393 467/500 [===========================>..] - ETA: 7s - loss: 2.4179 - regression_loss: 1.9787 - classification_loss: 0.4392 468/500 [===========================>..] - ETA: 7s - loss: 2.4180 - regression_loss: 1.9786 - classification_loss: 0.4394 469/500 [===========================>..] - ETA: 7s - loss: 2.4168 - regression_loss: 1.9777 - classification_loss: 0.4391 470/500 [===========================>..] - ETA: 6s - loss: 2.4189 - regression_loss: 1.9794 - classification_loss: 0.4395 471/500 [===========================>..] - ETA: 6s - loss: 2.4181 - regression_loss: 1.9786 - classification_loss: 0.4394 472/500 [===========================>..] - ETA: 6s - loss: 2.4184 - regression_loss: 1.9786 - classification_loss: 0.4398 473/500 [===========================>..] - ETA: 6s - loss: 2.4176 - regression_loss: 1.9783 - classification_loss: 0.4393 474/500 [===========================>..] - ETA: 6s - loss: 2.4201 - regression_loss: 1.9800 - classification_loss: 0.4400 475/500 [===========================>..] - ETA: 5s - loss: 2.4189 - regression_loss: 1.9792 - classification_loss: 0.4398 476/500 [===========================>..] - ETA: 5s - loss: 2.4196 - regression_loss: 1.9795 - classification_loss: 0.4400 477/500 [===========================>..] - ETA: 5s - loss: 2.4203 - regression_loss: 1.9801 - classification_loss: 0.4403 478/500 [===========================>..] - ETA: 5s - loss: 2.4213 - regression_loss: 1.9803 - classification_loss: 0.4410 479/500 [===========================>..] - ETA: 4s - loss: 2.4209 - regression_loss: 1.9802 - classification_loss: 0.4407 480/500 [===========================>..] - ETA: 4s - loss: 2.4216 - regression_loss: 1.9806 - classification_loss: 0.4410 481/500 [===========================>..] - ETA: 4s - loss: 2.4230 - regression_loss: 1.9821 - classification_loss: 0.4409 482/500 [===========================>..] - ETA: 4s - loss: 2.4239 - regression_loss: 1.9828 - classification_loss: 0.4412 483/500 [===========================>..] - ETA: 3s - loss: 2.4242 - regression_loss: 1.9829 - classification_loss: 0.4413 484/500 [============================>.] - ETA: 3s - loss: 2.4233 - regression_loss: 1.9823 - classification_loss: 0.4410 485/500 [============================>.] - ETA: 3s - loss: 2.4251 - regression_loss: 1.9837 - classification_loss: 0.4414 486/500 [============================>.] - ETA: 3s - loss: 2.4250 - regression_loss: 1.9832 - classification_loss: 0.4417 487/500 [============================>.] - ETA: 3s - loss: 2.4262 - regression_loss: 1.9841 - classification_loss: 0.4421 488/500 [============================>.] - ETA: 2s - loss: 2.4257 - regression_loss: 1.9837 - classification_loss: 0.4420 489/500 [============================>.] - ETA: 2s - loss: 2.4268 - regression_loss: 1.9847 - classification_loss: 0.4421 490/500 [============================>.] - ETA: 2s - loss: 2.4262 - regression_loss: 1.9844 - classification_loss: 0.4417 491/500 [============================>.] - ETA: 2s - loss: 2.4239 - regression_loss: 1.9827 - classification_loss: 0.4412 492/500 [============================>.] - ETA: 1s - loss: 2.4222 - regression_loss: 1.9814 - classification_loss: 0.4409 493/500 [============================>.] - ETA: 1s - loss: 2.4223 - regression_loss: 1.9814 - classification_loss: 0.4408 494/500 [============================>.] - ETA: 1s - loss: 2.4239 - regression_loss: 1.9827 - classification_loss: 0.4412 495/500 [============================>.] - ETA: 1s - loss: 2.4240 - regression_loss: 1.9828 - classification_loss: 0.4412 496/500 [============================>.] - ETA: 0s - loss: 2.4242 - regression_loss: 1.9828 - classification_loss: 0.4414 497/500 [============================>.] - ETA: 0s - loss: 2.4226 - regression_loss: 1.9816 - classification_loss: 0.4410 498/500 [============================>.] - ETA: 0s - loss: 2.4222 - regression_loss: 1.9813 - classification_loss: 0.4409 499/500 [============================>.] - ETA: 0s - loss: 2.4220 - regression_loss: 1.9810 - classification_loss: 0.4409 500/500 [==============================] - 117s 233ms/step - loss: 2.4210 - regression_loss: 1.9804 - classification_loss: 0.4407 326 instances of class plum with average precision: 0.5491 mAP: 0.5491 Epoch 00007: saving model to ./training/snapshots/resnet50_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 2:01 - loss: 1.4004 - regression_loss: 1.2061 - classification_loss: 0.1944 2/500 [..............................] - ETA: 1:57 - loss: 1.5494 - regression_loss: 1.3415 - classification_loss: 0.2079 3/500 [..............................] - ETA: 1:57 - loss: 1.9853 - regression_loss: 1.6500 - classification_loss: 0.3353 4/500 [..............................] - ETA: 1:56 - loss: 2.0029 - regression_loss: 1.6568 - classification_loss: 0.3460 5/500 [..............................] - ETA: 1:56 - loss: 2.2616 - regression_loss: 1.8098 - classification_loss: 0.4517 6/500 [..............................] - ETA: 1:56 - loss: 2.2465 - regression_loss: 1.8158 - classification_loss: 0.4307 7/500 [..............................] - ETA: 1:56 - loss: 2.3017 - regression_loss: 1.8646 - classification_loss: 0.4371 8/500 [..............................] - ETA: 1:57 - loss: 2.2666 - regression_loss: 1.8423 - classification_loss: 0.4242 9/500 [..............................] - ETA: 1:56 - loss: 2.2504 - regression_loss: 1.8388 - classification_loss: 0.4117 10/500 [..............................] - ETA: 1:55 - loss: 2.2647 - regression_loss: 1.8579 - classification_loss: 0.4068 11/500 [..............................] - ETA: 1:55 - loss: 2.2607 - regression_loss: 1.8607 - classification_loss: 0.4000 12/500 [..............................] - ETA: 1:55 - loss: 2.2818 - regression_loss: 1.8767 - classification_loss: 0.4052 13/500 [..............................] - ETA: 1:54 - loss: 2.2933 - regression_loss: 1.8798 - classification_loss: 0.4135 14/500 [..............................] - ETA: 1:54 - loss: 2.4162 - regression_loss: 1.9924 - classification_loss: 0.4237 15/500 [..............................] - ETA: 1:53 - loss: 2.5001 - regression_loss: 2.0595 - classification_loss: 0.4406 16/500 [..............................] - ETA: 1:53 - loss: 2.5049 - regression_loss: 2.0666 - classification_loss: 0.4382 17/500 [>.............................] - ETA: 1:53 - loss: 2.5070 - regression_loss: 2.0659 - classification_loss: 0.4411 18/500 [>.............................] - ETA: 1:52 - loss: 2.4991 - regression_loss: 2.0637 - classification_loss: 0.4354 19/500 [>.............................] - ETA: 1:52 - loss: 2.4675 - regression_loss: 2.0382 - classification_loss: 0.4293 20/500 [>.............................] - ETA: 1:52 - loss: 2.4639 - regression_loss: 2.0382 - classification_loss: 0.4257 21/500 [>.............................] - ETA: 1:51 - loss: 2.4628 - regression_loss: 2.0386 - classification_loss: 0.4242 22/500 [>.............................] - ETA: 1:51 - loss: 2.4853 - regression_loss: 2.0514 - classification_loss: 0.4340 23/500 [>.............................] - ETA: 1:51 - loss: 2.4772 - regression_loss: 2.0474 - classification_loss: 0.4298 24/500 [>.............................] - ETA: 1:51 - loss: 2.4675 - regression_loss: 2.0405 - classification_loss: 0.4269 25/500 [>.............................] - ETA: 1:51 - loss: 2.4242 - regression_loss: 2.0060 - classification_loss: 0.4182 26/500 [>.............................] - ETA: 1:50 - loss: 2.4160 - regression_loss: 2.0007 - classification_loss: 0.4153 27/500 [>.............................] - ETA: 1:50 - loss: 2.3980 - regression_loss: 1.9871 - classification_loss: 0.4109 28/500 [>.............................] - ETA: 1:50 - loss: 2.3962 - regression_loss: 1.9859 - classification_loss: 0.4103 29/500 [>.............................] - ETA: 1:49 - loss: 2.3579 - regression_loss: 1.9550 - classification_loss: 0.4029 30/500 [>.............................] - ETA: 1:49 - loss: 2.3420 - regression_loss: 1.9394 - classification_loss: 0.4026 31/500 [>.............................] - ETA: 1:49 - loss: 2.3391 - regression_loss: 1.9394 - classification_loss: 0.3997 32/500 [>.............................] - ETA: 1:49 - loss: 2.3439 - regression_loss: 1.9360 - classification_loss: 0.4078 33/500 [>.............................] - ETA: 1:48 - loss: 2.3508 - regression_loss: 1.9437 - classification_loss: 0.4070 34/500 [=>............................] - ETA: 1:48 - loss: 2.3425 - regression_loss: 1.9382 - classification_loss: 0.4044 35/500 [=>............................] - ETA: 1:48 - loss: 2.3380 - regression_loss: 1.9324 - classification_loss: 0.4056 36/500 [=>............................] - ETA: 1:47 - loss: 2.3430 - regression_loss: 1.9367 - classification_loss: 0.4063 37/500 [=>............................] - ETA: 1:47 - loss: 2.3380 - regression_loss: 1.9364 - classification_loss: 0.4016 38/500 [=>............................] - ETA: 1:47 - loss: 2.3423 - regression_loss: 1.9389 - classification_loss: 0.4033 39/500 [=>............................] - ETA: 1:47 - loss: 2.3102 - regression_loss: 1.9119 - classification_loss: 0.3983 40/500 [=>............................] - ETA: 1:46 - loss: 2.3096 - regression_loss: 1.9121 - classification_loss: 0.3976 41/500 [=>............................] - ETA: 1:46 - loss: 2.3112 - regression_loss: 1.9118 - classification_loss: 0.3994 42/500 [=>............................] - ETA: 1:46 - loss: 2.2839 - regression_loss: 1.8899 - classification_loss: 0.3941 43/500 [=>............................] - ETA: 1:46 - loss: 2.3084 - regression_loss: 1.9090 - classification_loss: 0.3994 44/500 [=>............................] - ETA: 1:46 - loss: 2.3136 - regression_loss: 1.9141 - classification_loss: 0.3995 45/500 [=>............................] - ETA: 1:45 - loss: 2.3253 - regression_loss: 1.9241 - classification_loss: 0.4012 46/500 [=>............................] - ETA: 1:45 - loss: 2.3309 - regression_loss: 1.9273 - classification_loss: 0.4036 47/500 [=>............................] - ETA: 1:45 - loss: 2.3572 - regression_loss: 1.9467 - classification_loss: 0.4105 48/500 [=>............................] - ETA: 1:45 - loss: 2.3464 - regression_loss: 1.9392 - classification_loss: 0.4072 49/500 [=>............................] - ETA: 1:44 - loss: 2.3563 - regression_loss: 1.9473 - classification_loss: 0.4090 50/500 [==>...........................] - ETA: 1:44 - loss: 2.3505 - regression_loss: 1.9420 - classification_loss: 0.4085 51/500 [==>...........................] - ETA: 1:44 - loss: 2.3570 - regression_loss: 1.9450 - classification_loss: 0.4120 52/500 [==>...........................] - ETA: 1:44 - loss: 2.3528 - regression_loss: 1.9415 - classification_loss: 0.4113 53/500 [==>...........................] - ETA: 1:43 - loss: 2.3561 - regression_loss: 1.9439 - classification_loss: 0.4122 54/500 [==>...........................] - ETA: 1:43 - loss: 2.3553 - regression_loss: 1.9433 - classification_loss: 0.4120 55/500 [==>...........................] - ETA: 1:43 - loss: 2.3757 - regression_loss: 1.9560 - classification_loss: 0.4197 56/500 [==>...........................] - ETA: 1:42 - loss: 2.3713 - regression_loss: 1.9533 - classification_loss: 0.4180 57/500 [==>...........................] - ETA: 1:42 - loss: 2.3776 - regression_loss: 1.9579 - classification_loss: 0.4198 58/500 [==>...........................] - ETA: 1:42 - loss: 2.3819 - regression_loss: 1.9607 - classification_loss: 0.4212 59/500 [==>...........................] - ETA: 1:42 - loss: 2.3859 - regression_loss: 1.9644 - classification_loss: 0.4215 60/500 [==>...........................] - ETA: 1:42 - loss: 2.3836 - regression_loss: 1.9631 - classification_loss: 0.4206 61/500 [==>...........................] - ETA: 1:41 - loss: 2.3802 - regression_loss: 1.9593 - classification_loss: 0.4209 62/500 [==>...........................] - ETA: 1:41 - loss: 2.3782 - regression_loss: 1.9582 - classification_loss: 0.4199 63/500 [==>...........................] - ETA: 1:41 - loss: 2.3793 - regression_loss: 1.9597 - classification_loss: 0.4196 64/500 [==>...........................] - ETA: 1:41 - loss: 2.3796 - regression_loss: 1.9572 - classification_loss: 0.4224 65/500 [==>...........................] - ETA: 1:40 - loss: 2.3858 - regression_loss: 1.9613 - classification_loss: 0.4245 66/500 [==>...........................] - ETA: 1:40 - loss: 2.3878 - regression_loss: 1.9623 - classification_loss: 0.4256 67/500 [===>..........................] - ETA: 1:40 - loss: 2.3662 - regression_loss: 1.9443 - classification_loss: 0.4219 68/500 [===>..........................] - ETA: 1:40 - loss: 2.3681 - regression_loss: 1.9464 - classification_loss: 0.4217 69/500 [===>..........................] - ETA: 1:40 - loss: 2.3690 - regression_loss: 1.9470 - classification_loss: 0.4220 70/500 [===>..........................] - ETA: 1:39 - loss: 2.3736 - regression_loss: 1.9505 - classification_loss: 0.4231 71/500 [===>..........................] - ETA: 1:39 - loss: 2.3788 - regression_loss: 1.9548 - classification_loss: 0.4240 72/500 [===>..........................] - ETA: 1:39 - loss: 2.3807 - regression_loss: 1.9555 - classification_loss: 0.4251 73/500 [===>..........................] - ETA: 1:39 - loss: 2.3825 - regression_loss: 1.9565 - classification_loss: 0.4261 74/500 [===>..........................] - ETA: 1:38 - loss: 2.3794 - regression_loss: 1.9540 - classification_loss: 0.4254 75/500 [===>..........................] - ETA: 1:38 - loss: 2.3766 - regression_loss: 1.9524 - classification_loss: 0.4242 76/500 [===>..........................] - ETA: 1:38 - loss: 2.3705 - regression_loss: 1.9473 - classification_loss: 0.4232 77/500 [===>..........................] - ETA: 1:38 - loss: 2.3623 - regression_loss: 1.9405 - classification_loss: 0.4218 78/500 [===>..........................] - ETA: 1:37 - loss: 2.3517 - regression_loss: 1.9323 - classification_loss: 0.4194 79/500 [===>..........................] - ETA: 1:37 - loss: 2.3567 - regression_loss: 1.9356 - classification_loss: 0.4211 80/500 [===>..........................] - ETA: 1:37 - loss: 2.3527 - regression_loss: 1.9338 - classification_loss: 0.4189 81/500 [===>..........................] - ETA: 1:37 - loss: 2.3486 - regression_loss: 1.9303 - classification_loss: 0.4182 82/500 [===>..........................] - ETA: 1:36 - loss: 2.3454 - regression_loss: 1.9269 - classification_loss: 0.4185 83/500 [===>..........................] - ETA: 1:36 - loss: 2.3458 - regression_loss: 1.9266 - classification_loss: 0.4193 84/500 [====>.........................] - ETA: 1:36 - loss: 2.3560 - regression_loss: 1.9366 - classification_loss: 0.4194 85/500 [====>.........................] - ETA: 1:36 - loss: 2.3548 - regression_loss: 1.9357 - classification_loss: 0.4190 86/500 [====>.........................] - ETA: 1:35 - loss: 2.3540 - regression_loss: 1.9362 - classification_loss: 0.4178 87/500 [====>.........................] - ETA: 1:35 - loss: 2.3532 - regression_loss: 1.9350 - classification_loss: 0.4183 88/500 [====>.........................] - ETA: 1:35 - loss: 2.3621 - regression_loss: 1.9429 - classification_loss: 0.4193 89/500 [====>.........................] - ETA: 1:35 - loss: 2.3722 - regression_loss: 1.9484 - classification_loss: 0.4237 90/500 [====>.........................] - ETA: 1:34 - loss: 2.3717 - regression_loss: 1.9490 - classification_loss: 0.4226 91/500 [====>.........................] - ETA: 1:34 - loss: 2.3755 - regression_loss: 1.9524 - classification_loss: 0.4230 92/500 [====>.........................] - ETA: 1:34 - loss: 2.3751 - regression_loss: 1.9519 - classification_loss: 0.4232 93/500 [====>.........................] - ETA: 1:34 - loss: 2.3703 - regression_loss: 1.9476 - classification_loss: 0.4227 94/500 [====>.........................] - ETA: 1:34 - loss: 2.3753 - regression_loss: 1.9527 - classification_loss: 0.4226 95/500 [====>.........................] - ETA: 1:33 - loss: 2.3700 - regression_loss: 1.9487 - classification_loss: 0.4213 96/500 [====>.........................] - ETA: 1:33 - loss: 2.3634 - regression_loss: 1.9435 - classification_loss: 0.4199 97/500 [====>.........................] - ETA: 1:33 - loss: 2.3600 - regression_loss: 1.9407 - classification_loss: 0.4193 98/500 [====>.........................] - ETA: 1:33 - loss: 2.3637 - regression_loss: 1.9436 - classification_loss: 0.4201 99/500 [====>.........................] - ETA: 1:32 - loss: 2.3687 - regression_loss: 1.9464 - classification_loss: 0.4223 100/500 [=====>........................] - ETA: 1:32 - loss: 2.3582 - regression_loss: 1.9381 - classification_loss: 0.4201 101/500 [=====>........................] - ETA: 1:32 - loss: 2.3584 - regression_loss: 1.9388 - classification_loss: 0.4197 102/500 [=====>........................] - ETA: 1:32 - loss: 2.3599 - regression_loss: 1.9402 - classification_loss: 0.4197 103/500 [=====>........................] - ETA: 1:32 - loss: 2.3617 - regression_loss: 1.9413 - classification_loss: 0.4204 104/500 [=====>........................] - ETA: 1:32 - loss: 2.3577 - regression_loss: 1.9387 - classification_loss: 0.4190 105/500 [=====>........................] - ETA: 1:31 - loss: 2.3616 - regression_loss: 1.9409 - classification_loss: 0.4207 106/500 [=====>........................] - ETA: 1:31 - loss: 2.3559 - regression_loss: 1.9367 - classification_loss: 0.4193 107/500 [=====>........................] - ETA: 1:31 - loss: 2.3577 - regression_loss: 1.9389 - classification_loss: 0.4187 108/500 [=====>........................] - ETA: 1:31 - loss: 2.3565 - regression_loss: 1.9385 - classification_loss: 0.4181 109/500 [=====>........................] - ETA: 1:30 - loss: 2.3625 - regression_loss: 1.9426 - classification_loss: 0.4199 110/500 [=====>........................] - ETA: 1:30 - loss: 2.3691 - regression_loss: 1.9499 - classification_loss: 0.4192 111/500 [=====>........................] - ETA: 1:30 - loss: 2.3695 - regression_loss: 1.9499 - classification_loss: 0.4197 112/500 [=====>........................] - ETA: 1:30 - loss: 2.3659 - regression_loss: 1.9474 - classification_loss: 0.4185 113/500 [=====>........................] - ETA: 1:29 - loss: 2.3604 - regression_loss: 1.9415 - classification_loss: 0.4189 114/500 [=====>........................] - ETA: 1:29 - loss: 2.3604 - regression_loss: 1.9403 - classification_loss: 0.4200 115/500 [=====>........................] - ETA: 1:29 - loss: 2.3657 - regression_loss: 1.9429 - classification_loss: 0.4227 116/500 [=====>........................] - ETA: 1:29 - loss: 2.3629 - regression_loss: 1.9404 - classification_loss: 0.4225 117/500 [======>.......................] - ETA: 1:28 - loss: 2.3637 - regression_loss: 1.9405 - classification_loss: 0.4232 118/500 [======>.......................] - ETA: 1:28 - loss: 2.3675 - regression_loss: 1.9423 - classification_loss: 0.4252 119/500 [======>.......................] - ETA: 1:28 - loss: 2.3640 - regression_loss: 1.9395 - classification_loss: 0.4245 120/500 [======>.......................] - ETA: 1:28 - loss: 2.3667 - regression_loss: 1.9423 - classification_loss: 0.4244 121/500 [======>.......................] - ETA: 1:28 - loss: 2.3669 - regression_loss: 1.9428 - classification_loss: 0.4242 122/500 [======>.......................] - ETA: 1:27 - loss: 2.3752 - regression_loss: 1.9489 - classification_loss: 0.4263 123/500 [======>.......................] - ETA: 1:27 - loss: 2.3723 - regression_loss: 1.9469 - classification_loss: 0.4254 124/500 [======>.......................] - ETA: 1:27 - loss: 2.3685 - regression_loss: 1.9435 - classification_loss: 0.4250 125/500 [======>.......................] - ETA: 1:27 - loss: 2.3698 - regression_loss: 1.9442 - classification_loss: 0.4256 126/500 [======>.......................] - ETA: 1:26 - loss: 2.3684 - regression_loss: 1.9424 - classification_loss: 0.4260 127/500 [======>.......................] - ETA: 1:26 - loss: 2.3648 - regression_loss: 1.9398 - classification_loss: 0.4250 128/500 [======>.......................] - ETA: 1:26 - loss: 2.3661 - regression_loss: 1.9407 - classification_loss: 0.4255 129/500 [======>.......................] - ETA: 1:26 - loss: 2.3653 - regression_loss: 1.9400 - classification_loss: 0.4254 130/500 [======>.......................] - ETA: 1:25 - loss: 2.3630 - regression_loss: 1.9388 - classification_loss: 0.4242 131/500 [======>.......................] - ETA: 1:25 - loss: 2.3599 - regression_loss: 1.9365 - classification_loss: 0.4234 132/500 [======>.......................] - ETA: 1:25 - loss: 2.3645 - regression_loss: 1.9405 - classification_loss: 0.4240 133/500 [======>.......................] - ETA: 1:25 - loss: 2.3644 - regression_loss: 1.9407 - classification_loss: 0.4237 134/500 [=======>......................] - ETA: 1:24 - loss: 2.3650 - regression_loss: 1.9417 - classification_loss: 0.4233 135/500 [=======>......................] - ETA: 1:24 - loss: 2.3684 - regression_loss: 1.9444 - classification_loss: 0.4240 136/500 [=======>......................] - ETA: 1:24 - loss: 2.3868 - regression_loss: 1.9521 - classification_loss: 0.4347 137/500 [=======>......................] - ETA: 1:24 - loss: 2.3892 - regression_loss: 1.9535 - classification_loss: 0.4357 138/500 [=======>......................] - ETA: 1:23 - loss: 2.3967 - regression_loss: 1.9593 - classification_loss: 0.4374 139/500 [=======>......................] - ETA: 1:23 - loss: 2.4007 - regression_loss: 1.9633 - classification_loss: 0.4374 140/500 [=======>......................] - ETA: 1:23 - loss: 2.4040 - regression_loss: 1.9656 - classification_loss: 0.4384 141/500 [=======>......................] - ETA: 1:23 - loss: 2.4027 - regression_loss: 1.9647 - classification_loss: 0.4380 142/500 [=======>......................] - ETA: 1:23 - loss: 2.4018 - regression_loss: 1.9643 - classification_loss: 0.4374 143/500 [=======>......................] - ETA: 1:22 - loss: 2.3986 - regression_loss: 1.9618 - classification_loss: 0.4368 144/500 [=======>......................] - ETA: 1:22 - loss: 2.3954 - regression_loss: 1.9598 - classification_loss: 0.4356 145/500 [=======>......................] - ETA: 1:22 - loss: 2.4000 - regression_loss: 1.9635 - classification_loss: 0.4365 146/500 [=======>......................] - ETA: 1:22 - loss: 2.4061 - regression_loss: 1.9687 - classification_loss: 0.4374 147/500 [=======>......................] - ETA: 1:21 - loss: 2.4048 - regression_loss: 1.9671 - classification_loss: 0.4376 148/500 [=======>......................] - ETA: 1:21 - loss: 2.4005 - regression_loss: 1.9643 - classification_loss: 0.4363 149/500 [=======>......................] - ETA: 1:21 - loss: 2.4011 - regression_loss: 1.9659 - classification_loss: 0.4353 150/500 [========>.....................] - ETA: 1:21 - loss: 2.3944 - regression_loss: 1.9608 - classification_loss: 0.4336 151/500 [========>.....................] - ETA: 1:20 - loss: 2.4009 - regression_loss: 1.9649 - classification_loss: 0.4360 152/500 [========>.....................] - ETA: 1:20 - loss: 2.4018 - regression_loss: 1.9655 - classification_loss: 0.4363 153/500 [========>.....................] - ETA: 1:20 - loss: 2.3984 - regression_loss: 1.9635 - classification_loss: 0.4349 154/500 [========>.....................] - ETA: 1:20 - loss: 2.3898 - regression_loss: 1.9508 - classification_loss: 0.4390 155/500 [========>.....................] - ETA: 1:20 - loss: 2.3863 - regression_loss: 1.9484 - classification_loss: 0.4379 156/500 [========>.....................] - ETA: 1:19 - loss: 2.3798 - regression_loss: 1.9432 - classification_loss: 0.4367 157/500 [========>.....................] - ETA: 1:19 - loss: 2.3793 - regression_loss: 1.9426 - classification_loss: 0.4367 158/500 [========>.....................] - ETA: 1:19 - loss: 2.3812 - regression_loss: 1.9444 - classification_loss: 0.4368 159/500 [========>.....................] - ETA: 1:19 - loss: 2.3770 - regression_loss: 1.9410 - classification_loss: 0.4359 160/500 [========>.....................] - ETA: 1:18 - loss: 2.3804 - regression_loss: 1.9439 - classification_loss: 0.4365 161/500 [========>.....................] - ETA: 1:18 - loss: 2.3787 - regression_loss: 1.9431 - classification_loss: 0.4356 162/500 [========>.....................] - ETA: 1:18 - loss: 2.3717 - regression_loss: 1.9377 - classification_loss: 0.4339 163/500 [========>.....................] - ETA: 1:18 - loss: 2.3710 - regression_loss: 1.9372 - classification_loss: 0.4338 164/500 [========>.....................] - ETA: 1:17 - loss: 2.3715 - regression_loss: 1.9375 - classification_loss: 0.4340 165/500 [========>.....................] - ETA: 1:17 - loss: 2.3703 - regression_loss: 1.9367 - classification_loss: 0.4336 166/500 [========>.....................] - ETA: 1:17 - loss: 2.3681 - regression_loss: 1.9355 - classification_loss: 0.4326 167/500 [=========>....................] - ETA: 1:17 - loss: 2.3668 - regression_loss: 1.9337 - classification_loss: 0.4331 168/500 [=========>....................] - ETA: 1:17 - loss: 2.3724 - regression_loss: 1.9389 - classification_loss: 0.4335 169/500 [=========>....................] - ETA: 1:16 - loss: 2.3744 - regression_loss: 1.9411 - classification_loss: 0.4334 170/500 [=========>....................] - ETA: 1:16 - loss: 2.3753 - regression_loss: 1.9417 - classification_loss: 0.4336 171/500 [=========>....................] - ETA: 1:16 - loss: 2.3735 - regression_loss: 1.9408 - classification_loss: 0.4327 172/500 [=========>....................] - ETA: 1:16 - loss: 2.3743 - regression_loss: 1.9413 - classification_loss: 0.4330 173/500 [=========>....................] - ETA: 1:15 - loss: 2.3739 - regression_loss: 1.9415 - classification_loss: 0.4324 174/500 [=========>....................] - ETA: 1:15 - loss: 2.3764 - regression_loss: 1.9430 - classification_loss: 0.4334 175/500 [=========>....................] - ETA: 1:15 - loss: 2.3765 - regression_loss: 1.9436 - classification_loss: 0.4330 176/500 [=========>....................] - ETA: 1:15 - loss: 2.3715 - regression_loss: 1.9397 - classification_loss: 0.4317 177/500 [=========>....................] - ETA: 1:15 - loss: 2.3713 - regression_loss: 1.9400 - classification_loss: 0.4313 178/500 [=========>....................] - ETA: 1:14 - loss: 2.3721 - regression_loss: 1.9399 - classification_loss: 0.4322 179/500 [=========>....................] - ETA: 1:14 - loss: 2.3707 - regression_loss: 1.9386 - classification_loss: 0.4321 180/500 [=========>....................] - ETA: 1:14 - loss: 2.3710 - regression_loss: 1.9385 - classification_loss: 0.4325 181/500 [=========>....................] - ETA: 1:14 - loss: 2.3737 - regression_loss: 1.9401 - classification_loss: 0.4336 182/500 [=========>....................] - ETA: 1:13 - loss: 2.3764 - regression_loss: 1.9412 - classification_loss: 0.4352 183/500 [=========>....................] - ETA: 1:13 - loss: 2.3791 - regression_loss: 1.9432 - classification_loss: 0.4359 184/500 [==========>...................] - ETA: 1:13 - loss: 2.3845 - regression_loss: 1.9493 - classification_loss: 0.4352 185/500 [==========>...................] - ETA: 1:13 - loss: 2.3866 - regression_loss: 1.9507 - classification_loss: 0.4359 186/500 [==========>...................] - ETA: 1:13 - loss: 2.3876 - regression_loss: 1.9515 - classification_loss: 0.4361 187/500 [==========>...................] - ETA: 1:12 - loss: 2.3894 - regression_loss: 1.9531 - classification_loss: 0.4363 188/500 [==========>...................] - ETA: 1:12 - loss: 2.3892 - regression_loss: 1.9531 - classification_loss: 0.4361 189/500 [==========>...................] - ETA: 1:12 - loss: 2.3881 - regression_loss: 1.9521 - classification_loss: 0.4360 190/500 [==========>...................] - ETA: 1:12 - loss: 2.3846 - regression_loss: 1.9493 - classification_loss: 0.4353 191/500 [==========>...................] - ETA: 1:11 - loss: 2.3819 - regression_loss: 1.9472 - classification_loss: 0.4346 192/500 [==========>...................] - ETA: 1:11 - loss: 2.3807 - regression_loss: 1.9461 - classification_loss: 0.4346 193/500 [==========>...................] - ETA: 1:11 - loss: 2.3799 - regression_loss: 1.9455 - classification_loss: 0.4344 194/500 [==========>...................] - ETA: 1:11 - loss: 2.3787 - regression_loss: 1.9447 - classification_loss: 0.4339 195/500 [==========>...................] - ETA: 1:11 - loss: 2.3789 - regression_loss: 1.9449 - classification_loss: 0.4340 196/500 [==========>...................] - ETA: 1:10 - loss: 2.3770 - regression_loss: 1.9433 - classification_loss: 0.4337 197/500 [==========>...................] - ETA: 1:10 - loss: 2.3812 - regression_loss: 1.9457 - classification_loss: 0.4355 198/500 [==========>...................] - ETA: 1:10 - loss: 2.3807 - regression_loss: 1.9453 - classification_loss: 0.4354 199/500 [==========>...................] - ETA: 1:10 - loss: 2.3808 - regression_loss: 1.9460 - classification_loss: 0.4348 200/500 [===========>..................] - ETA: 1:09 - loss: 2.3837 - regression_loss: 1.9482 - classification_loss: 0.4355 201/500 [===========>..................] - ETA: 1:09 - loss: 2.3778 - regression_loss: 1.9429 - classification_loss: 0.4349 202/500 [===========>..................] - ETA: 1:09 - loss: 2.3779 - regression_loss: 1.9433 - classification_loss: 0.4346 203/500 [===========>..................] - ETA: 1:09 - loss: 2.3729 - regression_loss: 1.9393 - classification_loss: 0.4335 204/500 [===========>..................] - ETA: 1:08 - loss: 2.3743 - regression_loss: 1.9397 - classification_loss: 0.4346 205/500 [===========>..................] - ETA: 1:08 - loss: 2.3743 - regression_loss: 1.9396 - classification_loss: 0.4347 206/500 [===========>..................] - ETA: 1:08 - loss: 2.3766 - regression_loss: 1.9408 - classification_loss: 0.4358 207/500 [===========>..................] - ETA: 1:08 - loss: 2.3796 - regression_loss: 1.9428 - classification_loss: 0.4367 208/500 [===========>..................] - ETA: 1:07 - loss: 2.3825 - regression_loss: 1.9449 - classification_loss: 0.4376 209/500 [===========>..................] - ETA: 1:07 - loss: 2.3830 - regression_loss: 1.9453 - classification_loss: 0.4377 210/500 [===========>..................] - ETA: 1:07 - loss: 2.3813 - regression_loss: 1.9439 - classification_loss: 0.4374 211/500 [===========>..................] - ETA: 1:07 - loss: 2.3836 - regression_loss: 1.9462 - classification_loss: 0.4374 212/500 [===========>..................] - ETA: 1:07 - loss: 2.3838 - regression_loss: 1.9462 - classification_loss: 0.4375 213/500 [===========>..................] - ETA: 1:06 - loss: 2.3871 - regression_loss: 1.9493 - classification_loss: 0.4378 214/500 [===========>..................] - ETA: 1:06 - loss: 2.3875 - regression_loss: 1.9495 - classification_loss: 0.4380 215/500 [===========>..................] - ETA: 1:06 - loss: 2.3816 - regression_loss: 1.9446 - classification_loss: 0.4369 216/500 [===========>..................] - ETA: 1:06 - loss: 2.3805 - regression_loss: 1.9442 - classification_loss: 0.4362 217/500 [============>.................] - ETA: 1:05 - loss: 2.3795 - regression_loss: 1.9442 - classification_loss: 0.4354 218/500 [============>.................] - ETA: 1:05 - loss: 2.3789 - regression_loss: 1.9437 - classification_loss: 0.4353 219/500 [============>.................] - ETA: 1:05 - loss: 2.3784 - regression_loss: 1.9432 - classification_loss: 0.4352 220/500 [============>.................] - ETA: 1:05 - loss: 2.3822 - regression_loss: 1.9463 - classification_loss: 0.4359 221/500 [============>.................] - ETA: 1:04 - loss: 2.3847 - regression_loss: 1.9477 - classification_loss: 0.4370 222/500 [============>.................] - ETA: 1:04 - loss: 2.3836 - regression_loss: 1.9473 - classification_loss: 0.4363 223/500 [============>.................] - ETA: 1:04 - loss: 2.3863 - regression_loss: 1.9493 - classification_loss: 0.4370 224/500 [============>.................] - ETA: 1:04 - loss: 2.3838 - regression_loss: 1.9477 - classification_loss: 0.4362 225/500 [============>.................] - ETA: 1:04 - loss: 2.3829 - regression_loss: 1.9474 - classification_loss: 0.4355 226/500 [============>.................] - ETA: 1:03 - loss: 2.3820 - regression_loss: 1.9467 - classification_loss: 0.4353 227/500 [============>.................] - ETA: 1:03 - loss: 2.3789 - regression_loss: 1.9447 - classification_loss: 0.4343 228/500 [============>.................] - ETA: 1:03 - loss: 2.3738 - regression_loss: 1.9405 - classification_loss: 0.4333 229/500 [============>.................] - ETA: 1:03 - loss: 2.3726 - regression_loss: 1.9400 - classification_loss: 0.4326 230/500 [============>.................] - ETA: 1:02 - loss: 2.3701 - regression_loss: 1.9381 - classification_loss: 0.4320 231/500 [============>.................] - ETA: 1:02 - loss: 2.3666 - regression_loss: 1.9351 - classification_loss: 0.4315 232/500 [============>.................] - ETA: 1:02 - loss: 2.3652 - regression_loss: 1.9341 - classification_loss: 0.4311 233/500 [============>.................] - ETA: 1:02 - loss: 2.3631 - regression_loss: 1.9330 - classification_loss: 0.4302 234/500 [=============>................] - ETA: 1:01 - loss: 2.3653 - regression_loss: 1.9348 - classification_loss: 0.4304 235/500 [=============>................] - ETA: 1:01 - loss: 2.3661 - regression_loss: 1.9333 - classification_loss: 0.4328 236/500 [=============>................] - ETA: 1:01 - loss: 2.3642 - regression_loss: 1.9317 - classification_loss: 0.4324 237/500 [=============>................] - ETA: 1:01 - loss: 2.3634 - regression_loss: 1.9315 - classification_loss: 0.4320 238/500 [=============>................] - ETA: 1:01 - loss: 2.3639 - regression_loss: 1.9308 - classification_loss: 0.4331 239/500 [=============>................] - ETA: 1:00 - loss: 2.3622 - regression_loss: 1.9291 - classification_loss: 0.4331 240/500 [=============>................] - ETA: 1:00 - loss: 2.3595 - regression_loss: 1.9271 - classification_loss: 0.4324 241/500 [=============>................] - ETA: 1:00 - loss: 2.3620 - regression_loss: 1.9291 - classification_loss: 0.4329 242/500 [=============>................] - ETA: 1:00 - loss: 2.3597 - regression_loss: 1.9274 - classification_loss: 0.4323 243/500 [=============>................] - ETA: 59s - loss: 2.3598 - regression_loss: 1.9279 - classification_loss: 0.4319  244/500 [=============>................] - ETA: 59s - loss: 2.3556 - regression_loss: 1.9245 - classification_loss: 0.4311 245/500 [=============>................] - ETA: 59s - loss: 2.3511 - regression_loss: 1.9211 - classification_loss: 0.4300 246/500 [=============>................] - ETA: 59s - loss: 2.3527 - regression_loss: 1.9232 - classification_loss: 0.4295 247/500 [=============>................] - ETA: 58s - loss: 2.3519 - regression_loss: 1.9226 - classification_loss: 0.4292 248/500 [=============>................] - ETA: 58s - loss: 2.3549 - regression_loss: 1.9248 - classification_loss: 0.4301 249/500 [=============>................] - ETA: 58s - loss: 2.3565 - regression_loss: 1.9258 - classification_loss: 0.4307 250/500 [==============>...............] - ETA: 58s - loss: 2.3536 - regression_loss: 1.9238 - classification_loss: 0.4298 251/500 [==============>...............] - ETA: 58s - loss: 2.3528 - regression_loss: 1.9232 - classification_loss: 0.4296 252/500 [==============>...............] - ETA: 57s - loss: 2.3533 - regression_loss: 1.9241 - classification_loss: 0.4292 253/500 [==============>...............] - ETA: 57s - loss: 2.3539 - regression_loss: 1.9238 - classification_loss: 0.4301 254/500 [==============>...............] - ETA: 57s - loss: 2.3587 - regression_loss: 1.9272 - classification_loss: 0.4315 255/500 [==============>...............] - ETA: 57s - loss: 2.3549 - regression_loss: 1.9244 - classification_loss: 0.4305 256/500 [==============>...............] - ETA: 56s - loss: 2.3564 - regression_loss: 1.9254 - classification_loss: 0.4310 257/500 [==============>...............] - ETA: 56s - loss: 2.3574 - regression_loss: 1.9258 - classification_loss: 0.4316 258/500 [==============>...............] - ETA: 56s - loss: 2.3564 - regression_loss: 1.9249 - classification_loss: 0.4315 259/500 [==============>...............] - ETA: 56s - loss: 2.3567 - regression_loss: 1.9248 - classification_loss: 0.4319 260/500 [==============>...............] - ETA: 56s - loss: 2.3674 - regression_loss: 1.9257 - classification_loss: 0.4417 261/500 [==============>...............] - ETA: 55s - loss: 2.3709 - regression_loss: 1.9286 - classification_loss: 0.4424 262/500 [==============>...............] - ETA: 55s - loss: 2.3665 - regression_loss: 1.9252 - classification_loss: 0.4413 263/500 [==============>...............] - ETA: 55s - loss: 2.3679 - regression_loss: 1.9264 - classification_loss: 0.4415 264/500 [==============>...............] - ETA: 55s - loss: 2.3679 - regression_loss: 1.9265 - classification_loss: 0.4414 265/500 [==============>...............] - ETA: 54s - loss: 2.3678 - regression_loss: 1.9260 - classification_loss: 0.4418 266/500 [==============>...............] - ETA: 54s - loss: 2.3689 - regression_loss: 1.9270 - classification_loss: 0.4419 267/500 [===============>..............] - ETA: 54s - loss: 2.3706 - regression_loss: 1.9283 - classification_loss: 0.4423 268/500 [===============>..............] - ETA: 54s - loss: 2.3712 - regression_loss: 1.9286 - classification_loss: 0.4426 269/500 [===============>..............] - ETA: 53s - loss: 2.3686 - regression_loss: 1.9263 - classification_loss: 0.4423 270/500 [===============>..............] - ETA: 53s - loss: 2.3708 - regression_loss: 1.9280 - classification_loss: 0.4428 271/500 [===============>..............] - ETA: 53s - loss: 2.3700 - regression_loss: 1.9270 - classification_loss: 0.4430 272/500 [===============>..............] - ETA: 53s - loss: 2.3683 - regression_loss: 1.9255 - classification_loss: 0.4428 273/500 [===============>..............] - ETA: 53s - loss: 2.3672 - regression_loss: 1.9248 - classification_loss: 0.4424 274/500 [===============>..............] - ETA: 52s - loss: 2.3708 - regression_loss: 1.9276 - classification_loss: 0.4433 275/500 [===============>..............] - ETA: 52s - loss: 2.3702 - regression_loss: 1.9272 - classification_loss: 0.4430 276/500 [===============>..............] - ETA: 52s - loss: 2.3722 - regression_loss: 1.9285 - classification_loss: 0.4437 277/500 [===============>..............] - ETA: 52s - loss: 2.3748 - regression_loss: 1.9302 - classification_loss: 0.4446 278/500 [===============>..............] - ETA: 51s - loss: 2.3751 - regression_loss: 1.9304 - classification_loss: 0.4447 279/500 [===============>..............] - ETA: 51s - loss: 2.3763 - regression_loss: 1.9310 - classification_loss: 0.4453 280/500 [===============>..............] - ETA: 51s - loss: 2.3751 - regression_loss: 1.9304 - classification_loss: 0.4447 281/500 [===============>..............] - ETA: 51s - loss: 2.3748 - regression_loss: 1.9303 - classification_loss: 0.4445 282/500 [===============>..............] - ETA: 50s - loss: 2.3711 - regression_loss: 1.9272 - classification_loss: 0.4439 283/500 [===============>..............] - ETA: 50s - loss: 2.3759 - regression_loss: 1.9311 - classification_loss: 0.4448 284/500 [================>.............] - ETA: 50s - loss: 2.3768 - regression_loss: 1.9319 - classification_loss: 0.4449 285/500 [================>.............] - ETA: 50s - loss: 2.3779 - regression_loss: 1.9328 - classification_loss: 0.4451 286/500 [================>.............] - ETA: 50s - loss: 2.3804 - regression_loss: 1.9345 - classification_loss: 0.4460 287/500 [================>.............] - ETA: 49s - loss: 2.3796 - regression_loss: 1.9339 - classification_loss: 0.4457 288/500 [================>.............] - ETA: 49s - loss: 2.3804 - regression_loss: 1.9347 - classification_loss: 0.4456 289/500 [================>.............] - ETA: 49s - loss: 2.3799 - regression_loss: 1.9344 - classification_loss: 0.4454 290/500 [================>.............] - ETA: 49s - loss: 2.3798 - regression_loss: 1.9345 - classification_loss: 0.4453 291/500 [================>.............] - ETA: 48s - loss: 2.3784 - regression_loss: 1.9333 - classification_loss: 0.4451 292/500 [================>.............] - ETA: 48s - loss: 2.3761 - regression_loss: 1.9318 - classification_loss: 0.4444 293/500 [================>.............] - ETA: 48s - loss: 2.3756 - regression_loss: 1.9313 - classification_loss: 0.4442 294/500 [================>.............] - ETA: 48s - loss: 2.3783 - regression_loss: 1.9328 - classification_loss: 0.4455 295/500 [================>.............] - ETA: 47s - loss: 2.3791 - regression_loss: 1.9338 - classification_loss: 0.4452 296/500 [================>.............] - ETA: 47s - loss: 2.3783 - regression_loss: 1.9334 - classification_loss: 0.4449 297/500 [================>.............] - ETA: 47s - loss: 2.3798 - regression_loss: 1.9346 - classification_loss: 0.4452 298/500 [================>.............] - ETA: 47s - loss: 2.3800 - regression_loss: 1.9344 - classification_loss: 0.4456 299/500 [================>.............] - ETA: 46s - loss: 2.3769 - regression_loss: 1.9323 - classification_loss: 0.4446 300/500 [=================>............] - ETA: 46s - loss: 2.3775 - regression_loss: 1.9328 - classification_loss: 0.4447 301/500 [=================>............] - ETA: 46s - loss: 2.3779 - regression_loss: 1.9333 - classification_loss: 0.4446 302/500 [=================>............] - ETA: 46s - loss: 2.3776 - regression_loss: 1.9331 - classification_loss: 0.4446 303/500 [=================>............] - ETA: 46s - loss: 2.3757 - regression_loss: 1.9317 - classification_loss: 0.4441 304/500 [=================>............] - ETA: 45s - loss: 2.3746 - regression_loss: 1.9310 - classification_loss: 0.4436 305/500 [=================>............] - ETA: 45s - loss: 2.3740 - regression_loss: 1.9307 - classification_loss: 0.4433 306/500 [=================>............] - ETA: 45s - loss: 2.3759 - regression_loss: 1.9319 - classification_loss: 0.4440 307/500 [=================>............] - ETA: 45s - loss: 2.3772 - regression_loss: 1.9328 - classification_loss: 0.4444 308/500 [=================>............] - ETA: 44s - loss: 2.3798 - regression_loss: 1.9350 - classification_loss: 0.4448 309/500 [=================>............] - ETA: 44s - loss: 2.3801 - regression_loss: 1.9353 - classification_loss: 0.4447 310/500 [=================>............] - ETA: 44s - loss: 2.3847 - regression_loss: 1.9390 - classification_loss: 0.4457 311/500 [=================>............] - ETA: 44s - loss: 2.3847 - regression_loss: 1.9388 - classification_loss: 0.4459 312/500 [=================>............] - ETA: 43s - loss: 2.3832 - regression_loss: 1.9377 - classification_loss: 0.4455 313/500 [=================>............] - ETA: 43s - loss: 2.3842 - regression_loss: 1.9382 - classification_loss: 0.4460 314/500 [=================>............] - ETA: 43s - loss: 2.3839 - regression_loss: 1.9384 - classification_loss: 0.4456 315/500 [=================>............] - ETA: 43s - loss: 2.3849 - regression_loss: 1.9396 - classification_loss: 0.4453 316/500 [=================>............] - ETA: 43s - loss: 2.3840 - regression_loss: 1.9390 - classification_loss: 0.4449 317/500 [==================>...........] - ETA: 42s - loss: 2.3834 - regression_loss: 1.9388 - classification_loss: 0.4445 318/500 [==================>...........] - ETA: 42s - loss: 2.3882 - regression_loss: 1.9421 - classification_loss: 0.4461 319/500 [==================>...........] - ETA: 42s - loss: 2.3864 - regression_loss: 1.9405 - classification_loss: 0.4459 320/500 [==================>...........] - ETA: 42s - loss: 2.3901 - regression_loss: 1.9433 - classification_loss: 0.4468 321/500 [==================>...........] - ETA: 41s - loss: 2.3906 - regression_loss: 1.9439 - classification_loss: 0.4467 322/500 [==================>...........] - ETA: 41s - loss: 2.3908 - regression_loss: 1.9441 - classification_loss: 0.4467 323/500 [==================>...........] - ETA: 41s - loss: 2.3909 - regression_loss: 1.9443 - classification_loss: 0.4466 324/500 [==================>...........] - ETA: 41s - loss: 2.3906 - regression_loss: 1.9441 - classification_loss: 0.4465 325/500 [==================>...........] - ETA: 40s - loss: 2.3904 - regression_loss: 1.9442 - classification_loss: 0.4461 326/500 [==================>...........] - ETA: 40s - loss: 2.3872 - regression_loss: 1.9418 - classification_loss: 0.4454 327/500 [==================>...........] - ETA: 40s - loss: 2.3864 - regression_loss: 1.9414 - classification_loss: 0.4450 328/500 [==================>...........] - ETA: 40s - loss: 2.3904 - regression_loss: 1.9448 - classification_loss: 0.4456 329/500 [==================>...........] - ETA: 40s - loss: 2.3923 - regression_loss: 1.9463 - classification_loss: 0.4460 330/500 [==================>...........] - ETA: 39s - loss: 2.3958 - regression_loss: 1.9493 - classification_loss: 0.4464 331/500 [==================>...........] - ETA: 39s - loss: 2.3958 - regression_loss: 1.9498 - classification_loss: 0.4460 332/500 [==================>...........] - ETA: 39s - loss: 2.3931 - regression_loss: 1.9479 - classification_loss: 0.4451 333/500 [==================>...........] - ETA: 39s - loss: 2.3966 - regression_loss: 1.9510 - classification_loss: 0.4456 334/500 [===================>..........] - ETA: 38s - loss: 2.3958 - regression_loss: 1.9504 - classification_loss: 0.4454 335/500 [===================>..........] - ETA: 38s - loss: 2.3954 - regression_loss: 1.9505 - classification_loss: 0.4450 336/500 [===================>..........] - ETA: 38s - loss: 2.3959 - regression_loss: 1.9513 - classification_loss: 0.4447 337/500 [===================>..........] - ETA: 38s - loss: 2.3954 - regression_loss: 1.9511 - classification_loss: 0.4443 338/500 [===================>..........] - ETA: 37s - loss: 2.3938 - regression_loss: 1.9496 - classification_loss: 0.4441 339/500 [===================>..........] - ETA: 37s - loss: 2.3938 - regression_loss: 1.9497 - classification_loss: 0.4441 340/500 [===================>..........] - ETA: 37s - loss: 2.3953 - regression_loss: 1.9513 - classification_loss: 0.4439 341/500 [===================>..........] - ETA: 37s - loss: 2.3971 - regression_loss: 1.9530 - classification_loss: 0.4441 342/500 [===================>..........] - ETA: 37s - loss: 2.3975 - regression_loss: 1.9538 - classification_loss: 0.4437 343/500 [===================>..........] - ETA: 36s - loss: 2.3965 - regression_loss: 1.9532 - classification_loss: 0.4433 344/500 [===================>..........] - ETA: 36s - loss: 2.3982 - regression_loss: 1.9541 - classification_loss: 0.4440 345/500 [===================>..........] - ETA: 36s - loss: 2.3968 - regression_loss: 1.9533 - classification_loss: 0.4435 346/500 [===================>..........] - ETA: 36s - loss: 2.3936 - regression_loss: 1.9507 - classification_loss: 0.4428 347/500 [===================>..........] - ETA: 35s - loss: 2.3940 - regression_loss: 1.9515 - classification_loss: 0.4426 348/500 [===================>..........] - ETA: 35s - loss: 2.3935 - regression_loss: 1.9512 - classification_loss: 0.4423 349/500 [===================>..........] - ETA: 35s - loss: 2.3952 - regression_loss: 1.9524 - classification_loss: 0.4428 350/500 [====================>.........] - ETA: 35s - loss: 2.3957 - regression_loss: 1.9529 - classification_loss: 0.4428 351/500 [====================>.........] - ETA: 34s - loss: 2.3946 - regression_loss: 1.9521 - classification_loss: 0.4425 352/500 [====================>.........] - ETA: 34s - loss: 2.3931 - regression_loss: 1.9511 - classification_loss: 0.4421 353/500 [====================>.........] - ETA: 34s - loss: 2.3922 - regression_loss: 1.9505 - classification_loss: 0.4418 354/500 [====================>.........] - ETA: 34s - loss: 2.3925 - regression_loss: 1.9508 - classification_loss: 0.4416 355/500 [====================>.........] - ETA: 33s - loss: 2.3919 - regression_loss: 1.9507 - classification_loss: 0.4413 356/500 [====================>.........] - ETA: 33s - loss: 2.3915 - regression_loss: 1.9501 - classification_loss: 0.4413 357/500 [====================>.........] - ETA: 33s - loss: 2.3903 - regression_loss: 1.9493 - classification_loss: 0.4410 358/500 [====================>.........] - ETA: 33s - loss: 2.3900 - regression_loss: 1.9490 - classification_loss: 0.4410 359/500 [====================>.........] - ETA: 33s - loss: 2.3898 - regression_loss: 1.9491 - classification_loss: 0.4407 360/500 [====================>.........] - ETA: 32s - loss: 2.3909 - regression_loss: 1.9501 - classification_loss: 0.4408 361/500 [====================>.........] - ETA: 32s - loss: 2.3910 - regression_loss: 1.9502 - classification_loss: 0.4408 362/500 [====================>.........] - ETA: 32s - loss: 2.3904 - regression_loss: 1.9496 - classification_loss: 0.4408 363/500 [====================>.........] - ETA: 32s - loss: 2.3886 - regression_loss: 1.9484 - classification_loss: 0.4402 364/500 [====================>.........] - ETA: 31s - loss: 2.3888 - regression_loss: 1.9486 - classification_loss: 0.4403 365/500 [====================>.........] - ETA: 31s - loss: 2.3879 - regression_loss: 1.9479 - classification_loss: 0.4400 366/500 [====================>.........] - ETA: 31s - loss: 2.3877 - regression_loss: 1.9480 - classification_loss: 0.4397 367/500 [=====================>........] - ETA: 31s - loss: 2.3873 - regression_loss: 1.9476 - classification_loss: 0.4398 368/500 [=====================>........] - ETA: 30s - loss: 2.3854 - regression_loss: 1.9458 - classification_loss: 0.4397 369/500 [=====================>........] - ETA: 30s - loss: 2.3855 - regression_loss: 1.9460 - classification_loss: 0.4395 370/500 [=====================>........] - ETA: 30s - loss: 2.3851 - regression_loss: 1.9456 - classification_loss: 0.4396 371/500 [=====================>........] - ETA: 30s - loss: 2.3832 - regression_loss: 1.9443 - classification_loss: 0.4389 372/500 [=====================>........] - ETA: 30s - loss: 2.3828 - regression_loss: 1.9441 - classification_loss: 0.4387 373/500 [=====================>........] - ETA: 29s - loss: 2.3840 - regression_loss: 1.9448 - classification_loss: 0.4392 374/500 [=====================>........] - ETA: 29s - loss: 2.3834 - regression_loss: 1.9444 - classification_loss: 0.4390 375/500 [=====================>........] - ETA: 29s - loss: 2.3814 - regression_loss: 1.9429 - classification_loss: 0.4385 376/500 [=====================>........] - ETA: 29s - loss: 2.3813 - regression_loss: 1.9428 - classification_loss: 0.4385 377/500 [=====================>........] - ETA: 28s - loss: 2.3813 - regression_loss: 1.9429 - classification_loss: 0.4385 378/500 [=====================>........] - ETA: 28s - loss: 2.3838 - regression_loss: 1.9446 - classification_loss: 0.4392 379/500 [=====================>........] - ETA: 28s - loss: 2.3838 - regression_loss: 1.9446 - classification_loss: 0.4392 380/500 [=====================>........] - ETA: 28s - loss: 2.3842 - regression_loss: 1.9447 - classification_loss: 0.4395 381/500 [=====================>........] - ETA: 27s - loss: 2.3850 - regression_loss: 1.9452 - classification_loss: 0.4397 382/500 [=====================>........] - ETA: 27s - loss: 2.3836 - regression_loss: 1.9442 - classification_loss: 0.4394 383/500 [=====================>........] - ETA: 27s - loss: 2.3834 - regression_loss: 1.9442 - classification_loss: 0.4391 384/500 [======================>.......] - ETA: 27s - loss: 2.3824 - regression_loss: 1.9436 - classification_loss: 0.4388 385/500 [======================>.......] - ETA: 26s - loss: 2.3817 - regression_loss: 1.9433 - classification_loss: 0.4383 386/500 [======================>.......] - ETA: 26s - loss: 2.3814 - regression_loss: 1.9432 - classification_loss: 0.4382 387/500 [======================>.......] - ETA: 26s - loss: 2.3811 - regression_loss: 1.9433 - classification_loss: 0.4378 388/500 [======================>.......] - ETA: 26s - loss: 2.3810 - regression_loss: 1.9433 - classification_loss: 0.4376 389/500 [======================>.......] - ETA: 26s - loss: 2.3819 - regression_loss: 1.9443 - classification_loss: 0.4375 390/500 [======================>.......] - ETA: 25s - loss: 2.3815 - regression_loss: 1.9438 - classification_loss: 0.4376 391/500 [======================>.......] - ETA: 25s - loss: 2.3814 - regression_loss: 1.9438 - classification_loss: 0.4375 392/500 [======================>.......] - ETA: 25s - loss: 2.3811 - regression_loss: 1.9437 - classification_loss: 0.4374 393/500 [======================>.......] - ETA: 25s - loss: 2.3819 - regression_loss: 1.9442 - classification_loss: 0.4377 394/500 [======================>.......] - ETA: 24s - loss: 2.3819 - regression_loss: 1.9445 - classification_loss: 0.4374 395/500 [======================>.......] - ETA: 24s - loss: 2.3805 - regression_loss: 1.9435 - classification_loss: 0.4370 396/500 [======================>.......] - ETA: 24s - loss: 2.3817 - regression_loss: 1.9443 - classification_loss: 0.4374 397/500 [======================>.......] - ETA: 24s - loss: 2.3804 - regression_loss: 1.9433 - classification_loss: 0.4371 398/500 [======================>.......] - ETA: 23s - loss: 2.3793 - regression_loss: 1.9427 - classification_loss: 0.4366 399/500 [======================>.......] - ETA: 23s - loss: 2.3790 - regression_loss: 1.9426 - classification_loss: 0.4363 400/500 [=======================>......] - ETA: 23s - loss: 2.3789 - regression_loss: 1.9430 - classification_loss: 0.4360 401/500 [=======================>......] - ETA: 23s - loss: 2.3779 - regression_loss: 1.9418 - classification_loss: 0.4361 402/500 [=======================>......] - ETA: 23s - loss: 2.3764 - regression_loss: 1.9408 - classification_loss: 0.4356 403/500 [=======================>......] - ETA: 22s - loss: 2.3742 - regression_loss: 1.9389 - classification_loss: 0.4353 404/500 [=======================>......] - ETA: 22s - loss: 2.3738 - regression_loss: 1.9387 - classification_loss: 0.4351 405/500 [=======================>......] - ETA: 22s - loss: 2.3752 - regression_loss: 1.9396 - classification_loss: 0.4356 406/500 [=======================>......] - ETA: 22s - loss: 2.3752 - regression_loss: 1.9396 - classification_loss: 0.4357 407/500 [=======================>......] - ETA: 21s - loss: 2.3761 - regression_loss: 1.9407 - classification_loss: 0.4354 408/500 [=======================>......] - ETA: 21s - loss: 2.3782 - regression_loss: 1.9424 - classification_loss: 0.4358 409/500 [=======================>......] - ETA: 21s - loss: 2.3775 - regression_loss: 1.9420 - classification_loss: 0.4355 410/500 [=======================>......] - ETA: 21s - loss: 2.3756 - regression_loss: 1.9407 - classification_loss: 0.4349 411/500 [=======================>......] - ETA: 20s - loss: 2.3789 - regression_loss: 1.9436 - classification_loss: 0.4353 412/500 [=======================>......] - ETA: 20s - loss: 2.3782 - regression_loss: 1.9431 - classification_loss: 0.4351 413/500 [=======================>......] - ETA: 20s - loss: 2.3801 - regression_loss: 1.9446 - classification_loss: 0.4355 414/500 [=======================>......] - ETA: 20s - loss: 2.3820 - regression_loss: 1.9458 - classification_loss: 0.4362 415/500 [=======================>......] - ETA: 19s - loss: 2.3828 - regression_loss: 1.9463 - classification_loss: 0.4365 416/500 [=======================>......] - ETA: 19s - loss: 2.3800 - regression_loss: 1.9442 - classification_loss: 0.4357 417/500 [========================>.....] - ETA: 19s - loss: 2.3811 - regression_loss: 1.9453 - classification_loss: 0.4358 418/500 [========================>.....] - ETA: 19s - loss: 2.3817 - regression_loss: 1.9462 - classification_loss: 0.4355 419/500 [========================>.....] - ETA: 19s - loss: 2.3834 - regression_loss: 1.9474 - classification_loss: 0.4360 420/500 [========================>.....] - ETA: 18s - loss: 2.3833 - regression_loss: 1.9476 - classification_loss: 0.4358 421/500 [========================>.....] - ETA: 18s - loss: 2.3831 - regression_loss: 1.9475 - classification_loss: 0.4356 422/500 [========================>.....] - ETA: 18s - loss: 2.3836 - regression_loss: 1.9481 - classification_loss: 0.4354 423/500 [========================>.....] - ETA: 18s - loss: 2.3832 - regression_loss: 1.9480 - classification_loss: 0.4352 424/500 [========================>.....] - ETA: 17s - loss: 2.3830 - regression_loss: 1.9480 - classification_loss: 0.4350 425/500 [========================>.....] - ETA: 17s - loss: 2.3829 - regression_loss: 1.9481 - classification_loss: 0.4348 426/500 [========================>.....] - ETA: 17s - loss: 2.3821 - regression_loss: 1.9472 - classification_loss: 0.4349 427/500 [========================>.....] - ETA: 17s - loss: 2.3809 - regression_loss: 1.9465 - classification_loss: 0.4344 428/500 [========================>.....] - ETA: 16s - loss: 2.3805 - regression_loss: 1.9463 - classification_loss: 0.4342 429/500 [========================>.....] - ETA: 16s - loss: 2.3793 - regression_loss: 1.9449 - classification_loss: 0.4344 430/500 [========================>.....] - ETA: 16s - loss: 2.3784 - regression_loss: 1.9445 - classification_loss: 0.4339 431/500 [========================>.....] - ETA: 16s - loss: 2.3774 - regression_loss: 1.9433 - classification_loss: 0.4341 432/500 [========================>.....] - ETA: 15s - loss: 2.3773 - regression_loss: 1.9433 - classification_loss: 0.4341 433/500 [========================>.....] - ETA: 15s - loss: 2.3767 - regression_loss: 1.9427 - classification_loss: 0.4340 434/500 [=========================>....] - ETA: 15s - loss: 2.3765 - regression_loss: 1.9427 - classification_loss: 0.4338 435/500 [=========================>....] - ETA: 15s - loss: 2.3769 - regression_loss: 1.9429 - classification_loss: 0.4340 436/500 [=========================>....] - ETA: 15s - loss: 2.3752 - regression_loss: 1.9417 - classification_loss: 0.4335 437/500 [=========================>....] - ETA: 14s - loss: 2.3759 - regression_loss: 1.9421 - classification_loss: 0.4338 438/500 [=========================>....] - ETA: 14s - loss: 2.3752 - regression_loss: 1.9417 - classification_loss: 0.4336 439/500 [=========================>....] - ETA: 14s - loss: 2.3742 - regression_loss: 1.9410 - classification_loss: 0.4332 440/500 [=========================>....] - ETA: 14s - loss: 2.3748 - regression_loss: 1.9415 - classification_loss: 0.4332 441/500 [=========================>....] - ETA: 13s - loss: 2.3759 - regression_loss: 1.9425 - classification_loss: 0.4334 442/500 [=========================>....] - ETA: 13s - loss: 2.3767 - regression_loss: 1.9432 - classification_loss: 0.4336 443/500 [=========================>....] - ETA: 13s - loss: 2.3757 - regression_loss: 1.9424 - classification_loss: 0.4333 444/500 [=========================>....] - ETA: 13s - loss: 2.3761 - regression_loss: 1.9428 - classification_loss: 0.4333 445/500 [=========================>....] - ETA: 12s - loss: 2.3754 - regression_loss: 1.9423 - classification_loss: 0.4331 446/500 [=========================>....] - ETA: 12s - loss: 2.3729 - regression_loss: 1.9402 - classification_loss: 0.4327 447/500 [=========================>....] - ETA: 12s - loss: 2.3728 - regression_loss: 1.9402 - classification_loss: 0.4325 448/500 [=========================>....] - ETA: 12s - loss: 2.3729 - regression_loss: 1.9403 - classification_loss: 0.4326 449/500 [=========================>....] - ETA: 11s - loss: 2.3723 - regression_loss: 1.9400 - classification_loss: 0.4323 450/500 [==========================>...] - ETA: 11s - loss: 2.3731 - regression_loss: 1.9406 - classification_loss: 0.4325 451/500 [==========================>...] - ETA: 11s - loss: 2.3722 - regression_loss: 1.9400 - classification_loss: 0.4323 452/500 [==========================>...] - ETA: 11s - loss: 2.3708 - regression_loss: 1.9390 - classification_loss: 0.4318 453/500 [==========================>...] - ETA: 11s - loss: 2.3719 - regression_loss: 1.9400 - classification_loss: 0.4319 454/500 [==========================>...] - ETA: 10s - loss: 2.3705 - regression_loss: 1.9389 - classification_loss: 0.4316 455/500 [==========================>...] - ETA: 10s - loss: 2.3706 - regression_loss: 1.9391 - classification_loss: 0.4315 456/500 [==========================>...] - ETA: 10s - loss: 2.3712 - regression_loss: 1.9395 - classification_loss: 0.4317 457/500 [==========================>...] - ETA: 10s - loss: 2.3716 - regression_loss: 1.9399 - classification_loss: 0.4317 458/500 [==========================>...] - ETA: 9s - loss: 2.3754 - regression_loss: 1.9416 - classification_loss: 0.4338  459/500 [==========================>...] - ETA: 9s - loss: 2.3753 - regression_loss: 1.9418 - classification_loss: 0.4334 460/500 [==========================>...] - ETA: 9s - loss: 2.3751 - regression_loss: 1.9417 - classification_loss: 0.4334 461/500 [==========================>...] - ETA: 9s - loss: 2.3742 - regression_loss: 1.9413 - classification_loss: 0.4329 462/500 [==========================>...] - ETA: 8s - loss: 2.3747 - regression_loss: 1.9415 - classification_loss: 0.4332 463/500 [==========================>...] - ETA: 8s - loss: 2.3743 - regression_loss: 1.9411 - classification_loss: 0.4332 464/500 [==========================>...] - ETA: 8s - loss: 2.3747 - regression_loss: 1.9413 - classification_loss: 0.4333 465/500 [==========================>...] - ETA: 8s - loss: 2.3754 - regression_loss: 1.9419 - classification_loss: 0.4335 466/500 [==========================>...] - ETA: 7s - loss: 2.3727 - regression_loss: 1.9397 - classification_loss: 0.4330 467/500 [===========================>..] - ETA: 7s - loss: 2.3710 - regression_loss: 1.9383 - classification_loss: 0.4327 468/500 [===========================>..] - ETA: 7s - loss: 2.3708 - regression_loss: 1.9383 - classification_loss: 0.4325 469/500 [===========================>..] - ETA: 7s - loss: 2.3714 - regression_loss: 1.9387 - classification_loss: 0.4327 470/500 [===========================>..] - ETA: 7s - loss: 2.3733 - regression_loss: 1.9400 - classification_loss: 0.4333 471/500 [===========================>..] - ETA: 6s - loss: 2.3730 - regression_loss: 1.9400 - classification_loss: 0.4330 472/500 [===========================>..] - ETA: 6s - loss: 2.3750 - regression_loss: 1.9417 - classification_loss: 0.4333 473/500 [===========================>..] - ETA: 6s - loss: 2.3735 - regression_loss: 1.9405 - classification_loss: 0.4330 474/500 [===========================>..] - ETA: 6s - loss: 2.3725 - regression_loss: 1.9397 - classification_loss: 0.4327 475/500 [===========================>..] - ETA: 5s - loss: 2.3705 - regression_loss: 1.9383 - classification_loss: 0.4322 476/500 [===========================>..] - ETA: 5s - loss: 2.3705 - regression_loss: 1.9383 - classification_loss: 0.4322 477/500 [===========================>..] - ETA: 5s - loss: 2.3693 - regression_loss: 1.9375 - classification_loss: 0.4318 478/500 [===========================>..] - ETA: 5s - loss: 2.3697 - regression_loss: 1.9375 - classification_loss: 0.4322 479/500 [===========================>..] - ETA: 4s - loss: 2.3677 - regression_loss: 1.9360 - classification_loss: 0.4316 480/500 [===========================>..] - ETA: 4s - loss: 2.3675 - regression_loss: 1.9359 - classification_loss: 0.4317 481/500 [===========================>..] - ETA: 4s - loss: 2.3668 - regression_loss: 1.9354 - classification_loss: 0.4314 482/500 [===========================>..] - ETA: 4s - loss: 2.3675 - regression_loss: 1.9361 - classification_loss: 0.4313 483/500 [===========================>..] - ETA: 3s - loss: 2.3681 - regression_loss: 1.9367 - classification_loss: 0.4314 484/500 [============================>.] - ETA: 3s - loss: 2.3673 - regression_loss: 1.9361 - classification_loss: 0.4312 485/500 [============================>.] - ETA: 3s - loss: 2.3681 - regression_loss: 1.9363 - classification_loss: 0.4317 486/500 [============================>.] - ETA: 3s - loss: 2.3681 - regression_loss: 1.9366 - classification_loss: 0.4315 487/500 [============================>.] - ETA: 3s - loss: 2.3677 - regression_loss: 1.9363 - classification_loss: 0.4314 488/500 [============================>.] - ETA: 2s - loss: 2.3669 - regression_loss: 1.9357 - classification_loss: 0.4312 489/500 [============================>.] - ETA: 2s - loss: 2.3658 - regression_loss: 1.9349 - classification_loss: 0.4309 490/500 [============================>.] - ETA: 2s - loss: 2.3660 - regression_loss: 1.9351 - classification_loss: 0.4309 491/500 [============================>.] - ETA: 2s - loss: 2.3662 - regression_loss: 1.9352 - classification_loss: 0.4310 492/500 [============================>.] - ETA: 1s - loss: 2.3682 - regression_loss: 1.9366 - classification_loss: 0.4316 493/500 [============================>.] - ETA: 1s - loss: 2.3664 - regression_loss: 1.9352 - classification_loss: 0.4312 494/500 [============================>.] - ETA: 1s - loss: 2.3667 - regression_loss: 1.9353 - classification_loss: 0.4314 495/500 [============================>.] - ETA: 1s - loss: 2.3669 - regression_loss: 1.9354 - classification_loss: 0.4315 496/500 [============================>.] - ETA: 0s - loss: 2.3667 - regression_loss: 1.9355 - classification_loss: 0.4313 497/500 [============================>.] - ETA: 0s - loss: 2.3655 - regression_loss: 1.9347 - classification_loss: 0.4308 498/500 [============================>.] - ETA: 0s - loss: 2.3662 - regression_loss: 1.9352 - classification_loss: 0.4310 499/500 [============================>.] - ETA: 0s - loss: 2.3655 - regression_loss: 1.9348 - classification_loss: 0.4306 500/500 [==============================] - 118s 235ms/step - loss: 2.3659 - regression_loss: 1.9352 - classification_loss: 0.4307 326 instances of class plum with average precision: 0.6309 mAP: 0.6309 Epoch 00008: saving model to ./training/snapshots/resnet50_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 1:56 - loss: 3.3889 - regression_loss: 2.6473 - classification_loss: 0.7417 2/500 [..............................] - ETA: 1:55 - loss: 2.8479 - regression_loss: 2.2576 - classification_loss: 0.5903 3/500 [..............................] - ETA: 1:55 - loss: 2.5952 - regression_loss: 2.0838 - classification_loss: 0.5114 4/500 [..............................] - ETA: 1:57 - loss: 2.3124 - regression_loss: 1.8838 - classification_loss: 0.4287 5/500 [..............................] - ETA: 1:57 - loss: 2.4737 - regression_loss: 2.0258 - classification_loss: 0.4480 6/500 [..............................] - ETA: 1:57 - loss: 2.5330 - regression_loss: 2.0949 - classification_loss: 0.4382 7/500 [..............................] - ETA: 1:57 - loss: 2.5038 - regression_loss: 2.0704 - classification_loss: 0.4334 8/500 [..............................] - ETA: 1:56 - loss: 2.3742 - regression_loss: 1.9719 - classification_loss: 0.4023 9/500 [..............................] - ETA: 1:56 - loss: 2.3003 - regression_loss: 1.9197 - classification_loss: 0.3806 10/500 [..............................] - ETA: 1:56 - loss: 2.2925 - regression_loss: 1.9139 - classification_loss: 0.3786 11/500 [..............................] - ETA: 1:56 - loss: 2.2897 - regression_loss: 1.9121 - classification_loss: 0.3776 12/500 [..............................] - ETA: 1:56 - loss: 2.3913 - regression_loss: 1.9969 - classification_loss: 0.3944 13/500 [..............................] - ETA: 1:56 - loss: 2.4586 - regression_loss: 2.0464 - classification_loss: 0.4122 14/500 [..............................] - ETA: 1:55 - loss: 2.4510 - regression_loss: 2.0310 - classification_loss: 0.4200 15/500 [..............................] - ETA: 1:55 - loss: 2.4523 - regression_loss: 2.0366 - classification_loss: 0.4157 16/500 [..............................] - ETA: 1:54 - loss: 2.4224 - regression_loss: 2.0158 - classification_loss: 0.4066 17/500 [>.............................] - ETA: 1:54 - loss: 2.4276 - regression_loss: 2.0144 - classification_loss: 0.4132 18/500 [>.............................] - ETA: 1:54 - loss: 2.4493 - regression_loss: 2.0326 - classification_loss: 0.4166 19/500 [>.............................] - ETA: 1:54 - loss: 2.4305 - regression_loss: 2.0167 - classification_loss: 0.4138 20/500 [>.............................] - ETA: 1:53 - loss: 2.4329 - regression_loss: 2.0159 - classification_loss: 0.4170 21/500 [>.............................] - ETA: 1:53 - loss: 2.3780 - regression_loss: 1.9741 - classification_loss: 0.4040 22/500 [>.............................] - ETA: 1:53 - loss: 2.4108 - regression_loss: 2.0078 - classification_loss: 0.4030 23/500 [>.............................] - ETA: 1:52 - loss: 2.4071 - regression_loss: 2.0032 - classification_loss: 0.4040 24/500 [>.............................] - ETA: 1:52 - loss: 2.4125 - regression_loss: 2.0060 - classification_loss: 0.4065 25/500 [>.............................] - ETA: 1:52 - loss: 2.3713 - regression_loss: 1.9755 - classification_loss: 0.3958 26/500 [>.............................] - ETA: 1:51 - loss: 2.3729 - regression_loss: 1.9737 - classification_loss: 0.3992 27/500 [>.............................] - ETA: 1:51 - loss: 2.3425 - regression_loss: 1.9475 - classification_loss: 0.3951 28/500 [>.............................] - ETA: 1:51 - loss: 2.3537 - regression_loss: 1.9571 - classification_loss: 0.3966 29/500 [>.............................] - ETA: 1:51 - loss: 2.3138 - regression_loss: 1.9233 - classification_loss: 0.3905 30/500 [>.............................] - ETA: 1:51 - loss: 2.2942 - regression_loss: 1.9081 - classification_loss: 0.3861 31/500 [>.............................] - ETA: 1:50 - loss: 2.2581 - regression_loss: 1.8779 - classification_loss: 0.3802 32/500 [>.............................] - ETA: 1:50 - loss: 2.2720 - regression_loss: 1.8895 - classification_loss: 0.3824 33/500 [>.............................] - ETA: 1:50 - loss: 2.2753 - regression_loss: 1.8944 - classification_loss: 0.3809 34/500 [=>............................] - ETA: 1:50 - loss: 2.2943 - regression_loss: 1.8971 - classification_loss: 0.3972 35/500 [=>............................] - ETA: 1:49 - loss: 2.3222 - regression_loss: 1.9164 - classification_loss: 0.4058 36/500 [=>............................] - ETA: 1:49 - loss: 2.3149 - regression_loss: 1.9105 - classification_loss: 0.4044 37/500 [=>............................] - ETA: 1:49 - loss: 2.3023 - regression_loss: 1.9021 - classification_loss: 0.4002 38/500 [=>............................] - ETA: 1:49 - loss: 2.3081 - regression_loss: 1.9090 - classification_loss: 0.3991 39/500 [=>............................] - ETA: 1:48 - loss: 2.3155 - regression_loss: 1.9116 - classification_loss: 0.4039 40/500 [=>............................] - ETA: 1:48 - loss: 2.3286 - regression_loss: 1.9242 - classification_loss: 0.4044 41/500 [=>............................] - ETA: 1:48 - loss: 2.3127 - regression_loss: 1.9131 - classification_loss: 0.3995 42/500 [=>............................] - ETA: 1:48 - loss: 2.3251 - regression_loss: 1.9240 - classification_loss: 0.4011 43/500 [=>............................] - ETA: 1:48 - loss: 2.3301 - regression_loss: 1.9283 - classification_loss: 0.4018 44/500 [=>............................] - ETA: 1:47 - loss: 2.3525 - regression_loss: 1.9453 - classification_loss: 0.4072 45/500 [=>............................] - ETA: 1:47 - loss: 2.3593 - regression_loss: 1.9488 - classification_loss: 0.4105 46/500 [=>............................] - ETA: 1:47 - loss: 2.3764 - regression_loss: 1.9658 - classification_loss: 0.4107 47/500 [=>............................] - ETA: 1:46 - loss: 2.3798 - regression_loss: 1.9667 - classification_loss: 0.4131 48/500 [=>............................] - ETA: 1:46 - loss: 2.3861 - regression_loss: 1.9766 - classification_loss: 0.4095 49/500 [=>............................] - ETA: 1:46 - loss: 2.3852 - regression_loss: 1.9757 - classification_loss: 0.4094 50/500 [==>...........................] - ETA: 1:46 - loss: 2.3860 - regression_loss: 1.9753 - classification_loss: 0.4107 51/500 [==>...........................] - ETA: 1:45 - loss: 2.3869 - regression_loss: 1.9802 - classification_loss: 0.4066 52/500 [==>...........................] - ETA: 1:45 - loss: 2.3913 - regression_loss: 1.9827 - classification_loss: 0.4086 53/500 [==>...........................] - ETA: 1:45 - loss: 2.3987 - regression_loss: 1.9870 - classification_loss: 0.4117 54/500 [==>...........................] - ETA: 1:45 - loss: 2.3936 - regression_loss: 1.9836 - classification_loss: 0.4100 55/500 [==>...........................] - ETA: 1:44 - loss: 2.3825 - regression_loss: 1.9748 - classification_loss: 0.4077 56/500 [==>...........................] - ETA: 1:44 - loss: 2.3780 - regression_loss: 1.9698 - classification_loss: 0.4081 57/500 [==>...........................] - ETA: 1:44 - loss: 2.3547 - regression_loss: 1.9512 - classification_loss: 0.4035 58/500 [==>...........................] - ETA: 1:44 - loss: 2.3324 - regression_loss: 1.9316 - classification_loss: 0.4007 59/500 [==>...........................] - ETA: 1:44 - loss: 2.3264 - regression_loss: 1.9272 - classification_loss: 0.3992 60/500 [==>...........................] - ETA: 1:43 - loss: 2.3283 - regression_loss: 1.9285 - classification_loss: 0.3998 61/500 [==>...........................] - ETA: 1:43 - loss: 2.3292 - regression_loss: 1.9286 - classification_loss: 0.4006 62/500 [==>...........................] - ETA: 1:43 - loss: 2.3393 - regression_loss: 1.9380 - classification_loss: 0.4013 63/500 [==>...........................] - ETA: 1:43 - loss: 2.3482 - regression_loss: 1.9447 - classification_loss: 0.4035 64/500 [==>...........................] - ETA: 1:43 - loss: 2.3297 - regression_loss: 1.9296 - classification_loss: 0.4001 65/500 [==>...........................] - ETA: 1:42 - loss: 2.3366 - regression_loss: 1.9332 - classification_loss: 0.4034 66/500 [==>...........................] - ETA: 1:42 - loss: 2.3289 - regression_loss: 1.9277 - classification_loss: 0.4012 67/500 [===>..........................] - ETA: 1:42 - loss: 2.3281 - regression_loss: 1.9274 - classification_loss: 0.4007 68/500 [===>..........................] - ETA: 1:42 - loss: 2.3343 - regression_loss: 1.9336 - classification_loss: 0.4007 69/500 [===>..........................] - ETA: 1:42 - loss: 2.3399 - regression_loss: 1.9377 - classification_loss: 0.4023 70/500 [===>..........................] - ETA: 1:41 - loss: 2.3365 - regression_loss: 1.9353 - classification_loss: 0.4012 71/500 [===>..........................] - ETA: 1:41 - loss: 2.3337 - regression_loss: 1.9337 - classification_loss: 0.4001 72/500 [===>..........................] - ETA: 1:41 - loss: 2.3353 - regression_loss: 1.9344 - classification_loss: 0.4009 73/500 [===>..........................] - ETA: 1:40 - loss: 2.3347 - regression_loss: 1.9330 - classification_loss: 0.4016 74/500 [===>..........................] - ETA: 1:40 - loss: 2.3273 - regression_loss: 1.9266 - classification_loss: 0.4007 75/500 [===>..........................] - ETA: 1:40 - loss: 2.3254 - regression_loss: 1.9250 - classification_loss: 0.4003 76/500 [===>..........................] - ETA: 1:40 - loss: 2.3332 - regression_loss: 1.9317 - classification_loss: 0.4015 77/500 [===>..........................] - ETA: 1:39 - loss: 2.3378 - regression_loss: 1.9346 - classification_loss: 0.4032 78/500 [===>..........................] - ETA: 1:39 - loss: 2.3306 - regression_loss: 1.9293 - classification_loss: 0.4014 79/500 [===>..........................] - ETA: 1:39 - loss: 2.3283 - regression_loss: 1.9293 - classification_loss: 0.3990 80/500 [===>..........................] - ETA: 1:39 - loss: 2.3298 - regression_loss: 1.9310 - classification_loss: 0.3987 81/500 [===>..........................] - ETA: 1:39 - loss: 2.3369 - regression_loss: 1.9362 - classification_loss: 0.4006 82/500 [===>..........................] - ETA: 1:38 - loss: 2.3310 - regression_loss: 1.9314 - classification_loss: 0.3996 83/500 [===>..........................] - ETA: 1:38 - loss: 2.3273 - regression_loss: 1.9286 - classification_loss: 0.3987 84/500 [====>.........................] - ETA: 1:38 - loss: 2.3223 - regression_loss: 1.9246 - classification_loss: 0.3977 85/500 [====>.........................] - ETA: 1:38 - loss: 2.3218 - regression_loss: 1.9245 - classification_loss: 0.3973 86/500 [====>.........................] - ETA: 1:37 - loss: 2.3088 - regression_loss: 1.9135 - classification_loss: 0.3953 87/500 [====>.........................] - ETA: 1:37 - loss: 2.3087 - regression_loss: 1.9139 - classification_loss: 0.3948 88/500 [====>.........................] - ETA: 1:37 - loss: 2.2993 - regression_loss: 1.9060 - classification_loss: 0.3933 89/500 [====>.........................] - ETA: 1:37 - loss: 2.2965 - regression_loss: 1.9041 - classification_loss: 0.3924 90/500 [====>.........................] - ETA: 1:36 - loss: 2.3097 - regression_loss: 1.9151 - classification_loss: 0.3946 91/500 [====>.........................] - ETA: 1:36 - loss: 2.3107 - regression_loss: 1.9159 - classification_loss: 0.3947 92/500 [====>.........................] - ETA: 1:36 - loss: 2.3096 - regression_loss: 1.9157 - classification_loss: 0.3939 93/500 [====>.........................] - ETA: 1:36 - loss: 2.3152 - regression_loss: 1.9203 - classification_loss: 0.3948 94/500 [====>.........................] - ETA: 1:36 - loss: 2.3249 - regression_loss: 1.9283 - classification_loss: 0.3966 95/500 [====>.........................] - ETA: 1:35 - loss: 2.3258 - regression_loss: 1.9299 - classification_loss: 0.3959 96/500 [====>.........................] - ETA: 1:35 - loss: 2.3268 - regression_loss: 1.9303 - classification_loss: 0.3965 97/500 [====>.........................] - ETA: 1:35 - loss: 2.3323 - regression_loss: 1.9324 - classification_loss: 0.3999 98/500 [====>.........................] - ETA: 1:35 - loss: 2.3312 - regression_loss: 1.9326 - classification_loss: 0.3986 99/500 [====>.........................] - ETA: 1:34 - loss: 2.3347 - regression_loss: 1.9346 - classification_loss: 0.4001 100/500 [=====>........................] - ETA: 1:34 - loss: 2.3347 - regression_loss: 1.9338 - classification_loss: 0.4010 101/500 [=====>........................] - ETA: 1:34 - loss: 2.3413 - regression_loss: 1.9375 - classification_loss: 0.4037 102/500 [=====>........................] - ETA: 1:34 - loss: 2.3492 - regression_loss: 1.9442 - classification_loss: 0.4051 103/500 [=====>........................] - ETA: 1:33 - loss: 2.3391 - regression_loss: 1.9360 - classification_loss: 0.4030 104/500 [=====>........................] - ETA: 1:33 - loss: 2.3477 - regression_loss: 1.9441 - classification_loss: 0.4036 105/500 [=====>........................] - ETA: 1:33 - loss: 2.3486 - regression_loss: 1.9440 - classification_loss: 0.4046 106/500 [=====>........................] - ETA: 1:33 - loss: 2.3500 - regression_loss: 1.9435 - classification_loss: 0.4066 107/500 [=====>........................] - ETA: 1:32 - loss: 2.3490 - regression_loss: 1.9428 - classification_loss: 0.4062 108/500 [=====>........................] - ETA: 1:32 - loss: 2.3531 - regression_loss: 1.9443 - classification_loss: 0.4088 109/500 [=====>........................] - ETA: 1:32 - loss: 2.3574 - regression_loss: 1.9473 - classification_loss: 0.4101 110/500 [=====>........................] - ETA: 1:32 - loss: 2.3587 - regression_loss: 1.9494 - classification_loss: 0.4093 111/500 [=====>........................] - ETA: 1:31 - loss: 2.3588 - regression_loss: 1.9492 - classification_loss: 0.4096 112/500 [=====>........................] - ETA: 1:31 - loss: 2.3494 - regression_loss: 1.9414 - classification_loss: 0.4080 113/500 [=====>........................] - ETA: 1:31 - loss: 2.3398 - regression_loss: 1.9343 - classification_loss: 0.4055 114/500 [=====>........................] - ETA: 1:31 - loss: 2.3383 - regression_loss: 1.9320 - classification_loss: 0.4063 115/500 [=====>........................] - ETA: 1:31 - loss: 2.3427 - regression_loss: 1.9366 - classification_loss: 0.4061 116/500 [=====>........................] - ETA: 1:30 - loss: 2.3427 - regression_loss: 1.9365 - classification_loss: 0.4062 117/500 [======>.......................] - ETA: 1:30 - loss: 2.3401 - regression_loss: 1.9343 - classification_loss: 0.4058 118/500 [======>.......................] - ETA: 1:30 - loss: 2.3320 - regression_loss: 1.9280 - classification_loss: 0.4040 119/500 [======>.......................] - ETA: 1:30 - loss: 2.3353 - regression_loss: 1.9297 - classification_loss: 0.4056 120/500 [======>.......................] - ETA: 1:29 - loss: 2.3370 - regression_loss: 1.9307 - classification_loss: 0.4063 121/500 [======>.......................] - ETA: 1:29 - loss: 2.3378 - regression_loss: 1.9313 - classification_loss: 0.4065 122/500 [======>.......................] - ETA: 1:29 - loss: 2.3370 - regression_loss: 1.9304 - classification_loss: 0.4065 123/500 [======>.......................] - ETA: 1:29 - loss: 2.3399 - regression_loss: 1.9335 - classification_loss: 0.4064 124/500 [======>.......................] - ETA: 1:28 - loss: 2.3403 - regression_loss: 1.9334 - classification_loss: 0.4069 125/500 [======>.......................] - ETA: 1:28 - loss: 2.3472 - regression_loss: 1.9397 - classification_loss: 0.4075 126/500 [======>.......................] - ETA: 1:28 - loss: 2.3478 - regression_loss: 1.9400 - classification_loss: 0.4079 127/500 [======>.......................] - ETA: 1:28 - loss: 2.3488 - regression_loss: 1.9401 - classification_loss: 0.4087 128/500 [======>.......................] - ETA: 1:27 - loss: 2.3526 - regression_loss: 1.9424 - classification_loss: 0.4102 129/500 [======>.......................] - ETA: 1:27 - loss: 2.3509 - regression_loss: 1.9415 - classification_loss: 0.4094 130/500 [======>.......................] - ETA: 1:27 - loss: 2.3525 - regression_loss: 1.9423 - classification_loss: 0.4103 131/500 [======>.......................] - ETA: 1:27 - loss: 2.3567 - regression_loss: 1.9419 - classification_loss: 0.4148 132/500 [======>.......................] - ETA: 1:27 - loss: 2.3519 - regression_loss: 1.9383 - classification_loss: 0.4136 133/500 [======>.......................] - ETA: 1:26 - loss: 2.3555 - regression_loss: 1.9401 - classification_loss: 0.4153 134/500 [=======>......................] - ETA: 1:26 - loss: 2.3577 - regression_loss: 1.9413 - classification_loss: 0.4164 135/500 [=======>......................] - ETA: 1:26 - loss: 2.3581 - regression_loss: 1.9409 - classification_loss: 0.4171 136/500 [=======>......................] - ETA: 1:26 - loss: 2.3574 - regression_loss: 1.9409 - classification_loss: 0.4165 137/500 [=======>......................] - ETA: 1:26 - loss: 2.3603 - regression_loss: 1.9428 - classification_loss: 0.4175 138/500 [=======>......................] - ETA: 1:25 - loss: 2.3556 - regression_loss: 1.9398 - classification_loss: 0.4158 139/500 [=======>......................] - ETA: 1:25 - loss: 2.3513 - regression_loss: 1.9361 - classification_loss: 0.4151 140/500 [=======>......................] - ETA: 1:25 - loss: 2.3501 - regression_loss: 1.9355 - classification_loss: 0.4146 141/500 [=======>......................] - ETA: 1:25 - loss: 2.3489 - regression_loss: 1.9353 - classification_loss: 0.4137 142/500 [=======>......................] - ETA: 1:24 - loss: 2.3518 - regression_loss: 1.9383 - classification_loss: 0.4135 143/500 [=======>......................] - ETA: 1:24 - loss: 2.3501 - regression_loss: 1.9372 - classification_loss: 0.4129 144/500 [=======>......................] - ETA: 1:24 - loss: 2.3478 - regression_loss: 1.9359 - classification_loss: 0.4119 145/500 [=======>......................] - ETA: 1:24 - loss: 2.3532 - regression_loss: 1.9403 - classification_loss: 0.4130 146/500 [=======>......................] - ETA: 1:23 - loss: 2.3530 - regression_loss: 1.9394 - classification_loss: 0.4135 147/500 [=======>......................] - ETA: 1:23 - loss: 2.3596 - regression_loss: 1.9448 - classification_loss: 0.4147 148/500 [=======>......................] - ETA: 1:23 - loss: 2.3587 - regression_loss: 1.9442 - classification_loss: 0.4145 149/500 [=======>......................] - ETA: 1:23 - loss: 2.3538 - regression_loss: 1.9405 - classification_loss: 0.4133 150/500 [========>.....................] - ETA: 1:22 - loss: 2.3502 - regression_loss: 1.9380 - classification_loss: 0.4122 151/500 [========>.....................] - ETA: 1:22 - loss: 2.3489 - regression_loss: 1.9374 - classification_loss: 0.4114 152/500 [========>.....................] - ETA: 1:22 - loss: 2.3457 - regression_loss: 1.9350 - classification_loss: 0.4107 153/500 [========>.....................] - ETA: 1:22 - loss: 2.3400 - regression_loss: 1.9308 - classification_loss: 0.4092 154/500 [========>.....................] - ETA: 1:21 - loss: 2.3385 - regression_loss: 1.9288 - classification_loss: 0.4097 155/500 [========>.....................] - ETA: 1:21 - loss: 2.3346 - regression_loss: 1.9251 - classification_loss: 0.4095 156/500 [========>.....................] - ETA: 1:21 - loss: 2.3364 - regression_loss: 1.9263 - classification_loss: 0.4101 157/500 [========>.....................] - ETA: 1:21 - loss: 2.3387 - regression_loss: 1.9283 - classification_loss: 0.4104 158/500 [========>.....................] - ETA: 1:21 - loss: 2.3423 - regression_loss: 1.9305 - classification_loss: 0.4118 159/500 [========>.....................] - ETA: 1:20 - loss: 2.3416 - regression_loss: 1.9297 - classification_loss: 0.4119 160/500 [========>.....................] - ETA: 1:20 - loss: 2.3428 - regression_loss: 1.9299 - classification_loss: 0.4129 161/500 [========>.....................] - ETA: 1:20 - loss: 2.3371 - regression_loss: 1.9253 - classification_loss: 0.4118 162/500 [========>.....................] - ETA: 1:20 - loss: 2.3411 - regression_loss: 1.9281 - classification_loss: 0.4130 163/500 [========>.....................] - ETA: 1:19 - loss: 2.3472 - regression_loss: 1.9341 - classification_loss: 0.4131 164/500 [========>.....................] - ETA: 1:19 - loss: 2.3496 - regression_loss: 1.9358 - classification_loss: 0.4138 165/500 [========>.....................] - ETA: 1:19 - loss: 2.3455 - regression_loss: 1.9326 - classification_loss: 0.4130 166/500 [========>.....................] - ETA: 1:19 - loss: 2.3465 - regression_loss: 1.9336 - classification_loss: 0.4129 167/500 [=========>....................] - ETA: 1:18 - loss: 2.3489 - regression_loss: 1.9342 - classification_loss: 0.4147 168/500 [=========>....................] - ETA: 1:18 - loss: 2.3509 - regression_loss: 1.9367 - classification_loss: 0.4142 169/500 [=========>....................] - ETA: 1:18 - loss: 2.3532 - regression_loss: 1.9384 - classification_loss: 0.4147 170/500 [=========>....................] - ETA: 1:18 - loss: 2.3535 - regression_loss: 1.9390 - classification_loss: 0.4145 171/500 [=========>....................] - ETA: 1:17 - loss: 2.3518 - regression_loss: 1.9374 - classification_loss: 0.4144 172/500 [=========>....................] - ETA: 1:17 - loss: 2.3500 - regression_loss: 1.9358 - classification_loss: 0.4142 173/500 [=========>....................] - ETA: 1:17 - loss: 2.3535 - regression_loss: 1.9390 - classification_loss: 0.4145 174/500 [=========>....................] - ETA: 1:17 - loss: 2.3519 - regression_loss: 1.9381 - classification_loss: 0.4138 175/500 [=========>....................] - ETA: 1:16 - loss: 2.3521 - regression_loss: 1.9389 - classification_loss: 0.4132 176/500 [=========>....................] - ETA: 1:16 - loss: 2.3508 - regression_loss: 1.9374 - classification_loss: 0.4133 177/500 [=========>....................] - ETA: 1:16 - loss: 2.3525 - regression_loss: 1.9381 - classification_loss: 0.4144 178/500 [=========>....................] - ETA: 1:16 - loss: 2.3535 - regression_loss: 1.9405 - classification_loss: 0.4130 179/500 [=========>....................] - ETA: 1:16 - loss: 2.3491 - regression_loss: 1.9370 - classification_loss: 0.4121 180/500 [=========>....................] - ETA: 1:15 - loss: 2.3457 - regression_loss: 1.9347 - classification_loss: 0.4110 181/500 [=========>....................] - ETA: 1:15 - loss: 2.3476 - regression_loss: 1.9361 - classification_loss: 0.4115 182/500 [=========>....................] - ETA: 1:15 - loss: 2.3484 - regression_loss: 1.9365 - classification_loss: 0.4120 183/500 [=========>....................] - ETA: 1:15 - loss: 2.3500 - regression_loss: 1.9375 - classification_loss: 0.4125 184/500 [==========>...................] - ETA: 1:14 - loss: 2.3477 - regression_loss: 1.9362 - classification_loss: 0.4114 185/500 [==========>...................] - ETA: 1:14 - loss: 2.3455 - regression_loss: 1.9349 - classification_loss: 0.4106 186/500 [==========>...................] - ETA: 1:14 - loss: 2.3410 - regression_loss: 1.9314 - classification_loss: 0.4096 187/500 [==========>...................] - ETA: 1:14 - loss: 2.3432 - regression_loss: 1.9333 - classification_loss: 0.4099 188/500 [==========>...................] - ETA: 1:13 - loss: 2.3445 - regression_loss: 1.9334 - classification_loss: 0.4111 189/500 [==========>...................] - ETA: 1:13 - loss: 2.3435 - regression_loss: 1.9330 - classification_loss: 0.4105 190/500 [==========>...................] - ETA: 1:13 - loss: 2.3487 - regression_loss: 1.9363 - classification_loss: 0.4124 191/500 [==========>...................] - ETA: 1:13 - loss: 2.3452 - regression_loss: 1.9335 - classification_loss: 0.4117 192/500 [==========>...................] - ETA: 1:13 - loss: 2.3464 - regression_loss: 1.9338 - classification_loss: 0.4126 193/500 [==========>...................] - ETA: 1:12 - loss: 2.3477 - regression_loss: 1.9346 - classification_loss: 0.4131 194/500 [==========>...................] - ETA: 1:12 - loss: 2.3553 - regression_loss: 1.9408 - classification_loss: 0.4145 195/500 [==========>...................] - ETA: 1:12 - loss: 2.3560 - regression_loss: 1.9417 - classification_loss: 0.4143 196/500 [==========>...................] - ETA: 1:12 - loss: 2.3577 - regression_loss: 1.9428 - classification_loss: 0.4150 197/500 [==========>...................] - ETA: 1:11 - loss: 2.3586 - regression_loss: 1.9432 - classification_loss: 0.4153 198/500 [==========>...................] - ETA: 1:11 - loss: 2.3570 - regression_loss: 1.9422 - classification_loss: 0.4148 199/500 [==========>...................] - ETA: 1:11 - loss: 2.3566 - regression_loss: 1.9417 - classification_loss: 0.4149 200/500 [===========>..................] - ETA: 1:11 - loss: 2.3547 - regression_loss: 1.9408 - classification_loss: 0.4139 201/500 [===========>..................] - ETA: 1:11 - loss: 2.3616 - regression_loss: 1.9467 - classification_loss: 0.4149 202/500 [===========>..................] - ETA: 1:10 - loss: 2.3607 - regression_loss: 1.9456 - classification_loss: 0.4150 203/500 [===========>..................] - ETA: 1:10 - loss: 2.3641 - regression_loss: 1.9493 - classification_loss: 0.4148 204/500 [===========>..................] - ETA: 1:10 - loss: 2.3674 - regression_loss: 1.9513 - classification_loss: 0.4161 205/500 [===========>..................] - ETA: 1:10 - loss: 2.3673 - regression_loss: 1.9515 - classification_loss: 0.4158 206/500 [===========>..................] - ETA: 1:09 - loss: 2.3662 - regression_loss: 1.9510 - classification_loss: 0.4152 207/500 [===========>..................] - ETA: 1:09 - loss: 2.3646 - regression_loss: 1.9493 - classification_loss: 0.4154 208/500 [===========>..................] - ETA: 1:09 - loss: 2.3629 - regression_loss: 1.9483 - classification_loss: 0.4146 209/500 [===========>..................] - ETA: 1:09 - loss: 2.3631 - regression_loss: 1.9478 - classification_loss: 0.4153 210/500 [===========>..................] - ETA: 1:09 - loss: 2.3661 - regression_loss: 1.9504 - classification_loss: 0.4156 211/500 [===========>..................] - ETA: 1:08 - loss: 2.3642 - regression_loss: 1.9488 - classification_loss: 0.4154 212/500 [===========>..................] - ETA: 1:08 - loss: 2.3683 - regression_loss: 1.9511 - classification_loss: 0.4173 213/500 [===========>..................] - ETA: 1:08 - loss: 2.3652 - regression_loss: 1.9486 - classification_loss: 0.4166 214/500 [===========>..................] - ETA: 1:08 - loss: 2.3674 - regression_loss: 1.9504 - classification_loss: 0.4170 215/500 [===========>..................] - ETA: 1:07 - loss: 2.3667 - regression_loss: 1.9500 - classification_loss: 0.4168 216/500 [===========>..................] - ETA: 1:07 - loss: 2.3665 - regression_loss: 1.9499 - classification_loss: 0.4166 217/500 [============>.................] - ETA: 1:07 - loss: 2.3740 - regression_loss: 1.9553 - classification_loss: 0.4187 218/500 [============>.................] - ETA: 1:07 - loss: 2.3715 - regression_loss: 1.9531 - classification_loss: 0.4184 219/500 [============>.................] - ETA: 1:06 - loss: 2.3698 - regression_loss: 1.9516 - classification_loss: 0.4182 220/500 [============>.................] - ETA: 1:06 - loss: 2.3703 - regression_loss: 1.9514 - classification_loss: 0.4189 221/500 [============>.................] - ETA: 1:06 - loss: 2.3732 - regression_loss: 1.9535 - classification_loss: 0.4198 222/500 [============>.................] - ETA: 1:06 - loss: 2.3759 - regression_loss: 1.9555 - classification_loss: 0.4204 223/500 [============>.................] - ETA: 1:05 - loss: 2.3693 - regression_loss: 1.9502 - classification_loss: 0.4191 224/500 [============>.................] - ETA: 1:05 - loss: 2.3735 - regression_loss: 1.9537 - classification_loss: 0.4198 225/500 [============>.................] - ETA: 1:05 - loss: 2.3706 - regression_loss: 1.9516 - classification_loss: 0.4190 226/500 [============>.................] - ETA: 1:05 - loss: 2.3719 - regression_loss: 1.9527 - classification_loss: 0.4192 227/500 [============>.................] - ETA: 1:04 - loss: 2.3727 - regression_loss: 1.9527 - classification_loss: 0.4200 228/500 [============>.................] - ETA: 1:04 - loss: 2.3742 - regression_loss: 1.9538 - classification_loss: 0.4203 229/500 [============>.................] - ETA: 1:04 - loss: 2.3732 - regression_loss: 1.9533 - classification_loss: 0.4199 230/500 [============>.................] - ETA: 1:04 - loss: 2.3683 - regression_loss: 1.9496 - classification_loss: 0.4187 231/500 [============>.................] - ETA: 1:03 - loss: 2.3656 - regression_loss: 1.9475 - classification_loss: 0.4182 232/500 [============>.................] - ETA: 1:03 - loss: 2.3647 - regression_loss: 1.9466 - classification_loss: 0.4181 233/500 [============>.................] - ETA: 1:03 - loss: 2.3644 - regression_loss: 1.9463 - classification_loss: 0.4181 234/500 [=============>................] - ETA: 1:03 - loss: 2.3620 - regression_loss: 1.9444 - classification_loss: 0.4176 235/500 [=============>................] - ETA: 1:03 - loss: 2.3601 - regression_loss: 1.9427 - classification_loss: 0.4174 236/500 [=============>................] - ETA: 1:02 - loss: 2.3608 - regression_loss: 1.9428 - classification_loss: 0.4180 237/500 [=============>................] - ETA: 1:02 - loss: 2.3634 - regression_loss: 1.9448 - classification_loss: 0.4186 238/500 [=============>................] - ETA: 1:02 - loss: 2.3701 - regression_loss: 1.9482 - classification_loss: 0.4218 239/500 [=============>................] - ETA: 1:02 - loss: 2.3676 - regression_loss: 1.9466 - classification_loss: 0.4210 240/500 [=============>................] - ETA: 1:01 - loss: 2.3719 - regression_loss: 1.9385 - classification_loss: 0.4334 241/500 [=============>................] - ETA: 1:01 - loss: 2.3676 - regression_loss: 1.9346 - classification_loss: 0.4330 242/500 [=============>................] - ETA: 1:01 - loss: 2.3661 - regression_loss: 1.9327 - classification_loss: 0.4334 243/500 [=============>................] - ETA: 1:01 - loss: 2.3700 - regression_loss: 1.9356 - classification_loss: 0.4344 244/500 [=============>................] - ETA: 1:00 - loss: 2.3694 - regression_loss: 1.9348 - classification_loss: 0.4346 245/500 [=============>................] - ETA: 1:00 - loss: 2.3677 - regression_loss: 1.9336 - classification_loss: 0.4341 246/500 [=============>................] - ETA: 1:00 - loss: 2.3684 - regression_loss: 1.9342 - classification_loss: 0.4343 247/500 [=============>................] - ETA: 1:00 - loss: 2.3702 - regression_loss: 1.9353 - classification_loss: 0.4348 248/500 [=============>................] - ETA: 59s - loss: 2.3688 - regression_loss: 1.9343 - classification_loss: 0.4344  249/500 [=============>................] - ETA: 59s - loss: 2.3677 - regression_loss: 1.9334 - classification_loss: 0.4343 250/500 [==============>...............] - ETA: 59s - loss: 2.3681 - regression_loss: 1.9336 - classification_loss: 0.4346 251/500 [==============>...............] - ETA: 59s - loss: 2.3699 - regression_loss: 1.9357 - classification_loss: 0.4342 252/500 [==============>...............] - ETA: 59s - loss: 2.3683 - regression_loss: 1.9346 - classification_loss: 0.4337 253/500 [==============>...............] - ETA: 58s - loss: 2.3679 - regression_loss: 1.9346 - classification_loss: 0.4333 254/500 [==============>...............] - ETA: 58s - loss: 2.3650 - regression_loss: 1.9325 - classification_loss: 0.4325 255/500 [==============>...............] - ETA: 58s - loss: 2.3706 - regression_loss: 1.9376 - classification_loss: 0.4329 256/500 [==============>...............] - ETA: 58s - loss: 2.3713 - regression_loss: 1.9383 - classification_loss: 0.4329 257/500 [==============>...............] - ETA: 57s - loss: 2.3695 - regression_loss: 1.9374 - classification_loss: 0.4321 258/500 [==============>...............] - ETA: 57s - loss: 2.3726 - regression_loss: 1.9400 - classification_loss: 0.4326 259/500 [==============>...............] - ETA: 57s - loss: 2.3712 - regression_loss: 1.9387 - classification_loss: 0.4325 260/500 [==============>...............] - ETA: 57s - loss: 2.3745 - regression_loss: 1.9414 - classification_loss: 0.4330 261/500 [==============>...............] - ETA: 56s - loss: 2.3700 - regression_loss: 1.9380 - classification_loss: 0.4320 262/500 [==============>...............] - ETA: 56s - loss: 2.3682 - regression_loss: 1.9368 - classification_loss: 0.4314 263/500 [==============>...............] - ETA: 56s - loss: 2.3693 - regression_loss: 1.9378 - classification_loss: 0.4316 264/500 [==============>...............] - ETA: 56s - loss: 2.3704 - regression_loss: 1.9387 - classification_loss: 0.4317 265/500 [==============>...............] - ETA: 55s - loss: 2.3700 - regression_loss: 1.9385 - classification_loss: 0.4315 266/500 [==============>...............] - ETA: 55s - loss: 2.3761 - regression_loss: 1.9437 - classification_loss: 0.4324 267/500 [===============>..............] - ETA: 55s - loss: 2.3723 - regression_loss: 1.9408 - classification_loss: 0.4315 268/500 [===============>..............] - ETA: 55s - loss: 2.3739 - regression_loss: 1.9416 - classification_loss: 0.4323 269/500 [===============>..............] - ETA: 55s - loss: 2.3729 - regression_loss: 1.9413 - classification_loss: 0.4317 270/500 [===============>..............] - ETA: 54s - loss: 2.3745 - regression_loss: 1.9419 - classification_loss: 0.4326 271/500 [===============>..............] - ETA: 54s - loss: 2.3749 - regression_loss: 1.9421 - classification_loss: 0.4328 272/500 [===============>..............] - ETA: 54s - loss: 2.3720 - regression_loss: 1.9397 - classification_loss: 0.4323 273/500 [===============>..............] - ETA: 54s - loss: 2.3729 - regression_loss: 1.9412 - classification_loss: 0.4317 274/500 [===============>..............] - ETA: 53s - loss: 2.3745 - regression_loss: 1.9421 - classification_loss: 0.4324 275/500 [===============>..............] - ETA: 53s - loss: 2.3743 - regression_loss: 1.9421 - classification_loss: 0.4322 276/500 [===============>..............] - ETA: 53s - loss: 2.3742 - regression_loss: 1.9418 - classification_loss: 0.4324 277/500 [===============>..............] - ETA: 53s - loss: 2.3744 - regression_loss: 1.9417 - classification_loss: 0.4326 278/500 [===============>..............] - ETA: 52s - loss: 2.3728 - regression_loss: 1.9407 - classification_loss: 0.4321 279/500 [===============>..............] - ETA: 52s - loss: 2.3742 - regression_loss: 1.9426 - classification_loss: 0.4316 280/500 [===============>..............] - ETA: 52s - loss: 2.3732 - regression_loss: 1.9420 - classification_loss: 0.4312 281/500 [===============>..............] - ETA: 52s - loss: 2.3702 - regression_loss: 1.9399 - classification_loss: 0.4303 282/500 [===============>..............] - ETA: 51s - loss: 2.3702 - regression_loss: 1.9400 - classification_loss: 0.4302 283/500 [===============>..............] - ETA: 51s - loss: 2.3680 - regression_loss: 1.9384 - classification_loss: 0.4296 284/500 [================>.............] - ETA: 51s - loss: 2.3683 - regression_loss: 1.9388 - classification_loss: 0.4296 285/500 [================>.............] - ETA: 51s - loss: 2.3670 - regression_loss: 1.9381 - classification_loss: 0.4290 286/500 [================>.............] - ETA: 51s - loss: 2.3646 - regression_loss: 1.9364 - classification_loss: 0.4282 287/500 [================>.............] - ETA: 50s - loss: 2.3633 - regression_loss: 1.9356 - classification_loss: 0.4277 288/500 [================>.............] - ETA: 50s - loss: 2.3683 - regression_loss: 1.9375 - classification_loss: 0.4309 289/500 [================>.............] - ETA: 50s - loss: 2.3734 - regression_loss: 1.9404 - classification_loss: 0.4330 290/500 [================>.............] - ETA: 50s - loss: 2.3718 - regression_loss: 1.9391 - classification_loss: 0.4327 291/500 [================>.............] - ETA: 49s - loss: 2.3718 - regression_loss: 1.9387 - classification_loss: 0.4331 292/500 [================>.............] - ETA: 49s - loss: 2.3722 - regression_loss: 1.9392 - classification_loss: 0.4330 293/500 [================>.............] - ETA: 49s - loss: 2.3718 - regression_loss: 1.9389 - classification_loss: 0.4330 294/500 [================>.............] - ETA: 49s - loss: 2.3705 - regression_loss: 1.9379 - classification_loss: 0.4326 295/500 [================>.............] - ETA: 48s - loss: 2.3692 - regression_loss: 1.9370 - classification_loss: 0.4322 296/500 [================>.............] - ETA: 48s - loss: 2.3708 - regression_loss: 1.9381 - classification_loss: 0.4328 297/500 [================>.............] - ETA: 48s - loss: 2.3717 - regression_loss: 1.9384 - classification_loss: 0.4333 298/500 [================>.............] - ETA: 48s - loss: 2.3722 - regression_loss: 1.9388 - classification_loss: 0.4334 299/500 [================>.............] - ETA: 47s - loss: 2.3740 - regression_loss: 1.9400 - classification_loss: 0.4340 300/500 [=================>............] - ETA: 47s - loss: 2.3767 - regression_loss: 1.9420 - classification_loss: 0.4346 301/500 [=================>............] - ETA: 47s - loss: 2.3731 - regression_loss: 1.9394 - classification_loss: 0.4338 302/500 [=================>............] - ETA: 47s - loss: 2.3714 - regression_loss: 1.9382 - classification_loss: 0.4332 303/500 [=================>............] - ETA: 46s - loss: 2.3709 - regression_loss: 1.9379 - classification_loss: 0.4330 304/500 [=================>............] - ETA: 46s - loss: 2.3710 - regression_loss: 1.9381 - classification_loss: 0.4329 305/500 [=================>............] - ETA: 46s - loss: 2.3695 - regression_loss: 1.9372 - classification_loss: 0.4323 306/500 [=================>............] - ETA: 46s - loss: 2.3688 - regression_loss: 1.9367 - classification_loss: 0.4321 307/500 [=================>............] - ETA: 45s - loss: 2.3691 - regression_loss: 1.9368 - classification_loss: 0.4323 308/500 [=================>............] - ETA: 45s - loss: 2.3685 - regression_loss: 1.9364 - classification_loss: 0.4321 309/500 [=================>............] - ETA: 45s - loss: 2.3703 - regression_loss: 1.9377 - classification_loss: 0.4327 310/500 [=================>............] - ETA: 45s - loss: 2.3701 - regression_loss: 1.9378 - classification_loss: 0.4323 311/500 [=================>............] - ETA: 44s - loss: 2.3695 - regression_loss: 1.9374 - classification_loss: 0.4321 312/500 [=================>............] - ETA: 44s - loss: 2.3698 - regression_loss: 1.9373 - classification_loss: 0.4324 313/500 [=================>............] - ETA: 44s - loss: 2.3684 - regression_loss: 1.9364 - classification_loss: 0.4320 314/500 [=================>............] - ETA: 44s - loss: 2.3645 - regression_loss: 1.9334 - classification_loss: 0.4311 315/500 [=================>............] - ETA: 44s - loss: 2.3630 - regression_loss: 1.9323 - classification_loss: 0.4306 316/500 [=================>............] - ETA: 43s - loss: 2.3621 - regression_loss: 1.9318 - classification_loss: 0.4303 317/500 [==================>...........] - ETA: 43s - loss: 2.3638 - regression_loss: 1.9331 - classification_loss: 0.4307 318/500 [==================>...........] - ETA: 43s - loss: 2.3643 - regression_loss: 1.9338 - classification_loss: 0.4305 319/500 [==================>...........] - ETA: 43s - loss: 2.3639 - regression_loss: 1.9336 - classification_loss: 0.4303 320/500 [==================>...........] - ETA: 42s - loss: 2.3665 - regression_loss: 1.9358 - classification_loss: 0.4306 321/500 [==================>...........] - ETA: 42s - loss: 2.3661 - regression_loss: 1.9358 - classification_loss: 0.4303 322/500 [==================>...........] - ETA: 42s - loss: 2.3664 - regression_loss: 1.9358 - classification_loss: 0.4307 323/500 [==================>...........] - ETA: 42s - loss: 2.3664 - regression_loss: 1.9357 - classification_loss: 0.4307 324/500 [==================>...........] - ETA: 41s - loss: 2.3672 - regression_loss: 1.9363 - classification_loss: 0.4309 325/500 [==================>...........] - ETA: 41s - loss: 2.3662 - regression_loss: 1.9357 - classification_loss: 0.4305 326/500 [==================>...........] - ETA: 41s - loss: 2.3678 - regression_loss: 1.9371 - classification_loss: 0.4307 327/500 [==================>...........] - ETA: 41s - loss: 2.3689 - regression_loss: 1.9380 - classification_loss: 0.4310 328/500 [==================>...........] - ETA: 40s - loss: 2.3696 - regression_loss: 1.9385 - classification_loss: 0.4311 329/500 [==================>...........] - ETA: 40s - loss: 2.3709 - regression_loss: 1.9391 - classification_loss: 0.4318 330/500 [==================>...........] - ETA: 40s - loss: 2.3694 - regression_loss: 1.9378 - classification_loss: 0.4315 331/500 [==================>...........] - ETA: 40s - loss: 2.3695 - regression_loss: 1.9380 - classification_loss: 0.4315 332/500 [==================>...........] - ETA: 40s - loss: 2.3693 - regression_loss: 1.9377 - classification_loss: 0.4316 333/500 [==================>...........] - ETA: 39s - loss: 2.3691 - regression_loss: 1.9376 - classification_loss: 0.4315 334/500 [===================>..........] - ETA: 39s - loss: 2.3709 - regression_loss: 1.9390 - classification_loss: 0.4319 335/500 [===================>..........] - ETA: 39s - loss: 2.3706 - regression_loss: 1.9389 - classification_loss: 0.4317 336/500 [===================>..........] - ETA: 39s - loss: 2.3705 - regression_loss: 1.9387 - classification_loss: 0.4318 337/500 [===================>..........] - ETA: 38s - loss: 2.3711 - regression_loss: 1.9393 - classification_loss: 0.4318 338/500 [===================>..........] - ETA: 38s - loss: 2.3725 - regression_loss: 1.9400 - classification_loss: 0.4324 339/500 [===================>..........] - ETA: 38s - loss: 2.3734 - regression_loss: 1.9404 - classification_loss: 0.4331 340/500 [===================>..........] - ETA: 38s - loss: 2.3741 - regression_loss: 1.9413 - classification_loss: 0.4328 341/500 [===================>..........] - ETA: 37s - loss: 2.3705 - regression_loss: 1.9384 - classification_loss: 0.4321 342/500 [===================>..........] - ETA: 37s - loss: 2.3715 - regression_loss: 1.9391 - classification_loss: 0.4324 343/500 [===================>..........] - ETA: 37s - loss: 2.3701 - regression_loss: 1.9381 - classification_loss: 0.4319 344/500 [===================>..........] - ETA: 37s - loss: 2.3713 - regression_loss: 1.9390 - classification_loss: 0.4323 345/500 [===================>..........] - ETA: 36s - loss: 2.3706 - regression_loss: 1.9386 - classification_loss: 0.4320 346/500 [===================>..........] - ETA: 36s - loss: 2.3746 - regression_loss: 1.9414 - classification_loss: 0.4331 347/500 [===================>..........] - ETA: 36s - loss: 2.3726 - regression_loss: 1.9397 - classification_loss: 0.4329 348/500 [===================>..........] - ETA: 36s - loss: 2.3718 - regression_loss: 1.9390 - classification_loss: 0.4328 349/500 [===================>..........] - ETA: 36s - loss: 2.3705 - regression_loss: 1.9381 - classification_loss: 0.4323 350/500 [====================>.........] - ETA: 35s - loss: 2.3700 - regression_loss: 1.9377 - classification_loss: 0.4323 351/500 [====================>.........] - ETA: 35s - loss: 2.3701 - regression_loss: 1.9379 - classification_loss: 0.4321 352/500 [====================>.........] - ETA: 35s - loss: 2.3683 - regression_loss: 1.9366 - classification_loss: 0.4317 353/500 [====================>.........] - ETA: 35s - loss: 2.3676 - regression_loss: 1.9363 - classification_loss: 0.4313 354/500 [====================>.........] - ETA: 34s - loss: 2.3669 - regression_loss: 1.9359 - classification_loss: 0.4310 355/500 [====================>.........] - ETA: 34s - loss: 2.3663 - regression_loss: 1.9355 - classification_loss: 0.4309 356/500 [====================>.........] - ETA: 34s - loss: 2.3654 - regression_loss: 1.9349 - classification_loss: 0.4305 357/500 [====================>.........] - ETA: 34s - loss: 2.3658 - regression_loss: 1.9352 - classification_loss: 0.4307 358/500 [====================>.........] - ETA: 33s - loss: 2.3645 - regression_loss: 1.9340 - classification_loss: 0.4306 359/500 [====================>.........] - ETA: 33s - loss: 2.3641 - regression_loss: 1.9338 - classification_loss: 0.4303 360/500 [====================>.........] - ETA: 33s - loss: 2.3639 - regression_loss: 1.9333 - classification_loss: 0.4305 361/500 [====================>.........] - ETA: 33s - loss: 2.3649 - regression_loss: 1.9337 - classification_loss: 0.4312 362/500 [====================>.........] - ETA: 32s - loss: 2.3653 - regression_loss: 1.9339 - classification_loss: 0.4314 363/500 [====================>.........] - ETA: 32s - loss: 2.3671 - regression_loss: 1.9352 - classification_loss: 0.4319 364/500 [====================>.........] - ETA: 32s - loss: 2.3661 - regression_loss: 1.9344 - classification_loss: 0.4316 365/500 [====================>.........] - ETA: 32s - loss: 2.3637 - regression_loss: 1.9326 - classification_loss: 0.4312 366/500 [====================>.........] - ETA: 31s - loss: 2.3635 - regression_loss: 1.9326 - classification_loss: 0.4309 367/500 [=====================>........] - ETA: 31s - loss: 2.3655 - regression_loss: 1.9339 - classification_loss: 0.4316 368/500 [=====================>........] - ETA: 31s - loss: 2.3638 - regression_loss: 1.9326 - classification_loss: 0.4312 369/500 [=====================>........] - ETA: 31s - loss: 2.3615 - regression_loss: 1.9308 - classification_loss: 0.4307 370/500 [=====================>........] - ETA: 31s - loss: 2.3618 - regression_loss: 1.9311 - classification_loss: 0.4308 371/500 [=====================>........] - ETA: 30s - loss: 2.3622 - regression_loss: 1.9315 - classification_loss: 0.4308 372/500 [=====================>........] - ETA: 30s - loss: 2.3622 - regression_loss: 1.9316 - classification_loss: 0.4306 373/500 [=====================>........] - ETA: 30s - loss: 2.3628 - regression_loss: 1.9322 - classification_loss: 0.4306 374/500 [=====================>........] - ETA: 30s - loss: 2.3618 - regression_loss: 1.9315 - classification_loss: 0.4304 375/500 [=====================>........] - ETA: 29s - loss: 2.3576 - regression_loss: 1.9280 - classification_loss: 0.4296 376/500 [=====================>........] - ETA: 29s - loss: 2.3573 - regression_loss: 1.9276 - classification_loss: 0.4297 377/500 [=====================>........] - ETA: 29s - loss: 2.3572 - regression_loss: 1.9275 - classification_loss: 0.4297 378/500 [=====================>........] - ETA: 29s - loss: 2.3558 - regression_loss: 1.9267 - classification_loss: 0.4292 379/500 [=====================>........] - ETA: 28s - loss: 2.3553 - regression_loss: 1.9262 - classification_loss: 0.4291 380/500 [=====================>........] - ETA: 28s - loss: 2.3556 - regression_loss: 1.9267 - classification_loss: 0.4288 381/500 [=====================>........] - ETA: 28s - loss: 2.3541 - regression_loss: 1.9256 - classification_loss: 0.4284 382/500 [=====================>........] - ETA: 28s - loss: 2.3543 - regression_loss: 1.9258 - classification_loss: 0.4285 383/500 [=====================>........] - ETA: 27s - loss: 2.3548 - regression_loss: 1.9263 - classification_loss: 0.4285 384/500 [======================>.......] - ETA: 27s - loss: 2.3546 - regression_loss: 1.9264 - classification_loss: 0.4282 385/500 [======================>.......] - ETA: 27s - loss: 2.3550 - regression_loss: 1.9263 - classification_loss: 0.4287 386/500 [======================>.......] - ETA: 27s - loss: 2.3571 - regression_loss: 1.9280 - classification_loss: 0.4292 387/500 [======================>.......] - ETA: 27s - loss: 2.3559 - regression_loss: 1.9273 - classification_loss: 0.4287 388/500 [======================>.......] - ETA: 26s - loss: 2.3561 - regression_loss: 1.9273 - classification_loss: 0.4288 389/500 [======================>.......] - ETA: 26s - loss: 2.3550 - regression_loss: 1.9265 - classification_loss: 0.4284 390/500 [======================>.......] - ETA: 26s - loss: 2.3533 - regression_loss: 1.9254 - classification_loss: 0.4279 391/500 [======================>.......] - ETA: 26s - loss: 2.3533 - regression_loss: 1.9254 - classification_loss: 0.4278 392/500 [======================>.......] - ETA: 25s - loss: 2.3546 - regression_loss: 1.9263 - classification_loss: 0.4284 393/500 [======================>.......] - ETA: 25s - loss: 2.3554 - regression_loss: 1.9268 - classification_loss: 0.4286 394/500 [======================>.......] - ETA: 25s - loss: 2.3567 - regression_loss: 1.9278 - classification_loss: 0.4288 395/500 [======================>.......] - ETA: 25s - loss: 2.3569 - regression_loss: 1.9280 - classification_loss: 0.4290 396/500 [======================>.......] - ETA: 24s - loss: 2.3572 - regression_loss: 1.9279 - classification_loss: 0.4293 397/500 [======================>.......] - ETA: 24s - loss: 2.3562 - regression_loss: 1.9273 - classification_loss: 0.4290 398/500 [======================>.......] - ETA: 24s - loss: 2.3583 - regression_loss: 1.9285 - classification_loss: 0.4298 399/500 [======================>.......] - ETA: 24s - loss: 2.3621 - regression_loss: 1.9308 - classification_loss: 0.4313 400/500 [=======================>......] - ETA: 23s - loss: 2.3614 - regression_loss: 1.9302 - classification_loss: 0.4312 401/500 [=======================>......] - ETA: 23s - loss: 2.3612 - regression_loss: 1.9300 - classification_loss: 0.4312 402/500 [=======================>......] - ETA: 23s - loss: 2.3616 - regression_loss: 1.9303 - classification_loss: 0.4312 403/500 [=======================>......] - ETA: 23s - loss: 2.3619 - regression_loss: 1.9305 - classification_loss: 0.4314 404/500 [=======================>......] - ETA: 22s - loss: 2.3600 - regression_loss: 1.9291 - classification_loss: 0.4309 405/500 [=======================>......] - ETA: 22s - loss: 2.3604 - regression_loss: 1.9295 - classification_loss: 0.4310 406/500 [=======================>......] - ETA: 22s - loss: 2.3607 - regression_loss: 1.9297 - classification_loss: 0.4310 407/500 [=======================>......] - ETA: 22s - loss: 2.3581 - regression_loss: 1.9276 - classification_loss: 0.4305 408/500 [=======================>......] - ETA: 21s - loss: 2.3557 - regression_loss: 1.9258 - classification_loss: 0.4299 409/500 [=======================>......] - ETA: 21s - loss: 2.3534 - regression_loss: 1.9242 - classification_loss: 0.4292 410/500 [=======================>......] - ETA: 21s - loss: 2.3531 - regression_loss: 1.9244 - classification_loss: 0.4287 411/500 [=======================>......] - ETA: 21s - loss: 2.3517 - regression_loss: 1.9232 - classification_loss: 0.4285 412/500 [=======================>......] - ETA: 21s - loss: 2.3523 - regression_loss: 1.9234 - classification_loss: 0.4289 413/500 [=======================>......] - ETA: 20s - loss: 2.3520 - regression_loss: 1.9233 - classification_loss: 0.4287 414/500 [=======================>......] - ETA: 20s - loss: 2.3511 - regression_loss: 1.9227 - classification_loss: 0.4284 415/500 [=======================>......] - ETA: 20s - loss: 2.3503 - regression_loss: 1.9214 - classification_loss: 0.4289 416/500 [=======================>......] - ETA: 20s - loss: 2.3508 - regression_loss: 1.9218 - classification_loss: 0.4290 417/500 [========================>.....] - ETA: 19s - loss: 2.3524 - regression_loss: 1.9228 - classification_loss: 0.4296 418/500 [========================>.....] - ETA: 19s - loss: 2.3522 - regression_loss: 1.9227 - classification_loss: 0.4296 419/500 [========================>.....] - ETA: 19s - loss: 2.3515 - regression_loss: 1.9224 - classification_loss: 0.4292 420/500 [========================>.....] - ETA: 19s - loss: 2.3535 - regression_loss: 1.9242 - classification_loss: 0.4293 421/500 [========================>.....] - ETA: 18s - loss: 2.3530 - regression_loss: 1.9241 - classification_loss: 0.4289 422/500 [========================>.....] - ETA: 18s - loss: 2.3531 - regression_loss: 1.9242 - classification_loss: 0.4289 423/500 [========================>.....] - ETA: 18s - loss: 2.3530 - regression_loss: 1.9240 - classification_loss: 0.4290 424/500 [========================>.....] - ETA: 18s - loss: 2.3519 - regression_loss: 1.9231 - classification_loss: 0.4288 425/500 [========================>.....] - ETA: 17s - loss: 2.3519 - regression_loss: 1.9233 - classification_loss: 0.4286 426/500 [========================>.....] - ETA: 17s - loss: 2.3511 - regression_loss: 1.9228 - classification_loss: 0.4283 427/500 [========================>.....] - ETA: 17s - loss: 2.3516 - regression_loss: 1.9232 - classification_loss: 0.4284 428/500 [========================>.....] - ETA: 17s - loss: 2.3511 - regression_loss: 1.9229 - classification_loss: 0.4282 429/500 [========================>.....] - ETA: 16s - loss: 2.3515 - regression_loss: 1.9232 - classification_loss: 0.4284 430/500 [========================>.....] - ETA: 16s - loss: 2.3521 - regression_loss: 1.9236 - classification_loss: 0.4286 431/500 [========================>.....] - ETA: 16s - loss: 2.3520 - regression_loss: 1.9234 - classification_loss: 0.4285 432/500 [========================>.....] - ETA: 16s - loss: 2.3512 - regression_loss: 1.9230 - classification_loss: 0.4282 433/500 [========================>.....] - ETA: 16s - loss: 2.3506 - regression_loss: 1.9226 - classification_loss: 0.4281 434/500 [=========================>....] - ETA: 15s - loss: 2.3506 - regression_loss: 1.9226 - classification_loss: 0.4280 435/500 [=========================>....] - ETA: 15s - loss: 2.3514 - regression_loss: 1.9233 - classification_loss: 0.4282 436/500 [=========================>....] - ETA: 15s - loss: 2.3517 - regression_loss: 1.9237 - classification_loss: 0.4280 437/500 [=========================>....] - ETA: 15s - loss: 2.3518 - regression_loss: 1.9239 - classification_loss: 0.4279 438/500 [=========================>....] - ETA: 14s - loss: 2.3496 - regression_loss: 1.9222 - classification_loss: 0.4274 439/500 [=========================>....] - ETA: 14s - loss: 2.3490 - regression_loss: 1.9213 - classification_loss: 0.4277 440/500 [=========================>....] - ETA: 14s - loss: 2.3496 - regression_loss: 1.9219 - classification_loss: 0.4278 441/500 [=========================>....] - ETA: 14s - loss: 2.3495 - regression_loss: 1.9220 - classification_loss: 0.4275 442/500 [=========================>....] - ETA: 13s - loss: 2.3495 - regression_loss: 1.9219 - classification_loss: 0.4275 443/500 [=========================>....] - ETA: 13s - loss: 2.3506 - regression_loss: 1.9218 - classification_loss: 0.4288 444/500 [=========================>....] - ETA: 13s - loss: 2.3517 - regression_loss: 1.9226 - classification_loss: 0.4292 445/500 [=========================>....] - ETA: 13s - loss: 2.3536 - regression_loss: 1.9240 - classification_loss: 0.4296 446/500 [=========================>....] - ETA: 12s - loss: 2.3545 - regression_loss: 1.9248 - classification_loss: 0.4297 447/500 [=========================>....] - ETA: 12s - loss: 2.3538 - regression_loss: 1.9244 - classification_loss: 0.4293 448/500 [=========================>....] - ETA: 12s - loss: 2.3542 - regression_loss: 1.9248 - classification_loss: 0.4293 449/500 [=========================>....] - ETA: 12s - loss: 2.3541 - regression_loss: 1.9247 - classification_loss: 0.4293 450/500 [==========================>...] - ETA: 11s - loss: 2.3530 - regression_loss: 1.9241 - classification_loss: 0.4289 451/500 [==========================>...] - ETA: 11s - loss: 2.3522 - regression_loss: 1.9236 - classification_loss: 0.4286 452/500 [==========================>...] - ETA: 11s - loss: 2.3514 - regression_loss: 1.9231 - classification_loss: 0.4283 453/500 [==========================>...] - ETA: 11s - loss: 2.3533 - regression_loss: 1.9243 - classification_loss: 0.4289 454/500 [==========================>...] - ETA: 11s - loss: 2.3523 - regression_loss: 1.9236 - classification_loss: 0.4287 455/500 [==========================>...] - ETA: 10s - loss: 2.3536 - regression_loss: 1.9247 - classification_loss: 0.4289 456/500 [==========================>...] - ETA: 10s - loss: 2.3536 - regression_loss: 1.9247 - classification_loss: 0.4289 457/500 [==========================>...] - ETA: 10s - loss: 2.3537 - regression_loss: 1.9248 - classification_loss: 0.4289 458/500 [==========================>...] - ETA: 10s - loss: 2.3555 - regression_loss: 1.9262 - classification_loss: 0.4293 459/500 [==========================>...] - ETA: 9s - loss: 2.3542 - regression_loss: 1.9254 - classification_loss: 0.4288  460/500 [==========================>...] - ETA: 9s - loss: 2.3532 - regression_loss: 1.9248 - classification_loss: 0.4284 461/500 [==========================>...] - ETA: 9s - loss: 2.3547 - regression_loss: 1.9256 - classification_loss: 0.4291 462/500 [==========================>...] - ETA: 9s - loss: 2.3579 - regression_loss: 1.9283 - classification_loss: 0.4296 463/500 [==========================>...] - ETA: 8s - loss: 2.3562 - regression_loss: 1.9270 - classification_loss: 0.4293 464/500 [==========================>...] - ETA: 8s - loss: 2.3567 - regression_loss: 1.9272 - classification_loss: 0.4294 465/500 [==========================>...] - ETA: 8s - loss: 2.3563 - regression_loss: 1.9272 - classification_loss: 0.4291 466/500 [==========================>...] - ETA: 8s - loss: 2.3558 - regression_loss: 1.9267 - classification_loss: 0.4290 467/500 [===========================>..] - ETA: 7s - loss: 2.3544 - regression_loss: 1.9257 - classification_loss: 0.4287 468/500 [===========================>..] - ETA: 7s - loss: 2.3555 - regression_loss: 1.9267 - classification_loss: 0.4288 469/500 [===========================>..] - ETA: 7s - loss: 2.3554 - regression_loss: 1.9265 - classification_loss: 0.4289 470/500 [===========================>..] - ETA: 7s - loss: 2.3568 - regression_loss: 1.9275 - classification_loss: 0.4293 471/500 [===========================>..] - ETA: 6s - loss: 2.3572 - regression_loss: 1.9264 - classification_loss: 0.4309 472/500 [===========================>..] - ETA: 6s - loss: 2.3556 - regression_loss: 1.9251 - classification_loss: 0.4305 473/500 [===========================>..] - ETA: 6s - loss: 2.3577 - regression_loss: 1.9273 - classification_loss: 0.4304 474/500 [===========================>..] - ETA: 6s - loss: 2.3598 - regression_loss: 1.9291 - classification_loss: 0.4307 475/500 [===========================>..] - ETA: 5s - loss: 2.3602 - regression_loss: 1.9295 - classification_loss: 0.4307 476/500 [===========================>..] - ETA: 5s - loss: 2.3597 - regression_loss: 1.9291 - classification_loss: 0.4306 477/500 [===========================>..] - ETA: 5s - loss: 2.3585 - regression_loss: 1.9280 - classification_loss: 0.4305 478/500 [===========================>..] - ETA: 5s - loss: 2.3580 - regression_loss: 1.9276 - classification_loss: 0.4304 479/500 [===========================>..] - ETA: 5s - loss: 2.3568 - regression_loss: 1.9269 - classification_loss: 0.4299 480/500 [===========================>..] - ETA: 4s - loss: 2.3603 - regression_loss: 1.9301 - classification_loss: 0.4302 481/500 [===========================>..] - ETA: 4s - loss: 2.3576 - regression_loss: 1.9280 - classification_loss: 0.4296 482/500 [===========================>..] - ETA: 4s - loss: 2.3568 - regression_loss: 1.9274 - classification_loss: 0.4294 483/500 [===========================>..] - ETA: 4s - loss: 2.3574 - regression_loss: 1.9280 - classification_loss: 0.4294 484/500 [============================>.] - ETA: 3s - loss: 2.3571 - regression_loss: 1.9276 - classification_loss: 0.4296 485/500 [============================>.] - ETA: 3s - loss: 2.3554 - regression_loss: 1.9262 - classification_loss: 0.4292 486/500 [============================>.] - ETA: 3s - loss: 2.3537 - regression_loss: 1.9251 - classification_loss: 0.4286 487/500 [============================>.] - ETA: 3s - loss: 2.3540 - regression_loss: 1.9255 - classification_loss: 0.4285 488/500 [============================>.] - ETA: 2s - loss: 2.3537 - regression_loss: 1.9252 - classification_loss: 0.4285 489/500 [============================>.] - ETA: 2s - loss: 2.3539 - regression_loss: 1.9254 - classification_loss: 0.4284 490/500 [============================>.] - ETA: 2s - loss: 2.3541 - regression_loss: 1.9255 - classification_loss: 0.4286 491/500 [============================>.] - ETA: 2s - loss: 2.3543 - regression_loss: 1.9259 - classification_loss: 0.4285 492/500 [============================>.] - ETA: 1s - loss: 2.3551 - regression_loss: 1.9258 - classification_loss: 0.4293 493/500 [============================>.] - ETA: 1s - loss: 2.3546 - regression_loss: 1.9255 - classification_loss: 0.4291 494/500 [============================>.] - ETA: 1s - loss: 2.3545 - regression_loss: 1.9254 - classification_loss: 0.4291 495/500 [============================>.] - ETA: 1s - loss: 2.3547 - regression_loss: 1.9253 - classification_loss: 0.4294 496/500 [============================>.] - ETA: 0s - loss: 2.3533 - regression_loss: 1.9243 - classification_loss: 0.4291 497/500 [============================>.] - ETA: 0s - loss: 2.3554 - regression_loss: 1.9257 - classification_loss: 0.4297 498/500 [============================>.] - ETA: 0s - loss: 2.3547 - regression_loss: 1.9252 - classification_loss: 0.4295 499/500 [============================>.] - ETA: 0s - loss: 2.3552 - regression_loss: 1.9256 - classification_loss: 0.4295 500/500 [==============================] - 120s 239ms/step - loss: 2.3552 - regression_loss: 1.9257 - classification_loss: 0.4295 326 instances of class plum with average precision: 0.5397 mAP: 0.5397 Epoch 00009: saving model to ./training/snapshots/resnet50_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:04 - loss: 1.3308 - regression_loss: 1.1559 - classification_loss: 0.1749 2/500 [..............................] - ETA: 2:00 - loss: 1.6216 - regression_loss: 1.3757 - classification_loss: 0.2459 3/500 [..............................] - ETA: 1:56 - loss: 2.1062 - regression_loss: 1.7742 - classification_loss: 0.3320 4/500 [..............................] - ETA: 1:58 - loss: 2.0213 - regression_loss: 1.7045 - classification_loss: 0.3168 5/500 [..............................] - ETA: 1:57 - loss: 2.0680 - regression_loss: 1.7248 - classification_loss: 0.3431 6/500 [..............................] - ETA: 1:56 - loss: 2.1968 - regression_loss: 1.8189 - classification_loss: 0.3779 7/500 [..............................] - ETA: 1:56 - loss: 2.1547 - regression_loss: 1.7924 - classification_loss: 0.3623 8/500 [..............................] - ETA: 1:56 - loss: 2.1420 - regression_loss: 1.7869 - classification_loss: 0.3550 9/500 [..............................] - ETA: 1:56 - loss: 2.0827 - regression_loss: 1.7358 - classification_loss: 0.3469 10/500 [..............................] - ETA: 1:57 - loss: 2.1395 - regression_loss: 1.7931 - classification_loss: 0.3464 11/500 [..............................] - ETA: 1:57 - loss: 2.1770 - regression_loss: 1.8192 - classification_loss: 0.3577 12/500 [..............................] - ETA: 1:57 - loss: 2.1824 - regression_loss: 1.8163 - classification_loss: 0.3662 13/500 [..............................] - ETA: 1:57 - loss: 2.1027 - regression_loss: 1.7509 - classification_loss: 0.3518 14/500 [..............................] - ETA: 1:57 - loss: 2.0928 - regression_loss: 1.7405 - classification_loss: 0.3523 15/500 [..............................] - ETA: 1:57 - loss: 2.1106 - regression_loss: 1.7506 - classification_loss: 0.3600 16/500 [..............................] - ETA: 1:57 - loss: 2.1312 - regression_loss: 1.7701 - classification_loss: 0.3611 17/500 [>.............................] - ETA: 1:56 - loss: 2.1577 - regression_loss: 1.7907 - classification_loss: 0.3670 18/500 [>.............................] - ETA: 1:56 - loss: 2.1565 - regression_loss: 1.7746 - classification_loss: 0.3820 19/500 [>.............................] - ETA: 1:55 - loss: 2.2126 - regression_loss: 1.8186 - classification_loss: 0.3940 20/500 [>.............................] - ETA: 1:55 - loss: 2.2172 - regression_loss: 1.8231 - classification_loss: 0.3942 21/500 [>.............................] - ETA: 1:54 - loss: 2.2217 - regression_loss: 1.8283 - classification_loss: 0.3934 22/500 [>.............................] - ETA: 1:54 - loss: 2.2339 - regression_loss: 1.8416 - classification_loss: 0.3922 23/500 [>.............................] - ETA: 1:54 - loss: 2.2573 - regression_loss: 1.8543 - classification_loss: 0.4031 24/500 [>.............................] - ETA: 1:54 - loss: 2.2704 - regression_loss: 1.8628 - classification_loss: 0.4076 25/500 [>.............................] - ETA: 1:53 - loss: 2.2648 - regression_loss: 1.8602 - classification_loss: 0.4046 26/500 [>.............................] - ETA: 1:53 - loss: 2.2578 - regression_loss: 1.8579 - classification_loss: 0.3999 27/500 [>.............................] - ETA: 1:53 - loss: 2.2631 - regression_loss: 1.8607 - classification_loss: 0.4025 28/500 [>.............................] - ETA: 1:53 - loss: 2.2545 - regression_loss: 1.8532 - classification_loss: 0.4012 29/500 [>.............................] - ETA: 1:52 - loss: 2.2681 - regression_loss: 1.8655 - classification_loss: 0.4027 30/500 [>.............................] - ETA: 1:52 - loss: 2.3081 - regression_loss: 1.8830 - classification_loss: 0.4251 31/500 [>.............................] - ETA: 1:52 - loss: 2.3365 - regression_loss: 1.9089 - classification_loss: 0.4276 32/500 [>.............................] - ETA: 1:51 - loss: 2.3419 - regression_loss: 1.9114 - classification_loss: 0.4305 33/500 [>.............................] - ETA: 1:51 - loss: 2.3372 - regression_loss: 1.9117 - classification_loss: 0.4254 34/500 [=>............................] - ETA: 1:51 - loss: 2.3579 - regression_loss: 1.9252 - classification_loss: 0.4327 35/500 [=>............................] - ETA: 1:50 - loss: 2.3846 - regression_loss: 1.9479 - classification_loss: 0.4367 36/500 [=>............................] - ETA: 1:50 - loss: 2.3840 - regression_loss: 1.9478 - classification_loss: 0.4362 37/500 [=>............................] - ETA: 1:50 - loss: 2.3984 - regression_loss: 1.9622 - classification_loss: 0.4362 38/500 [=>............................] - ETA: 1:49 - loss: 2.4360 - regression_loss: 1.9984 - classification_loss: 0.4377 39/500 [=>............................] - ETA: 1:49 - loss: 2.4290 - regression_loss: 1.9935 - classification_loss: 0.4355 40/500 [=>............................] - ETA: 1:49 - loss: 2.4255 - regression_loss: 1.9899 - classification_loss: 0.4356 41/500 [=>............................] - ETA: 1:49 - loss: 2.4227 - regression_loss: 1.9886 - classification_loss: 0.4341 42/500 [=>............................] - ETA: 1:48 - loss: 2.4093 - regression_loss: 1.9800 - classification_loss: 0.4293 43/500 [=>............................] - ETA: 1:48 - loss: 2.4155 - regression_loss: 1.9829 - classification_loss: 0.4327 44/500 [=>............................] - ETA: 1:48 - loss: 2.4030 - regression_loss: 1.9746 - classification_loss: 0.4284 45/500 [=>............................] - ETA: 1:48 - loss: 2.4226 - regression_loss: 1.9921 - classification_loss: 0.4305 46/500 [=>............................] - ETA: 1:48 - loss: 2.4256 - regression_loss: 1.9896 - classification_loss: 0.4360 47/500 [=>............................] - ETA: 1:47 - loss: 2.4056 - regression_loss: 1.9736 - classification_loss: 0.4320 48/500 [=>............................] - ETA: 1:47 - loss: 2.3960 - regression_loss: 1.9677 - classification_loss: 0.4283 49/500 [=>............................] - ETA: 1:47 - loss: 2.3668 - regression_loss: 1.9441 - classification_loss: 0.4227 50/500 [==>...........................] - ETA: 1:47 - loss: 2.3416 - regression_loss: 1.9224 - classification_loss: 0.4192 51/500 [==>...........................] - ETA: 1:46 - loss: 2.3462 - regression_loss: 1.9262 - classification_loss: 0.4200 52/500 [==>...........................] - ETA: 1:46 - loss: 2.3410 - regression_loss: 1.9214 - classification_loss: 0.4196 53/500 [==>...........................] - ETA: 1:46 - loss: 2.3247 - regression_loss: 1.9096 - classification_loss: 0.4151 54/500 [==>...........................] - ETA: 1:46 - loss: 2.3258 - regression_loss: 1.9107 - classification_loss: 0.4151 55/500 [==>...........................] - ETA: 1:45 - loss: 2.3357 - regression_loss: 1.9185 - classification_loss: 0.4171 56/500 [==>...........................] - ETA: 1:45 - loss: 2.3293 - regression_loss: 1.9132 - classification_loss: 0.4162 57/500 [==>...........................] - ETA: 1:45 - loss: 2.3362 - regression_loss: 1.9182 - classification_loss: 0.4180 58/500 [==>...........................] - ETA: 1:44 - loss: 2.3353 - regression_loss: 1.9193 - classification_loss: 0.4160 59/500 [==>...........................] - ETA: 1:44 - loss: 2.3350 - regression_loss: 1.9156 - classification_loss: 0.4194 60/500 [==>...........................] - ETA: 1:44 - loss: 2.3372 - regression_loss: 1.9185 - classification_loss: 0.4187 61/500 [==>...........................] - ETA: 1:44 - loss: 2.3178 - regression_loss: 1.9025 - classification_loss: 0.4153 62/500 [==>...........................] - ETA: 1:44 - loss: 2.3144 - regression_loss: 1.9008 - classification_loss: 0.4135 63/500 [==>...........................] - ETA: 1:43 - loss: 2.2959 - regression_loss: 1.8861 - classification_loss: 0.4098 64/500 [==>...........................] - ETA: 1:43 - loss: 2.2955 - regression_loss: 1.8874 - classification_loss: 0.4081 65/500 [==>...........................] - ETA: 1:43 - loss: 2.2992 - regression_loss: 1.8902 - classification_loss: 0.4090 66/500 [==>...........................] - ETA: 1:43 - loss: 2.3118 - regression_loss: 1.8987 - classification_loss: 0.4130 67/500 [===>..........................] - ETA: 1:42 - loss: 2.3226 - regression_loss: 1.9076 - classification_loss: 0.4150 68/500 [===>..........................] - ETA: 1:42 - loss: 2.3265 - regression_loss: 1.9126 - classification_loss: 0.4139 69/500 [===>..........................] - ETA: 1:42 - loss: 2.3260 - regression_loss: 1.9118 - classification_loss: 0.4142 70/500 [===>..........................] - ETA: 1:41 - loss: 2.3217 - regression_loss: 1.9092 - classification_loss: 0.4125 71/500 [===>..........................] - ETA: 1:41 - loss: 2.3066 - regression_loss: 1.8979 - classification_loss: 0.4087 72/500 [===>..........................] - ETA: 1:41 - loss: 2.3085 - regression_loss: 1.8992 - classification_loss: 0.4093 73/500 [===>..........................] - ETA: 1:41 - loss: 2.3073 - regression_loss: 1.8990 - classification_loss: 0.4084 74/500 [===>..........................] - ETA: 1:41 - loss: 2.3108 - regression_loss: 1.9014 - classification_loss: 0.4095 75/500 [===>..........................] - ETA: 1:41 - loss: 2.3121 - regression_loss: 1.9017 - classification_loss: 0.4104 76/500 [===>..........................] - ETA: 1:40 - loss: 2.3076 - regression_loss: 1.8984 - classification_loss: 0.4092 77/500 [===>..........................] - ETA: 1:40 - loss: 2.3092 - regression_loss: 1.8990 - classification_loss: 0.4102 78/500 [===>..........................] - ETA: 1:40 - loss: 2.3072 - regression_loss: 1.8980 - classification_loss: 0.4092 79/500 [===>..........................] - ETA: 1:39 - loss: 2.3345 - regression_loss: 1.9090 - classification_loss: 0.4255 80/500 [===>..........................] - ETA: 1:39 - loss: 2.3360 - regression_loss: 1.9104 - classification_loss: 0.4256 81/500 [===>..........................] - ETA: 1:39 - loss: 2.3299 - regression_loss: 1.9034 - classification_loss: 0.4265 82/500 [===>..........................] - ETA: 1:39 - loss: 2.3335 - regression_loss: 1.9054 - classification_loss: 0.4281 83/500 [===>..........................] - ETA: 1:38 - loss: 2.3303 - regression_loss: 1.9040 - classification_loss: 0.4263 84/500 [====>.........................] - ETA: 1:38 - loss: 2.3239 - regression_loss: 1.8997 - classification_loss: 0.4242 85/500 [====>.........................] - ETA: 1:38 - loss: 2.3270 - regression_loss: 1.9023 - classification_loss: 0.4248 86/500 [====>.........................] - ETA: 1:38 - loss: 2.3103 - regression_loss: 1.8865 - classification_loss: 0.4237 87/500 [====>.........................] - ETA: 1:37 - loss: 2.3082 - regression_loss: 1.8865 - classification_loss: 0.4217 88/500 [====>.........................] - ETA: 1:37 - loss: 2.3135 - regression_loss: 1.8894 - classification_loss: 0.4241 89/500 [====>.........................] - ETA: 1:37 - loss: 2.3102 - regression_loss: 1.8843 - classification_loss: 0.4258 90/500 [====>.........................] - ETA: 1:37 - loss: 2.3094 - regression_loss: 1.8847 - classification_loss: 0.4247 91/500 [====>.........................] - ETA: 1:36 - loss: 2.3134 - regression_loss: 1.8874 - classification_loss: 0.4260 92/500 [====>.........................] - ETA: 1:36 - loss: 2.3144 - regression_loss: 1.8879 - classification_loss: 0.4265 93/500 [====>.........................] - ETA: 1:36 - loss: 2.3189 - regression_loss: 1.8912 - classification_loss: 0.4277 94/500 [====>.........................] - ETA: 1:36 - loss: 2.3205 - regression_loss: 1.8928 - classification_loss: 0.4277 95/500 [====>.........................] - ETA: 1:35 - loss: 2.3209 - regression_loss: 1.8934 - classification_loss: 0.4275 96/500 [====>.........................] - ETA: 1:35 - loss: 2.3258 - regression_loss: 1.8979 - classification_loss: 0.4280 97/500 [====>.........................] - ETA: 1:35 - loss: 2.3259 - regression_loss: 1.8983 - classification_loss: 0.4275 98/500 [====>.........................] - ETA: 1:35 - loss: 2.3254 - regression_loss: 1.8987 - classification_loss: 0.4268 99/500 [====>.........................] - ETA: 1:34 - loss: 2.3263 - regression_loss: 1.8988 - classification_loss: 0.4275 100/500 [=====>........................] - ETA: 1:34 - loss: 2.3254 - regression_loss: 1.8983 - classification_loss: 0.4271 101/500 [=====>........................] - ETA: 1:34 - loss: 2.3312 - regression_loss: 1.9036 - classification_loss: 0.4275 102/500 [=====>........................] - ETA: 1:34 - loss: 2.3346 - regression_loss: 1.9063 - classification_loss: 0.4283 103/500 [=====>........................] - ETA: 1:33 - loss: 2.3389 - regression_loss: 1.9111 - classification_loss: 0.4277 104/500 [=====>........................] - ETA: 1:33 - loss: 2.3373 - regression_loss: 1.9100 - classification_loss: 0.4273 105/500 [=====>........................] - ETA: 1:33 - loss: 2.3368 - regression_loss: 1.9108 - classification_loss: 0.4260 106/500 [=====>........................] - ETA: 1:33 - loss: 2.3337 - regression_loss: 1.9069 - classification_loss: 0.4269 107/500 [=====>........................] - ETA: 1:33 - loss: 2.3359 - regression_loss: 1.9097 - classification_loss: 0.4262 108/500 [=====>........................] - ETA: 1:32 - loss: 2.3351 - regression_loss: 1.9093 - classification_loss: 0.4258 109/500 [=====>........................] - ETA: 1:32 - loss: 2.3383 - regression_loss: 1.9121 - classification_loss: 0.4262 110/500 [=====>........................] - ETA: 1:32 - loss: 2.3308 - regression_loss: 1.9069 - classification_loss: 0.4240 111/500 [=====>........................] - ETA: 1:32 - loss: 2.3350 - regression_loss: 1.9096 - classification_loss: 0.4254 112/500 [=====>........................] - ETA: 1:32 - loss: 2.3349 - regression_loss: 1.9091 - classification_loss: 0.4258 113/500 [=====>........................] - ETA: 1:31 - loss: 2.3348 - regression_loss: 1.9090 - classification_loss: 0.4259 114/500 [=====>........................] - ETA: 1:31 - loss: 2.3337 - regression_loss: 1.9081 - classification_loss: 0.4256 115/500 [=====>........................] - ETA: 1:31 - loss: 2.3301 - regression_loss: 1.9057 - classification_loss: 0.4244 116/500 [=====>........................] - ETA: 1:31 - loss: 2.3289 - regression_loss: 1.9053 - classification_loss: 0.4235 117/500 [======>.......................] - ETA: 1:31 - loss: 2.3316 - regression_loss: 1.9074 - classification_loss: 0.4242 118/500 [======>.......................] - ETA: 1:31 - loss: 2.3279 - regression_loss: 1.9047 - classification_loss: 0.4233 119/500 [======>.......................] - ETA: 1:30 - loss: 2.3167 - regression_loss: 1.8954 - classification_loss: 0.4213 120/500 [======>.......................] - ETA: 1:30 - loss: 2.3143 - regression_loss: 1.8934 - classification_loss: 0.4209 121/500 [======>.......................] - ETA: 1:30 - loss: 2.3062 - regression_loss: 1.8874 - classification_loss: 0.4188 122/500 [======>.......................] - ETA: 1:30 - loss: 2.3226 - regression_loss: 1.8896 - classification_loss: 0.4330 123/500 [======>.......................] - ETA: 1:29 - loss: 2.3243 - regression_loss: 1.8906 - classification_loss: 0.4337 124/500 [======>.......................] - ETA: 1:29 - loss: 2.3217 - regression_loss: 1.8880 - classification_loss: 0.4337 125/500 [======>.......................] - ETA: 1:29 - loss: 2.3241 - regression_loss: 1.8917 - classification_loss: 0.4325 126/500 [======>.......................] - ETA: 1:29 - loss: 2.3360 - regression_loss: 1.9017 - classification_loss: 0.4343 127/500 [======>.......................] - ETA: 1:29 - loss: 2.3356 - regression_loss: 1.9015 - classification_loss: 0.4341 128/500 [======>.......................] - ETA: 1:28 - loss: 2.3357 - regression_loss: 1.9028 - classification_loss: 0.4329 129/500 [======>.......................] - ETA: 1:28 - loss: 2.3316 - regression_loss: 1.9001 - classification_loss: 0.4315 130/500 [======>.......................] - ETA: 1:28 - loss: 2.3263 - regression_loss: 1.8965 - classification_loss: 0.4298 131/500 [======>.......................] - ETA: 1:28 - loss: 2.3297 - regression_loss: 1.8981 - classification_loss: 0.4316 132/500 [======>.......................] - ETA: 1:28 - loss: 2.3394 - regression_loss: 1.9079 - classification_loss: 0.4315 133/500 [======>.......................] - ETA: 1:27 - loss: 2.3388 - regression_loss: 1.9067 - classification_loss: 0.4320 134/500 [=======>......................] - ETA: 1:27 - loss: 2.3375 - regression_loss: 1.9066 - classification_loss: 0.4309 135/500 [=======>......................] - ETA: 1:27 - loss: 2.3373 - regression_loss: 1.9057 - classification_loss: 0.4315 136/500 [=======>......................] - ETA: 1:27 - loss: 2.3416 - regression_loss: 1.9094 - classification_loss: 0.4321 137/500 [=======>......................] - ETA: 1:27 - loss: 2.3418 - regression_loss: 1.9089 - classification_loss: 0.4329 138/500 [=======>......................] - ETA: 1:26 - loss: 2.3422 - regression_loss: 1.9088 - classification_loss: 0.4334 139/500 [=======>......................] - ETA: 1:26 - loss: 2.3462 - regression_loss: 1.9119 - classification_loss: 0.4344 140/500 [=======>......................] - ETA: 1:26 - loss: 2.3473 - regression_loss: 1.9130 - classification_loss: 0.4343 141/500 [=======>......................] - ETA: 1:26 - loss: 2.3460 - regression_loss: 1.9118 - classification_loss: 0.4342 142/500 [=======>......................] - ETA: 1:25 - loss: 2.3444 - regression_loss: 1.9107 - classification_loss: 0.4336 143/500 [=======>......................] - ETA: 1:25 - loss: 2.3447 - regression_loss: 1.9115 - classification_loss: 0.4332 144/500 [=======>......................] - ETA: 1:25 - loss: 2.3478 - regression_loss: 1.9141 - classification_loss: 0.4337 145/500 [=======>......................] - ETA: 1:25 - loss: 2.3423 - regression_loss: 1.9103 - classification_loss: 0.4320 146/500 [=======>......................] - ETA: 1:24 - loss: 2.3418 - regression_loss: 1.9097 - classification_loss: 0.4321 147/500 [=======>......................] - ETA: 1:24 - loss: 2.3401 - regression_loss: 1.9084 - classification_loss: 0.4317 148/500 [=======>......................] - ETA: 1:24 - loss: 2.3339 - regression_loss: 1.9038 - classification_loss: 0.4301 149/500 [=======>......................] - ETA: 1:24 - loss: 2.3399 - regression_loss: 1.9095 - classification_loss: 0.4305 150/500 [========>.....................] - ETA: 1:23 - loss: 2.3296 - regression_loss: 1.9011 - classification_loss: 0.4285 151/500 [========>.....................] - ETA: 1:23 - loss: 2.3330 - regression_loss: 1.9038 - classification_loss: 0.4292 152/500 [========>.....................] - ETA: 1:23 - loss: 2.3323 - regression_loss: 1.9033 - classification_loss: 0.4290 153/500 [========>.....................] - ETA: 1:23 - loss: 2.3319 - regression_loss: 1.9031 - classification_loss: 0.4287 154/500 [========>.....................] - ETA: 1:23 - loss: 2.3309 - regression_loss: 1.9027 - classification_loss: 0.4282 155/500 [========>.....................] - ETA: 1:22 - loss: 2.3237 - regression_loss: 1.8970 - classification_loss: 0.4267 156/500 [========>.....................] - ETA: 1:22 - loss: 2.3285 - regression_loss: 1.9016 - classification_loss: 0.4270 157/500 [========>.....................] - ETA: 1:22 - loss: 2.3369 - regression_loss: 1.9079 - classification_loss: 0.4290 158/500 [========>.....................] - ETA: 1:22 - loss: 2.3355 - regression_loss: 1.9077 - classification_loss: 0.4278 159/500 [========>.....................] - ETA: 1:22 - loss: 2.3381 - regression_loss: 1.9096 - classification_loss: 0.4285 160/500 [========>.....................] - ETA: 1:21 - loss: 2.3359 - regression_loss: 1.9080 - classification_loss: 0.4279 161/500 [========>.....................] - ETA: 1:21 - loss: 2.3340 - regression_loss: 1.9069 - classification_loss: 0.4271 162/500 [========>.....................] - ETA: 1:21 - loss: 2.3357 - regression_loss: 1.9082 - classification_loss: 0.4276 163/500 [========>.....................] - ETA: 1:21 - loss: 2.3357 - regression_loss: 1.9079 - classification_loss: 0.4279 164/500 [========>.....................] - ETA: 1:20 - loss: 2.3351 - regression_loss: 1.9076 - classification_loss: 0.4275 165/500 [========>.....................] - ETA: 1:20 - loss: 2.3334 - regression_loss: 1.9064 - classification_loss: 0.4270 166/500 [========>.....................] - ETA: 1:20 - loss: 2.3341 - regression_loss: 1.9069 - classification_loss: 0.4273 167/500 [=========>....................] - ETA: 1:20 - loss: 2.3349 - regression_loss: 1.9080 - classification_loss: 0.4269 168/500 [=========>....................] - ETA: 1:19 - loss: 2.3321 - regression_loss: 1.9059 - classification_loss: 0.4261 169/500 [=========>....................] - ETA: 1:19 - loss: 2.3439 - regression_loss: 1.9130 - classification_loss: 0.4309 170/500 [=========>....................] - ETA: 1:19 - loss: 2.3499 - regression_loss: 1.9176 - classification_loss: 0.4323 171/500 [=========>....................] - ETA: 1:19 - loss: 2.3465 - regression_loss: 1.9146 - classification_loss: 0.4319 172/500 [=========>....................] - ETA: 1:19 - loss: 2.3497 - regression_loss: 1.9174 - classification_loss: 0.4323 173/500 [=========>....................] - ETA: 1:18 - loss: 2.3499 - regression_loss: 1.9175 - classification_loss: 0.4324 174/500 [=========>....................] - ETA: 1:18 - loss: 2.3469 - regression_loss: 1.9155 - classification_loss: 0.4314 175/500 [=========>....................] - ETA: 1:18 - loss: 2.3420 - regression_loss: 1.9117 - classification_loss: 0.4303 176/500 [=========>....................] - ETA: 1:18 - loss: 2.3403 - regression_loss: 1.9105 - classification_loss: 0.4299 177/500 [=========>....................] - ETA: 1:17 - loss: 2.3379 - regression_loss: 1.9082 - classification_loss: 0.4297 178/500 [=========>....................] - ETA: 1:17 - loss: 2.3368 - regression_loss: 1.9081 - classification_loss: 0.4287 179/500 [=========>....................] - ETA: 1:17 - loss: 2.3395 - regression_loss: 1.9096 - classification_loss: 0.4299 180/500 [=========>....................] - ETA: 1:17 - loss: 2.3390 - regression_loss: 1.9087 - classification_loss: 0.4303 181/500 [=========>....................] - ETA: 1:16 - loss: 2.3368 - regression_loss: 1.9073 - classification_loss: 0.4295 182/500 [=========>....................] - ETA: 1:16 - loss: 2.3353 - regression_loss: 1.9066 - classification_loss: 0.4287 183/500 [=========>....................] - ETA: 1:16 - loss: 2.3378 - regression_loss: 1.9092 - classification_loss: 0.4286 184/500 [==========>...................] - ETA: 1:16 - loss: 2.3365 - regression_loss: 1.9083 - classification_loss: 0.4283 185/500 [==========>...................] - ETA: 1:16 - loss: 2.3379 - regression_loss: 1.9097 - classification_loss: 0.4283 186/500 [==========>...................] - ETA: 1:15 - loss: 2.3357 - regression_loss: 1.9080 - classification_loss: 0.4276 187/500 [==========>...................] - ETA: 1:15 - loss: 2.3399 - regression_loss: 1.9117 - classification_loss: 0.4282 188/500 [==========>...................] - ETA: 1:15 - loss: 2.3394 - regression_loss: 1.9111 - classification_loss: 0.4283 189/500 [==========>...................] - ETA: 1:15 - loss: 2.3382 - regression_loss: 1.9101 - classification_loss: 0.4281 190/500 [==========>...................] - ETA: 1:14 - loss: 2.3385 - regression_loss: 1.9100 - classification_loss: 0.4285 191/500 [==========>...................] - ETA: 1:14 - loss: 2.3424 - regression_loss: 1.9134 - classification_loss: 0.4290 192/500 [==========>...................] - ETA: 1:14 - loss: 2.3411 - regression_loss: 1.9126 - classification_loss: 0.4285 193/500 [==========>...................] - ETA: 1:14 - loss: 2.3463 - regression_loss: 1.9162 - classification_loss: 0.4301 194/500 [==========>...................] - ETA: 1:14 - loss: 2.3490 - regression_loss: 1.9185 - classification_loss: 0.4305 195/500 [==========>...................] - ETA: 1:13 - loss: 2.3476 - regression_loss: 1.9174 - classification_loss: 0.4302 196/500 [==========>...................] - ETA: 1:13 - loss: 2.3452 - regression_loss: 1.9160 - classification_loss: 0.4292 197/500 [==========>...................] - ETA: 1:13 - loss: 2.3493 - regression_loss: 1.9198 - classification_loss: 0.4295 198/500 [==========>...................] - ETA: 1:13 - loss: 2.3504 - regression_loss: 1.9210 - classification_loss: 0.4293 199/500 [==========>...................] - ETA: 1:12 - loss: 2.3515 - regression_loss: 1.9218 - classification_loss: 0.4297 200/500 [===========>..................] - ETA: 1:12 - loss: 2.3511 - regression_loss: 1.9216 - classification_loss: 0.4295 201/500 [===========>..................] - ETA: 1:12 - loss: 2.3504 - regression_loss: 1.9210 - classification_loss: 0.4294 202/500 [===========>..................] - ETA: 1:12 - loss: 2.3505 - regression_loss: 1.9212 - classification_loss: 0.4294 203/500 [===========>..................] - ETA: 1:11 - loss: 2.3489 - regression_loss: 1.9198 - classification_loss: 0.4291 204/500 [===========>..................] - ETA: 1:11 - loss: 2.3438 - regression_loss: 1.9159 - classification_loss: 0.4279 205/500 [===========>..................] - ETA: 1:11 - loss: 2.3438 - regression_loss: 1.9164 - classification_loss: 0.4274 206/500 [===========>..................] - ETA: 1:11 - loss: 2.3428 - regression_loss: 1.9153 - classification_loss: 0.4275 207/500 [===========>..................] - ETA: 1:11 - loss: 2.3407 - regression_loss: 1.9137 - classification_loss: 0.4270 208/500 [===========>..................] - ETA: 1:10 - loss: 2.3337 - regression_loss: 1.9076 - classification_loss: 0.4262 209/500 [===========>..................] - ETA: 1:10 - loss: 2.3333 - regression_loss: 1.9074 - classification_loss: 0.4259 210/500 [===========>..................] - ETA: 1:10 - loss: 2.3356 - regression_loss: 1.9092 - classification_loss: 0.4264 211/500 [===========>..................] - ETA: 1:10 - loss: 2.3352 - regression_loss: 1.9091 - classification_loss: 0.4261 212/500 [===========>..................] - ETA: 1:09 - loss: 2.3336 - regression_loss: 1.9077 - classification_loss: 0.4259 213/500 [===========>..................] - ETA: 1:09 - loss: 2.3316 - regression_loss: 1.9063 - classification_loss: 0.4252 214/500 [===========>..................] - ETA: 1:09 - loss: 2.3304 - regression_loss: 1.9053 - classification_loss: 0.4251 215/500 [===========>..................] - ETA: 1:09 - loss: 2.3306 - regression_loss: 1.9056 - classification_loss: 0.4251 216/500 [===========>..................] - ETA: 1:08 - loss: 2.3326 - regression_loss: 1.9074 - classification_loss: 0.4252 217/500 [============>.................] - ETA: 1:08 - loss: 2.3342 - regression_loss: 1.9088 - classification_loss: 0.4255 218/500 [============>.................] - ETA: 1:08 - loss: 2.3363 - regression_loss: 1.9102 - classification_loss: 0.4261 219/500 [============>.................] - ETA: 1:08 - loss: 2.3395 - regression_loss: 1.9129 - classification_loss: 0.4266 220/500 [============>.................] - ETA: 1:07 - loss: 2.3365 - regression_loss: 1.9108 - classification_loss: 0.4256 221/500 [============>.................] - ETA: 1:07 - loss: 2.3386 - regression_loss: 1.9126 - classification_loss: 0.4260 222/500 [============>.................] - ETA: 1:07 - loss: 2.3468 - regression_loss: 1.9133 - classification_loss: 0.4335 223/500 [============>.................] - ETA: 1:07 - loss: 2.3623 - regression_loss: 1.9047 - classification_loss: 0.4576 224/500 [============>.................] - ETA: 1:07 - loss: 2.3622 - regression_loss: 1.9046 - classification_loss: 0.4575 225/500 [============>.................] - ETA: 1:06 - loss: 2.3583 - regression_loss: 1.9019 - classification_loss: 0.4564 226/500 [============>.................] - ETA: 1:06 - loss: 2.3545 - regression_loss: 1.8992 - classification_loss: 0.4553 227/500 [============>.................] - ETA: 1:06 - loss: 2.3583 - regression_loss: 1.9024 - classification_loss: 0.4559 228/500 [============>.................] - ETA: 1:06 - loss: 2.3601 - regression_loss: 1.9050 - classification_loss: 0.4552 229/500 [============>.................] - ETA: 1:05 - loss: 2.3574 - regression_loss: 1.9032 - classification_loss: 0.4542 230/500 [============>.................] - ETA: 1:05 - loss: 2.3558 - regression_loss: 1.9019 - classification_loss: 0.4539 231/500 [============>.................] - ETA: 1:05 - loss: 2.3554 - regression_loss: 1.9018 - classification_loss: 0.4536 232/500 [============>.................] - ETA: 1:05 - loss: 2.3574 - regression_loss: 1.9034 - classification_loss: 0.4540 233/500 [============>.................] - ETA: 1:04 - loss: 2.3556 - regression_loss: 1.9019 - classification_loss: 0.4537 234/500 [=============>................] - ETA: 1:04 - loss: 2.3561 - regression_loss: 1.9020 - classification_loss: 0.4541 235/500 [=============>................] - ETA: 1:04 - loss: 2.3584 - regression_loss: 1.9043 - classification_loss: 0.4541 236/500 [=============>................] - ETA: 1:04 - loss: 2.3570 - regression_loss: 1.9036 - classification_loss: 0.4533 237/500 [=============>................] - ETA: 1:04 - loss: 2.3564 - regression_loss: 1.9037 - classification_loss: 0.4527 238/500 [=============>................] - ETA: 1:03 - loss: 2.3562 - regression_loss: 1.9037 - classification_loss: 0.4525 239/500 [=============>................] - ETA: 1:03 - loss: 2.3526 - regression_loss: 1.9009 - classification_loss: 0.4518 240/500 [=============>................] - ETA: 1:03 - loss: 2.3525 - regression_loss: 1.9011 - classification_loss: 0.4514 241/500 [=============>................] - ETA: 1:03 - loss: 2.3502 - regression_loss: 1.8995 - classification_loss: 0.4507 242/500 [=============>................] - ETA: 1:02 - loss: 2.3496 - regression_loss: 1.8992 - classification_loss: 0.4504 243/500 [=============>................] - ETA: 1:02 - loss: 2.3518 - regression_loss: 1.9013 - classification_loss: 0.4505 244/500 [=============>................] - ETA: 1:02 - loss: 2.3534 - regression_loss: 1.9025 - classification_loss: 0.4510 245/500 [=============>................] - ETA: 1:02 - loss: 2.3534 - regression_loss: 1.9027 - classification_loss: 0.4506 246/500 [=============>................] - ETA: 1:01 - loss: 2.3520 - regression_loss: 1.9020 - classification_loss: 0.4500 247/500 [=============>................] - ETA: 1:01 - loss: 2.3519 - regression_loss: 1.9015 - classification_loss: 0.4505 248/500 [=============>................] - ETA: 1:01 - loss: 2.3521 - regression_loss: 1.9023 - classification_loss: 0.4498 249/500 [=============>................] - ETA: 1:01 - loss: 2.3511 - regression_loss: 1.9016 - classification_loss: 0.4495 250/500 [==============>...............] - ETA: 1:00 - loss: 2.3493 - regression_loss: 1.9000 - classification_loss: 0.4493 251/500 [==============>...............] - ETA: 1:00 - loss: 2.3488 - regression_loss: 1.8993 - classification_loss: 0.4495 252/500 [==============>...............] - ETA: 1:00 - loss: 2.3488 - regression_loss: 1.8988 - classification_loss: 0.4500 253/500 [==============>...............] - ETA: 1:00 - loss: 2.3467 - regression_loss: 1.8974 - classification_loss: 0.4493 254/500 [==============>...............] - ETA: 59s - loss: 2.3458 - regression_loss: 1.8967 - classification_loss: 0.4491  255/500 [==============>...............] - ETA: 59s - loss: 2.3479 - regression_loss: 1.8983 - classification_loss: 0.4496 256/500 [==============>...............] - ETA: 59s - loss: 2.3482 - regression_loss: 1.8993 - classification_loss: 0.4489 257/500 [==============>...............] - ETA: 59s - loss: 2.3464 - regression_loss: 1.8984 - classification_loss: 0.4480 258/500 [==============>...............] - ETA: 58s - loss: 2.3454 - regression_loss: 1.8975 - classification_loss: 0.4479 259/500 [==============>...............] - ETA: 58s - loss: 2.3443 - regression_loss: 1.8965 - classification_loss: 0.4479 260/500 [==============>...............] - ETA: 58s - loss: 2.3447 - regression_loss: 1.8962 - classification_loss: 0.4485 261/500 [==============>...............] - ETA: 58s - loss: 2.3411 - regression_loss: 1.8934 - classification_loss: 0.4477 262/500 [==============>...............] - ETA: 58s - loss: 2.3410 - regression_loss: 1.8936 - classification_loss: 0.4474 263/500 [==============>...............] - ETA: 57s - loss: 2.3425 - regression_loss: 1.8952 - classification_loss: 0.4473 264/500 [==============>...............] - ETA: 57s - loss: 2.3439 - regression_loss: 1.8963 - classification_loss: 0.4476 265/500 [==============>...............] - ETA: 57s - loss: 2.3413 - regression_loss: 1.8943 - classification_loss: 0.4469 266/500 [==============>...............] - ETA: 57s - loss: 2.3408 - regression_loss: 1.8942 - classification_loss: 0.4466 267/500 [===============>..............] - ETA: 56s - loss: 2.3411 - regression_loss: 1.8948 - classification_loss: 0.4463 268/500 [===============>..............] - ETA: 56s - loss: 2.3412 - regression_loss: 1.8949 - classification_loss: 0.4464 269/500 [===============>..............] - ETA: 56s - loss: 2.3405 - regression_loss: 1.8944 - classification_loss: 0.4461 270/500 [===============>..............] - ETA: 56s - loss: 2.3405 - regression_loss: 1.8945 - classification_loss: 0.4460 271/500 [===============>..............] - ETA: 55s - loss: 2.3438 - regression_loss: 1.8974 - classification_loss: 0.4464 272/500 [===============>..............] - ETA: 55s - loss: 2.3451 - regression_loss: 1.8982 - classification_loss: 0.4469 273/500 [===============>..............] - ETA: 55s - loss: 2.3441 - regression_loss: 1.8978 - classification_loss: 0.4463 274/500 [===============>..............] - ETA: 55s - loss: 2.3453 - regression_loss: 1.8987 - classification_loss: 0.4466 275/500 [===============>..............] - ETA: 54s - loss: 2.3466 - regression_loss: 1.8998 - classification_loss: 0.4468 276/500 [===============>..............] - ETA: 54s - loss: 2.3427 - regression_loss: 1.8968 - classification_loss: 0.4459 277/500 [===============>..............] - ETA: 54s - loss: 2.3408 - regression_loss: 1.8957 - classification_loss: 0.4451 278/500 [===============>..............] - ETA: 54s - loss: 2.3460 - regression_loss: 1.9003 - classification_loss: 0.4457 279/500 [===============>..............] - ETA: 54s - loss: 2.3457 - regression_loss: 1.9004 - classification_loss: 0.4453 280/500 [===============>..............] - ETA: 53s - loss: 2.3421 - regression_loss: 1.8978 - classification_loss: 0.4443 281/500 [===============>..............] - ETA: 53s - loss: 2.3403 - regression_loss: 1.8966 - classification_loss: 0.4438 282/500 [===============>..............] - ETA: 53s - loss: 2.3395 - regression_loss: 1.8956 - classification_loss: 0.4438 283/500 [===============>..............] - ETA: 53s - loss: 2.3408 - regression_loss: 1.8964 - classification_loss: 0.4444 284/500 [================>.............] - ETA: 52s - loss: 2.3412 - regression_loss: 1.8963 - classification_loss: 0.4449 285/500 [================>.............] - ETA: 52s - loss: 2.3386 - regression_loss: 1.8943 - classification_loss: 0.4442 286/500 [================>.............] - ETA: 52s - loss: 2.3384 - regression_loss: 1.8942 - classification_loss: 0.4442 287/500 [================>.............] - ETA: 52s - loss: 2.3366 - regression_loss: 1.8928 - classification_loss: 0.4438 288/500 [================>.............] - ETA: 51s - loss: 2.3341 - regression_loss: 1.8910 - classification_loss: 0.4432 289/500 [================>.............] - ETA: 51s - loss: 2.3339 - regression_loss: 1.8910 - classification_loss: 0.4430 290/500 [================>.............] - ETA: 51s - loss: 2.3324 - regression_loss: 1.8900 - classification_loss: 0.4424 291/500 [================>.............] - ETA: 51s - loss: 2.3334 - regression_loss: 1.8909 - classification_loss: 0.4425 292/500 [================>.............] - ETA: 50s - loss: 2.3368 - regression_loss: 1.8935 - classification_loss: 0.4433 293/500 [================>.............] - ETA: 50s - loss: 2.3348 - regression_loss: 1.8922 - classification_loss: 0.4426 294/500 [================>.............] - ETA: 50s - loss: 2.3320 - regression_loss: 1.8900 - classification_loss: 0.4419 295/500 [================>.............] - ETA: 50s - loss: 2.3349 - regression_loss: 1.8923 - classification_loss: 0.4425 296/500 [================>.............] - ETA: 49s - loss: 2.3345 - regression_loss: 1.8923 - classification_loss: 0.4422 297/500 [================>.............] - ETA: 49s - loss: 2.3369 - regression_loss: 1.8942 - classification_loss: 0.4428 298/500 [================>.............] - ETA: 49s - loss: 2.3365 - regression_loss: 1.8939 - classification_loss: 0.4427 299/500 [================>.............] - ETA: 49s - loss: 2.3367 - regression_loss: 1.8939 - classification_loss: 0.4429 300/500 [=================>............] - ETA: 48s - loss: 2.3362 - regression_loss: 1.8934 - classification_loss: 0.4428 301/500 [=================>............] - ETA: 48s - loss: 2.3320 - regression_loss: 1.8902 - classification_loss: 0.4418 302/500 [=================>............] - ETA: 48s - loss: 2.3322 - regression_loss: 1.8901 - classification_loss: 0.4421 303/500 [=================>............] - ETA: 48s - loss: 2.3289 - regression_loss: 1.8875 - classification_loss: 0.4414 304/500 [=================>............] - ETA: 47s - loss: 2.3244 - regression_loss: 1.8839 - classification_loss: 0.4405 305/500 [=================>............] - ETA: 47s - loss: 2.3231 - regression_loss: 1.8830 - classification_loss: 0.4401 306/500 [=================>............] - ETA: 47s - loss: 2.3237 - regression_loss: 1.8839 - classification_loss: 0.4398 307/500 [=================>............] - ETA: 47s - loss: 2.3250 - regression_loss: 1.8850 - classification_loss: 0.4400 308/500 [=================>............] - ETA: 47s - loss: 2.3238 - regression_loss: 1.8841 - classification_loss: 0.4396 309/500 [=================>............] - ETA: 46s - loss: 2.3245 - regression_loss: 1.8848 - classification_loss: 0.4397 310/500 [=================>............] - ETA: 46s - loss: 2.3244 - regression_loss: 1.8848 - classification_loss: 0.4396 311/500 [=================>............] - ETA: 46s - loss: 2.3252 - regression_loss: 1.8857 - classification_loss: 0.4396 312/500 [=================>............] - ETA: 46s - loss: 2.3240 - regression_loss: 1.8849 - classification_loss: 0.4391 313/500 [=================>............] - ETA: 45s - loss: 2.3260 - regression_loss: 1.8865 - classification_loss: 0.4396 314/500 [=================>............] - ETA: 45s - loss: 2.3262 - regression_loss: 1.8867 - classification_loss: 0.4394 315/500 [=================>............] - ETA: 45s - loss: 2.3252 - regression_loss: 1.8864 - classification_loss: 0.4389 316/500 [=================>............] - ETA: 45s - loss: 2.3233 - regression_loss: 1.8849 - classification_loss: 0.4384 317/500 [==================>...........] - ETA: 44s - loss: 2.3215 - regression_loss: 1.8838 - classification_loss: 0.4377 318/500 [==================>...........] - ETA: 44s - loss: 2.3174 - regression_loss: 1.8806 - classification_loss: 0.4368 319/500 [==================>...........] - ETA: 44s - loss: 2.3169 - regression_loss: 1.8803 - classification_loss: 0.4365 320/500 [==================>...........] - ETA: 44s - loss: 2.3164 - regression_loss: 1.8801 - classification_loss: 0.4363 321/500 [==================>...........] - ETA: 43s - loss: 2.3174 - regression_loss: 1.8811 - classification_loss: 0.4363 322/500 [==================>...........] - ETA: 43s - loss: 2.3187 - regression_loss: 1.8816 - classification_loss: 0.4371 323/500 [==================>...........] - ETA: 43s - loss: 2.3190 - regression_loss: 1.8818 - classification_loss: 0.4372 324/500 [==================>...........] - ETA: 43s - loss: 2.3145 - regression_loss: 1.8783 - classification_loss: 0.4362 325/500 [==================>...........] - ETA: 42s - loss: 2.3142 - regression_loss: 1.8783 - classification_loss: 0.4359 326/500 [==================>...........] - ETA: 42s - loss: 2.3141 - regression_loss: 1.8782 - classification_loss: 0.4359 327/500 [==================>...........] - ETA: 42s - loss: 2.3139 - regression_loss: 1.8788 - classification_loss: 0.4352 328/500 [==================>...........] - ETA: 42s - loss: 2.3158 - regression_loss: 1.8801 - classification_loss: 0.4357 329/500 [==================>...........] - ETA: 41s - loss: 2.3161 - regression_loss: 1.8807 - classification_loss: 0.4354 330/500 [==================>...........] - ETA: 41s - loss: 2.3173 - regression_loss: 1.8825 - classification_loss: 0.4347 331/500 [==================>...........] - ETA: 41s - loss: 2.3176 - regression_loss: 1.8831 - classification_loss: 0.4345 332/500 [==================>...........] - ETA: 41s - loss: 2.3169 - regression_loss: 1.8825 - classification_loss: 0.4344 333/500 [==================>...........] - ETA: 40s - loss: 2.3146 - regression_loss: 1.8810 - classification_loss: 0.4336 334/500 [===================>..........] - ETA: 40s - loss: 2.3130 - regression_loss: 1.8799 - classification_loss: 0.4331 335/500 [===================>..........] - ETA: 40s - loss: 2.3115 - regression_loss: 1.8790 - classification_loss: 0.4325 336/500 [===================>..........] - ETA: 40s - loss: 2.3128 - regression_loss: 1.8800 - classification_loss: 0.4328 337/500 [===================>..........] - ETA: 39s - loss: 2.3129 - regression_loss: 1.8799 - classification_loss: 0.4330 338/500 [===================>..........] - ETA: 39s - loss: 2.3141 - regression_loss: 1.8807 - classification_loss: 0.4334 339/500 [===================>..........] - ETA: 39s - loss: 2.3125 - regression_loss: 1.8795 - classification_loss: 0.4330 340/500 [===================>..........] - ETA: 39s - loss: 2.3142 - regression_loss: 1.8808 - classification_loss: 0.4334 341/500 [===================>..........] - ETA: 39s - loss: 2.3109 - regression_loss: 1.8781 - classification_loss: 0.4328 342/500 [===================>..........] - ETA: 38s - loss: 2.3100 - regression_loss: 1.8774 - classification_loss: 0.4325 343/500 [===================>..........] - ETA: 38s - loss: 2.3097 - regression_loss: 1.8773 - classification_loss: 0.4324 344/500 [===================>..........] - ETA: 38s - loss: 2.3096 - regression_loss: 1.8775 - classification_loss: 0.4321 345/500 [===================>..........] - ETA: 38s - loss: 2.3119 - regression_loss: 1.8796 - classification_loss: 0.4323 346/500 [===================>..........] - ETA: 37s - loss: 2.3114 - regression_loss: 1.8794 - classification_loss: 0.4320 347/500 [===================>..........] - ETA: 37s - loss: 2.3155 - regression_loss: 1.8830 - classification_loss: 0.4325 348/500 [===================>..........] - ETA: 37s - loss: 2.3157 - regression_loss: 1.8831 - classification_loss: 0.4326 349/500 [===================>..........] - ETA: 37s - loss: 2.3160 - regression_loss: 1.8837 - classification_loss: 0.4323 350/500 [====================>.........] - ETA: 36s - loss: 2.3157 - regression_loss: 1.8839 - classification_loss: 0.4319 351/500 [====================>.........] - ETA: 36s - loss: 2.3139 - regression_loss: 1.8816 - classification_loss: 0.4324 352/500 [====================>.........] - ETA: 36s - loss: 2.3132 - regression_loss: 1.8810 - classification_loss: 0.4322 353/500 [====================>.........] - ETA: 36s - loss: 2.3161 - regression_loss: 1.8831 - classification_loss: 0.4330 354/500 [====================>.........] - ETA: 35s - loss: 2.3164 - regression_loss: 1.8838 - classification_loss: 0.4326 355/500 [====================>.........] - ETA: 35s - loss: 2.3162 - regression_loss: 1.8839 - classification_loss: 0.4323 356/500 [====================>.........] - ETA: 35s - loss: 2.3154 - regression_loss: 1.8837 - classification_loss: 0.4317 357/500 [====================>.........] - ETA: 35s - loss: 2.3174 - regression_loss: 1.8853 - classification_loss: 0.4321 358/500 [====================>.........] - ETA: 34s - loss: 2.3175 - regression_loss: 1.8860 - classification_loss: 0.4316 359/500 [====================>.........] - ETA: 34s - loss: 2.3216 - regression_loss: 1.8899 - classification_loss: 0.4318 360/500 [====================>.........] - ETA: 34s - loss: 2.3203 - regression_loss: 1.8888 - classification_loss: 0.4315 361/500 [====================>.........] - ETA: 34s - loss: 2.3204 - regression_loss: 1.8888 - classification_loss: 0.4316 362/500 [====================>.........] - ETA: 33s - loss: 2.3187 - regression_loss: 1.8877 - classification_loss: 0.4310 363/500 [====================>.........] - ETA: 33s - loss: 2.3175 - regression_loss: 1.8870 - classification_loss: 0.4306 364/500 [====================>.........] - ETA: 33s - loss: 2.3158 - regression_loss: 1.8857 - classification_loss: 0.4302 365/500 [====================>.........] - ETA: 33s - loss: 2.3162 - regression_loss: 1.8859 - classification_loss: 0.4303 366/500 [====================>.........] - ETA: 32s - loss: 2.3193 - regression_loss: 1.8886 - classification_loss: 0.4307 367/500 [=====================>........] - ETA: 32s - loss: 2.3211 - regression_loss: 1.8901 - classification_loss: 0.4310 368/500 [=====================>........] - ETA: 32s - loss: 2.3254 - regression_loss: 1.8938 - classification_loss: 0.4317 369/500 [=====================>........] - ETA: 32s - loss: 2.3245 - regression_loss: 1.8930 - classification_loss: 0.4315 370/500 [=====================>........] - ETA: 31s - loss: 2.3263 - regression_loss: 1.8948 - classification_loss: 0.4315 371/500 [=====================>........] - ETA: 31s - loss: 2.3276 - regression_loss: 1.8957 - classification_loss: 0.4319 372/500 [=====================>........] - ETA: 31s - loss: 2.3279 - regression_loss: 1.8958 - classification_loss: 0.4321 373/500 [=====================>........] - ETA: 31s - loss: 2.3253 - regression_loss: 1.8940 - classification_loss: 0.4314 374/500 [=====================>........] - ETA: 30s - loss: 2.3280 - regression_loss: 1.8962 - classification_loss: 0.4318 375/500 [=====================>........] - ETA: 30s - loss: 2.3299 - regression_loss: 1.8981 - classification_loss: 0.4318 376/500 [=====================>........] - ETA: 30s - loss: 2.3300 - regression_loss: 1.8982 - classification_loss: 0.4318 377/500 [=====================>........] - ETA: 30s - loss: 2.3294 - regression_loss: 1.8980 - classification_loss: 0.4314 378/500 [=====================>........] - ETA: 29s - loss: 2.3284 - regression_loss: 1.8974 - classification_loss: 0.4310 379/500 [=====================>........] - ETA: 29s - loss: 2.3289 - regression_loss: 1.8983 - classification_loss: 0.4306 380/500 [=====================>........] - ETA: 29s - loss: 2.3292 - regression_loss: 1.8985 - classification_loss: 0.4308 381/500 [=====================>........] - ETA: 29s - loss: 2.3304 - regression_loss: 1.8995 - classification_loss: 0.4309 382/500 [=====================>........] - ETA: 29s - loss: 2.3317 - regression_loss: 1.9005 - classification_loss: 0.4312 383/500 [=====================>........] - ETA: 28s - loss: 2.3317 - regression_loss: 1.9007 - classification_loss: 0.4310 384/500 [======================>.......] - ETA: 28s - loss: 2.3293 - regression_loss: 1.8988 - classification_loss: 0.4305 385/500 [======================>.......] - ETA: 28s - loss: 2.3300 - regression_loss: 1.8989 - classification_loss: 0.4311 386/500 [======================>.......] - ETA: 28s - loss: 2.3311 - regression_loss: 1.8997 - classification_loss: 0.4314 387/500 [======================>.......] - ETA: 27s - loss: 2.3299 - regression_loss: 1.8986 - classification_loss: 0.4313 388/500 [======================>.......] - ETA: 27s - loss: 2.3309 - regression_loss: 1.8995 - classification_loss: 0.4314 389/500 [======================>.......] - ETA: 27s - loss: 2.3314 - regression_loss: 1.8998 - classification_loss: 0.4316 390/500 [======================>.......] - ETA: 27s - loss: 2.3316 - regression_loss: 1.9002 - classification_loss: 0.4314 391/500 [======================>.......] - ETA: 26s - loss: 2.3308 - regression_loss: 1.8995 - classification_loss: 0.4313 392/500 [======================>.......] - ETA: 26s - loss: 2.3298 - regression_loss: 1.8988 - classification_loss: 0.4310 393/500 [======================>.......] - ETA: 26s - loss: 2.3305 - regression_loss: 1.8995 - classification_loss: 0.4310 394/500 [======================>.......] - ETA: 26s - loss: 2.3302 - regression_loss: 1.8991 - classification_loss: 0.4312 395/500 [======================>.......] - ETA: 25s - loss: 2.3292 - regression_loss: 1.8984 - classification_loss: 0.4308 396/500 [======================>.......] - ETA: 25s - loss: 2.3273 - regression_loss: 1.8970 - classification_loss: 0.4303 397/500 [======================>.......] - ETA: 25s - loss: 2.3280 - regression_loss: 1.8974 - classification_loss: 0.4306 398/500 [======================>.......] - ETA: 25s - loss: 2.3291 - regression_loss: 1.8984 - classification_loss: 0.4307 399/500 [======================>.......] - ETA: 24s - loss: 2.3281 - regression_loss: 1.8978 - classification_loss: 0.4303 400/500 [=======================>......] - ETA: 24s - loss: 2.3267 - regression_loss: 1.8968 - classification_loss: 0.4299 401/500 [=======================>......] - ETA: 24s - loss: 2.3289 - regression_loss: 1.8988 - classification_loss: 0.4302 402/500 [=======================>......] - ETA: 24s - loss: 2.3305 - regression_loss: 1.9000 - classification_loss: 0.4305 403/500 [=======================>......] - ETA: 23s - loss: 2.3287 - regression_loss: 1.8988 - classification_loss: 0.4300 404/500 [=======================>......] - ETA: 23s - loss: 2.3305 - regression_loss: 1.9008 - classification_loss: 0.4297 405/500 [=======================>......] - ETA: 23s - loss: 2.3300 - regression_loss: 1.9003 - classification_loss: 0.4298 406/500 [=======================>......] - ETA: 23s - loss: 2.3299 - regression_loss: 1.9004 - classification_loss: 0.4295 407/500 [=======================>......] - ETA: 22s - loss: 2.3298 - regression_loss: 1.9004 - classification_loss: 0.4293 408/500 [=======================>......] - ETA: 22s - loss: 2.3301 - regression_loss: 1.9008 - classification_loss: 0.4293 409/500 [=======================>......] - ETA: 22s - loss: 2.3305 - regression_loss: 1.9011 - classification_loss: 0.4294 410/500 [=======================>......] - ETA: 22s - loss: 2.3296 - regression_loss: 1.9004 - classification_loss: 0.4292 411/500 [=======================>......] - ETA: 21s - loss: 2.3286 - regression_loss: 1.8998 - classification_loss: 0.4288 412/500 [=======================>......] - ETA: 21s - loss: 2.3289 - regression_loss: 1.9000 - classification_loss: 0.4289 413/500 [=======================>......] - ETA: 21s - loss: 2.3291 - regression_loss: 1.9002 - classification_loss: 0.4289 414/500 [=======================>......] - ETA: 21s - loss: 2.3301 - regression_loss: 1.9011 - classification_loss: 0.4290 415/500 [=======================>......] - ETA: 20s - loss: 2.3292 - regression_loss: 1.9003 - classification_loss: 0.4289 416/500 [=======================>......] - ETA: 20s - loss: 2.3313 - regression_loss: 1.9018 - classification_loss: 0.4295 417/500 [========================>.....] - ETA: 20s - loss: 2.3310 - regression_loss: 1.9017 - classification_loss: 0.4294 418/500 [========================>.....] - ETA: 20s - loss: 2.3313 - regression_loss: 1.9019 - classification_loss: 0.4293 419/500 [========================>.....] - ETA: 19s - loss: 2.3321 - regression_loss: 1.9028 - classification_loss: 0.4293 420/500 [========================>.....] - ETA: 19s - loss: 2.3314 - regression_loss: 1.9024 - classification_loss: 0.4290 421/500 [========================>.....] - ETA: 19s - loss: 2.3325 - regression_loss: 1.9031 - classification_loss: 0.4294 422/500 [========================>.....] - ETA: 19s - loss: 2.3330 - regression_loss: 1.9035 - classification_loss: 0.4295 423/500 [========================>.....] - ETA: 18s - loss: 2.3314 - regression_loss: 1.9020 - classification_loss: 0.4293 424/500 [========================>.....] - ETA: 18s - loss: 2.3311 - regression_loss: 1.9020 - classification_loss: 0.4292 425/500 [========================>.....] - ETA: 18s - loss: 2.3319 - regression_loss: 1.9026 - classification_loss: 0.4293 426/500 [========================>.....] - ETA: 18s - loss: 2.3327 - regression_loss: 1.9032 - classification_loss: 0.4295 427/500 [========================>.....] - ETA: 17s - loss: 2.3335 - regression_loss: 1.9040 - classification_loss: 0.4296 428/500 [========================>.....] - ETA: 17s - loss: 2.3340 - regression_loss: 1.9042 - classification_loss: 0.4298 429/500 [========================>.....] - ETA: 17s - loss: 2.3315 - regression_loss: 1.9024 - classification_loss: 0.4291 430/500 [========================>.....] - ETA: 17s - loss: 2.3316 - regression_loss: 1.9026 - classification_loss: 0.4290 431/500 [========================>.....] - ETA: 16s - loss: 2.3316 - regression_loss: 1.9026 - classification_loss: 0.4290 432/500 [========================>.....] - ETA: 16s - loss: 2.3317 - regression_loss: 1.9030 - classification_loss: 0.4287 433/500 [========================>.....] - ETA: 16s - loss: 2.3312 - regression_loss: 1.9027 - classification_loss: 0.4285 434/500 [=========================>....] - ETA: 16s - loss: 2.3322 - regression_loss: 1.9030 - classification_loss: 0.4292 435/500 [=========================>....] - ETA: 16s - loss: 2.3314 - regression_loss: 1.9024 - classification_loss: 0.4289 436/500 [=========================>....] - ETA: 15s - loss: 2.3298 - regression_loss: 1.9013 - classification_loss: 0.4286 437/500 [=========================>....] - ETA: 15s - loss: 2.3289 - regression_loss: 1.9007 - classification_loss: 0.4283 438/500 [=========================>....] - ETA: 15s - loss: 2.3293 - regression_loss: 1.9007 - classification_loss: 0.4286 439/500 [=========================>....] - ETA: 15s - loss: 2.3300 - regression_loss: 1.9013 - classification_loss: 0.4287 440/500 [=========================>....] - ETA: 14s - loss: 2.3295 - regression_loss: 1.9007 - classification_loss: 0.4288 441/500 [=========================>....] - ETA: 14s - loss: 2.3285 - regression_loss: 1.8999 - classification_loss: 0.4286 442/500 [=========================>....] - ETA: 14s - loss: 2.3264 - regression_loss: 1.8984 - classification_loss: 0.4280 443/500 [=========================>....] - ETA: 14s - loss: 2.3271 - regression_loss: 1.8991 - classification_loss: 0.4280 444/500 [=========================>....] - ETA: 13s - loss: 2.3296 - regression_loss: 1.9008 - classification_loss: 0.4289 445/500 [=========================>....] - ETA: 13s - loss: 2.3295 - regression_loss: 1.9007 - classification_loss: 0.4288 446/500 [=========================>....] - ETA: 13s - loss: 2.3294 - regression_loss: 1.9007 - classification_loss: 0.4288 447/500 [=========================>....] - ETA: 13s - loss: 2.3287 - regression_loss: 1.9003 - classification_loss: 0.4284 448/500 [=========================>....] - ETA: 12s - loss: 2.3278 - regression_loss: 1.8998 - classification_loss: 0.4280 449/500 [=========================>....] - ETA: 12s - loss: 2.3268 - regression_loss: 1.8991 - classification_loss: 0.4277 450/500 [==========================>...] - ETA: 12s - loss: 2.3262 - regression_loss: 1.8987 - classification_loss: 0.4275 451/500 [==========================>...] - ETA: 12s - loss: 2.3256 - regression_loss: 1.8984 - classification_loss: 0.4272 452/500 [==========================>...] - ETA: 11s - loss: 2.3250 - regression_loss: 1.8979 - classification_loss: 0.4271 453/500 [==========================>...] - ETA: 11s - loss: 2.3245 - regression_loss: 1.8978 - classification_loss: 0.4267 454/500 [==========================>...] - ETA: 11s - loss: 2.3238 - regression_loss: 1.8972 - classification_loss: 0.4266 455/500 [==========================>...] - ETA: 11s - loss: 2.3229 - regression_loss: 1.8966 - classification_loss: 0.4263 456/500 [==========================>...] - ETA: 10s - loss: 2.3230 - regression_loss: 1.8966 - classification_loss: 0.4264 457/500 [==========================>...] - ETA: 10s - loss: 2.3223 - regression_loss: 1.8963 - classification_loss: 0.4259 458/500 [==========================>...] - ETA: 10s - loss: 2.3208 - regression_loss: 1.8953 - classification_loss: 0.4255 459/500 [==========================>...] - ETA: 10s - loss: 2.3202 - regression_loss: 1.8949 - classification_loss: 0.4253 460/500 [==========================>...] - ETA: 9s - loss: 2.3205 - regression_loss: 1.8951 - classification_loss: 0.4254  461/500 [==========================>...] - ETA: 9s - loss: 2.3199 - regression_loss: 1.8947 - classification_loss: 0.4251 462/500 [==========================>...] - ETA: 9s - loss: 2.3217 - regression_loss: 1.8960 - classification_loss: 0.4257 463/500 [==========================>...] - ETA: 9s - loss: 2.3218 - regression_loss: 1.8962 - classification_loss: 0.4256 464/500 [==========================>...] - ETA: 8s - loss: 2.3223 - regression_loss: 1.8968 - classification_loss: 0.4255 465/500 [==========================>...] - ETA: 8s - loss: 2.3214 - regression_loss: 1.8960 - classification_loss: 0.4254 466/500 [==========================>...] - ETA: 8s - loss: 2.3208 - regression_loss: 1.8958 - classification_loss: 0.4250 467/500 [===========================>..] - ETA: 8s - loss: 2.3175 - regression_loss: 1.8931 - classification_loss: 0.4244 468/500 [===========================>..] - ETA: 7s - loss: 2.3174 - regression_loss: 1.8931 - classification_loss: 0.4243 469/500 [===========================>..] - ETA: 7s - loss: 2.3184 - regression_loss: 1.8936 - classification_loss: 0.4247 470/500 [===========================>..] - ETA: 7s - loss: 2.3182 - regression_loss: 1.8935 - classification_loss: 0.4248 471/500 [===========================>..] - ETA: 7s - loss: 2.3174 - regression_loss: 1.8928 - classification_loss: 0.4246 472/500 [===========================>..] - ETA: 6s - loss: 2.3171 - regression_loss: 1.8930 - classification_loss: 0.4241 473/500 [===========================>..] - ETA: 6s - loss: 2.3169 - regression_loss: 1.8928 - classification_loss: 0.4241 474/500 [===========================>..] - ETA: 6s - loss: 2.3167 - regression_loss: 1.8927 - classification_loss: 0.4240 475/500 [===========================>..] - ETA: 6s - loss: 2.3173 - regression_loss: 1.8932 - classification_loss: 0.4241 476/500 [===========================>..] - ETA: 5s - loss: 2.3184 - regression_loss: 1.8943 - classification_loss: 0.4241 477/500 [===========================>..] - ETA: 5s - loss: 2.3182 - regression_loss: 1.8941 - classification_loss: 0.4241 478/500 [===========================>..] - ETA: 5s - loss: 2.3186 - regression_loss: 1.8946 - classification_loss: 0.4240 479/500 [===========================>..] - ETA: 5s - loss: 2.3191 - regression_loss: 1.8953 - classification_loss: 0.4239 480/500 [===========================>..] - ETA: 4s - loss: 2.3188 - regression_loss: 1.8948 - classification_loss: 0.4240 481/500 [===========================>..] - ETA: 4s - loss: 2.3189 - regression_loss: 1.8950 - classification_loss: 0.4239 482/500 [===========================>..] - ETA: 4s - loss: 2.3239 - regression_loss: 1.8962 - classification_loss: 0.4277 483/500 [===========================>..] - ETA: 4s - loss: 2.3225 - regression_loss: 1.8953 - classification_loss: 0.4272 484/500 [============================>.] - ETA: 3s - loss: 2.3238 - regression_loss: 1.8966 - classification_loss: 0.4272 485/500 [============================>.] - ETA: 3s - loss: 2.3263 - regression_loss: 1.8986 - classification_loss: 0.4277 486/500 [============================>.] - ETA: 3s - loss: 2.3272 - regression_loss: 1.8993 - classification_loss: 0.4279 487/500 [============================>.] - ETA: 3s - loss: 2.3268 - regression_loss: 1.8990 - classification_loss: 0.4279 488/500 [============================>.] - ETA: 2s - loss: 2.3270 - regression_loss: 1.8990 - classification_loss: 0.4280 489/500 [============================>.] - ETA: 2s - loss: 2.3270 - regression_loss: 1.8990 - classification_loss: 0.4280 490/500 [============================>.] - ETA: 2s - loss: 2.3263 - regression_loss: 1.8985 - classification_loss: 0.4278 491/500 [============================>.] - ETA: 2s - loss: 2.3266 - regression_loss: 1.8992 - classification_loss: 0.4274 492/500 [============================>.] - ETA: 1s - loss: 2.3265 - regression_loss: 1.8992 - classification_loss: 0.4273 493/500 [============================>.] - ETA: 1s - loss: 2.3275 - regression_loss: 1.9001 - classification_loss: 0.4274 494/500 [============================>.] - ETA: 1s - loss: 2.3272 - regression_loss: 1.8999 - classification_loss: 0.4273 495/500 [============================>.] - ETA: 1s - loss: 2.3251 - regression_loss: 1.8984 - classification_loss: 0.4268 496/500 [============================>.] - ETA: 0s - loss: 2.3254 - regression_loss: 1.8986 - classification_loss: 0.4269 497/500 [============================>.] - ETA: 0s - loss: 2.3249 - regression_loss: 1.8980 - classification_loss: 0.4268 498/500 [============================>.] - ETA: 0s - loss: 2.3246 - regression_loss: 1.8977 - classification_loss: 0.4269 499/500 [============================>.] - ETA: 0s - loss: 2.3236 - regression_loss: 1.8969 - classification_loss: 0.4266 500/500 [==============================] - 122s 245ms/step - loss: 2.3244 - regression_loss: 1.8976 - classification_loss: 0.4268 326 instances of class plum with average precision: 0.5720 mAP: 0.5720 Epoch 00010: saving model to ./training/snapshots/resnet50_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 1:50 - loss: 1.5483 - regression_loss: 1.2543 - classification_loss: 0.2940 2/500 [..............................] - ETA: 1:55 - loss: 1.9090 - regression_loss: 1.5587 - classification_loss: 0.3503 3/500 [..............................] - ETA: 1:58 - loss: 2.4327 - regression_loss: 1.9832 - classification_loss: 0.4495 4/500 [..............................] - ETA: 2:00 - loss: 2.4612 - regression_loss: 2.0082 - classification_loss: 0.4530 5/500 [..............................] - ETA: 2:00 - loss: 2.2717 - regression_loss: 1.8235 - classification_loss: 0.4482 6/500 [..............................] - ETA: 2:00 - loss: 2.2763 - regression_loss: 1.8285 - classification_loss: 0.4478 7/500 [..............................] - ETA: 2:00 - loss: 2.3338 - regression_loss: 1.8536 - classification_loss: 0.4802 8/500 [..............................] - ETA: 2:00 - loss: 2.2393 - regression_loss: 1.7869 - classification_loss: 0.4525 9/500 [..............................] - ETA: 2:00 - loss: 2.3360 - regression_loss: 1.8676 - classification_loss: 0.4684 10/500 [..............................] - ETA: 2:00 - loss: 2.3442 - regression_loss: 1.8747 - classification_loss: 0.4694 11/500 [..............................] - ETA: 2:00 - loss: 2.3732 - regression_loss: 1.8981 - classification_loss: 0.4751 12/500 [..............................] - ETA: 2:00 - loss: 2.3645 - regression_loss: 1.8936 - classification_loss: 0.4710 13/500 [..............................] - ETA: 2:00 - loss: 2.3898 - regression_loss: 1.9152 - classification_loss: 0.4747 14/500 [..............................] - ETA: 1:59 - loss: 2.3606 - regression_loss: 1.8944 - classification_loss: 0.4662 15/500 [..............................] - ETA: 1:59 - loss: 2.3718 - regression_loss: 1.9055 - classification_loss: 0.4663 16/500 [..............................] - ETA: 1:59 - loss: 2.3688 - regression_loss: 1.9090 - classification_loss: 0.4599 17/500 [>.............................] - ETA: 1:59 - loss: 2.3345 - regression_loss: 1.8870 - classification_loss: 0.4475 18/500 [>.............................] - ETA: 1:59 - loss: 2.3287 - regression_loss: 1.8806 - classification_loss: 0.4482 19/500 [>.............................] - ETA: 1:59 - loss: 2.3030 - regression_loss: 1.8582 - classification_loss: 0.4448 20/500 [>.............................] - ETA: 1:59 - loss: 2.3287 - regression_loss: 1.8788 - classification_loss: 0.4499 21/500 [>.............................] - ETA: 1:58 - loss: 2.3038 - regression_loss: 1.8640 - classification_loss: 0.4397 22/500 [>.............................] - ETA: 1:58 - loss: 2.2999 - regression_loss: 1.8659 - classification_loss: 0.4340 23/500 [>.............................] - ETA: 1:58 - loss: 2.3271 - regression_loss: 1.8871 - classification_loss: 0.4400 24/500 [>.............................] - ETA: 1:58 - loss: 2.3314 - regression_loss: 1.8902 - classification_loss: 0.4412 25/500 [>.............................] - ETA: 1:57 - loss: 2.3427 - regression_loss: 1.8866 - classification_loss: 0.4561 26/500 [>.............................] - ETA: 1:57 - loss: 2.3268 - regression_loss: 1.8777 - classification_loss: 0.4491 27/500 [>.............................] - ETA: 1:57 - loss: 2.2995 - regression_loss: 1.8605 - classification_loss: 0.4390 28/500 [>.............................] - ETA: 1:57 - loss: 2.2941 - regression_loss: 1.8600 - classification_loss: 0.4341 29/500 [>.............................] - ETA: 1:57 - loss: 2.3031 - regression_loss: 1.8688 - classification_loss: 0.4343 30/500 [>.............................] - ETA: 1:57 - loss: 2.3233 - regression_loss: 1.8835 - classification_loss: 0.4398 31/500 [>.............................] - ETA: 1:56 - loss: 2.3408 - regression_loss: 1.8929 - classification_loss: 0.4479 32/500 [>.............................] - ETA: 1:56 - loss: 2.3345 - regression_loss: 1.8883 - classification_loss: 0.4462 33/500 [>.............................] - ETA: 1:56 - loss: 2.3280 - regression_loss: 1.8845 - classification_loss: 0.4435 34/500 [=>............................] - ETA: 1:56 - loss: 2.3324 - regression_loss: 1.8889 - classification_loss: 0.4436 35/500 [=>............................] - ETA: 1:56 - loss: 2.3151 - regression_loss: 1.8771 - classification_loss: 0.4380 36/500 [=>............................] - ETA: 1:55 - loss: 2.3189 - regression_loss: 1.8808 - classification_loss: 0.4381 37/500 [=>............................] - ETA: 1:55 - loss: 2.3011 - regression_loss: 1.8692 - classification_loss: 0.4319 38/500 [=>............................] - ETA: 1:55 - loss: 2.3163 - regression_loss: 1.8805 - classification_loss: 0.4358 39/500 [=>............................] - ETA: 1:54 - loss: 2.3250 - regression_loss: 1.8889 - classification_loss: 0.4361 40/500 [=>............................] - ETA: 1:54 - loss: 2.3219 - regression_loss: 1.8885 - classification_loss: 0.4333 41/500 [=>............................] - ETA: 1:54 - loss: 2.3208 - regression_loss: 1.8902 - classification_loss: 0.4307 42/500 [=>............................] - ETA: 1:54 - loss: 2.3147 - regression_loss: 1.8867 - classification_loss: 0.4280 43/500 [=>............................] - ETA: 1:53 - loss: 2.3123 - regression_loss: 1.8855 - classification_loss: 0.4269 44/500 [=>............................] - ETA: 1:53 - loss: 2.3006 - regression_loss: 1.8764 - classification_loss: 0.4242 45/500 [=>............................] - ETA: 1:53 - loss: 2.3071 - regression_loss: 1.8775 - classification_loss: 0.4296 46/500 [=>............................] - ETA: 1:52 - loss: 2.2936 - regression_loss: 1.8675 - classification_loss: 0.4261 47/500 [=>............................] - ETA: 1:52 - loss: 2.2969 - regression_loss: 1.8713 - classification_loss: 0.4257 48/500 [=>............................] - ETA: 1:52 - loss: 2.2827 - regression_loss: 1.8590 - classification_loss: 0.4236 49/500 [=>............................] - ETA: 1:52 - loss: 2.2756 - regression_loss: 1.8532 - classification_loss: 0.4224 50/500 [==>...........................] - ETA: 1:52 - loss: 2.2668 - regression_loss: 1.8475 - classification_loss: 0.4193 51/500 [==>...........................] - ETA: 1:51 - loss: 2.2840 - regression_loss: 1.8621 - classification_loss: 0.4219 52/500 [==>...........................] - ETA: 1:51 - loss: 2.2719 - regression_loss: 1.8541 - classification_loss: 0.4178 53/500 [==>...........................] - ETA: 1:51 - loss: 2.2636 - regression_loss: 1.8485 - classification_loss: 0.4151 54/500 [==>...........................] - ETA: 1:51 - loss: 2.2707 - regression_loss: 1.8581 - classification_loss: 0.4126 55/500 [==>...........................] - ETA: 1:50 - loss: 2.2793 - regression_loss: 1.8643 - classification_loss: 0.4150 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2806 - regression_loss: 1.8649 - classification_loss: 0.4157 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2867 - regression_loss: 1.8685 - classification_loss: 0.4181 58/500 [==>...........................] - ETA: 1:50 - loss: 2.2883 - regression_loss: 1.8696 - classification_loss: 0.4187 59/500 [==>...........................] - ETA: 1:49 - loss: 2.3063 - regression_loss: 1.8847 - classification_loss: 0.4216 60/500 [==>...........................] - ETA: 1:49 - loss: 2.3232 - regression_loss: 1.9000 - classification_loss: 0.4232 61/500 [==>...........................] - ETA: 1:49 - loss: 2.3282 - regression_loss: 1.9039 - classification_loss: 0.4242 62/500 [==>...........................] - ETA: 1:49 - loss: 2.3286 - regression_loss: 1.9040 - classification_loss: 0.4246 63/500 [==>...........................] - ETA: 1:49 - loss: 2.3291 - regression_loss: 1.9053 - classification_loss: 0.4239 64/500 [==>...........................] - ETA: 1:48 - loss: 2.3302 - regression_loss: 1.9065 - classification_loss: 0.4237 65/500 [==>...........................] - ETA: 1:48 - loss: 2.3260 - regression_loss: 1.9035 - classification_loss: 0.4226 66/500 [==>...........................] - ETA: 1:48 - loss: 2.3314 - regression_loss: 1.9074 - classification_loss: 0.4240 67/500 [===>..........................] - ETA: 1:48 - loss: 2.3376 - regression_loss: 1.9129 - classification_loss: 0.4247 68/500 [===>..........................] - ETA: 1:47 - loss: 2.3372 - regression_loss: 1.9099 - classification_loss: 0.4273 69/500 [===>..........................] - ETA: 1:47 - loss: 2.3268 - regression_loss: 1.9014 - classification_loss: 0.4254 70/500 [===>..........................] - ETA: 1:47 - loss: 2.3290 - regression_loss: 1.9043 - classification_loss: 0.4248 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3312 - regression_loss: 1.9057 - classification_loss: 0.4255 72/500 [===>..........................] - ETA: 1:46 - loss: 2.3377 - regression_loss: 1.9105 - classification_loss: 0.4272 73/500 [===>..........................] - ETA: 1:46 - loss: 2.3273 - regression_loss: 1.9016 - classification_loss: 0.4257 74/500 [===>..........................] - ETA: 1:46 - loss: 2.3260 - regression_loss: 1.9009 - classification_loss: 0.4251 75/500 [===>..........................] - ETA: 1:46 - loss: 2.3255 - regression_loss: 1.8981 - classification_loss: 0.4274 76/500 [===>..........................] - ETA: 1:45 - loss: 2.3162 - regression_loss: 1.8907 - classification_loss: 0.4255 77/500 [===>..........................] - ETA: 1:45 - loss: 2.3216 - regression_loss: 1.8952 - classification_loss: 0.4264 78/500 [===>..........................] - ETA: 1:45 - loss: 2.3237 - regression_loss: 1.8978 - classification_loss: 0.4259 79/500 [===>..........................] - ETA: 1:45 - loss: 2.3038 - regression_loss: 1.8814 - classification_loss: 0.4225 80/500 [===>..........................] - ETA: 1:44 - loss: 2.3067 - regression_loss: 1.8814 - classification_loss: 0.4253 81/500 [===>..........................] - ETA: 1:44 - loss: 2.3000 - regression_loss: 1.8765 - classification_loss: 0.4235 82/500 [===>..........................] - ETA: 1:44 - loss: 2.2941 - regression_loss: 1.8727 - classification_loss: 0.4214 83/500 [===>..........................] - ETA: 1:43 - loss: 2.3006 - regression_loss: 1.8757 - classification_loss: 0.4249 84/500 [====>.........................] - ETA: 1:43 - loss: 2.2959 - regression_loss: 1.8729 - classification_loss: 0.4230 85/500 [====>.........................] - ETA: 1:43 - loss: 2.2928 - regression_loss: 1.8704 - classification_loss: 0.4223 86/500 [====>.........................] - ETA: 1:43 - loss: 2.2883 - regression_loss: 1.8676 - classification_loss: 0.4207 87/500 [====>.........................] - ETA: 1:43 - loss: 2.2892 - regression_loss: 1.8687 - classification_loss: 0.4205 88/500 [====>.........................] - ETA: 1:42 - loss: 2.2923 - regression_loss: 1.8720 - classification_loss: 0.4202 89/500 [====>.........................] - ETA: 1:42 - loss: 2.2921 - regression_loss: 1.8722 - classification_loss: 0.4199 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3027 - regression_loss: 1.8818 - classification_loss: 0.4209 91/500 [====>.........................] - ETA: 1:42 - loss: 2.2974 - regression_loss: 1.8778 - classification_loss: 0.4196 92/500 [====>.........................] - ETA: 1:41 - loss: 2.2905 - regression_loss: 1.8725 - classification_loss: 0.4180 93/500 [====>.........................] - ETA: 1:41 - loss: 2.2876 - regression_loss: 1.8704 - classification_loss: 0.4171 94/500 [====>.........................] - ETA: 1:41 - loss: 2.2865 - regression_loss: 1.8701 - classification_loss: 0.4164 95/500 [====>.........................] - ETA: 1:41 - loss: 2.2919 - regression_loss: 1.8750 - classification_loss: 0.4169 96/500 [====>.........................] - ETA: 1:40 - loss: 2.2958 - regression_loss: 1.8795 - classification_loss: 0.4163 97/500 [====>.........................] - ETA: 1:40 - loss: 2.2938 - regression_loss: 1.8789 - classification_loss: 0.4150 98/500 [====>.........................] - ETA: 1:40 - loss: 2.2974 - regression_loss: 1.8800 - classification_loss: 0.4175 99/500 [====>.........................] - ETA: 1:40 - loss: 2.2908 - regression_loss: 1.8754 - classification_loss: 0.4154 100/500 [=====>........................] - ETA: 1:39 - loss: 2.3047 - regression_loss: 1.8873 - classification_loss: 0.4175 101/500 [=====>........................] - ETA: 1:39 - loss: 2.2978 - regression_loss: 1.8818 - classification_loss: 0.4160 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2906 - regression_loss: 1.8767 - classification_loss: 0.4139 103/500 [=====>........................] - ETA: 1:39 - loss: 2.2887 - regression_loss: 1.8756 - classification_loss: 0.4130 104/500 [=====>........................] - ETA: 1:38 - loss: 2.2907 - regression_loss: 1.8769 - classification_loss: 0.4138 105/500 [=====>........................] - ETA: 1:38 - loss: 2.2894 - regression_loss: 1.8763 - classification_loss: 0.4131 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2915 - regression_loss: 1.8779 - classification_loss: 0.4136 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2888 - regression_loss: 1.8768 - classification_loss: 0.4120 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2942 - regression_loss: 1.8812 - classification_loss: 0.4130 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2915 - regression_loss: 1.8783 - classification_loss: 0.4132 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2902 - regression_loss: 1.8775 - classification_loss: 0.4127 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2852 - regression_loss: 1.8743 - classification_loss: 0.4109 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2792 - regression_loss: 1.8703 - classification_loss: 0.4089 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2770 - regression_loss: 1.8688 - classification_loss: 0.4081 114/500 [=====>........................] - ETA: 1:36 - loss: 2.2752 - regression_loss: 1.8681 - classification_loss: 0.4071 115/500 [=====>........................] - ETA: 1:35 - loss: 2.2799 - regression_loss: 1.8733 - classification_loss: 0.4066 116/500 [=====>........................] - ETA: 1:35 - loss: 2.2808 - regression_loss: 1.8736 - classification_loss: 0.4072 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2926 - regression_loss: 1.8830 - classification_loss: 0.4096 118/500 [======>.......................] - ETA: 1:35 - loss: 2.2960 - regression_loss: 1.8850 - classification_loss: 0.4110 119/500 [======>.......................] - ETA: 1:34 - loss: 2.2960 - regression_loss: 1.8840 - classification_loss: 0.4120 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2887 - regression_loss: 1.8784 - classification_loss: 0.4103 121/500 [======>.......................] - ETA: 1:34 - loss: 2.2937 - regression_loss: 1.8822 - classification_loss: 0.4115 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2918 - regression_loss: 1.8809 - classification_loss: 0.4108 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2910 - regression_loss: 1.8809 - classification_loss: 0.4101 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2797 - regression_loss: 1.8657 - classification_loss: 0.4140 125/500 [======>.......................] - ETA: 1:33 - loss: 2.2795 - regression_loss: 1.8652 - classification_loss: 0.4143 126/500 [======>.......................] - ETA: 1:33 - loss: 2.2874 - regression_loss: 1.8727 - classification_loss: 0.4146 127/500 [======>.......................] - ETA: 1:32 - loss: 2.2875 - regression_loss: 1.8732 - classification_loss: 0.4143 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2857 - regression_loss: 1.8718 - classification_loss: 0.4139 129/500 [======>.......................] - ETA: 1:32 - loss: 2.2858 - regression_loss: 1.8721 - classification_loss: 0.4137 130/500 [======>.......................] - ETA: 1:32 - loss: 2.2882 - regression_loss: 1.8737 - classification_loss: 0.4145 131/500 [======>.......................] - ETA: 1:31 - loss: 2.2895 - regression_loss: 1.8745 - classification_loss: 0.4150 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2873 - regression_loss: 1.8722 - classification_loss: 0.4151 133/500 [======>.......................] - ETA: 1:31 - loss: 2.2798 - regression_loss: 1.8664 - classification_loss: 0.4133 134/500 [=======>......................] - ETA: 1:31 - loss: 2.2706 - regression_loss: 1.8591 - classification_loss: 0.4114 135/500 [=======>......................] - ETA: 1:30 - loss: 2.2640 - regression_loss: 1.8541 - classification_loss: 0.4099 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2593 - regression_loss: 1.8511 - classification_loss: 0.4082 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2495 - regression_loss: 1.8376 - classification_loss: 0.4120 138/500 [=======>......................] - ETA: 1:30 - loss: 2.2506 - regression_loss: 1.8388 - classification_loss: 0.4118 139/500 [=======>......................] - ETA: 1:29 - loss: 2.2557 - regression_loss: 1.8431 - classification_loss: 0.4126 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2588 - regression_loss: 1.8449 - classification_loss: 0.4140 141/500 [=======>......................] - ETA: 1:29 - loss: 2.2566 - regression_loss: 1.8433 - classification_loss: 0.4133 142/500 [=======>......................] - ETA: 1:29 - loss: 2.2571 - regression_loss: 1.8438 - classification_loss: 0.4133 143/500 [=======>......................] - ETA: 1:29 - loss: 2.2578 - regression_loss: 1.8442 - classification_loss: 0.4136 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2610 - regression_loss: 1.8469 - classification_loss: 0.4141 145/500 [=======>......................] - ETA: 1:28 - loss: 2.2649 - regression_loss: 1.8497 - classification_loss: 0.4151 146/500 [=======>......................] - ETA: 1:28 - loss: 2.2639 - regression_loss: 1.8485 - classification_loss: 0.4153 147/500 [=======>......................] - ETA: 1:28 - loss: 2.2651 - regression_loss: 1.8502 - classification_loss: 0.4148 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2703 - regression_loss: 1.8554 - classification_loss: 0.4149 149/500 [=======>......................] - ETA: 1:27 - loss: 2.2668 - regression_loss: 1.8530 - classification_loss: 0.4138 150/500 [========>.....................] - ETA: 1:27 - loss: 2.2612 - regression_loss: 1.8491 - classification_loss: 0.4122 151/500 [========>.....................] - ETA: 1:27 - loss: 2.2637 - regression_loss: 1.8510 - classification_loss: 0.4127 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2663 - regression_loss: 1.8527 - classification_loss: 0.4136 153/500 [========>.....................] - ETA: 1:26 - loss: 2.2639 - regression_loss: 1.8513 - classification_loss: 0.4126 154/500 [========>.....................] - ETA: 1:26 - loss: 2.2592 - regression_loss: 1.8472 - classification_loss: 0.4119 155/500 [========>.....................] - ETA: 1:26 - loss: 2.2555 - regression_loss: 1.8448 - classification_loss: 0.4107 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2583 - regression_loss: 1.8472 - classification_loss: 0.4111 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2591 - regression_loss: 1.8479 - classification_loss: 0.4112 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2526 - regression_loss: 1.8427 - classification_loss: 0.4099 159/500 [========>.....................] - ETA: 1:25 - loss: 2.2520 - regression_loss: 1.8427 - classification_loss: 0.4094 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2526 - regression_loss: 1.8430 - classification_loss: 0.4096 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2550 - regression_loss: 1.8442 - classification_loss: 0.4108 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2573 - regression_loss: 1.8456 - classification_loss: 0.4117 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2587 - regression_loss: 1.8477 - classification_loss: 0.4110 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2555 - regression_loss: 1.8450 - classification_loss: 0.4105 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2568 - regression_loss: 1.8456 - classification_loss: 0.4112 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2632 - regression_loss: 1.8501 - classification_loss: 0.4131 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2680 - regression_loss: 1.8536 - classification_loss: 0.4144 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2664 - regression_loss: 1.8524 - classification_loss: 0.4140 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2654 - regression_loss: 1.8515 - classification_loss: 0.4139 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2680 - regression_loss: 1.8538 - classification_loss: 0.4141 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2661 - regression_loss: 1.8523 - classification_loss: 0.4138 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2655 - regression_loss: 1.8523 - classification_loss: 0.4133 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2645 - regression_loss: 1.8516 - classification_loss: 0.4130 174/500 [=========>....................] - ETA: 1:21 - loss: 2.2663 - regression_loss: 1.8527 - classification_loss: 0.4136 175/500 [=========>....................] - ETA: 1:21 - loss: 2.2709 - regression_loss: 1.8566 - classification_loss: 0.4143 176/500 [=========>....................] - ETA: 1:20 - loss: 2.2718 - regression_loss: 1.8573 - classification_loss: 0.4145 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2718 - regression_loss: 1.8577 - classification_loss: 0.4141 178/500 [=========>....................] - ETA: 1:20 - loss: 2.2720 - regression_loss: 1.8577 - classification_loss: 0.4143 179/500 [=========>....................] - ETA: 1:20 - loss: 2.2734 - regression_loss: 1.8588 - classification_loss: 0.4146 180/500 [=========>....................] - ETA: 1:19 - loss: 2.2690 - regression_loss: 1.8552 - classification_loss: 0.4138 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2698 - regression_loss: 1.8553 - classification_loss: 0.4146 182/500 [=========>....................] - ETA: 1:19 - loss: 2.2712 - regression_loss: 1.8556 - classification_loss: 0.4156 183/500 [=========>....................] - ETA: 1:19 - loss: 2.2744 - regression_loss: 1.8579 - classification_loss: 0.4165 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2749 - regression_loss: 1.8585 - classification_loss: 0.4164 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2748 - regression_loss: 1.8583 - classification_loss: 0.4165 186/500 [==========>...................] - ETA: 1:18 - loss: 2.2730 - regression_loss: 1.8573 - classification_loss: 0.4158 187/500 [==========>...................] - ETA: 1:18 - loss: 2.2689 - regression_loss: 1.8543 - classification_loss: 0.4146 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2733 - regression_loss: 1.8577 - classification_loss: 0.4155 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2712 - regression_loss: 1.8566 - classification_loss: 0.4146 190/500 [==========>...................] - ETA: 1:17 - loss: 2.2742 - regression_loss: 1.8586 - classification_loss: 0.4156 191/500 [==========>...................] - ETA: 1:17 - loss: 2.2759 - regression_loss: 1.8603 - classification_loss: 0.4156 192/500 [==========>...................] - ETA: 1:16 - loss: 2.2776 - regression_loss: 1.8605 - classification_loss: 0.4171 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2747 - regression_loss: 1.8587 - classification_loss: 0.4160 194/500 [==========>...................] - ETA: 1:16 - loss: 2.2736 - regression_loss: 1.8579 - classification_loss: 0.4157 195/500 [==========>...................] - ETA: 1:16 - loss: 2.2760 - regression_loss: 1.8591 - classification_loss: 0.4169 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2778 - regression_loss: 1.8606 - classification_loss: 0.4172 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2815 - regression_loss: 1.8636 - classification_loss: 0.4179 198/500 [==========>...................] - ETA: 1:15 - loss: 2.2762 - regression_loss: 1.8542 - classification_loss: 0.4220 199/500 [==========>...................] - ETA: 1:15 - loss: 2.2756 - regression_loss: 1.8538 - classification_loss: 0.4218 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2753 - regression_loss: 1.8537 - classification_loss: 0.4216 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2748 - regression_loss: 1.8534 - classification_loss: 0.4214 202/500 [===========>..................] - ETA: 1:14 - loss: 2.2740 - regression_loss: 1.8530 - classification_loss: 0.4210 203/500 [===========>..................] - ETA: 1:14 - loss: 2.2712 - regression_loss: 1.8513 - classification_loss: 0.4199 204/500 [===========>..................] - ETA: 1:13 - loss: 2.2713 - regression_loss: 1.8517 - classification_loss: 0.4196 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2723 - regression_loss: 1.8527 - classification_loss: 0.4196 206/500 [===========>..................] - ETA: 1:13 - loss: 2.2708 - regression_loss: 1.8519 - classification_loss: 0.4189 207/500 [===========>..................] - ETA: 1:13 - loss: 2.2697 - regression_loss: 1.8511 - classification_loss: 0.4185 208/500 [===========>..................] - ETA: 1:12 - loss: 2.2690 - regression_loss: 1.8505 - classification_loss: 0.4186 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2685 - regression_loss: 1.8497 - classification_loss: 0.4188 210/500 [===========>..................] - ETA: 1:12 - loss: 2.2714 - regression_loss: 1.8518 - classification_loss: 0.4196 211/500 [===========>..................] - ETA: 1:12 - loss: 2.2702 - regression_loss: 1.8511 - classification_loss: 0.4192 212/500 [===========>..................] - ETA: 1:11 - loss: 2.2695 - regression_loss: 1.8500 - classification_loss: 0.4195 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2685 - regression_loss: 1.8499 - classification_loss: 0.4186 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2636 - regression_loss: 1.8454 - classification_loss: 0.4182 215/500 [===========>..................] - ETA: 1:11 - loss: 2.2668 - regression_loss: 1.8480 - classification_loss: 0.4187 216/500 [===========>..................] - ETA: 1:10 - loss: 2.2648 - regression_loss: 1.8460 - classification_loss: 0.4188 217/500 [============>.................] - ETA: 1:10 - loss: 2.2592 - regression_loss: 1.8417 - classification_loss: 0.4175 218/500 [============>.................] - ETA: 1:10 - loss: 2.2608 - regression_loss: 1.8434 - classification_loss: 0.4174 219/500 [============>.................] - ETA: 1:10 - loss: 2.2611 - regression_loss: 1.8442 - classification_loss: 0.4168 220/500 [============>.................] - ETA: 1:09 - loss: 2.2595 - regression_loss: 1.8437 - classification_loss: 0.4158 221/500 [============>.................] - ETA: 1:09 - loss: 2.2586 - regression_loss: 1.8428 - classification_loss: 0.4158 222/500 [============>.................] - ETA: 1:09 - loss: 2.2599 - regression_loss: 1.8437 - classification_loss: 0.4162 223/500 [============>.................] - ETA: 1:09 - loss: 2.2600 - regression_loss: 1.8441 - classification_loss: 0.4159 224/500 [============>.................] - ETA: 1:08 - loss: 2.2599 - regression_loss: 1.8444 - classification_loss: 0.4155 225/500 [============>.................] - ETA: 1:08 - loss: 2.2592 - regression_loss: 1.8436 - classification_loss: 0.4156 226/500 [============>.................] - ETA: 1:08 - loss: 2.2578 - regression_loss: 1.8424 - classification_loss: 0.4153 227/500 [============>.................] - ETA: 1:08 - loss: 2.2536 - regression_loss: 1.8393 - classification_loss: 0.4142 228/500 [============>.................] - ETA: 1:07 - loss: 2.2562 - regression_loss: 1.8410 - classification_loss: 0.4152 229/500 [============>.................] - ETA: 1:07 - loss: 2.2530 - regression_loss: 1.8385 - classification_loss: 0.4145 230/500 [============>.................] - ETA: 1:07 - loss: 2.2562 - regression_loss: 1.8411 - classification_loss: 0.4151 231/500 [============>.................] - ETA: 1:07 - loss: 2.2558 - regression_loss: 1.8413 - classification_loss: 0.4146 232/500 [============>.................] - ETA: 1:06 - loss: 2.2579 - regression_loss: 1.8432 - classification_loss: 0.4146 233/500 [============>.................] - ETA: 1:06 - loss: 2.2576 - regression_loss: 1.8439 - classification_loss: 0.4137 234/500 [=============>................] - ETA: 1:06 - loss: 2.2593 - regression_loss: 1.8455 - classification_loss: 0.4138 235/500 [=============>................] - ETA: 1:06 - loss: 2.2587 - regression_loss: 1.8451 - classification_loss: 0.4136 236/500 [=============>................] - ETA: 1:05 - loss: 2.2576 - regression_loss: 1.8447 - classification_loss: 0.4129 237/500 [=============>................] - ETA: 1:05 - loss: 2.2587 - regression_loss: 1.8452 - classification_loss: 0.4135 238/500 [=============>................] - ETA: 1:05 - loss: 2.2627 - regression_loss: 1.8480 - classification_loss: 0.4148 239/500 [=============>................] - ETA: 1:05 - loss: 2.2628 - regression_loss: 1.8478 - classification_loss: 0.4149 240/500 [=============>................] - ETA: 1:04 - loss: 2.2629 - regression_loss: 1.8483 - classification_loss: 0.4146 241/500 [=============>................] - ETA: 1:04 - loss: 2.2607 - regression_loss: 1.8467 - classification_loss: 0.4140 242/500 [=============>................] - ETA: 1:04 - loss: 2.2664 - regression_loss: 1.8509 - classification_loss: 0.4155 243/500 [=============>................] - ETA: 1:04 - loss: 2.2666 - regression_loss: 1.8512 - classification_loss: 0.4154 244/500 [=============>................] - ETA: 1:03 - loss: 2.2649 - regression_loss: 1.8498 - classification_loss: 0.4151 245/500 [=============>................] - ETA: 1:03 - loss: 2.2647 - regression_loss: 1.8500 - classification_loss: 0.4146 246/500 [=============>................] - ETA: 1:03 - loss: 2.2650 - regression_loss: 1.8500 - classification_loss: 0.4150 247/500 [=============>................] - ETA: 1:03 - loss: 2.2667 - regression_loss: 1.8507 - classification_loss: 0.4159 248/500 [=============>................] - ETA: 1:02 - loss: 2.2683 - regression_loss: 1.8524 - classification_loss: 0.4159 249/500 [=============>................] - ETA: 1:02 - loss: 2.2686 - regression_loss: 1.8527 - classification_loss: 0.4159 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2736 - regression_loss: 1.8569 - classification_loss: 0.4168 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2720 - regression_loss: 1.8555 - classification_loss: 0.4165 252/500 [==============>...............] - ETA: 1:01 - loss: 2.2720 - regression_loss: 1.8556 - classification_loss: 0.4164 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2661 - regression_loss: 1.8508 - classification_loss: 0.4153 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2672 - regression_loss: 1.8517 - classification_loss: 0.4155 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2678 - regression_loss: 1.8523 - classification_loss: 0.4155 256/500 [==============>...............] - ETA: 1:00 - loss: 2.2677 - regression_loss: 1.8525 - classification_loss: 0.4152 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2705 - regression_loss: 1.8453 - classification_loss: 0.4252 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2675 - regression_loss: 1.8431 - classification_loss: 0.4245 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2675 - regression_loss: 1.8430 - classification_loss: 0.4246 260/500 [==============>...............] - ETA: 59s - loss: 2.2664 - regression_loss: 1.8425 - classification_loss: 0.4239  261/500 [==============>...............] - ETA: 59s - loss: 2.2645 - regression_loss: 1.8410 - classification_loss: 0.4235 262/500 [==============>...............] - ETA: 59s - loss: 2.2664 - regression_loss: 1.8427 - classification_loss: 0.4236 263/500 [==============>...............] - ETA: 59s - loss: 2.2666 - regression_loss: 1.8428 - classification_loss: 0.4238 264/500 [==============>...............] - ETA: 58s - loss: 2.2655 - regression_loss: 1.8418 - classification_loss: 0.4237 265/500 [==============>...............] - ETA: 58s - loss: 2.2663 - regression_loss: 1.8422 - classification_loss: 0.4241 266/500 [==============>...............] - ETA: 58s - loss: 2.2646 - regression_loss: 1.8409 - classification_loss: 0.4237 267/500 [===============>..............] - ETA: 58s - loss: 2.2623 - regression_loss: 1.8391 - classification_loss: 0.4232 268/500 [===============>..............] - ETA: 57s - loss: 2.2646 - regression_loss: 1.8412 - classification_loss: 0.4234 269/500 [===============>..............] - ETA: 57s - loss: 2.2681 - regression_loss: 1.8446 - classification_loss: 0.4236 270/500 [===============>..............] - ETA: 57s - loss: 2.2693 - regression_loss: 1.8454 - classification_loss: 0.4239 271/500 [===============>..............] - ETA: 57s - loss: 2.2698 - regression_loss: 1.8459 - classification_loss: 0.4239 272/500 [===============>..............] - ETA: 56s - loss: 2.2677 - regression_loss: 1.8444 - classification_loss: 0.4233 273/500 [===============>..............] - ETA: 56s - loss: 2.2629 - regression_loss: 1.8407 - classification_loss: 0.4222 274/500 [===============>..............] - ETA: 56s - loss: 2.2642 - regression_loss: 1.8410 - classification_loss: 0.4232 275/500 [===============>..............] - ETA: 56s - loss: 2.2641 - regression_loss: 1.8412 - classification_loss: 0.4229 276/500 [===============>..............] - ETA: 55s - loss: 2.2655 - regression_loss: 1.8426 - classification_loss: 0.4228 277/500 [===============>..............] - ETA: 55s - loss: 2.2651 - regression_loss: 1.8426 - classification_loss: 0.4225 278/500 [===============>..............] - ETA: 55s - loss: 2.2641 - regression_loss: 1.8417 - classification_loss: 0.4223 279/500 [===============>..............] - ETA: 55s - loss: 2.2604 - regression_loss: 1.8389 - classification_loss: 0.4214 280/500 [===============>..............] - ETA: 54s - loss: 2.2581 - regression_loss: 1.8374 - classification_loss: 0.4207 281/500 [===============>..............] - ETA: 54s - loss: 2.2590 - regression_loss: 1.8381 - classification_loss: 0.4208 282/500 [===============>..............] - ETA: 54s - loss: 2.2579 - regression_loss: 1.8372 - classification_loss: 0.4207 283/500 [===============>..............] - ETA: 54s - loss: 2.2590 - regression_loss: 1.8383 - classification_loss: 0.4207 284/500 [================>.............] - ETA: 53s - loss: 2.2586 - regression_loss: 1.8381 - classification_loss: 0.4205 285/500 [================>.............] - ETA: 53s - loss: 2.2580 - regression_loss: 1.8381 - classification_loss: 0.4199 286/500 [================>.............] - ETA: 53s - loss: 2.2594 - regression_loss: 1.8391 - classification_loss: 0.4203 287/500 [================>.............] - ETA: 53s - loss: 2.2601 - regression_loss: 1.8402 - classification_loss: 0.4199 288/500 [================>.............] - ETA: 52s - loss: 2.2593 - regression_loss: 1.8397 - classification_loss: 0.4195 289/500 [================>.............] - ETA: 52s - loss: 2.2567 - regression_loss: 1.8380 - classification_loss: 0.4187 290/500 [================>.............] - ETA: 52s - loss: 2.2626 - regression_loss: 1.8434 - classification_loss: 0.4192 291/500 [================>.............] - ETA: 52s - loss: 2.2625 - regression_loss: 1.8435 - classification_loss: 0.4191 292/500 [================>.............] - ETA: 51s - loss: 2.2644 - regression_loss: 1.8449 - classification_loss: 0.4194 293/500 [================>.............] - ETA: 51s - loss: 2.2632 - regression_loss: 1.8443 - classification_loss: 0.4189 294/500 [================>.............] - ETA: 51s - loss: 2.2635 - regression_loss: 1.8446 - classification_loss: 0.4189 295/500 [================>.............] - ETA: 51s - loss: 2.2626 - regression_loss: 1.8441 - classification_loss: 0.4186 296/500 [================>.............] - ETA: 50s - loss: 2.2634 - regression_loss: 1.8447 - classification_loss: 0.4187 297/500 [================>.............] - ETA: 50s - loss: 2.2649 - regression_loss: 1.8459 - classification_loss: 0.4190 298/500 [================>.............] - ETA: 50s - loss: 2.2653 - regression_loss: 1.8461 - classification_loss: 0.4191 299/500 [================>.............] - ETA: 50s - loss: 2.2657 - regression_loss: 1.8468 - classification_loss: 0.4189 300/500 [=================>............] - ETA: 49s - loss: 2.2642 - regression_loss: 1.8460 - classification_loss: 0.4182 301/500 [=================>............] - ETA: 49s - loss: 2.2637 - regression_loss: 1.8456 - classification_loss: 0.4180 302/500 [=================>............] - ETA: 49s - loss: 2.2620 - regression_loss: 1.8447 - classification_loss: 0.4174 303/500 [=================>............] - ETA: 49s - loss: 2.2613 - regression_loss: 1.8443 - classification_loss: 0.4170 304/500 [=================>............] - ETA: 48s - loss: 2.2629 - regression_loss: 1.8450 - classification_loss: 0.4178 305/500 [=================>............] - ETA: 48s - loss: 2.2641 - regression_loss: 1.8463 - classification_loss: 0.4177 306/500 [=================>............] - ETA: 48s - loss: 2.2598 - regression_loss: 1.8431 - classification_loss: 0.4167 307/500 [=================>............] - ETA: 48s - loss: 2.2600 - regression_loss: 1.8434 - classification_loss: 0.4166 308/500 [=================>............] - ETA: 47s - loss: 2.2593 - regression_loss: 1.8432 - classification_loss: 0.4161 309/500 [=================>............] - ETA: 47s - loss: 2.2573 - regression_loss: 1.8419 - classification_loss: 0.4155 310/500 [=================>............] - ETA: 47s - loss: 2.2582 - regression_loss: 1.8426 - classification_loss: 0.4156 311/500 [=================>............] - ETA: 47s - loss: 2.2567 - regression_loss: 1.8415 - classification_loss: 0.4152 312/500 [=================>............] - ETA: 46s - loss: 2.2545 - regression_loss: 1.8401 - classification_loss: 0.4144 313/500 [=================>............] - ETA: 46s - loss: 2.2530 - regression_loss: 1.8392 - classification_loss: 0.4137 314/500 [=================>............] - ETA: 46s - loss: 2.2557 - regression_loss: 1.8410 - classification_loss: 0.4147 315/500 [=================>............] - ETA: 46s - loss: 2.2552 - regression_loss: 1.8407 - classification_loss: 0.4145 316/500 [=================>............] - ETA: 45s - loss: 2.2548 - regression_loss: 1.8403 - classification_loss: 0.4145 317/500 [==================>...........] - ETA: 45s - loss: 2.2572 - regression_loss: 1.8426 - classification_loss: 0.4146 318/500 [==================>...........] - ETA: 45s - loss: 2.2586 - regression_loss: 1.8435 - classification_loss: 0.4151 319/500 [==================>...........] - ETA: 45s - loss: 2.2566 - regression_loss: 1.8423 - classification_loss: 0.4144 320/500 [==================>...........] - ETA: 44s - loss: 2.2565 - regression_loss: 1.8424 - classification_loss: 0.4141 321/500 [==================>...........] - ETA: 44s - loss: 2.2527 - regression_loss: 1.8394 - classification_loss: 0.4133 322/500 [==================>...........] - ETA: 44s - loss: 2.2536 - regression_loss: 1.8403 - classification_loss: 0.4133 323/500 [==================>...........] - ETA: 44s - loss: 2.2550 - regression_loss: 1.8413 - classification_loss: 0.4137 324/500 [==================>...........] - ETA: 43s - loss: 2.2524 - regression_loss: 1.8393 - classification_loss: 0.4131 325/500 [==================>...........] - ETA: 43s - loss: 2.2505 - regression_loss: 1.8380 - classification_loss: 0.4125 326/500 [==================>...........] - ETA: 43s - loss: 2.2511 - regression_loss: 1.8386 - classification_loss: 0.4125 327/500 [==================>...........] - ETA: 43s - loss: 2.2497 - regression_loss: 1.8378 - classification_loss: 0.4120 328/500 [==================>...........] - ETA: 42s - loss: 2.2484 - regression_loss: 1.8369 - classification_loss: 0.4115 329/500 [==================>...........] - ETA: 42s - loss: 2.2505 - regression_loss: 1.8387 - classification_loss: 0.4118 330/500 [==================>...........] - ETA: 42s - loss: 2.2546 - regression_loss: 1.8416 - classification_loss: 0.4130 331/500 [==================>...........] - ETA: 42s - loss: 2.2563 - regression_loss: 1.8431 - classification_loss: 0.4132 332/500 [==================>...........] - ETA: 41s - loss: 2.2565 - regression_loss: 1.8434 - classification_loss: 0.4131 333/500 [==================>...........] - ETA: 41s - loss: 2.2585 - regression_loss: 1.8448 - classification_loss: 0.4138 334/500 [===================>..........] - ETA: 41s - loss: 2.2600 - regression_loss: 1.8460 - classification_loss: 0.4141 335/500 [===================>..........] - ETA: 41s - loss: 2.2619 - regression_loss: 1.8473 - classification_loss: 0.4146 336/500 [===================>..........] - ETA: 40s - loss: 2.2638 - regression_loss: 1.8489 - classification_loss: 0.4149 337/500 [===================>..........] - ETA: 40s - loss: 2.2656 - regression_loss: 1.8505 - classification_loss: 0.4151 338/500 [===================>..........] - ETA: 40s - loss: 2.2696 - regression_loss: 1.8537 - classification_loss: 0.4159 339/500 [===================>..........] - ETA: 40s - loss: 2.2698 - regression_loss: 1.8540 - classification_loss: 0.4158 340/500 [===================>..........] - ETA: 39s - loss: 2.2725 - regression_loss: 1.8560 - classification_loss: 0.4165 341/500 [===================>..........] - ETA: 39s - loss: 2.2719 - regression_loss: 1.8553 - classification_loss: 0.4166 342/500 [===================>..........] - ETA: 39s - loss: 2.2710 - regression_loss: 1.8547 - classification_loss: 0.4163 343/500 [===================>..........] - ETA: 39s - loss: 2.2720 - regression_loss: 1.8556 - classification_loss: 0.4164 344/500 [===================>..........] - ETA: 38s - loss: 2.2713 - regression_loss: 1.8552 - classification_loss: 0.4161 345/500 [===================>..........] - ETA: 38s - loss: 2.2714 - regression_loss: 1.8551 - classification_loss: 0.4162 346/500 [===================>..........] - ETA: 38s - loss: 2.2728 - regression_loss: 1.8562 - classification_loss: 0.4167 347/500 [===================>..........] - ETA: 38s - loss: 2.2727 - regression_loss: 1.8559 - classification_loss: 0.4168 348/500 [===================>..........] - ETA: 37s - loss: 2.2742 - regression_loss: 1.8573 - classification_loss: 0.4169 349/500 [===================>..........] - ETA: 37s - loss: 2.2740 - regression_loss: 1.8570 - classification_loss: 0.4169 350/500 [====================>.........] - ETA: 37s - loss: 2.2744 - regression_loss: 1.8574 - classification_loss: 0.4170 351/500 [====================>.........] - ETA: 37s - loss: 2.2746 - regression_loss: 1.8575 - classification_loss: 0.4171 352/500 [====================>.........] - ETA: 36s - loss: 2.2774 - regression_loss: 1.8598 - classification_loss: 0.4176 353/500 [====================>.........] - ETA: 36s - loss: 2.2808 - regression_loss: 1.8624 - classification_loss: 0.4184 354/500 [====================>.........] - ETA: 36s - loss: 2.2799 - regression_loss: 1.8614 - classification_loss: 0.4186 355/500 [====================>.........] - ETA: 36s - loss: 2.2780 - regression_loss: 1.8596 - classification_loss: 0.4183 356/500 [====================>.........] - ETA: 35s - loss: 2.2779 - regression_loss: 1.8594 - classification_loss: 0.4185 357/500 [====================>.........] - ETA: 35s - loss: 2.2794 - regression_loss: 1.8607 - classification_loss: 0.4187 358/500 [====================>.........] - ETA: 35s - loss: 2.2798 - regression_loss: 1.8609 - classification_loss: 0.4188 359/500 [====================>.........] - ETA: 35s - loss: 2.2790 - regression_loss: 1.8605 - classification_loss: 0.4185 360/500 [====================>.........] - ETA: 34s - loss: 2.2795 - regression_loss: 1.8608 - classification_loss: 0.4188 361/500 [====================>.........] - ETA: 34s - loss: 2.2793 - regression_loss: 1.8606 - classification_loss: 0.4187 362/500 [====================>.........] - ETA: 34s - loss: 2.2789 - regression_loss: 1.8599 - classification_loss: 0.4190 363/500 [====================>.........] - ETA: 34s - loss: 2.2789 - regression_loss: 1.8602 - classification_loss: 0.4187 364/500 [====================>.........] - ETA: 33s - loss: 2.2791 - regression_loss: 1.8603 - classification_loss: 0.4188 365/500 [====================>.........] - ETA: 33s - loss: 2.2789 - regression_loss: 1.8603 - classification_loss: 0.4186 366/500 [====================>.........] - ETA: 33s - loss: 2.2805 - regression_loss: 1.8619 - classification_loss: 0.4185 367/500 [=====================>........] - ETA: 33s - loss: 2.2798 - regression_loss: 1.8615 - classification_loss: 0.4184 368/500 [=====================>........] - ETA: 32s - loss: 2.2860 - regression_loss: 1.8669 - classification_loss: 0.4190 369/500 [=====================>........] - ETA: 32s - loss: 2.2866 - regression_loss: 1.8673 - classification_loss: 0.4192 370/500 [=====================>........] - ETA: 32s - loss: 2.2862 - regression_loss: 1.8671 - classification_loss: 0.4191 371/500 [=====================>........] - ETA: 32s - loss: 2.2864 - regression_loss: 1.8673 - classification_loss: 0.4191 372/500 [=====================>........] - ETA: 31s - loss: 2.2878 - regression_loss: 1.8687 - classification_loss: 0.4191 373/500 [=====================>........] - ETA: 31s - loss: 2.2865 - regression_loss: 1.8678 - classification_loss: 0.4186 374/500 [=====================>........] - ETA: 31s - loss: 2.2879 - regression_loss: 1.8628 - classification_loss: 0.4250 375/500 [=====================>........] - ETA: 31s - loss: 2.2872 - regression_loss: 1.8626 - classification_loss: 0.4247 376/500 [=====================>........] - ETA: 30s - loss: 2.2884 - regression_loss: 1.8637 - classification_loss: 0.4247 377/500 [=====================>........] - ETA: 30s - loss: 2.2899 - regression_loss: 1.8648 - classification_loss: 0.4251 378/500 [=====================>........] - ETA: 30s - loss: 2.2896 - regression_loss: 1.8645 - classification_loss: 0.4251 379/500 [=====================>........] - ETA: 30s - loss: 2.2899 - regression_loss: 1.8654 - classification_loss: 0.4245 380/500 [=====================>........] - ETA: 29s - loss: 2.2899 - regression_loss: 1.8657 - classification_loss: 0.4242 381/500 [=====================>........] - ETA: 29s - loss: 2.2865 - regression_loss: 1.8629 - classification_loss: 0.4236 382/500 [=====================>........] - ETA: 29s - loss: 2.2879 - regression_loss: 1.8641 - classification_loss: 0.4238 383/500 [=====================>........] - ETA: 29s - loss: 2.2874 - regression_loss: 1.8640 - classification_loss: 0.4234 384/500 [======================>.......] - ETA: 28s - loss: 2.2878 - regression_loss: 1.8643 - classification_loss: 0.4235 385/500 [======================>.......] - ETA: 28s - loss: 2.2875 - regression_loss: 1.8641 - classification_loss: 0.4234 386/500 [======================>.......] - ETA: 28s - loss: 2.2870 - regression_loss: 1.8638 - classification_loss: 0.4232 387/500 [======================>.......] - ETA: 28s - loss: 2.2881 - regression_loss: 1.8648 - classification_loss: 0.4233 388/500 [======================>.......] - ETA: 27s - loss: 2.2876 - regression_loss: 1.8645 - classification_loss: 0.4231 389/500 [======================>.......] - ETA: 27s - loss: 2.2885 - regression_loss: 1.8655 - classification_loss: 0.4229 390/500 [======================>.......] - ETA: 27s - loss: 2.2877 - regression_loss: 1.8650 - classification_loss: 0.4227 391/500 [======================>.......] - ETA: 27s - loss: 2.2878 - regression_loss: 1.8652 - classification_loss: 0.4226 392/500 [======================>.......] - ETA: 26s - loss: 2.2878 - regression_loss: 1.8652 - classification_loss: 0.4226 393/500 [======================>.......] - ETA: 26s - loss: 2.2891 - regression_loss: 1.8658 - classification_loss: 0.4232 394/500 [======================>.......] - ETA: 26s - loss: 2.2886 - regression_loss: 1.8654 - classification_loss: 0.4232 395/500 [======================>.......] - ETA: 26s - loss: 2.2868 - regression_loss: 1.8642 - classification_loss: 0.4226 396/500 [======================>.......] - ETA: 25s - loss: 2.2875 - regression_loss: 1.8653 - classification_loss: 0.4222 397/500 [======================>.......] - ETA: 25s - loss: 2.2871 - regression_loss: 1.8653 - classification_loss: 0.4218 398/500 [======================>.......] - ETA: 25s - loss: 2.2873 - regression_loss: 1.8657 - classification_loss: 0.4216 399/500 [======================>.......] - ETA: 25s - loss: 2.2866 - regression_loss: 1.8654 - classification_loss: 0.4212 400/500 [=======================>......] - ETA: 24s - loss: 2.2863 - regression_loss: 1.8652 - classification_loss: 0.4211 401/500 [=======================>......] - ETA: 24s - loss: 2.2850 - regression_loss: 1.8643 - classification_loss: 0.4207 402/500 [=======================>......] - ETA: 24s - loss: 2.2848 - regression_loss: 1.8642 - classification_loss: 0.4206 403/500 [=======================>......] - ETA: 24s - loss: 2.2839 - regression_loss: 1.8637 - classification_loss: 0.4202 404/500 [=======================>......] - ETA: 23s - loss: 2.2847 - regression_loss: 1.8645 - classification_loss: 0.4202 405/500 [=======================>......] - ETA: 23s - loss: 2.2841 - regression_loss: 1.8628 - classification_loss: 0.4213 406/500 [=======================>......] - ETA: 23s - loss: 2.2842 - regression_loss: 1.8629 - classification_loss: 0.4213 407/500 [=======================>......] - ETA: 23s - loss: 2.2828 - regression_loss: 1.8621 - classification_loss: 0.4207 408/500 [=======================>......] - ETA: 22s - loss: 2.2835 - regression_loss: 1.8628 - classification_loss: 0.4208 409/500 [=======================>......] - ETA: 22s - loss: 2.2841 - regression_loss: 1.8632 - classification_loss: 0.4208 410/500 [=======================>......] - ETA: 22s - loss: 2.2841 - regression_loss: 1.8633 - classification_loss: 0.4208 411/500 [=======================>......] - ETA: 22s - loss: 2.2831 - regression_loss: 1.8627 - classification_loss: 0.4204 412/500 [=======================>......] - ETA: 21s - loss: 2.2834 - regression_loss: 1.8631 - classification_loss: 0.4204 413/500 [=======================>......] - ETA: 21s - loss: 2.2844 - regression_loss: 1.8640 - classification_loss: 0.4205 414/500 [=======================>......] - ETA: 21s - loss: 2.2850 - regression_loss: 1.8644 - classification_loss: 0.4206 415/500 [=======================>......] - ETA: 21s - loss: 2.2839 - regression_loss: 1.8637 - classification_loss: 0.4202 416/500 [=======================>......] - ETA: 20s - loss: 2.2841 - regression_loss: 1.8635 - classification_loss: 0.4206 417/500 [========================>.....] - ETA: 20s - loss: 2.2838 - regression_loss: 1.8634 - classification_loss: 0.4204 418/500 [========================>.....] - ETA: 20s - loss: 2.2849 - regression_loss: 1.8644 - classification_loss: 0.4206 419/500 [========================>.....] - ETA: 20s - loss: 2.2836 - regression_loss: 1.8635 - classification_loss: 0.4202 420/500 [========================>.....] - ETA: 19s - loss: 2.2828 - regression_loss: 1.8630 - classification_loss: 0.4198 421/500 [========================>.....] - ETA: 19s - loss: 2.2852 - regression_loss: 1.8647 - classification_loss: 0.4205 422/500 [========================>.....] - ETA: 19s - loss: 2.2860 - regression_loss: 1.8653 - classification_loss: 0.4207 423/500 [========================>.....] - ETA: 19s - loss: 2.2859 - regression_loss: 1.8650 - classification_loss: 0.4209 424/500 [========================>.....] - ETA: 18s - loss: 2.2867 - regression_loss: 1.8654 - classification_loss: 0.4212 425/500 [========================>.....] - ETA: 18s - loss: 2.2876 - regression_loss: 1.8662 - classification_loss: 0.4214 426/500 [========================>.....] - ETA: 18s - loss: 2.2868 - regression_loss: 1.8657 - classification_loss: 0.4212 427/500 [========================>.....] - ETA: 18s - loss: 2.2865 - regression_loss: 1.8655 - classification_loss: 0.4210 428/500 [========================>.....] - ETA: 17s - loss: 2.2882 - regression_loss: 1.8668 - classification_loss: 0.4214 429/500 [========================>.....] - ETA: 17s - loss: 2.2889 - regression_loss: 1.8675 - classification_loss: 0.4214 430/500 [========================>.....] - ETA: 17s - loss: 2.2893 - regression_loss: 1.8681 - classification_loss: 0.4212 431/500 [========================>.....] - ETA: 17s - loss: 2.2896 - regression_loss: 1.8683 - classification_loss: 0.4213 432/500 [========================>.....] - ETA: 16s - loss: 2.2890 - regression_loss: 1.8680 - classification_loss: 0.4210 433/500 [========================>.....] - ETA: 16s - loss: 2.2890 - regression_loss: 1.8679 - classification_loss: 0.4212 434/500 [=========================>....] - ETA: 16s - loss: 2.2897 - regression_loss: 1.8690 - classification_loss: 0.4207 435/500 [=========================>....] - ETA: 16s - loss: 2.2905 - regression_loss: 1.8696 - classification_loss: 0.4209 436/500 [=========================>....] - ETA: 15s - loss: 2.2915 - regression_loss: 1.8704 - classification_loss: 0.4211 437/500 [=========================>....] - ETA: 15s - loss: 2.2908 - regression_loss: 1.8699 - classification_loss: 0.4209 438/500 [=========================>....] - ETA: 15s - loss: 2.2924 - regression_loss: 1.8716 - classification_loss: 0.4208 439/500 [=========================>....] - ETA: 15s - loss: 2.2926 - regression_loss: 1.8716 - classification_loss: 0.4210 440/500 [=========================>....] - ETA: 14s - loss: 2.2914 - regression_loss: 1.8708 - classification_loss: 0.4206 441/500 [=========================>....] - ETA: 14s - loss: 2.2935 - regression_loss: 1.8726 - classification_loss: 0.4209 442/500 [=========================>....] - ETA: 14s - loss: 2.2932 - regression_loss: 1.8722 - classification_loss: 0.4209 443/500 [=========================>....] - ETA: 14s - loss: 2.2929 - regression_loss: 1.8720 - classification_loss: 0.4209 444/500 [=========================>....] - ETA: 13s - loss: 2.2905 - regression_loss: 1.8702 - classification_loss: 0.4204 445/500 [=========================>....] - ETA: 13s - loss: 2.2905 - regression_loss: 1.8702 - classification_loss: 0.4203 446/500 [=========================>....] - ETA: 13s - loss: 2.2918 - regression_loss: 1.8711 - classification_loss: 0.4207 447/500 [=========================>....] - ETA: 13s - loss: 2.2909 - regression_loss: 1.8705 - classification_loss: 0.4204 448/500 [=========================>....] - ETA: 12s - loss: 2.2893 - regression_loss: 1.8695 - classification_loss: 0.4199 449/500 [=========================>....] - ETA: 12s - loss: 2.2898 - regression_loss: 1.8699 - classification_loss: 0.4199 450/500 [==========================>...] - ETA: 12s - loss: 2.2898 - regression_loss: 1.8701 - classification_loss: 0.4197 451/500 [==========================>...] - ETA: 12s - loss: 2.2875 - regression_loss: 1.8680 - classification_loss: 0.4196 452/500 [==========================>...] - ETA: 11s - loss: 2.2876 - regression_loss: 1.8681 - classification_loss: 0.4195 453/500 [==========================>...] - ETA: 11s - loss: 2.2854 - regression_loss: 1.8665 - classification_loss: 0.4189 454/500 [==========================>...] - ETA: 11s - loss: 2.2847 - regression_loss: 1.8662 - classification_loss: 0.4186 455/500 [==========================>...] - ETA: 11s - loss: 2.2853 - regression_loss: 1.8666 - classification_loss: 0.4187 456/500 [==========================>...] - ETA: 10s - loss: 2.2857 - regression_loss: 1.8668 - classification_loss: 0.4188 457/500 [==========================>...] - ETA: 10s - loss: 2.2854 - regression_loss: 1.8666 - classification_loss: 0.4188 458/500 [==========================>...] - ETA: 10s - loss: 2.2867 - regression_loss: 1.8675 - classification_loss: 0.4191 459/500 [==========================>...] - ETA: 10s - loss: 2.2864 - regression_loss: 1.8673 - classification_loss: 0.4190 460/500 [==========================>...] - ETA: 9s - loss: 2.2841 - regression_loss: 1.8656 - classification_loss: 0.4185  461/500 [==========================>...] - ETA: 9s - loss: 2.2881 - regression_loss: 1.8683 - classification_loss: 0.4198 462/500 [==========================>...] - ETA: 9s - loss: 2.2886 - regression_loss: 1.8688 - classification_loss: 0.4198 463/500 [==========================>...] - ETA: 9s - loss: 2.2873 - regression_loss: 1.8679 - classification_loss: 0.4195 464/500 [==========================>...] - ETA: 8s - loss: 2.2878 - regression_loss: 1.8684 - classification_loss: 0.4194 465/500 [==========================>...] - ETA: 8s - loss: 2.2862 - regression_loss: 1.8673 - classification_loss: 0.4190 466/500 [==========================>...] - ETA: 8s - loss: 2.2864 - regression_loss: 1.8674 - classification_loss: 0.4190 467/500 [===========================>..] - ETA: 8s - loss: 2.2866 - regression_loss: 1.8677 - classification_loss: 0.4189 468/500 [===========================>..] - ETA: 7s - loss: 2.2883 - regression_loss: 1.8689 - classification_loss: 0.4194 469/500 [===========================>..] - ETA: 7s - loss: 2.2907 - regression_loss: 1.8711 - classification_loss: 0.4197 470/500 [===========================>..] - ETA: 7s - loss: 2.2879 - regression_loss: 1.8685 - classification_loss: 0.4194 471/500 [===========================>..] - ETA: 7s - loss: 2.2870 - regression_loss: 1.8680 - classification_loss: 0.4190 472/500 [===========================>..] - ETA: 6s - loss: 2.2857 - regression_loss: 1.8670 - classification_loss: 0.4186 473/500 [===========================>..] - ETA: 6s - loss: 2.2869 - regression_loss: 1.8679 - classification_loss: 0.4191 474/500 [===========================>..] - ETA: 6s - loss: 2.2883 - regression_loss: 1.8688 - classification_loss: 0.4195 475/500 [===========================>..] - ETA: 6s - loss: 2.2869 - regression_loss: 1.8678 - classification_loss: 0.4191 476/500 [===========================>..] - ETA: 5s - loss: 2.2872 - regression_loss: 1.8683 - classification_loss: 0.4189 477/500 [===========================>..] - ETA: 5s - loss: 2.2858 - regression_loss: 1.8674 - classification_loss: 0.4185 478/500 [===========================>..] - ETA: 5s - loss: 2.2849 - regression_loss: 1.8668 - classification_loss: 0.4181 479/500 [===========================>..] - ETA: 5s - loss: 2.2830 - regression_loss: 1.8654 - classification_loss: 0.4176 480/500 [===========================>..] - ETA: 4s - loss: 2.2831 - regression_loss: 1.8655 - classification_loss: 0.4176 481/500 [===========================>..] - ETA: 4s - loss: 2.2839 - regression_loss: 1.8661 - classification_loss: 0.4178 482/500 [===========================>..] - ETA: 4s - loss: 2.2841 - regression_loss: 1.8662 - classification_loss: 0.4179 483/500 [===========================>..] - ETA: 4s - loss: 2.2840 - regression_loss: 1.8662 - classification_loss: 0.4178 484/500 [============================>.] - ETA: 3s - loss: 2.2828 - regression_loss: 1.8653 - classification_loss: 0.4175 485/500 [============================>.] - ETA: 3s - loss: 2.2840 - regression_loss: 1.8661 - classification_loss: 0.4179 486/500 [============================>.] - ETA: 3s - loss: 2.2834 - regression_loss: 1.8655 - classification_loss: 0.4179 487/500 [============================>.] - ETA: 3s - loss: 2.2830 - regression_loss: 1.8650 - classification_loss: 0.4180 488/500 [============================>.] - ETA: 2s - loss: 2.2829 - regression_loss: 1.8651 - classification_loss: 0.4178 489/500 [============================>.] - ETA: 2s - loss: 2.2834 - regression_loss: 1.8655 - classification_loss: 0.4179 490/500 [============================>.] - ETA: 2s - loss: 2.2826 - regression_loss: 1.8650 - classification_loss: 0.4176 491/500 [============================>.] - ETA: 2s - loss: 2.2818 - regression_loss: 1.8642 - classification_loss: 0.4176 492/500 [============================>.] - ETA: 1s - loss: 2.2820 - regression_loss: 1.8640 - classification_loss: 0.4180 493/500 [============================>.] - ETA: 1s - loss: 2.2832 - regression_loss: 1.8651 - classification_loss: 0.4182 494/500 [============================>.] - ETA: 1s - loss: 2.2853 - regression_loss: 1.8667 - classification_loss: 0.4186 495/500 [============================>.] - ETA: 1s - loss: 2.2866 - regression_loss: 1.8677 - classification_loss: 0.4190 496/500 [============================>.] - ETA: 0s - loss: 2.2874 - regression_loss: 1.8683 - classification_loss: 0.4191 497/500 [============================>.] - ETA: 0s - loss: 2.2867 - regression_loss: 1.8678 - classification_loss: 0.4189 498/500 [============================>.] - ETA: 0s - loss: 2.2857 - regression_loss: 1.8671 - classification_loss: 0.4185 499/500 [============================>.] - ETA: 0s - loss: 2.2847 - regression_loss: 1.8665 - classification_loss: 0.4182 500/500 [==============================] - 125s 250ms/step - loss: 2.2843 - regression_loss: 1.8661 - classification_loss: 0.4182 326 instances of class plum with average precision: 0.6319 mAP: 0.6319 Epoch 00011: saving model to ./training/snapshots/resnet50_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:03 - loss: 1.6372 - regression_loss: 1.3737 - classification_loss: 0.2635 2/500 [..............................] - ETA: 2:00 - loss: 1.8035 - regression_loss: 1.4647 - classification_loss: 0.3387 3/500 [..............................] - ETA: 2:03 - loss: 1.7021 - regression_loss: 1.4097 - classification_loss: 0.2924 4/500 [..............................] - ETA: 2:02 - loss: 1.9158 - regression_loss: 1.5700 - classification_loss: 0.3458 5/500 [..............................] - ETA: 2:03 - loss: 1.8696 - regression_loss: 1.5486 - classification_loss: 0.3210 6/500 [..............................] - ETA: 2:03 - loss: 1.9767 - regression_loss: 1.6103 - classification_loss: 0.3665 7/500 [..............................] - ETA: 2:02 - loss: 2.0802 - regression_loss: 1.6934 - classification_loss: 0.3868 8/500 [..............................] - ETA: 2:03 - loss: 2.1065 - regression_loss: 1.7174 - classification_loss: 0.3891 9/500 [..............................] - ETA: 2:03 - loss: 2.2013 - regression_loss: 1.7852 - classification_loss: 0.4162 10/500 [..............................] - ETA: 2:02 - loss: 2.2516 - regression_loss: 1.8087 - classification_loss: 0.4429 11/500 [..............................] - ETA: 2:02 - loss: 2.2285 - regression_loss: 1.7958 - classification_loss: 0.4328 12/500 [..............................] - ETA: 2:02 - loss: 2.2948 - regression_loss: 1.8522 - classification_loss: 0.4427 13/500 [..............................] - ETA: 2:02 - loss: 2.2774 - regression_loss: 1.8490 - classification_loss: 0.4284 14/500 [..............................] - ETA: 2:02 - loss: 2.2111 - regression_loss: 1.8009 - classification_loss: 0.4102 15/500 [..............................] - ETA: 2:02 - loss: 2.2085 - regression_loss: 1.8002 - classification_loss: 0.4083 16/500 [..............................] - ETA: 2:01 - loss: 2.2428 - regression_loss: 1.8298 - classification_loss: 0.4130 17/500 [>.............................] - ETA: 2:00 - loss: 2.2154 - regression_loss: 1.8123 - classification_loss: 0.4031 18/500 [>.............................] - ETA: 2:00 - loss: 2.2484 - regression_loss: 1.8358 - classification_loss: 0.4125 19/500 [>.............................] - ETA: 1:59 - loss: 2.2322 - regression_loss: 1.8241 - classification_loss: 0.4080 20/500 [>.............................] - ETA: 1:59 - loss: 2.2088 - regression_loss: 1.8089 - classification_loss: 0.3999 21/500 [>.............................] - ETA: 1:59 - loss: 2.2106 - regression_loss: 1.8064 - classification_loss: 0.4042 22/500 [>.............................] - ETA: 1:58 - loss: 2.1972 - regression_loss: 1.7984 - classification_loss: 0.3989 23/500 [>.............................] - ETA: 1:57 - loss: 2.1748 - regression_loss: 1.7826 - classification_loss: 0.3921 24/500 [>.............................] - ETA: 1:57 - loss: 2.1565 - regression_loss: 1.7705 - classification_loss: 0.3859 25/500 [>.............................] - ETA: 1:57 - loss: 2.1499 - regression_loss: 1.7661 - classification_loss: 0.3838 26/500 [>.............................] - ETA: 1:57 - loss: 2.1652 - regression_loss: 1.7804 - classification_loss: 0.3848 27/500 [>.............................] - ETA: 1:56 - loss: 2.2065 - regression_loss: 1.8142 - classification_loss: 0.3923 28/500 [>.............................] - ETA: 1:56 - loss: 2.2211 - regression_loss: 1.8284 - classification_loss: 0.3927 29/500 [>.............................] - ETA: 1:56 - loss: 2.2209 - regression_loss: 1.8331 - classification_loss: 0.3877 30/500 [>.............................] - ETA: 1:56 - loss: 2.2209 - regression_loss: 1.8304 - classification_loss: 0.3906 31/500 [>.............................] - ETA: 1:55 - loss: 2.2317 - regression_loss: 1.8374 - classification_loss: 0.3943 32/500 [>.............................] - ETA: 1:55 - loss: 2.2290 - regression_loss: 1.8374 - classification_loss: 0.3915 33/500 [>.............................] - ETA: 1:55 - loss: 2.2303 - regression_loss: 1.8396 - classification_loss: 0.3907 34/500 [=>............................] - ETA: 1:55 - loss: 2.2484 - regression_loss: 1.8526 - classification_loss: 0.3958 35/500 [=>............................] - ETA: 1:55 - loss: 2.2140 - regression_loss: 1.8249 - classification_loss: 0.3891 36/500 [=>............................] - ETA: 1:54 - loss: 2.1846 - regression_loss: 1.8002 - classification_loss: 0.3844 37/500 [=>............................] - ETA: 1:54 - loss: 2.2060 - regression_loss: 1.8180 - classification_loss: 0.3880 38/500 [=>............................] - ETA: 1:54 - loss: 2.2324 - regression_loss: 1.8402 - classification_loss: 0.3922 39/500 [=>............................] - ETA: 1:54 - loss: 2.2270 - regression_loss: 1.8371 - classification_loss: 0.3899 40/500 [=>............................] - ETA: 1:53 - loss: 2.2253 - regression_loss: 1.8360 - classification_loss: 0.3893 41/500 [=>............................] - ETA: 1:53 - loss: 2.2178 - regression_loss: 1.8305 - classification_loss: 0.3873 42/500 [=>............................] - ETA: 1:53 - loss: 2.2134 - regression_loss: 1.8267 - classification_loss: 0.3867 43/500 [=>............................] - ETA: 1:53 - loss: 2.2234 - regression_loss: 1.8329 - classification_loss: 0.3905 44/500 [=>............................] - ETA: 1:53 - loss: 2.2487 - regression_loss: 1.8519 - classification_loss: 0.3968 45/500 [=>............................] - ETA: 1:52 - loss: 2.2466 - regression_loss: 1.8513 - classification_loss: 0.3953 46/500 [=>............................] - ETA: 1:52 - loss: 2.2343 - regression_loss: 1.8411 - classification_loss: 0.3932 47/500 [=>............................] - ETA: 1:52 - loss: 2.2156 - regression_loss: 1.8266 - classification_loss: 0.3891 48/500 [=>............................] - ETA: 1:52 - loss: 2.2093 - regression_loss: 1.8236 - classification_loss: 0.3858 49/500 [=>............................] - ETA: 1:52 - loss: 2.2174 - regression_loss: 1.8281 - classification_loss: 0.3893 50/500 [==>...........................] - ETA: 1:51 - loss: 2.2176 - regression_loss: 1.8279 - classification_loss: 0.3897 51/500 [==>...........................] - ETA: 1:51 - loss: 2.2282 - regression_loss: 1.8353 - classification_loss: 0.3929 52/500 [==>...........................] - ETA: 1:51 - loss: 2.2241 - regression_loss: 1.8310 - classification_loss: 0.3931 53/500 [==>...........................] - ETA: 1:51 - loss: 2.2234 - regression_loss: 1.8306 - classification_loss: 0.3929 54/500 [==>...........................] - ETA: 1:50 - loss: 2.2257 - regression_loss: 1.8342 - classification_loss: 0.3914 55/500 [==>...........................] - ETA: 1:50 - loss: 2.2292 - regression_loss: 1.8370 - classification_loss: 0.3922 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2371 - regression_loss: 1.8449 - classification_loss: 0.3922 57/500 [==>...........................] - ETA: 1:49 - loss: 2.2259 - regression_loss: 1.8360 - classification_loss: 0.3899 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2315 - regression_loss: 1.8416 - classification_loss: 0.3899 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2319 - regression_loss: 1.8432 - classification_loss: 0.3887 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2300 - regression_loss: 1.8417 - classification_loss: 0.3883 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2293 - regression_loss: 1.8408 - classification_loss: 0.3885 62/500 [==>...........................] - ETA: 1:48 - loss: 2.2174 - regression_loss: 1.8323 - classification_loss: 0.3851 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2031 - regression_loss: 1.8197 - classification_loss: 0.3834 64/500 [==>...........................] - ETA: 1:48 - loss: 2.1999 - regression_loss: 1.8169 - classification_loss: 0.3831 65/500 [==>...........................] - ETA: 1:47 - loss: 2.2043 - regression_loss: 1.8200 - classification_loss: 0.3843 66/500 [==>...........................] - ETA: 1:47 - loss: 2.2056 - regression_loss: 1.8217 - classification_loss: 0.3838 67/500 [===>..........................] - ETA: 1:47 - loss: 2.1910 - regression_loss: 1.8100 - classification_loss: 0.3810 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1978 - regression_loss: 1.8147 - classification_loss: 0.3831 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1955 - regression_loss: 1.8131 - classification_loss: 0.3824 70/500 [===>..........................] - ETA: 1:46 - loss: 2.1943 - regression_loss: 1.8136 - classification_loss: 0.3807 71/500 [===>..........................] - ETA: 1:46 - loss: 2.1854 - regression_loss: 1.8072 - classification_loss: 0.3782 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1785 - regression_loss: 1.8021 - classification_loss: 0.3764 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1881 - regression_loss: 1.8099 - classification_loss: 0.3782 74/500 [===>..........................] - ETA: 1:45 - loss: 2.1704 - regression_loss: 1.7956 - classification_loss: 0.3748 75/500 [===>..........................] - ETA: 1:45 - loss: 2.1820 - regression_loss: 1.8062 - classification_loss: 0.3759 76/500 [===>..........................] - ETA: 1:45 - loss: 2.1862 - regression_loss: 1.8095 - classification_loss: 0.3767 77/500 [===>..........................] - ETA: 1:45 - loss: 2.1805 - regression_loss: 1.8061 - classification_loss: 0.3744 78/500 [===>..........................] - ETA: 1:44 - loss: 2.1796 - regression_loss: 1.8062 - classification_loss: 0.3734 79/500 [===>..........................] - ETA: 1:44 - loss: 2.1884 - regression_loss: 1.8137 - classification_loss: 0.3746 80/500 [===>..........................] - ETA: 1:44 - loss: 2.1911 - regression_loss: 1.8148 - classification_loss: 0.3764 81/500 [===>..........................] - ETA: 1:43 - loss: 2.1983 - regression_loss: 1.8205 - classification_loss: 0.3779 82/500 [===>..........................] - ETA: 1:43 - loss: 2.2067 - regression_loss: 1.8280 - classification_loss: 0.3787 83/500 [===>..........................] - ETA: 1:43 - loss: 2.1991 - regression_loss: 1.8227 - classification_loss: 0.3764 84/500 [====>.........................] - ETA: 1:43 - loss: 2.1982 - regression_loss: 1.8218 - classification_loss: 0.3763 85/500 [====>.........................] - ETA: 1:42 - loss: 2.2067 - regression_loss: 1.8307 - classification_loss: 0.3760 86/500 [====>.........................] - ETA: 1:42 - loss: 2.2008 - regression_loss: 1.8260 - classification_loss: 0.3748 87/500 [====>.........................] - ETA: 1:42 - loss: 2.2059 - regression_loss: 1.8292 - classification_loss: 0.3767 88/500 [====>.........................] - ETA: 1:42 - loss: 2.2006 - regression_loss: 1.8250 - classification_loss: 0.3755 89/500 [====>.........................] - ETA: 1:42 - loss: 2.2030 - regression_loss: 1.8277 - classification_loss: 0.3753 90/500 [====>.........................] - ETA: 1:41 - loss: 2.2061 - regression_loss: 1.8306 - classification_loss: 0.3756 91/500 [====>.........................] - ETA: 1:41 - loss: 2.2081 - regression_loss: 1.8310 - classification_loss: 0.3771 92/500 [====>.........................] - ETA: 1:41 - loss: 2.2149 - regression_loss: 1.8337 - classification_loss: 0.3812 93/500 [====>.........................] - ETA: 1:41 - loss: 2.2172 - regression_loss: 1.8350 - classification_loss: 0.3823 94/500 [====>.........................] - ETA: 1:40 - loss: 2.2061 - regression_loss: 1.8259 - classification_loss: 0.3802 95/500 [====>.........................] - ETA: 1:40 - loss: 2.2090 - regression_loss: 1.8274 - classification_loss: 0.3817 96/500 [====>.........................] - ETA: 1:40 - loss: 2.2102 - regression_loss: 1.8286 - classification_loss: 0.3816 97/500 [====>.........................] - ETA: 1:40 - loss: 2.2015 - regression_loss: 1.8219 - classification_loss: 0.3796 98/500 [====>.........................] - ETA: 1:39 - loss: 2.2023 - regression_loss: 1.8221 - classification_loss: 0.3802 99/500 [====>.........................] - ETA: 1:39 - loss: 2.2019 - regression_loss: 1.8218 - classification_loss: 0.3801 100/500 [=====>........................] - ETA: 1:39 - loss: 2.2101 - regression_loss: 1.8279 - classification_loss: 0.3822 101/500 [=====>........................] - ETA: 1:39 - loss: 2.2148 - regression_loss: 1.8315 - classification_loss: 0.3833 102/500 [=====>........................] - ETA: 1:38 - loss: 2.2204 - regression_loss: 1.8356 - classification_loss: 0.3848 103/500 [=====>........................] - ETA: 1:38 - loss: 2.2244 - regression_loss: 1.8399 - classification_loss: 0.3844 104/500 [=====>........................] - ETA: 1:38 - loss: 2.2218 - regression_loss: 1.8389 - classification_loss: 0.3829 105/500 [=====>........................] - ETA: 1:38 - loss: 2.2235 - regression_loss: 1.8402 - classification_loss: 0.3833 106/500 [=====>........................] - ETA: 1:37 - loss: 2.2270 - regression_loss: 1.8436 - classification_loss: 0.3834 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2243 - regression_loss: 1.8421 - classification_loss: 0.3823 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2108 - regression_loss: 1.8307 - classification_loss: 0.3802 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2122 - regression_loss: 1.8317 - classification_loss: 0.3805 110/500 [=====>........................] - ETA: 1:36 - loss: 2.2107 - regression_loss: 1.8307 - classification_loss: 0.3800 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2042 - regression_loss: 1.8258 - classification_loss: 0.3784 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2107 - regression_loss: 1.8306 - classification_loss: 0.3801 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2066 - regression_loss: 1.8274 - classification_loss: 0.3791 114/500 [=====>........................] - ETA: 1:35 - loss: 2.1978 - regression_loss: 1.8192 - classification_loss: 0.3786 115/500 [=====>........................] - ETA: 1:35 - loss: 2.1918 - regression_loss: 1.8144 - classification_loss: 0.3774 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1917 - regression_loss: 1.8145 - classification_loss: 0.3772 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1899 - regression_loss: 1.8134 - classification_loss: 0.3766 118/500 [======>.......................] - ETA: 1:34 - loss: 2.1895 - regression_loss: 1.8130 - classification_loss: 0.3764 119/500 [======>.......................] - ETA: 1:34 - loss: 2.1924 - regression_loss: 1.8161 - classification_loss: 0.3763 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1940 - regression_loss: 1.8172 - classification_loss: 0.3768 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1955 - regression_loss: 1.8191 - classification_loss: 0.3764 122/500 [======>.......................] - ETA: 1:33 - loss: 2.1997 - regression_loss: 1.8218 - classification_loss: 0.3778 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2038 - regression_loss: 1.8243 - classification_loss: 0.3795 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2036 - regression_loss: 1.8227 - classification_loss: 0.3809 125/500 [======>.......................] - ETA: 1:33 - loss: 2.2108 - regression_loss: 1.8278 - classification_loss: 0.3830 126/500 [======>.......................] - ETA: 1:32 - loss: 2.2145 - regression_loss: 1.8309 - classification_loss: 0.3837 127/500 [======>.......................] - ETA: 1:32 - loss: 2.2145 - regression_loss: 1.8311 - classification_loss: 0.3834 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2163 - regression_loss: 1.8330 - classification_loss: 0.3834 129/500 [======>.......................] - ETA: 1:32 - loss: 2.2189 - regression_loss: 1.8348 - classification_loss: 0.3841 130/500 [======>.......................] - ETA: 1:32 - loss: 2.2173 - regression_loss: 1.8341 - classification_loss: 0.3831 131/500 [======>.......................] - ETA: 1:31 - loss: 2.2274 - regression_loss: 1.8415 - classification_loss: 0.3859 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2308 - regression_loss: 1.8440 - classification_loss: 0.3868 133/500 [======>.......................] - ETA: 1:31 - loss: 2.2321 - regression_loss: 1.8448 - classification_loss: 0.3873 134/500 [=======>......................] - ETA: 1:30 - loss: 2.2274 - regression_loss: 1.8413 - classification_loss: 0.3861 135/500 [=======>......................] - ETA: 1:30 - loss: 2.2255 - regression_loss: 1.8404 - classification_loss: 0.3851 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2260 - regression_loss: 1.8408 - classification_loss: 0.3852 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2240 - regression_loss: 1.8392 - classification_loss: 0.3849 138/500 [=======>......................] - ETA: 1:29 - loss: 2.2238 - regression_loss: 1.8386 - classification_loss: 0.3852 139/500 [=======>......................] - ETA: 1:29 - loss: 2.2216 - regression_loss: 1.8373 - classification_loss: 0.3844 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2219 - regression_loss: 1.8379 - classification_loss: 0.3840 141/500 [=======>......................] - ETA: 1:28 - loss: 2.2175 - regression_loss: 1.8346 - classification_loss: 0.3830 142/500 [=======>......................] - ETA: 1:28 - loss: 2.2196 - regression_loss: 1.8366 - classification_loss: 0.3830 143/500 [=======>......................] - ETA: 1:28 - loss: 2.2151 - regression_loss: 1.8335 - classification_loss: 0.3817 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2141 - regression_loss: 1.8327 - classification_loss: 0.3814 145/500 [=======>......................] - ETA: 1:27 - loss: 2.2105 - regression_loss: 1.8304 - classification_loss: 0.3801 146/500 [=======>......................] - ETA: 1:27 - loss: 2.2102 - regression_loss: 1.8298 - classification_loss: 0.3804 147/500 [=======>......................] - ETA: 1:27 - loss: 2.2074 - regression_loss: 1.8271 - classification_loss: 0.3803 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2057 - regression_loss: 1.8257 - classification_loss: 0.3800 149/500 [=======>......................] - ETA: 1:26 - loss: 2.2106 - regression_loss: 1.8287 - classification_loss: 0.3819 150/500 [========>.....................] - ETA: 1:26 - loss: 2.2051 - regression_loss: 1.8245 - classification_loss: 0.3806 151/500 [========>.....................] - ETA: 1:26 - loss: 2.2000 - regression_loss: 1.8209 - classification_loss: 0.3791 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2024 - regression_loss: 1.8234 - classification_loss: 0.3790 153/500 [========>.....................] - ETA: 1:25 - loss: 2.2016 - regression_loss: 1.8228 - classification_loss: 0.3788 154/500 [========>.....................] - ETA: 1:25 - loss: 2.1973 - regression_loss: 1.8195 - classification_loss: 0.3778 155/500 [========>.....................] - ETA: 1:25 - loss: 2.1950 - regression_loss: 1.8174 - classification_loss: 0.3776 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1957 - regression_loss: 1.8182 - classification_loss: 0.3776 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1941 - regression_loss: 1.8169 - classification_loss: 0.3773 158/500 [========>.....................] - ETA: 1:24 - loss: 2.1897 - regression_loss: 1.8132 - classification_loss: 0.3764 159/500 [========>.....................] - ETA: 1:24 - loss: 2.1849 - regression_loss: 1.8094 - classification_loss: 0.3755 160/500 [========>.....................] - ETA: 1:24 - loss: 2.1859 - regression_loss: 1.8103 - classification_loss: 0.3756 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1891 - regression_loss: 1.8123 - classification_loss: 0.3769 162/500 [========>.....................] - ETA: 1:23 - loss: 2.1858 - regression_loss: 1.8097 - classification_loss: 0.3761 163/500 [========>.....................] - ETA: 1:23 - loss: 2.1843 - regression_loss: 1.8090 - classification_loss: 0.3753 164/500 [========>.....................] - ETA: 1:23 - loss: 2.1854 - regression_loss: 1.8097 - classification_loss: 0.3757 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1872 - regression_loss: 1.8111 - classification_loss: 0.3761 166/500 [========>.....................] - ETA: 1:22 - loss: 2.1893 - regression_loss: 1.8122 - classification_loss: 0.3771 167/500 [=========>....................] - ETA: 1:22 - loss: 2.1947 - regression_loss: 1.8157 - classification_loss: 0.3790 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1936 - regression_loss: 1.8151 - classification_loss: 0.3786 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1951 - regression_loss: 1.8160 - classification_loss: 0.3791 170/500 [=========>....................] - ETA: 1:21 - loss: 2.1992 - regression_loss: 1.8189 - classification_loss: 0.3803 171/500 [=========>....................] - ETA: 1:21 - loss: 2.2040 - regression_loss: 1.8223 - classification_loss: 0.3818 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2043 - regression_loss: 1.8228 - classification_loss: 0.3814 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1988 - regression_loss: 1.8171 - classification_loss: 0.3817 174/500 [=========>....................] - ETA: 1:20 - loss: 2.1961 - regression_loss: 1.8153 - classification_loss: 0.3808 175/500 [=========>....................] - ETA: 1:20 - loss: 2.1962 - regression_loss: 1.8151 - classification_loss: 0.3811 176/500 [=========>....................] - ETA: 1:20 - loss: 2.1973 - regression_loss: 1.8162 - classification_loss: 0.3811 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1970 - regression_loss: 1.8164 - classification_loss: 0.3806 178/500 [=========>....................] - ETA: 1:19 - loss: 2.1959 - regression_loss: 1.8150 - classification_loss: 0.3809 179/500 [=========>....................] - ETA: 1:19 - loss: 2.1998 - regression_loss: 1.8163 - classification_loss: 0.3835 180/500 [=========>....................] - ETA: 1:19 - loss: 2.1984 - regression_loss: 1.8154 - classification_loss: 0.3829 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1989 - regression_loss: 1.8148 - classification_loss: 0.3840 182/500 [=========>....................] - ETA: 1:18 - loss: 2.2014 - regression_loss: 1.8164 - classification_loss: 0.3849 183/500 [=========>....................] - ETA: 1:18 - loss: 2.2026 - regression_loss: 1.8172 - classification_loss: 0.3854 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2035 - regression_loss: 1.8179 - classification_loss: 0.3856 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2044 - regression_loss: 1.8182 - classification_loss: 0.3862 186/500 [==========>...................] - ETA: 1:17 - loss: 2.2068 - regression_loss: 1.8211 - classification_loss: 0.3857 187/500 [==========>...................] - ETA: 1:17 - loss: 2.2057 - regression_loss: 1.8203 - classification_loss: 0.3854 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2049 - regression_loss: 1.8199 - classification_loss: 0.3851 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2143 - regression_loss: 1.8213 - classification_loss: 0.3930 190/500 [==========>...................] - ETA: 1:16 - loss: 2.2142 - regression_loss: 1.8215 - classification_loss: 0.3927 191/500 [==========>...................] - ETA: 1:16 - loss: 2.2177 - regression_loss: 1.8249 - classification_loss: 0.3928 192/500 [==========>...................] - ETA: 1:16 - loss: 2.2168 - regression_loss: 1.8238 - classification_loss: 0.3930 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2162 - regression_loss: 1.8238 - classification_loss: 0.3923 194/500 [==========>...................] - ETA: 1:15 - loss: 2.2146 - regression_loss: 1.8230 - classification_loss: 0.3916 195/500 [==========>...................] - ETA: 1:15 - loss: 2.2173 - regression_loss: 1.8249 - classification_loss: 0.3924 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2184 - regression_loss: 1.8265 - classification_loss: 0.3918 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2174 - regression_loss: 1.8260 - classification_loss: 0.3914 198/500 [==========>...................] - ETA: 1:14 - loss: 2.2176 - regression_loss: 1.8257 - classification_loss: 0.3919 199/500 [==========>...................] - ETA: 1:14 - loss: 2.2179 - regression_loss: 1.8261 - classification_loss: 0.3918 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2157 - regression_loss: 1.8244 - classification_loss: 0.3912 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2118 - regression_loss: 1.8215 - classification_loss: 0.3903 202/500 [===========>..................] - ETA: 1:13 - loss: 2.2148 - regression_loss: 1.8231 - classification_loss: 0.3917 203/500 [===========>..................] - ETA: 1:13 - loss: 2.2130 - regression_loss: 1.8220 - classification_loss: 0.3910 204/500 [===========>..................] - ETA: 1:13 - loss: 2.2147 - regression_loss: 1.8233 - classification_loss: 0.3914 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2126 - regression_loss: 1.8220 - classification_loss: 0.3906 206/500 [===========>..................] - ETA: 1:12 - loss: 2.2136 - regression_loss: 1.8222 - classification_loss: 0.3915 207/500 [===========>..................] - ETA: 1:12 - loss: 2.2157 - regression_loss: 1.8235 - classification_loss: 0.3922 208/500 [===========>..................] - ETA: 1:12 - loss: 2.2124 - regression_loss: 1.8211 - classification_loss: 0.3913 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2184 - regression_loss: 1.8249 - classification_loss: 0.3935 210/500 [===========>..................] - ETA: 1:11 - loss: 2.2226 - regression_loss: 1.8266 - classification_loss: 0.3961 211/500 [===========>..................] - ETA: 1:11 - loss: 2.2216 - regression_loss: 1.8261 - classification_loss: 0.3956 212/500 [===========>..................] - ETA: 1:11 - loss: 2.2256 - regression_loss: 1.8290 - classification_loss: 0.3967 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2273 - regression_loss: 1.8302 - classification_loss: 0.3971 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2302 - regression_loss: 1.8323 - classification_loss: 0.3980 215/500 [===========>..................] - ETA: 1:10 - loss: 2.2296 - regression_loss: 1.8319 - classification_loss: 0.3977 216/500 [===========>..................] - ETA: 1:10 - loss: 2.2297 - regression_loss: 1.8320 - classification_loss: 0.3977 217/500 [============>.................] - ETA: 1:10 - loss: 2.2316 - regression_loss: 1.8331 - classification_loss: 0.3985 218/500 [============>.................] - ETA: 1:10 - loss: 2.2330 - regression_loss: 1.8343 - classification_loss: 0.3988 219/500 [============>.................] - ETA: 1:09 - loss: 2.2301 - regression_loss: 1.8322 - classification_loss: 0.3978 220/500 [============>.................] - ETA: 1:09 - loss: 2.2311 - regression_loss: 1.8335 - classification_loss: 0.3976 221/500 [============>.................] - ETA: 1:09 - loss: 2.2313 - regression_loss: 1.8334 - classification_loss: 0.3979 222/500 [============>.................] - ETA: 1:09 - loss: 2.2277 - regression_loss: 1.8303 - classification_loss: 0.3974 223/500 [============>.................] - ETA: 1:08 - loss: 2.2258 - regression_loss: 1.8288 - classification_loss: 0.3970 224/500 [============>.................] - ETA: 1:08 - loss: 2.2244 - regression_loss: 1.8278 - classification_loss: 0.3966 225/500 [============>.................] - ETA: 1:08 - loss: 2.2220 - regression_loss: 1.8259 - classification_loss: 0.3961 226/500 [============>.................] - ETA: 1:08 - loss: 2.2196 - regression_loss: 1.8241 - classification_loss: 0.3954 227/500 [============>.................] - ETA: 1:07 - loss: 2.2194 - regression_loss: 1.8244 - classification_loss: 0.3950 228/500 [============>.................] - ETA: 1:07 - loss: 2.2235 - regression_loss: 1.8277 - classification_loss: 0.3958 229/500 [============>.................] - ETA: 1:07 - loss: 2.2224 - regression_loss: 1.8268 - classification_loss: 0.3956 230/500 [============>.................] - ETA: 1:07 - loss: 2.2214 - regression_loss: 1.8262 - classification_loss: 0.3953 231/500 [============>.................] - ETA: 1:06 - loss: 2.2214 - regression_loss: 1.8260 - classification_loss: 0.3954 232/500 [============>.................] - ETA: 1:06 - loss: 2.2218 - regression_loss: 1.8265 - classification_loss: 0.3954 233/500 [============>.................] - ETA: 1:06 - loss: 2.2201 - regression_loss: 1.8250 - classification_loss: 0.3951 234/500 [=============>................] - ETA: 1:06 - loss: 2.2198 - regression_loss: 1.8247 - classification_loss: 0.3951 235/500 [=============>................] - ETA: 1:05 - loss: 2.2178 - regression_loss: 1.8232 - classification_loss: 0.3946 236/500 [=============>................] - ETA: 1:05 - loss: 2.2197 - regression_loss: 1.8246 - classification_loss: 0.3951 237/500 [=============>................] - ETA: 1:05 - loss: 2.2188 - regression_loss: 1.8239 - classification_loss: 0.3949 238/500 [=============>................] - ETA: 1:05 - loss: 2.2166 - regression_loss: 1.8222 - classification_loss: 0.3944 239/500 [=============>................] - ETA: 1:04 - loss: 2.2191 - regression_loss: 1.8242 - classification_loss: 0.3948 240/500 [=============>................] - ETA: 1:04 - loss: 2.2199 - regression_loss: 1.8247 - classification_loss: 0.3952 241/500 [=============>................] - ETA: 1:04 - loss: 2.2202 - regression_loss: 1.8251 - classification_loss: 0.3951 242/500 [=============>................] - ETA: 1:04 - loss: 2.2199 - regression_loss: 1.8251 - classification_loss: 0.3948 243/500 [=============>................] - ETA: 1:03 - loss: 2.2239 - regression_loss: 1.8289 - classification_loss: 0.3950 244/500 [=============>................] - ETA: 1:03 - loss: 2.2244 - regression_loss: 1.8290 - classification_loss: 0.3954 245/500 [=============>................] - ETA: 1:03 - loss: 2.2261 - regression_loss: 1.8302 - classification_loss: 0.3958 246/500 [=============>................] - ETA: 1:03 - loss: 2.2266 - regression_loss: 1.8309 - classification_loss: 0.3957 247/500 [=============>................] - ETA: 1:02 - loss: 2.2261 - regression_loss: 1.8310 - classification_loss: 0.3951 248/500 [=============>................] - ETA: 1:02 - loss: 2.2278 - regression_loss: 1.8323 - classification_loss: 0.3955 249/500 [=============>................] - ETA: 1:02 - loss: 2.2267 - regression_loss: 1.8318 - classification_loss: 0.3949 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2334 - regression_loss: 1.8369 - classification_loss: 0.3965 251/500 [==============>...............] - ETA: 1:01 - loss: 2.2303 - regression_loss: 1.8345 - classification_loss: 0.3958 252/500 [==============>...............] - ETA: 1:01 - loss: 2.2295 - regression_loss: 1.8340 - classification_loss: 0.3955 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2315 - regression_loss: 1.8355 - classification_loss: 0.3961 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2322 - regression_loss: 1.8361 - classification_loss: 0.3962 255/500 [==============>...............] - ETA: 1:00 - loss: 2.2279 - regression_loss: 1.8327 - classification_loss: 0.3952 256/500 [==============>...............] - ETA: 1:00 - loss: 2.2265 - regression_loss: 1.8317 - classification_loss: 0.3948 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2299 - regression_loss: 1.8345 - classification_loss: 0.3954 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2315 - regression_loss: 1.8355 - classification_loss: 0.3960 259/500 [==============>...............] - ETA: 59s - loss: 2.2296 - regression_loss: 1.8343 - classification_loss: 0.3954  260/500 [==============>...............] - ETA: 59s - loss: 2.2299 - regression_loss: 1.8345 - classification_loss: 0.3955 261/500 [==============>...............] - ETA: 59s - loss: 2.2308 - regression_loss: 1.8349 - classification_loss: 0.3959 262/500 [==============>...............] - ETA: 59s - loss: 2.2309 - regression_loss: 1.8349 - classification_loss: 0.3959 263/500 [==============>...............] - ETA: 58s - loss: 2.2325 - regression_loss: 1.8363 - classification_loss: 0.3961 264/500 [==============>...............] - ETA: 58s - loss: 2.2358 - regression_loss: 1.8385 - classification_loss: 0.3973 265/500 [==============>...............] - ETA: 58s - loss: 2.2357 - regression_loss: 1.8385 - classification_loss: 0.3971 266/500 [==============>...............] - ETA: 58s - loss: 2.2351 - regression_loss: 1.8381 - classification_loss: 0.3971 267/500 [===============>..............] - ETA: 57s - loss: 2.2362 - regression_loss: 1.8391 - classification_loss: 0.3971 268/500 [===============>..............] - ETA: 57s - loss: 2.2371 - regression_loss: 1.8388 - classification_loss: 0.3983 269/500 [===============>..............] - ETA: 57s - loss: 2.2350 - regression_loss: 1.8369 - classification_loss: 0.3982 270/500 [===============>..............] - ETA: 57s - loss: 2.2355 - regression_loss: 1.8370 - classification_loss: 0.3985 271/500 [===============>..............] - ETA: 56s - loss: 2.2336 - regression_loss: 1.8358 - classification_loss: 0.3979 272/500 [===============>..............] - ETA: 56s - loss: 2.2324 - regression_loss: 1.8351 - classification_loss: 0.3973 273/500 [===============>..............] - ETA: 56s - loss: 2.2308 - regression_loss: 1.8339 - classification_loss: 0.3969 274/500 [===============>..............] - ETA: 56s - loss: 2.2305 - regression_loss: 1.8340 - classification_loss: 0.3965 275/500 [===============>..............] - ETA: 55s - loss: 2.2311 - regression_loss: 1.8345 - classification_loss: 0.3966 276/500 [===============>..............] - ETA: 55s - loss: 2.2315 - regression_loss: 1.8344 - classification_loss: 0.3971 277/500 [===============>..............] - ETA: 55s - loss: 2.2321 - regression_loss: 1.8347 - classification_loss: 0.3974 278/500 [===============>..............] - ETA: 55s - loss: 2.2335 - regression_loss: 1.8355 - classification_loss: 0.3979 279/500 [===============>..............] - ETA: 54s - loss: 2.2293 - regression_loss: 1.8324 - classification_loss: 0.3969 280/500 [===============>..............] - ETA: 54s - loss: 2.2288 - regression_loss: 1.8322 - classification_loss: 0.3966 281/500 [===============>..............] - ETA: 54s - loss: 2.2278 - regression_loss: 1.8315 - classification_loss: 0.3963 282/500 [===============>..............] - ETA: 54s - loss: 2.2299 - regression_loss: 1.8330 - classification_loss: 0.3969 283/500 [===============>..............] - ETA: 53s - loss: 2.2314 - regression_loss: 1.8342 - classification_loss: 0.3971 284/500 [================>.............] - ETA: 53s - loss: 2.2284 - regression_loss: 1.8320 - classification_loss: 0.3964 285/500 [================>.............] - ETA: 53s - loss: 2.2297 - regression_loss: 1.8328 - classification_loss: 0.3969 286/500 [================>.............] - ETA: 53s - loss: 2.2311 - regression_loss: 1.8337 - classification_loss: 0.3974 287/500 [================>.............] - ETA: 53s - loss: 2.2311 - regression_loss: 1.8339 - classification_loss: 0.3972 288/500 [================>.............] - ETA: 52s - loss: 2.2324 - regression_loss: 1.8351 - classification_loss: 0.3973 289/500 [================>.............] - ETA: 52s - loss: 2.2381 - regression_loss: 1.8399 - classification_loss: 0.3981 290/500 [================>.............] - ETA: 52s - loss: 2.2376 - regression_loss: 1.8396 - classification_loss: 0.3979 291/500 [================>.............] - ETA: 52s - loss: 2.2378 - regression_loss: 1.8394 - classification_loss: 0.3984 292/500 [================>.............] - ETA: 51s - loss: 2.2366 - regression_loss: 1.8385 - classification_loss: 0.3981 293/500 [================>.............] - ETA: 51s - loss: 2.2364 - regression_loss: 1.8384 - classification_loss: 0.3980 294/500 [================>.............] - ETA: 51s - loss: 2.2366 - regression_loss: 1.8380 - classification_loss: 0.3986 295/500 [================>.............] - ETA: 51s - loss: 2.2533 - regression_loss: 1.8318 - classification_loss: 0.4215 296/500 [================>.............] - ETA: 50s - loss: 2.2546 - regression_loss: 1.8324 - classification_loss: 0.4221 297/500 [================>.............] - ETA: 50s - loss: 2.2551 - regression_loss: 1.8330 - classification_loss: 0.4221 298/500 [================>.............] - ETA: 50s - loss: 2.2540 - regression_loss: 1.8323 - classification_loss: 0.4216 299/500 [================>.............] - ETA: 50s - loss: 2.2545 - regression_loss: 1.8325 - classification_loss: 0.4220 300/500 [=================>............] - ETA: 49s - loss: 2.2538 - regression_loss: 1.8320 - classification_loss: 0.4218 301/500 [=================>............] - ETA: 49s - loss: 2.2533 - regression_loss: 1.8316 - classification_loss: 0.4217 302/500 [=================>............] - ETA: 49s - loss: 2.2532 - regression_loss: 1.8317 - classification_loss: 0.4215 303/500 [=================>............] - ETA: 49s - loss: 2.2564 - regression_loss: 1.8340 - classification_loss: 0.4224 304/500 [=================>............] - ETA: 48s - loss: 2.2551 - regression_loss: 1.8332 - classification_loss: 0.4219 305/500 [=================>............] - ETA: 48s - loss: 2.2555 - regression_loss: 1.8338 - classification_loss: 0.4218 306/500 [=================>............] - ETA: 48s - loss: 2.2544 - regression_loss: 1.8329 - classification_loss: 0.4215 307/500 [=================>............] - ETA: 48s - loss: 2.2523 - regression_loss: 1.8310 - classification_loss: 0.4213 308/500 [=================>............] - ETA: 47s - loss: 2.2540 - regression_loss: 1.8326 - classification_loss: 0.4214 309/500 [=================>............] - ETA: 47s - loss: 2.2578 - regression_loss: 1.8349 - classification_loss: 0.4229 310/500 [=================>............] - ETA: 47s - loss: 2.2566 - regression_loss: 1.8339 - classification_loss: 0.4227 311/500 [=================>............] - ETA: 46s - loss: 2.2575 - regression_loss: 1.8346 - classification_loss: 0.4229 312/500 [=================>............] - ETA: 46s - loss: 2.2564 - regression_loss: 1.8340 - classification_loss: 0.4224 313/500 [=================>............] - ETA: 46s - loss: 2.2552 - regression_loss: 1.8333 - classification_loss: 0.4219 314/500 [=================>............] - ETA: 46s - loss: 2.2552 - regression_loss: 1.8334 - classification_loss: 0.4218 315/500 [=================>............] - ETA: 45s - loss: 2.2550 - regression_loss: 1.8334 - classification_loss: 0.4216 316/500 [=================>............] - ETA: 45s - loss: 2.2537 - regression_loss: 1.8327 - classification_loss: 0.4210 317/500 [==================>...........] - ETA: 45s - loss: 2.2521 - regression_loss: 1.8316 - classification_loss: 0.4206 318/500 [==================>...........] - ETA: 45s - loss: 2.2558 - regression_loss: 1.8346 - classification_loss: 0.4212 319/500 [==================>...........] - ETA: 44s - loss: 2.2511 - regression_loss: 1.8310 - classification_loss: 0.4202 320/500 [==================>...........] - ETA: 44s - loss: 2.2487 - regression_loss: 1.8291 - classification_loss: 0.4196 321/500 [==================>...........] - ETA: 44s - loss: 2.2479 - regression_loss: 1.8285 - classification_loss: 0.4194 322/500 [==================>...........] - ETA: 44s - loss: 2.2453 - regression_loss: 1.8266 - classification_loss: 0.4186 323/500 [==================>...........] - ETA: 43s - loss: 2.2430 - regression_loss: 1.8251 - classification_loss: 0.4179 324/500 [==================>...........] - ETA: 43s - loss: 2.2417 - regression_loss: 1.8242 - classification_loss: 0.4175 325/500 [==================>...........] - ETA: 43s - loss: 2.2469 - regression_loss: 1.8240 - classification_loss: 0.4229 326/500 [==================>...........] - ETA: 43s - loss: 2.2499 - regression_loss: 1.8265 - classification_loss: 0.4233 327/500 [==================>...........] - ETA: 43s - loss: 2.2505 - regression_loss: 1.8272 - classification_loss: 0.4233 328/500 [==================>...........] - ETA: 42s - loss: 2.2491 - regression_loss: 1.8264 - classification_loss: 0.4228 329/500 [==================>...........] - ETA: 42s - loss: 2.2484 - regression_loss: 1.8259 - classification_loss: 0.4225 330/500 [==================>...........] - ETA: 42s - loss: 2.2447 - regression_loss: 1.8231 - classification_loss: 0.4216 331/500 [==================>...........] - ETA: 42s - loss: 2.2413 - regression_loss: 1.8205 - classification_loss: 0.4208 332/500 [==================>...........] - ETA: 41s - loss: 2.2395 - regression_loss: 1.8192 - classification_loss: 0.4203 333/500 [==================>...........] - ETA: 41s - loss: 2.2396 - regression_loss: 1.8191 - classification_loss: 0.4205 334/500 [===================>..........] - ETA: 41s - loss: 2.2383 - regression_loss: 1.8183 - classification_loss: 0.4200 335/500 [===================>..........] - ETA: 41s - loss: 2.2375 - regression_loss: 1.8180 - classification_loss: 0.4195 336/500 [===================>..........] - ETA: 40s - loss: 2.2355 - regression_loss: 1.8167 - classification_loss: 0.4188 337/500 [===================>..........] - ETA: 40s - loss: 2.2352 - regression_loss: 1.8167 - classification_loss: 0.4185 338/500 [===================>..........] - ETA: 40s - loss: 2.2318 - regression_loss: 1.8139 - classification_loss: 0.4179 339/500 [===================>..........] - ETA: 40s - loss: 2.2326 - regression_loss: 1.8144 - classification_loss: 0.4182 340/500 [===================>..........] - ETA: 39s - loss: 2.2353 - regression_loss: 1.8165 - classification_loss: 0.4188 341/500 [===================>..........] - ETA: 39s - loss: 2.2372 - regression_loss: 1.8180 - classification_loss: 0.4192 342/500 [===================>..........] - ETA: 39s - loss: 2.2358 - regression_loss: 1.8171 - classification_loss: 0.4187 343/500 [===================>..........] - ETA: 39s - loss: 2.2369 - regression_loss: 1.8178 - classification_loss: 0.4191 344/500 [===================>..........] - ETA: 38s - loss: 2.2375 - regression_loss: 1.8187 - classification_loss: 0.4188 345/500 [===================>..........] - ETA: 38s - loss: 2.2371 - regression_loss: 1.8186 - classification_loss: 0.4185 346/500 [===================>..........] - ETA: 38s - loss: 2.2334 - regression_loss: 1.8157 - classification_loss: 0.4177 347/500 [===================>..........] - ETA: 38s - loss: 2.2325 - regression_loss: 1.8151 - classification_loss: 0.4174 348/500 [===================>..........] - ETA: 37s - loss: 2.2322 - regression_loss: 1.8154 - classification_loss: 0.4168 349/500 [===================>..........] - ETA: 37s - loss: 2.2327 - regression_loss: 1.8158 - classification_loss: 0.4169 350/500 [====================>.........] - ETA: 37s - loss: 2.2329 - regression_loss: 1.8161 - classification_loss: 0.4167 351/500 [====================>.........] - ETA: 37s - loss: 2.2329 - regression_loss: 1.8158 - classification_loss: 0.4171 352/500 [====================>.........] - ETA: 36s - loss: 2.2317 - regression_loss: 1.8149 - classification_loss: 0.4168 353/500 [====================>.........] - ETA: 36s - loss: 2.2265 - regression_loss: 1.8097 - classification_loss: 0.4168 354/500 [====================>.........] - ETA: 36s - loss: 2.2279 - regression_loss: 1.8110 - classification_loss: 0.4169 355/500 [====================>.........] - ETA: 36s - loss: 2.2260 - regression_loss: 1.8095 - classification_loss: 0.4166 356/500 [====================>.........] - ETA: 35s - loss: 2.2263 - regression_loss: 1.8097 - classification_loss: 0.4166 357/500 [====================>.........] - ETA: 35s - loss: 2.2268 - regression_loss: 1.8099 - classification_loss: 0.4169 358/500 [====================>.........] - ETA: 35s - loss: 2.2295 - regression_loss: 1.8120 - classification_loss: 0.4175 359/500 [====================>.........] - ETA: 35s - loss: 2.2288 - regression_loss: 1.8120 - classification_loss: 0.4168 360/500 [====================>.........] - ETA: 34s - loss: 2.2288 - regression_loss: 1.8117 - classification_loss: 0.4171 361/500 [====================>.........] - ETA: 34s - loss: 2.2327 - regression_loss: 1.8154 - classification_loss: 0.4172 362/500 [====================>.........] - ETA: 34s - loss: 2.2320 - regression_loss: 1.8151 - classification_loss: 0.4169 363/500 [====================>.........] - ETA: 34s - loss: 2.2296 - regression_loss: 1.8135 - classification_loss: 0.4161 364/500 [====================>.........] - ETA: 33s - loss: 2.2295 - regression_loss: 1.8137 - classification_loss: 0.4158 365/500 [====================>.........] - ETA: 33s - loss: 2.2301 - regression_loss: 1.8144 - classification_loss: 0.4157 366/500 [====================>.........] - ETA: 33s - loss: 2.2312 - regression_loss: 1.8156 - classification_loss: 0.4156 367/500 [=====================>........] - ETA: 33s - loss: 2.2308 - regression_loss: 1.8151 - classification_loss: 0.4157 368/500 [=====================>........] - ETA: 32s - loss: 2.2298 - regression_loss: 1.8144 - classification_loss: 0.4154 369/500 [=====================>........] - ETA: 32s - loss: 2.2291 - regression_loss: 1.8141 - classification_loss: 0.4150 370/500 [=====================>........] - ETA: 32s - loss: 2.2297 - regression_loss: 1.8149 - classification_loss: 0.4148 371/500 [=====================>........] - ETA: 32s - loss: 2.2264 - regression_loss: 1.8122 - classification_loss: 0.4142 372/500 [=====================>........] - ETA: 31s - loss: 2.2270 - regression_loss: 1.8122 - classification_loss: 0.4148 373/500 [=====================>........] - ETA: 31s - loss: 2.2276 - regression_loss: 1.8129 - classification_loss: 0.4147 374/500 [=====================>........] - ETA: 31s - loss: 2.2287 - regression_loss: 1.8139 - classification_loss: 0.4148 375/500 [=====================>........] - ETA: 31s - loss: 2.2280 - regression_loss: 1.8130 - classification_loss: 0.4149 376/500 [=====================>........] - ETA: 30s - loss: 2.2320 - regression_loss: 1.8162 - classification_loss: 0.4158 377/500 [=====================>........] - ETA: 30s - loss: 2.2296 - regression_loss: 1.8144 - classification_loss: 0.4152 378/500 [=====================>........] - ETA: 30s - loss: 2.2306 - regression_loss: 1.8150 - classification_loss: 0.4155 379/500 [=====================>........] - ETA: 30s - loss: 2.2315 - regression_loss: 1.8156 - classification_loss: 0.4159 380/500 [=====================>........] - ETA: 29s - loss: 2.2335 - regression_loss: 1.8173 - classification_loss: 0.4162 381/500 [=====================>........] - ETA: 29s - loss: 2.2334 - regression_loss: 1.8173 - classification_loss: 0.4161 382/500 [=====================>........] - ETA: 29s - loss: 2.2323 - regression_loss: 1.8166 - classification_loss: 0.4157 383/500 [=====================>........] - ETA: 29s - loss: 2.2316 - regression_loss: 1.8163 - classification_loss: 0.4153 384/500 [======================>.......] - ETA: 28s - loss: 2.2320 - regression_loss: 1.8164 - classification_loss: 0.4155 385/500 [======================>.......] - ETA: 28s - loss: 2.2298 - regression_loss: 1.8150 - classification_loss: 0.4149 386/500 [======================>.......] - ETA: 28s - loss: 2.2305 - regression_loss: 1.8157 - classification_loss: 0.4149 387/500 [======================>.......] - ETA: 28s - loss: 2.2304 - regression_loss: 1.8156 - classification_loss: 0.4148 388/500 [======================>.......] - ETA: 27s - loss: 2.2295 - regression_loss: 1.8151 - classification_loss: 0.4143 389/500 [======================>.......] - ETA: 27s - loss: 2.2313 - regression_loss: 1.8165 - classification_loss: 0.4147 390/500 [======================>.......] - ETA: 27s - loss: 2.2320 - regression_loss: 1.8171 - classification_loss: 0.4149 391/500 [======================>.......] - ETA: 27s - loss: 2.2317 - regression_loss: 1.8169 - classification_loss: 0.4148 392/500 [======================>.......] - ETA: 26s - loss: 2.2325 - regression_loss: 1.8178 - classification_loss: 0.4147 393/500 [======================>.......] - ETA: 26s - loss: 2.2319 - regression_loss: 1.8174 - classification_loss: 0.4145 394/500 [======================>.......] - ETA: 26s - loss: 2.2321 - regression_loss: 1.8175 - classification_loss: 0.4146 395/500 [======================>.......] - ETA: 26s - loss: 2.2324 - regression_loss: 1.8179 - classification_loss: 0.4145 396/500 [======================>.......] - ETA: 25s - loss: 2.2328 - regression_loss: 1.8183 - classification_loss: 0.4145 397/500 [======================>.......] - ETA: 25s - loss: 2.2351 - regression_loss: 1.8198 - classification_loss: 0.4153 398/500 [======================>.......] - ETA: 25s - loss: 2.2361 - regression_loss: 1.8208 - classification_loss: 0.4153 399/500 [======================>.......] - ETA: 25s - loss: 2.2349 - regression_loss: 1.8198 - classification_loss: 0.4151 400/500 [=======================>......] - ETA: 24s - loss: 2.2333 - regression_loss: 1.8188 - classification_loss: 0.4145 401/500 [=======================>......] - ETA: 24s - loss: 2.2344 - regression_loss: 1.8199 - classification_loss: 0.4145 402/500 [=======================>......] - ETA: 24s - loss: 2.2347 - regression_loss: 1.8204 - classification_loss: 0.4143 403/500 [=======================>......] - ETA: 24s - loss: 2.2367 - regression_loss: 1.8221 - classification_loss: 0.4146 404/500 [=======================>......] - ETA: 23s - loss: 2.2379 - regression_loss: 1.8229 - classification_loss: 0.4149 405/500 [=======================>......] - ETA: 23s - loss: 2.2376 - regression_loss: 1.8228 - classification_loss: 0.4148 406/500 [=======================>......] - ETA: 23s - loss: 2.2392 - regression_loss: 1.8240 - classification_loss: 0.4152 407/500 [=======================>......] - ETA: 23s - loss: 2.2376 - regression_loss: 1.8228 - classification_loss: 0.4148 408/500 [=======================>......] - ETA: 22s - loss: 2.2382 - regression_loss: 1.8232 - classification_loss: 0.4150 409/500 [=======================>......] - ETA: 22s - loss: 2.2369 - regression_loss: 1.8222 - classification_loss: 0.4147 410/500 [=======================>......] - ETA: 22s - loss: 2.2372 - regression_loss: 1.8226 - classification_loss: 0.4146 411/500 [=======================>......] - ETA: 22s - loss: 2.2376 - regression_loss: 1.8232 - classification_loss: 0.4144 412/500 [=======================>......] - ETA: 21s - loss: 2.2399 - regression_loss: 1.8254 - classification_loss: 0.4145 413/500 [=======================>......] - ETA: 21s - loss: 2.2388 - regression_loss: 1.8247 - classification_loss: 0.4141 414/500 [=======================>......] - ETA: 21s - loss: 2.2435 - regression_loss: 1.8281 - classification_loss: 0.4153 415/500 [=======================>......] - ETA: 21s - loss: 2.2426 - regression_loss: 1.8275 - classification_loss: 0.4151 416/500 [=======================>......] - ETA: 20s - loss: 2.2432 - regression_loss: 1.8280 - classification_loss: 0.4153 417/500 [========================>.....] - ETA: 20s - loss: 2.2429 - regression_loss: 1.8279 - classification_loss: 0.4150 418/500 [========================>.....] - ETA: 20s - loss: 2.2446 - regression_loss: 1.8293 - classification_loss: 0.4154 419/500 [========================>.....] - ETA: 20s - loss: 2.2444 - regression_loss: 1.8292 - classification_loss: 0.4152 420/500 [========================>.....] - ETA: 19s - loss: 2.2451 - regression_loss: 1.8301 - classification_loss: 0.4150 421/500 [========================>.....] - ETA: 19s - loss: 2.2464 - regression_loss: 1.8313 - classification_loss: 0.4151 422/500 [========================>.....] - ETA: 19s - loss: 2.2477 - regression_loss: 1.8326 - classification_loss: 0.4151 423/500 [========================>.....] - ETA: 19s - loss: 2.2511 - regression_loss: 1.8349 - classification_loss: 0.4162 424/500 [========================>.....] - ETA: 18s - loss: 2.2520 - regression_loss: 1.8355 - classification_loss: 0.4164 425/500 [========================>.....] - ETA: 18s - loss: 2.2514 - regression_loss: 1.8351 - classification_loss: 0.4163 426/500 [========================>.....] - ETA: 18s - loss: 2.2496 - regression_loss: 1.8338 - classification_loss: 0.4158 427/500 [========================>.....] - ETA: 18s - loss: 2.2493 - regression_loss: 1.8334 - classification_loss: 0.4159 428/500 [========================>.....] - ETA: 17s - loss: 2.2492 - regression_loss: 1.8335 - classification_loss: 0.4158 429/500 [========================>.....] - ETA: 17s - loss: 2.2482 - regression_loss: 1.8327 - classification_loss: 0.4155 430/500 [========================>.....] - ETA: 17s - loss: 2.2486 - regression_loss: 1.8331 - classification_loss: 0.4155 431/500 [========================>.....] - ETA: 17s - loss: 2.2486 - regression_loss: 1.8331 - classification_loss: 0.4155 432/500 [========================>.....] - ETA: 16s - loss: 2.2500 - regression_loss: 1.8350 - classification_loss: 0.4150 433/500 [========================>.....] - ETA: 16s - loss: 2.2514 - regression_loss: 1.8362 - classification_loss: 0.4151 434/500 [=========================>....] - ETA: 16s - loss: 2.2519 - regression_loss: 1.8365 - classification_loss: 0.4154 435/500 [=========================>....] - ETA: 16s - loss: 2.2519 - regression_loss: 1.8364 - classification_loss: 0.4156 436/500 [=========================>....] - ETA: 15s - loss: 2.2487 - regression_loss: 1.8337 - classification_loss: 0.4150 437/500 [=========================>....] - ETA: 15s - loss: 2.2477 - regression_loss: 1.8331 - classification_loss: 0.4146 438/500 [=========================>....] - ETA: 15s - loss: 2.2466 - regression_loss: 1.8322 - classification_loss: 0.4144 439/500 [=========================>....] - ETA: 15s - loss: 2.2461 - regression_loss: 1.8318 - classification_loss: 0.4143 440/500 [=========================>....] - ETA: 14s - loss: 2.2446 - regression_loss: 1.8309 - classification_loss: 0.4138 441/500 [=========================>....] - ETA: 14s - loss: 2.2437 - regression_loss: 1.8302 - classification_loss: 0.4135 442/500 [=========================>....] - ETA: 14s - loss: 2.2444 - regression_loss: 1.8306 - classification_loss: 0.4138 443/500 [=========================>....] - ETA: 14s - loss: 2.2457 - regression_loss: 1.8319 - classification_loss: 0.4138 444/500 [=========================>....] - ETA: 13s - loss: 2.2481 - regression_loss: 1.8338 - classification_loss: 0.4143 445/500 [=========================>....] - ETA: 13s - loss: 2.2484 - regression_loss: 1.8340 - classification_loss: 0.4144 446/500 [=========================>....] - ETA: 13s - loss: 2.2479 - regression_loss: 1.8336 - classification_loss: 0.4143 447/500 [=========================>....] - ETA: 13s - loss: 2.2472 - regression_loss: 1.8330 - classification_loss: 0.4142 448/500 [=========================>....] - ETA: 12s - loss: 2.2441 - regression_loss: 1.8306 - classification_loss: 0.4135 449/500 [=========================>....] - ETA: 12s - loss: 2.2439 - regression_loss: 1.8304 - classification_loss: 0.4134 450/500 [==========================>...] - ETA: 12s - loss: 2.2435 - regression_loss: 1.8300 - classification_loss: 0.4135 451/500 [==========================>...] - ETA: 12s - loss: 2.2427 - regression_loss: 1.8293 - classification_loss: 0.4134 452/500 [==========================>...] - ETA: 11s - loss: 2.2433 - regression_loss: 1.8298 - classification_loss: 0.4135 453/500 [==========================>...] - ETA: 11s - loss: 2.2436 - regression_loss: 1.8306 - classification_loss: 0.4130 454/500 [==========================>...] - ETA: 11s - loss: 2.2434 - regression_loss: 1.8306 - classification_loss: 0.4128 455/500 [==========================>...] - ETA: 11s - loss: 2.2440 - regression_loss: 1.8310 - classification_loss: 0.4130 456/500 [==========================>...] - ETA: 10s - loss: 2.2418 - regression_loss: 1.8293 - classification_loss: 0.4126 457/500 [==========================>...] - ETA: 10s - loss: 2.2411 - regression_loss: 1.8287 - classification_loss: 0.4124 458/500 [==========================>...] - ETA: 10s - loss: 2.2413 - regression_loss: 1.8290 - classification_loss: 0.4123 459/500 [==========================>...] - ETA: 10s - loss: 2.2416 - regression_loss: 1.8292 - classification_loss: 0.4124 460/500 [==========================>...] - ETA: 9s - loss: 2.2415 - regression_loss: 1.8293 - classification_loss: 0.4122  461/500 [==========================>...] - ETA: 9s - loss: 2.2415 - regression_loss: 1.8296 - classification_loss: 0.4119 462/500 [==========================>...] - ETA: 9s - loss: 2.2418 - regression_loss: 1.8298 - classification_loss: 0.4120 463/500 [==========================>...] - ETA: 9s - loss: 2.2424 - regression_loss: 1.8302 - classification_loss: 0.4121 464/500 [==========================>...] - ETA: 8s - loss: 2.2440 - regression_loss: 1.8317 - classification_loss: 0.4124 465/500 [==========================>...] - ETA: 8s - loss: 2.2474 - regression_loss: 1.8346 - classification_loss: 0.4127 466/500 [==========================>...] - ETA: 8s - loss: 2.2485 - regression_loss: 1.8354 - classification_loss: 0.4132 467/500 [===========================>..] - ETA: 8s - loss: 2.2490 - regression_loss: 1.8358 - classification_loss: 0.4132 468/500 [===========================>..] - ETA: 7s - loss: 2.2498 - regression_loss: 1.8364 - classification_loss: 0.4134 469/500 [===========================>..] - ETA: 7s - loss: 2.2494 - regression_loss: 1.8363 - classification_loss: 0.4131 470/500 [===========================>..] - ETA: 7s - loss: 2.2499 - regression_loss: 1.8366 - classification_loss: 0.4133 471/500 [===========================>..] - ETA: 7s - loss: 2.2503 - regression_loss: 1.8369 - classification_loss: 0.4134 472/500 [===========================>..] - ETA: 6s - loss: 2.2492 - regression_loss: 1.8362 - classification_loss: 0.4130 473/500 [===========================>..] - ETA: 6s - loss: 2.2513 - regression_loss: 1.8376 - classification_loss: 0.4137 474/500 [===========================>..] - ETA: 6s - loss: 2.2509 - regression_loss: 1.8373 - classification_loss: 0.4136 475/500 [===========================>..] - ETA: 6s - loss: 2.2507 - regression_loss: 1.8372 - classification_loss: 0.4135 476/500 [===========================>..] - ETA: 5s - loss: 2.2517 - regression_loss: 1.8380 - classification_loss: 0.4137 477/500 [===========================>..] - ETA: 5s - loss: 2.2516 - regression_loss: 1.8379 - classification_loss: 0.4137 478/500 [===========================>..] - ETA: 5s - loss: 2.2516 - regression_loss: 1.8379 - classification_loss: 0.4137 479/500 [===========================>..] - ETA: 5s - loss: 2.2519 - regression_loss: 1.8384 - classification_loss: 0.4136 480/500 [===========================>..] - ETA: 4s - loss: 2.2504 - regression_loss: 1.8372 - classification_loss: 0.4132 481/500 [===========================>..] - ETA: 4s - loss: 2.2507 - regression_loss: 1.8373 - classification_loss: 0.4133 482/500 [===========================>..] - ETA: 4s - loss: 2.2495 - regression_loss: 1.8365 - classification_loss: 0.4130 483/500 [===========================>..] - ETA: 4s - loss: 2.2496 - regression_loss: 1.8367 - classification_loss: 0.4130 484/500 [============================>.] - ETA: 3s - loss: 2.2519 - regression_loss: 1.8372 - classification_loss: 0.4146 485/500 [============================>.] - ETA: 3s - loss: 2.2510 - regression_loss: 1.8365 - classification_loss: 0.4145 486/500 [============================>.] - ETA: 3s - loss: 2.2503 - regression_loss: 1.8359 - classification_loss: 0.4144 487/500 [============================>.] - ETA: 3s - loss: 2.2514 - regression_loss: 1.8367 - classification_loss: 0.4147 488/500 [============================>.] - ETA: 2s - loss: 2.2511 - regression_loss: 1.8366 - classification_loss: 0.4145 489/500 [============================>.] - ETA: 2s - loss: 2.2516 - regression_loss: 1.8368 - classification_loss: 0.4148 490/500 [============================>.] - ETA: 2s - loss: 2.2505 - regression_loss: 1.8360 - classification_loss: 0.4144 491/500 [============================>.] - ETA: 2s - loss: 2.2497 - regression_loss: 1.8355 - classification_loss: 0.4143 492/500 [============================>.] - ETA: 1s - loss: 2.2499 - regression_loss: 1.8355 - classification_loss: 0.4143 493/500 [============================>.] - ETA: 1s - loss: 2.2505 - regression_loss: 1.8362 - classification_loss: 0.4143 494/500 [============================>.] - ETA: 1s - loss: 2.2503 - regression_loss: 1.8362 - classification_loss: 0.4140 495/500 [============================>.] - ETA: 1s - loss: 2.2508 - regression_loss: 1.8368 - classification_loss: 0.4140 496/500 [============================>.] - ETA: 0s - loss: 2.2514 - regression_loss: 1.8371 - classification_loss: 0.4144 497/500 [============================>.] - ETA: 0s - loss: 2.2513 - regression_loss: 1.8371 - classification_loss: 0.4141 498/500 [============================>.] - ETA: 0s - loss: 2.2508 - regression_loss: 1.8367 - classification_loss: 0.4141 499/500 [============================>.] - ETA: 0s - loss: 2.2508 - regression_loss: 1.8368 - classification_loss: 0.4140 500/500 [==============================] - 125s 249ms/step - loss: 2.2508 - regression_loss: 1.8368 - classification_loss: 0.4139 326 instances of class plum with average precision: 0.6046 mAP: 0.6046 Epoch 00012: saving model to ./training/snapshots/resnet50_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 2:02 - loss: 1.5242 - regression_loss: 1.2933 - classification_loss: 0.2308 2/500 [..............................] - ETA: 2:05 - loss: 1.8198 - regression_loss: 1.5412 - classification_loss: 0.2786 3/500 [..............................] - ETA: 2:02 - loss: 2.1283 - regression_loss: 1.8264 - classification_loss: 0.3019 4/500 [..............................] - ETA: 2:04 - loss: 2.2597 - regression_loss: 1.9071 - classification_loss: 0.3526 5/500 [..............................] - ETA: 2:04 - loss: 2.3348 - regression_loss: 1.9842 - classification_loss: 0.3506 6/500 [..............................] - ETA: 2:04 - loss: 2.3585 - regression_loss: 1.9881 - classification_loss: 0.3705 7/500 [..............................] - ETA: 2:04 - loss: 2.2937 - regression_loss: 1.9405 - classification_loss: 0.3532 8/500 [..............................] - ETA: 2:04 - loss: 2.1376 - regression_loss: 1.8066 - classification_loss: 0.3309 9/500 [..............................] - ETA: 2:04 - loss: 2.2237 - regression_loss: 1.8704 - classification_loss: 0.3533 10/500 [..............................] - ETA: 2:04 - loss: 2.2259 - regression_loss: 1.8660 - classification_loss: 0.3599 11/500 [..............................] - ETA: 2:04 - loss: 2.1779 - regression_loss: 1.8325 - classification_loss: 0.3454 12/500 [..............................] - ETA: 2:03 - loss: 2.2252 - regression_loss: 1.8636 - classification_loss: 0.3616 13/500 [..............................] - ETA: 2:03 - loss: 2.1856 - regression_loss: 1.8306 - classification_loss: 0.3551 14/500 [..............................] - ETA: 2:03 - loss: 2.1754 - regression_loss: 1.8187 - classification_loss: 0.3567 15/500 [..............................] - ETA: 2:02 - loss: 2.2334 - regression_loss: 1.8601 - classification_loss: 0.3734 16/500 [..............................] - ETA: 2:01 - loss: 2.2306 - regression_loss: 1.8576 - classification_loss: 0.3730 17/500 [>.............................] - ETA: 2:01 - loss: 2.2375 - regression_loss: 1.8603 - classification_loss: 0.3772 18/500 [>.............................] - ETA: 2:01 - loss: 2.2452 - regression_loss: 1.8646 - classification_loss: 0.3806 19/500 [>.............................] - ETA: 2:00 - loss: 2.2099 - regression_loss: 1.8377 - classification_loss: 0.3722 20/500 [>.............................] - ETA: 2:00 - loss: 2.2128 - regression_loss: 1.8301 - classification_loss: 0.3827 21/500 [>.............................] - ETA: 2:00 - loss: 2.2563 - regression_loss: 1.8632 - classification_loss: 0.3930 22/500 [>.............................] - ETA: 2:00 - loss: 2.2554 - regression_loss: 1.8640 - classification_loss: 0.3914 23/500 [>.............................] - ETA: 1:59 - loss: 2.2556 - regression_loss: 1.8611 - classification_loss: 0.3945 24/500 [>.............................] - ETA: 1:59 - loss: 2.2701 - regression_loss: 1.8745 - classification_loss: 0.3956 25/500 [>.............................] - ETA: 1:59 - loss: 2.2686 - regression_loss: 1.8663 - classification_loss: 0.4023 26/500 [>.............................] - ETA: 1:59 - loss: 2.2440 - regression_loss: 1.8492 - classification_loss: 0.3948 27/500 [>.............................] - ETA: 1:59 - loss: 2.2113 - regression_loss: 1.8229 - classification_loss: 0.3884 28/500 [>.............................] - ETA: 1:58 - loss: 2.2439 - regression_loss: 1.8443 - classification_loss: 0.3995 29/500 [>.............................] - ETA: 1:58 - loss: 2.2253 - regression_loss: 1.8291 - classification_loss: 0.3962 30/500 [>.............................] - ETA: 1:58 - loss: 2.1984 - regression_loss: 1.8088 - classification_loss: 0.3895 31/500 [>.............................] - ETA: 1:58 - loss: 2.1920 - regression_loss: 1.8044 - classification_loss: 0.3876 32/500 [>.............................] - ETA: 1:57 - loss: 2.2004 - regression_loss: 1.8093 - classification_loss: 0.3910 33/500 [>.............................] - ETA: 1:57 - loss: 2.1952 - regression_loss: 1.8030 - classification_loss: 0.3921 34/500 [=>............................] - ETA: 1:57 - loss: 2.1847 - regression_loss: 1.7946 - classification_loss: 0.3901 35/500 [=>............................] - ETA: 1:57 - loss: 2.1745 - regression_loss: 1.7865 - classification_loss: 0.3881 36/500 [=>............................] - ETA: 1:56 - loss: 2.1835 - regression_loss: 1.7907 - classification_loss: 0.3928 37/500 [=>............................] - ETA: 1:56 - loss: 2.1851 - regression_loss: 1.7900 - classification_loss: 0.3952 38/500 [=>............................] - ETA: 1:56 - loss: 2.1726 - regression_loss: 1.7810 - classification_loss: 0.3916 39/500 [=>............................] - ETA: 1:56 - loss: 2.1587 - regression_loss: 1.7705 - classification_loss: 0.3882 40/500 [=>............................] - ETA: 1:55 - loss: 2.1456 - regression_loss: 1.7619 - classification_loss: 0.3838 41/500 [=>............................] - ETA: 1:55 - loss: 2.1462 - regression_loss: 1.7632 - classification_loss: 0.3830 42/500 [=>............................] - ETA: 2:00 - loss: 2.1829 - regression_loss: 1.7926 - classification_loss: 0.3903 43/500 [=>............................] - ETA: 2:00 - loss: 2.1700 - regression_loss: 1.7829 - classification_loss: 0.3871 44/500 [=>............................] - ETA: 1:59 - loss: 2.1695 - regression_loss: 1.7861 - classification_loss: 0.3834 45/500 [=>............................] - ETA: 1:59 - loss: 2.1846 - regression_loss: 1.7980 - classification_loss: 0.3866 46/500 [=>............................] - ETA: 1:59 - loss: 2.2038 - regression_loss: 1.8066 - classification_loss: 0.3972 47/500 [=>............................] - ETA: 1:58 - loss: 2.1885 - regression_loss: 1.7935 - classification_loss: 0.3951 48/500 [=>............................] - ETA: 1:58 - loss: 2.1990 - regression_loss: 1.8025 - classification_loss: 0.3965 49/500 [=>............................] - ETA: 1:57 - loss: 2.2023 - regression_loss: 1.8047 - classification_loss: 0.3976 50/500 [==>...........................] - ETA: 1:57 - loss: 2.2209 - regression_loss: 1.8204 - classification_loss: 0.4004 51/500 [==>...........................] - ETA: 1:57 - loss: 2.2134 - regression_loss: 1.8159 - classification_loss: 0.3975 52/500 [==>...........................] - ETA: 1:56 - loss: 2.2165 - regression_loss: 1.8192 - classification_loss: 0.3973 53/500 [==>...........................] - ETA: 1:56 - loss: 2.2204 - regression_loss: 1.8230 - classification_loss: 0.3974 54/500 [==>...........................] - ETA: 1:56 - loss: 2.2278 - regression_loss: 1.8272 - classification_loss: 0.4006 55/500 [==>...........................] - ETA: 1:55 - loss: 2.2274 - regression_loss: 1.8280 - classification_loss: 0.3994 56/500 [==>...........................] - ETA: 1:55 - loss: 2.2334 - regression_loss: 1.8333 - classification_loss: 0.4002 57/500 [==>...........................] - ETA: 1:55 - loss: 2.2224 - regression_loss: 1.8257 - classification_loss: 0.3967 58/500 [==>...........................] - ETA: 1:54 - loss: 2.2288 - regression_loss: 1.8299 - classification_loss: 0.3989 59/500 [==>...........................] - ETA: 1:54 - loss: 2.2200 - regression_loss: 1.8233 - classification_loss: 0.3966 60/500 [==>...........................] - ETA: 1:54 - loss: 2.2161 - regression_loss: 1.8209 - classification_loss: 0.3952 61/500 [==>...........................] - ETA: 1:53 - loss: 2.2141 - regression_loss: 1.8212 - classification_loss: 0.3929 62/500 [==>...........................] - ETA: 1:53 - loss: 2.2214 - regression_loss: 1.8264 - classification_loss: 0.3950 63/500 [==>...........................] - ETA: 1:52 - loss: 2.2101 - regression_loss: 1.8178 - classification_loss: 0.3924 64/500 [==>...........................] - ETA: 1:52 - loss: 2.2062 - regression_loss: 1.8157 - classification_loss: 0.3905 65/500 [==>...........................] - ETA: 1:52 - loss: 2.2077 - regression_loss: 1.8186 - classification_loss: 0.3891 66/500 [==>...........................] - ETA: 1:52 - loss: 2.2139 - regression_loss: 1.7910 - classification_loss: 0.4229 67/500 [===>..........................] - ETA: 1:51 - loss: 2.2236 - regression_loss: 1.7991 - classification_loss: 0.4245 68/500 [===>..........................] - ETA: 1:51 - loss: 2.2209 - regression_loss: 1.7977 - classification_loss: 0.4231 69/500 [===>..........................] - ETA: 1:51 - loss: 2.2290 - regression_loss: 1.8033 - classification_loss: 0.4257 70/500 [===>..........................] - ETA: 1:51 - loss: 2.2279 - regression_loss: 1.8031 - classification_loss: 0.4249 71/500 [===>..........................] - ETA: 1:50 - loss: 2.2085 - regression_loss: 1.7868 - classification_loss: 0.4217 72/500 [===>..........................] - ETA: 1:50 - loss: 2.2077 - regression_loss: 1.7872 - classification_loss: 0.4204 73/500 [===>..........................] - ETA: 1:50 - loss: 2.2102 - regression_loss: 1.7895 - classification_loss: 0.4206 74/500 [===>..........................] - ETA: 1:49 - loss: 2.2092 - regression_loss: 1.7886 - classification_loss: 0.4206 75/500 [===>..........................] - ETA: 1:49 - loss: 2.2002 - regression_loss: 1.7823 - classification_loss: 0.4179 76/500 [===>..........................] - ETA: 1:49 - loss: 2.2220 - regression_loss: 1.8013 - classification_loss: 0.4207 77/500 [===>..........................] - ETA: 1:48 - loss: 2.2164 - regression_loss: 1.7977 - classification_loss: 0.4187 78/500 [===>..........................] - ETA: 1:48 - loss: 2.2127 - regression_loss: 1.7954 - classification_loss: 0.4172 79/500 [===>..........................] - ETA: 1:48 - loss: 2.2115 - regression_loss: 1.7935 - classification_loss: 0.4180 80/500 [===>..........................] - ETA: 1:48 - loss: 2.2129 - regression_loss: 1.7957 - classification_loss: 0.4171 81/500 [===>..........................] - ETA: 1:47 - loss: 2.2020 - regression_loss: 1.7880 - classification_loss: 0.4140 82/500 [===>..........................] - ETA: 1:47 - loss: 2.2057 - regression_loss: 1.7913 - classification_loss: 0.4144 83/500 [===>..........................] - ETA: 1:47 - loss: 2.2031 - regression_loss: 1.7907 - classification_loss: 0.4123 84/500 [====>.........................] - ETA: 1:46 - loss: 2.2078 - regression_loss: 1.7967 - classification_loss: 0.4111 85/500 [====>.........................] - ETA: 1:46 - loss: 2.2021 - regression_loss: 1.7936 - classification_loss: 0.4085 86/500 [====>.........................] - ETA: 1:46 - loss: 2.2004 - regression_loss: 1.7930 - classification_loss: 0.4074 87/500 [====>.........................] - ETA: 1:46 - loss: 2.1978 - regression_loss: 1.7918 - classification_loss: 0.4060 88/500 [====>.........................] - ETA: 1:45 - loss: 2.1976 - regression_loss: 1.7912 - classification_loss: 0.4064 89/500 [====>.........................] - ETA: 1:45 - loss: 2.1973 - regression_loss: 1.7913 - classification_loss: 0.4061 90/500 [====>.........................] - ETA: 1:45 - loss: 2.1907 - regression_loss: 1.7868 - classification_loss: 0.4039 91/500 [====>.........................] - ETA: 1:44 - loss: 2.1792 - regression_loss: 1.7783 - classification_loss: 0.4009 92/500 [====>.........................] - ETA: 1:44 - loss: 2.1751 - regression_loss: 1.7763 - classification_loss: 0.3988 93/500 [====>.........................] - ETA: 1:44 - loss: 2.1797 - regression_loss: 1.7801 - classification_loss: 0.3996 94/500 [====>.........................] - ETA: 1:43 - loss: 2.1867 - regression_loss: 1.7845 - classification_loss: 0.4023 95/500 [====>.........................] - ETA: 1:43 - loss: 2.1899 - regression_loss: 1.7885 - classification_loss: 0.4014 96/500 [====>.........................] - ETA: 1:43 - loss: 2.1936 - regression_loss: 1.7918 - classification_loss: 0.4018 97/500 [====>.........................] - ETA: 1:43 - loss: 2.2026 - regression_loss: 1.7977 - classification_loss: 0.4049 98/500 [====>.........................] - ETA: 1:42 - loss: 2.2001 - regression_loss: 1.7968 - classification_loss: 0.4033 99/500 [====>.........................] - ETA: 1:42 - loss: 2.1970 - regression_loss: 1.7786 - classification_loss: 0.4184 100/500 [=====>........................] - ETA: 1:42 - loss: 2.2025 - regression_loss: 1.7839 - classification_loss: 0.4186 101/500 [=====>........................] - ETA: 1:42 - loss: 2.2017 - regression_loss: 1.7838 - classification_loss: 0.4179 102/500 [=====>........................] - ETA: 1:41 - loss: 2.2066 - regression_loss: 1.7877 - classification_loss: 0.4189 103/500 [=====>........................] - ETA: 1:41 - loss: 2.1975 - regression_loss: 1.7805 - classification_loss: 0.4170 104/500 [=====>........................] - ETA: 1:41 - loss: 2.2010 - regression_loss: 1.7823 - classification_loss: 0.4188 105/500 [=====>........................] - ETA: 1:40 - loss: 2.2052 - regression_loss: 1.7840 - classification_loss: 0.4212 106/500 [=====>........................] - ETA: 1:40 - loss: 2.2060 - regression_loss: 1.7849 - classification_loss: 0.4211 107/500 [=====>........................] - ETA: 1:40 - loss: 2.2098 - regression_loss: 1.7884 - classification_loss: 0.4214 108/500 [=====>........................] - ETA: 1:40 - loss: 2.2073 - regression_loss: 1.7866 - classification_loss: 0.4208 109/500 [=====>........................] - ETA: 1:39 - loss: 2.2170 - regression_loss: 1.7961 - classification_loss: 0.4210 110/500 [=====>........................] - ETA: 1:39 - loss: 2.2275 - regression_loss: 1.8044 - classification_loss: 0.4231 111/500 [=====>........................] - ETA: 1:39 - loss: 2.2410 - regression_loss: 1.8187 - classification_loss: 0.4223 112/500 [=====>........................] - ETA: 1:38 - loss: 2.2311 - regression_loss: 1.8108 - classification_loss: 0.4203 113/500 [=====>........................] - ETA: 1:38 - loss: 2.2366 - regression_loss: 1.8153 - classification_loss: 0.4213 114/500 [=====>........................] - ETA: 1:38 - loss: 2.2353 - regression_loss: 1.8140 - classification_loss: 0.4213 115/500 [=====>........................] - ETA: 1:38 - loss: 2.2373 - regression_loss: 1.8162 - classification_loss: 0.4211 116/500 [=====>........................] - ETA: 1:37 - loss: 2.2353 - regression_loss: 1.8162 - classification_loss: 0.4191 117/500 [======>.......................] - ETA: 1:37 - loss: 2.2384 - regression_loss: 1.8195 - classification_loss: 0.4189 118/500 [======>.......................] - ETA: 1:37 - loss: 2.2358 - regression_loss: 1.8182 - classification_loss: 0.4176 119/500 [======>.......................] - ETA: 1:37 - loss: 2.2353 - regression_loss: 1.8177 - classification_loss: 0.4176 120/500 [======>.......................] - ETA: 1:36 - loss: 2.2366 - regression_loss: 1.8195 - classification_loss: 0.4171 121/500 [======>.......................] - ETA: 1:36 - loss: 2.2365 - regression_loss: 1.8209 - classification_loss: 0.4156 122/500 [======>.......................] - ETA: 1:36 - loss: 2.2351 - regression_loss: 1.8206 - classification_loss: 0.4145 123/500 [======>.......................] - ETA: 1:35 - loss: 2.2378 - regression_loss: 1.8233 - classification_loss: 0.4145 124/500 [======>.......................] - ETA: 1:35 - loss: 2.2323 - regression_loss: 1.8188 - classification_loss: 0.4134 125/500 [======>.......................] - ETA: 1:35 - loss: 2.2389 - regression_loss: 1.8242 - classification_loss: 0.4147 126/500 [======>.......................] - ETA: 1:35 - loss: 2.2453 - regression_loss: 1.8293 - classification_loss: 0.4159 127/500 [======>.......................] - ETA: 1:34 - loss: 2.2396 - regression_loss: 1.8254 - classification_loss: 0.4141 128/500 [======>.......................] - ETA: 1:34 - loss: 2.2395 - regression_loss: 1.8258 - classification_loss: 0.4137 129/500 [======>.......................] - ETA: 1:34 - loss: 2.2382 - regression_loss: 1.8251 - classification_loss: 0.4131 130/500 [======>.......................] - ETA: 1:34 - loss: 2.2409 - regression_loss: 1.8274 - classification_loss: 0.4135 131/500 [======>.......................] - ETA: 1:33 - loss: 2.2434 - regression_loss: 1.8300 - classification_loss: 0.4134 132/500 [======>.......................] - ETA: 1:33 - loss: 2.2450 - regression_loss: 1.8312 - classification_loss: 0.4138 133/500 [======>.......................] - ETA: 1:33 - loss: 2.2460 - regression_loss: 1.8324 - classification_loss: 0.4136 134/500 [=======>......................] - ETA: 1:33 - loss: 2.2444 - regression_loss: 1.8313 - classification_loss: 0.4131 135/500 [=======>......................] - ETA: 1:32 - loss: 2.2456 - regression_loss: 1.8330 - classification_loss: 0.4126 136/500 [=======>......................] - ETA: 1:32 - loss: 2.2484 - regression_loss: 1.8353 - classification_loss: 0.4131 137/500 [=======>......................] - ETA: 1:32 - loss: 2.2447 - regression_loss: 1.8332 - classification_loss: 0.4115 138/500 [=======>......................] - ETA: 1:32 - loss: 2.2392 - regression_loss: 1.8286 - classification_loss: 0.4106 139/500 [=======>......................] - ETA: 1:31 - loss: 2.2371 - regression_loss: 1.8269 - classification_loss: 0.4102 140/500 [=======>......................] - ETA: 1:31 - loss: 2.2373 - regression_loss: 1.8270 - classification_loss: 0.4103 141/500 [=======>......................] - ETA: 1:31 - loss: 2.2412 - regression_loss: 1.8300 - classification_loss: 0.4112 142/500 [=======>......................] - ETA: 1:30 - loss: 2.2414 - regression_loss: 1.8303 - classification_loss: 0.4112 143/500 [=======>......................] - ETA: 1:30 - loss: 2.2585 - regression_loss: 1.8336 - classification_loss: 0.4249 144/500 [=======>......................] - ETA: 1:30 - loss: 2.2689 - regression_loss: 1.8413 - classification_loss: 0.4276 145/500 [=======>......................] - ETA: 1:30 - loss: 2.2720 - regression_loss: 1.8440 - classification_loss: 0.4280 146/500 [=======>......................] - ETA: 1:29 - loss: 2.2723 - regression_loss: 1.8447 - classification_loss: 0.4276 147/500 [=======>......................] - ETA: 1:29 - loss: 2.2730 - regression_loss: 1.8450 - classification_loss: 0.4280 148/500 [=======>......................] - ETA: 1:29 - loss: 2.2653 - regression_loss: 1.8390 - classification_loss: 0.4263 149/500 [=======>......................] - ETA: 1:29 - loss: 2.2632 - regression_loss: 1.8374 - classification_loss: 0.4258 150/500 [========>.....................] - ETA: 1:28 - loss: 2.2639 - regression_loss: 1.8382 - classification_loss: 0.4257 151/500 [========>.....................] - ETA: 1:28 - loss: 2.2646 - regression_loss: 1.8389 - classification_loss: 0.4257 152/500 [========>.....................] - ETA: 1:28 - loss: 2.2636 - regression_loss: 1.8389 - classification_loss: 0.4247 153/500 [========>.....................] - ETA: 1:28 - loss: 2.2624 - regression_loss: 1.8383 - classification_loss: 0.4241 154/500 [========>.....................] - ETA: 1:27 - loss: 2.2606 - regression_loss: 1.8372 - classification_loss: 0.4234 155/500 [========>.....................] - ETA: 1:27 - loss: 2.2612 - regression_loss: 1.8379 - classification_loss: 0.4233 156/500 [========>.....................] - ETA: 1:27 - loss: 2.2636 - regression_loss: 1.8399 - classification_loss: 0.4237 157/500 [========>.....................] - ETA: 1:27 - loss: 2.2653 - regression_loss: 1.8413 - classification_loss: 0.4240 158/500 [========>.....................] - ETA: 1:26 - loss: 2.2697 - regression_loss: 1.8459 - classification_loss: 0.4238 159/500 [========>.....................] - ETA: 1:26 - loss: 2.2699 - regression_loss: 1.8464 - classification_loss: 0.4235 160/500 [========>.....................] - ETA: 1:26 - loss: 2.2761 - regression_loss: 1.8522 - classification_loss: 0.4239 161/500 [========>.....................] - ETA: 1:25 - loss: 2.2764 - regression_loss: 1.8524 - classification_loss: 0.4240 162/500 [========>.....................] - ETA: 1:25 - loss: 2.2715 - regression_loss: 1.8490 - classification_loss: 0.4225 163/500 [========>.....................] - ETA: 1:25 - loss: 2.2736 - regression_loss: 1.8508 - classification_loss: 0.4228 164/500 [========>.....................] - ETA: 1:24 - loss: 2.2713 - regression_loss: 1.8494 - classification_loss: 0.4220 165/500 [========>.....................] - ETA: 1:24 - loss: 2.2757 - regression_loss: 1.8505 - classification_loss: 0.4252 166/500 [========>.....................] - ETA: 1:24 - loss: 2.2752 - regression_loss: 1.8496 - classification_loss: 0.4256 167/500 [=========>....................] - ETA: 1:24 - loss: 2.2845 - regression_loss: 1.8574 - classification_loss: 0.4271 168/500 [=========>....................] - ETA: 1:23 - loss: 2.2812 - regression_loss: 1.8551 - classification_loss: 0.4260 169/500 [=========>....................] - ETA: 1:23 - loss: 2.2774 - regression_loss: 1.8526 - classification_loss: 0.4247 170/500 [=========>....................] - ETA: 1:23 - loss: 2.2831 - regression_loss: 1.8566 - classification_loss: 0.4266 171/500 [=========>....................] - ETA: 1:23 - loss: 2.2828 - regression_loss: 1.8566 - classification_loss: 0.4261 172/500 [=========>....................] - ETA: 1:22 - loss: 2.2803 - regression_loss: 1.8552 - classification_loss: 0.4251 173/500 [=========>....................] - ETA: 1:22 - loss: 2.2817 - regression_loss: 1.8561 - classification_loss: 0.4255 174/500 [=========>....................] - ETA: 1:22 - loss: 2.2819 - regression_loss: 1.8565 - classification_loss: 0.4254 175/500 [=========>....................] - ETA: 1:22 - loss: 2.2803 - regression_loss: 1.8559 - classification_loss: 0.4245 176/500 [=========>....................] - ETA: 1:21 - loss: 2.2801 - regression_loss: 1.8558 - classification_loss: 0.4243 177/500 [=========>....................] - ETA: 1:21 - loss: 2.2760 - regression_loss: 1.8525 - classification_loss: 0.4235 178/500 [=========>....................] - ETA: 1:21 - loss: 2.2773 - regression_loss: 1.8535 - classification_loss: 0.4238 179/500 [=========>....................] - ETA: 1:21 - loss: 2.2744 - regression_loss: 1.8516 - classification_loss: 0.4228 180/500 [=========>....................] - ETA: 1:20 - loss: 2.2757 - regression_loss: 1.8527 - classification_loss: 0.4230 181/500 [=========>....................] - ETA: 1:20 - loss: 2.2759 - regression_loss: 1.8542 - classification_loss: 0.4217 182/500 [=========>....................] - ETA: 1:20 - loss: 2.2778 - regression_loss: 1.8555 - classification_loss: 0.4223 183/500 [=========>....................] - ETA: 1:20 - loss: 2.2763 - regression_loss: 1.8544 - classification_loss: 0.4219 184/500 [==========>...................] - ETA: 1:19 - loss: 2.2766 - regression_loss: 1.8550 - classification_loss: 0.4215 185/500 [==========>...................] - ETA: 1:19 - loss: 2.2747 - regression_loss: 1.8540 - classification_loss: 0.4207 186/500 [==========>...................] - ETA: 1:19 - loss: 2.2737 - regression_loss: 1.8535 - classification_loss: 0.4202 187/500 [==========>...................] - ETA: 1:19 - loss: 2.2706 - regression_loss: 1.8512 - classification_loss: 0.4194 188/500 [==========>...................] - ETA: 1:18 - loss: 2.2684 - regression_loss: 1.8499 - classification_loss: 0.4185 189/500 [==========>...................] - ETA: 1:18 - loss: 2.2695 - regression_loss: 1.8514 - classification_loss: 0.4182 190/500 [==========>...................] - ETA: 1:18 - loss: 2.2701 - regression_loss: 1.8516 - classification_loss: 0.4185 191/500 [==========>...................] - ETA: 1:18 - loss: 2.2699 - regression_loss: 1.8516 - classification_loss: 0.4183 192/500 [==========>...................] - ETA: 1:17 - loss: 2.2632 - regression_loss: 1.8464 - classification_loss: 0.4167 193/500 [==========>...................] - ETA: 1:17 - loss: 2.2615 - regression_loss: 1.8456 - classification_loss: 0.4159 194/500 [==========>...................] - ETA: 1:17 - loss: 2.2583 - regression_loss: 1.8434 - classification_loss: 0.4149 195/500 [==========>...................] - ETA: 1:17 - loss: 2.2548 - regression_loss: 1.8409 - classification_loss: 0.4139 196/500 [==========>...................] - ETA: 1:16 - loss: 2.2549 - regression_loss: 1.8409 - classification_loss: 0.4140 197/500 [==========>...................] - ETA: 1:16 - loss: 2.2561 - regression_loss: 1.8421 - classification_loss: 0.4141 198/500 [==========>...................] - ETA: 1:16 - loss: 2.2579 - regression_loss: 1.8431 - classification_loss: 0.4147 199/500 [==========>...................] - ETA: 1:16 - loss: 2.2569 - regression_loss: 1.8425 - classification_loss: 0.4145 200/500 [===========>..................] - ETA: 1:15 - loss: 2.2576 - regression_loss: 1.8424 - classification_loss: 0.4152 201/500 [===========>..................] - ETA: 1:15 - loss: 2.2573 - regression_loss: 1.8424 - classification_loss: 0.4149 202/500 [===========>..................] - ETA: 1:15 - loss: 2.2564 - regression_loss: 1.8418 - classification_loss: 0.4145 203/500 [===========>..................] - ETA: 1:14 - loss: 2.2517 - regression_loss: 1.8378 - classification_loss: 0.4139 204/500 [===========>..................] - ETA: 1:14 - loss: 2.2582 - regression_loss: 1.8421 - classification_loss: 0.4161 205/500 [===========>..................] - ETA: 1:14 - loss: 2.2567 - regression_loss: 1.8410 - classification_loss: 0.4158 206/500 [===========>..................] - ETA: 1:14 - loss: 2.2561 - regression_loss: 1.8406 - classification_loss: 0.4155 207/500 [===========>..................] - ETA: 1:13 - loss: 2.2565 - regression_loss: 1.8412 - classification_loss: 0.4153 208/500 [===========>..................] - ETA: 1:13 - loss: 2.2608 - regression_loss: 1.8452 - classification_loss: 0.4156 209/500 [===========>..................] - ETA: 1:13 - loss: 2.2616 - regression_loss: 1.8456 - classification_loss: 0.4160 210/500 [===========>..................] - ETA: 1:13 - loss: 2.2637 - regression_loss: 1.8474 - classification_loss: 0.4163 211/500 [===========>..................] - ETA: 1:12 - loss: 2.2642 - regression_loss: 1.8481 - classification_loss: 0.4161 212/500 [===========>..................] - ETA: 1:12 - loss: 2.2647 - regression_loss: 1.8489 - classification_loss: 0.4158 213/500 [===========>..................] - ETA: 1:12 - loss: 2.2637 - regression_loss: 1.8478 - classification_loss: 0.4160 214/500 [===========>..................] - ETA: 1:12 - loss: 2.2634 - regression_loss: 1.8477 - classification_loss: 0.4158 215/500 [===========>..................] - ETA: 1:11 - loss: 2.2638 - regression_loss: 1.8476 - classification_loss: 0.4162 216/500 [===========>..................] - ETA: 1:11 - loss: 2.2638 - regression_loss: 1.8480 - classification_loss: 0.4158 217/500 [============>.................] - ETA: 1:11 - loss: 2.2598 - regression_loss: 1.8447 - classification_loss: 0.4151 218/500 [============>.................] - ETA: 1:11 - loss: 2.2617 - regression_loss: 1.8463 - classification_loss: 0.4154 219/500 [============>.................] - ETA: 1:10 - loss: 2.2590 - regression_loss: 1.8445 - classification_loss: 0.4144 220/500 [============>.................] - ETA: 1:10 - loss: 2.2606 - regression_loss: 1.8459 - classification_loss: 0.4147 221/500 [============>.................] - ETA: 1:10 - loss: 2.2638 - regression_loss: 1.8486 - classification_loss: 0.4152 222/500 [============>.................] - ETA: 1:10 - loss: 2.2637 - regression_loss: 1.8482 - classification_loss: 0.4154 223/500 [============>.................] - ETA: 1:09 - loss: 2.2631 - regression_loss: 1.8481 - classification_loss: 0.4149 224/500 [============>.................] - ETA: 1:09 - loss: 2.2628 - regression_loss: 1.8480 - classification_loss: 0.4149 225/500 [============>.................] - ETA: 1:09 - loss: 2.2610 - regression_loss: 1.8468 - classification_loss: 0.4142 226/500 [============>.................] - ETA: 1:09 - loss: 2.2611 - regression_loss: 1.8472 - classification_loss: 0.4139 227/500 [============>.................] - ETA: 1:08 - loss: 2.2619 - regression_loss: 1.8479 - classification_loss: 0.4140 228/500 [============>.................] - ETA: 1:08 - loss: 2.2611 - regression_loss: 1.8470 - classification_loss: 0.4141 229/500 [============>.................] - ETA: 1:08 - loss: 2.2570 - regression_loss: 1.8440 - classification_loss: 0.4131 230/500 [============>.................] - ETA: 1:08 - loss: 2.2578 - regression_loss: 1.8450 - classification_loss: 0.4128 231/500 [============>.................] - ETA: 1:07 - loss: 2.2602 - regression_loss: 1.8468 - classification_loss: 0.4134 232/500 [============>.................] - ETA: 1:07 - loss: 2.2609 - regression_loss: 1.8476 - classification_loss: 0.4133 233/500 [============>.................] - ETA: 1:07 - loss: 2.2622 - regression_loss: 1.8488 - classification_loss: 0.4134 234/500 [=============>................] - ETA: 1:07 - loss: 2.2610 - regression_loss: 1.8481 - classification_loss: 0.4129 235/500 [=============>................] - ETA: 1:06 - loss: 2.2608 - regression_loss: 1.8482 - classification_loss: 0.4126 236/500 [=============>................] - ETA: 1:06 - loss: 2.2616 - regression_loss: 1.8487 - classification_loss: 0.4129 237/500 [=============>................] - ETA: 1:06 - loss: 2.2632 - regression_loss: 1.8496 - classification_loss: 0.4136 238/500 [=============>................] - ETA: 1:06 - loss: 2.2656 - regression_loss: 1.8516 - classification_loss: 0.4140 239/500 [=============>................] - ETA: 1:05 - loss: 2.2650 - regression_loss: 1.8509 - classification_loss: 0.4141 240/500 [=============>................] - ETA: 1:05 - loss: 2.2689 - regression_loss: 1.8533 - classification_loss: 0.4156 241/500 [=============>................] - ETA: 1:05 - loss: 2.2673 - regression_loss: 1.8522 - classification_loss: 0.4151 242/500 [=============>................] - ETA: 1:05 - loss: 2.2677 - regression_loss: 1.8529 - classification_loss: 0.4148 243/500 [=============>................] - ETA: 1:04 - loss: 2.2697 - regression_loss: 1.8549 - classification_loss: 0.4148 244/500 [=============>................] - ETA: 1:04 - loss: 2.2684 - regression_loss: 1.8539 - classification_loss: 0.4145 245/500 [=============>................] - ETA: 1:04 - loss: 2.2667 - regression_loss: 1.8527 - classification_loss: 0.4141 246/500 [=============>................] - ETA: 1:04 - loss: 2.2651 - regression_loss: 1.8511 - classification_loss: 0.4140 247/500 [=============>................] - ETA: 1:03 - loss: 2.2644 - regression_loss: 1.8506 - classification_loss: 0.4139 248/500 [=============>................] - ETA: 1:03 - loss: 2.2603 - regression_loss: 1.8475 - classification_loss: 0.4128 249/500 [=============>................] - ETA: 1:03 - loss: 2.2567 - regression_loss: 1.8448 - classification_loss: 0.4119 250/500 [==============>...............] - ETA: 1:03 - loss: 2.2544 - regression_loss: 1.8434 - classification_loss: 0.4110 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2573 - regression_loss: 1.8454 - classification_loss: 0.4119 252/500 [==============>...............] - ETA: 1:02 - loss: 2.2609 - regression_loss: 1.8488 - classification_loss: 0.4121 253/500 [==============>...............] - ETA: 1:02 - loss: 2.2649 - regression_loss: 1.8523 - classification_loss: 0.4125 254/500 [==============>...............] - ETA: 1:02 - loss: 2.2651 - regression_loss: 1.8525 - classification_loss: 0.4126 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2650 - regression_loss: 1.8523 - classification_loss: 0.4126 256/500 [==============>...............] - ETA: 1:01 - loss: 2.2643 - regression_loss: 1.8524 - classification_loss: 0.4119 257/500 [==============>...............] - ETA: 1:01 - loss: 2.2672 - regression_loss: 1.8545 - classification_loss: 0.4127 258/500 [==============>...............] - ETA: 1:01 - loss: 2.2663 - regression_loss: 1.8541 - classification_loss: 0.4123 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2657 - regression_loss: 1.8537 - classification_loss: 0.4121 260/500 [==============>...............] - ETA: 1:00 - loss: 2.2689 - regression_loss: 1.8568 - classification_loss: 0.4121 261/500 [==============>...............] - ETA: 1:00 - loss: 2.2731 - regression_loss: 1.8601 - classification_loss: 0.4130 262/500 [==============>...............] - ETA: 1:00 - loss: 2.2696 - regression_loss: 1.8573 - classification_loss: 0.4122 263/500 [==============>...............] - ETA: 59s - loss: 2.2708 - regression_loss: 1.8591 - classification_loss: 0.4117  264/500 [==============>...............] - ETA: 59s - loss: 2.2697 - regression_loss: 1.8582 - classification_loss: 0.4115 265/500 [==============>...............] - ETA: 59s - loss: 2.2732 - regression_loss: 1.8613 - classification_loss: 0.4119 266/500 [==============>...............] - ETA: 59s - loss: 2.2733 - regression_loss: 1.8616 - classification_loss: 0.4117 267/500 [===============>..............] - ETA: 58s - loss: 2.2711 - regression_loss: 1.8600 - classification_loss: 0.4111 268/500 [===============>..............] - ETA: 58s - loss: 2.2736 - regression_loss: 1.8620 - classification_loss: 0.4115 269/500 [===============>..............] - ETA: 58s - loss: 2.2751 - regression_loss: 1.8642 - classification_loss: 0.4110 270/500 [===============>..............] - ETA: 58s - loss: 2.2762 - regression_loss: 1.8649 - classification_loss: 0.4113 271/500 [===============>..............] - ETA: 57s - loss: 2.2751 - regression_loss: 1.8641 - classification_loss: 0.4110 272/500 [===============>..............] - ETA: 57s - loss: 2.2768 - regression_loss: 1.8657 - classification_loss: 0.4111 273/500 [===============>..............] - ETA: 57s - loss: 2.2754 - regression_loss: 1.8645 - classification_loss: 0.4110 274/500 [===============>..............] - ETA: 56s - loss: 2.2801 - regression_loss: 1.8683 - classification_loss: 0.4118 275/500 [===============>..............] - ETA: 56s - loss: 2.2794 - regression_loss: 1.8682 - classification_loss: 0.4113 276/500 [===============>..............] - ETA: 56s - loss: 2.2790 - regression_loss: 1.8680 - classification_loss: 0.4109 277/500 [===============>..............] - ETA: 56s - loss: 2.2794 - regression_loss: 1.8684 - classification_loss: 0.4110 278/500 [===============>..............] - ETA: 55s - loss: 2.2801 - regression_loss: 1.8693 - classification_loss: 0.4109 279/500 [===============>..............] - ETA: 55s - loss: 2.2806 - regression_loss: 1.8696 - classification_loss: 0.4110 280/500 [===============>..............] - ETA: 55s - loss: 2.2816 - regression_loss: 1.8704 - classification_loss: 0.4113 281/500 [===============>..............] - ETA: 55s - loss: 2.2808 - regression_loss: 1.8697 - classification_loss: 0.4110 282/500 [===============>..............] - ETA: 54s - loss: 2.2816 - regression_loss: 1.8705 - classification_loss: 0.4111 283/500 [===============>..............] - ETA: 54s - loss: 2.2817 - regression_loss: 1.8706 - classification_loss: 0.4111 284/500 [================>.............] - ETA: 54s - loss: 2.2822 - regression_loss: 1.8709 - classification_loss: 0.4113 285/500 [================>.............] - ETA: 54s - loss: 2.2812 - regression_loss: 1.8702 - classification_loss: 0.4111 286/500 [================>.............] - ETA: 53s - loss: 2.2816 - regression_loss: 1.8706 - classification_loss: 0.4110 287/500 [================>.............] - ETA: 53s - loss: 2.2819 - regression_loss: 1.8707 - classification_loss: 0.4112 288/500 [================>.............] - ETA: 53s - loss: 2.2788 - regression_loss: 1.8685 - classification_loss: 0.4103 289/500 [================>.............] - ETA: 53s - loss: 2.2797 - regression_loss: 1.8690 - classification_loss: 0.4107 290/500 [================>.............] - ETA: 52s - loss: 2.2796 - regression_loss: 1.8691 - classification_loss: 0.4104 291/500 [================>.............] - ETA: 52s - loss: 2.2801 - regression_loss: 1.8694 - classification_loss: 0.4107 292/500 [================>.............] - ETA: 52s - loss: 2.2797 - regression_loss: 1.8693 - classification_loss: 0.4103 293/500 [================>.............] - ETA: 52s - loss: 2.2802 - regression_loss: 1.8697 - classification_loss: 0.4105 294/500 [================>.............] - ETA: 51s - loss: 2.2813 - regression_loss: 1.8704 - classification_loss: 0.4109 295/500 [================>.............] - ETA: 51s - loss: 2.2818 - regression_loss: 1.8708 - classification_loss: 0.4110 296/500 [================>.............] - ETA: 51s - loss: 2.2835 - regression_loss: 1.8727 - classification_loss: 0.4109 297/500 [================>.............] - ETA: 51s - loss: 2.2816 - regression_loss: 1.8713 - classification_loss: 0.4103 298/500 [================>.............] - ETA: 50s - loss: 2.2823 - regression_loss: 1.8718 - classification_loss: 0.4105 299/500 [================>.............] - ETA: 50s - loss: 2.2807 - regression_loss: 1.8706 - classification_loss: 0.4101 300/500 [=================>............] - ETA: 50s - loss: 2.2783 - regression_loss: 1.8689 - classification_loss: 0.4094 301/500 [=================>............] - ETA: 50s - loss: 2.2783 - regression_loss: 1.8691 - classification_loss: 0.4092 302/500 [=================>............] - ETA: 49s - loss: 2.2791 - regression_loss: 1.8698 - classification_loss: 0.4093 303/500 [=================>............] - ETA: 49s - loss: 2.2788 - regression_loss: 1.8698 - classification_loss: 0.4091 304/500 [=================>............] - ETA: 49s - loss: 2.2775 - regression_loss: 1.8688 - classification_loss: 0.4087 305/500 [=================>............] - ETA: 49s - loss: 2.2764 - regression_loss: 1.8680 - classification_loss: 0.4084 306/500 [=================>............] - ETA: 48s - loss: 2.2757 - regression_loss: 1.8679 - classification_loss: 0.4077 307/500 [=================>............] - ETA: 48s - loss: 2.2770 - regression_loss: 1.8688 - classification_loss: 0.4082 308/500 [=================>............] - ETA: 48s - loss: 2.2756 - regression_loss: 1.8677 - classification_loss: 0.4079 309/500 [=================>............] - ETA: 48s - loss: 2.2749 - regression_loss: 1.8669 - classification_loss: 0.4080 310/500 [=================>............] - ETA: 47s - loss: 2.2726 - regression_loss: 1.8650 - classification_loss: 0.4076 311/500 [=================>............] - ETA: 47s - loss: 2.2728 - regression_loss: 1.8650 - classification_loss: 0.4078 312/500 [=================>............] - ETA: 47s - loss: 2.2724 - regression_loss: 1.8649 - classification_loss: 0.4075 313/500 [=================>............] - ETA: 46s - loss: 2.2720 - regression_loss: 1.8646 - classification_loss: 0.4074 314/500 [=================>............] - ETA: 46s - loss: 2.2712 - regression_loss: 1.8642 - classification_loss: 0.4070 315/500 [=================>............] - ETA: 46s - loss: 2.2704 - regression_loss: 1.8636 - classification_loss: 0.4068 316/500 [=================>............] - ETA: 46s - loss: 2.2733 - regression_loss: 1.8652 - classification_loss: 0.4081 317/500 [==================>...........] - ETA: 45s - loss: 2.2727 - regression_loss: 1.8650 - classification_loss: 0.4077 318/500 [==================>...........] - ETA: 45s - loss: 2.2734 - regression_loss: 1.8658 - classification_loss: 0.4076 319/500 [==================>...........] - ETA: 45s - loss: 2.2708 - regression_loss: 1.8640 - classification_loss: 0.4069 320/500 [==================>...........] - ETA: 45s - loss: 2.2703 - regression_loss: 1.8639 - classification_loss: 0.4064 321/500 [==================>...........] - ETA: 44s - loss: 2.2678 - regression_loss: 1.8619 - classification_loss: 0.4060 322/500 [==================>...........] - ETA: 44s - loss: 2.2698 - regression_loss: 1.8632 - classification_loss: 0.4067 323/500 [==================>...........] - ETA: 44s - loss: 2.2698 - regression_loss: 1.8638 - classification_loss: 0.4060 324/500 [==================>...........] - ETA: 44s - loss: 2.2665 - regression_loss: 1.8612 - classification_loss: 0.4053 325/500 [==================>...........] - ETA: 43s - loss: 2.2649 - regression_loss: 1.8602 - classification_loss: 0.4046 326/500 [==================>...........] - ETA: 43s - loss: 2.2691 - regression_loss: 1.8635 - classification_loss: 0.4056 327/500 [==================>...........] - ETA: 43s - loss: 2.2697 - regression_loss: 1.8637 - classification_loss: 0.4060 328/500 [==================>...........] - ETA: 43s - loss: 2.2673 - regression_loss: 1.8619 - classification_loss: 0.4054 329/500 [==================>...........] - ETA: 42s - loss: 2.2636 - regression_loss: 1.8590 - classification_loss: 0.4046 330/500 [==================>...........] - ETA: 42s - loss: 2.2653 - regression_loss: 1.8600 - classification_loss: 0.4053 331/500 [==================>...........] - ETA: 42s - loss: 2.2627 - regression_loss: 1.8580 - classification_loss: 0.4047 332/500 [==================>...........] - ETA: 42s - loss: 2.2622 - regression_loss: 1.8577 - classification_loss: 0.4044 333/500 [==================>...........] - ETA: 41s - loss: 2.2633 - regression_loss: 1.8585 - classification_loss: 0.4048 334/500 [===================>..........] - ETA: 41s - loss: 2.2607 - regression_loss: 1.8565 - classification_loss: 0.4042 335/500 [===================>..........] - ETA: 41s - loss: 2.2581 - regression_loss: 1.8546 - classification_loss: 0.4035 336/500 [===================>..........] - ETA: 41s - loss: 2.2547 - regression_loss: 1.8520 - classification_loss: 0.4027 337/500 [===================>..........] - ETA: 40s - loss: 2.2554 - regression_loss: 1.8528 - classification_loss: 0.4026 338/500 [===================>..........] - ETA: 40s - loss: 2.2560 - regression_loss: 1.8532 - classification_loss: 0.4029 339/500 [===================>..........] - ETA: 40s - loss: 2.2533 - regression_loss: 1.8512 - classification_loss: 0.4022 340/500 [===================>..........] - ETA: 39s - loss: 2.2548 - regression_loss: 1.8522 - classification_loss: 0.4026 341/500 [===================>..........] - ETA: 39s - loss: 2.2538 - regression_loss: 1.8514 - classification_loss: 0.4024 342/500 [===================>..........] - ETA: 39s - loss: 2.2535 - regression_loss: 1.8513 - classification_loss: 0.4022 343/500 [===================>..........] - ETA: 39s - loss: 2.2537 - regression_loss: 1.8515 - classification_loss: 0.4022 344/500 [===================>..........] - ETA: 38s - loss: 2.2543 - regression_loss: 1.8521 - classification_loss: 0.4022 345/500 [===================>..........] - ETA: 38s - loss: 2.2534 - regression_loss: 1.8511 - classification_loss: 0.4023 346/500 [===================>..........] - ETA: 38s - loss: 2.2503 - regression_loss: 1.8489 - classification_loss: 0.4015 347/500 [===================>..........] - ETA: 38s - loss: 2.2522 - regression_loss: 1.8506 - classification_loss: 0.4016 348/500 [===================>..........] - ETA: 37s - loss: 2.2527 - regression_loss: 1.8515 - classification_loss: 0.4012 349/500 [===================>..........] - ETA: 37s - loss: 2.2528 - regression_loss: 1.8521 - classification_loss: 0.4008 350/500 [====================>.........] - ETA: 37s - loss: 2.2515 - regression_loss: 1.8511 - classification_loss: 0.4004 351/500 [====================>.........] - ETA: 37s - loss: 2.2512 - regression_loss: 1.8512 - classification_loss: 0.4000 352/500 [====================>.........] - ETA: 36s - loss: 2.2498 - regression_loss: 1.8502 - classification_loss: 0.3996 353/500 [====================>.........] - ETA: 36s - loss: 2.2519 - regression_loss: 1.8517 - classification_loss: 0.4001 354/500 [====================>.........] - ETA: 36s - loss: 2.2475 - regression_loss: 1.8481 - classification_loss: 0.3994 355/500 [====================>.........] - ETA: 36s - loss: 2.2484 - regression_loss: 1.8488 - classification_loss: 0.3996 356/500 [====================>.........] - ETA: 35s - loss: 2.2495 - regression_loss: 1.8493 - classification_loss: 0.4002 357/500 [====================>.........] - ETA: 35s - loss: 2.2505 - regression_loss: 1.8499 - classification_loss: 0.4006 358/500 [====================>.........] - ETA: 35s - loss: 2.2472 - regression_loss: 1.8473 - classification_loss: 0.3999 359/500 [====================>.........] - ETA: 35s - loss: 2.2469 - regression_loss: 1.8471 - classification_loss: 0.3998 360/500 [====================>.........] - ETA: 34s - loss: 2.2453 - regression_loss: 1.8460 - classification_loss: 0.3993 361/500 [====================>.........] - ETA: 34s - loss: 2.2469 - regression_loss: 1.8477 - classification_loss: 0.3992 362/500 [====================>.........] - ETA: 34s - loss: 2.2463 - regression_loss: 1.8474 - classification_loss: 0.3989 363/500 [====================>.........] - ETA: 34s - loss: 2.2458 - regression_loss: 1.8469 - classification_loss: 0.3989 364/500 [====================>.........] - ETA: 33s - loss: 2.2483 - regression_loss: 1.8491 - classification_loss: 0.3992 365/500 [====================>.........] - ETA: 33s - loss: 2.2480 - regression_loss: 1.8489 - classification_loss: 0.3991 366/500 [====================>.........] - ETA: 33s - loss: 2.2466 - regression_loss: 1.8480 - classification_loss: 0.3986 367/500 [=====================>........] - ETA: 33s - loss: 2.2476 - regression_loss: 1.8485 - classification_loss: 0.3992 368/500 [=====================>........] - ETA: 32s - loss: 2.2469 - regression_loss: 1.8480 - classification_loss: 0.3989 369/500 [=====================>........] - ETA: 32s - loss: 2.2473 - regression_loss: 1.8482 - classification_loss: 0.3991 370/500 [=====================>........] - ETA: 32s - loss: 2.2487 - regression_loss: 1.8491 - classification_loss: 0.3996 371/500 [=====================>........] - ETA: 32s - loss: 2.2487 - regression_loss: 1.8489 - classification_loss: 0.3997 372/500 [=====================>........] - ETA: 31s - loss: 2.2498 - regression_loss: 1.8503 - classification_loss: 0.3996 373/500 [=====================>........] - ETA: 31s - loss: 2.2490 - regression_loss: 1.8496 - classification_loss: 0.3994 374/500 [=====================>........] - ETA: 31s - loss: 2.2494 - regression_loss: 1.8498 - classification_loss: 0.3996 375/500 [=====================>........] - ETA: 31s - loss: 2.2497 - regression_loss: 1.8504 - classification_loss: 0.3993 376/500 [=====================>........] - ETA: 30s - loss: 2.2496 - regression_loss: 1.8503 - classification_loss: 0.3993 377/500 [=====================>........] - ETA: 30s - loss: 2.2510 - regression_loss: 1.8516 - classification_loss: 0.3994 378/500 [=====================>........] - ETA: 30s - loss: 2.2495 - regression_loss: 1.8503 - classification_loss: 0.3992 379/500 [=====================>........] - ETA: 30s - loss: 2.2489 - regression_loss: 1.8498 - classification_loss: 0.3991 380/500 [=====================>........] - ETA: 29s - loss: 2.2496 - regression_loss: 1.8499 - classification_loss: 0.3997 381/500 [=====================>........] - ETA: 29s - loss: 2.2494 - regression_loss: 1.8497 - classification_loss: 0.3997 382/500 [=====================>........] - ETA: 29s - loss: 2.2492 - regression_loss: 1.8495 - classification_loss: 0.3997 383/500 [=====================>........] - ETA: 29s - loss: 2.2481 - regression_loss: 1.8488 - classification_loss: 0.3993 384/500 [======================>.......] - ETA: 28s - loss: 2.2481 - regression_loss: 1.8489 - classification_loss: 0.3992 385/500 [======================>.......] - ETA: 28s - loss: 2.2482 - regression_loss: 1.8489 - classification_loss: 0.3993 386/500 [======================>.......] - ETA: 28s - loss: 2.2472 - regression_loss: 1.8478 - classification_loss: 0.3994 387/500 [======================>.......] - ETA: 28s - loss: 2.2500 - regression_loss: 1.8500 - classification_loss: 0.4000 388/500 [======================>.......] - ETA: 27s - loss: 2.2488 - regression_loss: 1.8491 - classification_loss: 0.3997 389/500 [======================>.......] - ETA: 27s - loss: 2.2526 - regression_loss: 1.8443 - classification_loss: 0.4082 390/500 [======================>.......] - ETA: 27s - loss: 2.2541 - regression_loss: 1.8455 - classification_loss: 0.4086 391/500 [======================>.......] - ETA: 27s - loss: 2.2536 - regression_loss: 1.8451 - classification_loss: 0.4084 392/500 [======================>.......] - ETA: 26s - loss: 2.2555 - regression_loss: 1.8469 - classification_loss: 0.4086 393/500 [======================>.......] - ETA: 26s - loss: 2.2578 - regression_loss: 1.8482 - classification_loss: 0.4095 394/500 [======================>.......] - ETA: 26s - loss: 2.2574 - regression_loss: 1.8480 - classification_loss: 0.4094 395/500 [======================>.......] - ETA: 26s - loss: 2.2588 - regression_loss: 1.8492 - classification_loss: 0.4096 396/500 [======================>.......] - ETA: 25s - loss: 2.2596 - regression_loss: 1.8497 - classification_loss: 0.4099 397/500 [======================>.......] - ETA: 25s - loss: 2.2595 - regression_loss: 1.8499 - classification_loss: 0.4096 398/500 [======================>.......] - ETA: 25s - loss: 2.2615 - regression_loss: 1.8513 - classification_loss: 0.4102 399/500 [======================>.......] - ETA: 25s - loss: 2.2615 - regression_loss: 1.8513 - classification_loss: 0.4102 400/500 [=======================>......] - ETA: 24s - loss: 2.2619 - regression_loss: 1.8515 - classification_loss: 0.4104 401/500 [=======================>......] - ETA: 24s - loss: 2.2613 - regression_loss: 1.8510 - classification_loss: 0.4103 402/500 [=======================>......] - ETA: 24s - loss: 2.2607 - regression_loss: 1.8507 - classification_loss: 0.4101 403/500 [=======================>......] - ETA: 24s - loss: 2.2602 - regression_loss: 1.8504 - classification_loss: 0.4098 404/500 [=======================>......] - ETA: 23s - loss: 2.2626 - regression_loss: 1.8528 - classification_loss: 0.4098 405/500 [=======================>......] - ETA: 23s - loss: 2.2618 - regression_loss: 1.8523 - classification_loss: 0.4095 406/500 [=======================>......] - ETA: 23s - loss: 2.2615 - regression_loss: 1.8519 - classification_loss: 0.4096 407/500 [=======================>......] - ETA: 23s - loss: 2.2597 - regression_loss: 1.8505 - classification_loss: 0.4091 408/500 [=======================>......] - ETA: 22s - loss: 2.2576 - regression_loss: 1.8490 - classification_loss: 0.4086 409/500 [=======================>......] - ETA: 22s - loss: 2.2571 - regression_loss: 1.8487 - classification_loss: 0.4084 410/500 [=======================>......] - ETA: 22s - loss: 2.2570 - regression_loss: 1.8486 - classification_loss: 0.4085 411/500 [=======================>......] - ETA: 22s - loss: 2.2563 - regression_loss: 1.8481 - classification_loss: 0.4082 412/500 [=======================>......] - ETA: 21s - loss: 2.2570 - regression_loss: 1.8486 - classification_loss: 0.4084 413/500 [=======================>......] - ETA: 21s - loss: 2.2557 - regression_loss: 1.8477 - classification_loss: 0.4079 414/500 [=======================>......] - ETA: 21s - loss: 2.2544 - regression_loss: 1.8466 - classification_loss: 0.4078 415/500 [=======================>......] - ETA: 21s - loss: 2.2542 - regression_loss: 1.8465 - classification_loss: 0.4077 416/500 [=======================>......] - ETA: 20s - loss: 2.2534 - regression_loss: 1.8460 - classification_loss: 0.4074 417/500 [========================>.....] - ETA: 20s - loss: 2.2534 - regression_loss: 1.8460 - classification_loss: 0.4074 418/500 [========================>.....] - ETA: 20s - loss: 2.2538 - regression_loss: 1.8463 - classification_loss: 0.4074 419/500 [========================>.....] - ETA: 20s - loss: 2.2550 - regression_loss: 1.8473 - classification_loss: 0.4078 420/500 [========================>.....] - ETA: 19s - loss: 2.2546 - regression_loss: 1.8470 - classification_loss: 0.4076 421/500 [========================>.....] - ETA: 19s - loss: 2.2552 - regression_loss: 1.8474 - classification_loss: 0.4078 422/500 [========================>.....] - ETA: 19s - loss: 2.2549 - regression_loss: 1.8472 - classification_loss: 0.4077 423/500 [========================>.....] - ETA: 19s - loss: 2.2544 - regression_loss: 1.8471 - classification_loss: 0.4073 424/500 [========================>.....] - ETA: 18s - loss: 2.2542 - regression_loss: 1.8472 - classification_loss: 0.4070 425/500 [========================>.....] - ETA: 18s - loss: 2.2529 - regression_loss: 1.8462 - classification_loss: 0.4067 426/500 [========================>.....] - ETA: 18s - loss: 2.2519 - regression_loss: 1.8454 - classification_loss: 0.4065 427/500 [========================>.....] - ETA: 18s - loss: 2.2515 - regression_loss: 1.8452 - classification_loss: 0.4064 428/500 [========================>.....] - ETA: 17s - loss: 2.2511 - regression_loss: 1.8449 - classification_loss: 0.4062 429/500 [========================>.....] - ETA: 17s - loss: 2.2487 - regression_loss: 1.8426 - classification_loss: 0.4061 430/500 [========================>.....] - ETA: 17s - loss: 2.2481 - regression_loss: 1.8423 - classification_loss: 0.4058 431/500 [========================>.....] - ETA: 17s - loss: 2.2467 - regression_loss: 1.8411 - classification_loss: 0.4056 432/500 [========================>.....] - ETA: 16s - loss: 2.2495 - regression_loss: 1.8439 - classification_loss: 0.4057 433/500 [========================>.....] - ETA: 16s - loss: 2.2496 - regression_loss: 1.8438 - classification_loss: 0.4059 434/500 [=========================>....] - ETA: 16s - loss: 2.2573 - regression_loss: 1.8486 - classification_loss: 0.4088 435/500 [=========================>....] - ETA: 16s - loss: 2.2552 - regression_loss: 1.8470 - classification_loss: 0.4082 436/500 [=========================>....] - ETA: 15s - loss: 2.2524 - regression_loss: 1.8449 - classification_loss: 0.4076 437/500 [=========================>....] - ETA: 15s - loss: 2.2542 - regression_loss: 1.8464 - classification_loss: 0.4078 438/500 [=========================>....] - ETA: 15s - loss: 2.2545 - regression_loss: 1.8467 - classification_loss: 0.4078 439/500 [=========================>....] - ETA: 15s - loss: 2.2533 - regression_loss: 1.8460 - classification_loss: 0.4073 440/500 [=========================>....] - ETA: 14s - loss: 2.2536 - regression_loss: 1.8463 - classification_loss: 0.4073 441/500 [=========================>....] - ETA: 14s - loss: 2.2541 - regression_loss: 1.8466 - classification_loss: 0.4075 442/500 [=========================>....] - ETA: 14s - loss: 2.2542 - regression_loss: 1.8468 - classification_loss: 0.4074 443/500 [=========================>....] - ETA: 14s - loss: 2.2528 - regression_loss: 1.8458 - classification_loss: 0.4070 444/500 [=========================>....] - ETA: 13s - loss: 2.2529 - regression_loss: 1.8457 - classification_loss: 0.4073 445/500 [=========================>....] - ETA: 13s - loss: 2.2531 - regression_loss: 1.8458 - classification_loss: 0.4073 446/500 [=========================>....] - ETA: 13s - loss: 2.2536 - regression_loss: 1.8463 - classification_loss: 0.4072 447/500 [=========================>....] - ETA: 13s - loss: 2.2534 - regression_loss: 1.8464 - classification_loss: 0.4070 448/500 [=========================>....] - ETA: 12s - loss: 2.2540 - regression_loss: 1.8475 - classification_loss: 0.4065 449/500 [=========================>....] - ETA: 12s - loss: 2.2547 - regression_loss: 1.8481 - classification_loss: 0.4066 450/500 [==========================>...] - ETA: 12s - loss: 2.2543 - regression_loss: 1.8479 - classification_loss: 0.4065 451/500 [==========================>...] - ETA: 12s - loss: 2.2536 - regression_loss: 1.8473 - classification_loss: 0.4063 452/500 [==========================>...] - ETA: 11s - loss: 2.2527 - regression_loss: 1.8469 - classification_loss: 0.4058 453/500 [==========================>...] - ETA: 11s - loss: 2.2523 - regression_loss: 1.8466 - classification_loss: 0.4058 454/500 [==========================>...] - ETA: 11s - loss: 2.2536 - regression_loss: 1.8477 - classification_loss: 0.4059 455/500 [==========================>...] - ETA: 11s - loss: 2.2542 - regression_loss: 1.8482 - classification_loss: 0.4061 456/500 [==========================>...] - ETA: 10s - loss: 2.2541 - regression_loss: 1.8483 - classification_loss: 0.4058 457/500 [==========================>...] - ETA: 10s - loss: 2.2531 - regression_loss: 1.8476 - classification_loss: 0.4055 458/500 [==========================>...] - ETA: 10s - loss: 2.2533 - regression_loss: 1.8477 - classification_loss: 0.4056 459/500 [==========================>...] - ETA: 10s - loss: 2.2536 - regression_loss: 1.8476 - classification_loss: 0.4060 460/500 [==========================>...] - ETA: 9s - loss: 2.2533 - regression_loss: 1.8473 - classification_loss: 0.4060  461/500 [==========================>...] - ETA: 9s - loss: 2.2543 - regression_loss: 1.8480 - classification_loss: 0.4064 462/500 [==========================>...] - ETA: 9s - loss: 2.2525 - regression_loss: 1.8466 - classification_loss: 0.4059 463/500 [==========================>...] - ETA: 9s - loss: 2.2517 - regression_loss: 1.8461 - classification_loss: 0.4055 464/500 [==========================>...] - ETA: 8s - loss: 2.2513 - regression_loss: 1.8460 - classification_loss: 0.4053 465/500 [==========================>...] - ETA: 8s - loss: 2.2496 - regression_loss: 1.8447 - classification_loss: 0.4048 466/500 [==========================>...] - ETA: 8s - loss: 2.2509 - regression_loss: 1.8460 - classification_loss: 0.4049 467/500 [===========================>..] - ETA: 8s - loss: 2.2514 - regression_loss: 1.8467 - classification_loss: 0.4047 468/500 [===========================>..] - ETA: 7s - loss: 2.2521 - regression_loss: 1.8474 - classification_loss: 0.4047 469/500 [===========================>..] - ETA: 7s - loss: 2.2519 - regression_loss: 1.8474 - classification_loss: 0.4046 470/500 [===========================>..] - ETA: 7s - loss: 2.2511 - regression_loss: 1.8468 - classification_loss: 0.4042 471/500 [===========================>..] - ETA: 7s - loss: 2.2501 - regression_loss: 1.8461 - classification_loss: 0.4039 472/500 [===========================>..] - ETA: 6s - loss: 2.2497 - regression_loss: 1.8461 - classification_loss: 0.4037 473/500 [===========================>..] - ETA: 6s - loss: 2.2476 - regression_loss: 1.8444 - classification_loss: 0.4032 474/500 [===========================>..] - ETA: 6s - loss: 2.2478 - regression_loss: 1.8445 - classification_loss: 0.4033 475/500 [===========================>..] - ETA: 6s - loss: 2.2486 - regression_loss: 1.8449 - classification_loss: 0.4037 476/500 [===========================>..] - ETA: 5s - loss: 2.2485 - regression_loss: 1.8450 - classification_loss: 0.4035 477/500 [===========================>..] - ETA: 5s - loss: 2.2496 - regression_loss: 1.8457 - classification_loss: 0.4039 478/500 [===========================>..] - ETA: 5s - loss: 2.2475 - regression_loss: 1.8442 - classification_loss: 0.4034 479/500 [===========================>..] - ETA: 5s - loss: 2.2489 - regression_loss: 1.8448 - classification_loss: 0.4041 480/500 [===========================>..] - ETA: 4s - loss: 2.2481 - regression_loss: 1.8443 - classification_loss: 0.4038 481/500 [===========================>..] - ETA: 4s - loss: 2.2491 - regression_loss: 1.8448 - classification_loss: 0.4043 482/500 [===========================>..] - ETA: 4s - loss: 2.2509 - regression_loss: 1.8463 - classification_loss: 0.4047 483/500 [===========================>..] - ETA: 4s - loss: 2.2480 - regression_loss: 1.8440 - classification_loss: 0.4040 484/500 [============================>.] - ETA: 3s - loss: 2.2507 - regression_loss: 1.8460 - classification_loss: 0.4048 485/500 [============================>.] - ETA: 3s - loss: 2.2499 - regression_loss: 1.8454 - classification_loss: 0.4045 486/500 [============================>.] - ETA: 3s - loss: 2.2496 - regression_loss: 1.8452 - classification_loss: 0.4043 487/500 [============================>.] - ETA: 3s - loss: 2.2502 - regression_loss: 1.8460 - classification_loss: 0.4042 488/500 [============================>.] - ETA: 2s - loss: 2.2502 - regression_loss: 1.8458 - classification_loss: 0.4044 489/500 [============================>.] - ETA: 2s - loss: 2.2481 - regression_loss: 1.8439 - classification_loss: 0.4042 490/500 [============================>.] - ETA: 2s - loss: 2.2461 - regression_loss: 1.8423 - classification_loss: 0.4038 491/500 [============================>.] - ETA: 2s - loss: 2.2462 - regression_loss: 1.8424 - classification_loss: 0.4038 492/500 [============================>.] - ETA: 1s - loss: 2.2461 - regression_loss: 1.8426 - classification_loss: 0.4036 493/500 [============================>.] - ETA: 1s - loss: 2.2473 - regression_loss: 1.8432 - classification_loss: 0.4041 494/500 [============================>.] - ETA: 1s - loss: 2.2476 - regression_loss: 1.8438 - classification_loss: 0.4038 495/500 [============================>.] - ETA: 1s - loss: 2.2466 - regression_loss: 1.8430 - classification_loss: 0.4036 496/500 [============================>.] - ETA: 0s - loss: 2.2462 - regression_loss: 1.8427 - classification_loss: 0.4035 497/500 [============================>.] - ETA: 0s - loss: 2.2455 - regression_loss: 1.8421 - classification_loss: 0.4034 498/500 [============================>.] - ETA: 0s - loss: 2.2451 - regression_loss: 1.8418 - classification_loss: 0.4034 499/500 [============================>.] - ETA: 0s - loss: 2.2450 - regression_loss: 1.8416 - classification_loss: 0.4034 500/500 [==============================] - 125s 250ms/step - loss: 2.2454 - regression_loss: 1.8419 - classification_loss: 0.4035 326 instances of class plum with average precision: 0.6681 mAP: 0.6681 Epoch 00013: saving model to ./training/snapshots/resnet50_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 1:54 - loss: 2.3737 - regression_loss: 2.0598 - classification_loss: 0.3139 2/500 [..............................] - ETA: 2:01 - loss: 2.2123 - regression_loss: 1.8841 - classification_loss: 0.3282 3/500 [..............................] - ETA: 2:02 - loss: 2.0373 - regression_loss: 1.7402 - classification_loss: 0.2971 4/500 [..............................] - ETA: 2:03 - loss: 2.1993 - regression_loss: 1.8488 - classification_loss: 0.3505 5/500 [..............................] - ETA: 2:03 - loss: 2.2522 - regression_loss: 1.8860 - classification_loss: 0.3662 6/500 [..............................] - ETA: 2:03 - loss: 2.2444 - regression_loss: 1.8762 - classification_loss: 0.3682 7/500 [..............................] - ETA: 2:02 - loss: 2.2770 - regression_loss: 1.9013 - classification_loss: 0.3757 8/500 [..............................] - ETA: 2:01 - loss: 2.2303 - regression_loss: 1.8750 - classification_loss: 0.3554 9/500 [..............................] - ETA: 2:01 - loss: 2.2612 - regression_loss: 1.8965 - classification_loss: 0.3647 10/500 [..............................] - ETA: 2:02 - loss: 2.3132 - regression_loss: 1.9282 - classification_loss: 0.3851 11/500 [..............................] - ETA: 2:01 - loss: 2.3683 - regression_loss: 1.9618 - classification_loss: 0.4065 12/500 [..............................] - ETA: 2:00 - loss: 2.2901 - regression_loss: 1.9031 - classification_loss: 0.3870 13/500 [..............................] - ETA: 1:58 - loss: 2.2997 - regression_loss: 1.9035 - classification_loss: 0.3962 14/500 [..............................] - ETA: 1:57 - loss: 2.2838 - regression_loss: 1.8909 - classification_loss: 0.3928 15/500 [..............................] - ETA: 1:56 - loss: 2.3117 - regression_loss: 1.9078 - classification_loss: 0.4039 16/500 [..............................] - ETA: 1:56 - loss: 2.3015 - regression_loss: 1.8955 - classification_loss: 0.4060 17/500 [>.............................] - ETA: 1:56 - loss: 2.3472 - regression_loss: 1.9350 - classification_loss: 0.4123 18/500 [>.............................] - ETA: 1:56 - loss: 2.3423 - regression_loss: 1.9236 - classification_loss: 0.4187 19/500 [>.............................] - ETA: 1:56 - loss: 2.3641 - regression_loss: 1.9442 - classification_loss: 0.4199 20/500 [>.............................] - ETA: 1:56 - loss: 2.3823 - regression_loss: 1.9571 - classification_loss: 0.4252 21/500 [>.............................] - ETA: 1:56 - loss: 2.3813 - regression_loss: 1.9553 - classification_loss: 0.4261 22/500 [>.............................] - ETA: 1:55 - loss: 2.3454 - regression_loss: 1.9271 - classification_loss: 0.4183 23/500 [>.............................] - ETA: 1:55 - loss: 2.3616 - regression_loss: 1.9421 - classification_loss: 0.4194 24/500 [>.............................] - ETA: 1:55 - loss: 2.3452 - regression_loss: 1.9278 - classification_loss: 0.4174 25/500 [>.............................] - ETA: 1:55 - loss: 2.3366 - regression_loss: 1.9223 - classification_loss: 0.4143 26/500 [>.............................] - ETA: 1:54 - loss: 2.3321 - regression_loss: 1.9192 - classification_loss: 0.4129 27/500 [>.............................] - ETA: 1:54 - loss: 2.3584 - regression_loss: 1.9424 - classification_loss: 0.4160 28/500 [>.............................] - ETA: 1:54 - loss: 2.3730 - regression_loss: 1.9554 - classification_loss: 0.4176 29/500 [>.............................] - ETA: 1:54 - loss: 2.3489 - regression_loss: 1.9384 - classification_loss: 0.4105 30/500 [>.............................] - ETA: 1:53 - loss: 2.3346 - regression_loss: 1.9262 - classification_loss: 0.4085 31/500 [>.............................] - ETA: 1:53 - loss: 2.3355 - regression_loss: 1.9283 - classification_loss: 0.4073 32/500 [>.............................] - ETA: 1:53 - loss: 2.3240 - regression_loss: 1.9200 - classification_loss: 0.4041 33/500 [>.............................] - ETA: 1:53 - loss: 2.3108 - regression_loss: 1.9117 - classification_loss: 0.3991 34/500 [=>............................] - ETA: 1:53 - loss: 2.3109 - regression_loss: 1.9111 - classification_loss: 0.3999 35/500 [=>............................] - ETA: 1:53 - loss: 2.2906 - regression_loss: 1.8950 - classification_loss: 0.3957 36/500 [=>............................] - ETA: 1:52 - loss: 2.2838 - regression_loss: 1.8877 - classification_loss: 0.3961 37/500 [=>............................] - ETA: 1:52 - loss: 2.2839 - regression_loss: 1.8881 - classification_loss: 0.3957 38/500 [=>............................] - ETA: 1:52 - loss: 2.2591 - regression_loss: 1.8676 - classification_loss: 0.3915 39/500 [=>............................] - ETA: 1:52 - loss: 2.2277 - regression_loss: 1.8418 - classification_loss: 0.3859 40/500 [=>............................] - ETA: 1:52 - loss: 2.2312 - regression_loss: 1.8457 - classification_loss: 0.3854 41/500 [=>............................] - ETA: 1:51 - loss: 2.2139 - regression_loss: 1.8331 - classification_loss: 0.3808 42/500 [=>............................] - ETA: 1:51 - loss: 2.2142 - regression_loss: 1.8326 - classification_loss: 0.3816 43/500 [=>............................] - ETA: 1:51 - loss: 2.1885 - regression_loss: 1.8110 - classification_loss: 0.3775 44/500 [=>............................] - ETA: 1:50 - loss: 2.1833 - regression_loss: 1.8086 - classification_loss: 0.3747 45/500 [=>............................] - ETA: 1:50 - loss: 2.1771 - regression_loss: 1.8052 - classification_loss: 0.3718 46/500 [=>............................] - ETA: 1:50 - loss: 2.1732 - regression_loss: 1.8018 - classification_loss: 0.3714 47/500 [=>............................] - ETA: 1:50 - loss: 2.1611 - regression_loss: 1.7928 - classification_loss: 0.3684 48/500 [=>............................] - ETA: 1:49 - loss: 2.1534 - regression_loss: 1.7847 - classification_loss: 0.3687 49/500 [=>............................] - ETA: 1:49 - loss: 2.1422 - regression_loss: 1.7762 - classification_loss: 0.3660 50/500 [==>...........................] - ETA: 1:48 - loss: 2.1303 - regression_loss: 1.7678 - classification_loss: 0.3626 51/500 [==>...........................] - ETA: 1:48 - loss: 2.1366 - regression_loss: 1.7719 - classification_loss: 0.3647 52/500 [==>...........................] - ETA: 1:48 - loss: 2.1409 - regression_loss: 1.7710 - classification_loss: 0.3699 53/500 [==>...........................] - ETA: 1:48 - loss: 2.1513 - regression_loss: 1.7770 - classification_loss: 0.3744 54/500 [==>...........................] - ETA: 1:47 - loss: 2.2352 - regression_loss: 1.7857 - classification_loss: 0.4494 55/500 [==>...........................] - ETA: 1:47 - loss: 2.2271 - regression_loss: 1.7807 - classification_loss: 0.4464 56/500 [==>...........................] - ETA: 1:47 - loss: 2.2236 - regression_loss: 1.7793 - classification_loss: 0.4444 57/500 [==>...........................] - ETA: 1:46 - loss: 2.2279 - regression_loss: 1.7838 - classification_loss: 0.4442 58/500 [==>...........................] - ETA: 1:46 - loss: 2.2298 - regression_loss: 1.7837 - classification_loss: 0.4460 59/500 [==>...........................] - ETA: 1:46 - loss: 2.2257 - regression_loss: 1.7821 - classification_loss: 0.4435 60/500 [==>...........................] - ETA: 1:46 - loss: 2.2192 - regression_loss: 1.7774 - classification_loss: 0.4418 61/500 [==>...........................] - ETA: 1:46 - loss: 2.2198 - regression_loss: 1.7792 - classification_loss: 0.4407 62/500 [==>...........................] - ETA: 1:46 - loss: 2.2188 - regression_loss: 1.7778 - classification_loss: 0.4409 63/500 [==>...........................] - ETA: 1:45 - loss: 2.2176 - regression_loss: 1.7802 - classification_loss: 0.4374 64/500 [==>...........................] - ETA: 1:45 - loss: 2.2160 - regression_loss: 1.7786 - classification_loss: 0.4375 65/500 [==>...........................] - ETA: 1:45 - loss: 2.2171 - regression_loss: 1.7812 - classification_loss: 0.4358 66/500 [==>...........................] - ETA: 1:45 - loss: 2.2227 - regression_loss: 1.7853 - classification_loss: 0.4374 67/500 [===>..........................] - ETA: 1:44 - loss: 2.2258 - regression_loss: 1.7877 - classification_loss: 0.4382 68/500 [===>..........................] - ETA: 1:44 - loss: 2.2295 - regression_loss: 1.7929 - classification_loss: 0.4366 69/500 [===>..........................] - ETA: 1:44 - loss: 2.2359 - regression_loss: 1.7997 - classification_loss: 0.4362 70/500 [===>..........................] - ETA: 1:44 - loss: 2.2553 - regression_loss: 1.8098 - classification_loss: 0.4455 71/500 [===>..........................] - ETA: 1:44 - loss: 2.2480 - regression_loss: 1.8056 - classification_loss: 0.4424 72/500 [===>..........................] - ETA: 1:44 - loss: 2.2506 - regression_loss: 1.8085 - classification_loss: 0.4421 73/500 [===>..........................] - ETA: 1:43 - loss: 2.2514 - regression_loss: 1.8092 - classification_loss: 0.4422 74/500 [===>..........................] - ETA: 1:43 - loss: 2.2611 - regression_loss: 1.8172 - classification_loss: 0.4438 75/500 [===>..........................] - ETA: 1:43 - loss: 2.2506 - regression_loss: 1.8104 - classification_loss: 0.4402 76/500 [===>..........................] - ETA: 1:43 - loss: 2.2604 - regression_loss: 1.8190 - classification_loss: 0.4414 77/500 [===>..........................] - ETA: 1:42 - loss: 2.2662 - regression_loss: 1.8251 - classification_loss: 0.4411 78/500 [===>..........................] - ETA: 1:42 - loss: 2.2687 - regression_loss: 1.8286 - classification_loss: 0.4401 79/500 [===>..........................] - ETA: 1:42 - loss: 2.2722 - regression_loss: 1.8313 - classification_loss: 0.4409 80/500 [===>..........................] - ETA: 1:42 - loss: 2.2735 - regression_loss: 1.8326 - classification_loss: 0.4409 81/500 [===>..........................] - ETA: 1:42 - loss: 2.2770 - regression_loss: 1.8361 - classification_loss: 0.4409 82/500 [===>..........................] - ETA: 1:41 - loss: 2.2713 - regression_loss: 1.8331 - classification_loss: 0.4381 83/500 [===>..........................] - ETA: 1:41 - loss: 2.2644 - regression_loss: 1.8280 - classification_loss: 0.4364 84/500 [====>.........................] - ETA: 1:41 - loss: 2.2628 - regression_loss: 1.8278 - classification_loss: 0.4350 85/500 [====>.........................] - ETA: 1:41 - loss: 2.2683 - regression_loss: 1.8321 - classification_loss: 0.4362 86/500 [====>.........................] - ETA: 1:41 - loss: 2.2711 - regression_loss: 1.8343 - classification_loss: 0.4367 87/500 [====>.........................] - ETA: 1:40 - loss: 2.2689 - regression_loss: 1.8330 - classification_loss: 0.4359 88/500 [====>.........................] - ETA: 1:40 - loss: 2.2705 - regression_loss: 1.8348 - classification_loss: 0.4358 89/500 [====>.........................] - ETA: 1:40 - loss: 2.2675 - regression_loss: 1.8333 - classification_loss: 0.4342 90/500 [====>.........................] - ETA: 1:40 - loss: 2.2518 - regression_loss: 1.8200 - classification_loss: 0.4319 91/500 [====>.........................] - ETA: 1:40 - loss: 2.2513 - regression_loss: 1.8201 - classification_loss: 0.4312 92/500 [====>.........................] - ETA: 1:39 - loss: 2.2543 - regression_loss: 1.8232 - classification_loss: 0.4311 93/500 [====>.........................] - ETA: 1:39 - loss: 2.2611 - regression_loss: 1.8293 - classification_loss: 0.4318 94/500 [====>.........................] - ETA: 1:39 - loss: 2.2594 - regression_loss: 1.8276 - classification_loss: 0.4318 95/500 [====>.........................] - ETA: 1:39 - loss: 2.2636 - regression_loss: 1.8321 - classification_loss: 0.4315 96/500 [====>.........................] - ETA: 1:38 - loss: 2.2607 - regression_loss: 1.8291 - classification_loss: 0.4316 97/500 [====>.........................] - ETA: 1:38 - loss: 2.2608 - regression_loss: 1.8301 - classification_loss: 0.4307 98/500 [====>.........................] - ETA: 1:38 - loss: 2.2436 - regression_loss: 1.8163 - classification_loss: 0.4274 99/500 [====>.........................] - ETA: 1:38 - loss: 2.2357 - regression_loss: 1.8106 - classification_loss: 0.4251 100/500 [=====>........................] - ETA: 1:38 - loss: 2.2309 - regression_loss: 1.8076 - classification_loss: 0.4233 101/500 [=====>........................] - ETA: 1:37 - loss: 2.2334 - regression_loss: 1.8110 - classification_loss: 0.4224 102/500 [=====>........................] - ETA: 1:37 - loss: 2.2308 - regression_loss: 1.8098 - classification_loss: 0.4210 103/500 [=====>........................] - ETA: 1:37 - loss: 2.2278 - regression_loss: 1.8079 - classification_loss: 0.4199 104/500 [=====>........................] - ETA: 1:37 - loss: 2.2243 - regression_loss: 1.8061 - classification_loss: 0.4182 105/500 [=====>........................] - ETA: 1:36 - loss: 2.2264 - regression_loss: 1.8083 - classification_loss: 0.4181 106/500 [=====>........................] - ETA: 1:36 - loss: 2.2254 - regression_loss: 1.8080 - classification_loss: 0.4174 107/500 [=====>........................] - ETA: 1:36 - loss: 2.2257 - regression_loss: 1.8098 - classification_loss: 0.4160 108/500 [=====>........................] - ETA: 1:36 - loss: 2.2308 - regression_loss: 1.8133 - classification_loss: 0.4175 109/500 [=====>........................] - ETA: 1:36 - loss: 2.2288 - regression_loss: 1.8122 - classification_loss: 0.4166 110/500 [=====>........................] - ETA: 1:35 - loss: 2.2286 - regression_loss: 1.7957 - classification_loss: 0.4329 111/500 [=====>........................] - ETA: 1:35 - loss: 2.2299 - regression_loss: 1.7972 - classification_loss: 0.4327 112/500 [=====>........................] - ETA: 1:35 - loss: 2.2284 - regression_loss: 1.7961 - classification_loss: 0.4324 113/500 [=====>........................] - ETA: 1:35 - loss: 2.2159 - regression_loss: 1.7865 - classification_loss: 0.4294 114/500 [=====>........................] - ETA: 1:34 - loss: 2.2172 - regression_loss: 1.7881 - classification_loss: 0.4291 115/500 [=====>........................] - ETA: 1:34 - loss: 2.2136 - regression_loss: 1.7855 - classification_loss: 0.4281 116/500 [=====>........................] - ETA: 1:34 - loss: 2.2130 - regression_loss: 1.7851 - classification_loss: 0.4279 117/500 [======>.......................] - ETA: 1:34 - loss: 2.2050 - regression_loss: 1.7793 - classification_loss: 0.4257 118/500 [======>.......................] - ETA: 1:34 - loss: 2.2002 - regression_loss: 1.7759 - classification_loss: 0.4242 119/500 [======>.......................] - ETA: 1:33 - loss: 2.2028 - regression_loss: 1.7777 - classification_loss: 0.4251 120/500 [======>.......................] - ETA: 1:33 - loss: 2.2039 - regression_loss: 1.7789 - classification_loss: 0.4250 121/500 [======>.......................] - ETA: 1:33 - loss: 2.1992 - regression_loss: 1.7757 - classification_loss: 0.4234 122/500 [======>.......................] - ETA: 1:32 - loss: 2.1920 - regression_loss: 1.7707 - classification_loss: 0.4213 123/500 [======>.......................] - ETA: 1:32 - loss: 2.1881 - regression_loss: 1.7687 - classification_loss: 0.4194 124/500 [======>.......................] - ETA: 1:32 - loss: 2.1867 - regression_loss: 1.7677 - classification_loss: 0.4190 125/500 [======>.......................] - ETA: 1:32 - loss: 2.1849 - regression_loss: 1.7667 - classification_loss: 0.4182 126/500 [======>.......................] - ETA: 1:32 - loss: 2.1812 - regression_loss: 1.7648 - classification_loss: 0.4164 127/500 [======>.......................] - ETA: 1:31 - loss: 2.1865 - regression_loss: 1.7694 - classification_loss: 0.4170 128/500 [======>.......................] - ETA: 1:31 - loss: 2.1895 - regression_loss: 1.7719 - classification_loss: 0.4176 129/500 [======>.......................] - ETA: 1:31 - loss: 2.1912 - regression_loss: 1.7731 - classification_loss: 0.4181 130/500 [======>.......................] - ETA: 1:31 - loss: 2.1920 - regression_loss: 1.7734 - classification_loss: 0.4185 131/500 [======>.......................] - ETA: 1:30 - loss: 2.1927 - regression_loss: 1.7760 - classification_loss: 0.4166 132/500 [======>.......................] - ETA: 1:30 - loss: 2.1837 - regression_loss: 1.7691 - classification_loss: 0.4146 133/500 [======>.......................] - ETA: 1:30 - loss: 2.1829 - regression_loss: 1.7690 - classification_loss: 0.4139 134/500 [=======>......................] - ETA: 1:30 - loss: 2.1779 - regression_loss: 1.7658 - classification_loss: 0.4121 135/500 [=======>......................] - ETA: 1:29 - loss: 2.1784 - regression_loss: 1.7669 - classification_loss: 0.4115 136/500 [=======>......................] - ETA: 1:29 - loss: 2.1742 - regression_loss: 1.7640 - classification_loss: 0.4103 137/500 [=======>......................] - ETA: 1:29 - loss: 2.1754 - regression_loss: 1.7656 - classification_loss: 0.4099 138/500 [=======>......................] - ETA: 1:29 - loss: 2.1737 - regression_loss: 1.7528 - classification_loss: 0.4210 139/500 [=======>......................] - ETA: 1:29 - loss: 2.1741 - regression_loss: 1.7548 - classification_loss: 0.4193 140/500 [=======>......................] - ETA: 1:28 - loss: 2.1741 - regression_loss: 1.7552 - classification_loss: 0.4190 141/500 [=======>......................] - ETA: 1:28 - loss: 2.1800 - regression_loss: 1.7596 - classification_loss: 0.4204 142/500 [=======>......................] - ETA: 1:28 - loss: 2.1794 - regression_loss: 1.7599 - classification_loss: 0.4196 143/500 [=======>......................] - ETA: 1:28 - loss: 2.1847 - regression_loss: 1.7636 - classification_loss: 0.4211 144/500 [=======>......................] - ETA: 1:27 - loss: 2.1810 - regression_loss: 1.7611 - classification_loss: 0.4199 145/500 [=======>......................] - ETA: 1:27 - loss: 2.1805 - regression_loss: 1.7610 - classification_loss: 0.4195 146/500 [=======>......................] - ETA: 1:27 - loss: 2.1838 - regression_loss: 1.7628 - classification_loss: 0.4210 147/500 [=======>......................] - ETA: 1:26 - loss: 2.1875 - regression_loss: 1.7660 - classification_loss: 0.4215 148/500 [=======>......................] - ETA: 1:26 - loss: 2.1900 - regression_loss: 1.7682 - classification_loss: 0.4218 149/500 [=======>......................] - ETA: 1:26 - loss: 2.1918 - regression_loss: 1.7695 - classification_loss: 0.4222 150/500 [========>.....................] - ETA: 1:26 - loss: 2.1880 - regression_loss: 1.7676 - classification_loss: 0.4204 151/500 [========>.....................] - ETA: 1:26 - loss: 2.1789 - regression_loss: 1.7606 - classification_loss: 0.4184 152/500 [========>.....................] - ETA: 1:25 - loss: 2.1798 - regression_loss: 1.7619 - classification_loss: 0.4179 153/500 [========>.....................] - ETA: 1:25 - loss: 2.1808 - regression_loss: 1.7629 - classification_loss: 0.4180 154/500 [========>.....................] - ETA: 1:25 - loss: 2.1827 - regression_loss: 1.7639 - classification_loss: 0.4188 155/500 [========>.....................] - ETA: 1:25 - loss: 2.1843 - regression_loss: 1.7654 - classification_loss: 0.4189 156/500 [========>.....................] - ETA: 1:24 - loss: 2.1823 - regression_loss: 1.7644 - classification_loss: 0.4179 157/500 [========>.....................] - ETA: 1:24 - loss: 2.1834 - regression_loss: 1.7656 - classification_loss: 0.4177 158/500 [========>.....................] - ETA: 1:24 - loss: 2.1835 - regression_loss: 1.7662 - classification_loss: 0.4172 159/500 [========>.....................] - ETA: 1:24 - loss: 2.1814 - regression_loss: 1.7652 - classification_loss: 0.4161 160/500 [========>.....................] - ETA: 1:23 - loss: 2.1814 - regression_loss: 1.7652 - classification_loss: 0.4162 161/500 [========>.....................] - ETA: 1:23 - loss: 2.1823 - regression_loss: 1.7656 - classification_loss: 0.4167 162/500 [========>.....................] - ETA: 1:23 - loss: 2.1850 - regression_loss: 1.7680 - classification_loss: 0.4171 163/500 [========>.....................] - ETA: 1:23 - loss: 2.1816 - regression_loss: 1.7659 - classification_loss: 0.4157 164/500 [========>.....................] - ETA: 1:22 - loss: 2.1809 - regression_loss: 1.7656 - classification_loss: 0.4153 165/500 [========>.....................] - ETA: 1:22 - loss: 2.1876 - regression_loss: 1.7710 - classification_loss: 0.4166 166/500 [========>.....................] - ETA: 1:22 - loss: 2.1904 - regression_loss: 1.7735 - classification_loss: 0.4169 167/500 [=========>....................] - ETA: 1:22 - loss: 2.1913 - regression_loss: 1.7744 - classification_loss: 0.4169 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1878 - regression_loss: 1.7723 - classification_loss: 0.4156 169/500 [=========>....................] - ETA: 1:21 - loss: 2.1835 - regression_loss: 1.7685 - classification_loss: 0.4150 170/500 [=========>....................] - ETA: 1:21 - loss: 2.1828 - regression_loss: 1.7682 - classification_loss: 0.4146 171/500 [=========>....................] - ETA: 1:21 - loss: 2.1783 - regression_loss: 1.7647 - classification_loss: 0.4136 172/500 [=========>....................] - ETA: 1:21 - loss: 2.1800 - regression_loss: 1.7662 - classification_loss: 0.4137 173/500 [=========>....................] - ETA: 1:20 - loss: 2.1783 - regression_loss: 1.7655 - classification_loss: 0.4128 174/500 [=========>....................] - ETA: 1:20 - loss: 2.1835 - regression_loss: 1.7690 - classification_loss: 0.4145 175/500 [=========>....................] - ETA: 1:20 - loss: 2.1792 - regression_loss: 1.7659 - classification_loss: 0.4133 176/500 [=========>....................] - ETA: 1:20 - loss: 2.1808 - regression_loss: 1.7669 - classification_loss: 0.4139 177/500 [=========>....................] - ETA: 1:19 - loss: 2.1777 - regression_loss: 1.7649 - classification_loss: 0.4128 178/500 [=========>....................] - ETA: 1:19 - loss: 2.1777 - regression_loss: 1.7652 - classification_loss: 0.4124 179/500 [=========>....................] - ETA: 1:19 - loss: 2.1715 - regression_loss: 1.7606 - classification_loss: 0.4109 180/500 [=========>....................] - ETA: 1:19 - loss: 2.1714 - regression_loss: 1.7611 - classification_loss: 0.4103 181/500 [=========>....................] - ETA: 1:18 - loss: 2.1716 - regression_loss: 1.7616 - classification_loss: 0.4100 182/500 [=========>....................] - ETA: 1:18 - loss: 2.1803 - regression_loss: 1.7702 - classification_loss: 0.4102 183/500 [=========>....................] - ETA: 1:18 - loss: 2.1818 - regression_loss: 1.7714 - classification_loss: 0.4104 184/500 [==========>...................] - ETA: 1:18 - loss: 2.1864 - regression_loss: 1.7750 - classification_loss: 0.4114 185/500 [==========>...................] - ETA: 1:17 - loss: 2.1864 - regression_loss: 1.7752 - classification_loss: 0.4113 186/500 [==========>...................] - ETA: 1:17 - loss: 2.1895 - regression_loss: 1.7781 - classification_loss: 0.4114 187/500 [==========>...................] - ETA: 1:17 - loss: 2.1896 - regression_loss: 1.7782 - classification_loss: 0.4114 188/500 [==========>...................] - ETA: 1:17 - loss: 2.1848 - regression_loss: 1.7747 - classification_loss: 0.4101 189/500 [==========>...................] - ETA: 1:16 - loss: 2.1850 - regression_loss: 1.7749 - classification_loss: 0.4101 190/500 [==========>...................] - ETA: 1:16 - loss: 2.1866 - regression_loss: 1.7757 - classification_loss: 0.4109 191/500 [==========>...................] - ETA: 1:16 - loss: 2.1824 - regression_loss: 1.7724 - classification_loss: 0.4100 192/500 [==========>...................] - ETA: 1:15 - loss: 2.1844 - regression_loss: 1.7740 - classification_loss: 0.4104 193/500 [==========>...................] - ETA: 1:15 - loss: 2.1810 - regression_loss: 1.7714 - classification_loss: 0.4097 194/500 [==========>...................] - ETA: 1:15 - loss: 2.1802 - regression_loss: 1.7712 - classification_loss: 0.4090 195/500 [==========>...................] - ETA: 1:15 - loss: 2.1796 - regression_loss: 1.7706 - classification_loss: 0.4091 196/500 [==========>...................] - ETA: 1:15 - loss: 2.1797 - regression_loss: 1.7712 - classification_loss: 0.4085 197/500 [==========>...................] - ETA: 1:14 - loss: 2.1876 - regression_loss: 1.7768 - classification_loss: 0.4108 198/500 [==========>...................] - ETA: 1:14 - loss: 2.1868 - regression_loss: 1.7764 - classification_loss: 0.4105 199/500 [==========>...................] - ETA: 1:14 - loss: 2.1877 - regression_loss: 1.7779 - classification_loss: 0.4098 200/500 [===========>..................] - ETA: 1:14 - loss: 2.1890 - regression_loss: 1.7797 - classification_loss: 0.4093 201/500 [===========>..................] - ETA: 1:13 - loss: 2.1868 - regression_loss: 1.7783 - classification_loss: 0.4085 202/500 [===========>..................] - ETA: 1:13 - loss: 2.1863 - regression_loss: 1.7781 - classification_loss: 0.4082 203/500 [===========>..................] - ETA: 1:13 - loss: 2.1860 - regression_loss: 1.7780 - classification_loss: 0.4080 204/500 [===========>..................] - ETA: 1:13 - loss: 2.1856 - regression_loss: 1.7778 - classification_loss: 0.4078 205/500 [===========>..................] - ETA: 1:12 - loss: 2.1862 - regression_loss: 1.7782 - classification_loss: 0.4080 206/500 [===========>..................] - ETA: 1:12 - loss: 2.1877 - regression_loss: 1.7797 - classification_loss: 0.4079 207/500 [===========>..................] - ETA: 1:12 - loss: 2.1865 - regression_loss: 1.7790 - classification_loss: 0.4075 208/500 [===========>..................] - ETA: 1:12 - loss: 2.1831 - regression_loss: 1.7765 - classification_loss: 0.4066 209/500 [===========>..................] - ETA: 1:11 - loss: 2.1831 - regression_loss: 1.7758 - classification_loss: 0.4073 210/500 [===========>..................] - ETA: 1:11 - loss: 2.1831 - regression_loss: 1.7761 - classification_loss: 0.4071 211/500 [===========>..................] - ETA: 1:11 - loss: 2.1783 - regression_loss: 1.7719 - classification_loss: 0.4064 212/500 [===========>..................] - ETA: 1:11 - loss: 2.1792 - regression_loss: 1.7719 - classification_loss: 0.4073 213/500 [===========>..................] - ETA: 1:10 - loss: 2.1769 - regression_loss: 1.7704 - classification_loss: 0.4064 214/500 [===========>..................] - ETA: 1:10 - loss: 2.1817 - regression_loss: 1.7742 - classification_loss: 0.4075 215/500 [===========>..................] - ETA: 1:10 - loss: 2.1790 - regression_loss: 1.7724 - classification_loss: 0.4066 216/500 [===========>..................] - ETA: 1:10 - loss: 2.1802 - regression_loss: 1.7732 - classification_loss: 0.4070 217/500 [============>.................] - ETA: 1:10 - loss: 2.1819 - regression_loss: 1.7744 - classification_loss: 0.4075 218/500 [============>.................] - ETA: 1:09 - loss: 2.1808 - regression_loss: 1.7739 - classification_loss: 0.4069 219/500 [============>.................] - ETA: 1:09 - loss: 2.1831 - regression_loss: 1.7763 - classification_loss: 0.4068 220/500 [============>.................] - ETA: 1:09 - loss: 2.1851 - regression_loss: 1.7782 - classification_loss: 0.4069 221/500 [============>.................] - ETA: 1:09 - loss: 2.1864 - regression_loss: 1.7787 - classification_loss: 0.4077 222/500 [============>.................] - ETA: 1:08 - loss: 2.1846 - regression_loss: 1.7776 - classification_loss: 0.4071 223/500 [============>.................] - ETA: 1:08 - loss: 2.1848 - regression_loss: 1.7776 - classification_loss: 0.4072 224/500 [============>.................] - ETA: 1:08 - loss: 2.1828 - regression_loss: 1.7761 - classification_loss: 0.4067 225/500 [============>.................] - ETA: 1:08 - loss: 2.1857 - regression_loss: 1.7781 - classification_loss: 0.4076 226/500 [============>.................] - ETA: 1:07 - loss: 2.1800 - regression_loss: 1.7737 - classification_loss: 0.4063 227/500 [============>.................] - ETA: 1:07 - loss: 2.1803 - regression_loss: 1.7740 - classification_loss: 0.4062 228/500 [============>.................] - ETA: 1:07 - loss: 2.1796 - regression_loss: 1.7736 - classification_loss: 0.4061 229/500 [============>.................] - ETA: 1:07 - loss: 2.1800 - regression_loss: 1.7742 - classification_loss: 0.4058 230/500 [============>.................] - ETA: 1:06 - loss: 2.1751 - regression_loss: 1.7705 - classification_loss: 0.4047 231/500 [============>.................] - ETA: 1:06 - loss: 2.1759 - regression_loss: 1.7710 - classification_loss: 0.4049 232/500 [============>.................] - ETA: 1:06 - loss: 2.1812 - regression_loss: 1.7755 - classification_loss: 0.4057 233/500 [============>.................] - ETA: 1:06 - loss: 2.1834 - regression_loss: 1.7767 - classification_loss: 0.4068 234/500 [=============>................] - ETA: 1:05 - loss: 2.1809 - regression_loss: 1.7749 - classification_loss: 0.4059 235/500 [=============>................] - ETA: 1:05 - loss: 2.1813 - regression_loss: 1.7754 - classification_loss: 0.4059 236/500 [=============>................] - ETA: 1:05 - loss: 2.1833 - regression_loss: 1.7769 - classification_loss: 0.4064 237/500 [=============>................] - ETA: 1:05 - loss: 2.1848 - regression_loss: 1.7781 - classification_loss: 0.4067 238/500 [=============>................] - ETA: 1:04 - loss: 2.1842 - regression_loss: 1.7774 - classification_loss: 0.4068 239/500 [=============>................] - ETA: 1:04 - loss: 2.1883 - regression_loss: 1.7817 - classification_loss: 0.4066 240/500 [=============>................] - ETA: 1:04 - loss: 2.1890 - regression_loss: 1.7822 - classification_loss: 0.4068 241/500 [=============>................] - ETA: 1:04 - loss: 2.1890 - regression_loss: 1.7826 - classification_loss: 0.4064 242/500 [=============>................] - ETA: 1:03 - loss: 2.1873 - regression_loss: 1.7815 - classification_loss: 0.4058 243/500 [=============>................] - ETA: 1:03 - loss: 2.1863 - regression_loss: 1.7810 - classification_loss: 0.4053 244/500 [=============>................] - ETA: 1:03 - loss: 2.1887 - regression_loss: 1.7837 - classification_loss: 0.4050 245/500 [=============>................] - ETA: 1:03 - loss: 2.1881 - regression_loss: 1.7833 - classification_loss: 0.4048 246/500 [=============>................] - ETA: 1:02 - loss: 2.1870 - regression_loss: 1.7821 - classification_loss: 0.4049 247/500 [=============>................] - ETA: 1:02 - loss: 2.1848 - regression_loss: 1.7808 - classification_loss: 0.4039 248/500 [=============>................] - ETA: 1:02 - loss: 2.1856 - regression_loss: 1.7822 - classification_loss: 0.4034 249/500 [=============>................] - ETA: 1:02 - loss: 2.1816 - regression_loss: 1.7792 - classification_loss: 0.4024 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1822 - regression_loss: 1.7797 - classification_loss: 0.4025 251/500 [==============>...............] - ETA: 1:01 - loss: 2.1883 - regression_loss: 1.7854 - classification_loss: 0.4029 252/500 [==============>...............] - ETA: 1:01 - loss: 2.1917 - regression_loss: 1.7876 - classification_loss: 0.4040 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1928 - regression_loss: 1.7877 - classification_loss: 0.4050 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1934 - regression_loss: 1.7874 - classification_loss: 0.4061 255/500 [==============>...............] - ETA: 1:00 - loss: 2.1920 - regression_loss: 1.7864 - classification_loss: 0.4055 256/500 [==============>...............] - ETA: 1:00 - loss: 2.1905 - regression_loss: 1.7855 - classification_loss: 0.4050 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1895 - regression_loss: 1.7849 - classification_loss: 0.4046 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1893 - regression_loss: 1.7847 - classification_loss: 0.4046 259/500 [==============>...............] - ETA: 59s - loss: 2.1876 - regression_loss: 1.7834 - classification_loss: 0.4041  260/500 [==============>...............] - ETA: 59s - loss: 2.1875 - regression_loss: 1.7835 - classification_loss: 0.4040 261/500 [==============>...............] - ETA: 59s - loss: 2.1890 - regression_loss: 1.7853 - classification_loss: 0.4037 262/500 [==============>...............] - ETA: 59s - loss: 2.1890 - regression_loss: 1.7858 - classification_loss: 0.4032 263/500 [==============>...............] - ETA: 58s - loss: 2.1908 - regression_loss: 1.7872 - classification_loss: 0.4036 264/500 [==============>...............] - ETA: 58s - loss: 2.1884 - regression_loss: 1.7854 - classification_loss: 0.4030 265/500 [==============>...............] - ETA: 58s - loss: 2.1829 - regression_loss: 1.7810 - classification_loss: 0.4018 266/500 [==============>...............] - ETA: 58s - loss: 2.1840 - regression_loss: 1.7822 - classification_loss: 0.4018 267/500 [===============>..............] - ETA: 57s - loss: 2.1826 - regression_loss: 1.7814 - classification_loss: 0.4012 268/500 [===============>..............] - ETA: 57s - loss: 2.1827 - regression_loss: 1.7816 - classification_loss: 0.4011 269/500 [===============>..............] - ETA: 57s - loss: 2.1863 - regression_loss: 1.7845 - classification_loss: 0.4018 270/500 [===============>..............] - ETA: 57s - loss: 2.1877 - regression_loss: 1.7854 - classification_loss: 0.4022 271/500 [===============>..............] - ETA: 56s - loss: 2.1862 - regression_loss: 1.7846 - classification_loss: 0.4016 272/500 [===============>..............] - ETA: 56s - loss: 2.1813 - regression_loss: 1.7807 - classification_loss: 0.4006 273/500 [===============>..............] - ETA: 56s - loss: 2.1830 - regression_loss: 1.7825 - classification_loss: 0.4005 274/500 [===============>..............] - ETA: 56s - loss: 2.1836 - regression_loss: 1.7830 - classification_loss: 0.4006 275/500 [===============>..............] - ETA: 55s - loss: 2.1844 - regression_loss: 1.7835 - classification_loss: 0.4008 276/500 [===============>..............] - ETA: 55s - loss: 2.1868 - regression_loss: 1.7852 - classification_loss: 0.4016 277/500 [===============>..............] - ETA: 55s - loss: 2.1897 - regression_loss: 1.7875 - classification_loss: 0.4022 278/500 [===============>..............] - ETA: 55s - loss: 2.1880 - regression_loss: 1.7866 - classification_loss: 0.4014 279/500 [===============>..............] - ETA: 54s - loss: 2.1871 - regression_loss: 1.7863 - classification_loss: 0.4008 280/500 [===============>..............] - ETA: 54s - loss: 2.1835 - regression_loss: 1.7835 - classification_loss: 0.4000 281/500 [===============>..............] - ETA: 54s - loss: 2.1821 - regression_loss: 1.7818 - classification_loss: 0.4003 282/500 [===============>..............] - ETA: 54s - loss: 2.1822 - regression_loss: 1.7819 - classification_loss: 0.4003 283/500 [===============>..............] - ETA: 53s - loss: 2.1838 - regression_loss: 1.7831 - classification_loss: 0.4006 284/500 [================>.............] - ETA: 53s - loss: 2.1840 - regression_loss: 1.7831 - classification_loss: 0.4009 285/500 [================>.............] - ETA: 53s - loss: 2.1861 - regression_loss: 1.7849 - classification_loss: 0.4013 286/500 [================>.............] - ETA: 53s - loss: 2.1861 - regression_loss: 1.7850 - classification_loss: 0.4011 287/500 [================>.............] - ETA: 52s - loss: 2.1841 - regression_loss: 1.7836 - classification_loss: 0.4005 288/500 [================>.............] - ETA: 52s - loss: 2.1840 - regression_loss: 1.7840 - classification_loss: 0.4000 289/500 [================>.............] - ETA: 52s - loss: 2.1842 - regression_loss: 1.7842 - classification_loss: 0.4000 290/500 [================>.............] - ETA: 52s - loss: 2.1799 - regression_loss: 1.7810 - classification_loss: 0.3989 291/500 [================>.............] - ETA: 51s - loss: 2.1819 - regression_loss: 1.7826 - classification_loss: 0.3993 292/500 [================>.............] - ETA: 51s - loss: 2.1816 - regression_loss: 1.7822 - classification_loss: 0.3994 293/500 [================>.............] - ETA: 51s - loss: 2.1824 - regression_loss: 1.7829 - classification_loss: 0.3995 294/500 [================>.............] - ETA: 51s - loss: 2.1806 - regression_loss: 1.7816 - classification_loss: 0.3990 295/500 [================>.............] - ETA: 50s - loss: 2.1810 - regression_loss: 1.7819 - classification_loss: 0.3991 296/500 [================>.............] - ETA: 50s - loss: 2.1818 - regression_loss: 1.7828 - classification_loss: 0.3991 297/500 [================>.............] - ETA: 50s - loss: 2.1823 - regression_loss: 1.7832 - classification_loss: 0.3992 298/500 [================>.............] - ETA: 50s - loss: 2.1818 - regression_loss: 1.7828 - classification_loss: 0.3990 299/500 [================>.............] - ETA: 49s - loss: 2.1817 - regression_loss: 1.7828 - classification_loss: 0.3989 300/500 [=================>............] - ETA: 49s - loss: 2.1818 - regression_loss: 1.7830 - classification_loss: 0.3987 301/500 [=================>............] - ETA: 49s - loss: 2.1827 - regression_loss: 1.7836 - classification_loss: 0.3991 302/500 [=================>............] - ETA: 49s - loss: 2.1823 - regression_loss: 1.7832 - classification_loss: 0.3991 303/500 [=================>............] - ETA: 48s - loss: 2.1807 - regression_loss: 1.7820 - classification_loss: 0.3987 304/500 [=================>............] - ETA: 48s - loss: 2.1813 - regression_loss: 1.7829 - classification_loss: 0.3984 305/500 [=================>............] - ETA: 48s - loss: 2.1813 - regression_loss: 1.7822 - classification_loss: 0.3992 306/500 [=================>............] - ETA: 48s - loss: 2.1818 - regression_loss: 1.7828 - classification_loss: 0.3990 307/500 [=================>............] - ETA: 47s - loss: 2.1824 - regression_loss: 1.7833 - classification_loss: 0.3992 308/500 [=================>............] - ETA: 47s - loss: 2.1806 - regression_loss: 1.7819 - classification_loss: 0.3986 309/500 [=================>............] - ETA: 47s - loss: 2.1837 - regression_loss: 1.7844 - classification_loss: 0.3994 310/500 [=================>............] - ETA: 47s - loss: 2.1830 - regression_loss: 1.7839 - classification_loss: 0.3991 311/500 [=================>............] - ETA: 46s - loss: 2.1834 - regression_loss: 1.7844 - classification_loss: 0.3991 312/500 [=================>............] - ETA: 46s - loss: 2.1844 - regression_loss: 1.7852 - classification_loss: 0.3992 313/500 [=================>............] - ETA: 46s - loss: 2.1812 - regression_loss: 1.7827 - classification_loss: 0.3985 314/500 [=================>............] - ETA: 46s - loss: 2.1784 - regression_loss: 1.7804 - classification_loss: 0.3980 315/500 [=================>............] - ETA: 45s - loss: 2.1802 - regression_loss: 1.7819 - classification_loss: 0.3984 316/500 [=================>............] - ETA: 45s - loss: 2.1813 - regression_loss: 1.7831 - classification_loss: 0.3982 317/500 [==================>...........] - ETA: 45s - loss: 2.1830 - regression_loss: 1.7841 - classification_loss: 0.3989 318/500 [==================>...........] - ETA: 45s - loss: 2.1814 - regression_loss: 1.7830 - classification_loss: 0.3984 319/500 [==================>...........] - ETA: 44s - loss: 2.1828 - regression_loss: 1.7841 - classification_loss: 0.3987 320/500 [==================>...........] - ETA: 44s - loss: 2.1829 - regression_loss: 1.7845 - classification_loss: 0.3984 321/500 [==================>...........] - ETA: 44s - loss: 2.1810 - regression_loss: 1.7832 - classification_loss: 0.3978 322/500 [==================>...........] - ETA: 44s - loss: 2.1802 - regression_loss: 1.7827 - classification_loss: 0.3975 323/500 [==================>...........] - ETA: 43s - loss: 2.1813 - regression_loss: 1.7839 - classification_loss: 0.3974 324/500 [==================>...........] - ETA: 43s - loss: 2.1795 - regression_loss: 1.7824 - classification_loss: 0.3971 325/500 [==================>...........] - ETA: 43s - loss: 2.1753 - regression_loss: 1.7791 - classification_loss: 0.3962 326/500 [==================>...........] - ETA: 43s - loss: 2.1774 - regression_loss: 1.7810 - classification_loss: 0.3964 327/500 [==================>...........] - ETA: 42s - loss: 2.1788 - regression_loss: 1.7821 - classification_loss: 0.3967 328/500 [==================>...........] - ETA: 42s - loss: 2.1791 - regression_loss: 1.7824 - classification_loss: 0.3967 329/500 [==================>...........] - ETA: 42s - loss: 2.1741 - regression_loss: 1.7782 - classification_loss: 0.3959 330/500 [==================>...........] - ETA: 42s - loss: 2.1736 - regression_loss: 1.7781 - classification_loss: 0.3955 331/500 [==================>...........] - ETA: 42s - loss: 2.1766 - regression_loss: 1.7807 - classification_loss: 0.3959 332/500 [==================>...........] - ETA: 41s - loss: 2.1788 - regression_loss: 1.7825 - classification_loss: 0.3963 333/500 [==================>...........] - ETA: 41s - loss: 2.1779 - regression_loss: 1.7820 - classification_loss: 0.3959 334/500 [===================>..........] - ETA: 41s - loss: 2.1771 - regression_loss: 1.7814 - classification_loss: 0.3957 335/500 [===================>..........] - ETA: 41s - loss: 2.1776 - regression_loss: 1.7821 - classification_loss: 0.3955 336/500 [===================>..........] - ETA: 40s - loss: 2.1800 - regression_loss: 1.7843 - classification_loss: 0.3957 337/500 [===================>..........] - ETA: 40s - loss: 2.1799 - regression_loss: 1.7843 - classification_loss: 0.3956 338/500 [===================>..........] - ETA: 40s - loss: 2.1834 - regression_loss: 1.7875 - classification_loss: 0.3959 339/500 [===================>..........] - ETA: 40s - loss: 2.1835 - regression_loss: 1.7877 - classification_loss: 0.3957 340/500 [===================>..........] - ETA: 39s - loss: 2.1855 - regression_loss: 1.7901 - classification_loss: 0.3954 341/500 [===================>..........] - ETA: 39s - loss: 2.1813 - regression_loss: 1.7866 - classification_loss: 0.3947 342/500 [===================>..........] - ETA: 39s - loss: 2.1801 - regression_loss: 1.7857 - classification_loss: 0.3944 343/500 [===================>..........] - ETA: 39s - loss: 2.1811 - regression_loss: 1.7863 - classification_loss: 0.3948 344/500 [===================>..........] - ETA: 38s - loss: 2.1822 - regression_loss: 1.7871 - classification_loss: 0.3951 345/500 [===================>..........] - ETA: 38s - loss: 2.1799 - regression_loss: 1.7853 - classification_loss: 0.3945 346/500 [===================>..........] - ETA: 38s - loss: 2.1816 - regression_loss: 1.7870 - classification_loss: 0.3947 347/500 [===================>..........] - ETA: 38s - loss: 2.1829 - regression_loss: 1.7878 - classification_loss: 0.3951 348/500 [===================>..........] - ETA: 37s - loss: 2.1816 - regression_loss: 1.7869 - classification_loss: 0.3947 349/500 [===================>..........] - ETA: 37s - loss: 2.1791 - regression_loss: 1.7851 - classification_loss: 0.3941 350/500 [====================>.........] - ETA: 37s - loss: 2.1761 - regression_loss: 1.7825 - classification_loss: 0.3935 351/500 [====================>.........] - ETA: 37s - loss: 2.1798 - regression_loss: 1.7854 - classification_loss: 0.3944 352/500 [====================>.........] - ETA: 36s - loss: 2.1838 - regression_loss: 1.7886 - classification_loss: 0.3951 353/500 [====================>.........] - ETA: 36s - loss: 2.1835 - regression_loss: 1.7886 - classification_loss: 0.3949 354/500 [====================>.........] - ETA: 36s - loss: 2.1838 - regression_loss: 1.7888 - classification_loss: 0.3950 355/500 [====================>.........] - ETA: 36s - loss: 2.1849 - regression_loss: 1.7897 - classification_loss: 0.3952 356/500 [====================>.........] - ETA: 35s - loss: 2.1863 - regression_loss: 1.7906 - classification_loss: 0.3957 357/500 [====================>.........] - ETA: 35s - loss: 2.1857 - regression_loss: 1.7903 - classification_loss: 0.3954 358/500 [====================>.........] - ETA: 35s - loss: 2.1852 - regression_loss: 1.7899 - classification_loss: 0.3953 359/500 [====================>.........] - ETA: 35s - loss: 2.1860 - regression_loss: 1.7907 - classification_loss: 0.3954 360/500 [====================>.........] - ETA: 34s - loss: 2.1857 - regression_loss: 1.7906 - classification_loss: 0.3950 361/500 [====================>.........] - ETA: 34s - loss: 2.1860 - regression_loss: 1.7908 - classification_loss: 0.3952 362/500 [====================>.........] - ETA: 34s - loss: 2.1856 - regression_loss: 1.7905 - classification_loss: 0.3952 363/500 [====================>.........] - ETA: 34s - loss: 2.1880 - regression_loss: 1.7925 - classification_loss: 0.3955 364/500 [====================>.........] - ETA: 33s - loss: 2.1906 - regression_loss: 1.7945 - classification_loss: 0.3961 365/500 [====================>.........] - ETA: 33s - loss: 2.1892 - regression_loss: 1.7937 - classification_loss: 0.3955 366/500 [====================>.........] - ETA: 33s - loss: 2.1898 - regression_loss: 1.7943 - classification_loss: 0.3956 367/500 [=====================>........] - ETA: 33s - loss: 2.1927 - regression_loss: 1.7967 - classification_loss: 0.3960 368/500 [=====================>........] - ETA: 32s - loss: 2.1926 - regression_loss: 1.7967 - classification_loss: 0.3958 369/500 [=====================>........] - ETA: 32s - loss: 2.1938 - regression_loss: 1.7974 - classification_loss: 0.3964 370/500 [=====================>........] - ETA: 32s - loss: 2.1921 - regression_loss: 1.7959 - classification_loss: 0.3961 371/500 [=====================>........] - ETA: 32s - loss: 2.1898 - regression_loss: 1.7942 - classification_loss: 0.3956 372/500 [=====================>........] - ETA: 31s - loss: 2.1891 - regression_loss: 1.7938 - classification_loss: 0.3953 373/500 [=====================>........] - ETA: 31s - loss: 2.1865 - regression_loss: 1.7917 - classification_loss: 0.3947 374/500 [=====================>........] - ETA: 31s - loss: 2.1841 - regression_loss: 1.7900 - classification_loss: 0.3941 375/500 [=====================>........] - ETA: 31s - loss: 2.1838 - regression_loss: 1.7902 - classification_loss: 0.3936 376/500 [=====================>........] - ETA: 30s - loss: 2.1841 - regression_loss: 1.7904 - classification_loss: 0.3936 377/500 [=====================>........] - ETA: 30s - loss: 2.1825 - regression_loss: 1.7893 - classification_loss: 0.3932 378/500 [=====================>........] - ETA: 30s - loss: 2.1813 - regression_loss: 1.7883 - classification_loss: 0.3930 379/500 [=====================>........] - ETA: 30s - loss: 2.1814 - regression_loss: 1.7883 - classification_loss: 0.3931 380/500 [=====================>........] - ETA: 29s - loss: 2.1812 - regression_loss: 1.7882 - classification_loss: 0.3930 381/500 [=====================>........] - ETA: 29s - loss: 2.1821 - regression_loss: 1.7889 - classification_loss: 0.3932 382/500 [=====================>........] - ETA: 29s - loss: 2.1821 - regression_loss: 1.7888 - classification_loss: 0.3933 383/500 [=====================>........] - ETA: 29s - loss: 2.1823 - regression_loss: 1.7888 - classification_loss: 0.3935 384/500 [======================>.......] - ETA: 28s - loss: 2.1816 - regression_loss: 1.7884 - classification_loss: 0.3932 385/500 [======================>.......] - ETA: 28s - loss: 2.1816 - regression_loss: 1.7883 - classification_loss: 0.3933 386/500 [======================>.......] - ETA: 28s - loss: 2.1805 - regression_loss: 1.7874 - classification_loss: 0.3931 387/500 [======================>.......] - ETA: 28s - loss: 2.1770 - regression_loss: 1.7845 - classification_loss: 0.3925 388/500 [======================>.......] - ETA: 27s - loss: 2.1756 - regression_loss: 1.7835 - classification_loss: 0.3922 389/500 [======================>.......] - ETA: 27s - loss: 2.1745 - regression_loss: 1.7825 - classification_loss: 0.3920 390/500 [======================>.......] - ETA: 27s - loss: 2.1745 - regression_loss: 1.7827 - classification_loss: 0.3918 391/500 [======================>.......] - ETA: 27s - loss: 2.1746 - regression_loss: 1.7828 - classification_loss: 0.3918 392/500 [======================>.......] - ETA: 26s - loss: 2.1751 - regression_loss: 1.7831 - classification_loss: 0.3920 393/500 [======================>.......] - ETA: 26s - loss: 2.1786 - regression_loss: 1.7858 - classification_loss: 0.3929 394/500 [======================>.......] - ETA: 26s - loss: 2.1802 - regression_loss: 1.7872 - classification_loss: 0.3930 395/500 [======================>.......] - ETA: 26s - loss: 2.1805 - regression_loss: 1.7873 - classification_loss: 0.3932 396/500 [======================>.......] - ETA: 25s - loss: 2.1812 - regression_loss: 1.7879 - classification_loss: 0.3933 397/500 [======================>.......] - ETA: 25s - loss: 2.1790 - regression_loss: 1.7862 - classification_loss: 0.3927 398/500 [======================>.......] - ETA: 25s - loss: 2.1810 - regression_loss: 1.7874 - classification_loss: 0.3935 399/500 [======================>.......] - ETA: 25s - loss: 2.1834 - regression_loss: 1.7896 - classification_loss: 0.3938 400/500 [=======================>......] - ETA: 24s - loss: 2.1849 - regression_loss: 1.7906 - classification_loss: 0.3943 401/500 [=======================>......] - ETA: 24s - loss: 2.1853 - regression_loss: 1.7910 - classification_loss: 0.3943 402/500 [=======================>......] - ETA: 24s - loss: 2.1847 - regression_loss: 1.7907 - classification_loss: 0.3941 403/500 [=======================>......] - ETA: 24s - loss: 2.1856 - regression_loss: 1.7914 - classification_loss: 0.3942 404/500 [=======================>......] - ETA: 23s - loss: 2.1853 - regression_loss: 1.7916 - classification_loss: 0.3937 405/500 [=======================>......] - ETA: 23s - loss: 2.1862 - regression_loss: 1.7923 - classification_loss: 0.3939 406/500 [=======================>......] - ETA: 23s - loss: 2.1867 - regression_loss: 1.7929 - classification_loss: 0.3938 407/500 [=======================>......] - ETA: 23s - loss: 2.1874 - regression_loss: 1.7933 - classification_loss: 0.3940 408/500 [=======================>......] - ETA: 22s - loss: 2.1880 - regression_loss: 1.7938 - classification_loss: 0.3942 409/500 [=======================>......] - ETA: 22s - loss: 2.1871 - regression_loss: 1.7929 - classification_loss: 0.3942 410/500 [=======================>......] - ETA: 22s - loss: 2.1868 - regression_loss: 1.7928 - classification_loss: 0.3940 411/500 [=======================>......] - ETA: 22s - loss: 2.1905 - regression_loss: 1.7959 - classification_loss: 0.3946 412/500 [=======================>......] - ETA: 21s - loss: 2.1912 - regression_loss: 1.7966 - classification_loss: 0.3946 413/500 [=======================>......] - ETA: 21s - loss: 2.1925 - regression_loss: 1.7975 - classification_loss: 0.3950 414/500 [=======================>......] - ETA: 21s - loss: 2.1913 - regression_loss: 1.7967 - classification_loss: 0.3946 415/500 [=======================>......] - ETA: 21s - loss: 2.1908 - regression_loss: 1.7965 - classification_loss: 0.3943 416/500 [=======================>......] - ETA: 20s - loss: 2.1891 - regression_loss: 1.7953 - classification_loss: 0.3938 417/500 [========================>.....] - ETA: 20s - loss: 2.1899 - regression_loss: 1.7958 - classification_loss: 0.3941 418/500 [========================>.....] - ETA: 20s - loss: 2.1894 - regression_loss: 1.7955 - classification_loss: 0.3939 419/500 [========================>.....] - ETA: 20s - loss: 2.1902 - regression_loss: 1.7961 - classification_loss: 0.3940 420/500 [========================>.....] - ETA: 19s - loss: 2.1896 - regression_loss: 1.7958 - classification_loss: 0.3938 421/500 [========================>.....] - ETA: 19s - loss: 2.1931 - regression_loss: 1.7985 - classification_loss: 0.3945 422/500 [========================>.....] - ETA: 19s - loss: 2.1944 - regression_loss: 1.7995 - classification_loss: 0.3949 423/500 [========================>.....] - ETA: 19s - loss: 2.1958 - regression_loss: 1.8008 - classification_loss: 0.3950 424/500 [========================>.....] - ETA: 18s - loss: 2.1970 - regression_loss: 1.8015 - classification_loss: 0.3956 425/500 [========================>.....] - ETA: 18s - loss: 2.1976 - regression_loss: 1.8019 - classification_loss: 0.3957 426/500 [========================>.....] - ETA: 18s - loss: 2.1953 - regression_loss: 1.8002 - classification_loss: 0.3951 427/500 [========================>.....] - ETA: 18s - loss: 2.1963 - regression_loss: 1.8008 - classification_loss: 0.3955 428/500 [========================>.....] - ETA: 17s - loss: 2.1957 - regression_loss: 1.8005 - classification_loss: 0.3952 429/500 [========================>.....] - ETA: 17s - loss: 2.1956 - regression_loss: 1.8006 - classification_loss: 0.3951 430/500 [========================>.....] - ETA: 17s - loss: 2.1953 - regression_loss: 1.8005 - classification_loss: 0.3948 431/500 [========================>.....] - ETA: 17s - loss: 2.1983 - regression_loss: 1.8031 - classification_loss: 0.3952 432/500 [========================>.....] - ETA: 16s - loss: 2.1969 - regression_loss: 1.8020 - classification_loss: 0.3948 433/500 [========================>.....] - ETA: 16s - loss: 2.1959 - regression_loss: 1.8011 - classification_loss: 0.3948 434/500 [=========================>....] - ETA: 16s - loss: 2.1959 - regression_loss: 1.8011 - classification_loss: 0.3948 435/500 [=========================>....] - ETA: 16s - loss: 2.1962 - regression_loss: 1.8013 - classification_loss: 0.3950 436/500 [=========================>....] - ETA: 15s - loss: 2.1956 - regression_loss: 1.8008 - classification_loss: 0.3949 437/500 [=========================>....] - ETA: 15s - loss: 2.1965 - regression_loss: 1.8014 - classification_loss: 0.3951 438/500 [=========================>....] - ETA: 15s - loss: 2.1966 - regression_loss: 1.8016 - classification_loss: 0.3950 439/500 [=========================>....] - ETA: 15s - loss: 2.1972 - regression_loss: 1.8021 - classification_loss: 0.3950 440/500 [=========================>....] - ETA: 14s - loss: 2.1958 - regression_loss: 1.8010 - classification_loss: 0.3947 441/500 [=========================>....] - ETA: 14s - loss: 2.1978 - regression_loss: 1.8026 - classification_loss: 0.3952 442/500 [=========================>....] - ETA: 14s - loss: 2.1997 - regression_loss: 1.8042 - classification_loss: 0.3955 443/500 [=========================>....] - ETA: 14s - loss: 2.1994 - regression_loss: 1.8043 - classification_loss: 0.3951 444/500 [=========================>....] - ETA: 13s - loss: 2.2014 - regression_loss: 1.8060 - classification_loss: 0.3954 445/500 [=========================>....] - ETA: 13s - loss: 2.2011 - regression_loss: 1.8058 - classification_loss: 0.3953 446/500 [=========================>....] - ETA: 13s - loss: 2.2017 - regression_loss: 1.8063 - classification_loss: 0.3954 447/500 [=========================>....] - ETA: 13s - loss: 2.2014 - regression_loss: 1.8062 - classification_loss: 0.3952 448/500 [=========================>....] - ETA: 12s - loss: 2.2017 - regression_loss: 1.8066 - classification_loss: 0.3952 449/500 [=========================>....] - ETA: 12s - loss: 2.2029 - regression_loss: 1.8075 - classification_loss: 0.3953 450/500 [==========================>...] - ETA: 12s - loss: 2.2019 - regression_loss: 1.8069 - classification_loss: 0.3951 451/500 [==========================>...] - ETA: 12s - loss: 2.2027 - regression_loss: 1.8077 - classification_loss: 0.3950 452/500 [==========================>...] - ETA: 11s - loss: 2.2037 - regression_loss: 1.8085 - classification_loss: 0.3952 453/500 [==========================>...] - ETA: 11s - loss: 2.2042 - regression_loss: 1.8090 - classification_loss: 0.3951 454/500 [==========================>...] - ETA: 11s - loss: 2.2034 - regression_loss: 1.8086 - classification_loss: 0.3948 455/500 [==========================>...] - ETA: 11s - loss: 2.2010 - regression_loss: 1.8068 - classification_loss: 0.3942 456/500 [==========================>...] - ETA: 10s - loss: 2.2014 - regression_loss: 1.8072 - classification_loss: 0.3942 457/500 [==========================>...] - ETA: 10s - loss: 2.2009 - regression_loss: 1.8066 - classification_loss: 0.3942 458/500 [==========================>...] - ETA: 10s - loss: 2.2008 - regression_loss: 1.8067 - classification_loss: 0.3941 459/500 [==========================>...] - ETA: 10s - loss: 2.2008 - regression_loss: 1.8069 - classification_loss: 0.3940 460/500 [==========================>...] - ETA: 9s - loss: 2.2013 - regression_loss: 1.8066 - classification_loss: 0.3946  461/500 [==========================>...] - ETA: 9s - loss: 2.2012 - regression_loss: 1.8067 - classification_loss: 0.3945 462/500 [==========================>...] - ETA: 9s - loss: 2.2015 - regression_loss: 1.8069 - classification_loss: 0.3946 463/500 [==========================>...] - ETA: 9s - loss: 2.1997 - regression_loss: 1.8055 - classification_loss: 0.3942 464/500 [==========================>...] - ETA: 8s - loss: 2.1995 - regression_loss: 1.8055 - classification_loss: 0.3941 465/500 [==========================>...] - ETA: 8s - loss: 2.1970 - regression_loss: 1.8034 - classification_loss: 0.3936 466/500 [==========================>...] - ETA: 8s - loss: 2.1985 - regression_loss: 1.8046 - classification_loss: 0.3939 467/500 [===========================>..] - ETA: 8s - loss: 2.1988 - regression_loss: 1.8050 - classification_loss: 0.3938 468/500 [===========================>..] - ETA: 7s - loss: 2.1988 - regression_loss: 1.8047 - classification_loss: 0.3941 469/500 [===========================>..] - ETA: 7s - loss: 2.1976 - regression_loss: 1.8039 - classification_loss: 0.3938 470/500 [===========================>..] - ETA: 7s - loss: 2.1978 - regression_loss: 1.8039 - classification_loss: 0.3939 471/500 [===========================>..] - ETA: 7s - loss: 2.1982 - regression_loss: 1.8042 - classification_loss: 0.3940 472/500 [===========================>..] - ETA: 6s - loss: 2.1988 - regression_loss: 1.8048 - classification_loss: 0.3940 473/500 [===========================>..] - ETA: 6s - loss: 2.1985 - regression_loss: 1.8046 - classification_loss: 0.3939 474/500 [===========================>..] - ETA: 6s - loss: 2.1984 - regression_loss: 1.8047 - classification_loss: 0.3937 475/500 [===========================>..] - ETA: 6s - loss: 2.1958 - regression_loss: 1.8026 - classification_loss: 0.3932 476/500 [===========================>..] - ETA: 5s - loss: 2.1954 - regression_loss: 1.8023 - classification_loss: 0.3931 477/500 [===========================>..] - ETA: 5s - loss: 2.1947 - regression_loss: 1.8020 - classification_loss: 0.3927 478/500 [===========================>..] - ETA: 5s - loss: 2.1949 - regression_loss: 1.8021 - classification_loss: 0.3928 479/500 [===========================>..] - ETA: 5s - loss: 2.1962 - regression_loss: 1.8030 - classification_loss: 0.3931 480/500 [===========================>..] - ETA: 4s - loss: 2.1961 - regression_loss: 1.8029 - classification_loss: 0.3931 481/500 [===========================>..] - ETA: 4s - loss: 2.1951 - regression_loss: 1.8023 - classification_loss: 0.3928 482/500 [===========================>..] - ETA: 4s - loss: 2.1931 - regression_loss: 1.8008 - classification_loss: 0.3924 483/500 [===========================>..] - ETA: 4s - loss: 2.1942 - regression_loss: 1.8014 - classification_loss: 0.3929 484/500 [============================>.] - ETA: 3s - loss: 2.1976 - regression_loss: 1.8044 - classification_loss: 0.3932 485/500 [============================>.] - ETA: 3s - loss: 2.1974 - regression_loss: 1.8042 - classification_loss: 0.3932 486/500 [============================>.] - ETA: 3s - loss: 2.1972 - regression_loss: 1.8040 - classification_loss: 0.3932 487/500 [============================>.] - ETA: 3s - loss: 2.1982 - regression_loss: 1.8049 - classification_loss: 0.3932 488/500 [============================>.] - ETA: 2s - loss: 2.1958 - regression_loss: 1.8031 - classification_loss: 0.3927 489/500 [============================>.] - ETA: 2s - loss: 2.1966 - regression_loss: 1.8039 - classification_loss: 0.3927 490/500 [============================>.] - ETA: 2s - loss: 2.1953 - regression_loss: 1.8029 - classification_loss: 0.3923 491/500 [============================>.] - ETA: 2s - loss: 2.1968 - regression_loss: 1.8040 - classification_loss: 0.3928 492/500 [============================>.] - ETA: 1s - loss: 2.1963 - regression_loss: 1.8038 - classification_loss: 0.3926 493/500 [============================>.] - ETA: 1s - loss: 2.1953 - regression_loss: 1.8030 - classification_loss: 0.3923 494/500 [============================>.] - ETA: 1s - loss: 2.1952 - regression_loss: 1.8031 - classification_loss: 0.3921 495/500 [============================>.] - ETA: 1s - loss: 2.1968 - regression_loss: 1.8044 - classification_loss: 0.3924 496/500 [============================>.] - ETA: 0s - loss: 2.1971 - regression_loss: 1.8050 - classification_loss: 0.3921 497/500 [============================>.] - ETA: 0s - loss: 2.1957 - regression_loss: 1.8040 - classification_loss: 0.3918 498/500 [============================>.] - ETA: 0s - loss: 2.1967 - regression_loss: 1.8044 - classification_loss: 0.3922 499/500 [============================>.] - ETA: 0s - loss: 2.1950 - regression_loss: 1.8032 - classification_loss: 0.3919 500/500 [==============================] - 125s 249ms/step - loss: 2.1939 - regression_loss: 1.8023 - classification_loss: 0.3917 326 instances of class plum with average precision: 0.6082 mAP: 0.6082 Epoch 00014: saving model to ./training/snapshots/resnet50_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:00 - loss: 2.3881 - regression_loss: 1.9608 - classification_loss: 0.4274 2/500 [..............................] - ETA: 2:01 - loss: 1.7494 - regression_loss: 1.4629 - classification_loss: 0.2865 3/500 [..............................] - ETA: 2:02 - loss: 2.1056 - regression_loss: 1.7226 - classification_loss: 0.3831 4/500 [..............................] - ETA: 2:00 - loss: 2.1770 - regression_loss: 1.7921 - classification_loss: 0.3849 5/500 [..............................] - ETA: 2:01 - loss: 2.2005 - regression_loss: 1.8045 - classification_loss: 0.3960 6/500 [..............................] - ETA: 2:02 - loss: 2.2247 - regression_loss: 1.8168 - classification_loss: 0.4078 7/500 [..............................] - ETA: 2:02 - loss: 2.2221 - regression_loss: 1.8262 - classification_loss: 0.3959 8/500 [..............................] - ETA: 2:02 - loss: 2.2109 - regression_loss: 1.8206 - classification_loss: 0.3902 9/500 [..............................] - ETA: 2:02 - loss: 2.2130 - regression_loss: 1.8258 - classification_loss: 0.3872 10/500 [..............................] - ETA: 2:02 - loss: 2.3169 - regression_loss: 1.8876 - classification_loss: 0.4293 11/500 [..............................] - ETA: 2:02 - loss: 2.2731 - regression_loss: 1.8527 - classification_loss: 0.4205 12/500 [..............................] - ETA: 2:02 - loss: 2.2874 - regression_loss: 1.8638 - classification_loss: 0.4236 13/500 [..............................] - ETA: 2:01 - loss: 2.2822 - regression_loss: 1.8612 - classification_loss: 0.4210 14/500 [..............................] - ETA: 2:01 - loss: 2.2475 - regression_loss: 1.8411 - classification_loss: 0.4065 15/500 [..............................] - ETA: 2:01 - loss: 2.2045 - regression_loss: 1.8116 - classification_loss: 0.3930 16/500 [..............................] - ETA: 2:01 - loss: 2.1939 - regression_loss: 1.8130 - classification_loss: 0.3809 17/500 [>.............................] - ETA: 2:01 - loss: 2.2222 - regression_loss: 1.8449 - classification_loss: 0.3773 18/500 [>.............................] - ETA: 2:01 - loss: 2.2372 - regression_loss: 1.8456 - classification_loss: 0.3916 19/500 [>.............................] - ETA: 2:00 - loss: 2.2473 - regression_loss: 1.8517 - classification_loss: 0.3955 20/500 [>.............................] - ETA: 2:00 - loss: 2.2877 - regression_loss: 1.8899 - classification_loss: 0.3978 21/500 [>.............................] - ETA: 2:00 - loss: 2.3007 - regression_loss: 1.9004 - classification_loss: 0.4003 22/500 [>.............................] - ETA: 1:59 - loss: 2.2675 - regression_loss: 1.8756 - classification_loss: 0.3919 23/500 [>.............................] - ETA: 1:59 - loss: 2.2880 - regression_loss: 1.8919 - classification_loss: 0.3961 24/500 [>.............................] - ETA: 1:59 - loss: 2.2762 - regression_loss: 1.8856 - classification_loss: 0.3906 25/500 [>.............................] - ETA: 1:59 - loss: 2.2810 - regression_loss: 1.8956 - classification_loss: 0.3855 26/500 [>.............................] - ETA: 1:59 - loss: 2.2686 - regression_loss: 1.8876 - classification_loss: 0.3810 27/500 [>.............................] - ETA: 1:59 - loss: 2.2487 - regression_loss: 1.8725 - classification_loss: 0.3762 28/500 [>.............................] - ETA: 1:58 - loss: 2.2176 - regression_loss: 1.8426 - classification_loss: 0.3750 29/500 [>.............................] - ETA: 1:58 - loss: 2.2373 - regression_loss: 1.8589 - classification_loss: 0.3784 30/500 [>.............................] - ETA: 1:58 - loss: 2.2407 - regression_loss: 1.8635 - classification_loss: 0.3771 31/500 [>.............................] - ETA: 1:57 - loss: 2.2360 - regression_loss: 1.8597 - classification_loss: 0.3763 32/500 [>.............................] - ETA: 1:57 - loss: 2.2700 - regression_loss: 1.8803 - classification_loss: 0.3898 33/500 [>.............................] - ETA: 1:56 - loss: 2.2643 - regression_loss: 1.8764 - classification_loss: 0.3879 34/500 [=>............................] - ETA: 1:56 - loss: 2.2600 - regression_loss: 1.8738 - classification_loss: 0.3861 35/500 [=>............................] - ETA: 1:56 - loss: 2.2600 - regression_loss: 1.8730 - classification_loss: 0.3870 36/500 [=>............................] - ETA: 1:56 - loss: 2.2306 - regression_loss: 1.8491 - classification_loss: 0.3815 37/500 [=>............................] - ETA: 1:55 - loss: 2.2236 - regression_loss: 1.8441 - classification_loss: 0.3796 38/500 [=>............................] - ETA: 1:55 - loss: 2.2192 - regression_loss: 1.8390 - classification_loss: 0.3802 39/500 [=>............................] - ETA: 1:55 - loss: 2.2200 - regression_loss: 1.8386 - classification_loss: 0.3814 40/500 [=>............................] - ETA: 1:54 - loss: 2.2134 - regression_loss: 1.8354 - classification_loss: 0.3780 41/500 [=>............................] - ETA: 1:54 - loss: 2.2229 - regression_loss: 1.8398 - classification_loss: 0.3830 42/500 [=>............................] - ETA: 1:53 - loss: 2.2193 - regression_loss: 1.8358 - classification_loss: 0.3835 43/500 [=>............................] - ETA: 1:52 - loss: 2.2157 - regression_loss: 1.8330 - classification_loss: 0.3827 44/500 [=>............................] - ETA: 1:52 - loss: 2.2165 - regression_loss: 1.8363 - classification_loss: 0.3802 45/500 [=>............................] - ETA: 1:52 - loss: 2.2225 - regression_loss: 1.8414 - classification_loss: 0.3811 46/500 [=>............................] - ETA: 1:52 - loss: 2.2294 - regression_loss: 1.8468 - classification_loss: 0.3826 47/500 [=>............................] - ETA: 1:52 - loss: 2.2354 - regression_loss: 1.8509 - classification_loss: 0.3845 48/500 [=>............................] - ETA: 1:51 - loss: 2.2398 - regression_loss: 1.8549 - classification_loss: 0.3849 49/500 [=>............................] - ETA: 1:51 - loss: 2.2261 - regression_loss: 1.8449 - classification_loss: 0.3812 50/500 [==>...........................] - ETA: 1:51 - loss: 2.2287 - regression_loss: 1.8470 - classification_loss: 0.3817 51/500 [==>...........................] - ETA: 1:51 - loss: 2.2249 - regression_loss: 1.8426 - classification_loss: 0.3823 52/500 [==>...........................] - ETA: 1:51 - loss: 2.2019 - regression_loss: 1.8247 - classification_loss: 0.3772 53/500 [==>...........................] - ETA: 1:50 - loss: 2.1954 - regression_loss: 1.8207 - classification_loss: 0.3746 54/500 [==>...........................] - ETA: 1:50 - loss: 2.1973 - regression_loss: 1.8225 - classification_loss: 0.3749 55/500 [==>...........................] - ETA: 1:50 - loss: 2.1957 - regression_loss: 1.8197 - classification_loss: 0.3759 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2000 - regression_loss: 1.8237 - classification_loss: 0.3764 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2076 - regression_loss: 1.8293 - classification_loss: 0.3784 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2214 - regression_loss: 1.8409 - classification_loss: 0.3805 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2220 - regression_loss: 1.8418 - classification_loss: 0.3802 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2205 - regression_loss: 1.8409 - classification_loss: 0.3795 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2269 - regression_loss: 1.8443 - classification_loss: 0.3826 62/500 [==>...........................] - ETA: 1:49 - loss: 2.2087 - regression_loss: 1.8263 - classification_loss: 0.3825 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2032 - regression_loss: 1.8235 - classification_loss: 0.3797 64/500 [==>...........................] - ETA: 1:48 - loss: 2.1951 - regression_loss: 1.8179 - classification_loss: 0.3772 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1915 - regression_loss: 1.8143 - classification_loss: 0.3771 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1833 - regression_loss: 1.8078 - classification_loss: 0.3755 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1670 - regression_loss: 1.7950 - classification_loss: 0.3719 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1639 - regression_loss: 1.7925 - classification_loss: 0.3713 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1573 - regression_loss: 1.7875 - classification_loss: 0.3699 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1564 - regression_loss: 1.7868 - classification_loss: 0.3696 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1601 - regression_loss: 1.7895 - classification_loss: 0.3706 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1565 - regression_loss: 1.7862 - classification_loss: 0.3703 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1501 - regression_loss: 1.7820 - classification_loss: 0.3680 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1515 - regression_loss: 1.7824 - classification_loss: 0.3690 75/500 [===>..........................] - ETA: 1:46 - loss: 2.1525 - regression_loss: 1.7828 - classification_loss: 0.3697 76/500 [===>..........................] - ETA: 1:45 - loss: 2.1565 - regression_loss: 1.7858 - classification_loss: 0.3707 77/500 [===>..........................] - ETA: 1:45 - loss: 2.1561 - regression_loss: 1.7853 - classification_loss: 0.3709 78/500 [===>..........................] - ETA: 1:45 - loss: 2.1471 - regression_loss: 1.7772 - classification_loss: 0.3699 79/500 [===>..........................] - ETA: 1:45 - loss: 2.1536 - regression_loss: 1.7812 - classification_loss: 0.3725 80/500 [===>..........................] - ETA: 1:45 - loss: 2.1524 - regression_loss: 1.7811 - classification_loss: 0.3713 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1447 - regression_loss: 1.7756 - classification_loss: 0.3691 82/500 [===>..........................] - ETA: 1:44 - loss: 2.1502 - regression_loss: 1.7802 - classification_loss: 0.3700 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1443 - regression_loss: 1.7748 - classification_loss: 0.3695 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1512 - regression_loss: 1.7809 - classification_loss: 0.3703 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1512 - regression_loss: 1.7811 - classification_loss: 0.3701 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1481 - regression_loss: 1.7792 - classification_loss: 0.3689 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1512 - regression_loss: 1.7825 - classification_loss: 0.3687 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1498 - regression_loss: 1.7817 - classification_loss: 0.3681 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1594 - regression_loss: 1.7886 - classification_loss: 0.3708 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1604 - regression_loss: 1.7885 - classification_loss: 0.3719 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1584 - regression_loss: 1.7880 - classification_loss: 0.3704 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1739 - regression_loss: 1.7978 - classification_loss: 0.3761 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1761 - regression_loss: 1.7996 - classification_loss: 0.3765 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1737 - regression_loss: 1.7982 - classification_loss: 0.3755 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1688 - regression_loss: 1.7947 - classification_loss: 0.3742 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1755 - regression_loss: 1.7995 - classification_loss: 0.3760 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1874 - regression_loss: 1.8097 - classification_loss: 0.3777 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1859 - regression_loss: 1.8080 - classification_loss: 0.3779 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1896 - regression_loss: 1.8107 - classification_loss: 0.3790 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1870 - regression_loss: 1.8088 - classification_loss: 0.3782 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1903 - regression_loss: 1.8102 - classification_loss: 0.3801 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1849 - regression_loss: 1.8056 - classification_loss: 0.3792 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1872 - regression_loss: 1.8068 - classification_loss: 0.3804 104/500 [=====>........................] - ETA: 1:39 - loss: 2.1936 - regression_loss: 1.8114 - classification_loss: 0.3822 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1938 - regression_loss: 1.8111 - classification_loss: 0.3826 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1857 - regression_loss: 1.8050 - classification_loss: 0.3806 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1933 - regression_loss: 1.7881 - classification_loss: 0.4052 108/500 [=====>........................] - ETA: 1:38 - loss: 2.1994 - regression_loss: 1.7936 - classification_loss: 0.4058 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2116 - regression_loss: 1.8047 - classification_loss: 0.4069 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2148 - regression_loss: 1.8071 - classification_loss: 0.4077 111/500 [=====>........................] - ETA: 1:37 - loss: 2.2139 - regression_loss: 1.8061 - classification_loss: 0.4078 112/500 [=====>........................] - ETA: 1:37 - loss: 2.2158 - regression_loss: 1.8071 - classification_loss: 0.4087 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2154 - regression_loss: 1.8067 - classification_loss: 0.4087 114/500 [=====>........................] - ETA: 1:36 - loss: 2.2102 - regression_loss: 1.8025 - classification_loss: 0.4077 115/500 [=====>........................] - ETA: 1:36 - loss: 2.2109 - regression_loss: 1.8051 - classification_loss: 0.4059 116/500 [=====>........................] - ETA: 1:36 - loss: 2.2102 - regression_loss: 1.8043 - classification_loss: 0.4059 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2077 - regression_loss: 1.8032 - classification_loss: 0.4045 118/500 [======>.......................] - ETA: 1:35 - loss: 2.2028 - regression_loss: 1.7997 - classification_loss: 0.4031 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1976 - regression_loss: 1.7959 - classification_loss: 0.4017 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1985 - regression_loss: 1.7967 - classification_loss: 0.4018 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1950 - regression_loss: 1.7935 - classification_loss: 0.4014 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2054 - regression_loss: 1.8001 - classification_loss: 0.4053 123/500 [======>.......................] - ETA: 1:34 - loss: 2.2001 - regression_loss: 1.7962 - classification_loss: 0.4038 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1955 - regression_loss: 1.7931 - classification_loss: 0.4025 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1944 - regression_loss: 1.7929 - classification_loss: 0.4015 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1932 - regression_loss: 1.7928 - classification_loss: 0.4004 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1934 - regression_loss: 1.7933 - classification_loss: 0.4001 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1938 - regression_loss: 1.7938 - classification_loss: 0.3999 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1931 - regression_loss: 1.7930 - classification_loss: 0.4001 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1950 - regression_loss: 1.7946 - classification_loss: 0.4004 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1980 - regression_loss: 1.7970 - classification_loss: 0.4010 132/500 [======>.......................] - ETA: 1:32 - loss: 2.2012 - regression_loss: 1.8008 - classification_loss: 0.4004 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1976 - regression_loss: 1.7980 - classification_loss: 0.3996 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1989 - regression_loss: 1.7993 - classification_loss: 0.3997 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1940 - regression_loss: 1.7961 - classification_loss: 0.3979 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1922 - regression_loss: 1.7954 - classification_loss: 0.3968 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1859 - regression_loss: 1.7907 - classification_loss: 0.3953 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1832 - regression_loss: 1.7889 - classification_loss: 0.3943 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1836 - regression_loss: 1.7896 - classification_loss: 0.3940 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1838 - regression_loss: 1.7898 - classification_loss: 0.3940 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1795 - regression_loss: 1.7865 - classification_loss: 0.3930 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1839 - regression_loss: 1.7894 - classification_loss: 0.3945 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1844 - regression_loss: 1.7895 - classification_loss: 0.3949 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1892 - regression_loss: 1.7932 - classification_loss: 0.3960 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1921 - regression_loss: 1.7960 - classification_loss: 0.3961 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1896 - regression_loss: 1.7940 - classification_loss: 0.3956 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1915 - regression_loss: 1.7956 - classification_loss: 0.3959 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1841 - regression_loss: 1.7899 - classification_loss: 0.3942 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1766 - regression_loss: 1.7843 - classification_loss: 0.3923 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1781 - regression_loss: 1.7854 - classification_loss: 0.3928 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1790 - regression_loss: 1.7867 - classification_loss: 0.3923 152/500 [========>.....................] - ETA: 1:27 - loss: 2.1784 - regression_loss: 1.7864 - classification_loss: 0.3920 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1817 - regression_loss: 1.7896 - classification_loss: 0.3921 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1841 - regression_loss: 1.7922 - classification_loss: 0.3919 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1816 - regression_loss: 1.7906 - classification_loss: 0.3910 156/500 [========>.....................] - ETA: 1:26 - loss: 2.1837 - regression_loss: 1.7933 - classification_loss: 0.3904 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1891 - regression_loss: 1.7978 - classification_loss: 0.3913 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2015 - regression_loss: 1.8067 - classification_loss: 0.3948 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1996 - regression_loss: 1.8049 - classification_loss: 0.3948 160/500 [========>.....................] - ETA: 1:25 - loss: 2.2019 - regression_loss: 1.8065 - classification_loss: 0.3955 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2009 - regression_loss: 1.8062 - classification_loss: 0.3947 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2038 - regression_loss: 1.8088 - classification_loss: 0.3951 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2033 - regression_loss: 1.8084 - classification_loss: 0.3949 164/500 [========>.....................] - ETA: 1:24 - loss: 2.2060 - regression_loss: 1.8118 - classification_loss: 0.3942 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2052 - regression_loss: 1.8113 - classification_loss: 0.3939 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2055 - regression_loss: 1.8111 - classification_loss: 0.3944 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2070 - regression_loss: 1.8126 - classification_loss: 0.3944 168/500 [=========>....................] - ETA: 1:23 - loss: 2.2114 - regression_loss: 1.8161 - classification_loss: 0.3953 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2098 - regression_loss: 1.8158 - classification_loss: 0.3940 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2077 - regression_loss: 1.8142 - classification_loss: 0.3935 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2104 - regression_loss: 1.8164 - classification_loss: 0.3939 172/500 [=========>....................] - ETA: 1:22 - loss: 2.2118 - regression_loss: 1.8180 - classification_loss: 0.3938 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2122 - regression_loss: 1.8181 - classification_loss: 0.3941 174/500 [=========>....................] - ETA: 1:21 - loss: 2.2161 - regression_loss: 1.8213 - classification_loss: 0.3948 175/500 [=========>....................] - ETA: 1:21 - loss: 2.2154 - regression_loss: 1.8212 - classification_loss: 0.3942 176/500 [=========>....................] - ETA: 1:21 - loss: 2.2195 - regression_loss: 1.8246 - classification_loss: 0.3949 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2185 - regression_loss: 1.8237 - classification_loss: 0.3947 178/500 [=========>....................] - ETA: 1:20 - loss: 2.2103 - regression_loss: 1.8169 - classification_loss: 0.3934 179/500 [=========>....................] - ETA: 1:20 - loss: 2.2102 - regression_loss: 1.8172 - classification_loss: 0.3930 180/500 [=========>....................] - ETA: 1:20 - loss: 2.2130 - regression_loss: 1.8202 - classification_loss: 0.3928 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2113 - regression_loss: 1.8190 - classification_loss: 0.3922 182/500 [=========>....................] - ETA: 1:19 - loss: 2.2088 - regression_loss: 1.8179 - classification_loss: 0.3910 183/500 [=========>....................] - ETA: 1:19 - loss: 2.2103 - regression_loss: 1.8198 - classification_loss: 0.3905 184/500 [==========>...................] - ETA: 1:19 - loss: 2.2076 - regression_loss: 1.8181 - classification_loss: 0.3896 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2101 - regression_loss: 1.8198 - classification_loss: 0.3904 186/500 [==========>...................] - ETA: 1:18 - loss: 2.2120 - regression_loss: 1.8212 - classification_loss: 0.3908 187/500 [==========>...................] - ETA: 1:18 - loss: 2.2060 - regression_loss: 1.8161 - classification_loss: 0.3898 188/500 [==========>...................] - ETA: 1:18 - loss: 2.2031 - regression_loss: 1.8141 - classification_loss: 0.3889 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2034 - regression_loss: 1.8137 - classification_loss: 0.3896 190/500 [==========>...................] - ETA: 1:17 - loss: 2.2041 - regression_loss: 1.8144 - classification_loss: 0.3897 191/500 [==========>...................] - ETA: 1:17 - loss: 2.2035 - regression_loss: 1.8140 - classification_loss: 0.3895 192/500 [==========>...................] - ETA: 1:17 - loss: 2.2079 - regression_loss: 1.8177 - classification_loss: 0.3903 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2043 - regression_loss: 1.8150 - classification_loss: 0.3893 194/500 [==========>...................] - ETA: 1:16 - loss: 2.2017 - regression_loss: 1.8133 - classification_loss: 0.3884 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1992 - regression_loss: 1.8119 - classification_loss: 0.3873 196/500 [==========>...................] - ETA: 1:16 - loss: 2.2008 - regression_loss: 1.8134 - classification_loss: 0.3875 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2007 - regression_loss: 1.8137 - classification_loss: 0.3870 198/500 [==========>...................] - ETA: 1:15 - loss: 2.2019 - regression_loss: 1.8145 - classification_loss: 0.3875 199/500 [==========>...................] - ETA: 1:15 - loss: 2.2016 - regression_loss: 1.8141 - classification_loss: 0.3875 200/500 [===========>..................] - ETA: 1:15 - loss: 2.2008 - regression_loss: 1.8136 - classification_loss: 0.3873 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1997 - regression_loss: 1.8129 - classification_loss: 0.3868 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1940 - regression_loss: 1.8085 - classification_loss: 0.3855 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1937 - regression_loss: 1.8085 - classification_loss: 0.3852 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1951 - regression_loss: 1.8096 - classification_loss: 0.3856 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1922 - regression_loss: 1.8073 - classification_loss: 0.3849 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1904 - regression_loss: 1.8058 - classification_loss: 0.3846 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1893 - regression_loss: 1.8055 - classification_loss: 0.3839 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1907 - regression_loss: 1.8064 - classification_loss: 0.3843 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1906 - regression_loss: 1.8061 - classification_loss: 0.3845 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1950 - regression_loss: 1.8098 - classification_loss: 0.3851 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1971 - regression_loss: 1.8120 - classification_loss: 0.3851 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1939 - regression_loss: 1.8091 - classification_loss: 0.3847 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1870 - regression_loss: 1.8034 - classification_loss: 0.3836 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1919 - regression_loss: 1.8074 - classification_loss: 0.3844 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1925 - regression_loss: 1.8078 - classification_loss: 0.3847 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1958 - regression_loss: 1.8101 - classification_loss: 0.3857 217/500 [============>.................] - ETA: 1:10 - loss: 2.1936 - regression_loss: 1.8085 - classification_loss: 0.3851 218/500 [============>.................] - ETA: 1:10 - loss: 2.1936 - regression_loss: 1.8087 - classification_loss: 0.3848 219/500 [============>.................] - ETA: 1:10 - loss: 2.1927 - regression_loss: 1.8081 - classification_loss: 0.3846 220/500 [============>.................] - ETA: 1:09 - loss: 2.1954 - regression_loss: 1.8110 - classification_loss: 0.3845 221/500 [============>.................] - ETA: 1:09 - loss: 2.1930 - regression_loss: 1.8093 - classification_loss: 0.3837 222/500 [============>.................] - ETA: 1:09 - loss: 2.1917 - regression_loss: 1.8084 - classification_loss: 0.3833 223/500 [============>.................] - ETA: 1:09 - loss: 2.1909 - regression_loss: 1.8077 - classification_loss: 0.3832 224/500 [============>.................] - ETA: 1:08 - loss: 2.1920 - regression_loss: 1.8085 - classification_loss: 0.3835 225/500 [============>.................] - ETA: 1:08 - loss: 2.2006 - regression_loss: 1.8096 - classification_loss: 0.3910 226/500 [============>.................] - ETA: 1:08 - loss: 2.2019 - regression_loss: 1.8105 - classification_loss: 0.3913 227/500 [============>.................] - ETA: 1:08 - loss: 2.2038 - regression_loss: 1.8123 - classification_loss: 0.3914 228/500 [============>.................] - ETA: 1:07 - loss: 2.2050 - regression_loss: 1.8130 - classification_loss: 0.3920 229/500 [============>.................] - ETA: 1:07 - loss: 2.2019 - regression_loss: 1.8106 - classification_loss: 0.3912 230/500 [============>.................] - ETA: 1:07 - loss: 2.2039 - regression_loss: 1.8119 - classification_loss: 0.3919 231/500 [============>.................] - ETA: 1:07 - loss: 2.2040 - regression_loss: 1.8125 - classification_loss: 0.3915 232/500 [============>.................] - ETA: 1:06 - loss: 2.2030 - regression_loss: 1.8118 - classification_loss: 0.3912 233/500 [============>.................] - ETA: 1:06 - loss: 2.2015 - regression_loss: 1.8109 - classification_loss: 0.3907 234/500 [=============>................] - ETA: 1:06 - loss: 2.1983 - regression_loss: 1.8085 - classification_loss: 0.3897 235/500 [=============>................] - ETA: 1:06 - loss: 2.1976 - regression_loss: 1.8082 - classification_loss: 0.3894 236/500 [=============>................] - ETA: 1:05 - loss: 2.1940 - regression_loss: 1.8055 - classification_loss: 0.3885 237/500 [=============>................] - ETA: 1:05 - loss: 2.1950 - regression_loss: 1.8059 - classification_loss: 0.3891 238/500 [=============>................] - ETA: 1:05 - loss: 2.1933 - regression_loss: 1.8047 - classification_loss: 0.3886 239/500 [=============>................] - ETA: 1:05 - loss: 2.1922 - regression_loss: 1.8039 - classification_loss: 0.3883 240/500 [=============>................] - ETA: 1:04 - loss: 2.1879 - regression_loss: 1.8002 - classification_loss: 0.3876 241/500 [=============>................] - ETA: 1:04 - loss: 2.1899 - regression_loss: 1.8012 - classification_loss: 0.3887 242/500 [=============>................] - ETA: 1:04 - loss: 2.1891 - regression_loss: 1.8008 - classification_loss: 0.3883 243/500 [=============>................] - ETA: 1:04 - loss: 2.1913 - regression_loss: 1.8025 - classification_loss: 0.3888 244/500 [=============>................] - ETA: 1:03 - loss: 2.1927 - regression_loss: 1.8039 - classification_loss: 0.3888 245/500 [=============>................] - ETA: 1:03 - loss: 2.1909 - regression_loss: 1.8030 - classification_loss: 0.3880 246/500 [=============>................] - ETA: 1:03 - loss: 2.1935 - regression_loss: 1.8050 - classification_loss: 0.3885 247/500 [=============>................] - ETA: 1:03 - loss: 2.1925 - regression_loss: 1.8044 - classification_loss: 0.3881 248/500 [=============>................] - ETA: 1:02 - loss: 2.1906 - regression_loss: 1.8032 - classification_loss: 0.3875 249/500 [=============>................] - ETA: 1:02 - loss: 2.1888 - regression_loss: 1.8020 - classification_loss: 0.3869 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1892 - regression_loss: 1.8023 - classification_loss: 0.3869 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1895 - regression_loss: 1.8022 - classification_loss: 0.3872 252/500 [==============>...............] - ETA: 1:01 - loss: 2.1936 - regression_loss: 1.8058 - classification_loss: 0.3879 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1935 - regression_loss: 1.8054 - classification_loss: 0.3881 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1920 - regression_loss: 1.8044 - classification_loss: 0.3876 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1931 - regression_loss: 1.8052 - classification_loss: 0.3879 256/500 [==============>...............] - ETA: 1:00 - loss: 2.1927 - regression_loss: 1.8051 - classification_loss: 0.3876 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1971 - regression_loss: 1.8087 - classification_loss: 0.3883 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1984 - regression_loss: 1.8085 - classification_loss: 0.3899 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1966 - regression_loss: 1.8073 - classification_loss: 0.3893 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1973 - regression_loss: 1.8078 - classification_loss: 0.3895 261/500 [==============>...............] - ETA: 59s - loss: 2.1969 - regression_loss: 1.8075 - classification_loss: 0.3894  262/500 [==============>...............] - ETA: 59s - loss: 2.1968 - regression_loss: 1.8077 - classification_loss: 0.3892 263/500 [==============>...............] - ETA: 59s - loss: 2.1963 - regression_loss: 1.8065 - classification_loss: 0.3898 264/500 [==============>...............] - ETA: 59s - loss: 2.1970 - regression_loss: 1.8071 - classification_loss: 0.3899 265/500 [==============>...............] - ETA: 58s - loss: 2.1943 - regression_loss: 1.8050 - classification_loss: 0.3894 266/500 [==============>...............] - ETA: 58s - loss: 2.1892 - regression_loss: 1.8008 - classification_loss: 0.3884 267/500 [===============>..............] - ETA: 58s - loss: 2.1903 - regression_loss: 1.8018 - classification_loss: 0.3885 268/500 [===============>..............] - ETA: 58s - loss: 2.1904 - regression_loss: 1.8020 - classification_loss: 0.3883 269/500 [===============>..............] - ETA: 57s - loss: 2.1951 - regression_loss: 1.8059 - classification_loss: 0.3892 270/500 [===============>..............] - ETA: 57s - loss: 2.1937 - regression_loss: 1.8050 - classification_loss: 0.3887 271/500 [===============>..............] - ETA: 57s - loss: 2.1950 - regression_loss: 1.8060 - classification_loss: 0.3890 272/500 [===============>..............] - ETA: 57s - loss: 2.1959 - regression_loss: 1.8067 - classification_loss: 0.3892 273/500 [===============>..............] - ETA: 56s - loss: 2.1953 - regression_loss: 1.8066 - classification_loss: 0.3887 274/500 [===============>..............] - ETA: 56s - loss: 2.1945 - regression_loss: 1.8062 - classification_loss: 0.3883 275/500 [===============>..............] - ETA: 56s - loss: 2.1959 - regression_loss: 1.8073 - classification_loss: 0.3886 276/500 [===============>..............] - ETA: 56s - loss: 2.1940 - regression_loss: 1.8058 - classification_loss: 0.3882 277/500 [===============>..............] - ETA: 55s - loss: 2.1917 - regression_loss: 1.8036 - classification_loss: 0.3882 278/500 [===============>..............] - ETA: 55s - loss: 2.1932 - regression_loss: 1.8046 - classification_loss: 0.3886 279/500 [===============>..............] - ETA: 55s - loss: 2.1922 - regression_loss: 1.8040 - classification_loss: 0.3882 280/500 [===============>..............] - ETA: 55s - loss: 2.1931 - regression_loss: 1.8046 - classification_loss: 0.3885 281/500 [===============>..............] - ETA: 54s - loss: 2.1933 - regression_loss: 1.8051 - classification_loss: 0.3882 282/500 [===============>..............] - ETA: 54s - loss: 2.1938 - regression_loss: 1.8053 - classification_loss: 0.3885 283/500 [===============>..............] - ETA: 54s - loss: 2.1941 - regression_loss: 1.8056 - classification_loss: 0.3884 284/500 [================>.............] - ETA: 54s - loss: 2.1945 - regression_loss: 1.8058 - classification_loss: 0.3887 285/500 [================>.............] - ETA: 53s - loss: 2.1923 - regression_loss: 1.8042 - classification_loss: 0.3882 286/500 [================>.............] - ETA: 53s - loss: 2.1929 - regression_loss: 1.8056 - classification_loss: 0.3873 287/500 [================>.............] - ETA: 53s - loss: 2.1917 - regression_loss: 1.8049 - classification_loss: 0.3869 288/500 [================>.............] - ETA: 53s - loss: 2.1923 - regression_loss: 1.8058 - classification_loss: 0.3865 289/500 [================>.............] - ETA: 52s - loss: 2.1921 - regression_loss: 1.8054 - classification_loss: 0.3867 290/500 [================>.............] - ETA: 52s - loss: 2.1888 - regression_loss: 1.8029 - classification_loss: 0.3859 291/500 [================>.............] - ETA: 52s - loss: 2.1900 - regression_loss: 1.8037 - classification_loss: 0.3863 292/500 [================>.............] - ETA: 52s - loss: 2.1944 - regression_loss: 1.8076 - classification_loss: 0.3868 293/500 [================>.............] - ETA: 51s - loss: 2.1956 - regression_loss: 1.8084 - classification_loss: 0.3872 294/500 [================>.............] - ETA: 51s - loss: 2.1920 - regression_loss: 1.8050 - classification_loss: 0.3870 295/500 [================>.............] - ETA: 51s - loss: 2.1918 - regression_loss: 1.8050 - classification_loss: 0.3868 296/500 [================>.............] - ETA: 51s - loss: 2.1888 - regression_loss: 1.8028 - classification_loss: 0.3860 297/500 [================>.............] - ETA: 50s - loss: 2.1863 - regression_loss: 1.8010 - classification_loss: 0.3853 298/500 [================>.............] - ETA: 50s - loss: 2.1880 - regression_loss: 1.8019 - classification_loss: 0.3861 299/500 [================>.............] - ETA: 50s - loss: 2.1852 - regression_loss: 1.7999 - classification_loss: 0.3853 300/500 [=================>............] - ETA: 50s - loss: 2.1856 - regression_loss: 1.8002 - classification_loss: 0.3855 301/500 [=================>............] - ETA: 49s - loss: 2.1848 - regression_loss: 1.7994 - classification_loss: 0.3854 302/500 [=================>............] - ETA: 49s - loss: 2.1886 - regression_loss: 1.8028 - classification_loss: 0.3858 303/500 [=================>............] - ETA: 49s - loss: 2.1894 - regression_loss: 1.8032 - classification_loss: 0.3862 304/500 [=================>............] - ETA: 49s - loss: 2.1900 - regression_loss: 1.8036 - classification_loss: 0.3864 305/500 [=================>............] - ETA: 48s - loss: 2.1907 - regression_loss: 1.8042 - classification_loss: 0.3865 306/500 [=================>............] - ETA: 48s - loss: 2.1885 - regression_loss: 1.8027 - classification_loss: 0.3858 307/500 [=================>............] - ETA: 48s - loss: 2.1876 - regression_loss: 1.8020 - classification_loss: 0.3856 308/500 [=================>............] - ETA: 48s - loss: 2.1866 - regression_loss: 1.8013 - classification_loss: 0.3853 309/500 [=================>............] - ETA: 47s - loss: 2.1850 - regression_loss: 1.8001 - classification_loss: 0.3849 310/500 [=================>............] - ETA: 47s - loss: 2.1831 - regression_loss: 1.7988 - classification_loss: 0.3843 311/500 [=================>............] - ETA: 47s - loss: 2.1831 - regression_loss: 1.7991 - classification_loss: 0.3840 312/500 [=================>............] - ETA: 47s - loss: 2.1835 - regression_loss: 1.7992 - classification_loss: 0.3843 313/500 [=================>............] - ETA: 46s - loss: 2.1806 - regression_loss: 1.7967 - classification_loss: 0.3838 314/500 [=================>............] - ETA: 46s - loss: 2.1800 - regression_loss: 1.7963 - classification_loss: 0.3836 315/500 [=================>............] - ETA: 46s - loss: 2.1805 - regression_loss: 1.7967 - classification_loss: 0.3838 316/500 [=================>............] - ETA: 46s - loss: 2.1797 - regression_loss: 1.7963 - classification_loss: 0.3835 317/500 [==================>...........] - ETA: 45s - loss: 2.1804 - regression_loss: 1.7968 - classification_loss: 0.3836 318/500 [==================>...........] - ETA: 45s - loss: 2.1811 - regression_loss: 1.7975 - classification_loss: 0.3835 319/500 [==================>...........] - ETA: 45s - loss: 2.1829 - regression_loss: 1.7991 - classification_loss: 0.3838 320/500 [==================>...........] - ETA: 45s - loss: 2.1823 - regression_loss: 1.7987 - classification_loss: 0.3835 321/500 [==================>...........] - ETA: 44s - loss: 2.1830 - regression_loss: 1.7996 - classification_loss: 0.3834 322/500 [==================>...........] - ETA: 44s - loss: 2.1830 - regression_loss: 1.7996 - classification_loss: 0.3834 323/500 [==================>...........] - ETA: 44s - loss: 2.1834 - regression_loss: 1.7999 - classification_loss: 0.3835 324/500 [==================>...........] - ETA: 44s - loss: 2.1835 - regression_loss: 1.8001 - classification_loss: 0.3834 325/500 [==================>...........] - ETA: 43s - loss: 2.1875 - regression_loss: 1.8031 - classification_loss: 0.3843 326/500 [==================>...........] - ETA: 43s - loss: 2.1857 - regression_loss: 1.8020 - classification_loss: 0.3837 327/500 [==================>...........] - ETA: 43s - loss: 2.1859 - regression_loss: 1.8019 - classification_loss: 0.3839 328/500 [==================>...........] - ETA: 43s - loss: 2.1865 - regression_loss: 1.8023 - classification_loss: 0.3842 329/500 [==================>...........] - ETA: 42s - loss: 2.1828 - regression_loss: 1.7993 - classification_loss: 0.3834 330/500 [==================>...........] - ETA: 42s - loss: 2.1821 - regression_loss: 1.7989 - classification_loss: 0.3832 331/500 [==================>...........] - ETA: 42s - loss: 2.1832 - regression_loss: 1.7999 - classification_loss: 0.3833 332/500 [==================>...........] - ETA: 42s - loss: 2.1844 - regression_loss: 1.8007 - classification_loss: 0.3837 333/500 [==================>...........] - ETA: 41s - loss: 2.1858 - regression_loss: 1.8021 - classification_loss: 0.3836 334/500 [===================>..........] - ETA: 41s - loss: 2.1865 - regression_loss: 1.8025 - classification_loss: 0.3841 335/500 [===================>..........] - ETA: 41s - loss: 2.1868 - regression_loss: 1.8029 - classification_loss: 0.3839 336/500 [===================>..........] - ETA: 41s - loss: 2.1867 - regression_loss: 1.8026 - classification_loss: 0.3841 337/500 [===================>..........] - ETA: 40s - loss: 2.1863 - regression_loss: 1.8023 - classification_loss: 0.3840 338/500 [===================>..........] - ETA: 40s - loss: 2.1880 - regression_loss: 1.8043 - classification_loss: 0.3837 339/500 [===================>..........] - ETA: 40s - loss: 2.1879 - regression_loss: 1.8045 - classification_loss: 0.3834 340/500 [===================>..........] - ETA: 40s - loss: 2.1857 - regression_loss: 1.8025 - classification_loss: 0.3832 341/500 [===================>..........] - ETA: 39s - loss: 2.1848 - regression_loss: 1.8017 - classification_loss: 0.3831 342/500 [===================>..........] - ETA: 39s - loss: 2.1871 - regression_loss: 1.8030 - classification_loss: 0.3841 343/500 [===================>..........] - ETA: 39s - loss: 2.1880 - regression_loss: 1.8041 - classification_loss: 0.3839 344/500 [===================>..........] - ETA: 39s - loss: 2.1878 - regression_loss: 1.8039 - classification_loss: 0.3839 345/500 [===================>..........] - ETA: 38s - loss: 2.1925 - regression_loss: 1.8080 - classification_loss: 0.3845 346/500 [===================>..........] - ETA: 38s - loss: 2.1927 - regression_loss: 1.8083 - classification_loss: 0.3844 347/500 [===================>..........] - ETA: 38s - loss: 2.1912 - regression_loss: 1.8072 - classification_loss: 0.3840 348/500 [===================>..........] - ETA: 38s - loss: 2.1929 - regression_loss: 1.8085 - classification_loss: 0.3844 349/500 [===================>..........] - ETA: 37s - loss: 2.1925 - regression_loss: 1.8083 - classification_loss: 0.3841 350/500 [====================>.........] - ETA: 37s - loss: 2.1927 - regression_loss: 1.8085 - classification_loss: 0.3842 351/500 [====================>.........] - ETA: 37s - loss: 2.1894 - regression_loss: 1.8058 - classification_loss: 0.3835 352/500 [====================>.........] - ETA: 37s - loss: 2.1908 - regression_loss: 1.8068 - classification_loss: 0.3839 353/500 [====================>.........] - ETA: 36s - loss: 2.1905 - regression_loss: 1.8067 - classification_loss: 0.3838 354/500 [====================>.........] - ETA: 36s - loss: 2.1904 - regression_loss: 1.8066 - classification_loss: 0.3838 355/500 [====================>.........] - ETA: 36s - loss: 2.1902 - regression_loss: 1.8064 - classification_loss: 0.3838 356/500 [====================>.........] - ETA: 36s - loss: 2.1860 - regression_loss: 1.8030 - classification_loss: 0.3830 357/500 [====================>.........] - ETA: 35s - loss: 2.1863 - regression_loss: 1.8033 - classification_loss: 0.3829 358/500 [====================>.........] - ETA: 35s - loss: 2.1859 - regression_loss: 1.8027 - classification_loss: 0.3832 359/500 [====================>.........] - ETA: 35s - loss: 2.1847 - regression_loss: 1.8020 - classification_loss: 0.3828 360/500 [====================>.........] - ETA: 35s - loss: 2.1857 - regression_loss: 1.8030 - classification_loss: 0.3827 361/500 [====================>.........] - ETA: 34s - loss: 2.1842 - regression_loss: 1.8018 - classification_loss: 0.3824 362/500 [====================>.........] - ETA: 34s - loss: 2.1836 - regression_loss: 1.8015 - classification_loss: 0.3821 363/500 [====================>.........] - ETA: 34s - loss: 2.1834 - regression_loss: 1.8015 - classification_loss: 0.3819 364/500 [====================>.........] - ETA: 34s - loss: 2.1837 - regression_loss: 1.8021 - classification_loss: 0.3816 365/500 [====================>.........] - ETA: 33s - loss: 2.1831 - regression_loss: 1.8015 - classification_loss: 0.3816 366/500 [====================>.........] - ETA: 33s - loss: 2.1808 - regression_loss: 1.7998 - classification_loss: 0.3810 367/500 [=====================>........] - ETA: 33s - loss: 2.1815 - regression_loss: 1.8002 - classification_loss: 0.3813 368/500 [=====================>........] - ETA: 33s - loss: 2.1807 - regression_loss: 1.7997 - classification_loss: 0.3810 369/500 [=====================>........] - ETA: 32s - loss: 2.1822 - regression_loss: 1.8010 - classification_loss: 0.3813 370/500 [=====================>........] - ETA: 32s - loss: 2.1826 - regression_loss: 1.8013 - classification_loss: 0.3813 371/500 [=====================>........] - ETA: 32s - loss: 2.1822 - regression_loss: 1.8011 - classification_loss: 0.3811 372/500 [=====================>........] - ETA: 32s - loss: 2.1838 - regression_loss: 1.8025 - classification_loss: 0.3813 373/500 [=====================>........] - ETA: 31s - loss: 2.1821 - regression_loss: 1.8013 - classification_loss: 0.3808 374/500 [=====================>........] - ETA: 31s - loss: 2.1816 - regression_loss: 1.8009 - classification_loss: 0.3808 375/500 [=====================>........] - ETA: 31s - loss: 2.1829 - regression_loss: 1.8017 - classification_loss: 0.3812 376/500 [=====================>........] - ETA: 31s - loss: 2.1824 - regression_loss: 1.8014 - classification_loss: 0.3810 377/500 [=====================>........] - ETA: 30s - loss: 2.1824 - regression_loss: 1.8014 - classification_loss: 0.3810 378/500 [=====================>........] - ETA: 30s - loss: 2.1832 - regression_loss: 1.8019 - classification_loss: 0.3814 379/500 [=====================>........] - ETA: 30s - loss: 2.1825 - regression_loss: 1.8018 - classification_loss: 0.3807 380/500 [=====================>........] - ETA: 30s - loss: 2.1855 - regression_loss: 1.8045 - classification_loss: 0.3811 381/500 [=====================>........] - ETA: 29s - loss: 2.1866 - regression_loss: 1.8055 - classification_loss: 0.3811 382/500 [=====================>........] - ETA: 29s - loss: 2.1875 - regression_loss: 1.8063 - classification_loss: 0.3812 383/500 [=====================>........] - ETA: 29s - loss: 2.1900 - regression_loss: 1.8084 - classification_loss: 0.3817 384/500 [======================>.......] - ETA: 29s - loss: 2.1893 - regression_loss: 1.8081 - classification_loss: 0.3813 385/500 [======================>.......] - ETA: 28s - loss: 2.1915 - regression_loss: 1.8093 - classification_loss: 0.3823 386/500 [======================>.......] - ETA: 28s - loss: 2.1914 - regression_loss: 1.8094 - classification_loss: 0.3819 387/500 [======================>.......] - ETA: 28s - loss: 2.1930 - regression_loss: 1.8097 - classification_loss: 0.3833 388/500 [======================>.......] - ETA: 27s - loss: 2.1948 - regression_loss: 1.8114 - classification_loss: 0.3834 389/500 [======================>.......] - ETA: 27s - loss: 2.1957 - regression_loss: 1.8123 - classification_loss: 0.3833 390/500 [======================>.......] - ETA: 27s - loss: 2.1954 - regression_loss: 1.8121 - classification_loss: 0.3833 391/500 [======================>.......] - ETA: 27s - loss: 2.1975 - regression_loss: 1.8139 - classification_loss: 0.3836 392/500 [======================>.......] - ETA: 26s - loss: 2.1965 - regression_loss: 1.8130 - classification_loss: 0.3835 393/500 [======================>.......] - ETA: 26s - loss: 2.1954 - regression_loss: 1.8123 - classification_loss: 0.3831 394/500 [======================>.......] - ETA: 26s - loss: 2.1945 - regression_loss: 1.8115 - classification_loss: 0.3830 395/500 [======================>.......] - ETA: 26s - loss: 2.1951 - regression_loss: 1.8120 - classification_loss: 0.3831 396/500 [======================>.......] - ETA: 25s - loss: 2.1958 - regression_loss: 1.8126 - classification_loss: 0.3832 397/500 [======================>.......] - ETA: 25s - loss: 2.1937 - regression_loss: 1.8109 - classification_loss: 0.3827 398/500 [======================>.......] - ETA: 25s - loss: 2.1934 - regression_loss: 1.8106 - classification_loss: 0.3828 399/500 [======================>.......] - ETA: 25s - loss: 2.1918 - regression_loss: 1.8095 - classification_loss: 0.3822 400/500 [=======================>......] - ETA: 24s - loss: 2.1918 - regression_loss: 1.8098 - classification_loss: 0.3819 401/500 [=======================>......] - ETA: 24s - loss: 2.1924 - regression_loss: 1.8103 - classification_loss: 0.3821 402/500 [=======================>......] - ETA: 24s - loss: 2.1919 - regression_loss: 1.8101 - classification_loss: 0.3818 403/500 [=======================>......] - ETA: 24s - loss: 2.1917 - regression_loss: 1.8100 - classification_loss: 0.3817 404/500 [=======================>......] - ETA: 23s - loss: 2.1919 - regression_loss: 1.8103 - classification_loss: 0.3816 405/500 [=======================>......] - ETA: 23s - loss: 2.1911 - regression_loss: 1.8097 - classification_loss: 0.3814 406/500 [=======================>......] - ETA: 23s - loss: 2.1906 - regression_loss: 1.8095 - classification_loss: 0.3811 407/500 [=======================>......] - ETA: 23s - loss: 2.1870 - regression_loss: 1.8066 - classification_loss: 0.3804 408/500 [=======================>......] - ETA: 22s - loss: 2.1854 - regression_loss: 1.8055 - classification_loss: 0.3800 409/500 [=======================>......] - ETA: 22s - loss: 2.1839 - regression_loss: 1.8043 - classification_loss: 0.3795 410/500 [=======================>......] - ETA: 22s - loss: 2.1848 - regression_loss: 1.8050 - classification_loss: 0.3797 411/500 [=======================>......] - ETA: 22s - loss: 2.1875 - regression_loss: 1.8073 - classification_loss: 0.3802 412/500 [=======================>......] - ETA: 21s - loss: 2.1867 - regression_loss: 1.8067 - classification_loss: 0.3800 413/500 [=======================>......] - ETA: 21s - loss: 2.1870 - regression_loss: 1.8067 - classification_loss: 0.3803 414/500 [=======================>......] - ETA: 21s - loss: 2.1871 - regression_loss: 1.8068 - classification_loss: 0.3803 415/500 [=======================>......] - ETA: 21s - loss: 2.1878 - regression_loss: 1.8074 - classification_loss: 0.3804 416/500 [=======================>......] - ETA: 20s - loss: 2.1868 - regression_loss: 1.8067 - classification_loss: 0.3800 417/500 [========================>.....] - ETA: 20s - loss: 2.1871 - regression_loss: 1.8073 - classification_loss: 0.3798 418/500 [========================>.....] - ETA: 20s - loss: 2.1850 - regression_loss: 1.8058 - classification_loss: 0.3792 419/500 [========================>.....] - ETA: 20s - loss: 2.1858 - regression_loss: 1.8064 - classification_loss: 0.3794 420/500 [========================>.....] - ETA: 19s - loss: 2.1852 - regression_loss: 1.8058 - classification_loss: 0.3795 421/500 [========================>.....] - ETA: 19s - loss: 2.1824 - regression_loss: 1.8035 - classification_loss: 0.3788 422/500 [========================>.....] - ETA: 19s - loss: 2.1794 - regression_loss: 1.8012 - classification_loss: 0.3782 423/500 [========================>.....] - ETA: 19s - loss: 2.1794 - regression_loss: 1.8013 - classification_loss: 0.3781 424/500 [========================>.....] - ETA: 18s - loss: 2.1844 - regression_loss: 1.8042 - classification_loss: 0.3802 425/500 [========================>.....] - ETA: 18s - loss: 2.1846 - regression_loss: 1.8043 - classification_loss: 0.3803 426/500 [========================>.....] - ETA: 18s - loss: 2.1874 - regression_loss: 1.8074 - classification_loss: 0.3801 427/500 [========================>.....] - ETA: 18s - loss: 2.1881 - regression_loss: 1.8076 - classification_loss: 0.3805 428/500 [========================>.....] - ETA: 17s - loss: 2.1865 - regression_loss: 1.8063 - classification_loss: 0.3803 429/500 [========================>.....] - ETA: 17s - loss: 2.1856 - regression_loss: 1.8054 - classification_loss: 0.3802 430/500 [========================>.....] - ETA: 17s - loss: 2.1847 - regression_loss: 1.8046 - classification_loss: 0.3800 431/500 [========================>.....] - ETA: 17s - loss: 2.1844 - regression_loss: 1.8045 - classification_loss: 0.3799 432/500 [========================>.....] - ETA: 16s - loss: 2.1838 - regression_loss: 1.8043 - classification_loss: 0.3794 433/500 [========================>.....] - ETA: 16s - loss: 2.1844 - regression_loss: 1.8046 - classification_loss: 0.3798 434/500 [=========================>....] - ETA: 16s - loss: 2.1835 - regression_loss: 1.8040 - classification_loss: 0.3795 435/500 [=========================>....] - ETA: 16s - loss: 2.1846 - regression_loss: 1.8050 - classification_loss: 0.3796 436/500 [=========================>....] - ETA: 15s - loss: 2.1851 - regression_loss: 1.8052 - classification_loss: 0.3799 437/500 [=========================>....] - ETA: 15s - loss: 2.1844 - regression_loss: 1.8048 - classification_loss: 0.3797 438/500 [=========================>....] - ETA: 15s - loss: 2.1849 - regression_loss: 1.8052 - classification_loss: 0.3797 439/500 [=========================>....] - ETA: 15s - loss: 2.1854 - regression_loss: 1.8058 - classification_loss: 0.3796 440/500 [=========================>....] - ETA: 14s - loss: 2.1882 - regression_loss: 1.8080 - classification_loss: 0.3802 441/500 [=========================>....] - ETA: 14s - loss: 2.1874 - regression_loss: 1.8076 - classification_loss: 0.3798 442/500 [=========================>....] - ETA: 14s - loss: 2.1866 - regression_loss: 1.8069 - classification_loss: 0.3797 443/500 [=========================>....] - ETA: 14s - loss: 2.1871 - regression_loss: 1.8073 - classification_loss: 0.3798 444/500 [=========================>....] - ETA: 14s - loss: 2.1875 - regression_loss: 1.8079 - classification_loss: 0.3796 445/500 [=========================>....] - ETA: 13s - loss: 2.1876 - regression_loss: 1.8080 - classification_loss: 0.3795 446/500 [=========================>....] - ETA: 13s - loss: 2.1868 - regression_loss: 1.8076 - classification_loss: 0.3792 447/500 [=========================>....] - ETA: 13s - loss: 2.1863 - regression_loss: 1.8075 - classification_loss: 0.3788 448/500 [=========================>....] - ETA: 13s - loss: 2.1865 - regression_loss: 1.8076 - classification_loss: 0.3789 449/500 [=========================>....] - ETA: 12s - loss: 2.1870 - regression_loss: 1.8079 - classification_loss: 0.3792 450/500 [==========================>...] - ETA: 12s - loss: 2.1873 - regression_loss: 1.8081 - classification_loss: 0.3792 451/500 [==========================>...] - ETA: 12s - loss: 2.1872 - regression_loss: 1.8082 - classification_loss: 0.3791 452/500 [==========================>...] - ETA: 12s - loss: 2.1881 - regression_loss: 1.8086 - classification_loss: 0.3795 453/500 [==========================>...] - ETA: 11s - loss: 2.1883 - regression_loss: 1.8086 - classification_loss: 0.3797 454/500 [==========================>...] - ETA: 11s - loss: 2.1896 - regression_loss: 1.8098 - classification_loss: 0.3798 455/500 [==========================>...] - ETA: 11s - loss: 2.1890 - regression_loss: 1.8093 - classification_loss: 0.3796 456/500 [==========================>...] - ETA: 11s - loss: 2.1885 - regression_loss: 1.8090 - classification_loss: 0.3795 457/500 [==========================>...] - ETA: 10s - loss: 2.1882 - regression_loss: 1.8089 - classification_loss: 0.3793 458/500 [==========================>...] - ETA: 10s - loss: 2.1866 - regression_loss: 1.8075 - classification_loss: 0.3791 459/500 [==========================>...] - ETA: 10s - loss: 2.1864 - regression_loss: 1.8074 - classification_loss: 0.3790 460/500 [==========================>...] - ETA: 10s - loss: 2.1849 - regression_loss: 1.8063 - classification_loss: 0.3786 461/500 [==========================>...] - ETA: 9s - loss: 2.1839 - regression_loss: 1.8056 - classification_loss: 0.3783  462/500 [==========================>...] - ETA: 9s - loss: 2.1848 - regression_loss: 1.8060 - classification_loss: 0.3787 463/500 [==========================>...] - ETA: 9s - loss: 2.1844 - regression_loss: 1.8058 - classification_loss: 0.3786 464/500 [==========================>...] - ETA: 9s - loss: 2.1845 - regression_loss: 1.8057 - classification_loss: 0.3788 465/500 [==========================>...] - ETA: 8s - loss: 2.1845 - regression_loss: 1.8055 - classification_loss: 0.3790 466/500 [==========================>...] - ETA: 8s - loss: 2.1850 - regression_loss: 1.8060 - classification_loss: 0.3790 467/500 [===========================>..] - ETA: 8s - loss: 2.1841 - regression_loss: 1.8054 - classification_loss: 0.3787 468/500 [===========================>..] - ETA: 8s - loss: 2.1845 - regression_loss: 1.8058 - classification_loss: 0.3787 469/500 [===========================>..] - ETA: 7s - loss: 2.1822 - regression_loss: 1.8039 - classification_loss: 0.3783 470/500 [===========================>..] - ETA: 7s - loss: 2.1804 - regression_loss: 1.8025 - classification_loss: 0.3779 471/500 [===========================>..] - ETA: 7s - loss: 2.1800 - regression_loss: 1.8021 - classification_loss: 0.3779 472/500 [===========================>..] - ETA: 7s - loss: 2.1770 - regression_loss: 1.7996 - classification_loss: 0.3774 473/500 [===========================>..] - ETA: 6s - loss: 2.1787 - regression_loss: 1.8009 - classification_loss: 0.3778 474/500 [===========================>..] - ETA: 6s - loss: 2.1778 - regression_loss: 1.8002 - classification_loss: 0.3776 475/500 [===========================>..] - ETA: 6s - loss: 2.1784 - regression_loss: 1.8007 - classification_loss: 0.3777 476/500 [===========================>..] - ETA: 6s - loss: 2.1824 - regression_loss: 1.8041 - classification_loss: 0.3783 477/500 [===========================>..] - ETA: 5s - loss: 2.1824 - regression_loss: 1.8041 - classification_loss: 0.3783 478/500 [===========================>..] - ETA: 5s - loss: 2.1833 - regression_loss: 1.8051 - classification_loss: 0.3782 479/500 [===========================>..] - ETA: 5s - loss: 2.1844 - regression_loss: 1.8059 - classification_loss: 0.3786 480/500 [===========================>..] - ETA: 5s - loss: 2.1844 - regression_loss: 1.8058 - classification_loss: 0.3786 481/500 [===========================>..] - ETA: 4s - loss: 2.1837 - regression_loss: 1.8053 - classification_loss: 0.3784 482/500 [===========================>..] - ETA: 4s - loss: 2.1869 - regression_loss: 1.8075 - classification_loss: 0.3794 483/500 [===========================>..] - ETA: 4s - loss: 2.1879 - regression_loss: 1.8085 - classification_loss: 0.3794 484/500 [============================>.] - ETA: 4s - loss: 2.1872 - regression_loss: 1.8080 - classification_loss: 0.3792 485/500 [============================>.] - ETA: 3s - loss: 2.1876 - regression_loss: 1.8084 - classification_loss: 0.3792 486/500 [============================>.] - ETA: 3s - loss: 2.1870 - regression_loss: 1.8078 - classification_loss: 0.3791 487/500 [============================>.] - ETA: 3s - loss: 2.1857 - regression_loss: 1.8068 - classification_loss: 0.3789 488/500 [============================>.] - ETA: 3s - loss: 2.1853 - regression_loss: 1.8064 - classification_loss: 0.3789 489/500 [============================>.] - ETA: 2s - loss: 2.1842 - regression_loss: 1.8056 - classification_loss: 0.3786 490/500 [============================>.] - ETA: 2s - loss: 2.1841 - regression_loss: 1.8057 - classification_loss: 0.3784 491/500 [============================>.] - ETA: 2s - loss: 2.1844 - regression_loss: 1.8059 - classification_loss: 0.3784 492/500 [============================>.] - ETA: 2s - loss: 2.1850 - regression_loss: 1.8062 - classification_loss: 0.3789 493/500 [============================>.] - ETA: 1s - loss: 2.1877 - regression_loss: 1.8080 - classification_loss: 0.3797 494/500 [============================>.] - ETA: 1s - loss: 2.1880 - regression_loss: 1.8083 - classification_loss: 0.3798 495/500 [============================>.] - ETA: 1s - loss: 2.1865 - regression_loss: 1.8068 - classification_loss: 0.3797 496/500 [============================>.] - ETA: 1s - loss: 2.1853 - regression_loss: 1.8059 - classification_loss: 0.3795 497/500 [============================>.] - ETA: 0s - loss: 2.1866 - regression_loss: 1.8066 - classification_loss: 0.3800 498/500 [============================>.] - ETA: 0s - loss: 2.1879 - regression_loss: 1.8075 - classification_loss: 0.3804 499/500 [============================>.] - ETA: 0s - loss: 2.1879 - regression_loss: 1.8077 - classification_loss: 0.3802 500/500 [==============================] - 125s 250ms/step - loss: 2.1874 - regression_loss: 1.8076 - classification_loss: 0.3799 326 instances of class plum with average precision: 0.6849 mAP: 0.6849 Epoch 00015: saving model to ./training/snapshots/resnet50_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 1:59 - loss: 1.6537 - regression_loss: 1.4061 - classification_loss: 0.2476 2/500 [..............................] - ETA: 2:03 - loss: 2.1886 - regression_loss: 1.9714 - classification_loss: 0.2172 3/500 [..............................] - ETA: 2:03 - loss: 1.8267 - regression_loss: 1.6218 - classification_loss: 0.2049 4/500 [..............................] - ETA: 2:04 - loss: 1.9575 - regression_loss: 1.6895 - classification_loss: 0.2680 5/500 [..............................] - ETA: 2:01 - loss: 2.0502 - regression_loss: 1.7505 - classification_loss: 0.2996 6/500 [..............................] - ETA: 2:01 - loss: 2.0572 - regression_loss: 1.7623 - classification_loss: 0.2949 7/500 [..............................] - ETA: 2:02 - loss: 2.0801 - regression_loss: 1.7630 - classification_loss: 0.3170 8/500 [..............................] - ETA: 2:02 - loss: 2.1891 - regression_loss: 1.8533 - classification_loss: 0.3358 9/500 [..............................] - ETA: 2:01 - loss: 2.1824 - regression_loss: 1.8430 - classification_loss: 0.3395 10/500 [..............................] - ETA: 2:01 - loss: 2.0951 - regression_loss: 1.7635 - classification_loss: 0.3316 11/500 [..............................] - ETA: 2:01 - loss: 2.1300 - regression_loss: 1.7815 - classification_loss: 0.3485 12/500 [..............................] - ETA: 2:01 - loss: 2.1148 - regression_loss: 1.7624 - classification_loss: 0.3525 13/500 [..............................] - ETA: 1:59 - loss: 2.1555 - regression_loss: 1.8021 - classification_loss: 0.3533 14/500 [..............................] - ETA: 2:00 - loss: 2.1043 - regression_loss: 1.7608 - classification_loss: 0.3435 15/500 [..............................] - ETA: 2:00 - loss: 2.1274 - regression_loss: 1.7914 - classification_loss: 0.3360 16/500 [..............................] - ETA: 2:00 - loss: 2.1955 - regression_loss: 1.8286 - classification_loss: 0.3668 17/500 [>.............................] - ETA: 1:59 - loss: 2.2375 - regression_loss: 1.8655 - classification_loss: 0.3720 18/500 [>.............................] - ETA: 1:59 - loss: 2.2833 - regression_loss: 1.8969 - classification_loss: 0.3864 19/500 [>.............................] - ETA: 1:59 - loss: 2.2447 - regression_loss: 1.8714 - classification_loss: 0.3733 20/500 [>.............................] - ETA: 1:58 - loss: 2.2442 - regression_loss: 1.8696 - classification_loss: 0.3745 21/500 [>.............................] - ETA: 1:58 - loss: 2.2216 - regression_loss: 1.8497 - classification_loss: 0.3719 22/500 [>.............................] - ETA: 1:58 - loss: 2.2161 - regression_loss: 1.8484 - classification_loss: 0.3677 23/500 [>.............................] - ETA: 1:57 - loss: 2.2313 - regression_loss: 1.8593 - classification_loss: 0.3720 24/500 [>.............................] - ETA: 1:57 - loss: 2.2132 - regression_loss: 1.8450 - classification_loss: 0.3682 25/500 [>.............................] - ETA: 1:57 - loss: 2.1907 - regression_loss: 1.8256 - classification_loss: 0.3651 26/500 [>.............................] - ETA: 1:57 - loss: 2.2045 - regression_loss: 1.8380 - classification_loss: 0.3665 27/500 [>.............................] - ETA: 1:57 - loss: 2.2006 - regression_loss: 1.8317 - classification_loss: 0.3689 28/500 [>.............................] - ETA: 1:56 - loss: 2.2114 - regression_loss: 1.8398 - classification_loss: 0.3717 29/500 [>.............................] - ETA: 1:56 - loss: 2.2308 - regression_loss: 1.8529 - classification_loss: 0.3778 30/500 [>.............................] - ETA: 1:56 - loss: 2.2386 - regression_loss: 1.8550 - classification_loss: 0.3836 31/500 [>.............................] - ETA: 1:56 - loss: 2.2377 - regression_loss: 1.8520 - classification_loss: 0.3857 32/500 [>.............................] - ETA: 1:55 - loss: 2.2387 - regression_loss: 1.8542 - classification_loss: 0.3845 33/500 [>.............................] - ETA: 1:55 - loss: 2.2344 - regression_loss: 1.8478 - classification_loss: 0.3867 34/500 [=>............................] - ETA: 1:55 - loss: 2.2293 - regression_loss: 1.8447 - classification_loss: 0.3845 35/500 [=>............................] - ETA: 1:55 - loss: 2.2380 - regression_loss: 1.8543 - classification_loss: 0.3837 36/500 [=>............................] - ETA: 1:55 - loss: 2.2369 - regression_loss: 1.8545 - classification_loss: 0.3823 37/500 [=>............................] - ETA: 1:54 - loss: 2.2375 - regression_loss: 1.8558 - classification_loss: 0.3817 38/500 [=>............................] - ETA: 1:54 - loss: 2.2212 - regression_loss: 1.8438 - classification_loss: 0.3775 39/500 [=>............................] - ETA: 1:54 - loss: 2.2240 - regression_loss: 1.8446 - classification_loss: 0.3794 40/500 [=>............................] - ETA: 1:54 - loss: 2.2250 - regression_loss: 1.8452 - classification_loss: 0.3799 41/500 [=>............................] - ETA: 1:53 - loss: 2.2121 - regression_loss: 1.8353 - classification_loss: 0.3768 42/500 [=>............................] - ETA: 1:53 - loss: 2.2031 - regression_loss: 1.8254 - classification_loss: 0.3778 43/500 [=>............................] - ETA: 1:53 - loss: 2.2152 - regression_loss: 1.8330 - classification_loss: 0.3821 44/500 [=>............................] - ETA: 1:53 - loss: 2.2220 - regression_loss: 1.8365 - classification_loss: 0.3854 45/500 [=>............................] - ETA: 1:52 - loss: 2.2281 - regression_loss: 1.8435 - classification_loss: 0.3845 46/500 [=>............................] - ETA: 1:52 - loss: 2.2187 - regression_loss: 1.8357 - classification_loss: 0.3830 47/500 [=>............................] - ETA: 1:52 - loss: 2.2172 - regression_loss: 1.8353 - classification_loss: 0.3819 48/500 [=>............................] - ETA: 1:52 - loss: 2.2235 - regression_loss: 1.8399 - classification_loss: 0.3836 49/500 [=>............................] - ETA: 1:52 - loss: 2.2252 - regression_loss: 1.8403 - classification_loss: 0.3849 50/500 [==>...........................] - ETA: 1:51 - loss: 2.2301 - regression_loss: 1.8428 - classification_loss: 0.3873 51/500 [==>...........................] - ETA: 1:51 - loss: 2.2344 - regression_loss: 1.8462 - classification_loss: 0.3882 52/500 [==>...........................] - ETA: 1:51 - loss: 2.2260 - regression_loss: 1.8404 - classification_loss: 0.3856 53/500 [==>...........................] - ETA: 1:50 - loss: 2.2355 - regression_loss: 1.8489 - classification_loss: 0.3866 54/500 [==>...........................] - ETA: 1:50 - loss: 2.2299 - regression_loss: 1.8455 - classification_loss: 0.3843 55/500 [==>...........................] - ETA: 1:50 - loss: 2.2278 - regression_loss: 1.8447 - classification_loss: 0.3830 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2307 - regression_loss: 1.8468 - classification_loss: 0.3840 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2288 - regression_loss: 1.8448 - classification_loss: 0.3840 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2264 - regression_loss: 1.8444 - classification_loss: 0.3820 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2202 - regression_loss: 1.8406 - classification_loss: 0.3796 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2092 - regression_loss: 1.8326 - classification_loss: 0.3766 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2132 - regression_loss: 1.8358 - classification_loss: 0.3775 62/500 [==>...........................] - ETA: 1:48 - loss: 2.2128 - regression_loss: 1.8357 - classification_loss: 0.3771 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2045 - regression_loss: 1.8292 - classification_loss: 0.3753 64/500 [==>...........................] - ETA: 1:48 - loss: 2.2164 - regression_loss: 1.8385 - classification_loss: 0.3779 65/500 [==>...........................] - ETA: 1:48 - loss: 2.2296 - regression_loss: 1.8494 - classification_loss: 0.3803 66/500 [==>...........................] - ETA: 1:47 - loss: 2.2364 - regression_loss: 1.8533 - classification_loss: 0.3831 67/500 [===>..........................] - ETA: 1:47 - loss: 2.2316 - regression_loss: 1.8497 - classification_loss: 0.3820 68/500 [===>..........................] - ETA: 1:47 - loss: 2.2371 - regression_loss: 1.8526 - classification_loss: 0.3845 69/500 [===>..........................] - ETA: 1:46 - loss: 2.2461 - regression_loss: 1.8602 - classification_loss: 0.3860 70/500 [===>..........................] - ETA: 1:46 - loss: 2.2470 - regression_loss: 1.8610 - classification_loss: 0.3861 71/500 [===>..........................] - ETA: 1:45 - loss: 2.2569 - regression_loss: 1.8694 - classification_loss: 0.3875 72/500 [===>..........................] - ETA: 1:45 - loss: 2.2548 - regression_loss: 1.8671 - classification_loss: 0.3877 73/500 [===>..........................] - ETA: 1:45 - loss: 2.2524 - regression_loss: 1.8657 - classification_loss: 0.3867 74/500 [===>..........................] - ETA: 1:45 - loss: 2.2524 - regression_loss: 1.8660 - classification_loss: 0.3864 75/500 [===>..........................] - ETA: 1:44 - loss: 2.2429 - regression_loss: 1.8591 - classification_loss: 0.3837 76/500 [===>..........................] - ETA: 1:44 - loss: 2.2500 - regression_loss: 1.8643 - classification_loss: 0.3857 77/500 [===>..........................] - ETA: 1:44 - loss: 2.2456 - regression_loss: 1.8615 - classification_loss: 0.3841 78/500 [===>..........................] - ETA: 1:44 - loss: 2.2466 - regression_loss: 1.8623 - classification_loss: 0.3843 79/500 [===>..........................] - ETA: 1:44 - loss: 2.2448 - regression_loss: 1.8615 - classification_loss: 0.3833 80/500 [===>..........................] - ETA: 1:43 - loss: 2.2378 - regression_loss: 1.8568 - classification_loss: 0.3811 81/500 [===>..........................] - ETA: 1:43 - loss: 2.2360 - regression_loss: 1.8553 - classification_loss: 0.3807 82/500 [===>..........................] - ETA: 1:43 - loss: 2.2417 - regression_loss: 1.8593 - classification_loss: 0.3824 83/500 [===>..........................] - ETA: 1:43 - loss: 2.2423 - regression_loss: 1.8603 - classification_loss: 0.3821 84/500 [====>.........................] - ETA: 1:42 - loss: 2.2417 - regression_loss: 1.8583 - classification_loss: 0.3834 85/500 [====>.........................] - ETA: 1:42 - loss: 2.2268 - regression_loss: 1.8455 - classification_loss: 0.3814 86/500 [====>.........................] - ETA: 1:42 - loss: 2.2259 - regression_loss: 1.8439 - classification_loss: 0.3820 87/500 [====>.........................] - ETA: 1:42 - loss: 2.2217 - regression_loss: 1.8396 - classification_loss: 0.3820 88/500 [====>.........................] - ETA: 1:41 - loss: 2.2255 - regression_loss: 1.8419 - classification_loss: 0.3835 89/500 [====>.........................] - ETA: 1:41 - loss: 2.2283 - regression_loss: 1.8437 - classification_loss: 0.3846 90/500 [====>.........................] - ETA: 1:41 - loss: 2.2225 - regression_loss: 1.8391 - classification_loss: 0.3834 91/500 [====>.........................] - ETA: 1:41 - loss: 2.2096 - regression_loss: 1.8285 - classification_loss: 0.3811 92/500 [====>.........................] - ETA: 1:41 - loss: 2.2121 - regression_loss: 1.8297 - classification_loss: 0.3825 93/500 [====>.........................] - ETA: 1:40 - loss: 2.2132 - regression_loss: 1.8299 - classification_loss: 0.3833 94/500 [====>.........................] - ETA: 1:40 - loss: 2.2128 - regression_loss: 1.8296 - classification_loss: 0.3832 95/500 [====>.........................] - ETA: 1:40 - loss: 2.2126 - regression_loss: 1.8289 - classification_loss: 0.3837 96/500 [====>.........................] - ETA: 1:40 - loss: 2.2122 - regression_loss: 1.8295 - classification_loss: 0.3827 97/500 [====>.........................] - ETA: 1:39 - loss: 2.2183 - regression_loss: 1.8345 - classification_loss: 0.3838 98/500 [====>.........................] - ETA: 1:39 - loss: 2.2188 - regression_loss: 1.8351 - classification_loss: 0.3838 99/500 [====>.........................] - ETA: 1:39 - loss: 2.2184 - regression_loss: 1.8351 - classification_loss: 0.3833 100/500 [=====>........................] - ETA: 1:39 - loss: 2.2223 - regression_loss: 1.8388 - classification_loss: 0.3835 101/500 [=====>........................] - ETA: 1:38 - loss: 2.2253 - regression_loss: 1.8428 - classification_loss: 0.3825 102/500 [=====>........................] - ETA: 1:38 - loss: 2.2225 - regression_loss: 1.8405 - classification_loss: 0.3821 103/500 [=====>........................] - ETA: 1:38 - loss: 2.2238 - regression_loss: 1.8416 - classification_loss: 0.3822 104/500 [=====>........................] - ETA: 1:38 - loss: 2.2202 - regression_loss: 1.8385 - classification_loss: 0.3817 105/500 [=====>........................] - ETA: 1:37 - loss: 2.2238 - regression_loss: 1.8399 - classification_loss: 0.3839 106/500 [=====>........................] - ETA: 1:37 - loss: 2.2239 - regression_loss: 1.8405 - classification_loss: 0.3833 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2269 - regression_loss: 1.8435 - classification_loss: 0.3834 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2280 - regression_loss: 1.8425 - classification_loss: 0.3855 109/500 [=====>........................] - ETA: 1:36 - loss: 2.2318 - regression_loss: 1.8442 - classification_loss: 0.3876 110/500 [=====>........................] - ETA: 1:36 - loss: 2.2303 - regression_loss: 1.8418 - classification_loss: 0.3885 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2344 - regression_loss: 1.8442 - classification_loss: 0.3901 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2311 - regression_loss: 1.8423 - classification_loss: 0.3888 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2331 - regression_loss: 1.8427 - classification_loss: 0.3905 114/500 [=====>........................] - ETA: 1:35 - loss: 2.2357 - regression_loss: 1.8445 - classification_loss: 0.3912 115/500 [=====>........................] - ETA: 1:35 - loss: 2.2444 - regression_loss: 1.8466 - classification_loss: 0.3979 116/500 [=====>........................] - ETA: 1:35 - loss: 2.2478 - regression_loss: 1.8491 - classification_loss: 0.3987 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2493 - regression_loss: 1.8502 - classification_loss: 0.3991 118/500 [======>.......................] - ETA: 1:34 - loss: 2.2456 - regression_loss: 1.8468 - classification_loss: 0.3988 119/500 [======>.......................] - ETA: 1:34 - loss: 2.2430 - regression_loss: 1.8449 - classification_loss: 0.3982 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2445 - regression_loss: 1.8462 - classification_loss: 0.3983 121/500 [======>.......................] - ETA: 1:33 - loss: 2.2514 - regression_loss: 1.8515 - classification_loss: 0.4000 122/500 [======>.......................] - ETA: 1:33 - loss: 2.2501 - regression_loss: 1.8509 - classification_loss: 0.3993 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2446 - regression_loss: 1.8465 - classification_loss: 0.3980 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2443 - regression_loss: 1.8477 - classification_loss: 0.3966 125/500 [======>.......................] - ETA: 1:32 - loss: 2.2516 - regression_loss: 1.8539 - classification_loss: 0.3976 126/500 [======>.......................] - ETA: 1:32 - loss: 2.2445 - regression_loss: 1.8392 - classification_loss: 0.4053 127/500 [======>.......................] - ETA: 1:32 - loss: 2.2422 - regression_loss: 1.8374 - classification_loss: 0.4048 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2396 - regression_loss: 1.8353 - classification_loss: 0.4043 129/500 [======>.......................] - ETA: 1:31 - loss: 2.2351 - regression_loss: 1.8319 - classification_loss: 0.4031 130/500 [======>.......................] - ETA: 1:31 - loss: 2.2334 - regression_loss: 1.8304 - classification_loss: 0.4031 131/500 [======>.......................] - ETA: 1:31 - loss: 2.2358 - regression_loss: 1.8318 - classification_loss: 0.4040 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2347 - regression_loss: 1.8314 - classification_loss: 0.4033 133/500 [======>.......................] - ETA: 1:30 - loss: 2.2330 - regression_loss: 1.8302 - classification_loss: 0.4028 134/500 [=======>......................] - ETA: 1:30 - loss: 2.2331 - regression_loss: 1.8302 - classification_loss: 0.4029 135/500 [=======>......................] - ETA: 1:30 - loss: 2.2343 - regression_loss: 1.8315 - classification_loss: 0.4028 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2347 - regression_loss: 1.8323 - classification_loss: 0.4024 137/500 [=======>......................] - ETA: 1:29 - loss: 2.2350 - regression_loss: 1.8327 - classification_loss: 0.4023 138/500 [=======>......................] - ETA: 1:29 - loss: 2.2354 - regression_loss: 1.8338 - classification_loss: 0.4016 139/500 [=======>......................] - ETA: 1:29 - loss: 2.2377 - regression_loss: 1.8341 - classification_loss: 0.4036 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2361 - regression_loss: 1.8327 - classification_loss: 0.4034 141/500 [=======>......................] - ETA: 1:28 - loss: 2.2317 - regression_loss: 1.8289 - classification_loss: 0.4027 142/500 [=======>......................] - ETA: 1:28 - loss: 2.2277 - regression_loss: 1.8258 - classification_loss: 0.4018 143/500 [=======>......................] - ETA: 1:28 - loss: 2.2370 - regression_loss: 1.8342 - classification_loss: 0.4028 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2399 - regression_loss: 1.8343 - classification_loss: 0.4056 145/500 [=======>......................] - ETA: 1:27 - loss: 2.2457 - regression_loss: 1.8394 - classification_loss: 0.4063 146/500 [=======>......................] - ETA: 1:27 - loss: 2.2461 - regression_loss: 1.8403 - classification_loss: 0.4058 147/500 [=======>......................] - ETA: 1:27 - loss: 2.2443 - regression_loss: 1.8400 - classification_loss: 0.4043 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2487 - regression_loss: 1.8431 - classification_loss: 0.4057 149/500 [=======>......................] - ETA: 1:26 - loss: 2.2458 - regression_loss: 1.8412 - classification_loss: 0.4046 150/500 [========>.....................] - ETA: 1:26 - loss: 2.2569 - regression_loss: 1.8470 - classification_loss: 0.4099 151/500 [========>.....................] - ETA: 1:26 - loss: 2.2599 - regression_loss: 1.8492 - classification_loss: 0.4107 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2566 - regression_loss: 1.8466 - classification_loss: 0.4100 153/500 [========>.....................] - ETA: 1:26 - loss: 2.2521 - regression_loss: 1.8435 - classification_loss: 0.4086 154/500 [========>.....................] - ETA: 1:25 - loss: 2.2464 - regression_loss: 1.8392 - classification_loss: 0.4072 155/500 [========>.....................] - ETA: 1:25 - loss: 2.2456 - regression_loss: 1.8385 - classification_loss: 0.4072 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2418 - regression_loss: 1.8349 - classification_loss: 0.4068 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2408 - regression_loss: 1.8347 - classification_loss: 0.4061 158/500 [========>.....................] - ETA: 1:24 - loss: 2.2360 - regression_loss: 1.8311 - classification_loss: 0.4048 159/500 [========>.....................] - ETA: 1:24 - loss: 2.2362 - regression_loss: 1.8309 - classification_loss: 0.4053 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2375 - regression_loss: 1.8321 - classification_loss: 0.4054 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2301 - regression_loss: 1.8267 - classification_loss: 0.4035 162/500 [========>.....................] - ETA: 1:23 - loss: 2.2297 - regression_loss: 1.8266 - classification_loss: 0.4031 163/500 [========>.....................] - ETA: 1:23 - loss: 2.2288 - regression_loss: 1.8259 - classification_loss: 0.4029 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2326 - regression_loss: 1.8291 - classification_loss: 0.4035 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2305 - regression_loss: 1.8277 - classification_loss: 0.4028 166/500 [========>.....................] - ETA: 1:22 - loss: 2.2296 - regression_loss: 1.8270 - classification_loss: 0.4025 167/500 [=========>....................] - ETA: 1:22 - loss: 2.2357 - regression_loss: 1.8314 - classification_loss: 0.4043 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2259 - regression_loss: 1.8233 - classification_loss: 0.4025 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2277 - regression_loss: 1.8250 - classification_loss: 0.4027 170/500 [=========>....................] - ETA: 1:21 - loss: 2.2246 - regression_loss: 1.8228 - classification_loss: 0.4018 171/500 [=========>....................] - ETA: 1:21 - loss: 2.2239 - regression_loss: 1.8221 - classification_loss: 0.4019 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2239 - regression_loss: 1.8223 - classification_loss: 0.4015 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2174 - regression_loss: 1.8171 - classification_loss: 0.4003 174/500 [=========>....................] - ETA: 1:20 - loss: 2.2185 - regression_loss: 1.8181 - classification_loss: 0.4003 175/500 [=========>....................] - ETA: 1:20 - loss: 2.2136 - regression_loss: 1.8144 - classification_loss: 0.3992 176/500 [=========>....................] - ETA: 1:20 - loss: 2.2117 - regression_loss: 1.8128 - classification_loss: 0.3989 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2108 - regression_loss: 1.8127 - classification_loss: 0.3980 178/500 [=========>....................] - ETA: 1:19 - loss: 2.2080 - regression_loss: 1.8108 - classification_loss: 0.3972 179/500 [=========>....................] - ETA: 1:19 - loss: 2.2078 - regression_loss: 1.8113 - classification_loss: 0.3964 180/500 [=========>....................] - ETA: 1:19 - loss: 2.2056 - regression_loss: 1.8102 - classification_loss: 0.3954 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2039 - regression_loss: 1.8087 - classification_loss: 0.3952 182/500 [=========>....................] - ETA: 1:18 - loss: 2.2009 - regression_loss: 1.8066 - classification_loss: 0.3943 183/500 [=========>....................] - ETA: 1:18 - loss: 2.1992 - regression_loss: 1.8057 - classification_loss: 0.3935 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2005 - regression_loss: 1.8065 - classification_loss: 0.3940 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2077 - regression_loss: 1.8118 - classification_loss: 0.3959 186/500 [==========>...................] - ETA: 1:17 - loss: 2.2024 - regression_loss: 1.8075 - classification_loss: 0.3949 187/500 [==========>...................] - ETA: 1:17 - loss: 2.2059 - regression_loss: 1.8097 - classification_loss: 0.3963 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2040 - regression_loss: 1.8085 - classification_loss: 0.3956 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1971 - regression_loss: 1.8030 - classification_loss: 0.3941 190/500 [==========>...................] - ETA: 1:16 - loss: 2.1990 - regression_loss: 1.8040 - classification_loss: 0.3950 191/500 [==========>...................] - ETA: 1:16 - loss: 2.1926 - regression_loss: 1.7989 - classification_loss: 0.3938 192/500 [==========>...................] - ETA: 1:16 - loss: 2.1961 - regression_loss: 1.8016 - classification_loss: 0.3946 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1962 - regression_loss: 1.8015 - classification_loss: 0.3947 194/500 [==========>...................] - ETA: 1:15 - loss: 2.1956 - regression_loss: 1.8009 - classification_loss: 0.3947 195/500 [==========>...................] - ETA: 1:15 - loss: 2.1983 - regression_loss: 1.8032 - classification_loss: 0.3951 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2018 - regression_loss: 1.8064 - classification_loss: 0.3954 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2001 - regression_loss: 1.8055 - classification_loss: 0.3946 198/500 [==========>...................] - ETA: 1:14 - loss: 2.1995 - regression_loss: 1.8051 - classification_loss: 0.3944 199/500 [==========>...................] - ETA: 1:14 - loss: 2.2032 - regression_loss: 1.8083 - classification_loss: 0.3949 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2029 - regression_loss: 1.8081 - classification_loss: 0.3948 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2026 - regression_loss: 1.8082 - classification_loss: 0.3944 202/500 [===========>..................] - ETA: 1:13 - loss: 2.2053 - regression_loss: 1.8104 - classification_loss: 0.3949 203/500 [===========>..................] - ETA: 1:13 - loss: 2.2008 - regression_loss: 1.8070 - classification_loss: 0.3939 204/500 [===========>..................] - ETA: 1:13 - loss: 2.2028 - regression_loss: 1.8086 - classification_loss: 0.3941 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2109 - regression_loss: 1.8151 - classification_loss: 0.3959 206/500 [===========>..................] - ETA: 1:12 - loss: 2.2095 - regression_loss: 1.8145 - classification_loss: 0.3950 207/500 [===========>..................] - ETA: 1:12 - loss: 2.2055 - regression_loss: 1.8116 - classification_loss: 0.3939 208/500 [===========>..................] - ETA: 1:12 - loss: 2.2057 - regression_loss: 1.8120 - classification_loss: 0.3937 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2027 - regression_loss: 1.8099 - classification_loss: 0.3927 210/500 [===========>..................] - ETA: 1:12 - loss: 2.2030 - regression_loss: 1.8103 - classification_loss: 0.3927 211/500 [===========>..................] - ETA: 1:11 - loss: 2.2037 - regression_loss: 1.8113 - classification_loss: 0.3924 212/500 [===========>..................] - ETA: 1:11 - loss: 2.2012 - regression_loss: 1.8095 - classification_loss: 0.3918 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2027 - regression_loss: 1.8105 - classification_loss: 0.3922 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2030 - regression_loss: 1.8109 - classification_loss: 0.3921 215/500 [===========>..................] - ETA: 1:10 - loss: 2.2022 - regression_loss: 1.8100 - classification_loss: 0.3922 216/500 [===========>..................] - ETA: 1:10 - loss: 2.2037 - regression_loss: 1.8123 - classification_loss: 0.3913 217/500 [============>.................] - ETA: 1:10 - loss: 2.2014 - regression_loss: 1.8109 - classification_loss: 0.3906 218/500 [============>.................] - ETA: 1:10 - loss: 2.1987 - regression_loss: 1.8085 - classification_loss: 0.3901 219/500 [============>.................] - ETA: 1:09 - loss: 2.1968 - regression_loss: 1.8073 - classification_loss: 0.3895 220/500 [============>.................] - ETA: 1:09 - loss: 2.1956 - regression_loss: 1.8067 - classification_loss: 0.3890 221/500 [============>.................] - ETA: 1:09 - loss: 2.1946 - regression_loss: 1.8062 - classification_loss: 0.3884 222/500 [============>.................] - ETA: 1:09 - loss: 2.1889 - regression_loss: 1.8017 - classification_loss: 0.3872 223/500 [============>.................] - ETA: 1:08 - loss: 2.1898 - regression_loss: 1.8023 - classification_loss: 0.3875 224/500 [============>.................] - ETA: 1:08 - loss: 2.1861 - regression_loss: 1.7995 - classification_loss: 0.3866 225/500 [============>.................] - ETA: 1:08 - loss: 2.1851 - regression_loss: 1.7990 - classification_loss: 0.3861 226/500 [============>.................] - ETA: 1:08 - loss: 2.1833 - regression_loss: 1.7978 - classification_loss: 0.3855 227/500 [============>.................] - ETA: 1:07 - loss: 2.1832 - regression_loss: 1.7973 - classification_loss: 0.3859 228/500 [============>.................] - ETA: 1:07 - loss: 2.1827 - regression_loss: 1.7975 - classification_loss: 0.3852 229/500 [============>.................] - ETA: 1:07 - loss: 2.1845 - regression_loss: 1.7987 - classification_loss: 0.3858 230/500 [============>.................] - ETA: 1:07 - loss: 2.1814 - regression_loss: 1.7964 - classification_loss: 0.3850 231/500 [============>.................] - ETA: 1:06 - loss: 2.1831 - regression_loss: 1.7976 - classification_loss: 0.3854 232/500 [============>.................] - ETA: 1:06 - loss: 2.1834 - regression_loss: 1.7977 - classification_loss: 0.3857 233/500 [============>.................] - ETA: 1:06 - loss: 2.1816 - regression_loss: 1.7965 - classification_loss: 0.3850 234/500 [=============>................] - ETA: 1:06 - loss: 2.1812 - regression_loss: 1.7971 - classification_loss: 0.3841 235/500 [=============>................] - ETA: 1:05 - loss: 2.1812 - regression_loss: 1.7974 - classification_loss: 0.3838 236/500 [=============>................] - ETA: 1:05 - loss: 2.1820 - regression_loss: 1.7984 - classification_loss: 0.3837 237/500 [=============>................] - ETA: 1:05 - loss: 2.1798 - regression_loss: 1.7966 - classification_loss: 0.3832 238/500 [=============>................] - ETA: 1:05 - loss: 2.1786 - regression_loss: 1.7963 - classification_loss: 0.3823 239/500 [=============>................] - ETA: 1:04 - loss: 2.1820 - regression_loss: 1.7981 - classification_loss: 0.3839 240/500 [=============>................] - ETA: 1:04 - loss: 2.1835 - regression_loss: 1.7993 - classification_loss: 0.3842 241/500 [=============>................] - ETA: 1:04 - loss: 2.1820 - regression_loss: 1.7981 - classification_loss: 0.3840 242/500 [=============>................] - ETA: 1:04 - loss: 2.1786 - regression_loss: 1.7954 - classification_loss: 0.3832 243/500 [=============>................] - ETA: 1:03 - loss: 2.1793 - regression_loss: 1.7960 - classification_loss: 0.3833 244/500 [=============>................] - ETA: 1:03 - loss: 2.1801 - regression_loss: 1.7965 - classification_loss: 0.3836 245/500 [=============>................] - ETA: 1:03 - loss: 2.1803 - regression_loss: 1.7970 - classification_loss: 0.3833 246/500 [=============>................] - ETA: 1:03 - loss: 2.1834 - regression_loss: 1.7994 - classification_loss: 0.3840 247/500 [=============>................] - ETA: 1:02 - loss: 2.1823 - regression_loss: 1.7989 - classification_loss: 0.3834 248/500 [=============>................] - ETA: 1:02 - loss: 2.1823 - regression_loss: 1.7990 - classification_loss: 0.3833 249/500 [=============>................] - ETA: 1:02 - loss: 2.1834 - regression_loss: 1.7997 - classification_loss: 0.3837 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1863 - regression_loss: 1.8020 - classification_loss: 0.3843 251/500 [==============>...............] - ETA: 1:01 - loss: 2.1853 - regression_loss: 1.8015 - classification_loss: 0.3838 252/500 [==============>...............] - ETA: 1:01 - loss: 2.1888 - regression_loss: 1.8038 - classification_loss: 0.3851 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1909 - regression_loss: 1.8054 - classification_loss: 0.3856 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1862 - regression_loss: 1.8014 - classification_loss: 0.3848 255/500 [==============>...............] - ETA: 1:00 - loss: 2.1877 - regression_loss: 1.8021 - classification_loss: 0.3856 256/500 [==============>...............] - ETA: 1:00 - loss: 2.1871 - regression_loss: 1.8018 - classification_loss: 0.3853 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1885 - regression_loss: 1.8030 - classification_loss: 0.3855 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1866 - regression_loss: 1.8018 - classification_loss: 0.3847 259/500 [==============>...............] - ETA: 59s - loss: 2.1873 - regression_loss: 1.8031 - classification_loss: 0.3842  260/500 [==============>...............] - ETA: 59s - loss: 2.1887 - regression_loss: 1.8041 - classification_loss: 0.3846 261/500 [==============>...............] - ETA: 59s - loss: 2.1878 - regression_loss: 1.8034 - classification_loss: 0.3843 262/500 [==============>...............] - ETA: 59s - loss: 2.1884 - regression_loss: 1.8047 - classification_loss: 0.3837 263/500 [==============>...............] - ETA: 58s - loss: 2.1877 - regression_loss: 1.8045 - classification_loss: 0.3832 264/500 [==============>...............] - ETA: 58s - loss: 2.1887 - regression_loss: 1.8048 - classification_loss: 0.3839 265/500 [==============>...............] - ETA: 58s - loss: 2.1888 - regression_loss: 1.8049 - classification_loss: 0.3839 266/500 [==============>...............] - ETA: 58s - loss: 2.1906 - regression_loss: 1.8063 - classification_loss: 0.3843 267/500 [===============>..............] - ETA: 57s - loss: 2.1913 - regression_loss: 1.8068 - classification_loss: 0.3845 268/500 [===============>..............] - ETA: 57s - loss: 2.1899 - regression_loss: 1.8057 - classification_loss: 0.3841 269/500 [===============>..............] - ETA: 57s - loss: 2.1910 - regression_loss: 1.8068 - classification_loss: 0.3842 270/500 [===============>..............] - ETA: 57s - loss: 2.1916 - regression_loss: 1.8074 - classification_loss: 0.3843 271/500 [===============>..............] - ETA: 56s - loss: 2.1884 - regression_loss: 1.8047 - classification_loss: 0.3837 272/500 [===============>..............] - ETA: 56s - loss: 2.1878 - regression_loss: 1.8042 - classification_loss: 0.3836 273/500 [===============>..............] - ETA: 56s - loss: 2.1867 - regression_loss: 1.8034 - classification_loss: 0.3834 274/500 [===============>..............] - ETA: 56s - loss: 2.1872 - regression_loss: 1.8040 - classification_loss: 0.3832 275/500 [===============>..............] - ETA: 55s - loss: 2.1824 - regression_loss: 1.8001 - classification_loss: 0.3822 276/500 [===============>..............] - ETA: 55s - loss: 2.1849 - regression_loss: 1.8021 - classification_loss: 0.3827 277/500 [===============>..............] - ETA: 55s - loss: 2.1862 - regression_loss: 1.8030 - classification_loss: 0.3831 278/500 [===============>..............] - ETA: 55s - loss: 2.1844 - regression_loss: 1.8016 - classification_loss: 0.3828 279/500 [===============>..............] - ETA: 54s - loss: 2.1839 - regression_loss: 1.8015 - classification_loss: 0.3824 280/500 [===============>..............] - ETA: 54s - loss: 2.1847 - regression_loss: 1.8024 - classification_loss: 0.3823 281/500 [===============>..............] - ETA: 54s - loss: 2.1847 - regression_loss: 1.8028 - classification_loss: 0.3819 282/500 [===============>..............] - ETA: 54s - loss: 2.1843 - regression_loss: 1.8027 - classification_loss: 0.3816 283/500 [===============>..............] - ETA: 53s - loss: 2.1820 - regression_loss: 1.8010 - classification_loss: 0.3810 284/500 [================>.............] - ETA: 53s - loss: 2.1782 - regression_loss: 1.7979 - classification_loss: 0.3803 285/500 [================>.............] - ETA: 53s - loss: 2.1783 - regression_loss: 1.7981 - classification_loss: 0.3803 286/500 [================>.............] - ETA: 53s - loss: 2.1775 - regression_loss: 1.7974 - classification_loss: 0.3801 287/500 [================>.............] - ETA: 52s - loss: 2.1801 - regression_loss: 1.7991 - classification_loss: 0.3810 288/500 [================>.............] - ETA: 52s - loss: 2.1782 - regression_loss: 1.7978 - classification_loss: 0.3804 289/500 [================>.............] - ETA: 52s - loss: 2.1817 - regression_loss: 1.8006 - classification_loss: 0.3810 290/500 [================>.............] - ETA: 52s - loss: 2.1772 - regression_loss: 1.7967 - classification_loss: 0.3804 291/500 [================>.............] - ETA: 51s - loss: 2.1740 - regression_loss: 1.7941 - classification_loss: 0.3799 292/500 [================>.............] - ETA: 51s - loss: 2.1751 - regression_loss: 1.7951 - classification_loss: 0.3800 293/500 [================>.............] - ETA: 51s - loss: 2.1766 - regression_loss: 1.7966 - classification_loss: 0.3801 294/500 [================>.............] - ETA: 51s - loss: 2.1751 - regression_loss: 1.7956 - classification_loss: 0.3795 295/500 [================>.............] - ETA: 50s - loss: 2.1761 - regression_loss: 1.7964 - classification_loss: 0.3797 296/500 [================>.............] - ETA: 50s - loss: 2.1769 - regression_loss: 1.7967 - classification_loss: 0.3802 297/500 [================>.............] - ETA: 50s - loss: 2.1783 - regression_loss: 1.7978 - classification_loss: 0.3806 298/500 [================>.............] - ETA: 50s - loss: 2.1760 - regression_loss: 1.7962 - classification_loss: 0.3798 299/500 [================>.............] - ETA: 49s - loss: 2.1757 - regression_loss: 1.7961 - classification_loss: 0.3797 300/500 [=================>............] - ETA: 49s - loss: 2.1756 - regression_loss: 1.7958 - classification_loss: 0.3797 301/500 [=================>............] - ETA: 49s - loss: 2.1789 - regression_loss: 1.7985 - classification_loss: 0.3804 302/500 [=================>............] - ETA: 49s - loss: 2.1788 - regression_loss: 1.7984 - classification_loss: 0.3804 303/500 [=================>............] - ETA: 48s - loss: 2.1802 - regression_loss: 1.7995 - classification_loss: 0.3807 304/500 [=================>............] - ETA: 48s - loss: 2.1784 - regression_loss: 1.7982 - classification_loss: 0.3802 305/500 [=================>............] - ETA: 48s - loss: 2.1749 - regression_loss: 1.7953 - classification_loss: 0.3795 306/500 [=================>............] - ETA: 48s - loss: 2.1751 - regression_loss: 1.7960 - classification_loss: 0.3791 307/500 [=================>............] - ETA: 47s - loss: 2.1744 - regression_loss: 1.7957 - classification_loss: 0.3787 308/500 [=================>............] - ETA: 47s - loss: 2.1751 - regression_loss: 1.7963 - classification_loss: 0.3788 309/500 [=================>............] - ETA: 47s - loss: 2.1755 - regression_loss: 1.7968 - classification_loss: 0.3787 310/500 [=================>............] - ETA: 47s - loss: 2.1736 - regression_loss: 1.7946 - classification_loss: 0.3790 311/500 [=================>............] - ETA: 46s - loss: 2.1722 - regression_loss: 1.7936 - classification_loss: 0.3785 312/500 [=================>............] - ETA: 46s - loss: 2.1718 - regression_loss: 1.7934 - classification_loss: 0.3783 313/500 [=================>............] - ETA: 46s - loss: 2.1721 - regression_loss: 1.7935 - classification_loss: 0.3786 314/500 [=================>............] - ETA: 46s - loss: 2.1719 - regression_loss: 1.7933 - classification_loss: 0.3785 315/500 [=================>............] - ETA: 45s - loss: 2.1712 - regression_loss: 1.7924 - classification_loss: 0.3788 316/500 [=================>............] - ETA: 45s - loss: 2.1696 - regression_loss: 1.7912 - classification_loss: 0.3784 317/500 [==================>...........] - ETA: 45s - loss: 2.1701 - regression_loss: 1.7917 - classification_loss: 0.3784 318/500 [==================>...........] - ETA: 45s - loss: 2.1701 - regression_loss: 1.7919 - classification_loss: 0.3782 319/500 [==================>...........] - ETA: 44s - loss: 2.1681 - regression_loss: 1.7904 - classification_loss: 0.3776 320/500 [==================>...........] - ETA: 44s - loss: 2.1696 - regression_loss: 1.7915 - classification_loss: 0.3781 321/500 [==================>...........] - ETA: 44s - loss: 2.1722 - regression_loss: 1.7937 - classification_loss: 0.3785 322/500 [==================>...........] - ETA: 44s - loss: 2.1727 - regression_loss: 1.7940 - classification_loss: 0.3787 323/500 [==================>...........] - ETA: 43s - loss: 2.1720 - regression_loss: 1.7934 - classification_loss: 0.3785 324/500 [==================>...........] - ETA: 43s - loss: 2.1741 - regression_loss: 1.7946 - classification_loss: 0.3794 325/500 [==================>...........] - ETA: 43s - loss: 2.1732 - regression_loss: 1.7941 - classification_loss: 0.3791 326/500 [==================>...........] - ETA: 43s - loss: 2.1733 - regression_loss: 1.7942 - classification_loss: 0.3791 327/500 [==================>...........] - ETA: 42s - loss: 2.1763 - regression_loss: 1.7959 - classification_loss: 0.3803 328/500 [==================>...........] - ETA: 42s - loss: 2.1761 - regression_loss: 1.7955 - classification_loss: 0.3806 329/500 [==================>...........] - ETA: 42s - loss: 2.1768 - regression_loss: 1.7960 - classification_loss: 0.3808 330/500 [==================>...........] - ETA: 42s - loss: 2.1774 - regression_loss: 1.7964 - classification_loss: 0.3809 331/500 [==================>...........] - ETA: 41s - loss: 2.1754 - regression_loss: 1.7948 - classification_loss: 0.3806 332/500 [==================>...........] - ETA: 41s - loss: 2.1741 - regression_loss: 1.7936 - classification_loss: 0.3805 333/500 [==================>...........] - ETA: 41s - loss: 2.1752 - regression_loss: 1.7945 - classification_loss: 0.3807 334/500 [===================>..........] - ETA: 41s - loss: 2.1765 - regression_loss: 1.7954 - classification_loss: 0.3811 335/500 [===================>..........] - ETA: 40s - loss: 2.1808 - regression_loss: 1.7987 - classification_loss: 0.3821 336/500 [===================>..........] - ETA: 40s - loss: 2.1801 - regression_loss: 1.7982 - classification_loss: 0.3819 337/500 [===================>..........] - ETA: 40s - loss: 2.1810 - regression_loss: 1.7989 - classification_loss: 0.3821 338/500 [===================>..........] - ETA: 40s - loss: 2.1799 - regression_loss: 1.7982 - classification_loss: 0.3817 339/500 [===================>..........] - ETA: 39s - loss: 2.1791 - regression_loss: 1.7975 - classification_loss: 0.3816 340/500 [===================>..........] - ETA: 39s - loss: 2.1808 - regression_loss: 1.7988 - classification_loss: 0.3821 341/500 [===================>..........] - ETA: 39s - loss: 2.1821 - regression_loss: 1.7997 - classification_loss: 0.3824 342/500 [===================>..........] - ETA: 39s - loss: 2.1819 - regression_loss: 1.7995 - classification_loss: 0.3824 343/500 [===================>..........] - ETA: 38s - loss: 2.1819 - regression_loss: 1.7997 - classification_loss: 0.3823 344/500 [===================>..........] - ETA: 38s - loss: 2.1842 - regression_loss: 1.8017 - classification_loss: 0.3825 345/500 [===================>..........] - ETA: 38s - loss: 2.1828 - regression_loss: 1.8007 - classification_loss: 0.3821 346/500 [===================>..........] - ETA: 38s - loss: 2.1831 - regression_loss: 1.8008 - classification_loss: 0.3822 347/500 [===================>..........] - ETA: 37s - loss: 2.1830 - regression_loss: 1.8005 - classification_loss: 0.3825 348/500 [===================>..........] - ETA: 37s - loss: 2.1836 - regression_loss: 1.8010 - classification_loss: 0.3826 349/500 [===================>..........] - ETA: 37s - loss: 2.1838 - regression_loss: 1.8012 - classification_loss: 0.3826 350/500 [====================>.........] - ETA: 37s - loss: 2.1826 - regression_loss: 1.8002 - classification_loss: 0.3824 351/500 [====================>.........] - ETA: 36s - loss: 2.1820 - regression_loss: 1.7998 - classification_loss: 0.3821 352/500 [====================>.........] - ETA: 36s - loss: 2.1813 - regression_loss: 1.7993 - classification_loss: 0.3820 353/500 [====================>.........] - ETA: 36s - loss: 2.1811 - regression_loss: 1.7992 - classification_loss: 0.3819 354/500 [====================>.........] - ETA: 36s - loss: 2.1826 - regression_loss: 1.8007 - classification_loss: 0.3819 355/500 [====================>.........] - ETA: 36s - loss: 2.1828 - regression_loss: 1.8011 - classification_loss: 0.3817 356/500 [====================>.........] - ETA: 35s - loss: 2.1828 - regression_loss: 1.8013 - classification_loss: 0.3815 357/500 [====================>.........] - ETA: 35s - loss: 2.1791 - regression_loss: 1.7982 - classification_loss: 0.3809 358/500 [====================>.........] - ETA: 35s - loss: 2.1780 - regression_loss: 1.7974 - classification_loss: 0.3806 359/500 [====================>.........] - ETA: 35s - loss: 2.1782 - regression_loss: 1.7976 - classification_loss: 0.3807 360/500 [====================>.........] - ETA: 34s - loss: 2.1781 - regression_loss: 1.7978 - classification_loss: 0.3804 361/500 [====================>.........] - ETA: 34s - loss: 2.1758 - regression_loss: 1.7959 - classification_loss: 0.3799 362/500 [====================>.........] - ETA: 34s - loss: 2.1739 - regression_loss: 1.7945 - classification_loss: 0.3794 363/500 [====================>.........] - ETA: 34s - loss: 2.1730 - regression_loss: 1.7937 - classification_loss: 0.3793 364/500 [====================>.........] - ETA: 33s - loss: 2.1727 - regression_loss: 1.7937 - classification_loss: 0.3790 365/500 [====================>.........] - ETA: 33s - loss: 2.1703 - regression_loss: 1.7918 - classification_loss: 0.3785 366/500 [====================>.........] - ETA: 33s - loss: 2.1694 - regression_loss: 1.7911 - classification_loss: 0.3783 367/500 [=====================>........] - ETA: 33s - loss: 2.1700 - regression_loss: 1.7917 - classification_loss: 0.3783 368/500 [=====================>........] - ETA: 32s - loss: 2.1691 - regression_loss: 1.7911 - classification_loss: 0.3780 369/500 [=====================>........] - ETA: 32s - loss: 2.1691 - regression_loss: 1.7910 - classification_loss: 0.3781 370/500 [=====================>........] - ETA: 32s - loss: 2.1686 - regression_loss: 1.7907 - classification_loss: 0.3779 371/500 [=====================>........] - ETA: 32s - loss: 2.1712 - regression_loss: 1.7927 - classification_loss: 0.3786 372/500 [=====================>........] - ETA: 31s - loss: 2.1708 - regression_loss: 1.7924 - classification_loss: 0.3784 373/500 [=====================>........] - ETA: 31s - loss: 2.1709 - regression_loss: 1.7924 - classification_loss: 0.3786 374/500 [=====================>........] - ETA: 31s - loss: 2.1686 - regression_loss: 1.7907 - classification_loss: 0.3779 375/500 [=====================>........] - ETA: 31s - loss: 2.1710 - regression_loss: 1.7907 - classification_loss: 0.3803 376/500 [=====================>........] - ETA: 30s - loss: 2.1712 - regression_loss: 1.7914 - classification_loss: 0.3799 377/500 [=====================>........] - ETA: 30s - loss: 2.1716 - regression_loss: 1.7916 - classification_loss: 0.3800 378/500 [=====================>........] - ETA: 30s - loss: 2.1678 - regression_loss: 1.7885 - classification_loss: 0.3793 379/500 [=====================>........] - ETA: 30s - loss: 2.1684 - regression_loss: 1.7892 - classification_loss: 0.3792 380/500 [=====================>........] - ETA: 29s - loss: 2.1709 - regression_loss: 1.7912 - classification_loss: 0.3798 381/500 [=====================>........] - ETA: 29s - loss: 2.1694 - regression_loss: 1.7901 - classification_loss: 0.3793 382/500 [=====================>........] - ETA: 29s - loss: 2.1701 - regression_loss: 1.7907 - classification_loss: 0.3794 383/500 [=====================>........] - ETA: 29s - loss: 2.1698 - regression_loss: 1.7903 - classification_loss: 0.3795 384/500 [======================>.......] - ETA: 28s - loss: 2.1699 - regression_loss: 1.7904 - classification_loss: 0.3794 385/500 [======================>.......] - ETA: 28s - loss: 2.1694 - regression_loss: 1.7902 - classification_loss: 0.3792 386/500 [======================>.......] - ETA: 28s - loss: 2.1686 - regression_loss: 1.7896 - classification_loss: 0.3790 387/500 [======================>.......] - ETA: 28s - loss: 2.1718 - regression_loss: 1.7918 - classification_loss: 0.3800 388/500 [======================>.......] - ETA: 27s - loss: 2.1701 - regression_loss: 1.7906 - classification_loss: 0.3795 389/500 [======================>.......] - ETA: 27s - loss: 2.1701 - regression_loss: 1.7908 - classification_loss: 0.3793 390/500 [======================>.......] - ETA: 27s - loss: 2.1697 - regression_loss: 1.7900 - classification_loss: 0.3797 391/500 [======================>.......] - ETA: 27s - loss: 2.1673 - regression_loss: 1.7881 - classification_loss: 0.3792 392/500 [======================>.......] - ETA: 26s - loss: 2.1688 - regression_loss: 1.7888 - classification_loss: 0.3800 393/500 [======================>.......] - ETA: 26s - loss: 2.1687 - regression_loss: 1.7888 - classification_loss: 0.3799 394/500 [======================>.......] - ETA: 26s - loss: 2.1685 - regression_loss: 1.7887 - classification_loss: 0.3798 395/500 [======================>.......] - ETA: 26s - loss: 2.1672 - regression_loss: 1.7878 - classification_loss: 0.3795 396/500 [======================>.......] - ETA: 25s - loss: 2.1669 - regression_loss: 1.7877 - classification_loss: 0.3792 397/500 [======================>.......] - ETA: 25s - loss: 2.1661 - regression_loss: 1.7871 - classification_loss: 0.3790 398/500 [======================>.......] - ETA: 25s - loss: 2.1661 - regression_loss: 1.7873 - classification_loss: 0.3788 399/500 [======================>.......] - ETA: 25s - loss: 2.1648 - regression_loss: 1.7864 - classification_loss: 0.3784 400/500 [=======================>......] - ETA: 24s - loss: 2.1635 - regression_loss: 1.7855 - classification_loss: 0.3780 401/500 [=======================>......] - ETA: 24s - loss: 2.1645 - regression_loss: 1.7865 - classification_loss: 0.3780 402/500 [=======================>......] - ETA: 24s - loss: 2.1665 - regression_loss: 1.7882 - classification_loss: 0.3782 403/500 [=======================>......] - ETA: 24s - loss: 2.1656 - regression_loss: 1.7873 - classification_loss: 0.3784 404/500 [=======================>......] - ETA: 23s - loss: 2.1663 - regression_loss: 1.7880 - classification_loss: 0.3784 405/500 [=======================>......] - ETA: 23s - loss: 2.1672 - regression_loss: 1.7887 - classification_loss: 0.3786 406/500 [=======================>......] - ETA: 23s - loss: 2.1680 - regression_loss: 1.7890 - classification_loss: 0.3789 407/500 [=======================>......] - ETA: 23s - loss: 2.1669 - regression_loss: 1.7880 - classification_loss: 0.3789 408/500 [=======================>......] - ETA: 22s - loss: 2.1652 - regression_loss: 1.7868 - classification_loss: 0.3784 409/500 [=======================>......] - ETA: 22s - loss: 2.1671 - regression_loss: 1.7884 - classification_loss: 0.3787 410/500 [=======================>......] - ETA: 22s - loss: 2.1684 - regression_loss: 1.7896 - classification_loss: 0.3788 411/500 [=======================>......] - ETA: 22s - loss: 2.1682 - regression_loss: 1.7893 - classification_loss: 0.3789 412/500 [=======================>......] - ETA: 21s - loss: 2.1676 - regression_loss: 1.7891 - classification_loss: 0.3786 413/500 [=======================>......] - ETA: 21s - loss: 2.1672 - regression_loss: 1.7887 - classification_loss: 0.3785 414/500 [=======================>......] - ETA: 21s - loss: 2.1671 - regression_loss: 1.7886 - classification_loss: 0.3785 415/500 [=======================>......] - ETA: 21s - loss: 2.1665 - regression_loss: 1.7881 - classification_loss: 0.3783 416/500 [=======================>......] - ETA: 20s - loss: 2.1688 - regression_loss: 1.7899 - classification_loss: 0.3789 417/500 [========================>.....] - ETA: 20s - loss: 2.1684 - regression_loss: 1.7897 - classification_loss: 0.3788 418/500 [========================>.....] - ETA: 20s - loss: 2.1695 - regression_loss: 1.7904 - classification_loss: 0.3791 419/500 [========================>.....] - ETA: 20s - loss: 2.1696 - regression_loss: 1.7903 - classification_loss: 0.3793 420/500 [========================>.....] - ETA: 19s - loss: 2.1715 - regression_loss: 1.7916 - classification_loss: 0.3799 421/500 [========================>.....] - ETA: 19s - loss: 2.1700 - regression_loss: 1.7905 - classification_loss: 0.3794 422/500 [========================>.....] - ETA: 19s - loss: 2.1753 - regression_loss: 1.7953 - classification_loss: 0.3800 423/500 [========================>.....] - ETA: 19s - loss: 2.1753 - regression_loss: 1.7954 - classification_loss: 0.3799 424/500 [========================>.....] - ETA: 18s - loss: 2.1746 - regression_loss: 1.7947 - classification_loss: 0.3798 425/500 [========================>.....] - ETA: 18s - loss: 2.1749 - regression_loss: 1.7947 - classification_loss: 0.3802 426/500 [========================>.....] - ETA: 18s - loss: 2.1754 - regression_loss: 1.7953 - classification_loss: 0.3801 427/500 [========================>.....] - ETA: 18s - loss: 2.1758 - regression_loss: 1.7956 - classification_loss: 0.3802 428/500 [========================>.....] - ETA: 17s - loss: 2.1764 - regression_loss: 1.7958 - classification_loss: 0.3805 429/500 [========================>.....] - ETA: 17s - loss: 2.1792 - regression_loss: 1.7980 - classification_loss: 0.3812 430/500 [========================>.....] - ETA: 17s - loss: 2.1813 - regression_loss: 1.7996 - classification_loss: 0.3817 431/500 [========================>.....] - ETA: 17s - loss: 2.1809 - regression_loss: 1.7996 - classification_loss: 0.3813 432/500 [========================>.....] - ETA: 16s - loss: 2.1803 - regression_loss: 1.7991 - classification_loss: 0.3813 433/500 [========================>.....] - ETA: 16s - loss: 2.1843 - regression_loss: 1.8004 - classification_loss: 0.3839 434/500 [=========================>....] - ETA: 16s - loss: 2.1849 - regression_loss: 1.8010 - classification_loss: 0.3839 435/500 [=========================>....] - ETA: 16s - loss: 2.1831 - regression_loss: 1.7995 - classification_loss: 0.3836 436/500 [=========================>....] - ETA: 15s - loss: 2.1819 - regression_loss: 1.7987 - classification_loss: 0.3832 437/500 [=========================>....] - ETA: 15s - loss: 2.1825 - regression_loss: 1.7993 - classification_loss: 0.3832 438/500 [=========================>....] - ETA: 15s - loss: 2.1832 - regression_loss: 1.7997 - classification_loss: 0.3835 439/500 [=========================>....] - ETA: 15s - loss: 2.1834 - regression_loss: 1.7997 - classification_loss: 0.3837 440/500 [=========================>....] - ETA: 14s - loss: 2.1811 - regression_loss: 1.7980 - classification_loss: 0.3832 441/500 [=========================>....] - ETA: 14s - loss: 2.1810 - regression_loss: 1.7979 - classification_loss: 0.3831 442/500 [=========================>....] - ETA: 14s - loss: 2.1827 - regression_loss: 1.7997 - classification_loss: 0.3830 443/500 [=========================>....] - ETA: 14s - loss: 2.1828 - regression_loss: 1.7998 - classification_loss: 0.3830 444/500 [=========================>....] - ETA: 13s - loss: 2.1823 - regression_loss: 1.7994 - classification_loss: 0.3829 445/500 [=========================>....] - ETA: 13s - loss: 2.1812 - regression_loss: 1.7986 - classification_loss: 0.3826 446/500 [=========================>....] - ETA: 13s - loss: 2.1826 - regression_loss: 1.7998 - classification_loss: 0.3828 447/500 [=========================>....] - ETA: 13s - loss: 2.1825 - regression_loss: 1.7999 - classification_loss: 0.3826 448/500 [=========================>....] - ETA: 12s - loss: 2.1818 - regression_loss: 1.7995 - classification_loss: 0.3823 449/500 [=========================>....] - ETA: 12s - loss: 2.1786 - regression_loss: 1.7955 - classification_loss: 0.3831 450/500 [==========================>...] - ETA: 12s - loss: 2.1771 - regression_loss: 1.7943 - classification_loss: 0.3828 451/500 [==========================>...] - ETA: 12s - loss: 2.1754 - regression_loss: 1.7930 - classification_loss: 0.3824 452/500 [==========================>...] - ETA: 11s - loss: 2.1741 - regression_loss: 1.7918 - classification_loss: 0.3823 453/500 [==========================>...] - ETA: 11s - loss: 2.1739 - regression_loss: 1.7917 - classification_loss: 0.3822 454/500 [==========================>...] - ETA: 11s - loss: 2.1734 - regression_loss: 1.7915 - classification_loss: 0.3818 455/500 [==========================>...] - ETA: 11s - loss: 2.1734 - regression_loss: 1.7916 - classification_loss: 0.3818 456/500 [==========================>...] - ETA: 10s - loss: 2.1717 - regression_loss: 1.7903 - classification_loss: 0.3814 457/500 [==========================>...] - ETA: 10s - loss: 2.1709 - regression_loss: 1.7899 - classification_loss: 0.3810 458/500 [==========================>...] - ETA: 10s - loss: 2.1692 - regression_loss: 1.7887 - classification_loss: 0.3805 459/500 [==========================>...] - ETA: 10s - loss: 2.1687 - regression_loss: 1.7885 - classification_loss: 0.3803 460/500 [==========================>...] - ETA: 9s - loss: 2.1680 - regression_loss: 1.7879 - classification_loss: 0.3802  461/500 [==========================>...] - ETA: 9s - loss: 2.1678 - regression_loss: 1.7879 - classification_loss: 0.3799 462/500 [==========================>...] - ETA: 9s - loss: 2.1669 - regression_loss: 1.7873 - classification_loss: 0.3796 463/500 [==========================>...] - ETA: 9s - loss: 2.1679 - regression_loss: 1.7879 - classification_loss: 0.3799 464/500 [==========================>...] - ETA: 8s - loss: 2.1672 - regression_loss: 1.7874 - classification_loss: 0.3798 465/500 [==========================>...] - ETA: 8s - loss: 2.1663 - regression_loss: 1.7868 - classification_loss: 0.3795 466/500 [==========================>...] - ETA: 8s - loss: 2.1657 - regression_loss: 1.7861 - classification_loss: 0.3796 467/500 [===========================>..] - ETA: 8s - loss: 2.1652 - regression_loss: 1.7858 - classification_loss: 0.3794 468/500 [===========================>..] - ETA: 7s - loss: 2.1639 - regression_loss: 1.7848 - classification_loss: 0.3790 469/500 [===========================>..] - ETA: 7s - loss: 2.1641 - regression_loss: 1.7849 - classification_loss: 0.3792 470/500 [===========================>..] - ETA: 7s - loss: 2.1630 - regression_loss: 1.7841 - classification_loss: 0.3789 471/500 [===========================>..] - ETA: 7s - loss: 2.1625 - regression_loss: 1.7839 - classification_loss: 0.3787 472/500 [===========================>..] - ETA: 6s - loss: 2.1630 - regression_loss: 1.7841 - classification_loss: 0.3789 473/500 [===========================>..] - ETA: 6s - loss: 2.1645 - regression_loss: 1.7852 - classification_loss: 0.3793 474/500 [===========================>..] - ETA: 6s - loss: 2.1636 - regression_loss: 1.7845 - classification_loss: 0.3790 475/500 [===========================>..] - ETA: 6s - loss: 2.1634 - regression_loss: 1.7844 - classification_loss: 0.3790 476/500 [===========================>..] - ETA: 5s - loss: 2.1648 - regression_loss: 1.7856 - classification_loss: 0.3792 477/500 [===========================>..] - ETA: 5s - loss: 2.1644 - regression_loss: 1.7853 - classification_loss: 0.3791 478/500 [===========================>..] - ETA: 5s - loss: 2.1662 - regression_loss: 1.7869 - classification_loss: 0.3793 479/500 [===========================>..] - ETA: 5s - loss: 2.1641 - regression_loss: 1.7852 - classification_loss: 0.3789 480/500 [===========================>..] - ETA: 4s - loss: 2.1644 - regression_loss: 1.7855 - classification_loss: 0.3789 481/500 [===========================>..] - ETA: 4s - loss: 2.1647 - regression_loss: 1.7857 - classification_loss: 0.3790 482/500 [===========================>..] - ETA: 4s - loss: 2.1665 - regression_loss: 1.7873 - classification_loss: 0.3793 483/500 [===========================>..] - ETA: 4s - loss: 2.1664 - regression_loss: 1.7872 - classification_loss: 0.3792 484/500 [============================>.] - ETA: 3s - loss: 2.1675 - regression_loss: 1.7882 - classification_loss: 0.3793 485/500 [============================>.] - ETA: 3s - loss: 2.1677 - regression_loss: 1.7882 - classification_loss: 0.3795 486/500 [============================>.] - ETA: 3s - loss: 2.1669 - regression_loss: 1.7876 - classification_loss: 0.3792 487/500 [============================>.] - ETA: 3s - loss: 2.1656 - regression_loss: 1.7867 - classification_loss: 0.3789 488/500 [============================>.] - ETA: 2s - loss: 2.1651 - regression_loss: 1.7864 - classification_loss: 0.3787 489/500 [============================>.] - ETA: 2s - loss: 2.1658 - regression_loss: 1.7868 - classification_loss: 0.3789 490/500 [============================>.] - ETA: 2s - loss: 2.1650 - regression_loss: 1.7863 - classification_loss: 0.3787 491/500 [============================>.] - ETA: 2s - loss: 2.1653 - regression_loss: 1.7866 - classification_loss: 0.3787 492/500 [============================>.] - ETA: 1s - loss: 2.1651 - regression_loss: 1.7863 - classification_loss: 0.3788 493/500 [============================>.] - ETA: 1s - loss: 2.1646 - regression_loss: 1.7860 - classification_loss: 0.3786 494/500 [============================>.] - ETA: 1s - loss: 2.1652 - regression_loss: 1.7867 - classification_loss: 0.3785 495/500 [============================>.] - ETA: 1s - loss: 2.1636 - regression_loss: 1.7855 - classification_loss: 0.3781 496/500 [============================>.] - ETA: 0s - loss: 2.1650 - regression_loss: 1.7865 - classification_loss: 0.3785 497/500 [============================>.] - ETA: 0s - loss: 2.1650 - regression_loss: 1.7866 - classification_loss: 0.3784 498/500 [============================>.] - ETA: 0s - loss: 2.1676 - regression_loss: 1.7883 - classification_loss: 0.3793 499/500 [============================>.] - ETA: 0s - loss: 2.1682 - regression_loss: 1.7888 - classification_loss: 0.3793 500/500 [==============================] - 124s 249ms/step - loss: 2.1693 - regression_loss: 1.7896 - classification_loss: 0.3797 326 instances of class plum with average precision: 0.6753 mAP: 0.6753 Epoch 00016: saving model to ./training/snapshots/resnet50_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 2:08 - loss: 1.7250 - regression_loss: 1.4805 - classification_loss: 0.2445 2/500 [..............................] - ETA: 2:05 - loss: 1.7359 - regression_loss: 1.4524 - classification_loss: 0.2835 3/500 [..............................] - ETA: 2:05 - loss: 2.0222 - regression_loss: 1.6349 - classification_loss: 0.3873 4/500 [..............................] - ETA: 2:04 - loss: 2.0069 - regression_loss: 1.6337 - classification_loss: 0.3732 5/500 [..............................] - ETA: 2:05 - loss: 2.0735 - regression_loss: 1.6837 - classification_loss: 0.3897 6/500 [..............................] - ETA: 2:04 - loss: 2.0793 - regression_loss: 1.6915 - classification_loss: 0.3878 7/500 [..............................] - ETA: 2:05 - loss: 2.2218 - regression_loss: 1.7954 - classification_loss: 0.4263 8/500 [..............................] - ETA: 2:04 - loss: 2.2387 - regression_loss: 1.8116 - classification_loss: 0.4271 9/500 [..............................] - ETA: 2:03 - loss: 2.2224 - regression_loss: 1.8050 - classification_loss: 0.4175 10/500 [..............................] - ETA: 2:02 - loss: 2.2259 - regression_loss: 1.8122 - classification_loss: 0.4137 11/500 [..............................] - ETA: 2:01 - loss: 2.2215 - regression_loss: 1.8129 - classification_loss: 0.4086 12/500 [..............................] - ETA: 2:00 - loss: 2.2107 - regression_loss: 1.8077 - classification_loss: 0.4031 13/500 [..............................] - ETA: 2:00 - loss: 2.1724 - regression_loss: 1.7775 - classification_loss: 0.3949 14/500 [..............................] - ETA: 2:00 - loss: 2.2443 - regression_loss: 1.8396 - classification_loss: 0.4047 15/500 [..............................] - ETA: 2:00 - loss: 2.2796 - regression_loss: 1.8670 - classification_loss: 0.4126 16/500 [..............................] - ETA: 1:59 - loss: 2.2921 - regression_loss: 1.8720 - classification_loss: 0.4201 17/500 [>.............................] - ETA: 1:58 - loss: 2.2985 - regression_loss: 1.8730 - classification_loss: 0.4255 18/500 [>.............................] - ETA: 1:58 - loss: 2.2593 - regression_loss: 1.8429 - classification_loss: 0.4164 19/500 [>.............................] - ETA: 1:58 - loss: 2.2525 - regression_loss: 1.8400 - classification_loss: 0.4125 20/500 [>.............................] - ETA: 1:58 - loss: 2.2498 - regression_loss: 1.8374 - classification_loss: 0.4124 21/500 [>.............................] - ETA: 1:57 - loss: 2.2566 - regression_loss: 1.8459 - classification_loss: 0.4107 22/500 [>.............................] - ETA: 1:57 - loss: 2.3057 - regression_loss: 1.8866 - classification_loss: 0.4192 23/500 [>.............................] - ETA: 1:56 - loss: 2.2977 - regression_loss: 1.8810 - classification_loss: 0.4167 24/500 [>.............................] - ETA: 1:56 - loss: 2.2829 - regression_loss: 1.8695 - classification_loss: 0.4134 25/500 [>.............................] - ETA: 1:56 - loss: 2.2734 - regression_loss: 1.8647 - classification_loss: 0.4087 26/500 [>.............................] - ETA: 1:56 - loss: 2.2632 - regression_loss: 1.8608 - classification_loss: 0.4024 27/500 [>.............................] - ETA: 1:56 - loss: 2.2415 - regression_loss: 1.8452 - classification_loss: 0.3964 28/500 [>.............................] - ETA: 1:55 - loss: 2.2591 - regression_loss: 1.8616 - classification_loss: 0.3976 29/500 [>.............................] - ETA: 1:55 - loss: 2.2565 - regression_loss: 1.8590 - classification_loss: 0.3975 30/500 [>.............................] - ETA: 1:55 - loss: 2.2567 - regression_loss: 1.8578 - classification_loss: 0.3989 31/500 [>.............................] - ETA: 1:55 - loss: 2.2967 - regression_loss: 1.8897 - classification_loss: 0.4070 32/500 [>.............................] - ETA: 1:55 - loss: 2.2979 - regression_loss: 1.8918 - classification_loss: 0.4061 33/500 [>.............................] - ETA: 1:54 - loss: 2.2636 - regression_loss: 1.8662 - classification_loss: 0.3973 34/500 [=>............................] - ETA: 1:54 - loss: 2.2485 - regression_loss: 1.8542 - classification_loss: 0.3944 35/500 [=>............................] - ETA: 1:54 - loss: 2.2381 - regression_loss: 1.8477 - classification_loss: 0.3904 36/500 [=>............................] - ETA: 1:54 - loss: 2.2503 - regression_loss: 1.8505 - classification_loss: 0.3998 37/500 [=>............................] - ETA: 1:53 - loss: 2.2528 - regression_loss: 1.8528 - classification_loss: 0.3999 38/500 [=>............................] - ETA: 1:53 - loss: 2.2505 - regression_loss: 1.8509 - classification_loss: 0.3996 39/500 [=>............................] - ETA: 1:53 - loss: 2.2521 - regression_loss: 1.8498 - classification_loss: 0.4023 40/500 [=>............................] - ETA: 1:53 - loss: 2.2708 - regression_loss: 1.8674 - classification_loss: 0.4034 41/500 [=>............................] - ETA: 1:53 - loss: 2.2841 - regression_loss: 1.8756 - classification_loss: 0.4085 42/500 [=>............................] - ETA: 1:52 - loss: 2.2573 - regression_loss: 1.8541 - classification_loss: 0.4031 43/500 [=>............................] - ETA: 1:52 - loss: 2.2386 - regression_loss: 1.8391 - classification_loss: 0.3995 44/500 [=>............................] - ETA: 1:52 - loss: 2.2218 - regression_loss: 1.8266 - classification_loss: 0.3952 45/500 [=>............................] - ETA: 1:52 - loss: 2.2238 - regression_loss: 1.8277 - classification_loss: 0.3960 46/500 [=>............................] - ETA: 1:52 - loss: 2.2153 - regression_loss: 1.8234 - classification_loss: 0.3919 47/500 [=>............................] - ETA: 1:51 - loss: 2.2100 - regression_loss: 1.8209 - classification_loss: 0.3892 48/500 [=>............................] - ETA: 1:51 - loss: 2.2046 - regression_loss: 1.8188 - classification_loss: 0.3858 49/500 [=>............................] - ETA: 1:51 - loss: 2.2170 - regression_loss: 1.8281 - classification_loss: 0.3889 50/500 [==>...........................] - ETA: 1:51 - loss: 2.2165 - regression_loss: 1.8294 - classification_loss: 0.3871 51/500 [==>...........................] - ETA: 1:51 - loss: 2.2040 - regression_loss: 1.8190 - classification_loss: 0.3850 52/500 [==>...........................] - ETA: 1:50 - loss: 2.1922 - regression_loss: 1.8089 - classification_loss: 0.3833 53/500 [==>...........................] - ETA: 1:50 - loss: 2.1908 - regression_loss: 1.8094 - classification_loss: 0.3814 54/500 [==>...........................] - ETA: 1:50 - loss: 2.1863 - regression_loss: 1.8064 - classification_loss: 0.3799 55/500 [==>...........................] - ETA: 1:50 - loss: 2.1934 - regression_loss: 1.8083 - classification_loss: 0.3851 56/500 [==>...........................] - ETA: 1:50 - loss: 2.1809 - regression_loss: 1.7993 - classification_loss: 0.3815 57/500 [==>...........................] - ETA: 1:49 - loss: 2.1776 - regression_loss: 1.7974 - classification_loss: 0.3802 58/500 [==>...........................] - ETA: 1:49 - loss: 2.1822 - regression_loss: 1.8021 - classification_loss: 0.3801 59/500 [==>...........................] - ETA: 1:49 - loss: 2.1735 - regression_loss: 1.7957 - classification_loss: 0.3778 60/500 [==>...........................] - ETA: 1:48 - loss: 2.1574 - regression_loss: 1.7834 - classification_loss: 0.3740 61/500 [==>...........................] - ETA: 1:48 - loss: 2.1395 - regression_loss: 1.7542 - classification_loss: 0.3853 62/500 [==>...........................] - ETA: 1:48 - loss: 2.1497 - regression_loss: 1.7624 - classification_loss: 0.3873 63/500 [==>...........................] - ETA: 1:48 - loss: 2.1672 - regression_loss: 1.7783 - classification_loss: 0.3889 64/500 [==>...........................] - ETA: 1:48 - loss: 2.1651 - regression_loss: 1.7772 - classification_loss: 0.3878 65/500 [==>...........................] - ETA: 1:47 - loss: 2.1679 - regression_loss: 1.7806 - classification_loss: 0.3873 66/500 [==>...........................] - ETA: 1:47 - loss: 2.1705 - regression_loss: 1.7823 - classification_loss: 0.3882 67/500 [===>..........................] - ETA: 1:47 - loss: 2.1751 - regression_loss: 1.7854 - classification_loss: 0.3897 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1812 - regression_loss: 1.7911 - classification_loss: 0.3901 69/500 [===>..........................] - ETA: 1:46 - loss: 2.1877 - regression_loss: 1.7954 - classification_loss: 0.3923 70/500 [===>..........................] - ETA: 1:46 - loss: 2.1820 - regression_loss: 1.7906 - classification_loss: 0.3914 71/500 [===>..........................] - ETA: 1:46 - loss: 2.1690 - regression_loss: 1.7816 - classification_loss: 0.3874 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1712 - regression_loss: 1.7844 - classification_loss: 0.3868 73/500 [===>..........................] - ETA: 1:45 - loss: 2.1658 - regression_loss: 1.7799 - classification_loss: 0.3859 74/500 [===>..........................] - ETA: 1:45 - loss: 2.1514 - regression_loss: 1.7693 - classification_loss: 0.3821 75/500 [===>..........................] - ETA: 1:45 - loss: 2.1449 - regression_loss: 1.7648 - classification_loss: 0.3801 76/500 [===>..........................] - ETA: 1:45 - loss: 2.1472 - regression_loss: 1.7666 - classification_loss: 0.3806 77/500 [===>..........................] - ETA: 1:45 - loss: 2.1475 - regression_loss: 1.7675 - classification_loss: 0.3801 78/500 [===>..........................] - ETA: 1:44 - loss: 2.1484 - regression_loss: 1.7679 - classification_loss: 0.3805 79/500 [===>..........................] - ETA: 1:44 - loss: 2.1466 - regression_loss: 1.7683 - classification_loss: 0.3783 80/500 [===>..........................] - ETA: 1:44 - loss: 2.1468 - regression_loss: 1.7679 - classification_loss: 0.3789 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1459 - regression_loss: 1.7658 - classification_loss: 0.3800 82/500 [===>..........................] - ETA: 1:43 - loss: 2.1480 - regression_loss: 1.7667 - classification_loss: 0.3812 83/500 [===>..........................] - ETA: 1:43 - loss: 2.1584 - regression_loss: 1.7747 - classification_loss: 0.3837 84/500 [====>.........................] - ETA: 1:43 - loss: 2.1539 - regression_loss: 1.7701 - classification_loss: 0.3838 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1532 - regression_loss: 1.7701 - classification_loss: 0.3831 86/500 [====>.........................] - ETA: 1:42 - loss: 2.1493 - regression_loss: 1.7681 - classification_loss: 0.3812 87/500 [====>.........................] - ETA: 1:42 - loss: 2.1611 - regression_loss: 1.7780 - classification_loss: 0.3831 88/500 [====>.........................] - ETA: 1:42 - loss: 2.1543 - regression_loss: 1.7731 - classification_loss: 0.3811 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1625 - regression_loss: 1.7790 - classification_loss: 0.3835 90/500 [====>.........................] - ETA: 1:41 - loss: 2.1671 - regression_loss: 1.7834 - classification_loss: 0.3838 91/500 [====>.........................] - ETA: 1:41 - loss: 2.1674 - regression_loss: 1.7836 - classification_loss: 0.3838 92/500 [====>.........................] - ETA: 1:41 - loss: 2.1686 - regression_loss: 1.7642 - classification_loss: 0.4044 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1636 - regression_loss: 1.7609 - classification_loss: 0.4027 94/500 [====>.........................] - ETA: 1:40 - loss: 2.1613 - regression_loss: 1.7590 - classification_loss: 0.4023 95/500 [====>.........................] - ETA: 1:40 - loss: 2.1612 - regression_loss: 1.7594 - classification_loss: 0.4018 96/500 [====>.........................] - ETA: 1:40 - loss: 2.1630 - regression_loss: 1.7626 - classification_loss: 0.4004 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1603 - regression_loss: 1.7611 - classification_loss: 0.3992 98/500 [====>.........................] - ETA: 1:39 - loss: 2.1630 - regression_loss: 1.7645 - classification_loss: 0.3985 99/500 [====>.........................] - ETA: 1:39 - loss: 2.1660 - regression_loss: 1.7668 - classification_loss: 0.3991 100/500 [=====>........................] - ETA: 1:39 - loss: 2.1617 - regression_loss: 1.7634 - classification_loss: 0.3983 101/500 [=====>........................] - ETA: 1:38 - loss: 2.1619 - regression_loss: 1.7629 - classification_loss: 0.3991 102/500 [=====>........................] - ETA: 1:38 - loss: 2.1579 - regression_loss: 1.7596 - classification_loss: 0.3982 103/500 [=====>........................] - ETA: 1:38 - loss: 2.1673 - regression_loss: 1.7669 - classification_loss: 0.4003 104/500 [=====>........................] - ETA: 1:37 - loss: 2.1781 - regression_loss: 1.7756 - classification_loss: 0.4025 105/500 [=====>........................] - ETA: 1:37 - loss: 2.1799 - regression_loss: 1.7768 - classification_loss: 0.4031 106/500 [=====>........................] - ETA: 1:37 - loss: 2.1791 - regression_loss: 1.7766 - classification_loss: 0.4025 107/500 [=====>........................] - ETA: 1:37 - loss: 2.1799 - regression_loss: 1.7778 - classification_loss: 0.4021 108/500 [=====>........................] - ETA: 1:37 - loss: 2.1854 - regression_loss: 1.7810 - classification_loss: 0.4044 109/500 [=====>........................] - ETA: 1:36 - loss: 2.1837 - regression_loss: 1.7803 - classification_loss: 0.4034 110/500 [=====>........................] - ETA: 1:36 - loss: 2.1832 - regression_loss: 1.7798 - classification_loss: 0.4035 111/500 [=====>........................] - ETA: 1:36 - loss: 2.1867 - regression_loss: 1.7834 - classification_loss: 0.4033 112/500 [=====>........................] - ETA: 1:36 - loss: 2.1880 - regression_loss: 1.7842 - classification_loss: 0.4037 113/500 [=====>........................] - ETA: 1:35 - loss: 2.1868 - regression_loss: 1.7833 - classification_loss: 0.4036 114/500 [=====>........................] - ETA: 1:35 - loss: 2.1768 - regression_loss: 1.7756 - classification_loss: 0.4012 115/500 [=====>........................] - ETA: 1:35 - loss: 2.1751 - regression_loss: 1.7751 - classification_loss: 0.4001 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1801 - regression_loss: 1.7783 - classification_loss: 0.4018 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1826 - regression_loss: 1.7802 - classification_loss: 0.4024 118/500 [======>.......................] - ETA: 1:34 - loss: 2.1809 - regression_loss: 1.7797 - classification_loss: 0.4012 119/500 [======>.......................] - ETA: 1:34 - loss: 2.1788 - regression_loss: 1.7783 - classification_loss: 0.4005 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1762 - regression_loss: 1.7766 - classification_loss: 0.3996 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1814 - regression_loss: 1.7809 - classification_loss: 0.4005 122/500 [======>.......................] - ETA: 1:33 - loss: 2.1734 - regression_loss: 1.7749 - classification_loss: 0.3984 123/500 [======>.......................] - ETA: 1:33 - loss: 2.1733 - regression_loss: 1.7765 - classification_loss: 0.3968 124/500 [======>.......................] - ETA: 1:33 - loss: 2.1759 - regression_loss: 1.7785 - classification_loss: 0.3974 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1785 - regression_loss: 1.7805 - classification_loss: 0.3980 126/500 [======>.......................] - ETA: 1:32 - loss: 2.1801 - regression_loss: 1.7813 - classification_loss: 0.3988 127/500 [======>.......................] - ETA: 1:32 - loss: 2.1831 - regression_loss: 1.7836 - classification_loss: 0.3995 128/500 [======>.......................] - ETA: 1:32 - loss: 2.1873 - regression_loss: 1.7871 - classification_loss: 0.4002 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1856 - regression_loss: 1.7861 - classification_loss: 0.3995 130/500 [======>.......................] - ETA: 1:31 - loss: 2.1808 - regression_loss: 1.7827 - classification_loss: 0.3981 131/500 [======>.......................] - ETA: 1:31 - loss: 2.1787 - regression_loss: 1.7813 - classification_loss: 0.3974 132/500 [======>.......................] - ETA: 1:31 - loss: 2.1756 - regression_loss: 1.7793 - classification_loss: 0.3963 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1757 - regression_loss: 1.7781 - classification_loss: 0.3977 134/500 [=======>......................] - ETA: 1:30 - loss: 2.1748 - regression_loss: 1.7774 - classification_loss: 0.3974 135/500 [=======>......................] - ETA: 1:30 - loss: 2.1698 - regression_loss: 1.7738 - classification_loss: 0.3960 136/500 [=======>......................] - ETA: 1:30 - loss: 2.1659 - regression_loss: 1.7710 - classification_loss: 0.3949 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1716 - regression_loss: 1.7762 - classification_loss: 0.3954 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1715 - regression_loss: 1.7763 - classification_loss: 0.3952 139/500 [=======>......................] - ETA: 1:29 - loss: 2.1680 - regression_loss: 1.7744 - classification_loss: 0.3936 140/500 [=======>......................] - ETA: 1:29 - loss: 2.1661 - regression_loss: 1.7733 - classification_loss: 0.3928 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1678 - regression_loss: 1.7749 - classification_loss: 0.3929 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1701 - regression_loss: 1.7762 - classification_loss: 0.3939 143/500 [=======>......................] - ETA: 1:28 - loss: 2.1729 - regression_loss: 1.7786 - classification_loss: 0.3944 144/500 [=======>......................] - ETA: 1:28 - loss: 2.1699 - regression_loss: 1.7767 - classification_loss: 0.3932 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1661 - regression_loss: 1.7739 - classification_loss: 0.3922 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1687 - regression_loss: 1.7754 - classification_loss: 0.3934 147/500 [=======>......................] - ETA: 1:27 - loss: 2.1627 - regression_loss: 1.7708 - classification_loss: 0.3919 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1633 - regression_loss: 1.7715 - classification_loss: 0.3918 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1648 - regression_loss: 1.7726 - classification_loss: 0.3922 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1612 - regression_loss: 1.7701 - classification_loss: 0.3911 151/500 [========>.....................] - ETA: 1:26 - loss: 2.1657 - regression_loss: 1.7739 - classification_loss: 0.3918 152/500 [========>.....................] - ETA: 1:26 - loss: 2.1619 - regression_loss: 1.7706 - classification_loss: 0.3913 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1655 - regression_loss: 1.7732 - classification_loss: 0.3922 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1646 - regression_loss: 1.7723 - classification_loss: 0.3923 155/500 [========>.....................] - ETA: 1:25 - loss: 2.1625 - regression_loss: 1.7708 - classification_loss: 0.3917 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1666 - regression_loss: 1.7736 - classification_loss: 0.3930 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1634 - regression_loss: 1.7712 - classification_loss: 0.3922 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1672 - regression_loss: 1.7739 - classification_loss: 0.3933 159/500 [========>.....................] - ETA: 1:24 - loss: 2.1678 - regression_loss: 1.7751 - classification_loss: 0.3927 160/500 [========>.....................] - ETA: 1:24 - loss: 2.1653 - regression_loss: 1.7727 - classification_loss: 0.3925 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1669 - regression_loss: 1.7736 - classification_loss: 0.3933 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1613 - regression_loss: 1.7695 - classification_loss: 0.3918 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1630 - regression_loss: 1.7704 - classification_loss: 0.3926 164/500 [========>.....................] - ETA: 1:23 - loss: 2.1539 - regression_loss: 1.7631 - classification_loss: 0.3908 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1574 - regression_loss: 1.7662 - classification_loss: 0.3912 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1608 - regression_loss: 1.7692 - classification_loss: 0.3916 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1585 - regression_loss: 1.7673 - classification_loss: 0.3912 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1579 - regression_loss: 1.7668 - classification_loss: 0.3910 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1628 - regression_loss: 1.7709 - classification_loss: 0.3919 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1616 - regression_loss: 1.7701 - classification_loss: 0.3915 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1708 - regression_loss: 1.7774 - classification_loss: 0.3933 172/500 [=========>....................] - ETA: 1:21 - loss: 2.1698 - regression_loss: 1.7671 - classification_loss: 0.4027 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1700 - regression_loss: 1.7667 - classification_loss: 0.4033 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1699 - regression_loss: 1.7667 - classification_loss: 0.4032 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1690 - regression_loss: 1.7660 - classification_loss: 0.4030 176/500 [=========>....................] - ETA: 1:20 - loss: 2.1697 - regression_loss: 1.7672 - classification_loss: 0.4025 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1669 - regression_loss: 1.7650 - classification_loss: 0.4020 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1653 - regression_loss: 1.7637 - classification_loss: 0.4017 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1635 - regression_loss: 1.7627 - classification_loss: 0.4008 180/500 [=========>....................] - ETA: 1:19 - loss: 2.1686 - regression_loss: 1.7677 - classification_loss: 0.4010 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1735 - regression_loss: 1.7724 - classification_loss: 0.4011 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1734 - regression_loss: 1.7724 - classification_loss: 0.4010 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1736 - regression_loss: 1.7727 - classification_loss: 0.4010 184/500 [==========>...................] - ETA: 1:18 - loss: 2.1745 - regression_loss: 1.7735 - classification_loss: 0.4010 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1676 - regression_loss: 1.7680 - classification_loss: 0.3996 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1671 - regression_loss: 1.7679 - classification_loss: 0.3992 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1638 - regression_loss: 1.7658 - classification_loss: 0.3980 188/500 [==========>...................] - ETA: 1:17 - loss: 2.1605 - regression_loss: 1.7624 - classification_loss: 0.3980 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1621 - regression_loss: 1.7637 - classification_loss: 0.3984 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1557 - regression_loss: 1.7584 - classification_loss: 0.3973 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1567 - regression_loss: 1.7601 - classification_loss: 0.3966 192/500 [==========>...................] - ETA: 1:16 - loss: 2.1564 - regression_loss: 1.7601 - classification_loss: 0.3962 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1538 - regression_loss: 1.7581 - classification_loss: 0.3957 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1510 - regression_loss: 1.7564 - classification_loss: 0.3946 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1495 - regression_loss: 1.7553 - classification_loss: 0.3942 196/500 [==========>...................] - ETA: 1:15 - loss: 2.1543 - regression_loss: 1.7583 - classification_loss: 0.3960 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1539 - regression_loss: 1.7577 - classification_loss: 0.3961 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1540 - regression_loss: 1.7574 - classification_loss: 0.3966 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1537 - regression_loss: 1.7573 - classification_loss: 0.3964 200/500 [===========>..................] - ETA: 1:14 - loss: 2.1536 - regression_loss: 1.7571 - classification_loss: 0.3965 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1537 - regression_loss: 1.7573 - classification_loss: 0.3964 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1552 - regression_loss: 1.7585 - classification_loss: 0.3968 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1544 - regression_loss: 1.7583 - classification_loss: 0.3962 204/500 [===========>..................] - ETA: 1:13 - loss: 2.1550 - regression_loss: 1.7592 - classification_loss: 0.3957 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1587 - regression_loss: 1.7630 - classification_loss: 0.3957 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1549 - regression_loss: 1.7603 - classification_loss: 0.3946 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1544 - regression_loss: 1.7602 - classification_loss: 0.3942 208/500 [===========>..................] - ETA: 1:12 - loss: 2.1556 - regression_loss: 1.7613 - classification_loss: 0.3943 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1561 - regression_loss: 1.7620 - classification_loss: 0.3941 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1554 - regression_loss: 1.7616 - classification_loss: 0.3939 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1497 - regression_loss: 1.7570 - classification_loss: 0.3927 212/500 [===========>..................] - ETA: 1:11 - loss: 2.1470 - regression_loss: 1.7550 - classification_loss: 0.3920 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1484 - regression_loss: 1.7563 - classification_loss: 0.3920 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1439 - regression_loss: 1.7531 - classification_loss: 0.3908 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1445 - regression_loss: 1.7531 - classification_loss: 0.3913 216/500 [===========>..................] - ETA: 1:10 - loss: 2.1405 - regression_loss: 1.7503 - classification_loss: 0.3902 217/500 [============>.................] - ETA: 1:10 - loss: 2.1366 - regression_loss: 1.7474 - classification_loss: 0.3892 218/500 [============>.................] - ETA: 1:10 - loss: 2.1337 - regression_loss: 1.7453 - classification_loss: 0.3884 219/500 [============>.................] - ETA: 1:10 - loss: 2.1325 - regression_loss: 1.7450 - classification_loss: 0.3875 220/500 [============>.................] - ETA: 1:09 - loss: 2.1332 - regression_loss: 1.7460 - classification_loss: 0.3872 221/500 [============>.................] - ETA: 1:09 - loss: 2.1325 - regression_loss: 1.7460 - classification_loss: 0.3865 222/500 [============>.................] - ETA: 1:09 - loss: 2.1333 - regression_loss: 1.7473 - classification_loss: 0.3860 223/500 [============>.................] - ETA: 1:09 - loss: 2.1354 - regression_loss: 1.7487 - classification_loss: 0.3868 224/500 [============>.................] - ETA: 1:08 - loss: 2.1364 - regression_loss: 1.7491 - classification_loss: 0.3873 225/500 [============>.................] - ETA: 1:08 - loss: 2.1330 - regression_loss: 1.7465 - classification_loss: 0.3864 226/500 [============>.................] - ETA: 1:08 - loss: 2.1332 - regression_loss: 1.7472 - classification_loss: 0.3860 227/500 [============>.................] - ETA: 1:08 - loss: 2.1432 - regression_loss: 1.7556 - classification_loss: 0.3875 228/500 [============>.................] - ETA: 1:07 - loss: 2.1449 - regression_loss: 1.7573 - classification_loss: 0.3877 229/500 [============>.................] - ETA: 1:07 - loss: 2.1409 - regression_loss: 1.7542 - classification_loss: 0.3867 230/500 [============>.................] - ETA: 1:07 - loss: 2.1357 - regression_loss: 1.7504 - classification_loss: 0.3853 231/500 [============>.................] - ETA: 1:07 - loss: 2.1347 - regression_loss: 1.7499 - classification_loss: 0.3849 232/500 [============>.................] - ETA: 1:06 - loss: 2.1367 - regression_loss: 1.7518 - classification_loss: 0.3850 233/500 [============>.................] - ETA: 1:06 - loss: 2.1347 - regression_loss: 1.7503 - classification_loss: 0.3843 234/500 [=============>................] - ETA: 1:06 - loss: 2.1365 - regression_loss: 1.7515 - classification_loss: 0.3850 235/500 [=============>................] - ETA: 1:06 - loss: 2.1381 - regression_loss: 1.7528 - classification_loss: 0.3853 236/500 [=============>................] - ETA: 1:05 - loss: 2.1460 - regression_loss: 1.7606 - classification_loss: 0.3854 237/500 [=============>................] - ETA: 1:05 - loss: 2.1438 - regression_loss: 1.7592 - classification_loss: 0.3846 238/500 [=============>................] - ETA: 1:05 - loss: 2.1434 - regression_loss: 1.7592 - classification_loss: 0.3841 239/500 [=============>................] - ETA: 1:05 - loss: 2.1439 - regression_loss: 1.7594 - classification_loss: 0.3844 240/500 [=============>................] - ETA: 1:04 - loss: 2.1421 - regression_loss: 1.7583 - classification_loss: 0.3838 241/500 [=============>................] - ETA: 1:04 - loss: 2.1430 - regression_loss: 1.7590 - classification_loss: 0.3841 242/500 [=============>................] - ETA: 1:04 - loss: 2.1445 - regression_loss: 1.7600 - classification_loss: 0.3845 243/500 [=============>................] - ETA: 1:04 - loss: 2.1464 - regression_loss: 1.7614 - classification_loss: 0.3850 244/500 [=============>................] - ETA: 1:03 - loss: 2.1464 - regression_loss: 1.7612 - classification_loss: 0.3852 245/500 [=============>................] - ETA: 1:03 - loss: 2.1432 - regression_loss: 1.7587 - classification_loss: 0.3845 246/500 [=============>................] - ETA: 1:03 - loss: 2.1454 - regression_loss: 1.7600 - classification_loss: 0.3853 247/500 [=============>................] - ETA: 1:03 - loss: 2.1446 - regression_loss: 1.7595 - classification_loss: 0.3850 248/500 [=============>................] - ETA: 1:02 - loss: 2.1463 - regression_loss: 1.7610 - classification_loss: 0.3852 249/500 [=============>................] - ETA: 1:02 - loss: 2.1483 - regression_loss: 1.7630 - classification_loss: 0.3852 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1465 - regression_loss: 1.7619 - classification_loss: 0.3846 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1498 - regression_loss: 1.7648 - classification_loss: 0.3850 252/500 [==============>...............] - ETA: 1:01 - loss: 2.1514 - regression_loss: 1.7659 - classification_loss: 0.3854 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1505 - regression_loss: 1.7652 - classification_loss: 0.3853 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1513 - regression_loss: 1.7662 - classification_loss: 0.3851 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1497 - regression_loss: 1.7651 - classification_loss: 0.3846 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1523 - regression_loss: 1.7679 - classification_loss: 0.3845 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1529 - regression_loss: 1.7684 - classification_loss: 0.3845 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1531 - regression_loss: 1.7688 - classification_loss: 0.3843 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1518 - regression_loss: 1.7680 - classification_loss: 0.3838 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1494 - regression_loss: 1.7666 - classification_loss: 0.3827 261/500 [==============>...............] - ETA: 59s - loss: 2.1488 - regression_loss: 1.7661 - classification_loss: 0.3827  262/500 [==============>...............] - ETA: 59s - loss: 2.1483 - regression_loss: 1.7658 - classification_loss: 0.3825 263/500 [==============>...............] - ETA: 59s - loss: 2.1478 - regression_loss: 1.7652 - classification_loss: 0.3827 264/500 [==============>...............] - ETA: 59s - loss: 2.1477 - regression_loss: 1.7650 - classification_loss: 0.3827 265/500 [==============>...............] - ETA: 58s - loss: 2.1499 - regression_loss: 1.7671 - classification_loss: 0.3828 266/500 [==============>...............] - ETA: 58s - loss: 2.1496 - regression_loss: 1.7665 - classification_loss: 0.3831 267/500 [===============>..............] - ETA: 58s - loss: 2.1462 - regression_loss: 1.7635 - classification_loss: 0.3827 268/500 [===============>..............] - ETA: 58s - loss: 2.1498 - regression_loss: 1.7663 - classification_loss: 0.3835 269/500 [===============>..............] - ETA: 57s - loss: 2.1530 - regression_loss: 1.7688 - classification_loss: 0.3842 270/500 [===============>..............] - ETA: 57s - loss: 2.1529 - regression_loss: 1.7691 - classification_loss: 0.3838 271/500 [===============>..............] - ETA: 57s - loss: 2.1478 - regression_loss: 1.7650 - classification_loss: 0.3828 272/500 [===============>..............] - ETA: 57s - loss: 2.1476 - regression_loss: 1.7651 - classification_loss: 0.3825 273/500 [===============>..............] - ETA: 56s - loss: 2.1487 - regression_loss: 1.7660 - classification_loss: 0.3827 274/500 [===============>..............] - ETA: 56s - loss: 2.1501 - regression_loss: 1.7672 - classification_loss: 0.3830 275/500 [===============>..............] - ETA: 56s - loss: 2.1482 - regression_loss: 1.7658 - classification_loss: 0.3824 276/500 [===============>..............] - ETA: 55s - loss: 2.1477 - regression_loss: 1.7656 - classification_loss: 0.3821 277/500 [===============>..............] - ETA: 55s - loss: 2.1503 - regression_loss: 1.7686 - classification_loss: 0.3816 278/500 [===============>..............] - ETA: 55s - loss: 2.1494 - regression_loss: 1.7682 - classification_loss: 0.3812 279/500 [===============>..............] - ETA: 55s - loss: 2.1472 - regression_loss: 1.7666 - classification_loss: 0.3806 280/500 [===============>..............] - ETA: 54s - loss: 2.1470 - regression_loss: 1.7666 - classification_loss: 0.3804 281/500 [===============>..............] - ETA: 54s - loss: 2.1480 - regression_loss: 1.7675 - classification_loss: 0.3805 282/500 [===============>..............] - ETA: 54s - loss: 2.1480 - regression_loss: 1.7677 - classification_loss: 0.3803 283/500 [===============>..............] - ETA: 54s - loss: 2.1473 - regression_loss: 1.7672 - classification_loss: 0.3802 284/500 [================>.............] - ETA: 53s - loss: 2.1466 - regression_loss: 1.7666 - classification_loss: 0.3800 285/500 [================>.............] - ETA: 53s - loss: 2.1472 - regression_loss: 1.7675 - classification_loss: 0.3798 286/500 [================>.............] - ETA: 53s - loss: 2.1502 - regression_loss: 1.7701 - classification_loss: 0.3801 287/500 [================>.............] - ETA: 53s - loss: 2.1514 - regression_loss: 1.7711 - classification_loss: 0.3803 288/500 [================>.............] - ETA: 52s - loss: 2.1512 - regression_loss: 1.7711 - classification_loss: 0.3801 289/500 [================>.............] - ETA: 52s - loss: 2.1487 - regression_loss: 1.7691 - classification_loss: 0.3795 290/500 [================>.............] - ETA: 52s - loss: 2.1510 - regression_loss: 1.7709 - classification_loss: 0.3801 291/500 [================>.............] - ETA: 52s - loss: 2.1501 - regression_loss: 1.7705 - classification_loss: 0.3796 292/500 [================>.............] - ETA: 51s - loss: 2.1493 - regression_loss: 1.7699 - classification_loss: 0.3794 293/500 [================>.............] - ETA: 51s - loss: 2.1489 - regression_loss: 1.7696 - classification_loss: 0.3793 294/500 [================>.............] - ETA: 51s - loss: 2.1498 - regression_loss: 1.7700 - classification_loss: 0.3798 295/500 [================>.............] - ETA: 51s - loss: 2.1500 - regression_loss: 1.7700 - classification_loss: 0.3800 296/500 [================>.............] - ETA: 50s - loss: 2.1493 - regression_loss: 1.7695 - classification_loss: 0.3798 297/500 [================>.............] - ETA: 50s - loss: 2.1469 - regression_loss: 1.7676 - classification_loss: 0.3792 298/500 [================>.............] - ETA: 50s - loss: 2.1475 - regression_loss: 1.7681 - classification_loss: 0.3794 299/500 [================>.............] - ETA: 50s - loss: 2.1483 - regression_loss: 1.7686 - classification_loss: 0.3796 300/500 [=================>............] - ETA: 49s - loss: 2.1479 - regression_loss: 1.7684 - classification_loss: 0.3794 301/500 [=================>............] - ETA: 49s - loss: 2.1466 - regression_loss: 1.7675 - classification_loss: 0.3791 302/500 [=================>............] - ETA: 49s - loss: 2.1435 - regression_loss: 1.7650 - classification_loss: 0.3784 303/500 [=================>............] - ETA: 49s - loss: 2.1437 - regression_loss: 1.7653 - classification_loss: 0.3784 304/500 [=================>............] - ETA: 48s - loss: 2.1453 - regression_loss: 1.7664 - classification_loss: 0.3789 305/500 [=================>............] - ETA: 48s - loss: 2.1449 - regression_loss: 1.7666 - classification_loss: 0.3783 306/500 [=================>............] - ETA: 48s - loss: 2.1435 - regression_loss: 1.7657 - classification_loss: 0.3778 307/500 [=================>............] - ETA: 48s - loss: 2.1441 - regression_loss: 1.7665 - classification_loss: 0.3776 308/500 [=================>............] - ETA: 47s - loss: 2.1447 - regression_loss: 1.7667 - classification_loss: 0.3780 309/500 [=================>............] - ETA: 47s - loss: 2.1415 - regression_loss: 1.7643 - classification_loss: 0.3772 310/500 [=================>............] - ETA: 47s - loss: 2.1430 - regression_loss: 1.7651 - classification_loss: 0.3778 311/500 [=================>............] - ETA: 47s - loss: 2.1446 - regression_loss: 1.7663 - classification_loss: 0.3783 312/500 [=================>............] - ETA: 46s - loss: 2.1449 - regression_loss: 1.7667 - classification_loss: 0.3782 313/500 [=================>............] - ETA: 46s - loss: 2.1457 - regression_loss: 1.7672 - classification_loss: 0.3785 314/500 [=================>............] - ETA: 46s - loss: 2.1449 - regression_loss: 1.7669 - classification_loss: 0.3780 315/500 [=================>............] - ETA: 46s - loss: 2.1432 - regression_loss: 1.7659 - classification_loss: 0.3773 316/500 [=================>............] - ETA: 45s - loss: 2.1411 - regression_loss: 1.7643 - classification_loss: 0.3768 317/500 [==================>...........] - ETA: 45s - loss: 2.1420 - regression_loss: 1.7649 - classification_loss: 0.3771 318/500 [==================>...........] - ETA: 45s - loss: 2.1394 - regression_loss: 1.7628 - classification_loss: 0.3766 319/500 [==================>...........] - ETA: 45s - loss: 2.1352 - regression_loss: 1.7593 - classification_loss: 0.3759 320/500 [==================>...........] - ETA: 44s - loss: 2.1356 - regression_loss: 1.7597 - classification_loss: 0.3760 321/500 [==================>...........] - ETA: 44s - loss: 2.1353 - regression_loss: 1.7593 - classification_loss: 0.3760 322/500 [==================>...........] - ETA: 44s - loss: 2.1348 - regression_loss: 1.7591 - classification_loss: 0.3756 323/500 [==================>...........] - ETA: 44s - loss: 2.1337 - regression_loss: 1.7584 - classification_loss: 0.3753 324/500 [==================>...........] - ETA: 43s - loss: 2.1317 - regression_loss: 1.7570 - classification_loss: 0.3747 325/500 [==================>...........] - ETA: 43s - loss: 2.1321 - regression_loss: 1.7576 - classification_loss: 0.3745 326/500 [==================>...........] - ETA: 43s - loss: 2.1316 - regression_loss: 1.7572 - classification_loss: 0.3744 327/500 [==================>...........] - ETA: 43s - loss: 2.1279 - regression_loss: 1.7543 - classification_loss: 0.3736 328/500 [==================>...........] - ETA: 42s - loss: 2.1275 - regression_loss: 1.7541 - classification_loss: 0.3734 329/500 [==================>...........] - ETA: 42s - loss: 2.1289 - regression_loss: 1.7550 - classification_loss: 0.3739 330/500 [==================>...........] - ETA: 42s - loss: 2.1293 - regression_loss: 1.7551 - classification_loss: 0.3742 331/500 [==================>...........] - ETA: 42s - loss: 2.1286 - regression_loss: 1.7544 - classification_loss: 0.3742 332/500 [==================>...........] - ETA: 41s - loss: 2.1282 - regression_loss: 1.7543 - classification_loss: 0.3739 333/500 [==================>...........] - ETA: 41s - loss: 2.1289 - regression_loss: 1.7548 - classification_loss: 0.3741 334/500 [===================>..........] - ETA: 41s - loss: 2.1282 - regression_loss: 1.7544 - classification_loss: 0.3738 335/500 [===================>..........] - ETA: 41s - loss: 2.1287 - regression_loss: 1.7549 - classification_loss: 0.3738 336/500 [===================>..........] - ETA: 40s - loss: 2.1298 - regression_loss: 1.7557 - classification_loss: 0.3741 337/500 [===================>..........] - ETA: 40s - loss: 2.1285 - regression_loss: 1.7549 - classification_loss: 0.3736 338/500 [===================>..........] - ETA: 40s - loss: 2.1297 - regression_loss: 1.7551 - classification_loss: 0.3746 339/500 [===================>..........] - ETA: 40s - loss: 2.1272 - regression_loss: 1.7531 - classification_loss: 0.3741 340/500 [===================>..........] - ETA: 39s - loss: 2.1281 - regression_loss: 1.7539 - classification_loss: 0.3743 341/500 [===================>..........] - ETA: 39s - loss: 2.1276 - regression_loss: 1.7535 - classification_loss: 0.3741 342/500 [===================>..........] - ETA: 39s - loss: 2.1270 - regression_loss: 1.7531 - classification_loss: 0.3738 343/500 [===================>..........] - ETA: 39s - loss: 2.1279 - regression_loss: 1.7535 - classification_loss: 0.3743 344/500 [===================>..........] - ETA: 38s - loss: 2.1324 - regression_loss: 1.7579 - classification_loss: 0.3746 345/500 [===================>..........] - ETA: 38s - loss: 2.1327 - regression_loss: 1.7578 - classification_loss: 0.3748 346/500 [===================>..........] - ETA: 38s - loss: 2.1337 - regression_loss: 1.7584 - classification_loss: 0.3753 347/500 [===================>..........] - ETA: 38s - loss: 2.1374 - regression_loss: 1.7604 - classification_loss: 0.3769 348/500 [===================>..........] - ETA: 37s - loss: 2.1353 - regression_loss: 1.7589 - classification_loss: 0.3764 349/500 [===================>..........] - ETA: 37s - loss: 2.1343 - regression_loss: 1.7582 - classification_loss: 0.3761 350/500 [====================>.........] - ETA: 37s - loss: 2.1356 - regression_loss: 1.7588 - classification_loss: 0.3767 351/500 [====================>.........] - ETA: 37s - loss: 2.1353 - regression_loss: 1.7588 - classification_loss: 0.3765 352/500 [====================>.........] - ETA: 36s - loss: 2.1342 - regression_loss: 1.7581 - classification_loss: 0.3760 353/500 [====================>.........] - ETA: 36s - loss: 2.1355 - regression_loss: 1.7589 - classification_loss: 0.3765 354/500 [====================>.........] - ETA: 36s - loss: 2.1373 - regression_loss: 1.7606 - classification_loss: 0.3767 355/500 [====================>.........] - ETA: 36s - loss: 2.1332 - regression_loss: 1.7572 - classification_loss: 0.3760 356/500 [====================>.........] - ETA: 35s - loss: 2.1370 - regression_loss: 1.7601 - classification_loss: 0.3770 357/500 [====================>.........] - ETA: 35s - loss: 2.1349 - regression_loss: 1.7584 - classification_loss: 0.3764 358/500 [====================>.........] - ETA: 35s - loss: 2.1359 - regression_loss: 1.7596 - classification_loss: 0.3763 359/500 [====================>.........] - ETA: 35s - loss: 2.1350 - regression_loss: 1.7587 - classification_loss: 0.3764 360/500 [====================>.........] - ETA: 34s - loss: 2.1348 - regression_loss: 1.7587 - classification_loss: 0.3761 361/500 [====================>.........] - ETA: 34s - loss: 2.1360 - regression_loss: 1.7594 - classification_loss: 0.3766 362/500 [====================>.........] - ETA: 34s - loss: 2.1357 - regression_loss: 1.7595 - classification_loss: 0.3762 363/500 [====================>.........] - ETA: 34s - loss: 2.1329 - regression_loss: 1.7573 - classification_loss: 0.3755 364/500 [====================>.........] - ETA: 33s - loss: 2.1335 - regression_loss: 1.7578 - classification_loss: 0.3758 365/500 [====================>.........] - ETA: 33s - loss: 2.1344 - regression_loss: 1.7579 - classification_loss: 0.3764 366/500 [====================>.........] - ETA: 33s - loss: 2.1347 - regression_loss: 1.7581 - classification_loss: 0.3765 367/500 [=====================>........] - ETA: 33s - loss: 2.1350 - regression_loss: 1.7584 - classification_loss: 0.3766 368/500 [=====================>........] - ETA: 32s - loss: 2.1348 - regression_loss: 1.7583 - classification_loss: 0.3764 369/500 [=====================>........] - ETA: 32s - loss: 2.1337 - regression_loss: 1.7576 - classification_loss: 0.3761 370/500 [=====================>........] - ETA: 32s - loss: 2.1347 - regression_loss: 1.7587 - classification_loss: 0.3760 371/500 [=====================>........] - ETA: 32s - loss: 2.1339 - regression_loss: 1.7582 - classification_loss: 0.3757 372/500 [=====================>........] - ETA: 31s - loss: 2.1357 - regression_loss: 1.7594 - classification_loss: 0.3763 373/500 [=====================>........] - ETA: 31s - loss: 2.1356 - regression_loss: 1.7591 - classification_loss: 0.3765 374/500 [=====================>........] - ETA: 31s - loss: 2.1382 - regression_loss: 1.7613 - classification_loss: 0.3768 375/500 [=====================>........] - ETA: 31s - loss: 2.1367 - regression_loss: 1.7602 - classification_loss: 0.3765 376/500 [=====================>........] - ETA: 31s - loss: 2.1380 - regression_loss: 1.7614 - classification_loss: 0.3766 377/500 [=====================>........] - ETA: 30s - loss: 2.1386 - regression_loss: 1.7624 - classification_loss: 0.3762 378/500 [=====================>........] - ETA: 30s - loss: 2.1368 - regression_loss: 1.7578 - classification_loss: 0.3790 379/500 [=====================>........] - ETA: 30s - loss: 2.1387 - regression_loss: 1.7594 - classification_loss: 0.3792 380/500 [=====================>........] - ETA: 30s - loss: 2.1394 - regression_loss: 1.7601 - classification_loss: 0.3793 381/500 [=====================>........] - ETA: 29s - loss: 2.1382 - regression_loss: 1.7592 - classification_loss: 0.3790 382/500 [=====================>........] - ETA: 29s - loss: 2.1365 - regression_loss: 1.7578 - classification_loss: 0.3786 383/500 [=====================>........] - ETA: 29s - loss: 2.1367 - regression_loss: 1.7581 - classification_loss: 0.3786 384/500 [======================>.......] - ETA: 28s - loss: 2.1356 - regression_loss: 1.7573 - classification_loss: 0.3783 385/500 [======================>.......] - ETA: 28s - loss: 2.1343 - regression_loss: 1.7564 - classification_loss: 0.3779 386/500 [======================>.......] - ETA: 28s - loss: 2.1352 - regression_loss: 1.7570 - classification_loss: 0.3782 387/500 [======================>.......] - ETA: 28s - loss: 2.1353 - regression_loss: 1.7574 - classification_loss: 0.3779 388/500 [======================>.......] - ETA: 28s - loss: 2.1387 - regression_loss: 1.7602 - classification_loss: 0.3784 389/500 [======================>.......] - ETA: 27s - loss: 2.1392 - regression_loss: 1.7606 - classification_loss: 0.3786 390/500 [======================>.......] - ETA: 27s - loss: 2.1391 - regression_loss: 1.7604 - classification_loss: 0.3788 391/500 [======================>.......] - ETA: 27s - loss: 2.1391 - regression_loss: 1.7603 - classification_loss: 0.3788 392/500 [======================>.......] - ETA: 27s - loss: 2.1380 - regression_loss: 1.7598 - classification_loss: 0.3782 393/500 [======================>.......] - ETA: 26s - loss: 2.1379 - regression_loss: 1.7601 - classification_loss: 0.3779 394/500 [======================>.......] - ETA: 26s - loss: 2.1387 - regression_loss: 1.7609 - classification_loss: 0.3778 395/500 [======================>.......] - ETA: 26s - loss: 2.1387 - regression_loss: 1.7611 - classification_loss: 0.3776 396/500 [======================>.......] - ETA: 26s - loss: 2.1399 - regression_loss: 1.7620 - classification_loss: 0.3779 397/500 [======================>.......] - ETA: 25s - loss: 2.1402 - regression_loss: 1.7623 - classification_loss: 0.3779 398/500 [======================>.......] - ETA: 25s - loss: 2.1386 - regression_loss: 1.7609 - classification_loss: 0.3777 399/500 [======================>.......] - ETA: 25s - loss: 2.1391 - regression_loss: 1.7612 - classification_loss: 0.3779 400/500 [=======================>......] - ETA: 25s - loss: 2.1385 - regression_loss: 1.7607 - classification_loss: 0.3779 401/500 [=======================>......] - ETA: 24s - loss: 2.1366 - regression_loss: 1.7591 - classification_loss: 0.3775 402/500 [=======================>......] - ETA: 24s - loss: 2.1372 - regression_loss: 1.7593 - classification_loss: 0.3779 403/500 [=======================>......] - ETA: 24s - loss: 2.1368 - regression_loss: 1.7590 - classification_loss: 0.3778 404/500 [=======================>......] - ETA: 24s - loss: 2.1381 - regression_loss: 1.7590 - classification_loss: 0.3792 405/500 [=======================>......] - ETA: 23s - loss: 2.1408 - regression_loss: 1.7613 - classification_loss: 0.3796 406/500 [=======================>......] - ETA: 23s - loss: 2.1393 - regression_loss: 1.7603 - classification_loss: 0.3790 407/500 [=======================>......] - ETA: 23s - loss: 2.1397 - regression_loss: 1.7606 - classification_loss: 0.3791 408/500 [=======================>......] - ETA: 23s - loss: 2.1398 - regression_loss: 1.7606 - classification_loss: 0.3791 409/500 [=======================>......] - ETA: 22s - loss: 2.1404 - regression_loss: 1.7610 - classification_loss: 0.3793 410/500 [=======================>......] - ETA: 22s - loss: 2.1396 - regression_loss: 1.7605 - classification_loss: 0.3792 411/500 [=======================>......] - ETA: 22s - loss: 2.1412 - regression_loss: 1.7617 - classification_loss: 0.3795 412/500 [=======================>......] - ETA: 21s - loss: 2.1412 - regression_loss: 1.7616 - classification_loss: 0.3796 413/500 [=======================>......] - ETA: 21s - loss: 2.1411 - regression_loss: 1.7616 - classification_loss: 0.3796 414/500 [=======================>......] - ETA: 21s - loss: 2.1402 - regression_loss: 1.7610 - classification_loss: 0.3792 415/500 [=======================>......] - ETA: 21s - loss: 2.1415 - regression_loss: 1.7621 - classification_loss: 0.3795 416/500 [=======================>......] - ETA: 21s - loss: 2.1421 - regression_loss: 1.7625 - classification_loss: 0.3796 417/500 [========================>.....] - ETA: 20s - loss: 2.1420 - regression_loss: 1.7624 - classification_loss: 0.3796 418/500 [========================>.....] - ETA: 20s - loss: 2.1425 - regression_loss: 1.7625 - classification_loss: 0.3800 419/500 [========================>.....] - ETA: 20s - loss: 2.1411 - regression_loss: 1.7614 - classification_loss: 0.3797 420/500 [========================>.....] - ETA: 19s - loss: 2.1406 - regression_loss: 1.7613 - classification_loss: 0.3793 421/500 [========================>.....] - ETA: 19s - loss: 2.1429 - regression_loss: 1.7630 - classification_loss: 0.3798 422/500 [========================>.....] - ETA: 19s - loss: 2.1427 - regression_loss: 1.7629 - classification_loss: 0.3798 423/500 [========================>.....] - ETA: 19s - loss: 2.1403 - regression_loss: 1.7609 - classification_loss: 0.3794 424/500 [========================>.....] - ETA: 18s - loss: 2.1409 - regression_loss: 1.7614 - classification_loss: 0.3795 425/500 [========================>.....] - ETA: 18s - loss: 2.1418 - regression_loss: 1.7622 - classification_loss: 0.3796 426/500 [========================>.....] - ETA: 18s - loss: 2.1420 - regression_loss: 1.7625 - classification_loss: 0.3795 427/500 [========================>.....] - ETA: 18s - loss: 2.1414 - regression_loss: 1.7624 - classification_loss: 0.3789 428/500 [========================>.....] - ETA: 17s - loss: 2.1421 - regression_loss: 1.7631 - classification_loss: 0.3790 429/500 [========================>.....] - ETA: 17s - loss: 2.1423 - regression_loss: 1.7628 - classification_loss: 0.3795 430/500 [========================>.....] - ETA: 17s - loss: 2.1445 - regression_loss: 1.7647 - classification_loss: 0.3797 431/500 [========================>.....] - ETA: 17s - loss: 2.1446 - regression_loss: 1.7649 - classification_loss: 0.3797 432/500 [========================>.....] - ETA: 16s - loss: 2.1509 - regression_loss: 1.7656 - classification_loss: 0.3853 433/500 [========================>.....] - ETA: 16s - loss: 2.1503 - regression_loss: 1.7653 - classification_loss: 0.3850 434/500 [=========================>....] - ETA: 16s - loss: 2.1509 - regression_loss: 1.7659 - classification_loss: 0.3850 435/500 [=========================>....] - ETA: 16s - loss: 2.1506 - regression_loss: 1.7657 - classification_loss: 0.3849 436/500 [=========================>....] - ETA: 15s - loss: 2.1501 - regression_loss: 1.7654 - classification_loss: 0.3847 437/500 [=========================>....] - ETA: 15s - loss: 2.1489 - regression_loss: 1.7645 - classification_loss: 0.3843 438/500 [=========================>....] - ETA: 15s - loss: 2.1485 - regression_loss: 1.7645 - classification_loss: 0.3840 439/500 [=========================>....] - ETA: 15s - loss: 2.1459 - regression_loss: 1.7624 - classification_loss: 0.3835 440/500 [=========================>....] - ETA: 15s - loss: 2.1455 - regression_loss: 1.7621 - classification_loss: 0.3834 441/500 [=========================>....] - ETA: 14s - loss: 2.1459 - regression_loss: 1.7623 - classification_loss: 0.3836 442/500 [=========================>....] - ETA: 14s - loss: 2.1455 - regression_loss: 1.7620 - classification_loss: 0.3835 443/500 [=========================>....] - ETA: 14s - loss: 2.1460 - regression_loss: 1.7623 - classification_loss: 0.3837 444/500 [=========================>....] - ETA: 14s - loss: 2.1461 - regression_loss: 1.7624 - classification_loss: 0.3837 445/500 [=========================>....] - ETA: 13s - loss: 2.1433 - regression_loss: 1.7602 - classification_loss: 0.3831 446/500 [=========================>....] - ETA: 13s - loss: 2.1422 - regression_loss: 1.7591 - classification_loss: 0.3831 447/500 [=========================>....] - ETA: 13s - loss: 2.1407 - regression_loss: 1.7579 - classification_loss: 0.3827 448/500 [=========================>....] - ETA: 13s - loss: 2.1405 - regression_loss: 1.7579 - classification_loss: 0.3826 449/500 [=========================>....] - ETA: 12s - loss: 2.1392 - regression_loss: 1.7571 - classification_loss: 0.3821 450/500 [==========================>...] - ETA: 12s - loss: 2.1398 - regression_loss: 1.7574 - classification_loss: 0.3823 451/500 [==========================>...] - ETA: 12s - loss: 2.1382 - regression_loss: 1.7562 - classification_loss: 0.3819 452/500 [==========================>...] - ETA: 11s - loss: 2.1360 - regression_loss: 1.7546 - classification_loss: 0.3814 453/500 [==========================>...] - ETA: 11s - loss: 2.1371 - regression_loss: 1.7557 - classification_loss: 0.3814 454/500 [==========================>...] - ETA: 11s - loss: 2.1370 - regression_loss: 1.7557 - classification_loss: 0.3813 455/500 [==========================>...] - ETA: 11s - loss: 2.1367 - regression_loss: 1.7557 - classification_loss: 0.3810 456/500 [==========================>...] - ETA: 10s - loss: 2.1375 - regression_loss: 1.7564 - classification_loss: 0.3811 457/500 [==========================>...] - ETA: 10s - loss: 2.1383 - regression_loss: 1.7570 - classification_loss: 0.3812 458/500 [==========================>...] - ETA: 10s - loss: 2.1377 - regression_loss: 1.7566 - classification_loss: 0.3810 459/500 [==========================>...] - ETA: 10s - loss: 2.1384 - regression_loss: 1.7572 - classification_loss: 0.3811 460/500 [==========================>...] - ETA: 9s - loss: 2.1380 - regression_loss: 1.7569 - classification_loss: 0.3811  461/500 [==========================>...] - ETA: 9s - loss: 2.1370 - regression_loss: 1.7563 - classification_loss: 0.3807 462/500 [==========================>...] - ETA: 9s - loss: 2.1364 - regression_loss: 1.7559 - classification_loss: 0.3805 463/500 [==========================>...] - ETA: 9s - loss: 2.1361 - regression_loss: 1.7557 - classification_loss: 0.3805 464/500 [==========================>...] - ETA: 8s - loss: 2.1370 - regression_loss: 1.7562 - classification_loss: 0.3808 465/500 [==========================>...] - ETA: 8s - loss: 2.1371 - regression_loss: 1.7563 - classification_loss: 0.3808 466/500 [==========================>...] - ETA: 8s - loss: 2.1356 - regression_loss: 1.7553 - classification_loss: 0.3803 467/500 [===========================>..] - ETA: 8s - loss: 2.1366 - regression_loss: 1.7559 - classification_loss: 0.3807 468/500 [===========================>..] - ETA: 7s - loss: 2.1357 - regression_loss: 1.7552 - classification_loss: 0.3805 469/500 [===========================>..] - ETA: 7s - loss: 2.1329 - regression_loss: 1.7529 - classification_loss: 0.3800 470/500 [===========================>..] - ETA: 7s - loss: 2.1342 - regression_loss: 1.7543 - classification_loss: 0.3799 471/500 [===========================>..] - ETA: 7s - loss: 2.1329 - regression_loss: 1.7533 - classification_loss: 0.3795 472/500 [===========================>..] - ETA: 6s - loss: 2.1331 - regression_loss: 1.7535 - classification_loss: 0.3796 473/500 [===========================>..] - ETA: 6s - loss: 2.1329 - regression_loss: 1.7534 - classification_loss: 0.3795 474/500 [===========================>..] - ETA: 6s - loss: 2.1324 - regression_loss: 1.7531 - classification_loss: 0.3793 475/500 [===========================>..] - ETA: 6s - loss: 2.1341 - regression_loss: 1.7543 - classification_loss: 0.3798 476/500 [===========================>..] - ETA: 5s - loss: 2.1336 - regression_loss: 1.7540 - classification_loss: 0.3795 477/500 [===========================>..] - ETA: 5s - loss: 2.1337 - regression_loss: 1.7543 - classification_loss: 0.3794 478/500 [===========================>..] - ETA: 5s - loss: 2.1338 - regression_loss: 1.7545 - classification_loss: 0.3794 479/500 [===========================>..] - ETA: 5s - loss: 2.1327 - regression_loss: 1.7538 - classification_loss: 0.3789 480/500 [===========================>..] - ETA: 4s - loss: 2.1333 - regression_loss: 1.7539 - classification_loss: 0.3793 481/500 [===========================>..] - ETA: 4s - loss: 2.1343 - regression_loss: 1.7550 - classification_loss: 0.3793 482/500 [===========================>..] - ETA: 4s - loss: 2.1332 - regression_loss: 1.7542 - classification_loss: 0.3790 483/500 [===========================>..] - ETA: 4s - loss: 2.1349 - regression_loss: 1.7558 - classification_loss: 0.3791 484/500 [============================>.] - ETA: 3s - loss: 2.1334 - regression_loss: 1.7547 - classification_loss: 0.3787 485/500 [============================>.] - ETA: 3s - loss: 2.1345 - regression_loss: 1.7556 - classification_loss: 0.3789 486/500 [============================>.] - ETA: 3s - loss: 2.1349 - regression_loss: 1.7560 - classification_loss: 0.3789 487/500 [============================>.] - ETA: 3s - loss: 2.1328 - regression_loss: 1.7544 - classification_loss: 0.3784 488/500 [============================>.] - ETA: 2s - loss: 2.1310 - regression_loss: 1.7530 - classification_loss: 0.3780 489/500 [============================>.] - ETA: 2s - loss: 2.1318 - regression_loss: 1.7538 - classification_loss: 0.3780 490/500 [============================>.] - ETA: 2s - loss: 2.1323 - regression_loss: 1.7540 - classification_loss: 0.3783 491/500 [============================>.] - ETA: 2s - loss: 2.1311 - regression_loss: 1.7533 - classification_loss: 0.3779 492/500 [============================>.] - ETA: 1s - loss: 2.1311 - regression_loss: 1.7533 - classification_loss: 0.3778 493/500 [============================>.] - ETA: 1s - loss: 2.1312 - regression_loss: 1.7533 - classification_loss: 0.3779 494/500 [============================>.] - ETA: 1s - loss: 2.1287 - regression_loss: 1.7513 - classification_loss: 0.3774 495/500 [============================>.] - ETA: 1s - loss: 2.1278 - regression_loss: 1.7507 - classification_loss: 0.3772 496/500 [============================>.] - ETA: 0s - loss: 2.1268 - regression_loss: 1.7500 - classification_loss: 0.3769 497/500 [============================>.] - ETA: 0s - loss: 2.1279 - regression_loss: 1.7509 - classification_loss: 0.3770 498/500 [============================>.] - ETA: 0s - loss: 2.1261 - regression_loss: 1.7496 - classification_loss: 0.3766 499/500 [============================>.] - ETA: 0s - loss: 2.1268 - regression_loss: 1.7502 - classification_loss: 0.3766 500/500 [==============================] - 125s 250ms/step - loss: 2.1265 - regression_loss: 1.7500 - classification_loss: 0.3765 326 instances of class plum with average precision: 0.6544 mAP: 0.6544 Epoch 00017: saving model to ./training/snapshots/resnet50_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 2:03 - loss: 2.1752 - regression_loss: 1.8884 - classification_loss: 0.2868 2/500 [..............................] - ETA: 2:05 - loss: 2.2832 - regression_loss: 1.9416 - classification_loss: 0.3416 3/500 [..............................] - ETA: 2:04 - loss: 2.2869 - regression_loss: 1.8668 - classification_loss: 0.4201 4/500 [..............................] - ETA: 2:05 - loss: 2.2345 - regression_loss: 1.8188 - classification_loss: 0.4157 5/500 [..............................] - ETA: 2:04 - loss: 2.2422 - regression_loss: 1.8389 - classification_loss: 0.4033 6/500 [..............................] - ETA: 2:03 - loss: 2.3756 - regression_loss: 1.9586 - classification_loss: 0.4170 7/500 [..............................] - ETA: 2:02 - loss: 2.2636 - regression_loss: 1.8805 - classification_loss: 0.3831 8/500 [..............................] - ETA: 2:01 - loss: 2.0734 - regression_loss: 1.7207 - classification_loss: 0.3527 9/500 [..............................] - ETA: 2:01 - loss: 2.0409 - regression_loss: 1.6951 - classification_loss: 0.3458 10/500 [..............................] - ETA: 2:01 - loss: 2.0419 - regression_loss: 1.6927 - classification_loss: 0.3492 11/500 [..............................] - ETA: 2:01 - loss: 2.0370 - regression_loss: 1.6925 - classification_loss: 0.3445 12/500 [..............................] - ETA: 2:01 - loss: 1.9993 - regression_loss: 1.6577 - classification_loss: 0.3416 13/500 [..............................] - ETA: 2:01 - loss: 2.0302 - regression_loss: 1.6799 - classification_loss: 0.3503 14/500 [..............................] - ETA: 2:01 - loss: 2.0127 - regression_loss: 1.6650 - classification_loss: 0.3476 15/500 [..............................] - ETA: 2:00 - loss: 2.0548 - regression_loss: 1.6978 - classification_loss: 0.3571 16/500 [..............................] - ETA: 2:00 - loss: 2.0865 - regression_loss: 1.7188 - classification_loss: 0.3676 17/500 [>.............................] - ETA: 2:00 - loss: 2.0021 - regression_loss: 1.6177 - classification_loss: 0.3844 18/500 [>.............................] - ETA: 1:59 - loss: 2.0149 - regression_loss: 1.6325 - classification_loss: 0.3825 19/500 [>.............................] - ETA: 1:59 - loss: 2.0354 - regression_loss: 1.6481 - classification_loss: 0.3872 20/500 [>.............................] - ETA: 1:59 - loss: 1.9627 - regression_loss: 1.5913 - classification_loss: 0.3714 21/500 [>.............................] - ETA: 1:58 - loss: 1.9396 - regression_loss: 1.5760 - classification_loss: 0.3636 22/500 [>.............................] - ETA: 1:58 - loss: 1.9052 - regression_loss: 1.5501 - classification_loss: 0.3552 23/500 [>.............................] - ETA: 1:58 - loss: 1.9201 - regression_loss: 1.5660 - classification_loss: 0.3540 24/500 [>.............................] - ETA: 1:58 - loss: 1.9456 - regression_loss: 1.5905 - classification_loss: 0.3551 25/500 [>.............................] - ETA: 1:58 - loss: 1.9596 - regression_loss: 1.6030 - classification_loss: 0.3566 26/500 [>.............................] - ETA: 1:58 - loss: 1.9589 - regression_loss: 1.6046 - classification_loss: 0.3544 27/500 [>.............................] - ETA: 1:57 - loss: 1.9667 - regression_loss: 1.6098 - classification_loss: 0.3569 28/500 [>.............................] - ETA: 1:57 - loss: 1.9615 - regression_loss: 1.6098 - classification_loss: 0.3518 29/500 [>.............................] - ETA: 1:57 - loss: 1.9562 - regression_loss: 1.6084 - classification_loss: 0.3479 30/500 [>.............................] - ETA: 1:57 - loss: 1.9252 - regression_loss: 1.5837 - classification_loss: 0.3415 31/500 [>.............................] - ETA: 1:56 - loss: 1.9474 - regression_loss: 1.5946 - classification_loss: 0.3528 32/500 [>.............................] - ETA: 1:56 - loss: 1.9540 - regression_loss: 1.5959 - classification_loss: 0.3581 33/500 [>.............................] - ETA: 1:56 - loss: 1.9203 - regression_loss: 1.5690 - classification_loss: 0.3513 34/500 [=>............................] - ETA: 1:56 - loss: 1.9261 - regression_loss: 1.5755 - classification_loss: 0.3505 35/500 [=>............................] - ETA: 1:56 - loss: 1.9311 - regression_loss: 1.5813 - classification_loss: 0.3498 36/500 [=>............................] - ETA: 1:55 - loss: 1.9360 - regression_loss: 1.5848 - classification_loss: 0.3511 37/500 [=>............................] - ETA: 1:55 - loss: 1.9312 - regression_loss: 1.5823 - classification_loss: 0.3489 38/500 [=>............................] - ETA: 1:54 - loss: 1.9235 - regression_loss: 1.5772 - classification_loss: 0.3463 39/500 [=>............................] - ETA: 1:54 - loss: 1.9150 - regression_loss: 1.5687 - classification_loss: 0.3462 40/500 [=>............................] - ETA: 1:54 - loss: 1.9171 - regression_loss: 1.5694 - classification_loss: 0.3478 41/500 [=>............................] - ETA: 1:54 - loss: 1.9289 - regression_loss: 1.5758 - classification_loss: 0.3531 42/500 [=>............................] - ETA: 1:53 - loss: 1.9508 - regression_loss: 1.5932 - classification_loss: 0.3576 43/500 [=>............................] - ETA: 1:53 - loss: 1.9650 - regression_loss: 1.6029 - classification_loss: 0.3621 44/500 [=>............................] - ETA: 1:53 - loss: 1.9609 - regression_loss: 1.6006 - classification_loss: 0.3603 45/500 [=>............................] - ETA: 1:53 - loss: 1.9510 - regression_loss: 1.5933 - classification_loss: 0.3577 46/500 [=>............................] - ETA: 1:52 - loss: 1.9528 - regression_loss: 1.5955 - classification_loss: 0.3573 47/500 [=>............................] - ETA: 1:52 - loss: 1.9357 - regression_loss: 1.5839 - classification_loss: 0.3518 48/500 [=>............................] - ETA: 1:52 - loss: 1.9333 - regression_loss: 1.5838 - classification_loss: 0.3495 49/500 [=>............................] - ETA: 1:52 - loss: 1.9395 - regression_loss: 1.5892 - classification_loss: 0.3503 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9263 - regression_loss: 1.5803 - classification_loss: 0.3460 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9359 - regression_loss: 1.5902 - classification_loss: 0.3457 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9269 - regression_loss: 1.5842 - classification_loss: 0.3427 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9241 - regression_loss: 1.5850 - classification_loss: 0.3391 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9230 - regression_loss: 1.5840 - classification_loss: 0.3390 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9272 - regression_loss: 1.5870 - classification_loss: 0.3402 56/500 [==>...........................] - ETA: 1:50 - loss: 1.9402 - regression_loss: 1.5950 - classification_loss: 0.3453 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9356 - regression_loss: 1.5929 - classification_loss: 0.3427 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9403 - regression_loss: 1.5970 - classification_loss: 0.3433 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9384 - regression_loss: 1.5969 - classification_loss: 0.3415 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9558 - regression_loss: 1.6103 - classification_loss: 0.3456 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9706 - regression_loss: 1.6205 - classification_loss: 0.3501 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9742 - regression_loss: 1.6241 - classification_loss: 0.3501 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9761 - regression_loss: 1.6245 - classification_loss: 0.3516 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9779 - regression_loss: 1.6270 - classification_loss: 0.3510 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9777 - regression_loss: 1.6270 - classification_loss: 0.3508 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9916 - regression_loss: 1.6376 - classification_loss: 0.3540 67/500 [===>..........................] - ETA: 1:47 - loss: 2.0001 - regression_loss: 1.6445 - classification_loss: 0.3556 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0039 - regression_loss: 1.6491 - classification_loss: 0.3548 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0067 - regression_loss: 1.6536 - classification_loss: 0.3531 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0106 - regression_loss: 1.6571 - classification_loss: 0.3536 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0050 - regression_loss: 1.6527 - classification_loss: 0.3523 72/500 [===>..........................] - ETA: 1:46 - loss: 1.9994 - regression_loss: 1.6493 - classification_loss: 0.3500 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0020 - regression_loss: 1.6513 - classification_loss: 0.3507 74/500 [===>..........................] - ETA: 1:46 - loss: 1.9982 - regression_loss: 1.6290 - classification_loss: 0.3692 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0039 - regression_loss: 1.6330 - classification_loss: 0.3708 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0133 - regression_loss: 1.6428 - classification_loss: 0.3705 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0196 - regression_loss: 1.6504 - classification_loss: 0.3692 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0179 - regression_loss: 1.6516 - classification_loss: 0.3663 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0169 - regression_loss: 1.6499 - classification_loss: 0.3670 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0242 - regression_loss: 1.6560 - classification_loss: 0.3681 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0337 - regression_loss: 1.6620 - classification_loss: 0.3717 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0465 - regression_loss: 1.6727 - classification_loss: 0.3738 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0526 - regression_loss: 1.6774 - classification_loss: 0.3752 84/500 [====>.........................] - ETA: 1:44 - loss: 2.0577 - regression_loss: 1.6808 - classification_loss: 0.3768 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0623 - regression_loss: 1.6849 - classification_loss: 0.3774 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0693 - regression_loss: 1.6897 - classification_loss: 0.3796 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0697 - regression_loss: 1.6905 - classification_loss: 0.3792 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0592 - regression_loss: 1.6824 - classification_loss: 0.3769 89/500 [====>.........................] - ETA: 1:42 - loss: 2.0593 - regression_loss: 1.6832 - classification_loss: 0.3761 90/500 [====>.........................] - ETA: 1:42 - loss: 2.0628 - regression_loss: 1.6863 - classification_loss: 0.3765 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0670 - regression_loss: 1.6903 - classification_loss: 0.3767 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0536 - regression_loss: 1.6798 - classification_loss: 0.3738 93/500 [====>.........................] - ETA: 1:41 - loss: 2.0526 - regression_loss: 1.6792 - classification_loss: 0.3734 94/500 [====>.........................] - ETA: 1:41 - loss: 2.0519 - regression_loss: 1.6781 - classification_loss: 0.3738 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0496 - regression_loss: 1.6766 - classification_loss: 0.3730 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0535 - regression_loss: 1.6791 - classification_loss: 0.3744 97/500 [====>.........................] - ETA: 1:40 - loss: 2.0584 - regression_loss: 1.6824 - classification_loss: 0.3761 98/500 [====>.........................] - ETA: 1:40 - loss: 2.0613 - regression_loss: 1.6843 - classification_loss: 0.3770 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0610 - regression_loss: 1.6848 - classification_loss: 0.3761 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0682 - regression_loss: 1.6902 - classification_loss: 0.3780 101/500 [=====>........................] - ETA: 1:39 - loss: 2.0699 - regression_loss: 1.6885 - classification_loss: 0.3815 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0690 - regression_loss: 1.6871 - classification_loss: 0.3820 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0724 - regression_loss: 1.6899 - classification_loss: 0.3824 104/500 [=====>........................] - ETA: 1:38 - loss: 2.0722 - regression_loss: 1.6912 - classification_loss: 0.3810 105/500 [=====>........................] - ETA: 1:38 - loss: 2.0700 - regression_loss: 1.6904 - classification_loss: 0.3797 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0697 - regression_loss: 1.6893 - classification_loss: 0.3805 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0789 - regression_loss: 1.6959 - classification_loss: 0.3830 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0768 - regression_loss: 1.6947 - classification_loss: 0.3821 109/500 [=====>........................] - ETA: 1:37 - loss: 2.0771 - regression_loss: 1.6959 - classification_loss: 0.3812 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0713 - regression_loss: 1.6920 - classification_loss: 0.3794 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0848 - regression_loss: 1.7065 - classification_loss: 0.3783 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0909 - regression_loss: 1.7104 - classification_loss: 0.3805 113/500 [=====>........................] - ETA: 1:36 - loss: 2.0861 - regression_loss: 1.7071 - classification_loss: 0.3790 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0756 - regression_loss: 1.6985 - classification_loss: 0.3771 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0697 - regression_loss: 1.6942 - classification_loss: 0.3755 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0739 - regression_loss: 1.6969 - classification_loss: 0.3770 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0881 - regression_loss: 1.7105 - classification_loss: 0.3776 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0914 - regression_loss: 1.7128 - classification_loss: 0.3786 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0947 - regression_loss: 1.7141 - classification_loss: 0.3805 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0947 - regression_loss: 1.7136 - classification_loss: 0.3811 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1006 - regression_loss: 1.7191 - classification_loss: 0.3815 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0972 - regression_loss: 1.7167 - classification_loss: 0.3805 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1004 - regression_loss: 1.7205 - classification_loss: 0.3800 124/500 [======>.......................] - ETA: 1:33 - loss: 2.1082 - regression_loss: 1.7268 - classification_loss: 0.3814 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1099 - regression_loss: 1.7290 - classification_loss: 0.3809 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1100 - regression_loss: 1.7293 - classification_loss: 0.3807 127/500 [======>.......................] - ETA: 1:32 - loss: 2.1131 - regression_loss: 1.7316 - classification_loss: 0.3815 128/500 [======>.......................] - ETA: 1:32 - loss: 2.1123 - regression_loss: 1.7315 - classification_loss: 0.3808 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1111 - regression_loss: 1.7309 - classification_loss: 0.3803 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1060 - regression_loss: 1.7274 - classification_loss: 0.3786 131/500 [======>.......................] - ETA: 1:31 - loss: 2.1004 - regression_loss: 1.7233 - classification_loss: 0.3771 132/500 [======>.......................] - ETA: 1:31 - loss: 2.1035 - regression_loss: 1.7252 - classification_loss: 0.3784 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1043 - regression_loss: 1.7261 - classification_loss: 0.3782 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1075 - regression_loss: 1.7283 - classification_loss: 0.3792 135/500 [=======>......................] - ETA: 1:30 - loss: 2.1069 - regression_loss: 1.7281 - classification_loss: 0.3788 136/500 [=======>......................] - ETA: 1:30 - loss: 2.1064 - regression_loss: 1.7280 - classification_loss: 0.3784 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1079 - regression_loss: 1.7291 - classification_loss: 0.3788 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1024 - regression_loss: 1.7251 - classification_loss: 0.3773 139/500 [=======>......................] - ETA: 1:29 - loss: 2.0963 - regression_loss: 1.7206 - classification_loss: 0.3758 140/500 [=======>......................] - ETA: 1:29 - loss: 2.0997 - regression_loss: 1.7238 - classification_loss: 0.3759 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0976 - regression_loss: 1.7227 - classification_loss: 0.3749 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1009 - regression_loss: 1.7252 - classification_loss: 0.3756 143/500 [=======>......................] - ETA: 1:28 - loss: 2.1010 - regression_loss: 1.7245 - classification_loss: 0.3765 144/500 [=======>......................] - ETA: 1:28 - loss: 2.0987 - regression_loss: 1.7232 - classification_loss: 0.3755 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1009 - regression_loss: 1.7251 - classification_loss: 0.3758 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1048 - regression_loss: 1.7286 - classification_loss: 0.3761 147/500 [=======>......................] - ETA: 1:27 - loss: 2.1063 - regression_loss: 1.7295 - classification_loss: 0.3768 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1109 - regression_loss: 1.7335 - classification_loss: 0.3774 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1069 - regression_loss: 1.7305 - classification_loss: 0.3764 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1113 - regression_loss: 1.7338 - classification_loss: 0.3775 151/500 [========>.....................] - ETA: 1:26 - loss: 2.1101 - regression_loss: 1.7327 - classification_loss: 0.3774 152/500 [========>.....................] - ETA: 1:26 - loss: 2.1076 - regression_loss: 1.7312 - classification_loss: 0.3764 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1103 - regression_loss: 1.7332 - classification_loss: 0.3770 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1075 - regression_loss: 1.7315 - classification_loss: 0.3760 155/500 [========>.....................] - ETA: 1:25 - loss: 2.1082 - regression_loss: 1.7310 - classification_loss: 0.3772 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1065 - regression_loss: 1.7301 - classification_loss: 0.3764 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1072 - regression_loss: 1.7308 - classification_loss: 0.3764 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1047 - regression_loss: 1.7294 - classification_loss: 0.3754 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0998 - regression_loss: 1.7251 - classification_loss: 0.3747 160/500 [========>.....................] - ETA: 1:24 - loss: 2.1018 - regression_loss: 1.7263 - classification_loss: 0.3754 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1008 - regression_loss: 1.7257 - classification_loss: 0.3752 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1011 - regression_loss: 1.7262 - classification_loss: 0.3749 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0951 - regression_loss: 1.7211 - classification_loss: 0.3740 164/500 [========>.....................] - ETA: 1:23 - loss: 2.1018 - regression_loss: 1.7270 - classification_loss: 0.3748 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1068 - regression_loss: 1.7303 - classification_loss: 0.3765 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1048 - regression_loss: 1.7288 - classification_loss: 0.3760 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1080 - regression_loss: 1.7309 - classification_loss: 0.3771 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1028 - regression_loss: 1.7268 - classification_loss: 0.3761 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1095 - regression_loss: 1.7319 - classification_loss: 0.3776 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1138 - regression_loss: 1.7352 - classification_loss: 0.3786 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1102 - regression_loss: 1.7322 - classification_loss: 0.3780 172/500 [=========>....................] - ETA: 1:21 - loss: 2.1077 - regression_loss: 1.7303 - classification_loss: 0.3774 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1057 - regression_loss: 1.7288 - classification_loss: 0.3769 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1070 - regression_loss: 1.7303 - classification_loss: 0.3767 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1027 - regression_loss: 1.7266 - classification_loss: 0.3761 176/500 [=========>....................] - ETA: 1:20 - loss: 2.0974 - regression_loss: 1.7225 - classification_loss: 0.3748 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0974 - regression_loss: 1.7224 - classification_loss: 0.3751 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0969 - regression_loss: 1.7219 - classification_loss: 0.3750 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0983 - regression_loss: 1.7228 - classification_loss: 0.3755 180/500 [=========>....................] - ETA: 1:19 - loss: 2.0993 - regression_loss: 1.7235 - classification_loss: 0.3758 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0991 - regression_loss: 1.7233 - classification_loss: 0.3758 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0987 - regression_loss: 1.7236 - classification_loss: 0.3751 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0965 - regression_loss: 1.7222 - classification_loss: 0.3743 184/500 [==========>...................] - ETA: 1:18 - loss: 2.0985 - regression_loss: 1.7238 - classification_loss: 0.3747 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0989 - regression_loss: 1.7242 - classification_loss: 0.3748 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0988 - regression_loss: 1.7246 - classification_loss: 0.3742 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1031 - regression_loss: 1.7282 - classification_loss: 0.3749 188/500 [==========>...................] - ETA: 1:17 - loss: 2.1021 - regression_loss: 1.7190 - classification_loss: 0.3830 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1044 - regression_loss: 1.7206 - classification_loss: 0.3838 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1046 - regression_loss: 1.7208 - classification_loss: 0.3838 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1076 - regression_loss: 1.7230 - classification_loss: 0.3846 192/500 [==========>...................] - ETA: 1:16 - loss: 2.1068 - regression_loss: 1.7224 - classification_loss: 0.3844 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1072 - regression_loss: 1.7238 - classification_loss: 0.3835 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1062 - regression_loss: 1.7233 - classification_loss: 0.3829 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1095 - regression_loss: 1.7257 - classification_loss: 0.3838 196/500 [==========>...................] - ETA: 1:15 - loss: 2.1086 - regression_loss: 1.7253 - classification_loss: 0.3834 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1094 - regression_loss: 1.7260 - classification_loss: 0.3834 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1122 - regression_loss: 1.7274 - classification_loss: 0.3847 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1120 - regression_loss: 1.7277 - classification_loss: 0.3842 200/500 [===========>..................] - ETA: 1:14 - loss: 2.1072 - regression_loss: 1.7241 - classification_loss: 0.3831 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1098 - regression_loss: 1.7259 - classification_loss: 0.3838 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1091 - regression_loss: 1.7252 - classification_loss: 0.3839 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1079 - regression_loss: 1.7245 - classification_loss: 0.3835 204/500 [===========>..................] - ETA: 1:13 - loss: 2.1084 - regression_loss: 1.7259 - classification_loss: 0.3825 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1061 - regression_loss: 1.7246 - classification_loss: 0.3815 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1073 - regression_loss: 1.7260 - classification_loss: 0.3813 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1076 - regression_loss: 1.7261 - classification_loss: 0.3814 208/500 [===========>..................] - ETA: 1:12 - loss: 2.1081 - regression_loss: 1.7270 - classification_loss: 0.3812 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1076 - regression_loss: 1.7266 - classification_loss: 0.3810 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1091 - regression_loss: 1.7276 - classification_loss: 0.3814 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1101 - regression_loss: 1.7284 - classification_loss: 0.3817 212/500 [===========>..................] - ETA: 1:11 - loss: 2.1091 - regression_loss: 1.7278 - classification_loss: 0.3813 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1138 - regression_loss: 1.7315 - classification_loss: 0.3823 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1140 - regression_loss: 1.7319 - classification_loss: 0.3821 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1143 - regression_loss: 1.7325 - classification_loss: 0.3818 216/500 [===========>..................] - ETA: 1:10 - loss: 2.1136 - regression_loss: 1.7321 - classification_loss: 0.3815 217/500 [============>.................] - ETA: 1:10 - loss: 2.1163 - regression_loss: 1.7341 - classification_loss: 0.3822 218/500 [============>.................] - ETA: 1:10 - loss: 2.1192 - regression_loss: 1.7362 - classification_loss: 0.3829 219/500 [============>.................] - ETA: 1:10 - loss: 2.1165 - regression_loss: 1.7344 - classification_loss: 0.3821 220/500 [============>.................] - ETA: 1:09 - loss: 2.1163 - regression_loss: 1.7347 - classification_loss: 0.3816 221/500 [============>.................] - ETA: 1:09 - loss: 2.1148 - regression_loss: 1.7338 - classification_loss: 0.3810 222/500 [============>.................] - ETA: 1:09 - loss: 2.1135 - regression_loss: 1.7331 - classification_loss: 0.3803 223/500 [============>.................] - ETA: 1:09 - loss: 2.1136 - regression_loss: 1.7333 - classification_loss: 0.3803 224/500 [============>.................] - ETA: 1:08 - loss: 2.1116 - regression_loss: 1.7318 - classification_loss: 0.3798 225/500 [============>.................] - ETA: 1:08 - loss: 2.1191 - regression_loss: 1.7388 - classification_loss: 0.3803 226/500 [============>.................] - ETA: 1:08 - loss: 2.1172 - regression_loss: 1.7372 - classification_loss: 0.3800 227/500 [============>.................] - ETA: 1:08 - loss: 2.1110 - regression_loss: 1.7323 - classification_loss: 0.3787 228/500 [============>.................] - ETA: 1:07 - loss: 2.1115 - regression_loss: 1.7324 - classification_loss: 0.3791 229/500 [============>.................] - ETA: 1:07 - loss: 2.1130 - regression_loss: 1.7335 - classification_loss: 0.3795 230/500 [============>.................] - ETA: 1:07 - loss: 2.1118 - regression_loss: 1.7325 - classification_loss: 0.3793 231/500 [============>.................] - ETA: 1:07 - loss: 2.1122 - regression_loss: 1.7336 - classification_loss: 0.3786 232/500 [============>.................] - ETA: 1:06 - loss: 2.1114 - regression_loss: 1.7334 - classification_loss: 0.3779 233/500 [============>.................] - ETA: 1:06 - loss: 2.1121 - regression_loss: 1.7343 - classification_loss: 0.3777 234/500 [=============>................] - ETA: 1:06 - loss: 2.1149 - regression_loss: 1.7361 - classification_loss: 0.3788 235/500 [=============>................] - ETA: 1:06 - loss: 2.1189 - regression_loss: 1.7401 - classification_loss: 0.3787 236/500 [=============>................] - ETA: 1:05 - loss: 2.1231 - regression_loss: 1.7439 - classification_loss: 0.3791 237/500 [=============>................] - ETA: 1:05 - loss: 2.1279 - regression_loss: 1.7490 - classification_loss: 0.3789 238/500 [=============>................] - ETA: 1:05 - loss: 2.1306 - regression_loss: 1.7513 - classification_loss: 0.3792 239/500 [=============>................] - ETA: 1:05 - loss: 2.1314 - regression_loss: 1.7518 - classification_loss: 0.3796 240/500 [=============>................] - ETA: 1:04 - loss: 2.1358 - regression_loss: 1.7554 - classification_loss: 0.3804 241/500 [=============>................] - ETA: 1:04 - loss: 2.1357 - regression_loss: 1.7553 - classification_loss: 0.3804 242/500 [=============>................] - ETA: 1:04 - loss: 2.1371 - regression_loss: 1.7566 - classification_loss: 0.3805 243/500 [=============>................] - ETA: 1:04 - loss: 2.1345 - regression_loss: 1.7549 - classification_loss: 0.3796 244/500 [=============>................] - ETA: 1:03 - loss: 2.1313 - regression_loss: 1.7527 - classification_loss: 0.3787 245/500 [=============>................] - ETA: 1:03 - loss: 2.1307 - regression_loss: 1.7526 - classification_loss: 0.3781 246/500 [=============>................] - ETA: 1:03 - loss: 2.1330 - regression_loss: 1.7549 - classification_loss: 0.3781 247/500 [=============>................] - ETA: 1:03 - loss: 2.1312 - regression_loss: 1.7538 - classification_loss: 0.3774 248/500 [=============>................] - ETA: 1:02 - loss: 2.1306 - regression_loss: 1.7532 - classification_loss: 0.3775 249/500 [=============>................] - ETA: 1:02 - loss: 2.1278 - regression_loss: 1.7512 - classification_loss: 0.3766 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1239 - regression_loss: 1.7483 - classification_loss: 0.3756 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1239 - regression_loss: 1.7480 - classification_loss: 0.3758 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1221 - regression_loss: 1.7470 - classification_loss: 0.3752 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1248 - regression_loss: 1.7485 - classification_loss: 0.3763 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1256 - regression_loss: 1.7492 - classification_loss: 0.3764 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1251 - regression_loss: 1.7490 - classification_loss: 0.3761 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1261 - regression_loss: 1.7500 - classification_loss: 0.3761 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1243 - regression_loss: 1.7488 - classification_loss: 0.3756 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1256 - regression_loss: 1.7498 - classification_loss: 0.3758 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1228 - regression_loss: 1.7475 - classification_loss: 0.3753 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1292 - regression_loss: 1.7527 - classification_loss: 0.3765 261/500 [==============>...............] - ETA: 59s - loss: 2.1283 - regression_loss: 1.7521 - classification_loss: 0.3762  262/500 [==============>...............] - ETA: 59s - loss: 2.1302 - regression_loss: 1.7534 - classification_loss: 0.3768 263/500 [==============>...............] - ETA: 59s - loss: 2.1286 - regression_loss: 1.7524 - classification_loss: 0.3762 264/500 [==============>...............] - ETA: 59s - loss: 2.1245 - regression_loss: 1.7492 - classification_loss: 0.3754 265/500 [==============>...............] - ETA: 58s - loss: 2.1235 - regression_loss: 1.7484 - classification_loss: 0.3751 266/500 [==============>...............] - ETA: 58s - loss: 2.1216 - regression_loss: 1.7473 - classification_loss: 0.3743 267/500 [===============>..............] - ETA: 58s - loss: 2.1188 - regression_loss: 1.7450 - classification_loss: 0.3737 268/500 [===============>..............] - ETA: 58s - loss: 2.1200 - regression_loss: 1.7462 - classification_loss: 0.3737 269/500 [===============>..............] - ETA: 57s - loss: 2.1225 - regression_loss: 1.7490 - classification_loss: 0.3736 270/500 [===============>..............] - ETA: 57s - loss: 2.1232 - regression_loss: 1.7497 - classification_loss: 0.3735 271/500 [===============>..............] - ETA: 57s - loss: 2.1249 - regression_loss: 1.7510 - classification_loss: 0.3739 272/500 [===============>..............] - ETA: 57s - loss: 2.1257 - regression_loss: 1.7518 - classification_loss: 0.3739 273/500 [===============>..............] - ETA: 56s - loss: 2.1284 - regression_loss: 1.7539 - classification_loss: 0.3745 274/500 [===============>..............] - ETA: 56s - loss: 2.1274 - regression_loss: 1.7532 - classification_loss: 0.3741 275/500 [===============>..............] - ETA: 56s - loss: 2.1270 - regression_loss: 1.7532 - classification_loss: 0.3738 276/500 [===============>..............] - ETA: 56s - loss: 2.1306 - regression_loss: 1.7558 - classification_loss: 0.3748 277/500 [===============>..............] - ETA: 55s - loss: 2.1308 - regression_loss: 1.7561 - classification_loss: 0.3747 278/500 [===============>..............] - ETA: 55s - loss: 2.1300 - regression_loss: 1.7551 - classification_loss: 0.3749 279/500 [===============>..............] - ETA: 55s - loss: 2.1309 - regression_loss: 1.7560 - classification_loss: 0.3749 280/500 [===============>..............] - ETA: 54s - loss: 2.1286 - regression_loss: 1.7544 - classification_loss: 0.3741 281/500 [===============>..............] - ETA: 54s - loss: 2.1291 - regression_loss: 1.7548 - classification_loss: 0.3743 282/500 [===============>..............] - ETA: 54s - loss: 2.1299 - regression_loss: 1.7554 - classification_loss: 0.3745 283/500 [===============>..............] - ETA: 54s - loss: 2.1303 - regression_loss: 1.7556 - classification_loss: 0.3747 284/500 [================>.............] - ETA: 53s - loss: 2.1304 - regression_loss: 1.7556 - classification_loss: 0.3747 285/500 [================>.............] - ETA: 53s - loss: 2.1304 - regression_loss: 1.7558 - classification_loss: 0.3746 286/500 [================>.............] - ETA: 53s - loss: 2.1310 - regression_loss: 1.7564 - classification_loss: 0.3745 287/500 [================>.............] - ETA: 53s - loss: 2.1328 - regression_loss: 1.7582 - classification_loss: 0.3746 288/500 [================>.............] - ETA: 52s - loss: 2.1296 - regression_loss: 1.7558 - classification_loss: 0.3738 289/500 [================>.............] - ETA: 52s - loss: 2.1285 - regression_loss: 1.7548 - classification_loss: 0.3737 290/500 [================>.............] - ETA: 52s - loss: 2.1325 - regression_loss: 1.7580 - classification_loss: 0.3745 291/500 [================>.............] - ETA: 52s - loss: 2.1296 - regression_loss: 1.7557 - classification_loss: 0.3739 292/500 [================>.............] - ETA: 51s - loss: 2.1297 - regression_loss: 1.7558 - classification_loss: 0.3739 293/500 [================>.............] - ETA: 51s - loss: 2.1262 - regression_loss: 1.7530 - classification_loss: 0.3731 294/500 [================>.............] - ETA: 51s - loss: 2.1283 - regression_loss: 1.7536 - classification_loss: 0.3747 295/500 [================>.............] - ETA: 51s - loss: 2.1282 - regression_loss: 1.7529 - classification_loss: 0.3752 296/500 [================>.............] - ETA: 50s - loss: 2.1284 - regression_loss: 1.7531 - classification_loss: 0.3753 297/500 [================>.............] - ETA: 50s - loss: 2.1299 - regression_loss: 1.7531 - classification_loss: 0.3768 298/500 [================>.............] - ETA: 50s - loss: 2.1252 - regression_loss: 1.7491 - classification_loss: 0.3761 299/500 [================>.............] - ETA: 50s - loss: 2.1266 - regression_loss: 1.7503 - classification_loss: 0.3763 300/500 [=================>............] - ETA: 49s - loss: 2.1249 - regression_loss: 1.7491 - classification_loss: 0.3759 301/500 [=================>............] - ETA: 49s - loss: 2.1243 - regression_loss: 1.7489 - classification_loss: 0.3754 302/500 [=================>............] - ETA: 49s - loss: 2.1234 - regression_loss: 1.7482 - classification_loss: 0.3753 303/500 [=================>............] - ETA: 49s - loss: 2.1259 - regression_loss: 1.7504 - classification_loss: 0.3756 304/500 [=================>............] - ETA: 48s - loss: 2.1271 - regression_loss: 1.7513 - classification_loss: 0.3758 305/500 [=================>............] - ETA: 48s - loss: 2.1271 - regression_loss: 1.7512 - classification_loss: 0.3759 306/500 [=================>............] - ETA: 48s - loss: 2.1271 - regression_loss: 1.7512 - classification_loss: 0.3759 307/500 [=================>............] - ETA: 48s - loss: 2.1254 - regression_loss: 1.7500 - classification_loss: 0.3754 308/500 [=================>............] - ETA: 47s - loss: 2.1270 - regression_loss: 1.7512 - classification_loss: 0.3758 309/500 [=================>............] - ETA: 47s - loss: 2.1267 - regression_loss: 1.7509 - classification_loss: 0.3758 310/500 [=================>............] - ETA: 47s - loss: 2.1273 - regression_loss: 1.7515 - classification_loss: 0.3759 311/500 [=================>............] - ETA: 47s - loss: 2.1279 - regression_loss: 1.7523 - classification_loss: 0.3756 312/500 [=================>............] - ETA: 46s - loss: 2.1328 - regression_loss: 1.7567 - classification_loss: 0.3760 313/500 [=================>............] - ETA: 46s - loss: 2.1297 - regression_loss: 1.7543 - classification_loss: 0.3754 314/500 [=================>............] - ETA: 46s - loss: 2.1313 - regression_loss: 1.7552 - classification_loss: 0.3761 315/500 [=================>............] - ETA: 46s - loss: 2.1311 - regression_loss: 1.7548 - classification_loss: 0.3763 316/500 [=================>............] - ETA: 45s - loss: 2.1322 - regression_loss: 1.7557 - classification_loss: 0.3764 317/500 [==================>...........] - ETA: 45s - loss: 2.1286 - regression_loss: 1.7529 - classification_loss: 0.3757 318/500 [==================>...........] - ETA: 45s - loss: 2.1283 - regression_loss: 1.7527 - classification_loss: 0.3756 319/500 [==================>...........] - ETA: 45s - loss: 2.1286 - regression_loss: 1.7529 - classification_loss: 0.3757 320/500 [==================>...........] - ETA: 44s - loss: 2.1306 - regression_loss: 1.7546 - classification_loss: 0.3760 321/500 [==================>...........] - ETA: 44s - loss: 2.1307 - regression_loss: 1.7546 - classification_loss: 0.3761 322/500 [==================>...........] - ETA: 44s - loss: 2.1306 - regression_loss: 1.7546 - classification_loss: 0.3760 323/500 [==================>...........] - ETA: 44s - loss: 2.1361 - regression_loss: 1.7587 - classification_loss: 0.3773 324/500 [==================>...........] - ETA: 43s - loss: 2.1362 - regression_loss: 1.7589 - classification_loss: 0.3773 325/500 [==================>...........] - ETA: 43s - loss: 2.1361 - regression_loss: 1.7591 - classification_loss: 0.3769 326/500 [==================>...........] - ETA: 43s - loss: 2.1378 - regression_loss: 1.7598 - classification_loss: 0.3781 327/500 [==================>...........] - ETA: 43s - loss: 2.1379 - regression_loss: 1.7597 - classification_loss: 0.3782 328/500 [==================>...........] - ETA: 42s - loss: 2.1383 - regression_loss: 1.7605 - classification_loss: 0.3778 329/500 [==================>...........] - ETA: 42s - loss: 2.1342 - regression_loss: 1.7573 - classification_loss: 0.3769 330/500 [==================>...........] - ETA: 42s - loss: 2.1353 - regression_loss: 1.7580 - classification_loss: 0.3772 331/500 [==================>...........] - ETA: 42s - loss: 2.1360 - regression_loss: 1.7586 - classification_loss: 0.3773 332/500 [==================>...........] - ETA: 41s - loss: 2.1358 - regression_loss: 1.7587 - classification_loss: 0.3771 333/500 [==================>...........] - ETA: 41s - loss: 2.1369 - regression_loss: 1.7592 - classification_loss: 0.3777 334/500 [===================>..........] - ETA: 41s - loss: 2.1363 - regression_loss: 1.7589 - classification_loss: 0.3774 335/500 [===================>..........] - ETA: 41s - loss: 2.1364 - regression_loss: 1.7591 - classification_loss: 0.3773 336/500 [===================>..........] - ETA: 40s - loss: 2.1350 - regression_loss: 1.7582 - classification_loss: 0.3768 337/500 [===================>..........] - ETA: 40s - loss: 2.1357 - regression_loss: 1.7590 - classification_loss: 0.3767 338/500 [===================>..........] - ETA: 40s - loss: 2.1365 - regression_loss: 1.7596 - classification_loss: 0.3769 339/500 [===================>..........] - ETA: 40s - loss: 2.1353 - regression_loss: 1.7587 - classification_loss: 0.3766 340/500 [===================>..........] - ETA: 39s - loss: 2.1340 - regression_loss: 1.7576 - classification_loss: 0.3764 341/500 [===================>..........] - ETA: 39s - loss: 2.1318 - regression_loss: 1.7560 - classification_loss: 0.3759 342/500 [===================>..........] - ETA: 39s - loss: 2.1331 - regression_loss: 1.7574 - classification_loss: 0.3757 343/500 [===================>..........] - ETA: 39s - loss: 2.1337 - regression_loss: 1.7580 - classification_loss: 0.3757 344/500 [===================>..........] - ETA: 38s - loss: 2.1304 - regression_loss: 1.7550 - classification_loss: 0.3753 345/500 [===================>..........] - ETA: 38s - loss: 2.1278 - regression_loss: 1.7531 - classification_loss: 0.3747 346/500 [===================>..........] - ETA: 38s - loss: 2.1278 - regression_loss: 1.7532 - classification_loss: 0.3746 347/500 [===================>..........] - ETA: 38s - loss: 2.1283 - regression_loss: 1.7538 - classification_loss: 0.3744 348/500 [===================>..........] - ETA: 37s - loss: 2.1280 - regression_loss: 1.7535 - classification_loss: 0.3745 349/500 [===================>..........] - ETA: 37s - loss: 2.1272 - regression_loss: 1.7528 - classification_loss: 0.3744 350/500 [====================>.........] - ETA: 37s - loss: 2.1282 - regression_loss: 1.7537 - classification_loss: 0.3745 351/500 [====================>.........] - ETA: 37s - loss: 2.1284 - regression_loss: 1.7538 - classification_loss: 0.3746 352/500 [====================>.........] - ETA: 36s - loss: 2.1284 - regression_loss: 1.7541 - classification_loss: 0.3743 353/500 [====================>.........] - ETA: 36s - loss: 2.1255 - regression_loss: 1.7517 - classification_loss: 0.3738 354/500 [====================>.........] - ETA: 36s - loss: 2.1247 - regression_loss: 1.7510 - classification_loss: 0.3737 355/500 [====================>.........] - ETA: 36s - loss: 2.1235 - regression_loss: 1.7500 - classification_loss: 0.3735 356/500 [====================>.........] - ETA: 35s - loss: 2.1277 - regression_loss: 1.7531 - classification_loss: 0.3746 357/500 [====================>.........] - ETA: 35s - loss: 2.1300 - regression_loss: 1.7549 - classification_loss: 0.3751 358/500 [====================>.........] - ETA: 35s - loss: 2.1289 - regression_loss: 1.7541 - classification_loss: 0.3748 359/500 [====================>.........] - ETA: 35s - loss: 2.1275 - regression_loss: 1.7530 - classification_loss: 0.3745 360/500 [====================>.........] - ETA: 34s - loss: 2.1269 - regression_loss: 1.7528 - classification_loss: 0.3741 361/500 [====================>.........] - ETA: 34s - loss: 2.1272 - regression_loss: 1.7531 - classification_loss: 0.3741 362/500 [====================>.........] - ETA: 34s - loss: 2.1270 - regression_loss: 1.7532 - classification_loss: 0.3738 363/500 [====================>.........] - ETA: 34s - loss: 2.1254 - regression_loss: 1.7521 - classification_loss: 0.3734 364/500 [====================>.........] - ETA: 33s - loss: 2.1271 - regression_loss: 1.7532 - classification_loss: 0.3739 365/500 [====================>.........] - ETA: 33s - loss: 2.1279 - regression_loss: 1.7537 - classification_loss: 0.3741 366/500 [====================>.........] - ETA: 33s - loss: 2.1288 - regression_loss: 1.7545 - classification_loss: 0.3743 367/500 [=====================>........] - ETA: 33s - loss: 2.1293 - regression_loss: 1.7549 - classification_loss: 0.3744 368/500 [=====================>........] - ETA: 32s - loss: 2.1309 - regression_loss: 1.7562 - classification_loss: 0.3746 369/500 [=====================>........] - ETA: 32s - loss: 2.1310 - regression_loss: 1.7564 - classification_loss: 0.3746 370/500 [=====================>........] - ETA: 32s - loss: 2.1296 - regression_loss: 1.7551 - classification_loss: 0.3746 371/500 [=====================>........] - ETA: 32s - loss: 2.1305 - regression_loss: 1.7557 - classification_loss: 0.3748 372/500 [=====================>........] - ETA: 31s - loss: 2.1313 - regression_loss: 1.7564 - classification_loss: 0.3750 373/500 [=====================>........] - ETA: 31s - loss: 2.1314 - regression_loss: 1.7565 - classification_loss: 0.3749 374/500 [=====================>........] - ETA: 31s - loss: 2.1320 - regression_loss: 1.7568 - classification_loss: 0.3752 375/500 [=====================>........] - ETA: 31s - loss: 2.1327 - regression_loss: 1.7576 - classification_loss: 0.3751 376/500 [=====================>........] - ETA: 30s - loss: 2.1321 - regression_loss: 1.7575 - classification_loss: 0.3746 377/500 [=====================>........] - ETA: 30s - loss: 2.1319 - regression_loss: 1.7576 - classification_loss: 0.3743 378/500 [=====================>........] - ETA: 30s - loss: 2.1286 - regression_loss: 1.7549 - classification_loss: 0.3737 379/500 [=====================>........] - ETA: 30s - loss: 2.1288 - regression_loss: 1.7553 - classification_loss: 0.3736 380/500 [=====================>........] - ETA: 29s - loss: 2.1296 - regression_loss: 1.7557 - classification_loss: 0.3738 381/500 [=====================>........] - ETA: 29s - loss: 2.1287 - regression_loss: 1.7554 - classification_loss: 0.3733 382/500 [=====================>........] - ETA: 29s - loss: 2.1303 - regression_loss: 1.7570 - classification_loss: 0.3733 383/500 [=====================>........] - ETA: 29s - loss: 2.1305 - regression_loss: 1.7571 - classification_loss: 0.3734 384/500 [======================>.......] - ETA: 28s - loss: 2.1312 - regression_loss: 1.7576 - classification_loss: 0.3736 385/500 [======================>.......] - ETA: 28s - loss: 2.1308 - regression_loss: 1.7572 - classification_loss: 0.3736 386/500 [======================>.......] - ETA: 28s - loss: 2.1294 - regression_loss: 1.7563 - classification_loss: 0.3731 387/500 [======================>.......] - ETA: 28s - loss: 2.1295 - regression_loss: 1.7565 - classification_loss: 0.3730 388/500 [======================>.......] - ETA: 27s - loss: 2.1303 - regression_loss: 1.7571 - classification_loss: 0.3732 389/500 [======================>.......] - ETA: 27s - loss: 2.1294 - regression_loss: 1.7565 - classification_loss: 0.3729 390/500 [======================>.......] - ETA: 27s - loss: 2.1302 - regression_loss: 1.7573 - classification_loss: 0.3729 391/500 [======================>.......] - ETA: 27s - loss: 2.1285 - regression_loss: 1.7560 - classification_loss: 0.3725 392/500 [======================>.......] - ETA: 26s - loss: 2.1290 - regression_loss: 1.7563 - classification_loss: 0.3727 393/500 [======================>.......] - ETA: 26s - loss: 2.1273 - regression_loss: 1.7550 - classification_loss: 0.3723 394/500 [======================>.......] - ETA: 26s - loss: 2.1296 - regression_loss: 1.7570 - classification_loss: 0.3726 395/500 [======================>.......] - ETA: 26s - loss: 2.1295 - regression_loss: 1.7571 - classification_loss: 0.3724 396/500 [======================>.......] - ETA: 25s - loss: 2.1325 - regression_loss: 1.7596 - classification_loss: 0.3729 397/500 [======================>.......] - ETA: 25s - loss: 2.1302 - regression_loss: 1.7578 - classification_loss: 0.3724 398/500 [======================>.......] - ETA: 25s - loss: 2.1288 - regression_loss: 1.7569 - classification_loss: 0.3719 399/500 [======================>.......] - ETA: 25s - loss: 2.1287 - regression_loss: 1.7570 - classification_loss: 0.3716 400/500 [=======================>......] - ETA: 24s - loss: 2.1265 - regression_loss: 1.7526 - classification_loss: 0.3739 401/500 [=======================>......] - ETA: 24s - loss: 2.1266 - regression_loss: 1.7527 - classification_loss: 0.3738 402/500 [=======================>......] - ETA: 24s - loss: 2.1268 - regression_loss: 1.7527 - classification_loss: 0.3741 403/500 [=======================>......] - ETA: 24s - loss: 2.1269 - regression_loss: 1.7526 - classification_loss: 0.3742 404/500 [=======================>......] - ETA: 23s - loss: 2.1254 - regression_loss: 1.7516 - classification_loss: 0.3739 405/500 [=======================>......] - ETA: 23s - loss: 2.1252 - regression_loss: 1.7514 - classification_loss: 0.3738 406/500 [=======================>......] - ETA: 23s - loss: 2.1237 - regression_loss: 1.7505 - classification_loss: 0.3733 407/500 [=======================>......] - ETA: 23s - loss: 2.1231 - regression_loss: 1.7502 - classification_loss: 0.3730 408/500 [=======================>......] - ETA: 22s - loss: 2.1233 - regression_loss: 1.7504 - classification_loss: 0.3728 409/500 [=======================>......] - ETA: 22s - loss: 2.1231 - regression_loss: 1.7503 - classification_loss: 0.3728 410/500 [=======================>......] - ETA: 22s - loss: 2.1243 - regression_loss: 1.7518 - classification_loss: 0.3725 411/500 [=======================>......] - ETA: 22s - loss: 2.1236 - regression_loss: 1.7513 - classification_loss: 0.3723 412/500 [=======================>......] - ETA: 21s - loss: 2.1222 - regression_loss: 1.7502 - classification_loss: 0.3720 413/500 [=======================>......] - ETA: 21s - loss: 2.1215 - regression_loss: 1.7499 - classification_loss: 0.3716 414/500 [=======================>......] - ETA: 21s - loss: 2.1225 - regression_loss: 1.7506 - classification_loss: 0.3719 415/500 [=======================>......] - ETA: 21s - loss: 2.1210 - regression_loss: 1.7495 - classification_loss: 0.3715 416/500 [=======================>......] - ETA: 20s - loss: 2.1205 - regression_loss: 1.7493 - classification_loss: 0.3713 417/500 [========================>.....] - ETA: 20s - loss: 2.1212 - regression_loss: 1.7500 - classification_loss: 0.3712 418/500 [========================>.....] - ETA: 20s - loss: 2.1207 - regression_loss: 1.7498 - classification_loss: 0.3709 419/500 [========================>.....] - ETA: 20s - loss: 2.1225 - regression_loss: 1.7513 - classification_loss: 0.3712 420/500 [========================>.....] - ETA: 19s - loss: 2.1226 - regression_loss: 1.7513 - classification_loss: 0.3712 421/500 [========================>.....] - ETA: 19s - loss: 2.1222 - regression_loss: 1.7510 - classification_loss: 0.3712 422/500 [========================>.....] - ETA: 19s - loss: 2.1247 - regression_loss: 1.7519 - classification_loss: 0.3728 423/500 [========================>.....] - ETA: 19s - loss: 2.1262 - regression_loss: 1.7532 - classification_loss: 0.3730 424/500 [========================>.....] - ETA: 19s - loss: 2.1254 - regression_loss: 1.7525 - classification_loss: 0.3730 425/500 [========================>.....] - ETA: 18s - loss: 2.1243 - regression_loss: 1.7517 - classification_loss: 0.3726 426/500 [========================>.....] - ETA: 18s - loss: 2.1237 - regression_loss: 1.7512 - classification_loss: 0.3725 427/500 [========================>.....] - ETA: 18s - loss: 2.1231 - regression_loss: 1.7510 - classification_loss: 0.3721 428/500 [========================>.....] - ETA: 18s - loss: 2.1219 - regression_loss: 1.7503 - classification_loss: 0.3716 429/500 [========================>.....] - ETA: 17s - loss: 2.1209 - regression_loss: 1.7494 - classification_loss: 0.3714 430/500 [========================>.....] - ETA: 17s - loss: 2.1213 - regression_loss: 1.7497 - classification_loss: 0.3716 431/500 [========================>.....] - ETA: 17s - loss: 2.1209 - regression_loss: 1.7494 - classification_loss: 0.3716 432/500 [========================>.....] - ETA: 17s - loss: 2.1223 - regression_loss: 1.7505 - classification_loss: 0.3718 433/500 [========================>.....] - ETA: 16s - loss: 2.1241 - regression_loss: 1.7520 - classification_loss: 0.3721 434/500 [=========================>....] - ETA: 16s - loss: 2.1254 - regression_loss: 1.7527 - classification_loss: 0.3727 435/500 [=========================>....] - ETA: 16s - loss: 2.1267 - regression_loss: 1.7536 - classification_loss: 0.3731 436/500 [=========================>....] - ETA: 16s - loss: 2.1274 - regression_loss: 1.7540 - classification_loss: 0.3734 437/500 [=========================>....] - ETA: 15s - loss: 2.1274 - regression_loss: 1.7541 - classification_loss: 0.3733 438/500 [=========================>....] - ETA: 15s - loss: 2.1314 - regression_loss: 1.7575 - classification_loss: 0.3739 439/500 [=========================>....] - ETA: 15s - loss: 2.1313 - regression_loss: 1.7572 - classification_loss: 0.3740 440/500 [=========================>....] - ETA: 15s - loss: 2.1318 - regression_loss: 1.7577 - classification_loss: 0.3741 441/500 [=========================>....] - ETA: 14s - loss: 2.1322 - regression_loss: 1.7579 - classification_loss: 0.3743 442/500 [=========================>....] - ETA: 14s - loss: 2.1298 - regression_loss: 1.7561 - classification_loss: 0.3737 443/500 [=========================>....] - ETA: 14s - loss: 2.1294 - regression_loss: 1.7560 - classification_loss: 0.3734 444/500 [=========================>....] - ETA: 14s - loss: 2.1290 - regression_loss: 1.7557 - classification_loss: 0.3733 445/500 [=========================>....] - ETA: 13s - loss: 2.1279 - regression_loss: 1.7549 - classification_loss: 0.3729 446/500 [=========================>....] - ETA: 13s - loss: 2.1274 - regression_loss: 1.7544 - classification_loss: 0.3730 447/500 [=========================>....] - ETA: 13s - loss: 2.1254 - regression_loss: 1.7529 - classification_loss: 0.3725 448/500 [=========================>....] - ETA: 13s - loss: 2.1281 - regression_loss: 1.7546 - classification_loss: 0.3735 449/500 [=========================>....] - ETA: 12s - loss: 2.1268 - regression_loss: 1.7537 - classification_loss: 0.3731 450/500 [==========================>...] - ETA: 12s - loss: 2.1252 - regression_loss: 1.7525 - classification_loss: 0.3727 451/500 [==========================>...] - ETA: 12s - loss: 2.1231 - regression_loss: 1.7508 - classification_loss: 0.3723 452/500 [==========================>...] - ETA: 12s - loss: 2.1230 - regression_loss: 1.7507 - classification_loss: 0.3723 453/500 [==========================>...] - ETA: 11s - loss: 2.1251 - regression_loss: 1.7525 - classification_loss: 0.3726 454/500 [==========================>...] - ETA: 11s - loss: 2.1250 - regression_loss: 1.7525 - classification_loss: 0.3725 455/500 [==========================>...] - ETA: 11s - loss: 2.1254 - regression_loss: 1.7527 - classification_loss: 0.3727 456/500 [==========================>...] - ETA: 11s - loss: 2.1260 - regression_loss: 1.7531 - classification_loss: 0.3730 457/500 [==========================>...] - ETA: 10s - loss: 2.1238 - regression_loss: 1.7513 - classification_loss: 0.3724 458/500 [==========================>...] - ETA: 10s - loss: 2.1239 - regression_loss: 1.7515 - classification_loss: 0.3724 459/500 [==========================>...] - ETA: 10s - loss: 2.1260 - regression_loss: 1.7531 - classification_loss: 0.3729 460/500 [==========================>...] - ETA: 10s - loss: 2.1246 - regression_loss: 1.7519 - classification_loss: 0.3728 461/500 [==========================>...] - ETA: 9s - loss: 2.1252 - regression_loss: 1.7521 - classification_loss: 0.3731  462/500 [==========================>...] - ETA: 9s - loss: 2.1255 - regression_loss: 1.7524 - classification_loss: 0.3731 463/500 [==========================>...] - ETA: 9s - loss: 2.1245 - regression_loss: 1.7518 - classification_loss: 0.3727 464/500 [==========================>...] - ETA: 9s - loss: 2.1246 - regression_loss: 1.7518 - classification_loss: 0.3729 465/500 [==========================>...] - ETA: 8s - loss: 2.1253 - regression_loss: 1.7524 - classification_loss: 0.3728 466/500 [==========================>...] - ETA: 8s - loss: 2.1243 - regression_loss: 1.7516 - classification_loss: 0.3727 467/500 [===========================>..] - ETA: 8s - loss: 2.1264 - regression_loss: 1.7531 - classification_loss: 0.3733 468/500 [===========================>..] - ETA: 8s - loss: 2.1265 - regression_loss: 1.7532 - classification_loss: 0.3733 469/500 [===========================>..] - ETA: 7s - loss: 2.1277 - regression_loss: 1.7539 - classification_loss: 0.3738 470/500 [===========================>..] - ETA: 7s - loss: 2.1274 - regression_loss: 1.7538 - classification_loss: 0.3736 471/500 [===========================>..] - ETA: 7s - loss: 2.1271 - regression_loss: 1.7537 - classification_loss: 0.3733 472/500 [===========================>..] - ETA: 7s - loss: 2.1267 - regression_loss: 1.7535 - classification_loss: 0.3732 473/500 [===========================>..] - ETA: 6s - loss: 2.1265 - regression_loss: 1.7533 - classification_loss: 0.3731 474/500 [===========================>..] - ETA: 6s - loss: 2.1269 - regression_loss: 1.7538 - classification_loss: 0.3731 475/500 [===========================>..] - ETA: 6s - loss: 2.1269 - regression_loss: 1.7537 - classification_loss: 0.3732 476/500 [===========================>..] - ETA: 6s - loss: 2.1258 - regression_loss: 1.7528 - classification_loss: 0.3730 477/500 [===========================>..] - ETA: 5s - loss: 2.1271 - regression_loss: 1.7537 - classification_loss: 0.3733 478/500 [===========================>..] - ETA: 5s - loss: 2.1274 - regression_loss: 1.7541 - classification_loss: 0.3733 479/500 [===========================>..] - ETA: 5s - loss: 2.1270 - regression_loss: 1.7540 - classification_loss: 0.3731 480/500 [===========================>..] - ETA: 4s - loss: 2.1281 - regression_loss: 1.7550 - classification_loss: 0.3732 481/500 [===========================>..] - ETA: 4s - loss: 2.1296 - regression_loss: 1.7560 - classification_loss: 0.3736 482/500 [===========================>..] - ETA: 4s - loss: 2.1300 - regression_loss: 1.7563 - classification_loss: 0.3737 483/500 [===========================>..] - ETA: 4s - loss: 2.1303 - regression_loss: 1.7564 - classification_loss: 0.3739 484/500 [============================>.] - ETA: 3s - loss: 2.1313 - regression_loss: 1.7575 - classification_loss: 0.3738 485/500 [============================>.] - ETA: 3s - loss: 2.1301 - regression_loss: 1.7563 - classification_loss: 0.3737 486/500 [============================>.] - ETA: 3s - loss: 2.1288 - regression_loss: 1.7552 - classification_loss: 0.3736 487/500 [============================>.] - ETA: 3s - loss: 2.1294 - regression_loss: 1.7555 - classification_loss: 0.3739 488/500 [============================>.] - ETA: 2s - loss: 2.1306 - regression_loss: 1.7563 - classification_loss: 0.3743 489/500 [============================>.] - ETA: 2s - loss: 2.1304 - regression_loss: 1.7563 - classification_loss: 0.3741 490/500 [============================>.] - ETA: 2s - loss: 2.1310 - regression_loss: 1.7565 - classification_loss: 0.3744 491/500 [============================>.] - ETA: 2s - loss: 2.1307 - regression_loss: 1.7565 - classification_loss: 0.3742 492/500 [============================>.] - ETA: 2s - loss: 2.1291 - regression_loss: 1.7553 - classification_loss: 0.3738 493/500 [============================>.] - ETA: 1s - loss: 2.1294 - regression_loss: 1.7556 - classification_loss: 0.3738 494/500 [============================>.] - ETA: 1s - loss: 2.1283 - regression_loss: 1.7547 - classification_loss: 0.3737 495/500 [============================>.] - ETA: 1s - loss: 2.1293 - regression_loss: 1.7554 - classification_loss: 0.3739 496/500 [============================>.] - ETA: 1s - loss: 2.1298 - regression_loss: 1.7557 - classification_loss: 0.3741 497/500 [============================>.] - ETA: 0s - loss: 2.1305 - regression_loss: 1.7566 - classification_loss: 0.3739 498/500 [============================>.] - ETA: 0s - loss: 2.1299 - regression_loss: 1.7556 - classification_loss: 0.3743 499/500 [============================>.] - ETA: 0s - loss: 2.1296 - regression_loss: 1.7553 - classification_loss: 0.3742 500/500 [==============================] - 125s 250ms/step - loss: 2.1294 - regression_loss: 1.7553 - classification_loss: 0.3741 326 instances of class plum with average precision: 0.6586 mAP: 0.6586 Epoch 00018: saving model to ./training/snapshots/resnet50_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 2:02 - loss: 2.6022 - regression_loss: 2.0300 - classification_loss: 0.5722 2/500 [..............................] - ETA: 2:01 - loss: 2.8578 - regression_loss: 2.2028 - classification_loss: 0.6550 3/500 [..............................] - ETA: 2:03 - loss: 2.6472 - regression_loss: 2.0928 - classification_loss: 0.5544 4/500 [..............................] - ETA: 2:03 - loss: 2.3569 - regression_loss: 1.8949 - classification_loss: 0.4619 5/500 [..............................] - ETA: 2:03 - loss: 2.2189 - regression_loss: 1.7992 - classification_loss: 0.4197 6/500 [..............................] - ETA: 2:03 - loss: 2.2579 - regression_loss: 1.8379 - classification_loss: 0.4200 7/500 [..............................] - ETA: 2:04 - loss: 2.2541 - regression_loss: 1.8509 - classification_loss: 0.4031 8/500 [..............................] - ETA: 2:03 - loss: 2.2256 - regression_loss: 1.8251 - classification_loss: 0.4005 9/500 [..............................] - ETA: 2:03 - loss: 2.2128 - regression_loss: 1.8127 - classification_loss: 0.4001 10/500 [..............................] - ETA: 2:03 - loss: 2.2429 - regression_loss: 1.8483 - classification_loss: 0.3946 11/500 [..............................] - ETA: 2:02 - loss: 2.3220 - regression_loss: 1.9143 - classification_loss: 0.4077 12/500 [..............................] - ETA: 2:02 - loss: 2.2159 - regression_loss: 1.8302 - classification_loss: 0.3857 13/500 [..............................] - ETA: 2:02 - loss: 2.1879 - regression_loss: 1.8097 - classification_loss: 0.3782 14/500 [..............................] - ETA: 2:01 - loss: 2.2330 - regression_loss: 1.8421 - classification_loss: 0.3909 15/500 [..............................] - ETA: 2:01 - loss: 2.2084 - regression_loss: 1.8249 - classification_loss: 0.3835 16/500 [..............................] - ETA: 2:01 - loss: 2.1800 - regression_loss: 1.8092 - classification_loss: 0.3709 17/500 [>.............................] - ETA: 2:01 - loss: 2.1896 - regression_loss: 1.8143 - classification_loss: 0.3752 18/500 [>.............................] - ETA: 2:01 - loss: 2.1965 - regression_loss: 1.8175 - classification_loss: 0.3790 19/500 [>.............................] - ETA: 2:00 - loss: 2.2000 - regression_loss: 1.8137 - classification_loss: 0.3863 20/500 [>.............................] - ETA: 2:00 - loss: 2.1617 - regression_loss: 1.7838 - classification_loss: 0.3779 21/500 [>.............................] - ETA: 2:00 - loss: 2.1777 - regression_loss: 1.7949 - classification_loss: 0.3828 22/500 [>.............................] - ETA: 2:00 - loss: 2.1748 - regression_loss: 1.7921 - classification_loss: 0.3827 23/500 [>.............................] - ETA: 1:59 - loss: 2.1767 - regression_loss: 1.7949 - classification_loss: 0.3818 24/500 [>.............................] - ETA: 1:59 - loss: 2.1527 - regression_loss: 1.7748 - classification_loss: 0.3779 25/500 [>.............................] - ETA: 1:59 - loss: 2.1275 - regression_loss: 1.7552 - classification_loss: 0.3723 26/500 [>.............................] - ETA: 1:58 - loss: 2.1800 - regression_loss: 1.7921 - classification_loss: 0.3879 27/500 [>.............................] - ETA: 1:58 - loss: 2.1904 - regression_loss: 1.8030 - classification_loss: 0.3874 28/500 [>.............................] - ETA: 1:58 - loss: 2.1825 - regression_loss: 1.7992 - classification_loss: 0.3834 29/500 [>.............................] - ETA: 1:58 - loss: 2.1837 - regression_loss: 1.8007 - classification_loss: 0.3830 30/500 [>.............................] - ETA: 1:57 - loss: 2.1400 - regression_loss: 1.7623 - classification_loss: 0.3776 31/500 [>.............................] - ETA: 1:57 - loss: 2.1251 - regression_loss: 1.7527 - classification_loss: 0.3724 32/500 [>.............................] - ETA: 1:57 - loss: 2.1254 - regression_loss: 1.7532 - classification_loss: 0.3722 33/500 [>.............................] - ETA: 1:57 - loss: 2.1396 - regression_loss: 1.7604 - classification_loss: 0.3792 34/500 [=>............................] - ETA: 1:57 - loss: 2.1317 - regression_loss: 1.7565 - classification_loss: 0.3752 35/500 [=>............................] - ETA: 1:56 - loss: 2.1209 - regression_loss: 1.7465 - classification_loss: 0.3745 36/500 [=>............................] - ETA: 1:56 - loss: 2.1214 - regression_loss: 1.7470 - classification_loss: 0.3744 37/500 [=>............................] - ETA: 1:56 - loss: 2.1464 - regression_loss: 1.7721 - classification_loss: 0.3743 38/500 [=>............................] - ETA: 1:56 - loss: 2.1239 - regression_loss: 1.7255 - classification_loss: 0.3984 39/500 [=>............................] - ETA: 1:56 - loss: 2.1371 - regression_loss: 1.7360 - classification_loss: 0.4011 40/500 [=>............................] - ETA: 1:55 - loss: 2.1051 - regression_loss: 1.7108 - classification_loss: 0.3943 41/500 [=>............................] - ETA: 1:55 - loss: 2.0876 - regression_loss: 1.6970 - classification_loss: 0.3905 42/500 [=>............................] - ETA: 1:55 - loss: 2.0914 - regression_loss: 1.7027 - classification_loss: 0.3886 43/500 [=>............................] - ETA: 1:55 - loss: 2.1044 - regression_loss: 1.7124 - classification_loss: 0.3920 44/500 [=>............................] - ETA: 1:55 - loss: 2.0887 - regression_loss: 1.7015 - classification_loss: 0.3872 45/500 [=>............................] - ETA: 1:54 - loss: 2.0886 - regression_loss: 1.7020 - classification_loss: 0.3866 46/500 [=>............................] - ETA: 1:54 - loss: 2.1172 - regression_loss: 1.7247 - classification_loss: 0.3926 47/500 [=>............................] - ETA: 1:54 - loss: 2.1151 - regression_loss: 1.7242 - classification_loss: 0.3909 48/500 [=>............................] - ETA: 1:53 - loss: 2.1220 - regression_loss: 1.7287 - classification_loss: 0.3933 49/500 [=>............................] - ETA: 1:53 - loss: 2.1226 - regression_loss: 1.7299 - classification_loss: 0.3927 50/500 [==>...........................] - ETA: 1:53 - loss: 2.1116 - regression_loss: 1.7203 - classification_loss: 0.3914 51/500 [==>...........................] - ETA: 1:53 - loss: 2.1205 - regression_loss: 1.7276 - classification_loss: 0.3929 52/500 [==>...........................] - ETA: 1:52 - loss: 2.1427 - regression_loss: 1.7495 - classification_loss: 0.3933 53/500 [==>...........................] - ETA: 1:52 - loss: 2.1431 - regression_loss: 1.7511 - classification_loss: 0.3919 54/500 [==>...........................] - ETA: 1:52 - loss: 2.1435 - regression_loss: 1.7528 - classification_loss: 0.3907 55/500 [==>...........................] - ETA: 1:52 - loss: 2.1692 - regression_loss: 1.7749 - classification_loss: 0.3943 56/500 [==>...........................] - ETA: 1:51 - loss: 2.1688 - regression_loss: 1.7737 - classification_loss: 0.3952 57/500 [==>...........................] - ETA: 1:51 - loss: 2.1701 - regression_loss: 1.7742 - classification_loss: 0.3959 58/500 [==>...........................] - ETA: 1:51 - loss: 2.1645 - regression_loss: 1.7710 - classification_loss: 0.3935 59/500 [==>...........................] - ETA: 1:51 - loss: 2.1457 - regression_loss: 1.7568 - classification_loss: 0.3889 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1473 - regression_loss: 1.7579 - classification_loss: 0.3894 61/500 [==>...........................] - ETA: 1:50 - loss: 2.1560 - regression_loss: 1.7652 - classification_loss: 0.3908 62/500 [==>...........................] - ETA: 1:50 - loss: 2.1723 - regression_loss: 1.7753 - classification_loss: 0.3970 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1786 - regression_loss: 1.7809 - classification_loss: 0.3978 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1791 - regression_loss: 1.7822 - classification_loss: 0.3969 65/500 [==>...........................] - ETA: 1:49 - loss: 2.1666 - regression_loss: 1.7739 - classification_loss: 0.3927 66/500 [==>...........................] - ETA: 1:49 - loss: 2.1595 - regression_loss: 1.7692 - classification_loss: 0.3903 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1457 - regression_loss: 1.7591 - classification_loss: 0.3867 68/500 [===>..........................] - ETA: 1:48 - loss: 2.1465 - regression_loss: 1.7602 - classification_loss: 0.3863 69/500 [===>..........................] - ETA: 1:48 - loss: 2.1372 - regression_loss: 1.7538 - classification_loss: 0.3834 70/500 [===>..........................] - ETA: 1:48 - loss: 2.1361 - regression_loss: 1.7542 - classification_loss: 0.3819 71/500 [===>..........................] - ETA: 1:48 - loss: 2.1388 - regression_loss: 1.7565 - classification_loss: 0.3823 72/500 [===>..........................] - ETA: 1:47 - loss: 2.1389 - regression_loss: 1.7572 - classification_loss: 0.3817 73/500 [===>..........................] - ETA: 1:47 - loss: 2.1286 - regression_loss: 1.7487 - classification_loss: 0.3799 74/500 [===>..........................] - ETA: 1:47 - loss: 2.1277 - regression_loss: 1.7480 - classification_loss: 0.3797 75/500 [===>..........................] - ETA: 1:46 - loss: 2.1348 - regression_loss: 1.7542 - classification_loss: 0.3805 76/500 [===>..........................] - ETA: 1:46 - loss: 2.1262 - regression_loss: 1.7475 - classification_loss: 0.3787 77/500 [===>..........................] - ETA: 1:46 - loss: 2.1221 - regression_loss: 1.7444 - classification_loss: 0.3777 78/500 [===>..........................] - ETA: 1:46 - loss: 2.1142 - regression_loss: 1.7382 - classification_loss: 0.3760 79/500 [===>..........................] - ETA: 1:45 - loss: 2.1164 - regression_loss: 1.7395 - classification_loss: 0.3769 80/500 [===>..........................] - ETA: 1:45 - loss: 2.1185 - regression_loss: 1.7408 - classification_loss: 0.3777 81/500 [===>..........................] - ETA: 1:45 - loss: 2.1239 - regression_loss: 1.7464 - classification_loss: 0.3774 82/500 [===>..........................] - ETA: 1:45 - loss: 2.1111 - regression_loss: 1.7369 - classification_loss: 0.3743 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1112 - regression_loss: 1.7367 - classification_loss: 0.3745 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1096 - regression_loss: 1.7359 - classification_loss: 0.3737 85/500 [====>.........................] - ETA: 1:44 - loss: 2.1092 - regression_loss: 1.7356 - classification_loss: 0.3736 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1018 - regression_loss: 1.7306 - classification_loss: 0.3711 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1014 - regression_loss: 1.7307 - classification_loss: 0.3708 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0979 - regression_loss: 1.7288 - classification_loss: 0.3691 89/500 [====>.........................] - ETA: 1:43 - loss: 2.1094 - regression_loss: 1.7412 - classification_loss: 0.3682 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1254 - regression_loss: 1.7558 - classification_loss: 0.3696 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1184 - regression_loss: 1.7507 - classification_loss: 0.3677 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1157 - regression_loss: 1.7488 - classification_loss: 0.3669 93/500 [====>.........................] - ETA: 1:42 - loss: 2.1136 - regression_loss: 1.7477 - classification_loss: 0.3659 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1104 - regression_loss: 1.7455 - classification_loss: 0.3649 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1074 - regression_loss: 1.7431 - classification_loss: 0.3643 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0966 - regression_loss: 1.7348 - classification_loss: 0.3618 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0936 - regression_loss: 1.7326 - classification_loss: 0.3609 98/500 [====>.........................] - ETA: 1:40 - loss: 2.0934 - regression_loss: 1.7323 - classification_loss: 0.3611 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0911 - regression_loss: 1.7309 - classification_loss: 0.3601 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0908 - regression_loss: 1.7304 - classification_loss: 0.3604 101/500 [=====>........................] - ETA: 1:40 - loss: 2.0829 - regression_loss: 1.7246 - classification_loss: 0.3584 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0713 - regression_loss: 1.7154 - classification_loss: 0.3560 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0760 - regression_loss: 1.7168 - classification_loss: 0.3592 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0724 - regression_loss: 1.7138 - classification_loss: 0.3586 105/500 [=====>........................] - ETA: 1:39 - loss: 2.0695 - regression_loss: 1.7121 - classification_loss: 0.3575 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0684 - regression_loss: 1.7103 - classification_loss: 0.3581 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0706 - regression_loss: 1.7118 - classification_loss: 0.3588 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0675 - regression_loss: 1.7094 - classification_loss: 0.3581 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0701 - regression_loss: 1.7121 - classification_loss: 0.3581 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0683 - regression_loss: 1.7111 - classification_loss: 0.3573 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0621 - regression_loss: 1.7061 - classification_loss: 0.3560 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0585 - regression_loss: 1.7042 - classification_loss: 0.3543 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0555 - regression_loss: 1.7022 - classification_loss: 0.3533 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0542 - regression_loss: 1.7007 - classification_loss: 0.3534 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0473 - regression_loss: 1.6957 - classification_loss: 0.3516 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0487 - regression_loss: 1.6959 - classification_loss: 0.3528 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0501 - regression_loss: 1.6978 - classification_loss: 0.3523 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0522 - regression_loss: 1.6995 - classification_loss: 0.3527 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0557 - regression_loss: 1.7016 - classification_loss: 0.3541 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0570 - regression_loss: 1.7009 - classification_loss: 0.3561 121/500 [======>.......................] - ETA: 1:34 - loss: 2.0660 - regression_loss: 1.7075 - classification_loss: 0.3585 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0666 - regression_loss: 1.7083 - classification_loss: 0.3582 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0645 - regression_loss: 1.7068 - classification_loss: 0.3578 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0761 - regression_loss: 1.7170 - classification_loss: 0.3591 125/500 [======>.......................] - ETA: 1:33 - loss: 2.0830 - regression_loss: 1.7209 - classification_loss: 0.3621 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0820 - regression_loss: 1.7204 - classification_loss: 0.3616 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0792 - regression_loss: 1.7185 - classification_loss: 0.3606 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0719 - regression_loss: 1.7127 - classification_loss: 0.3592 129/500 [======>.......................] - ETA: 1:32 - loss: 2.0797 - regression_loss: 1.7190 - classification_loss: 0.3606 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0745 - regression_loss: 1.7153 - classification_loss: 0.3592 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0782 - regression_loss: 1.7181 - classification_loss: 0.3602 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0779 - regression_loss: 1.7178 - classification_loss: 0.3601 133/500 [======>.......................] - ETA: 1:31 - loss: 2.0784 - regression_loss: 1.7177 - classification_loss: 0.3607 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0682 - regression_loss: 1.7092 - classification_loss: 0.3590 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0723 - regression_loss: 1.7119 - classification_loss: 0.3604 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0750 - regression_loss: 1.7133 - classification_loss: 0.3617 137/500 [=======>......................] - ETA: 1:30 - loss: 2.0761 - regression_loss: 1.7139 - classification_loss: 0.3622 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0762 - regression_loss: 1.7137 - classification_loss: 0.3625 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0734 - regression_loss: 1.7111 - classification_loss: 0.3623 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0764 - regression_loss: 1.7137 - classification_loss: 0.3628 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0756 - regression_loss: 1.7139 - classification_loss: 0.3617 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0733 - regression_loss: 1.7128 - classification_loss: 0.3605 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0772 - regression_loss: 1.7147 - classification_loss: 0.3625 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0747 - regression_loss: 1.7127 - classification_loss: 0.3620 145/500 [=======>......................] - ETA: 1:28 - loss: 2.0717 - regression_loss: 1.7105 - classification_loss: 0.3613 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0699 - regression_loss: 1.7095 - classification_loss: 0.3604 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0847 - regression_loss: 1.7202 - classification_loss: 0.3645 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0916 - regression_loss: 1.7252 - classification_loss: 0.3663 149/500 [=======>......................] - ETA: 1:27 - loss: 2.0928 - regression_loss: 1.7267 - classification_loss: 0.3662 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0918 - regression_loss: 1.7260 - classification_loss: 0.3658 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0945 - regression_loss: 1.7274 - classification_loss: 0.3670 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0968 - regression_loss: 1.7290 - classification_loss: 0.3678 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1003 - regression_loss: 1.7324 - classification_loss: 0.3678 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0971 - regression_loss: 1.7300 - classification_loss: 0.3671 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0940 - regression_loss: 1.7278 - classification_loss: 0.3662 156/500 [========>.....................] - ETA: 1:25 - loss: 2.0962 - regression_loss: 1.7305 - classification_loss: 0.3657 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0949 - regression_loss: 1.7294 - classification_loss: 0.3655 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0960 - regression_loss: 1.7302 - classification_loss: 0.3658 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0953 - regression_loss: 1.7295 - classification_loss: 0.3659 160/500 [========>.....................] - ETA: 1:24 - loss: 2.0928 - regression_loss: 1.7282 - classification_loss: 0.3647 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0909 - regression_loss: 1.7269 - classification_loss: 0.3640 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0930 - regression_loss: 1.7278 - classification_loss: 0.3653 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0909 - regression_loss: 1.7262 - classification_loss: 0.3647 164/500 [========>.....................] - ETA: 1:23 - loss: 2.0907 - regression_loss: 1.7264 - classification_loss: 0.3643 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0920 - regression_loss: 1.7273 - classification_loss: 0.3647 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0928 - regression_loss: 1.7276 - classification_loss: 0.3652 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0917 - regression_loss: 1.7271 - classification_loss: 0.3646 168/500 [=========>....................] - ETA: 1:22 - loss: 2.0870 - regression_loss: 1.7233 - classification_loss: 0.3637 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0861 - regression_loss: 1.7228 - classification_loss: 0.3633 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0842 - regression_loss: 1.7214 - classification_loss: 0.3628 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0861 - regression_loss: 1.7229 - classification_loss: 0.3632 172/500 [=========>....................] - ETA: 1:21 - loss: 2.0826 - regression_loss: 1.7202 - classification_loss: 0.3623 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0819 - regression_loss: 1.7204 - classification_loss: 0.3615 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0831 - regression_loss: 1.7215 - classification_loss: 0.3616 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0868 - regression_loss: 1.7253 - classification_loss: 0.3615 176/500 [=========>....................] - ETA: 1:20 - loss: 2.0886 - regression_loss: 1.7274 - classification_loss: 0.3611 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0840 - regression_loss: 1.7237 - classification_loss: 0.3603 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0849 - regression_loss: 1.7245 - classification_loss: 0.3604 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0863 - regression_loss: 1.7252 - classification_loss: 0.3611 180/500 [=========>....................] - ETA: 1:19 - loss: 2.0895 - regression_loss: 1.7290 - classification_loss: 0.3605 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0941 - regression_loss: 1.7327 - classification_loss: 0.3613 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1016 - regression_loss: 1.7398 - classification_loss: 0.3617 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1014 - regression_loss: 1.7397 - classification_loss: 0.3618 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1076 - regression_loss: 1.7439 - classification_loss: 0.3637 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1067 - regression_loss: 1.7436 - classification_loss: 0.3632 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1098 - regression_loss: 1.7454 - classification_loss: 0.3645 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1098 - regression_loss: 1.7453 - classification_loss: 0.3645 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1061 - regression_loss: 1.7426 - classification_loss: 0.3635 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1065 - regression_loss: 1.7435 - classification_loss: 0.3630 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1044 - regression_loss: 1.7421 - classification_loss: 0.3623 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1035 - regression_loss: 1.7414 - classification_loss: 0.3620 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1007 - regression_loss: 1.7394 - classification_loss: 0.3613 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0983 - regression_loss: 1.7375 - classification_loss: 0.3607 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0964 - regression_loss: 1.7286 - classification_loss: 0.3679 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1005 - regression_loss: 1.7319 - classification_loss: 0.3686 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1010 - regression_loss: 1.7320 - classification_loss: 0.3690 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1052 - regression_loss: 1.7356 - classification_loss: 0.3696 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1049 - regression_loss: 1.7353 - classification_loss: 0.3695 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1064 - regression_loss: 1.7361 - classification_loss: 0.3702 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1043 - regression_loss: 1.7351 - classification_loss: 0.3692 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1030 - regression_loss: 1.7340 - classification_loss: 0.3690 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1033 - regression_loss: 1.7343 - classification_loss: 0.3690 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1038 - regression_loss: 1.7346 - classification_loss: 0.3692 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1044 - regression_loss: 1.7347 - classification_loss: 0.3697 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1033 - regression_loss: 1.7344 - classification_loss: 0.3689 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1039 - regression_loss: 1.7351 - classification_loss: 0.3689 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1053 - regression_loss: 1.7361 - classification_loss: 0.3692 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1061 - regression_loss: 1.7369 - classification_loss: 0.3692 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1029 - regression_loss: 1.7347 - classification_loss: 0.3681 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1094 - regression_loss: 1.7395 - classification_loss: 0.3699 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1094 - regression_loss: 1.7400 - classification_loss: 0.3694 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1091 - regression_loss: 1.7397 - classification_loss: 0.3694 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1037 - regression_loss: 1.7354 - classification_loss: 0.3683 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1062 - regression_loss: 1.7375 - classification_loss: 0.3687 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1026 - regression_loss: 1.7347 - classification_loss: 0.3679 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1027 - regression_loss: 1.7344 - classification_loss: 0.3683 217/500 [============>.................] - ETA: 1:10 - loss: 2.1036 - regression_loss: 1.7351 - classification_loss: 0.3685 218/500 [============>.................] - ETA: 1:10 - loss: 2.0993 - regression_loss: 1.7316 - classification_loss: 0.3676 219/500 [============>.................] - ETA: 1:10 - loss: 2.0998 - regression_loss: 1.7320 - classification_loss: 0.3678 220/500 [============>.................] - ETA: 1:10 - loss: 2.0991 - regression_loss: 1.7313 - classification_loss: 0.3678 221/500 [============>.................] - ETA: 1:09 - loss: 2.0998 - regression_loss: 1.7322 - classification_loss: 0.3676 222/500 [============>.................] - ETA: 1:09 - loss: 2.1012 - regression_loss: 1.7331 - classification_loss: 0.3681 223/500 [============>.................] - ETA: 1:09 - loss: 2.1023 - regression_loss: 1.7340 - classification_loss: 0.3683 224/500 [============>.................] - ETA: 1:09 - loss: 2.1026 - regression_loss: 1.7345 - classification_loss: 0.3681 225/500 [============>.................] - ETA: 1:08 - loss: 2.1016 - regression_loss: 1.7336 - classification_loss: 0.3681 226/500 [============>.................] - ETA: 1:08 - loss: 2.1010 - regression_loss: 1.7333 - classification_loss: 0.3677 227/500 [============>.................] - ETA: 1:08 - loss: 2.1002 - regression_loss: 1.7326 - classification_loss: 0.3676 228/500 [============>.................] - ETA: 1:08 - loss: 2.0975 - regression_loss: 1.7305 - classification_loss: 0.3670 229/500 [============>.................] - ETA: 1:07 - loss: 2.0976 - regression_loss: 1.7308 - classification_loss: 0.3668 230/500 [============>.................] - ETA: 1:07 - loss: 2.0975 - regression_loss: 1.7307 - classification_loss: 0.3669 231/500 [============>.................] - ETA: 1:07 - loss: 2.1002 - regression_loss: 1.7326 - classification_loss: 0.3675 232/500 [============>.................] - ETA: 1:07 - loss: 2.0976 - regression_loss: 1.7306 - classification_loss: 0.3670 233/500 [============>.................] - ETA: 1:06 - loss: 2.0967 - regression_loss: 1.7304 - classification_loss: 0.3664 234/500 [=============>................] - ETA: 1:06 - loss: 2.0968 - regression_loss: 1.7307 - classification_loss: 0.3661 235/500 [=============>................] - ETA: 1:06 - loss: 2.0949 - regression_loss: 1.7292 - classification_loss: 0.3657 236/500 [=============>................] - ETA: 1:06 - loss: 2.0956 - regression_loss: 1.7293 - classification_loss: 0.3662 237/500 [=============>................] - ETA: 1:05 - loss: 2.0945 - regression_loss: 1.7285 - classification_loss: 0.3660 238/500 [=============>................] - ETA: 1:05 - loss: 2.0929 - regression_loss: 1.7275 - classification_loss: 0.3654 239/500 [=============>................] - ETA: 1:05 - loss: 2.0930 - regression_loss: 1.7277 - classification_loss: 0.3653 240/500 [=============>................] - ETA: 1:05 - loss: 2.0897 - regression_loss: 1.7251 - classification_loss: 0.3646 241/500 [=============>................] - ETA: 1:04 - loss: 2.0901 - regression_loss: 1.7254 - classification_loss: 0.3647 242/500 [=============>................] - ETA: 1:04 - loss: 2.0932 - regression_loss: 1.7279 - classification_loss: 0.3653 243/500 [=============>................] - ETA: 1:04 - loss: 2.0970 - regression_loss: 1.7311 - classification_loss: 0.3660 244/500 [=============>................] - ETA: 1:04 - loss: 2.0980 - regression_loss: 1.7317 - classification_loss: 0.3663 245/500 [=============>................] - ETA: 1:03 - loss: 2.1003 - regression_loss: 1.7332 - classification_loss: 0.3671 246/500 [=============>................] - ETA: 1:03 - loss: 2.1039 - regression_loss: 1.7359 - classification_loss: 0.3679 247/500 [=============>................] - ETA: 1:03 - loss: 2.1053 - regression_loss: 1.7366 - classification_loss: 0.3686 248/500 [=============>................] - ETA: 1:03 - loss: 2.1098 - regression_loss: 1.7406 - classification_loss: 0.3692 249/500 [=============>................] - ETA: 1:02 - loss: 2.1083 - regression_loss: 1.7396 - classification_loss: 0.3687 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1078 - regression_loss: 1.7391 - classification_loss: 0.3687 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1102 - regression_loss: 1.7405 - classification_loss: 0.3697 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1079 - regression_loss: 1.7389 - classification_loss: 0.3690 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1101 - regression_loss: 1.7402 - classification_loss: 0.3699 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1169 - regression_loss: 1.7333 - classification_loss: 0.3836 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1150 - regression_loss: 1.7319 - classification_loss: 0.3831 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1116 - regression_loss: 1.7295 - classification_loss: 0.3820 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1140 - regression_loss: 1.7313 - classification_loss: 0.3827 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1143 - regression_loss: 1.7319 - classification_loss: 0.3824 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1111 - regression_loss: 1.7295 - classification_loss: 0.3816 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1098 - regression_loss: 1.7287 - classification_loss: 0.3811 261/500 [==============>...............] - ETA: 59s - loss: 2.1117 - regression_loss: 1.7301 - classification_loss: 0.3816  262/500 [==============>...............] - ETA: 59s - loss: 2.1110 - regression_loss: 1.7296 - classification_loss: 0.3814 263/500 [==============>...............] - ETA: 59s - loss: 2.1098 - regression_loss: 1.7291 - classification_loss: 0.3808 264/500 [==============>...............] - ETA: 59s - loss: 2.1089 - regression_loss: 1.7283 - classification_loss: 0.3806 265/500 [==============>...............] - ETA: 58s - loss: 2.1074 - regression_loss: 1.7270 - classification_loss: 0.3804 266/500 [==============>...............] - ETA: 58s - loss: 2.1090 - regression_loss: 1.7283 - classification_loss: 0.3807 267/500 [===============>..............] - ETA: 58s - loss: 2.1112 - regression_loss: 1.7300 - classification_loss: 0.3812 268/500 [===============>..............] - ETA: 58s - loss: 2.1070 - regression_loss: 1.7266 - classification_loss: 0.3805 269/500 [===============>..............] - ETA: 57s - loss: 2.1040 - regression_loss: 1.7245 - classification_loss: 0.3795 270/500 [===============>..............] - ETA: 57s - loss: 2.1030 - regression_loss: 1.7241 - classification_loss: 0.3789 271/500 [===============>..............] - ETA: 57s - loss: 2.1059 - regression_loss: 1.7265 - classification_loss: 0.3793 272/500 [===============>..............] - ETA: 57s - loss: 2.1064 - regression_loss: 1.7270 - classification_loss: 0.3794 273/500 [===============>..............] - ETA: 56s - loss: 2.1057 - regression_loss: 1.7265 - classification_loss: 0.3792 274/500 [===============>..............] - ETA: 56s - loss: 2.1042 - regression_loss: 1.7254 - classification_loss: 0.3788 275/500 [===============>..............] - ETA: 56s - loss: 2.1043 - regression_loss: 1.7255 - classification_loss: 0.3788 276/500 [===============>..............] - ETA: 56s - loss: 2.1043 - regression_loss: 1.7254 - classification_loss: 0.3789 277/500 [===============>..............] - ETA: 55s - loss: 2.1086 - regression_loss: 1.7292 - classification_loss: 0.3794 278/500 [===============>..............] - ETA: 55s - loss: 2.1083 - regression_loss: 1.7292 - classification_loss: 0.3791 279/500 [===============>..............] - ETA: 55s - loss: 2.1069 - regression_loss: 1.7283 - classification_loss: 0.3785 280/500 [===============>..............] - ETA: 55s - loss: 2.1055 - regression_loss: 1.7276 - classification_loss: 0.3779 281/500 [===============>..............] - ETA: 54s - loss: 2.1057 - regression_loss: 1.7279 - classification_loss: 0.3778 282/500 [===============>..............] - ETA: 54s - loss: 2.1086 - regression_loss: 1.7303 - classification_loss: 0.3783 283/500 [===============>..............] - ETA: 54s - loss: 2.1078 - regression_loss: 1.7298 - classification_loss: 0.3780 284/500 [================>.............] - ETA: 54s - loss: 2.1071 - regression_loss: 1.7291 - classification_loss: 0.3779 285/500 [================>.............] - ETA: 53s - loss: 2.1051 - regression_loss: 1.7272 - classification_loss: 0.3779 286/500 [================>.............] - ETA: 53s - loss: 2.1037 - regression_loss: 1.7262 - classification_loss: 0.3776 287/500 [================>.............] - ETA: 53s - loss: 2.1046 - regression_loss: 1.7274 - classification_loss: 0.3772 288/500 [================>.............] - ETA: 53s - loss: 2.1022 - regression_loss: 1.7255 - classification_loss: 0.3767 289/500 [================>.............] - ETA: 52s - loss: 2.1005 - regression_loss: 1.7246 - classification_loss: 0.3759 290/500 [================>.............] - ETA: 52s - loss: 2.0962 - regression_loss: 1.7201 - classification_loss: 0.3760 291/500 [================>.............] - ETA: 52s - loss: 2.0958 - regression_loss: 1.7202 - classification_loss: 0.3756 292/500 [================>.............] - ETA: 52s - loss: 2.0937 - regression_loss: 1.7186 - classification_loss: 0.3751 293/500 [================>.............] - ETA: 51s - loss: 2.0927 - regression_loss: 1.7179 - classification_loss: 0.3749 294/500 [================>.............] - ETA: 51s - loss: 2.0912 - regression_loss: 1.7166 - classification_loss: 0.3746 295/500 [================>.............] - ETA: 51s - loss: 2.0881 - regression_loss: 1.7143 - classification_loss: 0.3738 296/500 [================>.............] - ETA: 51s - loss: 2.0895 - regression_loss: 1.7155 - classification_loss: 0.3740 297/500 [================>.............] - ETA: 50s - loss: 2.0908 - regression_loss: 1.7165 - classification_loss: 0.3743 298/500 [================>.............] - ETA: 50s - loss: 2.0872 - regression_loss: 1.7138 - classification_loss: 0.3734 299/500 [================>.............] - ETA: 50s - loss: 2.0880 - regression_loss: 1.7146 - classification_loss: 0.3734 300/500 [=================>............] - ETA: 50s - loss: 2.0896 - regression_loss: 1.7157 - classification_loss: 0.3739 301/500 [=================>............] - ETA: 49s - loss: 2.0880 - regression_loss: 1.7145 - classification_loss: 0.3736 302/500 [=================>............] - ETA: 49s - loss: 2.0895 - regression_loss: 1.7157 - classification_loss: 0.3738 303/500 [=================>............] - ETA: 49s - loss: 2.0895 - regression_loss: 1.7160 - classification_loss: 0.3736 304/500 [=================>............] - ETA: 49s - loss: 2.0911 - regression_loss: 1.7170 - classification_loss: 0.3741 305/500 [=================>............] - ETA: 48s - loss: 2.0903 - regression_loss: 1.7161 - classification_loss: 0.3743 306/500 [=================>............] - ETA: 48s - loss: 2.0852 - regression_loss: 1.7119 - classification_loss: 0.3733 307/500 [=================>............] - ETA: 48s - loss: 2.0882 - regression_loss: 1.7142 - classification_loss: 0.3741 308/500 [=================>............] - ETA: 48s - loss: 2.0878 - regression_loss: 1.7142 - classification_loss: 0.3736 309/500 [=================>............] - ETA: 47s - loss: 2.0863 - regression_loss: 1.7131 - classification_loss: 0.3732 310/500 [=================>............] - ETA: 47s - loss: 2.0848 - regression_loss: 1.7122 - classification_loss: 0.3726 311/500 [=================>............] - ETA: 47s - loss: 2.0811 - regression_loss: 1.7094 - classification_loss: 0.3717 312/500 [=================>............] - ETA: 47s - loss: 2.0853 - regression_loss: 1.7136 - classification_loss: 0.3716 313/500 [=================>............] - ETA: 46s - loss: 2.0839 - regression_loss: 1.7124 - classification_loss: 0.3715 314/500 [=================>............] - ETA: 46s - loss: 2.0837 - regression_loss: 1.7122 - classification_loss: 0.3715 315/500 [=================>............] - ETA: 46s - loss: 2.0853 - regression_loss: 1.7131 - classification_loss: 0.3722 316/500 [=================>............] - ETA: 46s - loss: 2.0855 - regression_loss: 1.7133 - classification_loss: 0.3721 317/500 [==================>...........] - ETA: 45s - loss: 2.0851 - regression_loss: 1.7132 - classification_loss: 0.3719 318/500 [==================>...........] - ETA: 45s - loss: 2.0834 - regression_loss: 1.7119 - classification_loss: 0.3714 319/500 [==================>...........] - ETA: 45s - loss: 2.0846 - regression_loss: 1.7131 - classification_loss: 0.3715 320/500 [==================>...........] - ETA: 45s - loss: 2.0856 - regression_loss: 1.7141 - classification_loss: 0.3714 321/500 [==================>...........] - ETA: 44s - loss: 2.0831 - regression_loss: 1.7122 - classification_loss: 0.3708 322/500 [==================>...........] - ETA: 44s - loss: 2.0836 - regression_loss: 1.7127 - classification_loss: 0.3709 323/500 [==================>...........] - ETA: 44s - loss: 2.0827 - regression_loss: 1.7122 - classification_loss: 0.3705 324/500 [==================>...........] - ETA: 44s - loss: 2.0821 - regression_loss: 1.7121 - classification_loss: 0.3700 325/500 [==================>...........] - ETA: 43s - loss: 2.0839 - regression_loss: 1.7135 - classification_loss: 0.3704 326/500 [==================>...........] - ETA: 43s - loss: 2.0850 - regression_loss: 1.7143 - classification_loss: 0.3707 327/500 [==================>...........] - ETA: 43s - loss: 2.0852 - regression_loss: 1.7145 - classification_loss: 0.3707 328/500 [==================>...........] - ETA: 43s - loss: 2.0833 - regression_loss: 1.7131 - classification_loss: 0.3702 329/500 [==================>...........] - ETA: 42s - loss: 2.0830 - regression_loss: 1.7130 - classification_loss: 0.3700 330/500 [==================>...........] - ETA: 42s - loss: 2.0830 - regression_loss: 1.7130 - classification_loss: 0.3701 331/500 [==================>...........] - ETA: 42s - loss: 2.0833 - regression_loss: 1.7132 - classification_loss: 0.3700 332/500 [==================>...........] - ETA: 42s - loss: 2.0826 - regression_loss: 1.7130 - classification_loss: 0.3695 333/500 [==================>...........] - ETA: 41s - loss: 2.0805 - regression_loss: 1.7115 - classification_loss: 0.3690 334/500 [===================>..........] - ETA: 41s - loss: 2.0809 - regression_loss: 1.7121 - classification_loss: 0.3687 335/500 [===================>..........] - ETA: 41s - loss: 2.0813 - regression_loss: 1.7124 - classification_loss: 0.3689 336/500 [===================>..........] - ETA: 41s - loss: 2.0823 - regression_loss: 1.7134 - classification_loss: 0.3689 337/500 [===================>..........] - ETA: 40s - loss: 2.0804 - regression_loss: 1.7117 - classification_loss: 0.3687 338/500 [===================>..........] - ETA: 40s - loss: 2.0805 - regression_loss: 1.7119 - classification_loss: 0.3686 339/500 [===================>..........] - ETA: 40s - loss: 2.0797 - regression_loss: 1.7108 - classification_loss: 0.3689 340/500 [===================>..........] - ETA: 40s - loss: 2.0786 - regression_loss: 1.7102 - classification_loss: 0.3684 341/500 [===================>..........] - ETA: 39s - loss: 2.0766 - regression_loss: 1.7086 - classification_loss: 0.3680 342/500 [===================>..........] - ETA: 39s - loss: 2.0773 - regression_loss: 1.7092 - classification_loss: 0.3681 343/500 [===================>..........] - ETA: 39s - loss: 2.0771 - regression_loss: 1.7090 - classification_loss: 0.3681 344/500 [===================>..........] - ETA: 39s - loss: 2.0775 - regression_loss: 1.7096 - classification_loss: 0.3678 345/500 [===================>..........] - ETA: 38s - loss: 2.0804 - regression_loss: 1.7118 - classification_loss: 0.3685 346/500 [===================>..........] - ETA: 38s - loss: 2.0811 - regression_loss: 1.7123 - classification_loss: 0.3687 347/500 [===================>..........] - ETA: 38s - loss: 2.0773 - regression_loss: 1.7091 - classification_loss: 0.3682 348/500 [===================>..........] - ETA: 38s - loss: 2.0774 - regression_loss: 1.7094 - classification_loss: 0.3680 349/500 [===================>..........] - ETA: 37s - loss: 2.0734 - regression_loss: 1.7062 - classification_loss: 0.3672 350/500 [====================>.........] - ETA: 37s - loss: 2.0734 - regression_loss: 1.7062 - classification_loss: 0.3671 351/500 [====================>.........] - ETA: 37s - loss: 2.0735 - regression_loss: 1.7065 - classification_loss: 0.3670 352/500 [====================>.........] - ETA: 37s - loss: 2.0733 - regression_loss: 1.7065 - classification_loss: 0.3669 353/500 [====================>.........] - ETA: 36s - loss: 2.0725 - regression_loss: 1.7059 - classification_loss: 0.3666 354/500 [====================>.........] - ETA: 36s - loss: 2.0732 - regression_loss: 1.7064 - classification_loss: 0.3668 355/500 [====================>.........] - ETA: 36s - loss: 2.0741 - regression_loss: 1.7071 - classification_loss: 0.3670 356/500 [====================>.........] - ETA: 36s - loss: 2.0724 - regression_loss: 1.7057 - classification_loss: 0.3667 357/500 [====================>.........] - ETA: 35s - loss: 2.0730 - regression_loss: 1.7062 - classification_loss: 0.3668 358/500 [====================>.........] - ETA: 35s - loss: 2.0693 - regression_loss: 1.7033 - classification_loss: 0.3660 359/500 [====================>.........] - ETA: 35s - loss: 2.0690 - regression_loss: 1.7031 - classification_loss: 0.3658 360/500 [====================>.........] - ETA: 35s - loss: 2.0691 - regression_loss: 1.7034 - classification_loss: 0.3657 361/500 [====================>.........] - ETA: 34s - loss: 2.0702 - regression_loss: 1.7038 - classification_loss: 0.3665 362/500 [====================>.........] - ETA: 34s - loss: 2.0663 - regression_loss: 1.7004 - classification_loss: 0.3658 363/500 [====================>.........] - ETA: 34s - loss: 2.0660 - regression_loss: 1.7002 - classification_loss: 0.3659 364/500 [====================>.........] - ETA: 34s - loss: 2.0644 - regression_loss: 1.6990 - classification_loss: 0.3654 365/500 [====================>.........] - ETA: 33s - loss: 2.0631 - regression_loss: 1.6981 - classification_loss: 0.3649 366/500 [====================>.........] - ETA: 33s - loss: 2.0616 - regression_loss: 1.6971 - classification_loss: 0.3645 367/500 [=====================>........] - ETA: 33s - loss: 2.0611 - regression_loss: 1.6969 - classification_loss: 0.3642 368/500 [=====================>........] - ETA: 33s - loss: 2.0601 - regression_loss: 1.6963 - classification_loss: 0.3638 369/500 [=====================>........] - ETA: 32s - loss: 2.0616 - regression_loss: 1.6973 - classification_loss: 0.3643 370/500 [=====================>........] - ETA: 32s - loss: 2.0608 - regression_loss: 1.6968 - classification_loss: 0.3640 371/500 [=====================>........] - ETA: 32s - loss: 2.0600 - regression_loss: 1.6962 - classification_loss: 0.3638 372/500 [=====================>........] - ETA: 32s - loss: 2.0601 - regression_loss: 1.6966 - classification_loss: 0.3635 373/500 [=====================>........] - ETA: 31s - loss: 2.0584 - regression_loss: 1.6952 - classification_loss: 0.3631 374/500 [=====================>........] - ETA: 31s - loss: 2.0579 - regression_loss: 1.6948 - classification_loss: 0.3631 375/500 [=====================>........] - ETA: 31s - loss: 2.0579 - regression_loss: 1.6949 - classification_loss: 0.3630 376/500 [=====================>........] - ETA: 31s - loss: 2.0570 - regression_loss: 1.6945 - classification_loss: 0.3625 377/500 [=====================>........] - ETA: 30s - loss: 2.0571 - regression_loss: 1.6948 - classification_loss: 0.3623 378/500 [=====================>........] - ETA: 30s - loss: 2.0573 - regression_loss: 1.6950 - classification_loss: 0.3624 379/500 [=====================>........] - ETA: 30s - loss: 2.0567 - regression_loss: 1.6946 - classification_loss: 0.3621 380/500 [=====================>........] - ETA: 30s - loss: 2.0575 - regression_loss: 1.6953 - classification_loss: 0.3622 381/500 [=====================>........] - ETA: 29s - loss: 2.0569 - regression_loss: 1.6950 - classification_loss: 0.3619 382/500 [=====================>........] - ETA: 29s - loss: 2.0570 - regression_loss: 1.6948 - classification_loss: 0.3622 383/500 [=====================>........] - ETA: 29s - loss: 2.0597 - regression_loss: 1.6970 - classification_loss: 0.3628 384/500 [======================>.......] - ETA: 29s - loss: 2.0622 - regression_loss: 1.6991 - classification_loss: 0.3630 385/500 [======================>.......] - ETA: 28s - loss: 2.0626 - regression_loss: 1.6997 - classification_loss: 0.3629 386/500 [======================>.......] - ETA: 28s - loss: 2.0621 - regression_loss: 1.6995 - classification_loss: 0.3626 387/500 [======================>.......] - ETA: 28s - loss: 2.0644 - regression_loss: 1.7003 - classification_loss: 0.3641 388/500 [======================>.......] - ETA: 28s - loss: 2.0650 - regression_loss: 1.7013 - classification_loss: 0.3637 389/500 [======================>.......] - ETA: 27s - loss: 2.0652 - regression_loss: 1.7014 - classification_loss: 0.3638 390/500 [======================>.......] - ETA: 27s - loss: 2.0628 - regression_loss: 1.6992 - classification_loss: 0.3636 391/500 [======================>.......] - ETA: 27s - loss: 2.0633 - regression_loss: 1.6995 - classification_loss: 0.3638 392/500 [======================>.......] - ETA: 27s - loss: 2.0629 - regression_loss: 1.6989 - classification_loss: 0.3640 393/500 [======================>.......] - ETA: 26s - loss: 2.0621 - regression_loss: 1.6983 - classification_loss: 0.3639 394/500 [======================>.......] - ETA: 26s - loss: 2.0604 - regression_loss: 1.6969 - classification_loss: 0.3635 395/500 [======================>.......] - ETA: 26s - loss: 2.0589 - regression_loss: 1.6957 - classification_loss: 0.3632 396/500 [======================>.......] - ETA: 26s - loss: 2.0609 - regression_loss: 1.6975 - classification_loss: 0.3634 397/500 [======================>.......] - ETA: 25s - loss: 2.0585 - regression_loss: 1.6956 - classification_loss: 0.3629 398/500 [======================>.......] - ETA: 25s - loss: 2.0572 - regression_loss: 1.6947 - classification_loss: 0.3625 399/500 [======================>.......] - ETA: 25s - loss: 2.0561 - regression_loss: 1.6939 - classification_loss: 0.3622 400/500 [=======================>......] - ETA: 25s - loss: 2.0555 - regression_loss: 1.6935 - classification_loss: 0.3621 401/500 [=======================>......] - ETA: 24s - loss: 2.0549 - regression_loss: 1.6930 - classification_loss: 0.3619 402/500 [=======================>......] - ETA: 24s - loss: 2.0558 - regression_loss: 1.6937 - classification_loss: 0.3621 403/500 [=======================>......] - ETA: 24s - loss: 2.0574 - regression_loss: 1.6947 - classification_loss: 0.3627 404/500 [=======================>......] - ETA: 24s - loss: 2.0612 - regression_loss: 1.6976 - classification_loss: 0.3636 405/500 [=======================>......] - ETA: 23s - loss: 2.0615 - regression_loss: 1.6980 - classification_loss: 0.3635 406/500 [=======================>......] - ETA: 23s - loss: 2.0615 - regression_loss: 1.6982 - classification_loss: 0.3633 407/500 [=======================>......] - ETA: 23s - loss: 2.0603 - regression_loss: 1.6969 - classification_loss: 0.3634 408/500 [=======================>......] - ETA: 23s - loss: 2.0606 - regression_loss: 1.6975 - classification_loss: 0.3631 409/500 [=======================>......] - ETA: 22s - loss: 2.0610 - regression_loss: 1.6978 - classification_loss: 0.3633 410/500 [=======================>......] - ETA: 22s - loss: 2.0619 - regression_loss: 1.6983 - classification_loss: 0.3636 411/500 [=======================>......] - ETA: 22s - loss: 2.0592 - regression_loss: 1.6961 - classification_loss: 0.3631 412/500 [=======================>......] - ETA: 22s - loss: 2.0601 - regression_loss: 1.6974 - classification_loss: 0.3627 413/500 [=======================>......] - ETA: 21s - loss: 2.0588 - regression_loss: 1.6965 - classification_loss: 0.3623 414/500 [=======================>......] - ETA: 21s - loss: 2.0594 - regression_loss: 1.6969 - classification_loss: 0.3625 415/500 [=======================>......] - ETA: 21s - loss: 2.0596 - regression_loss: 1.6973 - classification_loss: 0.3623 416/500 [=======================>......] - ETA: 21s - loss: 2.0601 - regression_loss: 1.6976 - classification_loss: 0.3625 417/500 [========================>.....] - ETA: 20s - loss: 2.0610 - regression_loss: 1.6982 - classification_loss: 0.3628 418/500 [========================>.....] - ETA: 20s - loss: 2.0616 - regression_loss: 1.6987 - classification_loss: 0.3629 419/500 [========================>.....] - ETA: 20s - loss: 2.0615 - regression_loss: 1.6987 - classification_loss: 0.3628 420/500 [========================>.....] - ETA: 20s - loss: 2.0620 - regression_loss: 1.6993 - classification_loss: 0.3627 421/500 [========================>.....] - ETA: 19s - loss: 2.0620 - regression_loss: 1.6993 - classification_loss: 0.3627 422/500 [========================>.....] - ETA: 19s - loss: 2.0627 - regression_loss: 1.6997 - classification_loss: 0.3630 423/500 [========================>.....] - ETA: 19s - loss: 2.0634 - regression_loss: 1.7002 - classification_loss: 0.3632 424/500 [========================>.....] - ETA: 19s - loss: 2.0622 - regression_loss: 1.6993 - classification_loss: 0.3629 425/500 [========================>.....] - ETA: 18s - loss: 2.0617 - regression_loss: 1.6989 - classification_loss: 0.3628 426/500 [========================>.....] - ETA: 18s - loss: 2.0654 - regression_loss: 1.7020 - classification_loss: 0.3634 427/500 [========================>.....] - ETA: 18s - loss: 2.0644 - regression_loss: 1.7011 - classification_loss: 0.3633 428/500 [========================>.....] - ETA: 18s - loss: 2.0631 - regression_loss: 1.7002 - classification_loss: 0.3629 429/500 [========================>.....] - ETA: 17s - loss: 2.0600 - regression_loss: 1.6977 - classification_loss: 0.3623 430/500 [========================>.....] - ETA: 17s - loss: 2.0586 - regression_loss: 1.6963 - classification_loss: 0.3623 431/500 [========================>.....] - ETA: 17s - loss: 2.0582 - regression_loss: 1.6960 - classification_loss: 0.3623 432/500 [========================>.....] - ETA: 17s - loss: 2.0583 - regression_loss: 1.6958 - classification_loss: 0.3624 433/500 [========================>.....] - ETA: 16s - loss: 2.0600 - regression_loss: 1.6973 - classification_loss: 0.3627 434/500 [=========================>....] - ETA: 16s - loss: 2.0633 - regression_loss: 1.7000 - classification_loss: 0.3633 435/500 [=========================>....] - ETA: 16s - loss: 2.0642 - regression_loss: 1.7010 - classification_loss: 0.3632 436/500 [=========================>....] - ETA: 16s - loss: 2.0649 - regression_loss: 1.7016 - classification_loss: 0.3634 437/500 [=========================>....] - ETA: 15s - loss: 2.0637 - regression_loss: 1.7007 - classification_loss: 0.3630 438/500 [=========================>....] - ETA: 15s - loss: 2.0630 - regression_loss: 1.7002 - classification_loss: 0.3628 439/500 [=========================>....] - ETA: 15s - loss: 2.0624 - regression_loss: 1.6995 - classification_loss: 0.3630 440/500 [=========================>....] - ETA: 15s - loss: 2.0617 - regression_loss: 1.6991 - classification_loss: 0.3626 441/500 [=========================>....] - ETA: 14s - loss: 2.0629 - regression_loss: 1.6999 - classification_loss: 0.3630 442/500 [=========================>....] - ETA: 14s - loss: 2.0627 - regression_loss: 1.7000 - classification_loss: 0.3626 443/500 [=========================>....] - ETA: 14s - loss: 2.0629 - regression_loss: 1.7003 - classification_loss: 0.3626 444/500 [=========================>....] - ETA: 14s - loss: 2.0619 - regression_loss: 1.6996 - classification_loss: 0.3624 445/500 [=========================>....] - ETA: 13s - loss: 2.0585 - regression_loss: 1.6967 - classification_loss: 0.3618 446/500 [=========================>....] - ETA: 13s - loss: 2.0592 - regression_loss: 1.6974 - classification_loss: 0.3619 447/500 [=========================>....] - ETA: 13s - loss: 2.0607 - regression_loss: 1.6983 - classification_loss: 0.3624 448/500 [=========================>....] - ETA: 13s - loss: 2.0606 - regression_loss: 1.6984 - classification_loss: 0.3622 449/500 [=========================>....] - ETA: 12s - loss: 2.0614 - regression_loss: 1.6991 - classification_loss: 0.3623 450/500 [==========================>...] - ETA: 12s - loss: 2.0625 - regression_loss: 1.6999 - classification_loss: 0.3626 451/500 [==========================>...] - ETA: 12s - loss: 2.0632 - regression_loss: 1.7004 - classification_loss: 0.3628 452/500 [==========================>...] - ETA: 12s - loss: 2.0633 - regression_loss: 1.7006 - classification_loss: 0.3627 453/500 [==========================>...] - ETA: 11s - loss: 2.0634 - regression_loss: 1.7010 - classification_loss: 0.3624 454/500 [==========================>...] - ETA: 11s - loss: 2.0651 - regression_loss: 1.7025 - classification_loss: 0.3626 455/500 [==========================>...] - ETA: 11s - loss: 2.0648 - regression_loss: 1.7024 - classification_loss: 0.3624 456/500 [==========================>...] - ETA: 11s - loss: 2.0619 - regression_loss: 1.7000 - classification_loss: 0.3619 457/500 [==========================>...] - ETA: 10s - loss: 2.0624 - regression_loss: 1.7005 - classification_loss: 0.3619 458/500 [==========================>...] - ETA: 10s - loss: 2.0636 - regression_loss: 1.7014 - classification_loss: 0.3622 459/500 [==========================>...] - ETA: 10s - loss: 2.0655 - regression_loss: 1.7028 - classification_loss: 0.3627 460/500 [==========================>...] - ETA: 10s - loss: 2.0681 - regression_loss: 1.7050 - classification_loss: 0.3631 461/500 [==========================>...] - ETA: 9s - loss: 2.0675 - regression_loss: 1.7046 - classification_loss: 0.3628  462/500 [==========================>...] - ETA: 9s - loss: 2.0674 - regression_loss: 1.7046 - classification_loss: 0.3628 463/500 [==========================>...] - ETA: 9s - loss: 2.0663 - regression_loss: 1.7039 - classification_loss: 0.3624 464/500 [==========================>...] - ETA: 9s - loss: 2.0679 - regression_loss: 1.7052 - classification_loss: 0.3627 465/500 [==========================>...] - ETA: 8s - loss: 2.0690 - regression_loss: 1.7062 - classification_loss: 0.3628 466/500 [==========================>...] - ETA: 8s - loss: 2.0692 - regression_loss: 1.7064 - classification_loss: 0.3628 467/500 [===========================>..] - ETA: 8s - loss: 2.0696 - regression_loss: 1.7067 - classification_loss: 0.3629 468/500 [===========================>..] - ETA: 8s - loss: 2.0701 - regression_loss: 1.7071 - classification_loss: 0.3631 469/500 [===========================>..] - ETA: 7s - loss: 2.0684 - regression_loss: 1.7058 - classification_loss: 0.3627 470/500 [===========================>..] - ETA: 7s - loss: 2.0687 - regression_loss: 1.7062 - classification_loss: 0.3626 471/500 [===========================>..] - ETA: 7s - loss: 2.0676 - regression_loss: 1.7053 - classification_loss: 0.3623 472/500 [===========================>..] - ETA: 7s - loss: 2.0678 - regression_loss: 1.7054 - classification_loss: 0.3624 473/500 [===========================>..] - ETA: 6s - loss: 2.0671 - regression_loss: 1.7018 - classification_loss: 0.3653 474/500 [===========================>..] - ETA: 6s - loss: 2.0680 - regression_loss: 1.7024 - classification_loss: 0.3657 475/500 [===========================>..] - ETA: 6s - loss: 2.0669 - regression_loss: 1.7015 - classification_loss: 0.3654 476/500 [===========================>..] - ETA: 6s - loss: 2.0676 - regression_loss: 1.7022 - classification_loss: 0.3655 477/500 [===========================>..] - ETA: 5s - loss: 2.0682 - regression_loss: 1.7026 - classification_loss: 0.3656 478/500 [===========================>..] - ETA: 5s - loss: 2.0695 - regression_loss: 1.7032 - classification_loss: 0.3663 479/500 [===========================>..] - ETA: 5s - loss: 2.0695 - regression_loss: 1.7033 - classification_loss: 0.3661 480/500 [===========================>..] - ETA: 5s - loss: 2.0686 - regression_loss: 1.7027 - classification_loss: 0.3659 481/500 [===========================>..] - ETA: 4s - loss: 2.0673 - regression_loss: 1.7018 - classification_loss: 0.3654 482/500 [===========================>..] - ETA: 4s - loss: 2.0663 - regression_loss: 1.7012 - classification_loss: 0.3652 483/500 [===========================>..] - ETA: 4s - loss: 2.0674 - regression_loss: 1.7019 - classification_loss: 0.3655 484/500 [============================>.] - ETA: 4s - loss: 2.0691 - regression_loss: 1.7034 - classification_loss: 0.3656 485/500 [============================>.] - ETA: 3s - loss: 2.0685 - regression_loss: 1.7030 - classification_loss: 0.3654 486/500 [============================>.] - ETA: 3s - loss: 2.0680 - regression_loss: 1.7027 - classification_loss: 0.3653 487/500 [============================>.] - ETA: 3s - loss: 2.0687 - regression_loss: 1.7032 - classification_loss: 0.3654 488/500 [============================>.] - ETA: 3s - loss: 2.0684 - regression_loss: 1.7032 - classification_loss: 0.3651 489/500 [============================>.] - ETA: 2s - loss: 2.0680 - regression_loss: 1.7030 - classification_loss: 0.3651 490/500 [============================>.] - ETA: 2s - loss: 2.0668 - regression_loss: 1.7021 - classification_loss: 0.3647 491/500 [============================>.] - ETA: 2s - loss: 2.0657 - regression_loss: 1.7013 - classification_loss: 0.3644 492/500 [============================>.] - ETA: 2s - loss: 2.0650 - regression_loss: 1.7009 - classification_loss: 0.3641 493/500 [============================>.] - ETA: 1s - loss: 2.0659 - regression_loss: 1.7017 - classification_loss: 0.3642 494/500 [============================>.] - ETA: 1s - loss: 2.0661 - regression_loss: 1.7021 - classification_loss: 0.3640 495/500 [============================>.] - ETA: 1s - loss: 2.0653 - regression_loss: 1.7016 - classification_loss: 0.3637 496/500 [============================>.] - ETA: 1s - loss: 2.0651 - regression_loss: 1.7014 - classification_loss: 0.3636 497/500 [============================>.] - ETA: 0s - loss: 2.0660 - regression_loss: 1.7022 - classification_loss: 0.3638 498/500 [============================>.] - ETA: 0s - loss: 2.0656 - regression_loss: 1.7018 - classification_loss: 0.3638 499/500 [============================>.] - ETA: 0s - loss: 2.0645 - regression_loss: 1.7010 - classification_loss: 0.3635 500/500 [==============================] - 125s 251ms/step - loss: 2.0648 - regression_loss: 1.7013 - classification_loss: 0.3636 326 instances of class plum with average precision: 0.6966 mAP: 0.6966 Epoch 00019: saving model to ./training/snapshots/resnet50_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 2:04 - loss: 1.5206 - regression_loss: 1.2990 - classification_loss: 0.2217 2/500 [..............................] - ETA: 2:01 - loss: 1.3922 - regression_loss: 1.2035 - classification_loss: 0.1887 3/500 [..............................] - ETA: 2:05 - loss: 1.7195 - regression_loss: 1.4410 - classification_loss: 0.2785 4/500 [..............................] - ETA: 2:01 - loss: 1.7200 - regression_loss: 1.4413 - classification_loss: 0.2787 5/500 [..............................] - ETA: 1:58 - loss: 1.8412 - regression_loss: 1.5394 - classification_loss: 0.3018 6/500 [..............................] - ETA: 1:55 - loss: 1.8320 - regression_loss: 1.5249 - classification_loss: 0.3071 7/500 [..............................] - ETA: 1:55 - loss: 1.8555 - regression_loss: 1.5459 - classification_loss: 0.3096 8/500 [..............................] - ETA: 1:56 - loss: 1.8483 - regression_loss: 1.5372 - classification_loss: 0.3111 9/500 [..............................] - ETA: 1:56 - loss: 1.8916 - regression_loss: 1.5904 - classification_loss: 0.3012 10/500 [..............................] - ETA: 1:57 - loss: 1.8749 - regression_loss: 1.5703 - classification_loss: 0.3046 11/500 [..............................] - ETA: 1:57 - loss: 1.7812 - regression_loss: 1.4935 - classification_loss: 0.2878 12/500 [..............................] - ETA: 1:57 - loss: 1.8743 - regression_loss: 1.5720 - classification_loss: 0.3022 13/500 [..............................] - ETA: 1:57 - loss: 1.8344 - regression_loss: 1.5432 - classification_loss: 0.2912 14/500 [..............................] - ETA: 1:57 - loss: 1.8585 - regression_loss: 1.5570 - classification_loss: 0.3015 15/500 [..............................] - ETA: 1:57 - loss: 1.9007 - regression_loss: 1.5877 - classification_loss: 0.3130 16/500 [..............................] - ETA: 1:57 - loss: 1.8906 - regression_loss: 1.5858 - classification_loss: 0.3048 17/500 [>.............................] - ETA: 1:57 - loss: 1.9091 - regression_loss: 1.5970 - classification_loss: 0.3121 18/500 [>.............................] - ETA: 1:57 - loss: 1.9527 - regression_loss: 1.6275 - classification_loss: 0.3252 19/500 [>.............................] - ETA: 1:57 - loss: 1.9274 - regression_loss: 1.6113 - classification_loss: 0.3160 20/500 [>.............................] - ETA: 1:57 - loss: 1.9379 - regression_loss: 1.6213 - classification_loss: 0.3166 21/500 [>.............................] - ETA: 1:57 - loss: 1.9817 - regression_loss: 1.6574 - classification_loss: 0.3243 22/500 [>.............................] - ETA: 1:57 - loss: 1.9592 - regression_loss: 1.6329 - classification_loss: 0.3263 23/500 [>.............................] - ETA: 1:56 - loss: 1.9381 - regression_loss: 1.6176 - classification_loss: 0.3205 24/500 [>.............................] - ETA: 1:56 - loss: 1.9406 - regression_loss: 1.6215 - classification_loss: 0.3191 25/500 [>.............................] - ETA: 1:56 - loss: 1.9660 - regression_loss: 1.6414 - classification_loss: 0.3246 26/500 [>.............................] - ETA: 1:56 - loss: 1.9473 - regression_loss: 1.6279 - classification_loss: 0.3195 27/500 [>.............................] - ETA: 1:56 - loss: 1.9380 - regression_loss: 1.6194 - classification_loss: 0.3185 28/500 [>.............................] - ETA: 1:56 - loss: 1.9498 - regression_loss: 1.6271 - classification_loss: 0.3227 29/500 [>.............................] - ETA: 1:56 - loss: 1.9738 - regression_loss: 1.6441 - classification_loss: 0.3298 30/500 [>.............................] - ETA: 1:56 - loss: 1.9931 - regression_loss: 1.6591 - classification_loss: 0.3340 31/500 [>.............................] - ETA: 1:56 - loss: 1.9539 - regression_loss: 1.6266 - classification_loss: 0.3273 32/500 [>.............................] - ETA: 1:56 - loss: 1.9322 - regression_loss: 1.6076 - classification_loss: 0.3246 33/500 [>.............................] - ETA: 1:55 - loss: 1.9261 - regression_loss: 1.6049 - classification_loss: 0.3212 34/500 [=>............................] - ETA: 1:55 - loss: 1.9236 - regression_loss: 1.6046 - classification_loss: 0.3190 35/500 [=>............................] - ETA: 1:55 - loss: 1.9483 - regression_loss: 1.6253 - classification_loss: 0.3230 36/500 [=>............................] - ETA: 1:55 - loss: 1.9622 - regression_loss: 1.6349 - classification_loss: 0.3273 37/500 [=>............................] - ETA: 1:54 - loss: 1.9689 - regression_loss: 1.6413 - classification_loss: 0.3275 38/500 [=>............................] - ETA: 1:54 - loss: 1.9591 - regression_loss: 1.6347 - classification_loss: 0.3244 39/500 [=>............................] - ETA: 1:54 - loss: 1.9548 - regression_loss: 1.6316 - classification_loss: 0.3232 40/500 [=>............................] - ETA: 1:54 - loss: 1.9533 - regression_loss: 1.6310 - classification_loss: 0.3223 41/500 [=>............................] - ETA: 1:54 - loss: 1.9467 - regression_loss: 1.6229 - classification_loss: 0.3238 42/500 [=>............................] - ETA: 1:53 - loss: 1.9696 - regression_loss: 1.6397 - classification_loss: 0.3299 43/500 [=>............................] - ETA: 1:53 - loss: 1.9629 - regression_loss: 1.6337 - classification_loss: 0.3292 44/500 [=>............................] - ETA: 1:53 - loss: 1.9667 - regression_loss: 1.6368 - classification_loss: 0.3299 45/500 [=>............................] - ETA: 1:53 - loss: 1.9689 - regression_loss: 1.6390 - classification_loss: 0.3299 46/500 [=>............................] - ETA: 1:52 - loss: 1.9765 - regression_loss: 1.6445 - classification_loss: 0.3320 47/500 [=>............................] - ETA: 1:52 - loss: 1.9821 - regression_loss: 1.6476 - classification_loss: 0.3345 48/500 [=>............................] - ETA: 1:52 - loss: 1.9863 - regression_loss: 1.6509 - classification_loss: 0.3354 49/500 [=>............................] - ETA: 1:52 - loss: 1.9881 - regression_loss: 1.6537 - classification_loss: 0.3344 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9698 - regression_loss: 1.6396 - classification_loss: 0.3302 51/500 [==>...........................] - ETA: 1:51 - loss: 2.0049 - regression_loss: 1.6670 - classification_loss: 0.3379 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9960 - regression_loss: 1.6610 - classification_loss: 0.3350 53/500 [==>...........................] - ETA: 1:51 - loss: 2.0141 - regression_loss: 1.6743 - classification_loss: 0.3398 54/500 [==>...........................] - ETA: 1:51 - loss: 2.0342 - regression_loss: 1.6900 - classification_loss: 0.3442 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0331 - regression_loss: 1.6894 - classification_loss: 0.3438 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0439 - regression_loss: 1.6962 - classification_loss: 0.3478 57/500 [==>...........................] - ETA: 1:50 - loss: 2.0451 - regression_loss: 1.6978 - classification_loss: 0.3473 58/500 [==>...........................] - ETA: 1:50 - loss: 2.0515 - regression_loss: 1.7038 - classification_loss: 0.3476 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0518 - regression_loss: 1.7021 - classification_loss: 0.3496 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0533 - regression_loss: 1.7031 - classification_loss: 0.3502 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0499 - regression_loss: 1.7003 - classification_loss: 0.3496 62/500 [==>...........................] - ETA: 1:49 - loss: 2.0548 - regression_loss: 1.7033 - classification_loss: 0.3516 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0588 - regression_loss: 1.7063 - classification_loss: 0.3525 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0572 - regression_loss: 1.7052 - classification_loss: 0.3520 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0586 - regression_loss: 1.7071 - classification_loss: 0.3515 66/500 [==>...........................] - ETA: 1:48 - loss: 2.0538 - regression_loss: 1.7036 - classification_loss: 0.3502 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0571 - regression_loss: 1.7078 - classification_loss: 0.3493 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0566 - regression_loss: 1.7072 - classification_loss: 0.3494 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0533 - regression_loss: 1.7034 - classification_loss: 0.3500 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0422 - regression_loss: 1.6951 - classification_loss: 0.3470 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0425 - regression_loss: 1.6962 - classification_loss: 0.3463 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0428 - regression_loss: 1.6965 - classification_loss: 0.3463 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0248 - regression_loss: 1.6820 - classification_loss: 0.3428 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0074 - regression_loss: 1.6680 - classification_loss: 0.3394 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0049 - regression_loss: 1.6664 - classification_loss: 0.3386 76/500 [===>..........................] - ETA: 1:46 - loss: 2.0036 - regression_loss: 1.6653 - classification_loss: 0.3383 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0085 - regression_loss: 1.6693 - classification_loss: 0.3393 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0090 - regression_loss: 1.6700 - classification_loss: 0.3390 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0187 - regression_loss: 1.6775 - classification_loss: 0.3411 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0152 - regression_loss: 1.6749 - classification_loss: 0.3402 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0305 - regression_loss: 1.6867 - classification_loss: 0.3439 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0279 - regression_loss: 1.6854 - classification_loss: 0.3425 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0256 - regression_loss: 1.6831 - classification_loss: 0.3425 84/500 [====>.........................] - ETA: 1:44 - loss: 2.0235 - regression_loss: 1.6821 - classification_loss: 0.3413 85/500 [====>.........................] - ETA: 1:44 - loss: 2.0313 - regression_loss: 1.6864 - classification_loss: 0.3449 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0363 - regression_loss: 1.6903 - classification_loss: 0.3459 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0421 - regression_loss: 1.6973 - classification_loss: 0.3448 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0407 - regression_loss: 1.6963 - classification_loss: 0.3444 89/500 [====>.........................] - ETA: 1:43 - loss: 2.0427 - regression_loss: 1.6969 - classification_loss: 0.3458 90/500 [====>.........................] - ETA: 1:42 - loss: 2.0375 - regression_loss: 1.6929 - classification_loss: 0.3446 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0324 - regression_loss: 1.6897 - classification_loss: 0.3427 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0248 - regression_loss: 1.6842 - classification_loss: 0.3406 93/500 [====>.........................] - ETA: 1:42 - loss: 2.0261 - regression_loss: 1.6844 - classification_loss: 0.3417 94/500 [====>.........................] - ETA: 1:42 - loss: 2.0327 - regression_loss: 1.6894 - classification_loss: 0.3433 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0281 - regression_loss: 1.6860 - classification_loss: 0.3420 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0218 - regression_loss: 1.6813 - classification_loss: 0.3405 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0137 - regression_loss: 1.6743 - classification_loss: 0.3394 98/500 [====>.........................] - ETA: 1:41 - loss: 2.0146 - regression_loss: 1.6765 - classification_loss: 0.3381 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0032 - regression_loss: 1.6672 - classification_loss: 0.3359 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0107 - regression_loss: 1.6734 - classification_loss: 0.3373 101/500 [=====>........................] - ETA: 1:40 - loss: 2.0136 - regression_loss: 1.6770 - classification_loss: 0.3366 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0146 - regression_loss: 1.6781 - classification_loss: 0.3365 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0127 - regression_loss: 1.6767 - classification_loss: 0.3361 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0118 - regression_loss: 1.6757 - classification_loss: 0.3361 105/500 [=====>........................] - ETA: 1:39 - loss: 2.0128 - regression_loss: 1.6770 - classification_loss: 0.3359 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0117 - regression_loss: 1.6759 - classification_loss: 0.3359 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0258 - regression_loss: 1.6895 - classification_loss: 0.3363 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0210 - regression_loss: 1.6860 - classification_loss: 0.3350 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0210 - regression_loss: 1.6863 - classification_loss: 0.3347 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0195 - regression_loss: 1.6846 - classification_loss: 0.3349 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0245 - regression_loss: 1.6879 - classification_loss: 0.3366 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0223 - regression_loss: 1.6865 - classification_loss: 0.3358 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0151 - regression_loss: 1.6806 - classification_loss: 0.3345 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0194 - regression_loss: 1.6833 - classification_loss: 0.3361 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0182 - regression_loss: 1.6822 - classification_loss: 0.3360 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0231 - regression_loss: 1.6864 - classification_loss: 0.3367 117/500 [======>.......................] - ETA: 1:36 - loss: 2.0235 - regression_loss: 1.6870 - classification_loss: 0.3365 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0221 - regression_loss: 1.6855 - classification_loss: 0.3365 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0159 - regression_loss: 1.6802 - classification_loss: 0.3357 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0170 - regression_loss: 1.6806 - classification_loss: 0.3364 121/500 [======>.......................] - ETA: 1:35 - loss: 2.0138 - regression_loss: 1.6780 - classification_loss: 0.3358 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0151 - regression_loss: 1.6789 - classification_loss: 0.3362 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0126 - regression_loss: 1.6771 - classification_loss: 0.3355 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0133 - regression_loss: 1.6779 - classification_loss: 0.3354 125/500 [======>.......................] - ETA: 1:34 - loss: 2.0162 - regression_loss: 1.6800 - classification_loss: 0.3362 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0220 - regression_loss: 1.6850 - classification_loss: 0.3371 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0274 - regression_loss: 1.6884 - classification_loss: 0.3391 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0258 - regression_loss: 1.6874 - classification_loss: 0.3384 129/500 [======>.......................] - ETA: 1:33 - loss: 2.0236 - regression_loss: 1.6859 - classification_loss: 0.3377 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0236 - regression_loss: 1.6865 - classification_loss: 0.3371 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0244 - regression_loss: 1.6877 - classification_loss: 0.3367 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0266 - regression_loss: 1.6893 - classification_loss: 0.3372 133/500 [======>.......................] - ETA: 1:32 - loss: 2.0270 - regression_loss: 1.6896 - classification_loss: 0.3374 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0250 - regression_loss: 1.6876 - classification_loss: 0.3373 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0277 - regression_loss: 1.6899 - classification_loss: 0.3377 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0244 - regression_loss: 1.6876 - classification_loss: 0.3369 137/500 [=======>......................] - ETA: 1:31 - loss: 2.0289 - regression_loss: 1.6905 - classification_loss: 0.3383 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0306 - regression_loss: 1.6912 - classification_loss: 0.3395 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0304 - regression_loss: 1.6916 - classification_loss: 0.3388 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0305 - regression_loss: 1.6914 - classification_loss: 0.3391 141/500 [=======>......................] - ETA: 1:30 - loss: 2.0388 - regression_loss: 1.6984 - classification_loss: 0.3405 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0381 - regression_loss: 1.6979 - classification_loss: 0.3402 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0355 - regression_loss: 1.6949 - classification_loss: 0.3407 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0366 - regression_loss: 1.6969 - classification_loss: 0.3397 145/500 [=======>......................] - ETA: 1:28 - loss: 2.0429 - regression_loss: 1.7024 - classification_loss: 0.3405 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0426 - regression_loss: 1.7013 - classification_loss: 0.3414 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0391 - regression_loss: 1.6986 - classification_loss: 0.3406 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0365 - regression_loss: 1.6960 - classification_loss: 0.3404 149/500 [=======>......................] - ETA: 1:27 - loss: 2.0378 - regression_loss: 1.6969 - classification_loss: 0.3410 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0371 - regression_loss: 1.6971 - classification_loss: 0.3400 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0379 - regression_loss: 1.6982 - classification_loss: 0.3397 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0326 - regression_loss: 1.6944 - classification_loss: 0.3382 153/500 [========>.....................] - ETA: 1:26 - loss: 2.0318 - regression_loss: 1.6938 - classification_loss: 0.3380 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0322 - regression_loss: 1.6943 - classification_loss: 0.3379 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0341 - regression_loss: 1.6958 - classification_loss: 0.3383 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0380 - regression_loss: 1.7000 - classification_loss: 0.3380 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0387 - regression_loss: 1.7002 - classification_loss: 0.3385 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0367 - regression_loss: 1.6986 - classification_loss: 0.3381 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0376 - regression_loss: 1.6982 - classification_loss: 0.3394 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0362 - regression_loss: 1.6973 - classification_loss: 0.3390 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0392 - regression_loss: 1.6994 - classification_loss: 0.3398 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0425 - regression_loss: 1.7019 - classification_loss: 0.3406 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0453 - regression_loss: 1.7039 - classification_loss: 0.3414 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0447 - regression_loss: 1.7034 - classification_loss: 0.3413 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0461 - regression_loss: 1.7045 - classification_loss: 0.3417 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0390 - regression_loss: 1.6986 - classification_loss: 0.3403 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0403 - regression_loss: 1.6996 - classification_loss: 0.3407 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0461 - regression_loss: 1.7046 - classification_loss: 0.3415 169/500 [=========>....................] - ETA: 1:23 - loss: 2.0448 - regression_loss: 1.7038 - classification_loss: 0.3410 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0455 - regression_loss: 1.7041 - classification_loss: 0.3415 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0442 - regression_loss: 1.7030 - classification_loss: 0.3411 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0401 - regression_loss: 1.6999 - classification_loss: 0.3402 173/500 [=========>....................] - ETA: 1:22 - loss: 2.0424 - regression_loss: 1.7017 - classification_loss: 0.3406 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0433 - regression_loss: 1.7025 - classification_loss: 0.3408 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0401 - regression_loss: 1.6990 - classification_loss: 0.3411 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0406 - regression_loss: 1.6992 - classification_loss: 0.3413 177/500 [=========>....................] - ETA: 1:21 - loss: 2.0407 - regression_loss: 1.6995 - classification_loss: 0.3412 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0383 - regression_loss: 1.6977 - classification_loss: 0.3406 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0367 - regression_loss: 1.6965 - classification_loss: 0.3403 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0421 - regression_loss: 1.7005 - classification_loss: 0.3415 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0441 - regression_loss: 1.7020 - classification_loss: 0.3421 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0451 - regression_loss: 1.7020 - classification_loss: 0.3431 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0423 - regression_loss: 1.6997 - classification_loss: 0.3426 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0458 - regression_loss: 1.7028 - classification_loss: 0.3430 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0458 - regression_loss: 1.7029 - classification_loss: 0.3429 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0476 - regression_loss: 1.7045 - classification_loss: 0.3432 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0495 - regression_loss: 1.7059 - classification_loss: 0.3436 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0499 - regression_loss: 1.7066 - classification_loss: 0.3433 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0518 - regression_loss: 1.7084 - classification_loss: 0.3434 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0525 - regression_loss: 1.7088 - classification_loss: 0.3437 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0529 - regression_loss: 1.7095 - classification_loss: 0.3434 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0540 - regression_loss: 1.7105 - classification_loss: 0.3435 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0516 - regression_loss: 1.7086 - classification_loss: 0.3430 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0529 - regression_loss: 1.7095 - classification_loss: 0.3434 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0533 - regression_loss: 1.7096 - classification_loss: 0.3438 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0491 - regression_loss: 1.7063 - classification_loss: 0.3427 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0479 - regression_loss: 1.7054 - classification_loss: 0.3426 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0433 - regression_loss: 1.7015 - classification_loss: 0.3418 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0465 - regression_loss: 1.7036 - classification_loss: 0.3429 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0506 - regression_loss: 1.7067 - classification_loss: 0.3440 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0487 - regression_loss: 1.7048 - classification_loss: 0.3439 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0489 - regression_loss: 1.7048 - classification_loss: 0.3441 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0477 - regression_loss: 1.7042 - classification_loss: 0.3435 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0464 - regression_loss: 1.7030 - classification_loss: 0.3434 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0515 - regression_loss: 1.7064 - classification_loss: 0.3451 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0531 - regression_loss: 1.7072 - classification_loss: 0.3459 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0580 - regression_loss: 1.7103 - classification_loss: 0.3477 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0574 - regression_loss: 1.7100 - classification_loss: 0.3474 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0554 - regression_loss: 1.7089 - classification_loss: 0.3465 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0534 - regression_loss: 1.7075 - classification_loss: 0.3459 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0524 - regression_loss: 1.7068 - classification_loss: 0.3456 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0559 - regression_loss: 1.7097 - classification_loss: 0.3463 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0580 - regression_loss: 1.7116 - classification_loss: 0.3464 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0634 - regression_loss: 1.7159 - classification_loss: 0.3474 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0607 - regression_loss: 1.7140 - classification_loss: 0.3467 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0594 - regression_loss: 1.7132 - classification_loss: 0.3462 217/500 [============>.................] - ETA: 1:10 - loss: 2.0536 - regression_loss: 1.7079 - classification_loss: 0.3456 218/500 [============>.................] - ETA: 1:10 - loss: 2.0478 - regression_loss: 1.7029 - classification_loss: 0.3448 219/500 [============>.................] - ETA: 1:10 - loss: 2.0453 - regression_loss: 1.7011 - classification_loss: 0.3443 220/500 [============>.................] - ETA: 1:10 - loss: 2.0475 - regression_loss: 1.7027 - classification_loss: 0.3447 221/500 [============>.................] - ETA: 1:09 - loss: 2.0462 - regression_loss: 1.7016 - classification_loss: 0.3445 222/500 [============>.................] - ETA: 1:09 - loss: 2.0462 - regression_loss: 1.7019 - classification_loss: 0.3443 223/500 [============>.................] - ETA: 1:09 - loss: 2.0472 - regression_loss: 1.7034 - classification_loss: 0.3438 224/500 [============>.................] - ETA: 1:09 - loss: 2.0481 - regression_loss: 1.7044 - classification_loss: 0.3437 225/500 [============>.................] - ETA: 1:08 - loss: 2.0463 - regression_loss: 1.7030 - classification_loss: 0.3433 226/500 [============>.................] - ETA: 1:08 - loss: 2.0451 - regression_loss: 1.7019 - classification_loss: 0.3432 227/500 [============>.................] - ETA: 1:08 - loss: 2.0511 - regression_loss: 1.7061 - classification_loss: 0.3451 228/500 [============>.................] - ETA: 1:08 - loss: 2.0520 - regression_loss: 1.7067 - classification_loss: 0.3453 229/500 [============>.................] - ETA: 1:07 - loss: 2.0546 - regression_loss: 1.7094 - classification_loss: 0.3452 230/500 [============>.................] - ETA: 1:07 - loss: 2.0556 - regression_loss: 1.7104 - classification_loss: 0.3452 231/500 [============>.................] - ETA: 1:07 - loss: 2.0581 - regression_loss: 1.7130 - classification_loss: 0.3451 232/500 [============>.................] - ETA: 1:07 - loss: 2.0606 - regression_loss: 1.7156 - classification_loss: 0.3450 233/500 [============>.................] - ETA: 1:06 - loss: 2.0621 - regression_loss: 1.7165 - classification_loss: 0.3456 234/500 [=============>................] - ETA: 1:06 - loss: 2.0654 - regression_loss: 1.7194 - classification_loss: 0.3460 235/500 [=============>................] - ETA: 1:06 - loss: 2.0608 - regression_loss: 1.7157 - classification_loss: 0.3451 236/500 [=============>................] - ETA: 1:06 - loss: 2.0607 - regression_loss: 1.7154 - classification_loss: 0.3453 237/500 [=============>................] - ETA: 1:05 - loss: 2.0599 - regression_loss: 1.7144 - classification_loss: 0.3455 238/500 [=============>................] - ETA: 1:05 - loss: 2.0625 - regression_loss: 1.7163 - classification_loss: 0.3461 239/500 [=============>................] - ETA: 1:05 - loss: 2.0634 - regression_loss: 1.7163 - classification_loss: 0.3471 240/500 [=============>................] - ETA: 1:05 - loss: 2.0629 - regression_loss: 1.7162 - classification_loss: 0.3468 241/500 [=============>................] - ETA: 1:04 - loss: 2.0653 - regression_loss: 1.7181 - classification_loss: 0.3473 242/500 [=============>................] - ETA: 1:04 - loss: 2.0632 - regression_loss: 1.7163 - classification_loss: 0.3469 243/500 [=============>................] - ETA: 1:04 - loss: 2.0674 - regression_loss: 1.7195 - classification_loss: 0.3480 244/500 [=============>................] - ETA: 1:04 - loss: 2.0649 - regression_loss: 1.7172 - classification_loss: 0.3478 245/500 [=============>................] - ETA: 1:03 - loss: 2.0653 - regression_loss: 1.7177 - classification_loss: 0.3476 246/500 [=============>................] - ETA: 1:03 - loss: 2.0656 - regression_loss: 1.7182 - classification_loss: 0.3474 247/500 [=============>................] - ETA: 1:03 - loss: 2.0634 - regression_loss: 1.7166 - classification_loss: 0.3468 248/500 [=============>................] - ETA: 1:03 - loss: 2.0649 - regression_loss: 1.7178 - classification_loss: 0.3471 249/500 [=============>................] - ETA: 1:02 - loss: 2.0666 - regression_loss: 1.7191 - classification_loss: 0.3474 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0665 - regression_loss: 1.7193 - classification_loss: 0.3473 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0670 - regression_loss: 1.7197 - classification_loss: 0.3473 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0656 - regression_loss: 1.7185 - classification_loss: 0.3471 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0652 - regression_loss: 1.7180 - classification_loss: 0.3472 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0674 - regression_loss: 1.7204 - classification_loss: 0.3470 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0684 - regression_loss: 1.7215 - classification_loss: 0.3469 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0699 - regression_loss: 1.7224 - classification_loss: 0.3475 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0673 - regression_loss: 1.7205 - classification_loss: 0.3468 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0662 - regression_loss: 1.7199 - classification_loss: 0.3463 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0662 - regression_loss: 1.7198 - classification_loss: 0.3464 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0650 - regression_loss: 1.7190 - classification_loss: 0.3460 261/500 [==============>...............] - ETA: 59s - loss: 2.0652 - regression_loss: 1.7190 - classification_loss: 0.3462  262/500 [==============>...............] - ETA: 59s - loss: 2.0631 - regression_loss: 1.7175 - classification_loss: 0.3456 263/500 [==============>...............] - ETA: 59s - loss: 2.0602 - regression_loss: 1.7147 - classification_loss: 0.3456 264/500 [==============>...............] - ETA: 59s - loss: 2.0570 - regression_loss: 1.7121 - classification_loss: 0.3449 265/500 [==============>...............] - ETA: 58s - loss: 2.0568 - regression_loss: 1.7121 - classification_loss: 0.3447 266/500 [==============>...............] - ETA: 58s - loss: 2.0556 - regression_loss: 1.7111 - classification_loss: 0.3445 267/500 [===============>..............] - ETA: 58s - loss: 2.0555 - regression_loss: 1.7112 - classification_loss: 0.3443 268/500 [===============>..............] - ETA: 58s - loss: 2.0544 - regression_loss: 1.7103 - classification_loss: 0.3440 269/500 [===============>..............] - ETA: 57s - loss: 2.0571 - regression_loss: 1.7122 - classification_loss: 0.3449 270/500 [===============>..............] - ETA: 57s - loss: 2.0568 - regression_loss: 1.7118 - classification_loss: 0.3449 271/500 [===============>..............] - ETA: 57s - loss: 2.0590 - regression_loss: 1.7133 - classification_loss: 0.3458 272/500 [===============>..............] - ETA: 57s - loss: 2.0558 - regression_loss: 1.7107 - classification_loss: 0.3451 273/500 [===============>..............] - ETA: 56s - loss: 2.0571 - regression_loss: 1.7120 - classification_loss: 0.3451 274/500 [===============>..............] - ETA: 56s - loss: 2.0558 - regression_loss: 1.7110 - classification_loss: 0.3448 275/500 [===============>..............] - ETA: 56s - loss: 2.0540 - regression_loss: 1.7098 - classification_loss: 0.3443 276/500 [===============>..............] - ETA: 56s - loss: 2.0534 - regression_loss: 1.7091 - classification_loss: 0.3443 277/500 [===============>..............] - ETA: 55s - loss: 2.0529 - regression_loss: 1.7088 - classification_loss: 0.3441 278/500 [===============>..............] - ETA: 55s - loss: 2.0544 - regression_loss: 1.7092 - classification_loss: 0.3451 279/500 [===============>..............] - ETA: 55s - loss: 2.0522 - regression_loss: 1.7075 - classification_loss: 0.3447 280/500 [===============>..............] - ETA: 55s - loss: 2.0505 - regression_loss: 1.7061 - classification_loss: 0.3444 281/500 [===============>..............] - ETA: 54s - loss: 2.0488 - regression_loss: 1.7048 - classification_loss: 0.3440 282/500 [===============>..............] - ETA: 54s - loss: 2.0464 - regression_loss: 1.7030 - classification_loss: 0.3434 283/500 [===============>..............] - ETA: 54s - loss: 2.0461 - regression_loss: 1.7026 - classification_loss: 0.3436 284/500 [================>.............] - ETA: 54s - loss: 2.0429 - regression_loss: 1.7002 - classification_loss: 0.3427 285/500 [================>.............] - ETA: 53s - loss: 2.0432 - regression_loss: 1.7005 - classification_loss: 0.3427 286/500 [================>.............] - ETA: 53s - loss: 2.0455 - regression_loss: 1.7023 - classification_loss: 0.3432 287/500 [================>.............] - ETA: 53s - loss: 2.0412 - regression_loss: 1.6991 - classification_loss: 0.3422 288/500 [================>.............] - ETA: 53s - loss: 2.0426 - regression_loss: 1.7005 - classification_loss: 0.3422 289/500 [================>.............] - ETA: 52s - loss: 2.0420 - regression_loss: 1.7000 - classification_loss: 0.3420 290/500 [================>.............] - ETA: 52s - loss: 2.0450 - regression_loss: 1.7032 - classification_loss: 0.3418 291/500 [================>.............] - ETA: 52s - loss: 2.0470 - regression_loss: 1.7046 - classification_loss: 0.3424 292/500 [================>.............] - ETA: 52s - loss: 2.0474 - regression_loss: 1.7047 - classification_loss: 0.3427 293/500 [================>.............] - ETA: 51s - loss: 2.0481 - regression_loss: 1.7055 - classification_loss: 0.3426 294/500 [================>.............] - ETA: 51s - loss: 2.0473 - regression_loss: 1.7050 - classification_loss: 0.3423 295/500 [================>.............] - ETA: 51s - loss: 2.0477 - regression_loss: 1.7051 - classification_loss: 0.3426 296/500 [================>.............] - ETA: 51s - loss: 2.0473 - regression_loss: 1.7052 - classification_loss: 0.3421 297/500 [================>.............] - ETA: 50s - loss: 2.0486 - regression_loss: 1.7062 - classification_loss: 0.3424 298/500 [================>.............] - ETA: 50s - loss: 2.0483 - regression_loss: 1.7056 - classification_loss: 0.3426 299/500 [================>.............] - ETA: 50s - loss: 2.0483 - regression_loss: 1.7058 - classification_loss: 0.3425 300/500 [=================>............] - ETA: 50s - loss: 2.0502 - regression_loss: 1.7072 - classification_loss: 0.3430 301/500 [=================>............] - ETA: 49s - loss: 2.0489 - regression_loss: 1.7063 - classification_loss: 0.3426 302/500 [=================>............] - ETA: 49s - loss: 2.0500 - regression_loss: 1.7066 - classification_loss: 0.3433 303/500 [=================>............] - ETA: 49s - loss: 2.0496 - regression_loss: 1.7065 - classification_loss: 0.3431 304/500 [=================>............] - ETA: 49s - loss: 2.0483 - regression_loss: 1.7057 - classification_loss: 0.3426 305/500 [=================>............] - ETA: 48s - loss: 2.0546 - regression_loss: 1.7096 - classification_loss: 0.3450 306/500 [=================>............] - ETA: 48s - loss: 2.0547 - regression_loss: 1.7097 - classification_loss: 0.3451 307/500 [=================>............] - ETA: 48s - loss: 2.0554 - regression_loss: 1.7103 - classification_loss: 0.3451 308/500 [=================>............] - ETA: 48s - loss: 2.0535 - regression_loss: 1.7089 - classification_loss: 0.3445 309/500 [=================>............] - ETA: 47s - loss: 2.0543 - regression_loss: 1.7094 - classification_loss: 0.3448 310/500 [=================>............] - ETA: 47s - loss: 2.0542 - regression_loss: 1.7095 - classification_loss: 0.3446 311/500 [=================>............] - ETA: 47s - loss: 2.0541 - regression_loss: 1.7096 - classification_loss: 0.3445 312/500 [=================>............] - ETA: 47s - loss: 2.0548 - regression_loss: 1.7101 - classification_loss: 0.3447 313/500 [=================>............] - ETA: 46s - loss: 2.0541 - regression_loss: 1.7095 - classification_loss: 0.3446 314/500 [=================>............] - ETA: 46s - loss: 2.0592 - regression_loss: 1.7122 - classification_loss: 0.3471 315/500 [=================>............] - ETA: 46s - loss: 2.0606 - regression_loss: 1.7132 - classification_loss: 0.3474 316/500 [=================>............] - ETA: 46s - loss: 2.0606 - regression_loss: 1.7129 - classification_loss: 0.3476 317/500 [==================>...........] - ETA: 45s - loss: 2.0596 - regression_loss: 1.7124 - classification_loss: 0.3473 318/500 [==================>...........] - ETA: 45s - loss: 2.0578 - regression_loss: 1.7110 - classification_loss: 0.3468 319/500 [==================>...........] - ETA: 45s - loss: 2.0557 - regression_loss: 1.7094 - classification_loss: 0.3463 320/500 [==================>...........] - ETA: 45s - loss: 2.0552 - regression_loss: 1.7090 - classification_loss: 0.3462 321/500 [==================>...........] - ETA: 44s - loss: 2.0546 - regression_loss: 1.7085 - classification_loss: 0.3461 322/500 [==================>...........] - ETA: 44s - loss: 2.0525 - regression_loss: 1.7068 - classification_loss: 0.3457 323/500 [==================>...........] - ETA: 44s - loss: 2.0560 - regression_loss: 1.7096 - classification_loss: 0.3463 324/500 [==================>...........] - ETA: 44s - loss: 2.0567 - regression_loss: 1.7101 - classification_loss: 0.3466 325/500 [==================>...........] - ETA: 43s - loss: 2.0551 - regression_loss: 1.7090 - classification_loss: 0.3461 326/500 [==================>...........] - ETA: 43s - loss: 2.0547 - regression_loss: 1.7089 - classification_loss: 0.3458 327/500 [==================>...........] - ETA: 43s - loss: 2.0579 - regression_loss: 1.7114 - classification_loss: 0.3465 328/500 [==================>...........] - ETA: 43s - loss: 2.0615 - regression_loss: 1.7143 - classification_loss: 0.3472 329/500 [==================>...........] - ETA: 42s - loss: 2.0618 - regression_loss: 1.7142 - classification_loss: 0.3475 330/500 [==================>...........] - ETA: 42s - loss: 2.0626 - regression_loss: 1.7149 - classification_loss: 0.3477 331/500 [==================>...........] - ETA: 42s - loss: 2.0637 - regression_loss: 1.7160 - classification_loss: 0.3476 332/500 [==================>...........] - ETA: 42s - loss: 2.0613 - regression_loss: 1.7144 - classification_loss: 0.3470 333/500 [==================>...........] - ETA: 41s - loss: 2.0619 - regression_loss: 1.7146 - classification_loss: 0.3473 334/500 [===================>..........] - ETA: 41s - loss: 2.0590 - regression_loss: 1.7123 - classification_loss: 0.3467 335/500 [===================>..........] - ETA: 41s - loss: 2.0583 - regression_loss: 1.7117 - classification_loss: 0.3466 336/500 [===================>..........] - ETA: 41s - loss: 2.0590 - regression_loss: 1.7120 - classification_loss: 0.3470 337/500 [===================>..........] - ETA: 40s - loss: 2.0569 - regression_loss: 1.7104 - classification_loss: 0.3465 338/500 [===================>..........] - ETA: 40s - loss: 2.0608 - regression_loss: 1.7134 - classification_loss: 0.3473 339/500 [===================>..........] - ETA: 40s - loss: 2.0630 - regression_loss: 1.7152 - classification_loss: 0.3478 340/500 [===================>..........] - ETA: 40s - loss: 2.0618 - regression_loss: 1.7145 - classification_loss: 0.3473 341/500 [===================>..........] - ETA: 39s - loss: 2.0587 - regression_loss: 1.7121 - classification_loss: 0.3466 342/500 [===================>..........] - ETA: 39s - loss: 2.0570 - regression_loss: 1.7107 - classification_loss: 0.3463 343/500 [===================>..........] - ETA: 39s - loss: 2.0575 - regression_loss: 1.7106 - classification_loss: 0.3469 344/500 [===================>..........] - ETA: 39s - loss: 2.0572 - regression_loss: 1.7106 - classification_loss: 0.3466 345/500 [===================>..........] - ETA: 38s - loss: 2.0584 - regression_loss: 1.7115 - classification_loss: 0.3469 346/500 [===================>..........] - ETA: 38s - loss: 2.0571 - regression_loss: 1.7107 - classification_loss: 0.3465 347/500 [===================>..........] - ETA: 38s - loss: 2.0572 - regression_loss: 1.7104 - classification_loss: 0.3468 348/500 [===================>..........] - ETA: 38s - loss: 2.0580 - regression_loss: 1.7107 - classification_loss: 0.3472 349/500 [===================>..........] - ETA: 37s - loss: 2.0589 - regression_loss: 1.7114 - classification_loss: 0.3476 350/500 [====================>.........] - ETA: 37s - loss: 2.0595 - regression_loss: 1.7119 - classification_loss: 0.3477 351/500 [====================>.........] - ETA: 37s - loss: 2.0591 - regression_loss: 1.7116 - classification_loss: 0.3475 352/500 [====================>.........] - ETA: 37s - loss: 2.0592 - regression_loss: 1.7116 - classification_loss: 0.3476 353/500 [====================>.........] - ETA: 36s - loss: 2.0591 - regression_loss: 1.7115 - classification_loss: 0.3477 354/500 [====================>.........] - ETA: 36s - loss: 2.0594 - regression_loss: 1.7118 - classification_loss: 0.3477 355/500 [====================>.........] - ETA: 36s - loss: 2.0575 - regression_loss: 1.7095 - classification_loss: 0.3480 356/500 [====================>.........] - ETA: 36s - loss: 2.0573 - regression_loss: 1.7094 - classification_loss: 0.3479 357/500 [====================>.........] - ETA: 35s - loss: 2.0592 - regression_loss: 1.7110 - classification_loss: 0.3482 358/500 [====================>.........] - ETA: 35s - loss: 2.0594 - regression_loss: 1.7112 - classification_loss: 0.3482 359/500 [====================>.........] - ETA: 35s - loss: 2.0598 - regression_loss: 1.7116 - classification_loss: 0.3482 360/500 [====================>.........] - ETA: 35s - loss: 2.0600 - regression_loss: 1.7116 - classification_loss: 0.3484 361/500 [====================>.........] - ETA: 34s - loss: 2.0606 - regression_loss: 1.7120 - classification_loss: 0.3486 362/500 [====================>.........] - ETA: 34s - loss: 2.0571 - regression_loss: 1.7092 - classification_loss: 0.3479 363/500 [====================>.........] - ETA: 34s - loss: 2.0565 - regression_loss: 1.7089 - classification_loss: 0.3476 364/500 [====================>.........] - ETA: 34s - loss: 2.0564 - regression_loss: 1.7089 - classification_loss: 0.3475 365/500 [====================>.........] - ETA: 33s - loss: 2.0561 - regression_loss: 1.7080 - classification_loss: 0.3481 366/500 [====================>.........] - ETA: 33s - loss: 2.0568 - regression_loss: 1.7085 - classification_loss: 0.3483 367/500 [=====================>........] - ETA: 33s - loss: 2.0562 - regression_loss: 1.7081 - classification_loss: 0.3480 368/500 [=====================>........] - ETA: 33s - loss: 2.0558 - regression_loss: 1.7082 - classification_loss: 0.3476 369/500 [=====================>........] - ETA: 32s - loss: 2.0592 - regression_loss: 1.7111 - classification_loss: 0.3481 370/500 [=====================>........] - ETA: 32s - loss: 2.0610 - regression_loss: 1.7127 - classification_loss: 0.3483 371/500 [=====================>........] - ETA: 32s - loss: 2.0620 - regression_loss: 1.7135 - classification_loss: 0.3485 372/500 [=====================>........] - ETA: 32s - loss: 2.0603 - regression_loss: 1.7122 - classification_loss: 0.3481 373/500 [=====================>........] - ETA: 31s - loss: 2.0615 - regression_loss: 1.7130 - classification_loss: 0.3485 374/500 [=====================>........] - ETA: 31s - loss: 2.0619 - regression_loss: 1.7132 - classification_loss: 0.3486 375/500 [=====================>........] - ETA: 31s - loss: 2.0633 - regression_loss: 1.7141 - classification_loss: 0.3492 376/500 [=====================>........] - ETA: 31s - loss: 2.0644 - regression_loss: 1.7148 - classification_loss: 0.3495 377/500 [=====================>........] - ETA: 30s - loss: 2.0634 - regression_loss: 1.7141 - classification_loss: 0.3493 378/500 [=====================>........] - ETA: 30s - loss: 2.0658 - regression_loss: 1.7158 - classification_loss: 0.3500 379/500 [=====================>........] - ETA: 30s - loss: 2.0654 - regression_loss: 1.7156 - classification_loss: 0.3498 380/500 [=====================>........] - ETA: 30s - loss: 2.0656 - regression_loss: 1.7157 - classification_loss: 0.3499 381/500 [=====================>........] - ETA: 29s - loss: 2.0680 - regression_loss: 1.7176 - classification_loss: 0.3504 382/500 [=====================>........] - ETA: 29s - loss: 2.0673 - regression_loss: 1.7173 - classification_loss: 0.3500 383/500 [=====================>........] - ETA: 29s - loss: 2.0668 - regression_loss: 1.7169 - classification_loss: 0.3499 384/500 [======================>.......] - ETA: 29s - loss: 2.0672 - regression_loss: 1.7172 - classification_loss: 0.3500 385/500 [======================>.......] - ETA: 28s - loss: 2.0687 - regression_loss: 1.7185 - classification_loss: 0.3502 386/500 [======================>.......] - ETA: 28s - loss: 2.0677 - regression_loss: 1.7178 - classification_loss: 0.3499 387/500 [======================>.......] - ETA: 28s - loss: 2.0685 - regression_loss: 1.7185 - classification_loss: 0.3501 388/500 [======================>.......] - ETA: 28s - loss: 2.0683 - regression_loss: 1.7184 - classification_loss: 0.3500 389/500 [======================>.......] - ETA: 27s - loss: 2.0652 - regression_loss: 1.7158 - classification_loss: 0.3495 390/500 [======================>.......] - ETA: 27s - loss: 2.0631 - regression_loss: 1.7140 - classification_loss: 0.3491 391/500 [======================>.......] - ETA: 27s - loss: 2.0630 - regression_loss: 1.7141 - classification_loss: 0.3489 392/500 [======================>.......] - ETA: 27s - loss: 2.0621 - regression_loss: 1.7135 - classification_loss: 0.3486 393/500 [======================>.......] - ETA: 26s - loss: 2.0601 - regression_loss: 1.7121 - classification_loss: 0.3481 394/500 [======================>.......] - ETA: 26s - loss: 2.0585 - regression_loss: 1.7109 - classification_loss: 0.3476 395/500 [======================>.......] - ETA: 26s - loss: 2.0586 - regression_loss: 1.7110 - classification_loss: 0.3476 396/500 [======================>.......] - ETA: 26s - loss: 2.0601 - regression_loss: 1.7123 - classification_loss: 0.3478 397/500 [======================>.......] - ETA: 25s - loss: 2.0565 - regression_loss: 1.7093 - classification_loss: 0.3472 398/500 [======================>.......] - ETA: 25s - loss: 2.0567 - regression_loss: 1.7094 - classification_loss: 0.3474 399/500 [======================>.......] - ETA: 25s - loss: 2.0558 - regression_loss: 1.7086 - classification_loss: 0.3472 400/500 [=======================>......] - ETA: 25s - loss: 2.0558 - regression_loss: 1.7087 - classification_loss: 0.3471 401/500 [=======================>......] - ETA: 24s - loss: 2.0559 - regression_loss: 1.7091 - classification_loss: 0.3468 402/500 [=======================>......] - ETA: 24s - loss: 2.0551 - regression_loss: 1.7086 - classification_loss: 0.3465 403/500 [=======================>......] - ETA: 24s - loss: 2.0552 - regression_loss: 1.7085 - classification_loss: 0.3467 404/500 [=======================>......] - ETA: 24s - loss: 2.0553 - regression_loss: 1.7084 - classification_loss: 0.3469 405/500 [=======================>......] - ETA: 23s - loss: 2.0563 - regression_loss: 1.7092 - classification_loss: 0.3471 406/500 [=======================>......] - ETA: 23s - loss: 2.0539 - regression_loss: 1.7074 - classification_loss: 0.3465 407/500 [=======================>......] - ETA: 23s - loss: 2.0573 - regression_loss: 1.7102 - classification_loss: 0.3471 408/500 [=======================>......] - ETA: 23s - loss: 2.0573 - regression_loss: 1.7104 - classification_loss: 0.3470 409/500 [=======================>......] - ETA: 22s - loss: 2.0570 - regression_loss: 1.7099 - classification_loss: 0.3471 410/500 [=======================>......] - ETA: 22s - loss: 2.0585 - regression_loss: 1.7109 - classification_loss: 0.3476 411/500 [=======================>......] - ETA: 22s - loss: 2.0577 - regression_loss: 1.7106 - classification_loss: 0.3471 412/500 [=======================>......] - ETA: 22s - loss: 2.0595 - regression_loss: 1.7126 - classification_loss: 0.3470 413/500 [=======================>......] - ETA: 21s - loss: 2.0607 - regression_loss: 1.7135 - classification_loss: 0.3472 414/500 [=======================>......] - ETA: 21s - loss: 2.0634 - regression_loss: 1.7159 - classification_loss: 0.3475 415/500 [=======================>......] - ETA: 21s - loss: 2.0634 - regression_loss: 1.7159 - classification_loss: 0.3474 416/500 [=======================>......] - ETA: 21s - loss: 2.0645 - regression_loss: 1.7168 - classification_loss: 0.3477 417/500 [========================>.....] - ETA: 20s - loss: 2.0643 - regression_loss: 1.7164 - classification_loss: 0.3478 418/500 [========================>.....] - ETA: 20s - loss: 2.0638 - regression_loss: 1.7162 - classification_loss: 0.3476 419/500 [========================>.....] - ETA: 20s - loss: 2.0627 - regression_loss: 1.7153 - classification_loss: 0.3474 420/500 [========================>.....] - ETA: 20s - loss: 2.0599 - regression_loss: 1.7131 - classification_loss: 0.3468 421/500 [========================>.....] - ETA: 19s - loss: 2.0594 - regression_loss: 1.7127 - classification_loss: 0.3467 422/500 [========================>.....] - ETA: 19s - loss: 2.0603 - regression_loss: 1.7133 - classification_loss: 0.3471 423/500 [========================>.....] - ETA: 19s - loss: 2.0592 - regression_loss: 1.7125 - classification_loss: 0.3467 424/500 [========================>.....] - ETA: 19s - loss: 2.0588 - regression_loss: 1.7121 - classification_loss: 0.3467 425/500 [========================>.....] - ETA: 18s - loss: 2.0601 - regression_loss: 1.7128 - classification_loss: 0.3473 426/500 [========================>.....] - ETA: 18s - loss: 2.0608 - regression_loss: 1.7134 - classification_loss: 0.3474 427/500 [========================>.....] - ETA: 18s - loss: 2.0613 - regression_loss: 1.7135 - classification_loss: 0.3478 428/500 [========================>.....] - ETA: 18s - loss: 2.0607 - regression_loss: 1.7129 - classification_loss: 0.3479 429/500 [========================>.....] - ETA: 17s - loss: 2.0603 - regression_loss: 1.7126 - classification_loss: 0.3477 430/500 [========================>.....] - ETA: 17s - loss: 2.0593 - regression_loss: 1.7119 - classification_loss: 0.3474 431/500 [========================>.....] - ETA: 17s - loss: 2.0580 - regression_loss: 1.7109 - classification_loss: 0.3471 432/500 [========================>.....] - ETA: 17s - loss: 2.0589 - regression_loss: 1.7117 - classification_loss: 0.3472 433/500 [========================>.....] - ETA: 16s - loss: 2.0592 - regression_loss: 1.7121 - classification_loss: 0.3471 434/500 [=========================>....] - ETA: 16s - loss: 2.0593 - regression_loss: 1.7120 - classification_loss: 0.3473 435/500 [=========================>....] - ETA: 16s - loss: 2.0590 - regression_loss: 1.7116 - classification_loss: 0.3474 436/500 [=========================>....] - ETA: 16s - loss: 2.0590 - regression_loss: 1.7114 - classification_loss: 0.3475 437/500 [=========================>....] - ETA: 15s - loss: 2.0613 - regression_loss: 1.7130 - classification_loss: 0.3483 438/500 [=========================>....] - ETA: 15s - loss: 2.0590 - regression_loss: 1.7113 - classification_loss: 0.3477 439/500 [=========================>....] - ETA: 15s - loss: 2.0602 - regression_loss: 1.7123 - classification_loss: 0.3479 440/500 [=========================>....] - ETA: 15s - loss: 2.0612 - regression_loss: 1.7129 - classification_loss: 0.3484 441/500 [=========================>....] - ETA: 14s - loss: 2.0617 - regression_loss: 1.7133 - classification_loss: 0.3484 442/500 [=========================>....] - ETA: 14s - loss: 2.0609 - regression_loss: 1.7127 - classification_loss: 0.3482 443/500 [=========================>....] - ETA: 14s - loss: 2.0619 - regression_loss: 1.7138 - classification_loss: 0.3481 444/500 [=========================>....] - ETA: 14s - loss: 2.0626 - regression_loss: 1.7147 - classification_loss: 0.3478 445/500 [=========================>....] - ETA: 13s - loss: 2.0622 - regression_loss: 1.7145 - classification_loss: 0.3478 446/500 [=========================>....] - ETA: 13s - loss: 2.0623 - regression_loss: 1.7147 - classification_loss: 0.3476 447/500 [=========================>....] - ETA: 13s - loss: 2.0617 - regression_loss: 1.7143 - classification_loss: 0.3473 448/500 [=========================>....] - ETA: 13s - loss: 2.0632 - regression_loss: 1.7154 - classification_loss: 0.3478 449/500 [=========================>....] - ETA: 12s - loss: 2.0632 - regression_loss: 1.7157 - classification_loss: 0.3475 450/500 [==========================>...] - ETA: 12s - loss: 2.0633 - regression_loss: 1.7157 - classification_loss: 0.3476 451/500 [==========================>...] - ETA: 12s - loss: 2.0636 - regression_loss: 1.7160 - classification_loss: 0.3476 452/500 [==========================>...] - ETA: 12s - loss: 2.0652 - regression_loss: 1.7171 - classification_loss: 0.3482 453/500 [==========================>...] - ETA: 11s - loss: 2.0656 - regression_loss: 1.7173 - classification_loss: 0.3483 454/500 [==========================>...] - ETA: 11s - loss: 2.0657 - regression_loss: 1.7174 - classification_loss: 0.3482 455/500 [==========================>...] - ETA: 11s - loss: 2.0658 - regression_loss: 1.7175 - classification_loss: 0.3483 456/500 [==========================>...] - ETA: 11s - loss: 2.0656 - regression_loss: 1.7174 - classification_loss: 0.3481 457/500 [==========================>...] - ETA: 10s - loss: 2.0634 - regression_loss: 1.7155 - classification_loss: 0.3479 458/500 [==========================>...] - ETA: 10s - loss: 2.0625 - regression_loss: 1.7149 - classification_loss: 0.3477 459/500 [==========================>...] - ETA: 10s - loss: 2.0623 - regression_loss: 1.7147 - classification_loss: 0.3477 460/500 [==========================>...] - ETA: 10s - loss: 2.0619 - regression_loss: 1.7144 - classification_loss: 0.3476 461/500 [==========================>...] - ETA: 9s - loss: 2.0616 - regression_loss: 1.7141 - classification_loss: 0.3475  462/500 [==========================>...] - ETA: 9s - loss: 2.0619 - regression_loss: 1.7142 - classification_loss: 0.3477 463/500 [==========================>...] - ETA: 9s - loss: 2.0615 - regression_loss: 1.7139 - classification_loss: 0.3476 464/500 [==========================>...] - ETA: 9s - loss: 2.0615 - regression_loss: 1.7140 - classification_loss: 0.3475 465/500 [==========================>...] - ETA: 8s - loss: 2.0629 - regression_loss: 1.7149 - classification_loss: 0.3481 466/500 [==========================>...] - ETA: 8s - loss: 2.0662 - regression_loss: 1.7178 - classification_loss: 0.3484 467/500 [===========================>..] - ETA: 8s - loss: 2.0662 - regression_loss: 1.7177 - classification_loss: 0.3485 468/500 [===========================>..] - ETA: 8s - loss: 2.0690 - regression_loss: 1.7190 - classification_loss: 0.3500 469/500 [===========================>..] - ETA: 7s - loss: 2.0685 - regression_loss: 1.7189 - classification_loss: 0.3496 470/500 [===========================>..] - ETA: 7s - loss: 2.0673 - regression_loss: 1.7181 - classification_loss: 0.3492 471/500 [===========================>..] - ETA: 7s - loss: 2.0686 - regression_loss: 1.7189 - classification_loss: 0.3498 472/500 [===========================>..] - ETA: 7s - loss: 2.0703 - regression_loss: 1.7200 - classification_loss: 0.3503 473/500 [===========================>..] - ETA: 6s - loss: 2.0705 - regression_loss: 1.7164 - classification_loss: 0.3541 474/500 [===========================>..] - ETA: 6s - loss: 2.0693 - regression_loss: 1.7155 - classification_loss: 0.3538 475/500 [===========================>..] - ETA: 6s - loss: 2.0701 - regression_loss: 1.7159 - classification_loss: 0.3541 476/500 [===========================>..] - ETA: 6s - loss: 2.0684 - regression_loss: 1.7147 - classification_loss: 0.3537 477/500 [===========================>..] - ETA: 5s - loss: 2.0694 - regression_loss: 1.7159 - classification_loss: 0.3535 478/500 [===========================>..] - ETA: 5s - loss: 2.0701 - regression_loss: 1.7164 - classification_loss: 0.3537 479/500 [===========================>..] - ETA: 5s - loss: 2.0695 - regression_loss: 1.7159 - classification_loss: 0.3535 480/500 [===========================>..] - ETA: 5s - loss: 2.0695 - regression_loss: 1.7160 - classification_loss: 0.3535 481/500 [===========================>..] - ETA: 4s - loss: 2.0688 - regression_loss: 1.7155 - classification_loss: 0.3533 482/500 [===========================>..] - ETA: 4s - loss: 2.0666 - regression_loss: 1.7137 - classification_loss: 0.3528 483/500 [===========================>..] - ETA: 4s - loss: 2.0696 - regression_loss: 1.7169 - classification_loss: 0.3528 484/500 [============================>.] - ETA: 4s - loss: 2.0696 - regression_loss: 1.7171 - classification_loss: 0.3525 485/500 [============================>.] - ETA: 3s - loss: 2.0675 - regression_loss: 1.7154 - classification_loss: 0.3521 486/500 [============================>.] - ETA: 3s - loss: 2.0672 - regression_loss: 1.7152 - classification_loss: 0.3520 487/500 [============================>.] - ETA: 3s - loss: 2.0674 - regression_loss: 1.7153 - classification_loss: 0.3520 488/500 [============================>.] - ETA: 3s - loss: 2.0662 - regression_loss: 1.7146 - classification_loss: 0.3516 489/500 [============================>.] - ETA: 2s - loss: 2.0655 - regression_loss: 1.7140 - classification_loss: 0.3515 490/500 [============================>.] - ETA: 2s - loss: 2.0657 - regression_loss: 1.7143 - classification_loss: 0.3514 491/500 [============================>.] - ETA: 2s - loss: 2.0659 - regression_loss: 1.7146 - classification_loss: 0.3514 492/500 [============================>.] - ETA: 2s - loss: 2.0662 - regression_loss: 1.7148 - classification_loss: 0.3514 493/500 [============================>.] - ETA: 1s - loss: 2.0652 - regression_loss: 1.7141 - classification_loss: 0.3511 494/500 [============================>.] - ETA: 1s - loss: 2.0671 - regression_loss: 1.7156 - classification_loss: 0.3515 495/500 [============================>.] - ETA: 1s - loss: 2.0699 - regression_loss: 1.7177 - classification_loss: 0.3522 496/500 [============================>.] - ETA: 1s - loss: 2.0697 - regression_loss: 1.7176 - classification_loss: 0.3521 497/500 [============================>.] - ETA: 0s - loss: 2.0702 - regression_loss: 1.7181 - classification_loss: 0.3521 498/500 [============================>.] - ETA: 0s - loss: 2.0714 - regression_loss: 1.7188 - classification_loss: 0.3526 499/500 [============================>.] - ETA: 0s - loss: 2.0705 - regression_loss: 1.7182 - classification_loss: 0.3523 500/500 [==============================] - 125s 251ms/step - loss: 2.0717 - regression_loss: 1.7190 - classification_loss: 0.3527 326 instances of class plum with average precision: 0.7027 mAP: 0.7027 Epoch 00020: saving model to ./training/snapshots/resnet50_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 1:57 - loss: 1.5131 - regression_loss: 1.3073 - classification_loss: 0.2059 2/500 [..............................] - ETA: 1:54 - loss: 1.8155 - regression_loss: 1.5090 - classification_loss: 0.3064 3/500 [..............................] - ETA: 1:57 - loss: 1.9704 - regression_loss: 1.6250 - classification_loss: 0.3454 4/500 [..............................] - ETA: 1:58 - loss: 2.0075 - regression_loss: 1.6447 - classification_loss: 0.3628 5/500 [..............................] - ETA: 1:59 - loss: 2.2124 - regression_loss: 1.8068 - classification_loss: 0.4056 6/500 [..............................] - ETA: 2:00 - loss: 2.2240 - regression_loss: 1.8166 - classification_loss: 0.4074 7/500 [..............................] - ETA: 1:59 - loss: 2.2056 - regression_loss: 1.8048 - classification_loss: 0.4007 8/500 [..............................] - ETA: 2:00 - loss: 2.2113 - regression_loss: 1.8074 - classification_loss: 0.4038 9/500 [..............................] - ETA: 2:00 - loss: 2.2142 - regression_loss: 1.8156 - classification_loss: 0.3986 10/500 [..............................] - ETA: 2:00 - loss: 2.2926 - regression_loss: 1.8777 - classification_loss: 0.4149 11/500 [..............................] - ETA: 2:00 - loss: 2.2634 - regression_loss: 1.8512 - classification_loss: 0.4122 12/500 [..............................] - ETA: 2:00 - loss: 2.2443 - regression_loss: 1.8464 - classification_loss: 0.3979 13/500 [..............................] - ETA: 2:00 - loss: 2.2271 - regression_loss: 1.8338 - classification_loss: 0.3933 14/500 [..............................] - ETA: 2:00 - loss: 2.2451 - regression_loss: 1.8376 - classification_loss: 0.4076 15/500 [..............................] - ETA: 2:00 - loss: 2.2437 - regression_loss: 1.8345 - classification_loss: 0.4091 16/500 [..............................] - ETA: 2:00 - loss: 2.2390 - regression_loss: 1.8243 - classification_loss: 0.4147 17/500 [>.............................] - ETA: 1:59 - loss: 2.2251 - regression_loss: 1.8236 - classification_loss: 0.4015 18/500 [>.............................] - ETA: 1:59 - loss: 2.2190 - regression_loss: 1.8161 - classification_loss: 0.4029 19/500 [>.............................] - ETA: 1:59 - loss: 2.1924 - regression_loss: 1.7985 - classification_loss: 0.3940 20/500 [>.............................] - ETA: 1:59 - loss: 2.2029 - regression_loss: 1.8126 - classification_loss: 0.3903 21/500 [>.............................] - ETA: 1:59 - loss: 2.2062 - regression_loss: 1.8156 - classification_loss: 0.3906 22/500 [>.............................] - ETA: 1:59 - loss: 2.2388 - regression_loss: 1.8425 - classification_loss: 0.3963 23/500 [>.............................] - ETA: 1:58 - loss: 2.2221 - regression_loss: 1.8329 - classification_loss: 0.3892 24/500 [>.............................] - ETA: 1:58 - loss: 2.2130 - regression_loss: 1.8286 - classification_loss: 0.3845 25/500 [>.............................] - ETA: 1:58 - loss: 2.2340 - regression_loss: 1.8430 - classification_loss: 0.3909 26/500 [>.............................] - ETA: 1:58 - loss: 2.2321 - regression_loss: 1.8421 - classification_loss: 0.3900 27/500 [>.............................] - ETA: 1:58 - loss: 2.2196 - regression_loss: 1.8345 - classification_loss: 0.3851 28/500 [>.............................] - ETA: 1:57 - loss: 2.2192 - regression_loss: 1.8352 - classification_loss: 0.3840 29/500 [>.............................] - ETA: 1:57 - loss: 2.2028 - regression_loss: 1.8241 - classification_loss: 0.3787 30/500 [>.............................] - ETA: 1:57 - loss: 2.1754 - regression_loss: 1.8040 - classification_loss: 0.3714 31/500 [>.............................] - ETA: 1:56 - loss: 2.1621 - regression_loss: 1.7937 - classification_loss: 0.3684 32/500 [>.............................] - ETA: 1:56 - loss: 2.1352 - regression_loss: 1.7718 - classification_loss: 0.3634 33/500 [>.............................] - ETA: 1:55 - loss: 2.1453 - regression_loss: 1.7780 - classification_loss: 0.3673 34/500 [=>............................] - ETA: 1:54 - loss: 2.1353 - regression_loss: 1.7725 - classification_loss: 0.3629 35/500 [=>............................] - ETA: 1:54 - loss: 2.1379 - regression_loss: 1.7744 - classification_loss: 0.3635 36/500 [=>............................] - ETA: 1:54 - loss: 2.1136 - regression_loss: 1.7559 - classification_loss: 0.3577 37/500 [=>............................] - ETA: 1:54 - loss: 2.0996 - regression_loss: 1.7440 - classification_loss: 0.3556 38/500 [=>............................] - ETA: 1:54 - loss: 2.0905 - regression_loss: 1.7379 - classification_loss: 0.3526 39/500 [=>............................] - ETA: 1:53 - loss: 2.0672 - regression_loss: 1.7193 - classification_loss: 0.3478 40/500 [=>............................] - ETA: 1:53 - loss: 2.0707 - regression_loss: 1.7211 - classification_loss: 0.3496 41/500 [=>............................] - ETA: 1:53 - loss: 2.0728 - regression_loss: 1.7213 - classification_loss: 0.3515 42/500 [=>............................] - ETA: 1:53 - loss: 2.0606 - regression_loss: 1.7126 - classification_loss: 0.3480 43/500 [=>............................] - ETA: 1:52 - loss: 2.0446 - regression_loss: 1.7001 - classification_loss: 0.3445 44/500 [=>............................] - ETA: 1:52 - loss: 2.0439 - regression_loss: 1.6981 - classification_loss: 0.3458 45/500 [=>............................] - ETA: 1:52 - loss: 2.0436 - regression_loss: 1.6964 - classification_loss: 0.3472 46/500 [=>............................] - ETA: 1:52 - loss: 2.0410 - regression_loss: 1.6928 - classification_loss: 0.3481 47/500 [=>............................] - ETA: 1:52 - loss: 2.0472 - regression_loss: 1.6990 - classification_loss: 0.3482 48/500 [=>............................] - ETA: 1:51 - loss: 2.0549 - regression_loss: 1.7036 - classification_loss: 0.3513 49/500 [=>............................] - ETA: 1:51 - loss: 2.0479 - regression_loss: 1.6996 - classification_loss: 0.3482 50/500 [==>...........................] - ETA: 1:51 - loss: 2.0416 - regression_loss: 1.6961 - classification_loss: 0.3455 51/500 [==>...........................] - ETA: 1:51 - loss: 2.0428 - regression_loss: 1.6967 - classification_loss: 0.3461 52/500 [==>...........................] - ETA: 1:51 - loss: 2.0420 - regression_loss: 1.6974 - classification_loss: 0.3446 53/500 [==>...........................] - ETA: 1:50 - loss: 2.0398 - regression_loss: 1.6962 - classification_loss: 0.3435 54/500 [==>...........................] - ETA: 1:50 - loss: 2.0471 - regression_loss: 1.7009 - classification_loss: 0.3463 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0345 - regression_loss: 1.6917 - classification_loss: 0.3428 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0369 - regression_loss: 1.6931 - classification_loss: 0.3438 57/500 [==>...........................] - ETA: 1:50 - loss: 2.0293 - regression_loss: 1.6874 - classification_loss: 0.3419 58/500 [==>...........................] - ETA: 1:49 - loss: 2.0463 - regression_loss: 1.7004 - classification_loss: 0.3459 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0366 - regression_loss: 1.6936 - classification_loss: 0.3430 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0553 - regression_loss: 1.7113 - classification_loss: 0.3440 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0471 - regression_loss: 1.7051 - classification_loss: 0.3420 62/500 [==>...........................] - ETA: 1:49 - loss: 2.0512 - regression_loss: 1.7080 - classification_loss: 0.3433 63/500 [==>...........................] - ETA: 1:48 - loss: 2.0496 - regression_loss: 1.7064 - classification_loss: 0.3432 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0378 - regression_loss: 1.6967 - classification_loss: 0.3411 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0328 - regression_loss: 1.6925 - classification_loss: 0.3403 66/500 [==>...........................] - ETA: 1:48 - loss: 2.0224 - regression_loss: 1.6845 - classification_loss: 0.3379 67/500 [===>..........................] - ETA: 1:47 - loss: 2.0234 - regression_loss: 1.6857 - classification_loss: 0.3377 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0181 - regression_loss: 1.6819 - classification_loss: 0.3362 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0171 - regression_loss: 1.6811 - classification_loss: 0.3360 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0144 - regression_loss: 1.6795 - classification_loss: 0.3350 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0309 - regression_loss: 1.6938 - classification_loss: 0.3371 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0303 - regression_loss: 1.6929 - classification_loss: 0.3374 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0395 - regression_loss: 1.7001 - classification_loss: 0.3393 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0306 - regression_loss: 1.6923 - classification_loss: 0.3383 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0346 - regression_loss: 1.6961 - classification_loss: 0.3386 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0361 - regression_loss: 1.6975 - classification_loss: 0.3386 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0430 - regression_loss: 1.7041 - classification_loss: 0.3388 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0532 - regression_loss: 1.7129 - classification_loss: 0.3402 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0630 - regression_loss: 1.7198 - classification_loss: 0.3432 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0689 - regression_loss: 1.7231 - classification_loss: 0.3458 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0626 - regression_loss: 1.7187 - classification_loss: 0.3439 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0822 - regression_loss: 1.7294 - classification_loss: 0.3529 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0933 - regression_loss: 1.7380 - classification_loss: 0.3553 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0957 - regression_loss: 1.7405 - classification_loss: 0.3552 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0955 - regression_loss: 1.7408 - classification_loss: 0.3547 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0841 - regression_loss: 1.7325 - classification_loss: 0.3517 87/500 [====>.........................] - ETA: 1:42 - loss: 2.0851 - regression_loss: 1.7323 - classification_loss: 0.3528 88/500 [====>.........................] - ETA: 1:42 - loss: 2.0894 - regression_loss: 1.7357 - classification_loss: 0.3537 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1003 - regression_loss: 1.7465 - classification_loss: 0.3538 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1137 - regression_loss: 1.7585 - classification_loss: 0.3552 91/500 [====>.........................] - ETA: 1:41 - loss: 2.0993 - regression_loss: 1.7468 - classification_loss: 0.3525 92/500 [====>.........................] - ETA: 1:41 - loss: 2.1040 - regression_loss: 1.7495 - classification_loss: 0.3545 93/500 [====>.........................] - ETA: 1:41 - loss: 2.0932 - regression_loss: 1.7407 - classification_loss: 0.3525 94/500 [====>.........................] - ETA: 1:41 - loss: 2.0908 - regression_loss: 1.7392 - classification_loss: 0.3517 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0923 - regression_loss: 1.7418 - classification_loss: 0.3505 96/500 [====>.........................] - ETA: 1:40 - loss: 2.0935 - regression_loss: 1.7422 - classification_loss: 0.3513 97/500 [====>.........................] - ETA: 1:40 - loss: 2.0960 - regression_loss: 1.7435 - classification_loss: 0.3525 98/500 [====>.........................] - ETA: 1:40 - loss: 2.0886 - regression_loss: 1.7380 - classification_loss: 0.3507 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0878 - regression_loss: 1.7374 - classification_loss: 0.3504 100/500 [=====>........................] - ETA: 1:39 - loss: 2.0969 - regression_loss: 1.7443 - classification_loss: 0.3526 101/500 [=====>........................] - ETA: 1:39 - loss: 2.0954 - regression_loss: 1.7425 - classification_loss: 0.3529 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0955 - regression_loss: 1.7439 - classification_loss: 0.3516 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0949 - regression_loss: 1.7430 - classification_loss: 0.3519 104/500 [=====>........................] - ETA: 1:38 - loss: 2.0952 - regression_loss: 1.7433 - classification_loss: 0.3519 105/500 [=====>........................] - ETA: 1:38 - loss: 2.0852 - regression_loss: 1.7344 - classification_loss: 0.3508 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0821 - regression_loss: 1.7320 - classification_loss: 0.3501 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0807 - regression_loss: 1.7319 - classification_loss: 0.3488 108/500 [=====>........................] - ETA: 1:37 - loss: 2.0833 - regression_loss: 1.7339 - classification_loss: 0.3494 109/500 [=====>........................] - ETA: 1:37 - loss: 2.0813 - regression_loss: 1.7326 - classification_loss: 0.3487 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0756 - regression_loss: 1.7283 - classification_loss: 0.3473 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0611 - regression_loss: 1.7159 - classification_loss: 0.3452 112/500 [=====>........................] - ETA: 1:36 - loss: 2.0639 - regression_loss: 1.7178 - classification_loss: 0.3461 113/500 [=====>........................] - ETA: 1:36 - loss: 2.0661 - regression_loss: 1.7196 - classification_loss: 0.3465 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0648 - regression_loss: 1.7187 - classification_loss: 0.3461 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0701 - regression_loss: 1.7238 - classification_loss: 0.3462 116/500 [=====>........................] - ETA: 1:35 - loss: 2.0687 - regression_loss: 1.7215 - classification_loss: 0.3472 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0630 - regression_loss: 1.7176 - classification_loss: 0.3454 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0635 - regression_loss: 1.7176 - classification_loss: 0.3460 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0624 - regression_loss: 1.7174 - classification_loss: 0.3450 120/500 [======>.......................] - ETA: 1:34 - loss: 2.0614 - regression_loss: 1.7165 - classification_loss: 0.3449 121/500 [======>.......................] - ETA: 1:34 - loss: 2.0536 - regression_loss: 1.7101 - classification_loss: 0.3435 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0528 - regression_loss: 1.7093 - classification_loss: 0.3435 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0534 - regression_loss: 1.7103 - classification_loss: 0.3432 124/500 [======>.......................] - ETA: 1:33 - loss: 2.0510 - regression_loss: 1.7086 - classification_loss: 0.3424 125/500 [======>.......................] - ETA: 1:33 - loss: 2.0502 - regression_loss: 1.7076 - classification_loss: 0.3426 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0496 - regression_loss: 1.7075 - classification_loss: 0.3421 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0470 - regression_loss: 1.7059 - classification_loss: 0.3411 128/500 [======>.......................] - ETA: 1:32 - loss: 2.0414 - regression_loss: 1.7021 - classification_loss: 0.3393 129/500 [======>.......................] - ETA: 1:32 - loss: 2.0416 - regression_loss: 1.7037 - classification_loss: 0.3379 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0405 - regression_loss: 1.7029 - classification_loss: 0.3376 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0363 - regression_loss: 1.6995 - classification_loss: 0.3367 132/500 [======>.......................] - ETA: 1:31 - loss: 2.0330 - regression_loss: 1.6970 - classification_loss: 0.3360 133/500 [======>.......................] - ETA: 1:31 - loss: 2.0386 - regression_loss: 1.7025 - classification_loss: 0.3362 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0395 - regression_loss: 1.7034 - classification_loss: 0.3360 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0390 - regression_loss: 1.7032 - classification_loss: 0.3359 136/500 [=======>......................] - ETA: 1:30 - loss: 2.0325 - regression_loss: 1.6976 - classification_loss: 0.3349 137/500 [=======>......................] - ETA: 1:30 - loss: 2.0342 - regression_loss: 1.6990 - classification_loss: 0.3352 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0350 - regression_loss: 1.6997 - classification_loss: 0.3354 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0387 - regression_loss: 1.7019 - classification_loss: 0.3368 140/500 [=======>......................] - ETA: 1:29 - loss: 2.0420 - regression_loss: 1.7021 - classification_loss: 0.3398 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0492 - regression_loss: 1.7074 - classification_loss: 0.3419 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0500 - regression_loss: 1.7079 - classification_loss: 0.3421 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0545 - regression_loss: 1.7120 - classification_loss: 0.3426 144/500 [=======>......................] - ETA: 1:28 - loss: 2.0585 - regression_loss: 1.7146 - classification_loss: 0.3439 145/500 [=======>......................] - ETA: 1:28 - loss: 2.0566 - regression_loss: 1.7133 - classification_loss: 0.3433 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0603 - regression_loss: 1.7161 - classification_loss: 0.3442 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0616 - regression_loss: 1.7165 - classification_loss: 0.3451 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0617 - regression_loss: 1.7168 - classification_loss: 0.3449 149/500 [=======>......................] - ETA: 1:27 - loss: 2.0639 - regression_loss: 1.7175 - classification_loss: 0.3463 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0677 - regression_loss: 1.7205 - classification_loss: 0.3472 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0672 - regression_loss: 1.7193 - classification_loss: 0.3479 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0648 - regression_loss: 1.7175 - classification_loss: 0.3473 153/500 [========>.....................] - ETA: 1:26 - loss: 2.0631 - regression_loss: 1.7165 - classification_loss: 0.3467 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0693 - regression_loss: 1.7211 - classification_loss: 0.3482 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0711 - regression_loss: 1.7228 - classification_loss: 0.3483 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0687 - regression_loss: 1.7208 - classification_loss: 0.3479 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0681 - regression_loss: 1.7206 - classification_loss: 0.3475 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0686 - regression_loss: 1.7214 - classification_loss: 0.3473 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0723 - regression_loss: 1.7241 - classification_loss: 0.3482 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0663 - regression_loss: 1.7175 - classification_loss: 0.3487 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0646 - regression_loss: 1.7159 - classification_loss: 0.3487 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0660 - regression_loss: 1.7179 - classification_loss: 0.3481 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0699 - regression_loss: 1.7210 - classification_loss: 0.3489 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0683 - regression_loss: 1.7186 - classification_loss: 0.3497 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0672 - regression_loss: 1.7177 - classification_loss: 0.3495 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0642 - regression_loss: 1.7158 - classification_loss: 0.3484 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0620 - regression_loss: 1.7141 - classification_loss: 0.3479 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0602 - regression_loss: 1.7129 - classification_loss: 0.3474 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0629 - regression_loss: 1.7151 - classification_loss: 0.3479 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0655 - regression_loss: 1.7176 - classification_loss: 0.3479 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0733 - regression_loss: 1.7240 - classification_loss: 0.3493 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0727 - regression_loss: 1.7234 - classification_loss: 0.3493 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0760 - regression_loss: 1.7254 - classification_loss: 0.3506 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0738 - regression_loss: 1.7240 - classification_loss: 0.3498 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0703 - regression_loss: 1.7213 - classification_loss: 0.3490 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0786 - regression_loss: 1.7262 - classification_loss: 0.3524 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0758 - regression_loss: 1.7236 - classification_loss: 0.3521 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0736 - regression_loss: 1.7223 - classification_loss: 0.3513 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0724 - regression_loss: 1.7215 - classification_loss: 0.3509 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0710 - regression_loss: 1.7205 - classification_loss: 0.3505 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0745 - regression_loss: 1.7228 - classification_loss: 0.3517 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0728 - regression_loss: 1.7218 - classification_loss: 0.3510 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0744 - regression_loss: 1.7228 - classification_loss: 0.3516 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0767 - regression_loss: 1.7248 - classification_loss: 0.3519 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0762 - regression_loss: 1.7242 - classification_loss: 0.3520 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0745 - regression_loss: 1.7234 - classification_loss: 0.3511 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0704 - regression_loss: 1.7205 - classification_loss: 0.3499 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0683 - regression_loss: 1.7190 - classification_loss: 0.3493 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0644 - regression_loss: 1.7161 - classification_loss: 0.3483 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0644 - regression_loss: 1.7158 - classification_loss: 0.3486 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0666 - regression_loss: 1.7175 - classification_loss: 0.3490 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0684 - regression_loss: 1.7183 - classification_loss: 0.3501 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0702 - regression_loss: 1.7197 - classification_loss: 0.3506 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0670 - regression_loss: 1.7172 - classification_loss: 0.3498 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0687 - regression_loss: 1.7189 - classification_loss: 0.3498 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0676 - regression_loss: 1.7181 - classification_loss: 0.3495 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0652 - regression_loss: 1.7167 - classification_loss: 0.3485 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0671 - regression_loss: 1.7183 - classification_loss: 0.3489 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0683 - regression_loss: 1.7198 - classification_loss: 0.3485 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0643 - regression_loss: 1.7164 - classification_loss: 0.3479 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0625 - regression_loss: 1.7151 - classification_loss: 0.3474 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0608 - regression_loss: 1.7142 - classification_loss: 0.3466 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0603 - regression_loss: 1.7139 - classification_loss: 0.3463 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0587 - regression_loss: 1.7127 - classification_loss: 0.3460 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0573 - regression_loss: 1.7117 - classification_loss: 0.3456 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0585 - regression_loss: 1.7129 - classification_loss: 0.3456 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0589 - regression_loss: 1.7133 - classification_loss: 0.3455 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0580 - regression_loss: 1.7128 - classification_loss: 0.3452 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0596 - regression_loss: 1.7143 - classification_loss: 0.3453 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0601 - regression_loss: 1.7149 - classification_loss: 0.3453 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0618 - regression_loss: 1.7158 - classification_loss: 0.3460 212/500 [===========>..................] - ETA: 1:11 - loss: 2.0604 - regression_loss: 1.7145 - classification_loss: 0.3459 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0533 - regression_loss: 1.7086 - classification_loss: 0.3448 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0563 - regression_loss: 1.7107 - classification_loss: 0.3455 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0551 - regression_loss: 1.7102 - classification_loss: 0.3449 216/500 [===========>..................] - ETA: 1:10 - loss: 2.0532 - regression_loss: 1.7088 - classification_loss: 0.3444 217/500 [============>.................] - ETA: 1:10 - loss: 2.0587 - regression_loss: 1.7125 - classification_loss: 0.3462 218/500 [============>.................] - ETA: 1:10 - loss: 2.0600 - regression_loss: 1.7131 - classification_loss: 0.3469 219/500 [============>.................] - ETA: 1:10 - loss: 2.0572 - regression_loss: 1.7110 - classification_loss: 0.3462 220/500 [============>.................] - ETA: 1:09 - loss: 2.0538 - regression_loss: 1.7081 - classification_loss: 0.3457 221/500 [============>.................] - ETA: 1:09 - loss: 2.0516 - regression_loss: 1.7061 - classification_loss: 0.3454 222/500 [============>.................] - ETA: 1:09 - loss: 2.0511 - regression_loss: 1.7059 - classification_loss: 0.3452 223/500 [============>.................] - ETA: 1:09 - loss: 2.0533 - regression_loss: 1.7077 - classification_loss: 0.3455 224/500 [============>.................] - ETA: 1:08 - loss: 2.0505 - regression_loss: 1.7056 - classification_loss: 0.3449 225/500 [============>.................] - ETA: 1:08 - loss: 2.0510 - regression_loss: 1.7056 - classification_loss: 0.3455 226/500 [============>.................] - ETA: 1:08 - loss: 2.0500 - regression_loss: 1.7048 - classification_loss: 0.3452 227/500 [============>.................] - ETA: 1:08 - loss: 2.0490 - regression_loss: 1.7040 - classification_loss: 0.3449 228/500 [============>.................] - ETA: 1:07 - loss: 2.0478 - regression_loss: 1.7033 - classification_loss: 0.3446 229/500 [============>.................] - ETA: 1:07 - loss: 2.0487 - regression_loss: 1.7040 - classification_loss: 0.3447 230/500 [============>.................] - ETA: 1:07 - loss: 2.0477 - regression_loss: 1.7035 - classification_loss: 0.3442 231/500 [============>.................] - ETA: 1:07 - loss: 2.0463 - regression_loss: 1.7019 - classification_loss: 0.3444 232/500 [============>.................] - ETA: 1:06 - loss: 2.0466 - regression_loss: 1.7024 - classification_loss: 0.3442 233/500 [============>.................] - ETA: 1:06 - loss: 2.0492 - regression_loss: 1.7056 - classification_loss: 0.3437 234/500 [=============>................] - ETA: 1:06 - loss: 2.0504 - regression_loss: 1.7069 - classification_loss: 0.3435 235/500 [=============>................] - ETA: 1:06 - loss: 2.0572 - regression_loss: 1.7089 - classification_loss: 0.3484 236/500 [=============>................] - ETA: 1:05 - loss: 2.0553 - regression_loss: 1.7076 - classification_loss: 0.3477 237/500 [=============>................] - ETA: 1:05 - loss: 2.0569 - regression_loss: 1.7086 - classification_loss: 0.3484 238/500 [=============>................] - ETA: 1:05 - loss: 2.0585 - regression_loss: 1.7098 - classification_loss: 0.3487 239/500 [=============>................] - ETA: 1:05 - loss: 2.0564 - regression_loss: 1.7082 - classification_loss: 0.3482 240/500 [=============>................] - ETA: 1:04 - loss: 2.0566 - regression_loss: 1.7084 - classification_loss: 0.3481 241/500 [=============>................] - ETA: 1:04 - loss: 2.0517 - regression_loss: 1.7044 - classification_loss: 0.3473 242/500 [=============>................] - ETA: 1:04 - loss: 2.0512 - regression_loss: 1.7038 - classification_loss: 0.3474 243/500 [=============>................] - ETA: 1:04 - loss: 2.0538 - regression_loss: 1.7059 - classification_loss: 0.3478 244/500 [=============>................] - ETA: 1:04 - loss: 2.0539 - regression_loss: 1.7060 - classification_loss: 0.3479 245/500 [=============>................] - ETA: 1:03 - loss: 2.0495 - regression_loss: 1.7024 - classification_loss: 0.3471 246/500 [=============>................] - ETA: 1:03 - loss: 2.0503 - regression_loss: 1.7032 - classification_loss: 0.3471 247/500 [=============>................] - ETA: 1:03 - loss: 2.0523 - regression_loss: 1.7047 - classification_loss: 0.3475 248/500 [=============>................] - ETA: 1:03 - loss: 2.0533 - regression_loss: 1.7050 - classification_loss: 0.3483 249/500 [=============>................] - ETA: 1:02 - loss: 2.0543 - regression_loss: 1.7056 - classification_loss: 0.3487 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0529 - regression_loss: 1.7045 - classification_loss: 0.3484 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0500 - regression_loss: 1.7025 - classification_loss: 0.3476 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0489 - regression_loss: 1.7018 - classification_loss: 0.3471 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0547 - regression_loss: 1.7069 - classification_loss: 0.3478 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0542 - regression_loss: 1.7064 - classification_loss: 0.3478 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0535 - regression_loss: 1.7059 - classification_loss: 0.3476 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0529 - regression_loss: 1.7058 - classification_loss: 0.3471 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0533 - regression_loss: 1.7061 - classification_loss: 0.3472 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0534 - regression_loss: 1.7056 - classification_loss: 0.3478 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0510 - regression_loss: 1.7031 - classification_loss: 0.3479 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0528 - regression_loss: 1.7048 - classification_loss: 0.3480 261/500 [==============>...............] - ETA: 59s - loss: 2.0502 - regression_loss: 1.7028 - classification_loss: 0.3474  262/500 [==============>...............] - ETA: 59s - loss: 2.0526 - regression_loss: 1.7038 - classification_loss: 0.3487 263/500 [==============>...............] - ETA: 59s - loss: 2.0551 - regression_loss: 1.7054 - classification_loss: 0.3497 264/500 [==============>...............] - ETA: 59s - loss: 2.0579 - regression_loss: 1.7071 - classification_loss: 0.3509 265/500 [==============>...............] - ETA: 58s - loss: 2.0587 - regression_loss: 1.7081 - classification_loss: 0.3506 266/500 [==============>...............] - ETA: 58s - loss: 2.0575 - regression_loss: 1.7077 - classification_loss: 0.3498 267/500 [===============>..............] - ETA: 58s - loss: 2.0583 - regression_loss: 1.7080 - classification_loss: 0.3502 268/500 [===============>..............] - ETA: 58s - loss: 2.0562 - regression_loss: 1.7063 - classification_loss: 0.3499 269/500 [===============>..............] - ETA: 57s - loss: 2.0533 - regression_loss: 1.7040 - classification_loss: 0.3494 270/500 [===============>..............] - ETA: 57s - loss: 2.0552 - regression_loss: 1.7053 - classification_loss: 0.3499 271/500 [===============>..............] - ETA: 57s - loss: 2.0540 - regression_loss: 1.7046 - classification_loss: 0.3494 272/500 [===============>..............] - ETA: 57s - loss: 2.0515 - regression_loss: 1.7026 - classification_loss: 0.3488 273/500 [===============>..............] - ETA: 56s - loss: 2.0516 - regression_loss: 1.7025 - classification_loss: 0.3490 274/500 [===============>..............] - ETA: 56s - loss: 2.0507 - regression_loss: 1.7018 - classification_loss: 0.3489 275/500 [===============>..............] - ETA: 56s - loss: 2.0541 - regression_loss: 1.7045 - classification_loss: 0.3496 276/500 [===============>..............] - ETA: 56s - loss: 2.0522 - regression_loss: 1.7031 - classification_loss: 0.3491 277/500 [===============>..............] - ETA: 55s - loss: 2.0487 - regression_loss: 1.7004 - classification_loss: 0.3483 278/500 [===============>..............] - ETA: 55s - loss: 2.0505 - regression_loss: 1.7017 - classification_loss: 0.3489 279/500 [===============>..............] - ETA: 55s - loss: 2.0516 - regression_loss: 1.7022 - classification_loss: 0.3494 280/500 [===============>..............] - ETA: 55s - loss: 2.0515 - regression_loss: 1.7020 - classification_loss: 0.3495 281/500 [===============>..............] - ETA: 54s - loss: 2.0515 - regression_loss: 1.7018 - classification_loss: 0.3497 282/500 [===============>..............] - ETA: 54s - loss: 2.0517 - regression_loss: 1.7019 - classification_loss: 0.3498 283/500 [===============>..............] - ETA: 54s - loss: 2.0543 - regression_loss: 1.7042 - classification_loss: 0.3501 284/500 [================>.............] - ETA: 54s - loss: 2.0543 - regression_loss: 1.7038 - classification_loss: 0.3505 285/500 [================>.............] - ETA: 53s - loss: 2.0544 - regression_loss: 1.7039 - classification_loss: 0.3506 286/500 [================>.............] - ETA: 53s - loss: 2.0568 - regression_loss: 1.7056 - classification_loss: 0.3513 287/500 [================>.............] - ETA: 53s - loss: 2.0525 - regression_loss: 1.7020 - classification_loss: 0.3505 288/500 [================>.............] - ETA: 53s - loss: 2.0537 - regression_loss: 1.7029 - classification_loss: 0.3508 289/500 [================>.............] - ETA: 52s - loss: 2.0519 - regression_loss: 1.7017 - classification_loss: 0.3502 290/500 [================>.............] - ETA: 52s - loss: 2.0516 - regression_loss: 1.7015 - classification_loss: 0.3501 291/500 [================>.............] - ETA: 52s - loss: 2.0509 - regression_loss: 1.7009 - classification_loss: 0.3500 292/500 [================>.............] - ETA: 52s - loss: 2.0562 - regression_loss: 1.7052 - classification_loss: 0.3510 293/500 [================>.............] - ETA: 51s - loss: 2.0573 - regression_loss: 1.7059 - classification_loss: 0.3514 294/500 [================>.............] - ETA: 51s - loss: 2.0571 - regression_loss: 1.7055 - classification_loss: 0.3517 295/500 [================>.............] - ETA: 51s - loss: 2.0554 - regression_loss: 1.7041 - classification_loss: 0.3513 296/500 [================>.............] - ETA: 51s - loss: 2.0553 - regression_loss: 1.7040 - classification_loss: 0.3513 297/500 [================>.............] - ETA: 50s - loss: 2.0535 - regression_loss: 1.7026 - classification_loss: 0.3510 298/500 [================>.............] - ETA: 50s - loss: 2.0552 - regression_loss: 1.7035 - classification_loss: 0.3517 299/500 [================>.............] - ETA: 50s - loss: 2.0567 - regression_loss: 1.7045 - classification_loss: 0.3522 300/500 [=================>............] - ETA: 50s - loss: 2.0574 - regression_loss: 1.7053 - classification_loss: 0.3520 301/500 [=================>............] - ETA: 49s - loss: 2.0575 - regression_loss: 1.7054 - classification_loss: 0.3522 302/500 [=================>............] - ETA: 49s - loss: 2.0577 - regression_loss: 1.7056 - classification_loss: 0.3522 303/500 [=================>............] - ETA: 49s - loss: 2.0574 - regression_loss: 1.7055 - classification_loss: 0.3519 304/500 [=================>............] - ETA: 49s - loss: 2.0576 - regression_loss: 1.7057 - classification_loss: 0.3519 305/500 [=================>............] - ETA: 48s - loss: 2.0572 - regression_loss: 1.7055 - classification_loss: 0.3516 306/500 [=================>............] - ETA: 48s - loss: 2.0567 - regression_loss: 1.7052 - classification_loss: 0.3515 307/500 [=================>............] - ETA: 48s - loss: 2.0577 - regression_loss: 1.7061 - classification_loss: 0.3517 308/500 [=================>............] - ETA: 48s - loss: 2.0600 - regression_loss: 1.7080 - classification_loss: 0.3519 309/500 [=================>............] - ETA: 47s - loss: 2.0554 - regression_loss: 1.7041 - classification_loss: 0.3512 310/500 [=================>............] - ETA: 47s - loss: 2.0585 - regression_loss: 1.7064 - classification_loss: 0.3520 311/500 [=================>............] - ETA: 47s - loss: 2.0588 - regression_loss: 1.7067 - classification_loss: 0.3520 312/500 [=================>............] - ETA: 47s - loss: 2.0582 - regression_loss: 1.7062 - classification_loss: 0.3519 313/500 [=================>............] - ETA: 46s - loss: 2.0574 - regression_loss: 1.7060 - classification_loss: 0.3514 314/500 [=================>............] - ETA: 46s - loss: 2.0612 - regression_loss: 1.7098 - classification_loss: 0.3514 315/500 [=================>............] - ETA: 46s - loss: 2.0616 - regression_loss: 1.7100 - classification_loss: 0.3516 316/500 [=================>............] - ETA: 46s - loss: 2.0607 - regression_loss: 1.7093 - classification_loss: 0.3514 317/500 [==================>...........] - ETA: 45s - loss: 2.0592 - regression_loss: 1.7082 - classification_loss: 0.3510 318/500 [==================>...........] - ETA: 45s - loss: 2.0606 - regression_loss: 1.7094 - classification_loss: 0.3512 319/500 [==================>...........] - ETA: 45s - loss: 2.0615 - regression_loss: 1.7101 - classification_loss: 0.3514 320/500 [==================>...........] - ETA: 45s - loss: 2.0623 - regression_loss: 1.7107 - classification_loss: 0.3515 321/500 [==================>...........] - ETA: 44s - loss: 2.0596 - regression_loss: 1.7089 - classification_loss: 0.3507 322/500 [==================>...........] - ETA: 44s - loss: 2.0601 - regression_loss: 1.7093 - classification_loss: 0.3508 323/500 [==================>...........] - ETA: 44s - loss: 2.0604 - regression_loss: 1.7096 - classification_loss: 0.3509 324/500 [==================>...........] - ETA: 44s - loss: 2.0588 - regression_loss: 1.7083 - classification_loss: 0.3505 325/500 [==================>...........] - ETA: 43s - loss: 2.0622 - regression_loss: 1.7109 - classification_loss: 0.3512 326/500 [==================>...........] - ETA: 43s - loss: 2.0615 - regression_loss: 1.7105 - classification_loss: 0.3510 327/500 [==================>...........] - ETA: 43s - loss: 2.0619 - regression_loss: 1.7108 - classification_loss: 0.3511 328/500 [==================>...........] - ETA: 43s - loss: 2.0621 - regression_loss: 1.7111 - classification_loss: 0.3510 329/500 [==================>...........] - ETA: 42s - loss: 2.0623 - regression_loss: 1.7113 - classification_loss: 0.3510 330/500 [==================>...........] - ETA: 42s - loss: 2.0622 - regression_loss: 1.7114 - classification_loss: 0.3508 331/500 [==================>...........] - ETA: 42s - loss: 2.0632 - regression_loss: 1.7122 - classification_loss: 0.3509 332/500 [==================>...........] - ETA: 42s - loss: 2.0616 - regression_loss: 1.7112 - classification_loss: 0.3505 333/500 [==================>...........] - ETA: 41s - loss: 2.0633 - regression_loss: 1.7124 - classification_loss: 0.3509 334/500 [===================>..........] - ETA: 41s - loss: 2.0622 - regression_loss: 1.7114 - classification_loss: 0.3508 335/500 [===================>..........] - ETA: 41s - loss: 2.0630 - regression_loss: 1.7122 - classification_loss: 0.3507 336/500 [===================>..........] - ETA: 41s - loss: 2.0638 - regression_loss: 1.7127 - classification_loss: 0.3511 337/500 [===================>..........] - ETA: 40s - loss: 2.0633 - regression_loss: 1.7123 - classification_loss: 0.3510 338/500 [===================>..........] - ETA: 40s - loss: 2.0636 - regression_loss: 1.7128 - classification_loss: 0.3508 339/500 [===================>..........] - ETA: 40s - loss: 2.0643 - regression_loss: 1.7130 - classification_loss: 0.3513 340/500 [===================>..........] - ETA: 40s - loss: 2.0647 - regression_loss: 1.7133 - classification_loss: 0.3514 341/500 [===================>..........] - ETA: 39s - loss: 2.0685 - regression_loss: 1.7156 - classification_loss: 0.3529 342/500 [===================>..........] - ETA: 39s - loss: 2.0658 - regression_loss: 1.7134 - classification_loss: 0.3523 343/500 [===================>..........] - ETA: 39s - loss: 2.0646 - regression_loss: 1.7126 - classification_loss: 0.3520 344/500 [===================>..........] - ETA: 39s - loss: 2.0620 - regression_loss: 1.7106 - classification_loss: 0.3513 345/500 [===================>..........] - ETA: 38s - loss: 2.0610 - regression_loss: 1.7100 - classification_loss: 0.3510 346/500 [===================>..........] - ETA: 38s - loss: 2.0620 - regression_loss: 1.7106 - classification_loss: 0.3514 347/500 [===================>..........] - ETA: 38s - loss: 2.0612 - regression_loss: 1.7102 - classification_loss: 0.3510 348/500 [===================>..........] - ETA: 38s - loss: 2.0590 - regression_loss: 1.7085 - classification_loss: 0.3505 349/500 [===================>..........] - ETA: 37s - loss: 2.0591 - regression_loss: 1.7089 - classification_loss: 0.3502 350/500 [====================>.........] - ETA: 37s - loss: 2.0559 - regression_loss: 1.7062 - classification_loss: 0.3497 351/500 [====================>.........] - ETA: 37s - loss: 2.0534 - regression_loss: 1.7043 - classification_loss: 0.3492 352/500 [====================>.........] - ETA: 37s - loss: 2.0547 - regression_loss: 1.7053 - classification_loss: 0.3494 353/500 [====================>.........] - ETA: 36s - loss: 2.0534 - regression_loss: 1.7044 - classification_loss: 0.3490 354/500 [====================>.........] - ETA: 36s - loss: 2.0534 - regression_loss: 1.7042 - classification_loss: 0.3492 355/500 [====================>.........] - ETA: 36s - loss: 2.0526 - regression_loss: 1.7040 - classification_loss: 0.3486 356/500 [====================>.........] - ETA: 36s - loss: 2.0505 - regression_loss: 1.7025 - classification_loss: 0.3480 357/500 [====================>.........] - ETA: 35s - loss: 2.0464 - regression_loss: 1.6991 - classification_loss: 0.3473 358/500 [====================>.........] - ETA: 35s - loss: 2.0467 - regression_loss: 1.6994 - classification_loss: 0.3473 359/500 [====================>.........] - ETA: 35s - loss: 2.0484 - regression_loss: 1.7005 - classification_loss: 0.3480 360/500 [====================>.........] - ETA: 35s - loss: 2.0480 - regression_loss: 1.7001 - classification_loss: 0.3479 361/500 [====================>.........] - ETA: 34s - loss: 2.0464 - regression_loss: 1.6989 - classification_loss: 0.3476 362/500 [====================>.........] - ETA: 34s - loss: 2.0451 - regression_loss: 1.6979 - classification_loss: 0.3472 363/500 [====================>.........] - ETA: 34s - loss: 2.0441 - regression_loss: 1.6970 - classification_loss: 0.3471 364/500 [====================>.........] - ETA: 34s - loss: 2.0447 - regression_loss: 1.6975 - classification_loss: 0.3471 365/500 [====================>.........] - ETA: 33s - loss: 2.0451 - regression_loss: 1.6980 - classification_loss: 0.3471 366/500 [====================>.........] - ETA: 33s - loss: 2.0494 - regression_loss: 1.7013 - classification_loss: 0.3481 367/500 [=====================>........] - ETA: 33s - loss: 2.0501 - regression_loss: 1.7018 - classification_loss: 0.3483 368/500 [=====================>........] - ETA: 33s - loss: 2.0530 - regression_loss: 1.7043 - classification_loss: 0.3488 369/500 [=====================>........] - ETA: 32s - loss: 2.0524 - regression_loss: 1.7038 - classification_loss: 0.3486 370/500 [=====================>........] - ETA: 32s - loss: 2.0513 - regression_loss: 1.7028 - classification_loss: 0.3485 371/500 [=====================>........] - ETA: 32s - loss: 2.0528 - regression_loss: 1.7040 - classification_loss: 0.3488 372/500 [=====================>........] - ETA: 32s - loss: 2.0556 - regression_loss: 1.7060 - classification_loss: 0.3496 373/500 [=====================>........] - ETA: 31s - loss: 2.0556 - regression_loss: 1.7063 - classification_loss: 0.3492 374/500 [=====================>........] - ETA: 31s - loss: 2.0539 - regression_loss: 1.7051 - classification_loss: 0.3488 375/500 [=====================>........] - ETA: 31s - loss: 2.0538 - regression_loss: 1.7050 - classification_loss: 0.3488 376/500 [=====================>........] - ETA: 31s - loss: 2.0547 - regression_loss: 1.7057 - classification_loss: 0.3490 377/500 [=====================>........] - ETA: 30s - loss: 2.0555 - regression_loss: 1.7060 - classification_loss: 0.3495 378/500 [=====================>........] - ETA: 30s - loss: 2.0540 - regression_loss: 1.7048 - classification_loss: 0.3492 379/500 [=====================>........] - ETA: 30s - loss: 2.0559 - regression_loss: 1.7060 - classification_loss: 0.3499 380/500 [=====================>........] - ETA: 30s - loss: 2.0546 - regression_loss: 1.7051 - classification_loss: 0.3494 381/500 [=====================>........] - ETA: 29s - loss: 2.0555 - regression_loss: 1.7045 - classification_loss: 0.3510 382/500 [=====================>........] - ETA: 29s - loss: 2.0575 - regression_loss: 1.7066 - classification_loss: 0.3509 383/500 [=====================>........] - ETA: 29s - loss: 2.0566 - regression_loss: 1.7061 - classification_loss: 0.3505 384/500 [======================>.......] - ETA: 29s - loss: 2.0580 - regression_loss: 1.7071 - classification_loss: 0.3508 385/500 [======================>.......] - ETA: 28s - loss: 2.0595 - regression_loss: 1.7082 - classification_loss: 0.3512 386/500 [======================>.......] - ETA: 28s - loss: 2.0582 - regression_loss: 1.7073 - classification_loss: 0.3508 387/500 [======================>.......] - ETA: 28s - loss: 2.0579 - regression_loss: 1.7072 - classification_loss: 0.3507 388/500 [======================>.......] - ETA: 28s - loss: 2.0586 - regression_loss: 1.7074 - classification_loss: 0.3511 389/500 [======================>.......] - ETA: 27s - loss: 2.0589 - regression_loss: 1.7074 - classification_loss: 0.3515 390/500 [======================>.......] - ETA: 27s - loss: 2.0573 - regression_loss: 1.7061 - classification_loss: 0.3512 391/500 [======================>.......] - ETA: 27s - loss: 2.0586 - regression_loss: 1.7071 - classification_loss: 0.3515 392/500 [======================>.......] - ETA: 27s - loss: 2.0575 - regression_loss: 1.7061 - classification_loss: 0.3514 393/500 [======================>.......] - ETA: 26s - loss: 2.0558 - regression_loss: 1.7049 - classification_loss: 0.3509 394/500 [======================>.......] - ETA: 26s - loss: 2.0554 - regression_loss: 1.7046 - classification_loss: 0.3508 395/500 [======================>.......] - ETA: 26s - loss: 2.0554 - regression_loss: 1.7046 - classification_loss: 0.3508 396/500 [======================>.......] - ETA: 26s - loss: 2.0532 - regression_loss: 1.7027 - classification_loss: 0.3505 397/500 [======================>.......] - ETA: 25s - loss: 2.0511 - regression_loss: 1.7010 - classification_loss: 0.3500 398/500 [======================>.......] - ETA: 25s - loss: 2.0510 - regression_loss: 1.7007 - classification_loss: 0.3502 399/500 [======================>.......] - ETA: 25s - loss: 2.0522 - regression_loss: 1.7017 - classification_loss: 0.3506 400/500 [=======================>......] - ETA: 25s - loss: 2.0503 - regression_loss: 1.7001 - classification_loss: 0.3502 401/500 [=======================>......] - ETA: 24s - loss: 2.0514 - regression_loss: 1.7010 - classification_loss: 0.3504 402/500 [=======================>......] - ETA: 24s - loss: 2.0533 - regression_loss: 1.7024 - classification_loss: 0.3509 403/500 [=======================>......] - ETA: 24s - loss: 2.0523 - regression_loss: 1.7015 - classification_loss: 0.3508 404/500 [=======================>......] - ETA: 24s - loss: 2.0529 - regression_loss: 1.7019 - classification_loss: 0.3510 405/500 [=======================>......] - ETA: 23s - loss: 2.0523 - regression_loss: 1.7014 - classification_loss: 0.3509 406/500 [=======================>......] - ETA: 23s - loss: 2.0529 - regression_loss: 1.7018 - classification_loss: 0.3511 407/500 [=======================>......] - ETA: 23s - loss: 2.0527 - regression_loss: 1.7016 - classification_loss: 0.3511 408/500 [=======================>......] - ETA: 23s - loss: 2.0514 - regression_loss: 1.7007 - classification_loss: 0.3506 409/500 [=======================>......] - ETA: 22s - loss: 2.0508 - regression_loss: 1.7004 - classification_loss: 0.3505 410/500 [=======================>......] - ETA: 22s - loss: 2.0509 - regression_loss: 1.7006 - classification_loss: 0.3503 411/500 [=======================>......] - ETA: 22s - loss: 2.0509 - regression_loss: 1.7006 - classification_loss: 0.3503 412/500 [=======================>......] - ETA: 22s - loss: 2.0476 - regression_loss: 1.6979 - classification_loss: 0.3497 413/500 [=======================>......] - ETA: 21s - loss: 2.0467 - regression_loss: 1.6971 - classification_loss: 0.3496 414/500 [=======================>......] - ETA: 21s - loss: 2.0465 - regression_loss: 1.6969 - classification_loss: 0.3495 415/500 [=======================>......] - ETA: 21s - loss: 2.0460 - regression_loss: 1.6966 - classification_loss: 0.3493 416/500 [=======================>......] - ETA: 21s - loss: 2.0434 - regression_loss: 1.6945 - classification_loss: 0.3490 417/500 [========================>.....] - ETA: 20s - loss: 2.0441 - regression_loss: 1.6950 - classification_loss: 0.3492 418/500 [========================>.....] - ETA: 20s - loss: 2.0422 - regression_loss: 1.6935 - classification_loss: 0.3488 419/500 [========================>.....] - ETA: 20s - loss: 2.0430 - regression_loss: 1.6942 - classification_loss: 0.3488 420/500 [========================>.....] - ETA: 20s - loss: 2.0449 - regression_loss: 1.6955 - classification_loss: 0.3494 421/500 [========================>.....] - ETA: 19s - loss: 2.0455 - regression_loss: 1.6963 - classification_loss: 0.3492 422/500 [========================>.....] - ETA: 19s - loss: 2.0447 - regression_loss: 1.6959 - classification_loss: 0.3488 423/500 [========================>.....] - ETA: 19s - loss: 2.0444 - regression_loss: 1.6957 - classification_loss: 0.3487 424/500 [========================>.....] - ETA: 19s - loss: 2.0437 - regression_loss: 1.6951 - classification_loss: 0.3486 425/500 [========================>.....] - ETA: 18s - loss: 2.0437 - regression_loss: 1.6952 - classification_loss: 0.3484 426/500 [========================>.....] - ETA: 18s - loss: 2.0423 - regression_loss: 1.6941 - classification_loss: 0.3482 427/500 [========================>.....] - ETA: 18s - loss: 2.0436 - regression_loss: 1.6953 - classification_loss: 0.3483 428/500 [========================>.....] - ETA: 18s - loss: 2.0432 - regression_loss: 1.6951 - classification_loss: 0.3481 429/500 [========================>.....] - ETA: 17s - loss: 2.0445 - regression_loss: 1.6964 - classification_loss: 0.3481 430/500 [========================>.....] - ETA: 17s - loss: 2.0450 - regression_loss: 1.6970 - classification_loss: 0.3480 431/500 [========================>.....] - ETA: 17s - loss: 2.0456 - regression_loss: 1.6976 - classification_loss: 0.3481 432/500 [========================>.....] - ETA: 17s - loss: 2.0457 - regression_loss: 1.6978 - classification_loss: 0.3479 433/500 [========================>.....] - ETA: 16s - loss: 2.0460 - regression_loss: 1.6984 - classification_loss: 0.3476 434/500 [=========================>....] - ETA: 16s - loss: 2.0463 - regression_loss: 1.6986 - classification_loss: 0.3477 435/500 [=========================>....] - ETA: 16s - loss: 2.0500 - regression_loss: 1.7023 - classification_loss: 0.3476 436/500 [=========================>....] - ETA: 16s - loss: 2.0507 - regression_loss: 1.7030 - classification_loss: 0.3477 437/500 [=========================>....] - ETA: 15s - loss: 2.0516 - regression_loss: 1.7040 - classification_loss: 0.3476 438/500 [=========================>....] - ETA: 15s - loss: 2.0504 - regression_loss: 1.7029 - classification_loss: 0.3475 439/500 [=========================>....] - ETA: 15s - loss: 2.0495 - regression_loss: 1.7021 - classification_loss: 0.3474 440/500 [=========================>....] - ETA: 15s - loss: 2.0494 - regression_loss: 1.7020 - classification_loss: 0.3474 441/500 [=========================>....] - ETA: 14s - loss: 2.0511 - regression_loss: 1.7034 - classification_loss: 0.3477 442/500 [=========================>....] - ETA: 14s - loss: 2.0484 - regression_loss: 1.7011 - classification_loss: 0.3472 443/500 [=========================>....] - ETA: 14s - loss: 2.0484 - regression_loss: 1.7012 - classification_loss: 0.3473 444/500 [=========================>....] - ETA: 14s - loss: 2.0476 - regression_loss: 1.7005 - classification_loss: 0.3471 445/500 [=========================>....] - ETA: 13s - loss: 2.0479 - regression_loss: 1.7008 - classification_loss: 0.3471 446/500 [=========================>....] - ETA: 13s - loss: 2.0488 - regression_loss: 1.7016 - classification_loss: 0.3472 447/500 [=========================>....] - ETA: 13s - loss: 2.0479 - regression_loss: 1.7009 - classification_loss: 0.3470 448/500 [=========================>....] - ETA: 13s - loss: 2.0479 - regression_loss: 1.7008 - classification_loss: 0.3470 449/500 [=========================>....] - ETA: 12s - loss: 2.0504 - regression_loss: 1.7029 - classification_loss: 0.3475 450/500 [==========================>...] - ETA: 12s - loss: 2.0512 - regression_loss: 1.7034 - classification_loss: 0.3478 451/500 [==========================>...] - ETA: 12s - loss: 2.0519 - regression_loss: 1.7040 - classification_loss: 0.3479 452/500 [==========================>...] - ETA: 12s - loss: 2.0517 - regression_loss: 1.7037 - classification_loss: 0.3480 453/500 [==========================>...] - ETA: 11s - loss: 2.0530 - regression_loss: 1.7048 - classification_loss: 0.3482 454/500 [==========================>...] - ETA: 11s - loss: 2.0548 - regression_loss: 1.7062 - classification_loss: 0.3486 455/500 [==========================>...] - ETA: 11s - loss: 2.0547 - regression_loss: 1.7061 - classification_loss: 0.3486 456/500 [==========================>...] - ETA: 11s - loss: 2.0557 - regression_loss: 1.7069 - classification_loss: 0.3488 457/500 [==========================>...] - ETA: 10s - loss: 2.0559 - regression_loss: 1.7072 - classification_loss: 0.3488 458/500 [==========================>...] - ETA: 10s - loss: 2.0558 - regression_loss: 1.7070 - classification_loss: 0.3488 459/500 [==========================>...] - ETA: 10s - loss: 2.0553 - regression_loss: 1.7068 - classification_loss: 0.3485 460/500 [==========================>...] - ETA: 10s - loss: 2.0528 - regression_loss: 1.7047 - classification_loss: 0.3481 461/500 [==========================>...] - ETA: 9s - loss: 2.0550 - regression_loss: 1.7064 - classification_loss: 0.3486  462/500 [==========================>...] - ETA: 9s - loss: 2.0565 - regression_loss: 1.7074 - classification_loss: 0.3491 463/500 [==========================>...] - ETA: 9s - loss: 2.0559 - regression_loss: 1.7070 - classification_loss: 0.3489 464/500 [==========================>...] - ETA: 9s - loss: 2.0557 - regression_loss: 1.7069 - classification_loss: 0.3488 465/500 [==========================>...] - ETA: 8s - loss: 2.0560 - regression_loss: 1.7070 - classification_loss: 0.3490 466/500 [==========================>...] - ETA: 8s - loss: 2.0528 - regression_loss: 1.7044 - classification_loss: 0.3485 467/500 [===========================>..] - ETA: 8s - loss: 2.0531 - regression_loss: 1.7046 - classification_loss: 0.3485 468/500 [===========================>..] - ETA: 8s - loss: 2.0534 - regression_loss: 1.7047 - classification_loss: 0.3488 469/500 [===========================>..] - ETA: 7s - loss: 2.0513 - regression_loss: 1.7030 - classification_loss: 0.3483 470/500 [===========================>..] - ETA: 7s - loss: 2.0513 - regression_loss: 1.7030 - classification_loss: 0.3484 471/500 [===========================>..] - ETA: 7s - loss: 2.0516 - regression_loss: 1.7033 - classification_loss: 0.3483 472/500 [===========================>..] - ETA: 7s - loss: 2.0513 - regression_loss: 1.7030 - classification_loss: 0.3482 473/500 [===========================>..] - ETA: 6s - loss: 2.0509 - regression_loss: 1.7028 - classification_loss: 0.3480 474/500 [===========================>..] - ETA: 6s - loss: 2.0520 - regression_loss: 1.7038 - classification_loss: 0.3482 475/500 [===========================>..] - ETA: 6s - loss: 2.0523 - regression_loss: 1.7042 - classification_loss: 0.3481 476/500 [===========================>..] - ETA: 6s - loss: 2.0521 - regression_loss: 1.7040 - classification_loss: 0.3481 477/500 [===========================>..] - ETA: 5s - loss: 2.0517 - regression_loss: 1.7037 - classification_loss: 0.3479 478/500 [===========================>..] - ETA: 5s - loss: 2.0494 - regression_loss: 1.7019 - classification_loss: 0.3475 479/500 [===========================>..] - ETA: 5s - loss: 2.0485 - regression_loss: 1.7011 - classification_loss: 0.3474 480/500 [===========================>..] - ETA: 5s - loss: 2.0474 - regression_loss: 1.7004 - classification_loss: 0.3471 481/500 [===========================>..] - ETA: 4s - loss: 2.0450 - regression_loss: 1.6983 - classification_loss: 0.3467 482/500 [===========================>..] - ETA: 4s - loss: 2.0438 - regression_loss: 1.6973 - classification_loss: 0.3465 483/500 [===========================>..] - ETA: 4s - loss: 2.0431 - regression_loss: 1.6968 - classification_loss: 0.3463 484/500 [============================>.] - ETA: 4s - loss: 2.0438 - regression_loss: 1.6973 - classification_loss: 0.3465 485/500 [============================>.] - ETA: 3s - loss: 2.0438 - regression_loss: 1.6971 - classification_loss: 0.3467 486/500 [============================>.] - ETA: 3s - loss: 2.0433 - regression_loss: 1.6968 - classification_loss: 0.3465 487/500 [============================>.] - ETA: 3s - loss: 2.0434 - regression_loss: 1.6967 - classification_loss: 0.3467 488/500 [============================>.] - ETA: 3s - loss: 2.0446 - regression_loss: 1.6976 - classification_loss: 0.3470 489/500 [============================>.] - ETA: 2s - loss: 2.0441 - regression_loss: 1.6973 - classification_loss: 0.3468 490/500 [============================>.] - ETA: 2s - loss: 2.0452 - regression_loss: 1.6983 - classification_loss: 0.3469 491/500 [============================>.] - ETA: 2s - loss: 2.0453 - regression_loss: 1.6984 - classification_loss: 0.3469 492/500 [============================>.] - ETA: 2s - loss: 2.0441 - regression_loss: 1.6975 - classification_loss: 0.3466 493/500 [============================>.] - ETA: 1s - loss: 2.0466 - regression_loss: 1.6994 - classification_loss: 0.3472 494/500 [============================>.] - ETA: 1s - loss: 2.0464 - regression_loss: 1.6992 - classification_loss: 0.3471 495/500 [============================>.] - ETA: 1s - loss: 2.0454 - regression_loss: 1.6985 - classification_loss: 0.3469 496/500 [============================>.] - ETA: 1s - loss: 2.0450 - regression_loss: 1.6983 - classification_loss: 0.3467 497/500 [============================>.] - ETA: 0s - loss: 2.0439 - regression_loss: 1.6975 - classification_loss: 0.3464 498/500 [============================>.] - ETA: 0s - loss: 2.0428 - regression_loss: 1.6967 - classification_loss: 0.3462 499/500 [============================>.] - ETA: 0s - loss: 2.0430 - regression_loss: 1.6968 - classification_loss: 0.3461 500/500 [==============================] - 125s 250ms/step - loss: 2.0420 - regression_loss: 1.6961 - classification_loss: 0.3459 326 instances of class plum with average precision: 0.6249 mAP: 0.6249 Epoch 00021: saving model to ./training/snapshots/resnet50_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 1:58 - loss: 1.5560 - regression_loss: 1.3461 - classification_loss: 0.2099 2/500 [..............................] - ETA: 2:01 - loss: 2.0126 - regression_loss: 1.7398 - classification_loss: 0.2728 3/500 [..............................] - ETA: 2:02 - loss: 1.8135 - regression_loss: 1.5565 - classification_loss: 0.2570 4/500 [..............................] - ETA: 2:01 - loss: 1.9052 - regression_loss: 1.6377 - classification_loss: 0.2676 5/500 [..............................] - ETA: 2:01 - loss: 1.8720 - regression_loss: 1.6099 - classification_loss: 0.2621 6/500 [..............................] - ETA: 2:02 - loss: 1.9846 - regression_loss: 1.6955 - classification_loss: 0.2891 7/500 [..............................] - ETA: 2:01 - loss: 1.9876 - regression_loss: 1.6882 - classification_loss: 0.2994 8/500 [..............................] - ETA: 2:01 - loss: 1.8895 - regression_loss: 1.6120 - classification_loss: 0.2775 9/500 [..............................] - ETA: 2:01 - loss: 1.8459 - regression_loss: 1.5809 - classification_loss: 0.2650 10/500 [..............................] - ETA: 2:01 - loss: 1.8721 - regression_loss: 1.5994 - classification_loss: 0.2727 11/500 [..............................] - ETA: 2:01 - loss: 1.9160 - regression_loss: 1.6277 - classification_loss: 0.2883 12/500 [..............................] - ETA: 2:01 - loss: 1.8976 - regression_loss: 1.6201 - classification_loss: 0.2775 13/500 [..............................] - ETA: 2:00 - loss: 1.9508 - regression_loss: 1.6507 - classification_loss: 0.3001 14/500 [..............................] - ETA: 2:00 - loss: 1.9672 - regression_loss: 1.6582 - classification_loss: 0.3089 15/500 [..............................] - ETA: 2:00 - loss: 1.9542 - regression_loss: 1.6477 - classification_loss: 0.3065 16/500 [..............................] - ETA: 2:00 - loss: 1.9606 - regression_loss: 1.6468 - classification_loss: 0.3137 17/500 [>.............................] - ETA: 2:00 - loss: 1.9168 - regression_loss: 1.6110 - classification_loss: 0.3058 18/500 [>.............................] - ETA: 2:00 - loss: 1.9206 - regression_loss: 1.6124 - classification_loss: 0.3082 19/500 [>.............................] - ETA: 2:00 - loss: 1.9462 - regression_loss: 1.6360 - classification_loss: 0.3102 20/500 [>.............................] - ETA: 1:59 - loss: 1.9129 - regression_loss: 1.5542 - classification_loss: 0.3587 21/500 [>.............................] - ETA: 1:59 - loss: 1.9002 - regression_loss: 1.5454 - classification_loss: 0.3548 22/500 [>.............................] - ETA: 1:59 - loss: 1.8850 - regression_loss: 1.5390 - classification_loss: 0.3461 23/500 [>.............................] - ETA: 1:59 - loss: 1.8961 - regression_loss: 1.5522 - classification_loss: 0.3439 24/500 [>.............................] - ETA: 1:59 - loss: 1.9109 - regression_loss: 1.5680 - classification_loss: 0.3429 25/500 [>.............................] - ETA: 1:58 - loss: 1.9049 - regression_loss: 1.5649 - classification_loss: 0.3399 26/500 [>.............................] - ETA: 1:58 - loss: 1.9101 - regression_loss: 1.5689 - classification_loss: 0.3412 27/500 [>.............................] - ETA: 1:58 - loss: 1.9220 - regression_loss: 1.5762 - classification_loss: 0.3458 28/500 [>.............................] - ETA: 1:58 - loss: 1.9134 - regression_loss: 1.5701 - classification_loss: 0.3433 29/500 [>.............................] - ETA: 1:58 - loss: 1.9103 - regression_loss: 1.5694 - classification_loss: 0.3409 30/500 [>.............................] - ETA: 1:57 - loss: 1.9192 - regression_loss: 1.5780 - classification_loss: 0.3411 31/500 [>.............................] - ETA: 1:57 - loss: 1.8852 - regression_loss: 1.5499 - classification_loss: 0.3353 32/500 [>.............................] - ETA: 1:57 - loss: 1.8753 - regression_loss: 1.5431 - classification_loss: 0.3322 33/500 [>.............................] - ETA: 1:56 - loss: 1.8708 - regression_loss: 1.5400 - classification_loss: 0.3309 34/500 [=>............................] - ETA: 1:56 - loss: 1.8438 - regression_loss: 1.5183 - classification_loss: 0.3255 35/500 [=>............................] - ETA: 1:56 - loss: 1.8716 - regression_loss: 1.5362 - classification_loss: 0.3354 36/500 [=>............................] - ETA: 1:56 - loss: 1.8739 - regression_loss: 1.5429 - classification_loss: 0.3310 37/500 [=>............................] - ETA: 1:56 - loss: 1.8671 - regression_loss: 1.5392 - classification_loss: 0.3279 38/500 [=>............................] - ETA: 1:55 - loss: 1.8669 - regression_loss: 1.5399 - classification_loss: 0.3270 39/500 [=>............................] - ETA: 1:55 - loss: 1.8884 - regression_loss: 1.5577 - classification_loss: 0.3307 40/500 [=>............................] - ETA: 1:55 - loss: 1.9169 - regression_loss: 1.5826 - classification_loss: 0.3343 41/500 [=>............................] - ETA: 1:55 - loss: 1.9550 - regression_loss: 1.6193 - classification_loss: 0.3357 42/500 [=>............................] - ETA: 1:54 - loss: 1.9564 - regression_loss: 1.6206 - classification_loss: 0.3358 43/500 [=>............................] - ETA: 1:54 - loss: 1.9655 - regression_loss: 1.6262 - classification_loss: 0.3393 44/500 [=>............................] - ETA: 1:54 - loss: 1.9513 - regression_loss: 1.6142 - classification_loss: 0.3371 45/500 [=>............................] - ETA: 1:54 - loss: 1.9600 - regression_loss: 1.6211 - classification_loss: 0.3389 46/500 [=>............................] - ETA: 1:53 - loss: 1.9507 - regression_loss: 1.6141 - classification_loss: 0.3366 47/500 [=>............................] - ETA: 1:53 - loss: 1.9677 - regression_loss: 1.6261 - classification_loss: 0.3416 48/500 [=>............................] - ETA: 1:53 - loss: 1.9726 - regression_loss: 1.6304 - classification_loss: 0.3423 49/500 [=>............................] - ETA: 1:53 - loss: 1.9628 - regression_loss: 1.6197 - classification_loss: 0.3431 50/500 [==>...........................] - ETA: 1:53 - loss: 1.9473 - regression_loss: 1.6069 - classification_loss: 0.3404 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9447 - regression_loss: 1.6061 - classification_loss: 0.3385 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9586 - regression_loss: 1.6164 - classification_loss: 0.3421 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9702 - regression_loss: 1.6248 - classification_loss: 0.3454 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9806 - regression_loss: 1.6325 - classification_loss: 0.3481 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9921 - regression_loss: 1.6455 - classification_loss: 0.3466 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9941 - regression_loss: 1.6475 - classification_loss: 0.3466 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9887 - regression_loss: 1.6443 - classification_loss: 0.3444 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9953 - regression_loss: 1.6508 - classification_loss: 0.3444 59/500 [==>...........................] - ETA: 1:50 - loss: 2.0036 - regression_loss: 1.6562 - classification_loss: 0.3475 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0117 - regression_loss: 1.6619 - classification_loss: 0.3499 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0204 - regression_loss: 1.6676 - classification_loss: 0.3528 62/500 [==>...........................] - ETA: 1:48 - loss: 2.0126 - regression_loss: 1.6624 - classification_loss: 0.3502 63/500 [==>...........................] - ETA: 1:48 - loss: 2.0090 - regression_loss: 1.6609 - classification_loss: 0.3480 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0194 - regression_loss: 1.6693 - classification_loss: 0.3501 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0195 - regression_loss: 1.6706 - classification_loss: 0.3488 66/500 [==>...........................] - ETA: 1:48 - loss: 2.0178 - regression_loss: 1.6693 - classification_loss: 0.3485 67/500 [===>..........................] - ETA: 1:47 - loss: 2.0351 - regression_loss: 1.6854 - classification_loss: 0.3497 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0331 - regression_loss: 1.6838 - classification_loss: 0.3493 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0363 - regression_loss: 1.6866 - classification_loss: 0.3497 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0331 - regression_loss: 1.6845 - classification_loss: 0.3486 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0197 - regression_loss: 1.6736 - classification_loss: 0.3461 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0165 - regression_loss: 1.6718 - classification_loss: 0.3447 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0173 - regression_loss: 1.6717 - classification_loss: 0.3457 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0173 - regression_loss: 1.6726 - classification_loss: 0.3447 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0126 - regression_loss: 1.6696 - classification_loss: 0.3430 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0107 - regression_loss: 1.6689 - classification_loss: 0.3417 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0077 - regression_loss: 1.6667 - classification_loss: 0.3410 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0098 - regression_loss: 1.6678 - classification_loss: 0.3420 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0087 - regression_loss: 1.6677 - classification_loss: 0.3410 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0070 - regression_loss: 1.6665 - classification_loss: 0.3405 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0088 - regression_loss: 1.6688 - classification_loss: 0.3400 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9927 - regression_loss: 1.6552 - classification_loss: 0.3375 83/500 [===>..........................] - ETA: 1:43 - loss: 1.9948 - regression_loss: 1.6562 - classification_loss: 0.3386 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0018 - regression_loss: 1.6630 - classification_loss: 0.3388 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9943 - regression_loss: 1.6576 - classification_loss: 0.3367 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9850 - regression_loss: 1.6500 - classification_loss: 0.3350 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9912 - regression_loss: 1.6549 - classification_loss: 0.3363 88/500 [====>.........................] - ETA: 1:42 - loss: 1.9929 - regression_loss: 1.6557 - classification_loss: 0.3371 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9954 - regression_loss: 1.6579 - classification_loss: 0.3375 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9905 - regression_loss: 1.6541 - classification_loss: 0.3364 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9867 - regression_loss: 1.6523 - classification_loss: 0.3343 92/500 [====>.........................] - ETA: 1:41 - loss: 1.9876 - regression_loss: 1.6526 - classification_loss: 0.3350 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9976 - regression_loss: 1.6612 - classification_loss: 0.3364 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9885 - regression_loss: 1.6542 - classification_loss: 0.3343 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9912 - regression_loss: 1.6562 - classification_loss: 0.3351 96/500 [====>.........................] - ETA: 1:40 - loss: 1.9976 - regression_loss: 1.6615 - classification_loss: 0.3361 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9950 - regression_loss: 1.6597 - classification_loss: 0.3353 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9966 - regression_loss: 1.6608 - classification_loss: 0.3357 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9959 - regression_loss: 1.6610 - classification_loss: 0.3348 100/500 [=====>........................] - ETA: 1:39 - loss: 2.0042 - regression_loss: 1.6667 - classification_loss: 0.3375 101/500 [=====>........................] - ETA: 1:39 - loss: 2.0083 - regression_loss: 1.6695 - classification_loss: 0.3388 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9978 - regression_loss: 1.6607 - classification_loss: 0.3371 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9971 - regression_loss: 1.6599 - classification_loss: 0.3372 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9957 - regression_loss: 1.6599 - classification_loss: 0.3358 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9963 - regression_loss: 1.6613 - classification_loss: 0.3350 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9932 - regression_loss: 1.6589 - classification_loss: 0.3342 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9845 - regression_loss: 1.6524 - classification_loss: 0.3321 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9827 - regression_loss: 1.6512 - classification_loss: 0.3314 109/500 [=====>........................] - ETA: 1:37 - loss: 1.9819 - regression_loss: 1.6508 - classification_loss: 0.3311 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9855 - regression_loss: 1.6543 - classification_loss: 0.3312 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9826 - regression_loss: 1.6523 - classification_loss: 0.3303 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9789 - regression_loss: 1.6498 - classification_loss: 0.3291 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9795 - regression_loss: 1.6504 - classification_loss: 0.3291 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9754 - regression_loss: 1.6475 - classification_loss: 0.3278 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9730 - regression_loss: 1.6455 - classification_loss: 0.3274 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9857 - regression_loss: 1.6555 - classification_loss: 0.3302 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9882 - regression_loss: 1.6576 - classification_loss: 0.3306 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9864 - regression_loss: 1.6562 - classification_loss: 0.3302 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9895 - regression_loss: 1.6576 - classification_loss: 0.3319 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9815 - regression_loss: 1.6512 - classification_loss: 0.3304 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9805 - regression_loss: 1.6509 - classification_loss: 0.3297 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9728 - regression_loss: 1.6449 - classification_loss: 0.3279 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9718 - regression_loss: 1.6446 - classification_loss: 0.3272 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9884 - regression_loss: 1.6556 - classification_loss: 0.3328 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9927 - regression_loss: 1.6587 - classification_loss: 0.3340 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9941 - regression_loss: 1.6594 - classification_loss: 0.3347 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9841 - regression_loss: 1.6513 - classification_loss: 0.3328 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9937 - regression_loss: 1.6594 - classification_loss: 0.3343 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9897 - regression_loss: 1.6565 - classification_loss: 0.3333 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9862 - regression_loss: 1.6536 - classification_loss: 0.3325 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9754 - regression_loss: 1.6445 - classification_loss: 0.3309 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9782 - regression_loss: 1.6470 - classification_loss: 0.3312 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9794 - regression_loss: 1.6477 - classification_loss: 0.3317 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9810 - regression_loss: 1.6484 - classification_loss: 0.3327 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9800 - regression_loss: 1.6472 - classification_loss: 0.3328 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9799 - regression_loss: 1.6480 - classification_loss: 0.3320 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9845 - regression_loss: 1.6505 - classification_loss: 0.3340 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9850 - regression_loss: 1.6509 - classification_loss: 0.3341 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9810 - regression_loss: 1.6483 - classification_loss: 0.3326 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9785 - regression_loss: 1.6466 - classification_loss: 0.3318 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9758 - regression_loss: 1.6450 - classification_loss: 0.3308 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9829 - regression_loss: 1.6510 - classification_loss: 0.3319 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9824 - regression_loss: 1.6501 - classification_loss: 0.3323 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9780 - regression_loss: 1.6467 - classification_loss: 0.3313 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9728 - regression_loss: 1.6427 - classification_loss: 0.3301 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9721 - regression_loss: 1.6426 - classification_loss: 0.3294 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9795 - regression_loss: 1.6487 - classification_loss: 0.3308 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9846 - regression_loss: 1.6521 - classification_loss: 0.3325 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9857 - regression_loss: 1.6526 - classification_loss: 0.3330 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9892 - regression_loss: 1.6540 - classification_loss: 0.3351 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9862 - regression_loss: 1.6520 - classification_loss: 0.3341 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9849 - regression_loss: 1.6502 - classification_loss: 0.3347 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9862 - regression_loss: 1.6490 - classification_loss: 0.3372 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9912 - regression_loss: 1.6530 - classification_loss: 0.3382 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9916 - regression_loss: 1.6533 - classification_loss: 0.3383 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9952 - regression_loss: 1.6561 - classification_loss: 0.3391 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9948 - regression_loss: 1.6557 - classification_loss: 0.3391 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9937 - regression_loss: 1.6552 - classification_loss: 0.3385 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9935 - regression_loss: 1.6556 - classification_loss: 0.3378 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9955 - regression_loss: 1.6572 - classification_loss: 0.3383 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0010 - regression_loss: 1.6619 - classification_loss: 0.3391 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0031 - regression_loss: 1.6638 - classification_loss: 0.3393 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0016 - regression_loss: 1.6625 - classification_loss: 0.3391 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0029 - regression_loss: 1.6635 - classification_loss: 0.3394 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0052 - regression_loss: 1.6657 - classification_loss: 0.3395 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0110 - regression_loss: 1.6710 - classification_loss: 0.3400 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0084 - regression_loss: 1.6686 - classification_loss: 0.3398 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0102 - regression_loss: 1.6697 - classification_loss: 0.3405 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0202 - regression_loss: 1.6778 - classification_loss: 0.3425 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0214 - regression_loss: 1.6792 - classification_loss: 0.3422 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0229 - regression_loss: 1.6802 - classification_loss: 0.3427 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0222 - regression_loss: 1.6799 - classification_loss: 0.3423 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0177 - regression_loss: 1.6762 - classification_loss: 0.3415 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0212 - regression_loss: 1.6785 - classification_loss: 0.3427 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0168 - regression_loss: 1.6750 - classification_loss: 0.3418 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0122 - regression_loss: 1.6714 - classification_loss: 0.3408 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0117 - regression_loss: 1.6708 - classification_loss: 0.3409 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0152 - regression_loss: 1.6734 - classification_loss: 0.3418 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0110 - regression_loss: 1.6702 - classification_loss: 0.3408 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0108 - regression_loss: 1.6704 - classification_loss: 0.3404 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0143 - regression_loss: 1.6726 - classification_loss: 0.3416 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0164 - regression_loss: 1.6741 - classification_loss: 0.3423 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0128 - regression_loss: 1.6715 - classification_loss: 0.3413 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0048 - regression_loss: 1.6648 - classification_loss: 0.3401 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0081 - regression_loss: 1.6674 - classification_loss: 0.3407 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0197 - regression_loss: 1.6584 - classification_loss: 0.3613 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0196 - regression_loss: 1.6589 - classification_loss: 0.3607 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0192 - regression_loss: 1.6588 - classification_loss: 0.3605 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0196 - regression_loss: 1.6589 - classification_loss: 0.3607 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0182 - regression_loss: 1.6579 - classification_loss: 0.3603 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0154 - regression_loss: 1.6558 - classification_loss: 0.3596 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0118 - regression_loss: 1.6529 - classification_loss: 0.3588 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0136 - regression_loss: 1.6543 - classification_loss: 0.3593 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0134 - regression_loss: 1.6544 - classification_loss: 0.3590 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0189 - regression_loss: 1.6597 - classification_loss: 0.3592 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0169 - regression_loss: 1.6581 - classification_loss: 0.3588 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0167 - regression_loss: 1.6575 - classification_loss: 0.3592 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0144 - regression_loss: 1.6555 - classification_loss: 0.3589 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0159 - regression_loss: 1.6572 - classification_loss: 0.3587 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0137 - regression_loss: 1.6560 - classification_loss: 0.3578 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0131 - regression_loss: 1.6556 - classification_loss: 0.3575 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0133 - regression_loss: 1.6559 - classification_loss: 0.3574 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0111 - regression_loss: 1.6544 - classification_loss: 0.3567 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0116 - regression_loss: 1.6552 - classification_loss: 0.3564 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0137 - regression_loss: 1.6572 - classification_loss: 0.3565 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0153 - regression_loss: 1.6582 - classification_loss: 0.3571 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0143 - regression_loss: 1.6579 - classification_loss: 0.3565 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0134 - regression_loss: 1.6574 - classification_loss: 0.3560 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0188 - regression_loss: 1.6613 - classification_loss: 0.3575 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0214 - regression_loss: 1.6623 - classification_loss: 0.3591 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0234 - regression_loss: 1.6639 - classification_loss: 0.3595 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0238 - regression_loss: 1.6636 - classification_loss: 0.3601 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0227 - regression_loss: 1.6627 - classification_loss: 0.3600 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0230 - regression_loss: 1.6631 - classification_loss: 0.3599 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0283 - regression_loss: 1.6683 - classification_loss: 0.3600 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0285 - regression_loss: 1.6692 - classification_loss: 0.3592 217/500 [============>.................] - ETA: 1:10 - loss: 2.0264 - regression_loss: 1.6674 - classification_loss: 0.3589 218/500 [============>.................] - ETA: 1:10 - loss: 2.0266 - regression_loss: 1.6680 - classification_loss: 0.3586 219/500 [============>.................] - ETA: 1:10 - loss: 2.0229 - regression_loss: 1.6651 - classification_loss: 0.3578 220/500 [============>.................] - ETA: 1:10 - loss: 2.0225 - regression_loss: 1.6650 - classification_loss: 0.3575 221/500 [============>.................] - ETA: 1:09 - loss: 2.0223 - regression_loss: 1.6650 - classification_loss: 0.3574 222/500 [============>.................] - ETA: 1:09 - loss: 2.0244 - regression_loss: 1.6669 - classification_loss: 0.3575 223/500 [============>.................] - ETA: 1:09 - loss: 2.0279 - regression_loss: 1.6701 - classification_loss: 0.3579 224/500 [============>.................] - ETA: 1:09 - loss: 2.0298 - regression_loss: 1.6712 - classification_loss: 0.3586 225/500 [============>.................] - ETA: 1:08 - loss: 2.0298 - regression_loss: 1.6717 - classification_loss: 0.3582 226/500 [============>.................] - ETA: 1:08 - loss: 2.0300 - regression_loss: 1.6719 - classification_loss: 0.3581 227/500 [============>.................] - ETA: 1:08 - loss: 2.0280 - regression_loss: 1.6705 - classification_loss: 0.3575 228/500 [============>.................] - ETA: 1:07 - loss: 2.0311 - regression_loss: 1.6735 - classification_loss: 0.3576 229/500 [============>.................] - ETA: 1:07 - loss: 2.0308 - regression_loss: 1.6733 - classification_loss: 0.3575 230/500 [============>.................] - ETA: 1:07 - loss: 2.0328 - regression_loss: 1.6748 - classification_loss: 0.3580 231/500 [============>.................] - ETA: 1:07 - loss: 2.0322 - regression_loss: 1.6741 - classification_loss: 0.3580 232/500 [============>.................] - ETA: 1:06 - loss: 2.0321 - regression_loss: 1.6741 - classification_loss: 0.3580 233/500 [============>.................] - ETA: 1:06 - loss: 2.0315 - regression_loss: 1.6734 - classification_loss: 0.3581 234/500 [=============>................] - ETA: 1:06 - loss: 2.0346 - regression_loss: 1.6765 - classification_loss: 0.3581 235/500 [=============>................] - ETA: 1:06 - loss: 2.0320 - regression_loss: 1.6745 - classification_loss: 0.3575 236/500 [=============>................] - ETA: 1:05 - loss: 2.0360 - regression_loss: 1.6782 - classification_loss: 0.3578 237/500 [=============>................] - ETA: 1:05 - loss: 2.0353 - regression_loss: 1.6778 - classification_loss: 0.3576 238/500 [=============>................] - ETA: 1:05 - loss: 2.0367 - regression_loss: 1.6791 - classification_loss: 0.3576 239/500 [=============>................] - ETA: 1:05 - loss: 2.0361 - regression_loss: 1.6786 - classification_loss: 0.3575 240/500 [=============>................] - ETA: 1:04 - loss: 2.0345 - regression_loss: 1.6774 - classification_loss: 0.3571 241/500 [=============>................] - ETA: 1:04 - loss: 2.0354 - regression_loss: 1.6788 - classification_loss: 0.3566 242/500 [=============>................] - ETA: 1:04 - loss: 2.0359 - regression_loss: 1.6794 - classification_loss: 0.3565 243/500 [=============>................] - ETA: 1:04 - loss: 2.0346 - regression_loss: 1.6783 - classification_loss: 0.3563 244/500 [=============>................] - ETA: 1:03 - loss: 2.0328 - regression_loss: 1.6770 - classification_loss: 0.3558 245/500 [=============>................] - ETA: 1:03 - loss: 2.0376 - regression_loss: 1.6811 - classification_loss: 0.3565 246/500 [=============>................] - ETA: 1:03 - loss: 2.0378 - regression_loss: 1.6813 - classification_loss: 0.3566 247/500 [=============>................] - ETA: 1:03 - loss: 2.0344 - regression_loss: 1.6781 - classification_loss: 0.3563 248/500 [=============>................] - ETA: 1:02 - loss: 2.0326 - regression_loss: 1.6765 - classification_loss: 0.3561 249/500 [=============>................] - ETA: 1:02 - loss: 2.0311 - regression_loss: 1.6757 - classification_loss: 0.3554 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0310 - regression_loss: 1.6758 - classification_loss: 0.3552 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0320 - regression_loss: 1.6765 - classification_loss: 0.3555 252/500 [==============>...............] - ETA: 1:01 - loss: 2.0315 - regression_loss: 1.6762 - classification_loss: 0.3553 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0302 - regression_loss: 1.6755 - classification_loss: 0.3547 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0322 - regression_loss: 1.6772 - classification_loss: 0.3549 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0295 - regression_loss: 1.6749 - classification_loss: 0.3546 256/500 [==============>...............] - ETA: 1:00 - loss: 2.0286 - regression_loss: 1.6744 - classification_loss: 0.3542 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0320 - regression_loss: 1.6775 - classification_loss: 0.3545 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0324 - regression_loss: 1.6776 - classification_loss: 0.3548 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0343 - regression_loss: 1.6793 - classification_loss: 0.3550 260/500 [==============>...............] - ETA: 59s - loss: 2.0349 - regression_loss: 1.6799 - classification_loss: 0.3550  261/500 [==============>...............] - ETA: 59s - loss: 2.0333 - regression_loss: 1.6788 - classification_loss: 0.3544 262/500 [==============>...............] - ETA: 59s - loss: 2.0329 - regression_loss: 1.6791 - classification_loss: 0.3538 263/500 [==============>...............] - ETA: 59s - loss: 2.0335 - regression_loss: 1.6794 - classification_loss: 0.3541 264/500 [==============>...............] - ETA: 58s - loss: 2.0333 - regression_loss: 1.6794 - classification_loss: 0.3539 265/500 [==============>...............] - ETA: 58s - loss: 2.0317 - regression_loss: 1.6781 - classification_loss: 0.3536 266/500 [==============>...............] - ETA: 58s - loss: 2.0332 - regression_loss: 1.6793 - classification_loss: 0.3539 267/500 [===============>..............] - ETA: 58s - loss: 2.0333 - regression_loss: 1.6796 - classification_loss: 0.3537 268/500 [===============>..............] - ETA: 57s - loss: 2.0321 - regression_loss: 1.6790 - classification_loss: 0.3531 269/500 [===============>..............] - ETA: 57s - loss: 2.0356 - regression_loss: 1.6814 - classification_loss: 0.3543 270/500 [===============>..............] - ETA: 57s - loss: 2.0347 - regression_loss: 1.6806 - classification_loss: 0.3541 271/500 [===============>..............] - ETA: 57s - loss: 2.0328 - regression_loss: 1.6793 - classification_loss: 0.3535 272/500 [===============>..............] - ETA: 56s - loss: 2.0322 - regression_loss: 1.6791 - classification_loss: 0.3531 273/500 [===============>..............] - ETA: 56s - loss: 2.0304 - regression_loss: 1.6778 - classification_loss: 0.3526 274/500 [===============>..............] - ETA: 56s - loss: 2.0295 - regression_loss: 1.6771 - classification_loss: 0.3524 275/500 [===============>..............] - ETA: 56s - loss: 2.0289 - regression_loss: 1.6772 - classification_loss: 0.3518 276/500 [===============>..............] - ETA: 55s - loss: 2.0260 - regression_loss: 1.6749 - classification_loss: 0.3511 277/500 [===============>..............] - ETA: 55s - loss: 2.0262 - regression_loss: 1.6750 - classification_loss: 0.3512 278/500 [===============>..............] - ETA: 55s - loss: 2.0287 - regression_loss: 1.6770 - classification_loss: 0.3517 279/500 [===============>..............] - ETA: 55s - loss: 2.0257 - regression_loss: 1.6747 - classification_loss: 0.3510 280/500 [===============>..............] - ETA: 54s - loss: 2.0295 - regression_loss: 1.6778 - classification_loss: 0.3517 281/500 [===============>..............] - ETA: 54s - loss: 2.0293 - regression_loss: 1.6775 - classification_loss: 0.3517 282/500 [===============>..............] - ETA: 54s - loss: 2.0293 - regression_loss: 1.6775 - classification_loss: 0.3518 283/500 [===============>..............] - ETA: 54s - loss: 2.0273 - regression_loss: 1.6759 - classification_loss: 0.3514 284/500 [================>.............] - ETA: 53s - loss: 2.0274 - regression_loss: 1.6759 - classification_loss: 0.3515 285/500 [================>.............] - ETA: 53s - loss: 2.0229 - regression_loss: 1.6723 - classification_loss: 0.3506 286/500 [================>.............] - ETA: 53s - loss: 2.0231 - regression_loss: 1.6725 - classification_loss: 0.3506 287/500 [================>.............] - ETA: 53s - loss: 2.0247 - regression_loss: 1.6738 - classification_loss: 0.3509 288/500 [================>.............] - ETA: 52s - loss: 2.0237 - regression_loss: 1.6729 - classification_loss: 0.3508 289/500 [================>.............] - ETA: 52s - loss: 2.0224 - regression_loss: 1.6720 - classification_loss: 0.3504 290/500 [================>.............] - ETA: 52s - loss: 2.0241 - regression_loss: 1.6731 - classification_loss: 0.3510 291/500 [================>.............] - ETA: 52s - loss: 2.0237 - regression_loss: 1.6733 - classification_loss: 0.3504 292/500 [================>.............] - ETA: 51s - loss: 2.0203 - regression_loss: 1.6708 - classification_loss: 0.3496 293/500 [================>.............] - ETA: 51s - loss: 2.0200 - regression_loss: 1.6706 - classification_loss: 0.3493 294/500 [================>.............] - ETA: 51s - loss: 2.0198 - regression_loss: 1.6707 - classification_loss: 0.3492 295/500 [================>.............] - ETA: 51s - loss: 2.0200 - regression_loss: 1.6714 - classification_loss: 0.3486 296/500 [================>.............] - ETA: 50s - loss: 2.0195 - regression_loss: 1.6707 - classification_loss: 0.3487 297/500 [================>.............] - ETA: 50s - loss: 2.0233 - regression_loss: 1.6740 - classification_loss: 0.3493 298/500 [================>.............] - ETA: 50s - loss: 2.0215 - regression_loss: 1.6729 - classification_loss: 0.3487 299/500 [================>.............] - ETA: 50s - loss: 2.0197 - regression_loss: 1.6716 - classification_loss: 0.3482 300/500 [=================>............] - ETA: 49s - loss: 2.0232 - regression_loss: 1.6738 - classification_loss: 0.3494 301/500 [=================>............] - ETA: 49s - loss: 2.0233 - regression_loss: 1.6739 - classification_loss: 0.3494 302/500 [=================>............] - ETA: 49s - loss: 2.0220 - regression_loss: 1.6729 - classification_loss: 0.3491 303/500 [=================>............] - ETA: 49s - loss: 2.0233 - regression_loss: 1.6736 - classification_loss: 0.3497 304/500 [=================>............] - ETA: 48s - loss: 2.0203 - regression_loss: 1.6712 - classification_loss: 0.3491 305/500 [=================>............] - ETA: 48s - loss: 2.0187 - regression_loss: 1.6701 - classification_loss: 0.3486 306/500 [=================>............] - ETA: 48s - loss: 2.0230 - regression_loss: 1.6740 - classification_loss: 0.3490 307/500 [=================>............] - ETA: 48s - loss: 2.0228 - regression_loss: 1.6729 - classification_loss: 0.3499 308/500 [=================>............] - ETA: 47s - loss: 2.0220 - regression_loss: 1.6725 - classification_loss: 0.3494 309/500 [=================>............] - ETA: 47s - loss: 2.0255 - regression_loss: 1.6751 - classification_loss: 0.3504 310/500 [=================>............] - ETA: 47s - loss: 2.0273 - regression_loss: 1.6766 - classification_loss: 0.3507 311/500 [=================>............] - ETA: 47s - loss: 2.0271 - regression_loss: 1.6766 - classification_loss: 0.3505 312/500 [=================>............] - ETA: 46s - loss: 2.0258 - regression_loss: 1.6758 - classification_loss: 0.3500 313/500 [=================>............] - ETA: 46s - loss: 2.0268 - regression_loss: 1.6769 - classification_loss: 0.3499 314/500 [=================>............] - ETA: 46s - loss: 2.0265 - regression_loss: 1.6766 - classification_loss: 0.3499 315/500 [=================>............] - ETA: 46s - loss: 2.0268 - regression_loss: 1.6765 - classification_loss: 0.3503 316/500 [=================>............] - ETA: 45s - loss: 2.0274 - regression_loss: 1.6770 - classification_loss: 0.3504 317/500 [==================>...........] - ETA: 45s - loss: 2.0312 - regression_loss: 1.6799 - classification_loss: 0.3513 318/500 [==================>...........] - ETA: 45s - loss: 2.0311 - regression_loss: 1.6800 - classification_loss: 0.3511 319/500 [==================>...........] - ETA: 45s - loss: 2.0303 - regression_loss: 1.6794 - classification_loss: 0.3509 320/500 [==================>...........] - ETA: 44s - loss: 2.0318 - regression_loss: 1.6806 - classification_loss: 0.3512 321/500 [==================>...........] - ETA: 44s - loss: 2.0297 - regression_loss: 1.6791 - classification_loss: 0.3506 322/500 [==================>...........] - ETA: 44s - loss: 2.0272 - regression_loss: 1.6772 - classification_loss: 0.3500 323/500 [==================>...........] - ETA: 44s - loss: 2.0289 - regression_loss: 1.6785 - classification_loss: 0.3503 324/500 [==================>...........] - ETA: 43s - loss: 2.0290 - regression_loss: 1.6787 - classification_loss: 0.3503 325/500 [==================>...........] - ETA: 43s - loss: 2.0275 - regression_loss: 1.6776 - classification_loss: 0.3499 326/500 [==================>...........] - ETA: 43s - loss: 2.0287 - regression_loss: 1.6789 - classification_loss: 0.3499 327/500 [==================>...........] - ETA: 43s - loss: 2.0281 - regression_loss: 1.6781 - classification_loss: 0.3500 328/500 [==================>...........] - ETA: 42s - loss: 2.0272 - regression_loss: 1.6777 - classification_loss: 0.3496 329/500 [==================>...........] - ETA: 42s - loss: 2.0263 - regression_loss: 1.6770 - classification_loss: 0.3493 330/500 [==================>...........] - ETA: 42s - loss: 2.0269 - regression_loss: 1.6775 - classification_loss: 0.3494 331/500 [==================>...........] - ETA: 42s - loss: 2.0258 - regression_loss: 1.6767 - classification_loss: 0.3491 332/500 [==================>...........] - ETA: 41s - loss: 2.0248 - regression_loss: 1.6758 - classification_loss: 0.3490 333/500 [==================>...........] - ETA: 41s - loss: 2.0271 - regression_loss: 1.6779 - classification_loss: 0.3493 334/500 [===================>..........] - ETA: 41s - loss: 2.0245 - regression_loss: 1.6753 - classification_loss: 0.3492 335/500 [===================>..........] - ETA: 41s - loss: 2.0230 - regression_loss: 1.6741 - classification_loss: 0.3489 336/500 [===================>..........] - ETA: 40s - loss: 2.0225 - regression_loss: 1.6736 - classification_loss: 0.3489 337/500 [===================>..........] - ETA: 40s - loss: 2.0220 - regression_loss: 1.6731 - classification_loss: 0.3489 338/500 [===================>..........] - ETA: 40s - loss: 2.0229 - regression_loss: 1.6736 - classification_loss: 0.3493 339/500 [===================>..........] - ETA: 40s - loss: 2.0238 - regression_loss: 1.6743 - classification_loss: 0.3495 340/500 [===================>..........] - ETA: 39s - loss: 2.0254 - regression_loss: 1.6754 - classification_loss: 0.3500 341/500 [===================>..........] - ETA: 39s - loss: 2.0225 - regression_loss: 1.6704 - classification_loss: 0.3521 342/500 [===================>..........] - ETA: 39s - loss: 2.0214 - regression_loss: 1.6694 - classification_loss: 0.3520 343/500 [===================>..........] - ETA: 39s - loss: 2.0201 - regression_loss: 1.6685 - classification_loss: 0.3516 344/500 [===================>..........] - ETA: 38s - loss: 2.0207 - regression_loss: 1.6690 - classification_loss: 0.3517 345/500 [===================>..........] - ETA: 38s - loss: 2.0230 - regression_loss: 1.6709 - classification_loss: 0.3521 346/500 [===================>..........] - ETA: 38s - loss: 2.0241 - regression_loss: 1.6720 - classification_loss: 0.3521 347/500 [===================>..........] - ETA: 38s - loss: 2.0227 - regression_loss: 1.6711 - classification_loss: 0.3516 348/500 [===================>..........] - ETA: 38s - loss: 2.0228 - regression_loss: 1.6713 - classification_loss: 0.3515 349/500 [===================>..........] - ETA: 37s - loss: 2.0228 - regression_loss: 1.6717 - classification_loss: 0.3511 350/500 [====================>.........] - ETA: 37s - loss: 2.0235 - regression_loss: 1.6724 - classification_loss: 0.3510 351/500 [====================>.........] - ETA: 37s - loss: 2.0241 - regression_loss: 1.6729 - classification_loss: 0.3512 352/500 [====================>.........] - ETA: 37s - loss: 2.0233 - regression_loss: 1.6720 - classification_loss: 0.3513 353/500 [====================>.........] - ETA: 36s - loss: 2.0239 - regression_loss: 1.6727 - classification_loss: 0.3512 354/500 [====================>.........] - ETA: 36s - loss: 2.0283 - regression_loss: 1.6771 - classification_loss: 0.3512 355/500 [====================>.........] - ETA: 36s - loss: 2.0296 - regression_loss: 1.6781 - classification_loss: 0.3514 356/500 [====================>.........] - ETA: 36s - loss: 2.0292 - regression_loss: 1.6780 - classification_loss: 0.3513 357/500 [====================>.........] - ETA: 35s - loss: 2.0299 - regression_loss: 1.6788 - classification_loss: 0.3511 358/500 [====================>.........] - ETA: 35s - loss: 2.0294 - regression_loss: 1.6784 - classification_loss: 0.3510 359/500 [====================>.........] - ETA: 35s - loss: 2.0290 - regression_loss: 1.6782 - classification_loss: 0.3507 360/500 [====================>.........] - ETA: 35s - loss: 2.0259 - regression_loss: 1.6759 - classification_loss: 0.3500 361/500 [====================>.........] - ETA: 34s - loss: 2.0278 - regression_loss: 1.6772 - classification_loss: 0.3505 362/500 [====================>.........] - ETA: 34s - loss: 2.0295 - regression_loss: 1.6787 - classification_loss: 0.3508 363/500 [====================>.........] - ETA: 34s - loss: 2.0291 - regression_loss: 1.6784 - classification_loss: 0.3507 364/500 [====================>.........] - ETA: 34s - loss: 2.0278 - regression_loss: 1.6776 - classification_loss: 0.3502 365/500 [====================>.........] - ETA: 33s - loss: 2.0284 - regression_loss: 1.6777 - classification_loss: 0.3506 366/500 [====================>.........] - ETA: 33s - loss: 2.0264 - regression_loss: 1.6760 - classification_loss: 0.3503 367/500 [=====================>........] - ETA: 33s - loss: 2.0276 - regression_loss: 1.6772 - classification_loss: 0.3504 368/500 [=====================>........] - ETA: 33s - loss: 2.0264 - regression_loss: 1.6763 - classification_loss: 0.3501 369/500 [=====================>........] - ETA: 32s - loss: 2.0257 - regression_loss: 1.6759 - classification_loss: 0.3498 370/500 [=====================>........] - ETA: 32s - loss: 2.0254 - regression_loss: 1.6757 - classification_loss: 0.3497 371/500 [=====================>........] - ETA: 32s - loss: 2.0242 - regression_loss: 1.6751 - classification_loss: 0.3491 372/500 [=====================>........] - ETA: 32s - loss: 2.0251 - regression_loss: 1.6760 - classification_loss: 0.3491 373/500 [=====================>........] - ETA: 31s - loss: 2.0250 - regression_loss: 1.6758 - classification_loss: 0.3491 374/500 [=====================>........] - ETA: 31s - loss: 2.0239 - regression_loss: 1.6752 - classification_loss: 0.3487 375/500 [=====================>........] - ETA: 31s - loss: 2.0249 - regression_loss: 1.6760 - classification_loss: 0.3490 376/500 [=====================>........] - ETA: 31s - loss: 2.0240 - regression_loss: 1.6754 - classification_loss: 0.3486 377/500 [=====================>........] - ETA: 30s - loss: 2.0258 - regression_loss: 1.6768 - classification_loss: 0.3490 378/500 [=====================>........] - ETA: 30s - loss: 2.0260 - regression_loss: 1.6770 - classification_loss: 0.3489 379/500 [=====================>........] - ETA: 30s - loss: 2.0263 - regression_loss: 1.6770 - classification_loss: 0.3494 380/500 [=====================>........] - ETA: 30s - loss: 2.0267 - regression_loss: 1.6773 - classification_loss: 0.3494 381/500 [=====================>........] - ETA: 29s - loss: 2.0289 - regression_loss: 1.6788 - classification_loss: 0.3500 382/500 [=====================>........] - ETA: 29s - loss: 2.0283 - regression_loss: 1.6785 - classification_loss: 0.3498 383/500 [=====================>........] - ETA: 29s - loss: 2.0259 - regression_loss: 1.6766 - classification_loss: 0.3493 384/500 [======================>.......] - ETA: 29s - loss: 2.0316 - regression_loss: 1.6805 - classification_loss: 0.3511 385/500 [======================>.......] - ETA: 28s - loss: 2.0336 - regression_loss: 1.6821 - classification_loss: 0.3515 386/500 [======================>.......] - ETA: 28s - loss: 2.0325 - regression_loss: 1.6813 - classification_loss: 0.3512 387/500 [======================>.......] - ETA: 28s - loss: 2.0322 - regression_loss: 1.6809 - classification_loss: 0.3512 388/500 [======================>.......] - ETA: 28s - loss: 2.0314 - regression_loss: 1.6804 - classification_loss: 0.3510 389/500 [======================>.......] - ETA: 27s - loss: 2.0324 - regression_loss: 1.6816 - classification_loss: 0.3508 390/500 [======================>.......] - ETA: 27s - loss: 2.0331 - regression_loss: 1.6819 - classification_loss: 0.3512 391/500 [======================>.......] - ETA: 27s - loss: 2.0329 - regression_loss: 1.6818 - classification_loss: 0.3511 392/500 [======================>.......] - ETA: 27s - loss: 2.0325 - regression_loss: 1.6817 - classification_loss: 0.3508 393/500 [======================>.......] - ETA: 26s - loss: 2.0330 - regression_loss: 1.6822 - classification_loss: 0.3508 394/500 [======================>.......] - ETA: 26s - loss: 2.0317 - regression_loss: 1.6813 - classification_loss: 0.3503 395/500 [======================>.......] - ETA: 26s - loss: 2.0321 - regression_loss: 1.6817 - classification_loss: 0.3504 396/500 [======================>.......] - ETA: 26s - loss: 2.0312 - regression_loss: 1.6812 - classification_loss: 0.3501 397/500 [======================>.......] - ETA: 25s - loss: 2.0326 - regression_loss: 1.6822 - classification_loss: 0.3504 398/500 [======================>.......] - ETA: 25s - loss: 2.0298 - regression_loss: 1.6800 - classification_loss: 0.3498 399/500 [======================>.......] - ETA: 25s - loss: 2.0305 - regression_loss: 1.6805 - classification_loss: 0.3500 400/500 [=======================>......] - ETA: 25s - loss: 2.0289 - regression_loss: 1.6794 - classification_loss: 0.3494 401/500 [=======================>......] - ETA: 24s - loss: 2.0284 - regression_loss: 1.6792 - classification_loss: 0.3493 402/500 [=======================>......] - ETA: 24s - loss: 2.0283 - regression_loss: 1.6791 - classification_loss: 0.3492 403/500 [=======================>......] - ETA: 24s - loss: 2.0281 - regression_loss: 1.6790 - classification_loss: 0.3490 404/500 [=======================>......] - ETA: 24s - loss: 2.0276 - regression_loss: 1.6787 - classification_loss: 0.3489 405/500 [=======================>......] - ETA: 23s - loss: 2.0251 - regression_loss: 1.6767 - classification_loss: 0.3484 406/500 [=======================>......] - ETA: 23s - loss: 2.0270 - regression_loss: 1.6778 - classification_loss: 0.3491 407/500 [=======================>......] - ETA: 23s - loss: 2.0272 - regression_loss: 1.6780 - classification_loss: 0.3492 408/500 [=======================>......] - ETA: 23s - loss: 2.0288 - regression_loss: 1.6789 - classification_loss: 0.3499 409/500 [=======================>......] - ETA: 22s - loss: 2.0298 - regression_loss: 1.6796 - classification_loss: 0.3502 410/500 [=======================>......] - ETA: 22s - loss: 2.0303 - regression_loss: 1.6799 - classification_loss: 0.3504 411/500 [=======================>......] - ETA: 22s - loss: 2.0288 - regression_loss: 1.6783 - classification_loss: 0.3505 412/500 [=======================>......] - ETA: 22s - loss: 2.0272 - regression_loss: 1.6771 - classification_loss: 0.3501 413/500 [=======================>......] - ETA: 21s - loss: 2.0271 - regression_loss: 1.6769 - classification_loss: 0.3502 414/500 [=======================>......] - ETA: 21s - loss: 2.0280 - regression_loss: 1.6775 - classification_loss: 0.3505 415/500 [=======================>......] - ETA: 21s - loss: 2.0278 - regression_loss: 1.6774 - classification_loss: 0.3504 416/500 [=======================>......] - ETA: 21s - loss: 2.0275 - regression_loss: 1.6770 - classification_loss: 0.3505 417/500 [========================>.....] - ETA: 20s - loss: 2.0308 - regression_loss: 1.6785 - classification_loss: 0.3523 418/500 [========================>.....] - ETA: 20s - loss: 2.0330 - regression_loss: 1.6805 - classification_loss: 0.3525 419/500 [========================>.....] - ETA: 20s - loss: 2.0329 - regression_loss: 1.6806 - classification_loss: 0.3523 420/500 [========================>.....] - ETA: 20s - loss: 2.0349 - regression_loss: 1.6819 - classification_loss: 0.3530 421/500 [========================>.....] - ETA: 19s - loss: 2.0362 - regression_loss: 1.6833 - classification_loss: 0.3529 422/500 [========================>.....] - ETA: 19s - loss: 2.0354 - regression_loss: 1.6826 - classification_loss: 0.3529 423/500 [========================>.....] - ETA: 19s - loss: 2.0383 - regression_loss: 1.6847 - classification_loss: 0.3535 424/500 [========================>.....] - ETA: 19s - loss: 2.0352 - regression_loss: 1.6822 - classification_loss: 0.3530 425/500 [========================>.....] - ETA: 18s - loss: 2.0349 - regression_loss: 1.6821 - classification_loss: 0.3528 426/500 [========================>.....] - ETA: 18s - loss: 2.0352 - regression_loss: 1.6826 - classification_loss: 0.3526 427/500 [========================>.....] - ETA: 18s - loss: 2.0355 - regression_loss: 1.6829 - classification_loss: 0.3526 428/500 [========================>.....] - ETA: 18s - loss: 2.0370 - regression_loss: 1.6840 - classification_loss: 0.3530 429/500 [========================>.....] - ETA: 17s - loss: 2.0365 - regression_loss: 1.6835 - classification_loss: 0.3530 430/500 [========================>.....] - ETA: 17s - loss: 2.0354 - regression_loss: 1.6828 - classification_loss: 0.3526 431/500 [========================>.....] - ETA: 17s - loss: 2.0354 - regression_loss: 1.6830 - classification_loss: 0.3525 432/500 [========================>.....] - ETA: 17s - loss: 2.0365 - regression_loss: 1.6839 - classification_loss: 0.3527 433/500 [========================>.....] - ETA: 16s - loss: 2.0375 - regression_loss: 1.6847 - classification_loss: 0.3528 434/500 [=========================>....] - ETA: 16s - loss: 2.0369 - regression_loss: 1.6840 - classification_loss: 0.3530 435/500 [=========================>....] - ETA: 16s - loss: 2.0353 - regression_loss: 1.6825 - classification_loss: 0.3528 436/500 [=========================>....] - ETA: 16s - loss: 2.0346 - regression_loss: 1.6820 - classification_loss: 0.3526 437/500 [=========================>....] - ETA: 15s - loss: 2.0329 - regression_loss: 1.6808 - classification_loss: 0.3521 438/500 [=========================>....] - ETA: 15s - loss: 2.0338 - regression_loss: 1.6813 - classification_loss: 0.3524 439/500 [=========================>....] - ETA: 15s - loss: 2.0324 - regression_loss: 1.6804 - classification_loss: 0.3520 440/500 [=========================>....] - ETA: 15s - loss: 2.0350 - regression_loss: 1.6803 - classification_loss: 0.3546 441/500 [=========================>....] - ETA: 14s - loss: 2.0379 - regression_loss: 1.6825 - classification_loss: 0.3554 442/500 [=========================>....] - ETA: 14s - loss: 2.0376 - regression_loss: 1.6823 - classification_loss: 0.3553 443/500 [=========================>....] - ETA: 14s - loss: 2.0369 - regression_loss: 1.6818 - classification_loss: 0.3551 444/500 [=========================>....] - ETA: 14s - loss: 2.0381 - regression_loss: 1.6830 - classification_loss: 0.3551 445/500 [=========================>....] - ETA: 13s - loss: 2.0353 - regression_loss: 1.6809 - classification_loss: 0.3545 446/500 [=========================>....] - ETA: 13s - loss: 2.0357 - regression_loss: 1.6811 - classification_loss: 0.3546 447/500 [=========================>....] - ETA: 13s - loss: 2.0330 - regression_loss: 1.6791 - classification_loss: 0.3540 448/500 [=========================>....] - ETA: 13s - loss: 2.0333 - regression_loss: 1.6790 - classification_loss: 0.3543 449/500 [=========================>....] - ETA: 12s - loss: 2.0335 - regression_loss: 1.6790 - classification_loss: 0.3545 450/500 [==========================>...] - ETA: 12s - loss: 2.0334 - regression_loss: 1.6790 - classification_loss: 0.3544 451/500 [==========================>...] - ETA: 12s - loss: 2.0310 - regression_loss: 1.6772 - classification_loss: 0.3538 452/500 [==========================>...] - ETA: 12s - loss: 2.0314 - regression_loss: 1.6776 - classification_loss: 0.3539 453/500 [==========================>...] - ETA: 11s - loss: 2.0320 - regression_loss: 1.6780 - classification_loss: 0.3540 454/500 [==========================>...] - ETA: 11s - loss: 2.0321 - regression_loss: 1.6780 - classification_loss: 0.3541 455/500 [==========================>...] - ETA: 11s - loss: 2.0308 - regression_loss: 1.6771 - classification_loss: 0.3537 456/500 [==========================>...] - ETA: 11s - loss: 2.0292 - regression_loss: 1.6759 - classification_loss: 0.3533 457/500 [==========================>...] - ETA: 10s - loss: 2.0281 - regression_loss: 1.6751 - classification_loss: 0.3529 458/500 [==========================>...] - ETA: 10s - loss: 2.0282 - regression_loss: 1.6754 - classification_loss: 0.3528 459/500 [==========================>...] - ETA: 10s - loss: 2.0258 - regression_loss: 1.6734 - classification_loss: 0.3524 460/500 [==========================>...] - ETA: 10s - loss: 2.0257 - regression_loss: 1.6735 - classification_loss: 0.3522 461/500 [==========================>...] - ETA: 9s - loss: 2.0248 - regression_loss: 1.6728 - classification_loss: 0.3519  462/500 [==========================>...] - ETA: 9s - loss: 2.0288 - regression_loss: 1.6761 - classification_loss: 0.3527 463/500 [==========================>...] - ETA: 9s - loss: 2.0307 - regression_loss: 1.6776 - classification_loss: 0.3531 464/500 [==========================>...] - ETA: 9s - loss: 2.0325 - regression_loss: 1.6790 - classification_loss: 0.3535 465/500 [==========================>...] - ETA: 8s - loss: 2.0322 - regression_loss: 1.6789 - classification_loss: 0.3533 466/500 [==========================>...] - ETA: 8s - loss: 2.0322 - regression_loss: 1.6789 - classification_loss: 0.3533 467/500 [===========================>..] - ETA: 8s - loss: 2.0296 - regression_loss: 1.6768 - classification_loss: 0.3528 468/500 [===========================>..] - ETA: 8s - loss: 2.0296 - regression_loss: 1.6770 - classification_loss: 0.3527 469/500 [===========================>..] - ETA: 7s - loss: 2.0300 - regression_loss: 1.6772 - classification_loss: 0.3528 470/500 [===========================>..] - ETA: 7s - loss: 2.0314 - regression_loss: 1.6783 - classification_loss: 0.3530 471/500 [===========================>..] - ETA: 7s - loss: 2.0331 - regression_loss: 1.6798 - classification_loss: 0.3534 472/500 [===========================>..] - ETA: 7s - loss: 2.0309 - regression_loss: 1.6780 - classification_loss: 0.3530 473/500 [===========================>..] - ETA: 6s - loss: 2.0324 - regression_loss: 1.6792 - classification_loss: 0.3532 474/500 [===========================>..] - ETA: 6s - loss: 2.0323 - regression_loss: 1.6791 - classification_loss: 0.3533 475/500 [===========================>..] - ETA: 6s - loss: 2.0312 - regression_loss: 1.6782 - classification_loss: 0.3530 476/500 [===========================>..] - ETA: 6s - loss: 2.0299 - regression_loss: 1.6772 - classification_loss: 0.3527 477/500 [===========================>..] - ETA: 5s - loss: 2.0299 - regression_loss: 1.6772 - classification_loss: 0.3527 478/500 [===========================>..] - ETA: 5s - loss: 2.0305 - regression_loss: 1.6777 - classification_loss: 0.3527 479/500 [===========================>..] - ETA: 5s - loss: 2.0300 - regression_loss: 1.6773 - classification_loss: 0.3527 480/500 [===========================>..] - ETA: 5s - loss: 2.0292 - regression_loss: 1.6767 - classification_loss: 0.3526 481/500 [===========================>..] - ETA: 4s - loss: 2.0299 - regression_loss: 1.6770 - classification_loss: 0.3529 482/500 [===========================>..] - ETA: 4s - loss: 2.0296 - regression_loss: 1.6768 - classification_loss: 0.3527 483/500 [===========================>..] - ETA: 4s - loss: 2.0296 - regression_loss: 1.6770 - classification_loss: 0.3526 484/500 [============================>.] - ETA: 4s - loss: 2.0295 - regression_loss: 1.6770 - classification_loss: 0.3525 485/500 [============================>.] - ETA: 3s - loss: 2.0284 - regression_loss: 1.6761 - classification_loss: 0.3522 486/500 [============================>.] - ETA: 3s - loss: 2.0298 - regression_loss: 1.6776 - classification_loss: 0.3523 487/500 [============================>.] - ETA: 3s - loss: 2.0290 - regression_loss: 1.6768 - classification_loss: 0.3522 488/500 [============================>.] - ETA: 3s - loss: 2.0289 - regression_loss: 1.6767 - classification_loss: 0.3522 489/500 [============================>.] - ETA: 2s - loss: 2.0302 - regression_loss: 1.6778 - classification_loss: 0.3524 490/500 [============================>.] - ETA: 2s - loss: 2.0296 - regression_loss: 1.6773 - classification_loss: 0.3523 491/500 [============================>.] - ETA: 2s - loss: 2.0285 - regression_loss: 1.6765 - classification_loss: 0.3519 492/500 [============================>.] - ETA: 2s - loss: 2.0282 - regression_loss: 1.6762 - classification_loss: 0.3519 493/500 [============================>.] - ETA: 1s - loss: 2.0291 - regression_loss: 1.6770 - classification_loss: 0.3520 494/500 [============================>.] - ETA: 1s - loss: 2.0283 - regression_loss: 1.6765 - classification_loss: 0.3518 495/500 [============================>.] - ETA: 1s - loss: 2.0274 - regression_loss: 1.6758 - classification_loss: 0.3515 496/500 [============================>.] - ETA: 1s - loss: 2.0257 - regression_loss: 1.6746 - classification_loss: 0.3511 497/500 [============================>.] - ETA: 0s - loss: 2.0268 - regression_loss: 1.6754 - classification_loss: 0.3514 498/500 [============================>.] - ETA: 0s - loss: 2.0258 - regression_loss: 1.6748 - classification_loss: 0.3510 499/500 [============================>.] - ETA: 0s - loss: 2.0259 - regression_loss: 1.6749 - classification_loss: 0.3510 500/500 [==============================] - 125s 250ms/step - loss: 2.0255 - regression_loss: 1.6747 - classification_loss: 0.3509 326 instances of class plum with average precision: 0.6709 mAP: 0.6709 Epoch 00022: saving model to ./training/snapshots/resnet50_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 2:03 - loss: 1.9444 - regression_loss: 1.6551 - classification_loss: 0.2892 2/500 [..............................] - ETA: 2:03 - loss: 1.8156 - regression_loss: 1.5521 - classification_loss: 0.2635 3/500 [..............................] - ETA: 1:59 - loss: 1.9571 - regression_loss: 1.6269 - classification_loss: 0.3303 4/500 [..............................] - ETA: 1:55 - loss: 1.8329 - regression_loss: 1.5180 - classification_loss: 0.3149 5/500 [..............................] - ETA: 1:55 - loss: 1.8871 - regression_loss: 1.5576 - classification_loss: 0.3295 6/500 [..............................] - ETA: 1:56 - loss: 2.0026 - regression_loss: 1.6577 - classification_loss: 0.3450 7/500 [..............................] - ETA: 1:57 - loss: 2.1171 - regression_loss: 1.7530 - classification_loss: 0.3641 8/500 [..............................] - ETA: 1:58 - loss: 2.1420 - regression_loss: 1.7800 - classification_loss: 0.3620 9/500 [..............................] - ETA: 1:58 - loss: 2.1344 - regression_loss: 1.7857 - classification_loss: 0.3487 10/500 [..............................] - ETA: 1:59 - loss: 2.1607 - regression_loss: 1.8050 - classification_loss: 0.3557 11/500 [..............................] - ETA: 1:59 - loss: 2.2212 - regression_loss: 1.8564 - classification_loss: 0.3648 12/500 [..............................] - ETA: 1:58 - loss: 2.1869 - regression_loss: 1.8320 - classification_loss: 0.3550 13/500 [..............................] - ETA: 1:59 - loss: 2.1089 - regression_loss: 1.6911 - classification_loss: 0.4179 14/500 [..............................] - ETA: 1:59 - loss: 2.1297 - regression_loss: 1.7139 - classification_loss: 0.4158 15/500 [..............................] - ETA: 1:59 - loss: 2.1390 - regression_loss: 1.7239 - classification_loss: 0.4151 16/500 [..............................] - ETA: 1:59 - loss: 2.1444 - regression_loss: 1.7278 - classification_loss: 0.4166 17/500 [>.............................] - ETA: 1:59 - loss: 2.1051 - regression_loss: 1.6985 - classification_loss: 0.4066 18/500 [>.............................] - ETA: 1:59 - loss: 2.0961 - regression_loss: 1.6982 - classification_loss: 0.3980 19/500 [>.............................] - ETA: 1:58 - loss: 2.0676 - regression_loss: 1.6829 - classification_loss: 0.3847 20/500 [>.............................] - ETA: 1:58 - loss: 2.0438 - regression_loss: 1.6686 - classification_loss: 0.3752 21/500 [>.............................] - ETA: 1:58 - loss: 2.0439 - regression_loss: 1.6749 - classification_loss: 0.3690 22/500 [>.............................] - ETA: 1:58 - loss: 2.0327 - regression_loss: 1.6743 - classification_loss: 0.3583 23/500 [>.............................] - ETA: 1:57 - loss: 1.9985 - regression_loss: 1.6503 - classification_loss: 0.3483 24/500 [>.............................] - ETA: 1:57 - loss: 2.0096 - regression_loss: 1.6629 - classification_loss: 0.3467 25/500 [>.............................] - ETA: 1:57 - loss: 2.0099 - regression_loss: 1.6627 - classification_loss: 0.3472 26/500 [>.............................] - ETA: 1:57 - loss: 2.0264 - regression_loss: 1.6768 - classification_loss: 0.3496 27/500 [>.............................] - ETA: 1:57 - loss: 2.0427 - regression_loss: 1.6854 - classification_loss: 0.3572 28/500 [>.............................] - ETA: 1:56 - loss: 2.0575 - regression_loss: 1.6952 - classification_loss: 0.3623 29/500 [>.............................] - ETA: 1:56 - loss: 2.0706 - regression_loss: 1.6878 - classification_loss: 0.3828 30/500 [>.............................] - ETA: 1:56 - loss: 2.0621 - regression_loss: 1.6827 - classification_loss: 0.3794 31/500 [>.............................] - ETA: 1:55 - loss: 2.0408 - regression_loss: 1.6681 - classification_loss: 0.3727 32/500 [>.............................] - ETA: 1:55 - loss: 2.0426 - regression_loss: 1.6710 - classification_loss: 0.3716 33/500 [>.............................] - ETA: 1:55 - loss: 2.0590 - regression_loss: 1.6854 - classification_loss: 0.3735 34/500 [=>............................] - ETA: 1:55 - loss: 2.0488 - regression_loss: 1.6806 - classification_loss: 0.3681 35/500 [=>............................] - ETA: 1:55 - loss: 2.0457 - regression_loss: 1.6776 - classification_loss: 0.3681 36/500 [=>............................] - ETA: 1:54 - loss: 2.0267 - regression_loss: 1.6631 - classification_loss: 0.3637 37/500 [=>............................] - ETA: 1:54 - loss: 2.0323 - regression_loss: 1.6659 - classification_loss: 0.3664 38/500 [=>............................] - ETA: 1:54 - loss: 2.0306 - regression_loss: 1.6695 - classification_loss: 0.3611 39/500 [=>............................] - ETA: 1:54 - loss: 2.0338 - regression_loss: 1.6716 - classification_loss: 0.3622 40/500 [=>............................] - ETA: 1:53 - loss: 2.0186 - regression_loss: 1.6603 - classification_loss: 0.3583 41/500 [=>............................] - ETA: 1:53 - loss: 2.0236 - regression_loss: 1.6637 - classification_loss: 0.3599 42/500 [=>............................] - ETA: 1:53 - loss: 2.0374 - regression_loss: 1.6755 - classification_loss: 0.3619 43/500 [=>............................] - ETA: 1:53 - loss: 2.0391 - regression_loss: 1.6781 - classification_loss: 0.3610 44/500 [=>............................] - ETA: 1:53 - loss: 2.0491 - regression_loss: 1.6876 - classification_loss: 0.3614 45/500 [=>............................] - ETA: 1:52 - loss: 2.0393 - regression_loss: 1.6803 - classification_loss: 0.3589 46/500 [=>............................] - ETA: 1:52 - loss: 2.0446 - regression_loss: 1.6856 - classification_loss: 0.3591 47/500 [=>............................] - ETA: 1:52 - loss: 2.0350 - regression_loss: 1.6797 - classification_loss: 0.3553 48/500 [=>............................] - ETA: 1:52 - loss: 2.0349 - regression_loss: 1.6830 - classification_loss: 0.3519 49/500 [=>............................] - ETA: 1:51 - loss: 2.0191 - regression_loss: 1.6714 - classification_loss: 0.3478 50/500 [==>...........................] - ETA: 1:51 - loss: 2.0244 - regression_loss: 1.6777 - classification_loss: 0.3467 51/500 [==>...........................] - ETA: 1:51 - loss: 2.0318 - regression_loss: 1.6839 - classification_loss: 0.3479 52/500 [==>...........................] - ETA: 1:51 - loss: 2.0397 - regression_loss: 1.6920 - classification_loss: 0.3477 53/500 [==>...........................] - ETA: 1:51 - loss: 2.0528 - regression_loss: 1.7012 - classification_loss: 0.3516 54/500 [==>...........................] - ETA: 1:50 - loss: 2.0641 - regression_loss: 1.7088 - classification_loss: 0.3553 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0695 - regression_loss: 1.7116 - classification_loss: 0.3579 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0736 - regression_loss: 1.7151 - classification_loss: 0.3585 57/500 [==>...........................] - ETA: 1:49 - loss: 2.0831 - regression_loss: 1.7212 - classification_loss: 0.3619 58/500 [==>...........................] - ETA: 1:49 - loss: 2.0634 - regression_loss: 1.7057 - classification_loss: 0.3577 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0585 - regression_loss: 1.7028 - classification_loss: 0.3557 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0732 - regression_loss: 1.7137 - classification_loss: 0.3595 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0767 - regression_loss: 1.7164 - classification_loss: 0.3603 62/500 [==>...........................] - ETA: 1:48 - loss: 2.0663 - regression_loss: 1.7076 - classification_loss: 0.3587 63/500 [==>...........................] - ETA: 1:48 - loss: 2.0545 - regression_loss: 1.6983 - classification_loss: 0.3562 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0560 - regression_loss: 1.6996 - classification_loss: 0.3564 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0556 - regression_loss: 1.6997 - classification_loss: 0.3559 66/500 [==>...........................] - ETA: 1:47 - loss: 2.0539 - regression_loss: 1.6981 - classification_loss: 0.3558 67/500 [===>..........................] - ETA: 1:47 - loss: 2.0571 - regression_loss: 1.7009 - classification_loss: 0.3562 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0553 - regression_loss: 1.6993 - classification_loss: 0.3560 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0460 - regression_loss: 1.6923 - classification_loss: 0.3537 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0448 - regression_loss: 1.6914 - classification_loss: 0.3535 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0508 - regression_loss: 1.6957 - classification_loss: 0.3551 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0336 - regression_loss: 1.6811 - classification_loss: 0.3525 73/500 [===>..........................] - ETA: 1:45 - loss: 2.0361 - regression_loss: 1.6841 - classification_loss: 0.3520 74/500 [===>..........................] - ETA: 1:45 - loss: 2.0325 - regression_loss: 1.6828 - classification_loss: 0.3498 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0271 - regression_loss: 1.6785 - classification_loss: 0.3486 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0142 - regression_loss: 1.6688 - classification_loss: 0.3454 77/500 [===>..........................] - ETA: 1:44 - loss: 2.0170 - regression_loss: 1.6707 - classification_loss: 0.3464 78/500 [===>..........................] - ETA: 1:44 - loss: 2.0124 - regression_loss: 1.6677 - classification_loss: 0.3447 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0128 - regression_loss: 1.6684 - classification_loss: 0.3443 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0104 - regression_loss: 1.6669 - classification_loss: 0.3435 81/500 [===>..........................] - ETA: 1:43 - loss: 2.0093 - regression_loss: 1.6668 - classification_loss: 0.3425 82/500 [===>..........................] - ETA: 1:43 - loss: 2.0091 - regression_loss: 1.6664 - classification_loss: 0.3427 83/500 [===>..........................] - ETA: 1:43 - loss: 2.0079 - regression_loss: 1.6656 - classification_loss: 0.3423 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0113 - regression_loss: 1.6695 - classification_loss: 0.3418 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0153 - regression_loss: 1.6736 - classification_loss: 0.3418 86/500 [====>.........................] - ETA: 1:42 - loss: 2.0134 - regression_loss: 1.6729 - classification_loss: 0.3405 87/500 [====>.........................] - ETA: 1:42 - loss: 2.0092 - regression_loss: 1.6687 - classification_loss: 0.3405 88/500 [====>.........................] - ETA: 1:42 - loss: 2.0062 - regression_loss: 1.6657 - classification_loss: 0.3406 89/500 [====>.........................] - ETA: 1:41 - loss: 2.0030 - regression_loss: 1.6633 - classification_loss: 0.3397 90/500 [====>.........................] - ETA: 1:41 - loss: 2.0024 - regression_loss: 1.6625 - classification_loss: 0.3399 91/500 [====>.........................] - ETA: 1:40 - loss: 2.0074 - regression_loss: 1.6668 - classification_loss: 0.3406 92/500 [====>.........................] - ETA: 1:40 - loss: 2.0096 - regression_loss: 1.6683 - classification_loss: 0.3412 93/500 [====>.........................] - ETA: 1:40 - loss: 1.9968 - regression_loss: 1.6582 - classification_loss: 0.3385 94/500 [====>.........................] - ETA: 1:40 - loss: 1.9922 - regression_loss: 1.6543 - classification_loss: 0.3379 95/500 [====>.........................] - ETA: 1:39 - loss: 1.9945 - regression_loss: 1.6540 - classification_loss: 0.3405 96/500 [====>.........................] - ETA: 1:39 - loss: 1.9899 - regression_loss: 1.6507 - classification_loss: 0.3392 97/500 [====>.........................] - ETA: 1:39 - loss: 1.9841 - regression_loss: 1.6452 - classification_loss: 0.3389 98/500 [====>.........................] - ETA: 1:39 - loss: 1.9900 - regression_loss: 1.6500 - classification_loss: 0.3400 99/500 [====>.........................] - ETA: 1:39 - loss: 1.9860 - regression_loss: 1.6468 - classification_loss: 0.3392 100/500 [=====>........................] - ETA: 1:38 - loss: 1.9789 - regression_loss: 1.6415 - classification_loss: 0.3374 101/500 [=====>........................] - ETA: 1:38 - loss: 1.9803 - regression_loss: 1.6425 - classification_loss: 0.3378 102/500 [=====>........................] - ETA: 1:38 - loss: 1.9833 - regression_loss: 1.6450 - classification_loss: 0.3383 103/500 [=====>........................] - ETA: 1:38 - loss: 1.9828 - regression_loss: 1.6451 - classification_loss: 0.3376 104/500 [=====>........................] - ETA: 1:37 - loss: 1.9839 - regression_loss: 1.6461 - classification_loss: 0.3378 105/500 [=====>........................] - ETA: 1:37 - loss: 1.9921 - regression_loss: 1.6498 - classification_loss: 0.3423 106/500 [=====>........................] - ETA: 1:37 - loss: 1.9980 - regression_loss: 1.6547 - classification_loss: 0.3433 107/500 [=====>........................] - ETA: 1:37 - loss: 2.0044 - regression_loss: 1.6593 - classification_loss: 0.3451 108/500 [=====>........................] - ETA: 1:36 - loss: 1.9927 - regression_loss: 1.6500 - classification_loss: 0.3427 109/500 [=====>........................] - ETA: 1:36 - loss: 1.9906 - regression_loss: 1.6484 - classification_loss: 0.3422 110/500 [=====>........................] - ETA: 1:36 - loss: 1.9807 - regression_loss: 1.6406 - classification_loss: 0.3401 111/500 [=====>........................] - ETA: 1:36 - loss: 1.9763 - regression_loss: 1.6373 - classification_loss: 0.3390 112/500 [=====>........................] - ETA: 1:35 - loss: 1.9761 - regression_loss: 1.6378 - classification_loss: 0.3383 113/500 [=====>........................] - ETA: 1:35 - loss: 1.9776 - regression_loss: 1.6387 - classification_loss: 0.3389 114/500 [=====>........................] - ETA: 1:35 - loss: 1.9748 - regression_loss: 1.6367 - classification_loss: 0.3381 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9739 - regression_loss: 1.6362 - classification_loss: 0.3377 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9801 - regression_loss: 1.6411 - classification_loss: 0.3389 117/500 [======>.......................] - ETA: 1:34 - loss: 1.9808 - regression_loss: 1.6412 - classification_loss: 0.3396 118/500 [======>.......................] - ETA: 1:34 - loss: 1.9790 - regression_loss: 1.6398 - classification_loss: 0.3392 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9820 - regression_loss: 1.6416 - classification_loss: 0.3404 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9780 - regression_loss: 1.6396 - classification_loss: 0.3385 121/500 [======>.......................] - ETA: 1:33 - loss: 1.9775 - regression_loss: 1.6393 - classification_loss: 0.3382 122/500 [======>.......................] - ETA: 1:33 - loss: 1.9733 - regression_loss: 1.6363 - classification_loss: 0.3370 123/500 [======>.......................] - ETA: 1:33 - loss: 1.9681 - regression_loss: 1.6320 - classification_loss: 0.3361 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9699 - regression_loss: 1.6337 - classification_loss: 0.3362 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9928 - regression_loss: 1.6405 - classification_loss: 0.3523 126/500 [======>.......................] - ETA: 1:32 - loss: 1.9954 - regression_loss: 1.6427 - classification_loss: 0.3526 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9952 - regression_loss: 1.6423 - classification_loss: 0.3529 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9890 - regression_loss: 1.6376 - classification_loss: 0.3514 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9919 - regression_loss: 1.6402 - classification_loss: 0.3517 130/500 [======>.......................] - ETA: 1:31 - loss: 1.9938 - regression_loss: 1.6414 - classification_loss: 0.3524 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9913 - regression_loss: 1.6395 - classification_loss: 0.3518 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9907 - regression_loss: 1.6402 - classification_loss: 0.3505 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9909 - regression_loss: 1.6407 - classification_loss: 0.3502 134/500 [=======>......................] - ETA: 1:30 - loss: 1.9977 - regression_loss: 1.6466 - classification_loss: 0.3512 135/500 [=======>......................] - ETA: 1:30 - loss: 2.0007 - regression_loss: 1.6487 - classification_loss: 0.3520 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9928 - regression_loss: 1.6423 - classification_loss: 0.3506 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9954 - regression_loss: 1.6444 - classification_loss: 0.3510 138/500 [=======>......................] - ETA: 1:29 - loss: 1.9893 - regression_loss: 1.6399 - classification_loss: 0.3493 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9886 - regression_loss: 1.6393 - classification_loss: 0.3493 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9911 - regression_loss: 1.6422 - classification_loss: 0.3489 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9829 - regression_loss: 1.6353 - classification_loss: 0.3477 142/500 [=======>......................] - ETA: 1:28 - loss: 1.9815 - regression_loss: 1.6344 - classification_loss: 0.3471 143/500 [=======>......................] - ETA: 1:28 - loss: 1.9862 - regression_loss: 1.6383 - classification_loss: 0.3479 144/500 [=======>......................] - ETA: 1:28 - loss: 1.9862 - regression_loss: 1.6382 - classification_loss: 0.3479 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9921 - regression_loss: 1.6433 - classification_loss: 0.3488 146/500 [=======>......................] - ETA: 1:27 - loss: 1.9948 - regression_loss: 1.6457 - classification_loss: 0.3491 147/500 [=======>......................] - ETA: 1:27 - loss: 1.9898 - regression_loss: 1.6423 - classification_loss: 0.3475 148/500 [=======>......................] - ETA: 1:27 - loss: 1.9888 - regression_loss: 1.6418 - classification_loss: 0.3470 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9906 - regression_loss: 1.6431 - classification_loss: 0.3475 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9930 - regression_loss: 1.6456 - classification_loss: 0.3474 151/500 [========>.....................] - ETA: 1:26 - loss: 1.9941 - regression_loss: 1.6462 - classification_loss: 0.3479 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9934 - regression_loss: 1.6450 - classification_loss: 0.3484 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9931 - regression_loss: 1.6448 - classification_loss: 0.3483 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9917 - regression_loss: 1.6441 - classification_loss: 0.3477 155/500 [========>.....................] - ETA: 1:25 - loss: 1.9969 - regression_loss: 1.6494 - classification_loss: 0.3475 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9988 - regression_loss: 1.6513 - classification_loss: 0.3475 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0062 - regression_loss: 1.6574 - classification_loss: 0.3487 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0092 - regression_loss: 1.6596 - classification_loss: 0.3496 159/500 [========>.....................] - ETA: 1:24 - loss: 2.0150 - regression_loss: 1.6643 - classification_loss: 0.3508 160/500 [========>.....................] - ETA: 1:24 - loss: 2.0133 - regression_loss: 1.6631 - classification_loss: 0.3502 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0100 - regression_loss: 1.6608 - classification_loss: 0.3492 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0141 - regression_loss: 1.6640 - classification_loss: 0.3501 163/500 [========>.....................] - ETA: 1:23 - loss: 2.0154 - regression_loss: 1.6646 - classification_loss: 0.3508 164/500 [========>.....................] - ETA: 1:23 - loss: 2.0120 - regression_loss: 1.6620 - classification_loss: 0.3500 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0181 - regression_loss: 1.6674 - classification_loss: 0.3507 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0166 - regression_loss: 1.6662 - classification_loss: 0.3503 167/500 [=========>....................] - ETA: 1:22 - loss: 2.0185 - regression_loss: 1.6682 - classification_loss: 0.3503 168/500 [=========>....................] - ETA: 1:22 - loss: 2.0257 - regression_loss: 1.6732 - classification_loss: 0.3525 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0242 - regression_loss: 1.6720 - classification_loss: 0.3521 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0279 - regression_loss: 1.6753 - classification_loss: 0.3526 171/500 [=========>....................] - ETA: 1:21 - loss: 2.0305 - regression_loss: 1.6777 - classification_loss: 0.3528 172/500 [=========>....................] - ETA: 1:21 - loss: 2.0307 - regression_loss: 1.6782 - classification_loss: 0.3526 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0285 - regression_loss: 1.6764 - classification_loss: 0.3521 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0258 - regression_loss: 1.6746 - classification_loss: 0.3512 175/500 [=========>....................] - ETA: 1:20 - loss: 2.0255 - regression_loss: 1.6747 - classification_loss: 0.3508 176/500 [=========>....................] - ETA: 1:20 - loss: 2.0306 - regression_loss: 1.6792 - classification_loss: 0.3514 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0305 - regression_loss: 1.6792 - classification_loss: 0.3513 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0274 - regression_loss: 1.6769 - classification_loss: 0.3506 179/500 [=========>....................] - ETA: 1:19 - loss: 2.0306 - regression_loss: 1.6787 - classification_loss: 0.3519 180/500 [=========>....................] - ETA: 1:19 - loss: 2.0255 - regression_loss: 1.6726 - classification_loss: 0.3529 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0254 - regression_loss: 1.6719 - classification_loss: 0.3534 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0264 - regression_loss: 1.6734 - classification_loss: 0.3530 183/500 [=========>....................] - ETA: 1:18 - loss: 2.0193 - regression_loss: 1.6671 - classification_loss: 0.3522 184/500 [==========>...................] - ETA: 1:18 - loss: 2.0242 - regression_loss: 1.6718 - classification_loss: 0.3525 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0245 - regression_loss: 1.6720 - classification_loss: 0.3525 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0200 - regression_loss: 1.6687 - classification_loss: 0.3513 187/500 [==========>...................] - ETA: 1:17 - loss: 2.0219 - regression_loss: 1.6705 - classification_loss: 0.3514 188/500 [==========>...................] - ETA: 1:17 - loss: 2.0194 - regression_loss: 1.6684 - classification_loss: 0.3509 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0243 - regression_loss: 1.6726 - classification_loss: 0.3517 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0232 - regression_loss: 1.6716 - classification_loss: 0.3516 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0237 - regression_loss: 1.6721 - classification_loss: 0.3516 192/500 [==========>...................] - ETA: 1:16 - loss: 2.0241 - regression_loss: 1.6725 - classification_loss: 0.3516 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0267 - regression_loss: 1.6741 - classification_loss: 0.3526 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0266 - regression_loss: 1.6741 - classification_loss: 0.3525 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0327 - regression_loss: 1.6790 - classification_loss: 0.3537 196/500 [==========>...................] - ETA: 1:15 - loss: 2.0321 - regression_loss: 1.6788 - classification_loss: 0.3534 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0315 - regression_loss: 1.6780 - classification_loss: 0.3535 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0308 - regression_loss: 1.6777 - classification_loss: 0.3531 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0272 - regression_loss: 1.6748 - classification_loss: 0.3523 200/500 [===========>..................] - ETA: 1:14 - loss: 2.0268 - regression_loss: 1.6745 - classification_loss: 0.3523 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0258 - regression_loss: 1.6744 - classification_loss: 0.3515 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0275 - regression_loss: 1.6753 - classification_loss: 0.3521 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0264 - regression_loss: 1.6745 - classification_loss: 0.3519 204/500 [===========>..................] - ETA: 1:13 - loss: 2.0255 - regression_loss: 1.6742 - classification_loss: 0.3513 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0251 - regression_loss: 1.6742 - classification_loss: 0.3509 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0260 - regression_loss: 1.6748 - classification_loss: 0.3512 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0230 - regression_loss: 1.6727 - classification_loss: 0.3503 208/500 [===========>..................] - ETA: 1:12 - loss: 2.0216 - regression_loss: 1.6717 - classification_loss: 0.3499 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0247 - regression_loss: 1.6737 - classification_loss: 0.3510 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0241 - regression_loss: 1.6735 - classification_loss: 0.3506 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0226 - regression_loss: 1.6720 - classification_loss: 0.3506 212/500 [===========>..................] - ETA: 1:11 - loss: 2.0216 - regression_loss: 1.6715 - classification_loss: 0.3500 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0250 - regression_loss: 1.6747 - classification_loss: 0.3503 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0257 - regression_loss: 1.6754 - classification_loss: 0.3503 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0226 - regression_loss: 1.6732 - classification_loss: 0.3494 216/500 [===========>..................] - ETA: 1:10 - loss: 2.0217 - regression_loss: 1.6724 - classification_loss: 0.3493 217/500 [============>.................] - ETA: 1:10 - loss: 2.0241 - regression_loss: 1.6741 - classification_loss: 0.3500 218/500 [============>.................] - ETA: 1:10 - loss: 2.0261 - regression_loss: 1.6756 - classification_loss: 0.3505 219/500 [============>.................] - ETA: 1:10 - loss: 2.0228 - regression_loss: 1.6733 - classification_loss: 0.3496 220/500 [============>.................] - ETA: 1:09 - loss: 2.0235 - regression_loss: 1.6737 - classification_loss: 0.3498 221/500 [============>.................] - ETA: 1:09 - loss: 2.0203 - regression_loss: 1.6712 - classification_loss: 0.3491 222/500 [============>.................] - ETA: 1:09 - loss: 2.0213 - regression_loss: 1.6718 - classification_loss: 0.3495 223/500 [============>.................] - ETA: 1:09 - loss: 2.0213 - regression_loss: 1.6721 - classification_loss: 0.3492 224/500 [============>.................] - ETA: 1:08 - loss: 2.0212 - regression_loss: 1.6718 - classification_loss: 0.3493 225/500 [============>.................] - ETA: 1:08 - loss: 2.0204 - regression_loss: 1.6716 - classification_loss: 0.3488 226/500 [============>.................] - ETA: 1:08 - loss: 2.0189 - regression_loss: 1.6703 - classification_loss: 0.3486 227/500 [============>.................] - ETA: 1:08 - loss: 2.0269 - regression_loss: 1.6629 - classification_loss: 0.3640 228/500 [============>.................] - ETA: 1:07 - loss: 2.0324 - regression_loss: 1.6675 - classification_loss: 0.3648 229/500 [============>.................] - ETA: 1:07 - loss: 2.0330 - regression_loss: 1.6682 - classification_loss: 0.3648 230/500 [============>.................] - ETA: 1:07 - loss: 2.0342 - regression_loss: 1.6697 - classification_loss: 0.3645 231/500 [============>.................] - ETA: 1:07 - loss: 2.0319 - regression_loss: 1.6682 - classification_loss: 0.3637 232/500 [============>.................] - ETA: 1:06 - loss: 2.0293 - regression_loss: 1.6665 - classification_loss: 0.3628 233/500 [============>.................] - ETA: 1:06 - loss: 2.0246 - regression_loss: 1.6628 - classification_loss: 0.3618 234/500 [=============>................] - ETA: 1:06 - loss: 2.0241 - regression_loss: 1.6624 - classification_loss: 0.3617 235/500 [=============>................] - ETA: 1:06 - loss: 2.0214 - regression_loss: 1.6605 - classification_loss: 0.3609 236/500 [=============>................] - ETA: 1:05 - loss: 2.0189 - regression_loss: 1.6588 - classification_loss: 0.3601 237/500 [=============>................] - ETA: 1:05 - loss: 2.0150 - regression_loss: 1.6556 - classification_loss: 0.3594 238/500 [=============>................] - ETA: 1:05 - loss: 2.0163 - regression_loss: 1.6564 - classification_loss: 0.3599 239/500 [=============>................] - ETA: 1:05 - loss: 2.0146 - regression_loss: 1.6553 - classification_loss: 0.3593 240/500 [=============>................] - ETA: 1:04 - loss: 2.0128 - regression_loss: 1.6540 - classification_loss: 0.3588 241/500 [=============>................] - ETA: 1:04 - loss: 2.0111 - regression_loss: 1.6531 - classification_loss: 0.3580 242/500 [=============>................] - ETA: 1:04 - loss: 2.0078 - regression_loss: 1.6505 - classification_loss: 0.3573 243/500 [=============>................] - ETA: 1:04 - loss: 2.0080 - regression_loss: 1.6510 - classification_loss: 0.3570 244/500 [=============>................] - ETA: 1:03 - loss: 2.0097 - regression_loss: 1.6521 - classification_loss: 0.3577 245/500 [=============>................] - ETA: 1:03 - loss: 2.0057 - regression_loss: 1.6487 - classification_loss: 0.3570 246/500 [=============>................] - ETA: 1:03 - loss: 2.0074 - regression_loss: 1.6499 - classification_loss: 0.3575 247/500 [=============>................] - ETA: 1:03 - loss: 2.0073 - regression_loss: 1.6500 - classification_loss: 0.3573 248/500 [=============>................] - ETA: 1:02 - loss: 2.0116 - regression_loss: 1.6533 - classification_loss: 0.3583 249/500 [=============>................] - ETA: 1:02 - loss: 2.0103 - regression_loss: 1.6517 - classification_loss: 0.3586 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0104 - regression_loss: 1.6521 - classification_loss: 0.3583 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0094 - regression_loss: 1.6515 - classification_loss: 0.3579 252/500 [==============>...............] - ETA: 1:01 - loss: 2.0107 - regression_loss: 1.6528 - classification_loss: 0.3579 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0077 - regression_loss: 1.6505 - classification_loss: 0.3572 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0068 - regression_loss: 1.6500 - classification_loss: 0.3568 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0126 - regression_loss: 1.6537 - classification_loss: 0.3589 256/500 [==============>...............] - ETA: 1:00 - loss: 2.0150 - regression_loss: 1.6561 - classification_loss: 0.3589 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0127 - regression_loss: 1.6547 - classification_loss: 0.3580 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0117 - regression_loss: 1.6541 - classification_loss: 0.3576 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0127 - regression_loss: 1.6551 - classification_loss: 0.3576 260/500 [==============>...............] - ETA: 59s - loss: 2.0149 - regression_loss: 1.6569 - classification_loss: 0.3581  261/500 [==============>...............] - ETA: 59s - loss: 2.0164 - regression_loss: 1.6574 - classification_loss: 0.3590 262/500 [==============>...............] - ETA: 59s - loss: 2.0148 - regression_loss: 1.6562 - classification_loss: 0.3586 263/500 [==============>...............] - ETA: 59s - loss: 2.0188 - regression_loss: 1.6595 - classification_loss: 0.3594 264/500 [==============>...............] - ETA: 58s - loss: 2.0169 - regression_loss: 1.6582 - classification_loss: 0.3587 265/500 [==============>...............] - ETA: 58s - loss: 2.0162 - regression_loss: 1.6575 - classification_loss: 0.3586 266/500 [==============>...............] - ETA: 58s - loss: 2.0151 - regression_loss: 1.6567 - classification_loss: 0.3584 267/500 [===============>..............] - ETA: 58s - loss: 2.0150 - regression_loss: 1.6561 - classification_loss: 0.3589 268/500 [===============>..............] - ETA: 57s - loss: 2.0157 - regression_loss: 1.6565 - classification_loss: 0.3592 269/500 [===============>..............] - ETA: 57s - loss: 2.0167 - regression_loss: 1.6574 - classification_loss: 0.3593 270/500 [===============>..............] - ETA: 57s - loss: 2.0160 - regression_loss: 1.6568 - classification_loss: 0.3592 271/500 [===============>..............] - ETA: 57s - loss: 2.0120 - regression_loss: 1.6536 - classification_loss: 0.3584 272/500 [===============>..............] - ETA: 56s - loss: 2.0146 - regression_loss: 1.6557 - classification_loss: 0.3589 273/500 [===============>..............] - ETA: 56s - loss: 2.0137 - regression_loss: 1.6556 - classification_loss: 0.3580 274/500 [===============>..............] - ETA: 56s - loss: 2.0155 - regression_loss: 1.6570 - classification_loss: 0.3585 275/500 [===============>..............] - ETA: 56s - loss: 2.0143 - regression_loss: 1.6554 - classification_loss: 0.3588 276/500 [===============>..............] - ETA: 55s - loss: 2.0154 - regression_loss: 1.6566 - classification_loss: 0.3589 277/500 [===============>..............] - ETA: 55s - loss: 2.0161 - regression_loss: 1.6568 - classification_loss: 0.3593 278/500 [===============>..............] - ETA: 55s - loss: 2.0185 - regression_loss: 1.6583 - classification_loss: 0.3602 279/500 [===============>..............] - ETA: 55s - loss: 2.0198 - regression_loss: 1.6596 - classification_loss: 0.3602 280/500 [===============>..............] - ETA: 54s - loss: 2.0212 - regression_loss: 1.6606 - classification_loss: 0.3606 281/500 [===============>..............] - ETA: 54s - loss: 2.0192 - regression_loss: 1.6592 - classification_loss: 0.3600 282/500 [===============>..............] - ETA: 54s - loss: 2.0183 - regression_loss: 1.6583 - classification_loss: 0.3600 283/500 [===============>..............] - ETA: 54s - loss: 2.0167 - regression_loss: 1.6570 - classification_loss: 0.3597 284/500 [================>.............] - ETA: 53s - loss: 2.0149 - regression_loss: 1.6559 - classification_loss: 0.3590 285/500 [================>.............] - ETA: 53s - loss: 2.0162 - regression_loss: 1.6574 - classification_loss: 0.3588 286/500 [================>.............] - ETA: 53s - loss: 2.0160 - regression_loss: 1.6571 - classification_loss: 0.3590 287/500 [================>.............] - ETA: 53s - loss: 2.0144 - regression_loss: 1.6559 - classification_loss: 0.3584 288/500 [================>.............] - ETA: 52s - loss: 2.0154 - regression_loss: 1.6566 - classification_loss: 0.3588 289/500 [================>.............] - ETA: 52s - loss: 2.0159 - regression_loss: 1.6571 - classification_loss: 0.3588 290/500 [================>.............] - ETA: 52s - loss: 2.0141 - regression_loss: 1.6557 - classification_loss: 0.3584 291/500 [================>.............] - ETA: 52s - loss: 2.0158 - regression_loss: 1.6569 - classification_loss: 0.3589 292/500 [================>.............] - ETA: 51s - loss: 2.0145 - regression_loss: 1.6562 - classification_loss: 0.3583 293/500 [================>.............] - ETA: 51s - loss: 2.0121 - regression_loss: 1.6544 - classification_loss: 0.3577 294/500 [================>.............] - ETA: 51s - loss: 2.0109 - regression_loss: 1.6536 - classification_loss: 0.3572 295/500 [================>.............] - ETA: 51s - loss: 2.0118 - regression_loss: 1.6543 - classification_loss: 0.3576 296/500 [================>.............] - ETA: 50s - loss: 2.0100 - regression_loss: 1.6526 - classification_loss: 0.3574 297/500 [================>.............] - ETA: 50s - loss: 2.0114 - regression_loss: 1.6536 - classification_loss: 0.3577 298/500 [================>.............] - ETA: 50s - loss: 2.0123 - regression_loss: 1.6544 - classification_loss: 0.3579 299/500 [================>.............] - ETA: 50s - loss: 2.0112 - regression_loss: 1.6537 - classification_loss: 0.3575 300/500 [=================>............] - ETA: 49s - loss: 2.0092 - regression_loss: 1.6523 - classification_loss: 0.3569 301/500 [=================>............] - ETA: 49s - loss: 2.0094 - regression_loss: 1.6529 - classification_loss: 0.3565 302/500 [=================>............] - ETA: 49s - loss: 2.0097 - regression_loss: 1.6534 - classification_loss: 0.3564 303/500 [=================>............] - ETA: 49s - loss: 2.0099 - regression_loss: 1.6534 - classification_loss: 0.3564 304/500 [=================>............] - ETA: 48s - loss: 2.0089 - regression_loss: 1.6530 - classification_loss: 0.3559 305/500 [=================>............] - ETA: 48s - loss: 2.0080 - regression_loss: 1.6524 - classification_loss: 0.3556 306/500 [=================>............] - ETA: 48s - loss: 2.0085 - regression_loss: 1.6527 - classification_loss: 0.3558 307/500 [=================>............] - ETA: 48s - loss: 2.0128 - regression_loss: 1.6563 - classification_loss: 0.3566 308/500 [=================>............] - ETA: 47s - loss: 2.0135 - regression_loss: 1.6567 - classification_loss: 0.3568 309/500 [=================>............] - ETA: 47s - loss: 2.0139 - regression_loss: 1.6567 - classification_loss: 0.3572 310/500 [=================>............] - ETA: 47s - loss: 2.0130 - regression_loss: 1.6562 - classification_loss: 0.3568 311/500 [=================>............] - ETA: 47s - loss: 2.0127 - regression_loss: 1.6561 - classification_loss: 0.3566 312/500 [=================>............] - ETA: 46s - loss: 2.0117 - regression_loss: 1.6555 - classification_loss: 0.3562 313/500 [=================>............] - ETA: 46s - loss: 2.0110 - regression_loss: 1.6550 - classification_loss: 0.3561 314/500 [=================>............] - ETA: 46s - loss: 2.0109 - regression_loss: 1.6549 - classification_loss: 0.3560 315/500 [=================>............] - ETA: 46s - loss: 2.0077 - regression_loss: 1.6524 - classification_loss: 0.3553 316/500 [=================>............] - ETA: 45s - loss: 2.0097 - regression_loss: 1.6542 - classification_loss: 0.3555 317/500 [==================>...........] - ETA: 45s - loss: 2.0079 - regression_loss: 1.6529 - classification_loss: 0.3550 318/500 [==================>...........] - ETA: 45s - loss: 2.0099 - regression_loss: 1.6549 - classification_loss: 0.3550 319/500 [==================>...........] - ETA: 45s - loss: 2.0090 - regression_loss: 1.6542 - classification_loss: 0.3548 320/500 [==================>...........] - ETA: 44s - loss: 2.0087 - regression_loss: 1.6542 - classification_loss: 0.3546 321/500 [==================>...........] - ETA: 44s - loss: 2.0101 - regression_loss: 1.6555 - classification_loss: 0.3546 322/500 [==================>...........] - ETA: 44s - loss: 2.0101 - regression_loss: 1.6557 - classification_loss: 0.3544 323/500 [==================>...........] - ETA: 44s - loss: 2.0098 - regression_loss: 1.6555 - classification_loss: 0.3543 324/500 [==================>...........] - ETA: 43s - loss: 2.0114 - regression_loss: 1.6568 - classification_loss: 0.3545 325/500 [==================>...........] - ETA: 43s - loss: 2.0101 - regression_loss: 1.6560 - classification_loss: 0.3541 326/500 [==================>...........] - ETA: 43s - loss: 2.0091 - regression_loss: 1.6553 - classification_loss: 0.3539 327/500 [==================>...........] - ETA: 43s - loss: 2.0068 - regression_loss: 1.6535 - classification_loss: 0.3533 328/500 [==================>...........] - ETA: 42s - loss: 2.0056 - regression_loss: 1.6528 - classification_loss: 0.3528 329/500 [==================>...........] - ETA: 42s - loss: 2.0042 - regression_loss: 1.6518 - classification_loss: 0.3524 330/500 [==================>...........] - ETA: 42s - loss: 2.0055 - regression_loss: 1.6520 - classification_loss: 0.3536 331/500 [==================>...........] - ETA: 42s - loss: 2.0051 - regression_loss: 1.6513 - classification_loss: 0.3538 332/500 [==================>...........] - ETA: 41s - loss: 2.0036 - regression_loss: 1.6504 - classification_loss: 0.3532 333/500 [==================>...........] - ETA: 41s - loss: 2.0057 - regression_loss: 1.6524 - classification_loss: 0.3533 334/500 [===================>..........] - ETA: 41s - loss: 2.0062 - regression_loss: 1.6526 - classification_loss: 0.3536 335/500 [===================>..........] - ETA: 41s - loss: 2.0060 - regression_loss: 1.6523 - classification_loss: 0.3537 336/500 [===================>..........] - ETA: 40s - loss: 2.0059 - regression_loss: 1.6522 - classification_loss: 0.3537 337/500 [===================>..........] - ETA: 40s - loss: 2.0045 - regression_loss: 1.6513 - classification_loss: 0.3532 338/500 [===================>..........] - ETA: 40s - loss: 2.0040 - regression_loss: 1.6511 - classification_loss: 0.3529 339/500 [===================>..........] - ETA: 40s - loss: 2.0001 - regression_loss: 1.6479 - classification_loss: 0.3521 340/500 [===================>..........] - ETA: 39s - loss: 2.0001 - regression_loss: 1.6480 - classification_loss: 0.3521 341/500 [===================>..........] - ETA: 39s - loss: 1.9983 - regression_loss: 1.6466 - classification_loss: 0.3516 342/500 [===================>..........] - ETA: 39s - loss: 1.9962 - regression_loss: 1.6451 - classification_loss: 0.3511 343/500 [===================>..........] - ETA: 39s - loss: 2.0000 - regression_loss: 1.6486 - classification_loss: 0.3514 344/500 [===================>..........] - ETA: 38s - loss: 2.0012 - regression_loss: 1.6502 - classification_loss: 0.3510 345/500 [===================>..........] - ETA: 38s - loss: 2.0019 - regression_loss: 1.6509 - classification_loss: 0.3510 346/500 [===================>..........] - ETA: 38s - loss: 2.0003 - regression_loss: 1.6498 - classification_loss: 0.3505 347/500 [===================>..........] - ETA: 38s - loss: 2.0014 - regression_loss: 1.6504 - classification_loss: 0.3510 348/500 [===================>..........] - ETA: 37s - loss: 2.0005 - regression_loss: 1.6497 - classification_loss: 0.3509 349/500 [===================>..........] - ETA: 37s - loss: 2.0024 - regression_loss: 1.6515 - classification_loss: 0.3509 350/500 [====================>.........] - ETA: 37s - loss: 2.0013 - regression_loss: 1.6508 - classification_loss: 0.3505 351/500 [====================>.........] - ETA: 37s - loss: 1.9986 - regression_loss: 1.6486 - classification_loss: 0.3500 352/500 [====================>.........] - ETA: 36s - loss: 1.9994 - regression_loss: 1.6493 - classification_loss: 0.3500 353/500 [====================>.........] - ETA: 36s - loss: 1.9999 - regression_loss: 1.6495 - classification_loss: 0.3504 354/500 [====================>.........] - ETA: 36s - loss: 1.9994 - regression_loss: 1.6488 - classification_loss: 0.3506 355/500 [====================>.........] - ETA: 36s - loss: 1.9956 - regression_loss: 1.6442 - classification_loss: 0.3515 356/500 [====================>.........] - ETA: 35s - loss: 1.9967 - regression_loss: 1.6446 - classification_loss: 0.3521 357/500 [====================>.........] - ETA: 35s - loss: 1.9979 - regression_loss: 1.6458 - classification_loss: 0.3521 358/500 [====================>.........] - ETA: 35s - loss: 1.9998 - regression_loss: 1.6473 - classification_loss: 0.3524 359/500 [====================>.........] - ETA: 35s - loss: 2.0022 - regression_loss: 1.6489 - classification_loss: 0.3533 360/500 [====================>.........] - ETA: 34s - loss: 2.0015 - regression_loss: 1.6484 - classification_loss: 0.3530 361/500 [====================>.........] - ETA: 34s - loss: 2.0012 - regression_loss: 1.6483 - classification_loss: 0.3529 362/500 [====================>.........] - ETA: 34s - loss: 2.0004 - regression_loss: 1.6478 - classification_loss: 0.3526 363/500 [====================>.........] - ETA: 34s - loss: 1.9985 - regression_loss: 1.6465 - classification_loss: 0.3521 364/500 [====================>.........] - ETA: 33s - loss: 2.0006 - regression_loss: 1.6481 - classification_loss: 0.3525 365/500 [====================>.........] - ETA: 33s - loss: 2.0020 - regression_loss: 1.6488 - classification_loss: 0.3532 366/500 [====================>.........] - ETA: 33s - loss: 1.9996 - regression_loss: 1.6469 - classification_loss: 0.3527 367/500 [=====================>........] - ETA: 33s - loss: 2.0005 - regression_loss: 1.6475 - classification_loss: 0.3530 368/500 [=====================>........] - ETA: 32s - loss: 2.0045 - regression_loss: 1.6494 - classification_loss: 0.3550 369/500 [=====================>........] - ETA: 32s - loss: 2.0058 - regression_loss: 1.6501 - classification_loss: 0.3556 370/500 [=====================>........] - ETA: 32s - loss: 2.0062 - regression_loss: 1.6506 - classification_loss: 0.3556 371/500 [=====================>........] - ETA: 32s - loss: 2.0061 - regression_loss: 1.6505 - classification_loss: 0.3556 372/500 [=====================>........] - ETA: 31s - loss: 2.0072 - regression_loss: 1.6516 - classification_loss: 0.3555 373/500 [=====================>........] - ETA: 31s - loss: 2.0068 - regression_loss: 1.6514 - classification_loss: 0.3555 374/500 [=====================>........] - ETA: 31s - loss: 2.0077 - regression_loss: 1.6520 - classification_loss: 0.3557 375/500 [=====================>........] - ETA: 31s - loss: 2.0067 - regression_loss: 1.6515 - classification_loss: 0.3553 376/500 [=====================>........] - ETA: 30s - loss: 2.0107 - regression_loss: 1.6538 - classification_loss: 0.3569 377/500 [=====================>........] - ETA: 30s - loss: 2.0098 - regression_loss: 1.6532 - classification_loss: 0.3567 378/500 [=====================>........] - ETA: 30s - loss: 2.0087 - regression_loss: 1.6524 - classification_loss: 0.3563 379/500 [=====================>........] - ETA: 30s - loss: 2.0084 - regression_loss: 1.6521 - classification_loss: 0.3563 380/500 [=====================>........] - ETA: 29s - loss: 2.0088 - regression_loss: 1.6526 - classification_loss: 0.3562 381/500 [=====================>........] - ETA: 29s - loss: 2.0086 - regression_loss: 1.6525 - classification_loss: 0.3560 382/500 [=====================>........] - ETA: 29s - loss: 2.0103 - regression_loss: 1.6538 - classification_loss: 0.3565 383/500 [=====================>........] - ETA: 29s - loss: 2.0088 - regression_loss: 1.6528 - classification_loss: 0.3559 384/500 [======================>.......] - ETA: 28s - loss: 2.0096 - regression_loss: 1.6537 - classification_loss: 0.3559 385/500 [======================>.......] - ETA: 28s - loss: 2.0098 - regression_loss: 1.6541 - classification_loss: 0.3557 386/500 [======================>.......] - ETA: 28s - loss: 2.0095 - regression_loss: 1.6542 - classification_loss: 0.3553 387/500 [======================>.......] - ETA: 28s - loss: 2.0091 - regression_loss: 1.6539 - classification_loss: 0.3552 388/500 [======================>.......] - ETA: 27s - loss: 2.0093 - regression_loss: 1.6541 - classification_loss: 0.3552 389/500 [======================>.......] - ETA: 27s - loss: 2.0110 - regression_loss: 1.6555 - classification_loss: 0.3555 390/500 [======================>.......] - ETA: 27s - loss: 2.0112 - regression_loss: 1.6558 - classification_loss: 0.3555 391/500 [======================>.......] - ETA: 27s - loss: 2.0121 - regression_loss: 1.6568 - classification_loss: 0.3553 392/500 [======================>.......] - ETA: 26s - loss: 2.0117 - regression_loss: 1.6567 - classification_loss: 0.3551 393/500 [======================>.......] - ETA: 26s - loss: 2.0131 - regression_loss: 1.6582 - classification_loss: 0.3549 394/500 [======================>.......] - ETA: 26s - loss: 2.0129 - regression_loss: 1.6582 - classification_loss: 0.3547 395/500 [======================>.......] - ETA: 26s - loss: 2.0169 - regression_loss: 1.6615 - classification_loss: 0.3554 396/500 [======================>.......] - ETA: 25s - loss: 2.0175 - regression_loss: 1.6621 - classification_loss: 0.3554 397/500 [======================>.......] - ETA: 25s - loss: 2.0190 - regression_loss: 1.6634 - classification_loss: 0.3557 398/500 [======================>.......] - ETA: 25s - loss: 2.0184 - regression_loss: 1.6629 - classification_loss: 0.3556 399/500 [======================>.......] - ETA: 25s - loss: 2.0175 - regression_loss: 1.6622 - classification_loss: 0.3553 400/500 [=======================>......] - ETA: 24s - loss: 2.0198 - regression_loss: 1.6642 - classification_loss: 0.3556 401/500 [=======================>......] - ETA: 24s - loss: 2.0198 - regression_loss: 1.6642 - classification_loss: 0.3556 402/500 [=======================>......] - ETA: 24s - loss: 2.0165 - regression_loss: 1.6616 - classification_loss: 0.3549 403/500 [=======================>......] - ETA: 24s - loss: 2.0156 - regression_loss: 1.6611 - classification_loss: 0.3544 404/500 [=======================>......] - ETA: 23s - loss: 2.0161 - regression_loss: 1.6615 - classification_loss: 0.3546 405/500 [=======================>......] - ETA: 23s - loss: 2.0146 - regression_loss: 1.6604 - classification_loss: 0.3542 406/500 [=======================>......] - ETA: 23s - loss: 2.0142 - regression_loss: 1.6603 - classification_loss: 0.3539 407/500 [=======================>......] - ETA: 23s - loss: 2.0159 - regression_loss: 1.6615 - classification_loss: 0.3545 408/500 [=======================>......] - ETA: 22s - loss: 2.0151 - regression_loss: 1.6608 - classification_loss: 0.3543 409/500 [=======================>......] - ETA: 22s - loss: 2.0150 - regression_loss: 1.6608 - classification_loss: 0.3541 410/500 [=======================>......] - ETA: 22s - loss: 2.0163 - regression_loss: 1.6620 - classification_loss: 0.3543 411/500 [=======================>......] - ETA: 22s - loss: 2.0173 - regression_loss: 1.6625 - classification_loss: 0.3548 412/500 [=======================>......] - ETA: 21s - loss: 2.0179 - regression_loss: 1.6630 - classification_loss: 0.3548 413/500 [=======================>......] - ETA: 21s - loss: 2.0181 - regression_loss: 1.6633 - classification_loss: 0.3548 414/500 [=======================>......] - ETA: 21s - loss: 2.0174 - regression_loss: 1.6630 - classification_loss: 0.3545 415/500 [=======================>......] - ETA: 21s - loss: 2.0182 - regression_loss: 1.6637 - classification_loss: 0.3545 416/500 [=======================>......] - ETA: 20s - loss: 2.0166 - regression_loss: 1.6625 - classification_loss: 0.3541 417/500 [========================>.....] - ETA: 20s - loss: 2.0181 - regression_loss: 1.6637 - classification_loss: 0.3543 418/500 [========================>.....] - ETA: 20s - loss: 2.0151 - regression_loss: 1.6613 - classification_loss: 0.3538 419/500 [========================>.....] - ETA: 20s - loss: 2.0132 - regression_loss: 1.6598 - classification_loss: 0.3534 420/500 [========================>.....] - ETA: 19s - loss: 2.0132 - regression_loss: 1.6598 - classification_loss: 0.3534 421/500 [========================>.....] - ETA: 19s - loss: 2.0124 - regression_loss: 1.6592 - classification_loss: 0.3533 422/500 [========================>.....] - ETA: 19s - loss: 2.0106 - regression_loss: 1.6578 - classification_loss: 0.3528 423/500 [========================>.....] - ETA: 19s - loss: 2.0104 - regression_loss: 1.6575 - classification_loss: 0.3530 424/500 [========================>.....] - ETA: 18s - loss: 2.0085 - regression_loss: 1.6560 - classification_loss: 0.3526 425/500 [========================>.....] - ETA: 18s - loss: 2.0083 - regression_loss: 1.6559 - classification_loss: 0.3524 426/500 [========================>.....] - ETA: 18s - loss: 2.0100 - regression_loss: 1.6574 - classification_loss: 0.3526 427/500 [========================>.....] - ETA: 18s - loss: 2.0101 - regression_loss: 1.6574 - classification_loss: 0.3526 428/500 [========================>.....] - ETA: 17s - loss: 2.0099 - regression_loss: 1.6572 - classification_loss: 0.3528 429/500 [========================>.....] - ETA: 17s - loss: 2.0122 - regression_loss: 1.6592 - classification_loss: 0.3530 430/500 [========================>.....] - ETA: 17s - loss: 2.0086 - regression_loss: 1.6563 - classification_loss: 0.3524 431/500 [========================>.....] - ETA: 17s - loss: 2.0068 - regression_loss: 1.6548 - classification_loss: 0.3520 432/500 [========================>.....] - ETA: 16s - loss: 2.0073 - regression_loss: 1.6554 - classification_loss: 0.3520 433/500 [========================>.....] - ETA: 16s - loss: 2.0060 - regression_loss: 1.6545 - classification_loss: 0.3515 434/500 [=========================>....] - ETA: 16s - loss: 2.0062 - regression_loss: 1.6546 - classification_loss: 0.3516 435/500 [=========================>....] - ETA: 16s - loss: 2.0062 - regression_loss: 1.6548 - classification_loss: 0.3514 436/500 [=========================>....] - ETA: 15s - loss: 2.0061 - regression_loss: 1.6546 - classification_loss: 0.3515 437/500 [=========================>....] - ETA: 15s - loss: 2.0064 - regression_loss: 1.6550 - classification_loss: 0.3514 438/500 [=========================>....] - ETA: 15s - loss: 2.0054 - regression_loss: 1.6543 - classification_loss: 0.3511 439/500 [=========================>....] - ETA: 15s - loss: 2.0031 - regression_loss: 1.6525 - classification_loss: 0.3506 440/500 [=========================>....] - ETA: 14s - loss: 2.0021 - regression_loss: 1.6519 - classification_loss: 0.3502 441/500 [=========================>....] - ETA: 14s - loss: 2.0027 - regression_loss: 1.6525 - classification_loss: 0.3501 442/500 [=========================>....] - ETA: 14s - loss: 2.0002 - regression_loss: 1.6507 - classification_loss: 0.3495 443/500 [=========================>....] - ETA: 14s - loss: 1.9998 - regression_loss: 1.6504 - classification_loss: 0.3494 444/500 [=========================>....] - ETA: 13s - loss: 2.0010 - regression_loss: 1.6513 - classification_loss: 0.3497 445/500 [=========================>....] - ETA: 13s - loss: 1.9989 - regression_loss: 1.6497 - classification_loss: 0.3492 446/500 [=========================>....] - ETA: 13s - loss: 1.9958 - regression_loss: 1.6460 - classification_loss: 0.3497 447/500 [=========================>....] - ETA: 13s - loss: 1.9952 - regression_loss: 1.6457 - classification_loss: 0.3495 448/500 [=========================>....] - ETA: 12s - loss: 1.9963 - regression_loss: 1.6465 - classification_loss: 0.3498 449/500 [=========================>....] - ETA: 12s - loss: 1.9933 - regression_loss: 1.6428 - classification_loss: 0.3505 450/500 [==========================>...] - ETA: 12s - loss: 1.9925 - regression_loss: 1.6422 - classification_loss: 0.3503 451/500 [==========================>...] - ETA: 12s - loss: 1.9949 - regression_loss: 1.6442 - classification_loss: 0.3508 452/500 [==========================>...] - ETA: 11s - loss: 1.9949 - regression_loss: 1.6443 - classification_loss: 0.3506 453/500 [==========================>...] - ETA: 11s - loss: 1.9935 - regression_loss: 1.6435 - classification_loss: 0.3500 454/500 [==========================>...] - ETA: 11s - loss: 1.9928 - regression_loss: 1.6429 - classification_loss: 0.3499 455/500 [==========================>...] - ETA: 11s - loss: 1.9938 - regression_loss: 1.6436 - classification_loss: 0.3503 456/500 [==========================>...] - ETA: 10s - loss: 1.9936 - regression_loss: 1.6434 - classification_loss: 0.3502 457/500 [==========================>...] - ETA: 10s - loss: 1.9949 - regression_loss: 1.6447 - classification_loss: 0.3502 458/500 [==========================>...] - ETA: 10s - loss: 1.9967 - regression_loss: 1.6459 - classification_loss: 0.3508 459/500 [==========================>...] - ETA: 10s - loss: 1.9968 - regression_loss: 1.6461 - classification_loss: 0.3507 460/500 [==========================>...] - ETA: 9s - loss: 1.9973 - regression_loss: 1.6461 - classification_loss: 0.3511  461/500 [==========================>...] - ETA: 9s - loss: 1.9977 - regression_loss: 1.6464 - classification_loss: 0.3513 462/500 [==========================>...] - ETA: 9s - loss: 1.9969 - regression_loss: 1.6460 - classification_loss: 0.3509 463/500 [==========================>...] - ETA: 9s - loss: 1.9957 - regression_loss: 1.6452 - classification_loss: 0.3505 464/500 [==========================>...] - ETA: 8s - loss: 1.9947 - regression_loss: 1.6444 - classification_loss: 0.3502 465/500 [==========================>...] - ETA: 8s - loss: 1.9953 - regression_loss: 1.6448 - classification_loss: 0.3505 466/500 [==========================>...] - ETA: 8s - loss: 1.9951 - regression_loss: 1.6446 - classification_loss: 0.3505 467/500 [===========================>..] - ETA: 8s - loss: 1.9940 - regression_loss: 1.6436 - classification_loss: 0.3504 468/500 [===========================>..] - ETA: 7s - loss: 1.9948 - regression_loss: 1.6441 - classification_loss: 0.3506 469/500 [===========================>..] - ETA: 7s - loss: 1.9948 - regression_loss: 1.6442 - classification_loss: 0.3507 470/500 [===========================>..] - ETA: 7s - loss: 1.9924 - regression_loss: 1.6421 - classification_loss: 0.3504 471/500 [===========================>..] - ETA: 7s - loss: 1.9930 - regression_loss: 1.6424 - classification_loss: 0.3506 472/500 [===========================>..] - ETA: 6s - loss: 1.9945 - regression_loss: 1.6436 - classification_loss: 0.3509 473/500 [===========================>..] - ETA: 6s - loss: 1.9942 - regression_loss: 1.6434 - classification_loss: 0.3508 474/500 [===========================>..] - ETA: 6s - loss: 1.9936 - regression_loss: 1.6431 - classification_loss: 0.3506 475/500 [===========================>..] - ETA: 6s - loss: 1.9928 - regression_loss: 1.6425 - classification_loss: 0.3503 476/500 [===========================>..] - ETA: 5s - loss: 1.9913 - regression_loss: 1.6413 - classification_loss: 0.3499 477/500 [===========================>..] - ETA: 5s - loss: 1.9925 - regression_loss: 1.6428 - classification_loss: 0.3497 478/500 [===========================>..] - ETA: 5s - loss: 1.9914 - regression_loss: 1.6421 - classification_loss: 0.3492 479/500 [===========================>..] - ETA: 5s - loss: 1.9919 - regression_loss: 1.6424 - classification_loss: 0.3495 480/500 [===========================>..] - ETA: 4s - loss: 1.9924 - regression_loss: 1.6428 - classification_loss: 0.3495 481/500 [===========================>..] - ETA: 4s - loss: 1.9936 - regression_loss: 1.6440 - classification_loss: 0.3496 482/500 [===========================>..] - ETA: 4s - loss: 1.9941 - regression_loss: 1.6444 - classification_loss: 0.3497 483/500 [===========================>..] - ETA: 4s - loss: 1.9933 - regression_loss: 1.6439 - classification_loss: 0.3494 484/500 [============================>.] - ETA: 3s - loss: 1.9937 - regression_loss: 1.6443 - classification_loss: 0.3494 485/500 [============================>.] - ETA: 3s - loss: 1.9930 - regression_loss: 1.6438 - classification_loss: 0.3492 486/500 [============================>.] - ETA: 3s - loss: 1.9921 - regression_loss: 1.6432 - classification_loss: 0.3489 487/500 [============================>.] - ETA: 3s - loss: 1.9926 - regression_loss: 1.6438 - classification_loss: 0.3488 488/500 [============================>.] - ETA: 2s - loss: 1.9929 - regression_loss: 1.6440 - classification_loss: 0.3488 489/500 [============================>.] - ETA: 2s - loss: 1.9923 - regression_loss: 1.6437 - classification_loss: 0.3486 490/500 [============================>.] - ETA: 2s - loss: 1.9920 - regression_loss: 1.6435 - classification_loss: 0.3485 491/500 [============================>.] - ETA: 2s - loss: 1.9914 - regression_loss: 1.6432 - classification_loss: 0.3482 492/500 [============================>.] - ETA: 1s - loss: 1.9899 - regression_loss: 1.6420 - classification_loss: 0.3478 493/500 [============================>.] - ETA: 1s - loss: 1.9903 - regression_loss: 1.6423 - classification_loss: 0.3480 494/500 [============================>.] - ETA: 1s - loss: 1.9935 - regression_loss: 1.6454 - classification_loss: 0.3481 495/500 [============================>.] - ETA: 1s - loss: 1.9929 - regression_loss: 1.6451 - classification_loss: 0.3478 496/500 [============================>.] - ETA: 0s - loss: 1.9935 - regression_loss: 1.6455 - classification_loss: 0.3480 497/500 [============================>.] - ETA: 0s - loss: 1.9942 - regression_loss: 1.6462 - classification_loss: 0.3480 498/500 [============================>.] - ETA: 0s - loss: 1.9935 - regression_loss: 1.6455 - classification_loss: 0.3480 499/500 [============================>.] - ETA: 0s - loss: 1.9938 - regression_loss: 1.6451 - classification_loss: 0.3487 500/500 [==============================] - 125s 250ms/step - loss: 1.9940 - regression_loss: 1.6452 - classification_loss: 0.3488 326 instances of class plum with average precision: 0.7126 mAP: 0.7126 Epoch 00023: saving model to ./training/snapshots/resnet50_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 2:06 - loss: 2.7240 - regression_loss: 2.1886 - classification_loss: 0.5354 2/500 [..............................] - ETA: 2:06 - loss: 2.1691 - regression_loss: 1.7899 - classification_loss: 0.3792 3/500 [..............................] - ETA: 2:06 - loss: 2.3248 - regression_loss: 1.9145 - classification_loss: 0.4103 4/500 [..............................] - ETA: 2:04 - loss: 2.0532 - regression_loss: 1.7099 - classification_loss: 0.3433 5/500 [..............................] - ETA: 2:04 - loss: 2.1493 - regression_loss: 1.7846 - classification_loss: 0.3648 6/500 [..............................] - ETA: 2:04 - loss: 2.1692 - regression_loss: 1.8063 - classification_loss: 0.3629 7/500 [..............................] - ETA: 2:04 - loss: 2.1583 - regression_loss: 1.8045 - classification_loss: 0.3538 8/500 [..............................] - ETA: 2:04 - loss: 2.0405 - regression_loss: 1.7002 - classification_loss: 0.3403 9/500 [..............................] - ETA: 2:03 - loss: 2.0050 - regression_loss: 1.6811 - classification_loss: 0.3239 10/500 [..............................] - ETA: 2:03 - loss: 1.9546 - regression_loss: 1.6474 - classification_loss: 0.3072 11/500 [..............................] - ETA: 2:03 - loss: 1.9572 - regression_loss: 1.6544 - classification_loss: 0.3028 12/500 [..............................] - ETA: 2:03 - loss: 1.9238 - regression_loss: 1.6251 - classification_loss: 0.2986 13/500 [..............................] - ETA: 2:02 - loss: 1.9319 - regression_loss: 1.6293 - classification_loss: 0.3026 14/500 [..............................] - ETA: 2:01 - loss: 1.9708 - regression_loss: 1.6547 - classification_loss: 0.3160 15/500 [..............................] - ETA: 2:01 - loss: 1.9450 - regression_loss: 1.6361 - classification_loss: 0.3089 16/500 [..............................] - ETA: 2:01 - loss: 1.9286 - regression_loss: 1.6269 - classification_loss: 0.3017 17/500 [>.............................] - ETA: 2:01 - loss: 1.9345 - regression_loss: 1.6274 - classification_loss: 0.3072 18/500 [>.............................] - ETA: 2:01 - loss: 1.9489 - regression_loss: 1.6315 - classification_loss: 0.3174 19/500 [>.............................] - ETA: 2:01 - loss: 1.9018 - regression_loss: 1.5929 - classification_loss: 0.3089 20/500 [>.............................] - ETA: 2:00 - loss: 1.9107 - regression_loss: 1.5969 - classification_loss: 0.3139 21/500 [>.............................] - ETA: 2:00 - loss: 1.9117 - regression_loss: 1.5995 - classification_loss: 0.3122 22/500 [>.............................] - ETA: 2:00 - loss: 1.9352 - regression_loss: 1.6168 - classification_loss: 0.3184 23/500 [>.............................] - ETA: 2:00 - loss: 1.9247 - regression_loss: 1.6116 - classification_loss: 0.3131 24/500 [>.............................] - ETA: 1:59 - loss: 1.9229 - regression_loss: 1.6101 - classification_loss: 0.3128 25/500 [>.............................] - ETA: 1:59 - loss: 1.8972 - regression_loss: 1.5897 - classification_loss: 0.3075 26/500 [>.............................] - ETA: 1:59 - loss: 1.8843 - regression_loss: 1.5784 - classification_loss: 0.3059 27/500 [>.............................] - ETA: 1:59 - loss: 1.8716 - regression_loss: 1.5681 - classification_loss: 0.3035 28/500 [>.............................] - ETA: 1:59 - loss: 1.8833 - regression_loss: 1.5775 - classification_loss: 0.3058 29/500 [>.............................] - ETA: 1:58 - loss: 1.8933 - regression_loss: 1.5828 - classification_loss: 0.3105 30/500 [>.............................] - ETA: 1:58 - loss: 1.9021 - regression_loss: 1.5892 - classification_loss: 0.3128 31/500 [>.............................] - ETA: 1:58 - loss: 1.8890 - regression_loss: 1.5762 - classification_loss: 0.3128 32/500 [>.............................] - ETA: 1:57 - loss: 1.8924 - regression_loss: 1.5775 - classification_loss: 0.3149 33/500 [>.............................] - ETA: 1:57 - loss: 1.8935 - regression_loss: 1.5792 - classification_loss: 0.3143 34/500 [=>............................] - ETA: 1:57 - loss: 1.9213 - regression_loss: 1.6025 - classification_loss: 0.3189 35/500 [=>............................] - ETA: 1:57 - loss: 1.9123 - regression_loss: 1.5952 - classification_loss: 0.3170 36/500 [=>............................] - ETA: 1:57 - loss: 1.9407 - regression_loss: 1.6183 - classification_loss: 0.3225 37/500 [=>............................] - ETA: 1:56 - loss: 1.9337 - regression_loss: 1.6137 - classification_loss: 0.3200 38/500 [=>............................] - ETA: 1:56 - loss: 1.9390 - regression_loss: 1.6190 - classification_loss: 0.3200 39/500 [=>............................] - ETA: 1:56 - loss: 1.9503 - regression_loss: 1.6282 - classification_loss: 0.3221 40/500 [=>............................] - ETA: 1:56 - loss: 1.9527 - regression_loss: 1.6305 - classification_loss: 0.3222 41/500 [=>............................] - ETA: 1:55 - loss: 1.9458 - regression_loss: 1.6270 - classification_loss: 0.3188 42/500 [=>............................] - ETA: 1:55 - loss: 1.9596 - regression_loss: 1.6437 - classification_loss: 0.3159 43/500 [=>............................] - ETA: 1:55 - loss: 1.9585 - regression_loss: 1.6435 - classification_loss: 0.3151 44/500 [=>............................] - ETA: 1:54 - loss: 1.9663 - regression_loss: 1.6474 - classification_loss: 0.3189 45/500 [=>............................] - ETA: 1:54 - loss: 1.9633 - regression_loss: 1.6455 - classification_loss: 0.3177 46/500 [=>............................] - ETA: 1:54 - loss: 1.9625 - regression_loss: 1.6432 - classification_loss: 0.3193 47/500 [=>............................] - ETA: 1:54 - loss: 1.9776 - regression_loss: 1.6584 - classification_loss: 0.3192 48/500 [=>............................] - ETA: 1:53 - loss: 1.9710 - regression_loss: 1.6521 - classification_loss: 0.3189 49/500 [=>............................] - ETA: 1:53 - loss: 1.9801 - regression_loss: 1.6595 - classification_loss: 0.3206 50/500 [==>...........................] - ETA: 1:53 - loss: 1.9856 - regression_loss: 1.6605 - classification_loss: 0.3251 51/500 [==>...........................] - ETA: 1:53 - loss: 1.9869 - regression_loss: 1.6619 - classification_loss: 0.3250 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9778 - regression_loss: 1.6554 - classification_loss: 0.3224 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9684 - regression_loss: 1.6447 - classification_loss: 0.3237 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9587 - regression_loss: 1.6358 - classification_loss: 0.3230 55/500 [==>...........................] - ETA: 1:52 - loss: 1.9486 - regression_loss: 1.6285 - classification_loss: 0.3200 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9509 - regression_loss: 1.6320 - classification_loss: 0.3188 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9584 - regression_loss: 1.6381 - classification_loss: 0.3203 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9350 - regression_loss: 1.6185 - classification_loss: 0.3165 59/500 [==>...........................] - ETA: 1:51 - loss: 1.9445 - regression_loss: 1.6269 - classification_loss: 0.3176 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9432 - regression_loss: 1.6250 - classification_loss: 0.3182 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9339 - regression_loss: 1.6178 - classification_loss: 0.3161 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9335 - regression_loss: 1.6176 - classification_loss: 0.3160 63/500 [==>...........................] - ETA: 1:50 - loss: 1.9191 - regression_loss: 1.6060 - classification_loss: 0.3131 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9199 - regression_loss: 1.6075 - classification_loss: 0.3123 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9093 - regression_loss: 1.5999 - classification_loss: 0.3094 66/500 [==>...........................] - ETA: 1:49 - loss: 1.9139 - regression_loss: 1.6033 - classification_loss: 0.3107 67/500 [===>..........................] - ETA: 1:49 - loss: 1.9409 - regression_loss: 1.6256 - classification_loss: 0.3153 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9392 - regression_loss: 1.6236 - classification_loss: 0.3156 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9405 - regression_loss: 1.6249 - classification_loss: 0.3157 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9399 - regression_loss: 1.6248 - classification_loss: 0.3151 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9353 - regression_loss: 1.6212 - classification_loss: 0.3141 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9321 - regression_loss: 1.6188 - classification_loss: 0.3134 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9324 - regression_loss: 1.6189 - classification_loss: 0.3135 74/500 [===>..........................] - ETA: 1:46 - loss: 1.9305 - regression_loss: 1.6173 - classification_loss: 0.3132 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9393 - regression_loss: 1.6251 - classification_loss: 0.3142 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9410 - regression_loss: 1.6273 - classification_loss: 0.3137 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9324 - regression_loss: 1.6201 - classification_loss: 0.3123 78/500 [===>..........................] - ETA: 1:45 - loss: 1.9355 - regression_loss: 1.6250 - classification_loss: 0.3104 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9426 - regression_loss: 1.6307 - classification_loss: 0.3119 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9412 - regression_loss: 1.6290 - classification_loss: 0.3122 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9432 - regression_loss: 1.6317 - classification_loss: 0.3116 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9424 - regression_loss: 1.6326 - classification_loss: 0.3098 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9535 - regression_loss: 1.6404 - classification_loss: 0.3131 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9524 - regression_loss: 1.6389 - classification_loss: 0.3136 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9535 - regression_loss: 1.6393 - classification_loss: 0.3143 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9508 - regression_loss: 1.6371 - classification_loss: 0.3137 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9513 - regression_loss: 1.6377 - classification_loss: 0.3136 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9576 - regression_loss: 1.6432 - classification_loss: 0.3144 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9534 - regression_loss: 1.6404 - classification_loss: 0.3130 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9556 - regression_loss: 1.6422 - classification_loss: 0.3133 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9547 - regression_loss: 1.6420 - classification_loss: 0.3127 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9505 - regression_loss: 1.6387 - classification_loss: 0.3118 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9502 - regression_loss: 1.6382 - classification_loss: 0.3119 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9572 - regression_loss: 1.6421 - classification_loss: 0.3151 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9586 - regression_loss: 1.6409 - classification_loss: 0.3177 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9580 - regression_loss: 1.6405 - classification_loss: 0.3175 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9547 - regression_loss: 1.6375 - classification_loss: 0.3172 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9543 - regression_loss: 1.6375 - classification_loss: 0.3167 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9596 - regression_loss: 1.6424 - classification_loss: 0.3172 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9629 - regression_loss: 1.6433 - classification_loss: 0.3196 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9632 - regression_loss: 1.6436 - classification_loss: 0.3196 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9768 - regression_loss: 1.6503 - classification_loss: 0.3264 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9855 - regression_loss: 1.6569 - classification_loss: 0.3286 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9884 - regression_loss: 1.6588 - classification_loss: 0.3296 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9935 - regression_loss: 1.6632 - classification_loss: 0.3303 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9986 - regression_loss: 1.6681 - classification_loss: 0.3306 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9995 - regression_loss: 1.6692 - classification_loss: 0.3303 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9953 - regression_loss: 1.6662 - classification_loss: 0.3292 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9900 - regression_loss: 1.6618 - classification_loss: 0.3282 110/500 [=====>........................] - ETA: 1:38 - loss: 1.9902 - regression_loss: 1.6622 - classification_loss: 0.3280 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9905 - regression_loss: 1.6623 - classification_loss: 0.3282 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0046 - regression_loss: 1.6737 - classification_loss: 0.3309 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0040 - regression_loss: 1.6733 - classification_loss: 0.3307 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9998 - regression_loss: 1.6701 - classification_loss: 0.3297 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9955 - regression_loss: 1.6668 - classification_loss: 0.3286 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9955 - regression_loss: 1.6671 - classification_loss: 0.3284 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9944 - regression_loss: 1.6664 - classification_loss: 0.3281 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9955 - regression_loss: 1.6672 - classification_loss: 0.3283 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9933 - regression_loss: 1.6655 - classification_loss: 0.3278 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9967 - regression_loss: 1.6684 - classification_loss: 0.3283 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9908 - regression_loss: 1.6631 - classification_loss: 0.3277 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9898 - regression_loss: 1.6623 - classification_loss: 0.3275 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9874 - regression_loss: 1.6607 - classification_loss: 0.3267 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9926 - regression_loss: 1.6642 - classification_loss: 0.3284 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9953 - regression_loss: 1.6659 - classification_loss: 0.3294 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9942 - regression_loss: 1.6657 - classification_loss: 0.3284 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0015 - regression_loss: 1.6715 - classification_loss: 0.3301 128/500 [======>.......................] - ETA: 1:32 - loss: 2.0008 - regression_loss: 1.6702 - classification_loss: 0.3306 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9917 - regression_loss: 1.6628 - classification_loss: 0.3289 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9925 - regression_loss: 1.6631 - classification_loss: 0.3294 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9874 - regression_loss: 1.6589 - classification_loss: 0.3285 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9913 - regression_loss: 1.6463 - classification_loss: 0.3450 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9870 - regression_loss: 1.6433 - classification_loss: 0.3437 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9850 - regression_loss: 1.6423 - classification_loss: 0.3427 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9860 - regression_loss: 1.6434 - classification_loss: 0.3427 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9915 - regression_loss: 1.6476 - classification_loss: 0.3439 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9827 - regression_loss: 1.6407 - classification_loss: 0.3420 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9859 - regression_loss: 1.6434 - classification_loss: 0.3425 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9857 - regression_loss: 1.6439 - classification_loss: 0.3418 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9851 - regression_loss: 1.6436 - classification_loss: 0.3415 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9833 - regression_loss: 1.6425 - classification_loss: 0.3408 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9838 - regression_loss: 1.6425 - classification_loss: 0.3413 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9846 - regression_loss: 1.6424 - classification_loss: 0.3422 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9831 - regression_loss: 1.6407 - classification_loss: 0.3424 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9810 - regression_loss: 1.6386 - classification_loss: 0.3424 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9765 - regression_loss: 1.6349 - classification_loss: 0.3416 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9791 - regression_loss: 1.6366 - classification_loss: 0.3425 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9765 - regression_loss: 1.6349 - classification_loss: 0.3415 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9808 - regression_loss: 1.6389 - classification_loss: 0.3419 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9866 - regression_loss: 1.6435 - classification_loss: 0.3431 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9886 - regression_loss: 1.6451 - classification_loss: 0.3435 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9825 - regression_loss: 1.6404 - classification_loss: 0.3422 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9820 - regression_loss: 1.6405 - classification_loss: 0.3415 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9851 - regression_loss: 1.6424 - classification_loss: 0.3427 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9871 - regression_loss: 1.6438 - classification_loss: 0.3433 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9856 - regression_loss: 1.6420 - classification_loss: 0.3436 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9863 - regression_loss: 1.6430 - classification_loss: 0.3433 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9878 - regression_loss: 1.6439 - classification_loss: 0.3438 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9881 - regression_loss: 1.6442 - classification_loss: 0.3439 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9875 - regression_loss: 1.6442 - classification_loss: 0.3433 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9895 - regression_loss: 1.6461 - classification_loss: 0.3434 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9926 - regression_loss: 1.6496 - classification_loss: 0.3430 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9924 - regression_loss: 1.6494 - classification_loss: 0.3430 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9842 - regression_loss: 1.6428 - classification_loss: 0.3414 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9845 - regression_loss: 1.6431 - classification_loss: 0.3413 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9824 - regression_loss: 1.6418 - classification_loss: 0.3406 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9800 - regression_loss: 1.6402 - classification_loss: 0.3399 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9784 - regression_loss: 1.6392 - classification_loss: 0.3392 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9767 - regression_loss: 1.6380 - classification_loss: 0.3387 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9798 - regression_loss: 1.6402 - classification_loss: 0.3396 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9816 - regression_loss: 1.6413 - classification_loss: 0.3403 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9856 - regression_loss: 1.6440 - classification_loss: 0.3416 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9902 - regression_loss: 1.6468 - classification_loss: 0.3435 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9891 - regression_loss: 1.6460 - classification_loss: 0.3431 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9847 - regression_loss: 1.6426 - classification_loss: 0.3421 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9848 - regression_loss: 1.6428 - classification_loss: 0.3420 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9802 - regression_loss: 1.6384 - classification_loss: 0.3418 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9868 - regression_loss: 1.6436 - classification_loss: 0.3433 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9897 - regression_loss: 1.6464 - classification_loss: 0.3433 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9877 - regression_loss: 1.6452 - classification_loss: 0.3426 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9854 - regression_loss: 1.6433 - classification_loss: 0.3422 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9881 - regression_loss: 1.6453 - classification_loss: 0.3428 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9859 - regression_loss: 1.6440 - classification_loss: 0.3419 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9881 - regression_loss: 1.6456 - classification_loss: 0.3424 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9898 - regression_loss: 1.6471 - classification_loss: 0.3426 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9947 - regression_loss: 1.6513 - classification_loss: 0.3435 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9937 - regression_loss: 1.6506 - classification_loss: 0.3431 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9956 - regression_loss: 1.6516 - classification_loss: 0.3441 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9947 - regression_loss: 1.6508 - classification_loss: 0.3439 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9918 - regression_loss: 1.6484 - classification_loss: 0.3434 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9922 - regression_loss: 1.6494 - classification_loss: 0.3429 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9907 - regression_loss: 1.6479 - classification_loss: 0.3428 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9916 - regression_loss: 1.6491 - classification_loss: 0.3424 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9977 - regression_loss: 1.6548 - classification_loss: 0.3429 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9991 - regression_loss: 1.6559 - classification_loss: 0.3433 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9982 - regression_loss: 1.6552 - classification_loss: 0.3431 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9988 - regression_loss: 1.6552 - classification_loss: 0.3436 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9959 - regression_loss: 1.6530 - classification_loss: 0.3429 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9971 - regression_loss: 1.6544 - classification_loss: 0.3427 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9966 - regression_loss: 1.6540 - classification_loss: 0.3426 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9995 - regression_loss: 1.6563 - classification_loss: 0.3432 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9996 - regression_loss: 1.6565 - classification_loss: 0.3431 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9961 - regression_loss: 1.6541 - classification_loss: 0.3420 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9950 - regression_loss: 1.6534 - classification_loss: 0.3416 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9945 - regression_loss: 1.6529 - classification_loss: 0.3415 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9963 - regression_loss: 1.6535 - classification_loss: 0.3427 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0003 - regression_loss: 1.6567 - classification_loss: 0.3436 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0032 - regression_loss: 1.6588 - classification_loss: 0.3444 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0000 - regression_loss: 1.6565 - classification_loss: 0.3436 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9991 - regression_loss: 1.6560 - classification_loss: 0.3431 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9990 - regression_loss: 1.6561 - classification_loss: 0.3429 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9984 - regression_loss: 1.6555 - classification_loss: 0.3429 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9988 - regression_loss: 1.6556 - classification_loss: 0.3432 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9983 - regression_loss: 1.6553 - classification_loss: 0.3431 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9972 - regression_loss: 1.6547 - classification_loss: 0.3424 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0011 - regression_loss: 1.6576 - classification_loss: 0.3434 217/500 [============>.................] - ETA: 1:10 - loss: 1.9987 - regression_loss: 1.6559 - classification_loss: 0.3428 218/500 [============>.................] - ETA: 1:10 - loss: 1.9946 - regression_loss: 1.6525 - classification_loss: 0.3420 219/500 [============>.................] - ETA: 1:10 - loss: 1.9927 - regression_loss: 1.6512 - classification_loss: 0.3415 220/500 [============>.................] - ETA: 1:10 - loss: 1.9925 - regression_loss: 1.6509 - classification_loss: 0.3416 221/500 [============>.................] - ETA: 1:09 - loss: 1.9979 - regression_loss: 1.6566 - classification_loss: 0.3413 222/500 [============>.................] - ETA: 1:09 - loss: 1.9998 - regression_loss: 1.6579 - classification_loss: 0.3419 223/500 [============>.................] - ETA: 1:09 - loss: 2.0016 - regression_loss: 1.6592 - classification_loss: 0.3424 224/500 [============>.................] - ETA: 1:09 - loss: 2.0019 - regression_loss: 1.6594 - classification_loss: 0.3425 225/500 [============>.................] - ETA: 1:08 - loss: 2.0024 - regression_loss: 1.6598 - classification_loss: 0.3426 226/500 [============>.................] - ETA: 1:08 - loss: 2.0046 - regression_loss: 1.6611 - classification_loss: 0.3435 227/500 [============>.................] - ETA: 1:08 - loss: 2.0052 - regression_loss: 1.6617 - classification_loss: 0.3434 228/500 [============>.................] - ETA: 1:08 - loss: 2.0093 - regression_loss: 1.6650 - classification_loss: 0.3443 229/500 [============>.................] - ETA: 1:07 - loss: 2.0071 - regression_loss: 1.6634 - classification_loss: 0.3438 230/500 [============>.................] - ETA: 1:07 - loss: 2.0060 - regression_loss: 1.6624 - classification_loss: 0.3437 231/500 [============>.................] - ETA: 1:07 - loss: 2.0075 - regression_loss: 1.6635 - classification_loss: 0.3441 232/500 [============>.................] - ETA: 1:07 - loss: 2.0100 - regression_loss: 1.6656 - classification_loss: 0.3444 233/500 [============>.................] - ETA: 1:06 - loss: 2.0095 - regression_loss: 1.6654 - classification_loss: 0.3441 234/500 [=============>................] - ETA: 1:06 - loss: 2.0099 - regression_loss: 1.6657 - classification_loss: 0.3442 235/500 [=============>................] - ETA: 1:06 - loss: 2.0093 - regression_loss: 1.6653 - classification_loss: 0.3440 236/500 [=============>................] - ETA: 1:06 - loss: 2.0109 - regression_loss: 1.6667 - classification_loss: 0.3441 237/500 [=============>................] - ETA: 1:05 - loss: 2.0118 - regression_loss: 1.6671 - classification_loss: 0.3447 238/500 [=============>................] - ETA: 1:05 - loss: 2.0144 - regression_loss: 1.6692 - classification_loss: 0.3452 239/500 [=============>................] - ETA: 1:05 - loss: 2.0129 - regression_loss: 1.6679 - classification_loss: 0.3450 240/500 [=============>................] - ETA: 1:05 - loss: 2.0119 - regression_loss: 1.6677 - classification_loss: 0.3442 241/500 [=============>................] - ETA: 1:04 - loss: 2.0113 - regression_loss: 1.6674 - classification_loss: 0.3439 242/500 [=============>................] - ETA: 1:04 - loss: 2.0125 - regression_loss: 1.6686 - classification_loss: 0.3439 243/500 [=============>................] - ETA: 1:04 - loss: 2.0117 - regression_loss: 1.6680 - classification_loss: 0.3437 244/500 [=============>................] - ETA: 1:04 - loss: 2.0116 - regression_loss: 1.6678 - classification_loss: 0.3438 245/500 [=============>................] - ETA: 1:03 - loss: 2.0114 - regression_loss: 1.6677 - classification_loss: 0.3437 246/500 [=============>................] - ETA: 1:03 - loss: 2.0141 - regression_loss: 1.6697 - classification_loss: 0.3444 247/500 [=============>................] - ETA: 1:03 - loss: 2.0135 - regression_loss: 1.6695 - classification_loss: 0.3441 248/500 [=============>................] - ETA: 1:03 - loss: 2.0126 - regression_loss: 1.6687 - classification_loss: 0.3439 249/500 [=============>................] - ETA: 1:02 - loss: 2.0127 - regression_loss: 1.6696 - classification_loss: 0.3432 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0113 - regression_loss: 1.6684 - classification_loss: 0.3429 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0103 - regression_loss: 1.6679 - classification_loss: 0.3423 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0085 - regression_loss: 1.6664 - classification_loss: 0.3421 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0080 - regression_loss: 1.6664 - classification_loss: 0.3416 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0095 - regression_loss: 1.6677 - classification_loss: 0.3418 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0084 - regression_loss: 1.6667 - classification_loss: 0.3416 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0063 - regression_loss: 1.6650 - classification_loss: 0.3414 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0066 - regression_loss: 1.6651 - classification_loss: 0.3415 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0026 - regression_loss: 1.6620 - classification_loss: 0.3406 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0016 - regression_loss: 1.6613 - classification_loss: 0.3403 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0016 - regression_loss: 1.6612 - classification_loss: 0.3404 261/500 [==============>...............] - ETA: 59s - loss: 2.0025 - regression_loss: 1.6611 - classification_loss: 0.3414  262/500 [==============>...............] - ETA: 59s - loss: 1.9976 - regression_loss: 1.6571 - classification_loss: 0.3405 263/500 [==============>...............] - ETA: 59s - loss: 1.9993 - regression_loss: 1.6588 - classification_loss: 0.3405 264/500 [==============>...............] - ETA: 59s - loss: 1.9997 - regression_loss: 1.6588 - classification_loss: 0.3408 265/500 [==============>...............] - ETA: 58s - loss: 1.9983 - regression_loss: 1.6581 - classification_loss: 0.3402 266/500 [==============>...............] - ETA: 58s - loss: 1.9992 - regression_loss: 1.6586 - classification_loss: 0.3406 267/500 [===============>..............] - ETA: 58s - loss: 1.9999 - regression_loss: 1.6593 - classification_loss: 0.3406 268/500 [===============>..............] - ETA: 58s - loss: 2.0004 - regression_loss: 1.6593 - classification_loss: 0.3411 269/500 [===============>..............] - ETA: 57s - loss: 2.0005 - regression_loss: 1.6593 - classification_loss: 0.3412 270/500 [===============>..............] - ETA: 57s - loss: 2.0001 - regression_loss: 1.6590 - classification_loss: 0.3411 271/500 [===============>..............] - ETA: 57s - loss: 1.9997 - regression_loss: 1.6588 - classification_loss: 0.3410 272/500 [===============>..............] - ETA: 57s - loss: 2.0003 - regression_loss: 1.6591 - classification_loss: 0.3412 273/500 [===============>..............] - ETA: 56s - loss: 1.9961 - regression_loss: 1.6558 - classification_loss: 0.3403 274/500 [===============>..............] - ETA: 56s - loss: 2.0006 - regression_loss: 1.6596 - classification_loss: 0.3409 275/500 [===============>..............] - ETA: 56s - loss: 2.0016 - regression_loss: 1.6605 - classification_loss: 0.3411 276/500 [===============>..............] - ETA: 56s - loss: 2.0018 - regression_loss: 1.6607 - classification_loss: 0.3412 277/500 [===============>..............] - ETA: 55s - loss: 2.0068 - regression_loss: 1.6646 - classification_loss: 0.3422 278/500 [===============>..............] - ETA: 55s - loss: 2.0075 - regression_loss: 1.6652 - classification_loss: 0.3424 279/500 [===============>..............] - ETA: 55s - loss: 2.0041 - regression_loss: 1.6623 - classification_loss: 0.3418 280/500 [===============>..............] - ETA: 55s - loss: 2.0054 - regression_loss: 1.6636 - classification_loss: 0.3418 281/500 [===============>..............] - ETA: 54s - loss: 2.0046 - regression_loss: 1.6630 - classification_loss: 0.3417 282/500 [===============>..............] - ETA: 54s - loss: 2.0040 - regression_loss: 1.6625 - classification_loss: 0.3415 283/500 [===============>..............] - ETA: 54s - loss: 2.0038 - regression_loss: 1.6624 - classification_loss: 0.3414 284/500 [================>.............] - ETA: 54s - loss: 1.9993 - regression_loss: 1.6587 - classification_loss: 0.3406 285/500 [================>.............] - ETA: 53s - loss: 1.9999 - regression_loss: 1.6590 - classification_loss: 0.3409 286/500 [================>.............] - ETA: 53s - loss: 2.0000 - regression_loss: 1.6589 - classification_loss: 0.3410 287/500 [================>.............] - ETA: 53s - loss: 1.9991 - regression_loss: 1.6585 - classification_loss: 0.3406 288/500 [================>.............] - ETA: 53s - loss: 1.9979 - regression_loss: 1.6576 - classification_loss: 0.3403 289/500 [================>.............] - ETA: 52s - loss: 1.9993 - regression_loss: 1.6583 - classification_loss: 0.3410 290/500 [================>.............] - ETA: 52s - loss: 1.9996 - regression_loss: 1.6585 - classification_loss: 0.3411 291/500 [================>.............] - ETA: 52s - loss: 2.0001 - regression_loss: 1.6587 - classification_loss: 0.3413 292/500 [================>.............] - ETA: 52s - loss: 2.0015 - regression_loss: 1.6598 - classification_loss: 0.3417 293/500 [================>.............] - ETA: 51s - loss: 2.0043 - regression_loss: 1.6622 - classification_loss: 0.3421 294/500 [================>.............] - ETA: 51s - loss: 2.0025 - regression_loss: 1.6610 - classification_loss: 0.3416 295/500 [================>.............] - ETA: 51s - loss: 2.0018 - regression_loss: 1.6604 - classification_loss: 0.3414 296/500 [================>.............] - ETA: 51s - loss: 2.0017 - regression_loss: 1.6603 - classification_loss: 0.3415 297/500 [================>.............] - ETA: 50s - loss: 2.0023 - regression_loss: 1.6607 - classification_loss: 0.3415 298/500 [================>.............] - ETA: 50s - loss: 1.9989 - regression_loss: 1.6582 - classification_loss: 0.3408 299/500 [================>.............] - ETA: 50s - loss: 2.0001 - regression_loss: 1.6589 - classification_loss: 0.3412 300/500 [=================>............] - ETA: 50s - loss: 2.0035 - regression_loss: 1.6615 - classification_loss: 0.3420 301/500 [=================>............] - ETA: 49s - loss: 2.0012 - regression_loss: 1.6599 - classification_loss: 0.3413 302/500 [=================>............] - ETA: 49s - loss: 2.0016 - regression_loss: 1.6606 - classification_loss: 0.3410 303/500 [=================>............] - ETA: 49s - loss: 2.0029 - regression_loss: 1.6616 - classification_loss: 0.3413 304/500 [=================>............] - ETA: 49s - loss: 2.0043 - regression_loss: 1.6628 - classification_loss: 0.3416 305/500 [=================>............] - ETA: 48s - loss: 2.0044 - regression_loss: 1.6627 - classification_loss: 0.3417 306/500 [=================>............] - ETA: 48s - loss: 2.0031 - regression_loss: 1.6618 - classification_loss: 0.3413 307/500 [=================>............] - ETA: 48s - loss: 2.0019 - regression_loss: 1.6610 - classification_loss: 0.3409 308/500 [=================>............] - ETA: 48s - loss: 2.0004 - regression_loss: 1.6597 - classification_loss: 0.3407 309/500 [=================>............] - ETA: 47s - loss: 1.9993 - regression_loss: 1.6589 - classification_loss: 0.3404 310/500 [=================>............] - ETA: 47s - loss: 1.9989 - regression_loss: 1.6585 - classification_loss: 0.3404 311/500 [=================>............] - ETA: 47s - loss: 2.0014 - regression_loss: 1.6605 - classification_loss: 0.3409 312/500 [=================>............] - ETA: 47s - loss: 2.0002 - regression_loss: 1.6595 - classification_loss: 0.3407 313/500 [=================>............] - ETA: 46s - loss: 1.9980 - regression_loss: 1.6576 - classification_loss: 0.3404 314/500 [=================>............] - ETA: 46s - loss: 1.9968 - regression_loss: 1.6566 - classification_loss: 0.3402 315/500 [=================>............] - ETA: 46s - loss: 1.9959 - regression_loss: 1.6562 - classification_loss: 0.3397 316/500 [=================>............] - ETA: 46s - loss: 1.9951 - regression_loss: 1.6555 - classification_loss: 0.3395 317/500 [==================>...........] - ETA: 45s - loss: 1.9961 - regression_loss: 1.6565 - classification_loss: 0.3396 318/500 [==================>...........] - ETA: 45s - loss: 1.9963 - regression_loss: 1.6569 - classification_loss: 0.3394 319/500 [==================>...........] - ETA: 45s - loss: 1.9983 - regression_loss: 1.6581 - classification_loss: 0.3402 320/500 [==================>...........] - ETA: 45s - loss: 1.9953 - regression_loss: 1.6558 - classification_loss: 0.3396 321/500 [==================>...........] - ETA: 44s - loss: 1.9951 - regression_loss: 1.6555 - classification_loss: 0.3396 322/500 [==================>...........] - ETA: 44s - loss: 1.9932 - regression_loss: 1.6540 - classification_loss: 0.3392 323/500 [==================>...........] - ETA: 44s - loss: 1.9923 - regression_loss: 1.6531 - classification_loss: 0.3391 324/500 [==================>...........] - ETA: 44s - loss: 1.9917 - regression_loss: 1.6529 - classification_loss: 0.3389 325/500 [==================>...........] - ETA: 43s - loss: 1.9909 - regression_loss: 1.6523 - classification_loss: 0.3386 326/500 [==================>...........] - ETA: 43s - loss: 1.9873 - regression_loss: 1.6493 - classification_loss: 0.3379 327/500 [==================>...........] - ETA: 43s - loss: 1.9865 - regression_loss: 1.6488 - classification_loss: 0.3377 328/500 [==================>...........] - ETA: 43s - loss: 1.9875 - regression_loss: 1.6498 - classification_loss: 0.3378 329/500 [==================>...........] - ETA: 42s - loss: 1.9875 - regression_loss: 1.6497 - classification_loss: 0.3378 330/500 [==================>...........] - ETA: 42s - loss: 1.9848 - regression_loss: 1.6477 - classification_loss: 0.3371 331/500 [==================>...........] - ETA: 42s - loss: 1.9843 - regression_loss: 1.6474 - classification_loss: 0.3370 332/500 [==================>...........] - ETA: 42s - loss: 1.9852 - regression_loss: 1.6480 - classification_loss: 0.3372 333/500 [==================>...........] - ETA: 41s - loss: 1.9858 - regression_loss: 1.6486 - classification_loss: 0.3371 334/500 [===================>..........] - ETA: 41s - loss: 1.9869 - regression_loss: 1.6496 - classification_loss: 0.3373 335/500 [===================>..........] - ETA: 41s - loss: 1.9847 - regression_loss: 1.6478 - classification_loss: 0.3368 336/500 [===================>..........] - ETA: 41s - loss: 1.9846 - regression_loss: 1.6478 - classification_loss: 0.3368 337/500 [===================>..........] - ETA: 40s - loss: 1.9857 - regression_loss: 1.6484 - classification_loss: 0.3373 338/500 [===================>..........] - ETA: 40s - loss: 1.9844 - regression_loss: 1.6474 - classification_loss: 0.3370 339/500 [===================>..........] - ETA: 40s - loss: 1.9839 - regression_loss: 1.6472 - classification_loss: 0.3367 340/500 [===================>..........] - ETA: 40s - loss: 1.9851 - regression_loss: 1.6484 - classification_loss: 0.3367 341/500 [===================>..........] - ETA: 39s - loss: 1.9847 - regression_loss: 1.6482 - classification_loss: 0.3365 342/500 [===================>..........] - ETA: 39s - loss: 1.9857 - regression_loss: 1.6492 - classification_loss: 0.3365 343/500 [===================>..........] - ETA: 39s - loss: 1.9851 - regression_loss: 1.6484 - classification_loss: 0.3366 344/500 [===================>..........] - ETA: 39s - loss: 1.9842 - regression_loss: 1.6479 - classification_loss: 0.3363 345/500 [===================>..........] - ETA: 38s - loss: 1.9853 - regression_loss: 1.6487 - classification_loss: 0.3366 346/500 [===================>..........] - ETA: 38s - loss: 1.9855 - regression_loss: 1.6487 - classification_loss: 0.3369 347/500 [===================>..........] - ETA: 38s - loss: 1.9838 - regression_loss: 1.6474 - classification_loss: 0.3364 348/500 [===================>..........] - ETA: 38s - loss: 1.9821 - regression_loss: 1.6461 - classification_loss: 0.3360 349/500 [===================>..........] - ETA: 37s - loss: 1.9826 - regression_loss: 1.6471 - classification_loss: 0.3355 350/500 [====================>.........] - ETA: 37s - loss: 1.9826 - regression_loss: 1.6471 - classification_loss: 0.3356 351/500 [====================>.........] - ETA: 37s - loss: 1.9846 - regression_loss: 1.6486 - classification_loss: 0.3360 352/500 [====================>.........] - ETA: 37s - loss: 1.9831 - regression_loss: 1.6476 - classification_loss: 0.3356 353/500 [====================>.........] - ETA: 36s - loss: 1.9831 - regression_loss: 1.6476 - classification_loss: 0.3355 354/500 [====================>.........] - ETA: 36s - loss: 1.9840 - regression_loss: 1.6478 - classification_loss: 0.3361 355/500 [====================>.........] - ETA: 36s - loss: 1.9834 - regression_loss: 1.6474 - classification_loss: 0.3361 356/500 [====================>.........] - ETA: 36s - loss: 1.9835 - regression_loss: 1.6477 - classification_loss: 0.3358 357/500 [====================>.........] - ETA: 35s - loss: 1.9835 - regression_loss: 1.6479 - classification_loss: 0.3356 358/500 [====================>.........] - ETA: 35s - loss: 1.9828 - regression_loss: 1.6471 - classification_loss: 0.3357 359/500 [====================>.........] - ETA: 35s - loss: 1.9822 - regression_loss: 1.6465 - classification_loss: 0.3357 360/500 [====================>.........] - ETA: 35s - loss: 1.9797 - regression_loss: 1.6445 - classification_loss: 0.3352 361/500 [====================>.........] - ETA: 34s - loss: 1.9780 - regression_loss: 1.6433 - classification_loss: 0.3348 362/500 [====================>.........] - ETA: 34s - loss: 1.9796 - regression_loss: 1.6444 - classification_loss: 0.3352 363/500 [====================>.........] - ETA: 34s - loss: 1.9790 - regression_loss: 1.6441 - classification_loss: 0.3349 364/500 [====================>.........] - ETA: 34s - loss: 1.9797 - regression_loss: 1.6447 - classification_loss: 0.3350 365/500 [====================>.........] - ETA: 33s - loss: 1.9783 - regression_loss: 1.6402 - classification_loss: 0.3381 366/500 [====================>.........] - ETA: 33s - loss: 1.9781 - regression_loss: 1.6402 - classification_loss: 0.3379 367/500 [=====================>........] - ETA: 33s - loss: 1.9769 - regression_loss: 1.6395 - classification_loss: 0.3374 368/500 [=====================>........] - ETA: 33s - loss: 1.9769 - regression_loss: 1.6395 - classification_loss: 0.3374 369/500 [=====================>........] - ETA: 32s - loss: 1.9770 - regression_loss: 1.6394 - classification_loss: 0.3377 370/500 [=====================>........] - ETA: 32s - loss: 1.9798 - regression_loss: 1.6417 - classification_loss: 0.3381 371/500 [=====================>........] - ETA: 32s - loss: 1.9830 - regression_loss: 1.6440 - classification_loss: 0.3389 372/500 [=====================>........] - ETA: 32s - loss: 1.9833 - regression_loss: 1.6445 - classification_loss: 0.3389 373/500 [=====================>........] - ETA: 31s - loss: 1.9825 - regression_loss: 1.6442 - classification_loss: 0.3384 374/500 [=====================>........] - ETA: 31s - loss: 1.9803 - regression_loss: 1.6424 - classification_loss: 0.3379 375/500 [=====================>........] - ETA: 31s - loss: 1.9839 - regression_loss: 1.6458 - classification_loss: 0.3380 376/500 [=====================>........] - ETA: 31s - loss: 1.9854 - regression_loss: 1.6469 - classification_loss: 0.3385 377/500 [=====================>........] - ETA: 30s - loss: 1.9854 - regression_loss: 1.6471 - classification_loss: 0.3383 378/500 [=====================>........] - ETA: 30s - loss: 1.9873 - regression_loss: 1.6487 - classification_loss: 0.3386 379/500 [=====================>........] - ETA: 30s - loss: 1.9879 - regression_loss: 1.6493 - classification_loss: 0.3386 380/500 [=====================>........] - ETA: 30s - loss: 1.9877 - regression_loss: 1.6492 - classification_loss: 0.3386 381/500 [=====================>........] - ETA: 29s - loss: 1.9862 - regression_loss: 1.6483 - classification_loss: 0.3380 382/500 [=====================>........] - ETA: 29s - loss: 1.9867 - regression_loss: 1.6486 - classification_loss: 0.3381 383/500 [=====================>........] - ETA: 29s - loss: 1.9861 - regression_loss: 1.6481 - classification_loss: 0.3380 384/500 [======================>.......] - ETA: 29s - loss: 1.9876 - regression_loss: 1.6489 - classification_loss: 0.3387 385/500 [======================>.......] - ETA: 28s - loss: 1.9871 - regression_loss: 1.6487 - classification_loss: 0.3385 386/500 [======================>.......] - ETA: 28s - loss: 1.9901 - regression_loss: 1.6509 - classification_loss: 0.3392 387/500 [======================>.......] - ETA: 28s - loss: 1.9879 - regression_loss: 1.6492 - classification_loss: 0.3388 388/500 [======================>.......] - ETA: 28s - loss: 1.9865 - regression_loss: 1.6476 - classification_loss: 0.3389 389/500 [======================>.......] - ETA: 27s - loss: 1.9886 - regression_loss: 1.6488 - classification_loss: 0.3398 390/500 [======================>.......] - ETA: 27s - loss: 1.9907 - regression_loss: 1.6503 - classification_loss: 0.3404 391/500 [======================>.......] - ETA: 27s - loss: 1.9940 - regression_loss: 1.6530 - classification_loss: 0.3410 392/500 [======================>.......] - ETA: 27s - loss: 1.9970 - regression_loss: 1.6550 - classification_loss: 0.3419 393/500 [======================>.......] - ETA: 26s - loss: 1.9965 - regression_loss: 1.6547 - classification_loss: 0.3418 394/500 [======================>.......] - ETA: 26s - loss: 1.9957 - regression_loss: 1.6540 - classification_loss: 0.3417 395/500 [======================>.......] - ETA: 26s - loss: 1.9949 - regression_loss: 1.6534 - classification_loss: 0.3415 396/500 [======================>.......] - ETA: 26s - loss: 1.9974 - regression_loss: 1.6553 - classification_loss: 0.3421 397/500 [======================>.......] - ETA: 25s - loss: 1.9986 - regression_loss: 1.6560 - classification_loss: 0.3426 398/500 [======================>.......] - ETA: 25s - loss: 2.0004 - regression_loss: 1.6572 - classification_loss: 0.3432 399/500 [======================>.......] - ETA: 25s - loss: 2.0002 - regression_loss: 1.6570 - classification_loss: 0.3432 400/500 [=======================>......] - ETA: 25s - loss: 2.0010 - regression_loss: 1.6575 - classification_loss: 0.3435 401/500 [=======================>......] - ETA: 24s - loss: 2.0025 - regression_loss: 1.6586 - classification_loss: 0.3439 402/500 [=======================>......] - ETA: 24s - loss: 2.0001 - regression_loss: 1.6568 - classification_loss: 0.3433 403/500 [=======================>......] - ETA: 24s - loss: 1.9994 - regression_loss: 1.6562 - classification_loss: 0.3432 404/500 [=======================>......] - ETA: 24s - loss: 1.9999 - regression_loss: 1.6567 - classification_loss: 0.3432 405/500 [=======================>......] - ETA: 23s - loss: 2.0005 - regression_loss: 1.6572 - classification_loss: 0.3434 406/500 [=======================>......] - ETA: 23s - loss: 2.0012 - regression_loss: 1.6573 - classification_loss: 0.3439 407/500 [=======================>......] - ETA: 23s - loss: 2.0023 - regression_loss: 1.6580 - classification_loss: 0.3442 408/500 [=======================>......] - ETA: 23s - loss: 2.0017 - regression_loss: 1.6575 - classification_loss: 0.3442 409/500 [=======================>......] - ETA: 22s - loss: 2.0017 - regression_loss: 1.6577 - classification_loss: 0.3439 410/500 [=======================>......] - ETA: 22s - loss: 2.0015 - regression_loss: 1.6579 - classification_loss: 0.3435 411/500 [=======================>......] - ETA: 22s - loss: 2.0018 - regression_loss: 1.6580 - classification_loss: 0.3438 412/500 [=======================>......] - ETA: 22s - loss: 2.0027 - regression_loss: 1.6586 - classification_loss: 0.3440 413/500 [=======================>......] - ETA: 21s - loss: 2.0022 - regression_loss: 1.6582 - classification_loss: 0.3440 414/500 [=======================>......] - ETA: 21s - loss: 2.0009 - regression_loss: 1.6572 - classification_loss: 0.3437 415/500 [=======================>......] - ETA: 21s - loss: 2.0009 - regression_loss: 1.6573 - classification_loss: 0.3437 416/500 [=======================>......] - ETA: 21s - loss: 2.0015 - regression_loss: 1.6576 - classification_loss: 0.3439 417/500 [========================>.....] - ETA: 20s - loss: 2.0006 - regression_loss: 1.6568 - classification_loss: 0.3438 418/500 [========================>.....] - ETA: 20s - loss: 2.0001 - regression_loss: 1.6564 - classification_loss: 0.3438 419/500 [========================>.....] - ETA: 20s - loss: 2.0009 - regression_loss: 1.6572 - classification_loss: 0.3437 420/500 [========================>.....] - ETA: 20s - loss: 2.0003 - regression_loss: 1.6567 - classification_loss: 0.3437 421/500 [========================>.....] - ETA: 19s - loss: 2.0005 - regression_loss: 1.6567 - classification_loss: 0.3438 422/500 [========================>.....] - ETA: 19s - loss: 2.0008 - regression_loss: 1.6569 - classification_loss: 0.3439 423/500 [========================>.....] - ETA: 19s - loss: 2.0001 - regression_loss: 1.6563 - classification_loss: 0.3438 424/500 [========================>.....] - ETA: 19s - loss: 1.9980 - regression_loss: 1.6546 - classification_loss: 0.3434 425/500 [========================>.....] - ETA: 18s - loss: 1.9986 - regression_loss: 1.6551 - classification_loss: 0.3435 426/500 [========================>.....] - ETA: 18s - loss: 1.9980 - regression_loss: 1.6546 - classification_loss: 0.3434 427/500 [========================>.....] - ETA: 18s - loss: 1.9981 - regression_loss: 1.6545 - classification_loss: 0.3436 428/500 [========================>.....] - ETA: 18s - loss: 1.9973 - regression_loss: 1.6540 - classification_loss: 0.3433 429/500 [========================>.....] - ETA: 17s - loss: 1.9961 - regression_loss: 1.6530 - classification_loss: 0.3431 430/500 [========================>.....] - ETA: 17s - loss: 1.9948 - regression_loss: 1.6521 - classification_loss: 0.3427 431/500 [========================>.....] - ETA: 17s - loss: 1.9946 - regression_loss: 1.6519 - classification_loss: 0.3427 432/500 [========================>.....] - ETA: 17s - loss: 1.9962 - regression_loss: 1.6531 - classification_loss: 0.3430 433/500 [========================>.....] - ETA: 16s - loss: 1.9965 - regression_loss: 1.6535 - classification_loss: 0.3430 434/500 [=========================>....] - ETA: 16s - loss: 1.9959 - regression_loss: 1.6531 - classification_loss: 0.3427 435/500 [=========================>....] - ETA: 16s - loss: 1.9971 - regression_loss: 1.6543 - classification_loss: 0.3428 436/500 [=========================>....] - ETA: 16s - loss: 1.9951 - regression_loss: 1.6523 - classification_loss: 0.3428 437/500 [=========================>....] - ETA: 15s - loss: 1.9960 - regression_loss: 1.6527 - classification_loss: 0.3433 438/500 [=========================>....] - ETA: 15s - loss: 1.9971 - regression_loss: 1.6537 - classification_loss: 0.3434 439/500 [=========================>....] - ETA: 15s - loss: 1.9979 - regression_loss: 1.6545 - classification_loss: 0.3435 440/500 [=========================>....] - ETA: 15s - loss: 1.9973 - regression_loss: 1.6540 - classification_loss: 0.3433 441/500 [=========================>....] - ETA: 14s - loss: 1.9983 - regression_loss: 1.6547 - classification_loss: 0.3436 442/500 [=========================>....] - ETA: 14s - loss: 1.9986 - regression_loss: 1.6551 - classification_loss: 0.3435 443/500 [=========================>....] - ETA: 14s - loss: 1.9987 - regression_loss: 1.6553 - classification_loss: 0.3434 444/500 [=========================>....] - ETA: 14s - loss: 1.9973 - regression_loss: 1.6543 - classification_loss: 0.3431 445/500 [=========================>....] - ETA: 13s - loss: 1.9963 - regression_loss: 1.6535 - classification_loss: 0.3428 446/500 [=========================>....] - ETA: 13s - loss: 1.9938 - regression_loss: 1.6516 - classification_loss: 0.3422 447/500 [=========================>....] - ETA: 13s - loss: 1.9944 - regression_loss: 1.6521 - classification_loss: 0.3424 448/500 [=========================>....] - ETA: 13s - loss: 1.9950 - regression_loss: 1.6526 - classification_loss: 0.3424 449/500 [=========================>....] - ETA: 12s - loss: 1.9941 - regression_loss: 1.6519 - classification_loss: 0.3421 450/500 [==========================>...] - ETA: 12s - loss: 1.9942 - regression_loss: 1.6521 - classification_loss: 0.3421 451/500 [==========================>...] - ETA: 12s - loss: 1.9965 - regression_loss: 1.6539 - classification_loss: 0.3427 452/500 [==========================>...] - ETA: 12s - loss: 1.9963 - regression_loss: 1.6538 - classification_loss: 0.3425 453/500 [==========================>...] - ETA: 11s - loss: 1.9946 - regression_loss: 1.6525 - classification_loss: 0.3421 454/500 [==========================>...] - ETA: 11s - loss: 1.9952 - regression_loss: 1.6535 - classification_loss: 0.3417 455/500 [==========================>...] - ETA: 11s - loss: 1.9927 - regression_loss: 1.6515 - classification_loss: 0.3412 456/500 [==========================>...] - ETA: 11s - loss: 1.9932 - regression_loss: 1.6519 - classification_loss: 0.3413 457/500 [==========================>...] - ETA: 10s - loss: 1.9924 - regression_loss: 1.6513 - classification_loss: 0.3410 458/500 [==========================>...] - ETA: 10s - loss: 1.9922 - regression_loss: 1.6511 - classification_loss: 0.3411 459/500 [==========================>...] - ETA: 10s - loss: 1.9924 - regression_loss: 1.6518 - classification_loss: 0.3406 460/500 [==========================>...] - ETA: 10s - loss: 1.9917 - regression_loss: 1.6512 - classification_loss: 0.3405 461/500 [==========================>...] - ETA: 9s - loss: 1.9919 - regression_loss: 1.6511 - classification_loss: 0.3408  462/500 [==========================>...] - ETA: 9s - loss: 1.9920 - regression_loss: 1.6513 - classification_loss: 0.3407 463/500 [==========================>...] - ETA: 9s - loss: 1.9899 - regression_loss: 1.6497 - classification_loss: 0.3403 464/500 [==========================>...] - ETA: 9s - loss: 1.9908 - regression_loss: 1.6503 - classification_loss: 0.3404 465/500 [==========================>...] - ETA: 8s - loss: 1.9892 - regression_loss: 1.6490 - classification_loss: 0.3402 466/500 [==========================>...] - ETA: 8s - loss: 1.9900 - regression_loss: 1.6497 - classification_loss: 0.3402 467/500 [===========================>..] - ETA: 8s - loss: 1.9920 - regression_loss: 1.6462 - classification_loss: 0.3458 468/500 [===========================>..] - ETA: 8s - loss: 1.9904 - regression_loss: 1.6450 - classification_loss: 0.3455 469/500 [===========================>..] - ETA: 7s - loss: 1.9906 - regression_loss: 1.6450 - classification_loss: 0.3456 470/500 [===========================>..] - ETA: 7s - loss: 1.9941 - regression_loss: 1.6478 - classification_loss: 0.3463 471/500 [===========================>..] - ETA: 7s - loss: 1.9939 - regression_loss: 1.6476 - classification_loss: 0.3464 472/500 [===========================>..] - ETA: 7s - loss: 1.9938 - regression_loss: 1.6475 - classification_loss: 0.3462 473/500 [===========================>..] - ETA: 6s - loss: 1.9938 - regression_loss: 1.6475 - classification_loss: 0.3463 474/500 [===========================>..] - ETA: 6s - loss: 1.9965 - regression_loss: 1.6494 - classification_loss: 0.3471 475/500 [===========================>..] - ETA: 6s - loss: 1.9959 - regression_loss: 1.6491 - classification_loss: 0.3468 476/500 [===========================>..] - ETA: 6s - loss: 1.9960 - regression_loss: 1.6491 - classification_loss: 0.3469 477/500 [===========================>..] - ETA: 5s - loss: 1.9974 - regression_loss: 1.6504 - classification_loss: 0.3470 478/500 [===========================>..] - ETA: 5s - loss: 1.9972 - regression_loss: 1.6502 - classification_loss: 0.3469 479/500 [===========================>..] - ETA: 5s - loss: 1.9966 - regression_loss: 1.6499 - classification_loss: 0.3466 480/500 [===========================>..] - ETA: 5s - loss: 1.9957 - regression_loss: 1.6492 - classification_loss: 0.3465 481/500 [===========================>..] - ETA: 4s - loss: 1.9927 - regression_loss: 1.6467 - classification_loss: 0.3460 482/500 [===========================>..] - ETA: 4s - loss: 1.9936 - regression_loss: 1.6479 - classification_loss: 0.3457 483/500 [===========================>..] - ETA: 4s - loss: 1.9936 - regression_loss: 1.6481 - classification_loss: 0.3455 484/500 [============================>.] - ETA: 4s - loss: 1.9959 - regression_loss: 1.6496 - classification_loss: 0.3463 485/500 [============================>.] - ETA: 3s - loss: 1.9978 - regression_loss: 1.6512 - classification_loss: 0.3466 486/500 [============================>.] - ETA: 3s - loss: 1.9981 - regression_loss: 1.6515 - classification_loss: 0.3467 487/500 [============================>.] - ETA: 3s - loss: 1.9998 - regression_loss: 1.6527 - classification_loss: 0.3471 488/500 [============================>.] - ETA: 3s - loss: 1.9999 - regression_loss: 1.6527 - classification_loss: 0.3472 489/500 [============================>.] - ETA: 2s - loss: 1.9994 - regression_loss: 1.6524 - classification_loss: 0.3470 490/500 [============================>.] - ETA: 2s - loss: 2.0004 - regression_loss: 1.6530 - classification_loss: 0.3474 491/500 [============================>.] - ETA: 2s - loss: 1.9992 - regression_loss: 1.6521 - classification_loss: 0.3472 492/500 [============================>.] - ETA: 2s - loss: 1.9978 - regression_loss: 1.6510 - classification_loss: 0.3468 493/500 [============================>.] - ETA: 1s - loss: 1.9969 - regression_loss: 1.6504 - classification_loss: 0.3465 494/500 [============================>.] - ETA: 1s - loss: 1.9961 - regression_loss: 1.6498 - classification_loss: 0.3463 495/500 [============================>.] - ETA: 1s - loss: 1.9960 - regression_loss: 1.6496 - classification_loss: 0.3464 496/500 [============================>.] - ETA: 1s - loss: 1.9962 - regression_loss: 1.6499 - classification_loss: 0.3463 497/500 [============================>.] - ETA: 0s - loss: 1.9953 - regression_loss: 1.6491 - classification_loss: 0.3462 498/500 [============================>.] - ETA: 0s - loss: 1.9952 - regression_loss: 1.6490 - classification_loss: 0.3462 499/500 [============================>.] - ETA: 0s - loss: 1.9963 - regression_loss: 1.6500 - classification_loss: 0.3463 500/500 [==============================] - 125s 251ms/step - loss: 1.9937 - regression_loss: 1.6477 - classification_loss: 0.3460 326 instances of class plum with average precision: 0.6908 mAP: 0.6908 Epoch 00024: saving model to ./training/snapshots/resnet50_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 1:55 - loss: 2.4459 - regression_loss: 2.0404 - classification_loss: 0.4055 2/500 [..............................] - ETA: 1:59 - loss: 2.3692 - regression_loss: 1.9828 - classification_loss: 0.3864 3/500 [..............................] - ETA: 2:01 - loss: 2.4295 - regression_loss: 2.0364 - classification_loss: 0.3931 4/500 [..............................] - ETA: 2:02 - loss: 2.2974 - regression_loss: 1.9477 - classification_loss: 0.3497 5/500 [..............................] - ETA: 2:02 - loss: 2.2109 - regression_loss: 1.8855 - classification_loss: 0.3255 6/500 [..............................] - ETA: 2:02 - loss: 2.2534 - regression_loss: 1.9041 - classification_loss: 0.3493 7/500 [..............................] - ETA: 2:01 - loss: 2.3117 - regression_loss: 1.9212 - classification_loss: 0.3905 8/500 [..............................] - ETA: 2:01 - loss: 2.3269 - regression_loss: 1.9266 - classification_loss: 0.4003 9/500 [..............................] - ETA: 2:01 - loss: 2.2736 - regression_loss: 1.9006 - classification_loss: 0.3730 10/500 [..............................] - ETA: 2:01 - loss: 2.2549 - regression_loss: 1.8837 - classification_loss: 0.3712 11/500 [..............................] - ETA: 2:01 - loss: 2.2851 - regression_loss: 1.9066 - classification_loss: 0.3785 12/500 [..............................] - ETA: 2:01 - loss: 2.2864 - regression_loss: 1.9073 - classification_loss: 0.3791 13/500 [..............................] - ETA: 2:01 - loss: 2.2740 - regression_loss: 1.8973 - classification_loss: 0.3767 14/500 [..............................] - ETA: 2:01 - loss: 2.2557 - regression_loss: 1.8845 - classification_loss: 0.3712 15/500 [..............................] - ETA: 2:01 - loss: 2.1982 - regression_loss: 1.8415 - classification_loss: 0.3567 16/500 [..............................] - ETA: 2:00 - loss: 2.1775 - regression_loss: 1.8244 - classification_loss: 0.3531 17/500 [>.............................] - ETA: 2:00 - loss: 2.1691 - regression_loss: 1.8111 - classification_loss: 0.3580 18/500 [>.............................] - ETA: 2:00 - loss: 2.1465 - regression_loss: 1.7950 - classification_loss: 0.3514 19/500 [>.............................] - ETA: 2:00 - loss: 2.1245 - regression_loss: 1.7759 - classification_loss: 0.3486 20/500 [>.............................] - ETA: 2:00 - loss: 2.1047 - regression_loss: 1.7606 - classification_loss: 0.3441 21/500 [>.............................] - ETA: 2:00 - loss: 2.1357 - regression_loss: 1.7848 - classification_loss: 0.3509 22/500 [>.............................] - ETA: 2:00 - loss: 2.1324 - regression_loss: 1.7805 - classification_loss: 0.3519 23/500 [>.............................] - ETA: 1:59 - loss: 2.1286 - regression_loss: 1.7756 - classification_loss: 0.3531 24/500 [>.............................] - ETA: 1:59 - loss: 2.1188 - regression_loss: 1.7681 - classification_loss: 0.3507 25/500 [>.............................] - ETA: 1:59 - loss: 2.1347 - regression_loss: 1.7744 - classification_loss: 0.3602 26/500 [>.............................] - ETA: 1:59 - loss: 2.1364 - regression_loss: 1.7745 - classification_loss: 0.3619 27/500 [>.............................] - ETA: 1:59 - loss: 2.1176 - regression_loss: 1.7578 - classification_loss: 0.3598 28/500 [>.............................] - ETA: 1:58 - loss: 2.1016 - regression_loss: 1.7425 - classification_loss: 0.3592 29/500 [>.............................] - ETA: 1:58 - loss: 2.1124 - regression_loss: 1.7514 - classification_loss: 0.3610 30/500 [>.............................] - ETA: 1:58 - loss: 2.1077 - regression_loss: 1.7485 - classification_loss: 0.3592 31/500 [>.............................] - ETA: 1:58 - loss: 2.0765 - regression_loss: 1.7248 - classification_loss: 0.3518 32/500 [>.............................] - ETA: 1:57 - loss: 2.0607 - regression_loss: 1.7148 - classification_loss: 0.3458 33/500 [>.............................] - ETA: 1:57 - loss: 2.0544 - regression_loss: 1.7112 - classification_loss: 0.3433 34/500 [=>............................] - ETA: 1:57 - loss: 2.0589 - regression_loss: 1.7145 - classification_loss: 0.3444 35/500 [=>............................] - ETA: 1:57 - loss: 2.0320 - regression_loss: 1.6940 - classification_loss: 0.3380 36/500 [=>............................] - ETA: 1:56 - loss: 2.0171 - regression_loss: 1.6803 - classification_loss: 0.3369 37/500 [=>............................] - ETA: 1:56 - loss: 2.0040 - regression_loss: 1.6694 - classification_loss: 0.3346 38/500 [=>............................] - ETA: 1:56 - loss: 1.9870 - regression_loss: 1.6566 - classification_loss: 0.3304 39/500 [=>............................] - ETA: 1:55 - loss: 1.9944 - regression_loss: 1.6627 - classification_loss: 0.3317 40/500 [=>............................] - ETA: 1:55 - loss: 1.9990 - regression_loss: 1.6664 - classification_loss: 0.3326 41/500 [=>............................] - ETA: 1:55 - loss: 2.0005 - regression_loss: 1.6664 - classification_loss: 0.3340 42/500 [=>............................] - ETA: 1:55 - loss: 2.0057 - regression_loss: 1.6721 - classification_loss: 0.3336 43/500 [=>............................] - ETA: 1:54 - loss: 2.0131 - regression_loss: 1.6788 - classification_loss: 0.3343 44/500 [=>............................] - ETA: 1:54 - loss: 2.0070 - regression_loss: 1.6740 - classification_loss: 0.3330 45/500 [=>............................] - ETA: 1:54 - loss: 2.0299 - regression_loss: 1.6925 - classification_loss: 0.3375 46/500 [=>............................] - ETA: 1:54 - loss: 2.0009 - regression_loss: 1.6691 - classification_loss: 0.3317 47/500 [=>............................] - ETA: 1:53 - loss: 1.9976 - regression_loss: 1.6651 - classification_loss: 0.3325 48/500 [=>............................] - ETA: 1:53 - loss: 1.9889 - regression_loss: 1.6590 - classification_loss: 0.3299 49/500 [=>............................] - ETA: 1:53 - loss: 1.9892 - regression_loss: 1.6587 - classification_loss: 0.3306 50/500 [==>...........................] - ETA: 1:53 - loss: 2.0052 - regression_loss: 1.6694 - classification_loss: 0.3358 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9986 - regression_loss: 1.6642 - classification_loss: 0.3344 52/500 [==>...........................] - ETA: 1:52 - loss: 2.0029 - regression_loss: 1.6680 - classification_loss: 0.3349 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9942 - regression_loss: 1.6621 - classification_loss: 0.3321 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9967 - regression_loss: 1.6649 - classification_loss: 0.3318 55/500 [==>...........................] - ETA: 1:52 - loss: 1.9920 - regression_loss: 1.6602 - classification_loss: 0.3318 56/500 [==>...........................] - ETA: 1:51 - loss: 2.0074 - regression_loss: 1.6733 - classification_loss: 0.3341 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9959 - regression_loss: 1.6647 - classification_loss: 0.3312 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9980 - regression_loss: 1.6664 - classification_loss: 0.3316 59/500 [==>...........................] - ETA: 1:51 - loss: 1.9971 - regression_loss: 1.6649 - classification_loss: 0.3322 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9994 - regression_loss: 1.6686 - classification_loss: 0.3307 61/500 [==>...........................] - ETA: 1:50 - loss: 2.0005 - regression_loss: 1.6686 - classification_loss: 0.3318 62/500 [==>...........................] - ETA: 1:50 - loss: 2.0042 - regression_loss: 1.6694 - classification_loss: 0.3348 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0078 - regression_loss: 1.6713 - classification_loss: 0.3366 64/500 [==>...........................] - ETA: 1:49 - loss: 2.0158 - regression_loss: 1.6773 - classification_loss: 0.3386 65/500 [==>...........................] - ETA: 1:49 - loss: 2.0101 - regression_loss: 1.6728 - classification_loss: 0.3372 66/500 [==>...........................] - ETA: 1:49 - loss: 2.0114 - regression_loss: 1.6755 - classification_loss: 0.3359 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0016 - regression_loss: 1.6677 - classification_loss: 0.3339 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0112 - regression_loss: 1.6732 - classification_loss: 0.3381 69/500 [===>..........................] - ETA: 1:48 - loss: 2.0020 - regression_loss: 1.6654 - classification_loss: 0.3367 70/500 [===>..........................] - ETA: 1:48 - loss: 2.0238 - regression_loss: 1.6817 - classification_loss: 0.3421 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0450 - regression_loss: 1.6960 - classification_loss: 0.3490 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0357 - regression_loss: 1.6894 - classification_loss: 0.3464 73/500 [===>..........................] - ETA: 1:47 - loss: 2.0296 - regression_loss: 1.6847 - classification_loss: 0.3449 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0236 - regression_loss: 1.6811 - classification_loss: 0.3426 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0232 - regression_loss: 1.6811 - classification_loss: 0.3421 76/500 [===>..........................] - ETA: 1:46 - loss: 2.0172 - regression_loss: 1.6759 - classification_loss: 0.3413 77/500 [===>..........................] - ETA: 1:46 - loss: 2.0011 - regression_loss: 1.6630 - classification_loss: 0.3381 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0050 - regression_loss: 1.6656 - classification_loss: 0.3394 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9991 - regression_loss: 1.6620 - classification_loss: 0.3371 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9983 - regression_loss: 1.6612 - classification_loss: 0.3371 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9948 - regression_loss: 1.6582 - classification_loss: 0.3366 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9838 - regression_loss: 1.6485 - classification_loss: 0.3353 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9892 - regression_loss: 1.6525 - classification_loss: 0.3367 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9924 - regression_loss: 1.6537 - classification_loss: 0.3388 85/500 [====>.........................] - ETA: 1:44 - loss: 2.0016 - regression_loss: 1.6594 - classification_loss: 0.3422 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9975 - regression_loss: 1.6554 - classification_loss: 0.3422 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9988 - regression_loss: 1.6566 - classification_loss: 0.3422 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9927 - regression_loss: 1.6518 - classification_loss: 0.3408 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9972 - regression_loss: 1.6550 - classification_loss: 0.3422 90/500 [====>.........................] - ETA: 1:43 - loss: 2.0036 - regression_loss: 1.6598 - classification_loss: 0.3438 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9924 - regression_loss: 1.6510 - classification_loss: 0.3414 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9935 - regression_loss: 1.6525 - classification_loss: 0.3410 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9943 - regression_loss: 1.6532 - classification_loss: 0.3412 94/500 [====>.........................] - ETA: 1:42 - loss: 2.0010 - regression_loss: 1.6572 - classification_loss: 0.3438 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9975 - regression_loss: 1.6544 - classification_loss: 0.3431 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0020 - regression_loss: 1.6580 - classification_loss: 0.3440 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0059 - regression_loss: 1.6606 - classification_loss: 0.3453 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9969 - regression_loss: 1.6531 - classification_loss: 0.3438 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9989 - regression_loss: 1.6552 - classification_loss: 0.3437 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0024 - regression_loss: 1.6578 - classification_loss: 0.3446 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9988 - regression_loss: 1.6557 - classification_loss: 0.3431 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9962 - regression_loss: 1.6542 - classification_loss: 0.3420 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9934 - regression_loss: 1.6519 - classification_loss: 0.3415 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9959 - regression_loss: 1.6545 - classification_loss: 0.3414 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9987 - regression_loss: 1.6571 - classification_loss: 0.3417 106/500 [=====>........................] - ETA: 1:39 - loss: 1.9976 - regression_loss: 1.6559 - classification_loss: 0.3417 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0078 - regression_loss: 1.6652 - classification_loss: 0.3426 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0047 - regression_loss: 1.6632 - classification_loss: 0.3414 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0081 - regression_loss: 1.6663 - classification_loss: 0.3418 110/500 [=====>........................] - ETA: 1:38 - loss: 2.0030 - regression_loss: 1.6625 - classification_loss: 0.3405 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0008 - regression_loss: 1.6610 - classification_loss: 0.3398 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0040 - regression_loss: 1.6636 - classification_loss: 0.3404 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0018 - regression_loss: 1.6627 - classification_loss: 0.3392 114/500 [=====>........................] - ETA: 1:37 - loss: 1.9999 - regression_loss: 1.6626 - classification_loss: 0.3373 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9912 - regression_loss: 1.6559 - classification_loss: 0.3353 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9879 - regression_loss: 1.6539 - classification_loss: 0.3340 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9816 - regression_loss: 1.6493 - classification_loss: 0.3323 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9831 - regression_loss: 1.6511 - classification_loss: 0.3320 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9811 - regression_loss: 1.6494 - classification_loss: 0.3317 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9795 - regression_loss: 1.6481 - classification_loss: 0.3314 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9712 - regression_loss: 1.6345 - classification_loss: 0.3367 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9740 - regression_loss: 1.6381 - classification_loss: 0.3360 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9710 - regression_loss: 1.6358 - classification_loss: 0.3351 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9630 - regression_loss: 1.6295 - classification_loss: 0.3336 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9615 - regression_loss: 1.6285 - classification_loss: 0.3330 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9647 - regression_loss: 1.6312 - classification_loss: 0.3335 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9618 - regression_loss: 1.6293 - classification_loss: 0.3325 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9652 - regression_loss: 1.6328 - classification_loss: 0.3324 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9693 - regression_loss: 1.6369 - classification_loss: 0.3323 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9644 - regression_loss: 1.6336 - classification_loss: 0.3308 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9625 - regression_loss: 1.6318 - classification_loss: 0.3307 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9605 - regression_loss: 1.6303 - classification_loss: 0.3302 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9603 - regression_loss: 1.6301 - classification_loss: 0.3303 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9574 - regression_loss: 1.6283 - classification_loss: 0.3291 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9554 - regression_loss: 1.6269 - classification_loss: 0.3285 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9569 - regression_loss: 1.6277 - classification_loss: 0.3292 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9576 - regression_loss: 1.6286 - classification_loss: 0.3289 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9671 - regression_loss: 1.6355 - classification_loss: 0.3316 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9652 - regression_loss: 1.6347 - classification_loss: 0.3305 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9614 - regression_loss: 1.6320 - classification_loss: 0.3294 141/500 [=======>......................] - ETA: 1:30 - loss: 1.9621 - regression_loss: 1.6333 - classification_loss: 0.3287 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9598 - regression_loss: 1.6317 - classification_loss: 0.3281 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9651 - regression_loss: 1.6365 - classification_loss: 0.3286 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9671 - regression_loss: 1.6380 - classification_loss: 0.3291 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9711 - regression_loss: 1.6404 - classification_loss: 0.3306 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9653 - regression_loss: 1.6357 - classification_loss: 0.3296 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9653 - regression_loss: 1.6345 - classification_loss: 0.3308 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9750 - regression_loss: 1.6419 - classification_loss: 0.3330 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9835 - regression_loss: 1.6485 - classification_loss: 0.3349 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9832 - regression_loss: 1.6486 - classification_loss: 0.3346 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9835 - regression_loss: 1.6482 - classification_loss: 0.3353 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9866 - regression_loss: 1.6517 - classification_loss: 0.3349 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9909 - regression_loss: 1.6559 - classification_loss: 0.3350 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9859 - regression_loss: 1.6515 - classification_loss: 0.3344 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9790 - regression_loss: 1.6458 - classification_loss: 0.3332 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9815 - regression_loss: 1.6473 - classification_loss: 0.3341 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9804 - regression_loss: 1.6464 - classification_loss: 0.3340 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9832 - regression_loss: 1.6483 - classification_loss: 0.3349 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9844 - regression_loss: 1.6490 - classification_loss: 0.3354 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9841 - regression_loss: 1.6486 - classification_loss: 0.3355 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9834 - regression_loss: 1.6476 - classification_loss: 0.3358 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9881 - regression_loss: 1.6510 - classification_loss: 0.3371 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9916 - regression_loss: 1.6548 - classification_loss: 0.3369 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9901 - regression_loss: 1.6540 - classification_loss: 0.3361 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9890 - regression_loss: 1.6530 - classification_loss: 0.3360 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9897 - regression_loss: 1.6531 - classification_loss: 0.3366 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9927 - regression_loss: 1.6554 - classification_loss: 0.3372 168/500 [=========>....................] - ETA: 1:22 - loss: 1.9941 - regression_loss: 1.6566 - classification_loss: 0.3376 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9972 - regression_loss: 1.6595 - classification_loss: 0.3378 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9928 - regression_loss: 1.6561 - classification_loss: 0.3368 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9883 - regression_loss: 1.6525 - classification_loss: 0.3358 172/500 [=========>....................] - ETA: 1:21 - loss: 1.9860 - regression_loss: 1.6507 - classification_loss: 0.3354 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9791 - regression_loss: 1.6450 - classification_loss: 0.3341 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9730 - regression_loss: 1.6400 - classification_loss: 0.3330 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9727 - regression_loss: 1.6399 - classification_loss: 0.3328 176/500 [=========>....................] - ETA: 1:20 - loss: 1.9704 - regression_loss: 1.6381 - classification_loss: 0.3323 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9683 - regression_loss: 1.6359 - classification_loss: 0.3324 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9686 - regression_loss: 1.6358 - classification_loss: 0.3329 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9724 - regression_loss: 1.6392 - classification_loss: 0.3332 180/500 [=========>....................] - ETA: 1:19 - loss: 1.9717 - regression_loss: 1.6389 - classification_loss: 0.3329 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9749 - regression_loss: 1.6414 - classification_loss: 0.3335 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9786 - regression_loss: 1.6436 - classification_loss: 0.3349 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9763 - regression_loss: 1.6419 - classification_loss: 0.3344 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9732 - regression_loss: 1.6397 - classification_loss: 0.3335 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9718 - regression_loss: 1.6389 - classification_loss: 0.3329 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9757 - regression_loss: 1.6419 - classification_loss: 0.3338 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9755 - regression_loss: 1.6420 - classification_loss: 0.3335 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9696 - regression_loss: 1.6374 - classification_loss: 0.3322 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9676 - regression_loss: 1.6358 - classification_loss: 0.3318 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9722 - regression_loss: 1.6401 - classification_loss: 0.3320 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9703 - regression_loss: 1.6387 - classification_loss: 0.3316 192/500 [==========>...................] - ETA: 1:16 - loss: 1.9692 - regression_loss: 1.6382 - classification_loss: 0.3310 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9721 - regression_loss: 1.6400 - classification_loss: 0.3321 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9709 - regression_loss: 1.6394 - classification_loss: 0.3315 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9712 - regression_loss: 1.6396 - classification_loss: 0.3316 196/500 [==========>...................] - ETA: 1:15 - loss: 1.9662 - regression_loss: 1.6358 - classification_loss: 0.3304 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9688 - regression_loss: 1.6377 - classification_loss: 0.3311 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9691 - regression_loss: 1.6379 - classification_loss: 0.3312 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9703 - regression_loss: 1.6391 - classification_loss: 0.3312 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9699 - regression_loss: 1.6387 - classification_loss: 0.3313 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9706 - regression_loss: 1.6392 - classification_loss: 0.3314 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9683 - regression_loss: 1.6377 - classification_loss: 0.3306 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9685 - regression_loss: 1.6379 - classification_loss: 0.3306 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9715 - regression_loss: 1.6396 - classification_loss: 0.3320 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9749 - regression_loss: 1.6427 - classification_loss: 0.3322 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9761 - regression_loss: 1.6441 - classification_loss: 0.3321 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9766 - regression_loss: 1.6446 - classification_loss: 0.3320 208/500 [===========>..................] - ETA: 1:12 - loss: 1.9797 - regression_loss: 1.6473 - classification_loss: 0.3324 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9800 - regression_loss: 1.6476 - classification_loss: 0.3323 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9783 - regression_loss: 1.6465 - classification_loss: 0.3319 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9769 - regression_loss: 1.6457 - classification_loss: 0.3313 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9745 - regression_loss: 1.6438 - classification_loss: 0.3306 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9720 - regression_loss: 1.6422 - classification_loss: 0.3298 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9704 - regression_loss: 1.6411 - classification_loss: 0.3294 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9719 - regression_loss: 1.6425 - classification_loss: 0.3294 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9715 - regression_loss: 1.6421 - classification_loss: 0.3294 217/500 [============>.................] - ETA: 1:10 - loss: 1.9708 - regression_loss: 1.6415 - classification_loss: 0.3293 218/500 [============>.................] - ETA: 1:10 - loss: 1.9706 - regression_loss: 1.6414 - classification_loss: 0.3291 219/500 [============>.................] - ETA: 1:10 - loss: 1.9708 - regression_loss: 1.6418 - classification_loss: 0.3290 220/500 [============>.................] - ETA: 1:09 - loss: 1.9711 - regression_loss: 1.6422 - classification_loss: 0.3289 221/500 [============>.................] - ETA: 1:09 - loss: 1.9679 - regression_loss: 1.6394 - classification_loss: 0.3285 222/500 [============>.................] - ETA: 1:09 - loss: 1.9677 - regression_loss: 1.6391 - classification_loss: 0.3286 223/500 [============>.................] - ETA: 1:09 - loss: 1.9721 - regression_loss: 1.6422 - classification_loss: 0.3299 224/500 [============>.................] - ETA: 1:08 - loss: 1.9727 - regression_loss: 1.6430 - classification_loss: 0.3297 225/500 [============>.................] - ETA: 1:08 - loss: 1.9679 - regression_loss: 1.6357 - classification_loss: 0.3322 226/500 [============>.................] - ETA: 1:08 - loss: 1.9724 - regression_loss: 1.6401 - classification_loss: 0.3324 227/500 [============>.................] - ETA: 1:08 - loss: 1.9728 - regression_loss: 1.6401 - classification_loss: 0.3326 228/500 [============>.................] - ETA: 1:07 - loss: 1.9728 - regression_loss: 1.6403 - classification_loss: 0.3325 229/500 [============>.................] - ETA: 1:07 - loss: 1.9752 - regression_loss: 1.6423 - classification_loss: 0.3328 230/500 [============>.................] - ETA: 1:07 - loss: 1.9759 - regression_loss: 1.6430 - classification_loss: 0.3329 231/500 [============>.................] - ETA: 1:07 - loss: 1.9748 - regression_loss: 1.6423 - classification_loss: 0.3325 232/500 [============>.................] - ETA: 1:06 - loss: 1.9747 - regression_loss: 1.6424 - classification_loss: 0.3324 233/500 [============>.................] - ETA: 1:06 - loss: 1.9752 - regression_loss: 1.6436 - classification_loss: 0.3316 234/500 [=============>................] - ETA: 1:06 - loss: 1.9745 - regression_loss: 1.6425 - classification_loss: 0.3320 235/500 [=============>................] - ETA: 1:06 - loss: 1.9700 - regression_loss: 1.6392 - classification_loss: 0.3308 236/500 [=============>................] - ETA: 1:05 - loss: 1.9712 - regression_loss: 1.6400 - classification_loss: 0.3313 237/500 [=============>................] - ETA: 1:05 - loss: 1.9716 - regression_loss: 1.6401 - classification_loss: 0.3314 238/500 [=============>................] - ETA: 1:05 - loss: 1.9699 - regression_loss: 1.6390 - classification_loss: 0.3309 239/500 [=============>................] - ETA: 1:05 - loss: 1.9697 - regression_loss: 1.6389 - classification_loss: 0.3308 240/500 [=============>................] - ETA: 1:04 - loss: 1.9691 - regression_loss: 1.6386 - classification_loss: 0.3305 241/500 [=============>................] - ETA: 1:04 - loss: 1.9652 - regression_loss: 1.6354 - classification_loss: 0.3298 242/500 [=============>................] - ETA: 1:04 - loss: 1.9633 - regression_loss: 1.6341 - classification_loss: 0.3291 243/500 [=============>................] - ETA: 1:04 - loss: 1.9656 - regression_loss: 1.6360 - classification_loss: 0.3296 244/500 [=============>................] - ETA: 1:03 - loss: 1.9663 - regression_loss: 1.6366 - classification_loss: 0.3297 245/500 [=============>................] - ETA: 1:03 - loss: 1.9664 - regression_loss: 1.6364 - classification_loss: 0.3300 246/500 [=============>................] - ETA: 1:03 - loss: 1.9633 - regression_loss: 1.6341 - classification_loss: 0.3292 247/500 [=============>................] - ETA: 1:03 - loss: 1.9616 - regression_loss: 1.6331 - classification_loss: 0.3285 248/500 [=============>................] - ETA: 1:02 - loss: 1.9624 - regression_loss: 1.6341 - classification_loss: 0.3284 249/500 [=============>................] - ETA: 1:02 - loss: 1.9623 - regression_loss: 1.6342 - classification_loss: 0.3281 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9653 - regression_loss: 1.6359 - classification_loss: 0.3294 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9649 - regression_loss: 1.6352 - classification_loss: 0.3296 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9621 - regression_loss: 1.6331 - classification_loss: 0.3290 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9649 - regression_loss: 1.6352 - classification_loss: 0.3297 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9616 - regression_loss: 1.6325 - classification_loss: 0.3291 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9591 - regression_loss: 1.6307 - classification_loss: 0.3284 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9617 - regression_loss: 1.6327 - classification_loss: 0.3290 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9567 - regression_loss: 1.6286 - classification_loss: 0.3281 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9637 - regression_loss: 1.6338 - classification_loss: 0.3300 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9645 - regression_loss: 1.6346 - classification_loss: 0.3299 260/500 [==============>...............] - ETA: 59s - loss: 1.9644 - regression_loss: 1.6344 - classification_loss: 0.3300  261/500 [==============>...............] - ETA: 59s - loss: 1.9684 - regression_loss: 1.6359 - classification_loss: 0.3325 262/500 [==============>...............] - ETA: 59s - loss: 1.9720 - regression_loss: 1.6388 - classification_loss: 0.3332 263/500 [==============>...............] - ETA: 59s - loss: 1.9729 - regression_loss: 1.6394 - classification_loss: 0.3336 264/500 [==============>...............] - ETA: 58s - loss: 1.9715 - regression_loss: 1.6386 - classification_loss: 0.3329 265/500 [==============>...............] - ETA: 58s - loss: 1.9729 - regression_loss: 1.6393 - classification_loss: 0.3336 266/500 [==============>...............] - ETA: 58s - loss: 1.9731 - regression_loss: 1.6394 - classification_loss: 0.3337 267/500 [===============>..............] - ETA: 58s - loss: 1.9721 - regression_loss: 1.6389 - classification_loss: 0.3332 268/500 [===============>..............] - ETA: 57s - loss: 1.9728 - regression_loss: 1.6394 - classification_loss: 0.3334 269/500 [===============>..............] - ETA: 57s - loss: 1.9747 - regression_loss: 1.6409 - classification_loss: 0.3338 270/500 [===============>..............] - ETA: 57s - loss: 1.9736 - regression_loss: 1.6403 - classification_loss: 0.3333 271/500 [===============>..............] - ETA: 57s - loss: 1.9757 - regression_loss: 1.6419 - classification_loss: 0.3338 272/500 [===============>..............] - ETA: 56s - loss: 1.9727 - regression_loss: 1.6395 - classification_loss: 0.3331 273/500 [===============>..............] - ETA: 56s - loss: 1.9716 - regression_loss: 1.6388 - classification_loss: 0.3328 274/500 [===============>..............] - ETA: 56s - loss: 1.9723 - regression_loss: 1.6395 - classification_loss: 0.3328 275/500 [===============>..............] - ETA: 56s - loss: 1.9733 - regression_loss: 1.6407 - classification_loss: 0.3326 276/500 [===============>..............] - ETA: 55s - loss: 1.9756 - regression_loss: 1.6421 - classification_loss: 0.3335 277/500 [===============>..............] - ETA: 55s - loss: 1.9757 - regression_loss: 1.6420 - classification_loss: 0.3337 278/500 [===============>..............] - ETA: 55s - loss: 1.9732 - regression_loss: 1.6401 - classification_loss: 0.3330 279/500 [===============>..............] - ETA: 55s - loss: 1.9717 - regression_loss: 1.6393 - classification_loss: 0.3324 280/500 [===============>..............] - ETA: 54s - loss: 1.9732 - regression_loss: 1.6403 - classification_loss: 0.3329 281/500 [===============>..............] - ETA: 54s - loss: 1.9727 - regression_loss: 1.6400 - classification_loss: 0.3327 282/500 [===============>..............] - ETA: 54s - loss: 1.9705 - regression_loss: 1.6381 - classification_loss: 0.3323 283/500 [===============>..............] - ETA: 54s - loss: 1.9678 - regression_loss: 1.6361 - classification_loss: 0.3317 284/500 [================>.............] - ETA: 53s - loss: 1.9689 - regression_loss: 1.6367 - classification_loss: 0.3322 285/500 [================>.............] - ETA: 53s - loss: 1.9694 - regression_loss: 1.6372 - classification_loss: 0.3322 286/500 [================>.............] - ETA: 53s - loss: 1.9682 - regression_loss: 1.6365 - classification_loss: 0.3318 287/500 [================>.............] - ETA: 53s - loss: 1.9690 - regression_loss: 1.6369 - classification_loss: 0.3320 288/500 [================>.............] - ETA: 52s - loss: 1.9690 - regression_loss: 1.6366 - classification_loss: 0.3324 289/500 [================>.............] - ETA: 52s - loss: 1.9689 - regression_loss: 1.6366 - classification_loss: 0.3323 290/500 [================>.............] - ETA: 52s - loss: 1.9690 - regression_loss: 1.6368 - classification_loss: 0.3322 291/500 [================>.............] - ETA: 52s - loss: 1.9690 - regression_loss: 1.6372 - classification_loss: 0.3319 292/500 [================>.............] - ETA: 51s - loss: 1.9684 - regression_loss: 1.6367 - classification_loss: 0.3317 293/500 [================>.............] - ETA: 51s - loss: 1.9669 - regression_loss: 1.6356 - classification_loss: 0.3314 294/500 [================>.............] - ETA: 51s - loss: 1.9685 - regression_loss: 1.6365 - classification_loss: 0.3320 295/500 [================>.............] - ETA: 51s - loss: 1.9683 - regression_loss: 1.6362 - classification_loss: 0.3321 296/500 [================>.............] - ETA: 50s - loss: 1.9685 - regression_loss: 1.6364 - classification_loss: 0.3321 297/500 [================>.............] - ETA: 50s - loss: 1.9683 - regression_loss: 1.6367 - classification_loss: 0.3317 298/500 [================>.............] - ETA: 50s - loss: 1.9724 - regression_loss: 1.6404 - classification_loss: 0.3321 299/500 [================>.............] - ETA: 50s - loss: 1.9750 - regression_loss: 1.6424 - classification_loss: 0.3327 300/500 [=================>............] - ETA: 49s - loss: 1.9753 - regression_loss: 1.6427 - classification_loss: 0.3325 301/500 [=================>............] - ETA: 49s - loss: 1.9747 - regression_loss: 1.6424 - classification_loss: 0.3324 302/500 [=================>............] - ETA: 49s - loss: 1.9743 - regression_loss: 1.6421 - classification_loss: 0.3323 303/500 [=================>............] - ETA: 49s - loss: 1.9750 - regression_loss: 1.6426 - classification_loss: 0.3324 304/500 [=================>............] - ETA: 48s - loss: 1.9757 - regression_loss: 1.6428 - classification_loss: 0.3329 305/500 [=================>............] - ETA: 48s - loss: 1.9753 - regression_loss: 1.6425 - classification_loss: 0.3328 306/500 [=================>............] - ETA: 48s - loss: 1.9744 - regression_loss: 1.6421 - classification_loss: 0.3324 307/500 [=================>............] - ETA: 48s - loss: 1.9761 - regression_loss: 1.6434 - classification_loss: 0.3327 308/500 [=================>............] - ETA: 47s - loss: 1.9763 - regression_loss: 1.6435 - classification_loss: 0.3328 309/500 [=================>............] - ETA: 47s - loss: 1.9774 - regression_loss: 1.6441 - classification_loss: 0.3333 310/500 [=================>............] - ETA: 47s - loss: 1.9786 - regression_loss: 1.6447 - classification_loss: 0.3339 311/500 [=================>............] - ETA: 47s - loss: 1.9800 - regression_loss: 1.6461 - classification_loss: 0.3339 312/500 [=================>............] - ETA: 46s - loss: 1.9808 - regression_loss: 1.6468 - classification_loss: 0.3340 313/500 [=================>............] - ETA: 46s - loss: 1.9794 - regression_loss: 1.6459 - classification_loss: 0.3335 314/500 [=================>............] - ETA: 46s - loss: 1.9787 - regression_loss: 1.6454 - classification_loss: 0.3334 315/500 [=================>............] - ETA: 46s - loss: 1.9783 - regression_loss: 1.6451 - classification_loss: 0.3332 316/500 [=================>............] - ETA: 45s - loss: 1.9761 - regression_loss: 1.6434 - classification_loss: 0.3327 317/500 [==================>...........] - ETA: 45s - loss: 1.9754 - regression_loss: 1.6429 - classification_loss: 0.3324 318/500 [==================>...........] - ETA: 45s - loss: 1.9745 - regression_loss: 1.6423 - classification_loss: 0.3322 319/500 [==================>...........] - ETA: 45s - loss: 1.9739 - regression_loss: 1.6419 - classification_loss: 0.3320 320/500 [==================>...........] - ETA: 44s - loss: 1.9747 - regression_loss: 1.6425 - classification_loss: 0.3322 321/500 [==================>...........] - ETA: 44s - loss: 1.9746 - regression_loss: 1.6423 - classification_loss: 0.3322 322/500 [==================>...........] - ETA: 44s - loss: 1.9743 - regression_loss: 1.6422 - classification_loss: 0.3321 323/500 [==================>...........] - ETA: 44s - loss: 1.9754 - regression_loss: 1.6431 - classification_loss: 0.3323 324/500 [==================>...........] - ETA: 43s - loss: 1.9752 - regression_loss: 1.6431 - classification_loss: 0.3321 325/500 [==================>...........] - ETA: 43s - loss: 1.9770 - regression_loss: 1.6452 - classification_loss: 0.3318 326/500 [==================>...........] - ETA: 43s - loss: 1.9772 - regression_loss: 1.6456 - classification_loss: 0.3317 327/500 [==================>...........] - ETA: 43s - loss: 1.9765 - regression_loss: 1.6452 - classification_loss: 0.3313 328/500 [==================>...........] - ETA: 42s - loss: 1.9771 - regression_loss: 1.6454 - classification_loss: 0.3317 329/500 [==================>...........] - ETA: 42s - loss: 1.9763 - regression_loss: 1.6448 - classification_loss: 0.3316 330/500 [==================>...........] - ETA: 42s - loss: 1.9761 - regression_loss: 1.6446 - classification_loss: 0.3315 331/500 [==================>...........] - ETA: 42s - loss: 1.9774 - regression_loss: 1.6454 - classification_loss: 0.3319 332/500 [==================>...........] - ETA: 41s - loss: 1.9783 - regression_loss: 1.6459 - classification_loss: 0.3324 333/500 [==================>...........] - ETA: 41s - loss: 1.9766 - regression_loss: 1.6445 - classification_loss: 0.3321 334/500 [===================>..........] - ETA: 41s - loss: 1.9762 - regression_loss: 1.6441 - classification_loss: 0.3321 335/500 [===================>..........] - ETA: 41s - loss: 1.9783 - regression_loss: 1.6460 - classification_loss: 0.3324 336/500 [===================>..........] - ETA: 40s - loss: 1.9766 - regression_loss: 1.6443 - classification_loss: 0.3323 337/500 [===================>..........] - ETA: 40s - loss: 1.9765 - regression_loss: 1.6444 - classification_loss: 0.3322 338/500 [===================>..........] - ETA: 40s - loss: 1.9753 - regression_loss: 1.6434 - classification_loss: 0.3319 339/500 [===================>..........] - ETA: 40s - loss: 1.9754 - regression_loss: 1.6436 - classification_loss: 0.3318 340/500 [===================>..........] - ETA: 39s - loss: 1.9727 - regression_loss: 1.6410 - classification_loss: 0.3317 341/500 [===================>..........] - ETA: 39s - loss: 1.9735 - regression_loss: 1.6415 - classification_loss: 0.3321 342/500 [===================>..........] - ETA: 39s - loss: 1.9700 - regression_loss: 1.6385 - classification_loss: 0.3315 343/500 [===================>..........] - ETA: 39s - loss: 1.9696 - regression_loss: 1.6380 - classification_loss: 0.3316 344/500 [===================>..........] - ETA: 38s - loss: 1.9686 - regression_loss: 1.6374 - classification_loss: 0.3312 345/500 [===================>..........] - ETA: 38s - loss: 1.9689 - regression_loss: 1.6378 - classification_loss: 0.3312 346/500 [===================>..........] - ETA: 38s - loss: 1.9705 - regression_loss: 1.6390 - classification_loss: 0.3315 347/500 [===================>..........] - ETA: 38s - loss: 1.9712 - regression_loss: 1.6395 - classification_loss: 0.3317 348/500 [===================>..........] - ETA: 37s - loss: 1.9719 - regression_loss: 1.6401 - classification_loss: 0.3318 349/500 [===================>..........] - ETA: 37s - loss: 1.9710 - regression_loss: 1.6392 - classification_loss: 0.3317 350/500 [====================>.........] - ETA: 37s - loss: 1.9735 - regression_loss: 1.6409 - classification_loss: 0.3326 351/500 [====================>.........] - ETA: 37s - loss: 1.9747 - regression_loss: 1.6418 - classification_loss: 0.3329 352/500 [====================>.........] - ETA: 36s - loss: 1.9771 - regression_loss: 1.6436 - classification_loss: 0.3335 353/500 [====================>.........] - ETA: 36s - loss: 1.9806 - regression_loss: 1.6464 - classification_loss: 0.3342 354/500 [====================>.........] - ETA: 36s - loss: 1.9793 - regression_loss: 1.6455 - classification_loss: 0.3338 355/500 [====================>.........] - ETA: 36s - loss: 1.9807 - regression_loss: 1.6465 - classification_loss: 0.3342 356/500 [====================>.........] - ETA: 35s - loss: 1.9806 - regression_loss: 1.6466 - classification_loss: 0.3340 357/500 [====================>.........] - ETA: 35s - loss: 1.9808 - regression_loss: 1.6467 - classification_loss: 0.3341 358/500 [====================>.........] - ETA: 35s - loss: 1.9792 - regression_loss: 1.6455 - classification_loss: 0.3337 359/500 [====================>.........] - ETA: 35s - loss: 1.9783 - regression_loss: 1.6449 - classification_loss: 0.3334 360/500 [====================>.........] - ETA: 34s - loss: 1.9770 - regression_loss: 1.6439 - classification_loss: 0.3331 361/500 [====================>.........] - ETA: 34s - loss: 1.9800 - regression_loss: 1.6458 - classification_loss: 0.3342 362/500 [====================>.........] - ETA: 34s - loss: 1.9785 - regression_loss: 1.6412 - classification_loss: 0.3372 363/500 [====================>.........] - ETA: 34s - loss: 1.9788 - regression_loss: 1.6415 - classification_loss: 0.3373 364/500 [====================>.........] - ETA: 33s - loss: 1.9794 - regression_loss: 1.6422 - classification_loss: 0.3372 365/500 [====================>.........] - ETA: 33s - loss: 1.9796 - regression_loss: 1.6424 - classification_loss: 0.3372 366/500 [====================>.........] - ETA: 33s - loss: 1.9796 - regression_loss: 1.6429 - classification_loss: 0.3368 367/500 [=====================>........] - ETA: 33s - loss: 1.9793 - regression_loss: 1.6426 - classification_loss: 0.3367 368/500 [=====================>........] - ETA: 32s - loss: 1.9786 - regression_loss: 1.6425 - classification_loss: 0.3362 369/500 [=====================>........] - ETA: 32s - loss: 1.9793 - regression_loss: 1.6430 - classification_loss: 0.3362 370/500 [=====================>........] - ETA: 32s - loss: 1.9796 - regression_loss: 1.6436 - classification_loss: 0.3361 371/500 [=====================>........] - ETA: 32s - loss: 1.9789 - regression_loss: 1.6431 - classification_loss: 0.3358 372/500 [=====================>........] - ETA: 31s - loss: 1.9802 - regression_loss: 1.6440 - classification_loss: 0.3362 373/500 [=====================>........] - ETA: 31s - loss: 1.9768 - regression_loss: 1.6412 - classification_loss: 0.3356 374/500 [=====================>........] - ETA: 31s - loss: 1.9777 - regression_loss: 1.6418 - classification_loss: 0.3359 375/500 [=====================>........] - ETA: 31s - loss: 1.9768 - regression_loss: 1.6411 - classification_loss: 0.3357 376/500 [=====================>........] - ETA: 30s - loss: 1.9750 - regression_loss: 1.6398 - classification_loss: 0.3352 377/500 [=====================>........] - ETA: 30s - loss: 1.9753 - regression_loss: 1.6401 - classification_loss: 0.3352 378/500 [=====================>........] - ETA: 30s - loss: 1.9725 - regression_loss: 1.6379 - classification_loss: 0.3346 379/500 [=====================>........] - ETA: 30s - loss: 1.9694 - regression_loss: 1.6354 - classification_loss: 0.3341 380/500 [=====================>........] - ETA: 29s - loss: 1.9675 - regression_loss: 1.6337 - classification_loss: 0.3338 381/500 [=====================>........] - ETA: 29s - loss: 1.9671 - regression_loss: 1.6332 - classification_loss: 0.3339 382/500 [=====================>........] - ETA: 29s - loss: 1.9657 - regression_loss: 1.6321 - classification_loss: 0.3336 383/500 [=====================>........] - ETA: 29s - loss: 1.9660 - regression_loss: 1.6325 - classification_loss: 0.3334 384/500 [======================>.......] - ETA: 28s - loss: 1.9650 - regression_loss: 1.6314 - classification_loss: 0.3336 385/500 [======================>.......] - ETA: 28s - loss: 1.9627 - regression_loss: 1.6295 - classification_loss: 0.3332 386/500 [======================>.......] - ETA: 28s - loss: 1.9623 - regression_loss: 1.6293 - classification_loss: 0.3330 387/500 [======================>.......] - ETA: 28s - loss: 1.9635 - regression_loss: 1.6301 - classification_loss: 0.3333 388/500 [======================>.......] - ETA: 27s - loss: 1.9686 - regression_loss: 1.6342 - classification_loss: 0.3344 389/500 [======================>.......] - ETA: 27s - loss: 1.9677 - regression_loss: 1.6337 - classification_loss: 0.3340 390/500 [======================>.......] - ETA: 27s - loss: 1.9667 - regression_loss: 1.6295 - classification_loss: 0.3372 391/500 [======================>.......] - ETA: 27s - loss: 1.9657 - regression_loss: 1.6253 - classification_loss: 0.3403 392/500 [======================>.......] - ETA: 26s - loss: 1.9639 - regression_loss: 1.6240 - classification_loss: 0.3399 393/500 [======================>.......] - ETA: 26s - loss: 1.9639 - regression_loss: 1.6240 - classification_loss: 0.3400 394/500 [======================>.......] - ETA: 26s - loss: 1.9642 - regression_loss: 1.6244 - classification_loss: 0.3399 395/500 [======================>.......] - ETA: 26s - loss: 1.9633 - regression_loss: 1.6237 - classification_loss: 0.3397 396/500 [======================>.......] - ETA: 25s - loss: 1.9646 - regression_loss: 1.6245 - classification_loss: 0.3400 397/500 [======================>.......] - ETA: 25s - loss: 1.9660 - regression_loss: 1.6254 - classification_loss: 0.3406 398/500 [======================>.......] - ETA: 25s - loss: 1.9669 - regression_loss: 1.6257 - classification_loss: 0.3412 399/500 [======================>.......] - ETA: 25s - loss: 1.9664 - regression_loss: 1.6251 - classification_loss: 0.3414 400/500 [=======================>......] - ETA: 24s - loss: 1.9680 - regression_loss: 1.6262 - classification_loss: 0.3418 401/500 [=======================>......] - ETA: 24s - loss: 1.9676 - regression_loss: 1.6258 - classification_loss: 0.3418 402/500 [=======================>......] - ETA: 24s - loss: 1.9681 - regression_loss: 1.6267 - classification_loss: 0.3414 403/500 [=======================>......] - ETA: 24s - loss: 1.9709 - regression_loss: 1.6290 - classification_loss: 0.3419 404/500 [=======================>......] - ETA: 23s - loss: 1.9723 - regression_loss: 1.6300 - classification_loss: 0.3423 405/500 [=======================>......] - ETA: 23s - loss: 1.9715 - regression_loss: 1.6294 - classification_loss: 0.3421 406/500 [=======================>......] - ETA: 23s - loss: 1.9704 - regression_loss: 1.6286 - classification_loss: 0.3418 407/500 [=======================>......] - ETA: 23s - loss: 1.9729 - regression_loss: 1.6305 - classification_loss: 0.3425 408/500 [=======================>......] - ETA: 22s - loss: 1.9736 - regression_loss: 1.6310 - classification_loss: 0.3426 409/500 [=======================>......] - ETA: 22s - loss: 1.9736 - regression_loss: 1.6312 - classification_loss: 0.3424 410/500 [=======================>......] - ETA: 22s - loss: 1.9741 - regression_loss: 1.6317 - classification_loss: 0.3424 411/500 [=======================>......] - ETA: 22s - loss: 1.9743 - regression_loss: 1.6322 - classification_loss: 0.3422 412/500 [=======================>......] - ETA: 21s - loss: 1.9742 - regression_loss: 1.6322 - classification_loss: 0.3420 413/500 [=======================>......] - ETA: 21s - loss: 1.9730 - regression_loss: 1.6313 - classification_loss: 0.3417 414/500 [=======================>......] - ETA: 21s - loss: 1.9709 - regression_loss: 1.6296 - classification_loss: 0.3413 415/500 [=======================>......] - ETA: 21s - loss: 1.9720 - regression_loss: 1.6303 - classification_loss: 0.3417 416/500 [=======================>......] - ETA: 20s - loss: 1.9716 - regression_loss: 1.6299 - classification_loss: 0.3416 417/500 [========================>.....] - ETA: 20s - loss: 1.9732 - regression_loss: 1.6311 - classification_loss: 0.3421 418/500 [========================>.....] - ETA: 20s - loss: 1.9728 - regression_loss: 1.6309 - classification_loss: 0.3419 419/500 [========================>.....] - ETA: 20s - loss: 1.9704 - regression_loss: 1.6290 - classification_loss: 0.3414 420/500 [========================>.....] - ETA: 19s - loss: 1.9722 - regression_loss: 1.6304 - classification_loss: 0.3418 421/500 [========================>.....] - ETA: 19s - loss: 1.9724 - regression_loss: 1.6307 - classification_loss: 0.3417 422/500 [========================>.....] - ETA: 19s - loss: 1.9725 - regression_loss: 1.6307 - classification_loss: 0.3418 423/500 [========================>.....] - ETA: 19s - loss: 1.9736 - regression_loss: 1.6316 - classification_loss: 0.3421 424/500 [========================>.....] - ETA: 18s - loss: 1.9747 - regression_loss: 1.6323 - classification_loss: 0.3424 425/500 [========================>.....] - ETA: 18s - loss: 1.9739 - regression_loss: 1.6317 - classification_loss: 0.3422 426/500 [========================>.....] - ETA: 18s - loss: 1.9749 - regression_loss: 1.6325 - classification_loss: 0.3423 427/500 [========================>.....] - ETA: 18s - loss: 1.9755 - regression_loss: 1.6329 - classification_loss: 0.3425 428/500 [========================>.....] - ETA: 17s - loss: 1.9744 - regression_loss: 1.6323 - classification_loss: 0.3421 429/500 [========================>.....] - ETA: 17s - loss: 1.9748 - regression_loss: 1.6327 - classification_loss: 0.3421 430/500 [========================>.....] - ETA: 17s - loss: 1.9737 - regression_loss: 1.6318 - classification_loss: 0.3419 431/500 [========================>.....] - ETA: 17s - loss: 1.9770 - regression_loss: 1.6342 - classification_loss: 0.3428 432/500 [========================>.....] - ETA: 16s - loss: 1.9748 - regression_loss: 1.6326 - classification_loss: 0.3422 433/500 [========================>.....] - ETA: 16s - loss: 1.9725 - regression_loss: 1.6308 - classification_loss: 0.3417 434/500 [=========================>....] - ETA: 16s - loss: 1.9716 - regression_loss: 1.6300 - classification_loss: 0.3416 435/500 [=========================>....] - ETA: 16s - loss: 1.9712 - regression_loss: 1.6296 - classification_loss: 0.3416 436/500 [=========================>....] - ETA: 15s - loss: 1.9755 - regression_loss: 1.6305 - classification_loss: 0.3450 437/500 [=========================>....] - ETA: 15s - loss: 1.9753 - regression_loss: 1.6303 - classification_loss: 0.3449 438/500 [=========================>....] - ETA: 15s - loss: 1.9751 - regression_loss: 1.6303 - classification_loss: 0.3448 439/500 [=========================>....] - ETA: 15s - loss: 1.9756 - regression_loss: 1.6306 - classification_loss: 0.3450 440/500 [=========================>....] - ETA: 14s - loss: 1.9766 - regression_loss: 1.6315 - classification_loss: 0.3450 441/500 [=========================>....] - ETA: 14s - loss: 1.9776 - regression_loss: 1.6321 - classification_loss: 0.3456 442/500 [=========================>....] - ETA: 14s - loss: 1.9780 - regression_loss: 1.6325 - classification_loss: 0.3455 443/500 [=========================>....] - ETA: 14s - loss: 1.9781 - regression_loss: 1.6326 - classification_loss: 0.3455 444/500 [=========================>....] - ETA: 13s - loss: 1.9783 - regression_loss: 1.6326 - classification_loss: 0.3457 445/500 [=========================>....] - ETA: 13s - loss: 1.9782 - regression_loss: 1.6325 - classification_loss: 0.3457 446/500 [=========================>....] - ETA: 13s - loss: 1.9808 - regression_loss: 1.6343 - classification_loss: 0.3465 447/500 [=========================>....] - ETA: 13s - loss: 1.9813 - regression_loss: 1.6346 - classification_loss: 0.3467 448/500 [=========================>....] - ETA: 12s - loss: 1.9830 - regression_loss: 1.6358 - classification_loss: 0.3472 449/500 [=========================>....] - ETA: 12s - loss: 1.9839 - regression_loss: 1.6369 - classification_loss: 0.3470 450/500 [==========================>...] - ETA: 12s - loss: 1.9832 - regression_loss: 1.6363 - classification_loss: 0.3469 451/500 [==========================>...] - ETA: 12s - loss: 1.9825 - regression_loss: 1.6358 - classification_loss: 0.3467 452/500 [==========================>...] - ETA: 11s - loss: 1.9838 - regression_loss: 1.6371 - classification_loss: 0.3467 453/500 [==========================>...] - ETA: 11s - loss: 1.9842 - regression_loss: 1.6380 - classification_loss: 0.3462 454/500 [==========================>...] - ETA: 11s - loss: 1.9851 - regression_loss: 1.6387 - classification_loss: 0.3464 455/500 [==========================>...] - ETA: 11s - loss: 1.9859 - regression_loss: 1.6394 - classification_loss: 0.3465 456/500 [==========================>...] - ETA: 10s - loss: 1.9859 - regression_loss: 1.6393 - classification_loss: 0.3466 457/500 [==========================>...] - ETA: 10s - loss: 1.9872 - regression_loss: 1.6401 - classification_loss: 0.3472 458/500 [==========================>...] - ETA: 10s - loss: 1.9885 - regression_loss: 1.6413 - classification_loss: 0.3471 459/500 [==========================>...] - ETA: 10s - loss: 1.9864 - regression_loss: 1.6397 - classification_loss: 0.3467 460/500 [==========================>...] - ETA: 9s - loss: 1.9869 - regression_loss: 1.6402 - classification_loss: 0.3467  461/500 [==========================>...] - ETA: 9s - loss: 1.9877 - regression_loss: 1.6409 - classification_loss: 0.3468 462/500 [==========================>...] - ETA: 9s - loss: 1.9872 - regression_loss: 1.6406 - classification_loss: 0.3466 463/500 [==========================>...] - ETA: 9s - loss: 1.9861 - regression_loss: 1.6398 - classification_loss: 0.3463 464/500 [==========================>...] - ETA: 8s - loss: 1.9875 - regression_loss: 1.6411 - classification_loss: 0.3465 465/500 [==========================>...] - ETA: 8s - loss: 1.9879 - regression_loss: 1.6416 - classification_loss: 0.3464 466/500 [==========================>...] - ETA: 8s - loss: 1.9881 - regression_loss: 1.6416 - classification_loss: 0.3465 467/500 [===========================>..] - ETA: 8s - loss: 1.9873 - regression_loss: 1.6408 - classification_loss: 0.3466 468/500 [===========================>..] - ETA: 7s - loss: 1.9869 - regression_loss: 1.6404 - classification_loss: 0.3465 469/500 [===========================>..] - ETA: 7s - loss: 1.9874 - regression_loss: 1.6407 - classification_loss: 0.3466 470/500 [===========================>..] - ETA: 7s - loss: 1.9868 - regression_loss: 1.6403 - classification_loss: 0.3465 471/500 [===========================>..] - ETA: 7s - loss: 1.9864 - regression_loss: 1.6400 - classification_loss: 0.3464 472/500 [===========================>..] - ETA: 6s - loss: 1.9887 - regression_loss: 1.6415 - classification_loss: 0.3472 473/500 [===========================>..] - ETA: 6s - loss: 1.9868 - regression_loss: 1.6399 - classification_loss: 0.3469 474/500 [===========================>..] - ETA: 6s - loss: 1.9868 - regression_loss: 1.6399 - classification_loss: 0.3469 475/500 [===========================>..] - ETA: 6s - loss: 1.9876 - regression_loss: 1.6405 - classification_loss: 0.3471 476/500 [===========================>..] - ETA: 5s - loss: 1.9860 - regression_loss: 1.6393 - classification_loss: 0.3467 477/500 [===========================>..] - ETA: 5s - loss: 1.9885 - regression_loss: 1.6414 - classification_loss: 0.3471 478/500 [===========================>..] - ETA: 5s - loss: 1.9876 - regression_loss: 1.6406 - classification_loss: 0.3470 479/500 [===========================>..] - ETA: 5s - loss: 1.9869 - regression_loss: 1.6402 - classification_loss: 0.3467 480/500 [===========================>..] - ETA: 4s - loss: 1.9869 - regression_loss: 1.6403 - classification_loss: 0.3466 481/500 [===========================>..] - ETA: 4s - loss: 1.9873 - regression_loss: 1.6405 - classification_loss: 0.3468 482/500 [===========================>..] - ETA: 4s - loss: 1.9872 - regression_loss: 1.6404 - classification_loss: 0.3468 483/500 [===========================>..] - ETA: 4s - loss: 1.9869 - regression_loss: 1.6404 - classification_loss: 0.3465 484/500 [============================>.] - ETA: 3s - loss: 1.9880 - regression_loss: 1.6411 - classification_loss: 0.3469 485/500 [============================>.] - ETA: 3s - loss: 1.9870 - regression_loss: 1.6404 - classification_loss: 0.3466 486/500 [============================>.] - ETA: 3s - loss: 1.9865 - regression_loss: 1.6400 - classification_loss: 0.3464 487/500 [============================>.] - ETA: 3s - loss: 1.9856 - regression_loss: 1.6395 - classification_loss: 0.3461 488/500 [============================>.] - ETA: 2s - loss: 1.9854 - regression_loss: 1.6395 - classification_loss: 0.3459 489/500 [============================>.] - ETA: 2s - loss: 1.9855 - regression_loss: 1.6395 - classification_loss: 0.3460 490/500 [============================>.] - ETA: 2s - loss: 1.9851 - regression_loss: 1.6392 - classification_loss: 0.3460 491/500 [============================>.] - ETA: 2s - loss: 1.9856 - regression_loss: 1.6395 - classification_loss: 0.3461 492/500 [============================>.] - ETA: 1s - loss: 1.9858 - regression_loss: 1.6398 - classification_loss: 0.3460 493/500 [============================>.] - ETA: 1s - loss: 1.9832 - regression_loss: 1.6377 - classification_loss: 0.3455 494/500 [============================>.] - ETA: 1s - loss: 1.9865 - regression_loss: 1.6404 - classification_loss: 0.3460 495/500 [============================>.] - ETA: 1s - loss: 1.9852 - regression_loss: 1.6396 - classification_loss: 0.3457 496/500 [============================>.] - ETA: 0s - loss: 1.9834 - regression_loss: 1.6382 - classification_loss: 0.3453 497/500 [============================>.] - ETA: 0s - loss: 1.9809 - regression_loss: 1.6362 - classification_loss: 0.3447 498/500 [============================>.] - ETA: 0s - loss: 1.9790 - regression_loss: 1.6347 - classification_loss: 0.3443 499/500 [============================>.] - ETA: 0s - loss: 1.9786 - regression_loss: 1.6346 - classification_loss: 0.3441 500/500 [==============================] - 125s 250ms/step - loss: 1.9790 - regression_loss: 1.6350 - classification_loss: 0.3440 326 instances of class plum with average precision: 0.7280 mAP: 0.7280 Epoch 00025: saving model to ./training/snapshots/resnet50_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 1:57 - loss: 2.4071 - regression_loss: 2.0254 - classification_loss: 0.3816 2/500 [..............................] - ETA: 2:03 - loss: 1.6809 - regression_loss: 1.4078 - classification_loss: 0.2731 3/500 [..............................] - ETA: 2:03 - loss: 1.7409 - regression_loss: 1.4705 - classification_loss: 0.2704 4/500 [..............................] - ETA: 2:03 - loss: 1.7203 - regression_loss: 1.4579 - classification_loss: 0.2624 5/500 [..............................] - ETA: 2:03 - loss: 1.8202 - regression_loss: 1.5513 - classification_loss: 0.2690 6/500 [..............................] - ETA: 2:03 - loss: 1.7184 - regression_loss: 1.4693 - classification_loss: 0.2491 7/500 [..............................] - ETA: 2:03 - loss: 1.6529 - regression_loss: 1.2594 - classification_loss: 0.3935 8/500 [..............................] - ETA: 2:03 - loss: 1.6735 - regression_loss: 1.2884 - classification_loss: 0.3851 9/500 [..............................] - ETA: 2:02 - loss: 1.7878 - regression_loss: 1.3801 - classification_loss: 0.4076 10/500 [..............................] - ETA: 2:02 - loss: 1.7997 - regression_loss: 1.4019 - classification_loss: 0.3978 11/500 [..............................] - ETA: 2:02 - loss: 1.8258 - regression_loss: 1.4270 - classification_loss: 0.3988 12/500 [..............................] - ETA: 2:01 - loss: 1.8808 - regression_loss: 1.4746 - classification_loss: 0.4062 13/500 [..............................] - ETA: 2:01 - loss: 1.9194 - regression_loss: 1.5023 - classification_loss: 0.4171 14/500 [..............................] - ETA: 2:01 - loss: 1.9323 - regression_loss: 1.5201 - classification_loss: 0.4122 15/500 [..............................] - ETA: 2:01 - loss: 1.9858 - regression_loss: 1.5619 - classification_loss: 0.4239 16/500 [..............................] - ETA: 2:01 - loss: 1.9634 - regression_loss: 1.5518 - classification_loss: 0.4117 17/500 [>.............................] - ETA: 2:00 - loss: 1.9223 - regression_loss: 1.5256 - classification_loss: 0.3967 18/500 [>.............................] - ETA: 2:00 - loss: 1.9225 - regression_loss: 1.5276 - classification_loss: 0.3949 19/500 [>.............................] - ETA: 2:00 - loss: 1.9383 - regression_loss: 1.5471 - classification_loss: 0.3912 20/500 [>.............................] - ETA: 2:00 - loss: 1.9556 - regression_loss: 1.5610 - classification_loss: 0.3946 21/500 [>.............................] - ETA: 2:00 - loss: 1.9864 - regression_loss: 1.5896 - classification_loss: 0.3968 22/500 [>.............................] - ETA: 1:59 - loss: 2.0176 - regression_loss: 1.6264 - classification_loss: 0.3912 23/500 [>.............................] - ETA: 1:59 - loss: 2.0128 - regression_loss: 1.6246 - classification_loss: 0.3882 24/500 [>.............................] - ETA: 1:59 - loss: 1.9935 - regression_loss: 1.6125 - classification_loss: 0.3809 25/500 [>.............................] - ETA: 1:59 - loss: 1.9717 - regression_loss: 1.5979 - classification_loss: 0.3737 26/500 [>.............................] - ETA: 1:59 - loss: 1.9886 - regression_loss: 1.6146 - classification_loss: 0.3740 27/500 [>.............................] - ETA: 1:59 - loss: 2.0031 - regression_loss: 1.6266 - classification_loss: 0.3764 28/500 [>.............................] - ETA: 1:58 - loss: 2.0156 - regression_loss: 1.6389 - classification_loss: 0.3766 29/500 [>.............................] - ETA: 1:58 - loss: 2.0571 - regression_loss: 1.6707 - classification_loss: 0.3864 30/500 [>.............................] - ETA: 1:58 - loss: 2.0406 - regression_loss: 1.6612 - classification_loss: 0.3793 31/500 [>.............................] - ETA: 1:58 - loss: 2.0396 - regression_loss: 1.6607 - classification_loss: 0.3788 32/500 [>.............................] - ETA: 1:57 - loss: 2.0459 - regression_loss: 1.6656 - classification_loss: 0.3803 33/500 [>.............................] - ETA: 1:57 - loss: 2.0591 - regression_loss: 1.6768 - classification_loss: 0.3823 34/500 [=>............................] - ETA: 1:57 - loss: 2.0281 - regression_loss: 1.6531 - classification_loss: 0.3750 35/500 [=>............................] - ETA: 1:56 - loss: 2.0217 - regression_loss: 1.6488 - classification_loss: 0.3729 36/500 [=>............................] - ETA: 1:56 - loss: 2.0173 - regression_loss: 1.6463 - classification_loss: 0.3709 37/500 [=>............................] - ETA: 1:56 - loss: 2.0256 - regression_loss: 1.6523 - classification_loss: 0.3733 38/500 [=>............................] - ETA: 1:55 - loss: 2.0267 - regression_loss: 1.6544 - classification_loss: 0.3723 39/500 [=>............................] - ETA: 1:55 - loss: 2.0334 - regression_loss: 1.6617 - classification_loss: 0.3717 40/500 [=>............................] - ETA: 1:55 - loss: 2.0337 - regression_loss: 1.6646 - classification_loss: 0.3692 41/500 [=>............................] - ETA: 1:55 - loss: 2.0206 - regression_loss: 1.6555 - classification_loss: 0.3651 42/500 [=>............................] - ETA: 1:55 - loss: 2.0153 - regression_loss: 1.6534 - classification_loss: 0.3619 43/500 [=>............................] - ETA: 1:54 - loss: 2.0077 - regression_loss: 1.6494 - classification_loss: 0.3583 44/500 [=>............................] - ETA: 1:54 - loss: 2.0089 - regression_loss: 1.6494 - classification_loss: 0.3596 45/500 [=>............................] - ETA: 1:54 - loss: 2.0337 - regression_loss: 1.6680 - classification_loss: 0.3657 46/500 [=>............................] - ETA: 1:54 - loss: 2.0165 - regression_loss: 1.6551 - classification_loss: 0.3615 47/500 [=>............................] - ETA: 1:53 - loss: 2.0199 - regression_loss: 1.6576 - classification_loss: 0.3623 48/500 [=>............................] - ETA: 1:53 - loss: 2.0197 - regression_loss: 1.6583 - classification_loss: 0.3614 49/500 [=>............................] - ETA: 1:53 - loss: 2.0233 - regression_loss: 1.6617 - classification_loss: 0.3616 50/500 [==>...........................] - ETA: 1:53 - loss: 2.0250 - regression_loss: 1.6627 - classification_loss: 0.3623 51/500 [==>...........................] - ETA: 1:52 - loss: 2.0659 - regression_loss: 1.6950 - classification_loss: 0.3709 52/500 [==>...........................] - ETA: 1:52 - loss: 2.0623 - regression_loss: 1.6931 - classification_loss: 0.3692 53/500 [==>...........................] - ETA: 1:52 - loss: 2.0640 - regression_loss: 1.6943 - classification_loss: 0.3697 54/500 [==>...........................] - ETA: 1:52 - loss: 2.0596 - regression_loss: 1.6912 - classification_loss: 0.3684 55/500 [==>...........................] - ETA: 1:51 - loss: 2.0592 - regression_loss: 1.6901 - classification_loss: 0.3691 56/500 [==>...........................] - ETA: 1:51 - loss: 2.0729 - regression_loss: 1.7020 - classification_loss: 0.3709 57/500 [==>...........................] - ETA: 1:51 - loss: 2.0706 - regression_loss: 1.6994 - classification_loss: 0.3712 58/500 [==>...........................] - ETA: 1:51 - loss: 2.0648 - regression_loss: 1.6946 - classification_loss: 0.3702 59/500 [==>...........................] - ETA: 1:50 - loss: 2.0738 - regression_loss: 1.7002 - classification_loss: 0.3735 60/500 [==>...........................] - ETA: 1:50 - loss: 2.0737 - regression_loss: 1.6997 - classification_loss: 0.3741 61/500 [==>...........................] - ETA: 1:50 - loss: 2.0751 - regression_loss: 1.6996 - classification_loss: 0.3755 62/500 [==>...........................] - ETA: 1:50 - loss: 2.0685 - regression_loss: 1.6951 - classification_loss: 0.3734 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0551 - regression_loss: 1.6847 - classification_loss: 0.3705 64/500 [==>...........................] - ETA: 1:49 - loss: 2.0527 - regression_loss: 1.6838 - classification_loss: 0.3689 65/500 [==>...........................] - ETA: 1:49 - loss: 2.0449 - regression_loss: 1.6775 - classification_loss: 0.3674 66/500 [==>...........................] - ETA: 1:49 - loss: 2.0450 - regression_loss: 1.6772 - classification_loss: 0.3678 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0463 - regression_loss: 1.6810 - classification_loss: 0.3652 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0570 - regression_loss: 1.6563 - classification_loss: 0.4007 69/500 [===>..........................] - ETA: 1:48 - loss: 2.0467 - regression_loss: 1.6495 - classification_loss: 0.3972 70/500 [===>..........................] - ETA: 1:48 - loss: 2.0425 - regression_loss: 1.6467 - classification_loss: 0.3958 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0475 - regression_loss: 1.6519 - classification_loss: 0.3956 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0562 - regression_loss: 1.6588 - classification_loss: 0.3974 73/500 [===>..........................] - ETA: 1:47 - loss: 2.0498 - regression_loss: 1.6550 - classification_loss: 0.3948 74/500 [===>..........................] - ETA: 1:47 - loss: 2.0342 - regression_loss: 1.6430 - classification_loss: 0.3913 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0233 - regression_loss: 1.6345 - classification_loss: 0.3888 76/500 [===>..........................] - ETA: 1:46 - loss: 2.0223 - regression_loss: 1.6340 - classification_loss: 0.3883 77/500 [===>..........................] - ETA: 1:46 - loss: 2.0274 - regression_loss: 1.6405 - classification_loss: 0.3869 78/500 [===>..........................] - ETA: 1:46 - loss: 2.0306 - regression_loss: 1.6438 - classification_loss: 0.3868 79/500 [===>..........................] - ETA: 1:46 - loss: 2.0322 - regression_loss: 1.6464 - classification_loss: 0.3858 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0337 - regression_loss: 1.6481 - classification_loss: 0.3855 81/500 [===>..........................] - ETA: 1:45 - loss: 2.0309 - regression_loss: 1.6465 - classification_loss: 0.3844 82/500 [===>..........................] - ETA: 1:45 - loss: 2.0276 - regression_loss: 1.6446 - classification_loss: 0.3829 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0355 - regression_loss: 1.6506 - classification_loss: 0.3850 84/500 [====>.........................] - ETA: 1:44 - loss: 2.0363 - regression_loss: 1.6514 - classification_loss: 0.3849 85/500 [====>.........................] - ETA: 1:44 - loss: 2.0380 - regression_loss: 1.6531 - classification_loss: 0.3850 86/500 [====>.........................] - ETA: 1:44 - loss: 2.0456 - regression_loss: 1.6597 - classification_loss: 0.3859 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0437 - regression_loss: 1.6582 - classification_loss: 0.3855 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0546 - regression_loss: 1.6675 - classification_loss: 0.3871 89/500 [====>.........................] - ETA: 1:43 - loss: 2.0542 - regression_loss: 1.6675 - classification_loss: 0.3866 90/500 [====>.........................] - ETA: 1:43 - loss: 2.0563 - regression_loss: 1.6690 - classification_loss: 0.3873 91/500 [====>.........................] - ETA: 1:43 - loss: 2.0469 - regression_loss: 1.6624 - classification_loss: 0.3845 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0439 - regression_loss: 1.6596 - classification_loss: 0.3843 93/500 [====>.........................] - ETA: 1:42 - loss: 2.0436 - regression_loss: 1.6599 - classification_loss: 0.3838 94/500 [====>.........................] - ETA: 1:42 - loss: 2.0347 - regression_loss: 1.6527 - classification_loss: 0.3820 95/500 [====>.........................] - ETA: 1:42 - loss: 2.0215 - regression_loss: 1.6414 - classification_loss: 0.3800 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0224 - regression_loss: 1.6426 - classification_loss: 0.3797 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0184 - regression_loss: 1.6408 - classification_loss: 0.3776 98/500 [====>.........................] - ETA: 1:41 - loss: 2.0064 - regression_loss: 1.6315 - classification_loss: 0.3749 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0117 - regression_loss: 1.6365 - classification_loss: 0.3752 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0107 - regression_loss: 1.6341 - classification_loss: 0.3766 101/500 [=====>........................] - ETA: 1:40 - loss: 2.0085 - regression_loss: 1.6335 - classification_loss: 0.3750 102/500 [=====>........................] - ETA: 1:40 - loss: 2.0133 - regression_loss: 1.6385 - classification_loss: 0.3748 103/500 [=====>........................] - ETA: 1:40 - loss: 2.0152 - regression_loss: 1.6410 - classification_loss: 0.3742 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0100 - regression_loss: 1.6377 - classification_loss: 0.3723 105/500 [=====>........................] - ETA: 1:39 - loss: 2.0098 - regression_loss: 1.6373 - classification_loss: 0.3724 106/500 [=====>........................] - ETA: 1:39 - loss: 2.0207 - regression_loss: 1.6463 - classification_loss: 0.3744 107/500 [=====>........................] - ETA: 1:39 - loss: 2.0179 - regression_loss: 1.6445 - classification_loss: 0.3733 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0231 - regression_loss: 1.6484 - classification_loss: 0.3747 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0176 - regression_loss: 1.6447 - classification_loss: 0.3728 110/500 [=====>........................] - ETA: 1:38 - loss: 2.0175 - regression_loss: 1.6456 - classification_loss: 0.3720 111/500 [=====>........................] - ETA: 1:38 - loss: 2.0238 - regression_loss: 1.6517 - classification_loss: 0.3720 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0183 - regression_loss: 1.6478 - classification_loss: 0.3705 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0145 - regression_loss: 1.6455 - classification_loss: 0.3690 114/500 [=====>........................] - ETA: 1:37 - loss: 2.0215 - regression_loss: 1.6527 - classification_loss: 0.3688 115/500 [=====>........................] - ETA: 1:37 - loss: 2.0187 - regression_loss: 1.6510 - classification_loss: 0.3677 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0317 - regression_loss: 1.6629 - classification_loss: 0.3688 117/500 [======>.......................] - ETA: 1:36 - loss: 2.0304 - regression_loss: 1.6624 - classification_loss: 0.3680 118/500 [======>.......................] - ETA: 1:36 - loss: 2.0350 - regression_loss: 1.6659 - classification_loss: 0.3690 119/500 [======>.......................] - ETA: 1:36 - loss: 2.0335 - regression_loss: 1.6655 - classification_loss: 0.3680 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0313 - regression_loss: 1.6636 - classification_loss: 0.3677 121/500 [======>.......................] - ETA: 1:35 - loss: 2.0327 - regression_loss: 1.6643 - classification_loss: 0.3684 122/500 [======>.......................] - ETA: 1:35 - loss: 2.0348 - regression_loss: 1.6665 - classification_loss: 0.3683 123/500 [======>.......................] - ETA: 1:35 - loss: 2.0382 - regression_loss: 1.6687 - classification_loss: 0.3695 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0311 - regression_loss: 1.6635 - classification_loss: 0.3676 125/500 [======>.......................] - ETA: 1:34 - loss: 2.0286 - regression_loss: 1.6610 - classification_loss: 0.3676 126/500 [======>.......................] - ETA: 1:34 - loss: 2.0297 - regression_loss: 1.6624 - classification_loss: 0.3673 127/500 [======>.......................] - ETA: 1:34 - loss: 2.0349 - regression_loss: 1.6669 - classification_loss: 0.3680 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0298 - regression_loss: 1.6630 - classification_loss: 0.3668 129/500 [======>.......................] - ETA: 1:33 - loss: 2.0293 - regression_loss: 1.6627 - classification_loss: 0.3666 130/500 [======>.......................] - ETA: 1:33 - loss: 2.0268 - regression_loss: 1.6615 - classification_loss: 0.3653 131/500 [======>.......................] - ETA: 1:33 - loss: 2.0276 - regression_loss: 1.6620 - classification_loss: 0.3655 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0228 - regression_loss: 1.6589 - classification_loss: 0.3639 133/500 [======>.......................] - ETA: 1:32 - loss: 2.0207 - regression_loss: 1.6582 - classification_loss: 0.3625 134/500 [=======>......................] - ETA: 1:32 - loss: 2.0149 - regression_loss: 1.6540 - classification_loss: 0.3609 135/500 [=======>......................] - ETA: 1:32 - loss: 2.0206 - regression_loss: 1.6580 - classification_loss: 0.3626 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0122 - regression_loss: 1.6514 - classification_loss: 0.3608 137/500 [=======>......................] - ETA: 1:31 - loss: 2.0126 - regression_loss: 1.6509 - classification_loss: 0.3617 138/500 [=======>......................] - ETA: 1:31 - loss: 2.0170 - regression_loss: 1.6551 - classification_loss: 0.3619 139/500 [=======>......................] - ETA: 1:31 - loss: 2.0167 - regression_loss: 1.6553 - classification_loss: 0.3614 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0133 - regression_loss: 1.6532 - classification_loss: 0.3601 141/500 [=======>......................] - ETA: 1:30 - loss: 2.0137 - regression_loss: 1.6535 - classification_loss: 0.3602 142/500 [=======>......................] - ETA: 1:30 - loss: 2.0169 - regression_loss: 1.6557 - classification_loss: 0.3612 143/500 [=======>......................] - ETA: 1:30 - loss: 2.0245 - regression_loss: 1.6629 - classification_loss: 0.3616 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0200 - regression_loss: 1.6597 - classification_loss: 0.3603 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0191 - regression_loss: 1.6591 - classification_loss: 0.3600 146/500 [=======>......................] - ETA: 1:29 - loss: 2.0222 - regression_loss: 1.6616 - classification_loss: 0.3606 147/500 [=======>......................] - ETA: 1:29 - loss: 2.0215 - regression_loss: 1.6612 - classification_loss: 0.3603 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0287 - regression_loss: 1.6667 - classification_loss: 0.3619 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0262 - regression_loss: 1.6652 - classification_loss: 0.3610 150/500 [========>.....................] - ETA: 1:28 - loss: 2.0274 - regression_loss: 1.6666 - classification_loss: 0.3609 151/500 [========>.....................] - ETA: 1:28 - loss: 2.0274 - regression_loss: 1.6668 - classification_loss: 0.3605 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0228 - regression_loss: 1.6634 - classification_loss: 0.3593 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0243 - regression_loss: 1.6652 - classification_loss: 0.3591 154/500 [========>.....................] - ETA: 1:27 - loss: 2.0240 - regression_loss: 1.6651 - classification_loss: 0.3589 155/500 [========>.....................] - ETA: 1:27 - loss: 2.0262 - regression_loss: 1.6666 - classification_loss: 0.3596 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0278 - regression_loss: 1.6679 - classification_loss: 0.3599 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0246 - regression_loss: 1.6658 - classification_loss: 0.3589 158/500 [========>.....................] - ETA: 1:26 - loss: 2.0248 - regression_loss: 1.6665 - classification_loss: 0.3583 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0222 - regression_loss: 1.6649 - classification_loss: 0.3573 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0213 - regression_loss: 1.6645 - classification_loss: 0.3568 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0192 - regression_loss: 1.6633 - classification_loss: 0.3559 162/500 [========>.....................] - ETA: 1:25 - loss: 2.0215 - regression_loss: 1.6653 - classification_loss: 0.3562 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0161 - regression_loss: 1.6606 - classification_loss: 0.3555 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0090 - regression_loss: 1.6550 - classification_loss: 0.3540 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0092 - regression_loss: 1.6550 - classification_loss: 0.3543 166/500 [========>.....................] - ETA: 1:24 - loss: 2.0078 - regression_loss: 1.6535 - classification_loss: 0.3542 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0052 - regression_loss: 1.6519 - classification_loss: 0.3533 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0021 - regression_loss: 1.6495 - classification_loss: 0.3526 169/500 [=========>....................] - ETA: 1:23 - loss: 1.9999 - regression_loss: 1.6481 - classification_loss: 0.3518 170/500 [=========>....................] - ETA: 1:23 - loss: 1.9997 - regression_loss: 1.6481 - classification_loss: 0.3516 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0029 - regression_loss: 1.6497 - classification_loss: 0.3532 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0014 - regression_loss: 1.6490 - classification_loss: 0.3524 173/500 [=========>....................] - ETA: 1:22 - loss: 1.9979 - regression_loss: 1.6463 - classification_loss: 0.3516 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0003 - regression_loss: 1.6483 - classification_loss: 0.3520 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0007 - regression_loss: 1.6479 - classification_loss: 0.3529 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9974 - regression_loss: 1.6447 - classification_loss: 0.3527 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9938 - regression_loss: 1.6421 - classification_loss: 0.3517 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9925 - regression_loss: 1.6411 - classification_loss: 0.3514 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9919 - regression_loss: 1.6411 - classification_loss: 0.3508 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9901 - regression_loss: 1.6402 - classification_loss: 0.3499 181/500 [=========>....................] - ETA: 1:20 - loss: 1.9891 - regression_loss: 1.6394 - classification_loss: 0.3497 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9887 - regression_loss: 1.6390 - classification_loss: 0.3497 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9914 - regression_loss: 1.6421 - classification_loss: 0.3493 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9918 - regression_loss: 1.6423 - classification_loss: 0.3495 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9933 - regression_loss: 1.6434 - classification_loss: 0.3499 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9907 - regression_loss: 1.6416 - classification_loss: 0.3491 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9870 - regression_loss: 1.6390 - classification_loss: 0.3480 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9877 - regression_loss: 1.6396 - classification_loss: 0.3481 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9843 - regression_loss: 1.6373 - classification_loss: 0.3470 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9857 - regression_loss: 1.6385 - classification_loss: 0.3472 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9866 - regression_loss: 1.6393 - classification_loss: 0.3473 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9845 - regression_loss: 1.6374 - classification_loss: 0.3471 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9835 - regression_loss: 1.6360 - classification_loss: 0.3475 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9852 - regression_loss: 1.6373 - classification_loss: 0.3478 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9865 - regression_loss: 1.6379 - classification_loss: 0.3486 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9895 - regression_loss: 1.6403 - classification_loss: 0.3492 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9911 - regression_loss: 1.6415 - classification_loss: 0.3495 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9897 - regression_loss: 1.6404 - classification_loss: 0.3494 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9845 - regression_loss: 1.6362 - classification_loss: 0.3483 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9786 - regression_loss: 1.6315 - classification_loss: 0.3471 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9822 - regression_loss: 1.6343 - classification_loss: 0.3478 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9818 - regression_loss: 1.6338 - classification_loss: 0.3480 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9828 - regression_loss: 1.6349 - classification_loss: 0.3479 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9835 - regression_loss: 1.6356 - classification_loss: 0.3479 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9826 - regression_loss: 1.6348 - classification_loss: 0.3477 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9827 - regression_loss: 1.6350 - classification_loss: 0.3477 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9819 - regression_loss: 1.6347 - classification_loss: 0.3472 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9831 - regression_loss: 1.6354 - classification_loss: 0.3477 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9801 - regression_loss: 1.6332 - classification_loss: 0.3469 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9839 - regression_loss: 1.6360 - classification_loss: 0.3479 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9862 - regression_loss: 1.6380 - classification_loss: 0.3482 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9855 - regression_loss: 1.6377 - classification_loss: 0.3478 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9794 - regression_loss: 1.6326 - classification_loss: 0.3469 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9803 - regression_loss: 1.6333 - classification_loss: 0.3470 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9785 - regression_loss: 1.6320 - classification_loss: 0.3465 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9796 - regression_loss: 1.6328 - classification_loss: 0.3468 217/500 [============>.................] - ETA: 1:11 - loss: 1.9788 - regression_loss: 1.6324 - classification_loss: 0.3464 218/500 [============>.................] - ETA: 1:10 - loss: 1.9813 - regression_loss: 1.6350 - classification_loss: 0.3463 219/500 [============>.................] - ETA: 1:10 - loss: 1.9863 - regression_loss: 1.6391 - classification_loss: 0.3472 220/500 [============>.................] - ETA: 1:10 - loss: 1.9857 - regression_loss: 1.6388 - classification_loss: 0.3468 221/500 [============>.................] - ETA: 1:10 - loss: 1.9856 - regression_loss: 1.6389 - classification_loss: 0.3467 222/500 [============>.................] - ETA: 1:09 - loss: 1.9822 - regression_loss: 1.6315 - classification_loss: 0.3507 223/500 [============>.................] - ETA: 1:09 - loss: 1.9842 - regression_loss: 1.6327 - classification_loss: 0.3515 224/500 [============>.................] - ETA: 1:09 - loss: 1.9879 - regression_loss: 1.6355 - classification_loss: 0.3524 225/500 [============>.................] - ETA: 1:09 - loss: 1.9858 - regression_loss: 1.6342 - classification_loss: 0.3516 226/500 [============>.................] - ETA: 1:08 - loss: 1.9893 - regression_loss: 1.6371 - classification_loss: 0.3523 227/500 [============>.................] - ETA: 1:08 - loss: 1.9920 - regression_loss: 1.6397 - classification_loss: 0.3523 228/500 [============>.................] - ETA: 1:08 - loss: 1.9941 - regression_loss: 1.6415 - classification_loss: 0.3526 229/500 [============>.................] - ETA: 1:08 - loss: 1.9929 - regression_loss: 1.6409 - classification_loss: 0.3520 230/500 [============>.................] - ETA: 1:07 - loss: 1.9937 - regression_loss: 1.6418 - classification_loss: 0.3519 231/500 [============>.................] - ETA: 1:07 - loss: 1.9933 - regression_loss: 1.6419 - classification_loss: 0.3514 232/500 [============>.................] - ETA: 1:07 - loss: 1.9930 - regression_loss: 1.6417 - classification_loss: 0.3513 233/500 [============>.................] - ETA: 1:07 - loss: 1.9985 - regression_loss: 1.6466 - classification_loss: 0.3519 234/500 [=============>................] - ETA: 1:06 - loss: 1.9959 - regression_loss: 1.6446 - classification_loss: 0.3512 235/500 [=============>................] - ETA: 1:06 - loss: 1.9990 - regression_loss: 1.6473 - classification_loss: 0.3518 236/500 [=============>................] - ETA: 1:06 - loss: 2.0002 - regression_loss: 1.6485 - classification_loss: 0.3517 237/500 [=============>................] - ETA: 1:06 - loss: 1.9993 - regression_loss: 1.6479 - classification_loss: 0.3515 238/500 [=============>................] - ETA: 1:05 - loss: 1.9985 - regression_loss: 1.6474 - classification_loss: 0.3510 239/500 [=============>................] - ETA: 1:05 - loss: 1.9987 - regression_loss: 1.6475 - classification_loss: 0.3512 240/500 [=============>................] - ETA: 1:05 - loss: 1.9989 - regression_loss: 1.6480 - classification_loss: 0.3508 241/500 [=============>................] - ETA: 1:05 - loss: 2.0009 - regression_loss: 1.6497 - classification_loss: 0.3512 242/500 [=============>................] - ETA: 1:04 - loss: 2.0041 - regression_loss: 1.6520 - classification_loss: 0.3521 243/500 [=============>................] - ETA: 1:04 - loss: 2.0064 - regression_loss: 1.6542 - classification_loss: 0.3522 244/500 [=============>................] - ETA: 1:04 - loss: 2.0072 - regression_loss: 1.6549 - classification_loss: 0.3523 245/500 [=============>................] - ETA: 1:04 - loss: 2.0101 - regression_loss: 1.6573 - classification_loss: 0.3528 246/500 [=============>................] - ETA: 1:03 - loss: 2.0106 - regression_loss: 1.6570 - classification_loss: 0.3535 247/500 [=============>................] - ETA: 1:03 - loss: 2.0052 - regression_loss: 1.6524 - classification_loss: 0.3528 248/500 [=============>................] - ETA: 1:03 - loss: 2.0034 - regression_loss: 1.6511 - classification_loss: 0.3523 249/500 [=============>................] - ETA: 1:03 - loss: 1.9992 - regression_loss: 1.6478 - classification_loss: 0.3513 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9962 - regression_loss: 1.6457 - classification_loss: 0.3505 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9954 - regression_loss: 1.6450 - classification_loss: 0.3504 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9964 - regression_loss: 1.6458 - classification_loss: 0.3506 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9957 - regression_loss: 1.6453 - classification_loss: 0.3504 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9969 - regression_loss: 1.6465 - classification_loss: 0.3504 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9960 - regression_loss: 1.6457 - classification_loss: 0.3503 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9935 - regression_loss: 1.6438 - classification_loss: 0.3497 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9915 - regression_loss: 1.6423 - classification_loss: 0.3493 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9919 - regression_loss: 1.6429 - classification_loss: 0.3490 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9911 - regression_loss: 1.6421 - classification_loss: 0.3490 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9901 - regression_loss: 1.6410 - classification_loss: 0.3490 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9913 - regression_loss: 1.6429 - classification_loss: 0.3484 262/500 [==============>...............] - ETA: 59s - loss: 1.9912 - regression_loss: 1.6428 - classification_loss: 0.3484  263/500 [==============>...............] - ETA: 59s - loss: 1.9918 - regression_loss: 1.6433 - classification_loss: 0.3485 264/500 [==============>...............] - ETA: 59s - loss: 1.9972 - regression_loss: 1.6486 - classification_loss: 0.3486 265/500 [==============>...............] - ETA: 59s - loss: 1.9982 - regression_loss: 1.6492 - classification_loss: 0.3491 266/500 [==============>...............] - ETA: 58s - loss: 2.0025 - regression_loss: 1.6524 - classification_loss: 0.3501 267/500 [===============>..............] - ETA: 58s - loss: 1.9983 - regression_loss: 1.6491 - classification_loss: 0.3492 268/500 [===============>..............] - ETA: 58s - loss: 1.9982 - regression_loss: 1.6491 - classification_loss: 0.3491 269/500 [===============>..............] - ETA: 58s - loss: 1.9980 - regression_loss: 1.6495 - classification_loss: 0.3486 270/500 [===============>..............] - ETA: 57s - loss: 1.9936 - regression_loss: 1.6460 - classification_loss: 0.3476 271/500 [===============>..............] - ETA: 57s - loss: 1.9919 - regression_loss: 1.6447 - classification_loss: 0.3472 272/500 [===============>..............] - ETA: 57s - loss: 1.9940 - regression_loss: 1.6470 - classification_loss: 0.3470 273/500 [===============>..............] - ETA: 57s - loss: 1.9923 - regression_loss: 1.6459 - classification_loss: 0.3464 274/500 [===============>..............] - ETA: 56s - loss: 1.9928 - regression_loss: 1.6464 - classification_loss: 0.3464 275/500 [===============>..............] - ETA: 56s - loss: 1.9929 - regression_loss: 1.6466 - classification_loss: 0.3463 276/500 [===============>..............] - ETA: 56s - loss: 1.9922 - regression_loss: 1.6465 - classification_loss: 0.3456 277/500 [===============>..............] - ETA: 56s - loss: 1.9886 - regression_loss: 1.6436 - classification_loss: 0.3449 278/500 [===============>..............] - ETA: 55s - loss: 1.9882 - regression_loss: 1.6435 - classification_loss: 0.3447 279/500 [===============>..............] - ETA: 55s - loss: 1.9853 - regression_loss: 1.6411 - classification_loss: 0.3442 280/500 [===============>..............] - ETA: 55s - loss: 1.9869 - regression_loss: 1.6421 - classification_loss: 0.3448 281/500 [===============>..............] - ETA: 55s - loss: 1.9838 - regression_loss: 1.6397 - classification_loss: 0.3441 282/500 [===============>..............] - ETA: 54s - loss: 1.9826 - regression_loss: 1.6390 - classification_loss: 0.3437 283/500 [===============>..............] - ETA: 54s - loss: 1.9834 - regression_loss: 1.6395 - classification_loss: 0.3439 284/500 [================>.............] - ETA: 54s - loss: 1.9882 - regression_loss: 1.6435 - classification_loss: 0.3446 285/500 [================>.............] - ETA: 54s - loss: 1.9875 - regression_loss: 1.6435 - classification_loss: 0.3441 286/500 [================>.............] - ETA: 53s - loss: 1.9865 - regression_loss: 1.6427 - classification_loss: 0.3438 287/500 [================>.............] - ETA: 53s - loss: 1.9829 - regression_loss: 1.6400 - classification_loss: 0.3429 288/500 [================>.............] - ETA: 53s - loss: 1.9823 - regression_loss: 1.6396 - classification_loss: 0.3427 289/500 [================>.............] - ETA: 52s - loss: 1.9795 - regression_loss: 1.6375 - classification_loss: 0.3421 290/500 [================>.............] - ETA: 52s - loss: 1.9807 - regression_loss: 1.6385 - classification_loss: 0.3422 291/500 [================>.............] - ETA: 52s - loss: 1.9818 - regression_loss: 1.6392 - classification_loss: 0.3426 292/500 [================>.............] - ETA: 52s - loss: 1.9866 - regression_loss: 1.6436 - classification_loss: 0.3430 293/500 [================>.............] - ETA: 51s - loss: 1.9862 - regression_loss: 1.6433 - classification_loss: 0.3428 294/500 [================>.............] - ETA: 51s - loss: 1.9849 - regression_loss: 1.6425 - classification_loss: 0.3424 295/500 [================>.............] - ETA: 51s - loss: 1.9855 - regression_loss: 1.6432 - classification_loss: 0.3424 296/500 [================>.............] - ETA: 51s - loss: 1.9835 - regression_loss: 1.6418 - classification_loss: 0.3417 297/500 [================>.............] - ETA: 50s - loss: 1.9827 - regression_loss: 1.6411 - classification_loss: 0.3416 298/500 [================>.............] - ETA: 50s - loss: 1.9822 - regression_loss: 1.6408 - classification_loss: 0.3414 299/500 [================>.............] - ETA: 50s - loss: 1.9829 - regression_loss: 1.6417 - classification_loss: 0.3413 300/500 [=================>............] - ETA: 50s - loss: 1.9829 - regression_loss: 1.6418 - classification_loss: 0.3411 301/500 [=================>............] - ETA: 49s - loss: 1.9812 - regression_loss: 1.6403 - classification_loss: 0.3408 302/500 [=================>............] - ETA: 49s - loss: 1.9794 - regression_loss: 1.6391 - classification_loss: 0.3403 303/500 [=================>............] - ETA: 49s - loss: 1.9787 - regression_loss: 1.6379 - classification_loss: 0.3408 304/500 [=================>............] - ETA: 49s - loss: 1.9782 - regression_loss: 1.6377 - classification_loss: 0.3405 305/500 [=================>............] - ETA: 48s - loss: 1.9777 - regression_loss: 1.6371 - classification_loss: 0.3405 306/500 [=================>............] - ETA: 48s - loss: 1.9773 - regression_loss: 1.6367 - classification_loss: 0.3405 307/500 [=================>............] - ETA: 48s - loss: 1.9746 - regression_loss: 1.6347 - classification_loss: 0.3400 308/500 [=================>............] - ETA: 48s - loss: 1.9736 - regression_loss: 1.6338 - classification_loss: 0.3398 309/500 [=================>............] - ETA: 47s - loss: 1.9741 - regression_loss: 1.6340 - classification_loss: 0.3402 310/500 [=================>............] - ETA: 47s - loss: 1.9707 - regression_loss: 1.6309 - classification_loss: 0.3398 311/500 [=================>............] - ETA: 47s - loss: 1.9715 - regression_loss: 1.6316 - classification_loss: 0.3399 312/500 [=================>............] - ETA: 47s - loss: 1.9747 - regression_loss: 1.6345 - classification_loss: 0.3402 313/500 [=================>............] - ETA: 46s - loss: 1.9714 - regression_loss: 1.6320 - classification_loss: 0.3394 314/500 [=================>............] - ETA: 46s - loss: 1.9708 - regression_loss: 1.6313 - classification_loss: 0.3395 315/500 [=================>............] - ETA: 46s - loss: 1.9745 - regression_loss: 1.6347 - classification_loss: 0.3398 316/500 [=================>............] - ETA: 46s - loss: 1.9726 - regression_loss: 1.6334 - classification_loss: 0.3392 317/500 [==================>...........] - ETA: 45s - loss: 1.9743 - regression_loss: 1.6344 - classification_loss: 0.3399 318/500 [==================>...........] - ETA: 45s - loss: 1.9731 - regression_loss: 1.6336 - classification_loss: 0.3396 319/500 [==================>...........] - ETA: 45s - loss: 1.9709 - regression_loss: 1.6317 - classification_loss: 0.3391 320/500 [==================>...........] - ETA: 45s - loss: 1.9692 - regression_loss: 1.6306 - classification_loss: 0.3386 321/500 [==================>...........] - ETA: 44s - loss: 1.9700 - regression_loss: 1.6313 - classification_loss: 0.3387 322/500 [==================>...........] - ETA: 44s - loss: 1.9693 - regression_loss: 1.6309 - classification_loss: 0.3385 323/500 [==================>...........] - ETA: 44s - loss: 1.9687 - regression_loss: 1.6307 - classification_loss: 0.3380 324/500 [==================>...........] - ETA: 44s - loss: 1.9686 - regression_loss: 1.6310 - classification_loss: 0.3376 325/500 [==================>...........] - ETA: 43s - loss: 1.9671 - regression_loss: 1.6298 - classification_loss: 0.3373 326/500 [==================>...........] - ETA: 43s - loss: 1.9680 - regression_loss: 1.6303 - classification_loss: 0.3377 327/500 [==================>...........] - ETA: 43s - loss: 1.9682 - regression_loss: 1.6301 - classification_loss: 0.3381 328/500 [==================>...........] - ETA: 43s - loss: 1.9665 - regression_loss: 1.6291 - classification_loss: 0.3374 329/500 [==================>...........] - ETA: 42s - loss: 1.9671 - regression_loss: 1.6298 - classification_loss: 0.3373 330/500 [==================>...........] - ETA: 42s - loss: 1.9641 - regression_loss: 1.6276 - classification_loss: 0.3365 331/500 [==================>...........] - ETA: 42s - loss: 1.9659 - regression_loss: 1.6288 - classification_loss: 0.3371 332/500 [==================>...........] - ETA: 42s - loss: 1.9663 - regression_loss: 1.6290 - classification_loss: 0.3373 333/500 [==================>...........] - ETA: 41s - loss: 1.9663 - regression_loss: 1.6292 - classification_loss: 0.3370 334/500 [===================>..........] - ETA: 41s - loss: 1.9678 - regression_loss: 1.6302 - classification_loss: 0.3376 335/500 [===================>..........] - ETA: 41s - loss: 1.9686 - regression_loss: 1.6309 - classification_loss: 0.3377 336/500 [===================>..........] - ETA: 41s - loss: 1.9688 - regression_loss: 1.6312 - classification_loss: 0.3376 337/500 [===================>..........] - ETA: 40s - loss: 1.9679 - regression_loss: 1.6305 - classification_loss: 0.3374 338/500 [===================>..........] - ETA: 40s - loss: 1.9667 - regression_loss: 1.6295 - classification_loss: 0.3372 339/500 [===================>..........] - ETA: 40s - loss: 1.9674 - regression_loss: 1.6299 - classification_loss: 0.3375 340/500 [===================>..........] - ETA: 40s - loss: 1.9674 - regression_loss: 1.6299 - classification_loss: 0.3376 341/500 [===================>..........] - ETA: 39s - loss: 1.9674 - regression_loss: 1.6301 - classification_loss: 0.3374 342/500 [===================>..........] - ETA: 39s - loss: 1.9662 - regression_loss: 1.6292 - classification_loss: 0.3370 343/500 [===================>..........] - ETA: 39s - loss: 1.9682 - regression_loss: 1.6309 - classification_loss: 0.3373 344/500 [===================>..........] - ETA: 39s - loss: 1.9684 - regression_loss: 1.6312 - classification_loss: 0.3372 345/500 [===================>..........] - ETA: 38s - loss: 1.9678 - regression_loss: 1.6308 - classification_loss: 0.3370 346/500 [===================>..........] - ETA: 38s - loss: 1.9750 - regression_loss: 1.6360 - classification_loss: 0.3391 347/500 [===================>..........] - ETA: 38s - loss: 1.9759 - regression_loss: 1.6366 - classification_loss: 0.3393 348/500 [===================>..........] - ETA: 38s - loss: 1.9759 - regression_loss: 1.6366 - classification_loss: 0.3393 349/500 [===================>..........] - ETA: 37s - loss: 1.9759 - regression_loss: 1.6368 - classification_loss: 0.3391 350/500 [====================>.........] - ETA: 37s - loss: 1.9768 - regression_loss: 1.6374 - classification_loss: 0.3393 351/500 [====================>.........] - ETA: 37s - loss: 1.9763 - regression_loss: 1.6369 - classification_loss: 0.3393 352/500 [====================>.........] - ETA: 37s - loss: 1.9780 - regression_loss: 1.6386 - classification_loss: 0.3394 353/500 [====================>.........] - ETA: 36s - loss: 1.9770 - regression_loss: 1.6379 - classification_loss: 0.3391 354/500 [====================>.........] - ETA: 36s - loss: 1.9778 - regression_loss: 1.6386 - classification_loss: 0.3392 355/500 [====================>.........] - ETA: 36s - loss: 1.9773 - regression_loss: 1.6383 - classification_loss: 0.3390 356/500 [====================>.........] - ETA: 36s - loss: 1.9757 - regression_loss: 1.6371 - classification_loss: 0.3386 357/500 [====================>.........] - ETA: 35s - loss: 1.9730 - regression_loss: 1.6346 - classification_loss: 0.3384 358/500 [====================>.........] - ETA: 35s - loss: 1.9718 - regression_loss: 1.6339 - classification_loss: 0.3379 359/500 [====================>.........] - ETA: 35s - loss: 1.9704 - regression_loss: 1.6329 - classification_loss: 0.3376 360/500 [====================>.........] - ETA: 35s - loss: 1.9676 - regression_loss: 1.6306 - classification_loss: 0.3370 361/500 [====================>.........] - ETA: 34s - loss: 1.9678 - regression_loss: 1.6304 - classification_loss: 0.3374 362/500 [====================>.........] - ETA: 34s - loss: 1.9672 - regression_loss: 1.6302 - classification_loss: 0.3370 363/500 [====================>.........] - ETA: 34s - loss: 1.9691 - regression_loss: 1.6316 - classification_loss: 0.3375 364/500 [====================>.........] - ETA: 34s - loss: 1.9685 - regression_loss: 1.6312 - classification_loss: 0.3374 365/500 [====================>.........] - ETA: 33s - loss: 1.9698 - regression_loss: 1.6320 - classification_loss: 0.3377 366/500 [====================>.........] - ETA: 33s - loss: 1.9704 - regression_loss: 1.6327 - classification_loss: 0.3377 367/500 [=====================>........] - ETA: 33s - loss: 1.9700 - regression_loss: 1.6325 - classification_loss: 0.3375 368/500 [=====================>........] - ETA: 33s - loss: 1.9714 - regression_loss: 1.6338 - classification_loss: 0.3376 369/500 [=====================>........] - ETA: 32s - loss: 1.9749 - regression_loss: 1.6367 - classification_loss: 0.3382 370/500 [=====================>........] - ETA: 32s - loss: 1.9785 - regression_loss: 1.6402 - classification_loss: 0.3383 371/500 [=====================>........] - ETA: 32s - loss: 1.9779 - regression_loss: 1.6396 - classification_loss: 0.3382 372/500 [=====================>........] - ETA: 32s - loss: 1.9784 - regression_loss: 1.6401 - classification_loss: 0.3382 373/500 [=====================>........] - ETA: 31s - loss: 1.9776 - regression_loss: 1.6396 - classification_loss: 0.3380 374/500 [=====================>........] - ETA: 31s - loss: 1.9773 - regression_loss: 1.6395 - classification_loss: 0.3378 375/500 [=====================>........] - ETA: 31s - loss: 1.9778 - regression_loss: 1.6399 - classification_loss: 0.3379 376/500 [=====================>........] - ETA: 31s - loss: 1.9770 - regression_loss: 1.6394 - classification_loss: 0.3376 377/500 [=====================>........] - ETA: 30s - loss: 1.9758 - regression_loss: 1.6387 - classification_loss: 0.3371 378/500 [=====================>........] - ETA: 30s - loss: 1.9746 - regression_loss: 1.6379 - classification_loss: 0.3367 379/500 [=====================>........] - ETA: 30s - loss: 1.9730 - regression_loss: 1.6367 - classification_loss: 0.3362 380/500 [=====================>........] - ETA: 30s - loss: 1.9715 - regression_loss: 1.6358 - classification_loss: 0.3357 381/500 [=====================>........] - ETA: 29s - loss: 1.9705 - regression_loss: 1.6351 - classification_loss: 0.3354 382/500 [=====================>........] - ETA: 29s - loss: 1.9708 - regression_loss: 1.6356 - classification_loss: 0.3352 383/500 [=====================>........] - ETA: 29s - loss: 1.9682 - regression_loss: 1.6334 - classification_loss: 0.3347 384/500 [======================>.......] - ETA: 29s - loss: 1.9729 - regression_loss: 1.6368 - classification_loss: 0.3361 385/500 [======================>.......] - ETA: 28s - loss: 1.9720 - regression_loss: 1.6362 - classification_loss: 0.3359 386/500 [======================>.......] - ETA: 28s - loss: 1.9717 - regression_loss: 1.6357 - classification_loss: 0.3361 387/500 [======================>.......] - ETA: 28s - loss: 1.9734 - regression_loss: 1.6371 - classification_loss: 0.3363 388/500 [======================>.......] - ETA: 28s - loss: 1.9732 - regression_loss: 1.6370 - classification_loss: 0.3363 389/500 [======================>.......] - ETA: 27s - loss: 1.9748 - regression_loss: 1.6380 - classification_loss: 0.3368 390/500 [======================>.......] - ETA: 27s - loss: 1.9744 - regression_loss: 1.6377 - classification_loss: 0.3367 391/500 [======================>.......] - ETA: 27s - loss: 1.9768 - regression_loss: 1.6394 - classification_loss: 0.3374 392/500 [======================>.......] - ETA: 27s - loss: 1.9781 - regression_loss: 1.6403 - classification_loss: 0.3378 393/500 [======================>.......] - ETA: 26s - loss: 1.9790 - regression_loss: 1.6408 - classification_loss: 0.3383 394/500 [======================>.......] - ETA: 26s - loss: 1.9789 - regression_loss: 1.6406 - classification_loss: 0.3383 395/500 [======================>.......] - ETA: 26s - loss: 1.9795 - regression_loss: 1.6407 - classification_loss: 0.3388 396/500 [======================>.......] - ETA: 26s - loss: 1.9774 - regression_loss: 1.6391 - classification_loss: 0.3383 397/500 [======================>.......] - ETA: 25s - loss: 1.9768 - regression_loss: 1.6385 - classification_loss: 0.3383 398/500 [======================>.......] - ETA: 25s - loss: 1.9766 - regression_loss: 1.6384 - classification_loss: 0.3382 399/500 [======================>.......] - ETA: 25s - loss: 1.9767 - regression_loss: 1.6388 - classification_loss: 0.3379 400/500 [=======================>......] - ETA: 25s - loss: 1.9755 - regression_loss: 1.6381 - classification_loss: 0.3374 401/500 [=======================>......] - ETA: 24s - loss: 1.9750 - regression_loss: 1.6377 - classification_loss: 0.3373 402/500 [=======================>......] - ETA: 24s - loss: 1.9759 - regression_loss: 1.6383 - classification_loss: 0.3376 403/500 [=======================>......] - ETA: 24s - loss: 1.9748 - regression_loss: 1.6375 - classification_loss: 0.3373 404/500 [=======================>......] - ETA: 24s - loss: 1.9762 - regression_loss: 1.6387 - classification_loss: 0.3374 405/500 [=======================>......] - ETA: 23s - loss: 1.9772 - regression_loss: 1.6394 - classification_loss: 0.3377 406/500 [=======================>......] - ETA: 23s - loss: 1.9769 - regression_loss: 1.6392 - classification_loss: 0.3377 407/500 [=======================>......] - ETA: 23s - loss: 1.9759 - regression_loss: 1.6385 - classification_loss: 0.3374 408/500 [=======================>......] - ETA: 23s - loss: 1.9746 - regression_loss: 1.6373 - classification_loss: 0.3373 409/500 [=======================>......] - ETA: 22s - loss: 1.9760 - regression_loss: 1.6382 - classification_loss: 0.3377 410/500 [=======================>......] - ETA: 22s - loss: 1.9788 - regression_loss: 1.6342 - classification_loss: 0.3446 411/500 [=======================>......] - ETA: 22s - loss: 1.9764 - regression_loss: 1.6323 - classification_loss: 0.3441 412/500 [=======================>......] - ETA: 22s - loss: 1.9775 - regression_loss: 1.6332 - classification_loss: 0.3443 413/500 [=======================>......] - ETA: 21s - loss: 1.9774 - regression_loss: 1.6331 - classification_loss: 0.3443 414/500 [=======================>......] - ETA: 21s - loss: 1.9767 - regression_loss: 1.6325 - classification_loss: 0.3442 415/500 [=======================>......] - ETA: 21s - loss: 1.9770 - regression_loss: 1.6332 - classification_loss: 0.3438 416/500 [=======================>......] - ETA: 21s - loss: 1.9774 - regression_loss: 1.6338 - classification_loss: 0.3436 417/500 [========================>.....] - ETA: 20s - loss: 1.9770 - regression_loss: 1.6335 - classification_loss: 0.3435 418/500 [========================>.....] - ETA: 20s - loss: 1.9770 - regression_loss: 1.6336 - classification_loss: 0.3435 419/500 [========================>.....] - ETA: 20s - loss: 1.9768 - regression_loss: 1.6335 - classification_loss: 0.3432 420/500 [========================>.....] - ETA: 20s - loss: 1.9761 - regression_loss: 1.6332 - classification_loss: 0.3430 421/500 [========================>.....] - ETA: 19s - loss: 1.9736 - regression_loss: 1.6312 - classification_loss: 0.3424 422/500 [========================>.....] - ETA: 19s - loss: 1.9723 - regression_loss: 1.6301 - classification_loss: 0.3421 423/500 [========================>.....] - ETA: 19s - loss: 1.9729 - regression_loss: 1.6309 - classification_loss: 0.3420 424/500 [========================>.....] - ETA: 19s - loss: 1.9735 - regression_loss: 1.6312 - classification_loss: 0.3423 425/500 [========================>.....] - ETA: 18s - loss: 1.9729 - regression_loss: 1.6308 - classification_loss: 0.3421 426/500 [========================>.....] - ETA: 18s - loss: 1.9728 - regression_loss: 1.6309 - classification_loss: 0.3419 427/500 [========================>.....] - ETA: 18s - loss: 1.9758 - regression_loss: 1.6333 - classification_loss: 0.3425 428/500 [========================>.....] - ETA: 18s - loss: 1.9734 - regression_loss: 1.6312 - classification_loss: 0.3422 429/500 [========================>.....] - ETA: 17s - loss: 1.9730 - regression_loss: 1.6309 - classification_loss: 0.3421 430/500 [========================>.....] - ETA: 17s - loss: 1.9736 - regression_loss: 1.6311 - classification_loss: 0.3424 431/500 [========================>.....] - ETA: 17s - loss: 1.9771 - regression_loss: 1.6331 - classification_loss: 0.3440 432/500 [========================>.....] - ETA: 17s - loss: 1.9766 - regression_loss: 1.6329 - classification_loss: 0.3437 433/500 [========================>.....] - ETA: 16s - loss: 1.9746 - regression_loss: 1.6313 - classification_loss: 0.3433 434/500 [=========================>....] - ETA: 16s - loss: 1.9757 - regression_loss: 1.6320 - classification_loss: 0.3437 435/500 [=========================>....] - ETA: 16s - loss: 1.9752 - regression_loss: 1.6318 - classification_loss: 0.3434 436/500 [=========================>....] - ETA: 16s - loss: 1.9753 - regression_loss: 1.6319 - classification_loss: 0.3434 437/500 [=========================>....] - ETA: 15s - loss: 1.9731 - regression_loss: 1.6303 - classification_loss: 0.3428 438/500 [=========================>....] - ETA: 15s - loss: 1.9744 - regression_loss: 1.6312 - classification_loss: 0.3432 439/500 [=========================>....] - ETA: 15s - loss: 1.9765 - regression_loss: 1.6323 - classification_loss: 0.3442 440/500 [=========================>....] - ETA: 15s - loss: 1.9768 - regression_loss: 1.6325 - classification_loss: 0.3443 441/500 [=========================>....] - ETA: 14s - loss: 1.9766 - regression_loss: 1.6325 - classification_loss: 0.3441 442/500 [=========================>....] - ETA: 14s - loss: 1.9764 - regression_loss: 1.6324 - classification_loss: 0.3440 443/500 [=========================>....] - ETA: 14s - loss: 1.9767 - regression_loss: 1.6323 - classification_loss: 0.3444 444/500 [=========================>....] - ETA: 14s - loss: 1.9742 - regression_loss: 1.6302 - classification_loss: 0.3439 445/500 [=========================>....] - ETA: 13s - loss: 1.9741 - regression_loss: 1.6301 - classification_loss: 0.3440 446/500 [=========================>....] - ETA: 13s - loss: 1.9750 - regression_loss: 1.6305 - classification_loss: 0.3445 447/500 [=========================>....] - ETA: 13s - loss: 1.9768 - regression_loss: 1.6316 - classification_loss: 0.3452 448/500 [=========================>....] - ETA: 13s - loss: 1.9755 - regression_loss: 1.6306 - classification_loss: 0.3449 449/500 [=========================>....] - ETA: 12s - loss: 1.9771 - regression_loss: 1.6319 - classification_loss: 0.3453 450/500 [==========================>...] - ETA: 12s - loss: 1.9801 - regression_loss: 1.6342 - classification_loss: 0.3458 451/500 [==========================>...] - ETA: 12s - loss: 1.9805 - regression_loss: 1.6345 - classification_loss: 0.3460 452/500 [==========================>...] - ETA: 12s - loss: 1.9808 - regression_loss: 1.6348 - classification_loss: 0.3460 453/500 [==========================>...] - ETA: 11s - loss: 1.9810 - regression_loss: 1.6349 - classification_loss: 0.3461 454/500 [==========================>...] - ETA: 11s - loss: 1.9823 - regression_loss: 1.6355 - classification_loss: 0.3468 455/500 [==========================>...] - ETA: 11s - loss: 1.9823 - regression_loss: 1.6354 - classification_loss: 0.3469 456/500 [==========================>...] - ETA: 11s - loss: 1.9815 - regression_loss: 1.6345 - classification_loss: 0.3470 457/500 [==========================>...] - ETA: 10s - loss: 1.9796 - regression_loss: 1.6329 - classification_loss: 0.3467 458/500 [==========================>...] - ETA: 10s - loss: 1.9798 - regression_loss: 1.6330 - classification_loss: 0.3467 459/500 [==========================>...] - ETA: 10s - loss: 1.9807 - regression_loss: 1.6338 - classification_loss: 0.3469 460/500 [==========================>...] - ETA: 10s - loss: 1.9802 - regression_loss: 1.6335 - classification_loss: 0.3467 461/500 [==========================>...] - ETA: 9s - loss: 1.9777 - regression_loss: 1.6315 - classification_loss: 0.3462  462/500 [==========================>...] - ETA: 9s - loss: 1.9782 - regression_loss: 1.6319 - classification_loss: 0.3463 463/500 [==========================>...] - ETA: 9s - loss: 1.9794 - regression_loss: 1.6330 - classification_loss: 0.3463 464/500 [==========================>...] - ETA: 9s - loss: 1.9786 - regression_loss: 1.6325 - classification_loss: 0.3460 465/500 [==========================>...] - ETA: 8s - loss: 1.9765 - regression_loss: 1.6309 - classification_loss: 0.3457 466/500 [==========================>...] - ETA: 8s - loss: 1.9778 - regression_loss: 1.6318 - classification_loss: 0.3460 467/500 [===========================>..] - ETA: 8s - loss: 1.9794 - regression_loss: 1.6332 - classification_loss: 0.3462 468/500 [===========================>..] - ETA: 8s - loss: 1.9789 - regression_loss: 1.6329 - classification_loss: 0.3460 469/500 [===========================>..] - ETA: 7s - loss: 1.9793 - regression_loss: 1.6332 - classification_loss: 0.3461 470/500 [===========================>..] - ETA: 7s - loss: 1.9778 - regression_loss: 1.6321 - classification_loss: 0.3457 471/500 [===========================>..] - ETA: 7s - loss: 1.9772 - regression_loss: 1.6319 - classification_loss: 0.3452 472/500 [===========================>..] - ETA: 7s - loss: 1.9783 - regression_loss: 1.6334 - classification_loss: 0.3449 473/500 [===========================>..] - ETA: 6s - loss: 1.9782 - regression_loss: 1.6335 - classification_loss: 0.3447 474/500 [===========================>..] - ETA: 6s - loss: 1.9768 - regression_loss: 1.6325 - classification_loss: 0.3443 475/500 [===========================>..] - ETA: 6s - loss: 1.9773 - regression_loss: 1.6326 - classification_loss: 0.3448 476/500 [===========================>..] - ETA: 6s - loss: 1.9768 - regression_loss: 1.6322 - classification_loss: 0.3446 477/500 [===========================>..] - ETA: 5s - loss: 1.9798 - regression_loss: 1.6344 - classification_loss: 0.3453 478/500 [===========================>..] - ETA: 5s - loss: 1.9796 - regression_loss: 1.6342 - classification_loss: 0.3454 479/500 [===========================>..] - ETA: 5s - loss: 1.9796 - regression_loss: 1.6343 - classification_loss: 0.3453 480/500 [===========================>..] - ETA: 5s - loss: 1.9789 - regression_loss: 1.6337 - classification_loss: 0.3452 481/500 [===========================>..] - ETA: 4s - loss: 1.9791 - regression_loss: 1.6339 - classification_loss: 0.3453 482/500 [===========================>..] - ETA: 4s - loss: 1.9781 - regression_loss: 1.6331 - classification_loss: 0.3450 483/500 [===========================>..] - ETA: 4s - loss: 1.9771 - regression_loss: 1.6324 - classification_loss: 0.3447 484/500 [============================>.] - ETA: 4s - loss: 1.9774 - regression_loss: 1.6325 - classification_loss: 0.3448 485/500 [============================>.] - ETA: 3s - loss: 1.9784 - regression_loss: 1.6337 - classification_loss: 0.3448 486/500 [============================>.] - ETA: 3s - loss: 1.9791 - regression_loss: 1.6342 - classification_loss: 0.3449 487/500 [============================>.] - ETA: 3s - loss: 1.9836 - regression_loss: 1.6375 - classification_loss: 0.3460 488/500 [============================>.] - ETA: 3s - loss: 1.9831 - regression_loss: 1.6372 - classification_loss: 0.3459 489/500 [============================>.] - ETA: 2s - loss: 1.9820 - regression_loss: 1.6364 - classification_loss: 0.3457 490/500 [============================>.] - ETA: 2s - loss: 1.9810 - regression_loss: 1.6355 - classification_loss: 0.3455 491/500 [============================>.] - ETA: 2s - loss: 1.9814 - regression_loss: 1.6353 - classification_loss: 0.3461 492/500 [============================>.] - ETA: 2s - loss: 1.9808 - regression_loss: 1.6350 - classification_loss: 0.3459 493/500 [============================>.] - ETA: 1s - loss: 1.9801 - regression_loss: 1.6346 - classification_loss: 0.3456 494/500 [============================>.] - ETA: 1s - loss: 1.9801 - regression_loss: 1.6344 - classification_loss: 0.3456 495/500 [============================>.] - ETA: 1s - loss: 1.9813 - regression_loss: 1.6354 - classification_loss: 0.3459 496/500 [============================>.] - ETA: 1s - loss: 1.9823 - regression_loss: 1.6364 - classification_loss: 0.3459 497/500 [============================>.] - ETA: 0s - loss: 1.9822 - regression_loss: 1.6363 - classification_loss: 0.3459 498/500 [============================>.] - ETA: 0s - loss: 1.9817 - regression_loss: 1.6359 - classification_loss: 0.3458 499/500 [============================>.] - ETA: 0s - loss: 1.9799 - regression_loss: 1.6346 - classification_loss: 0.3454 500/500 [==============================] - 125s 251ms/step - loss: 1.9800 - regression_loss: 1.6348 - classification_loss: 0.3451 326 instances of class plum with average precision: 0.6872 mAP: 0.6872 Epoch 00026: saving model to ./training/snapshots/resnet50_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 2:01 - loss: 2.2137 - regression_loss: 1.8195 - classification_loss: 0.3942 2/500 [..............................] - ETA: 2:05 - loss: 2.1241 - regression_loss: 1.7485 - classification_loss: 0.3757 3/500 [..............................] - ETA: 2:04 - loss: 2.1845 - regression_loss: 1.7863 - classification_loss: 0.3982 4/500 [..............................] - ETA: 2:04 - loss: 2.1796 - regression_loss: 1.8072 - classification_loss: 0.3724 5/500 [..............................] - ETA: 2:02 - loss: 2.2429 - regression_loss: 1.8906 - classification_loss: 0.3523 6/500 [..............................] - ETA: 2:03 - loss: 2.1732 - regression_loss: 1.8243 - classification_loss: 0.3489 7/500 [..............................] - ETA: 2:03 - loss: 2.1698 - regression_loss: 1.8175 - classification_loss: 0.3523 8/500 [..............................] - ETA: 2:02 - loss: 2.1430 - regression_loss: 1.8108 - classification_loss: 0.3322 9/500 [..............................] - ETA: 2:02 - loss: 2.2257 - regression_loss: 1.8701 - classification_loss: 0.3557 10/500 [..............................] - ETA: 2:02 - loss: 2.1527 - regression_loss: 1.8102 - classification_loss: 0.3425 11/500 [..............................] - ETA: 2:01 - loss: 2.1732 - regression_loss: 1.8168 - classification_loss: 0.3564 12/500 [..............................] - ETA: 2:01 - loss: 2.1636 - regression_loss: 1.8052 - classification_loss: 0.3584 13/500 [..............................] - ETA: 2:01 - loss: 2.1534 - regression_loss: 1.8046 - classification_loss: 0.3489 14/500 [..............................] - ETA: 2:01 - loss: 2.1260 - regression_loss: 1.7803 - classification_loss: 0.3457 15/500 [..............................] - ETA: 2:01 - loss: 2.1337 - regression_loss: 1.7890 - classification_loss: 0.3447 16/500 [..............................] - ETA: 2:00 - loss: 2.1128 - regression_loss: 1.7720 - classification_loss: 0.3409 17/500 [>.............................] - ETA: 2:00 - loss: 2.0931 - regression_loss: 1.7566 - classification_loss: 0.3365 18/500 [>.............................] - ETA: 2:00 - loss: 2.0906 - regression_loss: 1.7530 - classification_loss: 0.3376 19/500 [>.............................] - ETA: 1:59 - loss: 2.0300 - regression_loss: 1.7046 - classification_loss: 0.3254 20/500 [>.............................] - ETA: 1:58 - loss: 2.0299 - regression_loss: 1.7079 - classification_loss: 0.3221 21/500 [>.............................] - ETA: 1:57 - loss: 2.0686 - regression_loss: 1.7391 - classification_loss: 0.3295 22/500 [>.............................] - ETA: 1:57 - loss: 2.0422 - regression_loss: 1.7179 - classification_loss: 0.3243 23/500 [>.............................] - ETA: 1:57 - loss: 2.0466 - regression_loss: 1.7140 - classification_loss: 0.3326 24/500 [>.............................] - ETA: 1:56 - loss: 2.0465 - regression_loss: 1.7165 - classification_loss: 0.3300 25/500 [>.............................] - ETA: 1:56 - loss: 2.0262 - regression_loss: 1.7017 - classification_loss: 0.3245 26/500 [>.............................] - ETA: 1:55 - loss: 2.0205 - regression_loss: 1.6973 - classification_loss: 0.3232 27/500 [>.............................] - ETA: 1:54 - loss: 1.9893 - regression_loss: 1.6727 - classification_loss: 0.3166 28/500 [>.............................] - ETA: 1:54 - loss: 1.9790 - regression_loss: 1.6632 - classification_loss: 0.3158 29/500 [>.............................] - ETA: 1:54 - loss: 1.9468 - regression_loss: 1.6355 - classification_loss: 0.3113 30/500 [>.............................] - ETA: 1:53 - loss: 1.9607 - regression_loss: 1.6410 - classification_loss: 0.3198 31/500 [>.............................] - ETA: 1:53 - loss: 1.9510 - regression_loss: 1.6337 - classification_loss: 0.3172 32/500 [>.............................] - ETA: 1:53 - loss: 1.9592 - regression_loss: 1.6397 - classification_loss: 0.3195 33/500 [>.............................] - ETA: 1:53 - loss: 1.9420 - regression_loss: 1.6268 - classification_loss: 0.3152 34/500 [=>............................] - ETA: 1:53 - loss: 1.9165 - regression_loss: 1.6064 - classification_loss: 0.3102 35/500 [=>............................] - ETA: 1:53 - loss: 1.9161 - regression_loss: 1.6072 - classification_loss: 0.3089 36/500 [=>............................] - ETA: 1:53 - loss: 1.8905 - regression_loss: 1.5864 - classification_loss: 0.3041 37/500 [=>............................] - ETA: 1:53 - loss: 1.8864 - regression_loss: 1.5832 - classification_loss: 0.3032 38/500 [=>............................] - ETA: 1:52 - loss: 1.8896 - regression_loss: 1.5866 - classification_loss: 0.3030 39/500 [=>............................] - ETA: 1:52 - loss: 1.8788 - regression_loss: 1.5790 - classification_loss: 0.2998 40/500 [=>............................] - ETA: 1:52 - loss: 1.8792 - regression_loss: 1.5788 - classification_loss: 0.3004 41/500 [=>............................] - ETA: 1:52 - loss: 1.8692 - regression_loss: 1.5725 - classification_loss: 0.2967 42/500 [=>............................] - ETA: 1:52 - loss: 1.8574 - regression_loss: 1.5643 - classification_loss: 0.2930 43/500 [=>............................] - ETA: 1:52 - loss: 1.8699 - regression_loss: 1.5716 - classification_loss: 0.2983 44/500 [=>............................] - ETA: 1:51 - loss: 1.8669 - regression_loss: 1.5693 - classification_loss: 0.2976 45/500 [=>............................] - ETA: 1:51 - loss: 1.8725 - regression_loss: 1.5743 - classification_loss: 0.2982 46/500 [=>............................] - ETA: 1:51 - loss: 1.8888 - regression_loss: 1.5881 - classification_loss: 0.3007 47/500 [=>............................] - ETA: 1:51 - loss: 1.8849 - regression_loss: 1.5855 - classification_loss: 0.2994 48/500 [=>............................] - ETA: 1:51 - loss: 1.8829 - regression_loss: 1.5830 - classification_loss: 0.2999 49/500 [=>............................] - ETA: 1:51 - loss: 1.8874 - regression_loss: 1.5892 - classification_loss: 0.2982 50/500 [==>...........................] - ETA: 1:50 - loss: 1.8695 - regression_loss: 1.5741 - classification_loss: 0.2954 51/500 [==>...........................] - ETA: 1:50 - loss: 1.8660 - regression_loss: 1.5720 - classification_loss: 0.2940 52/500 [==>...........................] - ETA: 1:50 - loss: 1.8576 - regression_loss: 1.5650 - classification_loss: 0.2927 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8376 - regression_loss: 1.5489 - classification_loss: 0.2888 54/500 [==>...........................] - ETA: 1:49 - loss: 1.8372 - regression_loss: 1.5466 - classification_loss: 0.2905 55/500 [==>...........................] - ETA: 1:49 - loss: 1.8350 - regression_loss: 1.5458 - classification_loss: 0.2892 56/500 [==>...........................] - ETA: 1:49 - loss: 1.8126 - regression_loss: 1.5255 - classification_loss: 0.2872 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8098 - regression_loss: 1.5230 - classification_loss: 0.2869 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8176 - regression_loss: 1.5274 - classification_loss: 0.2902 59/500 [==>...........................] - ETA: 1:48 - loss: 1.8294 - regression_loss: 1.5342 - classification_loss: 0.2952 60/500 [==>...........................] - ETA: 1:48 - loss: 1.8452 - regression_loss: 1.5469 - classification_loss: 0.2983 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8399 - regression_loss: 1.5432 - classification_loss: 0.2967 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8519 - regression_loss: 1.5518 - classification_loss: 0.3001 63/500 [==>...........................] - ETA: 1:47 - loss: 1.8679 - regression_loss: 1.5650 - classification_loss: 0.3028 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8727 - regression_loss: 1.5672 - classification_loss: 0.3055 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8769 - regression_loss: 1.5711 - classification_loss: 0.3057 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8609 - regression_loss: 1.5577 - classification_loss: 0.3032 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8740 - regression_loss: 1.5671 - classification_loss: 0.3069 68/500 [===>..........................] - ETA: 1:46 - loss: 1.8784 - regression_loss: 1.5698 - classification_loss: 0.3086 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8791 - regression_loss: 1.5677 - classification_loss: 0.3114 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8765 - regression_loss: 1.5658 - classification_loss: 0.3107 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8639 - regression_loss: 1.5550 - classification_loss: 0.3090 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8632 - regression_loss: 1.5545 - classification_loss: 0.3087 73/500 [===>..........................] - ETA: 1:45 - loss: 1.8542 - regression_loss: 1.5474 - classification_loss: 0.3068 74/500 [===>..........................] - ETA: 1:45 - loss: 1.8647 - regression_loss: 1.5555 - classification_loss: 0.3093 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8640 - regression_loss: 1.5543 - classification_loss: 0.3096 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8597 - regression_loss: 1.5516 - classification_loss: 0.3080 77/500 [===>..........................] - ETA: 1:44 - loss: 1.8741 - regression_loss: 1.5631 - classification_loss: 0.3110 78/500 [===>..........................] - ETA: 1:44 - loss: 1.8621 - regression_loss: 1.5530 - classification_loss: 0.3091 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8651 - regression_loss: 1.5544 - classification_loss: 0.3107 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8645 - regression_loss: 1.5546 - classification_loss: 0.3099 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8710 - regression_loss: 1.5600 - classification_loss: 0.3110 82/500 [===>..........................] - ETA: 1:43 - loss: 1.8749 - regression_loss: 1.5621 - classification_loss: 0.3128 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8660 - regression_loss: 1.5550 - classification_loss: 0.3110 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8687 - regression_loss: 1.5570 - classification_loss: 0.3117 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8781 - regression_loss: 1.5648 - classification_loss: 0.3134 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8775 - regression_loss: 1.5645 - classification_loss: 0.3130 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8770 - regression_loss: 1.5645 - classification_loss: 0.3125 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8796 - regression_loss: 1.5658 - classification_loss: 0.3137 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8864 - regression_loss: 1.5710 - classification_loss: 0.3154 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8839 - regression_loss: 1.5694 - classification_loss: 0.3145 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8838 - regression_loss: 1.5703 - classification_loss: 0.3135 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8955 - regression_loss: 1.5780 - classification_loss: 0.3175 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8909 - regression_loss: 1.5745 - classification_loss: 0.3163 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9009 - regression_loss: 1.5837 - classification_loss: 0.3173 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8993 - regression_loss: 1.5830 - classification_loss: 0.3164 96/500 [====>.........................] - ETA: 1:40 - loss: 1.9060 - regression_loss: 1.5878 - classification_loss: 0.3182 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8973 - regression_loss: 1.5810 - classification_loss: 0.3163 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8935 - regression_loss: 1.5779 - classification_loss: 0.3156 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8914 - regression_loss: 1.5766 - classification_loss: 0.3148 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8932 - regression_loss: 1.5786 - classification_loss: 0.3146 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9028 - regression_loss: 1.5855 - classification_loss: 0.3173 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9038 - regression_loss: 1.5861 - classification_loss: 0.3176 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8985 - regression_loss: 1.5816 - classification_loss: 0.3169 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8980 - regression_loss: 1.5816 - classification_loss: 0.3164 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9002 - regression_loss: 1.5832 - classification_loss: 0.3170 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9047 - regression_loss: 1.5872 - classification_loss: 0.3174 107/500 [=====>........................] - ETA: 1:37 - loss: 1.9011 - regression_loss: 1.5846 - classification_loss: 0.3165 108/500 [=====>........................] - ETA: 1:37 - loss: 1.9019 - regression_loss: 1.5842 - classification_loss: 0.3177 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8931 - regression_loss: 1.5773 - classification_loss: 0.3158 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8824 - regression_loss: 1.5685 - classification_loss: 0.3139 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8856 - regression_loss: 1.5709 - classification_loss: 0.3147 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8844 - regression_loss: 1.5696 - classification_loss: 0.3149 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8908 - regression_loss: 1.5753 - classification_loss: 0.3155 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8949 - regression_loss: 1.5790 - classification_loss: 0.3159 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9068 - regression_loss: 1.5890 - classification_loss: 0.3179 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8997 - regression_loss: 1.5837 - classification_loss: 0.3160 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8991 - regression_loss: 1.5840 - classification_loss: 0.3151 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9011 - regression_loss: 1.5857 - classification_loss: 0.3154 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8911 - regression_loss: 1.5776 - classification_loss: 0.3136 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8914 - regression_loss: 1.5773 - classification_loss: 0.3141 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8904 - regression_loss: 1.5760 - classification_loss: 0.3144 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8899 - regression_loss: 1.5750 - classification_loss: 0.3149 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8949 - regression_loss: 1.5785 - classification_loss: 0.3164 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8969 - regression_loss: 1.5801 - classification_loss: 0.3168 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8993 - regression_loss: 1.5815 - classification_loss: 0.3177 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8981 - regression_loss: 1.5806 - classification_loss: 0.3174 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8949 - regression_loss: 1.5785 - classification_loss: 0.3165 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8993 - regression_loss: 1.5819 - classification_loss: 0.3174 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8990 - regression_loss: 1.5812 - classification_loss: 0.3177 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8987 - regression_loss: 1.5811 - classification_loss: 0.3175 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9017 - regression_loss: 1.5836 - classification_loss: 0.3182 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9036 - regression_loss: 1.5848 - classification_loss: 0.3188 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9034 - regression_loss: 1.5851 - classification_loss: 0.3183 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9056 - regression_loss: 1.5866 - classification_loss: 0.3189 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9089 - regression_loss: 1.5901 - classification_loss: 0.3188 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9094 - regression_loss: 1.5908 - classification_loss: 0.3187 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9118 - regression_loss: 1.5930 - classification_loss: 0.3188 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9203 - regression_loss: 1.5999 - classification_loss: 0.3204 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9178 - regression_loss: 1.5982 - classification_loss: 0.3195 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9138 - regression_loss: 1.5950 - classification_loss: 0.3188 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9179 - regression_loss: 1.5980 - classification_loss: 0.3199 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9200 - regression_loss: 1.5988 - classification_loss: 0.3212 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9216 - regression_loss: 1.6003 - classification_loss: 0.3212 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9234 - regression_loss: 1.6018 - classification_loss: 0.3216 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9241 - regression_loss: 1.6022 - classification_loss: 0.3219 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9205 - regression_loss: 1.5999 - classification_loss: 0.3207 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9172 - regression_loss: 1.5973 - classification_loss: 0.3199 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9177 - regression_loss: 1.5979 - classification_loss: 0.3199 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9209 - regression_loss: 1.6010 - classification_loss: 0.3200 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9160 - regression_loss: 1.5969 - classification_loss: 0.3191 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9174 - regression_loss: 1.5976 - classification_loss: 0.3199 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9190 - regression_loss: 1.5993 - classification_loss: 0.3197 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9169 - regression_loss: 1.5977 - classification_loss: 0.3192 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9132 - regression_loss: 1.5951 - classification_loss: 0.3181 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9123 - regression_loss: 1.5945 - classification_loss: 0.3178 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9131 - regression_loss: 1.5941 - classification_loss: 0.3190 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9177 - regression_loss: 1.5979 - classification_loss: 0.3198 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9197 - regression_loss: 1.5994 - classification_loss: 0.3203 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9247 - regression_loss: 1.6036 - classification_loss: 0.3211 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9214 - regression_loss: 1.6014 - classification_loss: 0.3200 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9236 - regression_loss: 1.6034 - classification_loss: 0.3202 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9264 - regression_loss: 1.6060 - classification_loss: 0.3204 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9351 - regression_loss: 1.6129 - classification_loss: 0.3223 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9388 - regression_loss: 1.6168 - classification_loss: 0.3221 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9371 - regression_loss: 1.6152 - classification_loss: 0.3219 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9417 - regression_loss: 1.6193 - classification_loss: 0.3224 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9436 - regression_loss: 1.6211 - classification_loss: 0.3225 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9462 - regression_loss: 1.6230 - classification_loss: 0.3232 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9422 - regression_loss: 1.6199 - classification_loss: 0.3223 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9441 - regression_loss: 1.6214 - classification_loss: 0.3227 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9497 - regression_loss: 1.6246 - classification_loss: 0.3251 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9545 - regression_loss: 1.6279 - classification_loss: 0.3265 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9554 - regression_loss: 1.6291 - classification_loss: 0.3264 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9545 - regression_loss: 1.6282 - classification_loss: 0.3263 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9573 - regression_loss: 1.6305 - classification_loss: 0.3268 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9548 - regression_loss: 1.6283 - classification_loss: 0.3265 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9523 - regression_loss: 1.6260 - classification_loss: 0.3262 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9545 - regression_loss: 1.6277 - classification_loss: 0.3268 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9552 - regression_loss: 1.6280 - classification_loss: 0.3272 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9534 - regression_loss: 1.6273 - classification_loss: 0.3261 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9600 - regression_loss: 1.6319 - classification_loss: 0.3281 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9577 - regression_loss: 1.6299 - classification_loss: 0.3279 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9580 - regression_loss: 1.6304 - classification_loss: 0.3276 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9580 - regression_loss: 1.6311 - classification_loss: 0.3269 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9583 - regression_loss: 1.6316 - classification_loss: 0.3267 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9570 - regression_loss: 1.6306 - classification_loss: 0.3264 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9548 - regression_loss: 1.6292 - classification_loss: 0.3256 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9573 - regression_loss: 1.6309 - classification_loss: 0.3264 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9566 - regression_loss: 1.6306 - classification_loss: 0.3260 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9545 - regression_loss: 1.6289 - classification_loss: 0.3256 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9526 - regression_loss: 1.6275 - classification_loss: 0.3251 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9558 - regression_loss: 1.6300 - classification_loss: 0.3258 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9538 - regression_loss: 1.6285 - classification_loss: 0.3252 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9650 - regression_loss: 1.6332 - classification_loss: 0.3318 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9662 - regression_loss: 1.6338 - classification_loss: 0.3324 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9654 - regression_loss: 1.6332 - classification_loss: 0.3322 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9654 - regression_loss: 1.6331 - classification_loss: 0.3323 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9641 - regression_loss: 1.6324 - classification_loss: 0.3317 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9647 - regression_loss: 1.6332 - classification_loss: 0.3316 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9596 - regression_loss: 1.6288 - classification_loss: 0.3308 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9574 - regression_loss: 1.6270 - classification_loss: 0.3304 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9576 - regression_loss: 1.6272 - classification_loss: 0.3304 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9573 - regression_loss: 1.6271 - classification_loss: 0.3303 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9634 - regression_loss: 1.6320 - classification_loss: 0.3314 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9630 - regression_loss: 1.6315 - classification_loss: 0.3314 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9681 - regression_loss: 1.6353 - classification_loss: 0.3329 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9757 - regression_loss: 1.6417 - classification_loss: 0.3340 208/500 [===========>..................] - ETA: 1:12 - loss: 1.9808 - regression_loss: 1.6460 - classification_loss: 0.3348 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9804 - regression_loss: 1.6457 - classification_loss: 0.3347 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9774 - regression_loss: 1.6435 - classification_loss: 0.3339 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9781 - regression_loss: 1.6446 - classification_loss: 0.3336 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9778 - regression_loss: 1.6446 - classification_loss: 0.3331 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9723 - regression_loss: 1.6404 - classification_loss: 0.3319 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9677 - regression_loss: 1.6367 - classification_loss: 0.3309 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9687 - regression_loss: 1.6378 - classification_loss: 0.3309 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9702 - regression_loss: 1.6389 - classification_loss: 0.3313 217/500 [============>.................] - ETA: 1:10 - loss: 1.9687 - regression_loss: 1.6377 - classification_loss: 0.3309 218/500 [============>.................] - ETA: 1:10 - loss: 1.9746 - regression_loss: 1.6406 - classification_loss: 0.3340 219/500 [============>.................] - ETA: 1:10 - loss: 1.9745 - regression_loss: 1.6403 - classification_loss: 0.3342 220/500 [============>.................] - ETA: 1:09 - loss: 1.9748 - regression_loss: 1.6406 - classification_loss: 0.3342 221/500 [============>.................] - ETA: 1:09 - loss: 1.9735 - regression_loss: 1.6396 - classification_loss: 0.3339 222/500 [============>.................] - ETA: 1:09 - loss: 1.9708 - regression_loss: 1.6376 - classification_loss: 0.3332 223/500 [============>.................] - ETA: 1:09 - loss: 1.9698 - regression_loss: 1.6370 - classification_loss: 0.3328 224/500 [============>.................] - ETA: 1:08 - loss: 1.9671 - regression_loss: 1.6352 - classification_loss: 0.3319 225/500 [============>.................] - ETA: 1:08 - loss: 1.9666 - regression_loss: 1.6350 - classification_loss: 0.3316 226/500 [============>.................] - ETA: 1:08 - loss: 1.9682 - regression_loss: 1.6362 - classification_loss: 0.3320 227/500 [============>.................] - ETA: 1:08 - loss: 1.9666 - regression_loss: 1.6351 - classification_loss: 0.3315 228/500 [============>.................] - ETA: 1:07 - loss: 1.9662 - regression_loss: 1.6347 - classification_loss: 0.3315 229/500 [============>.................] - ETA: 1:07 - loss: 1.9656 - regression_loss: 1.6347 - classification_loss: 0.3309 230/500 [============>.................] - ETA: 1:07 - loss: 1.9664 - regression_loss: 1.6350 - classification_loss: 0.3314 231/500 [============>.................] - ETA: 1:07 - loss: 1.9669 - regression_loss: 1.6348 - classification_loss: 0.3321 232/500 [============>.................] - ETA: 1:06 - loss: 1.9678 - regression_loss: 1.6354 - classification_loss: 0.3324 233/500 [============>.................] - ETA: 1:06 - loss: 1.9652 - regression_loss: 1.6335 - classification_loss: 0.3317 234/500 [=============>................] - ETA: 1:06 - loss: 1.9671 - regression_loss: 1.6351 - classification_loss: 0.3320 235/500 [=============>................] - ETA: 1:06 - loss: 1.9653 - regression_loss: 1.6338 - classification_loss: 0.3315 236/500 [=============>................] - ETA: 1:05 - loss: 1.9701 - regression_loss: 1.6375 - classification_loss: 0.3327 237/500 [=============>................] - ETA: 1:05 - loss: 1.9686 - regression_loss: 1.6363 - classification_loss: 0.3323 238/500 [=============>................] - ETA: 1:05 - loss: 1.9665 - regression_loss: 1.6347 - classification_loss: 0.3318 239/500 [=============>................] - ETA: 1:05 - loss: 1.9656 - regression_loss: 1.6339 - classification_loss: 0.3317 240/500 [=============>................] - ETA: 1:04 - loss: 1.9652 - regression_loss: 1.6337 - classification_loss: 0.3315 241/500 [=============>................] - ETA: 1:04 - loss: 1.9617 - regression_loss: 1.6310 - classification_loss: 0.3307 242/500 [=============>................] - ETA: 1:04 - loss: 1.9595 - regression_loss: 1.6292 - classification_loss: 0.3302 243/500 [=============>................] - ETA: 1:04 - loss: 1.9591 - regression_loss: 1.6286 - classification_loss: 0.3305 244/500 [=============>................] - ETA: 1:03 - loss: 1.9655 - regression_loss: 1.6339 - classification_loss: 0.3316 245/500 [=============>................] - ETA: 1:03 - loss: 1.9651 - regression_loss: 1.6335 - classification_loss: 0.3315 246/500 [=============>................] - ETA: 1:03 - loss: 1.9660 - regression_loss: 1.6342 - classification_loss: 0.3318 247/500 [=============>................] - ETA: 1:03 - loss: 1.9715 - regression_loss: 1.6388 - classification_loss: 0.3327 248/500 [=============>................] - ETA: 1:02 - loss: 1.9750 - regression_loss: 1.6429 - classification_loss: 0.3322 249/500 [=============>................] - ETA: 1:02 - loss: 1.9784 - regression_loss: 1.6456 - classification_loss: 0.3328 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9747 - regression_loss: 1.6427 - classification_loss: 0.3320 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9731 - regression_loss: 1.6418 - classification_loss: 0.3313 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9712 - regression_loss: 1.6405 - classification_loss: 0.3307 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9720 - regression_loss: 1.6415 - classification_loss: 0.3305 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9704 - regression_loss: 1.6404 - classification_loss: 0.3301 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9733 - regression_loss: 1.6426 - classification_loss: 0.3306 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9714 - regression_loss: 1.6413 - classification_loss: 0.3301 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9727 - regression_loss: 1.6424 - classification_loss: 0.3303 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9701 - regression_loss: 1.6404 - classification_loss: 0.3297 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9708 - regression_loss: 1.6411 - classification_loss: 0.3297 260/500 [==============>...............] - ETA: 59s - loss: 1.9717 - regression_loss: 1.6417 - classification_loss: 0.3300  261/500 [==============>...............] - ETA: 59s - loss: 1.9710 - regression_loss: 1.6413 - classification_loss: 0.3297 262/500 [==============>...............] - ETA: 59s - loss: 1.9698 - regression_loss: 1.6402 - classification_loss: 0.3295 263/500 [==============>...............] - ETA: 59s - loss: 1.9736 - regression_loss: 1.6433 - classification_loss: 0.3303 264/500 [==============>...............] - ETA: 58s - loss: 1.9746 - regression_loss: 1.6443 - classification_loss: 0.3302 265/500 [==============>...............] - ETA: 58s - loss: 1.9758 - regression_loss: 1.6453 - classification_loss: 0.3305 266/500 [==============>...............] - ETA: 58s - loss: 1.9762 - regression_loss: 1.6456 - classification_loss: 0.3305 267/500 [===============>..............] - ETA: 58s - loss: 1.9729 - regression_loss: 1.6432 - classification_loss: 0.3298 268/500 [===============>..............] - ETA: 57s - loss: 1.9700 - regression_loss: 1.6408 - classification_loss: 0.3292 269/500 [===============>..............] - ETA: 57s - loss: 1.9665 - regression_loss: 1.6380 - classification_loss: 0.3285 270/500 [===============>..............] - ETA: 57s - loss: 1.9655 - regression_loss: 1.6374 - classification_loss: 0.3281 271/500 [===============>..............] - ETA: 57s - loss: 1.9662 - regression_loss: 1.6382 - classification_loss: 0.3280 272/500 [===============>..............] - ETA: 56s - loss: 1.9664 - regression_loss: 1.6383 - classification_loss: 0.3281 273/500 [===============>..............] - ETA: 56s - loss: 1.9650 - regression_loss: 1.6375 - classification_loss: 0.3276 274/500 [===============>..............] - ETA: 56s - loss: 1.9670 - regression_loss: 1.6388 - classification_loss: 0.3282 275/500 [===============>..............] - ETA: 56s - loss: 1.9675 - regression_loss: 1.6394 - classification_loss: 0.3281 276/500 [===============>..............] - ETA: 55s - loss: 1.9707 - regression_loss: 1.6416 - classification_loss: 0.3292 277/500 [===============>..............] - ETA: 55s - loss: 1.9734 - regression_loss: 1.6437 - classification_loss: 0.3297 278/500 [===============>..............] - ETA: 55s - loss: 1.9716 - regression_loss: 1.6424 - classification_loss: 0.3292 279/500 [===============>..............] - ETA: 55s - loss: 1.9710 - regression_loss: 1.6366 - classification_loss: 0.3344 280/500 [===============>..............] - ETA: 54s - loss: 1.9710 - regression_loss: 1.6364 - classification_loss: 0.3346 281/500 [===============>..............] - ETA: 54s - loss: 1.9744 - regression_loss: 1.6389 - classification_loss: 0.3355 282/500 [===============>..............] - ETA: 54s - loss: 1.9742 - regression_loss: 1.6388 - classification_loss: 0.3354 283/500 [===============>..............] - ETA: 54s - loss: 1.9754 - regression_loss: 1.6393 - classification_loss: 0.3361 284/500 [================>.............] - ETA: 53s - loss: 1.9747 - regression_loss: 1.6387 - classification_loss: 0.3360 285/500 [================>.............] - ETA: 53s - loss: 1.9758 - regression_loss: 1.6394 - classification_loss: 0.3364 286/500 [================>.............] - ETA: 53s - loss: 1.9756 - regression_loss: 1.6392 - classification_loss: 0.3364 287/500 [================>.............] - ETA: 53s - loss: 1.9718 - regression_loss: 1.6361 - classification_loss: 0.3357 288/500 [================>.............] - ETA: 52s - loss: 1.9740 - regression_loss: 1.6378 - classification_loss: 0.3362 289/500 [================>.............] - ETA: 52s - loss: 1.9722 - regression_loss: 1.6362 - classification_loss: 0.3359 290/500 [================>.............] - ETA: 52s - loss: 1.9720 - regression_loss: 1.6357 - classification_loss: 0.3363 291/500 [================>.............] - ETA: 52s - loss: 1.9725 - regression_loss: 1.6360 - classification_loss: 0.3365 292/500 [================>.............] - ETA: 51s - loss: 1.9724 - regression_loss: 1.6358 - classification_loss: 0.3367 293/500 [================>.............] - ETA: 51s - loss: 1.9715 - regression_loss: 1.6351 - classification_loss: 0.3364 294/500 [================>.............] - ETA: 51s - loss: 1.9709 - regression_loss: 1.6347 - classification_loss: 0.3361 295/500 [================>.............] - ETA: 51s - loss: 1.9677 - regression_loss: 1.6319 - classification_loss: 0.3357 296/500 [================>.............] - ETA: 50s - loss: 1.9713 - regression_loss: 1.6349 - classification_loss: 0.3365 297/500 [================>.............] - ETA: 50s - loss: 1.9734 - regression_loss: 1.6363 - classification_loss: 0.3371 298/500 [================>.............] - ETA: 50s - loss: 1.9779 - regression_loss: 1.6393 - classification_loss: 0.3386 299/500 [================>.............] - ETA: 50s - loss: 1.9766 - regression_loss: 1.6384 - classification_loss: 0.3383 300/500 [=================>............] - ETA: 49s - loss: 1.9750 - regression_loss: 1.6371 - classification_loss: 0.3379 301/500 [=================>............] - ETA: 49s - loss: 1.9768 - regression_loss: 1.6384 - classification_loss: 0.3385 302/500 [=================>............] - ETA: 49s - loss: 1.9766 - regression_loss: 1.6383 - classification_loss: 0.3383 303/500 [=================>............] - ETA: 49s - loss: 1.9767 - regression_loss: 1.6384 - classification_loss: 0.3384 304/500 [=================>............] - ETA: 49s - loss: 1.9746 - regression_loss: 1.6364 - classification_loss: 0.3382 305/500 [=================>............] - ETA: 48s - loss: 1.9709 - regression_loss: 1.6333 - classification_loss: 0.3376 306/500 [=================>............] - ETA: 48s - loss: 1.9696 - regression_loss: 1.6323 - classification_loss: 0.3373 307/500 [=================>............] - ETA: 48s - loss: 1.9699 - regression_loss: 1.6330 - classification_loss: 0.3368 308/500 [=================>............] - ETA: 48s - loss: 1.9673 - regression_loss: 1.6311 - classification_loss: 0.3362 309/500 [=================>............] - ETA: 47s - loss: 1.9707 - regression_loss: 1.6337 - classification_loss: 0.3370 310/500 [=================>............] - ETA: 47s - loss: 1.9690 - regression_loss: 1.6324 - classification_loss: 0.3366 311/500 [=================>............] - ETA: 47s - loss: 1.9659 - regression_loss: 1.6300 - classification_loss: 0.3359 312/500 [=================>............] - ETA: 46s - loss: 1.9644 - regression_loss: 1.6287 - classification_loss: 0.3357 313/500 [=================>............] - ETA: 46s - loss: 1.9656 - regression_loss: 1.6296 - classification_loss: 0.3360 314/500 [=================>............] - ETA: 46s - loss: 1.9661 - regression_loss: 1.6299 - classification_loss: 0.3362 315/500 [=================>............] - ETA: 46s - loss: 1.9668 - regression_loss: 1.6306 - classification_loss: 0.3362 316/500 [=================>............] - ETA: 46s - loss: 1.9675 - regression_loss: 1.6314 - classification_loss: 0.3362 317/500 [==================>...........] - ETA: 45s - loss: 1.9687 - regression_loss: 1.6324 - classification_loss: 0.3364 318/500 [==================>...........] - ETA: 45s - loss: 1.9659 - regression_loss: 1.6301 - classification_loss: 0.3358 319/500 [==================>...........] - ETA: 45s - loss: 1.9653 - regression_loss: 1.6298 - classification_loss: 0.3356 320/500 [==================>...........] - ETA: 45s - loss: 1.9665 - regression_loss: 1.6309 - classification_loss: 0.3356 321/500 [==================>...........] - ETA: 44s - loss: 1.9653 - regression_loss: 1.6299 - classification_loss: 0.3353 322/500 [==================>...........] - ETA: 44s - loss: 1.9664 - regression_loss: 1.6316 - classification_loss: 0.3348 323/500 [==================>...........] - ETA: 44s - loss: 1.9673 - regression_loss: 1.6324 - classification_loss: 0.3349 324/500 [==================>...........] - ETA: 43s - loss: 1.9688 - regression_loss: 1.6335 - classification_loss: 0.3353 325/500 [==================>...........] - ETA: 43s - loss: 1.9699 - regression_loss: 1.6344 - classification_loss: 0.3355 326/500 [==================>...........] - ETA: 43s - loss: 1.9718 - regression_loss: 1.6361 - classification_loss: 0.3356 327/500 [==================>...........] - ETA: 43s - loss: 1.9718 - regression_loss: 1.6365 - classification_loss: 0.3353 328/500 [==================>...........] - ETA: 43s - loss: 1.9710 - regression_loss: 1.6356 - classification_loss: 0.3354 329/500 [==================>...........] - ETA: 42s - loss: 1.9729 - regression_loss: 1.6375 - classification_loss: 0.3354 330/500 [==================>...........] - ETA: 42s - loss: 1.9714 - regression_loss: 1.6364 - classification_loss: 0.3350 331/500 [==================>...........] - ETA: 42s - loss: 1.9718 - regression_loss: 1.6367 - classification_loss: 0.3351 332/500 [==================>...........] - ETA: 41s - loss: 1.9710 - regression_loss: 1.6363 - classification_loss: 0.3347 333/500 [==================>...........] - ETA: 41s - loss: 1.9713 - regression_loss: 1.6364 - classification_loss: 0.3349 334/500 [===================>..........] - ETA: 41s - loss: 1.9713 - regression_loss: 1.6363 - classification_loss: 0.3350 335/500 [===================>..........] - ETA: 41s - loss: 1.9703 - regression_loss: 1.6356 - classification_loss: 0.3347 336/500 [===================>..........] - ETA: 40s - loss: 1.9711 - regression_loss: 1.6363 - classification_loss: 0.3348 337/500 [===================>..........] - ETA: 40s - loss: 1.9724 - regression_loss: 1.6372 - classification_loss: 0.3353 338/500 [===================>..........] - ETA: 40s - loss: 1.9689 - regression_loss: 1.6344 - classification_loss: 0.3346 339/500 [===================>..........] - ETA: 40s - loss: 1.9685 - regression_loss: 1.6343 - classification_loss: 0.3342 340/500 [===================>..........] - ETA: 39s - loss: 1.9692 - regression_loss: 1.6348 - classification_loss: 0.3344 341/500 [===================>..........] - ETA: 39s - loss: 1.9697 - regression_loss: 1.6350 - classification_loss: 0.3346 342/500 [===================>..........] - ETA: 39s - loss: 1.9694 - regression_loss: 1.6352 - classification_loss: 0.3342 343/500 [===================>..........] - ETA: 39s - loss: 1.9684 - regression_loss: 1.6345 - classification_loss: 0.3339 344/500 [===================>..........] - ETA: 38s - loss: 1.9663 - regression_loss: 1.6328 - classification_loss: 0.3334 345/500 [===================>..........] - ETA: 38s - loss: 1.9659 - regression_loss: 1.6328 - classification_loss: 0.3331 346/500 [===================>..........] - ETA: 38s - loss: 1.9665 - regression_loss: 1.6331 - classification_loss: 0.3334 347/500 [===================>..........] - ETA: 38s - loss: 1.9685 - regression_loss: 1.6348 - classification_loss: 0.3337 348/500 [===================>..........] - ETA: 37s - loss: 1.9650 - regression_loss: 1.6318 - classification_loss: 0.3332 349/500 [===================>..........] - ETA: 37s - loss: 1.9637 - regression_loss: 1.6306 - classification_loss: 0.3330 350/500 [====================>.........] - ETA: 37s - loss: 1.9647 - regression_loss: 1.6313 - classification_loss: 0.3335 351/500 [====================>.........] - ETA: 37s - loss: 1.9637 - regression_loss: 1.6306 - classification_loss: 0.3331 352/500 [====================>.........] - ETA: 36s - loss: 1.9632 - regression_loss: 1.6303 - classification_loss: 0.3329 353/500 [====================>.........] - ETA: 36s - loss: 1.9625 - regression_loss: 1.6298 - classification_loss: 0.3327 354/500 [====================>.........] - ETA: 36s - loss: 1.9626 - regression_loss: 1.6300 - classification_loss: 0.3326 355/500 [====================>.........] - ETA: 36s - loss: 1.9631 - regression_loss: 1.6305 - classification_loss: 0.3326 356/500 [====================>.........] - ETA: 35s - loss: 1.9618 - regression_loss: 1.6295 - classification_loss: 0.3323 357/500 [====================>.........] - ETA: 35s - loss: 1.9605 - regression_loss: 1.6288 - classification_loss: 0.3318 358/500 [====================>.........] - ETA: 35s - loss: 1.9598 - regression_loss: 1.6284 - classification_loss: 0.3315 359/500 [====================>.........] - ETA: 35s - loss: 1.9626 - regression_loss: 1.6305 - classification_loss: 0.3321 360/500 [====================>.........] - ETA: 34s - loss: 1.9624 - regression_loss: 1.6304 - classification_loss: 0.3319 361/500 [====================>.........] - ETA: 34s - loss: 1.9604 - regression_loss: 1.6290 - classification_loss: 0.3314 362/500 [====================>.........] - ETA: 34s - loss: 1.9604 - regression_loss: 1.6291 - classification_loss: 0.3313 363/500 [====================>.........] - ETA: 34s - loss: 1.9611 - regression_loss: 1.6296 - classification_loss: 0.3316 364/500 [====================>.........] - ETA: 33s - loss: 1.9631 - regression_loss: 1.6312 - classification_loss: 0.3319 365/500 [====================>.........] - ETA: 33s - loss: 1.9622 - regression_loss: 1.6306 - classification_loss: 0.3316 366/500 [====================>.........] - ETA: 33s - loss: 1.9624 - regression_loss: 1.6308 - classification_loss: 0.3315 367/500 [=====================>........] - ETA: 33s - loss: 1.9610 - regression_loss: 1.6298 - classification_loss: 0.3312 368/500 [=====================>........] - ETA: 32s - loss: 1.9613 - regression_loss: 1.6302 - classification_loss: 0.3311 369/500 [=====================>........] - ETA: 32s - loss: 1.9625 - regression_loss: 1.6311 - classification_loss: 0.3314 370/500 [=====================>........] - ETA: 32s - loss: 1.9628 - regression_loss: 1.6313 - classification_loss: 0.3316 371/500 [=====================>........] - ETA: 32s - loss: 1.9631 - regression_loss: 1.6316 - classification_loss: 0.3315 372/500 [=====================>........] - ETA: 31s - loss: 1.9629 - regression_loss: 1.6315 - classification_loss: 0.3314 373/500 [=====================>........] - ETA: 31s - loss: 1.9633 - regression_loss: 1.6321 - classification_loss: 0.3312 374/500 [=====================>........] - ETA: 31s - loss: 1.9633 - regression_loss: 1.6319 - classification_loss: 0.3314 375/500 [=====================>........] - ETA: 31s - loss: 1.9622 - regression_loss: 1.6309 - classification_loss: 0.3313 376/500 [=====================>........] - ETA: 30s - loss: 1.9631 - regression_loss: 1.6315 - classification_loss: 0.3316 377/500 [=====================>........] - ETA: 30s - loss: 1.9643 - regression_loss: 1.6324 - classification_loss: 0.3319 378/500 [=====================>........] - ETA: 30s - loss: 1.9627 - regression_loss: 1.6313 - classification_loss: 0.3314 379/500 [=====================>........] - ETA: 30s - loss: 1.9642 - regression_loss: 1.6324 - classification_loss: 0.3318 380/500 [=====================>........] - ETA: 29s - loss: 1.9637 - regression_loss: 1.6321 - classification_loss: 0.3316 381/500 [=====================>........] - ETA: 29s - loss: 1.9638 - regression_loss: 1.6325 - classification_loss: 0.3313 382/500 [=====================>........] - ETA: 29s - loss: 1.9640 - regression_loss: 1.6326 - classification_loss: 0.3314 383/500 [=====================>........] - ETA: 29s - loss: 1.9607 - regression_loss: 1.6299 - classification_loss: 0.3308 384/500 [======================>.......] - ETA: 28s - loss: 1.9621 - regression_loss: 1.6309 - classification_loss: 0.3312 385/500 [======================>.......] - ETA: 28s - loss: 1.9615 - regression_loss: 1.6302 - classification_loss: 0.3313 386/500 [======================>.......] - ETA: 28s - loss: 1.9612 - regression_loss: 1.6299 - classification_loss: 0.3314 387/500 [======================>.......] - ETA: 28s - loss: 1.9630 - regression_loss: 1.6315 - classification_loss: 0.3315 388/500 [======================>.......] - ETA: 27s - loss: 1.9622 - regression_loss: 1.6311 - classification_loss: 0.3312 389/500 [======================>.......] - ETA: 27s - loss: 1.9643 - regression_loss: 1.6329 - classification_loss: 0.3314 390/500 [======================>.......] - ETA: 27s - loss: 1.9656 - regression_loss: 1.6338 - classification_loss: 0.3317 391/500 [======================>.......] - ETA: 27s - loss: 1.9642 - regression_loss: 1.6329 - classification_loss: 0.3313 392/500 [======================>.......] - ETA: 26s - loss: 1.9661 - regression_loss: 1.6349 - classification_loss: 0.3313 393/500 [======================>.......] - ETA: 26s - loss: 1.9662 - regression_loss: 1.6349 - classification_loss: 0.3313 394/500 [======================>.......] - ETA: 26s - loss: 1.9653 - regression_loss: 1.6340 - classification_loss: 0.3313 395/500 [======================>.......] - ETA: 26s - loss: 1.9684 - regression_loss: 1.6370 - classification_loss: 0.3314 396/500 [======================>.......] - ETA: 25s - loss: 1.9691 - regression_loss: 1.6377 - classification_loss: 0.3313 397/500 [======================>.......] - ETA: 25s - loss: 1.9681 - regression_loss: 1.6369 - classification_loss: 0.3312 398/500 [======================>.......] - ETA: 25s - loss: 1.9677 - regression_loss: 1.6368 - classification_loss: 0.3309 399/500 [======================>.......] - ETA: 25s - loss: 1.9656 - regression_loss: 1.6352 - classification_loss: 0.3304 400/500 [=======================>......] - ETA: 24s - loss: 1.9649 - regression_loss: 1.6347 - classification_loss: 0.3302 401/500 [=======================>......] - ETA: 24s - loss: 1.9655 - regression_loss: 1.6351 - classification_loss: 0.3304 402/500 [=======================>......] - ETA: 24s - loss: 1.9657 - regression_loss: 1.6356 - classification_loss: 0.3301 403/500 [=======================>......] - ETA: 24s - loss: 1.9671 - regression_loss: 1.6368 - classification_loss: 0.3303 404/500 [=======================>......] - ETA: 23s - loss: 1.9668 - regression_loss: 1.6367 - classification_loss: 0.3301 405/500 [=======================>......] - ETA: 23s - loss: 1.9677 - regression_loss: 1.6373 - classification_loss: 0.3304 406/500 [=======================>......] - ETA: 23s - loss: 1.9682 - regression_loss: 1.6376 - classification_loss: 0.3306 407/500 [=======================>......] - ETA: 23s - loss: 1.9680 - regression_loss: 1.6374 - classification_loss: 0.3306 408/500 [=======================>......] - ETA: 22s - loss: 1.9681 - regression_loss: 1.6372 - classification_loss: 0.3309 409/500 [=======================>......] - ETA: 22s - loss: 1.9694 - regression_loss: 1.6381 - classification_loss: 0.3312 410/500 [=======================>......] - ETA: 22s - loss: 1.9694 - regression_loss: 1.6384 - classification_loss: 0.3310 411/500 [=======================>......] - ETA: 22s - loss: 1.9678 - regression_loss: 1.6370 - classification_loss: 0.3308 412/500 [=======================>......] - ETA: 21s - loss: 1.9662 - regression_loss: 1.6357 - classification_loss: 0.3305 413/500 [=======================>......] - ETA: 21s - loss: 1.9685 - regression_loss: 1.6372 - classification_loss: 0.3313 414/500 [=======================>......] - ETA: 21s - loss: 1.9659 - regression_loss: 1.6352 - classification_loss: 0.3307 415/500 [=======================>......] - ETA: 21s - loss: 1.9654 - regression_loss: 1.6348 - classification_loss: 0.3306 416/500 [=======================>......] - ETA: 20s - loss: 1.9675 - regression_loss: 1.6366 - classification_loss: 0.3310 417/500 [========================>.....] - ETA: 20s - loss: 1.9687 - regression_loss: 1.6377 - classification_loss: 0.3311 418/500 [========================>.....] - ETA: 20s - loss: 1.9683 - regression_loss: 1.6374 - classification_loss: 0.3310 419/500 [========================>.....] - ETA: 20s - loss: 1.9685 - regression_loss: 1.6376 - classification_loss: 0.3309 420/500 [========================>.....] - ETA: 19s - loss: 1.9676 - regression_loss: 1.6369 - classification_loss: 0.3308 421/500 [========================>.....] - ETA: 19s - loss: 1.9669 - regression_loss: 1.6363 - classification_loss: 0.3306 422/500 [========================>.....] - ETA: 19s - loss: 1.9660 - regression_loss: 1.6356 - classification_loss: 0.3304 423/500 [========================>.....] - ETA: 19s - loss: 1.9688 - regression_loss: 1.6376 - classification_loss: 0.3312 424/500 [========================>.....] - ETA: 18s - loss: 1.9687 - regression_loss: 1.6376 - classification_loss: 0.3311 425/500 [========================>.....] - ETA: 18s - loss: 1.9690 - regression_loss: 1.6377 - classification_loss: 0.3313 426/500 [========================>.....] - ETA: 18s - loss: 1.9663 - regression_loss: 1.6355 - classification_loss: 0.3308 427/500 [========================>.....] - ETA: 18s - loss: 1.9681 - regression_loss: 1.6370 - classification_loss: 0.3310 428/500 [========================>.....] - ETA: 17s - loss: 1.9656 - regression_loss: 1.6352 - classification_loss: 0.3305 429/500 [========================>.....] - ETA: 17s - loss: 1.9655 - regression_loss: 1.6351 - classification_loss: 0.3304 430/500 [========================>.....] - ETA: 17s - loss: 1.9637 - regression_loss: 1.6338 - classification_loss: 0.3299 431/500 [========================>.....] - ETA: 17s - loss: 1.9626 - regression_loss: 1.6330 - classification_loss: 0.3296 432/500 [========================>.....] - ETA: 16s - loss: 1.9622 - regression_loss: 1.6327 - classification_loss: 0.3295 433/500 [========================>.....] - ETA: 16s - loss: 1.9659 - regression_loss: 1.6353 - classification_loss: 0.3306 434/500 [=========================>....] - ETA: 16s - loss: 1.9648 - regression_loss: 1.6345 - classification_loss: 0.3303 435/500 [=========================>....] - ETA: 16s - loss: 1.9652 - regression_loss: 1.6348 - classification_loss: 0.3303 436/500 [=========================>....] - ETA: 15s - loss: 1.9657 - regression_loss: 1.6353 - classification_loss: 0.3304 437/500 [=========================>....] - ETA: 15s - loss: 1.9649 - regression_loss: 1.6348 - classification_loss: 0.3301 438/500 [=========================>....] - ETA: 15s - loss: 1.9648 - regression_loss: 1.6347 - classification_loss: 0.3302 439/500 [=========================>....] - ETA: 15s - loss: 1.9653 - regression_loss: 1.6351 - classification_loss: 0.3303 440/500 [=========================>....] - ETA: 14s - loss: 1.9649 - regression_loss: 1.6349 - classification_loss: 0.3300 441/500 [=========================>....] - ETA: 14s - loss: 1.9636 - regression_loss: 1.6339 - classification_loss: 0.3297 442/500 [=========================>....] - ETA: 14s - loss: 1.9654 - regression_loss: 1.6353 - classification_loss: 0.3300 443/500 [=========================>....] - ETA: 14s - loss: 1.9658 - regression_loss: 1.6357 - classification_loss: 0.3301 444/500 [=========================>....] - ETA: 13s - loss: 1.9665 - regression_loss: 1.6364 - classification_loss: 0.3301 445/500 [=========================>....] - ETA: 13s - loss: 1.9674 - regression_loss: 1.6370 - classification_loss: 0.3304 446/500 [=========================>....] - ETA: 13s - loss: 1.9679 - regression_loss: 1.6373 - classification_loss: 0.3306 447/500 [=========================>....] - ETA: 13s - loss: 1.9690 - regression_loss: 1.6380 - classification_loss: 0.3309 448/500 [=========================>....] - ETA: 12s - loss: 1.9684 - regression_loss: 1.6376 - classification_loss: 0.3307 449/500 [=========================>....] - ETA: 12s - loss: 1.9677 - regression_loss: 1.6372 - classification_loss: 0.3305 450/500 [==========================>...] - ETA: 12s - loss: 1.9679 - regression_loss: 1.6375 - classification_loss: 0.3304 451/500 [==========================>...] - ETA: 12s - loss: 1.9682 - regression_loss: 1.6378 - classification_loss: 0.3304 452/500 [==========================>...] - ETA: 11s - loss: 1.9689 - regression_loss: 1.6383 - classification_loss: 0.3306 453/500 [==========================>...] - ETA: 11s - loss: 1.9693 - regression_loss: 1.6386 - classification_loss: 0.3307 454/500 [==========================>...] - ETA: 11s - loss: 1.9687 - regression_loss: 1.6381 - classification_loss: 0.3306 455/500 [==========================>...] - ETA: 11s - loss: 1.9698 - regression_loss: 1.6386 - classification_loss: 0.3311 456/500 [==========================>...] - ETA: 10s - loss: 1.9688 - regression_loss: 1.6380 - classification_loss: 0.3309 457/500 [==========================>...] - ETA: 10s - loss: 1.9694 - regression_loss: 1.6385 - classification_loss: 0.3309 458/500 [==========================>...] - ETA: 10s - loss: 1.9692 - regression_loss: 1.6385 - classification_loss: 0.3307 459/500 [==========================>...] - ETA: 10s - loss: 1.9675 - regression_loss: 1.6371 - classification_loss: 0.3304 460/500 [==========================>...] - ETA: 9s - loss: 1.9664 - regression_loss: 1.6363 - classification_loss: 0.3301  461/500 [==========================>...] - ETA: 9s - loss: 1.9658 - regression_loss: 1.6359 - classification_loss: 0.3299 462/500 [==========================>...] - ETA: 9s - loss: 1.9648 - regression_loss: 1.6349 - classification_loss: 0.3298 463/500 [==========================>...] - ETA: 9s - loss: 1.9633 - regression_loss: 1.6339 - classification_loss: 0.3294 464/500 [==========================>...] - ETA: 8s - loss: 1.9632 - regression_loss: 1.6339 - classification_loss: 0.3293 465/500 [==========================>...] - ETA: 8s - loss: 1.9628 - regression_loss: 1.6336 - classification_loss: 0.3291 466/500 [==========================>...] - ETA: 8s - loss: 1.9619 - regression_loss: 1.6331 - classification_loss: 0.3289 467/500 [===========================>..] - ETA: 8s - loss: 1.9632 - regression_loss: 1.6341 - classification_loss: 0.3291 468/500 [===========================>..] - ETA: 7s - loss: 1.9644 - regression_loss: 1.6350 - classification_loss: 0.3294 469/500 [===========================>..] - ETA: 7s - loss: 1.9639 - regression_loss: 1.6346 - classification_loss: 0.3292 470/500 [===========================>..] - ETA: 7s - loss: 1.9637 - regression_loss: 1.6346 - classification_loss: 0.3291 471/500 [===========================>..] - ETA: 7s - loss: 1.9643 - regression_loss: 1.6349 - classification_loss: 0.3294 472/500 [===========================>..] - ETA: 6s - loss: 1.9633 - regression_loss: 1.6340 - classification_loss: 0.3293 473/500 [===========================>..] - ETA: 6s - loss: 1.9615 - regression_loss: 1.6326 - classification_loss: 0.3289 474/500 [===========================>..] - ETA: 6s - loss: 1.9621 - regression_loss: 1.6331 - classification_loss: 0.3290 475/500 [===========================>..] - ETA: 6s - loss: 1.9629 - regression_loss: 1.6339 - classification_loss: 0.3291 476/500 [===========================>..] - ETA: 5s - loss: 1.9630 - regression_loss: 1.6341 - classification_loss: 0.3290 477/500 [===========================>..] - ETA: 5s - loss: 1.9635 - regression_loss: 1.6345 - classification_loss: 0.3290 478/500 [===========================>..] - ETA: 5s - loss: 1.9636 - regression_loss: 1.6344 - classification_loss: 0.3292 479/500 [===========================>..] - ETA: 5s - loss: 1.9615 - regression_loss: 1.6326 - classification_loss: 0.3289 480/500 [===========================>..] - ETA: 4s - loss: 1.9621 - regression_loss: 1.6330 - classification_loss: 0.3291 481/500 [===========================>..] - ETA: 4s - loss: 1.9617 - regression_loss: 1.6326 - classification_loss: 0.3291 482/500 [===========================>..] - ETA: 4s - loss: 1.9622 - regression_loss: 1.6330 - classification_loss: 0.3293 483/500 [===========================>..] - ETA: 4s - loss: 1.9616 - regression_loss: 1.6326 - classification_loss: 0.3291 484/500 [============================>.] - ETA: 3s - loss: 1.9617 - regression_loss: 1.6326 - classification_loss: 0.3290 485/500 [============================>.] - ETA: 3s - loss: 1.9622 - regression_loss: 1.6329 - classification_loss: 0.3293 486/500 [============================>.] - ETA: 3s - loss: 1.9634 - regression_loss: 1.6338 - classification_loss: 0.3295 487/500 [============================>.] - ETA: 3s - loss: 1.9626 - regression_loss: 1.6333 - classification_loss: 0.3293 488/500 [============================>.] - ETA: 2s - loss: 1.9627 - regression_loss: 1.6334 - classification_loss: 0.3293 489/500 [============================>.] - ETA: 2s - loss: 1.9622 - regression_loss: 1.6330 - classification_loss: 0.3291 490/500 [============================>.] - ETA: 2s - loss: 1.9606 - regression_loss: 1.6318 - classification_loss: 0.3288 491/500 [============================>.] - ETA: 2s - loss: 1.9600 - regression_loss: 1.6311 - classification_loss: 0.3289 492/500 [============================>.] - ETA: 1s - loss: 1.9596 - regression_loss: 1.6308 - classification_loss: 0.3288 493/500 [============================>.] - ETA: 1s - loss: 1.9586 - regression_loss: 1.6301 - classification_loss: 0.3286 494/500 [============================>.] - ETA: 1s - loss: 1.9593 - regression_loss: 1.6307 - classification_loss: 0.3286 495/500 [============================>.] - ETA: 1s - loss: 1.9604 - regression_loss: 1.6315 - classification_loss: 0.3290 496/500 [============================>.] - ETA: 0s - loss: 1.9613 - regression_loss: 1.6321 - classification_loss: 0.3292 497/500 [============================>.] - ETA: 0s - loss: 1.9612 - regression_loss: 1.6321 - classification_loss: 0.3291 498/500 [============================>.] - ETA: 0s - loss: 1.9612 - regression_loss: 1.6322 - classification_loss: 0.3290 499/500 [============================>.] - ETA: 0s - loss: 1.9639 - regression_loss: 1.6343 - classification_loss: 0.3295 500/500 [==============================] - 125s 250ms/step - loss: 1.9639 - regression_loss: 1.6343 - classification_loss: 0.3296 326 instances of class plum with average precision: 0.6967 mAP: 0.6967 Epoch 00027: saving model to ./training/snapshots/resnet50_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:02 - loss: 1.6471 - regression_loss: 1.4472 - classification_loss: 0.1999 2/500 [..............................] - ETA: 2:01 - loss: 1.8158 - regression_loss: 1.5109 - classification_loss: 0.3049 3/500 [..............................] - ETA: 2:02 - loss: 1.7600 - regression_loss: 1.4908 - classification_loss: 0.2692 4/500 [..............................] - ETA: 2:04 - loss: 1.7109 - regression_loss: 1.4577 - classification_loss: 0.2532 5/500 [..............................] - ETA: 2:04 - loss: 1.7285 - regression_loss: 1.4817 - classification_loss: 0.2468 6/500 [..............................] - ETA: 2:02 - loss: 1.6651 - regression_loss: 1.4238 - classification_loss: 0.2413 7/500 [..............................] - ETA: 2:02 - loss: 1.6401 - regression_loss: 1.3924 - classification_loss: 0.2477 8/500 [..............................] - ETA: 2:02 - loss: 1.6431 - regression_loss: 1.3915 - classification_loss: 0.2516 9/500 [..............................] - ETA: 2:03 - loss: 1.6831 - regression_loss: 1.4239 - classification_loss: 0.2593 10/500 [..............................] - ETA: 2:03 - loss: 1.7353 - regression_loss: 1.4547 - classification_loss: 0.2807 11/500 [..............................] - ETA: 2:02 - loss: 1.7616 - regression_loss: 1.4698 - classification_loss: 0.2918 12/500 [..............................] - ETA: 2:02 - loss: 1.6736 - regression_loss: 1.3981 - classification_loss: 0.2755 13/500 [..............................] - ETA: 2:01 - loss: 1.6678 - regression_loss: 1.4017 - classification_loss: 0.2661 14/500 [..............................] - ETA: 2:01 - loss: 1.7194 - regression_loss: 1.4413 - classification_loss: 0.2781 15/500 [..............................] - ETA: 2:01 - loss: 1.7501 - regression_loss: 1.4677 - classification_loss: 0.2823 16/500 [..............................] - ETA: 2:01 - loss: 1.7658 - regression_loss: 1.4858 - classification_loss: 0.2800 17/500 [>.............................] - ETA: 2:01 - loss: 1.7941 - regression_loss: 1.5016 - classification_loss: 0.2925 18/500 [>.............................] - ETA: 2:01 - loss: 1.7889 - regression_loss: 1.4927 - classification_loss: 0.2962 19/500 [>.............................] - ETA: 2:00 - loss: 1.8241 - regression_loss: 1.5236 - classification_loss: 0.3005 20/500 [>.............................] - ETA: 2:00 - loss: 1.8195 - regression_loss: 1.5208 - classification_loss: 0.2987 21/500 [>.............................] - ETA: 2:00 - loss: 1.8096 - regression_loss: 1.5129 - classification_loss: 0.2966 22/500 [>.............................] - ETA: 2:00 - loss: 1.8468 - regression_loss: 1.5452 - classification_loss: 0.3016 23/500 [>.............................] - ETA: 2:00 - loss: 1.8011 - regression_loss: 1.5075 - classification_loss: 0.2937 24/500 [>.............................] - ETA: 2:00 - loss: 1.8108 - regression_loss: 1.5126 - classification_loss: 0.2982 25/500 [>.............................] - ETA: 1:59 - loss: 1.7927 - regression_loss: 1.4987 - classification_loss: 0.2940 26/500 [>.............................] - ETA: 1:59 - loss: 1.8086 - regression_loss: 1.5095 - classification_loss: 0.2991 27/500 [>.............................] - ETA: 1:59 - loss: 1.8153 - regression_loss: 1.5156 - classification_loss: 0.2997 28/500 [>.............................] - ETA: 1:58 - loss: 1.8018 - regression_loss: 1.5030 - classification_loss: 0.2987 29/500 [>.............................] - ETA: 1:58 - loss: 1.7984 - regression_loss: 1.4976 - classification_loss: 0.3009 30/500 [>.............................] - ETA: 1:58 - loss: 1.7886 - regression_loss: 1.4886 - classification_loss: 0.3000 31/500 [>.............................] - ETA: 1:58 - loss: 1.7938 - regression_loss: 1.4918 - classification_loss: 0.3021 32/500 [>.............................] - ETA: 1:58 - loss: 1.8088 - regression_loss: 1.5041 - classification_loss: 0.3047 33/500 [>.............................] - ETA: 1:57 - loss: 1.7820 - regression_loss: 1.4820 - classification_loss: 0.3001 34/500 [=>............................] - ETA: 1:57 - loss: 1.7549 - regression_loss: 1.4605 - classification_loss: 0.2944 35/500 [=>............................] - ETA: 1:57 - loss: 1.7691 - regression_loss: 1.4709 - classification_loss: 0.2982 36/500 [=>............................] - ETA: 1:56 - loss: 1.7812 - regression_loss: 1.4810 - classification_loss: 0.3002 37/500 [=>............................] - ETA: 1:56 - loss: 1.7876 - regression_loss: 1.4874 - classification_loss: 0.3002 38/500 [=>............................] - ETA: 1:56 - loss: 1.8151 - regression_loss: 1.5157 - classification_loss: 0.2993 39/500 [=>............................] - ETA: 1:56 - loss: 1.8128 - regression_loss: 1.5184 - classification_loss: 0.2944 40/500 [=>............................] - ETA: 1:55 - loss: 1.8071 - regression_loss: 1.5137 - classification_loss: 0.2934 41/500 [=>............................] - ETA: 1:55 - loss: 1.8187 - regression_loss: 1.5206 - classification_loss: 0.2982 42/500 [=>............................] - ETA: 1:55 - loss: 1.8205 - regression_loss: 1.5231 - classification_loss: 0.2974 43/500 [=>............................] - ETA: 1:55 - loss: 1.8236 - regression_loss: 1.5260 - classification_loss: 0.2977 44/500 [=>............................] - ETA: 1:54 - loss: 1.8145 - regression_loss: 1.5197 - classification_loss: 0.2948 45/500 [=>............................] - ETA: 1:54 - loss: 1.8359 - regression_loss: 1.5338 - classification_loss: 0.3021 46/500 [=>............................] - ETA: 1:54 - loss: 1.8417 - regression_loss: 1.5388 - classification_loss: 0.3029 47/500 [=>............................] - ETA: 1:53 - loss: 1.8315 - regression_loss: 1.5314 - classification_loss: 0.3001 48/500 [=>............................] - ETA: 1:53 - loss: 1.8224 - regression_loss: 1.5241 - classification_loss: 0.2983 49/500 [=>............................] - ETA: 1:53 - loss: 1.8358 - regression_loss: 1.5351 - classification_loss: 0.3007 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8401 - regression_loss: 1.5386 - classification_loss: 0.3015 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8308 - regression_loss: 1.5313 - classification_loss: 0.2995 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8237 - regression_loss: 1.5257 - classification_loss: 0.2980 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8213 - regression_loss: 1.5239 - classification_loss: 0.2974 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8187 - regression_loss: 1.5229 - classification_loss: 0.2957 55/500 [==>...........................] - ETA: 1:50 - loss: 1.8160 - regression_loss: 1.5205 - classification_loss: 0.2955 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8104 - regression_loss: 1.5172 - classification_loss: 0.2933 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8023 - regression_loss: 1.5121 - classification_loss: 0.2902 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7904 - regression_loss: 1.5011 - classification_loss: 0.2893 59/500 [==>...........................] - ETA: 1:49 - loss: 1.7852 - regression_loss: 1.4971 - classification_loss: 0.2881 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7798 - regression_loss: 1.4931 - classification_loss: 0.2867 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7976 - regression_loss: 1.5085 - classification_loss: 0.2891 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8146 - regression_loss: 1.5215 - classification_loss: 0.2931 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8053 - regression_loss: 1.5144 - classification_loss: 0.2909 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7921 - regression_loss: 1.5047 - classification_loss: 0.2874 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7881 - regression_loss: 1.5020 - classification_loss: 0.2861 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7905 - regression_loss: 1.5047 - classification_loss: 0.2859 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8059 - regression_loss: 1.5161 - classification_loss: 0.2898 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8137 - regression_loss: 1.5213 - classification_loss: 0.2924 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8203 - regression_loss: 1.5260 - classification_loss: 0.2942 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8308 - regression_loss: 1.5331 - classification_loss: 0.2977 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8321 - regression_loss: 1.5336 - classification_loss: 0.2985 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8278 - regression_loss: 1.5303 - classification_loss: 0.2976 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8307 - regression_loss: 1.5334 - classification_loss: 0.2974 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8270 - regression_loss: 1.5289 - classification_loss: 0.2981 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8277 - regression_loss: 1.5300 - classification_loss: 0.2978 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8231 - regression_loss: 1.5263 - classification_loss: 0.2968 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8291 - regression_loss: 1.5316 - classification_loss: 0.2975 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8263 - regression_loss: 1.5287 - classification_loss: 0.2975 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8227 - regression_loss: 1.5259 - classification_loss: 0.2969 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8247 - regression_loss: 1.5284 - classification_loss: 0.2963 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8177 - regression_loss: 1.5236 - classification_loss: 0.2941 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8177 - regression_loss: 1.5240 - classification_loss: 0.2937 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8240 - regression_loss: 1.5292 - classification_loss: 0.2948 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8139 - regression_loss: 1.5203 - classification_loss: 0.2936 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8071 - regression_loss: 1.5151 - classification_loss: 0.2919 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8046 - regression_loss: 1.5136 - classification_loss: 0.2910 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8067 - regression_loss: 1.5161 - classification_loss: 0.2906 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8058 - regression_loss: 1.5157 - classification_loss: 0.2901 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8086 - regression_loss: 1.5172 - classification_loss: 0.2913 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8106 - regression_loss: 1.5185 - classification_loss: 0.2921 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8107 - regression_loss: 1.5187 - classification_loss: 0.2920 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8131 - regression_loss: 1.5206 - classification_loss: 0.2926 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8283 - regression_loss: 1.5324 - classification_loss: 0.2958 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8343 - regression_loss: 1.5365 - classification_loss: 0.2978 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8420 - regression_loss: 1.5421 - classification_loss: 0.2999 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8427 - regression_loss: 1.5425 - classification_loss: 0.3003 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8567 - regression_loss: 1.5534 - classification_loss: 0.3033 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8609 - regression_loss: 1.5549 - classification_loss: 0.3061 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8685 - regression_loss: 1.5617 - classification_loss: 0.3068 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8635 - regression_loss: 1.5584 - classification_loss: 0.3052 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8575 - regression_loss: 1.5544 - classification_loss: 0.3031 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8747 - regression_loss: 1.5681 - classification_loss: 0.3066 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8812 - regression_loss: 1.5730 - classification_loss: 0.3082 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8817 - regression_loss: 1.5734 - classification_loss: 0.3083 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8712 - regression_loss: 1.5649 - classification_loss: 0.3063 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8743 - regression_loss: 1.5677 - classification_loss: 0.3067 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8705 - regression_loss: 1.5649 - classification_loss: 0.3056 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8689 - regression_loss: 1.5641 - classification_loss: 0.3048 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8711 - regression_loss: 1.5662 - classification_loss: 0.3049 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8715 - regression_loss: 1.5664 - classification_loss: 0.3052 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8671 - regression_loss: 1.5631 - classification_loss: 0.3040 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8620 - regression_loss: 1.5590 - classification_loss: 0.3031 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8673 - regression_loss: 1.5632 - classification_loss: 0.3041 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8810 - regression_loss: 1.5734 - classification_loss: 0.3076 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8803 - regression_loss: 1.5732 - classification_loss: 0.3071 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8782 - regression_loss: 1.5718 - classification_loss: 0.3064 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8761 - regression_loss: 1.5699 - classification_loss: 0.3063 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8823 - regression_loss: 1.5738 - classification_loss: 0.3085 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8844 - regression_loss: 1.5747 - classification_loss: 0.3097 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8840 - regression_loss: 1.5733 - classification_loss: 0.3107 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8866 - regression_loss: 1.5757 - classification_loss: 0.3109 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8839 - regression_loss: 1.5736 - classification_loss: 0.3103 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8811 - regression_loss: 1.5709 - classification_loss: 0.3103 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8827 - regression_loss: 1.5731 - classification_loss: 0.3096 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8794 - regression_loss: 1.5708 - classification_loss: 0.3085 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8724 - regression_loss: 1.5650 - classification_loss: 0.3074 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8688 - regression_loss: 1.5622 - classification_loss: 0.3066 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8725 - regression_loss: 1.5649 - classification_loss: 0.3076 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8693 - regression_loss: 1.5629 - classification_loss: 0.3065 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8728 - regression_loss: 1.5657 - classification_loss: 0.3071 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8733 - regression_loss: 1.5657 - classification_loss: 0.3076 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8737 - regression_loss: 1.5661 - classification_loss: 0.3076 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8790 - regression_loss: 1.5704 - classification_loss: 0.3086 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8792 - regression_loss: 1.5693 - classification_loss: 0.3098 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8786 - regression_loss: 1.5691 - classification_loss: 0.3095 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8862 - regression_loss: 1.5748 - classification_loss: 0.3115 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8877 - regression_loss: 1.5759 - classification_loss: 0.3118 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8959 - regression_loss: 1.5837 - classification_loss: 0.3122 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8970 - regression_loss: 1.5844 - classification_loss: 0.3126 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8952 - regression_loss: 1.5829 - classification_loss: 0.3123 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8960 - regression_loss: 1.5840 - classification_loss: 0.3120 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8964 - regression_loss: 1.5837 - classification_loss: 0.3126 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8938 - regression_loss: 1.5817 - classification_loss: 0.3121 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8938 - regression_loss: 1.5825 - classification_loss: 0.3112 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8964 - regression_loss: 1.5716 - classification_loss: 0.3248 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9009 - regression_loss: 1.5756 - classification_loss: 0.3253 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9021 - regression_loss: 1.5766 - classification_loss: 0.3255 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9015 - regression_loss: 1.5758 - classification_loss: 0.3258 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8944 - regression_loss: 1.5701 - classification_loss: 0.3244 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8942 - regression_loss: 1.5703 - classification_loss: 0.3239 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8967 - regression_loss: 1.5723 - classification_loss: 0.3243 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8963 - regression_loss: 1.5724 - classification_loss: 0.3239 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8979 - regression_loss: 1.5738 - classification_loss: 0.3241 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8992 - regression_loss: 1.5746 - classification_loss: 0.3246 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8946 - regression_loss: 1.5710 - classification_loss: 0.3237 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9085 - regression_loss: 1.5828 - classification_loss: 0.3257 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9070 - regression_loss: 1.5816 - classification_loss: 0.3254 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9075 - regression_loss: 1.5819 - classification_loss: 0.3256 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9065 - regression_loss: 1.5814 - classification_loss: 0.3251 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9091 - regression_loss: 1.5837 - classification_loss: 0.3254 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9099 - regression_loss: 1.5846 - classification_loss: 0.3252 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9108 - regression_loss: 1.5853 - classification_loss: 0.3255 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9102 - regression_loss: 1.5851 - classification_loss: 0.3251 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9085 - regression_loss: 1.5837 - classification_loss: 0.3247 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9079 - regression_loss: 1.5834 - classification_loss: 0.3245 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9088 - regression_loss: 1.5839 - classification_loss: 0.3249 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9104 - regression_loss: 1.5855 - classification_loss: 0.3249 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9132 - regression_loss: 1.5880 - classification_loss: 0.3253 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9146 - regression_loss: 1.5893 - classification_loss: 0.3253 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9175 - regression_loss: 1.5916 - classification_loss: 0.3259 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9185 - regression_loss: 1.5935 - classification_loss: 0.3251 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9180 - regression_loss: 1.5935 - classification_loss: 0.3244 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9153 - regression_loss: 1.5918 - classification_loss: 0.3235 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9152 - regression_loss: 1.5919 - classification_loss: 0.3232 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9177 - regression_loss: 1.5939 - classification_loss: 0.3238 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9176 - regression_loss: 1.5935 - classification_loss: 0.3241 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9206 - regression_loss: 1.5970 - classification_loss: 0.3236 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9209 - regression_loss: 1.5974 - classification_loss: 0.3235 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9216 - regression_loss: 1.5977 - classification_loss: 0.3239 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9213 - regression_loss: 1.5974 - classification_loss: 0.3238 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9157 - regression_loss: 1.5928 - classification_loss: 0.3229 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9147 - regression_loss: 1.5904 - classification_loss: 0.3243 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9120 - regression_loss: 1.5886 - classification_loss: 0.3234 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9109 - regression_loss: 1.5880 - classification_loss: 0.3229 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9123 - regression_loss: 1.5896 - classification_loss: 0.3227 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9095 - regression_loss: 1.5874 - classification_loss: 0.3221 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9128 - regression_loss: 1.5895 - classification_loss: 0.3233 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9135 - regression_loss: 1.5895 - classification_loss: 0.3239 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9168 - regression_loss: 1.5918 - classification_loss: 0.3249 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9185 - regression_loss: 1.5933 - classification_loss: 0.3253 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9175 - regression_loss: 1.5925 - classification_loss: 0.3250 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9225 - regression_loss: 1.5960 - classification_loss: 0.3265 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9265 - regression_loss: 1.5992 - classification_loss: 0.3272 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9240 - regression_loss: 1.5973 - classification_loss: 0.3267 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9227 - regression_loss: 1.5965 - classification_loss: 0.3262 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9242 - regression_loss: 1.5969 - classification_loss: 0.3272 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9244 - regression_loss: 1.5981 - classification_loss: 0.3263 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9219 - regression_loss: 1.5964 - classification_loss: 0.3255 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9238 - regression_loss: 1.5979 - classification_loss: 0.3258 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9277 - regression_loss: 1.6005 - classification_loss: 0.3272 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9235 - regression_loss: 1.5971 - classification_loss: 0.3264 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9215 - regression_loss: 1.5955 - classification_loss: 0.3260 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9225 - regression_loss: 1.5967 - classification_loss: 0.3258 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9252 - regression_loss: 1.5985 - classification_loss: 0.3267 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9265 - regression_loss: 1.5996 - classification_loss: 0.3269 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9281 - regression_loss: 1.6007 - classification_loss: 0.3273 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9274 - regression_loss: 1.6001 - classification_loss: 0.3273 208/500 [===========>..................] - ETA: 1:12 - loss: 1.9281 - regression_loss: 1.6004 - classification_loss: 0.3277 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9245 - regression_loss: 1.5976 - classification_loss: 0.3269 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9240 - regression_loss: 1.5977 - classification_loss: 0.3263 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9236 - regression_loss: 1.5973 - classification_loss: 0.3263 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9236 - regression_loss: 1.5974 - classification_loss: 0.3262 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9226 - regression_loss: 1.5968 - classification_loss: 0.3258 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9253 - regression_loss: 1.5991 - classification_loss: 0.3263 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9250 - regression_loss: 1.5987 - classification_loss: 0.3263 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9272 - regression_loss: 1.6004 - classification_loss: 0.3267 217/500 [============>.................] - ETA: 1:10 - loss: 1.9296 - regression_loss: 1.6023 - classification_loss: 0.3273 218/500 [============>.................] - ETA: 1:10 - loss: 1.9328 - regression_loss: 1.6049 - classification_loss: 0.3280 219/500 [============>.................] - ETA: 1:10 - loss: 1.9333 - regression_loss: 1.6055 - classification_loss: 0.3278 220/500 [============>.................] - ETA: 1:09 - loss: 1.9372 - regression_loss: 1.6087 - classification_loss: 0.3285 221/500 [============>.................] - ETA: 1:09 - loss: 1.9377 - regression_loss: 1.6094 - classification_loss: 0.3283 222/500 [============>.................] - ETA: 1:09 - loss: 1.9374 - regression_loss: 1.6090 - classification_loss: 0.3284 223/500 [============>.................] - ETA: 1:09 - loss: 1.9379 - regression_loss: 1.6084 - classification_loss: 0.3295 224/500 [============>.................] - ETA: 1:08 - loss: 1.9362 - regression_loss: 1.6074 - classification_loss: 0.3288 225/500 [============>.................] - ETA: 1:08 - loss: 1.9353 - regression_loss: 1.6068 - classification_loss: 0.3285 226/500 [============>.................] - ETA: 1:08 - loss: 1.9324 - regression_loss: 1.6044 - classification_loss: 0.3280 227/500 [============>.................] - ETA: 1:08 - loss: 1.9323 - regression_loss: 1.6042 - classification_loss: 0.3281 228/500 [============>.................] - ETA: 1:07 - loss: 1.9319 - regression_loss: 1.6043 - classification_loss: 0.3275 229/500 [============>.................] - ETA: 1:07 - loss: 1.9291 - regression_loss: 1.6024 - classification_loss: 0.3267 230/500 [============>.................] - ETA: 1:07 - loss: 1.9241 - regression_loss: 1.5984 - classification_loss: 0.3257 231/500 [============>.................] - ETA: 1:07 - loss: 1.9255 - regression_loss: 1.5995 - classification_loss: 0.3260 232/500 [============>.................] - ETA: 1:06 - loss: 1.9266 - regression_loss: 1.6002 - classification_loss: 0.3264 233/500 [============>.................] - ETA: 1:06 - loss: 1.9263 - regression_loss: 1.6000 - classification_loss: 0.3263 234/500 [=============>................] - ETA: 1:06 - loss: 1.9255 - regression_loss: 1.5993 - classification_loss: 0.3262 235/500 [=============>................] - ETA: 1:06 - loss: 1.9228 - regression_loss: 1.5970 - classification_loss: 0.3258 236/500 [=============>................] - ETA: 1:05 - loss: 1.9227 - regression_loss: 1.5972 - classification_loss: 0.3255 237/500 [=============>................] - ETA: 1:05 - loss: 1.9244 - regression_loss: 1.5986 - classification_loss: 0.3259 238/500 [=============>................] - ETA: 1:05 - loss: 1.9252 - regression_loss: 1.5995 - classification_loss: 0.3257 239/500 [=============>................] - ETA: 1:05 - loss: 1.9254 - regression_loss: 1.5996 - classification_loss: 0.3258 240/500 [=============>................] - ETA: 1:04 - loss: 1.9214 - regression_loss: 1.5965 - classification_loss: 0.3249 241/500 [=============>................] - ETA: 1:04 - loss: 1.9205 - regression_loss: 1.5960 - classification_loss: 0.3244 242/500 [=============>................] - ETA: 1:04 - loss: 1.9212 - regression_loss: 1.5969 - classification_loss: 0.3243 243/500 [=============>................] - ETA: 1:04 - loss: 1.9188 - regression_loss: 1.5952 - classification_loss: 0.3236 244/500 [=============>................] - ETA: 1:03 - loss: 1.9172 - regression_loss: 1.5942 - classification_loss: 0.3230 245/500 [=============>................] - ETA: 1:03 - loss: 1.9221 - regression_loss: 1.5970 - classification_loss: 0.3250 246/500 [=============>................] - ETA: 1:03 - loss: 1.9214 - regression_loss: 1.5966 - classification_loss: 0.3247 247/500 [=============>................] - ETA: 1:03 - loss: 1.9212 - regression_loss: 1.5967 - classification_loss: 0.3246 248/500 [=============>................] - ETA: 1:02 - loss: 1.9225 - regression_loss: 1.5977 - classification_loss: 0.3248 249/500 [=============>................] - ETA: 1:02 - loss: 1.9199 - regression_loss: 1.5959 - classification_loss: 0.3241 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9167 - regression_loss: 1.5932 - classification_loss: 0.3235 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9168 - regression_loss: 1.5929 - classification_loss: 0.3239 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9169 - regression_loss: 1.5930 - classification_loss: 0.3239 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9159 - regression_loss: 1.5925 - classification_loss: 0.3235 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9141 - regression_loss: 1.5914 - classification_loss: 0.3227 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9135 - regression_loss: 1.5911 - classification_loss: 0.3224 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9134 - regression_loss: 1.5910 - classification_loss: 0.3224 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9104 - regression_loss: 1.5886 - classification_loss: 0.3219 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9100 - regression_loss: 1.5885 - classification_loss: 0.3215 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9056 - regression_loss: 1.5848 - classification_loss: 0.3208 260/500 [==============>...............] - ETA: 59s - loss: 1.9041 - regression_loss: 1.5840 - classification_loss: 0.3201  261/500 [==============>...............] - ETA: 59s - loss: 1.9042 - regression_loss: 1.5835 - classification_loss: 0.3207 262/500 [==============>...............] - ETA: 59s - loss: 1.9030 - regression_loss: 1.5823 - classification_loss: 0.3207 263/500 [==============>...............] - ETA: 59s - loss: 1.9006 - regression_loss: 1.5804 - classification_loss: 0.3202 264/500 [==============>...............] - ETA: 58s - loss: 1.9005 - regression_loss: 1.5804 - classification_loss: 0.3201 265/500 [==============>...............] - ETA: 58s - loss: 1.8977 - regression_loss: 1.5783 - classification_loss: 0.3193 266/500 [==============>...............] - ETA: 58s - loss: 1.8989 - regression_loss: 1.5796 - classification_loss: 0.3193 267/500 [===============>..............] - ETA: 58s - loss: 1.8962 - regression_loss: 1.5776 - classification_loss: 0.3185 268/500 [===============>..............] - ETA: 57s - loss: 1.8997 - regression_loss: 1.5803 - classification_loss: 0.3194 269/500 [===============>..............] - ETA: 57s - loss: 1.8956 - regression_loss: 1.5771 - classification_loss: 0.3185 270/500 [===============>..............] - ETA: 57s - loss: 1.8966 - regression_loss: 1.5785 - classification_loss: 0.3181 271/500 [===============>..............] - ETA: 57s - loss: 1.8974 - regression_loss: 1.5795 - classification_loss: 0.3179 272/500 [===============>..............] - ETA: 56s - loss: 1.8971 - regression_loss: 1.5793 - classification_loss: 0.3178 273/500 [===============>..............] - ETA: 56s - loss: 1.8996 - regression_loss: 1.5813 - classification_loss: 0.3183 274/500 [===============>..............] - ETA: 56s - loss: 1.8997 - regression_loss: 1.5816 - classification_loss: 0.3182 275/500 [===============>..............] - ETA: 56s - loss: 1.9032 - regression_loss: 1.5840 - classification_loss: 0.3192 276/500 [===============>..............] - ETA: 55s - loss: 1.9028 - regression_loss: 1.5836 - classification_loss: 0.3192 277/500 [===============>..............] - ETA: 55s - loss: 1.9014 - regression_loss: 1.5826 - classification_loss: 0.3189 278/500 [===============>..............] - ETA: 55s - loss: 1.9023 - regression_loss: 1.5830 - classification_loss: 0.3193 279/500 [===============>..............] - ETA: 55s - loss: 1.9048 - regression_loss: 1.5851 - classification_loss: 0.3198 280/500 [===============>..............] - ETA: 54s - loss: 1.9028 - regression_loss: 1.5835 - classification_loss: 0.3193 281/500 [===============>..............] - ETA: 54s - loss: 1.9030 - regression_loss: 1.5835 - classification_loss: 0.3195 282/500 [===============>..............] - ETA: 54s - loss: 1.9034 - regression_loss: 1.5840 - classification_loss: 0.3194 283/500 [===============>..............] - ETA: 54s - loss: 1.9051 - regression_loss: 1.5855 - classification_loss: 0.3196 284/500 [================>.............] - ETA: 53s - loss: 1.9048 - regression_loss: 1.5855 - classification_loss: 0.3193 285/500 [================>.............] - ETA: 53s - loss: 1.9067 - regression_loss: 1.5870 - classification_loss: 0.3197 286/500 [================>.............] - ETA: 53s - loss: 1.9060 - regression_loss: 1.5866 - classification_loss: 0.3195 287/500 [================>.............] - ETA: 53s - loss: 1.9059 - regression_loss: 1.5863 - classification_loss: 0.3196 288/500 [================>.............] - ETA: 52s - loss: 1.9037 - regression_loss: 1.5847 - classification_loss: 0.3190 289/500 [================>.............] - ETA: 52s - loss: 1.9061 - regression_loss: 1.5865 - classification_loss: 0.3196 290/500 [================>.............] - ETA: 52s - loss: 1.9029 - regression_loss: 1.5835 - classification_loss: 0.3194 291/500 [================>.............] - ETA: 52s - loss: 1.9037 - regression_loss: 1.5841 - classification_loss: 0.3196 292/500 [================>.............] - ETA: 51s - loss: 1.9030 - regression_loss: 1.5837 - classification_loss: 0.3193 293/500 [================>.............] - ETA: 51s - loss: 1.9045 - regression_loss: 1.5848 - classification_loss: 0.3197 294/500 [================>.............] - ETA: 51s - loss: 1.9059 - regression_loss: 1.5861 - classification_loss: 0.3198 295/500 [================>.............] - ETA: 51s - loss: 1.9068 - regression_loss: 1.5868 - classification_loss: 0.3200 296/500 [================>.............] - ETA: 50s - loss: 1.9062 - regression_loss: 1.5863 - classification_loss: 0.3199 297/500 [================>.............] - ETA: 50s - loss: 1.9043 - regression_loss: 1.5846 - classification_loss: 0.3197 298/500 [================>.............] - ETA: 50s - loss: 1.9045 - regression_loss: 1.5850 - classification_loss: 0.3195 299/500 [================>.............] - ETA: 50s - loss: 1.9035 - regression_loss: 1.5844 - classification_loss: 0.3191 300/500 [=================>............] - ETA: 49s - loss: 1.9054 - regression_loss: 1.5858 - classification_loss: 0.3195 301/500 [=================>............] - ETA: 49s - loss: 1.9074 - regression_loss: 1.5874 - classification_loss: 0.3200 302/500 [=================>............] - ETA: 49s - loss: 1.9070 - regression_loss: 1.5870 - classification_loss: 0.3200 303/500 [=================>............] - ETA: 49s - loss: 1.9058 - regression_loss: 1.5863 - classification_loss: 0.3195 304/500 [=================>............] - ETA: 48s - loss: 1.9060 - regression_loss: 1.5864 - classification_loss: 0.3196 305/500 [=================>............] - ETA: 48s - loss: 1.9043 - regression_loss: 1.5850 - classification_loss: 0.3193 306/500 [=================>............] - ETA: 48s - loss: 1.9057 - regression_loss: 1.5853 - classification_loss: 0.3204 307/500 [=================>............] - ETA: 48s - loss: 1.9088 - regression_loss: 1.5876 - classification_loss: 0.3212 308/500 [=================>............] - ETA: 47s - loss: 1.9115 - regression_loss: 1.5901 - classification_loss: 0.3215 309/500 [=================>............] - ETA: 47s - loss: 1.9095 - regression_loss: 1.5885 - classification_loss: 0.3209 310/500 [=================>............] - ETA: 47s - loss: 1.9114 - regression_loss: 1.5897 - classification_loss: 0.3217 311/500 [=================>............] - ETA: 47s - loss: 1.9128 - regression_loss: 1.5907 - classification_loss: 0.3221 312/500 [=================>............] - ETA: 46s - loss: 1.9127 - regression_loss: 1.5906 - classification_loss: 0.3221 313/500 [=================>............] - ETA: 46s - loss: 1.9129 - regression_loss: 1.5911 - classification_loss: 0.3218 314/500 [=================>............] - ETA: 46s - loss: 1.9128 - regression_loss: 1.5914 - classification_loss: 0.3214 315/500 [=================>............] - ETA: 46s - loss: 1.9122 - regression_loss: 1.5910 - classification_loss: 0.3211 316/500 [=================>............] - ETA: 45s - loss: 1.9100 - regression_loss: 1.5893 - classification_loss: 0.3207 317/500 [==================>...........] - ETA: 45s - loss: 1.9107 - regression_loss: 1.5899 - classification_loss: 0.3209 318/500 [==================>...........] - ETA: 45s - loss: 1.9140 - regression_loss: 1.5924 - classification_loss: 0.3217 319/500 [==================>...........] - ETA: 45s - loss: 1.9139 - regression_loss: 1.5923 - classification_loss: 0.3216 320/500 [==================>...........] - ETA: 44s - loss: 1.9140 - regression_loss: 1.5924 - classification_loss: 0.3217 321/500 [==================>...........] - ETA: 44s - loss: 1.9128 - regression_loss: 1.5915 - classification_loss: 0.3214 322/500 [==================>...........] - ETA: 44s - loss: 1.9120 - regression_loss: 1.5910 - classification_loss: 0.3210 323/500 [==================>...........] - ETA: 44s - loss: 1.9128 - regression_loss: 1.5916 - classification_loss: 0.3213 324/500 [==================>...........] - ETA: 43s - loss: 1.9157 - regression_loss: 1.5938 - classification_loss: 0.3219 325/500 [==================>...........] - ETA: 43s - loss: 1.9151 - regression_loss: 1.5932 - classification_loss: 0.3219 326/500 [==================>...........] - ETA: 43s - loss: 1.9136 - regression_loss: 1.5920 - classification_loss: 0.3216 327/500 [==================>...........] - ETA: 43s - loss: 1.9176 - regression_loss: 1.5953 - classification_loss: 0.3222 328/500 [==================>...........] - ETA: 42s - loss: 1.9181 - regression_loss: 1.5961 - classification_loss: 0.3221 329/500 [==================>...........] - ETA: 42s - loss: 1.9180 - regression_loss: 1.5961 - classification_loss: 0.3218 330/500 [==================>...........] - ETA: 42s - loss: 1.9155 - regression_loss: 1.5943 - classification_loss: 0.3213 331/500 [==================>...........] - ETA: 42s - loss: 1.9149 - regression_loss: 1.5939 - classification_loss: 0.3210 332/500 [==================>...........] - ETA: 41s - loss: 1.9124 - regression_loss: 1.5921 - classification_loss: 0.3203 333/500 [==================>...........] - ETA: 41s - loss: 1.9135 - regression_loss: 1.5929 - classification_loss: 0.3206 334/500 [===================>..........] - ETA: 41s - loss: 1.9146 - regression_loss: 1.5941 - classification_loss: 0.3204 335/500 [===================>..........] - ETA: 41s - loss: 1.9141 - regression_loss: 1.5936 - classification_loss: 0.3205 336/500 [===================>..........] - ETA: 40s - loss: 1.9146 - regression_loss: 1.5939 - classification_loss: 0.3206 337/500 [===================>..........] - ETA: 40s - loss: 1.9150 - regression_loss: 1.5941 - classification_loss: 0.3209 338/500 [===================>..........] - ETA: 40s - loss: 1.9172 - regression_loss: 1.5961 - classification_loss: 0.3211 339/500 [===================>..........] - ETA: 40s - loss: 1.9163 - regression_loss: 1.5955 - classification_loss: 0.3208 340/500 [===================>..........] - ETA: 40s - loss: 1.9151 - regression_loss: 1.5947 - classification_loss: 0.3204 341/500 [===================>..........] - ETA: 39s - loss: 1.9135 - regression_loss: 1.5935 - classification_loss: 0.3201 342/500 [===================>..........] - ETA: 39s - loss: 1.9140 - regression_loss: 1.5937 - classification_loss: 0.3203 343/500 [===================>..........] - ETA: 39s - loss: 1.9142 - regression_loss: 1.5940 - classification_loss: 0.3202 344/500 [===================>..........] - ETA: 39s - loss: 1.9139 - regression_loss: 1.5938 - classification_loss: 0.3201 345/500 [===================>..........] - ETA: 38s - loss: 1.9110 - regression_loss: 1.5912 - classification_loss: 0.3198 346/500 [===================>..........] - ETA: 38s - loss: 1.9122 - regression_loss: 1.5921 - classification_loss: 0.3201 347/500 [===================>..........] - ETA: 38s - loss: 1.9143 - regression_loss: 1.5939 - classification_loss: 0.3205 348/500 [===================>..........] - ETA: 38s - loss: 1.9111 - regression_loss: 1.5912 - classification_loss: 0.3199 349/500 [===================>..........] - ETA: 37s - loss: 1.9108 - regression_loss: 1.5911 - classification_loss: 0.3197 350/500 [====================>.........] - ETA: 37s - loss: 1.9092 - regression_loss: 1.5898 - classification_loss: 0.3195 351/500 [====================>.........] - ETA: 37s - loss: 1.9122 - regression_loss: 1.5922 - classification_loss: 0.3200 352/500 [====================>.........] - ETA: 37s - loss: 1.9105 - regression_loss: 1.5909 - classification_loss: 0.3196 353/500 [====================>.........] - ETA: 36s - loss: 1.9079 - regression_loss: 1.5888 - classification_loss: 0.3192 354/500 [====================>.........] - ETA: 36s - loss: 1.9075 - regression_loss: 1.5878 - classification_loss: 0.3197 355/500 [====================>.........] - ETA: 36s - loss: 1.9092 - regression_loss: 1.5889 - classification_loss: 0.3203 356/500 [====================>.........] - ETA: 35s - loss: 1.9092 - regression_loss: 1.5886 - classification_loss: 0.3205 357/500 [====================>.........] - ETA: 35s - loss: 1.9100 - regression_loss: 1.5896 - classification_loss: 0.3204 358/500 [====================>.........] - ETA: 35s - loss: 1.9102 - regression_loss: 1.5898 - classification_loss: 0.3204 359/500 [====================>.........] - ETA: 35s - loss: 1.9121 - regression_loss: 1.5910 - classification_loss: 0.3211 360/500 [====================>.........] - ETA: 34s - loss: 1.9127 - regression_loss: 1.5916 - classification_loss: 0.3212 361/500 [====================>.........] - ETA: 34s - loss: 1.9127 - regression_loss: 1.5916 - classification_loss: 0.3211 362/500 [====================>.........] - ETA: 34s - loss: 1.9122 - regression_loss: 1.5912 - classification_loss: 0.3210 363/500 [====================>.........] - ETA: 34s - loss: 1.9129 - regression_loss: 1.5918 - classification_loss: 0.3211 364/500 [====================>.........] - ETA: 33s - loss: 1.9125 - regression_loss: 1.5917 - classification_loss: 0.3208 365/500 [====================>.........] - ETA: 33s - loss: 1.9124 - regression_loss: 1.5917 - classification_loss: 0.3208 366/500 [====================>.........] - ETA: 33s - loss: 1.9122 - regression_loss: 1.5916 - classification_loss: 0.3206 367/500 [=====================>........] - ETA: 33s - loss: 1.9149 - regression_loss: 1.5928 - classification_loss: 0.3221 368/500 [=====================>........] - ETA: 33s - loss: 1.9158 - regression_loss: 1.5934 - classification_loss: 0.3224 369/500 [=====================>........] - ETA: 32s - loss: 1.9158 - regression_loss: 1.5936 - classification_loss: 0.3222 370/500 [=====================>........] - ETA: 32s - loss: 1.9163 - regression_loss: 1.5938 - classification_loss: 0.3225 371/500 [=====================>........] - ETA: 32s - loss: 1.9150 - regression_loss: 1.5926 - classification_loss: 0.3224 372/500 [=====================>........] - ETA: 32s - loss: 1.9155 - regression_loss: 1.5929 - classification_loss: 0.3226 373/500 [=====================>........] - ETA: 31s - loss: 1.9170 - regression_loss: 1.5942 - classification_loss: 0.3229 374/500 [=====================>........] - ETA: 31s - loss: 1.9160 - regression_loss: 1.5935 - classification_loss: 0.3225 375/500 [=====================>........] - ETA: 31s - loss: 1.9182 - regression_loss: 1.5950 - classification_loss: 0.3232 376/500 [=====================>........] - ETA: 31s - loss: 1.9196 - regression_loss: 1.5959 - classification_loss: 0.3237 377/500 [=====================>........] - ETA: 30s - loss: 1.9197 - regression_loss: 1.5949 - classification_loss: 0.3248 378/500 [=====================>........] - ETA: 30s - loss: 1.9200 - regression_loss: 1.5954 - classification_loss: 0.3246 379/500 [=====================>........] - ETA: 30s - loss: 1.9229 - regression_loss: 1.5980 - classification_loss: 0.3249 380/500 [=====================>........] - ETA: 30s - loss: 1.9241 - regression_loss: 1.5985 - classification_loss: 0.3256 381/500 [=====================>........] - ETA: 29s - loss: 1.9246 - regression_loss: 1.5990 - classification_loss: 0.3256 382/500 [=====================>........] - ETA: 29s - loss: 1.9263 - regression_loss: 1.6004 - classification_loss: 0.3258 383/500 [=====================>........] - ETA: 29s - loss: 1.9258 - regression_loss: 1.6001 - classification_loss: 0.3256 384/500 [======================>.......] - ETA: 29s - loss: 1.9264 - regression_loss: 1.6006 - classification_loss: 0.3258 385/500 [======================>.......] - ETA: 28s - loss: 1.9262 - regression_loss: 1.6005 - classification_loss: 0.3257 386/500 [======================>.......] - ETA: 28s - loss: 1.9266 - regression_loss: 1.6010 - classification_loss: 0.3256 387/500 [======================>.......] - ETA: 28s - loss: 1.9260 - regression_loss: 1.6007 - classification_loss: 0.3253 388/500 [======================>.......] - ETA: 28s - loss: 1.9259 - regression_loss: 1.6007 - classification_loss: 0.3252 389/500 [======================>.......] - ETA: 27s - loss: 1.9272 - regression_loss: 1.6020 - classification_loss: 0.3252 390/500 [======================>.......] - ETA: 27s - loss: 1.9268 - regression_loss: 1.6017 - classification_loss: 0.3251 391/500 [======================>.......] - ETA: 27s - loss: 1.9265 - regression_loss: 1.6017 - classification_loss: 0.3248 392/500 [======================>.......] - ETA: 27s - loss: 1.9269 - regression_loss: 1.6019 - classification_loss: 0.3250 393/500 [======================>.......] - ETA: 26s - loss: 1.9267 - regression_loss: 1.6018 - classification_loss: 0.3250 394/500 [======================>.......] - ETA: 26s - loss: 1.9271 - regression_loss: 1.6020 - classification_loss: 0.3252 395/500 [======================>.......] - ETA: 26s - loss: 1.9262 - regression_loss: 1.6013 - classification_loss: 0.3248 396/500 [======================>.......] - ETA: 26s - loss: 1.9267 - regression_loss: 1.6017 - classification_loss: 0.3250 397/500 [======================>.......] - ETA: 25s - loss: 1.9269 - regression_loss: 1.6019 - classification_loss: 0.3250 398/500 [======================>.......] - ETA: 25s - loss: 1.9288 - regression_loss: 1.6032 - classification_loss: 0.3256 399/500 [======================>.......] - ETA: 25s - loss: 1.9293 - regression_loss: 1.6038 - classification_loss: 0.3256 400/500 [=======================>......] - ETA: 25s - loss: 1.9306 - regression_loss: 1.6049 - classification_loss: 0.3257 401/500 [=======================>......] - ETA: 24s - loss: 1.9293 - regression_loss: 1.6034 - classification_loss: 0.3258 402/500 [=======================>......] - ETA: 24s - loss: 1.9312 - regression_loss: 1.6050 - classification_loss: 0.3262 403/500 [=======================>......] - ETA: 24s - loss: 1.9312 - regression_loss: 1.6051 - classification_loss: 0.3262 404/500 [=======================>......] - ETA: 24s - loss: 1.9333 - regression_loss: 1.6070 - classification_loss: 0.3263 405/500 [=======================>......] - ETA: 23s - loss: 1.9325 - regression_loss: 1.6065 - classification_loss: 0.3260 406/500 [=======================>......] - ETA: 23s - loss: 1.9312 - regression_loss: 1.6055 - classification_loss: 0.3257 407/500 [=======================>......] - ETA: 23s - loss: 1.9305 - regression_loss: 1.6050 - classification_loss: 0.3255 408/500 [=======================>......] - ETA: 22s - loss: 1.9289 - regression_loss: 1.6037 - classification_loss: 0.3252 409/500 [=======================>......] - ETA: 22s - loss: 1.9280 - regression_loss: 1.6029 - classification_loss: 0.3251 410/500 [=======================>......] - ETA: 22s - loss: 1.9266 - regression_loss: 1.6017 - classification_loss: 0.3249 411/500 [=======================>......] - ETA: 22s - loss: 1.9253 - regression_loss: 1.6008 - classification_loss: 0.3245 412/500 [=======================>......] - ETA: 21s - loss: 1.9257 - regression_loss: 1.6012 - classification_loss: 0.3245 413/500 [=======================>......] - ETA: 21s - loss: 1.9247 - regression_loss: 1.6004 - classification_loss: 0.3243 414/500 [=======================>......] - ETA: 21s - loss: 1.9264 - regression_loss: 1.6021 - classification_loss: 0.3243 415/500 [=======================>......] - ETA: 21s - loss: 1.9266 - regression_loss: 1.6022 - classification_loss: 0.3244 416/500 [=======================>......] - ETA: 20s - loss: 1.9280 - regression_loss: 1.6033 - classification_loss: 0.3248 417/500 [========================>.....] - ETA: 20s - loss: 1.9287 - regression_loss: 1.6037 - classification_loss: 0.3250 418/500 [========================>.....] - ETA: 20s - loss: 1.9281 - regression_loss: 1.6032 - classification_loss: 0.3248 419/500 [========================>.....] - ETA: 20s - loss: 1.9280 - regression_loss: 1.6033 - classification_loss: 0.3247 420/500 [========================>.....] - ETA: 19s - loss: 1.9279 - regression_loss: 1.6033 - classification_loss: 0.3246 421/500 [========================>.....] - ETA: 19s - loss: 1.9249 - regression_loss: 1.6007 - classification_loss: 0.3241 422/500 [========================>.....] - ETA: 19s - loss: 1.9260 - regression_loss: 1.6016 - classification_loss: 0.3244 423/500 [========================>.....] - ETA: 19s - loss: 1.9267 - regression_loss: 1.6019 - classification_loss: 0.3248 424/500 [========================>.....] - ETA: 18s - loss: 1.9251 - regression_loss: 1.6009 - classification_loss: 0.3243 425/500 [========================>.....] - ETA: 18s - loss: 1.9252 - regression_loss: 1.6010 - classification_loss: 0.3241 426/500 [========================>.....] - ETA: 18s - loss: 1.9255 - regression_loss: 1.6013 - classification_loss: 0.3242 427/500 [========================>.....] - ETA: 18s - loss: 1.9229 - regression_loss: 1.5992 - classification_loss: 0.3237 428/500 [========================>.....] - ETA: 17s - loss: 1.9234 - regression_loss: 1.5994 - classification_loss: 0.3239 429/500 [========================>.....] - ETA: 17s - loss: 1.9237 - regression_loss: 1.5998 - classification_loss: 0.3239 430/500 [========================>.....] - ETA: 17s - loss: 1.9229 - regression_loss: 1.5992 - classification_loss: 0.3238 431/500 [========================>.....] - ETA: 17s - loss: 1.9223 - regression_loss: 1.5986 - classification_loss: 0.3237 432/500 [========================>.....] - ETA: 16s - loss: 1.9211 - regression_loss: 1.5977 - classification_loss: 0.3234 433/500 [========================>.....] - ETA: 16s - loss: 1.9224 - regression_loss: 1.5988 - classification_loss: 0.3235 434/500 [=========================>....] - ETA: 16s - loss: 1.9220 - regression_loss: 1.5985 - classification_loss: 0.3235 435/500 [=========================>....] - ETA: 16s - loss: 1.9222 - regression_loss: 1.5988 - classification_loss: 0.3234 436/500 [=========================>....] - ETA: 15s - loss: 1.9229 - regression_loss: 1.5994 - classification_loss: 0.3235 437/500 [=========================>....] - ETA: 15s - loss: 1.9226 - regression_loss: 1.5992 - classification_loss: 0.3234 438/500 [=========================>....] - ETA: 15s - loss: 1.9234 - regression_loss: 1.5998 - classification_loss: 0.3236 439/500 [=========================>....] - ETA: 15s - loss: 1.9254 - regression_loss: 1.6015 - classification_loss: 0.3238 440/500 [=========================>....] - ETA: 14s - loss: 1.9241 - regression_loss: 1.6004 - classification_loss: 0.3237 441/500 [=========================>....] - ETA: 14s - loss: 1.9237 - regression_loss: 1.6002 - classification_loss: 0.3236 442/500 [=========================>....] - ETA: 14s - loss: 1.9242 - regression_loss: 1.6006 - classification_loss: 0.3236 443/500 [=========================>....] - ETA: 14s - loss: 1.9253 - regression_loss: 1.6016 - classification_loss: 0.3237 444/500 [=========================>....] - ETA: 13s - loss: 1.9265 - regression_loss: 1.6023 - classification_loss: 0.3241 445/500 [=========================>....] - ETA: 13s - loss: 1.9261 - regression_loss: 1.6020 - classification_loss: 0.3240 446/500 [=========================>....] - ETA: 13s - loss: 1.9260 - regression_loss: 1.6020 - classification_loss: 0.3239 447/500 [=========================>....] - ETA: 13s - loss: 1.9251 - regression_loss: 1.6006 - classification_loss: 0.3246 448/500 [=========================>....] - ETA: 12s - loss: 1.9241 - regression_loss: 1.5997 - classification_loss: 0.3243 449/500 [=========================>....] - ETA: 12s - loss: 1.9234 - regression_loss: 1.5992 - classification_loss: 0.3242 450/500 [==========================>...] - ETA: 12s - loss: 1.9214 - regression_loss: 1.5977 - classification_loss: 0.3237 451/500 [==========================>...] - ETA: 12s - loss: 1.9203 - regression_loss: 1.5969 - classification_loss: 0.3234 452/500 [==========================>...] - ETA: 11s - loss: 1.9173 - regression_loss: 1.5944 - classification_loss: 0.3229 453/500 [==========================>...] - ETA: 11s - loss: 1.9184 - regression_loss: 1.5952 - classification_loss: 0.3232 454/500 [==========================>...] - ETA: 11s - loss: 1.9198 - regression_loss: 1.5961 - classification_loss: 0.3237 455/500 [==========================>...] - ETA: 11s - loss: 1.9199 - regression_loss: 1.5959 - classification_loss: 0.3240 456/500 [==========================>...] - ETA: 10s - loss: 1.9198 - regression_loss: 1.5958 - classification_loss: 0.3241 457/500 [==========================>...] - ETA: 10s - loss: 1.9240 - regression_loss: 1.5986 - classification_loss: 0.3254 458/500 [==========================>...] - ETA: 10s - loss: 1.9218 - regression_loss: 1.5966 - classification_loss: 0.3251 459/500 [==========================>...] - ETA: 10s - loss: 1.9204 - regression_loss: 1.5956 - classification_loss: 0.3247 460/500 [==========================>...] - ETA: 9s - loss: 1.9211 - regression_loss: 1.5965 - classification_loss: 0.3246  461/500 [==========================>...] - ETA: 9s - loss: 1.9219 - regression_loss: 1.5968 - classification_loss: 0.3252 462/500 [==========================>...] - ETA: 9s - loss: 1.9213 - regression_loss: 1.5964 - classification_loss: 0.3249 463/500 [==========================>...] - ETA: 9s - loss: 1.9222 - regression_loss: 1.5969 - classification_loss: 0.3253 464/500 [==========================>...] - ETA: 8s - loss: 1.9218 - regression_loss: 1.5967 - classification_loss: 0.3250 465/500 [==========================>...] - ETA: 8s - loss: 1.9224 - regression_loss: 1.5971 - classification_loss: 0.3252 466/500 [==========================>...] - ETA: 8s - loss: 1.9221 - regression_loss: 1.5970 - classification_loss: 0.3251 467/500 [===========================>..] - ETA: 8s - loss: 1.9236 - regression_loss: 1.5983 - classification_loss: 0.3253 468/500 [===========================>..] - ETA: 7s - loss: 1.9228 - regression_loss: 1.5975 - classification_loss: 0.3253 469/500 [===========================>..] - ETA: 7s - loss: 1.9227 - regression_loss: 1.5976 - classification_loss: 0.3251 470/500 [===========================>..] - ETA: 7s - loss: 1.9239 - regression_loss: 1.5986 - classification_loss: 0.3253 471/500 [===========================>..] - ETA: 7s - loss: 1.9240 - regression_loss: 1.5986 - classification_loss: 0.3253 472/500 [===========================>..] - ETA: 6s - loss: 1.9248 - regression_loss: 1.5996 - classification_loss: 0.3252 473/500 [===========================>..] - ETA: 6s - loss: 1.9255 - regression_loss: 1.6002 - classification_loss: 0.3253 474/500 [===========================>..] - ETA: 6s - loss: 1.9254 - regression_loss: 1.6002 - classification_loss: 0.3252 475/500 [===========================>..] - ETA: 6s - loss: 1.9246 - regression_loss: 1.5996 - classification_loss: 0.3250 476/500 [===========================>..] - ETA: 5s - loss: 1.9250 - regression_loss: 1.6002 - classification_loss: 0.3249 477/500 [===========================>..] - ETA: 5s - loss: 1.9253 - regression_loss: 1.6004 - classification_loss: 0.3250 478/500 [===========================>..] - ETA: 5s - loss: 1.9246 - regression_loss: 1.5997 - classification_loss: 0.3249 479/500 [===========================>..] - ETA: 5s - loss: 1.9238 - regression_loss: 1.5990 - classification_loss: 0.3248 480/500 [===========================>..] - ETA: 5s - loss: 1.9243 - regression_loss: 1.5996 - classification_loss: 0.3248 481/500 [===========================>..] - ETA: 4s - loss: 1.9250 - regression_loss: 1.6002 - classification_loss: 0.3248 482/500 [===========================>..] - ETA: 4s - loss: 1.9235 - regression_loss: 1.5992 - classification_loss: 0.3243 483/500 [===========================>..] - ETA: 4s - loss: 1.9245 - regression_loss: 1.6000 - classification_loss: 0.3245 484/500 [============================>.] - ETA: 4s - loss: 1.9254 - regression_loss: 1.6006 - classification_loss: 0.3248 485/500 [============================>.] - ETA: 3s - loss: 1.9256 - regression_loss: 1.6007 - classification_loss: 0.3250 486/500 [============================>.] - ETA: 3s - loss: 1.9258 - regression_loss: 1.6006 - classification_loss: 0.3252 487/500 [============================>.] - ETA: 3s - loss: 1.9275 - regression_loss: 1.6021 - classification_loss: 0.3255 488/500 [============================>.] - ETA: 2s - loss: 1.9287 - regression_loss: 1.6029 - classification_loss: 0.3257 489/500 [============================>.] - ETA: 2s - loss: 1.9276 - regression_loss: 1.6023 - classification_loss: 0.3253 490/500 [============================>.] - ETA: 2s - loss: 1.9277 - regression_loss: 1.6023 - classification_loss: 0.3255 491/500 [============================>.] - ETA: 2s - loss: 1.9292 - regression_loss: 1.6036 - classification_loss: 0.3256 492/500 [============================>.] - ETA: 2s - loss: 1.9278 - regression_loss: 1.6025 - classification_loss: 0.3252 493/500 [============================>.] - ETA: 1s - loss: 1.9280 - regression_loss: 1.6027 - classification_loss: 0.3253 494/500 [============================>.] - ETA: 1s - loss: 1.9283 - regression_loss: 1.6031 - classification_loss: 0.3252 495/500 [============================>.] - ETA: 1s - loss: 1.9285 - regression_loss: 1.6031 - classification_loss: 0.3254 496/500 [============================>.] - ETA: 1s - loss: 1.9277 - regression_loss: 1.6025 - classification_loss: 0.3252 497/500 [============================>.] - ETA: 0s - loss: 1.9275 - regression_loss: 1.6025 - classification_loss: 0.3250 498/500 [============================>.] - ETA: 0s - loss: 1.9264 - regression_loss: 1.6017 - classification_loss: 0.3247 499/500 [============================>.] - ETA: 0s - loss: 1.9266 - regression_loss: 1.6020 - classification_loss: 0.3246 500/500 [==============================] - 125s 250ms/step - loss: 1.9260 - regression_loss: 1.6017 - classification_loss: 0.3243 326 instances of class plum with average precision: 0.7159 mAP: 0.7159 Epoch 00028: saving model to ./training/snapshots/resnet50_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 2:03 - loss: 1.6045 - regression_loss: 1.3208 - classification_loss: 0.2836 2/500 [..............................] - ETA: 1:58 - loss: 2.0507 - regression_loss: 1.6852 - classification_loss: 0.3654 3/500 [..............................] - ETA: 1:59 - loss: 2.0541 - regression_loss: 1.7143 - classification_loss: 0.3397 4/500 [..............................] - ETA: 2:00 - loss: 2.0920 - regression_loss: 1.7491 - classification_loss: 0.3429 5/500 [..............................] - ETA: 2:01 - loss: 2.0762 - regression_loss: 1.7324 - classification_loss: 0.3438 6/500 [..............................] - ETA: 2:01 - loss: 1.9576 - regression_loss: 1.6418 - classification_loss: 0.3158 7/500 [..............................] - ETA: 2:01 - loss: 1.9483 - regression_loss: 1.6379 - classification_loss: 0.3104 8/500 [..............................] - ETA: 2:01 - loss: 1.9851 - regression_loss: 1.6640 - classification_loss: 0.3212 9/500 [..............................] - ETA: 2:00 - loss: 1.8766 - regression_loss: 1.5621 - classification_loss: 0.3145 10/500 [..............................] - ETA: 2:00 - loss: 1.9980 - regression_loss: 1.6622 - classification_loss: 0.3359 11/500 [..............................] - ETA: 2:00 - loss: 1.9209 - regression_loss: 1.6029 - classification_loss: 0.3180 12/500 [..............................] - ETA: 2:00 - loss: 1.9733 - regression_loss: 1.6326 - classification_loss: 0.3408 13/500 [..............................] - ETA: 2:00 - loss: 1.9690 - regression_loss: 1.6311 - classification_loss: 0.3379 14/500 [..............................] - ETA: 2:00 - loss: 1.9665 - regression_loss: 1.6295 - classification_loss: 0.3370 15/500 [..............................] - ETA: 1:59 - loss: 1.9742 - regression_loss: 1.6346 - classification_loss: 0.3396 16/500 [..............................] - ETA: 1:59 - loss: 1.9751 - regression_loss: 1.6268 - classification_loss: 0.3483 17/500 [>.............................] - ETA: 1:59 - loss: 1.9428 - regression_loss: 1.6028 - classification_loss: 0.3400 18/500 [>.............................] - ETA: 1:59 - loss: 1.9640 - regression_loss: 1.6246 - classification_loss: 0.3393 19/500 [>.............................] - ETA: 1:59 - loss: 1.9381 - regression_loss: 1.6065 - classification_loss: 0.3316 20/500 [>.............................] - ETA: 1:58 - loss: 1.9312 - regression_loss: 1.5993 - classification_loss: 0.3319 21/500 [>.............................] - ETA: 1:58 - loss: 1.9606 - regression_loss: 1.6167 - classification_loss: 0.3439 22/500 [>.............................] - ETA: 1:58 - loss: 1.9345 - regression_loss: 1.5965 - classification_loss: 0.3380 23/500 [>.............................] - ETA: 1:58 - loss: 1.9618 - regression_loss: 1.6185 - classification_loss: 0.3433 24/500 [>.............................] - ETA: 1:58 - loss: 1.9501 - regression_loss: 1.6122 - classification_loss: 0.3379 25/500 [>.............................] - ETA: 1:57 - loss: 1.9610 - regression_loss: 1.6213 - classification_loss: 0.3397 26/500 [>.............................] - ETA: 1:57 - loss: 1.9087 - regression_loss: 1.5798 - classification_loss: 0.3289 27/500 [>.............................] - ETA: 1:57 - loss: 1.8897 - regression_loss: 1.5659 - classification_loss: 0.3238 28/500 [>.............................] - ETA: 1:57 - loss: 1.8968 - regression_loss: 1.5714 - classification_loss: 0.3254 29/500 [>.............................] - ETA: 1:57 - loss: 1.9055 - regression_loss: 1.5797 - classification_loss: 0.3258 30/500 [>.............................] - ETA: 1:56 - loss: 1.8849 - regression_loss: 1.5645 - classification_loss: 0.3205 31/500 [>.............................] - ETA: 1:56 - loss: 1.9140 - regression_loss: 1.5931 - classification_loss: 0.3209 32/500 [>.............................] - ETA: 1:56 - loss: 1.9238 - regression_loss: 1.6059 - classification_loss: 0.3180 33/500 [>.............................] - ETA: 1:56 - loss: 1.9248 - regression_loss: 1.6064 - classification_loss: 0.3184 34/500 [=>............................] - ETA: 1:56 - loss: 1.9371 - regression_loss: 1.6151 - classification_loss: 0.3221 35/500 [=>............................] - ETA: 1:56 - loss: 1.9153 - regression_loss: 1.5979 - classification_loss: 0.3174 36/500 [=>............................] - ETA: 1:55 - loss: 1.9194 - regression_loss: 1.6025 - classification_loss: 0.3169 37/500 [=>............................] - ETA: 1:55 - loss: 1.9210 - regression_loss: 1.6027 - classification_loss: 0.3183 38/500 [=>............................] - ETA: 1:55 - loss: 1.9328 - regression_loss: 1.6106 - classification_loss: 0.3222 39/500 [=>............................] - ETA: 1:55 - loss: 1.9495 - regression_loss: 1.6187 - classification_loss: 0.3308 40/500 [=>............................] - ETA: 1:54 - loss: 1.9659 - regression_loss: 1.6330 - classification_loss: 0.3329 41/500 [=>............................] - ETA: 1:54 - loss: 1.9673 - regression_loss: 1.6358 - classification_loss: 0.3316 42/500 [=>............................] - ETA: 1:54 - loss: 1.9784 - regression_loss: 1.6443 - classification_loss: 0.3341 43/500 [=>............................] - ETA: 1:54 - loss: 1.9577 - regression_loss: 1.6277 - classification_loss: 0.3300 44/500 [=>............................] - ETA: 1:53 - loss: 1.9514 - regression_loss: 1.6235 - classification_loss: 0.3279 45/500 [=>............................] - ETA: 1:53 - loss: 1.9523 - regression_loss: 1.6246 - classification_loss: 0.3276 46/500 [=>............................] - ETA: 1:53 - loss: 1.9513 - regression_loss: 1.6248 - classification_loss: 0.3265 47/500 [=>............................] - ETA: 1:53 - loss: 1.9334 - regression_loss: 1.6093 - classification_loss: 0.3242 48/500 [=>............................] - ETA: 1:53 - loss: 1.9357 - regression_loss: 1.6124 - classification_loss: 0.3233 49/500 [=>............................] - ETA: 1:52 - loss: 1.9513 - regression_loss: 1.6250 - classification_loss: 0.3263 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9547 - regression_loss: 1.6275 - classification_loss: 0.3272 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9552 - regression_loss: 1.6274 - classification_loss: 0.3279 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9445 - regression_loss: 1.6196 - classification_loss: 0.3250 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9461 - regression_loss: 1.6226 - classification_loss: 0.3235 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9508 - regression_loss: 1.6257 - classification_loss: 0.3250 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9574 - regression_loss: 1.6324 - classification_loss: 0.3250 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9572 - regression_loss: 1.6323 - classification_loss: 0.3249 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9555 - regression_loss: 1.6298 - classification_loss: 0.3257 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9647 - regression_loss: 1.6392 - classification_loss: 0.3255 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9575 - regression_loss: 1.6346 - classification_loss: 0.3229 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9504 - regression_loss: 1.6294 - classification_loss: 0.3210 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9466 - regression_loss: 1.6264 - classification_loss: 0.3203 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9568 - regression_loss: 1.6352 - classification_loss: 0.3216 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9502 - regression_loss: 1.6305 - classification_loss: 0.3197 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9506 - regression_loss: 1.6304 - classification_loss: 0.3201 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9446 - regression_loss: 1.6270 - classification_loss: 0.3175 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9539 - regression_loss: 1.6352 - classification_loss: 0.3187 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9587 - regression_loss: 1.6369 - classification_loss: 0.3218 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9594 - regression_loss: 1.6368 - classification_loss: 0.3226 69/500 [===>..........................] - ETA: 1:47 - loss: 1.9601 - regression_loss: 1.6360 - classification_loss: 0.3241 70/500 [===>..........................] - ETA: 1:47 - loss: 1.9603 - regression_loss: 1.6363 - classification_loss: 0.3241 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9688 - regression_loss: 1.6415 - classification_loss: 0.3273 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9695 - regression_loss: 1.6422 - classification_loss: 0.3273 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9732 - regression_loss: 1.6447 - classification_loss: 0.3285 74/500 [===>..........................] - ETA: 1:46 - loss: 1.9728 - regression_loss: 1.6455 - classification_loss: 0.3273 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9700 - regression_loss: 1.6443 - classification_loss: 0.3257 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9620 - regression_loss: 1.6373 - classification_loss: 0.3247 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9595 - regression_loss: 1.6344 - classification_loss: 0.3251 78/500 [===>..........................] - ETA: 1:45 - loss: 1.9628 - regression_loss: 1.6379 - classification_loss: 0.3249 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9632 - regression_loss: 1.6382 - classification_loss: 0.3250 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9678 - regression_loss: 1.6419 - classification_loss: 0.3258 81/500 [===>..........................] - ETA: 1:44 - loss: 1.9647 - regression_loss: 1.6398 - classification_loss: 0.3249 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9707 - regression_loss: 1.6449 - classification_loss: 0.3258 83/500 [===>..........................] - ETA: 1:43 - loss: 1.9713 - regression_loss: 1.6443 - classification_loss: 0.3270 84/500 [====>.........................] - ETA: 1:43 - loss: 1.9660 - regression_loss: 1.6388 - classification_loss: 0.3272 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9667 - regression_loss: 1.6377 - classification_loss: 0.3290 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9702 - regression_loss: 1.6389 - classification_loss: 0.3312 87/500 [====>.........................] - ETA: 1:42 - loss: 1.9750 - regression_loss: 1.6418 - classification_loss: 0.3332 88/500 [====>.........................] - ETA: 1:42 - loss: 1.9713 - regression_loss: 1.6392 - classification_loss: 0.3321 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9577 - regression_loss: 1.6279 - classification_loss: 0.3298 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9556 - regression_loss: 1.6272 - classification_loss: 0.3285 91/500 [====>.........................] - ETA: 1:41 - loss: 1.9634 - regression_loss: 1.6323 - classification_loss: 0.3312 92/500 [====>.........................] - ETA: 1:41 - loss: 1.9658 - regression_loss: 1.6339 - classification_loss: 0.3319 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9630 - regression_loss: 1.6320 - classification_loss: 0.3309 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9653 - regression_loss: 1.6336 - classification_loss: 0.3317 95/500 [====>.........................] - ETA: 1:40 - loss: 1.9679 - regression_loss: 1.6362 - classification_loss: 0.3317 96/500 [====>.........................] - ETA: 1:40 - loss: 1.9797 - regression_loss: 1.6447 - classification_loss: 0.3350 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9752 - regression_loss: 1.6406 - classification_loss: 0.3346 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9769 - regression_loss: 1.6411 - classification_loss: 0.3358 99/500 [====>.........................] - ETA: 1:39 - loss: 1.9800 - regression_loss: 1.6428 - classification_loss: 0.3372 100/500 [=====>........................] - ETA: 1:39 - loss: 1.9832 - regression_loss: 1.6456 - classification_loss: 0.3377 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9889 - regression_loss: 1.6499 - classification_loss: 0.3390 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9854 - regression_loss: 1.6473 - classification_loss: 0.3381 103/500 [=====>........................] - ETA: 1:38 - loss: 1.9832 - regression_loss: 1.6457 - classification_loss: 0.3375 104/500 [=====>........................] - ETA: 1:38 - loss: 1.9863 - regression_loss: 1.6465 - classification_loss: 0.3398 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9812 - regression_loss: 1.6427 - classification_loss: 0.3385 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9811 - regression_loss: 1.6412 - classification_loss: 0.3399 107/500 [=====>........................] - ETA: 1:37 - loss: 1.9822 - regression_loss: 1.6415 - classification_loss: 0.3406 108/500 [=====>........................] - ETA: 1:37 - loss: 1.9813 - regression_loss: 1.6407 - classification_loss: 0.3405 109/500 [=====>........................] - ETA: 1:37 - loss: 1.9737 - regression_loss: 1.6349 - classification_loss: 0.3388 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9726 - regression_loss: 1.6346 - classification_loss: 0.3380 111/500 [=====>........................] - ETA: 1:36 - loss: 1.9691 - regression_loss: 1.6323 - classification_loss: 0.3367 112/500 [=====>........................] - ETA: 1:36 - loss: 1.9643 - regression_loss: 1.6288 - classification_loss: 0.3355 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9540 - regression_loss: 1.6204 - classification_loss: 0.3336 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9648 - regression_loss: 1.6284 - classification_loss: 0.3364 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9602 - regression_loss: 1.6249 - classification_loss: 0.3352 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9576 - regression_loss: 1.6231 - classification_loss: 0.3345 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9585 - regression_loss: 1.6235 - classification_loss: 0.3350 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9502 - regression_loss: 1.6172 - classification_loss: 0.3331 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9520 - regression_loss: 1.6188 - classification_loss: 0.3332 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9553 - regression_loss: 1.6224 - classification_loss: 0.3330 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9501 - regression_loss: 1.6186 - classification_loss: 0.3314 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9459 - regression_loss: 1.6157 - classification_loss: 0.3303 123/500 [======>.......................] - ETA: 1:33 - loss: 1.9513 - regression_loss: 1.6194 - classification_loss: 0.3320 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9485 - regression_loss: 1.6168 - classification_loss: 0.3317 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9501 - regression_loss: 1.6177 - classification_loss: 0.3324 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9505 - regression_loss: 1.6179 - classification_loss: 0.3326 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9532 - regression_loss: 1.6211 - classification_loss: 0.3322 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9532 - regression_loss: 1.6206 - classification_loss: 0.3325 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9550 - regression_loss: 1.6221 - classification_loss: 0.3330 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9533 - regression_loss: 1.6202 - classification_loss: 0.3331 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9492 - regression_loss: 1.6169 - classification_loss: 0.3323 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9486 - regression_loss: 1.6168 - classification_loss: 0.3318 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9528 - regression_loss: 1.6206 - classification_loss: 0.3322 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9520 - regression_loss: 1.6196 - classification_loss: 0.3324 135/500 [=======>......................] - ETA: 1:30 - loss: 1.9523 - regression_loss: 1.6190 - classification_loss: 0.3333 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9482 - regression_loss: 1.6157 - classification_loss: 0.3325 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9416 - regression_loss: 1.6108 - classification_loss: 0.3309 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9374 - regression_loss: 1.6073 - classification_loss: 0.3301 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9381 - regression_loss: 1.6079 - classification_loss: 0.3301 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9296 - regression_loss: 1.6006 - classification_loss: 0.3290 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9286 - regression_loss: 1.5999 - classification_loss: 0.3287 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9306 - regression_loss: 1.6028 - classification_loss: 0.3278 143/500 [=======>......................] - ETA: 1:28 - loss: 1.9233 - regression_loss: 1.5971 - classification_loss: 0.3262 144/500 [=======>......................] - ETA: 1:28 - loss: 1.9216 - regression_loss: 1.5961 - classification_loss: 0.3255 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9168 - regression_loss: 1.5924 - classification_loss: 0.3244 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9162 - regression_loss: 1.5919 - classification_loss: 0.3242 147/500 [=======>......................] - ETA: 1:27 - loss: 1.9160 - regression_loss: 1.5920 - classification_loss: 0.3240 148/500 [=======>......................] - ETA: 1:27 - loss: 1.9188 - regression_loss: 1.5955 - classification_loss: 0.3233 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9196 - regression_loss: 1.5956 - classification_loss: 0.3239 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9233 - regression_loss: 1.5985 - classification_loss: 0.3248 151/500 [========>.....................] - ETA: 1:26 - loss: 1.9264 - regression_loss: 1.6003 - classification_loss: 0.3261 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9234 - regression_loss: 1.5980 - classification_loss: 0.3254 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9211 - regression_loss: 1.5963 - classification_loss: 0.3247 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9194 - regression_loss: 1.5955 - classification_loss: 0.3239 155/500 [========>.....................] - ETA: 1:25 - loss: 1.9178 - regression_loss: 1.5944 - classification_loss: 0.3235 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9189 - regression_loss: 1.5952 - classification_loss: 0.3237 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9149 - regression_loss: 1.5921 - classification_loss: 0.3228 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9167 - regression_loss: 1.5935 - classification_loss: 0.3232 159/500 [========>.....................] - ETA: 1:24 - loss: 1.9257 - regression_loss: 1.6005 - classification_loss: 0.3252 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9246 - regression_loss: 1.5990 - classification_loss: 0.3256 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9258 - regression_loss: 1.6004 - classification_loss: 0.3254 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9289 - regression_loss: 1.6028 - classification_loss: 0.3262 163/500 [========>.....................] - ETA: 1:23 - loss: 1.9263 - regression_loss: 1.6009 - classification_loss: 0.3254 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9277 - regression_loss: 1.6022 - classification_loss: 0.3255 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9255 - regression_loss: 1.6001 - classification_loss: 0.3254 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9223 - regression_loss: 1.5975 - classification_loss: 0.3248 167/500 [=========>....................] - ETA: 1:22 - loss: 1.9153 - regression_loss: 1.5918 - classification_loss: 0.3235 168/500 [=========>....................] - ETA: 1:22 - loss: 1.9152 - regression_loss: 1.5917 - classification_loss: 0.3235 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9104 - regression_loss: 1.5880 - classification_loss: 0.3224 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9090 - regression_loss: 1.5869 - classification_loss: 0.3221 171/500 [=========>....................] - ETA: 1:21 - loss: 1.9067 - regression_loss: 1.5852 - classification_loss: 0.3214 172/500 [=========>....................] - ETA: 1:21 - loss: 1.9035 - regression_loss: 1.5830 - classification_loss: 0.3205 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9054 - regression_loss: 1.5849 - classification_loss: 0.3204 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9025 - regression_loss: 1.5829 - classification_loss: 0.3196 175/500 [=========>....................] - ETA: 1:20 - loss: 1.9018 - regression_loss: 1.5827 - classification_loss: 0.3191 176/500 [=========>....................] - ETA: 1:20 - loss: 1.9022 - regression_loss: 1.5834 - classification_loss: 0.3188 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9032 - regression_loss: 1.5845 - classification_loss: 0.3186 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9025 - regression_loss: 1.5847 - classification_loss: 0.3178 179/500 [=========>....................] - ETA: 1:19 - loss: 1.9037 - regression_loss: 1.5858 - classification_loss: 0.3179 180/500 [=========>....................] - ETA: 1:19 - loss: 1.9050 - regression_loss: 1.5871 - classification_loss: 0.3179 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9097 - regression_loss: 1.5907 - classification_loss: 0.3190 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9085 - regression_loss: 1.5897 - classification_loss: 0.3188 183/500 [=========>....................] - ETA: 1:18 - loss: 1.9128 - regression_loss: 1.5932 - classification_loss: 0.3197 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9068 - regression_loss: 1.5883 - classification_loss: 0.3185 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9059 - regression_loss: 1.5876 - classification_loss: 0.3182 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9059 - regression_loss: 1.5876 - classification_loss: 0.3183 187/500 [==========>...................] - ETA: 1:17 - loss: 1.9045 - regression_loss: 1.5867 - classification_loss: 0.3178 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9040 - regression_loss: 1.5866 - classification_loss: 0.3175 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9118 - regression_loss: 1.5937 - classification_loss: 0.3181 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9087 - regression_loss: 1.5910 - classification_loss: 0.3177 191/500 [==========>...................] - ETA: 1:16 - loss: 1.9090 - regression_loss: 1.5908 - classification_loss: 0.3182 192/500 [==========>...................] - ETA: 1:16 - loss: 1.9107 - regression_loss: 1.5920 - classification_loss: 0.3187 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9105 - regression_loss: 1.5917 - classification_loss: 0.3188 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9048 - regression_loss: 1.5835 - classification_loss: 0.3212 195/500 [==========>...................] - ETA: 1:15 - loss: 1.9013 - regression_loss: 1.5809 - classification_loss: 0.3203 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8991 - regression_loss: 1.5795 - classification_loss: 0.3196 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8978 - regression_loss: 1.5789 - classification_loss: 0.3189 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8991 - regression_loss: 1.5796 - classification_loss: 0.3195 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9027 - regression_loss: 1.5818 - classification_loss: 0.3208 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9100 - regression_loss: 1.5892 - classification_loss: 0.3208 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9157 - regression_loss: 1.5937 - classification_loss: 0.3220 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9179 - regression_loss: 1.5950 - classification_loss: 0.3229 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9173 - regression_loss: 1.5944 - classification_loss: 0.3230 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9199 - regression_loss: 1.5961 - classification_loss: 0.3238 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9231 - regression_loss: 1.5982 - classification_loss: 0.3248 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9208 - regression_loss: 1.5966 - classification_loss: 0.3242 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9172 - regression_loss: 1.5936 - classification_loss: 0.3235 208/500 [===========>..................] - ETA: 1:12 - loss: 1.9209 - regression_loss: 1.5956 - classification_loss: 0.3252 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9218 - regression_loss: 1.5973 - classification_loss: 0.3245 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9214 - regression_loss: 1.5969 - classification_loss: 0.3245 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9216 - regression_loss: 1.5970 - classification_loss: 0.3247 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9186 - regression_loss: 1.5947 - classification_loss: 0.3240 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9189 - regression_loss: 1.5952 - classification_loss: 0.3237 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9175 - regression_loss: 1.5942 - classification_loss: 0.3232 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9159 - regression_loss: 1.5925 - classification_loss: 0.3233 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9197 - regression_loss: 1.5954 - classification_loss: 0.3243 217/500 [============>.................] - ETA: 1:10 - loss: 1.9214 - regression_loss: 1.5967 - classification_loss: 0.3248 218/500 [============>.................] - ETA: 1:10 - loss: 1.9190 - regression_loss: 1.5949 - classification_loss: 0.3241 219/500 [============>.................] - ETA: 1:10 - loss: 1.9185 - regression_loss: 1.5948 - classification_loss: 0.3236 220/500 [============>.................] - ETA: 1:09 - loss: 1.9164 - regression_loss: 1.5934 - classification_loss: 0.3230 221/500 [============>.................] - ETA: 1:09 - loss: 1.9150 - regression_loss: 1.5924 - classification_loss: 0.3226 222/500 [============>.................] - ETA: 1:09 - loss: 1.9182 - regression_loss: 1.5947 - classification_loss: 0.3235 223/500 [============>.................] - ETA: 1:09 - loss: 1.9171 - regression_loss: 1.5940 - classification_loss: 0.3232 224/500 [============>.................] - ETA: 1:08 - loss: 1.9188 - regression_loss: 1.5952 - classification_loss: 0.3237 225/500 [============>.................] - ETA: 1:08 - loss: 1.9183 - regression_loss: 1.5946 - classification_loss: 0.3237 226/500 [============>.................] - ETA: 1:08 - loss: 1.9182 - regression_loss: 1.5948 - classification_loss: 0.3234 227/500 [============>.................] - ETA: 1:08 - loss: 1.9173 - regression_loss: 1.5940 - classification_loss: 0.3233 228/500 [============>.................] - ETA: 1:07 - loss: 1.9178 - regression_loss: 1.5946 - classification_loss: 0.3232 229/500 [============>.................] - ETA: 1:07 - loss: 1.9192 - regression_loss: 1.5961 - classification_loss: 0.3231 230/500 [============>.................] - ETA: 1:07 - loss: 1.9162 - regression_loss: 1.5937 - classification_loss: 0.3225 231/500 [============>.................] - ETA: 1:07 - loss: 1.9214 - regression_loss: 1.5986 - classification_loss: 0.3228 232/500 [============>.................] - ETA: 1:06 - loss: 1.9204 - regression_loss: 1.5981 - classification_loss: 0.3224 233/500 [============>.................] - ETA: 1:06 - loss: 1.9198 - regression_loss: 1.5973 - classification_loss: 0.3224 234/500 [=============>................] - ETA: 1:06 - loss: 1.9191 - regression_loss: 1.5969 - classification_loss: 0.3222 235/500 [=============>................] - ETA: 1:06 - loss: 1.9218 - regression_loss: 1.5987 - classification_loss: 0.3231 236/500 [=============>................] - ETA: 1:05 - loss: 1.9207 - regression_loss: 1.5980 - classification_loss: 0.3227 237/500 [=============>................] - ETA: 1:05 - loss: 1.9219 - regression_loss: 1.5990 - classification_loss: 0.3229 238/500 [=============>................] - ETA: 1:05 - loss: 1.9235 - regression_loss: 1.6004 - classification_loss: 0.3231 239/500 [=============>................] - ETA: 1:05 - loss: 1.9259 - regression_loss: 1.6024 - classification_loss: 0.3235 240/500 [=============>................] - ETA: 1:04 - loss: 1.9207 - regression_loss: 1.5983 - classification_loss: 0.3224 241/500 [=============>................] - ETA: 1:04 - loss: 1.9198 - regression_loss: 1.5976 - classification_loss: 0.3221 242/500 [=============>................] - ETA: 1:04 - loss: 1.9162 - regression_loss: 1.5948 - classification_loss: 0.3214 243/500 [=============>................] - ETA: 1:04 - loss: 1.9152 - regression_loss: 1.5938 - classification_loss: 0.3214 244/500 [=============>................] - ETA: 1:03 - loss: 1.9178 - regression_loss: 1.5959 - classification_loss: 0.3219 245/500 [=============>................] - ETA: 1:03 - loss: 1.9179 - regression_loss: 1.5959 - classification_loss: 0.3220 246/500 [=============>................] - ETA: 1:03 - loss: 1.9148 - regression_loss: 1.5934 - classification_loss: 0.3214 247/500 [=============>................] - ETA: 1:03 - loss: 1.9154 - regression_loss: 1.5938 - classification_loss: 0.3216 248/500 [=============>................] - ETA: 1:02 - loss: 1.9156 - regression_loss: 1.5940 - classification_loss: 0.3216 249/500 [=============>................] - ETA: 1:02 - loss: 1.9146 - regression_loss: 1.5930 - classification_loss: 0.3215 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9160 - regression_loss: 1.5948 - classification_loss: 0.3212 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9178 - regression_loss: 1.5962 - classification_loss: 0.3216 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9186 - regression_loss: 1.5965 - classification_loss: 0.3221 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9178 - regression_loss: 1.5961 - classification_loss: 0.3217 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9182 - regression_loss: 1.5967 - classification_loss: 0.3215 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9200 - regression_loss: 1.5981 - classification_loss: 0.3219 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9227 - regression_loss: 1.6001 - classification_loss: 0.3226 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9229 - regression_loss: 1.6003 - classification_loss: 0.3226 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9207 - regression_loss: 1.5987 - classification_loss: 0.3220 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9187 - regression_loss: 1.5974 - classification_loss: 0.3214 260/500 [==============>...............] - ETA: 59s - loss: 1.9209 - regression_loss: 1.5992 - classification_loss: 0.3217  261/500 [==============>...............] - ETA: 59s - loss: 1.9240 - regression_loss: 1.6018 - classification_loss: 0.3222 262/500 [==============>...............] - ETA: 59s - loss: 1.9240 - regression_loss: 1.6021 - classification_loss: 0.3220 263/500 [==============>...............] - ETA: 59s - loss: 1.9243 - regression_loss: 1.6023 - classification_loss: 0.3220 264/500 [==============>...............] - ETA: 58s - loss: 1.9233 - regression_loss: 1.6016 - classification_loss: 0.3216 265/500 [==============>...............] - ETA: 58s - loss: 1.9244 - regression_loss: 1.6026 - classification_loss: 0.3218 266/500 [==============>...............] - ETA: 58s - loss: 1.9248 - regression_loss: 1.6032 - classification_loss: 0.3217 267/500 [===============>..............] - ETA: 58s - loss: 1.9285 - regression_loss: 1.6065 - classification_loss: 0.3220 268/500 [===============>..............] - ETA: 57s - loss: 1.9309 - regression_loss: 1.6087 - classification_loss: 0.3223 269/500 [===============>..............] - ETA: 57s - loss: 1.9272 - regression_loss: 1.6059 - classification_loss: 0.3214 270/500 [===============>..............] - ETA: 57s - loss: 1.9288 - regression_loss: 1.6069 - classification_loss: 0.3219 271/500 [===============>..............] - ETA: 57s - loss: 1.9296 - regression_loss: 1.6073 - classification_loss: 0.3223 272/500 [===============>..............] - ETA: 56s - loss: 1.9334 - regression_loss: 1.6099 - classification_loss: 0.3235 273/500 [===============>..............] - ETA: 56s - loss: 1.9333 - regression_loss: 1.6097 - classification_loss: 0.3235 274/500 [===============>..............] - ETA: 56s - loss: 1.9338 - regression_loss: 1.6102 - classification_loss: 0.3236 275/500 [===============>..............] - ETA: 56s - loss: 1.9342 - regression_loss: 1.6107 - classification_loss: 0.3236 276/500 [===============>..............] - ETA: 55s - loss: 1.9339 - regression_loss: 1.6105 - classification_loss: 0.3234 277/500 [===============>..............] - ETA: 55s - loss: 1.9344 - regression_loss: 1.6110 - classification_loss: 0.3234 278/500 [===============>..............] - ETA: 55s - loss: 1.9351 - regression_loss: 1.6113 - classification_loss: 0.3237 279/500 [===============>..............] - ETA: 55s - loss: 1.9359 - regression_loss: 1.6118 - classification_loss: 0.3240 280/500 [===============>..............] - ETA: 54s - loss: 1.9331 - regression_loss: 1.6098 - classification_loss: 0.3234 281/500 [===============>..............] - ETA: 54s - loss: 1.9341 - regression_loss: 1.6105 - classification_loss: 0.3237 282/500 [===============>..............] - ETA: 54s - loss: 1.9385 - regression_loss: 1.6144 - classification_loss: 0.3242 283/500 [===============>..............] - ETA: 54s - loss: 1.9431 - regression_loss: 1.6180 - classification_loss: 0.3251 284/500 [================>.............] - ETA: 53s - loss: 1.9443 - regression_loss: 1.6193 - classification_loss: 0.3250 285/500 [================>.............] - ETA: 53s - loss: 1.9445 - regression_loss: 1.6194 - classification_loss: 0.3251 286/500 [================>.............] - ETA: 53s - loss: 1.9450 - regression_loss: 1.6195 - classification_loss: 0.3254 287/500 [================>.............] - ETA: 53s - loss: 1.9459 - regression_loss: 1.6202 - classification_loss: 0.3257 288/500 [================>.............] - ETA: 52s - loss: 1.9450 - regression_loss: 1.6195 - classification_loss: 0.3255 289/500 [================>.............] - ETA: 52s - loss: 1.9470 - regression_loss: 1.6212 - classification_loss: 0.3258 290/500 [================>.............] - ETA: 52s - loss: 1.9470 - regression_loss: 1.6214 - classification_loss: 0.3257 291/500 [================>.............] - ETA: 52s - loss: 1.9476 - regression_loss: 1.6216 - classification_loss: 0.3260 292/500 [================>.............] - ETA: 51s - loss: 1.9503 - regression_loss: 1.6237 - classification_loss: 0.3266 293/500 [================>.............] - ETA: 51s - loss: 1.9495 - regression_loss: 1.6232 - classification_loss: 0.3263 294/500 [================>.............] - ETA: 51s - loss: 1.9476 - regression_loss: 1.6219 - classification_loss: 0.3256 295/500 [================>.............] - ETA: 51s - loss: 1.9461 - regression_loss: 1.6209 - classification_loss: 0.3252 296/500 [================>.............] - ETA: 50s - loss: 1.9457 - regression_loss: 1.6207 - classification_loss: 0.3250 297/500 [================>.............] - ETA: 50s - loss: 1.9446 - regression_loss: 1.6199 - classification_loss: 0.3246 298/500 [================>.............] - ETA: 50s - loss: 1.9438 - regression_loss: 1.6195 - classification_loss: 0.3243 299/500 [================>.............] - ETA: 50s - loss: 1.9423 - regression_loss: 1.6184 - classification_loss: 0.3239 300/500 [=================>............] - ETA: 49s - loss: 1.9430 - regression_loss: 1.6191 - classification_loss: 0.3239 301/500 [=================>............] - ETA: 49s - loss: 1.9416 - regression_loss: 1.6178 - classification_loss: 0.3238 302/500 [=================>............] - ETA: 49s - loss: 1.9408 - regression_loss: 1.6174 - classification_loss: 0.3235 303/500 [=================>............] - ETA: 49s - loss: 1.9425 - regression_loss: 1.6184 - classification_loss: 0.3241 304/500 [=================>............] - ETA: 48s - loss: 1.9418 - regression_loss: 1.6179 - classification_loss: 0.3239 305/500 [=================>............] - ETA: 48s - loss: 1.9402 - regression_loss: 1.6167 - classification_loss: 0.3235 306/500 [=================>............] - ETA: 48s - loss: 1.9372 - regression_loss: 1.6138 - classification_loss: 0.3235 307/500 [=================>............] - ETA: 48s - loss: 1.9347 - regression_loss: 1.6118 - classification_loss: 0.3230 308/500 [=================>............] - ETA: 47s - loss: 1.9352 - regression_loss: 1.6124 - classification_loss: 0.3228 309/500 [=================>............] - ETA: 47s - loss: 1.9358 - regression_loss: 1.6129 - classification_loss: 0.3230 310/500 [=================>............] - ETA: 47s - loss: 1.9358 - regression_loss: 1.6130 - classification_loss: 0.3228 311/500 [=================>............] - ETA: 47s - loss: 1.9344 - regression_loss: 1.6118 - classification_loss: 0.3225 312/500 [=================>............] - ETA: 46s - loss: 1.9347 - regression_loss: 1.6118 - classification_loss: 0.3230 313/500 [=================>............] - ETA: 46s - loss: 1.9341 - regression_loss: 1.6112 - classification_loss: 0.3229 314/500 [=================>............] - ETA: 46s - loss: 1.9331 - regression_loss: 1.6107 - classification_loss: 0.3224 315/500 [=================>............] - ETA: 46s - loss: 1.9339 - regression_loss: 1.6116 - classification_loss: 0.3223 316/500 [=================>............] - ETA: 45s - loss: 1.9329 - regression_loss: 1.6111 - classification_loss: 0.3218 317/500 [==================>...........] - ETA: 45s - loss: 1.9335 - regression_loss: 1.6115 - classification_loss: 0.3220 318/500 [==================>...........] - ETA: 45s - loss: 1.9337 - regression_loss: 1.6113 - classification_loss: 0.3223 319/500 [==================>...........] - ETA: 45s - loss: 1.9349 - regression_loss: 1.6124 - classification_loss: 0.3225 320/500 [==================>...........] - ETA: 44s - loss: 1.9349 - regression_loss: 1.6122 - classification_loss: 0.3227 321/500 [==================>...........] - ETA: 44s - loss: 1.9359 - regression_loss: 1.6137 - classification_loss: 0.3222 322/500 [==================>...........] - ETA: 44s - loss: 1.9371 - regression_loss: 1.6146 - classification_loss: 0.3225 323/500 [==================>...........] - ETA: 44s - loss: 1.9356 - regression_loss: 1.6135 - classification_loss: 0.3221 324/500 [==================>...........] - ETA: 43s - loss: 1.9329 - regression_loss: 1.6114 - classification_loss: 0.3216 325/500 [==================>...........] - ETA: 43s - loss: 1.9340 - regression_loss: 1.6121 - classification_loss: 0.3219 326/500 [==================>...........] - ETA: 43s - loss: 1.9315 - regression_loss: 1.6101 - classification_loss: 0.3215 327/500 [==================>...........] - ETA: 43s - loss: 1.9328 - regression_loss: 1.6112 - classification_loss: 0.3216 328/500 [==================>...........] - ETA: 42s - loss: 1.9332 - regression_loss: 1.6115 - classification_loss: 0.3218 329/500 [==================>...........] - ETA: 42s - loss: 1.9348 - regression_loss: 1.6125 - classification_loss: 0.3223 330/500 [==================>...........] - ETA: 42s - loss: 1.9338 - regression_loss: 1.6119 - classification_loss: 0.3219 331/500 [==================>...........] - ETA: 42s - loss: 1.9331 - regression_loss: 1.6115 - classification_loss: 0.3217 332/500 [==================>...........] - ETA: 41s - loss: 1.9329 - regression_loss: 1.6114 - classification_loss: 0.3216 333/500 [==================>...........] - ETA: 41s - loss: 1.9319 - regression_loss: 1.6107 - classification_loss: 0.3213 334/500 [===================>..........] - ETA: 41s - loss: 1.9306 - regression_loss: 1.6094 - classification_loss: 0.3212 335/500 [===================>..........] - ETA: 41s - loss: 1.9314 - regression_loss: 1.6102 - classification_loss: 0.3212 336/500 [===================>..........] - ETA: 40s - loss: 1.9282 - regression_loss: 1.6076 - classification_loss: 0.3206 337/500 [===================>..........] - ETA: 40s - loss: 1.9284 - regression_loss: 1.6076 - classification_loss: 0.3208 338/500 [===================>..........] - ETA: 40s - loss: 1.9273 - regression_loss: 1.6068 - classification_loss: 0.3205 339/500 [===================>..........] - ETA: 40s - loss: 1.9257 - regression_loss: 1.6057 - classification_loss: 0.3201 340/500 [===================>..........] - ETA: 39s - loss: 1.9216 - regression_loss: 1.6023 - classification_loss: 0.3193 341/500 [===================>..........] - ETA: 39s - loss: 1.9207 - regression_loss: 1.6018 - classification_loss: 0.3189 342/500 [===================>..........] - ETA: 39s - loss: 1.9217 - regression_loss: 1.6026 - classification_loss: 0.3191 343/500 [===================>..........] - ETA: 39s - loss: 1.9223 - regression_loss: 1.6033 - classification_loss: 0.3189 344/500 [===================>..........] - ETA: 38s - loss: 1.9205 - regression_loss: 1.6020 - classification_loss: 0.3185 345/500 [===================>..........] - ETA: 38s - loss: 1.9214 - regression_loss: 1.6028 - classification_loss: 0.3186 346/500 [===================>..........] - ETA: 38s - loss: 1.9175 - regression_loss: 1.5995 - classification_loss: 0.3180 347/500 [===================>..........] - ETA: 38s - loss: 1.9170 - regression_loss: 1.5991 - classification_loss: 0.3179 348/500 [===================>..........] - ETA: 37s - loss: 1.9160 - regression_loss: 1.5983 - classification_loss: 0.3176 349/500 [===================>..........] - ETA: 37s - loss: 1.9163 - regression_loss: 1.5987 - classification_loss: 0.3176 350/500 [====================>.........] - ETA: 37s - loss: 1.9161 - regression_loss: 1.5989 - classification_loss: 0.3172 351/500 [====================>.........] - ETA: 37s - loss: 1.9142 - regression_loss: 1.5974 - classification_loss: 0.3169 352/500 [====================>.........] - ETA: 36s - loss: 1.9141 - regression_loss: 1.5973 - classification_loss: 0.3168 353/500 [====================>.........] - ETA: 36s - loss: 1.9127 - regression_loss: 1.5958 - classification_loss: 0.3169 354/500 [====================>.........] - ETA: 36s - loss: 1.9153 - regression_loss: 1.5975 - classification_loss: 0.3177 355/500 [====================>.........] - ETA: 36s - loss: 1.9130 - regression_loss: 1.5958 - classification_loss: 0.3172 356/500 [====================>.........] - ETA: 35s - loss: 1.9156 - regression_loss: 1.5978 - classification_loss: 0.3178 357/500 [====================>.........] - ETA: 35s - loss: 1.9136 - regression_loss: 1.5962 - classification_loss: 0.3175 358/500 [====================>.........] - ETA: 35s - loss: 1.9135 - regression_loss: 1.5960 - classification_loss: 0.3175 359/500 [====================>.........] - ETA: 35s - loss: 1.9137 - regression_loss: 1.5963 - classification_loss: 0.3174 360/500 [====================>.........] - ETA: 34s - loss: 1.9135 - regression_loss: 1.5961 - classification_loss: 0.3173 361/500 [====================>.........] - ETA: 34s - loss: 1.9151 - regression_loss: 1.5971 - classification_loss: 0.3181 362/500 [====================>.........] - ETA: 34s - loss: 1.9154 - regression_loss: 1.5973 - classification_loss: 0.3181 363/500 [====================>.........] - ETA: 34s - loss: 1.9152 - regression_loss: 1.5974 - classification_loss: 0.3178 364/500 [====================>.........] - ETA: 33s - loss: 1.9152 - regression_loss: 1.5976 - classification_loss: 0.3176 365/500 [====================>.........] - ETA: 33s - loss: 1.9136 - regression_loss: 1.5964 - classification_loss: 0.3173 366/500 [====================>.........] - ETA: 33s - loss: 1.9195 - regression_loss: 1.6020 - classification_loss: 0.3176 367/500 [=====================>........] - ETA: 33s - loss: 1.9197 - regression_loss: 1.6020 - classification_loss: 0.3177 368/500 [=====================>........] - ETA: 32s - loss: 1.9189 - regression_loss: 1.6016 - classification_loss: 0.3173 369/500 [=====================>........] - ETA: 32s - loss: 1.9203 - regression_loss: 1.6026 - classification_loss: 0.3177 370/500 [=====================>........] - ETA: 32s - loss: 1.9192 - regression_loss: 1.6018 - classification_loss: 0.3173 371/500 [=====================>........] - ETA: 32s - loss: 1.9203 - regression_loss: 1.6027 - classification_loss: 0.3176 372/500 [=====================>........] - ETA: 31s - loss: 1.9194 - regression_loss: 1.6018 - classification_loss: 0.3176 373/500 [=====================>........] - ETA: 31s - loss: 1.9194 - regression_loss: 1.6019 - classification_loss: 0.3174 374/500 [=====================>........] - ETA: 31s - loss: 1.9194 - regression_loss: 1.6020 - classification_loss: 0.3174 375/500 [=====================>........] - ETA: 31s - loss: 1.9205 - regression_loss: 1.6026 - classification_loss: 0.3179 376/500 [=====================>........] - ETA: 30s - loss: 1.9208 - regression_loss: 1.6027 - classification_loss: 0.3181 377/500 [=====================>........] - ETA: 30s - loss: 1.9212 - regression_loss: 1.6034 - classification_loss: 0.3178 378/500 [=====================>........] - ETA: 30s - loss: 1.9193 - regression_loss: 1.6019 - classification_loss: 0.3174 379/500 [=====================>........] - ETA: 30s - loss: 1.9179 - regression_loss: 1.6009 - classification_loss: 0.3170 380/500 [=====================>........] - ETA: 29s - loss: 1.9172 - regression_loss: 1.6006 - classification_loss: 0.3166 381/500 [=====================>........] - ETA: 29s - loss: 1.9185 - regression_loss: 1.6014 - classification_loss: 0.3171 382/500 [=====================>........] - ETA: 29s - loss: 1.9201 - regression_loss: 1.6030 - classification_loss: 0.3171 383/500 [=====================>........] - ETA: 29s - loss: 1.9201 - regression_loss: 1.6031 - classification_loss: 0.3170 384/500 [======================>.......] - ETA: 28s - loss: 1.9176 - regression_loss: 1.6007 - classification_loss: 0.3169 385/500 [======================>.......] - ETA: 28s - loss: 1.9197 - regression_loss: 1.6022 - classification_loss: 0.3175 386/500 [======================>.......] - ETA: 28s - loss: 1.9186 - regression_loss: 1.6014 - classification_loss: 0.3173 387/500 [======================>.......] - ETA: 28s - loss: 1.9187 - regression_loss: 1.6015 - classification_loss: 0.3172 388/500 [======================>.......] - ETA: 27s - loss: 1.9201 - regression_loss: 1.6027 - classification_loss: 0.3173 389/500 [======================>.......] - ETA: 27s - loss: 1.9190 - regression_loss: 1.6021 - classification_loss: 0.3169 390/500 [======================>.......] - ETA: 27s - loss: 1.9186 - regression_loss: 1.6020 - classification_loss: 0.3166 391/500 [======================>.......] - ETA: 27s - loss: 1.9209 - regression_loss: 1.6037 - classification_loss: 0.3172 392/500 [======================>.......] - ETA: 26s - loss: 1.9207 - regression_loss: 1.6034 - classification_loss: 0.3172 393/500 [======================>.......] - ETA: 26s - loss: 1.9206 - regression_loss: 1.6033 - classification_loss: 0.3172 394/500 [======================>.......] - ETA: 26s - loss: 1.9225 - regression_loss: 1.6044 - classification_loss: 0.3181 395/500 [======================>.......] - ETA: 26s - loss: 1.9216 - regression_loss: 1.6037 - classification_loss: 0.3179 396/500 [======================>.......] - ETA: 25s - loss: 1.9211 - regression_loss: 1.6031 - classification_loss: 0.3180 397/500 [======================>.......] - ETA: 25s - loss: 1.9220 - regression_loss: 1.6039 - classification_loss: 0.3181 398/500 [======================>.......] - ETA: 25s - loss: 1.9218 - regression_loss: 1.6037 - classification_loss: 0.3181 399/500 [======================>.......] - ETA: 25s - loss: 1.9211 - regression_loss: 1.6033 - classification_loss: 0.3178 400/500 [=======================>......] - ETA: 24s - loss: 1.9207 - regression_loss: 1.6030 - classification_loss: 0.3177 401/500 [=======================>......] - ETA: 24s - loss: 1.9208 - regression_loss: 1.6029 - classification_loss: 0.3179 402/500 [=======================>......] - ETA: 24s - loss: 1.9212 - regression_loss: 1.6031 - classification_loss: 0.3181 403/500 [=======================>......] - ETA: 24s - loss: 1.9207 - regression_loss: 1.6027 - classification_loss: 0.3180 404/500 [=======================>......] - ETA: 23s - loss: 1.9210 - regression_loss: 1.6030 - classification_loss: 0.3180 405/500 [=======================>......] - ETA: 23s - loss: 1.9222 - regression_loss: 1.6039 - classification_loss: 0.3182 406/500 [=======================>......] - ETA: 23s - loss: 1.9204 - regression_loss: 1.6026 - classification_loss: 0.3178 407/500 [=======================>......] - ETA: 23s - loss: 1.9205 - regression_loss: 1.6027 - classification_loss: 0.3178 408/500 [=======================>......] - ETA: 22s - loss: 1.9207 - regression_loss: 1.6030 - classification_loss: 0.3177 409/500 [=======================>......] - ETA: 22s - loss: 1.9173 - regression_loss: 1.6002 - classification_loss: 0.3171 410/500 [=======================>......] - ETA: 22s - loss: 1.9171 - regression_loss: 1.6000 - classification_loss: 0.3171 411/500 [=======================>......] - ETA: 22s - loss: 1.9140 - regression_loss: 1.5974 - classification_loss: 0.3166 412/500 [=======================>......] - ETA: 21s - loss: 1.9150 - regression_loss: 1.5982 - classification_loss: 0.3168 413/500 [=======================>......] - ETA: 21s - loss: 1.9154 - regression_loss: 1.5989 - classification_loss: 0.3166 414/500 [=======================>......] - ETA: 21s - loss: 1.9152 - regression_loss: 1.5989 - classification_loss: 0.3163 415/500 [=======================>......] - ETA: 21s - loss: 1.9158 - regression_loss: 1.5992 - classification_loss: 0.3166 416/500 [=======================>......] - ETA: 20s - loss: 1.9174 - regression_loss: 1.6004 - classification_loss: 0.3170 417/500 [========================>.....] - ETA: 20s - loss: 1.9176 - regression_loss: 1.6007 - classification_loss: 0.3169 418/500 [========================>.....] - ETA: 20s - loss: 1.9187 - regression_loss: 1.6016 - classification_loss: 0.3172 419/500 [========================>.....] - ETA: 20s - loss: 1.9197 - regression_loss: 1.6026 - classification_loss: 0.3171 420/500 [========================>.....] - ETA: 19s - loss: 1.9199 - regression_loss: 1.6025 - classification_loss: 0.3174 421/500 [========================>.....] - ETA: 19s - loss: 1.9211 - regression_loss: 1.6032 - classification_loss: 0.3179 422/500 [========================>.....] - ETA: 19s - loss: 1.9218 - regression_loss: 1.6036 - classification_loss: 0.3182 423/500 [========================>.....] - ETA: 19s - loss: 1.9209 - regression_loss: 1.6031 - classification_loss: 0.3179 424/500 [========================>.....] - ETA: 18s - loss: 1.9203 - regression_loss: 1.6027 - classification_loss: 0.3176 425/500 [========================>.....] - ETA: 18s - loss: 1.9205 - regression_loss: 1.6031 - classification_loss: 0.3173 426/500 [========================>.....] - ETA: 18s - loss: 1.9205 - regression_loss: 1.6031 - classification_loss: 0.3175 427/500 [========================>.....] - ETA: 18s - loss: 1.9197 - regression_loss: 1.6021 - classification_loss: 0.3176 428/500 [========================>.....] - ETA: 17s - loss: 1.9179 - regression_loss: 1.6007 - classification_loss: 0.3172 429/500 [========================>.....] - ETA: 17s - loss: 1.9197 - regression_loss: 1.6017 - classification_loss: 0.3180 430/500 [========================>.....] - ETA: 17s - loss: 1.9205 - regression_loss: 1.6022 - classification_loss: 0.3183 431/500 [========================>.....] - ETA: 17s - loss: 1.9203 - regression_loss: 1.6019 - classification_loss: 0.3184 432/500 [========================>.....] - ETA: 16s - loss: 1.9200 - regression_loss: 1.6016 - classification_loss: 0.3183 433/500 [========================>.....] - ETA: 16s - loss: 1.9182 - regression_loss: 1.6001 - classification_loss: 0.3181 434/500 [=========================>....] - ETA: 16s - loss: 1.9176 - regression_loss: 1.5999 - classification_loss: 0.3177 435/500 [=========================>....] - ETA: 16s - loss: 1.9169 - regression_loss: 1.5995 - classification_loss: 0.3174 436/500 [=========================>....] - ETA: 15s - loss: 1.9180 - regression_loss: 1.6003 - classification_loss: 0.3177 437/500 [=========================>....] - ETA: 15s - loss: 1.9186 - regression_loss: 1.6007 - classification_loss: 0.3179 438/500 [=========================>....] - ETA: 15s - loss: 1.9188 - regression_loss: 1.6008 - classification_loss: 0.3180 439/500 [=========================>....] - ETA: 15s - loss: 1.9185 - regression_loss: 1.6005 - classification_loss: 0.3180 440/500 [=========================>....] - ETA: 14s - loss: 1.9183 - regression_loss: 1.6005 - classification_loss: 0.3179 441/500 [=========================>....] - ETA: 14s - loss: 1.9177 - regression_loss: 1.6000 - classification_loss: 0.3177 442/500 [=========================>....] - ETA: 14s - loss: 1.9165 - regression_loss: 1.5991 - classification_loss: 0.3174 443/500 [=========================>....] - ETA: 14s - loss: 1.9176 - regression_loss: 1.5998 - classification_loss: 0.3178 444/500 [=========================>....] - ETA: 13s - loss: 1.9182 - regression_loss: 1.5996 - classification_loss: 0.3186 445/500 [=========================>....] - ETA: 13s - loss: 1.9178 - regression_loss: 1.5993 - classification_loss: 0.3186 446/500 [=========================>....] - ETA: 13s - loss: 1.9172 - regression_loss: 1.5986 - classification_loss: 0.3186 447/500 [=========================>....] - ETA: 13s - loss: 1.9147 - regression_loss: 1.5965 - classification_loss: 0.3183 448/500 [=========================>....] - ETA: 12s - loss: 1.9142 - regression_loss: 1.5959 - classification_loss: 0.3183 449/500 [=========================>....] - ETA: 12s - loss: 1.9134 - regression_loss: 1.5954 - classification_loss: 0.3180 450/500 [==========================>...] - ETA: 12s - loss: 1.9147 - regression_loss: 1.5961 - classification_loss: 0.3186 451/500 [==========================>...] - ETA: 12s - loss: 1.9169 - regression_loss: 1.5979 - classification_loss: 0.3191 452/500 [==========================>...] - ETA: 11s - loss: 1.9163 - regression_loss: 1.5975 - classification_loss: 0.3189 453/500 [==========================>...] - ETA: 11s - loss: 1.9163 - regression_loss: 1.5975 - classification_loss: 0.3188 454/500 [==========================>...] - ETA: 11s - loss: 1.9152 - regression_loss: 1.5967 - classification_loss: 0.3186 455/500 [==========================>...] - ETA: 11s - loss: 1.9122 - regression_loss: 1.5941 - classification_loss: 0.3181 456/500 [==========================>...] - ETA: 10s - loss: 1.9137 - regression_loss: 1.5958 - classification_loss: 0.3180 457/500 [==========================>...] - ETA: 10s - loss: 1.9146 - regression_loss: 1.5966 - classification_loss: 0.3180 458/500 [==========================>...] - ETA: 10s - loss: 1.9144 - regression_loss: 1.5963 - classification_loss: 0.3181 459/500 [==========================>...] - ETA: 10s - loss: 1.9117 - regression_loss: 1.5941 - classification_loss: 0.3177 460/500 [==========================>...] - ETA: 9s - loss: 1.9097 - regression_loss: 1.5923 - classification_loss: 0.3173  461/500 [==========================>...] - ETA: 9s - loss: 1.9092 - regression_loss: 1.5918 - classification_loss: 0.3174 462/500 [==========================>...] - ETA: 9s - loss: 1.9089 - regression_loss: 1.5916 - classification_loss: 0.3173 463/500 [==========================>...] - ETA: 9s - loss: 1.9094 - regression_loss: 1.5921 - classification_loss: 0.3173 464/500 [==========================>...] - ETA: 8s - loss: 1.9091 - regression_loss: 1.5920 - classification_loss: 0.3171 465/500 [==========================>...] - ETA: 8s - loss: 1.9103 - regression_loss: 1.5931 - classification_loss: 0.3172 466/500 [==========================>...] - ETA: 8s - loss: 1.9123 - regression_loss: 1.5946 - classification_loss: 0.3177 467/500 [===========================>..] - ETA: 8s - loss: 1.9119 - regression_loss: 1.5942 - classification_loss: 0.3177 468/500 [===========================>..] - ETA: 7s - loss: 1.9104 - regression_loss: 1.5931 - classification_loss: 0.3174 469/500 [===========================>..] - ETA: 7s - loss: 1.9115 - regression_loss: 1.5938 - classification_loss: 0.3177 470/500 [===========================>..] - ETA: 7s - loss: 1.9139 - regression_loss: 1.5952 - classification_loss: 0.3186 471/500 [===========================>..] - ETA: 7s - loss: 1.9130 - regression_loss: 1.5946 - classification_loss: 0.3184 472/500 [===========================>..] - ETA: 6s - loss: 1.9131 - regression_loss: 1.5944 - classification_loss: 0.3186 473/500 [===========================>..] - ETA: 6s - loss: 1.9115 - regression_loss: 1.5932 - classification_loss: 0.3183 474/500 [===========================>..] - ETA: 6s - loss: 1.9107 - regression_loss: 1.5929 - classification_loss: 0.3178 475/500 [===========================>..] - ETA: 6s - loss: 1.9086 - regression_loss: 1.5913 - classification_loss: 0.3174 476/500 [===========================>..] - ETA: 5s - loss: 1.9076 - regression_loss: 1.5906 - classification_loss: 0.3170 477/500 [===========================>..] - ETA: 5s - loss: 1.9091 - regression_loss: 1.5919 - classification_loss: 0.3172 478/500 [===========================>..] - ETA: 5s - loss: 1.9089 - regression_loss: 1.5918 - classification_loss: 0.3171 479/500 [===========================>..] - ETA: 5s - loss: 1.9082 - regression_loss: 1.5912 - classification_loss: 0.3170 480/500 [===========================>..] - ETA: 4s - loss: 1.9071 - regression_loss: 1.5902 - classification_loss: 0.3169 481/500 [===========================>..] - ETA: 4s - loss: 1.9074 - regression_loss: 1.5905 - classification_loss: 0.3169 482/500 [===========================>..] - ETA: 4s - loss: 1.9080 - regression_loss: 1.5911 - classification_loss: 0.3169 483/500 [===========================>..] - ETA: 4s - loss: 1.9067 - regression_loss: 1.5899 - classification_loss: 0.3168 484/500 [============================>.] - ETA: 3s - loss: 1.9051 - regression_loss: 1.5887 - classification_loss: 0.3165 485/500 [============================>.] - ETA: 3s - loss: 1.9048 - regression_loss: 1.5883 - classification_loss: 0.3164 486/500 [============================>.] - ETA: 3s - loss: 1.9047 - regression_loss: 1.5883 - classification_loss: 0.3164 487/500 [============================>.] - ETA: 3s - loss: 1.9046 - regression_loss: 1.5883 - classification_loss: 0.3163 488/500 [============================>.] - ETA: 2s - loss: 1.9041 - regression_loss: 1.5878 - classification_loss: 0.3162 489/500 [============================>.] - ETA: 2s - loss: 1.9043 - regression_loss: 1.5881 - classification_loss: 0.3162 490/500 [============================>.] - ETA: 2s - loss: 1.9035 - regression_loss: 1.5873 - classification_loss: 0.3162 491/500 [============================>.] - ETA: 2s - loss: 1.9034 - regression_loss: 1.5872 - classification_loss: 0.3162 492/500 [============================>.] - ETA: 1s - loss: 1.9034 - regression_loss: 1.5873 - classification_loss: 0.3161 493/500 [============================>.] - ETA: 1s - loss: 1.9036 - regression_loss: 1.5875 - classification_loss: 0.3161 494/500 [============================>.] - ETA: 1s - loss: 1.9034 - regression_loss: 1.5873 - classification_loss: 0.3160 495/500 [============================>.] - ETA: 1s - loss: 1.9022 - regression_loss: 1.5865 - classification_loss: 0.3157 496/500 [============================>.] - ETA: 0s - loss: 1.9011 - regression_loss: 1.5857 - classification_loss: 0.3154 497/500 [============================>.] - ETA: 0s - loss: 1.9020 - regression_loss: 1.5862 - classification_loss: 0.3158 498/500 [============================>.] - ETA: 0s - loss: 1.9017 - regression_loss: 1.5860 - classification_loss: 0.3156 499/500 [============================>.] - ETA: 0s - loss: 1.9020 - regression_loss: 1.5862 - classification_loss: 0.3159 500/500 [==============================] - 125s 249ms/step - loss: 1.9015 - regression_loss: 1.5857 - classification_loss: 0.3158 326 instances of class plum with average precision: 0.7259 mAP: 0.7259 Epoch 00029: saving model to ./training/snapshots/resnet50_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 1:57 - loss: 2.2301 - regression_loss: 1.8467 - classification_loss: 0.3834 2/500 [..............................] - ETA: 2:01 - loss: 1.8473 - regression_loss: 1.5614 - classification_loss: 0.2859 3/500 [..............................] - ETA: 2:02 - loss: 1.9053 - regression_loss: 1.6080 - classification_loss: 0.2972 4/500 [..............................] - ETA: 2:02 - loss: 1.7420 - regression_loss: 1.4573 - classification_loss: 0.2847 5/500 [..............................] - ETA: 2:01 - loss: 1.7437 - regression_loss: 1.4515 - classification_loss: 0.2922 6/500 [..............................] - ETA: 2:01 - loss: 1.7505 - regression_loss: 1.4472 - classification_loss: 0.3033 7/500 [..............................] - ETA: 2:01 - loss: 1.9200 - regression_loss: 1.5798 - classification_loss: 0.3402 8/500 [..............................] - ETA: 2:01 - loss: 1.8895 - regression_loss: 1.5658 - classification_loss: 0.3237 9/500 [..............................] - ETA: 2:01 - loss: 1.8904 - regression_loss: 1.5631 - classification_loss: 0.3272 10/500 [..............................] - ETA: 2:00 - loss: 1.9333 - regression_loss: 1.6041 - classification_loss: 0.3292 11/500 [..............................] - ETA: 2:00 - loss: 2.0145 - regression_loss: 1.6776 - classification_loss: 0.3369 12/500 [..............................] - ETA: 2:01 - loss: 1.9655 - regression_loss: 1.6330 - classification_loss: 0.3325 13/500 [..............................] - ETA: 2:00 - loss: 1.9708 - regression_loss: 1.6343 - classification_loss: 0.3365 14/500 [..............................] - ETA: 2:00 - loss: 1.9269 - regression_loss: 1.5997 - classification_loss: 0.3272 15/500 [..............................] - ETA: 2:00 - loss: 1.9209 - regression_loss: 1.5932 - classification_loss: 0.3277 16/500 [..............................] - ETA: 1:59 - loss: 1.8984 - regression_loss: 1.5799 - classification_loss: 0.3185 17/500 [>.............................] - ETA: 1:59 - loss: 1.8881 - regression_loss: 1.5561 - classification_loss: 0.3320 18/500 [>.............................] - ETA: 1:59 - loss: 1.9318 - regression_loss: 1.5853 - classification_loss: 0.3465 19/500 [>.............................] - ETA: 1:59 - loss: 1.9082 - regression_loss: 1.5690 - classification_loss: 0.3392 20/500 [>.............................] - ETA: 1:59 - loss: 1.9236 - regression_loss: 1.5814 - classification_loss: 0.3422 21/500 [>.............................] - ETA: 1:59 - loss: 1.9254 - regression_loss: 1.5841 - classification_loss: 0.3413 22/500 [>.............................] - ETA: 1:58 - loss: 1.8928 - regression_loss: 1.5608 - classification_loss: 0.3320 23/500 [>.............................] - ETA: 1:58 - loss: 1.8643 - regression_loss: 1.5413 - classification_loss: 0.3231 24/500 [>.............................] - ETA: 1:58 - loss: 1.8518 - regression_loss: 1.5336 - classification_loss: 0.3183 25/500 [>.............................] - ETA: 1:57 - loss: 1.8220 - regression_loss: 1.5097 - classification_loss: 0.3124 26/500 [>.............................] - ETA: 1:57 - loss: 1.8372 - regression_loss: 1.5287 - classification_loss: 0.3086 27/500 [>.............................] - ETA: 1:57 - loss: 1.8275 - regression_loss: 1.5250 - classification_loss: 0.3025 28/500 [>.............................] - ETA: 1:57 - loss: 1.8282 - regression_loss: 1.5251 - classification_loss: 0.3031 29/500 [>.............................] - ETA: 1:56 - loss: 1.8425 - regression_loss: 1.5383 - classification_loss: 0.3042 30/500 [>.............................] - ETA: 1:56 - loss: 1.8458 - regression_loss: 1.5413 - classification_loss: 0.3045 31/500 [>.............................] - ETA: 1:56 - loss: 1.8388 - regression_loss: 1.5368 - classification_loss: 0.3020 32/500 [>.............................] - ETA: 1:56 - loss: 1.8286 - regression_loss: 1.5275 - classification_loss: 0.3012 33/500 [>.............................] - ETA: 1:56 - loss: 1.8087 - regression_loss: 1.5075 - classification_loss: 0.3013 34/500 [=>............................] - ETA: 1:56 - loss: 1.8408 - regression_loss: 1.5328 - classification_loss: 0.3080 35/500 [=>............................] - ETA: 1:55 - loss: 1.8375 - regression_loss: 1.5311 - classification_loss: 0.3063 36/500 [=>............................] - ETA: 1:55 - loss: 1.8147 - regression_loss: 1.5128 - classification_loss: 0.3019 37/500 [=>............................] - ETA: 1:55 - loss: 1.8142 - regression_loss: 1.5171 - classification_loss: 0.2971 38/500 [=>............................] - ETA: 1:55 - loss: 1.8244 - regression_loss: 1.5222 - classification_loss: 0.3022 39/500 [=>............................] - ETA: 1:55 - loss: 1.8201 - regression_loss: 1.5186 - classification_loss: 0.3015 40/500 [=>............................] - ETA: 1:55 - loss: 1.8270 - regression_loss: 1.5233 - classification_loss: 0.3038 41/500 [=>............................] - ETA: 1:54 - loss: 1.8316 - regression_loss: 1.5260 - classification_loss: 0.3056 42/500 [=>............................] - ETA: 1:54 - loss: 1.8621 - regression_loss: 1.5565 - classification_loss: 0.3056 43/500 [=>............................] - ETA: 1:54 - loss: 1.8686 - regression_loss: 1.5621 - classification_loss: 0.3064 44/500 [=>............................] - ETA: 1:54 - loss: 1.8753 - regression_loss: 1.5661 - classification_loss: 0.3093 45/500 [=>............................] - ETA: 1:53 - loss: 1.8839 - regression_loss: 1.5756 - classification_loss: 0.3082 46/500 [=>............................] - ETA: 1:53 - loss: 1.8895 - regression_loss: 1.5803 - classification_loss: 0.3092 47/500 [=>............................] - ETA: 1:53 - loss: 1.9257 - regression_loss: 1.6063 - classification_loss: 0.3194 48/500 [=>............................] - ETA: 1:52 - loss: 1.9486 - regression_loss: 1.6311 - classification_loss: 0.3176 49/500 [=>............................] - ETA: 1:52 - loss: 1.9399 - regression_loss: 1.6249 - classification_loss: 0.3150 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9391 - regression_loss: 1.6252 - classification_loss: 0.3139 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9362 - regression_loss: 1.6250 - classification_loss: 0.3112 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9288 - regression_loss: 1.6198 - classification_loss: 0.3091 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9249 - regression_loss: 1.6159 - classification_loss: 0.3090 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9275 - regression_loss: 1.6196 - classification_loss: 0.3079 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9270 - regression_loss: 1.6185 - classification_loss: 0.3085 56/500 [==>...........................] - ETA: 1:50 - loss: 1.9463 - regression_loss: 1.6387 - classification_loss: 0.3076 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9454 - regression_loss: 1.6392 - classification_loss: 0.3062 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9398 - regression_loss: 1.6338 - classification_loss: 0.3060 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9368 - regression_loss: 1.6305 - classification_loss: 0.3063 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9293 - regression_loss: 1.6246 - classification_loss: 0.3047 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9190 - regression_loss: 1.6166 - classification_loss: 0.3024 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9159 - regression_loss: 1.6142 - classification_loss: 0.3017 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9049 - regression_loss: 1.6057 - classification_loss: 0.2992 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9135 - regression_loss: 1.6120 - classification_loss: 0.3015 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8995 - regression_loss: 1.6004 - classification_loss: 0.2991 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9006 - regression_loss: 1.6014 - classification_loss: 0.2992 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8976 - regression_loss: 1.5990 - classification_loss: 0.2986 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8957 - regression_loss: 1.5957 - classification_loss: 0.3000 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8879 - regression_loss: 1.5888 - classification_loss: 0.2992 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8786 - regression_loss: 1.5816 - classification_loss: 0.2970 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8821 - regression_loss: 1.5842 - classification_loss: 0.2979 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8857 - regression_loss: 1.5867 - classification_loss: 0.2991 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8927 - regression_loss: 1.5920 - classification_loss: 0.3007 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8895 - regression_loss: 1.5901 - classification_loss: 0.2994 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8870 - regression_loss: 1.5877 - classification_loss: 0.2993 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8836 - regression_loss: 1.5860 - classification_loss: 0.2976 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8824 - regression_loss: 1.5859 - classification_loss: 0.2965 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8830 - regression_loss: 1.5865 - classification_loss: 0.2964 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8893 - regression_loss: 1.5917 - classification_loss: 0.2976 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8947 - regression_loss: 1.5955 - classification_loss: 0.2992 81/500 [===>..........................] - ETA: 1:44 - loss: 1.9045 - regression_loss: 1.6026 - classification_loss: 0.3019 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9151 - regression_loss: 1.6100 - classification_loss: 0.3050 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9140 - regression_loss: 1.6091 - classification_loss: 0.3048 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9207 - regression_loss: 1.6138 - classification_loss: 0.3069 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9141 - regression_loss: 1.6082 - classification_loss: 0.3059 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9180 - regression_loss: 1.6123 - classification_loss: 0.3057 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9096 - regression_loss: 1.6052 - classification_loss: 0.3045 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9077 - regression_loss: 1.6037 - classification_loss: 0.3040 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9154 - regression_loss: 1.6100 - classification_loss: 0.3054 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9150 - regression_loss: 1.6099 - classification_loss: 0.3050 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9268 - regression_loss: 1.6196 - classification_loss: 0.3072 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9248 - regression_loss: 1.6182 - classification_loss: 0.3067 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9257 - regression_loss: 1.6195 - classification_loss: 0.3062 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9270 - regression_loss: 1.6195 - classification_loss: 0.3074 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9218 - regression_loss: 1.6155 - classification_loss: 0.3062 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9264 - regression_loss: 1.6182 - classification_loss: 0.3081 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9212 - regression_loss: 1.6139 - classification_loss: 0.3074 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9228 - regression_loss: 1.6149 - classification_loss: 0.3078 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9264 - regression_loss: 1.6166 - classification_loss: 0.3098 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9181 - regression_loss: 1.6099 - classification_loss: 0.3081 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9157 - regression_loss: 1.6082 - classification_loss: 0.3074 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9205 - regression_loss: 1.6117 - classification_loss: 0.3088 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9175 - regression_loss: 1.6091 - classification_loss: 0.3085 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9105 - regression_loss: 1.6035 - classification_loss: 0.3070 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9168 - regression_loss: 1.6092 - classification_loss: 0.3076 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9171 - regression_loss: 1.6087 - classification_loss: 0.3084 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9166 - regression_loss: 1.6085 - classification_loss: 0.3081 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9171 - regression_loss: 1.6085 - classification_loss: 0.3086 109/500 [=====>........................] - ETA: 1:37 - loss: 1.9202 - regression_loss: 1.6100 - classification_loss: 0.3102 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9292 - regression_loss: 1.6166 - classification_loss: 0.3126 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9316 - regression_loss: 1.6190 - classification_loss: 0.3126 112/500 [=====>........................] - ETA: 1:36 - loss: 1.9220 - regression_loss: 1.6111 - classification_loss: 0.3109 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9248 - regression_loss: 1.5968 - classification_loss: 0.3280 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9233 - regression_loss: 1.5954 - classification_loss: 0.3279 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9192 - regression_loss: 1.5924 - classification_loss: 0.3268 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9156 - regression_loss: 1.5899 - classification_loss: 0.3257 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9256 - regression_loss: 1.5974 - classification_loss: 0.3282 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9235 - regression_loss: 1.5960 - classification_loss: 0.3276 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9213 - regression_loss: 1.5945 - classification_loss: 0.3268 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9205 - regression_loss: 1.5946 - classification_loss: 0.3259 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9246 - regression_loss: 1.5976 - classification_loss: 0.3271 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9235 - regression_loss: 1.5962 - classification_loss: 0.3273 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9273 - regression_loss: 1.5992 - classification_loss: 0.3281 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9363 - regression_loss: 1.6042 - classification_loss: 0.3321 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9390 - regression_loss: 1.6072 - classification_loss: 0.3317 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9398 - regression_loss: 1.6080 - classification_loss: 0.3319 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9379 - regression_loss: 1.6066 - classification_loss: 0.3312 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9427 - regression_loss: 1.6094 - classification_loss: 0.3333 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9437 - regression_loss: 1.6100 - classification_loss: 0.3337 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9466 - regression_loss: 1.6116 - classification_loss: 0.3351 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9421 - regression_loss: 1.6082 - classification_loss: 0.3339 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9503 - regression_loss: 1.6160 - classification_loss: 0.3343 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9447 - regression_loss: 1.6120 - classification_loss: 0.3327 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9372 - regression_loss: 1.6063 - classification_loss: 0.3308 135/500 [=======>......................] - ETA: 1:30 - loss: 1.9313 - regression_loss: 1.6016 - classification_loss: 0.3297 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9321 - regression_loss: 1.6014 - classification_loss: 0.3307 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9337 - regression_loss: 1.6027 - classification_loss: 0.3310 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9389 - regression_loss: 1.6068 - classification_loss: 0.3321 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9338 - regression_loss: 1.6026 - classification_loss: 0.3312 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9290 - regression_loss: 1.5988 - classification_loss: 0.3302 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9267 - regression_loss: 1.5973 - classification_loss: 0.3293 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9220 - regression_loss: 1.5936 - classification_loss: 0.3285 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9257 - regression_loss: 1.5962 - classification_loss: 0.3295 144/500 [=======>......................] - ETA: 1:28 - loss: 1.9246 - regression_loss: 1.5947 - classification_loss: 0.3300 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9238 - regression_loss: 1.5945 - classification_loss: 0.3293 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9218 - regression_loss: 1.5933 - classification_loss: 0.3285 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9209 - regression_loss: 1.5931 - classification_loss: 0.3279 148/500 [=======>......................] - ETA: 1:27 - loss: 1.9217 - regression_loss: 1.5940 - classification_loss: 0.3276 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9197 - regression_loss: 1.5927 - classification_loss: 0.3271 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9194 - regression_loss: 1.5927 - classification_loss: 0.3266 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9160 - regression_loss: 1.5902 - classification_loss: 0.3258 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9192 - regression_loss: 1.5942 - classification_loss: 0.3250 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9191 - regression_loss: 1.5944 - classification_loss: 0.3247 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9189 - regression_loss: 1.5942 - classification_loss: 0.3247 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9178 - regression_loss: 1.5930 - classification_loss: 0.3248 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9138 - regression_loss: 1.5897 - classification_loss: 0.3241 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9152 - regression_loss: 1.5910 - classification_loss: 0.3242 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9235 - regression_loss: 1.5978 - classification_loss: 0.3257 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9226 - regression_loss: 1.5967 - classification_loss: 0.3259 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9175 - regression_loss: 1.5928 - classification_loss: 0.3247 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9217 - regression_loss: 1.5972 - classification_loss: 0.3245 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9179 - regression_loss: 1.5943 - classification_loss: 0.3236 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9127 - regression_loss: 1.5902 - classification_loss: 0.3225 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9063 - regression_loss: 1.5849 - classification_loss: 0.3214 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9082 - regression_loss: 1.5857 - classification_loss: 0.3224 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9046 - regression_loss: 1.5824 - classification_loss: 0.3222 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9043 - regression_loss: 1.5822 - classification_loss: 0.3221 168/500 [=========>....................] - ETA: 1:22 - loss: 1.9068 - regression_loss: 1.5847 - classification_loss: 0.3221 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9082 - regression_loss: 1.5856 - classification_loss: 0.3226 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9059 - regression_loss: 1.5842 - classification_loss: 0.3217 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9081 - regression_loss: 1.5861 - classification_loss: 0.3220 172/500 [=========>....................] - ETA: 1:21 - loss: 1.9045 - regression_loss: 1.5836 - classification_loss: 0.3209 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9047 - regression_loss: 1.5847 - classification_loss: 0.3200 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9028 - regression_loss: 1.5836 - classification_loss: 0.3192 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9033 - regression_loss: 1.5838 - classification_loss: 0.3195 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8984 - regression_loss: 1.5798 - classification_loss: 0.3185 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8993 - regression_loss: 1.5810 - classification_loss: 0.3184 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8978 - regression_loss: 1.5798 - classification_loss: 0.3180 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8973 - regression_loss: 1.5796 - classification_loss: 0.3177 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8961 - regression_loss: 1.5789 - classification_loss: 0.3173 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8965 - regression_loss: 1.5790 - classification_loss: 0.3175 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9005 - regression_loss: 1.5821 - classification_loss: 0.3183 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8977 - regression_loss: 1.5798 - classification_loss: 0.3179 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8998 - regression_loss: 1.5817 - classification_loss: 0.3181 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8936 - regression_loss: 1.5767 - classification_loss: 0.3168 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8964 - regression_loss: 1.5794 - classification_loss: 0.3170 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8971 - regression_loss: 1.5794 - classification_loss: 0.3177 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9004 - regression_loss: 1.5824 - classification_loss: 0.3180 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9038 - regression_loss: 1.5847 - classification_loss: 0.3191 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9078 - regression_loss: 1.5869 - classification_loss: 0.3209 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9086 - regression_loss: 1.5880 - classification_loss: 0.3206 192/500 [==========>...................] - ETA: 1:16 - loss: 1.9104 - regression_loss: 1.5892 - classification_loss: 0.3212 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9105 - regression_loss: 1.5892 - classification_loss: 0.3213 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9126 - regression_loss: 1.5910 - classification_loss: 0.3216 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9153 - regression_loss: 1.5934 - classification_loss: 0.3219 196/500 [==========>...................] - ETA: 1:15 - loss: 1.9138 - regression_loss: 1.5926 - classification_loss: 0.3212 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9135 - regression_loss: 1.5925 - classification_loss: 0.3210 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9133 - regression_loss: 1.5929 - classification_loss: 0.3204 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9126 - regression_loss: 1.5920 - classification_loss: 0.3207 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9146 - regression_loss: 1.5939 - classification_loss: 0.3207 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9145 - regression_loss: 1.5939 - classification_loss: 0.3206 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9114 - regression_loss: 1.5915 - classification_loss: 0.3200 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9157 - regression_loss: 1.5947 - classification_loss: 0.3210 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9154 - regression_loss: 1.5946 - classification_loss: 0.3208 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9134 - regression_loss: 1.5932 - classification_loss: 0.3203 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9100 - regression_loss: 1.5905 - classification_loss: 0.3195 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9098 - regression_loss: 1.5908 - classification_loss: 0.3190 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9080 - regression_loss: 1.5896 - classification_loss: 0.3184 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9049 - regression_loss: 1.5872 - classification_loss: 0.3177 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9049 - regression_loss: 1.5872 - classification_loss: 0.3177 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9021 - regression_loss: 1.5845 - classification_loss: 0.3176 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9024 - regression_loss: 1.5845 - classification_loss: 0.3179 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8955 - regression_loss: 1.5785 - classification_loss: 0.3169 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8897 - regression_loss: 1.5738 - classification_loss: 0.3159 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8896 - regression_loss: 1.5737 - classification_loss: 0.3159 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8924 - regression_loss: 1.5758 - classification_loss: 0.3166 217/500 [============>.................] - ETA: 1:10 - loss: 1.8906 - regression_loss: 1.5741 - classification_loss: 0.3165 218/500 [============>.................] - ETA: 1:10 - loss: 1.8929 - regression_loss: 1.5758 - classification_loss: 0.3170 219/500 [============>.................] - ETA: 1:10 - loss: 1.8910 - regression_loss: 1.5742 - classification_loss: 0.3169 220/500 [============>.................] - ETA: 1:10 - loss: 1.8912 - regression_loss: 1.5749 - classification_loss: 0.3163 221/500 [============>.................] - ETA: 1:09 - loss: 1.8934 - regression_loss: 1.5767 - classification_loss: 0.3167 222/500 [============>.................] - ETA: 1:09 - loss: 1.8995 - regression_loss: 1.5816 - classification_loss: 0.3179 223/500 [============>.................] - ETA: 1:09 - loss: 1.9000 - regression_loss: 1.5821 - classification_loss: 0.3179 224/500 [============>.................] - ETA: 1:09 - loss: 1.9031 - regression_loss: 1.5845 - classification_loss: 0.3186 225/500 [============>.................] - ETA: 1:08 - loss: 1.9029 - regression_loss: 1.5847 - classification_loss: 0.3183 226/500 [============>.................] - ETA: 1:08 - loss: 1.9034 - regression_loss: 1.5851 - classification_loss: 0.3183 227/500 [============>.................] - ETA: 1:08 - loss: 1.9030 - regression_loss: 1.5849 - classification_loss: 0.3181 228/500 [============>.................] - ETA: 1:08 - loss: 1.9018 - regression_loss: 1.5841 - classification_loss: 0.3176 229/500 [============>.................] - ETA: 1:07 - loss: 1.9039 - regression_loss: 1.5857 - classification_loss: 0.3182 230/500 [============>.................] - ETA: 1:07 - loss: 1.9055 - regression_loss: 1.5873 - classification_loss: 0.3183 231/500 [============>.................] - ETA: 1:07 - loss: 1.9025 - regression_loss: 1.5850 - classification_loss: 0.3175 232/500 [============>.................] - ETA: 1:07 - loss: 1.9038 - regression_loss: 1.5864 - classification_loss: 0.3173 233/500 [============>.................] - ETA: 1:06 - loss: 1.9061 - regression_loss: 1.5881 - classification_loss: 0.3180 234/500 [=============>................] - ETA: 1:06 - loss: 1.9037 - regression_loss: 1.5863 - classification_loss: 0.3174 235/500 [=============>................] - ETA: 1:06 - loss: 1.9027 - regression_loss: 1.5855 - classification_loss: 0.3172 236/500 [=============>................] - ETA: 1:06 - loss: 1.9044 - regression_loss: 1.5867 - classification_loss: 0.3177 237/500 [=============>................] - ETA: 1:05 - loss: 1.9005 - regression_loss: 1.5836 - classification_loss: 0.3169 238/500 [=============>................] - ETA: 1:05 - loss: 1.9012 - regression_loss: 1.5839 - classification_loss: 0.3173 239/500 [=============>................] - ETA: 1:05 - loss: 1.9016 - regression_loss: 1.5850 - classification_loss: 0.3166 240/500 [=============>................] - ETA: 1:05 - loss: 1.9021 - regression_loss: 1.5853 - classification_loss: 0.3168 241/500 [=============>................] - ETA: 1:04 - loss: 1.9044 - regression_loss: 1.5872 - classification_loss: 0.3172 242/500 [=============>................] - ETA: 1:04 - loss: 1.9068 - regression_loss: 1.5893 - classification_loss: 0.3175 243/500 [=============>................] - ETA: 1:04 - loss: 1.9068 - regression_loss: 1.5894 - classification_loss: 0.3174 244/500 [=============>................] - ETA: 1:04 - loss: 1.9048 - regression_loss: 1.5881 - classification_loss: 0.3168 245/500 [=============>................] - ETA: 1:03 - loss: 1.9062 - regression_loss: 1.5893 - classification_loss: 0.3169 246/500 [=============>................] - ETA: 1:03 - loss: 1.9068 - regression_loss: 1.5897 - classification_loss: 0.3171 247/500 [=============>................] - ETA: 1:03 - loss: 1.9087 - regression_loss: 1.5912 - classification_loss: 0.3175 248/500 [=============>................] - ETA: 1:03 - loss: 1.9097 - regression_loss: 1.5920 - classification_loss: 0.3177 249/500 [=============>................] - ETA: 1:02 - loss: 1.9116 - regression_loss: 1.5937 - classification_loss: 0.3179 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9092 - regression_loss: 1.5918 - classification_loss: 0.3174 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9068 - regression_loss: 1.5901 - classification_loss: 0.3167 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9074 - regression_loss: 1.5908 - classification_loss: 0.3166 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9080 - regression_loss: 1.5916 - classification_loss: 0.3164 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9061 - regression_loss: 1.5901 - classification_loss: 0.3161 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9057 - regression_loss: 1.5897 - classification_loss: 0.3160 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9094 - regression_loss: 1.5835 - classification_loss: 0.3260 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9083 - regression_loss: 1.5823 - classification_loss: 0.3260 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9092 - regression_loss: 1.5833 - classification_loss: 0.3259 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9082 - regression_loss: 1.5829 - classification_loss: 0.3254 260/500 [==============>...............] - ETA: 59s - loss: 1.9070 - regression_loss: 1.5819 - classification_loss: 0.3251  261/500 [==============>...............] - ETA: 59s - loss: 1.9094 - regression_loss: 1.5837 - classification_loss: 0.3257 262/500 [==============>...............] - ETA: 59s - loss: 1.9100 - regression_loss: 1.5843 - classification_loss: 0.3257 263/500 [==============>...............] - ETA: 59s - loss: 1.9081 - regression_loss: 1.5829 - classification_loss: 0.3253 264/500 [==============>...............] - ETA: 58s - loss: 1.9062 - regression_loss: 1.5810 - classification_loss: 0.3252 265/500 [==============>...............] - ETA: 58s - loss: 1.9062 - regression_loss: 1.5811 - classification_loss: 0.3252 266/500 [==============>...............] - ETA: 58s - loss: 1.9050 - regression_loss: 1.5803 - classification_loss: 0.3248 267/500 [===============>..............] - ETA: 58s - loss: 1.9049 - regression_loss: 1.5801 - classification_loss: 0.3249 268/500 [===============>..............] - ETA: 57s - loss: 1.9050 - regression_loss: 1.5807 - classification_loss: 0.3244 269/500 [===============>..............] - ETA: 57s - loss: 1.9030 - regression_loss: 1.5791 - classification_loss: 0.3239 270/500 [===============>..............] - ETA: 57s - loss: 1.9022 - regression_loss: 1.5785 - classification_loss: 0.3237 271/500 [===============>..............] - ETA: 57s - loss: 1.9033 - regression_loss: 1.5796 - classification_loss: 0.3237 272/500 [===============>..............] - ETA: 56s - loss: 1.9057 - regression_loss: 1.5815 - classification_loss: 0.3242 273/500 [===============>..............] - ETA: 56s - loss: 1.9041 - regression_loss: 1.5805 - classification_loss: 0.3237 274/500 [===============>..............] - ETA: 56s - loss: 1.9058 - regression_loss: 1.5816 - classification_loss: 0.3242 275/500 [===============>..............] - ETA: 56s - loss: 1.9107 - regression_loss: 1.5854 - classification_loss: 0.3253 276/500 [===============>..............] - ETA: 55s - loss: 1.9102 - regression_loss: 1.5854 - classification_loss: 0.3249 277/500 [===============>..............] - ETA: 55s - loss: 1.9105 - regression_loss: 1.5856 - classification_loss: 0.3249 278/500 [===============>..............] - ETA: 55s - loss: 1.9129 - regression_loss: 1.5876 - classification_loss: 0.3254 279/500 [===============>..............] - ETA: 55s - loss: 1.9123 - regression_loss: 1.5870 - classification_loss: 0.3253 280/500 [===============>..............] - ETA: 54s - loss: 1.9111 - regression_loss: 1.5861 - classification_loss: 0.3250 281/500 [===============>..............] - ETA: 54s - loss: 1.9098 - regression_loss: 1.5851 - classification_loss: 0.3247 282/500 [===============>..............] - ETA: 54s - loss: 1.9144 - regression_loss: 1.5890 - classification_loss: 0.3254 283/500 [===============>..............] - ETA: 54s - loss: 1.9143 - regression_loss: 1.5890 - classification_loss: 0.3253 284/500 [================>.............] - ETA: 53s - loss: 1.9151 - regression_loss: 1.5898 - classification_loss: 0.3253 285/500 [================>.............] - ETA: 53s - loss: 1.9161 - regression_loss: 1.5908 - classification_loss: 0.3253 286/500 [================>.............] - ETA: 53s - loss: 1.9147 - regression_loss: 1.5901 - classification_loss: 0.3246 287/500 [================>.............] - ETA: 53s - loss: 1.9141 - regression_loss: 1.5898 - classification_loss: 0.3243 288/500 [================>.............] - ETA: 52s - loss: 1.9146 - regression_loss: 1.5900 - classification_loss: 0.3245 289/500 [================>.............] - ETA: 52s - loss: 1.9145 - regression_loss: 1.5902 - classification_loss: 0.3243 290/500 [================>.............] - ETA: 52s - loss: 1.9122 - regression_loss: 1.5885 - classification_loss: 0.3237 291/500 [================>.............] - ETA: 52s - loss: 1.9136 - regression_loss: 1.5894 - classification_loss: 0.3242 292/500 [================>.............] - ETA: 51s - loss: 1.9147 - regression_loss: 1.5904 - classification_loss: 0.3242 293/500 [================>.............] - ETA: 51s - loss: 1.9163 - regression_loss: 1.5918 - classification_loss: 0.3244 294/500 [================>.............] - ETA: 51s - loss: 1.9196 - regression_loss: 1.5947 - classification_loss: 0.3249 295/500 [================>.............] - ETA: 51s - loss: 1.9207 - regression_loss: 1.5959 - classification_loss: 0.3248 296/500 [================>.............] - ETA: 50s - loss: 1.9198 - regression_loss: 1.5955 - classification_loss: 0.3243 297/500 [================>.............] - ETA: 50s - loss: 1.9190 - regression_loss: 1.5949 - classification_loss: 0.3241 298/500 [================>.............] - ETA: 50s - loss: 1.9176 - regression_loss: 1.5940 - classification_loss: 0.3235 299/500 [================>.............] - ETA: 50s - loss: 1.9164 - regression_loss: 1.5934 - classification_loss: 0.3230 300/500 [=================>............] - ETA: 49s - loss: 1.9138 - regression_loss: 1.5913 - classification_loss: 0.3224 301/500 [=================>............] - ETA: 49s - loss: 1.9120 - regression_loss: 1.5900 - classification_loss: 0.3220 302/500 [=================>............] - ETA: 49s - loss: 1.9148 - regression_loss: 1.5923 - classification_loss: 0.3225 303/500 [=================>............] - ETA: 49s - loss: 1.9164 - regression_loss: 1.5933 - classification_loss: 0.3231 304/500 [=================>............] - ETA: 48s - loss: 1.9169 - regression_loss: 1.5939 - classification_loss: 0.3230 305/500 [=================>............] - ETA: 48s - loss: 1.9158 - regression_loss: 1.5928 - classification_loss: 0.3230 306/500 [=================>............] - ETA: 48s - loss: 1.9146 - regression_loss: 1.5920 - classification_loss: 0.3226 307/500 [=================>............] - ETA: 48s - loss: 1.9129 - regression_loss: 1.5907 - classification_loss: 0.3222 308/500 [=================>............] - ETA: 47s - loss: 1.9143 - regression_loss: 1.5921 - classification_loss: 0.3222 309/500 [=================>............] - ETA: 47s - loss: 1.9130 - regression_loss: 1.5913 - classification_loss: 0.3216 310/500 [=================>............] - ETA: 47s - loss: 1.9145 - regression_loss: 1.5922 - classification_loss: 0.3223 311/500 [=================>............] - ETA: 47s - loss: 1.9149 - regression_loss: 1.5924 - classification_loss: 0.3224 312/500 [=================>............] - ETA: 46s - loss: 1.9150 - regression_loss: 1.5924 - classification_loss: 0.3226 313/500 [=================>............] - ETA: 46s - loss: 1.9150 - regression_loss: 1.5922 - classification_loss: 0.3228 314/500 [=================>............] - ETA: 46s - loss: 1.9195 - regression_loss: 1.5961 - classification_loss: 0.3234 315/500 [=================>............] - ETA: 46s - loss: 1.9201 - regression_loss: 1.5965 - classification_loss: 0.3236 316/500 [=================>............] - ETA: 45s - loss: 1.9211 - regression_loss: 1.5973 - classification_loss: 0.3238 317/500 [==================>...........] - ETA: 45s - loss: 1.9175 - regression_loss: 1.5944 - classification_loss: 0.3231 318/500 [==================>...........] - ETA: 45s - loss: 1.9150 - regression_loss: 1.5924 - classification_loss: 0.3226 319/500 [==================>...........] - ETA: 45s - loss: 1.9159 - regression_loss: 1.5930 - classification_loss: 0.3229 320/500 [==================>...........] - ETA: 44s - loss: 1.9139 - regression_loss: 1.5912 - classification_loss: 0.3227 321/500 [==================>...........] - ETA: 44s - loss: 1.9170 - regression_loss: 1.5947 - classification_loss: 0.3223 322/500 [==================>...........] - ETA: 44s - loss: 1.9163 - regression_loss: 1.5940 - classification_loss: 0.3222 323/500 [==================>...........] - ETA: 44s - loss: 1.9156 - regression_loss: 1.5934 - classification_loss: 0.3222 324/500 [==================>...........] - ETA: 43s - loss: 1.9143 - regression_loss: 1.5925 - classification_loss: 0.3218 325/500 [==================>...........] - ETA: 43s - loss: 1.9129 - regression_loss: 1.5913 - classification_loss: 0.3216 326/500 [==================>...........] - ETA: 43s - loss: 1.9123 - regression_loss: 1.5906 - classification_loss: 0.3216 327/500 [==================>...........] - ETA: 43s - loss: 1.9105 - regression_loss: 1.5893 - classification_loss: 0.3212 328/500 [==================>...........] - ETA: 42s - loss: 1.9125 - regression_loss: 1.5910 - classification_loss: 0.3215 329/500 [==================>...........] - ETA: 42s - loss: 1.9114 - regression_loss: 1.5902 - classification_loss: 0.3211 330/500 [==================>...........] - ETA: 42s - loss: 1.9117 - regression_loss: 1.5905 - classification_loss: 0.3212 331/500 [==================>...........] - ETA: 42s - loss: 1.9093 - regression_loss: 1.5886 - classification_loss: 0.3207 332/500 [==================>...........] - ETA: 41s - loss: 1.9107 - regression_loss: 1.5900 - classification_loss: 0.3207 333/500 [==================>...........] - ETA: 41s - loss: 1.9103 - regression_loss: 1.5897 - classification_loss: 0.3206 334/500 [===================>..........] - ETA: 41s - loss: 1.9161 - regression_loss: 1.5905 - classification_loss: 0.3256 335/500 [===================>..........] - ETA: 41s - loss: 1.9171 - regression_loss: 1.5915 - classification_loss: 0.3256 336/500 [===================>..........] - ETA: 40s - loss: 1.9185 - regression_loss: 1.5925 - classification_loss: 0.3259 337/500 [===================>..........] - ETA: 40s - loss: 1.9172 - regression_loss: 1.5915 - classification_loss: 0.3256 338/500 [===================>..........] - ETA: 40s - loss: 1.9175 - regression_loss: 1.5919 - classification_loss: 0.3256 339/500 [===================>..........] - ETA: 40s - loss: 1.9185 - regression_loss: 1.5926 - classification_loss: 0.3258 340/500 [===================>..........] - ETA: 39s - loss: 1.9208 - regression_loss: 1.5945 - classification_loss: 0.3263 341/500 [===================>..........] - ETA: 39s - loss: 1.9208 - regression_loss: 1.5944 - classification_loss: 0.3264 342/500 [===================>..........] - ETA: 39s - loss: 1.9202 - regression_loss: 1.5942 - classification_loss: 0.3260 343/500 [===================>..........] - ETA: 39s - loss: 1.9206 - regression_loss: 1.5944 - classification_loss: 0.3262 344/500 [===================>..........] - ETA: 38s - loss: 1.9203 - regression_loss: 1.5943 - classification_loss: 0.3260 345/500 [===================>..........] - ETA: 38s - loss: 1.9195 - regression_loss: 1.5938 - classification_loss: 0.3258 346/500 [===================>..........] - ETA: 38s - loss: 1.9193 - regression_loss: 1.5940 - classification_loss: 0.3254 347/500 [===================>..........] - ETA: 38s - loss: 1.9170 - regression_loss: 1.5922 - classification_loss: 0.3248 348/500 [===================>..........] - ETA: 37s - loss: 1.9189 - regression_loss: 1.5942 - classification_loss: 0.3246 349/500 [===================>..........] - ETA: 37s - loss: 1.9184 - regression_loss: 1.5938 - classification_loss: 0.3246 350/500 [====================>.........] - ETA: 37s - loss: 1.9189 - regression_loss: 1.5947 - classification_loss: 0.3243 351/500 [====================>.........] - ETA: 37s - loss: 1.9195 - regression_loss: 1.5951 - classification_loss: 0.3244 352/500 [====================>.........] - ETA: 36s - loss: 1.9179 - regression_loss: 1.5938 - classification_loss: 0.3241 353/500 [====================>.........] - ETA: 36s - loss: 1.9173 - regression_loss: 1.5933 - classification_loss: 0.3239 354/500 [====================>.........] - ETA: 36s - loss: 1.9147 - regression_loss: 1.5913 - classification_loss: 0.3234 355/500 [====================>.........] - ETA: 36s - loss: 1.9135 - regression_loss: 1.5904 - classification_loss: 0.3231 356/500 [====================>.........] - ETA: 35s - loss: 1.9138 - regression_loss: 1.5907 - classification_loss: 0.3232 357/500 [====================>.........] - ETA: 35s - loss: 1.9149 - regression_loss: 1.5918 - classification_loss: 0.3230 358/500 [====================>.........] - ETA: 35s - loss: 1.9167 - regression_loss: 1.5931 - classification_loss: 0.3236 359/500 [====================>.........] - ETA: 35s - loss: 1.9153 - regression_loss: 1.5921 - classification_loss: 0.3231 360/500 [====================>.........] - ETA: 34s - loss: 1.9161 - regression_loss: 1.5929 - classification_loss: 0.3232 361/500 [====================>.........] - ETA: 34s - loss: 1.9146 - regression_loss: 1.5915 - classification_loss: 0.3231 362/500 [====================>.........] - ETA: 34s - loss: 1.9150 - regression_loss: 1.5917 - classification_loss: 0.3233 363/500 [====================>.........] - ETA: 34s - loss: 1.9121 - regression_loss: 1.5893 - classification_loss: 0.3228 364/500 [====================>.........] - ETA: 33s - loss: 1.9118 - regression_loss: 1.5892 - classification_loss: 0.3226 365/500 [====================>.........] - ETA: 33s - loss: 1.9111 - regression_loss: 1.5887 - classification_loss: 0.3223 366/500 [====================>.........] - ETA: 33s - loss: 1.9106 - regression_loss: 1.5885 - classification_loss: 0.3221 367/500 [=====================>........] - ETA: 33s - loss: 1.9102 - regression_loss: 1.5883 - classification_loss: 0.3220 368/500 [=====================>........] - ETA: 32s - loss: 1.9097 - regression_loss: 1.5880 - classification_loss: 0.3217 369/500 [=====================>........] - ETA: 32s - loss: 1.9097 - regression_loss: 1.5883 - classification_loss: 0.3214 370/500 [=====================>........] - ETA: 32s - loss: 1.9096 - regression_loss: 1.5883 - classification_loss: 0.3213 371/500 [=====================>........] - ETA: 32s - loss: 1.9092 - regression_loss: 1.5880 - classification_loss: 0.3213 372/500 [=====================>........] - ETA: 31s - loss: 1.9096 - regression_loss: 1.5881 - classification_loss: 0.3215 373/500 [=====================>........] - ETA: 31s - loss: 1.9091 - regression_loss: 1.5879 - classification_loss: 0.3212 374/500 [=====================>........] - ETA: 31s - loss: 1.9089 - regression_loss: 1.5879 - classification_loss: 0.3211 375/500 [=====================>........] - ETA: 31s - loss: 1.9100 - regression_loss: 1.5885 - classification_loss: 0.3215 376/500 [=====================>........] - ETA: 30s - loss: 1.9121 - regression_loss: 1.5905 - classification_loss: 0.3216 377/500 [=====================>........] - ETA: 30s - loss: 1.9128 - regression_loss: 1.5909 - classification_loss: 0.3220 378/500 [=====================>........] - ETA: 30s - loss: 1.9117 - regression_loss: 1.5900 - classification_loss: 0.3217 379/500 [=====================>........] - ETA: 30s - loss: 1.9118 - regression_loss: 1.5900 - classification_loss: 0.3218 380/500 [=====================>........] - ETA: 29s - loss: 1.9138 - regression_loss: 1.5914 - classification_loss: 0.3223 381/500 [=====================>........] - ETA: 29s - loss: 1.9128 - regression_loss: 1.5908 - classification_loss: 0.3220 382/500 [=====================>........] - ETA: 29s - loss: 1.9140 - regression_loss: 1.5917 - classification_loss: 0.3223 383/500 [=====================>........] - ETA: 29s - loss: 1.9183 - regression_loss: 1.5950 - classification_loss: 0.3233 384/500 [======================>.......] - ETA: 28s - loss: 1.9180 - regression_loss: 1.5947 - classification_loss: 0.3232 385/500 [======================>.......] - ETA: 28s - loss: 1.9184 - regression_loss: 1.5952 - classification_loss: 0.3232 386/500 [======================>.......] - ETA: 28s - loss: 1.9164 - regression_loss: 1.5937 - classification_loss: 0.3227 387/500 [======================>.......] - ETA: 28s - loss: 1.9157 - regression_loss: 1.5932 - classification_loss: 0.3225 388/500 [======================>.......] - ETA: 27s - loss: 1.9146 - regression_loss: 1.5925 - classification_loss: 0.3221 389/500 [======================>.......] - ETA: 27s - loss: 1.9140 - regression_loss: 1.5921 - classification_loss: 0.3219 390/500 [======================>.......] - ETA: 27s - loss: 1.9130 - regression_loss: 1.5914 - classification_loss: 0.3216 391/500 [======================>.......] - ETA: 27s - loss: 1.9119 - regression_loss: 1.5906 - classification_loss: 0.3213 392/500 [======================>.......] - ETA: 26s - loss: 1.9114 - regression_loss: 1.5904 - classification_loss: 0.3210 393/500 [======================>.......] - ETA: 26s - loss: 1.9110 - regression_loss: 1.5902 - classification_loss: 0.3207 394/500 [======================>.......] - ETA: 26s - loss: 1.9115 - regression_loss: 1.5911 - classification_loss: 0.3205 395/500 [======================>.......] - ETA: 26s - loss: 1.9123 - regression_loss: 1.5921 - classification_loss: 0.3201 396/500 [======================>.......] - ETA: 25s - loss: 1.9109 - regression_loss: 1.5910 - classification_loss: 0.3199 397/500 [======================>.......] - ETA: 25s - loss: 1.9125 - regression_loss: 1.5922 - classification_loss: 0.3203 398/500 [======================>.......] - ETA: 25s - loss: 1.9129 - regression_loss: 1.5925 - classification_loss: 0.3204 399/500 [======================>.......] - ETA: 25s - loss: 1.9097 - regression_loss: 1.5899 - classification_loss: 0.3198 400/500 [=======================>......] - ETA: 24s - loss: 1.9098 - regression_loss: 1.5901 - classification_loss: 0.3197 401/500 [=======================>......] - ETA: 24s - loss: 1.9075 - regression_loss: 1.5883 - classification_loss: 0.3193 402/500 [=======================>......] - ETA: 24s - loss: 1.9085 - regression_loss: 1.5890 - classification_loss: 0.3194 403/500 [=======================>......] - ETA: 24s - loss: 1.9091 - regression_loss: 1.5897 - classification_loss: 0.3195 404/500 [=======================>......] - ETA: 23s - loss: 1.9098 - regression_loss: 1.5903 - classification_loss: 0.3195 405/500 [=======================>......] - ETA: 23s - loss: 1.9092 - regression_loss: 1.5899 - classification_loss: 0.3193 406/500 [=======================>......] - ETA: 23s - loss: 1.9085 - regression_loss: 1.5894 - classification_loss: 0.3192 407/500 [=======================>......] - ETA: 23s - loss: 1.9082 - regression_loss: 1.5892 - classification_loss: 0.3190 408/500 [=======================>......] - ETA: 22s - loss: 1.9092 - regression_loss: 1.5901 - classification_loss: 0.3190 409/500 [=======================>......] - ETA: 22s - loss: 1.9092 - regression_loss: 1.5900 - classification_loss: 0.3191 410/500 [=======================>......] - ETA: 22s - loss: 1.9097 - regression_loss: 1.5904 - classification_loss: 0.3193 411/500 [=======================>......] - ETA: 22s - loss: 1.9098 - regression_loss: 1.5907 - classification_loss: 0.3191 412/500 [=======================>......] - ETA: 21s - loss: 1.9098 - regression_loss: 1.5908 - classification_loss: 0.3190 413/500 [=======================>......] - ETA: 21s - loss: 1.9110 - regression_loss: 1.5915 - classification_loss: 0.3194 414/500 [=======================>......] - ETA: 21s - loss: 1.9108 - regression_loss: 1.5914 - classification_loss: 0.3193 415/500 [=======================>......] - ETA: 21s - loss: 1.9106 - regression_loss: 1.5914 - classification_loss: 0.3192 416/500 [=======================>......] - ETA: 20s - loss: 1.9089 - regression_loss: 1.5901 - classification_loss: 0.3188 417/500 [========================>.....] - ETA: 20s - loss: 1.9073 - regression_loss: 1.5889 - classification_loss: 0.3184 418/500 [========================>.....] - ETA: 20s - loss: 1.9084 - regression_loss: 1.5899 - classification_loss: 0.3184 419/500 [========================>.....] - ETA: 20s - loss: 1.9080 - regression_loss: 1.5897 - classification_loss: 0.3183 420/500 [========================>.....] - ETA: 19s - loss: 1.9086 - regression_loss: 1.5900 - classification_loss: 0.3185 421/500 [========================>.....] - ETA: 19s - loss: 1.9086 - regression_loss: 1.5902 - classification_loss: 0.3184 422/500 [========================>.....] - ETA: 19s - loss: 1.9085 - regression_loss: 1.5901 - classification_loss: 0.3184 423/500 [========================>.....] - ETA: 19s - loss: 1.9068 - regression_loss: 1.5890 - classification_loss: 0.3179 424/500 [========================>.....] - ETA: 18s - loss: 1.9080 - regression_loss: 1.5898 - classification_loss: 0.3182 425/500 [========================>.....] - ETA: 18s - loss: 1.9078 - regression_loss: 1.5897 - classification_loss: 0.3181 426/500 [========================>.....] - ETA: 18s - loss: 1.9064 - regression_loss: 1.5887 - classification_loss: 0.3178 427/500 [========================>.....] - ETA: 18s - loss: 1.9052 - regression_loss: 1.5877 - classification_loss: 0.3175 428/500 [========================>.....] - ETA: 17s - loss: 1.9064 - regression_loss: 1.5888 - classification_loss: 0.3176 429/500 [========================>.....] - ETA: 17s - loss: 1.9069 - regression_loss: 1.5891 - classification_loss: 0.3178 430/500 [========================>.....] - ETA: 17s - loss: 1.9075 - regression_loss: 1.5894 - classification_loss: 0.3181 431/500 [========================>.....] - ETA: 17s - loss: 1.9062 - regression_loss: 1.5884 - classification_loss: 0.3178 432/500 [========================>.....] - ETA: 16s - loss: 1.9077 - regression_loss: 1.5897 - classification_loss: 0.3180 433/500 [========================>.....] - ETA: 16s - loss: 1.9071 - regression_loss: 1.5893 - classification_loss: 0.3178 434/500 [=========================>....] - ETA: 16s - loss: 1.9066 - regression_loss: 1.5888 - classification_loss: 0.3178 435/500 [=========================>....] - ETA: 16s - loss: 1.9068 - regression_loss: 1.5890 - classification_loss: 0.3178 436/500 [=========================>....] - ETA: 15s - loss: 1.9078 - regression_loss: 1.5897 - classification_loss: 0.3181 437/500 [=========================>....] - ETA: 15s - loss: 1.9106 - regression_loss: 1.5918 - classification_loss: 0.3188 438/500 [=========================>....] - ETA: 15s - loss: 1.9096 - regression_loss: 1.5909 - classification_loss: 0.3186 439/500 [=========================>....] - ETA: 15s - loss: 1.9092 - regression_loss: 1.5908 - classification_loss: 0.3184 440/500 [=========================>....] - ETA: 14s - loss: 1.9110 - regression_loss: 1.5923 - classification_loss: 0.3186 441/500 [=========================>....] - ETA: 14s - loss: 1.9107 - regression_loss: 1.5921 - classification_loss: 0.3186 442/500 [=========================>....] - ETA: 14s - loss: 1.9125 - regression_loss: 1.5934 - classification_loss: 0.3191 443/500 [=========================>....] - ETA: 14s - loss: 1.9127 - regression_loss: 1.5936 - classification_loss: 0.3192 444/500 [=========================>....] - ETA: 14s - loss: 1.9101 - regression_loss: 1.5914 - classification_loss: 0.3187 445/500 [=========================>....] - ETA: 13s - loss: 1.9091 - regression_loss: 1.5906 - classification_loss: 0.3184 446/500 [=========================>....] - ETA: 13s - loss: 1.9092 - regression_loss: 1.5905 - classification_loss: 0.3187 447/500 [=========================>....] - ETA: 13s - loss: 1.9109 - regression_loss: 1.5919 - classification_loss: 0.3191 448/500 [=========================>....] - ETA: 13s - loss: 1.9108 - regression_loss: 1.5918 - classification_loss: 0.3190 449/500 [=========================>....] - ETA: 12s - loss: 1.9094 - regression_loss: 1.5905 - classification_loss: 0.3189 450/500 [==========================>...] - ETA: 12s - loss: 1.9115 - regression_loss: 1.5918 - classification_loss: 0.3197 451/500 [==========================>...] - ETA: 12s - loss: 1.9117 - regression_loss: 1.5921 - classification_loss: 0.3197 452/500 [==========================>...] - ETA: 12s - loss: 1.9113 - regression_loss: 1.5917 - classification_loss: 0.3196 453/500 [==========================>...] - ETA: 11s - loss: 1.9119 - regression_loss: 1.5922 - classification_loss: 0.3196 454/500 [==========================>...] - ETA: 11s - loss: 1.9120 - regression_loss: 1.5922 - classification_loss: 0.3198 455/500 [==========================>...] - ETA: 11s - loss: 1.9102 - regression_loss: 1.5907 - classification_loss: 0.3194 456/500 [==========================>...] - ETA: 11s - loss: 1.9095 - regression_loss: 1.5902 - classification_loss: 0.3193 457/500 [==========================>...] - ETA: 10s - loss: 1.9107 - regression_loss: 1.5910 - classification_loss: 0.3198 458/500 [==========================>...] - ETA: 10s - loss: 1.9133 - regression_loss: 1.5927 - classification_loss: 0.3207 459/500 [==========================>...] - ETA: 10s - loss: 1.9134 - regression_loss: 1.5929 - classification_loss: 0.3205 460/500 [==========================>...] - ETA: 9s - loss: 1.9123 - regression_loss: 1.5921 - classification_loss: 0.3202  461/500 [==========================>...] - ETA: 9s - loss: 1.9129 - regression_loss: 1.5925 - classification_loss: 0.3204 462/500 [==========================>...] - ETA: 9s - loss: 1.9131 - regression_loss: 1.5926 - classification_loss: 0.3206 463/500 [==========================>...] - ETA: 9s - loss: 1.9146 - regression_loss: 1.5938 - classification_loss: 0.3208 464/500 [==========================>...] - ETA: 8s - loss: 1.9147 - regression_loss: 1.5939 - classification_loss: 0.3207 465/500 [==========================>...] - ETA: 8s - loss: 1.9141 - regression_loss: 1.5933 - classification_loss: 0.3207 466/500 [==========================>...] - ETA: 8s - loss: 1.9135 - regression_loss: 1.5930 - classification_loss: 0.3205 467/500 [===========================>..] - ETA: 8s - loss: 1.9138 - regression_loss: 1.5930 - classification_loss: 0.3207 468/500 [===========================>..] - ETA: 7s - loss: 1.9134 - regression_loss: 1.5926 - classification_loss: 0.3208 469/500 [===========================>..] - ETA: 7s - loss: 1.9129 - regression_loss: 1.5923 - classification_loss: 0.3206 470/500 [===========================>..] - ETA: 7s - loss: 1.9136 - regression_loss: 1.5928 - classification_loss: 0.3208 471/500 [===========================>..] - ETA: 7s - loss: 1.9115 - regression_loss: 1.5911 - classification_loss: 0.3204 472/500 [===========================>..] - ETA: 6s - loss: 1.9117 - regression_loss: 1.5912 - classification_loss: 0.3205 473/500 [===========================>..] - ETA: 6s - loss: 1.9125 - regression_loss: 1.5919 - classification_loss: 0.3207 474/500 [===========================>..] - ETA: 6s - loss: 1.9142 - regression_loss: 1.5930 - classification_loss: 0.3212 475/500 [===========================>..] - ETA: 6s - loss: 1.9129 - regression_loss: 1.5919 - classification_loss: 0.3210 476/500 [===========================>..] - ETA: 5s - loss: 1.9129 - regression_loss: 1.5916 - classification_loss: 0.3213 477/500 [===========================>..] - ETA: 5s - loss: 1.9119 - regression_loss: 1.5908 - classification_loss: 0.3211 478/500 [===========================>..] - ETA: 5s - loss: 1.9138 - regression_loss: 1.5924 - classification_loss: 0.3214 479/500 [===========================>..] - ETA: 5s - loss: 1.9143 - regression_loss: 1.5930 - classification_loss: 0.3213 480/500 [===========================>..] - ETA: 4s - loss: 1.9135 - regression_loss: 1.5924 - classification_loss: 0.3212 481/500 [===========================>..] - ETA: 4s - loss: 1.9132 - regression_loss: 1.5922 - classification_loss: 0.3211 482/500 [===========================>..] - ETA: 4s - loss: 1.9132 - regression_loss: 1.5922 - classification_loss: 0.3210 483/500 [===========================>..] - ETA: 4s - loss: 1.9123 - regression_loss: 1.5916 - classification_loss: 0.3206 484/500 [============================>.] - ETA: 3s - loss: 1.9119 - regression_loss: 1.5914 - classification_loss: 0.3206 485/500 [============================>.] - ETA: 3s - loss: 1.9109 - regression_loss: 1.5906 - classification_loss: 0.3203 486/500 [============================>.] - ETA: 3s - loss: 1.9107 - regression_loss: 1.5906 - classification_loss: 0.3201 487/500 [============================>.] - ETA: 3s - loss: 1.9099 - regression_loss: 1.5900 - classification_loss: 0.3199 488/500 [============================>.] - ETA: 2s - loss: 1.9104 - regression_loss: 1.5906 - classification_loss: 0.3198 489/500 [============================>.] - ETA: 2s - loss: 1.9095 - regression_loss: 1.5900 - classification_loss: 0.3195 490/500 [============================>.] - ETA: 2s - loss: 1.9080 - regression_loss: 1.5887 - classification_loss: 0.3193 491/500 [============================>.] - ETA: 2s - loss: 1.9085 - regression_loss: 1.5892 - classification_loss: 0.3193 492/500 [============================>.] - ETA: 1s - loss: 1.9087 - regression_loss: 1.5893 - classification_loss: 0.3194 493/500 [============================>.] - ETA: 1s - loss: 1.9085 - regression_loss: 1.5891 - classification_loss: 0.3194 494/500 [============================>.] - ETA: 1s - loss: 1.9085 - regression_loss: 1.5893 - classification_loss: 0.3192 495/500 [============================>.] - ETA: 1s - loss: 1.9115 - regression_loss: 1.5918 - classification_loss: 0.3197 496/500 [============================>.] - ETA: 0s - loss: 1.9113 - regression_loss: 1.5918 - classification_loss: 0.3195 497/500 [============================>.] - ETA: 0s - loss: 1.9110 - regression_loss: 1.5915 - classification_loss: 0.3194 498/500 [============================>.] - ETA: 0s - loss: 1.9098 - regression_loss: 1.5906 - classification_loss: 0.3192 499/500 [============================>.] - ETA: 0s - loss: 1.9087 - regression_loss: 1.5899 - classification_loss: 0.3188 500/500 [==============================] - 125s 250ms/step - loss: 1.9090 - regression_loss: 1.5901 - classification_loss: 0.3190 326 instances of class plum with average precision: 0.7475 mAP: 0.7475 Epoch 00030: saving model to ./training/snapshots/resnet50_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 1:48 - loss: 0.7591 - regression_loss: 0.6868 - classification_loss: 0.0723 2/500 [..............................] - ETA: 1:54 - loss: 1.3193 - regression_loss: 1.1246 - classification_loss: 0.1948 3/500 [..............................] - ETA: 1:57 - loss: 1.5377 - regression_loss: 1.2993 - classification_loss: 0.2384 4/500 [..............................] - ETA: 1:59 - loss: 1.5962 - regression_loss: 1.2861 - classification_loss: 0.3101 5/500 [..............................] - ETA: 2:00 - loss: 1.5349 - regression_loss: 1.2429 - classification_loss: 0.2920 6/500 [..............................] - ETA: 2:00 - loss: 1.6775 - regression_loss: 1.3669 - classification_loss: 0.3106 7/500 [..............................] - ETA: 2:00 - loss: 1.7323 - regression_loss: 1.4105 - classification_loss: 0.3218 8/500 [..............................] - ETA: 2:00 - loss: 1.8560 - regression_loss: 1.4879 - classification_loss: 0.3681 9/500 [..............................] - ETA: 2:00 - loss: 1.8947 - regression_loss: 1.5202 - classification_loss: 0.3745 10/500 [..............................] - ETA: 2:00 - loss: 1.9966 - regression_loss: 1.6056 - classification_loss: 0.3910 11/500 [..............................] - ETA: 2:00 - loss: 1.9841 - regression_loss: 1.5814 - classification_loss: 0.4027 12/500 [..............................] - ETA: 2:00 - loss: 1.9396 - regression_loss: 1.5533 - classification_loss: 0.3863 13/500 [..............................] - ETA: 2:00 - loss: 1.9034 - regression_loss: 1.5318 - classification_loss: 0.3716 14/500 [..............................] - ETA: 2:00 - loss: 1.9114 - regression_loss: 1.5371 - classification_loss: 0.3743 15/500 [..............................] - ETA: 2:00 - loss: 1.8852 - regression_loss: 1.5287 - classification_loss: 0.3565 16/500 [..............................] - ETA: 2:00 - loss: 1.8347 - regression_loss: 1.4919 - classification_loss: 0.3428 17/500 [>.............................] - ETA: 1:59 - loss: 1.8031 - regression_loss: 1.4705 - classification_loss: 0.3326 18/500 [>.............................] - ETA: 1:59 - loss: 1.8492 - regression_loss: 1.5062 - classification_loss: 0.3430 19/500 [>.............................] - ETA: 1:59 - loss: 1.8555 - regression_loss: 1.5114 - classification_loss: 0.3440 20/500 [>.............................] - ETA: 1:59 - loss: 1.8476 - regression_loss: 1.5023 - classification_loss: 0.3453 21/500 [>.............................] - ETA: 1:59 - loss: 1.8657 - regression_loss: 1.5217 - classification_loss: 0.3440 22/500 [>.............................] - ETA: 1:58 - loss: 1.8538 - regression_loss: 1.5156 - classification_loss: 0.3382 23/500 [>.............................] - ETA: 1:58 - loss: 1.9050 - regression_loss: 1.5535 - classification_loss: 0.3515 24/500 [>.............................] - ETA: 1:58 - loss: 1.8940 - regression_loss: 1.5462 - classification_loss: 0.3478 25/500 [>.............................] - ETA: 1:58 - loss: 1.8976 - regression_loss: 1.5498 - classification_loss: 0.3478 26/500 [>.............................] - ETA: 1:58 - loss: 1.8862 - regression_loss: 1.5469 - classification_loss: 0.3393 27/500 [>.............................] - ETA: 1:57 - loss: 1.8868 - regression_loss: 1.5510 - classification_loss: 0.3358 28/500 [>.............................] - ETA: 1:57 - loss: 1.8802 - regression_loss: 1.5469 - classification_loss: 0.3332 29/500 [>.............................] - ETA: 1:57 - loss: 1.8931 - regression_loss: 1.5588 - classification_loss: 0.3344 30/500 [>.............................] - ETA: 1:57 - loss: 1.9334 - regression_loss: 1.5905 - classification_loss: 0.3429 31/500 [>.............................] - ETA: 1:57 - loss: 1.9793 - regression_loss: 1.6287 - classification_loss: 0.3506 32/500 [>.............................] - ETA: 1:57 - loss: 1.9506 - regression_loss: 1.6044 - classification_loss: 0.3461 33/500 [>.............................] - ETA: 1:56 - loss: 1.9415 - regression_loss: 1.5991 - classification_loss: 0.3424 34/500 [=>............................] - ETA: 1:56 - loss: 1.9443 - regression_loss: 1.5991 - classification_loss: 0.3452 35/500 [=>............................] - ETA: 1:56 - loss: 1.9571 - regression_loss: 1.6116 - classification_loss: 0.3455 36/500 [=>............................] - ETA: 1:56 - loss: 1.9276 - regression_loss: 1.5890 - classification_loss: 0.3386 37/500 [=>............................] - ETA: 1:55 - loss: 1.9272 - regression_loss: 1.5886 - classification_loss: 0.3386 38/500 [=>............................] - ETA: 1:55 - loss: 1.9449 - regression_loss: 1.6049 - classification_loss: 0.3400 39/500 [=>............................] - ETA: 1:55 - loss: 1.9192 - regression_loss: 1.5845 - classification_loss: 0.3347 40/500 [=>............................] - ETA: 1:54 - loss: 1.9350 - regression_loss: 1.5980 - classification_loss: 0.3370 41/500 [=>............................] - ETA: 1:54 - loss: 1.9233 - regression_loss: 1.5897 - classification_loss: 0.3336 42/500 [=>............................] - ETA: 1:54 - loss: 1.9008 - regression_loss: 1.5699 - classification_loss: 0.3309 43/500 [=>............................] - ETA: 1:53 - loss: 1.9294 - regression_loss: 1.5910 - classification_loss: 0.3384 44/500 [=>............................] - ETA: 1:53 - loss: 1.9127 - regression_loss: 1.5774 - classification_loss: 0.3353 45/500 [=>............................] - ETA: 1:53 - loss: 1.9197 - regression_loss: 1.5827 - classification_loss: 0.3370 46/500 [=>............................] - ETA: 1:53 - loss: 1.9263 - regression_loss: 1.5885 - classification_loss: 0.3378 47/500 [=>............................] - ETA: 1:52 - loss: 1.9104 - regression_loss: 1.5765 - classification_loss: 0.3338 48/500 [=>............................] - ETA: 1:52 - loss: 1.9088 - regression_loss: 1.5743 - classification_loss: 0.3345 49/500 [=>............................] - ETA: 1:52 - loss: 1.9096 - regression_loss: 1.5743 - classification_loss: 0.3353 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9128 - regression_loss: 1.5769 - classification_loss: 0.3359 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9170 - regression_loss: 1.5788 - classification_loss: 0.3383 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9128 - regression_loss: 1.5769 - classification_loss: 0.3358 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9155 - regression_loss: 1.5795 - classification_loss: 0.3360 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9103 - regression_loss: 1.5761 - classification_loss: 0.3342 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9095 - regression_loss: 1.5756 - classification_loss: 0.3340 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8938 - regression_loss: 1.5627 - classification_loss: 0.3311 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8768 - regression_loss: 1.5489 - classification_loss: 0.3279 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8823 - regression_loss: 1.5548 - classification_loss: 0.3275 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8872 - regression_loss: 1.5531 - classification_loss: 0.3341 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8862 - regression_loss: 1.5487 - classification_loss: 0.3375 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9093 - regression_loss: 1.5684 - classification_loss: 0.3409 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9121 - regression_loss: 1.5717 - classification_loss: 0.3404 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8981 - regression_loss: 1.5611 - classification_loss: 0.3371 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8925 - regression_loss: 1.5574 - classification_loss: 0.3351 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8947 - regression_loss: 1.5586 - classification_loss: 0.3361 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8988 - regression_loss: 1.5616 - classification_loss: 0.3373 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8907 - regression_loss: 1.5550 - classification_loss: 0.3357 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8895 - regression_loss: 1.5535 - classification_loss: 0.3361 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8866 - regression_loss: 1.5513 - classification_loss: 0.3353 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8870 - regression_loss: 1.5521 - classification_loss: 0.3349 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8855 - regression_loss: 1.5508 - classification_loss: 0.3346 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8892 - regression_loss: 1.5538 - classification_loss: 0.3354 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8890 - regression_loss: 1.5545 - classification_loss: 0.3346 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8952 - regression_loss: 1.5607 - classification_loss: 0.3344 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8842 - regression_loss: 1.5519 - classification_loss: 0.3323 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8808 - regression_loss: 1.5504 - classification_loss: 0.3304 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8864 - regression_loss: 1.5560 - classification_loss: 0.3304 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8854 - regression_loss: 1.5561 - classification_loss: 0.3293 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8816 - regression_loss: 1.5525 - classification_loss: 0.3290 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8809 - regression_loss: 1.5528 - classification_loss: 0.3281 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8805 - regression_loss: 1.5539 - classification_loss: 0.3265 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8820 - regression_loss: 1.5538 - classification_loss: 0.3282 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8774 - regression_loss: 1.5504 - classification_loss: 0.3270 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8797 - regression_loss: 1.5524 - classification_loss: 0.3273 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8870 - regression_loss: 1.5577 - classification_loss: 0.3293 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8838 - regression_loss: 1.5553 - classification_loss: 0.3285 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8792 - regression_loss: 1.5522 - classification_loss: 0.3269 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8805 - regression_loss: 1.5532 - classification_loss: 0.3274 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8778 - regression_loss: 1.5518 - classification_loss: 0.3259 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8671 - regression_loss: 1.5436 - classification_loss: 0.3235 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8683 - regression_loss: 1.5448 - classification_loss: 0.3235 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8706 - regression_loss: 1.5473 - classification_loss: 0.3233 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8791 - regression_loss: 1.5557 - classification_loss: 0.3233 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8819 - regression_loss: 1.5578 - classification_loss: 0.3242 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8917 - regression_loss: 1.5648 - classification_loss: 0.3269 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8859 - regression_loss: 1.5603 - classification_loss: 0.3256 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8816 - regression_loss: 1.5575 - classification_loss: 0.3242 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8845 - regression_loss: 1.5597 - classification_loss: 0.3248 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8816 - regression_loss: 1.5573 - classification_loss: 0.3243 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8789 - regression_loss: 1.5560 - classification_loss: 0.3230 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8768 - regression_loss: 1.5552 - classification_loss: 0.3216 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8785 - regression_loss: 1.5553 - classification_loss: 0.3232 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8849 - regression_loss: 1.5612 - classification_loss: 0.3237 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8837 - regression_loss: 1.5612 - classification_loss: 0.3225 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8838 - regression_loss: 1.5615 - classification_loss: 0.3224 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8806 - regression_loss: 1.5602 - classification_loss: 0.3204 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8765 - regression_loss: 1.5562 - classification_loss: 0.3203 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8843 - regression_loss: 1.5623 - classification_loss: 0.3220 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8842 - regression_loss: 1.5633 - classification_loss: 0.3209 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8915 - regression_loss: 1.5693 - classification_loss: 0.3221 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8968 - regression_loss: 1.5737 - classification_loss: 0.3231 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8989 - regression_loss: 1.5753 - classification_loss: 0.3235 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9040 - regression_loss: 1.5793 - classification_loss: 0.3247 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9014 - regression_loss: 1.5779 - classification_loss: 0.3234 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9008 - regression_loss: 1.5779 - classification_loss: 0.3230 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8952 - regression_loss: 1.5734 - classification_loss: 0.3218 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8911 - regression_loss: 1.5702 - classification_loss: 0.3209 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8844 - regression_loss: 1.5655 - classification_loss: 0.3189 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8872 - regression_loss: 1.5680 - classification_loss: 0.3193 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8925 - regression_loss: 1.5714 - classification_loss: 0.3211 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8896 - regression_loss: 1.5695 - classification_loss: 0.3201 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8841 - regression_loss: 1.5651 - classification_loss: 0.3190 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8733 - regression_loss: 1.5561 - classification_loss: 0.3172 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8704 - regression_loss: 1.5544 - classification_loss: 0.3159 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8830 - regression_loss: 1.5630 - classification_loss: 0.3200 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8852 - regression_loss: 1.5657 - classification_loss: 0.3195 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8829 - regression_loss: 1.5640 - classification_loss: 0.3189 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8980 - regression_loss: 1.5759 - classification_loss: 0.3222 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8941 - regression_loss: 1.5732 - classification_loss: 0.3210 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8888 - regression_loss: 1.5695 - classification_loss: 0.3194 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8908 - regression_loss: 1.5712 - classification_loss: 0.3196 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8955 - regression_loss: 1.5751 - classification_loss: 0.3205 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8949 - regression_loss: 1.5745 - classification_loss: 0.3204 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8934 - regression_loss: 1.5732 - classification_loss: 0.3202 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8895 - regression_loss: 1.5701 - classification_loss: 0.3195 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8880 - regression_loss: 1.5696 - classification_loss: 0.3184 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8916 - regression_loss: 1.5720 - classification_loss: 0.3195 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9020 - regression_loss: 1.5805 - classification_loss: 0.3215 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9014 - regression_loss: 1.5804 - classification_loss: 0.3210 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8981 - regression_loss: 1.5780 - classification_loss: 0.3201 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8966 - regression_loss: 1.5767 - classification_loss: 0.3199 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8938 - regression_loss: 1.5748 - classification_loss: 0.3190 143/500 [=======>......................] - ETA: 1:28 - loss: 1.8928 - regression_loss: 1.5743 - classification_loss: 0.3185 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8928 - regression_loss: 1.5741 - classification_loss: 0.3187 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8950 - regression_loss: 1.5758 - classification_loss: 0.3192 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9004 - regression_loss: 1.5819 - classification_loss: 0.3185 147/500 [=======>......................] - ETA: 1:27 - loss: 1.8987 - regression_loss: 1.5806 - classification_loss: 0.3181 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8969 - regression_loss: 1.5797 - classification_loss: 0.3173 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8977 - regression_loss: 1.5801 - classification_loss: 0.3175 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9005 - regression_loss: 1.5830 - classification_loss: 0.3175 151/500 [========>.....................] - ETA: 1:26 - loss: 1.9028 - regression_loss: 1.5834 - classification_loss: 0.3194 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9044 - regression_loss: 1.5846 - classification_loss: 0.3198 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9100 - regression_loss: 1.5893 - classification_loss: 0.3207 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9087 - regression_loss: 1.5888 - classification_loss: 0.3199 155/500 [========>.....................] - ETA: 1:25 - loss: 1.9053 - regression_loss: 1.5865 - classification_loss: 0.3189 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9089 - regression_loss: 1.5890 - classification_loss: 0.3199 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9022 - regression_loss: 1.5840 - classification_loss: 0.3183 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9069 - regression_loss: 1.5879 - classification_loss: 0.3190 159/500 [========>.....................] - ETA: 1:24 - loss: 1.9081 - regression_loss: 1.5892 - classification_loss: 0.3189 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9016 - regression_loss: 1.5843 - classification_loss: 0.3173 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9030 - regression_loss: 1.5854 - classification_loss: 0.3176 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9047 - regression_loss: 1.5867 - classification_loss: 0.3180 163/500 [========>.....................] - ETA: 1:23 - loss: 1.9035 - regression_loss: 1.5855 - classification_loss: 0.3180 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9052 - regression_loss: 1.5863 - classification_loss: 0.3189 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9056 - regression_loss: 1.5862 - classification_loss: 0.3194 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9046 - regression_loss: 1.5852 - classification_loss: 0.3194 167/500 [=========>....................] - ETA: 1:22 - loss: 1.9052 - regression_loss: 1.5858 - classification_loss: 0.3194 168/500 [=========>....................] - ETA: 1:22 - loss: 1.9070 - regression_loss: 1.5866 - classification_loss: 0.3204 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9063 - regression_loss: 1.5856 - classification_loss: 0.3207 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9040 - regression_loss: 1.5839 - classification_loss: 0.3201 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9053 - regression_loss: 1.5850 - classification_loss: 0.3204 172/500 [=========>....................] - ETA: 1:21 - loss: 1.9057 - regression_loss: 1.5854 - classification_loss: 0.3204 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9063 - regression_loss: 1.5854 - classification_loss: 0.3209 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9034 - regression_loss: 1.5835 - classification_loss: 0.3200 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8998 - regression_loss: 1.5808 - classification_loss: 0.3189 176/500 [=========>....................] - ETA: 1:20 - loss: 1.9010 - regression_loss: 1.5815 - classification_loss: 0.3195 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8997 - regression_loss: 1.5804 - classification_loss: 0.3193 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8962 - regression_loss: 1.5778 - classification_loss: 0.3184 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8988 - regression_loss: 1.5801 - classification_loss: 0.3187 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8982 - regression_loss: 1.5797 - classification_loss: 0.3185 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9054 - regression_loss: 1.5854 - classification_loss: 0.3199 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9009 - regression_loss: 1.5820 - classification_loss: 0.3189 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9020 - regression_loss: 1.5830 - classification_loss: 0.3189 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9007 - regression_loss: 1.5820 - classification_loss: 0.3187 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9025 - regression_loss: 1.5833 - classification_loss: 0.3192 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9015 - regression_loss: 1.5827 - classification_loss: 0.3188 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8999 - regression_loss: 1.5803 - classification_loss: 0.3196 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9018 - regression_loss: 1.5815 - classification_loss: 0.3203 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8999 - regression_loss: 1.5805 - classification_loss: 0.3194 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8981 - regression_loss: 1.5790 - classification_loss: 0.3191 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8934 - regression_loss: 1.5754 - classification_loss: 0.3181 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8937 - regression_loss: 1.5755 - classification_loss: 0.3182 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8933 - regression_loss: 1.5753 - classification_loss: 0.3180 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8914 - regression_loss: 1.5739 - classification_loss: 0.3175 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8853 - regression_loss: 1.5688 - classification_loss: 0.3164 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8863 - regression_loss: 1.5694 - classification_loss: 0.3169 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8861 - regression_loss: 1.5695 - classification_loss: 0.3165 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8842 - regression_loss: 1.5683 - classification_loss: 0.3159 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8864 - regression_loss: 1.5700 - classification_loss: 0.3163 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8885 - regression_loss: 1.5714 - classification_loss: 0.3171 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8943 - regression_loss: 1.5768 - classification_loss: 0.3175 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8951 - regression_loss: 1.5776 - classification_loss: 0.3174 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8915 - regression_loss: 1.5750 - classification_loss: 0.3166 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8918 - regression_loss: 1.5755 - classification_loss: 0.3163 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8931 - regression_loss: 1.5765 - classification_loss: 0.3166 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8911 - regression_loss: 1.5749 - classification_loss: 0.3162 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8926 - regression_loss: 1.5761 - classification_loss: 0.3165 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8911 - regression_loss: 1.5750 - classification_loss: 0.3161 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8886 - regression_loss: 1.5732 - classification_loss: 0.3154 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8930 - regression_loss: 1.5760 - classification_loss: 0.3170 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8938 - regression_loss: 1.5764 - classification_loss: 0.3173 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8953 - regression_loss: 1.5771 - classification_loss: 0.3182 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8954 - regression_loss: 1.5773 - classification_loss: 0.3181 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8918 - regression_loss: 1.5745 - classification_loss: 0.3173 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8902 - regression_loss: 1.5733 - classification_loss: 0.3169 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8899 - regression_loss: 1.5730 - classification_loss: 0.3169 217/500 [============>.................] - ETA: 1:10 - loss: 1.8915 - regression_loss: 1.5742 - classification_loss: 0.3173 218/500 [============>.................] - ETA: 1:10 - loss: 1.8951 - regression_loss: 1.5774 - classification_loss: 0.3177 219/500 [============>.................] - ETA: 1:10 - loss: 1.8903 - regression_loss: 1.5737 - classification_loss: 0.3166 220/500 [============>.................] - ETA: 1:09 - loss: 1.8915 - regression_loss: 1.5748 - classification_loss: 0.3168 221/500 [============>.................] - ETA: 1:09 - loss: 1.8924 - regression_loss: 1.5755 - classification_loss: 0.3169 222/500 [============>.................] - ETA: 1:09 - loss: 1.8922 - regression_loss: 1.5756 - classification_loss: 0.3167 223/500 [============>.................] - ETA: 1:09 - loss: 1.8919 - regression_loss: 1.5756 - classification_loss: 0.3164 224/500 [============>.................] - ETA: 1:08 - loss: 1.8890 - regression_loss: 1.5731 - classification_loss: 0.3159 225/500 [============>.................] - ETA: 1:08 - loss: 1.8872 - regression_loss: 1.5716 - classification_loss: 0.3156 226/500 [============>.................] - ETA: 1:08 - loss: 1.8845 - regression_loss: 1.5691 - classification_loss: 0.3153 227/500 [============>.................] - ETA: 1:08 - loss: 1.8846 - regression_loss: 1.5689 - classification_loss: 0.3157 228/500 [============>.................] - ETA: 1:07 - loss: 1.8853 - regression_loss: 1.5695 - classification_loss: 0.3158 229/500 [============>.................] - ETA: 1:07 - loss: 1.8854 - regression_loss: 1.5700 - classification_loss: 0.3153 230/500 [============>.................] - ETA: 1:07 - loss: 1.8832 - regression_loss: 1.5686 - classification_loss: 0.3147 231/500 [============>.................] - ETA: 1:07 - loss: 1.8802 - regression_loss: 1.5661 - classification_loss: 0.3142 232/500 [============>.................] - ETA: 1:06 - loss: 1.8801 - regression_loss: 1.5660 - classification_loss: 0.3140 233/500 [============>.................] - ETA: 1:06 - loss: 1.8806 - regression_loss: 1.5667 - classification_loss: 0.3139 234/500 [=============>................] - ETA: 1:06 - loss: 1.8783 - regression_loss: 1.5650 - classification_loss: 0.3133 235/500 [=============>................] - ETA: 1:06 - loss: 1.8766 - regression_loss: 1.5639 - classification_loss: 0.3128 236/500 [=============>................] - ETA: 1:05 - loss: 1.8754 - regression_loss: 1.5631 - classification_loss: 0.3123 237/500 [=============>................] - ETA: 1:05 - loss: 1.8754 - regression_loss: 1.5629 - classification_loss: 0.3126 238/500 [=============>................] - ETA: 1:05 - loss: 1.8757 - regression_loss: 1.5633 - classification_loss: 0.3124 239/500 [=============>................] - ETA: 1:05 - loss: 1.8765 - regression_loss: 1.5636 - classification_loss: 0.3129 240/500 [=============>................] - ETA: 1:04 - loss: 1.8770 - regression_loss: 1.5640 - classification_loss: 0.3130 241/500 [=============>................] - ETA: 1:04 - loss: 1.8779 - regression_loss: 1.5647 - classification_loss: 0.3132 242/500 [=============>................] - ETA: 1:04 - loss: 1.8768 - regression_loss: 1.5640 - classification_loss: 0.3129 243/500 [=============>................] - ETA: 1:04 - loss: 1.8763 - regression_loss: 1.5632 - classification_loss: 0.3131 244/500 [=============>................] - ETA: 1:03 - loss: 1.8742 - regression_loss: 1.5613 - classification_loss: 0.3130 245/500 [=============>................] - ETA: 1:03 - loss: 1.8728 - regression_loss: 1.5601 - classification_loss: 0.3127 246/500 [=============>................] - ETA: 1:03 - loss: 1.8715 - regression_loss: 1.5594 - classification_loss: 0.3122 247/500 [=============>................] - ETA: 1:03 - loss: 1.8678 - regression_loss: 1.5566 - classification_loss: 0.3112 248/500 [=============>................] - ETA: 1:02 - loss: 1.8674 - regression_loss: 1.5564 - classification_loss: 0.3111 249/500 [=============>................] - ETA: 1:02 - loss: 1.8666 - regression_loss: 1.5558 - classification_loss: 0.3108 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8656 - regression_loss: 1.5554 - classification_loss: 0.3102 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8655 - regression_loss: 1.5557 - classification_loss: 0.3097 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8669 - regression_loss: 1.5569 - classification_loss: 0.3100 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8651 - regression_loss: 1.5559 - classification_loss: 0.3093 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8657 - regression_loss: 1.5564 - classification_loss: 0.3093 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8654 - regression_loss: 1.5563 - classification_loss: 0.3090 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8639 - regression_loss: 1.5551 - classification_loss: 0.3088 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8609 - regression_loss: 1.5525 - classification_loss: 0.3084 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8602 - regression_loss: 1.5522 - classification_loss: 0.3080 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8580 - regression_loss: 1.5506 - classification_loss: 0.3074 260/500 [==============>...............] - ETA: 59s - loss: 1.8566 - regression_loss: 1.5496 - classification_loss: 0.3070  261/500 [==============>...............] - ETA: 59s - loss: 1.8566 - regression_loss: 1.5498 - classification_loss: 0.3068 262/500 [==============>...............] - ETA: 59s - loss: 1.8585 - regression_loss: 1.5510 - classification_loss: 0.3074 263/500 [==============>...............] - ETA: 59s - loss: 1.8588 - regression_loss: 1.5514 - classification_loss: 0.3074 264/500 [==============>...............] - ETA: 58s - loss: 1.8600 - regression_loss: 1.5524 - classification_loss: 0.3076 265/500 [==============>...............] - ETA: 58s - loss: 1.8631 - regression_loss: 1.5548 - classification_loss: 0.3082 266/500 [==============>...............] - ETA: 58s - loss: 1.8599 - regression_loss: 1.5516 - classification_loss: 0.3083 267/500 [===============>..............] - ETA: 58s - loss: 1.8666 - regression_loss: 1.5575 - classification_loss: 0.3092 268/500 [===============>..............] - ETA: 57s - loss: 1.8660 - regression_loss: 1.5570 - classification_loss: 0.3089 269/500 [===============>..............] - ETA: 57s - loss: 1.8648 - regression_loss: 1.5564 - classification_loss: 0.3084 270/500 [===============>..............] - ETA: 57s - loss: 1.8642 - regression_loss: 1.5558 - classification_loss: 0.3084 271/500 [===============>..............] - ETA: 57s - loss: 1.8633 - regression_loss: 1.5551 - classification_loss: 0.3082 272/500 [===============>..............] - ETA: 56s - loss: 1.8646 - regression_loss: 1.5569 - classification_loss: 0.3077 273/500 [===============>..............] - ETA: 56s - loss: 1.8623 - regression_loss: 1.5551 - classification_loss: 0.3072 274/500 [===============>..............] - ETA: 56s - loss: 1.8613 - regression_loss: 1.5547 - classification_loss: 0.3066 275/500 [===============>..............] - ETA: 56s - loss: 1.8601 - regression_loss: 1.5540 - classification_loss: 0.3061 276/500 [===============>..............] - ETA: 55s - loss: 1.8624 - regression_loss: 1.5556 - classification_loss: 0.3067 277/500 [===============>..............] - ETA: 55s - loss: 1.8664 - regression_loss: 1.5585 - classification_loss: 0.3079 278/500 [===============>..............] - ETA: 55s - loss: 1.8664 - regression_loss: 1.5582 - classification_loss: 0.3081 279/500 [===============>..............] - ETA: 55s - loss: 1.8681 - regression_loss: 1.5593 - classification_loss: 0.3087 280/500 [===============>..............] - ETA: 54s - loss: 1.8672 - regression_loss: 1.5589 - classification_loss: 0.3084 281/500 [===============>..............] - ETA: 54s - loss: 1.8678 - regression_loss: 1.5595 - classification_loss: 0.3083 282/500 [===============>..............] - ETA: 54s - loss: 1.8692 - regression_loss: 1.5606 - classification_loss: 0.3086 283/500 [===============>..............] - ETA: 54s - loss: 1.8709 - regression_loss: 1.5622 - classification_loss: 0.3087 284/500 [================>.............] - ETA: 53s - loss: 1.8698 - regression_loss: 1.5614 - classification_loss: 0.3084 285/500 [================>.............] - ETA: 53s - loss: 1.8737 - regression_loss: 1.5644 - classification_loss: 0.3094 286/500 [================>.............] - ETA: 53s - loss: 1.8750 - regression_loss: 1.5654 - classification_loss: 0.3096 287/500 [================>.............] - ETA: 53s - loss: 1.8744 - regression_loss: 1.5649 - classification_loss: 0.3095 288/500 [================>.............] - ETA: 52s - loss: 1.8736 - regression_loss: 1.5645 - classification_loss: 0.3091 289/500 [================>.............] - ETA: 52s - loss: 1.8740 - regression_loss: 1.5649 - classification_loss: 0.3091 290/500 [================>.............] - ETA: 52s - loss: 1.8715 - regression_loss: 1.5628 - classification_loss: 0.3087 291/500 [================>.............] - ETA: 52s - loss: 1.8691 - regression_loss: 1.5609 - classification_loss: 0.3082 292/500 [================>.............] - ETA: 51s - loss: 1.8677 - regression_loss: 1.5595 - classification_loss: 0.3081 293/500 [================>.............] - ETA: 51s - loss: 1.8693 - regression_loss: 1.5608 - classification_loss: 0.3085 294/500 [================>.............] - ETA: 51s - loss: 1.8706 - regression_loss: 1.5619 - classification_loss: 0.3087 295/500 [================>.............] - ETA: 51s - loss: 1.8702 - regression_loss: 1.5617 - classification_loss: 0.3085 296/500 [================>.............] - ETA: 50s - loss: 1.8702 - regression_loss: 1.5617 - classification_loss: 0.3084 297/500 [================>.............] - ETA: 50s - loss: 1.8682 - regression_loss: 1.5604 - classification_loss: 0.3079 298/500 [================>.............] - ETA: 50s - loss: 1.8713 - regression_loss: 1.5631 - classification_loss: 0.3082 299/500 [================>.............] - ETA: 50s - loss: 1.8694 - regression_loss: 1.5616 - classification_loss: 0.3078 300/500 [=================>............] - ETA: 49s - loss: 1.8793 - regression_loss: 1.5700 - classification_loss: 0.3093 301/500 [=================>............] - ETA: 49s - loss: 1.8792 - regression_loss: 1.5702 - classification_loss: 0.3091 302/500 [=================>............] - ETA: 49s - loss: 1.8814 - regression_loss: 1.5713 - classification_loss: 0.3102 303/500 [=================>............] - ETA: 49s - loss: 1.8786 - regression_loss: 1.5691 - classification_loss: 0.3095 304/500 [=================>............] - ETA: 48s - loss: 1.8777 - regression_loss: 1.5684 - classification_loss: 0.3093 305/500 [=================>............] - ETA: 48s - loss: 1.8757 - regression_loss: 1.5668 - classification_loss: 0.3089 306/500 [=================>............] - ETA: 48s - loss: 1.8764 - regression_loss: 1.5674 - classification_loss: 0.3090 307/500 [=================>............] - ETA: 48s - loss: 1.8764 - regression_loss: 1.5673 - classification_loss: 0.3091 308/500 [=================>............] - ETA: 47s - loss: 1.8735 - regression_loss: 1.5648 - classification_loss: 0.3087 309/500 [=================>............] - ETA: 47s - loss: 1.8746 - regression_loss: 1.5659 - classification_loss: 0.3087 310/500 [=================>............] - ETA: 47s - loss: 1.8742 - regression_loss: 1.5654 - classification_loss: 0.3087 311/500 [=================>............] - ETA: 47s - loss: 1.8758 - regression_loss: 1.5671 - classification_loss: 0.3086 312/500 [=================>............] - ETA: 46s - loss: 1.8767 - regression_loss: 1.5677 - classification_loss: 0.3090 313/500 [=================>............] - ETA: 46s - loss: 1.8761 - regression_loss: 1.5676 - classification_loss: 0.3085 314/500 [=================>............] - ETA: 46s - loss: 1.8725 - regression_loss: 1.5647 - classification_loss: 0.3079 315/500 [=================>............] - ETA: 46s - loss: 1.8717 - regression_loss: 1.5641 - classification_loss: 0.3076 316/500 [=================>............] - ETA: 45s - loss: 1.8710 - regression_loss: 1.5635 - classification_loss: 0.3074 317/500 [==================>...........] - ETA: 45s - loss: 1.8697 - regression_loss: 1.5625 - classification_loss: 0.3072 318/500 [==================>...........] - ETA: 45s - loss: 1.8699 - regression_loss: 1.5630 - classification_loss: 0.3069 319/500 [==================>...........] - ETA: 45s - loss: 1.8709 - regression_loss: 1.5639 - classification_loss: 0.3070 320/500 [==================>...........] - ETA: 44s - loss: 1.8715 - regression_loss: 1.5641 - classification_loss: 0.3074 321/500 [==================>...........] - ETA: 44s - loss: 1.8724 - regression_loss: 1.5649 - classification_loss: 0.3075 322/500 [==================>...........] - ETA: 44s - loss: 1.8720 - regression_loss: 1.5646 - classification_loss: 0.3074 323/500 [==================>...........] - ETA: 44s - loss: 1.8694 - regression_loss: 1.5626 - classification_loss: 0.3068 324/500 [==================>...........] - ETA: 43s - loss: 1.8675 - regression_loss: 1.5612 - classification_loss: 0.3063 325/500 [==================>...........] - ETA: 43s - loss: 1.8682 - regression_loss: 1.5620 - classification_loss: 0.3062 326/500 [==================>...........] - ETA: 43s - loss: 1.8657 - regression_loss: 1.5600 - classification_loss: 0.3057 327/500 [==================>...........] - ETA: 43s - loss: 1.8662 - regression_loss: 1.5604 - classification_loss: 0.3058 328/500 [==================>...........] - ETA: 42s - loss: 1.8674 - regression_loss: 1.5612 - classification_loss: 0.3062 329/500 [==================>...........] - ETA: 42s - loss: 1.8648 - regression_loss: 1.5590 - classification_loss: 0.3057 330/500 [==================>...........] - ETA: 42s - loss: 1.8637 - regression_loss: 1.5584 - classification_loss: 0.3053 331/500 [==================>...........] - ETA: 42s - loss: 1.8648 - regression_loss: 1.5594 - classification_loss: 0.3055 332/500 [==================>...........] - ETA: 41s - loss: 1.8670 - regression_loss: 1.5608 - classification_loss: 0.3063 333/500 [==================>...........] - ETA: 41s - loss: 1.8714 - regression_loss: 1.5643 - classification_loss: 0.3071 334/500 [===================>..........] - ETA: 41s - loss: 1.8712 - regression_loss: 1.5640 - classification_loss: 0.3072 335/500 [===================>..........] - ETA: 41s - loss: 1.8711 - regression_loss: 1.5641 - classification_loss: 0.3070 336/500 [===================>..........] - ETA: 40s - loss: 1.8711 - regression_loss: 1.5641 - classification_loss: 0.3071 337/500 [===================>..........] - ETA: 40s - loss: 1.8715 - regression_loss: 1.5645 - classification_loss: 0.3070 338/500 [===================>..........] - ETA: 40s - loss: 1.8722 - regression_loss: 1.5655 - classification_loss: 0.3068 339/500 [===================>..........] - ETA: 40s - loss: 1.8709 - regression_loss: 1.5647 - classification_loss: 0.3063 340/500 [===================>..........] - ETA: 39s - loss: 1.8698 - regression_loss: 1.5639 - classification_loss: 0.3059 341/500 [===================>..........] - ETA: 39s - loss: 1.8707 - regression_loss: 1.5644 - classification_loss: 0.3062 342/500 [===================>..........] - ETA: 39s - loss: 1.8715 - regression_loss: 1.5648 - classification_loss: 0.3066 343/500 [===================>..........] - ETA: 39s - loss: 1.8704 - regression_loss: 1.5639 - classification_loss: 0.3064 344/500 [===================>..........] - ETA: 38s - loss: 1.8709 - regression_loss: 1.5644 - classification_loss: 0.3065 345/500 [===================>..........] - ETA: 38s - loss: 1.8753 - regression_loss: 1.5677 - classification_loss: 0.3076 346/500 [===================>..........] - ETA: 38s - loss: 1.8756 - regression_loss: 1.5680 - classification_loss: 0.3075 347/500 [===================>..........] - ETA: 38s - loss: 1.8751 - regression_loss: 1.5678 - classification_loss: 0.3073 348/500 [===================>..........] - ETA: 37s - loss: 1.8734 - regression_loss: 1.5666 - classification_loss: 0.3068 349/500 [===================>..........] - ETA: 37s - loss: 1.8736 - regression_loss: 1.5669 - classification_loss: 0.3067 350/500 [====================>.........] - ETA: 37s - loss: 1.8736 - regression_loss: 1.5666 - classification_loss: 0.3070 351/500 [====================>.........] - ETA: 37s - loss: 1.8739 - regression_loss: 1.5669 - classification_loss: 0.3070 352/500 [====================>.........] - ETA: 36s - loss: 1.8758 - regression_loss: 1.5684 - classification_loss: 0.3074 353/500 [====================>.........] - ETA: 36s - loss: 1.8753 - regression_loss: 1.5679 - classification_loss: 0.3074 354/500 [====================>.........] - ETA: 36s - loss: 1.8767 - regression_loss: 1.5693 - classification_loss: 0.3075 355/500 [====================>.........] - ETA: 36s - loss: 1.8783 - regression_loss: 1.5701 - classification_loss: 0.3082 356/500 [====================>.........] - ETA: 35s - loss: 1.8776 - regression_loss: 1.5696 - classification_loss: 0.3080 357/500 [====================>.........] - ETA: 35s - loss: 1.8784 - regression_loss: 1.5699 - classification_loss: 0.3085 358/500 [====================>.........] - ETA: 35s - loss: 1.8803 - regression_loss: 1.5719 - classification_loss: 0.3083 359/500 [====================>.........] - ETA: 35s - loss: 1.8811 - regression_loss: 1.5725 - classification_loss: 0.3086 360/500 [====================>.........] - ETA: 34s - loss: 1.8792 - regression_loss: 1.5709 - classification_loss: 0.3083 361/500 [====================>.........] - ETA: 34s - loss: 1.8788 - regression_loss: 1.5708 - classification_loss: 0.3081 362/500 [====================>.........] - ETA: 34s - loss: 1.8818 - regression_loss: 1.5732 - classification_loss: 0.3086 363/500 [====================>.........] - ETA: 34s - loss: 1.8811 - regression_loss: 1.5728 - classification_loss: 0.3083 364/500 [====================>.........] - ETA: 33s - loss: 1.8820 - regression_loss: 1.5733 - classification_loss: 0.3087 365/500 [====================>.........] - ETA: 33s - loss: 1.8827 - regression_loss: 1.5739 - classification_loss: 0.3088 366/500 [====================>.........] - ETA: 33s - loss: 1.8828 - regression_loss: 1.5738 - classification_loss: 0.3090 367/500 [=====================>........] - ETA: 33s - loss: 1.8824 - regression_loss: 1.5736 - classification_loss: 0.3088 368/500 [=====================>........] - ETA: 32s - loss: 1.8817 - regression_loss: 1.5728 - classification_loss: 0.3088 369/500 [=====================>........] - ETA: 32s - loss: 1.8825 - regression_loss: 1.5734 - classification_loss: 0.3091 370/500 [=====================>........] - ETA: 32s - loss: 1.8874 - regression_loss: 1.5755 - classification_loss: 0.3118 371/500 [=====================>........] - ETA: 32s - loss: 1.8842 - regression_loss: 1.5728 - classification_loss: 0.3114 372/500 [=====================>........] - ETA: 31s - loss: 1.8842 - regression_loss: 1.5726 - classification_loss: 0.3115 373/500 [=====================>........] - ETA: 31s - loss: 1.8857 - regression_loss: 1.5737 - classification_loss: 0.3120 374/500 [=====================>........] - ETA: 31s - loss: 1.8848 - regression_loss: 1.5731 - classification_loss: 0.3117 375/500 [=====================>........] - ETA: 31s - loss: 1.8849 - regression_loss: 1.5733 - classification_loss: 0.3116 376/500 [=====================>........] - ETA: 30s - loss: 1.8839 - regression_loss: 1.5723 - classification_loss: 0.3116 377/500 [=====================>........] - ETA: 30s - loss: 1.8864 - regression_loss: 1.5742 - classification_loss: 0.3122 378/500 [=====================>........] - ETA: 30s - loss: 1.8848 - regression_loss: 1.5729 - classification_loss: 0.3119 379/500 [=====================>........] - ETA: 30s - loss: 1.8867 - regression_loss: 1.5742 - classification_loss: 0.3125 380/500 [=====================>........] - ETA: 29s - loss: 1.8852 - regression_loss: 1.5732 - classification_loss: 0.3120 381/500 [=====================>........] - ETA: 29s - loss: 1.8837 - regression_loss: 1.5720 - classification_loss: 0.3116 382/500 [=====================>........] - ETA: 29s - loss: 1.8826 - regression_loss: 1.5712 - classification_loss: 0.3114 383/500 [=====================>........] - ETA: 29s - loss: 1.8818 - regression_loss: 1.5706 - classification_loss: 0.3112 384/500 [======================>.......] - ETA: 28s - loss: 1.8811 - regression_loss: 1.5700 - classification_loss: 0.3112 385/500 [======================>.......] - ETA: 28s - loss: 1.8821 - regression_loss: 1.5709 - classification_loss: 0.3112 386/500 [======================>.......] - ETA: 28s - loss: 1.8821 - regression_loss: 1.5709 - classification_loss: 0.3112 387/500 [======================>.......] - ETA: 28s - loss: 1.8828 - regression_loss: 1.5712 - classification_loss: 0.3116 388/500 [======================>.......] - ETA: 27s - loss: 1.8819 - regression_loss: 1.5700 - classification_loss: 0.3119 389/500 [======================>.......] - ETA: 27s - loss: 1.8818 - regression_loss: 1.5697 - classification_loss: 0.3121 390/500 [======================>.......] - ETA: 27s - loss: 1.8832 - regression_loss: 1.5712 - classification_loss: 0.3120 391/500 [======================>.......] - ETA: 27s - loss: 1.8849 - regression_loss: 1.5727 - classification_loss: 0.3122 392/500 [======================>.......] - ETA: 26s - loss: 1.8839 - regression_loss: 1.5718 - classification_loss: 0.3120 393/500 [======================>.......] - ETA: 26s - loss: 1.8839 - regression_loss: 1.5719 - classification_loss: 0.3120 394/500 [======================>.......] - ETA: 26s - loss: 1.8833 - regression_loss: 1.5715 - classification_loss: 0.3118 395/500 [======================>.......] - ETA: 26s - loss: 1.8850 - regression_loss: 1.5727 - classification_loss: 0.3123 396/500 [======================>.......] - ETA: 25s - loss: 1.8845 - regression_loss: 1.5724 - classification_loss: 0.3121 397/500 [======================>.......] - ETA: 25s - loss: 1.8831 - regression_loss: 1.5713 - classification_loss: 0.3118 398/500 [======================>.......] - ETA: 25s - loss: 1.8826 - regression_loss: 1.5709 - classification_loss: 0.3118 399/500 [======================>.......] - ETA: 25s - loss: 1.8825 - regression_loss: 1.5709 - classification_loss: 0.3116 400/500 [=======================>......] - ETA: 24s - loss: 1.8838 - regression_loss: 1.5719 - classification_loss: 0.3119 401/500 [=======================>......] - ETA: 24s - loss: 1.8834 - regression_loss: 1.5680 - classification_loss: 0.3154 402/500 [=======================>......] - ETA: 24s - loss: 1.8816 - regression_loss: 1.5664 - classification_loss: 0.3152 403/500 [=======================>......] - ETA: 24s - loss: 1.8829 - regression_loss: 1.5674 - classification_loss: 0.3155 404/500 [=======================>......] - ETA: 23s - loss: 1.8834 - regression_loss: 1.5677 - classification_loss: 0.3157 405/500 [=======================>......] - ETA: 23s - loss: 1.8826 - regression_loss: 1.5671 - classification_loss: 0.3155 406/500 [=======================>......] - ETA: 23s - loss: 1.8829 - regression_loss: 1.5672 - classification_loss: 0.3157 407/500 [=======================>......] - ETA: 23s - loss: 1.8841 - regression_loss: 1.5681 - classification_loss: 0.3160 408/500 [=======================>......] - ETA: 22s - loss: 1.8849 - regression_loss: 1.5683 - classification_loss: 0.3166 409/500 [=======================>......] - ETA: 22s - loss: 1.8831 - regression_loss: 1.5669 - classification_loss: 0.3161 410/500 [=======================>......] - ETA: 22s - loss: 1.8831 - regression_loss: 1.5669 - classification_loss: 0.3162 411/500 [=======================>......] - ETA: 22s - loss: 1.8840 - regression_loss: 1.5679 - classification_loss: 0.3162 412/500 [=======================>......] - ETA: 21s - loss: 1.8860 - regression_loss: 1.5693 - classification_loss: 0.3168 413/500 [=======================>......] - ETA: 21s - loss: 1.8855 - regression_loss: 1.5690 - classification_loss: 0.3165 414/500 [=======================>......] - ETA: 21s - loss: 1.8855 - regression_loss: 1.5693 - classification_loss: 0.3162 415/500 [=======================>......] - ETA: 21s - loss: 1.8853 - regression_loss: 1.5692 - classification_loss: 0.3161 416/500 [=======================>......] - ETA: 20s - loss: 1.8855 - regression_loss: 1.5693 - classification_loss: 0.3162 417/500 [========================>.....] - ETA: 20s - loss: 1.8887 - regression_loss: 1.5717 - classification_loss: 0.3170 418/500 [========================>.....] - ETA: 20s - loss: 1.8858 - regression_loss: 1.5694 - classification_loss: 0.3165 419/500 [========================>.....] - ETA: 20s - loss: 1.8863 - regression_loss: 1.5694 - classification_loss: 0.3169 420/500 [========================>.....] - ETA: 19s - loss: 1.8841 - regression_loss: 1.5673 - classification_loss: 0.3167 421/500 [========================>.....] - ETA: 19s - loss: 1.8839 - regression_loss: 1.5674 - classification_loss: 0.3165 422/500 [========================>.....] - ETA: 19s - loss: 1.8834 - regression_loss: 1.5671 - classification_loss: 0.3163 423/500 [========================>.....] - ETA: 19s - loss: 1.8848 - regression_loss: 1.5682 - classification_loss: 0.3166 424/500 [========================>.....] - ETA: 18s - loss: 1.8848 - regression_loss: 1.5682 - classification_loss: 0.3166 425/500 [========================>.....] - ETA: 18s - loss: 1.8845 - regression_loss: 1.5681 - classification_loss: 0.3164 426/500 [========================>.....] - ETA: 18s - loss: 1.8835 - regression_loss: 1.5673 - classification_loss: 0.3162 427/500 [========================>.....] - ETA: 18s - loss: 1.8836 - regression_loss: 1.5672 - classification_loss: 0.3164 428/500 [========================>.....] - ETA: 17s - loss: 1.8823 - regression_loss: 1.5662 - classification_loss: 0.3161 429/500 [========================>.....] - ETA: 17s - loss: 1.8800 - regression_loss: 1.5644 - classification_loss: 0.3157 430/500 [========================>.....] - ETA: 17s - loss: 1.8795 - regression_loss: 1.5640 - classification_loss: 0.3155 431/500 [========================>.....] - ETA: 17s - loss: 1.8807 - regression_loss: 1.5650 - classification_loss: 0.3157 432/500 [========================>.....] - ETA: 16s - loss: 1.8806 - regression_loss: 1.5651 - classification_loss: 0.3155 433/500 [========================>.....] - ETA: 16s - loss: 1.8776 - regression_loss: 1.5627 - classification_loss: 0.3149 434/500 [=========================>....] - ETA: 16s - loss: 1.8773 - regression_loss: 1.5625 - classification_loss: 0.3148 435/500 [=========================>....] - ETA: 16s - loss: 1.8775 - regression_loss: 1.5623 - classification_loss: 0.3152 436/500 [=========================>....] - ETA: 15s - loss: 1.8785 - regression_loss: 1.5634 - classification_loss: 0.3152 437/500 [=========================>....] - ETA: 15s - loss: 1.8795 - regression_loss: 1.5641 - classification_loss: 0.3154 438/500 [=========================>....] - ETA: 15s - loss: 1.8777 - regression_loss: 1.5627 - classification_loss: 0.3151 439/500 [=========================>....] - ETA: 15s - loss: 1.8768 - regression_loss: 1.5621 - classification_loss: 0.3147 440/500 [=========================>....] - ETA: 14s - loss: 1.8768 - regression_loss: 1.5620 - classification_loss: 0.3149 441/500 [=========================>....] - ETA: 14s - loss: 1.8760 - regression_loss: 1.5615 - classification_loss: 0.3145 442/500 [=========================>....] - ETA: 14s - loss: 1.8756 - regression_loss: 1.5614 - classification_loss: 0.3141 443/500 [=========================>....] - ETA: 14s - loss: 1.8753 - regression_loss: 1.5611 - classification_loss: 0.3142 444/500 [=========================>....] - ETA: 14s - loss: 1.8747 - regression_loss: 1.5607 - classification_loss: 0.3140 445/500 [=========================>....] - ETA: 13s - loss: 1.8741 - regression_loss: 1.5604 - classification_loss: 0.3137 446/500 [=========================>....] - ETA: 13s - loss: 1.8740 - regression_loss: 1.5602 - classification_loss: 0.3138 447/500 [=========================>....] - ETA: 13s - loss: 1.8728 - regression_loss: 1.5594 - classification_loss: 0.3134 448/500 [=========================>....] - ETA: 12s - loss: 1.8716 - regression_loss: 1.5585 - classification_loss: 0.3131 449/500 [=========================>....] - ETA: 12s - loss: 1.8724 - regression_loss: 1.5593 - classification_loss: 0.3131 450/500 [==========================>...] - ETA: 12s - loss: 1.8736 - regression_loss: 1.5602 - classification_loss: 0.3134 451/500 [==========================>...] - ETA: 12s - loss: 1.8741 - regression_loss: 1.5604 - classification_loss: 0.3137 452/500 [==========================>...] - ETA: 12s - loss: 1.8743 - regression_loss: 1.5605 - classification_loss: 0.3137 453/500 [==========================>...] - ETA: 11s - loss: 1.8721 - regression_loss: 1.5585 - classification_loss: 0.3136 454/500 [==========================>...] - ETA: 11s - loss: 1.8718 - regression_loss: 1.5583 - classification_loss: 0.3135 455/500 [==========================>...] - ETA: 11s - loss: 1.8713 - regression_loss: 1.5580 - classification_loss: 0.3133 456/500 [==========================>...] - ETA: 11s - loss: 1.8684 - regression_loss: 1.5556 - classification_loss: 0.3128 457/500 [==========================>...] - ETA: 10s - loss: 1.8692 - regression_loss: 1.5563 - classification_loss: 0.3129 458/500 [==========================>...] - ETA: 10s - loss: 1.8714 - regression_loss: 1.5580 - classification_loss: 0.3134 459/500 [==========================>...] - ETA: 10s - loss: 1.8701 - regression_loss: 1.5571 - classification_loss: 0.3131 460/500 [==========================>...] - ETA: 10s - loss: 1.8706 - regression_loss: 1.5575 - classification_loss: 0.3131 461/500 [==========================>...] - ETA: 9s - loss: 1.8726 - regression_loss: 1.5594 - classification_loss: 0.3132  462/500 [==========================>...] - ETA: 9s - loss: 1.8729 - regression_loss: 1.5598 - classification_loss: 0.3131 463/500 [==========================>...] - ETA: 9s - loss: 1.8736 - regression_loss: 1.5602 - classification_loss: 0.3134 464/500 [==========================>...] - ETA: 9s - loss: 1.8744 - regression_loss: 1.5607 - classification_loss: 0.3137 465/500 [==========================>...] - ETA: 8s - loss: 1.8745 - regression_loss: 1.5607 - classification_loss: 0.3138 466/500 [==========================>...] - ETA: 8s - loss: 1.8753 - regression_loss: 1.5614 - classification_loss: 0.3139 467/500 [===========================>..] - ETA: 8s - loss: 1.8776 - regression_loss: 1.5633 - classification_loss: 0.3143 468/500 [===========================>..] - ETA: 7s - loss: 1.8788 - regression_loss: 1.5643 - classification_loss: 0.3145 469/500 [===========================>..] - ETA: 7s - loss: 1.8784 - regression_loss: 1.5641 - classification_loss: 0.3143 470/500 [===========================>..] - ETA: 7s - loss: 1.8787 - regression_loss: 1.5644 - classification_loss: 0.3143 471/500 [===========================>..] - ETA: 7s - loss: 1.8785 - regression_loss: 1.5643 - classification_loss: 0.3142 472/500 [===========================>..] - ETA: 6s - loss: 1.8777 - regression_loss: 1.5637 - classification_loss: 0.3140 473/500 [===========================>..] - ETA: 6s - loss: 1.8774 - regression_loss: 1.5635 - classification_loss: 0.3139 474/500 [===========================>..] - ETA: 6s - loss: 1.8780 - regression_loss: 1.5637 - classification_loss: 0.3143 475/500 [===========================>..] - ETA: 6s - loss: 1.8799 - regression_loss: 1.5652 - classification_loss: 0.3147 476/500 [===========================>..] - ETA: 5s - loss: 1.8805 - regression_loss: 1.5655 - classification_loss: 0.3149 477/500 [===========================>..] - ETA: 5s - loss: 1.8793 - regression_loss: 1.5645 - classification_loss: 0.3148 478/500 [===========================>..] - ETA: 5s - loss: 1.8792 - regression_loss: 1.5644 - classification_loss: 0.3148 479/500 [===========================>..] - ETA: 5s - loss: 1.8796 - regression_loss: 1.5649 - classification_loss: 0.3147 480/500 [===========================>..] - ETA: 4s - loss: 1.8776 - regression_loss: 1.5631 - classification_loss: 0.3145 481/500 [===========================>..] - ETA: 4s - loss: 1.8768 - regression_loss: 1.5625 - classification_loss: 0.3142 482/500 [===========================>..] - ETA: 4s - loss: 1.8775 - regression_loss: 1.5628 - classification_loss: 0.3146 483/500 [===========================>..] - ETA: 4s - loss: 1.8781 - regression_loss: 1.5635 - classification_loss: 0.3147 484/500 [============================>.] - ETA: 3s - loss: 1.8812 - regression_loss: 1.5660 - classification_loss: 0.3153 485/500 [============================>.] - ETA: 3s - loss: 1.8816 - regression_loss: 1.5664 - classification_loss: 0.3152 486/500 [============================>.] - ETA: 3s - loss: 1.8803 - regression_loss: 1.5654 - classification_loss: 0.3149 487/500 [============================>.] - ETA: 3s - loss: 1.8825 - regression_loss: 1.5672 - classification_loss: 0.3152 488/500 [============================>.] - ETA: 2s - loss: 1.8819 - regression_loss: 1.5668 - classification_loss: 0.3151 489/500 [============================>.] - ETA: 2s - loss: 1.8813 - regression_loss: 1.5663 - classification_loss: 0.3151 490/500 [============================>.] - ETA: 2s - loss: 1.8808 - regression_loss: 1.5658 - classification_loss: 0.3149 491/500 [============================>.] - ETA: 2s - loss: 1.8804 - regression_loss: 1.5655 - classification_loss: 0.3148 492/500 [============================>.] - ETA: 1s - loss: 1.8783 - regression_loss: 1.5639 - classification_loss: 0.3144 493/500 [============================>.] - ETA: 1s - loss: 1.8792 - regression_loss: 1.5645 - classification_loss: 0.3147 494/500 [============================>.] - ETA: 1s - loss: 1.8816 - regression_loss: 1.5663 - classification_loss: 0.3153 495/500 [============================>.] - ETA: 1s - loss: 1.8812 - regression_loss: 1.5661 - classification_loss: 0.3151 496/500 [============================>.] - ETA: 0s - loss: 1.8811 - regression_loss: 1.5660 - classification_loss: 0.3151 497/500 [============================>.] - ETA: 0s - loss: 1.8792 - regression_loss: 1.5645 - classification_loss: 0.3147 498/500 [============================>.] - ETA: 0s - loss: 1.8799 - regression_loss: 1.5651 - classification_loss: 0.3148 499/500 [============================>.] - ETA: 0s - loss: 1.8794 - regression_loss: 1.5648 - classification_loss: 0.3146 500/500 [==============================] - 125s 249ms/step - loss: 1.8802 - regression_loss: 1.5655 - classification_loss: 0.3147 326 instances of class plum with average precision: 0.7406 mAP: 0.7406 Epoch 00031: saving model to ./training/snapshots/resnet50_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 2:00 - loss: 2.2107 - regression_loss: 1.9640 - classification_loss: 0.2467 2/500 [..............................] - ETA: 1:55 - loss: 1.5082 - regression_loss: 1.3283 - classification_loss: 0.1799 3/500 [..............................] - ETA: 1:56 - loss: 1.7548 - regression_loss: 1.4846 - classification_loss: 0.2702 4/500 [..............................] - ETA: 1:59 - loss: 1.6588 - regression_loss: 1.4025 - classification_loss: 0.2564 5/500 [..............................] - ETA: 2:00 - loss: 1.8593 - regression_loss: 1.5654 - classification_loss: 0.2939 6/500 [..............................] - ETA: 2:02 - loss: 1.8543 - regression_loss: 1.5652 - classification_loss: 0.2891 7/500 [..............................] - ETA: 2:02 - loss: 1.8629 - regression_loss: 1.5531 - classification_loss: 0.3098 8/500 [..............................] - ETA: 2:02 - loss: 1.9478 - regression_loss: 1.6277 - classification_loss: 0.3202 9/500 [..............................] - ETA: 2:02 - loss: 1.9953 - regression_loss: 1.6738 - classification_loss: 0.3215 10/500 [..............................] - ETA: 2:02 - loss: 1.9808 - regression_loss: 1.6565 - classification_loss: 0.3242 11/500 [..............................] - ETA: 2:02 - loss: 1.9543 - regression_loss: 1.6396 - classification_loss: 0.3147 12/500 [..............................] - ETA: 2:02 - loss: 1.9220 - regression_loss: 1.6096 - classification_loss: 0.3124 13/500 [..............................] - ETA: 2:02 - loss: 1.8869 - regression_loss: 1.5853 - classification_loss: 0.3016 14/500 [..............................] - ETA: 2:01 - loss: 1.9894 - regression_loss: 1.6709 - classification_loss: 0.3185 15/500 [..............................] - ETA: 2:01 - loss: 1.9810 - regression_loss: 1.6617 - classification_loss: 0.3193 16/500 [..............................] - ETA: 2:01 - loss: 1.9743 - regression_loss: 1.6534 - classification_loss: 0.3209 17/500 [>.............................] - ETA: 2:01 - loss: 2.0271 - regression_loss: 1.7009 - classification_loss: 0.3263 18/500 [>.............................] - ETA: 2:01 - loss: 2.0185 - regression_loss: 1.6918 - classification_loss: 0.3267 19/500 [>.............................] - ETA: 2:00 - loss: 2.0279 - regression_loss: 1.6927 - classification_loss: 0.3352 20/500 [>.............................] - ETA: 2:00 - loss: 2.0502 - regression_loss: 1.7082 - classification_loss: 0.3420 21/500 [>.............................] - ETA: 2:00 - loss: 2.0883 - regression_loss: 1.7369 - classification_loss: 0.3513 22/500 [>.............................] - ETA: 2:00 - loss: 2.0747 - regression_loss: 1.7289 - classification_loss: 0.3458 23/500 [>.............................] - ETA: 1:59 - loss: 2.0879 - regression_loss: 1.7396 - classification_loss: 0.3483 24/500 [>.............................] - ETA: 1:59 - loss: 2.0638 - regression_loss: 1.7203 - classification_loss: 0.3436 25/500 [>.............................] - ETA: 1:59 - loss: 2.0341 - regression_loss: 1.6953 - classification_loss: 0.3388 26/500 [>.............................] - ETA: 1:58 - loss: 2.0369 - regression_loss: 1.6957 - classification_loss: 0.3411 27/500 [>.............................] - ETA: 1:58 - loss: 2.0141 - regression_loss: 1.6775 - classification_loss: 0.3366 28/500 [>.............................] - ETA: 1:58 - loss: 1.9664 - regression_loss: 1.6386 - classification_loss: 0.3278 29/500 [>.............................] - ETA: 1:57 - loss: 1.9771 - regression_loss: 1.6471 - classification_loss: 0.3300 30/500 [>.............................] - ETA: 1:57 - loss: 1.9764 - regression_loss: 1.6378 - classification_loss: 0.3385 31/500 [>.............................] - ETA: 1:56 - loss: 1.9489 - regression_loss: 1.6160 - classification_loss: 0.3329 32/500 [>.............................] - ETA: 1:56 - loss: 1.9304 - regression_loss: 1.6017 - classification_loss: 0.3286 33/500 [>.............................] - ETA: 1:56 - loss: 1.9311 - regression_loss: 1.5972 - classification_loss: 0.3339 34/500 [=>............................] - ETA: 1:56 - loss: 1.9162 - regression_loss: 1.5861 - classification_loss: 0.3301 35/500 [=>............................] - ETA: 1:55 - loss: 1.9126 - regression_loss: 1.5822 - classification_loss: 0.3304 36/500 [=>............................] - ETA: 1:55 - loss: 1.9122 - regression_loss: 1.5817 - classification_loss: 0.3305 37/500 [=>............................] - ETA: 1:55 - loss: 1.9397 - regression_loss: 1.6015 - classification_loss: 0.3382 38/500 [=>............................] - ETA: 1:54 - loss: 1.9300 - regression_loss: 1.5948 - classification_loss: 0.3351 39/500 [=>............................] - ETA: 1:54 - loss: 1.9378 - regression_loss: 1.6008 - classification_loss: 0.3370 40/500 [=>............................] - ETA: 1:54 - loss: 1.9199 - regression_loss: 1.5877 - classification_loss: 0.3322 41/500 [=>............................] - ETA: 1:54 - loss: 1.9242 - regression_loss: 1.5909 - classification_loss: 0.3333 42/500 [=>............................] - ETA: 1:54 - loss: 1.9176 - regression_loss: 1.5864 - classification_loss: 0.3312 43/500 [=>............................] - ETA: 1:53 - loss: 1.9204 - regression_loss: 1.5883 - classification_loss: 0.3321 44/500 [=>............................] - ETA: 1:53 - loss: 1.9103 - regression_loss: 1.5803 - classification_loss: 0.3301 45/500 [=>............................] - ETA: 1:53 - loss: 1.8952 - regression_loss: 1.5691 - classification_loss: 0.3261 46/500 [=>............................] - ETA: 1:53 - loss: 1.8915 - regression_loss: 1.5642 - classification_loss: 0.3273 47/500 [=>............................] - ETA: 1:52 - loss: 1.8619 - regression_loss: 1.5395 - classification_loss: 0.3224 48/500 [=>............................] - ETA: 1:52 - loss: 1.8688 - regression_loss: 1.5445 - classification_loss: 0.3243 49/500 [=>............................] - ETA: 1:52 - loss: 1.8696 - regression_loss: 1.5459 - classification_loss: 0.3237 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8610 - regression_loss: 1.5404 - classification_loss: 0.3206 51/500 [==>...........................] - ETA: 1:51 - loss: 1.8569 - regression_loss: 1.5374 - classification_loss: 0.3196 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8686 - regression_loss: 1.5444 - classification_loss: 0.3242 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8625 - regression_loss: 1.5405 - classification_loss: 0.3220 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8636 - regression_loss: 1.5427 - classification_loss: 0.3209 55/500 [==>...........................] - ETA: 1:50 - loss: 1.8625 - regression_loss: 1.5412 - classification_loss: 0.3213 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8676 - regression_loss: 1.5481 - classification_loss: 0.3195 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8602 - regression_loss: 1.5439 - classification_loss: 0.3163 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8500 - regression_loss: 1.5365 - classification_loss: 0.3136 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8444 - regression_loss: 1.5332 - classification_loss: 0.3112 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8483 - regression_loss: 1.5366 - classification_loss: 0.3117 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8316 - regression_loss: 1.5237 - classification_loss: 0.3079 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8388 - regression_loss: 1.5291 - classification_loss: 0.3096 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8512 - regression_loss: 1.5384 - classification_loss: 0.3128 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8514 - regression_loss: 1.5394 - classification_loss: 0.3120 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8409 - regression_loss: 1.5319 - classification_loss: 0.3090 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8345 - regression_loss: 1.5278 - classification_loss: 0.3067 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8358 - regression_loss: 1.5294 - classification_loss: 0.3064 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8438 - regression_loss: 1.5346 - classification_loss: 0.3092 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8461 - regression_loss: 1.5124 - classification_loss: 0.3338 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8273 - regression_loss: 1.4973 - classification_loss: 0.3300 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8265 - regression_loss: 1.4970 - classification_loss: 0.3295 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8189 - regression_loss: 1.4901 - classification_loss: 0.3288 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8093 - regression_loss: 1.4826 - classification_loss: 0.3267 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8151 - regression_loss: 1.4863 - classification_loss: 0.3288 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8283 - regression_loss: 1.4969 - classification_loss: 0.3314 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8256 - regression_loss: 1.4959 - classification_loss: 0.3297 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8207 - regression_loss: 1.4921 - classification_loss: 0.3285 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8253 - regression_loss: 1.4962 - classification_loss: 0.3291 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8257 - regression_loss: 1.4977 - classification_loss: 0.3280 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8311 - regression_loss: 1.5021 - classification_loss: 0.3290 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8538 - regression_loss: 1.5249 - classification_loss: 0.3289 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8497 - regression_loss: 1.5218 - classification_loss: 0.3278 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8465 - regression_loss: 1.5205 - classification_loss: 0.3260 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8506 - regression_loss: 1.5235 - classification_loss: 0.3272 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8480 - regression_loss: 1.5219 - classification_loss: 0.3261 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8409 - regression_loss: 1.5168 - classification_loss: 0.3241 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8411 - regression_loss: 1.5165 - classification_loss: 0.3246 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8486 - regression_loss: 1.5241 - classification_loss: 0.3246 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8500 - regression_loss: 1.5255 - classification_loss: 0.3244 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8460 - regression_loss: 1.5228 - classification_loss: 0.3232 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8433 - regression_loss: 1.5218 - classification_loss: 0.3215 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8443 - regression_loss: 1.5219 - classification_loss: 0.3225 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8490 - regression_loss: 1.5277 - classification_loss: 0.3213 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8502 - regression_loss: 1.5275 - classification_loss: 0.3227 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8525 - regression_loss: 1.5305 - classification_loss: 0.3219 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8549 - regression_loss: 1.5336 - classification_loss: 0.3213 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8578 - regression_loss: 1.5357 - classification_loss: 0.3221 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8570 - regression_loss: 1.5354 - classification_loss: 0.3216 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8584 - regression_loss: 1.5370 - classification_loss: 0.3215 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8674 - regression_loss: 1.5438 - classification_loss: 0.3236 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8622 - regression_loss: 1.5402 - classification_loss: 0.3220 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8596 - regression_loss: 1.5387 - classification_loss: 0.3210 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8626 - regression_loss: 1.5410 - classification_loss: 0.3216 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8554 - regression_loss: 1.5358 - classification_loss: 0.3196 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8592 - regression_loss: 1.5390 - classification_loss: 0.3202 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8619 - regression_loss: 1.5410 - classification_loss: 0.3209 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8603 - regression_loss: 1.5394 - classification_loss: 0.3209 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8679 - regression_loss: 1.5443 - classification_loss: 0.3236 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8669 - regression_loss: 1.5440 - classification_loss: 0.3229 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8677 - regression_loss: 1.5448 - classification_loss: 0.3229 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8605 - regression_loss: 1.5395 - classification_loss: 0.3210 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8614 - regression_loss: 1.5402 - classification_loss: 0.3213 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8613 - regression_loss: 1.5403 - classification_loss: 0.3209 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8625 - regression_loss: 1.5419 - classification_loss: 0.3206 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8621 - regression_loss: 1.5421 - classification_loss: 0.3200 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8594 - regression_loss: 1.5288 - classification_loss: 0.3307 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8608 - regression_loss: 1.5299 - classification_loss: 0.3309 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8712 - regression_loss: 1.5379 - classification_loss: 0.3333 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8631 - regression_loss: 1.5316 - classification_loss: 0.3314 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8656 - regression_loss: 1.5335 - classification_loss: 0.3321 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8641 - regression_loss: 1.5321 - classification_loss: 0.3320 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8659 - regression_loss: 1.5344 - classification_loss: 0.3316 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8671 - regression_loss: 1.5357 - classification_loss: 0.3315 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8737 - regression_loss: 1.5402 - classification_loss: 0.3335 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8739 - regression_loss: 1.5407 - classification_loss: 0.3332 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8802 - regression_loss: 1.5447 - classification_loss: 0.3355 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8791 - regression_loss: 1.5442 - classification_loss: 0.3349 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8832 - regression_loss: 1.5472 - classification_loss: 0.3360 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8800 - regression_loss: 1.5451 - classification_loss: 0.3350 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8868 - regression_loss: 1.5514 - classification_loss: 0.3354 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8858 - regression_loss: 1.5509 - classification_loss: 0.3349 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8839 - regression_loss: 1.5500 - classification_loss: 0.3340 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8840 - regression_loss: 1.5507 - classification_loss: 0.3332 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8858 - regression_loss: 1.5522 - classification_loss: 0.3336 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8831 - regression_loss: 1.5505 - classification_loss: 0.3326 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8851 - regression_loss: 1.5523 - classification_loss: 0.3328 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8772 - regression_loss: 1.5455 - classification_loss: 0.3318 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8787 - regression_loss: 1.5462 - classification_loss: 0.3325 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8798 - regression_loss: 1.5474 - classification_loss: 0.3324 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8821 - regression_loss: 1.5491 - classification_loss: 0.3330 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8787 - regression_loss: 1.5467 - classification_loss: 0.3319 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8795 - regression_loss: 1.5477 - classification_loss: 0.3318 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8785 - regression_loss: 1.5470 - classification_loss: 0.3315 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8754 - regression_loss: 1.5448 - classification_loss: 0.3307 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8775 - regression_loss: 1.5464 - classification_loss: 0.3311 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8702 - regression_loss: 1.5406 - classification_loss: 0.3296 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8734 - regression_loss: 1.5433 - classification_loss: 0.3301 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8720 - regression_loss: 1.5423 - classification_loss: 0.3297 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8711 - regression_loss: 1.5414 - classification_loss: 0.3297 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8759 - regression_loss: 1.5451 - classification_loss: 0.3308 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8768 - regression_loss: 1.5463 - classification_loss: 0.3306 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8701 - regression_loss: 1.5361 - classification_loss: 0.3340 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8689 - regression_loss: 1.5352 - classification_loss: 0.3338 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8685 - regression_loss: 1.5355 - classification_loss: 0.3330 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8635 - regression_loss: 1.5318 - classification_loss: 0.3317 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8678 - regression_loss: 1.5353 - classification_loss: 0.3325 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8677 - regression_loss: 1.5357 - classification_loss: 0.3320 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8651 - regression_loss: 1.5339 - classification_loss: 0.3311 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8672 - regression_loss: 1.5357 - classification_loss: 0.3316 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8668 - regression_loss: 1.5352 - classification_loss: 0.3316 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8736 - regression_loss: 1.5407 - classification_loss: 0.3329 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8787 - regression_loss: 1.5442 - classification_loss: 0.3345 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8746 - regression_loss: 1.5409 - classification_loss: 0.3337 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8760 - regression_loss: 1.5426 - classification_loss: 0.3334 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8722 - regression_loss: 1.5332 - classification_loss: 0.3389 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8686 - regression_loss: 1.5306 - classification_loss: 0.3380 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8627 - regression_loss: 1.5262 - classification_loss: 0.3364 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8641 - regression_loss: 1.5280 - classification_loss: 0.3361 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8665 - regression_loss: 1.5304 - classification_loss: 0.3362 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8702 - regression_loss: 1.5333 - classification_loss: 0.3370 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8767 - regression_loss: 1.5379 - classification_loss: 0.3388 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8696 - regression_loss: 1.5289 - classification_loss: 0.3406 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8700 - regression_loss: 1.5294 - classification_loss: 0.3406 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8680 - regression_loss: 1.5278 - classification_loss: 0.3401 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8648 - regression_loss: 1.5254 - classification_loss: 0.3394 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8582 - regression_loss: 1.5203 - classification_loss: 0.3378 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8576 - regression_loss: 1.5202 - classification_loss: 0.3374 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8575 - regression_loss: 1.5204 - classification_loss: 0.3371 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8580 - regression_loss: 1.5209 - classification_loss: 0.3371 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8564 - regression_loss: 1.5202 - classification_loss: 0.3362 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8533 - regression_loss: 1.5179 - classification_loss: 0.3354 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8538 - regression_loss: 1.5183 - classification_loss: 0.3355 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8478 - regression_loss: 1.5137 - classification_loss: 0.3341 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8544 - regression_loss: 1.5195 - classification_loss: 0.3349 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8553 - regression_loss: 1.5201 - classification_loss: 0.3352 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8569 - regression_loss: 1.5212 - classification_loss: 0.3357 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8550 - regression_loss: 1.5195 - classification_loss: 0.3355 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8535 - regression_loss: 1.5183 - classification_loss: 0.3352 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8545 - regression_loss: 1.5195 - classification_loss: 0.3350 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8555 - regression_loss: 1.5205 - classification_loss: 0.3350 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8534 - regression_loss: 1.5189 - classification_loss: 0.3344 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8537 - regression_loss: 1.5187 - classification_loss: 0.3351 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8585 - regression_loss: 1.5223 - classification_loss: 0.3363 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8599 - regression_loss: 1.5238 - classification_loss: 0.3361 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8591 - regression_loss: 1.5236 - classification_loss: 0.3355 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8576 - regression_loss: 1.5223 - classification_loss: 0.3353 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8566 - regression_loss: 1.5218 - classification_loss: 0.3348 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8600 - regression_loss: 1.5249 - classification_loss: 0.3351 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8612 - regression_loss: 1.5261 - classification_loss: 0.3351 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8613 - regression_loss: 1.5264 - classification_loss: 0.3349 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8633 - regression_loss: 1.5278 - classification_loss: 0.3354 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8589 - regression_loss: 1.5248 - classification_loss: 0.3341 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8579 - regression_loss: 1.5244 - classification_loss: 0.3336 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8565 - regression_loss: 1.5236 - classification_loss: 0.3329 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8561 - regression_loss: 1.5236 - classification_loss: 0.3326 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8572 - regression_loss: 1.5243 - classification_loss: 0.3329 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8524 - regression_loss: 1.5206 - classification_loss: 0.3318 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8470 - regression_loss: 1.5163 - classification_loss: 0.3308 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8484 - regression_loss: 1.5180 - classification_loss: 0.3304 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8511 - regression_loss: 1.5200 - classification_loss: 0.3311 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8494 - regression_loss: 1.5183 - classification_loss: 0.3311 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8537 - regression_loss: 1.5212 - classification_loss: 0.3325 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8538 - regression_loss: 1.5213 - classification_loss: 0.3325 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8540 - regression_loss: 1.5212 - classification_loss: 0.3328 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8554 - regression_loss: 1.5226 - classification_loss: 0.3328 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8571 - regression_loss: 1.5241 - classification_loss: 0.3330 217/500 [============>.................] - ETA: 1:10 - loss: 1.8586 - regression_loss: 1.5252 - classification_loss: 0.3334 218/500 [============>.................] - ETA: 1:10 - loss: 1.8596 - regression_loss: 1.5260 - classification_loss: 0.3336 219/500 [============>.................] - ETA: 1:10 - loss: 1.8622 - regression_loss: 1.5285 - classification_loss: 0.3337 220/500 [============>.................] - ETA: 1:09 - loss: 1.8675 - regression_loss: 1.5332 - classification_loss: 0.3342 221/500 [============>.................] - ETA: 1:09 - loss: 1.8692 - regression_loss: 1.5346 - classification_loss: 0.3346 222/500 [============>.................] - ETA: 1:09 - loss: 1.8700 - regression_loss: 1.5357 - classification_loss: 0.3344 223/500 [============>.................] - ETA: 1:09 - loss: 1.8694 - regression_loss: 1.5352 - classification_loss: 0.3342 224/500 [============>.................] - ETA: 1:08 - loss: 1.8696 - regression_loss: 1.5356 - classification_loss: 0.3340 225/500 [============>.................] - ETA: 1:08 - loss: 1.8752 - regression_loss: 1.5398 - classification_loss: 0.3354 226/500 [============>.................] - ETA: 1:08 - loss: 1.8767 - regression_loss: 1.5411 - classification_loss: 0.3356 227/500 [============>.................] - ETA: 1:08 - loss: 1.8790 - regression_loss: 1.5427 - classification_loss: 0.3363 228/500 [============>.................] - ETA: 1:07 - loss: 1.8792 - regression_loss: 1.5434 - classification_loss: 0.3357 229/500 [============>.................] - ETA: 1:07 - loss: 1.8746 - regression_loss: 1.5399 - classification_loss: 0.3347 230/500 [============>.................] - ETA: 1:07 - loss: 1.8737 - regression_loss: 1.5393 - classification_loss: 0.3344 231/500 [============>.................] - ETA: 1:07 - loss: 1.8754 - regression_loss: 1.5408 - classification_loss: 0.3346 232/500 [============>.................] - ETA: 1:06 - loss: 1.8768 - regression_loss: 1.5419 - classification_loss: 0.3349 233/500 [============>.................] - ETA: 1:06 - loss: 1.8766 - regression_loss: 1.5418 - classification_loss: 0.3349 234/500 [=============>................] - ETA: 1:06 - loss: 1.8746 - regression_loss: 1.5402 - classification_loss: 0.3345 235/500 [=============>................] - ETA: 1:06 - loss: 1.8748 - regression_loss: 1.5405 - classification_loss: 0.3343 236/500 [=============>................] - ETA: 1:05 - loss: 1.8737 - regression_loss: 1.5394 - classification_loss: 0.3343 237/500 [=============>................] - ETA: 1:05 - loss: 1.8745 - regression_loss: 1.5403 - classification_loss: 0.3341 238/500 [=============>................] - ETA: 1:05 - loss: 1.8732 - regression_loss: 1.5394 - classification_loss: 0.3338 239/500 [=============>................] - ETA: 1:05 - loss: 1.8725 - regression_loss: 1.5387 - classification_loss: 0.3338 240/500 [=============>................] - ETA: 1:04 - loss: 1.8711 - regression_loss: 1.5378 - classification_loss: 0.3333 241/500 [=============>................] - ETA: 1:04 - loss: 1.8713 - regression_loss: 1.5381 - classification_loss: 0.3333 242/500 [=============>................] - ETA: 1:04 - loss: 1.8701 - regression_loss: 1.5372 - classification_loss: 0.3329 243/500 [=============>................] - ETA: 1:04 - loss: 1.8695 - regression_loss: 1.5368 - classification_loss: 0.3327 244/500 [=============>................] - ETA: 1:03 - loss: 1.8714 - regression_loss: 1.5390 - classification_loss: 0.3325 245/500 [=============>................] - ETA: 1:03 - loss: 1.8739 - regression_loss: 1.5410 - classification_loss: 0.3329 246/500 [=============>................] - ETA: 1:03 - loss: 1.8700 - regression_loss: 1.5380 - classification_loss: 0.3320 247/500 [=============>................] - ETA: 1:03 - loss: 1.8691 - regression_loss: 1.5369 - classification_loss: 0.3322 248/500 [=============>................] - ETA: 1:02 - loss: 1.8723 - regression_loss: 1.5397 - classification_loss: 0.3326 249/500 [=============>................] - ETA: 1:02 - loss: 1.8689 - regression_loss: 1.5371 - classification_loss: 0.3318 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8692 - regression_loss: 1.5376 - classification_loss: 0.3316 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8648 - regression_loss: 1.5341 - classification_loss: 0.3307 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8645 - regression_loss: 1.5341 - classification_loss: 0.3304 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8679 - regression_loss: 1.5365 - classification_loss: 0.3313 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8692 - regression_loss: 1.5379 - classification_loss: 0.3313 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8675 - regression_loss: 1.5368 - classification_loss: 0.3307 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8706 - regression_loss: 1.5393 - classification_loss: 0.3313 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8678 - regression_loss: 1.5371 - classification_loss: 0.3307 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8706 - regression_loss: 1.5392 - classification_loss: 0.3314 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8716 - regression_loss: 1.5398 - classification_loss: 0.3319 260/500 [==============>...............] - ETA: 59s - loss: 1.8702 - regression_loss: 1.5388 - classification_loss: 0.3314  261/500 [==============>...............] - ETA: 59s - loss: 1.8670 - regression_loss: 1.5365 - classification_loss: 0.3305 262/500 [==============>...............] - ETA: 59s - loss: 1.8661 - regression_loss: 1.5360 - classification_loss: 0.3300 263/500 [==============>...............] - ETA: 59s - loss: 1.8688 - regression_loss: 1.5385 - classification_loss: 0.3303 264/500 [==============>...............] - ETA: 58s - loss: 1.8700 - regression_loss: 1.5393 - classification_loss: 0.3307 265/500 [==============>...............] - ETA: 58s - loss: 1.8708 - regression_loss: 1.5398 - classification_loss: 0.3310 266/500 [==============>...............] - ETA: 58s - loss: 1.8714 - regression_loss: 1.5401 - classification_loss: 0.3313 267/500 [===============>..............] - ETA: 58s - loss: 1.8706 - regression_loss: 1.5398 - classification_loss: 0.3308 268/500 [===============>..............] - ETA: 57s - loss: 1.8699 - regression_loss: 1.5398 - classification_loss: 0.3301 269/500 [===============>..............] - ETA: 57s - loss: 1.8696 - regression_loss: 1.5397 - classification_loss: 0.3299 270/500 [===============>..............] - ETA: 57s - loss: 1.8668 - regression_loss: 1.5376 - classification_loss: 0.3291 271/500 [===============>..............] - ETA: 57s - loss: 1.8651 - regression_loss: 1.5365 - classification_loss: 0.3286 272/500 [===============>..............] - ETA: 56s - loss: 1.8631 - regression_loss: 1.5348 - classification_loss: 0.3282 273/500 [===============>..............] - ETA: 56s - loss: 1.8640 - regression_loss: 1.5358 - classification_loss: 0.3282 274/500 [===============>..............] - ETA: 56s - loss: 1.8663 - regression_loss: 1.5376 - classification_loss: 0.3287 275/500 [===============>..............] - ETA: 56s - loss: 1.8663 - regression_loss: 1.5377 - classification_loss: 0.3286 276/500 [===============>..............] - ETA: 56s - loss: 1.8652 - regression_loss: 1.5371 - classification_loss: 0.3281 277/500 [===============>..............] - ETA: 55s - loss: 1.8658 - regression_loss: 1.5378 - classification_loss: 0.3281 278/500 [===============>..............] - ETA: 55s - loss: 1.8648 - regression_loss: 1.5370 - classification_loss: 0.3277 279/500 [===============>..............] - ETA: 55s - loss: 1.8638 - regression_loss: 1.5365 - classification_loss: 0.3273 280/500 [===============>..............] - ETA: 55s - loss: 1.8628 - regression_loss: 1.5359 - classification_loss: 0.3269 281/500 [===============>..............] - ETA: 54s - loss: 1.8658 - regression_loss: 1.5380 - classification_loss: 0.3278 282/500 [===============>..............] - ETA: 54s - loss: 1.8654 - regression_loss: 1.5377 - classification_loss: 0.3277 283/500 [===============>..............] - ETA: 54s - loss: 1.8651 - regression_loss: 1.5375 - classification_loss: 0.3276 284/500 [================>.............] - ETA: 54s - loss: 1.8636 - regression_loss: 1.5365 - classification_loss: 0.3271 285/500 [================>.............] - ETA: 53s - loss: 1.8633 - regression_loss: 1.5366 - classification_loss: 0.3268 286/500 [================>.............] - ETA: 53s - loss: 1.8623 - regression_loss: 1.5359 - classification_loss: 0.3264 287/500 [================>.............] - ETA: 53s - loss: 1.8622 - regression_loss: 1.5361 - classification_loss: 0.3262 288/500 [================>.............] - ETA: 53s - loss: 1.8623 - regression_loss: 1.5358 - classification_loss: 0.3265 289/500 [================>.............] - ETA: 52s - loss: 1.8608 - regression_loss: 1.5349 - classification_loss: 0.3259 290/500 [================>.............] - ETA: 52s - loss: 1.8649 - regression_loss: 1.5379 - classification_loss: 0.3270 291/500 [================>.............] - ETA: 52s - loss: 1.8636 - regression_loss: 1.5371 - classification_loss: 0.3265 292/500 [================>.............] - ETA: 51s - loss: 1.8636 - regression_loss: 1.5371 - classification_loss: 0.3265 293/500 [================>.............] - ETA: 51s - loss: 1.8649 - regression_loss: 1.5380 - classification_loss: 0.3269 294/500 [================>.............] - ETA: 51s - loss: 1.8633 - regression_loss: 1.5369 - classification_loss: 0.3263 295/500 [================>.............] - ETA: 51s - loss: 1.8638 - regression_loss: 1.5377 - classification_loss: 0.3261 296/500 [================>.............] - ETA: 50s - loss: 1.8620 - regression_loss: 1.5365 - classification_loss: 0.3255 297/500 [================>.............] - ETA: 50s - loss: 1.8647 - regression_loss: 1.5386 - classification_loss: 0.3261 298/500 [================>.............] - ETA: 50s - loss: 1.8647 - regression_loss: 1.5393 - classification_loss: 0.3255 299/500 [================>.............] - ETA: 50s - loss: 1.8645 - regression_loss: 1.5392 - classification_loss: 0.3253 300/500 [=================>............] - ETA: 49s - loss: 1.8665 - regression_loss: 1.5406 - classification_loss: 0.3259 301/500 [=================>............] - ETA: 49s - loss: 1.8677 - regression_loss: 1.5413 - classification_loss: 0.3263 302/500 [=================>............] - ETA: 49s - loss: 1.8691 - regression_loss: 1.5424 - classification_loss: 0.3267 303/500 [=================>............] - ETA: 49s - loss: 1.8695 - regression_loss: 1.5429 - classification_loss: 0.3266 304/500 [=================>............] - ETA: 48s - loss: 1.8685 - regression_loss: 1.5423 - classification_loss: 0.3262 305/500 [=================>............] - ETA: 48s - loss: 1.8680 - regression_loss: 1.5419 - classification_loss: 0.3261 306/500 [=================>............] - ETA: 48s - loss: 1.8669 - regression_loss: 1.5412 - classification_loss: 0.3257 307/500 [=================>............] - ETA: 48s - loss: 1.8671 - regression_loss: 1.5415 - classification_loss: 0.3256 308/500 [=================>............] - ETA: 47s - loss: 1.8669 - regression_loss: 1.5414 - classification_loss: 0.3255 309/500 [=================>............] - ETA: 47s - loss: 1.8653 - regression_loss: 1.5403 - classification_loss: 0.3250 310/500 [=================>............] - ETA: 47s - loss: 1.8672 - regression_loss: 1.5417 - classification_loss: 0.3255 311/500 [=================>............] - ETA: 47s - loss: 1.8665 - regression_loss: 1.5413 - classification_loss: 0.3252 312/500 [=================>............] - ETA: 46s - loss: 1.8671 - regression_loss: 1.5418 - classification_loss: 0.3253 313/500 [=================>............] - ETA: 46s - loss: 1.8691 - regression_loss: 1.5437 - classification_loss: 0.3254 314/500 [=================>............] - ETA: 46s - loss: 1.8711 - regression_loss: 1.5454 - classification_loss: 0.3258 315/500 [=================>............] - ETA: 46s - loss: 1.8753 - regression_loss: 1.5490 - classification_loss: 0.3263 316/500 [=================>............] - ETA: 45s - loss: 1.8719 - regression_loss: 1.5459 - classification_loss: 0.3260 317/500 [==================>...........] - ETA: 45s - loss: 1.8736 - regression_loss: 1.5476 - classification_loss: 0.3260 318/500 [==================>...........] - ETA: 45s - loss: 1.8728 - regression_loss: 1.5469 - classification_loss: 0.3259 319/500 [==================>...........] - ETA: 45s - loss: 1.8729 - regression_loss: 1.5468 - classification_loss: 0.3261 320/500 [==================>...........] - ETA: 44s - loss: 1.8732 - regression_loss: 1.5471 - classification_loss: 0.3261 321/500 [==================>...........] - ETA: 44s - loss: 1.8728 - regression_loss: 1.5470 - classification_loss: 0.3257 322/500 [==================>...........] - ETA: 44s - loss: 1.8734 - regression_loss: 1.5477 - classification_loss: 0.3257 323/500 [==================>...........] - ETA: 44s - loss: 1.8752 - regression_loss: 1.5492 - classification_loss: 0.3260 324/500 [==================>...........] - ETA: 43s - loss: 1.8731 - regression_loss: 1.5478 - classification_loss: 0.3253 325/500 [==================>...........] - ETA: 43s - loss: 1.8731 - regression_loss: 1.5479 - classification_loss: 0.3252 326/500 [==================>...........] - ETA: 43s - loss: 1.8708 - regression_loss: 1.5461 - classification_loss: 0.3248 327/500 [==================>...........] - ETA: 43s - loss: 1.8701 - regression_loss: 1.5454 - classification_loss: 0.3247 328/500 [==================>...........] - ETA: 42s - loss: 1.8722 - regression_loss: 1.5470 - classification_loss: 0.3253 329/500 [==================>...........] - ETA: 42s - loss: 1.8707 - regression_loss: 1.5458 - classification_loss: 0.3249 330/500 [==================>...........] - ETA: 42s - loss: 1.8710 - regression_loss: 1.5461 - classification_loss: 0.3250 331/500 [==================>...........] - ETA: 42s - loss: 1.8702 - regression_loss: 1.5455 - classification_loss: 0.3247 332/500 [==================>...........] - ETA: 41s - loss: 1.8708 - regression_loss: 1.5462 - classification_loss: 0.3246 333/500 [==================>...........] - ETA: 41s - loss: 1.8696 - regression_loss: 1.5454 - classification_loss: 0.3243 334/500 [===================>..........] - ETA: 41s - loss: 1.8728 - regression_loss: 1.5479 - classification_loss: 0.3249 335/500 [===================>..........] - ETA: 41s - loss: 1.8711 - regression_loss: 1.5467 - classification_loss: 0.3244 336/500 [===================>..........] - ETA: 41s - loss: 1.8729 - regression_loss: 1.5482 - classification_loss: 0.3246 337/500 [===================>..........] - ETA: 40s - loss: 1.8717 - regression_loss: 1.5474 - classification_loss: 0.3243 338/500 [===================>..........] - ETA: 40s - loss: 1.8695 - regression_loss: 1.5457 - classification_loss: 0.3237 339/500 [===================>..........] - ETA: 40s - loss: 1.8700 - regression_loss: 1.5460 - classification_loss: 0.3239 340/500 [===================>..........] - ETA: 39s - loss: 1.8677 - regression_loss: 1.5442 - classification_loss: 0.3235 341/500 [===================>..........] - ETA: 39s - loss: 1.8684 - regression_loss: 1.5448 - classification_loss: 0.3236 342/500 [===================>..........] - ETA: 39s - loss: 1.8688 - regression_loss: 1.5454 - classification_loss: 0.3234 343/500 [===================>..........] - ETA: 39s - loss: 1.8675 - regression_loss: 1.5444 - classification_loss: 0.3231 344/500 [===================>..........] - ETA: 38s - loss: 1.8688 - regression_loss: 1.5454 - classification_loss: 0.3234 345/500 [===================>..........] - ETA: 38s - loss: 1.8684 - regression_loss: 1.5450 - classification_loss: 0.3234 346/500 [===================>..........] - ETA: 38s - loss: 1.8694 - regression_loss: 1.5458 - classification_loss: 0.3237 347/500 [===================>..........] - ETA: 38s - loss: 1.8694 - regression_loss: 1.5460 - classification_loss: 0.3233 348/500 [===================>..........] - ETA: 37s - loss: 1.8695 - regression_loss: 1.5460 - classification_loss: 0.3234 349/500 [===================>..........] - ETA: 37s - loss: 1.8688 - regression_loss: 1.5455 - classification_loss: 0.3233 350/500 [====================>.........] - ETA: 37s - loss: 1.8683 - regression_loss: 1.5453 - classification_loss: 0.3230 351/500 [====================>.........] - ETA: 37s - loss: 1.8707 - regression_loss: 1.5474 - classification_loss: 0.3233 352/500 [====================>.........] - ETA: 36s - loss: 1.8709 - regression_loss: 1.5477 - classification_loss: 0.3232 353/500 [====================>.........] - ETA: 36s - loss: 1.8683 - regression_loss: 1.5456 - classification_loss: 0.3227 354/500 [====================>.........] - ETA: 36s - loss: 1.8687 - regression_loss: 1.5456 - classification_loss: 0.3231 355/500 [====================>.........] - ETA: 36s - loss: 1.8701 - regression_loss: 1.5472 - classification_loss: 0.3229 356/500 [====================>.........] - ETA: 35s - loss: 1.8711 - regression_loss: 1.5479 - classification_loss: 0.3232 357/500 [====================>.........] - ETA: 35s - loss: 1.8708 - regression_loss: 1.5476 - classification_loss: 0.3232 358/500 [====================>.........] - ETA: 35s - loss: 1.8708 - regression_loss: 1.5474 - classification_loss: 0.3234 359/500 [====================>.........] - ETA: 35s - loss: 1.8696 - regression_loss: 1.5466 - classification_loss: 0.3230 360/500 [====================>.........] - ETA: 34s - loss: 1.8678 - regression_loss: 1.5453 - classification_loss: 0.3225 361/500 [====================>.........] - ETA: 34s - loss: 1.8672 - regression_loss: 1.5449 - classification_loss: 0.3223 362/500 [====================>.........] - ETA: 34s - loss: 1.8672 - regression_loss: 1.5449 - classification_loss: 0.3222 363/500 [====================>.........] - ETA: 34s - loss: 1.8689 - regression_loss: 1.5465 - classification_loss: 0.3224 364/500 [====================>.........] - ETA: 33s - loss: 1.8686 - regression_loss: 1.5463 - classification_loss: 0.3223 365/500 [====================>.........] - ETA: 33s - loss: 1.8683 - regression_loss: 1.5461 - classification_loss: 0.3221 366/500 [====================>.........] - ETA: 33s - loss: 1.8694 - regression_loss: 1.5471 - classification_loss: 0.3223 367/500 [=====================>........] - ETA: 33s - loss: 1.8717 - regression_loss: 1.5496 - classification_loss: 0.3222 368/500 [=====================>........] - ETA: 32s - loss: 1.8725 - regression_loss: 1.5502 - classification_loss: 0.3223 369/500 [=====================>........] - ETA: 32s - loss: 1.8731 - regression_loss: 1.5506 - classification_loss: 0.3226 370/500 [=====================>........] - ETA: 32s - loss: 1.8735 - regression_loss: 1.5512 - classification_loss: 0.3223 371/500 [=====================>........] - ETA: 32s - loss: 1.8731 - regression_loss: 1.5509 - classification_loss: 0.3223 372/500 [=====================>........] - ETA: 31s - loss: 1.8745 - regression_loss: 1.5519 - classification_loss: 0.3225 373/500 [=====================>........] - ETA: 31s - loss: 1.8734 - regression_loss: 1.5511 - classification_loss: 0.3223 374/500 [=====================>........] - ETA: 31s - loss: 1.8748 - regression_loss: 1.5521 - classification_loss: 0.3227 375/500 [=====================>........] - ETA: 31s - loss: 1.8742 - regression_loss: 1.5516 - classification_loss: 0.3226 376/500 [=====================>........] - ETA: 30s - loss: 1.8721 - regression_loss: 1.5499 - classification_loss: 0.3222 377/500 [=====================>........] - ETA: 30s - loss: 1.8719 - regression_loss: 1.5499 - classification_loss: 0.3220 378/500 [=====================>........] - ETA: 30s - loss: 1.8713 - regression_loss: 1.5497 - classification_loss: 0.3216 379/500 [=====================>........] - ETA: 30s - loss: 1.8699 - regression_loss: 1.5486 - classification_loss: 0.3212 380/500 [=====================>........] - ETA: 29s - loss: 1.8704 - regression_loss: 1.5491 - classification_loss: 0.3213 381/500 [=====================>........] - ETA: 29s - loss: 1.8690 - regression_loss: 1.5482 - classification_loss: 0.3209 382/500 [=====================>........] - ETA: 29s - loss: 1.8693 - regression_loss: 1.5484 - classification_loss: 0.3209 383/500 [=====================>........] - ETA: 29s - loss: 1.8694 - regression_loss: 1.5486 - classification_loss: 0.3208 384/500 [======================>.......] - ETA: 28s - loss: 1.8688 - regression_loss: 1.5482 - classification_loss: 0.3206 385/500 [======================>.......] - ETA: 28s - loss: 1.8677 - regression_loss: 1.5474 - classification_loss: 0.3203 386/500 [======================>.......] - ETA: 28s - loss: 1.8670 - regression_loss: 1.5467 - classification_loss: 0.3202 387/500 [======================>.......] - ETA: 28s - loss: 1.8641 - regression_loss: 1.5444 - classification_loss: 0.3197 388/500 [======================>.......] - ETA: 27s - loss: 1.8637 - regression_loss: 1.5442 - classification_loss: 0.3195 389/500 [======================>.......] - ETA: 27s - loss: 1.8631 - regression_loss: 1.5440 - classification_loss: 0.3191 390/500 [======================>.......] - ETA: 27s - loss: 1.8638 - regression_loss: 1.5444 - classification_loss: 0.3193 391/500 [======================>.......] - ETA: 27s - loss: 1.8631 - regression_loss: 1.5438 - classification_loss: 0.3193 392/500 [======================>.......] - ETA: 26s - loss: 1.8657 - regression_loss: 1.5456 - classification_loss: 0.3201 393/500 [======================>.......] - ETA: 26s - loss: 1.8633 - regression_loss: 1.5436 - classification_loss: 0.3197 394/500 [======================>.......] - ETA: 26s - loss: 1.8633 - regression_loss: 1.5436 - classification_loss: 0.3196 395/500 [======================>.......] - ETA: 26s - loss: 1.8651 - regression_loss: 1.5447 - classification_loss: 0.3204 396/500 [======================>.......] - ETA: 25s - loss: 1.8640 - regression_loss: 1.5440 - classification_loss: 0.3201 397/500 [======================>.......] - ETA: 25s - loss: 1.8635 - regression_loss: 1.5436 - classification_loss: 0.3199 398/500 [======================>.......] - ETA: 25s - loss: 1.8638 - regression_loss: 1.5437 - classification_loss: 0.3201 399/500 [======================>.......] - ETA: 25s - loss: 1.8615 - regression_loss: 1.5420 - classification_loss: 0.3196 400/500 [=======================>......] - ETA: 24s - loss: 1.8626 - regression_loss: 1.5432 - classification_loss: 0.3195 401/500 [=======================>......] - ETA: 24s - loss: 1.8613 - regression_loss: 1.5423 - classification_loss: 0.3190 402/500 [=======================>......] - ETA: 24s - loss: 1.8610 - regression_loss: 1.5422 - classification_loss: 0.3188 403/500 [=======================>......] - ETA: 24s - loss: 1.8609 - regression_loss: 1.5423 - classification_loss: 0.3186 404/500 [=======================>......] - ETA: 23s - loss: 1.8617 - regression_loss: 1.5432 - classification_loss: 0.3185 405/500 [=======================>......] - ETA: 23s - loss: 1.8610 - regression_loss: 1.5427 - classification_loss: 0.3183 406/500 [=======================>......] - ETA: 23s - loss: 1.8593 - regression_loss: 1.5415 - classification_loss: 0.3178 407/500 [=======================>......] - ETA: 23s - loss: 1.8580 - regression_loss: 1.5407 - classification_loss: 0.3173 408/500 [=======================>......] - ETA: 22s - loss: 1.8595 - regression_loss: 1.5418 - classification_loss: 0.3177 409/500 [=======================>......] - ETA: 22s - loss: 1.8597 - regression_loss: 1.5421 - classification_loss: 0.3176 410/500 [=======================>......] - ETA: 22s - loss: 1.8599 - regression_loss: 1.5422 - classification_loss: 0.3178 411/500 [=======================>......] - ETA: 22s - loss: 1.8606 - regression_loss: 1.5427 - classification_loss: 0.3178 412/500 [=======================>......] - ETA: 21s - loss: 1.8598 - regression_loss: 1.5421 - classification_loss: 0.3177 413/500 [=======================>......] - ETA: 21s - loss: 1.8600 - regression_loss: 1.5424 - classification_loss: 0.3176 414/500 [=======================>......] - ETA: 21s - loss: 1.8622 - regression_loss: 1.5443 - classification_loss: 0.3180 415/500 [=======================>......] - ETA: 21s - loss: 1.8629 - regression_loss: 1.5450 - classification_loss: 0.3180 416/500 [=======================>......] - ETA: 20s - loss: 1.8629 - regression_loss: 1.5449 - classification_loss: 0.3180 417/500 [========================>.....] - ETA: 20s - loss: 1.8641 - regression_loss: 1.5459 - classification_loss: 0.3182 418/500 [========================>.....] - ETA: 20s - loss: 1.8638 - regression_loss: 1.5456 - classification_loss: 0.3182 419/500 [========================>.....] - ETA: 20s - loss: 1.8613 - regression_loss: 1.5437 - classification_loss: 0.3176 420/500 [========================>.....] - ETA: 19s - loss: 1.8617 - regression_loss: 1.5441 - classification_loss: 0.3176 421/500 [========================>.....] - ETA: 19s - loss: 1.8615 - regression_loss: 1.5439 - classification_loss: 0.3176 422/500 [========================>.....] - ETA: 19s - loss: 1.8617 - regression_loss: 1.5441 - classification_loss: 0.3176 423/500 [========================>.....] - ETA: 19s - loss: 1.8630 - regression_loss: 1.5452 - classification_loss: 0.3178 424/500 [========================>.....] - ETA: 19s - loss: 1.8627 - regression_loss: 1.5452 - classification_loss: 0.3176 425/500 [========================>.....] - ETA: 18s - loss: 1.8639 - regression_loss: 1.5459 - classification_loss: 0.3181 426/500 [========================>.....] - ETA: 18s - loss: 1.8634 - regression_loss: 1.5453 - classification_loss: 0.3180 427/500 [========================>.....] - ETA: 18s - loss: 1.8617 - regression_loss: 1.5440 - classification_loss: 0.3177 428/500 [========================>.....] - ETA: 17s - loss: 1.8617 - regression_loss: 1.5440 - classification_loss: 0.3177 429/500 [========================>.....] - ETA: 17s - loss: 1.8600 - regression_loss: 1.5428 - classification_loss: 0.3173 430/500 [========================>.....] - ETA: 17s - loss: 1.8602 - regression_loss: 1.5432 - classification_loss: 0.3170 431/500 [========================>.....] - ETA: 17s - loss: 1.8584 - regression_loss: 1.5418 - classification_loss: 0.3166 432/500 [========================>.....] - ETA: 17s - loss: 1.8598 - regression_loss: 1.5428 - classification_loss: 0.3170 433/500 [========================>.....] - ETA: 16s - loss: 1.8616 - regression_loss: 1.5441 - classification_loss: 0.3175 434/500 [=========================>....] - ETA: 16s - loss: 1.8623 - regression_loss: 1.5446 - classification_loss: 0.3177 435/500 [=========================>....] - ETA: 16s - loss: 1.8620 - regression_loss: 1.5445 - classification_loss: 0.3175 436/500 [=========================>....] - ETA: 16s - loss: 1.8618 - regression_loss: 1.5445 - classification_loss: 0.3173 437/500 [=========================>....] - ETA: 15s - loss: 1.8605 - regression_loss: 1.5434 - classification_loss: 0.3171 438/500 [=========================>....] - ETA: 15s - loss: 1.8616 - regression_loss: 1.5442 - classification_loss: 0.3174 439/500 [=========================>....] - ETA: 15s - loss: 1.8592 - regression_loss: 1.5421 - classification_loss: 0.3171 440/500 [=========================>....] - ETA: 15s - loss: 1.8592 - regression_loss: 1.5421 - classification_loss: 0.3170 441/500 [=========================>....] - ETA: 14s - loss: 1.8613 - regression_loss: 1.5437 - classification_loss: 0.3175 442/500 [=========================>....] - ETA: 14s - loss: 1.8630 - regression_loss: 1.5452 - classification_loss: 0.3178 443/500 [=========================>....] - ETA: 14s - loss: 1.8622 - regression_loss: 1.5448 - classification_loss: 0.3174 444/500 [=========================>....] - ETA: 14s - loss: 1.8611 - regression_loss: 1.5440 - classification_loss: 0.3171 445/500 [=========================>....] - ETA: 13s - loss: 1.8633 - regression_loss: 1.5458 - classification_loss: 0.3175 446/500 [=========================>....] - ETA: 13s - loss: 1.8645 - regression_loss: 1.5465 - classification_loss: 0.3180 447/500 [=========================>....] - ETA: 13s - loss: 1.8652 - regression_loss: 1.5469 - classification_loss: 0.3183 448/500 [=========================>....] - ETA: 13s - loss: 1.8676 - regression_loss: 1.5489 - classification_loss: 0.3187 449/500 [=========================>....] - ETA: 12s - loss: 1.8692 - regression_loss: 1.5504 - classification_loss: 0.3188 450/500 [==========================>...] - ETA: 12s - loss: 1.8689 - regression_loss: 1.5502 - classification_loss: 0.3186 451/500 [==========================>...] - ETA: 12s - loss: 1.8676 - regression_loss: 1.5491 - classification_loss: 0.3186 452/500 [==========================>...] - ETA: 12s - loss: 1.8679 - regression_loss: 1.5490 - classification_loss: 0.3189 453/500 [==========================>...] - ETA: 11s - loss: 1.8688 - regression_loss: 1.5495 - classification_loss: 0.3193 454/500 [==========================>...] - ETA: 11s - loss: 1.8677 - regression_loss: 1.5488 - classification_loss: 0.3189 455/500 [==========================>...] - ETA: 11s - loss: 1.8682 - regression_loss: 1.5492 - classification_loss: 0.3190 456/500 [==========================>...] - ETA: 11s - loss: 1.8674 - regression_loss: 1.5488 - classification_loss: 0.3186 457/500 [==========================>...] - ETA: 10s - loss: 1.8661 - regression_loss: 1.5479 - classification_loss: 0.3183 458/500 [==========================>...] - ETA: 10s - loss: 1.8668 - regression_loss: 1.5483 - classification_loss: 0.3185 459/500 [==========================>...] - ETA: 10s - loss: 1.8676 - regression_loss: 1.5488 - classification_loss: 0.3188 460/500 [==========================>...] - ETA: 10s - loss: 1.8681 - regression_loss: 1.5494 - classification_loss: 0.3187 461/500 [==========================>...] - ETA: 9s - loss: 1.8716 - regression_loss: 1.5524 - classification_loss: 0.3192  462/500 [==========================>...] - ETA: 9s - loss: 1.8736 - regression_loss: 1.5541 - classification_loss: 0.3194 463/500 [==========================>...] - ETA: 9s - loss: 1.8727 - regression_loss: 1.5535 - classification_loss: 0.3192 464/500 [==========================>...] - ETA: 9s - loss: 1.8728 - regression_loss: 1.5535 - classification_loss: 0.3193 465/500 [==========================>...] - ETA: 8s - loss: 1.8733 - regression_loss: 1.5539 - classification_loss: 0.3194 466/500 [==========================>...] - ETA: 8s - loss: 1.8708 - regression_loss: 1.5519 - classification_loss: 0.3189 467/500 [===========================>..] - ETA: 8s - loss: 1.8716 - regression_loss: 1.5525 - classification_loss: 0.3191 468/500 [===========================>..] - ETA: 8s - loss: 1.8688 - regression_loss: 1.5501 - classification_loss: 0.3187 469/500 [===========================>..] - ETA: 7s - loss: 1.8676 - regression_loss: 1.5492 - classification_loss: 0.3184 470/500 [===========================>..] - ETA: 7s - loss: 1.8682 - regression_loss: 1.5496 - classification_loss: 0.3186 471/500 [===========================>..] - ETA: 7s - loss: 1.8688 - regression_loss: 1.5502 - classification_loss: 0.3185 472/500 [===========================>..] - ETA: 7s - loss: 1.8699 - regression_loss: 1.5512 - classification_loss: 0.3187 473/500 [===========================>..] - ETA: 6s - loss: 1.8678 - regression_loss: 1.5496 - classification_loss: 0.3183 474/500 [===========================>..] - ETA: 6s - loss: 1.8690 - regression_loss: 1.5504 - classification_loss: 0.3186 475/500 [===========================>..] - ETA: 6s - loss: 1.8681 - regression_loss: 1.5497 - classification_loss: 0.3184 476/500 [===========================>..] - ETA: 6s - loss: 1.8679 - regression_loss: 1.5496 - classification_loss: 0.3183 477/500 [===========================>..] - ETA: 5s - loss: 1.8696 - regression_loss: 1.5510 - classification_loss: 0.3186 478/500 [===========================>..] - ETA: 5s - loss: 1.8699 - regression_loss: 1.5512 - classification_loss: 0.3187 479/500 [===========================>..] - ETA: 5s - loss: 1.8696 - regression_loss: 1.5509 - classification_loss: 0.3186 480/500 [===========================>..] - ETA: 5s - loss: 1.8706 - regression_loss: 1.5517 - classification_loss: 0.3189 481/500 [===========================>..] - ETA: 4s - loss: 1.8709 - regression_loss: 1.5519 - classification_loss: 0.3190 482/500 [===========================>..] - ETA: 4s - loss: 1.8698 - regression_loss: 1.5511 - classification_loss: 0.3187 483/500 [===========================>..] - ETA: 4s - loss: 1.8689 - regression_loss: 1.5505 - classification_loss: 0.3185 484/500 [============================>.] - ETA: 4s - loss: 1.8674 - regression_loss: 1.5492 - classification_loss: 0.3182 485/500 [============================>.] - ETA: 3s - loss: 1.8673 - regression_loss: 1.5493 - classification_loss: 0.3180 486/500 [============================>.] - ETA: 3s - loss: 1.8668 - regression_loss: 1.5491 - classification_loss: 0.3177 487/500 [============================>.] - ETA: 3s - loss: 1.8660 - regression_loss: 1.5485 - classification_loss: 0.3175 488/500 [============================>.] - ETA: 3s - loss: 1.8644 - regression_loss: 1.5473 - classification_loss: 0.3171 489/500 [============================>.] - ETA: 2s - loss: 1.8641 - regression_loss: 1.5471 - classification_loss: 0.3170 490/500 [============================>.] - ETA: 2s - loss: 1.8635 - regression_loss: 1.5466 - classification_loss: 0.3168 491/500 [============================>.] - ETA: 2s - loss: 1.8631 - regression_loss: 1.5463 - classification_loss: 0.3168 492/500 [============================>.] - ETA: 2s - loss: 1.8626 - regression_loss: 1.5461 - classification_loss: 0.3165 493/500 [============================>.] - ETA: 1s - loss: 1.8622 - regression_loss: 1.5457 - classification_loss: 0.3165 494/500 [============================>.] - ETA: 1s - loss: 1.8627 - regression_loss: 1.5461 - classification_loss: 0.3166 495/500 [============================>.] - ETA: 1s - loss: 1.8629 - regression_loss: 1.5462 - classification_loss: 0.3166 496/500 [============================>.] - ETA: 1s - loss: 1.8626 - regression_loss: 1.5461 - classification_loss: 0.3165 497/500 [============================>.] - ETA: 0s - loss: 1.8627 - regression_loss: 1.5461 - classification_loss: 0.3166 498/500 [============================>.] - ETA: 0s - loss: 1.8621 - regression_loss: 1.5457 - classification_loss: 0.3164 499/500 [============================>.] - ETA: 0s - loss: 1.8634 - regression_loss: 1.5466 - classification_loss: 0.3168 500/500 [==============================] - 125s 250ms/step - loss: 1.8629 - regression_loss: 1.5461 - classification_loss: 0.3168 326 instances of class plum with average precision: 0.7498 mAP: 0.7498 Epoch 00032: saving model to ./training/snapshots/resnet50_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 1:58 - loss: 1.8916 - regression_loss: 1.6035 - classification_loss: 0.2882 2/500 [..............................] - ETA: 2:01 - loss: 1.7861 - regression_loss: 1.5345 - classification_loss: 0.2516 3/500 [..............................] - ETA: 2:03 - loss: 2.3017 - regression_loss: 2.0123 - classification_loss: 0.2895 4/500 [..............................] - ETA: 2:03 - loss: 1.9495 - regression_loss: 1.7051 - classification_loss: 0.2444 5/500 [..............................] - ETA: 2:03 - loss: 2.0317 - regression_loss: 1.7596 - classification_loss: 0.2722 6/500 [..............................] - ETA: 2:03 - loss: 1.9579 - regression_loss: 1.6900 - classification_loss: 0.2679 7/500 [..............................] - ETA: 2:03 - loss: 2.0547 - regression_loss: 1.7471 - classification_loss: 0.3075 8/500 [..............................] - ETA: 2:03 - loss: 1.9636 - regression_loss: 1.6691 - classification_loss: 0.2945 9/500 [..............................] - ETA: 2:02 - loss: 1.9731 - regression_loss: 1.6751 - classification_loss: 0.2980 10/500 [..............................] - ETA: 2:02 - loss: 1.9270 - regression_loss: 1.6308 - classification_loss: 0.2962 11/500 [..............................] - ETA: 2:02 - loss: 1.9990 - regression_loss: 1.6731 - classification_loss: 0.3259 12/500 [..............................] - ETA: 2:02 - loss: 1.9967 - regression_loss: 1.6723 - classification_loss: 0.3245 13/500 [..............................] - ETA: 2:01 - loss: 1.9334 - regression_loss: 1.6222 - classification_loss: 0.3112 14/500 [..............................] - ETA: 2:01 - loss: 1.9097 - regression_loss: 1.6069 - classification_loss: 0.3027 15/500 [..............................] - ETA: 2:01 - loss: 1.9169 - regression_loss: 1.6088 - classification_loss: 0.3081 16/500 [..............................] - ETA: 2:00 - loss: 1.9235 - regression_loss: 1.6055 - classification_loss: 0.3180 17/500 [>.............................] - ETA: 1:58 - loss: 1.9344 - regression_loss: 1.6104 - classification_loss: 0.3240 18/500 [>.............................] - ETA: 1:57 - loss: 1.9170 - regression_loss: 1.5938 - classification_loss: 0.3232 19/500 [>.............................] - ETA: 1:57 - loss: 1.8940 - regression_loss: 1.5773 - classification_loss: 0.3167 20/500 [>.............................] - ETA: 1:57 - loss: 1.8945 - regression_loss: 1.5784 - classification_loss: 0.3161 21/500 [>.............................] - ETA: 1:57 - loss: 1.8726 - regression_loss: 1.5634 - classification_loss: 0.3092 22/500 [>.............................] - ETA: 1:57 - loss: 1.8494 - regression_loss: 1.5460 - classification_loss: 0.3034 23/500 [>.............................] - ETA: 1:57 - loss: 1.8832 - regression_loss: 1.5835 - classification_loss: 0.2997 24/500 [>.............................] - ETA: 1:57 - loss: 1.8878 - regression_loss: 1.5899 - classification_loss: 0.2979 25/500 [>.............................] - ETA: 1:57 - loss: 1.9135 - regression_loss: 1.6128 - classification_loss: 0.3007 26/500 [>.............................] - ETA: 1:56 - loss: 1.9198 - regression_loss: 1.6177 - classification_loss: 0.3021 27/500 [>.............................] - ETA: 1:56 - loss: 1.9178 - regression_loss: 1.6136 - classification_loss: 0.3042 28/500 [>.............................] - ETA: 1:56 - loss: 1.8974 - regression_loss: 1.5976 - classification_loss: 0.2998 29/500 [>.............................] - ETA: 1:56 - loss: 1.9079 - regression_loss: 1.6053 - classification_loss: 0.3026 30/500 [>.............................] - ETA: 1:56 - loss: 1.9087 - regression_loss: 1.6017 - classification_loss: 0.3070 31/500 [>.............................] - ETA: 1:56 - loss: 1.9341 - regression_loss: 1.6210 - classification_loss: 0.3130 32/500 [>.............................] - ETA: 1:56 - loss: 1.9284 - regression_loss: 1.6175 - classification_loss: 0.3110 33/500 [>.............................] - ETA: 1:55 - loss: 1.9339 - regression_loss: 1.6223 - classification_loss: 0.3116 34/500 [=>............................] - ETA: 1:55 - loss: 1.9402 - regression_loss: 1.6294 - classification_loss: 0.3108 35/500 [=>............................] - ETA: 1:55 - loss: 1.9237 - regression_loss: 1.6177 - classification_loss: 0.3060 36/500 [=>............................] - ETA: 1:55 - loss: 1.9210 - regression_loss: 1.6163 - classification_loss: 0.3047 37/500 [=>............................] - ETA: 1:54 - loss: 1.9171 - regression_loss: 1.6099 - classification_loss: 0.3072 38/500 [=>............................] - ETA: 1:54 - loss: 1.9059 - regression_loss: 1.6004 - classification_loss: 0.3054 39/500 [=>............................] - ETA: 1:54 - loss: 1.8741 - regression_loss: 1.5743 - classification_loss: 0.2998 40/500 [=>............................] - ETA: 1:54 - loss: 1.8700 - regression_loss: 1.5718 - classification_loss: 0.2983 41/500 [=>............................] - ETA: 1:54 - loss: 1.8904 - regression_loss: 1.5877 - classification_loss: 0.3027 42/500 [=>............................] - ETA: 1:53 - loss: 1.8948 - regression_loss: 1.5908 - classification_loss: 0.3040 43/500 [=>............................] - ETA: 1:53 - loss: 1.9099 - regression_loss: 1.6028 - classification_loss: 0.3070 44/500 [=>............................] - ETA: 1:53 - loss: 1.8905 - regression_loss: 1.5879 - classification_loss: 0.3026 45/500 [=>............................] - ETA: 1:53 - loss: 1.8882 - regression_loss: 1.5878 - classification_loss: 0.3004 46/500 [=>............................] - ETA: 1:52 - loss: 1.8766 - regression_loss: 1.5786 - classification_loss: 0.2980 47/500 [=>............................] - ETA: 1:52 - loss: 1.8768 - regression_loss: 1.5780 - classification_loss: 0.2988 48/500 [=>............................] - ETA: 1:52 - loss: 1.8804 - regression_loss: 1.5810 - classification_loss: 0.2994 49/500 [=>............................] - ETA: 1:52 - loss: 1.8810 - regression_loss: 1.5810 - classification_loss: 0.3000 50/500 [==>...........................] - ETA: 1:51 - loss: 1.8743 - regression_loss: 1.5759 - classification_loss: 0.2984 51/500 [==>...........................] - ETA: 1:51 - loss: 1.8820 - regression_loss: 1.5821 - classification_loss: 0.2999 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8808 - regression_loss: 1.5818 - classification_loss: 0.2990 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8815 - regression_loss: 1.5832 - classification_loss: 0.2983 54/500 [==>...........................] - ETA: 1:50 - loss: 1.8877 - regression_loss: 1.5890 - classification_loss: 0.2987 55/500 [==>...........................] - ETA: 1:50 - loss: 1.8832 - regression_loss: 1.5835 - classification_loss: 0.2997 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8866 - regression_loss: 1.5881 - classification_loss: 0.2985 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8937 - regression_loss: 1.5925 - classification_loss: 0.3012 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9030 - regression_loss: 1.5990 - classification_loss: 0.3040 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9149 - regression_loss: 1.6087 - classification_loss: 0.3062 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9160 - regression_loss: 1.6114 - classification_loss: 0.3047 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9194 - regression_loss: 1.6139 - classification_loss: 0.3055 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9170 - regression_loss: 1.6139 - classification_loss: 0.3031 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9122 - regression_loss: 1.6109 - classification_loss: 0.3013 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9087 - regression_loss: 1.6079 - classification_loss: 0.3008 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9139 - regression_loss: 1.6137 - classification_loss: 0.3002 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9105 - regression_loss: 1.6108 - classification_loss: 0.2997 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8991 - regression_loss: 1.6015 - classification_loss: 0.2975 68/500 [===>..........................] - ETA: 1:47 - loss: 1.9053 - regression_loss: 1.6066 - classification_loss: 0.2987 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8987 - regression_loss: 1.6012 - classification_loss: 0.2975 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8973 - regression_loss: 1.6000 - classification_loss: 0.2973 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8845 - regression_loss: 1.5884 - classification_loss: 0.2962 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8803 - regression_loss: 1.5855 - classification_loss: 0.2949 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8817 - regression_loss: 1.5865 - classification_loss: 0.2952 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8918 - regression_loss: 1.5931 - classification_loss: 0.2987 75/500 [===>..........................] - ETA: 1:45 - loss: 1.9011 - regression_loss: 1.6013 - classification_loss: 0.2998 76/500 [===>..........................] - ETA: 1:45 - loss: 1.9021 - regression_loss: 1.6017 - classification_loss: 0.3003 77/500 [===>..........................] - ETA: 1:45 - loss: 1.9018 - regression_loss: 1.6019 - classification_loss: 0.2999 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8861 - regression_loss: 1.5888 - classification_loss: 0.2972 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8901 - regression_loss: 1.5908 - classification_loss: 0.2994 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8874 - regression_loss: 1.5891 - classification_loss: 0.2982 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8867 - regression_loss: 1.5886 - classification_loss: 0.2981 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8734 - regression_loss: 1.5776 - classification_loss: 0.2958 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8809 - regression_loss: 1.5845 - classification_loss: 0.2964 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8697 - regression_loss: 1.5756 - classification_loss: 0.2942 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8741 - regression_loss: 1.5788 - classification_loss: 0.2953 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8703 - regression_loss: 1.5755 - classification_loss: 0.2948 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8693 - regression_loss: 1.5759 - classification_loss: 0.2934 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8648 - regression_loss: 1.5735 - classification_loss: 0.2914 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8609 - regression_loss: 1.5710 - classification_loss: 0.2898 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8622 - regression_loss: 1.5720 - classification_loss: 0.2903 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8662 - regression_loss: 1.5752 - classification_loss: 0.2910 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8543 - regression_loss: 1.5656 - classification_loss: 0.2887 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8562 - regression_loss: 1.5666 - classification_loss: 0.2895 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8510 - regression_loss: 1.5628 - classification_loss: 0.2882 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8600 - regression_loss: 1.5701 - classification_loss: 0.2900 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8565 - regression_loss: 1.5674 - classification_loss: 0.2892 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8615 - regression_loss: 1.5715 - classification_loss: 0.2900 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8583 - regression_loss: 1.5694 - classification_loss: 0.2888 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8578 - regression_loss: 1.5695 - classification_loss: 0.2883 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8553 - regression_loss: 1.5680 - classification_loss: 0.2872 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8614 - regression_loss: 1.5728 - classification_loss: 0.2886 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8789 - regression_loss: 1.5875 - classification_loss: 0.2914 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8834 - regression_loss: 1.5909 - classification_loss: 0.2924 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8736 - regression_loss: 1.5831 - classification_loss: 0.2904 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8780 - regression_loss: 1.5849 - classification_loss: 0.2930 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8800 - regression_loss: 1.5865 - classification_loss: 0.2935 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8784 - regression_loss: 1.5855 - classification_loss: 0.2929 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8729 - regression_loss: 1.5812 - classification_loss: 0.2917 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8707 - regression_loss: 1.5793 - classification_loss: 0.2914 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8683 - regression_loss: 1.5773 - classification_loss: 0.2910 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8686 - regression_loss: 1.5780 - classification_loss: 0.2905 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8720 - regression_loss: 1.5802 - classification_loss: 0.2917 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8718 - regression_loss: 1.5797 - classification_loss: 0.2921 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8734 - regression_loss: 1.5796 - classification_loss: 0.2938 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8801 - regression_loss: 1.5853 - classification_loss: 0.2949 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8758 - regression_loss: 1.5819 - classification_loss: 0.2939 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8877 - regression_loss: 1.5915 - classification_loss: 0.2962 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8909 - regression_loss: 1.5941 - classification_loss: 0.2969 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8909 - regression_loss: 1.5941 - classification_loss: 0.2967 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8923 - regression_loss: 1.5956 - classification_loss: 0.2968 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8958 - regression_loss: 1.5980 - classification_loss: 0.2978 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8958 - regression_loss: 1.5981 - classification_loss: 0.2978 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8965 - regression_loss: 1.5988 - classification_loss: 0.2977 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8960 - regression_loss: 1.5982 - classification_loss: 0.2978 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8933 - regression_loss: 1.5958 - classification_loss: 0.2975 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8948 - regression_loss: 1.5978 - classification_loss: 0.2969 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8912 - regression_loss: 1.5950 - classification_loss: 0.2962 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8834 - regression_loss: 1.5885 - classification_loss: 0.2949 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8861 - regression_loss: 1.5911 - classification_loss: 0.2951 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8905 - regression_loss: 1.5936 - classification_loss: 0.2969 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8969 - regression_loss: 1.5980 - classification_loss: 0.2988 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9024 - regression_loss: 1.6020 - classification_loss: 0.3004 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8960 - regression_loss: 1.5968 - classification_loss: 0.2993 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8981 - regression_loss: 1.5952 - classification_loss: 0.3029 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8990 - regression_loss: 1.5963 - classification_loss: 0.3027 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9000 - regression_loss: 1.5966 - classification_loss: 0.3034 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8999 - regression_loss: 1.5967 - classification_loss: 0.3032 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8935 - regression_loss: 1.5913 - classification_loss: 0.3022 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8886 - regression_loss: 1.5867 - classification_loss: 0.3019 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8924 - regression_loss: 1.5895 - classification_loss: 0.3030 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8956 - regression_loss: 1.5918 - classification_loss: 0.3039 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9020 - regression_loss: 1.5963 - classification_loss: 0.3058 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8977 - regression_loss: 1.5929 - classification_loss: 0.3048 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8965 - regression_loss: 1.5916 - classification_loss: 0.3049 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8957 - regression_loss: 1.5914 - classification_loss: 0.3042 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9019 - regression_loss: 1.5960 - classification_loss: 0.3059 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8948 - regression_loss: 1.5899 - classification_loss: 0.3048 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8947 - regression_loss: 1.5894 - classification_loss: 0.3053 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8945 - regression_loss: 1.5892 - classification_loss: 0.3052 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8853 - regression_loss: 1.5818 - classification_loss: 0.3036 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8850 - regression_loss: 1.5819 - classification_loss: 0.3032 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8850 - regression_loss: 1.5816 - classification_loss: 0.3034 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8875 - regression_loss: 1.5838 - classification_loss: 0.3037 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8882 - regression_loss: 1.5846 - classification_loss: 0.3035 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8848 - regression_loss: 1.5821 - classification_loss: 0.3028 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8856 - regression_loss: 1.5821 - classification_loss: 0.3035 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8831 - regression_loss: 1.5798 - classification_loss: 0.3033 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8823 - regression_loss: 1.5793 - classification_loss: 0.3030 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8822 - regression_loss: 1.5792 - classification_loss: 0.3030 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8819 - regression_loss: 1.5795 - classification_loss: 0.3024 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8808 - regression_loss: 1.5789 - classification_loss: 0.3019 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8783 - regression_loss: 1.5770 - classification_loss: 0.3013 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8779 - regression_loss: 1.5767 - classification_loss: 0.3012 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8726 - regression_loss: 1.5671 - classification_loss: 0.3055 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8696 - regression_loss: 1.5651 - classification_loss: 0.3045 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8679 - regression_loss: 1.5635 - classification_loss: 0.3044 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8654 - regression_loss: 1.5618 - classification_loss: 0.3036 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8653 - regression_loss: 1.5620 - classification_loss: 0.3034 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8737 - regression_loss: 1.5690 - classification_loss: 0.3047 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8735 - regression_loss: 1.5689 - classification_loss: 0.3046 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8691 - regression_loss: 1.5653 - classification_loss: 0.3038 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8713 - regression_loss: 1.5658 - classification_loss: 0.3055 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8780 - regression_loss: 1.5718 - classification_loss: 0.3062 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8782 - regression_loss: 1.5726 - classification_loss: 0.3056 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8796 - regression_loss: 1.5741 - classification_loss: 0.3054 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8814 - regression_loss: 1.5764 - classification_loss: 0.3050 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8828 - regression_loss: 1.5773 - classification_loss: 0.3055 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8864 - regression_loss: 1.5797 - classification_loss: 0.3068 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8873 - regression_loss: 1.5798 - classification_loss: 0.3075 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8870 - regression_loss: 1.5796 - classification_loss: 0.3074 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8870 - regression_loss: 1.5793 - classification_loss: 0.3077 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8865 - regression_loss: 1.5793 - classification_loss: 0.3072 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8874 - regression_loss: 1.5804 - classification_loss: 0.3070 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8826 - regression_loss: 1.5766 - classification_loss: 0.3060 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8879 - regression_loss: 1.5814 - classification_loss: 0.3065 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8833 - regression_loss: 1.5777 - classification_loss: 0.3056 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8858 - regression_loss: 1.5789 - classification_loss: 0.3068 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8837 - regression_loss: 1.5770 - classification_loss: 0.3067 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8840 - regression_loss: 1.5772 - classification_loss: 0.3068 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8840 - regression_loss: 1.5773 - classification_loss: 0.3067 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8812 - regression_loss: 1.5750 - classification_loss: 0.3062 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8838 - regression_loss: 1.5771 - classification_loss: 0.3067 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8836 - regression_loss: 1.5771 - classification_loss: 0.3064 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8830 - regression_loss: 1.5765 - classification_loss: 0.3065 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8826 - regression_loss: 1.5762 - classification_loss: 0.3065 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8814 - regression_loss: 1.5754 - classification_loss: 0.3061 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8835 - regression_loss: 1.5768 - classification_loss: 0.3067 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8850 - regression_loss: 1.5778 - classification_loss: 0.3072 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8851 - regression_loss: 1.5776 - classification_loss: 0.3075 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8825 - regression_loss: 1.5756 - classification_loss: 0.3069 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8809 - regression_loss: 1.5745 - classification_loss: 0.3063 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8778 - regression_loss: 1.5723 - classification_loss: 0.3055 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8730 - regression_loss: 1.5685 - classification_loss: 0.3045 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8727 - regression_loss: 1.5681 - classification_loss: 0.3046 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8722 - regression_loss: 1.5679 - classification_loss: 0.3043 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8720 - regression_loss: 1.5680 - classification_loss: 0.3041 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8711 - regression_loss: 1.5670 - classification_loss: 0.3041 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8702 - regression_loss: 1.5663 - classification_loss: 0.3039 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8689 - regression_loss: 1.5654 - classification_loss: 0.3035 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8633 - regression_loss: 1.5609 - classification_loss: 0.3025 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8623 - regression_loss: 1.5602 - classification_loss: 0.3021 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8639 - regression_loss: 1.5616 - classification_loss: 0.3023 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8633 - regression_loss: 1.5614 - classification_loss: 0.3020 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8658 - regression_loss: 1.5641 - classification_loss: 0.3018 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8626 - regression_loss: 1.5613 - classification_loss: 0.3013 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8592 - regression_loss: 1.5587 - classification_loss: 0.3004 217/500 [============>.................] - ETA: 1:10 - loss: 1.8576 - regression_loss: 1.5577 - classification_loss: 0.2998 218/500 [============>.................] - ETA: 1:10 - loss: 1.8610 - regression_loss: 1.5605 - classification_loss: 0.3005 219/500 [============>.................] - ETA: 1:10 - loss: 1.8599 - regression_loss: 1.5598 - classification_loss: 0.3000 220/500 [============>.................] - ETA: 1:10 - loss: 1.8619 - regression_loss: 1.5613 - classification_loss: 0.3006 221/500 [============>.................] - ETA: 1:09 - loss: 1.8604 - regression_loss: 1.5602 - classification_loss: 0.3002 222/500 [============>.................] - ETA: 1:09 - loss: 1.8594 - regression_loss: 1.5594 - classification_loss: 0.3000 223/500 [============>.................] - ETA: 1:09 - loss: 1.8591 - regression_loss: 1.5588 - classification_loss: 0.3003 224/500 [============>.................] - ETA: 1:09 - loss: 1.8599 - regression_loss: 1.5595 - classification_loss: 0.3004 225/500 [============>.................] - ETA: 1:08 - loss: 1.8576 - regression_loss: 1.5577 - classification_loss: 0.2999 226/500 [============>.................] - ETA: 1:08 - loss: 1.8613 - regression_loss: 1.5608 - classification_loss: 0.3005 227/500 [============>.................] - ETA: 1:08 - loss: 1.8602 - regression_loss: 1.5600 - classification_loss: 0.3002 228/500 [============>.................] - ETA: 1:08 - loss: 1.8651 - regression_loss: 1.5642 - classification_loss: 0.3009 229/500 [============>.................] - ETA: 1:07 - loss: 1.8626 - regression_loss: 1.5622 - classification_loss: 0.3004 230/500 [============>.................] - ETA: 1:07 - loss: 1.8626 - regression_loss: 1.5620 - classification_loss: 0.3007 231/500 [============>.................] - ETA: 1:07 - loss: 1.8624 - regression_loss: 1.5620 - classification_loss: 0.3004 232/500 [============>.................] - ETA: 1:07 - loss: 1.8643 - regression_loss: 1.5632 - classification_loss: 0.3011 233/500 [============>.................] - ETA: 1:06 - loss: 1.8645 - regression_loss: 1.5634 - classification_loss: 0.3011 234/500 [=============>................] - ETA: 1:06 - loss: 1.8658 - regression_loss: 1.5643 - classification_loss: 0.3015 235/500 [=============>................] - ETA: 1:06 - loss: 1.8631 - regression_loss: 1.5577 - classification_loss: 0.3054 236/500 [=============>................] - ETA: 1:06 - loss: 1.8645 - regression_loss: 1.5584 - classification_loss: 0.3061 237/500 [=============>................] - ETA: 1:05 - loss: 1.8644 - regression_loss: 1.5584 - classification_loss: 0.3060 238/500 [=============>................] - ETA: 1:05 - loss: 1.8651 - regression_loss: 1.5592 - classification_loss: 0.3059 239/500 [=============>................] - ETA: 1:05 - loss: 1.8670 - regression_loss: 1.5601 - classification_loss: 0.3069 240/500 [=============>................] - ETA: 1:05 - loss: 1.8679 - regression_loss: 1.5606 - classification_loss: 0.3073 241/500 [=============>................] - ETA: 1:04 - loss: 1.8711 - regression_loss: 1.5624 - classification_loss: 0.3087 242/500 [=============>................] - ETA: 1:04 - loss: 1.8710 - regression_loss: 1.5624 - classification_loss: 0.3086 243/500 [=============>................] - ETA: 1:04 - loss: 1.8701 - regression_loss: 1.5618 - classification_loss: 0.3083 244/500 [=============>................] - ETA: 1:04 - loss: 1.8739 - regression_loss: 1.5647 - classification_loss: 0.3092 245/500 [=============>................] - ETA: 1:03 - loss: 1.8723 - regression_loss: 1.5636 - classification_loss: 0.3087 246/500 [=============>................] - ETA: 1:03 - loss: 1.8714 - regression_loss: 1.5630 - classification_loss: 0.3084 247/500 [=============>................] - ETA: 1:03 - loss: 1.8687 - regression_loss: 1.5606 - classification_loss: 0.3082 248/500 [=============>................] - ETA: 1:03 - loss: 1.8690 - regression_loss: 1.5609 - classification_loss: 0.3081 249/500 [=============>................] - ETA: 1:02 - loss: 1.8652 - regression_loss: 1.5572 - classification_loss: 0.3081 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8636 - regression_loss: 1.5562 - classification_loss: 0.3074 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8657 - regression_loss: 1.5577 - classification_loss: 0.3080 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8656 - regression_loss: 1.5579 - classification_loss: 0.3077 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8637 - regression_loss: 1.5564 - classification_loss: 0.3072 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8636 - regression_loss: 1.5565 - classification_loss: 0.3071 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8647 - regression_loss: 1.5573 - classification_loss: 0.3074 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8633 - regression_loss: 1.5562 - classification_loss: 0.3070 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8636 - regression_loss: 1.5563 - classification_loss: 0.3073 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8601 - regression_loss: 1.5536 - classification_loss: 0.3065 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8600 - regression_loss: 1.5536 - classification_loss: 0.3064 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8585 - regression_loss: 1.5523 - classification_loss: 0.3062 261/500 [==============>...............] - ETA: 59s - loss: 1.8592 - regression_loss: 1.5529 - classification_loss: 0.3063  262/500 [==============>...............] - ETA: 59s - loss: 1.8600 - regression_loss: 1.5538 - classification_loss: 0.3062 263/500 [==============>...............] - ETA: 59s - loss: 1.8572 - regression_loss: 1.5515 - classification_loss: 0.3057 264/500 [==============>...............] - ETA: 59s - loss: 1.8587 - regression_loss: 1.5529 - classification_loss: 0.3058 265/500 [==============>...............] - ETA: 58s - loss: 1.8671 - regression_loss: 1.5582 - classification_loss: 0.3089 266/500 [==============>...............] - ETA: 58s - loss: 1.8649 - regression_loss: 1.5565 - classification_loss: 0.3084 267/500 [===============>..............] - ETA: 58s - loss: 1.8654 - regression_loss: 1.5569 - classification_loss: 0.3085 268/500 [===============>..............] - ETA: 58s - loss: 1.8674 - regression_loss: 1.5587 - classification_loss: 0.3087 269/500 [===============>..............] - ETA: 57s - loss: 1.8681 - regression_loss: 1.5590 - classification_loss: 0.3091 270/500 [===============>..............] - ETA: 57s - loss: 1.8666 - regression_loss: 1.5579 - classification_loss: 0.3087 271/500 [===============>..............] - ETA: 57s - loss: 1.8676 - regression_loss: 1.5589 - classification_loss: 0.3087 272/500 [===============>..............] - ETA: 57s - loss: 1.8663 - regression_loss: 1.5580 - classification_loss: 0.3083 273/500 [===============>..............] - ETA: 56s - loss: 1.8657 - regression_loss: 1.5576 - classification_loss: 0.3081 274/500 [===============>..............] - ETA: 56s - loss: 1.8661 - regression_loss: 1.5582 - classification_loss: 0.3079 275/500 [===============>..............] - ETA: 56s - loss: 1.8760 - regression_loss: 1.5635 - classification_loss: 0.3126 276/500 [===============>..............] - ETA: 56s - loss: 1.8772 - regression_loss: 1.5644 - classification_loss: 0.3128 277/500 [===============>..............] - ETA: 55s - loss: 1.8776 - regression_loss: 1.5646 - classification_loss: 0.3130 278/500 [===============>..............] - ETA: 55s - loss: 1.8747 - regression_loss: 1.5621 - classification_loss: 0.3126 279/500 [===============>..............] - ETA: 55s - loss: 1.8733 - regression_loss: 1.5609 - classification_loss: 0.3124 280/500 [===============>..............] - ETA: 55s - loss: 1.8719 - regression_loss: 1.5600 - classification_loss: 0.3120 281/500 [===============>..............] - ETA: 54s - loss: 1.8725 - regression_loss: 1.5604 - classification_loss: 0.3121 282/500 [===============>..............] - ETA: 54s - loss: 1.8743 - regression_loss: 1.5619 - classification_loss: 0.3124 283/500 [===============>..............] - ETA: 54s - loss: 1.8736 - regression_loss: 1.5615 - classification_loss: 0.3121 284/500 [================>.............] - ETA: 54s - loss: 1.8743 - regression_loss: 1.5622 - classification_loss: 0.3121 285/500 [================>.............] - ETA: 53s - loss: 1.8751 - regression_loss: 1.5633 - classification_loss: 0.3117 286/500 [================>.............] - ETA: 53s - loss: 1.8769 - regression_loss: 1.5646 - classification_loss: 0.3123 287/500 [================>.............] - ETA: 53s - loss: 1.8772 - regression_loss: 1.5648 - classification_loss: 0.3124 288/500 [================>.............] - ETA: 53s - loss: 1.8785 - regression_loss: 1.5659 - classification_loss: 0.3126 289/500 [================>.............] - ETA: 52s - loss: 1.8804 - regression_loss: 1.5670 - classification_loss: 0.3133 290/500 [================>.............] - ETA: 52s - loss: 1.8786 - regression_loss: 1.5658 - classification_loss: 0.3129 291/500 [================>.............] - ETA: 52s - loss: 1.8779 - regression_loss: 1.5651 - classification_loss: 0.3128 292/500 [================>.............] - ETA: 52s - loss: 1.8742 - regression_loss: 1.5619 - classification_loss: 0.3123 293/500 [================>.............] - ETA: 51s - loss: 1.8736 - regression_loss: 1.5611 - classification_loss: 0.3125 294/500 [================>.............] - ETA: 51s - loss: 1.8737 - regression_loss: 1.5612 - classification_loss: 0.3125 295/500 [================>.............] - ETA: 51s - loss: 1.8716 - regression_loss: 1.5597 - classification_loss: 0.3119 296/500 [================>.............] - ETA: 51s - loss: 1.8720 - regression_loss: 1.5599 - classification_loss: 0.3121 297/500 [================>.............] - ETA: 50s - loss: 1.8701 - regression_loss: 1.5584 - classification_loss: 0.3117 298/500 [================>.............] - ETA: 50s - loss: 1.8734 - regression_loss: 1.5608 - classification_loss: 0.3126 299/500 [================>.............] - ETA: 50s - loss: 1.8756 - regression_loss: 1.5626 - classification_loss: 0.3130 300/500 [=================>............] - ETA: 50s - loss: 1.8765 - regression_loss: 1.5634 - classification_loss: 0.3130 301/500 [=================>............] - ETA: 49s - loss: 1.8771 - regression_loss: 1.5641 - classification_loss: 0.3131 302/500 [=================>............] - ETA: 49s - loss: 1.8753 - regression_loss: 1.5626 - classification_loss: 0.3128 303/500 [=================>............] - ETA: 49s - loss: 1.8768 - regression_loss: 1.5637 - classification_loss: 0.3131 304/500 [=================>............] - ETA: 49s - loss: 1.8792 - regression_loss: 1.5665 - classification_loss: 0.3127 305/500 [=================>............] - ETA: 48s - loss: 1.8814 - regression_loss: 1.5683 - classification_loss: 0.3132 306/500 [=================>............] - ETA: 48s - loss: 1.8819 - regression_loss: 1.5690 - classification_loss: 0.3129 307/500 [=================>............] - ETA: 48s - loss: 1.8833 - regression_loss: 1.5700 - classification_loss: 0.3133 308/500 [=================>............] - ETA: 48s - loss: 1.8817 - regression_loss: 1.5687 - classification_loss: 0.3129 309/500 [=================>............] - ETA: 47s - loss: 1.8814 - regression_loss: 1.5687 - classification_loss: 0.3128 310/500 [=================>............] - ETA: 47s - loss: 1.8823 - regression_loss: 1.5694 - classification_loss: 0.3129 311/500 [=================>............] - ETA: 47s - loss: 1.8851 - regression_loss: 1.5721 - classification_loss: 0.3131 312/500 [=================>............] - ETA: 47s - loss: 1.8859 - regression_loss: 1.5730 - classification_loss: 0.3129 313/500 [=================>............] - ETA: 46s - loss: 1.8859 - regression_loss: 1.5730 - classification_loss: 0.3128 314/500 [=================>............] - ETA: 46s - loss: 1.8871 - regression_loss: 1.5739 - classification_loss: 0.3133 315/500 [=================>............] - ETA: 46s - loss: 1.8878 - regression_loss: 1.5749 - classification_loss: 0.3130 316/500 [=================>............] - ETA: 46s - loss: 1.8883 - regression_loss: 1.5753 - classification_loss: 0.3131 317/500 [==================>...........] - ETA: 45s - loss: 1.8889 - regression_loss: 1.5758 - classification_loss: 0.3132 318/500 [==================>...........] - ETA: 45s - loss: 1.8900 - regression_loss: 1.5771 - classification_loss: 0.3129 319/500 [==================>...........] - ETA: 45s - loss: 1.8900 - regression_loss: 1.5773 - classification_loss: 0.3126 320/500 [==================>...........] - ETA: 45s - loss: 1.8903 - regression_loss: 1.5776 - classification_loss: 0.3126 321/500 [==================>...........] - ETA: 44s - loss: 1.8916 - regression_loss: 1.5786 - classification_loss: 0.3130 322/500 [==================>...........] - ETA: 44s - loss: 1.8913 - regression_loss: 1.5786 - classification_loss: 0.3127 323/500 [==================>...........] - ETA: 44s - loss: 1.8912 - regression_loss: 1.5786 - classification_loss: 0.3126 324/500 [==================>...........] - ETA: 44s - loss: 1.8895 - regression_loss: 1.5769 - classification_loss: 0.3126 325/500 [==================>...........] - ETA: 43s - loss: 1.8893 - regression_loss: 1.5768 - classification_loss: 0.3125 326/500 [==================>...........] - ETA: 43s - loss: 1.8881 - regression_loss: 1.5759 - classification_loss: 0.3122 327/500 [==================>...........] - ETA: 43s - loss: 1.8911 - regression_loss: 1.5786 - classification_loss: 0.3125 328/500 [==================>...........] - ETA: 43s - loss: 1.8928 - regression_loss: 1.5798 - classification_loss: 0.3130 329/500 [==================>...........] - ETA: 42s - loss: 1.8949 - regression_loss: 1.5804 - classification_loss: 0.3145 330/500 [==================>...........] - ETA: 42s - loss: 1.8941 - regression_loss: 1.5799 - classification_loss: 0.3142 331/500 [==================>...........] - ETA: 42s - loss: 1.8926 - regression_loss: 1.5788 - classification_loss: 0.3137 332/500 [==================>...........] - ETA: 42s - loss: 1.8905 - regression_loss: 1.5773 - classification_loss: 0.3132 333/500 [==================>...........] - ETA: 41s - loss: 1.8908 - regression_loss: 1.5774 - classification_loss: 0.3134 334/500 [===================>..........] - ETA: 41s - loss: 1.8902 - regression_loss: 1.5769 - classification_loss: 0.3133 335/500 [===================>..........] - ETA: 41s - loss: 1.8889 - regression_loss: 1.5759 - classification_loss: 0.3130 336/500 [===================>..........] - ETA: 41s - loss: 1.8886 - regression_loss: 1.5756 - classification_loss: 0.3130 337/500 [===================>..........] - ETA: 40s - loss: 1.8883 - regression_loss: 1.5752 - classification_loss: 0.3131 338/500 [===================>..........] - ETA: 40s - loss: 1.8872 - regression_loss: 1.5742 - classification_loss: 0.3129 339/500 [===================>..........] - ETA: 40s - loss: 1.8892 - regression_loss: 1.5758 - classification_loss: 0.3134 340/500 [===================>..........] - ETA: 40s - loss: 1.8885 - regression_loss: 1.5754 - classification_loss: 0.3131 341/500 [===================>..........] - ETA: 39s - loss: 1.8890 - regression_loss: 1.5761 - classification_loss: 0.3128 342/500 [===================>..........] - ETA: 39s - loss: 1.8915 - regression_loss: 1.5783 - classification_loss: 0.3132 343/500 [===================>..........] - ETA: 39s - loss: 1.8905 - regression_loss: 1.5775 - classification_loss: 0.3130 344/500 [===================>..........] - ETA: 39s - loss: 1.8903 - regression_loss: 1.5773 - classification_loss: 0.3130 345/500 [===================>..........] - ETA: 38s - loss: 1.8909 - regression_loss: 1.5777 - classification_loss: 0.3132 346/500 [===================>..........] - ETA: 38s - loss: 1.8900 - regression_loss: 1.5770 - classification_loss: 0.3130 347/500 [===================>..........] - ETA: 38s - loss: 1.8899 - regression_loss: 1.5768 - classification_loss: 0.3130 348/500 [===================>..........] - ETA: 38s - loss: 1.8884 - regression_loss: 1.5758 - classification_loss: 0.3126 349/500 [===================>..........] - ETA: 37s - loss: 1.8872 - regression_loss: 1.5749 - classification_loss: 0.3124 350/500 [====================>.........] - ETA: 37s - loss: 1.8838 - regression_loss: 1.5721 - classification_loss: 0.3116 351/500 [====================>.........] - ETA: 37s - loss: 1.8847 - regression_loss: 1.5729 - classification_loss: 0.3118 352/500 [====================>.........] - ETA: 37s - loss: 1.8829 - regression_loss: 1.5715 - classification_loss: 0.3114 353/500 [====================>.........] - ETA: 36s - loss: 1.8827 - regression_loss: 1.5714 - classification_loss: 0.3114 354/500 [====================>.........] - ETA: 36s - loss: 1.8799 - regression_loss: 1.5690 - classification_loss: 0.3108 355/500 [====================>.........] - ETA: 36s - loss: 1.8794 - regression_loss: 1.5688 - classification_loss: 0.3106 356/500 [====================>.........] - ETA: 36s - loss: 1.8786 - regression_loss: 1.5679 - classification_loss: 0.3107 357/500 [====================>.........] - ETA: 35s - loss: 1.8782 - regression_loss: 1.5677 - classification_loss: 0.3105 358/500 [====================>.........] - ETA: 35s - loss: 1.8795 - regression_loss: 1.5684 - classification_loss: 0.3112 359/500 [====================>.........] - ETA: 35s - loss: 1.8786 - regression_loss: 1.5678 - classification_loss: 0.3109 360/500 [====================>.........] - ETA: 35s - loss: 1.8772 - regression_loss: 1.5666 - classification_loss: 0.3106 361/500 [====================>.........] - ETA: 34s - loss: 1.8780 - regression_loss: 1.5668 - classification_loss: 0.3112 362/500 [====================>.........] - ETA: 34s - loss: 1.8748 - regression_loss: 1.5642 - classification_loss: 0.3106 363/500 [====================>.........] - ETA: 34s - loss: 1.8750 - regression_loss: 1.5642 - classification_loss: 0.3108 364/500 [====================>.........] - ETA: 34s - loss: 1.8742 - regression_loss: 1.5636 - classification_loss: 0.3106 365/500 [====================>.........] - ETA: 33s - loss: 1.8724 - regression_loss: 1.5622 - classification_loss: 0.3102 366/500 [====================>.........] - ETA: 33s - loss: 1.8694 - regression_loss: 1.5598 - classification_loss: 0.3096 367/500 [=====================>........] - ETA: 33s - loss: 1.8705 - regression_loss: 1.5606 - classification_loss: 0.3099 368/500 [=====================>........] - ETA: 33s - loss: 1.8724 - regression_loss: 1.5620 - classification_loss: 0.3103 369/500 [=====================>........] - ETA: 32s - loss: 1.8730 - regression_loss: 1.5626 - classification_loss: 0.3104 370/500 [=====================>........] - ETA: 32s - loss: 1.8695 - regression_loss: 1.5597 - classification_loss: 0.3097 371/500 [=====================>........] - ETA: 32s - loss: 1.8706 - regression_loss: 1.5607 - classification_loss: 0.3099 372/500 [=====================>........] - ETA: 32s - loss: 1.8716 - regression_loss: 1.5617 - classification_loss: 0.3099 373/500 [=====================>........] - ETA: 31s - loss: 1.8680 - regression_loss: 1.5587 - classification_loss: 0.3093 374/500 [=====================>........] - ETA: 31s - loss: 1.8683 - regression_loss: 1.5591 - classification_loss: 0.3093 375/500 [=====================>........] - ETA: 31s - loss: 1.8682 - regression_loss: 1.5590 - classification_loss: 0.3092 376/500 [=====================>........] - ETA: 31s - loss: 1.8670 - regression_loss: 1.5581 - classification_loss: 0.3089 377/500 [=====================>........] - ETA: 30s - loss: 1.8654 - regression_loss: 1.5569 - classification_loss: 0.3085 378/500 [=====================>........] - ETA: 30s - loss: 1.8647 - regression_loss: 1.5564 - classification_loss: 0.3083 379/500 [=====================>........] - ETA: 30s - loss: 1.8639 - regression_loss: 1.5560 - classification_loss: 0.3079 380/500 [=====================>........] - ETA: 30s - loss: 1.8638 - regression_loss: 1.5559 - classification_loss: 0.3079 381/500 [=====================>........] - ETA: 29s - loss: 1.8620 - regression_loss: 1.5546 - classification_loss: 0.3075 382/500 [=====================>........] - ETA: 29s - loss: 1.8598 - regression_loss: 1.5528 - classification_loss: 0.3071 383/500 [=====================>........] - ETA: 29s - loss: 1.8580 - regression_loss: 1.5513 - classification_loss: 0.3067 384/500 [======================>.......] - ETA: 29s - loss: 1.8576 - regression_loss: 1.5513 - classification_loss: 0.3062 385/500 [======================>.......] - ETA: 28s - loss: 1.8566 - regression_loss: 1.5505 - classification_loss: 0.3060 386/500 [======================>.......] - ETA: 28s - loss: 1.8559 - regression_loss: 1.5500 - classification_loss: 0.3058 387/500 [======================>.......] - ETA: 28s - loss: 1.8549 - regression_loss: 1.5493 - classification_loss: 0.3056 388/500 [======================>.......] - ETA: 28s - loss: 1.8570 - regression_loss: 1.5509 - classification_loss: 0.3061 389/500 [======================>.......] - ETA: 27s - loss: 1.8573 - regression_loss: 1.5511 - classification_loss: 0.3062 390/500 [======================>.......] - ETA: 27s - loss: 1.8569 - regression_loss: 1.5508 - classification_loss: 0.3061 391/500 [======================>.......] - ETA: 27s - loss: 1.8567 - regression_loss: 1.5509 - classification_loss: 0.3058 392/500 [======================>.......] - ETA: 27s - loss: 1.8553 - regression_loss: 1.5498 - classification_loss: 0.3055 393/500 [======================>.......] - ETA: 26s - loss: 1.8559 - regression_loss: 1.5504 - classification_loss: 0.3056 394/500 [======================>.......] - ETA: 26s - loss: 1.8533 - regression_loss: 1.5482 - classification_loss: 0.3051 395/500 [======================>.......] - ETA: 26s - loss: 1.8502 - regression_loss: 1.5457 - classification_loss: 0.3045 396/500 [======================>.......] - ETA: 26s - loss: 1.8509 - regression_loss: 1.5466 - classification_loss: 0.3043 397/500 [======================>.......] - ETA: 25s - loss: 1.8520 - regression_loss: 1.5475 - classification_loss: 0.3045 398/500 [======================>.......] - ETA: 25s - loss: 1.8513 - regression_loss: 1.5470 - classification_loss: 0.3043 399/500 [======================>.......] - ETA: 25s - loss: 1.8518 - regression_loss: 1.5474 - classification_loss: 0.3044 400/500 [=======================>......] - ETA: 25s - loss: 1.8527 - regression_loss: 1.5479 - classification_loss: 0.3048 401/500 [=======================>......] - ETA: 24s - loss: 1.8530 - regression_loss: 1.5485 - classification_loss: 0.3045 402/500 [=======================>......] - ETA: 24s - loss: 1.8496 - regression_loss: 1.5458 - classification_loss: 0.3039 403/500 [=======================>......] - ETA: 24s - loss: 1.8484 - regression_loss: 1.5448 - classification_loss: 0.3036 404/500 [=======================>......] - ETA: 24s - loss: 1.8491 - regression_loss: 1.5453 - classification_loss: 0.3038 405/500 [=======================>......] - ETA: 23s - loss: 1.8478 - regression_loss: 1.5443 - classification_loss: 0.3035 406/500 [=======================>......] - ETA: 23s - loss: 1.8470 - regression_loss: 1.5436 - classification_loss: 0.3035 407/500 [=======================>......] - ETA: 23s - loss: 1.8465 - regression_loss: 1.5431 - classification_loss: 0.3033 408/500 [=======================>......] - ETA: 23s - loss: 1.8440 - regression_loss: 1.5410 - classification_loss: 0.3030 409/500 [=======================>......] - ETA: 22s - loss: 1.8425 - regression_loss: 1.5398 - classification_loss: 0.3027 410/500 [=======================>......] - ETA: 22s - loss: 1.8438 - regression_loss: 1.5409 - classification_loss: 0.3029 411/500 [=======================>......] - ETA: 22s - loss: 1.8447 - regression_loss: 1.5413 - classification_loss: 0.3034 412/500 [=======================>......] - ETA: 22s - loss: 1.8439 - regression_loss: 1.5376 - classification_loss: 0.3063 413/500 [=======================>......] - ETA: 21s - loss: 1.8462 - regression_loss: 1.5399 - classification_loss: 0.3064 414/500 [=======================>......] - ETA: 21s - loss: 1.8498 - regression_loss: 1.5428 - classification_loss: 0.3071 415/500 [=======================>......] - ETA: 21s - loss: 1.8502 - regression_loss: 1.5430 - classification_loss: 0.3072 416/500 [=======================>......] - ETA: 21s - loss: 1.8478 - regression_loss: 1.5410 - classification_loss: 0.3068 417/500 [========================>.....] - ETA: 20s - loss: 1.8485 - regression_loss: 1.5415 - classification_loss: 0.3070 418/500 [========================>.....] - ETA: 20s - loss: 1.8482 - regression_loss: 1.5412 - classification_loss: 0.3070 419/500 [========================>.....] - ETA: 20s - loss: 1.8486 - regression_loss: 1.5417 - classification_loss: 0.3069 420/500 [========================>.....] - ETA: 20s - loss: 1.8491 - regression_loss: 1.5420 - classification_loss: 0.3071 421/500 [========================>.....] - ETA: 19s - loss: 1.8486 - regression_loss: 1.5415 - classification_loss: 0.3071 422/500 [========================>.....] - ETA: 19s - loss: 1.8493 - regression_loss: 1.5419 - classification_loss: 0.3074 423/500 [========================>.....] - ETA: 19s - loss: 1.8543 - regression_loss: 1.5461 - classification_loss: 0.3082 424/500 [========================>.....] - ETA: 19s - loss: 1.8541 - regression_loss: 1.5461 - classification_loss: 0.3080 425/500 [========================>.....] - ETA: 18s - loss: 1.8521 - regression_loss: 1.5424 - classification_loss: 0.3097 426/500 [========================>.....] - ETA: 18s - loss: 1.8524 - regression_loss: 1.5428 - classification_loss: 0.3095 427/500 [========================>.....] - ETA: 18s - loss: 1.8519 - regression_loss: 1.5424 - classification_loss: 0.3095 428/500 [========================>.....] - ETA: 18s - loss: 1.8509 - regression_loss: 1.5416 - classification_loss: 0.3093 429/500 [========================>.....] - ETA: 17s - loss: 1.8487 - regression_loss: 1.5398 - classification_loss: 0.3089 430/500 [========================>.....] - ETA: 17s - loss: 1.8498 - regression_loss: 1.5403 - classification_loss: 0.3095 431/500 [========================>.....] - ETA: 17s - loss: 1.8490 - regression_loss: 1.5395 - classification_loss: 0.3095 432/500 [========================>.....] - ETA: 17s - loss: 1.8510 - regression_loss: 1.5407 - classification_loss: 0.3103 433/500 [========================>.....] - ETA: 16s - loss: 1.8520 - regression_loss: 1.5418 - classification_loss: 0.3102 434/500 [=========================>....] - ETA: 16s - loss: 1.8509 - regression_loss: 1.5410 - classification_loss: 0.3099 435/500 [=========================>....] - ETA: 16s - loss: 1.8512 - regression_loss: 1.5412 - classification_loss: 0.3100 436/500 [=========================>....] - ETA: 16s - loss: 1.8505 - regression_loss: 1.5404 - classification_loss: 0.3101 437/500 [=========================>....] - ETA: 15s - loss: 1.8503 - regression_loss: 1.5402 - classification_loss: 0.3101 438/500 [=========================>....] - ETA: 15s - loss: 1.8495 - regression_loss: 1.5397 - classification_loss: 0.3097 439/500 [=========================>....] - ETA: 15s - loss: 1.8486 - regression_loss: 1.5390 - classification_loss: 0.3096 440/500 [=========================>....] - ETA: 15s - loss: 1.8491 - regression_loss: 1.5394 - classification_loss: 0.3098 441/500 [=========================>....] - ETA: 14s - loss: 1.8472 - regression_loss: 1.5378 - classification_loss: 0.3094 442/500 [=========================>....] - ETA: 14s - loss: 1.8480 - regression_loss: 1.5385 - classification_loss: 0.3095 443/500 [=========================>....] - ETA: 14s - loss: 1.8483 - regression_loss: 1.5388 - classification_loss: 0.3095 444/500 [=========================>....] - ETA: 14s - loss: 1.8483 - regression_loss: 1.5389 - classification_loss: 0.3094 445/500 [=========================>....] - ETA: 13s - loss: 1.8491 - regression_loss: 1.5397 - classification_loss: 0.3094 446/500 [=========================>....] - ETA: 13s - loss: 1.8488 - regression_loss: 1.5397 - classification_loss: 0.3092 447/500 [=========================>....] - ETA: 13s - loss: 1.8481 - regression_loss: 1.5392 - classification_loss: 0.3089 448/500 [=========================>....] - ETA: 13s - loss: 1.8488 - regression_loss: 1.5395 - classification_loss: 0.3093 449/500 [=========================>....] - ETA: 12s - loss: 1.8492 - regression_loss: 1.5398 - classification_loss: 0.3094 450/500 [==========================>...] - ETA: 12s - loss: 1.8492 - regression_loss: 1.5398 - classification_loss: 0.3094 451/500 [==========================>...] - ETA: 12s - loss: 1.8485 - regression_loss: 1.5394 - classification_loss: 0.3091 452/500 [==========================>...] - ETA: 12s - loss: 1.8477 - regression_loss: 1.5386 - classification_loss: 0.3090 453/500 [==========================>...] - ETA: 11s - loss: 1.8523 - regression_loss: 1.5424 - classification_loss: 0.3099 454/500 [==========================>...] - ETA: 11s - loss: 1.8524 - regression_loss: 1.5424 - classification_loss: 0.3100 455/500 [==========================>...] - ETA: 11s - loss: 1.8522 - regression_loss: 1.5423 - classification_loss: 0.3098 456/500 [==========================>...] - ETA: 11s - loss: 1.8529 - regression_loss: 1.5427 - classification_loss: 0.3102 457/500 [==========================>...] - ETA: 10s - loss: 1.8571 - regression_loss: 1.5459 - classification_loss: 0.3112 458/500 [==========================>...] - ETA: 10s - loss: 1.8581 - regression_loss: 1.5467 - classification_loss: 0.3113 459/500 [==========================>...] - ETA: 10s - loss: 1.8584 - regression_loss: 1.5470 - classification_loss: 0.3115 460/500 [==========================>...] - ETA: 10s - loss: 1.8598 - regression_loss: 1.5479 - classification_loss: 0.3119 461/500 [==========================>...] - ETA: 9s - loss: 1.8612 - regression_loss: 1.5488 - classification_loss: 0.3123  462/500 [==========================>...] - ETA: 9s - loss: 1.8599 - regression_loss: 1.5479 - classification_loss: 0.3121 463/500 [==========================>...] - ETA: 9s - loss: 1.8601 - regression_loss: 1.5481 - classification_loss: 0.3120 464/500 [==========================>...] - ETA: 9s - loss: 1.8604 - regression_loss: 1.5482 - classification_loss: 0.3122 465/500 [==========================>...] - ETA: 8s - loss: 1.8580 - regression_loss: 1.5462 - classification_loss: 0.3118 466/500 [==========================>...] - ETA: 8s - loss: 1.8591 - regression_loss: 1.5469 - classification_loss: 0.3121 467/500 [===========================>..] - ETA: 8s - loss: 1.8583 - regression_loss: 1.5462 - classification_loss: 0.3120 468/500 [===========================>..] - ETA: 8s - loss: 1.8577 - regression_loss: 1.5458 - classification_loss: 0.3119 469/500 [===========================>..] - ETA: 7s - loss: 1.8580 - regression_loss: 1.5459 - classification_loss: 0.3121 470/500 [===========================>..] - ETA: 7s - loss: 1.8552 - regression_loss: 1.5437 - classification_loss: 0.3116 471/500 [===========================>..] - ETA: 7s - loss: 1.8560 - regression_loss: 1.5441 - classification_loss: 0.3119 472/500 [===========================>..] - ETA: 7s - loss: 1.8552 - regression_loss: 1.5435 - classification_loss: 0.3117 473/500 [===========================>..] - ETA: 6s - loss: 1.8561 - regression_loss: 1.5443 - classification_loss: 0.3118 474/500 [===========================>..] - ETA: 6s - loss: 1.8560 - regression_loss: 1.5443 - classification_loss: 0.3117 475/500 [===========================>..] - ETA: 6s - loss: 1.8576 - regression_loss: 1.5451 - classification_loss: 0.3125 476/500 [===========================>..] - ETA: 6s - loss: 1.8583 - regression_loss: 1.5453 - classification_loss: 0.3130 477/500 [===========================>..] - ETA: 5s - loss: 1.8586 - regression_loss: 1.5455 - classification_loss: 0.3131 478/500 [===========================>..] - ETA: 5s - loss: 1.8589 - regression_loss: 1.5457 - classification_loss: 0.3132 479/500 [===========================>..] - ETA: 5s - loss: 1.8578 - regression_loss: 1.5449 - classification_loss: 0.3129 480/500 [===========================>..] - ETA: 5s - loss: 1.8585 - regression_loss: 1.5454 - classification_loss: 0.3130 481/500 [===========================>..] - ETA: 4s - loss: 1.8584 - regression_loss: 1.5454 - classification_loss: 0.3130 482/500 [===========================>..] - ETA: 4s - loss: 1.8580 - regression_loss: 1.5453 - classification_loss: 0.3128 483/500 [===========================>..] - ETA: 4s - loss: 1.8580 - regression_loss: 1.5453 - classification_loss: 0.3127 484/500 [============================>.] - ETA: 4s - loss: 1.8568 - regression_loss: 1.5445 - classification_loss: 0.3123 485/500 [============================>.] - ETA: 3s - loss: 1.8581 - regression_loss: 1.5455 - classification_loss: 0.3126 486/500 [============================>.] - ETA: 3s - loss: 1.8577 - regression_loss: 1.5453 - classification_loss: 0.3124 487/500 [============================>.] - ETA: 3s - loss: 1.8571 - regression_loss: 1.5449 - classification_loss: 0.3122 488/500 [============================>.] - ETA: 3s - loss: 1.8574 - regression_loss: 1.5452 - classification_loss: 0.3122 489/500 [============================>.] - ETA: 2s - loss: 1.8593 - regression_loss: 1.5467 - classification_loss: 0.3125 490/500 [============================>.] - ETA: 2s - loss: 1.8583 - regression_loss: 1.5459 - classification_loss: 0.3124 491/500 [============================>.] - ETA: 2s - loss: 1.8580 - regression_loss: 1.5458 - classification_loss: 0.3122 492/500 [============================>.] - ETA: 2s - loss: 1.8571 - regression_loss: 1.5451 - classification_loss: 0.3120 493/500 [============================>.] - ETA: 1s - loss: 1.8572 - regression_loss: 1.5454 - classification_loss: 0.3118 494/500 [============================>.] - ETA: 1s - loss: 1.8586 - regression_loss: 1.5464 - classification_loss: 0.3122 495/500 [============================>.] - ETA: 1s - loss: 1.8581 - regression_loss: 1.5461 - classification_loss: 0.3120 496/500 [============================>.] - ETA: 1s - loss: 1.8575 - regression_loss: 1.5457 - classification_loss: 0.3118 497/500 [============================>.] - ETA: 0s - loss: 1.8600 - regression_loss: 1.5479 - classification_loss: 0.3121 498/500 [============================>.] - ETA: 0s - loss: 1.8600 - regression_loss: 1.5478 - classification_loss: 0.3121 499/500 [============================>.] - ETA: 0s - loss: 1.8602 - regression_loss: 1.5480 - classification_loss: 0.3122 500/500 [==============================] - 125s 251ms/step - loss: 1.8609 - regression_loss: 1.5487 - classification_loss: 0.3122 326 instances of class plum with average precision: 0.7109 mAP: 0.7109 Epoch 00033: saving model to ./training/snapshots/resnet50_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 1:56 - loss: 1.2750 - regression_loss: 1.0665 - classification_loss: 0.2084 2/500 [..............................] - ETA: 1:59 - loss: 2.3052 - regression_loss: 1.8433 - classification_loss: 0.4619 3/500 [..............................] - ETA: 2:01 - loss: 2.0317 - regression_loss: 1.6487 - classification_loss: 0.3830 4/500 [..............................] - ETA: 2:00 - loss: 1.8019 - regression_loss: 1.4649 - classification_loss: 0.3370 5/500 [..............................] - ETA: 2:01 - loss: 1.5642 - regression_loss: 1.2715 - classification_loss: 0.2927 6/500 [..............................] - ETA: 2:02 - loss: 1.6972 - regression_loss: 1.3969 - classification_loss: 0.3003 7/500 [..............................] - ETA: 2:02 - loss: 1.7074 - regression_loss: 1.4059 - classification_loss: 0.3015 8/500 [..............................] - ETA: 2:02 - loss: 1.7960 - regression_loss: 1.4752 - classification_loss: 0.3207 9/500 [..............................] - ETA: 2:02 - loss: 1.8141 - regression_loss: 1.4963 - classification_loss: 0.3178 10/500 [..............................] - ETA: 2:02 - loss: 1.8255 - regression_loss: 1.5097 - classification_loss: 0.3158 11/500 [..............................] - ETA: 2:02 - loss: 1.8259 - regression_loss: 1.5164 - classification_loss: 0.3095 12/500 [..............................] - ETA: 2:02 - loss: 1.8221 - regression_loss: 1.5245 - classification_loss: 0.2976 13/500 [..............................] - ETA: 2:02 - loss: 1.7670 - regression_loss: 1.4810 - classification_loss: 0.2860 14/500 [..............................] - ETA: 2:01 - loss: 1.8061 - regression_loss: 1.5107 - classification_loss: 0.2954 15/500 [..............................] - ETA: 2:01 - loss: 1.7591 - regression_loss: 1.4690 - classification_loss: 0.2902 16/500 [..............................] - ETA: 2:01 - loss: 1.7524 - regression_loss: 1.4684 - classification_loss: 0.2840 17/500 [>.............................] - ETA: 2:00 - loss: 1.7438 - regression_loss: 1.4626 - classification_loss: 0.2812 18/500 [>.............................] - ETA: 1:59 - loss: 1.7550 - regression_loss: 1.4709 - classification_loss: 0.2841 19/500 [>.............................] - ETA: 1:59 - loss: 1.7741 - regression_loss: 1.4844 - classification_loss: 0.2897 20/500 [>.............................] - ETA: 1:59 - loss: 1.7980 - regression_loss: 1.5049 - classification_loss: 0.2931 21/500 [>.............................] - ETA: 1:59 - loss: 1.8146 - regression_loss: 1.5172 - classification_loss: 0.2974 22/500 [>.............................] - ETA: 1:59 - loss: 1.8319 - regression_loss: 1.5304 - classification_loss: 0.3015 23/500 [>.............................] - ETA: 1:59 - loss: 1.8317 - regression_loss: 1.5290 - classification_loss: 0.3027 24/500 [>.............................] - ETA: 1:58 - loss: 1.8321 - regression_loss: 1.5284 - classification_loss: 0.3037 25/500 [>.............................] - ETA: 1:58 - loss: 1.8697 - regression_loss: 1.5557 - classification_loss: 0.3140 26/500 [>.............................] - ETA: 1:58 - loss: 1.8707 - regression_loss: 1.5546 - classification_loss: 0.3162 27/500 [>.............................] - ETA: 1:58 - loss: 1.8555 - regression_loss: 1.5437 - classification_loss: 0.3118 28/500 [>.............................] - ETA: 1:58 - loss: 1.8498 - regression_loss: 1.5306 - classification_loss: 0.3192 29/500 [>.............................] - ETA: 1:57 - loss: 1.8615 - regression_loss: 1.5405 - classification_loss: 0.3210 30/500 [>.............................] - ETA: 1:57 - loss: 1.8634 - regression_loss: 1.5485 - classification_loss: 0.3149 31/500 [>.............................] - ETA: 1:57 - loss: 1.8734 - regression_loss: 1.5526 - classification_loss: 0.3208 32/500 [>.............................] - ETA: 1:57 - loss: 1.8906 - regression_loss: 1.5661 - classification_loss: 0.3245 33/500 [>.............................] - ETA: 1:56 - loss: 1.8992 - regression_loss: 1.5737 - classification_loss: 0.3255 34/500 [=>............................] - ETA: 1:56 - loss: 1.9205 - regression_loss: 1.5874 - classification_loss: 0.3331 35/500 [=>............................] - ETA: 1:56 - loss: 1.9098 - regression_loss: 1.5807 - classification_loss: 0.3291 36/500 [=>............................] - ETA: 1:56 - loss: 1.9000 - regression_loss: 1.5743 - classification_loss: 0.3258 37/500 [=>............................] - ETA: 1:55 - loss: 1.9140 - regression_loss: 1.5873 - classification_loss: 0.3267 38/500 [=>............................] - ETA: 1:55 - loss: 1.9254 - regression_loss: 1.5957 - classification_loss: 0.3298 39/500 [=>............................] - ETA: 1:55 - loss: 1.9206 - regression_loss: 1.5939 - classification_loss: 0.3267 40/500 [=>............................] - ETA: 1:55 - loss: 1.9138 - regression_loss: 1.5904 - classification_loss: 0.3234 41/500 [=>............................] - ETA: 1:54 - loss: 1.9180 - regression_loss: 1.5909 - classification_loss: 0.3272 42/500 [=>............................] - ETA: 1:54 - loss: 1.9086 - regression_loss: 1.5846 - classification_loss: 0.3240 43/500 [=>............................] - ETA: 1:54 - loss: 1.9090 - regression_loss: 1.5860 - classification_loss: 0.3229 44/500 [=>............................] - ETA: 1:53 - loss: 1.9148 - regression_loss: 1.5916 - classification_loss: 0.3231 45/500 [=>............................] - ETA: 1:53 - loss: 1.9121 - regression_loss: 1.5898 - classification_loss: 0.3223 46/500 [=>............................] - ETA: 1:52 - loss: 1.9005 - regression_loss: 1.5806 - classification_loss: 0.3199 47/500 [=>............................] - ETA: 1:52 - loss: 1.9007 - regression_loss: 1.5816 - classification_loss: 0.3191 48/500 [=>............................] - ETA: 1:51 - loss: 1.9037 - regression_loss: 1.5800 - classification_loss: 0.3237 49/500 [=>............................] - ETA: 1:51 - loss: 1.9019 - regression_loss: 1.5795 - classification_loss: 0.3224 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9032 - regression_loss: 1.5807 - classification_loss: 0.3225 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9057 - regression_loss: 1.5834 - classification_loss: 0.3223 52/500 [==>...........................] - ETA: 1:50 - loss: 1.8993 - regression_loss: 1.5792 - classification_loss: 0.3201 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8968 - regression_loss: 1.5777 - classification_loss: 0.3191 54/500 [==>...........................] - ETA: 1:50 - loss: 1.8741 - regression_loss: 1.5583 - classification_loss: 0.3158 55/500 [==>...........................] - ETA: 1:49 - loss: 1.8761 - regression_loss: 1.5600 - classification_loss: 0.3161 56/500 [==>...........................] - ETA: 1:49 - loss: 1.8809 - regression_loss: 1.5639 - classification_loss: 0.3170 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8863 - regression_loss: 1.5688 - classification_loss: 0.3175 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8685 - regression_loss: 1.5541 - classification_loss: 0.3144 59/500 [==>...........................] - ETA: 1:48 - loss: 1.8622 - regression_loss: 1.5482 - classification_loss: 0.3140 60/500 [==>...........................] - ETA: 1:48 - loss: 1.8507 - regression_loss: 1.5390 - classification_loss: 0.3117 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8540 - regression_loss: 1.5416 - classification_loss: 0.3124 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8554 - regression_loss: 1.5423 - classification_loss: 0.3131 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8427 - regression_loss: 1.5325 - classification_loss: 0.3103 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8298 - regression_loss: 1.5228 - classification_loss: 0.3070 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8232 - regression_loss: 1.5186 - classification_loss: 0.3046 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8213 - regression_loss: 1.5168 - classification_loss: 0.3045 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8278 - regression_loss: 1.5212 - classification_loss: 0.3066 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8233 - regression_loss: 1.5185 - classification_loss: 0.3048 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8358 - regression_loss: 1.5284 - classification_loss: 0.3074 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8337 - regression_loss: 1.5262 - classification_loss: 0.3075 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8183 - regression_loss: 1.5143 - classification_loss: 0.3040 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8213 - regression_loss: 1.5168 - classification_loss: 0.3045 73/500 [===>..........................] - ETA: 1:45 - loss: 1.8182 - regression_loss: 1.5152 - classification_loss: 0.3029 74/500 [===>..........................] - ETA: 1:45 - loss: 1.8247 - regression_loss: 1.5192 - classification_loss: 0.3055 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8273 - regression_loss: 1.5226 - classification_loss: 0.3048 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8271 - regression_loss: 1.5222 - classification_loss: 0.3049 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8335 - regression_loss: 1.5270 - classification_loss: 0.3065 78/500 [===>..........................] - ETA: 1:44 - loss: 1.8244 - regression_loss: 1.5198 - classification_loss: 0.3045 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8178 - regression_loss: 1.5155 - classification_loss: 0.3023 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8141 - regression_loss: 1.5129 - classification_loss: 0.3012 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8197 - regression_loss: 1.5181 - classification_loss: 0.3016 82/500 [===>..........................] - ETA: 1:43 - loss: 1.8200 - regression_loss: 1.5190 - classification_loss: 0.3010 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8126 - regression_loss: 1.5116 - classification_loss: 0.3010 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8143 - regression_loss: 1.5139 - classification_loss: 0.3004 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8192 - regression_loss: 1.5177 - classification_loss: 0.3015 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8229 - regression_loss: 1.5200 - classification_loss: 0.3029 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8129 - regression_loss: 1.5123 - classification_loss: 0.3006 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8194 - regression_loss: 1.5174 - classification_loss: 0.3020 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8198 - regression_loss: 1.5181 - classification_loss: 0.3017 90/500 [====>.........................] - ETA: 1:41 - loss: 1.8175 - regression_loss: 1.5169 - classification_loss: 0.3006 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8202 - regression_loss: 1.5183 - classification_loss: 0.3019 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8202 - regression_loss: 1.5178 - classification_loss: 0.3025 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8229 - regression_loss: 1.5194 - classification_loss: 0.3035 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8236 - regression_loss: 1.5202 - classification_loss: 0.3034 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8195 - regression_loss: 1.5167 - classification_loss: 0.3028 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8316 - regression_loss: 1.5267 - classification_loss: 0.3049 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8349 - regression_loss: 1.5296 - classification_loss: 0.3053 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8369 - regression_loss: 1.5313 - classification_loss: 0.3056 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8368 - regression_loss: 1.5312 - classification_loss: 0.3057 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8407 - regression_loss: 1.5340 - classification_loss: 0.3067 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8445 - regression_loss: 1.5380 - classification_loss: 0.3065 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8444 - regression_loss: 1.5387 - classification_loss: 0.3057 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8421 - regression_loss: 1.5373 - classification_loss: 0.3047 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8337 - regression_loss: 1.5305 - classification_loss: 0.3032 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8348 - regression_loss: 1.5311 - classification_loss: 0.3037 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8397 - regression_loss: 1.5348 - classification_loss: 0.3048 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8412 - regression_loss: 1.5363 - classification_loss: 0.3049 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8414 - regression_loss: 1.5371 - classification_loss: 0.3043 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8352 - regression_loss: 1.5325 - classification_loss: 0.3027 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8319 - regression_loss: 1.5302 - classification_loss: 0.3017 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8330 - regression_loss: 1.5309 - classification_loss: 0.3020 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8380 - regression_loss: 1.5341 - classification_loss: 0.3039 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8265 - regression_loss: 1.5240 - classification_loss: 0.3025 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8261 - regression_loss: 1.5246 - classification_loss: 0.3015 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8331 - regression_loss: 1.5303 - classification_loss: 0.3028 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8366 - regression_loss: 1.5324 - classification_loss: 0.3042 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8407 - regression_loss: 1.5355 - classification_loss: 0.3052 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8426 - regression_loss: 1.5371 - classification_loss: 0.3055 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8480 - regression_loss: 1.5408 - classification_loss: 0.3073 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8458 - regression_loss: 1.5392 - classification_loss: 0.3067 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8436 - regression_loss: 1.5376 - classification_loss: 0.3060 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8408 - regression_loss: 1.5354 - classification_loss: 0.3054 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8419 - regression_loss: 1.5363 - classification_loss: 0.3056 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8446 - regression_loss: 1.5372 - classification_loss: 0.3074 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8439 - regression_loss: 1.5371 - classification_loss: 0.3068 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8435 - regression_loss: 1.5369 - classification_loss: 0.3066 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8392 - regression_loss: 1.5336 - classification_loss: 0.3056 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8375 - regression_loss: 1.5324 - classification_loss: 0.3050 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8354 - regression_loss: 1.5301 - classification_loss: 0.3052 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8421 - regression_loss: 1.5355 - classification_loss: 0.3065 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8464 - regression_loss: 1.5395 - classification_loss: 0.3069 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8505 - regression_loss: 1.5440 - classification_loss: 0.3065 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8515 - regression_loss: 1.5447 - classification_loss: 0.3068 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8537 - regression_loss: 1.5473 - classification_loss: 0.3064 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8537 - regression_loss: 1.5475 - classification_loss: 0.3062 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8583 - regression_loss: 1.5506 - classification_loss: 0.3077 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8569 - regression_loss: 1.5494 - classification_loss: 0.3075 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8517 - regression_loss: 1.5454 - classification_loss: 0.3062 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8627 - regression_loss: 1.5550 - classification_loss: 0.3077 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8616 - regression_loss: 1.5546 - classification_loss: 0.3069 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8604 - regression_loss: 1.5535 - classification_loss: 0.3069 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8639 - regression_loss: 1.5569 - classification_loss: 0.3070 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8653 - regression_loss: 1.5578 - classification_loss: 0.3076 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8671 - regression_loss: 1.5594 - classification_loss: 0.3077 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8641 - regression_loss: 1.5577 - classification_loss: 0.3065 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8601 - regression_loss: 1.5550 - classification_loss: 0.3051 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8556 - regression_loss: 1.5444 - classification_loss: 0.3112 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8563 - regression_loss: 1.5455 - classification_loss: 0.3108 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8544 - regression_loss: 1.5446 - classification_loss: 0.3099 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8568 - regression_loss: 1.5473 - classification_loss: 0.3094 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8552 - regression_loss: 1.5466 - classification_loss: 0.3087 152/500 [========>.....................] - ETA: 1:26 - loss: 1.8564 - regression_loss: 1.5471 - classification_loss: 0.3092 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8577 - regression_loss: 1.5481 - classification_loss: 0.3096 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8536 - regression_loss: 1.5451 - classification_loss: 0.3085 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8545 - regression_loss: 1.5451 - classification_loss: 0.3094 156/500 [========>.....................] - ETA: 1:25 - loss: 1.8538 - regression_loss: 1.5446 - classification_loss: 0.3092 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8537 - regression_loss: 1.5442 - classification_loss: 0.3095 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8560 - regression_loss: 1.5460 - classification_loss: 0.3100 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8560 - regression_loss: 1.5463 - classification_loss: 0.3097 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8598 - regression_loss: 1.5496 - classification_loss: 0.3102 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8583 - regression_loss: 1.5484 - classification_loss: 0.3099 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8592 - regression_loss: 1.5495 - classification_loss: 0.3097 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8583 - regression_loss: 1.5493 - classification_loss: 0.3090 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8546 - regression_loss: 1.5465 - classification_loss: 0.3080 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8545 - regression_loss: 1.5469 - classification_loss: 0.3076 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8531 - regression_loss: 1.5460 - classification_loss: 0.3071 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8548 - regression_loss: 1.5471 - classification_loss: 0.3078 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8547 - regression_loss: 1.5466 - classification_loss: 0.3080 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8584 - regression_loss: 1.5499 - classification_loss: 0.3086 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8662 - regression_loss: 1.5559 - classification_loss: 0.3104 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8668 - regression_loss: 1.5564 - classification_loss: 0.3104 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8698 - regression_loss: 1.5590 - classification_loss: 0.3109 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8694 - regression_loss: 1.5590 - classification_loss: 0.3104 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8727 - regression_loss: 1.5605 - classification_loss: 0.3122 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8705 - regression_loss: 1.5589 - classification_loss: 0.3117 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8696 - regression_loss: 1.5583 - classification_loss: 0.3113 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8705 - regression_loss: 1.5592 - classification_loss: 0.3114 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8679 - regression_loss: 1.5574 - classification_loss: 0.3105 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8685 - regression_loss: 1.5576 - classification_loss: 0.3109 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8665 - regression_loss: 1.5561 - classification_loss: 0.3104 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8631 - regression_loss: 1.5533 - classification_loss: 0.3099 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8636 - regression_loss: 1.5538 - classification_loss: 0.3098 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8653 - regression_loss: 1.5549 - classification_loss: 0.3103 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8618 - regression_loss: 1.5521 - classification_loss: 0.3098 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8580 - regression_loss: 1.5490 - classification_loss: 0.3090 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8587 - regression_loss: 1.5496 - classification_loss: 0.3091 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8631 - regression_loss: 1.5522 - classification_loss: 0.3109 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8643 - regression_loss: 1.5533 - classification_loss: 0.3110 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8640 - regression_loss: 1.5533 - classification_loss: 0.3107 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8660 - regression_loss: 1.5548 - classification_loss: 0.3111 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8655 - regression_loss: 1.5547 - classification_loss: 0.3108 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8633 - regression_loss: 1.5532 - classification_loss: 0.3101 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8663 - regression_loss: 1.5551 - classification_loss: 0.3113 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8660 - regression_loss: 1.5549 - classification_loss: 0.3111 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8633 - regression_loss: 1.5529 - classification_loss: 0.3104 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8622 - regression_loss: 1.5524 - classification_loss: 0.3097 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8618 - regression_loss: 1.5524 - classification_loss: 0.3094 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8638 - regression_loss: 1.5538 - classification_loss: 0.3100 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8633 - regression_loss: 1.5535 - classification_loss: 0.3098 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8639 - regression_loss: 1.5539 - classification_loss: 0.3101 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8696 - regression_loss: 1.5588 - classification_loss: 0.3109 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8678 - regression_loss: 1.5573 - classification_loss: 0.3104 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8702 - regression_loss: 1.5594 - classification_loss: 0.3108 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8688 - regression_loss: 1.5585 - classification_loss: 0.3103 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8678 - regression_loss: 1.5578 - classification_loss: 0.3100 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8642 - regression_loss: 1.5546 - classification_loss: 0.3095 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8634 - regression_loss: 1.5541 - classification_loss: 0.3093 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8626 - regression_loss: 1.5531 - classification_loss: 0.3095 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8608 - regression_loss: 1.5517 - classification_loss: 0.3092 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8614 - regression_loss: 1.5521 - classification_loss: 0.3093 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8641 - regression_loss: 1.5540 - classification_loss: 0.3100 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8638 - regression_loss: 1.5543 - classification_loss: 0.3095 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8609 - regression_loss: 1.5520 - classification_loss: 0.3088 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8637 - regression_loss: 1.5542 - classification_loss: 0.3094 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8636 - regression_loss: 1.5546 - classification_loss: 0.3091 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8630 - regression_loss: 1.5543 - classification_loss: 0.3087 217/500 [============>.................] - ETA: 1:10 - loss: 1.8640 - regression_loss: 1.5540 - classification_loss: 0.3100 218/500 [============>.................] - ETA: 1:10 - loss: 1.8634 - regression_loss: 1.5541 - classification_loss: 0.3092 219/500 [============>.................] - ETA: 1:10 - loss: 1.8628 - regression_loss: 1.5519 - classification_loss: 0.3108 220/500 [============>.................] - ETA: 1:10 - loss: 1.8610 - regression_loss: 1.5507 - classification_loss: 0.3102 221/500 [============>.................] - ETA: 1:09 - loss: 1.8561 - regression_loss: 1.5467 - classification_loss: 0.3094 222/500 [============>.................] - ETA: 1:09 - loss: 1.8595 - regression_loss: 1.5494 - classification_loss: 0.3101 223/500 [============>.................] - ETA: 1:09 - loss: 1.8586 - regression_loss: 1.5481 - classification_loss: 0.3105 224/500 [============>.................] - ETA: 1:08 - loss: 1.8695 - regression_loss: 1.5564 - classification_loss: 0.3132 225/500 [============>.................] - ETA: 1:08 - loss: 1.8681 - regression_loss: 1.5553 - classification_loss: 0.3128 226/500 [============>.................] - ETA: 1:08 - loss: 1.8761 - regression_loss: 1.5623 - classification_loss: 0.3138 227/500 [============>.................] - ETA: 1:08 - loss: 1.8740 - regression_loss: 1.5554 - classification_loss: 0.3185 228/500 [============>.................] - ETA: 1:07 - loss: 1.8706 - regression_loss: 1.5527 - classification_loss: 0.3179 229/500 [============>.................] - ETA: 1:07 - loss: 1.8694 - regression_loss: 1.5515 - classification_loss: 0.3179 230/500 [============>.................] - ETA: 1:07 - loss: 1.8664 - regression_loss: 1.5487 - classification_loss: 0.3177 231/500 [============>.................] - ETA: 1:07 - loss: 1.8674 - regression_loss: 1.5494 - classification_loss: 0.3180 232/500 [============>.................] - ETA: 1:06 - loss: 1.8652 - regression_loss: 1.5477 - classification_loss: 0.3175 233/500 [============>.................] - ETA: 1:06 - loss: 1.8620 - regression_loss: 1.5452 - classification_loss: 0.3168 234/500 [=============>................] - ETA: 1:06 - loss: 1.8618 - regression_loss: 1.5450 - classification_loss: 0.3168 235/500 [=============>................] - ETA: 1:06 - loss: 1.8620 - regression_loss: 1.5453 - classification_loss: 0.3167 236/500 [=============>................] - ETA: 1:05 - loss: 1.8613 - regression_loss: 1.5445 - classification_loss: 0.3168 237/500 [=============>................] - ETA: 1:05 - loss: 1.8624 - regression_loss: 1.5453 - classification_loss: 0.3171 238/500 [=============>................] - ETA: 1:05 - loss: 1.8601 - regression_loss: 1.5436 - classification_loss: 0.3165 239/500 [=============>................] - ETA: 1:05 - loss: 1.8592 - regression_loss: 1.5433 - classification_loss: 0.3159 240/500 [=============>................] - ETA: 1:04 - loss: 1.8563 - regression_loss: 1.5411 - classification_loss: 0.3152 241/500 [=============>................] - ETA: 1:04 - loss: 1.8551 - regression_loss: 1.5404 - classification_loss: 0.3146 242/500 [=============>................] - ETA: 1:04 - loss: 1.8538 - regression_loss: 1.5395 - classification_loss: 0.3142 243/500 [=============>................] - ETA: 1:04 - loss: 1.8556 - regression_loss: 1.5410 - classification_loss: 0.3146 244/500 [=============>................] - ETA: 1:03 - loss: 1.8564 - regression_loss: 1.5412 - classification_loss: 0.3152 245/500 [=============>................] - ETA: 1:03 - loss: 1.8549 - regression_loss: 1.5404 - classification_loss: 0.3145 246/500 [=============>................] - ETA: 1:03 - loss: 1.8556 - regression_loss: 1.5415 - classification_loss: 0.3141 247/500 [=============>................] - ETA: 1:03 - loss: 1.8594 - regression_loss: 1.5441 - classification_loss: 0.3153 248/500 [=============>................] - ETA: 1:03 - loss: 1.8616 - regression_loss: 1.5460 - classification_loss: 0.3156 249/500 [=============>................] - ETA: 1:02 - loss: 1.8654 - regression_loss: 1.5492 - classification_loss: 0.3162 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8686 - regression_loss: 1.5504 - classification_loss: 0.3182 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8698 - regression_loss: 1.5511 - classification_loss: 0.3186 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8703 - regression_loss: 1.5514 - classification_loss: 0.3189 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8712 - regression_loss: 1.5524 - classification_loss: 0.3189 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8664 - regression_loss: 1.5485 - classification_loss: 0.3179 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8627 - regression_loss: 1.5455 - classification_loss: 0.3172 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8625 - regression_loss: 1.5454 - classification_loss: 0.3171 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8616 - regression_loss: 1.5450 - classification_loss: 0.3166 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8613 - regression_loss: 1.5448 - classification_loss: 0.3165 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8637 - regression_loss: 1.5467 - classification_loss: 0.3170 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8657 - regression_loss: 1.5480 - classification_loss: 0.3177 261/500 [==============>...............] - ETA: 59s - loss: 1.8661 - regression_loss: 1.5482 - classification_loss: 0.3179  262/500 [==============>...............] - ETA: 59s - loss: 1.8673 - regression_loss: 1.5489 - classification_loss: 0.3184 263/500 [==============>...............] - ETA: 59s - loss: 1.8695 - regression_loss: 1.5500 - classification_loss: 0.3195 264/500 [==============>...............] - ETA: 59s - loss: 1.8696 - regression_loss: 1.5501 - classification_loss: 0.3194 265/500 [==============>...............] - ETA: 58s - loss: 1.8680 - regression_loss: 1.5489 - classification_loss: 0.3190 266/500 [==============>...............] - ETA: 58s - loss: 1.8690 - regression_loss: 1.5501 - classification_loss: 0.3189 267/500 [===============>..............] - ETA: 58s - loss: 1.8686 - regression_loss: 1.5498 - classification_loss: 0.3187 268/500 [===============>..............] - ETA: 58s - loss: 1.8700 - regression_loss: 1.5512 - classification_loss: 0.3188 269/500 [===============>..............] - ETA: 57s - loss: 1.8705 - regression_loss: 1.5518 - classification_loss: 0.3188 270/500 [===============>..............] - ETA: 57s - loss: 1.8678 - regression_loss: 1.5496 - classification_loss: 0.3182 271/500 [===============>..............] - ETA: 57s - loss: 1.8707 - regression_loss: 1.5514 - classification_loss: 0.3193 272/500 [===============>..............] - ETA: 57s - loss: 1.8710 - regression_loss: 1.5516 - classification_loss: 0.3194 273/500 [===============>..............] - ETA: 56s - loss: 1.8713 - regression_loss: 1.5519 - classification_loss: 0.3194 274/500 [===============>..............] - ETA: 56s - loss: 1.8726 - regression_loss: 1.5529 - classification_loss: 0.3197 275/500 [===============>..............] - ETA: 56s - loss: 1.8726 - regression_loss: 1.5528 - classification_loss: 0.3198 276/500 [===============>..............] - ETA: 56s - loss: 1.8738 - regression_loss: 1.5540 - classification_loss: 0.3198 277/500 [===============>..............] - ETA: 55s - loss: 1.8739 - regression_loss: 1.5523 - classification_loss: 0.3216 278/500 [===============>..............] - ETA: 55s - loss: 1.8724 - regression_loss: 1.5505 - classification_loss: 0.3219 279/500 [===============>..............] - ETA: 55s - loss: 1.8729 - regression_loss: 1.5511 - classification_loss: 0.3218 280/500 [===============>..............] - ETA: 55s - loss: 1.8728 - regression_loss: 1.5511 - classification_loss: 0.3217 281/500 [===============>..............] - ETA: 54s - loss: 1.8717 - regression_loss: 1.5503 - classification_loss: 0.3214 282/500 [===============>..............] - ETA: 54s - loss: 1.8718 - regression_loss: 1.5504 - classification_loss: 0.3214 283/500 [===============>..............] - ETA: 54s - loss: 1.8720 - regression_loss: 1.5505 - classification_loss: 0.3214 284/500 [================>.............] - ETA: 54s - loss: 1.8713 - regression_loss: 1.5500 - classification_loss: 0.3213 285/500 [================>.............] - ETA: 53s - loss: 1.8711 - regression_loss: 1.5500 - classification_loss: 0.3211 286/500 [================>.............] - ETA: 53s - loss: 1.8708 - regression_loss: 1.5499 - classification_loss: 0.3208 287/500 [================>.............] - ETA: 53s - loss: 1.8693 - regression_loss: 1.5491 - classification_loss: 0.3202 288/500 [================>.............] - ETA: 53s - loss: 1.8707 - regression_loss: 1.5505 - classification_loss: 0.3202 289/500 [================>.............] - ETA: 52s - loss: 1.8692 - regression_loss: 1.5497 - classification_loss: 0.3195 290/500 [================>.............] - ETA: 52s - loss: 1.8681 - regression_loss: 1.5490 - classification_loss: 0.3191 291/500 [================>.............] - ETA: 52s - loss: 1.8699 - regression_loss: 1.5502 - classification_loss: 0.3196 292/500 [================>.............] - ETA: 52s - loss: 1.8674 - regression_loss: 1.5485 - classification_loss: 0.3188 293/500 [================>.............] - ETA: 51s - loss: 1.8634 - regression_loss: 1.5452 - classification_loss: 0.3182 294/500 [================>.............] - ETA: 51s - loss: 1.8631 - regression_loss: 1.5451 - classification_loss: 0.3180 295/500 [================>.............] - ETA: 51s - loss: 1.8653 - regression_loss: 1.5466 - classification_loss: 0.3187 296/500 [================>.............] - ETA: 51s - loss: 1.8646 - regression_loss: 1.5462 - classification_loss: 0.3184 297/500 [================>.............] - ETA: 50s - loss: 1.8643 - regression_loss: 1.5461 - classification_loss: 0.3181 298/500 [================>.............] - ETA: 50s - loss: 1.8650 - regression_loss: 1.5472 - classification_loss: 0.3179 299/500 [================>.............] - ETA: 50s - loss: 1.8638 - regression_loss: 1.5462 - classification_loss: 0.3176 300/500 [=================>............] - ETA: 50s - loss: 1.8653 - regression_loss: 1.5475 - classification_loss: 0.3178 301/500 [=================>............] - ETA: 49s - loss: 1.8632 - regression_loss: 1.5456 - classification_loss: 0.3175 302/500 [=================>............] - ETA: 49s - loss: 1.8629 - regression_loss: 1.5455 - classification_loss: 0.3174 303/500 [=================>............] - ETA: 49s - loss: 1.8618 - regression_loss: 1.5450 - classification_loss: 0.3169 304/500 [=================>............] - ETA: 49s - loss: 1.8612 - regression_loss: 1.5445 - classification_loss: 0.3167 305/500 [=================>............] - ETA: 48s - loss: 1.8618 - regression_loss: 1.5450 - classification_loss: 0.3168 306/500 [=================>............] - ETA: 48s - loss: 1.8641 - regression_loss: 1.5465 - classification_loss: 0.3176 307/500 [=================>............] - ETA: 48s - loss: 1.8635 - regression_loss: 1.5461 - classification_loss: 0.3174 308/500 [=================>............] - ETA: 48s - loss: 1.8641 - regression_loss: 1.5466 - classification_loss: 0.3174 309/500 [=================>............] - ETA: 47s - loss: 1.8674 - regression_loss: 1.5495 - classification_loss: 0.3179 310/500 [=================>............] - ETA: 47s - loss: 1.8685 - regression_loss: 1.5503 - classification_loss: 0.3183 311/500 [=================>............] - ETA: 47s - loss: 1.8683 - regression_loss: 1.5500 - classification_loss: 0.3183 312/500 [=================>............] - ETA: 47s - loss: 1.8681 - regression_loss: 1.5499 - classification_loss: 0.3182 313/500 [=================>............] - ETA: 46s - loss: 1.8684 - regression_loss: 1.5502 - classification_loss: 0.3182 314/500 [=================>............] - ETA: 46s - loss: 1.8692 - regression_loss: 1.5512 - classification_loss: 0.3181 315/500 [=================>............] - ETA: 46s - loss: 1.8677 - regression_loss: 1.5500 - classification_loss: 0.3177 316/500 [=================>............] - ETA: 46s - loss: 1.8662 - regression_loss: 1.5489 - classification_loss: 0.3173 317/500 [==================>...........] - ETA: 45s - loss: 1.8673 - regression_loss: 1.5497 - classification_loss: 0.3176 318/500 [==================>...........] - ETA: 45s - loss: 1.8690 - regression_loss: 1.5511 - classification_loss: 0.3179 319/500 [==================>...........] - ETA: 45s - loss: 1.8684 - regression_loss: 1.5504 - classification_loss: 0.3180 320/500 [==================>...........] - ETA: 45s - loss: 1.8664 - regression_loss: 1.5487 - classification_loss: 0.3177 321/500 [==================>...........] - ETA: 44s - loss: 1.8678 - regression_loss: 1.5498 - classification_loss: 0.3180 322/500 [==================>...........] - ETA: 44s - loss: 1.8701 - regression_loss: 1.5519 - classification_loss: 0.3182 323/500 [==================>...........] - ETA: 44s - loss: 1.8678 - regression_loss: 1.5500 - classification_loss: 0.3178 324/500 [==================>...........] - ETA: 44s - loss: 1.8689 - regression_loss: 1.5506 - classification_loss: 0.3183 325/500 [==================>...........] - ETA: 43s - loss: 1.8690 - regression_loss: 1.5506 - classification_loss: 0.3184 326/500 [==================>...........] - ETA: 43s - loss: 1.8675 - regression_loss: 1.5496 - classification_loss: 0.3179 327/500 [==================>...........] - ETA: 43s - loss: 1.8677 - regression_loss: 1.5500 - classification_loss: 0.3177 328/500 [==================>...........] - ETA: 43s - loss: 1.8670 - regression_loss: 1.5496 - classification_loss: 0.3174 329/500 [==================>...........] - ETA: 42s - loss: 1.8655 - regression_loss: 1.5485 - classification_loss: 0.3170 330/500 [==================>...........] - ETA: 42s - loss: 1.8672 - regression_loss: 1.5494 - classification_loss: 0.3178 331/500 [==================>...........] - ETA: 42s - loss: 1.8667 - regression_loss: 1.5492 - classification_loss: 0.3175 332/500 [==================>...........] - ETA: 42s - loss: 1.8655 - regression_loss: 1.5484 - classification_loss: 0.3171 333/500 [==================>...........] - ETA: 41s - loss: 1.8641 - regression_loss: 1.5471 - classification_loss: 0.3170 334/500 [===================>..........] - ETA: 41s - loss: 1.8628 - regression_loss: 1.5464 - classification_loss: 0.3165 335/500 [===================>..........] - ETA: 41s - loss: 1.8633 - regression_loss: 1.5466 - classification_loss: 0.3167 336/500 [===================>..........] - ETA: 41s - loss: 1.8633 - regression_loss: 1.5467 - classification_loss: 0.3166 337/500 [===================>..........] - ETA: 40s - loss: 1.8639 - regression_loss: 1.5473 - classification_loss: 0.3165 338/500 [===================>..........] - ETA: 40s - loss: 1.8644 - regression_loss: 1.5476 - classification_loss: 0.3168 339/500 [===================>..........] - ETA: 40s - loss: 1.8627 - regression_loss: 1.5464 - classification_loss: 0.3163 340/500 [===================>..........] - ETA: 40s - loss: 1.8613 - regression_loss: 1.5454 - classification_loss: 0.3159 341/500 [===================>..........] - ETA: 39s - loss: 1.8594 - regression_loss: 1.5440 - classification_loss: 0.3154 342/500 [===================>..........] - ETA: 39s - loss: 1.8627 - regression_loss: 1.5471 - classification_loss: 0.3156 343/500 [===================>..........] - ETA: 39s - loss: 1.8616 - regression_loss: 1.5464 - classification_loss: 0.3152 344/500 [===================>..........] - ETA: 39s - loss: 1.8636 - regression_loss: 1.5481 - classification_loss: 0.3156 345/500 [===================>..........] - ETA: 38s - loss: 1.8597 - regression_loss: 1.5448 - classification_loss: 0.3149 346/500 [===================>..........] - ETA: 38s - loss: 1.8607 - regression_loss: 1.5461 - classification_loss: 0.3146 347/500 [===================>..........] - ETA: 38s - loss: 1.8615 - regression_loss: 1.5469 - classification_loss: 0.3146 348/500 [===================>..........] - ETA: 38s - loss: 1.8624 - regression_loss: 1.5477 - classification_loss: 0.3147 349/500 [===================>..........] - ETA: 37s - loss: 1.8639 - regression_loss: 1.5487 - classification_loss: 0.3151 350/500 [====================>.........] - ETA: 37s - loss: 1.8645 - regression_loss: 1.5492 - classification_loss: 0.3153 351/500 [====================>.........] - ETA: 37s - loss: 1.8677 - regression_loss: 1.5518 - classification_loss: 0.3159 352/500 [====================>.........] - ETA: 37s - loss: 1.8672 - regression_loss: 1.5516 - classification_loss: 0.3156 353/500 [====================>.........] - ETA: 36s - loss: 1.8673 - regression_loss: 1.5519 - classification_loss: 0.3154 354/500 [====================>.........] - ETA: 36s - loss: 1.8662 - regression_loss: 1.5511 - classification_loss: 0.3151 355/500 [====================>.........] - ETA: 36s - loss: 1.8673 - regression_loss: 1.5519 - classification_loss: 0.3154 356/500 [====================>.........] - ETA: 36s - loss: 1.8684 - regression_loss: 1.5525 - classification_loss: 0.3158 357/500 [====================>.........] - ETA: 35s - loss: 1.8694 - regression_loss: 1.5532 - classification_loss: 0.3162 358/500 [====================>.........] - ETA: 35s - loss: 1.8692 - regression_loss: 1.5532 - classification_loss: 0.3161 359/500 [====================>.........] - ETA: 35s - loss: 1.8684 - regression_loss: 1.5525 - classification_loss: 0.3159 360/500 [====================>.........] - ETA: 35s - loss: 1.8700 - regression_loss: 1.5540 - classification_loss: 0.3159 361/500 [====================>.........] - ETA: 34s - loss: 1.8710 - regression_loss: 1.5545 - classification_loss: 0.3165 362/500 [====================>.........] - ETA: 34s - loss: 1.8677 - regression_loss: 1.5519 - classification_loss: 0.3158 363/500 [====================>.........] - ETA: 34s - loss: 1.8668 - regression_loss: 1.5515 - classification_loss: 0.3153 364/500 [====================>.........] - ETA: 34s - loss: 1.8645 - regression_loss: 1.5497 - classification_loss: 0.3148 365/500 [====================>.........] - ETA: 33s - loss: 1.8630 - regression_loss: 1.5486 - classification_loss: 0.3145 366/500 [====================>.........] - ETA: 33s - loss: 1.8643 - regression_loss: 1.5499 - classification_loss: 0.3144 367/500 [=====================>........] - ETA: 33s - loss: 1.8650 - regression_loss: 1.5503 - classification_loss: 0.3147 368/500 [=====================>........] - ETA: 33s - loss: 1.8654 - regression_loss: 1.5505 - classification_loss: 0.3149 369/500 [=====================>........] - ETA: 32s - loss: 1.8650 - regression_loss: 1.5501 - classification_loss: 0.3149 370/500 [=====================>........] - ETA: 32s - loss: 1.8651 - regression_loss: 1.5501 - classification_loss: 0.3150 371/500 [=====================>........] - ETA: 32s - loss: 1.8670 - regression_loss: 1.5516 - classification_loss: 0.3154 372/500 [=====================>........] - ETA: 32s - loss: 1.8683 - regression_loss: 1.5528 - classification_loss: 0.3155 373/500 [=====================>........] - ETA: 31s - loss: 1.8680 - regression_loss: 1.5526 - classification_loss: 0.3154 374/500 [=====================>........] - ETA: 31s - loss: 1.8650 - regression_loss: 1.5501 - classification_loss: 0.3148 375/500 [=====================>........] - ETA: 31s - loss: 1.8650 - regression_loss: 1.5504 - classification_loss: 0.3146 376/500 [=====================>........] - ETA: 31s - loss: 1.8642 - regression_loss: 1.5498 - classification_loss: 0.3144 377/500 [=====================>........] - ETA: 30s - loss: 1.8626 - regression_loss: 1.5486 - classification_loss: 0.3140 378/500 [=====================>........] - ETA: 30s - loss: 1.8621 - regression_loss: 1.5484 - classification_loss: 0.3138 379/500 [=====================>........] - ETA: 30s - loss: 1.8614 - regression_loss: 1.5476 - classification_loss: 0.3137 380/500 [=====================>........] - ETA: 30s - loss: 1.8630 - regression_loss: 1.5491 - classification_loss: 0.3139 381/500 [=====================>........] - ETA: 29s - loss: 1.8638 - regression_loss: 1.5497 - classification_loss: 0.3142 382/500 [=====================>........] - ETA: 29s - loss: 1.8642 - regression_loss: 1.5501 - classification_loss: 0.3141 383/500 [=====================>........] - ETA: 29s - loss: 1.8664 - regression_loss: 1.5521 - classification_loss: 0.3143 384/500 [======================>.......] - ETA: 29s - loss: 1.8676 - regression_loss: 1.5529 - classification_loss: 0.3147 385/500 [======================>.......] - ETA: 28s - loss: 1.8678 - regression_loss: 1.5534 - classification_loss: 0.3144 386/500 [======================>.......] - ETA: 28s - loss: 1.8668 - regression_loss: 1.5529 - classification_loss: 0.3140 387/500 [======================>.......] - ETA: 28s - loss: 1.8664 - regression_loss: 1.5524 - classification_loss: 0.3139 388/500 [======================>.......] - ETA: 28s - loss: 1.8666 - regression_loss: 1.5527 - classification_loss: 0.3139 389/500 [======================>.......] - ETA: 27s - loss: 1.8655 - regression_loss: 1.5518 - classification_loss: 0.3137 390/500 [======================>.......] - ETA: 27s - loss: 1.8639 - regression_loss: 1.5504 - classification_loss: 0.3135 391/500 [======================>.......] - ETA: 27s - loss: 1.8641 - regression_loss: 1.5507 - classification_loss: 0.3134 392/500 [======================>.......] - ETA: 27s - loss: 1.8658 - regression_loss: 1.5522 - classification_loss: 0.3137 393/500 [======================>.......] - ETA: 26s - loss: 1.8656 - regression_loss: 1.5520 - classification_loss: 0.3136 394/500 [======================>.......] - ETA: 26s - loss: 1.8658 - regression_loss: 1.5523 - classification_loss: 0.3135 395/500 [======================>.......] - ETA: 26s - loss: 1.8654 - regression_loss: 1.5520 - classification_loss: 0.3134 396/500 [======================>.......] - ETA: 26s - loss: 1.8635 - regression_loss: 1.5505 - classification_loss: 0.3130 397/500 [======================>.......] - ETA: 25s - loss: 1.8631 - regression_loss: 1.5501 - classification_loss: 0.3130 398/500 [======================>.......] - ETA: 25s - loss: 1.8639 - regression_loss: 1.5507 - classification_loss: 0.3132 399/500 [======================>.......] - ETA: 25s - loss: 1.8608 - regression_loss: 1.5481 - classification_loss: 0.3128 400/500 [=======================>......] - ETA: 24s - loss: 1.8614 - regression_loss: 1.5485 - classification_loss: 0.3129 401/500 [=======================>......] - ETA: 24s - loss: 1.8627 - regression_loss: 1.5495 - classification_loss: 0.3132 402/500 [=======================>......] - ETA: 24s - loss: 1.8628 - regression_loss: 1.5495 - classification_loss: 0.3133 403/500 [=======================>......] - ETA: 24s - loss: 1.8627 - regression_loss: 1.5494 - classification_loss: 0.3133 404/500 [=======================>......] - ETA: 23s - loss: 1.8624 - regression_loss: 1.5492 - classification_loss: 0.3132 405/500 [=======================>......] - ETA: 23s - loss: 1.8617 - regression_loss: 1.5484 - classification_loss: 0.3133 406/500 [=======================>......] - ETA: 23s - loss: 1.8618 - regression_loss: 1.5483 - classification_loss: 0.3135 407/500 [=======================>......] - ETA: 23s - loss: 1.8618 - regression_loss: 1.5483 - classification_loss: 0.3135 408/500 [=======================>......] - ETA: 22s - loss: 1.8596 - regression_loss: 1.5465 - classification_loss: 0.3131 409/500 [=======================>......] - ETA: 22s - loss: 1.8589 - regression_loss: 1.5461 - classification_loss: 0.3128 410/500 [=======================>......] - ETA: 22s - loss: 1.8585 - regression_loss: 1.5458 - classification_loss: 0.3127 411/500 [=======================>......] - ETA: 22s - loss: 1.8589 - regression_loss: 1.5464 - classification_loss: 0.3125 412/500 [=======================>......] - ETA: 21s - loss: 1.8614 - regression_loss: 1.5484 - classification_loss: 0.3130 413/500 [=======================>......] - ETA: 21s - loss: 1.8613 - regression_loss: 1.5483 - classification_loss: 0.3130 414/500 [=======================>......] - ETA: 21s - loss: 1.8591 - regression_loss: 1.5467 - classification_loss: 0.3125 415/500 [=======================>......] - ETA: 21s - loss: 1.8613 - regression_loss: 1.5480 - classification_loss: 0.3132 416/500 [=======================>......] - ETA: 21s - loss: 1.8593 - regression_loss: 1.5465 - classification_loss: 0.3127 417/500 [========================>.....] - ETA: 20s - loss: 1.8621 - regression_loss: 1.5490 - classification_loss: 0.3132 418/500 [========================>.....] - ETA: 20s - loss: 1.8629 - regression_loss: 1.5499 - classification_loss: 0.3131 419/500 [========================>.....] - ETA: 20s - loss: 1.8614 - regression_loss: 1.5486 - classification_loss: 0.3128 420/500 [========================>.....] - ETA: 20s - loss: 1.8627 - regression_loss: 1.5493 - classification_loss: 0.3135 421/500 [========================>.....] - ETA: 19s - loss: 1.8612 - regression_loss: 1.5480 - classification_loss: 0.3132 422/500 [========================>.....] - ETA: 19s - loss: 1.8606 - regression_loss: 1.5474 - classification_loss: 0.3131 423/500 [========================>.....] - ETA: 19s - loss: 1.8614 - regression_loss: 1.5480 - classification_loss: 0.3134 424/500 [========================>.....] - ETA: 19s - loss: 1.8588 - regression_loss: 1.5460 - classification_loss: 0.3128 425/500 [========================>.....] - ETA: 18s - loss: 1.8570 - regression_loss: 1.5424 - classification_loss: 0.3146 426/500 [========================>.....] - ETA: 18s - loss: 1.8570 - regression_loss: 1.5423 - classification_loss: 0.3147 427/500 [========================>.....] - ETA: 18s - loss: 1.8569 - regression_loss: 1.5422 - classification_loss: 0.3147 428/500 [========================>.....] - ETA: 18s - loss: 1.8558 - regression_loss: 1.5414 - classification_loss: 0.3144 429/500 [========================>.....] - ETA: 17s - loss: 1.8581 - regression_loss: 1.5431 - classification_loss: 0.3150 430/500 [========================>.....] - ETA: 17s - loss: 1.8566 - regression_loss: 1.5419 - classification_loss: 0.3147 431/500 [========================>.....] - ETA: 17s - loss: 1.8548 - regression_loss: 1.5402 - classification_loss: 0.3146 432/500 [========================>.....] - ETA: 17s - loss: 1.8561 - regression_loss: 1.5412 - classification_loss: 0.3149 433/500 [========================>.....] - ETA: 16s - loss: 1.8562 - regression_loss: 1.5412 - classification_loss: 0.3150 434/500 [=========================>....] - ETA: 16s - loss: 1.8574 - regression_loss: 1.5421 - classification_loss: 0.3153 435/500 [=========================>....] - ETA: 16s - loss: 1.8584 - regression_loss: 1.5429 - classification_loss: 0.3156 436/500 [=========================>....] - ETA: 16s - loss: 1.8573 - regression_loss: 1.5420 - classification_loss: 0.3153 437/500 [=========================>....] - ETA: 15s - loss: 1.8565 - regression_loss: 1.5414 - classification_loss: 0.3152 438/500 [=========================>....] - ETA: 15s - loss: 1.8562 - regression_loss: 1.5413 - classification_loss: 0.3149 439/500 [=========================>....] - ETA: 15s - loss: 1.8564 - regression_loss: 1.5415 - classification_loss: 0.3149 440/500 [=========================>....] - ETA: 15s - loss: 1.8569 - regression_loss: 1.5416 - classification_loss: 0.3152 441/500 [=========================>....] - ETA: 14s - loss: 1.8571 - regression_loss: 1.5419 - classification_loss: 0.3151 442/500 [=========================>....] - ETA: 14s - loss: 1.8567 - regression_loss: 1.5419 - classification_loss: 0.3149 443/500 [=========================>....] - ETA: 14s - loss: 1.8554 - regression_loss: 1.5408 - classification_loss: 0.3146 444/500 [=========================>....] - ETA: 14s - loss: 1.8560 - regression_loss: 1.5412 - classification_loss: 0.3148 445/500 [=========================>....] - ETA: 13s - loss: 1.8599 - regression_loss: 1.5443 - classification_loss: 0.3156 446/500 [=========================>....] - ETA: 13s - loss: 1.8595 - regression_loss: 1.5442 - classification_loss: 0.3153 447/500 [=========================>....] - ETA: 13s - loss: 1.8612 - regression_loss: 1.5452 - classification_loss: 0.3160 448/500 [=========================>....] - ETA: 13s - loss: 1.8602 - regression_loss: 1.5444 - classification_loss: 0.3157 449/500 [=========================>....] - ETA: 12s - loss: 1.8588 - regression_loss: 1.5435 - classification_loss: 0.3153 450/500 [==========================>...] - ETA: 12s - loss: 1.8599 - regression_loss: 1.5443 - classification_loss: 0.3156 451/500 [==========================>...] - ETA: 12s - loss: 1.8596 - regression_loss: 1.5440 - classification_loss: 0.3155 452/500 [==========================>...] - ETA: 11s - loss: 1.8588 - regression_loss: 1.5434 - classification_loss: 0.3154 453/500 [==========================>...] - ETA: 11s - loss: 1.8587 - regression_loss: 1.5434 - classification_loss: 0.3153 454/500 [==========================>...] - ETA: 11s - loss: 1.8587 - regression_loss: 1.5433 - classification_loss: 0.3154 455/500 [==========================>...] - ETA: 11s - loss: 1.8587 - regression_loss: 1.5433 - classification_loss: 0.3154 456/500 [==========================>...] - ETA: 10s - loss: 1.8589 - regression_loss: 1.5436 - classification_loss: 0.3154 457/500 [==========================>...] - ETA: 10s - loss: 1.8564 - regression_loss: 1.5415 - classification_loss: 0.3149 458/500 [==========================>...] - ETA: 10s - loss: 1.8590 - regression_loss: 1.5433 - classification_loss: 0.3157 459/500 [==========================>...] - ETA: 10s - loss: 1.8591 - regression_loss: 1.5433 - classification_loss: 0.3159 460/500 [==========================>...] - ETA: 9s - loss: 1.8600 - regression_loss: 1.5439 - classification_loss: 0.3161  461/500 [==========================>...] - ETA: 9s - loss: 1.8587 - regression_loss: 1.5430 - classification_loss: 0.3157 462/500 [==========================>...] - ETA: 9s - loss: 1.8586 - regression_loss: 1.5430 - classification_loss: 0.3156 463/500 [==========================>...] - ETA: 9s - loss: 1.8577 - regression_loss: 1.5422 - classification_loss: 0.3155 464/500 [==========================>...] - ETA: 8s - loss: 1.8568 - regression_loss: 1.5417 - classification_loss: 0.3151 465/500 [==========================>...] - ETA: 8s - loss: 1.8569 - regression_loss: 1.5418 - classification_loss: 0.3151 466/500 [==========================>...] - ETA: 8s - loss: 1.8561 - regression_loss: 1.5413 - classification_loss: 0.3148 467/500 [===========================>..] - ETA: 8s - loss: 1.8552 - regression_loss: 1.5405 - classification_loss: 0.3147 468/500 [===========================>..] - ETA: 8s - loss: 1.8559 - regression_loss: 1.5411 - classification_loss: 0.3148 469/500 [===========================>..] - ETA: 7s - loss: 1.8547 - regression_loss: 1.5402 - classification_loss: 0.3145 470/500 [===========================>..] - ETA: 7s - loss: 1.8537 - regression_loss: 1.5392 - classification_loss: 0.3144 471/500 [===========================>..] - ETA: 7s - loss: 1.8553 - regression_loss: 1.5405 - classification_loss: 0.3147 472/500 [===========================>..] - ETA: 7s - loss: 1.8543 - regression_loss: 1.5399 - classification_loss: 0.3144 473/500 [===========================>..] - ETA: 6s - loss: 1.8538 - regression_loss: 1.5395 - classification_loss: 0.3143 474/500 [===========================>..] - ETA: 6s - loss: 1.8525 - regression_loss: 1.5385 - classification_loss: 0.3141 475/500 [===========================>..] - ETA: 6s - loss: 1.8522 - regression_loss: 1.5385 - classification_loss: 0.3138 476/500 [===========================>..] - ETA: 6s - loss: 1.8527 - regression_loss: 1.5386 - classification_loss: 0.3141 477/500 [===========================>..] - ETA: 5s - loss: 1.8536 - regression_loss: 1.5393 - classification_loss: 0.3143 478/500 [===========================>..] - ETA: 5s - loss: 1.8532 - regression_loss: 1.5391 - classification_loss: 0.3141 479/500 [===========================>..] - ETA: 5s - loss: 1.8539 - regression_loss: 1.5398 - classification_loss: 0.3141 480/500 [===========================>..] - ETA: 5s - loss: 1.8520 - regression_loss: 1.5381 - classification_loss: 0.3138 481/500 [===========================>..] - ETA: 4s - loss: 1.8535 - regression_loss: 1.5393 - classification_loss: 0.3142 482/500 [===========================>..] - ETA: 4s - loss: 1.8535 - regression_loss: 1.5394 - classification_loss: 0.3141 483/500 [===========================>..] - ETA: 4s - loss: 1.8539 - regression_loss: 1.5397 - classification_loss: 0.3142 484/500 [============================>.] - ETA: 4s - loss: 1.8542 - regression_loss: 1.5400 - classification_loss: 0.3142 485/500 [============================>.] - ETA: 3s - loss: 1.8547 - regression_loss: 1.5406 - classification_loss: 0.3141 486/500 [============================>.] - ETA: 3s - loss: 1.8564 - regression_loss: 1.5420 - classification_loss: 0.3145 487/500 [============================>.] - ETA: 3s - loss: 1.8563 - regression_loss: 1.5420 - classification_loss: 0.3142 488/500 [============================>.] - ETA: 3s - loss: 1.8578 - regression_loss: 1.5433 - classification_loss: 0.3145 489/500 [============================>.] - ETA: 2s - loss: 1.8575 - regression_loss: 1.5431 - classification_loss: 0.3144 490/500 [============================>.] - ETA: 2s - loss: 1.8574 - regression_loss: 1.5430 - classification_loss: 0.3144 491/500 [============================>.] - ETA: 2s - loss: 1.8563 - regression_loss: 1.5420 - classification_loss: 0.3143 492/500 [============================>.] - ETA: 1s - loss: 1.8572 - regression_loss: 1.5428 - classification_loss: 0.3144 493/500 [============================>.] - ETA: 1s - loss: 1.8573 - regression_loss: 1.5429 - classification_loss: 0.3143 494/500 [============================>.] - ETA: 1s - loss: 1.8564 - regression_loss: 1.5423 - classification_loss: 0.3142 495/500 [============================>.] - ETA: 1s - loss: 1.8563 - regression_loss: 1.5421 - classification_loss: 0.3142 496/500 [============================>.] - ETA: 1s - loss: 1.8575 - regression_loss: 1.5434 - classification_loss: 0.3141 497/500 [============================>.] - ETA: 0s - loss: 1.8551 - regression_loss: 1.5415 - classification_loss: 0.3136 498/500 [============================>.] - ETA: 0s - loss: 1.8548 - regression_loss: 1.5414 - classification_loss: 0.3134 499/500 [============================>.] - ETA: 0s - loss: 1.8530 - regression_loss: 1.5396 - classification_loss: 0.3134 500/500 [==============================] - 125s 250ms/step - loss: 1.8523 - regression_loss: 1.5391 - classification_loss: 0.3131 326 instances of class plum with average precision: 0.7569 mAP: 0.7569 Epoch 00034: saving model to ./training/snapshots/resnet50_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 2:02 - loss: 0.9039 - regression_loss: 0.7978 - classification_loss: 0.1062 2/500 [..............................] - ETA: 2:03 - loss: 1.0810 - regression_loss: 0.9559 - classification_loss: 0.1251 3/500 [..............................] - ETA: 2:04 - loss: 1.2304 - regression_loss: 1.0642 - classification_loss: 0.1662 4/500 [..............................] - ETA: 2:03 - loss: 1.4888 - regression_loss: 1.2597 - classification_loss: 0.2292 5/500 [..............................] - ETA: 2:03 - loss: 1.5411 - regression_loss: 1.3102 - classification_loss: 0.2308 6/500 [..............................] - ETA: 2:02 - loss: 1.4980 - regression_loss: 1.2815 - classification_loss: 0.2165 7/500 [..............................] - ETA: 2:03 - loss: 1.6108 - regression_loss: 1.3664 - classification_loss: 0.2444 8/500 [..............................] - ETA: 2:04 - loss: 1.7512 - regression_loss: 1.4791 - classification_loss: 0.2721 9/500 [..............................] - ETA: 2:04 - loss: 1.6871 - regression_loss: 1.4304 - classification_loss: 0.2567 10/500 [..............................] - ETA: 2:04 - loss: 1.7499 - regression_loss: 1.4872 - classification_loss: 0.2627 11/500 [..............................] - ETA: 2:03 - loss: 1.7869 - regression_loss: 1.5173 - classification_loss: 0.2696 12/500 [..............................] - ETA: 2:03 - loss: 1.7804 - regression_loss: 1.5054 - classification_loss: 0.2749 13/500 [..............................] - ETA: 2:02 - loss: 1.7364 - regression_loss: 1.4711 - classification_loss: 0.2653 14/500 [..............................] - ETA: 2:02 - loss: 1.7962 - regression_loss: 1.5250 - classification_loss: 0.2711 15/500 [..............................] - ETA: 2:01 - loss: 1.8144 - regression_loss: 1.5419 - classification_loss: 0.2725 16/500 [..............................] - ETA: 2:01 - loss: 1.7935 - regression_loss: 1.5204 - classification_loss: 0.2731 17/500 [>.............................] - ETA: 2:01 - loss: 1.8190 - regression_loss: 1.5365 - classification_loss: 0.2825 18/500 [>.............................] - ETA: 2:00 - loss: 1.7959 - regression_loss: 1.5192 - classification_loss: 0.2767 19/500 [>.............................] - ETA: 2:00 - loss: 1.8116 - regression_loss: 1.5326 - classification_loss: 0.2791 20/500 [>.............................] - ETA: 2:00 - loss: 1.7626 - regression_loss: 1.4910 - classification_loss: 0.2717 21/500 [>.............................] - ETA: 1:59 - loss: 1.8004 - regression_loss: 1.5143 - classification_loss: 0.2861 22/500 [>.............................] - ETA: 1:59 - loss: 1.8042 - regression_loss: 1.5144 - classification_loss: 0.2898 23/500 [>.............................] - ETA: 1:59 - loss: 1.7559 - regression_loss: 1.4744 - classification_loss: 0.2815 24/500 [>.............................] - ETA: 1:58 - loss: 1.7786 - regression_loss: 1.4906 - classification_loss: 0.2880 25/500 [>.............................] - ETA: 1:58 - loss: 1.7833 - regression_loss: 1.4957 - classification_loss: 0.2876 26/500 [>.............................] - ETA: 1:58 - loss: 1.7892 - regression_loss: 1.5029 - classification_loss: 0.2863 27/500 [>.............................] - ETA: 1:58 - loss: 1.7884 - regression_loss: 1.5042 - classification_loss: 0.2842 28/500 [>.............................] - ETA: 1:58 - loss: 1.8144 - regression_loss: 1.5254 - classification_loss: 0.2890 29/500 [>.............................] - ETA: 1:57 - loss: 1.8372 - regression_loss: 1.5454 - classification_loss: 0.2919 30/500 [>.............................] - ETA: 1:57 - loss: 1.8593 - regression_loss: 1.5586 - classification_loss: 0.3007 31/500 [>.............................] - ETA: 1:57 - loss: 1.8885 - regression_loss: 1.5794 - classification_loss: 0.3091 32/500 [>.............................] - ETA: 1:56 - loss: 1.8866 - regression_loss: 1.5792 - classification_loss: 0.3074 33/500 [>.............................] - ETA: 1:56 - loss: 1.8912 - regression_loss: 1.5826 - classification_loss: 0.3086 34/500 [=>............................] - ETA: 1:56 - loss: 1.8747 - regression_loss: 1.5697 - classification_loss: 0.3050 35/500 [=>............................] - ETA: 1:56 - loss: 1.8739 - regression_loss: 1.5685 - classification_loss: 0.3054 36/500 [=>............................] - ETA: 1:55 - loss: 1.8834 - regression_loss: 1.5699 - classification_loss: 0.3135 37/500 [=>............................] - ETA: 1:55 - loss: 1.8805 - regression_loss: 1.5674 - classification_loss: 0.3132 38/500 [=>............................] - ETA: 1:55 - loss: 1.8674 - regression_loss: 1.5565 - classification_loss: 0.3109 39/500 [=>............................] - ETA: 1:55 - loss: 1.8682 - regression_loss: 1.5601 - classification_loss: 0.3082 40/500 [=>............................] - ETA: 1:55 - loss: 1.8767 - regression_loss: 1.5652 - classification_loss: 0.3114 41/500 [=>............................] - ETA: 1:54 - loss: 1.8709 - regression_loss: 1.5619 - classification_loss: 0.3091 42/500 [=>............................] - ETA: 1:54 - loss: 1.8769 - regression_loss: 1.5665 - classification_loss: 0.3103 43/500 [=>............................] - ETA: 1:54 - loss: 1.8637 - regression_loss: 1.5564 - classification_loss: 0.3073 44/500 [=>............................] - ETA: 1:54 - loss: 1.8627 - regression_loss: 1.5570 - classification_loss: 0.3057 45/500 [=>............................] - ETA: 1:54 - loss: 1.8776 - regression_loss: 1.5675 - classification_loss: 0.3101 46/500 [=>............................] - ETA: 1:53 - loss: 1.8834 - regression_loss: 1.5734 - classification_loss: 0.3101 47/500 [=>............................] - ETA: 1:53 - loss: 1.8783 - regression_loss: 1.5702 - classification_loss: 0.3082 48/500 [=>............................] - ETA: 1:53 - loss: 1.8881 - regression_loss: 1.5777 - classification_loss: 0.3103 49/500 [=>............................] - ETA: 1:53 - loss: 1.9043 - regression_loss: 1.5906 - classification_loss: 0.3137 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9076 - regression_loss: 1.5926 - classification_loss: 0.3150 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9083 - regression_loss: 1.5935 - classification_loss: 0.3148 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9096 - regression_loss: 1.5945 - classification_loss: 0.3151 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9019 - regression_loss: 1.5895 - classification_loss: 0.3123 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8842 - regression_loss: 1.5752 - classification_loss: 0.3089 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8817 - regression_loss: 1.5720 - classification_loss: 0.3097 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8851 - regression_loss: 1.5747 - classification_loss: 0.3104 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8807 - regression_loss: 1.5723 - classification_loss: 0.3084 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8797 - regression_loss: 1.5722 - classification_loss: 0.3074 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8756 - regression_loss: 1.5676 - classification_loss: 0.3080 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8645 - regression_loss: 1.5587 - classification_loss: 0.3058 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8655 - regression_loss: 1.5581 - classification_loss: 0.3075 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8580 - regression_loss: 1.5525 - classification_loss: 0.3054 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8556 - regression_loss: 1.5498 - classification_loss: 0.3058 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8557 - regression_loss: 1.5502 - classification_loss: 0.3055 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8592 - regression_loss: 1.5522 - classification_loss: 0.3070 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8588 - regression_loss: 1.5510 - classification_loss: 0.3078 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8697 - regression_loss: 1.5599 - classification_loss: 0.3099 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8700 - regression_loss: 1.5610 - classification_loss: 0.3090 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8648 - regression_loss: 1.5564 - classification_loss: 0.3085 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8659 - regression_loss: 1.5578 - classification_loss: 0.3081 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8739 - regression_loss: 1.5613 - classification_loss: 0.3126 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8704 - regression_loss: 1.5593 - classification_loss: 0.3111 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8675 - regression_loss: 1.5575 - classification_loss: 0.3100 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8615 - regression_loss: 1.5532 - classification_loss: 0.3083 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8573 - regression_loss: 1.5479 - classification_loss: 0.3095 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8593 - regression_loss: 1.5497 - classification_loss: 0.3097 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8571 - regression_loss: 1.5478 - classification_loss: 0.3093 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8587 - regression_loss: 1.5488 - classification_loss: 0.3099 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8723 - regression_loss: 1.5588 - classification_loss: 0.3135 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8692 - regression_loss: 1.5561 - classification_loss: 0.3131 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8707 - regression_loss: 1.5567 - classification_loss: 0.3140 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8664 - regression_loss: 1.5533 - classification_loss: 0.3131 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8859 - regression_loss: 1.5584 - classification_loss: 0.3275 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8872 - regression_loss: 1.5592 - classification_loss: 0.3280 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8900 - regression_loss: 1.5621 - classification_loss: 0.3279 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8862 - regression_loss: 1.5586 - classification_loss: 0.3276 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8845 - regression_loss: 1.5578 - classification_loss: 0.3267 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8931 - regression_loss: 1.5635 - classification_loss: 0.3296 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8959 - regression_loss: 1.5651 - classification_loss: 0.3308 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8897 - regression_loss: 1.5598 - classification_loss: 0.3300 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8944 - regression_loss: 1.5627 - classification_loss: 0.3317 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8923 - regression_loss: 1.5609 - classification_loss: 0.3314 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8811 - regression_loss: 1.5525 - classification_loss: 0.3286 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8802 - regression_loss: 1.5525 - classification_loss: 0.3277 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8783 - regression_loss: 1.5497 - classification_loss: 0.3286 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8814 - regression_loss: 1.5518 - classification_loss: 0.3296 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8764 - regression_loss: 1.5480 - classification_loss: 0.3283 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8703 - regression_loss: 1.5432 - classification_loss: 0.3271 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8781 - regression_loss: 1.5486 - classification_loss: 0.3295 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8809 - regression_loss: 1.5505 - classification_loss: 0.3304 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8747 - regression_loss: 1.5456 - classification_loss: 0.3291 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8839 - regression_loss: 1.5526 - classification_loss: 0.3313 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8813 - regression_loss: 1.5515 - classification_loss: 0.3298 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8795 - regression_loss: 1.5492 - classification_loss: 0.3303 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8773 - regression_loss: 1.5447 - classification_loss: 0.3327 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8701 - regression_loss: 1.5388 - classification_loss: 0.3313 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8701 - regression_loss: 1.5400 - classification_loss: 0.3301 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8673 - regression_loss: 1.5383 - classification_loss: 0.3290 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8626 - regression_loss: 1.5348 - classification_loss: 0.3278 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8624 - regression_loss: 1.5350 - classification_loss: 0.3273 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8612 - regression_loss: 1.5343 - classification_loss: 0.3269 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8542 - regression_loss: 1.5293 - classification_loss: 0.3250 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8542 - regression_loss: 1.5298 - classification_loss: 0.3244 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8526 - regression_loss: 1.5281 - classification_loss: 0.3245 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8564 - regression_loss: 1.5309 - classification_loss: 0.3255 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8558 - regression_loss: 1.5305 - classification_loss: 0.3253 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8527 - regression_loss: 1.5284 - classification_loss: 0.3243 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8537 - regression_loss: 1.5291 - classification_loss: 0.3246 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8526 - regression_loss: 1.5279 - classification_loss: 0.3247 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8515 - regression_loss: 1.5272 - classification_loss: 0.3243 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8501 - regression_loss: 1.5263 - classification_loss: 0.3238 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8512 - regression_loss: 1.5272 - classification_loss: 0.3240 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8538 - regression_loss: 1.5304 - classification_loss: 0.3233 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8580 - regression_loss: 1.5304 - classification_loss: 0.3276 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8554 - regression_loss: 1.5284 - classification_loss: 0.3270 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8552 - regression_loss: 1.5285 - classification_loss: 0.3267 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8576 - regression_loss: 1.5307 - classification_loss: 0.3269 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8490 - regression_loss: 1.5236 - classification_loss: 0.3255 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8446 - regression_loss: 1.5205 - classification_loss: 0.3240 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8451 - regression_loss: 1.5215 - classification_loss: 0.3237 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8486 - regression_loss: 1.5244 - classification_loss: 0.3242 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8465 - regression_loss: 1.5230 - classification_loss: 0.3235 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8521 - regression_loss: 1.5273 - classification_loss: 0.3248 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8479 - regression_loss: 1.5240 - classification_loss: 0.3240 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8503 - regression_loss: 1.5261 - classification_loss: 0.3242 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8499 - regression_loss: 1.5249 - classification_loss: 0.3249 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8472 - regression_loss: 1.5230 - classification_loss: 0.3242 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8524 - regression_loss: 1.5270 - classification_loss: 0.3254 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8496 - regression_loss: 1.5253 - classification_loss: 0.3243 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8464 - regression_loss: 1.5229 - classification_loss: 0.3235 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8444 - regression_loss: 1.5209 - classification_loss: 0.3235 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8498 - regression_loss: 1.5249 - classification_loss: 0.3249 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8481 - regression_loss: 1.5237 - classification_loss: 0.3244 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8484 - regression_loss: 1.5237 - classification_loss: 0.3247 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8420 - regression_loss: 1.5188 - classification_loss: 0.3232 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8351 - regression_loss: 1.5134 - classification_loss: 0.3217 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8381 - regression_loss: 1.5163 - classification_loss: 0.3218 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8381 - regression_loss: 1.5167 - classification_loss: 0.3214 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8440 - regression_loss: 1.5212 - classification_loss: 0.3228 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8437 - regression_loss: 1.5212 - classification_loss: 0.3225 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8438 - regression_loss: 1.5221 - classification_loss: 0.3217 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8415 - regression_loss: 1.5206 - classification_loss: 0.3209 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8463 - regression_loss: 1.5247 - classification_loss: 0.3216 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8496 - regression_loss: 1.5280 - classification_loss: 0.3216 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8473 - regression_loss: 1.5258 - classification_loss: 0.3215 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8454 - regression_loss: 1.5245 - classification_loss: 0.3209 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8496 - regression_loss: 1.5275 - classification_loss: 0.3221 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8517 - regression_loss: 1.5290 - classification_loss: 0.3227 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8479 - regression_loss: 1.5262 - classification_loss: 0.3216 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8468 - regression_loss: 1.5256 - classification_loss: 0.3212 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8484 - regression_loss: 1.5161 - classification_loss: 0.3323 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8449 - regression_loss: 1.5136 - classification_loss: 0.3313 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8434 - regression_loss: 1.5129 - classification_loss: 0.3305 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8425 - regression_loss: 1.5123 - classification_loss: 0.3302 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8385 - regression_loss: 1.5093 - classification_loss: 0.3291 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8400 - regression_loss: 1.5100 - classification_loss: 0.3300 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8393 - regression_loss: 1.5096 - classification_loss: 0.3297 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8365 - regression_loss: 1.5077 - classification_loss: 0.3288 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8355 - regression_loss: 1.5075 - classification_loss: 0.3280 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8384 - regression_loss: 1.5097 - classification_loss: 0.3287 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8327 - regression_loss: 1.5052 - classification_loss: 0.3275 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8390 - regression_loss: 1.5100 - classification_loss: 0.3290 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8356 - regression_loss: 1.5075 - classification_loss: 0.3281 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8453 - regression_loss: 1.5144 - classification_loss: 0.3309 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8384 - regression_loss: 1.5089 - classification_loss: 0.3295 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8364 - regression_loss: 1.5077 - classification_loss: 0.3287 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8334 - regression_loss: 1.5057 - classification_loss: 0.3278 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8306 - regression_loss: 1.5036 - classification_loss: 0.3269 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8308 - regression_loss: 1.5045 - classification_loss: 0.3263 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8340 - regression_loss: 1.5068 - classification_loss: 0.3272 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8331 - regression_loss: 1.5060 - classification_loss: 0.3271 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8373 - regression_loss: 1.5090 - classification_loss: 0.3282 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8367 - regression_loss: 1.5086 - classification_loss: 0.3281 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8419 - regression_loss: 1.5137 - classification_loss: 0.3281 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8412 - regression_loss: 1.5128 - classification_loss: 0.3284 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8392 - regression_loss: 1.5109 - classification_loss: 0.3283 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8395 - regression_loss: 1.5111 - classification_loss: 0.3284 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8384 - regression_loss: 1.5108 - classification_loss: 0.3275 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8422 - regression_loss: 1.5144 - classification_loss: 0.3278 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8423 - regression_loss: 1.5147 - classification_loss: 0.3277 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8417 - regression_loss: 1.5143 - classification_loss: 0.3273 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8354 - regression_loss: 1.5095 - classification_loss: 0.3259 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8369 - regression_loss: 1.5107 - classification_loss: 0.3262 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8361 - regression_loss: 1.5102 - classification_loss: 0.3259 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8330 - regression_loss: 1.5076 - classification_loss: 0.3254 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8332 - regression_loss: 1.5075 - classification_loss: 0.3257 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8321 - regression_loss: 1.5069 - classification_loss: 0.3252 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8312 - regression_loss: 1.5064 - classification_loss: 0.3247 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8292 - regression_loss: 1.5054 - classification_loss: 0.3238 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8288 - regression_loss: 1.5054 - classification_loss: 0.3234 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8290 - regression_loss: 1.5059 - classification_loss: 0.3232 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8276 - regression_loss: 1.5053 - classification_loss: 0.3223 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8273 - regression_loss: 1.5053 - classification_loss: 0.3220 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8285 - regression_loss: 1.5063 - classification_loss: 0.3222 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8277 - regression_loss: 1.5058 - classification_loss: 0.3218 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8278 - regression_loss: 1.5061 - classification_loss: 0.3217 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8316 - regression_loss: 1.5100 - classification_loss: 0.3215 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8315 - regression_loss: 1.5106 - classification_loss: 0.3209 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8296 - regression_loss: 1.5090 - classification_loss: 0.3206 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8306 - regression_loss: 1.5104 - classification_loss: 0.3202 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8307 - regression_loss: 1.5102 - classification_loss: 0.3205 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8289 - regression_loss: 1.5089 - classification_loss: 0.3200 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8347 - regression_loss: 1.5138 - classification_loss: 0.3210 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8363 - regression_loss: 1.5147 - classification_loss: 0.3216 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8372 - regression_loss: 1.5152 - classification_loss: 0.3220 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8357 - regression_loss: 1.5140 - classification_loss: 0.3217 217/500 [============>.................] - ETA: 1:10 - loss: 1.8370 - regression_loss: 1.5149 - classification_loss: 0.3220 218/500 [============>.................] - ETA: 1:10 - loss: 1.8396 - regression_loss: 1.5170 - classification_loss: 0.3226 219/500 [============>.................] - ETA: 1:10 - loss: 1.8424 - regression_loss: 1.5193 - classification_loss: 0.3231 220/500 [============>.................] - ETA: 1:10 - loss: 1.8425 - regression_loss: 1.5195 - classification_loss: 0.3230 221/500 [============>.................] - ETA: 1:09 - loss: 1.8401 - regression_loss: 1.5178 - classification_loss: 0.3223 222/500 [============>.................] - ETA: 1:09 - loss: 1.8339 - regression_loss: 1.5128 - classification_loss: 0.3212 223/500 [============>.................] - ETA: 1:09 - loss: 1.8379 - regression_loss: 1.5155 - classification_loss: 0.3224 224/500 [============>.................] - ETA: 1:09 - loss: 1.8394 - regression_loss: 1.5171 - classification_loss: 0.3223 225/500 [============>.................] - ETA: 1:08 - loss: 1.8411 - regression_loss: 1.5186 - classification_loss: 0.3226 226/500 [============>.................] - ETA: 1:08 - loss: 1.8418 - regression_loss: 1.5192 - classification_loss: 0.3226 227/500 [============>.................] - ETA: 1:08 - loss: 1.8435 - regression_loss: 1.5205 - classification_loss: 0.3229 228/500 [============>.................] - ETA: 1:08 - loss: 1.8468 - regression_loss: 1.5232 - classification_loss: 0.3236 229/500 [============>.................] - ETA: 1:07 - loss: 1.8477 - regression_loss: 1.5240 - classification_loss: 0.3237 230/500 [============>.................] - ETA: 1:07 - loss: 1.8456 - regression_loss: 1.5221 - classification_loss: 0.3235 231/500 [============>.................] - ETA: 1:07 - loss: 1.8430 - regression_loss: 1.5203 - classification_loss: 0.3227 232/500 [============>.................] - ETA: 1:07 - loss: 1.8462 - regression_loss: 1.5227 - classification_loss: 0.3235 233/500 [============>.................] - ETA: 1:06 - loss: 1.8438 - regression_loss: 1.5207 - classification_loss: 0.3230 234/500 [=============>................] - ETA: 1:06 - loss: 1.8429 - regression_loss: 1.5202 - classification_loss: 0.3227 235/500 [=============>................] - ETA: 1:06 - loss: 1.8470 - regression_loss: 1.5232 - classification_loss: 0.3239 236/500 [=============>................] - ETA: 1:06 - loss: 1.8489 - regression_loss: 1.5248 - classification_loss: 0.3241 237/500 [=============>................] - ETA: 1:05 - loss: 1.8525 - regression_loss: 1.5271 - classification_loss: 0.3254 238/500 [=============>................] - ETA: 1:05 - loss: 1.8492 - regression_loss: 1.5243 - classification_loss: 0.3248 239/500 [=============>................] - ETA: 1:05 - loss: 1.8443 - regression_loss: 1.5204 - classification_loss: 0.3239 240/500 [=============>................] - ETA: 1:05 - loss: 1.8406 - regression_loss: 1.5174 - classification_loss: 0.3232 241/500 [=============>................] - ETA: 1:04 - loss: 1.8355 - regression_loss: 1.5133 - classification_loss: 0.3222 242/500 [=============>................] - ETA: 1:04 - loss: 1.8328 - regression_loss: 1.5112 - classification_loss: 0.3217 243/500 [=============>................] - ETA: 1:04 - loss: 1.8336 - regression_loss: 1.5119 - classification_loss: 0.3217 244/500 [=============>................] - ETA: 1:04 - loss: 1.8351 - regression_loss: 1.5131 - classification_loss: 0.3220 245/500 [=============>................] - ETA: 1:03 - loss: 1.8344 - regression_loss: 1.5127 - classification_loss: 0.3217 246/500 [=============>................] - ETA: 1:03 - loss: 1.8358 - regression_loss: 1.5139 - classification_loss: 0.3218 247/500 [=============>................] - ETA: 1:03 - loss: 1.8347 - regression_loss: 1.5132 - classification_loss: 0.3216 248/500 [=============>................] - ETA: 1:03 - loss: 1.8353 - regression_loss: 1.5135 - classification_loss: 0.3218 249/500 [=============>................] - ETA: 1:02 - loss: 1.8339 - regression_loss: 1.5123 - classification_loss: 0.3215 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8342 - regression_loss: 1.5127 - classification_loss: 0.3214 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8358 - regression_loss: 1.5137 - classification_loss: 0.3222 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8348 - regression_loss: 1.5128 - classification_loss: 0.3220 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8369 - regression_loss: 1.5142 - classification_loss: 0.3227 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8369 - regression_loss: 1.5143 - classification_loss: 0.3226 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8338 - regression_loss: 1.5118 - classification_loss: 0.3221 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8368 - regression_loss: 1.5136 - classification_loss: 0.3232 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8399 - regression_loss: 1.5162 - classification_loss: 0.3237 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8362 - regression_loss: 1.5131 - classification_loss: 0.3231 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8364 - regression_loss: 1.5134 - classification_loss: 0.3230 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8351 - regression_loss: 1.5123 - classification_loss: 0.3228 261/500 [==============>...............] - ETA: 59s - loss: 1.8356 - regression_loss: 1.5129 - classification_loss: 0.3227  262/500 [==============>...............] - ETA: 59s - loss: 1.8345 - regression_loss: 1.5121 - classification_loss: 0.3224 263/500 [==============>...............] - ETA: 59s - loss: 1.8345 - regression_loss: 1.5122 - classification_loss: 0.3222 264/500 [==============>...............] - ETA: 59s - loss: 1.8347 - regression_loss: 1.5124 - classification_loss: 0.3223 265/500 [==============>...............] - ETA: 58s - loss: 1.8333 - regression_loss: 1.5109 - classification_loss: 0.3224 266/500 [==============>...............] - ETA: 58s - loss: 1.8340 - regression_loss: 1.5114 - classification_loss: 0.3227 267/500 [===============>..............] - ETA: 58s - loss: 1.8325 - regression_loss: 1.5105 - classification_loss: 0.3221 268/500 [===============>..............] - ETA: 58s - loss: 1.8342 - regression_loss: 1.5116 - classification_loss: 0.3226 269/500 [===============>..............] - ETA: 57s - loss: 1.8322 - regression_loss: 1.5098 - classification_loss: 0.3224 270/500 [===============>..............] - ETA: 57s - loss: 1.8381 - regression_loss: 1.5145 - classification_loss: 0.3236 271/500 [===============>..............] - ETA: 57s - loss: 1.8400 - regression_loss: 1.5162 - classification_loss: 0.3238 272/500 [===============>..............] - ETA: 57s - loss: 1.8402 - regression_loss: 1.5163 - classification_loss: 0.3239 273/500 [===============>..............] - ETA: 56s - loss: 1.8402 - regression_loss: 1.5167 - classification_loss: 0.3235 274/500 [===============>..............] - ETA: 56s - loss: 1.8380 - regression_loss: 1.5152 - classification_loss: 0.3228 275/500 [===============>..............] - ETA: 56s - loss: 1.8395 - regression_loss: 1.5163 - classification_loss: 0.3231 276/500 [===============>..............] - ETA: 56s - loss: 1.8420 - regression_loss: 1.5187 - classification_loss: 0.3233 277/500 [===============>..............] - ETA: 55s - loss: 1.8409 - regression_loss: 1.5182 - classification_loss: 0.3227 278/500 [===============>..............] - ETA: 55s - loss: 1.8400 - regression_loss: 1.5174 - classification_loss: 0.3225 279/500 [===============>..............] - ETA: 55s - loss: 1.8447 - regression_loss: 1.5199 - classification_loss: 0.3248 280/500 [===============>..............] - ETA: 55s - loss: 1.8450 - regression_loss: 1.5202 - classification_loss: 0.3248 281/500 [===============>..............] - ETA: 54s - loss: 1.8440 - regression_loss: 1.5195 - classification_loss: 0.3244 282/500 [===============>..............] - ETA: 54s - loss: 1.8433 - regression_loss: 1.5191 - classification_loss: 0.3242 283/500 [===============>..............] - ETA: 54s - loss: 1.8470 - regression_loss: 1.5221 - classification_loss: 0.3250 284/500 [================>.............] - ETA: 54s - loss: 1.8470 - regression_loss: 1.5220 - classification_loss: 0.3250 285/500 [================>.............] - ETA: 53s - loss: 1.8469 - regression_loss: 1.5223 - classification_loss: 0.3246 286/500 [================>.............] - ETA: 53s - loss: 1.8499 - regression_loss: 1.5247 - classification_loss: 0.3252 287/500 [================>.............] - ETA: 53s - loss: 1.8495 - regression_loss: 1.5245 - classification_loss: 0.3250 288/500 [================>.............] - ETA: 53s - loss: 1.8469 - regression_loss: 1.5224 - classification_loss: 0.3246 289/500 [================>.............] - ETA: 52s - loss: 1.8468 - regression_loss: 1.5223 - classification_loss: 0.3244 290/500 [================>.............] - ETA: 52s - loss: 1.8458 - regression_loss: 1.5218 - classification_loss: 0.3240 291/500 [================>.............] - ETA: 52s - loss: 1.8437 - regression_loss: 1.5203 - classification_loss: 0.3235 292/500 [================>.............] - ETA: 52s - loss: 1.8456 - regression_loss: 1.5225 - classification_loss: 0.3232 293/500 [================>.............] - ETA: 51s - loss: 1.8474 - regression_loss: 1.5241 - classification_loss: 0.3234 294/500 [================>.............] - ETA: 51s - loss: 1.8488 - regression_loss: 1.5251 - classification_loss: 0.3237 295/500 [================>.............] - ETA: 51s - loss: 1.8500 - regression_loss: 1.5261 - classification_loss: 0.3240 296/500 [================>.............] - ETA: 51s - loss: 1.8490 - regression_loss: 1.5254 - classification_loss: 0.3236 297/500 [================>.............] - ETA: 50s - loss: 1.8497 - regression_loss: 1.5259 - classification_loss: 0.3238 298/500 [================>.............] - ETA: 50s - loss: 1.8479 - regression_loss: 1.5246 - classification_loss: 0.3233 299/500 [================>.............] - ETA: 50s - loss: 1.8455 - regression_loss: 1.5227 - classification_loss: 0.3228 300/500 [=================>............] - ETA: 50s - loss: 1.8438 - regression_loss: 1.5213 - classification_loss: 0.3225 301/500 [=================>............] - ETA: 49s - loss: 1.8436 - regression_loss: 1.5210 - classification_loss: 0.3226 302/500 [=================>............] - ETA: 49s - loss: 1.8440 - regression_loss: 1.5214 - classification_loss: 0.3226 303/500 [=================>............] - ETA: 49s - loss: 1.8463 - regression_loss: 1.5227 - classification_loss: 0.3237 304/500 [=================>............] - ETA: 49s - loss: 1.8482 - regression_loss: 1.5241 - classification_loss: 0.3240 305/500 [=================>............] - ETA: 48s - loss: 1.8492 - regression_loss: 1.5249 - classification_loss: 0.3243 306/500 [=================>............] - ETA: 48s - loss: 1.8500 - regression_loss: 1.5260 - classification_loss: 0.3240 307/500 [=================>............] - ETA: 48s - loss: 1.8519 - regression_loss: 1.5276 - classification_loss: 0.3243 308/500 [=================>............] - ETA: 48s - loss: 1.8552 - regression_loss: 1.5305 - classification_loss: 0.3247 309/500 [=================>............] - ETA: 47s - loss: 1.8529 - regression_loss: 1.5288 - classification_loss: 0.3241 310/500 [=================>............] - ETA: 47s - loss: 1.8536 - regression_loss: 1.5294 - classification_loss: 0.3242 311/500 [=================>............] - ETA: 47s - loss: 1.8548 - regression_loss: 1.5303 - classification_loss: 0.3245 312/500 [=================>............] - ETA: 47s - loss: 1.8554 - regression_loss: 1.5309 - classification_loss: 0.3245 313/500 [=================>............] - ETA: 46s - loss: 1.8539 - regression_loss: 1.5300 - classification_loss: 0.3239 314/500 [=================>............] - ETA: 46s - loss: 1.8551 - regression_loss: 1.5313 - classification_loss: 0.3238 315/500 [=================>............] - ETA: 46s - loss: 1.8531 - regression_loss: 1.5298 - classification_loss: 0.3233 316/500 [=================>............] - ETA: 46s - loss: 1.8531 - regression_loss: 1.5300 - classification_loss: 0.3231 317/500 [==================>...........] - ETA: 45s - loss: 1.8524 - regression_loss: 1.5295 - classification_loss: 0.3229 318/500 [==================>...........] - ETA: 45s - loss: 1.8532 - regression_loss: 1.5300 - classification_loss: 0.3232 319/500 [==================>...........] - ETA: 45s - loss: 1.8527 - regression_loss: 1.5297 - classification_loss: 0.3230 320/500 [==================>...........] - ETA: 45s - loss: 1.8533 - regression_loss: 1.5302 - classification_loss: 0.3231 321/500 [==================>...........] - ETA: 44s - loss: 1.8651 - regression_loss: 1.5327 - classification_loss: 0.3324 322/500 [==================>...........] - ETA: 44s - loss: 1.8651 - regression_loss: 1.5327 - classification_loss: 0.3324 323/500 [==================>...........] - ETA: 44s - loss: 1.8653 - regression_loss: 1.5332 - classification_loss: 0.3321 324/500 [==================>...........] - ETA: 44s - loss: 1.8672 - regression_loss: 1.5349 - classification_loss: 0.3322 325/500 [==================>...........] - ETA: 43s - loss: 1.8681 - regression_loss: 1.5359 - classification_loss: 0.3322 326/500 [==================>...........] - ETA: 43s - loss: 1.8693 - regression_loss: 1.5369 - classification_loss: 0.3323 327/500 [==================>...........] - ETA: 43s - loss: 1.8702 - regression_loss: 1.5376 - classification_loss: 0.3326 328/500 [==================>...........] - ETA: 43s - loss: 1.8669 - regression_loss: 1.5349 - classification_loss: 0.3319 329/500 [==================>...........] - ETA: 42s - loss: 1.8653 - regression_loss: 1.5339 - classification_loss: 0.3315 330/500 [==================>...........] - ETA: 42s - loss: 1.8661 - regression_loss: 1.5346 - classification_loss: 0.3315 331/500 [==================>...........] - ETA: 42s - loss: 1.8640 - regression_loss: 1.5331 - classification_loss: 0.3309 332/500 [==================>...........] - ETA: 42s - loss: 1.8651 - regression_loss: 1.5341 - classification_loss: 0.3310 333/500 [==================>...........] - ETA: 41s - loss: 1.8662 - regression_loss: 1.5350 - classification_loss: 0.3312 334/500 [===================>..........] - ETA: 41s - loss: 1.8648 - regression_loss: 1.5339 - classification_loss: 0.3309 335/500 [===================>..........] - ETA: 41s - loss: 1.8631 - regression_loss: 1.5327 - classification_loss: 0.3303 336/500 [===================>..........] - ETA: 41s - loss: 1.8641 - regression_loss: 1.5337 - classification_loss: 0.3304 337/500 [===================>..........] - ETA: 40s - loss: 1.8654 - regression_loss: 1.5346 - classification_loss: 0.3308 338/500 [===================>..........] - ETA: 40s - loss: 1.8665 - regression_loss: 1.5358 - classification_loss: 0.3307 339/500 [===================>..........] - ETA: 40s - loss: 1.8659 - regression_loss: 1.5354 - classification_loss: 0.3305 340/500 [===================>..........] - ETA: 40s - loss: 1.8640 - regression_loss: 1.5340 - classification_loss: 0.3299 341/500 [===================>..........] - ETA: 39s - loss: 1.8665 - regression_loss: 1.5354 - classification_loss: 0.3311 342/500 [===================>..........] - ETA: 39s - loss: 1.8666 - regression_loss: 1.5359 - classification_loss: 0.3307 343/500 [===================>..........] - ETA: 39s - loss: 1.8639 - regression_loss: 1.5339 - classification_loss: 0.3300 344/500 [===================>..........] - ETA: 39s - loss: 1.8627 - regression_loss: 1.5330 - classification_loss: 0.3296 345/500 [===================>..........] - ETA: 38s - loss: 1.8626 - regression_loss: 1.5331 - classification_loss: 0.3295 346/500 [===================>..........] - ETA: 38s - loss: 1.8602 - regression_loss: 1.5312 - classification_loss: 0.3290 347/500 [===================>..........] - ETA: 38s - loss: 1.8563 - regression_loss: 1.5282 - classification_loss: 0.3282 348/500 [===================>..........] - ETA: 38s - loss: 1.8551 - regression_loss: 1.5274 - classification_loss: 0.3277 349/500 [===================>..........] - ETA: 37s - loss: 1.8527 - regression_loss: 1.5255 - classification_loss: 0.3272 350/500 [====================>.........] - ETA: 37s - loss: 1.8538 - regression_loss: 1.5261 - classification_loss: 0.3278 351/500 [====================>.........] - ETA: 37s - loss: 1.8536 - regression_loss: 1.5261 - classification_loss: 0.3275 352/500 [====================>.........] - ETA: 37s - loss: 1.8537 - regression_loss: 1.5264 - classification_loss: 0.3273 353/500 [====================>.........] - ETA: 36s - loss: 1.8537 - regression_loss: 1.5263 - classification_loss: 0.3274 354/500 [====================>.........] - ETA: 36s - loss: 1.8533 - regression_loss: 1.5263 - classification_loss: 0.3271 355/500 [====================>.........] - ETA: 36s - loss: 1.8523 - regression_loss: 1.5257 - classification_loss: 0.3266 356/500 [====================>.........] - ETA: 36s - loss: 1.8525 - regression_loss: 1.5258 - classification_loss: 0.3268 357/500 [====================>.........] - ETA: 35s - loss: 1.8517 - regression_loss: 1.5252 - classification_loss: 0.3265 358/500 [====================>.........] - ETA: 35s - loss: 1.8516 - regression_loss: 1.5254 - classification_loss: 0.3262 359/500 [====================>.........] - ETA: 35s - loss: 1.8505 - regression_loss: 1.5245 - classification_loss: 0.3260 360/500 [====================>.........] - ETA: 35s - loss: 1.8519 - regression_loss: 1.5258 - classification_loss: 0.3261 361/500 [====================>.........] - ETA: 34s - loss: 1.8547 - regression_loss: 1.5287 - classification_loss: 0.3260 362/500 [====================>.........] - ETA: 34s - loss: 1.8537 - regression_loss: 1.5281 - classification_loss: 0.3257 363/500 [====================>.........] - ETA: 34s - loss: 1.8560 - regression_loss: 1.5299 - classification_loss: 0.3261 364/500 [====================>.........] - ETA: 34s - loss: 1.8562 - regression_loss: 1.5302 - classification_loss: 0.3260 365/500 [====================>.........] - ETA: 33s - loss: 1.8574 - regression_loss: 1.5310 - classification_loss: 0.3264 366/500 [====================>.........] - ETA: 33s - loss: 1.8595 - regression_loss: 1.5327 - classification_loss: 0.3268 367/500 [=====================>........] - ETA: 33s - loss: 1.8597 - regression_loss: 1.5328 - classification_loss: 0.3269 368/500 [=====================>........] - ETA: 33s - loss: 1.8604 - regression_loss: 1.5333 - classification_loss: 0.3271 369/500 [=====================>........] - ETA: 32s - loss: 1.8623 - regression_loss: 1.5348 - classification_loss: 0.3275 370/500 [=====================>........] - ETA: 32s - loss: 1.8619 - regression_loss: 1.5345 - classification_loss: 0.3274 371/500 [=====================>........] - ETA: 32s - loss: 1.8636 - regression_loss: 1.5361 - classification_loss: 0.3275 372/500 [=====================>........] - ETA: 32s - loss: 1.8619 - regression_loss: 1.5349 - classification_loss: 0.3270 373/500 [=====================>........] - ETA: 31s - loss: 1.8623 - regression_loss: 1.5354 - classification_loss: 0.3269 374/500 [=====================>........] - ETA: 31s - loss: 1.8593 - regression_loss: 1.5329 - classification_loss: 0.3264 375/500 [=====================>........] - ETA: 31s - loss: 1.8589 - regression_loss: 1.5327 - classification_loss: 0.3261 376/500 [=====================>........] - ETA: 31s - loss: 1.8559 - regression_loss: 1.5302 - classification_loss: 0.3257 377/500 [=====================>........] - ETA: 30s - loss: 1.8548 - regression_loss: 1.5296 - classification_loss: 0.3252 378/500 [=====================>........] - ETA: 30s - loss: 1.8554 - regression_loss: 1.5300 - classification_loss: 0.3253 379/500 [=====================>........] - ETA: 30s - loss: 1.8552 - regression_loss: 1.5300 - classification_loss: 0.3252 380/500 [=====================>........] - ETA: 30s - loss: 1.8528 - regression_loss: 1.5281 - classification_loss: 0.3247 381/500 [=====================>........] - ETA: 29s - loss: 1.8512 - regression_loss: 1.5269 - classification_loss: 0.3242 382/500 [=====================>........] - ETA: 29s - loss: 1.8508 - regression_loss: 1.5269 - classification_loss: 0.3239 383/500 [=====================>........] - ETA: 29s - loss: 1.8508 - regression_loss: 1.5271 - classification_loss: 0.3237 384/500 [======================>.......] - ETA: 29s - loss: 1.8520 - regression_loss: 1.5282 - classification_loss: 0.3238 385/500 [======================>.......] - ETA: 28s - loss: 1.8514 - regression_loss: 1.5277 - classification_loss: 0.3237 386/500 [======================>.......] - ETA: 28s - loss: 1.8510 - regression_loss: 1.5276 - classification_loss: 0.3234 387/500 [======================>.......] - ETA: 28s - loss: 1.8504 - regression_loss: 1.5272 - classification_loss: 0.3232 388/500 [======================>.......] - ETA: 28s - loss: 1.8485 - regression_loss: 1.5257 - classification_loss: 0.3229 389/500 [======================>.......] - ETA: 27s - loss: 1.8470 - regression_loss: 1.5246 - classification_loss: 0.3223 390/500 [======================>.......] - ETA: 27s - loss: 1.8450 - regression_loss: 1.5232 - classification_loss: 0.3218 391/500 [======================>.......] - ETA: 27s - loss: 1.8448 - regression_loss: 1.5233 - classification_loss: 0.3215 392/500 [======================>.......] - ETA: 27s - loss: 1.8419 - regression_loss: 1.5209 - classification_loss: 0.3209 393/500 [======================>.......] - ETA: 26s - loss: 1.8413 - regression_loss: 1.5207 - classification_loss: 0.3206 394/500 [======================>.......] - ETA: 26s - loss: 1.8404 - regression_loss: 1.5201 - classification_loss: 0.3203 395/500 [======================>.......] - ETA: 26s - loss: 1.8412 - regression_loss: 1.5206 - classification_loss: 0.3206 396/500 [======================>.......] - ETA: 26s - loss: 1.8406 - regression_loss: 1.5203 - classification_loss: 0.3203 397/500 [======================>.......] - ETA: 25s - loss: 1.8459 - regression_loss: 1.5243 - classification_loss: 0.3217 398/500 [======================>.......] - ETA: 25s - loss: 1.8465 - regression_loss: 1.5247 - classification_loss: 0.3217 399/500 [======================>.......] - ETA: 25s - loss: 1.8463 - regression_loss: 1.5246 - classification_loss: 0.3216 400/500 [=======================>......] - ETA: 25s - loss: 1.8461 - regression_loss: 1.5245 - classification_loss: 0.3216 401/500 [=======================>......] - ETA: 24s - loss: 1.8461 - regression_loss: 1.5249 - classification_loss: 0.3213 402/500 [=======================>......] - ETA: 24s - loss: 1.8466 - regression_loss: 1.5254 - classification_loss: 0.3212 403/500 [=======================>......] - ETA: 24s - loss: 1.8459 - regression_loss: 1.5250 - classification_loss: 0.3209 404/500 [=======================>......] - ETA: 24s - loss: 1.8439 - regression_loss: 1.5234 - classification_loss: 0.3205 405/500 [=======================>......] - ETA: 23s - loss: 1.8443 - regression_loss: 1.5238 - classification_loss: 0.3205 406/500 [=======================>......] - ETA: 23s - loss: 1.8448 - regression_loss: 1.5243 - classification_loss: 0.3205 407/500 [=======================>......] - ETA: 23s - loss: 1.8442 - regression_loss: 1.5240 - classification_loss: 0.3202 408/500 [=======================>......] - ETA: 23s - loss: 1.8416 - regression_loss: 1.5220 - classification_loss: 0.3196 409/500 [=======================>......] - ETA: 22s - loss: 1.8416 - regression_loss: 1.5220 - classification_loss: 0.3196 410/500 [=======================>......] - ETA: 22s - loss: 1.8428 - regression_loss: 1.5228 - classification_loss: 0.3200 411/500 [=======================>......] - ETA: 22s - loss: 1.8434 - regression_loss: 1.5233 - classification_loss: 0.3201 412/500 [=======================>......] - ETA: 22s - loss: 1.8456 - regression_loss: 1.5249 - classification_loss: 0.3207 413/500 [=======================>......] - ETA: 21s - loss: 1.8460 - regression_loss: 1.5250 - classification_loss: 0.3210 414/500 [=======================>......] - ETA: 21s - loss: 1.8466 - regression_loss: 1.5258 - classification_loss: 0.3209 415/500 [=======================>......] - ETA: 21s - loss: 1.8450 - regression_loss: 1.5245 - classification_loss: 0.3205 416/500 [=======================>......] - ETA: 21s - loss: 1.8447 - regression_loss: 1.5244 - classification_loss: 0.3204 417/500 [========================>.....] - ETA: 20s - loss: 1.8453 - regression_loss: 1.5244 - classification_loss: 0.3209 418/500 [========================>.....] - ETA: 20s - loss: 1.8473 - regression_loss: 1.5260 - classification_loss: 0.3213 419/500 [========================>.....] - ETA: 20s - loss: 1.8469 - regression_loss: 1.5258 - classification_loss: 0.3211 420/500 [========================>.....] - ETA: 20s - loss: 1.8458 - regression_loss: 1.5251 - classification_loss: 0.3207 421/500 [========================>.....] - ETA: 19s - loss: 1.8462 - regression_loss: 1.5255 - classification_loss: 0.3207 422/500 [========================>.....] - ETA: 19s - loss: 1.8460 - regression_loss: 1.5251 - classification_loss: 0.3209 423/500 [========================>.....] - ETA: 19s - loss: 1.8434 - regression_loss: 1.5230 - classification_loss: 0.3204 424/500 [========================>.....] - ETA: 19s - loss: 1.8420 - regression_loss: 1.5218 - classification_loss: 0.3202 425/500 [========================>.....] - ETA: 18s - loss: 1.8430 - regression_loss: 1.5223 - classification_loss: 0.3207 426/500 [========================>.....] - ETA: 18s - loss: 1.8427 - regression_loss: 1.5221 - classification_loss: 0.3206 427/500 [========================>.....] - ETA: 18s - loss: 1.8408 - regression_loss: 1.5206 - classification_loss: 0.3202 428/500 [========================>.....] - ETA: 18s - loss: 1.8396 - regression_loss: 1.5197 - classification_loss: 0.3198 429/500 [========================>.....] - ETA: 17s - loss: 1.8383 - regression_loss: 1.5189 - classification_loss: 0.3193 430/500 [========================>.....] - ETA: 17s - loss: 1.8387 - regression_loss: 1.5194 - classification_loss: 0.3193 431/500 [========================>.....] - ETA: 17s - loss: 1.8375 - regression_loss: 1.5185 - classification_loss: 0.3190 432/500 [========================>.....] - ETA: 17s - loss: 1.8382 - regression_loss: 1.5187 - classification_loss: 0.3195 433/500 [========================>.....] - ETA: 16s - loss: 1.8372 - regression_loss: 1.5181 - classification_loss: 0.3192 434/500 [=========================>....] - ETA: 16s - loss: 1.8360 - regression_loss: 1.5171 - classification_loss: 0.3189 435/500 [=========================>....] - ETA: 16s - loss: 1.8361 - regression_loss: 1.5173 - classification_loss: 0.3189 436/500 [=========================>....] - ETA: 16s - loss: 1.8352 - regression_loss: 1.5168 - classification_loss: 0.3184 437/500 [=========================>....] - ETA: 15s - loss: 1.8329 - regression_loss: 1.5150 - classification_loss: 0.3179 438/500 [=========================>....] - ETA: 15s - loss: 1.8329 - regression_loss: 1.5150 - classification_loss: 0.3179 439/500 [=========================>....] - ETA: 15s - loss: 1.8340 - regression_loss: 1.5159 - classification_loss: 0.3181 440/500 [=========================>....] - ETA: 15s - loss: 1.8321 - regression_loss: 1.5144 - classification_loss: 0.3177 441/500 [=========================>....] - ETA: 14s - loss: 1.8326 - regression_loss: 1.5147 - classification_loss: 0.3179 442/500 [=========================>....] - ETA: 14s - loss: 1.8324 - regression_loss: 1.5145 - classification_loss: 0.3178 443/500 [=========================>....] - ETA: 14s - loss: 1.8344 - regression_loss: 1.5165 - classification_loss: 0.3178 444/500 [=========================>....] - ETA: 14s - loss: 1.8343 - regression_loss: 1.5167 - classification_loss: 0.3177 445/500 [=========================>....] - ETA: 13s - loss: 1.8341 - regression_loss: 1.5165 - classification_loss: 0.3176 446/500 [=========================>....] - ETA: 13s - loss: 1.8337 - regression_loss: 1.5163 - classification_loss: 0.3174 447/500 [=========================>....] - ETA: 13s - loss: 1.8347 - regression_loss: 1.5167 - classification_loss: 0.3180 448/500 [=========================>....] - ETA: 13s - loss: 1.8346 - regression_loss: 1.5168 - classification_loss: 0.3178 449/500 [=========================>....] - ETA: 12s - loss: 1.8342 - regression_loss: 1.5165 - classification_loss: 0.3177 450/500 [==========================>...] - ETA: 12s - loss: 1.8360 - regression_loss: 1.5180 - classification_loss: 0.3181 451/500 [==========================>...] - ETA: 12s - loss: 1.8348 - regression_loss: 1.5170 - classification_loss: 0.3178 452/500 [==========================>...] - ETA: 12s - loss: 1.8342 - regression_loss: 1.5164 - classification_loss: 0.3178 453/500 [==========================>...] - ETA: 11s - loss: 1.8340 - regression_loss: 1.5164 - classification_loss: 0.3176 454/500 [==========================>...] - ETA: 11s - loss: 1.8344 - regression_loss: 1.5168 - classification_loss: 0.3176 455/500 [==========================>...] - ETA: 11s - loss: 1.8336 - regression_loss: 1.5163 - classification_loss: 0.3173 456/500 [==========================>...] - ETA: 11s - loss: 1.8349 - regression_loss: 1.5173 - classification_loss: 0.3175 457/500 [==========================>...] - ETA: 10s - loss: 1.8358 - regression_loss: 1.5183 - classification_loss: 0.3176 458/500 [==========================>...] - ETA: 10s - loss: 1.8381 - regression_loss: 1.5200 - classification_loss: 0.3181 459/500 [==========================>...] - ETA: 10s - loss: 1.8397 - regression_loss: 1.5213 - classification_loss: 0.3184 460/500 [==========================>...] - ETA: 10s - loss: 1.8388 - regression_loss: 1.5207 - classification_loss: 0.3181 461/500 [==========================>...] - ETA: 9s - loss: 1.8389 - regression_loss: 1.5208 - classification_loss: 0.3180  462/500 [==========================>...] - ETA: 9s - loss: 1.8387 - regression_loss: 1.5208 - classification_loss: 0.3179 463/500 [==========================>...] - ETA: 9s - loss: 1.8368 - regression_loss: 1.5193 - classification_loss: 0.3175 464/500 [==========================>...] - ETA: 9s - loss: 1.8361 - regression_loss: 1.5189 - classification_loss: 0.3173 465/500 [==========================>...] - ETA: 8s - loss: 1.8353 - regression_loss: 1.5183 - classification_loss: 0.3171 466/500 [==========================>...] - ETA: 8s - loss: 1.8348 - regression_loss: 1.5177 - classification_loss: 0.3170 467/500 [===========================>..] - ETA: 8s - loss: 1.8351 - regression_loss: 1.5181 - classification_loss: 0.3170 468/500 [===========================>..] - ETA: 8s - loss: 1.8346 - regression_loss: 1.5179 - classification_loss: 0.3167 469/500 [===========================>..] - ETA: 7s - loss: 1.8386 - regression_loss: 1.5208 - classification_loss: 0.3178 470/500 [===========================>..] - ETA: 7s - loss: 1.8387 - regression_loss: 1.5209 - classification_loss: 0.3179 471/500 [===========================>..] - ETA: 7s - loss: 1.8386 - regression_loss: 1.5211 - classification_loss: 0.3176 472/500 [===========================>..] - ETA: 7s - loss: 1.8374 - regression_loss: 1.5201 - classification_loss: 0.3174 473/500 [===========================>..] - ETA: 6s - loss: 1.8370 - regression_loss: 1.5197 - classification_loss: 0.3173 474/500 [===========================>..] - ETA: 6s - loss: 1.8358 - regression_loss: 1.5187 - classification_loss: 0.3171 475/500 [===========================>..] - ETA: 6s - loss: 1.8356 - regression_loss: 1.5185 - classification_loss: 0.3171 476/500 [===========================>..] - ETA: 6s - loss: 1.8354 - regression_loss: 1.5183 - classification_loss: 0.3171 477/500 [===========================>..] - ETA: 5s - loss: 1.8358 - regression_loss: 1.5187 - classification_loss: 0.3170 478/500 [===========================>..] - ETA: 5s - loss: 1.8347 - regression_loss: 1.5179 - classification_loss: 0.3167 479/500 [===========================>..] - ETA: 5s - loss: 1.8341 - regression_loss: 1.5177 - classification_loss: 0.3165 480/500 [===========================>..] - ETA: 5s - loss: 1.8331 - regression_loss: 1.5170 - classification_loss: 0.3161 481/500 [===========================>..] - ETA: 4s - loss: 1.8320 - regression_loss: 1.5160 - classification_loss: 0.3160 482/500 [===========================>..] - ETA: 4s - loss: 1.8328 - regression_loss: 1.5166 - classification_loss: 0.3162 483/500 [===========================>..] - ETA: 4s - loss: 1.8329 - regression_loss: 1.5168 - classification_loss: 0.3160 484/500 [============================>.] - ETA: 4s - loss: 1.8332 - regression_loss: 1.5173 - classification_loss: 0.3159 485/500 [============================>.] - ETA: 3s - loss: 1.8322 - regression_loss: 1.5167 - classification_loss: 0.3156 486/500 [============================>.] - ETA: 3s - loss: 1.8344 - regression_loss: 1.5184 - classification_loss: 0.3159 487/500 [============================>.] - ETA: 3s - loss: 1.8356 - regression_loss: 1.5194 - classification_loss: 0.3162 488/500 [============================>.] - ETA: 3s - loss: 1.8345 - regression_loss: 1.5186 - classification_loss: 0.3159 489/500 [============================>.] - ETA: 2s - loss: 1.8348 - regression_loss: 1.5188 - classification_loss: 0.3160 490/500 [============================>.] - ETA: 2s - loss: 1.8350 - regression_loss: 1.5190 - classification_loss: 0.3160 491/500 [============================>.] - ETA: 2s - loss: 1.8350 - regression_loss: 1.5189 - classification_loss: 0.3161 492/500 [============================>.] - ETA: 2s - loss: 1.8355 - regression_loss: 1.5193 - classification_loss: 0.3162 493/500 [============================>.] - ETA: 1s - loss: 1.8356 - regression_loss: 1.5194 - classification_loss: 0.3162 494/500 [============================>.] - ETA: 1s - loss: 1.8356 - regression_loss: 1.5193 - classification_loss: 0.3163 495/500 [============================>.] - ETA: 1s - loss: 1.8351 - regression_loss: 1.5190 - classification_loss: 0.3161 496/500 [============================>.] - ETA: 1s - loss: 1.8359 - regression_loss: 1.5197 - classification_loss: 0.3162 497/500 [============================>.] - ETA: 0s - loss: 1.8357 - regression_loss: 1.5196 - classification_loss: 0.3161 498/500 [============================>.] - ETA: 0s - loss: 1.8347 - regression_loss: 1.5189 - classification_loss: 0.3158 499/500 [============================>.] - ETA: 0s - loss: 1.8332 - regression_loss: 1.5177 - classification_loss: 0.3155 500/500 [==============================] - 125s 250ms/step - loss: 1.8336 - regression_loss: 1.5181 - classification_loss: 0.3155 326 instances of class plum with average precision: 0.7327 mAP: 0.7327 Epoch 00035: saving model to ./training/snapshots/resnet50_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 2:01 - loss: 1.6629 - regression_loss: 1.4095 - classification_loss: 0.2534 2/500 [..............................] - ETA: 2:03 - loss: 1.3511 - regression_loss: 1.1518 - classification_loss: 0.1993 3/500 [..............................] - ETA: 2:04 - loss: 1.6512 - regression_loss: 1.3994 - classification_loss: 0.2519 4/500 [..............................] - ETA: 2:02 - loss: 1.8030 - regression_loss: 1.5194 - classification_loss: 0.2837 5/500 [..............................] - ETA: 2:02 - loss: 1.9811 - regression_loss: 1.6347 - classification_loss: 0.3464 6/500 [..............................] - ETA: 2:02 - loss: 1.7981 - regression_loss: 1.4878 - classification_loss: 0.3103 7/500 [..............................] - ETA: 2:02 - loss: 1.8950 - regression_loss: 1.5713 - classification_loss: 0.3238 8/500 [..............................] - ETA: 2:02 - loss: 1.8883 - regression_loss: 1.5706 - classification_loss: 0.3176 9/500 [..............................] - ETA: 2:03 - loss: 1.8642 - regression_loss: 1.5581 - classification_loss: 0.3061 10/500 [..............................] - ETA: 2:02 - loss: 1.8734 - regression_loss: 1.5556 - classification_loss: 0.3178 11/500 [..............................] - ETA: 2:02 - loss: 1.8934 - regression_loss: 1.5779 - classification_loss: 0.3155 12/500 [..............................] - ETA: 2:02 - loss: 1.9033 - regression_loss: 1.5824 - classification_loss: 0.3208 13/500 [..............................] - ETA: 2:01 - loss: 1.9611 - regression_loss: 1.6206 - classification_loss: 0.3405 14/500 [..............................] - ETA: 2:01 - loss: 1.9637 - regression_loss: 1.6245 - classification_loss: 0.3392 15/500 [..............................] - ETA: 2:01 - loss: 1.9286 - regression_loss: 1.5932 - classification_loss: 0.3353 16/500 [..............................] - ETA: 2:01 - loss: 1.9114 - regression_loss: 1.5885 - classification_loss: 0.3230 17/500 [>.............................] - ETA: 2:01 - loss: 1.8987 - regression_loss: 1.5789 - classification_loss: 0.3198 18/500 [>.............................] - ETA: 2:01 - loss: 1.9038 - regression_loss: 1.5881 - classification_loss: 0.3157 19/500 [>.............................] - ETA: 2:00 - loss: 1.8960 - regression_loss: 1.5845 - classification_loss: 0.3115 20/500 [>.............................] - ETA: 2:00 - loss: 1.8807 - regression_loss: 1.5721 - classification_loss: 0.3086 21/500 [>.............................] - ETA: 2:00 - loss: 1.8630 - regression_loss: 1.5616 - classification_loss: 0.3014 22/500 [>.............................] - ETA: 1:59 - loss: 1.8864 - regression_loss: 1.5787 - classification_loss: 0.3077 23/500 [>.............................] - ETA: 1:59 - loss: 1.9247 - regression_loss: 1.6092 - classification_loss: 0.3154 24/500 [>.............................] - ETA: 1:59 - loss: 1.9223 - regression_loss: 1.6120 - classification_loss: 0.3103 25/500 [>.............................] - ETA: 1:58 - loss: 1.9213 - regression_loss: 1.6088 - classification_loss: 0.3125 26/500 [>.............................] - ETA: 1:58 - loss: 1.9289 - regression_loss: 1.6132 - classification_loss: 0.3157 27/500 [>.............................] - ETA: 1:58 - loss: 1.8994 - regression_loss: 1.5890 - classification_loss: 0.3104 28/500 [>.............................] - ETA: 1:58 - loss: 1.8892 - regression_loss: 1.5821 - classification_loss: 0.3070 29/500 [>.............................] - ETA: 1:57 - loss: 1.8876 - regression_loss: 1.5820 - classification_loss: 0.3056 30/500 [>.............................] - ETA: 1:57 - loss: 1.8967 - regression_loss: 1.5912 - classification_loss: 0.3055 31/500 [>.............................] - ETA: 1:57 - loss: 1.9047 - regression_loss: 1.5963 - classification_loss: 0.3084 32/500 [>.............................] - ETA: 1:57 - loss: 1.8886 - regression_loss: 1.5835 - classification_loss: 0.3050 33/500 [>.............................] - ETA: 1:57 - loss: 1.8745 - regression_loss: 1.5724 - classification_loss: 0.3021 34/500 [=>............................] - ETA: 1:56 - loss: 1.8654 - regression_loss: 1.5661 - classification_loss: 0.2994 35/500 [=>............................] - ETA: 1:56 - loss: 1.8592 - regression_loss: 1.5613 - classification_loss: 0.2979 36/500 [=>............................] - ETA: 1:56 - loss: 1.8657 - regression_loss: 1.5665 - classification_loss: 0.2993 37/500 [=>............................] - ETA: 1:56 - loss: 1.8582 - regression_loss: 1.5613 - classification_loss: 0.2969 38/500 [=>............................] - ETA: 1:56 - loss: 1.8616 - regression_loss: 1.5631 - classification_loss: 0.2986 39/500 [=>............................] - ETA: 1:55 - loss: 1.8521 - regression_loss: 1.5541 - classification_loss: 0.2980 40/500 [=>............................] - ETA: 1:55 - loss: 1.8627 - regression_loss: 1.5614 - classification_loss: 0.3013 41/500 [=>............................] - ETA: 1:55 - loss: 1.8668 - regression_loss: 1.5658 - classification_loss: 0.3010 42/500 [=>............................] - ETA: 1:55 - loss: 1.8624 - regression_loss: 1.5619 - classification_loss: 0.3004 43/500 [=>............................] - ETA: 1:55 - loss: 1.8765 - regression_loss: 1.5729 - classification_loss: 0.3036 44/500 [=>............................] - ETA: 1:54 - loss: 1.8914 - regression_loss: 1.5851 - classification_loss: 0.3062 45/500 [=>............................] - ETA: 1:54 - loss: 1.8772 - regression_loss: 1.5740 - classification_loss: 0.3032 46/500 [=>............................] - ETA: 1:54 - loss: 1.8648 - regression_loss: 1.5644 - classification_loss: 0.3004 47/500 [=>............................] - ETA: 1:54 - loss: 1.8554 - regression_loss: 1.5573 - classification_loss: 0.2981 48/500 [=>............................] - ETA: 1:53 - loss: 1.8501 - regression_loss: 1.5530 - classification_loss: 0.2970 49/500 [=>............................] - ETA: 1:53 - loss: 1.8620 - regression_loss: 1.5608 - classification_loss: 0.3011 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8416 - regression_loss: 1.5445 - classification_loss: 0.2971 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8393 - regression_loss: 1.5437 - classification_loss: 0.2956 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8291 - regression_loss: 1.5357 - classification_loss: 0.2934 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8249 - regression_loss: 1.5310 - classification_loss: 0.2939 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8446 - regression_loss: 1.5466 - classification_loss: 0.2980 55/500 [==>...........................] - ETA: 1:52 - loss: 1.8521 - regression_loss: 1.5509 - classification_loss: 0.3012 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8396 - regression_loss: 1.5405 - classification_loss: 0.2990 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8184 - regression_loss: 1.5227 - classification_loss: 0.2956 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8197 - regression_loss: 1.5249 - classification_loss: 0.2948 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8214 - regression_loss: 1.5235 - classification_loss: 0.2980 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8284 - regression_loss: 1.5298 - classification_loss: 0.2985 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8304 - regression_loss: 1.5317 - classification_loss: 0.2988 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8252 - regression_loss: 1.5275 - classification_loss: 0.2978 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8200 - regression_loss: 1.5238 - classification_loss: 0.2962 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8155 - regression_loss: 1.5199 - classification_loss: 0.2955 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8125 - regression_loss: 1.5180 - classification_loss: 0.2946 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8167 - regression_loss: 1.5207 - classification_loss: 0.2960 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8165 - regression_loss: 1.5213 - classification_loss: 0.2952 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8266 - regression_loss: 1.5296 - classification_loss: 0.2971 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8451 - regression_loss: 1.5457 - classification_loss: 0.2995 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8454 - regression_loss: 1.5467 - classification_loss: 0.2987 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8332 - regression_loss: 1.5368 - classification_loss: 0.2964 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8393 - regression_loss: 1.5408 - classification_loss: 0.2986 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8257 - regression_loss: 1.5302 - classification_loss: 0.2955 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8845 - regression_loss: 1.5095 - classification_loss: 0.3750 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8735 - regression_loss: 1.5018 - classification_loss: 0.3718 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8756 - regression_loss: 1.5037 - classification_loss: 0.3719 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8808 - regression_loss: 1.5073 - classification_loss: 0.3736 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8969 - regression_loss: 1.5202 - classification_loss: 0.3766 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8947 - regression_loss: 1.5183 - classification_loss: 0.3765 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8935 - regression_loss: 1.5191 - classification_loss: 0.3745 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8852 - regression_loss: 1.5132 - classification_loss: 0.3720 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8865 - regression_loss: 1.5145 - classification_loss: 0.3720 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8971 - regression_loss: 1.5226 - classification_loss: 0.3745 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8902 - regression_loss: 1.5182 - classification_loss: 0.3720 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9021 - regression_loss: 1.5279 - classification_loss: 0.3742 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9044 - regression_loss: 1.5297 - classification_loss: 0.3747 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9127 - regression_loss: 1.5367 - classification_loss: 0.3760 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9041 - regression_loss: 1.5302 - classification_loss: 0.3739 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8973 - regression_loss: 1.5256 - classification_loss: 0.3717 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8957 - regression_loss: 1.5258 - classification_loss: 0.3699 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8838 - regression_loss: 1.5167 - classification_loss: 0.3671 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8798 - regression_loss: 1.5145 - classification_loss: 0.3653 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8704 - regression_loss: 1.5078 - classification_loss: 0.3626 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8696 - regression_loss: 1.5084 - classification_loss: 0.3612 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8814 - regression_loss: 1.5164 - classification_loss: 0.3650 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8821 - regression_loss: 1.5189 - classification_loss: 0.3632 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8784 - regression_loss: 1.5168 - classification_loss: 0.3616 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8824 - regression_loss: 1.5212 - classification_loss: 0.3612 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8834 - regression_loss: 1.5227 - classification_loss: 0.3607 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8837 - regression_loss: 1.5238 - classification_loss: 0.3600 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8808 - regression_loss: 1.5224 - classification_loss: 0.3584 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8712 - regression_loss: 1.5147 - classification_loss: 0.3565 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8804 - regression_loss: 1.5233 - classification_loss: 0.3571 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8860 - regression_loss: 1.5285 - classification_loss: 0.3575 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8815 - regression_loss: 1.5258 - classification_loss: 0.3557 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8827 - regression_loss: 1.5276 - classification_loss: 0.3551 107/500 [=====>........................] - ETA: 1:37 - loss: 1.8878 - regression_loss: 1.5332 - classification_loss: 0.3546 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8872 - regression_loss: 1.5328 - classification_loss: 0.3544 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8878 - regression_loss: 1.5338 - classification_loss: 0.3539 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8930 - regression_loss: 1.5386 - classification_loss: 0.3543 111/500 [=====>........................] - ETA: 1:36 - loss: 1.8949 - regression_loss: 1.5401 - classification_loss: 0.3548 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8932 - regression_loss: 1.5396 - classification_loss: 0.3536 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8869 - regression_loss: 1.5347 - classification_loss: 0.3522 114/500 [=====>........................] - ETA: 1:35 - loss: 1.8817 - regression_loss: 1.5311 - classification_loss: 0.3507 115/500 [=====>........................] - ETA: 1:35 - loss: 1.8823 - regression_loss: 1.5314 - classification_loss: 0.3510 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8729 - regression_loss: 1.5243 - classification_loss: 0.3486 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8709 - regression_loss: 1.5227 - classification_loss: 0.3482 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8690 - regression_loss: 1.5222 - classification_loss: 0.3467 119/500 [======>.......................] - ETA: 1:34 - loss: 1.8661 - regression_loss: 1.5209 - classification_loss: 0.3453 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8686 - regression_loss: 1.5230 - classification_loss: 0.3457 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8673 - regression_loss: 1.5228 - classification_loss: 0.3445 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8670 - regression_loss: 1.5224 - classification_loss: 0.3446 123/500 [======>.......................] - ETA: 1:33 - loss: 1.8735 - regression_loss: 1.5283 - classification_loss: 0.3452 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8721 - regression_loss: 1.5274 - classification_loss: 0.3447 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8698 - regression_loss: 1.5262 - classification_loss: 0.3437 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8718 - regression_loss: 1.5281 - classification_loss: 0.3438 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8712 - regression_loss: 1.5281 - classification_loss: 0.3432 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8767 - regression_loss: 1.5320 - classification_loss: 0.3447 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8752 - regression_loss: 1.5309 - classification_loss: 0.3444 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8771 - regression_loss: 1.5328 - classification_loss: 0.3443 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8743 - regression_loss: 1.5313 - classification_loss: 0.3430 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8675 - regression_loss: 1.5260 - classification_loss: 0.3415 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8714 - regression_loss: 1.5291 - classification_loss: 0.3422 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8668 - regression_loss: 1.5262 - classification_loss: 0.3406 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8663 - regression_loss: 1.5263 - classification_loss: 0.3399 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8710 - regression_loss: 1.5309 - classification_loss: 0.3401 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8710 - regression_loss: 1.5314 - classification_loss: 0.3396 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8700 - regression_loss: 1.5314 - classification_loss: 0.3386 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8711 - regression_loss: 1.5331 - classification_loss: 0.3380 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8713 - regression_loss: 1.5335 - classification_loss: 0.3378 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8775 - regression_loss: 1.5391 - classification_loss: 0.3384 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8775 - regression_loss: 1.5400 - classification_loss: 0.3375 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8754 - regression_loss: 1.5384 - classification_loss: 0.3370 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8726 - regression_loss: 1.5372 - classification_loss: 0.3355 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8746 - regression_loss: 1.5386 - classification_loss: 0.3359 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8751 - regression_loss: 1.5386 - classification_loss: 0.3364 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8715 - regression_loss: 1.5360 - classification_loss: 0.3355 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8728 - regression_loss: 1.5374 - classification_loss: 0.3354 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8709 - regression_loss: 1.5363 - classification_loss: 0.3346 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8707 - regression_loss: 1.5359 - classification_loss: 0.3348 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8678 - regression_loss: 1.5340 - classification_loss: 0.3337 152/500 [========>.....................] - ETA: 1:26 - loss: 1.8645 - regression_loss: 1.5315 - classification_loss: 0.3330 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8639 - regression_loss: 1.5314 - classification_loss: 0.3325 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8626 - regression_loss: 1.5312 - classification_loss: 0.3314 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8642 - regression_loss: 1.5320 - classification_loss: 0.3322 156/500 [========>.....................] - ETA: 1:25 - loss: 1.8639 - regression_loss: 1.5318 - classification_loss: 0.3321 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8660 - regression_loss: 1.5334 - classification_loss: 0.3326 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8706 - regression_loss: 1.5371 - classification_loss: 0.3336 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8728 - regression_loss: 1.5388 - classification_loss: 0.3340 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8751 - regression_loss: 1.5411 - classification_loss: 0.3340 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8794 - regression_loss: 1.5440 - classification_loss: 0.3353 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8809 - regression_loss: 1.5454 - classification_loss: 0.3355 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8785 - regression_loss: 1.5439 - classification_loss: 0.3346 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8790 - regression_loss: 1.5441 - classification_loss: 0.3349 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8746 - regression_loss: 1.5410 - classification_loss: 0.3335 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8742 - regression_loss: 1.5401 - classification_loss: 0.3341 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8722 - regression_loss: 1.5381 - classification_loss: 0.3341 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8771 - regression_loss: 1.5416 - classification_loss: 0.3355 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8777 - regression_loss: 1.5419 - classification_loss: 0.3358 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8736 - regression_loss: 1.5389 - classification_loss: 0.3347 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8724 - regression_loss: 1.5379 - classification_loss: 0.3345 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8771 - regression_loss: 1.5408 - classification_loss: 0.3363 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8749 - regression_loss: 1.5396 - classification_loss: 0.3353 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8730 - regression_loss: 1.5384 - classification_loss: 0.3346 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8741 - regression_loss: 1.5393 - classification_loss: 0.3348 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8715 - regression_loss: 1.5378 - classification_loss: 0.3337 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8733 - regression_loss: 1.5394 - classification_loss: 0.3339 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8758 - regression_loss: 1.5415 - classification_loss: 0.3343 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8836 - regression_loss: 1.5479 - classification_loss: 0.3357 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8814 - regression_loss: 1.5466 - classification_loss: 0.3348 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8794 - regression_loss: 1.5453 - classification_loss: 0.3342 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8850 - regression_loss: 1.5490 - classification_loss: 0.3360 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8832 - regression_loss: 1.5476 - classification_loss: 0.3355 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8857 - regression_loss: 1.5502 - classification_loss: 0.3355 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8851 - regression_loss: 1.5501 - classification_loss: 0.3350 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8817 - regression_loss: 1.5475 - classification_loss: 0.3342 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8825 - regression_loss: 1.5481 - classification_loss: 0.3344 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8817 - regression_loss: 1.5476 - classification_loss: 0.3341 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8828 - regression_loss: 1.5486 - classification_loss: 0.3342 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8829 - regression_loss: 1.5488 - classification_loss: 0.3342 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8838 - regression_loss: 1.5493 - classification_loss: 0.3346 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8877 - regression_loss: 1.5526 - classification_loss: 0.3351 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8839 - regression_loss: 1.5497 - classification_loss: 0.3342 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8823 - regression_loss: 1.5487 - classification_loss: 0.3336 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8823 - regression_loss: 1.5488 - classification_loss: 0.3335 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8832 - regression_loss: 1.5493 - classification_loss: 0.3339 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8823 - regression_loss: 1.5488 - classification_loss: 0.3335 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8819 - regression_loss: 1.5487 - classification_loss: 0.3332 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8756 - regression_loss: 1.5430 - classification_loss: 0.3327 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8736 - regression_loss: 1.5417 - classification_loss: 0.3319 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8725 - regression_loss: 1.5412 - classification_loss: 0.3314 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8769 - regression_loss: 1.5446 - classification_loss: 0.3323 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8757 - regression_loss: 1.5438 - classification_loss: 0.3319 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8746 - regression_loss: 1.5429 - classification_loss: 0.3318 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8742 - regression_loss: 1.5428 - classification_loss: 0.3314 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8702 - regression_loss: 1.5395 - classification_loss: 0.3307 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8751 - regression_loss: 1.5431 - classification_loss: 0.3320 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8781 - regression_loss: 1.5457 - classification_loss: 0.3324 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8754 - regression_loss: 1.5434 - classification_loss: 0.3320 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8769 - regression_loss: 1.5448 - classification_loss: 0.3321 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8808 - regression_loss: 1.5480 - classification_loss: 0.3327 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8784 - regression_loss: 1.5461 - classification_loss: 0.3323 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8785 - regression_loss: 1.5465 - classification_loss: 0.3320 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8782 - regression_loss: 1.5470 - classification_loss: 0.3312 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8770 - regression_loss: 1.5464 - classification_loss: 0.3306 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8761 - regression_loss: 1.5459 - classification_loss: 0.3302 217/500 [============>.................] - ETA: 1:10 - loss: 1.8761 - regression_loss: 1.5463 - classification_loss: 0.3297 218/500 [============>.................] - ETA: 1:10 - loss: 1.8752 - regression_loss: 1.5460 - classification_loss: 0.3292 219/500 [============>.................] - ETA: 1:10 - loss: 1.8745 - regression_loss: 1.5459 - classification_loss: 0.3287 220/500 [============>.................] - ETA: 1:09 - loss: 1.8704 - regression_loss: 1.5426 - classification_loss: 0.3278 221/500 [============>.................] - ETA: 1:09 - loss: 1.8712 - regression_loss: 1.5433 - classification_loss: 0.3279 222/500 [============>.................] - ETA: 1:09 - loss: 1.8710 - regression_loss: 1.5429 - classification_loss: 0.3281 223/500 [============>.................] - ETA: 1:09 - loss: 1.8712 - regression_loss: 1.5434 - classification_loss: 0.3278 224/500 [============>.................] - ETA: 1:08 - loss: 1.8695 - regression_loss: 1.5417 - classification_loss: 0.3278 225/500 [============>.................] - ETA: 1:08 - loss: 1.8687 - regression_loss: 1.5415 - classification_loss: 0.3272 226/500 [============>.................] - ETA: 1:08 - loss: 1.8699 - regression_loss: 1.5426 - classification_loss: 0.3273 227/500 [============>.................] - ETA: 1:08 - loss: 1.8692 - regression_loss: 1.5422 - classification_loss: 0.3271 228/500 [============>.................] - ETA: 1:07 - loss: 1.8656 - regression_loss: 1.5395 - classification_loss: 0.3261 229/500 [============>.................] - ETA: 1:07 - loss: 1.8642 - regression_loss: 1.5387 - classification_loss: 0.3254 230/500 [============>.................] - ETA: 1:07 - loss: 1.8654 - regression_loss: 1.5393 - classification_loss: 0.3262 231/500 [============>.................] - ETA: 1:07 - loss: 1.8630 - regression_loss: 1.5375 - classification_loss: 0.3255 232/500 [============>.................] - ETA: 1:06 - loss: 1.8645 - regression_loss: 1.5385 - classification_loss: 0.3260 233/500 [============>.................] - ETA: 1:06 - loss: 1.8639 - regression_loss: 1.5382 - classification_loss: 0.3257 234/500 [=============>................] - ETA: 1:06 - loss: 1.8655 - regression_loss: 1.5393 - classification_loss: 0.3262 235/500 [=============>................] - ETA: 1:06 - loss: 1.8635 - regression_loss: 1.5377 - classification_loss: 0.3258 236/500 [=============>................] - ETA: 1:05 - loss: 1.8631 - regression_loss: 1.5374 - classification_loss: 0.3257 237/500 [=============>................] - ETA: 1:05 - loss: 1.8645 - regression_loss: 1.5387 - classification_loss: 0.3258 238/500 [=============>................] - ETA: 1:05 - loss: 1.8626 - regression_loss: 1.5372 - classification_loss: 0.3254 239/500 [=============>................] - ETA: 1:05 - loss: 1.8583 - regression_loss: 1.5335 - classification_loss: 0.3248 240/500 [=============>................] - ETA: 1:04 - loss: 1.8598 - regression_loss: 1.5346 - classification_loss: 0.3251 241/500 [=============>................] - ETA: 1:04 - loss: 1.8625 - regression_loss: 1.5366 - classification_loss: 0.3260 242/500 [=============>................] - ETA: 1:04 - loss: 1.8625 - regression_loss: 1.5369 - classification_loss: 0.3256 243/500 [=============>................] - ETA: 1:04 - loss: 1.8625 - regression_loss: 1.5368 - classification_loss: 0.3257 244/500 [=============>................] - ETA: 1:03 - loss: 1.8571 - regression_loss: 1.5325 - classification_loss: 0.3246 245/500 [=============>................] - ETA: 1:03 - loss: 1.8561 - regression_loss: 1.5319 - classification_loss: 0.3242 246/500 [=============>................] - ETA: 1:03 - loss: 1.8542 - regression_loss: 1.5302 - classification_loss: 0.3240 247/500 [=============>................] - ETA: 1:03 - loss: 1.8514 - regression_loss: 1.5283 - classification_loss: 0.3232 248/500 [=============>................] - ETA: 1:02 - loss: 1.8490 - regression_loss: 1.5268 - classification_loss: 0.3223 249/500 [=============>................] - ETA: 1:02 - loss: 1.8501 - regression_loss: 1.5275 - classification_loss: 0.3226 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8477 - regression_loss: 1.5258 - classification_loss: 0.3219 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8481 - regression_loss: 1.5263 - classification_loss: 0.3218 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8485 - regression_loss: 1.5269 - classification_loss: 0.3215 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8449 - regression_loss: 1.5242 - classification_loss: 0.3207 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8462 - regression_loss: 1.5251 - classification_loss: 0.3211 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8437 - regression_loss: 1.5233 - classification_loss: 0.3204 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8429 - regression_loss: 1.5230 - classification_loss: 0.3199 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8444 - regression_loss: 1.5241 - classification_loss: 0.3202 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8445 - regression_loss: 1.5246 - classification_loss: 0.3199 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8454 - regression_loss: 1.5254 - classification_loss: 0.3200 260/500 [==============>...............] - ETA: 59s - loss: 1.8466 - regression_loss: 1.5266 - classification_loss: 0.3200  261/500 [==============>...............] - ETA: 59s - loss: 1.8467 - regression_loss: 1.5264 - classification_loss: 0.3203 262/500 [==============>...............] - ETA: 59s - loss: 1.8454 - regression_loss: 1.5254 - classification_loss: 0.3200 263/500 [==============>...............] - ETA: 59s - loss: 1.8469 - regression_loss: 1.5266 - classification_loss: 0.3203 264/500 [==============>...............] - ETA: 58s - loss: 1.8450 - regression_loss: 1.5253 - classification_loss: 0.3197 265/500 [==============>...............] - ETA: 58s - loss: 1.8443 - regression_loss: 1.5244 - classification_loss: 0.3199 266/500 [==============>...............] - ETA: 58s - loss: 1.8462 - regression_loss: 1.5258 - classification_loss: 0.3204 267/500 [===============>..............] - ETA: 58s - loss: 1.8456 - regression_loss: 1.5255 - classification_loss: 0.3201 268/500 [===============>..............] - ETA: 57s - loss: 1.8496 - regression_loss: 1.5290 - classification_loss: 0.3205 269/500 [===============>..............] - ETA: 57s - loss: 1.8506 - regression_loss: 1.5293 - classification_loss: 0.3213 270/500 [===============>..............] - ETA: 57s - loss: 1.8514 - regression_loss: 1.5298 - classification_loss: 0.3216 271/500 [===============>..............] - ETA: 57s - loss: 1.8518 - regression_loss: 1.5303 - classification_loss: 0.3215 272/500 [===============>..............] - ETA: 56s - loss: 1.8501 - regression_loss: 1.5291 - classification_loss: 0.3210 273/500 [===============>..............] - ETA: 56s - loss: 1.8504 - regression_loss: 1.5295 - classification_loss: 0.3210 274/500 [===============>..............] - ETA: 56s - loss: 1.8527 - regression_loss: 1.5308 - classification_loss: 0.3219 275/500 [===============>..............] - ETA: 56s - loss: 1.8525 - regression_loss: 1.5305 - classification_loss: 0.3219 276/500 [===============>..............] - ETA: 55s - loss: 1.8513 - regression_loss: 1.5299 - classification_loss: 0.3215 277/500 [===============>..............] - ETA: 55s - loss: 1.8516 - regression_loss: 1.5302 - classification_loss: 0.3214 278/500 [===============>..............] - ETA: 55s - loss: 1.8536 - regression_loss: 1.5321 - classification_loss: 0.3215 279/500 [===============>..............] - ETA: 55s - loss: 1.8529 - regression_loss: 1.5313 - classification_loss: 0.3216 280/500 [===============>..............] - ETA: 54s - loss: 1.8492 - regression_loss: 1.5284 - classification_loss: 0.3208 281/500 [===============>..............] - ETA: 54s - loss: 1.8527 - regression_loss: 1.5230 - classification_loss: 0.3297 282/500 [===============>..............] - ETA: 54s - loss: 1.8532 - regression_loss: 1.5234 - classification_loss: 0.3298 283/500 [===============>..............] - ETA: 54s - loss: 1.8522 - regression_loss: 1.5228 - classification_loss: 0.3294 284/500 [================>.............] - ETA: 53s - loss: 1.8505 - regression_loss: 1.5217 - classification_loss: 0.3288 285/500 [================>.............] - ETA: 53s - loss: 1.8500 - regression_loss: 1.5212 - classification_loss: 0.3288 286/500 [================>.............] - ETA: 53s - loss: 1.8494 - regression_loss: 1.5209 - classification_loss: 0.3284 287/500 [================>.............] - ETA: 53s - loss: 1.8468 - regression_loss: 1.5188 - classification_loss: 0.3280 288/500 [================>.............] - ETA: 52s - loss: 1.8462 - regression_loss: 1.5185 - classification_loss: 0.3278 289/500 [================>.............] - ETA: 52s - loss: 1.8470 - regression_loss: 1.5199 - classification_loss: 0.3271 290/500 [================>.............] - ETA: 52s - loss: 1.8455 - regression_loss: 1.5187 - classification_loss: 0.3268 291/500 [================>.............] - ETA: 52s - loss: 1.8466 - regression_loss: 1.5197 - classification_loss: 0.3270 292/500 [================>.............] - ETA: 51s - loss: 1.8473 - regression_loss: 1.5204 - classification_loss: 0.3270 293/500 [================>.............] - ETA: 51s - loss: 1.8472 - regression_loss: 1.5201 - classification_loss: 0.3271 294/500 [================>.............] - ETA: 51s - loss: 1.8458 - regression_loss: 1.5190 - classification_loss: 0.3268 295/500 [================>.............] - ETA: 51s - loss: 1.8453 - regression_loss: 1.5187 - classification_loss: 0.3267 296/500 [================>.............] - ETA: 50s - loss: 1.8453 - regression_loss: 1.5190 - classification_loss: 0.3263 297/500 [================>.............] - ETA: 50s - loss: 1.8465 - regression_loss: 1.5199 - classification_loss: 0.3265 298/500 [================>.............] - ETA: 50s - loss: 1.8462 - regression_loss: 1.5198 - classification_loss: 0.3264 299/500 [================>.............] - ETA: 50s - loss: 1.8484 - regression_loss: 1.5215 - classification_loss: 0.3269 300/500 [=================>............] - ETA: 49s - loss: 1.8476 - regression_loss: 1.5209 - classification_loss: 0.3267 301/500 [=================>............] - ETA: 49s - loss: 1.8475 - regression_loss: 1.5212 - classification_loss: 0.3262 302/500 [=================>............] - ETA: 49s - loss: 1.8464 - regression_loss: 1.5206 - classification_loss: 0.3258 303/500 [=================>............] - ETA: 49s - loss: 1.8456 - regression_loss: 1.5202 - classification_loss: 0.3255 304/500 [=================>............] - ETA: 48s - loss: 1.8456 - regression_loss: 1.5203 - classification_loss: 0.3252 305/500 [=================>............] - ETA: 48s - loss: 1.8447 - regression_loss: 1.5198 - classification_loss: 0.3249 306/500 [=================>............] - ETA: 48s - loss: 1.8406 - regression_loss: 1.5164 - classification_loss: 0.3242 307/500 [=================>............] - ETA: 48s - loss: 1.8395 - regression_loss: 1.5157 - classification_loss: 0.3238 308/500 [=================>............] - ETA: 47s - loss: 1.8412 - regression_loss: 1.5168 - classification_loss: 0.3244 309/500 [=================>............] - ETA: 47s - loss: 1.8422 - regression_loss: 1.5175 - classification_loss: 0.3247 310/500 [=================>............] - ETA: 47s - loss: 1.8430 - regression_loss: 1.5186 - classification_loss: 0.3244 311/500 [=================>............] - ETA: 47s - loss: 1.8445 - regression_loss: 1.5199 - classification_loss: 0.3246 312/500 [=================>............] - ETA: 46s - loss: 1.8484 - regression_loss: 1.5232 - classification_loss: 0.3252 313/500 [=================>............] - ETA: 46s - loss: 1.8491 - regression_loss: 1.5239 - classification_loss: 0.3252 314/500 [=================>............] - ETA: 46s - loss: 1.8484 - regression_loss: 1.5235 - classification_loss: 0.3249 315/500 [=================>............] - ETA: 46s - loss: 1.8487 - regression_loss: 1.5236 - classification_loss: 0.3250 316/500 [=================>............] - ETA: 45s - loss: 1.8481 - regression_loss: 1.5230 - classification_loss: 0.3250 317/500 [==================>...........] - ETA: 45s - loss: 1.8461 - regression_loss: 1.5216 - classification_loss: 0.3245 318/500 [==================>...........] - ETA: 45s - loss: 1.8470 - regression_loss: 1.5222 - classification_loss: 0.3248 319/500 [==================>...........] - ETA: 45s - loss: 1.8456 - regression_loss: 1.5212 - classification_loss: 0.3244 320/500 [==================>...........] - ETA: 44s - loss: 1.8463 - regression_loss: 1.5217 - classification_loss: 0.3246 321/500 [==================>...........] - ETA: 44s - loss: 1.8434 - regression_loss: 1.5194 - classification_loss: 0.3240 322/500 [==================>...........] - ETA: 44s - loss: 1.8436 - regression_loss: 1.5195 - classification_loss: 0.3240 323/500 [==================>...........] - ETA: 44s - loss: 1.8397 - regression_loss: 1.5163 - classification_loss: 0.3234 324/500 [==================>...........] - ETA: 43s - loss: 1.8397 - regression_loss: 1.5164 - classification_loss: 0.3233 325/500 [==================>...........] - ETA: 43s - loss: 1.8405 - regression_loss: 1.5171 - classification_loss: 0.3235 326/500 [==================>...........] - ETA: 43s - loss: 1.8418 - regression_loss: 1.5181 - classification_loss: 0.3237 327/500 [==================>...........] - ETA: 43s - loss: 1.8426 - regression_loss: 1.5187 - classification_loss: 0.3240 328/500 [==================>...........] - ETA: 42s - loss: 1.8436 - regression_loss: 1.5191 - classification_loss: 0.3245 329/500 [==================>...........] - ETA: 42s - loss: 1.8430 - regression_loss: 1.5180 - classification_loss: 0.3250 330/500 [==================>...........] - ETA: 42s - loss: 1.8422 - regression_loss: 1.5174 - classification_loss: 0.3248 331/500 [==================>...........] - ETA: 42s - loss: 1.8423 - regression_loss: 1.5178 - classification_loss: 0.3245 332/500 [==================>...........] - ETA: 41s - loss: 1.8400 - regression_loss: 1.5161 - classification_loss: 0.3240 333/500 [==================>...........] - ETA: 41s - loss: 1.8418 - regression_loss: 1.5173 - classification_loss: 0.3245 334/500 [===================>..........] - ETA: 41s - loss: 1.8390 - regression_loss: 1.5151 - classification_loss: 0.3239 335/500 [===================>..........] - ETA: 41s - loss: 1.8394 - regression_loss: 1.5155 - classification_loss: 0.3239 336/500 [===================>..........] - ETA: 40s - loss: 1.8383 - regression_loss: 1.5146 - classification_loss: 0.3237 337/500 [===================>..........] - ETA: 40s - loss: 1.8383 - regression_loss: 1.5148 - classification_loss: 0.3236 338/500 [===================>..........] - ETA: 40s - loss: 1.8402 - regression_loss: 1.5160 - classification_loss: 0.3242 339/500 [===================>..........] - ETA: 40s - loss: 1.8394 - regression_loss: 1.5155 - classification_loss: 0.3240 340/500 [===================>..........] - ETA: 39s - loss: 1.8385 - regression_loss: 1.5150 - classification_loss: 0.3235 341/500 [===================>..........] - ETA: 39s - loss: 1.8393 - regression_loss: 1.5156 - classification_loss: 0.3236 342/500 [===================>..........] - ETA: 39s - loss: 1.8420 - regression_loss: 1.5174 - classification_loss: 0.3246 343/500 [===================>..........] - ETA: 39s - loss: 1.8409 - regression_loss: 1.5168 - classification_loss: 0.3241 344/500 [===================>..........] - ETA: 38s - loss: 1.8402 - regression_loss: 1.5162 - classification_loss: 0.3240 345/500 [===================>..........] - ETA: 38s - loss: 1.8398 - regression_loss: 1.5160 - classification_loss: 0.3238 346/500 [===================>..........] - ETA: 38s - loss: 1.8381 - regression_loss: 1.5146 - classification_loss: 0.3235 347/500 [===================>..........] - ETA: 38s - loss: 1.8385 - regression_loss: 1.5150 - classification_loss: 0.3235 348/500 [===================>..........] - ETA: 37s - loss: 1.8365 - regression_loss: 1.5135 - classification_loss: 0.3230 349/500 [===================>..........] - ETA: 37s - loss: 1.8360 - regression_loss: 1.5132 - classification_loss: 0.3227 350/500 [====================>.........] - ETA: 37s - loss: 1.8377 - regression_loss: 1.5145 - classification_loss: 0.3232 351/500 [====================>.........] - ETA: 37s - loss: 1.8383 - regression_loss: 1.5149 - classification_loss: 0.3234 352/500 [====================>.........] - ETA: 36s - loss: 1.8392 - regression_loss: 1.5157 - classification_loss: 0.3235 353/500 [====================>.........] - ETA: 36s - loss: 1.8409 - regression_loss: 1.5168 - classification_loss: 0.3241 354/500 [====================>.........] - ETA: 36s - loss: 1.8413 - regression_loss: 1.5172 - classification_loss: 0.3241 355/500 [====================>.........] - ETA: 36s - loss: 1.8416 - regression_loss: 1.5179 - classification_loss: 0.3237 356/500 [====================>.........] - ETA: 35s - loss: 1.8435 - regression_loss: 1.5192 - classification_loss: 0.3243 357/500 [====================>.........] - ETA: 35s - loss: 1.8430 - regression_loss: 1.5190 - classification_loss: 0.3240 358/500 [====================>.........] - ETA: 35s - loss: 1.8437 - regression_loss: 1.5197 - classification_loss: 0.3240 359/500 [====================>.........] - ETA: 35s - loss: 1.8430 - regression_loss: 1.5192 - classification_loss: 0.3239 360/500 [====================>.........] - ETA: 34s - loss: 1.8449 - regression_loss: 1.5208 - classification_loss: 0.3241 361/500 [====================>.........] - ETA: 34s - loss: 1.8444 - regression_loss: 1.5203 - classification_loss: 0.3242 362/500 [====================>.........] - ETA: 34s - loss: 1.8428 - regression_loss: 1.5191 - classification_loss: 0.3237 363/500 [====================>.........] - ETA: 34s - loss: 1.8412 - regression_loss: 1.5178 - classification_loss: 0.3234 364/500 [====================>.........] - ETA: 33s - loss: 1.8415 - regression_loss: 1.5185 - classification_loss: 0.3230 365/500 [====================>.........] - ETA: 33s - loss: 1.8437 - regression_loss: 1.5203 - classification_loss: 0.3234 366/500 [====================>.........] - ETA: 33s - loss: 1.8436 - regression_loss: 1.5202 - classification_loss: 0.3234 367/500 [=====================>........] - ETA: 33s - loss: 1.8475 - regression_loss: 1.5238 - classification_loss: 0.3237 368/500 [=====================>........] - ETA: 32s - loss: 1.8467 - regression_loss: 1.5232 - classification_loss: 0.3235 369/500 [=====================>........] - ETA: 32s - loss: 1.8462 - regression_loss: 1.5230 - classification_loss: 0.3232 370/500 [=====================>........] - ETA: 32s - loss: 1.8446 - regression_loss: 1.5218 - classification_loss: 0.3227 371/500 [=====================>........] - ETA: 32s - loss: 1.8438 - regression_loss: 1.5215 - classification_loss: 0.3223 372/500 [=====================>........] - ETA: 31s - loss: 1.8443 - regression_loss: 1.5221 - classification_loss: 0.3222 373/500 [=====================>........] - ETA: 31s - loss: 1.8469 - regression_loss: 1.5238 - classification_loss: 0.3231 374/500 [=====================>........] - ETA: 31s - loss: 1.8458 - regression_loss: 1.5231 - classification_loss: 0.3226 375/500 [=====================>........] - ETA: 31s - loss: 1.8444 - regression_loss: 1.5222 - classification_loss: 0.3222 376/500 [=====================>........] - ETA: 30s - loss: 1.8419 - regression_loss: 1.5201 - classification_loss: 0.3218 377/500 [=====================>........] - ETA: 30s - loss: 1.8418 - regression_loss: 1.5201 - classification_loss: 0.3217 378/500 [=====================>........] - ETA: 30s - loss: 1.8436 - regression_loss: 1.5213 - classification_loss: 0.3223 379/500 [=====================>........] - ETA: 30s - loss: 1.8410 - regression_loss: 1.5194 - classification_loss: 0.3216 380/500 [=====================>........] - ETA: 29s - loss: 1.8402 - regression_loss: 1.5189 - classification_loss: 0.3213 381/500 [=====================>........] - ETA: 29s - loss: 1.8393 - regression_loss: 1.5183 - classification_loss: 0.3210 382/500 [=====================>........] - ETA: 29s - loss: 1.8391 - regression_loss: 1.5183 - classification_loss: 0.3208 383/500 [=====================>........] - ETA: 29s - loss: 1.8396 - regression_loss: 1.5190 - classification_loss: 0.3206 384/500 [======================>.......] - ETA: 28s - loss: 1.8403 - regression_loss: 1.5196 - classification_loss: 0.3207 385/500 [======================>.......] - ETA: 28s - loss: 1.8406 - regression_loss: 1.5197 - classification_loss: 0.3209 386/500 [======================>.......] - ETA: 28s - loss: 1.8390 - regression_loss: 1.5185 - classification_loss: 0.3204 387/500 [======================>.......] - ETA: 28s - loss: 1.8396 - regression_loss: 1.5188 - classification_loss: 0.3207 388/500 [======================>.......] - ETA: 27s - loss: 1.8384 - regression_loss: 1.5178 - classification_loss: 0.3207 389/500 [======================>.......] - ETA: 27s - loss: 1.8399 - regression_loss: 1.5192 - classification_loss: 0.3206 390/500 [======================>.......] - ETA: 27s - loss: 1.8417 - regression_loss: 1.5207 - classification_loss: 0.3210 391/500 [======================>.......] - ETA: 27s - loss: 1.8409 - regression_loss: 1.5202 - classification_loss: 0.3207 392/500 [======================>.......] - ETA: 26s - loss: 1.8432 - regression_loss: 1.5220 - classification_loss: 0.3212 393/500 [======================>.......] - ETA: 26s - loss: 1.8445 - regression_loss: 1.5227 - classification_loss: 0.3217 394/500 [======================>.......] - ETA: 26s - loss: 1.8464 - regression_loss: 1.5244 - classification_loss: 0.3220 395/500 [======================>.......] - ETA: 26s - loss: 1.8454 - regression_loss: 1.5238 - classification_loss: 0.3217 396/500 [======================>.......] - ETA: 26s - loss: 1.8460 - regression_loss: 1.5243 - classification_loss: 0.3217 397/500 [======================>.......] - ETA: 25s - loss: 1.8466 - regression_loss: 1.5249 - classification_loss: 0.3217 398/500 [======================>.......] - ETA: 25s - loss: 1.8462 - regression_loss: 1.5246 - classification_loss: 0.3216 399/500 [======================>.......] - ETA: 25s - loss: 1.8461 - regression_loss: 1.5245 - classification_loss: 0.3216 400/500 [=======================>......] - ETA: 25s - loss: 1.8456 - regression_loss: 1.5242 - classification_loss: 0.3214 401/500 [=======================>......] - ETA: 24s - loss: 1.8456 - regression_loss: 1.5242 - classification_loss: 0.3214 402/500 [=======================>......] - ETA: 24s - loss: 1.8438 - regression_loss: 1.5228 - classification_loss: 0.3210 403/500 [=======================>......] - ETA: 24s - loss: 1.8416 - regression_loss: 1.5211 - classification_loss: 0.3206 404/500 [=======================>......] - ETA: 24s - loss: 1.8424 - regression_loss: 1.5215 - classification_loss: 0.3209 405/500 [=======================>......] - ETA: 23s - loss: 1.8429 - regression_loss: 1.5221 - classification_loss: 0.3208 406/500 [=======================>......] - ETA: 23s - loss: 1.8434 - regression_loss: 1.5225 - classification_loss: 0.3209 407/500 [=======================>......] - ETA: 23s - loss: 1.8427 - regression_loss: 1.5220 - classification_loss: 0.3207 408/500 [=======================>......] - ETA: 23s - loss: 1.8449 - regression_loss: 1.5239 - classification_loss: 0.3210 409/500 [=======================>......] - ETA: 22s - loss: 1.8455 - regression_loss: 1.5245 - classification_loss: 0.3210 410/500 [=======================>......] - ETA: 22s - loss: 1.8446 - regression_loss: 1.5238 - classification_loss: 0.3208 411/500 [=======================>......] - ETA: 22s - loss: 1.8432 - regression_loss: 1.5224 - classification_loss: 0.3208 412/500 [=======================>......] - ETA: 22s - loss: 1.8424 - regression_loss: 1.5218 - classification_loss: 0.3205 413/500 [=======================>......] - ETA: 21s - loss: 1.8432 - regression_loss: 1.5225 - classification_loss: 0.3207 414/500 [=======================>......] - ETA: 21s - loss: 1.8468 - regression_loss: 1.5260 - classification_loss: 0.3208 415/500 [=======================>......] - ETA: 21s - loss: 1.8474 - regression_loss: 1.5267 - classification_loss: 0.3208 416/500 [=======================>......] - ETA: 21s - loss: 1.8478 - regression_loss: 1.5271 - classification_loss: 0.3207 417/500 [========================>.....] - ETA: 20s - loss: 1.8480 - regression_loss: 1.5271 - classification_loss: 0.3209 418/500 [========================>.....] - ETA: 20s - loss: 1.8476 - regression_loss: 1.5268 - classification_loss: 0.3208 419/500 [========================>.....] - ETA: 20s - loss: 1.8471 - regression_loss: 1.5266 - classification_loss: 0.3206 420/500 [========================>.....] - ETA: 20s - loss: 1.8474 - regression_loss: 1.5269 - classification_loss: 0.3205 421/500 [========================>.....] - ETA: 19s - loss: 1.8469 - regression_loss: 1.5267 - classification_loss: 0.3202 422/500 [========================>.....] - ETA: 19s - loss: 1.8477 - regression_loss: 1.5275 - classification_loss: 0.3202 423/500 [========================>.....] - ETA: 19s - loss: 1.8481 - regression_loss: 1.5277 - classification_loss: 0.3204 424/500 [========================>.....] - ETA: 19s - loss: 1.8466 - regression_loss: 1.5268 - classification_loss: 0.3199 425/500 [========================>.....] - ETA: 18s - loss: 1.8457 - regression_loss: 1.5260 - classification_loss: 0.3196 426/500 [========================>.....] - ETA: 18s - loss: 1.8473 - regression_loss: 1.5273 - classification_loss: 0.3200 427/500 [========================>.....] - ETA: 18s - loss: 1.8472 - regression_loss: 1.5270 - classification_loss: 0.3201 428/500 [========================>.....] - ETA: 18s - loss: 1.8493 - regression_loss: 1.5288 - classification_loss: 0.3204 429/500 [========================>.....] - ETA: 17s - loss: 1.8485 - regression_loss: 1.5280 - classification_loss: 0.3205 430/500 [========================>.....] - ETA: 17s - loss: 1.8492 - regression_loss: 1.5286 - classification_loss: 0.3206 431/500 [========================>.....] - ETA: 17s - loss: 1.8486 - regression_loss: 1.5283 - classification_loss: 0.3204 432/500 [========================>.....] - ETA: 17s - loss: 1.8492 - regression_loss: 1.5290 - classification_loss: 0.3202 433/500 [========================>.....] - ETA: 16s - loss: 1.8503 - regression_loss: 1.5299 - classification_loss: 0.3204 434/500 [=========================>....] - ETA: 16s - loss: 1.8480 - regression_loss: 1.5279 - classification_loss: 0.3201 435/500 [=========================>....] - ETA: 16s - loss: 1.8478 - regression_loss: 1.5277 - classification_loss: 0.3201 436/500 [=========================>....] - ETA: 16s - loss: 1.8493 - regression_loss: 1.5288 - classification_loss: 0.3205 437/500 [=========================>....] - ETA: 15s - loss: 1.8484 - regression_loss: 1.5283 - classification_loss: 0.3201 438/500 [=========================>....] - ETA: 15s - loss: 1.8494 - regression_loss: 1.5291 - classification_loss: 0.3203 439/500 [=========================>....] - ETA: 15s - loss: 1.8471 - regression_loss: 1.5273 - classification_loss: 0.3199 440/500 [=========================>....] - ETA: 15s - loss: 1.8465 - regression_loss: 1.5268 - classification_loss: 0.3197 441/500 [=========================>....] - ETA: 14s - loss: 1.8452 - regression_loss: 1.5258 - classification_loss: 0.3194 442/500 [=========================>....] - ETA: 14s - loss: 1.8443 - regression_loss: 1.5252 - classification_loss: 0.3191 443/500 [=========================>....] - ETA: 14s - loss: 1.8429 - regression_loss: 1.5243 - classification_loss: 0.3186 444/500 [=========================>....] - ETA: 14s - loss: 1.8439 - regression_loss: 1.5253 - classification_loss: 0.3186 445/500 [=========================>....] - ETA: 13s - loss: 1.8452 - regression_loss: 1.5265 - classification_loss: 0.3187 446/500 [=========================>....] - ETA: 13s - loss: 1.8446 - regression_loss: 1.5261 - classification_loss: 0.3185 447/500 [=========================>....] - ETA: 13s - loss: 1.8453 - regression_loss: 1.5267 - classification_loss: 0.3187 448/500 [=========================>....] - ETA: 13s - loss: 1.8460 - regression_loss: 1.5273 - classification_loss: 0.3188 449/500 [=========================>....] - ETA: 12s - loss: 1.8449 - regression_loss: 1.5265 - classification_loss: 0.3184 450/500 [==========================>...] - ETA: 12s - loss: 1.8457 - regression_loss: 1.5270 - classification_loss: 0.3187 451/500 [==========================>...] - ETA: 12s - loss: 1.8457 - regression_loss: 1.5269 - classification_loss: 0.3188 452/500 [==========================>...] - ETA: 12s - loss: 1.8460 - regression_loss: 1.5272 - classification_loss: 0.3187 453/500 [==========================>...] - ETA: 11s - loss: 1.8433 - regression_loss: 1.5250 - classification_loss: 0.3183 454/500 [==========================>...] - ETA: 11s - loss: 1.8431 - regression_loss: 1.5248 - classification_loss: 0.3183 455/500 [==========================>...] - ETA: 11s - loss: 1.8429 - regression_loss: 1.5249 - classification_loss: 0.3179 456/500 [==========================>...] - ETA: 10s - loss: 1.8399 - regression_loss: 1.5224 - classification_loss: 0.3175 457/500 [==========================>...] - ETA: 10s - loss: 1.8386 - regression_loss: 1.5214 - classification_loss: 0.3172 458/500 [==========================>...] - ETA: 10s - loss: 1.8418 - regression_loss: 1.5241 - classification_loss: 0.3178 459/500 [==========================>...] - ETA: 10s - loss: 1.8430 - regression_loss: 1.5242 - classification_loss: 0.3188 460/500 [==========================>...] - ETA: 9s - loss: 1.8451 - regression_loss: 1.5260 - classification_loss: 0.3191  461/500 [==========================>...] - ETA: 9s - loss: 1.8432 - regression_loss: 1.5245 - classification_loss: 0.3187 462/500 [==========================>...] - ETA: 9s - loss: 1.8437 - regression_loss: 1.5250 - classification_loss: 0.3188 463/500 [==========================>...] - ETA: 9s - loss: 1.8422 - regression_loss: 1.5237 - classification_loss: 0.3185 464/500 [==========================>...] - ETA: 8s - loss: 1.8413 - regression_loss: 1.5233 - classification_loss: 0.3180 465/500 [==========================>...] - ETA: 8s - loss: 1.8421 - regression_loss: 1.5242 - classification_loss: 0.3180 466/500 [==========================>...] - ETA: 8s - loss: 1.8418 - regression_loss: 1.5241 - classification_loss: 0.3177 467/500 [===========================>..] - ETA: 8s - loss: 1.8419 - regression_loss: 1.5241 - classification_loss: 0.3178 468/500 [===========================>..] - ETA: 7s - loss: 1.8420 - regression_loss: 1.5240 - classification_loss: 0.3180 469/500 [===========================>..] - ETA: 7s - loss: 1.8414 - regression_loss: 1.5236 - classification_loss: 0.3179 470/500 [===========================>..] - ETA: 7s - loss: 1.8417 - regression_loss: 1.5237 - classification_loss: 0.3180 471/500 [===========================>..] - ETA: 7s - loss: 1.8412 - regression_loss: 1.5235 - classification_loss: 0.3178 472/500 [===========================>..] - ETA: 6s - loss: 1.8410 - regression_loss: 1.5233 - classification_loss: 0.3177 473/500 [===========================>..] - ETA: 6s - loss: 1.8410 - regression_loss: 1.5233 - classification_loss: 0.3177 474/500 [===========================>..] - ETA: 6s - loss: 1.8393 - regression_loss: 1.5221 - classification_loss: 0.3172 475/500 [===========================>..] - ETA: 6s - loss: 1.8391 - regression_loss: 1.5221 - classification_loss: 0.3170 476/500 [===========================>..] - ETA: 5s - loss: 1.8405 - regression_loss: 1.5232 - classification_loss: 0.3173 477/500 [===========================>..] - ETA: 5s - loss: 1.8407 - regression_loss: 1.5235 - classification_loss: 0.3172 478/500 [===========================>..] - ETA: 5s - loss: 1.8410 - regression_loss: 1.5236 - classification_loss: 0.3175 479/500 [===========================>..] - ETA: 5s - loss: 1.8423 - regression_loss: 1.5247 - classification_loss: 0.3176 480/500 [===========================>..] - ETA: 4s - loss: 1.8408 - regression_loss: 1.5236 - classification_loss: 0.3172 481/500 [===========================>..] - ETA: 4s - loss: 1.8408 - regression_loss: 1.5237 - classification_loss: 0.3171 482/500 [===========================>..] - ETA: 4s - loss: 1.8409 - regression_loss: 1.5238 - classification_loss: 0.3171 483/500 [===========================>..] - ETA: 4s - loss: 1.8436 - regression_loss: 1.5207 - classification_loss: 0.3229 484/500 [============================>.] - ETA: 3s - loss: 1.8432 - regression_loss: 1.5205 - classification_loss: 0.3227 485/500 [============================>.] - ETA: 3s - loss: 1.8412 - regression_loss: 1.5189 - classification_loss: 0.3223 486/500 [============================>.] - ETA: 3s - loss: 1.8402 - regression_loss: 1.5182 - classification_loss: 0.3220 487/500 [============================>.] - ETA: 3s - loss: 1.8389 - regression_loss: 1.5173 - classification_loss: 0.3216 488/500 [============================>.] - ETA: 2s - loss: 1.8390 - regression_loss: 1.5175 - classification_loss: 0.3215 489/500 [============================>.] - ETA: 2s - loss: 1.8364 - regression_loss: 1.5154 - classification_loss: 0.3210 490/500 [============================>.] - ETA: 2s - loss: 1.8382 - regression_loss: 1.5167 - classification_loss: 0.3214 491/500 [============================>.] - ETA: 2s - loss: 1.8388 - regression_loss: 1.5171 - classification_loss: 0.3217 492/500 [============================>.] - ETA: 1s - loss: 1.8378 - regression_loss: 1.5164 - classification_loss: 0.3214 493/500 [============================>.] - ETA: 1s - loss: 1.8391 - regression_loss: 1.5176 - classification_loss: 0.3215 494/500 [============================>.] - ETA: 1s - loss: 1.8409 - regression_loss: 1.5191 - classification_loss: 0.3218 495/500 [============================>.] - ETA: 1s - loss: 1.8397 - regression_loss: 1.5183 - classification_loss: 0.3214 496/500 [============================>.] - ETA: 0s - loss: 1.8389 - regression_loss: 1.5177 - classification_loss: 0.3212 497/500 [============================>.] - ETA: 0s - loss: 1.8388 - regression_loss: 1.5176 - classification_loss: 0.3212 498/500 [============================>.] - ETA: 0s - loss: 1.8400 - regression_loss: 1.5187 - classification_loss: 0.3213 499/500 [============================>.] - ETA: 0s - loss: 1.8400 - regression_loss: 1.5189 - classification_loss: 0.3211 500/500 [==============================] - 125s 250ms/step - loss: 1.8407 - regression_loss: 1.5194 - classification_loss: 0.3212 326 instances of class plum with average precision: 0.7532 mAP: 0.7532 Epoch 00036: saving model to ./training/snapshots/resnet50_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 1:53 - loss: 2.9064 - regression_loss: 2.4074 - classification_loss: 0.4991 2/500 [..............................] - ETA: 1:57 - loss: 2.3107 - regression_loss: 1.9470 - classification_loss: 0.3636 3/500 [..............................] - ETA: 1:57 - loss: 2.3239 - regression_loss: 1.9377 - classification_loss: 0.3862 4/500 [..............................] - ETA: 1:58 - loss: 2.1955 - regression_loss: 1.8285 - classification_loss: 0.3670 5/500 [..............................] - ETA: 1:59 - loss: 2.2870 - regression_loss: 1.8801 - classification_loss: 0.4070 6/500 [..............................] - ETA: 2:01 - loss: 2.3129 - regression_loss: 1.9002 - classification_loss: 0.4128 7/500 [..............................] - ETA: 2:01 - loss: 2.3002 - regression_loss: 1.8961 - classification_loss: 0.4041 8/500 [..............................] - ETA: 2:01 - loss: 2.3181 - regression_loss: 1.9167 - classification_loss: 0.4014 9/500 [..............................] - ETA: 2:01 - loss: 2.2356 - regression_loss: 1.8534 - classification_loss: 0.3822 10/500 [..............................] - ETA: 2:01 - loss: 2.2110 - regression_loss: 1.8347 - classification_loss: 0.3763 11/500 [..............................] - ETA: 2:00 - loss: 2.1478 - regression_loss: 1.7879 - classification_loss: 0.3599 12/500 [..............................] - ETA: 2:00 - loss: 2.1280 - regression_loss: 1.7702 - classification_loss: 0.3578 13/500 [..............................] - ETA: 2:00 - loss: 2.0639 - regression_loss: 1.7201 - classification_loss: 0.3438 14/500 [..............................] - ETA: 1:59 - loss: 2.0972 - regression_loss: 1.7418 - classification_loss: 0.3554 15/500 [..............................] - ETA: 1:59 - loss: 2.0993 - regression_loss: 1.7409 - classification_loss: 0.3584 16/500 [..............................] - ETA: 1:59 - loss: 2.0728 - regression_loss: 1.7171 - classification_loss: 0.3557 17/500 [>.............................] - ETA: 1:59 - loss: 2.0896 - regression_loss: 1.7306 - classification_loss: 0.3591 18/500 [>.............................] - ETA: 1:59 - loss: 2.1055 - regression_loss: 1.7430 - classification_loss: 0.3625 19/500 [>.............................] - ETA: 1:59 - loss: 2.1383 - regression_loss: 1.7633 - classification_loss: 0.3750 20/500 [>.............................] - ETA: 1:58 - loss: 2.1447 - regression_loss: 1.7630 - classification_loss: 0.3816 21/500 [>.............................] - ETA: 1:58 - loss: 2.1694 - regression_loss: 1.7886 - classification_loss: 0.3808 22/500 [>.............................] - ETA: 1:58 - loss: 2.1365 - regression_loss: 1.7645 - classification_loss: 0.3720 23/500 [>.............................] - ETA: 1:58 - loss: 2.0828 - regression_loss: 1.7220 - classification_loss: 0.3607 24/500 [>.............................] - ETA: 1:57 - loss: 2.0618 - regression_loss: 1.7065 - classification_loss: 0.3553 25/500 [>.............................] - ETA: 1:57 - loss: 2.0277 - regression_loss: 1.6778 - classification_loss: 0.3498 26/500 [>.............................] - ETA: 1:57 - loss: 2.0234 - regression_loss: 1.6767 - classification_loss: 0.3468 27/500 [>.............................] - ETA: 1:57 - loss: 2.0031 - regression_loss: 1.6624 - classification_loss: 0.3407 28/500 [>.............................] - ETA: 1:57 - loss: 1.9940 - regression_loss: 1.6531 - classification_loss: 0.3408 29/500 [>.............................] - ETA: 1:57 - loss: 1.9521 - regression_loss: 1.6205 - classification_loss: 0.3316 30/500 [>.............................] - ETA: 1:56 - loss: 1.9609 - regression_loss: 1.6313 - classification_loss: 0.3297 31/500 [>.............................] - ETA: 1:56 - loss: 1.9726 - regression_loss: 1.6406 - classification_loss: 0.3320 32/500 [>.............................] - ETA: 1:56 - loss: 1.9678 - regression_loss: 1.6339 - classification_loss: 0.3340 33/500 [>.............................] - ETA: 1:56 - loss: 1.9612 - regression_loss: 1.6285 - classification_loss: 0.3327 34/500 [=>............................] - ETA: 1:55 - loss: 1.9788 - regression_loss: 1.6408 - classification_loss: 0.3380 35/500 [=>............................] - ETA: 1:55 - loss: 1.9717 - regression_loss: 1.6354 - classification_loss: 0.3363 36/500 [=>............................] - ETA: 1:55 - loss: 1.9645 - regression_loss: 1.6310 - classification_loss: 0.3335 37/500 [=>............................] - ETA: 1:55 - loss: 1.9600 - regression_loss: 1.6292 - classification_loss: 0.3307 38/500 [=>............................] - ETA: 1:55 - loss: 1.9442 - regression_loss: 1.6165 - classification_loss: 0.3278 39/500 [=>............................] - ETA: 1:55 - loss: 1.9355 - regression_loss: 1.6093 - classification_loss: 0.3263 40/500 [=>............................] - ETA: 1:54 - loss: 1.9036 - regression_loss: 1.5830 - classification_loss: 0.3205 41/500 [=>............................] - ETA: 1:54 - loss: 1.9070 - regression_loss: 1.5852 - classification_loss: 0.3218 42/500 [=>............................] - ETA: 1:54 - loss: 1.9023 - regression_loss: 1.5823 - classification_loss: 0.3200 43/500 [=>............................] - ETA: 1:54 - loss: 1.9195 - regression_loss: 1.5951 - classification_loss: 0.3244 44/500 [=>............................] - ETA: 1:54 - loss: 1.9376 - regression_loss: 1.5589 - classification_loss: 0.3787 45/500 [=>............................] - ETA: 1:53 - loss: 1.9315 - regression_loss: 1.5560 - classification_loss: 0.3755 46/500 [=>............................] - ETA: 1:53 - loss: 1.9248 - regression_loss: 1.5536 - classification_loss: 0.3712 47/500 [=>............................] - ETA: 1:53 - loss: 1.9145 - regression_loss: 1.5475 - classification_loss: 0.3670 48/500 [=>............................] - ETA: 1:53 - loss: 1.9319 - regression_loss: 1.5646 - classification_loss: 0.3673 49/500 [=>............................] - ETA: 1:52 - loss: 1.9391 - regression_loss: 1.5735 - classification_loss: 0.3656 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9397 - regression_loss: 1.5739 - classification_loss: 0.3658 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9349 - regression_loss: 1.5717 - classification_loss: 0.3633 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9400 - regression_loss: 1.5765 - classification_loss: 0.3635 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9297 - regression_loss: 1.5677 - classification_loss: 0.3620 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9201 - regression_loss: 1.5612 - classification_loss: 0.3589 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9170 - regression_loss: 1.5603 - classification_loss: 0.3567 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9102 - regression_loss: 1.5572 - classification_loss: 0.3531 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9067 - regression_loss: 1.5542 - classification_loss: 0.3525 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9120 - regression_loss: 1.5604 - classification_loss: 0.3516 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9142 - regression_loss: 1.5628 - classification_loss: 0.3514 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9107 - regression_loss: 1.5604 - classification_loss: 0.3503 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9067 - regression_loss: 1.5576 - classification_loss: 0.3491 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9223 - regression_loss: 1.5700 - classification_loss: 0.3523 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9199 - regression_loss: 1.5686 - classification_loss: 0.3513 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9009 - regression_loss: 1.5538 - classification_loss: 0.3471 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8920 - regression_loss: 1.5477 - classification_loss: 0.3444 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8915 - regression_loss: 1.5486 - classification_loss: 0.3429 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8828 - regression_loss: 1.5425 - classification_loss: 0.3403 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8853 - regression_loss: 1.5436 - classification_loss: 0.3416 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8814 - regression_loss: 1.5413 - classification_loss: 0.3401 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8816 - regression_loss: 1.5430 - classification_loss: 0.3387 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8757 - regression_loss: 1.5390 - classification_loss: 0.3367 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8652 - regression_loss: 1.5314 - classification_loss: 0.3337 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8595 - regression_loss: 1.5266 - classification_loss: 0.3329 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8557 - regression_loss: 1.5245 - classification_loss: 0.3312 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8575 - regression_loss: 1.5251 - classification_loss: 0.3324 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8519 - regression_loss: 1.5218 - classification_loss: 0.3302 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8562 - regression_loss: 1.5268 - classification_loss: 0.3293 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8469 - regression_loss: 1.5199 - classification_loss: 0.3270 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8445 - regression_loss: 1.5181 - classification_loss: 0.3265 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8374 - regression_loss: 1.5135 - classification_loss: 0.3239 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8342 - regression_loss: 1.5103 - classification_loss: 0.3239 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8252 - regression_loss: 1.5038 - classification_loss: 0.3215 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8354 - regression_loss: 1.5113 - classification_loss: 0.3241 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8338 - regression_loss: 1.5088 - classification_loss: 0.3250 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8255 - regression_loss: 1.5035 - classification_loss: 0.3221 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8311 - regression_loss: 1.5081 - classification_loss: 0.3230 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8349 - regression_loss: 1.5117 - classification_loss: 0.3232 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8282 - regression_loss: 1.5073 - classification_loss: 0.3209 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8279 - regression_loss: 1.5080 - classification_loss: 0.3199 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8285 - regression_loss: 1.5097 - classification_loss: 0.3188 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8242 - regression_loss: 1.5065 - classification_loss: 0.3177 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8312 - regression_loss: 1.5111 - classification_loss: 0.3201 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8361 - regression_loss: 1.5150 - classification_loss: 0.3211 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8375 - regression_loss: 1.5162 - classification_loss: 0.3213 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8314 - regression_loss: 1.5117 - classification_loss: 0.3197 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8324 - regression_loss: 1.5129 - classification_loss: 0.3195 97/500 [====>.........................] - ETA: 1:41 - loss: 1.8312 - regression_loss: 1.5122 - classification_loss: 0.3190 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8265 - regression_loss: 1.5087 - classification_loss: 0.3178 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8327 - regression_loss: 1.5146 - classification_loss: 0.3181 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8372 - regression_loss: 1.5184 - classification_loss: 0.3188 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8302 - regression_loss: 1.5126 - classification_loss: 0.3176 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8396 - regression_loss: 1.5225 - classification_loss: 0.3170 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8395 - regression_loss: 1.5228 - classification_loss: 0.3168 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8397 - regression_loss: 1.5234 - classification_loss: 0.3163 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8414 - regression_loss: 1.5251 - classification_loss: 0.3163 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8446 - regression_loss: 1.5274 - classification_loss: 0.3171 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8369 - regression_loss: 1.5208 - classification_loss: 0.3162 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8440 - regression_loss: 1.5263 - classification_loss: 0.3177 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8486 - regression_loss: 1.5298 - classification_loss: 0.3188 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8506 - regression_loss: 1.5312 - classification_loss: 0.3195 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8537 - regression_loss: 1.5339 - classification_loss: 0.3198 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8574 - regression_loss: 1.5371 - classification_loss: 0.3203 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8548 - regression_loss: 1.5354 - classification_loss: 0.3195 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8473 - regression_loss: 1.5292 - classification_loss: 0.3180 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8591 - regression_loss: 1.5394 - classification_loss: 0.3197 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8652 - regression_loss: 1.5445 - classification_loss: 0.3206 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8634 - regression_loss: 1.5438 - classification_loss: 0.3197 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8632 - regression_loss: 1.5441 - classification_loss: 0.3192 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8606 - regression_loss: 1.5421 - classification_loss: 0.3184 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8602 - regression_loss: 1.5427 - classification_loss: 0.3176 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8577 - regression_loss: 1.5408 - classification_loss: 0.3168 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8651 - regression_loss: 1.5471 - classification_loss: 0.3180 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8660 - regression_loss: 1.5482 - classification_loss: 0.3178 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8739 - regression_loss: 1.5528 - classification_loss: 0.3210 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8710 - regression_loss: 1.5507 - classification_loss: 0.3203 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8732 - regression_loss: 1.5533 - classification_loss: 0.3199 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8832 - regression_loss: 1.5629 - classification_loss: 0.3204 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8817 - regression_loss: 1.5617 - classification_loss: 0.3200 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8818 - regression_loss: 1.5620 - classification_loss: 0.3198 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8834 - regression_loss: 1.5635 - classification_loss: 0.3199 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8820 - regression_loss: 1.5634 - classification_loss: 0.3185 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8801 - regression_loss: 1.5621 - classification_loss: 0.3180 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8801 - regression_loss: 1.5625 - classification_loss: 0.3175 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8796 - regression_loss: 1.5617 - classification_loss: 0.3179 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8795 - regression_loss: 1.5626 - classification_loss: 0.3170 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8701 - regression_loss: 1.5552 - classification_loss: 0.3149 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8699 - regression_loss: 1.5552 - classification_loss: 0.3147 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8722 - regression_loss: 1.5577 - classification_loss: 0.3144 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8719 - regression_loss: 1.5583 - classification_loss: 0.3135 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8720 - regression_loss: 1.5589 - classification_loss: 0.3131 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8697 - regression_loss: 1.5575 - classification_loss: 0.3123 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8715 - regression_loss: 1.5584 - classification_loss: 0.3131 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8710 - regression_loss: 1.5583 - classification_loss: 0.3127 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8656 - regression_loss: 1.5535 - classification_loss: 0.3122 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8621 - regression_loss: 1.5510 - classification_loss: 0.3111 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8630 - regression_loss: 1.5515 - classification_loss: 0.3115 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8680 - regression_loss: 1.5557 - classification_loss: 0.3124 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8697 - regression_loss: 1.5571 - classification_loss: 0.3126 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8721 - regression_loss: 1.5595 - classification_loss: 0.3127 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8730 - regression_loss: 1.5604 - classification_loss: 0.3126 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8725 - regression_loss: 1.5602 - classification_loss: 0.3123 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8676 - regression_loss: 1.5565 - classification_loss: 0.3111 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8649 - regression_loss: 1.5543 - classification_loss: 0.3106 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8617 - regression_loss: 1.5519 - classification_loss: 0.3099 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8604 - regression_loss: 1.5510 - classification_loss: 0.3094 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8595 - regression_loss: 1.5510 - classification_loss: 0.3086 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8606 - regression_loss: 1.5523 - classification_loss: 0.3084 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8618 - regression_loss: 1.5527 - classification_loss: 0.3091 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8622 - regression_loss: 1.5536 - classification_loss: 0.3086 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8665 - regression_loss: 1.5568 - classification_loss: 0.3098 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8643 - regression_loss: 1.5551 - classification_loss: 0.3092 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8640 - regression_loss: 1.5548 - classification_loss: 0.3092 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8631 - regression_loss: 1.5544 - classification_loss: 0.3088 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8627 - regression_loss: 1.5537 - classification_loss: 0.3091 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8635 - regression_loss: 1.5545 - classification_loss: 0.3090 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8629 - regression_loss: 1.5544 - classification_loss: 0.3084 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8598 - regression_loss: 1.5521 - classification_loss: 0.3077 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8611 - regression_loss: 1.5529 - classification_loss: 0.3082 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8654 - regression_loss: 1.5552 - classification_loss: 0.3101 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8627 - regression_loss: 1.5532 - classification_loss: 0.3095 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8666 - regression_loss: 1.5561 - classification_loss: 0.3106 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8642 - regression_loss: 1.5545 - classification_loss: 0.3097 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8687 - regression_loss: 1.5577 - classification_loss: 0.3110 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8717 - regression_loss: 1.5603 - classification_loss: 0.3115 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8671 - regression_loss: 1.5566 - classification_loss: 0.3105 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8681 - regression_loss: 1.5578 - classification_loss: 0.3103 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8664 - regression_loss: 1.5567 - classification_loss: 0.3097 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8613 - regression_loss: 1.5527 - classification_loss: 0.3086 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8566 - regression_loss: 1.5489 - classification_loss: 0.3076 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8536 - regression_loss: 1.5471 - classification_loss: 0.3066 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8543 - regression_loss: 1.5479 - classification_loss: 0.3064 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8541 - regression_loss: 1.5481 - classification_loss: 0.3060 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8509 - regression_loss: 1.5458 - classification_loss: 0.3052 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8516 - regression_loss: 1.5462 - classification_loss: 0.3053 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8499 - regression_loss: 1.5448 - classification_loss: 0.3050 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8519 - regression_loss: 1.5468 - classification_loss: 0.3051 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8532 - regression_loss: 1.5486 - classification_loss: 0.3046 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8534 - regression_loss: 1.5486 - classification_loss: 0.3048 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8497 - regression_loss: 1.5456 - classification_loss: 0.3041 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8490 - regression_loss: 1.5451 - classification_loss: 0.3039 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8459 - regression_loss: 1.5429 - classification_loss: 0.3030 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8507 - regression_loss: 1.5465 - classification_loss: 0.3042 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8590 - regression_loss: 1.5512 - classification_loss: 0.3078 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8557 - regression_loss: 1.5487 - classification_loss: 0.3070 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8524 - regression_loss: 1.5462 - classification_loss: 0.3061 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8519 - regression_loss: 1.5461 - classification_loss: 0.3058 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8499 - regression_loss: 1.5447 - classification_loss: 0.3052 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8491 - regression_loss: 1.5435 - classification_loss: 0.3055 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8489 - regression_loss: 1.5438 - classification_loss: 0.3051 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8499 - regression_loss: 1.5446 - classification_loss: 0.3053 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8456 - regression_loss: 1.5411 - classification_loss: 0.3045 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8452 - regression_loss: 1.5410 - classification_loss: 0.3041 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8431 - regression_loss: 1.5394 - classification_loss: 0.3036 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8415 - regression_loss: 1.5381 - classification_loss: 0.3034 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8426 - regression_loss: 1.5388 - classification_loss: 0.3038 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8429 - regression_loss: 1.5388 - classification_loss: 0.3041 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8428 - regression_loss: 1.5382 - classification_loss: 0.3046 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8381 - regression_loss: 1.5347 - classification_loss: 0.3034 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8364 - regression_loss: 1.5334 - classification_loss: 0.3030 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8399 - regression_loss: 1.5367 - classification_loss: 0.3032 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8360 - regression_loss: 1.5336 - classification_loss: 0.3025 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8385 - regression_loss: 1.5355 - classification_loss: 0.3030 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8390 - regression_loss: 1.5358 - classification_loss: 0.3032 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8350 - regression_loss: 1.5326 - classification_loss: 0.3025 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8370 - regression_loss: 1.5340 - classification_loss: 0.3030 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8356 - regression_loss: 1.5332 - classification_loss: 0.3024 217/500 [============>.................] - ETA: 1:10 - loss: 1.8378 - regression_loss: 1.5352 - classification_loss: 0.3027 218/500 [============>.................] - ETA: 1:10 - loss: 1.8326 - regression_loss: 1.5310 - classification_loss: 0.3016 219/500 [============>.................] - ETA: 1:10 - loss: 1.8297 - regression_loss: 1.5288 - classification_loss: 0.3008 220/500 [============>.................] - ETA: 1:10 - loss: 1.8262 - regression_loss: 1.5259 - classification_loss: 0.3003 221/500 [============>.................] - ETA: 1:09 - loss: 1.8289 - regression_loss: 1.5279 - classification_loss: 0.3009 222/500 [============>.................] - ETA: 1:09 - loss: 1.8236 - regression_loss: 1.5210 - classification_loss: 0.3025 223/500 [============>.................] - ETA: 1:09 - loss: 1.8226 - regression_loss: 1.5195 - classification_loss: 0.3030 224/500 [============>.................] - ETA: 1:09 - loss: 1.8216 - regression_loss: 1.5185 - classification_loss: 0.3031 225/500 [============>.................] - ETA: 1:08 - loss: 1.8227 - regression_loss: 1.5192 - classification_loss: 0.3035 226/500 [============>.................] - ETA: 1:08 - loss: 1.8264 - regression_loss: 1.5223 - classification_loss: 0.3041 227/500 [============>.................] - ETA: 1:08 - loss: 1.8249 - regression_loss: 1.5212 - classification_loss: 0.3037 228/500 [============>.................] - ETA: 1:08 - loss: 1.8271 - regression_loss: 1.5231 - classification_loss: 0.3039 229/500 [============>.................] - ETA: 1:07 - loss: 1.8319 - regression_loss: 1.5259 - classification_loss: 0.3060 230/500 [============>.................] - ETA: 1:07 - loss: 1.8345 - regression_loss: 1.5279 - classification_loss: 0.3066 231/500 [============>.................] - ETA: 1:07 - loss: 1.8326 - regression_loss: 1.5261 - classification_loss: 0.3065 232/500 [============>.................] - ETA: 1:07 - loss: 1.8324 - regression_loss: 1.5262 - classification_loss: 0.3062 233/500 [============>.................] - ETA: 1:06 - loss: 1.8338 - regression_loss: 1.5271 - classification_loss: 0.3067 234/500 [=============>................] - ETA: 1:06 - loss: 1.8361 - regression_loss: 1.5287 - classification_loss: 0.3074 235/500 [=============>................] - ETA: 1:06 - loss: 1.8399 - regression_loss: 1.5313 - classification_loss: 0.3086 236/500 [=============>................] - ETA: 1:06 - loss: 1.8413 - regression_loss: 1.5324 - classification_loss: 0.3089 237/500 [=============>................] - ETA: 1:05 - loss: 1.8405 - regression_loss: 1.5322 - classification_loss: 0.3083 238/500 [=============>................] - ETA: 1:05 - loss: 1.8412 - regression_loss: 1.5326 - classification_loss: 0.3086 239/500 [=============>................] - ETA: 1:05 - loss: 1.8435 - regression_loss: 1.5347 - classification_loss: 0.3089 240/500 [=============>................] - ETA: 1:05 - loss: 1.8441 - regression_loss: 1.5347 - classification_loss: 0.3094 241/500 [=============>................] - ETA: 1:04 - loss: 1.8413 - regression_loss: 1.5325 - classification_loss: 0.3088 242/500 [=============>................] - ETA: 1:04 - loss: 1.8410 - regression_loss: 1.5320 - classification_loss: 0.3089 243/500 [=============>................] - ETA: 1:04 - loss: 1.8397 - regression_loss: 1.5314 - classification_loss: 0.3083 244/500 [=============>................] - ETA: 1:04 - loss: 1.8395 - regression_loss: 1.5315 - classification_loss: 0.3080 245/500 [=============>................] - ETA: 1:03 - loss: 1.8389 - regression_loss: 1.5311 - classification_loss: 0.3078 246/500 [=============>................] - ETA: 1:03 - loss: 1.8398 - regression_loss: 1.5318 - classification_loss: 0.3080 247/500 [=============>................] - ETA: 1:03 - loss: 1.8423 - regression_loss: 1.5334 - classification_loss: 0.3090 248/500 [=============>................] - ETA: 1:03 - loss: 1.8449 - regression_loss: 1.5357 - classification_loss: 0.3092 249/500 [=============>................] - ETA: 1:02 - loss: 1.8425 - regression_loss: 1.5341 - classification_loss: 0.3084 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8392 - regression_loss: 1.5314 - classification_loss: 0.3078 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8388 - regression_loss: 1.5314 - classification_loss: 0.3074 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8390 - regression_loss: 1.5315 - classification_loss: 0.3075 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8338 - regression_loss: 1.5272 - classification_loss: 0.3066 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8370 - regression_loss: 1.5290 - classification_loss: 0.3080 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8328 - regression_loss: 1.5257 - classification_loss: 0.3070 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8321 - regression_loss: 1.5253 - classification_loss: 0.3068 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8336 - regression_loss: 1.5265 - classification_loss: 0.3072 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8349 - regression_loss: 1.5273 - classification_loss: 0.3076 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8354 - regression_loss: 1.5276 - classification_loss: 0.3078 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8364 - regression_loss: 1.5286 - classification_loss: 0.3078 261/500 [==============>...............] - ETA: 59s - loss: 1.8397 - regression_loss: 1.5303 - classification_loss: 0.3093  262/500 [==============>...............] - ETA: 59s - loss: 1.8407 - regression_loss: 1.5303 - classification_loss: 0.3104 263/500 [==============>...............] - ETA: 59s - loss: 1.8395 - regression_loss: 1.5295 - classification_loss: 0.3100 264/500 [==============>...............] - ETA: 59s - loss: 1.8390 - regression_loss: 1.5291 - classification_loss: 0.3099 265/500 [==============>...............] - ETA: 58s - loss: 1.8378 - regression_loss: 1.5282 - classification_loss: 0.3095 266/500 [==============>...............] - ETA: 58s - loss: 1.8329 - regression_loss: 1.5241 - classification_loss: 0.3088 267/500 [===============>..............] - ETA: 58s - loss: 1.8322 - regression_loss: 1.5237 - classification_loss: 0.3085 268/500 [===============>..............] - ETA: 58s - loss: 1.8308 - regression_loss: 1.5226 - classification_loss: 0.3082 269/500 [===============>..............] - ETA: 57s - loss: 1.8315 - regression_loss: 1.5232 - classification_loss: 0.3083 270/500 [===============>..............] - ETA: 57s - loss: 1.8323 - regression_loss: 1.5237 - classification_loss: 0.3086 271/500 [===============>..............] - ETA: 57s - loss: 1.8321 - regression_loss: 1.5238 - classification_loss: 0.3083 272/500 [===============>..............] - ETA: 57s - loss: 1.8310 - regression_loss: 1.5230 - classification_loss: 0.3081 273/500 [===============>..............] - ETA: 56s - loss: 1.8298 - regression_loss: 1.5219 - classification_loss: 0.3080 274/500 [===============>..............] - ETA: 56s - loss: 1.8311 - regression_loss: 1.5226 - classification_loss: 0.3085 275/500 [===============>..............] - ETA: 56s - loss: 1.8321 - regression_loss: 1.5234 - classification_loss: 0.3088 276/500 [===============>..............] - ETA: 56s - loss: 1.8308 - regression_loss: 1.5224 - classification_loss: 0.3084 277/500 [===============>..............] - ETA: 55s - loss: 1.8342 - regression_loss: 1.5251 - classification_loss: 0.3091 278/500 [===============>..............] - ETA: 55s - loss: 1.8332 - regression_loss: 1.5241 - classification_loss: 0.3090 279/500 [===============>..............] - ETA: 55s - loss: 1.8316 - regression_loss: 1.5230 - classification_loss: 0.3087 280/500 [===============>..............] - ETA: 55s - loss: 1.8344 - regression_loss: 1.5253 - classification_loss: 0.3091 281/500 [===============>..............] - ETA: 54s - loss: 1.8307 - regression_loss: 1.5215 - classification_loss: 0.3092 282/500 [===============>..............] - ETA: 54s - loss: 1.8305 - regression_loss: 1.5215 - classification_loss: 0.3090 283/500 [===============>..............] - ETA: 54s - loss: 1.8304 - regression_loss: 1.5212 - classification_loss: 0.3092 284/500 [================>.............] - ETA: 54s - loss: 1.8303 - regression_loss: 1.5208 - classification_loss: 0.3096 285/500 [================>.............] - ETA: 53s - loss: 1.8266 - regression_loss: 1.5177 - classification_loss: 0.3089 286/500 [================>.............] - ETA: 53s - loss: 1.8286 - regression_loss: 1.5191 - classification_loss: 0.3095 287/500 [================>.............] - ETA: 53s - loss: 1.8292 - regression_loss: 1.5193 - classification_loss: 0.3099 288/500 [================>.............] - ETA: 53s - loss: 1.8305 - regression_loss: 1.5207 - classification_loss: 0.3098 289/500 [================>.............] - ETA: 52s - loss: 1.8281 - regression_loss: 1.5188 - classification_loss: 0.3093 290/500 [================>.............] - ETA: 52s - loss: 1.8266 - regression_loss: 1.5177 - classification_loss: 0.3088 291/500 [================>.............] - ETA: 52s - loss: 1.8249 - regression_loss: 1.5166 - classification_loss: 0.3083 292/500 [================>.............] - ETA: 52s - loss: 1.8249 - regression_loss: 1.5165 - classification_loss: 0.3084 293/500 [================>.............] - ETA: 51s - loss: 1.8251 - regression_loss: 1.5168 - classification_loss: 0.3083 294/500 [================>.............] - ETA: 51s - loss: 1.8280 - regression_loss: 1.5197 - classification_loss: 0.3083 295/500 [================>.............] - ETA: 51s - loss: 1.8286 - regression_loss: 1.5203 - classification_loss: 0.3083 296/500 [================>.............] - ETA: 51s - loss: 1.8261 - regression_loss: 1.5186 - classification_loss: 0.3076 297/500 [================>.............] - ETA: 50s - loss: 1.8272 - regression_loss: 1.5197 - classification_loss: 0.3075 298/500 [================>.............] - ETA: 50s - loss: 1.8286 - regression_loss: 1.5211 - classification_loss: 0.3075 299/500 [================>.............] - ETA: 50s - loss: 1.8287 - regression_loss: 1.5212 - classification_loss: 0.3075 300/500 [=================>............] - ETA: 50s - loss: 1.8282 - regression_loss: 1.5211 - classification_loss: 0.3072 301/500 [=================>............] - ETA: 49s - loss: 1.8315 - regression_loss: 1.5235 - classification_loss: 0.3080 302/500 [=================>............] - ETA: 49s - loss: 1.8302 - regression_loss: 1.5226 - classification_loss: 0.3076 303/500 [=================>............] - ETA: 49s - loss: 1.8340 - regression_loss: 1.5246 - classification_loss: 0.3093 304/500 [=================>............] - ETA: 49s - loss: 1.8329 - regression_loss: 1.5239 - classification_loss: 0.3090 305/500 [=================>............] - ETA: 48s - loss: 1.8328 - regression_loss: 1.5237 - classification_loss: 0.3091 306/500 [=================>............] - ETA: 48s - loss: 1.8318 - regression_loss: 1.5230 - classification_loss: 0.3088 307/500 [=================>............] - ETA: 48s - loss: 1.8304 - regression_loss: 1.5218 - classification_loss: 0.3087 308/500 [=================>............] - ETA: 47s - loss: 1.8300 - regression_loss: 1.5214 - classification_loss: 0.3086 309/500 [=================>............] - ETA: 47s - loss: 1.8305 - regression_loss: 1.5219 - classification_loss: 0.3086 310/500 [=================>............] - ETA: 47s - loss: 1.8305 - regression_loss: 1.5216 - classification_loss: 0.3089 311/500 [=================>............] - ETA: 47s - loss: 1.8309 - regression_loss: 1.5220 - classification_loss: 0.3088 312/500 [=================>............] - ETA: 46s - loss: 1.8322 - regression_loss: 1.5230 - classification_loss: 0.3092 313/500 [=================>............] - ETA: 46s - loss: 1.8313 - regression_loss: 1.5223 - classification_loss: 0.3090 314/500 [=================>............] - ETA: 46s - loss: 1.8308 - regression_loss: 1.5221 - classification_loss: 0.3087 315/500 [=================>............] - ETA: 46s - loss: 1.8298 - regression_loss: 1.5214 - classification_loss: 0.3084 316/500 [=================>............] - ETA: 45s - loss: 1.8328 - regression_loss: 1.5238 - classification_loss: 0.3090 317/500 [==================>...........] - ETA: 45s - loss: 1.8305 - regression_loss: 1.5219 - classification_loss: 0.3086 318/500 [==================>...........] - ETA: 45s - loss: 1.8322 - regression_loss: 1.5235 - classification_loss: 0.3087 319/500 [==================>...........] - ETA: 45s - loss: 1.8326 - regression_loss: 1.5241 - classification_loss: 0.3085 320/500 [==================>...........] - ETA: 44s - loss: 1.8327 - regression_loss: 1.5246 - classification_loss: 0.3081 321/500 [==================>...........] - ETA: 44s - loss: 1.8333 - regression_loss: 1.5250 - classification_loss: 0.3083 322/500 [==================>...........] - ETA: 44s - loss: 1.8329 - regression_loss: 1.5246 - classification_loss: 0.3083 323/500 [==================>...........] - ETA: 44s - loss: 1.8324 - regression_loss: 1.5244 - classification_loss: 0.3081 324/500 [==================>...........] - ETA: 43s - loss: 1.8296 - regression_loss: 1.5221 - classification_loss: 0.3074 325/500 [==================>...........] - ETA: 43s - loss: 1.8267 - regression_loss: 1.5200 - classification_loss: 0.3066 326/500 [==================>...........] - ETA: 43s - loss: 1.8275 - regression_loss: 1.5209 - classification_loss: 0.3066 327/500 [==================>...........] - ETA: 43s - loss: 1.8263 - regression_loss: 1.5201 - classification_loss: 0.3062 328/500 [==================>...........] - ETA: 43s - loss: 1.8262 - regression_loss: 1.5201 - classification_loss: 0.3061 329/500 [==================>...........] - ETA: 42s - loss: 1.8273 - regression_loss: 1.5209 - classification_loss: 0.3064 330/500 [==================>...........] - ETA: 42s - loss: 1.8272 - regression_loss: 1.5211 - classification_loss: 0.3061 331/500 [==================>...........] - ETA: 42s - loss: 1.8279 - regression_loss: 1.5218 - classification_loss: 0.3061 332/500 [==================>...........] - ETA: 41s - loss: 1.8283 - regression_loss: 1.5222 - classification_loss: 0.3061 333/500 [==================>...........] - ETA: 41s - loss: 1.8277 - regression_loss: 1.5214 - classification_loss: 0.3063 334/500 [===================>..........] - ETA: 41s - loss: 1.8257 - regression_loss: 1.5199 - classification_loss: 0.3057 335/500 [===================>..........] - ETA: 41s - loss: 1.8279 - regression_loss: 1.5211 - classification_loss: 0.3068 336/500 [===================>..........] - ETA: 40s - loss: 1.8254 - regression_loss: 1.5192 - classification_loss: 0.3063 337/500 [===================>..........] - ETA: 40s - loss: 1.8263 - regression_loss: 1.5200 - classification_loss: 0.3063 338/500 [===================>..........] - ETA: 40s - loss: 1.8253 - regression_loss: 1.5192 - classification_loss: 0.3061 339/500 [===================>..........] - ETA: 40s - loss: 1.8245 - regression_loss: 1.5186 - classification_loss: 0.3059 340/500 [===================>..........] - ETA: 40s - loss: 1.8233 - regression_loss: 1.5177 - classification_loss: 0.3055 341/500 [===================>..........] - ETA: 39s - loss: 1.8250 - regression_loss: 1.5190 - classification_loss: 0.3060 342/500 [===================>..........] - ETA: 39s - loss: 1.8244 - regression_loss: 1.5186 - classification_loss: 0.3058 343/500 [===================>..........] - ETA: 39s - loss: 1.8255 - regression_loss: 1.5194 - classification_loss: 0.3061 344/500 [===================>..........] - ETA: 39s - loss: 1.8265 - regression_loss: 1.5203 - classification_loss: 0.3063 345/500 [===================>..........] - ETA: 38s - loss: 1.8275 - regression_loss: 1.5213 - classification_loss: 0.3062 346/500 [===================>..........] - ETA: 38s - loss: 1.8268 - regression_loss: 1.5210 - classification_loss: 0.3058 347/500 [===================>..........] - ETA: 38s - loss: 1.8266 - regression_loss: 1.5208 - classification_loss: 0.3057 348/500 [===================>..........] - ETA: 38s - loss: 1.8251 - regression_loss: 1.5198 - classification_loss: 0.3053 349/500 [===================>..........] - ETA: 37s - loss: 1.8262 - regression_loss: 1.5205 - classification_loss: 0.3057 350/500 [====================>.........] - ETA: 37s - loss: 1.8248 - regression_loss: 1.5195 - classification_loss: 0.3053 351/500 [====================>.........] - ETA: 37s - loss: 1.8239 - regression_loss: 1.5189 - classification_loss: 0.3050 352/500 [====================>.........] - ETA: 37s - loss: 1.8225 - regression_loss: 1.5177 - classification_loss: 0.3048 353/500 [====================>.........] - ETA: 36s - loss: 1.8253 - regression_loss: 1.5200 - classification_loss: 0.3053 354/500 [====================>.........] - ETA: 36s - loss: 1.8269 - regression_loss: 1.5214 - classification_loss: 0.3055 355/500 [====================>.........] - ETA: 36s - loss: 1.8281 - regression_loss: 1.5222 - classification_loss: 0.3059 356/500 [====================>.........] - ETA: 36s - loss: 1.8303 - regression_loss: 1.5238 - classification_loss: 0.3064 357/500 [====================>.........] - ETA: 35s - loss: 1.8288 - regression_loss: 1.5228 - classification_loss: 0.3060 358/500 [====================>.........] - ETA: 35s - loss: 1.8302 - regression_loss: 1.5238 - classification_loss: 0.3064 359/500 [====================>.........] - ETA: 35s - loss: 1.8301 - regression_loss: 1.5236 - classification_loss: 0.3065 360/500 [====================>.........] - ETA: 35s - loss: 1.8283 - regression_loss: 1.5221 - classification_loss: 0.3062 361/500 [====================>.........] - ETA: 34s - loss: 1.8283 - regression_loss: 1.5220 - classification_loss: 0.3063 362/500 [====================>.........] - ETA: 34s - loss: 1.8255 - regression_loss: 1.5195 - classification_loss: 0.3060 363/500 [====================>.........] - ETA: 34s - loss: 1.8249 - regression_loss: 1.5191 - classification_loss: 0.3058 364/500 [====================>.........] - ETA: 34s - loss: 1.8258 - regression_loss: 1.5193 - classification_loss: 0.3065 365/500 [====================>.........] - ETA: 33s - loss: 1.8260 - regression_loss: 1.5193 - classification_loss: 0.3067 366/500 [====================>.........] - ETA: 33s - loss: 1.8258 - regression_loss: 1.5188 - classification_loss: 0.3070 367/500 [=====================>........] - ETA: 33s - loss: 1.8246 - regression_loss: 1.5180 - classification_loss: 0.3066 368/500 [=====================>........] - ETA: 33s - loss: 1.8232 - regression_loss: 1.5169 - classification_loss: 0.3063 369/500 [=====================>........] - ETA: 32s - loss: 1.8233 - regression_loss: 1.5170 - classification_loss: 0.3063 370/500 [=====================>........] - ETA: 32s - loss: 1.8226 - regression_loss: 1.5165 - classification_loss: 0.3061 371/500 [=====================>........] - ETA: 32s - loss: 1.8229 - regression_loss: 1.5167 - classification_loss: 0.3062 372/500 [=====================>........] - ETA: 32s - loss: 1.8237 - regression_loss: 1.5172 - classification_loss: 0.3065 373/500 [=====================>........] - ETA: 31s - loss: 1.8248 - regression_loss: 1.5181 - classification_loss: 0.3068 374/500 [=====================>........] - ETA: 31s - loss: 1.8240 - regression_loss: 1.5175 - classification_loss: 0.3065 375/500 [=====================>........] - ETA: 31s - loss: 1.8231 - regression_loss: 1.5168 - classification_loss: 0.3062 376/500 [=====================>........] - ETA: 31s - loss: 1.8253 - regression_loss: 1.5187 - classification_loss: 0.3066 377/500 [=====================>........] - ETA: 30s - loss: 1.8226 - regression_loss: 1.5166 - classification_loss: 0.3061 378/500 [=====================>........] - ETA: 30s - loss: 1.8228 - regression_loss: 1.5167 - classification_loss: 0.3061 379/500 [=====================>........] - ETA: 30s - loss: 1.8218 - regression_loss: 1.5160 - classification_loss: 0.3058 380/500 [=====================>........] - ETA: 30s - loss: 1.8220 - regression_loss: 1.5162 - classification_loss: 0.3058 381/500 [=====================>........] - ETA: 29s - loss: 1.8218 - regression_loss: 1.5161 - classification_loss: 0.3057 382/500 [=====================>........] - ETA: 29s - loss: 1.8214 - regression_loss: 1.5161 - classification_loss: 0.3053 383/500 [=====================>........] - ETA: 29s - loss: 1.8196 - regression_loss: 1.5148 - classification_loss: 0.3049 384/500 [======================>.......] - ETA: 29s - loss: 1.8234 - regression_loss: 1.5108 - classification_loss: 0.3126 385/500 [======================>.......] - ETA: 28s - loss: 1.8217 - regression_loss: 1.5093 - classification_loss: 0.3124 386/500 [======================>.......] - ETA: 28s - loss: 1.8184 - regression_loss: 1.5066 - classification_loss: 0.3118 387/500 [======================>.......] - ETA: 28s - loss: 1.8176 - regression_loss: 1.5060 - classification_loss: 0.3116 388/500 [======================>.......] - ETA: 28s - loss: 1.8181 - regression_loss: 1.5067 - classification_loss: 0.3114 389/500 [======================>.......] - ETA: 27s - loss: 1.8169 - regression_loss: 1.5057 - classification_loss: 0.3112 390/500 [======================>.......] - ETA: 27s - loss: 1.8172 - regression_loss: 1.5062 - classification_loss: 0.3111 391/500 [======================>.......] - ETA: 27s - loss: 1.8173 - regression_loss: 1.5062 - classification_loss: 0.3111 392/500 [======================>.......] - ETA: 27s - loss: 1.8164 - regression_loss: 1.5057 - classification_loss: 0.3108 393/500 [======================>.......] - ETA: 26s - loss: 1.8167 - regression_loss: 1.5059 - classification_loss: 0.3108 394/500 [======================>.......] - ETA: 26s - loss: 1.8144 - regression_loss: 1.5041 - classification_loss: 0.3103 395/500 [======================>.......] - ETA: 26s - loss: 1.8131 - regression_loss: 1.5029 - classification_loss: 0.3103 396/500 [======================>.......] - ETA: 26s - loss: 1.8165 - regression_loss: 1.5057 - classification_loss: 0.3108 397/500 [======================>.......] - ETA: 25s - loss: 1.8174 - regression_loss: 1.5066 - classification_loss: 0.3108 398/500 [======================>.......] - ETA: 25s - loss: 1.8187 - regression_loss: 1.5076 - classification_loss: 0.3111 399/500 [======================>.......] - ETA: 25s - loss: 1.8175 - regression_loss: 1.5067 - classification_loss: 0.3108 400/500 [=======================>......] - ETA: 25s - loss: 1.8187 - regression_loss: 1.5075 - classification_loss: 0.3111 401/500 [=======================>......] - ETA: 24s - loss: 1.8178 - regression_loss: 1.5069 - classification_loss: 0.3109 402/500 [=======================>......] - ETA: 24s - loss: 1.8175 - regression_loss: 1.5066 - classification_loss: 0.3109 403/500 [=======================>......] - ETA: 24s - loss: 1.8189 - regression_loss: 1.5076 - classification_loss: 0.3113 404/500 [=======================>......] - ETA: 24s - loss: 1.8176 - regression_loss: 1.5066 - classification_loss: 0.3110 405/500 [=======================>......] - ETA: 23s - loss: 1.8186 - regression_loss: 1.5075 - classification_loss: 0.3111 406/500 [=======================>......] - ETA: 23s - loss: 1.8193 - regression_loss: 1.5081 - classification_loss: 0.3112 407/500 [=======================>......] - ETA: 23s - loss: 1.8202 - regression_loss: 1.5087 - classification_loss: 0.3115 408/500 [=======================>......] - ETA: 23s - loss: 1.8204 - regression_loss: 1.5086 - classification_loss: 0.3118 409/500 [=======================>......] - ETA: 22s - loss: 1.8194 - regression_loss: 1.5079 - classification_loss: 0.3115 410/500 [=======================>......] - ETA: 22s - loss: 1.8191 - regression_loss: 1.5075 - classification_loss: 0.3115 411/500 [=======================>......] - ETA: 22s - loss: 1.8174 - regression_loss: 1.5063 - classification_loss: 0.3111 412/500 [=======================>......] - ETA: 22s - loss: 1.8174 - regression_loss: 1.5066 - classification_loss: 0.3108 413/500 [=======================>......] - ETA: 21s - loss: 1.8176 - regression_loss: 1.5071 - classification_loss: 0.3105 414/500 [=======================>......] - ETA: 21s - loss: 1.8169 - regression_loss: 1.5066 - classification_loss: 0.3104 415/500 [=======================>......] - ETA: 21s - loss: 1.8185 - regression_loss: 1.5078 - classification_loss: 0.3107 416/500 [=======================>......] - ETA: 21s - loss: 1.8183 - regression_loss: 1.5076 - classification_loss: 0.3107 417/500 [========================>.....] - ETA: 20s - loss: 1.8184 - regression_loss: 1.5076 - classification_loss: 0.3108 418/500 [========================>.....] - ETA: 20s - loss: 1.8181 - regression_loss: 1.5073 - classification_loss: 0.3108 419/500 [========================>.....] - ETA: 20s - loss: 1.8182 - regression_loss: 1.5075 - classification_loss: 0.3107 420/500 [========================>.....] - ETA: 20s - loss: 1.8192 - regression_loss: 1.5085 - classification_loss: 0.3107 421/500 [========================>.....] - ETA: 19s - loss: 1.8190 - regression_loss: 1.5083 - classification_loss: 0.3107 422/500 [========================>.....] - ETA: 19s - loss: 1.8178 - regression_loss: 1.5074 - classification_loss: 0.3104 423/500 [========================>.....] - ETA: 19s - loss: 1.8153 - regression_loss: 1.5054 - classification_loss: 0.3099 424/500 [========================>.....] - ETA: 19s - loss: 1.8163 - regression_loss: 1.5063 - classification_loss: 0.3100 425/500 [========================>.....] - ETA: 18s - loss: 1.8176 - regression_loss: 1.5070 - classification_loss: 0.3106 426/500 [========================>.....] - ETA: 18s - loss: 1.8177 - regression_loss: 1.5067 - classification_loss: 0.3109 427/500 [========================>.....] - ETA: 18s - loss: 1.8151 - regression_loss: 1.5046 - classification_loss: 0.3105 428/500 [========================>.....] - ETA: 18s - loss: 1.8132 - regression_loss: 1.5032 - classification_loss: 0.3100 429/500 [========================>.....] - ETA: 17s - loss: 1.8134 - regression_loss: 1.5033 - classification_loss: 0.3101 430/500 [========================>.....] - ETA: 17s - loss: 1.8123 - regression_loss: 1.5024 - classification_loss: 0.3099 431/500 [========================>.....] - ETA: 17s - loss: 1.8119 - regression_loss: 1.5025 - classification_loss: 0.3095 432/500 [========================>.....] - ETA: 17s - loss: 1.8119 - regression_loss: 1.5027 - classification_loss: 0.3092 433/500 [========================>.....] - ETA: 16s - loss: 1.8125 - regression_loss: 1.5031 - classification_loss: 0.3094 434/500 [=========================>....] - ETA: 16s - loss: 1.8117 - regression_loss: 1.5024 - classification_loss: 0.3093 435/500 [=========================>....] - ETA: 16s - loss: 1.8112 - regression_loss: 1.5023 - classification_loss: 0.3089 436/500 [=========================>....] - ETA: 16s - loss: 1.8112 - regression_loss: 1.5022 - classification_loss: 0.3089 437/500 [=========================>....] - ETA: 15s - loss: 1.8117 - regression_loss: 1.5027 - classification_loss: 0.3090 438/500 [=========================>....] - ETA: 15s - loss: 1.8109 - regression_loss: 1.5021 - classification_loss: 0.3087 439/500 [=========================>....] - ETA: 15s - loss: 1.8085 - regression_loss: 1.5003 - classification_loss: 0.3082 440/500 [=========================>....] - ETA: 15s - loss: 1.8074 - regression_loss: 1.4994 - classification_loss: 0.3080 441/500 [=========================>....] - ETA: 14s - loss: 1.8082 - regression_loss: 1.5000 - classification_loss: 0.3082 442/500 [=========================>....] - ETA: 14s - loss: 1.8087 - regression_loss: 1.5004 - classification_loss: 0.3083 443/500 [=========================>....] - ETA: 14s - loss: 1.8089 - regression_loss: 1.5005 - classification_loss: 0.3083 444/500 [=========================>....] - ETA: 14s - loss: 1.8091 - regression_loss: 1.5007 - classification_loss: 0.3085 445/500 [=========================>....] - ETA: 13s - loss: 1.8067 - regression_loss: 1.4987 - classification_loss: 0.3080 446/500 [=========================>....] - ETA: 13s - loss: 1.8063 - regression_loss: 1.4984 - classification_loss: 0.3079 447/500 [=========================>....] - ETA: 13s - loss: 1.8060 - regression_loss: 1.4983 - classification_loss: 0.3077 448/500 [=========================>....] - ETA: 13s - loss: 1.8056 - regression_loss: 1.4982 - classification_loss: 0.3074 449/500 [=========================>....] - ETA: 12s - loss: 1.8063 - regression_loss: 1.4989 - classification_loss: 0.3075 450/500 [==========================>...] - ETA: 12s - loss: 1.8077 - regression_loss: 1.5001 - classification_loss: 0.3076 451/500 [==========================>...] - ETA: 12s - loss: 1.8066 - regression_loss: 1.4993 - classification_loss: 0.3072 452/500 [==========================>...] - ETA: 12s - loss: 1.8052 - regression_loss: 1.4982 - classification_loss: 0.3070 453/500 [==========================>...] - ETA: 11s - loss: 1.8057 - regression_loss: 1.4986 - classification_loss: 0.3070 454/500 [==========================>...] - ETA: 11s - loss: 1.8064 - regression_loss: 1.4991 - classification_loss: 0.3074 455/500 [==========================>...] - ETA: 11s - loss: 1.8058 - regression_loss: 1.4985 - classification_loss: 0.3073 456/500 [==========================>...] - ETA: 11s - loss: 1.8056 - regression_loss: 1.4983 - classification_loss: 0.3072 457/500 [==========================>...] - ETA: 10s - loss: 1.8042 - regression_loss: 1.4973 - classification_loss: 0.3069 458/500 [==========================>...] - ETA: 10s - loss: 1.8042 - regression_loss: 1.4975 - classification_loss: 0.3067 459/500 [==========================>...] - ETA: 10s - loss: 1.8036 - regression_loss: 1.4972 - classification_loss: 0.3063 460/500 [==========================>...] - ETA: 10s - loss: 1.8030 - regression_loss: 1.4966 - classification_loss: 0.3064 461/500 [==========================>...] - ETA: 9s - loss: 1.8041 - regression_loss: 1.4973 - classification_loss: 0.3068  462/500 [==========================>...] - ETA: 9s - loss: 1.8051 - regression_loss: 1.4982 - classification_loss: 0.3069 463/500 [==========================>...] - ETA: 9s - loss: 1.8070 - regression_loss: 1.4994 - classification_loss: 0.3076 464/500 [==========================>...] - ETA: 9s - loss: 1.8083 - regression_loss: 1.5002 - classification_loss: 0.3081 465/500 [==========================>...] - ETA: 8s - loss: 1.8067 - regression_loss: 1.4990 - classification_loss: 0.3078 466/500 [==========================>...] - ETA: 8s - loss: 1.8059 - regression_loss: 1.4984 - classification_loss: 0.3075 467/500 [===========================>..] - ETA: 8s - loss: 1.8073 - regression_loss: 1.4993 - classification_loss: 0.3079 468/500 [===========================>..] - ETA: 8s - loss: 1.8074 - regression_loss: 1.4994 - classification_loss: 0.3079 469/500 [===========================>..] - ETA: 7s - loss: 1.8084 - regression_loss: 1.5004 - classification_loss: 0.3080 470/500 [===========================>..] - ETA: 7s - loss: 1.8095 - regression_loss: 1.5010 - classification_loss: 0.3085 471/500 [===========================>..] - ETA: 7s - loss: 1.8107 - regression_loss: 1.5022 - classification_loss: 0.3085 472/500 [===========================>..] - ETA: 7s - loss: 1.8107 - regression_loss: 1.5023 - classification_loss: 0.3084 473/500 [===========================>..] - ETA: 6s - loss: 1.8125 - regression_loss: 1.5039 - classification_loss: 0.3086 474/500 [===========================>..] - ETA: 6s - loss: 1.8138 - regression_loss: 1.5047 - classification_loss: 0.3091 475/500 [===========================>..] - ETA: 6s - loss: 1.8159 - regression_loss: 1.5065 - classification_loss: 0.3094 476/500 [===========================>..] - ETA: 6s - loss: 1.8159 - regression_loss: 1.5066 - classification_loss: 0.3093 477/500 [===========================>..] - ETA: 5s - loss: 1.8157 - regression_loss: 1.5066 - classification_loss: 0.3090 478/500 [===========================>..] - ETA: 5s - loss: 1.8160 - regression_loss: 1.5070 - classification_loss: 0.3090 479/500 [===========================>..] - ETA: 5s - loss: 1.8152 - regression_loss: 1.5065 - classification_loss: 0.3088 480/500 [===========================>..] - ETA: 5s - loss: 1.8146 - regression_loss: 1.5060 - classification_loss: 0.3086 481/500 [===========================>..] - ETA: 4s - loss: 1.8157 - regression_loss: 1.5068 - classification_loss: 0.3089 482/500 [===========================>..] - ETA: 4s - loss: 1.8160 - regression_loss: 1.5066 - classification_loss: 0.3094 483/500 [===========================>..] - ETA: 4s - loss: 1.8160 - regression_loss: 1.5065 - classification_loss: 0.3095 484/500 [============================>.] - ETA: 3s - loss: 1.8172 - regression_loss: 1.5076 - classification_loss: 0.3096 485/500 [============================>.] - ETA: 3s - loss: 1.8165 - regression_loss: 1.5071 - classification_loss: 0.3094 486/500 [============================>.] - ETA: 3s - loss: 1.8159 - regression_loss: 1.5067 - classification_loss: 0.3092 487/500 [============================>.] - ETA: 3s - loss: 1.8165 - regression_loss: 1.5072 - classification_loss: 0.3094 488/500 [============================>.] - ETA: 2s - loss: 1.8181 - regression_loss: 1.5083 - classification_loss: 0.3098 489/500 [============================>.] - ETA: 2s - loss: 1.8176 - regression_loss: 1.5079 - classification_loss: 0.3097 490/500 [============================>.] - ETA: 2s - loss: 1.8174 - regression_loss: 1.5078 - classification_loss: 0.3095 491/500 [============================>.] - ETA: 2s - loss: 1.8169 - regression_loss: 1.5076 - classification_loss: 0.3093 492/500 [============================>.] - ETA: 1s - loss: 1.8179 - regression_loss: 1.5086 - classification_loss: 0.3093 493/500 [============================>.] - ETA: 1s - loss: 1.8184 - regression_loss: 1.5091 - classification_loss: 0.3094 494/500 [============================>.] - ETA: 1s - loss: 1.8178 - regression_loss: 1.5086 - classification_loss: 0.3092 495/500 [============================>.] - ETA: 1s - loss: 1.8171 - regression_loss: 1.5083 - classification_loss: 0.3089 496/500 [============================>.] - ETA: 0s - loss: 1.8173 - regression_loss: 1.5084 - classification_loss: 0.3088 497/500 [============================>.] - ETA: 0s - loss: 1.8157 - regression_loss: 1.5072 - classification_loss: 0.3085 498/500 [============================>.] - ETA: 0s - loss: 1.8166 - regression_loss: 1.5077 - classification_loss: 0.3088 499/500 [============================>.] - ETA: 0s - loss: 1.8162 - regression_loss: 1.5075 - classification_loss: 0.3087 500/500 [==============================] - 125s 250ms/step - loss: 1.8170 - regression_loss: 1.5082 - classification_loss: 0.3089 326 instances of class plum with average precision: 0.7558 mAP: 0.7558 Epoch 00037: saving model to ./training/snapshots/resnet50_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 1:51 - loss: 2.3884 - regression_loss: 1.8047 - classification_loss: 0.5836 2/500 [..............................] - ETA: 1:54 - loss: 2.1864 - regression_loss: 1.8208 - classification_loss: 0.3656 3/500 [..............................] - ETA: 1:58 - loss: 1.8541 - regression_loss: 1.5419 - classification_loss: 0.3122 4/500 [..............................] - ETA: 2:01 - loss: 1.8410 - regression_loss: 1.5458 - classification_loss: 0.2952 5/500 [..............................] - ETA: 2:02 - loss: 2.1033 - regression_loss: 1.7740 - classification_loss: 0.3293 6/500 [..............................] - ETA: 2:01 - loss: 2.0779 - regression_loss: 1.7642 - classification_loss: 0.3137 7/500 [..............................] - ETA: 2:00 - loss: 1.9973 - regression_loss: 1.7033 - classification_loss: 0.2939 8/500 [..............................] - ETA: 2:00 - loss: 1.8775 - regression_loss: 1.6049 - classification_loss: 0.2726 9/500 [..............................] - ETA: 2:00 - loss: 1.7894 - regression_loss: 1.5323 - classification_loss: 0.2571 10/500 [..............................] - ETA: 2:00 - loss: 1.7711 - regression_loss: 1.5091 - classification_loss: 0.2620 11/500 [..............................] - ETA: 2:00 - loss: 1.7483 - regression_loss: 1.4832 - classification_loss: 0.2651 12/500 [..............................] - ETA: 2:00 - loss: 1.7601 - regression_loss: 1.4884 - classification_loss: 0.2718 13/500 [..............................] - ETA: 2:00 - loss: 1.8230 - regression_loss: 1.5353 - classification_loss: 0.2877 14/500 [..............................] - ETA: 2:00 - loss: 1.8426 - regression_loss: 1.5497 - classification_loss: 0.2929 15/500 [..............................] - ETA: 2:00 - loss: 1.8156 - regression_loss: 1.5293 - classification_loss: 0.2863 16/500 [..............................] - ETA: 2:00 - loss: 1.7823 - regression_loss: 1.5087 - classification_loss: 0.2736 17/500 [>.............................] - ETA: 2:00 - loss: 1.7781 - regression_loss: 1.5042 - classification_loss: 0.2739 18/500 [>.............................] - ETA: 2:00 - loss: 1.7896 - regression_loss: 1.5158 - classification_loss: 0.2738 19/500 [>.............................] - ETA: 2:00 - loss: 1.7876 - regression_loss: 1.5140 - classification_loss: 0.2736 20/500 [>.............................] - ETA: 2:00 - loss: 1.7697 - regression_loss: 1.5004 - classification_loss: 0.2693 21/500 [>.............................] - ETA: 1:59 - loss: 1.7768 - regression_loss: 1.5064 - classification_loss: 0.2705 22/500 [>.............................] - ETA: 1:59 - loss: 1.7564 - regression_loss: 1.4911 - classification_loss: 0.2653 23/500 [>.............................] - ETA: 1:59 - loss: 1.7795 - regression_loss: 1.5151 - classification_loss: 0.2644 24/500 [>.............................] - ETA: 1:59 - loss: 1.7666 - regression_loss: 1.5034 - classification_loss: 0.2633 25/500 [>.............................] - ETA: 1:58 - loss: 1.7692 - regression_loss: 1.5048 - classification_loss: 0.2644 26/500 [>.............................] - ETA: 1:58 - loss: 1.7635 - regression_loss: 1.4966 - classification_loss: 0.2669 27/500 [>.............................] - ETA: 1:58 - loss: 1.7762 - regression_loss: 1.5049 - classification_loss: 0.2713 28/500 [>.............................] - ETA: 1:58 - loss: 1.7839 - regression_loss: 1.5118 - classification_loss: 0.2721 29/500 [>.............................] - ETA: 1:57 - loss: 1.7730 - regression_loss: 1.5020 - classification_loss: 0.2710 30/500 [>.............................] - ETA: 1:57 - loss: 1.7659 - regression_loss: 1.4979 - classification_loss: 0.2680 31/500 [>.............................] - ETA: 1:57 - loss: 1.7659 - regression_loss: 1.4973 - classification_loss: 0.2686 32/500 [>.............................] - ETA: 1:57 - loss: 1.7454 - regression_loss: 1.4817 - classification_loss: 0.2636 33/500 [>.............................] - ETA: 1:57 - loss: 1.7701 - regression_loss: 1.4992 - classification_loss: 0.2709 34/500 [=>............................] - ETA: 1:56 - loss: 1.7863 - regression_loss: 1.5085 - classification_loss: 0.2778 35/500 [=>............................] - ETA: 1:56 - loss: 1.7951 - regression_loss: 1.5114 - classification_loss: 0.2837 36/500 [=>............................] - ETA: 1:56 - loss: 1.7959 - regression_loss: 1.5132 - classification_loss: 0.2826 37/500 [=>............................] - ETA: 1:55 - loss: 1.8010 - regression_loss: 1.5163 - classification_loss: 0.2847 38/500 [=>............................] - ETA: 1:55 - loss: 1.7901 - regression_loss: 1.5075 - classification_loss: 0.2826 39/500 [=>............................] - ETA: 1:55 - loss: 1.7836 - regression_loss: 1.5017 - classification_loss: 0.2819 40/500 [=>............................] - ETA: 1:55 - loss: 1.7674 - regression_loss: 1.4881 - classification_loss: 0.2794 41/500 [=>............................] - ETA: 1:54 - loss: 1.7586 - regression_loss: 1.4809 - classification_loss: 0.2778 42/500 [=>............................] - ETA: 1:54 - loss: 1.7747 - regression_loss: 1.4951 - classification_loss: 0.2796 43/500 [=>............................] - ETA: 1:54 - loss: 1.7823 - regression_loss: 1.5024 - classification_loss: 0.2799 44/500 [=>............................] - ETA: 1:54 - loss: 1.7930 - regression_loss: 1.5091 - classification_loss: 0.2839 45/500 [=>............................] - ETA: 1:53 - loss: 1.7960 - regression_loss: 1.5114 - classification_loss: 0.2846 46/500 [=>............................] - ETA: 1:53 - loss: 1.8247 - regression_loss: 1.5389 - classification_loss: 0.2858 47/500 [=>............................] - ETA: 1:53 - loss: 1.8304 - regression_loss: 1.5432 - classification_loss: 0.2872 48/500 [=>............................] - ETA: 1:53 - loss: 1.8132 - regression_loss: 1.5278 - classification_loss: 0.2854 49/500 [=>............................] - ETA: 1:52 - loss: 1.8147 - regression_loss: 1.5275 - classification_loss: 0.2872 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8299 - regression_loss: 1.5411 - classification_loss: 0.2887 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8262 - regression_loss: 1.5392 - classification_loss: 0.2870 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8396 - regression_loss: 1.5478 - classification_loss: 0.2917 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8384 - regression_loss: 1.5467 - classification_loss: 0.2917 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8335 - regression_loss: 1.5433 - classification_loss: 0.2902 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8271 - regression_loss: 1.5374 - classification_loss: 0.2897 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8291 - regression_loss: 1.5370 - classification_loss: 0.2922 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8240 - regression_loss: 1.5326 - classification_loss: 0.2914 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8127 - regression_loss: 1.5241 - classification_loss: 0.2886 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8123 - regression_loss: 1.5252 - classification_loss: 0.2871 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8196 - regression_loss: 1.5326 - classification_loss: 0.2870 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8160 - regression_loss: 1.5283 - classification_loss: 0.2877 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8223 - regression_loss: 1.5319 - classification_loss: 0.2904 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8218 - regression_loss: 1.5322 - classification_loss: 0.2895 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8149 - regression_loss: 1.5266 - classification_loss: 0.2883 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8212 - regression_loss: 1.5310 - classification_loss: 0.2901 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8175 - regression_loss: 1.5291 - classification_loss: 0.2885 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8120 - regression_loss: 1.5237 - classification_loss: 0.2883 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7947 - regression_loss: 1.5074 - classification_loss: 0.2874 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7860 - regression_loss: 1.5004 - classification_loss: 0.2856 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7858 - regression_loss: 1.4996 - classification_loss: 0.2861 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7899 - regression_loss: 1.5031 - classification_loss: 0.2868 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7896 - regression_loss: 1.5040 - classification_loss: 0.2856 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7974 - regression_loss: 1.5102 - classification_loss: 0.2872 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7892 - regression_loss: 1.5042 - classification_loss: 0.2851 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7977 - regression_loss: 1.5119 - classification_loss: 0.2858 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8005 - regression_loss: 1.5133 - classification_loss: 0.2871 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7989 - regression_loss: 1.5115 - classification_loss: 0.2874 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7938 - regression_loss: 1.5080 - classification_loss: 0.2858 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8097 - regression_loss: 1.5215 - classification_loss: 0.2882 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8059 - regression_loss: 1.5185 - classification_loss: 0.2874 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8065 - regression_loss: 1.5187 - classification_loss: 0.2878 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8064 - regression_loss: 1.5174 - classification_loss: 0.2891 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7997 - regression_loss: 1.5103 - classification_loss: 0.2894 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8001 - regression_loss: 1.5096 - classification_loss: 0.2905 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7998 - regression_loss: 1.5095 - classification_loss: 0.2903 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7963 - regression_loss: 1.5069 - classification_loss: 0.2894 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7971 - regression_loss: 1.5078 - classification_loss: 0.2893 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7918 - regression_loss: 1.5042 - classification_loss: 0.2876 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7903 - regression_loss: 1.5035 - classification_loss: 0.2868 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7879 - regression_loss: 1.5023 - classification_loss: 0.2855 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7817 - regression_loss: 1.4962 - classification_loss: 0.2855 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7749 - regression_loss: 1.4906 - classification_loss: 0.2843 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7749 - regression_loss: 1.4909 - classification_loss: 0.2840 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7659 - regression_loss: 1.4830 - classification_loss: 0.2829 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7739 - regression_loss: 1.4914 - classification_loss: 0.2826 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7805 - regression_loss: 1.4952 - classification_loss: 0.2853 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7875 - regression_loss: 1.4993 - classification_loss: 0.2881 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7895 - regression_loss: 1.5011 - classification_loss: 0.2884 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7970 - regression_loss: 1.5057 - classification_loss: 0.2913 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7946 - regression_loss: 1.5031 - classification_loss: 0.2915 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7970 - regression_loss: 1.5052 - classification_loss: 0.2918 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8018 - regression_loss: 1.5093 - classification_loss: 0.2925 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7972 - regression_loss: 1.5064 - classification_loss: 0.2908 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7988 - regression_loss: 1.5075 - classification_loss: 0.2913 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8021 - regression_loss: 1.5097 - classification_loss: 0.2924 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8107 - regression_loss: 1.5172 - classification_loss: 0.2934 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8145 - regression_loss: 1.5211 - classification_loss: 0.2934 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8194 - regression_loss: 1.5248 - classification_loss: 0.2946 109/500 [=====>........................] - ETA: 1:38 - loss: 1.8159 - regression_loss: 1.5220 - classification_loss: 0.2939 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8149 - regression_loss: 1.5210 - classification_loss: 0.2939 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8155 - regression_loss: 1.5216 - classification_loss: 0.2938 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8114 - regression_loss: 1.5175 - classification_loss: 0.2939 113/500 [=====>........................] - ETA: 1:37 - loss: 1.8123 - regression_loss: 1.5178 - classification_loss: 0.2945 114/500 [=====>........................] - ETA: 1:37 - loss: 1.8250 - regression_loss: 1.5286 - classification_loss: 0.2964 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8274 - regression_loss: 1.5307 - classification_loss: 0.2967 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8335 - regression_loss: 1.5362 - classification_loss: 0.2974 117/500 [======>.......................] - ETA: 1:36 - loss: 1.8296 - regression_loss: 1.5329 - classification_loss: 0.2967 118/500 [======>.......................] - ETA: 1:36 - loss: 1.8300 - regression_loss: 1.5328 - classification_loss: 0.2972 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8259 - regression_loss: 1.5296 - classification_loss: 0.2963 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8256 - regression_loss: 1.5292 - classification_loss: 0.2964 121/500 [======>.......................] - ETA: 1:35 - loss: 1.8200 - regression_loss: 1.5249 - classification_loss: 0.2951 122/500 [======>.......................] - ETA: 1:35 - loss: 1.8145 - regression_loss: 1.5203 - classification_loss: 0.2941 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8123 - regression_loss: 1.5188 - classification_loss: 0.2935 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8163 - regression_loss: 1.5223 - classification_loss: 0.2940 125/500 [======>.......................] - ETA: 1:34 - loss: 1.8127 - regression_loss: 1.5197 - classification_loss: 0.2930 126/500 [======>.......................] - ETA: 1:34 - loss: 1.8028 - regression_loss: 1.5116 - classification_loss: 0.2912 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8099 - regression_loss: 1.5169 - classification_loss: 0.2931 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8100 - regression_loss: 1.5163 - classification_loss: 0.2937 129/500 [======>.......................] - ETA: 1:33 - loss: 1.8090 - regression_loss: 1.5156 - classification_loss: 0.2935 130/500 [======>.......................] - ETA: 1:33 - loss: 1.8062 - regression_loss: 1.5138 - classification_loss: 0.2925 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8073 - regression_loss: 1.5141 - classification_loss: 0.2932 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8160 - regression_loss: 1.5207 - classification_loss: 0.2952 133/500 [======>.......................] - ETA: 1:32 - loss: 1.8165 - regression_loss: 1.5222 - classification_loss: 0.2943 134/500 [=======>......................] - ETA: 1:32 - loss: 1.8253 - regression_loss: 1.5292 - classification_loss: 0.2961 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8202 - regression_loss: 1.5254 - classification_loss: 0.2948 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8178 - regression_loss: 1.5235 - classification_loss: 0.2943 137/500 [=======>......................] - ETA: 1:31 - loss: 1.8208 - regression_loss: 1.5259 - classification_loss: 0.2949 138/500 [=======>......................] - ETA: 1:31 - loss: 1.8192 - regression_loss: 1.5243 - classification_loss: 0.2948 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8197 - regression_loss: 1.5248 - classification_loss: 0.2949 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8202 - regression_loss: 1.5254 - classification_loss: 0.2948 141/500 [=======>......................] - ETA: 1:30 - loss: 1.8230 - regression_loss: 1.5273 - classification_loss: 0.2957 142/500 [=======>......................] - ETA: 1:30 - loss: 1.8175 - regression_loss: 1.5232 - classification_loss: 0.2943 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8214 - regression_loss: 1.5278 - classification_loss: 0.2936 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8204 - regression_loss: 1.5270 - classification_loss: 0.2934 145/500 [=======>......................] - ETA: 1:29 - loss: 1.8254 - regression_loss: 1.5306 - classification_loss: 0.2949 146/500 [=======>......................] - ETA: 1:29 - loss: 1.8220 - regression_loss: 1.5276 - classification_loss: 0.2944 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8185 - regression_loss: 1.5251 - classification_loss: 0.2934 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8191 - regression_loss: 1.5254 - classification_loss: 0.2937 149/500 [=======>......................] - ETA: 1:28 - loss: 1.8205 - regression_loss: 1.5263 - classification_loss: 0.2942 150/500 [========>.....................] - ETA: 1:28 - loss: 1.8208 - regression_loss: 1.5264 - classification_loss: 0.2944 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8182 - regression_loss: 1.5245 - classification_loss: 0.2938 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8168 - regression_loss: 1.5230 - classification_loss: 0.2939 153/500 [========>.....................] - ETA: 1:27 - loss: 1.8202 - regression_loss: 1.5256 - classification_loss: 0.2946 154/500 [========>.....................] - ETA: 1:27 - loss: 1.8128 - regression_loss: 1.5195 - classification_loss: 0.2933 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8166 - regression_loss: 1.5235 - classification_loss: 0.2932 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8224 - regression_loss: 1.5278 - classification_loss: 0.2946 157/500 [========>.....................] - ETA: 1:26 - loss: 1.8209 - regression_loss: 1.5268 - classification_loss: 0.2941 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8154 - regression_loss: 1.5227 - classification_loss: 0.2927 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8155 - regression_loss: 1.5228 - classification_loss: 0.2928 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8148 - regression_loss: 1.5222 - classification_loss: 0.2925 161/500 [========>.....................] - ETA: 1:25 - loss: 1.8160 - regression_loss: 1.5228 - classification_loss: 0.2932 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8194 - regression_loss: 1.5255 - classification_loss: 0.2940 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8182 - regression_loss: 1.5245 - classification_loss: 0.2937 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8185 - regression_loss: 1.5249 - classification_loss: 0.2937 165/500 [========>.....................] - ETA: 1:24 - loss: 1.8174 - regression_loss: 1.5239 - classification_loss: 0.2935 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8172 - regression_loss: 1.5243 - classification_loss: 0.2929 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8177 - regression_loss: 1.5242 - classification_loss: 0.2935 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8185 - regression_loss: 1.5242 - classification_loss: 0.2943 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8198 - regression_loss: 1.5255 - classification_loss: 0.2943 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8238 - regression_loss: 1.5286 - classification_loss: 0.2952 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8210 - regression_loss: 1.5265 - classification_loss: 0.2945 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8258 - regression_loss: 1.5300 - classification_loss: 0.2958 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8253 - regression_loss: 1.5294 - classification_loss: 0.2959 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8252 - regression_loss: 1.5296 - classification_loss: 0.2957 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8235 - regression_loss: 1.5282 - classification_loss: 0.2953 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8204 - regression_loss: 1.5258 - classification_loss: 0.2946 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8196 - regression_loss: 1.5244 - classification_loss: 0.2952 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8227 - regression_loss: 1.5265 - classification_loss: 0.2962 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8249 - regression_loss: 1.5279 - classification_loss: 0.2970 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8285 - regression_loss: 1.5303 - classification_loss: 0.2982 181/500 [=========>....................] - ETA: 1:20 - loss: 1.8316 - regression_loss: 1.5319 - classification_loss: 0.2998 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8303 - regression_loss: 1.5304 - classification_loss: 0.2999 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8325 - regression_loss: 1.5319 - classification_loss: 0.3006 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8341 - regression_loss: 1.5333 - classification_loss: 0.3008 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8338 - regression_loss: 1.5333 - classification_loss: 0.3006 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8306 - regression_loss: 1.5307 - classification_loss: 0.2999 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8292 - regression_loss: 1.5298 - classification_loss: 0.2994 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8309 - regression_loss: 1.5313 - classification_loss: 0.2996 189/500 [==========>...................] - ETA: 1:18 - loss: 1.8317 - regression_loss: 1.5320 - classification_loss: 0.2997 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8314 - regression_loss: 1.5318 - classification_loss: 0.2996 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8248 - regression_loss: 1.5261 - classification_loss: 0.2987 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8177 - regression_loss: 1.5201 - classification_loss: 0.2976 193/500 [==========>...................] - ETA: 1:17 - loss: 1.8173 - regression_loss: 1.5197 - classification_loss: 0.2976 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8170 - regression_loss: 1.5200 - classification_loss: 0.2970 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8187 - regression_loss: 1.5214 - classification_loss: 0.2974 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8166 - regression_loss: 1.5198 - classification_loss: 0.2968 197/500 [==========>...................] - ETA: 1:16 - loss: 1.8147 - regression_loss: 1.5186 - classification_loss: 0.2961 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8101 - regression_loss: 1.5149 - classification_loss: 0.2953 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8127 - regression_loss: 1.5172 - classification_loss: 0.2955 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8141 - regression_loss: 1.5189 - classification_loss: 0.2952 201/500 [===========>..................] - ETA: 1:15 - loss: 1.8093 - regression_loss: 1.5147 - classification_loss: 0.2946 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8078 - regression_loss: 1.5136 - classification_loss: 0.2941 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8037 - regression_loss: 1.5102 - classification_loss: 0.2935 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8019 - regression_loss: 1.5085 - classification_loss: 0.2935 205/500 [===========>..................] - ETA: 1:14 - loss: 1.8054 - regression_loss: 1.5110 - classification_loss: 0.2944 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8047 - regression_loss: 1.5106 - classification_loss: 0.2941 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8061 - regression_loss: 1.5117 - classification_loss: 0.2944 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8040 - regression_loss: 1.5101 - classification_loss: 0.2940 209/500 [===========>..................] - ETA: 1:13 - loss: 1.8052 - regression_loss: 1.5112 - classification_loss: 0.2940 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8024 - regression_loss: 1.5088 - classification_loss: 0.2935 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7990 - regression_loss: 1.5063 - classification_loss: 0.2927 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7962 - regression_loss: 1.5043 - classification_loss: 0.2919 213/500 [===========>..................] - ETA: 1:12 - loss: 1.7953 - regression_loss: 1.5038 - classification_loss: 0.2914 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7925 - regression_loss: 1.5017 - classification_loss: 0.2907 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7940 - regression_loss: 1.5031 - classification_loss: 0.2909 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7952 - regression_loss: 1.5043 - classification_loss: 0.2909 217/500 [============>.................] - ETA: 1:11 - loss: 1.7959 - regression_loss: 1.5048 - classification_loss: 0.2911 218/500 [============>.................] - ETA: 1:10 - loss: 1.7952 - regression_loss: 1.5043 - classification_loss: 0.2908 219/500 [============>.................] - ETA: 1:10 - loss: 1.7927 - regression_loss: 1.5026 - classification_loss: 0.2901 220/500 [============>.................] - ETA: 1:10 - loss: 1.7892 - regression_loss: 1.4999 - classification_loss: 0.2893 221/500 [============>.................] - ETA: 1:10 - loss: 1.7898 - regression_loss: 1.5002 - classification_loss: 0.2895 222/500 [============>.................] - ETA: 1:09 - loss: 1.7905 - regression_loss: 1.5007 - classification_loss: 0.2898 223/500 [============>.................] - ETA: 1:09 - loss: 1.7893 - regression_loss: 1.5000 - classification_loss: 0.2893 224/500 [============>.................] - ETA: 1:09 - loss: 1.7877 - regression_loss: 1.4988 - classification_loss: 0.2889 225/500 [============>.................] - ETA: 1:09 - loss: 1.7871 - regression_loss: 1.4984 - classification_loss: 0.2886 226/500 [============>.................] - ETA: 1:08 - loss: 1.7820 - regression_loss: 1.4942 - classification_loss: 0.2879 227/500 [============>.................] - ETA: 1:08 - loss: 1.7821 - regression_loss: 1.4946 - classification_loss: 0.2876 228/500 [============>.................] - ETA: 1:08 - loss: 1.7825 - regression_loss: 1.4949 - classification_loss: 0.2875 229/500 [============>.................] - ETA: 1:08 - loss: 1.7817 - regression_loss: 1.4947 - classification_loss: 0.2869 230/500 [============>.................] - ETA: 1:07 - loss: 1.7838 - regression_loss: 1.4969 - classification_loss: 0.2870 231/500 [============>.................] - ETA: 1:07 - loss: 1.7855 - regression_loss: 1.4982 - classification_loss: 0.2873 232/500 [============>.................] - ETA: 1:07 - loss: 1.7853 - regression_loss: 1.4981 - classification_loss: 0.2871 233/500 [============>.................] - ETA: 1:07 - loss: 1.7852 - regression_loss: 1.4980 - classification_loss: 0.2873 234/500 [=============>................] - ETA: 1:06 - loss: 1.7867 - regression_loss: 1.4996 - classification_loss: 0.2871 235/500 [=============>................] - ETA: 1:06 - loss: 1.7824 - regression_loss: 1.4962 - classification_loss: 0.2862 236/500 [=============>................] - ETA: 1:06 - loss: 1.7866 - regression_loss: 1.4998 - classification_loss: 0.2868 237/500 [=============>................] - ETA: 1:06 - loss: 1.7870 - regression_loss: 1.5002 - classification_loss: 0.2868 238/500 [=============>................] - ETA: 1:05 - loss: 1.7862 - regression_loss: 1.4997 - classification_loss: 0.2865 239/500 [=============>................] - ETA: 1:05 - loss: 1.7836 - regression_loss: 1.4979 - classification_loss: 0.2857 240/500 [=============>................] - ETA: 1:05 - loss: 1.7815 - regression_loss: 1.4955 - classification_loss: 0.2860 241/500 [=============>................] - ETA: 1:05 - loss: 1.7832 - regression_loss: 1.4968 - classification_loss: 0.2864 242/500 [=============>................] - ETA: 1:04 - loss: 1.7830 - regression_loss: 1.4971 - classification_loss: 0.2859 243/500 [=============>................] - ETA: 1:04 - loss: 1.7815 - regression_loss: 1.4961 - classification_loss: 0.2855 244/500 [=============>................] - ETA: 1:04 - loss: 1.7798 - regression_loss: 1.4949 - classification_loss: 0.2849 245/500 [=============>................] - ETA: 1:04 - loss: 1.7769 - regression_loss: 1.4918 - classification_loss: 0.2851 246/500 [=============>................] - ETA: 1:03 - loss: 1.7789 - regression_loss: 1.4935 - classification_loss: 0.2853 247/500 [=============>................] - ETA: 1:03 - loss: 1.7766 - regression_loss: 1.4918 - classification_loss: 0.2848 248/500 [=============>................] - ETA: 1:03 - loss: 1.7760 - regression_loss: 1.4917 - classification_loss: 0.2844 249/500 [=============>................] - ETA: 1:03 - loss: 1.7781 - regression_loss: 1.4934 - classification_loss: 0.2847 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7779 - regression_loss: 1.4932 - classification_loss: 0.2847 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7778 - regression_loss: 1.4933 - classification_loss: 0.2845 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7765 - regression_loss: 1.4923 - classification_loss: 0.2842 253/500 [==============>...............] - ETA: 1:02 - loss: 1.7758 - regression_loss: 1.4915 - classification_loss: 0.2844 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7738 - regression_loss: 1.4898 - classification_loss: 0.2840 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7751 - regression_loss: 1.4907 - classification_loss: 0.2844 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7733 - regression_loss: 1.4892 - classification_loss: 0.2841 257/500 [==============>...............] - ETA: 1:01 - loss: 1.7727 - regression_loss: 1.4888 - classification_loss: 0.2838 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7718 - regression_loss: 1.4887 - classification_loss: 0.2831 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7738 - regression_loss: 1.4897 - classification_loss: 0.2841 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7713 - regression_loss: 1.4873 - classification_loss: 0.2839 261/500 [==============>...............] - ETA: 1:00 - loss: 1.7728 - regression_loss: 1.4882 - classification_loss: 0.2846 262/500 [==============>...............] - ETA: 59s - loss: 1.7735 - regression_loss: 1.4889 - classification_loss: 0.2846  263/500 [==============>...............] - ETA: 59s - loss: 1.7759 - regression_loss: 1.4908 - classification_loss: 0.2850 264/500 [==============>...............] - ETA: 59s - loss: 1.7774 - regression_loss: 1.4918 - classification_loss: 0.2856 265/500 [==============>...............] - ETA: 59s - loss: 1.7799 - regression_loss: 1.4940 - classification_loss: 0.2859 266/500 [==============>...............] - ETA: 58s - loss: 1.7786 - regression_loss: 1.4931 - classification_loss: 0.2854 267/500 [===============>..............] - ETA: 58s - loss: 1.7796 - regression_loss: 1.4942 - classification_loss: 0.2854 268/500 [===============>..............] - ETA: 58s - loss: 1.7784 - regression_loss: 1.4933 - classification_loss: 0.2851 269/500 [===============>..............] - ETA: 57s - loss: 1.7773 - regression_loss: 1.4924 - classification_loss: 0.2848 270/500 [===============>..............] - ETA: 57s - loss: 1.7813 - regression_loss: 1.4958 - classification_loss: 0.2855 271/500 [===============>..............] - ETA: 57s - loss: 1.7845 - regression_loss: 1.4984 - classification_loss: 0.2861 272/500 [===============>..............] - ETA: 57s - loss: 1.7835 - regression_loss: 1.4977 - classification_loss: 0.2857 273/500 [===============>..............] - ETA: 56s - loss: 1.7874 - regression_loss: 1.5017 - classification_loss: 0.2857 274/500 [===============>..............] - ETA: 56s - loss: 1.7857 - regression_loss: 1.5004 - classification_loss: 0.2853 275/500 [===============>..............] - ETA: 56s - loss: 1.7842 - regression_loss: 1.4994 - classification_loss: 0.2848 276/500 [===============>..............] - ETA: 56s - loss: 1.7845 - regression_loss: 1.4996 - classification_loss: 0.2849 277/500 [===============>..............] - ETA: 55s - loss: 1.7867 - regression_loss: 1.5011 - classification_loss: 0.2856 278/500 [===============>..............] - ETA: 55s - loss: 1.7866 - regression_loss: 1.5008 - classification_loss: 0.2858 279/500 [===============>..............] - ETA: 55s - loss: 1.7873 - regression_loss: 1.5014 - classification_loss: 0.2859 280/500 [===============>..............] - ETA: 55s - loss: 1.7868 - regression_loss: 1.5013 - classification_loss: 0.2855 281/500 [===============>..............] - ETA: 54s - loss: 1.7855 - regression_loss: 1.5004 - classification_loss: 0.2850 282/500 [===============>..............] - ETA: 54s - loss: 1.7863 - regression_loss: 1.5013 - classification_loss: 0.2851 283/500 [===============>..............] - ETA: 54s - loss: 1.7868 - regression_loss: 1.5017 - classification_loss: 0.2851 284/500 [================>.............] - ETA: 54s - loss: 1.7877 - regression_loss: 1.5023 - classification_loss: 0.2855 285/500 [================>.............] - ETA: 53s - loss: 1.7866 - regression_loss: 1.5015 - classification_loss: 0.2851 286/500 [================>.............] - ETA: 53s - loss: 1.7854 - regression_loss: 1.5005 - classification_loss: 0.2848 287/500 [================>.............] - ETA: 53s - loss: 1.7877 - regression_loss: 1.5020 - classification_loss: 0.2856 288/500 [================>.............] - ETA: 53s - loss: 1.7889 - regression_loss: 1.5032 - classification_loss: 0.2858 289/500 [================>.............] - ETA: 52s - loss: 1.7872 - regression_loss: 1.5017 - classification_loss: 0.2856 290/500 [================>.............] - ETA: 52s - loss: 1.7874 - regression_loss: 1.5023 - classification_loss: 0.2852 291/500 [================>.............] - ETA: 52s - loss: 1.7879 - regression_loss: 1.5022 - classification_loss: 0.2857 292/500 [================>.............] - ETA: 52s - loss: 1.7845 - regression_loss: 1.4993 - classification_loss: 0.2852 293/500 [================>.............] - ETA: 51s - loss: 1.7833 - regression_loss: 1.4980 - classification_loss: 0.2852 294/500 [================>.............] - ETA: 51s - loss: 1.7868 - regression_loss: 1.5012 - classification_loss: 0.2857 295/500 [================>.............] - ETA: 51s - loss: 1.7851 - regression_loss: 1.4999 - classification_loss: 0.2852 296/500 [================>.............] - ETA: 51s - loss: 1.7847 - regression_loss: 1.4995 - classification_loss: 0.2852 297/500 [================>.............] - ETA: 50s - loss: 1.7853 - regression_loss: 1.5002 - classification_loss: 0.2851 298/500 [================>.............] - ETA: 50s - loss: 1.7848 - regression_loss: 1.4999 - classification_loss: 0.2850 299/500 [================>.............] - ETA: 50s - loss: 1.7828 - regression_loss: 1.4983 - classification_loss: 0.2845 300/500 [=================>............] - ETA: 50s - loss: 1.7855 - regression_loss: 1.4999 - classification_loss: 0.2856 301/500 [=================>............] - ETA: 49s - loss: 1.7859 - regression_loss: 1.5001 - classification_loss: 0.2858 302/500 [=================>............] - ETA: 49s - loss: 1.7828 - regression_loss: 1.4977 - classification_loss: 0.2852 303/500 [=================>............] - ETA: 49s - loss: 1.7870 - regression_loss: 1.4997 - classification_loss: 0.2873 304/500 [=================>............] - ETA: 49s - loss: 1.7867 - regression_loss: 1.4996 - classification_loss: 0.2872 305/500 [=================>............] - ETA: 48s - loss: 1.7861 - regression_loss: 1.4991 - classification_loss: 0.2870 306/500 [=================>............] - ETA: 48s - loss: 1.7867 - regression_loss: 1.4996 - classification_loss: 0.2872 307/500 [=================>............] - ETA: 48s - loss: 1.7848 - regression_loss: 1.4981 - classification_loss: 0.2867 308/500 [=================>............] - ETA: 48s - loss: 1.7864 - regression_loss: 1.4995 - classification_loss: 0.2869 309/500 [=================>............] - ETA: 47s - loss: 1.7871 - regression_loss: 1.4999 - classification_loss: 0.2871 310/500 [=================>............] - ETA: 47s - loss: 1.7870 - regression_loss: 1.4999 - classification_loss: 0.2871 311/500 [=================>............] - ETA: 47s - loss: 1.7863 - regression_loss: 1.4995 - classification_loss: 0.2869 312/500 [=================>............] - ETA: 47s - loss: 1.7869 - regression_loss: 1.5000 - classification_loss: 0.2869 313/500 [=================>............] - ETA: 46s - loss: 1.7878 - regression_loss: 1.5007 - classification_loss: 0.2871 314/500 [=================>............] - ETA: 46s - loss: 1.7880 - regression_loss: 1.5008 - classification_loss: 0.2872 315/500 [=================>............] - ETA: 46s - loss: 1.7894 - regression_loss: 1.5016 - classification_loss: 0.2878 316/500 [=================>............] - ETA: 46s - loss: 1.7901 - regression_loss: 1.5019 - classification_loss: 0.2881 317/500 [==================>...........] - ETA: 45s - loss: 1.7894 - regression_loss: 1.5014 - classification_loss: 0.2880 318/500 [==================>...........] - ETA: 45s - loss: 1.7936 - regression_loss: 1.5039 - classification_loss: 0.2897 319/500 [==================>...........] - ETA: 45s - loss: 1.7930 - regression_loss: 1.5033 - classification_loss: 0.2898 320/500 [==================>...........] - ETA: 45s - loss: 1.7951 - regression_loss: 1.5049 - classification_loss: 0.2903 321/500 [==================>...........] - ETA: 44s - loss: 1.7947 - regression_loss: 1.5046 - classification_loss: 0.2902 322/500 [==================>...........] - ETA: 44s - loss: 1.7926 - regression_loss: 1.5028 - classification_loss: 0.2897 323/500 [==================>...........] - ETA: 44s - loss: 1.7932 - regression_loss: 1.5034 - classification_loss: 0.2899 324/500 [==================>...........] - ETA: 44s - loss: 1.7931 - regression_loss: 1.5026 - classification_loss: 0.2905 325/500 [==================>...........] - ETA: 43s - loss: 1.7921 - regression_loss: 1.5019 - classification_loss: 0.2902 326/500 [==================>...........] - ETA: 43s - loss: 1.7937 - regression_loss: 1.5029 - classification_loss: 0.2908 327/500 [==================>...........] - ETA: 43s - loss: 1.7928 - regression_loss: 1.5022 - classification_loss: 0.2905 328/500 [==================>...........] - ETA: 43s - loss: 1.7935 - regression_loss: 1.5029 - classification_loss: 0.2906 329/500 [==================>...........] - ETA: 42s - loss: 1.7935 - regression_loss: 1.5028 - classification_loss: 0.2907 330/500 [==================>...........] - ETA: 42s - loss: 1.7932 - regression_loss: 1.5020 - classification_loss: 0.2912 331/500 [==================>...........] - ETA: 42s - loss: 1.7921 - regression_loss: 1.5012 - classification_loss: 0.2909 332/500 [==================>...........] - ETA: 42s - loss: 1.7935 - regression_loss: 1.5021 - classification_loss: 0.2914 333/500 [==================>...........] - ETA: 41s - loss: 1.7927 - regression_loss: 1.5014 - classification_loss: 0.2913 334/500 [===================>..........] - ETA: 41s - loss: 1.7937 - regression_loss: 1.5023 - classification_loss: 0.2913 335/500 [===================>..........] - ETA: 41s - loss: 1.7941 - regression_loss: 1.5026 - classification_loss: 0.2915 336/500 [===================>..........] - ETA: 41s - loss: 1.7977 - regression_loss: 1.5045 - classification_loss: 0.2932 337/500 [===================>..........] - ETA: 40s - loss: 1.7997 - regression_loss: 1.5059 - classification_loss: 0.2938 338/500 [===================>..........] - ETA: 40s - loss: 1.8004 - regression_loss: 1.5064 - classification_loss: 0.2941 339/500 [===================>..........] - ETA: 40s - loss: 1.8002 - regression_loss: 1.5063 - classification_loss: 0.2939 340/500 [===================>..........] - ETA: 40s - loss: 1.8032 - regression_loss: 1.5086 - classification_loss: 0.2946 341/500 [===================>..........] - ETA: 39s - loss: 1.8036 - regression_loss: 1.5088 - classification_loss: 0.2948 342/500 [===================>..........] - ETA: 39s - loss: 1.8050 - regression_loss: 1.5101 - classification_loss: 0.2949 343/500 [===================>..........] - ETA: 39s - loss: 1.8043 - regression_loss: 1.5095 - classification_loss: 0.2948 344/500 [===================>..........] - ETA: 39s - loss: 1.8159 - regression_loss: 1.5051 - classification_loss: 0.3108 345/500 [===================>..........] - ETA: 38s - loss: 1.8189 - regression_loss: 1.5075 - classification_loss: 0.3114 346/500 [===================>..........] - ETA: 38s - loss: 1.8191 - regression_loss: 1.5077 - classification_loss: 0.3115 347/500 [===================>..........] - ETA: 38s - loss: 1.8224 - regression_loss: 1.5105 - classification_loss: 0.3119 348/500 [===================>..........] - ETA: 38s - loss: 1.8251 - regression_loss: 1.5127 - classification_loss: 0.3124 349/500 [===================>..........] - ETA: 37s - loss: 1.8242 - regression_loss: 1.5121 - classification_loss: 0.3121 350/500 [====================>.........] - ETA: 37s - loss: 1.8234 - regression_loss: 1.5118 - classification_loss: 0.3116 351/500 [====================>.........] - ETA: 37s - loss: 1.8221 - regression_loss: 1.5108 - classification_loss: 0.3113 352/500 [====================>.........] - ETA: 37s - loss: 1.8237 - regression_loss: 1.5122 - classification_loss: 0.3115 353/500 [====================>.........] - ETA: 36s - loss: 1.8231 - regression_loss: 1.5119 - classification_loss: 0.3112 354/500 [====================>.........] - ETA: 36s - loss: 1.8222 - regression_loss: 1.5111 - classification_loss: 0.3110 355/500 [====================>.........] - ETA: 36s - loss: 1.8211 - regression_loss: 1.5105 - classification_loss: 0.3106 356/500 [====================>.........] - ETA: 36s - loss: 1.8216 - regression_loss: 1.5109 - classification_loss: 0.3107 357/500 [====================>.........] - ETA: 35s - loss: 1.8209 - regression_loss: 1.5105 - classification_loss: 0.3104 358/500 [====================>.........] - ETA: 35s - loss: 1.8205 - regression_loss: 1.5100 - classification_loss: 0.3104 359/500 [====================>.........] - ETA: 35s - loss: 1.8210 - regression_loss: 1.5105 - classification_loss: 0.3105 360/500 [====================>.........] - ETA: 35s - loss: 1.8228 - regression_loss: 1.5118 - classification_loss: 0.3111 361/500 [====================>.........] - ETA: 34s - loss: 1.8222 - regression_loss: 1.5115 - classification_loss: 0.3107 362/500 [====================>.........] - ETA: 34s - loss: 1.8228 - regression_loss: 1.5119 - classification_loss: 0.3109 363/500 [====================>.........] - ETA: 34s - loss: 1.8228 - regression_loss: 1.5120 - classification_loss: 0.3108 364/500 [====================>.........] - ETA: 34s - loss: 1.8216 - regression_loss: 1.5111 - classification_loss: 0.3105 365/500 [====================>.........] - ETA: 33s - loss: 1.8209 - regression_loss: 1.5101 - classification_loss: 0.3108 366/500 [====================>.........] - ETA: 33s - loss: 1.8209 - regression_loss: 1.5102 - classification_loss: 0.3107 367/500 [=====================>........] - ETA: 33s - loss: 1.8199 - regression_loss: 1.5096 - classification_loss: 0.3102 368/500 [=====================>........] - ETA: 33s - loss: 1.8220 - regression_loss: 1.5108 - classification_loss: 0.3112 369/500 [=====================>........] - ETA: 32s - loss: 1.8234 - regression_loss: 1.5113 - classification_loss: 0.3121 370/500 [=====================>........] - ETA: 32s - loss: 1.8223 - regression_loss: 1.5104 - classification_loss: 0.3120 371/500 [=====================>........] - ETA: 32s - loss: 1.8211 - regression_loss: 1.5095 - classification_loss: 0.3116 372/500 [=====================>........] - ETA: 32s - loss: 1.8200 - regression_loss: 1.5090 - classification_loss: 0.3111 373/500 [=====================>........] - ETA: 31s - loss: 1.8193 - regression_loss: 1.5082 - classification_loss: 0.3111 374/500 [=====================>........] - ETA: 31s - loss: 1.8176 - regression_loss: 1.5068 - classification_loss: 0.3108 375/500 [=====================>........] - ETA: 31s - loss: 1.8187 - regression_loss: 1.5077 - classification_loss: 0.3110 376/500 [=====================>........] - ETA: 31s - loss: 1.8192 - regression_loss: 1.5081 - classification_loss: 0.3112 377/500 [=====================>........] - ETA: 30s - loss: 1.8192 - regression_loss: 1.5082 - classification_loss: 0.3110 378/500 [=====================>........] - ETA: 30s - loss: 1.8192 - regression_loss: 1.5082 - classification_loss: 0.3111 379/500 [=====================>........] - ETA: 30s - loss: 1.8213 - regression_loss: 1.5098 - classification_loss: 0.3115 380/500 [=====================>........] - ETA: 30s - loss: 1.8245 - regression_loss: 1.5105 - classification_loss: 0.3139 381/500 [=====================>........] - ETA: 29s - loss: 1.8231 - regression_loss: 1.5095 - classification_loss: 0.3136 382/500 [=====================>........] - ETA: 29s - loss: 1.8222 - regression_loss: 1.5089 - classification_loss: 0.3134 383/500 [=====================>........] - ETA: 29s - loss: 1.8214 - regression_loss: 1.5083 - classification_loss: 0.3131 384/500 [======================>.......] - ETA: 29s - loss: 1.8207 - regression_loss: 1.5079 - classification_loss: 0.3128 385/500 [======================>.......] - ETA: 28s - loss: 1.8214 - regression_loss: 1.5085 - classification_loss: 0.3129 386/500 [======================>.......] - ETA: 28s - loss: 1.8218 - regression_loss: 1.5091 - classification_loss: 0.3127 387/500 [======================>.......] - ETA: 28s - loss: 1.8203 - regression_loss: 1.5052 - classification_loss: 0.3150 388/500 [======================>.......] - ETA: 28s - loss: 1.8204 - regression_loss: 1.5054 - classification_loss: 0.3150 389/500 [======================>.......] - ETA: 27s - loss: 1.8202 - regression_loss: 1.5053 - classification_loss: 0.3149 390/500 [======================>.......] - ETA: 27s - loss: 1.8209 - regression_loss: 1.5059 - classification_loss: 0.3150 391/500 [======================>.......] - ETA: 27s - loss: 1.8201 - regression_loss: 1.5054 - classification_loss: 0.3147 392/500 [======================>.......] - ETA: 27s - loss: 1.8188 - regression_loss: 1.5044 - classification_loss: 0.3144 393/500 [======================>.......] - ETA: 26s - loss: 1.8189 - regression_loss: 1.5044 - classification_loss: 0.3145 394/500 [======================>.......] - ETA: 26s - loss: 1.8185 - regression_loss: 1.5043 - classification_loss: 0.3142 395/500 [======================>.......] - ETA: 26s - loss: 1.8180 - regression_loss: 1.5041 - classification_loss: 0.3139 396/500 [======================>.......] - ETA: 26s - loss: 1.8187 - regression_loss: 1.5043 - classification_loss: 0.3144 397/500 [======================>.......] - ETA: 25s - loss: 1.8207 - regression_loss: 1.5057 - classification_loss: 0.3150 398/500 [======================>.......] - ETA: 25s - loss: 1.8195 - regression_loss: 1.5049 - classification_loss: 0.3146 399/500 [======================>.......] - ETA: 25s - loss: 1.8174 - regression_loss: 1.5033 - classification_loss: 0.3142 400/500 [=======================>......] - ETA: 25s - loss: 1.8146 - regression_loss: 1.5011 - classification_loss: 0.3135 401/500 [=======================>......] - ETA: 24s - loss: 1.8134 - regression_loss: 1.5002 - classification_loss: 0.3132 402/500 [=======================>......] - ETA: 24s - loss: 1.8132 - regression_loss: 1.5003 - classification_loss: 0.3129 403/500 [=======================>......] - ETA: 24s - loss: 1.8141 - regression_loss: 1.5010 - classification_loss: 0.3132 404/500 [=======================>......] - ETA: 24s - loss: 1.8147 - regression_loss: 1.5015 - classification_loss: 0.3132 405/500 [=======================>......] - ETA: 23s - loss: 1.8151 - regression_loss: 1.5016 - classification_loss: 0.3135 406/500 [=======================>......] - ETA: 23s - loss: 1.8158 - regression_loss: 1.5019 - classification_loss: 0.3138 407/500 [=======================>......] - ETA: 23s - loss: 1.8188 - regression_loss: 1.5045 - classification_loss: 0.3143 408/500 [=======================>......] - ETA: 23s - loss: 1.8189 - regression_loss: 1.5046 - classification_loss: 0.3143 409/500 [=======================>......] - ETA: 22s - loss: 1.8174 - regression_loss: 1.5036 - classification_loss: 0.3138 410/500 [=======================>......] - ETA: 22s - loss: 1.8162 - regression_loss: 1.5028 - classification_loss: 0.3133 411/500 [=======================>......] - ETA: 22s - loss: 1.8138 - regression_loss: 1.5010 - classification_loss: 0.3129 412/500 [=======================>......] - ETA: 22s - loss: 1.8138 - regression_loss: 1.5011 - classification_loss: 0.3127 413/500 [=======================>......] - ETA: 21s - loss: 1.8138 - regression_loss: 1.5012 - classification_loss: 0.3126 414/500 [=======================>......] - ETA: 21s - loss: 1.8116 - regression_loss: 1.4995 - classification_loss: 0.3122 415/500 [=======================>......] - ETA: 21s - loss: 1.8120 - regression_loss: 1.4993 - classification_loss: 0.3127 416/500 [=======================>......] - ETA: 21s - loss: 1.8130 - regression_loss: 1.5001 - classification_loss: 0.3129 417/500 [========================>.....] - ETA: 20s - loss: 1.8121 - regression_loss: 1.4994 - classification_loss: 0.3127 418/500 [========================>.....] - ETA: 20s - loss: 1.8122 - regression_loss: 1.4995 - classification_loss: 0.3127 419/500 [========================>.....] - ETA: 20s - loss: 1.8113 - regression_loss: 1.4988 - classification_loss: 0.3125 420/500 [========================>.....] - ETA: 20s - loss: 1.8113 - regression_loss: 1.4988 - classification_loss: 0.3125 421/500 [========================>.....] - ETA: 19s - loss: 1.8122 - regression_loss: 1.4997 - classification_loss: 0.3126 422/500 [========================>.....] - ETA: 19s - loss: 1.8093 - regression_loss: 1.4973 - classification_loss: 0.3121 423/500 [========================>.....] - ETA: 19s - loss: 1.8100 - regression_loss: 1.4978 - classification_loss: 0.3122 424/500 [========================>.....] - ETA: 19s - loss: 1.8083 - regression_loss: 1.4964 - classification_loss: 0.3119 425/500 [========================>.....] - ETA: 18s - loss: 1.8070 - regression_loss: 1.4955 - classification_loss: 0.3115 426/500 [========================>.....] - ETA: 18s - loss: 1.8064 - regression_loss: 1.4950 - classification_loss: 0.3114 427/500 [========================>.....] - ETA: 18s - loss: 1.8063 - regression_loss: 1.4951 - classification_loss: 0.3112 428/500 [========================>.....] - ETA: 18s - loss: 1.8052 - regression_loss: 1.4943 - classification_loss: 0.3108 429/500 [========================>.....] - ETA: 17s - loss: 1.8031 - regression_loss: 1.4927 - classification_loss: 0.3104 430/500 [========================>.....] - ETA: 17s - loss: 1.8036 - regression_loss: 1.4935 - classification_loss: 0.3101 431/500 [========================>.....] - ETA: 17s - loss: 1.8042 - regression_loss: 1.4942 - classification_loss: 0.3100 432/500 [========================>.....] - ETA: 17s - loss: 1.8041 - regression_loss: 1.4941 - classification_loss: 0.3101 433/500 [========================>.....] - ETA: 16s - loss: 1.8038 - regression_loss: 1.4940 - classification_loss: 0.3098 434/500 [=========================>....] - ETA: 16s - loss: 1.8026 - regression_loss: 1.4931 - classification_loss: 0.3095 435/500 [=========================>....] - ETA: 16s - loss: 1.8066 - regression_loss: 1.4949 - classification_loss: 0.3117 436/500 [=========================>....] - ETA: 16s - loss: 1.8056 - regression_loss: 1.4942 - classification_loss: 0.3114 437/500 [=========================>....] - ETA: 15s - loss: 1.8049 - regression_loss: 1.4939 - classification_loss: 0.3110 438/500 [=========================>....] - ETA: 15s - loss: 1.8043 - regression_loss: 1.4935 - classification_loss: 0.3108 439/500 [=========================>....] - ETA: 15s - loss: 1.8024 - regression_loss: 1.4921 - classification_loss: 0.3104 440/500 [=========================>....] - ETA: 15s - loss: 1.8032 - regression_loss: 1.4930 - classification_loss: 0.3101 441/500 [=========================>....] - ETA: 14s - loss: 1.8047 - regression_loss: 1.4941 - classification_loss: 0.3105 442/500 [=========================>....] - ETA: 14s - loss: 1.8064 - regression_loss: 1.4956 - classification_loss: 0.3109 443/500 [=========================>....] - ETA: 14s - loss: 1.8088 - regression_loss: 1.4973 - classification_loss: 0.3115 444/500 [=========================>....] - ETA: 14s - loss: 1.8090 - regression_loss: 1.4975 - classification_loss: 0.3115 445/500 [=========================>....] - ETA: 13s - loss: 1.8086 - regression_loss: 1.4970 - classification_loss: 0.3116 446/500 [=========================>....] - ETA: 13s - loss: 1.8080 - regression_loss: 1.4964 - classification_loss: 0.3115 447/500 [=========================>....] - ETA: 13s - loss: 1.8091 - regression_loss: 1.4974 - classification_loss: 0.3117 448/500 [=========================>....] - ETA: 13s - loss: 1.8091 - regression_loss: 1.4973 - classification_loss: 0.3118 449/500 [=========================>....] - ETA: 12s - loss: 1.8076 - regression_loss: 1.4962 - classification_loss: 0.3114 450/500 [==========================>...] - ETA: 12s - loss: 1.8078 - regression_loss: 1.4963 - classification_loss: 0.3115 451/500 [==========================>...] - ETA: 12s - loss: 1.8089 - regression_loss: 1.4971 - classification_loss: 0.3118 452/500 [==========================>...] - ETA: 12s - loss: 1.8088 - regression_loss: 1.4970 - classification_loss: 0.3117 453/500 [==========================>...] - ETA: 11s - loss: 1.8082 - regression_loss: 1.4965 - classification_loss: 0.3116 454/500 [==========================>...] - ETA: 11s - loss: 1.8092 - regression_loss: 1.4973 - classification_loss: 0.3119 455/500 [==========================>...] - ETA: 11s - loss: 1.8099 - regression_loss: 1.4978 - classification_loss: 0.3121 456/500 [==========================>...] - ETA: 11s - loss: 1.8100 - regression_loss: 1.4980 - classification_loss: 0.3120 457/500 [==========================>...] - ETA: 10s - loss: 1.8102 - regression_loss: 1.4979 - classification_loss: 0.3123 458/500 [==========================>...] - ETA: 10s - loss: 1.8096 - regression_loss: 1.4976 - classification_loss: 0.3121 459/500 [==========================>...] - ETA: 10s - loss: 1.8073 - regression_loss: 1.4957 - classification_loss: 0.3116 460/500 [==========================>...] - ETA: 10s - loss: 1.8066 - regression_loss: 1.4953 - classification_loss: 0.3113 461/500 [==========================>...] - ETA: 9s - loss: 1.8080 - regression_loss: 1.4963 - classification_loss: 0.3117  462/500 [==========================>...] - ETA: 9s - loss: 1.8084 - regression_loss: 1.4966 - classification_loss: 0.3119 463/500 [==========================>...] - ETA: 9s - loss: 1.8087 - regression_loss: 1.4968 - classification_loss: 0.3119 464/500 [==========================>...] - ETA: 9s - loss: 1.8089 - regression_loss: 1.4969 - classification_loss: 0.3119 465/500 [==========================>...] - ETA: 8s - loss: 1.8080 - regression_loss: 1.4961 - classification_loss: 0.3119 466/500 [==========================>...] - ETA: 8s - loss: 1.8068 - regression_loss: 1.4953 - classification_loss: 0.3115 467/500 [===========================>..] - ETA: 8s - loss: 1.8076 - regression_loss: 1.4959 - classification_loss: 0.3117 468/500 [===========================>..] - ETA: 8s - loss: 1.8075 - regression_loss: 1.4961 - classification_loss: 0.3114 469/500 [===========================>..] - ETA: 7s - loss: 1.8077 - regression_loss: 1.4963 - classification_loss: 0.3114 470/500 [===========================>..] - ETA: 7s - loss: 1.8072 - regression_loss: 1.4960 - classification_loss: 0.3113 471/500 [===========================>..] - ETA: 7s - loss: 1.8071 - regression_loss: 1.4958 - classification_loss: 0.3113 472/500 [===========================>..] - ETA: 7s - loss: 1.8059 - regression_loss: 1.4947 - classification_loss: 0.3111 473/500 [===========================>..] - ETA: 6s - loss: 1.8069 - regression_loss: 1.4958 - classification_loss: 0.3111 474/500 [===========================>..] - ETA: 6s - loss: 1.8060 - regression_loss: 1.4952 - classification_loss: 0.3108 475/500 [===========================>..] - ETA: 6s - loss: 1.8057 - regression_loss: 1.4950 - classification_loss: 0.3107 476/500 [===========================>..] - ETA: 6s - loss: 1.8061 - regression_loss: 1.4953 - classification_loss: 0.3108 477/500 [===========================>..] - ETA: 5s - loss: 1.8062 - regression_loss: 1.4954 - classification_loss: 0.3108 478/500 [===========================>..] - ETA: 5s - loss: 1.8051 - regression_loss: 1.4946 - classification_loss: 0.3105 479/500 [===========================>..] - ETA: 5s - loss: 1.8028 - regression_loss: 1.4928 - classification_loss: 0.3101 480/500 [===========================>..] - ETA: 5s - loss: 1.8031 - regression_loss: 1.4931 - classification_loss: 0.3100 481/500 [===========================>..] - ETA: 4s - loss: 1.8037 - regression_loss: 1.4938 - classification_loss: 0.3098 482/500 [===========================>..] - ETA: 4s - loss: 1.8038 - regression_loss: 1.4942 - classification_loss: 0.3097 483/500 [===========================>..] - ETA: 4s - loss: 1.8050 - regression_loss: 1.4952 - classification_loss: 0.3098 484/500 [============================>.] - ETA: 4s - loss: 1.8026 - regression_loss: 1.4933 - classification_loss: 0.3093 485/500 [============================>.] - ETA: 3s - loss: 1.8028 - regression_loss: 1.4935 - classification_loss: 0.3093 486/500 [============================>.] - ETA: 3s - loss: 1.8022 - regression_loss: 1.4931 - classification_loss: 0.3092 487/500 [============================>.] - ETA: 3s - loss: 1.8008 - regression_loss: 1.4919 - classification_loss: 0.3089 488/500 [============================>.] - ETA: 3s - loss: 1.8007 - regression_loss: 1.4917 - classification_loss: 0.3090 489/500 [============================>.] - ETA: 2s - loss: 1.8024 - regression_loss: 1.4930 - classification_loss: 0.3094 490/500 [============================>.] - ETA: 2s - loss: 1.8022 - regression_loss: 1.4929 - classification_loss: 0.3093 491/500 [============================>.] - ETA: 2s - loss: 1.8037 - regression_loss: 1.4942 - classification_loss: 0.3096 492/500 [============================>.] - ETA: 2s - loss: 1.8033 - regression_loss: 1.4939 - classification_loss: 0.3094 493/500 [============================>.] - ETA: 1s - loss: 1.8037 - regression_loss: 1.4943 - classification_loss: 0.3094 494/500 [============================>.] - ETA: 1s - loss: 1.8033 - regression_loss: 1.4941 - classification_loss: 0.3092 495/500 [============================>.] - ETA: 1s - loss: 1.8033 - regression_loss: 1.4944 - classification_loss: 0.3090 496/500 [============================>.] - ETA: 1s - loss: 1.8042 - regression_loss: 1.4949 - classification_loss: 0.3092 497/500 [============================>.] - ETA: 0s - loss: 1.8040 - regression_loss: 1.4950 - classification_loss: 0.3089 498/500 [============================>.] - ETA: 0s - loss: 1.8054 - regression_loss: 1.4963 - classification_loss: 0.3091 499/500 [============================>.] - ETA: 0s - loss: 1.8051 - regression_loss: 1.4962 - classification_loss: 0.3088 500/500 [==============================] - 125s 250ms/step - loss: 1.8058 - regression_loss: 1.4970 - classification_loss: 0.3089 326 instances of class plum with average precision: 0.7487 mAP: 0.7487 Epoch 00038: saving model to ./training/snapshots/resnet50_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 1:57 - loss: 1.9093 - regression_loss: 1.7446 - classification_loss: 0.1647 2/500 [..............................] - ETA: 1:58 - loss: 1.5695 - regression_loss: 1.3898 - classification_loss: 0.1796 3/500 [..............................] - ETA: 2:00 - loss: 1.3782 - regression_loss: 1.2155 - classification_loss: 0.1627 4/500 [..............................] - ETA: 2:01 - loss: 1.4745 - regression_loss: 1.2967 - classification_loss: 0.1779 5/500 [..............................] - ETA: 1:59 - loss: 1.6481 - regression_loss: 1.3967 - classification_loss: 0.2514 6/500 [..............................] - ETA: 1:57 - loss: 1.7128 - regression_loss: 1.4589 - classification_loss: 0.2539 7/500 [..............................] - ETA: 1:57 - loss: 1.7001 - regression_loss: 1.4462 - classification_loss: 0.2540 8/500 [..............................] - ETA: 1:56 - loss: 1.6886 - regression_loss: 1.4369 - classification_loss: 0.2517 9/500 [..............................] - ETA: 1:54 - loss: 1.6686 - regression_loss: 1.4239 - classification_loss: 0.2447 10/500 [..............................] - ETA: 1:53 - loss: 1.6251 - regression_loss: 1.3886 - classification_loss: 0.2365 11/500 [..............................] - ETA: 1:54 - loss: 1.7038 - regression_loss: 1.4417 - classification_loss: 0.2620 12/500 [..............................] - ETA: 1:54 - loss: 1.6939 - regression_loss: 1.4327 - classification_loss: 0.2612 13/500 [..............................] - ETA: 1:54 - loss: 1.7049 - regression_loss: 1.4441 - classification_loss: 0.2608 14/500 [..............................] - ETA: 1:55 - loss: 1.7037 - regression_loss: 1.4420 - classification_loss: 0.2617 15/500 [..............................] - ETA: 1:55 - loss: 1.6567 - regression_loss: 1.4057 - classification_loss: 0.2511 16/500 [..............................] - ETA: 1:55 - loss: 1.7128 - regression_loss: 1.4576 - classification_loss: 0.2553 17/500 [>.............................] - ETA: 1:56 - loss: 1.7346 - regression_loss: 1.4690 - classification_loss: 0.2656 18/500 [>.............................] - ETA: 1:55 - loss: 1.7791 - regression_loss: 1.5059 - classification_loss: 0.2733 19/500 [>.............................] - ETA: 1:56 - loss: 1.7625 - regression_loss: 1.4957 - classification_loss: 0.2668 20/500 [>.............................] - ETA: 1:56 - loss: 1.8311 - regression_loss: 1.5338 - classification_loss: 0.2973 21/500 [>.............................] - ETA: 1:56 - loss: 1.8811 - regression_loss: 1.5667 - classification_loss: 0.3143 22/500 [>.............................] - ETA: 1:56 - loss: 1.8882 - regression_loss: 1.5712 - classification_loss: 0.3169 23/500 [>.............................] - ETA: 1:56 - loss: 1.8777 - regression_loss: 1.5647 - classification_loss: 0.3129 24/500 [>.............................] - ETA: 1:56 - loss: 1.8637 - regression_loss: 1.5573 - classification_loss: 0.3064 25/500 [>.............................] - ETA: 1:56 - loss: 1.8467 - regression_loss: 1.5445 - classification_loss: 0.3022 26/500 [>.............................] - ETA: 1:55 - loss: 1.8276 - regression_loss: 1.5264 - classification_loss: 0.3012 27/500 [>.............................] - ETA: 1:55 - loss: 1.8270 - regression_loss: 1.5264 - classification_loss: 0.3006 28/500 [>.............................] - ETA: 1:55 - loss: 1.8074 - regression_loss: 1.5090 - classification_loss: 0.2984 29/500 [>.............................] - ETA: 1:55 - loss: 1.7948 - regression_loss: 1.5000 - classification_loss: 0.2948 30/500 [>.............................] - ETA: 1:54 - loss: 1.7862 - regression_loss: 1.4954 - classification_loss: 0.2909 31/500 [>.............................] - ETA: 1:54 - loss: 1.7882 - regression_loss: 1.4934 - classification_loss: 0.2948 32/500 [>.............................] - ETA: 1:54 - loss: 1.7836 - regression_loss: 1.4874 - classification_loss: 0.2962 33/500 [>.............................] - ETA: 1:54 - loss: 1.7801 - regression_loss: 1.4858 - classification_loss: 0.2943 34/500 [=>............................] - ETA: 1:54 - loss: 1.7830 - regression_loss: 1.4880 - classification_loss: 0.2949 35/500 [=>............................] - ETA: 1:54 - loss: 1.7559 - regression_loss: 1.4661 - classification_loss: 0.2898 36/500 [=>............................] - ETA: 1:54 - loss: 1.7391 - regression_loss: 1.4525 - classification_loss: 0.2866 37/500 [=>............................] - ETA: 1:53 - loss: 1.7245 - regression_loss: 1.4414 - classification_loss: 0.2831 38/500 [=>............................] - ETA: 1:53 - loss: 1.7193 - regression_loss: 1.4374 - classification_loss: 0.2819 39/500 [=>............................] - ETA: 1:53 - loss: 1.7245 - regression_loss: 1.4403 - classification_loss: 0.2842 40/500 [=>............................] - ETA: 1:53 - loss: 1.7344 - regression_loss: 1.4452 - classification_loss: 0.2893 41/500 [=>............................] - ETA: 1:52 - loss: 1.7382 - regression_loss: 1.4476 - classification_loss: 0.2906 42/500 [=>............................] - ETA: 1:52 - loss: 1.7270 - regression_loss: 1.4384 - classification_loss: 0.2886 43/500 [=>............................] - ETA: 1:52 - loss: 1.7212 - regression_loss: 1.4354 - classification_loss: 0.2858 44/500 [=>............................] - ETA: 1:52 - loss: 1.7077 - regression_loss: 1.4252 - classification_loss: 0.2826 45/500 [=>............................] - ETA: 1:52 - loss: 1.7103 - regression_loss: 1.4231 - classification_loss: 0.2872 46/500 [=>............................] - ETA: 1:51 - loss: 1.7046 - regression_loss: 1.4149 - classification_loss: 0.2896 47/500 [=>............................] - ETA: 1:51 - loss: 1.7025 - regression_loss: 1.4135 - classification_loss: 0.2889 48/500 [=>............................] - ETA: 1:51 - loss: 1.7392 - regression_loss: 1.4442 - classification_loss: 0.2950 49/500 [=>............................] - ETA: 1:51 - loss: 1.7427 - regression_loss: 1.4482 - classification_loss: 0.2945 50/500 [==>...........................] - ETA: 1:51 - loss: 1.7443 - regression_loss: 1.4193 - classification_loss: 0.3251 51/500 [==>...........................] - ETA: 1:50 - loss: 1.7489 - regression_loss: 1.4231 - classification_loss: 0.3258 52/500 [==>...........................] - ETA: 1:50 - loss: 1.7422 - regression_loss: 1.4186 - classification_loss: 0.3235 53/500 [==>...........................] - ETA: 1:50 - loss: 1.7426 - regression_loss: 1.4206 - classification_loss: 0.3220 54/500 [==>...........................] - ETA: 1:50 - loss: 1.7599 - regression_loss: 1.4342 - classification_loss: 0.3257 55/500 [==>...........................] - ETA: 1:49 - loss: 1.7684 - regression_loss: 1.4396 - classification_loss: 0.3288 56/500 [==>...........................] - ETA: 1:49 - loss: 1.7847 - regression_loss: 1.4499 - classification_loss: 0.3348 57/500 [==>...........................] - ETA: 1:49 - loss: 1.7855 - regression_loss: 1.4524 - classification_loss: 0.3330 58/500 [==>...........................] - ETA: 1:49 - loss: 1.7750 - regression_loss: 1.4444 - classification_loss: 0.3306 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7834 - regression_loss: 1.4499 - classification_loss: 0.3335 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7785 - regression_loss: 1.4476 - classification_loss: 0.3309 61/500 [==>...........................] - ETA: 1:48 - loss: 1.7950 - regression_loss: 1.4598 - classification_loss: 0.3351 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7808 - regression_loss: 1.4490 - classification_loss: 0.3317 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7833 - regression_loss: 1.4492 - classification_loss: 0.3341 64/500 [==>...........................] - ETA: 1:47 - loss: 1.7829 - regression_loss: 1.4499 - classification_loss: 0.3331 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7865 - regression_loss: 1.4541 - classification_loss: 0.3324 66/500 [==>...........................] - ETA: 1:47 - loss: 1.7822 - regression_loss: 1.4516 - classification_loss: 0.3306 67/500 [===>..........................] - ETA: 1:47 - loss: 1.7819 - regression_loss: 1.4520 - classification_loss: 0.3299 68/500 [===>..........................] - ETA: 1:47 - loss: 1.7664 - regression_loss: 1.4398 - classification_loss: 0.3267 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7658 - regression_loss: 1.4422 - classification_loss: 0.3237 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7663 - regression_loss: 1.4425 - classification_loss: 0.3239 71/500 [===>..........................] - ETA: 1:46 - loss: 1.7745 - regression_loss: 1.4497 - classification_loss: 0.3248 72/500 [===>..........................] - ETA: 1:46 - loss: 1.7720 - regression_loss: 1.4494 - classification_loss: 0.3226 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7790 - regression_loss: 1.4554 - classification_loss: 0.3236 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7824 - regression_loss: 1.4598 - classification_loss: 0.3226 75/500 [===>..........................] - ETA: 1:45 - loss: 1.7741 - regression_loss: 1.4543 - classification_loss: 0.3198 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7849 - regression_loss: 1.4645 - classification_loss: 0.3204 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7791 - regression_loss: 1.4593 - classification_loss: 0.3198 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7819 - regression_loss: 1.4619 - classification_loss: 0.3200 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7779 - regression_loss: 1.4583 - classification_loss: 0.3195 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7762 - regression_loss: 1.4565 - classification_loss: 0.3197 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7828 - regression_loss: 1.4637 - classification_loss: 0.3191 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7816 - regression_loss: 1.4637 - classification_loss: 0.3179 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7857 - regression_loss: 1.4678 - classification_loss: 0.3179 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7701 - regression_loss: 1.4553 - classification_loss: 0.3147 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7691 - regression_loss: 1.4550 - classification_loss: 0.3140 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7640 - regression_loss: 1.4512 - classification_loss: 0.3128 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7699 - regression_loss: 1.4565 - classification_loss: 0.3134 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7718 - regression_loss: 1.4572 - classification_loss: 0.3146 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7781 - regression_loss: 1.4629 - classification_loss: 0.3152 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7825 - regression_loss: 1.4654 - classification_loss: 0.3170 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7838 - regression_loss: 1.4661 - classification_loss: 0.3176 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7792 - regression_loss: 1.4630 - classification_loss: 0.3162 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7651 - regression_loss: 1.4515 - classification_loss: 0.3137 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7666 - regression_loss: 1.4540 - classification_loss: 0.3126 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7705 - regression_loss: 1.4576 - classification_loss: 0.3129 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7711 - regression_loss: 1.4571 - classification_loss: 0.3140 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7757 - regression_loss: 1.4607 - classification_loss: 0.3149 98/500 [====>.........................] - ETA: 1:39 - loss: 1.7789 - regression_loss: 1.4626 - classification_loss: 0.3163 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7844 - regression_loss: 1.4666 - classification_loss: 0.3178 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7909 - regression_loss: 1.4718 - classification_loss: 0.3191 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7866 - regression_loss: 1.4689 - classification_loss: 0.3177 102/500 [=====>........................] - ETA: 1:38 - loss: 1.7826 - regression_loss: 1.4664 - classification_loss: 0.3162 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7819 - regression_loss: 1.4665 - classification_loss: 0.3155 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7867 - regression_loss: 1.4704 - classification_loss: 0.3163 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7860 - regression_loss: 1.4699 - classification_loss: 0.3161 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7879 - regression_loss: 1.4712 - classification_loss: 0.3166 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7817 - regression_loss: 1.4665 - classification_loss: 0.3151 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7812 - regression_loss: 1.4665 - classification_loss: 0.3147 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7805 - regression_loss: 1.4665 - classification_loss: 0.3140 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7737 - regression_loss: 1.4614 - classification_loss: 0.3123 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7670 - regression_loss: 1.4562 - classification_loss: 0.3107 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7627 - regression_loss: 1.4535 - classification_loss: 0.3092 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7634 - regression_loss: 1.4540 - classification_loss: 0.3095 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7612 - regression_loss: 1.4522 - classification_loss: 0.3090 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7499 - regression_loss: 1.4428 - classification_loss: 0.3071 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7438 - regression_loss: 1.4369 - classification_loss: 0.3069 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7479 - regression_loss: 1.4392 - classification_loss: 0.3087 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7574 - regression_loss: 1.4486 - classification_loss: 0.3088 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7608 - regression_loss: 1.4520 - classification_loss: 0.3088 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7666 - regression_loss: 1.4566 - classification_loss: 0.3100 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7632 - regression_loss: 1.4540 - classification_loss: 0.3092 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7658 - regression_loss: 1.4561 - classification_loss: 0.3097 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7685 - regression_loss: 1.4598 - classification_loss: 0.3087 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7680 - regression_loss: 1.4593 - classification_loss: 0.3086 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7669 - regression_loss: 1.4593 - classification_loss: 0.3075 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7649 - regression_loss: 1.4568 - classification_loss: 0.3081 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7676 - regression_loss: 1.4600 - classification_loss: 0.3076 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7684 - regression_loss: 1.4608 - classification_loss: 0.3076 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7695 - regression_loss: 1.4619 - classification_loss: 0.3076 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7728 - regression_loss: 1.4645 - classification_loss: 0.3083 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7728 - regression_loss: 1.4641 - classification_loss: 0.3088 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7713 - regression_loss: 1.4632 - classification_loss: 0.3081 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7711 - regression_loss: 1.4636 - classification_loss: 0.3074 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7737 - regression_loss: 1.4660 - classification_loss: 0.3076 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7686 - regression_loss: 1.4616 - classification_loss: 0.3070 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7658 - regression_loss: 1.4599 - classification_loss: 0.3059 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7709 - regression_loss: 1.4641 - classification_loss: 0.3068 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7692 - regression_loss: 1.4631 - classification_loss: 0.3061 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7696 - regression_loss: 1.4634 - classification_loss: 0.3062 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7722 - regression_loss: 1.4656 - classification_loss: 0.3066 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7791 - regression_loss: 1.4710 - classification_loss: 0.3081 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7709 - regression_loss: 1.4645 - classification_loss: 0.3064 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7726 - regression_loss: 1.4662 - classification_loss: 0.3063 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7713 - regression_loss: 1.4654 - classification_loss: 0.3059 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7743 - regression_loss: 1.4678 - classification_loss: 0.3066 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7724 - regression_loss: 1.4665 - classification_loss: 0.3059 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7693 - regression_loss: 1.4644 - classification_loss: 0.3049 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7666 - regression_loss: 1.4626 - classification_loss: 0.3041 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7695 - regression_loss: 1.4649 - classification_loss: 0.3046 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7662 - regression_loss: 1.4625 - classification_loss: 0.3036 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7661 - regression_loss: 1.4627 - classification_loss: 0.3035 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7674 - regression_loss: 1.4640 - classification_loss: 0.3034 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7757 - regression_loss: 1.4703 - classification_loss: 0.3054 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7759 - regression_loss: 1.4707 - classification_loss: 0.3052 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7747 - regression_loss: 1.4702 - classification_loss: 0.3046 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7730 - regression_loss: 1.4691 - classification_loss: 0.3039 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7743 - regression_loss: 1.4703 - classification_loss: 0.3040 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7702 - regression_loss: 1.4675 - classification_loss: 0.3027 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7756 - regression_loss: 1.4720 - classification_loss: 0.3036 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7768 - regression_loss: 1.4731 - classification_loss: 0.3037 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7725 - regression_loss: 1.4700 - classification_loss: 0.3025 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7731 - regression_loss: 1.4700 - classification_loss: 0.3031 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7716 - regression_loss: 1.4689 - classification_loss: 0.3027 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7696 - regression_loss: 1.4676 - classification_loss: 0.3021 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7706 - regression_loss: 1.4687 - classification_loss: 0.3019 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7702 - regression_loss: 1.4684 - classification_loss: 0.3019 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7717 - regression_loss: 1.4699 - classification_loss: 0.3018 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7735 - regression_loss: 1.4719 - classification_loss: 0.3017 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7773 - regression_loss: 1.4746 - classification_loss: 0.3026 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7782 - regression_loss: 1.4758 - classification_loss: 0.3024 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7759 - regression_loss: 1.4743 - classification_loss: 0.3016 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7810 - regression_loss: 1.4785 - classification_loss: 0.3024 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7871 - regression_loss: 1.4833 - classification_loss: 0.3038 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7810 - regression_loss: 1.4784 - classification_loss: 0.3026 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7821 - regression_loss: 1.4795 - classification_loss: 0.3026 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7829 - regression_loss: 1.4804 - classification_loss: 0.3025 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7831 - regression_loss: 1.4810 - classification_loss: 0.3021 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7816 - regression_loss: 1.4802 - classification_loss: 0.3013 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7808 - regression_loss: 1.4797 - classification_loss: 0.3011 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7824 - regression_loss: 1.4810 - classification_loss: 0.3013 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7863 - regression_loss: 1.4849 - classification_loss: 0.3014 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7848 - regression_loss: 1.4839 - classification_loss: 0.3009 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7865 - regression_loss: 1.4855 - classification_loss: 0.3009 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7851 - regression_loss: 1.4842 - classification_loss: 0.3008 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7818 - regression_loss: 1.4815 - classification_loss: 0.3003 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7798 - regression_loss: 1.4802 - classification_loss: 0.2997 187/500 [==========>...................] - ETA: 1:17 - loss: 1.7776 - regression_loss: 1.4786 - classification_loss: 0.2990 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7771 - regression_loss: 1.4785 - classification_loss: 0.2986 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7777 - regression_loss: 1.4792 - classification_loss: 0.2985 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7813 - regression_loss: 1.4823 - classification_loss: 0.2990 191/500 [==========>...................] - ETA: 1:16 - loss: 1.7829 - regression_loss: 1.4834 - classification_loss: 0.2995 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7825 - regression_loss: 1.4831 - classification_loss: 0.2994 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7834 - regression_loss: 1.4842 - classification_loss: 0.2992 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7790 - regression_loss: 1.4799 - classification_loss: 0.2991 195/500 [==========>...................] - ETA: 1:15 - loss: 1.7770 - regression_loss: 1.4786 - classification_loss: 0.2984 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7771 - regression_loss: 1.4788 - classification_loss: 0.2983 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7810 - regression_loss: 1.4815 - classification_loss: 0.2995 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7815 - regression_loss: 1.4820 - classification_loss: 0.2994 199/500 [==========>...................] - ETA: 1:14 - loss: 1.7837 - regression_loss: 1.4842 - classification_loss: 0.2996 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7818 - regression_loss: 1.4829 - classification_loss: 0.2989 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7835 - regression_loss: 1.4846 - classification_loss: 0.2989 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7827 - regression_loss: 1.4841 - classification_loss: 0.2985 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7869 - regression_loss: 1.4881 - classification_loss: 0.2989 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7868 - regression_loss: 1.4878 - classification_loss: 0.2990 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7912 - regression_loss: 1.4912 - classification_loss: 0.3000 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7930 - regression_loss: 1.4928 - classification_loss: 0.3002 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7937 - regression_loss: 1.4932 - classification_loss: 0.3004 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7939 - regression_loss: 1.4932 - classification_loss: 0.3007 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7896 - regression_loss: 1.4899 - classification_loss: 0.2997 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7866 - regression_loss: 1.4876 - classification_loss: 0.2990 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7863 - regression_loss: 1.4874 - classification_loss: 0.2988 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7857 - regression_loss: 1.4870 - classification_loss: 0.2988 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7894 - regression_loss: 1.4899 - classification_loss: 0.2995 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7902 - regression_loss: 1.4908 - classification_loss: 0.2994 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7869 - regression_loss: 1.4882 - classification_loss: 0.2987 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7861 - regression_loss: 1.4878 - classification_loss: 0.2983 217/500 [============>.................] - ETA: 1:10 - loss: 1.7869 - regression_loss: 1.4884 - classification_loss: 0.2985 218/500 [============>.................] - ETA: 1:10 - loss: 1.7872 - regression_loss: 1.4892 - classification_loss: 0.2980 219/500 [============>.................] - ETA: 1:10 - loss: 1.7858 - regression_loss: 1.4884 - classification_loss: 0.2974 220/500 [============>.................] - ETA: 1:09 - loss: 1.7887 - regression_loss: 1.4909 - classification_loss: 0.2978 221/500 [============>.................] - ETA: 1:09 - loss: 1.7872 - regression_loss: 1.4897 - classification_loss: 0.2975 222/500 [============>.................] - ETA: 1:09 - loss: 1.7846 - regression_loss: 1.4875 - classification_loss: 0.2971 223/500 [============>.................] - ETA: 1:09 - loss: 1.7867 - regression_loss: 1.4887 - classification_loss: 0.2980 224/500 [============>.................] - ETA: 1:08 - loss: 1.7861 - regression_loss: 1.4883 - classification_loss: 0.2978 225/500 [============>.................] - ETA: 1:08 - loss: 1.7852 - regression_loss: 1.4875 - classification_loss: 0.2977 226/500 [============>.................] - ETA: 1:08 - loss: 1.7839 - regression_loss: 1.4866 - classification_loss: 0.2972 227/500 [============>.................] - ETA: 1:08 - loss: 1.7838 - regression_loss: 1.4867 - classification_loss: 0.2971 228/500 [============>.................] - ETA: 1:07 - loss: 1.7824 - regression_loss: 1.4860 - classification_loss: 0.2964 229/500 [============>.................] - ETA: 1:07 - loss: 1.7829 - regression_loss: 1.4865 - classification_loss: 0.2964 230/500 [============>.................] - ETA: 1:07 - loss: 1.7822 - regression_loss: 1.4801 - classification_loss: 0.3021 231/500 [============>.................] - ETA: 1:07 - loss: 1.7851 - regression_loss: 1.4826 - classification_loss: 0.3025 232/500 [============>.................] - ETA: 1:06 - loss: 1.7838 - regression_loss: 1.4817 - classification_loss: 0.3022 233/500 [============>.................] - ETA: 1:06 - loss: 1.7858 - regression_loss: 1.4830 - classification_loss: 0.3028 234/500 [=============>................] - ETA: 1:06 - loss: 1.7827 - regression_loss: 1.4806 - classification_loss: 0.3021 235/500 [=============>................] - ETA: 1:06 - loss: 1.7817 - regression_loss: 1.4798 - classification_loss: 0.3019 236/500 [=============>................] - ETA: 1:05 - loss: 1.7825 - regression_loss: 1.4805 - classification_loss: 0.3019 237/500 [=============>................] - ETA: 1:05 - loss: 1.7819 - regression_loss: 1.4806 - classification_loss: 0.3013 238/500 [=============>................] - ETA: 1:05 - loss: 1.7793 - regression_loss: 1.4783 - classification_loss: 0.3010 239/500 [=============>................] - ETA: 1:05 - loss: 1.7796 - regression_loss: 1.4789 - classification_loss: 0.3007 240/500 [=============>................] - ETA: 1:04 - loss: 1.7839 - regression_loss: 1.4818 - classification_loss: 0.3021 241/500 [=============>................] - ETA: 1:04 - loss: 1.7788 - regression_loss: 1.4775 - classification_loss: 0.3013 242/500 [=============>................] - ETA: 1:04 - loss: 1.7770 - regression_loss: 1.4761 - classification_loss: 0.3009 243/500 [=============>................] - ETA: 1:04 - loss: 1.7798 - regression_loss: 1.4781 - classification_loss: 0.3017 244/500 [=============>................] - ETA: 1:03 - loss: 1.7809 - regression_loss: 1.4790 - classification_loss: 0.3019 245/500 [=============>................] - ETA: 1:03 - loss: 1.7785 - regression_loss: 1.4773 - classification_loss: 0.3012 246/500 [=============>................] - ETA: 1:03 - loss: 1.7786 - regression_loss: 1.4778 - classification_loss: 0.3008 247/500 [=============>................] - ETA: 1:03 - loss: 1.7781 - regression_loss: 1.4775 - classification_loss: 0.3007 248/500 [=============>................] - ETA: 1:02 - loss: 1.7768 - regression_loss: 1.4765 - classification_loss: 0.3002 249/500 [=============>................] - ETA: 1:02 - loss: 1.7736 - regression_loss: 1.4738 - classification_loss: 0.2998 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7748 - regression_loss: 1.4749 - classification_loss: 0.2999 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7726 - regression_loss: 1.4732 - classification_loss: 0.2994 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7771 - regression_loss: 1.4771 - classification_loss: 0.2999 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7812 - regression_loss: 1.4801 - classification_loss: 0.3011 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7769 - regression_loss: 1.4763 - classification_loss: 0.3006 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7754 - regression_loss: 1.4752 - classification_loss: 0.3001 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7750 - regression_loss: 1.4752 - classification_loss: 0.2998 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7767 - regression_loss: 1.4768 - classification_loss: 0.2999 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7774 - regression_loss: 1.4777 - classification_loss: 0.2997 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7781 - regression_loss: 1.4780 - classification_loss: 0.3001 260/500 [==============>...............] - ETA: 59s - loss: 1.7738 - regression_loss: 1.4745 - classification_loss: 0.2993  261/500 [==============>...............] - ETA: 59s - loss: 1.7744 - regression_loss: 1.4749 - classification_loss: 0.2994 262/500 [==============>...............] - ETA: 59s - loss: 1.7735 - regression_loss: 1.4743 - classification_loss: 0.2992 263/500 [==============>...............] - ETA: 59s - loss: 1.7707 - regression_loss: 1.4723 - classification_loss: 0.2984 264/500 [==============>...............] - ETA: 58s - loss: 1.7717 - regression_loss: 1.4727 - classification_loss: 0.2990 265/500 [==============>...............] - ETA: 58s - loss: 1.7722 - regression_loss: 1.4736 - classification_loss: 0.2985 266/500 [==============>...............] - ETA: 58s - loss: 1.7720 - regression_loss: 1.4734 - classification_loss: 0.2986 267/500 [===============>..............] - ETA: 58s - loss: 1.7730 - regression_loss: 1.4743 - classification_loss: 0.2987 268/500 [===============>..............] - ETA: 57s - loss: 1.7770 - regression_loss: 1.4772 - classification_loss: 0.2998 269/500 [===============>..............] - ETA: 57s - loss: 1.7760 - regression_loss: 1.4767 - classification_loss: 0.2993 270/500 [===============>..............] - ETA: 57s - loss: 1.7767 - regression_loss: 1.4773 - classification_loss: 0.2994 271/500 [===============>..............] - ETA: 57s - loss: 1.7767 - regression_loss: 1.4771 - classification_loss: 0.2995 272/500 [===============>..............] - ETA: 56s - loss: 1.7779 - regression_loss: 1.4782 - classification_loss: 0.2997 273/500 [===============>..............] - ETA: 56s - loss: 1.7769 - regression_loss: 1.4776 - classification_loss: 0.2993 274/500 [===============>..............] - ETA: 56s - loss: 1.7765 - regression_loss: 1.4773 - classification_loss: 0.2992 275/500 [===============>..............] - ETA: 56s - loss: 1.7727 - regression_loss: 1.4742 - classification_loss: 0.2985 276/500 [===============>..............] - ETA: 55s - loss: 1.7765 - regression_loss: 1.4780 - classification_loss: 0.2986 277/500 [===============>..............] - ETA: 55s - loss: 1.7774 - regression_loss: 1.4786 - classification_loss: 0.2988 278/500 [===============>..............] - ETA: 55s - loss: 1.7771 - regression_loss: 1.4779 - classification_loss: 0.2992 279/500 [===============>..............] - ETA: 55s - loss: 1.7769 - regression_loss: 1.4778 - classification_loss: 0.2992 280/500 [===============>..............] - ETA: 54s - loss: 1.7787 - regression_loss: 1.4787 - classification_loss: 0.3000 281/500 [===============>..............] - ETA: 54s - loss: 1.7812 - regression_loss: 1.4810 - classification_loss: 0.3002 282/500 [===============>..............] - ETA: 54s - loss: 1.7807 - regression_loss: 1.4809 - classification_loss: 0.2998 283/500 [===============>..............] - ETA: 54s - loss: 1.7812 - regression_loss: 1.4811 - classification_loss: 0.3001 284/500 [================>.............] - ETA: 53s - loss: 1.7818 - regression_loss: 1.4813 - classification_loss: 0.3005 285/500 [================>.............] - ETA: 53s - loss: 1.7821 - regression_loss: 1.4819 - classification_loss: 0.3002 286/500 [================>.............] - ETA: 53s - loss: 1.7846 - regression_loss: 1.4843 - classification_loss: 0.3003 287/500 [================>.............] - ETA: 53s - loss: 1.7836 - regression_loss: 1.4837 - classification_loss: 0.3000 288/500 [================>.............] - ETA: 52s - loss: 1.7800 - regression_loss: 1.4805 - classification_loss: 0.2995 289/500 [================>.............] - ETA: 52s - loss: 1.7787 - regression_loss: 1.4798 - classification_loss: 0.2989 290/500 [================>.............] - ETA: 52s - loss: 1.7828 - regression_loss: 1.4830 - classification_loss: 0.2998 291/500 [================>.............] - ETA: 52s - loss: 1.7822 - regression_loss: 1.4825 - classification_loss: 0.2996 292/500 [================>.............] - ETA: 51s - loss: 1.7839 - regression_loss: 1.4842 - classification_loss: 0.2997 293/500 [================>.............] - ETA: 51s - loss: 1.7866 - regression_loss: 1.4863 - classification_loss: 0.3003 294/500 [================>.............] - ETA: 51s - loss: 1.7875 - regression_loss: 1.4868 - classification_loss: 0.3007 295/500 [================>.............] - ETA: 51s - loss: 1.7877 - regression_loss: 1.4871 - classification_loss: 0.3006 296/500 [================>.............] - ETA: 50s - loss: 1.7904 - regression_loss: 1.4889 - classification_loss: 0.3015 297/500 [================>.............] - ETA: 50s - loss: 1.7914 - regression_loss: 1.4897 - classification_loss: 0.3017 298/500 [================>.............] - ETA: 50s - loss: 1.7892 - regression_loss: 1.4880 - classification_loss: 0.3012 299/500 [================>.............] - ETA: 50s - loss: 1.7901 - regression_loss: 1.4887 - classification_loss: 0.3015 300/500 [=================>............] - ETA: 49s - loss: 1.7898 - regression_loss: 1.4881 - classification_loss: 0.3017 301/500 [=================>............] - ETA: 49s - loss: 1.7893 - regression_loss: 1.4878 - classification_loss: 0.3015 302/500 [=================>............] - ETA: 49s - loss: 1.7882 - regression_loss: 1.4869 - classification_loss: 0.3013 303/500 [=================>............] - ETA: 49s - loss: 1.7875 - regression_loss: 1.4865 - classification_loss: 0.3010 304/500 [=================>............] - ETA: 48s - loss: 1.7862 - regression_loss: 1.4855 - classification_loss: 0.3007 305/500 [=================>............] - ETA: 48s - loss: 1.7849 - regression_loss: 1.4846 - classification_loss: 0.3003 306/500 [=================>............] - ETA: 48s - loss: 1.7851 - regression_loss: 1.4847 - classification_loss: 0.3003 307/500 [=================>............] - ETA: 48s - loss: 1.7862 - regression_loss: 1.4852 - classification_loss: 0.3010 308/500 [=================>............] - ETA: 47s - loss: 1.7841 - regression_loss: 1.4834 - classification_loss: 0.3007 309/500 [=================>............] - ETA: 47s - loss: 1.7845 - regression_loss: 1.4837 - classification_loss: 0.3008 310/500 [=================>............] - ETA: 47s - loss: 1.7843 - regression_loss: 1.4837 - classification_loss: 0.3006 311/500 [=================>............] - ETA: 47s - loss: 1.7846 - regression_loss: 1.4841 - classification_loss: 0.3005 312/500 [=================>............] - ETA: 46s - loss: 1.7825 - regression_loss: 1.4824 - classification_loss: 0.3001 313/500 [=================>............] - ETA: 46s - loss: 1.7847 - regression_loss: 1.4842 - classification_loss: 0.3005 314/500 [=================>............] - ETA: 46s - loss: 1.7822 - regression_loss: 1.4823 - classification_loss: 0.2999 315/500 [=================>............] - ETA: 46s - loss: 1.7790 - regression_loss: 1.4798 - classification_loss: 0.2993 316/500 [=================>............] - ETA: 45s - loss: 1.7769 - regression_loss: 1.4783 - classification_loss: 0.2986 317/500 [==================>...........] - ETA: 45s - loss: 1.7787 - regression_loss: 1.4797 - classification_loss: 0.2990 318/500 [==================>...........] - ETA: 45s - loss: 1.7810 - regression_loss: 1.4813 - classification_loss: 0.2996 319/500 [==================>...........] - ETA: 45s - loss: 1.7812 - regression_loss: 1.4816 - classification_loss: 0.2996 320/500 [==================>...........] - ETA: 44s - loss: 1.7799 - regression_loss: 1.4808 - classification_loss: 0.2991 321/500 [==================>...........] - ETA: 44s - loss: 1.7789 - regression_loss: 1.4801 - classification_loss: 0.2988 322/500 [==================>...........] - ETA: 44s - loss: 1.7790 - regression_loss: 1.4804 - classification_loss: 0.2987 323/500 [==================>...........] - ETA: 44s - loss: 1.7764 - regression_loss: 1.4783 - classification_loss: 0.2981 324/500 [==================>...........] - ETA: 43s - loss: 1.7786 - regression_loss: 1.4800 - classification_loss: 0.2985 325/500 [==================>...........] - ETA: 43s - loss: 1.7820 - regression_loss: 1.4835 - classification_loss: 0.2984 326/500 [==================>...........] - ETA: 43s - loss: 1.7827 - regression_loss: 1.4841 - classification_loss: 0.2985 327/500 [==================>...........] - ETA: 43s - loss: 1.7839 - regression_loss: 1.4850 - classification_loss: 0.2989 328/500 [==================>...........] - ETA: 43s - loss: 1.7853 - regression_loss: 1.4861 - classification_loss: 0.2992 329/500 [==================>...........] - ETA: 42s - loss: 1.7830 - regression_loss: 1.4844 - classification_loss: 0.2986 330/500 [==================>...........] - ETA: 42s - loss: 1.7845 - regression_loss: 1.4862 - classification_loss: 0.2983 331/500 [==================>...........] - ETA: 42s - loss: 1.7845 - regression_loss: 1.4861 - classification_loss: 0.2984 332/500 [==================>...........] - ETA: 41s - loss: 1.7852 - regression_loss: 1.4867 - classification_loss: 0.2985 333/500 [==================>...........] - ETA: 41s - loss: 1.7834 - regression_loss: 1.4854 - classification_loss: 0.2981 334/500 [===================>..........] - ETA: 41s - loss: 1.7834 - regression_loss: 1.4855 - classification_loss: 0.2980 335/500 [===================>..........] - ETA: 41s - loss: 1.7815 - regression_loss: 1.4841 - classification_loss: 0.2974 336/500 [===================>..........] - ETA: 40s - loss: 1.7844 - regression_loss: 1.4864 - classification_loss: 0.2980 337/500 [===================>..........] - ETA: 40s - loss: 1.7836 - regression_loss: 1.4860 - classification_loss: 0.2977 338/500 [===================>..........] - ETA: 40s - loss: 1.7830 - regression_loss: 1.4855 - classification_loss: 0.2975 339/500 [===================>..........] - ETA: 40s - loss: 1.7848 - regression_loss: 1.4870 - classification_loss: 0.2978 340/500 [===================>..........] - ETA: 39s - loss: 1.7875 - regression_loss: 1.4891 - classification_loss: 0.2985 341/500 [===================>..........] - ETA: 39s - loss: 1.7867 - regression_loss: 1.4885 - classification_loss: 0.2982 342/500 [===================>..........] - ETA: 39s - loss: 1.7857 - regression_loss: 1.4877 - classification_loss: 0.2979 343/500 [===================>..........] - ETA: 39s - loss: 1.7860 - regression_loss: 1.4880 - classification_loss: 0.2980 344/500 [===================>..........] - ETA: 38s - loss: 1.7856 - regression_loss: 1.4879 - classification_loss: 0.2977 345/500 [===================>..........] - ETA: 38s - loss: 1.7845 - regression_loss: 1.4871 - classification_loss: 0.2974 346/500 [===================>..........] - ETA: 38s - loss: 1.7848 - regression_loss: 1.4872 - classification_loss: 0.2976 347/500 [===================>..........] - ETA: 38s - loss: 1.7838 - regression_loss: 1.4867 - classification_loss: 0.2972 348/500 [===================>..........] - ETA: 37s - loss: 1.7837 - regression_loss: 1.4866 - classification_loss: 0.2971 349/500 [===================>..........] - ETA: 37s - loss: 1.7817 - regression_loss: 1.4848 - classification_loss: 0.2969 350/500 [====================>.........] - ETA: 37s - loss: 1.7816 - regression_loss: 1.4846 - classification_loss: 0.2970 351/500 [====================>.........] - ETA: 37s - loss: 1.7826 - regression_loss: 1.4854 - classification_loss: 0.2971 352/500 [====================>.........] - ETA: 36s - loss: 1.7828 - regression_loss: 1.4857 - classification_loss: 0.2971 353/500 [====================>.........] - ETA: 36s - loss: 1.7822 - regression_loss: 1.4855 - classification_loss: 0.2967 354/500 [====================>.........] - ETA: 36s - loss: 1.7842 - regression_loss: 1.4872 - classification_loss: 0.2970 355/500 [====================>.........] - ETA: 36s - loss: 1.7854 - regression_loss: 1.4880 - classification_loss: 0.2974 356/500 [====================>.........] - ETA: 35s - loss: 1.7885 - regression_loss: 1.4907 - classification_loss: 0.2978 357/500 [====================>.........] - ETA: 35s - loss: 1.7870 - regression_loss: 1.4896 - classification_loss: 0.2974 358/500 [====================>.........] - ETA: 35s - loss: 1.7880 - regression_loss: 1.4904 - classification_loss: 0.2976 359/500 [====================>.........] - ETA: 35s - loss: 1.7896 - regression_loss: 1.4915 - classification_loss: 0.2981 360/500 [====================>.........] - ETA: 34s - loss: 1.7895 - regression_loss: 1.4911 - classification_loss: 0.2984 361/500 [====================>.........] - ETA: 34s - loss: 1.7890 - regression_loss: 1.4905 - classification_loss: 0.2985 362/500 [====================>.........] - ETA: 34s - loss: 1.7884 - regression_loss: 1.4899 - classification_loss: 0.2985 363/500 [====================>.........] - ETA: 34s - loss: 1.7902 - regression_loss: 1.4913 - classification_loss: 0.2988 364/500 [====================>.........] - ETA: 33s - loss: 1.7906 - regression_loss: 1.4918 - classification_loss: 0.2988 365/500 [====================>.........] - ETA: 33s - loss: 1.7919 - regression_loss: 1.4931 - classification_loss: 0.2988 366/500 [====================>.........] - ETA: 33s - loss: 1.7934 - regression_loss: 1.4943 - classification_loss: 0.2991 367/500 [=====================>........] - ETA: 33s - loss: 1.7929 - regression_loss: 1.4941 - classification_loss: 0.2988 368/500 [=====================>........] - ETA: 32s - loss: 1.7961 - regression_loss: 1.4966 - classification_loss: 0.2995 369/500 [=====================>........] - ETA: 32s - loss: 1.7946 - regression_loss: 1.4955 - classification_loss: 0.2991 370/500 [=====================>........] - ETA: 32s - loss: 1.7935 - regression_loss: 1.4948 - classification_loss: 0.2987 371/500 [=====================>........] - ETA: 32s - loss: 1.7913 - regression_loss: 1.4931 - classification_loss: 0.2982 372/500 [=====================>........] - ETA: 31s - loss: 1.7909 - regression_loss: 1.4927 - classification_loss: 0.2982 373/500 [=====================>........] - ETA: 31s - loss: 1.7889 - regression_loss: 1.4912 - classification_loss: 0.2978 374/500 [=====================>........] - ETA: 31s - loss: 1.7882 - regression_loss: 1.4906 - classification_loss: 0.2975 375/500 [=====================>........] - ETA: 31s - loss: 1.7894 - regression_loss: 1.4920 - classification_loss: 0.2973 376/500 [=====================>........] - ETA: 30s - loss: 1.7883 - regression_loss: 1.4914 - classification_loss: 0.2969 377/500 [=====================>........] - ETA: 30s - loss: 1.7898 - regression_loss: 1.4926 - classification_loss: 0.2972 378/500 [=====================>........] - ETA: 30s - loss: 1.7896 - regression_loss: 1.4922 - classification_loss: 0.2973 379/500 [=====================>........] - ETA: 30s - loss: 1.7909 - regression_loss: 1.4932 - classification_loss: 0.2978 380/500 [=====================>........] - ETA: 29s - loss: 1.7900 - regression_loss: 1.4923 - classification_loss: 0.2977 381/500 [=====================>........] - ETA: 29s - loss: 1.7906 - regression_loss: 1.4929 - classification_loss: 0.2977 382/500 [=====================>........] - ETA: 29s - loss: 1.7900 - regression_loss: 1.4925 - classification_loss: 0.2975 383/500 [=====================>........] - ETA: 29s - loss: 1.7873 - regression_loss: 1.4902 - classification_loss: 0.2971 384/500 [======================>.......] - ETA: 28s - loss: 1.7870 - regression_loss: 1.4899 - classification_loss: 0.2970 385/500 [======================>.......] - ETA: 28s - loss: 1.7873 - regression_loss: 1.4902 - classification_loss: 0.2971 386/500 [======================>.......] - ETA: 28s - loss: 1.7870 - regression_loss: 1.4900 - classification_loss: 0.2970 387/500 [======================>.......] - ETA: 28s - loss: 1.7864 - regression_loss: 1.4896 - classification_loss: 0.2968 388/500 [======================>.......] - ETA: 27s - loss: 1.7878 - regression_loss: 1.4908 - classification_loss: 0.2970 389/500 [======================>.......] - ETA: 27s - loss: 1.7900 - regression_loss: 1.4932 - classification_loss: 0.2968 390/500 [======================>.......] - ETA: 27s - loss: 1.7911 - regression_loss: 1.4941 - classification_loss: 0.2970 391/500 [======================>.......] - ETA: 27s - loss: 1.7921 - regression_loss: 1.4948 - classification_loss: 0.2973 392/500 [======================>.......] - ETA: 26s - loss: 1.7913 - regression_loss: 1.4943 - classification_loss: 0.2969 393/500 [======================>.......] - ETA: 26s - loss: 1.7906 - regression_loss: 1.4938 - classification_loss: 0.2968 394/500 [======================>.......] - ETA: 26s - loss: 1.7885 - regression_loss: 1.4922 - classification_loss: 0.2963 395/500 [======================>.......] - ETA: 26s - loss: 1.7886 - regression_loss: 1.4922 - classification_loss: 0.2964 396/500 [======================>.......] - ETA: 25s - loss: 1.7883 - regression_loss: 1.4921 - classification_loss: 0.2963 397/500 [======================>.......] - ETA: 25s - loss: 1.7901 - regression_loss: 1.4938 - classification_loss: 0.2963 398/500 [======================>.......] - ETA: 25s - loss: 1.7910 - regression_loss: 1.4943 - classification_loss: 0.2968 399/500 [======================>.......] - ETA: 25s - loss: 1.7911 - regression_loss: 1.4943 - classification_loss: 0.2967 400/500 [=======================>......] - ETA: 24s - loss: 1.7892 - regression_loss: 1.4929 - classification_loss: 0.2963 401/500 [=======================>......] - ETA: 24s - loss: 1.7896 - regression_loss: 1.4929 - classification_loss: 0.2967 402/500 [=======================>......] - ETA: 24s - loss: 1.7900 - regression_loss: 1.4934 - classification_loss: 0.2966 403/500 [=======================>......] - ETA: 24s - loss: 1.7903 - regression_loss: 1.4937 - classification_loss: 0.2966 404/500 [=======================>......] - ETA: 23s - loss: 1.7892 - regression_loss: 1.4929 - classification_loss: 0.2963 405/500 [=======================>......] - ETA: 23s - loss: 1.7894 - regression_loss: 1.4931 - classification_loss: 0.2963 406/500 [=======================>......] - ETA: 23s - loss: 1.7912 - regression_loss: 1.4943 - classification_loss: 0.2969 407/500 [=======================>......] - ETA: 23s - loss: 1.7918 - regression_loss: 1.4949 - classification_loss: 0.2970 408/500 [=======================>......] - ETA: 22s - loss: 1.7921 - regression_loss: 1.4951 - classification_loss: 0.2970 409/500 [=======================>......] - ETA: 22s - loss: 1.7927 - regression_loss: 1.4960 - classification_loss: 0.2967 410/500 [=======================>......] - ETA: 22s - loss: 1.7911 - regression_loss: 1.4948 - classification_loss: 0.2963 411/500 [=======================>......] - ETA: 22s - loss: 1.7881 - regression_loss: 1.4923 - classification_loss: 0.2958 412/500 [=======================>......] - ETA: 21s - loss: 1.7921 - regression_loss: 1.4956 - classification_loss: 0.2965 413/500 [=======================>......] - ETA: 21s - loss: 1.7918 - regression_loss: 1.4949 - classification_loss: 0.2968 414/500 [=======================>......] - ETA: 21s - loss: 1.7946 - regression_loss: 1.4973 - classification_loss: 0.2973 415/500 [=======================>......] - ETA: 21s - loss: 1.7917 - regression_loss: 1.4950 - classification_loss: 0.2967 416/500 [=======================>......] - ETA: 20s - loss: 1.7907 - regression_loss: 1.4942 - classification_loss: 0.2965 417/500 [========================>.....] - ETA: 20s - loss: 1.7905 - regression_loss: 1.4942 - classification_loss: 0.2964 418/500 [========================>.....] - ETA: 20s - loss: 1.7896 - regression_loss: 1.4936 - classification_loss: 0.2961 419/500 [========================>.....] - ETA: 20s - loss: 1.7903 - regression_loss: 1.4942 - classification_loss: 0.2961 420/500 [========================>.....] - ETA: 19s - loss: 1.7882 - regression_loss: 1.4926 - classification_loss: 0.2956 421/500 [========================>.....] - ETA: 19s - loss: 1.7881 - regression_loss: 1.4924 - classification_loss: 0.2957 422/500 [========================>.....] - ETA: 19s - loss: 1.7875 - regression_loss: 1.4919 - classification_loss: 0.2956 423/500 [========================>.....] - ETA: 19s - loss: 1.7876 - regression_loss: 1.4918 - classification_loss: 0.2958 424/500 [========================>.....] - ETA: 18s - loss: 1.7873 - regression_loss: 1.4917 - classification_loss: 0.2956 425/500 [========================>.....] - ETA: 18s - loss: 1.7872 - regression_loss: 1.4916 - classification_loss: 0.2956 426/500 [========================>.....] - ETA: 18s - loss: 1.7883 - regression_loss: 1.4925 - classification_loss: 0.2958 427/500 [========================>.....] - ETA: 18s - loss: 1.7876 - regression_loss: 1.4919 - classification_loss: 0.2957 428/500 [========================>.....] - ETA: 17s - loss: 1.7874 - regression_loss: 1.4919 - classification_loss: 0.2956 429/500 [========================>.....] - ETA: 17s - loss: 1.7880 - regression_loss: 1.4925 - classification_loss: 0.2955 430/500 [========================>.....] - ETA: 17s - loss: 1.7865 - regression_loss: 1.4913 - classification_loss: 0.2952 431/500 [========================>.....] - ETA: 17s - loss: 1.7878 - regression_loss: 1.4920 - classification_loss: 0.2958 432/500 [========================>.....] - ETA: 16s - loss: 1.7879 - regression_loss: 1.4922 - classification_loss: 0.2957 433/500 [========================>.....] - ETA: 16s - loss: 1.7873 - regression_loss: 1.4919 - classification_loss: 0.2954 434/500 [=========================>....] - ETA: 16s - loss: 1.7871 - regression_loss: 1.4918 - classification_loss: 0.2952 435/500 [=========================>....] - ETA: 16s - loss: 1.7867 - regression_loss: 1.4914 - classification_loss: 0.2953 436/500 [=========================>....] - ETA: 15s - loss: 1.7884 - regression_loss: 1.4923 - classification_loss: 0.2961 437/500 [=========================>....] - ETA: 15s - loss: 1.7872 - regression_loss: 1.4913 - classification_loss: 0.2959 438/500 [=========================>....] - ETA: 15s - loss: 1.7859 - regression_loss: 1.4902 - classification_loss: 0.2956 439/500 [=========================>....] - ETA: 15s - loss: 1.7854 - regression_loss: 1.4900 - classification_loss: 0.2954 440/500 [=========================>....] - ETA: 14s - loss: 1.7851 - regression_loss: 1.4897 - classification_loss: 0.2954 441/500 [=========================>....] - ETA: 14s - loss: 1.7852 - regression_loss: 1.4898 - classification_loss: 0.2954 442/500 [=========================>....] - ETA: 14s - loss: 1.7856 - regression_loss: 1.4902 - classification_loss: 0.2955 443/500 [=========================>....] - ETA: 14s - loss: 1.7875 - regression_loss: 1.4916 - classification_loss: 0.2959 444/500 [=========================>....] - ETA: 13s - loss: 1.7879 - regression_loss: 1.4921 - classification_loss: 0.2958 445/500 [=========================>....] - ETA: 13s - loss: 1.7878 - regression_loss: 1.4920 - classification_loss: 0.2958 446/500 [=========================>....] - ETA: 13s - loss: 1.7872 - regression_loss: 1.4915 - classification_loss: 0.2957 447/500 [=========================>....] - ETA: 13s - loss: 1.7875 - regression_loss: 1.4918 - classification_loss: 0.2957 448/500 [=========================>....] - ETA: 12s - loss: 1.7874 - regression_loss: 1.4920 - classification_loss: 0.2954 449/500 [=========================>....] - ETA: 12s - loss: 1.7849 - regression_loss: 1.4899 - classification_loss: 0.2950 450/500 [==========================>...] - ETA: 12s - loss: 1.7832 - regression_loss: 1.4886 - classification_loss: 0.2945 451/500 [==========================>...] - ETA: 12s - loss: 1.7834 - regression_loss: 1.4886 - classification_loss: 0.2948 452/500 [==========================>...] - ETA: 11s - loss: 1.7842 - regression_loss: 1.4892 - classification_loss: 0.2950 453/500 [==========================>...] - ETA: 11s - loss: 1.7839 - regression_loss: 1.4891 - classification_loss: 0.2948 454/500 [==========================>...] - ETA: 11s - loss: 1.7841 - regression_loss: 1.4894 - classification_loss: 0.2947 455/500 [==========================>...] - ETA: 11s - loss: 1.7838 - regression_loss: 1.4893 - classification_loss: 0.2945 456/500 [==========================>...] - ETA: 10s - loss: 1.7830 - regression_loss: 1.4887 - classification_loss: 0.2943 457/500 [==========================>...] - ETA: 10s - loss: 1.7814 - regression_loss: 1.4874 - classification_loss: 0.2940 458/500 [==========================>...] - ETA: 10s - loss: 1.7823 - regression_loss: 1.4883 - classification_loss: 0.2940 459/500 [==========================>...] - ETA: 10s - loss: 1.7833 - regression_loss: 1.4890 - classification_loss: 0.2943 460/500 [==========================>...] - ETA: 9s - loss: 1.7834 - regression_loss: 1.4892 - classification_loss: 0.2942  461/500 [==========================>...] - ETA: 9s - loss: 1.7835 - regression_loss: 1.4894 - classification_loss: 0.2942 462/500 [==========================>...] - ETA: 9s - loss: 1.7819 - regression_loss: 1.4881 - classification_loss: 0.2938 463/500 [==========================>...] - ETA: 9s - loss: 1.7810 - regression_loss: 1.4875 - classification_loss: 0.2935 464/500 [==========================>...] - ETA: 8s - loss: 1.7833 - regression_loss: 1.4894 - classification_loss: 0.2939 465/500 [==========================>...] - ETA: 8s - loss: 1.7832 - regression_loss: 1.4893 - classification_loss: 0.2939 466/500 [==========================>...] - ETA: 8s - loss: 1.7843 - regression_loss: 1.4902 - classification_loss: 0.2941 467/500 [===========================>..] - ETA: 8s - loss: 1.7836 - regression_loss: 1.4898 - classification_loss: 0.2939 468/500 [===========================>..] - ETA: 7s - loss: 1.7831 - regression_loss: 1.4893 - classification_loss: 0.2938 469/500 [===========================>..] - ETA: 7s - loss: 1.7840 - regression_loss: 1.4902 - classification_loss: 0.2939 470/500 [===========================>..] - ETA: 7s - loss: 1.7836 - regression_loss: 1.4900 - classification_loss: 0.2936 471/500 [===========================>..] - ETA: 7s - loss: 1.7832 - regression_loss: 1.4897 - classification_loss: 0.2935 472/500 [===========================>..] - ETA: 6s - loss: 1.7838 - regression_loss: 1.4902 - classification_loss: 0.2936 473/500 [===========================>..] - ETA: 6s - loss: 1.7847 - regression_loss: 1.4910 - classification_loss: 0.2937 474/500 [===========================>..] - ETA: 6s - loss: 1.7841 - regression_loss: 1.4907 - classification_loss: 0.2934 475/500 [===========================>..] - ETA: 6s - loss: 1.7851 - regression_loss: 1.4914 - classification_loss: 0.2936 476/500 [===========================>..] - ETA: 5s - loss: 1.7863 - regression_loss: 1.4925 - classification_loss: 0.2939 477/500 [===========================>..] - ETA: 5s - loss: 1.7876 - regression_loss: 1.4932 - classification_loss: 0.2944 478/500 [===========================>..] - ETA: 5s - loss: 1.7908 - regression_loss: 1.4958 - classification_loss: 0.2950 479/500 [===========================>..] - ETA: 5s - loss: 1.7909 - regression_loss: 1.4961 - classification_loss: 0.2948 480/500 [===========================>..] - ETA: 4s - loss: 1.7926 - regression_loss: 1.4974 - classification_loss: 0.2952 481/500 [===========================>..] - ETA: 4s - loss: 1.7922 - regression_loss: 1.4972 - classification_loss: 0.2950 482/500 [===========================>..] - ETA: 4s - loss: 1.7920 - regression_loss: 1.4970 - classification_loss: 0.2949 483/500 [===========================>..] - ETA: 4s - loss: 1.7916 - regression_loss: 1.4968 - classification_loss: 0.2948 484/500 [============================>.] - ETA: 3s - loss: 1.7922 - regression_loss: 1.4972 - classification_loss: 0.2950 485/500 [============================>.] - ETA: 3s - loss: 1.7936 - regression_loss: 1.4984 - classification_loss: 0.2952 486/500 [============================>.] - ETA: 3s - loss: 1.7932 - regression_loss: 1.4981 - classification_loss: 0.2951 487/500 [============================>.] - ETA: 3s - loss: 1.7929 - regression_loss: 1.4980 - classification_loss: 0.2949 488/500 [============================>.] - ETA: 2s - loss: 1.7931 - regression_loss: 1.4982 - classification_loss: 0.2949 489/500 [============================>.] - ETA: 2s - loss: 1.7932 - regression_loss: 1.4982 - classification_loss: 0.2950 490/500 [============================>.] - ETA: 2s - loss: 1.7924 - regression_loss: 1.4976 - classification_loss: 0.2948 491/500 [============================>.] - ETA: 2s - loss: 1.7937 - regression_loss: 1.4986 - classification_loss: 0.2951 492/500 [============================>.] - ETA: 1s - loss: 1.7936 - regression_loss: 1.4986 - classification_loss: 0.2951 493/500 [============================>.] - ETA: 1s - loss: 1.7942 - regression_loss: 1.4991 - classification_loss: 0.2952 494/500 [============================>.] - ETA: 1s - loss: 1.7929 - regression_loss: 1.4981 - classification_loss: 0.2948 495/500 [============================>.] - ETA: 1s - loss: 1.7938 - regression_loss: 1.4988 - classification_loss: 0.2951 496/500 [============================>.] - ETA: 0s - loss: 1.7941 - regression_loss: 1.4990 - classification_loss: 0.2951 497/500 [============================>.] - ETA: 0s - loss: 1.7950 - regression_loss: 1.4997 - classification_loss: 0.2954 498/500 [============================>.] - ETA: 0s - loss: 1.7940 - regression_loss: 1.4990 - classification_loss: 0.2950 499/500 [============================>.] - ETA: 0s - loss: 1.7950 - regression_loss: 1.4997 - classification_loss: 0.2953 500/500 [==============================] - 125s 250ms/step - loss: 1.7938 - regression_loss: 1.4989 - classification_loss: 0.2949 326 instances of class plum with average precision: 0.7451 mAP: 0.7451 Epoch 00039: saving model to ./training/snapshots/resnet50_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 1:49 - loss: 2.1693 - regression_loss: 1.7511 - classification_loss: 0.4182 2/500 [..............................] - ETA: 1:52 - loss: 2.0281 - regression_loss: 1.7009 - classification_loss: 0.3273 3/500 [..............................] - ETA: 1:56 - loss: 1.9805 - regression_loss: 1.6379 - classification_loss: 0.3425 4/500 [..............................] - ETA: 1:58 - loss: 1.9203 - regression_loss: 1.5933 - classification_loss: 0.3270 5/500 [..............................] - ETA: 1:59 - loss: 1.8834 - regression_loss: 1.5631 - classification_loss: 0.3203 6/500 [..............................] - ETA: 1:59 - loss: 1.9038 - regression_loss: 1.5815 - classification_loss: 0.3223 7/500 [..............................] - ETA: 2:00 - loss: 1.9221 - regression_loss: 1.6006 - classification_loss: 0.3214 8/500 [..............................] - ETA: 2:00 - loss: 1.9710 - regression_loss: 1.6268 - classification_loss: 0.3441 9/500 [..............................] - ETA: 2:00 - loss: 1.8143 - regression_loss: 1.5019 - classification_loss: 0.3125 10/500 [..............................] - ETA: 2:00 - loss: 1.7578 - regression_loss: 1.4606 - classification_loss: 0.2972 11/500 [..............................] - ETA: 2:00 - loss: 1.6910 - regression_loss: 1.4104 - classification_loss: 0.2806 12/500 [..............................] - ETA: 2:00 - loss: 1.6701 - regression_loss: 1.3866 - classification_loss: 0.2834 13/500 [..............................] - ETA: 1:59 - loss: 1.7057 - regression_loss: 1.4093 - classification_loss: 0.2964 14/500 [..............................] - ETA: 1:59 - loss: 1.7798 - regression_loss: 1.4742 - classification_loss: 0.3056 15/500 [..............................] - ETA: 1:59 - loss: 1.7927 - regression_loss: 1.4870 - classification_loss: 0.3057 16/500 [..............................] - ETA: 1:59 - loss: 1.7986 - regression_loss: 1.4973 - classification_loss: 0.3013 17/500 [>.............................] - ETA: 1:59 - loss: 1.7886 - regression_loss: 1.4906 - classification_loss: 0.2980 18/500 [>.............................] - ETA: 1:59 - loss: 1.7935 - regression_loss: 1.4939 - classification_loss: 0.2996 19/500 [>.............................] - ETA: 1:59 - loss: 1.7895 - regression_loss: 1.4884 - classification_loss: 0.3011 20/500 [>.............................] - ETA: 1:58 - loss: 1.8056 - regression_loss: 1.5046 - classification_loss: 0.3009 21/500 [>.............................] - ETA: 1:58 - loss: 1.7997 - regression_loss: 1.4915 - classification_loss: 0.3083 22/500 [>.............................] - ETA: 1:58 - loss: 1.8105 - regression_loss: 1.4962 - classification_loss: 0.3143 23/500 [>.............................] - ETA: 1:57 - loss: 1.7996 - regression_loss: 1.4860 - classification_loss: 0.3136 24/500 [>.............................] - ETA: 1:57 - loss: 1.8012 - regression_loss: 1.4868 - classification_loss: 0.3144 25/500 [>.............................] - ETA: 1:57 - loss: 1.8191 - regression_loss: 1.4969 - classification_loss: 0.3222 26/500 [>.............................] - ETA: 1:57 - loss: 1.7986 - regression_loss: 1.4825 - classification_loss: 0.3161 27/500 [>.............................] - ETA: 1:57 - loss: 1.8049 - regression_loss: 1.4871 - classification_loss: 0.3178 28/500 [>.............................] - ETA: 1:56 - loss: 1.8115 - regression_loss: 1.4923 - classification_loss: 0.3193 29/500 [>.............................] - ETA: 1:56 - loss: 1.7751 - regression_loss: 1.4606 - classification_loss: 0.3145 30/500 [>.............................] - ETA: 1:56 - loss: 1.7623 - regression_loss: 1.4527 - classification_loss: 0.3096 31/500 [>.............................] - ETA: 1:56 - loss: 1.7356 - regression_loss: 1.4335 - classification_loss: 0.3020 32/500 [>.............................] - ETA: 1:55 - loss: 1.7388 - regression_loss: 1.4355 - classification_loss: 0.3033 33/500 [>.............................] - ETA: 1:55 - loss: 1.7484 - regression_loss: 1.4449 - classification_loss: 0.3036 34/500 [=>............................] - ETA: 1:55 - loss: 1.8023 - regression_loss: 1.4881 - classification_loss: 0.3141 35/500 [=>............................] - ETA: 1:55 - loss: 1.8127 - regression_loss: 1.4996 - classification_loss: 0.3131 36/500 [=>............................] - ETA: 1:54 - loss: 1.7999 - regression_loss: 1.4871 - classification_loss: 0.3128 37/500 [=>............................] - ETA: 1:53 - loss: 1.8084 - regression_loss: 1.4936 - classification_loss: 0.3148 38/500 [=>............................] - ETA: 1:53 - loss: 1.8145 - regression_loss: 1.4986 - classification_loss: 0.3159 39/500 [=>............................] - ETA: 1:52 - loss: 1.8154 - regression_loss: 1.5014 - classification_loss: 0.3140 40/500 [=>............................] - ETA: 1:52 - loss: 1.8180 - regression_loss: 1.5034 - classification_loss: 0.3146 41/500 [=>............................] - ETA: 1:52 - loss: 1.7899 - regression_loss: 1.4818 - classification_loss: 0.3081 42/500 [=>............................] - ETA: 1:51 - loss: 1.7929 - regression_loss: 1.4861 - classification_loss: 0.3068 43/500 [=>............................] - ETA: 1:51 - loss: 1.7717 - regression_loss: 1.4693 - classification_loss: 0.3023 44/500 [=>............................] - ETA: 1:51 - loss: 1.7665 - regression_loss: 1.4651 - classification_loss: 0.3014 45/500 [=>............................] - ETA: 1:51 - loss: 1.7732 - regression_loss: 1.4722 - classification_loss: 0.3010 46/500 [=>............................] - ETA: 1:51 - loss: 1.7631 - regression_loss: 1.4402 - classification_loss: 0.3229 47/500 [=>............................] - ETA: 1:50 - loss: 1.7580 - regression_loss: 1.4365 - classification_loss: 0.3215 48/500 [=>............................] - ETA: 1:50 - loss: 1.7572 - regression_loss: 1.4375 - classification_loss: 0.3197 49/500 [=>............................] - ETA: 1:50 - loss: 1.7783 - regression_loss: 1.4548 - classification_loss: 0.3235 50/500 [==>...........................] - ETA: 1:50 - loss: 1.7849 - regression_loss: 1.4616 - classification_loss: 0.3234 51/500 [==>...........................] - ETA: 1:49 - loss: 1.7720 - regression_loss: 1.4517 - classification_loss: 0.3204 52/500 [==>...........................] - ETA: 1:49 - loss: 1.7644 - regression_loss: 1.4475 - classification_loss: 0.3169 53/500 [==>...........................] - ETA: 1:49 - loss: 1.7757 - regression_loss: 1.4566 - classification_loss: 0.3190 54/500 [==>...........................] - ETA: 1:49 - loss: 1.7966 - regression_loss: 1.4730 - classification_loss: 0.3236 55/500 [==>...........................] - ETA: 1:49 - loss: 1.7914 - regression_loss: 1.4694 - classification_loss: 0.3220 56/500 [==>...........................] - ETA: 1:49 - loss: 1.7853 - regression_loss: 1.4653 - classification_loss: 0.3200 57/500 [==>...........................] - ETA: 1:48 - loss: 1.7751 - regression_loss: 1.4580 - classification_loss: 0.3171 58/500 [==>...........................] - ETA: 1:48 - loss: 1.7677 - regression_loss: 1.4530 - classification_loss: 0.3148 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7674 - regression_loss: 1.4530 - classification_loss: 0.3144 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7570 - regression_loss: 1.4455 - classification_loss: 0.3115 61/500 [==>...........................] - ETA: 1:48 - loss: 1.7525 - regression_loss: 1.4426 - classification_loss: 0.3099 62/500 [==>...........................] - ETA: 1:47 - loss: 1.7427 - regression_loss: 1.4357 - classification_loss: 0.3069 63/500 [==>...........................] - ETA: 1:47 - loss: 1.7450 - regression_loss: 1.4384 - classification_loss: 0.3066 64/500 [==>...........................] - ETA: 1:47 - loss: 1.7439 - regression_loss: 1.4379 - classification_loss: 0.3060 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7462 - regression_loss: 1.4411 - classification_loss: 0.3050 66/500 [==>...........................] - ETA: 1:46 - loss: 1.7410 - regression_loss: 1.4382 - classification_loss: 0.3028 67/500 [===>..........................] - ETA: 1:46 - loss: 1.7451 - regression_loss: 1.4428 - classification_loss: 0.3024 68/500 [===>..........................] - ETA: 1:46 - loss: 1.7452 - regression_loss: 1.4428 - classification_loss: 0.3024 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7354 - regression_loss: 1.4348 - classification_loss: 0.3006 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7282 - regression_loss: 1.4298 - classification_loss: 0.2984 71/500 [===>..........................] - ETA: 1:45 - loss: 1.7199 - regression_loss: 1.4237 - classification_loss: 0.2962 72/500 [===>..........................] - ETA: 1:45 - loss: 1.7162 - regression_loss: 1.4218 - classification_loss: 0.2944 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7177 - regression_loss: 1.4225 - classification_loss: 0.2953 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7173 - regression_loss: 1.4235 - classification_loss: 0.2939 75/500 [===>..........................] - ETA: 1:45 - loss: 1.7267 - regression_loss: 1.4326 - classification_loss: 0.2941 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7315 - regression_loss: 1.4361 - classification_loss: 0.2955 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7276 - regression_loss: 1.4341 - classification_loss: 0.2935 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7320 - regression_loss: 1.4385 - classification_loss: 0.2935 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7442 - regression_loss: 1.4489 - classification_loss: 0.2953 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7426 - regression_loss: 1.4477 - classification_loss: 0.2948 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7467 - regression_loss: 1.4502 - classification_loss: 0.2966 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7493 - regression_loss: 1.4521 - classification_loss: 0.2972 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7491 - regression_loss: 1.4533 - classification_loss: 0.2958 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7516 - regression_loss: 1.4562 - classification_loss: 0.2953 85/500 [====>.........................] - ETA: 1:42 - loss: 1.7531 - regression_loss: 1.4578 - classification_loss: 0.2953 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7481 - regression_loss: 1.4536 - classification_loss: 0.2945 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7473 - regression_loss: 1.4535 - classification_loss: 0.2938 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7543 - regression_loss: 1.4592 - classification_loss: 0.2951 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7515 - regression_loss: 1.4570 - classification_loss: 0.2945 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7510 - regression_loss: 1.4569 - classification_loss: 0.2940 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7752 - regression_loss: 1.4702 - classification_loss: 0.3050 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7716 - regression_loss: 1.4674 - classification_loss: 0.3042 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7812 - regression_loss: 1.4757 - classification_loss: 0.3055 94/500 [====>.........................] - ETA: 1:40 - loss: 1.7847 - regression_loss: 1.4790 - classification_loss: 0.3058 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7836 - regression_loss: 1.4782 - classification_loss: 0.3054 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7800 - regression_loss: 1.4759 - classification_loss: 0.3041 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7685 - regression_loss: 1.4667 - classification_loss: 0.3018 98/500 [====>.........................] - ETA: 1:39 - loss: 1.7659 - regression_loss: 1.4642 - classification_loss: 0.3018 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7707 - regression_loss: 1.4680 - classification_loss: 0.3027 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7724 - regression_loss: 1.4697 - classification_loss: 0.3027 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7792 - regression_loss: 1.4743 - classification_loss: 0.3050 102/500 [=====>........................] - ETA: 1:38 - loss: 1.7835 - regression_loss: 1.4771 - classification_loss: 0.3064 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7762 - regression_loss: 1.4710 - classification_loss: 0.3052 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7767 - regression_loss: 1.4704 - classification_loss: 0.3063 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7756 - regression_loss: 1.4702 - classification_loss: 0.3054 106/500 [=====>........................] - ETA: 1:37 - loss: 1.7711 - regression_loss: 1.4663 - classification_loss: 0.3047 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7704 - regression_loss: 1.4661 - classification_loss: 0.3043 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7647 - regression_loss: 1.4618 - classification_loss: 0.3029 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7608 - regression_loss: 1.4587 - classification_loss: 0.3021 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7632 - regression_loss: 1.4614 - classification_loss: 0.3019 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7736 - regression_loss: 1.4700 - classification_loss: 0.3036 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7701 - regression_loss: 1.4675 - classification_loss: 0.3025 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7665 - regression_loss: 1.4653 - classification_loss: 0.3012 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7643 - regression_loss: 1.4640 - classification_loss: 0.3003 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7592 - regression_loss: 1.4601 - classification_loss: 0.2991 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7613 - regression_loss: 1.4613 - classification_loss: 0.3000 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7623 - regression_loss: 1.4620 - classification_loss: 0.3004 118/500 [======>.......................] - ETA: 1:34 - loss: 1.7645 - regression_loss: 1.4647 - classification_loss: 0.2998 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7700 - regression_loss: 1.4690 - classification_loss: 0.3010 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7768 - regression_loss: 1.4738 - classification_loss: 0.3030 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7811 - regression_loss: 1.4763 - classification_loss: 0.3048 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7785 - regression_loss: 1.4747 - classification_loss: 0.3038 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7733 - regression_loss: 1.4706 - classification_loss: 0.3027 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7717 - regression_loss: 1.4688 - classification_loss: 0.3030 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7691 - regression_loss: 1.4671 - classification_loss: 0.3020 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7655 - regression_loss: 1.4648 - classification_loss: 0.3007 127/500 [======>.......................] - ETA: 1:32 - loss: 1.7647 - regression_loss: 1.4643 - classification_loss: 0.3004 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7665 - regression_loss: 1.4659 - classification_loss: 0.3006 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7641 - regression_loss: 1.4645 - classification_loss: 0.2996 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7664 - regression_loss: 1.4661 - classification_loss: 0.3002 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7626 - regression_loss: 1.4630 - classification_loss: 0.2996 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7541 - regression_loss: 1.4558 - classification_loss: 0.2983 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7663 - regression_loss: 1.4661 - classification_loss: 0.3003 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7674 - regression_loss: 1.4674 - classification_loss: 0.3000 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7700 - regression_loss: 1.4696 - classification_loss: 0.3004 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7642 - regression_loss: 1.4653 - classification_loss: 0.2990 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7599 - regression_loss: 1.4619 - classification_loss: 0.2980 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7593 - regression_loss: 1.4622 - classification_loss: 0.2971 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7653 - regression_loss: 1.4668 - classification_loss: 0.2985 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7633 - regression_loss: 1.4642 - classification_loss: 0.2991 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7643 - regression_loss: 1.4654 - classification_loss: 0.2989 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7656 - regression_loss: 1.4680 - classification_loss: 0.2976 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7673 - regression_loss: 1.4699 - classification_loss: 0.2974 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7631 - regression_loss: 1.4668 - classification_loss: 0.2963 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7721 - regression_loss: 1.4738 - classification_loss: 0.2984 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7722 - regression_loss: 1.4738 - classification_loss: 0.2984 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7696 - regression_loss: 1.4720 - classification_loss: 0.2976 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7679 - regression_loss: 1.4708 - classification_loss: 0.2971 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7751 - regression_loss: 1.4762 - classification_loss: 0.2988 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7759 - regression_loss: 1.4773 - classification_loss: 0.2986 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7696 - regression_loss: 1.4725 - classification_loss: 0.2971 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7719 - regression_loss: 1.4747 - classification_loss: 0.2971 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7677 - regression_loss: 1.4714 - classification_loss: 0.2963 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7665 - regression_loss: 1.4706 - classification_loss: 0.2959 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7755 - regression_loss: 1.4776 - classification_loss: 0.2978 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7763 - regression_loss: 1.4780 - classification_loss: 0.2982 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7867 - regression_loss: 1.4864 - classification_loss: 0.3003 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7891 - regression_loss: 1.4885 - classification_loss: 0.3006 159/500 [========>.....................] - ETA: 1:24 - loss: 1.7805 - regression_loss: 1.4814 - classification_loss: 0.2991 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7794 - regression_loss: 1.4810 - classification_loss: 0.2985 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7834 - regression_loss: 1.4833 - classification_loss: 0.3000 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7808 - regression_loss: 1.4810 - classification_loss: 0.2998 163/500 [========>.....................] - ETA: 1:23 - loss: 1.7779 - regression_loss: 1.4790 - classification_loss: 0.2989 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7811 - regression_loss: 1.4816 - classification_loss: 0.2995 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7828 - regression_loss: 1.4825 - classification_loss: 0.3004 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7818 - regression_loss: 1.4808 - classification_loss: 0.3010 167/500 [=========>....................] - ETA: 1:22 - loss: 1.7824 - regression_loss: 1.4809 - classification_loss: 0.3015 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7838 - regression_loss: 1.4824 - classification_loss: 0.3014 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7816 - regression_loss: 1.4811 - classification_loss: 0.3005 170/500 [=========>....................] - ETA: 1:21 - loss: 1.7787 - regression_loss: 1.4789 - classification_loss: 0.2998 171/500 [=========>....................] - ETA: 1:21 - loss: 1.7837 - regression_loss: 1.4830 - classification_loss: 0.3007 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7807 - regression_loss: 1.4807 - classification_loss: 0.2999 173/500 [=========>....................] - ETA: 1:20 - loss: 1.7817 - regression_loss: 1.4812 - classification_loss: 0.3005 174/500 [=========>....................] - ETA: 1:20 - loss: 1.7805 - regression_loss: 1.4804 - classification_loss: 0.3001 175/500 [=========>....................] - ETA: 1:20 - loss: 1.7812 - regression_loss: 1.4805 - classification_loss: 0.3007 176/500 [=========>....................] - ETA: 1:19 - loss: 1.7819 - regression_loss: 1.4811 - classification_loss: 0.3008 177/500 [=========>....................] - ETA: 1:19 - loss: 1.7843 - regression_loss: 1.4829 - classification_loss: 0.3014 178/500 [=========>....................] - ETA: 1:19 - loss: 1.7778 - regression_loss: 1.4774 - classification_loss: 0.3004 179/500 [=========>....................] - ETA: 1:18 - loss: 1.7761 - regression_loss: 1.4762 - classification_loss: 0.2998 180/500 [=========>....................] - ETA: 1:18 - loss: 1.7706 - regression_loss: 1.4720 - classification_loss: 0.2986 181/500 [=========>....................] - ETA: 1:18 - loss: 1.7703 - regression_loss: 1.4724 - classification_loss: 0.2979 182/500 [=========>....................] - ETA: 1:18 - loss: 1.7689 - regression_loss: 1.4717 - classification_loss: 0.2972 183/500 [=========>....................] - ETA: 1:17 - loss: 1.7744 - regression_loss: 1.4757 - classification_loss: 0.2987 184/500 [==========>...................] - ETA: 1:17 - loss: 1.7732 - regression_loss: 1.4748 - classification_loss: 0.2984 185/500 [==========>...................] - ETA: 1:17 - loss: 1.7748 - regression_loss: 1.4762 - classification_loss: 0.2987 186/500 [==========>...................] - ETA: 1:17 - loss: 1.7768 - regression_loss: 1.4776 - classification_loss: 0.2992 187/500 [==========>...................] - ETA: 1:16 - loss: 1.7730 - regression_loss: 1.4745 - classification_loss: 0.2985 188/500 [==========>...................] - ETA: 1:16 - loss: 1.7716 - regression_loss: 1.4737 - classification_loss: 0.2978 189/500 [==========>...................] - ETA: 1:16 - loss: 1.7695 - regression_loss: 1.4659 - classification_loss: 0.3035 190/500 [==========>...................] - ETA: 1:16 - loss: 1.7656 - regression_loss: 1.4628 - classification_loss: 0.3028 191/500 [==========>...................] - ETA: 1:15 - loss: 1.7689 - regression_loss: 1.4655 - classification_loss: 0.3034 192/500 [==========>...................] - ETA: 1:15 - loss: 1.7699 - regression_loss: 1.4660 - classification_loss: 0.3039 193/500 [==========>...................] - ETA: 1:15 - loss: 1.7736 - regression_loss: 1.4700 - classification_loss: 0.3036 194/500 [==========>...................] - ETA: 1:15 - loss: 1.7724 - regression_loss: 1.4693 - classification_loss: 0.3032 195/500 [==========>...................] - ETA: 1:15 - loss: 1.7721 - regression_loss: 1.4690 - classification_loss: 0.3030 196/500 [==========>...................] - ETA: 1:14 - loss: 1.7665 - regression_loss: 1.4645 - classification_loss: 0.3020 197/500 [==========>...................] - ETA: 1:14 - loss: 1.7635 - regression_loss: 1.4624 - classification_loss: 0.3011 198/500 [==========>...................] - ETA: 1:14 - loss: 1.7647 - regression_loss: 1.4630 - classification_loss: 0.3017 199/500 [==========>...................] - ETA: 1:14 - loss: 1.7648 - regression_loss: 1.4636 - classification_loss: 0.3012 200/500 [===========>..................] - ETA: 1:13 - loss: 1.7581 - regression_loss: 1.4581 - classification_loss: 0.3000 201/500 [===========>..................] - ETA: 1:13 - loss: 1.7579 - regression_loss: 1.4579 - classification_loss: 0.2999 202/500 [===========>..................] - ETA: 1:13 - loss: 1.7602 - regression_loss: 1.4595 - classification_loss: 0.3007 203/500 [===========>..................] - ETA: 1:13 - loss: 1.7625 - regression_loss: 1.4616 - classification_loss: 0.3009 204/500 [===========>..................] - ETA: 1:12 - loss: 1.7633 - regression_loss: 1.4623 - classification_loss: 0.3010 205/500 [===========>..................] - ETA: 1:12 - loss: 1.7662 - regression_loss: 1.4649 - classification_loss: 0.3013 206/500 [===========>..................] - ETA: 1:12 - loss: 1.7679 - regression_loss: 1.4664 - classification_loss: 0.3015 207/500 [===========>..................] - ETA: 1:12 - loss: 1.7667 - regression_loss: 1.4656 - classification_loss: 0.3011 208/500 [===========>..................] - ETA: 1:11 - loss: 1.7649 - regression_loss: 1.4644 - classification_loss: 0.3006 209/500 [===========>..................] - ETA: 1:11 - loss: 1.7625 - regression_loss: 1.4627 - classification_loss: 0.2998 210/500 [===========>..................] - ETA: 1:11 - loss: 1.7622 - regression_loss: 1.4627 - classification_loss: 0.2995 211/500 [===========>..................] - ETA: 1:11 - loss: 1.7634 - regression_loss: 1.4634 - classification_loss: 0.3000 212/500 [===========>..................] - ETA: 1:10 - loss: 1.7617 - regression_loss: 1.4624 - classification_loss: 0.2993 213/500 [===========>..................] - ETA: 1:10 - loss: 1.7611 - regression_loss: 1.4618 - classification_loss: 0.2993 214/500 [===========>..................] - ETA: 1:10 - loss: 1.7578 - regression_loss: 1.4593 - classification_loss: 0.2985 215/500 [===========>..................] - ETA: 1:10 - loss: 1.7552 - regression_loss: 1.4573 - classification_loss: 0.2979 216/500 [===========>..................] - ETA: 1:09 - loss: 1.7574 - regression_loss: 1.4584 - classification_loss: 0.2990 217/500 [============>.................] - ETA: 1:09 - loss: 1.7553 - regression_loss: 1.4572 - classification_loss: 0.2982 218/500 [============>.................] - ETA: 1:09 - loss: 1.7554 - regression_loss: 1.4572 - classification_loss: 0.2983 219/500 [============>.................] - ETA: 1:09 - loss: 1.7585 - regression_loss: 1.4595 - classification_loss: 0.2990 220/500 [============>.................] - ETA: 1:08 - loss: 1.7579 - regression_loss: 1.4592 - classification_loss: 0.2987 221/500 [============>.................] - ETA: 1:08 - loss: 1.7596 - regression_loss: 1.4610 - classification_loss: 0.2987 222/500 [============>.................] - ETA: 1:08 - loss: 1.7587 - regression_loss: 1.4601 - classification_loss: 0.2986 223/500 [============>.................] - ETA: 1:08 - loss: 1.7579 - regression_loss: 1.4598 - classification_loss: 0.2981 224/500 [============>.................] - ETA: 1:07 - loss: 1.7534 - regression_loss: 1.4564 - classification_loss: 0.2970 225/500 [============>.................] - ETA: 1:07 - loss: 1.7508 - regression_loss: 1.4545 - classification_loss: 0.2963 226/500 [============>.................] - ETA: 1:07 - loss: 1.7510 - regression_loss: 1.4548 - classification_loss: 0.2961 227/500 [============>.................] - ETA: 1:07 - loss: 1.7506 - regression_loss: 1.4551 - classification_loss: 0.2955 228/500 [============>.................] - ETA: 1:06 - loss: 1.7483 - regression_loss: 1.4535 - classification_loss: 0.2948 229/500 [============>.................] - ETA: 1:06 - loss: 1.7477 - regression_loss: 1.4531 - classification_loss: 0.2945 230/500 [============>.................] - ETA: 1:06 - loss: 1.7428 - regression_loss: 1.4492 - classification_loss: 0.2936 231/500 [============>.................] - ETA: 1:06 - loss: 1.7434 - regression_loss: 1.4498 - classification_loss: 0.2936 232/500 [============>.................] - ETA: 1:05 - loss: 1.7439 - regression_loss: 1.4503 - classification_loss: 0.2935 233/500 [============>.................] - ETA: 1:05 - loss: 1.7408 - regression_loss: 1.4476 - classification_loss: 0.2932 234/500 [=============>................] - ETA: 1:05 - loss: 1.7466 - regression_loss: 1.4529 - classification_loss: 0.2937 235/500 [=============>................] - ETA: 1:05 - loss: 1.7465 - regression_loss: 1.4530 - classification_loss: 0.2936 236/500 [=============>................] - ETA: 1:04 - loss: 1.7446 - regression_loss: 1.4517 - classification_loss: 0.2929 237/500 [=============>................] - ETA: 1:04 - loss: 1.7444 - regression_loss: 1.4520 - classification_loss: 0.2924 238/500 [=============>................] - ETA: 1:04 - loss: 1.7429 - regression_loss: 1.4510 - classification_loss: 0.2919 239/500 [=============>................] - ETA: 1:04 - loss: 1.7427 - regression_loss: 1.4511 - classification_loss: 0.2916 240/500 [=============>................] - ETA: 1:03 - loss: 1.7407 - regression_loss: 1.4495 - classification_loss: 0.2911 241/500 [=============>................] - ETA: 1:03 - loss: 1.7399 - regression_loss: 1.4491 - classification_loss: 0.2908 242/500 [=============>................] - ETA: 1:03 - loss: 1.7420 - regression_loss: 1.4507 - classification_loss: 0.2913 243/500 [=============>................] - ETA: 1:03 - loss: 1.7398 - regression_loss: 1.4491 - classification_loss: 0.2907 244/500 [=============>................] - ETA: 1:02 - loss: 1.7408 - regression_loss: 1.4502 - classification_loss: 0.2906 245/500 [=============>................] - ETA: 1:02 - loss: 1.7402 - regression_loss: 1.4499 - classification_loss: 0.2903 246/500 [=============>................] - ETA: 1:02 - loss: 1.7408 - regression_loss: 1.4507 - classification_loss: 0.2901 247/500 [=============>................] - ETA: 1:02 - loss: 1.7421 - regression_loss: 1.4516 - classification_loss: 0.2905 248/500 [=============>................] - ETA: 1:01 - loss: 1.7406 - regression_loss: 1.4506 - classification_loss: 0.2900 249/500 [=============>................] - ETA: 1:01 - loss: 1.7418 - regression_loss: 1.4518 - classification_loss: 0.2899 250/500 [==============>...............] - ETA: 1:01 - loss: 1.7439 - regression_loss: 1.4535 - classification_loss: 0.2904 251/500 [==============>...............] - ETA: 1:01 - loss: 1.7409 - regression_loss: 1.4512 - classification_loss: 0.2897 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7444 - regression_loss: 1.4538 - classification_loss: 0.2906 253/500 [==============>...............] - ETA: 1:00 - loss: 1.7428 - regression_loss: 1.4525 - classification_loss: 0.2902 254/500 [==============>...............] - ETA: 1:00 - loss: 1.7402 - regression_loss: 1.4506 - classification_loss: 0.2897 255/500 [==============>...............] - ETA: 1:00 - loss: 1.7432 - regression_loss: 1.4532 - classification_loss: 0.2900 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7439 - regression_loss: 1.4537 - classification_loss: 0.2902 257/500 [==============>...............] - ETA: 59s - loss: 1.7454 - regression_loss: 1.4552 - classification_loss: 0.2902  258/500 [==============>...............] - ETA: 59s - loss: 1.7467 - regression_loss: 1.4562 - classification_loss: 0.2905 259/500 [==============>...............] - ETA: 59s - loss: 1.7482 - regression_loss: 1.4575 - classification_loss: 0.2907 260/500 [==============>...............] - ETA: 59s - loss: 1.7491 - regression_loss: 1.4579 - classification_loss: 0.2912 261/500 [==============>...............] - ETA: 58s - loss: 1.7490 - regression_loss: 1.4582 - classification_loss: 0.2908 262/500 [==============>...............] - ETA: 58s - loss: 1.7467 - regression_loss: 1.4564 - classification_loss: 0.2903 263/500 [==============>...............] - ETA: 58s - loss: 1.7499 - regression_loss: 1.4592 - classification_loss: 0.2907 264/500 [==============>...............] - ETA: 58s - loss: 1.7545 - regression_loss: 1.4631 - classification_loss: 0.2914 265/500 [==============>...............] - ETA: 57s - loss: 1.7553 - regression_loss: 1.4632 - classification_loss: 0.2920 266/500 [==============>...............] - ETA: 57s - loss: 1.7536 - regression_loss: 1.4621 - classification_loss: 0.2915 267/500 [===============>..............] - ETA: 57s - loss: 1.7518 - regression_loss: 1.4608 - classification_loss: 0.2909 268/500 [===============>..............] - ETA: 57s - loss: 1.7489 - regression_loss: 1.4585 - classification_loss: 0.2904 269/500 [===============>..............] - ETA: 56s - loss: 1.7467 - regression_loss: 1.4569 - classification_loss: 0.2898 270/500 [===============>..............] - ETA: 56s - loss: 1.7449 - regression_loss: 1.4556 - classification_loss: 0.2893 271/500 [===============>..............] - ETA: 56s - loss: 1.7470 - regression_loss: 1.4574 - classification_loss: 0.2896 272/500 [===============>..............] - ETA: 56s - loss: 1.7462 - regression_loss: 1.4569 - classification_loss: 0.2893 273/500 [===============>..............] - ETA: 55s - loss: 1.7453 - regression_loss: 1.4562 - classification_loss: 0.2891 274/500 [===============>..............] - ETA: 55s - loss: 1.7448 - regression_loss: 1.4559 - classification_loss: 0.2889 275/500 [===============>..............] - ETA: 55s - loss: 1.7482 - regression_loss: 1.4579 - classification_loss: 0.2903 276/500 [===============>..............] - ETA: 55s - loss: 1.7500 - regression_loss: 1.4590 - classification_loss: 0.2910 277/500 [===============>..............] - ETA: 54s - loss: 1.7496 - regression_loss: 1.4585 - classification_loss: 0.2911 278/500 [===============>..............] - ETA: 54s - loss: 1.7471 - regression_loss: 1.4567 - classification_loss: 0.2905 279/500 [===============>..............] - ETA: 54s - loss: 1.7453 - regression_loss: 1.4551 - classification_loss: 0.2902 280/500 [===============>..............] - ETA: 54s - loss: 1.7431 - regression_loss: 1.4535 - classification_loss: 0.2897 281/500 [===============>..............] - ETA: 53s - loss: 1.7402 - regression_loss: 1.4512 - classification_loss: 0.2890 282/500 [===============>..............] - ETA: 53s - loss: 1.7398 - regression_loss: 1.4509 - classification_loss: 0.2889 283/500 [===============>..............] - ETA: 53s - loss: 1.7422 - regression_loss: 1.4530 - classification_loss: 0.2891 284/500 [================>.............] - ETA: 53s - loss: 1.7485 - regression_loss: 1.4586 - classification_loss: 0.2898 285/500 [================>.............] - ETA: 53s - loss: 1.7475 - regression_loss: 1.4580 - classification_loss: 0.2895 286/500 [================>.............] - ETA: 52s - loss: 1.7461 - regression_loss: 1.4571 - classification_loss: 0.2890 287/500 [================>.............] - ETA: 52s - loss: 1.7444 - regression_loss: 1.4559 - classification_loss: 0.2885 288/500 [================>.............] - ETA: 52s - loss: 1.7447 - regression_loss: 1.4561 - classification_loss: 0.2886 289/500 [================>.............] - ETA: 52s - loss: 1.7455 - regression_loss: 1.4569 - classification_loss: 0.2886 290/500 [================>.............] - ETA: 51s - loss: 1.7429 - regression_loss: 1.4541 - classification_loss: 0.2888 291/500 [================>.............] - ETA: 51s - loss: 1.7450 - regression_loss: 1.4556 - classification_loss: 0.2894 292/500 [================>.............] - ETA: 51s - loss: 1.7426 - regression_loss: 1.4539 - classification_loss: 0.2888 293/500 [================>.............] - ETA: 51s - loss: 1.7449 - regression_loss: 1.4556 - classification_loss: 0.2893 294/500 [================>.............] - ETA: 50s - loss: 1.7442 - regression_loss: 1.4552 - classification_loss: 0.2890 295/500 [================>.............] - ETA: 50s - loss: 1.7465 - regression_loss: 1.4566 - classification_loss: 0.2899 296/500 [================>.............] - ETA: 50s - loss: 1.7461 - regression_loss: 1.4564 - classification_loss: 0.2897 297/500 [================>.............] - ETA: 50s - loss: 1.7441 - regression_loss: 1.4549 - classification_loss: 0.2892 298/500 [================>.............] - ETA: 49s - loss: 1.7449 - regression_loss: 1.4561 - classification_loss: 0.2888 299/500 [================>.............] - ETA: 49s - loss: 1.7458 - regression_loss: 1.4568 - classification_loss: 0.2890 300/500 [=================>............] - ETA: 49s - loss: 1.7486 - regression_loss: 1.4590 - classification_loss: 0.2896 301/500 [=================>............] - ETA: 49s - loss: 1.7482 - regression_loss: 1.4583 - classification_loss: 0.2898 302/500 [=================>............] - ETA: 48s - loss: 1.7488 - regression_loss: 1.4590 - classification_loss: 0.2898 303/500 [=================>............] - ETA: 48s - loss: 1.7515 - regression_loss: 1.4609 - classification_loss: 0.2906 304/500 [=================>............] - ETA: 48s - loss: 1.7522 - regression_loss: 1.4614 - classification_loss: 0.2908 305/500 [=================>............] - ETA: 48s - loss: 1.7522 - regression_loss: 1.4616 - classification_loss: 0.2906 306/500 [=================>............] - ETA: 47s - loss: 1.7509 - regression_loss: 1.4605 - classification_loss: 0.2903 307/500 [=================>............] - ETA: 47s - loss: 1.7547 - regression_loss: 1.4637 - classification_loss: 0.2910 308/500 [=================>............] - ETA: 47s - loss: 1.7544 - regression_loss: 1.4636 - classification_loss: 0.2907 309/500 [=================>............] - ETA: 47s - loss: 1.7546 - regression_loss: 1.4638 - classification_loss: 0.2908 310/500 [=================>............] - ETA: 46s - loss: 1.7562 - regression_loss: 1.4651 - classification_loss: 0.2910 311/500 [=================>............] - ETA: 46s - loss: 1.7554 - regression_loss: 1.4645 - classification_loss: 0.2909 312/500 [=================>............] - ETA: 46s - loss: 1.7550 - regression_loss: 1.4642 - classification_loss: 0.2909 313/500 [=================>............] - ETA: 46s - loss: 1.7518 - regression_loss: 1.4616 - classification_loss: 0.2902 314/500 [=================>............] - ETA: 45s - loss: 1.7501 - regression_loss: 1.4603 - classification_loss: 0.2897 315/500 [=================>............] - ETA: 45s - loss: 1.7512 - regression_loss: 1.4613 - classification_loss: 0.2899 316/500 [=================>............] - ETA: 45s - loss: 1.7494 - regression_loss: 1.4595 - classification_loss: 0.2899 317/500 [==================>...........] - ETA: 45s - loss: 1.7495 - regression_loss: 1.4597 - classification_loss: 0.2898 318/500 [==================>...........] - ETA: 44s - loss: 1.7506 - regression_loss: 1.4606 - classification_loss: 0.2900 319/500 [==================>...........] - ETA: 44s - loss: 1.7504 - regression_loss: 1.4604 - classification_loss: 0.2900 320/500 [==================>...........] - ETA: 44s - loss: 1.7498 - regression_loss: 1.4598 - classification_loss: 0.2900 321/500 [==================>...........] - ETA: 44s - loss: 1.7517 - regression_loss: 1.4614 - classification_loss: 0.2903 322/500 [==================>...........] - ETA: 43s - loss: 1.7518 - regression_loss: 1.4617 - classification_loss: 0.2902 323/500 [==================>...........] - ETA: 43s - loss: 1.7522 - regression_loss: 1.4621 - classification_loss: 0.2900 324/500 [==================>...........] - ETA: 43s - loss: 1.7518 - regression_loss: 1.4617 - classification_loss: 0.2901 325/500 [==================>...........] - ETA: 43s - loss: 1.7509 - regression_loss: 1.4612 - classification_loss: 0.2897 326/500 [==================>...........] - ETA: 42s - loss: 1.7512 - regression_loss: 1.4613 - classification_loss: 0.2899 327/500 [==================>...........] - ETA: 42s - loss: 1.7536 - regression_loss: 1.4634 - classification_loss: 0.2902 328/500 [==================>...........] - ETA: 42s - loss: 1.7541 - regression_loss: 1.4637 - classification_loss: 0.2904 329/500 [==================>...........] - ETA: 42s - loss: 1.7521 - regression_loss: 1.4619 - classification_loss: 0.2901 330/500 [==================>...........] - ETA: 41s - loss: 1.7539 - regression_loss: 1.4641 - classification_loss: 0.2898 331/500 [==================>...........] - ETA: 41s - loss: 1.7565 - regression_loss: 1.4662 - classification_loss: 0.2903 332/500 [==================>...........] - ETA: 41s - loss: 1.7593 - regression_loss: 1.4679 - classification_loss: 0.2914 333/500 [==================>...........] - ETA: 41s - loss: 1.7582 - regression_loss: 1.4670 - classification_loss: 0.2911 334/500 [===================>..........] - ETA: 40s - loss: 1.7597 - regression_loss: 1.4683 - classification_loss: 0.2913 335/500 [===================>..........] - ETA: 40s - loss: 1.7619 - regression_loss: 1.4704 - classification_loss: 0.2915 336/500 [===================>..........] - ETA: 40s - loss: 1.7612 - regression_loss: 1.4701 - classification_loss: 0.2912 337/500 [===================>..........] - ETA: 40s - loss: 1.7607 - regression_loss: 1.4697 - classification_loss: 0.2911 338/500 [===================>..........] - ETA: 40s - loss: 1.7604 - regression_loss: 1.4691 - classification_loss: 0.2913 339/500 [===================>..........] - ETA: 39s - loss: 1.7636 - regression_loss: 1.4708 - classification_loss: 0.2928 340/500 [===================>..........] - ETA: 39s - loss: 1.7638 - regression_loss: 1.4712 - classification_loss: 0.2926 341/500 [===================>..........] - ETA: 39s - loss: 1.7622 - regression_loss: 1.4700 - classification_loss: 0.2922 342/500 [===================>..........] - ETA: 39s - loss: 1.7607 - regression_loss: 1.4688 - classification_loss: 0.2919 343/500 [===================>..........] - ETA: 38s - loss: 1.7618 - regression_loss: 1.4696 - classification_loss: 0.2921 344/500 [===================>..........] - ETA: 38s - loss: 1.7628 - regression_loss: 1.4699 - classification_loss: 0.2929 345/500 [===================>..........] - ETA: 38s - loss: 1.7648 - regression_loss: 1.4717 - classification_loss: 0.2931 346/500 [===================>..........] - ETA: 38s - loss: 1.7664 - regression_loss: 1.4728 - classification_loss: 0.2936 347/500 [===================>..........] - ETA: 37s - loss: 1.7667 - regression_loss: 1.4733 - classification_loss: 0.2933 348/500 [===================>..........] - ETA: 37s - loss: 1.7667 - regression_loss: 1.4733 - classification_loss: 0.2933 349/500 [===================>..........] - ETA: 37s - loss: 1.7675 - regression_loss: 1.4743 - classification_loss: 0.2932 350/500 [====================>.........] - ETA: 37s - loss: 1.7671 - regression_loss: 1.4739 - classification_loss: 0.2931 351/500 [====================>.........] - ETA: 36s - loss: 1.7682 - regression_loss: 1.4751 - classification_loss: 0.2932 352/500 [====================>.........] - ETA: 36s - loss: 1.7676 - regression_loss: 1.4747 - classification_loss: 0.2929 353/500 [====================>.........] - ETA: 36s - loss: 1.7657 - regression_loss: 1.4732 - classification_loss: 0.2925 354/500 [====================>.........] - ETA: 36s - loss: 1.7656 - regression_loss: 1.4733 - classification_loss: 0.2923 355/500 [====================>.........] - ETA: 35s - loss: 1.7660 - regression_loss: 1.4737 - classification_loss: 0.2924 356/500 [====================>.........] - ETA: 35s - loss: 1.7637 - regression_loss: 1.4717 - classification_loss: 0.2920 357/500 [====================>.........] - ETA: 35s - loss: 1.7654 - regression_loss: 1.4732 - classification_loss: 0.2922 358/500 [====================>.........] - ETA: 35s - loss: 1.7666 - regression_loss: 1.4743 - classification_loss: 0.2923 359/500 [====================>.........] - ETA: 34s - loss: 1.7633 - regression_loss: 1.4716 - classification_loss: 0.2917 360/500 [====================>.........] - ETA: 34s - loss: 1.7644 - regression_loss: 1.4725 - classification_loss: 0.2919 361/500 [====================>.........] - ETA: 34s - loss: 1.7611 - regression_loss: 1.4697 - classification_loss: 0.2914 362/500 [====================>.........] - ETA: 34s - loss: 1.7614 - regression_loss: 1.4698 - classification_loss: 0.2916 363/500 [====================>.........] - ETA: 33s - loss: 1.7614 - regression_loss: 1.4700 - classification_loss: 0.2915 364/500 [====================>.........] - ETA: 33s - loss: 1.7619 - regression_loss: 1.4703 - classification_loss: 0.2916 365/500 [====================>.........] - ETA: 33s - loss: 1.7616 - regression_loss: 1.4701 - classification_loss: 0.2914 366/500 [====================>.........] - ETA: 33s - loss: 1.7613 - regression_loss: 1.4701 - classification_loss: 0.2912 367/500 [=====================>........] - ETA: 32s - loss: 1.7603 - regression_loss: 1.4689 - classification_loss: 0.2913 368/500 [=====================>........] - ETA: 32s - loss: 1.7624 - regression_loss: 1.4703 - classification_loss: 0.2921 369/500 [=====================>........] - ETA: 32s - loss: 1.7647 - regression_loss: 1.4719 - classification_loss: 0.2928 370/500 [=====================>........] - ETA: 32s - loss: 1.7647 - regression_loss: 1.4720 - classification_loss: 0.2926 371/500 [=====================>........] - ETA: 31s - loss: 1.7649 - regression_loss: 1.4722 - classification_loss: 0.2927 372/500 [=====================>........] - ETA: 31s - loss: 1.7650 - regression_loss: 1.4725 - classification_loss: 0.2925 373/500 [=====================>........] - ETA: 31s - loss: 1.7659 - regression_loss: 1.4732 - classification_loss: 0.2927 374/500 [=====================>........] - ETA: 31s - loss: 1.7664 - regression_loss: 1.4737 - classification_loss: 0.2927 375/500 [=====================>........] - ETA: 30s - loss: 1.7679 - regression_loss: 1.4749 - classification_loss: 0.2929 376/500 [=====================>........] - ETA: 30s - loss: 1.7681 - regression_loss: 1.4751 - classification_loss: 0.2931 377/500 [=====================>........] - ETA: 30s - loss: 1.7652 - regression_loss: 1.4726 - classification_loss: 0.2926 378/500 [=====================>........] - ETA: 30s - loss: 1.7663 - regression_loss: 1.4735 - classification_loss: 0.2928 379/500 [=====================>........] - ETA: 29s - loss: 1.7644 - regression_loss: 1.4718 - classification_loss: 0.2926 380/500 [=====================>........] - ETA: 29s - loss: 1.7656 - regression_loss: 1.4727 - classification_loss: 0.2929 381/500 [=====================>........] - ETA: 29s - loss: 1.7656 - regression_loss: 1.4729 - classification_loss: 0.2927 382/500 [=====================>........] - ETA: 29s - loss: 1.7626 - regression_loss: 1.4704 - classification_loss: 0.2922 383/500 [=====================>........] - ETA: 28s - loss: 1.7648 - regression_loss: 1.4715 - classification_loss: 0.2933 384/500 [======================>.......] - ETA: 28s - loss: 1.7642 - regression_loss: 1.4711 - classification_loss: 0.2931 385/500 [======================>.......] - ETA: 28s - loss: 1.7656 - regression_loss: 1.4721 - classification_loss: 0.2935 386/500 [======================>.......] - ETA: 28s - loss: 1.7651 - regression_loss: 1.4716 - classification_loss: 0.2935 387/500 [======================>.......] - ETA: 27s - loss: 1.7647 - regression_loss: 1.4713 - classification_loss: 0.2934 388/500 [======================>.......] - ETA: 27s - loss: 1.7643 - regression_loss: 1.4710 - classification_loss: 0.2933 389/500 [======================>.......] - ETA: 27s - loss: 1.7651 - regression_loss: 1.4717 - classification_loss: 0.2935 390/500 [======================>.......] - ETA: 27s - loss: 1.7674 - regression_loss: 1.4734 - classification_loss: 0.2940 391/500 [======================>.......] - ETA: 26s - loss: 1.7688 - regression_loss: 1.4744 - classification_loss: 0.2944 392/500 [======================>.......] - ETA: 26s - loss: 1.7692 - regression_loss: 1.4746 - classification_loss: 0.2946 393/500 [======================>.......] - ETA: 26s - loss: 1.7699 - regression_loss: 1.4751 - classification_loss: 0.2948 394/500 [======================>.......] - ETA: 26s - loss: 1.7697 - regression_loss: 1.4750 - classification_loss: 0.2947 395/500 [======================>.......] - ETA: 25s - loss: 1.7692 - regression_loss: 1.4746 - classification_loss: 0.2946 396/500 [======================>.......] - ETA: 25s - loss: 1.7688 - regression_loss: 1.4744 - classification_loss: 0.2944 397/500 [======================>.......] - ETA: 25s - loss: 1.7667 - regression_loss: 1.4727 - classification_loss: 0.2940 398/500 [======================>.......] - ETA: 25s - loss: 1.7666 - regression_loss: 1.4724 - classification_loss: 0.2941 399/500 [======================>.......] - ETA: 24s - loss: 1.7674 - regression_loss: 1.4731 - classification_loss: 0.2943 400/500 [=======================>......] - ETA: 24s - loss: 1.7681 - regression_loss: 1.4736 - classification_loss: 0.2946 401/500 [=======================>......] - ETA: 24s - loss: 1.7677 - regression_loss: 1.4734 - classification_loss: 0.2943 402/500 [=======================>......] - ETA: 24s - loss: 1.7699 - regression_loss: 1.4752 - classification_loss: 0.2947 403/500 [=======================>......] - ETA: 23s - loss: 1.7701 - regression_loss: 1.4755 - classification_loss: 0.2946 404/500 [=======================>......] - ETA: 23s - loss: 1.7709 - regression_loss: 1.4760 - classification_loss: 0.2949 405/500 [=======================>......] - ETA: 23s - loss: 1.7697 - regression_loss: 1.4749 - classification_loss: 0.2948 406/500 [=======================>......] - ETA: 23s - loss: 1.7693 - regression_loss: 1.4747 - classification_loss: 0.2946 407/500 [=======================>......] - ETA: 23s - loss: 1.7690 - regression_loss: 1.4746 - classification_loss: 0.2944 408/500 [=======================>......] - ETA: 22s - loss: 1.7691 - regression_loss: 1.4746 - classification_loss: 0.2944 409/500 [=======================>......] - ETA: 22s - loss: 1.7703 - regression_loss: 1.4755 - classification_loss: 0.2948 410/500 [=======================>......] - ETA: 22s - loss: 1.7725 - regression_loss: 1.4772 - classification_loss: 0.2953 411/500 [=======================>......] - ETA: 22s - loss: 1.7728 - regression_loss: 1.4772 - classification_loss: 0.2957 412/500 [=======================>......] - ETA: 21s - loss: 1.7706 - regression_loss: 1.4753 - classification_loss: 0.2953 413/500 [=======================>......] - ETA: 21s - loss: 1.7700 - regression_loss: 1.4748 - classification_loss: 0.2951 414/500 [=======================>......] - ETA: 21s - loss: 1.7711 - regression_loss: 1.4758 - classification_loss: 0.2953 415/500 [=======================>......] - ETA: 21s - loss: 1.7711 - regression_loss: 1.4756 - classification_loss: 0.2955 416/500 [=======================>......] - ETA: 20s - loss: 1.7706 - regression_loss: 1.4752 - classification_loss: 0.2954 417/500 [========================>.....] - ETA: 20s - loss: 1.7694 - regression_loss: 1.4743 - classification_loss: 0.2951 418/500 [========================>.....] - ETA: 20s - loss: 1.7682 - regression_loss: 1.4734 - classification_loss: 0.2948 419/500 [========================>.....] - ETA: 20s - loss: 1.7667 - regression_loss: 1.4723 - classification_loss: 0.2944 420/500 [========================>.....] - ETA: 19s - loss: 1.7663 - regression_loss: 1.4719 - classification_loss: 0.2944 421/500 [========================>.....] - ETA: 19s - loss: 1.7672 - regression_loss: 1.4728 - classification_loss: 0.2944 422/500 [========================>.....] - ETA: 19s - loss: 1.7691 - regression_loss: 1.4741 - classification_loss: 0.2950 423/500 [========================>.....] - ETA: 19s - loss: 1.7687 - regression_loss: 1.4739 - classification_loss: 0.2948 424/500 [========================>.....] - ETA: 18s - loss: 1.7673 - regression_loss: 1.4728 - classification_loss: 0.2945 425/500 [========================>.....] - ETA: 18s - loss: 1.7645 - regression_loss: 1.4705 - classification_loss: 0.2940 426/500 [========================>.....] - ETA: 18s - loss: 1.7638 - regression_loss: 1.4701 - classification_loss: 0.2937 427/500 [========================>.....] - ETA: 18s - loss: 1.7629 - regression_loss: 1.4694 - classification_loss: 0.2935 428/500 [========================>.....] - ETA: 17s - loss: 1.7624 - regression_loss: 1.4689 - classification_loss: 0.2935 429/500 [========================>.....] - ETA: 17s - loss: 1.7615 - regression_loss: 1.4681 - classification_loss: 0.2934 430/500 [========================>.....] - ETA: 17s - loss: 1.7596 - regression_loss: 1.4666 - classification_loss: 0.2931 431/500 [========================>.....] - ETA: 17s - loss: 1.7591 - regression_loss: 1.4663 - classification_loss: 0.2928 432/500 [========================>.....] - ETA: 16s - loss: 1.7572 - regression_loss: 1.4648 - classification_loss: 0.2923 433/500 [========================>.....] - ETA: 16s - loss: 1.7551 - regression_loss: 1.4630 - classification_loss: 0.2921 434/500 [=========================>....] - ETA: 16s - loss: 1.7538 - regression_loss: 1.4620 - classification_loss: 0.2919 435/500 [=========================>....] - ETA: 16s - loss: 1.7530 - regression_loss: 1.4612 - classification_loss: 0.2918 436/500 [=========================>....] - ETA: 15s - loss: 1.7525 - regression_loss: 1.4609 - classification_loss: 0.2916 437/500 [=========================>....] - ETA: 15s - loss: 1.7526 - regression_loss: 1.4610 - classification_loss: 0.2916 438/500 [=========================>....] - ETA: 15s - loss: 1.7528 - regression_loss: 1.4611 - classification_loss: 0.2917 439/500 [=========================>....] - ETA: 15s - loss: 1.7534 - regression_loss: 1.4616 - classification_loss: 0.2917 440/500 [=========================>....] - ETA: 14s - loss: 1.7539 - regression_loss: 1.4623 - classification_loss: 0.2917 441/500 [=========================>....] - ETA: 14s - loss: 1.7539 - regression_loss: 1.4623 - classification_loss: 0.2916 442/500 [=========================>....] - ETA: 14s - loss: 1.7566 - regression_loss: 1.4643 - classification_loss: 0.2923 443/500 [=========================>....] - ETA: 14s - loss: 1.7566 - regression_loss: 1.4642 - classification_loss: 0.2923 444/500 [=========================>....] - ETA: 13s - loss: 1.7578 - regression_loss: 1.4653 - classification_loss: 0.2926 445/500 [=========================>....] - ETA: 13s - loss: 1.7607 - regression_loss: 1.4674 - classification_loss: 0.2934 446/500 [=========================>....] - ETA: 13s - loss: 1.7581 - regression_loss: 1.4653 - classification_loss: 0.2928 447/500 [=========================>....] - ETA: 13s - loss: 1.7581 - regression_loss: 1.4656 - classification_loss: 0.2925 448/500 [=========================>....] - ETA: 12s - loss: 1.7574 - regression_loss: 1.4651 - classification_loss: 0.2924 449/500 [=========================>....] - ETA: 12s - loss: 1.7580 - regression_loss: 1.4655 - classification_loss: 0.2925 450/500 [==========================>...] - ETA: 12s - loss: 1.7572 - regression_loss: 1.4648 - classification_loss: 0.2924 451/500 [==========================>...] - ETA: 12s - loss: 1.7587 - regression_loss: 1.4658 - classification_loss: 0.2929 452/500 [==========================>...] - ETA: 11s - loss: 1.7607 - regression_loss: 1.4672 - classification_loss: 0.2935 453/500 [==========================>...] - ETA: 11s - loss: 1.7639 - regression_loss: 1.4700 - classification_loss: 0.2939 454/500 [==========================>...] - ETA: 11s - loss: 1.7644 - regression_loss: 1.4703 - classification_loss: 0.2940 455/500 [==========================>...] - ETA: 11s - loss: 1.7660 - regression_loss: 1.4721 - classification_loss: 0.2939 456/500 [==========================>...] - ETA: 10s - loss: 1.7672 - regression_loss: 1.4729 - classification_loss: 0.2943 457/500 [==========================>...] - ETA: 10s - loss: 1.7668 - regression_loss: 1.4728 - classification_loss: 0.2941 458/500 [==========================>...] - ETA: 10s - loss: 1.7658 - regression_loss: 1.4719 - classification_loss: 0.2939 459/500 [==========================>...] - ETA: 10s - loss: 1.7633 - regression_loss: 1.4700 - classification_loss: 0.2934 460/500 [==========================>...] - ETA: 9s - loss: 1.7612 - regression_loss: 1.4683 - classification_loss: 0.2929  461/500 [==========================>...] - ETA: 9s - loss: 1.7610 - regression_loss: 1.4682 - classification_loss: 0.2928 462/500 [==========================>...] - ETA: 9s - loss: 1.7622 - regression_loss: 1.4693 - classification_loss: 0.2929 463/500 [==========================>...] - ETA: 9s - loss: 1.7632 - regression_loss: 1.4702 - classification_loss: 0.2930 464/500 [==========================>...] - ETA: 8s - loss: 1.7657 - regression_loss: 1.4722 - classification_loss: 0.2935 465/500 [==========================>...] - ETA: 8s - loss: 1.7658 - regression_loss: 1.4723 - classification_loss: 0.2936 466/500 [==========================>...] - ETA: 8s - loss: 1.7644 - regression_loss: 1.4712 - classification_loss: 0.2932 467/500 [===========================>..] - ETA: 8s - loss: 1.7658 - regression_loss: 1.4720 - classification_loss: 0.2938 468/500 [===========================>..] - ETA: 7s - loss: 1.7662 - regression_loss: 1.4724 - classification_loss: 0.2938 469/500 [===========================>..] - ETA: 7s - loss: 1.7659 - regression_loss: 1.4723 - classification_loss: 0.2937 470/500 [===========================>..] - ETA: 7s - loss: 1.7669 - regression_loss: 1.4732 - classification_loss: 0.2937 471/500 [===========================>..] - ETA: 7s - loss: 1.7661 - regression_loss: 1.4727 - classification_loss: 0.2934 472/500 [===========================>..] - ETA: 6s - loss: 1.7652 - regression_loss: 1.4720 - classification_loss: 0.2932 473/500 [===========================>..] - ETA: 6s - loss: 1.7629 - regression_loss: 1.4701 - classification_loss: 0.2928 474/500 [===========================>..] - ETA: 6s - loss: 1.7621 - regression_loss: 1.4695 - classification_loss: 0.2926 475/500 [===========================>..] - ETA: 6s - loss: 1.7614 - regression_loss: 1.4691 - classification_loss: 0.2923 476/500 [===========================>..] - ETA: 5s - loss: 1.7627 - regression_loss: 1.4700 - classification_loss: 0.2928 477/500 [===========================>..] - ETA: 5s - loss: 1.7636 - regression_loss: 1.4708 - classification_loss: 0.2929 478/500 [===========================>..] - ETA: 5s - loss: 1.7630 - regression_loss: 1.4704 - classification_loss: 0.2927 479/500 [===========================>..] - ETA: 5s - loss: 1.7625 - regression_loss: 1.4700 - classification_loss: 0.2925 480/500 [===========================>..] - ETA: 4s - loss: 1.7638 - regression_loss: 1.4711 - classification_loss: 0.2927 481/500 [===========================>..] - ETA: 4s - loss: 1.7630 - regression_loss: 1.4705 - classification_loss: 0.2925 482/500 [===========================>..] - ETA: 4s - loss: 1.7621 - regression_loss: 1.4697 - classification_loss: 0.2924 483/500 [===========================>..] - ETA: 4s - loss: 1.7631 - regression_loss: 1.4708 - classification_loss: 0.2923 484/500 [============================>.] - ETA: 3s - loss: 1.7614 - regression_loss: 1.4695 - classification_loss: 0.2919 485/500 [============================>.] - ETA: 3s - loss: 1.7619 - regression_loss: 1.4699 - classification_loss: 0.2921 486/500 [============================>.] - ETA: 3s - loss: 1.7625 - regression_loss: 1.4702 - classification_loss: 0.2923 487/500 [============================>.] - ETA: 3s - loss: 1.7637 - regression_loss: 1.4712 - classification_loss: 0.2925 488/500 [============================>.] - ETA: 2s - loss: 1.7645 - regression_loss: 1.4717 - classification_loss: 0.2928 489/500 [============================>.] - ETA: 2s - loss: 1.7655 - regression_loss: 1.4724 - classification_loss: 0.2931 490/500 [============================>.] - ETA: 2s - loss: 1.7656 - regression_loss: 1.4725 - classification_loss: 0.2931 491/500 [============================>.] - ETA: 2s - loss: 1.7656 - regression_loss: 1.4725 - classification_loss: 0.2931 492/500 [============================>.] - ETA: 1s - loss: 1.7649 - regression_loss: 1.4720 - classification_loss: 0.2929 493/500 [============================>.] - ETA: 1s - loss: 1.7643 - regression_loss: 1.4715 - classification_loss: 0.2928 494/500 [============================>.] - ETA: 1s - loss: 1.7637 - regression_loss: 1.4712 - classification_loss: 0.2925 495/500 [============================>.] - ETA: 1s - loss: 1.7639 - regression_loss: 1.4712 - classification_loss: 0.2926 496/500 [============================>.] - ETA: 0s - loss: 1.7643 - regression_loss: 1.4713 - classification_loss: 0.2929 497/500 [============================>.] - ETA: 0s - loss: 1.7660 - regression_loss: 1.4728 - classification_loss: 0.2933 498/500 [============================>.] - ETA: 0s - loss: 1.7656 - regression_loss: 1.4725 - classification_loss: 0.2931 499/500 [============================>.] - ETA: 0s - loss: 1.7655 - regression_loss: 1.4725 - classification_loss: 0.2930 500/500 [==============================] - 124s 248ms/step - loss: 1.7656 - regression_loss: 1.4724 - classification_loss: 0.2932 326 instances of class plum with average precision: 0.7404 mAP: 0.7404 Epoch 00040: saving model to ./training/snapshots/resnet50_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 1:55 - loss: 2.3495 - regression_loss: 1.9284 - classification_loss: 0.4211 2/500 [..............................] - ETA: 1:58 - loss: 2.0556 - regression_loss: 1.7422 - classification_loss: 0.3135 3/500 [..............................] - ETA: 1:59 - loss: 2.2196 - regression_loss: 1.8687 - classification_loss: 0.3509 4/500 [..............................] - ETA: 1:57 - loss: 2.0231 - regression_loss: 1.7005 - classification_loss: 0.3226 5/500 [..............................] - ETA: 1:57 - loss: 1.9597 - regression_loss: 1.6316 - classification_loss: 0.3281 6/500 [..............................] - ETA: 1:58 - loss: 1.7818 - regression_loss: 1.4903 - classification_loss: 0.2915 7/500 [..............................] - ETA: 1:59 - loss: 1.9170 - regression_loss: 1.6109 - classification_loss: 0.3061 8/500 [..............................] - ETA: 1:59 - loss: 1.9027 - regression_loss: 1.5979 - classification_loss: 0.3048 9/500 [..............................] - ETA: 1:59 - loss: 1.8966 - regression_loss: 1.6023 - classification_loss: 0.2943 10/500 [..............................] - ETA: 1:59 - loss: 1.9629 - regression_loss: 1.6614 - classification_loss: 0.3015 11/500 [..............................] - ETA: 1:59 - loss: 1.9566 - regression_loss: 1.6564 - classification_loss: 0.3002 12/500 [..............................] - ETA: 1:59 - loss: 1.9656 - regression_loss: 1.6604 - classification_loss: 0.3052 13/500 [..............................] - ETA: 1:59 - loss: 1.9019 - regression_loss: 1.6112 - classification_loss: 0.2907 14/500 [..............................] - ETA: 1:59 - loss: 1.9237 - regression_loss: 1.6264 - classification_loss: 0.2973 15/500 [..............................] - ETA: 1:59 - loss: 1.9352 - regression_loss: 1.6328 - classification_loss: 0.3024 16/500 [..............................] - ETA: 1:59 - loss: 1.9426 - regression_loss: 1.6332 - classification_loss: 0.3094 17/500 [>.............................] - ETA: 1:59 - loss: 1.9210 - regression_loss: 1.6153 - classification_loss: 0.3057 18/500 [>.............................] - ETA: 1:58 - loss: 1.9169 - regression_loss: 1.6125 - classification_loss: 0.3044 19/500 [>.............................] - ETA: 1:58 - loss: 1.9047 - regression_loss: 1.6018 - classification_loss: 0.3030 20/500 [>.............................] - ETA: 1:58 - loss: 1.8687 - regression_loss: 1.5737 - classification_loss: 0.2950 21/500 [>.............................] - ETA: 1:58 - loss: 1.8527 - regression_loss: 1.5640 - classification_loss: 0.2887 22/500 [>.............................] - ETA: 1:58 - loss: 1.8276 - regression_loss: 1.5448 - classification_loss: 0.2828 23/500 [>.............................] - ETA: 1:58 - loss: 1.8135 - regression_loss: 1.5311 - classification_loss: 0.2823 24/500 [>.............................] - ETA: 1:57 - loss: 1.8189 - regression_loss: 1.5337 - classification_loss: 0.2852 25/500 [>.............................] - ETA: 1:57 - loss: 1.8181 - regression_loss: 1.5344 - classification_loss: 0.2837 26/500 [>.............................] - ETA: 1:57 - loss: 1.8220 - regression_loss: 1.5341 - classification_loss: 0.2880 27/500 [>.............................] - ETA: 1:56 - loss: 1.8288 - regression_loss: 1.5396 - classification_loss: 0.2892 28/500 [>.............................] - ETA: 1:56 - loss: 1.8290 - regression_loss: 1.5385 - classification_loss: 0.2906 29/500 [>.............................] - ETA: 1:56 - loss: 1.8081 - regression_loss: 1.5188 - classification_loss: 0.2893 30/500 [>.............................] - ETA: 1:56 - loss: 1.7918 - regression_loss: 1.5070 - classification_loss: 0.2848 31/500 [>.............................] - ETA: 1:56 - loss: 1.7859 - regression_loss: 1.5047 - classification_loss: 0.2812 32/500 [>.............................] - ETA: 1:55 - loss: 1.7604 - regression_loss: 1.4842 - classification_loss: 0.2762 33/500 [>.............................] - ETA: 1:55 - loss: 1.7588 - regression_loss: 1.4846 - classification_loss: 0.2742 34/500 [=>............................] - ETA: 1:55 - loss: 1.7801 - regression_loss: 1.5008 - classification_loss: 0.2793 35/500 [=>............................] - ETA: 1:55 - loss: 1.7847 - regression_loss: 1.5058 - classification_loss: 0.2789 36/500 [=>............................] - ETA: 1:54 - loss: 1.7756 - regression_loss: 1.4996 - classification_loss: 0.2759 37/500 [=>............................] - ETA: 1:54 - loss: 1.7869 - regression_loss: 1.5068 - classification_loss: 0.2801 38/500 [=>............................] - ETA: 1:54 - loss: 1.7793 - regression_loss: 1.4990 - classification_loss: 0.2802 39/500 [=>............................] - ETA: 1:54 - loss: 1.7653 - regression_loss: 1.4869 - classification_loss: 0.2784 40/500 [=>............................] - ETA: 1:54 - loss: 1.7530 - regression_loss: 1.4774 - classification_loss: 0.2757 41/500 [=>............................] - ETA: 1:53 - loss: 1.7512 - regression_loss: 1.4772 - classification_loss: 0.2740 42/500 [=>............................] - ETA: 1:53 - loss: 1.7843 - regression_loss: 1.5067 - classification_loss: 0.2777 43/500 [=>............................] - ETA: 1:53 - loss: 1.7785 - regression_loss: 1.5014 - classification_loss: 0.2771 44/500 [=>............................] - ETA: 1:53 - loss: 1.7754 - regression_loss: 1.4998 - classification_loss: 0.2756 45/500 [=>............................] - ETA: 1:52 - loss: 1.7993 - regression_loss: 1.5160 - classification_loss: 0.2833 46/500 [=>............................] - ETA: 1:52 - loss: 1.8066 - regression_loss: 1.5191 - classification_loss: 0.2876 47/500 [=>............................] - ETA: 1:52 - loss: 1.8150 - regression_loss: 1.5246 - classification_loss: 0.2904 48/500 [=>............................] - ETA: 1:52 - loss: 1.8170 - regression_loss: 1.5247 - classification_loss: 0.2923 49/500 [=>............................] - ETA: 1:51 - loss: 1.8132 - regression_loss: 1.5208 - classification_loss: 0.2924 50/500 [==>...........................] - ETA: 1:51 - loss: 1.8187 - regression_loss: 1.5203 - classification_loss: 0.2984 51/500 [==>...........................] - ETA: 1:51 - loss: 1.8153 - regression_loss: 1.5167 - classification_loss: 0.2986 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8090 - regression_loss: 1.5124 - classification_loss: 0.2966 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8209 - regression_loss: 1.5209 - classification_loss: 0.3000 54/500 [==>...........................] - ETA: 1:50 - loss: 1.7992 - regression_loss: 1.5031 - classification_loss: 0.2961 55/500 [==>...........................] - ETA: 1:50 - loss: 1.7875 - regression_loss: 1.4935 - classification_loss: 0.2940 56/500 [==>...........................] - ETA: 1:50 - loss: 1.7936 - regression_loss: 1.4983 - classification_loss: 0.2953 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8037 - regression_loss: 1.5065 - classification_loss: 0.2972 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8003 - regression_loss: 1.5047 - classification_loss: 0.2956 59/500 [==>...........................] - ETA: 1:49 - loss: 1.7994 - regression_loss: 1.5041 - classification_loss: 0.2953 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7949 - regression_loss: 1.5013 - classification_loss: 0.2935 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8028 - regression_loss: 1.5079 - classification_loss: 0.2948 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7895 - regression_loss: 1.4976 - classification_loss: 0.2919 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7824 - regression_loss: 1.4929 - classification_loss: 0.2895 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7891 - regression_loss: 1.4967 - classification_loss: 0.2925 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7809 - regression_loss: 1.4901 - classification_loss: 0.2908 66/500 [==>...........................] - ETA: 1:47 - loss: 1.7656 - regression_loss: 1.4781 - classification_loss: 0.2876 67/500 [===>..........................] - ETA: 1:46 - loss: 1.7725 - regression_loss: 1.4836 - classification_loss: 0.2890 68/500 [===>..........................] - ETA: 1:46 - loss: 1.7830 - regression_loss: 1.4914 - classification_loss: 0.2917 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7817 - regression_loss: 1.4905 - classification_loss: 0.2911 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7823 - regression_loss: 1.4907 - classification_loss: 0.2916 71/500 [===>..........................] - ETA: 1:45 - loss: 1.7808 - regression_loss: 1.4901 - classification_loss: 0.2908 72/500 [===>..........................] - ETA: 1:45 - loss: 1.7797 - regression_loss: 1.4891 - classification_loss: 0.2906 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7709 - regression_loss: 1.4818 - classification_loss: 0.2891 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7719 - regression_loss: 1.4831 - classification_loss: 0.2888 75/500 [===>..........................] - ETA: 1:44 - loss: 1.7710 - regression_loss: 1.4810 - classification_loss: 0.2899 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7679 - regression_loss: 1.4791 - classification_loss: 0.2888 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7666 - regression_loss: 1.4785 - classification_loss: 0.2881 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7696 - regression_loss: 1.4809 - classification_loss: 0.2888 79/500 [===>..........................] - ETA: 1:43 - loss: 1.7677 - regression_loss: 1.4797 - classification_loss: 0.2880 80/500 [===>..........................] - ETA: 1:43 - loss: 1.7547 - regression_loss: 1.4690 - classification_loss: 0.2856 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7514 - regression_loss: 1.4663 - classification_loss: 0.2851 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7431 - regression_loss: 1.4579 - classification_loss: 0.2852 83/500 [===>..........................] - ETA: 1:42 - loss: 1.7378 - regression_loss: 1.4551 - classification_loss: 0.2827 84/500 [====>.........................] - ETA: 1:42 - loss: 1.7455 - regression_loss: 1.4619 - classification_loss: 0.2835 85/500 [====>.........................] - ETA: 1:42 - loss: 1.7439 - regression_loss: 1.4611 - classification_loss: 0.2828 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7357 - regression_loss: 1.4545 - classification_loss: 0.2812 87/500 [====>.........................] - ETA: 1:41 - loss: 1.7398 - regression_loss: 1.4378 - classification_loss: 0.3020 88/500 [====>.........................] - ETA: 1:41 - loss: 1.7488 - regression_loss: 1.4460 - classification_loss: 0.3028 89/500 [====>.........................] - ETA: 1:41 - loss: 1.7512 - regression_loss: 1.4474 - classification_loss: 0.3038 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7456 - regression_loss: 1.4422 - classification_loss: 0.3034 91/500 [====>.........................] - ETA: 1:40 - loss: 1.7543 - regression_loss: 1.4514 - classification_loss: 0.3028 92/500 [====>.........................] - ETA: 1:40 - loss: 1.7558 - regression_loss: 1.4538 - classification_loss: 0.3020 93/500 [====>.........................] - ETA: 1:40 - loss: 1.7586 - regression_loss: 1.4560 - classification_loss: 0.3026 94/500 [====>.........................] - ETA: 1:40 - loss: 1.7639 - regression_loss: 1.4606 - classification_loss: 0.3033 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7653 - regression_loss: 1.4619 - classification_loss: 0.3033 96/500 [====>.........................] - ETA: 1:39 - loss: 1.7712 - regression_loss: 1.4656 - classification_loss: 0.3056 97/500 [====>.........................] - ETA: 1:39 - loss: 1.7675 - regression_loss: 1.4614 - classification_loss: 0.3061 98/500 [====>.........................] - ETA: 1:39 - loss: 1.7697 - regression_loss: 1.4626 - classification_loss: 0.3071 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7683 - regression_loss: 1.4616 - classification_loss: 0.3067 100/500 [=====>........................] - ETA: 1:38 - loss: 1.7702 - regression_loss: 1.4626 - classification_loss: 0.3076 101/500 [=====>........................] - ETA: 1:38 - loss: 1.7696 - regression_loss: 1.4632 - classification_loss: 0.3064 102/500 [=====>........................] - ETA: 1:38 - loss: 1.7644 - regression_loss: 1.4599 - classification_loss: 0.3044 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7632 - regression_loss: 1.4589 - classification_loss: 0.3043 104/500 [=====>........................] - ETA: 1:37 - loss: 1.7624 - regression_loss: 1.4577 - classification_loss: 0.3046 105/500 [=====>........................] - ETA: 1:37 - loss: 1.7570 - regression_loss: 1.4538 - classification_loss: 0.3032 106/500 [=====>........................] - ETA: 1:37 - loss: 1.7559 - regression_loss: 1.4526 - classification_loss: 0.3033 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7649 - regression_loss: 1.4594 - classification_loss: 0.3055 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7698 - regression_loss: 1.4632 - classification_loss: 0.3067 109/500 [=====>........................] - ETA: 1:36 - loss: 1.7764 - regression_loss: 1.4701 - classification_loss: 0.3063 110/500 [=====>........................] - ETA: 1:36 - loss: 1.7785 - regression_loss: 1.4720 - classification_loss: 0.3065 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7802 - regression_loss: 1.4738 - classification_loss: 0.3064 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7795 - regression_loss: 1.4735 - classification_loss: 0.3060 113/500 [=====>........................] - ETA: 1:35 - loss: 1.7808 - regression_loss: 1.4752 - classification_loss: 0.3056 114/500 [=====>........................] - ETA: 1:35 - loss: 1.7797 - regression_loss: 1.4748 - classification_loss: 0.3049 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7869 - regression_loss: 1.4813 - classification_loss: 0.3056 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7904 - regression_loss: 1.4850 - classification_loss: 0.3054 117/500 [======>.......................] - ETA: 1:34 - loss: 1.7968 - regression_loss: 1.4900 - classification_loss: 0.3069 118/500 [======>.......................] - ETA: 1:34 - loss: 1.8008 - regression_loss: 1.4927 - classification_loss: 0.3081 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7976 - regression_loss: 1.4895 - classification_loss: 0.3081 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8000 - regression_loss: 1.4913 - classification_loss: 0.3087 121/500 [======>.......................] - ETA: 1:33 - loss: 1.8033 - regression_loss: 1.4931 - classification_loss: 0.3102 122/500 [======>.......................] - ETA: 1:33 - loss: 1.8001 - regression_loss: 1.4909 - classification_loss: 0.3093 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7982 - regression_loss: 1.4894 - classification_loss: 0.3087 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7980 - regression_loss: 1.4894 - classification_loss: 0.3087 125/500 [======>.......................] - ETA: 1:32 - loss: 1.7991 - regression_loss: 1.4909 - classification_loss: 0.3082 126/500 [======>.......................] - ETA: 1:32 - loss: 1.8021 - regression_loss: 1.4938 - classification_loss: 0.3083 127/500 [======>.......................] - ETA: 1:32 - loss: 1.8002 - regression_loss: 1.4931 - classification_loss: 0.3071 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8001 - regression_loss: 1.4937 - classification_loss: 0.3064 129/500 [======>.......................] - ETA: 1:31 - loss: 1.7984 - regression_loss: 1.4929 - classification_loss: 0.3054 130/500 [======>.......................] - ETA: 1:31 - loss: 1.7980 - regression_loss: 1.4927 - classification_loss: 0.3053 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7971 - regression_loss: 1.4921 - classification_loss: 0.3049 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7951 - regression_loss: 1.4911 - classification_loss: 0.3040 133/500 [======>.......................] - ETA: 1:30 - loss: 1.7977 - regression_loss: 1.4948 - classification_loss: 0.3029 134/500 [=======>......................] - ETA: 1:30 - loss: 1.7981 - regression_loss: 1.4945 - classification_loss: 0.3035 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7935 - regression_loss: 1.4907 - classification_loss: 0.3028 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7880 - regression_loss: 1.4867 - classification_loss: 0.3013 137/500 [=======>......................] - ETA: 1:29 - loss: 1.7819 - regression_loss: 1.4821 - classification_loss: 0.2998 138/500 [=======>......................] - ETA: 1:29 - loss: 1.7781 - regression_loss: 1.4798 - classification_loss: 0.2983 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7759 - regression_loss: 1.4784 - classification_loss: 0.2975 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7778 - regression_loss: 1.4795 - classification_loss: 0.2983 141/500 [=======>......................] - ETA: 1:28 - loss: 1.7719 - regression_loss: 1.4747 - classification_loss: 0.2972 142/500 [=======>......................] - ETA: 1:28 - loss: 1.7714 - regression_loss: 1.4747 - classification_loss: 0.2967 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7629 - regression_loss: 1.4678 - classification_loss: 0.2951 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7623 - regression_loss: 1.4671 - classification_loss: 0.2952 145/500 [=======>......................] - ETA: 1:27 - loss: 1.7654 - regression_loss: 1.4697 - classification_loss: 0.2956 146/500 [=======>......................] - ETA: 1:27 - loss: 1.7609 - regression_loss: 1.4660 - classification_loss: 0.2949 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7604 - regression_loss: 1.4650 - classification_loss: 0.2954 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7593 - regression_loss: 1.4645 - classification_loss: 0.2948 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7558 - regression_loss: 1.4619 - classification_loss: 0.2939 150/500 [========>.....................] - ETA: 1:26 - loss: 1.7571 - regression_loss: 1.4632 - classification_loss: 0.2938 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7521 - regression_loss: 1.4592 - classification_loss: 0.2929 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7522 - regression_loss: 1.4594 - classification_loss: 0.2928 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7509 - regression_loss: 1.4588 - classification_loss: 0.2921 154/500 [========>.....................] - ETA: 1:25 - loss: 1.7526 - regression_loss: 1.4600 - classification_loss: 0.2926 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7561 - regression_loss: 1.4626 - classification_loss: 0.2934 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7535 - regression_loss: 1.4607 - classification_loss: 0.2928 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7608 - regression_loss: 1.4657 - classification_loss: 0.2951 158/500 [========>.....................] - ETA: 1:24 - loss: 1.7571 - regression_loss: 1.4630 - classification_loss: 0.2941 159/500 [========>.....................] - ETA: 1:24 - loss: 1.7558 - regression_loss: 1.4625 - classification_loss: 0.2933 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7525 - regression_loss: 1.4599 - classification_loss: 0.2925 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7593 - regression_loss: 1.4657 - classification_loss: 0.2936 162/500 [========>.....................] - ETA: 1:23 - loss: 1.7629 - regression_loss: 1.4687 - classification_loss: 0.2942 163/500 [========>.....................] - ETA: 1:23 - loss: 1.7612 - regression_loss: 1.4673 - classification_loss: 0.2940 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7578 - regression_loss: 1.4650 - classification_loss: 0.2928 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7600 - regression_loss: 1.4667 - classification_loss: 0.2933 166/500 [========>.....................] - ETA: 1:22 - loss: 1.7557 - regression_loss: 1.4636 - classification_loss: 0.2921 167/500 [=========>....................] - ETA: 1:22 - loss: 1.7577 - regression_loss: 1.4649 - classification_loss: 0.2928 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7587 - regression_loss: 1.4662 - classification_loss: 0.2925 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7594 - regression_loss: 1.4675 - classification_loss: 0.2919 170/500 [=========>....................] - ETA: 1:21 - loss: 1.7595 - regression_loss: 1.4677 - classification_loss: 0.2917 171/500 [=========>....................] - ETA: 1:21 - loss: 1.7642 - regression_loss: 1.4722 - classification_loss: 0.2920 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7634 - regression_loss: 1.4713 - classification_loss: 0.2921 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7641 - regression_loss: 1.4721 - classification_loss: 0.2919 174/500 [=========>....................] - ETA: 1:20 - loss: 1.7643 - regression_loss: 1.4721 - classification_loss: 0.2923 175/500 [=========>....................] - ETA: 1:20 - loss: 1.7588 - regression_loss: 1.4678 - classification_loss: 0.2911 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7592 - regression_loss: 1.4682 - classification_loss: 0.2910 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7591 - regression_loss: 1.4685 - classification_loss: 0.2906 178/500 [=========>....................] - ETA: 1:19 - loss: 1.7650 - regression_loss: 1.4736 - classification_loss: 0.2914 179/500 [=========>....................] - ETA: 1:19 - loss: 1.7619 - regression_loss: 1.4713 - classification_loss: 0.2906 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7639 - regression_loss: 1.4728 - classification_loss: 0.2911 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7650 - regression_loss: 1.4737 - classification_loss: 0.2914 182/500 [=========>....................] - ETA: 1:18 - loss: 1.7679 - regression_loss: 1.4759 - classification_loss: 0.2921 183/500 [=========>....................] - ETA: 1:18 - loss: 1.7703 - regression_loss: 1.4783 - classification_loss: 0.2920 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7722 - regression_loss: 1.4798 - classification_loss: 0.2924 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7699 - regression_loss: 1.4780 - classification_loss: 0.2919 186/500 [==========>...................] - ETA: 1:17 - loss: 1.7675 - regression_loss: 1.4760 - classification_loss: 0.2915 187/500 [==========>...................] - ETA: 1:17 - loss: 1.7657 - regression_loss: 1.4749 - classification_loss: 0.2908 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7686 - regression_loss: 1.4768 - classification_loss: 0.2918 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7682 - regression_loss: 1.4764 - classification_loss: 0.2918 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7653 - regression_loss: 1.4741 - classification_loss: 0.2912 191/500 [==========>...................] - ETA: 1:16 - loss: 1.7658 - regression_loss: 1.4748 - classification_loss: 0.2910 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7672 - regression_loss: 1.4760 - classification_loss: 0.2912 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7635 - regression_loss: 1.4731 - classification_loss: 0.2904 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7655 - regression_loss: 1.4742 - classification_loss: 0.2914 195/500 [==========>...................] - ETA: 1:15 - loss: 1.7643 - regression_loss: 1.4730 - classification_loss: 0.2913 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7589 - regression_loss: 1.4685 - classification_loss: 0.2904 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7581 - regression_loss: 1.4678 - classification_loss: 0.2903 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7595 - regression_loss: 1.4692 - classification_loss: 0.2903 199/500 [==========>...................] - ETA: 1:14 - loss: 1.7610 - regression_loss: 1.4708 - classification_loss: 0.2901 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7613 - regression_loss: 1.4705 - classification_loss: 0.2908 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7663 - regression_loss: 1.4750 - classification_loss: 0.2913 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7655 - regression_loss: 1.4744 - classification_loss: 0.2911 203/500 [===========>..................] - ETA: 1:13 - loss: 1.7666 - regression_loss: 1.4749 - classification_loss: 0.2917 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7698 - regression_loss: 1.4778 - classification_loss: 0.2921 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7729 - regression_loss: 1.4803 - classification_loss: 0.2926 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7762 - regression_loss: 1.4832 - classification_loss: 0.2931 207/500 [===========>..................] - ETA: 1:12 - loss: 1.7797 - regression_loss: 1.4868 - classification_loss: 0.2930 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7804 - regression_loss: 1.4876 - classification_loss: 0.2928 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7811 - regression_loss: 1.4882 - classification_loss: 0.2930 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7812 - regression_loss: 1.4884 - classification_loss: 0.2928 211/500 [===========>..................] - ETA: 1:11 - loss: 1.7806 - regression_loss: 1.4879 - classification_loss: 0.2927 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7783 - regression_loss: 1.4861 - classification_loss: 0.2922 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7726 - regression_loss: 1.4814 - classification_loss: 0.2912 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7715 - regression_loss: 1.4806 - classification_loss: 0.2908 215/500 [===========>..................] - ETA: 1:10 - loss: 1.7742 - regression_loss: 1.4832 - classification_loss: 0.2910 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7718 - regression_loss: 1.4815 - classification_loss: 0.2903 217/500 [============>.................] - ETA: 1:10 - loss: 1.7720 - regression_loss: 1.4817 - classification_loss: 0.2904 218/500 [============>.................] - ETA: 1:10 - loss: 1.7681 - regression_loss: 1.4784 - classification_loss: 0.2896 219/500 [============>.................] - ETA: 1:09 - loss: 1.7677 - regression_loss: 1.4775 - classification_loss: 0.2901 220/500 [============>.................] - ETA: 1:09 - loss: 1.7685 - regression_loss: 1.4781 - classification_loss: 0.2905 221/500 [============>.................] - ETA: 1:09 - loss: 1.7715 - regression_loss: 1.4802 - classification_loss: 0.2913 222/500 [============>.................] - ETA: 1:09 - loss: 1.7722 - regression_loss: 1.4808 - classification_loss: 0.2914 223/500 [============>.................] - ETA: 1:08 - loss: 1.7721 - regression_loss: 1.4812 - classification_loss: 0.2909 224/500 [============>.................] - ETA: 1:08 - loss: 1.7788 - regression_loss: 1.4866 - classification_loss: 0.2922 225/500 [============>.................] - ETA: 1:08 - loss: 1.7795 - regression_loss: 1.4870 - classification_loss: 0.2924 226/500 [============>.................] - ETA: 1:08 - loss: 1.7848 - regression_loss: 1.4921 - classification_loss: 0.2927 227/500 [============>.................] - ETA: 1:07 - loss: 1.7850 - regression_loss: 1.4921 - classification_loss: 0.2928 228/500 [============>.................] - ETA: 1:07 - loss: 1.7830 - regression_loss: 1.4909 - classification_loss: 0.2921 229/500 [============>.................] - ETA: 1:07 - loss: 1.7844 - regression_loss: 1.4919 - classification_loss: 0.2925 230/500 [============>.................] - ETA: 1:07 - loss: 1.7849 - regression_loss: 1.4922 - classification_loss: 0.2927 231/500 [============>.................] - ETA: 1:06 - loss: 1.7862 - regression_loss: 1.4929 - classification_loss: 0.2932 232/500 [============>.................] - ETA: 1:06 - loss: 1.7853 - regression_loss: 1.4918 - classification_loss: 0.2935 233/500 [============>.................] - ETA: 1:06 - loss: 1.7852 - regression_loss: 1.4916 - classification_loss: 0.2936 234/500 [=============>................] - ETA: 1:06 - loss: 1.7930 - regression_loss: 1.4974 - classification_loss: 0.2956 235/500 [=============>................] - ETA: 1:06 - loss: 1.7912 - regression_loss: 1.4957 - classification_loss: 0.2954 236/500 [=============>................] - ETA: 1:05 - loss: 1.7921 - regression_loss: 1.4962 - classification_loss: 0.2958 237/500 [=============>................] - ETA: 1:05 - loss: 1.7912 - regression_loss: 1.4954 - classification_loss: 0.2958 238/500 [=============>................] - ETA: 1:05 - loss: 1.7933 - regression_loss: 1.4970 - classification_loss: 0.2962 239/500 [=============>................] - ETA: 1:05 - loss: 1.7923 - regression_loss: 1.4966 - classification_loss: 0.2957 240/500 [=============>................] - ETA: 1:04 - loss: 1.7876 - regression_loss: 1.4929 - classification_loss: 0.2947 241/500 [=============>................] - ETA: 1:04 - loss: 1.7868 - regression_loss: 1.4919 - classification_loss: 0.2949 242/500 [=============>................] - ETA: 1:04 - loss: 1.7854 - regression_loss: 1.4902 - classification_loss: 0.2952 243/500 [=============>................] - ETA: 1:03 - loss: 1.7841 - regression_loss: 1.4892 - classification_loss: 0.2949 244/500 [=============>................] - ETA: 1:03 - loss: 1.7847 - regression_loss: 1.4898 - classification_loss: 0.2949 245/500 [=============>................] - ETA: 1:03 - loss: 1.7831 - regression_loss: 1.4889 - classification_loss: 0.2943 246/500 [=============>................] - ETA: 1:03 - loss: 1.7817 - regression_loss: 1.4877 - classification_loss: 0.2940 247/500 [=============>................] - ETA: 1:02 - loss: 1.7842 - regression_loss: 1.4896 - classification_loss: 0.2946 248/500 [=============>................] - ETA: 1:02 - loss: 1.7845 - regression_loss: 1.4901 - classification_loss: 0.2944 249/500 [=============>................] - ETA: 1:02 - loss: 1.7889 - regression_loss: 1.4935 - classification_loss: 0.2954 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7925 - regression_loss: 1.4965 - classification_loss: 0.2960 251/500 [==============>...............] - ETA: 1:01 - loss: 1.7919 - regression_loss: 1.4962 - classification_loss: 0.2957 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7902 - regression_loss: 1.4951 - classification_loss: 0.2951 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7904 - regression_loss: 1.4951 - classification_loss: 0.2953 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7905 - regression_loss: 1.4952 - classification_loss: 0.2953 255/500 [==============>...............] - ETA: 1:00 - loss: 1.7935 - regression_loss: 1.4973 - classification_loss: 0.2961 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7937 - regression_loss: 1.4976 - classification_loss: 0.2961 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7942 - regression_loss: 1.4978 - classification_loss: 0.2963 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7929 - regression_loss: 1.4971 - classification_loss: 0.2959 259/500 [==============>...............] - ETA: 59s - loss: 1.7920 - regression_loss: 1.4966 - classification_loss: 0.2954  260/500 [==============>...............] - ETA: 59s - loss: 1.7890 - regression_loss: 1.4943 - classification_loss: 0.2946 261/500 [==============>...............] - ETA: 59s - loss: 1.7898 - regression_loss: 1.4949 - classification_loss: 0.2948 262/500 [==============>...............] - ETA: 59s - loss: 1.7901 - regression_loss: 1.4952 - classification_loss: 0.2948 263/500 [==============>...............] - ETA: 58s - loss: 1.7892 - regression_loss: 1.4947 - classification_loss: 0.2944 264/500 [==============>...............] - ETA: 58s - loss: 1.7894 - regression_loss: 1.4954 - classification_loss: 0.2940 265/500 [==============>...............] - ETA: 58s - loss: 1.7867 - regression_loss: 1.4932 - classification_loss: 0.2935 266/500 [==============>...............] - ETA: 58s - loss: 1.7870 - regression_loss: 1.4933 - classification_loss: 0.2938 267/500 [===============>..............] - ETA: 57s - loss: 1.7872 - regression_loss: 1.4933 - classification_loss: 0.2939 268/500 [===============>..............] - ETA: 57s - loss: 1.7829 - regression_loss: 1.4898 - classification_loss: 0.2931 269/500 [===============>..............] - ETA: 57s - loss: 1.7834 - regression_loss: 1.4901 - classification_loss: 0.2933 270/500 [===============>..............] - ETA: 57s - loss: 1.7804 - regression_loss: 1.4877 - classification_loss: 0.2926 271/500 [===============>..............] - ETA: 56s - loss: 1.7794 - regression_loss: 1.4871 - classification_loss: 0.2923 272/500 [===============>..............] - ETA: 56s - loss: 1.7812 - regression_loss: 1.4887 - classification_loss: 0.2925 273/500 [===============>..............] - ETA: 56s - loss: 1.7806 - regression_loss: 1.4881 - classification_loss: 0.2925 274/500 [===============>..............] - ETA: 56s - loss: 1.7811 - regression_loss: 1.4879 - classification_loss: 0.2932 275/500 [===============>..............] - ETA: 56s - loss: 1.7782 - regression_loss: 1.4856 - classification_loss: 0.2926 276/500 [===============>..............] - ETA: 55s - loss: 1.7785 - regression_loss: 1.4857 - classification_loss: 0.2928 277/500 [===============>..............] - ETA: 55s - loss: 1.7840 - regression_loss: 1.4879 - classification_loss: 0.2961 278/500 [===============>..............] - ETA: 55s - loss: 1.7854 - regression_loss: 1.4892 - classification_loss: 0.2962 279/500 [===============>..............] - ETA: 55s - loss: 1.7845 - regression_loss: 1.4882 - classification_loss: 0.2963 280/500 [===============>..............] - ETA: 54s - loss: 1.7852 - regression_loss: 1.4891 - classification_loss: 0.2962 281/500 [===============>..............] - ETA: 54s - loss: 1.7832 - regression_loss: 1.4876 - classification_loss: 0.2957 282/500 [===============>..............] - ETA: 54s - loss: 1.7845 - regression_loss: 1.4892 - classification_loss: 0.2954 283/500 [===============>..............] - ETA: 54s - loss: 1.7837 - regression_loss: 1.4885 - classification_loss: 0.2952 284/500 [================>.............] - ETA: 53s - loss: 1.7804 - regression_loss: 1.4858 - classification_loss: 0.2946 285/500 [================>.............] - ETA: 53s - loss: 1.7810 - regression_loss: 1.4861 - classification_loss: 0.2948 286/500 [================>.............] - ETA: 53s - loss: 1.7800 - regression_loss: 1.4854 - classification_loss: 0.2946 287/500 [================>.............] - ETA: 53s - loss: 1.7803 - regression_loss: 1.4856 - classification_loss: 0.2947 288/500 [================>.............] - ETA: 52s - loss: 1.7824 - regression_loss: 1.4873 - classification_loss: 0.2951 289/500 [================>.............] - ETA: 52s - loss: 1.7836 - regression_loss: 1.4887 - classification_loss: 0.2949 290/500 [================>.............] - ETA: 52s - loss: 1.7825 - regression_loss: 1.4878 - classification_loss: 0.2947 291/500 [================>.............] - ETA: 52s - loss: 1.7790 - regression_loss: 1.4850 - classification_loss: 0.2940 292/500 [================>.............] - ETA: 51s - loss: 1.7794 - regression_loss: 1.4856 - classification_loss: 0.2938 293/500 [================>.............] - ETA: 51s - loss: 1.7783 - regression_loss: 1.4850 - classification_loss: 0.2933 294/500 [================>.............] - ETA: 51s - loss: 1.7800 - regression_loss: 1.4861 - classification_loss: 0.2939 295/500 [================>.............] - ETA: 51s - loss: 1.7794 - regression_loss: 1.4859 - classification_loss: 0.2935 296/500 [================>.............] - ETA: 50s - loss: 1.7810 - regression_loss: 1.4870 - classification_loss: 0.2940 297/500 [================>.............] - ETA: 50s - loss: 1.7784 - regression_loss: 1.4850 - classification_loss: 0.2934 298/500 [================>.............] - ETA: 50s - loss: 1.7803 - regression_loss: 1.4866 - classification_loss: 0.2937 299/500 [================>.............] - ETA: 50s - loss: 1.7810 - regression_loss: 1.4868 - classification_loss: 0.2942 300/500 [=================>............] - ETA: 49s - loss: 1.7820 - regression_loss: 1.4877 - classification_loss: 0.2943 301/500 [=================>............] - ETA: 49s - loss: 1.7834 - regression_loss: 1.4888 - classification_loss: 0.2946 302/500 [=================>............] - ETA: 49s - loss: 1.7878 - regression_loss: 1.4925 - classification_loss: 0.2953 303/500 [=================>............] - ETA: 49s - loss: 1.7888 - regression_loss: 1.4932 - classification_loss: 0.2956 304/500 [=================>............] - ETA: 48s - loss: 1.7874 - regression_loss: 1.4922 - classification_loss: 0.2952 305/500 [=================>............] - ETA: 48s - loss: 1.7886 - regression_loss: 1.4930 - classification_loss: 0.2956 306/500 [=================>............] - ETA: 48s - loss: 1.7881 - regression_loss: 1.4927 - classification_loss: 0.2953 307/500 [=================>............] - ETA: 48s - loss: 1.7869 - regression_loss: 1.4920 - classification_loss: 0.2949 308/500 [=================>............] - ETA: 47s - loss: 1.7838 - regression_loss: 1.4895 - classification_loss: 0.2943 309/500 [=================>............] - ETA: 47s - loss: 1.7832 - regression_loss: 1.4890 - classification_loss: 0.2942 310/500 [=================>............] - ETA: 47s - loss: 1.7801 - regression_loss: 1.4866 - classification_loss: 0.2935 311/500 [=================>............] - ETA: 47s - loss: 1.7789 - regression_loss: 1.4857 - classification_loss: 0.2932 312/500 [=================>............] - ETA: 46s - loss: 1.7798 - regression_loss: 1.4867 - classification_loss: 0.2931 313/500 [=================>............] - ETA: 46s - loss: 1.7811 - regression_loss: 1.4877 - classification_loss: 0.2934 314/500 [=================>............] - ETA: 46s - loss: 1.7809 - regression_loss: 1.4877 - classification_loss: 0.2932 315/500 [=================>............] - ETA: 46s - loss: 1.7802 - regression_loss: 1.4874 - classification_loss: 0.2928 316/500 [=================>............] - ETA: 45s - loss: 1.7773 - regression_loss: 1.4851 - classification_loss: 0.2922 317/500 [==================>...........] - ETA: 45s - loss: 1.7765 - regression_loss: 1.4846 - classification_loss: 0.2919 318/500 [==================>...........] - ETA: 45s - loss: 1.7752 - regression_loss: 1.4833 - classification_loss: 0.2919 319/500 [==================>...........] - ETA: 45s - loss: 1.7755 - regression_loss: 1.4834 - classification_loss: 0.2920 320/500 [==================>...........] - ETA: 44s - loss: 1.7754 - regression_loss: 1.4834 - classification_loss: 0.2920 321/500 [==================>...........] - ETA: 44s - loss: 1.7745 - regression_loss: 1.4826 - classification_loss: 0.2919 322/500 [==================>...........] - ETA: 44s - loss: 1.7776 - regression_loss: 1.4853 - classification_loss: 0.2923 323/500 [==================>...........] - ETA: 44s - loss: 1.7769 - regression_loss: 1.4848 - classification_loss: 0.2921 324/500 [==================>...........] - ETA: 43s - loss: 1.7766 - regression_loss: 1.4845 - classification_loss: 0.2921 325/500 [==================>...........] - ETA: 43s - loss: 1.7787 - regression_loss: 1.4863 - classification_loss: 0.2925 326/500 [==================>...........] - ETA: 43s - loss: 1.7786 - regression_loss: 1.4864 - classification_loss: 0.2922 327/500 [==================>...........] - ETA: 43s - loss: 1.7775 - regression_loss: 1.4856 - classification_loss: 0.2919 328/500 [==================>...........] - ETA: 42s - loss: 1.7767 - regression_loss: 1.4852 - classification_loss: 0.2915 329/500 [==================>...........] - ETA: 42s - loss: 1.7789 - regression_loss: 1.4870 - classification_loss: 0.2919 330/500 [==================>...........] - ETA: 42s - loss: 1.7788 - regression_loss: 1.4870 - classification_loss: 0.2917 331/500 [==================>...........] - ETA: 42s - loss: 1.7770 - regression_loss: 1.4857 - classification_loss: 0.2913 332/500 [==================>...........] - ETA: 41s - loss: 1.7763 - regression_loss: 1.4853 - classification_loss: 0.2910 333/500 [==================>...........] - ETA: 41s - loss: 1.7754 - regression_loss: 1.4847 - classification_loss: 0.2908 334/500 [===================>..........] - ETA: 41s - loss: 1.7778 - regression_loss: 1.4864 - classification_loss: 0.2914 335/500 [===================>..........] - ETA: 41s - loss: 1.7736 - regression_loss: 1.4829 - classification_loss: 0.2906 336/500 [===================>..........] - ETA: 40s - loss: 1.7748 - regression_loss: 1.4841 - classification_loss: 0.2907 337/500 [===================>..........] - ETA: 40s - loss: 1.7770 - regression_loss: 1.4857 - classification_loss: 0.2912 338/500 [===================>..........] - ETA: 40s - loss: 1.7778 - regression_loss: 1.4863 - classification_loss: 0.2914 339/500 [===================>..........] - ETA: 40s - loss: 1.7804 - regression_loss: 1.4882 - classification_loss: 0.2922 340/500 [===================>..........] - ETA: 39s - loss: 1.7819 - regression_loss: 1.4895 - classification_loss: 0.2924 341/500 [===================>..........] - ETA: 39s - loss: 1.7808 - regression_loss: 1.4887 - classification_loss: 0.2921 342/500 [===================>..........] - ETA: 39s - loss: 1.7811 - regression_loss: 1.4889 - classification_loss: 0.2922 343/500 [===================>..........] - ETA: 39s - loss: 1.7793 - regression_loss: 1.4876 - classification_loss: 0.2917 344/500 [===================>..........] - ETA: 38s - loss: 1.7790 - regression_loss: 1.4873 - classification_loss: 0.2918 345/500 [===================>..........] - ETA: 38s - loss: 1.7782 - regression_loss: 1.4867 - classification_loss: 0.2915 346/500 [===================>..........] - ETA: 38s - loss: 1.7776 - regression_loss: 1.4863 - classification_loss: 0.2913 347/500 [===================>..........] - ETA: 38s - loss: 1.7803 - regression_loss: 1.4886 - classification_loss: 0.2917 348/500 [===================>..........] - ETA: 37s - loss: 1.7808 - regression_loss: 1.4889 - classification_loss: 0.2919 349/500 [===================>..........] - ETA: 37s - loss: 1.7790 - regression_loss: 1.4875 - classification_loss: 0.2915 350/500 [====================>.........] - ETA: 37s - loss: 1.7814 - regression_loss: 1.4893 - classification_loss: 0.2922 351/500 [====================>.........] - ETA: 37s - loss: 1.7823 - regression_loss: 1.4899 - classification_loss: 0.2924 352/500 [====================>.........] - ETA: 36s - loss: 1.7845 - regression_loss: 1.4914 - classification_loss: 0.2931 353/500 [====================>.........] - ETA: 36s - loss: 1.7851 - regression_loss: 1.4918 - classification_loss: 0.2933 354/500 [====================>.........] - ETA: 36s - loss: 1.7840 - regression_loss: 1.4911 - classification_loss: 0.2929 355/500 [====================>.........] - ETA: 36s - loss: 1.7846 - regression_loss: 1.4919 - classification_loss: 0.2927 356/500 [====================>.........] - ETA: 35s - loss: 1.7855 - regression_loss: 1.4929 - classification_loss: 0.2927 357/500 [====================>.........] - ETA: 35s - loss: 1.7827 - regression_loss: 1.4905 - classification_loss: 0.2923 358/500 [====================>.........] - ETA: 35s - loss: 1.7829 - regression_loss: 1.4907 - classification_loss: 0.2922 359/500 [====================>.........] - ETA: 35s - loss: 1.7833 - regression_loss: 1.4912 - classification_loss: 0.2920 360/500 [====================>.........] - ETA: 34s - loss: 1.7820 - regression_loss: 1.4904 - classification_loss: 0.2916 361/500 [====================>.........] - ETA: 34s - loss: 1.7829 - regression_loss: 1.4912 - classification_loss: 0.2917 362/500 [====================>.........] - ETA: 34s - loss: 1.7839 - regression_loss: 1.4918 - classification_loss: 0.2921 363/500 [====================>.........] - ETA: 34s - loss: 1.7821 - regression_loss: 1.4906 - classification_loss: 0.2915 364/500 [====================>.........] - ETA: 33s - loss: 1.7811 - regression_loss: 1.4899 - classification_loss: 0.2912 365/500 [====================>.........] - ETA: 33s - loss: 1.7805 - regression_loss: 1.4894 - classification_loss: 0.2911 366/500 [====================>.........] - ETA: 33s - loss: 1.7819 - regression_loss: 1.4904 - classification_loss: 0.2915 367/500 [=====================>........] - ETA: 33s - loss: 1.7803 - regression_loss: 1.4892 - classification_loss: 0.2911 368/500 [=====================>........] - ETA: 32s - loss: 1.7804 - regression_loss: 1.4893 - classification_loss: 0.2911 369/500 [=====================>........] - ETA: 32s - loss: 1.7781 - regression_loss: 1.4875 - classification_loss: 0.2906 370/500 [=====================>........] - ETA: 32s - loss: 1.7782 - regression_loss: 1.4877 - classification_loss: 0.2905 371/500 [=====================>........] - ETA: 32s - loss: 1.7776 - regression_loss: 1.4874 - classification_loss: 0.2902 372/500 [=====================>........] - ETA: 31s - loss: 1.7815 - regression_loss: 1.4907 - classification_loss: 0.2908 373/500 [=====================>........] - ETA: 31s - loss: 1.7844 - regression_loss: 1.4927 - classification_loss: 0.2917 374/500 [=====================>........] - ETA: 31s - loss: 1.7825 - regression_loss: 1.4912 - classification_loss: 0.2913 375/500 [=====================>........] - ETA: 31s - loss: 1.7816 - regression_loss: 1.4906 - classification_loss: 0.2910 376/500 [=====================>........] - ETA: 30s - loss: 1.7812 - regression_loss: 1.4904 - classification_loss: 0.2908 377/500 [=====================>........] - ETA: 30s - loss: 1.7815 - regression_loss: 1.4908 - classification_loss: 0.2907 378/500 [=====================>........] - ETA: 30s - loss: 1.7788 - regression_loss: 1.4887 - classification_loss: 0.2901 379/500 [=====================>........] - ETA: 30s - loss: 1.7766 - regression_loss: 1.4848 - classification_loss: 0.2918 380/500 [=====================>........] - ETA: 29s - loss: 1.7768 - regression_loss: 1.4852 - classification_loss: 0.2916 381/500 [=====================>........] - ETA: 29s - loss: 1.7776 - regression_loss: 1.4856 - classification_loss: 0.2920 382/500 [=====================>........] - ETA: 29s - loss: 1.7775 - regression_loss: 1.4855 - classification_loss: 0.2920 383/500 [=====================>........] - ETA: 29s - loss: 1.7780 - regression_loss: 1.4860 - classification_loss: 0.2920 384/500 [======================>.......] - ETA: 28s - loss: 1.7794 - regression_loss: 1.4870 - classification_loss: 0.2924 385/500 [======================>.......] - ETA: 28s - loss: 1.7771 - regression_loss: 1.4851 - classification_loss: 0.2920 386/500 [======================>.......] - ETA: 28s - loss: 1.7744 - regression_loss: 1.4829 - classification_loss: 0.2915 387/500 [======================>.......] - ETA: 28s - loss: 1.7745 - regression_loss: 1.4831 - classification_loss: 0.2915 388/500 [======================>.......] - ETA: 27s - loss: 1.7740 - regression_loss: 1.4825 - classification_loss: 0.2915 389/500 [======================>.......] - ETA: 27s - loss: 1.7752 - regression_loss: 1.4835 - classification_loss: 0.2917 390/500 [======================>.......] - ETA: 27s - loss: 1.7774 - regression_loss: 1.4850 - classification_loss: 0.2924 391/500 [======================>.......] - ETA: 27s - loss: 1.7747 - regression_loss: 1.4828 - classification_loss: 0.2919 392/500 [======================>.......] - ETA: 26s - loss: 1.7752 - regression_loss: 1.4832 - classification_loss: 0.2919 393/500 [======================>.......] - ETA: 26s - loss: 1.7757 - regression_loss: 1.4835 - classification_loss: 0.2922 394/500 [======================>.......] - ETA: 26s - loss: 1.7757 - regression_loss: 1.4836 - classification_loss: 0.2921 395/500 [======================>.......] - ETA: 26s - loss: 1.7754 - regression_loss: 1.4836 - classification_loss: 0.2918 396/500 [======================>.......] - ETA: 25s - loss: 1.7761 - regression_loss: 1.4842 - classification_loss: 0.2919 397/500 [======================>.......] - ETA: 25s - loss: 1.7758 - regression_loss: 1.4843 - classification_loss: 0.2916 398/500 [======================>.......] - ETA: 25s - loss: 1.7750 - regression_loss: 1.4837 - classification_loss: 0.2914 399/500 [======================>.......] - ETA: 25s - loss: 1.7749 - regression_loss: 1.4836 - classification_loss: 0.2913 400/500 [=======================>......] - ETA: 24s - loss: 1.7741 - regression_loss: 1.4830 - classification_loss: 0.2912 401/500 [=======================>......] - ETA: 24s - loss: 1.7738 - regression_loss: 1.4828 - classification_loss: 0.2910 402/500 [=======================>......] - ETA: 24s - loss: 1.7750 - regression_loss: 1.4836 - classification_loss: 0.2915 403/500 [=======================>......] - ETA: 24s - loss: 1.7756 - regression_loss: 1.4843 - classification_loss: 0.2913 404/500 [=======================>......] - ETA: 23s - loss: 1.7747 - regression_loss: 1.4835 - classification_loss: 0.2912 405/500 [=======================>......] - ETA: 23s - loss: 1.7749 - regression_loss: 1.4838 - classification_loss: 0.2912 406/500 [=======================>......] - ETA: 23s - loss: 1.7735 - regression_loss: 1.4827 - classification_loss: 0.2908 407/500 [=======================>......] - ETA: 23s - loss: 1.7743 - regression_loss: 1.4833 - classification_loss: 0.2911 408/500 [=======================>......] - ETA: 22s - loss: 1.7740 - regression_loss: 1.4831 - classification_loss: 0.2908 409/500 [=======================>......] - ETA: 22s - loss: 1.7734 - regression_loss: 1.4828 - classification_loss: 0.2906 410/500 [=======================>......] - ETA: 22s - loss: 1.7737 - regression_loss: 1.4830 - classification_loss: 0.2907 411/500 [=======================>......] - ETA: 22s - loss: 1.7744 - regression_loss: 1.4835 - classification_loss: 0.2909 412/500 [=======================>......] - ETA: 21s - loss: 1.7758 - regression_loss: 1.4846 - classification_loss: 0.2912 413/500 [=======================>......] - ETA: 21s - loss: 1.7762 - regression_loss: 1.4852 - classification_loss: 0.2910 414/500 [=======================>......] - ETA: 21s - loss: 1.7766 - regression_loss: 1.4854 - classification_loss: 0.2912 415/500 [=======================>......] - ETA: 21s - loss: 1.7780 - regression_loss: 1.4864 - classification_loss: 0.2916 416/500 [=======================>......] - ETA: 20s - loss: 1.7786 - regression_loss: 1.4870 - classification_loss: 0.2916 417/500 [========================>.....] - ETA: 20s - loss: 1.7785 - regression_loss: 1.4870 - classification_loss: 0.2915 418/500 [========================>.....] - ETA: 20s - loss: 1.7758 - regression_loss: 1.4847 - classification_loss: 0.2911 419/500 [========================>.....] - ETA: 20s - loss: 1.7752 - regression_loss: 1.4843 - classification_loss: 0.2909 420/500 [========================>.....] - ETA: 19s - loss: 1.7759 - regression_loss: 1.4850 - classification_loss: 0.2910 421/500 [========================>.....] - ETA: 19s - loss: 1.7766 - regression_loss: 1.4856 - classification_loss: 0.2909 422/500 [========================>.....] - ETA: 19s - loss: 1.7758 - regression_loss: 1.4849 - classification_loss: 0.2909 423/500 [========================>.....] - ETA: 19s - loss: 1.7765 - regression_loss: 1.4854 - classification_loss: 0.2911 424/500 [========================>.....] - ETA: 18s - loss: 1.7782 - regression_loss: 1.4871 - classification_loss: 0.2911 425/500 [========================>.....] - ETA: 18s - loss: 1.7779 - regression_loss: 1.4872 - classification_loss: 0.2908 426/500 [========================>.....] - ETA: 18s - loss: 1.7766 - regression_loss: 1.4862 - classification_loss: 0.2904 427/500 [========================>.....] - ETA: 18s - loss: 1.7781 - regression_loss: 1.4874 - classification_loss: 0.2907 428/500 [========================>.....] - ETA: 17s - loss: 1.7781 - regression_loss: 1.4874 - classification_loss: 0.2907 429/500 [========================>.....] - ETA: 17s - loss: 1.7795 - regression_loss: 1.4884 - classification_loss: 0.2910 430/500 [========================>.....] - ETA: 17s - loss: 1.7808 - regression_loss: 1.4899 - classification_loss: 0.2909 431/500 [========================>.....] - ETA: 17s - loss: 1.7807 - regression_loss: 1.4896 - classification_loss: 0.2911 432/500 [========================>.....] - ETA: 16s - loss: 1.7796 - regression_loss: 1.4887 - classification_loss: 0.2908 433/500 [========================>.....] - ETA: 16s - loss: 1.7781 - regression_loss: 1.4853 - classification_loss: 0.2928 434/500 [=========================>....] - ETA: 16s - loss: 1.7776 - regression_loss: 1.4851 - classification_loss: 0.2925 435/500 [=========================>....] - ETA: 16s - loss: 1.7775 - regression_loss: 1.4850 - classification_loss: 0.2925 436/500 [=========================>....] - ETA: 15s - loss: 1.7805 - regression_loss: 1.4852 - classification_loss: 0.2954 437/500 [=========================>....] - ETA: 15s - loss: 1.7786 - regression_loss: 1.4835 - classification_loss: 0.2951 438/500 [=========================>....] - ETA: 15s - loss: 1.7792 - regression_loss: 1.4842 - classification_loss: 0.2950 439/500 [=========================>....] - ETA: 15s - loss: 1.7804 - regression_loss: 1.4853 - classification_loss: 0.2951 440/500 [=========================>....] - ETA: 14s - loss: 1.7808 - regression_loss: 1.4857 - classification_loss: 0.2950 441/500 [=========================>....] - ETA: 14s - loss: 1.7794 - regression_loss: 1.4847 - classification_loss: 0.2947 442/500 [=========================>....] - ETA: 14s - loss: 1.7799 - regression_loss: 1.4851 - classification_loss: 0.2948 443/500 [=========================>....] - ETA: 14s - loss: 1.7797 - regression_loss: 1.4849 - classification_loss: 0.2948 444/500 [=========================>....] - ETA: 13s - loss: 1.7793 - regression_loss: 1.4846 - classification_loss: 0.2947 445/500 [=========================>....] - ETA: 13s - loss: 1.7810 - regression_loss: 1.4859 - classification_loss: 0.2951 446/500 [=========================>....] - ETA: 13s - loss: 1.7835 - regression_loss: 1.4881 - classification_loss: 0.2954 447/500 [=========================>....] - ETA: 13s - loss: 1.7838 - regression_loss: 1.4883 - classification_loss: 0.2956 448/500 [=========================>....] - ETA: 12s - loss: 1.7853 - regression_loss: 1.4890 - classification_loss: 0.2964 449/500 [=========================>....] - ETA: 12s - loss: 1.7849 - regression_loss: 1.4887 - classification_loss: 0.2961 450/500 [==========================>...] - ETA: 12s - loss: 1.7859 - regression_loss: 1.4894 - classification_loss: 0.2965 451/500 [==========================>...] - ETA: 12s - loss: 1.7867 - regression_loss: 1.4898 - classification_loss: 0.2969 452/500 [==========================>...] - ETA: 11s - loss: 1.7879 - regression_loss: 1.4908 - classification_loss: 0.2972 453/500 [==========================>...] - ETA: 11s - loss: 1.7886 - regression_loss: 1.4909 - classification_loss: 0.2977 454/500 [==========================>...] - ETA: 11s - loss: 1.7889 - regression_loss: 1.4910 - classification_loss: 0.2979 455/500 [==========================>...] - ETA: 11s - loss: 1.7876 - regression_loss: 1.4900 - classification_loss: 0.2977 456/500 [==========================>...] - ETA: 10s - loss: 1.7887 - regression_loss: 1.4909 - classification_loss: 0.2978 457/500 [==========================>...] - ETA: 10s - loss: 1.7872 - regression_loss: 1.4896 - classification_loss: 0.2976 458/500 [==========================>...] - ETA: 10s - loss: 1.7872 - regression_loss: 1.4897 - classification_loss: 0.2975 459/500 [==========================>...] - ETA: 10s - loss: 1.7846 - regression_loss: 1.4874 - classification_loss: 0.2972 460/500 [==========================>...] - ETA: 9s - loss: 1.7852 - regression_loss: 1.4876 - classification_loss: 0.2976  461/500 [==========================>...] - ETA: 9s - loss: 1.7853 - regression_loss: 1.4875 - classification_loss: 0.2978 462/500 [==========================>...] - ETA: 9s - loss: 1.7848 - regression_loss: 1.4872 - classification_loss: 0.2975 463/500 [==========================>...] - ETA: 9s - loss: 1.7867 - regression_loss: 1.4886 - classification_loss: 0.2981 464/500 [==========================>...] - ETA: 8s - loss: 1.7844 - regression_loss: 1.4868 - classification_loss: 0.2976 465/500 [==========================>...] - ETA: 8s - loss: 1.7825 - regression_loss: 1.4853 - classification_loss: 0.2973 466/500 [==========================>...] - ETA: 8s - loss: 1.7833 - regression_loss: 1.4859 - classification_loss: 0.2974 467/500 [===========================>..] - ETA: 8s - loss: 1.7821 - regression_loss: 1.4850 - classification_loss: 0.2971 468/500 [===========================>..] - ETA: 7s - loss: 1.7808 - regression_loss: 1.4840 - classification_loss: 0.2968 469/500 [===========================>..] - ETA: 7s - loss: 1.7800 - regression_loss: 1.4835 - classification_loss: 0.2965 470/500 [===========================>..] - ETA: 7s - loss: 1.7799 - regression_loss: 1.4833 - classification_loss: 0.2966 471/500 [===========================>..] - ETA: 7s - loss: 1.7801 - regression_loss: 1.4836 - classification_loss: 0.2964 472/500 [===========================>..] - ETA: 6s - loss: 1.7836 - regression_loss: 1.4865 - classification_loss: 0.2970 473/500 [===========================>..] - ETA: 6s - loss: 1.7837 - regression_loss: 1.4866 - classification_loss: 0.2971 474/500 [===========================>..] - ETA: 6s - loss: 1.7826 - regression_loss: 1.4857 - classification_loss: 0.2969 475/500 [===========================>..] - ETA: 6s - loss: 1.7817 - regression_loss: 1.4852 - classification_loss: 0.2965 476/500 [===========================>..] - ETA: 5s - loss: 1.7809 - regression_loss: 1.4846 - classification_loss: 0.2963 477/500 [===========================>..] - ETA: 5s - loss: 1.7815 - regression_loss: 1.4850 - classification_loss: 0.2964 478/500 [===========================>..] - ETA: 5s - loss: 1.7812 - regression_loss: 1.4847 - classification_loss: 0.2964 479/500 [===========================>..] - ETA: 5s - loss: 1.7800 - regression_loss: 1.4838 - classification_loss: 0.2962 480/500 [===========================>..] - ETA: 4s - loss: 1.7788 - regression_loss: 1.4829 - classification_loss: 0.2959 481/500 [===========================>..] - ETA: 4s - loss: 1.7779 - regression_loss: 1.4823 - classification_loss: 0.2956 482/500 [===========================>..] - ETA: 4s - loss: 1.7789 - regression_loss: 1.4830 - classification_loss: 0.2959 483/500 [===========================>..] - ETA: 4s - loss: 1.7771 - regression_loss: 1.4816 - classification_loss: 0.2955 484/500 [============================>.] - ETA: 3s - loss: 1.7768 - regression_loss: 1.4815 - classification_loss: 0.2954 485/500 [============================>.] - ETA: 3s - loss: 1.7752 - regression_loss: 1.4801 - classification_loss: 0.2951 486/500 [============================>.] - ETA: 3s - loss: 1.7763 - regression_loss: 1.4810 - classification_loss: 0.2952 487/500 [============================>.] - ETA: 3s - loss: 1.7757 - regression_loss: 1.4805 - classification_loss: 0.2952 488/500 [============================>.] - ETA: 2s - loss: 1.7757 - regression_loss: 1.4803 - classification_loss: 0.2953 489/500 [============================>.] - ETA: 2s - loss: 1.7737 - regression_loss: 1.4788 - classification_loss: 0.2949 490/500 [============================>.] - ETA: 2s - loss: 1.7721 - regression_loss: 1.4774 - classification_loss: 0.2946 491/500 [============================>.] - ETA: 2s - loss: 1.7709 - regression_loss: 1.4766 - classification_loss: 0.2943 492/500 [============================>.] - ETA: 1s - loss: 1.7712 - regression_loss: 1.4767 - classification_loss: 0.2946 493/500 [============================>.] - ETA: 1s - loss: 1.7703 - regression_loss: 1.4759 - classification_loss: 0.2944 494/500 [============================>.] - ETA: 1s - loss: 1.7702 - regression_loss: 1.4758 - classification_loss: 0.2944 495/500 [============================>.] - ETA: 1s - loss: 1.7707 - regression_loss: 1.4762 - classification_loss: 0.2945 496/500 [============================>.] - ETA: 0s - loss: 1.7683 - regression_loss: 1.4742 - classification_loss: 0.2941 497/500 [============================>.] - ETA: 0s - loss: 1.7677 - regression_loss: 1.4736 - classification_loss: 0.2941 498/500 [============================>.] - ETA: 0s - loss: 1.7680 - regression_loss: 1.4738 - classification_loss: 0.2942 499/500 [============================>.] - ETA: 0s - loss: 1.7672 - regression_loss: 1.4732 - classification_loss: 0.2940 500/500 [==============================] - 125s 249ms/step - loss: 1.7667 - regression_loss: 1.4729 - classification_loss: 0.2938 326 instances of class plum with average precision: 0.7519 mAP: 0.7519 Epoch 00041: saving model to ./training/snapshots/resnet50_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 2:03 - loss: 2.1519 - regression_loss: 1.8491 - classification_loss: 0.3028 2/500 [..............................] - ETA: 2:03 - loss: 1.9383 - regression_loss: 1.6462 - classification_loss: 0.2921 3/500 [..............................] - ETA: 2:03 - loss: 1.7809 - regression_loss: 1.4989 - classification_loss: 0.2819 4/500 [..............................] - ETA: 2:02 - loss: 1.8250 - regression_loss: 1.5687 - classification_loss: 0.2563 5/500 [..............................] - ETA: 2:02 - loss: 1.8834 - regression_loss: 1.6242 - classification_loss: 0.2592 6/500 [..............................] - ETA: 2:02 - loss: 1.8949 - regression_loss: 1.5819 - classification_loss: 0.3130 7/500 [..............................] - ETA: 2:02 - loss: 1.9900 - regression_loss: 1.6337 - classification_loss: 0.3563 8/500 [..............................] - ETA: 2:01 - loss: 1.9753 - regression_loss: 1.6312 - classification_loss: 0.3440 9/500 [..............................] - ETA: 2:01 - loss: 1.9536 - regression_loss: 1.6183 - classification_loss: 0.3353 10/500 [..............................] - ETA: 2:01 - loss: 2.0158 - regression_loss: 1.6662 - classification_loss: 0.3496 11/500 [..............................] - ETA: 2:01 - loss: 1.9868 - regression_loss: 1.6445 - classification_loss: 0.3423 12/500 [..............................] - ETA: 2:01 - loss: 1.9459 - regression_loss: 1.6156 - classification_loss: 0.3303 13/500 [..............................] - ETA: 2:01 - loss: 1.9090 - regression_loss: 1.5866 - classification_loss: 0.3224 14/500 [..............................] - ETA: 2:01 - loss: 1.9264 - regression_loss: 1.6136 - classification_loss: 0.3128 15/500 [..............................] - ETA: 2:01 - loss: 1.8638 - regression_loss: 1.5566 - classification_loss: 0.3072 16/500 [..............................] - ETA: 2:00 - loss: 1.8696 - regression_loss: 1.5603 - classification_loss: 0.3094 17/500 [>.............................] - ETA: 2:00 - loss: 1.8528 - regression_loss: 1.5468 - classification_loss: 0.3060 18/500 [>.............................] - ETA: 2:00 - loss: 1.8549 - regression_loss: 1.5485 - classification_loss: 0.3064 19/500 [>.............................] - ETA: 1:59 - loss: 1.8229 - regression_loss: 1.5227 - classification_loss: 0.3002 20/500 [>.............................] - ETA: 1:59 - loss: 1.8202 - regression_loss: 1.5216 - classification_loss: 0.2986 21/500 [>.............................] - ETA: 1:59 - loss: 1.8351 - regression_loss: 1.5325 - classification_loss: 0.3026 22/500 [>.............................] - ETA: 1:59 - loss: 1.8506 - regression_loss: 1.5444 - classification_loss: 0.3061 23/500 [>.............................] - ETA: 1:58 - loss: 1.8312 - regression_loss: 1.5320 - classification_loss: 0.2992 24/500 [>.............................] - ETA: 1:58 - loss: 1.8447 - regression_loss: 1.5405 - classification_loss: 0.3042 25/500 [>.............................] - ETA: 1:58 - loss: 1.8366 - regression_loss: 1.5307 - classification_loss: 0.3059 26/500 [>.............................] - ETA: 1:57 - loss: 1.8172 - regression_loss: 1.5167 - classification_loss: 0.3005 27/500 [>.............................] - ETA: 1:57 - loss: 1.8273 - regression_loss: 1.5223 - classification_loss: 0.3050 28/500 [>.............................] - ETA: 1:57 - loss: 1.8103 - regression_loss: 1.5097 - classification_loss: 0.3005 29/500 [>.............................] - ETA: 1:57 - loss: 1.7922 - regression_loss: 1.4963 - classification_loss: 0.2959 30/500 [>.............................] - ETA: 1:57 - loss: 1.7813 - regression_loss: 1.4879 - classification_loss: 0.2934 31/500 [>.............................] - ETA: 1:56 - loss: 1.7822 - regression_loss: 1.4878 - classification_loss: 0.2945 32/500 [>.............................] - ETA: 1:56 - loss: 1.7741 - regression_loss: 1.4831 - classification_loss: 0.2909 33/500 [>.............................] - ETA: 1:56 - loss: 1.7694 - regression_loss: 1.4765 - classification_loss: 0.2930 34/500 [=>............................] - ETA: 1:55 - loss: 1.7665 - regression_loss: 1.4746 - classification_loss: 0.2919 35/500 [=>............................] - ETA: 1:55 - loss: 1.7840 - regression_loss: 1.4899 - classification_loss: 0.2941 36/500 [=>............................] - ETA: 1:55 - loss: 1.7664 - regression_loss: 1.4771 - classification_loss: 0.2893 37/500 [=>............................] - ETA: 1:55 - loss: 1.7555 - regression_loss: 1.4678 - classification_loss: 0.2878 38/500 [=>............................] - ETA: 1:55 - loss: 1.7474 - regression_loss: 1.4598 - classification_loss: 0.2875 39/500 [=>............................] - ETA: 1:55 - loss: 1.7525 - regression_loss: 1.4646 - classification_loss: 0.2879 40/500 [=>............................] - ETA: 1:54 - loss: 1.7535 - regression_loss: 1.4656 - classification_loss: 0.2879 41/500 [=>............................] - ETA: 1:54 - loss: 1.7715 - regression_loss: 1.4821 - classification_loss: 0.2894 42/500 [=>............................] - ETA: 1:54 - loss: 1.7706 - regression_loss: 1.4839 - classification_loss: 0.2867 43/500 [=>............................] - ETA: 1:54 - loss: 1.7730 - regression_loss: 1.4835 - classification_loss: 0.2895 44/500 [=>............................] - ETA: 1:54 - loss: 1.7735 - regression_loss: 1.4823 - classification_loss: 0.2912 45/500 [=>............................] - ETA: 1:53 - loss: 1.7655 - regression_loss: 1.4749 - classification_loss: 0.2906 46/500 [=>............................] - ETA: 1:53 - loss: 1.7756 - regression_loss: 1.4822 - classification_loss: 0.2934 47/500 [=>............................] - ETA: 1:53 - loss: 1.7773 - regression_loss: 1.4844 - classification_loss: 0.2929 48/500 [=>............................] - ETA: 1:53 - loss: 1.7802 - regression_loss: 1.4862 - classification_loss: 0.2940 49/500 [=>............................] - ETA: 1:52 - loss: 1.7908 - regression_loss: 1.4938 - classification_loss: 0.2970 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7787 - regression_loss: 1.4847 - classification_loss: 0.2940 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7533 - regression_loss: 1.4640 - classification_loss: 0.2894 52/500 [==>...........................] - ETA: 1:52 - loss: 1.7421 - regression_loss: 1.4550 - classification_loss: 0.2871 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7404 - regression_loss: 1.4538 - classification_loss: 0.2865 54/500 [==>...........................] - ETA: 1:51 - loss: 1.7383 - regression_loss: 1.4518 - classification_loss: 0.2865 55/500 [==>...........................] - ETA: 1:51 - loss: 1.7456 - regression_loss: 1.4586 - classification_loss: 0.2870 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7440 - regression_loss: 1.4551 - classification_loss: 0.2889 57/500 [==>...........................] - ETA: 1:50 - loss: 1.7614 - regression_loss: 1.4673 - classification_loss: 0.2942 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7569 - regression_loss: 1.4652 - classification_loss: 0.2917 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7629 - regression_loss: 1.4702 - classification_loss: 0.2927 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7633 - regression_loss: 1.4715 - classification_loss: 0.2918 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7520 - regression_loss: 1.4615 - classification_loss: 0.2905 62/500 [==>...........................] - ETA: 1:49 - loss: 1.7513 - regression_loss: 1.4600 - classification_loss: 0.2914 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7521 - regression_loss: 1.4593 - classification_loss: 0.2928 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7719 - regression_loss: 1.4754 - classification_loss: 0.2965 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7738 - regression_loss: 1.4776 - classification_loss: 0.2963 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7706 - regression_loss: 1.4752 - classification_loss: 0.2954 67/500 [===>..........................] - ETA: 1:48 - loss: 1.7699 - regression_loss: 1.4757 - classification_loss: 0.2942 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7661 - regression_loss: 1.4734 - classification_loss: 0.2928 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7714 - regression_loss: 1.4785 - classification_loss: 0.2929 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7746 - regression_loss: 1.4805 - classification_loss: 0.2941 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7782 - regression_loss: 1.4837 - classification_loss: 0.2946 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7755 - regression_loss: 1.4820 - classification_loss: 0.2935 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7692 - regression_loss: 1.4776 - classification_loss: 0.2917 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7775 - regression_loss: 1.4826 - classification_loss: 0.2949 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7800 - regression_loss: 1.4857 - classification_loss: 0.2943 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7720 - regression_loss: 1.4797 - classification_loss: 0.2923 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7712 - regression_loss: 1.4793 - classification_loss: 0.2919 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7689 - regression_loss: 1.4786 - classification_loss: 0.2903 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7746 - regression_loss: 1.4829 - classification_loss: 0.2917 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7777 - regression_loss: 1.4858 - classification_loss: 0.2920 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7709 - regression_loss: 1.4803 - classification_loss: 0.2905 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7816 - regression_loss: 1.4857 - classification_loss: 0.2959 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7797 - regression_loss: 1.4841 - classification_loss: 0.2956 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7736 - regression_loss: 1.4790 - classification_loss: 0.2947 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7805 - regression_loss: 1.4845 - classification_loss: 0.2960 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7858 - regression_loss: 1.4888 - classification_loss: 0.2970 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7831 - regression_loss: 1.4872 - classification_loss: 0.2959 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7939 - regression_loss: 1.4975 - classification_loss: 0.2964 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7842 - regression_loss: 1.4903 - classification_loss: 0.2939 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7819 - regression_loss: 1.4882 - classification_loss: 0.2937 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7831 - regression_loss: 1.4887 - classification_loss: 0.2943 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7759 - regression_loss: 1.4832 - classification_loss: 0.2928 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7804 - regression_loss: 1.4864 - classification_loss: 0.2940 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7825 - regression_loss: 1.4878 - classification_loss: 0.2947 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7838 - regression_loss: 1.4886 - classification_loss: 0.2951 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7825 - regression_loss: 1.4870 - classification_loss: 0.2955 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7865 - regression_loss: 1.4901 - classification_loss: 0.2964 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7935 - regression_loss: 1.4955 - classification_loss: 0.2979 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7998 - regression_loss: 1.5009 - classification_loss: 0.2989 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7951 - regression_loss: 1.4859 - classification_loss: 0.3092 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7887 - regression_loss: 1.4811 - classification_loss: 0.3076 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7845 - regression_loss: 1.4782 - classification_loss: 0.3062 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7810 - regression_loss: 1.4765 - classification_loss: 0.3045 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7878 - regression_loss: 1.4829 - classification_loss: 0.3049 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7885 - regression_loss: 1.4836 - classification_loss: 0.3049 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7861 - regression_loss: 1.4818 - classification_loss: 0.3043 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7833 - regression_loss: 1.4804 - classification_loss: 0.3029 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7787 - regression_loss: 1.4763 - classification_loss: 0.3024 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7794 - regression_loss: 1.4774 - classification_loss: 0.3020 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7789 - regression_loss: 1.4767 - classification_loss: 0.3022 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7873 - regression_loss: 1.4843 - classification_loss: 0.3030 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7799 - regression_loss: 1.4781 - classification_loss: 0.3018 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7778 - regression_loss: 1.4764 - classification_loss: 0.3013 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7786 - regression_loss: 1.4765 - classification_loss: 0.3021 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7675 - regression_loss: 1.4673 - classification_loss: 0.3002 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7716 - regression_loss: 1.4702 - classification_loss: 0.3014 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7723 - regression_loss: 1.4705 - classification_loss: 0.3018 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7756 - regression_loss: 1.4734 - classification_loss: 0.3023 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7662 - regression_loss: 1.4660 - classification_loss: 0.3002 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7652 - regression_loss: 1.4647 - classification_loss: 0.3005 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7677 - regression_loss: 1.4663 - classification_loss: 0.3014 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7668 - regression_loss: 1.4655 - classification_loss: 0.3013 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7648 - regression_loss: 1.4649 - classification_loss: 0.2999 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7662 - regression_loss: 1.4662 - classification_loss: 0.3000 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7627 - regression_loss: 1.4637 - classification_loss: 0.2990 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7601 - regression_loss: 1.4620 - classification_loss: 0.2981 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7645 - regression_loss: 1.4658 - classification_loss: 0.2987 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7662 - regression_loss: 1.4672 - classification_loss: 0.2990 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7634 - regression_loss: 1.4651 - classification_loss: 0.2983 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7631 - regression_loss: 1.4650 - classification_loss: 0.2981 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7618 - regression_loss: 1.4646 - classification_loss: 0.2971 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7632 - regression_loss: 1.4656 - classification_loss: 0.2976 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7606 - regression_loss: 1.4637 - classification_loss: 0.2968 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7577 - regression_loss: 1.4619 - classification_loss: 0.2958 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7614 - regression_loss: 1.4648 - classification_loss: 0.2966 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7631 - regression_loss: 1.4658 - classification_loss: 0.2972 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7632 - regression_loss: 1.4665 - classification_loss: 0.2967 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7624 - regression_loss: 1.4662 - classification_loss: 0.2962 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7576 - regression_loss: 1.4624 - classification_loss: 0.2952 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7520 - regression_loss: 1.4579 - classification_loss: 0.2942 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7512 - regression_loss: 1.4578 - classification_loss: 0.2934 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7500 - regression_loss: 1.4573 - classification_loss: 0.2928 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7479 - regression_loss: 1.4557 - classification_loss: 0.2921 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7493 - regression_loss: 1.4572 - classification_loss: 0.2921 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7472 - regression_loss: 1.4555 - classification_loss: 0.2917 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7467 - regression_loss: 1.4556 - classification_loss: 0.2911 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7428 - regression_loss: 1.4515 - classification_loss: 0.2913 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7435 - regression_loss: 1.4525 - classification_loss: 0.2911 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7446 - regression_loss: 1.4535 - classification_loss: 0.2912 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7444 - regression_loss: 1.4535 - classification_loss: 0.2909 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7425 - regression_loss: 1.4523 - classification_loss: 0.2902 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7423 - regression_loss: 1.4519 - classification_loss: 0.2904 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7427 - regression_loss: 1.4524 - classification_loss: 0.2904 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7432 - regression_loss: 1.4531 - classification_loss: 0.2900 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7431 - regression_loss: 1.4530 - classification_loss: 0.2901 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7395 - regression_loss: 1.4499 - classification_loss: 0.2895 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7365 - regression_loss: 1.4474 - classification_loss: 0.2891 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7357 - regression_loss: 1.4470 - classification_loss: 0.2887 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7293 - regression_loss: 1.4418 - classification_loss: 0.2875 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7339 - regression_loss: 1.4453 - classification_loss: 0.2886 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7323 - regression_loss: 1.4441 - classification_loss: 0.2882 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7333 - regression_loss: 1.4450 - classification_loss: 0.2883 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7343 - regression_loss: 1.4463 - classification_loss: 0.2880 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7347 - regression_loss: 1.4468 - classification_loss: 0.2879 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7354 - regression_loss: 1.4476 - classification_loss: 0.2878 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7298 - regression_loss: 1.4428 - classification_loss: 0.2870 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7325 - regression_loss: 1.4451 - classification_loss: 0.2875 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7348 - regression_loss: 1.4465 - classification_loss: 0.2883 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7366 - regression_loss: 1.4475 - classification_loss: 0.2892 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7296 - regression_loss: 1.4417 - classification_loss: 0.2879 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7355 - regression_loss: 1.4460 - classification_loss: 0.2895 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7359 - regression_loss: 1.4464 - classification_loss: 0.2896 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7384 - regression_loss: 1.4480 - classification_loss: 0.2904 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7356 - regression_loss: 1.4455 - classification_loss: 0.2900 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7431 - regression_loss: 1.4523 - classification_loss: 0.2907 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7427 - regression_loss: 1.4521 - classification_loss: 0.2906 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7435 - regression_loss: 1.4527 - classification_loss: 0.2908 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7456 - regression_loss: 1.4549 - classification_loss: 0.2907 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7437 - regression_loss: 1.4536 - classification_loss: 0.2901 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7446 - regression_loss: 1.4546 - classification_loss: 0.2901 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7438 - regression_loss: 1.4542 - classification_loss: 0.2896 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7453 - regression_loss: 1.4554 - classification_loss: 0.2899 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7443 - regression_loss: 1.4547 - classification_loss: 0.2897 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7407 - regression_loss: 1.4519 - classification_loss: 0.2888 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7415 - regression_loss: 1.4526 - classification_loss: 0.2890 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7403 - regression_loss: 1.4518 - classification_loss: 0.2885 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7401 - regression_loss: 1.4521 - classification_loss: 0.2880 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7433 - regression_loss: 1.4542 - classification_loss: 0.2891 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7437 - regression_loss: 1.4545 - classification_loss: 0.2892 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7493 - regression_loss: 1.4586 - classification_loss: 0.2907 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7436 - regression_loss: 1.4541 - classification_loss: 0.2896 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7434 - regression_loss: 1.4544 - classification_loss: 0.2890 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7474 - regression_loss: 1.4575 - classification_loss: 0.2900 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7459 - regression_loss: 1.4565 - classification_loss: 0.2895 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7472 - regression_loss: 1.4575 - classification_loss: 0.2897 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7447 - regression_loss: 1.4559 - classification_loss: 0.2887 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7405 - regression_loss: 1.4527 - classification_loss: 0.2878 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7407 - regression_loss: 1.4529 - classification_loss: 0.2878 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7415 - regression_loss: 1.4535 - classification_loss: 0.2880 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7400 - regression_loss: 1.4522 - classification_loss: 0.2878 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7418 - regression_loss: 1.4536 - classification_loss: 0.2881 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7411 - regression_loss: 1.4528 - classification_loss: 0.2883 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7380 - regression_loss: 1.4505 - classification_loss: 0.2875 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7391 - regression_loss: 1.4513 - classification_loss: 0.2878 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7358 - regression_loss: 1.4489 - classification_loss: 0.2870 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7372 - regression_loss: 1.4496 - classification_loss: 0.2876 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7361 - regression_loss: 1.4487 - classification_loss: 0.2874 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7351 - regression_loss: 1.4481 - classification_loss: 0.2870 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7374 - regression_loss: 1.4496 - classification_loss: 0.2878 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7348 - regression_loss: 1.4475 - classification_loss: 0.2873 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7394 - regression_loss: 1.4503 - classification_loss: 0.2890 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7407 - regression_loss: 1.4518 - classification_loss: 0.2888 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7455 - regression_loss: 1.4563 - classification_loss: 0.2892 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7497 - regression_loss: 1.4599 - classification_loss: 0.2898 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7502 - regression_loss: 1.4599 - classification_loss: 0.2903 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7537 - regression_loss: 1.4629 - classification_loss: 0.2907 217/500 [============>.................] - ETA: 1:10 - loss: 1.7508 - regression_loss: 1.4599 - classification_loss: 0.2909 218/500 [============>.................] - ETA: 1:10 - loss: 1.7520 - regression_loss: 1.4609 - classification_loss: 0.2911 219/500 [============>.................] - ETA: 1:10 - loss: 1.7523 - regression_loss: 1.4613 - classification_loss: 0.2909 220/500 [============>.................] - ETA: 1:10 - loss: 1.7535 - regression_loss: 1.4622 - classification_loss: 0.2913 221/500 [============>.................] - ETA: 1:09 - loss: 1.7551 - regression_loss: 1.4640 - classification_loss: 0.2912 222/500 [============>.................] - ETA: 1:09 - loss: 1.7551 - regression_loss: 1.4642 - classification_loss: 0.2909 223/500 [============>.................] - ETA: 1:09 - loss: 1.7606 - regression_loss: 1.4688 - classification_loss: 0.2918 224/500 [============>.................] - ETA: 1:09 - loss: 1.7615 - regression_loss: 1.4698 - classification_loss: 0.2917 225/500 [============>.................] - ETA: 1:08 - loss: 1.7610 - regression_loss: 1.4693 - classification_loss: 0.2916 226/500 [============>.................] - ETA: 1:08 - loss: 1.7586 - regression_loss: 1.4677 - classification_loss: 0.2910 227/500 [============>.................] - ETA: 1:08 - loss: 1.7588 - regression_loss: 1.4677 - classification_loss: 0.2911 228/500 [============>.................] - ETA: 1:08 - loss: 1.7560 - regression_loss: 1.4658 - classification_loss: 0.2902 229/500 [============>.................] - ETA: 1:07 - loss: 1.7537 - regression_loss: 1.4641 - classification_loss: 0.2895 230/500 [============>.................] - ETA: 1:07 - loss: 1.7532 - regression_loss: 1.4642 - classification_loss: 0.2890 231/500 [============>.................] - ETA: 1:07 - loss: 1.7530 - regression_loss: 1.4639 - classification_loss: 0.2891 232/500 [============>.................] - ETA: 1:07 - loss: 1.7529 - regression_loss: 1.4642 - classification_loss: 0.2887 233/500 [============>.................] - ETA: 1:06 - loss: 1.7521 - regression_loss: 1.4632 - classification_loss: 0.2889 234/500 [=============>................] - ETA: 1:06 - loss: 1.7524 - regression_loss: 1.4635 - classification_loss: 0.2889 235/500 [=============>................] - ETA: 1:06 - loss: 1.7545 - regression_loss: 1.4651 - classification_loss: 0.2894 236/500 [=============>................] - ETA: 1:06 - loss: 1.7572 - regression_loss: 1.4669 - classification_loss: 0.2902 237/500 [=============>................] - ETA: 1:05 - loss: 1.7560 - regression_loss: 1.4661 - classification_loss: 0.2898 238/500 [=============>................] - ETA: 1:05 - loss: 1.7576 - regression_loss: 1.4672 - classification_loss: 0.2904 239/500 [=============>................] - ETA: 1:05 - loss: 1.7559 - regression_loss: 1.4658 - classification_loss: 0.2901 240/500 [=============>................] - ETA: 1:05 - loss: 1.7544 - regression_loss: 1.4650 - classification_loss: 0.2895 241/500 [=============>................] - ETA: 1:04 - loss: 1.7548 - regression_loss: 1.4652 - classification_loss: 0.2896 242/500 [=============>................] - ETA: 1:04 - loss: 1.7552 - regression_loss: 1.4660 - classification_loss: 0.2892 243/500 [=============>................] - ETA: 1:04 - loss: 1.7556 - regression_loss: 1.4663 - classification_loss: 0.2893 244/500 [=============>................] - ETA: 1:04 - loss: 1.7556 - regression_loss: 1.4665 - classification_loss: 0.2891 245/500 [=============>................] - ETA: 1:03 - loss: 1.7545 - regression_loss: 1.4658 - classification_loss: 0.2887 246/500 [=============>................] - ETA: 1:03 - loss: 1.7543 - regression_loss: 1.4663 - classification_loss: 0.2881 247/500 [=============>................] - ETA: 1:03 - loss: 1.7551 - regression_loss: 1.4671 - classification_loss: 0.2880 248/500 [=============>................] - ETA: 1:03 - loss: 1.7566 - regression_loss: 1.4684 - classification_loss: 0.2882 249/500 [=============>................] - ETA: 1:02 - loss: 1.7572 - regression_loss: 1.4693 - classification_loss: 0.2879 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7535 - regression_loss: 1.4665 - classification_loss: 0.2871 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7554 - regression_loss: 1.4682 - classification_loss: 0.2872 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7575 - regression_loss: 1.4701 - classification_loss: 0.2875 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7572 - regression_loss: 1.4698 - classification_loss: 0.2875 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7604 - regression_loss: 1.4713 - classification_loss: 0.2891 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7604 - regression_loss: 1.4714 - classification_loss: 0.2890 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7588 - regression_loss: 1.4700 - classification_loss: 0.2887 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7560 - regression_loss: 1.4679 - classification_loss: 0.2881 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7537 - regression_loss: 1.4661 - classification_loss: 0.2875 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7540 - regression_loss: 1.4666 - classification_loss: 0.2873 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7567 - regression_loss: 1.4682 - classification_loss: 0.2885 261/500 [==============>...............] - ETA: 59s - loss: 1.7545 - regression_loss: 1.4665 - classification_loss: 0.2880  262/500 [==============>...............] - ETA: 59s - loss: 1.7581 - regression_loss: 1.4699 - classification_loss: 0.2882 263/500 [==============>...............] - ETA: 59s - loss: 1.7582 - regression_loss: 1.4703 - classification_loss: 0.2879 264/500 [==============>...............] - ETA: 59s - loss: 1.7596 - regression_loss: 1.4710 - classification_loss: 0.2886 265/500 [==============>...............] - ETA: 58s - loss: 1.7593 - regression_loss: 1.4708 - classification_loss: 0.2884 266/500 [==============>...............] - ETA: 58s - loss: 1.7637 - regression_loss: 1.4748 - classification_loss: 0.2890 267/500 [===============>..............] - ETA: 58s - loss: 1.7639 - regression_loss: 1.4729 - classification_loss: 0.2910 268/500 [===============>..............] - ETA: 58s - loss: 1.7644 - regression_loss: 1.4732 - classification_loss: 0.2912 269/500 [===============>..............] - ETA: 57s - loss: 1.7599 - regression_loss: 1.4694 - classification_loss: 0.2905 270/500 [===============>..............] - ETA: 57s - loss: 1.7594 - regression_loss: 1.4691 - classification_loss: 0.2903 271/500 [===============>..............] - ETA: 57s - loss: 1.7606 - regression_loss: 1.4703 - classification_loss: 0.2903 272/500 [===============>..............] - ETA: 57s - loss: 1.7601 - regression_loss: 1.4700 - classification_loss: 0.2902 273/500 [===============>..............] - ETA: 56s - loss: 1.7582 - regression_loss: 1.4686 - classification_loss: 0.2895 274/500 [===============>..............] - ETA: 56s - loss: 1.7582 - regression_loss: 1.4686 - classification_loss: 0.2896 275/500 [===============>..............] - ETA: 56s - loss: 1.7588 - regression_loss: 1.4694 - classification_loss: 0.2894 276/500 [===============>..............] - ETA: 55s - loss: 1.7582 - regression_loss: 1.4691 - classification_loss: 0.2891 277/500 [===============>..............] - ETA: 55s - loss: 1.7554 - regression_loss: 1.4668 - classification_loss: 0.2886 278/500 [===============>..............] - ETA: 55s - loss: 1.7558 - regression_loss: 1.4672 - classification_loss: 0.2886 279/500 [===============>..............] - ETA: 55s - loss: 1.7606 - regression_loss: 1.4712 - classification_loss: 0.2895 280/500 [===============>..............] - ETA: 54s - loss: 1.7587 - regression_loss: 1.4697 - classification_loss: 0.2890 281/500 [===============>..............] - ETA: 54s - loss: 1.7640 - regression_loss: 1.4748 - classification_loss: 0.2892 282/500 [===============>..............] - ETA: 54s - loss: 1.7644 - regression_loss: 1.4750 - classification_loss: 0.2895 283/500 [===============>..............] - ETA: 54s - loss: 1.7651 - regression_loss: 1.4754 - classification_loss: 0.2896 284/500 [================>.............] - ETA: 53s - loss: 1.7664 - regression_loss: 1.4769 - classification_loss: 0.2896 285/500 [================>.............] - ETA: 53s - loss: 1.7667 - regression_loss: 1.4769 - classification_loss: 0.2897 286/500 [================>.............] - ETA: 53s - loss: 1.7671 - regression_loss: 1.4773 - classification_loss: 0.2898 287/500 [================>.............] - ETA: 53s - loss: 1.7664 - regression_loss: 1.4771 - classification_loss: 0.2893 288/500 [================>.............] - ETA: 52s - loss: 1.7645 - regression_loss: 1.4755 - classification_loss: 0.2890 289/500 [================>.............] - ETA: 52s - loss: 1.7645 - regression_loss: 1.4759 - classification_loss: 0.2886 290/500 [================>.............] - ETA: 52s - loss: 1.7664 - regression_loss: 1.4770 - classification_loss: 0.2894 291/500 [================>.............] - ETA: 52s - loss: 1.7658 - regression_loss: 1.4765 - classification_loss: 0.2893 292/500 [================>.............] - ETA: 51s - loss: 1.7660 - regression_loss: 1.4767 - classification_loss: 0.2892 293/500 [================>.............] - ETA: 51s - loss: 1.7666 - regression_loss: 1.4773 - classification_loss: 0.2892 294/500 [================>.............] - ETA: 51s - loss: 1.7662 - regression_loss: 1.4769 - classification_loss: 0.2893 295/500 [================>.............] - ETA: 51s - loss: 1.7702 - regression_loss: 1.4802 - classification_loss: 0.2900 296/500 [================>.............] - ETA: 50s - loss: 1.7679 - regression_loss: 1.4777 - classification_loss: 0.2903 297/500 [================>.............] - ETA: 50s - loss: 1.7670 - regression_loss: 1.4770 - classification_loss: 0.2900 298/500 [================>.............] - ETA: 50s - loss: 1.7667 - regression_loss: 1.4770 - classification_loss: 0.2897 299/500 [================>.............] - ETA: 50s - loss: 1.7679 - regression_loss: 1.4777 - classification_loss: 0.2902 300/500 [=================>............] - ETA: 49s - loss: 1.7658 - regression_loss: 1.4760 - classification_loss: 0.2898 301/500 [=================>............] - ETA: 49s - loss: 1.7653 - regression_loss: 1.4758 - classification_loss: 0.2895 302/500 [=================>............] - ETA: 49s - loss: 1.7672 - regression_loss: 1.4768 - classification_loss: 0.2903 303/500 [=================>............] - ETA: 49s - loss: 1.7671 - regression_loss: 1.4768 - classification_loss: 0.2903 304/500 [=================>............] - ETA: 48s - loss: 1.7674 - regression_loss: 1.4773 - classification_loss: 0.2901 305/500 [=================>............] - ETA: 48s - loss: 1.7690 - regression_loss: 1.4786 - classification_loss: 0.2904 306/500 [=================>............] - ETA: 48s - loss: 1.7693 - regression_loss: 1.4795 - classification_loss: 0.2898 307/500 [=================>............] - ETA: 48s - loss: 1.7719 - regression_loss: 1.4807 - classification_loss: 0.2912 308/500 [=================>............] - ETA: 47s - loss: 1.7698 - regression_loss: 1.4791 - classification_loss: 0.2907 309/500 [=================>............] - ETA: 47s - loss: 1.7690 - regression_loss: 1.4787 - classification_loss: 0.2903 310/500 [=================>............] - ETA: 47s - loss: 1.7695 - regression_loss: 1.4788 - classification_loss: 0.2907 311/500 [=================>............] - ETA: 47s - loss: 1.7687 - regression_loss: 1.4783 - classification_loss: 0.2905 312/500 [=================>............] - ETA: 46s - loss: 1.7694 - regression_loss: 1.4792 - classification_loss: 0.2902 313/500 [=================>............] - ETA: 46s - loss: 1.7703 - regression_loss: 1.4799 - classification_loss: 0.2904 314/500 [=================>............] - ETA: 46s - loss: 1.7674 - regression_loss: 1.4776 - classification_loss: 0.2898 315/500 [=================>............] - ETA: 46s - loss: 1.7673 - regression_loss: 1.4772 - classification_loss: 0.2901 316/500 [=================>............] - ETA: 45s - loss: 1.7649 - regression_loss: 1.4752 - classification_loss: 0.2897 317/500 [==================>...........] - ETA: 45s - loss: 1.7620 - regression_loss: 1.4730 - classification_loss: 0.2890 318/500 [==================>...........] - ETA: 45s - loss: 1.7615 - regression_loss: 1.4729 - classification_loss: 0.2886 319/500 [==================>...........] - ETA: 45s - loss: 1.7620 - regression_loss: 1.4729 - classification_loss: 0.2890 320/500 [==================>...........] - ETA: 44s - loss: 1.7607 - regression_loss: 1.4720 - classification_loss: 0.2887 321/500 [==================>...........] - ETA: 44s - loss: 1.7598 - regression_loss: 1.4713 - classification_loss: 0.2885 322/500 [==================>...........] - ETA: 44s - loss: 1.7599 - regression_loss: 1.4714 - classification_loss: 0.2885 323/500 [==================>...........] - ETA: 44s - loss: 1.7612 - regression_loss: 1.4727 - classification_loss: 0.2885 324/500 [==================>...........] - ETA: 43s - loss: 1.7621 - regression_loss: 1.4734 - classification_loss: 0.2887 325/500 [==================>...........] - ETA: 43s - loss: 1.7620 - regression_loss: 1.4738 - classification_loss: 0.2882 326/500 [==================>...........] - ETA: 43s - loss: 1.7628 - regression_loss: 1.4745 - classification_loss: 0.2884 327/500 [==================>...........] - ETA: 43s - loss: 1.7632 - regression_loss: 1.4746 - classification_loss: 0.2887 328/500 [==================>...........] - ETA: 42s - loss: 1.7656 - regression_loss: 1.4761 - classification_loss: 0.2895 329/500 [==================>...........] - ETA: 42s - loss: 1.7667 - regression_loss: 1.4770 - classification_loss: 0.2897 330/500 [==================>...........] - ETA: 42s - loss: 1.7679 - regression_loss: 1.4780 - classification_loss: 0.2899 331/500 [==================>...........] - ETA: 42s - loss: 1.7689 - regression_loss: 1.4789 - classification_loss: 0.2900 332/500 [==================>...........] - ETA: 41s - loss: 1.7672 - regression_loss: 1.4776 - classification_loss: 0.2896 333/500 [==================>...........] - ETA: 41s - loss: 1.7667 - regression_loss: 1.4772 - classification_loss: 0.2895 334/500 [===================>..........] - ETA: 41s - loss: 1.7674 - regression_loss: 1.4777 - classification_loss: 0.2897 335/500 [===================>..........] - ETA: 41s - loss: 1.7671 - regression_loss: 1.4777 - classification_loss: 0.2894 336/500 [===================>..........] - ETA: 40s - loss: 1.7694 - regression_loss: 1.4798 - classification_loss: 0.2896 337/500 [===================>..........] - ETA: 40s - loss: 1.7726 - regression_loss: 1.4823 - classification_loss: 0.2903 338/500 [===================>..........] - ETA: 40s - loss: 1.7724 - regression_loss: 1.4822 - classification_loss: 0.2901 339/500 [===================>..........] - ETA: 40s - loss: 1.7718 - regression_loss: 1.4821 - classification_loss: 0.2897 340/500 [===================>..........] - ETA: 39s - loss: 1.7727 - regression_loss: 1.4829 - classification_loss: 0.2898 341/500 [===================>..........] - ETA: 39s - loss: 1.7721 - regression_loss: 1.4826 - classification_loss: 0.2895 342/500 [===================>..........] - ETA: 39s - loss: 1.7717 - regression_loss: 1.4822 - classification_loss: 0.2895 343/500 [===================>..........] - ETA: 39s - loss: 1.7721 - regression_loss: 1.4824 - classification_loss: 0.2897 344/500 [===================>..........] - ETA: 38s - loss: 1.7704 - regression_loss: 1.4811 - classification_loss: 0.2893 345/500 [===================>..........] - ETA: 38s - loss: 1.7697 - regression_loss: 1.4806 - classification_loss: 0.2892 346/500 [===================>..........] - ETA: 38s - loss: 1.7691 - regression_loss: 1.4802 - classification_loss: 0.2889 347/500 [===================>..........] - ETA: 38s - loss: 1.7686 - regression_loss: 1.4798 - classification_loss: 0.2887 348/500 [===================>..........] - ETA: 37s - loss: 1.7669 - regression_loss: 1.4786 - classification_loss: 0.2883 349/500 [===================>..........] - ETA: 37s - loss: 1.7661 - regression_loss: 1.4780 - classification_loss: 0.2880 350/500 [====================>.........] - ETA: 37s - loss: 1.7680 - regression_loss: 1.4796 - classification_loss: 0.2883 351/500 [====================>.........] - ETA: 37s - loss: 1.7671 - regression_loss: 1.4791 - classification_loss: 0.2881 352/500 [====================>.........] - ETA: 36s - loss: 1.7675 - regression_loss: 1.4792 - classification_loss: 0.2884 353/500 [====================>.........] - ETA: 36s - loss: 1.7694 - regression_loss: 1.4809 - classification_loss: 0.2885 354/500 [====================>.........] - ETA: 36s - loss: 1.7697 - regression_loss: 1.4813 - classification_loss: 0.2884 355/500 [====================>.........] - ETA: 36s - loss: 1.7703 - regression_loss: 1.4818 - classification_loss: 0.2886 356/500 [====================>.........] - ETA: 35s - loss: 1.7699 - regression_loss: 1.4812 - classification_loss: 0.2887 357/500 [====================>.........] - ETA: 35s - loss: 1.7689 - regression_loss: 1.4803 - classification_loss: 0.2886 358/500 [====================>.........] - ETA: 35s - loss: 1.7664 - regression_loss: 1.4784 - classification_loss: 0.2880 359/500 [====================>.........] - ETA: 35s - loss: 1.7674 - regression_loss: 1.4793 - classification_loss: 0.2881 360/500 [====================>.........] - ETA: 34s - loss: 1.7702 - regression_loss: 1.4815 - classification_loss: 0.2886 361/500 [====================>.........] - ETA: 34s - loss: 1.7729 - regression_loss: 1.4834 - classification_loss: 0.2895 362/500 [====================>.........] - ETA: 34s - loss: 1.7750 - regression_loss: 1.4848 - classification_loss: 0.2901 363/500 [====================>.........] - ETA: 34s - loss: 1.7760 - regression_loss: 1.4857 - classification_loss: 0.2904 364/500 [====================>.........] - ETA: 33s - loss: 1.7754 - regression_loss: 1.4852 - classification_loss: 0.2902 365/500 [====================>.........] - ETA: 33s - loss: 1.7751 - regression_loss: 1.4848 - classification_loss: 0.2902 366/500 [====================>.........] - ETA: 33s - loss: 1.7738 - regression_loss: 1.4839 - classification_loss: 0.2899 367/500 [=====================>........] - ETA: 33s - loss: 1.7746 - regression_loss: 1.4849 - classification_loss: 0.2897 368/500 [=====================>........] - ETA: 32s - loss: 1.7761 - regression_loss: 1.4860 - classification_loss: 0.2901 369/500 [=====================>........] - ETA: 32s - loss: 1.7747 - regression_loss: 1.4847 - classification_loss: 0.2900 370/500 [=====================>........] - ETA: 32s - loss: 1.7731 - regression_loss: 1.4835 - classification_loss: 0.2896 371/500 [=====================>........] - ETA: 32s - loss: 1.7717 - regression_loss: 1.4823 - classification_loss: 0.2894 372/500 [=====================>........] - ETA: 31s - loss: 1.7712 - regression_loss: 1.4819 - classification_loss: 0.2892 373/500 [=====================>........] - ETA: 31s - loss: 1.7707 - regression_loss: 1.4815 - classification_loss: 0.2891 374/500 [=====================>........] - ETA: 31s - loss: 1.7697 - regression_loss: 1.4810 - classification_loss: 0.2888 375/500 [=====================>........] - ETA: 31s - loss: 1.7701 - regression_loss: 1.4811 - classification_loss: 0.2890 376/500 [=====================>........] - ETA: 30s - loss: 1.7700 - regression_loss: 1.4811 - classification_loss: 0.2889 377/500 [=====================>........] - ETA: 30s - loss: 1.7683 - regression_loss: 1.4795 - classification_loss: 0.2888 378/500 [=====================>........] - ETA: 30s - loss: 1.7670 - regression_loss: 1.4785 - classification_loss: 0.2885 379/500 [=====================>........] - ETA: 30s - loss: 1.7677 - regression_loss: 1.4794 - classification_loss: 0.2883 380/500 [=====================>........] - ETA: 29s - loss: 1.7646 - regression_loss: 1.4767 - classification_loss: 0.2879 381/500 [=====================>........] - ETA: 29s - loss: 1.7640 - regression_loss: 1.4763 - classification_loss: 0.2878 382/500 [=====================>........] - ETA: 29s - loss: 1.7650 - regression_loss: 1.4770 - classification_loss: 0.2879 383/500 [=====================>........] - ETA: 29s - loss: 1.7644 - regression_loss: 1.4768 - classification_loss: 0.2876 384/500 [======================>.......] - ETA: 28s - loss: 1.7648 - regression_loss: 1.4768 - classification_loss: 0.2879 385/500 [======================>.......] - ETA: 28s - loss: 1.7640 - regression_loss: 1.4762 - classification_loss: 0.2878 386/500 [======================>.......] - ETA: 28s - loss: 1.7632 - regression_loss: 1.4757 - classification_loss: 0.2875 387/500 [======================>.......] - ETA: 28s - loss: 1.7629 - regression_loss: 1.4755 - classification_loss: 0.2874 388/500 [======================>.......] - ETA: 27s - loss: 1.7622 - regression_loss: 1.4750 - classification_loss: 0.2872 389/500 [======================>.......] - ETA: 27s - loss: 1.7619 - regression_loss: 1.4748 - classification_loss: 0.2871 390/500 [======================>.......] - ETA: 27s - loss: 1.7612 - regression_loss: 1.4743 - classification_loss: 0.2869 391/500 [======================>.......] - ETA: 27s - loss: 1.7621 - regression_loss: 1.4749 - classification_loss: 0.2872 392/500 [======================>.......] - ETA: 26s - loss: 1.7626 - regression_loss: 1.4756 - classification_loss: 0.2870 393/500 [======================>.......] - ETA: 26s - loss: 1.7638 - regression_loss: 1.4766 - classification_loss: 0.2872 394/500 [======================>.......] - ETA: 26s - loss: 1.7642 - regression_loss: 1.4768 - classification_loss: 0.2874 395/500 [======================>.......] - ETA: 26s - loss: 1.7650 - regression_loss: 1.4773 - classification_loss: 0.2877 396/500 [======================>.......] - ETA: 25s - loss: 1.7649 - regression_loss: 1.4773 - classification_loss: 0.2876 397/500 [======================>.......] - ETA: 25s - loss: 1.7662 - regression_loss: 1.4785 - classification_loss: 0.2877 398/500 [======================>.......] - ETA: 25s - loss: 1.7660 - regression_loss: 1.4784 - classification_loss: 0.2877 399/500 [======================>.......] - ETA: 25s - loss: 1.7680 - regression_loss: 1.4747 - classification_loss: 0.2933 400/500 [=======================>......] - ETA: 24s - loss: 1.7692 - regression_loss: 1.4755 - classification_loss: 0.2937 401/500 [=======================>......] - ETA: 24s - loss: 1.7678 - regression_loss: 1.4744 - classification_loss: 0.2934 402/500 [=======================>......] - ETA: 24s - loss: 1.7690 - regression_loss: 1.4752 - classification_loss: 0.2938 403/500 [=======================>......] - ETA: 24s - loss: 1.7681 - regression_loss: 1.4744 - classification_loss: 0.2937 404/500 [=======================>......] - ETA: 23s - loss: 1.7663 - regression_loss: 1.4730 - classification_loss: 0.2933 405/500 [=======================>......] - ETA: 23s - loss: 1.7646 - regression_loss: 1.4717 - classification_loss: 0.2929 406/500 [=======================>......] - ETA: 23s - loss: 1.7646 - regression_loss: 1.4719 - classification_loss: 0.2927 407/500 [=======================>......] - ETA: 23s - loss: 1.7626 - regression_loss: 1.4704 - classification_loss: 0.2923 408/500 [=======================>......] - ETA: 22s - loss: 1.7624 - regression_loss: 1.4701 - classification_loss: 0.2923 409/500 [=======================>......] - ETA: 22s - loss: 1.7629 - regression_loss: 1.4702 - classification_loss: 0.2927 410/500 [=======================>......] - ETA: 22s - loss: 1.7644 - regression_loss: 1.4712 - classification_loss: 0.2931 411/500 [=======================>......] - ETA: 22s - loss: 1.7657 - regression_loss: 1.4723 - classification_loss: 0.2934 412/500 [=======================>......] - ETA: 21s - loss: 1.7674 - regression_loss: 1.4735 - classification_loss: 0.2939 413/500 [=======================>......] - ETA: 21s - loss: 1.7697 - regression_loss: 1.4753 - classification_loss: 0.2944 414/500 [=======================>......] - ETA: 21s - loss: 1.7708 - regression_loss: 1.4762 - classification_loss: 0.2946 415/500 [=======================>......] - ETA: 21s - loss: 1.7714 - regression_loss: 1.4767 - classification_loss: 0.2947 416/500 [=======================>......] - ETA: 20s - loss: 1.7700 - regression_loss: 1.4756 - classification_loss: 0.2944 417/500 [========================>.....] - ETA: 20s - loss: 1.7724 - regression_loss: 1.4775 - classification_loss: 0.2949 418/500 [========================>.....] - ETA: 20s - loss: 1.7721 - regression_loss: 1.4773 - classification_loss: 0.2948 419/500 [========================>.....] - ETA: 20s - loss: 1.7722 - regression_loss: 1.4775 - classification_loss: 0.2947 420/500 [========================>.....] - ETA: 19s - loss: 1.7708 - regression_loss: 1.4740 - classification_loss: 0.2968 421/500 [========================>.....] - ETA: 19s - loss: 1.7705 - regression_loss: 1.4738 - classification_loss: 0.2967 422/500 [========================>.....] - ETA: 19s - loss: 1.7715 - regression_loss: 1.4746 - classification_loss: 0.2969 423/500 [========================>.....] - ETA: 19s - loss: 1.7723 - regression_loss: 1.4753 - classification_loss: 0.2970 424/500 [========================>.....] - ETA: 18s - loss: 1.7724 - regression_loss: 1.4755 - classification_loss: 0.2970 425/500 [========================>.....] - ETA: 18s - loss: 1.7708 - regression_loss: 1.4742 - classification_loss: 0.2966 426/500 [========================>.....] - ETA: 18s - loss: 1.7724 - regression_loss: 1.4754 - classification_loss: 0.2970 427/500 [========================>.....] - ETA: 18s - loss: 1.7711 - regression_loss: 1.4745 - classification_loss: 0.2966 428/500 [========================>.....] - ETA: 17s - loss: 1.7712 - regression_loss: 1.4746 - classification_loss: 0.2966 429/500 [========================>.....] - ETA: 17s - loss: 1.7702 - regression_loss: 1.4740 - classification_loss: 0.2963 430/500 [========================>.....] - ETA: 17s - loss: 1.7689 - regression_loss: 1.4729 - classification_loss: 0.2960 431/500 [========================>.....] - ETA: 17s - loss: 1.7710 - regression_loss: 1.4746 - classification_loss: 0.2964 432/500 [========================>.....] - ETA: 16s - loss: 1.7690 - regression_loss: 1.4729 - classification_loss: 0.2961 433/500 [========================>.....] - ETA: 16s - loss: 1.7686 - regression_loss: 1.4725 - classification_loss: 0.2961 434/500 [=========================>....] - ETA: 16s - loss: 1.7684 - regression_loss: 1.4724 - classification_loss: 0.2961 435/500 [=========================>....] - ETA: 16s - loss: 1.7677 - regression_loss: 1.4717 - classification_loss: 0.2960 436/500 [=========================>....] - ETA: 15s - loss: 1.7683 - regression_loss: 1.4721 - classification_loss: 0.2963 437/500 [=========================>....] - ETA: 15s - loss: 1.7659 - regression_loss: 1.4702 - classification_loss: 0.2957 438/500 [=========================>....] - ETA: 15s - loss: 1.7628 - regression_loss: 1.4674 - classification_loss: 0.2953 439/500 [=========================>....] - ETA: 15s - loss: 1.7630 - regression_loss: 1.4673 - classification_loss: 0.2957 440/500 [=========================>....] - ETA: 14s - loss: 1.7645 - regression_loss: 1.4685 - classification_loss: 0.2960 441/500 [=========================>....] - ETA: 14s - loss: 1.7649 - regression_loss: 1.4688 - classification_loss: 0.2961 442/500 [=========================>....] - ETA: 14s - loss: 1.7624 - regression_loss: 1.4667 - classification_loss: 0.2956 443/500 [=========================>....] - ETA: 14s - loss: 1.7630 - regression_loss: 1.4674 - classification_loss: 0.2955 444/500 [=========================>....] - ETA: 13s - loss: 1.7641 - regression_loss: 1.4684 - classification_loss: 0.2957 445/500 [=========================>....] - ETA: 13s - loss: 1.7650 - regression_loss: 1.4691 - classification_loss: 0.2959 446/500 [=========================>....] - ETA: 13s - loss: 1.7655 - regression_loss: 1.4695 - classification_loss: 0.2960 447/500 [=========================>....] - ETA: 13s - loss: 1.7663 - regression_loss: 1.4701 - classification_loss: 0.2962 448/500 [=========================>....] - ETA: 12s - loss: 1.7674 - regression_loss: 1.4708 - classification_loss: 0.2966 449/500 [=========================>....] - ETA: 12s - loss: 1.7673 - regression_loss: 1.4707 - classification_loss: 0.2966 450/500 [==========================>...] - ETA: 12s - loss: 1.7675 - regression_loss: 1.4709 - classification_loss: 0.2966 451/500 [==========================>...] - ETA: 12s - loss: 1.7676 - regression_loss: 1.4710 - classification_loss: 0.2965 452/500 [==========================>...] - ETA: 11s - loss: 1.7663 - regression_loss: 1.4702 - classification_loss: 0.2962 453/500 [==========================>...] - ETA: 11s - loss: 1.7635 - regression_loss: 1.4678 - classification_loss: 0.2957 454/500 [==========================>...] - ETA: 11s - loss: 1.7642 - regression_loss: 1.4684 - classification_loss: 0.2958 455/500 [==========================>...] - ETA: 11s - loss: 1.7636 - regression_loss: 1.4681 - classification_loss: 0.2956 456/500 [==========================>...] - ETA: 10s - loss: 1.7635 - regression_loss: 1.4680 - classification_loss: 0.2955 457/500 [==========================>...] - ETA: 10s - loss: 1.7626 - regression_loss: 1.4674 - classification_loss: 0.2952 458/500 [==========================>...] - ETA: 10s - loss: 1.7633 - regression_loss: 1.4681 - classification_loss: 0.2953 459/500 [==========================>...] - ETA: 10s - loss: 1.7617 - regression_loss: 1.4668 - classification_loss: 0.2949 460/500 [==========================>...] - ETA: 9s - loss: 1.7604 - regression_loss: 1.4659 - classification_loss: 0.2945  461/500 [==========================>...] - ETA: 9s - loss: 1.7602 - regression_loss: 1.4657 - classification_loss: 0.2945 462/500 [==========================>...] - ETA: 9s - loss: 1.7605 - regression_loss: 1.4660 - classification_loss: 0.2946 463/500 [==========================>...] - ETA: 9s - loss: 1.7645 - regression_loss: 1.4692 - classification_loss: 0.2953 464/500 [==========================>...] - ETA: 8s - loss: 1.7652 - regression_loss: 1.4697 - classification_loss: 0.2955 465/500 [==========================>...] - ETA: 8s - loss: 1.7651 - regression_loss: 1.4697 - classification_loss: 0.2954 466/500 [==========================>...] - ETA: 8s - loss: 1.7651 - regression_loss: 1.4698 - classification_loss: 0.2954 467/500 [===========================>..] - ETA: 8s - loss: 1.7653 - regression_loss: 1.4695 - classification_loss: 0.2958 468/500 [===========================>..] - ETA: 7s - loss: 1.7646 - regression_loss: 1.4690 - classification_loss: 0.2956 469/500 [===========================>..] - ETA: 7s - loss: 1.7628 - regression_loss: 1.4676 - classification_loss: 0.2952 470/500 [===========================>..] - ETA: 7s - loss: 1.7620 - regression_loss: 1.4672 - classification_loss: 0.2949 471/500 [===========================>..] - ETA: 7s - loss: 1.7615 - regression_loss: 1.4668 - classification_loss: 0.2947 472/500 [===========================>..] - ETA: 6s - loss: 1.7610 - regression_loss: 1.4661 - classification_loss: 0.2949 473/500 [===========================>..] - ETA: 6s - loss: 1.7608 - regression_loss: 1.4658 - classification_loss: 0.2950 474/500 [===========================>..] - ETA: 6s - loss: 1.7606 - regression_loss: 1.4658 - classification_loss: 0.2948 475/500 [===========================>..] - ETA: 6s - loss: 1.7607 - regression_loss: 1.4658 - classification_loss: 0.2950 476/500 [===========================>..] - ETA: 5s - loss: 1.7618 - regression_loss: 1.4665 - classification_loss: 0.2952 477/500 [===========================>..] - ETA: 5s - loss: 1.7613 - regression_loss: 1.4662 - classification_loss: 0.2951 478/500 [===========================>..] - ETA: 5s - loss: 1.7604 - regression_loss: 1.4657 - classification_loss: 0.2947 479/500 [===========================>..] - ETA: 5s - loss: 1.7606 - regression_loss: 1.4656 - classification_loss: 0.2950 480/500 [===========================>..] - ETA: 4s - loss: 1.7593 - regression_loss: 1.4646 - classification_loss: 0.2947 481/500 [===========================>..] - ETA: 4s - loss: 1.7603 - regression_loss: 1.4654 - classification_loss: 0.2949 482/500 [===========================>..] - ETA: 4s - loss: 1.7607 - regression_loss: 1.4660 - classification_loss: 0.2947 483/500 [===========================>..] - ETA: 4s - loss: 1.7608 - regression_loss: 1.4664 - classification_loss: 0.2945 484/500 [============================>.] - ETA: 3s - loss: 1.7606 - regression_loss: 1.4662 - classification_loss: 0.2944 485/500 [============================>.] - ETA: 3s - loss: 1.7604 - regression_loss: 1.4661 - classification_loss: 0.2943 486/500 [============================>.] - ETA: 3s - loss: 1.7586 - regression_loss: 1.4645 - classification_loss: 0.2941 487/500 [============================>.] - ETA: 3s - loss: 1.7588 - regression_loss: 1.4648 - classification_loss: 0.2941 488/500 [============================>.] - ETA: 2s - loss: 1.7573 - regression_loss: 1.4635 - classification_loss: 0.2938 489/500 [============================>.] - ETA: 2s - loss: 1.7561 - regression_loss: 1.4624 - classification_loss: 0.2938 490/500 [============================>.] - ETA: 2s - loss: 1.7558 - regression_loss: 1.4619 - classification_loss: 0.2940 491/500 [============================>.] - ETA: 2s - loss: 1.7540 - regression_loss: 1.4604 - classification_loss: 0.2936 492/500 [============================>.] - ETA: 1s - loss: 1.7552 - regression_loss: 1.4615 - classification_loss: 0.2937 493/500 [============================>.] - ETA: 1s - loss: 1.7533 - regression_loss: 1.4599 - classification_loss: 0.2933 494/500 [============================>.] - ETA: 1s - loss: 1.7531 - regression_loss: 1.4598 - classification_loss: 0.2933 495/500 [============================>.] - ETA: 1s - loss: 1.7530 - regression_loss: 1.4598 - classification_loss: 0.2932 496/500 [============================>.] - ETA: 0s - loss: 1.7526 - regression_loss: 1.4595 - classification_loss: 0.2931 497/500 [============================>.] - ETA: 0s - loss: 1.7528 - regression_loss: 1.4595 - classification_loss: 0.2933 498/500 [============================>.] - ETA: 0s - loss: 1.7523 - regression_loss: 1.4591 - classification_loss: 0.2932 499/500 [============================>.] - ETA: 0s - loss: 1.7531 - regression_loss: 1.4596 - classification_loss: 0.2935 500/500 [==============================] - 125s 250ms/step - loss: 1.7536 - regression_loss: 1.4601 - classification_loss: 0.2936 326 instances of class plum with average precision: 0.7685 mAP: 0.7685 Epoch 00042: saving model to ./training/snapshots/resnet50_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:01 - loss: 2.5855 - regression_loss: 2.2224 - classification_loss: 0.3632 2/500 [..............................] - ETA: 1:55 - loss: 1.8917 - regression_loss: 1.6284 - classification_loss: 0.2634 3/500 [..............................] - ETA: 1:58 - loss: 1.8788 - regression_loss: 1.5968 - classification_loss: 0.2820 4/500 [..............................] - ETA: 1:58 - loss: 1.7392 - regression_loss: 1.4911 - classification_loss: 0.2482 5/500 [..............................] - ETA: 2:00 - loss: 1.7653 - regression_loss: 1.4928 - classification_loss: 0.2726 6/500 [..............................] - ETA: 2:01 - loss: 1.7303 - regression_loss: 1.4652 - classification_loss: 0.2651 7/500 [..............................] - ETA: 2:00 - loss: 1.7789 - regression_loss: 1.5052 - classification_loss: 0.2737 8/500 [..............................] - ETA: 2:00 - loss: 1.7046 - regression_loss: 1.4443 - classification_loss: 0.2603 9/500 [..............................] - ETA: 2:00 - loss: 1.6973 - regression_loss: 1.4322 - classification_loss: 0.2650 10/500 [..............................] - ETA: 2:00 - loss: 1.6053 - regression_loss: 1.3544 - classification_loss: 0.2510 11/500 [..............................] - ETA: 2:00 - loss: 1.6187 - regression_loss: 1.3654 - classification_loss: 0.2533 12/500 [..............................] - ETA: 2:00 - loss: 1.6137 - regression_loss: 1.3667 - classification_loss: 0.2471 13/500 [..............................] - ETA: 2:00 - loss: 1.5409 - regression_loss: 1.3040 - classification_loss: 0.2368 14/500 [..............................] - ETA: 2:00 - loss: 1.4988 - regression_loss: 1.2706 - classification_loss: 0.2281 15/500 [..............................] - ETA: 2:00 - loss: 1.5173 - regression_loss: 1.2809 - classification_loss: 0.2365 16/500 [..............................] - ETA: 2:00 - loss: 1.4609 - regression_loss: 1.2348 - classification_loss: 0.2261 17/500 [>.............................] - ETA: 1:59 - loss: 1.4604 - regression_loss: 1.2397 - classification_loss: 0.2207 18/500 [>.............................] - ETA: 1:58 - loss: 1.4119 - regression_loss: 1.2003 - classification_loss: 0.2116 19/500 [>.............................] - ETA: 1:58 - loss: 1.4531 - regression_loss: 1.2272 - classification_loss: 0.2259 20/500 [>.............................] - ETA: 1:58 - loss: 1.4758 - regression_loss: 1.2393 - classification_loss: 0.2364 21/500 [>.............................] - ETA: 1:58 - loss: 1.4631 - regression_loss: 1.2286 - classification_loss: 0.2345 22/500 [>.............................] - ETA: 1:57 - loss: 1.4641 - regression_loss: 1.2301 - classification_loss: 0.2340 23/500 [>.............................] - ETA: 1:57 - loss: 1.4608 - regression_loss: 1.2304 - classification_loss: 0.2304 24/500 [>.............................] - ETA: 1:57 - loss: 1.4832 - regression_loss: 1.2488 - classification_loss: 0.2343 25/500 [>.............................] - ETA: 1:56 - loss: 1.5041 - regression_loss: 1.2644 - classification_loss: 0.2397 26/500 [>.............................] - ETA: 1:56 - loss: 1.4642 - regression_loss: 1.2316 - classification_loss: 0.2327 27/500 [>.............................] - ETA: 1:56 - loss: 1.4996 - regression_loss: 1.2576 - classification_loss: 0.2420 28/500 [>.............................] - ETA: 1:55 - loss: 1.4824 - regression_loss: 1.2444 - classification_loss: 0.2379 29/500 [>.............................] - ETA: 1:55 - loss: 1.4922 - regression_loss: 1.2510 - classification_loss: 0.2412 30/500 [>.............................] - ETA: 1:55 - loss: 1.5047 - regression_loss: 1.2600 - classification_loss: 0.2447 31/500 [>.............................] - ETA: 1:55 - loss: 1.5096 - regression_loss: 1.2643 - classification_loss: 0.2453 32/500 [>.............................] - ETA: 1:55 - loss: 1.5232 - regression_loss: 1.2767 - classification_loss: 0.2465 33/500 [>.............................] - ETA: 1:54 - loss: 1.5128 - regression_loss: 1.2704 - classification_loss: 0.2424 34/500 [=>............................] - ETA: 1:54 - loss: 1.5180 - regression_loss: 1.2743 - classification_loss: 0.2437 35/500 [=>............................] - ETA: 1:54 - loss: 1.5307 - regression_loss: 1.2852 - classification_loss: 0.2455 36/500 [=>............................] - ETA: 1:54 - loss: 1.5289 - regression_loss: 1.2870 - classification_loss: 0.2419 37/500 [=>............................] - ETA: 1:53 - loss: 1.5313 - regression_loss: 1.2905 - classification_loss: 0.2407 38/500 [=>............................] - ETA: 1:53 - loss: 1.5310 - regression_loss: 1.2900 - classification_loss: 0.2410 39/500 [=>............................] - ETA: 1:53 - loss: 1.5426 - regression_loss: 1.2995 - classification_loss: 0.2430 40/500 [=>............................] - ETA: 1:53 - loss: 1.5461 - regression_loss: 1.2996 - classification_loss: 0.2466 41/500 [=>............................] - ETA: 1:53 - loss: 1.5598 - regression_loss: 1.3122 - classification_loss: 0.2476 42/500 [=>............................] - ETA: 1:52 - loss: 1.5853 - regression_loss: 1.3328 - classification_loss: 0.2524 43/500 [=>............................] - ETA: 1:52 - loss: 1.5999 - regression_loss: 1.3426 - classification_loss: 0.2573 44/500 [=>............................] - ETA: 1:52 - loss: 1.6055 - regression_loss: 1.3491 - classification_loss: 0.2564 45/500 [=>............................] - ETA: 1:52 - loss: 1.6145 - regression_loss: 1.3583 - classification_loss: 0.2562 46/500 [=>............................] - ETA: 1:52 - loss: 1.6253 - regression_loss: 1.3660 - classification_loss: 0.2593 47/500 [=>............................] - ETA: 1:51 - loss: 1.6403 - regression_loss: 1.3787 - classification_loss: 0.2616 48/500 [=>............................] - ETA: 1:51 - loss: 1.6406 - regression_loss: 1.3802 - classification_loss: 0.2604 49/500 [=>............................] - ETA: 1:51 - loss: 1.6634 - regression_loss: 1.3992 - classification_loss: 0.2642 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6641 - regression_loss: 1.4002 - classification_loss: 0.2639 51/500 [==>...........................] - ETA: 1:50 - loss: 1.6746 - regression_loss: 1.4103 - classification_loss: 0.2643 52/500 [==>...........................] - ETA: 1:50 - loss: 1.6728 - regression_loss: 1.4100 - classification_loss: 0.2628 53/500 [==>...........................] - ETA: 1:50 - loss: 1.6683 - regression_loss: 1.4074 - classification_loss: 0.2609 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6675 - regression_loss: 1.4070 - classification_loss: 0.2605 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6830 - regression_loss: 1.4200 - classification_loss: 0.2631 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6925 - regression_loss: 1.4267 - classification_loss: 0.2659 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6923 - regression_loss: 1.4269 - classification_loss: 0.2654 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6955 - regression_loss: 1.4280 - classification_loss: 0.2675 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6996 - regression_loss: 1.4321 - classification_loss: 0.2675 60/500 [==>...........................] - ETA: 1:48 - loss: 1.6990 - regression_loss: 1.4321 - classification_loss: 0.2669 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6964 - regression_loss: 1.4294 - classification_loss: 0.2670 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7178 - regression_loss: 1.4440 - classification_loss: 0.2737 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7099 - regression_loss: 1.4374 - classification_loss: 0.2725 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7093 - regression_loss: 1.4376 - classification_loss: 0.2716 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6992 - regression_loss: 1.4299 - classification_loss: 0.2693 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6874 - regression_loss: 1.4190 - classification_loss: 0.2684 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6782 - regression_loss: 1.4096 - classification_loss: 0.2686 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6810 - regression_loss: 1.4121 - classification_loss: 0.2689 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6818 - regression_loss: 1.4116 - classification_loss: 0.2702 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6932 - regression_loss: 1.4224 - classification_loss: 0.2708 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6926 - regression_loss: 1.4218 - classification_loss: 0.2708 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6936 - regression_loss: 1.4219 - classification_loss: 0.2717 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6881 - regression_loss: 1.4180 - classification_loss: 0.2701 74/500 [===>..........................] - ETA: 1:45 - loss: 1.6905 - regression_loss: 1.4218 - classification_loss: 0.2687 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6984 - regression_loss: 1.4274 - classification_loss: 0.2710 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6972 - regression_loss: 1.4277 - classification_loss: 0.2695 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6989 - regression_loss: 1.4291 - classification_loss: 0.2698 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7051 - regression_loss: 1.4330 - classification_loss: 0.2722 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7069 - regression_loss: 1.4329 - classification_loss: 0.2740 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7036 - regression_loss: 1.4308 - classification_loss: 0.2728 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7246 - regression_loss: 1.4440 - classification_loss: 0.2806 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7242 - regression_loss: 1.4450 - classification_loss: 0.2792 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7176 - regression_loss: 1.4400 - classification_loss: 0.2775 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7127 - regression_loss: 1.4369 - classification_loss: 0.2757 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7166 - regression_loss: 1.4399 - classification_loss: 0.2767 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7153 - regression_loss: 1.4395 - classification_loss: 0.2758 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7083 - regression_loss: 1.4337 - classification_loss: 0.2746 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7154 - regression_loss: 1.4395 - classification_loss: 0.2759 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7120 - regression_loss: 1.4363 - classification_loss: 0.2758 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7239 - regression_loss: 1.4457 - classification_loss: 0.2782 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7333 - regression_loss: 1.4525 - classification_loss: 0.2808 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7396 - regression_loss: 1.4599 - classification_loss: 0.2797 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7283 - regression_loss: 1.4509 - classification_loss: 0.2775 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7381 - regression_loss: 1.4585 - classification_loss: 0.2796 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7372 - regression_loss: 1.4578 - classification_loss: 0.2794 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7322 - regression_loss: 1.4541 - classification_loss: 0.2781 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7292 - regression_loss: 1.4514 - classification_loss: 0.2778 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7461 - regression_loss: 1.4668 - classification_loss: 0.2792 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7447 - regression_loss: 1.4653 - classification_loss: 0.2795 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7488 - regression_loss: 1.4687 - classification_loss: 0.2802 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7425 - regression_loss: 1.4636 - classification_loss: 0.2789 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7458 - regression_loss: 1.4667 - classification_loss: 0.2791 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7474 - regression_loss: 1.4663 - classification_loss: 0.2811 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7604 - regression_loss: 1.4769 - classification_loss: 0.2835 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7623 - regression_loss: 1.4782 - classification_loss: 0.2841 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7758 - regression_loss: 1.4891 - classification_loss: 0.2867 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7826 - regression_loss: 1.4934 - classification_loss: 0.2892 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7881 - regression_loss: 1.4968 - classification_loss: 0.2913 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7875 - regression_loss: 1.4960 - classification_loss: 0.2915 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7834 - regression_loss: 1.4927 - classification_loss: 0.2908 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7783 - regression_loss: 1.4889 - classification_loss: 0.2895 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7759 - regression_loss: 1.4872 - classification_loss: 0.2887 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7787 - regression_loss: 1.4885 - classification_loss: 0.2901 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7711 - regression_loss: 1.4826 - classification_loss: 0.2885 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7735 - regression_loss: 1.4848 - classification_loss: 0.2887 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7744 - regression_loss: 1.4862 - classification_loss: 0.2882 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7755 - regression_loss: 1.4879 - classification_loss: 0.2876 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7750 - regression_loss: 1.4874 - classification_loss: 0.2876 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7733 - regression_loss: 1.4862 - classification_loss: 0.2870 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7710 - regression_loss: 1.4843 - classification_loss: 0.2866 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7678 - regression_loss: 1.4819 - classification_loss: 0.2859 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7748 - regression_loss: 1.4867 - classification_loss: 0.2881 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7734 - regression_loss: 1.4859 - classification_loss: 0.2874 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7722 - regression_loss: 1.4853 - classification_loss: 0.2869 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7686 - regression_loss: 1.4828 - classification_loss: 0.2858 126/500 [======>.......................] - ETA: 1:32 - loss: 1.7777 - regression_loss: 1.4897 - classification_loss: 0.2880 127/500 [======>.......................] - ETA: 1:32 - loss: 1.7725 - regression_loss: 1.4852 - classification_loss: 0.2873 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7781 - regression_loss: 1.4896 - classification_loss: 0.2885 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7871 - regression_loss: 1.4972 - classification_loss: 0.2899 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7881 - regression_loss: 1.4984 - classification_loss: 0.2898 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7906 - regression_loss: 1.4994 - classification_loss: 0.2912 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7895 - regression_loss: 1.4981 - classification_loss: 0.2913 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7860 - regression_loss: 1.4947 - classification_loss: 0.2913 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7866 - regression_loss: 1.4946 - classification_loss: 0.2920 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7772 - regression_loss: 1.4866 - classification_loss: 0.2907 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7807 - regression_loss: 1.4890 - classification_loss: 0.2917 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7770 - regression_loss: 1.4865 - classification_loss: 0.2905 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7773 - regression_loss: 1.4868 - classification_loss: 0.2905 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7769 - regression_loss: 1.4864 - classification_loss: 0.2905 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7749 - regression_loss: 1.4838 - classification_loss: 0.2911 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7746 - regression_loss: 1.4836 - classification_loss: 0.2910 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7760 - regression_loss: 1.4850 - classification_loss: 0.2909 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7777 - regression_loss: 1.4864 - classification_loss: 0.2913 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7759 - regression_loss: 1.4852 - classification_loss: 0.2907 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7728 - regression_loss: 1.4826 - classification_loss: 0.2902 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7703 - regression_loss: 1.4809 - classification_loss: 0.2894 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7695 - regression_loss: 1.4803 - classification_loss: 0.2892 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7715 - regression_loss: 1.4814 - classification_loss: 0.2901 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7667 - regression_loss: 1.4779 - classification_loss: 0.2888 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7662 - regression_loss: 1.4779 - classification_loss: 0.2883 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7694 - regression_loss: 1.4806 - classification_loss: 0.2888 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7740 - regression_loss: 1.4843 - classification_loss: 0.2897 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7720 - regression_loss: 1.4826 - classification_loss: 0.2894 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7723 - regression_loss: 1.4829 - classification_loss: 0.2894 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7747 - regression_loss: 1.4843 - classification_loss: 0.2904 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7824 - regression_loss: 1.4880 - classification_loss: 0.2944 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7839 - regression_loss: 1.4884 - classification_loss: 0.2954 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7833 - regression_loss: 1.4881 - classification_loss: 0.2952 159/500 [========>.....................] - ETA: 1:24 - loss: 1.7783 - regression_loss: 1.4839 - classification_loss: 0.2944 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7828 - regression_loss: 1.4885 - classification_loss: 0.2944 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7842 - regression_loss: 1.4895 - classification_loss: 0.2948 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7847 - regression_loss: 1.4900 - classification_loss: 0.2947 163/500 [========>.....................] - ETA: 1:23 - loss: 1.7875 - regression_loss: 1.4924 - classification_loss: 0.2951 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7892 - regression_loss: 1.4935 - classification_loss: 0.2957 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7889 - regression_loss: 1.4932 - classification_loss: 0.2957 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7895 - regression_loss: 1.4939 - classification_loss: 0.2957 167/500 [=========>....................] - ETA: 1:22 - loss: 1.7936 - regression_loss: 1.4973 - classification_loss: 0.2963 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7936 - regression_loss: 1.4972 - classification_loss: 0.2964 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7930 - regression_loss: 1.4973 - classification_loss: 0.2957 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7940 - regression_loss: 1.4985 - classification_loss: 0.2955 171/500 [=========>....................] - ETA: 1:21 - loss: 1.7968 - regression_loss: 1.5006 - classification_loss: 0.2962 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7945 - regression_loss: 1.4984 - classification_loss: 0.2961 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7912 - regression_loss: 1.4954 - classification_loss: 0.2958 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7879 - regression_loss: 1.4930 - classification_loss: 0.2949 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7869 - regression_loss: 1.4925 - classification_loss: 0.2944 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7880 - regression_loss: 1.4934 - classification_loss: 0.2946 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7858 - regression_loss: 1.4917 - classification_loss: 0.2941 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7878 - regression_loss: 1.4937 - classification_loss: 0.2941 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7902 - regression_loss: 1.4957 - classification_loss: 0.2946 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7822 - regression_loss: 1.4888 - classification_loss: 0.2934 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7791 - regression_loss: 1.4864 - classification_loss: 0.2926 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7784 - regression_loss: 1.4862 - classification_loss: 0.2922 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7771 - regression_loss: 1.4850 - classification_loss: 0.2921 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7792 - regression_loss: 1.4863 - classification_loss: 0.2929 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7776 - regression_loss: 1.4846 - classification_loss: 0.2930 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7765 - regression_loss: 1.4839 - classification_loss: 0.2925 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7749 - regression_loss: 1.4830 - classification_loss: 0.2919 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7776 - regression_loss: 1.4856 - classification_loss: 0.2920 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7764 - regression_loss: 1.4850 - classification_loss: 0.2914 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7768 - regression_loss: 1.4855 - classification_loss: 0.2914 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7768 - regression_loss: 1.4859 - classification_loss: 0.2909 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7822 - regression_loss: 1.4900 - classification_loss: 0.2922 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7786 - regression_loss: 1.4873 - classification_loss: 0.2913 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7799 - regression_loss: 1.4887 - classification_loss: 0.2912 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7804 - regression_loss: 1.4896 - classification_loss: 0.2909 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7789 - regression_loss: 1.4887 - classification_loss: 0.2902 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7827 - regression_loss: 1.4901 - classification_loss: 0.2925 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7825 - regression_loss: 1.4900 - classification_loss: 0.2925 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7851 - regression_loss: 1.4916 - classification_loss: 0.2935 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7900 - regression_loss: 1.4959 - classification_loss: 0.2942 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7902 - regression_loss: 1.4963 - classification_loss: 0.2939 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7879 - regression_loss: 1.4945 - classification_loss: 0.2934 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7880 - regression_loss: 1.4953 - classification_loss: 0.2927 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7849 - regression_loss: 1.4930 - classification_loss: 0.2919 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7850 - regression_loss: 1.4929 - classification_loss: 0.2921 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7852 - regression_loss: 1.4929 - classification_loss: 0.2923 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7850 - regression_loss: 1.4928 - classification_loss: 0.2922 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7837 - regression_loss: 1.4918 - classification_loss: 0.2919 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7853 - regression_loss: 1.4931 - classification_loss: 0.2922 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7832 - regression_loss: 1.4916 - classification_loss: 0.2917 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7819 - regression_loss: 1.4908 - classification_loss: 0.2911 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7849 - regression_loss: 1.4931 - classification_loss: 0.2917 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7833 - regression_loss: 1.4921 - classification_loss: 0.2912 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7797 - regression_loss: 1.4893 - classification_loss: 0.2904 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7756 - regression_loss: 1.4858 - classification_loss: 0.2897 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7724 - regression_loss: 1.4834 - classification_loss: 0.2891 217/500 [============>.................] - ETA: 1:10 - loss: 1.7726 - regression_loss: 1.4841 - classification_loss: 0.2885 218/500 [============>.................] - ETA: 1:10 - loss: 1.7714 - regression_loss: 1.4832 - classification_loss: 0.2882 219/500 [============>.................] - ETA: 1:10 - loss: 1.7706 - regression_loss: 1.4825 - classification_loss: 0.2880 220/500 [============>.................] - ETA: 1:09 - loss: 1.7751 - regression_loss: 1.4858 - classification_loss: 0.2893 221/500 [============>.................] - ETA: 1:09 - loss: 1.7730 - regression_loss: 1.4843 - classification_loss: 0.2886 222/500 [============>.................] - ETA: 1:09 - loss: 1.7766 - regression_loss: 1.4876 - classification_loss: 0.2889 223/500 [============>.................] - ETA: 1:09 - loss: 1.7769 - regression_loss: 1.4879 - classification_loss: 0.2890 224/500 [============>.................] - ETA: 1:08 - loss: 1.7753 - regression_loss: 1.4865 - classification_loss: 0.2887 225/500 [============>.................] - ETA: 1:08 - loss: 1.7733 - regression_loss: 1.4851 - classification_loss: 0.2881 226/500 [============>.................] - ETA: 1:08 - loss: 1.7733 - regression_loss: 1.4854 - classification_loss: 0.2879 227/500 [============>.................] - ETA: 1:08 - loss: 1.7727 - regression_loss: 1.4852 - classification_loss: 0.2875 228/500 [============>.................] - ETA: 1:07 - loss: 1.7730 - regression_loss: 1.4858 - classification_loss: 0.2872 229/500 [============>.................] - ETA: 1:07 - loss: 1.7765 - regression_loss: 1.4893 - classification_loss: 0.2872 230/500 [============>.................] - ETA: 1:07 - loss: 1.7768 - regression_loss: 1.4898 - classification_loss: 0.2870 231/500 [============>.................] - ETA: 1:07 - loss: 1.7742 - regression_loss: 1.4878 - classification_loss: 0.2864 232/500 [============>.................] - ETA: 1:07 - loss: 1.7784 - regression_loss: 1.4911 - classification_loss: 0.2873 233/500 [============>.................] - ETA: 1:06 - loss: 1.7789 - regression_loss: 1.4911 - classification_loss: 0.2878 234/500 [=============>................] - ETA: 1:06 - loss: 1.7782 - regression_loss: 1.4908 - classification_loss: 0.2874 235/500 [=============>................] - ETA: 1:06 - loss: 1.7825 - regression_loss: 1.4940 - classification_loss: 0.2885 236/500 [=============>................] - ETA: 1:06 - loss: 1.7801 - regression_loss: 1.4922 - classification_loss: 0.2879 237/500 [=============>................] - ETA: 1:05 - loss: 1.7809 - regression_loss: 1.4927 - classification_loss: 0.2882 238/500 [=============>................] - ETA: 1:05 - loss: 1.7796 - regression_loss: 1.4917 - classification_loss: 0.2879 239/500 [=============>................] - ETA: 1:05 - loss: 1.7786 - regression_loss: 1.4910 - classification_loss: 0.2876 240/500 [=============>................] - ETA: 1:05 - loss: 1.7795 - regression_loss: 1.4923 - classification_loss: 0.2873 241/500 [=============>................] - ETA: 1:04 - loss: 1.7800 - regression_loss: 1.4927 - classification_loss: 0.2873 242/500 [=============>................] - ETA: 1:04 - loss: 1.7788 - regression_loss: 1.4919 - classification_loss: 0.2869 243/500 [=============>................] - ETA: 1:04 - loss: 1.7751 - regression_loss: 1.4890 - classification_loss: 0.2861 244/500 [=============>................] - ETA: 1:04 - loss: 1.7736 - regression_loss: 1.4879 - classification_loss: 0.2858 245/500 [=============>................] - ETA: 1:03 - loss: 1.7744 - regression_loss: 1.4881 - classification_loss: 0.2864 246/500 [=============>................] - ETA: 1:03 - loss: 1.7736 - regression_loss: 1.4874 - classification_loss: 0.2861 247/500 [=============>................] - ETA: 1:03 - loss: 1.7745 - regression_loss: 1.4883 - classification_loss: 0.2861 248/500 [=============>................] - ETA: 1:03 - loss: 1.7697 - regression_loss: 1.4842 - classification_loss: 0.2855 249/500 [=============>................] - ETA: 1:02 - loss: 1.7699 - regression_loss: 1.4843 - classification_loss: 0.2857 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7659 - regression_loss: 1.4806 - classification_loss: 0.2853 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7678 - regression_loss: 1.4820 - classification_loss: 0.2858 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7687 - regression_loss: 1.4827 - classification_loss: 0.2861 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7677 - regression_loss: 1.4820 - classification_loss: 0.2857 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7706 - regression_loss: 1.4842 - classification_loss: 0.2864 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7681 - regression_loss: 1.4784 - classification_loss: 0.2897 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7679 - regression_loss: 1.4780 - classification_loss: 0.2899 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7647 - regression_loss: 1.4753 - classification_loss: 0.2895 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7698 - regression_loss: 1.4797 - classification_loss: 0.2901 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7648 - regression_loss: 1.4756 - classification_loss: 0.2892 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7619 - regression_loss: 1.4732 - classification_loss: 0.2886 261/500 [==============>...............] - ETA: 59s - loss: 1.7613 - regression_loss: 1.4729 - classification_loss: 0.2885  262/500 [==============>...............] - ETA: 59s - loss: 1.7595 - regression_loss: 1.4714 - classification_loss: 0.2881 263/500 [==============>...............] - ETA: 59s - loss: 1.7594 - regression_loss: 1.4712 - classification_loss: 0.2882 264/500 [==============>...............] - ETA: 59s - loss: 1.7615 - regression_loss: 1.4730 - classification_loss: 0.2885 265/500 [==============>...............] - ETA: 58s - loss: 1.7613 - regression_loss: 1.4727 - classification_loss: 0.2887 266/500 [==============>...............] - ETA: 58s - loss: 1.7606 - regression_loss: 1.4724 - classification_loss: 0.2882 267/500 [===============>..............] - ETA: 58s - loss: 1.7605 - regression_loss: 1.4726 - classification_loss: 0.2879 268/500 [===============>..............] - ETA: 58s - loss: 1.7613 - regression_loss: 1.4730 - classification_loss: 0.2883 269/500 [===============>..............] - ETA: 57s - loss: 1.7605 - regression_loss: 1.4724 - classification_loss: 0.2881 270/500 [===============>..............] - ETA: 57s - loss: 1.7609 - regression_loss: 1.4725 - classification_loss: 0.2884 271/500 [===============>..............] - ETA: 57s - loss: 1.7599 - regression_loss: 1.4715 - classification_loss: 0.2885 272/500 [===============>..............] - ETA: 57s - loss: 1.7599 - regression_loss: 1.4712 - classification_loss: 0.2886 273/500 [===============>..............] - ETA: 56s - loss: 1.7589 - regression_loss: 1.4706 - classification_loss: 0.2883 274/500 [===============>..............] - ETA: 56s - loss: 1.7594 - regression_loss: 1.4713 - classification_loss: 0.2880 275/500 [===============>..............] - ETA: 56s - loss: 1.7598 - regression_loss: 1.4718 - classification_loss: 0.2880 276/500 [===============>..............] - ETA: 56s - loss: 1.7608 - regression_loss: 1.4728 - classification_loss: 0.2881 277/500 [===============>..............] - ETA: 55s - loss: 1.7599 - regression_loss: 1.4721 - classification_loss: 0.2878 278/500 [===============>..............] - ETA: 55s - loss: 1.7609 - regression_loss: 1.4730 - classification_loss: 0.2879 279/500 [===============>..............] - ETA: 55s - loss: 1.7610 - regression_loss: 1.4729 - classification_loss: 0.2880 280/500 [===============>..............] - ETA: 55s - loss: 1.7593 - regression_loss: 1.4717 - classification_loss: 0.2877 281/500 [===============>..............] - ETA: 54s - loss: 1.7615 - regression_loss: 1.4738 - classification_loss: 0.2877 282/500 [===============>..............] - ETA: 54s - loss: 1.7658 - regression_loss: 1.4768 - classification_loss: 0.2890 283/500 [===============>..............] - ETA: 54s - loss: 1.7688 - regression_loss: 1.4791 - classification_loss: 0.2897 284/500 [================>.............] - ETA: 54s - loss: 1.7691 - regression_loss: 1.4792 - classification_loss: 0.2899 285/500 [================>.............] - ETA: 53s - loss: 1.7657 - regression_loss: 1.4762 - classification_loss: 0.2895 286/500 [================>.............] - ETA: 53s - loss: 1.7618 - regression_loss: 1.4732 - classification_loss: 0.2887 287/500 [================>.............] - ETA: 53s - loss: 1.7603 - regression_loss: 1.4721 - classification_loss: 0.2882 288/500 [================>.............] - ETA: 53s - loss: 1.7621 - regression_loss: 1.4739 - classification_loss: 0.2882 289/500 [================>.............] - ETA: 52s - loss: 1.7615 - regression_loss: 1.4735 - classification_loss: 0.2880 290/500 [================>.............] - ETA: 52s - loss: 1.7605 - regression_loss: 1.4727 - classification_loss: 0.2879 291/500 [================>.............] - ETA: 52s - loss: 1.7601 - regression_loss: 1.4725 - classification_loss: 0.2877 292/500 [================>.............] - ETA: 52s - loss: 1.7586 - regression_loss: 1.4713 - classification_loss: 0.2873 293/500 [================>.............] - ETA: 51s - loss: 1.7577 - regression_loss: 1.4709 - classification_loss: 0.2868 294/500 [================>.............] - ETA: 51s - loss: 1.7606 - regression_loss: 1.4733 - classification_loss: 0.2872 295/500 [================>.............] - ETA: 51s - loss: 1.7608 - regression_loss: 1.4731 - classification_loss: 0.2877 296/500 [================>.............] - ETA: 51s - loss: 1.7611 - regression_loss: 1.4734 - classification_loss: 0.2878 297/500 [================>.............] - ETA: 50s - loss: 1.7616 - regression_loss: 1.4738 - classification_loss: 0.2878 298/500 [================>.............] - ETA: 50s - loss: 1.7601 - regression_loss: 1.4729 - classification_loss: 0.2872 299/500 [================>.............] - ETA: 50s - loss: 1.7600 - regression_loss: 1.4728 - classification_loss: 0.2872 300/500 [=================>............] - ETA: 49s - loss: 1.7612 - regression_loss: 1.4730 - classification_loss: 0.2882 301/500 [=================>............] - ETA: 49s - loss: 1.7585 - regression_loss: 1.4710 - classification_loss: 0.2875 302/500 [=================>............] - ETA: 49s - loss: 1.7584 - regression_loss: 1.4709 - classification_loss: 0.2875 303/500 [=================>............] - ETA: 49s - loss: 1.7594 - regression_loss: 1.4713 - classification_loss: 0.2880 304/500 [=================>............] - ETA: 48s - loss: 1.7577 - regression_loss: 1.4702 - classification_loss: 0.2876 305/500 [=================>............] - ETA: 48s - loss: 1.7567 - regression_loss: 1.4696 - classification_loss: 0.2871 306/500 [=================>............] - ETA: 48s - loss: 1.7530 - regression_loss: 1.4666 - classification_loss: 0.2864 307/500 [=================>............] - ETA: 48s - loss: 1.7504 - regression_loss: 1.4644 - classification_loss: 0.2859 308/500 [=================>............] - ETA: 47s - loss: 1.7512 - regression_loss: 1.4652 - classification_loss: 0.2860 309/500 [=================>............] - ETA: 47s - loss: 1.7507 - regression_loss: 1.4649 - classification_loss: 0.2858 310/500 [=================>............] - ETA: 47s - loss: 1.7526 - regression_loss: 1.4661 - classification_loss: 0.2865 311/500 [=================>............] - ETA: 47s - loss: 1.7536 - regression_loss: 1.4671 - classification_loss: 0.2865 312/500 [=================>............] - ETA: 46s - loss: 1.7491 - regression_loss: 1.4634 - classification_loss: 0.2857 313/500 [=================>............] - ETA: 46s - loss: 1.7480 - regression_loss: 1.4627 - classification_loss: 0.2854 314/500 [=================>............] - ETA: 46s - loss: 1.7457 - regression_loss: 1.4609 - classification_loss: 0.2848 315/500 [=================>............] - ETA: 46s - loss: 1.7501 - regression_loss: 1.4641 - classification_loss: 0.2860 316/500 [=================>............] - ETA: 45s - loss: 1.7489 - regression_loss: 1.4633 - classification_loss: 0.2856 317/500 [==================>...........] - ETA: 45s - loss: 1.7501 - regression_loss: 1.4644 - classification_loss: 0.2856 318/500 [==================>...........] - ETA: 45s - loss: 1.7491 - regression_loss: 1.4638 - classification_loss: 0.2853 319/500 [==================>...........] - ETA: 45s - loss: 1.7496 - regression_loss: 1.4643 - classification_loss: 0.2853 320/500 [==================>...........] - ETA: 44s - loss: 1.7518 - regression_loss: 1.4657 - classification_loss: 0.2861 321/500 [==================>...........] - ETA: 44s - loss: 1.7520 - regression_loss: 1.4660 - classification_loss: 0.2860 322/500 [==================>...........] - ETA: 44s - loss: 1.7545 - regression_loss: 1.4678 - classification_loss: 0.2867 323/500 [==================>...........] - ETA: 44s - loss: 1.7554 - regression_loss: 1.4687 - classification_loss: 0.2867 324/500 [==================>...........] - ETA: 43s - loss: 1.7556 - regression_loss: 1.4691 - classification_loss: 0.2865 325/500 [==================>...........] - ETA: 43s - loss: 1.7556 - regression_loss: 1.4690 - classification_loss: 0.2866 326/500 [==================>...........] - ETA: 43s - loss: 1.7555 - regression_loss: 1.4690 - classification_loss: 0.2865 327/500 [==================>...........] - ETA: 43s - loss: 1.7570 - regression_loss: 1.4702 - classification_loss: 0.2868 328/500 [==================>...........] - ETA: 42s - loss: 1.7540 - regression_loss: 1.4677 - classification_loss: 0.2863 329/500 [==================>...........] - ETA: 42s - loss: 1.7536 - regression_loss: 1.4675 - classification_loss: 0.2861 330/500 [==================>...........] - ETA: 42s - loss: 1.7534 - regression_loss: 1.4674 - classification_loss: 0.2859 331/500 [==================>...........] - ETA: 42s - loss: 1.7538 - regression_loss: 1.4679 - classification_loss: 0.2859 332/500 [==================>...........] - ETA: 41s - loss: 1.7553 - regression_loss: 1.4691 - classification_loss: 0.2862 333/500 [==================>...........] - ETA: 41s - loss: 1.7576 - regression_loss: 1.4709 - classification_loss: 0.2867 334/500 [===================>..........] - ETA: 41s - loss: 1.7586 - regression_loss: 1.4720 - classification_loss: 0.2866 335/500 [===================>..........] - ETA: 41s - loss: 1.7588 - regression_loss: 1.4722 - classification_loss: 0.2866 336/500 [===================>..........] - ETA: 40s - loss: 1.7610 - regression_loss: 1.4740 - classification_loss: 0.2870 337/500 [===================>..........] - ETA: 40s - loss: 1.7635 - regression_loss: 1.4761 - classification_loss: 0.2874 338/500 [===================>..........] - ETA: 40s - loss: 1.7627 - regression_loss: 1.4755 - classification_loss: 0.2872 339/500 [===================>..........] - ETA: 40s - loss: 1.7635 - regression_loss: 1.4762 - classification_loss: 0.2872 340/500 [===================>..........] - ETA: 39s - loss: 1.7613 - regression_loss: 1.4744 - classification_loss: 0.2869 341/500 [===================>..........] - ETA: 39s - loss: 1.7621 - regression_loss: 1.4753 - classification_loss: 0.2868 342/500 [===================>..........] - ETA: 39s - loss: 1.7657 - regression_loss: 1.4782 - classification_loss: 0.2875 343/500 [===================>..........] - ETA: 39s - loss: 1.7647 - regression_loss: 1.4775 - classification_loss: 0.2871 344/500 [===================>..........] - ETA: 39s - loss: 1.7634 - regression_loss: 1.4763 - classification_loss: 0.2870 345/500 [===================>..........] - ETA: 38s - loss: 1.7626 - regression_loss: 1.4759 - classification_loss: 0.2867 346/500 [===================>..........] - ETA: 38s - loss: 1.7617 - regression_loss: 1.4752 - classification_loss: 0.2865 347/500 [===================>..........] - ETA: 38s - loss: 1.7600 - regression_loss: 1.4738 - classification_loss: 0.2862 348/500 [===================>..........] - ETA: 38s - loss: 1.7614 - regression_loss: 1.4750 - classification_loss: 0.2864 349/500 [===================>..........] - ETA: 37s - loss: 1.7617 - regression_loss: 1.4755 - classification_loss: 0.2862 350/500 [====================>.........] - ETA: 37s - loss: 1.7621 - regression_loss: 1.4756 - classification_loss: 0.2865 351/500 [====================>.........] - ETA: 37s - loss: 1.7626 - regression_loss: 1.4760 - classification_loss: 0.2867 352/500 [====================>.........] - ETA: 37s - loss: 1.7639 - regression_loss: 1.4770 - classification_loss: 0.2869 353/500 [====================>.........] - ETA: 36s - loss: 1.7625 - regression_loss: 1.4759 - classification_loss: 0.2866 354/500 [====================>.........] - ETA: 36s - loss: 1.7637 - regression_loss: 1.4770 - classification_loss: 0.2868 355/500 [====================>.........] - ETA: 36s - loss: 1.7613 - regression_loss: 1.4750 - classification_loss: 0.2863 356/500 [====================>.........] - ETA: 36s - loss: 1.7618 - regression_loss: 1.4757 - classification_loss: 0.2861 357/500 [====================>.........] - ETA: 35s - loss: 1.7629 - regression_loss: 1.4769 - classification_loss: 0.2861 358/500 [====================>.........] - ETA: 35s - loss: 1.7620 - regression_loss: 1.4761 - classification_loss: 0.2859 359/500 [====================>.........] - ETA: 35s - loss: 1.7624 - regression_loss: 1.4767 - classification_loss: 0.2857 360/500 [====================>.........] - ETA: 34s - loss: 1.7620 - regression_loss: 1.4766 - classification_loss: 0.2855 361/500 [====================>.........] - ETA: 34s - loss: 1.7613 - regression_loss: 1.4762 - classification_loss: 0.2851 362/500 [====================>.........] - ETA: 34s - loss: 1.7623 - regression_loss: 1.4770 - classification_loss: 0.2853 363/500 [====================>.........] - ETA: 34s - loss: 1.7623 - regression_loss: 1.4772 - classification_loss: 0.2851 364/500 [====================>.........] - ETA: 33s - loss: 1.7613 - regression_loss: 1.4762 - classification_loss: 0.2851 365/500 [====================>.........] - ETA: 33s - loss: 1.7614 - regression_loss: 1.4763 - classification_loss: 0.2851 366/500 [====================>.........] - ETA: 33s - loss: 1.7620 - regression_loss: 1.4766 - classification_loss: 0.2854 367/500 [=====================>........] - ETA: 33s - loss: 1.7626 - regression_loss: 1.4770 - classification_loss: 0.2856 368/500 [=====================>........] - ETA: 32s - loss: 1.7623 - regression_loss: 1.4768 - classification_loss: 0.2855 369/500 [=====================>........] - ETA: 32s - loss: 1.7608 - regression_loss: 1.4758 - classification_loss: 0.2850 370/500 [=====================>........] - ETA: 32s - loss: 1.7615 - regression_loss: 1.4762 - classification_loss: 0.2853 371/500 [=====================>........] - ETA: 32s - loss: 1.7617 - regression_loss: 1.4761 - classification_loss: 0.2856 372/500 [=====================>........] - ETA: 31s - loss: 1.7611 - regression_loss: 1.4757 - classification_loss: 0.2855 373/500 [=====================>........] - ETA: 31s - loss: 1.7608 - regression_loss: 1.4756 - classification_loss: 0.2853 374/500 [=====================>........] - ETA: 31s - loss: 1.7605 - regression_loss: 1.4752 - classification_loss: 0.2853 375/500 [=====================>........] - ETA: 31s - loss: 1.7610 - regression_loss: 1.4758 - classification_loss: 0.2852 376/500 [=====================>........] - ETA: 30s - loss: 1.7611 - regression_loss: 1.4759 - classification_loss: 0.2852 377/500 [=====================>........] - ETA: 30s - loss: 1.7601 - regression_loss: 1.4752 - classification_loss: 0.2849 378/500 [=====================>........] - ETA: 30s - loss: 1.7595 - regression_loss: 1.4742 - classification_loss: 0.2853 379/500 [=====================>........] - ETA: 30s - loss: 1.7579 - regression_loss: 1.4730 - classification_loss: 0.2849 380/500 [=====================>........] - ETA: 30s - loss: 1.7553 - regression_loss: 1.4710 - classification_loss: 0.2844 381/500 [=====================>........] - ETA: 29s - loss: 1.7555 - regression_loss: 1.4712 - classification_loss: 0.2843 382/500 [=====================>........] - ETA: 29s - loss: 1.7551 - regression_loss: 1.4711 - classification_loss: 0.2841 383/500 [=====================>........] - ETA: 29s - loss: 1.7548 - regression_loss: 1.4709 - classification_loss: 0.2840 384/500 [======================>.......] - ETA: 29s - loss: 1.7549 - regression_loss: 1.4708 - classification_loss: 0.2841 385/500 [======================>.......] - ETA: 28s - loss: 1.7535 - regression_loss: 1.4698 - classification_loss: 0.2837 386/500 [======================>.......] - ETA: 28s - loss: 1.7533 - regression_loss: 1.4696 - classification_loss: 0.2837 387/500 [======================>.......] - ETA: 28s - loss: 1.7543 - regression_loss: 1.4706 - classification_loss: 0.2837 388/500 [======================>.......] - ETA: 28s - loss: 1.7567 - regression_loss: 1.4727 - classification_loss: 0.2840 389/500 [======================>.......] - ETA: 27s - loss: 1.7594 - regression_loss: 1.4746 - classification_loss: 0.2848 390/500 [======================>.......] - ETA: 27s - loss: 1.7604 - regression_loss: 1.4753 - classification_loss: 0.2850 391/500 [======================>.......] - ETA: 27s - loss: 1.7580 - regression_loss: 1.4735 - classification_loss: 0.2845 392/500 [======================>.......] - ETA: 27s - loss: 1.7587 - regression_loss: 1.4738 - classification_loss: 0.2849 393/500 [======================>.......] - ETA: 26s - loss: 1.7591 - regression_loss: 1.4742 - classification_loss: 0.2850 394/500 [======================>.......] - ETA: 26s - loss: 1.7593 - regression_loss: 1.4743 - classification_loss: 0.2850 395/500 [======================>.......] - ETA: 26s - loss: 1.7588 - regression_loss: 1.4739 - classification_loss: 0.2849 396/500 [======================>.......] - ETA: 26s - loss: 1.7577 - regression_loss: 1.4730 - classification_loss: 0.2848 397/500 [======================>.......] - ETA: 25s - loss: 1.7592 - regression_loss: 1.4738 - classification_loss: 0.2854 398/500 [======================>.......] - ETA: 25s - loss: 1.7601 - regression_loss: 1.4746 - classification_loss: 0.2855 399/500 [======================>.......] - ETA: 25s - loss: 1.7605 - regression_loss: 1.4748 - classification_loss: 0.2858 400/500 [=======================>......] - ETA: 25s - loss: 1.7628 - regression_loss: 1.4765 - classification_loss: 0.2863 401/500 [=======================>......] - ETA: 24s - loss: 1.7627 - regression_loss: 1.4764 - classification_loss: 0.2863 402/500 [=======================>......] - ETA: 24s - loss: 1.7629 - regression_loss: 1.4766 - classification_loss: 0.2863 403/500 [=======================>......] - ETA: 24s - loss: 1.7619 - regression_loss: 1.4759 - classification_loss: 0.2860 404/500 [=======================>......] - ETA: 24s - loss: 1.7639 - regression_loss: 1.4774 - classification_loss: 0.2864 405/500 [=======================>......] - ETA: 23s - loss: 1.7645 - regression_loss: 1.4775 - classification_loss: 0.2870 406/500 [=======================>......] - ETA: 23s - loss: 1.7645 - regression_loss: 1.4776 - classification_loss: 0.2869 407/500 [=======================>......] - ETA: 23s - loss: 1.7640 - regression_loss: 1.4770 - classification_loss: 0.2870 408/500 [=======================>......] - ETA: 23s - loss: 1.7633 - regression_loss: 1.4764 - classification_loss: 0.2869 409/500 [=======================>......] - ETA: 22s - loss: 1.7648 - regression_loss: 1.4777 - classification_loss: 0.2871 410/500 [=======================>......] - ETA: 22s - loss: 1.7652 - regression_loss: 1.4781 - classification_loss: 0.2871 411/500 [=======================>......] - ETA: 22s - loss: 1.7657 - regression_loss: 1.4783 - classification_loss: 0.2875 412/500 [=======================>......] - ETA: 22s - loss: 1.7666 - regression_loss: 1.4789 - classification_loss: 0.2877 413/500 [=======================>......] - ETA: 21s - loss: 1.7656 - regression_loss: 1.4781 - classification_loss: 0.2876 414/500 [=======================>......] - ETA: 21s - loss: 1.7631 - regression_loss: 1.4761 - classification_loss: 0.2871 415/500 [=======================>......] - ETA: 21s - loss: 1.7634 - regression_loss: 1.4763 - classification_loss: 0.2871 416/500 [=======================>......] - ETA: 21s - loss: 1.7661 - regression_loss: 1.4782 - classification_loss: 0.2879 417/500 [========================>.....] - ETA: 20s - loss: 1.7658 - regression_loss: 1.4779 - classification_loss: 0.2879 418/500 [========================>.....] - ETA: 20s - loss: 1.7650 - regression_loss: 1.4772 - classification_loss: 0.2877 419/500 [========================>.....] - ETA: 20s - loss: 1.7655 - regression_loss: 1.4774 - classification_loss: 0.2880 420/500 [========================>.....] - ETA: 20s - loss: 1.7660 - regression_loss: 1.4777 - classification_loss: 0.2883 421/500 [========================>.....] - ETA: 19s - loss: 1.7642 - regression_loss: 1.4761 - classification_loss: 0.2881 422/500 [========================>.....] - ETA: 19s - loss: 1.7634 - regression_loss: 1.4755 - classification_loss: 0.2879 423/500 [========================>.....] - ETA: 19s - loss: 1.7610 - regression_loss: 1.4735 - classification_loss: 0.2875 424/500 [========================>.....] - ETA: 19s - loss: 1.7621 - regression_loss: 1.4743 - classification_loss: 0.2878 425/500 [========================>.....] - ETA: 18s - loss: 1.7603 - regression_loss: 1.4729 - classification_loss: 0.2874 426/500 [========================>.....] - ETA: 18s - loss: 1.7607 - regression_loss: 1.4734 - classification_loss: 0.2874 427/500 [========================>.....] - ETA: 18s - loss: 1.7598 - regression_loss: 1.4726 - classification_loss: 0.2872 428/500 [========================>.....] - ETA: 18s - loss: 1.7608 - regression_loss: 1.4735 - classification_loss: 0.2874 429/500 [========================>.....] - ETA: 17s - loss: 1.7619 - regression_loss: 1.4742 - classification_loss: 0.2876 430/500 [========================>.....] - ETA: 17s - loss: 1.7618 - regression_loss: 1.4742 - classification_loss: 0.2876 431/500 [========================>.....] - ETA: 17s - loss: 1.7601 - regression_loss: 1.4729 - classification_loss: 0.2872 432/500 [========================>.....] - ETA: 17s - loss: 1.7597 - regression_loss: 1.4723 - classification_loss: 0.2873 433/500 [========================>.....] - ETA: 16s - loss: 1.7602 - regression_loss: 1.4730 - classification_loss: 0.2872 434/500 [=========================>....] - ETA: 16s - loss: 1.7587 - regression_loss: 1.4717 - classification_loss: 0.2870 435/500 [=========================>....] - ETA: 16s - loss: 1.7562 - regression_loss: 1.4697 - classification_loss: 0.2866 436/500 [=========================>....] - ETA: 16s - loss: 1.7558 - regression_loss: 1.4694 - classification_loss: 0.2864 437/500 [=========================>....] - ETA: 15s - loss: 1.7566 - regression_loss: 1.4701 - classification_loss: 0.2865 438/500 [=========================>....] - ETA: 15s - loss: 1.7562 - regression_loss: 1.4698 - classification_loss: 0.2864 439/500 [=========================>....] - ETA: 15s - loss: 1.7562 - regression_loss: 1.4697 - classification_loss: 0.2865 440/500 [=========================>....] - ETA: 15s - loss: 1.7579 - regression_loss: 1.4708 - classification_loss: 0.2871 441/500 [=========================>....] - ETA: 14s - loss: 1.7580 - regression_loss: 1.4709 - classification_loss: 0.2871 442/500 [=========================>....] - ETA: 14s - loss: 1.7576 - regression_loss: 1.4705 - classification_loss: 0.2872 443/500 [=========================>....] - ETA: 14s - loss: 1.7572 - regression_loss: 1.4702 - classification_loss: 0.2870 444/500 [=========================>....] - ETA: 14s - loss: 1.7572 - regression_loss: 1.4704 - classification_loss: 0.2868 445/500 [=========================>....] - ETA: 13s - loss: 1.7563 - regression_loss: 1.4697 - classification_loss: 0.2867 446/500 [=========================>....] - ETA: 13s - loss: 1.7587 - regression_loss: 1.4716 - classification_loss: 0.2871 447/500 [=========================>....] - ETA: 13s - loss: 1.7582 - regression_loss: 1.4713 - classification_loss: 0.2869 448/500 [=========================>....] - ETA: 13s - loss: 1.7591 - regression_loss: 1.4721 - classification_loss: 0.2870 449/500 [=========================>....] - ETA: 12s - loss: 1.7565 - regression_loss: 1.4700 - classification_loss: 0.2865 450/500 [==========================>...] - ETA: 12s - loss: 1.7549 - regression_loss: 1.4685 - classification_loss: 0.2864 451/500 [==========================>...] - ETA: 12s - loss: 1.7557 - regression_loss: 1.4695 - classification_loss: 0.2862 452/500 [==========================>...] - ETA: 12s - loss: 1.7570 - regression_loss: 1.4706 - classification_loss: 0.2864 453/500 [==========================>...] - ETA: 11s - loss: 1.7555 - regression_loss: 1.4694 - classification_loss: 0.2861 454/500 [==========================>...] - ETA: 11s - loss: 1.7549 - regression_loss: 1.4690 - classification_loss: 0.2859 455/500 [==========================>...] - ETA: 11s - loss: 1.7549 - regression_loss: 1.4691 - classification_loss: 0.2858 456/500 [==========================>...] - ETA: 11s - loss: 1.7552 - regression_loss: 1.4693 - classification_loss: 0.2859 457/500 [==========================>...] - ETA: 10s - loss: 1.7537 - regression_loss: 1.4680 - classification_loss: 0.2857 458/500 [==========================>...] - ETA: 10s - loss: 1.7538 - regression_loss: 1.4681 - classification_loss: 0.2857 459/500 [==========================>...] - ETA: 10s - loss: 1.7535 - regression_loss: 1.4677 - classification_loss: 0.2858 460/500 [==========================>...] - ETA: 10s - loss: 1.7520 - regression_loss: 1.4665 - classification_loss: 0.2855 461/500 [==========================>...] - ETA: 9s - loss: 1.7510 - regression_loss: 1.4659 - classification_loss: 0.2851  462/500 [==========================>...] - ETA: 9s - loss: 1.7514 - regression_loss: 1.4661 - classification_loss: 0.2854 463/500 [==========================>...] - ETA: 9s - loss: 1.7505 - regression_loss: 1.4655 - classification_loss: 0.2850 464/500 [==========================>...] - ETA: 9s - loss: 1.7498 - regression_loss: 1.4649 - classification_loss: 0.2849 465/500 [==========================>...] - ETA: 8s - loss: 1.7509 - regression_loss: 1.4658 - classification_loss: 0.2851 466/500 [==========================>...] - ETA: 8s - loss: 1.7517 - regression_loss: 1.4663 - classification_loss: 0.2854 467/500 [===========================>..] - ETA: 8s - loss: 1.7530 - regression_loss: 1.4676 - classification_loss: 0.2854 468/500 [===========================>..] - ETA: 8s - loss: 1.7528 - regression_loss: 1.4674 - classification_loss: 0.2854 469/500 [===========================>..] - ETA: 7s - loss: 1.7525 - regression_loss: 1.4672 - classification_loss: 0.2853 470/500 [===========================>..] - ETA: 7s - loss: 1.7531 - regression_loss: 1.4677 - classification_loss: 0.2854 471/500 [===========================>..] - ETA: 7s - loss: 1.7514 - regression_loss: 1.4664 - classification_loss: 0.2849 472/500 [===========================>..] - ETA: 7s - loss: 1.7542 - regression_loss: 1.4692 - classification_loss: 0.2851 473/500 [===========================>..] - ETA: 6s - loss: 1.7558 - regression_loss: 1.4704 - classification_loss: 0.2854 474/500 [===========================>..] - ETA: 6s - loss: 1.7572 - regression_loss: 1.4714 - classification_loss: 0.2858 475/500 [===========================>..] - ETA: 6s - loss: 1.7593 - regression_loss: 1.4730 - classification_loss: 0.2863 476/500 [===========================>..] - ETA: 5s - loss: 1.7585 - regression_loss: 1.4724 - classification_loss: 0.2861 477/500 [===========================>..] - ETA: 5s - loss: 1.7585 - regression_loss: 1.4723 - classification_loss: 0.2862 478/500 [===========================>..] - ETA: 5s - loss: 1.7593 - regression_loss: 1.4727 - classification_loss: 0.2866 479/500 [===========================>..] - ETA: 5s - loss: 1.7596 - regression_loss: 1.4728 - classification_loss: 0.2868 480/500 [===========================>..] - ETA: 4s - loss: 1.7597 - regression_loss: 1.4729 - classification_loss: 0.2868 481/500 [===========================>..] - ETA: 4s - loss: 1.7596 - regression_loss: 1.4731 - classification_loss: 0.2865 482/500 [===========================>..] - ETA: 4s - loss: 1.7602 - regression_loss: 1.4737 - classification_loss: 0.2865 483/500 [===========================>..] - ETA: 4s - loss: 1.7609 - regression_loss: 1.4742 - classification_loss: 0.2867 484/500 [============================>.] - ETA: 3s - loss: 1.7608 - regression_loss: 1.4739 - classification_loss: 0.2868 485/500 [============================>.] - ETA: 3s - loss: 1.7615 - regression_loss: 1.4746 - classification_loss: 0.2869 486/500 [============================>.] - ETA: 3s - loss: 1.7625 - regression_loss: 1.4752 - classification_loss: 0.2873 487/500 [============================>.] - ETA: 3s - loss: 1.7627 - regression_loss: 1.4754 - classification_loss: 0.2873 488/500 [============================>.] - ETA: 2s - loss: 1.7603 - regression_loss: 1.4733 - classification_loss: 0.2869 489/500 [============================>.] - ETA: 2s - loss: 1.7603 - regression_loss: 1.4736 - classification_loss: 0.2867 490/500 [============================>.] - ETA: 2s - loss: 1.7581 - regression_loss: 1.4718 - classification_loss: 0.2863 491/500 [============================>.] - ETA: 2s - loss: 1.7586 - regression_loss: 1.4722 - classification_loss: 0.2864 492/500 [============================>.] - ETA: 1s - loss: 1.7583 - regression_loss: 1.4720 - classification_loss: 0.2863 493/500 [============================>.] - ETA: 1s - loss: 1.7575 - regression_loss: 1.4716 - classification_loss: 0.2859 494/500 [============================>.] - ETA: 1s - loss: 1.7579 - regression_loss: 1.4718 - classification_loss: 0.2861 495/500 [============================>.] - ETA: 1s - loss: 1.7576 - regression_loss: 1.4717 - classification_loss: 0.2859 496/500 [============================>.] - ETA: 0s - loss: 1.7585 - regression_loss: 1.4728 - classification_loss: 0.2856 497/500 [============================>.] - ETA: 0s - loss: 1.7574 - regression_loss: 1.4721 - classification_loss: 0.2853 498/500 [============================>.] - ETA: 0s - loss: 1.7589 - regression_loss: 1.4735 - classification_loss: 0.2854 499/500 [============================>.] - ETA: 0s - loss: 1.7578 - regression_loss: 1.4724 - classification_loss: 0.2854 500/500 [==============================] - 125s 250ms/step - loss: 1.7570 - regression_loss: 1.4718 - classification_loss: 0.2852 326 instances of class plum with average precision: 0.7379 mAP: 0.7379 Epoch 00043: saving model to ./training/snapshots/resnet50_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 2:00 - loss: 2.0681 - regression_loss: 1.6967 - classification_loss: 0.3714 2/500 [..............................] - ETA: 2:03 - loss: 1.8774 - regression_loss: 1.5611 - classification_loss: 0.3163 3/500 [..............................] - ETA: 2:03 - loss: 1.8254 - regression_loss: 1.5120 - classification_loss: 0.3134 4/500 [..............................] - ETA: 2:04 - loss: 1.8926 - regression_loss: 1.5568 - classification_loss: 0.3359 5/500 [..............................] - ETA: 2:04 - loss: 1.9114 - regression_loss: 1.5877 - classification_loss: 0.3237 6/500 [..............................] - ETA: 2:04 - loss: 1.8551 - regression_loss: 1.5617 - classification_loss: 0.2933 7/500 [..............................] - ETA: 2:04 - loss: 1.8673 - regression_loss: 1.5873 - classification_loss: 0.2799 8/500 [..............................] - ETA: 2:03 - loss: 1.8778 - regression_loss: 1.5868 - classification_loss: 0.2910 9/500 [..............................] - ETA: 2:03 - loss: 1.8735 - regression_loss: 1.5829 - classification_loss: 0.2906 10/500 [..............................] - ETA: 2:03 - loss: 1.8155 - regression_loss: 1.5374 - classification_loss: 0.2781 11/500 [..............................] - ETA: 2:03 - loss: 1.7706 - regression_loss: 1.5023 - classification_loss: 0.2683 12/500 [..............................] - ETA: 2:03 - loss: 1.7881 - regression_loss: 1.5117 - classification_loss: 0.2764 13/500 [..............................] - ETA: 2:03 - loss: 1.7844 - regression_loss: 1.5071 - classification_loss: 0.2773 14/500 [..............................] - ETA: 2:03 - loss: 1.7614 - regression_loss: 1.4880 - classification_loss: 0.2734 15/500 [..............................] - ETA: 2:02 - loss: 1.7803 - regression_loss: 1.4975 - classification_loss: 0.2828 16/500 [..............................] - ETA: 2:02 - loss: 1.7796 - regression_loss: 1.4984 - classification_loss: 0.2812 17/500 [>.............................] - ETA: 2:02 - loss: 1.8483 - regression_loss: 1.5477 - classification_loss: 0.3005 18/500 [>.............................] - ETA: 2:02 - loss: 1.8298 - regression_loss: 1.5350 - classification_loss: 0.2948 19/500 [>.............................] - ETA: 2:01 - loss: 1.8501 - regression_loss: 1.5476 - classification_loss: 0.3025 20/500 [>.............................] - ETA: 2:01 - loss: 1.8336 - regression_loss: 1.5340 - classification_loss: 0.2997 21/500 [>.............................] - ETA: 2:01 - loss: 1.8739 - regression_loss: 1.5703 - classification_loss: 0.3036 22/500 [>.............................] - ETA: 2:01 - loss: 1.8899 - regression_loss: 1.5809 - classification_loss: 0.3090 23/500 [>.............................] - ETA: 2:00 - loss: 1.8823 - regression_loss: 1.5759 - classification_loss: 0.3064 24/500 [>.............................] - ETA: 2:00 - loss: 1.8961 - regression_loss: 1.5876 - classification_loss: 0.3085 25/500 [>.............................] - ETA: 2:00 - loss: 1.8630 - regression_loss: 1.5600 - classification_loss: 0.3031 26/500 [>.............................] - ETA: 1:59 - loss: 1.8492 - regression_loss: 1.5504 - classification_loss: 0.2988 27/500 [>.............................] - ETA: 1:59 - loss: 1.8451 - regression_loss: 1.5494 - classification_loss: 0.2957 28/500 [>.............................] - ETA: 1:59 - loss: 1.8342 - regression_loss: 1.5407 - classification_loss: 0.2935 29/500 [>.............................] - ETA: 1:58 - loss: 1.8172 - regression_loss: 1.5265 - classification_loss: 0.2907 30/500 [>.............................] - ETA: 1:58 - loss: 1.8389 - regression_loss: 1.5434 - classification_loss: 0.2954 31/500 [>.............................] - ETA: 1:57 - loss: 1.8286 - regression_loss: 1.5335 - classification_loss: 0.2950 32/500 [>.............................] - ETA: 1:57 - loss: 1.7967 - regression_loss: 1.5076 - classification_loss: 0.2891 33/500 [>.............................] - ETA: 1:57 - loss: 1.8106 - regression_loss: 1.5180 - classification_loss: 0.2927 34/500 [=>............................] - ETA: 1:56 - loss: 1.8140 - regression_loss: 1.5200 - classification_loss: 0.2940 35/500 [=>............................] - ETA: 1:56 - loss: 1.8020 - regression_loss: 1.5111 - classification_loss: 0.2910 36/500 [=>............................] - ETA: 1:56 - loss: 1.8535 - regression_loss: 1.5524 - classification_loss: 0.3012 37/500 [=>............................] - ETA: 1:56 - loss: 1.8526 - regression_loss: 1.5503 - classification_loss: 0.3023 38/500 [=>............................] - ETA: 1:55 - loss: 1.8423 - regression_loss: 1.5436 - classification_loss: 0.2987 39/500 [=>............................] - ETA: 1:55 - loss: 1.8422 - regression_loss: 1.5421 - classification_loss: 0.3001 40/500 [=>............................] - ETA: 1:55 - loss: 1.8422 - regression_loss: 1.5395 - classification_loss: 0.3027 41/500 [=>............................] - ETA: 1:54 - loss: 1.8535 - regression_loss: 1.5471 - classification_loss: 0.3065 42/500 [=>............................] - ETA: 1:54 - loss: 1.8372 - regression_loss: 1.5344 - classification_loss: 0.3028 43/500 [=>............................] - ETA: 1:54 - loss: 1.8360 - regression_loss: 1.5343 - classification_loss: 0.3017 44/500 [=>............................] - ETA: 1:53 - loss: 1.8085 - regression_loss: 1.5126 - classification_loss: 0.2959 45/500 [=>............................] - ETA: 1:53 - loss: 1.8052 - regression_loss: 1.5106 - classification_loss: 0.2946 46/500 [=>............................] - ETA: 1:53 - loss: 1.7955 - regression_loss: 1.5013 - classification_loss: 0.2942 47/500 [=>............................] - ETA: 1:53 - loss: 1.7914 - regression_loss: 1.4976 - classification_loss: 0.2938 48/500 [=>............................] - ETA: 1:52 - loss: 1.7959 - regression_loss: 1.5016 - classification_loss: 0.2943 49/500 [=>............................] - ETA: 1:52 - loss: 1.7823 - regression_loss: 1.4910 - classification_loss: 0.2913 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7791 - regression_loss: 1.4889 - classification_loss: 0.2901 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7777 - regression_loss: 1.4879 - classification_loss: 0.2897 52/500 [==>...........................] - ETA: 1:51 - loss: 1.7879 - regression_loss: 1.4959 - classification_loss: 0.2920 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7914 - regression_loss: 1.4991 - classification_loss: 0.2923 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8016 - regression_loss: 1.5065 - classification_loss: 0.2951 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8066 - regression_loss: 1.5118 - classification_loss: 0.2948 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8151 - regression_loss: 1.5183 - classification_loss: 0.2968 57/500 [==>...........................] - ETA: 1:50 - loss: 1.7945 - regression_loss: 1.5016 - classification_loss: 0.2929 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7818 - regression_loss: 1.4911 - classification_loss: 0.2906 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7970 - regression_loss: 1.5045 - classification_loss: 0.2926 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7965 - regression_loss: 1.5052 - classification_loss: 0.2913 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7803 - regression_loss: 1.4917 - classification_loss: 0.2886 62/500 [==>...........................] - ETA: 1:49 - loss: 1.7830 - regression_loss: 1.4952 - classification_loss: 0.2878 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7809 - regression_loss: 1.4952 - classification_loss: 0.2857 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7677 - regression_loss: 1.4846 - classification_loss: 0.2832 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7604 - regression_loss: 1.4793 - classification_loss: 0.2811 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7635 - regression_loss: 1.4810 - classification_loss: 0.2825 67/500 [===>..........................] - ETA: 1:47 - loss: 1.7573 - regression_loss: 1.4758 - classification_loss: 0.2815 68/500 [===>..........................] - ETA: 1:47 - loss: 1.7623 - regression_loss: 1.4803 - classification_loss: 0.2821 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7567 - regression_loss: 1.4766 - classification_loss: 0.2801 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7544 - regression_loss: 1.4747 - classification_loss: 0.2796 71/500 [===>..........................] - ETA: 1:46 - loss: 1.7494 - regression_loss: 1.4707 - classification_loss: 0.2788 72/500 [===>..........................] - ETA: 1:46 - loss: 1.7398 - regression_loss: 1.4630 - classification_loss: 0.2768 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7319 - regression_loss: 1.4562 - classification_loss: 0.2757 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7502 - regression_loss: 1.4655 - classification_loss: 0.2847 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7807 - regression_loss: 1.4617 - classification_loss: 0.3190 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7837 - regression_loss: 1.4643 - classification_loss: 0.3194 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7848 - regression_loss: 1.4663 - classification_loss: 0.3184 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7792 - regression_loss: 1.4625 - classification_loss: 0.3167 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7756 - regression_loss: 1.4597 - classification_loss: 0.3159 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7715 - regression_loss: 1.4571 - classification_loss: 0.3144 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7697 - regression_loss: 1.4565 - classification_loss: 0.3132 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7730 - regression_loss: 1.4595 - classification_loss: 0.3135 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7572 - regression_loss: 1.4468 - classification_loss: 0.3104 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7612 - regression_loss: 1.4486 - classification_loss: 0.3126 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7613 - regression_loss: 1.4494 - classification_loss: 0.3120 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7622 - regression_loss: 1.4505 - classification_loss: 0.3117 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7563 - regression_loss: 1.4466 - classification_loss: 0.3097 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7599 - regression_loss: 1.4491 - classification_loss: 0.3108 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7600 - regression_loss: 1.4497 - classification_loss: 0.3103 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7637 - regression_loss: 1.4537 - classification_loss: 0.3100 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7670 - regression_loss: 1.4544 - classification_loss: 0.3126 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7616 - regression_loss: 1.4498 - classification_loss: 0.3117 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7488 - regression_loss: 1.4394 - classification_loss: 0.3094 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7478 - regression_loss: 1.4392 - classification_loss: 0.3085 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7476 - regression_loss: 1.4404 - classification_loss: 0.3072 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7485 - regression_loss: 1.4421 - classification_loss: 0.3064 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7506 - regression_loss: 1.4456 - classification_loss: 0.3050 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7519 - regression_loss: 1.4470 - classification_loss: 0.3048 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7502 - regression_loss: 1.4461 - classification_loss: 0.3041 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7477 - regression_loss: 1.4442 - classification_loss: 0.3035 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7499 - regression_loss: 1.4466 - classification_loss: 0.3033 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7468 - regression_loss: 1.4446 - classification_loss: 0.3022 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7421 - regression_loss: 1.4410 - classification_loss: 0.3011 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7394 - regression_loss: 1.4396 - classification_loss: 0.2997 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7559 - regression_loss: 1.4439 - classification_loss: 0.3120 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7564 - regression_loss: 1.4453 - classification_loss: 0.3111 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7583 - regression_loss: 1.4472 - classification_loss: 0.3111 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7581 - regression_loss: 1.4466 - classification_loss: 0.3115 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7524 - regression_loss: 1.4425 - classification_loss: 0.3100 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7492 - regression_loss: 1.4402 - classification_loss: 0.3090 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7495 - regression_loss: 1.4404 - classification_loss: 0.3092 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7464 - regression_loss: 1.4383 - classification_loss: 0.3081 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7448 - regression_loss: 1.4370 - classification_loss: 0.3078 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7491 - regression_loss: 1.4402 - classification_loss: 0.3089 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7563 - regression_loss: 1.4448 - classification_loss: 0.3115 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7497 - regression_loss: 1.4398 - classification_loss: 0.3099 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7521 - regression_loss: 1.4419 - classification_loss: 0.3102 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7467 - regression_loss: 1.4377 - classification_loss: 0.3090 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7455 - regression_loss: 1.4372 - classification_loss: 0.3084 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7492 - regression_loss: 1.4390 - classification_loss: 0.3102 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7446 - regression_loss: 1.4360 - classification_loss: 0.3086 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7439 - regression_loss: 1.4359 - classification_loss: 0.3080 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7416 - regression_loss: 1.4349 - classification_loss: 0.3067 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7463 - regression_loss: 1.4384 - classification_loss: 0.3079 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7471 - regression_loss: 1.4388 - classification_loss: 0.3083 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7514 - regression_loss: 1.4427 - classification_loss: 0.3087 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7555 - regression_loss: 1.4455 - classification_loss: 0.3100 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7514 - regression_loss: 1.4429 - classification_loss: 0.3086 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7507 - regression_loss: 1.4434 - classification_loss: 0.3073 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7540 - regression_loss: 1.4458 - classification_loss: 0.3082 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7569 - regression_loss: 1.4478 - classification_loss: 0.3091 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7647 - regression_loss: 1.4538 - classification_loss: 0.3109 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7659 - regression_loss: 1.4553 - classification_loss: 0.3106 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7666 - regression_loss: 1.4562 - classification_loss: 0.3104 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7686 - regression_loss: 1.4574 - classification_loss: 0.3112 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7683 - regression_loss: 1.4572 - classification_loss: 0.3111 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7692 - regression_loss: 1.4564 - classification_loss: 0.3129 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7762 - regression_loss: 1.4621 - classification_loss: 0.3141 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7782 - regression_loss: 1.4631 - classification_loss: 0.3151 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7780 - regression_loss: 1.4632 - classification_loss: 0.3148 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7796 - regression_loss: 1.4637 - classification_loss: 0.3159 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7798 - regression_loss: 1.4635 - classification_loss: 0.3163 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7776 - regression_loss: 1.4622 - classification_loss: 0.3154 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7785 - regression_loss: 1.4624 - classification_loss: 0.3160 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7851 - regression_loss: 1.4679 - classification_loss: 0.3172 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7881 - regression_loss: 1.4704 - classification_loss: 0.3177 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7865 - regression_loss: 1.4692 - classification_loss: 0.3173 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7833 - regression_loss: 1.4671 - classification_loss: 0.3162 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7883 - regression_loss: 1.4714 - classification_loss: 0.3168 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7929 - regression_loss: 1.4740 - classification_loss: 0.3189 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7964 - regression_loss: 1.4760 - classification_loss: 0.3204 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7937 - regression_loss: 1.4742 - classification_loss: 0.3196 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7950 - regression_loss: 1.4753 - classification_loss: 0.3197 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8034 - regression_loss: 1.4826 - classification_loss: 0.3208 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8025 - regression_loss: 1.4822 - classification_loss: 0.3203 156/500 [========>.....................] - ETA: 1:25 - loss: 1.8085 - regression_loss: 1.4873 - classification_loss: 0.3211 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8080 - regression_loss: 1.4870 - classification_loss: 0.3210 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8122 - regression_loss: 1.4899 - classification_loss: 0.3222 159/500 [========>.....................] - ETA: 1:24 - loss: 1.8130 - regression_loss: 1.4900 - classification_loss: 0.3230 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8101 - regression_loss: 1.4880 - classification_loss: 0.3221 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8111 - regression_loss: 1.4891 - classification_loss: 0.3220 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8048 - regression_loss: 1.4835 - classification_loss: 0.3212 163/500 [========>.....................] - ETA: 1:23 - loss: 1.8045 - regression_loss: 1.4836 - classification_loss: 0.3209 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8015 - regression_loss: 1.4816 - classification_loss: 0.3199 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8017 - regression_loss: 1.4808 - classification_loss: 0.3209 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7989 - regression_loss: 1.4787 - classification_loss: 0.3201 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7994 - regression_loss: 1.4788 - classification_loss: 0.3207 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7968 - regression_loss: 1.4767 - classification_loss: 0.3201 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7904 - regression_loss: 1.4713 - classification_loss: 0.3191 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7904 - regression_loss: 1.4710 - classification_loss: 0.3194 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7877 - regression_loss: 1.4695 - classification_loss: 0.3183 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7865 - regression_loss: 1.4688 - classification_loss: 0.3177 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7836 - regression_loss: 1.4671 - classification_loss: 0.3166 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7789 - regression_loss: 1.4638 - classification_loss: 0.3151 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7774 - regression_loss: 1.4631 - classification_loss: 0.3143 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7809 - regression_loss: 1.4675 - classification_loss: 0.3135 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7757 - regression_loss: 1.4635 - classification_loss: 0.3122 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7731 - regression_loss: 1.4615 - classification_loss: 0.3116 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7737 - regression_loss: 1.4617 - classification_loss: 0.3120 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7717 - regression_loss: 1.4601 - classification_loss: 0.3116 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7715 - regression_loss: 1.4600 - classification_loss: 0.3116 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7710 - regression_loss: 1.4603 - classification_loss: 0.3106 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7684 - regression_loss: 1.4586 - classification_loss: 0.3098 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7642 - regression_loss: 1.4553 - classification_loss: 0.3089 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7667 - regression_loss: 1.4578 - classification_loss: 0.3089 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7688 - regression_loss: 1.4597 - classification_loss: 0.3091 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7666 - regression_loss: 1.4581 - classification_loss: 0.3085 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7661 - regression_loss: 1.4582 - classification_loss: 0.3079 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7642 - regression_loss: 1.4570 - classification_loss: 0.3072 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7602 - regression_loss: 1.4542 - classification_loss: 0.3061 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7592 - regression_loss: 1.4536 - classification_loss: 0.3056 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7609 - regression_loss: 1.4555 - classification_loss: 0.3054 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7614 - regression_loss: 1.4567 - classification_loss: 0.3047 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7586 - regression_loss: 1.4547 - classification_loss: 0.3039 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7578 - regression_loss: 1.4539 - classification_loss: 0.3039 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7591 - regression_loss: 1.4551 - classification_loss: 0.3041 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7604 - regression_loss: 1.4561 - classification_loss: 0.3043 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7597 - regression_loss: 1.4559 - classification_loss: 0.3038 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7587 - regression_loss: 1.4547 - classification_loss: 0.3041 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7554 - regression_loss: 1.4520 - classification_loss: 0.3034 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7546 - regression_loss: 1.4516 - classification_loss: 0.3030 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7507 - regression_loss: 1.4487 - classification_loss: 0.3020 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7492 - regression_loss: 1.4476 - classification_loss: 0.3016 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7478 - regression_loss: 1.4468 - classification_loss: 0.3010 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7475 - regression_loss: 1.4464 - classification_loss: 0.3010 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7437 - regression_loss: 1.4437 - classification_loss: 0.2999 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7377 - regression_loss: 1.4391 - classification_loss: 0.2986 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7366 - regression_loss: 1.4385 - classification_loss: 0.2981 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7382 - regression_loss: 1.4399 - classification_loss: 0.2983 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7377 - regression_loss: 1.4393 - classification_loss: 0.2984 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7399 - regression_loss: 1.4413 - classification_loss: 0.2987 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7393 - regression_loss: 1.4410 - classification_loss: 0.2984 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7412 - regression_loss: 1.4426 - classification_loss: 0.2985 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7417 - regression_loss: 1.4436 - classification_loss: 0.2981 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7449 - regression_loss: 1.4464 - classification_loss: 0.2985 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7451 - regression_loss: 1.4470 - classification_loss: 0.2981 217/500 [============>.................] - ETA: 1:10 - loss: 1.7448 - regression_loss: 1.4470 - classification_loss: 0.2978 218/500 [============>.................] - ETA: 1:10 - loss: 1.7430 - regression_loss: 1.4458 - classification_loss: 0.2971 219/500 [============>.................] - ETA: 1:10 - loss: 1.7392 - regression_loss: 1.4428 - classification_loss: 0.2963 220/500 [============>.................] - ETA: 1:09 - loss: 1.7427 - regression_loss: 1.4454 - classification_loss: 0.2973 221/500 [============>.................] - ETA: 1:09 - loss: 1.7451 - regression_loss: 1.4475 - classification_loss: 0.2976 222/500 [============>.................] - ETA: 1:09 - loss: 1.7461 - regression_loss: 1.4481 - classification_loss: 0.2980 223/500 [============>.................] - ETA: 1:09 - loss: 1.7451 - regression_loss: 1.4475 - classification_loss: 0.2976 224/500 [============>.................] - ETA: 1:08 - loss: 1.7475 - regression_loss: 1.4490 - classification_loss: 0.2985 225/500 [============>.................] - ETA: 1:08 - loss: 1.7476 - regression_loss: 1.4489 - classification_loss: 0.2987 226/500 [============>.................] - ETA: 1:08 - loss: 1.7500 - regression_loss: 1.4506 - classification_loss: 0.2994 227/500 [============>.................] - ETA: 1:08 - loss: 1.7511 - regression_loss: 1.4516 - classification_loss: 0.2995 228/500 [============>.................] - ETA: 1:07 - loss: 1.7483 - regression_loss: 1.4496 - classification_loss: 0.2987 229/500 [============>.................] - ETA: 1:07 - loss: 1.7471 - regression_loss: 1.4490 - classification_loss: 0.2981 230/500 [============>.................] - ETA: 1:07 - loss: 1.7487 - regression_loss: 1.4503 - classification_loss: 0.2984 231/500 [============>.................] - ETA: 1:07 - loss: 1.7496 - regression_loss: 1.4513 - classification_loss: 0.2983 232/500 [============>.................] - ETA: 1:06 - loss: 1.7483 - regression_loss: 1.4505 - classification_loss: 0.2978 233/500 [============>.................] - ETA: 1:06 - loss: 1.7523 - regression_loss: 1.4531 - classification_loss: 0.2993 234/500 [=============>................] - ETA: 1:06 - loss: 1.7495 - regression_loss: 1.4511 - classification_loss: 0.2984 235/500 [=============>................] - ETA: 1:06 - loss: 1.7486 - regression_loss: 1.4504 - classification_loss: 0.2983 236/500 [=============>................] - ETA: 1:05 - loss: 1.7476 - regression_loss: 1.4497 - classification_loss: 0.2979 237/500 [=============>................] - ETA: 1:05 - loss: 1.7446 - regression_loss: 1.4436 - classification_loss: 0.3010 238/500 [=============>................] - ETA: 1:05 - loss: 1.7430 - regression_loss: 1.4424 - classification_loss: 0.3007 239/500 [=============>................] - ETA: 1:05 - loss: 1.7444 - regression_loss: 1.4437 - classification_loss: 0.3008 240/500 [=============>................] - ETA: 1:04 - loss: 1.7470 - regression_loss: 1.4460 - classification_loss: 0.3010 241/500 [=============>................] - ETA: 1:04 - loss: 1.7500 - regression_loss: 1.4482 - classification_loss: 0.3019 242/500 [=============>................] - ETA: 1:04 - loss: 1.7497 - regression_loss: 1.4481 - classification_loss: 0.3016 243/500 [=============>................] - ETA: 1:04 - loss: 1.7489 - regression_loss: 1.4474 - classification_loss: 0.3015 244/500 [=============>................] - ETA: 1:03 - loss: 1.7461 - regression_loss: 1.4455 - classification_loss: 0.3006 245/500 [=============>................] - ETA: 1:03 - loss: 1.7459 - regression_loss: 1.4455 - classification_loss: 0.3004 246/500 [=============>................] - ETA: 1:03 - loss: 1.7458 - regression_loss: 1.4451 - classification_loss: 0.3007 247/500 [=============>................] - ETA: 1:03 - loss: 1.7503 - regression_loss: 1.4495 - classification_loss: 0.3008 248/500 [=============>................] - ETA: 1:02 - loss: 1.7533 - regression_loss: 1.4515 - classification_loss: 0.3017 249/500 [=============>................] - ETA: 1:02 - loss: 1.7539 - regression_loss: 1.4522 - classification_loss: 0.3017 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7557 - regression_loss: 1.4533 - classification_loss: 0.3024 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7533 - regression_loss: 1.4512 - classification_loss: 0.3021 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7539 - regression_loss: 1.4523 - classification_loss: 0.3016 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7525 - regression_loss: 1.4515 - classification_loss: 0.3010 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7494 - regression_loss: 1.4489 - classification_loss: 0.3005 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7504 - regression_loss: 1.4502 - classification_loss: 0.3002 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7517 - regression_loss: 1.4515 - classification_loss: 0.3003 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7489 - regression_loss: 1.4494 - classification_loss: 0.2994 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7483 - regression_loss: 1.4492 - classification_loss: 0.2991 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7491 - regression_loss: 1.4499 - classification_loss: 0.2991 260/500 [==============>...............] - ETA: 59s - loss: 1.7487 - regression_loss: 1.4499 - classification_loss: 0.2988  261/500 [==============>...............] - ETA: 59s - loss: 1.7493 - regression_loss: 1.4506 - classification_loss: 0.2987 262/500 [==============>...............] - ETA: 59s - loss: 1.7504 - regression_loss: 1.4516 - classification_loss: 0.2987 263/500 [==============>...............] - ETA: 59s - loss: 1.7503 - regression_loss: 1.4514 - classification_loss: 0.2989 264/500 [==============>...............] - ETA: 59s - loss: 1.7512 - regression_loss: 1.4522 - classification_loss: 0.2990 265/500 [==============>...............] - ETA: 58s - loss: 1.7527 - regression_loss: 1.4536 - classification_loss: 0.2992 266/500 [==============>...............] - ETA: 58s - loss: 1.7549 - regression_loss: 1.4554 - classification_loss: 0.2995 267/500 [===============>..............] - ETA: 58s - loss: 1.7569 - regression_loss: 1.4570 - classification_loss: 0.2999 268/500 [===============>..............] - ETA: 58s - loss: 1.7573 - regression_loss: 1.4575 - classification_loss: 0.2999 269/500 [===============>..............] - ETA: 57s - loss: 1.7570 - regression_loss: 1.4577 - classification_loss: 0.2993 270/500 [===============>..............] - ETA: 57s - loss: 1.7566 - regression_loss: 1.4576 - classification_loss: 0.2989 271/500 [===============>..............] - ETA: 57s - loss: 1.7579 - regression_loss: 1.4594 - classification_loss: 0.2986 272/500 [===============>..............] - ETA: 57s - loss: 1.7578 - regression_loss: 1.4594 - classification_loss: 0.2984 273/500 [===============>..............] - ETA: 56s - loss: 1.7564 - regression_loss: 1.4584 - classification_loss: 0.2979 274/500 [===============>..............] - ETA: 56s - loss: 1.7609 - regression_loss: 1.4619 - classification_loss: 0.2991 275/500 [===============>..............] - ETA: 56s - loss: 1.7590 - regression_loss: 1.4605 - classification_loss: 0.2985 276/500 [===============>..............] - ETA: 56s - loss: 1.7578 - regression_loss: 1.4596 - classification_loss: 0.2981 277/500 [===============>..............] - ETA: 55s - loss: 1.7558 - regression_loss: 1.4582 - classification_loss: 0.2976 278/500 [===============>..............] - ETA: 55s - loss: 1.7563 - regression_loss: 1.4584 - classification_loss: 0.2979 279/500 [===============>..............] - ETA: 55s - loss: 1.7516 - regression_loss: 1.4543 - classification_loss: 0.2972 280/500 [===============>..............] - ETA: 55s - loss: 1.7514 - regression_loss: 1.4543 - classification_loss: 0.2971 281/500 [===============>..............] - ETA: 54s - loss: 1.7513 - regression_loss: 1.4546 - classification_loss: 0.2968 282/500 [===============>..............] - ETA: 54s - loss: 1.7491 - regression_loss: 1.4529 - classification_loss: 0.2961 283/500 [===============>..............] - ETA: 54s - loss: 1.7467 - regression_loss: 1.4478 - classification_loss: 0.2989 284/500 [================>.............] - ETA: 54s - loss: 1.7453 - regression_loss: 1.4467 - classification_loss: 0.2985 285/500 [================>.............] - ETA: 53s - loss: 1.7501 - regression_loss: 1.4509 - classification_loss: 0.2992 286/500 [================>.............] - ETA: 53s - loss: 1.7492 - regression_loss: 1.4501 - classification_loss: 0.2991 287/500 [================>.............] - ETA: 53s - loss: 1.7462 - regression_loss: 1.4478 - classification_loss: 0.2985 288/500 [================>.............] - ETA: 53s - loss: 1.7472 - regression_loss: 1.4486 - classification_loss: 0.2986 289/500 [================>.............] - ETA: 52s - loss: 1.7446 - regression_loss: 1.4467 - classification_loss: 0.2980 290/500 [================>.............] - ETA: 52s - loss: 1.7455 - regression_loss: 1.4473 - classification_loss: 0.2983 291/500 [================>.............] - ETA: 52s - loss: 1.7473 - regression_loss: 1.4485 - classification_loss: 0.2988 292/500 [================>.............] - ETA: 52s - loss: 1.7485 - regression_loss: 1.4498 - classification_loss: 0.2987 293/500 [================>.............] - ETA: 51s - loss: 1.7474 - regression_loss: 1.4489 - classification_loss: 0.2985 294/500 [================>.............] - ETA: 51s - loss: 1.7474 - regression_loss: 1.4489 - classification_loss: 0.2985 295/500 [================>.............] - ETA: 51s - loss: 1.7486 - regression_loss: 1.4498 - classification_loss: 0.2988 296/500 [================>.............] - ETA: 51s - loss: 1.7487 - regression_loss: 1.4497 - classification_loss: 0.2989 297/500 [================>.............] - ETA: 50s - loss: 1.7487 - regression_loss: 1.4495 - classification_loss: 0.2992 298/500 [================>.............] - ETA: 50s - loss: 1.7491 - regression_loss: 1.4497 - classification_loss: 0.2993 299/500 [================>.............] - ETA: 50s - loss: 1.7482 - regression_loss: 1.4492 - classification_loss: 0.2990 300/500 [=================>............] - ETA: 50s - loss: 1.7451 - regression_loss: 1.4467 - classification_loss: 0.2984 301/500 [=================>............] - ETA: 49s - loss: 1.7456 - regression_loss: 1.4470 - classification_loss: 0.2986 302/500 [=================>............] - ETA: 49s - loss: 1.7469 - regression_loss: 1.4479 - classification_loss: 0.2990 303/500 [=================>............] - ETA: 49s - loss: 1.7480 - regression_loss: 1.4490 - classification_loss: 0.2990 304/500 [=================>............] - ETA: 49s - loss: 1.7525 - regression_loss: 1.4517 - classification_loss: 0.3008 305/500 [=================>............] - ETA: 48s - loss: 1.7509 - regression_loss: 1.4502 - classification_loss: 0.3006 306/500 [=================>............] - ETA: 48s - loss: 1.7532 - regression_loss: 1.4521 - classification_loss: 0.3011 307/500 [=================>............] - ETA: 48s - loss: 1.7514 - regression_loss: 1.4508 - classification_loss: 0.3006 308/500 [=================>............] - ETA: 48s - loss: 1.7510 - regression_loss: 1.4508 - classification_loss: 0.3002 309/500 [=================>............] - ETA: 47s - loss: 1.7503 - regression_loss: 1.4499 - classification_loss: 0.3003 310/500 [=================>............] - ETA: 47s - loss: 1.7486 - regression_loss: 1.4453 - classification_loss: 0.3033 311/500 [=================>............] - ETA: 47s - loss: 1.7508 - regression_loss: 1.4476 - classification_loss: 0.3032 312/500 [=================>............] - ETA: 47s - loss: 1.7501 - regression_loss: 1.4472 - classification_loss: 0.3029 313/500 [=================>............] - ETA: 46s - loss: 1.7538 - regression_loss: 1.4503 - classification_loss: 0.3035 314/500 [=================>............] - ETA: 46s - loss: 1.7531 - regression_loss: 1.4497 - classification_loss: 0.3033 315/500 [=================>............] - ETA: 46s - loss: 1.7555 - regression_loss: 1.4514 - classification_loss: 0.3041 316/500 [=================>............] - ETA: 46s - loss: 1.7531 - regression_loss: 1.4496 - classification_loss: 0.3035 317/500 [==================>...........] - ETA: 45s - loss: 1.7553 - regression_loss: 1.4510 - classification_loss: 0.3043 318/500 [==================>...........] - ETA: 45s - loss: 1.7524 - regression_loss: 1.4487 - classification_loss: 0.3037 319/500 [==================>...........] - ETA: 45s - loss: 1.7516 - regression_loss: 1.4480 - classification_loss: 0.3035 320/500 [==================>...........] - ETA: 45s - loss: 1.7520 - regression_loss: 1.4483 - classification_loss: 0.3038 321/500 [==================>...........] - ETA: 44s - loss: 1.7517 - regression_loss: 1.4482 - classification_loss: 0.3035 322/500 [==================>...........] - ETA: 44s - loss: 1.7502 - regression_loss: 1.4472 - classification_loss: 0.3030 323/500 [==================>...........] - ETA: 44s - loss: 1.7504 - regression_loss: 1.4476 - classification_loss: 0.3027 324/500 [==================>...........] - ETA: 44s - loss: 1.7513 - regression_loss: 1.4486 - classification_loss: 0.3027 325/500 [==================>...........] - ETA: 43s - loss: 1.7513 - regression_loss: 1.4487 - classification_loss: 0.3026 326/500 [==================>...........] - ETA: 43s - loss: 1.7509 - regression_loss: 1.4486 - classification_loss: 0.3023 327/500 [==================>...........] - ETA: 43s - loss: 1.7521 - regression_loss: 1.4498 - classification_loss: 0.3023 328/500 [==================>...........] - ETA: 43s - loss: 1.7533 - regression_loss: 1.4507 - classification_loss: 0.3026 329/500 [==================>...........] - ETA: 42s - loss: 1.7558 - regression_loss: 1.4529 - classification_loss: 0.3029 330/500 [==================>...........] - ETA: 42s - loss: 1.7585 - regression_loss: 1.4550 - classification_loss: 0.3034 331/500 [==================>...........] - ETA: 42s - loss: 1.7578 - regression_loss: 1.4548 - classification_loss: 0.3031 332/500 [==================>...........] - ETA: 42s - loss: 1.7583 - regression_loss: 1.4553 - classification_loss: 0.3030 333/500 [==================>...........] - ETA: 41s - loss: 1.7559 - regression_loss: 1.4535 - classification_loss: 0.3024 334/500 [===================>..........] - ETA: 41s - loss: 1.7566 - regression_loss: 1.4541 - classification_loss: 0.3025 335/500 [===================>..........] - ETA: 41s - loss: 1.7579 - regression_loss: 1.4551 - classification_loss: 0.3028 336/500 [===================>..........] - ETA: 41s - loss: 1.7554 - regression_loss: 1.4532 - classification_loss: 0.3022 337/500 [===================>..........] - ETA: 40s - loss: 1.7542 - regression_loss: 1.4524 - classification_loss: 0.3019 338/500 [===================>..........] - ETA: 40s - loss: 1.7544 - regression_loss: 1.4529 - classification_loss: 0.3015 339/500 [===================>..........] - ETA: 40s - loss: 1.7553 - regression_loss: 1.4536 - classification_loss: 0.3016 340/500 [===================>..........] - ETA: 40s - loss: 1.7587 - regression_loss: 1.4560 - classification_loss: 0.3026 341/500 [===================>..........] - ETA: 39s - loss: 1.7582 - regression_loss: 1.4559 - classification_loss: 0.3023 342/500 [===================>..........] - ETA: 39s - loss: 1.7565 - regression_loss: 1.4545 - classification_loss: 0.3020 343/500 [===================>..........] - ETA: 39s - loss: 1.7557 - regression_loss: 1.4542 - classification_loss: 0.3015 344/500 [===================>..........] - ETA: 39s - loss: 1.7528 - regression_loss: 1.4520 - classification_loss: 0.3009 345/500 [===================>..........] - ETA: 38s - loss: 1.7544 - regression_loss: 1.4535 - classification_loss: 0.3009 346/500 [===================>..........] - ETA: 38s - loss: 1.7553 - regression_loss: 1.4541 - classification_loss: 0.3012 347/500 [===================>..........] - ETA: 38s - loss: 1.7550 - regression_loss: 1.4542 - classification_loss: 0.3008 348/500 [===================>..........] - ETA: 38s - loss: 1.7536 - regression_loss: 1.4534 - classification_loss: 0.3003 349/500 [===================>..........] - ETA: 37s - loss: 1.7550 - regression_loss: 1.4542 - classification_loss: 0.3007 350/500 [====================>.........] - ETA: 37s - loss: 1.7559 - regression_loss: 1.4550 - classification_loss: 0.3009 351/500 [====================>.........] - ETA: 37s - loss: 1.7573 - regression_loss: 1.4560 - classification_loss: 0.3013 352/500 [====================>.........] - ETA: 37s - loss: 1.7571 - regression_loss: 1.4559 - classification_loss: 0.3012 353/500 [====================>.........] - ETA: 36s - loss: 1.7576 - regression_loss: 1.4565 - classification_loss: 0.3011 354/500 [====================>.........] - ETA: 36s - loss: 1.7568 - regression_loss: 1.4562 - classification_loss: 0.3006 355/500 [====================>.........] - ETA: 36s - loss: 1.7570 - regression_loss: 1.4565 - classification_loss: 0.3005 356/500 [====================>.........] - ETA: 36s - loss: 1.7558 - regression_loss: 1.4557 - classification_loss: 0.3001 357/500 [====================>.........] - ETA: 35s - loss: 1.7539 - regression_loss: 1.4543 - classification_loss: 0.2996 358/500 [====================>.........] - ETA: 35s - loss: 1.7534 - regression_loss: 1.4543 - classification_loss: 0.2992 359/500 [====================>.........] - ETA: 35s - loss: 1.7555 - regression_loss: 1.4562 - classification_loss: 0.2994 360/500 [====================>.........] - ETA: 35s - loss: 1.7521 - regression_loss: 1.4533 - classification_loss: 0.2988 361/500 [====================>.........] - ETA: 34s - loss: 1.7511 - regression_loss: 1.4526 - classification_loss: 0.2985 362/500 [====================>.........] - ETA: 34s - loss: 1.7525 - regression_loss: 1.4539 - classification_loss: 0.2987 363/500 [====================>.........] - ETA: 34s - loss: 1.7523 - regression_loss: 1.4538 - classification_loss: 0.2985 364/500 [====================>.........] - ETA: 34s - loss: 1.7496 - regression_loss: 1.4517 - classification_loss: 0.2979 365/500 [====================>.........] - ETA: 33s - loss: 1.7474 - regression_loss: 1.4499 - classification_loss: 0.2976 366/500 [====================>.........] - ETA: 33s - loss: 1.7498 - regression_loss: 1.4515 - classification_loss: 0.2983 367/500 [=====================>........] - ETA: 33s - loss: 1.7495 - regression_loss: 1.4510 - classification_loss: 0.2985 368/500 [=====================>........] - ETA: 33s - loss: 1.7487 - regression_loss: 1.4504 - classification_loss: 0.2983 369/500 [=====================>........] - ETA: 32s - loss: 1.7497 - regression_loss: 1.4517 - classification_loss: 0.2980 370/500 [=====================>........] - ETA: 32s - loss: 1.7481 - regression_loss: 1.4506 - classification_loss: 0.2975 371/500 [=====================>........] - ETA: 32s - loss: 1.7481 - regression_loss: 1.4506 - classification_loss: 0.2975 372/500 [=====================>........] - ETA: 32s - loss: 1.7500 - regression_loss: 1.4523 - classification_loss: 0.2977 373/500 [=====================>........] - ETA: 31s - loss: 1.7476 - regression_loss: 1.4503 - classification_loss: 0.2974 374/500 [=====================>........] - ETA: 31s - loss: 1.7476 - regression_loss: 1.4503 - classification_loss: 0.2973 375/500 [=====================>........] - ETA: 31s - loss: 1.7481 - regression_loss: 1.4506 - classification_loss: 0.2975 376/500 [=====================>........] - ETA: 31s - loss: 1.7470 - regression_loss: 1.4498 - classification_loss: 0.2972 377/500 [=====================>........] - ETA: 30s - loss: 1.7503 - regression_loss: 1.4527 - classification_loss: 0.2976 378/500 [=====================>........] - ETA: 30s - loss: 1.7513 - regression_loss: 1.4534 - classification_loss: 0.2980 379/500 [=====================>........] - ETA: 30s - loss: 1.7545 - regression_loss: 1.4560 - classification_loss: 0.2986 380/500 [=====================>........] - ETA: 30s - loss: 1.7550 - regression_loss: 1.4563 - classification_loss: 0.2987 381/500 [=====================>........] - ETA: 29s - loss: 1.7562 - regression_loss: 1.4568 - classification_loss: 0.2994 382/500 [=====================>........] - ETA: 29s - loss: 1.7566 - regression_loss: 1.4571 - classification_loss: 0.2995 383/500 [=====================>........] - ETA: 29s - loss: 1.7582 - regression_loss: 1.4586 - classification_loss: 0.2996 384/500 [======================>.......] - ETA: 29s - loss: 1.7606 - regression_loss: 1.4607 - classification_loss: 0.2999 385/500 [======================>.......] - ETA: 28s - loss: 1.7600 - regression_loss: 1.4606 - classification_loss: 0.2994 386/500 [======================>.......] - ETA: 28s - loss: 1.7576 - regression_loss: 1.4585 - classification_loss: 0.2991 387/500 [======================>.......] - ETA: 28s - loss: 1.7605 - regression_loss: 1.4605 - classification_loss: 0.3000 388/500 [======================>.......] - ETA: 28s - loss: 1.7577 - regression_loss: 1.4582 - classification_loss: 0.2995 389/500 [======================>.......] - ETA: 27s - loss: 1.7584 - regression_loss: 1.4587 - classification_loss: 0.2997 390/500 [======================>.......] - ETA: 27s - loss: 1.7574 - regression_loss: 1.4579 - classification_loss: 0.2994 391/500 [======================>.......] - ETA: 27s - loss: 1.7559 - regression_loss: 1.4567 - classification_loss: 0.2992 392/500 [======================>.......] - ETA: 27s - loss: 1.7545 - regression_loss: 1.4556 - classification_loss: 0.2989 393/500 [======================>.......] - ETA: 26s - loss: 1.7540 - regression_loss: 1.4553 - classification_loss: 0.2988 394/500 [======================>.......] - ETA: 26s - loss: 1.7532 - regression_loss: 1.4545 - classification_loss: 0.2987 395/500 [======================>.......] - ETA: 26s - loss: 1.7534 - regression_loss: 1.4546 - classification_loss: 0.2988 396/500 [======================>.......] - ETA: 26s - loss: 1.7532 - regression_loss: 1.4545 - classification_loss: 0.2987 397/500 [======================>.......] - ETA: 25s - loss: 1.7534 - regression_loss: 1.4547 - classification_loss: 0.2987 398/500 [======================>.......] - ETA: 25s - loss: 1.7553 - regression_loss: 1.4560 - classification_loss: 0.2993 399/500 [======================>.......] - ETA: 25s - loss: 1.7555 - regression_loss: 1.4566 - classification_loss: 0.2990 400/500 [=======================>......] - ETA: 25s - loss: 1.7542 - regression_loss: 1.4555 - classification_loss: 0.2987 401/500 [=======================>......] - ETA: 24s - loss: 1.7549 - regression_loss: 1.4562 - classification_loss: 0.2987 402/500 [=======================>......] - ETA: 24s - loss: 1.7549 - regression_loss: 1.4562 - classification_loss: 0.2987 403/500 [=======================>......] - ETA: 24s - loss: 1.7525 - regression_loss: 1.4542 - classification_loss: 0.2983 404/500 [=======================>......] - ETA: 24s - loss: 1.7539 - regression_loss: 1.4554 - classification_loss: 0.2985 405/500 [=======================>......] - ETA: 23s - loss: 1.7551 - regression_loss: 1.4562 - classification_loss: 0.2989 406/500 [=======================>......] - ETA: 23s - loss: 1.7537 - regression_loss: 1.4553 - classification_loss: 0.2985 407/500 [=======================>......] - ETA: 23s - loss: 1.7537 - regression_loss: 1.4554 - classification_loss: 0.2983 408/500 [=======================>......] - ETA: 23s - loss: 1.7546 - regression_loss: 1.4560 - classification_loss: 0.2986 409/500 [=======================>......] - ETA: 22s - loss: 1.7550 - regression_loss: 1.4559 - classification_loss: 0.2990 410/500 [=======================>......] - ETA: 22s - loss: 1.7568 - regression_loss: 1.4572 - classification_loss: 0.2996 411/500 [=======================>......] - ETA: 22s - loss: 1.7555 - regression_loss: 1.4564 - classification_loss: 0.2992 412/500 [=======================>......] - ETA: 22s - loss: 1.7572 - regression_loss: 1.4579 - classification_loss: 0.2993 413/500 [=======================>......] - ETA: 21s - loss: 1.7572 - regression_loss: 1.4579 - classification_loss: 0.2992 414/500 [=======================>......] - ETA: 21s - loss: 1.7573 - regression_loss: 1.4582 - classification_loss: 0.2992 415/500 [=======================>......] - ETA: 21s - loss: 1.7575 - regression_loss: 1.4584 - classification_loss: 0.2990 416/500 [=======================>......] - ETA: 21s - loss: 1.7588 - regression_loss: 1.4594 - classification_loss: 0.2994 417/500 [========================>.....] - ETA: 20s - loss: 1.7583 - regression_loss: 1.4590 - classification_loss: 0.2993 418/500 [========================>.....] - ETA: 20s - loss: 1.7579 - regression_loss: 1.4587 - classification_loss: 0.2992 419/500 [========================>.....] - ETA: 20s - loss: 1.7583 - regression_loss: 1.4593 - classification_loss: 0.2990 420/500 [========================>.....] - ETA: 20s - loss: 1.7573 - regression_loss: 1.4587 - classification_loss: 0.2987 421/500 [========================>.....] - ETA: 19s - loss: 1.7596 - regression_loss: 1.4602 - classification_loss: 0.2995 422/500 [========================>.....] - ETA: 19s - loss: 1.7599 - regression_loss: 1.4604 - classification_loss: 0.2995 423/500 [========================>.....] - ETA: 19s - loss: 1.7596 - regression_loss: 1.4603 - classification_loss: 0.2992 424/500 [========================>.....] - ETA: 19s - loss: 1.7602 - regression_loss: 1.4608 - classification_loss: 0.2994 425/500 [========================>.....] - ETA: 18s - loss: 1.7606 - regression_loss: 1.4611 - classification_loss: 0.2995 426/500 [========================>.....] - ETA: 18s - loss: 1.7602 - regression_loss: 1.4608 - classification_loss: 0.2994 427/500 [========================>.....] - ETA: 18s - loss: 1.7618 - regression_loss: 1.4613 - classification_loss: 0.3006 428/500 [========================>.....] - ETA: 18s - loss: 1.7608 - regression_loss: 1.4605 - classification_loss: 0.3003 429/500 [========================>.....] - ETA: 17s - loss: 1.7595 - regression_loss: 1.4595 - classification_loss: 0.3000 430/500 [========================>.....] - ETA: 17s - loss: 1.7601 - regression_loss: 1.4599 - classification_loss: 0.3001 431/500 [========================>.....] - ETA: 17s - loss: 1.7608 - regression_loss: 1.4605 - classification_loss: 0.3002 432/500 [========================>.....] - ETA: 17s - loss: 1.7609 - regression_loss: 1.4607 - classification_loss: 0.3001 433/500 [========================>.....] - ETA: 16s - loss: 1.7590 - regression_loss: 1.4593 - classification_loss: 0.2997 434/500 [=========================>....] - ETA: 16s - loss: 1.7592 - regression_loss: 1.4595 - classification_loss: 0.2997 435/500 [=========================>....] - ETA: 16s - loss: 1.7603 - regression_loss: 1.4602 - classification_loss: 0.3002 436/500 [=========================>....] - ETA: 16s - loss: 1.7594 - regression_loss: 1.4594 - classification_loss: 0.3001 437/500 [=========================>....] - ETA: 15s - loss: 1.7564 - regression_loss: 1.4568 - classification_loss: 0.2995 438/500 [=========================>....] - ETA: 15s - loss: 1.7549 - regression_loss: 1.4557 - classification_loss: 0.2992 439/500 [=========================>....] - ETA: 15s - loss: 1.7541 - regression_loss: 1.4552 - classification_loss: 0.2989 440/500 [=========================>....] - ETA: 15s - loss: 1.7565 - regression_loss: 1.4562 - classification_loss: 0.3003 441/500 [=========================>....] - ETA: 14s - loss: 1.7559 - regression_loss: 1.4559 - classification_loss: 0.3000 442/500 [=========================>....] - ETA: 14s - loss: 1.7553 - regression_loss: 1.4556 - classification_loss: 0.2997 443/500 [=========================>....] - ETA: 14s - loss: 1.7554 - regression_loss: 1.4557 - classification_loss: 0.2997 444/500 [=========================>....] - ETA: 14s - loss: 1.7552 - regression_loss: 1.4556 - classification_loss: 0.2995 445/500 [=========================>....] - ETA: 13s - loss: 1.7542 - regression_loss: 1.4549 - classification_loss: 0.2992 446/500 [=========================>....] - ETA: 13s - loss: 1.7536 - regression_loss: 1.4545 - classification_loss: 0.2991 447/500 [=========================>....] - ETA: 13s - loss: 1.7534 - regression_loss: 1.4545 - classification_loss: 0.2989 448/500 [=========================>....] - ETA: 13s - loss: 1.7539 - regression_loss: 1.4548 - classification_loss: 0.2990 449/500 [=========================>....] - ETA: 12s - loss: 1.7532 - regression_loss: 1.4544 - classification_loss: 0.2988 450/500 [==========================>...] - ETA: 12s - loss: 1.7522 - regression_loss: 1.4537 - classification_loss: 0.2985 451/500 [==========================>...] - ETA: 12s - loss: 1.7517 - regression_loss: 1.4535 - classification_loss: 0.2983 452/500 [==========================>...] - ETA: 12s - loss: 1.7530 - regression_loss: 1.4545 - classification_loss: 0.2985 453/500 [==========================>...] - ETA: 11s - loss: 1.7533 - regression_loss: 1.4548 - classification_loss: 0.2985 454/500 [==========================>...] - ETA: 11s - loss: 1.7508 - regression_loss: 1.4528 - classification_loss: 0.2980 455/500 [==========================>...] - ETA: 11s - loss: 1.7507 - regression_loss: 1.4526 - classification_loss: 0.2981 456/500 [==========================>...] - ETA: 11s - loss: 1.7509 - regression_loss: 1.4529 - classification_loss: 0.2980 457/500 [==========================>...] - ETA: 10s - loss: 1.7513 - regression_loss: 1.4533 - classification_loss: 0.2980 458/500 [==========================>...] - ETA: 10s - loss: 1.7513 - regression_loss: 1.4533 - classification_loss: 0.2980 459/500 [==========================>...] - ETA: 10s - loss: 1.7497 - regression_loss: 1.4521 - classification_loss: 0.2975 460/500 [==========================>...] - ETA: 10s - loss: 1.7479 - regression_loss: 1.4507 - classification_loss: 0.2972 461/500 [==========================>...] - ETA: 9s - loss: 1.7474 - regression_loss: 1.4504 - classification_loss: 0.2971  462/500 [==========================>...] - ETA: 9s - loss: 1.7454 - regression_loss: 1.4488 - classification_loss: 0.2966 463/500 [==========================>...] - ETA: 9s - loss: 1.7453 - regression_loss: 1.4486 - classification_loss: 0.2967 464/500 [==========================>...] - ETA: 9s - loss: 1.7440 - regression_loss: 1.4477 - classification_loss: 0.2964 465/500 [==========================>...] - ETA: 8s - loss: 1.7437 - regression_loss: 1.4475 - classification_loss: 0.2962 466/500 [==========================>...] - ETA: 8s - loss: 1.7431 - regression_loss: 1.4471 - classification_loss: 0.2960 467/500 [===========================>..] - ETA: 8s - loss: 1.7427 - regression_loss: 1.4469 - classification_loss: 0.2958 468/500 [===========================>..] - ETA: 8s - loss: 1.7433 - regression_loss: 1.4475 - classification_loss: 0.2958 469/500 [===========================>..] - ETA: 7s - loss: 1.7439 - regression_loss: 1.4481 - classification_loss: 0.2958 470/500 [===========================>..] - ETA: 7s - loss: 1.7435 - regression_loss: 1.4478 - classification_loss: 0.2957 471/500 [===========================>..] - ETA: 7s - loss: 1.7416 - regression_loss: 1.4462 - classification_loss: 0.2954 472/500 [===========================>..] - ETA: 7s - loss: 1.7409 - regression_loss: 1.4457 - classification_loss: 0.2952 473/500 [===========================>..] - ETA: 6s - loss: 1.7393 - regression_loss: 1.4445 - classification_loss: 0.2948 474/500 [===========================>..] - ETA: 6s - loss: 1.7403 - regression_loss: 1.4450 - classification_loss: 0.2952 475/500 [===========================>..] - ETA: 6s - loss: 1.7402 - regression_loss: 1.4451 - classification_loss: 0.2951 476/500 [===========================>..] - ETA: 6s - loss: 1.7400 - regression_loss: 1.4451 - classification_loss: 0.2949 477/500 [===========================>..] - ETA: 5s - loss: 1.7411 - regression_loss: 1.4462 - classification_loss: 0.2950 478/500 [===========================>..] - ETA: 5s - loss: 1.7423 - regression_loss: 1.4472 - classification_loss: 0.2951 479/500 [===========================>..] - ETA: 5s - loss: 1.7405 - regression_loss: 1.4457 - classification_loss: 0.2948 480/500 [===========================>..] - ETA: 5s - loss: 1.7403 - regression_loss: 1.4455 - classification_loss: 0.2948 481/500 [===========================>..] - ETA: 4s - loss: 1.7398 - regression_loss: 1.4452 - classification_loss: 0.2947 482/500 [===========================>..] - ETA: 4s - loss: 1.7397 - regression_loss: 1.4450 - classification_loss: 0.2948 483/500 [===========================>..] - ETA: 4s - loss: 1.7384 - regression_loss: 1.4440 - classification_loss: 0.2944 484/500 [============================>.] - ETA: 4s - loss: 1.7386 - regression_loss: 1.4441 - classification_loss: 0.2945 485/500 [============================>.] - ETA: 3s - loss: 1.7390 - regression_loss: 1.4442 - classification_loss: 0.2948 486/500 [============================>.] - ETA: 3s - loss: 1.7384 - regression_loss: 1.4438 - classification_loss: 0.2946 487/500 [============================>.] - ETA: 3s - loss: 1.7381 - regression_loss: 1.4436 - classification_loss: 0.2945 488/500 [============================>.] - ETA: 3s - loss: 1.7387 - regression_loss: 1.4441 - classification_loss: 0.2946 489/500 [============================>.] - ETA: 2s - loss: 1.7396 - regression_loss: 1.4447 - classification_loss: 0.2950 490/500 [============================>.] - ETA: 2s - loss: 1.7396 - regression_loss: 1.4447 - classification_loss: 0.2948 491/500 [============================>.] - ETA: 2s - loss: 1.7377 - regression_loss: 1.4432 - classification_loss: 0.2945 492/500 [============================>.] - ETA: 2s - loss: 1.7370 - regression_loss: 1.4427 - classification_loss: 0.2943 493/500 [============================>.] - ETA: 1s - loss: 1.7370 - regression_loss: 1.4427 - classification_loss: 0.2943 494/500 [============================>.] - ETA: 1s - loss: 1.7371 - regression_loss: 1.4428 - classification_loss: 0.2943 495/500 [============================>.] - ETA: 1s - loss: 1.7390 - regression_loss: 1.4441 - classification_loss: 0.2949 496/500 [============================>.] - ETA: 1s - loss: 1.7398 - regression_loss: 1.4449 - classification_loss: 0.2949 497/500 [============================>.] - ETA: 0s - loss: 1.7410 - regression_loss: 1.4461 - classification_loss: 0.2950 498/500 [============================>.] - ETA: 0s - loss: 1.7429 - regression_loss: 1.4476 - classification_loss: 0.2954 499/500 [============================>.] - ETA: 0s - loss: 1.7416 - regression_loss: 1.4465 - classification_loss: 0.2950 500/500 [==============================] - 125s 251ms/step - loss: 1.7413 - regression_loss: 1.4463 - classification_loss: 0.2950 326 instances of class plum with average precision: 0.7804 mAP: 0.7804 Epoch 00044: saving model to ./training/snapshots/resnet50_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 1:51 - loss: 2.1144 - regression_loss: 1.7191 - classification_loss: 0.3953 2/500 [..............................] - ETA: 1:58 - loss: 2.0621 - regression_loss: 1.7072 - classification_loss: 0.3549 3/500 [..............................] - ETA: 2:00 - loss: 2.1134 - regression_loss: 1.7664 - classification_loss: 0.3470 4/500 [..............................] - ETA: 2:01 - loss: 2.2618 - regression_loss: 1.9260 - classification_loss: 0.3358 5/500 [..............................] - ETA: 2:01 - loss: 2.1679 - regression_loss: 1.8374 - classification_loss: 0.3305 6/500 [..............................] - ETA: 2:01 - loss: 2.0170 - regression_loss: 1.7200 - classification_loss: 0.2970 7/500 [..............................] - ETA: 2:01 - loss: 1.9454 - regression_loss: 1.6610 - classification_loss: 0.2844 8/500 [..............................] - ETA: 2:01 - loss: 1.8002 - regression_loss: 1.5435 - classification_loss: 0.2567 9/500 [..............................] - ETA: 2:00 - loss: 1.7280 - regression_loss: 1.4853 - classification_loss: 0.2428 10/500 [..............................] - ETA: 2:00 - loss: 1.7775 - regression_loss: 1.5198 - classification_loss: 0.2578 11/500 [..............................] - ETA: 2:00 - loss: 1.7235 - regression_loss: 1.4720 - classification_loss: 0.2515 12/500 [..............................] - ETA: 1:59 - loss: 1.7292 - regression_loss: 1.4761 - classification_loss: 0.2532 13/500 [..............................] - ETA: 2:00 - loss: 1.7054 - regression_loss: 1.4591 - classification_loss: 0.2463 14/500 [..............................] - ETA: 2:00 - loss: 1.6590 - regression_loss: 1.4219 - classification_loss: 0.2370 15/500 [..............................] - ETA: 1:59 - loss: 1.6915 - regression_loss: 1.4403 - classification_loss: 0.2512 16/500 [..............................] - ETA: 1:59 - loss: 1.7212 - regression_loss: 1.4640 - classification_loss: 0.2572 17/500 [>.............................] - ETA: 1:58 - loss: 1.6790 - regression_loss: 1.4270 - classification_loss: 0.2520 18/500 [>.............................] - ETA: 1:58 - loss: 1.7156 - regression_loss: 1.4533 - classification_loss: 0.2622 19/500 [>.............................] - ETA: 1:58 - loss: 1.7080 - regression_loss: 1.4507 - classification_loss: 0.2574 20/500 [>.............................] - ETA: 1:58 - loss: 1.7378 - regression_loss: 1.4695 - classification_loss: 0.2683 21/500 [>.............................] - ETA: 1:58 - loss: 1.7113 - regression_loss: 1.4492 - classification_loss: 0.2621 22/500 [>.............................] - ETA: 1:58 - loss: 1.7285 - regression_loss: 1.4502 - classification_loss: 0.2783 23/500 [>.............................] - ETA: 1:58 - loss: 1.7316 - regression_loss: 1.4542 - classification_loss: 0.2774 24/500 [>.............................] - ETA: 1:58 - loss: 1.7273 - regression_loss: 1.4518 - classification_loss: 0.2755 25/500 [>.............................] - ETA: 1:57 - loss: 1.7379 - regression_loss: 1.4603 - classification_loss: 0.2776 26/500 [>.............................] - ETA: 1:57 - loss: 1.7361 - regression_loss: 1.4584 - classification_loss: 0.2777 27/500 [>.............................] - ETA: 1:57 - loss: 1.7437 - regression_loss: 1.4611 - classification_loss: 0.2826 28/500 [>.............................] - ETA: 1:56 - loss: 1.7543 - regression_loss: 1.4687 - classification_loss: 0.2856 29/500 [>.............................] - ETA: 1:56 - loss: 1.7332 - regression_loss: 1.4517 - classification_loss: 0.2815 30/500 [>.............................] - ETA: 1:56 - loss: 1.7248 - regression_loss: 1.4462 - classification_loss: 0.2786 31/500 [>.............................] - ETA: 1:56 - loss: 1.7364 - regression_loss: 1.4519 - classification_loss: 0.2845 32/500 [>.............................] - ETA: 1:56 - loss: 1.7326 - regression_loss: 1.4504 - classification_loss: 0.2822 33/500 [>.............................] - ETA: 1:55 - loss: 1.7062 - regression_loss: 1.4277 - classification_loss: 0.2785 34/500 [=>............................] - ETA: 1:55 - loss: 1.6962 - regression_loss: 1.4210 - classification_loss: 0.2751 35/500 [=>............................] - ETA: 1:55 - loss: 1.7199 - regression_loss: 1.4448 - classification_loss: 0.2750 36/500 [=>............................] - ETA: 1:54 - loss: 1.7191 - regression_loss: 1.4444 - classification_loss: 0.2747 37/500 [=>............................] - ETA: 1:54 - loss: 1.7328 - regression_loss: 1.4530 - classification_loss: 0.2798 38/500 [=>............................] - ETA: 1:54 - loss: 1.7216 - regression_loss: 1.4451 - classification_loss: 0.2764 39/500 [=>............................] - ETA: 1:54 - loss: 1.7591 - regression_loss: 1.4633 - classification_loss: 0.2958 40/500 [=>............................] - ETA: 1:54 - loss: 1.7651 - regression_loss: 1.4711 - classification_loss: 0.2940 41/500 [=>............................] - ETA: 1:53 - loss: 1.7645 - regression_loss: 1.4711 - classification_loss: 0.2933 42/500 [=>............................] - ETA: 1:53 - loss: 1.7380 - regression_loss: 1.4496 - classification_loss: 0.2884 43/500 [=>............................] - ETA: 1:53 - loss: 1.7273 - regression_loss: 1.4416 - classification_loss: 0.2857 44/500 [=>............................] - ETA: 1:53 - loss: 1.7244 - regression_loss: 1.4396 - classification_loss: 0.2848 45/500 [=>............................] - ETA: 1:53 - loss: 1.7267 - regression_loss: 1.4414 - classification_loss: 0.2853 46/500 [=>............................] - ETA: 1:52 - loss: 1.7236 - regression_loss: 1.4383 - classification_loss: 0.2853 47/500 [=>............................] - ETA: 1:52 - loss: 1.7271 - regression_loss: 1.4404 - classification_loss: 0.2867 48/500 [=>............................] - ETA: 1:52 - loss: 1.7298 - regression_loss: 1.4418 - classification_loss: 0.2880 49/500 [=>............................] - ETA: 1:52 - loss: 1.7128 - regression_loss: 1.4286 - classification_loss: 0.2843 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6943 - regression_loss: 1.4134 - classification_loss: 0.2809 51/500 [==>...........................] - ETA: 1:51 - loss: 1.7000 - regression_loss: 1.4170 - classification_loss: 0.2830 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6992 - regression_loss: 1.4164 - classification_loss: 0.2828 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6907 - regression_loss: 1.4108 - classification_loss: 0.2798 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6805 - regression_loss: 1.4033 - classification_loss: 0.2771 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6833 - regression_loss: 1.4039 - classification_loss: 0.2795 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6646 - regression_loss: 1.3887 - classification_loss: 0.2759 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6729 - regression_loss: 1.3957 - classification_loss: 0.2773 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6781 - regression_loss: 1.3988 - classification_loss: 0.2793 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6714 - regression_loss: 1.3934 - classification_loss: 0.2780 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6680 - regression_loss: 1.3910 - classification_loss: 0.2770 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6623 - regression_loss: 1.3869 - classification_loss: 0.2754 62/500 [==>...........................] - ETA: 1:48 - loss: 1.6659 - regression_loss: 1.3901 - classification_loss: 0.2758 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6681 - regression_loss: 1.3916 - classification_loss: 0.2765 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6844 - regression_loss: 1.4070 - classification_loss: 0.2774 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6926 - regression_loss: 1.4148 - classification_loss: 0.2778 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7055 - regression_loss: 1.4245 - classification_loss: 0.2810 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6981 - regression_loss: 1.4187 - classification_loss: 0.2794 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6971 - regression_loss: 1.4181 - classification_loss: 0.2790 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6981 - regression_loss: 1.4212 - classification_loss: 0.2769 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7165 - regression_loss: 1.4373 - classification_loss: 0.2792 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7110 - regression_loss: 1.4334 - classification_loss: 0.2776 72/500 [===>..........................] - ETA: 1:46 - loss: 1.7156 - regression_loss: 1.4369 - classification_loss: 0.2787 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7221 - regression_loss: 1.4400 - classification_loss: 0.2821 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7269 - regression_loss: 1.4444 - classification_loss: 0.2826 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7373 - regression_loss: 1.4511 - classification_loss: 0.2862 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7338 - regression_loss: 1.4491 - classification_loss: 0.2847 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7381 - regression_loss: 1.4521 - classification_loss: 0.2859 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7379 - regression_loss: 1.4513 - classification_loss: 0.2865 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7356 - regression_loss: 1.4493 - classification_loss: 0.2863 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7411 - regression_loss: 1.4530 - classification_loss: 0.2881 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7428 - regression_loss: 1.4550 - classification_loss: 0.2878 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7526 - regression_loss: 1.4626 - classification_loss: 0.2900 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7512 - regression_loss: 1.4621 - classification_loss: 0.2891 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7464 - regression_loss: 1.4584 - classification_loss: 0.2879 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7508 - regression_loss: 1.4624 - classification_loss: 0.2884 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7468 - regression_loss: 1.4597 - classification_loss: 0.2872 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7384 - regression_loss: 1.4532 - classification_loss: 0.2852 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7383 - regression_loss: 1.4528 - classification_loss: 0.2855 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7420 - regression_loss: 1.4558 - classification_loss: 0.2861 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7298 - regression_loss: 1.4454 - classification_loss: 0.2844 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7249 - regression_loss: 1.4414 - classification_loss: 0.2835 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7201 - regression_loss: 1.4374 - classification_loss: 0.2827 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7203 - regression_loss: 1.4381 - classification_loss: 0.2822 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7184 - regression_loss: 1.4367 - classification_loss: 0.2817 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7091 - regression_loss: 1.4287 - classification_loss: 0.2803 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7150 - regression_loss: 1.4336 - classification_loss: 0.2814 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7143 - regression_loss: 1.4331 - classification_loss: 0.2812 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7209 - regression_loss: 1.4384 - classification_loss: 0.2825 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7201 - regression_loss: 1.4369 - classification_loss: 0.2832 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7215 - regression_loss: 1.4379 - classification_loss: 0.2836 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7246 - regression_loss: 1.4412 - classification_loss: 0.2834 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7269 - regression_loss: 1.4425 - classification_loss: 0.2844 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7329 - regression_loss: 1.4463 - classification_loss: 0.2866 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7291 - regression_loss: 1.4438 - classification_loss: 0.2853 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7240 - regression_loss: 1.4394 - classification_loss: 0.2846 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7244 - regression_loss: 1.4390 - classification_loss: 0.2855 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7307 - regression_loss: 1.4432 - classification_loss: 0.2875 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7300 - regression_loss: 1.4425 - classification_loss: 0.2875 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7336 - regression_loss: 1.4450 - classification_loss: 0.2885 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7331 - regression_loss: 1.4445 - classification_loss: 0.2886 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7283 - regression_loss: 1.4410 - classification_loss: 0.2874 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7251 - regression_loss: 1.4385 - classification_loss: 0.2866 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7319 - regression_loss: 1.4439 - classification_loss: 0.2880 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7325 - regression_loss: 1.4439 - classification_loss: 0.2885 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7274 - regression_loss: 1.4397 - classification_loss: 0.2877 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7195 - regression_loss: 1.4332 - classification_loss: 0.2862 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7286 - regression_loss: 1.4384 - classification_loss: 0.2903 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7331 - regression_loss: 1.4415 - classification_loss: 0.2916 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7361 - regression_loss: 1.4441 - classification_loss: 0.2920 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7378 - regression_loss: 1.4464 - classification_loss: 0.2914 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7388 - regression_loss: 1.4476 - classification_loss: 0.2912 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7419 - regression_loss: 1.4507 - classification_loss: 0.2912 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7399 - regression_loss: 1.4496 - classification_loss: 0.2903 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7554 - regression_loss: 1.4621 - classification_loss: 0.2933 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7567 - regression_loss: 1.4632 - classification_loss: 0.2935 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7487 - regression_loss: 1.4569 - classification_loss: 0.2919 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7502 - regression_loss: 1.4581 - classification_loss: 0.2921 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7585 - regression_loss: 1.4645 - classification_loss: 0.2939 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7613 - regression_loss: 1.4665 - classification_loss: 0.2948 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7598 - regression_loss: 1.4651 - classification_loss: 0.2947 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7711 - regression_loss: 1.4755 - classification_loss: 0.2955 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7740 - regression_loss: 1.4783 - classification_loss: 0.2957 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7770 - regression_loss: 1.4800 - classification_loss: 0.2970 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7718 - regression_loss: 1.4758 - classification_loss: 0.2960 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7691 - regression_loss: 1.4739 - classification_loss: 0.2952 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7680 - regression_loss: 1.4726 - classification_loss: 0.2954 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7704 - regression_loss: 1.4749 - classification_loss: 0.2955 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7687 - regression_loss: 1.4739 - classification_loss: 0.2948 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7659 - regression_loss: 1.4719 - classification_loss: 0.2940 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7657 - regression_loss: 1.4719 - classification_loss: 0.2937 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7649 - regression_loss: 1.4714 - classification_loss: 0.2935 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7657 - regression_loss: 1.4724 - classification_loss: 0.2933 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7714 - regression_loss: 1.4768 - classification_loss: 0.2947 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7692 - regression_loss: 1.4748 - classification_loss: 0.2945 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7598 - regression_loss: 1.4667 - classification_loss: 0.2931 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7594 - regression_loss: 1.4670 - classification_loss: 0.2923 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7646 - regression_loss: 1.4710 - classification_loss: 0.2936 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7735 - regression_loss: 1.4783 - classification_loss: 0.2951 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7696 - regression_loss: 1.4753 - classification_loss: 0.2942 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7701 - regression_loss: 1.4761 - classification_loss: 0.2940 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7693 - regression_loss: 1.4756 - classification_loss: 0.2937 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7741 - regression_loss: 1.4789 - classification_loss: 0.2952 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7735 - regression_loss: 1.4785 - classification_loss: 0.2950 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7704 - regression_loss: 1.4758 - classification_loss: 0.2946 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7712 - regression_loss: 1.4763 - classification_loss: 0.2949 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7697 - regression_loss: 1.4745 - classification_loss: 0.2951 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7681 - regression_loss: 1.4735 - classification_loss: 0.2946 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7663 - regression_loss: 1.4722 - classification_loss: 0.2941 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7663 - regression_loss: 1.4721 - classification_loss: 0.2943 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7649 - regression_loss: 1.4710 - classification_loss: 0.2939 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7637 - regression_loss: 1.4696 - classification_loss: 0.2941 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7578 - regression_loss: 1.4646 - classification_loss: 0.2932 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7571 - regression_loss: 1.4644 - classification_loss: 0.2926 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7530 - regression_loss: 1.4611 - classification_loss: 0.2919 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7491 - regression_loss: 1.4581 - classification_loss: 0.2910 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7490 - regression_loss: 1.4583 - classification_loss: 0.2907 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7525 - regression_loss: 1.4623 - classification_loss: 0.2903 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7503 - regression_loss: 1.4608 - classification_loss: 0.2896 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7479 - regression_loss: 1.4590 - classification_loss: 0.2889 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7490 - regression_loss: 1.4596 - classification_loss: 0.2894 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7518 - regression_loss: 1.4616 - classification_loss: 0.2902 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7525 - regression_loss: 1.4629 - classification_loss: 0.2896 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7517 - regression_loss: 1.4622 - classification_loss: 0.2895 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7511 - regression_loss: 1.4619 - classification_loss: 0.2892 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7572 - regression_loss: 1.4664 - classification_loss: 0.2908 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7537 - regression_loss: 1.4636 - classification_loss: 0.2901 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7518 - regression_loss: 1.4621 - classification_loss: 0.2897 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7510 - regression_loss: 1.4615 - classification_loss: 0.2895 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7561 - regression_loss: 1.4660 - classification_loss: 0.2902 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7553 - regression_loss: 1.4656 - classification_loss: 0.2897 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7550 - regression_loss: 1.4661 - classification_loss: 0.2889 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7533 - regression_loss: 1.4650 - classification_loss: 0.2883 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7492 - regression_loss: 1.4613 - classification_loss: 0.2879 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7486 - regression_loss: 1.4603 - classification_loss: 0.2883 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7476 - regression_loss: 1.4583 - classification_loss: 0.2893 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7499 - regression_loss: 1.4609 - classification_loss: 0.2890 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7499 - regression_loss: 1.4612 - classification_loss: 0.2887 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7532 - regression_loss: 1.4644 - classification_loss: 0.2888 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7502 - regression_loss: 1.4620 - classification_loss: 0.2882 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7492 - regression_loss: 1.4613 - classification_loss: 0.2878 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7490 - regression_loss: 1.4613 - classification_loss: 0.2877 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7521 - regression_loss: 1.4640 - classification_loss: 0.2882 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7513 - regression_loss: 1.4631 - classification_loss: 0.2882 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7515 - regression_loss: 1.4629 - classification_loss: 0.2887 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7561 - regression_loss: 1.4662 - classification_loss: 0.2898 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7493 - regression_loss: 1.4587 - classification_loss: 0.2905 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7424 - regression_loss: 1.4531 - classification_loss: 0.2893 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7444 - regression_loss: 1.4546 - classification_loss: 0.2898 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7474 - regression_loss: 1.4580 - classification_loss: 0.2894 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7430 - regression_loss: 1.4545 - classification_loss: 0.2885 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7469 - regression_loss: 1.4571 - classification_loss: 0.2898 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7447 - regression_loss: 1.4550 - classification_loss: 0.2897 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7412 - regression_loss: 1.4522 - classification_loss: 0.2891 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7414 - regression_loss: 1.4521 - classification_loss: 0.2892 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7439 - regression_loss: 1.4538 - classification_loss: 0.2901 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7469 - regression_loss: 1.4564 - classification_loss: 0.2905 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7458 - regression_loss: 1.4559 - classification_loss: 0.2899 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7446 - regression_loss: 1.4554 - classification_loss: 0.2892 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7423 - regression_loss: 1.4539 - classification_loss: 0.2884 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7425 - regression_loss: 1.4545 - classification_loss: 0.2880 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7397 - regression_loss: 1.4525 - classification_loss: 0.2871 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7413 - regression_loss: 1.4537 - classification_loss: 0.2876 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7386 - regression_loss: 1.4517 - classification_loss: 0.2869 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7357 - regression_loss: 1.4494 - classification_loss: 0.2863 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7382 - regression_loss: 1.4522 - classification_loss: 0.2860 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7407 - regression_loss: 1.4541 - classification_loss: 0.2867 217/500 [============>.................] - ETA: 1:10 - loss: 1.7382 - regression_loss: 1.4523 - classification_loss: 0.2859 218/500 [============>.................] - ETA: 1:10 - loss: 1.7355 - regression_loss: 1.4504 - classification_loss: 0.2851 219/500 [============>.................] - ETA: 1:10 - loss: 1.7393 - regression_loss: 1.4532 - classification_loss: 0.2862 220/500 [============>.................] - ETA: 1:09 - loss: 1.7386 - regression_loss: 1.4528 - classification_loss: 0.2858 221/500 [============>.................] - ETA: 1:09 - loss: 1.7372 - regression_loss: 1.4517 - classification_loss: 0.2854 222/500 [============>.................] - ETA: 1:09 - loss: 1.7314 - regression_loss: 1.4470 - classification_loss: 0.2844 223/500 [============>.................] - ETA: 1:09 - loss: 1.7384 - regression_loss: 1.4517 - classification_loss: 0.2868 224/500 [============>.................] - ETA: 1:08 - loss: 1.7384 - regression_loss: 1.4513 - classification_loss: 0.2871 225/500 [============>.................] - ETA: 1:08 - loss: 1.7436 - regression_loss: 1.4554 - classification_loss: 0.2883 226/500 [============>.................] - ETA: 1:08 - loss: 1.7449 - regression_loss: 1.4564 - classification_loss: 0.2885 227/500 [============>.................] - ETA: 1:08 - loss: 1.7456 - regression_loss: 1.4566 - classification_loss: 0.2890 228/500 [============>.................] - ETA: 1:07 - loss: 1.7444 - regression_loss: 1.4560 - classification_loss: 0.2884 229/500 [============>.................] - ETA: 1:07 - loss: 1.7430 - regression_loss: 1.4554 - classification_loss: 0.2876 230/500 [============>.................] - ETA: 1:07 - loss: 1.7437 - regression_loss: 1.4561 - classification_loss: 0.2876 231/500 [============>.................] - ETA: 1:07 - loss: 1.7418 - regression_loss: 1.4548 - classification_loss: 0.2870 232/500 [============>.................] - ETA: 1:06 - loss: 1.7431 - regression_loss: 1.4559 - classification_loss: 0.2872 233/500 [============>.................] - ETA: 1:06 - loss: 1.7395 - regression_loss: 1.4531 - classification_loss: 0.2864 234/500 [=============>................] - ETA: 1:06 - loss: 1.7409 - regression_loss: 1.4543 - classification_loss: 0.2865 235/500 [=============>................] - ETA: 1:06 - loss: 1.7355 - regression_loss: 1.4497 - classification_loss: 0.2858 236/500 [=============>................] - ETA: 1:05 - loss: 1.7363 - regression_loss: 1.4505 - classification_loss: 0.2858 237/500 [=============>................] - ETA: 1:05 - loss: 1.7404 - regression_loss: 1.4543 - classification_loss: 0.2861 238/500 [=============>................] - ETA: 1:05 - loss: 1.7402 - regression_loss: 1.4539 - classification_loss: 0.2862 239/500 [=============>................] - ETA: 1:05 - loss: 1.7420 - regression_loss: 1.4557 - classification_loss: 0.2863 240/500 [=============>................] - ETA: 1:04 - loss: 1.7399 - regression_loss: 1.4536 - classification_loss: 0.2863 241/500 [=============>................] - ETA: 1:04 - loss: 1.7426 - regression_loss: 1.4560 - classification_loss: 0.2867 242/500 [=============>................] - ETA: 1:04 - loss: 1.7442 - regression_loss: 1.4572 - classification_loss: 0.2870 243/500 [=============>................] - ETA: 1:04 - loss: 1.7430 - regression_loss: 1.4565 - classification_loss: 0.2866 244/500 [=============>................] - ETA: 1:03 - loss: 1.7443 - regression_loss: 1.4572 - classification_loss: 0.2871 245/500 [=============>................] - ETA: 1:03 - loss: 1.7429 - regression_loss: 1.4561 - classification_loss: 0.2868 246/500 [=============>................] - ETA: 1:03 - loss: 1.7436 - regression_loss: 1.4568 - classification_loss: 0.2869 247/500 [=============>................] - ETA: 1:03 - loss: 1.7437 - regression_loss: 1.4566 - classification_loss: 0.2871 248/500 [=============>................] - ETA: 1:02 - loss: 1.7443 - regression_loss: 1.4571 - classification_loss: 0.2873 249/500 [=============>................] - ETA: 1:02 - loss: 1.7436 - regression_loss: 1.4566 - classification_loss: 0.2870 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7433 - regression_loss: 1.4565 - classification_loss: 0.2868 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7416 - regression_loss: 1.4549 - classification_loss: 0.2867 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7405 - regression_loss: 1.4542 - classification_loss: 0.2863 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7388 - regression_loss: 1.4529 - classification_loss: 0.2859 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7386 - regression_loss: 1.4527 - classification_loss: 0.2859 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7381 - regression_loss: 1.4523 - classification_loss: 0.2858 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7358 - regression_loss: 1.4501 - classification_loss: 0.2857 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7345 - regression_loss: 1.4491 - classification_loss: 0.2854 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7352 - regression_loss: 1.4501 - classification_loss: 0.2851 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7365 - regression_loss: 1.4512 - classification_loss: 0.2854 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7356 - regression_loss: 1.4506 - classification_loss: 0.2850 261/500 [==============>...............] - ETA: 59s - loss: 1.7317 - regression_loss: 1.4473 - classification_loss: 0.2844  262/500 [==============>...............] - ETA: 59s - loss: 1.7296 - regression_loss: 1.4457 - classification_loss: 0.2839 263/500 [==============>...............] - ETA: 59s - loss: 1.7290 - regression_loss: 1.4449 - classification_loss: 0.2841 264/500 [==============>...............] - ETA: 58s - loss: 1.7290 - regression_loss: 1.4451 - classification_loss: 0.2840 265/500 [==============>...............] - ETA: 58s - loss: 1.7292 - regression_loss: 1.4450 - classification_loss: 0.2842 266/500 [==============>...............] - ETA: 58s - loss: 1.7289 - regression_loss: 1.4447 - classification_loss: 0.2843 267/500 [===============>..............] - ETA: 58s - loss: 1.7283 - regression_loss: 1.4441 - classification_loss: 0.2842 268/500 [===============>..............] - ETA: 58s - loss: 1.7300 - regression_loss: 1.4454 - classification_loss: 0.2846 269/500 [===============>..............] - ETA: 57s - loss: 1.7301 - regression_loss: 1.4450 - classification_loss: 0.2851 270/500 [===============>..............] - ETA: 57s - loss: 1.7308 - regression_loss: 1.4460 - classification_loss: 0.2848 271/500 [===============>..............] - ETA: 57s - loss: 1.7306 - regression_loss: 1.4463 - classification_loss: 0.2843 272/500 [===============>..............] - ETA: 56s - loss: 1.7285 - regression_loss: 1.4448 - classification_loss: 0.2838 273/500 [===============>..............] - ETA: 56s - loss: 1.7244 - regression_loss: 1.4415 - classification_loss: 0.2830 274/500 [===============>..............] - ETA: 56s - loss: 1.7250 - regression_loss: 1.4414 - classification_loss: 0.2835 275/500 [===============>..............] - ETA: 56s - loss: 1.7216 - regression_loss: 1.4388 - classification_loss: 0.2828 276/500 [===============>..............] - ETA: 55s - loss: 1.7259 - regression_loss: 1.4423 - classification_loss: 0.2836 277/500 [===============>..............] - ETA: 55s - loss: 1.7261 - regression_loss: 1.4426 - classification_loss: 0.2835 278/500 [===============>..............] - ETA: 55s - loss: 1.7259 - regression_loss: 1.4426 - classification_loss: 0.2833 279/500 [===============>..............] - ETA: 55s - loss: 1.7282 - regression_loss: 1.4441 - classification_loss: 0.2842 280/500 [===============>..............] - ETA: 55s - loss: 1.7282 - regression_loss: 1.4445 - classification_loss: 0.2838 281/500 [===============>..............] - ETA: 54s - loss: 1.7308 - regression_loss: 1.4458 - classification_loss: 0.2849 282/500 [===============>..............] - ETA: 54s - loss: 1.7291 - regression_loss: 1.4446 - classification_loss: 0.2845 283/500 [===============>..............] - ETA: 54s - loss: 1.7250 - regression_loss: 1.4412 - classification_loss: 0.2837 284/500 [================>.............] - ETA: 54s - loss: 1.7285 - regression_loss: 1.4440 - classification_loss: 0.2845 285/500 [================>.............] - ETA: 53s - loss: 1.7316 - regression_loss: 1.4466 - classification_loss: 0.2850 286/500 [================>.............] - ETA: 53s - loss: 1.7339 - regression_loss: 1.4482 - classification_loss: 0.2857 287/500 [================>.............] - ETA: 53s - loss: 1.7339 - regression_loss: 1.4481 - classification_loss: 0.2858 288/500 [================>.............] - ETA: 52s - loss: 1.7323 - regression_loss: 1.4470 - classification_loss: 0.2853 289/500 [================>.............] - ETA: 52s - loss: 1.7323 - regression_loss: 1.4470 - classification_loss: 0.2852 290/500 [================>.............] - ETA: 52s - loss: 1.7326 - regression_loss: 1.4472 - classification_loss: 0.2854 291/500 [================>.............] - ETA: 52s - loss: 1.7345 - regression_loss: 1.4486 - classification_loss: 0.2859 292/500 [================>.............] - ETA: 52s - loss: 1.7371 - regression_loss: 1.4505 - classification_loss: 0.2866 293/500 [================>.............] - ETA: 51s - loss: 1.7381 - regression_loss: 1.4512 - classification_loss: 0.2868 294/500 [================>.............] - ETA: 51s - loss: 1.7379 - regression_loss: 1.4512 - classification_loss: 0.2867 295/500 [================>.............] - ETA: 51s - loss: 1.7383 - regression_loss: 1.4515 - classification_loss: 0.2867 296/500 [================>.............] - ETA: 51s - loss: 1.7386 - regression_loss: 1.4516 - classification_loss: 0.2870 297/500 [================>.............] - ETA: 50s - loss: 1.7380 - regression_loss: 1.4513 - classification_loss: 0.2868 298/500 [================>.............] - ETA: 50s - loss: 1.7395 - regression_loss: 1.4523 - classification_loss: 0.2872 299/500 [================>.............] - ETA: 50s - loss: 1.7377 - regression_loss: 1.4506 - classification_loss: 0.2871 300/500 [=================>............] - ETA: 50s - loss: 1.7339 - regression_loss: 1.4474 - classification_loss: 0.2865 301/500 [=================>............] - ETA: 49s - loss: 1.7332 - regression_loss: 1.4472 - classification_loss: 0.2860 302/500 [=================>............] - ETA: 49s - loss: 1.7316 - regression_loss: 1.4460 - classification_loss: 0.2856 303/500 [=================>............] - ETA: 49s - loss: 1.7322 - regression_loss: 1.4466 - classification_loss: 0.2856 304/500 [=================>............] - ETA: 49s - loss: 1.7311 - regression_loss: 1.4457 - classification_loss: 0.2854 305/500 [=================>............] - ETA: 48s - loss: 1.7287 - regression_loss: 1.4439 - classification_loss: 0.2848 306/500 [=================>............] - ETA: 48s - loss: 1.7286 - regression_loss: 1.4440 - classification_loss: 0.2846 307/500 [=================>............] - ETA: 48s - loss: 1.7269 - regression_loss: 1.4427 - classification_loss: 0.2842 308/500 [=================>............] - ETA: 48s - loss: 1.7258 - regression_loss: 1.4417 - classification_loss: 0.2841 309/500 [=================>............] - ETA: 47s - loss: 1.7275 - regression_loss: 1.4427 - classification_loss: 0.2847 310/500 [=================>............] - ETA: 47s - loss: 1.7261 - regression_loss: 1.4419 - classification_loss: 0.2843 311/500 [=================>............] - ETA: 47s - loss: 1.7236 - regression_loss: 1.4399 - classification_loss: 0.2837 312/500 [=================>............] - ETA: 47s - loss: 1.7236 - regression_loss: 1.4400 - classification_loss: 0.2836 313/500 [=================>............] - ETA: 46s - loss: 1.7249 - regression_loss: 1.4411 - classification_loss: 0.2837 314/500 [=================>............] - ETA: 46s - loss: 1.7266 - regression_loss: 1.4427 - classification_loss: 0.2839 315/500 [=================>............] - ETA: 46s - loss: 1.7297 - regression_loss: 1.4448 - classification_loss: 0.2849 316/500 [=================>............] - ETA: 46s - loss: 1.7287 - regression_loss: 1.4441 - classification_loss: 0.2846 317/500 [==================>...........] - ETA: 45s - loss: 1.7284 - regression_loss: 1.4441 - classification_loss: 0.2843 318/500 [==================>...........] - ETA: 45s - loss: 1.7258 - regression_loss: 1.4416 - classification_loss: 0.2842 319/500 [==================>...........] - ETA: 45s - loss: 1.7317 - regression_loss: 1.4459 - classification_loss: 0.2858 320/500 [==================>...........] - ETA: 45s - loss: 1.7293 - regression_loss: 1.4438 - classification_loss: 0.2855 321/500 [==================>...........] - ETA: 44s - loss: 1.7283 - regression_loss: 1.4432 - classification_loss: 0.2851 322/500 [==================>...........] - ETA: 44s - loss: 1.7292 - regression_loss: 1.4436 - classification_loss: 0.2856 323/500 [==================>...........] - ETA: 44s - loss: 1.7288 - regression_loss: 1.4433 - classification_loss: 0.2855 324/500 [==================>...........] - ETA: 44s - loss: 1.7277 - regression_loss: 1.4424 - classification_loss: 0.2854 325/500 [==================>...........] - ETA: 43s - loss: 1.7261 - regression_loss: 1.4411 - classification_loss: 0.2850 326/500 [==================>...........] - ETA: 43s - loss: 1.7281 - regression_loss: 1.4427 - classification_loss: 0.2854 327/500 [==================>...........] - ETA: 43s - loss: 1.7317 - regression_loss: 1.4456 - classification_loss: 0.2862 328/500 [==================>...........] - ETA: 43s - loss: 1.7322 - regression_loss: 1.4461 - classification_loss: 0.2861 329/500 [==================>...........] - ETA: 42s - loss: 1.7325 - regression_loss: 1.4464 - classification_loss: 0.2861 330/500 [==================>...........] - ETA: 42s - loss: 1.7328 - regression_loss: 1.4463 - classification_loss: 0.2865 331/500 [==================>...........] - ETA: 42s - loss: 1.7329 - regression_loss: 1.4465 - classification_loss: 0.2865 332/500 [==================>...........] - ETA: 42s - loss: 1.7317 - regression_loss: 1.4456 - classification_loss: 0.2861 333/500 [==================>...........] - ETA: 41s - loss: 1.7311 - regression_loss: 1.4453 - classification_loss: 0.2858 334/500 [===================>..........] - ETA: 41s - loss: 1.7325 - regression_loss: 1.4461 - classification_loss: 0.2864 335/500 [===================>..........] - ETA: 41s - loss: 1.7335 - regression_loss: 1.4469 - classification_loss: 0.2866 336/500 [===================>..........] - ETA: 41s - loss: 1.7340 - regression_loss: 1.4473 - classification_loss: 0.2866 337/500 [===================>..........] - ETA: 40s - loss: 1.7347 - regression_loss: 1.4479 - classification_loss: 0.2868 338/500 [===================>..........] - ETA: 40s - loss: 1.7344 - regression_loss: 1.4478 - classification_loss: 0.2867 339/500 [===================>..........] - ETA: 40s - loss: 1.7337 - regression_loss: 1.4471 - classification_loss: 0.2866 340/500 [===================>..........] - ETA: 40s - loss: 1.7339 - regression_loss: 1.4475 - classification_loss: 0.2864 341/500 [===================>..........] - ETA: 39s - loss: 1.7341 - regression_loss: 1.4475 - classification_loss: 0.2867 342/500 [===================>..........] - ETA: 39s - loss: 1.7335 - regression_loss: 1.4471 - classification_loss: 0.2864 343/500 [===================>..........] - ETA: 39s - loss: 1.7343 - regression_loss: 1.4482 - classification_loss: 0.2861 344/500 [===================>..........] - ETA: 39s - loss: 1.7351 - regression_loss: 1.4488 - classification_loss: 0.2862 345/500 [===================>..........] - ETA: 38s - loss: 1.7358 - regression_loss: 1.4493 - classification_loss: 0.2865 346/500 [===================>..........] - ETA: 38s - loss: 1.7382 - regression_loss: 1.4510 - classification_loss: 0.2873 347/500 [===================>..........] - ETA: 38s - loss: 1.7376 - regression_loss: 1.4505 - classification_loss: 0.2871 348/500 [===================>..........] - ETA: 38s - loss: 1.7369 - regression_loss: 1.4501 - classification_loss: 0.2868 349/500 [===================>..........] - ETA: 37s - loss: 1.7379 - regression_loss: 1.4508 - classification_loss: 0.2871 350/500 [====================>.........] - ETA: 37s - loss: 1.7373 - regression_loss: 1.4503 - classification_loss: 0.2870 351/500 [====================>.........] - ETA: 37s - loss: 1.7374 - regression_loss: 1.4504 - classification_loss: 0.2870 352/500 [====================>.........] - ETA: 37s - loss: 1.7358 - regression_loss: 1.4492 - classification_loss: 0.2866 353/500 [====================>.........] - ETA: 36s - loss: 1.7377 - regression_loss: 1.4507 - classification_loss: 0.2870 354/500 [====================>.........] - ETA: 36s - loss: 1.7373 - regression_loss: 1.4505 - classification_loss: 0.2868 355/500 [====================>.........] - ETA: 36s - loss: 1.7376 - regression_loss: 1.4506 - classification_loss: 0.2869 356/500 [====================>.........] - ETA: 35s - loss: 1.7386 - regression_loss: 1.4515 - classification_loss: 0.2871 357/500 [====================>.........] - ETA: 35s - loss: 1.7394 - regression_loss: 1.4524 - classification_loss: 0.2870 358/500 [====================>.........] - ETA: 35s - loss: 1.7396 - regression_loss: 1.4527 - classification_loss: 0.2869 359/500 [====================>.........] - ETA: 35s - loss: 1.7402 - regression_loss: 1.4532 - classification_loss: 0.2869 360/500 [====================>.........] - ETA: 34s - loss: 1.7397 - regression_loss: 1.4530 - classification_loss: 0.2867 361/500 [====================>.........] - ETA: 34s - loss: 1.7396 - regression_loss: 1.4529 - classification_loss: 0.2867 362/500 [====================>.........] - ETA: 34s - loss: 1.7393 - regression_loss: 1.4525 - classification_loss: 0.2868 363/500 [====================>.........] - ETA: 34s - loss: 1.7408 - regression_loss: 1.4537 - classification_loss: 0.2871 364/500 [====================>.........] - ETA: 33s - loss: 1.7415 - regression_loss: 1.4543 - classification_loss: 0.2872 365/500 [====================>.........] - ETA: 33s - loss: 1.7404 - regression_loss: 1.4535 - classification_loss: 0.2869 366/500 [====================>.........] - ETA: 33s - loss: 1.7416 - regression_loss: 1.4543 - classification_loss: 0.2874 367/500 [=====================>........] - ETA: 33s - loss: 1.7408 - regression_loss: 1.4538 - classification_loss: 0.2870 368/500 [=====================>........] - ETA: 32s - loss: 1.7430 - regression_loss: 1.4561 - classification_loss: 0.2870 369/500 [=====================>........] - ETA: 32s - loss: 1.7432 - regression_loss: 1.4563 - classification_loss: 0.2869 370/500 [=====================>........] - ETA: 32s - loss: 1.7435 - regression_loss: 1.4565 - classification_loss: 0.2870 371/500 [=====================>........] - ETA: 32s - loss: 1.7439 - regression_loss: 1.4569 - classification_loss: 0.2870 372/500 [=====================>........] - ETA: 31s - loss: 1.7428 - regression_loss: 1.4560 - classification_loss: 0.2868 373/500 [=====================>........] - ETA: 31s - loss: 1.7429 - regression_loss: 1.4561 - classification_loss: 0.2868 374/500 [=====================>........] - ETA: 31s - loss: 1.7400 - regression_loss: 1.4538 - classification_loss: 0.2862 375/500 [=====================>........] - ETA: 31s - loss: 1.7412 - regression_loss: 1.4551 - classification_loss: 0.2861 376/500 [=====================>........] - ETA: 30s - loss: 1.7405 - regression_loss: 1.4547 - classification_loss: 0.2859 377/500 [=====================>........] - ETA: 30s - loss: 1.7426 - regression_loss: 1.4566 - classification_loss: 0.2859 378/500 [=====================>........] - ETA: 30s - loss: 1.7419 - regression_loss: 1.4565 - classification_loss: 0.2854 379/500 [=====================>........] - ETA: 30s - loss: 1.7412 - regression_loss: 1.4560 - classification_loss: 0.2852 380/500 [=====================>........] - ETA: 29s - loss: 1.7410 - regression_loss: 1.4557 - classification_loss: 0.2853 381/500 [=====================>........] - ETA: 29s - loss: 1.7416 - regression_loss: 1.4559 - classification_loss: 0.2858 382/500 [=====================>........] - ETA: 29s - loss: 1.7422 - regression_loss: 1.4563 - classification_loss: 0.2858 383/500 [=====================>........] - ETA: 29s - loss: 1.7416 - regression_loss: 1.4560 - classification_loss: 0.2856 384/500 [======================>.......] - ETA: 28s - loss: 1.7389 - regression_loss: 1.4536 - classification_loss: 0.2853 385/500 [======================>.......] - ETA: 28s - loss: 1.7412 - regression_loss: 1.4553 - classification_loss: 0.2859 386/500 [======================>.......] - ETA: 28s - loss: 1.7413 - regression_loss: 1.4554 - classification_loss: 0.2859 387/500 [======================>.......] - ETA: 28s - loss: 1.7401 - regression_loss: 1.4546 - classification_loss: 0.2854 388/500 [======================>.......] - ETA: 27s - loss: 1.7400 - regression_loss: 1.4548 - classification_loss: 0.2852 389/500 [======================>.......] - ETA: 27s - loss: 1.7395 - regression_loss: 1.4545 - classification_loss: 0.2850 390/500 [======================>.......] - ETA: 27s - loss: 1.7406 - regression_loss: 1.4555 - classification_loss: 0.2851 391/500 [======================>.......] - ETA: 27s - loss: 1.7398 - regression_loss: 1.4549 - classification_loss: 0.2849 392/500 [======================>.......] - ETA: 26s - loss: 1.7397 - regression_loss: 1.4549 - classification_loss: 0.2848 393/500 [======================>.......] - ETA: 26s - loss: 1.7392 - regression_loss: 1.4544 - classification_loss: 0.2848 394/500 [======================>.......] - ETA: 26s - loss: 1.7383 - regression_loss: 1.4536 - classification_loss: 0.2846 395/500 [======================>.......] - ETA: 26s - loss: 1.7396 - regression_loss: 1.4545 - classification_loss: 0.2850 396/500 [======================>.......] - ETA: 25s - loss: 1.7414 - regression_loss: 1.4561 - classification_loss: 0.2853 397/500 [======================>.......] - ETA: 25s - loss: 1.7448 - regression_loss: 1.4587 - classification_loss: 0.2861 398/500 [======================>.......] - ETA: 25s - loss: 1.7452 - regression_loss: 1.4589 - classification_loss: 0.2863 399/500 [======================>.......] - ETA: 25s - loss: 1.7441 - regression_loss: 1.4582 - classification_loss: 0.2860 400/500 [=======================>......] - ETA: 24s - loss: 1.7436 - regression_loss: 1.4576 - classification_loss: 0.2860 401/500 [=======================>......] - ETA: 24s - loss: 1.7452 - regression_loss: 1.4587 - classification_loss: 0.2865 402/500 [=======================>......] - ETA: 24s - loss: 1.7423 - regression_loss: 1.4563 - classification_loss: 0.2859 403/500 [=======================>......] - ETA: 24s - loss: 1.7421 - regression_loss: 1.4562 - classification_loss: 0.2858 404/500 [=======================>......] - ETA: 23s - loss: 1.7436 - regression_loss: 1.4575 - classification_loss: 0.2861 405/500 [=======================>......] - ETA: 23s - loss: 1.7428 - regression_loss: 1.4570 - classification_loss: 0.2858 406/500 [=======================>......] - ETA: 23s - loss: 1.7422 - regression_loss: 1.4566 - classification_loss: 0.2857 407/500 [=======================>......] - ETA: 23s - loss: 1.7405 - regression_loss: 1.4553 - classification_loss: 0.2853 408/500 [=======================>......] - ETA: 22s - loss: 1.7406 - regression_loss: 1.4555 - classification_loss: 0.2851 409/500 [=======================>......] - ETA: 22s - loss: 1.7411 - regression_loss: 1.4558 - classification_loss: 0.2852 410/500 [=======================>......] - ETA: 22s - loss: 1.7395 - regression_loss: 1.4546 - classification_loss: 0.2849 411/500 [=======================>......] - ETA: 22s - loss: 1.7387 - regression_loss: 1.4540 - classification_loss: 0.2847 412/500 [=======================>......] - ETA: 21s - loss: 1.7355 - regression_loss: 1.4505 - classification_loss: 0.2850 413/500 [=======================>......] - ETA: 21s - loss: 1.7354 - regression_loss: 1.4507 - classification_loss: 0.2847 414/500 [=======================>......] - ETA: 21s - loss: 1.7349 - regression_loss: 1.4502 - classification_loss: 0.2847 415/500 [=======================>......] - ETA: 21s - loss: 1.7363 - regression_loss: 1.4511 - classification_loss: 0.2852 416/500 [=======================>......] - ETA: 20s - loss: 1.7360 - regression_loss: 1.4508 - classification_loss: 0.2851 417/500 [========================>.....] - ETA: 20s - loss: 1.7358 - regression_loss: 1.4506 - classification_loss: 0.2851 418/500 [========================>.....] - ETA: 20s - loss: 1.7347 - regression_loss: 1.4499 - classification_loss: 0.2848 419/500 [========================>.....] - ETA: 20s - loss: 1.7344 - regression_loss: 1.4497 - classification_loss: 0.2847 420/500 [========================>.....] - ETA: 19s - loss: 1.7350 - regression_loss: 1.4504 - classification_loss: 0.2846 421/500 [========================>.....] - ETA: 19s - loss: 1.7358 - regression_loss: 1.4508 - classification_loss: 0.2849 422/500 [========================>.....] - ETA: 19s - loss: 1.7349 - regression_loss: 1.4502 - classification_loss: 0.2847 423/500 [========================>.....] - ETA: 19s - loss: 1.7341 - regression_loss: 1.4497 - classification_loss: 0.2844 424/500 [========================>.....] - ETA: 18s - loss: 1.7340 - regression_loss: 1.4496 - classification_loss: 0.2844 425/500 [========================>.....] - ETA: 18s - loss: 1.7345 - regression_loss: 1.4503 - classification_loss: 0.2842 426/500 [========================>.....] - ETA: 18s - loss: 1.7343 - regression_loss: 1.4503 - classification_loss: 0.2840 427/500 [========================>.....] - ETA: 18s - loss: 1.7320 - regression_loss: 1.4484 - classification_loss: 0.2836 428/500 [========================>.....] - ETA: 17s - loss: 1.7307 - regression_loss: 1.4474 - classification_loss: 0.2834 429/500 [========================>.....] - ETA: 17s - loss: 1.7290 - regression_loss: 1.4461 - classification_loss: 0.2829 430/500 [========================>.....] - ETA: 17s - loss: 1.7279 - regression_loss: 1.4452 - classification_loss: 0.2827 431/500 [========================>.....] - ETA: 17s - loss: 1.7296 - regression_loss: 1.4462 - classification_loss: 0.2834 432/500 [========================>.....] - ETA: 16s - loss: 1.7280 - regression_loss: 1.4447 - classification_loss: 0.2833 433/500 [========================>.....] - ETA: 16s - loss: 1.7262 - regression_loss: 1.4433 - classification_loss: 0.2829 434/500 [=========================>....] - ETA: 16s - loss: 1.7244 - regression_loss: 1.4419 - classification_loss: 0.2826 435/500 [=========================>....] - ETA: 16s - loss: 1.7259 - regression_loss: 1.4430 - classification_loss: 0.2830 436/500 [=========================>....] - ETA: 15s - loss: 1.7272 - regression_loss: 1.4436 - classification_loss: 0.2836 437/500 [=========================>....] - ETA: 15s - loss: 1.7255 - regression_loss: 1.4423 - classification_loss: 0.2832 438/500 [=========================>....] - ETA: 15s - loss: 1.7282 - regression_loss: 1.4446 - classification_loss: 0.2835 439/500 [=========================>....] - ETA: 15s - loss: 1.7292 - regression_loss: 1.4452 - classification_loss: 0.2840 440/500 [=========================>....] - ETA: 14s - loss: 1.7288 - regression_loss: 1.4448 - classification_loss: 0.2839 441/500 [=========================>....] - ETA: 14s - loss: 1.7308 - regression_loss: 1.4466 - classification_loss: 0.2842 442/500 [=========================>....] - ETA: 14s - loss: 1.7327 - regression_loss: 1.4485 - classification_loss: 0.2842 443/500 [=========================>....] - ETA: 14s - loss: 1.7339 - regression_loss: 1.4494 - classification_loss: 0.2845 444/500 [=========================>....] - ETA: 13s - loss: 1.7324 - regression_loss: 1.4483 - classification_loss: 0.2841 445/500 [=========================>....] - ETA: 13s - loss: 1.7323 - regression_loss: 1.4482 - classification_loss: 0.2841 446/500 [=========================>....] - ETA: 13s - loss: 1.7303 - regression_loss: 1.4466 - classification_loss: 0.2837 447/500 [=========================>....] - ETA: 13s - loss: 1.7293 - regression_loss: 1.4458 - classification_loss: 0.2835 448/500 [=========================>....] - ETA: 12s - loss: 1.7291 - regression_loss: 1.4456 - classification_loss: 0.2836 449/500 [=========================>....] - ETA: 12s - loss: 1.7292 - regression_loss: 1.4457 - classification_loss: 0.2836 450/500 [==========================>...] - ETA: 12s - loss: 1.7275 - regression_loss: 1.4443 - classification_loss: 0.2832 451/500 [==========================>...] - ETA: 12s - loss: 1.7271 - regression_loss: 1.4441 - classification_loss: 0.2830 452/500 [==========================>...] - ETA: 12s - loss: 1.7280 - regression_loss: 1.4447 - classification_loss: 0.2833 453/500 [==========================>...] - ETA: 11s - loss: 1.7270 - regression_loss: 1.4440 - classification_loss: 0.2830 454/500 [==========================>...] - ETA: 11s - loss: 1.7272 - regression_loss: 1.4440 - classification_loss: 0.2831 455/500 [==========================>...] - ETA: 11s - loss: 1.7267 - regression_loss: 1.4433 - classification_loss: 0.2834 456/500 [==========================>...] - ETA: 11s - loss: 1.7265 - regression_loss: 1.4430 - classification_loss: 0.2835 457/500 [==========================>...] - ETA: 10s - loss: 1.7267 - regression_loss: 1.4433 - classification_loss: 0.2834 458/500 [==========================>...] - ETA: 10s - loss: 1.7269 - regression_loss: 1.4435 - classification_loss: 0.2834 459/500 [==========================>...] - ETA: 10s - loss: 1.7268 - regression_loss: 1.4433 - classification_loss: 0.2835 460/500 [==========================>...] - ETA: 10s - loss: 1.7246 - regression_loss: 1.4415 - classification_loss: 0.2831 461/500 [==========================>...] - ETA: 9s - loss: 1.7252 - regression_loss: 1.4421 - classification_loss: 0.2831  462/500 [==========================>...] - ETA: 9s - loss: 1.7252 - regression_loss: 1.4422 - classification_loss: 0.2830 463/500 [==========================>...] - ETA: 9s - loss: 1.7253 - regression_loss: 1.4422 - classification_loss: 0.2831 464/500 [==========================>...] - ETA: 9s - loss: 1.7247 - regression_loss: 1.4418 - classification_loss: 0.2828 465/500 [==========================>...] - ETA: 8s - loss: 1.7232 - regression_loss: 1.4408 - classification_loss: 0.2825 466/500 [==========================>...] - ETA: 8s - loss: 1.7239 - regression_loss: 1.4410 - classification_loss: 0.2829 467/500 [===========================>..] - ETA: 8s - loss: 1.7237 - regression_loss: 1.4410 - classification_loss: 0.2827 468/500 [===========================>..] - ETA: 8s - loss: 1.7210 - regression_loss: 1.4388 - classification_loss: 0.2822 469/500 [===========================>..] - ETA: 7s - loss: 1.7200 - regression_loss: 1.4380 - classification_loss: 0.2820 470/500 [===========================>..] - ETA: 7s - loss: 1.7201 - regression_loss: 1.4382 - classification_loss: 0.2819 471/500 [===========================>..] - ETA: 7s - loss: 1.7207 - regression_loss: 1.4388 - classification_loss: 0.2819 472/500 [===========================>..] - ETA: 7s - loss: 1.7195 - regression_loss: 1.4379 - classification_loss: 0.2816 473/500 [===========================>..] - ETA: 6s - loss: 1.7195 - regression_loss: 1.4381 - classification_loss: 0.2815 474/500 [===========================>..] - ETA: 6s - loss: 1.7215 - regression_loss: 1.4395 - classification_loss: 0.2820 475/500 [===========================>..] - ETA: 6s - loss: 1.7202 - regression_loss: 1.4385 - classification_loss: 0.2816 476/500 [===========================>..] - ETA: 6s - loss: 1.7195 - regression_loss: 1.4377 - classification_loss: 0.2817 477/500 [===========================>..] - ETA: 5s - loss: 1.7206 - regression_loss: 1.4385 - classification_loss: 0.2820 478/500 [===========================>..] - ETA: 5s - loss: 1.7218 - regression_loss: 1.4396 - classification_loss: 0.2823 479/500 [===========================>..] - ETA: 5s - loss: 1.7218 - regression_loss: 1.4394 - classification_loss: 0.2825 480/500 [===========================>..] - ETA: 5s - loss: 1.7229 - regression_loss: 1.4402 - classification_loss: 0.2827 481/500 [===========================>..] - ETA: 4s - loss: 1.7236 - regression_loss: 1.4409 - classification_loss: 0.2827 482/500 [===========================>..] - ETA: 4s - loss: 1.7258 - regression_loss: 1.4427 - classification_loss: 0.2831 483/500 [===========================>..] - ETA: 4s - loss: 1.7271 - regression_loss: 1.4437 - classification_loss: 0.2835 484/500 [============================>.] - ETA: 4s - loss: 1.7264 - regression_loss: 1.4432 - classification_loss: 0.2832 485/500 [============================>.] - ETA: 3s - loss: 1.7274 - regression_loss: 1.4439 - classification_loss: 0.2835 486/500 [============================>.] - ETA: 3s - loss: 1.7267 - regression_loss: 1.4434 - classification_loss: 0.2833 487/500 [============================>.] - ETA: 3s - loss: 1.7278 - regression_loss: 1.4442 - classification_loss: 0.2835 488/500 [============================>.] - ETA: 3s - loss: 1.7283 - regression_loss: 1.4447 - classification_loss: 0.2836 489/500 [============================>.] - ETA: 2s - loss: 1.7290 - regression_loss: 1.4456 - classification_loss: 0.2835 490/500 [============================>.] - ETA: 2s - loss: 1.7290 - regression_loss: 1.4457 - classification_loss: 0.2834 491/500 [============================>.] - ETA: 2s - loss: 1.7306 - regression_loss: 1.4468 - classification_loss: 0.2838 492/500 [============================>.] - ETA: 2s - loss: 1.7305 - regression_loss: 1.4467 - classification_loss: 0.2838 493/500 [============================>.] - ETA: 1s - loss: 1.7300 - regression_loss: 1.4463 - classification_loss: 0.2837 494/500 [============================>.] - ETA: 1s - loss: 1.7298 - regression_loss: 1.4461 - classification_loss: 0.2836 495/500 [============================>.] - ETA: 1s - loss: 1.7290 - regression_loss: 1.4455 - classification_loss: 0.2835 496/500 [============================>.] - ETA: 1s - loss: 1.7287 - regression_loss: 1.4452 - classification_loss: 0.2835 497/500 [============================>.] - ETA: 0s - loss: 1.7301 - regression_loss: 1.4459 - classification_loss: 0.2842 498/500 [============================>.] - ETA: 0s - loss: 1.7317 - regression_loss: 1.4468 - classification_loss: 0.2848 499/500 [============================>.] - ETA: 0s - loss: 1.7315 - regression_loss: 1.4468 - classification_loss: 0.2847 500/500 [==============================] - 125s 250ms/step - loss: 1.7326 - regression_loss: 1.4477 - classification_loss: 0.2849 326 instances of class plum with average precision: 0.7813 mAP: 0.7813 Epoch 00045: saving model to ./training/snapshots/resnet50_pascal_45.h5 Epoch 46/150 1/500 [..............................] - ETA: 1:59 - loss: 1.7682 - regression_loss: 1.4169 - classification_loss: 0.3513 2/500 [..............................] - ETA: 2:02 - loss: 1.8107 - regression_loss: 1.4880 - classification_loss: 0.3228 3/500 [..............................] - ETA: 2:03 - loss: 1.8567 - regression_loss: 1.5000 - classification_loss: 0.3568 4/500 [..............................] - ETA: 2:03 - loss: 1.7778 - regression_loss: 1.4607 - classification_loss: 0.3171 5/500 [..............................] - ETA: 2:03 - loss: 1.7194 - regression_loss: 1.4280 - classification_loss: 0.2914 6/500 [..............................] - ETA: 2:03 - loss: 1.6568 - regression_loss: 1.3844 - classification_loss: 0.2724 7/500 [..............................] - ETA: 2:02 - loss: 1.6163 - regression_loss: 1.3570 - classification_loss: 0.2593 8/500 [..............................] - ETA: 2:02 - loss: 1.5651 - regression_loss: 1.3218 - classification_loss: 0.2433 9/500 [..............................] - ETA: 2:02 - loss: 1.5412 - regression_loss: 1.3029 - classification_loss: 0.2382 10/500 [..............................] - ETA: 2:02 - loss: 1.5538 - regression_loss: 1.3143 - classification_loss: 0.2395 11/500 [..............................] - ETA: 2:01 - loss: 1.5548 - regression_loss: 1.3139 - classification_loss: 0.2408 12/500 [..............................] - ETA: 2:00 - loss: 1.6138 - regression_loss: 1.3688 - classification_loss: 0.2449 13/500 [..............................] - ETA: 2:00 - loss: 1.5482 - regression_loss: 1.3130 - classification_loss: 0.2352 14/500 [..............................] - ETA: 2:00 - loss: 1.5914 - regression_loss: 1.3466 - classification_loss: 0.2449 15/500 [..............................] - ETA: 2:00 - loss: 1.5759 - regression_loss: 1.3377 - classification_loss: 0.2382 16/500 [..............................] - ETA: 2:00 - loss: 1.5967 - regression_loss: 1.3633 - classification_loss: 0.2334 17/500 [>.............................] - ETA: 1:59 - loss: 1.6045 - regression_loss: 1.3707 - classification_loss: 0.2338 18/500 [>.............................] - ETA: 1:59 - loss: 1.6379 - regression_loss: 1.3940 - classification_loss: 0.2438 19/500 [>.............................] - ETA: 1:59 - loss: 1.6096 - regression_loss: 1.3699 - classification_loss: 0.2396 20/500 [>.............................] - ETA: 1:59 - loss: 1.6332 - regression_loss: 1.3867 - classification_loss: 0.2464 21/500 [>.............................] - ETA: 1:59 - loss: 1.6075 - regression_loss: 1.3655 - classification_loss: 0.2420 22/500 [>.............................] - ETA: 1:59 - loss: 1.6295 - regression_loss: 1.3844 - classification_loss: 0.2451 23/500 [>.............................] - ETA: 1:59 - loss: 1.6273 - regression_loss: 1.3830 - classification_loss: 0.2443 24/500 [>.............................] - ETA: 1:58 - loss: 1.6159 - regression_loss: 1.3742 - classification_loss: 0.2417 25/500 [>.............................] - ETA: 1:58 - loss: 1.6123 - regression_loss: 1.3714 - classification_loss: 0.2409 26/500 [>.............................] - ETA: 1:58 - loss: 1.6051 - regression_loss: 1.3655 - classification_loss: 0.2396 27/500 [>.............................] - ETA: 1:58 - loss: 1.6093 - regression_loss: 1.3694 - classification_loss: 0.2399 28/500 [>.............................] - ETA: 1:57 - loss: 1.6007 - regression_loss: 1.3625 - classification_loss: 0.2382 29/500 [>.............................] - ETA: 1:56 - loss: 1.5746 - regression_loss: 1.3378 - classification_loss: 0.2368 30/500 [>.............................] - ETA: 1:55 - loss: 1.5658 - regression_loss: 1.3242 - classification_loss: 0.2416 31/500 [>.............................] - ETA: 1:55 - loss: 1.5845 - regression_loss: 1.3446 - classification_loss: 0.2399 32/500 [>.............................] - ETA: 1:55 - loss: 1.6311 - regression_loss: 1.3798 - classification_loss: 0.2513 33/500 [>.............................] - ETA: 1:55 - loss: 1.6317 - regression_loss: 1.3808 - classification_loss: 0.2509 34/500 [=>............................] - ETA: 1:54 - loss: 1.6405 - regression_loss: 1.3835 - classification_loss: 0.2570 35/500 [=>............................] - ETA: 1:54 - loss: 1.6352 - regression_loss: 1.3803 - classification_loss: 0.2549 36/500 [=>............................] - ETA: 1:54 - loss: 1.6436 - regression_loss: 1.3858 - classification_loss: 0.2578 37/500 [=>............................] - ETA: 1:54 - loss: 1.6296 - regression_loss: 1.3758 - classification_loss: 0.2537 38/500 [=>............................] - ETA: 1:54 - loss: 1.6456 - regression_loss: 1.3870 - classification_loss: 0.2586 39/500 [=>............................] - ETA: 1:53 - loss: 1.7029 - regression_loss: 1.4357 - classification_loss: 0.2672 40/500 [=>............................] - ETA: 1:53 - loss: 1.7120 - regression_loss: 1.4426 - classification_loss: 0.2694 41/500 [=>............................] - ETA: 1:53 - loss: 1.7098 - regression_loss: 1.4421 - classification_loss: 0.2677 42/500 [=>............................] - ETA: 1:53 - loss: 1.7088 - regression_loss: 1.4411 - classification_loss: 0.2677 43/500 [=>............................] - ETA: 1:53 - loss: 1.6875 - regression_loss: 1.4249 - classification_loss: 0.2626 44/500 [=>............................] - ETA: 1:52 - loss: 1.6847 - regression_loss: 1.4234 - classification_loss: 0.2613 45/500 [=>............................] - ETA: 1:52 - loss: 1.6886 - regression_loss: 1.4276 - classification_loss: 0.2610 46/500 [=>............................] - ETA: 1:52 - loss: 1.6724 - regression_loss: 1.4131 - classification_loss: 0.2592 47/500 [=>............................] - ETA: 1:52 - loss: 1.6765 - regression_loss: 1.4161 - classification_loss: 0.2604 48/500 [=>............................] - ETA: 1:52 - loss: 1.6596 - regression_loss: 1.4030 - classification_loss: 0.2566 49/500 [=>............................] - ETA: 1:51 - loss: 1.6698 - regression_loss: 1.4126 - classification_loss: 0.2572 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6617 - regression_loss: 1.4060 - classification_loss: 0.2557 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6548 - regression_loss: 1.4017 - classification_loss: 0.2531 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6540 - regression_loss: 1.3997 - classification_loss: 0.2543 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6540 - regression_loss: 1.4002 - classification_loss: 0.2538 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6431 - regression_loss: 1.3911 - classification_loss: 0.2520 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6506 - regression_loss: 1.3969 - classification_loss: 0.2537 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6469 - regression_loss: 1.3938 - classification_loss: 0.2531 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6481 - regression_loss: 1.3949 - classification_loss: 0.2531 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6375 - regression_loss: 1.3869 - classification_loss: 0.2506 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6465 - regression_loss: 1.3927 - classification_loss: 0.2538 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6459 - regression_loss: 1.3913 - classification_loss: 0.2546 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6440 - regression_loss: 1.3903 - classification_loss: 0.2537 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6510 - regression_loss: 1.3959 - classification_loss: 0.2551 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6514 - regression_loss: 1.3969 - classification_loss: 0.2544 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6476 - regression_loss: 1.3945 - classification_loss: 0.2531 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6556 - regression_loss: 1.4014 - classification_loss: 0.2542 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6552 - regression_loss: 1.4018 - classification_loss: 0.2534 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6589 - regression_loss: 1.4042 - classification_loss: 0.2546 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6620 - regression_loss: 1.4048 - classification_loss: 0.2572 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6597 - regression_loss: 1.4023 - classification_loss: 0.2574 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6623 - regression_loss: 1.4043 - classification_loss: 0.2580 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6708 - regression_loss: 1.4102 - classification_loss: 0.2606 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6706 - regression_loss: 1.4097 - classification_loss: 0.2609 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6725 - regression_loss: 1.4123 - classification_loss: 0.2602 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6585 - regression_loss: 1.4008 - classification_loss: 0.2577 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6556 - regression_loss: 1.3994 - classification_loss: 0.2562 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6509 - regression_loss: 1.3959 - classification_loss: 0.2550 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6496 - regression_loss: 1.3954 - classification_loss: 0.2542 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6409 - regression_loss: 1.3884 - classification_loss: 0.2526 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6453 - regression_loss: 1.3924 - classification_loss: 0.2530 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6454 - regression_loss: 1.3890 - classification_loss: 0.2564 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6452 - regression_loss: 1.3891 - classification_loss: 0.2560 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6305 - regression_loss: 1.3765 - classification_loss: 0.2540 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6423 - regression_loss: 1.3875 - classification_loss: 0.2548 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6403 - regression_loss: 1.3863 - classification_loss: 0.2540 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6419 - regression_loss: 1.3881 - classification_loss: 0.2539 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6354 - regression_loss: 1.3828 - classification_loss: 0.2527 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6402 - regression_loss: 1.3864 - classification_loss: 0.2538 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6471 - regression_loss: 1.3920 - classification_loss: 0.2552 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6506 - regression_loss: 1.3948 - classification_loss: 0.2557 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6516 - regression_loss: 1.3956 - classification_loss: 0.2561 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6505 - regression_loss: 1.3948 - classification_loss: 0.2557 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6538 - regression_loss: 1.3981 - classification_loss: 0.2556 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6537 - regression_loss: 1.3979 - classification_loss: 0.2558 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6662 - regression_loss: 1.4064 - classification_loss: 0.2598 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6639 - regression_loss: 1.4046 - classification_loss: 0.2593 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6736 - regression_loss: 1.4123 - classification_loss: 0.2613 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6719 - regression_loss: 1.4108 - classification_loss: 0.2611 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6767 - regression_loss: 1.4134 - classification_loss: 0.2633 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6791 - regression_loss: 1.4152 - classification_loss: 0.2639 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6748 - regression_loss: 1.4121 - classification_loss: 0.2627 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6716 - regression_loss: 1.4105 - classification_loss: 0.2611 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6713 - regression_loss: 1.4099 - classification_loss: 0.2614 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6770 - regression_loss: 1.4141 - classification_loss: 0.2629 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6790 - regression_loss: 1.4160 - classification_loss: 0.2630 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6773 - regression_loss: 1.4149 - classification_loss: 0.2624 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6758 - regression_loss: 1.4140 - classification_loss: 0.2618 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6665 - regression_loss: 1.4058 - classification_loss: 0.2607 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6669 - regression_loss: 1.4061 - classification_loss: 0.2609 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6656 - regression_loss: 1.4056 - classification_loss: 0.2600 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6682 - regression_loss: 1.4075 - classification_loss: 0.2608 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6790 - regression_loss: 1.4165 - classification_loss: 0.2625 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6789 - regression_loss: 1.4167 - classification_loss: 0.2621 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6801 - regression_loss: 1.4175 - classification_loss: 0.2626 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6892 - regression_loss: 1.4238 - classification_loss: 0.2654 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6892 - regression_loss: 1.4243 - classification_loss: 0.2649 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6887 - regression_loss: 1.4243 - classification_loss: 0.2644 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6848 - regression_loss: 1.4213 - classification_loss: 0.2635 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6868 - regression_loss: 1.4231 - classification_loss: 0.2636 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6860 - regression_loss: 1.4231 - classification_loss: 0.2629 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6987 - regression_loss: 1.4331 - classification_loss: 0.2657 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6968 - regression_loss: 1.4323 - classification_loss: 0.2646 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6988 - regression_loss: 1.4337 - classification_loss: 0.2651 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7031 - regression_loss: 1.4375 - classification_loss: 0.2656 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7042 - regression_loss: 1.4385 - classification_loss: 0.2657 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7031 - regression_loss: 1.4376 - classification_loss: 0.2656 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7055 - regression_loss: 1.4387 - classification_loss: 0.2669 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6998 - regression_loss: 1.4339 - classification_loss: 0.2659 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6918 - regression_loss: 1.4272 - classification_loss: 0.2646 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6914 - regression_loss: 1.4255 - classification_loss: 0.2659 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6905 - regression_loss: 1.4249 - classification_loss: 0.2656 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6968 - regression_loss: 1.4300 - classification_loss: 0.2667 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6970 - regression_loss: 1.4303 - classification_loss: 0.2667 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6977 - regression_loss: 1.4317 - classification_loss: 0.2660 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6971 - regression_loss: 1.4307 - classification_loss: 0.2664 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6966 - regression_loss: 1.4302 - classification_loss: 0.2664 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6908 - regression_loss: 1.4252 - classification_loss: 0.2655 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6934 - regression_loss: 1.4269 - classification_loss: 0.2665 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6901 - regression_loss: 1.4244 - classification_loss: 0.2657 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6910 - regression_loss: 1.4249 - classification_loss: 0.2661 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6965 - regression_loss: 1.4295 - classification_loss: 0.2671 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7007 - regression_loss: 1.4325 - classification_loss: 0.2682 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6988 - regression_loss: 1.4310 - classification_loss: 0.2678 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6963 - regression_loss: 1.4295 - classification_loss: 0.2669 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7049 - regression_loss: 1.4358 - classification_loss: 0.2690 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7007 - regression_loss: 1.4325 - classification_loss: 0.2682 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7018 - regression_loss: 1.4333 - classification_loss: 0.2685 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7036 - regression_loss: 1.4348 - classification_loss: 0.2688 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7031 - regression_loss: 1.4348 - classification_loss: 0.2683 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7034 - regression_loss: 1.4350 - classification_loss: 0.2684 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7017 - regression_loss: 1.4336 - classification_loss: 0.2681 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7063 - regression_loss: 1.4367 - classification_loss: 0.2696 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7036 - regression_loss: 1.4347 - classification_loss: 0.2688 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7027 - regression_loss: 1.4340 - classification_loss: 0.2686 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7059 - regression_loss: 1.4361 - classification_loss: 0.2698 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7030 - regression_loss: 1.4337 - classification_loss: 0.2693 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7121 - regression_loss: 1.4397 - classification_loss: 0.2725 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7169 - regression_loss: 1.4433 - classification_loss: 0.2735 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7168 - regression_loss: 1.4429 - classification_loss: 0.2738 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7210 - regression_loss: 1.4473 - classification_loss: 0.2737 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7212 - regression_loss: 1.4472 - classification_loss: 0.2740 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7186 - regression_loss: 1.4454 - classification_loss: 0.2731 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7258 - regression_loss: 1.4497 - classification_loss: 0.2761 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7286 - regression_loss: 1.4514 - classification_loss: 0.2772 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7286 - regression_loss: 1.4512 - classification_loss: 0.2774 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7330 - regression_loss: 1.4558 - classification_loss: 0.2771 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7339 - regression_loss: 1.4569 - classification_loss: 0.2770 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7360 - regression_loss: 1.4593 - classification_loss: 0.2767 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7407 - regression_loss: 1.4637 - classification_loss: 0.2770 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7407 - regression_loss: 1.4637 - classification_loss: 0.2770 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7388 - regression_loss: 1.4625 - classification_loss: 0.2763 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7401 - regression_loss: 1.4637 - classification_loss: 0.2764 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7434 - regression_loss: 1.4653 - classification_loss: 0.2781 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7425 - regression_loss: 1.4568 - classification_loss: 0.2857 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7459 - regression_loss: 1.4594 - classification_loss: 0.2865 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7437 - regression_loss: 1.4576 - classification_loss: 0.2861 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7491 - regression_loss: 1.4624 - classification_loss: 0.2867 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7481 - regression_loss: 1.4618 - classification_loss: 0.2862 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7510 - regression_loss: 1.4645 - classification_loss: 0.2865 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7526 - regression_loss: 1.4658 - classification_loss: 0.2868 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7554 - regression_loss: 1.4680 - classification_loss: 0.2874 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7529 - regression_loss: 1.4660 - classification_loss: 0.2869 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7502 - regression_loss: 1.4639 - classification_loss: 0.2863 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7502 - regression_loss: 1.4641 - classification_loss: 0.2862 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7519 - regression_loss: 1.4649 - classification_loss: 0.2870 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7518 - regression_loss: 1.4650 - classification_loss: 0.2869 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7457 - regression_loss: 1.4599 - classification_loss: 0.2858 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7442 - regression_loss: 1.4583 - classification_loss: 0.2859 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7411 - regression_loss: 1.4558 - classification_loss: 0.2853 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7411 - regression_loss: 1.4557 - classification_loss: 0.2854 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7377 - regression_loss: 1.4530 - classification_loss: 0.2846 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7400 - regression_loss: 1.4555 - classification_loss: 0.2845 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7407 - regression_loss: 1.4566 - classification_loss: 0.2840 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7459 - regression_loss: 1.4603 - classification_loss: 0.2856 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7472 - regression_loss: 1.4612 - classification_loss: 0.2860 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7457 - regression_loss: 1.4599 - classification_loss: 0.2858 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7477 - regression_loss: 1.4613 - classification_loss: 0.2864 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7493 - regression_loss: 1.4625 - classification_loss: 0.2868 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7535 - regression_loss: 1.4661 - classification_loss: 0.2875 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7489 - regression_loss: 1.4624 - classification_loss: 0.2864 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7496 - regression_loss: 1.4636 - classification_loss: 0.2860 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7495 - regression_loss: 1.4639 - classification_loss: 0.2857 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7516 - regression_loss: 1.4658 - classification_loss: 0.2858 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7527 - regression_loss: 1.4669 - classification_loss: 0.2858 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7474 - regression_loss: 1.4627 - classification_loss: 0.2847 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7482 - regression_loss: 1.4631 - classification_loss: 0.2852 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7472 - regression_loss: 1.4625 - classification_loss: 0.2847 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7496 - regression_loss: 1.4638 - classification_loss: 0.2859 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7510 - regression_loss: 1.4644 - classification_loss: 0.2865 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7539 - regression_loss: 1.4665 - classification_loss: 0.2873 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7543 - regression_loss: 1.4668 - classification_loss: 0.2875 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7531 - regression_loss: 1.4659 - classification_loss: 0.2872 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7544 - regression_loss: 1.4672 - classification_loss: 0.2872 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7516 - regression_loss: 1.4651 - classification_loss: 0.2865 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7515 - regression_loss: 1.4655 - classification_loss: 0.2860 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7490 - regression_loss: 1.4636 - classification_loss: 0.2854 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7451 - regression_loss: 1.4604 - classification_loss: 0.2847 217/500 [============>.................] - ETA: 1:10 - loss: 1.7439 - regression_loss: 1.4599 - classification_loss: 0.2840 218/500 [============>.................] - ETA: 1:10 - loss: 1.7452 - regression_loss: 1.4609 - classification_loss: 0.2844 219/500 [============>.................] - ETA: 1:10 - loss: 1.7452 - regression_loss: 1.4612 - classification_loss: 0.2840 220/500 [============>.................] - ETA: 1:10 - loss: 1.7453 - regression_loss: 1.4613 - classification_loss: 0.2840 221/500 [============>.................] - ETA: 1:09 - loss: 1.7489 - regression_loss: 1.4635 - classification_loss: 0.2854 222/500 [============>.................] - ETA: 1:09 - loss: 1.7476 - regression_loss: 1.4628 - classification_loss: 0.2848 223/500 [============>.................] - ETA: 1:09 - loss: 1.7477 - regression_loss: 1.4630 - classification_loss: 0.2847 224/500 [============>.................] - ETA: 1:09 - loss: 1.7453 - regression_loss: 1.4613 - classification_loss: 0.2841 225/500 [============>.................] - ETA: 1:08 - loss: 1.7407 - regression_loss: 1.4576 - classification_loss: 0.2831 226/500 [============>.................] - ETA: 1:08 - loss: 1.7444 - regression_loss: 1.4603 - classification_loss: 0.2842 227/500 [============>.................] - ETA: 1:08 - loss: 1.7457 - regression_loss: 1.4612 - classification_loss: 0.2844 228/500 [============>.................] - ETA: 1:08 - loss: 1.7471 - regression_loss: 1.4622 - classification_loss: 0.2849 229/500 [============>.................] - ETA: 1:07 - loss: 1.7470 - regression_loss: 1.4621 - classification_loss: 0.2849 230/500 [============>.................] - ETA: 1:07 - loss: 1.7511 - regression_loss: 1.4652 - classification_loss: 0.2859 231/500 [============>.................] - ETA: 1:07 - loss: 1.7486 - regression_loss: 1.4633 - classification_loss: 0.2853 232/500 [============>.................] - ETA: 1:07 - loss: 1.7479 - regression_loss: 1.4628 - classification_loss: 0.2851 233/500 [============>.................] - ETA: 1:06 - loss: 1.7490 - regression_loss: 1.4640 - classification_loss: 0.2850 234/500 [=============>................] - ETA: 1:06 - loss: 1.7502 - regression_loss: 1.4645 - classification_loss: 0.2857 235/500 [=============>................] - ETA: 1:06 - loss: 1.7490 - regression_loss: 1.4638 - classification_loss: 0.2852 236/500 [=============>................] - ETA: 1:06 - loss: 1.7486 - regression_loss: 1.4633 - classification_loss: 0.2853 237/500 [=============>................] - ETA: 1:05 - loss: 1.7532 - regression_loss: 1.4674 - classification_loss: 0.2858 238/500 [=============>................] - ETA: 1:05 - loss: 1.7527 - regression_loss: 1.4670 - classification_loss: 0.2857 239/500 [=============>................] - ETA: 1:05 - loss: 1.7518 - regression_loss: 1.4664 - classification_loss: 0.2854 240/500 [=============>................] - ETA: 1:05 - loss: 1.7537 - regression_loss: 1.4677 - classification_loss: 0.2860 241/500 [=============>................] - ETA: 1:04 - loss: 1.7523 - regression_loss: 1.4662 - classification_loss: 0.2861 242/500 [=============>................] - ETA: 1:04 - loss: 1.7521 - regression_loss: 1.4658 - classification_loss: 0.2863 243/500 [=============>................] - ETA: 1:04 - loss: 1.7517 - regression_loss: 1.4655 - classification_loss: 0.2862 244/500 [=============>................] - ETA: 1:04 - loss: 1.7504 - regression_loss: 1.4644 - classification_loss: 0.2859 245/500 [=============>................] - ETA: 1:03 - loss: 1.7539 - regression_loss: 1.4675 - classification_loss: 0.2865 246/500 [=============>................] - ETA: 1:03 - loss: 1.7544 - regression_loss: 1.4679 - classification_loss: 0.2865 247/500 [=============>................] - ETA: 1:03 - loss: 1.7507 - regression_loss: 1.4648 - classification_loss: 0.2859 248/500 [=============>................] - ETA: 1:03 - loss: 1.7503 - regression_loss: 1.4589 - classification_loss: 0.2914 249/500 [=============>................] - ETA: 1:02 - loss: 1.7520 - regression_loss: 1.4600 - classification_loss: 0.2920 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7501 - regression_loss: 1.4585 - classification_loss: 0.2917 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7481 - regression_loss: 1.4570 - classification_loss: 0.2911 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7504 - regression_loss: 1.4589 - classification_loss: 0.2915 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7542 - regression_loss: 1.4631 - classification_loss: 0.2911 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7575 - regression_loss: 1.4655 - classification_loss: 0.2920 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7576 - regression_loss: 1.4653 - classification_loss: 0.2922 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7563 - regression_loss: 1.4645 - classification_loss: 0.2917 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7522 - regression_loss: 1.4611 - classification_loss: 0.2910 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7555 - regression_loss: 1.4634 - classification_loss: 0.2921 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7576 - regression_loss: 1.4650 - classification_loss: 0.2926 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7576 - regression_loss: 1.4648 - classification_loss: 0.2927 261/500 [==============>...............] - ETA: 59s - loss: 1.7586 - regression_loss: 1.4657 - classification_loss: 0.2929  262/500 [==============>...............] - ETA: 59s - loss: 1.7573 - regression_loss: 1.4649 - classification_loss: 0.2924 263/500 [==============>...............] - ETA: 59s - loss: 1.7585 - regression_loss: 1.4659 - classification_loss: 0.2926 264/500 [==============>...............] - ETA: 59s - loss: 1.7595 - regression_loss: 1.4668 - classification_loss: 0.2927 265/500 [==============>...............] - ETA: 58s - loss: 1.7597 - regression_loss: 1.4674 - classification_loss: 0.2924 266/500 [==============>...............] - ETA: 58s - loss: 1.7600 - regression_loss: 1.4675 - classification_loss: 0.2925 267/500 [===============>..............] - ETA: 58s - loss: 1.7592 - regression_loss: 1.4669 - classification_loss: 0.2923 268/500 [===============>..............] - ETA: 58s - loss: 1.7585 - regression_loss: 1.4666 - classification_loss: 0.2919 269/500 [===============>..............] - ETA: 57s - loss: 1.7577 - regression_loss: 1.4663 - classification_loss: 0.2914 270/500 [===============>..............] - ETA: 57s - loss: 1.7589 - regression_loss: 1.4671 - classification_loss: 0.2918 271/500 [===============>..............] - ETA: 57s - loss: 1.7616 - regression_loss: 1.4692 - classification_loss: 0.2924 272/500 [===============>..............] - ETA: 57s - loss: 1.7615 - regression_loss: 1.4690 - classification_loss: 0.2925 273/500 [===============>..............] - ETA: 56s - loss: 1.7632 - regression_loss: 1.4702 - classification_loss: 0.2930 274/500 [===============>..............] - ETA: 56s - loss: 1.7621 - regression_loss: 1.4695 - classification_loss: 0.2926 275/500 [===============>..............] - ETA: 56s - loss: 1.7649 - regression_loss: 1.4723 - classification_loss: 0.2926 276/500 [===============>..............] - ETA: 56s - loss: 1.7626 - regression_loss: 1.4704 - classification_loss: 0.2922 277/500 [===============>..............] - ETA: 55s - loss: 1.7603 - regression_loss: 1.4685 - classification_loss: 0.2918 278/500 [===============>..............] - ETA: 55s - loss: 1.7602 - regression_loss: 1.4685 - classification_loss: 0.2917 279/500 [===============>..............] - ETA: 55s - loss: 1.7601 - regression_loss: 1.4681 - classification_loss: 0.2920 280/500 [===============>..............] - ETA: 55s - loss: 1.7581 - regression_loss: 1.4666 - classification_loss: 0.2915 281/500 [===============>..............] - ETA: 54s - loss: 1.7588 - regression_loss: 1.4673 - classification_loss: 0.2915 282/500 [===============>..............] - ETA: 54s - loss: 1.7573 - regression_loss: 1.4661 - classification_loss: 0.2912 283/500 [===============>..............] - ETA: 54s - loss: 1.7571 - regression_loss: 1.4659 - classification_loss: 0.2912 284/500 [================>.............] - ETA: 54s - loss: 1.7570 - regression_loss: 1.4659 - classification_loss: 0.2911 285/500 [================>.............] - ETA: 53s - loss: 1.7579 - regression_loss: 1.4666 - classification_loss: 0.2913 286/500 [================>.............] - ETA: 53s - loss: 1.7581 - regression_loss: 1.4669 - classification_loss: 0.2912 287/500 [================>.............] - ETA: 53s - loss: 1.7561 - regression_loss: 1.4654 - classification_loss: 0.2907 288/500 [================>.............] - ETA: 53s - loss: 1.7549 - regression_loss: 1.4647 - classification_loss: 0.2902 289/500 [================>.............] - ETA: 52s - loss: 1.7544 - regression_loss: 1.4644 - classification_loss: 0.2900 290/500 [================>.............] - ETA: 52s - loss: 1.7536 - regression_loss: 1.4634 - classification_loss: 0.2901 291/500 [================>.............] - ETA: 52s - loss: 1.7513 - regression_loss: 1.4616 - classification_loss: 0.2897 292/500 [================>.............] - ETA: 52s - loss: 1.7500 - regression_loss: 1.4608 - classification_loss: 0.2892 293/500 [================>.............] - ETA: 51s - loss: 1.7518 - regression_loss: 1.4622 - classification_loss: 0.2896 294/500 [================>.............] - ETA: 51s - loss: 1.7520 - regression_loss: 1.4626 - classification_loss: 0.2895 295/500 [================>.............] - ETA: 51s - loss: 1.7506 - regression_loss: 1.4611 - classification_loss: 0.2895 296/500 [================>.............] - ETA: 51s - loss: 1.7554 - regression_loss: 1.4651 - classification_loss: 0.2903 297/500 [================>.............] - ETA: 50s - loss: 1.7551 - regression_loss: 1.4651 - classification_loss: 0.2900 298/500 [================>.............] - ETA: 50s - loss: 1.7553 - regression_loss: 1.4655 - classification_loss: 0.2898 299/500 [================>.............] - ETA: 50s - loss: 1.7541 - regression_loss: 1.4646 - classification_loss: 0.2895 300/500 [=================>............] - ETA: 50s - loss: 1.7547 - regression_loss: 1.4653 - classification_loss: 0.2894 301/500 [=================>............] - ETA: 49s - loss: 1.7553 - regression_loss: 1.4661 - classification_loss: 0.2892 302/500 [=================>............] - ETA: 49s - loss: 1.7615 - regression_loss: 1.4684 - classification_loss: 0.2931 303/500 [=================>............] - ETA: 49s - loss: 1.7620 - regression_loss: 1.4688 - classification_loss: 0.2932 304/500 [=================>............] - ETA: 49s - loss: 1.7619 - regression_loss: 1.4684 - classification_loss: 0.2934 305/500 [=================>............] - ETA: 48s - loss: 1.7621 - regression_loss: 1.4686 - classification_loss: 0.2935 306/500 [=================>............] - ETA: 48s - loss: 1.7587 - regression_loss: 1.4658 - classification_loss: 0.2929 307/500 [=================>............] - ETA: 48s - loss: 1.7581 - regression_loss: 1.4655 - classification_loss: 0.2926 308/500 [=================>............] - ETA: 48s - loss: 1.7585 - regression_loss: 1.4657 - classification_loss: 0.2928 309/500 [=================>............] - ETA: 47s - loss: 1.7556 - regression_loss: 1.4635 - classification_loss: 0.2921 310/500 [=================>............] - ETA: 47s - loss: 1.7566 - regression_loss: 1.4639 - classification_loss: 0.2927 311/500 [=================>............] - ETA: 47s - loss: 1.7569 - regression_loss: 1.4640 - classification_loss: 0.2928 312/500 [=================>............] - ETA: 47s - loss: 1.7594 - regression_loss: 1.4660 - classification_loss: 0.2934 313/500 [=================>............] - ETA: 46s - loss: 1.7600 - regression_loss: 1.4663 - classification_loss: 0.2937 314/500 [=================>............] - ETA: 46s - loss: 1.7610 - regression_loss: 1.4676 - classification_loss: 0.2933 315/500 [=================>............] - ETA: 46s - loss: 1.7608 - regression_loss: 1.4677 - classification_loss: 0.2931 316/500 [=================>............] - ETA: 46s - loss: 1.7613 - regression_loss: 1.4683 - classification_loss: 0.2930 317/500 [==================>...........] - ETA: 45s - loss: 1.7620 - regression_loss: 1.4691 - classification_loss: 0.2929 318/500 [==================>...........] - ETA: 45s - loss: 1.7607 - regression_loss: 1.4682 - classification_loss: 0.2925 319/500 [==================>...........] - ETA: 45s - loss: 1.7583 - regression_loss: 1.4663 - classification_loss: 0.2920 320/500 [==================>...........] - ETA: 45s - loss: 1.7567 - regression_loss: 1.4651 - classification_loss: 0.2916 321/500 [==================>...........] - ETA: 44s - loss: 1.7575 - regression_loss: 1.4657 - classification_loss: 0.2918 322/500 [==================>...........] - ETA: 44s - loss: 1.7580 - regression_loss: 1.4662 - classification_loss: 0.2917 323/500 [==================>...........] - ETA: 44s - loss: 1.7592 - regression_loss: 1.4672 - classification_loss: 0.2919 324/500 [==================>...........] - ETA: 44s - loss: 1.7599 - regression_loss: 1.4679 - classification_loss: 0.2920 325/500 [==================>...........] - ETA: 43s - loss: 1.7599 - regression_loss: 1.4681 - classification_loss: 0.2918 326/500 [==================>...........] - ETA: 43s - loss: 1.7591 - regression_loss: 1.4676 - classification_loss: 0.2915 327/500 [==================>...........] - ETA: 43s - loss: 1.7599 - regression_loss: 1.4684 - classification_loss: 0.2915 328/500 [==================>...........] - ETA: 43s - loss: 1.7577 - regression_loss: 1.4667 - classification_loss: 0.2910 329/500 [==================>...........] - ETA: 42s - loss: 1.7588 - regression_loss: 1.4678 - classification_loss: 0.2910 330/500 [==================>...........] - ETA: 42s - loss: 1.7609 - regression_loss: 1.4694 - classification_loss: 0.2915 331/500 [==================>...........] - ETA: 42s - loss: 1.7629 - regression_loss: 1.4711 - classification_loss: 0.2918 332/500 [==================>...........] - ETA: 42s - loss: 1.7614 - regression_loss: 1.4701 - classification_loss: 0.2913 333/500 [==================>...........] - ETA: 41s - loss: 1.7604 - regression_loss: 1.4695 - classification_loss: 0.2909 334/500 [===================>..........] - ETA: 41s - loss: 1.7608 - regression_loss: 1.4702 - classification_loss: 0.2906 335/500 [===================>..........] - ETA: 41s - loss: 1.7610 - regression_loss: 1.4698 - classification_loss: 0.2912 336/500 [===================>..........] - ETA: 41s - loss: 1.7606 - regression_loss: 1.4697 - classification_loss: 0.2910 337/500 [===================>..........] - ETA: 40s - loss: 1.7613 - regression_loss: 1.4702 - classification_loss: 0.2911 338/500 [===================>..........] - ETA: 40s - loss: 1.7582 - regression_loss: 1.4675 - classification_loss: 0.2907 339/500 [===================>..........] - ETA: 40s - loss: 1.7587 - regression_loss: 1.4682 - classification_loss: 0.2905 340/500 [===================>..........] - ETA: 40s - loss: 1.7565 - regression_loss: 1.4664 - classification_loss: 0.2901 341/500 [===================>..........] - ETA: 39s - loss: 1.7564 - regression_loss: 1.4664 - classification_loss: 0.2900 342/500 [===================>..........] - ETA: 39s - loss: 1.7528 - regression_loss: 1.4635 - classification_loss: 0.2893 343/500 [===================>..........] - ETA: 39s - loss: 1.7537 - regression_loss: 1.4644 - classification_loss: 0.2893 344/500 [===================>..........] - ETA: 39s - loss: 1.7540 - regression_loss: 1.4644 - classification_loss: 0.2895 345/500 [===================>..........] - ETA: 38s - loss: 1.7516 - regression_loss: 1.4625 - classification_loss: 0.2891 346/500 [===================>..........] - ETA: 38s - loss: 1.7513 - regression_loss: 1.4623 - classification_loss: 0.2890 347/500 [===================>..........] - ETA: 38s - loss: 1.7509 - regression_loss: 1.4619 - classification_loss: 0.2890 348/500 [===================>..........] - ETA: 38s - loss: 1.7482 - regression_loss: 1.4599 - classification_loss: 0.2884 349/500 [===================>..........] - ETA: 37s - loss: 1.7487 - regression_loss: 1.4603 - classification_loss: 0.2884 350/500 [====================>.........] - ETA: 37s - loss: 1.7489 - regression_loss: 1.4606 - classification_loss: 0.2884 351/500 [====================>.........] - ETA: 37s - loss: 1.7483 - regression_loss: 1.4602 - classification_loss: 0.2882 352/500 [====================>.........] - ETA: 37s - loss: 1.7475 - regression_loss: 1.4597 - classification_loss: 0.2879 353/500 [====================>.........] - ETA: 36s - loss: 1.7489 - regression_loss: 1.4606 - classification_loss: 0.2883 354/500 [====================>.........] - ETA: 36s - loss: 1.7475 - regression_loss: 1.4595 - classification_loss: 0.2880 355/500 [====================>.........] - ETA: 36s - loss: 1.7458 - regression_loss: 1.4582 - classification_loss: 0.2876 356/500 [====================>.........] - ETA: 36s - loss: 1.7479 - regression_loss: 1.4599 - classification_loss: 0.2880 357/500 [====================>.........] - ETA: 35s - loss: 1.7483 - regression_loss: 1.4606 - classification_loss: 0.2877 358/500 [====================>.........] - ETA: 35s - loss: 1.7474 - regression_loss: 1.4598 - classification_loss: 0.2877 359/500 [====================>.........] - ETA: 35s - loss: 1.7452 - regression_loss: 1.4580 - classification_loss: 0.2872 360/500 [====================>.........] - ETA: 35s - loss: 1.7430 - regression_loss: 1.4562 - classification_loss: 0.2868 361/500 [====================>.........] - ETA: 34s - loss: 1.7400 - regression_loss: 1.4538 - classification_loss: 0.2862 362/500 [====================>.........] - ETA: 34s - loss: 1.7395 - regression_loss: 1.4534 - classification_loss: 0.2860 363/500 [====================>.........] - ETA: 34s - loss: 1.7397 - regression_loss: 1.4537 - classification_loss: 0.2860 364/500 [====================>.........] - ETA: 34s - loss: 1.7402 - regression_loss: 1.4542 - classification_loss: 0.2860 365/500 [====================>.........] - ETA: 33s - loss: 1.7412 - regression_loss: 1.4548 - classification_loss: 0.2863 366/500 [====================>.........] - ETA: 33s - loss: 1.7404 - regression_loss: 1.4541 - classification_loss: 0.2863 367/500 [=====================>........] - ETA: 33s - loss: 1.7372 - regression_loss: 1.4515 - classification_loss: 0.2857 368/500 [=====================>........] - ETA: 33s - loss: 1.7355 - regression_loss: 1.4502 - classification_loss: 0.2853 369/500 [=====================>........] - ETA: 32s - loss: 1.7355 - regression_loss: 1.4501 - classification_loss: 0.2853 370/500 [=====================>........] - ETA: 32s - loss: 1.7360 - regression_loss: 1.4506 - classification_loss: 0.2854 371/500 [=====================>........] - ETA: 32s - loss: 1.7365 - regression_loss: 1.4508 - classification_loss: 0.2857 372/500 [=====================>........] - ETA: 32s - loss: 1.7361 - regression_loss: 1.4503 - classification_loss: 0.2858 373/500 [=====================>........] - ETA: 31s - loss: 1.7381 - regression_loss: 1.4518 - classification_loss: 0.2863 374/500 [=====================>........] - ETA: 31s - loss: 1.7374 - regression_loss: 1.4514 - classification_loss: 0.2861 375/500 [=====================>........] - ETA: 31s - loss: 1.7399 - regression_loss: 1.4531 - classification_loss: 0.2868 376/500 [=====================>........] - ETA: 31s - loss: 1.7392 - regression_loss: 1.4525 - classification_loss: 0.2867 377/500 [=====================>........] - ETA: 30s - loss: 1.7360 - regression_loss: 1.4498 - classification_loss: 0.2862 378/500 [=====================>........] - ETA: 30s - loss: 1.7344 - regression_loss: 1.4486 - classification_loss: 0.2857 379/500 [=====================>........] - ETA: 30s - loss: 1.7376 - regression_loss: 1.4513 - classification_loss: 0.2863 380/500 [=====================>........] - ETA: 30s - loss: 1.7382 - regression_loss: 1.4519 - classification_loss: 0.2864 381/500 [=====================>........] - ETA: 29s - loss: 1.7391 - regression_loss: 1.4525 - classification_loss: 0.2866 382/500 [=====================>........] - ETA: 29s - loss: 1.7388 - regression_loss: 1.4522 - classification_loss: 0.2866 383/500 [=====================>........] - ETA: 29s - loss: 1.7395 - regression_loss: 1.4525 - classification_loss: 0.2870 384/500 [======================>.......] - ETA: 29s - loss: 1.7396 - regression_loss: 1.4526 - classification_loss: 0.2870 385/500 [======================>.......] - ETA: 28s - loss: 1.7381 - regression_loss: 1.4516 - classification_loss: 0.2865 386/500 [======================>.......] - ETA: 28s - loss: 1.7369 - regression_loss: 1.4508 - classification_loss: 0.2862 387/500 [======================>.......] - ETA: 28s - loss: 1.7382 - regression_loss: 1.4519 - classification_loss: 0.2863 388/500 [======================>.......] - ETA: 28s - loss: 1.7366 - regression_loss: 1.4508 - classification_loss: 0.2858 389/500 [======================>.......] - ETA: 27s - loss: 1.7380 - regression_loss: 1.4518 - classification_loss: 0.2862 390/500 [======================>.......] - ETA: 27s - loss: 1.7389 - regression_loss: 1.4526 - classification_loss: 0.2863 391/500 [======================>.......] - ETA: 27s - loss: 1.7380 - regression_loss: 1.4520 - classification_loss: 0.2860 392/500 [======================>.......] - ETA: 27s - loss: 1.7382 - regression_loss: 1.4521 - classification_loss: 0.2861 393/500 [======================>.......] - ETA: 26s - loss: 1.7380 - regression_loss: 1.4521 - classification_loss: 0.2859 394/500 [======================>.......] - ETA: 26s - loss: 1.7388 - regression_loss: 1.4528 - classification_loss: 0.2860 395/500 [======================>.......] - ETA: 26s - loss: 1.7382 - regression_loss: 1.4525 - classification_loss: 0.2856 396/500 [======================>.......] - ETA: 26s - loss: 1.7381 - regression_loss: 1.4524 - classification_loss: 0.2857 397/500 [======================>.......] - ETA: 25s - loss: 1.7348 - regression_loss: 1.4496 - classification_loss: 0.2852 398/500 [======================>.......] - ETA: 25s - loss: 1.7346 - regression_loss: 1.4496 - classification_loss: 0.2850 399/500 [======================>.......] - ETA: 25s - loss: 1.7339 - regression_loss: 1.4490 - classification_loss: 0.2849 400/500 [=======================>......] - ETA: 25s - loss: 1.7361 - regression_loss: 1.4509 - classification_loss: 0.2851 401/500 [=======================>......] - ETA: 24s - loss: 1.7362 - regression_loss: 1.4510 - classification_loss: 0.2852 402/500 [=======================>......] - ETA: 24s - loss: 1.7350 - regression_loss: 1.4501 - classification_loss: 0.2850 403/500 [=======================>......] - ETA: 24s - loss: 1.7354 - regression_loss: 1.4504 - classification_loss: 0.2850 404/500 [=======================>......] - ETA: 24s - loss: 1.7348 - regression_loss: 1.4500 - classification_loss: 0.2848 405/500 [=======================>......] - ETA: 23s - loss: 1.7328 - regression_loss: 1.4483 - classification_loss: 0.2845 406/500 [=======================>......] - ETA: 23s - loss: 1.7325 - regression_loss: 1.4481 - classification_loss: 0.2844 407/500 [=======================>......] - ETA: 23s - loss: 1.7330 - regression_loss: 1.4485 - classification_loss: 0.2845 408/500 [=======================>......] - ETA: 23s - loss: 1.7322 - regression_loss: 1.4479 - classification_loss: 0.2843 409/500 [=======================>......] - ETA: 22s - loss: 1.7317 - regression_loss: 1.4473 - classification_loss: 0.2844 410/500 [=======================>......] - ETA: 22s - loss: 1.7350 - regression_loss: 1.4496 - classification_loss: 0.2854 411/500 [=======================>......] - ETA: 22s - loss: 1.7347 - regression_loss: 1.4494 - classification_loss: 0.2852 412/500 [=======================>......] - ETA: 22s - loss: 1.7348 - regression_loss: 1.4496 - classification_loss: 0.2853 413/500 [=======================>......] - ETA: 21s - loss: 1.7349 - regression_loss: 1.4495 - classification_loss: 0.2854 414/500 [=======================>......] - ETA: 21s - loss: 1.7333 - regression_loss: 1.4483 - classification_loss: 0.2850 415/500 [=======================>......] - ETA: 21s - loss: 1.7334 - regression_loss: 1.4486 - classification_loss: 0.2848 416/500 [=======================>......] - ETA: 21s - loss: 1.7330 - regression_loss: 1.4483 - classification_loss: 0.2847 417/500 [========================>.....] - ETA: 20s - loss: 1.7325 - regression_loss: 1.4479 - classification_loss: 0.2847 418/500 [========================>.....] - ETA: 20s - loss: 1.7315 - regression_loss: 1.4472 - classification_loss: 0.2843 419/500 [========================>.....] - ETA: 20s - loss: 1.7308 - regression_loss: 1.4466 - classification_loss: 0.2842 420/500 [========================>.....] - ETA: 20s - loss: 1.7300 - regression_loss: 1.4461 - classification_loss: 0.2839 421/500 [========================>.....] - ETA: 19s - loss: 1.7307 - regression_loss: 1.4467 - classification_loss: 0.2840 422/500 [========================>.....] - ETA: 19s - loss: 1.7304 - regression_loss: 1.4466 - classification_loss: 0.2838 423/500 [========================>.....] - ETA: 19s - loss: 1.7313 - regression_loss: 1.4472 - classification_loss: 0.2842 424/500 [========================>.....] - ETA: 19s - loss: 1.7318 - regression_loss: 1.4474 - classification_loss: 0.2844 425/500 [========================>.....] - ETA: 18s - loss: 1.7318 - regression_loss: 1.4476 - classification_loss: 0.2842 426/500 [========================>.....] - ETA: 18s - loss: 1.7316 - regression_loss: 1.4476 - classification_loss: 0.2840 427/500 [========================>.....] - ETA: 18s - loss: 1.7296 - regression_loss: 1.4458 - classification_loss: 0.2838 428/500 [========================>.....] - ETA: 18s - loss: 1.7304 - regression_loss: 1.4464 - classification_loss: 0.2840 429/500 [========================>.....] - ETA: 17s - loss: 1.7323 - regression_loss: 1.4482 - classification_loss: 0.2840 430/500 [========================>.....] - ETA: 17s - loss: 1.7308 - regression_loss: 1.4468 - classification_loss: 0.2839 431/500 [========================>.....] - ETA: 17s - loss: 1.7317 - regression_loss: 1.4470 - classification_loss: 0.2846 432/500 [========================>.....] - ETA: 17s - loss: 1.7321 - regression_loss: 1.4472 - classification_loss: 0.2849 433/500 [========================>.....] - ETA: 16s - loss: 1.7314 - regression_loss: 1.4470 - classification_loss: 0.2845 434/500 [=========================>....] - ETA: 16s - loss: 1.7322 - regression_loss: 1.4473 - classification_loss: 0.2848 435/500 [=========================>....] - ETA: 16s - loss: 1.7323 - regression_loss: 1.4474 - classification_loss: 0.2849 436/500 [=========================>....] - ETA: 16s - loss: 1.7315 - regression_loss: 1.4468 - classification_loss: 0.2848 437/500 [=========================>....] - ETA: 15s - loss: 1.7303 - regression_loss: 1.4459 - classification_loss: 0.2844 438/500 [=========================>....] - ETA: 15s - loss: 1.7296 - regression_loss: 1.4454 - classification_loss: 0.2842 439/500 [=========================>....] - ETA: 15s - loss: 1.7314 - regression_loss: 1.4469 - classification_loss: 0.2845 440/500 [=========================>....] - ETA: 15s - loss: 1.7309 - regression_loss: 1.4466 - classification_loss: 0.2843 441/500 [=========================>....] - ETA: 14s - loss: 1.7318 - regression_loss: 1.4472 - classification_loss: 0.2845 442/500 [=========================>....] - ETA: 14s - loss: 1.7324 - regression_loss: 1.4477 - classification_loss: 0.2847 443/500 [=========================>....] - ETA: 14s - loss: 1.7349 - regression_loss: 1.4495 - classification_loss: 0.2854 444/500 [=========================>....] - ETA: 14s - loss: 1.7349 - regression_loss: 1.4495 - classification_loss: 0.2855 445/500 [=========================>....] - ETA: 13s - loss: 1.7359 - regression_loss: 1.4503 - classification_loss: 0.2856 446/500 [=========================>....] - ETA: 13s - loss: 1.7349 - regression_loss: 1.4496 - classification_loss: 0.2853 447/500 [=========================>....] - ETA: 13s - loss: 1.7353 - regression_loss: 1.4498 - classification_loss: 0.2855 448/500 [=========================>....] - ETA: 13s - loss: 1.7365 - regression_loss: 1.4506 - classification_loss: 0.2859 449/500 [=========================>....] - ETA: 12s - loss: 1.7362 - regression_loss: 1.4504 - classification_loss: 0.2858 450/500 [==========================>...] - ETA: 12s - loss: 1.7365 - regression_loss: 1.4507 - classification_loss: 0.2858 451/500 [==========================>...] - ETA: 12s - loss: 1.7382 - regression_loss: 1.4523 - classification_loss: 0.2859 452/500 [==========================>...] - ETA: 12s - loss: 1.7359 - regression_loss: 1.4503 - classification_loss: 0.2855 453/500 [==========================>...] - ETA: 11s - loss: 1.7354 - regression_loss: 1.4499 - classification_loss: 0.2855 454/500 [==========================>...] - ETA: 11s - loss: 1.7355 - regression_loss: 1.4499 - classification_loss: 0.2857 455/500 [==========================>...] - ETA: 11s - loss: 1.7366 - regression_loss: 1.4507 - classification_loss: 0.2859 456/500 [==========================>...] - ETA: 11s - loss: 1.7359 - regression_loss: 1.4500 - classification_loss: 0.2859 457/500 [==========================>...] - ETA: 10s - loss: 1.7366 - regression_loss: 1.4508 - classification_loss: 0.2859 458/500 [==========================>...] - ETA: 10s - loss: 1.7371 - regression_loss: 1.4511 - classification_loss: 0.2860 459/500 [==========================>...] - ETA: 10s - loss: 1.7357 - regression_loss: 1.4500 - classification_loss: 0.2857 460/500 [==========================>...] - ETA: 10s - loss: 1.7365 - regression_loss: 1.4508 - classification_loss: 0.2858 461/500 [==========================>...] - ETA: 9s - loss: 1.7360 - regression_loss: 1.4502 - classification_loss: 0.2858  462/500 [==========================>...] - ETA: 9s - loss: 1.7352 - regression_loss: 1.4496 - classification_loss: 0.2856 463/500 [==========================>...] - ETA: 9s - loss: 1.7360 - regression_loss: 1.4502 - classification_loss: 0.2858 464/500 [==========================>...] - ETA: 9s - loss: 1.7350 - regression_loss: 1.4494 - classification_loss: 0.2856 465/500 [==========================>...] - ETA: 8s - loss: 1.7340 - regression_loss: 1.4485 - classification_loss: 0.2854 466/500 [==========================>...] - ETA: 8s - loss: 1.7337 - regression_loss: 1.4484 - classification_loss: 0.2852 467/500 [===========================>..] - ETA: 8s - loss: 1.7315 - regression_loss: 1.4467 - classification_loss: 0.2848 468/500 [===========================>..] - ETA: 8s - loss: 1.7311 - regression_loss: 1.4464 - classification_loss: 0.2847 469/500 [===========================>..] - ETA: 7s - loss: 1.7313 - regression_loss: 1.4464 - classification_loss: 0.2849 470/500 [===========================>..] - ETA: 7s - loss: 1.7304 - regression_loss: 1.4458 - classification_loss: 0.2846 471/500 [===========================>..] - ETA: 7s - loss: 1.7294 - regression_loss: 1.4451 - classification_loss: 0.2843 472/500 [===========================>..] - ETA: 7s - loss: 1.7287 - regression_loss: 1.4446 - classification_loss: 0.2841 473/500 [===========================>..] - ETA: 6s - loss: 1.7282 - regression_loss: 1.4444 - classification_loss: 0.2839 474/500 [===========================>..] - ETA: 6s - loss: 1.7307 - regression_loss: 1.4460 - classification_loss: 0.2847 475/500 [===========================>..] - ETA: 6s - loss: 1.7297 - regression_loss: 1.4450 - classification_loss: 0.2847 476/500 [===========================>..] - ETA: 6s - loss: 1.7276 - regression_loss: 1.4420 - classification_loss: 0.2856 477/500 [===========================>..] - ETA: 5s - loss: 1.7280 - regression_loss: 1.4425 - classification_loss: 0.2855 478/500 [===========================>..] - ETA: 5s - loss: 1.7276 - regression_loss: 1.4419 - classification_loss: 0.2857 479/500 [===========================>..] - ETA: 5s - loss: 1.7276 - regression_loss: 1.4420 - classification_loss: 0.2856 480/500 [===========================>..] - ETA: 5s - loss: 1.7278 - regression_loss: 1.4421 - classification_loss: 0.2856 481/500 [===========================>..] - ETA: 4s - loss: 1.7258 - regression_loss: 1.4407 - classification_loss: 0.2852 482/500 [===========================>..] - ETA: 4s - loss: 1.7256 - regression_loss: 1.4405 - classification_loss: 0.2852 483/500 [===========================>..] - ETA: 4s - loss: 1.7242 - regression_loss: 1.4393 - classification_loss: 0.2849 484/500 [============================>.] - ETA: 4s - loss: 1.7236 - regression_loss: 1.4389 - classification_loss: 0.2847 485/500 [============================>.] - ETA: 3s - loss: 1.7233 - regression_loss: 1.4388 - classification_loss: 0.2845 486/500 [============================>.] - ETA: 3s - loss: 1.7238 - regression_loss: 1.4393 - classification_loss: 0.2845 487/500 [============================>.] - ETA: 3s - loss: 1.7230 - regression_loss: 1.4386 - classification_loss: 0.2843 488/500 [============================>.] - ETA: 3s - loss: 1.7236 - regression_loss: 1.4391 - classification_loss: 0.2845 489/500 [============================>.] - ETA: 2s - loss: 1.7234 - regression_loss: 1.4391 - classification_loss: 0.2844 490/500 [============================>.] - ETA: 2s - loss: 1.7232 - regression_loss: 1.4389 - classification_loss: 0.2843 491/500 [============================>.] - ETA: 2s - loss: 1.7220 - regression_loss: 1.4381 - classification_loss: 0.2839 492/500 [============================>.] - ETA: 2s - loss: 1.7216 - regression_loss: 1.4377 - classification_loss: 0.2838 493/500 [============================>.] - ETA: 1s - loss: 1.7214 - regression_loss: 1.4376 - classification_loss: 0.2837 494/500 [============================>.] - ETA: 1s - loss: 1.7208 - regression_loss: 1.4373 - classification_loss: 0.2835 495/500 [============================>.] - ETA: 1s - loss: 1.7198 - regression_loss: 1.4363 - classification_loss: 0.2835 496/500 [============================>.] - ETA: 1s - loss: 1.7194 - regression_loss: 1.4357 - classification_loss: 0.2837 497/500 [============================>.] - ETA: 0s - loss: 1.7216 - regression_loss: 1.4366 - classification_loss: 0.2850 498/500 [============================>.] - ETA: 0s - loss: 1.7218 - regression_loss: 1.4367 - classification_loss: 0.2852 499/500 [============================>.] - ETA: 0s - loss: 1.7228 - regression_loss: 1.4376 - classification_loss: 0.2852 500/500 [==============================] - 125s 251ms/step - loss: 1.7227 - regression_loss: 1.4377 - classification_loss: 0.2851 326 instances of class plum with average precision: 0.7694 mAP: 0.7694 Epoch 00046: saving model to ./training/snapshots/resnet50_pascal_46.h5 Epoch 47/150 1/500 [..............................] - ETA: 2:02 - loss: 2.8177 - regression_loss: 2.4202 - classification_loss: 0.3975 2/500 [..............................] - ETA: 2:03 - loss: 2.3714 - regression_loss: 1.9899 - classification_loss: 0.3815 3/500 [..............................] - ETA: 2:04 - loss: 2.0930 - regression_loss: 1.7841 - classification_loss: 0.3089 4/500 [..............................] - ETA: 2:04 - loss: 1.9857 - regression_loss: 1.6950 - classification_loss: 0.2908 5/500 [..............................] - ETA: 2:04 - loss: 1.8974 - regression_loss: 1.6224 - classification_loss: 0.2750 6/500 [..............................] - ETA: 2:04 - loss: 1.8314 - regression_loss: 1.5788 - classification_loss: 0.2526 7/500 [..............................] - ETA: 2:03 - loss: 1.8266 - regression_loss: 1.5643 - classification_loss: 0.2623 8/500 [..............................] - ETA: 2:01 - loss: 1.8896 - regression_loss: 1.6166 - classification_loss: 0.2729 9/500 [..............................] - ETA: 2:01 - loss: 1.8404 - regression_loss: 1.5784 - classification_loss: 0.2619 10/500 [..............................] - ETA: 2:01 - loss: 1.7976 - regression_loss: 1.5419 - classification_loss: 0.2556 11/500 [..............................] - ETA: 2:01 - loss: 1.7827 - regression_loss: 1.5274 - classification_loss: 0.2553 12/500 [..............................] - ETA: 2:01 - loss: 1.8443 - regression_loss: 1.5860 - classification_loss: 0.2583 13/500 [..............................] - ETA: 2:01 - loss: 1.9025 - regression_loss: 1.6309 - classification_loss: 0.2715 14/500 [..............................] - ETA: 2:00 - loss: 1.8808 - regression_loss: 1.6111 - classification_loss: 0.2697 15/500 [..............................] - ETA: 2:00 - loss: 1.9098 - regression_loss: 1.6345 - classification_loss: 0.2752 16/500 [..............................] - ETA: 2:00 - loss: 1.9127 - regression_loss: 1.6326 - classification_loss: 0.2801 17/500 [>.............................] - ETA: 1:59 - loss: 1.8820 - regression_loss: 1.6084 - classification_loss: 0.2736 18/500 [>.............................] - ETA: 1:59 - loss: 1.8039 - regression_loss: 1.5395 - classification_loss: 0.2644 19/500 [>.............................] - ETA: 1:59 - loss: 1.8188 - regression_loss: 1.5470 - classification_loss: 0.2718 20/500 [>.............................] - ETA: 1:59 - loss: 1.7850 - regression_loss: 1.5217 - classification_loss: 0.2633 21/500 [>.............................] - ETA: 1:59 - loss: 1.7826 - regression_loss: 1.5208 - classification_loss: 0.2618 22/500 [>.............................] - ETA: 1:59 - loss: 1.7746 - regression_loss: 1.5154 - classification_loss: 0.2592 23/500 [>.............................] - ETA: 1:59 - loss: 1.7410 - regression_loss: 1.4784 - classification_loss: 0.2627 24/500 [>.............................] - ETA: 1:58 - loss: 1.7494 - regression_loss: 1.4794 - classification_loss: 0.2701 25/500 [>.............................] - ETA: 1:58 - loss: 1.7391 - regression_loss: 1.4700 - classification_loss: 0.2690 26/500 [>.............................] - ETA: 1:58 - loss: 1.7185 - regression_loss: 1.4507 - classification_loss: 0.2678 27/500 [>.............................] - ETA: 1:58 - loss: 1.7013 - regression_loss: 1.4370 - classification_loss: 0.2643 28/500 [>.............................] - ETA: 1:57 - loss: 1.7048 - regression_loss: 1.4354 - classification_loss: 0.2693 29/500 [>.............................] - ETA: 1:57 - loss: 1.7130 - regression_loss: 1.4419 - classification_loss: 0.2711 30/500 [>.............................] - ETA: 1:57 - loss: 1.7236 - regression_loss: 1.4520 - classification_loss: 0.2716 31/500 [>.............................] - ETA: 1:56 - loss: 1.7779 - regression_loss: 1.4922 - classification_loss: 0.2856 32/500 [>.............................] - ETA: 1:56 - loss: 1.7801 - regression_loss: 1.4954 - classification_loss: 0.2847 33/500 [>.............................] - ETA: 1:56 - loss: 1.7767 - regression_loss: 1.4922 - classification_loss: 0.2845 34/500 [=>............................] - ETA: 1:56 - loss: 1.7633 - regression_loss: 1.4812 - classification_loss: 0.2821 35/500 [=>............................] - ETA: 1:55 - loss: 1.7751 - regression_loss: 1.4910 - classification_loss: 0.2840 36/500 [=>............................] - ETA: 1:55 - loss: 1.7876 - regression_loss: 1.5009 - classification_loss: 0.2867 37/500 [=>............................] - ETA: 1:55 - loss: 1.7983 - regression_loss: 1.5091 - classification_loss: 0.2892 38/500 [=>............................] - ETA: 1:55 - loss: 1.7984 - regression_loss: 1.5097 - classification_loss: 0.2887 39/500 [=>............................] - ETA: 1:54 - loss: 1.7932 - regression_loss: 1.5077 - classification_loss: 0.2855 40/500 [=>............................] - ETA: 1:54 - loss: 1.7960 - regression_loss: 1.5120 - classification_loss: 0.2840 41/500 [=>............................] - ETA: 1:54 - loss: 1.7970 - regression_loss: 1.5128 - classification_loss: 0.2842 42/500 [=>............................] - ETA: 1:54 - loss: 1.8220 - regression_loss: 1.5328 - classification_loss: 0.2892 43/500 [=>............................] - ETA: 1:53 - loss: 1.8103 - regression_loss: 1.5247 - classification_loss: 0.2857 44/500 [=>............................] - ETA: 1:53 - loss: 1.8243 - regression_loss: 1.5348 - classification_loss: 0.2895 45/500 [=>............................] - ETA: 1:53 - loss: 1.8120 - regression_loss: 1.5266 - classification_loss: 0.2854 46/500 [=>............................] - ETA: 1:52 - loss: 1.8008 - regression_loss: 1.5176 - classification_loss: 0.2832 47/500 [=>............................] - ETA: 1:52 - loss: 1.8019 - regression_loss: 1.5159 - classification_loss: 0.2859 48/500 [=>............................] - ETA: 1:52 - loss: 1.8174 - regression_loss: 1.5268 - classification_loss: 0.2907 49/500 [=>............................] - ETA: 1:52 - loss: 1.8182 - regression_loss: 1.5284 - classification_loss: 0.2898 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8332 - regression_loss: 1.5434 - classification_loss: 0.2898 51/500 [==>...........................] - ETA: 1:51 - loss: 1.8196 - regression_loss: 1.5318 - classification_loss: 0.2878 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8169 - regression_loss: 1.5308 - classification_loss: 0.2861 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7958 - regression_loss: 1.5115 - classification_loss: 0.2843 54/500 [==>...........................] - ETA: 1:51 - loss: 1.7909 - regression_loss: 1.5078 - classification_loss: 0.2831 55/500 [==>...........................] - ETA: 1:50 - loss: 1.7933 - regression_loss: 1.5095 - classification_loss: 0.2838 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8251 - regression_loss: 1.5325 - classification_loss: 0.2926 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8139 - regression_loss: 1.5224 - classification_loss: 0.2915 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8203 - regression_loss: 1.5285 - classification_loss: 0.2917 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8177 - regression_loss: 1.5257 - classification_loss: 0.2920 60/500 [==>...........................] - ETA: 1:48 - loss: 1.8194 - regression_loss: 1.5233 - classification_loss: 0.2961 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8228 - regression_loss: 1.5251 - classification_loss: 0.2976 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8166 - regression_loss: 1.5223 - classification_loss: 0.2944 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8133 - regression_loss: 1.5201 - classification_loss: 0.2932 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8148 - regression_loss: 1.5230 - classification_loss: 0.2918 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8184 - regression_loss: 1.5257 - classification_loss: 0.2927 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8199 - regression_loss: 1.5267 - classification_loss: 0.2932 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8212 - regression_loss: 1.5278 - classification_loss: 0.2934 68/500 [===>..........................] - ETA: 1:46 - loss: 1.8160 - regression_loss: 1.5237 - classification_loss: 0.2923 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8135 - regression_loss: 1.5223 - classification_loss: 0.2912 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8075 - regression_loss: 1.5170 - classification_loss: 0.2905 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8096 - regression_loss: 1.5202 - classification_loss: 0.2893 72/500 [===>..........................] - ETA: 1:45 - loss: 1.8120 - regression_loss: 1.5207 - classification_loss: 0.2913 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7983 - regression_loss: 1.5100 - classification_loss: 0.2882 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7986 - regression_loss: 1.5096 - classification_loss: 0.2890 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8042 - regression_loss: 1.5132 - classification_loss: 0.2910 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7912 - regression_loss: 1.5030 - classification_loss: 0.2882 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7884 - regression_loss: 1.5012 - classification_loss: 0.2872 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7862 - regression_loss: 1.4995 - classification_loss: 0.2867 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7817 - regression_loss: 1.4954 - classification_loss: 0.2863 80/500 [===>..........................] - ETA: 1:43 - loss: 1.7856 - regression_loss: 1.4979 - classification_loss: 0.2877 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7788 - regression_loss: 1.4918 - classification_loss: 0.2869 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7762 - regression_loss: 1.4893 - classification_loss: 0.2868 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7781 - regression_loss: 1.4907 - classification_loss: 0.2875 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7694 - regression_loss: 1.4823 - classification_loss: 0.2871 85/500 [====>.........................] - ETA: 1:42 - loss: 1.7740 - regression_loss: 1.4860 - classification_loss: 0.2880 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7718 - regression_loss: 1.4838 - classification_loss: 0.2880 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7758 - regression_loss: 1.4862 - classification_loss: 0.2896 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7813 - regression_loss: 1.4899 - classification_loss: 0.2913 89/500 [====>.........................] - ETA: 1:41 - loss: 1.7795 - regression_loss: 1.4877 - classification_loss: 0.2918 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7767 - regression_loss: 1.4848 - classification_loss: 0.2919 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7706 - regression_loss: 1.4801 - classification_loss: 0.2905 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7801 - regression_loss: 1.4872 - classification_loss: 0.2929 93/500 [====>.........................] - ETA: 1:40 - loss: 1.7692 - regression_loss: 1.4786 - classification_loss: 0.2906 94/500 [====>.........................] - ETA: 1:40 - loss: 1.7666 - regression_loss: 1.4773 - classification_loss: 0.2893 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7691 - regression_loss: 1.4794 - classification_loss: 0.2898 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7710 - regression_loss: 1.4818 - classification_loss: 0.2892 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7732 - regression_loss: 1.4820 - classification_loss: 0.2911 98/500 [====>.........................] - ETA: 1:39 - loss: 1.7706 - regression_loss: 1.4806 - classification_loss: 0.2901 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7672 - regression_loss: 1.4770 - classification_loss: 0.2902 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7563 - regression_loss: 1.4679 - classification_loss: 0.2884 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7545 - regression_loss: 1.4668 - classification_loss: 0.2877 102/500 [=====>........................] - ETA: 1:38 - loss: 1.7534 - regression_loss: 1.4660 - classification_loss: 0.2874 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7515 - regression_loss: 1.4651 - classification_loss: 0.2864 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7586 - regression_loss: 1.4716 - classification_loss: 0.2870 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7460 - regression_loss: 1.4614 - classification_loss: 0.2846 106/500 [=====>........................] - ETA: 1:37 - loss: 1.7349 - regression_loss: 1.4524 - classification_loss: 0.2825 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7417 - regression_loss: 1.4563 - classification_loss: 0.2854 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7375 - regression_loss: 1.4523 - classification_loss: 0.2852 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7349 - regression_loss: 1.4501 - classification_loss: 0.2848 110/500 [=====>........................] - ETA: 1:36 - loss: 1.7331 - regression_loss: 1.4485 - classification_loss: 0.2847 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7298 - regression_loss: 1.4466 - classification_loss: 0.2832 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7238 - regression_loss: 1.4419 - classification_loss: 0.2820 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7270 - regression_loss: 1.4444 - classification_loss: 0.2825 114/500 [=====>........................] - ETA: 1:35 - loss: 1.7219 - regression_loss: 1.4404 - classification_loss: 0.2815 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7128 - regression_loss: 1.4331 - classification_loss: 0.2797 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7099 - regression_loss: 1.4311 - classification_loss: 0.2788 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7093 - regression_loss: 1.4306 - classification_loss: 0.2788 118/500 [======>.......................] - ETA: 1:34 - loss: 1.7091 - regression_loss: 1.4306 - classification_loss: 0.2785 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7060 - regression_loss: 1.4284 - classification_loss: 0.2777 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7074 - regression_loss: 1.4285 - classification_loss: 0.2789 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7017 - regression_loss: 1.4241 - classification_loss: 0.2775 122/500 [======>.......................] - ETA: 1:33 - loss: 1.7032 - regression_loss: 1.4252 - classification_loss: 0.2780 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7056 - regression_loss: 1.4286 - classification_loss: 0.2769 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7013 - regression_loss: 1.4254 - classification_loss: 0.2759 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7020 - regression_loss: 1.4260 - classification_loss: 0.2759 126/500 [======>.......................] - ETA: 1:32 - loss: 1.7018 - regression_loss: 1.4257 - classification_loss: 0.2761 127/500 [======>.......................] - ETA: 1:32 - loss: 1.7014 - regression_loss: 1.4248 - classification_loss: 0.2766 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7009 - regression_loss: 1.4243 - classification_loss: 0.2766 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7100 - regression_loss: 1.4321 - classification_loss: 0.2779 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7140 - regression_loss: 1.4358 - classification_loss: 0.2782 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7190 - regression_loss: 1.4395 - classification_loss: 0.2794 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7120 - regression_loss: 1.4336 - classification_loss: 0.2784 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7089 - regression_loss: 1.4314 - classification_loss: 0.2775 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7116 - regression_loss: 1.4336 - classification_loss: 0.2780 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7206 - regression_loss: 1.4408 - classification_loss: 0.2798 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7222 - regression_loss: 1.4417 - classification_loss: 0.2806 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7189 - regression_loss: 1.4395 - classification_loss: 0.2795 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7202 - regression_loss: 1.4408 - classification_loss: 0.2795 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7192 - regression_loss: 1.4402 - classification_loss: 0.2791 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7181 - regression_loss: 1.4392 - classification_loss: 0.2790 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7205 - regression_loss: 1.4411 - classification_loss: 0.2794 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7162 - regression_loss: 1.4381 - classification_loss: 0.2782 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7184 - regression_loss: 1.4400 - classification_loss: 0.2784 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7162 - regression_loss: 1.4385 - classification_loss: 0.2778 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7141 - regression_loss: 1.4369 - classification_loss: 0.2772 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7131 - regression_loss: 1.4364 - classification_loss: 0.2767 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7164 - regression_loss: 1.4392 - classification_loss: 0.2772 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7174 - regression_loss: 1.4390 - classification_loss: 0.2784 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7118 - regression_loss: 1.4341 - classification_loss: 0.2776 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7127 - regression_loss: 1.4347 - classification_loss: 0.2780 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7141 - regression_loss: 1.4353 - classification_loss: 0.2788 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7129 - regression_loss: 1.4348 - classification_loss: 0.2781 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7163 - regression_loss: 1.4378 - classification_loss: 0.2785 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7138 - regression_loss: 1.4360 - classification_loss: 0.2778 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7124 - regression_loss: 1.4349 - classification_loss: 0.2775 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7110 - regression_loss: 1.4337 - classification_loss: 0.2773 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7148 - regression_loss: 1.4368 - classification_loss: 0.2779 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7181 - regression_loss: 1.4392 - classification_loss: 0.2789 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7173 - regression_loss: 1.4384 - classification_loss: 0.2789 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7174 - regression_loss: 1.4383 - classification_loss: 0.2791 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7144 - regression_loss: 1.4358 - classification_loss: 0.2787 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7132 - regression_loss: 1.4353 - classification_loss: 0.2778 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7160 - regression_loss: 1.4380 - classification_loss: 0.2780 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7173 - regression_loss: 1.4399 - classification_loss: 0.2774 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7139 - regression_loss: 1.4372 - classification_loss: 0.2767 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7154 - regression_loss: 1.4383 - classification_loss: 0.2771 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7163 - regression_loss: 1.4391 - classification_loss: 0.2772 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7132 - regression_loss: 1.4368 - classification_loss: 0.2763 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7142 - regression_loss: 1.4377 - classification_loss: 0.2766 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7180 - regression_loss: 1.4408 - classification_loss: 0.2772 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7195 - regression_loss: 1.4418 - classification_loss: 0.2777 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7199 - regression_loss: 1.4418 - classification_loss: 0.2781 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7201 - regression_loss: 1.4422 - classification_loss: 0.2779 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7197 - regression_loss: 1.4422 - classification_loss: 0.2775 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7205 - regression_loss: 1.4431 - classification_loss: 0.2774 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7239 - regression_loss: 1.4458 - classification_loss: 0.2781 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7221 - regression_loss: 1.4448 - classification_loss: 0.2773 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7263 - regression_loss: 1.4482 - classification_loss: 0.2781 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7274 - regression_loss: 1.4492 - classification_loss: 0.2782 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7292 - regression_loss: 1.4501 - classification_loss: 0.2791 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7255 - regression_loss: 1.4472 - classification_loss: 0.2784 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7240 - regression_loss: 1.4451 - classification_loss: 0.2789 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7214 - regression_loss: 1.4431 - classification_loss: 0.2783 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7236 - regression_loss: 1.4455 - classification_loss: 0.2781 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7277 - regression_loss: 1.4481 - classification_loss: 0.2796 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7220 - regression_loss: 1.4436 - classification_loss: 0.2783 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7229 - regression_loss: 1.4446 - classification_loss: 0.2783 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7198 - regression_loss: 1.4422 - classification_loss: 0.2776 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7144 - regression_loss: 1.4376 - classification_loss: 0.2768 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7127 - regression_loss: 1.4364 - classification_loss: 0.2763 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7105 - regression_loss: 1.4348 - classification_loss: 0.2757 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7100 - regression_loss: 1.4341 - classification_loss: 0.2759 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7094 - regression_loss: 1.4333 - classification_loss: 0.2760 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7133 - regression_loss: 1.4358 - classification_loss: 0.2775 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7127 - regression_loss: 1.4359 - classification_loss: 0.2768 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7126 - regression_loss: 1.4359 - classification_loss: 0.2767 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7146 - regression_loss: 1.4378 - classification_loss: 0.2768 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7128 - regression_loss: 1.4365 - classification_loss: 0.2763 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7168 - regression_loss: 1.4404 - classification_loss: 0.2764 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7158 - regression_loss: 1.4402 - classification_loss: 0.2756 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7172 - regression_loss: 1.4410 - classification_loss: 0.2762 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7165 - regression_loss: 1.4405 - classification_loss: 0.2759 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7174 - regression_loss: 1.4412 - classification_loss: 0.2762 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7161 - regression_loss: 1.4403 - classification_loss: 0.2758 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7149 - regression_loss: 1.4395 - classification_loss: 0.2754 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7137 - regression_loss: 1.4381 - classification_loss: 0.2756 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7153 - regression_loss: 1.4392 - classification_loss: 0.2761 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7156 - regression_loss: 1.4398 - classification_loss: 0.2758 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7151 - regression_loss: 1.4398 - classification_loss: 0.2753 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7134 - regression_loss: 1.4385 - classification_loss: 0.2749 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7104 - regression_loss: 1.4362 - classification_loss: 0.2742 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7079 - regression_loss: 1.4343 - classification_loss: 0.2737 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7078 - regression_loss: 1.4341 - classification_loss: 0.2737 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7067 - regression_loss: 1.4335 - classification_loss: 0.2732 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7067 - regression_loss: 1.4334 - classification_loss: 0.2733 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7036 - regression_loss: 1.4312 - classification_loss: 0.2725 217/500 [============>.................] - ETA: 1:10 - loss: 1.7090 - regression_loss: 1.4355 - classification_loss: 0.2734 218/500 [============>.................] - ETA: 1:10 - loss: 1.7065 - regression_loss: 1.4338 - classification_loss: 0.2727 219/500 [============>.................] - ETA: 1:10 - loss: 1.7078 - regression_loss: 1.4347 - classification_loss: 0.2731 220/500 [============>.................] - ETA: 1:09 - loss: 1.7033 - regression_loss: 1.4309 - classification_loss: 0.2724 221/500 [============>.................] - ETA: 1:09 - loss: 1.7027 - regression_loss: 1.4304 - classification_loss: 0.2723 222/500 [============>.................] - ETA: 1:09 - loss: 1.7032 - regression_loss: 1.4310 - classification_loss: 0.2722 223/500 [============>.................] - ETA: 1:09 - loss: 1.7044 - regression_loss: 1.4318 - classification_loss: 0.2726 224/500 [============>.................] - ETA: 1:08 - loss: 1.7025 - regression_loss: 1.4302 - classification_loss: 0.2723 225/500 [============>.................] - ETA: 1:08 - loss: 1.7033 - regression_loss: 1.4312 - classification_loss: 0.2721 226/500 [============>.................] - ETA: 1:08 - loss: 1.7033 - regression_loss: 1.4314 - classification_loss: 0.2719 227/500 [============>.................] - ETA: 1:08 - loss: 1.7044 - regression_loss: 1.4322 - classification_loss: 0.2723 228/500 [============>.................] - ETA: 1:07 - loss: 1.7043 - regression_loss: 1.4322 - classification_loss: 0.2721 229/500 [============>.................] - ETA: 1:07 - loss: 1.7000 - regression_loss: 1.4287 - classification_loss: 0.2713 230/500 [============>.................] - ETA: 1:07 - loss: 1.7006 - regression_loss: 1.4290 - classification_loss: 0.2716 231/500 [============>.................] - ETA: 1:07 - loss: 1.7043 - regression_loss: 1.4316 - classification_loss: 0.2727 232/500 [============>.................] - ETA: 1:06 - loss: 1.7031 - regression_loss: 1.4307 - classification_loss: 0.2724 233/500 [============>.................] - ETA: 1:06 - loss: 1.7034 - regression_loss: 1.4309 - classification_loss: 0.2725 234/500 [=============>................] - ETA: 1:06 - loss: 1.7021 - regression_loss: 1.4298 - classification_loss: 0.2724 235/500 [=============>................] - ETA: 1:05 - loss: 1.7037 - regression_loss: 1.4313 - classification_loss: 0.2724 236/500 [=============>................] - ETA: 1:05 - loss: 1.7038 - regression_loss: 1.4318 - classification_loss: 0.2721 237/500 [=============>................] - ETA: 1:05 - loss: 1.7024 - regression_loss: 1.4303 - classification_loss: 0.2720 238/500 [=============>................] - ETA: 1:05 - loss: 1.7040 - regression_loss: 1.4308 - classification_loss: 0.2732 239/500 [=============>................] - ETA: 1:04 - loss: 1.7004 - regression_loss: 1.4280 - classification_loss: 0.2724 240/500 [=============>................] - ETA: 1:04 - loss: 1.7051 - regression_loss: 1.4319 - classification_loss: 0.2733 241/500 [=============>................] - ETA: 1:04 - loss: 1.7019 - regression_loss: 1.4292 - classification_loss: 0.2727 242/500 [=============>................] - ETA: 1:04 - loss: 1.7007 - regression_loss: 1.4283 - classification_loss: 0.2724 243/500 [=============>................] - ETA: 1:03 - loss: 1.7018 - regression_loss: 1.4293 - classification_loss: 0.2725 244/500 [=============>................] - ETA: 1:03 - loss: 1.7022 - regression_loss: 1.4294 - classification_loss: 0.2728 245/500 [=============>................] - ETA: 1:03 - loss: 1.7035 - regression_loss: 1.4302 - classification_loss: 0.2733 246/500 [=============>................] - ETA: 1:03 - loss: 1.7022 - regression_loss: 1.4295 - classification_loss: 0.2727 247/500 [=============>................] - ETA: 1:03 - loss: 1.7009 - regression_loss: 1.4284 - classification_loss: 0.2725 248/500 [=============>................] - ETA: 1:02 - loss: 1.7015 - regression_loss: 1.4291 - classification_loss: 0.2724 249/500 [=============>................] - ETA: 1:02 - loss: 1.6967 - regression_loss: 1.4249 - classification_loss: 0.2718 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6946 - regression_loss: 1.4234 - classification_loss: 0.2712 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6976 - regression_loss: 1.4258 - classification_loss: 0.2719 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6975 - regression_loss: 1.4259 - classification_loss: 0.2716 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6968 - regression_loss: 1.4257 - classification_loss: 0.2711 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7026 - regression_loss: 1.4300 - classification_loss: 0.2726 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7020 - regression_loss: 1.4299 - classification_loss: 0.2722 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7039 - regression_loss: 1.4312 - classification_loss: 0.2727 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7028 - regression_loss: 1.4305 - classification_loss: 0.2723 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7043 - regression_loss: 1.4319 - classification_loss: 0.2724 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7104 - regression_loss: 1.4361 - classification_loss: 0.2742 260/500 [==============>...............] - ETA: 59s - loss: 1.7097 - regression_loss: 1.4357 - classification_loss: 0.2740  261/500 [==============>...............] - ETA: 59s - loss: 1.7094 - regression_loss: 1.4356 - classification_loss: 0.2738 262/500 [==============>...............] - ETA: 59s - loss: 1.7115 - regression_loss: 1.4377 - classification_loss: 0.2738 263/500 [==============>...............] - ETA: 59s - loss: 1.7102 - regression_loss: 1.4364 - classification_loss: 0.2738 264/500 [==============>...............] - ETA: 58s - loss: 1.7087 - regression_loss: 1.4352 - classification_loss: 0.2735 265/500 [==============>...............] - ETA: 58s - loss: 1.7107 - regression_loss: 1.4367 - classification_loss: 0.2740 266/500 [==============>...............] - ETA: 58s - loss: 1.7106 - regression_loss: 1.4365 - classification_loss: 0.2741 267/500 [===============>..............] - ETA: 58s - loss: 1.7082 - regression_loss: 1.4347 - classification_loss: 0.2735 268/500 [===============>..............] - ETA: 57s - loss: 1.7081 - regression_loss: 1.4342 - classification_loss: 0.2739 269/500 [===============>..............] - ETA: 57s - loss: 1.7087 - regression_loss: 1.4348 - classification_loss: 0.2739 270/500 [===============>..............] - ETA: 57s - loss: 1.7076 - regression_loss: 1.4340 - classification_loss: 0.2736 271/500 [===============>..............] - ETA: 57s - loss: 1.7099 - regression_loss: 1.4357 - classification_loss: 0.2742 272/500 [===============>..............] - ETA: 56s - loss: 1.7096 - regression_loss: 1.4355 - classification_loss: 0.2741 273/500 [===============>..............] - ETA: 56s - loss: 1.7107 - regression_loss: 1.4367 - classification_loss: 0.2740 274/500 [===============>..............] - ETA: 56s - loss: 1.7102 - regression_loss: 1.4364 - classification_loss: 0.2737 275/500 [===============>..............] - ETA: 56s - loss: 1.7092 - regression_loss: 1.4357 - classification_loss: 0.2735 276/500 [===============>..............] - ETA: 55s - loss: 1.7082 - regression_loss: 1.4348 - classification_loss: 0.2734 277/500 [===============>..............] - ETA: 55s - loss: 1.7091 - regression_loss: 1.4354 - classification_loss: 0.2736 278/500 [===============>..............] - ETA: 55s - loss: 1.7105 - regression_loss: 1.4364 - classification_loss: 0.2741 279/500 [===============>..............] - ETA: 55s - loss: 1.7114 - regression_loss: 1.4373 - classification_loss: 0.2740 280/500 [===============>..............] - ETA: 54s - loss: 1.7121 - regression_loss: 1.4376 - classification_loss: 0.2745 281/500 [===============>..............] - ETA: 54s - loss: 1.7171 - regression_loss: 1.4415 - classification_loss: 0.2756 282/500 [===============>..............] - ETA: 54s - loss: 1.7142 - regression_loss: 1.4392 - classification_loss: 0.2751 283/500 [===============>..............] - ETA: 54s - loss: 1.7149 - regression_loss: 1.4396 - classification_loss: 0.2753 284/500 [================>.............] - ETA: 53s - loss: 1.7148 - regression_loss: 1.4393 - classification_loss: 0.2754 285/500 [================>.............] - ETA: 53s - loss: 1.7171 - regression_loss: 1.4409 - classification_loss: 0.2762 286/500 [================>.............] - ETA: 53s - loss: 1.7181 - regression_loss: 1.4419 - classification_loss: 0.2763 287/500 [================>.............] - ETA: 53s - loss: 1.7197 - regression_loss: 1.4431 - classification_loss: 0.2766 288/500 [================>.............] - ETA: 52s - loss: 1.7207 - regression_loss: 1.4441 - classification_loss: 0.2766 289/500 [================>.............] - ETA: 52s - loss: 1.7221 - regression_loss: 1.4456 - classification_loss: 0.2765 290/500 [================>.............] - ETA: 52s - loss: 1.7213 - regression_loss: 1.4449 - classification_loss: 0.2764 291/500 [================>.............] - ETA: 52s - loss: 1.7177 - regression_loss: 1.4419 - classification_loss: 0.2758 292/500 [================>.............] - ETA: 51s - loss: 1.7175 - regression_loss: 1.4418 - classification_loss: 0.2758 293/500 [================>.............] - ETA: 51s - loss: 1.7167 - regression_loss: 1.4412 - classification_loss: 0.2756 294/500 [================>.............] - ETA: 51s - loss: 1.7157 - regression_loss: 1.4402 - classification_loss: 0.2755 295/500 [================>.............] - ETA: 51s - loss: 1.7175 - regression_loss: 1.4417 - classification_loss: 0.2759 296/500 [================>.............] - ETA: 50s - loss: 1.7169 - regression_loss: 1.4412 - classification_loss: 0.2757 297/500 [================>.............] - ETA: 50s - loss: 1.7187 - regression_loss: 1.4428 - classification_loss: 0.2759 298/500 [================>.............] - ETA: 50s - loss: 1.7182 - regression_loss: 1.4426 - classification_loss: 0.2756 299/500 [================>.............] - ETA: 50s - loss: 1.7162 - regression_loss: 1.4410 - classification_loss: 0.2752 300/500 [=================>............] - ETA: 49s - loss: 1.7182 - regression_loss: 1.4422 - classification_loss: 0.2761 301/500 [=================>............] - ETA: 49s - loss: 1.7193 - regression_loss: 1.4430 - classification_loss: 0.2763 302/500 [=================>............] - ETA: 49s - loss: 1.7195 - regression_loss: 1.4430 - classification_loss: 0.2765 303/500 [=================>............] - ETA: 49s - loss: 1.7194 - regression_loss: 1.4427 - classification_loss: 0.2766 304/500 [=================>............] - ETA: 48s - loss: 1.7197 - regression_loss: 1.4432 - classification_loss: 0.2765 305/500 [=================>............] - ETA: 48s - loss: 1.7195 - regression_loss: 1.4430 - classification_loss: 0.2764 306/500 [=================>............] - ETA: 48s - loss: 1.7187 - regression_loss: 1.4426 - classification_loss: 0.2761 307/500 [=================>............] - ETA: 48s - loss: 1.7190 - regression_loss: 1.4431 - classification_loss: 0.2759 308/500 [=================>............] - ETA: 47s - loss: 1.7212 - regression_loss: 1.4447 - classification_loss: 0.2765 309/500 [=================>............] - ETA: 47s - loss: 1.7227 - regression_loss: 1.4456 - classification_loss: 0.2771 310/500 [=================>............] - ETA: 47s - loss: 1.7209 - regression_loss: 1.4442 - classification_loss: 0.2767 311/500 [=================>............] - ETA: 47s - loss: 1.7212 - regression_loss: 1.4446 - classification_loss: 0.2766 312/500 [=================>............] - ETA: 46s - loss: 1.7219 - regression_loss: 1.4448 - classification_loss: 0.2771 313/500 [=================>............] - ETA: 46s - loss: 1.7213 - regression_loss: 1.4445 - classification_loss: 0.2768 314/500 [=================>............] - ETA: 46s - loss: 1.7206 - regression_loss: 1.4439 - classification_loss: 0.2767 315/500 [=================>............] - ETA: 46s - loss: 1.7203 - regression_loss: 1.4436 - classification_loss: 0.2767 316/500 [=================>............] - ETA: 45s - loss: 1.7175 - regression_loss: 1.4413 - classification_loss: 0.2762 317/500 [==================>...........] - ETA: 45s - loss: 1.7173 - regression_loss: 1.4411 - classification_loss: 0.2761 318/500 [==================>...........] - ETA: 45s - loss: 1.7181 - regression_loss: 1.4418 - classification_loss: 0.2763 319/500 [==================>...........] - ETA: 45s - loss: 1.7200 - regression_loss: 1.4432 - classification_loss: 0.2768 320/500 [==================>...........] - ETA: 44s - loss: 1.7183 - regression_loss: 1.4419 - classification_loss: 0.2763 321/500 [==================>...........] - ETA: 44s - loss: 1.7187 - regression_loss: 1.4424 - classification_loss: 0.2763 322/500 [==================>...........] - ETA: 44s - loss: 1.7170 - regression_loss: 1.4411 - classification_loss: 0.2759 323/500 [==================>...........] - ETA: 44s - loss: 1.7182 - regression_loss: 1.4423 - classification_loss: 0.2759 324/500 [==================>...........] - ETA: 43s - loss: 1.7184 - regression_loss: 1.4425 - classification_loss: 0.2760 325/500 [==================>...........] - ETA: 43s - loss: 1.7185 - regression_loss: 1.4429 - classification_loss: 0.2756 326/500 [==================>...........] - ETA: 43s - loss: 1.7180 - regression_loss: 1.4425 - classification_loss: 0.2756 327/500 [==================>...........] - ETA: 43s - loss: 1.7185 - regression_loss: 1.4425 - classification_loss: 0.2760 328/500 [==================>...........] - ETA: 42s - loss: 1.7172 - regression_loss: 1.4416 - classification_loss: 0.2756 329/500 [==================>...........] - ETA: 42s - loss: 1.7182 - regression_loss: 1.4425 - classification_loss: 0.2757 330/500 [==================>...........] - ETA: 42s - loss: 1.7172 - regression_loss: 1.4417 - classification_loss: 0.2755 331/500 [==================>...........] - ETA: 42s - loss: 1.7192 - regression_loss: 1.4439 - classification_loss: 0.2753 332/500 [==================>...........] - ETA: 41s - loss: 1.7179 - regression_loss: 1.4428 - classification_loss: 0.2751 333/500 [==================>...........] - ETA: 41s - loss: 1.7182 - regression_loss: 1.4432 - classification_loss: 0.2750 334/500 [===================>..........] - ETA: 41s - loss: 1.7186 - regression_loss: 1.4437 - classification_loss: 0.2749 335/500 [===================>..........] - ETA: 41s - loss: 1.7187 - regression_loss: 1.4440 - classification_loss: 0.2747 336/500 [===================>..........] - ETA: 40s - loss: 1.7160 - regression_loss: 1.4419 - classification_loss: 0.2740 337/500 [===================>..........] - ETA: 40s - loss: 1.7149 - regression_loss: 1.4412 - classification_loss: 0.2736 338/500 [===================>..........] - ETA: 40s - loss: 1.7144 - regression_loss: 1.4408 - classification_loss: 0.2736 339/500 [===================>..........] - ETA: 40s - loss: 1.7162 - regression_loss: 1.4423 - classification_loss: 0.2739 340/500 [===================>..........] - ETA: 39s - loss: 1.7159 - regression_loss: 1.4421 - classification_loss: 0.2738 341/500 [===================>..........] - ETA: 39s - loss: 1.7143 - regression_loss: 1.4410 - classification_loss: 0.2733 342/500 [===================>..........] - ETA: 39s - loss: 1.7138 - regression_loss: 1.4408 - classification_loss: 0.2730 343/500 [===================>..........] - ETA: 39s - loss: 1.7109 - regression_loss: 1.4383 - classification_loss: 0.2726 344/500 [===================>..........] - ETA: 38s - loss: 1.7108 - regression_loss: 1.4382 - classification_loss: 0.2726 345/500 [===================>..........] - ETA: 38s - loss: 1.7113 - regression_loss: 1.4387 - classification_loss: 0.2726 346/500 [===================>..........] - ETA: 38s - loss: 1.7130 - regression_loss: 1.4402 - classification_loss: 0.2728 347/500 [===================>..........] - ETA: 38s - loss: 1.7132 - regression_loss: 1.4404 - classification_loss: 0.2729 348/500 [===================>..........] - ETA: 37s - loss: 1.7139 - regression_loss: 1.4408 - classification_loss: 0.2731 349/500 [===================>..........] - ETA: 37s - loss: 1.7168 - regression_loss: 1.4431 - classification_loss: 0.2737 350/500 [====================>.........] - ETA: 37s - loss: 1.7152 - regression_loss: 1.4419 - classification_loss: 0.2733 351/500 [====================>.........] - ETA: 37s - loss: 1.7158 - regression_loss: 1.4423 - classification_loss: 0.2734 352/500 [====================>.........] - ETA: 36s - loss: 1.7137 - regression_loss: 1.4407 - classification_loss: 0.2731 353/500 [====================>.........] - ETA: 36s - loss: 1.7138 - regression_loss: 1.4408 - classification_loss: 0.2730 354/500 [====================>.........] - ETA: 36s - loss: 1.7126 - regression_loss: 1.4400 - classification_loss: 0.2726 355/500 [====================>.........] - ETA: 36s - loss: 1.7116 - regression_loss: 1.4392 - classification_loss: 0.2723 356/500 [====================>.........] - ETA: 35s - loss: 1.7127 - regression_loss: 1.4403 - classification_loss: 0.2724 357/500 [====================>.........] - ETA: 35s - loss: 1.7128 - regression_loss: 1.4406 - classification_loss: 0.2722 358/500 [====================>.........] - ETA: 35s - loss: 1.7142 - regression_loss: 1.4419 - classification_loss: 0.2723 359/500 [====================>.........] - ETA: 35s - loss: 1.7137 - regression_loss: 1.4416 - classification_loss: 0.2722 360/500 [====================>.........] - ETA: 34s - loss: 1.7134 - regression_loss: 1.4410 - classification_loss: 0.2724 361/500 [====================>.........] - ETA: 34s - loss: 1.7136 - regression_loss: 1.4409 - classification_loss: 0.2727 362/500 [====================>.........] - ETA: 34s - loss: 1.7141 - regression_loss: 1.4416 - classification_loss: 0.2725 363/500 [====================>.........] - ETA: 34s - loss: 1.7136 - regression_loss: 1.4412 - classification_loss: 0.2723 364/500 [====================>.........] - ETA: 33s - loss: 1.7138 - regression_loss: 1.4414 - classification_loss: 0.2723 365/500 [====================>.........] - ETA: 33s - loss: 1.7135 - regression_loss: 1.4414 - classification_loss: 0.2721 366/500 [====================>.........] - ETA: 33s - loss: 1.7129 - regression_loss: 1.4411 - classification_loss: 0.2718 367/500 [=====================>........] - ETA: 33s - loss: 1.7133 - regression_loss: 1.4414 - classification_loss: 0.2719 368/500 [=====================>........] - ETA: 32s - loss: 1.7125 - regression_loss: 1.4407 - classification_loss: 0.2717 369/500 [=====================>........] - ETA: 32s - loss: 1.7096 - regression_loss: 1.4384 - classification_loss: 0.2712 370/500 [=====================>........] - ETA: 32s - loss: 1.7072 - regression_loss: 1.4361 - classification_loss: 0.2711 371/500 [=====================>........] - ETA: 32s - loss: 1.7074 - regression_loss: 1.4365 - classification_loss: 0.2710 372/500 [=====================>........] - ETA: 31s - loss: 1.7080 - regression_loss: 1.4368 - classification_loss: 0.2712 373/500 [=====================>........] - ETA: 31s - loss: 1.7056 - regression_loss: 1.4348 - classification_loss: 0.2708 374/500 [=====================>........] - ETA: 31s - loss: 1.7055 - regression_loss: 1.4349 - classification_loss: 0.2706 375/500 [=====================>........] - ETA: 31s - loss: 1.7047 - regression_loss: 1.4343 - classification_loss: 0.2704 376/500 [=====================>........] - ETA: 30s - loss: 1.7037 - regression_loss: 1.4337 - classification_loss: 0.2700 377/500 [=====================>........] - ETA: 30s - loss: 1.7008 - regression_loss: 1.4312 - classification_loss: 0.2695 378/500 [=====================>........] - ETA: 30s - loss: 1.7004 - regression_loss: 1.4309 - classification_loss: 0.2695 379/500 [=====================>........] - ETA: 30s - loss: 1.7023 - regression_loss: 1.4322 - classification_loss: 0.2701 380/500 [=====================>........] - ETA: 29s - loss: 1.6995 - regression_loss: 1.4299 - classification_loss: 0.2696 381/500 [=====================>........] - ETA: 29s - loss: 1.7007 - regression_loss: 1.4310 - classification_loss: 0.2697 382/500 [=====================>........] - ETA: 29s - loss: 1.6992 - regression_loss: 1.4298 - classification_loss: 0.2695 383/500 [=====================>........] - ETA: 29s - loss: 1.7008 - regression_loss: 1.4311 - classification_loss: 0.2698 384/500 [======================>.......] - ETA: 28s - loss: 1.7003 - regression_loss: 1.4305 - classification_loss: 0.2698 385/500 [======================>.......] - ETA: 28s - loss: 1.7014 - regression_loss: 1.4319 - classification_loss: 0.2695 386/500 [======================>.......] - ETA: 28s - loss: 1.7021 - regression_loss: 1.4325 - classification_loss: 0.2696 387/500 [======================>.......] - ETA: 28s - loss: 1.7027 - regression_loss: 1.4330 - classification_loss: 0.2697 388/500 [======================>.......] - ETA: 27s - loss: 1.7036 - regression_loss: 1.4337 - classification_loss: 0.2699 389/500 [======================>.......] - ETA: 27s - loss: 1.7037 - regression_loss: 1.4338 - classification_loss: 0.2699 390/500 [======================>.......] - ETA: 27s - loss: 1.7008 - regression_loss: 1.4314 - classification_loss: 0.2694 391/500 [======================>.......] - ETA: 27s - loss: 1.7006 - regression_loss: 1.4313 - classification_loss: 0.2694 392/500 [======================>.......] - ETA: 26s - loss: 1.7024 - regression_loss: 1.4325 - classification_loss: 0.2699 393/500 [======================>.......] - ETA: 26s - loss: 1.7043 - regression_loss: 1.4336 - classification_loss: 0.2707 394/500 [======================>.......] - ETA: 26s - loss: 1.7041 - regression_loss: 1.4333 - classification_loss: 0.2707 395/500 [======================>.......] - ETA: 26s - loss: 1.7027 - regression_loss: 1.4322 - classification_loss: 0.2705 396/500 [======================>.......] - ETA: 25s - loss: 1.7006 - regression_loss: 1.4306 - classification_loss: 0.2700 397/500 [======================>.......] - ETA: 25s - loss: 1.7000 - regression_loss: 1.4303 - classification_loss: 0.2697 398/500 [======================>.......] - ETA: 25s - loss: 1.6973 - regression_loss: 1.4281 - classification_loss: 0.2692 399/500 [======================>.......] - ETA: 25s - loss: 1.6951 - regression_loss: 1.4262 - classification_loss: 0.2689 400/500 [=======================>......] - ETA: 24s - loss: 1.6964 - regression_loss: 1.4272 - classification_loss: 0.2692 401/500 [=======================>......] - ETA: 24s - loss: 1.6960 - regression_loss: 1.4270 - classification_loss: 0.2690 402/500 [=======================>......] - ETA: 24s - loss: 1.6973 - regression_loss: 1.4279 - classification_loss: 0.2693 403/500 [=======================>......] - ETA: 24s - loss: 1.6960 - regression_loss: 1.4269 - classification_loss: 0.2691 404/500 [=======================>......] - ETA: 23s - loss: 1.6963 - regression_loss: 1.4270 - classification_loss: 0.2693 405/500 [=======================>......] - ETA: 23s - loss: 1.6937 - regression_loss: 1.4247 - classification_loss: 0.2690 406/500 [=======================>......] - ETA: 23s - loss: 1.6936 - regression_loss: 1.4247 - classification_loss: 0.2689 407/500 [=======================>......] - ETA: 23s - loss: 1.6932 - regression_loss: 1.4245 - classification_loss: 0.2688 408/500 [=======================>......] - ETA: 22s - loss: 1.6941 - regression_loss: 1.4251 - classification_loss: 0.2690 409/500 [=======================>......] - ETA: 22s - loss: 1.6936 - regression_loss: 1.4247 - classification_loss: 0.2689 410/500 [=======================>......] - ETA: 22s - loss: 1.6949 - regression_loss: 1.4257 - classification_loss: 0.2692 411/500 [=======================>......] - ETA: 22s - loss: 1.6951 - regression_loss: 1.4259 - classification_loss: 0.2692 412/500 [=======================>......] - ETA: 21s - loss: 1.6948 - regression_loss: 1.4254 - classification_loss: 0.2694 413/500 [=======================>......] - ETA: 21s - loss: 1.6945 - regression_loss: 1.4251 - classification_loss: 0.2694 414/500 [=======================>......] - ETA: 21s - loss: 1.6933 - regression_loss: 1.4243 - classification_loss: 0.2690 415/500 [=======================>......] - ETA: 21s - loss: 1.6935 - regression_loss: 1.4244 - classification_loss: 0.2691 416/500 [=======================>......] - ETA: 20s - loss: 1.6947 - regression_loss: 1.4254 - classification_loss: 0.2693 417/500 [========================>.....] - ETA: 20s - loss: 1.6953 - regression_loss: 1.4258 - classification_loss: 0.2695 418/500 [========================>.....] - ETA: 20s - loss: 1.6937 - regression_loss: 1.4246 - classification_loss: 0.2691 419/500 [========================>.....] - ETA: 20s - loss: 1.6930 - regression_loss: 1.4242 - classification_loss: 0.2688 420/500 [========================>.....] - ETA: 19s - loss: 1.6931 - regression_loss: 1.4245 - classification_loss: 0.2686 421/500 [========================>.....] - ETA: 19s - loss: 1.6905 - regression_loss: 1.4224 - classification_loss: 0.2682 422/500 [========================>.....] - ETA: 19s - loss: 1.6922 - regression_loss: 1.4237 - classification_loss: 0.2686 423/500 [========================>.....] - ETA: 19s - loss: 1.6920 - regression_loss: 1.4235 - classification_loss: 0.2685 424/500 [========================>.....] - ETA: 18s - loss: 1.6923 - regression_loss: 1.4236 - classification_loss: 0.2687 425/500 [========================>.....] - ETA: 18s - loss: 1.6895 - regression_loss: 1.4213 - classification_loss: 0.2682 426/500 [========================>.....] - ETA: 18s - loss: 1.6899 - regression_loss: 1.4218 - classification_loss: 0.2682 427/500 [========================>.....] - ETA: 18s - loss: 1.6892 - regression_loss: 1.4213 - classification_loss: 0.2679 428/500 [========================>.....] - ETA: 17s - loss: 1.6878 - regression_loss: 1.4201 - classification_loss: 0.2677 429/500 [========================>.....] - ETA: 17s - loss: 1.6872 - regression_loss: 1.4197 - classification_loss: 0.2676 430/500 [========================>.....] - ETA: 17s - loss: 1.6872 - regression_loss: 1.4195 - classification_loss: 0.2676 431/500 [========================>.....] - ETA: 17s - loss: 1.6876 - regression_loss: 1.4200 - classification_loss: 0.2676 432/500 [========================>.....] - ETA: 16s - loss: 1.6879 - regression_loss: 1.4205 - classification_loss: 0.2674 433/500 [========================>.....] - ETA: 16s - loss: 1.6907 - regression_loss: 1.4226 - classification_loss: 0.2681 434/500 [=========================>....] - ETA: 16s - loss: 1.6916 - regression_loss: 1.4237 - classification_loss: 0.2679 435/500 [=========================>....] - ETA: 16s - loss: 1.6924 - regression_loss: 1.4241 - classification_loss: 0.2683 436/500 [=========================>....] - ETA: 15s - loss: 1.6925 - regression_loss: 1.4242 - classification_loss: 0.2683 437/500 [=========================>....] - ETA: 15s - loss: 1.6935 - regression_loss: 1.4252 - classification_loss: 0.2684 438/500 [=========================>....] - ETA: 15s - loss: 1.6945 - regression_loss: 1.4257 - classification_loss: 0.2687 439/500 [=========================>....] - ETA: 15s - loss: 1.6940 - regression_loss: 1.4253 - classification_loss: 0.2687 440/500 [=========================>....] - ETA: 14s - loss: 1.6915 - regression_loss: 1.4233 - classification_loss: 0.2682 441/500 [=========================>....] - ETA: 14s - loss: 1.6900 - regression_loss: 1.4221 - classification_loss: 0.2679 442/500 [=========================>....] - ETA: 14s - loss: 1.6892 - regression_loss: 1.4216 - classification_loss: 0.2676 443/500 [=========================>....] - ETA: 14s - loss: 1.6902 - regression_loss: 1.4225 - classification_loss: 0.2677 444/500 [=========================>....] - ETA: 13s - loss: 1.6914 - regression_loss: 1.4235 - classification_loss: 0.2679 445/500 [=========================>....] - ETA: 13s - loss: 1.6915 - regression_loss: 1.4236 - classification_loss: 0.2679 446/500 [=========================>....] - ETA: 13s - loss: 1.6908 - regression_loss: 1.4229 - classification_loss: 0.2679 447/500 [=========================>....] - ETA: 13s - loss: 1.6898 - regression_loss: 1.4222 - classification_loss: 0.2676 448/500 [=========================>....] - ETA: 12s - loss: 1.6898 - regression_loss: 1.4224 - classification_loss: 0.2674 449/500 [=========================>....] - ETA: 12s - loss: 1.6894 - regression_loss: 1.4221 - classification_loss: 0.2674 450/500 [==========================>...] - ETA: 12s - loss: 1.6867 - regression_loss: 1.4198 - classification_loss: 0.2669 451/500 [==========================>...] - ETA: 12s - loss: 1.6859 - regression_loss: 1.4193 - classification_loss: 0.2666 452/500 [==========================>...] - ETA: 11s - loss: 1.6866 - regression_loss: 1.4198 - classification_loss: 0.2668 453/500 [==========================>...] - ETA: 11s - loss: 1.6858 - regression_loss: 1.4192 - classification_loss: 0.2666 454/500 [==========================>...] - ETA: 11s - loss: 1.6863 - regression_loss: 1.4195 - classification_loss: 0.2667 455/500 [==========================>...] - ETA: 11s - loss: 1.6855 - regression_loss: 1.4190 - classification_loss: 0.2665 456/500 [==========================>...] - ETA: 10s - loss: 1.6862 - regression_loss: 1.4196 - classification_loss: 0.2666 457/500 [==========================>...] - ETA: 10s - loss: 1.6854 - regression_loss: 1.4189 - classification_loss: 0.2665 458/500 [==========================>...] - ETA: 10s - loss: 1.6859 - regression_loss: 1.4194 - classification_loss: 0.2666 459/500 [==========================>...] - ETA: 10s - loss: 1.6846 - regression_loss: 1.4182 - classification_loss: 0.2664 460/500 [==========================>...] - ETA: 9s - loss: 1.6845 - regression_loss: 1.4181 - classification_loss: 0.2663  461/500 [==========================>...] - ETA: 9s - loss: 1.6852 - regression_loss: 1.4188 - classification_loss: 0.2664 462/500 [==========================>...] - ETA: 9s - loss: 1.6849 - regression_loss: 1.4186 - classification_loss: 0.2664 463/500 [==========================>...] - ETA: 9s - loss: 1.6853 - regression_loss: 1.4189 - classification_loss: 0.2663 464/500 [==========================>...] - ETA: 8s - loss: 1.6871 - regression_loss: 1.4203 - classification_loss: 0.2668 465/500 [==========================>...] - ETA: 8s - loss: 1.6877 - regression_loss: 1.4208 - classification_loss: 0.2669 466/500 [==========================>...] - ETA: 8s - loss: 1.6866 - regression_loss: 1.4199 - classification_loss: 0.2667 467/500 [===========================>..] - ETA: 8s - loss: 1.6874 - regression_loss: 1.4205 - classification_loss: 0.2669 468/500 [===========================>..] - ETA: 7s - loss: 1.6865 - regression_loss: 1.4198 - classification_loss: 0.2667 469/500 [===========================>..] - ETA: 7s - loss: 1.6861 - regression_loss: 1.4196 - classification_loss: 0.2665 470/500 [===========================>..] - ETA: 7s - loss: 1.6878 - regression_loss: 1.4204 - classification_loss: 0.2675 471/500 [===========================>..] - ETA: 7s - loss: 1.6859 - regression_loss: 1.4189 - classification_loss: 0.2670 472/500 [===========================>..] - ETA: 6s - loss: 1.6861 - regression_loss: 1.4190 - classification_loss: 0.2670 473/500 [===========================>..] - ETA: 6s - loss: 1.6845 - regression_loss: 1.4177 - classification_loss: 0.2669 474/500 [===========================>..] - ETA: 6s - loss: 1.6845 - regression_loss: 1.4177 - classification_loss: 0.2667 475/500 [===========================>..] - ETA: 6s - loss: 1.6856 - regression_loss: 1.4188 - classification_loss: 0.2669 476/500 [===========================>..] - ETA: 5s - loss: 1.6855 - regression_loss: 1.4186 - classification_loss: 0.2670 477/500 [===========================>..] - ETA: 5s - loss: 1.6859 - regression_loss: 1.4187 - classification_loss: 0.2672 478/500 [===========================>..] - ETA: 5s - loss: 1.6857 - regression_loss: 1.4186 - classification_loss: 0.2671 479/500 [===========================>..] - ETA: 5s - loss: 1.6861 - regression_loss: 1.4189 - classification_loss: 0.2671 480/500 [===========================>..] - ETA: 4s - loss: 1.6876 - regression_loss: 1.4202 - classification_loss: 0.2674 481/500 [===========================>..] - ETA: 4s - loss: 1.6882 - regression_loss: 1.4208 - classification_loss: 0.2674 482/500 [===========================>..] - ETA: 4s - loss: 1.6899 - regression_loss: 1.4222 - classification_loss: 0.2678 483/500 [===========================>..] - ETA: 4s - loss: 1.6885 - regression_loss: 1.4209 - classification_loss: 0.2676 484/500 [============================>.] - ETA: 3s - loss: 1.6899 - regression_loss: 1.4218 - classification_loss: 0.2681 485/500 [============================>.] - ETA: 3s - loss: 1.6900 - regression_loss: 1.4220 - classification_loss: 0.2681 486/500 [============================>.] - ETA: 3s - loss: 1.6893 - regression_loss: 1.4214 - classification_loss: 0.2679 487/500 [============================>.] - ETA: 3s - loss: 1.6883 - regression_loss: 1.4208 - classification_loss: 0.2676 488/500 [============================>.] - ETA: 2s - loss: 1.6893 - regression_loss: 1.4215 - classification_loss: 0.2678 489/500 [============================>.] - ETA: 2s - loss: 1.6903 - regression_loss: 1.4221 - classification_loss: 0.2682 490/500 [============================>.] - ETA: 2s - loss: 1.6911 - regression_loss: 1.4227 - classification_loss: 0.2684 491/500 [============================>.] - ETA: 2s - loss: 1.6910 - regression_loss: 1.4227 - classification_loss: 0.2682 492/500 [============================>.] - ETA: 1s - loss: 1.6901 - regression_loss: 1.4220 - classification_loss: 0.2681 493/500 [============================>.] - ETA: 1s - loss: 1.6891 - regression_loss: 1.4210 - classification_loss: 0.2681 494/500 [============================>.] - ETA: 1s - loss: 1.6895 - regression_loss: 1.4214 - classification_loss: 0.2681 495/500 [============================>.] - ETA: 1s - loss: 1.6898 - regression_loss: 1.4217 - classification_loss: 0.2681 496/500 [============================>.] - ETA: 0s - loss: 1.6900 - regression_loss: 1.4219 - classification_loss: 0.2681 497/500 [============================>.] - ETA: 0s - loss: 1.6914 - regression_loss: 1.4229 - classification_loss: 0.2685 498/500 [============================>.] - ETA: 0s - loss: 1.6920 - regression_loss: 1.4234 - classification_loss: 0.2686 499/500 [============================>.] - ETA: 0s - loss: 1.6908 - regression_loss: 1.4225 - classification_loss: 0.2684 500/500 [==============================] - 125s 249ms/step - loss: 1.6901 - regression_loss: 1.4219 - classification_loss: 0.2683 326 instances of class plum with average precision: 0.7775 mAP: 0.7775 Epoch 00047: saving model to ./training/snapshots/resnet50_pascal_47.h5 Epoch 48/150 1/500 [..............................] - ETA: 2:01 - loss: 1.5622 - regression_loss: 1.3238 - classification_loss: 0.2383 2/500 [..............................] - ETA: 2:04 - loss: 1.5418 - regression_loss: 1.3104 - classification_loss: 0.2313 3/500 [..............................] - ETA: 2:03 - loss: 1.6813 - regression_loss: 1.4331 - classification_loss: 0.2482 4/500 [..............................] - ETA: 2:04 - loss: 1.4846 - regression_loss: 1.2669 - classification_loss: 0.2177 5/500 [..............................] - ETA: 2:04 - loss: 1.5338 - regression_loss: 1.3085 - classification_loss: 0.2253 6/500 [..............................] - ETA: 2:02 - loss: 1.4352 - regression_loss: 1.2195 - classification_loss: 0.2157 7/500 [..............................] - ETA: 2:01 - loss: 1.3930 - regression_loss: 1.1832 - classification_loss: 0.2098 8/500 [..............................] - ETA: 2:00 - loss: 1.4893 - regression_loss: 1.2529 - classification_loss: 0.2364 9/500 [..............................] - ETA: 2:00 - loss: 1.5141 - regression_loss: 1.2598 - classification_loss: 0.2542 10/500 [..............................] - ETA: 2:00 - loss: 1.5223 - regression_loss: 1.2594 - classification_loss: 0.2629 11/500 [..............................] - ETA: 2:00 - loss: 1.5735 - regression_loss: 1.2990 - classification_loss: 0.2745 12/500 [..............................] - ETA: 2:00 - loss: 1.6073 - regression_loss: 1.3283 - classification_loss: 0.2790 13/500 [..............................] - ETA: 2:00 - loss: 1.6079 - regression_loss: 1.3351 - classification_loss: 0.2728 14/500 [..............................] - ETA: 2:00 - loss: 1.5864 - regression_loss: 1.3212 - classification_loss: 0.2652 15/500 [..............................] - ETA: 2:00 - loss: 1.6048 - regression_loss: 1.3355 - classification_loss: 0.2694 16/500 [..............................] - ETA: 2:00 - loss: 1.8080 - regression_loss: 1.4197 - classification_loss: 0.3883 17/500 [>.............................] - ETA: 2:00 - loss: 1.7915 - regression_loss: 1.4110 - classification_loss: 0.3805 18/500 [>.............................] - ETA: 2:00 - loss: 1.7712 - regression_loss: 1.3985 - classification_loss: 0.3726 19/500 [>.............................] - ETA: 1:59 - loss: 1.7657 - regression_loss: 1.4017 - classification_loss: 0.3640 20/500 [>.............................] - ETA: 1:59 - loss: 1.7425 - regression_loss: 1.3853 - classification_loss: 0.3573 21/500 [>.............................] - ETA: 1:59 - loss: 1.7526 - regression_loss: 1.3960 - classification_loss: 0.3565 22/500 [>.............................] - ETA: 1:59 - loss: 1.7565 - regression_loss: 1.4025 - classification_loss: 0.3540 23/500 [>.............................] - ETA: 1:58 - loss: 1.7728 - regression_loss: 1.4214 - classification_loss: 0.3513 24/500 [>.............................] - ETA: 1:58 - loss: 1.7910 - regression_loss: 1.4401 - classification_loss: 0.3509 25/500 [>.............................] - ETA: 1:58 - loss: 1.7550 - regression_loss: 1.4131 - classification_loss: 0.3419 26/500 [>.............................] - ETA: 1:58 - loss: 1.7699 - regression_loss: 1.4259 - classification_loss: 0.3440 27/500 [>.............................] - ETA: 1:58 - loss: 1.7693 - regression_loss: 1.4250 - classification_loss: 0.3443 28/500 [>.............................] - ETA: 1:57 - loss: 1.7553 - regression_loss: 1.4159 - classification_loss: 0.3394 29/500 [>.............................] - ETA: 1:57 - loss: 1.7609 - regression_loss: 1.4212 - classification_loss: 0.3397 30/500 [>.............................] - ETA: 1:57 - loss: 1.7639 - regression_loss: 1.4224 - classification_loss: 0.3415 31/500 [>.............................] - ETA: 1:57 - loss: 1.7711 - regression_loss: 1.4317 - classification_loss: 0.3395 32/500 [>.............................] - ETA: 1:56 - loss: 1.7899 - regression_loss: 1.4457 - classification_loss: 0.3442 33/500 [>.............................] - ETA: 1:56 - loss: 1.8018 - regression_loss: 1.4559 - classification_loss: 0.3460 34/500 [=>............................] - ETA: 1:56 - loss: 1.8249 - regression_loss: 1.4720 - classification_loss: 0.3529 35/500 [=>............................] - ETA: 1:56 - loss: 1.8164 - regression_loss: 1.4682 - classification_loss: 0.3482 36/500 [=>............................] - ETA: 1:56 - loss: 1.8496 - regression_loss: 1.4963 - classification_loss: 0.3532 37/500 [=>............................] - ETA: 1:55 - loss: 1.8488 - regression_loss: 1.4987 - classification_loss: 0.3500 38/500 [=>............................] - ETA: 1:55 - loss: 1.8399 - regression_loss: 1.4924 - classification_loss: 0.3475 39/500 [=>............................] - ETA: 1:55 - loss: 1.8514 - regression_loss: 1.5018 - classification_loss: 0.3496 40/500 [=>............................] - ETA: 1:54 - loss: 1.8635 - regression_loss: 1.5103 - classification_loss: 0.3532 41/500 [=>............................] - ETA: 1:54 - loss: 1.8394 - regression_loss: 1.4900 - classification_loss: 0.3494 42/500 [=>............................] - ETA: 1:54 - loss: 1.8388 - regression_loss: 1.4875 - classification_loss: 0.3514 43/500 [=>............................] - ETA: 1:54 - loss: 1.8248 - regression_loss: 1.4778 - classification_loss: 0.3470 44/500 [=>............................] - ETA: 1:53 - loss: 1.8183 - regression_loss: 1.4748 - classification_loss: 0.3435 45/500 [=>............................] - ETA: 1:53 - loss: 1.8204 - regression_loss: 1.4780 - classification_loss: 0.3423 46/500 [=>............................] - ETA: 1:53 - loss: 1.8050 - regression_loss: 1.4668 - classification_loss: 0.3382 47/500 [=>............................] - ETA: 1:53 - loss: 1.7994 - regression_loss: 1.4643 - classification_loss: 0.3351 48/500 [=>............................] - ETA: 1:53 - loss: 1.7901 - regression_loss: 1.4578 - classification_loss: 0.3323 49/500 [=>............................] - ETA: 1:52 - loss: 1.7816 - regression_loss: 1.4525 - classification_loss: 0.3291 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7700 - regression_loss: 1.4450 - classification_loss: 0.3250 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7701 - regression_loss: 1.4487 - classification_loss: 0.3214 52/500 [==>...........................] - ETA: 1:52 - loss: 1.7697 - regression_loss: 1.4486 - classification_loss: 0.3212 53/500 [==>...........................] - ETA: 1:52 - loss: 1.7733 - regression_loss: 1.4535 - classification_loss: 0.3198 54/500 [==>...........................] - ETA: 1:51 - loss: 1.7707 - regression_loss: 1.4513 - classification_loss: 0.3193 55/500 [==>...........................] - ETA: 1:51 - loss: 1.7568 - regression_loss: 1.4407 - classification_loss: 0.3161 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7545 - regression_loss: 1.4392 - classification_loss: 0.3153 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7585 - regression_loss: 1.4413 - classification_loss: 0.3172 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7539 - regression_loss: 1.4396 - classification_loss: 0.3142 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7573 - regression_loss: 1.4423 - classification_loss: 0.3149 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7576 - regression_loss: 1.4429 - classification_loss: 0.3147 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7469 - regression_loss: 1.4348 - classification_loss: 0.3122 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7425 - regression_loss: 1.4320 - classification_loss: 0.3105 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7402 - regression_loss: 1.4307 - classification_loss: 0.3095 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7266 - regression_loss: 1.4204 - classification_loss: 0.3062 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7306 - regression_loss: 1.4228 - classification_loss: 0.3078 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7553 - regression_loss: 1.4420 - classification_loss: 0.3133 67/500 [===>..........................] - ETA: 1:48 - loss: 1.7526 - regression_loss: 1.4395 - classification_loss: 0.3130 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7602 - regression_loss: 1.4455 - classification_loss: 0.3146 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7557 - regression_loss: 1.4415 - classification_loss: 0.3142 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7469 - regression_loss: 1.4341 - classification_loss: 0.3128 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7461 - regression_loss: 1.4342 - classification_loss: 0.3119 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7367 - regression_loss: 1.4271 - classification_loss: 0.3097 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7246 - regression_loss: 1.4163 - classification_loss: 0.3082 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7242 - regression_loss: 1.4181 - classification_loss: 0.3061 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7308 - regression_loss: 1.4226 - classification_loss: 0.3082 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7263 - regression_loss: 1.4200 - classification_loss: 0.3063 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7316 - regression_loss: 1.4241 - classification_loss: 0.3075 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7299 - regression_loss: 1.4228 - classification_loss: 0.3071 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7283 - regression_loss: 1.4237 - classification_loss: 0.3046 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7247 - regression_loss: 1.4204 - classification_loss: 0.3044 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7208 - regression_loss: 1.4180 - classification_loss: 0.3028 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7269 - regression_loss: 1.4234 - classification_loss: 0.3036 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7286 - regression_loss: 1.4252 - classification_loss: 0.3034 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7258 - regression_loss: 1.4235 - classification_loss: 0.3023 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7200 - regression_loss: 1.4193 - classification_loss: 0.3008 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7314 - regression_loss: 1.4277 - classification_loss: 0.3037 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7293 - regression_loss: 1.4260 - classification_loss: 0.3034 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7411 - regression_loss: 1.4356 - classification_loss: 0.3055 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7519 - regression_loss: 1.4441 - classification_loss: 0.3078 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7494 - regression_loss: 1.4416 - classification_loss: 0.3078 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7422 - regression_loss: 1.4359 - classification_loss: 0.3063 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7441 - regression_loss: 1.4385 - classification_loss: 0.3056 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7442 - regression_loss: 1.4393 - classification_loss: 0.3048 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7493 - regression_loss: 1.4438 - classification_loss: 0.3055 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7511 - regression_loss: 1.4454 - classification_loss: 0.3056 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7545 - regression_loss: 1.4479 - classification_loss: 0.3066 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7441 - regression_loss: 1.4398 - classification_loss: 0.3044 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7455 - regression_loss: 1.4412 - classification_loss: 0.3043 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7474 - regression_loss: 1.4430 - classification_loss: 0.3044 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7496 - regression_loss: 1.4445 - classification_loss: 0.3051 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7498 - regression_loss: 1.4449 - classification_loss: 0.3049 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7481 - regression_loss: 1.4443 - classification_loss: 0.3038 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7395 - regression_loss: 1.4371 - classification_loss: 0.3024 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7389 - regression_loss: 1.4373 - classification_loss: 0.3016 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7552 - regression_loss: 1.4496 - classification_loss: 0.3056 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7606 - regression_loss: 1.4539 - classification_loss: 0.3068 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7619 - regression_loss: 1.4559 - classification_loss: 0.3060 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7666 - regression_loss: 1.4602 - classification_loss: 0.3064 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7607 - regression_loss: 1.4555 - classification_loss: 0.3052 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7597 - regression_loss: 1.4550 - classification_loss: 0.3048 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7555 - regression_loss: 1.4521 - classification_loss: 0.3034 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7540 - regression_loss: 1.4508 - classification_loss: 0.3033 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7676 - regression_loss: 1.4570 - classification_loss: 0.3106 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7689 - regression_loss: 1.4582 - classification_loss: 0.3107 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7638 - regression_loss: 1.4547 - classification_loss: 0.3092 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7620 - regression_loss: 1.4540 - classification_loss: 0.3079 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7565 - regression_loss: 1.4498 - classification_loss: 0.3067 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7555 - regression_loss: 1.4491 - classification_loss: 0.3064 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7573 - regression_loss: 1.4512 - classification_loss: 0.3061 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7545 - regression_loss: 1.4496 - classification_loss: 0.3050 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7535 - regression_loss: 1.4498 - classification_loss: 0.3038 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7505 - regression_loss: 1.4469 - classification_loss: 0.3036 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7596 - regression_loss: 1.4549 - classification_loss: 0.3047 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7657 - regression_loss: 1.4604 - classification_loss: 0.3053 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7659 - regression_loss: 1.4607 - classification_loss: 0.3052 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7659 - regression_loss: 1.4607 - classification_loss: 0.3052 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7654 - regression_loss: 1.4612 - classification_loss: 0.3042 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7669 - regression_loss: 1.4623 - classification_loss: 0.3046 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7634 - regression_loss: 1.4600 - classification_loss: 0.3034 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7633 - regression_loss: 1.4600 - classification_loss: 0.3033 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7627 - regression_loss: 1.4603 - classification_loss: 0.3023 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7656 - regression_loss: 1.4625 - classification_loss: 0.3031 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7635 - regression_loss: 1.4608 - classification_loss: 0.3027 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7590 - regression_loss: 1.4575 - classification_loss: 0.3016 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7568 - regression_loss: 1.4562 - classification_loss: 0.3006 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7535 - regression_loss: 1.4538 - classification_loss: 0.2997 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7515 - regression_loss: 1.4526 - classification_loss: 0.2989 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7469 - regression_loss: 1.4487 - classification_loss: 0.2983 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7442 - regression_loss: 1.4466 - classification_loss: 0.2977 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7425 - regression_loss: 1.4453 - classification_loss: 0.2971 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7404 - regression_loss: 1.4435 - classification_loss: 0.2969 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7362 - regression_loss: 1.4406 - classification_loss: 0.2956 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7420 - regression_loss: 1.4454 - classification_loss: 0.2966 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7448 - regression_loss: 1.4473 - classification_loss: 0.2975 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7478 - regression_loss: 1.4496 - classification_loss: 0.2982 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7408 - regression_loss: 1.4441 - classification_loss: 0.2966 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7446 - regression_loss: 1.4470 - classification_loss: 0.2976 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7446 - regression_loss: 1.4479 - classification_loss: 0.2967 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7435 - regression_loss: 1.4474 - classification_loss: 0.2961 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7446 - regression_loss: 1.4490 - classification_loss: 0.2956 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7461 - regression_loss: 1.4508 - classification_loss: 0.2953 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7439 - regression_loss: 1.4495 - classification_loss: 0.2945 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7414 - regression_loss: 1.4474 - classification_loss: 0.2940 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7399 - regression_loss: 1.4457 - classification_loss: 0.2942 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7327 - regression_loss: 1.4398 - classification_loss: 0.2929 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7370 - regression_loss: 1.4434 - classification_loss: 0.2936 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7405 - regression_loss: 1.4463 - classification_loss: 0.2942 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7438 - regression_loss: 1.4491 - classification_loss: 0.2947 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7432 - regression_loss: 1.4493 - classification_loss: 0.2939 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7442 - regression_loss: 1.4504 - classification_loss: 0.2938 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7430 - regression_loss: 1.4496 - classification_loss: 0.2934 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7448 - regression_loss: 1.4507 - classification_loss: 0.2941 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7456 - regression_loss: 1.4517 - classification_loss: 0.2939 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7506 - regression_loss: 1.4560 - classification_loss: 0.2946 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7506 - regression_loss: 1.4559 - classification_loss: 0.2947 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7510 - regression_loss: 1.4564 - classification_loss: 0.2946 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7522 - regression_loss: 1.4574 - classification_loss: 0.2948 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7512 - regression_loss: 1.4564 - classification_loss: 0.2949 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7530 - regression_loss: 1.4579 - classification_loss: 0.2950 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7466 - regression_loss: 1.4527 - classification_loss: 0.2939 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7430 - regression_loss: 1.4501 - classification_loss: 0.2930 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7366 - regression_loss: 1.4449 - classification_loss: 0.2917 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7368 - regression_loss: 1.4453 - classification_loss: 0.2915 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7355 - regression_loss: 1.4444 - classification_loss: 0.2910 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7362 - regression_loss: 1.4454 - classification_loss: 0.2908 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7329 - regression_loss: 1.4426 - classification_loss: 0.2903 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7330 - regression_loss: 1.4434 - classification_loss: 0.2896 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7345 - regression_loss: 1.4443 - classification_loss: 0.2901 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7332 - regression_loss: 1.4431 - classification_loss: 0.2901 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7282 - regression_loss: 1.4394 - classification_loss: 0.2888 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7259 - regression_loss: 1.4377 - classification_loss: 0.2883 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7254 - regression_loss: 1.4373 - classification_loss: 0.2880 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7195 - regression_loss: 1.4324 - classification_loss: 0.2871 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7176 - regression_loss: 1.4306 - classification_loss: 0.2871 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7209 - regression_loss: 1.4330 - classification_loss: 0.2879 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7179 - regression_loss: 1.4307 - classification_loss: 0.2872 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7174 - regression_loss: 1.4305 - classification_loss: 0.2870 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7192 - regression_loss: 1.4317 - classification_loss: 0.2875 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7170 - regression_loss: 1.4302 - classification_loss: 0.2867 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7146 - regression_loss: 1.4284 - classification_loss: 0.2862 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7138 - regression_loss: 1.4281 - classification_loss: 0.2857 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7106 - regression_loss: 1.4255 - classification_loss: 0.2851 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7138 - regression_loss: 1.4275 - classification_loss: 0.2863 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7142 - regression_loss: 1.4277 - classification_loss: 0.2864 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7118 - regression_loss: 1.4261 - classification_loss: 0.2857 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7164 - regression_loss: 1.4313 - classification_loss: 0.2852 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7169 - regression_loss: 1.4314 - classification_loss: 0.2855 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7213 - regression_loss: 1.4347 - classification_loss: 0.2866 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7224 - regression_loss: 1.4359 - classification_loss: 0.2865 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7205 - regression_loss: 1.4343 - classification_loss: 0.2862 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7221 - regression_loss: 1.4356 - classification_loss: 0.2865 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7228 - regression_loss: 1.4362 - classification_loss: 0.2867 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7226 - regression_loss: 1.4358 - classification_loss: 0.2868 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7233 - regression_loss: 1.4365 - classification_loss: 0.2868 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7226 - regression_loss: 1.4361 - classification_loss: 0.2864 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7239 - regression_loss: 1.4375 - classification_loss: 0.2864 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7251 - regression_loss: 1.4392 - classification_loss: 0.2859 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7254 - regression_loss: 1.4398 - classification_loss: 0.2857 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7262 - regression_loss: 1.4406 - classification_loss: 0.2856 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7254 - regression_loss: 1.4406 - classification_loss: 0.2848 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7242 - regression_loss: 1.4392 - classification_loss: 0.2850 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7206 - regression_loss: 1.4361 - classification_loss: 0.2845 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7213 - regression_loss: 1.4294 - classification_loss: 0.2919 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7195 - regression_loss: 1.4281 - classification_loss: 0.2914 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7173 - regression_loss: 1.4265 - classification_loss: 0.2908 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7112 - regression_loss: 1.4199 - classification_loss: 0.2913 217/500 [============>.................] - ETA: 1:10 - loss: 1.7069 - regression_loss: 1.4166 - classification_loss: 0.2903 218/500 [============>.................] - ETA: 1:10 - loss: 1.7049 - regression_loss: 1.4151 - classification_loss: 0.2897 219/500 [============>.................] - ETA: 1:10 - loss: 1.7025 - regression_loss: 1.4132 - classification_loss: 0.2893 220/500 [============>.................] - ETA: 1:10 - loss: 1.7008 - regression_loss: 1.4118 - classification_loss: 0.2890 221/500 [============>.................] - ETA: 1:09 - loss: 1.7038 - regression_loss: 1.4136 - classification_loss: 0.2902 222/500 [============>.................] - ETA: 1:09 - loss: 1.7035 - regression_loss: 1.4130 - classification_loss: 0.2905 223/500 [============>.................] - ETA: 1:09 - loss: 1.6988 - regression_loss: 1.4093 - classification_loss: 0.2895 224/500 [============>.................] - ETA: 1:09 - loss: 1.7017 - regression_loss: 1.4110 - classification_loss: 0.2907 225/500 [============>.................] - ETA: 1:08 - loss: 1.6977 - regression_loss: 1.4079 - classification_loss: 0.2898 226/500 [============>.................] - ETA: 1:08 - loss: 1.6989 - regression_loss: 1.4093 - classification_loss: 0.2896 227/500 [============>.................] - ETA: 1:08 - loss: 1.6971 - regression_loss: 1.4080 - classification_loss: 0.2891 228/500 [============>.................] - ETA: 1:08 - loss: 1.6994 - regression_loss: 1.4096 - classification_loss: 0.2898 229/500 [============>.................] - ETA: 1:07 - loss: 1.6981 - regression_loss: 1.4085 - classification_loss: 0.2897 230/500 [============>.................] - ETA: 1:07 - loss: 1.6985 - regression_loss: 1.4093 - classification_loss: 0.2892 231/500 [============>.................] - ETA: 1:07 - loss: 1.6983 - regression_loss: 1.4090 - classification_loss: 0.2892 232/500 [============>.................] - ETA: 1:07 - loss: 1.6987 - regression_loss: 1.4094 - classification_loss: 0.2893 233/500 [============>.................] - ETA: 1:06 - loss: 1.6977 - regression_loss: 1.4083 - classification_loss: 0.2894 234/500 [=============>................] - ETA: 1:06 - loss: 1.6985 - regression_loss: 1.4093 - classification_loss: 0.2892 235/500 [=============>................] - ETA: 1:06 - loss: 1.6991 - regression_loss: 1.4096 - classification_loss: 0.2895 236/500 [=============>................] - ETA: 1:06 - loss: 1.6966 - regression_loss: 1.4076 - classification_loss: 0.2890 237/500 [=============>................] - ETA: 1:05 - loss: 1.6941 - regression_loss: 1.4058 - classification_loss: 0.2883 238/500 [=============>................] - ETA: 1:05 - loss: 1.6987 - regression_loss: 1.4104 - classification_loss: 0.2883 239/500 [=============>................] - ETA: 1:05 - loss: 1.7002 - regression_loss: 1.4116 - classification_loss: 0.2886 240/500 [=============>................] - ETA: 1:05 - loss: 1.7014 - regression_loss: 1.4128 - classification_loss: 0.2886 241/500 [=============>................] - ETA: 1:04 - loss: 1.7028 - regression_loss: 1.4135 - classification_loss: 0.2893 242/500 [=============>................] - ETA: 1:04 - loss: 1.7037 - regression_loss: 1.4138 - classification_loss: 0.2899 243/500 [=============>................] - ETA: 1:04 - loss: 1.7067 - regression_loss: 1.4166 - classification_loss: 0.2901 244/500 [=============>................] - ETA: 1:04 - loss: 1.7064 - regression_loss: 1.4164 - classification_loss: 0.2900 245/500 [=============>................] - ETA: 1:03 - loss: 1.7063 - regression_loss: 1.4158 - classification_loss: 0.2905 246/500 [=============>................] - ETA: 1:03 - loss: 1.7066 - regression_loss: 1.4159 - classification_loss: 0.2907 247/500 [=============>................] - ETA: 1:03 - loss: 1.7059 - regression_loss: 1.4156 - classification_loss: 0.2903 248/500 [=============>................] - ETA: 1:03 - loss: 1.7061 - regression_loss: 1.4160 - classification_loss: 0.2901 249/500 [=============>................] - ETA: 1:02 - loss: 1.7017 - regression_loss: 1.4126 - classification_loss: 0.2892 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7031 - regression_loss: 1.4138 - classification_loss: 0.2893 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6981 - regression_loss: 1.4097 - classification_loss: 0.2884 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6986 - regression_loss: 1.4101 - classification_loss: 0.2885 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6995 - regression_loss: 1.4108 - classification_loss: 0.2886 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6967 - regression_loss: 1.4087 - classification_loss: 0.2880 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6980 - regression_loss: 1.4096 - classification_loss: 0.2884 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6975 - regression_loss: 1.4094 - classification_loss: 0.2882 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6957 - regression_loss: 1.4082 - classification_loss: 0.2875 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7019 - regression_loss: 1.4140 - classification_loss: 0.2879 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7008 - regression_loss: 1.4134 - classification_loss: 0.2874 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7007 - regression_loss: 1.4136 - classification_loss: 0.2871 261/500 [==============>...............] - ETA: 59s - loss: 1.6994 - regression_loss: 1.4127 - classification_loss: 0.2868  262/500 [==============>...............] - ETA: 59s - loss: 1.6966 - regression_loss: 1.4103 - classification_loss: 0.2863 263/500 [==============>...............] - ETA: 59s - loss: 1.6997 - regression_loss: 1.4123 - classification_loss: 0.2875 264/500 [==============>...............] - ETA: 58s - loss: 1.7031 - regression_loss: 1.4148 - classification_loss: 0.2882 265/500 [==============>...............] - ETA: 58s - loss: 1.7029 - regression_loss: 1.4149 - classification_loss: 0.2880 266/500 [==============>...............] - ETA: 58s - loss: 1.7053 - regression_loss: 1.4169 - classification_loss: 0.2883 267/500 [===============>..............] - ETA: 58s - loss: 1.7063 - regression_loss: 1.4175 - classification_loss: 0.2887 268/500 [===============>..............] - ETA: 57s - loss: 1.7077 - regression_loss: 1.4188 - classification_loss: 0.2889 269/500 [===============>..............] - ETA: 57s - loss: 1.7092 - regression_loss: 1.4203 - classification_loss: 0.2890 270/500 [===============>..............] - ETA: 57s - loss: 1.7089 - regression_loss: 1.4203 - classification_loss: 0.2886 271/500 [===============>..............] - ETA: 57s - loss: 1.7088 - regression_loss: 1.4202 - classification_loss: 0.2886 272/500 [===============>..............] - ETA: 56s - loss: 1.7075 - regression_loss: 1.4194 - classification_loss: 0.2881 273/500 [===============>..............] - ETA: 56s - loss: 1.7084 - regression_loss: 1.4201 - classification_loss: 0.2883 274/500 [===============>..............] - ETA: 56s - loss: 1.7095 - regression_loss: 1.4216 - classification_loss: 0.2880 275/500 [===============>..............] - ETA: 56s - loss: 1.7058 - regression_loss: 1.4186 - classification_loss: 0.2871 276/500 [===============>..............] - ETA: 55s - loss: 1.7039 - regression_loss: 1.4171 - classification_loss: 0.2869 277/500 [===============>..............] - ETA: 55s - loss: 1.7036 - regression_loss: 1.4166 - classification_loss: 0.2871 278/500 [===============>..............] - ETA: 55s - loss: 1.7040 - regression_loss: 1.4167 - classification_loss: 0.2873 279/500 [===============>..............] - ETA: 55s - loss: 1.7027 - regression_loss: 1.4160 - classification_loss: 0.2867 280/500 [===============>..............] - ETA: 55s - loss: 1.7026 - regression_loss: 1.4161 - classification_loss: 0.2865 281/500 [===============>..............] - ETA: 54s - loss: 1.7008 - regression_loss: 1.4149 - classification_loss: 0.2859 282/500 [===============>..............] - ETA: 54s - loss: 1.6971 - regression_loss: 1.4119 - classification_loss: 0.2852 283/500 [===============>..............] - ETA: 54s - loss: 1.6980 - regression_loss: 1.4122 - classification_loss: 0.2858 284/500 [================>.............] - ETA: 54s - loss: 1.6983 - regression_loss: 1.4125 - classification_loss: 0.2858 285/500 [================>.............] - ETA: 53s - loss: 1.6964 - regression_loss: 1.4109 - classification_loss: 0.2855 286/500 [================>.............] - ETA: 53s - loss: 1.6969 - regression_loss: 1.4111 - classification_loss: 0.2857 287/500 [================>.............] - ETA: 53s - loss: 1.6982 - regression_loss: 1.4115 - classification_loss: 0.2867 288/500 [================>.............] - ETA: 53s - loss: 1.6987 - regression_loss: 1.4118 - classification_loss: 0.2869 289/500 [================>.............] - ETA: 52s - loss: 1.6975 - regression_loss: 1.4107 - classification_loss: 0.2868 290/500 [================>.............] - ETA: 52s - loss: 1.6959 - regression_loss: 1.4093 - classification_loss: 0.2866 291/500 [================>.............] - ETA: 52s - loss: 1.6998 - regression_loss: 1.4127 - classification_loss: 0.2871 292/500 [================>.............] - ETA: 52s - loss: 1.6975 - regression_loss: 1.4109 - classification_loss: 0.2865 293/500 [================>.............] - ETA: 51s - loss: 1.6966 - regression_loss: 1.4105 - classification_loss: 0.2861 294/500 [================>.............] - ETA: 51s - loss: 1.6964 - regression_loss: 1.4101 - classification_loss: 0.2862 295/500 [================>.............] - ETA: 51s - loss: 1.6935 - regression_loss: 1.4080 - classification_loss: 0.2855 296/500 [================>.............] - ETA: 51s - loss: 1.6915 - regression_loss: 1.4064 - classification_loss: 0.2851 297/500 [================>.............] - ETA: 50s - loss: 1.6897 - regression_loss: 1.4050 - classification_loss: 0.2848 298/500 [================>.............] - ETA: 50s - loss: 1.6889 - regression_loss: 1.4045 - classification_loss: 0.2844 299/500 [================>.............] - ETA: 50s - loss: 1.6871 - regression_loss: 1.4030 - classification_loss: 0.2841 300/500 [=================>............] - ETA: 50s - loss: 1.6872 - regression_loss: 1.4034 - classification_loss: 0.2838 301/500 [=================>............] - ETA: 49s - loss: 1.6893 - regression_loss: 1.4053 - classification_loss: 0.2841 302/500 [=================>............] - ETA: 49s - loss: 1.6916 - regression_loss: 1.4071 - classification_loss: 0.2845 303/500 [=================>............] - ETA: 49s - loss: 1.6919 - regression_loss: 1.4074 - classification_loss: 0.2845 304/500 [=================>............] - ETA: 49s - loss: 1.6907 - regression_loss: 1.4065 - classification_loss: 0.2842 305/500 [=================>............] - ETA: 48s - loss: 1.6906 - regression_loss: 1.4065 - classification_loss: 0.2841 306/500 [=================>............] - ETA: 48s - loss: 1.6912 - regression_loss: 1.4071 - classification_loss: 0.2841 307/500 [=================>............] - ETA: 48s - loss: 1.6917 - regression_loss: 1.4073 - classification_loss: 0.2843 308/500 [=================>............] - ETA: 48s - loss: 1.6902 - regression_loss: 1.4062 - classification_loss: 0.2839 309/500 [=================>............] - ETA: 47s - loss: 1.6888 - regression_loss: 1.4054 - classification_loss: 0.2834 310/500 [=================>............] - ETA: 47s - loss: 1.6902 - regression_loss: 1.4065 - classification_loss: 0.2837 311/500 [=================>............] - ETA: 47s - loss: 1.6903 - regression_loss: 1.4063 - classification_loss: 0.2840 312/500 [=================>............] - ETA: 47s - loss: 1.6896 - regression_loss: 1.4058 - classification_loss: 0.2838 313/500 [=================>............] - ETA: 46s - loss: 1.6894 - regression_loss: 1.4059 - classification_loss: 0.2835 314/500 [=================>............] - ETA: 46s - loss: 1.6896 - regression_loss: 1.4061 - classification_loss: 0.2836 315/500 [=================>............] - ETA: 46s - loss: 1.6894 - regression_loss: 1.4059 - classification_loss: 0.2835 316/500 [=================>............] - ETA: 46s - loss: 1.6898 - regression_loss: 1.4063 - classification_loss: 0.2835 317/500 [==================>...........] - ETA: 45s - loss: 1.6913 - regression_loss: 1.4074 - classification_loss: 0.2839 318/500 [==================>...........] - ETA: 45s - loss: 1.6908 - regression_loss: 1.4070 - classification_loss: 0.2837 319/500 [==================>...........] - ETA: 45s - loss: 1.6894 - regression_loss: 1.4061 - classification_loss: 0.2833 320/500 [==================>...........] - ETA: 45s - loss: 1.6914 - regression_loss: 1.4079 - classification_loss: 0.2835 321/500 [==================>...........] - ETA: 44s - loss: 1.6902 - regression_loss: 1.4070 - classification_loss: 0.2832 322/500 [==================>...........] - ETA: 44s - loss: 1.6900 - regression_loss: 1.4071 - classification_loss: 0.2829 323/500 [==================>...........] - ETA: 44s - loss: 1.6898 - regression_loss: 1.4071 - classification_loss: 0.2826 324/500 [==================>...........] - ETA: 44s - loss: 1.6885 - regression_loss: 1.4062 - classification_loss: 0.2823 325/500 [==================>...........] - ETA: 43s - loss: 1.6854 - regression_loss: 1.4038 - classification_loss: 0.2816 326/500 [==================>...........] - ETA: 43s - loss: 1.6852 - regression_loss: 1.4038 - classification_loss: 0.2814 327/500 [==================>...........] - ETA: 43s - loss: 1.6871 - regression_loss: 1.4051 - classification_loss: 0.2820 328/500 [==================>...........] - ETA: 43s - loss: 1.6851 - regression_loss: 1.4035 - classification_loss: 0.2816 329/500 [==================>...........] - ETA: 42s - loss: 1.6859 - regression_loss: 1.4047 - classification_loss: 0.2812 330/500 [==================>...........] - ETA: 42s - loss: 1.6849 - regression_loss: 1.4040 - classification_loss: 0.2809 331/500 [==================>...........] - ETA: 42s - loss: 1.6855 - regression_loss: 1.4046 - classification_loss: 0.2809 332/500 [==================>...........] - ETA: 42s - loss: 1.6863 - regression_loss: 1.4052 - classification_loss: 0.2811 333/500 [==================>...........] - ETA: 41s - loss: 1.6851 - regression_loss: 1.4043 - classification_loss: 0.2809 334/500 [===================>..........] - ETA: 41s - loss: 1.6843 - regression_loss: 1.4038 - classification_loss: 0.2805 335/500 [===================>..........] - ETA: 41s - loss: 1.6859 - regression_loss: 1.4050 - classification_loss: 0.2809 336/500 [===================>..........] - ETA: 41s - loss: 1.6899 - regression_loss: 1.4082 - classification_loss: 0.2817 337/500 [===================>..........] - ETA: 40s - loss: 1.6881 - regression_loss: 1.4068 - classification_loss: 0.2812 338/500 [===================>..........] - ETA: 40s - loss: 1.6897 - regression_loss: 1.4081 - classification_loss: 0.2816 339/500 [===================>..........] - ETA: 40s - loss: 1.6923 - regression_loss: 1.4102 - classification_loss: 0.2821 340/500 [===================>..........] - ETA: 40s - loss: 1.6936 - regression_loss: 1.4111 - classification_loss: 0.2825 341/500 [===================>..........] - ETA: 39s - loss: 1.6941 - regression_loss: 1.4117 - classification_loss: 0.2824 342/500 [===================>..........] - ETA: 39s - loss: 1.6942 - regression_loss: 1.4119 - classification_loss: 0.2823 343/500 [===================>..........] - ETA: 39s - loss: 1.6950 - regression_loss: 1.4129 - classification_loss: 0.2821 344/500 [===================>..........] - ETA: 39s - loss: 1.6936 - regression_loss: 1.4119 - classification_loss: 0.2817 345/500 [===================>..........] - ETA: 38s - loss: 1.6923 - regression_loss: 1.4111 - classification_loss: 0.2812 346/500 [===================>..........] - ETA: 38s - loss: 1.6920 - regression_loss: 1.4109 - classification_loss: 0.2811 347/500 [===================>..........] - ETA: 38s - loss: 1.6913 - regression_loss: 1.4105 - classification_loss: 0.2808 348/500 [===================>..........] - ETA: 38s - loss: 1.6909 - regression_loss: 1.4102 - classification_loss: 0.2807 349/500 [===================>..........] - ETA: 37s - loss: 1.6907 - regression_loss: 1.4101 - classification_loss: 0.2806 350/500 [====================>.........] - ETA: 37s - loss: 1.6922 - regression_loss: 1.4116 - classification_loss: 0.2806 351/500 [====================>.........] - ETA: 37s - loss: 1.6917 - regression_loss: 1.4116 - classification_loss: 0.2800 352/500 [====================>.........] - ETA: 37s - loss: 1.6912 - regression_loss: 1.4115 - classification_loss: 0.2798 353/500 [====================>.........] - ETA: 36s - loss: 1.6928 - regression_loss: 1.4129 - classification_loss: 0.2799 354/500 [====================>.........] - ETA: 36s - loss: 1.6907 - regression_loss: 1.4113 - classification_loss: 0.2794 355/500 [====================>.........] - ETA: 36s - loss: 1.6907 - regression_loss: 1.4114 - classification_loss: 0.2793 356/500 [====================>.........] - ETA: 36s - loss: 1.6907 - regression_loss: 1.4117 - classification_loss: 0.2790 357/500 [====================>.........] - ETA: 35s - loss: 1.6897 - regression_loss: 1.4111 - classification_loss: 0.2786 358/500 [====================>.........] - ETA: 35s - loss: 1.6869 - regression_loss: 1.4087 - classification_loss: 0.2782 359/500 [====================>.........] - ETA: 35s - loss: 1.6842 - regression_loss: 1.4065 - classification_loss: 0.2777 360/500 [====================>.........] - ETA: 35s - loss: 1.6849 - regression_loss: 1.4070 - classification_loss: 0.2778 361/500 [====================>.........] - ETA: 34s - loss: 1.6846 - regression_loss: 1.4066 - classification_loss: 0.2779 362/500 [====================>.........] - ETA: 34s - loss: 1.6844 - regression_loss: 1.4066 - classification_loss: 0.2778 363/500 [====================>.........] - ETA: 34s - loss: 1.6849 - regression_loss: 1.4060 - classification_loss: 0.2789 364/500 [====================>.........] - ETA: 34s - loss: 1.6857 - regression_loss: 1.4066 - classification_loss: 0.2791 365/500 [====================>.........] - ETA: 33s - loss: 1.6847 - regression_loss: 1.4058 - classification_loss: 0.2789 366/500 [====================>.........] - ETA: 33s - loss: 1.6831 - regression_loss: 1.4046 - classification_loss: 0.2785 367/500 [=====================>........] - ETA: 33s - loss: 1.6843 - regression_loss: 1.4053 - classification_loss: 0.2790 368/500 [=====================>........] - ETA: 33s - loss: 1.6845 - regression_loss: 1.4054 - classification_loss: 0.2791 369/500 [=====================>........] - ETA: 32s - loss: 1.6856 - regression_loss: 1.4064 - classification_loss: 0.2792 370/500 [=====================>........] - ETA: 32s - loss: 1.6858 - regression_loss: 1.4066 - classification_loss: 0.2792 371/500 [=====================>........] - ETA: 32s - loss: 1.6858 - regression_loss: 1.4068 - classification_loss: 0.2791 372/500 [=====================>........] - ETA: 32s - loss: 1.6859 - regression_loss: 1.4068 - classification_loss: 0.2791 373/500 [=====================>........] - ETA: 31s - loss: 1.6863 - regression_loss: 1.4071 - classification_loss: 0.2792 374/500 [=====================>........] - ETA: 31s - loss: 1.6889 - regression_loss: 1.4093 - classification_loss: 0.2796 375/500 [=====================>........] - ETA: 31s - loss: 1.6892 - regression_loss: 1.4095 - classification_loss: 0.2797 376/500 [=====================>........] - ETA: 31s - loss: 1.6910 - regression_loss: 1.4109 - classification_loss: 0.2801 377/500 [=====================>........] - ETA: 30s - loss: 1.6902 - regression_loss: 1.4103 - classification_loss: 0.2799 378/500 [=====================>........] - ETA: 30s - loss: 1.6899 - regression_loss: 1.4100 - classification_loss: 0.2799 379/500 [=====================>........] - ETA: 30s - loss: 1.6905 - regression_loss: 1.4104 - classification_loss: 0.2801 380/500 [=====================>........] - ETA: 30s - loss: 1.6873 - regression_loss: 1.4078 - classification_loss: 0.2795 381/500 [=====================>........] - ETA: 29s - loss: 1.6877 - regression_loss: 1.4082 - classification_loss: 0.2795 382/500 [=====================>........] - ETA: 29s - loss: 1.6879 - regression_loss: 1.4085 - classification_loss: 0.2793 383/500 [=====================>........] - ETA: 29s - loss: 1.6885 - regression_loss: 1.4091 - classification_loss: 0.2795 384/500 [======================>.......] - ETA: 29s - loss: 1.6873 - regression_loss: 1.4082 - classification_loss: 0.2791 385/500 [======================>.......] - ETA: 28s - loss: 1.6866 - regression_loss: 1.4078 - classification_loss: 0.2788 386/500 [======================>.......] - ETA: 28s - loss: 1.6891 - regression_loss: 1.4099 - classification_loss: 0.2792 387/500 [======================>.......] - ETA: 28s - loss: 1.6912 - regression_loss: 1.4116 - classification_loss: 0.2796 388/500 [======================>.......] - ETA: 28s - loss: 1.6910 - regression_loss: 1.4113 - classification_loss: 0.2797 389/500 [======================>.......] - ETA: 27s - loss: 1.6910 - regression_loss: 1.4114 - classification_loss: 0.2796 390/500 [======================>.......] - ETA: 27s - loss: 1.6902 - regression_loss: 1.4109 - classification_loss: 0.2793 391/500 [======================>.......] - ETA: 27s - loss: 1.6906 - regression_loss: 1.4113 - classification_loss: 0.2793 392/500 [======================>.......] - ETA: 27s - loss: 1.6918 - regression_loss: 1.4121 - classification_loss: 0.2797 393/500 [======================>.......] - ETA: 26s - loss: 1.6914 - regression_loss: 1.4116 - classification_loss: 0.2798 394/500 [======================>.......] - ETA: 26s - loss: 1.6921 - regression_loss: 1.4123 - classification_loss: 0.2798 395/500 [======================>.......] - ETA: 26s - loss: 1.6920 - regression_loss: 1.4124 - classification_loss: 0.2797 396/500 [======================>.......] - ETA: 26s - loss: 1.6908 - regression_loss: 1.4114 - classification_loss: 0.2794 397/500 [======================>.......] - ETA: 25s - loss: 1.6887 - regression_loss: 1.4097 - classification_loss: 0.2790 398/500 [======================>.......] - ETA: 25s - loss: 1.6883 - regression_loss: 1.4095 - classification_loss: 0.2788 399/500 [======================>.......] - ETA: 25s - loss: 1.6876 - regression_loss: 1.4090 - classification_loss: 0.2786 400/500 [=======================>......] - ETA: 25s - loss: 1.6889 - regression_loss: 1.4101 - classification_loss: 0.2788 401/500 [=======================>......] - ETA: 24s - loss: 1.6875 - regression_loss: 1.4091 - classification_loss: 0.2784 402/500 [=======================>......] - ETA: 24s - loss: 1.6872 - regression_loss: 1.4090 - classification_loss: 0.2782 403/500 [=======================>......] - ETA: 24s - loss: 1.6878 - regression_loss: 1.4094 - classification_loss: 0.2785 404/500 [=======================>......] - ETA: 24s - loss: 1.6889 - regression_loss: 1.4100 - classification_loss: 0.2789 405/500 [=======================>......] - ETA: 23s - loss: 1.6886 - regression_loss: 1.4099 - classification_loss: 0.2788 406/500 [=======================>......] - ETA: 23s - loss: 1.6912 - regression_loss: 1.4119 - classification_loss: 0.2793 407/500 [=======================>......] - ETA: 23s - loss: 1.6920 - regression_loss: 1.4125 - classification_loss: 0.2795 408/500 [=======================>......] - ETA: 23s - loss: 1.6928 - regression_loss: 1.4131 - classification_loss: 0.2797 409/500 [=======================>......] - ETA: 22s - loss: 1.6925 - regression_loss: 1.4129 - classification_loss: 0.2796 410/500 [=======================>......] - ETA: 22s - loss: 1.6909 - regression_loss: 1.4117 - classification_loss: 0.2792 411/500 [=======================>......] - ETA: 22s - loss: 1.6907 - regression_loss: 1.4116 - classification_loss: 0.2791 412/500 [=======================>......] - ETA: 22s - loss: 1.6911 - regression_loss: 1.4120 - classification_loss: 0.2791 413/500 [=======================>......] - ETA: 21s - loss: 1.6887 - regression_loss: 1.4100 - classification_loss: 0.2788 414/500 [=======================>......] - ETA: 21s - loss: 1.6889 - regression_loss: 1.4100 - classification_loss: 0.2789 415/500 [=======================>......] - ETA: 21s - loss: 1.6876 - regression_loss: 1.4089 - classification_loss: 0.2787 416/500 [=======================>......] - ETA: 21s - loss: 1.6870 - regression_loss: 1.4085 - classification_loss: 0.2785 417/500 [========================>.....] - ETA: 20s - loss: 1.6873 - regression_loss: 1.4090 - classification_loss: 0.2782 418/500 [========================>.....] - ETA: 20s - loss: 1.6894 - regression_loss: 1.4110 - classification_loss: 0.2785 419/500 [========================>.....] - ETA: 20s - loss: 1.6889 - regression_loss: 1.4106 - classification_loss: 0.2783 420/500 [========================>.....] - ETA: 20s - loss: 1.6880 - regression_loss: 1.4100 - classification_loss: 0.2780 421/500 [========================>.....] - ETA: 19s - loss: 1.6863 - regression_loss: 1.4087 - classification_loss: 0.2776 422/500 [========================>.....] - ETA: 19s - loss: 1.6864 - regression_loss: 1.4089 - classification_loss: 0.2775 423/500 [========================>.....] - ETA: 19s - loss: 1.6870 - regression_loss: 1.4095 - classification_loss: 0.2775 424/500 [========================>.....] - ETA: 19s - loss: 1.6907 - regression_loss: 1.4126 - classification_loss: 0.2781 425/500 [========================>.....] - ETA: 18s - loss: 1.6892 - regression_loss: 1.4115 - classification_loss: 0.2778 426/500 [========================>.....] - ETA: 18s - loss: 1.6909 - regression_loss: 1.4127 - classification_loss: 0.2782 427/500 [========================>.....] - ETA: 18s - loss: 1.6904 - regression_loss: 1.4124 - classification_loss: 0.2780 428/500 [========================>.....] - ETA: 18s - loss: 1.6907 - regression_loss: 1.4126 - classification_loss: 0.2782 429/500 [========================>.....] - ETA: 17s - loss: 1.6925 - regression_loss: 1.4137 - classification_loss: 0.2788 430/500 [========================>.....] - ETA: 17s - loss: 1.6935 - regression_loss: 1.4147 - classification_loss: 0.2788 431/500 [========================>.....] - ETA: 17s - loss: 1.6935 - regression_loss: 1.4149 - classification_loss: 0.2786 432/500 [========================>.....] - ETA: 17s - loss: 1.6953 - regression_loss: 1.4165 - classification_loss: 0.2788 433/500 [========================>.....] - ETA: 16s - loss: 1.6957 - regression_loss: 1.4168 - classification_loss: 0.2789 434/500 [=========================>....] - ETA: 16s - loss: 1.6951 - regression_loss: 1.4164 - classification_loss: 0.2787 435/500 [=========================>....] - ETA: 16s - loss: 1.6933 - regression_loss: 1.4148 - classification_loss: 0.2785 436/500 [=========================>....] - ETA: 16s - loss: 1.6915 - regression_loss: 1.4133 - classification_loss: 0.2782 437/500 [=========================>....] - ETA: 15s - loss: 1.6928 - regression_loss: 1.4147 - classification_loss: 0.2781 438/500 [=========================>....] - ETA: 15s - loss: 1.6919 - regression_loss: 1.4141 - classification_loss: 0.2778 439/500 [=========================>....] - ETA: 15s - loss: 1.6910 - regression_loss: 1.4136 - classification_loss: 0.2775 440/500 [=========================>....] - ETA: 15s - loss: 1.6910 - regression_loss: 1.4135 - classification_loss: 0.2775 441/500 [=========================>....] - ETA: 14s - loss: 1.6891 - regression_loss: 1.4120 - classification_loss: 0.2771 442/500 [=========================>....] - ETA: 14s - loss: 1.6874 - regression_loss: 1.4107 - classification_loss: 0.2767 443/500 [=========================>....] - ETA: 14s - loss: 1.6881 - regression_loss: 1.4111 - classification_loss: 0.2770 444/500 [=========================>....] - ETA: 14s - loss: 1.6880 - regression_loss: 1.4110 - classification_loss: 0.2769 445/500 [=========================>....] - ETA: 13s - loss: 1.6878 - regression_loss: 1.4109 - classification_loss: 0.2768 446/500 [=========================>....] - ETA: 13s - loss: 1.6887 - regression_loss: 1.4117 - classification_loss: 0.2770 447/500 [=========================>....] - ETA: 13s - loss: 1.6871 - regression_loss: 1.4104 - classification_loss: 0.2767 448/500 [=========================>....] - ETA: 13s - loss: 1.6888 - regression_loss: 1.4118 - classification_loss: 0.2770 449/500 [=========================>....] - ETA: 12s - loss: 1.6889 - regression_loss: 1.4119 - classification_loss: 0.2769 450/500 [==========================>...] - ETA: 12s - loss: 1.6883 - regression_loss: 1.4115 - classification_loss: 0.2768 451/500 [==========================>...] - ETA: 12s - loss: 1.6887 - regression_loss: 1.4119 - classification_loss: 0.2768 452/500 [==========================>...] - ETA: 12s - loss: 1.6908 - regression_loss: 1.4135 - classification_loss: 0.2773 453/500 [==========================>...] - ETA: 11s - loss: 1.6906 - regression_loss: 1.4135 - classification_loss: 0.2771 454/500 [==========================>...] - ETA: 11s - loss: 1.6896 - regression_loss: 1.4127 - classification_loss: 0.2769 455/500 [==========================>...] - ETA: 11s - loss: 1.6868 - regression_loss: 1.4104 - classification_loss: 0.2764 456/500 [==========================>...] - ETA: 11s - loss: 1.6855 - regression_loss: 1.4091 - classification_loss: 0.2765 457/500 [==========================>...] - ETA: 10s - loss: 1.6840 - regression_loss: 1.4079 - classification_loss: 0.2762 458/500 [==========================>...] - ETA: 10s - loss: 1.6846 - regression_loss: 1.4082 - classification_loss: 0.2763 459/500 [==========================>...] - ETA: 10s - loss: 1.6849 - regression_loss: 1.4084 - classification_loss: 0.2765 460/500 [==========================>...] - ETA: 10s - loss: 1.6827 - regression_loss: 1.4065 - classification_loss: 0.2761 461/500 [==========================>...] - ETA: 9s - loss: 1.6830 - regression_loss: 1.4068 - classification_loss: 0.2763  462/500 [==========================>...] - ETA: 9s - loss: 1.6830 - regression_loss: 1.4067 - classification_loss: 0.2762 463/500 [==========================>...] - ETA: 9s - loss: 1.6829 - regression_loss: 1.4068 - classification_loss: 0.2761 464/500 [==========================>...] - ETA: 9s - loss: 1.6838 - regression_loss: 1.4075 - classification_loss: 0.2764 465/500 [==========================>...] - ETA: 8s - loss: 1.6843 - regression_loss: 1.4079 - classification_loss: 0.2764 466/500 [==========================>...] - ETA: 8s - loss: 1.6852 - regression_loss: 1.4085 - classification_loss: 0.2767 467/500 [===========================>..] - ETA: 8s - loss: 1.6842 - regression_loss: 1.4078 - classification_loss: 0.2764 468/500 [===========================>..] - ETA: 8s - loss: 1.6846 - regression_loss: 1.4085 - classification_loss: 0.2761 469/500 [===========================>..] - ETA: 7s - loss: 1.6855 - regression_loss: 1.4094 - classification_loss: 0.2761 470/500 [===========================>..] - ETA: 7s - loss: 1.6855 - regression_loss: 1.4092 - classification_loss: 0.2763 471/500 [===========================>..] - ETA: 7s - loss: 1.6852 - regression_loss: 1.4091 - classification_loss: 0.2762 472/500 [===========================>..] - ETA: 7s - loss: 1.6843 - regression_loss: 1.4083 - classification_loss: 0.2760 473/500 [===========================>..] - ETA: 6s - loss: 1.6840 - regression_loss: 1.4081 - classification_loss: 0.2759 474/500 [===========================>..] - ETA: 6s - loss: 1.6842 - regression_loss: 1.4082 - classification_loss: 0.2760 475/500 [===========================>..] - ETA: 6s - loss: 1.6824 - regression_loss: 1.4067 - classification_loss: 0.2757 476/500 [===========================>..] - ETA: 6s - loss: 1.6838 - regression_loss: 1.4076 - classification_loss: 0.2762 477/500 [===========================>..] - ETA: 5s - loss: 1.6841 - regression_loss: 1.4080 - classification_loss: 0.2761 478/500 [===========================>..] - ETA: 5s - loss: 1.6850 - regression_loss: 1.4087 - classification_loss: 0.2763 479/500 [===========================>..] - ETA: 5s - loss: 1.6850 - regression_loss: 1.4087 - classification_loss: 0.2763 480/500 [===========================>..] - ETA: 5s - loss: 1.6865 - regression_loss: 1.4100 - classification_loss: 0.2766 481/500 [===========================>..] - ETA: 4s - loss: 1.6854 - regression_loss: 1.4091 - classification_loss: 0.2763 482/500 [===========================>..] - ETA: 4s - loss: 1.6844 - regression_loss: 1.4083 - classification_loss: 0.2761 483/500 [===========================>..] - ETA: 4s - loss: 1.6842 - regression_loss: 1.4080 - classification_loss: 0.2761 484/500 [============================>.] - ETA: 4s - loss: 1.6845 - regression_loss: 1.4082 - classification_loss: 0.2763 485/500 [============================>.] - ETA: 3s - loss: 1.6853 - regression_loss: 1.4089 - classification_loss: 0.2764 486/500 [============================>.] - ETA: 3s - loss: 1.6863 - regression_loss: 1.4096 - classification_loss: 0.2766 487/500 [============================>.] - ETA: 3s - loss: 1.6871 - regression_loss: 1.4104 - classification_loss: 0.2766 488/500 [============================>.] - ETA: 3s - loss: 1.6873 - regression_loss: 1.4107 - classification_loss: 0.2766 489/500 [============================>.] - ETA: 2s - loss: 1.6876 - regression_loss: 1.4107 - classification_loss: 0.2768 490/500 [============================>.] - ETA: 2s - loss: 1.6874 - regression_loss: 1.4107 - classification_loss: 0.2767 491/500 [============================>.] - ETA: 2s - loss: 1.6886 - regression_loss: 1.4115 - classification_loss: 0.2771 492/500 [============================>.] - ETA: 2s - loss: 1.6887 - regression_loss: 1.4117 - classification_loss: 0.2770 493/500 [============================>.] - ETA: 1s - loss: 1.6883 - regression_loss: 1.4113 - classification_loss: 0.2770 494/500 [============================>.] - ETA: 1s - loss: 1.6889 - regression_loss: 1.4117 - classification_loss: 0.2772 495/500 [============================>.] - ETA: 1s - loss: 1.6884 - regression_loss: 1.4114 - classification_loss: 0.2771 496/500 [============================>.] - ETA: 1s - loss: 1.6887 - regression_loss: 1.4114 - classification_loss: 0.2773 497/500 [============================>.] - ETA: 0s - loss: 1.6880 - regression_loss: 1.4109 - classification_loss: 0.2771 498/500 [============================>.] - ETA: 0s - loss: 1.6896 - regression_loss: 1.4121 - classification_loss: 0.2774 499/500 [============================>.] - ETA: 0s - loss: 1.6894 - regression_loss: 1.4120 - classification_loss: 0.2774 500/500 [==============================] - 125s 251ms/step - loss: 1.6897 - regression_loss: 1.4123 - classification_loss: 0.2774 326 instances of class plum with average precision: 0.7795 mAP: 0.7795 Epoch 00048: saving model to ./training/snapshots/resnet50_pascal_48.h5 Epoch 49/150 1/500 [..............................] - ETA: 1:59 - loss: 1.8668 - regression_loss: 1.5799 - classification_loss: 0.2869 2/500 [..............................] - ETA: 2:02 - loss: 2.0146 - regression_loss: 1.7256 - classification_loss: 0.2890 3/500 [..............................] - ETA: 2:03 - loss: 1.8691 - regression_loss: 1.6068 - classification_loss: 0.2623 4/500 [..............................] - ETA: 2:04 - loss: 1.8398 - regression_loss: 1.5640 - classification_loss: 0.2758 5/500 [..............................] - ETA: 2:03 - loss: 1.8515 - regression_loss: 1.5695 - classification_loss: 0.2820 6/500 [..............................] - ETA: 2:03 - loss: 1.7821 - regression_loss: 1.5024 - classification_loss: 0.2797 7/500 [..............................] - ETA: 2:03 - loss: 1.7296 - regression_loss: 1.4460 - classification_loss: 0.2836 8/500 [..............................] - ETA: 2:03 - loss: 1.7548 - regression_loss: 1.4683 - classification_loss: 0.2865 9/500 [..............................] - ETA: 2:02 - loss: 1.6966 - regression_loss: 1.4169 - classification_loss: 0.2797 10/500 [..............................] - ETA: 2:02 - loss: 1.6817 - regression_loss: 1.4050 - classification_loss: 0.2767 11/500 [..............................] - ETA: 2:01 - loss: 1.5873 - regression_loss: 1.3255 - classification_loss: 0.2618 12/500 [..............................] - ETA: 2:00 - loss: 1.6326 - regression_loss: 1.3659 - classification_loss: 0.2667 13/500 [..............................] - ETA: 2:00 - loss: 1.7235 - regression_loss: 1.4587 - classification_loss: 0.2648 14/500 [..............................] - ETA: 2:00 - loss: 1.6546 - regression_loss: 1.4017 - classification_loss: 0.2529 15/500 [..............................] - ETA: 1:59 - loss: 1.6724 - regression_loss: 1.4170 - classification_loss: 0.2554 16/500 [..............................] - ETA: 1:59 - loss: 1.6480 - regression_loss: 1.4017 - classification_loss: 0.2463 17/500 [>.............................] - ETA: 1:59 - loss: 1.6725 - regression_loss: 1.4200 - classification_loss: 0.2525 18/500 [>.............................] - ETA: 1:59 - loss: 1.7057 - regression_loss: 1.4448 - classification_loss: 0.2609 19/500 [>.............................] - ETA: 1:59 - loss: 1.7017 - regression_loss: 1.4422 - classification_loss: 0.2595 20/500 [>.............................] - ETA: 1:59 - loss: 1.6771 - regression_loss: 1.4221 - classification_loss: 0.2549 21/500 [>.............................] - ETA: 1:59 - loss: 1.6543 - regression_loss: 1.4033 - classification_loss: 0.2510 22/500 [>.............................] - ETA: 1:59 - loss: 1.6789 - regression_loss: 1.4222 - classification_loss: 0.2567 23/500 [>.............................] - ETA: 1:58 - loss: 1.6622 - regression_loss: 1.4068 - classification_loss: 0.2554 24/500 [>.............................] - ETA: 1:58 - loss: 1.6216 - regression_loss: 1.3742 - classification_loss: 0.2474 25/500 [>.............................] - ETA: 1:58 - loss: 1.6386 - regression_loss: 1.3897 - classification_loss: 0.2488 26/500 [>.............................] - ETA: 1:58 - loss: 1.6420 - regression_loss: 1.3929 - classification_loss: 0.2491 27/500 [>.............................] - ETA: 1:58 - loss: 1.6153 - regression_loss: 1.3684 - classification_loss: 0.2469 28/500 [>.............................] - ETA: 1:57 - loss: 1.6119 - regression_loss: 1.3660 - classification_loss: 0.2459 29/500 [>.............................] - ETA: 1:57 - loss: 1.6320 - regression_loss: 1.3189 - classification_loss: 0.3132 30/500 [>.............................] - ETA: 1:57 - loss: 1.6480 - regression_loss: 1.3320 - classification_loss: 0.3160 31/500 [>.............................] - ETA: 1:57 - loss: 1.6238 - regression_loss: 1.3149 - classification_loss: 0.3090 32/500 [>.............................] - ETA: 1:56 - loss: 1.6325 - regression_loss: 1.3219 - classification_loss: 0.3107 33/500 [>.............................] - ETA: 1:56 - loss: 1.6460 - regression_loss: 1.3363 - classification_loss: 0.3097 34/500 [=>............................] - ETA: 1:56 - loss: 1.6711 - regression_loss: 1.3596 - classification_loss: 0.3115 35/500 [=>............................] - ETA: 1:56 - loss: 1.6916 - regression_loss: 1.3762 - classification_loss: 0.3154 36/500 [=>............................] - ETA: 1:56 - loss: 1.6837 - regression_loss: 1.3739 - classification_loss: 0.3098 37/500 [=>............................] - ETA: 1:55 - loss: 1.6885 - regression_loss: 1.3779 - classification_loss: 0.3105 38/500 [=>............................] - ETA: 1:55 - loss: 1.6744 - regression_loss: 1.3677 - classification_loss: 0.3066 39/500 [=>............................] - ETA: 1:55 - loss: 1.6739 - regression_loss: 1.3690 - classification_loss: 0.3048 40/500 [=>............................] - ETA: 1:55 - loss: 1.6602 - regression_loss: 1.3594 - classification_loss: 0.3008 41/500 [=>............................] - ETA: 1:54 - loss: 1.6560 - regression_loss: 1.3585 - classification_loss: 0.2976 42/500 [=>............................] - ETA: 1:54 - loss: 1.6553 - regression_loss: 1.3553 - classification_loss: 0.3000 43/500 [=>............................] - ETA: 1:54 - loss: 1.6729 - regression_loss: 1.3715 - classification_loss: 0.3014 44/500 [=>............................] - ETA: 1:54 - loss: 1.6778 - regression_loss: 1.3758 - classification_loss: 0.3020 45/500 [=>............................] - ETA: 1:53 - loss: 1.6798 - regression_loss: 1.3783 - classification_loss: 0.3015 46/500 [=>............................] - ETA: 1:53 - loss: 1.6747 - regression_loss: 1.3754 - classification_loss: 0.2994 47/500 [=>............................] - ETA: 1:53 - loss: 1.6667 - regression_loss: 1.3702 - classification_loss: 0.2965 48/500 [=>............................] - ETA: 1:53 - loss: 1.6660 - regression_loss: 1.3693 - classification_loss: 0.2968 49/500 [=>............................] - ETA: 1:53 - loss: 1.6767 - regression_loss: 1.3728 - classification_loss: 0.3039 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6731 - regression_loss: 1.3715 - classification_loss: 0.3016 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6643 - regression_loss: 1.3647 - classification_loss: 0.2996 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6831 - regression_loss: 1.3775 - classification_loss: 0.3055 53/500 [==>...........................] - ETA: 1:52 - loss: 1.7042 - regression_loss: 1.3926 - classification_loss: 0.3116 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6959 - regression_loss: 1.3878 - classification_loss: 0.3082 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6929 - regression_loss: 1.3868 - classification_loss: 0.3061 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7111 - regression_loss: 1.4019 - classification_loss: 0.3092 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7127 - regression_loss: 1.4043 - classification_loss: 0.3084 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7034 - regression_loss: 1.3962 - classification_loss: 0.3072 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7181 - regression_loss: 1.4071 - classification_loss: 0.3110 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7162 - regression_loss: 1.4072 - classification_loss: 0.3090 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7082 - regression_loss: 1.4026 - classification_loss: 0.3057 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6983 - regression_loss: 1.3952 - classification_loss: 0.3031 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6945 - regression_loss: 1.3929 - classification_loss: 0.3016 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6951 - regression_loss: 1.3940 - classification_loss: 0.3011 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7050 - regression_loss: 1.4008 - classification_loss: 0.3042 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7034 - regression_loss: 1.4015 - classification_loss: 0.3019 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6972 - regression_loss: 1.3980 - classification_loss: 0.2992 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6874 - regression_loss: 1.3898 - classification_loss: 0.2976 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6777 - regression_loss: 1.3823 - classification_loss: 0.2954 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6697 - regression_loss: 1.3769 - classification_loss: 0.2929 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6772 - regression_loss: 1.3834 - classification_loss: 0.2939 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6748 - regression_loss: 1.3815 - classification_loss: 0.2933 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6714 - regression_loss: 1.3792 - classification_loss: 0.2921 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6712 - regression_loss: 1.3800 - classification_loss: 0.2911 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6775 - regression_loss: 1.3845 - classification_loss: 0.2930 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6822 - regression_loss: 1.3898 - classification_loss: 0.2924 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6867 - regression_loss: 1.3890 - classification_loss: 0.2977 78/500 [===>..........................] - ETA: 1:46 - loss: 1.6916 - regression_loss: 1.3931 - classification_loss: 0.2985 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6967 - regression_loss: 1.3974 - classification_loss: 0.2993 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6992 - regression_loss: 1.4004 - classification_loss: 0.2988 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7044 - regression_loss: 1.4052 - classification_loss: 0.2992 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6919 - regression_loss: 1.3958 - classification_loss: 0.2961 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6897 - regression_loss: 1.3947 - classification_loss: 0.2950 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6860 - regression_loss: 1.3925 - classification_loss: 0.2935 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6875 - regression_loss: 1.3947 - classification_loss: 0.2928 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6883 - regression_loss: 1.3960 - classification_loss: 0.2923 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6923 - regression_loss: 1.3992 - classification_loss: 0.2931 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6923 - regression_loss: 1.3999 - classification_loss: 0.2924 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6996 - regression_loss: 1.4051 - classification_loss: 0.2945 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7041 - regression_loss: 1.4083 - classification_loss: 0.2958 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7031 - regression_loss: 1.4081 - classification_loss: 0.2949 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7070 - regression_loss: 1.4121 - classification_loss: 0.2949 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7035 - regression_loss: 1.4092 - classification_loss: 0.2943 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7008 - regression_loss: 1.4076 - classification_loss: 0.2932 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7023 - regression_loss: 1.4097 - classification_loss: 0.2926 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6927 - regression_loss: 1.3998 - classification_loss: 0.2929 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6883 - regression_loss: 1.3969 - classification_loss: 0.2914 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6898 - regression_loss: 1.3981 - classification_loss: 0.2917 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7009 - regression_loss: 1.4087 - classification_loss: 0.2922 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6950 - regression_loss: 1.4050 - classification_loss: 0.2901 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6870 - regression_loss: 1.3986 - classification_loss: 0.2885 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6907 - regression_loss: 1.4020 - classification_loss: 0.2886 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6949 - regression_loss: 1.4064 - classification_loss: 0.2884 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6996 - regression_loss: 1.4100 - classification_loss: 0.2896 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6999 - regression_loss: 1.4110 - classification_loss: 0.2889 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6973 - regression_loss: 1.4096 - classification_loss: 0.2877 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7001 - regression_loss: 1.4120 - classification_loss: 0.2881 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6990 - regression_loss: 1.4119 - classification_loss: 0.2870 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7029 - regression_loss: 1.4153 - classification_loss: 0.2877 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7014 - regression_loss: 1.4145 - classification_loss: 0.2870 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7058 - regression_loss: 1.4182 - classification_loss: 0.2875 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7088 - regression_loss: 1.4215 - classification_loss: 0.2873 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7030 - regression_loss: 1.4176 - classification_loss: 0.2854 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7025 - regression_loss: 1.4173 - classification_loss: 0.2852 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6977 - regression_loss: 1.4141 - classification_loss: 0.2835 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6975 - regression_loss: 1.4144 - classification_loss: 0.2831 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6969 - regression_loss: 1.4141 - classification_loss: 0.2828 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6918 - regression_loss: 1.4103 - classification_loss: 0.2815 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6877 - regression_loss: 1.4070 - classification_loss: 0.2807 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6909 - regression_loss: 1.4084 - classification_loss: 0.2825 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6902 - regression_loss: 1.4080 - classification_loss: 0.2821 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6882 - regression_loss: 1.4065 - classification_loss: 0.2817 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6939 - regression_loss: 1.4122 - classification_loss: 0.2817 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7000 - regression_loss: 1.4160 - classification_loss: 0.2840 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6924 - regression_loss: 1.4101 - classification_loss: 0.2823 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6863 - regression_loss: 1.4048 - classification_loss: 0.2814 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6837 - regression_loss: 1.4031 - classification_loss: 0.2805 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6881 - regression_loss: 1.4061 - classification_loss: 0.2820 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6869 - regression_loss: 1.4050 - classification_loss: 0.2818 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6826 - regression_loss: 1.4016 - classification_loss: 0.2810 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6821 - regression_loss: 1.4011 - classification_loss: 0.2810 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6825 - regression_loss: 1.4018 - classification_loss: 0.2807 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6844 - regression_loss: 1.4039 - classification_loss: 0.2805 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6856 - regression_loss: 1.4062 - classification_loss: 0.2794 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6891 - regression_loss: 1.4088 - classification_loss: 0.2803 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6878 - regression_loss: 1.4081 - classification_loss: 0.2797 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6866 - regression_loss: 1.4073 - classification_loss: 0.2793 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6883 - regression_loss: 1.4095 - classification_loss: 0.2789 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6885 - regression_loss: 1.4090 - classification_loss: 0.2795 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6858 - regression_loss: 1.4075 - classification_loss: 0.2784 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6881 - regression_loss: 1.4097 - classification_loss: 0.2784 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6839 - regression_loss: 1.4066 - classification_loss: 0.2773 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6816 - regression_loss: 1.4051 - classification_loss: 0.2764 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6900 - regression_loss: 1.4129 - classification_loss: 0.2771 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6903 - regression_loss: 1.4128 - classification_loss: 0.2775 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6890 - regression_loss: 1.4119 - classification_loss: 0.2770 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6832 - regression_loss: 1.4074 - classification_loss: 0.2758 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6960 - regression_loss: 1.4173 - classification_loss: 0.2787 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6921 - regression_loss: 1.4141 - classification_loss: 0.2779 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6982 - regression_loss: 1.4190 - classification_loss: 0.2792 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7041 - regression_loss: 1.4250 - classification_loss: 0.2792 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7038 - regression_loss: 1.4248 - classification_loss: 0.2789 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7064 - regression_loss: 1.4270 - classification_loss: 0.2794 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7088 - regression_loss: 1.4295 - classification_loss: 0.2792 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7117 - regression_loss: 1.4316 - classification_loss: 0.2802 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7133 - regression_loss: 1.4330 - classification_loss: 0.2803 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7132 - regression_loss: 1.4333 - classification_loss: 0.2799 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7151 - regression_loss: 1.4358 - classification_loss: 0.2792 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7164 - regression_loss: 1.4374 - classification_loss: 0.2790 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7149 - regression_loss: 1.4361 - classification_loss: 0.2787 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7137 - regression_loss: 1.4349 - classification_loss: 0.2788 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7123 - regression_loss: 1.4343 - classification_loss: 0.2780 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7105 - regression_loss: 1.4329 - classification_loss: 0.2776 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7093 - regression_loss: 1.4324 - classification_loss: 0.2769 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7031 - regression_loss: 1.4272 - classification_loss: 0.2759 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6963 - regression_loss: 1.4217 - classification_loss: 0.2745 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6976 - regression_loss: 1.4229 - classification_loss: 0.2747 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6991 - regression_loss: 1.4144 - classification_loss: 0.2846 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6962 - regression_loss: 1.4122 - classification_loss: 0.2840 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7020 - regression_loss: 1.4166 - classification_loss: 0.2855 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7008 - regression_loss: 1.4160 - classification_loss: 0.2848 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6983 - regression_loss: 1.4140 - classification_loss: 0.2843 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7009 - regression_loss: 1.4158 - classification_loss: 0.2852 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7001 - regression_loss: 1.4151 - classification_loss: 0.2850 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7052 - regression_loss: 1.4188 - classification_loss: 0.2864 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7101 - regression_loss: 1.4226 - classification_loss: 0.2875 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7152 - regression_loss: 1.4269 - classification_loss: 0.2883 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7162 - regression_loss: 1.4275 - classification_loss: 0.2887 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7133 - regression_loss: 1.4253 - classification_loss: 0.2880 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7118 - regression_loss: 1.4245 - classification_loss: 0.2873 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7091 - regression_loss: 1.4227 - classification_loss: 0.2863 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7113 - regression_loss: 1.4248 - classification_loss: 0.2865 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7088 - regression_loss: 1.4229 - classification_loss: 0.2859 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7086 - regression_loss: 1.4226 - classification_loss: 0.2860 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7100 - regression_loss: 1.4235 - classification_loss: 0.2865 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7087 - regression_loss: 1.4227 - classification_loss: 0.2860 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7086 - regression_loss: 1.4228 - classification_loss: 0.2858 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7083 - regression_loss: 1.4230 - classification_loss: 0.2853 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7064 - regression_loss: 1.4214 - classification_loss: 0.2850 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7085 - regression_loss: 1.4228 - classification_loss: 0.2856 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7102 - regression_loss: 1.4245 - classification_loss: 0.2857 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7108 - regression_loss: 1.4249 - classification_loss: 0.2859 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7043 - regression_loss: 1.4195 - classification_loss: 0.2847 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7050 - regression_loss: 1.4197 - classification_loss: 0.2853 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7043 - regression_loss: 1.4190 - classification_loss: 0.2853 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6998 - regression_loss: 1.4156 - classification_loss: 0.2841 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6962 - regression_loss: 1.4128 - classification_loss: 0.2834 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7000 - regression_loss: 1.4150 - classification_loss: 0.2851 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6940 - regression_loss: 1.4079 - classification_loss: 0.2861 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6973 - regression_loss: 1.4098 - classification_loss: 0.2876 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6962 - regression_loss: 1.4089 - classification_loss: 0.2873 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6943 - regression_loss: 1.4079 - classification_loss: 0.2864 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6963 - regression_loss: 1.4091 - classification_loss: 0.2872 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6984 - regression_loss: 1.4111 - classification_loss: 0.2874 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7028 - regression_loss: 1.4143 - classification_loss: 0.2885 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6985 - regression_loss: 1.4109 - classification_loss: 0.2876 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6997 - regression_loss: 1.4121 - classification_loss: 0.2877 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7053 - regression_loss: 1.4166 - classification_loss: 0.2887 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7062 - regression_loss: 1.4174 - classification_loss: 0.2889 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7096 - regression_loss: 1.4205 - classification_loss: 0.2891 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7058 - regression_loss: 1.4178 - classification_loss: 0.2880 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7037 - regression_loss: 1.4164 - classification_loss: 0.2874 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7008 - regression_loss: 1.4142 - classification_loss: 0.2866 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7006 - regression_loss: 1.4145 - classification_loss: 0.2860 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6999 - regression_loss: 1.4143 - classification_loss: 0.2857 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7020 - regression_loss: 1.4160 - classification_loss: 0.2860 217/500 [============>.................] - ETA: 1:10 - loss: 1.7052 - regression_loss: 1.4186 - classification_loss: 0.2866 218/500 [============>.................] - ETA: 1:10 - loss: 1.7081 - regression_loss: 1.4207 - classification_loss: 0.2873 219/500 [============>.................] - ETA: 1:10 - loss: 1.7077 - regression_loss: 1.4205 - classification_loss: 0.2872 220/500 [============>.................] - ETA: 1:10 - loss: 1.7070 - regression_loss: 1.4205 - classification_loss: 0.2865 221/500 [============>.................] - ETA: 1:09 - loss: 1.7081 - regression_loss: 1.4209 - classification_loss: 0.2872 222/500 [============>.................] - ETA: 1:09 - loss: 1.7082 - regression_loss: 1.4207 - classification_loss: 0.2874 223/500 [============>.................] - ETA: 1:09 - loss: 1.7080 - regression_loss: 1.4204 - classification_loss: 0.2876 224/500 [============>.................] - ETA: 1:09 - loss: 1.7102 - regression_loss: 1.4222 - classification_loss: 0.2880 225/500 [============>.................] - ETA: 1:08 - loss: 1.7084 - regression_loss: 1.4210 - classification_loss: 0.2873 226/500 [============>.................] - ETA: 1:08 - loss: 1.7078 - regression_loss: 1.4208 - classification_loss: 0.2870 227/500 [============>.................] - ETA: 1:08 - loss: 1.7062 - regression_loss: 1.4198 - classification_loss: 0.2863 228/500 [============>.................] - ETA: 1:08 - loss: 1.7075 - regression_loss: 1.4204 - classification_loss: 0.2871 229/500 [============>.................] - ETA: 1:07 - loss: 1.7033 - regression_loss: 1.4171 - classification_loss: 0.2862 230/500 [============>.................] - ETA: 1:07 - loss: 1.7025 - regression_loss: 1.4167 - classification_loss: 0.2858 231/500 [============>.................] - ETA: 1:07 - loss: 1.7041 - regression_loss: 1.4180 - classification_loss: 0.2861 232/500 [============>.................] - ETA: 1:07 - loss: 1.6997 - regression_loss: 1.4144 - classification_loss: 0.2853 233/500 [============>.................] - ETA: 1:06 - loss: 1.7010 - regression_loss: 1.4158 - classification_loss: 0.2852 234/500 [=============>................] - ETA: 1:06 - loss: 1.7023 - regression_loss: 1.4166 - classification_loss: 0.2857 235/500 [=============>................] - ETA: 1:06 - loss: 1.6996 - regression_loss: 1.4147 - classification_loss: 0.2849 236/500 [=============>................] - ETA: 1:06 - loss: 1.6994 - regression_loss: 1.4145 - classification_loss: 0.2849 237/500 [=============>................] - ETA: 1:05 - loss: 1.6983 - regression_loss: 1.4141 - classification_loss: 0.2843 238/500 [=============>................] - ETA: 1:05 - loss: 1.7001 - regression_loss: 1.4155 - classification_loss: 0.2846 239/500 [=============>................] - ETA: 1:05 - loss: 1.7000 - regression_loss: 1.4153 - classification_loss: 0.2848 240/500 [=============>................] - ETA: 1:05 - loss: 1.7004 - regression_loss: 1.4153 - classification_loss: 0.2851 241/500 [=============>................] - ETA: 1:04 - loss: 1.7052 - regression_loss: 1.4200 - classification_loss: 0.2852 242/500 [=============>................] - ETA: 1:04 - loss: 1.7066 - regression_loss: 1.4209 - classification_loss: 0.2857 243/500 [=============>................] - ETA: 1:04 - loss: 1.7070 - regression_loss: 1.4215 - classification_loss: 0.2856 244/500 [=============>................] - ETA: 1:04 - loss: 1.7097 - regression_loss: 1.4231 - classification_loss: 0.2866 245/500 [=============>................] - ETA: 1:03 - loss: 1.7080 - regression_loss: 1.4218 - classification_loss: 0.2862 246/500 [=============>................] - ETA: 1:03 - loss: 1.7076 - regression_loss: 1.4217 - classification_loss: 0.2860 247/500 [=============>................] - ETA: 1:03 - loss: 1.7093 - regression_loss: 1.4229 - classification_loss: 0.2864 248/500 [=============>................] - ETA: 1:03 - loss: 1.7106 - regression_loss: 1.4239 - classification_loss: 0.2867 249/500 [=============>................] - ETA: 1:02 - loss: 1.7097 - regression_loss: 1.4231 - classification_loss: 0.2867 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7116 - regression_loss: 1.4244 - classification_loss: 0.2872 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7131 - regression_loss: 1.4260 - classification_loss: 0.2871 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7172 - regression_loss: 1.4290 - classification_loss: 0.2882 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7158 - regression_loss: 1.4280 - classification_loss: 0.2878 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7165 - regression_loss: 1.4287 - classification_loss: 0.2878 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7143 - regression_loss: 1.4269 - classification_loss: 0.2874 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7149 - regression_loss: 1.4272 - classification_loss: 0.2877 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7161 - regression_loss: 1.4282 - classification_loss: 0.2879 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7151 - regression_loss: 1.4277 - classification_loss: 0.2873 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7164 - regression_loss: 1.4288 - classification_loss: 0.2876 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7170 - regression_loss: 1.4296 - classification_loss: 0.2874 261/500 [==============>...............] - ETA: 59s - loss: 1.7165 - regression_loss: 1.4293 - classification_loss: 0.2872  262/500 [==============>...............] - ETA: 59s - loss: 1.7179 - regression_loss: 1.4305 - classification_loss: 0.2874 263/500 [==============>...............] - ETA: 59s - loss: 1.7200 - regression_loss: 1.4324 - classification_loss: 0.2876 264/500 [==============>...............] - ETA: 59s - loss: 1.7189 - regression_loss: 1.4316 - classification_loss: 0.2873 265/500 [==============>...............] - ETA: 58s - loss: 1.7188 - regression_loss: 1.4318 - classification_loss: 0.2869 266/500 [==============>...............] - ETA: 58s - loss: 1.7197 - regression_loss: 1.4325 - classification_loss: 0.2872 267/500 [===============>..............] - ETA: 58s - loss: 1.7190 - regression_loss: 1.4321 - classification_loss: 0.2869 268/500 [===============>..............] - ETA: 58s - loss: 1.7170 - regression_loss: 1.4305 - classification_loss: 0.2864 269/500 [===============>..............] - ETA: 57s - loss: 1.7157 - regression_loss: 1.4295 - classification_loss: 0.2862 270/500 [===============>..............] - ETA: 57s - loss: 1.7170 - regression_loss: 1.4307 - classification_loss: 0.2863 271/500 [===============>..............] - ETA: 57s - loss: 1.7176 - regression_loss: 1.4313 - classification_loss: 0.2863 272/500 [===============>..............] - ETA: 57s - loss: 1.7174 - regression_loss: 1.4310 - classification_loss: 0.2864 273/500 [===============>..............] - ETA: 56s - loss: 1.7170 - regression_loss: 1.4308 - classification_loss: 0.2862 274/500 [===============>..............] - ETA: 56s - loss: 1.7192 - regression_loss: 1.4327 - classification_loss: 0.2866 275/500 [===============>..............] - ETA: 56s - loss: 1.7166 - regression_loss: 1.4307 - classification_loss: 0.2859 276/500 [===============>..............] - ETA: 56s - loss: 1.7158 - regression_loss: 1.4302 - classification_loss: 0.2856 277/500 [===============>..............] - ETA: 55s - loss: 1.7170 - regression_loss: 1.4312 - classification_loss: 0.2858 278/500 [===============>..............] - ETA: 55s - loss: 1.7170 - regression_loss: 1.4312 - classification_loss: 0.2858 279/500 [===============>..............] - ETA: 55s - loss: 1.7180 - regression_loss: 1.4317 - classification_loss: 0.2863 280/500 [===============>..............] - ETA: 55s - loss: 1.7174 - regression_loss: 1.4304 - classification_loss: 0.2870 281/500 [===============>..............] - ETA: 54s - loss: 1.7163 - regression_loss: 1.4295 - classification_loss: 0.2867 282/500 [===============>..............] - ETA: 54s - loss: 1.7181 - regression_loss: 1.4311 - classification_loss: 0.2871 283/500 [===============>..............] - ETA: 54s - loss: 1.7199 - regression_loss: 1.4324 - classification_loss: 0.2874 284/500 [================>.............] - ETA: 54s - loss: 1.7215 - regression_loss: 1.4341 - classification_loss: 0.2875 285/500 [================>.............] - ETA: 53s - loss: 1.7232 - regression_loss: 1.4353 - classification_loss: 0.2879 286/500 [================>.............] - ETA: 53s - loss: 1.7228 - regression_loss: 1.4349 - classification_loss: 0.2879 287/500 [================>.............] - ETA: 53s - loss: 1.7218 - regression_loss: 1.4341 - classification_loss: 0.2877 288/500 [================>.............] - ETA: 53s - loss: 1.7225 - regression_loss: 1.4347 - classification_loss: 0.2878 289/500 [================>.............] - ETA: 52s - loss: 1.7245 - regression_loss: 1.4362 - classification_loss: 0.2883 290/500 [================>.............] - ETA: 52s - loss: 1.7231 - regression_loss: 1.4350 - classification_loss: 0.2880 291/500 [================>.............] - ETA: 52s - loss: 1.7225 - regression_loss: 1.4346 - classification_loss: 0.2879 292/500 [================>.............] - ETA: 52s - loss: 1.7204 - regression_loss: 1.4332 - classification_loss: 0.2873 293/500 [================>.............] - ETA: 51s - loss: 1.7198 - regression_loss: 1.4323 - classification_loss: 0.2875 294/500 [================>.............] - ETA: 51s - loss: 1.7179 - regression_loss: 1.4308 - classification_loss: 0.2871 295/500 [================>.............] - ETA: 51s - loss: 1.7177 - regression_loss: 1.4306 - classification_loss: 0.2870 296/500 [================>.............] - ETA: 51s - loss: 1.7156 - regression_loss: 1.4258 - classification_loss: 0.2898 297/500 [================>.............] - ETA: 50s - loss: 1.7145 - regression_loss: 1.4250 - classification_loss: 0.2896 298/500 [================>.............] - ETA: 50s - loss: 1.7178 - regression_loss: 1.4277 - classification_loss: 0.2901 299/500 [================>.............] - ETA: 50s - loss: 1.7186 - regression_loss: 1.4283 - classification_loss: 0.2902 300/500 [=================>............] - ETA: 50s - loss: 1.7165 - regression_loss: 1.4265 - classification_loss: 0.2899 301/500 [=================>............] - ETA: 49s - loss: 1.7158 - regression_loss: 1.4261 - classification_loss: 0.2897 302/500 [=================>............] - ETA: 49s - loss: 1.7158 - regression_loss: 1.4259 - classification_loss: 0.2899 303/500 [=================>............] - ETA: 49s - loss: 1.7160 - regression_loss: 1.4262 - classification_loss: 0.2898 304/500 [=================>............] - ETA: 49s - loss: 1.7172 - regression_loss: 1.4268 - classification_loss: 0.2903 305/500 [=================>............] - ETA: 48s - loss: 1.7199 - regression_loss: 1.4286 - classification_loss: 0.2913 306/500 [=================>............] - ETA: 48s - loss: 1.7200 - regression_loss: 1.4286 - classification_loss: 0.2914 307/500 [=================>............] - ETA: 48s - loss: 1.7200 - regression_loss: 1.4289 - classification_loss: 0.2911 308/500 [=================>............] - ETA: 48s - loss: 1.7193 - regression_loss: 1.4286 - classification_loss: 0.2907 309/500 [=================>............] - ETA: 47s - loss: 1.7192 - regression_loss: 1.4286 - classification_loss: 0.2906 310/500 [=================>............] - ETA: 47s - loss: 1.7189 - regression_loss: 1.4284 - classification_loss: 0.2905 311/500 [=================>............] - ETA: 47s - loss: 1.7180 - regression_loss: 1.4273 - classification_loss: 0.2906 312/500 [=================>............] - ETA: 47s - loss: 1.7174 - regression_loss: 1.4269 - classification_loss: 0.2906 313/500 [=================>............] - ETA: 46s - loss: 1.7181 - regression_loss: 1.4279 - classification_loss: 0.2902 314/500 [=================>............] - ETA: 46s - loss: 1.7203 - regression_loss: 1.4295 - classification_loss: 0.2908 315/500 [=================>............] - ETA: 46s - loss: 1.7202 - regression_loss: 1.4294 - classification_loss: 0.2908 316/500 [=================>............] - ETA: 46s - loss: 1.7223 - regression_loss: 1.4313 - classification_loss: 0.2911 317/500 [==================>...........] - ETA: 45s - loss: 1.7185 - regression_loss: 1.4282 - classification_loss: 0.2903 318/500 [==================>...........] - ETA: 45s - loss: 1.7181 - regression_loss: 1.4278 - classification_loss: 0.2904 319/500 [==================>...........] - ETA: 45s - loss: 1.7177 - regression_loss: 1.4276 - classification_loss: 0.2901 320/500 [==================>...........] - ETA: 45s - loss: 1.7156 - regression_loss: 1.4260 - classification_loss: 0.2896 321/500 [==================>...........] - ETA: 44s - loss: 1.7157 - regression_loss: 1.4260 - classification_loss: 0.2898 322/500 [==================>...........] - ETA: 44s - loss: 1.7145 - regression_loss: 1.4250 - classification_loss: 0.2895 323/500 [==================>...........] - ETA: 44s - loss: 1.7171 - regression_loss: 1.4272 - classification_loss: 0.2899 324/500 [==================>...........] - ETA: 44s - loss: 1.7155 - regression_loss: 1.4255 - classification_loss: 0.2901 325/500 [==================>...........] - ETA: 43s - loss: 1.7164 - regression_loss: 1.4259 - classification_loss: 0.2905 326/500 [==================>...........] - ETA: 43s - loss: 1.7124 - regression_loss: 1.4226 - classification_loss: 0.2898 327/500 [==================>...........] - ETA: 43s - loss: 1.7112 - regression_loss: 1.4218 - classification_loss: 0.2894 328/500 [==================>...........] - ETA: 43s - loss: 1.7108 - regression_loss: 1.4216 - classification_loss: 0.2893 329/500 [==================>...........] - ETA: 42s - loss: 1.7124 - regression_loss: 1.4229 - classification_loss: 0.2896 330/500 [==================>...........] - ETA: 42s - loss: 1.7120 - regression_loss: 1.4227 - classification_loss: 0.2893 331/500 [==================>...........] - ETA: 42s - loss: 1.7089 - regression_loss: 1.4202 - classification_loss: 0.2888 332/500 [==================>...........] - ETA: 42s - loss: 1.7066 - regression_loss: 1.4184 - classification_loss: 0.2882 333/500 [==================>...........] - ETA: 41s - loss: 1.7069 - regression_loss: 1.4186 - classification_loss: 0.2882 334/500 [===================>..........] - ETA: 41s - loss: 1.7069 - regression_loss: 1.4183 - classification_loss: 0.2886 335/500 [===================>..........] - ETA: 41s - loss: 1.7116 - regression_loss: 1.4227 - classification_loss: 0.2889 336/500 [===================>..........] - ETA: 41s - loss: 1.7100 - regression_loss: 1.4213 - classification_loss: 0.2886 337/500 [===================>..........] - ETA: 40s - loss: 1.7092 - regression_loss: 1.4208 - classification_loss: 0.2884 338/500 [===================>..........] - ETA: 40s - loss: 1.7101 - regression_loss: 1.4217 - classification_loss: 0.2884 339/500 [===================>..........] - ETA: 40s - loss: 1.7121 - regression_loss: 1.4233 - classification_loss: 0.2888 340/500 [===================>..........] - ETA: 40s - loss: 1.7088 - regression_loss: 1.4207 - classification_loss: 0.2881 341/500 [===================>..........] - ETA: 39s - loss: 1.7097 - regression_loss: 1.4213 - classification_loss: 0.2883 342/500 [===================>..........] - ETA: 39s - loss: 1.7086 - regression_loss: 1.4207 - classification_loss: 0.2878 343/500 [===================>..........] - ETA: 39s - loss: 1.7101 - regression_loss: 1.4217 - classification_loss: 0.2884 344/500 [===================>..........] - ETA: 39s - loss: 1.7078 - regression_loss: 1.4199 - classification_loss: 0.2880 345/500 [===================>..........] - ETA: 38s - loss: 1.7086 - regression_loss: 1.4207 - classification_loss: 0.2879 346/500 [===================>..........] - ETA: 38s - loss: 1.7121 - regression_loss: 1.4232 - classification_loss: 0.2889 347/500 [===================>..........] - ETA: 38s - loss: 1.7130 - regression_loss: 1.4242 - classification_loss: 0.2888 348/500 [===================>..........] - ETA: 38s - loss: 1.7125 - regression_loss: 1.4239 - classification_loss: 0.2886 349/500 [===================>..........] - ETA: 37s - loss: 1.7137 - regression_loss: 1.4251 - classification_loss: 0.2885 350/500 [====================>.........] - ETA: 37s - loss: 1.7127 - regression_loss: 1.4245 - classification_loss: 0.2882 351/500 [====================>.........] - ETA: 37s - loss: 1.7112 - regression_loss: 1.4234 - classification_loss: 0.2878 352/500 [====================>.........] - ETA: 37s - loss: 1.7096 - regression_loss: 1.4221 - classification_loss: 0.2875 353/500 [====================>.........] - ETA: 36s - loss: 1.7117 - regression_loss: 1.4239 - classification_loss: 0.2878 354/500 [====================>.........] - ETA: 36s - loss: 1.7092 - regression_loss: 1.4219 - classification_loss: 0.2873 355/500 [====================>.........] - ETA: 36s - loss: 1.7114 - regression_loss: 1.4233 - classification_loss: 0.2881 356/500 [====================>.........] - ETA: 36s - loss: 1.7114 - regression_loss: 1.4233 - classification_loss: 0.2881 357/500 [====================>.........] - ETA: 35s - loss: 1.7116 - regression_loss: 1.4234 - classification_loss: 0.2882 358/500 [====================>.........] - ETA: 35s - loss: 1.7103 - regression_loss: 1.4223 - classification_loss: 0.2880 359/500 [====================>.........] - ETA: 35s - loss: 1.7081 - regression_loss: 1.4205 - classification_loss: 0.2876 360/500 [====================>.........] - ETA: 35s - loss: 1.7052 - regression_loss: 1.4182 - classification_loss: 0.2870 361/500 [====================>.........] - ETA: 34s - loss: 1.7064 - regression_loss: 1.4191 - classification_loss: 0.2873 362/500 [====================>.........] - ETA: 34s - loss: 1.7042 - regression_loss: 1.4175 - classification_loss: 0.2868 363/500 [====================>.........] - ETA: 34s - loss: 1.7040 - regression_loss: 1.4173 - classification_loss: 0.2867 364/500 [====================>.........] - ETA: 34s - loss: 1.7030 - regression_loss: 1.4162 - classification_loss: 0.2868 365/500 [====================>.........] - ETA: 33s - loss: 1.7014 - regression_loss: 1.4150 - classification_loss: 0.2864 366/500 [====================>.........] - ETA: 33s - loss: 1.7020 - regression_loss: 1.4155 - classification_loss: 0.2865 367/500 [=====================>........] - ETA: 33s - loss: 1.7029 - regression_loss: 1.4161 - classification_loss: 0.2868 368/500 [=====================>........] - ETA: 33s - loss: 1.7026 - regression_loss: 1.4160 - classification_loss: 0.2865 369/500 [=====================>........] - ETA: 32s - loss: 1.7038 - regression_loss: 1.4168 - classification_loss: 0.2870 370/500 [=====================>........] - ETA: 32s - loss: 1.7036 - regression_loss: 1.4162 - classification_loss: 0.2873 371/500 [=====================>........] - ETA: 32s - loss: 1.7005 - regression_loss: 1.4137 - classification_loss: 0.2868 372/500 [=====================>........] - ETA: 32s - loss: 1.6999 - regression_loss: 1.4134 - classification_loss: 0.2865 373/500 [=====================>........] - ETA: 31s - loss: 1.7008 - regression_loss: 1.4141 - classification_loss: 0.2867 374/500 [=====================>........] - ETA: 31s - loss: 1.7004 - regression_loss: 1.4142 - classification_loss: 0.2863 375/500 [=====================>........] - ETA: 31s - loss: 1.7021 - regression_loss: 1.4159 - classification_loss: 0.2862 376/500 [=====================>........] - ETA: 31s - loss: 1.7112 - regression_loss: 1.4206 - classification_loss: 0.2906 377/500 [=====================>........] - ETA: 30s - loss: 1.7109 - regression_loss: 1.4203 - classification_loss: 0.2906 378/500 [=====================>........] - ETA: 30s - loss: 1.7094 - regression_loss: 1.4191 - classification_loss: 0.2903 379/500 [=====================>........] - ETA: 30s - loss: 1.7103 - regression_loss: 1.4199 - classification_loss: 0.2904 380/500 [=====================>........] - ETA: 30s - loss: 1.7097 - regression_loss: 1.4195 - classification_loss: 0.2902 381/500 [=====================>........] - ETA: 29s - loss: 1.7105 - regression_loss: 1.4202 - classification_loss: 0.2903 382/500 [=====================>........] - ETA: 29s - loss: 1.7107 - regression_loss: 1.4205 - classification_loss: 0.2903 383/500 [=====================>........] - ETA: 29s - loss: 1.7104 - regression_loss: 1.4202 - classification_loss: 0.2902 384/500 [======================>.......] - ETA: 29s - loss: 1.7099 - regression_loss: 1.4198 - classification_loss: 0.2900 385/500 [======================>.......] - ETA: 28s - loss: 1.7103 - regression_loss: 1.4203 - classification_loss: 0.2900 386/500 [======================>.......] - ETA: 28s - loss: 1.7114 - regression_loss: 1.4210 - classification_loss: 0.2904 387/500 [======================>.......] - ETA: 28s - loss: 1.7126 - regression_loss: 1.4219 - classification_loss: 0.2907 388/500 [======================>.......] - ETA: 28s - loss: 1.7134 - regression_loss: 1.4227 - classification_loss: 0.2908 389/500 [======================>.......] - ETA: 27s - loss: 1.7137 - regression_loss: 1.4228 - classification_loss: 0.2909 390/500 [======================>.......] - ETA: 27s - loss: 1.7126 - regression_loss: 1.4219 - classification_loss: 0.2907 391/500 [======================>.......] - ETA: 27s - loss: 1.7120 - regression_loss: 1.4215 - classification_loss: 0.2905 392/500 [======================>.......] - ETA: 27s - loss: 1.7101 - regression_loss: 1.4201 - classification_loss: 0.2900 393/500 [======================>.......] - ETA: 26s - loss: 1.7081 - regression_loss: 1.4165 - classification_loss: 0.2916 394/500 [======================>.......] - ETA: 26s - loss: 1.7103 - regression_loss: 1.4179 - classification_loss: 0.2924 395/500 [======================>.......] - ETA: 26s - loss: 1.7106 - regression_loss: 1.4182 - classification_loss: 0.2924 396/500 [======================>.......] - ETA: 26s - loss: 1.7101 - regression_loss: 1.4180 - classification_loss: 0.2921 397/500 [======================>.......] - ETA: 25s - loss: 1.7099 - regression_loss: 1.4179 - classification_loss: 0.2920 398/500 [======================>.......] - ETA: 25s - loss: 1.7104 - regression_loss: 1.4183 - classification_loss: 0.2921 399/500 [======================>.......] - ETA: 25s - loss: 1.7112 - regression_loss: 1.4189 - classification_loss: 0.2923 400/500 [=======================>......] - ETA: 25s - loss: 1.7103 - regression_loss: 1.4183 - classification_loss: 0.2920 401/500 [=======================>......] - ETA: 24s - loss: 1.7091 - regression_loss: 1.4176 - classification_loss: 0.2915 402/500 [=======================>......] - ETA: 24s - loss: 1.7086 - regression_loss: 1.4169 - classification_loss: 0.2917 403/500 [=======================>......] - ETA: 24s - loss: 1.7080 - regression_loss: 1.4164 - classification_loss: 0.2916 404/500 [=======================>......] - ETA: 24s - loss: 1.7088 - regression_loss: 1.4168 - classification_loss: 0.2920 405/500 [=======================>......] - ETA: 23s - loss: 1.7060 - regression_loss: 1.4147 - classification_loss: 0.2913 406/500 [=======================>......] - ETA: 23s - loss: 1.7032 - regression_loss: 1.4124 - classification_loss: 0.2907 407/500 [=======================>......] - ETA: 23s - loss: 1.7030 - regression_loss: 1.4124 - classification_loss: 0.2906 408/500 [=======================>......] - ETA: 23s - loss: 1.7007 - regression_loss: 1.4106 - classification_loss: 0.2901 409/500 [=======================>......] - ETA: 22s - loss: 1.6995 - regression_loss: 1.4097 - classification_loss: 0.2898 410/500 [=======================>......] - ETA: 22s - loss: 1.6996 - regression_loss: 1.4098 - classification_loss: 0.2898 411/500 [=======================>......] - ETA: 22s - loss: 1.6991 - regression_loss: 1.4095 - classification_loss: 0.2896 412/500 [=======================>......] - ETA: 22s - loss: 1.7008 - regression_loss: 1.4108 - classification_loss: 0.2900 413/500 [=======================>......] - ETA: 21s - loss: 1.7006 - regression_loss: 1.4106 - classification_loss: 0.2900 414/500 [=======================>......] - ETA: 21s - loss: 1.7006 - regression_loss: 1.4108 - classification_loss: 0.2899 415/500 [=======================>......] - ETA: 21s - loss: 1.7008 - regression_loss: 1.4108 - classification_loss: 0.2900 416/500 [=======================>......] - ETA: 21s - loss: 1.7003 - regression_loss: 1.4103 - classification_loss: 0.2901 417/500 [========================>.....] - ETA: 20s - loss: 1.7007 - regression_loss: 1.4107 - classification_loss: 0.2900 418/500 [========================>.....] - ETA: 20s - loss: 1.7011 - regression_loss: 1.4113 - classification_loss: 0.2898 419/500 [========================>.....] - ETA: 20s - loss: 1.6996 - regression_loss: 1.4101 - classification_loss: 0.2895 420/500 [========================>.....] - ETA: 20s - loss: 1.7003 - regression_loss: 1.4106 - classification_loss: 0.2897 421/500 [========================>.....] - ETA: 19s - loss: 1.7007 - regression_loss: 1.4111 - classification_loss: 0.2896 422/500 [========================>.....] - ETA: 19s - loss: 1.7007 - regression_loss: 1.4110 - classification_loss: 0.2897 423/500 [========================>.....] - ETA: 19s - loss: 1.7003 - regression_loss: 1.4108 - classification_loss: 0.2895 424/500 [========================>.....] - ETA: 19s - loss: 1.7016 - regression_loss: 1.4118 - classification_loss: 0.2897 425/500 [========================>.....] - ETA: 18s - loss: 1.7023 - regression_loss: 1.4123 - classification_loss: 0.2900 426/500 [========================>.....] - ETA: 18s - loss: 1.7025 - regression_loss: 1.4126 - classification_loss: 0.2899 427/500 [========================>.....] - ETA: 18s - loss: 1.7024 - regression_loss: 1.4126 - classification_loss: 0.2899 428/500 [========================>.....] - ETA: 18s - loss: 1.7034 - regression_loss: 1.4133 - classification_loss: 0.2901 429/500 [========================>.....] - ETA: 17s - loss: 1.7020 - regression_loss: 1.4123 - classification_loss: 0.2897 430/500 [========================>.....] - ETA: 17s - loss: 1.7015 - regression_loss: 1.4120 - classification_loss: 0.2896 431/500 [========================>.....] - ETA: 17s - loss: 1.7014 - regression_loss: 1.4121 - classification_loss: 0.2893 432/500 [========================>.....] - ETA: 17s - loss: 1.7015 - regression_loss: 1.4123 - classification_loss: 0.2892 433/500 [========================>.....] - ETA: 16s - loss: 1.7021 - regression_loss: 1.4129 - classification_loss: 0.2892 434/500 [=========================>....] - ETA: 16s - loss: 1.7016 - regression_loss: 1.4127 - classification_loss: 0.2889 435/500 [=========================>....] - ETA: 16s - loss: 1.7013 - regression_loss: 1.4125 - classification_loss: 0.2888 436/500 [=========================>....] - ETA: 16s - loss: 1.6997 - regression_loss: 1.4112 - classification_loss: 0.2885 437/500 [=========================>....] - ETA: 15s - loss: 1.6985 - regression_loss: 1.4104 - classification_loss: 0.2881 438/500 [=========================>....] - ETA: 15s - loss: 1.6984 - regression_loss: 1.4104 - classification_loss: 0.2880 439/500 [=========================>....] - ETA: 15s - loss: 1.6983 - regression_loss: 1.4103 - classification_loss: 0.2880 440/500 [=========================>....] - ETA: 15s - loss: 1.6985 - regression_loss: 1.4104 - classification_loss: 0.2881 441/500 [=========================>....] - ETA: 14s - loss: 1.6990 - regression_loss: 1.4110 - classification_loss: 0.2880 442/500 [=========================>....] - ETA: 14s - loss: 1.7010 - regression_loss: 1.4131 - classification_loss: 0.2878 443/500 [=========================>....] - ETA: 14s - loss: 1.7010 - regression_loss: 1.4132 - classification_loss: 0.2879 444/500 [=========================>....] - ETA: 14s - loss: 1.7049 - regression_loss: 1.4125 - classification_loss: 0.2924 445/500 [=========================>....] - ETA: 13s - loss: 1.7070 - regression_loss: 1.4139 - classification_loss: 0.2930 446/500 [=========================>....] - ETA: 13s - loss: 1.7096 - regression_loss: 1.4157 - classification_loss: 0.2939 447/500 [=========================>....] - ETA: 13s - loss: 1.7084 - regression_loss: 1.4147 - classification_loss: 0.2937 448/500 [=========================>....] - ETA: 13s - loss: 1.7094 - regression_loss: 1.4159 - classification_loss: 0.2935 449/500 [=========================>....] - ETA: 12s - loss: 1.7111 - regression_loss: 1.4172 - classification_loss: 0.2939 450/500 [==========================>...] - ETA: 12s - loss: 1.7127 - regression_loss: 1.4183 - classification_loss: 0.2944 451/500 [==========================>...] - ETA: 12s - loss: 1.7130 - regression_loss: 1.4186 - classification_loss: 0.2944 452/500 [==========================>...] - ETA: 12s - loss: 1.7140 - regression_loss: 1.4199 - classification_loss: 0.2941 453/500 [==========================>...] - ETA: 11s - loss: 1.7132 - regression_loss: 1.4192 - classification_loss: 0.2939 454/500 [==========================>...] - ETA: 11s - loss: 1.7118 - regression_loss: 1.4183 - classification_loss: 0.2935 455/500 [==========================>...] - ETA: 11s - loss: 1.7110 - regression_loss: 1.4178 - classification_loss: 0.2932 456/500 [==========================>...] - ETA: 11s - loss: 1.7112 - regression_loss: 1.4179 - classification_loss: 0.2932 457/500 [==========================>...] - ETA: 10s - loss: 1.7113 - regression_loss: 1.4182 - classification_loss: 0.2931 458/500 [==========================>...] - ETA: 10s - loss: 1.7117 - regression_loss: 1.4187 - classification_loss: 0.2930 459/500 [==========================>...] - ETA: 10s - loss: 1.7134 - regression_loss: 1.4198 - classification_loss: 0.2936 460/500 [==========================>...] - ETA: 10s - loss: 1.7141 - regression_loss: 1.4203 - classification_loss: 0.2938 461/500 [==========================>...] - ETA: 9s - loss: 1.7126 - regression_loss: 1.4191 - classification_loss: 0.2934  462/500 [==========================>...] - ETA: 9s - loss: 1.7119 - regression_loss: 1.4187 - classification_loss: 0.2932 463/500 [==========================>...] - ETA: 9s - loss: 1.7118 - regression_loss: 1.4156 - classification_loss: 0.2962 464/500 [==========================>...] - ETA: 9s - loss: 1.7118 - regression_loss: 1.4157 - classification_loss: 0.2961 465/500 [==========================>...] - ETA: 8s - loss: 1.7111 - regression_loss: 1.4153 - classification_loss: 0.2959 466/500 [==========================>...] - ETA: 8s - loss: 1.7107 - regression_loss: 1.4152 - classification_loss: 0.2955 467/500 [===========================>..] - ETA: 8s - loss: 1.7118 - regression_loss: 1.4161 - classification_loss: 0.2958 468/500 [===========================>..] - ETA: 7s - loss: 1.7121 - regression_loss: 1.4162 - classification_loss: 0.2959 469/500 [===========================>..] - ETA: 7s - loss: 1.7116 - regression_loss: 1.4160 - classification_loss: 0.2956 470/500 [===========================>..] - ETA: 7s - loss: 1.7116 - regression_loss: 1.4160 - classification_loss: 0.2957 471/500 [===========================>..] - ETA: 7s - loss: 1.7115 - regression_loss: 1.4159 - classification_loss: 0.2956 472/500 [===========================>..] - ETA: 6s - loss: 1.7089 - regression_loss: 1.4139 - classification_loss: 0.2951 473/500 [===========================>..] - ETA: 6s - loss: 1.7087 - regression_loss: 1.4137 - classification_loss: 0.2950 474/500 [===========================>..] - ETA: 6s - loss: 1.7072 - regression_loss: 1.4125 - classification_loss: 0.2947 475/500 [===========================>..] - ETA: 6s - loss: 1.7084 - regression_loss: 1.4135 - classification_loss: 0.2949 476/500 [===========================>..] - ETA: 5s - loss: 1.7085 - regression_loss: 1.4138 - classification_loss: 0.2948 477/500 [===========================>..] - ETA: 5s - loss: 1.7096 - regression_loss: 1.4149 - classification_loss: 0.2947 478/500 [===========================>..] - ETA: 5s - loss: 1.7113 - regression_loss: 1.4162 - classification_loss: 0.2951 479/500 [===========================>..] - ETA: 5s - loss: 1.7114 - regression_loss: 1.4164 - classification_loss: 0.2950 480/500 [===========================>..] - ETA: 4s - loss: 1.7104 - regression_loss: 1.4157 - classification_loss: 0.2947 481/500 [===========================>..] - ETA: 4s - loss: 1.7108 - regression_loss: 1.4162 - classification_loss: 0.2946 482/500 [===========================>..] - ETA: 4s - loss: 1.7101 - regression_loss: 1.4157 - classification_loss: 0.2944 483/500 [===========================>..] - ETA: 4s - loss: 1.7105 - regression_loss: 1.4162 - classification_loss: 0.2943 484/500 [============================>.] - ETA: 3s - loss: 1.7119 - regression_loss: 1.4173 - classification_loss: 0.2946 485/500 [============================>.] - ETA: 3s - loss: 1.7118 - regression_loss: 1.4173 - classification_loss: 0.2945 486/500 [============================>.] - ETA: 3s - loss: 1.7139 - regression_loss: 1.4190 - classification_loss: 0.2949 487/500 [============================>.] - ETA: 3s - loss: 1.7139 - regression_loss: 1.4190 - classification_loss: 0.2949 488/500 [============================>.] - ETA: 2s - loss: 1.7153 - regression_loss: 1.4204 - classification_loss: 0.2948 489/500 [============================>.] - ETA: 2s - loss: 1.7142 - regression_loss: 1.4196 - classification_loss: 0.2946 490/500 [============================>.] - ETA: 2s - loss: 1.7134 - regression_loss: 1.4191 - classification_loss: 0.2943 491/500 [============================>.] - ETA: 2s - loss: 1.7136 - regression_loss: 1.4192 - classification_loss: 0.2943 492/500 [============================>.] - ETA: 1s - loss: 1.7127 - regression_loss: 1.4185 - classification_loss: 0.2942 493/500 [============================>.] - ETA: 1s - loss: 1.7113 - regression_loss: 1.4175 - classification_loss: 0.2938 494/500 [============================>.] - ETA: 1s - loss: 1.7098 - regression_loss: 1.4163 - classification_loss: 0.2935 495/500 [============================>.] - ETA: 1s - loss: 1.7076 - regression_loss: 1.4146 - classification_loss: 0.2930 496/500 [============================>.] - ETA: 0s - loss: 1.7083 - regression_loss: 1.4150 - classification_loss: 0.2932 497/500 [============================>.] - ETA: 0s - loss: 1.7094 - regression_loss: 1.4161 - classification_loss: 0.2934 498/500 [============================>.] - ETA: 0s - loss: 1.7094 - regression_loss: 1.4162 - classification_loss: 0.2932 499/500 [============================>.] - ETA: 0s - loss: 1.7105 - regression_loss: 1.4174 - classification_loss: 0.2931 500/500 [==============================] - 125s 250ms/step - loss: 1.7094 - regression_loss: 1.4165 - classification_loss: 0.2929 326 instances of class plum with average precision: 0.7744 mAP: 0.7744 Epoch 00049: saving model to ./training/snapshots/resnet50_pascal_49.h5 Epoch 50/150 1/500 [..............................] - ETA: 1:55 - loss: 2.1187 - regression_loss: 1.6621 - classification_loss: 0.4566 2/500 [..............................] - ETA: 1:56 - loss: 2.0789 - regression_loss: 1.6897 - classification_loss: 0.3891 3/500 [..............................] - ETA: 2:01 - loss: 2.3237 - regression_loss: 1.8930 - classification_loss: 0.4307 4/500 [..............................] - ETA: 2:02 - loss: 2.3608 - regression_loss: 1.9528 - classification_loss: 0.4080 5/500 [..............................] - ETA: 2:03 - loss: 2.3812 - regression_loss: 1.9493 - classification_loss: 0.4319 6/500 [..............................] - ETA: 2:03 - loss: 2.0807 - regression_loss: 1.7134 - classification_loss: 0.3674 7/500 [..............................] - ETA: 2:03 - loss: 2.0412 - regression_loss: 1.6765 - classification_loss: 0.3647 8/500 [..............................] - ETA: 2:02 - loss: 1.9930 - regression_loss: 1.6374 - classification_loss: 0.3556 9/500 [..............................] - ETA: 2:03 - loss: 1.9228 - regression_loss: 1.5867 - classification_loss: 0.3361 10/500 [..............................] - ETA: 2:03 - loss: 1.9077 - regression_loss: 1.5784 - classification_loss: 0.3294 11/500 [..............................] - ETA: 2:03 - loss: 1.8525 - regression_loss: 1.5354 - classification_loss: 0.3171 12/500 [..............................] - ETA: 2:03 - loss: 1.8565 - regression_loss: 1.5334 - classification_loss: 0.3231 13/500 [..............................] - ETA: 2:02 - loss: 1.9681 - regression_loss: 1.6142 - classification_loss: 0.3539 14/500 [..............................] - ETA: 2:02 - loss: 1.9381 - regression_loss: 1.5938 - classification_loss: 0.3443 15/500 [..............................] - ETA: 2:02 - loss: 1.9675 - regression_loss: 1.6250 - classification_loss: 0.3424 16/500 [..............................] - ETA: 2:02 - loss: 2.0068 - regression_loss: 1.6619 - classification_loss: 0.3449 17/500 [>.............................] - ETA: 2:02 - loss: 1.9857 - regression_loss: 1.6426 - classification_loss: 0.3432 18/500 [>.............................] - ETA: 2:01 - loss: 1.9510 - regression_loss: 1.6184 - classification_loss: 0.3326 19/500 [>.............................] - ETA: 2:01 - loss: 1.9145 - regression_loss: 1.5863 - classification_loss: 0.3282 20/500 [>.............................] - ETA: 2:01 - loss: 1.9070 - regression_loss: 1.5787 - classification_loss: 0.3283 21/500 [>.............................] - ETA: 2:00 - loss: 1.8835 - regression_loss: 1.5598 - classification_loss: 0.3237 22/500 [>.............................] - ETA: 2:00 - loss: 1.9076 - regression_loss: 1.5773 - classification_loss: 0.3302 23/500 [>.............................] - ETA: 2:00 - loss: 1.8906 - regression_loss: 1.5670 - classification_loss: 0.3235 24/500 [>.............................] - ETA: 2:00 - loss: 1.9264 - regression_loss: 1.5908 - classification_loss: 0.3356 25/500 [>.............................] - ETA: 1:59 - loss: 1.9249 - regression_loss: 1.5867 - classification_loss: 0.3382 26/500 [>.............................] - ETA: 1:59 - loss: 1.8964 - regression_loss: 1.5641 - classification_loss: 0.3324 27/500 [>.............................] - ETA: 1:59 - loss: 1.9429 - regression_loss: 1.6080 - classification_loss: 0.3348 28/500 [>.............................] - ETA: 1:59 - loss: 1.9284 - regression_loss: 1.5957 - classification_loss: 0.3327 29/500 [>.............................] - ETA: 1:58 - loss: 1.9054 - regression_loss: 1.5788 - classification_loss: 0.3266 30/500 [>.............................] - ETA: 1:58 - loss: 1.9017 - regression_loss: 1.5777 - classification_loss: 0.3240 31/500 [>.............................] - ETA: 1:58 - loss: 1.8869 - regression_loss: 1.5668 - classification_loss: 0.3201 32/500 [>.............................] - ETA: 1:58 - loss: 1.8699 - regression_loss: 1.5531 - classification_loss: 0.3168 33/500 [>.............................] - ETA: 1:57 - loss: 1.8717 - regression_loss: 1.5551 - classification_loss: 0.3166 34/500 [=>............................] - ETA: 1:57 - loss: 1.8658 - regression_loss: 1.5482 - classification_loss: 0.3175 35/500 [=>............................] - ETA: 1:57 - loss: 1.8463 - regression_loss: 1.5348 - classification_loss: 0.3115 36/500 [=>............................] - ETA: 1:56 - loss: 1.8387 - regression_loss: 1.5289 - classification_loss: 0.3099 37/500 [=>............................] - ETA: 1:56 - loss: 1.8367 - regression_loss: 1.5308 - classification_loss: 0.3059 38/500 [=>............................] - ETA: 1:56 - loss: 1.8291 - regression_loss: 1.5246 - classification_loss: 0.3045 39/500 [=>............................] - ETA: 1:56 - loss: 1.8173 - regression_loss: 1.5147 - classification_loss: 0.3025 40/500 [=>............................] - ETA: 1:55 - loss: 1.8211 - regression_loss: 1.5186 - classification_loss: 0.3025 41/500 [=>............................] - ETA: 1:55 - loss: 1.8290 - regression_loss: 1.5230 - classification_loss: 0.3060 42/500 [=>............................] - ETA: 1:55 - loss: 1.8099 - regression_loss: 1.5084 - classification_loss: 0.3015 43/500 [=>............................] - ETA: 1:55 - loss: 1.8210 - regression_loss: 1.5216 - classification_loss: 0.2994 44/500 [=>............................] - ETA: 1:54 - loss: 1.8255 - regression_loss: 1.5223 - classification_loss: 0.3032 45/500 [=>............................] - ETA: 1:54 - loss: 1.8260 - regression_loss: 1.5228 - classification_loss: 0.3032 46/500 [=>............................] - ETA: 1:54 - loss: 1.8226 - regression_loss: 1.5156 - classification_loss: 0.3070 47/500 [=>............................] - ETA: 1:54 - loss: 1.8296 - regression_loss: 1.5209 - classification_loss: 0.3088 48/500 [=>............................] - ETA: 1:53 - loss: 1.8444 - regression_loss: 1.5331 - classification_loss: 0.3112 49/500 [=>............................] - ETA: 1:53 - loss: 1.8517 - regression_loss: 1.5401 - classification_loss: 0.3116 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8500 - regression_loss: 1.5405 - classification_loss: 0.3095 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8420 - regression_loss: 1.5353 - classification_loss: 0.3067 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8370 - regression_loss: 1.5314 - classification_loss: 0.3056 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8350 - regression_loss: 1.5302 - classification_loss: 0.3048 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8400 - regression_loss: 1.5345 - classification_loss: 0.3055 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8521 - regression_loss: 1.5424 - classification_loss: 0.3097 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8639 - regression_loss: 1.5513 - classification_loss: 0.3126 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8593 - regression_loss: 1.5490 - classification_loss: 0.3104 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8500 - regression_loss: 1.5403 - classification_loss: 0.3097 59/500 [==>...........................] - ETA: 1:51 - loss: 1.8308 - regression_loss: 1.5256 - classification_loss: 0.3053 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8364 - regression_loss: 1.5292 - classification_loss: 0.3072 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8311 - regression_loss: 1.5260 - classification_loss: 0.3051 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8192 - regression_loss: 1.5152 - classification_loss: 0.3040 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8104 - regression_loss: 1.5080 - classification_loss: 0.3024 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8056 - regression_loss: 1.5049 - classification_loss: 0.3007 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8060 - regression_loss: 1.5058 - classification_loss: 0.3003 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8104 - regression_loss: 1.5105 - classification_loss: 0.3000 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8199 - regression_loss: 1.5172 - classification_loss: 0.3026 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8134 - regression_loss: 1.5125 - classification_loss: 0.3009 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8086 - regression_loss: 1.5082 - classification_loss: 0.3005 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7977 - regression_loss: 1.4994 - classification_loss: 0.2983 71/500 [===>..........................] - ETA: 1:48 - loss: 1.8021 - regression_loss: 1.5026 - classification_loss: 0.2995 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7996 - regression_loss: 1.5010 - classification_loss: 0.2986 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8100 - regression_loss: 1.5083 - classification_loss: 0.3017 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8126 - regression_loss: 1.5106 - classification_loss: 0.3020 75/500 [===>..........................] - ETA: 1:47 - loss: 1.8149 - regression_loss: 1.5124 - classification_loss: 0.3025 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8171 - regression_loss: 1.5144 - classification_loss: 0.3026 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8107 - regression_loss: 1.5100 - classification_loss: 0.3006 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8059 - regression_loss: 1.5059 - classification_loss: 0.3000 79/500 [===>..........................] - ETA: 1:46 - loss: 1.8029 - regression_loss: 1.5038 - classification_loss: 0.2991 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7928 - regression_loss: 1.4956 - classification_loss: 0.2972 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7976 - regression_loss: 1.4992 - classification_loss: 0.2984 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7990 - regression_loss: 1.4995 - classification_loss: 0.2995 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7901 - regression_loss: 1.4928 - classification_loss: 0.2973 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7855 - regression_loss: 1.4898 - classification_loss: 0.2957 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7761 - regression_loss: 1.4816 - classification_loss: 0.2945 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7717 - regression_loss: 1.4776 - classification_loss: 0.2941 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7776 - regression_loss: 1.4804 - classification_loss: 0.2972 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7819 - regression_loss: 1.4833 - classification_loss: 0.2986 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7812 - regression_loss: 1.4833 - classification_loss: 0.2979 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7697 - regression_loss: 1.4740 - classification_loss: 0.2957 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7790 - regression_loss: 1.4804 - classification_loss: 0.2986 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7840 - regression_loss: 1.4850 - classification_loss: 0.2990 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7766 - regression_loss: 1.4799 - classification_loss: 0.2967 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7732 - regression_loss: 1.4777 - classification_loss: 0.2955 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7796 - regression_loss: 1.4832 - classification_loss: 0.2964 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7677 - regression_loss: 1.4731 - classification_loss: 0.2946 97/500 [====>.........................] - ETA: 1:41 - loss: 1.7684 - regression_loss: 1.4749 - classification_loss: 0.2935 98/500 [====>.........................] - ETA: 1:41 - loss: 1.7638 - regression_loss: 1.4715 - classification_loss: 0.2923 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7624 - regression_loss: 1.4706 - classification_loss: 0.2919 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7639 - regression_loss: 1.4715 - classification_loss: 0.2924 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7665 - regression_loss: 1.4733 - classification_loss: 0.2932 102/500 [=====>........................] - ETA: 1:40 - loss: 1.7657 - regression_loss: 1.4732 - classification_loss: 0.2924 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7670 - regression_loss: 1.4752 - classification_loss: 0.2918 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7654 - regression_loss: 1.4739 - classification_loss: 0.2915 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7601 - regression_loss: 1.4699 - classification_loss: 0.2902 106/500 [=====>........................] - ETA: 1:39 - loss: 1.7560 - regression_loss: 1.4663 - classification_loss: 0.2897 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7512 - regression_loss: 1.4628 - classification_loss: 0.2884 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7489 - regression_loss: 1.4614 - classification_loss: 0.2875 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7548 - regression_loss: 1.4659 - classification_loss: 0.2889 110/500 [=====>........................] - ETA: 1:38 - loss: 1.7450 - regression_loss: 1.4582 - classification_loss: 0.2868 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7457 - regression_loss: 1.4590 - classification_loss: 0.2868 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7456 - regression_loss: 1.4592 - classification_loss: 0.2864 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7410 - regression_loss: 1.4560 - classification_loss: 0.2850 114/500 [=====>........................] - ETA: 1:37 - loss: 1.7406 - regression_loss: 1.4548 - classification_loss: 0.2858 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7402 - regression_loss: 1.4550 - classification_loss: 0.2852 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7388 - regression_loss: 1.4527 - classification_loss: 0.2860 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7311 - regression_loss: 1.4467 - classification_loss: 0.2844 118/500 [======>.......................] - ETA: 1:36 - loss: 1.7269 - regression_loss: 1.4433 - classification_loss: 0.2837 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7279 - regression_loss: 1.4432 - classification_loss: 0.2847 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7311 - regression_loss: 1.4457 - classification_loss: 0.2854 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7236 - regression_loss: 1.4397 - classification_loss: 0.2840 122/500 [======>.......................] - ETA: 1:35 - loss: 1.7234 - regression_loss: 1.4400 - classification_loss: 0.2835 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7239 - regression_loss: 1.4393 - classification_loss: 0.2846 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7182 - regression_loss: 1.4346 - classification_loss: 0.2836 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7160 - regression_loss: 1.4328 - classification_loss: 0.2833 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7169 - regression_loss: 1.4327 - classification_loss: 0.2842 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7108 - regression_loss: 1.4280 - classification_loss: 0.2829 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7091 - regression_loss: 1.4271 - classification_loss: 0.2820 129/500 [======>.......................] - ETA: 1:33 - loss: 1.7099 - regression_loss: 1.4275 - classification_loss: 0.2824 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7082 - regression_loss: 1.4264 - classification_loss: 0.2818 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7092 - regression_loss: 1.4278 - classification_loss: 0.2814 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7065 - regression_loss: 1.4259 - classification_loss: 0.2806 133/500 [======>.......................] - ETA: 1:32 - loss: 1.7030 - regression_loss: 1.4236 - classification_loss: 0.2795 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7063 - regression_loss: 1.4256 - classification_loss: 0.2807 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7038 - regression_loss: 1.4236 - classification_loss: 0.2802 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6978 - regression_loss: 1.4186 - classification_loss: 0.2791 137/500 [=======>......................] - ETA: 1:31 - loss: 1.7028 - regression_loss: 1.4226 - classification_loss: 0.2802 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6988 - regression_loss: 1.4197 - classification_loss: 0.2791 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6974 - regression_loss: 1.4191 - classification_loss: 0.2783 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6962 - regression_loss: 1.4188 - classification_loss: 0.2774 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6870 - regression_loss: 1.4112 - classification_loss: 0.2758 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6856 - regression_loss: 1.4104 - classification_loss: 0.2753 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6816 - regression_loss: 1.4072 - classification_loss: 0.2744 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6857 - regression_loss: 1.4101 - classification_loss: 0.2757 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6823 - regression_loss: 1.4075 - classification_loss: 0.2748 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6804 - regression_loss: 1.4054 - classification_loss: 0.2750 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6784 - regression_loss: 1.4036 - classification_loss: 0.2747 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6786 - regression_loss: 1.4042 - classification_loss: 0.2744 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6806 - regression_loss: 1.4056 - classification_loss: 0.2750 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6777 - regression_loss: 1.4036 - classification_loss: 0.2741 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6803 - regression_loss: 1.4055 - classification_loss: 0.2748 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6825 - regression_loss: 1.4073 - classification_loss: 0.2752 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6871 - regression_loss: 1.4105 - classification_loss: 0.2766 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6879 - regression_loss: 1.4115 - classification_loss: 0.2765 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6897 - regression_loss: 1.4133 - classification_loss: 0.2764 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6887 - regression_loss: 1.4127 - classification_loss: 0.2760 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6885 - regression_loss: 1.4128 - classification_loss: 0.2757 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6897 - regression_loss: 1.4140 - classification_loss: 0.2757 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6854 - regression_loss: 1.4109 - classification_loss: 0.2745 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6883 - regression_loss: 1.4134 - classification_loss: 0.2749 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6857 - regression_loss: 1.4115 - classification_loss: 0.2742 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6790 - regression_loss: 1.4057 - classification_loss: 0.2732 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6755 - regression_loss: 1.4028 - classification_loss: 0.2727 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6734 - regression_loss: 1.4015 - classification_loss: 0.2719 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6714 - regression_loss: 1.4002 - classification_loss: 0.2712 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6698 - regression_loss: 1.3989 - classification_loss: 0.2709 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6707 - regression_loss: 1.3996 - classification_loss: 0.2711 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6719 - regression_loss: 1.4007 - classification_loss: 0.2712 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6756 - regression_loss: 1.4035 - classification_loss: 0.2721 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6727 - regression_loss: 1.4013 - classification_loss: 0.2714 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6717 - regression_loss: 1.4009 - classification_loss: 0.2708 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6710 - regression_loss: 1.4002 - classification_loss: 0.2708 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6706 - regression_loss: 1.3999 - classification_loss: 0.2707 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6702 - regression_loss: 1.3999 - classification_loss: 0.2703 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6712 - regression_loss: 1.4011 - classification_loss: 0.2701 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6682 - regression_loss: 1.3989 - classification_loss: 0.2692 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6651 - regression_loss: 1.3967 - classification_loss: 0.2684 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6637 - regression_loss: 1.3954 - classification_loss: 0.2684 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6676 - regression_loss: 1.3981 - classification_loss: 0.2695 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6614 - regression_loss: 1.3930 - classification_loss: 0.2684 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6603 - regression_loss: 1.3925 - classification_loss: 0.2678 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6587 - regression_loss: 1.3910 - classification_loss: 0.2677 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6559 - regression_loss: 1.3888 - classification_loss: 0.2671 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6584 - regression_loss: 1.3908 - classification_loss: 0.2676 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6573 - regression_loss: 1.3901 - classification_loss: 0.2672 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6587 - regression_loss: 1.3916 - classification_loss: 0.2671 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6621 - regression_loss: 1.3944 - classification_loss: 0.2677 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6619 - regression_loss: 1.3941 - classification_loss: 0.2678 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6597 - regression_loss: 1.3927 - classification_loss: 0.2670 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6608 - regression_loss: 1.3936 - classification_loss: 0.2672 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6607 - regression_loss: 1.3940 - classification_loss: 0.2667 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6634 - regression_loss: 1.3960 - classification_loss: 0.2674 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6640 - regression_loss: 1.3967 - classification_loss: 0.2673 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6657 - regression_loss: 1.3977 - classification_loss: 0.2680 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6657 - regression_loss: 1.3977 - classification_loss: 0.2680 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7004 - regression_loss: 1.3905 - classification_loss: 0.3098 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6989 - regression_loss: 1.3895 - classification_loss: 0.3093 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6978 - regression_loss: 1.3890 - classification_loss: 0.3088 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6994 - regression_loss: 1.3906 - classification_loss: 0.3088 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7037 - regression_loss: 1.3938 - classification_loss: 0.3098 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7015 - regression_loss: 1.3921 - classification_loss: 0.3094 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6955 - regression_loss: 1.3871 - classification_loss: 0.3083 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6911 - regression_loss: 1.3838 - classification_loss: 0.3073 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6894 - regression_loss: 1.3827 - classification_loss: 0.3067 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6886 - regression_loss: 1.3822 - classification_loss: 0.3064 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6890 - regression_loss: 1.3823 - classification_loss: 0.3067 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6900 - regression_loss: 1.3831 - classification_loss: 0.3070 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6908 - regression_loss: 1.3839 - classification_loss: 0.3068 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6899 - regression_loss: 1.3836 - classification_loss: 0.3062 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6876 - regression_loss: 1.3822 - classification_loss: 0.3054 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6861 - regression_loss: 1.3813 - classification_loss: 0.3049 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6805 - regression_loss: 1.3768 - classification_loss: 0.3037 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6792 - regression_loss: 1.3704 - classification_loss: 0.3088 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6812 - regression_loss: 1.3720 - classification_loss: 0.3091 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6800 - regression_loss: 1.3715 - classification_loss: 0.3085 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6789 - regression_loss: 1.3711 - classification_loss: 0.3078 217/500 [============>.................] - ETA: 1:10 - loss: 1.6800 - regression_loss: 1.3718 - classification_loss: 0.3082 218/500 [============>.................] - ETA: 1:10 - loss: 1.6789 - regression_loss: 1.3712 - classification_loss: 0.3077 219/500 [============>.................] - ETA: 1:10 - loss: 1.6773 - regression_loss: 1.3703 - classification_loss: 0.3070 220/500 [============>.................] - ETA: 1:10 - loss: 1.6771 - regression_loss: 1.3698 - classification_loss: 0.3073 221/500 [============>.................] - ETA: 1:09 - loss: 1.6784 - regression_loss: 1.3713 - classification_loss: 0.3070 222/500 [============>.................] - ETA: 1:09 - loss: 1.6769 - regression_loss: 1.3701 - classification_loss: 0.3067 223/500 [============>.................] - ETA: 1:09 - loss: 1.6803 - regression_loss: 1.3733 - classification_loss: 0.3071 224/500 [============>.................] - ETA: 1:09 - loss: 1.6804 - regression_loss: 1.3727 - classification_loss: 0.3077 225/500 [============>.................] - ETA: 1:08 - loss: 1.6801 - regression_loss: 1.3723 - classification_loss: 0.3078 226/500 [============>.................] - ETA: 1:08 - loss: 1.6779 - regression_loss: 1.3702 - classification_loss: 0.3077 227/500 [============>.................] - ETA: 1:08 - loss: 1.6776 - regression_loss: 1.3703 - classification_loss: 0.3073 228/500 [============>.................] - ETA: 1:08 - loss: 1.6785 - regression_loss: 1.3711 - classification_loss: 0.3074 229/500 [============>.................] - ETA: 1:07 - loss: 1.6788 - regression_loss: 1.3718 - classification_loss: 0.3069 230/500 [============>.................] - ETA: 1:07 - loss: 1.6786 - regression_loss: 1.3716 - classification_loss: 0.3070 231/500 [============>.................] - ETA: 1:07 - loss: 1.6746 - regression_loss: 1.3687 - classification_loss: 0.3059 232/500 [============>.................] - ETA: 1:07 - loss: 1.6750 - regression_loss: 1.3693 - classification_loss: 0.3057 233/500 [============>.................] - ETA: 1:06 - loss: 1.6750 - regression_loss: 1.3698 - classification_loss: 0.3051 234/500 [=============>................] - ETA: 1:06 - loss: 1.6756 - regression_loss: 1.3704 - classification_loss: 0.3052 235/500 [=============>................] - ETA: 1:06 - loss: 1.6774 - regression_loss: 1.3719 - classification_loss: 0.3055 236/500 [=============>................] - ETA: 1:06 - loss: 1.6761 - regression_loss: 1.3711 - classification_loss: 0.3050 237/500 [=============>................] - ETA: 1:05 - loss: 1.6763 - regression_loss: 1.3721 - classification_loss: 0.3042 238/500 [=============>................] - ETA: 1:05 - loss: 1.6749 - regression_loss: 1.3713 - classification_loss: 0.3037 239/500 [=============>................] - ETA: 1:05 - loss: 1.6756 - regression_loss: 1.3723 - classification_loss: 0.3033 240/500 [=============>................] - ETA: 1:05 - loss: 1.6751 - regression_loss: 1.3724 - classification_loss: 0.3027 241/500 [=============>................] - ETA: 1:04 - loss: 1.6781 - regression_loss: 1.3745 - classification_loss: 0.3036 242/500 [=============>................] - ETA: 1:04 - loss: 1.6762 - regression_loss: 1.3731 - classification_loss: 0.3031 243/500 [=============>................] - ETA: 1:04 - loss: 1.6789 - regression_loss: 1.3747 - classification_loss: 0.3042 244/500 [=============>................] - ETA: 1:04 - loss: 1.6790 - regression_loss: 1.3751 - classification_loss: 0.3040 245/500 [=============>................] - ETA: 1:03 - loss: 1.6789 - regression_loss: 1.3756 - classification_loss: 0.3033 246/500 [=============>................] - ETA: 1:03 - loss: 1.6746 - regression_loss: 1.3722 - classification_loss: 0.3024 247/500 [=============>................] - ETA: 1:03 - loss: 1.6734 - regression_loss: 1.3715 - classification_loss: 0.3019 248/500 [=============>................] - ETA: 1:03 - loss: 1.6739 - regression_loss: 1.3720 - classification_loss: 0.3019 249/500 [=============>................] - ETA: 1:02 - loss: 1.6754 - regression_loss: 1.3735 - classification_loss: 0.3019 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6774 - regression_loss: 1.3748 - classification_loss: 0.3026 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6752 - regression_loss: 1.3734 - classification_loss: 0.3019 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6747 - regression_loss: 1.3730 - classification_loss: 0.3017 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6821 - regression_loss: 1.3791 - classification_loss: 0.3030 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6809 - regression_loss: 1.3779 - classification_loss: 0.3030 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6818 - regression_loss: 1.3790 - classification_loss: 0.3029 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6826 - regression_loss: 1.3796 - classification_loss: 0.3030 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6835 - regression_loss: 1.3804 - classification_loss: 0.3031 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6827 - regression_loss: 1.3797 - classification_loss: 0.3029 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6857 - regression_loss: 1.3821 - classification_loss: 0.3036 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6806 - regression_loss: 1.3780 - classification_loss: 0.3026 261/500 [==============>...............] - ETA: 59s - loss: 1.6781 - regression_loss: 1.3758 - classification_loss: 0.3023  262/500 [==============>...............] - ETA: 59s - loss: 1.6781 - regression_loss: 1.3759 - classification_loss: 0.3022 263/500 [==============>...............] - ETA: 59s - loss: 1.6783 - regression_loss: 1.3762 - classification_loss: 0.3021 264/500 [==============>...............] - ETA: 59s - loss: 1.6794 - regression_loss: 1.3773 - classification_loss: 0.3021 265/500 [==============>...............] - ETA: 58s - loss: 1.6787 - regression_loss: 1.3771 - classification_loss: 0.3016 266/500 [==============>...............] - ETA: 58s - loss: 1.6792 - regression_loss: 1.3780 - classification_loss: 0.3011 267/500 [===============>..............] - ETA: 58s - loss: 1.6782 - regression_loss: 1.3774 - classification_loss: 0.3008 268/500 [===============>..............] - ETA: 58s - loss: 1.6790 - regression_loss: 1.3784 - classification_loss: 0.3006 269/500 [===============>..............] - ETA: 57s - loss: 1.6778 - regression_loss: 1.3778 - classification_loss: 0.2999 270/500 [===============>..............] - ETA: 57s - loss: 1.6768 - regression_loss: 1.3771 - classification_loss: 0.2997 271/500 [===============>..............] - ETA: 57s - loss: 1.6742 - regression_loss: 1.3752 - classification_loss: 0.2989 272/500 [===============>..............] - ETA: 57s - loss: 1.6747 - regression_loss: 1.3757 - classification_loss: 0.2990 273/500 [===============>..............] - ETA: 56s - loss: 1.6738 - regression_loss: 1.3751 - classification_loss: 0.2987 274/500 [===============>..............] - ETA: 56s - loss: 1.6769 - regression_loss: 1.3773 - classification_loss: 0.2996 275/500 [===============>..............] - ETA: 56s - loss: 1.6785 - regression_loss: 1.3787 - classification_loss: 0.2998 276/500 [===============>..............] - ETA: 56s - loss: 1.6796 - regression_loss: 1.3794 - classification_loss: 0.3002 277/500 [===============>..............] - ETA: 55s - loss: 1.6799 - regression_loss: 1.3800 - classification_loss: 0.2999 278/500 [===============>..............] - ETA: 55s - loss: 1.6840 - regression_loss: 1.3832 - classification_loss: 0.3008 279/500 [===============>..............] - ETA: 55s - loss: 1.6836 - regression_loss: 1.3829 - classification_loss: 0.3007 280/500 [===============>..............] - ETA: 55s - loss: 1.6818 - regression_loss: 1.3817 - classification_loss: 0.3001 281/500 [===============>..............] - ETA: 54s - loss: 1.6804 - regression_loss: 1.3808 - classification_loss: 0.2996 282/500 [===============>..............] - ETA: 54s - loss: 1.6830 - regression_loss: 1.3834 - classification_loss: 0.2997 283/500 [===============>..............] - ETA: 54s - loss: 1.6827 - regression_loss: 1.3833 - classification_loss: 0.2995 284/500 [================>.............] - ETA: 54s - loss: 1.6844 - regression_loss: 1.3848 - classification_loss: 0.2996 285/500 [================>.............] - ETA: 53s - loss: 1.6898 - regression_loss: 1.3896 - classification_loss: 0.3002 286/500 [================>.............] - ETA: 53s - loss: 1.6888 - regression_loss: 1.3889 - classification_loss: 0.2998 287/500 [================>.............] - ETA: 53s - loss: 1.6900 - regression_loss: 1.3898 - classification_loss: 0.3002 288/500 [================>.............] - ETA: 53s - loss: 1.6908 - regression_loss: 1.3907 - classification_loss: 0.3001 289/500 [================>.............] - ETA: 52s - loss: 1.6938 - regression_loss: 1.3937 - classification_loss: 0.3001 290/500 [================>.............] - ETA: 52s - loss: 1.6932 - regression_loss: 1.3932 - classification_loss: 0.3000 291/500 [================>.............] - ETA: 52s - loss: 1.6939 - regression_loss: 1.3940 - classification_loss: 0.3000 292/500 [================>.............] - ETA: 52s - loss: 1.6934 - regression_loss: 1.3936 - classification_loss: 0.2998 293/500 [================>.............] - ETA: 51s - loss: 1.6916 - regression_loss: 1.3924 - classification_loss: 0.2993 294/500 [================>.............] - ETA: 51s - loss: 1.6944 - regression_loss: 1.3947 - classification_loss: 0.2997 295/500 [================>.............] - ETA: 51s - loss: 1.6929 - regression_loss: 1.3937 - classification_loss: 0.2992 296/500 [================>.............] - ETA: 51s - loss: 1.6915 - regression_loss: 1.3928 - classification_loss: 0.2987 297/500 [================>.............] - ETA: 50s - loss: 1.6907 - regression_loss: 1.3923 - classification_loss: 0.2985 298/500 [================>.............] - ETA: 50s - loss: 1.6910 - regression_loss: 1.3927 - classification_loss: 0.2984 299/500 [================>.............] - ETA: 50s - loss: 1.6922 - regression_loss: 1.3935 - classification_loss: 0.2987 300/500 [=================>............] - ETA: 50s - loss: 1.6938 - regression_loss: 1.3948 - classification_loss: 0.2990 301/500 [=================>............] - ETA: 49s - loss: 1.6900 - regression_loss: 1.3916 - classification_loss: 0.2984 302/500 [=================>............] - ETA: 49s - loss: 1.6872 - regression_loss: 1.3894 - classification_loss: 0.2977 303/500 [=================>............] - ETA: 49s - loss: 1.6877 - regression_loss: 1.3902 - classification_loss: 0.2975 304/500 [=================>............] - ETA: 49s - loss: 1.6862 - regression_loss: 1.3891 - classification_loss: 0.2971 305/500 [=================>............] - ETA: 48s - loss: 1.6847 - regression_loss: 1.3879 - classification_loss: 0.2968 306/500 [=================>............] - ETA: 48s - loss: 1.6840 - regression_loss: 1.3874 - classification_loss: 0.2966 307/500 [=================>............] - ETA: 48s - loss: 1.6837 - regression_loss: 1.3872 - classification_loss: 0.2965 308/500 [=================>............] - ETA: 48s - loss: 1.6826 - regression_loss: 1.3864 - classification_loss: 0.2961 309/500 [=================>............] - ETA: 47s - loss: 1.6816 - regression_loss: 1.3854 - classification_loss: 0.2961 310/500 [=================>............] - ETA: 47s - loss: 1.6841 - regression_loss: 1.3877 - classification_loss: 0.2964 311/500 [=================>............] - ETA: 47s - loss: 1.6816 - regression_loss: 1.3856 - classification_loss: 0.2959 312/500 [=================>............] - ETA: 47s - loss: 1.6789 - regression_loss: 1.3835 - classification_loss: 0.2954 313/500 [=================>............] - ETA: 46s - loss: 1.6768 - regression_loss: 1.3816 - classification_loss: 0.2953 314/500 [=================>............] - ETA: 46s - loss: 1.6785 - regression_loss: 1.3830 - classification_loss: 0.2955 315/500 [=================>............] - ETA: 46s - loss: 1.6771 - regression_loss: 1.3820 - classification_loss: 0.2951 316/500 [=================>............] - ETA: 46s - loss: 1.6769 - regression_loss: 1.3819 - classification_loss: 0.2950 317/500 [==================>...........] - ETA: 45s - loss: 1.6755 - regression_loss: 1.3810 - classification_loss: 0.2945 318/500 [==================>...........] - ETA: 45s - loss: 1.6738 - regression_loss: 1.3797 - classification_loss: 0.2941 319/500 [==================>...........] - ETA: 45s - loss: 1.6722 - regression_loss: 1.3786 - classification_loss: 0.2937 320/500 [==================>...........] - ETA: 45s - loss: 1.6716 - regression_loss: 1.3784 - classification_loss: 0.2932 321/500 [==================>...........] - ETA: 44s - loss: 1.6713 - regression_loss: 1.3785 - classification_loss: 0.2928 322/500 [==================>...........] - ETA: 44s - loss: 1.6700 - regression_loss: 1.3777 - classification_loss: 0.2924 323/500 [==================>...........] - ETA: 44s - loss: 1.6689 - regression_loss: 1.3769 - classification_loss: 0.2920 324/500 [==================>...........] - ETA: 44s - loss: 1.6698 - regression_loss: 1.3779 - classification_loss: 0.2918 325/500 [==================>...........] - ETA: 43s - loss: 1.6703 - regression_loss: 1.3786 - classification_loss: 0.2917 326/500 [==================>...........] - ETA: 43s - loss: 1.6684 - regression_loss: 1.3772 - classification_loss: 0.2912 327/500 [==================>...........] - ETA: 43s - loss: 1.6696 - regression_loss: 1.3779 - classification_loss: 0.2917 328/500 [==================>...........] - ETA: 43s - loss: 1.6695 - regression_loss: 1.3779 - classification_loss: 0.2916 329/500 [==================>...........] - ETA: 42s - loss: 1.6729 - regression_loss: 1.3804 - classification_loss: 0.2925 330/500 [==================>...........] - ETA: 42s - loss: 1.6755 - regression_loss: 1.3823 - classification_loss: 0.2932 331/500 [==================>...........] - ETA: 42s - loss: 1.6757 - regression_loss: 1.3826 - classification_loss: 0.2931 332/500 [==================>...........] - ETA: 42s - loss: 1.6762 - regression_loss: 1.3833 - classification_loss: 0.2930 333/500 [==================>...........] - ETA: 41s - loss: 1.6792 - regression_loss: 1.3853 - classification_loss: 0.2939 334/500 [===================>..........] - ETA: 41s - loss: 1.6802 - regression_loss: 1.3859 - classification_loss: 0.2942 335/500 [===================>..........] - ETA: 41s - loss: 1.6795 - regression_loss: 1.3856 - classification_loss: 0.2939 336/500 [===================>..........] - ETA: 41s - loss: 1.6798 - regression_loss: 1.3861 - classification_loss: 0.2937 337/500 [===================>..........] - ETA: 40s - loss: 1.6793 - regression_loss: 1.3857 - classification_loss: 0.2936 338/500 [===================>..........] - ETA: 40s - loss: 1.6785 - regression_loss: 1.3851 - classification_loss: 0.2935 339/500 [===================>..........] - ETA: 40s - loss: 1.6799 - regression_loss: 1.3860 - classification_loss: 0.2939 340/500 [===================>..........] - ETA: 40s - loss: 1.6815 - regression_loss: 1.3872 - classification_loss: 0.2942 341/500 [===================>..........] - ETA: 39s - loss: 1.6805 - regression_loss: 1.3865 - classification_loss: 0.2940 342/500 [===================>..........] - ETA: 39s - loss: 1.6781 - regression_loss: 1.3848 - classification_loss: 0.2934 343/500 [===================>..........] - ETA: 39s - loss: 1.6770 - regression_loss: 1.3841 - classification_loss: 0.2929 344/500 [===================>..........] - ETA: 39s - loss: 1.6762 - regression_loss: 1.3835 - classification_loss: 0.2926 345/500 [===================>..........] - ETA: 38s - loss: 1.6754 - regression_loss: 1.3828 - classification_loss: 0.2926 346/500 [===================>..........] - ETA: 38s - loss: 1.6787 - regression_loss: 1.3835 - classification_loss: 0.2952 347/500 [===================>..........] - ETA: 38s - loss: 1.6787 - regression_loss: 1.3834 - classification_loss: 0.2953 348/500 [===================>..........] - ETA: 38s - loss: 1.6778 - regression_loss: 1.3827 - classification_loss: 0.2951 349/500 [===================>..........] - ETA: 37s - loss: 1.6747 - regression_loss: 1.3803 - classification_loss: 0.2945 350/500 [====================>.........] - ETA: 37s - loss: 1.6742 - regression_loss: 1.3803 - classification_loss: 0.2939 351/500 [====================>.........] - ETA: 37s - loss: 1.6735 - regression_loss: 1.3799 - classification_loss: 0.2936 352/500 [====================>.........] - ETA: 37s - loss: 1.6745 - regression_loss: 1.3808 - classification_loss: 0.2938 353/500 [====================>.........] - ETA: 36s - loss: 1.6723 - regression_loss: 1.3791 - classification_loss: 0.2932 354/500 [====================>.........] - ETA: 36s - loss: 1.6732 - regression_loss: 1.3800 - classification_loss: 0.2932 355/500 [====================>.........] - ETA: 36s - loss: 1.6741 - regression_loss: 1.3808 - classification_loss: 0.2933 356/500 [====================>.........] - ETA: 36s - loss: 1.6740 - regression_loss: 1.3807 - classification_loss: 0.2932 357/500 [====================>.........] - ETA: 35s - loss: 1.6745 - regression_loss: 1.3812 - classification_loss: 0.2932 358/500 [====================>.........] - ETA: 35s - loss: 1.6749 - regression_loss: 1.3820 - classification_loss: 0.2929 359/500 [====================>.........] - ETA: 35s - loss: 1.6753 - regression_loss: 1.3826 - classification_loss: 0.2927 360/500 [====================>.........] - ETA: 35s - loss: 1.6748 - regression_loss: 1.3824 - classification_loss: 0.2924 361/500 [====================>.........] - ETA: 34s - loss: 1.6730 - regression_loss: 1.3811 - classification_loss: 0.2919 362/500 [====================>.........] - ETA: 34s - loss: 1.6732 - regression_loss: 1.3816 - classification_loss: 0.2916 363/500 [====================>.........] - ETA: 34s - loss: 1.6741 - regression_loss: 1.3826 - classification_loss: 0.2915 364/500 [====================>.........] - ETA: 34s - loss: 1.6733 - regression_loss: 1.3821 - classification_loss: 0.2912 365/500 [====================>.........] - ETA: 33s - loss: 1.6736 - regression_loss: 1.3818 - classification_loss: 0.2917 366/500 [====================>.........] - ETA: 33s - loss: 1.6745 - regression_loss: 1.3826 - classification_loss: 0.2920 367/500 [=====================>........] - ETA: 33s - loss: 1.6770 - regression_loss: 1.3848 - classification_loss: 0.2922 368/500 [=====================>........] - ETA: 33s - loss: 1.6825 - regression_loss: 1.3885 - classification_loss: 0.2940 369/500 [=====================>........] - ETA: 32s - loss: 1.6815 - regression_loss: 1.3878 - classification_loss: 0.2937 370/500 [=====================>........] - ETA: 32s - loss: 1.6829 - regression_loss: 1.3892 - classification_loss: 0.2936 371/500 [=====================>........] - ETA: 32s - loss: 1.6824 - regression_loss: 1.3890 - classification_loss: 0.2934 372/500 [=====================>........] - ETA: 32s - loss: 1.6818 - regression_loss: 1.3884 - classification_loss: 0.2935 373/500 [=====================>........] - ETA: 31s - loss: 1.6815 - regression_loss: 1.3883 - classification_loss: 0.2932 374/500 [=====================>........] - ETA: 31s - loss: 1.6828 - regression_loss: 1.3892 - classification_loss: 0.2936 375/500 [=====================>........] - ETA: 31s - loss: 1.6813 - regression_loss: 1.3881 - classification_loss: 0.2932 376/500 [=====================>........] - ETA: 31s - loss: 1.6811 - regression_loss: 1.3882 - classification_loss: 0.2929 377/500 [=====================>........] - ETA: 30s - loss: 1.6818 - regression_loss: 1.3884 - classification_loss: 0.2934 378/500 [=====================>........] - ETA: 30s - loss: 1.6803 - regression_loss: 1.3873 - classification_loss: 0.2930 379/500 [=====================>........] - ETA: 30s - loss: 1.6820 - regression_loss: 1.3886 - classification_loss: 0.2934 380/500 [=====================>........] - ETA: 30s - loss: 1.6849 - regression_loss: 1.3911 - classification_loss: 0.2937 381/500 [=====================>........] - ETA: 29s - loss: 1.6858 - regression_loss: 1.3918 - classification_loss: 0.2940 382/500 [=====================>........] - ETA: 29s - loss: 1.6850 - regression_loss: 1.3910 - classification_loss: 0.2940 383/500 [=====================>........] - ETA: 29s - loss: 1.6856 - regression_loss: 1.3917 - classification_loss: 0.2939 384/500 [======================>.......] - ETA: 29s - loss: 1.6856 - regression_loss: 1.3919 - classification_loss: 0.2937 385/500 [======================>.......] - ETA: 28s - loss: 1.6850 - regression_loss: 1.3914 - classification_loss: 0.2935 386/500 [======================>.......] - ETA: 28s - loss: 1.6868 - regression_loss: 1.3927 - classification_loss: 0.2940 387/500 [======================>.......] - ETA: 28s - loss: 1.6861 - regression_loss: 1.3924 - classification_loss: 0.2936 388/500 [======================>.......] - ETA: 28s - loss: 1.6832 - regression_loss: 1.3900 - classification_loss: 0.2932 389/500 [======================>.......] - ETA: 27s - loss: 1.6832 - regression_loss: 1.3901 - classification_loss: 0.2931 390/500 [======================>.......] - ETA: 27s - loss: 1.6826 - regression_loss: 1.3896 - classification_loss: 0.2929 391/500 [======================>.......] - ETA: 27s - loss: 1.6842 - regression_loss: 1.3911 - classification_loss: 0.2931 392/500 [======================>.......] - ETA: 27s - loss: 1.6849 - regression_loss: 1.3916 - classification_loss: 0.2933 393/500 [======================>.......] - ETA: 26s - loss: 1.6891 - regression_loss: 1.3953 - classification_loss: 0.2938 394/500 [======================>.......] - ETA: 26s - loss: 1.6896 - regression_loss: 1.3958 - classification_loss: 0.2939 395/500 [======================>.......] - ETA: 26s - loss: 1.6896 - regression_loss: 1.3955 - classification_loss: 0.2941 396/500 [======================>.......] - ETA: 26s - loss: 1.6877 - regression_loss: 1.3939 - classification_loss: 0.2938 397/500 [======================>.......] - ETA: 25s - loss: 1.6877 - regression_loss: 1.3941 - classification_loss: 0.2936 398/500 [======================>.......] - ETA: 25s - loss: 1.6856 - regression_loss: 1.3924 - classification_loss: 0.2931 399/500 [======================>.......] - ETA: 25s - loss: 1.6833 - regression_loss: 1.3906 - classification_loss: 0.2927 400/500 [=======================>......] - ETA: 25s - loss: 1.6840 - regression_loss: 1.3911 - classification_loss: 0.2929 401/500 [=======================>......] - ETA: 24s - loss: 1.6839 - regression_loss: 1.3911 - classification_loss: 0.2927 402/500 [=======================>......] - ETA: 24s - loss: 1.6833 - regression_loss: 1.3910 - classification_loss: 0.2924 403/500 [=======================>......] - ETA: 24s - loss: 1.6826 - regression_loss: 1.3875 - classification_loss: 0.2950 404/500 [=======================>......] - ETA: 24s - loss: 1.6797 - regression_loss: 1.3853 - classification_loss: 0.2944 405/500 [=======================>......] - ETA: 23s - loss: 1.6790 - regression_loss: 1.3849 - classification_loss: 0.2940 406/500 [=======================>......] - ETA: 23s - loss: 1.6788 - regression_loss: 1.3850 - classification_loss: 0.2938 407/500 [=======================>......] - ETA: 23s - loss: 1.6775 - regression_loss: 1.3841 - classification_loss: 0.2934 408/500 [=======================>......] - ETA: 23s - loss: 1.6767 - regression_loss: 1.3833 - classification_loss: 0.2933 409/500 [=======================>......] - ETA: 22s - loss: 1.6756 - regression_loss: 1.3826 - classification_loss: 0.2930 410/500 [=======================>......] - ETA: 22s - loss: 1.6765 - regression_loss: 1.3832 - classification_loss: 0.2933 411/500 [=======================>......] - ETA: 22s - loss: 1.6772 - regression_loss: 1.3838 - classification_loss: 0.2934 412/500 [=======================>......] - ETA: 22s - loss: 1.6753 - regression_loss: 1.3823 - classification_loss: 0.2930 413/500 [=======================>......] - ETA: 21s - loss: 1.6729 - regression_loss: 1.3803 - classification_loss: 0.2925 414/500 [=======================>......] - ETA: 21s - loss: 1.6734 - regression_loss: 1.3808 - classification_loss: 0.2926 415/500 [=======================>......] - ETA: 21s - loss: 1.6726 - regression_loss: 1.3802 - classification_loss: 0.2924 416/500 [=======================>......] - ETA: 21s - loss: 1.6731 - regression_loss: 1.3805 - classification_loss: 0.2925 417/500 [========================>.....] - ETA: 20s - loss: 1.6738 - regression_loss: 1.3811 - classification_loss: 0.2927 418/500 [========================>.....] - ETA: 20s - loss: 1.6748 - regression_loss: 1.3820 - classification_loss: 0.2928 419/500 [========================>.....] - ETA: 20s - loss: 1.6754 - regression_loss: 1.3829 - classification_loss: 0.2925 420/500 [========================>.....] - ETA: 20s - loss: 1.6750 - regression_loss: 1.3827 - classification_loss: 0.2923 421/500 [========================>.....] - ETA: 19s - loss: 1.6756 - regression_loss: 1.3831 - classification_loss: 0.2925 422/500 [========================>.....] - ETA: 19s - loss: 1.6753 - regression_loss: 1.3829 - classification_loss: 0.2923 423/500 [========================>.....] - ETA: 19s - loss: 1.6763 - regression_loss: 1.3835 - classification_loss: 0.2928 424/500 [========================>.....] - ETA: 19s - loss: 1.6776 - regression_loss: 1.3845 - classification_loss: 0.2930 425/500 [========================>.....] - ETA: 18s - loss: 1.6781 - regression_loss: 1.3850 - classification_loss: 0.2931 426/500 [========================>.....] - ETA: 18s - loss: 1.6775 - regression_loss: 1.3847 - classification_loss: 0.2928 427/500 [========================>.....] - ETA: 18s - loss: 1.6773 - regression_loss: 1.3847 - classification_loss: 0.2926 428/500 [========================>.....] - ETA: 18s - loss: 1.6773 - regression_loss: 1.3849 - classification_loss: 0.2924 429/500 [========================>.....] - ETA: 17s - loss: 1.6774 - regression_loss: 1.3848 - classification_loss: 0.2926 430/500 [========================>.....] - ETA: 17s - loss: 1.6785 - regression_loss: 1.3856 - classification_loss: 0.2929 431/500 [========================>.....] - ETA: 17s - loss: 1.6791 - regression_loss: 1.3861 - classification_loss: 0.2930 432/500 [========================>.....] - ETA: 17s - loss: 1.6786 - regression_loss: 1.3858 - classification_loss: 0.2928 433/500 [========================>.....] - ETA: 16s - loss: 1.6791 - regression_loss: 1.3864 - classification_loss: 0.2927 434/500 [=========================>....] - ETA: 16s - loss: 1.6793 - regression_loss: 1.3867 - classification_loss: 0.2926 435/500 [=========================>....] - ETA: 16s - loss: 1.6794 - regression_loss: 1.3869 - classification_loss: 0.2925 436/500 [=========================>....] - ETA: 16s - loss: 1.6802 - regression_loss: 1.3875 - classification_loss: 0.2927 437/500 [=========================>....] - ETA: 15s - loss: 1.6791 - regression_loss: 1.3867 - classification_loss: 0.2924 438/500 [=========================>....] - ETA: 15s - loss: 1.6779 - regression_loss: 1.3858 - classification_loss: 0.2921 439/500 [=========================>....] - ETA: 15s - loss: 1.6787 - regression_loss: 1.3865 - classification_loss: 0.2921 440/500 [=========================>....] - ETA: 15s - loss: 1.6800 - regression_loss: 1.3878 - classification_loss: 0.2922 441/500 [=========================>....] - ETA: 14s - loss: 1.6814 - regression_loss: 1.3889 - classification_loss: 0.2925 442/500 [=========================>....] - ETA: 14s - loss: 1.6817 - regression_loss: 1.3891 - classification_loss: 0.2926 443/500 [=========================>....] - ETA: 14s - loss: 1.6819 - regression_loss: 1.3894 - classification_loss: 0.2926 444/500 [=========================>....] - ETA: 14s - loss: 1.6823 - regression_loss: 1.3899 - classification_loss: 0.2924 445/500 [=========================>....] - ETA: 13s - loss: 1.6819 - regression_loss: 1.3897 - classification_loss: 0.2922 446/500 [=========================>....] - ETA: 13s - loss: 1.6819 - regression_loss: 1.3897 - classification_loss: 0.2921 447/500 [=========================>....] - ETA: 13s - loss: 1.6820 - regression_loss: 1.3898 - classification_loss: 0.2922 448/500 [=========================>....] - ETA: 13s - loss: 1.6800 - regression_loss: 1.3881 - classification_loss: 0.2918 449/500 [=========================>....] - ETA: 12s - loss: 1.6800 - regression_loss: 1.3883 - classification_loss: 0.2917 450/500 [==========================>...] - ETA: 12s - loss: 1.6801 - regression_loss: 1.3883 - classification_loss: 0.2917 451/500 [==========================>...] - ETA: 12s - loss: 1.6795 - regression_loss: 1.3880 - classification_loss: 0.2915 452/500 [==========================>...] - ETA: 12s - loss: 1.6781 - regression_loss: 1.3868 - classification_loss: 0.2912 453/500 [==========================>...] - ETA: 11s - loss: 1.6789 - regression_loss: 1.3876 - classification_loss: 0.2912 454/500 [==========================>...] - ETA: 11s - loss: 1.6792 - regression_loss: 1.3878 - classification_loss: 0.2914 455/500 [==========================>...] - ETA: 11s - loss: 1.6788 - regression_loss: 1.3876 - classification_loss: 0.2912 456/500 [==========================>...] - ETA: 11s - loss: 1.6791 - regression_loss: 1.3878 - classification_loss: 0.2913 457/500 [==========================>...] - ETA: 10s - loss: 1.6792 - regression_loss: 1.3879 - classification_loss: 0.2913 458/500 [==========================>...] - ETA: 10s - loss: 1.6789 - regression_loss: 1.3878 - classification_loss: 0.2911 459/500 [==========================>...] - ETA: 10s - loss: 1.6791 - regression_loss: 1.3882 - classification_loss: 0.2909 460/500 [==========================>...] - ETA: 10s - loss: 1.6790 - regression_loss: 1.3881 - classification_loss: 0.2909 461/500 [==========================>...] - ETA: 9s - loss: 1.6772 - regression_loss: 1.3866 - classification_loss: 0.2906  462/500 [==========================>...] - ETA: 9s - loss: 1.6769 - regression_loss: 1.3864 - classification_loss: 0.2905 463/500 [==========================>...] - ETA: 9s - loss: 1.6760 - regression_loss: 1.3858 - classification_loss: 0.2902 464/500 [==========================>...] - ETA: 9s - loss: 1.6762 - regression_loss: 1.3859 - classification_loss: 0.2903 465/500 [==========================>...] - ETA: 8s - loss: 1.6765 - regression_loss: 1.3861 - classification_loss: 0.2904 466/500 [==========================>...] - ETA: 8s - loss: 1.6756 - regression_loss: 1.3856 - classification_loss: 0.2900 467/500 [===========================>..] - ETA: 8s - loss: 1.6743 - regression_loss: 1.3846 - classification_loss: 0.2896 468/500 [===========================>..] - ETA: 8s - loss: 1.6747 - regression_loss: 1.3850 - classification_loss: 0.2897 469/500 [===========================>..] - ETA: 7s - loss: 1.6723 - regression_loss: 1.3830 - classification_loss: 0.2893 470/500 [===========================>..] - ETA: 7s - loss: 1.6735 - regression_loss: 1.3840 - classification_loss: 0.2895 471/500 [===========================>..] - ETA: 7s - loss: 1.6741 - regression_loss: 1.3846 - classification_loss: 0.2895 472/500 [===========================>..] - ETA: 7s - loss: 1.6724 - regression_loss: 1.3832 - classification_loss: 0.2892 473/500 [===========================>..] - ETA: 6s - loss: 1.6725 - regression_loss: 1.3833 - classification_loss: 0.2892 474/500 [===========================>..] - ETA: 6s - loss: 1.6707 - regression_loss: 1.3819 - classification_loss: 0.2888 475/500 [===========================>..] - ETA: 6s - loss: 1.6686 - regression_loss: 1.3790 - classification_loss: 0.2896 476/500 [===========================>..] - ETA: 6s - loss: 1.6692 - regression_loss: 1.3793 - classification_loss: 0.2899 477/500 [===========================>..] - ETA: 5s - loss: 1.6699 - regression_loss: 1.3797 - classification_loss: 0.2901 478/500 [===========================>..] - ETA: 5s - loss: 1.6696 - regression_loss: 1.3798 - classification_loss: 0.2897 479/500 [===========================>..] - ETA: 5s - loss: 1.6681 - regression_loss: 1.3786 - classification_loss: 0.2894 480/500 [===========================>..] - ETA: 5s - loss: 1.6655 - regression_loss: 1.3757 - classification_loss: 0.2897 481/500 [===========================>..] - ETA: 4s - loss: 1.6661 - regression_loss: 1.3761 - classification_loss: 0.2900 482/500 [===========================>..] - ETA: 4s - loss: 1.6661 - regression_loss: 1.3761 - classification_loss: 0.2900 483/500 [===========================>..] - ETA: 4s - loss: 1.6659 - regression_loss: 1.3759 - classification_loss: 0.2900 484/500 [============================>.] - ETA: 4s - loss: 1.6659 - regression_loss: 1.3757 - classification_loss: 0.2902 485/500 [============================>.] - ETA: 3s - loss: 1.6671 - regression_loss: 1.3763 - classification_loss: 0.2908 486/500 [============================>.] - ETA: 3s - loss: 1.6684 - regression_loss: 1.3774 - classification_loss: 0.2910 487/500 [============================>.] - ETA: 3s - loss: 1.6678 - regression_loss: 1.3770 - classification_loss: 0.2908 488/500 [============================>.] - ETA: 3s - loss: 1.6687 - regression_loss: 1.3775 - classification_loss: 0.2911 489/500 [============================>.] - ETA: 2s - loss: 1.6693 - regression_loss: 1.3781 - classification_loss: 0.2912 490/500 [============================>.] - ETA: 2s - loss: 1.6692 - regression_loss: 1.3780 - classification_loss: 0.2911 491/500 [============================>.] - ETA: 2s - loss: 1.6682 - regression_loss: 1.3772 - classification_loss: 0.2910 492/500 [============================>.] - ETA: 2s - loss: 1.6687 - regression_loss: 1.3777 - classification_loss: 0.2910 493/500 [============================>.] - ETA: 1s - loss: 1.6678 - regression_loss: 1.3772 - classification_loss: 0.2906 494/500 [============================>.] - ETA: 1s - loss: 1.6670 - regression_loss: 1.3766 - classification_loss: 0.2903 495/500 [============================>.] - ETA: 1s - loss: 1.6681 - regression_loss: 1.3777 - classification_loss: 0.2905 496/500 [============================>.] - ETA: 1s - loss: 1.6688 - regression_loss: 1.3783 - classification_loss: 0.2905 497/500 [============================>.] - ETA: 0s - loss: 1.6679 - regression_loss: 1.3777 - classification_loss: 0.2902 498/500 [============================>.] - ETA: 0s - loss: 1.6684 - regression_loss: 1.3782 - classification_loss: 0.2902 499/500 [============================>.] - ETA: 0s - loss: 1.6686 - regression_loss: 1.3785 - classification_loss: 0.2901 500/500 [==============================] - 125s 250ms/step - loss: 1.6681 - regression_loss: 1.3784 - classification_loss: 0.2898 326 instances of class plum with average precision: 0.7857 mAP: 0.7857 Epoch 00050: saving model to ./training/snapshots/resnet50_pascal_50.h5 Epoch 51/150 1/500 [..............................] - ETA: 2:01 - loss: 1.4858 - regression_loss: 1.2939 - classification_loss: 0.1919 2/500 [..............................] - ETA: 2:02 - loss: 1.6180 - regression_loss: 1.3654 - classification_loss: 0.2526 3/500 [..............................] - ETA: 2:04 - loss: 1.6113 - regression_loss: 1.3473 - classification_loss: 0.2640 4/500 [..............................] - ETA: 2:04 - loss: 1.8665 - regression_loss: 1.5801 - classification_loss: 0.2864 5/500 [..............................] - ETA: 2:04 - loss: 3.4168 - regression_loss: 1.6239 - classification_loss: 1.7929 6/500 [..............................] - ETA: 2:04 - loss: 3.0466 - regression_loss: 1.5166 - classification_loss: 1.5300 7/500 [..............................] - ETA: 2:03 - loss: 2.7743 - regression_loss: 1.4342 - classification_loss: 1.3401 8/500 [..............................] - ETA: 2:03 - loss: 2.7015 - regression_loss: 1.4816 - classification_loss: 1.2199 9/500 [..............................] - ETA: 2:03 - loss: 2.6596 - regression_loss: 1.5337 - classification_loss: 1.1259 10/500 [..............................] - ETA: 2:02 - loss: 2.4517 - regression_loss: 1.4331 - classification_loss: 1.0186 11/500 [..............................] - ETA: 2:02 - loss: 2.4056 - regression_loss: 1.4530 - classification_loss: 0.9526 12/500 [..............................] - ETA: 2:02 - loss: 2.3767 - regression_loss: 1.4741 - classification_loss: 0.9026 13/500 [..............................] - ETA: 2:02 - loss: 2.3434 - regression_loss: 1.4857 - classification_loss: 0.8577 14/500 [..............................] - ETA: 2:01 - loss: 2.2782 - regression_loss: 1.4681 - classification_loss: 0.8101 15/500 [..............................] - ETA: 2:01 - loss: 2.2250 - regression_loss: 1.4574 - classification_loss: 0.7676 16/500 [..............................] - ETA: 2:01 - loss: 2.1889 - regression_loss: 1.4556 - classification_loss: 0.7333 17/500 [>.............................] - ETA: 2:01 - loss: 2.1246 - regression_loss: 1.4270 - classification_loss: 0.6976 18/500 [>.............................] - ETA: 2:00 - loss: 2.1342 - regression_loss: 1.4527 - classification_loss: 0.6815 19/500 [>.............................] - ETA: 2:00 - loss: 2.0566 - regression_loss: 1.4068 - classification_loss: 0.6498 20/500 [>.............................] - ETA: 2:01 - loss: 2.0213 - regression_loss: 1.3964 - classification_loss: 0.6249 21/500 [>.............................] - ETA: 2:00 - loss: 1.9884 - regression_loss: 1.3801 - classification_loss: 0.6083 22/500 [>.............................] - ETA: 2:00 - loss: 1.9770 - regression_loss: 1.3821 - classification_loss: 0.5949 23/500 [>.............................] - ETA: 2:00 - loss: 1.9682 - regression_loss: 1.3860 - classification_loss: 0.5823 24/500 [>.............................] - ETA: 1:59 - loss: 1.9500 - regression_loss: 1.3836 - classification_loss: 0.5664 25/500 [>.............................] - ETA: 1:59 - loss: 1.9803 - regression_loss: 1.4124 - classification_loss: 0.5679 26/500 [>.............................] - ETA: 1:59 - loss: 1.9423 - regression_loss: 1.3894 - classification_loss: 0.5529 27/500 [>.............................] - ETA: 1:58 - loss: 1.9581 - regression_loss: 1.4064 - classification_loss: 0.5517 28/500 [>.............................] - ETA: 1:58 - loss: 1.9506 - regression_loss: 1.4090 - classification_loss: 0.5416 29/500 [>.............................] - ETA: 1:58 - loss: 1.9316 - regression_loss: 1.4025 - classification_loss: 0.5291 30/500 [>.............................] - ETA: 1:58 - loss: 1.9473 - regression_loss: 1.4239 - classification_loss: 0.5234 31/500 [>.............................] - ETA: 1:57 - loss: 1.9530 - regression_loss: 1.4333 - classification_loss: 0.5196 32/500 [>.............................] - ETA: 1:57 - loss: 1.9262 - regression_loss: 1.4188 - classification_loss: 0.5075 33/500 [>.............................] - ETA: 1:57 - loss: 1.8901 - regression_loss: 1.3963 - classification_loss: 0.4938 34/500 [=>............................] - ETA: 1:57 - loss: 1.8682 - regression_loss: 1.3859 - classification_loss: 0.4823 35/500 [=>............................] - ETA: 1:56 - loss: 1.8612 - regression_loss: 1.3867 - classification_loss: 0.4745 36/500 [=>............................] - ETA: 1:56 - loss: 1.8763 - regression_loss: 1.4037 - classification_loss: 0.4727 37/500 [=>............................] - ETA: 1:56 - loss: 1.8560 - regression_loss: 1.3936 - classification_loss: 0.4623 38/500 [=>............................] - ETA: 1:56 - loss: 1.8446 - regression_loss: 1.3887 - classification_loss: 0.4560 39/500 [=>............................] - ETA: 1:55 - loss: 1.8329 - regression_loss: 1.3823 - classification_loss: 0.4506 40/500 [=>............................] - ETA: 1:55 - loss: 1.8174 - regression_loss: 1.3728 - classification_loss: 0.4446 41/500 [=>............................] - ETA: 1:55 - loss: 1.8137 - regression_loss: 1.3732 - classification_loss: 0.4404 42/500 [=>............................] - ETA: 1:55 - loss: 1.8212 - regression_loss: 1.3815 - classification_loss: 0.4397 43/500 [=>............................] - ETA: 1:55 - loss: 1.8105 - regression_loss: 1.3774 - classification_loss: 0.4331 44/500 [=>............................] - ETA: 1:54 - loss: 1.8181 - regression_loss: 1.3871 - classification_loss: 0.4310 45/500 [=>............................] - ETA: 1:54 - loss: 1.8185 - regression_loss: 1.3897 - classification_loss: 0.4288 46/500 [=>............................] - ETA: 1:54 - loss: 1.8193 - regression_loss: 1.3938 - classification_loss: 0.4255 47/500 [=>............................] - ETA: 1:54 - loss: 1.8212 - regression_loss: 1.3984 - classification_loss: 0.4227 48/500 [=>............................] - ETA: 1:53 - loss: 1.8266 - regression_loss: 1.4075 - classification_loss: 0.4191 49/500 [=>............................] - ETA: 1:53 - loss: 1.8190 - regression_loss: 1.4054 - classification_loss: 0.4136 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8185 - regression_loss: 1.4080 - classification_loss: 0.4105 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8218 - regression_loss: 1.4087 - classification_loss: 0.4131 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8282 - regression_loss: 1.4146 - classification_loss: 0.4136 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8287 - regression_loss: 1.4146 - classification_loss: 0.4141 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8308 - regression_loss: 1.4171 - classification_loss: 0.4137 55/500 [==>...........................] - ETA: 1:52 - loss: 1.8182 - regression_loss: 1.4099 - classification_loss: 0.4083 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8193 - regression_loss: 1.4119 - classification_loss: 0.4074 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7957 - regression_loss: 1.3943 - classification_loss: 0.4014 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7936 - regression_loss: 1.3939 - classification_loss: 0.3997 59/500 [==>...........................] - ETA: 1:52 - loss: 1.8052 - regression_loss: 1.4078 - classification_loss: 0.3974 60/500 [==>...........................] - ETA: 1:52 - loss: 1.7918 - regression_loss: 1.3987 - classification_loss: 0.3931 61/500 [==>...........................] - ETA: 1:52 - loss: 1.8003 - regression_loss: 1.4062 - classification_loss: 0.3940 62/500 [==>...........................] - ETA: 1:51 - loss: 1.7903 - regression_loss: 1.3996 - classification_loss: 0.3907 63/500 [==>...........................] - ETA: 1:51 - loss: 1.7873 - regression_loss: 1.3988 - classification_loss: 0.3885 64/500 [==>...........................] - ETA: 1:51 - loss: 1.7891 - regression_loss: 1.4024 - classification_loss: 0.3867 65/500 [==>...........................] - ETA: 1:50 - loss: 1.7841 - regression_loss: 1.4013 - classification_loss: 0.3828 66/500 [==>...........................] - ETA: 1:50 - loss: 1.7794 - regression_loss: 1.4002 - classification_loss: 0.3792 67/500 [===>..........................] - ETA: 1:50 - loss: 1.7864 - regression_loss: 1.4086 - classification_loss: 0.3778 68/500 [===>..........................] - ETA: 1:50 - loss: 1.7843 - regression_loss: 1.4088 - classification_loss: 0.3755 69/500 [===>..........................] - ETA: 1:49 - loss: 1.7825 - regression_loss: 1.4095 - classification_loss: 0.3730 70/500 [===>..........................] - ETA: 1:49 - loss: 1.7804 - regression_loss: 1.4103 - classification_loss: 0.3701 71/500 [===>..........................] - ETA: 1:49 - loss: 1.7893 - regression_loss: 1.4172 - classification_loss: 0.3721 72/500 [===>..........................] - ETA: 1:48 - loss: 1.7886 - regression_loss: 1.4189 - classification_loss: 0.3697 73/500 [===>..........................] - ETA: 1:48 - loss: 1.7986 - regression_loss: 1.4226 - classification_loss: 0.3761 74/500 [===>..........................] - ETA: 1:48 - loss: 1.8130 - regression_loss: 1.4377 - classification_loss: 0.3753 75/500 [===>..........................] - ETA: 1:47 - loss: 1.8034 - regression_loss: 1.4306 - classification_loss: 0.3727 76/500 [===>..........................] - ETA: 1:47 - loss: 1.8014 - regression_loss: 1.4309 - classification_loss: 0.3705 77/500 [===>..........................] - ETA: 1:47 - loss: 1.8007 - regression_loss: 1.4306 - classification_loss: 0.3702 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7937 - regression_loss: 1.4246 - classification_loss: 0.3691 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7940 - regression_loss: 1.4266 - classification_loss: 0.3674 80/500 [===>..........................] - ETA: 1:46 - loss: 1.7865 - regression_loss: 1.4216 - classification_loss: 0.3649 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7863 - regression_loss: 1.4231 - classification_loss: 0.3632 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7856 - regression_loss: 1.4238 - classification_loss: 0.3618 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7830 - regression_loss: 1.4226 - classification_loss: 0.3604 84/500 [====>.........................] - ETA: 1:45 - loss: 1.7804 - regression_loss: 1.4223 - classification_loss: 0.3581 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7676 - regression_loss: 1.4128 - classification_loss: 0.3548 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7655 - regression_loss: 1.4123 - classification_loss: 0.3532 87/500 [====>.........................] - ETA: 1:44 - loss: 1.7799 - regression_loss: 1.4219 - classification_loss: 0.3579 88/500 [====>.........................] - ETA: 1:44 - loss: 1.7884 - regression_loss: 1.4285 - classification_loss: 0.3600 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7814 - regression_loss: 1.4240 - classification_loss: 0.3574 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7913 - regression_loss: 1.4327 - classification_loss: 0.3587 91/500 [====>.........................] - ETA: 1:43 - loss: 1.7877 - regression_loss: 1.4297 - classification_loss: 0.3580 92/500 [====>.........................] - ETA: 1:43 - loss: 1.7778 - regression_loss: 1.4230 - classification_loss: 0.3548 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7696 - regression_loss: 1.4164 - classification_loss: 0.3531 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7730 - regression_loss: 1.4198 - classification_loss: 0.3532 95/500 [====>.........................] - ETA: 1:42 - loss: 1.7686 - regression_loss: 1.4170 - classification_loss: 0.3516 96/500 [====>.........................] - ETA: 1:42 - loss: 1.7662 - regression_loss: 1.4161 - classification_loss: 0.3501 97/500 [====>.........................] - ETA: 1:41 - loss: 1.7687 - regression_loss: 1.4181 - classification_loss: 0.3506 98/500 [====>.........................] - ETA: 1:41 - loss: 1.7824 - regression_loss: 1.4308 - classification_loss: 0.3516 99/500 [====>.........................] - ETA: 1:41 - loss: 1.7739 - regression_loss: 1.4248 - classification_loss: 0.3491 100/500 [=====>........................] - ETA: 1:41 - loss: 1.7756 - regression_loss: 1.4268 - classification_loss: 0.3487 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7809 - regression_loss: 1.4310 - classification_loss: 0.3500 102/500 [=====>........................] - ETA: 1:40 - loss: 1.7785 - regression_loss: 1.4285 - classification_loss: 0.3500 103/500 [=====>........................] - ETA: 1:40 - loss: 1.7783 - regression_loss: 1.4290 - classification_loss: 0.3493 104/500 [=====>........................] - ETA: 1:40 - loss: 1.7746 - regression_loss: 1.4259 - classification_loss: 0.3488 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7758 - regression_loss: 1.4270 - classification_loss: 0.3488 106/500 [=====>........................] - ETA: 1:39 - loss: 1.7712 - regression_loss: 1.4243 - classification_loss: 0.3469 107/500 [=====>........................] - ETA: 1:39 - loss: 1.7674 - regression_loss: 1.4223 - classification_loss: 0.3451 108/500 [=====>........................] - ETA: 1:39 - loss: 1.7670 - regression_loss: 1.4225 - classification_loss: 0.3445 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7685 - regression_loss: 1.4241 - classification_loss: 0.3444 110/500 [=====>........................] - ETA: 1:38 - loss: 1.7742 - regression_loss: 1.4287 - classification_loss: 0.3455 111/500 [=====>........................] - ETA: 1:38 - loss: 1.7753 - regression_loss: 1.4298 - classification_loss: 0.3455 112/500 [=====>........................] - ETA: 1:38 - loss: 1.7715 - regression_loss: 1.4274 - classification_loss: 0.3441 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7677 - regression_loss: 1.4254 - classification_loss: 0.3423 114/500 [=====>........................] - ETA: 1:37 - loss: 1.7699 - regression_loss: 1.4276 - classification_loss: 0.3423 115/500 [=====>........................] - ETA: 1:37 - loss: 1.7748 - regression_loss: 1.4315 - classification_loss: 0.3433 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7689 - regression_loss: 1.4275 - classification_loss: 0.3415 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7765 - regression_loss: 1.4341 - classification_loss: 0.3424 118/500 [======>.......................] - ETA: 1:36 - loss: 1.7763 - regression_loss: 1.4345 - classification_loss: 0.3418 119/500 [======>.......................] - ETA: 1:36 - loss: 1.7714 - regression_loss: 1.4311 - classification_loss: 0.3403 120/500 [======>.......................] - ETA: 1:36 - loss: 1.7708 - regression_loss: 1.4317 - classification_loss: 0.3391 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7715 - regression_loss: 1.4327 - classification_loss: 0.3388 122/500 [======>.......................] - ETA: 1:35 - loss: 1.7705 - regression_loss: 1.4326 - classification_loss: 0.3379 123/500 [======>.......................] - ETA: 1:35 - loss: 1.7615 - regression_loss: 1.4258 - classification_loss: 0.3358 124/500 [======>.......................] - ETA: 1:35 - loss: 1.7594 - regression_loss: 1.4236 - classification_loss: 0.3357 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7567 - regression_loss: 1.4221 - classification_loss: 0.3346 126/500 [======>.......................] - ETA: 1:34 - loss: 1.7556 - regression_loss: 1.4209 - classification_loss: 0.3347 127/500 [======>.......................] - ETA: 1:34 - loss: 1.7509 - regression_loss: 1.4178 - classification_loss: 0.3332 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7536 - regression_loss: 1.4210 - classification_loss: 0.3326 129/500 [======>.......................] - ETA: 1:33 - loss: 1.7482 - regression_loss: 1.4171 - classification_loss: 0.3312 130/500 [======>.......................] - ETA: 1:33 - loss: 1.7511 - regression_loss: 1.4205 - classification_loss: 0.3307 131/500 [======>.......................] - ETA: 1:33 - loss: 1.7470 - regression_loss: 1.4176 - classification_loss: 0.3294 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7429 - regression_loss: 1.4147 - classification_loss: 0.3282 133/500 [======>.......................] - ETA: 1:32 - loss: 1.7387 - regression_loss: 1.4106 - classification_loss: 0.3282 134/500 [=======>......................] - ETA: 1:32 - loss: 1.7392 - regression_loss: 1.4111 - classification_loss: 0.3282 135/500 [=======>......................] - ETA: 1:32 - loss: 1.7415 - regression_loss: 1.4139 - classification_loss: 0.3276 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7410 - regression_loss: 1.4141 - classification_loss: 0.3269 137/500 [=======>......................] - ETA: 1:31 - loss: 1.7348 - regression_loss: 1.4088 - classification_loss: 0.3260 138/500 [=======>......................] - ETA: 1:31 - loss: 1.7374 - regression_loss: 1.4117 - classification_loss: 0.3257 139/500 [=======>......................] - ETA: 1:31 - loss: 1.7362 - regression_loss: 1.4113 - classification_loss: 0.3249 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7331 - regression_loss: 1.4094 - classification_loss: 0.3238 141/500 [=======>......................] - ETA: 1:30 - loss: 1.7301 - regression_loss: 1.4068 - classification_loss: 0.3233 142/500 [=======>......................] - ETA: 1:30 - loss: 1.7280 - regression_loss: 1.4055 - classification_loss: 0.3225 143/500 [=======>......................] - ETA: 1:30 - loss: 1.7275 - regression_loss: 1.4054 - classification_loss: 0.3220 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7329 - regression_loss: 1.4093 - classification_loss: 0.3236 145/500 [=======>......................] - ETA: 1:29 - loss: 1.7327 - regression_loss: 1.4100 - classification_loss: 0.3228 146/500 [=======>......................] - ETA: 1:29 - loss: 1.7310 - regression_loss: 1.4094 - classification_loss: 0.3216 147/500 [=======>......................] - ETA: 1:29 - loss: 1.7376 - regression_loss: 1.4138 - classification_loss: 0.3238 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7354 - regression_loss: 1.4125 - classification_loss: 0.3229 149/500 [=======>......................] - ETA: 1:28 - loss: 1.7356 - regression_loss: 1.4128 - classification_loss: 0.3228 150/500 [========>.....................] - ETA: 1:28 - loss: 1.7279 - regression_loss: 1.4068 - classification_loss: 0.3211 151/500 [========>.....................] - ETA: 1:28 - loss: 1.7296 - regression_loss: 1.4080 - classification_loss: 0.3216 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7288 - regression_loss: 1.4079 - classification_loss: 0.3209 153/500 [========>.....................] - ETA: 1:27 - loss: 1.7288 - regression_loss: 1.4085 - classification_loss: 0.3203 154/500 [========>.....................] - ETA: 1:27 - loss: 1.7283 - regression_loss: 1.4085 - classification_loss: 0.3197 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7241 - regression_loss: 1.4055 - classification_loss: 0.3186 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7177 - regression_loss: 1.4003 - classification_loss: 0.3173 157/500 [========>.....................] - ETA: 1:26 - loss: 1.7208 - regression_loss: 1.4032 - classification_loss: 0.3176 158/500 [========>.....................] - ETA: 1:26 - loss: 1.7219 - regression_loss: 1.4046 - classification_loss: 0.3174 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7237 - regression_loss: 1.4063 - classification_loss: 0.3174 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7216 - regression_loss: 1.4053 - classification_loss: 0.3163 161/500 [========>.....................] - ETA: 1:25 - loss: 1.7309 - regression_loss: 1.4124 - classification_loss: 0.3185 162/500 [========>.....................] - ETA: 1:25 - loss: 1.7295 - regression_loss: 1.4120 - classification_loss: 0.3175 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7259 - regression_loss: 1.4093 - classification_loss: 0.3166 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7237 - regression_loss: 1.4080 - classification_loss: 0.3157 165/500 [========>.....................] - ETA: 1:24 - loss: 1.7211 - regression_loss: 1.4063 - classification_loss: 0.3148 166/500 [========>.....................] - ETA: 1:24 - loss: 1.7179 - regression_loss: 1.4043 - classification_loss: 0.3137 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7241 - regression_loss: 1.4109 - classification_loss: 0.3132 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7239 - regression_loss: 1.4111 - classification_loss: 0.3128 169/500 [=========>....................] - ETA: 1:23 - loss: 1.7231 - regression_loss: 1.4110 - classification_loss: 0.3121 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7252 - regression_loss: 1.4130 - classification_loss: 0.3122 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7277 - regression_loss: 1.4164 - classification_loss: 0.3114 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7290 - regression_loss: 1.4175 - classification_loss: 0.3115 173/500 [=========>....................] - ETA: 1:22 - loss: 1.7354 - regression_loss: 1.4224 - classification_loss: 0.3131 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7302 - regression_loss: 1.4180 - classification_loss: 0.3122 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7307 - regression_loss: 1.4188 - classification_loss: 0.3118 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7300 - regression_loss: 1.4188 - classification_loss: 0.3112 177/500 [=========>....................] - ETA: 1:21 - loss: 1.7237 - regression_loss: 1.4138 - classification_loss: 0.3098 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7246 - regression_loss: 1.4148 - classification_loss: 0.3098 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7248 - regression_loss: 1.4153 - classification_loss: 0.3095 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7317 - regression_loss: 1.4203 - classification_loss: 0.3113 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7303 - regression_loss: 1.4197 - classification_loss: 0.3106 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7303 - regression_loss: 1.4199 - classification_loss: 0.3104 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7296 - regression_loss: 1.4193 - classification_loss: 0.3104 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7243 - regression_loss: 1.4150 - classification_loss: 0.3094 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7207 - regression_loss: 1.4121 - classification_loss: 0.3086 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7283 - regression_loss: 1.4188 - classification_loss: 0.3094 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7317 - regression_loss: 1.4218 - classification_loss: 0.3099 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7317 - regression_loss: 1.4212 - classification_loss: 0.3105 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7307 - regression_loss: 1.4137 - classification_loss: 0.3170 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7286 - regression_loss: 1.4122 - classification_loss: 0.3164 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7299 - regression_loss: 1.4134 - classification_loss: 0.3165 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7323 - regression_loss: 1.4151 - classification_loss: 0.3173 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7298 - regression_loss: 1.4131 - classification_loss: 0.3167 194/500 [==========>...................] - ETA: 1:17 - loss: 1.7285 - regression_loss: 1.4120 - classification_loss: 0.3165 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7289 - regression_loss: 1.4126 - classification_loss: 0.3163 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7300 - regression_loss: 1.4132 - classification_loss: 0.3168 197/500 [==========>...................] - ETA: 1:16 - loss: 1.7242 - regression_loss: 1.4086 - classification_loss: 0.3155 198/500 [==========>...................] - ETA: 1:16 - loss: 1.7282 - regression_loss: 1.4116 - classification_loss: 0.3165 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7282 - regression_loss: 1.4120 - classification_loss: 0.3162 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7281 - regression_loss: 1.4124 - classification_loss: 0.3158 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7261 - regression_loss: 1.4111 - classification_loss: 0.3150 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7208 - regression_loss: 1.4041 - classification_loss: 0.3167 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7221 - regression_loss: 1.4050 - classification_loss: 0.3172 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7233 - regression_loss: 1.4064 - classification_loss: 0.3169 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7224 - regression_loss: 1.4060 - classification_loss: 0.3164 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7237 - regression_loss: 1.4071 - classification_loss: 0.3166 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7207 - regression_loss: 1.4051 - classification_loss: 0.3156 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7222 - regression_loss: 1.4062 - classification_loss: 0.3160 209/500 [===========>..................] - ETA: 1:13 - loss: 1.7213 - regression_loss: 1.4057 - classification_loss: 0.3155 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7197 - regression_loss: 1.4047 - classification_loss: 0.3151 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7207 - regression_loss: 1.4058 - classification_loss: 0.3148 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7217 - regression_loss: 1.4075 - classification_loss: 0.3142 213/500 [===========>..................] - ETA: 1:12 - loss: 1.7174 - regression_loss: 1.4044 - classification_loss: 0.3130 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7161 - regression_loss: 1.4040 - classification_loss: 0.3121 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7154 - regression_loss: 1.4039 - classification_loss: 0.3115 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7149 - regression_loss: 1.4035 - classification_loss: 0.3114 217/500 [============>.................] - ETA: 1:11 - loss: 1.7146 - regression_loss: 1.4038 - classification_loss: 0.3108 218/500 [============>.................] - ETA: 1:10 - loss: 1.7122 - regression_loss: 1.4024 - classification_loss: 0.3098 219/500 [============>.................] - ETA: 1:10 - loss: 1.7130 - regression_loss: 1.4030 - classification_loss: 0.3100 220/500 [============>.................] - ETA: 1:10 - loss: 1.7140 - regression_loss: 1.4045 - classification_loss: 0.3095 221/500 [============>.................] - ETA: 1:10 - loss: 1.7112 - regression_loss: 1.4026 - classification_loss: 0.3086 222/500 [============>.................] - ETA: 1:09 - loss: 1.7119 - regression_loss: 1.4037 - classification_loss: 0.3082 223/500 [============>.................] - ETA: 1:09 - loss: 1.7131 - regression_loss: 1.4044 - classification_loss: 0.3087 224/500 [============>.................] - ETA: 1:09 - loss: 1.7090 - regression_loss: 1.4014 - classification_loss: 0.3076 225/500 [============>.................] - ETA: 1:09 - loss: 1.7110 - regression_loss: 1.4030 - classification_loss: 0.3080 226/500 [============>.................] - ETA: 1:08 - loss: 1.7101 - regression_loss: 1.4026 - classification_loss: 0.3075 227/500 [============>.................] - ETA: 1:08 - loss: 1.7158 - regression_loss: 1.4072 - classification_loss: 0.3085 228/500 [============>.................] - ETA: 1:08 - loss: 1.7135 - regression_loss: 1.4056 - classification_loss: 0.3079 229/500 [============>.................] - ETA: 1:08 - loss: 1.7152 - regression_loss: 1.4068 - classification_loss: 0.3084 230/500 [============>.................] - ETA: 1:07 - loss: 1.7180 - regression_loss: 1.4089 - classification_loss: 0.3090 231/500 [============>.................] - ETA: 1:07 - loss: 1.7175 - regression_loss: 1.4089 - classification_loss: 0.3086 232/500 [============>.................] - ETA: 1:07 - loss: 1.7170 - regression_loss: 1.4089 - classification_loss: 0.3081 233/500 [============>.................] - ETA: 1:07 - loss: 1.7181 - regression_loss: 1.4099 - classification_loss: 0.3082 234/500 [=============>................] - ETA: 1:06 - loss: 1.7163 - regression_loss: 1.4086 - classification_loss: 0.3077 235/500 [=============>................] - ETA: 1:06 - loss: 1.7163 - regression_loss: 1.4090 - classification_loss: 0.3073 236/500 [=============>................] - ETA: 1:06 - loss: 1.7172 - regression_loss: 1.4099 - classification_loss: 0.3073 237/500 [=============>................] - ETA: 1:06 - loss: 1.7165 - regression_loss: 1.4090 - classification_loss: 0.3075 238/500 [=============>................] - ETA: 1:05 - loss: 1.7174 - regression_loss: 1.4099 - classification_loss: 0.3075 239/500 [=============>................] - ETA: 1:05 - loss: 1.7152 - regression_loss: 1.4086 - classification_loss: 0.3066 240/500 [=============>................] - ETA: 1:05 - loss: 1.7164 - regression_loss: 1.4096 - classification_loss: 0.3068 241/500 [=============>................] - ETA: 1:05 - loss: 1.7172 - regression_loss: 1.4104 - classification_loss: 0.3068 242/500 [=============>................] - ETA: 1:04 - loss: 1.7150 - regression_loss: 1.4091 - classification_loss: 0.3058 243/500 [=============>................] - ETA: 1:04 - loss: 1.7160 - regression_loss: 1.4100 - classification_loss: 0.3060 244/500 [=============>................] - ETA: 1:04 - loss: 1.7179 - regression_loss: 1.4118 - classification_loss: 0.3060 245/500 [=============>................] - ETA: 1:04 - loss: 1.7175 - regression_loss: 1.4114 - classification_loss: 0.3061 246/500 [=============>................] - ETA: 1:03 - loss: 1.7184 - regression_loss: 1.4121 - classification_loss: 0.3063 247/500 [=============>................] - ETA: 1:03 - loss: 1.7177 - regression_loss: 1.4118 - classification_loss: 0.3059 248/500 [=============>................] - ETA: 1:03 - loss: 1.7195 - regression_loss: 1.4130 - classification_loss: 0.3064 249/500 [=============>................] - ETA: 1:03 - loss: 1.7156 - regression_loss: 1.4090 - classification_loss: 0.3066 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7159 - regression_loss: 1.4095 - classification_loss: 0.3064 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7161 - regression_loss: 1.4097 - classification_loss: 0.3063 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7151 - regression_loss: 1.4091 - classification_loss: 0.3061 253/500 [==============>...............] - ETA: 1:02 - loss: 1.7168 - regression_loss: 1.4106 - classification_loss: 0.3063 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7176 - regression_loss: 1.4111 - classification_loss: 0.3065 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7207 - regression_loss: 1.4140 - classification_loss: 0.3067 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7189 - regression_loss: 1.4127 - classification_loss: 0.3061 257/500 [==============>...............] - ETA: 1:01 - loss: 1.7189 - regression_loss: 1.4126 - classification_loss: 0.3062 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7193 - regression_loss: 1.4128 - classification_loss: 0.3064 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7210 - regression_loss: 1.4146 - classification_loss: 0.3064 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7192 - regression_loss: 1.4132 - classification_loss: 0.3061 261/500 [==============>...............] - ETA: 1:00 - loss: 1.7174 - regression_loss: 1.4114 - classification_loss: 0.3060 262/500 [==============>...............] - ETA: 59s - loss: 1.7184 - regression_loss: 1.4122 - classification_loss: 0.3062  263/500 [==============>...............] - ETA: 59s - loss: 1.7211 - regression_loss: 1.4138 - classification_loss: 0.3073 264/500 [==============>...............] - ETA: 59s - loss: 1.7195 - regression_loss: 1.4125 - classification_loss: 0.3069 265/500 [==============>...............] - ETA: 59s - loss: 1.7183 - regression_loss: 1.4115 - classification_loss: 0.3068 266/500 [==============>...............] - ETA: 58s - loss: 1.7168 - regression_loss: 1.4107 - classification_loss: 0.3061 267/500 [===============>..............] - ETA: 58s - loss: 1.7151 - regression_loss: 1.4093 - classification_loss: 0.3058 268/500 [===============>..............] - ETA: 58s - loss: 1.7134 - regression_loss: 1.4080 - classification_loss: 0.3054 269/500 [===============>..............] - ETA: 58s - loss: 1.7091 - regression_loss: 1.4044 - classification_loss: 0.3047 270/500 [===============>..............] - ETA: 57s - loss: 1.7105 - regression_loss: 1.4059 - classification_loss: 0.3047 271/500 [===============>..............] - ETA: 57s - loss: 1.7119 - regression_loss: 1.4069 - classification_loss: 0.3050 272/500 [===============>..............] - ETA: 57s - loss: 1.7143 - regression_loss: 1.4092 - classification_loss: 0.3051 273/500 [===============>..............] - ETA: 57s - loss: 1.7102 - regression_loss: 1.4062 - classification_loss: 0.3041 274/500 [===============>..............] - ETA: 56s - loss: 1.7076 - regression_loss: 1.4043 - classification_loss: 0.3033 275/500 [===============>..............] - ETA: 56s - loss: 1.7062 - regression_loss: 1.4034 - classification_loss: 0.3028 276/500 [===============>..............] - ETA: 56s - loss: 1.7083 - regression_loss: 1.4054 - classification_loss: 0.3029 277/500 [===============>..............] - ETA: 56s - loss: 1.7083 - regression_loss: 1.4055 - classification_loss: 0.3028 278/500 [===============>..............] - ETA: 55s - loss: 1.7067 - regression_loss: 1.4046 - classification_loss: 0.3022 279/500 [===============>..............] - ETA: 55s - loss: 1.7077 - regression_loss: 1.4055 - classification_loss: 0.3022 280/500 [===============>..............] - ETA: 55s - loss: 1.7113 - regression_loss: 1.4086 - classification_loss: 0.3027 281/500 [===============>..............] - ETA: 55s - loss: 1.7107 - regression_loss: 1.4082 - classification_loss: 0.3025 282/500 [===============>..............] - ETA: 54s - loss: 1.7110 - regression_loss: 1.4088 - classification_loss: 0.3023 283/500 [===============>..............] - ETA: 54s - loss: 1.7107 - regression_loss: 1.4085 - classification_loss: 0.3022 284/500 [================>.............] - ETA: 54s - loss: 1.7106 - regression_loss: 1.4086 - classification_loss: 0.3020 285/500 [================>.............] - ETA: 54s - loss: 1.7130 - regression_loss: 1.4107 - classification_loss: 0.3023 286/500 [================>.............] - ETA: 53s - loss: 1.7122 - regression_loss: 1.4105 - classification_loss: 0.3017 287/500 [================>.............] - ETA: 53s - loss: 1.7129 - regression_loss: 1.4111 - classification_loss: 0.3017 288/500 [================>.............] - ETA: 53s - loss: 1.7137 - regression_loss: 1.4119 - classification_loss: 0.3018 289/500 [================>.............] - ETA: 53s - loss: 1.7138 - regression_loss: 1.4123 - classification_loss: 0.3015 290/500 [================>.............] - ETA: 52s - loss: 1.7128 - regression_loss: 1.4116 - classification_loss: 0.3012 291/500 [================>.............] - ETA: 52s - loss: 1.7131 - regression_loss: 1.4120 - classification_loss: 0.3011 292/500 [================>.............] - ETA: 52s - loss: 1.7135 - regression_loss: 1.4124 - classification_loss: 0.3011 293/500 [================>.............] - ETA: 52s - loss: 1.7137 - regression_loss: 1.4128 - classification_loss: 0.3009 294/500 [================>.............] - ETA: 51s - loss: 1.7116 - regression_loss: 1.4114 - classification_loss: 0.3002 295/500 [================>.............] - ETA: 51s - loss: 1.7117 - regression_loss: 1.4113 - classification_loss: 0.3005 296/500 [================>.............] - ETA: 51s - loss: 1.7130 - regression_loss: 1.4122 - classification_loss: 0.3008 297/500 [================>.............] - ETA: 51s - loss: 1.7108 - regression_loss: 1.4106 - classification_loss: 0.3002 298/500 [================>.............] - ETA: 50s - loss: 1.7103 - regression_loss: 1.4106 - classification_loss: 0.2998 299/500 [================>.............] - ETA: 50s - loss: 1.7106 - regression_loss: 1.4107 - classification_loss: 0.2999 300/500 [=================>............] - ETA: 50s - loss: 1.7092 - regression_loss: 1.4095 - classification_loss: 0.2997 301/500 [=================>............] - ETA: 50s - loss: 1.7083 - regression_loss: 1.4090 - classification_loss: 0.2992 302/500 [=================>............] - ETA: 49s - loss: 1.7074 - regression_loss: 1.4084 - classification_loss: 0.2990 303/500 [=================>............] - ETA: 49s - loss: 1.7078 - regression_loss: 1.4089 - classification_loss: 0.2989 304/500 [=================>............] - ETA: 49s - loss: 1.7087 - regression_loss: 1.4096 - classification_loss: 0.2991 305/500 [=================>............] - ETA: 49s - loss: 1.7075 - regression_loss: 1.4088 - classification_loss: 0.2986 306/500 [=================>............] - ETA: 48s - loss: 1.7088 - regression_loss: 1.4103 - classification_loss: 0.2986 307/500 [=================>............] - ETA: 48s - loss: 1.7076 - regression_loss: 1.4095 - classification_loss: 0.2982 308/500 [=================>............] - ETA: 48s - loss: 1.7077 - regression_loss: 1.4094 - classification_loss: 0.2983 309/500 [=================>............] - ETA: 48s - loss: 1.7082 - regression_loss: 1.4096 - classification_loss: 0.2986 310/500 [=================>............] - ETA: 47s - loss: 1.7062 - regression_loss: 1.4081 - classification_loss: 0.2981 311/500 [=================>............] - ETA: 47s - loss: 1.7071 - regression_loss: 1.4090 - classification_loss: 0.2981 312/500 [=================>............] - ETA: 47s - loss: 1.7061 - regression_loss: 1.4084 - classification_loss: 0.2977 313/500 [=================>............] - ETA: 47s - loss: 1.7065 - regression_loss: 1.4086 - classification_loss: 0.2979 314/500 [=================>............] - ETA: 46s - loss: 1.7060 - regression_loss: 1.4083 - classification_loss: 0.2977 315/500 [=================>............] - ETA: 46s - loss: 1.7066 - regression_loss: 1.4086 - classification_loss: 0.2980 316/500 [=================>............] - ETA: 46s - loss: 1.7055 - regression_loss: 1.4079 - classification_loss: 0.2976 317/500 [==================>...........] - ETA: 46s - loss: 1.7052 - regression_loss: 1.4080 - classification_loss: 0.2973 318/500 [==================>...........] - ETA: 45s - loss: 1.7053 - regression_loss: 1.4080 - classification_loss: 0.2972 319/500 [==================>...........] - ETA: 45s - loss: 1.7082 - regression_loss: 1.4104 - classification_loss: 0.2978 320/500 [==================>...........] - ETA: 45s - loss: 1.7074 - regression_loss: 1.4098 - classification_loss: 0.2976 321/500 [==================>...........] - ETA: 44s - loss: 1.7073 - regression_loss: 1.4099 - classification_loss: 0.2974 322/500 [==================>...........] - ETA: 44s - loss: 1.7085 - regression_loss: 1.4107 - classification_loss: 0.2978 323/500 [==================>...........] - ETA: 44s - loss: 1.7061 - regression_loss: 1.4089 - classification_loss: 0.2972 324/500 [==================>...........] - ETA: 44s - loss: 1.7035 - regression_loss: 1.4069 - classification_loss: 0.2965 325/500 [==================>...........] - ETA: 43s - loss: 1.7027 - regression_loss: 1.4063 - classification_loss: 0.2964 326/500 [==================>...........] - ETA: 43s - loss: 1.7014 - regression_loss: 1.4054 - classification_loss: 0.2960 327/500 [==================>...........] - ETA: 43s - loss: 1.7049 - regression_loss: 1.4084 - classification_loss: 0.2965 328/500 [==================>...........] - ETA: 43s - loss: 1.7056 - regression_loss: 1.4090 - classification_loss: 0.2966 329/500 [==================>...........] - ETA: 42s - loss: 1.7076 - regression_loss: 1.4105 - classification_loss: 0.2971 330/500 [==================>...........] - ETA: 42s - loss: 1.7070 - regression_loss: 1.4101 - classification_loss: 0.2969 331/500 [==================>...........] - ETA: 42s - loss: 1.7059 - regression_loss: 1.4095 - classification_loss: 0.2965 332/500 [==================>...........] - ETA: 42s - loss: 1.7047 - regression_loss: 1.4084 - classification_loss: 0.2962 333/500 [==================>...........] - ETA: 41s - loss: 1.7030 - regression_loss: 1.4072 - classification_loss: 0.2958 334/500 [===================>..........] - ETA: 41s - loss: 1.7038 - regression_loss: 1.4080 - classification_loss: 0.2957 335/500 [===================>..........] - ETA: 41s - loss: 1.7052 - regression_loss: 1.4094 - classification_loss: 0.2958 336/500 [===================>..........] - ETA: 41s - loss: 1.7023 - regression_loss: 1.4071 - classification_loss: 0.2951 337/500 [===================>..........] - ETA: 40s - loss: 1.7007 - regression_loss: 1.4060 - classification_loss: 0.2946 338/500 [===================>..........] - ETA: 40s - loss: 1.7012 - regression_loss: 1.4066 - classification_loss: 0.2946 339/500 [===================>..........] - ETA: 40s - loss: 1.7034 - regression_loss: 1.4084 - classification_loss: 0.2950 340/500 [===================>..........] - ETA: 40s - loss: 1.7033 - regression_loss: 1.4085 - classification_loss: 0.2948 341/500 [===================>..........] - ETA: 39s - loss: 1.7009 - regression_loss: 1.4066 - classification_loss: 0.2943 342/500 [===================>..........] - ETA: 39s - loss: 1.7024 - regression_loss: 1.4073 - classification_loss: 0.2951 343/500 [===================>..........] - ETA: 39s - loss: 1.7028 - regression_loss: 1.4076 - classification_loss: 0.2952 344/500 [===================>..........] - ETA: 39s - loss: 1.7036 - regression_loss: 1.4081 - classification_loss: 0.2955 345/500 [===================>..........] - ETA: 38s - loss: 1.7040 - regression_loss: 1.4087 - classification_loss: 0.2953 346/500 [===================>..........] - ETA: 38s - loss: 1.7047 - regression_loss: 1.4096 - classification_loss: 0.2951 347/500 [===================>..........] - ETA: 38s - loss: 1.7037 - regression_loss: 1.4089 - classification_loss: 0.2948 348/500 [===================>..........] - ETA: 38s - loss: 1.7034 - regression_loss: 1.4089 - classification_loss: 0.2945 349/500 [===================>..........] - ETA: 37s - loss: 1.7038 - regression_loss: 1.4091 - classification_loss: 0.2947 350/500 [====================>.........] - ETA: 37s - loss: 1.7032 - regression_loss: 1.4087 - classification_loss: 0.2944 351/500 [====================>.........] - ETA: 37s - loss: 1.7026 - regression_loss: 1.4083 - classification_loss: 0.2944 352/500 [====================>.........] - ETA: 37s - loss: 1.7040 - regression_loss: 1.4093 - classification_loss: 0.2947 353/500 [====================>.........] - ETA: 36s - loss: 1.7037 - regression_loss: 1.4094 - classification_loss: 0.2943 354/500 [====================>.........] - ETA: 36s - loss: 1.7051 - regression_loss: 1.4103 - classification_loss: 0.2948 355/500 [====================>.........] - ETA: 36s - loss: 1.7052 - regression_loss: 1.4104 - classification_loss: 0.2948 356/500 [====================>.........] - ETA: 36s - loss: 1.7065 - regression_loss: 1.4116 - classification_loss: 0.2948 357/500 [====================>.........] - ETA: 35s - loss: 1.7065 - regression_loss: 1.4116 - classification_loss: 0.2950 358/500 [====================>.........] - ETA: 35s - loss: 1.7070 - regression_loss: 1.4120 - classification_loss: 0.2950 359/500 [====================>.........] - ETA: 35s - loss: 1.7107 - regression_loss: 1.4129 - classification_loss: 0.2978 360/500 [====================>.........] - ETA: 35s - loss: 1.7116 - regression_loss: 1.4134 - classification_loss: 0.2982 361/500 [====================>.........] - ETA: 34s - loss: 1.7114 - regression_loss: 1.4133 - classification_loss: 0.2981 362/500 [====================>.........] - ETA: 34s - loss: 1.7112 - regression_loss: 1.4134 - classification_loss: 0.2978 363/500 [====================>.........] - ETA: 34s - loss: 1.7113 - regression_loss: 1.4134 - classification_loss: 0.2979 364/500 [====================>.........] - ETA: 34s - loss: 1.7110 - regression_loss: 1.4133 - classification_loss: 0.2977 365/500 [====================>.........] - ETA: 33s - loss: 1.7113 - regression_loss: 1.4135 - classification_loss: 0.2978 366/500 [====================>.........] - ETA: 33s - loss: 1.7116 - regression_loss: 1.4138 - classification_loss: 0.2978 367/500 [=====================>........] - ETA: 33s - loss: 1.7107 - regression_loss: 1.4132 - classification_loss: 0.2974 368/500 [=====================>........] - ETA: 33s - loss: 1.7113 - regression_loss: 1.4140 - classification_loss: 0.2973 369/500 [=====================>........] - ETA: 32s - loss: 1.7107 - regression_loss: 1.4136 - classification_loss: 0.2971 370/500 [=====================>........] - ETA: 32s - loss: 1.7112 - regression_loss: 1.4144 - classification_loss: 0.2968 371/500 [=====================>........] - ETA: 32s - loss: 1.7109 - regression_loss: 1.4144 - classification_loss: 0.2965 372/500 [=====================>........] - ETA: 32s - loss: 1.7085 - regression_loss: 1.4126 - classification_loss: 0.2959 373/500 [=====================>........] - ETA: 31s - loss: 1.7104 - regression_loss: 1.4141 - classification_loss: 0.2963 374/500 [=====================>........] - ETA: 31s - loss: 1.7087 - regression_loss: 1.4128 - classification_loss: 0.2959 375/500 [=====================>........] - ETA: 31s - loss: 1.7076 - regression_loss: 1.4120 - classification_loss: 0.2956 376/500 [=====================>........] - ETA: 31s - loss: 1.7078 - regression_loss: 1.4120 - classification_loss: 0.2957 377/500 [=====================>........] - ETA: 30s - loss: 1.7070 - regression_loss: 1.4115 - classification_loss: 0.2956 378/500 [=====================>........] - ETA: 30s - loss: 1.7057 - regression_loss: 1.4105 - classification_loss: 0.2952 379/500 [=====================>........] - ETA: 30s - loss: 1.7058 - regression_loss: 1.4105 - classification_loss: 0.2952 380/500 [=====================>........] - ETA: 30s - loss: 1.7059 - regression_loss: 1.4105 - classification_loss: 0.2954 381/500 [=====================>........] - ETA: 29s - loss: 1.7066 - regression_loss: 1.4106 - classification_loss: 0.2960 382/500 [=====================>........] - ETA: 29s - loss: 1.7069 - regression_loss: 1.4108 - classification_loss: 0.2961 383/500 [=====================>........] - ETA: 29s - loss: 1.7071 - regression_loss: 1.4110 - classification_loss: 0.2961 384/500 [======================>.......] - ETA: 29s - loss: 1.7071 - regression_loss: 1.4113 - classification_loss: 0.2959 385/500 [======================>.......] - ETA: 28s - loss: 1.7062 - regression_loss: 1.4106 - classification_loss: 0.2956 386/500 [======================>.......] - ETA: 28s - loss: 1.7048 - regression_loss: 1.4095 - classification_loss: 0.2953 387/500 [======================>.......] - ETA: 28s - loss: 1.7069 - regression_loss: 1.4111 - classification_loss: 0.2959 388/500 [======================>.......] - ETA: 28s - loss: 1.7074 - regression_loss: 1.4115 - classification_loss: 0.2958 389/500 [======================>.......] - ETA: 27s - loss: 1.7079 - regression_loss: 1.4120 - classification_loss: 0.2959 390/500 [======================>.......] - ETA: 27s - loss: 1.7065 - regression_loss: 1.4107 - classification_loss: 0.2958 391/500 [======================>.......] - ETA: 27s - loss: 1.7060 - regression_loss: 1.4104 - classification_loss: 0.2956 392/500 [======================>.......] - ETA: 27s - loss: 1.7026 - regression_loss: 1.4076 - classification_loss: 0.2950 393/500 [======================>.......] - ETA: 26s - loss: 1.7029 - regression_loss: 1.4078 - classification_loss: 0.2952 394/500 [======================>.......] - ETA: 26s - loss: 1.7019 - regression_loss: 1.4069 - classification_loss: 0.2950 395/500 [======================>.......] - ETA: 26s - loss: 1.7019 - regression_loss: 1.4071 - classification_loss: 0.2948 396/500 [======================>.......] - ETA: 26s - loss: 1.7017 - regression_loss: 1.4070 - classification_loss: 0.2948 397/500 [======================>.......] - ETA: 25s - loss: 1.7018 - regression_loss: 1.4065 - classification_loss: 0.2953 398/500 [======================>.......] - ETA: 25s - loss: 1.7031 - regression_loss: 1.4075 - classification_loss: 0.2956 399/500 [======================>.......] - ETA: 25s - loss: 1.7023 - regression_loss: 1.4068 - classification_loss: 0.2955 400/500 [=======================>......] - ETA: 25s - loss: 1.7014 - regression_loss: 1.4062 - classification_loss: 0.2953 401/500 [=======================>......] - ETA: 24s - loss: 1.7008 - regression_loss: 1.4057 - classification_loss: 0.2951 402/500 [=======================>......] - ETA: 24s - loss: 1.7006 - regression_loss: 1.4057 - classification_loss: 0.2948 403/500 [=======================>......] - ETA: 24s - loss: 1.7014 - regression_loss: 1.4064 - classification_loss: 0.2950 404/500 [=======================>......] - ETA: 24s - loss: 1.7002 - regression_loss: 1.4056 - classification_loss: 0.2946 405/500 [=======================>......] - ETA: 23s - loss: 1.7012 - regression_loss: 1.4067 - classification_loss: 0.2945 406/500 [=======================>......] - ETA: 23s - loss: 1.7029 - regression_loss: 1.4079 - classification_loss: 0.2950 407/500 [=======================>......] - ETA: 23s - loss: 1.7027 - regression_loss: 1.4080 - classification_loss: 0.2948 408/500 [=======================>......] - ETA: 23s - loss: 1.7050 - regression_loss: 1.4096 - classification_loss: 0.2954 409/500 [=======================>......] - ETA: 22s - loss: 1.7065 - regression_loss: 1.4106 - classification_loss: 0.2959 410/500 [=======================>......] - ETA: 22s - loss: 1.7059 - regression_loss: 1.4101 - classification_loss: 0.2958 411/500 [=======================>......] - ETA: 22s - loss: 1.7063 - regression_loss: 1.4105 - classification_loss: 0.2958 412/500 [=======================>......] - ETA: 22s - loss: 1.7046 - regression_loss: 1.4092 - classification_loss: 0.2954 413/500 [=======================>......] - ETA: 21s - loss: 1.7043 - regression_loss: 1.4087 - classification_loss: 0.2956 414/500 [=======================>......] - ETA: 21s - loss: 1.7047 - regression_loss: 1.4090 - classification_loss: 0.2958 415/500 [=======================>......] - ETA: 21s - loss: 1.7048 - regression_loss: 1.4089 - classification_loss: 0.2959 416/500 [=======================>......] - ETA: 21s - loss: 1.7046 - regression_loss: 1.4089 - classification_loss: 0.2957 417/500 [========================>.....] - ETA: 20s - loss: 1.7018 - regression_loss: 1.4066 - classification_loss: 0.2952 418/500 [========================>.....] - ETA: 20s - loss: 1.7025 - regression_loss: 1.4073 - classification_loss: 0.2952 419/500 [========================>.....] - ETA: 20s - loss: 1.7025 - regression_loss: 1.4073 - classification_loss: 0.2953 420/500 [========================>.....] - ETA: 20s - loss: 1.7019 - regression_loss: 1.4068 - classification_loss: 0.2951 421/500 [========================>.....] - ETA: 19s - loss: 1.7007 - regression_loss: 1.4059 - classification_loss: 0.2947 422/500 [========================>.....] - ETA: 19s - loss: 1.7020 - regression_loss: 1.4068 - classification_loss: 0.2952 423/500 [========================>.....] - ETA: 19s - loss: 1.7012 - regression_loss: 1.4062 - classification_loss: 0.2950 424/500 [========================>.....] - ETA: 19s - loss: 1.7002 - regression_loss: 1.4055 - classification_loss: 0.2947 425/500 [========================>.....] - ETA: 18s - loss: 1.6992 - regression_loss: 1.4047 - classification_loss: 0.2944 426/500 [========================>.....] - ETA: 18s - loss: 1.6995 - regression_loss: 1.4051 - classification_loss: 0.2944 427/500 [========================>.....] - ETA: 18s - loss: 1.6986 - regression_loss: 1.4045 - classification_loss: 0.2941 428/500 [========================>.....] - ETA: 18s - loss: 1.6995 - regression_loss: 1.4052 - classification_loss: 0.2943 429/500 [========================>.....] - ETA: 17s - loss: 1.6999 - regression_loss: 1.4057 - classification_loss: 0.2942 430/500 [========================>.....] - ETA: 17s - loss: 1.7028 - regression_loss: 1.4074 - classification_loss: 0.2954 431/500 [========================>.....] - ETA: 17s - loss: 1.7033 - regression_loss: 1.4078 - classification_loss: 0.2955 432/500 [========================>.....] - ETA: 17s - loss: 1.7023 - regression_loss: 1.4071 - classification_loss: 0.2951 433/500 [========================>.....] - ETA: 16s - loss: 1.7025 - regression_loss: 1.4074 - classification_loss: 0.2951 434/500 [=========================>....] - ETA: 16s - loss: 1.7011 - regression_loss: 1.4064 - classification_loss: 0.2947 435/500 [=========================>....] - ETA: 16s - loss: 1.7005 - regression_loss: 1.4062 - classification_loss: 0.2944 436/500 [=========================>....] - ETA: 16s - loss: 1.7016 - regression_loss: 1.4067 - classification_loss: 0.2948 437/500 [=========================>....] - ETA: 15s - loss: 1.7021 - regression_loss: 1.4074 - classification_loss: 0.2947 438/500 [=========================>....] - ETA: 15s - loss: 1.7016 - regression_loss: 1.4071 - classification_loss: 0.2945 439/500 [=========================>....] - ETA: 15s - loss: 1.7022 - regression_loss: 1.4076 - classification_loss: 0.2946 440/500 [=========================>....] - ETA: 15s - loss: 1.7021 - regression_loss: 1.4077 - classification_loss: 0.2945 441/500 [=========================>....] - ETA: 14s - loss: 1.7015 - regression_loss: 1.4072 - classification_loss: 0.2943 442/500 [=========================>....] - ETA: 14s - loss: 1.7008 - regression_loss: 1.4068 - classification_loss: 0.2940 443/500 [=========================>....] - ETA: 14s - loss: 1.6978 - regression_loss: 1.4043 - classification_loss: 0.2935 444/500 [=========================>....] - ETA: 14s - loss: 1.6974 - regression_loss: 1.4039 - classification_loss: 0.2935 445/500 [=========================>....] - ETA: 13s - loss: 1.6969 - regression_loss: 1.4035 - classification_loss: 0.2934 446/500 [=========================>....] - ETA: 13s - loss: 1.6966 - regression_loss: 1.4032 - classification_loss: 0.2934 447/500 [=========================>....] - ETA: 13s - loss: 1.6977 - regression_loss: 1.4040 - classification_loss: 0.2937 448/500 [=========================>....] - ETA: 13s - loss: 1.6968 - regression_loss: 1.4034 - classification_loss: 0.2934 449/500 [=========================>....] - ETA: 12s - loss: 1.6973 - regression_loss: 1.4040 - classification_loss: 0.2933 450/500 [==========================>...] - ETA: 12s - loss: 1.6963 - regression_loss: 1.4033 - classification_loss: 0.2930 451/500 [==========================>...] - ETA: 12s - loss: 1.6951 - regression_loss: 1.4024 - classification_loss: 0.2928 452/500 [==========================>...] - ETA: 12s - loss: 1.6934 - regression_loss: 1.4010 - classification_loss: 0.2924 453/500 [==========================>...] - ETA: 11s - loss: 1.6942 - regression_loss: 1.4017 - classification_loss: 0.2925 454/500 [==========================>...] - ETA: 11s - loss: 1.6935 - regression_loss: 1.4012 - classification_loss: 0.2924 455/500 [==========================>...] - ETA: 11s - loss: 1.6933 - regression_loss: 1.4010 - classification_loss: 0.2923 456/500 [==========================>...] - ETA: 11s - loss: 1.6943 - regression_loss: 1.4019 - classification_loss: 0.2924 457/500 [==========================>...] - ETA: 10s - loss: 1.6939 - regression_loss: 1.4018 - classification_loss: 0.2921 458/500 [==========================>...] - ETA: 10s - loss: 1.6957 - regression_loss: 1.4034 - classification_loss: 0.2923 459/500 [==========================>...] - ETA: 10s - loss: 1.6961 - regression_loss: 1.4036 - classification_loss: 0.2925 460/500 [==========================>...] - ETA: 10s - loss: 1.6961 - regression_loss: 1.4039 - classification_loss: 0.2922 461/500 [==========================>...] - ETA: 9s - loss: 1.6984 - regression_loss: 1.4056 - classification_loss: 0.2928  462/500 [==========================>...] - ETA: 9s - loss: 1.6992 - regression_loss: 1.4063 - classification_loss: 0.2929 463/500 [==========================>...] - ETA: 9s - loss: 1.6992 - regression_loss: 1.4062 - classification_loss: 0.2929 464/500 [==========================>...] - ETA: 9s - loss: 1.6992 - regression_loss: 1.4064 - classification_loss: 0.2928 465/500 [==========================>...] - ETA: 8s - loss: 1.6992 - regression_loss: 1.4066 - classification_loss: 0.2925 466/500 [==========================>...] - ETA: 8s - loss: 1.6994 - regression_loss: 1.4071 - classification_loss: 0.2923 467/500 [===========================>..] - ETA: 8s - loss: 1.6982 - regression_loss: 1.4061 - classification_loss: 0.2921 468/500 [===========================>..] - ETA: 8s - loss: 1.6969 - regression_loss: 1.4050 - classification_loss: 0.2918 469/500 [===========================>..] - ETA: 7s - loss: 1.6955 - regression_loss: 1.4039 - classification_loss: 0.2916 470/500 [===========================>..] - ETA: 7s - loss: 1.6944 - regression_loss: 1.4032 - classification_loss: 0.2912 471/500 [===========================>..] - ETA: 7s - loss: 1.6951 - regression_loss: 1.4037 - classification_loss: 0.2915 472/500 [===========================>..] - ETA: 7s - loss: 1.6950 - regression_loss: 1.4035 - classification_loss: 0.2915 473/500 [===========================>..] - ETA: 6s - loss: 1.6947 - regression_loss: 1.4033 - classification_loss: 0.2914 474/500 [===========================>..] - ETA: 6s - loss: 1.7065 - regression_loss: 1.4104 - classification_loss: 0.2961 475/500 [===========================>..] - ETA: 6s - loss: 1.7070 - regression_loss: 1.4109 - classification_loss: 0.2961 476/500 [===========================>..] - ETA: 6s - loss: 1.7073 - regression_loss: 1.4113 - classification_loss: 0.2961 477/500 [===========================>..] - ETA: 5s - loss: 1.7069 - regression_loss: 1.4111 - classification_loss: 0.2958 478/500 [===========================>..] - ETA: 5s - loss: 1.7077 - regression_loss: 1.4119 - classification_loss: 0.2958 479/500 [===========================>..] - ETA: 5s - loss: 1.7096 - regression_loss: 1.4133 - classification_loss: 0.2963 480/500 [===========================>..] - ETA: 5s - loss: 1.7097 - regression_loss: 1.4135 - classification_loss: 0.2962 481/500 [===========================>..] - ETA: 4s - loss: 1.7099 - regression_loss: 1.4137 - classification_loss: 0.2962 482/500 [===========================>..] - ETA: 4s - loss: 1.7115 - regression_loss: 1.4153 - classification_loss: 0.2962 483/500 [===========================>..] - ETA: 4s - loss: 1.7117 - regression_loss: 1.4156 - classification_loss: 0.2961 484/500 [============================>.] - ETA: 4s - loss: 1.7108 - regression_loss: 1.4150 - classification_loss: 0.2958 485/500 [============================>.] - ETA: 3s - loss: 1.7124 - regression_loss: 1.4164 - classification_loss: 0.2960 486/500 [============================>.] - ETA: 3s - loss: 1.7113 - regression_loss: 1.4157 - classification_loss: 0.2957 487/500 [============================>.] - ETA: 3s - loss: 1.7110 - regression_loss: 1.4155 - classification_loss: 0.2955 488/500 [============================>.] - ETA: 3s - loss: 1.7103 - regression_loss: 1.4149 - classification_loss: 0.2954 489/500 [============================>.] - ETA: 2s - loss: 1.7119 - regression_loss: 1.4160 - classification_loss: 0.2959 490/500 [============================>.] - ETA: 2s - loss: 1.7136 - regression_loss: 1.4172 - classification_loss: 0.2964 491/500 [============================>.] - ETA: 2s - loss: 1.7134 - regression_loss: 1.4171 - classification_loss: 0.2962 492/500 [============================>.] - ETA: 2s - loss: 1.7119 - regression_loss: 1.4157 - classification_loss: 0.2961 493/500 [============================>.] - ETA: 1s - loss: 1.7103 - regression_loss: 1.4145 - classification_loss: 0.2958 494/500 [============================>.] - ETA: 1s - loss: 1.7113 - regression_loss: 1.4153 - classification_loss: 0.2961 495/500 [============================>.] - ETA: 1s - loss: 1.7118 - regression_loss: 1.4158 - classification_loss: 0.2960 496/500 [============================>.] - ETA: 1s - loss: 1.7100 - regression_loss: 1.4145 - classification_loss: 0.2955 497/500 [============================>.] - ETA: 0s - loss: 1.7110 - regression_loss: 1.4153 - classification_loss: 0.2956 498/500 [============================>.] - ETA: 0s - loss: 1.7089 - regression_loss: 1.4137 - classification_loss: 0.2952 499/500 [============================>.] - ETA: 0s - loss: 1.7102 - regression_loss: 1.4149 - classification_loss: 0.2953 500/500 [==============================] - 126s 251ms/step - loss: 1.7113 - regression_loss: 1.4159 - classification_loss: 0.2954 326 instances of class plum with average precision: 0.7772 mAP: 0.7772 Epoch 00051: saving model to ./training/snapshots/resnet50_pascal_51.h5 Epoch 52/150 1/500 [..............................] - ETA: 1:53 - loss: 2.6271 - regression_loss: 2.1358 - classification_loss: 0.4912 2/500 [..............................] - ETA: 1:50 - loss: 1.6976 - regression_loss: 1.4033 - classification_loss: 0.2944 3/500 [..............................] - ETA: 1:54 - loss: 1.5484 - regression_loss: 1.3083 - classification_loss: 0.2402 4/500 [..............................] - ETA: 1:56 - loss: 1.5760 - regression_loss: 1.3186 - classification_loss: 0.2574 5/500 [..............................] - ETA: 1:57 - loss: 1.5655 - regression_loss: 1.3269 - classification_loss: 0.2387 6/500 [..............................] - ETA: 1:59 - loss: 1.5755 - regression_loss: 1.3351 - classification_loss: 0.2404 7/500 [..............................] - ETA: 1:59 - loss: 1.5854 - regression_loss: 1.3431 - classification_loss: 0.2423 8/500 [..............................] - ETA: 1:59 - loss: 1.6230 - regression_loss: 1.3735 - classification_loss: 0.2495 9/500 [..............................] - ETA: 1:59 - loss: 1.6381 - regression_loss: 1.3964 - classification_loss: 0.2417 10/500 [..............................] - ETA: 1:58 - loss: 1.6172 - regression_loss: 1.3789 - classification_loss: 0.2383 11/500 [..............................] - ETA: 1:58 - loss: 1.6652 - regression_loss: 1.4125 - classification_loss: 0.2527 12/500 [..............................] - ETA: 1:58 - loss: 1.6620 - regression_loss: 1.4122 - classification_loss: 0.2497 13/500 [..............................] - ETA: 1:58 - loss: 1.6472 - regression_loss: 1.3999 - classification_loss: 0.2472 14/500 [..............................] - ETA: 1:58 - loss: 1.6901 - regression_loss: 1.4431 - classification_loss: 0.2470 15/500 [..............................] - ETA: 1:58 - loss: 1.6448 - regression_loss: 1.4083 - classification_loss: 0.2365 16/500 [..............................] - ETA: 1:58 - loss: 1.5835 - regression_loss: 1.3564 - classification_loss: 0.2272 17/500 [>.............................] - ETA: 1:58 - loss: 1.6273 - regression_loss: 1.3892 - classification_loss: 0.2381 18/500 [>.............................] - ETA: 1:57 - loss: 1.6060 - regression_loss: 1.3713 - classification_loss: 0.2347 19/500 [>.............................] - ETA: 1:57 - loss: 1.6324 - regression_loss: 1.3899 - classification_loss: 0.2425 20/500 [>.............................] - ETA: 1:56 - loss: 1.6270 - regression_loss: 1.3832 - classification_loss: 0.2438 21/500 [>.............................] - ETA: 1:55 - loss: 1.5889 - regression_loss: 1.3499 - classification_loss: 0.2390 22/500 [>.............................] - ETA: 1:55 - loss: 1.6055 - regression_loss: 1.3629 - classification_loss: 0.2427 23/500 [>.............................] - ETA: 1:54 - loss: 1.6210 - regression_loss: 1.3737 - classification_loss: 0.2472 24/500 [>.............................] - ETA: 1:54 - loss: 1.6137 - regression_loss: 1.3589 - classification_loss: 0.2548 25/500 [>.............................] - ETA: 1:54 - loss: 1.6297 - regression_loss: 1.3759 - classification_loss: 0.2538 26/500 [>.............................] - ETA: 1:54 - loss: 1.6336 - regression_loss: 1.3758 - classification_loss: 0.2578 27/500 [>.............................] - ETA: 1:54 - loss: 1.6319 - regression_loss: 1.3797 - classification_loss: 0.2522 28/500 [>.............................] - ETA: 1:53 - loss: 1.6274 - regression_loss: 1.3775 - classification_loss: 0.2498 29/500 [>.............................] - ETA: 1:53 - loss: 1.6303 - regression_loss: 1.3802 - classification_loss: 0.2501 30/500 [>.............................] - ETA: 1:54 - loss: 1.6246 - regression_loss: 1.3762 - classification_loss: 0.2484 31/500 [>.............................] - ETA: 1:53 - loss: 1.6515 - regression_loss: 1.3952 - classification_loss: 0.2564 32/500 [>.............................] - ETA: 1:53 - loss: 1.6550 - regression_loss: 1.3980 - classification_loss: 0.2571 33/500 [>.............................] - ETA: 1:53 - loss: 1.6541 - regression_loss: 1.3959 - classification_loss: 0.2582 34/500 [=>............................] - ETA: 1:53 - loss: 1.6333 - regression_loss: 1.3776 - classification_loss: 0.2556 35/500 [=>............................] - ETA: 1:53 - loss: 1.6250 - regression_loss: 1.3725 - classification_loss: 0.2525 36/500 [=>............................] - ETA: 1:53 - loss: 1.6235 - regression_loss: 1.3716 - classification_loss: 0.2519 37/500 [=>............................] - ETA: 1:52 - loss: 1.6094 - regression_loss: 1.3614 - classification_loss: 0.2480 38/500 [=>............................] - ETA: 1:52 - loss: 1.6036 - regression_loss: 1.3564 - classification_loss: 0.2472 39/500 [=>............................] - ETA: 1:52 - loss: 1.6047 - regression_loss: 1.3569 - classification_loss: 0.2477 40/500 [=>............................] - ETA: 1:52 - loss: 1.5785 - regression_loss: 1.3328 - classification_loss: 0.2456 41/500 [=>............................] - ETA: 1:52 - loss: 1.5607 - regression_loss: 1.3188 - classification_loss: 0.2419 42/500 [=>............................] - ETA: 1:51 - loss: 1.5626 - regression_loss: 1.3200 - classification_loss: 0.2426 43/500 [=>............................] - ETA: 1:51 - loss: 1.5596 - regression_loss: 1.3187 - classification_loss: 0.2408 44/500 [=>............................] - ETA: 1:51 - loss: 1.5378 - regression_loss: 1.3001 - classification_loss: 0.2377 45/500 [=>............................] - ETA: 1:51 - loss: 1.5498 - regression_loss: 1.3083 - classification_loss: 0.2415 46/500 [=>............................] - ETA: 1:51 - loss: 1.5408 - regression_loss: 1.3015 - classification_loss: 0.2393 47/500 [=>............................] - ETA: 1:51 - loss: 1.5528 - regression_loss: 1.3113 - classification_loss: 0.2415 48/500 [=>............................] - ETA: 1:50 - loss: 1.5524 - regression_loss: 1.3123 - classification_loss: 0.2401 49/500 [=>............................] - ETA: 1:50 - loss: 1.5656 - regression_loss: 1.3218 - classification_loss: 0.2438 50/500 [==>...........................] - ETA: 1:50 - loss: 1.5835 - regression_loss: 1.3337 - classification_loss: 0.2497 51/500 [==>...........................] - ETA: 1:50 - loss: 1.5935 - regression_loss: 1.3423 - classification_loss: 0.2513 52/500 [==>...........................] - ETA: 1:50 - loss: 1.5785 - regression_loss: 1.3300 - classification_loss: 0.2486 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5757 - regression_loss: 1.3288 - classification_loss: 0.2468 54/500 [==>...........................] - ETA: 1:49 - loss: 1.5781 - regression_loss: 1.3317 - classification_loss: 0.2463 55/500 [==>...........................] - ETA: 1:49 - loss: 1.6057 - regression_loss: 1.3537 - classification_loss: 0.2520 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6219 - regression_loss: 1.3678 - classification_loss: 0.2541 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6158 - regression_loss: 1.3630 - classification_loss: 0.2528 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6163 - regression_loss: 1.3621 - classification_loss: 0.2542 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6223 - regression_loss: 1.3658 - classification_loss: 0.2565 60/500 [==>...........................] - ETA: 1:48 - loss: 1.6181 - regression_loss: 1.3632 - classification_loss: 0.2549 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6125 - regression_loss: 1.3590 - classification_loss: 0.2535 62/500 [==>...........................] - ETA: 1:47 - loss: 1.6098 - regression_loss: 1.3572 - classification_loss: 0.2526 63/500 [==>...........................] - ETA: 1:47 - loss: 1.6081 - regression_loss: 1.3542 - classification_loss: 0.2539 64/500 [==>...........................] - ETA: 1:47 - loss: 1.6274 - regression_loss: 1.3678 - classification_loss: 0.2596 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6186 - regression_loss: 1.3606 - classification_loss: 0.2580 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6240 - regression_loss: 1.3649 - classification_loss: 0.2591 67/500 [===>..........................] - ETA: 1:46 - loss: 1.6189 - regression_loss: 1.3610 - classification_loss: 0.2579 68/500 [===>..........................] - ETA: 1:46 - loss: 1.6114 - regression_loss: 1.3545 - classification_loss: 0.2569 69/500 [===>..........................] - ETA: 1:46 - loss: 1.6056 - regression_loss: 1.3508 - classification_loss: 0.2548 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6201 - regression_loss: 1.3631 - classification_loss: 0.2570 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6293 - regression_loss: 1.3686 - classification_loss: 0.2607 72/500 [===>..........................] - ETA: 1:45 - loss: 1.6325 - regression_loss: 1.3717 - classification_loss: 0.2608 73/500 [===>..........................] - ETA: 1:45 - loss: 1.6250 - regression_loss: 1.3656 - classification_loss: 0.2594 74/500 [===>..........................] - ETA: 1:45 - loss: 1.6236 - regression_loss: 1.3651 - classification_loss: 0.2585 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6237 - regression_loss: 1.3662 - classification_loss: 0.2575 76/500 [===>..........................] - ETA: 1:44 - loss: 1.6357 - regression_loss: 1.3763 - classification_loss: 0.2594 77/500 [===>..........................] - ETA: 1:44 - loss: 1.6419 - regression_loss: 1.3813 - classification_loss: 0.2605 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6415 - regression_loss: 1.3811 - classification_loss: 0.2604 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6550 - regression_loss: 1.3922 - classification_loss: 0.2628 80/500 [===>..........................] - ETA: 1:43 - loss: 1.6577 - regression_loss: 1.3938 - classification_loss: 0.2640 81/500 [===>..........................] - ETA: 1:43 - loss: 1.6436 - regression_loss: 1.3821 - classification_loss: 0.2614 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6448 - regression_loss: 1.3835 - classification_loss: 0.2614 83/500 [===>..........................] - ETA: 1:42 - loss: 1.6398 - regression_loss: 1.3796 - classification_loss: 0.2602 84/500 [====>.........................] - ETA: 1:42 - loss: 1.6408 - regression_loss: 1.3805 - classification_loss: 0.2603 85/500 [====>.........................] - ETA: 1:42 - loss: 1.6423 - regression_loss: 1.3817 - classification_loss: 0.2606 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6429 - regression_loss: 1.3823 - classification_loss: 0.2606 87/500 [====>.........................] - ETA: 1:41 - loss: 1.6416 - regression_loss: 1.3815 - classification_loss: 0.2601 88/500 [====>.........................] - ETA: 1:41 - loss: 1.6295 - regression_loss: 1.3716 - classification_loss: 0.2579 89/500 [====>.........................] - ETA: 1:41 - loss: 1.6432 - regression_loss: 1.3829 - classification_loss: 0.2603 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6410 - regression_loss: 1.3813 - classification_loss: 0.2598 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6332 - regression_loss: 1.3752 - classification_loss: 0.2581 92/500 [====>.........................] - ETA: 1:40 - loss: 1.6273 - regression_loss: 1.3705 - classification_loss: 0.2568 93/500 [====>.........................] - ETA: 1:40 - loss: 1.6282 - regression_loss: 1.3714 - classification_loss: 0.2568 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6200 - regression_loss: 1.3642 - classification_loss: 0.2557 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6283 - regression_loss: 1.3695 - classification_loss: 0.2588 96/500 [====>.........................] - ETA: 1:39 - loss: 1.6338 - regression_loss: 1.3739 - classification_loss: 0.2599 97/500 [====>.........................] - ETA: 1:39 - loss: 1.6334 - regression_loss: 1.3738 - classification_loss: 0.2596 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6328 - regression_loss: 1.3742 - classification_loss: 0.2587 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6312 - regression_loss: 1.3732 - classification_loss: 0.2580 100/500 [=====>........................] - ETA: 1:38 - loss: 1.6326 - regression_loss: 1.3741 - classification_loss: 0.2585 101/500 [=====>........................] - ETA: 1:38 - loss: 1.6288 - regression_loss: 1.3696 - classification_loss: 0.2591 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6192 - regression_loss: 1.3618 - classification_loss: 0.2574 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6199 - regression_loss: 1.3625 - classification_loss: 0.2574 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6229 - regression_loss: 1.3661 - classification_loss: 0.2568 105/500 [=====>........................] - ETA: 1:37 - loss: 1.6246 - regression_loss: 1.3685 - classification_loss: 0.2561 106/500 [=====>........................] - ETA: 1:37 - loss: 1.6229 - regression_loss: 1.3676 - classification_loss: 0.2553 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6152 - regression_loss: 1.3613 - classification_loss: 0.2539 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6147 - regression_loss: 1.3614 - classification_loss: 0.2533 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6185 - regression_loss: 1.3650 - classification_loss: 0.2535 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6178 - regression_loss: 1.3647 - classification_loss: 0.2531 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6117 - regression_loss: 1.3594 - classification_loss: 0.2522 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6128 - regression_loss: 1.3603 - classification_loss: 0.2525 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6084 - regression_loss: 1.3573 - classification_loss: 0.2511 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6062 - regression_loss: 1.3558 - classification_loss: 0.2504 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6036 - regression_loss: 1.3543 - classification_loss: 0.2494 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6135 - regression_loss: 1.3621 - classification_loss: 0.2513 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6122 - regression_loss: 1.3610 - classification_loss: 0.2512 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6153 - regression_loss: 1.3639 - classification_loss: 0.2514 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6122 - regression_loss: 1.3615 - classification_loss: 0.2507 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6091 - regression_loss: 1.3593 - classification_loss: 0.2498 121/500 [======>.......................] - ETA: 1:33 - loss: 1.6085 - regression_loss: 1.3591 - classification_loss: 0.2494 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6150 - regression_loss: 1.3645 - classification_loss: 0.2505 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6147 - regression_loss: 1.3643 - classification_loss: 0.2504 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6159 - regression_loss: 1.3649 - classification_loss: 0.2510 125/500 [======>.......................] - ETA: 1:32 - loss: 1.6165 - regression_loss: 1.3655 - classification_loss: 0.2510 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6167 - regression_loss: 1.3658 - classification_loss: 0.2510 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6136 - regression_loss: 1.3633 - classification_loss: 0.2503 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6150 - regression_loss: 1.3644 - classification_loss: 0.2506 129/500 [======>.......................] - ETA: 1:31 - loss: 1.6139 - regression_loss: 1.3636 - classification_loss: 0.2503 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6177 - regression_loss: 1.3666 - classification_loss: 0.2511 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6177 - regression_loss: 1.3666 - classification_loss: 0.2511 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6152 - regression_loss: 1.3635 - classification_loss: 0.2517 133/500 [======>.......................] - ETA: 1:30 - loss: 1.6158 - regression_loss: 1.3639 - classification_loss: 0.2519 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6160 - regression_loss: 1.3647 - classification_loss: 0.2512 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6203 - regression_loss: 1.3677 - classification_loss: 0.2526 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6218 - regression_loss: 1.3684 - classification_loss: 0.2534 137/500 [=======>......................] - ETA: 1:29 - loss: 1.6220 - regression_loss: 1.3678 - classification_loss: 0.2542 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6182 - regression_loss: 1.3652 - classification_loss: 0.2530 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6174 - regression_loss: 1.3647 - classification_loss: 0.2527 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6213 - regression_loss: 1.3687 - classification_loss: 0.2526 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6204 - regression_loss: 1.3679 - classification_loss: 0.2525 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6209 - regression_loss: 1.3688 - classification_loss: 0.2521 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6209 - regression_loss: 1.3689 - classification_loss: 0.2520 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6249 - regression_loss: 1.3715 - classification_loss: 0.2535 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6201 - regression_loss: 1.3675 - classification_loss: 0.2525 146/500 [=======>......................] - ETA: 1:27 - loss: 1.6242 - regression_loss: 1.3710 - classification_loss: 0.2532 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6255 - regression_loss: 1.3724 - classification_loss: 0.2531 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6196 - regression_loss: 1.3680 - classification_loss: 0.2516 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6186 - regression_loss: 1.3669 - classification_loss: 0.2517 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6172 - regression_loss: 1.3656 - classification_loss: 0.2516 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6205 - regression_loss: 1.3681 - classification_loss: 0.2524 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6272 - regression_loss: 1.3735 - classification_loss: 0.2537 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6265 - regression_loss: 1.3728 - classification_loss: 0.2537 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6260 - regression_loss: 1.3730 - classification_loss: 0.2529 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6271 - regression_loss: 1.3739 - classification_loss: 0.2531 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6271 - regression_loss: 1.3744 - classification_loss: 0.2527 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6282 - regression_loss: 1.3748 - classification_loss: 0.2534 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6290 - regression_loss: 1.3756 - classification_loss: 0.2534 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6328 - regression_loss: 1.3784 - classification_loss: 0.2544 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6351 - regression_loss: 1.3803 - classification_loss: 0.2547 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6375 - regression_loss: 1.3825 - classification_loss: 0.2549 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6404 - regression_loss: 1.3850 - classification_loss: 0.2554 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6406 - regression_loss: 1.3856 - classification_loss: 0.2551 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6421 - regression_loss: 1.3865 - classification_loss: 0.2556 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6391 - regression_loss: 1.3838 - classification_loss: 0.2552 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6400 - regression_loss: 1.3849 - classification_loss: 0.2551 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6393 - regression_loss: 1.3845 - classification_loss: 0.2548 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6366 - regression_loss: 1.3825 - classification_loss: 0.2541 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6397 - regression_loss: 1.3845 - classification_loss: 0.2552 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6361 - regression_loss: 1.3813 - classification_loss: 0.2548 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6384 - regression_loss: 1.3833 - classification_loss: 0.2550 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6372 - regression_loss: 1.3824 - classification_loss: 0.2548 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6388 - regression_loss: 1.3838 - classification_loss: 0.2549 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6391 - regression_loss: 1.3843 - classification_loss: 0.2548 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6398 - regression_loss: 1.3846 - classification_loss: 0.2552 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6425 - regression_loss: 1.3868 - classification_loss: 0.2557 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6462 - regression_loss: 1.3899 - classification_loss: 0.2564 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6468 - regression_loss: 1.3905 - classification_loss: 0.2563 179/500 [=========>....................] - ETA: 1:19 - loss: 1.6501 - regression_loss: 1.3930 - classification_loss: 0.2571 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6519 - regression_loss: 1.3947 - classification_loss: 0.2572 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6553 - regression_loss: 1.3976 - classification_loss: 0.2578 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6528 - regression_loss: 1.3956 - classification_loss: 0.2572 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6505 - regression_loss: 1.3934 - classification_loss: 0.2571 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6488 - regression_loss: 1.3923 - classification_loss: 0.2565 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6536 - regression_loss: 1.3968 - classification_loss: 0.2568 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6562 - regression_loss: 1.3987 - classification_loss: 0.2575 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6558 - regression_loss: 1.3988 - classification_loss: 0.2570 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6571 - regression_loss: 1.3997 - classification_loss: 0.2574 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6576 - regression_loss: 1.4005 - classification_loss: 0.2571 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6584 - regression_loss: 1.4015 - classification_loss: 0.2569 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6549 - regression_loss: 1.3988 - classification_loss: 0.2561 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6570 - regression_loss: 1.4003 - classification_loss: 0.2567 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6504 - regression_loss: 1.3946 - classification_loss: 0.2558 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6537 - regression_loss: 1.3970 - classification_loss: 0.2568 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6575 - regression_loss: 1.3997 - classification_loss: 0.2577 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6641 - regression_loss: 1.4038 - classification_loss: 0.2603 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6604 - regression_loss: 1.4010 - classification_loss: 0.2595 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6654 - regression_loss: 1.4049 - classification_loss: 0.2606 199/500 [==========>...................] - ETA: 1:14 - loss: 1.6642 - regression_loss: 1.4039 - classification_loss: 0.2604 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6628 - regression_loss: 1.4031 - classification_loss: 0.2597 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6636 - regression_loss: 1.4037 - classification_loss: 0.2598 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6599 - regression_loss: 1.4007 - classification_loss: 0.2592 203/500 [===========>..................] - ETA: 1:13 - loss: 1.6571 - regression_loss: 1.3985 - classification_loss: 0.2586 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6608 - regression_loss: 1.4014 - classification_loss: 0.2595 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6599 - regression_loss: 1.4007 - classification_loss: 0.2592 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6550 - regression_loss: 1.3968 - classification_loss: 0.2582 207/500 [===========>..................] - ETA: 1:12 - loss: 1.6515 - regression_loss: 1.3940 - classification_loss: 0.2575 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6500 - regression_loss: 1.3928 - classification_loss: 0.2571 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6547 - regression_loss: 1.3975 - classification_loss: 0.2572 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6580 - regression_loss: 1.3996 - classification_loss: 0.2584 211/500 [===========>..................] - ETA: 1:11 - loss: 1.6549 - regression_loss: 1.3973 - classification_loss: 0.2576 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6533 - regression_loss: 1.3959 - classification_loss: 0.2574 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6561 - regression_loss: 1.3975 - classification_loss: 0.2585 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6609 - regression_loss: 1.4003 - classification_loss: 0.2605 215/500 [===========>..................] - ETA: 1:10 - loss: 1.6609 - regression_loss: 1.4002 - classification_loss: 0.2607 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6590 - regression_loss: 1.3986 - classification_loss: 0.2604 217/500 [============>.................] - ETA: 1:10 - loss: 1.6607 - regression_loss: 1.3997 - classification_loss: 0.2609 218/500 [============>.................] - ETA: 1:10 - loss: 1.6610 - regression_loss: 1.3998 - classification_loss: 0.2612 219/500 [============>.................] - ETA: 1:09 - loss: 1.6634 - regression_loss: 1.4022 - classification_loss: 0.2612 220/500 [============>.................] - ETA: 1:09 - loss: 1.6617 - regression_loss: 1.4009 - classification_loss: 0.2608 221/500 [============>.................] - ETA: 1:09 - loss: 1.6622 - regression_loss: 1.4013 - classification_loss: 0.2609 222/500 [============>.................] - ETA: 1:09 - loss: 1.6626 - regression_loss: 1.4015 - classification_loss: 0.2611 223/500 [============>.................] - ETA: 1:08 - loss: 1.6585 - regression_loss: 1.3980 - classification_loss: 0.2605 224/500 [============>.................] - ETA: 1:08 - loss: 1.6577 - regression_loss: 1.3977 - classification_loss: 0.2600 225/500 [============>.................] - ETA: 1:08 - loss: 1.6589 - regression_loss: 1.3986 - classification_loss: 0.2602 226/500 [============>.................] - ETA: 1:08 - loss: 1.6544 - regression_loss: 1.3947 - classification_loss: 0.2597 227/500 [============>.................] - ETA: 1:07 - loss: 1.6527 - regression_loss: 1.3934 - classification_loss: 0.2593 228/500 [============>.................] - ETA: 1:07 - loss: 1.6517 - regression_loss: 1.3925 - classification_loss: 0.2592 229/500 [============>.................] - ETA: 1:07 - loss: 1.6545 - regression_loss: 1.3937 - classification_loss: 0.2607 230/500 [============>.................] - ETA: 1:07 - loss: 1.6548 - regression_loss: 1.3942 - classification_loss: 0.2606 231/500 [============>.................] - ETA: 1:06 - loss: 1.6564 - regression_loss: 1.3960 - classification_loss: 0.2604 232/500 [============>.................] - ETA: 1:06 - loss: 1.6519 - regression_loss: 1.3919 - classification_loss: 0.2600 233/500 [============>.................] - ETA: 1:06 - loss: 1.6508 - regression_loss: 1.3913 - classification_loss: 0.2596 234/500 [=============>................] - ETA: 1:06 - loss: 1.6504 - regression_loss: 1.3911 - classification_loss: 0.2593 235/500 [=============>................] - ETA: 1:05 - loss: 1.6519 - regression_loss: 1.3921 - classification_loss: 0.2599 236/500 [=============>................] - ETA: 1:05 - loss: 1.6503 - regression_loss: 1.3908 - classification_loss: 0.2595 237/500 [=============>................] - ETA: 1:05 - loss: 1.6529 - regression_loss: 1.3936 - classification_loss: 0.2593 238/500 [=============>................] - ETA: 1:05 - loss: 1.6535 - regression_loss: 1.3940 - classification_loss: 0.2595 239/500 [=============>................] - ETA: 1:04 - loss: 1.6534 - regression_loss: 1.3940 - classification_loss: 0.2594 240/500 [=============>................] - ETA: 1:04 - loss: 1.6530 - regression_loss: 1.3936 - classification_loss: 0.2594 241/500 [=============>................] - ETA: 1:04 - loss: 1.6522 - regression_loss: 1.3932 - classification_loss: 0.2591 242/500 [=============>................] - ETA: 1:04 - loss: 1.6525 - regression_loss: 1.3931 - classification_loss: 0.2593 243/500 [=============>................] - ETA: 1:03 - loss: 1.6494 - regression_loss: 1.3895 - classification_loss: 0.2599 244/500 [=============>................] - ETA: 1:03 - loss: 1.6481 - regression_loss: 1.3877 - classification_loss: 0.2604 245/500 [=============>................] - ETA: 1:03 - loss: 1.6466 - regression_loss: 1.3868 - classification_loss: 0.2598 246/500 [=============>................] - ETA: 1:03 - loss: 1.6463 - regression_loss: 1.3866 - classification_loss: 0.2596 247/500 [=============>................] - ETA: 1:02 - loss: 1.6445 - regression_loss: 1.3850 - classification_loss: 0.2595 248/500 [=============>................] - ETA: 1:02 - loss: 1.6455 - regression_loss: 1.3858 - classification_loss: 0.2596 249/500 [=============>................] - ETA: 1:02 - loss: 1.6456 - regression_loss: 1.3858 - classification_loss: 0.2598 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6496 - regression_loss: 1.3880 - classification_loss: 0.2616 251/500 [==============>...............] - ETA: 1:01 - loss: 1.6516 - regression_loss: 1.3893 - classification_loss: 0.2623 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6481 - regression_loss: 1.3865 - classification_loss: 0.2616 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6489 - regression_loss: 1.3872 - classification_loss: 0.2617 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6458 - regression_loss: 1.3847 - classification_loss: 0.2610 255/500 [==============>...............] - ETA: 1:00 - loss: 1.6463 - regression_loss: 1.3852 - classification_loss: 0.2611 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6494 - regression_loss: 1.3886 - classification_loss: 0.2608 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6452 - regression_loss: 1.3849 - classification_loss: 0.2602 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6431 - regression_loss: 1.3832 - classification_loss: 0.2598 259/500 [==============>...............] - ETA: 59s - loss: 1.6412 - regression_loss: 1.3815 - classification_loss: 0.2597  260/500 [==============>...............] - ETA: 59s - loss: 1.6401 - regression_loss: 1.3805 - classification_loss: 0.2596 261/500 [==============>...............] - ETA: 59s - loss: 1.6404 - regression_loss: 1.3805 - classification_loss: 0.2599 262/500 [==============>...............] - ETA: 59s - loss: 1.6396 - regression_loss: 1.3799 - classification_loss: 0.2596 263/500 [==============>...............] - ETA: 58s - loss: 1.6404 - regression_loss: 1.3804 - classification_loss: 0.2600 264/500 [==============>...............] - ETA: 58s - loss: 1.6422 - regression_loss: 1.3820 - classification_loss: 0.2602 265/500 [==============>...............] - ETA: 58s - loss: 1.6394 - regression_loss: 1.3798 - classification_loss: 0.2596 266/500 [==============>...............] - ETA: 58s - loss: 1.6404 - regression_loss: 1.3808 - classification_loss: 0.2595 267/500 [===============>..............] - ETA: 58s - loss: 1.6401 - regression_loss: 1.3808 - classification_loss: 0.2593 268/500 [===============>..............] - ETA: 57s - loss: 1.6418 - regression_loss: 1.3822 - classification_loss: 0.2597 269/500 [===============>..............] - ETA: 57s - loss: 1.6427 - regression_loss: 1.3830 - classification_loss: 0.2596 270/500 [===============>..............] - ETA: 57s - loss: 1.6417 - regression_loss: 1.3827 - classification_loss: 0.2590 271/500 [===============>..............] - ETA: 57s - loss: 1.6423 - regression_loss: 1.3833 - classification_loss: 0.2591 272/500 [===============>..............] - ETA: 56s - loss: 1.6438 - regression_loss: 1.3844 - classification_loss: 0.2594 273/500 [===============>..............] - ETA: 56s - loss: 1.6439 - regression_loss: 1.3844 - classification_loss: 0.2595 274/500 [===============>..............] - ETA: 56s - loss: 1.6422 - regression_loss: 1.3831 - classification_loss: 0.2592 275/500 [===============>..............] - ETA: 55s - loss: 1.6433 - regression_loss: 1.3837 - classification_loss: 0.2596 276/500 [===============>..............] - ETA: 55s - loss: 1.6450 - regression_loss: 1.3851 - classification_loss: 0.2599 277/500 [===============>..............] - ETA: 55s - loss: 1.6448 - regression_loss: 1.3851 - classification_loss: 0.2597 278/500 [===============>..............] - ETA: 55s - loss: 1.6467 - regression_loss: 1.3867 - classification_loss: 0.2600 279/500 [===============>..............] - ETA: 55s - loss: 1.6471 - regression_loss: 1.3873 - classification_loss: 0.2598 280/500 [===============>..............] - ETA: 54s - loss: 1.6456 - regression_loss: 1.3862 - classification_loss: 0.2594 281/500 [===============>..............] - ETA: 54s - loss: 1.6474 - regression_loss: 1.3877 - classification_loss: 0.2597 282/500 [===============>..............] - ETA: 54s - loss: 1.6472 - regression_loss: 1.3877 - classification_loss: 0.2595 283/500 [===============>..............] - ETA: 54s - loss: 1.6497 - regression_loss: 1.3894 - classification_loss: 0.2603 284/500 [================>.............] - ETA: 53s - loss: 1.6500 - regression_loss: 1.3898 - classification_loss: 0.2602 285/500 [================>.............] - ETA: 53s - loss: 1.6519 - regression_loss: 1.3908 - classification_loss: 0.2611 286/500 [================>.............] - ETA: 53s - loss: 1.6506 - regression_loss: 1.3900 - classification_loss: 0.2606 287/500 [================>.............] - ETA: 53s - loss: 1.6489 - regression_loss: 1.3887 - classification_loss: 0.2602 288/500 [================>.............] - ETA: 52s - loss: 1.6502 - regression_loss: 1.3899 - classification_loss: 0.2603 289/500 [================>.............] - ETA: 52s - loss: 1.6512 - regression_loss: 1.3906 - classification_loss: 0.2606 290/500 [================>.............] - ETA: 52s - loss: 1.6538 - regression_loss: 1.3927 - classification_loss: 0.2611 291/500 [================>.............] - ETA: 52s - loss: 1.6557 - regression_loss: 1.3939 - classification_loss: 0.2619 292/500 [================>.............] - ETA: 51s - loss: 1.6547 - regression_loss: 1.3928 - classification_loss: 0.2619 293/500 [================>.............] - ETA: 51s - loss: 1.6543 - regression_loss: 1.3927 - classification_loss: 0.2616 294/500 [================>.............] - ETA: 51s - loss: 1.6525 - regression_loss: 1.3912 - classification_loss: 0.2612 295/500 [================>.............] - ETA: 51s - loss: 1.6541 - regression_loss: 1.3926 - classification_loss: 0.2615 296/500 [================>.............] - ETA: 50s - loss: 1.6537 - regression_loss: 1.3921 - classification_loss: 0.2616 297/500 [================>.............] - ETA: 50s - loss: 1.6566 - regression_loss: 1.3945 - classification_loss: 0.2621 298/500 [================>.............] - ETA: 50s - loss: 1.6570 - regression_loss: 1.3948 - classification_loss: 0.2622 299/500 [================>.............] - ETA: 50s - loss: 1.6591 - regression_loss: 1.3965 - classification_loss: 0.2626 300/500 [=================>............] - ETA: 49s - loss: 1.6584 - regression_loss: 1.3961 - classification_loss: 0.2622 301/500 [=================>............] - ETA: 49s - loss: 1.6587 - regression_loss: 1.3966 - classification_loss: 0.2621 302/500 [=================>............] - ETA: 49s - loss: 1.6617 - regression_loss: 1.3988 - classification_loss: 0.2629 303/500 [=================>............] - ETA: 49s - loss: 1.6616 - regression_loss: 1.3987 - classification_loss: 0.2629 304/500 [=================>............] - ETA: 48s - loss: 1.6608 - regression_loss: 1.3981 - classification_loss: 0.2627 305/500 [=================>............] - ETA: 48s - loss: 1.6656 - regression_loss: 1.4020 - classification_loss: 0.2635 306/500 [=================>............] - ETA: 48s - loss: 1.6680 - regression_loss: 1.4041 - classification_loss: 0.2639 307/500 [=================>............] - ETA: 48s - loss: 1.6668 - regression_loss: 1.4032 - classification_loss: 0.2636 308/500 [=================>............] - ETA: 47s - loss: 1.6683 - regression_loss: 1.4043 - classification_loss: 0.2640 309/500 [=================>............] - ETA: 47s - loss: 1.6684 - regression_loss: 1.4044 - classification_loss: 0.2640 310/500 [=================>............] - ETA: 47s - loss: 1.6660 - regression_loss: 1.4026 - classification_loss: 0.2635 311/500 [=================>............] - ETA: 47s - loss: 1.6644 - regression_loss: 1.4011 - classification_loss: 0.2633 312/500 [=================>............] - ETA: 46s - loss: 1.6659 - regression_loss: 1.4022 - classification_loss: 0.2636 313/500 [=================>............] - ETA: 46s - loss: 1.6654 - regression_loss: 1.4019 - classification_loss: 0.2634 314/500 [=================>............] - ETA: 46s - loss: 1.6636 - regression_loss: 1.4004 - classification_loss: 0.2631 315/500 [=================>............] - ETA: 46s - loss: 1.6625 - regression_loss: 1.3994 - classification_loss: 0.2631 316/500 [=================>............] - ETA: 45s - loss: 1.6644 - regression_loss: 1.4010 - classification_loss: 0.2634 317/500 [==================>...........] - ETA: 45s - loss: 1.6656 - regression_loss: 1.4017 - classification_loss: 0.2639 318/500 [==================>...........] - ETA: 45s - loss: 1.6646 - regression_loss: 1.4010 - classification_loss: 0.2636 319/500 [==================>...........] - ETA: 45s - loss: 1.6657 - regression_loss: 1.4019 - classification_loss: 0.2638 320/500 [==================>...........] - ETA: 44s - loss: 1.6653 - regression_loss: 1.4017 - classification_loss: 0.2636 321/500 [==================>...........] - ETA: 44s - loss: 1.6649 - regression_loss: 1.4014 - classification_loss: 0.2635 322/500 [==================>...........] - ETA: 44s - loss: 1.6674 - regression_loss: 1.4031 - classification_loss: 0.2643 323/500 [==================>...........] - ETA: 44s - loss: 1.6670 - regression_loss: 1.4029 - classification_loss: 0.2641 324/500 [==================>...........] - ETA: 43s - loss: 1.6680 - regression_loss: 1.4038 - classification_loss: 0.2642 325/500 [==================>...........] - ETA: 43s - loss: 1.6692 - regression_loss: 1.4047 - classification_loss: 0.2645 326/500 [==================>...........] - ETA: 43s - loss: 1.6682 - regression_loss: 1.4038 - classification_loss: 0.2644 327/500 [==================>...........] - ETA: 43s - loss: 1.6725 - regression_loss: 1.4074 - classification_loss: 0.2651 328/500 [==================>...........] - ETA: 42s - loss: 1.6724 - regression_loss: 1.4071 - classification_loss: 0.2654 329/500 [==================>...........] - ETA: 42s - loss: 1.6724 - regression_loss: 1.4069 - classification_loss: 0.2654 330/500 [==================>...........] - ETA: 42s - loss: 1.6708 - regression_loss: 1.4057 - classification_loss: 0.2651 331/500 [==================>...........] - ETA: 42s - loss: 1.6716 - regression_loss: 1.4063 - classification_loss: 0.2653 332/500 [==================>...........] - ETA: 41s - loss: 1.6699 - regression_loss: 1.4050 - classification_loss: 0.2649 333/500 [==================>...........] - ETA: 41s - loss: 1.6689 - regression_loss: 1.4042 - classification_loss: 0.2647 334/500 [===================>..........] - ETA: 41s - loss: 1.6699 - regression_loss: 1.4048 - classification_loss: 0.2651 335/500 [===================>..........] - ETA: 41s - loss: 1.6692 - regression_loss: 1.4043 - classification_loss: 0.2649 336/500 [===================>..........] - ETA: 40s - loss: 1.6708 - regression_loss: 1.4056 - classification_loss: 0.2652 337/500 [===================>..........] - ETA: 40s - loss: 1.6719 - regression_loss: 1.4065 - classification_loss: 0.2654 338/500 [===================>..........] - ETA: 40s - loss: 1.6735 - regression_loss: 1.4079 - classification_loss: 0.2656 339/500 [===================>..........] - ETA: 40s - loss: 1.6755 - regression_loss: 1.4098 - classification_loss: 0.2658 340/500 [===================>..........] - ETA: 39s - loss: 1.6771 - regression_loss: 1.4110 - classification_loss: 0.2661 341/500 [===================>..........] - ETA: 39s - loss: 1.6769 - regression_loss: 1.4107 - classification_loss: 0.2662 342/500 [===================>..........] - ETA: 39s - loss: 1.6767 - regression_loss: 1.4107 - classification_loss: 0.2661 343/500 [===================>..........] - ETA: 39s - loss: 1.6730 - regression_loss: 1.4076 - classification_loss: 0.2655 344/500 [===================>..........] - ETA: 38s - loss: 1.6748 - regression_loss: 1.4088 - classification_loss: 0.2660 345/500 [===================>..........] - ETA: 38s - loss: 1.6758 - regression_loss: 1.4097 - classification_loss: 0.2662 346/500 [===================>..........] - ETA: 38s - loss: 1.6767 - regression_loss: 1.4103 - classification_loss: 0.2663 347/500 [===================>..........] - ETA: 38s - loss: 1.6766 - regression_loss: 1.4103 - classification_loss: 0.2663 348/500 [===================>..........] - ETA: 37s - loss: 1.6762 - regression_loss: 1.4100 - classification_loss: 0.2663 349/500 [===================>..........] - ETA: 37s - loss: 1.6754 - regression_loss: 1.4093 - classification_loss: 0.2660 350/500 [====================>.........] - ETA: 37s - loss: 1.6749 - regression_loss: 1.4091 - classification_loss: 0.2658 351/500 [====================>.........] - ETA: 37s - loss: 1.6783 - regression_loss: 1.4117 - classification_loss: 0.2667 352/500 [====================>.........] - ETA: 36s - loss: 1.6782 - regression_loss: 1.4115 - classification_loss: 0.2667 353/500 [====================>.........] - ETA: 36s - loss: 1.6790 - regression_loss: 1.4123 - classification_loss: 0.2667 354/500 [====================>.........] - ETA: 36s - loss: 1.6777 - regression_loss: 1.4114 - classification_loss: 0.2663 355/500 [====================>.........] - ETA: 36s - loss: 1.6771 - regression_loss: 1.4108 - classification_loss: 0.2663 356/500 [====================>.........] - ETA: 35s - loss: 1.6769 - regression_loss: 1.4106 - classification_loss: 0.2663 357/500 [====================>.........] - ETA: 35s - loss: 1.6769 - regression_loss: 1.4108 - classification_loss: 0.2661 358/500 [====================>.........] - ETA: 35s - loss: 1.6793 - regression_loss: 1.4128 - classification_loss: 0.2665 359/500 [====================>.........] - ETA: 35s - loss: 1.6784 - regression_loss: 1.4121 - classification_loss: 0.2663 360/500 [====================>.........] - ETA: 34s - loss: 1.6758 - regression_loss: 1.4099 - classification_loss: 0.2659 361/500 [====================>.........] - ETA: 34s - loss: 1.6760 - regression_loss: 1.4102 - classification_loss: 0.2658 362/500 [====================>.........] - ETA: 34s - loss: 1.6748 - regression_loss: 1.4093 - classification_loss: 0.2655 363/500 [====================>.........] - ETA: 34s - loss: 1.6740 - regression_loss: 1.4085 - classification_loss: 0.2654 364/500 [====================>.........] - ETA: 33s - loss: 1.6723 - regression_loss: 1.4067 - classification_loss: 0.2656 365/500 [====================>.........] - ETA: 33s - loss: 1.6714 - regression_loss: 1.4061 - classification_loss: 0.2654 366/500 [====================>.........] - ETA: 33s - loss: 1.6708 - regression_loss: 1.4056 - classification_loss: 0.2652 367/500 [=====================>........] - ETA: 33s - loss: 1.6704 - regression_loss: 1.4051 - classification_loss: 0.2653 368/500 [=====================>........] - ETA: 32s - loss: 1.6686 - regression_loss: 1.4037 - classification_loss: 0.2649 369/500 [=====================>........] - ETA: 32s - loss: 1.6669 - regression_loss: 1.4023 - classification_loss: 0.2646 370/500 [=====================>........] - ETA: 32s - loss: 1.6673 - regression_loss: 1.4027 - classification_loss: 0.2646 371/500 [=====================>........] - ETA: 32s - loss: 1.6682 - regression_loss: 1.4034 - classification_loss: 0.2648 372/500 [=====================>........] - ETA: 31s - loss: 1.6682 - regression_loss: 1.4035 - classification_loss: 0.2647 373/500 [=====================>........] - ETA: 31s - loss: 1.6672 - regression_loss: 1.4028 - classification_loss: 0.2644 374/500 [=====================>........] - ETA: 31s - loss: 1.6680 - regression_loss: 1.4039 - classification_loss: 0.2641 375/500 [=====================>........] - ETA: 31s - loss: 1.6684 - regression_loss: 1.4042 - classification_loss: 0.2642 376/500 [=====================>........] - ETA: 30s - loss: 1.6652 - regression_loss: 1.4016 - classification_loss: 0.2636 377/500 [=====================>........] - ETA: 30s - loss: 1.6665 - regression_loss: 1.4031 - classification_loss: 0.2635 378/500 [=====================>........] - ETA: 30s - loss: 1.6657 - regression_loss: 1.4024 - classification_loss: 0.2633 379/500 [=====================>........] - ETA: 30s - loss: 1.6656 - regression_loss: 1.4021 - classification_loss: 0.2635 380/500 [=====================>........] - ETA: 29s - loss: 1.6655 - regression_loss: 1.4020 - classification_loss: 0.2635 381/500 [=====================>........] - ETA: 29s - loss: 1.6652 - regression_loss: 1.4016 - classification_loss: 0.2636 382/500 [=====================>........] - ETA: 29s - loss: 1.6666 - regression_loss: 1.4027 - classification_loss: 0.2639 383/500 [=====================>........] - ETA: 29s - loss: 1.6649 - regression_loss: 1.4014 - classification_loss: 0.2636 384/500 [======================>.......] - ETA: 28s - loss: 1.6658 - regression_loss: 1.4020 - classification_loss: 0.2638 385/500 [======================>.......] - ETA: 28s - loss: 1.6649 - regression_loss: 1.4010 - classification_loss: 0.2640 386/500 [======================>.......] - ETA: 28s - loss: 1.6645 - regression_loss: 1.4008 - classification_loss: 0.2637 387/500 [======================>.......] - ETA: 28s - loss: 1.6663 - regression_loss: 1.4025 - classification_loss: 0.2638 388/500 [======================>.......] - ETA: 27s - loss: 1.6650 - regression_loss: 1.4013 - classification_loss: 0.2637 389/500 [======================>.......] - ETA: 27s - loss: 1.6651 - regression_loss: 1.4013 - classification_loss: 0.2639 390/500 [======================>.......] - ETA: 27s - loss: 1.6648 - regression_loss: 1.4010 - classification_loss: 0.2638 391/500 [======================>.......] - ETA: 27s - loss: 1.6664 - regression_loss: 1.4021 - classification_loss: 0.2643 392/500 [======================>.......] - ETA: 26s - loss: 1.6676 - regression_loss: 1.4030 - classification_loss: 0.2646 393/500 [======================>.......] - ETA: 26s - loss: 1.6655 - regression_loss: 1.4013 - classification_loss: 0.2642 394/500 [======================>.......] - ETA: 26s - loss: 1.6647 - regression_loss: 1.4007 - classification_loss: 0.2639 395/500 [======================>.......] - ETA: 26s - loss: 1.6635 - regression_loss: 1.3997 - classification_loss: 0.2638 396/500 [======================>.......] - ETA: 25s - loss: 1.6624 - regression_loss: 1.3990 - classification_loss: 0.2635 397/500 [======================>.......] - ETA: 25s - loss: 1.6618 - regression_loss: 1.3985 - classification_loss: 0.2632 398/500 [======================>.......] - ETA: 25s - loss: 1.6630 - regression_loss: 1.3995 - classification_loss: 0.2635 399/500 [======================>.......] - ETA: 25s - loss: 1.6649 - regression_loss: 1.4008 - classification_loss: 0.2641 400/500 [=======================>......] - ETA: 24s - loss: 1.6657 - regression_loss: 1.4014 - classification_loss: 0.2643 401/500 [=======================>......] - ETA: 24s - loss: 1.6660 - regression_loss: 1.4017 - classification_loss: 0.2643 402/500 [=======================>......] - ETA: 24s - loss: 1.6659 - regression_loss: 1.4017 - classification_loss: 0.2642 403/500 [=======================>......] - ETA: 24s - loss: 1.6660 - regression_loss: 1.4020 - classification_loss: 0.2640 404/500 [=======================>......] - ETA: 23s - loss: 1.6659 - regression_loss: 1.4018 - classification_loss: 0.2641 405/500 [=======================>......] - ETA: 23s - loss: 1.6664 - regression_loss: 1.4021 - classification_loss: 0.2643 406/500 [=======================>......] - ETA: 23s - loss: 1.6638 - regression_loss: 1.4000 - classification_loss: 0.2638 407/500 [=======================>......] - ETA: 23s - loss: 1.6620 - regression_loss: 1.3986 - classification_loss: 0.2634 408/500 [=======================>......] - ETA: 22s - loss: 1.6619 - regression_loss: 1.3982 - classification_loss: 0.2637 409/500 [=======================>......] - ETA: 22s - loss: 1.6617 - regression_loss: 1.3980 - classification_loss: 0.2637 410/500 [=======================>......] - ETA: 22s - loss: 1.6608 - regression_loss: 1.3974 - classification_loss: 0.2634 411/500 [=======================>......] - ETA: 22s - loss: 1.6594 - regression_loss: 1.3962 - classification_loss: 0.2632 412/500 [=======================>......] - ETA: 21s - loss: 1.6606 - regression_loss: 1.3972 - classification_loss: 0.2634 413/500 [=======================>......] - ETA: 21s - loss: 1.6605 - regression_loss: 1.3970 - classification_loss: 0.2635 414/500 [=======================>......] - ETA: 21s - loss: 1.6605 - regression_loss: 1.3970 - classification_loss: 0.2635 415/500 [=======================>......] - ETA: 21s - loss: 1.6626 - regression_loss: 1.3984 - classification_loss: 0.2642 416/500 [=======================>......] - ETA: 20s - loss: 1.6633 - regression_loss: 1.3990 - classification_loss: 0.2644 417/500 [========================>.....] - ETA: 20s - loss: 1.6624 - regression_loss: 1.3984 - classification_loss: 0.2641 418/500 [========================>.....] - ETA: 20s - loss: 1.6623 - regression_loss: 1.3984 - classification_loss: 0.2639 419/500 [========================>.....] - ETA: 20s - loss: 1.6620 - regression_loss: 1.3983 - classification_loss: 0.2637 420/500 [========================>.....] - ETA: 19s - loss: 1.6624 - regression_loss: 1.3986 - classification_loss: 0.2637 421/500 [========================>.....] - ETA: 19s - loss: 1.6615 - regression_loss: 1.3981 - classification_loss: 0.2635 422/500 [========================>.....] - ETA: 19s - loss: 1.6621 - regression_loss: 1.3986 - classification_loss: 0.2634 423/500 [========================>.....] - ETA: 19s - loss: 1.6635 - regression_loss: 1.3996 - classification_loss: 0.2639 424/500 [========================>.....] - ETA: 18s - loss: 1.6636 - regression_loss: 1.3996 - classification_loss: 0.2640 425/500 [========================>.....] - ETA: 18s - loss: 1.6634 - regression_loss: 1.3993 - classification_loss: 0.2641 426/500 [========================>.....] - ETA: 18s - loss: 1.6620 - regression_loss: 1.3982 - classification_loss: 0.2638 427/500 [========================>.....] - ETA: 18s - loss: 1.6625 - regression_loss: 1.3986 - classification_loss: 0.2639 428/500 [========================>.....] - ETA: 17s - loss: 1.6639 - regression_loss: 1.3998 - classification_loss: 0.2641 429/500 [========================>.....] - ETA: 17s - loss: 1.6645 - regression_loss: 1.4003 - classification_loss: 0.2642 430/500 [========================>.....] - ETA: 17s - loss: 1.6653 - regression_loss: 1.4011 - classification_loss: 0.2642 431/500 [========================>.....] - ETA: 17s - loss: 1.6647 - regression_loss: 1.4008 - classification_loss: 0.2639 432/500 [========================>.....] - ETA: 16s - loss: 1.6643 - regression_loss: 1.4004 - classification_loss: 0.2639 433/500 [========================>.....] - ETA: 16s - loss: 1.6639 - regression_loss: 1.4003 - classification_loss: 0.2636 434/500 [=========================>....] - ETA: 16s - loss: 1.6639 - regression_loss: 1.4003 - classification_loss: 0.2636 435/500 [=========================>....] - ETA: 16s - loss: 1.6632 - regression_loss: 1.3997 - classification_loss: 0.2635 436/500 [=========================>....] - ETA: 15s - loss: 1.6632 - regression_loss: 1.3998 - classification_loss: 0.2634 437/500 [=========================>....] - ETA: 15s - loss: 1.6656 - regression_loss: 1.4016 - classification_loss: 0.2640 438/500 [=========================>....] - ETA: 15s - loss: 1.6640 - regression_loss: 1.4004 - classification_loss: 0.2636 439/500 [=========================>....] - ETA: 15s - loss: 1.6635 - regression_loss: 1.4000 - classification_loss: 0.2635 440/500 [=========================>....] - ETA: 14s - loss: 1.6632 - regression_loss: 1.3999 - classification_loss: 0.2633 441/500 [=========================>....] - ETA: 14s - loss: 1.6638 - regression_loss: 1.4003 - classification_loss: 0.2635 442/500 [=========================>....] - ETA: 14s - loss: 1.6622 - regression_loss: 1.3987 - classification_loss: 0.2635 443/500 [=========================>....] - ETA: 14s - loss: 1.6623 - regression_loss: 1.3986 - classification_loss: 0.2637 444/500 [=========================>....] - ETA: 13s - loss: 1.6620 - regression_loss: 1.3984 - classification_loss: 0.2636 445/500 [=========================>....] - ETA: 13s - loss: 1.6633 - regression_loss: 1.3996 - classification_loss: 0.2636 446/500 [=========================>....] - ETA: 13s - loss: 1.6627 - regression_loss: 1.3992 - classification_loss: 0.2634 447/500 [=========================>....] - ETA: 13s - loss: 1.6629 - regression_loss: 1.3993 - classification_loss: 0.2637 448/500 [=========================>....] - ETA: 12s - loss: 1.6618 - regression_loss: 1.3983 - classification_loss: 0.2634 449/500 [=========================>....] - ETA: 12s - loss: 1.6618 - regression_loss: 1.3985 - classification_loss: 0.2633 450/500 [==========================>...] - ETA: 12s - loss: 1.6627 - regression_loss: 1.3995 - classification_loss: 0.2632 451/500 [==========================>...] - ETA: 12s - loss: 1.6627 - regression_loss: 1.3994 - classification_loss: 0.2633 452/500 [==========================>...] - ETA: 11s - loss: 1.6619 - regression_loss: 1.3988 - classification_loss: 0.2630 453/500 [==========================>...] - ETA: 11s - loss: 1.6606 - regression_loss: 1.3978 - classification_loss: 0.2627 454/500 [==========================>...] - ETA: 11s - loss: 1.6613 - regression_loss: 1.3984 - classification_loss: 0.2629 455/500 [==========================>...] - ETA: 11s - loss: 1.6608 - regression_loss: 1.3980 - classification_loss: 0.2628 456/500 [==========================>...] - ETA: 10s - loss: 1.6618 - regression_loss: 1.3988 - classification_loss: 0.2630 457/500 [==========================>...] - ETA: 10s - loss: 1.6620 - regression_loss: 1.3990 - classification_loss: 0.2630 458/500 [==========================>...] - ETA: 10s - loss: 1.6625 - regression_loss: 1.3993 - classification_loss: 0.2632 459/500 [==========================>...] - ETA: 10s - loss: 1.6655 - regression_loss: 1.4016 - classification_loss: 0.2639 460/500 [==========================>...] - ETA: 9s - loss: 1.6664 - regression_loss: 1.4022 - classification_loss: 0.2642  461/500 [==========================>...] - ETA: 9s - loss: 1.6649 - regression_loss: 1.4011 - classification_loss: 0.2638 462/500 [==========================>...] - ETA: 9s - loss: 1.6649 - regression_loss: 1.4012 - classification_loss: 0.2637 463/500 [==========================>...] - ETA: 9s - loss: 1.6652 - regression_loss: 1.4014 - classification_loss: 0.2638 464/500 [==========================>...] - ETA: 8s - loss: 1.6657 - regression_loss: 1.4018 - classification_loss: 0.2638 465/500 [==========================>...] - ETA: 8s - loss: 1.6649 - regression_loss: 1.4013 - classification_loss: 0.2637 466/500 [==========================>...] - ETA: 8s - loss: 1.6661 - regression_loss: 1.4022 - classification_loss: 0.2639 467/500 [===========================>..] - ETA: 8s - loss: 1.6660 - regression_loss: 1.4021 - classification_loss: 0.2639 468/500 [===========================>..] - ETA: 7s - loss: 1.6658 - regression_loss: 1.4020 - classification_loss: 0.2638 469/500 [===========================>..] - ETA: 7s - loss: 1.6662 - regression_loss: 1.4023 - classification_loss: 0.2639 470/500 [===========================>..] - ETA: 7s - loss: 1.6642 - regression_loss: 1.4007 - classification_loss: 0.2635 471/500 [===========================>..] - ETA: 7s - loss: 1.6654 - regression_loss: 1.4015 - classification_loss: 0.2639 472/500 [===========================>..] - ETA: 6s - loss: 1.6641 - regression_loss: 1.4001 - classification_loss: 0.2641 473/500 [===========================>..] - ETA: 6s - loss: 1.6657 - regression_loss: 1.4012 - classification_loss: 0.2645 474/500 [===========================>..] - ETA: 6s - loss: 1.6659 - regression_loss: 1.4013 - classification_loss: 0.2646 475/500 [===========================>..] - ETA: 6s - loss: 1.6664 - regression_loss: 1.4020 - classification_loss: 0.2644 476/500 [===========================>..] - ETA: 5s - loss: 1.6673 - regression_loss: 1.4028 - classification_loss: 0.2645 477/500 [===========================>..] - ETA: 5s - loss: 1.6664 - regression_loss: 1.4021 - classification_loss: 0.2643 478/500 [===========================>..] - ETA: 5s - loss: 1.6662 - regression_loss: 1.4021 - classification_loss: 0.2641 479/500 [===========================>..] - ETA: 5s - loss: 1.6653 - regression_loss: 1.4014 - classification_loss: 0.2639 480/500 [===========================>..] - ETA: 4s - loss: 1.6632 - regression_loss: 1.3997 - classification_loss: 0.2635 481/500 [===========================>..] - ETA: 4s - loss: 1.6621 - regression_loss: 1.3989 - classification_loss: 0.2633 482/500 [===========================>..] - ETA: 4s - loss: 1.6628 - regression_loss: 1.3994 - classification_loss: 0.2634 483/500 [===========================>..] - ETA: 4s - loss: 1.6608 - regression_loss: 1.3978 - classification_loss: 0.2630 484/500 [============================>.] - ETA: 3s - loss: 1.6625 - regression_loss: 1.3994 - classification_loss: 0.2631 485/500 [============================>.] - ETA: 3s - loss: 1.6631 - regression_loss: 1.3998 - classification_loss: 0.2633 486/500 [============================>.] - ETA: 3s - loss: 1.6622 - regression_loss: 1.3990 - classification_loss: 0.2632 487/500 [============================>.] - ETA: 3s - loss: 1.6623 - regression_loss: 1.3991 - classification_loss: 0.2633 488/500 [============================>.] - ETA: 2s - loss: 1.6660 - regression_loss: 1.4019 - classification_loss: 0.2641 489/500 [============================>.] - ETA: 2s - loss: 1.6642 - regression_loss: 1.4005 - classification_loss: 0.2637 490/500 [============================>.] - ETA: 2s - loss: 1.6651 - regression_loss: 1.4013 - classification_loss: 0.2638 491/500 [============================>.] - ETA: 2s - loss: 1.6659 - regression_loss: 1.4020 - classification_loss: 0.2639 492/500 [============================>.] - ETA: 1s - loss: 1.6655 - regression_loss: 1.4016 - classification_loss: 0.2639 493/500 [============================>.] - ETA: 1s - loss: 1.6648 - regression_loss: 1.4011 - classification_loss: 0.2637 494/500 [============================>.] - ETA: 1s - loss: 1.6664 - regression_loss: 1.4022 - classification_loss: 0.2641 495/500 [============================>.] - ETA: 1s - loss: 1.6664 - regression_loss: 1.4022 - classification_loss: 0.2642 496/500 [============================>.] - ETA: 0s - loss: 1.6656 - regression_loss: 1.4016 - classification_loss: 0.2640 497/500 [============================>.] - ETA: 0s - loss: 1.6661 - regression_loss: 1.4020 - classification_loss: 0.2641 498/500 [============================>.] - ETA: 0s - loss: 1.6656 - regression_loss: 1.4016 - classification_loss: 0.2639 499/500 [============================>.] - ETA: 0s - loss: 1.6652 - regression_loss: 1.4014 - classification_loss: 0.2639 500/500 [==============================] - 125s 249ms/step - loss: 1.6647 - regression_loss: 1.4010 - classification_loss: 0.2637 326 instances of class plum with average precision: 0.7816 mAP: 0.7816 Epoch 00052: saving model to ./training/snapshots/resnet50_pascal_52.h5 Epoch 53/150 1/500 [..............................] - ETA: 2:01 - loss: 2.6141 - regression_loss: 2.1430 - classification_loss: 0.4711 2/500 [..............................] - ETA: 2:02 - loss: 1.7158 - regression_loss: 1.4277 - classification_loss: 0.2880 3/500 [..............................] - ETA: 1:58 - loss: 1.8105 - regression_loss: 1.5136 - classification_loss: 0.2969 4/500 [..............................] - ETA: 1:57 - loss: 1.9881 - regression_loss: 1.6405 - classification_loss: 0.3476 5/500 [..............................] - ETA: 1:57 - loss: 1.9666 - regression_loss: 1.6295 - classification_loss: 0.3371 6/500 [..............................] - ETA: 1:56 - loss: 1.8841 - regression_loss: 1.5764 - classification_loss: 0.3077 7/500 [..............................] - ETA: 1:56 - loss: 1.8584 - regression_loss: 1.5577 - classification_loss: 0.3007 8/500 [..............................] - ETA: 1:57 - loss: 1.8133 - regression_loss: 1.5204 - classification_loss: 0.2929 9/500 [..............................] - ETA: 1:57 - loss: 1.6478 - regression_loss: 1.3829 - classification_loss: 0.2648 10/500 [..............................] - ETA: 1:57 - loss: 1.6978 - regression_loss: 1.4181 - classification_loss: 0.2797 11/500 [..............................] - ETA: 1:57 - loss: 1.6164 - regression_loss: 1.3475 - classification_loss: 0.2688 12/500 [..............................] - ETA: 1:57 - loss: 1.7114 - regression_loss: 1.4039 - classification_loss: 0.3075 13/500 [..............................] - ETA: 1:57 - loss: 1.7655 - regression_loss: 1.4382 - classification_loss: 0.3274 14/500 [..............................] - ETA: 1:57 - loss: 1.8054 - regression_loss: 1.4745 - classification_loss: 0.3309 15/500 [..............................] - ETA: 1:57 - loss: 1.8459 - regression_loss: 1.5114 - classification_loss: 0.3345 16/500 [..............................] - ETA: 1:57 - loss: 1.8499 - regression_loss: 1.5164 - classification_loss: 0.3335 17/500 [>.............................] - ETA: 1:57 - loss: 1.8358 - regression_loss: 1.5083 - classification_loss: 0.3275 18/500 [>.............................] - ETA: 1:56 - loss: 1.8487 - regression_loss: 1.5263 - classification_loss: 0.3223 19/500 [>.............................] - ETA: 1:56 - loss: 1.7852 - regression_loss: 1.4710 - classification_loss: 0.3142 20/500 [>.............................] - ETA: 1:56 - loss: 1.7953 - regression_loss: 1.4812 - classification_loss: 0.3141 21/500 [>.............................] - ETA: 1:56 - loss: 1.7655 - regression_loss: 1.4570 - classification_loss: 0.3084 22/500 [>.............................] - ETA: 1:56 - loss: 1.7619 - regression_loss: 1.4571 - classification_loss: 0.3048 23/500 [>.............................] - ETA: 1:56 - loss: 1.7453 - regression_loss: 1.4444 - classification_loss: 0.3009 24/500 [>.............................] - ETA: 1:56 - loss: 1.7619 - regression_loss: 1.4586 - classification_loss: 0.3033 25/500 [>.............................] - ETA: 1:56 - loss: 1.7479 - regression_loss: 1.4441 - classification_loss: 0.3038 26/500 [>.............................] - ETA: 1:55 - loss: 1.7398 - regression_loss: 1.4381 - classification_loss: 0.3016 27/500 [>.............................] - ETA: 1:55 - loss: 1.8042 - regression_loss: 1.4853 - classification_loss: 0.3189 28/500 [>.............................] - ETA: 1:54 - loss: 1.7911 - regression_loss: 1.4738 - classification_loss: 0.3173 29/500 [>.............................] - ETA: 1:54 - loss: 1.8305 - regression_loss: 1.5030 - classification_loss: 0.3275 30/500 [>.............................] - ETA: 1:54 - loss: 1.8150 - regression_loss: 1.4937 - classification_loss: 0.3213 31/500 [>.............................] - ETA: 1:54 - loss: 1.8044 - regression_loss: 1.4873 - classification_loss: 0.3171 32/500 [>.............................] - ETA: 1:54 - loss: 1.7865 - regression_loss: 1.4749 - classification_loss: 0.3116 33/500 [>.............................] - ETA: 1:54 - loss: 1.7679 - regression_loss: 1.4623 - classification_loss: 0.3056 34/500 [=>............................] - ETA: 1:53 - loss: 1.7668 - regression_loss: 1.4623 - classification_loss: 0.3045 35/500 [=>............................] - ETA: 1:53 - loss: 1.7300 - regression_loss: 1.4319 - classification_loss: 0.2981 36/500 [=>............................] - ETA: 1:53 - loss: 1.7172 - regression_loss: 1.4211 - classification_loss: 0.2960 37/500 [=>............................] - ETA: 1:53 - loss: 1.7058 - regression_loss: 1.4131 - classification_loss: 0.2927 38/500 [=>............................] - ETA: 1:53 - loss: 1.6811 - regression_loss: 1.3930 - classification_loss: 0.2881 39/500 [=>............................] - ETA: 1:53 - loss: 1.6810 - regression_loss: 1.3937 - classification_loss: 0.2873 40/500 [=>............................] - ETA: 1:53 - loss: 1.6792 - regression_loss: 1.3942 - classification_loss: 0.2851 41/500 [=>............................] - ETA: 1:52 - loss: 1.6684 - regression_loss: 1.3869 - classification_loss: 0.2815 42/500 [=>............................] - ETA: 1:52 - loss: 1.6784 - regression_loss: 1.3967 - classification_loss: 0.2817 43/500 [=>............................] - ETA: 1:52 - loss: 1.6954 - regression_loss: 1.4097 - classification_loss: 0.2857 44/500 [=>............................] - ETA: 1:52 - loss: 1.7060 - regression_loss: 1.4195 - classification_loss: 0.2865 45/500 [=>............................] - ETA: 1:52 - loss: 1.7058 - regression_loss: 1.4208 - classification_loss: 0.2851 46/500 [=>............................] - ETA: 1:51 - loss: 1.7015 - regression_loss: 1.4194 - classification_loss: 0.2821 47/500 [=>............................] - ETA: 1:51 - loss: 1.7129 - regression_loss: 1.4288 - classification_loss: 0.2841 48/500 [=>............................] - ETA: 1:51 - loss: 1.6878 - regression_loss: 1.4087 - classification_loss: 0.2791 49/500 [=>............................] - ETA: 1:50 - loss: 1.6786 - regression_loss: 1.4015 - classification_loss: 0.2772 50/500 [==>...........................] - ETA: 1:50 - loss: 1.6801 - regression_loss: 1.4035 - classification_loss: 0.2767 51/500 [==>...........................] - ETA: 1:49 - loss: 1.6639 - regression_loss: 1.3916 - classification_loss: 0.2723 52/500 [==>...........................] - ETA: 1:49 - loss: 1.6479 - regression_loss: 1.3793 - classification_loss: 0.2687 53/500 [==>...........................] - ETA: 1:49 - loss: 1.6523 - regression_loss: 1.3827 - classification_loss: 0.2696 54/500 [==>...........................] - ETA: 1:49 - loss: 1.6492 - regression_loss: 1.3814 - classification_loss: 0.2678 55/500 [==>...........................] - ETA: 1:49 - loss: 1.6395 - regression_loss: 1.3744 - classification_loss: 0.2651 56/500 [==>...........................] - ETA: 1:48 - loss: 1.6535 - regression_loss: 1.3827 - classification_loss: 0.2708 57/500 [==>...........................] - ETA: 1:48 - loss: 1.6549 - regression_loss: 1.3835 - classification_loss: 0.2714 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6697 - regression_loss: 1.3948 - classification_loss: 0.2750 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6715 - regression_loss: 1.3985 - classification_loss: 0.2730 60/500 [==>...........................] - ETA: 1:48 - loss: 1.6713 - regression_loss: 1.4007 - classification_loss: 0.2706 61/500 [==>...........................] - ETA: 1:47 - loss: 1.6579 - regression_loss: 1.3902 - classification_loss: 0.2676 62/500 [==>...........................] - ETA: 1:47 - loss: 1.6606 - regression_loss: 1.3907 - classification_loss: 0.2698 63/500 [==>...........................] - ETA: 1:47 - loss: 1.6620 - regression_loss: 1.3927 - classification_loss: 0.2693 64/500 [==>...........................] - ETA: 1:47 - loss: 1.6540 - regression_loss: 1.3852 - classification_loss: 0.2689 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6367 - regression_loss: 1.3639 - classification_loss: 0.2728 66/500 [==>...........................] - ETA: 1:46 - loss: 1.6389 - regression_loss: 1.3666 - classification_loss: 0.2723 67/500 [===>..........................] - ETA: 1:46 - loss: 1.6424 - regression_loss: 1.3695 - classification_loss: 0.2729 68/500 [===>..........................] - ETA: 1:46 - loss: 1.6545 - regression_loss: 1.3790 - classification_loss: 0.2755 69/500 [===>..........................] - ETA: 1:46 - loss: 1.6532 - regression_loss: 1.3795 - classification_loss: 0.2737 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6609 - regression_loss: 1.3846 - classification_loss: 0.2763 71/500 [===>..........................] - ETA: 1:45 - loss: 1.6591 - regression_loss: 1.3827 - classification_loss: 0.2764 72/500 [===>..........................] - ETA: 1:45 - loss: 1.6585 - regression_loss: 1.3826 - classification_loss: 0.2759 73/500 [===>..........................] - ETA: 1:45 - loss: 1.6590 - regression_loss: 1.3832 - classification_loss: 0.2758 74/500 [===>..........................] - ETA: 1:45 - loss: 1.6455 - regression_loss: 1.3724 - classification_loss: 0.2732 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6484 - regression_loss: 1.3743 - classification_loss: 0.2741 76/500 [===>..........................] - ETA: 1:44 - loss: 1.6515 - regression_loss: 1.3772 - classification_loss: 0.2743 77/500 [===>..........................] - ETA: 1:44 - loss: 1.6605 - regression_loss: 1.3851 - classification_loss: 0.2754 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6552 - regression_loss: 1.3811 - classification_loss: 0.2741 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6576 - regression_loss: 1.3833 - classification_loss: 0.2744 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6541 - regression_loss: 1.3804 - classification_loss: 0.2737 81/500 [===>..........................] - ETA: 1:43 - loss: 1.6470 - regression_loss: 1.3735 - classification_loss: 0.2736 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6473 - regression_loss: 1.3721 - classification_loss: 0.2752 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6459 - regression_loss: 1.3714 - classification_loss: 0.2745 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6415 - regression_loss: 1.3688 - classification_loss: 0.2727 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6392 - regression_loss: 1.3674 - classification_loss: 0.2718 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6291 - regression_loss: 1.3592 - classification_loss: 0.2699 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6178 - regression_loss: 1.3503 - classification_loss: 0.2675 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6133 - regression_loss: 1.3476 - classification_loss: 0.2657 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6149 - regression_loss: 1.3489 - classification_loss: 0.2660 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6161 - regression_loss: 1.3504 - classification_loss: 0.2657 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6167 - regression_loss: 1.3509 - classification_loss: 0.2657 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6158 - regression_loss: 1.3508 - classification_loss: 0.2650 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6276 - regression_loss: 1.3591 - classification_loss: 0.2685 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6279 - regression_loss: 1.3603 - classification_loss: 0.2676 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6316 - regression_loss: 1.3631 - classification_loss: 0.2685 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6298 - regression_loss: 1.3617 - classification_loss: 0.2681 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6318 - regression_loss: 1.3631 - classification_loss: 0.2687 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6293 - regression_loss: 1.3608 - classification_loss: 0.2686 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6239 - regression_loss: 1.3572 - classification_loss: 0.2667 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6121 - regression_loss: 1.3436 - classification_loss: 0.2685 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6131 - regression_loss: 1.3445 - classification_loss: 0.2686 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6131 - regression_loss: 1.3442 - classification_loss: 0.2689 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6125 - regression_loss: 1.3443 - classification_loss: 0.2682 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6182 - regression_loss: 1.3495 - classification_loss: 0.2687 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6211 - regression_loss: 1.3525 - classification_loss: 0.2686 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6233 - regression_loss: 1.3543 - classification_loss: 0.2690 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6264 - regression_loss: 1.3568 - classification_loss: 0.2697 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6195 - regression_loss: 1.3514 - classification_loss: 0.2681 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6202 - regression_loss: 1.3527 - classification_loss: 0.2675 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6241 - regression_loss: 1.3561 - classification_loss: 0.2680 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6210 - regression_loss: 1.3539 - classification_loss: 0.2671 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6195 - regression_loss: 1.3526 - classification_loss: 0.2669 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6121 - regression_loss: 1.3453 - classification_loss: 0.2668 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6015 - regression_loss: 1.3365 - classification_loss: 0.2650 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5997 - regression_loss: 1.3355 - classification_loss: 0.2642 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6075 - regression_loss: 1.3418 - classification_loss: 0.2656 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6011 - regression_loss: 1.3362 - classification_loss: 0.2648 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6050 - regression_loss: 1.3392 - classification_loss: 0.2658 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5950 - regression_loss: 1.3310 - classification_loss: 0.2640 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5991 - regression_loss: 1.3348 - classification_loss: 0.2643 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6074 - regression_loss: 1.3409 - classification_loss: 0.2664 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6022 - regression_loss: 1.3369 - classification_loss: 0.2652 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6061 - regression_loss: 1.3398 - classification_loss: 0.2663 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6056 - regression_loss: 1.3392 - classification_loss: 0.2664 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6050 - regression_loss: 1.3392 - classification_loss: 0.2658 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6035 - regression_loss: 1.3385 - classification_loss: 0.2650 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6112 - regression_loss: 1.3438 - classification_loss: 0.2673 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6085 - regression_loss: 1.3419 - classification_loss: 0.2666 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5997 - regression_loss: 1.3347 - classification_loss: 0.2650 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5980 - regression_loss: 1.3336 - classification_loss: 0.2644 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5982 - regression_loss: 1.3336 - classification_loss: 0.2646 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6038 - regression_loss: 1.3371 - classification_loss: 0.2667 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6087 - regression_loss: 1.3403 - classification_loss: 0.2684 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6183 - regression_loss: 1.3493 - classification_loss: 0.2689 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6217 - regression_loss: 1.3516 - classification_loss: 0.2700 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6203 - regression_loss: 1.3504 - classification_loss: 0.2699 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6206 - regression_loss: 1.3500 - classification_loss: 0.2706 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6249 - regression_loss: 1.3532 - classification_loss: 0.2717 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6313 - regression_loss: 1.3587 - classification_loss: 0.2726 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6287 - regression_loss: 1.3490 - classification_loss: 0.2796 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6285 - regression_loss: 1.3489 - classification_loss: 0.2796 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6277 - regression_loss: 1.3485 - classification_loss: 0.2791 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6268 - regression_loss: 1.3485 - classification_loss: 0.2783 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6243 - regression_loss: 1.3470 - classification_loss: 0.2773 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6217 - regression_loss: 1.3450 - classification_loss: 0.2767 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6147 - regression_loss: 1.3394 - classification_loss: 0.2753 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6083 - regression_loss: 1.3342 - classification_loss: 0.2740 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6100 - regression_loss: 1.3358 - classification_loss: 0.2742 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6076 - regression_loss: 1.3341 - classification_loss: 0.2735 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6111 - regression_loss: 1.3367 - classification_loss: 0.2743 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6065 - regression_loss: 1.3327 - classification_loss: 0.2739 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6064 - regression_loss: 1.3329 - classification_loss: 0.2735 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6086 - regression_loss: 1.3355 - classification_loss: 0.2731 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6071 - regression_loss: 1.3346 - classification_loss: 0.2725 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6116 - regression_loss: 1.3360 - classification_loss: 0.2756 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6176 - regression_loss: 1.3409 - classification_loss: 0.2767 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6158 - regression_loss: 1.3398 - classification_loss: 0.2760 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6159 - regression_loss: 1.3406 - classification_loss: 0.2754 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6197 - regression_loss: 1.3439 - classification_loss: 0.2758 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6217 - regression_loss: 1.3454 - classification_loss: 0.2764 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6211 - regression_loss: 1.3453 - classification_loss: 0.2758 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6232 - regression_loss: 1.3462 - classification_loss: 0.2769 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6219 - regression_loss: 1.3456 - classification_loss: 0.2763 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6240 - regression_loss: 1.3475 - classification_loss: 0.2766 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6186 - regression_loss: 1.3431 - classification_loss: 0.2755 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6157 - regression_loss: 1.3409 - classification_loss: 0.2749 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6206 - regression_loss: 1.3437 - classification_loss: 0.2769 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6191 - regression_loss: 1.3431 - classification_loss: 0.2760 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6149 - regression_loss: 1.3396 - classification_loss: 0.2753 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6196 - regression_loss: 1.3439 - classification_loss: 0.2757 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6160 - regression_loss: 1.3409 - classification_loss: 0.2751 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6150 - regression_loss: 1.3406 - classification_loss: 0.2744 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6197 - regression_loss: 1.3437 - classification_loss: 0.2760 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6197 - regression_loss: 1.3439 - classification_loss: 0.2758 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6269 - regression_loss: 1.3488 - classification_loss: 0.2781 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6316 - regression_loss: 1.3524 - classification_loss: 0.2791 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6351 - regression_loss: 1.3553 - classification_loss: 0.2798 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6377 - regression_loss: 1.3579 - classification_loss: 0.2798 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6433 - regression_loss: 1.3631 - classification_loss: 0.2802 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6390 - regression_loss: 1.3598 - classification_loss: 0.2793 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6387 - regression_loss: 1.3599 - classification_loss: 0.2789 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6385 - regression_loss: 1.3604 - classification_loss: 0.2781 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6400 - regression_loss: 1.3614 - classification_loss: 0.2786 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6336 - regression_loss: 1.3560 - classification_loss: 0.2776 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6329 - regression_loss: 1.3557 - classification_loss: 0.2772 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6331 - regression_loss: 1.3559 - classification_loss: 0.2772 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6354 - regression_loss: 1.3572 - classification_loss: 0.2782 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6350 - regression_loss: 1.3571 - classification_loss: 0.2779 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6293 - regression_loss: 1.3526 - classification_loss: 0.2768 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6327 - regression_loss: 1.3554 - classification_loss: 0.2773 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6340 - regression_loss: 1.3565 - classification_loss: 0.2775 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6361 - regression_loss: 1.3580 - classification_loss: 0.2781 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6365 - regression_loss: 1.3588 - classification_loss: 0.2778 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6351 - regression_loss: 1.3578 - classification_loss: 0.2773 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6330 - regression_loss: 1.3562 - classification_loss: 0.2768 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6341 - regression_loss: 1.3567 - classification_loss: 0.2774 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6319 - regression_loss: 1.3549 - classification_loss: 0.2770 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6384 - regression_loss: 1.3605 - classification_loss: 0.2778 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6389 - regression_loss: 1.3610 - classification_loss: 0.2779 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6374 - regression_loss: 1.3600 - classification_loss: 0.2775 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6373 - regression_loss: 1.3600 - classification_loss: 0.2774 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6359 - regression_loss: 1.3584 - classification_loss: 0.2775 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6389 - regression_loss: 1.3610 - classification_loss: 0.2780 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6398 - regression_loss: 1.3617 - classification_loss: 0.2781 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6396 - regression_loss: 1.3614 - classification_loss: 0.2782 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6423 - regression_loss: 1.3640 - classification_loss: 0.2782 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6421 - regression_loss: 1.3641 - classification_loss: 0.2780 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6410 - regression_loss: 1.3633 - classification_loss: 0.2776 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6419 - regression_loss: 1.3640 - classification_loss: 0.2779 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6427 - regression_loss: 1.3647 - classification_loss: 0.2781 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6430 - regression_loss: 1.3649 - classification_loss: 0.2781 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6439 - regression_loss: 1.3653 - classification_loss: 0.2786 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6426 - regression_loss: 1.3642 - classification_loss: 0.2785 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6392 - regression_loss: 1.3615 - classification_loss: 0.2776 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6394 - regression_loss: 1.3618 - classification_loss: 0.2776 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6430 - regression_loss: 1.3640 - classification_loss: 0.2790 217/500 [============>.................] - ETA: 1:10 - loss: 1.6438 - regression_loss: 1.3645 - classification_loss: 0.2793 218/500 [============>.................] - ETA: 1:10 - loss: 1.6459 - regression_loss: 1.3662 - classification_loss: 0.2797 219/500 [============>.................] - ETA: 1:10 - loss: 1.6435 - regression_loss: 1.3645 - classification_loss: 0.2790 220/500 [============>.................] - ETA: 1:09 - loss: 1.6477 - regression_loss: 1.3681 - classification_loss: 0.2796 221/500 [============>.................] - ETA: 1:09 - loss: 1.6480 - regression_loss: 1.3684 - classification_loss: 0.2796 222/500 [============>.................] - ETA: 1:09 - loss: 1.6479 - regression_loss: 1.3681 - classification_loss: 0.2798 223/500 [============>.................] - ETA: 1:09 - loss: 1.6449 - regression_loss: 1.3657 - classification_loss: 0.2792 224/500 [============>.................] - ETA: 1:08 - loss: 1.6461 - regression_loss: 1.3667 - classification_loss: 0.2794 225/500 [============>.................] - ETA: 1:08 - loss: 1.6472 - regression_loss: 1.3678 - classification_loss: 0.2794 226/500 [============>.................] - ETA: 1:08 - loss: 1.6435 - regression_loss: 1.3648 - classification_loss: 0.2787 227/500 [============>.................] - ETA: 1:08 - loss: 1.6454 - regression_loss: 1.3660 - classification_loss: 0.2794 228/500 [============>.................] - ETA: 1:07 - loss: 1.6457 - regression_loss: 1.3663 - classification_loss: 0.2793 229/500 [============>.................] - ETA: 1:07 - loss: 1.6467 - regression_loss: 1.3673 - classification_loss: 0.2794 230/500 [============>.................] - ETA: 1:07 - loss: 1.6470 - regression_loss: 1.3676 - classification_loss: 0.2793 231/500 [============>.................] - ETA: 1:07 - loss: 1.6449 - regression_loss: 1.3659 - classification_loss: 0.2789 232/500 [============>.................] - ETA: 1:06 - loss: 1.6472 - regression_loss: 1.3682 - classification_loss: 0.2790 233/500 [============>.................] - ETA: 1:06 - loss: 1.6494 - regression_loss: 1.3706 - classification_loss: 0.2788 234/500 [=============>................] - ETA: 1:06 - loss: 1.6511 - regression_loss: 1.3720 - classification_loss: 0.2791 235/500 [=============>................] - ETA: 1:06 - loss: 1.6509 - regression_loss: 1.3719 - classification_loss: 0.2790 236/500 [=============>................] - ETA: 1:05 - loss: 1.6481 - regression_loss: 1.3697 - classification_loss: 0.2784 237/500 [=============>................] - ETA: 1:05 - loss: 1.6442 - regression_loss: 1.3669 - classification_loss: 0.2774 238/500 [=============>................] - ETA: 1:05 - loss: 1.6416 - regression_loss: 1.3647 - classification_loss: 0.2769 239/500 [=============>................] - ETA: 1:05 - loss: 1.6401 - regression_loss: 1.3637 - classification_loss: 0.2765 240/500 [=============>................] - ETA: 1:04 - loss: 1.6361 - regression_loss: 1.3605 - classification_loss: 0.2756 241/500 [=============>................] - ETA: 1:04 - loss: 1.6357 - regression_loss: 1.3604 - classification_loss: 0.2753 242/500 [=============>................] - ETA: 1:04 - loss: 1.6357 - regression_loss: 1.3599 - classification_loss: 0.2757 243/500 [=============>................] - ETA: 1:04 - loss: 1.6354 - regression_loss: 1.3596 - classification_loss: 0.2758 244/500 [=============>................] - ETA: 1:03 - loss: 1.6340 - regression_loss: 1.3586 - classification_loss: 0.2754 245/500 [=============>................] - ETA: 1:03 - loss: 1.6332 - regression_loss: 1.3578 - classification_loss: 0.2754 246/500 [=============>................] - ETA: 1:03 - loss: 1.6357 - regression_loss: 1.3605 - classification_loss: 0.2752 247/500 [=============>................] - ETA: 1:03 - loss: 1.6389 - regression_loss: 1.3634 - classification_loss: 0.2755 248/500 [=============>................] - ETA: 1:02 - loss: 1.6406 - regression_loss: 1.3642 - classification_loss: 0.2764 249/500 [=============>................] - ETA: 1:02 - loss: 1.6453 - regression_loss: 1.3675 - classification_loss: 0.2778 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6454 - regression_loss: 1.3665 - classification_loss: 0.2789 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6491 - regression_loss: 1.3698 - classification_loss: 0.2792 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6510 - regression_loss: 1.3715 - classification_loss: 0.2795 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6518 - regression_loss: 1.3724 - classification_loss: 0.2794 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6520 - regression_loss: 1.3728 - classification_loss: 0.2792 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6513 - regression_loss: 1.3724 - classification_loss: 0.2788 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6537 - regression_loss: 1.3744 - classification_loss: 0.2794 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6528 - regression_loss: 1.3736 - classification_loss: 0.2792 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6563 - regression_loss: 1.3761 - classification_loss: 0.2802 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6650 - regression_loss: 1.3843 - classification_loss: 0.2807 260/500 [==============>...............] - ETA: 59s - loss: 1.6627 - regression_loss: 1.3826 - classification_loss: 0.2801  261/500 [==============>...............] - ETA: 59s - loss: 1.6649 - regression_loss: 1.3842 - classification_loss: 0.2807 262/500 [==============>...............] - ETA: 59s - loss: 1.6658 - regression_loss: 1.3850 - classification_loss: 0.2808 263/500 [==============>...............] - ETA: 59s - loss: 1.6670 - regression_loss: 1.3860 - classification_loss: 0.2809 264/500 [==============>...............] - ETA: 58s - loss: 1.6686 - regression_loss: 1.3874 - classification_loss: 0.2812 265/500 [==============>...............] - ETA: 58s - loss: 1.6704 - regression_loss: 1.3894 - classification_loss: 0.2811 266/500 [==============>...............] - ETA: 58s - loss: 1.6701 - regression_loss: 1.3887 - classification_loss: 0.2814 267/500 [===============>..............] - ETA: 58s - loss: 1.6703 - regression_loss: 1.3887 - classification_loss: 0.2816 268/500 [===============>..............] - ETA: 57s - loss: 1.6716 - regression_loss: 1.3900 - classification_loss: 0.2816 269/500 [===============>..............] - ETA: 57s - loss: 1.6710 - regression_loss: 1.3894 - classification_loss: 0.2815 270/500 [===============>..............] - ETA: 57s - loss: 1.6691 - regression_loss: 1.3878 - classification_loss: 0.2813 271/500 [===============>..............] - ETA: 57s - loss: 1.6689 - regression_loss: 1.3878 - classification_loss: 0.2812 272/500 [===============>..............] - ETA: 56s - loss: 1.6760 - regression_loss: 1.3941 - classification_loss: 0.2819 273/500 [===============>..............] - ETA: 56s - loss: 1.6762 - regression_loss: 1.3944 - classification_loss: 0.2818 274/500 [===============>..............] - ETA: 56s - loss: 1.6762 - regression_loss: 1.3949 - classification_loss: 0.2813 275/500 [===============>..............] - ETA: 56s - loss: 1.6760 - regression_loss: 1.3949 - classification_loss: 0.2811 276/500 [===============>..............] - ETA: 55s - loss: 1.6743 - regression_loss: 1.3933 - classification_loss: 0.2810 277/500 [===============>..............] - ETA: 55s - loss: 1.6755 - regression_loss: 1.3941 - classification_loss: 0.2815 278/500 [===============>..............] - ETA: 55s - loss: 1.6776 - regression_loss: 1.3960 - classification_loss: 0.2816 279/500 [===============>..............] - ETA: 55s - loss: 1.6770 - regression_loss: 1.3956 - classification_loss: 0.2814 280/500 [===============>..............] - ETA: 54s - loss: 1.6775 - regression_loss: 1.3959 - classification_loss: 0.2816 281/500 [===============>..............] - ETA: 54s - loss: 1.6769 - regression_loss: 1.3958 - classification_loss: 0.2811 282/500 [===============>..............] - ETA: 54s - loss: 1.6767 - regression_loss: 1.3958 - classification_loss: 0.2809 283/500 [===============>..............] - ETA: 54s - loss: 1.6738 - regression_loss: 1.3935 - classification_loss: 0.2803 284/500 [================>.............] - ETA: 53s - loss: 1.6744 - regression_loss: 1.3939 - classification_loss: 0.2806 285/500 [================>.............] - ETA: 53s - loss: 1.6745 - regression_loss: 1.3940 - classification_loss: 0.2805 286/500 [================>.............] - ETA: 53s - loss: 1.6763 - regression_loss: 1.3953 - classification_loss: 0.2810 287/500 [================>.............] - ETA: 53s - loss: 1.6749 - regression_loss: 1.3942 - classification_loss: 0.2807 288/500 [================>.............] - ETA: 52s - loss: 1.6734 - regression_loss: 1.3931 - classification_loss: 0.2803 289/500 [================>.............] - ETA: 52s - loss: 1.6744 - regression_loss: 1.3939 - classification_loss: 0.2805 290/500 [================>.............] - ETA: 52s - loss: 1.6766 - regression_loss: 1.3959 - classification_loss: 0.2807 291/500 [================>.............] - ETA: 52s - loss: 1.6796 - regression_loss: 1.3983 - classification_loss: 0.2813 292/500 [================>.............] - ETA: 51s - loss: 1.6787 - regression_loss: 1.3979 - classification_loss: 0.2809 293/500 [================>.............] - ETA: 51s - loss: 1.6814 - regression_loss: 1.4002 - classification_loss: 0.2812 294/500 [================>.............] - ETA: 51s - loss: 1.6813 - regression_loss: 1.4002 - classification_loss: 0.2811 295/500 [================>.............] - ETA: 51s - loss: 1.6793 - regression_loss: 1.3987 - classification_loss: 0.2806 296/500 [================>.............] - ETA: 50s - loss: 1.6782 - regression_loss: 1.3979 - classification_loss: 0.2803 297/500 [================>.............] - ETA: 50s - loss: 1.6774 - regression_loss: 1.3975 - classification_loss: 0.2799 298/500 [================>.............] - ETA: 50s - loss: 1.6762 - regression_loss: 1.3968 - classification_loss: 0.2794 299/500 [================>.............] - ETA: 50s - loss: 1.6762 - regression_loss: 1.3966 - classification_loss: 0.2795 300/500 [=================>............] - ETA: 49s - loss: 1.6793 - regression_loss: 1.3993 - classification_loss: 0.2800 301/500 [=================>............] - ETA: 49s - loss: 1.6789 - regression_loss: 1.3987 - classification_loss: 0.2802 302/500 [=================>............] - ETA: 49s - loss: 1.6765 - regression_loss: 1.3969 - classification_loss: 0.2796 303/500 [=================>............] - ETA: 49s - loss: 1.6780 - regression_loss: 1.3984 - classification_loss: 0.2797 304/500 [=================>............] - ETA: 48s - loss: 1.6783 - regression_loss: 1.3987 - classification_loss: 0.2797 305/500 [=================>............] - ETA: 48s - loss: 1.6791 - regression_loss: 1.3992 - classification_loss: 0.2799 306/500 [=================>............] - ETA: 48s - loss: 1.6762 - regression_loss: 1.3970 - classification_loss: 0.2792 307/500 [=================>............] - ETA: 48s - loss: 1.6768 - regression_loss: 1.3973 - classification_loss: 0.2795 308/500 [=================>............] - ETA: 47s - loss: 1.6745 - regression_loss: 1.3952 - classification_loss: 0.2793 309/500 [=================>............] - ETA: 47s - loss: 1.6730 - regression_loss: 1.3941 - classification_loss: 0.2789 310/500 [=================>............] - ETA: 47s - loss: 1.6696 - regression_loss: 1.3913 - classification_loss: 0.2783 311/500 [=================>............] - ETA: 47s - loss: 1.6704 - regression_loss: 1.3918 - classification_loss: 0.2786 312/500 [=================>............] - ETA: 46s - loss: 1.6709 - regression_loss: 1.3927 - classification_loss: 0.2782 313/500 [=================>............] - ETA: 46s - loss: 1.6746 - regression_loss: 1.3954 - classification_loss: 0.2792 314/500 [=================>............] - ETA: 46s - loss: 1.6739 - regression_loss: 1.3950 - classification_loss: 0.2788 315/500 [=================>............] - ETA: 46s - loss: 1.6747 - regression_loss: 1.3957 - classification_loss: 0.2790 316/500 [=================>............] - ETA: 45s - loss: 1.6725 - regression_loss: 1.3941 - classification_loss: 0.2784 317/500 [==================>...........] - ETA: 45s - loss: 1.6755 - regression_loss: 1.3965 - classification_loss: 0.2790 318/500 [==================>...........] - ETA: 45s - loss: 1.6768 - regression_loss: 1.3975 - classification_loss: 0.2793 319/500 [==================>...........] - ETA: 45s - loss: 1.6775 - regression_loss: 1.3981 - classification_loss: 0.2794 320/500 [==================>...........] - ETA: 44s - loss: 1.6789 - regression_loss: 1.3987 - classification_loss: 0.2803 321/500 [==================>...........] - ETA: 44s - loss: 1.6796 - regression_loss: 1.3992 - classification_loss: 0.2804 322/500 [==================>...........] - ETA: 44s - loss: 1.6772 - regression_loss: 1.3974 - classification_loss: 0.2798 323/500 [==================>...........] - ETA: 44s - loss: 1.6760 - regression_loss: 1.3964 - classification_loss: 0.2795 324/500 [==================>...........] - ETA: 43s - loss: 1.6771 - regression_loss: 1.3974 - classification_loss: 0.2797 325/500 [==================>...........] - ETA: 43s - loss: 1.6776 - regression_loss: 1.3979 - classification_loss: 0.2797 326/500 [==================>...........] - ETA: 43s - loss: 1.6786 - regression_loss: 1.3988 - classification_loss: 0.2798 327/500 [==================>...........] - ETA: 43s - loss: 1.6784 - regression_loss: 1.3988 - classification_loss: 0.2795 328/500 [==================>...........] - ETA: 42s - loss: 1.6785 - regression_loss: 1.3987 - classification_loss: 0.2798 329/500 [==================>...........] - ETA: 42s - loss: 1.6760 - regression_loss: 1.3967 - classification_loss: 0.2793 330/500 [==================>...........] - ETA: 42s - loss: 1.6754 - regression_loss: 1.3963 - classification_loss: 0.2791 331/500 [==================>...........] - ETA: 42s - loss: 1.6772 - regression_loss: 1.3981 - classification_loss: 0.2791 332/500 [==================>...........] - ETA: 41s - loss: 1.6761 - regression_loss: 1.3972 - classification_loss: 0.2789 333/500 [==================>...........] - ETA: 41s - loss: 1.6754 - regression_loss: 1.3968 - classification_loss: 0.2786 334/500 [===================>..........] - ETA: 41s - loss: 1.6741 - regression_loss: 1.3959 - classification_loss: 0.2782 335/500 [===================>..........] - ETA: 41s - loss: 1.6731 - regression_loss: 1.3950 - classification_loss: 0.2781 336/500 [===================>..........] - ETA: 40s - loss: 1.6733 - regression_loss: 1.3950 - classification_loss: 0.2782 337/500 [===================>..........] - ETA: 40s - loss: 1.6718 - regression_loss: 1.3940 - classification_loss: 0.2778 338/500 [===================>..........] - ETA: 40s - loss: 1.6710 - regression_loss: 1.3934 - classification_loss: 0.2776 339/500 [===================>..........] - ETA: 40s - loss: 1.6704 - regression_loss: 1.3927 - classification_loss: 0.2776 340/500 [===================>..........] - ETA: 39s - loss: 1.6705 - regression_loss: 1.3931 - classification_loss: 0.2774 341/500 [===================>..........] - ETA: 39s - loss: 1.6717 - regression_loss: 1.3942 - classification_loss: 0.2775 342/500 [===================>..........] - ETA: 39s - loss: 1.6715 - regression_loss: 1.3941 - classification_loss: 0.2774 343/500 [===================>..........] - ETA: 39s - loss: 1.6720 - regression_loss: 1.3946 - classification_loss: 0.2774 344/500 [===================>..........] - ETA: 38s - loss: 1.6706 - regression_loss: 1.3932 - classification_loss: 0.2774 345/500 [===================>..........] - ETA: 38s - loss: 1.6721 - regression_loss: 1.3943 - classification_loss: 0.2778 346/500 [===================>..........] - ETA: 38s - loss: 1.6697 - regression_loss: 1.3924 - classification_loss: 0.2773 347/500 [===================>..........] - ETA: 38s - loss: 1.6687 - regression_loss: 1.3917 - classification_loss: 0.2771 348/500 [===================>..........] - ETA: 37s - loss: 1.6683 - regression_loss: 1.3915 - classification_loss: 0.2768 349/500 [===================>..........] - ETA: 37s - loss: 1.6676 - regression_loss: 1.3909 - classification_loss: 0.2767 350/500 [====================>.........] - ETA: 37s - loss: 1.6658 - regression_loss: 1.3894 - classification_loss: 0.2763 351/500 [====================>.........] - ETA: 37s - loss: 1.6664 - regression_loss: 1.3900 - classification_loss: 0.2764 352/500 [====================>.........] - ETA: 36s - loss: 1.6686 - regression_loss: 1.3914 - classification_loss: 0.2773 353/500 [====================>.........] - ETA: 36s - loss: 1.6676 - regression_loss: 1.3903 - classification_loss: 0.2773 354/500 [====================>.........] - ETA: 36s - loss: 1.6679 - regression_loss: 1.3908 - classification_loss: 0.2772 355/500 [====================>.........] - ETA: 36s - loss: 1.6658 - regression_loss: 1.3891 - classification_loss: 0.2767 356/500 [====================>.........] - ETA: 35s - loss: 1.6651 - regression_loss: 1.3886 - classification_loss: 0.2766 357/500 [====================>.........] - ETA: 35s - loss: 1.6655 - regression_loss: 1.3885 - classification_loss: 0.2770 358/500 [====================>.........] - ETA: 35s - loss: 1.6666 - regression_loss: 1.3893 - classification_loss: 0.2773 359/500 [====================>.........] - ETA: 35s - loss: 1.6669 - regression_loss: 1.3897 - classification_loss: 0.2772 360/500 [====================>.........] - ETA: 34s - loss: 1.6673 - regression_loss: 1.3899 - classification_loss: 0.2773 361/500 [====================>.........] - ETA: 34s - loss: 1.6703 - regression_loss: 1.3921 - classification_loss: 0.2782 362/500 [====================>.........] - ETA: 34s - loss: 1.6692 - regression_loss: 1.3914 - classification_loss: 0.2778 363/500 [====================>.........] - ETA: 34s - loss: 1.6716 - regression_loss: 1.3934 - classification_loss: 0.2782 364/500 [====================>.........] - ETA: 33s - loss: 1.6720 - regression_loss: 1.3939 - classification_loss: 0.2781 365/500 [====================>.........] - ETA: 33s - loss: 1.6714 - regression_loss: 1.3934 - classification_loss: 0.2780 366/500 [====================>.........] - ETA: 33s - loss: 1.6735 - regression_loss: 1.3950 - classification_loss: 0.2785 367/500 [=====================>........] - ETA: 33s - loss: 1.6728 - regression_loss: 1.3945 - classification_loss: 0.2784 368/500 [=====================>........] - ETA: 32s - loss: 1.6739 - regression_loss: 1.3954 - classification_loss: 0.2785 369/500 [=====================>........] - ETA: 32s - loss: 1.6735 - regression_loss: 1.3951 - classification_loss: 0.2785 370/500 [=====================>........] - ETA: 32s - loss: 1.6731 - regression_loss: 1.3947 - classification_loss: 0.2783 371/500 [=====================>........] - ETA: 32s - loss: 1.6714 - regression_loss: 1.3934 - classification_loss: 0.2780 372/500 [=====================>........] - ETA: 31s - loss: 1.6706 - regression_loss: 1.3925 - classification_loss: 0.2780 373/500 [=====================>........] - ETA: 31s - loss: 1.6691 - regression_loss: 1.3913 - classification_loss: 0.2778 374/500 [=====================>........] - ETA: 31s - loss: 1.6693 - regression_loss: 1.3916 - classification_loss: 0.2776 375/500 [=====================>........] - ETA: 31s - loss: 1.6686 - regression_loss: 1.3913 - classification_loss: 0.2774 376/500 [=====================>........] - ETA: 30s - loss: 1.6685 - regression_loss: 1.3913 - classification_loss: 0.2772 377/500 [=====================>........] - ETA: 30s - loss: 1.6691 - regression_loss: 1.3921 - classification_loss: 0.2770 378/500 [=====================>........] - ETA: 30s - loss: 1.6677 - regression_loss: 1.3908 - classification_loss: 0.2769 379/500 [=====================>........] - ETA: 30s - loss: 1.6679 - regression_loss: 1.3910 - classification_loss: 0.2769 380/500 [=====================>........] - ETA: 29s - loss: 1.6661 - regression_loss: 1.3896 - classification_loss: 0.2765 381/500 [=====================>........] - ETA: 29s - loss: 1.6655 - regression_loss: 1.3890 - classification_loss: 0.2766 382/500 [=====================>........] - ETA: 29s - loss: 1.6639 - regression_loss: 1.3877 - classification_loss: 0.2762 383/500 [=====================>........] - ETA: 29s - loss: 1.6646 - regression_loss: 1.3882 - classification_loss: 0.2764 384/500 [======================>.......] - ETA: 28s - loss: 1.6644 - regression_loss: 1.3881 - classification_loss: 0.2763 385/500 [======================>.......] - ETA: 28s - loss: 1.6646 - regression_loss: 1.3883 - classification_loss: 0.2763 386/500 [======================>.......] - ETA: 28s - loss: 1.6651 - regression_loss: 1.3888 - classification_loss: 0.2763 387/500 [======================>.......] - ETA: 28s - loss: 1.6637 - regression_loss: 1.3877 - classification_loss: 0.2761 388/500 [======================>.......] - ETA: 27s - loss: 1.6634 - regression_loss: 1.3874 - classification_loss: 0.2760 389/500 [======================>.......] - ETA: 27s - loss: 1.6634 - regression_loss: 1.3876 - classification_loss: 0.2759 390/500 [======================>.......] - ETA: 27s - loss: 1.6636 - regression_loss: 1.3876 - classification_loss: 0.2759 391/500 [======================>.......] - ETA: 27s - loss: 1.6632 - regression_loss: 1.3875 - classification_loss: 0.2757 392/500 [======================>.......] - ETA: 26s - loss: 1.6623 - regression_loss: 1.3866 - classification_loss: 0.2757 393/500 [======================>.......] - ETA: 26s - loss: 1.6615 - regression_loss: 1.3861 - classification_loss: 0.2755 394/500 [======================>.......] - ETA: 26s - loss: 1.6631 - regression_loss: 1.3876 - classification_loss: 0.2756 395/500 [======================>.......] - ETA: 26s - loss: 1.6634 - regression_loss: 1.3879 - classification_loss: 0.2754 396/500 [======================>.......] - ETA: 25s - loss: 1.6610 - regression_loss: 1.3860 - classification_loss: 0.2749 397/500 [======================>.......] - ETA: 25s - loss: 1.6605 - regression_loss: 1.3859 - classification_loss: 0.2746 398/500 [======================>.......] - ETA: 25s - loss: 1.6608 - regression_loss: 1.3862 - classification_loss: 0.2746 399/500 [======================>.......] - ETA: 25s - loss: 1.6621 - regression_loss: 1.3871 - classification_loss: 0.2749 400/500 [=======================>......] - ETA: 24s - loss: 1.6611 - regression_loss: 1.3862 - classification_loss: 0.2749 401/500 [=======================>......] - ETA: 24s - loss: 1.6609 - regression_loss: 1.3862 - classification_loss: 0.2747 402/500 [=======================>......] - ETA: 24s - loss: 1.6618 - regression_loss: 1.3870 - classification_loss: 0.2748 403/500 [=======================>......] - ETA: 24s - loss: 1.6602 - regression_loss: 1.3858 - classification_loss: 0.2744 404/500 [=======================>......] - ETA: 23s - loss: 1.6590 - regression_loss: 1.3850 - classification_loss: 0.2740 405/500 [=======================>......] - ETA: 23s - loss: 1.6597 - regression_loss: 1.3857 - classification_loss: 0.2740 406/500 [=======================>......] - ETA: 23s - loss: 1.6595 - regression_loss: 1.3857 - classification_loss: 0.2738 407/500 [=======================>......] - ETA: 23s - loss: 1.6587 - regression_loss: 1.3851 - classification_loss: 0.2736 408/500 [=======================>......] - ETA: 22s - loss: 1.6594 - regression_loss: 1.3859 - classification_loss: 0.2735 409/500 [=======================>......] - ETA: 22s - loss: 1.6587 - regression_loss: 1.3852 - classification_loss: 0.2735 410/500 [=======================>......] - ETA: 22s - loss: 1.6594 - regression_loss: 1.3858 - classification_loss: 0.2735 411/500 [=======================>......] - ETA: 22s - loss: 1.6605 - regression_loss: 1.3868 - classification_loss: 0.2737 412/500 [=======================>......] - ETA: 21s - loss: 1.6605 - regression_loss: 1.3867 - classification_loss: 0.2738 413/500 [=======================>......] - ETA: 21s - loss: 1.6592 - regression_loss: 1.3857 - classification_loss: 0.2735 414/500 [=======================>......] - ETA: 21s - loss: 1.6594 - regression_loss: 1.3863 - classification_loss: 0.2731 415/500 [=======================>......] - ETA: 21s - loss: 1.6595 - regression_loss: 1.3865 - classification_loss: 0.2730 416/500 [=======================>......] - ETA: 20s - loss: 1.6603 - regression_loss: 1.3872 - classification_loss: 0.2731 417/500 [========================>.....] - ETA: 20s - loss: 1.6606 - regression_loss: 1.3875 - classification_loss: 0.2731 418/500 [========================>.....] - ETA: 20s - loss: 1.6599 - regression_loss: 1.3870 - classification_loss: 0.2729 419/500 [========================>.....] - ETA: 20s - loss: 1.6606 - regression_loss: 1.3875 - classification_loss: 0.2731 420/500 [========================>.....] - ETA: 19s - loss: 1.6583 - regression_loss: 1.3856 - classification_loss: 0.2727 421/500 [========================>.....] - ETA: 19s - loss: 1.6593 - regression_loss: 1.3865 - classification_loss: 0.2728 422/500 [========================>.....] - ETA: 19s - loss: 1.6585 - regression_loss: 1.3860 - classification_loss: 0.2725 423/500 [========================>.....] - ETA: 19s - loss: 1.6583 - regression_loss: 1.3856 - classification_loss: 0.2728 424/500 [========================>.....] - ETA: 18s - loss: 1.6576 - regression_loss: 1.3850 - classification_loss: 0.2726 425/500 [========================>.....] - ETA: 18s - loss: 1.6567 - regression_loss: 1.3844 - classification_loss: 0.2724 426/500 [========================>.....] - ETA: 18s - loss: 1.6581 - regression_loss: 1.3853 - classification_loss: 0.2728 427/500 [========================>.....] - ETA: 18s - loss: 1.6576 - regression_loss: 1.3851 - classification_loss: 0.2725 428/500 [========================>.....] - ETA: 17s - loss: 1.6598 - regression_loss: 1.3871 - classification_loss: 0.2727 429/500 [========================>.....] - ETA: 17s - loss: 1.6596 - regression_loss: 1.3870 - classification_loss: 0.2726 430/500 [========================>.....] - ETA: 17s - loss: 1.6582 - regression_loss: 1.3859 - classification_loss: 0.2723 431/500 [========================>.....] - ETA: 17s - loss: 1.6587 - regression_loss: 1.3862 - classification_loss: 0.2725 432/500 [========================>.....] - ETA: 16s - loss: 1.6593 - regression_loss: 1.3863 - classification_loss: 0.2729 433/500 [========================>.....] - ETA: 16s - loss: 1.6600 - regression_loss: 1.3871 - classification_loss: 0.2729 434/500 [=========================>....] - ETA: 16s - loss: 1.6594 - regression_loss: 1.3868 - classification_loss: 0.2727 435/500 [=========================>....] - ETA: 16s - loss: 1.6591 - regression_loss: 1.3866 - classification_loss: 0.2724 436/500 [=========================>....] - ETA: 15s - loss: 1.6592 - regression_loss: 1.3867 - classification_loss: 0.2725 437/500 [=========================>....] - ETA: 15s - loss: 1.6596 - regression_loss: 1.3869 - classification_loss: 0.2728 438/500 [=========================>....] - ETA: 15s - loss: 1.6585 - regression_loss: 1.3861 - classification_loss: 0.2724 439/500 [=========================>....] - ETA: 15s - loss: 1.6591 - regression_loss: 1.3866 - classification_loss: 0.2725 440/500 [=========================>....] - ETA: 14s - loss: 1.6579 - regression_loss: 1.3857 - classification_loss: 0.2722 441/500 [=========================>....] - ETA: 14s - loss: 1.6580 - regression_loss: 1.3858 - classification_loss: 0.2723 442/500 [=========================>....] - ETA: 14s - loss: 1.6590 - regression_loss: 1.3865 - classification_loss: 0.2725 443/500 [=========================>....] - ETA: 14s - loss: 1.6591 - regression_loss: 1.3865 - classification_loss: 0.2726 444/500 [=========================>....] - ETA: 13s - loss: 1.6585 - regression_loss: 1.3861 - classification_loss: 0.2724 445/500 [=========================>....] - ETA: 13s - loss: 1.6586 - regression_loss: 1.3863 - classification_loss: 0.2724 446/500 [=========================>....] - ETA: 13s - loss: 1.6589 - regression_loss: 1.3865 - classification_loss: 0.2724 447/500 [=========================>....] - ETA: 13s - loss: 1.6599 - regression_loss: 1.3875 - classification_loss: 0.2723 448/500 [=========================>....] - ETA: 12s - loss: 1.6586 - regression_loss: 1.3866 - classification_loss: 0.2721 449/500 [=========================>....] - ETA: 12s - loss: 1.6577 - regression_loss: 1.3859 - classification_loss: 0.2718 450/500 [==========================>...] - ETA: 12s - loss: 1.6566 - regression_loss: 1.3851 - classification_loss: 0.2715 451/500 [==========================>...] - ETA: 12s - loss: 1.6579 - regression_loss: 1.3860 - classification_loss: 0.2719 452/500 [==========================>...] - ETA: 11s - loss: 1.6554 - regression_loss: 1.3839 - classification_loss: 0.2714 453/500 [==========================>...] - ETA: 11s - loss: 1.6549 - regression_loss: 1.3837 - classification_loss: 0.2712 454/500 [==========================>...] - ETA: 11s - loss: 1.6544 - regression_loss: 1.3834 - classification_loss: 0.2709 455/500 [==========================>...] - ETA: 11s - loss: 1.6554 - regression_loss: 1.3842 - classification_loss: 0.2712 456/500 [==========================>...] - ETA: 10s - loss: 1.6551 - regression_loss: 1.3840 - classification_loss: 0.2711 457/500 [==========================>...] - ETA: 10s - loss: 1.6568 - regression_loss: 1.3852 - classification_loss: 0.2716 458/500 [==========================>...] - ETA: 10s - loss: 1.6570 - regression_loss: 1.3822 - classification_loss: 0.2748 459/500 [==========================>...] - ETA: 10s - loss: 1.6567 - regression_loss: 1.3821 - classification_loss: 0.2746 460/500 [==========================>...] - ETA: 9s - loss: 1.6580 - regression_loss: 1.3833 - classification_loss: 0.2747  461/500 [==========================>...] - ETA: 9s - loss: 1.6567 - regression_loss: 1.3824 - classification_loss: 0.2743 462/500 [==========================>...] - ETA: 9s - loss: 1.6591 - regression_loss: 1.3843 - classification_loss: 0.2748 463/500 [==========================>...] - ETA: 9s - loss: 1.6582 - regression_loss: 1.3836 - classification_loss: 0.2746 464/500 [==========================>...] - ETA: 8s - loss: 1.6585 - regression_loss: 1.3837 - classification_loss: 0.2748 465/500 [==========================>...] - ETA: 8s - loss: 1.6591 - regression_loss: 1.3844 - classification_loss: 0.2747 466/500 [==========================>...] - ETA: 8s - loss: 1.6603 - regression_loss: 1.3855 - classification_loss: 0.2748 467/500 [===========================>..] - ETA: 8s - loss: 1.6604 - regression_loss: 1.3855 - classification_loss: 0.2748 468/500 [===========================>..] - ETA: 7s - loss: 1.6643 - regression_loss: 1.3884 - classification_loss: 0.2759 469/500 [===========================>..] - ETA: 7s - loss: 1.6633 - regression_loss: 1.3877 - classification_loss: 0.2756 470/500 [===========================>..] - ETA: 7s - loss: 1.6632 - regression_loss: 1.3878 - classification_loss: 0.2754 471/500 [===========================>..] - ETA: 7s - loss: 1.6628 - regression_loss: 1.3873 - classification_loss: 0.2755 472/500 [===========================>..] - ETA: 6s - loss: 1.6634 - regression_loss: 1.3879 - classification_loss: 0.2755 473/500 [===========================>..] - ETA: 6s - loss: 1.6630 - regression_loss: 1.3875 - classification_loss: 0.2754 474/500 [===========================>..] - ETA: 6s - loss: 1.6627 - regression_loss: 1.3871 - classification_loss: 0.2756 475/500 [===========================>..] - ETA: 6s - loss: 1.6623 - regression_loss: 1.3868 - classification_loss: 0.2755 476/500 [===========================>..] - ETA: 5s - loss: 1.6632 - regression_loss: 1.3878 - classification_loss: 0.2754 477/500 [===========================>..] - ETA: 5s - loss: 1.6647 - regression_loss: 1.3888 - classification_loss: 0.2759 478/500 [===========================>..] - ETA: 5s - loss: 1.6653 - regression_loss: 1.3893 - classification_loss: 0.2760 479/500 [===========================>..] - ETA: 5s - loss: 1.6646 - regression_loss: 1.3889 - classification_loss: 0.2757 480/500 [===========================>..] - ETA: 4s - loss: 1.6646 - regression_loss: 1.3888 - classification_loss: 0.2758 481/500 [===========================>..] - ETA: 4s - loss: 1.6639 - regression_loss: 1.3884 - classification_loss: 0.2755 482/500 [===========================>..] - ETA: 4s - loss: 1.6649 - regression_loss: 1.3894 - classification_loss: 0.2755 483/500 [===========================>..] - ETA: 4s - loss: 1.6653 - regression_loss: 1.3896 - classification_loss: 0.2756 484/500 [============================>.] - ETA: 3s - loss: 1.6651 - regression_loss: 1.3896 - classification_loss: 0.2755 485/500 [============================>.] - ETA: 3s - loss: 1.6665 - regression_loss: 1.3908 - classification_loss: 0.2757 486/500 [============================>.] - ETA: 3s - loss: 1.6662 - regression_loss: 1.3907 - classification_loss: 0.2755 487/500 [============================>.] - ETA: 3s - loss: 1.6657 - regression_loss: 1.3905 - classification_loss: 0.2752 488/500 [============================>.] - ETA: 2s - loss: 1.6643 - regression_loss: 1.3894 - classification_loss: 0.2749 489/500 [============================>.] - ETA: 2s - loss: 1.6639 - regression_loss: 1.3891 - classification_loss: 0.2749 490/500 [============================>.] - ETA: 2s - loss: 1.6636 - regression_loss: 1.3890 - classification_loss: 0.2746 491/500 [============================>.] - ETA: 2s - loss: 1.6649 - regression_loss: 1.3901 - classification_loss: 0.2748 492/500 [============================>.] - ETA: 1s - loss: 1.6645 - regression_loss: 1.3899 - classification_loss: 0.2746 493/500 [============================>.] - ETA: 1s - loss: 1.6645 - regression_loss: 1.3900 - classification_loss: 0.2745 494/500 [============================>.] - ETA: 1s - loss: 1.6632 - regression_loss: 1.3888 - classification_loss: 0.2743 495/500 [============================>.] - ETA: 1s - loss: 1.6631 - regression_loss: 1.3889 - classification_loss: 0.2742 496/500 [============================>.] - ETA: 0s - loss: 1.6623 - regression_loss: 1.3882 - classification_loss: 0.2740 497/500 [============================>.] - ETA: 0s - loss: 1.6618 - regression_loss: 1.3877 - classification_loss: 0.2741 498/500 [============================>.] - ETA: 0s - loss: 1.6620 - regression_loss: 1.3878 - classification_loss: 0.2743 499/500 [============================>.] - ETA: 0s - loss: 1.6613 - regression_loss: 1.3873 - classification_loss: 0.2741 500/500 [==============================] - 125s 249ms/step - loss: 1.6620 - regression_loss: 1.3876 - classification_loss: 0.2743 326 instances of class plum with average precision: 0.7818 mAP: 0.7818 Epoch 00053: saving model to ./training/snapshots/resnet50_pascal_53.h5 Epoch 54/150 1/500 [..............................] - ETA: 2:07 - loss: 1.3099 - regression_loss: 1.0743 - classification_loss: 0.2356 2/500 [..............................] - ETA: 2:04 - loss: 1.4878 - regression_loss: 1.2492 - classification_loss: 0.2386 3/500 [..............................] - ETA: 2:04 - loss: 1.7186 - regression_loss: 1.4219 - classification_loss: 0.2967 4/500 [..............................] - ETA: 2:03 - loss: 1.6791 - regression_loss: 1.4076 - classification_loss: 0.2714 5/500 [..............................] - ETA: 2:03 - loss: 2.0278 - regression_loss: 1.6858 - classification_loss: 0.3420 6/500 [..............................] - ETA: 2:04 - loss: 1.8538 - regression_loss: 1.5518 - classification_loss: 0.3020 7/500 [..............................] - ETA: 2:04 - loss: 1.8841 - regression_loss: 1.5767 - classification_loss: 0.3074 8/500 [..............................] - ETA: 2:04 - loss: 1.9193 - regression_loss: 1.6189 - classification_loss: 0.3004 9/500 [..............................] - ETA: 2:05 - loss: 1.9085 - regression_loss: 1.6041 - classification_loss: 0.3044 10/500 [..............................] - ETA: 2:04 - loss: 1.7929 - regression_loss: 1.5118 - classification_loss: 0.2811 11/500 [..............................] - ETA: 2:04 - loss: 1.7629 - regression_loss: 1.4888 - classification_loss: 0.2741 12/500 [..............................] - ETA: 2:02 - loss: 1.7395 - regression_loss: 1.4616 - classification_loss: 0.2779 13/500 [..............................] - ETA: 2:02 - loss: 1.7454 - regression_loss: 1.4609 - classification_loss: 0.2845 14/500 [..............................] - ETA: 2:01 - loss: 1.7218 - regression_loss: 1.4404 - classification_loss: 0.2814 15/500 [..............................] - ETA: 2:01 - loss: 1.7457 - regression_loss: 1.4611 - classification_loss: 0.2846 16/500 [..............................] - ETA: 2:01 - loss: 1.7220 - regression_loss: 1.4422 - classification_loss: 0.2798 17/500 [>.............................] - ETA: 2:00 - loss: 1.6883 - regression_loss: 1.4152 - classification_loss: 0.2731 18/500 [>.............................] - ETA: 2:00 - loss: 1.6870 - regression_loss: 1.4140 - classification_loss: 0.2730 19/500 [>.............................] - ETA: 2:00 - loss: 1.6831 - regression_loss: 1.4123 - classification_loss: 0.2708 20/500 [>.............................] - ETA: 1:59 - loss: 1.6701 - regression_loss: 1.4045 - classification_loss: 0.2657 21/500 [>.............................] - ETA: 1:59 - loss: 1.7046 - regression_loss: 1.4359 - classification_loss: 0.2687 22/500 [>.............................] - ETA: 1:59 - loss: 1.6745 - regression_loss: 1.4131 - classification_loss: 0.2614 23/500 [>.............................] - ETA: 1:59 - loss: 1.6501 - regression_loss: 1.3935 - classification_loss: 0.2566 24/500 [>.............................] - ETA: 1:59 - loss: 1.6531 - regression_loss: 1.3965 - classification_loss: 0.2566 25/500 [>.............................] - ETA: 1:59 - loss: 1.6821 - regression_loss: 1.4210 - classification_loss: 0.2611 26/500 [>.............................] - ETA: 1:58 - loss: 1.6883 - regression_loss: 1.4292 - classification_loss: 0.2591 27/500 [>.............................] - ETA: 1:58 - loss: 1.6696 - regression_loss: 1.4135 - classification_loss: 0.2562 28/500 [>.............................] - ETA: 1:58 - loss: 1.6954 - regression_loss: 1.4336 - classification_loss: 0.2618 29/500 [>.............................] - ETA: 1:57 - loss: 1.6812 - regression_loss: 1.4235 - classification_loss: 0.2577 30/500 [>.............................] - ETA: 1:57 - loss: 1.7036 - regression_loss: 1.4364 - classification_loss: 0.2672 31/500 [>.............................] - ETA: 1:57 - loss: 1.7256 - regression_loss: 1.4534 - classification_loss: 0.2722 32/500 [>.............................] - ETA: 1:57 - loss: 1.7004 - regression_loss: 1.4334 - classification_loss: 0.2670 33/500 [>.............................] - ETA: 1:56 - loss: 1.7064 - regression_loss: 1.4377 - classification_loss: 0.2687 34/500 [=>............................] - ETA: 1:56 - loss: 1.6933 - regression_loss: 1.4241 - classification_loss: 0.2692 35/500 [=>............................] - ETA: 1:56 - loss: 1.6932 - regression_loss: 1.4253 - classification_loss: 0.2679 36/500 [=>............................] - ETA: 1:55 - loss: 1.6898 - regression_loss: 1.4213 - classification_loss: 0.2685 37/500 [=>............................] - ETA: 1:55 - loss: 1.6817 - regression_loss: 1.4153 - classification_loss: 0.2664 38/500 [=>............................] - ETA: 1:55 - loss: 1.6772 - regression_loss: 1.4115 - classification_loss: 0.2657 39/500 [=>............................] - ETA: 1:55 - loss: 1.6909 - regression_loss: 1.4213 - classification_loss: 0.2696 40/500 [=>............................] - ETA: 1:55 - loss: 1.7128 - regression_loss: 1.4390 - classification_loss: 0.2737 41/500 [=>............................] - ETA: 1:54 - loss: 1.7004 - regression_loss: 1.4286 - classification_loss: 0.2718 42/500 [=>............................] - ETA: 1:54 - loss: 1.7006 - regression_loss: 1.4309 - classification_loss: 0.2697 43/500 [=>............................] - ETA: 1:54 - loss: 1.7067 - regression_loss: 1.4369 - classification_loss: 0.2698 44/500 [=>............................] - ETA: 1:54 - loss: 1.6881 - regression_loss: 1.4218 - classification_loss: 0.2663 45/500 [=>............................] - ETA: 1:53 - loss: 1.7004 - regression_loss: 1.4311 - classification_loss: 0.2693 46/500 [=>............................] - ETA: 1:53 - loss: 1.6719 - regression_loss: 1.4069 - classification_loss: 0.2650 47/500 [=>............................] - ETA: 1:53 - loss: 1.6510 - regression_loss: 1.3881 - classification_loss: 0.2629 48/500 [=>............................] - ETA: 1:53 - loss: 1.6523 - regression_loss: 1.3906 - classification_loss: 0.2617 49/500 [=>............................] - ETA: 1:53 - loss: 1.6569 - regression_loss: 1.3941 - classification_loss: 0.2628 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6504 - regression_loss: 1.3872 - classification_loss: 0.2632 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6676 - regression_loss: 1.3973 - classification_loss: 0.2703 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6858 - regression_loss: 1.4111 - classification_loss: 0.2746 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6770 - regression_loss: 1.4052 - classification_loss: 0.2719 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6869 - regression_loss: 1.4126 - classification_loss: 0.2743 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6823 - regression_loss: 1.4098 - classification_loss: 0.2724 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6829 - regression_loss: 1.4095 - classification_loss: 0.2734 57/500 [==>...........................] - ETA: 1:51 - loss: 1.6793 - regression_loss: 1.4073 - classification_loss: 0.2720 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6707 - regression_loss: 1.4009 - classification_loss: 0.2697 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6663 - regression_loss: 1.3973 - classification_loss: 0.2690 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6748 - regression_loss: 1.4040 - classification_loss: 0.2708 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6776 - regression_loss: 1.4060 - classification_loss: 0.2716 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6787 - regression_loss: 1.4076 - classification_loss: 0.2710 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6858 - regression_loss: 1.4133 - classification_loss: 0.2725 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6815 - regression_loss: 1.4100 - classification_loss: 0.2715 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6701 - regression_loss: 1.4012 - classification_loss: 0.2689 66/500 [==>...........................] - ETA: 1:49 - loss: 1.6628 - regression_loss: 1.3957 - classification_loss: 0.2671 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6642 - regression_loss: 1.3970 - classification_loss: 0.2672 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6626 - regression_loss: 1.3953 - classification_loss: 0.2673 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6605 - regression_loss: 1.3929 - classification_loss: 0.2676 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6597 - regression_loss: 1.3931 - classification_loss: 0.2667 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6575 - regression_loss: 1.3905 - classification_loss: 0.2670 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6517 - regression_loss: 1.3858 - classification_loss: 0.2658 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6563 - regression_loss: 1.3905 - classification_loss: 0.2658 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6504 - regression_loss: 1.3868 - classification_loss: 0.2636 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6473 - regression_loss: 1.3844 - classification_loss: 0.2630 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6444 - regression_loss: 1.3818 - classification_loss: 0.2627 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6419 - regression_loss: 1.3786 - classification_loss: 0.2633 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6485 - regression_loss: 1.3828 - classification_loss: 0.2657 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6482 - regression_loss: 1.3832 - classification_loss: 0.2649 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6451 - regression_loss: 1.3809 - classification_loss: 0.2642 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6340 - regression_loss: 1.3717 - classification_loss: 0.2624 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6375 - regression_loss: 1.3742 - classification_loss: 0.2633 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6480 - regression_loss: 1.3827 - classification_loss: 0.2652 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6451 - regression_loss: 1.3809 - classification_loss: 0.2642 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6437 - regression_loss: 1.3802 - classification_loss: 0.2635 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6495 - regression_loss: 1.3859 - classification_loss: 0.2636 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6460 - regression_loss: 1.3836 - classification_loss: 0.2624 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6587 - regression_loss: 1.3939 - classification_loss: 0.2647 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6582 - regression_loss: 1.3929 - classification_loss: 0.2653 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6699 - regression_loss: 1.4024 - classification_loss: 0.2676 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6697 - regression_loss: 1.4018 - classification_loss: 0.2679 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6813 - regression_loss: 1.4118 - classification_loss: 0.2695 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6799 - regression_loss: 1.4107 - classification_loss: 0.2691 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6860 - regression_loss: 1.4157 - classification_loss: 0.2703 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6869 - regression_loss: 1.4168 - classification_loss: 0.2701 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6863 - regression_loss: 1.4168 - classification_loss: 0.2695 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6866 - regression_loss: 1.4172 - classification_loss: 0.2694 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6948 - regression_loss: 1.4224 - classification_loss: 0.2724 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6957 - regression_loss: 1.4234 - classification_loss: 0.2724 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7022 - regression_loss: 1.4280 - classification_loss: 0.2742 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7016 - regression_loss: 1.4282 - classification_loss: 0.2733 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7026 - regression_loss: 1.4293 - classification_loss: 0.2733 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6999 - regression_loss: 1.4271 - classification_loss: 0.2728 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7032 - regression_loss: 1.4309 - classification_loss: 0.2723 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7054 - regression_loss: 1.4321 - classification_loss: 0.2733 106/500 [=====>........................] - ETA: 1:37 - loss: 1.7002 - regression_loss: 1.4278 - classification_loss: 0.2724 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7028 - regression_loss: 1.4300 - classification_loss: 0.2728 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7008 - regression_loss: 1.4291 - classification_loss: 0.2717 109/500 [=====>........................] - ETA: 1:36 - loss: 1.7019 - regression_loss: 1.4299 - classification_loss: 0.2720 110/500 [=====>........................] - ETA: 1:36 - loss: 1.7012 - regression_loss: 1.4296 - classification_loss: 0.2716 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6991 - regression_loss: 1.4280 - classification_loss: 0.2711 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6977 - regression_loss: 1.4263 - classification_loss: 0.2714 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6960 - regression_loss: 1.4255 - classification_loss: 0.2705 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6960 - regression_loss: 1.4263 - classification_loss: 0.2696 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6944 - regression_loss: 1.4249 - classification_loss: 0.2695 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6934 - regression_loss: 1.4241 - classification_loss: 0.2693 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6954 - regression_loss: 1.4251 - classification_loss: 0.2702 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6963 - regression_loss: 1.4261 - classification_loss: 0.2702 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6938 - regression_loss: 1.4244 - classification_loss: 0.2694 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6969 - regression_loss: 1.4269 - classification_loss: 0.2700 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6958 - regression_loss: 1.4265 - classification_loss: 0.2693 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6928 - regression_loss: 1.4244 - classification_loss: 0.2685 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6919 - regression_loss: 1.4235 - classification_loss: 0.2684 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6909 - regression_loss: 1.4229 - classification_loss: 0.2680 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6955 - regression_loss: 1.4272 - classification_loss: 0.2682 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6996 - regression_loss: 1.4302 - classification_loss: 0.2694 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6987 - regression_loss: 1.4293 - classification_loss: 0.2694 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7002 - regression_loss: 1.4301 - classification_loss: 0.2701 129/500 [======>.......................] - ETA: 1:31 - loss: 1.7033 - regression_loss: 1.4325 - classification_loss: 0.2708 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6985 - regression_loss: 1.4287 - classification_loss: 0.2698 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6972 - regression_loss: 1.4278 - classification_loss: 0.2693 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6979 - regression_loss: 1.4286 - classification_loss: 0.2694 133/500 [======>.......................] - ETA: 1:30 - loss: 1.6917 - regression_loss: 1.4240 - classification_loss: 0.2677 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6932 - regression_loss: 1.4254 - classification_loss: 0.2677 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6942 - regression_loss: 1.4262 - classification_loss: 0.2681 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6937 - regression_loss: 1.4259 - classification_loss: 0.2679 137/500 [=======>......................] - ETA: 1:29 - loss: 1.6990 - regression_loss: 1.4301 - classification_loss: 0.2689 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6945 - regression_loss: 1.4261 - classification_loss: 0.2685 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6978 - regression_loss: 1.4286 - classification_loss: 0.2692 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7003 - regression_loss: 1.4309 - classification_loss: 0.2694 141/500 [=======>......................] - ETA: 1:28 - loss: 1.6989 - regression_loss: 1.4304 - classification_loss: 0.2685 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6953 - regression_loss: 1.4278 - classification_loss: 0.2676 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7021 - regression_loss: 1.4329 - classification_loss: 0.2692 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7037 - regression_loss: 1.4344 - classification_loss: 0.2694 145/500 [=======>......................] - ETA: 1:27 - loss: 1.7049 - regression_loss: 1.4350 - classification_loss: 0.2698 146/500 [=======>......................] - ETA: 1:27 - loss: 1.7010 - regression_loss: 1.4320 - classification_loss: 0.2690 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7007 - regression_loss: 1.4315 - classification_loss: 0.2692 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7008 - regression_loss: 1.4322 - classification_loss: 0.2686 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7007 - regression_loss: 1.4323 - classification_loss: 0.2684 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6989 - regression_loss: 1.4311 - classification_loss: 0.2678 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6959 - regression_loss: 1.4288 - classification_loss: 0.2671 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6938 - regression_loss: 1.4270 - classification_loss: 0.2667 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6933 - regression_loss: 1.4270 - classification_loss: 0.2663 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6867 - regression_loss: 1.4213 - classification_loss: 0.2655 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6834 - regression_loss: 1.4185 - classification_loss: 0.2649 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6825 - regression_loss: 1.4179 - classification_loss: 0.2646 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6824 - regression_loss: 1.4180 - classification_loss: 0.2644 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6900 - regression_loss: 1.4239 - classification_loss: 0.2661 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6924 - regression_loss: 1.4260 - classification_loss: 0.2663 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6941 - regression_loss: 1.4280 - classification_loss: 0.2662 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7015 - regression_loss: 1.4347 - classification_loss: 0.2668 162/500 [========>.....................] - ETA: 1:23 - loss: 1.6996 - regression_loss: 1.4336 - classification_loss: 0.2660 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6962 - regression_loss: 1.4308 - classification_loss: 0.2654 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6948 - regression_loss: 1.4301 - classification_loss: 0.2647 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6904 - regression_loss: 1.4262 - classification_loss: 0.2642 166/500 [========>.....................] - ETA: 1:22 - loss: 1.6926 - regression_loss: 1.4276 - classification_loss: 0.2650 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6916 - regression_loss: 1.4267 - classification_loss: 0.2649 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6895 - regression_loss: 1.4255 - classification_loss: 0.2641 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6903 - regression_loss: 1.4262 - classification_loss: 0.2641 170/500 [=========>....................] - ETA: 1:21 - loss: 1.6902 - regression_loss: 1.4265 - classification_loss: 0.2638 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6937 - regression_loss: 1.4296 - classification_loss: 0.2641 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6917 - regression_loss: 1.4281 - classification_loss: 0.2635 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6848 - regression_loss: 1.4223 - classification_loss: 0.2625 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6844 - regression_loss: 1.4214 - classification_loss: 0.2630 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6860 - regression_loss: 1.4221 - classification_loss: 0.2638 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6850 - regression_loss: 1.4215 - classification_loss: 0.2635 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6843 - regression_loss: 1.4207 - classification_loss: 0.2636 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6845 - regression_loss: 1.4216 - classification_loss: 0.2629 179/500 [=========>....................] - ETA: 1:19 - loss: 1.6866 - regression_loss: 1.4236 - classification_loss: 0.2630 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6855 - regression_loss: 1.4230 - classification_loss: 0.2625 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6862 - regression_loss: 1.4235 - classification_loss: 0.2627 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6831 - regression_loss: 1.4212 - classification_loss: 0.2619 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6778 - regression_loss: 1.4171 - classification_loss: 0.2607 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6798 - regression_loss: 1.4187 - classification_loss: 0.2611 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6785 - regression_loss: 1.4180 - classification_loss: 0.2605 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6764 - regression_loss: 1.4164 - classification_loss: 0.2600 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6754 - regression_loss: 1.4160 - classification_loss: 0.2594 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6807 - regression_loss: 1.4172 - classification_loss: 0.2635 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6807 - regression_loss: 1.4172 - classification_loss: 0.2635 190/500 [==========>...................] - ETA: 1:16 - loss: 1.6794 - regression_loss: 1.4163 - classification_loss: 0.2630 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6782 - regression_loss: 1.4154 - classification_loss: 0.2628 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6820 - regression_loss: 1.4183 - classification_loss: 0.2638 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6784 - regression_loss: 1.4153 - classification_loss: 0.2631 194/500 [==========>...................] - ETA: 1:15 - loss: 1.6795 - regression_loss: 1.4162 - classification_loss: 0.2633 195/500 [==========>...................] - ETA: 1:15 - loss: 1.6809 - regression_loss: 1.4175 - classification_loss: 0.2634 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6811 - regression_loss: 1.4176 - classification_loss: 0.2635 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6786 - regression_loss: 1.4155 - classification_loss: 0.2631 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6804 - regression_loss: 1.4165 - classification_loss: 0.2639 199/500 [==========>...................] - ETA: 1:14 - loss: 1.6801 - regression_loss: 1.4165 - classification_loss: 0.2636 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6795 - regression_loss: 1.4167 - classification_loss: 0.2628 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6824 - regression_loss: 1.4187 - classification_loss: 0.2637 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6810 - regression_loss: 1.4167 - classification_loss: 0.2643 203/500 [===========>..................] - ETA: 1:13 - loss: 1.6839 - regression_loss: 1.4192 - classification_loss: 0.2647 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6804 - regression_loss: 1.4164 - classification_loss: 0.2640 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6817 - regression_loss: 1.4172 - classification_loss: 0.2645 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6822 - regression_loss: 1.4177 - classification_loss: 0.2645 207/500 [===========>..................] - ETA: 1:12 - loss: 1.6784 - regression_loss: 1.4145 - classification_loss: 0.2639 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6817 - regression_loss: 1.4158 - classification_loss: 0.2659 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6811 - regression_loss: 1.4150 - classification_loss: 0.2661 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6797 - regression_loss: 1.4142 - classification_loss: 0.2655 211/500 [===========>..................] - ETA: 1:11 - loss: 1.6801 - regression_loss: 1.4145 - classification_loss: 0.2656 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6806 - regression_loss: 1.4151 - classification_loss: 0.2655 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6787 - regression_loss: 1.4138 - classification_loss: 0.2648 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6805 - regression_loss: 1.4153 - classification_loss: 0.2652 215/500 [===========>..................] - ETA: 1:10 - loss: 1.6776 - regression_loss: 1.4130 - classification_loss: 0.2646 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6767 - regression_loss: 1.4121 - classification_loss: 0.2646 217/500 [============>.................] - ETA: 1:10 - loss: 1.6755 - regression_loss: 1.4111 - classification_loss: 0.2644 218/500 [============>.................] - ETA: 1:10 - loss: 1.6756 - regression_loss: 1.4109 - classification_loss: 0.2647 219/500 [============>.................] - ETA: 1:09 - loss: 1.6721 - regression_loss: 1.4082 - classification_loss: 0.2639 220/500 [============>.................] - ETA: 1:09 - loss: 1.6699 - regression_loss: 1.4067 - classification_loss: 0.2632 221/500 [============>.................] - ETA: 1:09 - loss: 1.6649 - regression_loss: 1.4025 - classification_loss: 0.2624 222/500 [============>.................] - ETA: 1:09 - loss: 1.6625 - regression_loss: 1.4008 - classification_loss: 0.2617 223/500 [============>.................] - ETA: 1:08 - loss: 1.6610 - regression_loss: 1.3995 - classification_loss: 0.2614 224/500 [============>.................] - ETA: 1:08 - loss: 1.6608 - regression_loss: 1.3998 - classification_loss: 0.2610 225/500 [============>.................] - ETA: 1:08 - loss: 1.6601 - regression_loss: 1.3993 - classification_loss: 0.2608 226/500 [============>.................] - ETA: 1:08 - loss: 1.6592 - regression_loss: 1.3987 - classification_loss: 0.2605 227/500 [============>.................] - ETA: 1:07 - loss: 1.6581 - regression_loss: 1.3979 - classification_loss: 0.2601 228/500 [============>.................] - ETA: 1:07 - loss: 1.6544 - regression_loss: 1.3948 - classification_loss: 0.2596 229/500 [============>.................] - ETA: 1:07 - loss: 1.6541 - regression_loss: 1.3941 - classification_loss: 0.2601 230/500 [============>.................] - ETA: 1:07 - loss: 1.6547 - regression_loss: 1.3941 - classification_loss: 0.2606 231/500 [============>.................] - ETA: 1:06 - loss: 1.6560 - regression_loss: 1.3947 - classification_loss: 0.2612 232/500 [============>.................] - ETA: 1:06 - loss: 1.6564 - regression_loss: 1.3954 - classification_loss: 0.2610 233/500 [============>.................] - ETA: 1:06 - loss: 1.6569 - regression_loss: 1.3959 - classification_loss: 0.2610 234/500 [=============>................] - ETA: 1:06 - loss: 1.6538 - regression_loss: 1.3934 - classification_loss: 0.2604 235/500 [=============>................] - ETA: 1:05 - loss: 1.6555 - regression_loss: 1.3951 - classification_loss: 0.2605 236/500 [=============>................] - ETA: 1:05 - loss: 1.6535 - regression_loss: 1.3936 - classification_loss: 0.2599 237/500 [=============>................] - ETA: 1:05 - loss: 1.6571 - regression_loss: 1.3966 - classification_loss: 0.2605 238/500 [=============>................] - ETA: 1:05 - loss: 1.6604 - regression_loss: 1.3990 - classification_loss: 0.2613 239/500 [=============>................] - ETA: 1:04 - loss: 1.6577 - regression_loss: 1.3966 - classification_loss: 0.2610 240/500 [=============>................] - ETA: 1:04 - loss: 1.6578 - regression_loss: 1.3968 - classification_loss: 0.2610 241/500 [=============>................] - ETA: 1:04 - loss: 1.6535 - regression_loss: 1.3933 - classification_loss: 0.2602 242/500 [=============>................] - ETA: 1:04 - loss: 1.6515 - regression_loss: 1.3916 - classification_loss: 0.2599 243/500 [=============>................] - ETA: 1:03 - loss: 1.6512 - regression_loss: 1.3910 - classification_loss: 0.2602 244/500 [=============>................] - ETA: 1:03 - loss: 1.6482 - regression_loss: 1.3888 - classification_loss: 0.2595 245/500 [=============>................] - ETA: 1:03 - loss: 1.6497 - regression_loss: 1.3899 - classification_loss: 0.2598 246/500 [=============>................] - ETA: 1:03 - loss: 1.6473 - regression_loss: 1.3880 - classification_loss: 0.2592 247/500 [=============>................] - ETA: 1:02 - loss: 1.6509 - regression_loss: 1.3911 - classification_loss: 0.2598 248/500 [=============>................] - ETA: 1:02 - loss: 1.6465 - regression_loss: 1.3875 - classification_loss: 0.2590 249/500 [=============>................] - ETA: 1:02 - loss: 1.6475 - regression_loss: 1.3866 - classification_loss: 0.2609 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6486 - regression_loss: 1.3874 - classification_loss: 0.2612 251/500 [==============>...............] - ETA: 1:01 - loss: 1.6509 - regression_loss: 1.3892 - classification_loss: 0.2617 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6509 - regression_loss: 1.3892 - classification_loss: 0.2618 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6528 - regression_loss: 1.3906 - classification_loss: 0.2622 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6541 - regression_loss: 1.3913 - classification_loss: 0.2627 255/500 [==============>...............] - ETA: 1:00 - loss: 1.6500 - regression_loss: 1.3881 - classification_loss: 0.2620 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6521 - regression_loss: 1.3897 - classification_loss: 0.2624 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6542 - regression_loss: 1.3913 - classification_loss: 0.2629 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6560 - regression_loss: 1.3925 - classification_loss: 0.2634 259/500 [==============>...............] - ETA: 59s - loss: 1.6571 - regression_loss: 1.3937 - classification_loss: 0.2633  260/500 [==============>...............] - ETA: 59s - loss: 1.6587 - regression_loss: 1.3953 - classification_loss: 0.2634 261/500 [==============>...............] - ETA: 59s - loss: 1.6589 - regression_loss: 1.3957 - classification_loss: 0.2632 262/500 [==============>...............] - ETA: 59s - loss: 1.6568 - regression_loss: 1.3941 - classification_loss: 0.2626 263/500 [==============>...............] - ETA: 58s - loss: 1.6591 - regression_loss: 1.3962 - classification_loss: 0.2629 264/500 [==============>...............] - ETA: 58s - loss: 1.6597 - regression_loss: 1.3970 - classification_loss: 0.2628 265/500 [==============>...............] - ETA: 58s - loss: 1.6583 - regression_loss: 1.3958 - classification_loss: 0.2625 266/500 [==============>...............] - ETA: 58s - loss: 1.6548 - regression_loss: 1.3928 - classification_loss: 0.2620 267/500 [===============>..............] - ETA: 57s - loss: 1.6538 - regression_loss: 1.3922 - classification_loss: 0.2617 268/500 [===============>..............] - ETA: 57s - loss: 1.6521 - regression_loss: 1.3907 - classification_loss: 0.2613 269/500 [===============>..............] - ETA: 57s - loss: 1.6529 - regression_loss: 1.3911 - classification_loss: 0.2618 270/500 [===============>..............] - ETA: 57s - loss: 1.6535 - regression_loss: 1.3916 - classification_loss: 0.2619 271/500 [===============>..............] - ETA: 56s - loss: 1.6522 - regression_loss: 1.3905 - classification_loss: 0.2617 272/500 [===============>..............] - ETA: 56s - loss: 1.6540 - regression_loss: 1.3921 - classification_loss: 0.2619 273/500 [===============>..............] - ETA: 56s - loss: 1.6527 - regression_loss: 1.3910 - classification_loss: 0.2616 274/500 [===============>..............] - ETA: 56s - loss: 1.6514 - regression_loss: 1.3901 - classification_loss: 0.2613 275/500 [===============>..............] - ETA: 55s - loss: 1.6515 - regression_loss: 1.3906 - classification_loss: 0.2609 276/500 [===============>..............] - ETA: 55s - loss: 1.6503 - regression_loss: 1.3897 - classification_loss: 0.2606 277/500 [===============>..............] - ETA: 55s - loss: 1.6493 - regression_loss: 1.3890 - classification_loss: 0.2603 278/500 [===============>..............] - ETA: 55s - loss: 1.6506 - regression_loss: 1.3899 - classification_loss: 0.2607 279/500 [===============>..............] - ETA: 54s - loss: 1.6503 - regression_loss: 1.3898 - classification_loss: 0.2605 280/500 [===============>..............] - ETA: 54s - loss: 1.6517 - regression_loss: 1.3906 - classification_loss: 0.2610 281/500 [===============>..............] - ETA: 54s - loss: 1.6494 - regression_loss: 1.3888 - classification_loss: 0.2606 282/500 [===============>..............] - ETA: 54s - loss: 1.6516 - regression_loss: 1.3904 - classification_loss: 0.2611 283/500 [===============>..............] - ETA: 53s - loss: 1.6530 - regression_loss: 1.3911 - classification_loss: 0.2619 284/500 [================>.............] - ETA: 53s - loss: 1.6559 - regression_loss: 1.3936 - classification_loss: 0.2623 285/500 [================>.............] - ETA: 53s - loss: 1.6552 - regression_loss: 1.3934 - classification_loss: 0.2619 286/500 [================>.............] - ETA: 53s - loss: 1.6563 - regression_loss: 1.3946 - classification_loss: 0.2617 287/500 [================>.............] - ETA: 52s - loss: 1.6553 - regression_loss: 1.3939 - classification_loss: 0.2614 288/500 [================>.............] - ETA: 52s - loss: 1.6557 - regression_loss: 1.3939 - classification_loss: 0.2618 289/500 [================>.............] - ETA: 52s - loss: 1.6554 - regression_loss: 1.3938 - classification_loss: 0.2617 290/500 [================>.............] - ETA: 52s - loss: 1.6540 - regression_loss: 1.3929 - classification_loss: 0.2611 291/500 [================>.............] - ETA: 51s - loss: 1.6548 - regression_loss: 1.3936 - classification_loss: 0.2612 292/500 [================>.............] - ETA: 51s - loss: 1.6554 - regression_loss: 1.3940 - classification_loss: 0.2614 293/500 [================>.............] - ETA: 51s - loss: 1.6555 - regression_loss: 1.3940 - classification_loss: 0.2615 294/500 [================>.............] - ETA: 51s - loss: 1.6549 - regression_loss: 1.3936 - classification_loss: 0.2613 295/500 [================>.............] - ETA: 51s - loss: 1.6562 - regression_loss: 1.3944 - classification_loss: 0.2617 296/500 [================>.............] - ETA: 50s - loss: 1.6577 - regression_loss: 1.3956 - classification_loss: 0.2620 297/500 [================>.............] - ETA: 50s - loss: 1.6576 - regression_loss: 1.3958 - classification_loss: 0.2618 298/500 [================>.............] - ETA: 50s - loss: 1.6551 - regression_loss: 1.3938 - classification_loss: 0.2613 299/500 [================>.............] - ETA: 50s - loss: 1.6557 - regression_loss: 1.3939 - classification_loss: 0.2618 300/500 [=================>............] - ETA: 49s - loss: 1.6600 - regression_loss: 1.3977 - classification_loss: 0.2623 301/500 [=================>............] - ETA: 49s - loss: 1.6610 - regression_loss: 1.3984 - classification_loss: 0.2626 302/500 [=================>............] - ETA: 49s - loss: 1.6625 - regression_loss: 1.3995 - classification_loss: 0.2630 303/500 [=================>............] - ETA: 49s - loss: 1.6646 - regression_loss: 1.4007 - classification_loss: 0.2639 304/500 [=================>............] - ETA: 48s - loss: 1.6658 - regression_loss: 1.4017 - classification_loss: 0.2641 305/500 [=================>............] - ETA: 48s - loss: 1.6645 - regression_loss: 1.4006 - classification_loss: 0.2639 306/500 [=================>............] - ETA: 48s - loss: 1.6653 - regression_loss: 1.4013 - classification_loss: 0.2640 307/500 [=================>............] - ETA: 48s - loss: 1.6621 - regression_loss: 1.3987 - classification_loss: 0.2634 308/500 [=================>............] - ETA: 47s - loss: 1.6622 - regression_loss: 1.3987 - classification_loss: 0.2635 309/500 [=================>............] - ETA: 47s - loss: 1.6627 - regression_loss: 1.3992 - classification_loss: 0.2634 310/500 [=================>............] - ETA: 47s - loss: 1.6617 - regression_loss: 1.3986 - classification_loss: 0.2630 311/500 [=================>............] - ETA: 47s - loss: 1.6638 - regression_loss: 1.4002 - classification_loss: 0.2637 312/500 [=================>............] - ETA: 46s - loss: 1.6637 - regression_loss: 1.4000 - classification_loss: 0.2637 313/500 [=================>............] - ETA: 46s - loss: 1.6631 - regression_loss: 1.3994 - classification_loss: 0.2637 314/500 [=================>............] - ETA: 46s - loss: 1.6637 - regression_loss: 1.4003 - classification_loss: 0.2634 315/500 [=================>............] - ETA: 46s - loss: 1.6653 - regression_loss: 1.4011 - classification_loss: 0.2642 316/500 [=================>............] - ETA: 45s - loss: 1.6651 - regression_loss: 1.4006 - classification_loss: 0.2645 317/500 [==================>...........] - ETA: 45s - loss: 1.6644 - regression_loss: 1.4004 - classification_loss: 0.2640 318/500 [==================>...........] - ETA: 45s - loss: 1.6653 - regression_loss: 1.4011 - classification_loss: 0.2643 319/500 [==================>...........] - ETA: 45s - loss: 1.6656 - regression_loss: 1.4014 - classification_loss: 0.2643 320/500 [==================>...........] - ETA: 44s - loss: 1.6651 - regression_loss: 1.4005 - classification_loss: 0.2646 321/500 [==================>...........] - ETA: 44s - loss: 1.6652 - regression_loss: 1.4006 - classification_loss: 0.2645 322/500 [==================>...........] - ETA: 44s - loss: 1.6655 - regression_loss: 1.4004 - classification_loss: 0.2651 323/500 [==================>...........] - ETA: 44s - loss: 1.6635 - regression_loss: 1.3987 - classification_loss: 0.2648 324/500 [==================>...........] - ETA: 43s - loss: 1.6620 - regression_loss: 1.3973 - classification_loss: 0.2647 325/500 [==================>...........] - ETA: 43s - loss: 1.6613 - regression_loss: 1.3970 - classification_loss: 0.2643 326/500 [==================>...........] - ETA: 43s - loss: 1.6631 - regression_loss: 1.3985 - classification_loss: 0.2646 327/500 [==================>...........] - ETA: 43s - loss: 1.6646 - regression_loss: 1.3998 - classification_loss: 0.2648 328/500 [==================>...........] - ETA: 42s - loss: 1.6631 - regression_loss: 1.3988 - classification_loss: 0.2643 329/500 [==================>...........] - ETA: 42s - loss: 1.6614 - regression_loss: 1.3971 - classification_loss: 0.2643 330/500 [==================>...........] - ETA: 42s - loss: 1.6611 - regression_loss: 1.3968 - classification_loss: 0.2642 331/500 [==================>...........] - ETA: 42s - loss: 1.6597 - regression_loss: 1.3956 - classification_loss: 0.2641 332/500 [==================>...........] - ETA: 41s - loss: 1.6622 - regression_loss: 1.3976 - classification_loss: 0.2646 333/500 [==================>...........] - ETA: 41s - loss: 1.6612 - regression_loss: 1.3970 - classification_loss: 0.2642 334/500 [===================>..........] - ETA: 41s - loss: 1.6616 - regression_loss: 1.3973 - classification_loss: 0.2643 335/500 [===================>..........] - ETA: 41s - loss: 1.6603 - regression_loss: 1.3963 - classification_loss: 0.2640 336/500 [===================>..........] - ETA: 40s - loss: 1.6614 - regression_loss: 1.3971 - classification_loss: 0.2642 337/500 [===================>..........] - ETA: 40s - loss: 1.6614 - regression_loss: 1.3974 - classification_loss: 0.2640 338/500 [===================>..........] - ETA: 40s - loss: 1.6611 - regression_loss: 1.3970 - classification_loss: 0.2641 339/500 [===================>..........] - ETA: 40s - loss: 1.6595 - regression_loss: 1.3958 - classification_loss: 0.2637 340/500 [===================>..........] - ETA: 39s - loss: 1.6605 - regression_loss: 1.3964 - classification_loss: 0.2641 341/500 [===================>..........] - ETA: 39s - loss: 1.6599 - regression_loss: 1.3962 - classification_loss: 0.2637 342/500 [===================>..........] - ETA: 39s - loss: 1.6593 - regression_loss: 1.3956 - classification_loss: 0.2637 343/500 [===================>..........] - ETA: 39s - loss: 1.6574 - regression_loss: 1.3942 - classification_loss: 0.2633 344/500 [===================>..........] - ETA: 38s - loss: 1.6575 - regression_loss: 1.3942 - classification_loss: 0.2633 345/500 [===================>..........] - ETA: 38s - loss: 1.6557 - regression_loss: 1.3926 - classification_loss: 0.2631 346/500 [===================>..........] - ETA: 38s - loss: 1.6562 - regression_loss: 1.3931 - classification_loss: 0.2631 347/500 [===================>..........] - ETA: 38s - loss: 1.6581 - regression_loss: 1.3947 - classification_loss: 0.2634 348/500 [===================>..........] - ETA: 37s - loss: 1.6574 - regression_loss: 1.3942 - classification_loss: 0.2632 349/500 [===================>..........] - ETA: 37s - loss: 1.6581 - regression_loss: 1.3947 - classification_loss: 0.2634 350/500 [====================>.........] - ETA: 37s - loss: 1.6572 - regression_loss: 1.3939 - classification_loss: 0.2633 351/500 [====================>.........] - ETA: 37s - loss: 1.6562 - regression_loss: 1.3934 - classification_loss: 0.2629 352/500 [====================>.........] - ETA: 36s - loss: 1.6558 - regression_loss: 1.3930 - classification_loss: 0.2628 353/500 [====================>.........] - ETA: 36s - loss: 1.6557 - regression_loss: 1.3932 - classification_loss: 0.2625 354/500 [====================>.........] - ETA: 36s - loss: 1.6567 - regression_loss: 1.3943 - classification_loss: 0.2624 355/500 [====================>.........] - ETA: 36s - loss: 1.6541 - regression_loss: 1.3923 - classification_loss: 0.2618 356/500 [====================>.........] - ETA: 35s - loss: 1.6530 - regression_loss: 1.3914 - classification_loss: 0.2616 357/500 [====================>.........] - ETA: 35s - loss: 1.6528 - regression_loss: 1.3911 - classification_loss: 0.2617 358/500 [====================>.........] - ETA: 35s - loss: 1.6510 - regression_loss: 1.3897 - classification_loss: 0.2613 359/500 [====================>.........] - ETA: 35s - loss: 1.6492 - regression_loss: 1.3883 - classification_loss: 0.2609 360/500 [====================>.........] - ETA: 34s - loss: 1.6513 - regression_loss: 1.3897 - classification_loss: 0.2615 361/500 [====================>.........] - ETA: 34s - loss: 1.6509 - regression_loss: 1.3894 - classification_loss: 0.2614 362/500 [====================>.........] - ETA: 34s - loss: 1.6521 - regression_loss: 1.3903 - classification_loss: 0.2618 363/500 [====================>.........] - ETA: 34s - loss: 1.6520 - regression_loss: 1.3904 - classification_loss: 0.2617 364/500 [====================>.........] - ETA: 33s - loss: 1.6527 - regression_loss: 1.3910 - classification_loss: 0.2617 365/500 [====================>.........] - ETA: 33s - loss: 1.6551 - regression_loss: 1.3926 - classification_loss: 0.2625 366/500 [====================>.........] - ETA: 33s - loss: 1.6558 - regression_loss: 1.3932 - classification_loss: 0.2626 367/500 [=====================>........] - ETA: 33s - loss: 1.6572 - regression_loss: 1.3940 - classification_loss: 0.2632 368/500 [=====================>........] - ETA: 32s - loss: 1.6598 - regression_loss: 1.3958 - classification_loss: 0.2640 369/500 [=====================>........] - ETA: 32s - loss: 1.6608 - regression_loss: 1.3965 - classification_loss: 0.2642 370/500 [=====================>........] - ETA: 32s - loss: 1.6604 - regression_loss: 1.3963 - classification_loss: 0.2641 371/500 [=====================>........] - ETA: 32s - loss: 1.6629 - regression_loss: 1.3982 - classification_loss: 0.2647 372/500 [=====================>........] - ETA: 31s - loss: 1.6624 - regression_loss: 1.3978 - classification_loss: 0.2645 373/500 [=====================>........] - ETA: 31s - loss: 1.6619 - regression_loss: 1.3978 - classification_loss: 0.2641 374/500 [=====================>........] - ETA: 31s - loss: 1.6635 - regression_loss: 1.3990 - classification_loss: 0.2645 375/500 [=====================>........] - ETA: 31s - loss: 1.6615 - regression_loss: 1.3966 - classification_loss: 0.2649 376/500 [=====================>........] - ETA: 30s - loss: 1.6630 - regression_loss: 1.3979 - classification_loss: 0.2651 377/500 [=====================>........] - ETA: 30s - loss: 1.6641 - regression_loss: 1.3989 - classification_loss: 0.2652 378/500 [=====================>........] - ETA: 30s - loss: 1.6667 - regression_loss: 1.4010 - classification_loss: 0.2656 379/500 [=====================>........] - ETA: 30s - loss: 1.6664 - regression_loss: 1.4009 - classification_loss: 0.2655 380/500 [=====================>........] - ETA: 29s - loss: 1.6658 - regression_loss: 1.4005 - classification_loss: 0.2653 381/500 [=====================>........] - ETA: 29s - loss: 1.6631 - regression_loss: 1.3983 - classification_loss: 0.2649 382/500 [=====================>........] - ETA: 29s - loss: 1.6631 - regression_loss: 1.3985 - classification_loss: 0.2646 383/500 [=====================>........] - ETA: 29s - loss: 1.6626 - regression_loss: 1.3982 - classification_loss: 0.2644 384/500 [======================>.......] - ETA: 28s - loss: 1.6620 - regression_loss: 1.3975 - classification_loss: 0.2645 385/500 [======================>.......] - ETA: 28s - loss: 1.6639 - regression_loss: 1.3990 - classification_loss: 0.2650 386/500 [======================>.......] - ETA: 28s - loss: 1.6611 - regression_loss: 1.3965 - classification_loss: 0.2646 387/500 [======================>.......] - ETA: 28s - loss: 1.6616 - regression_loss: 1.3968 - classification_loss: 0.2648 388/500 [======================>.......] - ETA: 27s - loss: 1.6616 - regression_loss: 1.3966 - classification_loss: 0.2650 389/500 [======================>.......] - ETA: 27s - loss: 1.6639 - regression_loss: 1.3985 - classification_loss: 0.2654 390/500 [======================>.......] - ETA: 27s - loss: 1.6649 - regression_loss: 1.3993 - classification_loss: 0.2655 391/500 [======================>.......] - ETA: 27s - loss: 1.6649 - regression_loss: 1.3994 - classification_loss: 0.2654 392/500 [======================>.......] - ETA: 26s - loss: 1.6640 - regression_loss: 1.3987 - classification_loss: 0.2653 393/500 [======================>.......] - ETA: 26s - loss: 1.6654 - regression_loss: 1.4000 - classification_loss: 0.2654 394/500 [======================>.......] - ETA: 26s - loss: 1.6644 - regression_loss: 1.3991 - classification_loss: 0.2653 395/500 [======================>.......] - ETA: 26s - loss: 1.6652 - regression_loss: 1.3995 - classification_loss: 0.2656 396/500 [======================>.......] - ETA: 25s - loss: 1.6655 - regression_loss: 1.3998 - classification_loss: 0.2657 397/500 [======================>.......] - ETA: 25s - loss: 1.6629 - regression_loss: 1.3977 - classification_loss: 0.2652 398/500 [======================>.......] - ETA: 25s - loss: 1.6612 - regression_loss: 1.3962 - classification_loss: 0.2650 399/500 [======================>.......] - ETA: 25s - loss: 1.6612 - regression_loss: 1.3965 - classification_loss: 0.2647 400/500 [=======================>......] - ETA: 24s - loss: 1.6631 - regression_loss: 1.3980 - classification_loss: 0.2651 401/500 [=======================>......] - ETA: 24s - loss: 1.6616 - regression_loss: 1.3968 - classification_loss: 0.2647 402/500 [=======================>......] - ETA: 24s - loss: 1.6602 - regression_loss: 1.3958 - classification_loss: 0.2644 403/500 [=======================>......] - ETA: 24s - loss: 1.6587 - regression_loss: 1.3945 - classification_loss: 0.2642 404/500 [=======================>......] - ETA: 23s - loss: 1.6599 - regression_loss: 1.3953 - classification_loss: 0.2645 405/500 [=======================>......] - ETA: 23s - loss: 1.6596 - regression_loss: 1.3952 - classification_loss: 0.2644 406/500 [=======================>......] - ETA: 23s - loss: 1.6607 - regression_loss: 1.3956 - classification_loss: 0.2651 407/500 [=======================>......] - ETA: 23s - loss: 1.6604 - regression_loss: 1.3954 - classification_loss: 0.2650 408/500 [=======================>......] - ETA: 22s - loss: 1.6592 - regression_loss: 1.3945 - classification_loss: 0.2647 409/500 [=======================>......] - ETA: 22s - loss: 1.6565 - regression_loss: 1.3922 - classification_loss: 0.2643 410/500 [=======================>......] - ETA: 22s - loss: 1.6540 - regression_loss: 1.3901 - classification_loss: 0.2639 411/500 [=======================>......] - ETA: 22s - loss: 1.6553 - regression_loss: 1.3912 - classification_loss: 0.2641 412/500 [=======================>......] - ETA: 21s - loss: 1.6563 - regression_loss: 1.3919 - classification_loss: 0.2644 413/500 [=======================>......] - ETA: 21s - loss: 1.6585 - regression_loss: 1.3935 - classification_loss: 0.2650 414/500 [=======================>......] - ETA: 21s - loss: 1.6595 - regression_loss: 1.3948 - classification_loss: 0.2647 415/500 [=======================>......] - ETA: 21s - loss: 1.6586 - regression_loss: 1.3943 - classification_loss: 0.2644 416/500 [=======================>......] - ETA: 20s - loss: 1.6595 - regression_loss: 1.3948 - classification_loss: 0.2647 417/500 [========================>.....] - ETA: 20s - loss: 1.6589 - regression_loss: 1.3944 - classification_loss: 0.2645 418/500 [========================>.....] - ETA: 20s - loss: 1.6591 - regression_loss: 1.3945 - classification_loss: 0.2647 419/500 [========================>.....] - ETA: 20s - loss: 1.6574 - regression_loss: 1.3929 - classification_loss: 0.2646 420/500 [========================>.....] - ETA: 19s - loss: 1.6554 - regression_loss: 1.3896 - classification_loss: 0.2658 421/500 [========================>.....] - ETA: 19s - loss: 1.6549 - regression_loss: 1.3892 - classification_loss: 0.2657 422/500 [========================>.....] - ETA: 19s - loss: 1.6552 - regression_loss: 1.3896 - classification_loss: 0.2656 423/500 [========================>.....] - ETA: 19s - loss: 1.6551 - regression_loss: 1.3896 - classification_loss: 0.2655 424/500 [========================>.....] - ETA: 18s - loss: 1.6542 - regression_loss: 1.3890 - classification_loss: 0.2653 425/500 [========================>.....] - ETA: 18s - loss: 1.6536 - regression_loss: 1.3886 - classification_loss: 0.2650 426/500 [========================>.....] - ETA: 18s - loss: 1.6556 - regression_loss: 1.3904 - classification_loss: 0.2652 427/500 [========================>.....] - ETA: 18s - loss: 1.6577 - regression_loss: 1.3918 - classification_loss: 0.2659 428/500 [========================>.....] - ETA: 17s - loss: 1.6568 - regression_loss: 1.3911 - classification_loss: 0.2657 429/500 [========================>.....] - ETA: 17s - loss: 1.6575 - regression_loss: 1.3918 - classification_loss: 0.2657 430/500 [========================>.....] - ETA: 17s - loss: 1.6573 - regression_loss: 1.3917 - classification_loss: 0.2656 431/500 [========================>.....] - ETA: 17s - loss: 1.6567 - regression_loss: 1.3914 - classification_loss: 0.2653 432/500 [========================>.....] - ETA: 16s - loss: 1.6563 - regression_loss: 1.3909 - classification_loss: 0.2653 433/500 [========================>.....] - ETA: 16s - loss: 1.6560 - regression_loss: 1.3908 - classification_loss: 0.2652 434/500 [=========================>....] - ETA: 16s - loss: 1.6560 - regression_loss: 1.3908 - classification_loss: 0.2652 435/500 [=========================>....] - ETA: 16s - loss: 1.6559 - regression_loss: 1.3906 - classification_loss: 0.2653 436/500 [=========================>....] - ETA: 15s - loss: 1.6548 - regression_loss: 1.3898 - classification_loss: 0.2650 437/500 [=========================>....] - ETA: 15s - loss: 1.6549 - regression_loss: 1.3900 - classification_loss: 0.2649 438/500 [=========================>....] - ETA: 15s - loss: 1.6547 - regression_loss: 1.3900 - classification_loss: 0.2647 439/500 [=========================>....] - ETA: 15s - loss: 1.6558 - regression_loss: 1.3908 - classification_loss: 0.2650 440/500 [=========================>....] - ETA: 14s - loss: 1.6559 - regression_loss: 1.3909 - classification_loss: 0.2650 441/500 [=========================>....] - ETA: 14s - loss: 1.6568 - regression_loss: 1.3915 - classification_loss: 0.2652 442/500 [=========================>....] - ETA: 14s - loss: 1.6582 - regression_loss: 1.3924 - classification_loss: 0.2657 443/500 [=========================>....] - ETA: 14s - loss: 1.6584 - regression_loss: 1.3925 - classification_loss: 0.2660 444/500 [=========================>....] - ETA: 13s - loss: 1.6577 - regression_loss: 1.3920 - classification_loss: 0.2657 445/500 [=========================>....] - ETA: 13s - loss: 1.6577 - regression_loss: 1.3921 - classification_loss: 0.2656 446/500 [=========================>....] - ETA: 13s - loss: 1.6573 - regression_loss: 1.3919 - classification_loss: 0.2654 447/500 [=========================>....] - ETA: 13s - loss: 1.6578 - regression_loss: 1.3923 - classification_loss: 0.2656 448/500 [=========================>....] - ETA: 12s - loss: 1.6565 - regression_loss: 1.3911 - classification_loss: 0.2654 449/500 [=========================>....] - ETA: 12s - loss: 1.6571 - regression_loss: 1.3917 - classification_loss: 0.2655 450/500 [==========================>...] - ETA: 12s - loss: 1.6579 - regression_loss: 1.3923 - classification_loss: 0.2655 451/500 [==========================>...] - ETA: 12s - loss: 1.6573 - regression_loss: 1.3920 - classification_loss: 0.2654 452/500 [==========================>...] - ETA: 11s - loss: 1.6578 - regression_loss: 1.3923 - classification_loss: 0.2656 453/500 [==========================>...] - ETA: 11s - loss: 1.6582 - regression_loss: 1.3926 - classification_loss: 0.2656 454/500 [==========================>...] - ETA: 11s - loss: 1.6599 - regression_loss: 1.3938 - classification_loss: 0.2661 455/500 [==========================>...] - ETA: 11s - loss: 1.6594 - regression_loss: 1.3935 - classification_loss: 0.2659 456/500 [==========================>...] - ETA: 10s - loss: 1.6579 - regression_loss: 1.3922 - classification_loss: 0.2657 457/500 [==========================>...] - ETA: 10s - loss: 1.6610 - regression_loss: 1.3948 - classification_loss: 0.2662 458/500 [==========================>...] - ETA: 10s - loss: 1.6597 - regression_loss: 1.3938 - classification_loss: 0.2659 459/500 [==========================>...] - ETA: 10s - loss: 1.6577 - regression_loss: 1.3922 - classification_loss: 0.2655 460/500 [==========================>...] - ETA: 9s - loss: 1.6578 - regression_loss: 1.3922 - classification_loss: 0.2655  461/500 [==========================>...] - ETA: 9s - loss: 1.6570 - regression_loss: 1.3918 - classification_loss: 0.2652 462/500 [==========================>...] - ETA: 9s - loss: 1.6555 - regression_loss: 1.3906 - classification_loss: 0.2649 463/500 [==========================>...] - ETA: 9s - loss: 1.6572 - regression_loss: 1.3919 - classification_loss: 0.2653 464/500 [==========================>...] - ETA: 8s - loss: 1.6575 - regression_loss: 1.3922 - classification_loss: 0.2652 465/500 [==========================>...] - ETA: 8s - loss: 1.6594 - regression_loss: 1.3938 - classification_loss: 0.2656 466/500 [==========================>...] - ETA: 8s - loss: 1.6598 - regression_loss: 1.3941 - classification_loss: 0.2656 467/500 [===========================>..] - ETA: 8s - loss: 1.6587 - regression_loss: 1.3933 - classification_loss: 0.2654 468/500 [===========================>..] - ETA: 7s - loss: 1.6563 - regression_loss: 1.3913 - classification_loss: 0.2650 469/500 [===========================>..] - ETA: 7s - loss: 1.6562 - regression_loss: 1.3913 - classification_loss: 0.2649 470/500 [===========================>..] - ETA: 7s - loss: 1.6565 - regression_loss: 1.3917 - classification_loss: 0.2648 471/500 [===========================>..] - ETA: 7s - loss: 1.6552 - regression_loss: 1.3907 - classification_loss: 0.2645 472/500 [===========================>..] - ETA: 6s - loss: 1.6535 - regression_loss: 1.3892 - classification_loss: 0.2642 473/500 [===========================>..] - ETA: 6s - loss: 1.6543 - regression_loss: 1.3900 - classification_loss: 0.2642 474/500 [===========================>..] - ETA: 6s - loss: 1.6553 - regression_loss: 1.3909 - classification_loss: 0.2644 475/500 [===========================>..] - ETA: 6s - loss: 1.6544 - regression_loss: 1.3903 - classification_loss: 0.2642 476/500 [===========================>..] - ETA: 5s - loss: 1.6534 - regression_loss: 1.3894 - classification_loss: 0.2640 477/500 [===========================>..] - ETA: 5s - loss: 1.6542 - regression_loss: 1.3899 - classification_loss: 0.2643 478/500 [===========================>..] - ETA: 5s - loss: 1.6533 - regression_loss: 1.3893 - classification_loss: 0.2640 479/500 [===========================>..] - ETA: 5s - loss: 1.6538 - regression_loss: 1.3895 - classification_loss: 0.2643 480/500 [===========================>..] - ETA: 4s - loss: 1.6535 - regression_loss: 1.3893 - classification_loss: 0.2641 481/500 [===========================>..] - ETA: 4s - loss: 1.6538 - regression_loss: 1.3896 - classification_loss: 0.2642 482/500 [===========================>..] - ETA: 4s - loss: 1.6536 - regression_loss: 1.3893 - classification_loss: 0.2643 483/500 [===========================>..] - ETA: 4s - loss: 1.6527 - regression_loss: 1.3885 - classification_loss: 0.2642 484/500 [============================>.] - ETA: 3s - loss: 1.6517 - regression_loss: 1.3879 - classification_loss: 0.2638 485/500 [============================>.] - ETA: 3s - loss: 1.6507 - regression_loss: 1.3871 - classification_loss: 0.2636 486/500 [============================>.] - ETA: 3s - loss: 1.6501 - regression_loss: 1.3866 - classification_loss: 0.2635 487/500 [============================>.] - ETA: 3s - loss: 1.6487 - regression_loss: 1.3854 - classification_loss: 0.2633 488/500 [============================>.] - ETA: 2s - loss: 1.6473 - regression_loss: 1.3841 - classification_loss: 0.2631 489/500 [============================>.] - ETA: 2s - loss: 1.6464 - regression_loss: 1.3835 - classification_loss: 0.2628 490/500 [============================>.] - ETA: 2s - loss: 1.6469 - regression_loss: 1.3840 - classification_loss: 0.2629 491/500 [============================>.] - ETA: 2s - loss: 1.6471 - regression_loss: 1.3841 - classification_loss: 0.2629 492/500 [============================>.] - ETA: 1s - loss: 1.6470 - regression_loss: 1.3842 - classification_loss: 0.2628 493/500 [============================>.] - ETA: 1s - loss: 1.6465 - regression_loss: 1.3839 - classification_loss: 0.2627 494/500 [============================>.] - ETA: 1s - loss: 1.6491 - regression_loss: 1.3862 - classification_loss: 0.2629 495/500 [============================>.] - ETA: 1s - loss: 1.6490 - regression_loss: 1.3862 - classification_loss: 0.2629 496/500 [============================>.] - ETA: 0s - loss: 1.6500 - regression_loss: 1.3871 - classification_loss: 0.2629 497/500 [============================>.] - ETA: 0s - loss: 1.6491 - regression_loss: 1.3864 - classification_loss: 0.2628 498/500 [============================>.] - ETA: 0s - loss: 1.6499 - regression_loss: 1.3870 - classification_loss: 0.2629 499/500 [============================>.] - ETA: 0s - loss: 1.6504 - regression_loss: 1.3873 - classification_loss: 0.2631 500/500 [==============================] - 125s 249ms/step - loss: 1.6510 - regression_loss: 1.3878 - classification_loss: 0.2632 326 instances of class plum with average precision: 0.7780 mAP: 0.7780 Epoch 00054: saving model to ./training/snapshots/resnet50_pascal_54.h5 Epoch 55/150 1/500 [..............................] - ETA: 2:02 - loss: 2.1777 - regression_loss: 1.7298 - classification_loss: 0.4479 2/500 [..............................] - ETA: 2:05 - loss: 2.3588 - regression_loss: 1.8371 - classification_loss: 0.5217 3/500 [..............................] - ETA: 2:04 - loss: 2.1492 - regression_loss: 1.7207 - classification_loss: 0.4284 4/500 [..............................] - ETA: 2:03 - loss: 2.0893 - regression_loss: 1.6925 - classification_loss: 0.3968 5/500 [..............................] - ETA: 2:03 - loss: 1.9269 - regression_loss: 1.5535 - classification_loss: 0.3733 6/500 [..............................] - ETA: 2:03 - loss: 1.7834 - regression_loss: 1.4471 - classification_loss: 0.3363 7/500 [..............................] - ETA: 2:03 - loss: 1.7036 - regression_loss: 1.3964 - classification_loss: 0.3072 8/500 [..............................] - ETA: 2:03 - loss: 1.6237 - regression_loss: 1.3333 - classification_loss: 0.2903 9/500 [..............................] - ETA: 2:02 - loss: 1.5490 - regression_loss: 1.2780 - classification_loss: 0.2710 10/500 [..............................] - ETA: 2:02 - loss: 1.5854 - regression_loss: 1.3117 - classification_loss: 0.2736 11/500 [..............................] - ETA: 2:01 - loss: 1.4973 - regression_loss: 1.2408 - classification_loss: 0.2565 12/500 [..............................] - ETA: 2:01 - loss: 1.4560 - regression_loss: 1.2095 - classification_loss: 0.2465 13/500 [..............................] - ETA: 2:01 - loss: 1.4617 - regression_loss: 1.2141 - classification_loss: 0.2476 14/500 [..............................] - ETA: 2:01 - loss: 1.4498 - regression_loss: 1.2089 - classification_loss: 0.2408 15/500 [..............................] - ETA: 2:06 - loss: 1.5084 - regression_loss: 1.2536 - classification_loss: 0.2548 16/500 [..............................] - ETA: 2:06 - loss: 1.4970 - regression_loss: 1.2449 - classification_loss: 0.2521 17/500 [>.............................] - ETA: 2:05 - loss: 1.4737 - regression_loss: 1.2282 - classification_loss: 0.2455 18/500 [>.............................] - ETA: 2:05 - loss: 1.4902 - regression_loss: 1.2417 - classification_loss: 0.2485 19/500 [>.............................] - ETA: 2:05 - loss: 1.5035 - regression_loss: 1.2570 - classification_loss: 0.2465 20/500 [>.............................] - ETA: 2:05 - loss: 1.5514 - regression_loss: 1.2896 - classification_loss: 0.2618 21/500 [>.............................] - ETA: 2:04 - loss: 1.5142 - regression_loss: 1.2588 - classification_loss: 0.2554 22/500 [>.............................] - ETA: 2:03 - loss: 1.5200 - regression_loss: 1.2637 - classification_loss: 0.2563 23/500 [>.............................] - ETA: 2:03 - loss: 1.5481 - regression_loss: 1.2898 - classification_loss: 0.2583 24/500 [>.............................] - ETA: 2:03 - loss: 1.5669 - regression_loss: 1.3085 - classification_loss: 0.2585 25/500 [>.............................] - ETA: 2:02 - loss: 1.5749 - regression_loss: 1.3131 - classification_loss: 0.2617 26/500 [>.............................] - ETA: 2:02 - loss: 1.5689 - regression_loss: 1.3077 - classification_loss: 0.2612 27/500 [>.............................] - ETA: 2:01 - loss: 1.5679 - regression_loss: 1.3051 - classification_loss: 0.2628 28/500 [>.............................] - ETA: 2:01 - loss: 1.5801 - regression_loss: 1.3171 - classification_loss: 0.2629 29/500 [>.............................] - ETA: 2:01 - loss: 1.5969 - regression_loss: 1.3287 - classification_loss: 0.2682 30/500 [>.............................] - ETA: 2:00 - loss: 1.6001 - regression_loss: 1.3309 - classification_loss: 0.2692 31/500 [>.............................] - ETA: 2:00 - loss: 1.6234 - regression_loss: 1.3496 - classification_loss: 0.2738 32/500 [>.............................] - ETA: 2:00 - loss: 1.6239 - regression_loss: 1.3521 - classification_loss: 0.2718 33/500 [>.............................] - ETA: 1:59 - loss: 1.6311 - regression_loss: 1.3542 - classification_loss: 0.2769 34/500 [=>............................] - ETA: 1:59 - loss: 1.6256 - regression_loss: 1.3488 - classification_loss: 0.2768 35/500 [=>............................] - ETA: 1:59 - loss: 1.6134 - regression_loss: 1.3388 - classification_loss: 0.2746 36/500 [=>............................] - ETA: 1:59 - loss: 1.6123 - regression_loss: 1.3394 - classification_loss: 0.2729 37/500 [=>............................] - ETA: 1:58 - loss: 1.5983 - regression_loss: 1.3283 - classification_loss: 0.2700 38/500 [=>............................] - ETA: 1:58 - loss: 1.5878 - regression_loss: 1.3208 - classification_loss: 0.2670 39/500 [=>............................] - ETA: 1:58 - loss: 1.5859 - regression_loss: 1.3207 - classification_loss: 0.2652 40/500 [=>............................] - ETA: 1:58 - loss: 1.6040 - regression_loss: 1.3408 - classification_loss: 0.2632 41/500 [=>............................] - ETA: 1:57 - loss: 1.6050 - regression_loss: 1.3410 - classification_loss: 0.2639 42/500 [=>............................] - ETA: 1:57 - loss: 1.6001 - regression_loss: 1.3385 - classification_loss: 0.2616 43/500 [=>............................] - ETA: 1:57 - loss: 1.5967 - regression_loss: 1.3351 - classification_loss: 0.2616 44/500 [=>............................] - ETA: 1:56 - loss: 1.5936 - regression_loss: 1.3343 - classification_loss: 0.2593 45/500 [=>............................] - ETA: 1:56 - loss: 1.5910 - regression_loss: 1.3329 - classification_loss: 0.2581 46/500 [=>............................] - ETA: 1:56 - loss: 1.5944 - regression_loss: 1.3362 - classification_loss: 0.2582 47/500 [=>............................] - ETA: 1:56 - loss: 1.6019 - regression_loss: 1.3428 - classification_loss: 0.2591 48/500 [=>............................] - ETA: 1:55 - loss: 1.6013 - regression_loss: 1.3427 - classification_loss: 0.2586 49/500 [=>............................] - ETA: 1:55 - loss: 1.5976 - regression_loss: 1.3399 - classification_loss: 0.2578 50/500 [==>...........................] - ETA: 1:55 - loss: 1.5946 - regression_loss: 1.3366 - classification_loss: 0.2580 51/500 [==>...........................] - ETA: 1:54 - loss: 1.5859 - regression_loss: 1.3295 - classification_loss: 0.2564 52/500 [==>...........................] - ETA: 1:54 - loss: 1.6017 - regression_loss: 1.3410 - classification_loss: 0.2607 53/500 [==>...........................] - ETA: 1:54 - loss: 1.5801 - regression_loss: 1.3231 - classification_loss: 0.2570 54/500 [==>...........................] - ETA: 1:54 - loss: 1.5822 - regression_loss: 1.3244 - classification_loss: 0.2577 55/500 [==>...........................] - ETA: 1:53 - loss: 1.5765 - regression_loss: 1.3192 - classification_loss: 0.2573 56/500 [==>...........................] - ETA: 1:53 - loss: 1.5757 - regression_loss: 1.3200 - classification_loss: 0.2558 57/500 [==>...........................] - ETA: 1:56 - loss: 1.5808 - regression_loss: 1.3250 - classification_loss: 0.2559 58/500 [==>...........................] - ETA: 1:55 - loss: 1.5760 - regression_loss: 1.3217 - classification_loss: 0.2543 59/500 [==>...........................] - ETA: 1:55 - loss: 1.5749 - regression_loss: 1.3216 - classification_loss: 0.2532 60/500 [==>...........................] - ETA: 1:54 - loss: 1.5763 - regression_loss: 1.3231 - classification_loss: 0.2531 61/500 [==>...........................] - ETA: 1:54 - loss: 1.5835 - regression_loss: 1.3293 - classification_loss: 0.2542 62/500 [==>...........................] - ETA: 1:54 - loss: 1.5927 - regression_loss: 1.3365 - classification_loss: 0.2562 63/500 [==>...........................] - ETA: 1:54 - loss: 1.5856 - regression_loss: 1.3292 - classification_loss: 0.2564 64/500 [==>...........................] - ETA: 1:53 - loss: 1.5880 - regression_loss: 1.3317 - classification_loss: 0.2563 65/500 [==>...........................] - ETA: 1:53 - loss: 1.5883 - regression_loss: 1.3320 - classification_loss: 0.2563 66/500 [==>...........................] - ETA: 1:53 - loss: 1.6010 - regression_loss: 1.3419 - classification_loss: 0.2591 67/500 [===>..........................] - ETA: 1:52 - loss: 1.5932 - regression_loss: 1.3364 - classification_loss: 0.2568 68/500 [===>..........................] - ETA: 1:52 - loss: 1.5990 - regression_loss: 1.3410 - classification_loss: 0.2580 69/500 [===>..........................] - ETA: 1:52 - loss: 1.5987 - regression_loss: 1.3406 - classification_loss: 0.2581 70/500 [===>..........................] - ETA: 1:51 - loss: 1.6026 - regression_loss: 1.3434 - classification_loss: 0.2592 71/500 [===>..........................] - ETA: 1:51 - loss: 1.6116 - regression_loss: 1.3497 - classification_loss: 0.2619 72/500 [===>..........................] - ETA: 1:51 - loss: 1.6100 - regression_loss: 1.3475 - classification_loss: 0.2624 73/500 [===>..........................] - ETA: 1:50 - loss: 1.6149 - regression_loss: 1.3518 - classification_loss: 0.2630 74/500 [===>..........................] - ETA: 1:50 - loss: 1.6152 - regression_loss: 1.3525 - classification_loss: 0.2626 75/500 [===>..........................] - ETA: 1:50 - loss: 1.6119 - regression_loss: 1.3505 - classification_loss: 0.2614 76/500 [===>..........................] - ETA: 1:49 - loss: 1.6232 - regression_loss: 1.3590 - classification_loss: 0.2641 77/500 [===>..........................] - ETA: 1:49 - loss: 1.6267 - regression_loss: 1.3619 - classification_loss: 0.2647 78/500 [===>..........................] - ETA: 1:49 - loss: 1.6161 - regression_loss: 1.3531 - classification_loss: 0.2630 79/500 [===>..........................] - ETA: 1:48 - loss: 1.6112 - regression_loss: 1.3496 - classification_loss: 0.2616 80/500 [===>..........................] - ETA: 1:48 - loss: 1.6205 - regression_loss: 1.3565 - classification_loss: 0.2640 81/500 [===>..........................] - ETA: 1:48 - loss: 1.6150 - regression_loss: 1.3517 - classification_loss: 0.2633 82/500 [===>..........................] - ETA: 1:47 - loss: 1.6215 - regression_loss: 1.3570 - classification_loss: 0.2645 83/500 [===>..........................] - ETA: 1:47 - loss: 1.6220 - regression_loss: 1.3586 - classification_loss: 0.2635 84/500 [====>.........................] - ETA: 1:47 - loss: 1.6226 - regression_loss: 1.3606 - classification_loss: 0.2619 85/500 [====>.........................] - ETA: 1:47 - loss: 1.6098 - regression_loss: 1.3501 - classification_loss: 0.2597 86/500 [====>.........................] - ETA: 1:46 - loss: 1.6247 - regression_loss: 1.3626 - classification_loss: 0.2621 87/500 [====>.........................] - ETA: 1:46 - loss: 1.6330 - regression_loss: 1.3708 - classification_loss: 0.2622 88/500 [====>.........................] - ETA: 1:46 - loss: 1.6376 - regression_loss: 1.3756 - classification_loss: 0.2620 89/500 [====>.........................] - ETA: 1:45 - loss: 1.6364 - regression_loss: 1.3751 - classification_loss: 0.2613 90/500 [====>.........................] - ETA: 1:45 - loss: 1.6409 - regression_loss: 1.3791 - classification_loss: 0.2618 91/500 [====>.........................] - ETA: 1:45 - loss: 1.6494 - regression_loss: 1.3874 - classification_loss: 0.2620 92/500 [====>.........................] - ETA: 1:45 - loss: 1.6367 - regression_loss: 1.3723 - classification_loss: 0.2644 93/500 [====>.........................] - ETA: 1:44 - loss: 1.6357 - regression_loss: 1.3722 - classification_loss: 0.2635 94/500 [====>.........................] - ETA: 1:44 - loss: 1.6342 - regression_loss: 1.3711 - classification_loss: 0.2631 95/500 [====>.........................] - ETA: 1:44 - loss: 1.6436 - regression_loss: 1.3783 - classification_loss: 0.2654 96/500 [====>.........................] - ETA: 1:43 - loss: 1.6484 - regression_loss: 1.3815 - classification_loss: 0.2669 97/500 [====>.........................] - ETA: 1:43 - loss: 1.6622 - regression_loss: 1.3906 - classification_loss: 0.2717 98/500 [====>.........................] - ETA: 1:43 - loss: 1.6694 - regression_loss: 1.3946 - classification_loss: 0.2749 99/500 [====>.........................] - ETA: 1:42 - loss: 1.6739 - regression_loss: 1.3970 - classification_loss: 0.2768 100/500 [=====>........................] - ETA: 1:42 - loss: 1.6682 - regression_loss: 1.3922 - classification_loss: 0.2760 101/500 [=====>........................] - ETA: 1:42 - loss: 1.6677 - regression_loss: 1.3915 - classification_loss: 0.2762 102/500 [=====>........................] - ETA: 1:42 - loss: 1.6611 - regression_loss: 1.3863 - classification_loss: 0.2748 103/500 [=====>........................] - ETA: 1:41 - loss: 1.6537 - regression_loss: 1.3804 - classification_loss: 0.2733 104/500 [=====>........................] - ETA: 1:41 - loss: 1.6559 - regression_loss: 1.3818 - classification_loss: 0.2741 105/500 [=====>........................] - ETA: 1:40 - loss: 1.6533 - regression_loss: 1.3798 - classification_loss: 0.2736 106/500 [=====>........................] - ETA: 1:40 - loss: 1.6543 - regression_loss: 1.3800 - classification_loss: 0.2743 107/500 [=====>........................] - ETA: 1:40 - loss: 1.6563 - regression_loss: 1.3818 - classification_loss: 0.2746 108/500 [=====>........................] - ETA: 1:39 - loss: 1.6550 - regression_loss: 1.3807 - classification_loss: 0.2743 109/500 [=====>........................] - ETA: 1:39 - loss: 1.6577 - regression_loss: 1.3824 - classification_loss: 0.2752 110/500 [=====>........................] - ETA: 1:39 - loss: 1.6650 - regression_loss: 1.3893 - classification_loss: 0.2757 111/500 [=====>........................] - ETA: 1:39 - loss: 1.6688 - regression_loss: 1.3938 - classification_loss: 0.2750 112/500 [=====>........................] - ETA: 1:38 - loss: 1.6673 - regression_loss: 1.3933 - classification_loss: 0.2740 113/500 [=====>........................] - ETA: 1:38 - loss: 1.6660 - regression_loss: 1.3921 - classification_loss: 0.2739 114/500 [=====>........................] - ETA: 1:38 - loss: 1.6635 - regression_loss: 1.3911 - classification_loss: 0.2724 115/500 [=====>........................] - ETA: 1:37 - loss: 1.6638 - regression_loss: 1.3917 - classification_loss: 0.2721 116/500 [=====>........................] - ETA: 1:37 - loss: 1.6701 - regression_loss: 1.3965 - classification_loss: 0.2736 117/500 [======>.......................] - ETA: 1:37 - loss: 1.6673 - regression_loss: 1.3945 - classification_loss: 0.2728 118/500 [======>.......................] - ETA: 1:37 - loss: 1.6677 - regression_loss: 1.3948 - classification_loss: 0.2729 119/500 [======>.......................] - ETA: 1:36 - loss: 1.6672 - regression_loss: 1.3942 - classification_loss: 0.2730 120/500 [======>.......................] - ETA: 1:36 - loss: 1.6624 - regression_loss: 1.3906 - classification_loss: 0.2718 121/500 [======>.......................] - ETA: 1:36 - loss: 1.6641 - regression_loss: 1.3920 - classification_loss: 0.2720 122/500 [======>.......................] - ETA: 1:36 - loss: 1.6673 - regression_loss: 1.3941 - classification_loss: 0.2733 123/500 [======>.......................] - ETA: 1:35 - loss: 1.6635 - regression_loss: 1.3911 - classification_loss: 0.2724 124/500 [======>.......................] - ETA: 1:35 - loss: 1.6554 - regression_loss: 1.3848 - classification_loss: 0.2706 125/500 [======>.......................] - ETA: 1:35 - loss: 1.6526 - regression_loss: 1.3825 - classification_loss: 0.2701 126/500 [======>.......................] - ETA: 1:35 - loss: 1.6573 - regression_loss: 1.3857 - classification_loss: 0.2716 127/500 [======>.......................] - ETA: 1:34 - loss: 1.6572 - regression_loss: 1.3857 - classification_loss: 0.2715 128/500 [======>.......................] - ETA: 1:34 - loss: 1.6569 - regression_loss: 1.3852 - classification_loss: 0.2717 129/500 [======>.......................] - ETA: 1:34 - loss: 1.6619 - regression_loss: 1.3896 - classification_loss: 0.2722 130/500 [======>.......................] - ETA: 1:33 - loss: 1.6596 - regression_loss: 1.3885 - classification_loss: 0.2711 131/500 [======>.......................] - ETA: 1:33 - loss: 1.6528 - regression_loss: 1.3830 - classification_loss: 0.2698 132/500 [======>.......................] - ETA: 1:33 - loss: 1.6559 - regression_loss: 1.3851 - classification_loss: 0.2708 133/500 [======>.......................] - ETA: 1:33 - loss: 1.6617 - regression_loss: 1.3893 - classification_loss: 0.2724 134/500 [=======>......................] - ETA: 1:32 - loss: 1.6623 - regression_loss: 1.3905 - classification_loss: 0.2719 135/500 [=======>......................] - ETA: 1:32 - loss: 1.6618 - regression_loss: 1.3890 - classification_loss: 0.2728 136/500 [=======>......................] - ETA: 1:32 - loss: 1.6664 - regression_loss: 1.3925 - classification_loss: 0.2739 137/500 [=======>......................] - ETA: 1:32 - loss: 1.6718 - regression_loss: 1.3979 - classification_loss: 0.2739 138/500 [=======>......................] - ETA: 1:31 - loss: 1.6657 - regression_loss: 1.3931 - classification_loss: 0.2726 139/500 [=======>......................] - ETA: 1:31 - loss: 1.6670 - regression_loss: 1.3943 - classification_loss: 0.2727 140/500 [=======>......................] - ETA: 1:31 - loss: 1.6642 - regression_loss: 1.3920 - classification_loss: 0.2722 141/500 [=======>......................] - ETA: 1:31 - loss: 1.6683 - regression_loss: 1.3945 - classification_loss: 0.2738 142/500 [=======>......................] - ETA: 1:30 - loss: 1.6642 - regression_loss: 1.3911 - classification_loss: 0.2731 143/500 [=======>......................] - ETA: 1:30 - loss: 1.6612 - regression_loss: 1.3890 - classification_loss: 0.2722 144/500 [=======>......................] - ETA: 1:30 - loss: 1.6595 - regression_loss: 1.3882 - classification_loss: 0.2713 145/500 [=======>......................] - ETA: 1:30 - loss: 1.6587 - regression_loss: 1.3872 - classification_loss: 0.2715 146/500 [=======>......................] - ETA: 1:29 - loss: 1.6588 - regression_loss: 1.3873 - classification_loss: 0.2714 147/500 [=======>......................] - ETA: 1:29 - loss: 1.6541 - regression_loss: 1.3834 - classification_loss: 0.2707 148/500 [=======>......................] - ETA: 1:29 - loss: 1.6536 - regression_loss: 1.3832 - classification_loss: 0.2704 149/500 [=======>......................] - ETA: 1:29 - loss: 1.6513 - regression_loss: 1.3813 - classification_loss: 0.2700 150/500 [========>.....................] - ETA: 1:28 - loss: 1.6517 - regression_loss: 1.3818 - classification_loss: 0.2700 151/500 [========>.....................] - ETA: 1:28 - loss: 1.6442 - regression_loss: 1.3755 - classification_loss: 0.2686 152/500 [========>.....................] - ETA: 1:28 - loss: 1.6474 - regression_loss: 1.3785 - classification_loss: 0.2689 153/500 [========>.....................] - ETA: 1:28 - loss: 1.6464 - regression_loss: 1.3781 - classification_loss: 0.2683 154/500 [========>.....................] - ETA: 1:27 - loss: 1.6416 - regression_loss: 1.3744 - classification_loss: 0.2672 155/500 [========>.....................] - ETA: 1:27 - loss: 1.6416 - regression_loss: 1.3742 - classification_loss: 0.2674 156/500 [========>.....................] - ETA: 1:27 - loss: 1.6410 - regression_loss: 1.3742 - classification_loss: 0.2668 157/500 [========>.....................] - ETA: 1:27 - loss: 1.6431 - regression_loss: 1.3762 - classification_loss: 0.2669 158/500 [========>.....................] - ETA: 1:26 - loss: 1.6398 - regression_loss: 1.3737 - classification_loss: 0.2661 159/500 [========>.....................] - ETA: 1:26 - loss: 1.6373 - regression_loss: 1.3712 - classification_loss: 0.2661 160/500 [========>.....................] - ETA: 1:26 - loss: 1.6421 - regression_loss: 1.3756 - classification_loss: 0.2665 161/500 [========>.....................] - ETA: 1:26 - loss: 1.6422 - regression_loss: 1.3759 - classification_loss: 0.2663 162/500 [========>.....................] - ETA: 1:25 - loss: 1.6426 - regression_loss: 1.3764 - classification_loss: 0.2663 163/500 [========>.....................] - ETA: 1:25 - loss: 1.6423 - regression_loss: 1.3764 - classification_loss: 0.2658 164/500 [========>.....................] - ETA: 1:25 - loss: 1.6436 - regression_loss: 1.3780 - classification_loss: 0.2656 165/500 [========>.....................] - ETA: 1:25 - loss: 1.6454 - regression_loss: 1.3795 - classification_loss: 0.2659 166/500 [========>.....................] - ETA: 1:24 - loss: 1.6473 - regression_loss: 1.3806 - classification_loss: 0.2667 167/500 [=========>....................] - ETA: 1:24 - loss: 1.6494 - regression_loss: 1.3819 - classification_loss: 0.2675 168/500 [=========>....................] - ETA: 1:24 - loss: 1.6517 - regression_loss: 1.3831 - classification_loss: 0.2686 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6566 - regression_loss: 1.3871 - classification_loss: 0.2694 170/500 [=========>....................] - ETA: 1:23 - loss: 1.6541 - regression_loss: 1.3853 - classification_loss: 0.2688 171/500 [=========>....................] - ETA: 1:23 - loss: 1.6515 - regression_loss: 1.3829 - classification_loss: 0.2686 172/500 [=========>....................] - ETA: 1:23 - loss: 1.6541 - regression_loss: 1.3848 - classification_loss: 0.2693 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6633 - regression_loss: 1.3925 - classification_loss: 0.2709 174/500 [=========>....................] - ETA: 1:22 - loss: 1.6664 - regression_loss: 1.3951 - classification_loss: 0.2714 175/500 [=========>....................] - ETA: 1:22 - loss: 1.6632 - regression_loss: 1.3925 - classification_loss: 0.2706 176/500 [=========>....................] - ETA: 1:22 - loss: 1.6648 - regression_loss: 1.3939 - classification_loss: 0.2709 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6624 - regression_loss: 1.3922 - classification_loss: 0.2702 178/500 [=========>....................] - ETA: 1:21 - loss: 1.6649 - regression_loss: 1.3944 - classification_loss: 0.2705 179/500 [=========>....................] - ETA: 1:21 - loss: 1.6678 - regression_loss: 1.3967 - classification_loss: 0.2711 180/500 [=========>....................] - ETA: 1:21 - loss: 1.6671 - regression_loss: 1.3963 - classification_loss: 0.2707 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6652 - regression_loss: 1.3950 - classification_loss: 0.2702 182/500 [=========>....................] - ETA: 1:20 - loss: 1.6725 - regression_loss: 1.4008 - classification_loss: 0.2716 183/500 [=========>....................] - ETA: 1:20 - loss: 1.6718 - regression_loss: 1.4005 - classification_loss: 0.2713 184/500 [==========>...................] - ETA: 1:20 - loss: 1.6727 - regression_loss: 1.4019 - classification_loss: 0.2708 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6749 - regression_loss: 1.4033 - classification_loss: 0.2716 186/500 [==========>...................] - ETA: 1:19 - loss: 1.6747 - regression_loss: 1.4032 - classification_loss: 0.2715 187/500 [==========>...................] - ETA: 1:19 - loss: 1.6739 - regression_loss: 1.4026 - classification_loss: 0.2713 188/500 [==========>...................] - ETA: 1:19 - loss: 1.6743 - regression_loss: 1.4030 - classification_loss: 0.2713 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6741 - regression_loss: 1.4029 - classification_loss: 0.2712 190/500 [==========>...................] - ETA: 1:18 - loss: 1.6730 - regression_loss: 1.4016 - classification_loss: 0.2714 191/500 [==========>...................] - ETA: 1:18 - loss: 1.6753 - regression_loss: 1.4035 - classification_loss: 0.2718 192/500 [==========>...................] - ETA: 1:18 - loss: 1.6771 - regression_loss: 1.4048 - classification_loss: 0.2724 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6753 - regression_loss: 1.4034 - classification_loss: 0.2720 194/500 [==========>...................] - ETA: 1:17 - loss: 1.6740 - regression_loss: 1.4025 - classification_loss: 0.2715 195/500 [==========>...................] - ETA: 1:17 - loss: 1.6765 - regression_loss: 1.4047 - classification_loss: 0.2718 196/500 [==========>...................] - ETA: 1:17 - loss: 1.6777 - regression_loss: 1.4059 - classification_loss: 0.2718 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6781 - regression_loss: 1.4062 - classification_loss: 0.2719 198/500 [==========>...................] - ETA: 1:16 - loss: 1.6782 - regression_loss: 1.4060 - classification_loss: 0.2722 199/500 [==========>...................] - ETA: 1:16 - loss: 1.6816 - regression_loss: 1.4081 - classification_loss: 0.2735 200/500 [===========>..................] - ETA: 1:16 - loss: 1.6826 - regression_loss: 1.4093 - classification_loss: 0.2732 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6838 - regression_loss: 1.4102 - classification_loss: 0.2736 202/500 [===========>..................] - ETA: 1:15 - loss: 1.6852 - regression_loss: 1.4116 - classification_loss: 0.2736 203/500 [===========>..................] - ETA: 1:15 - loss: 1.6844 - regression_loss: 1.4111 - classification_loss: 0.2733 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6862 - regression_loss: 1.4122 - classification_loss: 0.2740 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6875 - regression_loss: 1.4132 - classification_loss: 0.2743 206/500 [===========>..................] - ETA: 1:14 - loss: 1.6842 - regression_loss: 1.4106 - classification_loss: 0.2735 207/500 [===========>..................] - ETA: 1:14 - loss: 1.6835 - regression_loss: 1.4100 - classification_loss: 0.2734 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6827 - regression_loss: 1.4096 - classification_loss: 0.2731 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6821 - regression_loss: 1.4091 - classification_loss: 0.2730 210/500 [===========>..................] - ETA: 1:13 - loss: 1.6793 - regression_loss: 1.4069 - classification_loss: 0.2724 211/500 [===========>..................] - ETA: 1:13 - loss: 1.6788 - regression_loss: 1.4069 - classification_loss: 0.2719 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6753 - regression_loss: 1.4040 - classification_loss: 0.2713 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6780 - regression_loss: 1.4065 - classification_loss: 0.2715 214/500 [===========>..................] - ETA: 1:12 - loss: 1.6763 - regression_loss: 1.4053 - classification_loss: 0.2709 215/500 [===========>..................] - ETA: 1:12 - loss: 1.6759 - regression_loss: 1.4053 - classification_loss: 0.2706 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6768 - regression_loss: 1.4055 - classification_loss: 0.2713 217/500 [============>.................] - ETA: 1:11 - loss: 1.6781 - regression_loss: 1.4064 - classification_loss: 0.2717 218/500 [============>.................] - ETA: 1:11 - loss: 1.6804 - regression_loss: 1.4084 - classification_loss: 0.2720 219/500 [============>.................] - ETA: 1:11 - loss: 1.6761 - regression_loss: 1.4046 - classification_loss: 0.2715 220/500 [============>.................] - ETA: 1:10 - loss: 1.6805 - regression_loss: 1.4078 - classification_loss: 0.2727 221/500 [============>.................] - ETA: 1:10 - loss: 1.6785 - regression_loss: 1.4056 - classification_loss: 0.2728 222/500 [============>.................] - ETA: 1:10 - loss: 1.6770 - regression_loss: 1.4038 - classification_loss: 0.2731 223/500 [============>.................] - ETA: 1:10 - loss: 1.6731 - regression_loss: 1.4008 - classification_loss: 0.2723 224/500 [============>.................] - ETA: 1:09 - loss: 1.6719 - regression_loss: 1.4000 - classification_loss: 0.2719 225/500 [============>.................] - ETA: 1:09 - loss: 1.6705 - regression_loss: 1.3990 - classification_loss: 0.2716 226/500 [============>.................] - ETA: 1:09 - loss: 1.6717 - regression_loss: 1.4000 - classification_loss: 0.2717 227/500 [============>.................] - ETA: 1:09 - loss: 1.6698 - regression_loss: 1.3986 - classification_loss: 0.2712 228/500 [============>.................] - ETA: 1:08 - loss: 1.6651 - regression_loss: 1.3945 - classification_loss: 0.2707 229/500 [============>.................] - ETA: 1:08 - loss: 1.6622 - regression_loss: 1.3923 - classification_loss: 0.2700 230/500 [============>.................] - ETA: 1:08 - loss: 1.6631 - regression_loss: 1.3927 - classification_loss: 0.2704 231/500 [============>.................] - ETA: 1:08 - loss: 1.6652 - regression_loss: 1.3943 - classification_loss: 0.2709 232/500 [============>.................] - ETA: 1:07 - loss: 1.6647 - regression_loss: 1.3940 - classification_loss: 0.2707 233/500 [============>.................] - ETA: 1:07 - loss: 1.6639 - regression_loss: 1.3935 - classification_loss: 0.2704 234/500 [=============>................] - ETA: 1:07 - loss: 1.6612 - regression_loss: 1.3914 - classification_loss: 0.2698 235/500 [=============>................] - ETA: 1:07 - loss: 1.6608 - regression_loss: 1.3914 - classification_loss: 0.2694 236/500 [=============>................] - ETA: 1:06 - loss: 1.6614 - regression_loss: 1.3913 - classification_loss: 0.2701 237/500 [=============>................] - ETA: 1:06 - loss: 1.6625 - regression_loss: 1.3921 - classification_loss: 0.2703 238/500 [=============>................] - ETA: 1:06 - loss: 1.6618 - regression_loss: 1.3919 - classification_loss: 0.2699 239/500 [=============>................] - ETA: 1:05 - loss: 1.6621 - regression_loss: 1.3925 - classification_loss: 0.2696 240/500 [=============>................] - ETA: 1:05 - loss: 1.6620 - regression_loss: 1.3925 - classification_loss: 0.2695 241/500 [=============>................] - ETA: 1:05 - loss: 1.6637 - regression_loss: 1.3938 - classification_loss: 0.2699 242/500 [=============>................] - ETA: 1:05 - loss: 1.6639 - regression_loss: 1.3939 - classification_loss: 0.2700 243/500 [=============>................] - ETA: 1:04 - loss: 1.6640 - regression_loss: 1.3942 - classification_loss: 0.2698 244/500 [=============>................] - ETA: 1:04 - loss: 1.6619 - regression_loss: 1.3926 - classification_loss: 0.2693 245/500 [=============>................] - ETA: 1:04 - loss: 1.6632 - regression_loss: 1.3939 - classification_loss: 0.2693 246/500 [=============>................] - ETA: 1:04 - loss: 1.6611 - regression_loss: 1.3925 - classification_loss: 0.2685 247/500 [=============>................] - ETA: 1:03 - loss: 1.6592 - regression_loss: 1.3912 - classification_loss: 0.2680 248/500 [=============>................] - ETA: 1:03 - loss: 1.6551 - regression_loss: 1.3878 - classification_loss: 0.2673 249/500 [=============>................] - ETA: 1:03 - loss: 1.6532 - regression_loss: 1.3864 - classification_loss: 0.2669 250/500 [==============>...............] - ETA: 1:03 - loss: 1.6509 - regression_loss: 1.3846 - classification_loss: 0.2663 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6501 - regression_loss: 1.3836 - classification_loss: 0.2665 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6513 - regression_loss: 1.3853 - classification_loss: 0.2659 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6505 - regression_loss: 1.3848 - classification_loss: 0.2658 254/500 [==============>...............] - ETA: 1:02 - loss: 1.6530 - regression_loss: 1.3871 - classification_loss: 0.2659 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6518 - regression_loss: 1.3865 - classification_loss: 0.2654 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6508 - regression_loss: 1.3856 - classification_loss: 0.2652 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6509 - regression_loss: 1.3853 - classification_loss: 0.2656 258/500 [==============>...............] - ETA: 1:01 - loss: 1.6506 - regression_loss: 1.3854 - classification_loss: 0.2652 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6494 - regression_loss: 1.3843 - classification_loss: 0.2650 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6493 - regression_loss: 1.3842 - classification_loss: 0.2651 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6511 - regression_loss: 1.3856 - classification_loss: 0.2655 262/500 [==============>...............] - ETA: 1:00 - loss: 1.6523 - regression_loss: 1.3866 - classification_loss: 0.2657 263/500 [==============>...............] - ETA: 59s - loss: 1.6519 - regression_loss: 1.3865 - classification_loss: 0.2654  264/500 [==============>...............] - ETA: 59s - loss: 1.6529 - regression_loss: 1.3872 - classification_loss: 0.2656 265/500 [==============>...............] - ETA: 59s - loss: 1.6543 - regression_loss: 1.3883 - classification_loss: 0.2660 266/500 [==============>...............] - ETA: 59s - loss: 1.6580 - regression_loss: 1.3910 - classification_loss: 0.2670 267/500 [===============>..............] - ETA: 58s - loss: 1.6556 - regression_loss: 1.3892 - classification_loss: 0.2665 268/500 [===============>..............] - ETA: 58s - loss: 1.6529 - regression_loss: 1.3869 - classification_loss: 0.2660 269/500 [===============>..............] - ETA: 58s - loss: 1.6516 - regression_loss: 1.3858 - classification_loss: 0.2658 270/500 [===============>..............] - ETA: 58s - loss: 1.6529 - regression_loss: 1.3864 - classification_loss: 0.2665 271/500 [===============>..............] - ETA: 57s - loss: 1.6533 - regression_loss: 1.3863 - classification_loss: 0.2670 272/500 [===============>..............] - ETA: 57s - loss: 1.6520 - regression_loss: 1.3851 - classification_loss: 0.2668 273/500 [===============>..............] - ETA: 57s - loss: 1.6496 - regression_loss: 1.3833 - classification_loss: 0.2663 274/500 [===============>..............] - ETA: 57s - loss: 1.6528 - regression_loss: 1.3857 - classification_loss: 0.2671 275/500 [===============>..............] - ETA: 56s - loss: 1.6546 - regression_loss: 1.3866 - classification_loss: 0.2680 276/500 [===============>..............] - ETA: 56s - loss: 1.6576 - regression_loss: 1.3889 - classification_loss: 0.2687 277/500 [===============>..............] - ETA: 56s - loss: 1.6572 - regression_loss: 1.3890 - classification_loss: 0.2682 278/500 [===============>..............] - ETA: 56s - loss: 1.6587 - regression_loss: 1.3900 - classification_loss: 0.2687 279/500 [===============>..............] - ETA: 55s - loss: 1.6557 - regression_loss: 1.3875 - classification_loss: 0.2681 280/500 [===============>..............] - ETA: 55s - loss: 1.6556 - regression_loss: 1.3878 - classification_loss: 0.2678 281/500 [===============>..............] - ETA: 55s - loss: 1.6542 - regression_loss: 1.3866 - classification_loss: 0.2675 282/500 [===============>..............] - ETA: 55s - loss: 1.6531 - regression_loss: 1.3861 - classification_loss: 0.2670 283/500 [===============>..............] - ETA: 54s - loss: 1.6522 - regression_loss: 1.3855 - classification_loss: 0.2668 284/500 [================>.............] - ETA: 54s - loss: 1.6540 - regression_loss: 1.3870 - classification_loss: 0.2670 285/500 [================>.............] - ETA: 54s - loss: 1.6549 - regression_loss: 1.3877 - classification_loss: 0.2672 286/500 [================>.............] - ETA: 53s - loss: 1.6547 - regression_loss: 1.3876 - classification_loss: 0.2671 287/500 [================>.............] - ETA: 53s - loss: 1.6595 - regression_loss: 1.3916 - classification_loss: 0.2679 288/500 [================>.............] - ETA: 53s - loss: 1.6599 - regression_loss: 1.3921 - classification_loss: 0.2678 289/500 [================>.............] - ETA: 53s - loss: 1.6573 - regression_loss: 1.3901 - classification_loss: 0.2672 290/500 [================>.............] - ETA: 52s - loss: 1.6564 - regression_loss: 1.3892 - classification_loss: 0.2672 291/500 [================>.............] - ETA: 52s - loss: 1.6539 - regression_loss: 1.3870 - classification_loss: 0.2669 292/500 [================>.............] - ETA: 52s - loss: 1.6533 - regression_loss: 1.3867 - classification_loss: 0.2666 293/500 [================>.............] - ETA: 52s - loss: 1.6532 - regression_loss: 1.3871 - classification_loss: 0.2661 294/500 [================>.............] - ETA: 51s - loss: 1.6541 - regression_loss: 1.3880 - classification_loss: 0.2661 295/500 [================>.............] - ETA: 51s - loss: 1.6508 - regression_loss: 1.3833 - classification_loss: 0.2675 296/500 [================>.............] - ETA: 51s - loss: 1.6508 - regression_loss: 1.3832 - classification_loss: 0.2675 297/500 [================>.............] - ETA: 51s - loss: 1.6497 - regression_loss: 1.3826 - classification_loss: 0.2671 298/500 [================>.............] - ETA: 50s - loss: 1.6511 - regression_loss: 1.3834 - classification_loss: 0.2677 299/500 [================>.............] - ETA: 50s - loss: 1.6505 - regression_loss: 1.3827 - classification_loss: 0.2678 300/500 [=================>............] - ETA: 50s - loss: 1.6503 - regression_loss: 1.3823 - classification_loss: 0.2680 301/500 [=================>............] - ETA: 50s - loss: 1.6494 - regression_loss: 1.3817 - classification_loss: 0.2677 302/500 [=================>............] - ETA: 49s - loss: 1.6501 - regression_loss: 1.3822 - classification_loss: 0.2679 303/500 [=================>............] - ETA: 49s - loss: 1.6516 - regression_loss: 1.3832 - classification_loss: 0.2685 304/500 [=================>............] - ETA: 49s - loss: 1.6499 - regression_loss: 1.3819 - classification_loss: 0.2680 305/500 [=================>............] - ETA: 49s - loss: 1.6515 - regression_loss: 1.3830 - classification_loss: 0.2685 306/500 [=================>............] - ETA: 48s - loss: 1.6519 - regression_loss: 1.3834 - classification_loss: 0.2685 307/500 [=================>............] - ETA: 48s - loss: 1.6488 - regression_loss: 1.3807 - classification_loss: 0.2681 308/500 [=================>............] - ETA: 48s - loss: 1.6476 - regression_loss: 1.3799 - classification_loss: 0.2677 309/500 [=================>............] - ETA: 48s - loss: 1.6499 - regression_loss: 1.3817 - classification_loss: 0.2681 310/500 [=================>............] - ETA: 47s - loss: 1.6503 - regression_loss: 1.3819 - classification_loss: 0.2683 311/500 [=================>............] - ETA: 47s - loss: 1.6504 - regression_loss: 1.3821 - classification_loss: 0.2683 312/500 [=================>............] - ETA: 47s - loss: 1.6494 - regression_loss: 1.3815 - classification_loss: 0.2680 313/500 [=================>............] - ETA: 47s - loss: 1.6487 - regression_loss: 1.3807 - classification_loss: 0.2679 314/500 [=================>............] - ETA: 46s - loss: 1.6486 - regression_loss: 1.3808 - classification_loss: 0.2679 315/500 [=================>............] - ETA: 46s - loss: 1.6492 - regression_loss: 1.3811 - classification_loss: 0.2680 316/500 [=================>............] - ETA: 46s - loss: 1.6485 - regression_loss: 1.3809 - classification_loss: 0.2677 317/500 [==================>...........] - ETA: 46s - loss: 1.6473 - regression_loss: 1.3800 - classification_loss: 0.2673 318/500 [==================>...........] - ETA: 45s - loss: 1.6496 - regression_loss: 1.3818 - classification_loss: 0.2678 319/500 [==================>...........] - ETA: 45s - loss: 1.6501 - regression_loss: 1.3820 - classification_loss: 0.2681 320/500 [==================>...........] - ETA: 45s - loss: 1.6485 - regression_loss: 1.3805 - classification_loss: 0.2680 321/500 [==================>...........] - ETA: 45s - loss: 1.6476 - regression_loss: 1.3797 - classification_loss: 0.2678 322/500 [==================>...........] - ETA: 44s - loss: 1.6481 - regression_loss: 1.3803 - classification_loss: 0.2677 323/500 [==================>...........] - ETA: 44s - loss: 1.6469 - regression_loss: 1.3793 - classification_loss: 0.2676 324/500 [==================>...........] - ETA: 44s - loss: 1.6457 - regression_loss: 1.3786 - classification_loss: 0.2672 325/500 [==================>...........] - ETA: 44s - loss: 1.6488 - regression_loss: 1.3811 - classification_loss: 0.2677 326/500 [==================>...........] - ETA: 43s - loss: 1.6488 - regression_loss: 1.3811 - classification_loss: 0.2677 327/500 [==================>...........] - ETA: 43s - loss: 1.6479 - regression_loss: 1.3807 - classification_loss: 0.2673 328/500 [==================>...........] - ETA: 43s - loss: 1.6483 - regression_loss: 1.3809 - classification_loss: 0.2674 329/500 [==================>...........] - ETA: 43s - loss: 1.6489 - regression_loss: 1.3812 - classification_loss: 0.2677 330/500 [==================>...........] - ETA: 42s - loss: 1.6477 - regression_loss: 1.3803 - classification_loss: 0.2673 331/500 [==================>...........] - ETA: 42s - loss: 1.6516 - regression_loss: 1.3838 - classification_loss: 0.2678 332/500 [==================>...........] - ETA: 42s - loss: 1.6503 - regression_loss: 1.3829 - classification_loss: 0.2675 333/500 [==================>...........] - ETA: 42s - loss: 1.6514 - regression_loss: 1.3839 - classification_loss: 0.2675 334/500 [===================>..........] - ETA: 41s - loss: 1.6492 - regression_loss: 1.3823 - classification_loss: 0.2669 335/500 [===================>..........] - ETA: 41s - loss: 1.6495 - regression_loss: 1.3826 - classification_loss: 0.2669 336/500 [===================>..........] - ETA: 41s - loss: 1.6493 - regression_loss: 1.3822 - classification_loss: 0.2671 337/500 [===================>..........] - ETA: 41s - loss: 1.6501 - regression_loss: 1.3829 - classification_loss: 0.2672 338/500 [===================>..........] - ETA: 40s - loss: 1.6514 - regression_loss: 1.3842 - classification_loss: 0.2672 339/500 [===================>..........] - ETA: 40s - loss: 1.6542 - regression_loss: 1.3865 - classification_loss: 0.2677 340/500 [===================>..........] - ETA: 40s - loss: 1.6527 - regression_loss: 1.3854 - classification_loss: 0.2673 341/500 [===================>..........] - ETA: 40s - loss: 1.6526 - regression_loss: 1.3855 - classification_loss: 0.2671 342/500 [===================>..........] - ETA: 39s - loss: 1.6524 - regression_loss: 1.3853 - classification_loss: 0.2671 343/500 [===================>..........] - ETA: 39s - loss: 1.6525 - regression_loss: 1.3853 - classification_loss: 0.2672 344/500 [===================>..........] - ETA: 39s - loss: 1.6550 - regression_loss: 1.3872 - classification_loss: 0.2678 345/500 [===================>..........] - ETA: 39s - loss: 1.6559 - regression_loss: 1.3879 - classification_loss: 0.2679 346/500 [===================>..........] - ETA: 38s - loss: 1.6552 - regression_loss: 1.3875 - classification_loss: 0.2677 347/500 [===================>..........] - ETA: 38s - loss: 1.6551 - regression_loss: 1.3878 - classification_loss: 0.2673 348/500 [===================>..........] - ETA: 38s - loss: 1.6530 - regression_loss: 1.3860 - classification_loss: 0.2670 349/500 [===================>..........] - ETA: 38s - loss: 1.6538 - regression_loss: 1.3864 - classification_loss: 0.2673 350/500 [====================>.........] - ETA: 37s - loss: 1.6546 - regression_loss: 1.3870 - classification_loss: 0.2675 351/500 [====================>.........] - ETA: 37s - loss: 1.6562 - regression_loss: 1.3890 - classification_loss: 0.2673 352/500 [====================>.........] - ETA: 37s - loss: 1.6564 - regression_loss: 1.3889 - classification_loss: 0.2675 353/500 [====================>.........] - ETA: 37s - loss: 1.6549 - regression_loss: 1.3878 - classification_loss: 0.2671 354/500 [====================>.........] - ETA: 36s - loss: 1.6555 - regression_loss: 1.3883 - classification_loss: 0.2672 355/500 [====================>.........] - ETA: 36s - loss: 1.6564 - regression_loss: 1.3890 - classification_loss: 0.2673 356/500 [====================>.........] - ETA: 36s - loss: 1.6569 - regression_loss: 1.3894 - classification_loss: 0.2675 357/500 [====================>.........] - ETA: 36s - loss: 1.6567 - regression_loss: 1.3893 - classification_loss: 0.2674 358/500 [====================>.........] - ETA: 35s - loss: 1.6580 - regression_loss: 1.3903 - classification_loss: 0.2677 359/500 [====================>.........] - ETA: 35s - loss: 1.6558 - regression_loss: 1.3880 - classification_loss: 0.2678 360/500 [====================>.........] - ETA: 35s - loss: 1.6522 - regression_loss: 1.3850 - classification_loss: 0.2672 361/500 [====================>.........] - ETA: 35s - loss: 1.6526 - regression_loss: 1.3853 - classification_loss: 0.2672 362/500 [====================>.........] - ETA: 34s - loss: 1.6522 - regression_loss: 1.3850 - classification_loss: 0.2673 363/500 [====================>.........] - ETA: 34s - loss: 1.6520 - regression_loss: 1.3849 - classification_loss: 0.2671 364/500 [====================>.........] - ETA: 34s - loss: 1.6511 - regression_loss: 1.3843 - classification_loss: 0.2668 365/500 [====================>.........] - ETA: 34s - loss: 1.6509 - regression_loss: 1.3841 - classification_loss: 0.2667 366/500 [====================>.........] - ETA: 33s - loss: 1.6513 - regression_loss: 1.3844 - classification_loss: 0.2669 367/500 [=====================>........] - ETA: 33s - loss: 1.6525 - regression_loss: 1.3857 - classification_loss: 0.2669 368/500 [=====================>........] - ETA: 33s - loss: 1.6539 - regression_loss: 1.3866 - classification_loss: 0.2673 369/500 [=====================>........] - ETA: 33s - loss: 1.6537 - regression_loss: 1.3863 - classification_loss: 0.2673 370/500 [=====================>........] - ETA: 32s - loss: 1.6523 - regression_loss: 1.3852 - classification_loss: 0.2671 371/500 [=====================>........] - ETA: 32s - loss: 1.6521 - regression_loss: 1.3850 - classification_loss: 0.2671 372/500 [=====================>........] - ETA: 32s - loss: 1.6511 - regression_loss: 1.3842 - classification_loss: 0.2669 373/500 [=====================>........] - ETA: 32s - loss: 1.6512 - regression_loss: 1.3841 - classification_loss: 0.2671 374/500 [=====================>........] - ETA: 31s - loss: 1.6554 - regression_loss: 1.3874 - classification_loss: 0.2680 375/500 [=====================>........] - ETA: 31s - loss: 1.6542 - regression_loss: 1.3866 - classification_loss: 0.2676 376/500 [=====================>........] - ETA: 31s - loss: 1.6521 - regression_loss: 1.3849 - classification_loss: 0.2672 377/500 [=====================>........] - ETA: 31s - loss: 1.6507 - regression_loss: 1.3838 - classification_loss: 0.2668 378/500 [=====================>........] - ETA: 30s - loss: 1.6514 - regression_loss: 1.3844 - classification_loss: 0.2670 379/500 [=====================>........] - ETA: 30s - loss: 1.6495 - regression_loss: 1.3831 - classification_loss: 0.2665 380/500 [=====================>........] - ETA: 30s - loss: 1.6504 - regression_loss: 1.3838 - classification_loss: 0.2666 381/500 [=====================>........] - ETA: 29s - loss: 1.6491 - regression_loss: 1.3828 - classification_loss: 0.2662 382/500 [=====================>........] - ETA: 29s - loss: 1.6484 - regression_loss: 1.3824 - classification_loss: 0.2660 383/500 [=====================>........] - ETA: 29s - loss: 1.6481 - regression_loss: 1.3822 - classification_loss: 0.2659 384/500 [======================>.......] - ETA: 29s - loss: 1.6482 - regression_loss: 1.3823 - classification_loss: 0.2659 385/500 [======================>.......] - ETA: 28s - loss: 1.6487 - regression_loss: 1.3829 - classification_loss: 0.2658 386/500 [======================>.......] - ETA: 28s - loss: 1.6466 - regression_loss: 1.3811 - classification_loss: 0.2655 387/500 [======================>.......] - ETA: 28s - loss: 1.6464 - regression_loss: 1.3810 - classification_loss: 0.2654 388/500 [======================>.......] - ETA: 28s - loss: 1.6465 - regression_loss: 1.3813 - classification_loss: 0.2653 389/500 [======================>.......] - ETA: 27s - loss: 1.6456 - regression_loss: 1.3804 - classification_loss: 0.2652 390/500 [======================>.......] - ETA: 27s - loss: 1.6448 - regression_loss: 1.3798 - classification_loss: 0.2650 391/500 [======================>.......] - ETA: 27s - loss: 1.6428 - regression_loss: 1.3780 - classification_loss: 0.2648 392/500 [======================>.......] - ETA: 27s - loss: 1.6420 - regression_loss: 1.3775 - classification_loss: 0.2645 393/500 [======================>.......] - ETA: 26s - loss: 1.6400 - regression_loss: 1.3760 - classification_loss: 0.2640 394/500 [======================>.......] - ETA: 26s - loss: 1.6376 - regression_loss: 1.3740 - classification_loss: 0.2636 395/500 [======================>.......] - ETA: 26s - loss: 1.6373 - regression_loss: 1.3739 - classification_loss: 0.2635 396/500 [======================>.......] - ETA: 26s - loss: 1.6375 - regression_loss: 1.3740 - classification_loss: 0.2635 397/500 [======================>.......] - ETA: 25s - loss: 1.6364 - regression_loss: 1.3731 - classification_loss: 0.2633 398/500 [======================>.......] - ETA: 25s - loss: 1.6364 - regression_loss: 1.3730 - classification_loss: 0.2633 399/500 [======================>.......] - ETA: 25s - loss: 1.6344 - regression_loss: 1.3715 - classification_loss: 0.2629 400/500 [=======================>......] - ETA: 25s - loss: 1.6332 - regression_loss: 1.3706 - classification_loss: 0.2625 401/500 [=======================>......] - ETA: 24s - loss: 1.6329 - regression_loss: 1.3708 - classification_loss: 0.2622 402/500 [=======================>......] - ETA: 24s - loss: 1.6299 - regression_loss: 1.3683 - classification_loss: 0.2616 403/500 [=======================>......] - ETA: 24s - loss: 1.6313 - regression_loss: 1.3695 - classification_loss: 0.2618 404/500 [=======================>......] - ETA: 24s - loss: 1.6303 - regression_loss: 1.3688 - classification_loss: 0.2615 405/500 [=======================>......] - ETA: 23s - loss: 1.6296 - regression_loss: 1.3683 - classification_loss: 0.2613 406/500 [=======================>......] - ETA: 23s - loss: 1.6296 - regression_loss: 1.3682 - classification_loss: 0.2614 407/500 [=======================>......] - ETA: 23s - loss: 1.6285 - regression_loss: 1.3673 - classification_loss: 0.2612 408/500 [=======================>......] - ETA: 23s - loss: 1.6277 - regression_loss: 1.3666 - classification_loss: 0.2611 409/500 [=======================>......] - ETA: 22s - loss: 1.6250 - regression_loss: 1.3644 - classification_loss: 0.2606 410/500 [=======================>......] - ETA: 22s - loss: 1.6258 - regression_loss: 1.3653 - classification_loss: 0.2606 411/500 [=======================>......] - ETA: 22s - loss: 1.6287 - regression_loss: 1.3676 - classification_loss: 0.2611 412/500 [=======================>......] - ETA: 22s - loss: 1.6278 - regression_loss: 1.3669 - classification_loss: 0.2609 413/500 [=======================>......] - ETA: 21s - loss: 1.6291 - regression_loss: 1.3680 - classification_loss: 0.2611 414/500 [=======================>......] - ETA: 21s - loss: 1.6302 - regression_loss: 1.3690 - classification_loss: 0.2613 415/500 [=======================>......] - ETA: 21s - loss: 1.6324 - regression_loss: 1.3705 - classification_loss: 0.2618 416/500 [=======================>......] - ETA: 21s - loss: 1.6308 - regression_loss: 1.3692 - classification_loss: 0.2616 417/500 [========================>.....] - ETA: 20s - loss: 1.6313 - regression_loss: 1.3697 - classification_loss: 0.2616 418/500 [========================>.....] - ETA: 20s - loss: 1.6327 - regression_loss: 1.3707 - classification_loss: 0.2620 419/500 [========================>.....] - ETA: 20s - loss: 1.6298 - regression_loss: 1.3682 - classification_loss: 0.2615 420/500 [========================>.....] - ETA: 20s - loss: 1.6311 - regression_loss: 1.3693 - classification_loss: 0.2618 421/500 [========================>.....] - ETA: 19s - loss: 1.6328 - regression_loss: 1.3705 - classification_loss: 0.2623 422/500 [========================>.....] - ETA: 19s - loss: 1.6308 - regression_loss: 1.3688 - classification_loss: 0.2620 423/500 [========================>.....] - ETA: 19s - loss: 1.6311 - regression_loss: 1.3691 - classification_loss: 0.2621 424/500 [========================>.....] - ETA: 19s - loss: 1.6320 - regression_loss: 1.3698 - classification_loss: 0.2622 425/500 [========================>.....] - ETA: 18s - loss: 1.6317 - regression_loss: 1.3696 - classification_loss: 0.2621 426/500 [========================>.....] - ETA: 18s - loss: 1.6330 - regression_loss: 1.3706 - classification_loss: 0.2624 427/500 [========================>.....] - ETA: 18s - loss: 1.6323 - regression_loss: 1.3701 - classification_loss: 0.2621 428/500 [========================>.....] - ETA: 18s - loss: 1.6314 - regression_loss: 1.3695 - classification_loss: 0.2618 429/500 [========================>.....] - ETA: 17s - loss: 1.6315 - regression_loss: 1.3697 - classification_loss: 0.2617 430/500 [========================>.....] - ETA: 17s - loss: 1.6314 - regression_loss: 1.3696 - classification_loss: 0.2618 431/500 [========================>.....] - ETA: 17s - loss: 1.6313 - regression_loss: 1.3694 - classification_loss: 0.2619 432/500 [========================>.....] - ETA: 17s - loss: 1.6324 - regression_loss: 1.3701 - classification_loss: 0.2623 433/500 [========================>.....] - ETA: 16s - loss: 1.6345 - regression_loss: 1.3715 - classification_loss: 0.2629 434/500 [=========================>....] - ETA: 16s - loss: 1.6343 - regression_loss: 1.3715 - classification_loss: 0.2628 435/500 [=========================>....] - ETA: 16s - loss: 1.6346 - regression_loss: 1.3717 - classification_loss: 0.2628 436/500 [=========================>....] - ETA: 16s - loss: 1.6336 - regression_loss: 1.3710 - classification_loss: 0.2626 437/500 [=========================>....] - ETA: 15s - loss: 1.6334 - regression_loss: 1.3707 - classification_loss: 0.2627 438/500 [=========================>....] - ETA: 15s - loss: 1.6333 - regression_loss: 1.3706 - classification_loss: 0.2627 439/500 [=========================>....] - ETA: 15s - loss: 1.6339 - regression_loss: 1.3713 - classification_loss: 0.2627 440/500 [=========================>....] - ETA: 15s - loss: 1.6326 - regression_loss: 1.3701 - classification_loss: 0.2624 441/500 [=========================>....] - ETA: 14s - loss: 1.6326 - regression_loss: 1.3704 - classification_loss: 0.2622 442/500 [=========================>....] - ETA: 14s - loss: 1.6324 - regression_loss: 1.3703 - classification_loss: 0.2621 443/500 [=========================>....] - ETA: 14s - loss: 1.6321 - regression_loss: 1.3700 - classification_loss: 0.2621 444/500 [=========================>....] - ETA: 14s - loss: 1.6311 - regression_loss: 1.3691 - classification_loss: 0.2620 445/500 [=========================>....] - ETA: 13s - loss: 1.6299 - regression_loss: 1.3679 - classification_loss: 0.2620 446/500 [=========================>....] - ETA: 13s - loss: 1.6319 - regression_loss: 1.3694 - classification_loss: 0.2624 447/500 [=========================>....] - ETA: 13s - loss: 1.6290 - regression_loss: 1.3670 - classification_loss: 0.2620 448/500 [=========================>....] - ETA: 13s - loss: 1.6298 - regression_loss: 1.3674 - classification_loss: 0.2625 449/500 [=========================>....] - ETA: 12s - loss: 1.6302 - regression_loss: 1.3676 - classification_loss: 0.2626 450/500 [==========================>...] - ETA: 12s - loss: 1.6318 - regression_loss: 1.3690 - classification_loss: 0.2629 451/500 [==========================>...] - ETA: 12s - loss: 1.6322 - regression_loss: 1.3693 - classification_loss: 0.2629 452/500 [==========================>...] - ETA: 12s - loss: 1.6318 - regression_loss: 1.3692 - classification_loss: 0.2626 453/500 [==========================>...] - ETA: 11s - loss: 1.6313 - regression_loss: 1.3688 - classification_loss: 0.2625 454/500 [==========================>...] - ETA: 11s - loss: 1.6332 - regression_loss: 1.3703 - classification_loss: 0.2629 455/500 [==========================>...] - ETA: 11s - loss: 1.6352 - regression_loss: 1.3718 - classification_loss: 0.2634 456/500 [==========================>...] - ETA: 11s - loss: 1.6345 - regression_loss: 1.3711 - classification_loss: 0.2635 457/500 [==========================>...] - ETA: 10s - loss: 1.6344 - regression_loss: 1.3710 - classification_loss: 0.2634 458/500 [==========================>...] - ETA: 10s - loss: 1.6353 - regression_loss: 1.3717 - classification_loss: 0.2636 459/500 [==========================>...] - ETA: 10s - loss: 1.6365 - regression_loss: 1.3727 - classification_loss: 0.2638 460/500 [==========================>...] - ETA: 10s - loss: 1.6365 - regression_loss: 1.3726 - classification_loss: 0.2638 461/500 [==========================>...] - ETA: 9s - loss: 1.6364 - regression_loss: 1.3727 - classification_loss: 0.2637  462/500 [==========================>...] - ETA: 9s - loss: 1.6363 - regression_loss: 1.3725 - classification_loss: 0.2638 463/500 [==========================>...] - ETA: 9s - loss: 1.6372 - regression_loss: 1.3726 - classification_loss: 0.2646 464/500 [==========================>...] - ETA: 9s - loss: 1.6362 - regression_loss: 1.3718 - classification_loss: 0.2644 465/500 [==========================>...] - ETA: 8s - loss: 1.6353 - regression_loss: 1.3711 - classification_loss: 0.2642 466/500 [==========================>...] - ETA: 8s - loss: 1.6344 - regression_loss: 1.3701 - classification_loss: 0.2643 467/500 [===========================>..] - ETA: 8s - loss: 1.6343 - regression_loss: 1.3701 - classification_loss: 0.2642 468/500 [===========================>..] - ETA: 8s - loss: 1.6341 - regression_loss: 1.3672 - classification_loss: 0.2669 469/500 [===========================>..] - ETA: 7s - loss: 1.6343 - regression_loss: 1.3674 - classification_loss: 0.2669 470/500 [===========================>..] - ETA: 7s - loss: 1.6341 - regression_loss: 1.3674 - classification_loss: 0.2668 471/500 [===========================>..] - ETA: 7s - loss: 1.6353 - regression_loss: 1.3680 - classification_loss: 0.2672 472/500 [===========================>..] - ETA: 7s - loss: 1.6359 - regression_loss: 1.3685 - classification_loss: 0.2674 473/500 [===========================>..] - ETA: 6s - loss: 1.6355 - regression_loss: 1.3681 - classification_loss: 0.2674 474/500 [===========================>..] - ETA: 6s - loss: 1.6350 - regression_loss: 1.3677 - classification_loss: 0.2673 475/500 [===========================>..] - ETA: 6s - loss: 1.6344 - regression_loss: 1.3672 - classification_loss: 0.2672 476/500 [===========================>..] - ETA: 6s - loss: 1.6351 - regression_loss: 1.3677 - classification_loss: 0.2674 477/500 [===========================>..] - ETA: 5s - loss: 1.6354 - regression_loss: 1.3680 - classification_loss: 0.2674 478/500 [===========================>..] - ETA: 5s - loss: 1.6344 - regression_loss: 1.3672 - classification_loss: 0.2672 479/500 [===========================>..] - ETA: 5s - loss: 1.6347 - regression_loss: 1.3673 - classification_loss: 0.2674 480/500 [===========================>..] - ETA: 5s - loss: 1.6341 - regression_loss: 1.3669 - classification_loss: 0.2672 481/500 [===========================>..] - ETA: 4s - loss: 1.6322 - regression_loss: 1.3653 - classification_loss: 0.2669 482/500 [===========================>..] - ETA: 4s - loss: 1.6302 - regression_loss: 1.3637 - classification_loss: 0.2665 483/500 [===========================>..] - ETA: 4s - loss: 1.6300 - regression_loss: 1.3637 - classification_loss: 0.2664 484/500 [============================>.] - ETA: 4s - loss: 1.6296 - regression_loss: 1.3635 - classification_loss: 0.2661 485/500 [============================>.] - ETA: 3s - loss: 1.6316 - regression_loss: 1.3650 - classification_loss: 0.2667 486/500 [============================>.] - ETA: 3s - loss: 1.6291 - regression_loss: 1.3628 - classification_loss: 0.2662 487/500 [============================>.] - ETA: 3s - loss: 1.6376 - regression_loss: 1.3667 - classification_loss: 0.2709 488/500 [============================>.] - ETA: 3s - loss: 1.6388 - regression_loss: 1.3675 - classification_loss: 0.2714 489/500 [============================>.] - ETA: 2s - loss: 1.6393 - regression_loss: 1.3676 - classification_loss: 0.2717 490/500 [============================>.] - ETA: 2s - loss: 1.6404 - regression_loss: 1.3685 - classification_loss: 0.2719 491/500 [============================>.] - ETA: 2s - loss: 1.6402 - regression_loss: 1.3686 - classification_loss: 0.2716 492/500 [============================>.] - ETA: 2s - loss: 1.6391 - regression_loss: 1.3678 - classification_loss: 0.2713 493/500 [============================>.] - ETA: 1s - loss: 1.6397 - regression_loss: 1.3684 - classification_loss: 0.2712 494/500 [============================>.] - ETA: 1s - loss: 1.6403 - regression_loss: 1.3690 - classification_loss: 0.2713 495/500 [============================>.] - ETA: 1s - loss: 1.6407 - regression_loss: 1.3693 - classification_loss: 0.2714 496/500 [============================>.] - ETA: 1s - loss: 1.6407 - regression_loss: 1.3692 - classification_loss: 0.2714 497/500 [============================>.] - ETA: 0s - loss: 1.6395 - regression_loss: 1.3684 - classification_loss: 0.2711 498/500 [============================>.] - ETA: 0s - loss: 1.6387 - regression_loss: 1.3676 - classification_loss: 0.2711 499/500 [============================>.] - ETA: 0s - loss: 1.6397 - regression_loss: 1.3684 - classification_loss: 0.2714 500/500 [==============================] - 126s 252ms/step - loss: 1.6410 - regression_loss: 1.3699 - classification_loss: 0.2712 326 instances of class plum with average precision: 0.7942 mAP: 0.7942 Epoch 00055: saving model to ./training/snapshots/resnet50_pascal_55.h5 Epoch 56/150 1/500 [..............................] - ETA: 1:58 - loss: 1.9206 - regression_loss: 1.5611 - classification_loss: 0.3595 2/500 [..............................] - ETA: 1:56 - loss: 2.0961 - regression_loss: 1.6980 - classification_loss: 0.3981 3/500 [..............................] - ETA: 2:00 - loss: 1.9000 - regression_loss: 1.5610 - classification_loss: 0.3390 4/500 [..............................] - ETA: 2:02 - loss: 2.0178 - regression_loss: 1.6667 - classification_loss: 0.3511 5/500 [..............................] - ETA: 2:02 - loss: 1.9030 - regression_loss: 1.5874 - classification_loss: 0.3156 6/500 [..............................] - ETA: 2:02 - loss: 2.0171 - regression_loss: 1.6850 - classification_loss: 0.3321 7/500 [..............................] - ETA: 2:02 - loss: 1.9911 - regression_loss: 1.6648 - classification_loss: 0.3262 8/500 [..............................] - ETA: 2:02 - loss: 1.8951 - regression_loss: 1.5909 - classification_loss: 0.3043 9/500 [..............................] - ETA: 2:02 - loss: 1.7782 - regression_loss: 1.4869 - classification_loss: 0.2912 10/500 [..............................] - ETA: 2:02 - loss: 1.7138 - regression_loss: 1.4429 - classification_loss: 0.2709 11/500 [..............................] - ETA: 2:03 - loss: 1.6930 - regression_loss: 1.4274 - classification_loss: 0.2656 12/500 [..............................] - ETA: 2:02 - loss: 1.6594 - regression_loss: 1.3967 - classification_loss: 0.2628 13/500 [..............................] - ETA: 2:02 - loss: 1.6579 - regression_loss: 1.3944 - classification_loss: 0.2635 14/500 [..............................] - ETA: 2:01 - loss: 1.6003 - regression_loss: 1.3472 - classification_loss: 0.2531 15/500 [..............................] - ETA: 2:01 - loss: 1.6643 - regression_loss: 1.4028 - classification_loss: 0.2615 16/500 [..............................] - ETA: 2:01 - loss: 1.6177 - regression_loss: 1.3648 - classification_loss: 0.2528 17/500 [>.............................] - ETA: 2:01 - loss: 1.6041 - regression_loss: 1.3562 - classification_loss: 0.2479 18/500 [>.............................] - ETA: 2:00 - loss: 1.6069 - regression_loss: 1.3552 - classification_loss: 0.2517 19/500 [>.............................] - ETA: 2:00 - loss: 1.6011 - regression_loss: 1.3530 - classification_loss: 0.2481 20/500 [>.............................] - ETA: 2:00 - loss: 1.5777 - regression_loss: 1.3353 - classification_loss: 0.2423 21/500 [>.............................] - ETA: 2:00 - loss: 1.6037 - regression_loss: 1.3515 - classification_loss: 0.2522 22/500 [>.............................] - ETA: 2:00 - loss: 1.6154 - regression_loss: 1.3626 - classification_loss: 0.2528 23/500 [>.............................] - ETA: 1:59 - loss: 1.6262 - regression_loss: 1.3691 - classification_loss: 0.2571 24/500 [>.............................] - ETA: 1:59 - loss: 1.6202 - regression_loss: 1.3668 - classification_loss: 0.2534 25/500 [>.............................] - ETA: 1:59 - loss: 1.6130 - regression_loss: 1.3616 - classification_loss: 0.2513 26/500 [>.............................] - ETA: 1:59 - loss: 1.5874 - regression_loss: 1.3406 - classification_loss: 0.2467 27/500 [>.............................] - ETA: 1:58 - loss: 1.5699 - regression_loss: 1.3255 - classification_loss: 0.2444 28/500 [>.............................] - ETA: 1:58 - loss: 1.6091 - regression_loss: 1.3541 - classification_loss: 0.2550 29/500 [>.............................] - ETA: 1:58 - loss: 1.6038 - regression_loss: 1.3516 - classification_loss: 0.2521 30/500 [>.............................] - ETA: 1:58 - loss: 1.5887 - regression_loss: 1.3394 - classification_loss: 0.2494 31/500 [>.............................] - ETA: 1:58 - loss: 1.5903 - regression_loss: 1.3427 - classification_loss: 0.2476 32/500 [>.............................] - ETA: 1:57 - loss: 1.5822 - regression_loss: 1.3380 - classification_loss: 0.2442 33/500 [>.............................] - ETA: 1:57 - loss: 1.5857 - regression_loss: 1.3413 - classification_loss: 0.2444 34/500 [=>............................] - ETA: 1:57 - loss: 1.5989 - regression_loss: 1.3555 - classification_loss: 0.2435 35/500 [=>............................] - ETA: 1:57 - loss: 1.6079 - regression_loss: 1.3605 - classification_loss: 0.2474 36/500 [=>............................] - ETA: 1:57 - loss: 1.6179 - regression_loss: 1.3673 - classification_loss: 0.2506 37/500 [=>............................] - ETA: 1:56 - loss: 1.6191 - regression_loss: 1.3663 - classification_loss: 0.2528 38/500 [=>............................] - ETA: 1:56 - loss: 1.6108 - regression_loss: 1.3611 - classification_loss: 0.2498 39/500 [=>............................] - ETA: 1:56 - loss: 1.6137 - regression_loss: 1.3636 - classification_loss: 0.2501 40/500 [=>............................] - ETA: 1:55 - loss: 1.6108 - regression_loss: 1.3635 - classification_loss: 0.2473 41/500 [=>............................] - ETA: 1:55 - loss: 1.6149 - regression_loss: 1.3664 - classification_loss: 0.2485 42/500 [=>............................] - ETA: 1:55 - loss: 1.6205 - regression_loss: 1.3717 - classification_loss: 0.2487 43/500 [=>............................] - ETA: 1:55 - loss: 1.6141 - regression_loss: 1.3666 - classification_loss: 0.2475 44/500 [=>............................] - ETA: 1:54 - loss: 1.5979 - regression_loss: 1.3541 - classification_loss: 0.2439 45/500 [=>............................] - ETA: 1:54 - loss: 1.5954 - regression_loss: 1.3510 - classification_loss: 0.2445 46/500 [=>............................] - ETA: 1:54 - loss: 1.6055 - regression_loss: 1.3573 - classification_loss: 0.2482 47/500 [=>............................] - ETA: 1:53 - loss: 1.6248 - regression_loss: 1.3709 - classification_loss: 0.2539 48/500 [=>............................] - ETA: 1:53 - loss: 1.6291 - regression_loss: 1.3766 - classification_loss: 0.2525 49/500 [=>............................] - ETA: 1:53 - loss: 1.6295 - regression_loss: 1.3767 - classification_loss: 0.2528 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6194 - regression_loss: 1.3689 - classification_loss: 0.2505 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6350 - regression_loss: 1.3803 - classification_loss: 0.2548 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6381 - regression_loss: 1.3837 - classification_loss: 0.2544 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6461 - regression_loss: 1.3908 - classification_loss: 0.2553 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6416 - regression_loss: 1.3878 - classification_loss: 0.2538 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6348 - regression_loss: 1.3816 - classification_loss: 0.2531 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6358 - regression_loss: 1.3828 - classification_loss: 0.2530 57/500 [==>...........................] - ETA: 1:51 - loss: 1.6155 - regression_loss: 1.3655 - classification_loss: 0.2500 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6158 - regression_loss: 1.3645 - classification_loss: 0.2513 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6207 - regression_loss: 1.3681 - classification_loss: 0.2526 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6276 - regression_loss: 1.3726 - classification_loss: 0.2550 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6307 - regression_loss: 1.3759 - classification_loss: 0.2547 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6372 - regression_loss: 1.3827 - classification_loss: 0.2545 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6339 - regression_loss: 1.3805 - classification_loss: 0.2533 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6353 - regression_loss: 1.3818 - classification_loss: 0.2535 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6306 - regression_loss: 1.3785 - classification_loss: 0.2521 66/500 [==>...........................] - ETA: 1:49 - loss: 1.6361 - regression_loss: 1.3834 - classification_loss: 0.2528 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6468 - regression_loss: 1.3916 - classification_loss: 0.2552 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6561 - regression_loss: 1.3989 - classification_loss: 0.2573 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6537 - regression_loss: 1.3966 - classification_loss: 0.2571 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6459 - regression_loss: 1.3908 - classification_loss: 0.2551 71/500 [===>..........................] - ETA: 1:48 - loss: 1.6458 - regression_loss: 1.3906 - classification_loss: 0.2552 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6461 - regression_loss: 1.3921 - classification_loss: 0.2540 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6415 - regression_loss: 1.3891 - classification_loss: 0.2524 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6384 - regression_loss: 1.3867 - classification_loss: 0.2517 75/500 [===>..........................] - ETA: 1:47 - loss: 1.6249 - regression_loss: 1.3760 - classification_loss: 0.2489 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6307 - regression_loss: 1.3790 - classification_loss: 0.2517 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6157 - regression_loss: 1.3663 - classification_loss: 0.2494 78/500 [===>..........................] - ETA: 1:46 - loss: 1.6096 - regression_loss: 1.3613 - classification_loss: 0.2483 79/500 [===>..........................] - ETA: 1:46 - loss: 1.6143 - regression_loss: 1.3639 - classification_loss: 0.2503 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6248 - regression_loss: 1.3734 - classification_loss: 0.2515 81/500 [===>..........................] - ETA: 1:45 - loss: 1.6168 - regression_loss: 1.3668 - classification_loss: 0.2501 82/500 [===>..........................] - ETA: 1:45 - loss: 1.6153 - regression_loss: 1.3662 - classification_loss: 0.2491 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6183 - regression_loss: 1.3676 - classification_loss: 0.2508 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6212 - regression_loss: 1.3695 - classification_loss: 0.2517 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6248 - regression_loss: 1.3727 - classification_loss: 0.2521 86/500 [====>.........................] - ETA: 1:44 - loss: 1.6164 - regression_loss: 1.3645 - classification_loss: 0.2519 87/500 [====>.........................] - ETA: 1:44 - loss: 1.6148 - regression_loss: 1.3639 - classification_loss: 0.2509 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6190 - regression_loss: 1.3672 - classification_loss: 0.2518 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6229 - regression_loss: 1.3704 - classification_loss: 0.2525 90/500 [====>.........................] - ETA: 1:43 - loss: 1.6322 - regression_loss: 1.3780 - classification_loss: 0.2542 91/500 [====>.........................] - ETA: 1:43 - loss: 1.6364 - regression_loss: 1.3822 - classification_loss: 0.2542 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6356 - regression_loss: 1.3821 - classification_loss: 0.2535 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6392 - regression_loss: 1.3842 - classification_loss: 0.2550 94/500 [====>.........................] - ETA: 1:42 - loss: 1.6428 - regression_loss: 1.3863 - classification_loss: 0.2565 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6435 - regression_loss: 1.3866 - classification_loss: 0.2569 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6378 - regression_loss: 1.3822 - classification_loss: 0.2555 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6432 - regression_loss: 1.3848 - classification_loss: 0.2584 98/500 [====>.........................] - ETA: 1:41 - loss: 1.6473 - regression_loss: 1.3884 - classification_loss: 0.2590 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6454 - regression_loss: 1.3875 - classification_loss: 0.2579 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6512 - regression_loss: 1.3914 - classification_loss: 0.2598 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6533 - regression_loss: 1.3936 - classification_loss: 0.2596 102/500 [=====>........................] - ETA: 1:40 - loss: 1.6630 - regression_loss: 1.4001 - classification_loss: 0.2629 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6631 - regression_loss: 1.4001 - classification_loss: 0.2630 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6575 - regression_loss: 1.3954 - classification_loss: 0.2621 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6697 - regression_loss: 1.4055 - classification_loss: 0.2641 106/500 [=====>........................] - ETA: 1:39 - loss: 1.6678 - regression_loss: 1.4045 - classification_loss: 0.2633 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6578 - regression_loss: 1.3964 - classification_loss: 0.2614 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6555 - regression_loss: 1.3950 - classification_loss: 0.2605 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6545 - regression_loss: 1.3941 - classification_loss: 0.2604 110/500 [=====>........................] - ETA: 1:38 - loss: 1.6535 - regression_loss: 1.3930 - classification_loss: 0.2605 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6550 - regression_loss: 1.3942 - classification_loss: 0.2608 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6499 - regression_loss: 1.3897 - classification_loss: 0.2603 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6488 - regression_loss: 1.3888 - classification_loss: 0.2600 114/500 [=====>........................] - ETA: 1:37 - loss: 1.6511 - regression_loss: 1.3905 - classification_loss: 0.2606 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6516 - regression_loss: 1.3916 - classification_loss: 0.2600 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6539 - regression_loss: 1.3930 - classification_loss: 0.2610 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6551 - regression_loss: 1.3936 - classification_loss: 0.2614 118/500 [======>.......................] - ETA: 1:36 - loss: 1.6560 - regression_loss: 1.3949 - classification_loss: 0.2611 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6531 - regression_loss: 1.3926 - classification_loss: 0.2606 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6559 - regression_loss: 1.3951 - classification_loss: 0.2608 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6471 - regression_loss: 1.3877 - classification_loss: 0.2594 122/500 [======>.......................] - ETA: 1:35 - loss: 1.6443 - regression_loss: 1.3853 - classification_loss: 0.2591 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6449 - regression_loss: 1.3853 - classification_loss: 0.2596 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6408 - regression_loss: 1.3821 - classification_loss: 0.2586 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6432 - regression_loss: 1.3845 - classification_loss: 0.2587 126/500 [======>.......................] - ETA: 1:34 - loss: 1.6435 - regression_loss: 1.3843 - classification_loss: 0.2591 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6413 - regression_loss: 1.3826 - classification_loss: 0.2587 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6396 - regression_loss: 1.3807 - classification_loss: 0.2589 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6416 - regression_loss: 1.3824 - classification_loss: 0.2592 130/500 [======>.......................] - ETA: 1:33 - loss: 1.6396 - regression_loss: 1.3811 - classification_loss: 0.2585 131/500 [======>.......................] - ETA: 1:33 - loss: 1.6384 - regression_loss: 1.3806 - classification_loss: 0.2578 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6341 - regression_loss: 1.3771 - classification_loss: 0.2570 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6353 - regression_loss: 1.3782 - classification_loss: 0.2572 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6398 - regression_loss: 1.3825 - classification_loss: 0.2573 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6398 - regression_loss: 1.3825 - classification_loss: 0.2573 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6364 - regression_loss: 1.3795 - classification_loss: 0.2570 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6427 - regression_loss: 1.3847 - classification_loss: 0.2580 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6416 - regression_loss: 1.3841 - classification_loss: 0.2575 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6410 - regression_loss: 1.3834 - classification_loss: 0.2576 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6402 - regression_loss: 1.3825 - classification_loss: 0.2578 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6423 - regression_loss: 1.3836 - classification_loss: 0.2587 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6415 - regression_loss: 1.3834 - classification_loss: 0.2581 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6404 - regression_loss: 1.3823 - classification_loss: 0.2581 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6404 - regression_loss: 1.3826 - classification_loss: 0.2578 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6448 - regression_loss: 1.3860 - classification_loss: 0.2588 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6472 - regression_loss: 1.3879 - classification_loss: 0.2593 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6457 - regression_loss: 1.3865 - classification_loss: 0.2591 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6438 - regression_loss: 1.3851 - classification_loss: 0.2587 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6455 - regression_loss: 1.3844 - classification_loss: 0.2612 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6458 - regression_loss: 1.3847 - classification_loss: 0.2611 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6441 - regression_loss: 1.3835 - classification_loss: 0.2606 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6466 - regression_loss: 1.3847 - classification_loss: 0.2619 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6386 - regression_loss: 1.3781 - classification_loss: 0.2604 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6360 - regression_loss: 1.3761 - classification_loss: 0.2599 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6348 - regression_loss: 1.3752 - classification_loss: 0.2595 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6286 - regression_loss: 1.3703 - classification_loss: 0.2583 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6300 - regression_loss: 1.3710 - classification_loss: 0.2590 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6302 - regression_loss: 1.3713 - classification_loss: 0.2589 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6307 - regression_loss: 1.3722 - classification_loss: 0.2585 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6269 - regression_loss: 1.3692 - classification_loss: 0.2577 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6270 - regression_loss: 1.3695 - classification_loss: 0.2576 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6281 - regression_loss: 1.3705 - classification_loss: 0.2576 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6329 - regression_loss: 1.3744 - classification_loss: 0.2586 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6304 - regression_loss: 1.3727 - classification_loss: 0.2577 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6354 - regression_loss: 1.3767 - classification_loss: 0.2587 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6414 - regression_loss: 1.3818 - classification_loss: 0.2596 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6447 - regression_loss: 1.3845 - classification_loss: 0.2601 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6480 - regression_loss: 1.3867 - classification_loss: 0.2613 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6463 - regression_loss: 1.3855 - classification_loss: 0.2608 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6563 - regression_loss: 1.3932 - classification_loss: 0.2631 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6573 - regression_loss: 1.3941 - classification_loss: 0.2632 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6586 - regression_loss: 1.3952 - classification_loss: 0.2634 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6565 - regression_loss: 1.3938 - classification_loss: 0.2627 174/500 [=========>....................] - ETA: 1:22 - loss: 1.6611 - regression_loss: 1.3969 - classification_loss: 0.2642 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6637 - regression_loss: 1.3992 - classification_loss: 0.2645 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6658 - regression_loss: 1.4007 - classification_loss: 0.2652 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6654 - regression_loss: 1.4008 - classification_loss: 0.2646 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6587 - regression_loss: 1.3950 - classification_loss: 0.2637 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6591 - regression_loss: 1.3954 - classification_loss: 0.2637 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6593 - regression_loss: 1.3957 - classification_loss: 0.2637 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6576 - regression_loss: 1.3943 - classification_loss: 0.2633 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6581 - regression_loss: 1.3944 - classification_loss: 0.2637 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6593 - regression_loss: 1.3957 - classification_loss: 0.2635 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6622 - regression_loss: 1.3981 - classification_loss: 0.2641 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6630 - regression_loss: 1.3977 - classification_loss: 0.2653 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6578 - regression_loss: 1.3936 - classification_loss: 0.2642 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6668 - regression_loss: 1.4008 - classification_loss: 0.2661 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6683 - regression_loss: 1.4021 - classification_loss: 0.2662 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6693 - regression_loss: 1.4029 - classification_loss: 0.2664 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6712 - regression_loss: 1.4046 - classification_loss: 0.2666 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6674 - regression_loss: 1.4015 - classification_loss: 0.2659 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6665 - regression_loss: 1.4010 - classification_loss: 0.2655 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6641 - regression_loss: 1.3990 - classification_loss: 0.2651 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6646 - regression_loss: 1.3996 - classification_loss: 0.2649 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6649 - regression_loss: 1.4001 - classification_loss: 0.2648 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6639 - regression_loss: 1.3994 - classification_loss: 0.2645 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6644 - regression_loss: 1.4001 - classification_loss: 0.2643 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6642 - regression_loss: 1.3997 - classification_loss: 0.2645 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6638 - regression_loss: 1.3996 - classification_loss: 0.2642 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6632 - regression_loss: 1.3987 - classification_loss: 0.2645 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6639 - regression_loss: 1.3994 - classification_loss: 0.2645 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6595 - regression_loss: 1.3955 - classification_loss: 0.2639 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6564 - regression_loss: 1.3930 - classification_loss: 0.2634 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6560 - regression_loss: 1.3925 - classification_loss: 0.2636 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6529 - regression_loss: 1.3900 - classification_loss: 0.2629 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6550 - regression_loss: 1.3920 - classification_loss: 0.2630 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6551 - regression_loss: 1.3923 - classification_loss: 0.2628 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6560 - regression_loss: 1.3926 - classification_loss: 0.2634 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6549 - regression_loss: 1.3921 - classification_loss: 0.2628 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6551 - regression_loss: 1.3925 - classification_loss: 0.2627 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6536 - regression_loss: 1.3913 - classification_loss: 0.2623 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6548 - regression_loss: 1.3922 - classification_loss: 0.2625 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6525 - regression_loss: 1.3906 - classification_loss: 0.2619 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6508 - regression_loss: 1.3891 - classification_loss: 0.2617 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6495 - regression_loss: 1.3882 - classification_loss: 0.2613 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6509 - regression_loss: 1.3894 - classification_loss: 0.2615 217/500 [============>.................] - ETA: 1:11 - loss: 1.6529 - regression_loss: 1.3910 - classification_loss: 0.2619 218/500 [============>.................] - ETA: 1:10 - loss: 1.6543 - regression_loss: 1.3924 - classification_loss: 0.2619 219/500 [============>.................] - ETA: 1:10 - loss: 1.6521 - regression_loss: 1.3907 - classification_loss: 0.2614 220/500 [============>.................] - ETA: 1:10 - loss: 1.6509 - regression_loss: 1.3901 - classification_loss: 0.2608 221/500 [============>.................] - ETA: 1:09 - loss: 1.6547 - regression_loss: 1.3929 - classification_loss: 0.2618 222/500 [============>.................] - ETA: 1:09 - loss: 1.6533 - regression_loss: 1.3919 - classification_loss: 0.2614 223/500 [============>.................] - ETA: 1:09 - loss: 1.6529 - regression_loss: 1.3914 - classification_loss: 0.2616 224/500 [============>.................] - ETA: 1:09 - loss: 1.6550 - regression_loss: 1.3929 - classification_loss: 0.2621 225/500 [============>.................] - ETA: 1:08 - loss: 1.6545 - regression_loss: 1.3924 - classification_loss: 0.2621 226/500 [============>.................] - ETA: 1:08 - loss: 1.6575 - regression_loss: 1.3942 - classification_loss: 0.2633 227/500 [============>.................] - ETA: 1:08 - loss: 1.6603 - regression_loss: 1.3967 - classification_loss: 0.2636 228/500 [============>.................] - ETA: 1:07 - loss: 1.6620 - regression_loss: 1.3982 - classification_loss: 0.2637 229/500 [============>.................] - ETA: 1:07 - loss: 1.6608 - regression_loss: 1.3974 - classification_loss: 0.2634 230/500 [============>.................] - ETA: 1:07 - loss: 1.6561 - regression_loss: 1.3934 - classification_loss: 0.2626 231/500 [============>.................] - ETA: 1:07 - loss: 1.6583 - regression_loss: 1.3949 - classification_loss: 0.2634 232/500 [============>.................] - ETA: 1:06 - loss: 1.6587 - regression_loss: 1.3952 - classification_loss: 0.2635 233/500 [============>.................] - ETA: 1:06 - loss: 1.6627 - regression_loss: 1.3978 - classification_loss: 0.2649 234/500 [=============>................] - ETA: 1:06 - loss: 1.6615 - regression_loss: 1.3970 - classification_loss: 0.2645 235/500 [=============>................] - ETA: 1:06 - loss: 1.6567 - regression_loss: 1.3930 - classification_loss: 0.2637 236/500 [=============>................] - ETA: 1:05 - loss: 1.6563 - regression_loss: 1.3929 - classification_loss: 0.2634 237/500 [=============>................] - ETA: 1:05 - loss: 1.6547 - regression_loss: 1.3918 - classification_loss: 0.2630 238/500 [=============>................] - ETA: 1:05 - loss: 1.6514 - regression_loss: 1.3891 - classification_loss: 0.2624 239/500 [=============>................] - ETA: 1:05 - loss: 1.6494 - regression_loss: 1.3875 - classification_loss: 0.2618 240/500 [=============>................] - ETA: 1:04 - loss: 1.6522 - regression_loss: 1.3898 - classification_loss: 0.2624 241/500 [=============>................] - ETA: 1:04 - loss: 1.6523 - regression_loss: 1.3901 - classification_loss: 0.2622 242/500 [=============>................] - ETA: 1:04 - loss: 1.6501 - regression_loss: 1.3882 - classification_loss: 0.2619 243/500 [=============>................] - ETA: 1:04 - loss: 1.6483 - regression_loss: 1.3868 - classification_loss: 0.2616 244/500 [=============>................] - ETA: 1:03 - loss: 1.6472 - regression_loss: 1.3859 - classification_loss: 0.2613 245/500 [=============>................] - ETA: 1:03 - loss: 1.6452 - regression_loss: 1.3843 - classification_loss: 0.2609 246/500 [=============>................] - ETA: 1:03 - loss: 1.6427 - regression_loss: 1.3823 - classification_loss: 0.2604 247/500 [=============>................] - ETA: 1:03 - loss: 1.6404 - regression_loss: 1.3803 - classification_loss: 0.2601 248/500 [=============>................] - ETA: 1:02 - loss: 1.6435 - regression_loss: 1.3823 - classification_loss: 0.2611 249/500 [=============>................] - ETA: 1:02 - loss: 1.6451 - regression_loss: 1.3836 - classification_loss: 0.2616 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6454 - regression_loss: 1.3838 - classification_loss: 0.2616 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6471 - regression_loss: 1.3855 - classification_loss: 0.2616 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6457 - regression_loss: 1.3843 - classification_loss: 0.2615 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6425 - regression_loss: 1.3817 - classification_loss: 0.2608 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6441 - regression_loss: 1.3829 - classification_loss: 0.2612 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6454 - regression_loss: 1.3837 - classification_loss: 0.2617 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6451 - regression_loss: 1.3836 - classification_loss: 0.2615 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6462 - regression_loss: 1.3845 - classification_loss: 0.2616 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6475 - regression_loss: 1.3852 - classification_loss: 0.2622 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6484 - regression_loss: 1.3861 - classification_loss: 0.2623 260/500 [==============>...............] - ETA: 59s - loss: 1.6476 - regression_loss: 1.3853 - classification_loss: 0.2623  261/500 [==============>...............] - ETA: 59s - loss: 1.6502 - regression_loss: 1.3871 - classification_loss: 0.2631 262/500 [==============>...............] - ETA: 59s - loss: 1.6531 - regression_loss: 1.3891 - classification_loss: 0.2640 263/500 [==============>...............] - ETA: 59s - loss: 1.6536 - regression_loss: 1.3894 - classification_loss: 0.2641 264/500 [==============>...............] - ETA: 58s - loss: 1.6553 - regression_loss: 1.3904 - classification_loss: 0.2649 265/500 [==============>...............] - ETA: 58s - loss: 1.6527 - regression_loss: 1.3852 - classification_loss: 0.2675 266/500 [==============>...............] - ETA: 58s - loss: 1.6520 - regression_loss: 1.3847 - classification_loss: 0.2673 267/500 [===============>..............] - ETA: 58s - loss: 1.6531 - regression_loss: 1.3857 - classification_loss: 0.2674 268/500 [===============>..............] - ETA: 57s - loss: 1.6520 - regression_loss: 1.3850 - classification_loss: 0.2671 269/500 [===============>..............] - ETA: 57s - loss: 1.6492 - regression_loss: 1.3828 - classification_loss: 0.2665 270/500 [===============>..............] - ETA: 57s - loss: 1.6497 - regression_loss: 1.3830 - classification_loss: 0.2666 271/500 [===============>..............] - ETA: 57s - loss: 1.6512 - regression_loss: 1.3843 - classification_loss: 0.2669 272/500 [===============>..............] - ETA: 56s - loss: 1.6486 - regression_loss: 1.3823 - classification_loss: 0.2663 273/500 [===============>..............] - ETA: 56s - loss: 1.6494 - regression_loss: 1.3829 - classification_loss: 0.2666 274/500 [===============>..............] - ETA: 56s - loss: 1.6504 - regression_loss: 1.3834 - classification_loss: 0.2671 275/500 [===============>..............] - ETA: 56s - loss: 1.6506 - regression_loss: 1.3834 - classification_loss: 0.2671 276/500 [===============>..............] - ETA: 55s - loss: 1.6509 - regression_loss: 1.3836 - classification_loss: 0.2674 277/500 [===============>..............] - ETA: 55s - loss: 1.6508 - regression_loss: 1.3834 - classification_loss: 0.2674 278/500 [===============>..............] - ETA: 55s - loss: 1.6522 - regression_loss: 1.3848 - classification_loss: 0.2675 279/500 [===============>..............] - ETA: 55s - loss: 1.6511 - regression_loss: 1.3841 - classification_loss: 0.2670 280/500 [===============>..............] - ETA: 54s - loss: 1.6500 - regression_loss: 1.3834 - classification_loss: 0.2666 281/500 [===============>..............] - ETA: 54s - loss: 1.6503 - regression_loss: 1.3839 - classification_loss: 0.2664 282/500 [===============>..............] - ETA: 54s - loss: 1.6510 - regression_loss: 1.3846 - classification_loss: 0.2663 283/500 [===============>..............] - ETA: 54s - loss: 1.6497 - regression_loss: 1.3838 - classification_loss: 0.2659 284/500 [================>.............] - ETA: 53s - loss: 1.6497 - regression_loss: 1.3835 - classification_loss: 0.2662 285/500 [================>.............] - ETA: 53s - loss: 1.6475 - regression_loss: 1.3819 - classification_loss: 0.2656 286/500 [================>.............] - ETA: 53s - loss: 1.6473 - regression_loss: 1.3815 - classification_loss: 0.2658 287/500 [================>.............] - ETA: 53s - loss: 1.6470 - regression_loss: 1.3815 - classification_loss: 0.2656 288/500 [================>.............] - ETA: 52s - loss: 1.6473 - regression_loss: 1.3816 - classification_loss: 0.2657 289/500 [================>.............] - ETA: 52s - loss: 1.6474 - regression_loss: 1.3816 - classification_loss: 0.2658 290/500 [================>.............] - ETA: 52s - loss: 1.6471 - regression_loss: 1.3812 - classification_loss: 0.2659 291/500 [================>.............] - ETA: 52s - loss: 1.6484 - regression_loss: 1.3821 - classification_loss: 0.2663 292/500 [================>.............] - ETA: 51s - loss: 1.6479 - regression_loss: 1.3816 - classification_loss: 0.2664 293/500 [================>.............] - ETA: 51s - loss: 1.6442 - regression_loss: 1.3787 - classification_loss: 0.2656 294/500 [================>.............] - ETA: 51s - loss: 1.6454 - regression_loss: 1.3798 - classification_loss: 0.2656 295/500 [================>.............] - ETA: 51s - loss: 1.6483 - regression_loss: 1.3828 - classification_loss: 0.2655 296/500 [================>.............] - ETA: 50s - loss: 1.6442 - regression_loss: 1.3793 - classification_loss: 0.2649 297/500 [================>.............] - ETA: 50s - loss: 1.6432 - regression_loss: 1.3782 - classification_loss: 0.2650 298/500 [================>.............] - ETA: 50s - loss: 1.6403 - regression_loss: 1.3759 - classification_loss: 0.2644 299/500 [================>.............] - ETA: 50s - loss: 1.6402 - regression_loss: 1.3759 - classification_loss: 0.2643 300/500 [=================>............] - ETA: 49s - loss: 1.6411 - regression_loss: 1.3763 - classification_loss: 0.2648 301/500 [=================>............] - ETA: 49s - loss: 1.6437 - regression_loss: 1.3784 - classification_loss: 0.2653 302/500 [=================>............] - ETA: 49s - loss: 1.6415 - regression_loss: 1.3767 - classification_loss: 0.2648 303/500 [=================>............] - ETA: 49s - loss: 1.6423 - regression_loss: 1.3773 - classification_loss: 0.2650 304/500 [=================>............] - ETA: 48s - loss: 1.6420 - regression_loss: 1.3777 - classification_loss: 0.2644 305/500 [=================>............] - ETA: 48s - loss: 1.6430 - regression_loss: 1.3786 - classification_loss: 0.2644 306/500 [=================>............] - ETA: 48s - loss: 1.6423 - regression_loss: 1.3781 - classification_loss: 0.2642 307/500 [=================>............] - ETA: 48s - loss: 1.6429 - regression_loss: 1.3789 - classification_loss: 0.2640 308/500 [=================>............] - ETA: 47s - loss: 1.6418 - regression_loss: 1.3782 - classification_loss: 0.2636 309/500 [=================>............] - ETA: 47s - loss: 1.6419 - regression_loss: 1.3786 - classification_loss: 0.2633 310/500 [=================>............] - ETA: 47s - loss: 1.6418 - regression_loss: 1.3785 - classification_loss: 0.2633 311/500 [=================>............] - ETA: 47s - loss: 1.6418 - regression_loss: 1.3784 - classification_loss: 0.2633 312/500 [=================>............] - ETA: 46s - loss: 1.6424 - regression_loss: 1.3788 - classification_loss: 0.2636 313/500 [=================>............] - ETA: 46s - loss: 1.6434 - regression_loss: 1.3796 - classification_loss: 0.2638 314/500 [=================>............] - ETA: 46s - loss: 1.6457 - regression_loss: 1.3814 - classification_loss: 0.2642 315/500 [=================>............] - ETA: 46s - loss: 1.6445 - regression_loss: 1.3806 - classification_loss: 0.2639 316/500 [=================>............] - ETA: 45s - loss: 1.6438 - regression_loss: 1.3802 - classification_loss: 0.2637 317/500 [==================>...........] - ETA: 45s - loss: 1.6448 - regression_loss: 1.3810 - classification_loss: 0.2639 318/500 [==================>...........] - ETA: 45s - loss: 1.6447 - regression_loss: 1.3810 - classification_loss: 0.2637 319/500 [==================>...........] - ETA: 45s - loss: 1.6448 - regression_loss: 1.3814 - classification_loss: 0.2634 320/500 [==================>...........] - ETA: 44s - loss: 1.6456 - regression_loss: 1.3822 - classification_loss: 0.2633 321/500 [==================>...........] - ETA: 44s - loss: 1.6477 - regression_loss: 1.3836 - classification_loss: 0.2642 322/500 [==================>...........] - ETA: 44s - loss: 1.6489 - regression_loss: 1.3845 - classification_loss: 0.2644 323/500 [==================>...........] - ETA: 44s - loss: 1.6486 - regression_loss: 1.3843 - classification_loss: 0.2643 324/500 [==================>...........] - ETA: 43s - loss: 1.6483 - regression_loss: 1.3840 - classification_loss: 0.2643 325/500 [==================>...........] - ETA: 43s - loss: 1.6484 - regression_loss: 1.3841 - classification_loss: 0.2642 326/500 [==================>...........] - ETA: 43s - loss: 1.6498 - regression_loss: 1.3856 - classification_loss: 0.2642 327/500 [==================>...........] - ETA: 43s - loss: 1.6527 - regression_loss: 1.3878 - classification_loss: 0.2649 328/500 [==================>...........] - ETA: 42s - loss: 1.6516 - regression_loss: 1.3871 - classification_loss: 0.2646 329/500 [==================>...........] - ETA: 42s - loss: 1.6529 - regression_loss: 1.3879 - classification_loss: 0.2650 330/500 [==================>...........] - ETA: 42s - loss: 1.6554 - regression_loss: 1.3904 - classification_loss: 0.2650 331/500 [==================>...........] - ETA: 42s - loss: 1.6551 - regression_loss: 1.3901 - classification_loss: 0.2650 332/500 [==================>...........] - ETA: 41s - loss: 1.6558 - regression_loss: 1.3905 - classification_loss: 0.2652 333/500 [==================>...........] - ETA: 41s - loss: 1.6544 - regression_loss: 1.3895 - classification_loss: 0.2649 334/500 [===================>..........] - ETA: 41s - loss: 1.6533 - regression_loss: 1.3887 - classification_loss: 0.2645 335/500 [===================>..........] - ETA: 41s - loss: 1.6512 - regression_loss: 1.3869 - classification_loss: 0.2642 336/500 [===================>..........] - ETA: 40s - loss: 1.6517 - regression_loss: 1.3875 - classification_loss: 0.2643 337/500 [===================>..........] - ETA: 40s - loss: 1.6490 - regression_loss: 1.3853 - classification_loss: 0.2637 338/500 [===================>..........] - ETA: 40s - loss: 1.6486 - regression_loss: 1.3849 - classification_loss: 0.2637 339/500 [===================>..........] - ETA: 40s - loss: 1.6511 - regression_loss: 1.3867 - classification_loss: 0.2644 340/500 [===================>..........] - ETA: 39s - loss: 1.6541 - regression_loss: 1.3895 - classification_loss: 0.2646 341/500 [===================>..........] - ETA: 39s - loss: 1.6543 - regression_loss: 1.3897 - classification_loss: 0.2645 342/500 [===================>..........] - ETA: 39s - loss: 1.6512 - regression_loss: 1.3873 - classification_loss: 0.2639 343/500 [===================>..........] - ETA: 39s - loss: 1.6501 - regression_loss: 1.3865 - classification_loss: 0.2636 344/500 [===================>..........] - ETA: 38s - loss: 1.6507 - regression_loss: 1.3870 - classification_loss: 0.2637 345/500 [===================>..........] - ETA: 38s - loss: 1.6507 - regression_loss: 1.3872 - classification_loss: 0.2635 346/500 [===================>..........] - ETA: 38s - loss: 1.6498 - regression_loss: 1.3865 - classification_loss: 0.2633 347/500 [===================>..........] - ETA: 38s - loss: 1.6502 - regression_loss: 1.3870 - classification_loss: 0.2631 348/500 [===================>..........] - ETA: 37s - loss: 1.6488 - regression_loss: 1.3860 - classification_loss: 0.2628 349/500 [===================>..........] - ETA: 37s - loss: 1.6484 - regression_loss: 1.3858 - classification_loss: 0.2626 350/500 [====================>.........] - ETA: 37s - loss: 1.6489 - regression_loss: 1.3864 - classification_loss: 0.2625 351/500 [====================>.........] - ETA: 37s - loss: 1.6493 - regression_loss: 1.3867 - classification_loss: 0.2626 352/500 [====================>.........] - ETA: 36s - loss: 1.6468 - regression_loss: 1.3844 - classification_loss: 0.2624 353/500 [====================>.........] - ETA: 36s - loss: 1.6458 - regression_loss: 1.3837 - classification_loss: 0.2621 354/500 [====================>.........] - ETA: 36s - loss: 1.6481 - regression_loss: 1.3859 - classification_loss: 0.2623 355/500 [====================>.........] - ETA: 36s - loss: 1.6476 - regression_loss: 1.3854 - classification_loss: 0.2621 356/500 [====================>.........] - ETA: 35s - loss: 1.6447 - regression_loss: 1.3829 - classification_loss: 0.2618 357/500 [====================>.........] - ETA: 35s - loss: 1.6449 - regression_loss: 1.3831 - classification_loss: 0.2618 358/500 [====================>.........] - ETA: 35s - loss: 1.6442 - regression_loss: 1.3826 - classification_loss: 0.2616 359/500 [====================>.........] - ETA: 35s - loss: 1.6443 - regression_loss: 1.3823 - classification_loss: 0.2620 360/500 [====================>.........] - ETA: 34s - loss: 1.6448 - regression_loss: 1.3824 - classification_loss: 0.2625 361/500 [====================>.........] - ETA: 34s - loss: 1.6447 - regression_loss: 1.3822 - classification_loss: 0.2625 362/500 [====================>.........] - ETA: 34s - loss: 1.6437 - regression_loss: 1.3816 - classification_loss: 0.2621 363/500 [====================>.........] - ETA: 34s - loss: 1.6430 - regression_loss: 1.3810 - classification_loss: 0.2620 364/500 [====================>.........] - ETA: 33s - loss: 1.6431 - regression_loss: 1.3810 - classification_loss: 0.2621 365/500 [====================>.........] - ETA: 33s - loss: 1.6452 - regression_loss: 1.3822 - classification_loss: 0.2631 366/500 [====================>.........] - ETA: 33s - loss: 1.6453 - regression_loss: 1.3822 - classification_loss: 0.2631 367/500 [=====================>........] - ETA: 33s - loss: 1.6435 - regression_loss: 1.3807 - classification_loss: 0.2628 368/500 [=====================>........] - ETA: 32s - loss: 1.6457 - regression_loss: 1.3825 - classification_loss: 0.2632 369/500 [=====================>........] - ETA: 32s - loss: 1.6468 - regression_loss: 1.3832 - classification_loss: 0.2636 370/500 [=====================>........] - ETA: 32s - loss: 1.6462 - regression_loss: 1.3829 - classification_loss: 0.2633 371/500 [=====================>........] - ETA: 32s - loss: 1.6460 - regression_loss: 1.3828 - classification_loss: 0.2633 372/500 [=====================>........] - ETA: 31s - loss: 1.6445 - regression_loss: 1.3815 - classification_loss: 0.2630 373/500 [=====================>........] - ETA: 31s - loss: 1.6446 - regression_loss: 1.3818 - classification_loss: 0.2628 374/500 [=====================>........] - ETA: 31s - loss: 1.6453 - regression_loss: 1.3824 - classification_loss: 0.2629 375/500 [=====================>........] - ETA: 31s - loss: 1.6447 - regression_loss: 1.3819 - classification_loss: 0.2628 376/500 [=====================>........] - ETA: 30s - loss: 1.6448 - regression_loss: 1.3821 - classification_loss: 0.2627 377/500 [=====================>........] - ETA: 30s - loss: 1.6446 - regression_loss: 1.3816 - classification_loss: 0.2630 378/500 [=====================>........] - ETA: 30s - loss: 1.6459 - regression_loss: 1.3827 - classification_loss: 0.2632 379/500 [=====================>........] - ETA: 30s - loss: 1.6467 - regression_loss: 1.3832 - classification_loss: 0.2636 380/500 [=====================>........] - ETA: 29s - loss: 1.6457 - regression_loss: 1.3825 - classification_loss: 0.2632 381/500 [=====================>........] - ETA: 29s - loss: 1.6444 - regression_loss: 1.3811 - classification_loss: 0.2633 382/500 [=====================>........] - ETA: 29s - loss: 1.6449 - regression_loss: 1.3814 - classification_loss: 0.2635 383/500 [=====================>........] - ETA: 29s - loss: 1.6426 - regression_loss: 1.3795 - classification_loss: 0.2631 384/500 [======================>.......] - ETA: 28s - loss: 1.6436 - regression_loss: 1.3802 - classification_loss: 0.2634 385/500 [======================>.......] - ETA: 28s - loss: 1.6438 - regression_loss: 1.3804 - classification_loss: 0.2634 386/500 [======================>.......] - ETA: 28s - loss: 1.6434 - regression_loss: 1.3801 - classification_loss: 0.2633 387/500 [======================>.......] - ETA: 28s - loss: 1.6427 - regression_loss: 1.3797 - classification_loss: 0.2631 388/500 [======================>.......] - ETA: 27s - loss: 1.6423 - regression_loss: 1.3794 - classification_loss: 0.2629 389/500 [======================>.......] - ETA: 27s - loss: 1.6427 - regression_loss: 1.3796 - classification_loss: 0.2632 390/500 [======================>.......] - ETA: 27s - loss: 1.6436 - regression_loss: 1.3803 - classification_loss: 0.2633 391/500 [======================>.......] - ETA: 27s - loss: 1.6416 - regression_loss: 1.3786 - classification_loss: 0.2630 392/500 [======================>.......] - ETA: 26s - loss: 1.6419 - regression_loss: 1.3788 - classification_loss: 0.2631 393/500 [======================>.......] - ETA: 26s - loss: 1.6417 - regression_loss: 1.3787 - classification_loss: 0.2630 394/500 [======================>.......] - ETA: 26s - loss: 1.6421 - regression_loss: 1.3789 - classification_loss: 0.2632 395/500 [======================>.......] - ETA: 26s - loss: 1.6421 - regression_loss: 1.3789 - classification_loss: 0.2633 396/500 [======================>.......] - ETA: 25s - loss: 1.6413 - regression_loss: 1.3780 - classification_loss: 0.2632 397/500 [======================>.......] - ETA: 25s - loss: 1.6400 - regression_loss: 1.3771 - classification_loss: 0.2630 398/500 [======================>.......] - ETA: 25s - loss: 1.6427 - regression_loss: 1.3791 - classification_loss: 0.2636 399/500 [======================>.......] - ETA: 25s - loss: 1.6433 - regression_loss: 1.3798 - classification_loss: 0.2635 400/500 [=======================>......] - ETA: 24s - loss: 1.6447 - regression_loss: 1.3809 - classification_loss: 0.2639 401/500 [=======================>......] - ETA: 24s - loss: 1.6444 - regression_loss: 1.3807 - classification_loss: 0.2637 402/500 [=======================>......] - ETA: 24s - loss: 1.6444 - regression_loss: 1.3810 - classification_loss: 0.2634 403/500 [=======================>......] - ETA: 24s - loss: 1.6447 - regression_loss: 1.3814 - classification_loss: 0.2634 404/500 [=======================>......] - ETA: 23s - loss: 1.6420 - regression_loss: 1.3792 - classification_loss: 0.2628 405/500 [=======================>......] - ETA: 23s - loss: 1.6407 - regression_loss: 1.3781 - classification_loss: 0.2626 406/500 [=======================>......] - ETA: 23s - loss: 1.6397 - regression_loss: 1.3774 - classification_loss: 0.2623 407/500 [=======================>......] - ETA: 23s - loss: 1.6392 - regression_loss: 1.3770 - classification_loss: 0.2621 408/500 [=======================>......] - ETA: 22s - loss: 1.6398 - regression_loss: 1.3776 - classification_loss: 0.2622 409/500 [=======================>......] - ETA: 22s - loss: 1.6387 - regression_loss: 1.3766 - classification_loss: 0.2621 410/500 [=======================>......] - ETA: 22s - loss: 1.6385 - regression_loss: 1.3764 - classification_loss: 0.2621 411/500 [=======================>......] - ETA: 22s - loss: 1.6383 - regression_loss: 1.3763 - classification_loss: 0.2620 412/500 [=======================>......] - ETA: 21s - loss: 1.6369 - regression_loss: 1.3752 - classification_loss: 0.2617 413/500 [=======================>......] - ETA: 21s - loss: 1.6368 - regression_loss: 1.3751 - classification_loss: 0.2616 414/500 [=======================>......] - ETA: 21s - loss: 1.6361 - regression_loss: 1.3747 - classification_loss: 0.2613 415/500 [=======================>......] - ETA: 21s - loss: 1.6350 - regression_loss: 1.3741 - classification_loss: 0.2610 416/500 [=======================>......] - ETA: 20s - loss: 1.6349 - regression_loss: 1.3741 - classification_loss: 0.2608 417/500 [========================>.....] - ETA: 20s - loss: 1.6351 - regression_loss: 1.3743 - classification_loss: 0.2608 418/500 [========================>.....] - ETA: 20s - loss: 1.6326 - regression_loss: 1.3722 - classification_loss: 0.2604 419/500 [========================>.....] - ETA: 20s - loss: 1.6341 - regression_loss: 1.3734 - classification_loss: 0.2607 420/500 [========================>.....] - ETA: 19s - loss: 1.6345 - regression_loss: 1.3740 - classification_loss: 0.2604 421/500 [========================>.....] - ETA: 19s - loss: 1.6342 - regression_loss: 1.3736 - classification_loss: 0.2605 422/500 [========================>.....] - ETA: 19s - loss: 1.6348 - regression_loss: 1.3739 - classification_loss: 0.2609 423/500 [========================>.....] - ETA: 19s - loss: 1.6354 - regression_loss: 1.3742 - classification_loss: 0.2612 424/500 [========================>.....] - ETA: 18s - loss: 1.6358 - regression_loss: 1.3746 - classification_loss: 0.2612 425/500 [========================>.....] - ETA: 18s - loss: 1.6355 - regression_loss: 1.3744 - classification_loss: 0.2612 426/500 [========================>.....] - ETA: 18s - loss: 1.6360 - regression_loss: 1.3747 - classification_loss: 0.2613 427/500 [========================>.....] - ETA: 18s - loss: 1.6375 - regression_loss: 1.3758 - classification_loss: 0.2617 428/500 [========================>.....] - ETA: 17s - loss: 1.6387 - regression_loss: 1.3771 - classification_loss: 0.2616 429/500 [========================>.....] - ETA: 17s - loss: 1.6381 - regression_loss: 1.3766 - classification_loss: 0.2614 430/500 [========================>.....] - ETA: 17s - loss: 1.6377 - regression_loss: 1.3762 - classification_loss: 0.2614 431/500 [========================>.....] - ETA: 17s - loss: 1.6385 - regression_loss: 1.3767 - classification_loss: 0.2618 432/500 [========================>.....] - ETA: 16s - loss: 1.6377 - regression_loss: 1.3761 - classification_loss: 0.2616 433/500 [========================>.....] - ETA: 16s - loss: 1.6375 - regression_loss: 1.3762 - classification_loss: 0.2613 434/500 [=========================>....] - ETA: 16s - loss: 1.6388 - regression_loss: 1.3770 - classification_loss: 0.2617 435/500 [=========================>....] - ETA: 16s - loss: 1.6390 - regression_loss: 1.3774 - classification_loss: 0.2616 436/500 [=========================>....] - ETA: 15s - loss: 1.6394 - regression_loss: 1.3777 - classification_loss: 0.2617 437/500 [=========================>....] - ETA: 15s - loss: 1.6400 - regression_loss: 1.3783 - classification_loss: 0.2617 438/500 [=========================>....] - ETA: 15s - loss: 1.6376 - regression_loss: 1.3761 - classification_loss: 0.2615 439/500 [=========================>....] - ETA: 15s - loss: 1.6375 - regression_loss: 1.3762 - classification_loss: 0.2613 440/500 [=========================>....] - ETA: 14s - loss: 1.6365 - regression_loss: 1.3756 - classification_loss: 0.2609 441/500 [=========================>....] - ETA: 14s - loss: 1.6344 - regression_loss: 1.3739 - classification_loss: 0.2606 442/500 [=========================>....] - ETA: 14s - loss: 1.6332 - regression_loss: 1.3729 - classification_loss: 0.2603 443/500 [=========================>....] - ETA: 14s - loss: 1.6320 - regression_loss: 1.3718 - classification_loss: 0.2601 444/500 [=========================>....] - ETA: 13s - loss: 1.6324 - regression_loss: 1.3722 - classification_loss: 0.2602 445/500 [=========================>....] - ETA: 13s - loss: 1.6328 - regression_loss: 1.3727 - classification_loss: 0.2601 446/500 [=========================>....] - ETA: 13s - loss: 1.6338 - regression_loss: 1.3730 - classification_loss: 0.2608 447/500 [=========================>....] - ETA: 13s - loss: 1.6343 - regression_loss: 1.3735 - classification_loss: 0.2608 448/500 [=========================>....] - ETA: 12s - loss: 1.6352 - regression_loss: 1.3742 - classification_loss: 0.2610 449/500 [=========================>....] - ETA: 12s - loss: 1.6363 - regression_loss: 1.3751 - classification_loss: 0.2613 450/500 [==========================>...] - ETA: 12s - loss: 1.6356 - regression_loss: 1.3746 - classification_loss: 0.2610 451/500 [==========================>...] - ETA: 12s - loss: 1.6347 - regression_loss: 1.3736 - classification_loss: 0.2611 452/500 [==========================>...] - ETA: 11s - loss: 1.6341 - regression_loss: 1.3728 - classification_loss: 0.2613 453/500 [==========================>...] - ETA: 11s - loss: 1.6353 - regression_loss: 1.3740 - classification_loss: 0.2613 454/500 [==========================>...] - ETA: 11s - loss: 1.6339 - regression_loss: 1.3728 - classification_loss: 0.2611 455/500 [==========================>...] - ETA: 11s - loss: 1.6358 - regression_loss: 1.3744 - classification_loss: 0.2614 456/500 [==========================>...] - ETA: 10s - loss: 1.6363 - regression_loss: 1.3748 - classification_loss: 0.2615 457/500 [==========================>...] - ETA: 10s - loss: 1.6367 - regression_loss: 1.3750 - classification_loss: 0.2618 458/500 [==========================>...] - ETA: 10s - loss: 1.6370 - regression_loss: 1.3753 - classification_loss: 0.2617 459/500 [==========================>...] - ETA: 10s - loss: 1.6388 - regression_loss: 1.3767 - classification_loss: 0.2621 460/500 [==========================>...] - ETA: 9s - loss: 1.6384 - regression_loss: 1.3761 - classification_loss: 0.2622  461/500 [==========================>...] - ETA: 9s - loss: 1.6390 - regression_loss: 1.3765 - classification_loss: 0.2625 462/500 [==========================>...] - ETA: 9s - loss: 1.6391 - regression_loss: 1.3768 - classification_loss: 0.2623 463/500 [==========================>...] - ETA: 9s - loss: 1.6366 - regression_loss: 1.3739 - classification_loss: 0.2627 464/500 [==========================>...] - ETA: 8s - loss: 1.6364 - regression_loss: 1.3738 - classification_loss: 0.2626 465/500 [==========================>...] - ETA: 8s - loss: 1.6353 - regression_loss: 1.3729 - classification_loss: 0.2624 466/500 [==========================>...] - ETA: 8s - loss: 1.6357 - regression_loss: 1.3733 - classification_loss: 0.2624 467/500 [===========================>..] - ETA: 8s - loss: 1.6350 - regression_loss: 1.3727 - classification_loss: 0.2623 468/500 [===========================>..] - ETA: 7s - loss: 1.6354 - regression_loss: 1.3729 - classification_loss: 0.2625 469/500 [===========================>..] - ETA: 7s - loss: 1.6350 - regression_loss: 1.3726 - classification_loss: 0.2624 470/500 [===========================>..] - ETA: 7s - loss: 1.6344 - regression_loss: 1.3722 - classification_loss: 0.2622 471/500 [===========================>..] - ETA: 7s - loss: 1.6350 - regression_loss: 1.3726 - classification_loss: 0.2624 472/500 [===========================>..] - ETA: 6s - loss: 1.6363 - regression_loss: 1.3738 - classification_loss: 0.2625 473/500 [===========================>..] - ETA: 6s - loss: 1.6368 - regression_loss: 1.3742 - classification_loss: 0.2626 474/500 [===========================>..] - ETA: 6s - loss: 1.6356 - regression_loss: 1.3733 - classification_loss: 0.2623 475/500 [===========================>..] - ETA: 6s - loss: 1.6367 - regression_loss: 1.3738 - classification_loss: 0.2629 476/500 [===========================>..] - ETA: 5s - loss: 1.6376 - regression_loss: 1.3746 - classification_loss: 0.2630 477/500 [===========================>..] - ETA: 5s - loss: 1.6379 - regression_loss: 1.3749 - classification_loss: 0.2630 478/500 [===========================>..] - ETA: 5s - loss: 1.6367 - regression_loss: 1.3740 - classification_loss: 0.2627 479/500 [===========================>..] - ETA: 5s - loss: 1.6374 - regression_loss: 1.3747 - classification_loss: 0.2627 480/500 [===========================>..] - ETA: 4s - loss: 1.6346 - regression_loss: 1.3724 - classification_loss: 0.2622 481/500 [===========================>..] - ETA: 4s - loss: 1.6348 - regression_loss: 1.3725 - classification_loss: 0.2623 482/500 [===========================>..] - ETA: 4s - loss: 1.6349 - regression_loss: 1.3724 - classification_loss: 0.2625 483/500 [===========================>..] - ETA: 4s - loss: 1.6333 - regression_loss: 1.3712 - classification_loss: 0.2622 484/500 [============================>.] - ETA: 3s - loss: 1.6330 - regression_loss: 1.3708 - classification_loss: 0.2622 485/500 [============================>.] - ETA: 3s - loss: 1.6330 - regression_loss: 1.3709 - classification_loss: 0.2622 486/500 [============================>.] - ETA: 3s - loss: 1.6359 - regression_loss: 1.3732 - classification_loss: 0.2626 487/500 [============================>.] - ETA: 3s - loss: 1.6362 - regression_loss: 1.3735 - classification_loss: 0.2627 488/500 [============================>.] - ETA: 2s - loss: 1.6336 - regression_loss: 1.3714 - classification_loss: 0.2623 489/500 [============================>.] - ETA: 2s - loss: 1.6325 - regression_loss: 1.3705 - classification_loss: 0.2621 490/500 [============================>.] - ETA: 2s - loss: 1.6317 - regression_loss: 1.3698 - classification_loss: 0.2618 491/500 [============================>.] - ETA: 2s - loss: 1.6315 - regression_loss: 1.3699 - classification_loss: 0.2616 492/500 [============================>.] - ETA: 1s - loss: 1.6324 - regression_loss: 1.3705 - classification_loss: 0.2619 493/500 [============================>.] - ETA: 1s - loss: 1.6321 - regression_loss: 1.3704 - classification_loss: 0.2617 494/500 [============================>.] - ETA: 1s - loss: 1.6295 - regression_loss: 1.3682 - classification_loss: 0.2613 495/500 [============================>.] - ETA: 1s - loss: 1.6298 - regression_loss: 1.3686 - classification_loss: 0.2612 496/500 [============================>.] - ETA: 0s - loss: 1.6299 - regression_loss: 1.3688 - classification_loss: 0.2612 497/500 [============================>.] - ETA: 0s - loss: 1.6277 - regression_loss: 1.3669 - classification_loss: 0.2608 498/500 [============================>.] - ETA: 0s - loss: 1.6270 - regression_loss: 1.3664 - classification_loss: 0.2606 499/500 [============================>.] - ETA: 0s - loss: 1.6269 - regression_loss: 1.3665 - classification_loss: 0.2604 500/500 [==============================] - 125s 249ms/step - loss: 1.6271 - regression_loss: 1.3665 - classification_loss: 0.2606 326 instances of class plum with average precision: 0.7785 mAP: 0.7785 Epoch 00056: saving model to ./training/snapshots/resnet50_pascal_56.h5 Epoch 57/150 1/500 [..............................] - ETA: 2:02 - loss: 2.4839 - regression_loss: 1.8397 - classification_loss: 0.6442 2/500 [..............................] - ETA: 2:02 - loss: 2.4026 - regression_loss: 1.8471 - classification_loss: 0.5555 3/500 [..............................] - ETA: 2:04 - loss: 2.0191 - regression_loss: 1.5902 - classification_loss: 0.4288 4/500 [..............................] - ETA: 2:03 - loss: 2.0189 - regression_loss: 1.5911 - classification_loss: 0.4278 5/500 [..............................] - ETA: 2:03 - loss: 1.9260 - regression_loss: 1.5462 - classification_loss: 0.3798 6/500 [..............................] - ETA: 2:03 - loss: 1.9078 - regression_loss: 1.5311 - classification_loss: 0.3767 7/500 [..............................] - ETA: 2:02 - loss: 1.8857 - regression_loss: 1.5286 - classification_loss: 0.3571 8/500 [..............................] - ETA: 2:02 - loss: 1.9158 - regression_loss: 1.5492 - classification_loss: 0.3666 9/500 [..............................] - ETA: 2:02 - loss: 1.9135 - regression_loss: 1.5533 - classification_loss: 0.3602 10/500 [..............................] - ETA: 2:02 - loss: 1.8361 - regression_loss: 1.4909 - classification_loss: 0.3451 11/500 [..............................] - ETA: 2:02 - loss: 1.8257 - regression_loss: 1.4920 - classification_loss: 0.3337 12/500 [..............................] - ETA: 2:02 - loss: 1.8464 - regression_loss: 1.5146 - classification_loss: 0.3319 13/500 [..............................] - ETA: 2:02 - loss: 1.8498 - regression_loss: 1.5134 - classification_loss: 0.3364 14/500 [..............................] - ETA: 2:01 - loss: 1.7898 - regression_loss: 1.4666 - classification_loss: 0.3232 15/500 [..............................] - ETA: 2:01 - loss: 1.7255 - regression_loss: 1.4187 - classification_loss: 0.3068 16/500 [..............................] - ETA: 2:01 - loss: 1.7286 - regression_loss: 1.4268 - classification_loss: 0.3018 17/500 [>.............................] - ETA: 2:01 - loss: 1.7675 - regression_loss: 1.4529 - classification_loss: 0.3146 18/500 [>.............................] - ETA: 2:01 - loss: 1.7494 - regression_loss: 1.4422 - classification_loss: 0.3072 19/500 [>.............................] - ETA: 2:00 - loss: 1.7476 - regression_loss: 1.4392 - classification_loss: 0.3084 20/500 [>.............................] - ETA: 2:00 - loss: 1.7449 - regression_loss: 1.4422 - classification_loss: 0.3027 21/500 [>.............................] - ETA: 2:00 - loss: 1.7602 - regression_loss: 1.4517 - classification_loss: 0.3085 22/500 [>.............................] - ETA: 2:00 - loss: 1.7497 - regression_loss: 1.4447 - classification_loss: 0.3050 23/500 [>.............................] - ETA: 2:00 - loss: 1.7317 - regression_loss: 1.4321 - classification_loss: 0.2996 24/500 [>.............................] - ETA: 1:59 - loss: 1.7521 - regression_loss: 1.4458 - classification_loss: 0.3063 25/500 [>.............................] - ETA: 1:59 - loss: 1.7577 - regression_loss: 1.4535 - classification_loss: 0.3042 26/500 [>.............................] - ETA: 1:59 - loss: 1.7445 - regression_loss: 1.4410 - classification_loss: 0.3035 27/500 [>.............................] - ETA: 1:59 - loss: 1.7546 - regression_loss: 1.4510 - classification_loss: 0.3036 28/500 [>.............................] - ETA: 1:58 - loss: 1.7420 - regression_loss: 1.4422 - classification_loss: 0.2998 29/500 [>.............................] - ETA: 1:58 - loss: 1.7383 - regression_loss: 1.4349 - classification_loss: 0.3034 30/500 [>.............................] - ETA: 1:58 - loss: 1.7243 - regression_loss: 1.4257 - classification_loss: 0.2986 31/500 [>.............................] - ETA: 1:58 - loss: 1.7098 - regression_loss: 1.4136 - classification_loss: 0.2962 32/500 [>.............................] - ETA: 1:57 - loss: 1.7633 - regression_loss: 1.4136 - classification_loss: 0.3497 33/500 [>.............................] - ETA: 1:57 - loss: 1.7399 - regression_loss: 1.3943 - classification_loss: 0.3456 34/500 [=>............................] - ETA: 1:57 - loss: 1.7565 - regression_loss: 1.4091 - classification_loss: 0.3474 35/500 [=>............................] - ETA: 1:56 - loss: 1.7392 - regression_loss: 1.3988 - classification_loss: 0.3404 36/500 [=>............................] - ETA: 1:56 - loss: 1.7424 - regression_loss: 1.4033 - classification_loss: 0.3390 37/500 [=>............................] - ETA: 1:56 - loss: 1.7381 - regression_loss: 1.4033 - classification_loss: 0.3348 38/500 [=>............................] - ETA: 1:56 - loss: 1.7279 - regression_loss: 1.3976 - classification_loss: 0.3303 39/500 [=>............................] - ETA: 1:55 - loss: 1.7247 - regression_loss: 1.3954 - classification_loss: 0.3293 40/500 [=>............................] - ETA: 1:55 - loss: 1.7040 - regression_loss: 1.3798 - classification_loss: 0.3242 41/500 [=>............................] - ETA: 1:55 - loss: 1.7087 - regression_loss: 1.3861 - classification_loss: 0.3226 42/500 [=>............................] - ETA: 1:54 - loss: 1.7232 - regression_loss: 1.4006 - classification_loss: 0.3226 43/500 [=>............................] - ETA: 1:54 - loss: 1.7208 - regression_loss: 1.4006 - classification_loss: 0.3201 44/500 [=>............................] - ETA: 1:54 - loss: 1.7236 - regression_loss: 1.4052 - classification_loss: 0.3184 45/500 [=>............................] - ETA: 1:54 - loss: 1.7142 - regression_loss: 1.3987 - classification_loss: 0.3156 46/500 [=>............................] - ETA: 1:54 - loss: 1.7053 - regression_loss: 1.3921 - classification_loss: 0.3132 47/500 [=>............................] - ETA: 1:53 - loss: 1.7003 - regression_loss: 1.3880 - classification_loss: 0.3124 48/500 [=>............................] - ETA: 1:53 - loss: 1.7033 - regression_loss: 1.3898 - classification_loss: 0.3135 49/500 [=>............................] - ETA: 1:53 - loss: 1.6994 - regression_loss: 1.3880 - classification_loss: 0.3114 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6907 - regression_loss: 1.3824 - classification_loss: 0.3083 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6891 - regression_loss: 1.3818 - classification_loss: 0.3073 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6784 - regression_loss: 1.3733 - classification_loss: 0.3051 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6817 - regression_loss: 1.3770 - classification_loss: 0.3048 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6879 - regression_loss: 1.3803 - classification_loss: 0.3076 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6711 - regression_loss: 1.3679 - classification_loss: 0.3033 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6586 - regression_loss: 1.3591 - classification_loss: 0.2995 57/500 [==>...........................] - ETA: 1:51 - loss: 1.6640 - regression_loss: 1.3641 - classification_loss: 0.2999 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6620 - regression_loss: 1.3644 - classification_loss: 0.2977 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6646 - regression_loss: 1.3659 - classification_loss: 0.2987 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6849 - regression_loss: 1.3835 - classification_loss: 0.3013 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6956 - regression_loss: 1.3920 - classification_loss: 0.3036 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6943 - regression_loss: 1.3924 - classification_loss: 0.3018 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6826 - regression_loss: 1.3807 - classification_loss: 0.3019 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6788 - regression_loss: 1.3790 - classification_loss: 0.2998 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6696 - regression_loss: 1.3709 - classification_loss: 0.2987 66/500 [==>...........................] - ETA: 1:49 - loss: 1.6664 - regression_loss: 1.3692 - classification_loss: 0.2972 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6609 - regression_loss: 1.3658 - classification_loss: 0.2951 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6642 - regression_loss: 1.3690 - classification_loss: 0.2952 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6485 - regression_loss: 1.3566 - classification_loss: 0.2920 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6626 - regression_loss: 1.3667 - classification_loss: 0.2959 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6583 - regression_loss: 1.3637 - classification_loss: 0.2946 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6559 - regression_loss: 1.3617 - classification_loss: 0.2942 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6507 - regression_loss: 1.3581 - classification_loss: 0.2926 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6560 - regression_loss: 1.3632 - classification_loss: 0.2927 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6666 - regression_loss: 1.3727 - classification_loss: 0.2939 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6656 - regression_loss: 1.3726 - classification_loss: 0.2930 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6653 - regression_loss: 1.3722 - classification_loss: 0.2931 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6676 - regression_loss: 1.3735 - classification_loss: 0.2941 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6744 - regression_loss: 1.3789 - classification_loss: 0.2955 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6745 - regression_loss: 1.3790 - classification_loss: 0.2956 81/500 [===>..........................] - ETA: 1:45 - loss: 1.6783 - regression_loss: 1.3826 - classification_loss: 0.2956 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6778 - regression_loss: 1.3834 - classification_loss: 0.2944 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6636 - regression_loss: 1.3721 - classification_loss: 0.2916 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6640 - regression_loss: 1.3729 - classification_loss: 0.2911 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6645 - regression_loss: 1.3741 - classification_loss: 0.2905 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6733 - regression_loss: 1.3829 - classification_loss: 0.2904 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6796 - regression_loss: 1.3904 - classification_loss: 0.2892 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6814 - regression_loss: 1.3911 - classification_loss: 0.2903 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6850 - regression_loss: 1.3944 - classification_loss: 0.2906 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6813 - regression_loss: 1.3925 - classification_loss: 0.2887 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6823 - regression_loss: 1.3934 - classification_loss: 0.2888 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6889 - regression_loss: 1.3988 - classification_loss: 0.2901 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6827 - regression_loss: 1.3946 - classification_loss: 0.2881 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6859 - regression_loss: 1.3977 - classification_loss: 0.2882 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6895 - regression_loss: 1.3996 - classification_loss: 0.2899 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6880 - regression_loss: 1.3986 - classification_loss: 0.2893 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6867 - regression_loss: 1.3976 - classification_loss: 0.2891 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6814 - regression_loss: 1.3929 - classification_loss: 0.2885 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6711 - regression_loss: 1.3850 - classification_loss: 0.2862 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6661 - regression_loss: 1.3810 - classification_loss: 0.2851 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6705 - regression_loss: 1.3851 - classification_loss: 0.2854 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6623 - regression_loss: 1.3788 - classification_loss: 0.2836 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6560 - regression_loss: 1.3738 - classification_loss: 0.2823 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6499 - regression_loss: 1.3690 - classification_loss: 0.2809 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6570 - regression_loss: 1.3744 - classification_loss: 0.2826 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6638 - regression_loss: 1.3801 - classification_loss: 0.2837 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6666 - regression_loss: 1.3831 - classification_loss: 0.2836 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6723 - regression_loss: 1.3870 - classification_loss: 0.2853 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6787 - regression_loss: 1.3920 - classification_loss: 0.2866 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6829 - regression_loss: 1.3950 - classification_loss: 0.2879 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6856 - regression_loss: 1.3966 - classification_loss: 0.2890 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6837 - regression_loss: 1.3954 - classification_loss: 0.2883 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6840 - regression_loss: 1.3955 - classification_loss: 0.2885 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6873 - regression_loss: 1.3983 - classification_loss: 0.2890 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6864 - regression_loss: 1.3975 - classification_loss: 0.2889 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6826 - regression_loss: 1.3943 - classification_loss: 0.2883 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6843 - regression_loss: 1.3960 - classification_loss: 0.2883 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6818 - regression_loss: 1.3942 - classification_loss: 0.2876 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6829 - regression_loss: 1.3954 - classification_loss: 0.2874 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6779 - regression_loss: 1.3917 - classification_loss: 0.2862 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6808 - regression_loss: 1.3951 - classification_loss: 0.2857 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6781 - regression_loss: 1.3933 - classification_loss: 0.2848 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6847 - regression_loss: 1.3989 - classification_loss: 0.2858 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6857 - regression_loss: 1.4002 - classification_loss: 0.2855 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6814 - regression_loss: 1.3968 - classification_loss: 0.2846 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6797 - regression_loss: 1.3958 - classification_loss: 0.2839 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6735 - regression_loss: 1.3910 - classification_loss: 0.2825 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6817 - regression_loss: 1.3970 - classification_loss: 0.2847 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6780 - regression_loss: 1.3946 - classification_loss: 0.2835 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6770 - regression_loss: 1.3940 - classification_loss: 0.2830 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6763 - regression_loss: 1.3937 - classification_loss: 0.2826 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6747 - regression_loss: 1.3932 - classification_loss: 0.2816 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6735 - regression_loss: 1.3921 - classification_loss: 0.2814 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6735 - regression_loss: 1.3926 - classification_loss: 0.2808 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6701 - regression_loss: 1.3896 - classification_loss: 0.2805 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6731 - regression_loss: 1.3918 - classification_loss: 0.2813 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6721 - regression_loss: 1.3915 - classification_loss: 0.2806 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6710 - regression_loss: 1.3909 - classification_loss: 0.2801 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6713 - regression_loss: 1.3918 - classification_loss: 0.2795 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6688 - regression_loss: 1.3903 - classification_loss: 0.2786 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6680 - regression_loss: 1.3904 - classification_loss: 0.2776 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6644 - regression_loss: 1.3870 - classification_loss: 0.2774 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6651 - regression_loss: 1.3876 - classification_loss: 0.2775 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6637 - regression_loss: 1.3867 - classification_loss: 0.2770 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6625 - regression_loss: 1.3862 - classification_loss: 0.2764 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6663 - regression_loss: 1.3892 - classification_loss: 0.2771 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6646 - regression_loss: 1.3883 - classification_loss: 0.2762 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6598 - regression_loss: 1.3845 - classification_loss: 0.2753 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6606 - regression_loss: 1.3850 - classification_loss: 0.2756 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6638 - regression_loss: 1.3874 - classification_loss: 0.2764 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6657 - regression_loss: 1.3891 - classification_loss: 0.2766 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6667 - regression_loss: 1.3898 - classification_loss: 0.2769 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6698 - regression_loss: 1.3923 - classification_loss: 0.2775 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6674 - regression_loss: 1.3906 - classification_loss: 0.2768 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6680 - regression_loss: 1.3911 - classification_loss: 0.2769 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6702 - regression_loss: 1.3934 - classification_loss: 0.2768 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6679 - regression_loss: 1.3919 - classification_loss: 0.2760 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6665 - regression_loss: 1.3909 - classification_loss: 0.2756 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6663 - regression_loss: 1.3906 - classification_loss: 0.2757 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6663 - regression_loss: 1.3913 - classification_loss: 0.2750 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6688 - regression_loss: 1.3938 - classification_loss: 0.2750 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6716 - regression_loss: 1.3963 - classification_loss: 0.2753 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6719 - regression_loss: 1.3964 - classification_loss: 0.2756 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6687 - regression_loss: 1.3942 - classification_loss: 0.2745 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6752 - regression_loss: 1.3988 - classification_loss: 0.2764 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6760 - regression_loss: 1.3998 - classification_loss: 0.2762 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6743 - regression_loss: 1.3984 - classification_loss: 0.2759 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6778 - regression_loss: 1.4008 - classification_loss: 0.2770 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6813 - regression_loss: 1.4037 - classification_loss: 0.2776 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6803 - regression_loss: 1.4032 - classification_loss: 0.2770 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6842 - regression_loss: 1.4054 - classification_loss: 0.2788 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6829 - regression_loss: 1.4047 - classification_loss: 0.2782 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6790 - regression_loss: 1.4012 - classification_loss: 0.2778 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6791 - regression_loss: 1.4012 - classification_loss: 0.2780 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6776 - regression_loss: 1.4003 - classification_loss: 0.2773 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6803 - regression_loss: 1.4033 - classification_loss: 0.2771 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6878 - regression_loss: 1.4093 - classification_loss: 0.2785 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6844 - regression_loss: 1.4067 - classification_loss: 0.2777 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6851 - regression_loss: 1.4073 - classification_loss: 0.2778 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6872 - regression_loss: 1.4096 - classification_loss: 0.2776 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6851 - regression_loss: 1.4081 - classification_loss: 0.2770 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6831 - regression_loss: 1.4067 - classification_loss: 0.2764 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6802 - regression_loss: 1.4046 - classification_loss: 0.2756 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6792 - regression_loss: 1.4035 - classification_loss: 0.2757 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6753 - regression_loss: 1.4001 - classification_loss: 0.2752 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6745 - regression_loss: 1.3996 - classification_loss: 0.2749 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6743 - regression_loss: 1.3991 - classification_loss: 0.2753 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6742 - regression_loss: 1.3991 - classification_loss: 0.2751 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6757 - regression_loss: 1.4000 - classification_loss: 0.2757 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6745 - regression_loss: 1.3991 - classification_loss: 0.2754 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6738 - regression_loss: 1.3983 - classification_loss: 0.2755 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6727 - regression_loss: 1.3975 - classification_loss: 0.2752 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6692 - regression_loss: 1.3949 - classification_loss: 0.2743 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6689 - regression_loss: 1.3947 - classification_loss: 0.2742 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6701 - regression_loss: 1.3959 - classification_loss: 0.2742 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6651 - regression_loss: 1.3919 - classification_loss: 0.2732 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6635 - regression_loss: 1.3907 - classification_loss: 0.2728 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6644 - regression_loss: 1.3917 - classification_loss: 0.2727 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6674 - regression_loss: 1.3946 - classification_loss: 0.2728 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6669 - regression_loss: 1.3941 - classification_loss: 0.2727 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6657 - regression_loss: 1.3935 - classification_loss: 0.2723 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6711 - regression_loss: 1.3980 - classification_loss: 0.2730 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6678 - regression_loss: 1.3953 - classification_loss: 0.2725 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6687 - regression_loss: 1.3961 - classification_loss: 0.2726 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6677 - regression_loss: 1.3957 - classification_loss: 0.2720 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6696 - regression_loss: 1.3971 - classification_loss: 0.2725 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6675 - regression_loss: 1.3954 - classification_loss: 0.2722 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6666 - regression_loss: 1.3943 - classification_loss: 0.2724 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6676 - regression_loss: 1.3951 - classification_loss: 0.2725 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6666 - regression_loss: 1.3947 - classification_loss: 0.2719 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6696 - regression_loss: 1.3968 - classification_loss: 0.2728 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6644 - regression_loss: 1.3923 - classification_loss: 0.2722 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6610 - regression_loss: 1.3896 - classification_loss: 0.2714 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6598 - regression_loss: 1.3888 - classification_loss: 0.2710 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6594 - regression_loss: 1.3890 - classification_loss: 0.2704 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6575 - regression_loss: 1.3878 - classification_loss: 0.2698 217/500 [============>.................] - ETA: 1:10 - loss: 1.6595 - regression_loss: 1.3895 - classification_loss: 0.2700 218/500 [============>.................] - ETA: 1:10 - loss: 1.6564 - regression_loss: 1.3871 - classification_loss: 0.2693 219/500 [============>.................] - ETA: 1:10 - loss: 1.6545 - regression_loss: 1.3855 - classification_loss: 0.2690 220/500 [============>.................] - ETA: 1:10 - loss: 1.6579 - regression_loss: 1.3887 - classification_loss: 0.2692 221/500 [============>.................] - ETA: 1:09 - loss: 1.6600 - regression_loss: 1.3905 - classification_loss: 0.2695 222/500 [============>.................] - ETA: 1:09 - loss: 1.6584 - regression_loss: 1.3894 - classification_loss: 0.2690 223/500 [============>.................] - ETA: 1:09 - loss: 1.6617 - regression_loss: 1.3921 - classification_loss: 0.2697 224/500 [============>.................] - ETA: 1:09 - loss: 1.6664 - regression_loss: 1.3940 - classification_loss: 0.2724 225/500 [============>.................] - ETA: 1:08 - loss: 1.6688 - regression_loss: 1.3959 - classification_loss: 0.2728 226/500 [============>.................] - ETA: 1:08 - loss: 1.6713 - regression_loss: 1.3970 - classification_loss: 0.2743 227/500 [============>.................] - ETA: 1:08 - loss: 1.6713 - regression_loss: 1.3972 - classification_loss: 0.2740 228/500 [============>.................] - ETA: 1:08 - loss: 1.6730 - regression_loss: 1.3987 - classification_loss: 0.2744 229/500 [============>.................] - ETA: 1:07 - loss: 1.6749 - regression_loss: 1.4007 - classification_loss: 0.2741 230/500 [============>.................] - ETA: 1:07 - loss: 1.6740 - regression_loss: 1.4001 - classification_loss: 0.2739 231/500 [============>.................] - ETA: 1:07 - loss: 1.6723 - regression_loss: 1.3989 - classification_loss: 0.2734 232/500 [============>.................] - ETA: 1:07 - loss: 1.6682 - regression_loss: 1.3954 - classification_loss: 0.2728 233/500 [============>.................] - ETA: 1:06 - loss: 1.6672 - regression_loss: 1.3944 - classification_loss: 0.2728 234/500 [=============>................] - ETA: 1:06 - loss: 1.6667 - regression_loss: 1.3937 - classification_loss: 0.2730 235/500 [=============>................] - ETA: 1:06 - loss: 1.6646 - regression_loss: 1.3919 - classification_loss: 0.2727 236/500 [=============>................] - ETA: 1:06 - loss: 1.6664 - regression_loss: 1.3934 - classification_loss: 0.2731 237/500 [=============>................] - ETA: 1:05 - loss: 1.6659 - regression_loss: 1.3932 - classification_loss: 0.2727 238/500 [=============>................] - ETA: 1:05 - loss: 1.6681 - regression_loss: 1.3955 - classification_loss: 0.2726 239/500 [=============>................] - ETA: 1:05 - loss: 1.6710 - regression_loss: 1.3978 - classification_loss: 0.2732 240/500 [=============>................] - ETA: 1:05 - loss: 1.6706 - regression_loss: 1.3979 - classification_loss: 0.2728 241/500 [=============>................] - ETA: 1:04 - loss: 1.6693 - regression_loss: 1.3971 - classification_loss: 0.2722 242/500 [=============>................] - ETA: 1:04 - loss: 1.6674 - regression_loss: 1.3955 - classification_loss: 0.2719 243/500 [=============>................] - ETA: 1:04 - loss: 1.6649 - regression_loss: 1.3935 - classification_loss: 0.2714 244/500 [=============>................] - ETA: 1:04 - loss: 1.6649 - regression_loss: 1.3935 - classification_loss: 0.2714 245/500 [=============>................] - ETA: 1:03 - loss: 1.6611 - regression_loss: 1.3905 - classification_loss: 0.2706 246/500 [=============>................] - ETA: 1:03 - loss: 1.6643 - regression_loss: 1.3936 - classification_loss: 0.2707 247/500 [=============>................] - ETA: 1:03 - loss: 1.6645 - regression_loss: 1.3941 - classification_loss: 0.2704 248/500 [=============>................] - ETA: 1:03 - loss: 1.6635 - regression_loss: 1.3936 - classification_loss: 0.2700 249/500 [=============>................] - ETA: 1:02 - loss: 1.6635 - regression_loss: 1.3937 - classification_loss: 0.2698 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6660 - regression_loss: 1.3951 - classification_loss: 0.2709 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6681 - regression_loss: 1.3953 - classification_loss: 0.2728 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6652 - regression_loss: 1.3926 - classification_loss: 0.2727 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6654 - regression_loss: 1.3930 - classification_loss: 0.2724 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6615 - regression_loss: 1.3897 - classification_loss: 0.2719 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6633 - regression_loss: 1.3910 - classification_loss: 0.2722 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6635 - regression_loss: 1.3914 - classification_loss: 0.2722 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6626 - regression_loss: 1.3908 - classification_loss: 0.2717 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6613 - regression_loss: 1.3896 - classification_loss: 0.2717 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6612 - regression_loss: 1.3896 - classification_loss: 0.2716 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6593 - regression_loss: 1.3882 - classification_loss: 0.2711 261/500 [==============>...............] - ETA: 59s - loss: 1.6583 - regression_loss: 1.3874 - classification_loss: 0.2710  262/500 [==============>...............] - ETA: 59s - loss: 1.6536 - regression_loss: 1.3834 - classification_loss: 0.2701 263/500 [==============>...............] - ETA: 59s - loss: 1.6536 - regression_loss: 1.3838 - classification_loss: 0.2699 264/500 [==============>...............] - ETA: 59s - loss: 1.6523 - regression_loss: 1.3830 - classification_loss: 0.2692 265/500 [==============>...............] - ETA: 58s - loss: 1.6541 - regression_loss: 1.3848 - classification_loss: 0.2692 266/500 [==============>...............] - ETA: 58s - loss: 1.6549 - regression_loss: 1.3856 - classification_loss: 0.2693 267/500 [===============>..............] - ETA: 58s - loss: 1.6557 - regression_loss: 1.3865 - classification_loss: 0.2693 268/500 [===============>..............] - ETA: 58s - loss: 1.6545 - regression_loss: 1.3856 - classification_loss: 0.2689 269/500 [===============>..............] - ETA: 57s - loss: 1.6549 - regression_loss: 1.3862 - classification_loss: 0.2687 270/500 [===============>..............] - ETA: 57s - loss: 1.6529 - regression_loss: 1.3811 - classification_loss: 0.2718 271/500 [===============>..............] - ETA: 57s - loss: 1.6559 - regression_loss: 1.3833 - classification_loss: 0.2725 272/500 [===============>..............] - ETA: 57s - loss: 1.6559 - regression_loss: 1.3837 - classification_loss: 0.2722 273/500 [===============>..............] - ETA: 56s - loss: 1.6552 - regression_loss: 1.3835 - classification_loss: 0.2716 274/500 [===============>..............] - ETA: 56s - loss: 1.6534 - regression_loss: 1.3819 - classification_loss: 0.2715 275/500 [===============>..............] - ETA: 56s - loss: 1.6493 - regression_loss: 1.3787 - classification_loss: 0.2706 276/500 [===============>..............] - ETA: 56s - loss: 1.6488 - regression_loss: 1.3783 - classification_loss: 0.2705 277/500 [===============>..............] - ETA: 55s - loss: 1.6444 - regression_loss: 1.3748 - classification_loss: 0.2696 278/500 [===============>..............] - ETA: 55s - loss: 1.6478 - regression_loss: 1.3774 - classification_loss: 0.2704 279/500 [===============>..............] - ETA: 55s - loss: 1.6474 - regression_loss: 1.3772 - classification_loss: 0.2702 280/500 [===============>..............] - ETA: 55s - loss: 1.6511 - regression_loss: 1.3790 - classification_loss: 0.2720 281/500 [===============>..............] - ETA: 54s - loss: 1.6544 - regression_loss: 1.3817 - classification_loss: 0.2727 282/500 [===============>..............] - ETA: 54s - loss: 1.6556 - regression_loss: 1.3825 - classification_loss: 0.2731 283/500 [===============>..............] - ETA: 54s - loss: 1.6558 - regression_loss: 1.3826 - classification_loss: 0.2732 284/500 [================>.............] - ETA: 54s - loss: 1.6539 - regression_loss: 1.3813 - classification_loss: 0.2726 285/500 [================>.............] - ETA: 53s - loss: 1.6559 - regression_loss: 1.3826 - classification_loss: 0.2734 286/500 [================>.............] - ETA: 53s - loss: 1.6547 - regression_loss: 1.3814 - classification_loss: 0.2733 287/500 [================>.............] - ETA: 53s - loss: 1.6542 - regression_loss: 1.3811 - classification_loss: 0.2731 288/500 [================>.............] - ETA: 53s - loss: 1.6518 - regression_loss: 1.3794 - classification_loss: 0.2724 289/500 [================>.............] - ETA: 52s - loss: 1.6515 - regression_loss: 1.3791 - classification_loss: 0.2724 290/500 [================>.............] - ETA: 52s - loss: 1.6507 - regression_loss: 1.3785 - classification_loss: 0.2722 291/500 [================>.............] - ETA: 52s - loss: 1.6554 - regression_loss: 1.3825 - classification_loss: 0.2729 292/500 [================>.............] - ETA: 52s - loss: 1.6556 - regression_loss: 1.3825 - classification_loss: 0.2730 293/500 [================>.............] - ETA: 51s - loss: 1.6551 - regression_loss: 1.3826 - classification_loss: 0.2725 294/500 [================>.............] - ETA: 51s - loss: 1.6566 - regression_loss: 1.3835 - classification_loss: 0.2731 295/500 [================>.............] - ETA: 51s - loss: 1.6571 - regression_loss: 1.3836 - classification_loss: 0.2734 296/500 [================>.............] - ETA: 51s - loss: 1.6571 - regression_loss: 1.3832 - classification_loss: 0.2739 297/500 [================>.............] - ETA: 50s - loss: 1.6557 - regression_loss: 1.3821 - classification_loss: 0.2736 298/500 [================>.............] - ETA: 50s - loss: 1.6560 - regression_loss: 1.3825 - classification_loss: 0.2735 299/500 [================>.............] - ETA: 50s - loss: 1.6567 - regression_loss: 1.3830 - classification_loss: 0.2737 300/500 [=================>............] - ETA: 50s - loss: 1.6555 - regression_loss: 1.3822 - classification_loss: 0.2733 301/500 [=================>............] - ETA: 49s - loss: 1.6544 - regression_loss: 1.3811 - classification_loss: 0.2733 302/500 [=================>............] - ETA: 49s - loss: 1.6532 - regression_loss: 1.3803 - classification_loss: 0.2729 303/500 [=================>............] - ETA: 49s - loss: 1.6532 - regression_loss: 1.3803 - classification_loss: 0.2730 304/500 [=================>............] - ETA: 49s - loss: 1.6550 - regression_loss: 1.3818 - classification_loss: 0.2732 305/500 [=================>............] - ETA: 48s - loss: 1.6539 - regression_loss: 1.3813 - classification_loss: 0.2727 306/500 [=================>............] - ETA: 48s - loss: 1.6536 - regression_loss: 1.3813 - classification_loss: 0.2723 307/500 [=================>............] - ETA: 48s - loss: 1.6540 - regression_loss: 1.3818 - classification_loss: 0.2722 308/500 [=================>............] - ETA: 48s - loss: 1.6525 - regression_loss: 1.3806 - classification_loss: 0.2719 309/500 [=================>............] - ETA: 47s - loss: 1.6522 - regression_loss: 1.3807 - classification_loss: 0.2715 310/500 [=================>............] - ETA: 47s - loss: 1.6535 - regression_loss: 1.3818 - classification_loss: 0.2717 311/500 [=================>............] - ETA: 47s - loss: 1.6513 - regression_loss: 1.3800 - classification_loss: 0.2713 312/500 [=================>............] - ETA: 47s - loss: 1.6513 - regression_loss: 1.3801 - classification_loss: 0.2712 313/500 [=================>............] - ETA: 46s - loss: 1.6518 - regression_loss: 1.3804 - classification_loss: 0.2714 314/500 [=================>............] - ETA: 46s - loss: 1.6527 - regression_loss: 1.3813 - classification_loss: 0.2714 315/500 [=================>............] - ETA: 46s - loss: 1.6530 - regression_loss: 1.3813 - classification_loss: 0.2717 316/500 [=================>............] - ETA: 46s - loss: 1.6559 - regression_loss: 1.3836 - classification_loss: 0.2723 317/500 [==================>...........] - ETA: 45s - loss: 1.6558 - regression_loss: 1.3838 - classification_loss: 0.2719 318/500 [==================>...........] - ETA: 45s - loss: 1.6534 - regression_loss: 1.3820 - classification_loss: 0.2713 319/500 [==================>...........] - ETA: 45s - loss: 1.6538 - regression_loss: 1.3824 - classification_loss: 0.2714 320/500 [==================>...........] - ETA: 45s - loss: 1.6516 - regression_loss: 1.3805 - classification_loss: 0.2710 321/500 [==================>...........] - ETA: 44s - loss: 1.6491 - regression_loss: 1.3786 - classification_loss: 0.2705 322/500 [==================>...........] - ETA: 44s - loss: 1.6467 - regression_loss: 1.3767 - classification_loss: 0.2700 323/500 [==================>...........] - ETA: 44s - loss: 1.6479 - regression_loss: 1.3774 - classification_loss: 0.2704 324/500 [==================>...........] - ETA: 44s - loss: 1.6458 - regression_loss: 1.3759 - classification_loss: 0.2699 325/500 [==================>...........] - ETA: 43s - loss: 1.6447 - regression_loss: 1.3750 - classification_loss: 0.2697 326/500 [==================>...........] - ETA: 43s - loss: 1.6426 - regression_loss: 1.3734 - classification_loss: 0.2693 327/500 [==================>...........] - ETA: 43s - loss: 1.6433 - regression_loss: 1.3739 - classification_loss: 0.2694 328/500 [==================>...........] - ETA: 43s - loss: 1.6443 - regression_loss: 1.3746 - classification_loss: 0.2697 329/500 [==================>...........] - ETA: 42s - loss: 1.6447 - regression_loss: 1.3750 - classification_loss: 0.2697 330/500 [==================>...........] - ETA: 42s - loss: 1.6454 - regression_loss: 1.3755 - classification_loss: 0.2699 331/500 [==================>...........] - ETA: 42s - loss: 1.6455 - regression_loss: 1.3753 - classification_loss: 0.2702 332/500 [==================>...........] - ETA: 42s - loss: 1.6463 - regression_loss: 1.3759 - classification_loss: 0.2704 333/500 [==================>...........] - ETA: 41s - loss: 1.6459 - regression_loss: 1.3755 - classification_loss: 0.2704 334/500 [===================>..........] - ETA: 41s - loss: 1.6471 - regression_loss: 1.3765 - classification_loss: 0.2705 335/500 [===================>..........] - ETA: 41s - loss: 1.6464 - regression_loss: 1.3762 - classification_loss: 0.2702 336/500 [===================>..........] - ETA: 41s - loss: 1.6467 - regression_loss: 1.3763 - classification_loss: 0.2704 337/500 [===================>..........] - ETA: 40s - loss: 1.6481 - regression_loss: 1.3780 - classification_loss: 0.2701 338/500 [===================>..........] - ETA: 40s - loss: 1.6466 - regression_loss: 1.3769 - classification_loss: 0.2697 339/500 [===================>..........] - ETA: 40s - loss: 1.6454 - regression_loss: 1.3760 - classification_loss: 0.2694 340/500 [===================>..........] - ETA: 40s - loss: 1.6469 - regression_loss: 1.3771 - classification_loss: 0.2699 341/500 [===================>..........] - ETA: 39s - loss: 1.6490 - regression_loss: 1.3786 - classification_loss: 0.2704 342/500 [===================>..........] - ETA: 39s - loss: 1.6493 - regression_loss: 1.3787 - classification_loss: 0.2706 343/500 [===================>..........] - ETA: 39s - loss: 1.6483 - regression_loss: 1.3781 - classification_loss: 0.2702 344/500 [===================>..........] - ETA: 39s - loss: 1.6490 - regression_loss: 1.3787 - classification_loss: 0.2703 345/500 [===================>..........] - ETA: 38s - loss: 1.6498 - regression_loss: 1.3790 - classification_loss: 0.2708 346/500 [===================>..........] - ETA: 38s - loss: 1.6495 - regression_loss: 1.3789 - classification_loss: 0.2706 347/500 [===================>..........] - ETA: 38s - loss: 1.6507 - regression_loss: 1.3799 - classification_loss: 0.2708 348/500 [===================>..........] - ETA: 38s - loss: 1.6505 - regression_loss: 1.3798 - classification_loss: 0.2706 349/500 [===================>..........] - ETA: 37s - loss: 1.6529 - regression_loss: 1.3821 - classification_loss: 0.2707 350/500 [====================>.........] - ETA: 37s - loss: 1.6513 - regression_loss: 1.3809 - classification_loss: 0.2704 351/500 [====================>.........] - ETA: 37s - loss: 1.6522 - regression_loss: 1.3816 - classification_loss: 0.2706 352/500 [====================>.........] - ETA: 37s - loss: 1.6489 - regression_loss: 1.3788 - classification_loss: 0.2700 353/500 [====================>.........] - ETA: 36s - loss: 1.6451 - regression_loss: 1.3749 - classification_loss: 0.2701 354/500 [====================>.........] - ETA: 36s - loss: 1.6439 - regression_loss: 1.3741 - classification_loss: 0.2698 355/500 [====================>.........] - ETA: 36s - loss: 1.6434 - regression_loss: 1.3739 - classification_loss: 0.2695 356/500 [====================>.........] - ETA: 36s - loss: 1.6416 - regression_loss: 1.3724 - classification_loss: 0.2691 357/500 [====================>.........] - ETA: 35s - loss: 1.6433 - regression_loss: 1.3739 - classification_loss: 0.2694 358/500 [====================>.........] - ETA: 35s - loss: 1.6434 - regression_loss: 1.3737 - classification_loss: 0.2697 359/500 [====================>.........] - ETA: 35s - loss: 1.6412 - regression_loss: 1.3721 - classification_loss: 0.2691 360/500 [====================>.........] - ETA: 35s - loss: 1.6415 - regression_loss: 1.3722 - classification_loss: 0.2693 361/500 [====================>.........] - ETA: 34s - loss: 1.6402 - regression_loss: 1.3713 - classification_loss: 0.2689 362/500 [====================>.........] - ETA: 34s - loss: 1.6405 - regression_loss: 1.3717 - classification_loss: 0.2688 363/500 [====================>.........] - ETA: 34s - loss: 1.6408 - regression_loss: 1.3721 - classification_loss: 0.2687 364/500 [====================>.........] - ETA: 34s - loss: 1.6404 - regression_loss: 1.3719 - classification_loss: 0.2686 365/500 [====================>.........] - ETA: 33s - loss: 1.6386 - regression_loss: 1.3705 - classification_loss: 0.2680 366/500 [====================>.........] - ETA: 33s - loss: 1.6377 - regression_loss: 1.3697 - classification_loss: 0.2679 367/500 [=====================>........] - ETA: 33s - loss: 1.6382 - regression_loss: 1.3703 - classification_loss: 0.2679 368/500 [=====================>........] - ETA: 33s - loss: 1.6387 - regression_loss: 1.3709 - classification_loss: 0.2678 369/500 [=====================>........] - ETA: 32s - loss: 1.6424 - regression_loss: 1.3739 - classification_loss: 0.2685 370/500 [=====================>........] - ETA: 32s - loss: 1.6422 - regression_loss: 1.3740 - classification_loss: 0.2682 371/500 [=====================>........] - ETA: 32s - loss: 1.6421 - regression_loss: 1.3740 - classification_loss: 0.2681 372/500 [=====================>........] - ETA: 32s - loss: 1.6415 - regression_loss: 1.3735 - classification_loss: 0.2680 373/500 [=====================>........] - ETA: 31s - loss: 1.6413 - regression_loss: 1.3732 - classification_loss: 0.2680 374/500 [=====================>........] - ETA: 31s - loss: 1.6412 - regression_loss: 1.3730 - classification_loss: 0.2682 375/500 [=====================>........] - ETA: 31s - loss: 1.6406 - regression_loss: 1.3729 - classification_loss: 0.2678 376/500 [=====================>........] - ETA: 31s - loss: 1.6400 - regression_loss: 1.3724 - classification_loss: 0.2675 377/500 [=====================>........] - ETA: 30s - loss: 1.6390 - regression_loss: 1.3717 - classification_loss: 0.2673 378/500 [=====================>........] - ETA: 30s - loss: 1.6420 - regression_loss: 1.3739 - classification_loss: 0.2680 379/500 [=====================>........] - ETA: 30s - loss: 1.6432 - regression_loss: 1.3748 - classification_loss: 0.2683 380/500 [=====================>........] - ETA: 30s - loss: 1.6415 - regression_loss: 1.3736 - classification_loss: 0.2679 381/500 [=====================>........] - ETA: 29s - loss: 1.6409 - regression_loss: 1.3732 - classification_loss: 0.2677 382/500 [=====================>........] - ETA: 29s - loss: 1.6375 - regression_loss: 1.3704 - classification_loss: 0.2671 383/500 [=====================>........] - ETA: 29s - loss: 1.6369 - regression_loss: 1.3700 - classification_loss: 0.2669 384/500 [======================>.......] - ETA: 29s - loss: 1.6352 - regression_loss: 1.3685 - classification_loss: 0.2667 385/500 [======================>.......] - ETA: 28s - loss: 1.6368 - regression_loss: 1.3700 - classification_loss: 0.2668 386/500 [======================>.......] - ETA: 28s - loss: 1.6373 - regression_loss: 1.3701 - classification_loss: 0.2672 387/500 [======================>.......] - ETA: 28s - loss: 1.6366 - regression_loss: 1.3697 - classification_loss: 0.2669 388/500 [======================>.......] - ETA: 28s - loss: 1.6352 - regression_loss: 1.3686 - classification_loss: 0.2666 389/500 [======================>.......] - ETA: 27s - loss: 1.6361 - regression_loss: 1.3695 - classification_loss: 0.2665 390/500 [======================>.......] - ETA: 27s - loss: 1.6354 - regression_loss: 1.3692 - classification_loss: 0.2662 391/500 [======================>.......] - ETA: 27s - loss: 1.6359 - regression_loss: 1.3698 - classification_loss: 0.2661 392/500 [======================>.......] - ETA: 27s - loss: 1.6365 - regression_loss: 1.3705 - classification_loss: 0.2660 393/500 [======================>.......] - ETA: 26s - loss: 1.6346 - regression_loss: 1.3690 - classification_loss: 0.2657 394/500 [======================>.......] - ETA: 26s - loss: 1.6354 - regression_loss: 1.3695 - classification_loss: 0.2659 395/500 [======================>.......] - ETA: 26s - loss: 1.6362 - regression_loss: 1.3699 - classification_loss: 0.2663 396/500 [======================>.......] - ETA: 26s - loss: 1.6361 - regression_loss: 1.3699 - classification_loss: 0.2661 397/500 [======================>.......] - ETA: 25s - loss: 1.6362 - regression_loss: 1.3700 - classification_loss: 0.2662 398/500 [======================>.......] - ETA: 25s - loss: 1.6349 - regression_loss: 1.3690 - classification_loss: 0.2659 399/500 [======================>.......] - ETA: 25s - loss: 1.6347 - regression_loss: 1.3687 - classification_loss: 0.2660 400/500 [=======================>......] - ETA: 25s - loss: 1.6341 - regression_loss: 1.3684 - classification_loss: 0.2657 401/500 [=======================>......] - ETA: 24s - loss: 1.6334 - regression_loss: 1.3679 - classification_loss: 0.2655 402/500 [=======================>......] - ETA: 24s - loss: 1.6317 - regression_loss: 1.3666 - classification_loss: 0.2651 403/500 [=======================>......] - ETA: 24s - loss: 1.6311 - regression_loss: 1.3660 - classification_loss: 0.2652 404/500 [=======================>......] - ETA: 24s - loss: 1.6307 - regression_loss: 1.3657 - classification_loss: 0.2649 405/500 [=======================>......] - ETA: 23s - loss: 1.6301 - regression_loss: 1.3654 - classification_loss: 0.2647 406/500 [=======================>......] - ETA: 23s - loss: 1.6318 - regression_loss: 1.3667 - classification_loss: 0.2650 407/500 [=======================>......] - ETA: 23s - loss: 1.6314 - regression_loss: 1.3665 - classification_loss: 0.2648 408/500 [=======================>......] - ETA: 23s - loss: 1.6326 - regression_loss: 1.3672 - classification_loss: 0.2654 409/500 [=======================>......] - ETA: 22s - loss: 1.6343 - regression_loss: 1.3687 - classification_loss: 0.2656 410/500 [=======================>......] - ETA: 22s - loss: 1.6334 - regression_loss: 1.3680 - classification_loss: 0.2654 411/500 [=======================>......] - ETA: 22s - loss: 1.6353 - regression_loss: 1.3696 - classification_loss: 0.2657 412/500 [=======================>......] - ETA: 22s - loss: 1.6367 - regression_loss: 1.3708 - classification_loss: 0.2659 413/500 [=======================>......] - ETA: 21s - loss: 1.6368 - regression_loss: 1.3707 - classification_loss: 0.2661 414/500 [=======================>......] - ETA: 21s - loss: 1.6370 - regression_loss: 1.3711 - classification_loss: 0.2660 415/500 [=======================>......] - ETA: 21s - loss: 1.6366 - regression_loss: 1.3708 - classification_loss: 0.2658 416/500 [=======================>......] - ETA: 21s - loss: 1.6368 - regression_loss: 1.3708 - classification_loss: 0.2660 417/500 [========================>.....] - ETA: 20s - loss: 1.6391 - regression_loss: 1.3726 - classification_loss: 0.2665 418/500 [========================>.....] - ETA: 20s - loss: 1.6417 - regression_loss: 1.3749 - classification_loss: 0.2668 419/500 [========================>.....] - ETA: 20s - loss: 1.6409 - regression_loss: 1.3744 - classification_loss: 0.2665 420/500 [========================>.....] - ETA: 20s - loss: 1.6395 - regression_loss: 1.3734 - classification_loss: 0.2661 421/500 [========================>.....] - ETA: 19s - loss: 1.6408 - regression_loss: 1.3744 - classification_loss: 0.2664 422/500 [========================>.....] - ETA: 19s - loss: 1.6393 - regression_loss: 1.3733 - classification_loss: 0.2660 423/500 [========================>.....] - ETA: 19s - loss: 1.6377 - regression_loss: 1.3720 - classification_loss: 0.2656 424/500 [========================>.....] - ETA: 19s - loss: 1.6387 - regression_loss: 1.3729 - classification_loss: 0.2659 425/500 [========================>.....] - ETA: 18s - loss: 1.6401 - regression_loss: 1.3738 - classification_loss: 0.2664 426/500 [========================>.....] - ETA: 18s - loss: 1.6398 - regression_loss: 1.3735 - classification_loss: 0.2663 427/500 [========================>.....] - ETA: 18s - loss: 1.6372 - regression_loss: 1.3714 - classification_loss: 0.2658 428/500 [========================>.....] - ETA: 18s - loss: 1.6368 - regression_loss: 1.3712 - classification_loss: 0.2656 429/500 [========================>.....] - ETA: 17s - loss: 1.6387 - regression_loss: 1.3728 - classification_loss: 0.2659 430/500 [========================>.....] - ETA: 17s - loss: 1.6391 - regression_loss: 1.3732 - classification_loss: 0.2659 431/500 [========================>.....] - ETA: 17s - loss: 1.6396 - regression_loss: 1.3736 - classification_loss: 0.2660 432/500 [========================>.....] - ETA: 17s - loss: 1.6411 - regression_loss: 1.3747 - classification_loss: 0.2663 433/500 [========================>.....] - ETA: 16s - loss: 1.6415 - regression_loss: 1.3753 - classification_loss: 0.2662 434/500 [=========================>....] - ETA: 16s - loss: 1.6423 - regression_loss: 1.3759 - classification_loss: 0.2663 435/500 [=========================>....] - ETA: 16s - loss: 1.6421 - regression_loss: 1.3758 - classification_loss: 0.2663 436/500 [=========================>....] - ETA: 16s - loss: 1.6429 - regression_loss: 1.3763 - classification_loss: 0.2666 437/500 [=========================>....] - ETA: 15s - loss: 1.6455 - regression_loss: 1.3780 - classification_loss: 0.2675 438/500 [=========================>....] - ETA: 15s - loss: 1.6459 - regression_loss: 1.3783 - classification_loss: 0.2677 439/500 [=========================>....] - ETA: 15s - loss: 1.6464 - regression_loss: 1.3788 - classification_loss: 0.2676 440/500 [=========================>....] - ETA: 15s - loss: 1.6466 - regression_loss: 1.3788 - classification_loss: 0.2677 441/500 [=========================>....] - ETA: 14s - loss: 1.6465 - regression_loss: 1.3789 - classification_loss: 0.2677 442/500 [=========================>....] - ETA: 14s - loss: 1.6467 - regression_loss: 1.3791 - classification_loss: 0.2676 443/500 [=========================>....] - ETA: 14s - loss: 1.6467 - regression_loss: 1.3790 - classification_loss: 0.2678 444/500 [=========================>....] - ETA: 14s - loss: 1.6467 - regression_loss: 1.3789 - classification_loss: 0.2679 445/500 [=========================>....] - ETA: 13s - loss: 1.6474 - regression_loss: 1.3796 - classification_loss: 0.2678 446/500 [=========================>....] - ETA: 13s - loss: 1.6473 - regression_loss: 1.3795 - classification_loss: 0.2677 447/500 [=========================>....] - ETA: 13s - loss: 1.6474 - regression_loss: 1.3797 - classification_loss: 0.2677 448/500 [=========================>....] - ETA: 13s - loss: 1.6457 - regression_loss: 1.3784 - classification_loss: 0.2673 449/500 [=========================>....] - ETA: 12s - loss: 1.6455 - regression_loss: 1.3782 - classification_loss: 0.2673 450/500 [==========================>...] - ETA: 12s - loss: 1.6470 - regression_loss: 1.3798 - classification_loss: 0.2672 451/500 [==========================>...] - ETA: 12s - loss: 1.6481 - regression_loss: 1.3805 - classification_loss: 0.2676 452/500 [==========================>...] - ETA: 12s - loss: 1.6477 - regression_loss: 1.3802 - classification_loss: 0.2675 453/500 [==========================>...] - ETA: 11s - loss: 1.6473 - regression_loss: 1.3800 - classification_loss: 0.2672 454/500 [==========================>...] - ETA: 11s - loss: 1.6471 - regression_loss: 1.3800 - classification_loss: 0.2670 455/500 [==========================>...] - ETA: 11s - loss: 1.6466 - regression_loss: 1.3796 - classification_loss: 0.2670 456/500 [==========================>...] - ETA: 11s - loss: 1.6444 - regression_loss: 1.3779 - classification_loss: 0.2665 457/500 [==========================>...] - ETA: 10s - loss: 1.6442 - regression_loss: 1.3778 - classification_loss: 0.2664 458/500 [==========================>...] - ETA: 10s - loss: 1.6435 - regression_loss: 1.3770 - classification_loss: 0.2665 459/500 [==========================>...] - ETA: 10s - loss: 1.6426 - regression_loss: 1.3764 - classification_loss: 0.2662 460/500 [==========================>...] - ETA: 10s - loss: 1.6439 - regression_loss: 1.3773 - classification_loss: 0.2666 461/500 [==========================>...] - ETA: 9s - loss: 1.6465 - regression_loss: 1.3800 - classification_loss: 0.2665  462/500 [==========================>...] - ETA: 9s - loss: 1.6481 - regression_loss: 1.3814 - classification_loss: 0.2667 463/500 [==========================>...] - ETA: 9s - loss: 1.6501 - regression_loss: 1.3834 - classification_loss: 0.2667 464/500 [==========================>...] - ETA: 9s - loss: 1.6508 - regression_loss: 1.3839 - classification_loss: 0.2669 465/500 [==========================>...] - ETA: 8s - loss: 1.6497 - regression_loss: 1.3830 - classification_loss: 0.2666 466/500 [==========================>...] - ETA: 8s - loss: 1.6492 - regression_loss: 1.3825 - classification_loss: 0.2666 467/500 [===========================>..] - ETA: 8s - loss: 1.6494 - regression_loss: 1.3828 - classification_loss: 0.2666 468/500 [===========================>..] - ETA: 8s - loss: 1.6496 - regression_loss: 1.3829 - classification_loss: 0.2667 469/500 [===========================>..] - ETA: 7s - loss: 1.6499 - regression_loss: 1.3830 - classification_loss: 0.2669 470/500 [===========================>..] - ETA: 7s - loss: 1.6497 - regression_loss: 1.3829 - classification_loss: 0.2668 471/500 [===========================>..] - ETA: 7s - loss: 1.6500 - regression_loss: 1.3833 - classification_loss: 0.2667 472/500 [===========================>..] - ETA: 7s - loss: 1.6490 - regression_loss: 1.3826 - classification_loss: 0.2664 473/500 [===========================>..] - ETA: 6s - loss: 1.6499 - regression_loss: 1.3833 - classification_loss: 0.2666 474/500 [===========================>..] - ETA: 6s - loss: 1.6506 - regression_loss: 1.3841 - classification_loss: 0.2665 475/500 [===========================>..] - ETA: 6s - loss: 1.6512 - regression_loss: 1.3845 - classification_loss: 0.2667 476/500 [===========================>..] - ETA: 6s - loss: 1.6529 - regression_loss: 1.3857 - classification_loss: 0.2672 477/500 [===========================>..] - ETA: 5s - loss: 1.6540 - regression_loss: 1.3866 - classification_loss: 0.2674 478/500 [===========================>..] - ETA: 5s - loss: 1.6533 - regression_loss: 1.3862 - classification_loss: 0.2672 479/500 [===========================>..] - ETA: 5s - loss: 1.6544 - regression_loss: 1.3868 - classification_loss: 0.2676 480/500 [===========================>..] - ETA: 5s - loss: 1.6531 - regression_loss: 1.3858 - classification_loss: 0.2673 481/500 [===========================>..] - ETA: 4s - loss: 1.6535 - regression_loss: 1.3859 - classification_loss: 0.2675 482/500 [===========================>..] - ETA: 4s - loss: 1.6527 - regression_loss: 1.3853 - classification_loss: 0.2673 483/500 [===========================>..] - ETA: 4s - loss: 1.6527 - regression_loss: 1.3854 - classification_loss: 0.2673 484/500 [============================>.] - ETA: 4s - loss: 1.6524 - regression_loss: 1.3852 - classification_loss: 0.2672 485/500 [============================>.] - ETA: 3s - loss: 1.6531 - regression_loss: 1.3860 - classification_loss: 0.2672 486/500 [============================>.] - ETA: 3s - loss: 1.6540 - regression_loss: 1.3867 - classification_loss: 0.2672 487/500 [============================>.] - ETA: 3s - loss: 1.6535 - regression_loss: 1.3864 - classification_loss: 0.2672 488/500 [============================>.] - ETA: 3s - loss: 1.6534 - regression_loss: 1.3863 - classification_loss: 0.2671 489/500 [============================>.] - ETA: 2s - loss: 1.6521 - regression_loss: 1.3853 - classification_loss: 0.2668 490/500 [============================>.] - ETA: 2s - loss: 1.6525 - regression_loss: 1.3855 - classification_loss: 0.2669 491/500 [============================>.] - ETA: 2s - loss: 1.6532 - regression_loss: 1.3862 - classification_loss: 0.2671 492/500 [============================>.] - ETA: 2s - loss: 1.6531 - regression_loss: 1.3862 - classification_loss: 0.2669 493/500 [============================>.] - ETA: 1s - loss: 1.6514 - regression_loss: 1.3848 - classification_loss: 0.2666 494/500 [============================>.] - ETA: 1s - loss: 1.6513 - regression_loss: 1.3848 - classification_loss: 0.2664 495/500 [============================>.] - ETA: 1s - loss: 1.6515 - regression_loss: 1.3851 - classification_loss: 0.2664 496/500 [============================>.] - ETA: 1s - loss: 1.6523 - regression_loss: 1.3857 - classification_loss: 0.2665 497/500 [============================>.] - ETA: 0s - loss: 1.6520 - regression_loss: 1.3855 - classification_loss: 0.2665 498/500 [============================>.] - ETA: 0s - loss: 1.6525 - regression_loss: 1.3857 - classification_loss: 0.2668 499/500 [============================>.] - ETA: 0s - loss: 1.6527 - regression_loss: 1.3859 - classification_loss: 0.2668 500/500 [==============================] - 125s 251ms/step - loss: 1.6519 - regression_loss: 1.3853 - classification_loss: 0.2666 326 instances of class plum with average precision: 0.7576 mAP: 0.7576 Epoch 00057: saving model to ./training/snapshots/resnet50_pascal_57.h5 Epoch 58/150 1/500 [..............................] - ETA: 2:04 - loss: 1.0512 - regression_loss: 0.9287 - classification_loss: 0.1226 2/500 [..............................] - ETA: 2:05 - loss: 1.5693 - regression_loss: 1.3591 - classification_loss: 0.2102 3/500 [..............................] - ETA: 2:05 - loss: 1.3595 - regression_loss: 1.1554 - classification_loss: 0.2041 4/500 [..............................] - ETA: 2:05 - loss: 1.4128 - regression_loss: 1.2032 - classification_loss: 0.2096 5/500 [..............................] - ETA: 2:05 - loss: 1.3094 - regression_loss: 1.1242 - classification_loss: 0.1852 6/500 [..............................] - ETA: 2:05 - loss: 1.3357 - regression_loss: 1.1507 - classification_loss: 0.1851 7/500 [..............................] - ETA: 2:04 - loss: 1.4466 - regression_loss: 1.2312 - classification_loss: 0.2155 8/500 [..............................] - ETA: 2:03 - loss: 1.4426 - regression_loss: 1.2347 - classification_loss: 0.2078 9/500 [..............................] - ETA: 2:03 - loss: 1.3897 - regression_loss: 1.1858 - classification_loss: 0.2039 10/500 [..............................] - ETA: 2:02 - loss: 1.3635 - regression_loss: 1.1663 - classification_loss: 0.1972 11/500 [..............................] - ETA: 2:02 - loss: 1.3705 - regression_loss: 1.1753 - classification_loss: 0.1951 12/500 [..............................] - ETA: 2:00 - loss: 1.2825 - regression_loss: 1.0961 - classification_loss: 0.1864 13/500 [..............................] - ETA: 1:59 - loss: 1.2903 - regression_loss: 1.1043 - classification_loss: 0.1860 14/500 [..............................] - ETA: 1:58 - loss: 1.3308 - regression_loss: 1.1390 - classification_loss: 0.1918 15/500 [..............................] - ETA: 1:57 - loss: 1.3335 - regression_loss: 1.1458 - classification_loss: 0.1877 16/500 [..............................] - ETA: 1:57 - loss: 1.3648 - regression_loss: 1.1749 - classification_loss: 0.1899 17/500 [>.............................] - ETA: 1:57 - loss: 1.3713 - regression_loss: 1.1813 - classification_loss: 0.1899 18/500 [>.............................] - ETA: 1:57 - loss: 1.3624 - regression_loss: 1.1739 - classification_loss: 0.1885 19/500 [>.............................] - ETA: 1:57 - loss: 1.3631 - regression_loss: 1.1771 - classification_loss: 0.1860 20/500 [>.............................] - ETA: 1:57 - loss: 1.3385 - regression_loss: 1.1564 - classification_loss: 0.1821 21/500 [>.............................] - ETA: 1:57 - loss: 1.3677 - regression_loss: 1.1821 - classification_loss: 0.1857 22/500 [>.............................] - ETA: 1:57 - loss: 1.3282 - regression_loss: 1.1458 - classification_loss: 0.1824 23/500 [>.............................] - ETA: 1:57 - loss: 1.3108 - regression_loss: 1.1310 - classification_loss: 0.1798 24/500 [>.............................] - ETA: 1:57 - loss: 1.2932 - regression_loss: 1.1170 - classification_loss: 0.1761 25/500 [>.............................] - ETA: 1:57 - loss: 1.3132 - regression_loss: 1.1333 - classification_loss: 0.1798 26/500 [>.............................] - ETA: 1:57 - loss: 1.3331 - regression_loss: 1.1535 - classification_loss: 0.1795 27/500 [>.............................] - ETA: 1:56 - loss: 1.3542 - regression_loss: 1.1719 - classification_loss: 0.1824 28/500 [>.............................] - ETA: 1:56 - loss: 1.3774 - regression_loss: 1.1908 - classification_loss: 0.1865 29/500 [>.............................] - ETA: 1:56 - loss: 1.3462 - regression_loss: 1.1641 - classification_loss: 0.1821 30/500 [>.............................] - ETA: 1:56 - loss: 1.3468 - regression_loss: 1.1657 - classification_loss: 0.1811 31/500 [>.............................] - ETA: 1:56 - loss: 1.3624 - regression_loss: 1.1816 - classification_loss: 0.1808 32/500 [>.............................] - ETA: 1:55 - loss: 1.3590 - regression_loss: 1.1787 - classification_loss: 0.1803 33/500 [>.............................] - ETA: 1:55 - loss: 1.3593 - regression_loss: 1.1774 - classification_loss: 0.1819 34/500 [=>............................] - ETA: 1:55 - loss: 1.3626 - regression_loss: 1.1799 - classification_loss: 0.1827 35/500 [=>............................] - ETA: 1:55 - loss: 1.3971 - regression_loss: 1.2057 - classification_loss: 0.1914 36/500 [=>............................] - ETA: 1:55 - loss: 1.3869 - regression_loss: 1.1954 - classification_loss: 0.1915 37/500 [=>............................] - ETA: 1:55 - loss: 1.3833 - regression_loss: 1.1927 - classification_loss: 0.1907 38/500 [=>............................] - ETA: 1:54 - loss: 1.3955 - regression_loss: 1.2006 - classification_loss: 0.1950 39/500 [=>............................] - ETA: 1:54 - loss: 1.4092 - regression_loss: 1.2090 - classification_loss: 0.2002 40/500 [=>............................] - ETA: 1:54 - loss: 1.4188 - regression_loss: 1.2152 - classification_loss: 0.2036 41/500 [=>............................] - ETA: 1:54 - loss: 1.4138 - regression_loss: 1.2111 - classification_loss: 0.2026 42/500 [=>............................] - ETA: 1:54 - loss: 1.4001 - regression_loss: 1.2000 - classification_loss: 0.2001 43/500 [=>............................] - ETA: 1:53 - loss: 1.4022 - regression_loss: 1.2022 - classification_loss: 0.2000 44/500 [=>............................] - ETA: 1:53 - loss: 1.3821 - regression_loss: 1.1853 - classification_loss: 0.1967 45/500 [=>............................] - ETA: 1:53 - loss: 1.4006 - regression_loss: 1.1981 - classification_loss: 0.2025 46/500 [=>............................] - ETA: 1:53 - loss: 1.4117 - regression_loss: 1.2064 - classification_loss: 0.2053 47/500 [=>............................] - ETA: 1:52 - loss: 1.4126 - regression_loss: 1.2075 - classification_loss: 0.2051 48/500 [=>............................] - ETA: 1:52 - loss: 1.4295 - regression_loss: 1.2186 - classification_loss: 0.2109 49/500 [=>............................] - ETA: 1:52 - loss: 1.4231 - regression_loss: 1.2122 - classification_loss: 0.2109 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4345 - regression_loss: 1.2196 - classification_loss: 0.2149 51/500 [==>...........................] - ETA: 1:52 - loss: 1.4340 - regression_loss: 1.2207 - classification_loss: 0.2133 52/500 [==>...........................] - ETA: 1:51 - loss: 1.4237 - regression_loss: 1.2121 - classification_loss: 0.2117 53/500 [==>...........................] - ETA: 1:51 - loss: 1.4260 - regression_loss: 1.2132 - classification_loss: 0.2129 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4158 - regression_loss: 1.2043 - classification_loss: 0.2115 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4223 - regression_loss: 1.2118 - classification_loss: 0.2106 56/500 [==>...........................] - ETA: 1:50 - loss: 1.4223 - regression_loss: 1.2122 - classification_loss: 0.2101 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4149 - regression_loss: 1.2067 - classification_loss: 0.2082 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4196 - regression_loss: 1.2104 - classification_loss: 0.2092 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4327 - regression_loss: 1.2226 - classification_loss: 0.2101 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4402 - regression_loss: 1.2288 - classification_loss: 0.2114 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4452 - regression_loss: 1.2332 - classification_loss: 0.2120 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4593 - regression_loss: 1.2437 - classification_loss: 0.2156 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4759 - regression_loss: 1.2556 - classification_loss: 0.2203 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4726 - regression_loss: 1.2534 - classification_loss: 0.2191 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4597 - regression_loss: 1.2430 - classification_loss: 0.2167 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4613 - regression_loss: 1.2432 - classification_loss: 0.2181 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4681 - regression_loss: 1.2484 - classification_loss: 0.2197 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4817 - regression_loss: 1.2605 - classification_loss: 0.2212 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4815 - regression_loss: 1.2595 - classification_loss: 0.2220 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4887 - regression_loss: 1.2642 - classification_loss: 0.2246 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4976 - regression_loss: 1.2714 - classification_loss: 0.2261 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5041 - regression_loss: 1.2775 - classification_loss: 0.2266 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5063 - regression_loss: 1.2800 - classification_loss: 0.2263 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5063 - regression_loss: 1.2806 - classification_loss: 0.2257 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5073 - regression_loss: 1.2819 - classification_loss: 0.2254 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5045 - regression_loss: 1.2800 - classification_loss: 0.2245 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5151 - regression_loss: 1.2872 - classification_loss: 0.2279 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5155 - regression_loss: 1.2886 - classification_loss: 0.2270 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5117 - regression_loss: 1.2859 - classification_loss: 0.2258 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5172 - regression_loss: 1.2906 - classification_loss: 0.2266 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5244 - regression_loss: 1.2961 - classification_loss: 0.2283 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5128 - regression_loss: 1.2864 - classification_loss: 0.2264 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5176 - regression_loss: 1.2905 - classification_loss: 0.2271 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5158 - regression_loss: 1.2892 - classification_loss: 0.2267 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5211 - regression_loss: 1.2938 - classification_loss: 0.2273 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5153 - regression_loss: 1.2886 - classification_loss: 0.2267 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5192 - regression_loss: 1.2905 - classification_loss: 0.2287 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5112 - regression_loss: 1.2836 - classification_loss: 0.2276 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5161 - regression_loss: 1.2878 - classification_loss: 0.2283 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5159 - regression_loss: 1.2876 - classification_loss: 0.2283 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5301 - regression_loss: 1.2977 - classification_loss: 0.2324 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5238 - regression_loss: 1.2925 - classification_loss: 0.2313 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5274 - regression_loss: 1.2948 - classification_loss: 0.2326 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5274 - regression_loss: 1.2953 - classification_loss: 0.2321 95/500 [====>.........................] - ETA: 1:40 - loss: 1.5335 - regression_loss: 1.3005 - classification_loss: 0.2330 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5342 - regression_loss: 1.2999 - classification_loss: 0.2343 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5378 - regression_loss: 1.3028 - classification_loss: 0.2350 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5384 - regression_loss: 1.3030 - classification_loss: 0.2354 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5372 - regression_loss: 1.3024 - classification_loss: 0.2347 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5372 - regression_loss: 1.3028 - classification_loss: 0.2343 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5355 - regression_loss: 1.3014 - classification_loss: 0.2341 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5323 - regression_loss: 1.2991 - classification_loss: 0.2333 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5331 - regression_loss: 1.3002 - classification_loss: 0.2329 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5364 - regression_loss: 1.3021 - classification_loss: 0.2343 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5361 - regression_loss: 1.3017 - classification_loss: 0.2344 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5385 - regression_loss: 1.3036 - classification_loss: 0.2349 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5331 - regression_loss: 1.2996 - classification_loss: 0.2335 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5428 - regression_loss: 1.3064 - classification_loss: 0.2364 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5454 - regression_loss: 1.3089 - classification_loss: 0.2365 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5394 - regression_loss: 1.3038 - classification_loss: 0.2356 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5367 - regression_loss: 1.3019 - classification_loss: 0.2348 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5379 - regression_loss: 1.3029 - classification_loss: 0.2350 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5405 - regression_loss: 1.3046 - classification_loss: 0.2359 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5449 - regression_loss: 1.3083 - classification_loss: 0.2366 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5418 - regression_loss: 1.3060 - classification_loss: 0.2358 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5492 - regression_loss: 1.3127 - classification_loss: 0.2365 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5406 - regression_loss: 1.3052 - classification_loss: 0.2354 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5422 - regression_loss: 1.3071 - classification_loss: 0.2351 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5468 - regression_loss: 1.3113 - classification_loss: 0.2355 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5567 - regression_loss: 1.3199 - classification_loss: 0.2368 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5615 - regression_loss: 1.3236 - classification_loss: 0.2378 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5637 - regression_loss: 1.3257 - classification_loss: 0.2380 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5666 - regression_loss: 1.3280 - classification_loss: 0.2386 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5689 - regression_loss: 1.3299 - classification_loss: 0.2389 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5613 - regression_loss: 1.3236 - classification_loss: 0.2377 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5650 - regression_loss: 1.3261 - classification_loss: 0.2389 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5681 - regression_loss: 1.3290 - classification_loss: 0.2391 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5683 - regression_loss: 1.3294 - classification_loss: 0.2388 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5717 - regression_loss: 1.3317 - classification_loss: 0.2399 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5722 - regression_loss: 1.3315 - classification_loss: 0.2407 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5712 - regression_loss: 1.3312 - classification_loss: 0.2400 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5738 - regression_loss: 1.3334 - classification_loss: 0.2404 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5866 - regression_loss: 1.3437 - classification_loss: 0.2430 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5925 - regression_loss: 1.3484 - classification_loss: 0.2441 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5887 - regression_loss: 1.3446 - classification_loss: 0.2440 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5848 - regression_loss: 1.3415 - classification_loss: 0.2433 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5832 - regression_loss: 1.3403 - classification_loss: 0.2429 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5836 - regression_loss: 1.3407 - classification_loss: 0.2429 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5787 - regression_loss: 1.3364 - classification_loss: 0.2423 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5832 - regression_loss: 1.3391 - classification_loss: 0.2441 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5820 - regression_loss: 1.3384 - classification_loss: 0.2436 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5848 - regression_loss: 1.3408 - classification_loss: 0.2440 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5852 - regression_loss: 1.3410 - classification_loss: 0.2442 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5860 - regression_loss: 1.3415 - classification_loss: 0.2446 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5809 - regression_loss: 1.3370 - classification_loss: 0.2439 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5790 - regression_loss: 1.3356 - classification_loss: 0.2433 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5796 - regression_loss: 1.3358 - classification_loss: 0.2437 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5753 - regression_loss: 1.3318 - classification_loss: 0.2436 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5749 - regression_loss: 1.3313 - classification_loss: 0.2436 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5741 - regression_loss: 1.3302 - classification_loss: 0.2439 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5735 - regression_loss: 1.3299 - classification_loss: 0.2436 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5732 - regression_loss: 1.3305 - classification_loss: 0.2427 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5724 - regression_loss: 1.3304 - classification_loss: 0.2419 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5745 - regression_loss: 1.3317 - classification_loss: 0.2428 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5783 - regression_loss: 1.3346 - classification_loss: 0.2437 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5831 - regression_loss: 1.3377 - classification_loss: 0.2454 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5830 - regression_loss: 1.3380 - classification_loss: 0.2450 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5818 - regression_loss: 1.3375 - classification_loss: 0.2443 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5807 - regression_loss: 1.3370 - classification_loss: 0.2438 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5807 - regression_loss: 1.3369 - classification_loss: 0.2438 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5815 - regression_loss: 1.3377 - classification_loss: 0.2438 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5785 - regression_loss: 1.3350 - classification_loss: 0.2435 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5812 - regression_loss: 1.3370 - classification_loss: 0.2442 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5799 - regression_loss: 1.3360 - classification_loss: 0.2439 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5774 - regression_loss: 1.3337 - classification_loss: 0.2437 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5774 - regression_loss: 1.3340 - classification_loss: 0.2434 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5752 - regression_loss: 1.3317 - classification_loss: 0.2435 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5754 - regression_loss: 1.3318 - classification_loss: 0.2435 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5765 - regression_loss: 1.3328 - classification_loss: 0.2437 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5770 - regression_loss: 1.3327 - classification_loss: 0.2442 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5755 - regression_loss: 1.3317 - classification_loss: 0.2438 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5740 - regression_loss: 1.3305 - classification_loss: 0.2435 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5682 - regression_loss: 1.3257 - classification_loss: 0.2425 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5715 - regression_loss: 1.3286 - classification_loss: 0.2429 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5707 - regression_loss: 1.3278 - classification_loss: 0.2429 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5758 - regression_loss: 1.3319 - classification_loss: 0.2439 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5766 - regression_loss: 1.3322 - classification_loss: 0.2443 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5717 - regression_loss: 1.3282 - classification_loss: 0.2435 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5706 - regression_loss: 1.3270 - classification_loss: 0.2436 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5733 - regression_loss: 1.3292 - classification_loss: 0.2441 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5736 - regression_loss: 1.3295 - classification_loss: 0.2441 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5762 - regression_loss: 1.3312 - classification_loss: 0.2450 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5804 - regression_loss: 1.3345 - classification_loss: 0.2459 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5819 - regression_loss: 1.3361 - classification_loss: 0.2458 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5815 - regression_loss: 1.3363 - classification_loss: 0.2452 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5811 - regression_loss: 1.3363 - classification_loss: 0.2447 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5771 - regression_loss: 1.3329 - classification_loss: 0.2442 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5775 - regression_loss: 1.3334 - classification_loss: 0.2441 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5767 - regression_loss: 1.3329 - classification_loss: 0.2438 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5745 - regression_loss: 1.3312 - classification_loss: 0.2433 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5727 - regression_loss: 1.3299 - classification_loss: 0.2428 192/500 [==========>...................] - ETA: 1:16 - loss: 1.5718 - regression_loss: 1.3291 - classification_loss: 0.2427 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5721 - regression_loss: 1.3297 - classification_loss: 0.2424 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5730 - regression_loss: 1.3306 - classification_loss: 0.2423 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5680 - regression_loss: 1.3266 - classification_loss: 0.2414 196/500 [==========>...................] - ETA: 1:15 - loss: 1.5633 - regression_loss: 1.3227 - classification_loss: 0.2406 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5639 - regression_loss: 1.3233 - classification_loss: 0.2405 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5656 - regression_loss: 1.3247 - classification_loss: 0.2409 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5659 - regression_loss: 1.3249 - classification_loss: 0.2411 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5654 - regression_loss: 1.3244 - classification_loss: 0.2410 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5651 - regression_loss: 1.3178 - classification_loss: 0.2473 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5656 - regression_loss: 1.3183 - classification_loss: 0.2473 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5661 - regression_loss: 1.3192 - classification_loss: 0.2469 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5683 - regression_loss: 1.3210 - classification_loss: 0.2473 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5669 - regression_loss: 1.3203 - classification_loss: 0.2466 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5696 - regression_loss: 1.3224 - classification_loss: 0.2473 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5688 - regression_loss: 1.3214 - classification_loss: 0.2474 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5680 - regression_loss: 1.3205 - classification_loss: 0.2475 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5634 - regression_loss: 1.3167 - classification_loss: 0.2467 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5630 - regression_loss: 1.3165 - classification_loss: 0.2465 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5649 - regression_loss: 1.3179 - classification_loss: 0.2470 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5665 - regression_loss: 1.3190 - classification_loss: 0.2476 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5663 - regression_loss: 1.3191 - classification_loss: 0.2472 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5667 - regression_loss: 1.3201 - classification_loss: 0.2466 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5666 - regression_loss: 1.3205 - classification_loss: 0.2461 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5630 - regression_loss: 1.3177 - classification_loss: 0.2453 217/500 [============>.................] - ETA: 1:10 - loss: 1.5623 - regression_loss: 1.3171 - classification_loss: 0.2452 218/500 [============>.................] - ETA: 1:10 - loss: 1.5649 - regression_loss: 1.3193 - classification_loss: 0.2456 219/500 [============>.................] - ETA: 1:10 - loss: 1.5640 - regression_loss: 1.3188 - classification_loss: 0.2452 220/500 [============>.................] - ETA: 1:09 - loss: 1.5655 - regression_loss: 1.3201 - classification_loss: 0.2454 221/500 [============>.................] - ETA: 1:09 - loss: 1.5658 - regression_loss: 1.3208 - classification_loss: 0.2450 222/500 [============>.................] - ETA: 1:09 - loss: 1.5646 - regression_loss: 1.3200 - classification_loss: 0.2446 223/500 [============>.................] - ETA: 1:09 - loss: 1.5667 - regression_loss: 1.3217 - classification_loss: 0.2451 224/500 [============>.................] - ETA: 1:09 - loss: 1.5662 - regression_loss: 1.3211 - classification_loss: 0.2451 225/500 [============>.................] - ETA: 1:08 - loss: 1.5667 - regression_loss: 1.3216 - classification_loss: 0.2451 226/500 [============>.................] - ETA: 1:08 - loss: 1.5688 - regression_loss: 1.3230 - classification_loss: 0.2457 227/500 [============>.................] - ETA: 1:08 - loss: 1.5687 - regression_loss: 1.3229 - classification_loss: 0.2458 228/500 [============>.................] - ETA: 1:08 - loss: 1.5642 - regression_loss: 1.3193 - classification_loss: 0.2449 229/500 [============>.................] - ETA: 1:07 - loss: 1.5643 - regression_loss: 1.3196 - classification_loss: 0.2447 230/500 [============>.................] - ETA: 1:07 - loss: 1.5637 - regression_loss: 1.3193 - classification_loss: 0.2444 231/500 [============>.................] - ETA: 1:07 - loss: 1.5609 - regression_loss: 1.3172 - classification_loss: 0.2437 232/500 [============>.................] - ETA: 1:07 - loss: 1.5590 - regression_loss: 1.3158 - classification_loss: 0.2432 233/500 [============>.................] - ETA: 1:06 - loss: 1.5591 - regression_loss: 1.3162 - classification_loss: 0.2428 234/500 [=============>................] - ETA: 1:06 - loss: 1.5593 - regression_loss: 1.3168 - classification_loss: 0.2425 235/500 [=============>................] - ETA: 1:06 - loss: 1.5594 - regression_loss: 1.3166 - classification_loss: 0.2428 236/500 [=============>................] - ETA: 1:06 - loss: 1.5605 - regression_loss: 1.3177 - classification_loss: 0.2429 237/500 [=============>................] - ETA: 1:05 - loss: 1.5608 - regression_loss: 1.3181 - classification_loss: 0.2427 238/500 [=============>................] - ETA: 1:05 - loss: 1.5605 - regression_loss: 1.3178 - classification_loss: 0.2427 239/500 [=============>................] - ETA: 1:05 - loss: 1.5609 - regression_loss: 1.3181 - classification_loss: 0.2428 240/500 [=============>................] - ETA: 1:05 - loss: 1.5629 - regression_loss: 1.3192 - classification_loss: 0.2437 241/500 [=============>................] - ETA: 1:04 - loss: 1.5636 - regression_loss: 1.3201 - classification_loss: 0.2434 242/500 [=============>................] - ETA: 1:04 - loss: 1.5609 - regression_loss: 1.3182 - classification_loss: 0.2428 243/500 [=============>................] - ETA: 1:04 - loss: 1.5652 - regression_loss: 1.3199 - classification_loss: 0.2452 244/500 [=============>................] - ETA: 1:04 - loss: 1.5633 - regression_loss: 1.3183 - classification_loss: 0.2450 245/500 [=============>................] - ETA: 1:03 - loss: 1.5601 - regression_loss: 1.3158 - classification_loss: 0.2444 246/500 [=============>................] - ETA: 1:03 - loss: 1.5602 - regression_loss: 1.3158 - classification_loss: 0.2444 247/500 [=============>................] - ETA: 1:03 - loss: 1.5599 - regression_loss: 1.3158 - classification_loss: 0.2440 248/500 [=============>................] - ETA: 1:03 - loss: 1.5589 - regression_loss: 1.3151 - classification_loss: 0.2438 249/500 [=============>................] - ETA: 1:02 - loss: 1.5600 - regression_loss: 1.3155 - classification_loss: 0.2445 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5593 - regression_loss: 1.3150 - classification_loss: 0.2444 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5578 - regression_loss: 1.3138 - classification_loss: 0.2441 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5583 - regression_loss: 1.3141 - classification_loss: 0.2442 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5581 - regression_loss: 1.3142 - classification_loss: 0.2439 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5605 - regression_loss: 1.3161 - classification_loss: 0.2444 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5609 - regression_loss: 1.3166 - classification_loss: 0.2443 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5606 - regression_loss: 1.3149 - classification_loss: 0.2457 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5607 - regression_loss: 1.3149 - classification_loss: 0.2458 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5628 - regression_loss: 1.3161 - classification_loss: 0.2467 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5631 - regression_loss: 1.3164 - classification_loss: 0.2467 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5603 - regression_loss: 1.3137 - classification_loss: 0.2467 261/500 [==============>...............] - ETA: 59s - loss: 1.5611 - regression_loss: 1.3141 - classification_loss: 0.2470  262/500 [==============>...............] - ETA: 59s - loss: 1.5599 - regression_loss: 1.3133 - classification_loss: 0.2466 263/500 [==============>...............] - ETA: 59s - loss: 1.5591 - regression_loss: 1.3128 - classification_loss: 0.2463 264/500 [==============>...............] - ETA: 59s - loss: 1.5578 - regression_loss: 1.3116 - classification_loss: 0.2462 265/500 [==============>...............] - ETA: 58s - loss: 1.5610 - regression_loss: 1.3140 - classification_loss: 0.2469 266/500 [==============>...............] - ETA: 58s - loss: 1.5663 - regression_loss: 1.3190 - classification_loss: 0.2472 267/500 [===============>..............] - ETA: 58s - loss: 1.5646 - regression_loss: 1.3176 - classification_loss: 0.2470 268/500 [===============>..............] - ETA: 58s - loss: 1.5648 - regression_loss: 1.3179 - classification_loss: 0.2469 269/500 [===============>..............] - ETA: 57s - loss: 1.5651 - regression_loss: 1.3180 - classification_loss: 0.2471 270/500 [===============>..............] - ETA: 57s - loss: 1.5651 - regression_loss: 1.3178 - classification_loss: 0.2473 271/500 [===============>..............] - ETA: 57s - loss: 1.5630 - regression_loss: 1.3162 - classification_loss: 0.2468 272/500 [===============>..............] - ETA: 57s - loss: 1.5653 - regression_loss: 1.3172 - classification_loss: 0.2481 273/500 [===============>..............] - ETA: 56s - loss: 1.5647 - regression_loss: 1.3167 - classification_loss: 0.2480 274/500 [===============>..............] - ETA: 56s - loss: 1.5650 - regression_loss: 1.3166 - classification_loss: 0.2484 275/500 [===============>..............] - ETA: 56s - loss: 1.5618 - regression_loss: 1.3141 - classification_loss: 0.2477 276/500 [===============>..............] - ETA: 56s - loss: 1.5615 - regression_loss: 1.3139 - classification_loss: 0.2476 277/500 [===============>..............] - ETA: 55s - loss: 1.5612 - regression_loss: 1.3137 - classification_loss: 0.2475 278/500 [===============>..............] - ETA: 55s - loss: 1.5630 - regression_loss: 1.3154 - classification_loss: 0.2476 279/500 [===============>..............] - ETA: 55s - loss: 1.5641 - regression_loss: 1.3163 - classification_loss: 0.2478 280/500 [===============>..............] - ETA: 55s - loss: 1.5655 - regression_loss: 1.3173 - classification_loss: 0.2482 281/500 [===============>..............] - ETA: 54s - loss: 1.5657 - regression_loss: 1.3174 - classification_loss: 0.2483 282/500 [===============>..............] - ETA: 54s - loss: 1.5664 - regression_loss: 1.3179 - classification_loss: 0.2485 283/500 [===============>..............] - ETA: 54s - loss: 1.5670 - regression_loss: 1.3187 - classification_loss: 0.2483 284/500 [================>.............] - ETA: 54s - loss: 1.5676 - regression_loss: 1.3191 - classification_loss: 0.2485 285/500 [================>.............] - ETA: 53s - loss: 1.5673 - regression_loss: 1.3184 - classification_loss: 0.2488 286/500 [================>.............] - ETA: 53s - loss: 1.5657 - regression_loss: 1.3173 - classification_loss: 0.2484 287/500 [================>.............] - ETA: 53s - loss: 1.5680 - regression_loss: 1.3194 - classification_loss: 0.2487 288/500 [================>.............] - ETA: 53s - loss: 1.5691 - regression_loss: 1.3197 - classification_loss: 0.2493 289/500 [================>.............] - ETA: 52s - loss: 1.5686 - regression_loss: 1.3195 - classification_loss: 0.2491 290/500 [================>.............] - ETA: 52s - loss: 1.5686 - regression_loss: 1.3196 - classification_loss: 0.2491 291/500 [================>.............] - ETA: 52s - loss: 1.5697 - regression_loss: 1.3207 - classification_loss: 0.2490 292/500 [================>.............] - ETA: 51s - loss: 1.5675 - regression_loss: 1.3189 - classification_loss: 0.2486 293/500 [================>.............] - ETA: 51s - loss: 1.5709 - regression_loss: 1.3217 - classification_loss: 0.2492 294/500 [================>.............] - ETA: 51s - loss: 1.5734 - regression_loss: 1.3238 - classification_loss: 0.2496 295/500 [================>.............] - ETA: 51s - loss: 1.5752 - regression_loss: 1.3257 - classification_loss: 0.2495 296/500 [================>.............] - ETA: 51s - loss: 1.5759 - regression_loss: 1.3263 - classification_loss: 0.2496 297/500 [================>.............] - ETA: 50s - loss: 1.5769 - regression_loss: 1.3273 - classification_loss: 0.2496 298/500 [================>.............] - ETA: 50s - loss: 1.5760 - regression_loss: 1.3267 - classification_loss: 0.2493 299/500 [================>.............] - ETA: 50s - loss: 1.5753 - regression_loss: 1.3264 - classification_loss: 0.2489 300/500 [=================>............] - ETA: 49s - loss: 1.5750 - regression_loss: 1.3261 - classification_loss: 0.2489 301/500 [=================>............] - ETA: 49s - loss: 1.5747 - regression_loss: 1.3259 - classification_loss: 0.2487 302/500 [=================>............] - ETA: 49s - loss: 1.5756 - regression_loss: 1.3266 - classification_loss: 0.2490 303/500 [=================>............] - ETA: 49s - loss: 1.5750 - regression_loss: 1.3262 - classification_loss: 0.2488 304/500 [=================>............] - ETA: 48s - loss: 1.5758 - regression_loss: 1.3265 - classification_loss: 0.2492 305/500 [=================>............] - ETA: 48s - loss: 1.5758 - regression_loss: 1.3268 - classification_loss: 0.2490 306/500 [=================>............] - ETA: 48s - loss: 1.5785 - regression_loss: 1.3285 - classification_loss: 0.2500 307/500 [=================>............] - ETA: 48s - loss: 1.5788 - regression_loss: 1.3289 - classification_loss: 0.2499 308/500 [=================>............] - ETA: 47s - loss: 1.5781 - regression_loss: 1.3285 - classification_loss: 0.2496 309/500 [=================>............] - ETA: 47s - loss: 1.5775 - regression_loss: 1.3281 - classification_loss: 0.2495 310/500 [=================>............] - ETA: 47s - loss: 1.5777 - regression_loss: 1.3283 - classification_loss: 0.2495 311/500 [=================>............] - ETA: 47s - loss: 1.5785 - regression_loss: 1.3291 - classification_loss: 0.2494 312/500 [=================>............] - ETA: 46s - loss: 1.5775 - regression_loss: 1.3282 - classification_loss: 0.2493 313/500 [=================>............] - ETA: 46s - loss: 1.5770 - regression_loss: 1.3279 - classification_loss: 0.2492 314/500 [=================>............] - ETA: 46s - loss: 1.5839 - regression_loss: 1.3334 - classification_loss: 0.2505 315/500 [=================>............] - ETA: 46s - loss: 1.5834 - regression_loss: 1.3333 - classification_loss: 0.2501 316/500 [=================>............] - ETA: 45s - loss: 1.5869 - regression_loss: 1.3361 - classification_loss: 0.2508 317/500 [==================>...........] - ETA: 45s - loss: 1.5865 - regression_loss: 1.3354 - classification_loss: 0.2511 318/500 [==================>...........] - ETA: 45s - loss: 1.5873 - regression_loss: 1.3363 - classification_loss: 0.2510 319/500 [==================>...........] - ETA: 45s - loss: 1.5893 - regression_loss: 1.3378 - classification_loss: 0.2514 320/500 [==================>...........] - ETA: 44s - loss: 1.5908 - regression_loss: 1.3388 - classification_loss: 0.2520 321/500 [==================>...........] - ETA: 44s - loss: 1.5906 - regression_loss: 1.3389 - classification_loss: 0.2517 322/500 [==================>...........] - ETA: 44s - loss: 1.5930 - regression_loss: 1.3413 - classification_loss: 0.2517 323/500 [==================>...........] - ETA: 44s - loss: 1.5939 - regression_loss: 1.3418 - classification_loss: 0.2521 324/500 [==================>...........] - ETA: 43s - loss: 1.5925 - regression_loss: 1.3408 - classification_loss: 0.2517 325/500 [==================>...........] - ETA: 43s - loss: 1.5919 - regression_loss: 1.3403 - classification_loss: 0.2516 326/500 [==================>...........] - ETA: 43s - loss: 1.5934 - regression_loss: 1.3416 - classification_loss: 0.2519 327/500 [==================>...........] - ETA: 43s - loss: 1.5920 - regression_loss: 1.3406 - classification_loss: 0.2514 328/500 [==================>...........] - ETA: 42s - loss: 1.5920 - regression_loss: 1.3407 - classification_loss: 0.2513 329/500 [==================>...........] - ETA: 42s - loss: 1.5932 - regression_loss: 1.3413 - classification_loss: 0.2518 330/500 [==================>...........] - ETA: 42s - loss: 1.5916 - regression_loss: 1.3401 - classification_loss: 0.2514 331/500 [==================>...........] - ETA: 42s - loss: 1.5921 - regression_loss: 1.3409 - classification_loss: 0.2512 332/500 [==================>...........] - ETA: 41s - loss: 1.5916 - regression_loss: 1.3406 - classification_loss: 0.2510 333/500 [==================>...........] - ETA: 41s - loss: 1.5969 - regression_loss: 1.3448 - classification_loss: 0.2521 334/500 [===================>..........] - ETA: 41s - loss: 1.5973 - regression_loss: 1.3451 - classification_loss: 0.2522 335/500 [===================>..........] - ETA: 41s - loss: 1.5974 - regression_loss: 1.3454 - classification_loss: 0.2520 336/500 [===================>..........] - ETA: 40s - loss: 1.5985 - regression_loss: 1.3464 - classification_loss: 0.2522 337/500 [===================>..........] - ETA: 40s - loss: 1.5992 - regression_loss: 1.3470 - classification_loss: 0.2523 338/500 [===================>..........] - ETA: 40s - loss: 1.5987 - regression_loss: 1.3468 - classification_loss: 0.2519 339/500 [===================>..........] - ETA: 40s - loss: 1.6011 - regression_loss: 1.3486 - classification_loss: 0.2525 340/500 [===================>..........] - ETA: 39s - loss: 1.6036 - regression_loss: 1.3492 - classification_loss: 0.2545 341/500 [===================>..........] - ETA: 39s - loss: 1.6036 - regression_loss: 1.3493 - classification_loss: 0.2543 342/500 [===================>..........] - ETA: 39s - loss: 1.6014 - regression_loss: 1.3474 - classification_loss: 0.2540 343/500 [===================>..........] - ETA: 39s - loss: 1.6033 - regression_loss: 1.3489 - classification_loss: 0.2544 344/500 [===================>..........] - ETA: 38s - loss: 1.6030 - regression_loss: 1.3487 - classification_loss: 0.2543 345/500 [===================>..........] - ETA: 38s - loss: 1.6039 - regression_loss: 1.3493 - classification_loss: 0.2546 346/500 [===================>..........] - ETA: 38s - loss: 1.6052 - regression_loss: 1.3503 - classification_loss: 0.2549 347/500 [===================>..........] - ETA: 38s - loss: 1.6060 - regression_loss: 1.3509 - classification_loss: 0.2551 348/500 [===================>..........] - ETA: 37s - loss: 1.6065 - regression_loss: 1.3512 - classification_loss: 0.2553 349/500 [===================>..........] - ETA: 37s - loss: 1.6048 - regression_loss: 1.3499 - classification_loss: 0.2549 350/500 [====================>.........] - ETA: 37s - loss: 1.6034 - regression_loss: 1.3489 - classification_loss: 0.2545 351/500 [====================>.........] - ETA: 37s - loss: 1.6017 - regression_loss: 1.3474 - classification_loss: 0.2542 352/500 [====================>.........] - ETA: 36s - loss: 1.6021 - regression_loss: 1.3477 - classification_loss: 0.2544 353/500 [====================>.........] - ETA: 36s - loss: 1.5993 - regression_loss: 1.3453 - classification_loss: 0.2540 354/500 [====================>.........] - ETA: 36s - loss: 1.5972 - regression_loss: 1.3437 - classification_loss: 0.2535 355/500 [====================>.........] - ETA: 36s - loss: 1.5967 - regression_loss: 1.3433 - classification_loss: 0.2534 356/500 [====================>.........] - ETA: 35s - loss: 1.5982 - regression_loss: 1.3441 - classification_loss: 0.2540 357/500 [====================>.........] - ETA: 35s - loss: 1.6015 - regression_loss: 1.3470 - classification_loss: 0.2545 358/500 [====================>.........] - ETA: 35s - loss: 1.6005 - regression_loss: 1.3462 - classification_loss: 0.2543 359/500 [====================>.........] - ETA: 35s - loss: 1.6019 - regression_loss: 1.3475 - classification_loss: 0.2544 360/500 [====================>.........] - ETA: 34s - loss: 1.6028 - regression_loss: 1.3483 - classification_loss: 0.2546 361/500 [====================>.........] - ETA: 34s - loss: 1.6018 - regression_loss: 1.3474 - classification_loss: 0.2544 362/500 [====================>.........] - ETA: 34s - loss: 1.6046 - regression_loss: 1.3503 - classification_loss: 0.2543 363/500 [====================>.........] - ETA: 34s - loss: 1.6035 - regression_loss: 1.3495 - classification_loss: 0.2540 364/500 [====================>.........] - ETA: 33s - loss: 1.6050 - regression_loss: 1.3506 - classification_loss: 0.2544 365/500 [====================>.........] - ETA: 33s - loss: 1.6064 - regression_loss: 1.3519 - classification_loss: 0.2545 366/500 [====================>.........] - ETA: 33s - loss: 1.6079 - regression_loss: 1.3534 - classification_loss: 0.2545 367/500 [=====================>........] - ETA: 33s - loss: 1.6077 - regression_loss: 1.3534 - classification_loss: 0.2543 368/500 [=====================>........] - ETA: 32s - loss: 1.6076 - regression_loss: 1.3533 - classification_loss: 0.2542 369/500 [=====================>........] - ETA: 32s - loss: 1.6075 - regression_loss: 1.3534 - classification_loss: 0.2541 370/500 [=====================>........] - ETA: 32s - loss: 1.6083 - regression_loss: 1.3542 - classification_loss: 0.2542 371/500 [=====================>........] - ETA: 32s - loss: 1.6091 - regression_loss: 1.3547 - classification_loss: 0.2545 372/500 [=====================>........] - ETA: 31s - loss: 1.6113 - regression_loss: 1.3562 - classification_loss: 0.2551 373/500 [=====================>........] - ETA: 31s - loss: 1.6106 - regression_loss: 1.3557 - classification_loss: 0.2549 374/500 [=====================>........] - ETA: 31s - loss: 1.6121 - regression_loss: 1.3568 - classification_loss: 0.2552 375/500 [=====================>........] - ETA: 31s - loss: 1.6105 - regression_loss: 1.3554 - classification_loss: 0.2551 376/500 [=====================>........] - ETA: 30s - loss: 1.6110 - regression_loss: 1.3558 - classification_loss: 0.2552 377/500 [=====================>........] - ETA: 30s - loss: 1.6096 - regression_loss: 1.3547 - classification_loss: 0.2549 378/500 [=====================>........] - ETA: 30s - loss: 1.6080 - regression_loss: 1.3535 - classification_loss: 0.2545 379/500 [=====================>........] - ETA: 30s - loss: 1.6090 - regression_loss: 1.3543 - classification_loss: 0.2547 380/500 [=====================>........] - ETA: 29s - loss: 1.6099 - regression_loss: 1.3553 - classification_loss: 0.2546 381/500 [=====================>........] - ETA: 29s - loss: 1.6092 - regression_loss: 1.3548 - classification_loss: 0.2544 382/500 [=====================>........] - ETA: 29s - loss: 1.6115 - regression_loss: 1.3565 - classification_loss: 0.2550 383/500 [=====================>........] - ETA: 29s - loss: 1.6125 - regression_loss: 1.3574 - classification_loss: 0.2551 384/500 [======================>.......] - ETA: 28s - loss: 1.6120 - regression_loss: 1.3570 - classification_loss: 0.2549 385/500 [======================>.......] - ETA: 28s - loss: 1.6112 - regression_loss: 1.3564 - classification_loss: 0.2548 386/500 [======================>.......] - ETA: 28s - loss: 1.6100 - regression_loss: 1.3556 - classification_loss: 0.2545 387/500 [======================>.......] - ETA: 28s - loss: 1.6099 - regression_loss: 1.3554 - classification_loss: 0.2545 388/500 [======================>.......] - ETA: 27s - loss: 1.6095 - regression_loss: 1.3550 - classification_loss: 0.2545 389/500 [======================>.......] - ETA: 27s - loss: 1.6098 - regression_loss: 1.3550 - classification_loss: 0.2548 390/500 [======================>.......] - ETA: 27s - loss: 1.6095 - regression_loss: 1.3548 - classification_loss: 0.2547 391/500 [======================>.......] - ETA: 27s - loss: 1.6091 - regression_loss: 1.3546 - classification_loss: 0.2545 392/500 [======================>.......] - ETA: 26s - loss: 1.6079 - regression_loss: 1.3537 - classification_loss: 0.2542 393/500 [======================>.......] - ETA: 26s - loss: 1.6070 - regression_loss: 1.3532 - classification_loss: 0.2538 394/500 [======================>.......] - ETA: 26s - loss: 1.6067 - regression_loss: 1.3531 - classification_loss: 0.2536 395/500 [======================>.......] - ETA: 26s - loss: 1.6051 - regression_loss: 1.3518 - classification_loss: 0.2532 396/500 [======================>.......] - ETA: 25s - loss: 1.6063 - regression_loss: 1.3531 - classification_loss: 0.2532 397/500 [======================>.......] - ETA: 25s - loss: 1.6059 - regression_loss: 1.3528 - classification_loss: 0.2530 398/500 [======================>.......] - ETA: 25s - loss: 1.6047 - regression_loss: 1.3517 - classification_loss: 0.2529 399/500 [======================>.......] - ETA: 25s - loss: 1.6043 - regression_loss: 1.3516 - classification_loss: 0.2527 400/500 [=======================>......] - ETA: 24s - loss: 1.6051 - regression_loss: 1.3521 - classification_loss: 0.2530 401/500 [=======================>......] - ETA: 24s - loss: 1.6060 - regression_loss: 1.3528 - classification_loss: 0.2531 402/500 [=======================>......] - ETA: 24s - loss: 1.6073 - regression_loss: 1.3539 - classification_loss: 0.2534 403/500 [=======================>......] - ETA: 24s - loss: 1.6079 - regression_loss: 1.3545 - classification_loss: 0.2534 404/500 [=======================>......] - ETA: 23s - loss: 1.6080 - regression_loss: 1.3545 - classification_loss: 0.2536 405/500 [=======================>......] - ETA: 23s - loss: 1.6075 - regression_loss: 1.3541 - classification_loss: 0.2534 406/500 [=======================>......] - ETA: 23s - loss: 1.6065 - regression_loss: 1.3531 - classification_loss: 0.2534 407/500 [=======================>......] - ETA: 23s - loss: 1.6066 - regression_loss: 1.3531 - classification_loss: 0.2535 408/500 [=======================>......] - ETA: 22s - loss: 1.6066 - regression_loss: 1.3533 - classification_loss: 0.2533 409/500 [=======================>......] - ETA: 22s - loss: 1.6057 - regression_loss: 1.3526 - classification_loss: 0.2531 410/500 [=======================>......] - ETA: 22s - loss: 1.6073 - regression_loss: 1.3534 - classification_loss: 0.2539 411/500 [=======================>......] - ETA: 22s - loss: 1.6090 - regression_loss: 1.3548 - classification_loss: 0.2541 412/500 [=======================>......] - ETA: 21s - loss: 1.6089 - regression_loss: 1.3548 - classification_loss: 0.2540 413/500 [=======================>......] - ETA: 21s - loss: 1.6097 - regression_loss: 1.3553 - classification_loss: 0.2544 414/500 [=======================>......] - ETA: 21s - loss: 1.6116 - regression_loss: 1.3568 - classification_loss: 0.2547 415/500 [=======================>......] - ETA: 21s - loss: 1.6113 - regression_loss: 1.3564 - classification_loss: 0.2549 416/500 [=======================>......] - ETA: 20s - loss: 1.6093 - regression_loss: 1.3546 - classification_loss: 0.2546 417/500 [========================>.....] - ETA: 20s - loss: 1.6102 - regression_loss: 1.3550 - classification_loss: 0.2552 418/500 [========================>.....] - ETA: 20s - loss: 1.6111 - regression_loss: 1.3556 - classification_loss: 0.2555 419/500 [========================>.....] - ETA: 20s - loss: 1.6102 - regression_loss: 1.3549 - classification_loss: 0.2553 420/500 [========================>.....] - ETA: 19s - loss: 1.6103 - regression_loss: 1.3549 - classification_loss: 0.2554 421/500 [========================>.....] - ETA: 19s - loss: 1.6123 - regression_loss: 1.3566 - classification_loss: 0.2557 422/500 [========================>.....] - ETA: 19s - loss: 1.6124 - regression_loss: 1.3566 - classification_loss: 0.2558 423/500 [========================>.....] - ETA: 19s - loss: 1.6121 - regression_loss: 1.3562 - classification_loss: 0.2558 424/500 [========================>.....] - ETA: 18s - loss: 1.6128 - regression_loss: 1.3568 - classification_loss: 0.2560 425/500 [========================>.....] - ETA: 18s - loss: 1.6121 - regression_loss: 1.3564 - classification_loss: 0.2557 426/500 [========================>.....] - ETA: 18s - loss: 1.6137 - regression_loss: 1.3575 - classification_loss: 0.2561 427/500 [========================>.....] - ETA: 18s - loss: 1.6133 - regression_loss: 1.3573 - classification_loss: 0.2560 428/500 [========================>.....] - ETA: 17s - loss: 1.6139 - regression_loss: 1.3577 - classification_loss: 0.2562 429/500 [========================>.....] - ETA: 17s - loss: 1.6130 - regression_loss: 1.3570 - classification_loss: 0.2560 430/500 [========================>.....] - ETA: 17s - loss: 1.6125 - regression_loss: 1.3565 - classification_loss: 0.2559 431/500 [========================>.....] - ETA: 17s - loss: 1.6121 - regression_loss: 1.3561 - classification_loss: 0.2560 432/500 [========================>.....] - ETA: 16s - loss: 1.6103 - regression_loss: 1.3547 - classification_loss: 0.2556 433/500 [========================>.....] - ETA: 16s - loss: 1.6113 - regression_loss: 1.3556 - classification_loss: 0.2557 434/500 [=========================>....] - ETA: 16s - loss: 1.6118 - regression_loss: 1.3559 - classification_loss: 0.2560 435/500 [=========================>....] - ETA: 16s - loss: 1.6105 - regression_loss: 1.3547 - classification_loss: 0.2557 436/500 [=========================>....] - ETA: 15s - loss: 1.6099 - regression_loss: 1.3543 - classification_loss: 0.2556 437/500 [=========================>....] - ETA: 15s - loss: 1.6101 - regression_loss: 1.3545 - classification_loss: 0.2557 438/500 [=========================>....] - ETA: 15s - loss: 1.6208 - regression_loss: 1.3578 - classification_loss: 0.2630 439/500 [=========================>....] - ETA: 15s - loss: 1.6207 - regression_loss: 1.3577 - classification_loss: 0.2630 440/500 [=========================>....] - ETA: 14s - loss: 1.6196 - regression_loss: 1.3569 - classification_loss: 0.2627 441/500 [=========================>....] - ETA: 14s - loss: 1.6190 - regression_loss: 1.3565 - classification_loss: 0.2625 442/500 [=========================>....] - ETA: 14s - loss: 1.6203 - regression_loss: 1.3576 - classification_loss: 0.2627 443/500 [=========================>....] - ETA: 14s - loss: 1.6227 - regression_loss: 1.3595 - classification_loss: 0.2632 444/500 [=========================>....] - ETA: 13s - loss: 1.6227 - regression_loss: 1.3596 - classification_loss: 0.2630 445/500 [=========================>....] - ETA: 13s - loss: 1.6241 - regression_loss: 1.3608 - classification_loss: 0.2633 446/500 [=========================>....] - ETA: 13s - loss: 1.6247 - regression_loss: 1.3614 - classification_loss: 0.2633 447/500 [=========================>....] - ETA: 13s - loss: 1.6244 - regression_loss: 1.3612 - classification_loss: 0.2632 448/500 [=========================>....] - ETA: 12s - loss: 1.6251 - regression_loss: 1.3619 - classification_loss: 0.2632 449/500 [=========================>....] - ETA: 12s - loss: 1.6259 - regression_loss: 1.3624 - classification_loss: 0.2635 450/500 [==========================>...] - ETA: 12s - loss: 1.6255 - regression_loss: 1.3622 - classification_loss: 0.2632 451/500 [==========================>...] - ETA: 12s - loss: 1.6256 - regression_loss: 1.3622 - classification_loss: 0.2634 452/500 [==========================>...] - ETA: 11s - loss: 1.6253 - regression_loss: 1.3620 - classification_loss: 0.2634 453/500 [==========================>...] - ETA: 11s - loss: 1.6246 - regression_loss: 1.3614 - classification_loss: 0.2633 454/500 [==========================>...] - ETA: 11s - loss: 1.6256 - regression_loss: 1.3621 - classification_loss: 0.2635 455/500 [==========================>...] - ETA: 11s - loss: 1.6259 - regression_loss: 1.3626 - classification_loss: 0.2634 456/500 [==========================>...] - ETA: 10s - loss: 1.6238 - regression_loss: 1.3608 - classification_loss: 0.2630 457/500 [==========================>...] - ETA: 10s - loss: 1.6236 - regression_loss: 1.3608 - classification_loss: 0.2628 458/500 [==========================>...] - ETA: 10s - loss: 1.6234 - regression_loss: 1.3608 - classification_loss: 0.2625 459/500 [==========================>...] - ETA: 10s - loss: 1.6213 - regression_loss: 1.3579 - classification_loss: 0.2634 460/500 [==========================>...] - ETA: 9s - loss: 1.6210 - regression_loss: 1.3575 - classification_loss: 0.2635  461/500 [==========================>...] - ETA: 9s - loss: 1.6192 - regression_loss: 1.3561 - classification_loss: 0.2631 462/500 [==========================>...] - ETA: 9s - loss: 1.6190 - regression_loss: 1.3559 - classification_loss: 0.2631 463/500 [==========================>...] - ETA: 9s - loss: 1.6187 - regression_loss: 1.3556 - classification_loss: 0.2630 464/500 [==========================>...] - ETA: 8s - loss: 1.6183 - regression_loss: 1.3553 - classification_loss: 0.2630 465/500 [==========================>...] - ETA: 8s - loss: 1.6168 - regression_loss: 1.3540 - classification_loss: 0.2628 466/500 [==========================>...] - ETA: 8s - loss: 1.6167 - regression_loss: 1.3541 - classification_loss: 0.2626 467/500 [===========================>..] - ETA: 8s - loss: 1.6182 - regression_loss: 1.3552 - classification_loss: 0.2630 468/500 [===========================>..] - ETA: 7s - loss: 1.6181 - regression_loss: 1.3554 - classification_loss: 0.2628 469/500 [===========================>..] - ETA: 7s - loss: 1.6196 - regression_loss: 1.3565 - classification_loss: 0.2632 470/500 [===========================>..] - ETA: 7s - loss: 1.6171 - regression_loss: 1.3544 - classification_loss: 0.2627 471/500 [===========================>..] - ETA: 7s - loss: 1.6163 - regression_loss: 1.3539 - classification_loss: 0.2624 472/500 [===========================>..] - ETA: 6s - loss: 1.6162 - regression_loss: 1.3540 - classification_loss: 0.2622 473/500 [===========================>..] - ETA: 6s - loss: 1.6159 - regression_loss: 1.3538 - classification_loss: 0.2621 474/500 [===========================>..] - ETA: 6s - loss: 1.6158 - regression_loss: 1.3538 - classification_loss: 0.2620 475/500 [===========================>..] - ETA: 6s - loss: 1.6148 - regression_loss: 1.3528 - classification_loss: 0.2619 476/500 [===========================>..] - ETA: 5s - loss: 1.6148 - regression_loss: 1.3526 - classification_loss: 0.2622 477/500 [===========================>..] - ETA: 5s - loss: 1.6130 - regression_loss: 1.3513 - classification_loss: 0.2617 478/500 [===========================>..] - ETA: 5s - loss: 1.6121 - regression_loss: 1.3507 - classification_loss: 0.2614 479/500 [===========================>..] - ETA: 5s - loss: 1.6100 - regression_loss: 1.3490 - classification_loss: 0.2609 480/500 [===========================>..] - ETA: 4s - loss: 1.6100 - regression_loss: 1.3490 - classification_loss: 0.2610 481/500 [===========================>..] - ETA: 4s - loss: 1.6093 - regression_loss: 1.3485 - classification_loss: 0.2608 482/500 [===========================>..] - ETA: 4s - loss: 1.6078 - regression_loss: 1.3471 - classification_loss: 0.2607 483/500 [===========================>..] - ETA: 4s - loss: 1.6078 - regression_loss: 1.3470 - classification_loss: 0.2607 484/500 [============================>.] - ETA: 3s - loss: 1.6091 - regression_loss: 1.3483 - classification_loss: 0.2608 485/500 [============================>.] - ETA: 3s - loss: 1.6075 - regression_loss: 1.3470 - classification_loss: 0.2605 486/500 [============================>.] - ETA: 3s - loss: 1.6077 - regression_loss: 1.3471 - classification_loss: 0.2606 487/500 [============================>.] - ETA: 3s - loss: 1.6074 - regression_loss: 1.3470 - classification_loss: 0.2604 488/500 [============================>.] - ETA: 2s - loss: 1.6071 - regression_loss: 1.3467 - classification_loss: 0.2604 489/500 [============================>.] - ETA: 2s - loss: 1.6075 - regression_loss: 1.3471 - classification_loss: 0.2603 490/500 [============================>.] - ETA: 2s - loss: 1.6070 - regression_loss: 1.3469 - classification_loss: 0.2601 491/500 [============================>.] - ETA: 2s - loss: 1.6054 - regression_loss: 1.3456 - classification_loss: 0.2598 492/500 [============================>.] - ETA: 1s - loss: 1.6068 - regression_loss: 1.3468 - classification_loss: 0.2600 493/500 [============================>.] - ETA: 1s - loss: 1.6071 - regression_loss: 1.3470 - classification_loss: 0.2601 494/500 [============================>.] - ETA: 1s - loss: 1.6087 - regression_loss: 1.3485 - classification_loss: 0.2603 495/500 [============================>.] - ETA: 1s - loss: 1.6090 - regression_loss: 1.3489 - classification_loss: 0.2601 496/500 [============================>.] - ETA: 0s - loss: 1.6133 - regression_loss: 1.3520 - classification_loss: 0.2613 497/500 [============================>.] - ETA: 0s - loss: 1.6141 - regression_loss: 1.3526 - classification_loss: 0.2616 498/500 [============================>.] - ETA: 0s - loss: 1.6121 - regression_loss: 1.3510 - classification_loss: 0.2611 499/500 [============================>.] - ETA: 0s - loss: 1.6127 - regression_loss: 1.3514 - classification_loss: 0.2613 500/500 [==============================] - 125s 250ms/step - loss: 1.6129 - regression_loss: 1.3516 - classification_loss: 0.2614 326 instances of class plum with average precision: 0.7919 mAP: 0.7919 Epoch 00058: saving model to ./training/snapshots/resnet50_pascal_58.h5 Epoch 59/150 1/500 [..............................] - ETA: 2:02 - loss: 2.4488 - regression_loss: 1.9623 - classification_loss: 0.4865 2/500 [..............................] - ETA: 1:57 - loss: 2.0524 - regression_loss: 1.6191 - classification_loss: 0.4333 3/500 [..............................] - ETA: 1:57 - loss: 1.5780 - regression_loss: 1.2546 - classification_loss: 0.3233 4/500 [..............................] - ETA: 1:59 - loss: 1.4562 - regression_loss: 1.1808 - classification_loss: 0.2754 5/500 [..............................] - ETA: 1:59 - loss: 1.5636 - regression_loss: 1.2881 - classification_loss: 0.2755 6/500 [..............................] - ETA: 2:00 - loss: 1.4782 - regression_loss: 1.2258 - classification_loss: 0.2524 7/500 [..............................] - ETA: 2:01 - loss: 1.4631 - regression_loss: 1.2249 - classification_loss: 0.2382 8/500 [..............................] - ETA: 2:01 - loss: 1.5689 - regression_loss: 1.3161 - classification_loss: 0.2528 9/500 [..............................] - ETA: 2:00 - loss: 1.6423 - regression_loss: 1.3854 - classification_loss: 0.2570 10/500 [..............................] - ETA: 2:00 - loss: 1.7614 - regression_loss: 1.4836 - classification_loss: 0.2778 11/500 [..............................] - ETA: 2:00 - loss: 1.7384 - regression_loss: 1.4607 - classification_loss: 0.2777 12/500 [..............................] - ETA: 2:00 - loss: 1.7512 - regression_loss: 1.4771 - classification_loss: 0.2741 13/500 [..............................] - ETA: 2:00 - loss: 1.7133 - regression_loss: 1.4517 - classification_loss: 0.2616 14/500 [..............................] - ETA: 2:00 - loss: 1.7459 - regression_loss: 1.4771 - classification_loss: 0.2688 15/500 [..............................] - ETA: 1:59 - loss: 1.7548 - regression_loss: 1.4866 - classification_loss: 0.2682 16/500 [..............................] - ETA: 1:59 - loss: 1.7999 - regression_loss: 1.5303 - classification_loss: 0.2696 17/500 [>.............................] - ETA: 1:59 - loss: 1.8295 - regression_loss: 1.5512 - classification_loss: 0.2783 18/500 [>.............................] - ETA: 1:59 - loss: 1.8150 - regression_loss: 1.5427 - classification_loss: 0.2723 19/500 [>.............................] - ETA: 1:59 - loss: 1.8175 - regression_loss: 1.5463 - classification_loss: 0.2712 20/500 [>.............................] - ETA: 1:59 - loss: 1.8255 - regression_loss: 1.5525 - classification_loss: 0.2729 21/500 [>.............................] - ETA: 1:59 - loss: 1.7881 - regression_loss: 1.5215 - classification_loss: 0.2666 22/500 [>.............................] - ETA: 1:58 - loss: 1.8033 - regression_loss: 1.5359 - classification_loss: 0.2675 23/500 [>.............................] - ETA: 1:58 - loss: 1.7823 - regression_loss: 1.5175 - classification_loss: 0.2648 24/500 [>.............................] - ETA: 1:58 - loss: 1.7459 - regression_loss: 1.4862 - classification_loss: 0.2597 25/500 [>.............................] - ETA: 1:57 - loss: 1.7519 - regression_loss: 1.4884 - classification_loss: 0.2635 26/500 [>.............................] - ETA: 1:57 - loss: 1.7749 - regression_loss: 1.5035 - classification_loss: 0.2714 27/500 [>.............................] - ETA: 1:57 - loss: 1.7811 - regression_loss: 1.5084 - classification_loss: 0.2727 28/500 [>.............................] - ETA: 1:57 - loss: 1.7915 - regression_loss: 1.5169 - classification_loss: 0.2746 29/500 [>.............................] - ETA: 1:57 - loss: 1.8190 - regression_loss: 1.5318 - classification_loss: 0.2872 30/500 [>.............................] - ETA: 1:57 - loss: 1.8130 - regression_loss: 1.5276 - classification_loss: 0.2854 31/500 [>.............................] - ETA: 1:56 - loss: 1.8015 - regression_loss: 1.5186 - classification_loss: 0.2828 32/500 [>.............................] - ETA: 1:56 - loss: 1.8040 - regression_loss: 1.5217 - classification_loss: 0.2823 33/500 [>.............................] - ETA: 1:56 - loss: 1.8073 - regression_loss: 1.5243 - classification_loss: 0.2830 34/500 [=>............................] - ETA: 1:55 - loss: 1.8027 - regression_loss: 1.5201 - classification_loss: 0.2826 35/500 [=>............................] - ETA: 1:55 - loss: 1.8062 - regression_loss: 1.5235 - classification_loss: 0.2827 36/500 [=>............................] - ETA: 1:55 - loss: 1.7742 - regression_loss: 1.4955 - classification_loss: 0.2787 37/500 [=>............................] - ETA: 1:54 - loss: 1.7452 - regression_loss: 1.4726 - classification_loss: 0.2726 38/500 [=>............................] - ETA: 1:54 - loss: 1.7436 - regression_loss: 1.4712 - classification_loss: 0.2723 39/500 [=>............................] - ETA: 1:54 - loss: 1.7368 - regression_loss: 1.4661 - classification_loss: 0.2707 40/500 [=>............................] - ETA: 1:53 - loss: 1.7417 - regression_loss: 1.4694 - classification_loss: 0.2723 41/500 [=>............................] - ETA: 1:53 - loss: 1.7530 - regression_loss: 1.4778 - classification_loss: 0.2752 42/500 [=>............................] - ETA: 1:52 - loss: 1.7402 - regression_loss: 1.4680 - classification_loss: 0.2722 43/500 [=>............................] - ETA: 1:52 - loss: 1.7511 - regression_loss: 1.4778 - classification_loss: 0.2733 44/500 [=>............................] - ETA: 1:51 - loss: 1.7587 - regression_loss: 1.4837 - classification_loss: 0.2750 45/500 [=>............................] - ETA: 1:51 - loss: 1.7548 - regression_loss: 1.4824 - classification_loss: 0.2724 46/500 [=>............................] - ETA: 1:51 - loss: 1.7395 - regression_loss: 1.4707 - classification_loss: 0.2688 47/500 [=>............................] - ETA: 1:51 - loss: 1.7355 - regression_loss: 1.4670 - classification_loss: 0.2685 48/500 [=>............................] - ETA: 1:51 - loss: 1.7142 - regression_loss: 1.4494 - classification_loss: 0.2647 49/500 [=>............................] - ETA: 1:50 - loss: 1.7423 - regression_loss: 1.4721 - classification_loss: 0.2702 50/500 [==>...........................] - ETA: 1:50 - loss: 1.7388 - regression_loss: 1.4676 - classification_loss: 0.2712 51/500 [==>...........................] - ETA: 1:50 - loss: 1.7430 - regression_loss: 1.4711 - classification_loss: 0.2719 52/500 [==>...........................] - ETA: 1:50 - loss: 1.7626 - regression_loss: 1.4863 - classification_loss: 0.2763 53/500 [==>...........................] - ETA: 1:50 - loss: 1.7663 - regression_loss: 1.4901 - classification_loss: 0.2762 54/500 [==>...........................] - ETA: 1:49 - loss: 1.7696 - regression_loss: 1.4930 - classification_loss: 0.2765 55/500 [==>...........................] - ETA: 1:49 - loss: 1.7670 - regression_loss: 1.4909 - classification_loss: 0.2761 56/500 [==>...........................] - ETA: 1:49 - loss: 1.7519 - regression_loss: 1.4775 - classification_loss: 0.2744 57/500 [==>...........................] - ETA: 1:49 - loss: 1.7472 - regression_loss: 1.4732 - classification_loss: 0.2740 58/500 [==>...........................] - ETA: 1:48 - loss: 1.7469 - regression_loss: 1.4733 - classification_loss: 0.2736 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7435 - regression_loss: 1.4694 - classification_loss: 0.2741 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7490 - regression_loss: 1.4729 - classification_loss: 0.2760 61/500 [==>...........................] - ETA: 1:48 - loss: 1.7469 - regression_loss: 1.4723 - classification_loss: 0.2746 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7539 - regression_loss: 1.4780 - classification_loss: 0.2760 63/500 [==>...........................] - ETA: 1:47 - loss: 1.7516 - regression_loss: 1.4754 - classification_loss: 0.2762 64/500 [==>...........................] - ETA: 1:47 - loss: 1.7464 - regression_loss: 1.4714 - classification_loss: 0.2750 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7460 - regression_loss: 1.4717 - classification_loss: 0.2743 66/500 [==>...........................] - ETA: 1:46 - loss: 1.7458 - regression_loss: 1.4719 - classification_loss: 0.2740 67/500 [===>..........................] - ETA: 1:46 - loss: 1.7422 - regression_loss: 1.4685 - classification_loss: 0.2736 68/500 [===>..........................] - ETA: 1:46 - loss: 1.7456 - regression_loss: 1.4699 - classification_loss: 0.2757 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7400 - regression_loss: 1.4658 - classification_loss: 0.2741 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7499 - regression_loss: 1.4735 - classification_loss: 0.2763 71/500 [===>..........................] - ETA: 1:45 - loss: 1.7428 - regression_loss: 1.4688 - classification_loss: 0.2740 72/500 [===>..........................] - ETA: 1:45 - loss: 1.7255 - regression_loss: 1.4545 - classification_loss: 0.2709 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7235 - regression_loss: 1.4535 - classification_loss: 0.2700 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7222 - regression_loss: 1.4517 - classification_loss: 0.2705 75/500 [===>..........................] - ETA: 1:44 - loss: 1.7262 - regression_loss: 1.4554 - classification_loss: 0.2709 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7234 - regression_loss: 1.4527 - classification_loss: 0.2706 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7164 - regression_loss: 1.4474 - classification_loss: 0.2690 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7131 - regression_loss: 1.4449 - classification_loss: 0.2682 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7089 - regression_loss: 1.4424 - classification_loss: 0.2665 80/500 [===>..........................] - ETA: 1:43 - loss: 1.7137 - regression_loss: 1.4478 - classification_loss: 0.2659 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7076 - regression_loss: 1.4429 - classification_loss: 0.2647 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7039 - regression_loss: 1.4401 - classification_loss: 0.2637 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7006 - regression_loss: 1.4367 - classification_loss: 0.2639 84/500 [====>.........................] - ETA: 1:42 - loss: 1.7011 - regression_loss: 1.4378 - classification_loss: 0.2633 85/500 [====>.........................] - ETA: 1:42 - loss: 1.6989 - regression_loss: 1.4353 - classification_loss: 0.2636 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6948 - regression_loss: 1.4308 - classification_loss: 0.2639 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6905 - regression_loss: 1.4267 - classification_loss: 0.2638 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6890 - regression_loss: 1.4253 - classification_loss: 0.2637 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7015 - regression_loss: 1.4362 - classification_loss: 0.2652 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7045 - regression_loss: 1.4383 - classification_loss: 0.2662 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7051 - regression_loss: 1.4380 - classification_loss: 0.2671 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6966 - regression_loss: 1.4312 - classification_loss: 0.2654 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6997 - regression_loss: 1.4337 - classification_loss: 0.2661 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6996 - regression_loss: 1.4340 - classification_loss: 0.2656 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7098 - regression_loss: 1.4415 - classification_loss: 0.2683 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7142 - regression_loss: 1.4450 - classification_loss: 0.2692 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7068 - regression_loss: 1.4396 - classification_loss: 0.2673 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6998 - regression_loss: 1.4341 - classification_loss: 0.2657 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7066 - regression_loss: 1.4386 - classification_loss: 0.2680 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7076 - regression_loss: 1.4391 - classification_loss: 0.2686 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6989 - regression_loss: 1.4323 - classification_loss: 0.2666 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6937 - regression_loss: 1.4281 - classification_loss: 0.2655 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6886 - regression_loss: 1.4239 - classification_loss: 0.2646 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6834 - regression_loss: 1.4198 - classification_loss: 0.2636 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6752 - regression_loss: 1.4131 - classification_loss: 0.2621 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6734 - regression_loss: 1.4121 - classification_loss: 0.2613 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6681 - regression_loss: 1.4077 - classification_loss: 0.2605 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6694 - regression_loss: 1.4090 - classification_loss: 0.2605 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6750 - regression_loss: 1.4134 - classification_loss: 0.2615 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6653 - regression_loss: 1.4055 - classification_loss: 0.2599 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6693 - regression_loss: 1.4081 - classification_loss: 0.2611 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6779 - regression_loss: 1.4147 - classification_loss: 0.2632 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6754 - regression_loss: 1.4131 - classification_loss: 0.2623 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6673 - regression_loss: 1.4062 - classification_loss: 0.2611 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6668 - regression_loss: 1.4059 - classification_loss: 0.2609 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6633 - regression_loss: 1.4034 - classification_loss: 0.2599 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6657 - regression_loss: 1.4056 - classification_loss: 0.2601 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6673 - regression_loss: 1.4060 - classification_loss: 0.2614 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6607 - regression_loss: 1.4003 - classification_loss: 0.2603 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6574 - regression_loss: 1.3981 - classification_loss: 0.2593 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6577 - regression_loss: 1.3980 - classification_loss: 0.2597 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6561 - regression_loss: 1.3963 - classification_loss: 0.2599 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6533 - regression_loss: 1.3942 - classification_loss: 0.2591 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6545 - regression_loss: 1.3953 - classification_loss: 0.2593 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6467 - regression_loss: 1.3886 - classification_loss: 0.2581 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6446 - regression_loss: 1.3875 - classification_loss: 0.2571 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6498 - regression_loss: 1.3917 - classification_loss: 0.2581 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6482 - regression_loss: 1.3907 - classification_loss: 0.2575 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6561 - regression_loss: 1.3976 - classification_loss: 0.2585 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6646 - regression_loss: 1.4066 - classification_loss: 0.2580 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6615 - regression_loss: 1.4041 - classification_loss: 0.2574 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6673 - regression_loss: 1.4091 - classification_loss: 0.2582 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6628 - regression_loss: 1.4052 - classification_loss: 0.2576 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6555 - regression_loss: 1.3983 - classification_loss: 0.2571 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6585 - regression_loss: 1.4009 - classification_loss: 0.2576 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6561 - regression_loss: 1.3982 - classification_loss: 0.2578 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6630 - regression_loss: 1.4032 - classification_loss: 0.2599 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6618 - regression_loss: 1.4022 - classification_loss: 0.2596 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6579 - regression_loss: 1.3993 - classification_loss: 0.2586 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6575 - regression_loss: 1.3989 - classification_loss: 0.2586 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6560 - regression_loss: 1.3973 - classification_loss: 0.2587 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6572 - regression_loss: 1.3990 - classification_loss: 0.2582 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6548 - regression_loss: 1.3970 - classification_loss: 0.2578 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6551 - regression_loss: 1.3976 - classification_loss: 0.2575 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6574 - regression_loss: 1.4002 - classification_loss: 0.2573 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6558 - regression_loss: 1.3988 - classification_loss: 0.2571 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6582 - regression_loss: 1.4008 - classification_loss: 0.2574 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6633 - regression_loss: 1.4034 - classification_loss: 0.2598 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6661 - regression_loss: 1.4050 - classification_loss: 0.2611 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6580 - regression_loss: 1.3981 - classification_loss: 0.2599 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6556 - regression_loss: 1.3964 - classification_loss: 0.2591 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6562 - regression_loss: 1.3967 - classification_loss: 0.2595 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6559 - regression_loss: 1.3968 - classification_loss: 0.2591 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6567 - regression_loss: 1.3973 - classification_loss: 0.2594 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6526 - regression_loss: 1.3943 - classification_loss: 0.2583 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6512 - regression_loss: 1.3933 - classification_loss: 0.2579 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6503 - regression_loss: 1.3925 - classification_loss: 0.2577 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6477 - regression_loss: 1.3906 - classification_loss: 0.2571 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6488 - regression_loss: 1.3910 - classification_loss: 0.2578 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6491 - regression_loss: 1.3909 - classification_loss: 0.2583 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6512 - regression_loss: 1.3919 - classification_loss: 0.2593 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6531 - regression_loss: 1.3934 - classification_loss: 0.2597 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6587 - regression_loss: 1.3972 - classification_loss: 0.2615 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6642 - regression_loss: 1.4018 - classification_loss: 0.2623 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6668 - regression_loss: 1.4041 - classification_loss: 0.2627 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6671 - regression_loss: 1.4032 - classification_loss: 0.2638 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6628 - regression_loss: 1.3999 - classification_loss: 0.2628 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6619 - regression_loss: 1.3995 - classification_loss: 0.2623 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6563 - regression_loss: 1.3950 - classification_loss: 0.2613 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6565 - regression_loss: 1.3958 - classification_loss: 0.2607 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6550 - regression_loss: 1.3951 - classification_loss: 0.2599 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6527 - regression_loss: 1.3934 - classification_loss: 0.2593 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6527 - regression_loss: 1.3935 - classification_loss: 0.2592 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6511 - regression_loss: 1.3925 - classification_loss: 0.2587 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6499 - regression_loss: 1.3914 - classification_loss: 0.2585 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6484 - regression_loss: 1.3906 - classification_loss: 0.2578 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6506 - regression_loss: 1.3922 - classification_loss: 0.2584 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6511 - regression_loss: 1.3927 - classification_loss: 0.2584 179/500 [=========>....................] - ETA: 1:19 - loss: 1.6506 - regression_loss: 1.3924 - classification_loss: 0.2582 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6508 - regression_loss: 1.3925 - classification_loss: 0.2583 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6460 - regression_loss: 1.3887 - classification_loss: 0.2573 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6467 - regression_loss: 1.3891 - classification_loss: 0.2576 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6504 - regression_loss: 1.3918 - classification_loss: 0.2585 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6498 - regression_loss: 1.3919 - classification_loss: 0.2579 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6499 - regression_loss: 1.3923 - classification_loss: 0.2576 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6496 - regression_loss: 1.3921 - classification_loss: 0.2576 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6479 - regression_loss: 1.3907 - classification_loss: 0.2571 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6499 - regression_loss: 1.3924 - classification_loss: 0.2575 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6532 - regression_loss: 1.3948 - classification_loss: 0.2584 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6532 - regression_loss: 1.3951 - classification_loss: 0.2581 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6520 - regression_loss: 1.3941 - classification_loss: 0.2580 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6517 - regression_loss: 1.3940 - classification_loss: 0.2577 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6516 - regression_loss: 1.3939 - classification_loss: 0.2577 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6503 - regression_loss: 1.3929 - classification_loss: 0.2573 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6522 - regression_loss: 1.3947 - classification_loss: 0.2575 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6507 - regression_loss: 1.3935 - classification_loss: 0.2572 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6505 - regression_loss: 1.3930 - classification_loss: 0.2575 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6506 - regression_loss: 1.3927 - classification_loss: 0.2579 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6487 - regression_loss: 1.3911 - classification_loss: 0.2576 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6494 - regression_loss: 1.3912 - classification_loss: 0.2582 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6489 - regression_loss: 1.3911 - classification_loss: 0.2578 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6503 - regression_loss: 1.3921 - classification_loss: 0.2583 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6530 - regression_loss: 1.3941 - classification_loss: 0.2589 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6506 - regression_loss: 1.3924 - classification_loss: 0.2582 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6478 - regression_loss: 1.3899 - classification_loss: 0.2579 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6486 - regression_loss: 1.3906 - classification_loss: 0.2580 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6474 - regression_loss: 1.3897 - classification_loss: 0.2577 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6481 - regression_loss: 1.3902 - classification_loss: 0.2578 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6485 - regression_loss: 1.3910 - classification_loss: 0.2575 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6481 - regression_loss: 1.3907 - classification_loss: 0.2574 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6466 - regression_loss: 1.3896 - classification_loss: 0.2570 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6464 - regression_loss: 1.3891 - classification_loss: 0.2574 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6448 - regression_loss: 1.3876 - classification_loss: 0.2572 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6463 - regression_loss: 1.3890 - classification_loss: 0.2574 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6458 - regression_loss: 1.3825 - classification_loss: 0.2633 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6456 - regression_loss: 1.3824 - classification_loss: 0.2632 217/500 [============>.................] - ETA: 1:10 - loss: 1.6428 - regression_loss: 1.3800 - classification_loss: 0.2628 218/500 [============>.................] - ETA: 1:10 - loss: 1.6442 - regression_loss: 1.3805 - classification_loss: 0.2637 219/500 [============>.................] - ETA: 1:09 - loss: 1.6436 - regression_loss: 1.3799 - classification_loss: 0.2637 220/500 [============>.................] - ETA: 1:09 - loss: 1.6434 - regression_loss: 1.3796 - classification_loss: 0.2638 221/500 [============>.................] - ETA: 1:09 - loss: 1.6437 - regression_loss: 1.3795 - classification_loss: 0.2642 222/500 [============>.................] - ETA: 1:09 - loss: 1.6447 - regression_loss: 1.3802 - classification_loss: 0.2646 223/500 [============>.................] - ETA: 1:08 - loss: 1.6428 - regression_loss: 1.3786 - classification_loss: 0.2642 224/500 [============>.................] - ETA: 1:08 - loss: 1.6432 - regression_loss: 1.3790 - classification_loss: 0.2642 225/500 [============>.................] - ETA: 1:08 - loss: 1.6375 - regression_loss: 1.3728 - classification_loss: 0.2646 226/500 [============>.................] - ETA: 1:08 - loss: 1.6375 - regression_loss: 1.3729 - classification_loss: 0.2645 227/500 [============>.................] - ETA: 1:07 - loss: 1.6386 - regression_loss: 1.3748 - classification_loss: 0.2639 228/500 [============>.................] - ETA: 1:07 - loss: 1.6398 - regression_loss: 1.3756 - classification_loss: 0.2643 229/500 [============>.................] - ETA: 1:07 - loss: 1.6394 - regression_loss: 1.3753 - classification_loss: 0.2642 230/500 [============>.................] - ETA: 1:07 - loss: 1.6383 - regression_loss: 1.3744 - classification_loss: 0.2638 231/500 [============>.................] - ETA: 1:06 - loss: 1.6395 - regression_loss: 1.3753 - classification_loss: 0.2643 232/500 [============>.................] - ETA: 1:06 - loss: 1.6415 - regression_loss: 1.3776 - classification_loss: 0.2638 233/500 [============>.................] - ETA: 1:06 - loss: 1.6438 - regression_loss: 1.3794 - classification_loss: 0.2644 234/500 [=============>................] - ETA: 1:06 - loss: 1.6440 - regression_loss: 1.3796 - classification_loss: 0.2644 235/500 [=============>................] - ETA: 1:05 - loss: 1.6454 - regression_loss: 1.3805 - classification_loss: 0.2649 236/500 [=============>................] - ETA: 1:05 - loss: 1.6420 - regression_loss: 1.3777 - classification_loss: 0.2643 237/500 [=============>................] - ETA: 1:05 - loss: 1.6447 - regression_loss: 1.3793 - classification_loss: 0.2654 238/500 [=============>................] - ETA: 1:05 - loss: 1.6468 - regression_loss: 1.3809 - classification_loss: 0.2659 239/500 [=============>................] - ETA: 1:04 - loss: 1.6455 - regression_loss: 1.3800 - classification_loss: 0.2655 240/500 [=============>................] - ETA: 1:04 - loss: 1.6453 - regression_loss: 1.3799 - classification_loss: 0.2654 241/500 [=============>................] - ETA: 1:04 - loss: 1.6480 - regression_loss: 1.3821 - classification_loss: 0.2658 242/500 [=============>................] - ETA: 1:04 - loss: 1.6474 - regression_loss: 1.3818 - classification_loss: 0.2656 243/500 [=============>................] - ETA: 1:04 - loss: 1.6503 - regression_loss: 1.3838 - classification_loss: 0.2665 244/500 [=============>................] - ETA: 1:03 - loss: 1.6510 - regression_loss: 1.3844 - classification_loss: 0.2666 245/500 [=============>................] - ETA: 1:03 - loss: 1.6483 - regression_loss: 1.3824 - classification_loss: 0.2659 246/500 [=============>................] - ETA: 1:03 - loss: 1.6468 - regression_loss: 1.3813 - classification_loss: 0.2655 247/500 [=============>................] - ETA: 1:02 - loss: 1.6472 - regression_loss: 1.3820 - classification_loss: 0.2651 248/500 [=============>................] - ETA: 1:02 - loss: 1.6464 - regression_loss: 1.3816 - classification_loss: 0.2648 249/500 [=============>................] - ETA: 1:02 - loss: 1.6458 - regression_loss: 1.3814 - classification_loss: 0.2644 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6468 - regression_loss: 1.3824 - classification_loss: 0.2645 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6470 - regression_loss: 1.3820 - classification_loss: 0.2651 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6444 - regression_loss: 1.3798 - classification_loss: 0.2645 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6439 - regression_loss: 1.3795 - classification_loss: 0.2644 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6423 - regression_loss: 1.3784 - classification_loss: 0.2639 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6398 - regression_loss: 1.3767 - classification_loss: 0.2631 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6388 - regression_loss: 1.3760 - classification_loss: 0.2628 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6374 - regression_loss: 1.3750 - classification_loss: 0.2624 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6382 - regression_loss: 1.3755 - classification_loss: 0.2627 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6371 - regression_loss: 1.3747 - classification_loss: 0.2624 260/500 [==============>...............] - ETA: 59s - loss: 1.6372 - regression_loss: 1.3746 - classification_loss: 0.2626  261/500 [==============>...............] - ETA: 59s - loss: 1.6375 - regression_loss: 1.3748 - classification_loss: 0.2628 262/500 [==============>...............] - ETA: 59s - loss: 1.6342 - regression_loss: 1.3722 - classification_loss: 0.2620 263/500 [==============>...............] - ETA: 59s - loss: 1.6339 - regression_loss: 1.3720 - classification_loss: 0.2619 264/500 [==============>...............] - ETA: 58s - loss: 1.6340 - regression_loss: 1.3723 - classification_loss: 0.2616 265/500 [==============>...............] - ETA: 58s - loss: 1.6339 - regression_loss: 1.3726 - classification_loss: 0.2613 266/500 [==============>...............] - ETA: 58s - loss: 1.6352 - regression_loss: 1.3742 - classification_loss: 0.2610 267/500 [===============>..............] - ETA: 58s - loss: 1.6311 - regression_loss: 1.3708 - classification_loss: 0.2603 268/500 [===============>..............] - ETA: 57s - loss: 1.6323 - regression_loss: 1.3716 - classification_loss: 0.2607 269/500 [===============>..............] - ETA: 57s - loss: 1.6331 - regression_loss: 1.3720 - classification_loss: 0.2611 270/500 [===============>..............] - ETA: 57s - loss: 1.6376 - regression_loss: 1.3756 - classification_loss: 0.2620 271/500 [===============>..............] - ETA: 57s - loss: 1.6363 - regression_loss: 1.3705 - classification_loss: 0.2658 272/500 [===============>..............] - ETA: 56s - loss: 1.6373 - regression_loss: 1.3714 - classification_loss: 0.2659 273/500 [===============>..............] - ETA: 56s - loss: 1.6327 - regression_loss: 1.3675 - classification_loss: 0.2652 274/500 [===============>..............] - ETA: 56s - loss: 1.6308 - regression_loss: 1.3658 - classification_loss: 0.2650 275/500 [===============>..............] - ETA: 56s - loss: 1.6298 - regression_loss: 1.3653 - classification_loss: 0.2646 276/500 [===============>..............] - ETA: 55s - loss: 1.6281 - regression_loss: 1.3640 - classification_loss: 0.2641 277/500 [===============>..............] - ETA: 55s - loss: 1.6308 - regression_loss: 1.3663 - classification_loss: 0.2644 278/500 [===============>..............] - ETA: 55s - loss: 1.6310 - regression_loss: 1.3667 - classification_loss: 0.2643 279/500 [===============>..............] - ETA: 55s - loss: 1.6306 - regression_loss: 1.3664 - classification_loss: 0.2642 280/500 [===============>..............] - ETA: 54s - loss: 1.6302 - regression_loss: 1.3660 - classification_loss: 0.2642 281/500 [===============>..............] - ETA: 54s - loss: 1.6286 - regression_loss: 1.3647 - classification_loss: 0.2639 282/500 [===============>..............] - ETA: 54s - loss: 1.6284 - regression_loss: 1.3640 - classification_loss: 0.2643 283/500 [===============>..............] - ETA: 54s - loss: 1.6249 - regression_loss: 1.3611 - classification_loss: 0.2638 284/500 [================>.............] - ETA: 53s - loss: 1.6220 - regression_loss: 1.3588 - classification_loss: 0.2632 285/500 [================>.............] - ETA: 53s - loss: 1.6227 - regression_loss: 1.3594 - classification_loss: 0.2633 286/500 [================>.............] - ETA: 53s - loss: 1.6226 - regression_loss: 1.3596 - classification_loss: 0.2630 287/500 [================>.............] - ETA: 53s - loss: 1.6216 - regression_loss: 1.3586 - classification_loss: 0.2630 288/500 [================>.............] - ETA: 52s - loss: 1.6234 - regression_loss: 1.3596 - classification_loss: 0.2638 289/500 [================>.............] - ETA: 52s - loss: 1.6230 - regression_loss: 1.3595 - classification_loss: 0.2635 290/500 [================>.............] - ETA: 52s - loss: 1.6207 - regression_loss: 1.3576 - classification_loss: 0.2631 291/500 [================>.............] - ETA: 52s - loss: 1.6202 - regression_loss: 1.3574 - classification_loss: 0.2627 292/500 [================>.............] - ETA: 51s - loss: 1.6200 - regression_loss: 1.3575 - classification_loss: 0.2625 293/500 [================>.............] - ETA: 51s - loss: 1.6192 - regression_loss: 1.3571 - classification_loss: 0.2620 294/500 [================>.............] - ETA: 51s - loss: 1.6171 - regression_loss: 1.3554 - classification_loss: 0.2617 295/500 [================>.............] - ETA: 51s - loss: 1.6173 - regression_loss: 1.3555 - classification_loss: 0.2618 296/500 [================>.............] - ETA: 50s - loss: 1.6154 - regression_loss: 1.3541 - classification_loss: 0.2613 297/500 [================>.............] - ETA: 50s - loss: 1.6187 - regression_loss: 1.3566 - classification_loss: 0.2620 298/500 [================>.............] - ETA: 50s - loss: 1.6198 - regression_loss: 1.3577 - classification_loss: 0.2622 299/500 [================>.............] - ETA: 50s - loss: 1.6175 - regression_loss: 1.3560 - classification_loss: 0.2615 300/500 [=================>............] - ETA: 49s - loss: 1.6172 - regression_loss: 1.3557 - classification_loss: 0.2615 301/500 [=================>............] - ETA: 49s - loss: 1.6243 - regression_loss: 1.3612 - classification_loss: 0.2631 302/500 [=================>............] - ETA: 49s - loss: 1.6239 - regression_loss: 1.3607 - classification_loss: 0.2632 303/500 [=================>............] - ETA: 49s - loss: 1.6239 - regression_loss: 1.3607 - classification_loss: 0.2632 304/500 [=================>............] - ETA: 48s - loss: 1.6232 - regression_loss: 1.3600 - classification_loss: 0.2633 305/500 [=================>............] - ETA: 48s - loss: 1.6244 - regression_loss: 1.3610 - classification_loss: 0.2635 306/500 [=================>............] - ETA: 48s - loss: 1.6233 - regression_loss: 1.3602 - classification_loss: 0.2630 307/500 [=================>............] - ETA: 48s - loss: 1.6275 - regression_loss: 1.3636 - classification_loss: 0.2639 308/500 [=================>............] - ETA: 47s - loss: 1.6281 - regression_loss: 1.3645 - classification_loss: 0.2635 309/500 [=================>............] - ETA: 47s - loss: 1.6294 - regression_loss: 1.3658 - classification_loss: 0.2635 310/500 [=================>............] - ETA: 47s - loss: 1.6285 - regression_loss: 1.3653 - classification_loss: 0.2632 311/500 [=================>............] - ETA: 47s - loss: 1.6274 - regression_loss: 1.3647 - classification_loss: 0.2627 312/500 [=================>............] - ETA: 46s - loss: 1.6256 - regression_loss: 1.3634 - classification_loss: 0.2622 313/500 [=================>............] - ETA: 46s - loss: 1.6224 - regression_loss: 1.3608 - classification_loss: 0.2616 314/500 [=================>............] - ETA: 46s - loss: 1.6216 - regression_loss: 1.3604 - classification_loss: 0.2612 315/500 [=================>............] - ETA: 46s - loss: 1.6228 - regression_loss: 1.3616 - classification_loss: 0.2612 316/500 [=================>............] - ETA: 45s - loss: 1.6188 - regression_loss: 1.3583 - classification_loss: 0.2605 317/500 [==================>...........] - ETA: 45s - loss: 1.6220 - regression_loss: 1.3613 - classification_loss: 0.2608 318/500 [==================>...........] - ETA: 45s - loss: 1.6232 - regression_loss: 1.3624 - classification_loss: 0.2608 319/500 [==================>...........] - ETA: 45s - loss: 1.6262 - regression_loss: 1.3645 - classification_loss: 0.2617 320/500 [==================>...........] - ETA: 44s - loss: 1.6256 - regression_loss: 1.3642 - classification_loss: 0.2614 321/500 [==================>...........] - ETA: 44s - loss: 1.6243 - regression_loss: 1.3630 - classification_loss: 0.2614 322/500 [==================>...........] - ETA: 44s - loss: 1.6248 - regression_loss: 1.3631 - classification_loss: 0.2617 323/500 [==================>...........] - ETA: 44s - loss: 1.6241 - regression_loss: 1.3627 - classification_loss: 0.2614 324/500 [==================>...........] - ETA: 43s - loss: 1.6235 - regression_loss: 1.3624 - classification_loss: 0.2611 325/500 [==================>...........] - ETA: 43s - loss: 1.6209 - regression_loss: 1.3604 - classification_loss: 0.2605 326/500 [==================>...........] - ETA: 43s - loss: 1.6204 - regression_loss: 1.3600 - classification_loss: 0.2604 327/500 [==================>...........] - ETA: 43s - loss: 1.6202 - regression_loss: 1.3603 - classification_loss: 0.2599 328/500 [==================>...........] - ETA: 42s - loss: 1.6222 - regression_loss: 1.3617 - classification_loss: 0.2605 329/500 [==================>...........] - ETA: 42s - loss: 1.6222 - regression_loss: 1.3616 - classification_loss: 0.2606 330/500 [==================>...........] - ETA: 42s - loss: 1.6218 - regression_loss: 1.3612 - classification_loss: 0.2606 331/500 [==================>...........] - ETA: 42s - loss: 1.6234 - regression_loss: 1.3626 - classification_loss: 0.2608 332/500 [==================>...........] - ETA: 41s - loss: 1.6264 - regression_loss: 1.3649 - classification_loss: 0.2615 333/500 [==================>...........] - ETA: 41s - loss: 1.6278 - regression_loss: 1.3659 - classification_loss: 0.2619 334/500 [===================>..........] - ETA: 41s - loss: 1.6279 - regression_loss: 1.3662 - classification_loss: 0.2617 335/500 [===================>..........] - ETA: 41s - loss: 1.6299 - regression_loss: 1.3675 - classification_loss: 0.2624 336/500 [===================>..........] - ETA: 40s - loss: 1.6321 - regression_loss: 1.3689 - classification_loss: 0.2632 337/500 [===================>..........] - ETA: 40s - loss: 1.6302 - regression_loss: 1.3673 - classification_loss: 0.2628 338/500 [===================>..........] - ETA: 40s - loss: 1.6292 - regression_loss: 1.3633 - classification_loss: 0.2659 339/500 [===================>..........] - ETA: 40s - loss: 1.6270 - regression_loss: 1.3617 - classification_loss: 0.2653 340/500 [===================>..........] - ETA: 39s - loss: 1.6278 - regression_loss: 1.3623 - classification_loss: 0.2654 341/500 [===================>..........] - ETA: 39s - loss: 1.6263 - regression_loss: 1.3612 - classification_loss: 0.2651 342/500 [===================>..........] - ETA: 39s - loss: 1.6285 - regression_loss: 1.3630 - classification_loss: 0.2655 343/500 [===================>..........] - ETA: 39s - loss: 1.6282 - regression_loss: 1.3629 - classification_loss: 0.2653 344/500 [===================>..........] - ETA: 38s - loss: 1.6279 - regression_loss: 1.3628 - classification_loss: 0.2651 345/500 [===================>..........] - ETA: 38s - loss: 1.6291 - regression_loss: 1.3638 - classification_loss: 0.2654 346/500 [===================>..........] - ETA: 38s - loss: 1.6282 - regression_loss: 1.3629 - classification_loss: 0.2653 347/500 [===================>..........] - ETA: 38s - loss: 1.6284 - regression_loss: 1.3632 - classification_loss: 0.2652 348/500 [===================>..........] - ETA: 37s - loss: 1.6291 - regression_loss: 1.3639 - classification_loss: 0.2651 349/500 [===================>..........] - ETA: 37s - loss: 1.6275 - regression_loss: 1.3627 - classification_loss: 0.2648 350/500 [====================>.........] - ETA: 37s - loss: 1.6294 - regression_loss: 1.3638 - classification_loss: 0.2656 351/500 [====================>.........] - ETA: 37s - loss: 1.6282 - regression_loss: 1.3626 - classification_loss: 0.2656 352/500 [====================>.........] - ETA: 36s - loss: 1.6291 - regression_loss: 1.3632 - classification_loss: 0.2658 353/500 [====================>.........] - ETA: 36s - loss: 1.6296 - regression_loss: 1.3639 - classification_loss: 0.2657 354/500 [====================>.........] - ETA: 36s - loss: 1.6286 - regression_loss: 1.3633 - classification_loss: 0.2653 355/500 [====================>.........] - ETA: 36s - loss: 1.6288 - regression_loss: 1.3636 - classification_loss: 0.2652 356/500 [====================>.........] - ETA: 35s - loss: 1.6288 - regression_loss: 1.3639 - classification_loss: 0.2649 357/500 [====================>.........] - ETA: 35s - loss: 1.6294 - regression_loss: 1.3645 - classification_loss: 0.2649 358/500 [====================>.........] - ETA: 35s - loss: 1.6289 - regression_loss: 1.3639 - classification_loss: 0.2651 359/500 [====================>.........] - ETA: 35s - loss: 1.6267 - regression_loss: 1.3621 - classification_loss: 0.2646 360/500 [====================>.........] - ETA: 34s - loss: 1.6261 - regression_loss: 1.3613 - classification_loss: 0.2647 361/500 [====================>.........] - ETA: 34s - loss: 1.6237 - regression_loss: 1.3594 - classification_loss: 0.2643 362/500 [====================>.........] - ETA: 34s - loss: 1.6253 - regression_loss: 1.3612 - classification_loss: 0.2640 363/500 [====================>.........] - ETA: 34s - loss: 1.6254 - regression_loss: 1.3616 - classification_loss: 0.2638 364/500 [====================>.........] - ETA: 33s - loss: 1.6241 - regression_loss: 1.3600 - classification_loss: 0.2641 365/500 [====================>.........] - ETA: 33s - loss: 1.6262 - regression_loss: 1.3618 - classification_loss: 0.2644 366/500 [====================>.........] - ETA: 33s - loss: 1.6277 - regression_loss: 1.3630 - classification_loss: 0.2647 367/500 [=====================>........] - ETA: 33s - loss: 1.6254 - regression_loss: 1.3611 - classification_loss: 0.2643 368/500 [=====================>........] - ETA: 32s - loss: 1.6251 - regression_loss: 1.3608 - classification_loss: 0.2643 369/500 [=====================>........] - ETA: 32s - loss: 1.6271 - regression_loss: 1.3624 - classification_loss: 0.2647 370/500 [=====================>........] - ETA: 32s - loss: 1.6286 - regression_loss: 1.3634 - classification_loss: 0.2653 371/500 [=====================>........] - ETA: 32s - loss: 1.6279 - regression_loss: 1.3629 - classification_loss: 0.2650 372/500 [=====================>........] - ETA: 31s - loss: 1.6296 - regression_loss: 1.3643 - classification_loss: 0.2653 373/500 [=====================>........] - ETA: 31s - loss: 1.6302 - regression_loss: 1.3649 - classification_loss: 0.2653 374/500 [=====================>........] - ETA: 31s - loss: 1.6287 - regression_loss: 1.3638 - classification_loss: 0.2649 375/500 [=====================>........] - ETA: 31s - loss: 1.6285 - regression_loss: 1.3635 - classification_loss: 0.2649 376/500 [=====================>........] - ETA: 30s - loss: 1.6282 - regression_loss: 1.3632 - classification_loss: 0.2650 377/500 [=====================>........] - ETA: 30s - loss: 1.6271 - regression_loss: 1.3624 - classification_loss: 0.2647 378/500 [=====================>........] - ETA: 30s - loss: 1.6264 - regression_loss: 1.3619 - classification_loss: 0.2644 379/500 [=====================>........] - ETA: 30s - loss: 1.6279 - regression_loss: 1.3629 - classification_loss: 0.2650 380/500 [=====================>........] - ETA: 29s - loss: 1.6269 - regression_loss: 1.3621 - classification_loss: 0.2648 381/500 [=====================>........] - ETA: 29s - loss: 1.6250 - regression_loss: 1.3607 - classification_loss: 0.2643 382/500 [=====================>........] - ETA: 29s - loss: 1.6245 - regression_loss: 1.3604 - classification_loss: 0.2641 383/500 [=====================>........] - ETA: 29s - loss: 1.6241 - regression_loss: 1.3604 - classification_loss: 0.2637 384/500 [======================>.......] - ETA: 28s - loss: 1.6262 - regression_loss: 1.3620 - classification_loss: 0.2642 385/500 [======================>.......] - ETA: 28s - loss: 1.6259 - regression_loss: 1.3619 - classification_loss: 0.2640 386/500 [======================>.......] - ETA: 28s - loss: 1.6258 - regression_loss: 1.3619 - classification_loss: 0.2640 387/500 [======================>.......] - ETA: 28s - loss: 1.6245 - regression_loss: 1.3607 - classification_loss: 0.2638 388/500 [======================>.......] - ETA: 27s - loss: 1.6261 - regression_loss: 1.3621 - classification_loss: 0.2640 389/500 [======================>.......] - ETA: 27s - loss: 1.6274 - regression_loss: 1.3630 - classification_loss: 0.2644 390/500 [======================>.......] - ETA: 27s - loss: 1.6296 - regression_loss: 1.3648 - classification_loss: 0.2648 391/500 [======================>.......] - ETA: 27s - loss: 1.6302 - regression_loss: 1.3651 - classification_loss: 0.2650 392/500 [======================>.......] - ETA: 26s - loss: 1.6306 - regression_loss: 1.3656 - classification_loss: 0.2650 393/500 [======================>.......] - ETA: 26s - loss: 1.6287 - regression_loss: 1.3636 - classification_loss: 0.2651 394/500 [======================>.......] - ETA: 26s - loss: 1.6303 - regression_loss: 1.3650 - classification_loss: 0.2653 395/500 [======================>.......] - ETA: 26s - loss: 1.6307 - regression_loss: 1.3655 - classification_loss: 0.2652 396/500 [======================>.......] - ETA: 25s - loss: 1.6303 - regression_loss: 1.3652 - classification_loss: 0.2650 397/500 [======================>.......] - ETA: 25s - loss: 1.6289 - regression_loss: 1.3642 - classification_loss: 0.2647 398/500 [======================>.......] - ETA: 25s - loss: 1.6289 - regression_loss: 1.3645 - classification_loss: 0.2644 399/500 [======================>.......] - ETA: 25s - loss: 1.6277 - regression_loss: 1.3635 - classification_loss: 0.2642 400/500 [=======================>......] - ETA: 24s - loss: 1.6288 - regression_loss: 1.3644 - classification_loss: 0.2645 401/500 [=======================>......] - ETA: 24s - loss: 1.6299 - regression_loss: 1.3652 - classification_loss: 0.2647 402/500 [=======================>......] - ETA: 24s - loss: 1.6297 - regression_loss: 1.3651 - classification_loss: 0.2645 403/500 [=======================>......] - ETA: 24s - loss: 1.6288 - regression_loss: 1.3645 - classification_loss: 0.2643 404/500 [=======================>......] - ETA: 23s - loss: 1.6299 - regression_loss: 1.3658 - classification_loss: 0.2641 405/500 [=======================>......] - ETA: 23s - loss: 1.6309 - regression_loss: 1.3666 - classification_loss: 0.2644 406/500 [=======================>......] - ETA: 23s - loss: 1.6310 - regression_loss: 1.3668 - classification_loss: 0.2642 407/500 [=======================>......] - ETA: 23s - loss: 1.6314 - regression_loss: 1.3672 - classification_loss: 0.2642 408/500 [=======================>......] - ETA: 22s - loss: 1.6324 - regression_loss: 1.3683 - classification_loss: 0.2641 409/500 [=======================>......] - ETA: 22s - loss: 1.6342 - regression_loss: 1.3696 - classification_loss: 0.2645 410/500 [=======================>......] - ETA: 22s - loss: 1.6348 - regression_loss: 1.3702 - classification_loss: 0.2646 411/500 [=======================>......] - ETA: 22s - loss: 1.6354 - regression_loss: 1.3707 - classification_loss: 0.2647 412/500 [=======================>......] - ETA: 21s - loss: 1.6348 - regression_loss: 1.3703 - classification_loss: 0.2645 413/500 [=======================>......] - ETA: 21s - loss: 1.6353 - regression_loss: 1.3707 - classification_loss: 0.2645 414/500 [=======================>......] - ETA: 21s - loss: 1.6360 - regression_loss: 1.3714 - classification_loss: 0.2646 415/500 [=======================>......] - ETA: 21s - loss: 1.6360 - regression_loss: 1.3712 - classification_loss: 0.2649 416/500 [=======================>......] - ETA: 20s - loss: 1.6366 - regression_loss: 1.3715 - classification_loss: 0.2652 417/500 [========================>.....] - ETA: 20s - loss: 1.6359 - regression_loss: 1.3709 - classification_loss: 0.2650 418/500 [========================>.....] - ETA: 20s - loss: 1.6355 - regression_loss: 1.3704 - classification_loss: 0.2650 419/500 [========================>.....] - ETA: 20s - loss: 1.6325 - regression_loss: 1.3680 - classification_loss: 0.2646 420/500 [========================>.....] - ETA: 19s - loss: 1.6317 - regression_loss: 1.3674 - classification_loss: 0.2643 421/500 [========================>.....] - ETA: 19s - loss: 1.6320 - regression_loss: 1.3676 - classification_loss: 0.2644 422/500 [========================>.....] - ETA: 19s - loss: 1.6324 - regression_loss: 1.3680 - classification_loss: 0.2645 423/500 [========================>.....] - ETA: 19s - loss: 1.6318 - regression_loss: 1.3675 - classification_loss: 0.2642 424/500 [========================>.....] - ETA: 18s - loss: 1.6308 - regression_loss: 1.3669 - classification_loss: 0.2640 425/500 [========================>.....] - ETA: 18s - loss: 1.6308 - regression_loss: 1.3669 - classification_loss: 0.2639 426/500 [========================>.....] - ETA: 18s - loss: 1.6312 - regression_loss: 1.3671 - classification_loss: 0.2641 427/500 [========================>.....] - ETA: 18s - loss: 1.6309 - regression_loss: 1.3666 - classification_loss: 0.2642 428/500 [========================>.....] - ETA: 17s - loss: 1.6305 - regression_loss: 1.3664 - classification_loss: 0.2641 429/500 [========================>.....] - ETA: 17s - loss: 1.6297 - regression_loss: 1.3657 - classification_loss: 0.2639 430/500 [========================>.....] - ETA: 17s - loss: 1.6282 - regression_loss: 1.3647 - classification_loss: 0.2636 431/500 [========================>.....] - ETA: 17s - loss: 1.6295 - regression_loss: 1.3655 - classification_loss: 0.2640 432/500 [========================>.....] - ETA: 16s - loss: 1.6313 - regression_loss: 1.3668 - classification_loss: 0.2645 433/500 [========================>.....] - ETA: 16s - loss: 1.6310 - regression_loss: 1.3667 - classification_loss: 0.2643 434/500 [=========================>....] - ETA: 16s - loss: 1.6308 - regression_loss: 1.3667 - classification_loss: 0.2641 435/500 [=========================>....] - ETA: 16s - loss: 1.6296 - regression_loss: 1.3658 - classification_loss: 0.2638 436/500 [=========================>....] - ETA: 15s - loss: 1.6326 - regression_loss: 1.3674 - classification_loss: 0.2652 437/500 [=========================>....] - ETA: 15s - loss: 1.6324 - regression_loss: 1.3676 - classification_loss: 0.2648 438/500 [=========================>....] - ETA: 15s - loss: 1.6315 - regression_loss: 1.3667 - classification_loss: 0.2648 439/500 [=========================>....] - ETA: 15s - loss: 1.6306 - regression_loss: 1.3660 - classification_loss: 0.2646 440/500 [=========================>....] - ETA: 14s - loss: 1.6312 - regression_loss: 1.3665 - classification_loss: 0.2648 441/500 [=========================>....] - ETA: 14s - loss: 1.6305 - regression_loss: 1.3658 - classification_loss: 0.2647 442/500 [=========================>....] - ETA: 14s - loss: 1.6298 - regression_loss: 1.3652 - classification_loss: 0.2645 443/500 [=========================>....] - ETA: 14s - loss: 1.6292 - regression_loss: 1.3646 - classification_loss: 0.2646 444/500 [=========================>....] - ETA: 13s - loss: 1.6295 - regression_loss: 1.3649 - classification_loss: 0.2646 445/500 [=========================>....] - ETA: 13s - loss: 1.6282 - regression_loss: 1.3639 - classification_loss: 0.2643 446/500 [=========================>....] - ETA: 13s - loss: 1.6290 - regression_loss: 1.3647 - classification_loss: 0.2643 447/500 [=========================>....] - ETA: 13s - loss: 1.6329 - regression_loss: 1.3616 - classification_loss: 0.2713 448/500 [=========================>....] - ETA: 12s - loss: 1.6314 - regression_loss: 1.3602 - classification_loss: 0.2712 449/500 [=========================>....] - ETA: 12s - loss: 1.6326 - regression_loss: 1.3612 - classification_loss: 0.2714 450/500 [==========================>...] - ETA: 12s - loss: 1.6329 - regression_loss: 1.3614 - classification_loss: 0.2715 451/500 [==========================>...] - ETA: 12s - loss: 1.6319 - regression_loss: 1.3606 - classification_loss: 0.2713 452/500 [==========================>...] - ETA: 11s - loss: 1.6314 - regression_loss: 1.3604 - classification_loss: 0.2711 453/500 [==========================>...] - ETA: 11s - loss: 1.6328 - regression_loss: 1.3611 - classification_loss: 0.2716 454/500 [==========================>...] - ETA: 11s - loss: 1.6307 - regression_loss: 1.3595 - classification_loss: 0.2712 455/500 [==========================>...] - ETA: 11s - loss: 1.6310 - regression_loss: 1.3597 - classification_loss: 0.2713 456/500 [==========================>...] - ETA: 10s - loss: 1.6324 - regression_loss: 1.3607 - classification_loss: 0.2717 457/500 [==========================>...] - ETA: 10s - loss: 1.6313 - regression_loss: 1.3599 - classification_loss: 0.2715 458/500 [==========================>...] - ETA: 10s - loss: 1.6308 - regression_loss: 1.3597 - classification_loss: 0.2711 459/500 [==========================>...] - ETA: 10s - loss: 1.6320 - regression_loss: 1.3607 - classification_loss: 0.2713 460/500 [==========================>...] - ETA: 9s - loss: 1.6324 - regression_loss: 1.3610 - classification_loss: 0.2714  461/500 [==========================>...] - ETA: 9s - loss: 1.6328 - regression_loss: 1.3612 - classification_loss: 0.2716 462/500 [==========================>...] - ETA: 9s - loss: 1.6342 - regression_loss: 1.3622 - classification_loss: 0.2720 463/500 [==========================>...] - ETA: 9s - loss: 1.6324 - regression_loss: 1.3608 - classification_loss: 0.2716 464/500 [==========================>...] - ETA: 8s - loss: 1.6321 - regression_loss: 1.3607 - classification_loss: 0.2714 465/500 [==========================>...] - ETA: 8s - loss: 1.6337 - regression_loss: 1.3623 - classification_loss: 0.2714 466/500 [==========================>...] - ETA: 8s - loss: 1.6338 - regression_loss: 1.3624 - classification_loss: 0.2713 467/500 [===========================>..] - ETA: 8s - loss: 1.6336 - regression_loss: 1.3625 - classification_loss: 0.2711 468/500 [===========================>..] - ETA: 7s - loss: 1.6371 - regression_loss: 1.3596 - classification_loss: 0.2775 469/500 [===========================>..] - ETA: 7s - loss: 1.6377 - regression_loss: 1.3599 - classification_loss: 0.2778 470/500 [===========================>..] - ETA: 7s - loss: 1.6387 - regression_loss: 1.3605 - classification_loss: 0.2782 471/500 [===========================>..] - ETA: 7s - loss: 1.6385 - regression_loss: 1.3605 - classification_loss: 0.2780 472/500 [===========================>..] - ETA: 6s - loss: 1.6388 - regression_loss: 1.3609 - classification_loss: 0.2779 473/500 [===========================>..] - ETA: 6s - loss: 1.6366 - regression_loss: 1.3591 - classification_loss: 0.2775 474/500 [===========================>..] - ETA: 6s - loss: 1.6349 - regression_loss: 1.3577 - classification_loss: 0.2772 475/500 [===========================>..] - ETA: 6s - loss: 1.6340 - regression_loss: 1.3570 - classification_loss: 0.2770 476/500 [===========================>..] - ETA: 5s - loss: 1.6335 - regression_loss: 1.3566 - classification_loss: 0.2768 477/500 [===========================>..] - ETA: 5s - loss: 1.6343 - regression_loss: 1.3572 - classification_loss: 0.2771 478/500 [===========================>..] - ETA: 5s - loss: 1.6313 - regression_loss: 1.3544 - classification_loss: 0.2769 479/500 [===========================>..] - ETA: 5s - loss: 1.6321 - regression_loss: 1.3551 - classification_loss: 0.2770 480/500 [===========================>..] - ETA: 4s - loss: 1.6323 - regression_loss: 1.3551 - classification_loss: 0.2772 481/500 [===========================>..] - ETA: 4s - loss: 1.6302 - regression_loss: 1.3534 - classification_loss: 0.2768 482/500 [===========================>..] - ETA: 4s - loss: 1.6296 - regression_loss: 1.3529 - classification_loss: 0.2767 483/500 [===========================>..] - ETA: 4s - loss: 1.6306 - regression_loss: 1.3535 - classification_loss: 0.2771 484/500 [============================>.] - ETA: 3s - loss: 1.6321 - regression_loss: 1.3540 - classification_loss: 0.2781 485/500 [============================>.] - ETA: 3s - loss: 1.6314 - regression_loss: 1.3536 - classification_loss: 0.2778 486/500 [============================>.] - ETA: 3s - loss: 1.6323 - regression_loss: 1.3543 - classification_loss: 0.2779 487/500 [============================>.] - ETA: 3s - loss: 1.6328 - regression_loss: 1.3551 - classification_loss: 0.2778 488/500 [============================>.] - ETA: 2s - loss: 1.6316 - regression_loss: 1.3540 - classification_loss: 0.2775 489/500 [============================>.] - ETA: 2s - loss: 1.6311 - regression_loss: 1.3538 - classification_loss: 0.2773 490/500 [============================>.] - ETA: 2s - loss: 1.6306 - regression_loss: 1.3534 - classification_loss: 0.2771 491/500 [============================>.] - ETA: 2s - loss: 1.6311 - regression_loss: 1.3538 - classification_loss: 0.2774 492/500 [============================>.] - ETA: 1s - loss: 1.6309 - regression_loss: 1.3535 - classification_loss: 0.2774 493/500 [============================>.] - ETA: 1s - loss: 1.6319 - regression_loss: 1.3544 - classification_loss: 0.2774 494/500 [============================>.] - ETA: 1s - loss: 1.6321 - regression_loss: 1.3548 - classification_loss: 0.2773 495/500 [============================>.] - ETA: 1s - loss: 1.6313 - regression_loss: 1.3544 - classification_loss: 0.2769 496/500 [============================>.] - ETA: 0s - loss: 1.6319 - regression_loss: 1.3549 - classification_loss: 0.2770 497/500 [============================>.] - ETA: 0s - loss: 1.6327 - regression_loss: 1.3557 - classification_loss: 0.2770 498/500 [============================>.] - ETA: 0s - loss: 1.6320 - regression_loss: 1.3552 - classification_loss: 0.2768 499/500 [============================>.] - ETA: 0s - loss: 1.6312 - regression_loss: 1.3547 - classification_loss: 0.2766 500/500 [==============================] - 125s 250ms/step - loss: 1.6305 - regression_loss: 1.3542 - classification_loss: 0.2763 326 instances of class plum with average precision: 0.7921 mAP: 0.7921 Epoch 00059: saving model to ./training/snapshots/resnet50_pascal_59.h5 Epoch 60/150 1/500 [..............................] - ETA: 2:04 - loss: 1.3627 - regression_loss: 1.1837 - classification_loss: 0.1789 2/500 [..............................] - ETA: 2:04 - loss: 1.6889 - regression_loss: 1.4201 - classification_loss: 0.2689 3/500 [..............................] - ETA: 2:04 - loss: 1.5021 - regression_loss: 1.2996 - classification_loss: 0.2026 4/500 [..............................] - ETA: 2:03 - loss: 1.5084 - regression_loss: 1.3022 - classification_loss: 0.2062 5/500 [..............................] - ETA: 2:04 - loss: 1.6210 - regression_loss: 1.3934 - classification_loss: 0.2277 6/500 [..............................] - ETA: 2:04 - loss: 1.5805 - regression_loss: 1.3641 - classification_loss: 0.2164 7/500 [..............................] - ETA: 2:04 - loss: 1.5185 - regression_loss: 1.3097 - classification_loss: 0.2088 8/500 [..............................] - ETA: 2:05 - loss: 1.4204 - regression_loss: 1.2288 - classification_loss: 0.1916 9/500 [..............................] - ETA: 2:05 - loss: 1.4586 - regression_loss: 1.2568 - classification_loss: 0.2019 10/500 [..............................] - ETA: 2:05 - loss: 1.4618 - regression_loss: 1.2597 - classification_loss: 0.2021 11/500 [..............................] - ETA: 2:04 - loss: 1.4566 - regression_loss: 1.2578 - classification_loss: 0.1988 12/500 [..............................] - ETA: 2:04 - loss: 1.4449 - regression_loss: 1.2510 - classification_loss: 0.1940 13/500 [..............................] - ETA: 2:04 - loss: 1.4928 - regression_loss: 1.2787 - classification_loss: 0.2141 14/500 [..............................] - ETA: 2:03 - loss: 1.4851 - regression_loss: 1.2788 - classification_loss: 0.2062 15/500 [..............................] - ETA: 2:03 - loss: 1.4928 - regression_loss: 1.2741 - classification_loss: 0.2187 16/500 [..............................] - ETA: 2:03 - loss: 1.5045 - regression_loss: 1.2908 - classification_loss: 0.2137 17/500 [>.............................] - ETA: 2:02 - loss: 1.5552 - regression_loss: 1.3164 - classification_loss: 0.2388 18/500 [>.............................] - ETA: 2:02 - loss: 1.5949 - regression_loss: 1.3371 - classification_loss: 0.2578 19/500 [>.............................] - ETA: 2:02 - loss: 1.5859 - regression_loss: 1.3342 - classification_loss: 0.2517 20/500 [>.............................] - ETA: 2:02 - loss: 1.6002 - regression_loss: 1.3430 - classification_loss: 0.2572 21/500 [>.............................] - ETA: 2:01 - loss: 1.5928 - regression_loss: 1.3372 - classification_loss: 0.2556 22/500 [>.............................] - ETA: 2:01 - loss: 1.6219 - regression_loss: 1.3594 - classification_loss: 0.2625 23/500 [>.............................] - ETA: 2:01 - loss: 1.6006 - regression_loss: 1.3435 - classification_loss: 0.2571 24/500 [>.............................] - ETA: 2:00 - loss: 1.6406 - regression_loss: 1.3682 - classification_loss: 0.2724 25/500 [>.............................] - ETA: 2:00 - loss: 1.6330 - regression_loss: 1.3616 - classification_loss: 0.2714 26/500 [>.............................] - ETA: 2:00 - loss: 1.6184 - regression_loss: 1.3506 - classification_loss: 0.2678 27/500 [>.............................] - ETA: 1:59 - loss: 1.6263 - regression_loss: 1.3574 - classification_loss: 0.2689 28/500 [>.............................] - ETA: 1:59 - loss: 1.6334 - regression_loss: 1.3610 - classification_loss: 0.2724 29/500 [>.............................] - ETA: 1:59 - loss: 1.6164 - regression_loss: 1.3475 - classification_loss: 0.2689 30/500 [>.............................] - ETA: 1:59 - loss: 1.6051 - regression_loss: 1.3412 - classification_loss: 0.2639 31/500 [>.............................] - ETA: 1:58 - loss: 1.6052 - regression_loss: 1.3449 - classification_loss: 0.2603 32/500 [>.............................] - ETA: 1:58 - loss: 1.6011 - regression_loss: 1.3445 - classification_loss: 0.2567 33/500 [>.............................] - ETA: 1:58 - loss: 1.5873 - regression_loss: 1.3322 - classification_loss: 0.2551 34/500 [=>............................] - ETA: 1:58 - loss: 1.5877 - regression_loss: 1.3309 - classification_loss: 0.2567 35/500 [=>............................] - ETA: 1:57 - loss: 1.5932 - regression_loss: 1.3371 - classification_loss: 0.2561 36/500 [=>............................] - ETA: 1:57 - loss: 1.5943 - regression_loss: 1.3391 - classification_loss: 0.2552 37/500 [=>............................] - ETA: 1:57 - loss: 1.6010 - regression_loss: 1.3432 - classification_loss: 0.2578 38/500 [=>............................] - ETA: 1:57 - loss: 1.6076 - regression_loss: 1.3489 - classification_loss: 0.2587 39/500 [=>............................] - ETA: 1:56 - loss: 1.6116 - regression_loss: 1.3504 - classification_loss: 0.2611 40/500 [=>............................] - ETA: 1:56 - loss: 1.6096 - regression_loss: 1.3473 - classification_loss: 0.2624 41/500 [=>............................] - ETA: 1:56 - loss: 1.6266 - regression_loss: 1.3620 - classification_loss: 0.2646 42/500 [=>............................] - ETA: 1:55 - loss: 1.6282 - regression_loss: 1.3639 - classification_loss: 0.2643 43/500 [=>............................] - ETA: 1:55 - loss: 1.6381 - regression_loss: 1.3718 - classification_loss: 0.2663 44/500 [=>............................] - ETA: 1:55 - loss: 1.6597 - regression_loss: 1.3909 - classification_loss: 0.2688 45/500 [=>............................] - ETA: 1:55 - loss: 1.6333 - regression_loss: 1.3688 - classification_loss: 0.2645 46/500 [=>............................] - ETA: 1:54 - loss: 1.6381 - regression_loss: 1.3729 - classification_loss: 0.2652 47/500 [=>............................] - ETA: 1:54 - loss: 1.6382 - regression_loss: 1.3726 - classification_loss: 0.2656 48/500 [=>............................] - ETA: 1:54 - loss: 1.6290 - regression_loss: 1.3630 - classification_loss: 0.2660 49/500 [=>............................] - ETA: 1:53 - loss: 1.6354 - regression_loss: 1.3673 - classification_loss: 0.2681 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6159 - regression_loss: 1.3523 - classification_loss: 0.2636 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6088 - regression_loss: 1.3476 - classification_loss: 0.2612 52/500 [==>...........................] - ETA: 1:53 - loss: 1.6382 - regression_loss: 1.3710 - classification_loss: 0.2672 53/500 [==>...........................] - ETA: 1:53 - loss: 1.6375 - regression_loss: 1.3700 - classification_loss: 0.2675 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6344 - regression_loss: 1.3680 - classification_loss: 0.2664 55/500 [==>...........................] - ETA: 1:52 - loss: 1.6325 - regression_loss: 1.3667 - classification_loss: 0.2658 56/500 [==>...........................] - ETA: 1:52 - loss: 1.6384 - regression_loss: 1.3706 - classification_loss: 0.2678 57/500 [==>...........................] - ETA: 1:52 - loss: 1.6455 - regression_loss: 1.3772 - classification_loss: 0.2683 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6429 - regression_loss: 1.3759 - classification_loss: 0.2669 59/500 [==>...........................] - ETA: 1:51 - loss: 1.6414 - regression_loss: 1.3751 - classification_loss: 0.2663 60/500 [==>...........................] - ETA: 1:51 - loss: 1.6408 - regression_loss: 1.3756 - classification_loss: 0.2652 61/500 [==>...........................] - ETA: 1:51 - loss: 1.6426 - regression_loss: 1.3775 - classification_loss: 0.2652 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6359 - regression_loss: 1.3711 - classification_loss: 0.2648 63/500 [==>...........................] - ETA: 1:50 - loss: 1.6330 - regression_loss: 1.3693 - classification_loss: 0.2636 64/500 [==>...........................] - ETA: 1:50 - loss: 1.6282 - regression_loss: 1.3664 - classification_loss: 0.2618 65/500 [==>...........................] - ETA: 1:50 - loss: 1.6252 - regression_loss: 1.3640 - classification_loss: 0.2613 66/500 [==>...........................] - ETA: 1:49 - loss: 1.6360 - regression_loss: 1.3728 - classification_loss: 0.2632 67/500 [===>..........................] - ETA: 1:49 - loss: 1.6196 - regression_loss: 1.3585 - classification_loss: 0.2611 68/500 [===>..........................] - ETA: 1:49 - loss: 1.6235 - regression_loss: 1.3617 - classification_loss: 0.2618 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6197 - regression_loss: 1.3594 - classification_loss: 0.2603 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6352 - regression_loss: 1.3715 - classification_loss: 0.2637 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6339 - regression_loss: 1.3702 - classification_loss: 0.2637 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6380 - regression_loss: 1.3749 - classification_loss: 0.2631 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6405 - regression_loss: 1.3784 - classification_loss: 0.2621 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6278 - regression_loss: 1.3684 - classification_loss: 0.2594 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6134 - regression_loss: 1.3567 - classification_loss: 0.2567 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6198 - regression_loss: 1.3607 - classification_loss: 0.2592 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6220 - regression_loss: 1.3641 - classification_loss: 0.2578 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6198 - regression_loss: 1.3630 - classification_loss: 0.2568 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6177 - regression_loss: 1.3619 - classification_loss: 0.2558 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6145 - regression_loss: 1.3591 - classification_loss: 0.2553 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6246 - regression_loss: 1.3681 - classification_loss: 0.2565 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6212 - regression_loss: 1.3648 - classification_loss: 0.2564 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6269 - regression_loss: 1.3700 - classification_loss: 0.2570 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6207 - regression_loss: 1.3655 - classification_loss: 0.2552 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6216 - regression_loss: 1.3667 - classification_loss: 0.2549 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6222 - regression_loss: 1.3676 - classification_loss: 0.2546 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6168 - regression_loss: 1.3629 - classification_loss: 0.2539 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6129 - regression_loss: 1.3602 - classification_loss: 0.2528 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6106 - regression_loss: 1.3588 - classification_loss: 0.2518 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6109 - regression_loss: 1.3599 - classification_loss: 0.2509 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6128 - regression_loss: 1.3624 - classification_loss: 0.2503 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6160 - regression_loss: 1.3646 - classification_loss: 0.2514 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6163 - regression_loss: 1.3647 - classification_loss: 0.2516 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6063 - regression_loss: 1.3562 - classification_loss: 0.2501 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6059 - regression_loss: 1.3563 - classification_loss: 0.2496 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5985 - regression_loss: 1.3493 - classification_loss: 0.2492 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5947 - regression_loss: 1.3463 - classification_loss: 0.2484 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5940 - regression_loss: 1.3456 - classification_loss: 0.2484 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5912 - regression_loss: 1.3436 - classification_loss: 0.2476 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6017 - regression_loss: 1.3522 - classification_loss: 0.2495 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5962 - regression_loss: 1.3479 - classification_loss: 0.2483 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5882 - regression_loss: 1.3412 - classification_loss: 0.2470 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5905 - regression_loss: 1.3431 - classification_loss: 0.2474 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5854 - regression_loss: 1.3393 - classification_loss: 0.2461 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5907 - regression_loss: 1.3433 - classification_loss: 0.2473 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6032 - regression_loss: 1.3514 - classification_loss: 0.2518 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6050 - regression_loss: 1.3535 - classification_loss: 0.2515 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6068 - regression_loss: 1.3539 - classification_loss: 0.2529 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6086 - regression_loss: 1.3549 - classification_loss: 0.2537 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6072 - regression_loss: 1.3539 - classification_loss: 0.2533 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6212 - regression_loss: 1.3660 - classification_loss: 0.2552 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6210 - regression_loss: 1.3654 - classification_loss: 0.2557 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6250 - regression_loss: 1.3682 - classification_loss: 0.2568 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6231 - regression_loss: 1.3671 - classification_loss: 0.2560 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6253 - regression_loss: 1.3675 - classification_loss: 0.2578 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6306 - regression_loss: 1.3724 - classification_loss: 0.2582 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6286 - regression_loss: 1.3708 - classification_loss: 0.2578 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6265 - regression_loss: 1.3694 - classification_loss: 0.2571 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6262 - regression_loss: 1.3687 - classification_loss: 0.2575 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6262 - regression_loss: 1.3693 - classification_loss: 0.2568 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6243 - regression_loss: 1.3684 - classification_loss: 0.2560 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6215 - regression_loss: 1.3659 - classification_loss: 0.2556 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6211 - regression_loss: 1.3654 - classification_loss: 0.2558 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6202 - regression_loss: 1.3650 - classification_loss: 0.2552 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6179 - regression_loss: 1.3635 - classification_loss: 0.2545 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6148 - regression_loss: 1.3613 - classification_loss: 0.2535 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6101 - regression_loss: 1.3577 - classification_loss: 0.2524 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6047 - regression_loss: 1.3530 - classification_loss: 0.2517 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6086 - regression_loss: 1.3555 - classification_loss: 0.2531 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6077 - regression_loss: 1.3548 - classification_loss: 0.2529 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6076 - regression_loss: 1.3546 - classification_loss: 0.2530 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6074 - regression_loss: 1.3548 - classification_loss: 0.2526 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6075 - regression_loss: 1.3548 - classification_loss: 0.2526 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6139 - regression_loss: 1.3597 - classification_loss: 0.2542 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6114 - regression_loss: 1.3579 - classification_loss: 0.2535 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6109 - regression_loss: 1.3574 - classification_loss: 0.2535 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6068 - regression_loss: 1.3538 - classification_loss: 0.2530 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6110 - regression_loss: 1.3566 - classification_loss: 0.2544 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6096 - regression_loss: 1.3556 - classification_loss: 0.2540 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6109 - regression_loss: 1.3568 - classification_loss: 0.2541 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6085 - regression_loss: 1.3551 - classification_loss: 0.2535 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6116 - regression_loss: 1.3557 - classification_loss: 0.2559 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6092 - regression_loss: 1.3534 - classification_loss: 0.2558 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6045 - regression_loss: 1.3498 - classification_loss: 0.2547 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6069 - regression_loss: 1.3524 - classification_loss: 0.2546 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5995 - regression_loss: 1.3459 - classification_loss: 0.2536 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5973 - regression_loss: 1.3445 - classification_loss: 0.2527 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6004 - regression_loss: 1.3473 - classification_loss: 0.2531 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6012 - regression_loss: 1.3474 - classification_loss: 0.2538 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6035 - regression_loss: 1.3493 - classification_loss: 0.2542 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6036 - regression_loss: 1.3499 - classification_loss: 0.2538 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6076 - regression_loss: 1.3533 - classification_loss: 0.2543 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6047 - regression_loss: 1.3507 - classification_loss: 0.2540 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6073 - regression_loss: 1.3528 - classification_loss: 0.2544 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6125 - regression_loss: 1.3564 - classification_loss: 0.2561 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6148 - regression_loss: 1.3582 - classification_loss: 0.2566 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6141 - regression_loss: 1.3584 - classification_loss: 0.2557 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6118 - regression_loss: 1.3565 - classification_loss: 0.2553 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6079 - regression_loss: 1.3535 - classification_loss: 0.2545 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6095 - regression_loss: 1.3549 - classification_loss: 0.2546 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6110 - regression_loss: 1.3562 - classification_loss: 0.2548 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6140 - regression_loss: 1.3584 - classification_loss: 0.2556 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6106 - regression_loss: 1.3559 - classification_loss: 0.2548 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6096 - regression_loss: 1.3551 - classification_loss: 0.2545 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6096 - regression_loss: 1.3552 - classification_loss: 0.2545 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6109 - regression_loss: 1.3559 - classification_loss: 0.2550 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6081 - regression_loss: 1.3541 - classification_loss: 0.2541 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6019 - regression_loss: 1.3490 - classification_loss: 0.2529 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6050 - regression_loss: 1.3511 - classification_loss: 0.2539 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6094 - regression_loss: 1.3545 - classification_loss: 0.2549 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6069 - regression_loss: 1.3525 - classification_loss: 0.2544 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6079 - regression_loss: 1.3534 - classification_loss: 0.2545 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6091 - regression_loss: 1.3544 - classification_loss: 0.2547 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6054 - regression_loss: 1.3515 - classification_loss: 0.2539 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6024 - regression_loss: 1.3490 - classification_loss: 0.2534 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6014 - regression_loss: 1.3484 - classification_loss: 0.2529 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6013 - regression_loss: 1.3487 - classification_loss: 0.2526 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5966 - regression_loss: 1.3450 - classification_loss: 0.2516 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5951 - regression_loss: 1.3441 - classification_loss: 0.2509 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6044 - regression_loss: 1.3516 - classification_loss: 0.2528 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6053 - regression_loss: 1.3523 - classification_loss: 0.2530 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6064 - regression_loss: 1.3529 - classification_loss: 0.2535 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6031 - regression_loss: 1.3498 - classification_loss: 0.2532 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6018 - regression_loss: 1.3490 - classification_loss: 0.2528 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6011 - regression_loss: 1.3485 - classification_loss: 0.2525 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6036 - regression_loss: 1.3501 - classification_loss: 0.2534 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6035 - regression_loss: 1.3504 - classification_loss: 0.2531 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6006 - regression_loss: 1.3484 - classification_loss: 0.2521 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6001 - regression_loss: 1.3486 - classification_loss: 0.2515 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6016 - regression_loss: 1.3500 - classification_loss: 0.2517 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6028 - regression_loss: 1.3515 - classification_loss: 0.2514 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6015 - regression_loss: 1.3502 - classification_loss: 0.2512 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6038 - regression_loss: 1.3523 - classification_loss: 0.2515 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6046 - regression_loss: 1.3531 - classification_loss: 0.2515 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6050 - regression_loss: 1.3537 - classification_loss: 0.2513 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6049 - regression_loss: 1.3535 - classification_loss: 0.2513 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6022 - regression_loss: 1.3507 - classification_loss: 0.2515 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6037 - regression_loss: 1.3518 - classification_loss: 0.2518 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6063 - regression_loss: 1.3536 - classification_loss: 0.2527 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6057 - regression_loss: 1.3534 - classification_loss: 0.2524 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6046 - regression_loss: 1.3525 - classification_loss: 0.2521 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6042 - regression_loss: 1.3520 - classification_loss: 0.2523 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6063 - regression_loss: 1.3539 - classification_loss: 0.2524 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6054 - regression_loss: 1.3533 - classification_loss: 0.2521 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6035 - regression_loss: 1.3519 - classification_loss: 0.2516 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6039 - regression_loss: 1.3522 - classification_loss: 0.2517 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6038 - regression_loss: 1.3526 - classification_loss: 0.2512 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6031 - regression_loss: 1.3525 - classification_loss: 0.2506 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6028 - regression_loss: 1.3518 - classification_loss: 0.2510 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6041 - regression_loss: 1.3529 - classification_loss: 0.2512 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6052 - regression_loss: 1.3539 - classification_loss: 0.2513 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6072 - regression_loss: 1.3555 - classification_loss: 0.2517 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6084 - regression_loss: 1.3568 - classification_loss: 0.2516 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6111 - regression_loss: 1.3587 - classification_loss: 0.2524 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6073 - regression_loss: 1.3556 - classification_loss: 0.2518 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6050 - regression_loss: 1.3533 - classification_loss: 0.2517 217/500 [============>.................] - ETA: 1:10 - loss: 1.6009 - regression_loss: 1.3499 - classification_loss: 0.2510 218/500 [============>.................] - ETA: 1:10 - loss: 1.6031 - regression_loss: 1.3515 - classification_loss: 0.2516 219/500 [============>.................] - ETA: 1:10 - loss: 1.6038 - regression_loss: 1.3524 - classification_loss: 0.2514 220/500 [============>.................] - ETA: 1:10 - loss: 1.6031 - regression_loss: 1.3518 - classification_loss: 0.2513 221/500 [============>.................] - ETA: 1:09 - loss: 1.6023 - regression_loss: 1.3510 - classification_loss: 0.2512 222/500 [============>.................] - ETA: 1:09 - loss: 1.6007 - regression_loss: 1.3496 - classification_loss: 0.2510 223/500 [============>.................] - ETA: 1:09 - loss: 1.6002 - regression_loss: 1.3492 - classification_loss: 0.2511 224/500 [============>.................] - ETA: 1:09 - loss: 1.5997 - regression_loss: 1.3492 - classification_loss: 0.2505 225/500 [============>.................] - ETA: 1:08 - loss: 1.6002 - regression_loss: 1.3493 - classification_loss: 0.2509 226/500 [============>.................] - ETA: 1:08 - loss: 1.5971 - regression_loss: 1.3468 - classification_loss: 0.2503 227/500 [============>.................] - ETA: 1:08 - loss: 1.5939 - regression_loss: 1.3442 - classification_loss: 0.2497 228/500 [============>.................] - ETA: 1:08 - loss: 1.5910 - regression_loss: 1.3419 - classification_loss: 0.2491 229/500 [============>.................] - ETA: 1:07 - loss: 1.5902 - regression_loss: 1.3414 - classification_loss: 0.2489 230/500 [============>.................] - ETA: 1:07 - loss: 1.5886 - regression_loss: 1.3402 - classification_loss: 0.2484 231/500 [============>.................] - ETA: 1:07 - loss: 1.5868 - regression_loss: 1.3387 - classification_loss: 0.2481 232/500 [============>.................] - ETA: 1:07 - loss: 1.5863 - regression_loss: 1.3381 - classification_loss: 0.2482 233/500 [============>.................] - ETA: 1:06 - loss: 1.5939 - regression_loss: 1.3453 - classification_loss: 0.2486 234/500 [=============>................] - ETA: 1:06 - loss: 1.5959 - regression_loss: 1.3472 - classification_loss: 0.2487 235/500 [=============>................] - ETA: 1:06 - loss: 1.6005 - regression_loss: 1.3497 - classification_loss: 0.2507 236/500 [=============>................] - ETA: 1:06 - loss: 1.6031 - regression_loss: 1.3516 - classification_loss: 0.2515 237/500 [=============>................] - ETA: 1:05 - loss: 1.6023 - regression_loss: 1.3512 - classification_loss: 0.2512 238/500 [=============>................] - ETA: 1:05 - loss: 1.6021 - regression_loss: 1.3506 - classification_loss: 0.2515 239/500 [=============>................] - ETA: 1:05 - loss: 1.6014 - regression_loss: 1.3490 - classification_loss: 0.2524 240/500 [=============>................] - ETA: 1:05 - loss: 1.5997 - regression_loss: 1.3477 - classification_loss: 0.2520 241/500 [=============>................] - ETA: 1:04 - loss: 1.5999 - regression_loss: 1.3482 - classification_loss: 0.2517 242/500 [=============>................] - ETA: 1:04 - loss: 1.6007 - regression_loss: 1.3488 - classification_loss: 0.2519 243/500 [=============>................] - ETA: 1:04 - loss: 1.5998 - regression_loss: 1.3480 - classification_loss: 0.2519 244/500 [=============>................] - ETA: 1:04 - loss: 1.5982 - regression_loss: 1.3468 - classification_loss: 0.2514 245/500 [=============>................] - ETA: 1:03 - loss: 1.5986 - regression_loss: 1.3472 - classification_loss: 0.2514 246/500 [=============>................] - ETA: 1:03 - loss: 1.5987 - regression_loss: 1.3473 - classification_loss: 0.2514 247/500 [=============>................] - ETA: 1:03 - loss: 1.6003 - regression_loss: 1.3484 - classification_loss: 0.2519 248/500 [=============>................] - ETA: 1:03 - loss: 1.6020 - regression_loss: 1.3500 - classification_loss: 0.2520 249/500 [=============>................] - ETA: 1:02 - loss: 1.6009 - regression_loss: 1.3488 - classification_loss: 0.2521 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6008 - regression_loss: 1.3489 - classification_loss: 0.2519 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6018 - regression_loss: 1.3499 - classification_loss: 0.2518 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5992 - regression_loss: 1.3478 - classification_loss: 0.2514 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5996 - regression_loss: 1.3482 - classification_loss: 0.2515 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5974 - regression_loss: 1.3464 - classification_loss: 0.2510 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5966 - regression_loss: 1.3459 - classification_loss: 0.2507 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5952 - regression_loss: 1.3443 - classification_loss: 0.2509 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5947 - regression_loss: 1.3438 - classification_loss: 0.2508 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5946 - regression_loss: 1.3437 - classification_loss: 0.2509 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5955 - regression_loss: 1.3447 - classification_loss: 0.2507 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5953 - regression_loss: 1.3443 - classification_loss: 0.2510 261/500 [==============>...............] - ETA: 59s - loss: 1.6001 - regression_loss: 1.3486 - classification_loss: 0.2516  262/500 [==============>...............] - ETA: 59s - loss: 1.5995 - regression_loss: 1.3479 - classification_loss: 0.2516 263/500 [==============>...............] - ETA: 59s - loss: 1.5980 - regression_loss: 1.3469 - classification_loss: 0.2510 264/500 [==============>...............] - ETA: 59s - loss: 1.5967 - regression_loss: 1.3460 - classification_loss: 0.2507 265/500 [==============>...............] - ETA: 58s - loss: 1.5972 - regression_loss: 1.3464 - classification_loss: 0.2508 266/500 [==============>...............] - ETA: 58s - loss: 1.5988 - regression_loss: 1.3475 - classification_loss: 0.2513 267/500 [===============>..............] - ETA: 58s - loss: 1.5995 - regression_loss: 1.3481 - classification_loss: 0.2514 268/500 [===============>..............] - ETA: 58s - loss: 1.5977 - regression_loss: 1.3466 - classification_loss: 0.2511 269/500 [===============>..............] - ETA: 57s - loss: 1.5989 - regression_loss: 1.3472 - classification_loss: 0.2517 270/500 [===============>..............] - ETA: 57s - loss: 1.5976 - regression_loss: 1.3460 - classification_loss: 0.2516 271/500 [===============>..............] - ETA: 57s - loss: 1.5988 - regression_loss: 1.3471 - classification_loss: 0.2517 272/500 [===============>..............] - ETA: 57s - loss: 1.5960 - regression_loss: 1.3448 - classification_loss: 0.2512 273/500 [===============>..............] - ETA: 56s - loss: 1.5964 - regression_loss: 1.3448 - classification_loss: 0.2516 274/500 [===============>..............] - ETA: 56s - loss: 1.5984 - regression_loss: 1.3461 - classification_loss: 0.2523 275/500 [===============>..............] - ETA: 56s - loss: 1.5956 - regression_loss: 1.3440 - classification_loss: 0.2516 276/500 [===============>..............] - ETA: 56s - loss: 1.5957 - regression_loss: 1.3439 - classification_loss: 0.2518 277/500 [===============>..............] - ETA: 55s - loss: 1.5945 - regression_loss: 1.3429 - classification_loss: 0.2516 278/500 [===============>..............] - ETA: 55s - loss: 1.5965 - regression_loss: 1.3443 - classification_loss: 0.2522 279/500 [===============>..............] - ETA: 55s - loss: 1.5939 - regression_loss: 1.3423 - classification_loss: 0.2516 280/500 [===============>..............] - ETA: 55s - loss: 1.5921 - regression_loss: 1.3409 - classification_loss: 0.2512 281/500 [===============>..............] - ETA: 54s - loss: 1.5917 - regression_loss: 1.3407 - classification_loss: 0.2511 282/500 [===============>..............] - ETA: 54s - loss: 1.5937 - regression_loss: 1.3424 - classification_loss: 0.2513 283/500 [===============>..............] - ETA: 54s - loss: 1.5939 - regression_loss: 1.3424 - classification_loss: 0.2515 284/500 [================>.............] - ETA: 54s - loss: 1.5967 - regression_loss: 1.3446 - classification_loss: 0.2521 285/500 [================>.............] - ETA: 53s - loss: 1.5973 - regression_loss: 1.3452 - classification_loss: 0.2521 286/500 [================>.............] - ETA: 53s - loss: 1.6077 - regression_loss: 1.3452 - classification_loss: 0.2624 287/500 [================>.............] - ETA: 53s - loss: 1.6072 - regression_loss: 1.3448 - classification_loss: 0.2624 288/500 [================>.............] - ETA: 53s - loss: 1.6077 - regression_loss: 1.3454 - classification_loss: 0.2623 289/500 [================>.............] - ETA: 52s - loss: 1.6111 - regression_loss: 1.3478 - classification_loss: 0.2634 290/500 [================>.............] - ETA: 52s - loss: 1.6107 - regression_loss: 1.3475 - classification_loss: 0.2632 291/500 [================>.............] - ETA: 52s - loss: 1.6096 - regression_loss: 1.3468 - classification_loss: 0.2628 292/500 [================>.............] - ETA: 52s - loss: 1.6081 - regression_loss: 1.3455 - classification_loss: 0.2626 293/500 [================>.............] - ETA: 51s - loss: 1.6087 - regression_loss: 1.3460 - classification_loss: 0.2628 294/500 [================>.............] - ETA: 51s - loss: 1.6098 - regression_loss: 1.3471 - classification_loss: 0.2627 295/500 [================>.............] - ETA: 51s - loss: 1.6114 - regression_loss: 1.3484 - classification_loss: 0.2629 296/500 [================>.............] - ETA: 51s - loss: 1.6123 - regression_loss: 1.3493 - classification_loss: 0.2631 297/500 [================>.............] - ETA: 50s - loss: 1.6162 - regression_loss: 1.3525 - classification_loss: 0.2637 298/500 [================>.............] - ETA: 50s - loss: 1.6191 - regression_loss: 1.3549 - classification_loss: 0.2642 299/500 [================>.............] - ETA: 50s - loss: 1.6174 - regression_loss: 1.3535 - classification_loss: 0.2638 300/500 [=================>............] - ETA: 50s - loss: 1.6169 - regression_loss: 1.3533 - classification_loss: 0.2636 301/500 [=================>............] - ETA: 49s - loss: 1.6157 - regression_loss: 1.3524 - classification_loss: 0.2634 302/500 [=================>............] - ETA: 49s - loss: 1.6167 - regression_loss: 1.3529 - classification_loss: 0.2638 303/500 [=================>............] - ETA: 49s - loss: 1.6153 - regression_loss: 1.3520 - classification_loss: 0.2633 304/500 [=================>............] - ETA: 49s - loss: 1.6184 - regression_loss: 1.3546 - classification_loss: 0.2638 305/500 [=================>............] - ETA: 48s - loss: 1.6172 - regression_loss: 1.3537 - classification_loss: 0.2635 306/500 [=================>............] - ETA: 48s - loss: 1.6180 - regression_loss: 1.3543 - classification_loss: 0.2637 307/500 [=================>............] - ETA: 48s - loss: 1.6189 - regression_loss: 1.3549 - classification_loss: 0.2640 308/500 [=================>............] - ETA: 48s - loss: 1.6190 - regression_loss: 1.3551 - classification_loss: 0.2639 309/500 [=================>............] - ETA: 47s - loss: 1.6178 - regression_loss: 1.3543 - classification_loss: 0.2634 310/500 [=================>............] - ETA: 47s - loss: 1.6137 - regression_loss: 1.3509 - classification_loss: 0.2628 311/500 [=================>............] - ETA: 47s - loss: 1.6165 - regression_loss: 1.3536 - classification_loss: 0.2628 312/500 [=================>............] - ETA: 47s - loss: 1.6154 - regression_loss: 1.3526 - classification_loss: 0.2628 313/500 [=================>............] - ETA: 46s - loss: 1.6152 - regression_loss: 1.3525 - classification_loss: 0.2627 314/500 [=================>............] - ETA: 46s - loss: 1.6131 - regression_loss: 1.3508 - classification_loss: 0.2623 315/500 [=================>............] - ETA: 46s - loss: 1.6144 - regression_loss: 1.3518 - classification_loss: 0.2626 316/500 [=================>............] - ETA: 46s - loss: 1.6150 - regression_loss: 1.3525 - classification_loss: 0.2625 317/500 [==================>...........] - ETA: 45s - loss: 1.6159 - regression_loss: 1.3532 - classification_loss: 0.2627 318/500 [==================>...........] - ETA: 45s - loss: 1.6166 - regression_loss: 1.3539 - classification_loss: 0.2628 319/500 [==================>...........] - ETA: 45s - loss: 1.6153 - regression_loss: 1.3529 - classification_loss: 0.2624 320/500 [==================>...........] - ETA: 45s - loss: 1.6145 - regression_loss: 1.3521 - classification_loss: 0.2624 321/500 [==================>...........] - ETA: 44s - loss: 1.6143 - regression_loss: 1.3522 - classification_loss: 0.2621 322/500 [==================>...........] - ETA: 44s - loss: 1.6148 - regression_loss: 1.3525 - classification_loss: 0.2623 323/500 [==================>...........] - ETA: 44s - loss: 1.6156 - regression_loss: 1.3532 - classification_loss: 0.2624 324/500 [==================>...........] - ETA: 44s - loss: 1.6189 - regression_loss: 1.3552 - classification_loss: 0.2638 325/500 [==================>...........] - ETA: 43s - loss: 1.6173 - regression_loss: 1.3538 - classification_loss: 0.2634 326/500 [==================>...........] - ETA: 43s - loss: 1.6186 - regression_loss: 1.3548 - classification_loss: 0.2638 327/500 [==================>...........] - ETA: 43s - loss: 1.6184 - regression_loss: 1.3548 - classification_loss: 0.2636 328/500 [==================>...........] - ETA: 43s - loss: 1.6200 - regression_loss: 1.3558 - classification_loss: 0.2641 329/500 [==================>...........] - ETA: 42s - loss: 1.6259 - regression_loss: 1.3614 - classification_loss: 0.2645 330/500 [==================>...........] - ETA: 42s - loss: 1.6258 - regression_loss: 1.3612 - classification_loss: 0.2646 331/500 [==================>...........] - ETA: 42s - loss: 1.6277 - regression_loss: 1.3630 - classification_loss: 0.2647 332/500 [==================>...........] - ETA: 42s - loss: 1.6262 - regression_loss: 1.3620 - classification_loss: 0.2643 333/500 [==================>...........] - ETA: 41s - loss: 1.6270 - regression_loss: 1.3629 - classification_loss: 0.2642 334/500 [===================>..........] - ETA: 41s - loss: 1.6271 - regression_loss: 1.3628 - classification_loss: 0.2643 335/500 [===================>..........] - ETA: 41s - loss: 1.6264 - regression_loss: 1.3625 - classification_loss: 0.2639 336/500 [===================>..........] - ETA: 41s - loss: 1.6302 - regression_loss: 1.3612 - classification_loss: 0.2691 337/500 [===================>..........] - ETA: 40s - loss: 1.6298 - regression_loss: 1.3610 - classification_loss: 0.2688 338/500 [===================>..........] - ETA: 40s - loss: 1.6283 - regression_loss: 1.3599 - classification_loss: 0.2685 339/500 [===================>..........] - ETA: 40s - loss: 1.6294 - regression_loss: 1.3609 - classification_loss: 0.2685 340/500 [===================>..........] - ETA: 40s - loss: 1.6312 - regression_loss: 1.3623 - classification_loss: 0.2689 341/500 [===================>..........] - ETA: 39s - loss: 1.6312 - regression_loss: 1.3626 - classification_loss: 0.2686 342/500 [===================>..........] - ETA: 39s - loss: 1.6301 - regression_loss: 1.3617 - classification_loss: 0.2683 343/500 [===================>..........] - ETA: 39s - loss: 1.6307 - regression_loss: 1.3624 - classification_loss: 0.2683 344/500 [===================>..........] - ETA: 39s - loss: 1.6296 - regression_loss: 1.3616 - classification_loss: 0.2679 345/500 [===================>..........] - ETA: 38s - loss: 1.6265 - regression_loss: 1.3592 - classification_loss: 0.2673 346/500 [===================>..........] - ETA: 38s - loss: 1.6257 - regression_loss: 1.3588 - classification_loss: 0.2670 347/500 [===================>..........] - ETA: 38s - loss: 1.6272 - regression_loss: 1.3601 - classification_loss: 0.2672 348/500 [===================>..........] - ETA: 38s - loss: 1.6278 - regression_loss: 1.3605 - classification_loss: 0.2673 349/500 [===================>..........] - ETA: 37s - loss: 1.6277 - regression_loss: 1.3607 - classification_loss: 0.2670 350/500 [====================>.........] - ETA: 37s - loss: 1.6263 - regression_loss: 1.3596 - classification_loss: 0.2667 351/500 [====================>.........] - ETA: 37s - loss: 1.6263 - regression_loss: 1.3593 - classification_loss: 0.2670 352/500 [====================>.........] - ETA: 37s - loss: 1.6253 - regression_loss: 1.3586 - classification_loss: 0.2666 353/500 [====================>.........] - ETA: 36s - loss: 1.6236 - regression_loss: 1.3572 - classification_loss: 0.2664 354/500 [====================>.........] - ETA: 36s - loss: 1.6239 - regression_loss: 1.3573 - classification_loss: 0.2666 355/500 [====================>.........] - ETA: 36s - loss: 1.6230 - regression_loss: 1.3564 - classification_loss: 0.2666 356/500 [====================>.........] - ETA: 36s - loss: 1.6227 - regression_loss: 1.3562 - classification_loss: 0.2666 357/500 [====================>.........] - ETA: 35s - loss: 1.6252 - regression_loss: 1.3580 - classification_loss: 0.2672 358/500 [====================>.........] - ETA: 35s - loss: 1.6251 - regression_loss: 1.3579 - classification_loss: 0.2672 359/500 [====================>.........] - ETA: 35s - loss: 1.6242 - regression_loss: 1.3572 - classification_loss: 0.2669 360/500 [====================>.........] - ETA: 35s - loss: 1.6248 - regression_loss: 1.3577 - classification_loss: 0.2671 361/500 [====================>.........] - ETA: 34s - loss: 1.6269 - regression_loss: 1.3593 - classification_loss: 0.2676 362/500 [====================>.........] - ETA: 34s - loss: 1.6269 - regression_loss: 1.3594 - classification_loss: 0.2675 363/500 [====================>.........] - ETA: 34s - loss: 1.6273 - regression_loss: 1.3598 - classification_loss: 0.2675 364/500 [====================>.........] - ETA: 34s - loss: 1.6310 - regression_loss: 1.3634 - classification_loss: 0.2676 365/500 [====================>.........] - ETA: 33s - loss: 1.6339 - regression_loss: 1.3655 - classification_loss: 0.2684 366/500 [====================>.........] - ETA: 33s - loss: 1.6330 - regression_loss: 1.3649 - classification_loss: 0.2681 367/500 [=====================>........] - ETA: 33s - loss: 1.6318 - regression_loss: 1.3641 - classification_loss: 0.2677 368/500 [=====================>........] - ETA: 33s - loss: 1.6324 - regression_loss: 1.3647 - classification_loss: 0.2676 369/500 [=====================>........] - ETA: 32s - loss: 1.6346 - regression_loss: 1.3667 - classification_loss: 0.2679 370/500 [=====================>........] - ETA: 32s - loss: 1.6326 - regression_loss: 1.3651 - classification_loss: 0.2674 371/500 [=====================>........] - ETA: 32s - loss: 1.6360 - regression_loss: 1.3678 - classification_loss: 0.2682 372/500 [=====================>........] - ETA: 32s - loss: 1.6350 - regression_loss: 1.3671 - classification_loss: 0.2679 373/500 [=====================>........] - ETA: 31s - loss: 1.6333 - regression_loss: 1.3659 - classification_loss: 0.2674 374/500 [=====================>........] - ETA: 31s - loss: 1.6343 - regression_loss: 1.3671 - classification_loss: 0.2672 375/500 [=====================>........] - ETA: 31s - loss: 1.6328 - regression_loss: 1.3660 - classification_loss: 0.2668 376/500 [=====================>........] - ETA: 31s - loss: 1.6337 - regression_loss: 1.3666 - classification_loss: 0.2671 377/500 [=====================>........] - ETA: 30s - loss: 1.6317 - regression_loss: 1.3650 - classification_loss: 0.2667 378/500 [=====================>........] - ETA: 30s - loss: 1.6314 - regression_loss: 1.3649 - classification_loss: 0.2665 379/500 [=====================>........] - ETA: 30s - loss: 1.6315 - regression_loss: 1.3648 - classification_loss: 0.2667 380/500 [=====================>........] - ETA: 30s - loss: 1.6331 - regression_loss: 1.3660 - classification_loss: 0.2671 381/500 [=====================>........] - ETA: 29s - loss: 1.6298 - regression_loss: 1.3634 - classification_loss: 0.2664 382/500 [=====================>........] - ETA: 29s - loss: 1.6291 - regression_loss: 1.3627 - classification_loss: 0.2664 383/500 [=====================>........] - ETA: 29s - loss: 1.6295 - regression_loss: 1.3632 - classification_loss: 0.2663 384/500 [======================>.......] - ETA: 29s - loss: 1.6324 - regression_loss: 1.3654 - classification_loss: 0.2670 385/500 [======================>.......] - ETA: 28s - loss: 1.6323 - regression_loss: 1.3654 - classification_loss: 0.2669 386/500 [======================>.......] - ETA: 28s - loss: 1.6310 - regression_loss: 1.3643 - classification_loss: 0.2668 387/500 [======================>.......] - ETA: 28s - loss: 1.6314 - regression_loss: 1.3644 - classification_loss: 0.2669 388/500 [======================>.......] - ETA: 28s - loss: 1.6317 - regression_loss: 1.3646 - classification_loss: 0.2670 389/500 [======================>.......] - ETA: 27s - loss: 1.6327 - regression_loss: 1.3654 - classification_loss: 0.2673 390/500 [======================>.......] - ETA: 27s - loss: 1.6308 - regression_loss: 1.3639 - classification_loss: 0.2670 391/500 [======================>.......] - ETA: 27s - loss: 1.6318 - regression_loss: 1.3646 - classification_loss: 0.2672 392/500 [======================>.......] - ETA: 27s - loss: 1.6321 - regression_loss: 1.3648 - classification_loss: 0.2673 393/500 [======================>.......] - ETA: 26s - loss: 1.6325 - regression_loss: 1.3652 - classification_loss: 0.2673 394/500 [======================>.......] - ETA: 26s - loss: 1.6316 - regression_loss: 1.3646 - classification_loss: 0.2670 395/500 [======================>.......] - ETA: 26s - loss: 1.6309 - regression_loss: 1.3640 - classification_loss: 0.2669 396/500 [======================>.......] - ETA: 26s - loss: 1.6305 - regression_loss: 1.3636 - classification_loss: 0.2669 397/500 [======================>.......] - ETA: 25s - loss: 1.6312 - regression_loss: 1.3640 - classification_loss: 0.2672 398/500 [======================>.......] - ETA: 25s - loss: 1.6311 - regression_loss: 1.3638 - classification_loss: 0.2673 399/500 [======================>.......] - ETA: 25s - loss: 1.6311 - regression_loss: 1.3639 - classification_loss: 0.2672 400/500 [=======================>......] - ETA: 25s - loss: 1.6297 - regression_loss: 1.3627 - classification_loss: 0.2670 401/500 [=======================>......] - ETA: 24s - loss: 1.6315 - regression_loss: 1.3637 - classification_loss: 0.2677 402/500 [=======================>......] - ETA: 24s - loss: 1.6286 - regression_loss: 1.3613 - classification_loss: 0.2673 403/500 [=======================>......] - ETA: 24s - loss: 1.6282 - regression_loss: 1.3610 - classification_loss: 0.2672 404/500 [=======================>......] - ETA: 24s - loss: 1.6271 - regression_loss: 1.3602 - classification_loss: 0.2669 405/500 [=======================>......] - ETA: 23s - loss: 1.6264 - regression_loss: 1.3599 - classification_loss: 0.2666 406/500 [=======================>......] - ETA: 23s - loss: 1.6262 - regression_loss: 1.3595 - classification_loss: 0.2666 407/500 [=======================>......] - ETA: 23s - loss: 1.6263 - regression_loss: 1.3596 - classification_loss: 0.2667 408/500 [=======================>......] - ETA: 23s - loss: 1.6252 - regression_loss: 1.3588 - classification_loss: 0.2664 409/500 [=======================>......] - ETA: 22s - loss: 1.6256 - regression_loss: 1.3592 - classification_loss: 0.2664 410/500 [=======================>......] - ETA: 22s - loss: 1.6234 - regression_loss: 1.3572 - classification_loss: 0.2662 411/500 [=======================>......] - ETA: 22s - loss: 1.6237 - regression_loss: 1.3573 - classification_loss: 0.2663 412/500 [=======================>......] - ETA: 22s - loss: 1.6234 - regression_loss: 1.3571 - classification_loss: 0.2663 413/500 [=======================>......] - ETA: 21s - loss: 1.6231 - regression_loss: 1.3571 - classification_loss: 0.2660 414/500 [=======================>......] - ETA: 21s - loss: 1.6241 - regression_loss: 1.3576 - classification_loss: 0.2666 415/500 [=======================>......] - ETA: 21s - loss: 1.6231 - regression_loss: 1.3570 - classification_loss: 0.2662 416/500 [=======================>......] - ETA: 21s - loss: 1.6226 - regression_loss: 1.3567 - classification_loss: 0.2659 417/500 [========================>.....] - ETA: 20s - loss: 1.6214 - regression_loss: 1.3559 - classification_loss: 0.2655 418/500 [========================>.....] - ETA: 20s - loss: 1.6196 - regression_loss: 1.3545 - classification_loss: 0.2651 419/500 [========================>.....] - ETA: 20s - loss: 1.6186 - regression_loss: 1.3536 - classification_loss: 0.2649 420/500 [========================>.....] - ETA: 20s - loss: 1.6172 - regression_loss: 1.3524 - classification_loss: 0.2647 421/500 [========================>.....] - ETA: 19s - loss: 1.6207 - regression_loss: 1.3545 - classification_loss: 0.2662 422/500 [========================>.....] - ETA: 19s - loss: 1.6217 - regression_loss: 1.3553 - classification_loss: 0.2664 423/500 [========================>.....] - ETA: 19s - loss: 1.6210 - regression_loss: 1.3549 - classification_loss: 0.2662 424/500 [========================>.....] - ETA: 19s - loss: 1.6203 - regression_loss: 1.3542 - classification_loss: 0.2661 425/500 [========================>.....] - ETA: 18s - loss: 1.6202 - regression_loss: 1.3541 - classification_loss: 0.2661 426/500 [========================>.....] - ETA: 18s - loss: 1.6181 - regression_loss: 1.3524 - classification_loss: 0.2657 427/500 [========================>.....] - ETA: 18s - loss: 1.6176 - regression_loss: 1.3520 - classification_loss: 0.2656 428/500 [========================>.....] - ETA: 18s - loss: 1.6190 - regression_loss: 1.3530 - classification_loss: 0.2660 429/500 [========================>.....] - ETA: 17s - loss: 1.6176 - regression_loss: 1.3520 - classification_loss: 0.2656 430/500 [========================>.....] - ETA: 17s - loss: 1.6161 - regression_loss: 1.3508 - classification_loss: 0.2654 431/500 [========================>.....] - ETA: 17s - loss: 1.6157 - regression_loss: 1.3506 - classification_loss: 0.2651 432/500 [========================>.....] - ETA: 17s - loss: 1.6175 - regression_loss: 1.3521 - classification_loss: 0.2655 433/500 [========================>.....] - ETA: 16s - loss: 1.6181 - regression_loss: 1.3527 - classification_loss: 0.2654 434/500 [=========================>....] - ETA: 16s - loss: 1.6185 - regression_loss: 1.3529 - classification_loss: 0.2656 435/500 [=========================>....] - ETA: 16s - loss: 1.6194 - regression_loss: 1.3536 - classification_loss: 0.2658 436/500 [=========================>....] - ETA: 16s - loss: 1.6190 - regression_loss: 1.3533 - classification_loss: 0.2657 437/500 [=========================>....] - ETA: 15s - loss: 1.6184 - regression_loss: 1.3529 - classification_loss: 0.2655 438/500 [=========================>....] - ETA: 15s - loss: 1.6191 - regression_loss: 1.3537 - classification_loss: 0.2654 439/500 [=========================>....] - ETA: 15s - loss: 1.6224 - regression_loss: 1.3559 - classification_loss: 0.2665 440/500 [=========================>....] - ETA: 15s - loss: 1.6226 - regression_loss: 1.3563 - classification_loss: 0.2662 441/500 [=========================>....] - ETA: 14s - loss: 1.6245 - regression_loss: 1.3580 - classification_loss: 0.2664 442/500 [=========================>....] - ETA: 14s - loss: 1.6239 - regression_loss: 1.3576 - classification_loss: 0.2663 443/500 [=========================>....] - ETA: 14s - loss: 1.6234 - regression_loss: 1.3573 - classification_loss: 0.2661 444/500 [=========================>....] - ETA: 14s - loss: 1.6251 - regression_loss: 1.3589 - classification_loss: 0.2662 445/500 [=========================>....] - ETA: 13s - loss: 1.6248 - regression_loss: 1.3587 - classification_loss: 0.2661 446/500 [=========================>....] - ETA: 13s - loss: 1.6252 - regression_loss: 1.3590 - classification_loss: 0.2662 447/500 [=========================>....] - ETA: 13s - loss: 1.6254 - regression_loss: 1.3593 - classification_loss: 0.2661 448/500 [=========================>....] - ETA: 13s - loss: 1.6273 - regression_loss: 1.3608 - classification_loss: 0.2665 449/500 [=========================>....] - ETA: 12s - loss: 1.6280 - regression_loss: 1.3613 - classification_loss: 0.2666 450/500 [==========================>...] - ETA: 12s - loss: 1.6278 - regression_loss: 1.3612 - classification_loss: 0.2666 451/500 [==========================>...] - ETA: 12s - loss: 1.6281 - regression_loss: 1.3615 - classification_loss: 0.2666 452/500 [==========================>...] - ETA: 12s - loss: 1.6279 - regression_loss: 1.3615 - classification_loss: 0.2664 453/500 [==========================>...] - ETA: 11s - loss: 1.6286 - regression_loss: 1.3619 - classification_loss: 0.2667 454/500 [==========================>...] - ETA: 11s - loss: 1.6294 - regression_loss: 1.3626 - classification_loss: 0.2668 455/500 [==========================>...] - ETA: 11s - loss: 1.6290 - regression_loss: 1.3624 - classification_loss: 0.2666 456/500 [==========================>...] - ETA: 11s - loss: 1.6294 - regression_loss: 1.3628 - classification_loss: 0.2665 457/500 [==========================>...] - ETA: 10s - loss: 1.6288 - regression_loss: 1.3625 - classification_loss: 0.2663 458/500 [==========================>...] - ETA: 10s - loss: 1.6298 - regression_loss: 1.3636 - classification_loss: 0.2663 459/500 [==========================>...] - ETA: 10s - loss: 1.6313 - regression_loss: 1.3647 - classification_loss: 0.2666 460/500 [==========================>...] - ETA: 10s - loss: 1.6312 - regression_loss: 1.3646 - classification_loss: 0.2666 461/500 [==========================>...] - ETA: 9s - loss: 1.6318 - regression_loss: 1.3647 - classification_loss: 0.2671  462/500 [==========================>...] - ETA: 9s - loss: 1.6320 - regression_loss: 1.3648 - classification_loss: 0.2672 463/500 [==========================>...] - ETA: 9s - loss: 1.6312 - regression_loss: 1.3641 - classification_loss: 0.2671 464/500 [==========================>...] - ETA: 9s - loss: 1.6292 - regression_loss: 1.3624 - classification_loss: 0.2668 465/500 [==========================>...] - ETA: 8s - loss: 1.6286 - regression_loss: 1.3620 - classification_loss: 0.2665 466/500 [==========================>...] - ETA: 8s - loss: 1.6276 - regression_loss: 1.3613 - classification_loss: 0.2663 467/500 [===========================>..] - ETA: 8s - loss: 1.6266 - regression_loss: 1.3604 - classification_loss: 0.2662 468/500 [===========================>..] - ETA: 8s - loss: 1.6261 - regression_loss: 1.3600 - classification_loss: 0.2662 469/500 [===========================>..] - ETA: 7s - loss: 1.6255 - regression_loss: 1.3596 - classification_loss: 0.2659 470/500 [===========================>..] - ETA: 7s - loss: 1.6252 - regression_loss: 1.3594 - classification_loss: 0.2658 471/500 [===========================>..] - ETA: 7s - loss: 1.6237 - regression_loss: 1.3582 - classification_loss: 0.2655 472/500 [===========================>..] - ETA: 7s - loss: 1.6236 - regression_loss: 1.3583 - classification_loss: 0.2653 473/500 [===========================>..] - ETA: 6s - loss: 1.6241 - regression_loss: 1.3587 - classification_loss: 0.2655 474/500 [===========================>..] - ETA: 6s - loss: 1.6257 - regression_loss: 1.3599 - classification_loss: 0.2657 475/500 [===========================>..] - ETA: 6s - loss: 1.6256 - regression_loss: 1.3598 - classification_loss: 0.2658 476/500 [===========================>..] - ETA: 6s - loss: 1.6259 - regression_loss: 1.3599 - classification_loss: 0.2660 477/500 [===========================>..] - ETA: 5s - loss: 1.6268 - regression_loss: 1.3605 - classification_loss: 0.2663 478/500 [===========================>..] - ETA: 5s - loss: 1.6251 - regression_loss: 1.3592 - classification_loss: 0.2659 479/500 [===========================>..] - ETA: 5s - loss: 1.6257 - regression_loss: 1.3595 - classification_loss: 0.2662 480/500 [===========================>..] - ETA: 5s - loss: 1.6244 - regression_loss: 1.3584 - classification_loss: 0.2660 481/500 [===========================>..] - ETA: 4s - loss: 1.6230 - regression_loss: 1.3573 - classification_loss: 0.2657 482/500 [===========================>..] - ETA: 4s - loss: 1.6227 - regression_loss: 1.3570 - classification_loss: 0.2657 483/500 [===========================>..] - ETA: 4s - loss: 1.6227 - regression_loss: 1.3571 - classification_loss: 0.2655 484/500 [============================>.] - ETA: 4s - loss: 1.6235 - regression_loss: 1.3579 - classification_loss: 0.2657 485/500 [============================>.] - ETA: 3s - loss: 1.6249 - regression_loss: 1.3589 - classification_loss: 0.2660 486/500 [============================>.] - ETA: 3s - loss: 1.6235 - regression_loss: 1.3577 - classification_loss: 0.2657 487/500 [============================>.] - ETA: 3s - loss: 1.6236 - regression_loss: 1.3578 - classification_loss: 0.2658 488/500 [============================>.] - ETA: 3s - loss: 1.6251 - regression_loss: 1.3591 - classification_loss: 0.2661 489/500 [============================>.] - ETA: 2s - loss: 1.6252 - regression_loss: 1.3593 - classification_loss: 0.2660 490/500 [============================>.] - ETA: 2s - loss: 1.6245 - regression_loss: 1.3588 - classification_loss: 0.2658 491/500 [============================>.] - ETA: 2s - loss: 1.6287 - regression_loss: 1.3617 - classification_loss: 0.2670 492/500 [============================>.] - ETA: 2s - loss: 1.6280 - regression_loss: 1.3608 - classification_loss: 0.2671 493/500 [============================>.] - ETA: 1s - loss: 1.6270 - regression_loss: 1.3601 - classification_loss: 0.2669 494/500 [============================>.] - ETA: 1s - loss: 1.6293 - regression_loss: 1.3620 - classification_loss: 0.2672 495/500 [============================>.] - ETA: 1s - loss: 1.6282 - regression_loss: 1.3613 - classification_loss: 0.2669 496/500 [============================>.] - ETA: 1s - loss: 1.6269 - regression_loss: 1.3602 - classification_loss: 0.2667 497/500 [============================>.] - ETA: 0s - loss: 1.6261 - regression_loss: 1.3598 - classification_loss: 0.2664 498/500 [============================>.] - ETA: 0s - loss: 1.6259 - regression_loss: 1.3598 - classification_loss: 0.2661 499/500 [============================>.] - ETA: 0s - loss: 1.6259 - regression_loss: 1.3598 - classification_loss: 0.2661 500/500 [==============================] - 125s 250ms/step - loss: 1.6256 - regression_loss: 1.3596 - classification_loss: 0.2660 326 instances of class plum with average precision: 0.7961 mAP: 0.7961 Epoch 00060: saving model to ./training/snapshots/resnet50_pascal_60.h5 Epoch 61/150 1/500 [..............................] - ETA: 1:58 - loss: 1.4514 - regression_loss: 1.3145 - classification_loss: 0.1368 2/500 [..............................] - ETA: 1:57 - loss: 1.3070 - regression_loss: 1.1862 - classification_loss: 0.1208 3/500 [..............................] - ETA: 1:58 - loss: 1.0648 - regression_loss: 0.9653 - classification_loss: 0.0995 4/500 [..............................] - ETA: 1:59 - loss: 1.1263 - regression_loss: 0.9789 - classification_loss: 0.1474 5/500 [..............................] - ETA: 2:00 - loss: 1.1327 - regression_loss: 0.9875 - classification_loss: 0.1452 6/500 [..............................] - ETA: 2:00 - loss: 1.0693 - regression_loss: 0.9399 - classification_loss: 0.1294 7/500 [..............................] - ETA: 2:00 - loss: 1.2483 - regression_loss: 1.0940 - classification_loss: 0.1543 8/500 [..............................] - ETA: 2:01 - loss: 1.2965 - regression_loss: 1.1316 - classification_loss: 0.1649 9/500 [..............................] - ETA: 2:00 - loss: 1.3267 - regression_loss: 1.1477 - classification_loss: 0.1790 10/500 [..............................] - ETA: 2:01 - loss: 1.3465 - regression_loss: 1.1625 - classification_loss: 0.1839 11/500 [..............................] - ETA: 2:01 - loss: 1.4354 - regression_loss: 1.2327 - classification_loss: 0.2028 12/500 [..............................] - ETA: 2:01 - loss: 1.4314 - regression_loss: 1.2297 - classification_loss: 0.2018 13/500 [..............................] - ETA: 2:01 - loss: 1.4695 - regression_loss: 1.2503 - classification_loss: 0.2193 14/500 [..............................] - ETA: 2:01 - loss: 1.4620 - regression_loss: 1.2427 - classification_loss: 0.2193 15/500 [..............................] - ETA: 2:00 - loss: 1.4924 - regression_loss: 1.2647 - classification_loss: 0.2276 16/500 [..............................] - ETA: 1:59 - loss: 1.5236 - regression_loss: 1.2842 - classification_loss: 0.2394 17/500 [>.............................] - ETA: 1:58 - loss: 1.5109 - regression_loss: 1.2772 - classification_loss: 0.2337 18/500 [>.............................] - ETA: 1:57 - loss: 1.5796 - regression_loss: 1.3389 - classification_loss: 0.2407 19/500 [>.............................] - ETA: 1:57 - loss: 1.5367 - regression_loss: 1.3055 - classification_loss: 0.2312 20/500 [>.............................] - ETA: 1:57 - loss: 1.5047 - regression_loss: 1.2797 - classification_loss: 0.2250 21/500 [>.............................] - ETA: 1:57 - loss: 1.5137 - regression_loss: 1.2833 - classification_loss: 0.2303 22/500 [>.............................] - ETA: 1:57 - loss: 1.5264 - regression_loss: 1.2928 - classification_loss: 0.2336 23/500 [>.............................] - ETA: 1:57 - loss: 1.5393 - regression_loss: 1.2999 - classification_loss: 0.2394 24/500 [>.............................] - ETA: 1:57 - loss: 1.5391 - regression_loss: 1.3001 - classification_loss: 0.2391 25/500 [>.............................] - ETA: 1:57 - loss: 1.5340 - regression_loss: 1.2990 - classification_loss: 0.2350 26/500 [>.............................] - ETA: 1:56 - loss: 1.5448 - regression_loss: 1.3086 - classification_loss: 0.2362 27/500 [>.............................] - ETA: 1:56 - loss: 1.5208 - regression_loss: 1.2885 - classification_loss: 0.2323 28/500 [>.............................] - ETA: 1:56 - loss: 1.5091 - regression_loss: 1.2798 - classification_loss: 0.2293 29/500 [>.............................] - ETA: 1:56 - loss: 1.5112 - regression_loss: 1.2812 - classification_loss: 0.2301 30/500 [>.............................] - ETA: 1:56 - loss: 1.5341 - regression_loss: 1.2992 - classification_loss: 0.2349 31/500 [>.............................] - ETA: 1:55 - loss: 1.5206 - regression_loss: 1.2886 - classification_loss: 0.2321 32/500 [>.............................] - ETA: 1:55 - loss: 1.5308 - regression_loss: 1.2964 - classification_loss: 0.2344 33/500 [>.............................] - ETA: 1:55 - loss: 1.5086 - regression_loss: 1.2770 - classification_loss: 0.2317 34/500 [=>............................] - ETA: 1:54 - loss: 1.5532 - regression_loss: 1.3152 - classification_loss: 0.2379 35/500 [=>............................] - ETA: 1:54 - loss: 1.5516 - regression_loss: 1.3160 - classification_loss: 0.2356 36/500 [=>............................] - ETA: 1:54 - loss: 1.5532 - regression_loss: 1.3170 - classification_loss: 0.2363 37/500 [=>............................] - ETA: 1:54 - loss: 1.5675 - regression_loss: 1.3288 - classification_loss: 0.2387 38/500 [=>............................] - ETA: 1:54 - loss: 1.5792 - regression_loss: 1.3396 - classification_loss: 0.2396 39/500 [=>............................] - ETA: 1:53 - loss: 1.5995 - regression_loss: 1.3532 - classification_loss: 0.2463 40/500 [=>............................] - ETA: 1:53 - loss: 1.6229 - regression_loss: 1.3724 - classification_loss: 0.2505 41/500 [=>............................] - ETA: 1:53 - loss: 1.6339 - regression_loss: 1.3848 - classification_loss: 0.2491 42/500 [=>............................] - ETA: 1:53 - loss: 1.6487 - regression_loss: 1.3945 - classification_loss: 0.2543 43/500 [=>............................] - ETA: 1:53 - loss: 1.6418 - regression_loss: 1.3892 - classification_loss: 0.2526 44/500 [=>............................] - ETA: 1:53 - loss: 1.6404 - regression_loss: 1.3883 - classification_loss: 0.2522 45/500 [=>............................] - ETA: 1:52 - loss: 1.6635 - regression_loss: 1.3987 - classification_loss: 0.2648 46/500 [=>............................] - ETA: 1:52 - loss: 1.6563 - regression_loss: 1.3944 - classification_loss: 0.2619 47/500 [=>............................] - ETA: 1:52 - loss: 1.6432 - regression_loss: 1.3844 - classification_loss: 0.2588 48/500 [=>............................] - ETA: 1:51 - loss: 1.6486 - regression_loss: 1.3896 - classification_loss: 0.2590 49/500 [=>............................] - ETA: 1:51 - loss: 1.6393 - regression_loss: 1.3825 - classification_loss: 0.2568 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6152 - regression_loss: 1.3616 - classification_loss: 0.2536 51/500 [==>...........................] - ETA: 1:50 - loss: 1.6104 - regression_loss: 1.3588 - classification_loss: 0.2516 52/500 [==>...........................] - ETA: 1:50 - loss: 1.5997 - regression_loss: 1.3503 - classification_loss: 0.2494 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5927 - regression_loss: 1.3446 - classification_loss: 0.2481 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5993 - regression_loss: 1.3494 - classification_loss: 0.2498 55/500 [==>...........................] - ETA: 1:49 - loss: 1.5993 - regression_loss: 1.3503 - classification_loss: 0.2489 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6013 - regression_loss: 1.3491 - classification_loss: 0.2522 57/500 [==>...........................] - ETA: 1:49 - loss: 1.5922 - regression_loss: 1.3418 - classification_loss: 0.2503 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5812 - regression_loss: 1.3322 - classification_loss: 0.2490 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5818 - regression_loss: 1.3334 - classification_loss: 0.2484 60/500 [==>...........................] - ETA: 1:48 - loss: 1.5771 - regression_loss: 1.3298 - classification_loss: 0.2473 61/500 [==>...........................] - ETA: 1:48 - loss: 1.5869 - regression_loss: 1.3375 - classification_loss: 0.2494 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5892 - regression_loss: 1.3385 - classification_loss: 0.2507 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5867 - regression_loss: 1.3366 - classification_loss: 0.2501 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5966 - regression_loss: 1.3456 - classification_loss: 0.2510 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6089 - regression_loss: 1.3534 - classification_loss: 0.2555 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6114 - regression_loss: 1.3543 - classification_loss: 0.2571 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6076 - regression_loss: 1.3519 - classification_loss: 0.2557 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6119 - regression_loss: 1.3570 - classification_loss: 0.2549 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6042 - regression_loss: 1.3500 - classification_loss: 0.2541 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5929 - regression_loss: 1.3412 - classification_loss: 0.2517 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5940 - regression_loss: 1.3431 - classification_loss: 0.2510 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5932 - regression_loss: 1.3420 - classification_loss: 0.2511 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5908 - regression_loss: 1.3414 - classification_loss: 0.2494 74/500 [===>..........................] - ETA: 1:45 - loss: 1.5928 - regression_loss: 1.3410 - classification_loss: 0.2518 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5858 - regression_loss: 1.3353 - classification_loss: 0.2505 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5834 - regression_loss: 1.3340 - classification_loss: 0.2494 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5829 - regression_loss: 1.3337 - classification_loss: 0.2492 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5802 - regression_loss: 1.3322 - classification_loss: 0.2480 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5654 - regression_loss: 1.3200 - classification_loss: 0.2454 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5660 - regression_loss: 1.3204 - classification_loss: 0.2456 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5659 - regression_loss: 1.3210 - classification_loss: 0.2449 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5736 - regression_loss: 1.3254 - classification_loss: 0.2482 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5751 - regression_loss: 1.3262 - classification_loss: 0.2489 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5772 - regression_loss: 1.3272 - classification_loss: 0.2499 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5733 - regression_loss: 1.3234 - classification_loss: 0.2499 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5743 - regression_loss: 1.3251 - classification_loss: 0.2492 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5704 - regression_loss: 1.3226 - classification_loss: 0.2478 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5700 - regression_loss: 1.3226 - classification_loss: 0.2474 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5684 - regression_loss: 1.3218 - classification_loss: 0.2465 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5671 - regression_loss: 1.3207 - classification_loss: 0.2464 91/500 [====>.........................] - ETA: 1:41 - loss: 1.5701 - regression_loss: 1.3234 - classification_loss: 0.2467 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5646 - regression_loss: 1.3191 - classification_loss: 0.2455 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5588 - regression_loss: 1.3135 - classification_loss: 0.2453 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5568 - regression_loss: 1.3125 - classification_loss: 0.2443 95/500 [====>.........................] - ETA: 1:40 - loss: 1.5582 - regression_loss: 1.3128 - classification_loss: 0.2453 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5613 - regression_loss: 1.3148 - classification_loss: 0.2465 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5616 - regression_loss: 1.3150 - classification_loss: 0.2467 98/500 [====>.........................] - ETA: 1:39 - loss: 1.5574 - regression_loss: 1.3116 - classification_loss: 0.2458 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5486 - regression_loss: 1.3038 - classification_loss: 0.2447 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5490 - regression_loss: 1.3039 - classification_loss: 0.2452 101/500 [=====>........................] - ETA: 1:38 - loss: 1.5383 - regression_loss: 1.2909 - classification_loss: 0.2473 102/500 [=====>........................] - ETA: 1:38 - loss: 1.5360 - regression_loss: 1.2895 - classification_loss: 0.2466 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5303 - regression_loss: 1.2847 - classification_loss: 0.2456 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5315 - regression_loss: 1.2859 - classification_loss: 0.2456 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5355 - regression_loss: 1.2895 - classification_loss: 0.2459 106/500 [=====>........................] - ETA: 1:37 - loss: 1.5333 - regression_loss: 1.2882 - classification_loss: 0.2451 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5321 - regression_loss: 1.2874 - classification_loss: 0.2447 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5299 - regression_loss: 1.2848 - classification_loss: 0.2451 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5312 - regression_loss: 1.2870 - classification_loss: 0.2442 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5343 - regression_loss: 1.2895 - classification_loss: 0.2449 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5264 - regression_loss: 1.2832 - classification_loss: 0.2432 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5263 - regression_loss: 1.2835 - classification_loss: 0.2427 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5263 - regression_loss: 1.2836 - classification_loss: 0.2427 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5265 - regression_loss: 1.2841 - classification_loss: 0.2424 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5165 - regression_loss: 1.2760 - classification_loss: 0.2405 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5224 - regression_loss: 1.2806 - classification_loss: 0.2417 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5279 - regression_loss: 1.2849 - classification_loss: 0.2429 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5272 - regression_loss: 1.2839 - classification_loss: 0.2433 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5246 - regression_loss: 1.2824 - classification_loss: 0.2422 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5284 - regression_loss: 1.2851 - classification_loss: 0.2434 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5265 - regression_loss: 1.2839 - classification_loss: 0.2425 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5335 - regression_loss: 1.2894 - classification_loss: 0.2441 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5424 - regression_loss: 1.2973 - classification_loss: 0.2451 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5446 - regression_loss: 1.2993 - classification_loss: 0.2453 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5393 - regression_loss: 1.2954 - classification_loss: 0.2439 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5461 - regression_loss: 1.3003 - classification_loss: 0.2459 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5438 - regression_loss: 1.2985 - classification_loss: 0.2453 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5392 - regression_loss: 1.2945 - classification_loss: 0.2447 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5358 - regression_loss: 1.2918 - classification_loss: 0.2440 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5382 - regression_loss: 1.2941 - classification_loss: 0.2441 131/500 [======>.......................] - ETA: 1:31 - loss: 1.5384 - regression_loss: 1.2935 - classification_loss: 0.2448 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5363 - regression_loss: 1.2918 - classification_loss: 0.2445 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5355 - regression_loss: 1.2916 - classification_loss: 0.2439 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5455 - regression_loss: 1.3000 - classification_loss: 0.2456 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5374 - regression_loss: 1.2931 - classification_loss: 0.2443 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5431 - regression_loss: 1.2971 - classification_loss: 0.2460 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5436 - regression_loss: 1.2974 - classification_loss: 0.2462 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5412 - regression_loss: 1.2953 - classification_loss: 0.2458 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5382 - regression_loss: 1.2933 - classification_loss: 0.2450 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5352 - regression_loss: 1.2908 - classification_loss: 0.2444 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5358 - regression_loss: 1.2917 - classification_loss: 0.2441 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5359 - regression_loss: 1.2920 - classification_loss: 0.2439 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5355 - regression_loss: 1.2909 - classification_loss: 0.2447 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5322 - regression_loss: 1.2885 - classification_loss: 0.2437 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5295 - regression_loss: 1.2863 - classification_loss: 0.2432 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5232 - regression_loss: 1.2813 - classification_loss: 0.2419 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5268 - regression_loss: 1.2850 - classification_loss: 0.2418 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5307 - regression_loss: 1.2887 - classification_loss: 0.2420 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5310 - regression_loss: 1.2891 - classification_loss: 0.2418 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5322 - regression_loss: 1.2900 - classification_loss: 0.2422 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5292 - regression_loss: 1.2879 - classification_loss: 0.2414 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5296 - regression_loss: 1.2885 - classification_loss: 0.2411 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5336 - regression_loss: 1.2913 - classification_loss: 0.2423 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5316 - regression_loss: 1.2902 - classification_loss: 0.2414 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5291 - regression_loss: 1.2878 - classification_loss: 0.2413 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5340 - regression_loss: 1.2909 - classification_loss: 0.2430 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5328 - regression_loss: 1.2899 - classification_loss: 0.2429 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5337 - regression_loss: 1.2903 - classification_loss: 0.2434 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5366 - regression_loss: 1.2929 - classification_loss: 0.2437 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5373 - regression_loss: 1.2934 - classification_loss: 0.2438 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5397 - regression_loss: 1.2961 - classification_loss: 0.2436 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5435 - regression_loss: 1.2994 - classification_loss: 0.2440 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5464 - regression_loss: 1.3018 - classification_loss: 0.2446 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5455 - regression_loss: 1.3011 - classification_loss: 0.2444 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5459 - regression_loss: 1.3014 - classification_loss: 0.2445 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5470 - regression_loss: 1.3021 - classification_loss: 0.2449 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5477 - regression_loss: 1.3029 - classification_loss: 0.2448 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5476 - regression_loss: 1.3031 - classification_loss: 0.2445 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5449 - regression_loss: 1.3013 - classification_loss: 0.2437 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5443 - regression_loss: 1.3006 - classification_loss: 0.2436 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5406 - regression_loss: 1.2977 - classification_loss: 0.2428 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5406 - regression_loss: 1.2980 - classification_loss: 0.2426 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5396 - regression_loss: 1.2964 - classification_loss: 0.2433 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5400 - regression_loss: 1.2971 - classification_loss: 0.2430 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5417 - regression_loss: 1.2982 - classification_loss: 0.2434 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5453 - regression_loss: 1.3012 - classification_loss: 0.2441 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5435 - regression_loss: 1.2994 - classification_loss: 0.2441 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5455 - regression_loss: 1.3008 - classification_loss: 0.2447 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5405 - regression_loss: 1.2968 - classification_loss: 0.2437 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5445 - regression_loss: 1.2996 - classification_loss: 0.2449 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5410 - regression_loss: 1.2968 - classification_loss: 0.2442 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5408 - regression_loss: 1.2967 - classification_loss: 0.2441 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5394 - regression_loss: 1.2957 - classification_loss: 0.2437 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5390 - regression_loss: 1.2949 - classification_loss: 0.2441 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5379 - regression_loss: 1.2942 - classification_loss: 0.2438 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5416 - regression_loss: 1.2968 - classification_loss: 0.2448 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5435 - regression_loss: 1.2985 - classification_loss: 0.2450 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5442 - regression_loss: 1.2986 - classification_loss: 0.2456 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5503 - regression_loss: 1.3031 - classification_loss: 0.2473 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5469 - regression_loss: 1.2962 - classification_loss: 0.2507 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5473 - regression_loss: 1.2966 - classification_loss: 0.2507 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5462 - regression_loss: 1.2962 - classification_loss: 0.2500 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5463 - regression_loss: 1.2962 - classification_loss: 0.2501 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5465 - regression_loss: 1.2966 - classification_loss: 0.2499 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5464 - regression_loss: 1.2964 - classification_loss: 0.2500 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5470 - regression_loss: 1.2970 - classification_loss: 0.2501 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5488 - regression_loss: 1.2979 - classification_loss: 0.2509 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5468 - regression_loss: 1.2963 - classification_loss: 0.2505 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5496 - regression_loss: 1.2982 - classification_loss: 0.2514 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5447 - regression_loss: 1.2943 - classification_loss: 0.2504 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5484 - regression_loss: 1.2975 - classification_loss: 0.2509 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5446 - regression_loss: 1.2911 - classification_loss: 0.2535 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5436 - regression_loss: 1.2905 - classification_loss: 0.2532 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5468 - regression_loss: 1.2931 - classification_loss: 0.2537 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5451 - regression_loss: 1.2918 - classification_loss: 0.2533 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5448 - regression_loss: 1.2916 - classification_loss: 0.2532 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5453 - regression_loss: 1.2916 - classification_loss: 0.2537 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5474 - regression_loss: 1.2935 - classification_loss: 0.2539 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5500 - regression_loss: 1.2955 - classification_loss: 0.2545 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5488 - regression_loss: 1.2943 - classification_loss: 0.2545 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5499 - regression_loss: 1.2950 - classification_loss: 0.2548 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5506 - regression_loss: 1.2954 - classification_loss: 0.2552 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5485 - regression_loss: 1.2937 - classification_loss: 0.2548 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5483 - regression_loss: 1.2937 - classification_loss: 0.2546 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5483 - regression_loss: 1.2937 - classification_loss: 0.2546 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5491 - regression_loss: 1.2952 - classification_loss: 0.2539 217/500 [============>.................] - ETA: 1:10 - loss: 1.5490 - regression_loss: 1.2954 - classification_loss: 0.2535 218/500 [============>.................] - ETA: 1:10 - loss: 1.5486 - regression_loss: 1.2949 - classification_loss: 0.2536 219/500 [============>.................] - ETA: 1:10 - loss: 1.5471 - regression_loss: 1.2942 - classification_loss: 0.2529 220/500 [============>.................] - ETA: 1:09 - loss: 1.5457 - regression_loss: 1.2931 - classification_loss: 0.2526 221/500 [============>.................] - ETA: 1:09 - loss: 1.5449 - regression_loss: 1.2922 - classification_loss: 0.2527 222/500 [============>.................] - ETA: 1:09 - loss: 1.5472 - regression_loss: 1.2943 - classification_loss: 0.2529 223/500 [============>.................] - ETA: 1:09 - loss: 1.5442 - regression_loss: 1.2919 - classification_loss: 0.2524 224/500 [============>.................] - ETA: 1:08 - loss: 1.5452 - regression_loss: 1.2921 - classification_loss: 0.2531 225/500 [============>.................] - ETA: 1:08 - loss: 1.5442 - regression_loss: 1.2912 - classification_loss: 0.2530 226/500 [============>.................] - ETA: 1:08 - loss: 1.5437 - regression_loss: 1.2907 - classification_loss: 0.2530 227/500 [============>.................] - ETA: 1:08 - loss: 1.5453 - regression_loss: 1.2916 - classification_loss: 0.2537 228/500 [============>.................] - ETA: 1:07 - loss: 1.5403 - regression_loss: 1.2875 - classification_loss: 0.2528 229/500 [============>.................] - ETA: 1:07 - loss: 1.5413 - regression_loss: 1.2879 - classification_loss: 0.2534 230/500 [============>.................] - ETA: 1:07 - loss: 1.5426 - regression_loss: 1.2890 - classification_loss: 0.2536 231/500 [============>.................] - ETA: 1:07 - loss: 1.5471 - regression_loss: 1.2934 - classification_loss: 0.2537 232/500 [============>.................] - ETA: 1:06 - loss: 1.5479 - regression_loss: 1.2940 - classification_loss: 0.2539 233/500 [============>.................] - ETA: 1:06 - loss: 1.5503 - regression_loss: 1.2964 - classification_loss: 0.2538 234/500 [=============>................] - ETA: 1:06 - loss: 1.5483 - regression_loss: 1.2952 - classification_loss: 0.2531 235/500 [=============>................] - ETA: 1:06 - loss: 1.5549 - regression_loss: 1.3007 - classification_loss: 0.2542 236/500 [=============>................] - ETA: 1:05 - loss: 1.5551 - regression_loss: 1.3008 - classification_loss: 0.2543 237/500 [=============>................] - ETA: 1:05 - loss: 1.5634 - regression_loss: 1.3063 - classification_loss: 0.2571 238/500 [=============>................] - ETA: 1:05 - loss: 1.5624 - regression_loss: 1.3058 - classification_loss: 0.2566 239/500 [=============>................] - ETA: 1:05 - loss: 1.5590 - regression_loss: 1.3031 - classification_loss: 0.2559 240/500 [=============>................] - ETA: 1:04 - loss: 1.5599 - regression_loss: 1.3039 - classification_loss: 0.2560 241/500 [=============>................] - ETA: 1:04 - loss: 1.5593 - regression_loss: 1.3032 - classification_loss: 0.2561 242/500 [=============>................] - ETA: 1:04 - loss: 1.5588 - regression_loss: 1.3030 - classification_loss: 0.2558 243/500 [=============>................] - ETA: 1:04 - loss: 1.5584 - regression_loss: 1.3024 - classification_loss: 0.2560 244/500 [=============>................] - ETA: 1:03 - loss: 1.5633 - regression_loss: 1.3060 - classification_loss: 0.2573 245/500 [=============>................] - ETA: 1:03 - loss: 1.5641 - regression_loss: 1.3066 - classification_loss: 0.2575 246/500 [=============>................] - ETA: 1:03 - loss: 1.5646 - regression_loss: 1.3075 - classification_loss: 0.2571 247/500 [=============>................] - ETA: 1:03 - loss: 1.5609 - regression_loss: 1.3045 - classification_loss: 0.2565 248/500 [=============>................] - ETA: 1:03 - loss: 1.5609 - regression_loss: 1.3046 - classification_loss: 0.2562 249/500 [=============>................] - ETA: 1:02 - loss: 1.5598 - regression_loss: 1.3039 - classification_loss: 0.2559 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5576 - regression_loss: 1.3023 - classification_loss: 0.2553 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5566 - regression_loss: 1.3017 - classification_loss: 0.2550 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5564 - regression_loss: 1.3015 - classification_loss: 0.2549 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5574 - regression_loss: 1.3025 - classification_loss: 0.2549 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5580 - regression_loss: 1.3033 - classification_loss: 0.2547 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5571 - regression_loss: 1.3027 - classification_loss: 0.2544 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5582 - regression_loss: 1.3034 - classification_loss: 0.2547 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5610 - regression_loss: 1.3054 - classification_loss: 0.2556 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5626 - regression_loss: 1.3062 - classification_loss: 0.2564 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5620 - regression_loss: 1.3062 - classification_loss: 0.2558 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5589 - regression_loss: 1.3035 - classification_loss: 0.2555 261/500 [==============>...............] - ETA: 59s - loss: 1.5636 - regression_loss: 1.3067 - classification_loss: 0.2569  262/500 [==============>...............] - ETA: 59s - loss: 1.5655 - regression_loss: 1.3085 - classification_loss: 0.2570 263/500 [==============>...............] - ETA: 59s - loss: 1.5654 - regression_loss: 1.3084 - classification_loss: 0.2570 264/500 [==============>...............] - ETA: 59s - loss: 1.5635 - regression_loss: 1.3073 - classification_loss: 0.2563 265/500 [==============>...............] - ETA: 58s - loss: 1.5605 - regression_loss: 1.3048 - classification_loss: 0.2558 266/500 [==============>...............] - ETA: 58s - loss: 1.5575 - regression_loss: 1.3024 - classification_loss: 0.2551 267/500 [===============>..............] - ETA: 58s - loss: 1.5565 - regression_loss: 1.3019 - classification_loss: 0.2547 268/500 [===============>..............] - ETA: 58s - loss: 1.5581 - regression_loss: 1.3031 - classification_loss: 0.2550 269/500 [===============>..............] - ETA: 57s - loss: 1.5579 - regression_loss: 1.3031 - classification_loss: 0.2547 270/500 [===============>..............] - ETA: 57s - loss: 1.5590 - regression_loss: 1.3041 - classification_loss: 0.2549 271/500 [===============>..............] - ETA: 57s - loss: 1.5584 - regression_loss: 1.3036 - classification_loss: 0.2548 272/500 [===============>..............] - ETA: 57s - loss: 1.5585 - regression_loss: 1.3039 - classification_loss: 0.2546 273/500 [===============>..............] - ETA: 56s - loss: 1.5569 - regression_loss: 1.3026 - classification_loss: 0.2543 274/500 [===============>..............] - ETA: 56s - loss: 1.5531 - regression_loss: 1.2996 - classification_loss: 0.2535 275/500 [===============>..............] - ETA: 56s - loss: 1.5541 - regression_loss: 1.3007 - classification_loss: 0.2534 276/500 [===============>..............] - ETA: 55s - loss: 1.5574 - regression_loss: 1.3033 - classification_loss: 0.2540 277/500 [===============>..............] - ETA: 55s - loss: 1.5577 - regression_loss: 1.3043 - classification_loss: 0.2534 278/500 [===============>..............] - ETA: 55s - loss: 1.5581 - regression_loss: 1.3047 - classification_loss: 0.2534 279/500 [===============>..............] - ETA: 55s - loss: 1.5572 - regression_loss: 1.3042 - classification_loss: 0.2530 280/500 [===============>..............] - ETA: 54s - loss: 1.5583 - regression_loss: 1.3053 - classification_loss: 0.2530 281/500 [===============>..............] - ETA: 54s - loss: 1.5594 - regression_loss: 1.3066 - classification_loss: 0.2527 282/500 [===============>..............] - ETA: 54s - loss: 1.5600 - regression_loss: 1.3065 - classification_loss: 0.2535 283/500 [===============>..............] - ETA: 54s - loss: 1.5624 - regression_loss: 1.3081 - classification_loss: 0.2543 284/500 [================>.............] - ETA: 53s - loss: 1.5606 - regression_loss: 1.3066 - classification_loss: 0.2540 285/500 [================>.............] - ETA: 53s - loss: 1.5648 - regression_loss: 1.3098 - classification_loss: 0.2550 286/500 [================>.............] - ETA: 53s - loss: 1.5665 - regression_loss: 1.3115 - classification_loss: 0.2550 287/500 [================>.............] - ETA: 53s - loss: 1.5669 - regression_loss: 1.3122 - classification_loss: 0.2547 288/500 [================>.............] - ETA: 52s - loss: 1.5662 - regression_loss: 1.3117 - classification_loss: 0.2546 289/500 [================>.............] - ETA: 52s - loss: 1.5642 - regression_loss: 1.3102 - classification_loss: 0.2541 290/500 [================>.............] - ETA: 52s - loss: 1.5645 - regression_loss: 1.3104 - classification_loss: 0.2541 291/500 [================>.............] - ETA: 52s - loss: 1.5635 - regression_loss: 1.3095 - classification_loss: 0.2539 292/500 [================>.............] - ETA: 51s - loss: 1.5619 - regression_loss: 1.3082 - classification_loss: 0.2537 293/500 [================>.............] - ETA: 51s - loss: 1.5632 - regression_loss: 1.3095 - classification_loss: 0.2537 294/500 [================>.............] - ETA: 51s - loss: 1.5641 - regression_loss: 1.3101 - classification_loss: 0.2540 295/500 [================>.............] - ETA: 51s - loss: 1.5649 - regression_loss: 1.3107 - classification_loss: 0.2542 296/500 [================>.............] - ETA: 50s - loss: 1.5644 - regression_loss: 1.3107 - classification_loss: 0.2537 297/500 [================>.............] - ETA: 50s - loss: 1.5637 - regression_loss: 1.3101 - classification_loss: 0.2536 298/500 [================>.............] - ETA: 50s - loss: 1.5606 - regression_loss: 1.3076 - classification_loss: 0.2530 299/500 [================>.............] - ETA: 50s - loss: 1.5572 - regression_loss: 1.3050 - classification_loss: 0.2523 300/500 [=================>............] - ETA: 49s - loss: 1.5592 - regression_loss: 1.3065 - classification_loss: 0.2527 301/500 [=================>............] - ETA: 49s - loss: 1.5585 - regression_loss: 1.3061 - classification_loss: 0.2525 302/500 [=================>............] - ETA: 49s - loss: 1.5606 - regression_loss: 1.3075 - classification_loss: 0.2531 303/500 [=================>............] - ETA: 49s - loss: 1.5586 - regression_loss: 1.3059 - classification_loss: 0.2527 304/500 [=================>............] - ETA: 48s - loss: 1.5585 - regression_loss: 1.3060 - classification_loss: 0.2525 305/500 [=================>............] - ETA: 48s - loss: 1.5604 - regression_loss: 1.3073 - classification_loss: 0.2530 306/500 [=================>............] - ETA: 48s - loss: 1.5619 - regression_loss: 1.3075 - classification_loss: 0.2545 307/500 [=================>............] - ETA: 48s - loss: 1.5638 - regression_loss: 1.3087 - classification_loss: 0.2551 308/500 [=================>............] - ETA: 47s - loss: 1.5630 - regression_loss: 1.3081 - classification_loss: 0.2549 309/500 [=================>............] - ETA: 47s - loss: 1.5636 - regression_loss: 1.3087 - classification_loss: 0.2549 310/500 [=================>............] - ETA: 47s - loss: 1.5631 - regression_loss: 1.3082 - classification_loss: 0.2549 311/500 [=================>............] - ETA: 47s - loss: 1.5638 - regression_loss: 1.3089 - classification_loss: 0.2549 312/500 [=================>............] - ETA: 46s - loss: 1.5650 - regression_loss: 1.3098 - classification_loss: 0.2552 313/500 [=================>............] - ETA: 46s - loss: 1.5623 - regression_loss: 1.3077 - classification_loss: 0.2547 314/500 [=================>............] - ETA: 46s - loss: 1.5612 - regression_loss: 1.3067 - classification_loss: 0.2544 315/500 [=================>............] - ETA: 46s - loss: 1.5601 - regression_loss: 1.3060 - classification_loss: 0.2541 316/500 [=================>............] - ETA: 46s - loss: 1.5618 - regression_loss: 1.3071 - classification_loss: 0.2547 317/500 [==================>...........] - ETA: 45s - loss: 1.5599 - regression_loss: 1.3056 - classification_loss: 0.2544 318/500 [==================>...........] - ETA: 45s - loss: 1.5608 - regression_loss: 1.3064 - classification_loss: 0.2544 319/500 [==================>...........] - ETA: 45s - loss: 1.5614 - regression_loss: 1.3067 - classification_loss: 0.2547 320/500 [==================>...........] - ETA: 45s - loss: 1.5613 - regression_loss: 1.3066 - classification_loss: 0.2546 321/500 [==================>...........] - ETA: 44s - loss: 1.5628 - regression_loss: 1.3077 - classification_loss: 0.2551 322/500 [==================>...........] - ETA: 44s - loss: 1.5619 - regression_loss: 1.3070 - classification_loss: 0.2549 323/500 [==================>...........] - ETA: 44s - loss: 1.5649 - regression_loss: 1.3089 - classification_loss: 0.2560 324/500 [==================>...........] - ETA: 44s - loss: 1.5643 - regression_loss: 1.3087 - classification_loss: 0.2557 325/500 [==================>...........] - ETA: 43s - loss: 1.5631 - regression_loss: 1.3078 - classification_loss: 0.2553 326/500 [==================>...........] - ETA: 43s - loss: 1.5629 - regression_loss: 1.3077 - classification_loss: 0.2552 327/500 [==================>...........] - ETA: 43s - loss: 1.5611 - regression_loss: 1.3064 - classification_loss: 0.2547 328/500 [==================>...........] - ETA: 43s - loss: 1.5597 - regression_loss: 1.3052 - classification_loss: 0.2545 329/500 [==================>...........] - ETA: 42s - loss: 1.5599 - regression_loss: 1.3055 - classification_loss: 0.2544 330/500 [==================>...........] - ETA: 42s - loss: 1.5607 - regression_loss: 1.3061 - classification_loss: 0.2546 331/500 [==================>...........] - ETA: 42s - loss: 1.5613 - regression_loss: 1.3069 - classification_loss: 0.2544 332/500 [==================>...........] - ETA: 42s - loss: 1.5613 - regression_loss: 1.3069 - classification_loss: 0.2544 333/500 [==================>...........] - ETA: 41s - loss: 1.5620 - regression_loss: 1.3075 - classification_loss: 0.2545 334/500 [===================>..........] - ETA: 41s - loss: 1.5623 - regression_loss: 1.3078 - classification_loss: 0.2544 335/500 [===================>..........] - ETA: 41s - loss: 1.5627 - regression_loss: 1.3080 - classification_loss: 0.2547 336/500 [===================>..........] - ETA: 41s - loss: 1.5638 - regression_loss: 1.3093 - classification_loss: 0.2545 337/500 [===================>..........] - ETA: 40s - loss: 1.5670 - regression_loss: 1.3118 - classification_loss: 0.2551 338/500 [===================>..........] - ETA: 40s - loss: 1.5670 - regression_loss: 1.3119 - classification_loss: 0.2551 339/500 [===================>..........] - ETA: 40s - loss: 1.5684 - regression_loss: 1.3133 - classification_loss: 0.2551 340/500 [===================>..........] - ETA: 40s - loss: 1.5657 - regression_loss: 1.3112 - classification_loss: 0.2545 341/500 [===================>..........] - ETA: 39s - loss: 1.5660 - regression_loss: 1.3115 - classification_loss: 0.2545 342/500 [===================>..........] - ETA: 39s - loss: 1.5663 - regression_loss: 1.3118 - classification_loss: 0.2546 343/500 [===================>..........] - ETA: 39s - loss: 1.5668 - regression_loss: 1.3118 - classification_loss: 0.2550 344/500 [===================>..........] - ETA: 39s - loss: 1.5690 - regression_loss: 1.3141 - classification_loss: 0.2549 345/500 [===================>..........] - ETA: 38s - loss: 1.5687 - regression_loss: 1.3138 - classification_loss: 0.2549 346/500 [===================>..........] - ETA: 38s - loss: 1.5699 - regression_loss: 1.3149 - classification_loss: 0.2550 347/500 [===================>..........] - ETA: 38s - loss: 1.5691 - regression_loss: 1.3145 - classification_loss: 0.2547 348/500 [===================>..........] - ETA: 38s - loss: 1.5686 - regression_loss: 1.3142 - classification_loss: 0.2544 349/500 [===================>..........] - ETA: 37s - loss: 1.5737 - regression_loss: 1.3170 - classification_loss: 0.2567 350/500 [====================>.........] - ETA: 37s - loss: 1.5733 - regression_loss: 1.3168 - classification_loss: 0.2564 351/500 [====================>.........] - ETA: 37s - loss: 1.5746 - regression_loss: 1.3179 - classification_loss: 0.2567 352/500 [====================>.........] - ETA: 37s - loss: 1.5759 - regression_loss: 1.3191 - classification_loss: 0.2568 353/500 [====================>.........] - ETA: 36s - loss: 1.5802 - regression_loss: 1.3226 - classification_loss: 0.2576 354/500 [====================>.........] - ETA: 36s - loss: 1.5807 - regression_loss: 1.3230 - classification_loss: 0.2576 355/500 [====================>.........] - ETA: 36s - loss: 1.5796 - regression_loss: 1.3223 - classification_loss: 0.2573 356/500 [====================>.........] - ETA: 36s - loss: 1.5791 - regression_loss: 1.3220 - classification_loss: 0.2571 357/500 [====================>.........] - ETA: 35s - loss: 1.5803 - regression_loss: 1.3230 - classification_loss: 0.2573 358/500 [====================>.........] - ETA: 35s - loss: 1.5793 - regression_loss: 1.3223 - classification_loss: 0.2570 359/500 [====================>.........] - ETA: 35s - loss: 1.5787 - regression_loss: 1.3218 - classification_loss: 0.2569 360/500 [====================>.........] - ETA: 35s - loss: 1.5788 - regression_loss: 1.3218 - classification_loss: 0.2570 361/500 [====================>.........] - ETA: 34s - loss: 1.5792 - regression_loss: 1.3223 - classification_loss: 0.2569 362/500 [====================>.........] - ETA: 34s - loss: 1.5792 - regression_loss: 1.3224 - classification_loss: 0.2568 363/500 [====================>.........] - ETA: 34s - loss: 1.5801 - regression_loss: 1.3232 - classification_loss: 0.2569 364/500 [====================>.........] - ETA: 34s - loss: 1.5794 - regression_loss: 1.3227 - classification_loss: 0.2567 365/500 [====================>.........] - ETA: 33s - loss: 1.5786 - regression_loss: 1.3222 - classification_loss: 0.2564 366/500 [====================>.........] - ETA: 33s - loss: 1.5769 - regression_loss: 1.3209 - classification_loss: 0.2561 367/500 [=====================>........] - ETA: 33s - loss: 1.5774 - regression_loss: 1.3213 - classification_loss: 0.2561 368/500 [=====================>........] - ETA: 33s - loss: 1.5778 - regression_loss: 1.3218 - classification_loss: 0.2559 369/500 [=====================>........] - ETA: 32s - loss: 1.5792 - regression_loss: 1.3230 - classification_loss: 0.2562 370/500 [=====================>........] - ETA: 32s - loss: 1.5797 - regression_loss: 1.3232 - classification_loss: 0.2565 371/500 [=====================>........] - ETA: 32s - loss: 1.5796 - regression_loss: 1.3230 - classification_loss: 0.2566 372/500 [=====================>........] - ETA: 32s - loss: 1.5787 - regression_loss: 1.3225 - classification_loss: 0.2562 373/500 [=====================>........] - ETA: 31s - loss: 1.5799 - regression_loss: 1.3237 - classification_loss: 0.2563 374/500 [=====================>........] - ETA: 31s - loss: 1.5805 - regression_loss: 1.3244 - classification_loss: 0.2561 375/500 [=====================>........] - ETA: 31s - loss: 1.5792 - regression_loss: 1.3234 - classification_loss: 0.2558 376/500 [=====================>........] - ETA: 31s - loss: 1.5785 - regression_loss: 1.3227 - classification_loss: 0.2558 377/500 [=====================>........] - ETA: 30s - loss: 1.5761 - regression_loss: 1.3207 - classification_loss: 0.2553 378/500 [=====================>........] - ETA: 30s - loss: 1.5763 - regression_loss: 1.3209 - classification_loss: 0.2554 379/500 [=====================>........] - ETA: 30s - loss: 1.5780 - regression_loss: 1.3221 - classification_loss: 0.2560 380/500 [=====================>........] - ETA: 30s - loss: 1.5760 - regression_loss: 1.3205 - classification_loss: 0.2556 381/500 [=====================>........] - ETA: 29s - loss: 1.5773 - regression_loss: 1.3214 - classification_loss: 0.2558 382/500 [=====================>........] - ETA: 29s - loss: 1.5767 - regression_loss: 1.3211 - classification_loss: 0.2556 383/500 [=====================>........] - ETA: 29s - loss: 1.5783 - regression_loss: 1.3223 - classification_loss: 0.2559 384/500 [======================>.......] - ETA: 29s - loss: 1.5785 - regression_loss: 1.3225 - classification_loss: 0.2560 385/500 [======================>.......] - ETA: 28s - loss: 1.5793 - regression_loss: 1.3231 - classification_loss: 0.2561 386/500 [======================>.......] - ETA: 28s - loss: 1.5792 - regression_loss: 1.3232 - classification_loss: 0.2560 387/500 [======================>.......] - ETA: 28s - loss: 1.5789 - regression_loss: 1.3231 - classification_loss: 0.2558 388/500 [======================>.......] - ETA: 28s - loss: 1.5808 - regression_loss: 1.3197 - classification_loss: 0.2612 389/500 [======================>.......] - ETA: 27s - loss: 1.5808 - regression_loss: 1.3195 - classification_loss: 0.2613 390/500 [======================>.......] - ETA: 27s - loss: 1.5799 - regression_loss: 1.3189 - classification_loss: 0.2611 391/500 [======================>.......] - ETA: 27s - loss: 1.5797 - regression_loss: 1.3189 - classification_loss: 0.2608 392/500 [======================>.......] - ETA: 27s - loss: 1.5804 - regression_loss: 1.3195 - classification_loss: 0.2609 393/500 [======================>.......] - ETA: 26s - loss: 1.5794 - regression_loss: 1.3188 - classification_loss: 0.2605 394/500 [======================>.......] - ETA: 26s - loss: 1.5797 - regression_loss: 1.3190 - classification_loss: 0.2607 395/500 [======================>.......] - ETA: 26s - loss: 1.5809 - regression_loss: 1.3198 - classification_loss: 0.2611 396/500 [======================>.......] - ETA: 26s - loss: 1.5802 - regression_loss: 1.3194 - classification_loss: 0.2608 397/500 [======================>.......] - ETA: 25s - loss: 1.5781 - regression_loss: 1.3177 - classification_loss: 0.2605 398/500 [======================>.......] - ETA: 25s - loss: 1.5772 - regression_loss: 1.3170 - classification_loss: 0.2602 399/500 [======================>.......] - ETA: 25s - loss: 1.5793 - regression_loss: 1.3181 - classification_loss: 0.2612 400/500 [=======================>......] - ETA: 24s - loss: 1.5810 - regression_loss: 1.3195 - classification_loss: 0.2615 401/500 [=======================>......] - ETA: 24s - loss: 1.5810 - regression_loss: 1.3197 - classification_loss: 0.2613 402/500 [=======================>......] - ETA: 24s - loss: 1.5796 - regression_loss: 1.3186 - classification_loss: 0.2610 403/500 [=======================>......] - ETA: 24s - loss: 1.5821 - regression_loss: 1.3206 - classification_loss: 0.2615 404/500 [=======================>......] - ETA: 23s - loss: 1.5809 - regression_loss: 1.3198 - classification_loss: 0.2611 405/500 [=======================>......] - ETA: 23s - loss: 1.5807 - regression_loss: 1.3198 - classification_loss: 0.2609 406/500 [=======================>......] - ETA: 23s - loss: 1.5798 - regression_loss: 1.3190 - classification_loss: 0.2607 407/500 [=======================>......] - ETA: 23s - loss: 1.5802 - regression_loss: 1.3195 - classification_loss: 0.2607 408/500 [=======================>......] - ETA: 22s - loss: 1.5809 - regression_loss: 1.3202 - classification_loss: 0.2608 409/500 [=======================>......] - ETA: 22s - loss: 1.5785 - regression_loss: 1.3182 - classification_loss: 0.2603 410/500 [=======================>......] - ETA: 22s - loss: 1.5785 - regression_loss: 1.3182 - classification_loss: 0.2603 411/500 [=======================>......] - ETA: 22s - loss: 1.5783 - regression_loss: 1.3180 - classification_loss: 0.2603 412/500 [=======================>......] - ETA: 21s - loss: 1.5787 - regression_loss: 1.3184 - classification_loss: 0.2603 413/500 [=======================>......] - ETA: 21s - loss: 1.5768 - regression_loss: 1.3169 - classification_loss: 0.2599 414/500 [=======================>......] - ETA: 21s - loss: 1.5759 - regression_loss: 1.3163 - classification_loss: 0.2596 415/500 [=======================>......] - ETA: 21s - loss: 1.5770 - regression_loss: 1.3170 - classification_loss: 0.2599 416/500 [=======================>......] - ETA: 20s - loss: 1.5764 - regression_loss: 1.3166 - classification_loss: 0.2599 417/500 [========================>.....] - ETA: 20s - loss: 1.5756 - regression_loss: 1.3156 - classification_loss: 0.2600 418/500 [========================>.....] - ETA: 20s - loss: 1.5774 - regression_loss: 1.3170 - classification_loss: 0.2604 419/500 [========================>.....] - ETA: 20s - loss: 1.5775 - regression_loss: 1.3172 - classification_loss: 0.2603 420/500 [========================>.....] - ETA: 19s - loss: 1.5783 - regression_loss: 1.3179 - classification_loss: 0.2605 421/500 [========================>.....] - ETA: 19s - loss: 1.5770 - regression_loss: 1.3168 - classification_loss: 0.2602 422/500 [========================>.....] - ETA: 19s - loss: 1.5773 - regression_loss: 1.3171 - classification_loss: 0.2603 423/500 [========================>.....] - ETA: 19s - loss: 1.5771 - regression_loss: 1.3170 - classification_loss: 0.2602 424/500 [========================>.....] - ETA: 19s - loss: 1.5783 - regression_loss: 1.3181 - classification_loss: 0.2602 425/500 [========================>.....] - ETA: 18s - loss: 1.5766 - regression_loss: 1.3168 - classification_loss: 0.2598 426/500 [========================>.....] - ETA: 18s - loss: 1.5805 - regression_loss: 1.3199 - classification_loss: 0.2607 427/500 [========================>.....] - ETA: 18s - loss: 1.5799 - regression_loss: 1.3195 - classification_loss: 0.2605 428/500 [========================>.....] - ETA: 17s - loss: 1.5812 - regression_loss: 1.3207 - classification_loss: 0.2605 429/500 [========================>.....] - ETA: 17s - loss: 1.5814 - regression_loss: 1.3209 - classification_loss: 0.2605 430/500 [========================>.....] - ETA: 17s - loss: 1.5822 - regression_loss: 1.3211 - classification_loss: 0.2610 431/500 [========================>.....] - ETA: 17s - loss: 1.5814 - regression_loss: 1.3205 - classification_loss: 0.2609 432/500 [========================>.....] - ETA: 16s - loss: 1.5846 - regression_loss: 1.3229 - classification_loss: 0.2616 433/500 [========================>.....] - ETA: 16s - loss: 1.5866 - regression_loss: 1.3247 - classification_loss: 0.2619 434/500 [=========================>....] - ETA: 16s - loss: 1.5862 - regression_loss: 1.3245 - classification_loss: 0.2617 435/500 [=========================>....] - ETA: 16s - loss: 1.5865 - regression_loss: 1.3248 - classification_loss: 0.2618 436/500 [=========================>....] - ETA: 15s - loss: 1.5876 - regression_loss: 1.3254 - classification_loss: 0.2622 437/500 [=========================>....] - ETA: 15s - loss: 1.5876 - regression_loss: 1.3255 - classification_loss: 0.2620 438/500 [=========================>....] - ETA: 15s - loss: 1.5888 - regression_loss: 1.3265 - classification_loss: 0.2623 439/500 [=========================>....] - ETA: 15s - loss: 1.5879 - regression_loss: 1.3258 - classification_loss: 0.2621 440/500 [=========================>....] - ETA: 14s - loss: 1.5896 - regression_loss: 1.3271 - classification_loss: 0.2625 441/500 [=========================>....] - ETA: 14s - loss: 1.5882 - regression_loss: 1.3260 - classification_loss: 0.2622 442/500 [=========================>....] - ETA: 14s - loss: 1.5882 - regression_loss: 1.3262 - classification_loss: 0.2620 443/500 [=========================>....] - ETA: 14s - loss: 1.5874 - regression_loss: 1.3255 - classification_loss: 0.2618 444/500 [=========================>....] - ETA: 13s - loss: 1.5885 - regression_loss: 1.3264 - classification_loss: 0.2621 445/500 [=========================>....] - ETA: 13s - loss: 1.5879 - regression_loss: 1.3260 - classification_loss: 0.2619 446/500 [=========================>....] - ETA: 13s - loss: 1.5882 - regression_loss: 1.3263 - classification_loss: 0.2620 447/500 [=========================>....] - ETA: 13s - loss: 1.5885 - regression_loss: 1.3265 - classification_loss: 0.2620 448/500 [=========================>....] - ETA: 12s - loss: 1.5870 - regression_loss: 1.3252 - classification_loss: 0.2617 449/500 [=========================>....] - ETA: 12s - loss: 1.5856 - regression_loss: 1.3242 - classification_loss: 0.2614 450/500 [==========================>...] - ETA: 12s - loss: 1.5855 - regression_loss: 1.3240 - classification_loss: 0.2615 451/500 [==========================>...] - ETA: 12s - loss: 1.5852 - regression_loss: 1.3235 - classification_loss: 0.2617 452/500 [==========================>...] - ETA: 11s - loss: 1.5840 - regression_loss: 1.3226 - classification_loss: 0.2614 453/500 [==========================>...] - ETA: 11s - loss: 1.5843 - regression_loss: 1.3229 - classification_loss: 0.2614 454/500 [==========================>...] - ETA: 11s - loss: 1.5834 - regression_loss: 1.3221 - classification_loss: 0.2612 455/500 [==========================>...] - ETA: 11s - loss: 1.5828 - regression_loss: 1.3219 - classification_loss: 0.2609 456/500 [==========================>...] - ETA: 10s - loss: 1.5818 - regression_loss: 1.3211 - classification_loss: 0.2607 457/500 [==========================>...] - ETA: 10s - loss: 1.5820 - regression_loss: 1.3215 - classification_loss: 0.2605 458/500 [==========================>...] - ETA: 10s - loss: 1.5820 - regression_loss: 1.3215 - classification_loss: 0.2605 459/500 [==========================>...] - ETA: 10s - loss: 1.5819 - regression_loss: 1.3215 - classification_loss: 0.2605 460/500 [==========================>...] - ETA: 9s - loss: 1.5803 - regression_loss: 1.3203 - classification_loss: 0.2600  461/500 [==========================>...] - ETA: 9s - loss: 1.5800 - regression_loss: 1.3202 - classification_loss: 0.2598 462/500 [==========================>...] - ETA: 9s - loss: 1.5788 - regression_loss: 1.3193 - classification_loss: 0.2595 463/500 [==========================>...] - ETA: 9s - loss: 1.5782 - regression_loss: 1.3187 - classification_loss: 0.2595 464/500 [==========================>...] - ETA: 8s - loss: 1.5768 - regression_loss: 1.3177 - classification_loss: 0.2592 465/500 [==========================>...] - ETA: 8s - loss: 1.5780 - regression_loss: 1.3186 - classification_loss: 0.2594 466/500 [==========================>...] - ETA: 8s - loss: 1.5779 - regression_loss: 1.3186 - classification_loss: 0.2593 467/500 [===========================>..] - ETA: 8s - loss: 1.5778 - regression_loss: 1.3186 - classification_loss: 0.2592 468/500 [===========================>..] - ETA: 7s - loss: 1.5780 - regression_loss: 1.3189 - classification_loss: 0.2591 469/500 [===========================>..] - ETA: 7s - loss: 1.5768 - regression_loss: 1.3180 - classification_loss: 0.2588 470/500 [===========================>..] - ETA: 7s - loss: 1.5766 - regression_loss: 1.3180 - classification_loss: 0.2587 471/500 [===========================>..] - ETA: 7s - loss: 1.5759 - regression_loss: 1.3175 - classification_loss: 0.2584 472/500 [===========================>..] - ETA: 6s - loss: 1.5757 - regression_loss: 1.3174 - classification_loss: 0.2583 473/500 [===========================>..] - ETA: 6s - loss: 1.5766 - regression_loss: 1.3181 - classification_loss: 0.2585 474/500 [===========================>..] - ETA: 6s - loss: 1.5765 - regression_loss: 1.3180 - classification_loss: 0.2584 475/500 [===========================>..] - ETA: 6s - loss: 1.5765 - regression_loss: 1.3181 - classification_loss: 0.2583 476/500 [===========================>..] - ETA: 5s - loss: 1.5757 - regression_loss: 1.3175 - classification_loss: 0.2582 477/500 [===========================>..] - ETA: 5s - loss: 1.5734 - regression_loss: 1.3155 - classification_loss: 0.2579 478/500 [===========================>..] - ETA: 5s - loss: 1.5730 - regression_loss: 1.3153 - classification_loss: 0.2577 479/500 [===========================>..] - ETA: 5s - loss: 1.5722 - regression_loss: 1.3147 - classification_loss: 0.2575 480/500 [===========================>..] - ETA: 4s - loss: 1.5721 - regression_loss: 1.3148 - classification_loss: 0.2573 481/500 [===========================>..] - ETA: 4s - loss: 1.5714 - regression_loss: 1.3143 - classification_loss: 0.2572 482/500 [===========================>..] - ETA: 4s - loss: 1.5721 - regression_loss: 1.3149 - classification_loss: 0.2572 483/500 [===========================>..] - ETA: 4s - loss: 1.5720 - regression_loss: 1.3148 - classification_loss: 0.2571 484/500 [============================>.] - ETA: 3s - loss: 1.5717 - regression_loss: 1.3146 - classification_loss: 0.2571 485/500 [============================>.] - ETA: 3s - loss: 1.5728 - regression_loss: 1.3156 - classification_loss: 0.2572 486/500 [============================>.] - ETA: 3s - loss: 1.5726 - regression_loss: 1.3154 - classification_loss: 0.2571 487/500 [============================>.] - ETA: 3s - loss: 1.5735 - regression_loss: 1.3164 - classification_loss: 0.2571 488/500 [============================>.] - ETA: 2s - loss: 1.5762 - regression_loss: 1.3184 - classification_loss: 0.2578 489/500 [============================>.] - ETA: 2s - loss: 1.5764 - regression_loss: 1.3185 - classification_loss: 0.2578 490/500 [============================>.] - ETA: 2s - loss: 1.5774 - regression_loss: 1.3194 - classification_loss: 0.2581 491/500 [============================>.] - ETA: 2s - loss: 1.5754 - regression_loss: 1.3177 - classification_loss: 0.2577 492/500 [============================>.] - ETA: 1s - loss: 1.5752 - regression_loss: 1.3176 - classification_loss: 0.2576 493/500 [============================>.] - ETA: 1s - loss: 1.5756 - regression_loss: 1.3178 - classification_loss: 0.2578 494/500 [============================>.] - ETA: 1s - loss: 1.5755 - regression_loss: 1.3180 - classification_loss: 0.2576 495/500 [============================>.] - ETA: 1s - loss: 1.5762 - regression_loss: 1.3185 - classification_loss: 0.2577 496/500 [============================>.] - ETA: 0s - loss: 1.5744 - regression_loss: 1.3170 - classification_loss: 0.2573 497/500 [============================>.] - ETA: 0s - loss: 1.5755 - regression_loss: 1.3180 - classification_loss: 0.2574 498/500 [============================>.] - ETA: 0s - loss: 1.5756 - regression_loss: 1.3178 - classification_loss: 0.2578 499/500 [============================>.] - ETA: 0s - loss: 1.5754 - regression_loss: 1.3176 - classification_loss: 0.2578 500/500 [==============================] - 125s 250ms/step - loss: 1.5749 - regression_loss: 1.3172 - classification_loss: 0.2577 326 instances of class plum with average precision: 0.7945 mAP: 0.7945 Epoch 00061: saving model to ./training/snapshots/resnet50_pascal_61.h5 Epoch 62/150 1/500 [..............................] - ETA: 2:02 - loss: 1.2919 - regression_loss: 1.1091 - classification_loss: 0.1828 2/500 [..............................] - ETA: 2:00 - loss: 1.4319 - regression_loss: 1.2362 - classification_loss: 0.1957 3/500 [..............................] - ETA: 2:02 - loss: 1.4775 - regression_loss: 1.2567 - classification_loss: 0.2208 4/500 [..............................] - ETA: 2:02 - loss: 1.4423 - regression_loss: 1.2521 - classification_loss: 0.1903 5/500 [..............................] - ETA: 2:03 - loss: 1.6483 - regression_loss: 1.4024 - classification_loss: 0.2460 6/500 [..............................] - ETA: 2:03 - loss: 1.5836 - regression_loss: 1.3426 - classification_loss: 0.2410 7/500 [..............................] - ETA: 2:03 - loss: 1.5645 - regression_loss: 1.3319 - classification_loss: 0.2326 8/500 [..............................] - ETA: 2:02 - loss: 1.6907 - regression_loss: 1.4515 - classification_loss: 0.2391 9/500 [..............................] - ETA: 2:03 - loss: 1.7054 - regression_loss: 1.4637 - classification_loss: 0.2417 10/500 [..............................] - ETA: 2:03 - loss: 1.6691 - regression_loss: 1.4359 - classification_loss: 0.2332 11/500 [..............................] - ETA: 2:03 - loss: 1.7167 - regression_loss: 1.4818 - classification_loss: 0.2349 12/500 [..............................] - ETA: 2:02 - loss: 1.7221 - regression_loss: 1.4876 - classification_loss: 0.2345 13/500 [..............................] - ETA: 2:02 - loss: 1.7288 - regression_loss: 1.4889 - classification_loss: 0.2399 14/500 [..............................] - ETA: 2:01 - loss: 1.7608 - regression_loss: 1.5138 - classification_loss: 0.2471 15/500 [..............................] - ETA: 2:01 - loss: 1.7791 - regression_loss: 1.5263 - classification_loss: 0.2528 16/500 [..............................] - ETA: 2:01 - loss: 1.8027 - regression_loss: 1.5431 - classification_loss: 0.2596 17/500 [>.............................] - ETA: 2:01 - loss: 1.7942 - regression_loss: 1.5328 - classification_loss: 0.2614 18/500 [>.............................] - ETA: 2:01 - loss: 1.7958 - regression_loss: 1.5332 - classification_loss: 0.2627 19/500 [>.............................] - ETA: 2:00 - loss: 1.7851 - regression_loss: 1.5262 - classification_loss: 0.2589 20/500 [>.............................] - ETA: 2:00 - loss: 1.7658 - regression_loss: 1.5120 - classification_loss: 0.2538 21/500 [>.............................] - ETA: 2:00 - loss: 1.7528 - regression_loss: 1.5019 - classification_loss: 0.2508 22/500 [>.............................] - ETA: 2:00 - loss: 1.7544 - regression_loss: 1.5071 - classification_loss: 0.2473 23/500 [>.............................] - ETA: 2:00 - loss: 1.7737 - regression_loss: 1.5197 - classification_loss: 0.2539 24/500 [>.............................] - ETA: 2:00 - loss: 1.7764 - regression_loss: 1.5199 - classification_loss: 0.2565 25/500 [>.............................] - ETA: 1:59 - loss: 1.7705 - regression_loss: 1.5167 - classification_loss: 0.2537 26/500 [>.............................] - ETA: 1:59 - loss: 1.7670 - regression_loss: 1.5124 - classification_loss: 0.2546 27/500 [>.............................] - ETA: 1:59 - loss: 1.7502 - regression_loss: 1.4965 - classification_loss: 0.2537 28/500 [>.............................] - ETA: 1:59 - loss: 1.7366 - regression_loss: 1.4866 - classification_loss: 0.2499 29/500 [>.............................] - ETA: 1:58 - loss: 1.7521 - regression_loss: 1.5032 - classification_loss: 0.2490 30/500 [>.............................] - ETA: 1:58 - loss: 1.7283 - regression_loss: 1.4833 - classification_loss: 0.2450 31/500 [>.............................] - ETA: 1:58 - loss: 1.7309 - regression_loss: 1.4858 - classification_loss: 0.2451 32/500 [>.............................] - ETA: 1:58 - loss: 1.7317 - regression_loss: 1.4875 - classification_loss: 0.2442 33/500 [>.............................] - ETA: 1:57 - loss: 1.6991 - regression_loss: 1.4587 - classification_loss: 0.2404 34/500 [=>............................] - ETA: 1:57 - loss: 1.6992 - regression_loss: 1.4596 - classification_loss: 0.2397 35/500 [=>............................] - ETA: 1:57 - loss: 1.7147 - regression_loss: 1.4703 - classification_loss: 0.2444 36/500 [=>............................] - ETA: 1:57 - loss: 1.6989 - regression_loss: 1.4575 - classification_loss: 0.2414 37/500 [=>............................] - ETA: 1:56 - loss: 1.6754 - regression_loss: 1.4385 - classification_loss: 0.2369 38/500 [=>............................] - ETA: 1:56 - loss: 1.6820 - regression_loss: 1.4433 - classification_loss: 0.2387 39/500 [=>............................] - ETA: 1:56 - loss: 1.6768 - regression_loss: 1.4397 - classification_loss: 0.2371 40/500 [=>............................] - ETA: 1:56 - loss: 1.6854 - regression_loss: 1.4459 - classification_loss: 0.2395 41/500 [=>............................] - ETA: 1:55 - loss: 1.7114 - regression_loss: 1.4673 - classification_loss: 0.2441 42/500 [=>............................] - ETA: 1:55 - loss: 1.7078 - regression_loss: 1.4634 - classification_loss: 0.2444 43/500 [=>............................] - ETA: 1:55 - loss: 1.7144 - regression_loss: 1.4672 - classification_loss: 0.2472 44/500 [=>............................] - ETA: 1:55 - loss: 1.7128 - regression_loss: 1.4659 - classification_loss: 0.2468 45/500 [=>............................] - ETA: 1:54 - loss: 1.7267 - regression_loss: 1.4776 - classification_loss: 0.2491 46/500 [=>............................] - ETA: 1:54 - loss: 1.7204 - regression_loss: 1.4719 - classification_loss: 0.2485 47/500 [=>............................] - ETA: 1:54 - loss: 1.7203 - regression_loss: 1.4718 - classification_loss: 0.2485 48/500 [=>............................] - ETA: 1:53 - loss: 1.7140 - regression_loss: 1.4660 - classification_loss: 0.2479 49/500 [=>............................] - ETA: 1:53 - loss: 1.7218 - regression_loss: 1.4715 - classification_loss: 0.2503 50/500 [==>...........................] - ETA: 1:53 - loss: 1.7162 - regression_loss: 1.4674 - classification_loss: 0.2488 51/500 [==>...........................] - ETA: 1:53 - loss: 1.7088 - regression_loss: 1.4625 - classification_loss: 0.2463 52/500 [==>...........................] - ETA: 1:52 - loss: 1.7167 - regression_loss: 1.4676 - classification_loss: 0.2491 53/500 [==>...........................] - ETA: 1:52 - loss: 1.7128 - regression_loss: 1.4642 - classification_loss: 0.2486 54/500 [==>...........................] - ETA: 1:52 - loss: 1.7132 - regression_loss: 1.4648 - classification_loss: 0.2484 55/500 [==>...........................] - ETA: 1:52 - loss: 1.7017 - regression_loss: 1.4554 - classification_loss: 0.2463 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6979 - regression_loss: 1.4516 - classification_loss: 0.2463 57/500 [==>...........................] - ETA: 1:51 - loss: 1.6968 - regression_loss: 1.4505 - classification_loss: 0.2464 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6898 - regression_loss: 1.4448 - classification_loss: 0.2450 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6795 - regression_loss: 1.4357 - classification_loss: 0.2438 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6724 - regression_loss: 1.4299 - classification_loss: 0.2425 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6695 - regression_loss: 1.4276 - classification_loss: 0.2419 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6674 - regression_loss: 1.4268 - classification_loss: 0.2406 63/500 [==>...........................] - ETA: 1:50 - loss: 1.6580 - regression_loss: 1.4187 - classification_loss: 0.2393 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6643 - regression_loss: 1.4214 - classification_loss: 0.2429 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6596 - regression_loss: 1.4172 - classification_loss: 0.2424 66/500 [==>...........................] - ETA: 1:49 - loss: 1.6602 - regression_loss: 1.4191 - classification_loss: 0.2411 67/500 [===>..........................] - ETA: 1:49 - loss: 1.6617 - regression_loss: 1.4194 - classification_loss: 0.2423 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6574 - regression_loss: 1.4165 - classification_loss: 0.2409 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6488 - regression_loss: 1.4089 - classification_loss: 0.2400 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6329 - regression_loss: 1.3957 - classification_loss: 0.2371 71/500 [===>..........................] - ETA: 1:48 - loss: 1.6398 - regression_loss: 1.4015 - classification_loss: 0.2383 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6430 - regression_loss: 1.4049 - classification_loss: 0.2381 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6508 - regression_loss: 1.4077 - classification_loss: 0.2431 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6499 - regression_loss: 1.4062 - classification_loss: 0.2438 75/500 [===>..........................] - ETA: 1:47 - loss: 1.6706 - regression_loss: 1.4207 - classification_loss: 0.2498 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6675 - regression_loss: 1.4168 - classification_loss: 0.2508 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6714 - regression_loss: 1.4188 - classification_loss: 0.2526 78/500 [===>..........................] - ETA: 1:46 - loss: 1.6809 - regression_loss: 1.4257 - classification_loss: 0.2553 79/500 [===>..........................] - ETA: 1:46 - loss: 1.6830 - regression_loss: 1.4260 - classification_loss: 0.2569 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6910 - regression_loss: 1.4321 - classification_loss: 0.2589 81/500 [===>..........................] - ETA: 1:45 - loss: 1.6905 - regression_loss: 1.4310 - classification_loss: 0.2594 82/500 [===>..........................] - ETA: 1:45 - loss: 1.6941 - regression_loss: 1.4335 - classification_loss: 0.2606 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6955 - regression_loss: 1.4347 - classification_loss: 0.2609 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6870 - regression_loss: 1.4275 - classification_loss: 0.2595 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6807 - regression_loss: 1.4227 - classification_loss: 0.2580 86/500 [====>.........................] - ETA: 1:44 - loss: 1.6765 - regression_loss: 1.4194 - classification_loss: 0.2571 87/500 [====>.........................] - ETA: 1:44 - loss: 1.6733 - regression_loss: 1.4179 - classification_loss: 0.2554 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6738 - regression_loss: 1.4180 - classification_loss: 0.2558 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6598 - regression_loss: 1.4061 - classification_loss: 0.2537 90/500 [====>.........................] - ETA: 1:43 - loss: 1.6633 - regression_loss: 1.4098 - classification_loss: 0.2535 91/500 [====>.........................] - ETA: 1:43 - loss: 1.6642 - regression_loss: 1.4106 - classification_loss: 0.2536 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6660 - regression_loss: 1.4116 - classification_loss: 0.2544 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6736 - regression_loss: 1.4187 - classification_loss: 0.2549 94/500 [====>.........................] - ETA: 1:42 - loss: 1.6695 - regression_loss: 1.4149 - classification_loss: 0.2546 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6741 - regression_loss: 1.4176 - classification_loss: 0.2565 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6744 - regression_loss: 1.4180 - classification_loss: 0.2564 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6834 - regression_loss: 1.4240 - classification_loss: 0.2594 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6922 - regression_loss: 1.4298 - classification_loss: 0.2624 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6951 - regression_loss: 1.4326 - classification_loss: 0.2625 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7004 - regression_loss: 1.4371 - classification_loss: 0.2634 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7088 - regression_loss: 1.4427 - classification_loss: 0.2662 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7119 - regression_loss: 1.4450 - classification_loss: 0.2670 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7123 - regression_loss: 1.4442 - classification_loss: 0.2681 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7126 - regression_loss: 1.4443 - classification_loss: 0.2682 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7101 - regression_loss: 1.4427 - classification_loss: 0.2675 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7012 - regression_loss: 1.4353 - classification_loss: 0.2659 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6942 - regression_loss: 1.4298 - classification_loss: 0.2644 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6940 - regression_loss: 1.4296 - classification_loss: 0.2645 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6959 - regression_loss: 1.4304 - classification_loss: 0.2655 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6928 - regression_loss: 1.4283 - classification_loss: 0.2645 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6944 - regression_loss: 1.4298 - classification_loss: 0.2645 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6943 - regression_loss: 1.4294 - classification_loss: 0.2649 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6921 - regression_loss: 1.4278 - classification_loss: 0.2643 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6954 - regression_loss: 1.4302 - classification_loss: 0.2652 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6921 - regression_loss: 1.4275 - classification_loss: 0.2646 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6896 - regression_loss: 1.4260 - classification_loss: 0.2636 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6825 - regression_loss: 1.4201 - classification_loss: 0.2624 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6833 - regression_loss: 1.4213 - classification_loss: 0.2620 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6834 - regression_loss: 1.4216 - classification_loss: 0.2618 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6737 - regression_loss: 1.4135 - classification_loss: 0.2602 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6676 - regression_loss: 1.4082 - classification_loss: 0.2595 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6687 - regression_loss: 1.4085 - classification_loss: 0.2602 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6662 - regression_loss: 1.4063 - classification_loss: 0.2599 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6737 - regression_loss: 1.4120 - classification_loss: 0.2617 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6738 - regression_loss: 1.4121 - classification_loss: 0.2616 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6740 - regression_loss: 1.4122 - classification_loss: 0.2617 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6718 - regression_loss: 1.4104 - classification_loss: 0.2614 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6723 - regression_loss: 1.4104 - classification_loss: 0.2619 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6689 - regression_loss: 1.4080 - classification_loss: 0.2609 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6637 - regression_loss: 1.4040 - classification_loss: 0.2597 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6585 - regression_loss: 1.3933 - classification_loss: 0.2652 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6548 - regression_loss: 1.3907 - classification_loss: 0.2641 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6557 - regression_loss: 1.3910 - classification_loss: 0.2647 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6543 - regression_loss: 1.3900 - classification_loss: 0.2643 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6534 - regression_loss: 1.3888 - classification_loss: 0.2646 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6519 - regression_loss: 1.3876 - classification_loss: 0.2643 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6450 - regression_loss: 1.3821 - classification_loss: 0.2629 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6445 - regression_loss: 1.3821 - classification_loss: 0.2624 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6425 - regression_loss: 1.3806 - classification_loss: 0.2619 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6408 - regression_loss: 1.3792 - classification_loss: 0.2616 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6396 - regression_loss: 1.3787 - classification_loss: 0.2610 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6370 - regression_loss: 1.3766 - classification_loss: 0.2604 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6388 - regression_loss: 1.3793 - classification_loss: 0.2595 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6407 - regression_loss: 1.3805 - classification_loss: 0.2602 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6416 - regression_loss: 1.3812 - classification_loss: 0.2605 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6410 - regression_loss: 1.3812 - classification_loss: 0.2598 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6400 - regression_loss: 1.3800 - classification_loss: 0.2601 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6361 - regression_loss: 1.3770 - classification_loss: 0.2591 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6297 - regression_loss: 1.3715 - classification_loss: 0.2582 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6299 - regression_loss: 1.3720 - classification_loss: 0.2580 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6286 - regression_loss: 1.3711 - classification_loss: 0.2575 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6311 - regression_loss: 1.3727 - classification_loss: 0.2583 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6291 - regression_loss: 1.3710 - classification_loss: 0.2581 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6282 - regression_loss: 1.3705 - classification_loss: 0.2577 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6261 - regression_loss: 1.3689 - classification_loss: 0.2572 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6285 - regression_loss: 1.3715 - classification_loss: 0.2569 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6256 - regression_loss: 1.3694 - classification_loss: 0.2562 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6238 - regression_loss: 1.3683 - classification_loss: 0.2555 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6235 - regression_loss: 1.3685 - classification_loss: 0.2550 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6204 - regression_loss: 1.3661 - classification_loss: 0.2542 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6227 - regression_loss: 1.3656 - classification_loss: 0.2572 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6220 - regression_loss: 1.3649 - classification_loss: 0.2571 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6213 - regression_loss: 1.3643 - classification_loss: 0.2570 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6239 - regression_loss: 1.3661 - classification_loss: 0.2579 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6212 - regression_loss: 1.3638 - classification_loss: 0.2575 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6220 - regression_loss: 1.3644 - classification_loss: 0.2576 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6224 - regression_loss: 1.3647 - classification_loss: 0.2577 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6190 - regression_loss: 1.3619 - classification_loss: 0.2571 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6191 - regression_loss: 1.3620 - classification_loss: 0.2571 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6155 - regression_loss: 1.3593 - classification_loss: 0.2562 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6167 - regression_loss: 1.3601 - classification_loss: 0.2566 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6251 - regression_loss: 1.3670 - classification_loss: 0.2581 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6290 - regression_loss: 1.3697 - classification_loss: 0.2593 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6294 - regression_loss: 1.3704 - classification_loss: 0.2591 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6267 - regression_loss: 1.3679 - classification_loss: 0.2587 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6281 - regression_loss: 1.3686 - classification_loss: 0.2595 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6299 - regression_loss: 1.3699 - classification_loss: 0.2600 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6241 - regression_loss: 1.3654 - classification_loss: 0.2588 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6272 - regression_loss: 1.3677 - classification_loss: 0.2595 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6270 - regression_loss: 1.3677 - classification_loss: 0.2593 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6249 - regression_loss: 1.3662 - classification_loss: 0.2587 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6249 - regression_loss: 1.3666 - classification_loss: 0.2582 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6279 - regression_loss: 1.3693 - classification_loss: 0.2586 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6261 - regression_loss: 1.3675 - classification_loss: 0.2586 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6293 - regression_loss: 1.3707 - classification_loss: 0.2586 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6229 - regression_loss: 1.3655 - classification_loss: 0.2574 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6228 - regression_loss: 1.3651 - classification_loss: 0.2576 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6243 - regression_loss: 1.3665 - classification_loss: 0.2578 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6239 - regression_loss: 1.3662 - classification_loss: 0.2577 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6240 - regression_loss: 1.3661 - classification_loss: 0.2578 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6249 - regression_loss: 1.3668 - classification_loss: 0.2581 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6253 - regression_loss: 1.3670 - classification_loss: 0.2583 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6245 - regression_loss: 1.3664 - classification_loss: 0.2581 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6244 - regression_loss: 1.3662 - classification_loss: 0.2583 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6208 - regression_loss: 1.3632 - classification_loss: 0.2575 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6229 - regression_loss: 1.3651 - classification_loss: 0.2578 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6227 - regression_loss: 1.3652 - classification_loss: 0.2575 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6312 - regression_loss: 1.3722 - classification_loss: 0.2591 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6325 - regression_loss: 1.3730 - classification_loss: 0.2595 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6316 - regression_loss: 1.3725 - classification_loss: 0.2591 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6445 - regression_loss: 1.3657 - classification_loss: 0.2788 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6423 - regression_loss: 1.3640 - classification_loss: 0.2783 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6400 - regression_loss: 1.3623 - classification_loss: 0.2777 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6382 - regression_loss: 1.3612 - classification_loss: 0.2770 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6379 - regression_loss: 1.3613 - classification_loss: 0.2766 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6415 - regression_loss: 1.3643 - classification_loss: 0.2772 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6397 - regression_loss: 1.3630 - classification_loss: 0.2767 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6382 - regression_loss: 1.3621 - classification_loss: 0.2761 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6404 - regression_loss: 1.3637 - classification_loss: 0.2767 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6404 - regression_loss: 1.3635 - classification_loss: 0.2768 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6441 - regression_loss: 1.3665 - classification_loss: 0.2776 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6467 - regression_loss: 1.3682 - classification_loss: 0.2785 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6450 - regression_loss: 1.3667 - classification_loss: 0.2782 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6441 - regression_loss: 1.3660 - classification_loss: 0.2781 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6457 - regression_loss: 1.3674 - classification_loss: 0.2783 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6451 - regression_loss: 1.3672 - classification_loss: 0.2779 217/500 [============>.................] - ETA: 1:10 - loss: 1.6440 - regression_loss: 1.3663 - classification_loss: 0.2778 218/500 [============>.................] - ETA: 1:10 - loss: 1.6441 - regression_loss: 1.3662 - classification_loss: 0.2779 219/500 [============>.................] - ETA: 1:10 - loss: 1.6471 - regression_loss: 1.3682 - classification_loss: 0.2789 220/500 [============>.................] - ETA: 1:10 - loss: 1.6463 - regression_loss: 1.3679 - classification_loss: 0.2784 221/500 [============>.................] - ETA: 1:09 - loss: 1.6487 - regression_loss: 1.3698 - classification_loss: 0.2789 222/500 [============>.................] - ETA: 1:09 - loss: 1.6464 - regression_loss: 1.3683 - classification_loss: 0.2781 223/500 [============>.................] - ETA: 1:09 - loss: 1.6458 - regression_loss: 1.3679 - classification_loss: 0.2779 224/500 [============>.................] - ETA: 1:09 - loss: 1.6459 - regression_loss: 1.3682 - classification_loss: 0.2777 225/500 [============>.................] - ETA: 1:08 - loss: 1.6453 - regression_loss: 1.3673 - classification_loss: 0.2780 226/500 [============>.................] - ETA: 1:08 - loss: 1.6461 - regression_loss: 1.3681 - classification_loss: 0.2780 227/500 [============>.................] - ETA: 1:08 - loss: 1.6484 - regression_loss: 1.3698 - classification_loss: 0.2785 228/500 [============>.................] - ETA: 1:08 - loss: 1.6478 - regression_loss: 1.3697 - classification_loss: 0.2781 229/500 [============>.................] - ETA: 1:07 - loss: 1.6479 - regression_loss: 1.3700 - classification_loss: 0.2779 230/500 [============>.................] - ETA: 1:07 - loss: 1.6492 - regression_loss: 1.3708 - classification_loss: 0.2785 231/500 [============>.................] - ETA: 1:07 - loss: 1.6458 - regression_loss: 1.3677 - classification_loss: 0.2781 232/500 [============>.................] - ETA: 1:07 - loss: 1.6472 - regression_loss: 1.3689 - classification_loss: 0.2783 233/500 [============>.................] - ETA: 1:06 - loss: 1.6472 - regression_loss: 1.3694 - classification_loss: 0.2778 234/500 [=============>................] - ETA: 1:06 - loss: 1.6444 - regression_loss: 1.3673 - classification_loss: 0.2771 235/500 [=============>................] - ETA: 1:06 - loss: 1.6457 - regression_loss: 1.3685 - classification_loss: 0.2772 236/500 [=============>................] - ETA: 1:06 - loss: 1.6447 - regression_loss: 1.3681 - classification_loss: 0.2766 237/500 [=============>................] - ETA: 1:05 - loss: 1.6426 - regression_loss: 1.3666 - classification_loss: 0.2760 238/500 [=============>................] - ETA: 1:05 - loss: 1.6444 - regression_loss: 1.3680 - classification_loss: 0.2764 239/500 [=============>................] - ETA: 1:05 - loss: 1.6450 - regression_loss: 1.3687 - classification_loss: 0.2764 240/500 [=============>................] - ETA: 1:05 - loss: 1.6399 - regression_loss: 1.3646 - classification_loss: 0.2753 241/500 [=============>................] - ETA: 1:04 - loss: 1.6385 - regression_loss: 1.3634 - classification_loss: 0.2751 242/500 [=============>................] - ETA: 1:04 - loss: 1.6375 - regression_loss: 1.3627 - classification_loss: 0.2748 243/500 [=============>................] - ETA: 1:04 - loss: 1.6366 - regression_loss: 1.3620 - classification_loss: 0.2745 244/500 [=============>................] - ETA: 1:04 - loss: 1.6338 - regression_loss: 1.3600 - classification_loss: 0.2739 245/500 [=============>................] - ETA: 1:03 - loss: 1.6342 - regression_loss: 1.3602 - classification_loss: 0.2740 246/500 [=============>................] - ETA: 1:03 - loss: 1.6327 - regression_loss: 1.3590 - classification_loss: 0.2736 247/500 [=============>................] - ETA: 1:03 - loss: 1.6328 - regression_loss: 1.3592 - classification_loss: 0.2735 248/500 [=============>................] - ETA: 1:03 - loss: 1.6331 - regression_loss: 1.3592 - classification_loss: 0.2739 249/500 [=============>................] - ETA: 1:02 - loss: 1.6362 - regression_loss: 1.3615 - classification_loss: 0.2747 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6384 - regression_loss: 1.3634 - classification_loss: 0.2750 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6389 - regression_loss: 1.3633 - classification_loss: 0.2755 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6409 - regression_loss: 1.3655 - classification_loss: 0.2754 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6442 - regression_loss: 1.3677 - classification_loss: 0.2765 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6407 - regression_loss: 1.3649 - classification_loss: 0.2758 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6414 - regression_loss: 1.3658 - classification_loss: 0.2756 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6390 - regression_loss: 1.3640 - classification_loss: 0.2750 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6352 - regression_loss: 1.3610 - classification_loss: 0.2742 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6344 - regression_loss: 1.3610 - classification_loss: 0.2734 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6303 - regression_loss: 1.3577 - classification_loss: 0.2726 260/500 [==============>...............] - ETA: 59s - loss: 1.6297 - regression_loss: 1.3574 - classification_loss: 0.2723  261/500 [==============>...............] - ETA: 59s - loss: 1.6306 - regression_loss: 1.3582 - classification_loss: 0.2723 262/500 [==============>...............] - ETA: 59s - loss: 1.6288 - regression_loss: 1.3570 - classification_loss: 0.2718 263/500 [==============>...............] - ETA: 59s - loss: 1.6288 - regression_loss: 1.3574 - classification_loss: 0.2713 264/500 [==============>...............] - ETA: 58s - loss: 1.6282 - regression_loss: 1.3572 - classification_loss: 0.2710 265/500 [==============>...............] - ETA: 58s - loss: 1.6283 - regression_loss: 1.3570 - classification_loss: 0.2712 266/500 [==============>...............] - ETA: 58s - loss: 1.6281 - regression_loss: 1.3572 - classification_loss: 0.2710 267/500 [===============>..............] - ETA: 58s - loss: 1.6253 - regression_loss: 1.3548 - classification_loss: 0.2704 268/500 [===============>..............] - ETA: 57s - loss: 1.6251 - regression_loss: 1.3547 - classification_loss: 0.2704 269/500 [===============>..............] - ETA: 57s - loss: 1.6262 - regression_loss: 1.3560 - classification_loss: 0.2703 270/500 [===============>..............] - ETA: 57s - loss: 1.6286 - regression_loss: 1.3581 - classification_loss: 0.2705 271/500 [===============>..............] - ETA: 57s - loss: 1.6299 - regression_loss: 1.3592 - classification_loss: 0.2707 272/500 [===============>..............] - ETA: 56s - loss: 1.6291 - regression_loss: 1.3586 - classification_loss: 0.2705 273/500 [===============>..............] - ETA: 56s - loss: 1.6290 - regression_loss: 1.3586 - classification_loss: 0.2704 274/500 [===============>..............] - ETA: 56s - loss: 1.6301 - regression_loss: 1.3595 - classification_loss: 0.2706 275/500 [===============>..............] - ETA: 56s - loss: 1.6301 - regression_loss: 1.3597 - classification_loss: 0.2704 276/500 [===============>..............] - ETA: 55s - loss: 1.6322 - regression_loss: 1.3614 - classification_loss: 0.2707 277/500 [===============>..............] - ETA: 55s - loss: 1.6349 - regression_loss: 1.3632 - classification_loss: 0.2717 278/500 [===============>..............] - ETA: 55s - loss: 1.6349 - regression_loss: 1.3630 - classification_loss: 0.2719 279/500 [===============>..............] - ETA: 55s - loss: 1.6341 - regression_loss: 1.3625 - classification_loss: 0.2716 280/500 [===============>..............] - ETA: 54s - loss: 1.6336 - regression_loss: 1.3623 - classification_loss: 0.2712 281/500 [===============>..............] - ETA: 54s - loss: 1.6324 - regression_loss: 1.3616 - classification_loss: 0.2708 282/500 [===============>..............] - ETA: 54s - loss: 1.6312 - regression_loss: 1.3604 - classification_loss: 0.2707 283/500 [===============>..............] - ETA: 54s - loss: 1.6318 - regression_loss: 1.3609 - classification_loss: 0.2708 284/500 [================>.............] - ETA: 53s - loss: 1.6319 - regression_loss: 1.3614 - classification_loss: 0.2705 285/500 [================>.............] - ETA: 53s - loss: 1.6309 - regression_loss: 1.3608 - classification_loss: 0.2701 286/500 [================>.............] - ETA: 53s - loss: 1.6329 - regression_loss: 1.3626 - classification_loss: 0.2703 287/500 [================>.............] - ETA: 53s - loss: 1.6309 - regression_loss: 1.3609 - classification_loss: 0.2700 288/500 [================>.............] - ETA: 52s - loss: 1.6287 - regression_loss: 1.3593 - classification_loss: 0.2694 289/500 [================>.............] - ETA: 52s - loss: 1.6280 - regression_loss: 1.3590 - classification_loss: 0.2690 290/500 [================>.............] - ETA: 52s - loss: 1.6259 - regression_loss: 1.3576 - classification_loss: 0.2683 291/500 [================>.............] - ETA: 51s - loss: 1.6273 - regression_loss: 1.3587 - classification_loss: 0.2685 292/500 [================>.............] - ETA: 51s - loss: 1.6278 - regression_loss: 1.3594 - classification_loss: 0.2683 293/500 [================>.............] - ETA: 51s - loss: 1.6281 - regression_loss: 1.3599 - classification_loss: 0.2682 294/500 [================>.............] - ETA: 51s - loss: 1.6304 - regression_loss: 1.3616 - classification_loss: 0.2688 295/500 [================>.............] - ETA: 50s - loss: 1.6297 - regression_loss: 1.3612 - classification_loss: 0.2685 296/500 [================>.............] - ETA: 50s - loss: 1.6309 - regression_loss: 1.3622 - classification_loss: 0.2687 297/500 [================>.............] - ETA: 50s - loss: 1.6287 - regression_loss: 1.3604 - classification_loss: 0.2683 298/500 [================>.............] - ETA: 50s - loss: 1.6323 - regression_loss: 1.3637 - classification_loss: 0.2686 299/500 [================>.............] - ETA: 49s - loss: 1.6334 - regression_loss: 1.3645 - classification_loss: 0.2689 300/500 [=================>............] - ETA: 49s - loss: 1.6340 - regression_loss: 1.3653 - classification_loss: 0.2687 301/500 [=================>............] - ETA: 49s - loss: 1.6307 - regression_loss: 1.3627 - classification_loss: 0.2680 302/500 [=================>............] - ETA: 49s - loss: 1.6305 - regression_loss: 1.3628 - classification_loss: 0.2677 303/500 [=================>............] - ETA: 48s - loss: 1.6320 - regression_loss: 1.3640 - classification_loss: 0.2680 304/500 [=================>............] - ETA: 48s - loss: 1.6319 - regression_loss: 1.3640 - classification_loss: 0.2679 305/500 [=================>............] - ETA: 48s - loss: 1.6340 - regression_loss: 1.3653 - classification_loss: 0.2686 306/500 [=================>............] - ETA: 48s - loss: 1.6322 - regression_loss: 1.3640 - classification_loss: 0.2682 307/500 [=================>............] - ETA: 47s - loss: 1.6322 - regression_loss: 1.3639 - classification_loss: 0.2683 308/500 [=================>............] - ETA: 47s - loss: 1.6319 - regression_loss: 1.3630 - classification_loss: 0.2689 309/500 [=================>............] - ETA: 47s - loss: 1.6307 - regression_loss: 1.3621 - classification_loss: 0.2686 310/500 [=================>............] - ETA: 47s - loss: 1.6304 - regression_loss: 1.3622 - classification_loss: 0.2682 311/500 [=================>............] - ETA: 46s - loss: 1.6284 - regression_loss: 1.3607 - classification_loss: 0.2677 312/500 [=================>............] - ETA: 46s - loss: 1.6284 - regression_loss: 1.3606 - classification_loss: 0.2678 313/500 [=================>............] - ETA: 46s - loss: 1.6281 - regression_loss: 1.3606 - classification_loss: 0.2675 314/500 [=================>............] - ETA: 46s - loss: 1.6289 - regression_loss: 1.3616 - classification_loss: 0.2672 315/500 [=================>............] - ETA: 45s - loss: 1.6253 - regression_loss: 1.3585 - classification_loss: 0.2667 316/500 [=================>............] - ETA: 45s - loss: 1.6219 - regression_loss: 1.3557 - classification_loss: 0.2662 317/500 [==================>...........] - ETA: 45s - loss: 1.6221 - regression_loss: 1.3561 - classification_loss: 0.2661 318/500 [==================>...........] - ETA: 45s - loss: 1.6217 - regression_loss: 1.3556 - classification_loss: 0.2660 319/500 [==================>...........] - ETA: 44s - loss: 1.6211 - regression_loss: 1.3554 - classification_loss: 0.2657 320/500 [==================>...........] - ETA: 44s - loss: 1.6211 - regression_loss: 1.3555 - classification_loss: 0.2655 321/500 [==================>...........] - ETA: 44s - loss: 1.6192 - regression_loss: 1.3542 - classification_loss: 0.2650 322/500 [==================>...........] - ETA: 44s - loss: 1.6187 - regression_loss: 1.3535 - classification_loss: 0.2651 323/500 [==================>...........] - ETA: 43s - loss: 1.6174 - regression_loss: 1.3523 - classification_loss: 0.2651 324/500 [==================>...........] - ETA: 43s - loss: 1.6180 - regression_loss: 1.3527 - classification_loss: 0.2653 325/500 [==================>...........] - ETA: 43s - loss: 1.6167 - regression_loss: 1.3518 - classification_loss: 0.2649 326/500 [==================>...........] - ETA: 43s - loss: 1.6180 - regression_loss: 1.3530 - classification_loss: 0.2651 327/500 [==================>...........] - ETA: 42s - loss: 1.6203 - regression_loss: 1.3548 - classification_loss: 0.2655 328/500 [==================>...........] - ETA: 42s - loss: 1.6213 - regression_loss: 1.3553 - classification_loss: 0.2660 329/500 [==================>...........] - ETA: 42s - loss: 1.6200 - regression_loss: 1.3544 - classification_loss: 0.2656 330/500 [==================>...........] - ETA: 42s - loss: 1.6162 - regression_loss: 1.3512 - classification_loss: 0.2650 331/500 [==================>...........] - ETA: 41s - loss: 1.6152 - regression_loss: 1.3506 - classification_loss: 0.2646 332/500 [==================>...........] - ETA: 41s - loss: 1.6154 - regression_loss: 1.3508 - classification_loss: 0.2646 333/500 [==================>...........] - ETA: 41s - loss: 1.6142 - regression_loss: 1.3500 - classification_loss: 0.2642 334/500 [===================>..........] - ETA: 41s - loss: 1.6135 - regression_loss: 1.3493 - classification_loss: 0.2642 335/500 [===================>..........] - ETA: 40s - loss: 1.6157 - regression_loss: 1.3508 - classification_loss: 0.2649 336/500 [===================>..........] - ETA: 40s - loss: 1.6143 - regression_loss: 1.3497 - classification_loss: 0.2646 337/500 [===================>..........] - ETA: 40s - loss: 1.6172 - regression_loss: 1.3517 - classification_loss: 0.2655 338/500 [===================>..........] - ETA: 40s - loss: 1.6171 - regression_loss: 1.3516 - classification_loss: 0.2654 339/500 [===================>..........] - ETA: 39s - loss: 1.6182 - regression_loss: 1.3525 - classification_loss: 0.2657 340/500 [===================>..........] - ETA: 39s - loss: 1.6175 - regression_loss: 1.3519 - classification_loss: 0.2656 341/500 [===================>..........] - ETA: 39s - loss: 1.6204 - regression_loss: 1.3548 - classification_loss: 0.2656 342/500 [===================>..........] - ETA: 39s - loss: 1.6194 - regression_loss: 1.3541 - classification_loss: 0.2653 343/500 [===================>..........] - ETA: 38s - loss: 1.6180 - regression_loss: 1.3532 - classification_loss: 0.2648 344/500 [===================>..........] - ETA: 38s - loss: 1.6171 - regression_loss: 1.3526 - classification_loss: 0.2645 345/500 [===================>..........] - ETA: 38s - loss: 1.6156 - regression_loss: 1.3515 - classification_loss: 0.2641 346/500 [===================>..........] - ETA: 38s - loss: 1.6166 - regression_loss: 1.3526 - classification_loss: 0.2641 347/500 [===================>..........] - ETA: 37s - loss: 1.6175 - regression_loss: 1.3533 - classification_loss: 0.2642 348/500 [===================>..........] - ETA: 37s - loss: 1.6162 - regression_loss: 1.3523 - classification_loss: 0.2639 349/500 [===================>..........] - ETA: 37s - loss: 1.6168 - regression_loss: 1.3529 - classification_loss: 0.2639 350/500 [====================>.........] - ETA: 37s - loss: 1.6150 - regression_loss: 1.3515 - classification_loss: 0.2635 351/500 [====================>.........] - ETA: 36s - loss: 1.6131 - regression_loss: 1.3498 - classification_loss: 0.2633 352/500 [====================>.........] - ETA: 36s - loss: 1.6127 - regression_loss: 1.3499 - classification_loss: 0.2629 353/500 [====================>.........] - ETA: 36s - loss: 1.6121 - regression_loss: 1.3496 - classification_loss: 0.2626 354/500 [====================>.........] - ETA: 36s - loss: 1.6172 - regression_loss: 1.3530 - classification_loss: 0.2642 355/500 [====================>.........] - ETA: 35s - loss: 1.6182 - regression_loss: 1.3539 - classification_loss: 0.2643 356/500 [====================>.........] - ETA: 35s - loss: 1.6176 - regression_loss: 1.3535 - classification_loss: 0.2641 357/500 [====================>.........] - ETA: 35s - loss: 1.6167 - regression_loss: 1.3529 - classification_loss: 0.2638 358/500 [====================>.........] - ETA: 35s - loss: 1.6168 - regression_loss: 1.3531 - classification_loss: 0.2637 359/500 [====================>.........] - ETA: 34s - loss: 1.6154 - regression_loss: 1.3520 - classification_loss: 0.2634 360/500 [====================>.........] - ETA: 34s - loss: 1.6145 - regression_loss: 1.3512 - classification_loss: 0.2632 361/500 [====================>.........] - ETA: 34s - loss: 1.6145 - regression_loss: 1.3513 - classification_loss: 0.2632 362/500 [====================>.........] - ETA: 34s - loss: 1.6151 - regression_loss: 1.3520 - classification_loss: 0.2630 363/500 [====================>.........] - ETA: 33s - loss: 1.6172 - regression_loss: 1.3537 - classification_loss: 0.2634 364/500 [====================>.........] - ETA: 33s - loss: 1.6164 - regression_loss: 1.3532 - classification_loss: 0.2632 365/500 [====================>.........] - ETA: 33s - loss: 1.6160 - regression_loss: 1.3530 - classification_loss: 0.2630 366/500 [====================>.........] - ETA: 33s - loss: 1.6160 - regression_loss: 1.3531 - classification_loss: 0.2629 367/500 [=====================>........] - ETA: 32s - loss: 1.6162 - regression_loss: 1.3535 - classification_loss: 0.2627 368/500 [=====================>........] - ETA: 32s - loss: 1.6161 - regression_loss: 1.3535 - classification_loss: 0.2626 369/500 [=====================>........] - ETA: 32s - loss: 1.6169 - regression_loss: 1.3542 - classification_loss: 0.2627 370/500 [=====================>........] - ETA: 32s - loss: 1.6152 - regression_loss: 1.3529 - classification_loss: 0.2622 371/500 [=====================>........] - ETA: 31s - loss: 1.6160 - regression_loss: 1.3537 - classification_loss: 0.2623 372/500 [=====================>........] - ETA: 31s - loss: 1.6171 - regression_loss: 1.3542 - classification_loss: 0.2629 373/500 [=====================>........] - ETA: 31s - loss: 1.6171 - regression_loss: 1.3542 - classification_loss: 0.2629 374/500 [=====================>........] - ETA: 31s - loss: 1.6184 - regression_loss: 1.3552 - classification_loss: 0.2632 375/500 [=====================>........] - ETA: 30s - loss: 1.6190 - regression_loss: 1.3556 - classification_loss: 0.2633 376/500 [=====================>........] - ETA: 30s - loss: 1.6193 - regression_loss: 1.3562 - classification_loss: 0.2632 377/500 [=====================>........] - ETA: 30s - loss: 1.6179 - regression_loss: 1.3551 - classification_loss: 0.2628 378/500 [=====================>........] - ETA: 30s - loss: 1.6170 - regression_loss: 1.3544 - classification_loss: 0.2626 379/500 [=====================>........] - ETA: 29s - loss: 1.6155 - regression_loss: 1.3532 - classification_loss: 0.2623 380/500 [=====================>........] - ETA: 29s - loss: 1.6143 - regression_loss: 1.3521 - classification_loss: 0.2622 381/500 [=====================>........] - ETA: 29s - loss: 1.6155 - regression_loss: 1.3533 - classification_loss: 0.2622 382/500 [=====================>........] - ETA: 29s - loss: 1.6144 - regression_loss: 1.3527 - classification_loss: 0.2618 383/500 [=====================>........] - ETA: 28s - loss: 1.6149 - regression_loss: 1.3531 - classification_loss: 0.2618 384/500 [======================>.......] - ETA: 28s - loss: 1.6136 - regression_loss: 1.3521 - classification_loss: 0.2615 385/500 [======================>.......] - ETA: 28s - loss: 1.6135 - regression_loss: 1.3521 - classification_loss: 0.2614 386/500 [======================>.......] - ETA: 28s - loss: 1.6161 - regression_loss: 1.3540 - classification_loss: 0.2621 387/500 [======================>.......] - ETA: 28s - loss: 1.6151 - regression_loss: 1.3532 - classification_loss: 0.2620 388/500 [======================>.......] - ETA: 27s - loss: 1.6148 - regression_loss: 1.3529 - classification_loss: 0.2618 389/500 [======================>.......] - ETA: 27s - loss: 1.6141 - regression_loss: 1.3525 - classification_loss: 0.2617 390/500 [======================>.......] - ETA: 27s - loss: 1.6138 - regression_loss: 1.3524 - classification_loss: 0.2614 391/500 [======================>.......] - ETA: 27s - loss: 1.6148 - regression_loss: 1.3528 - classification_loss: 0.2620 392/500 [======================>.......] - ETA: 26s - loss: 1.6138 - regression_loss: 1.3522 - classification_loss: 0.2616 393/500 [======================>.......] - ETA: 26s - loss: 1.6129 - regression_loss: 1.3514 - classification_loss: 0.2615 394/500 [======================>.......] - ETA: 26s - loss: 1.6121 - regression_loss: 1.3506 - classification_loss: 0.2615 395/500 [======================>.......] - ETA: 26s - loss: 1.6111 - regression_loss: 1.3496 - classification_loss: 0.2615 396/500 [======================>.......] - ETA: 25s - loss: 1.6103 - regression_loss: 1.3492 - classification_loss: 0.2612 397/500 [======================>.......] - ETA: 25s - loss: 1.6101 - regression_loss: 1.3491 - classification_loss: 0.2610 398/500 [======================>.......] - ETA: 25s - loss: 1.6099 - regression_loss: 1.3489 - classification_loss: 0.2610 399/500 [======================>.......] - ETA: 25s - loss: 1.6088 - regression_loss: 1.3481 - classification_loss: 0.2607 400/500 [=======================>......] - ETA: 24s - loss: 1.6080 - regression_loss: 1.3473 - classification_loss: 0.2607 401/500 [=======================>......] - ETA: 24s - loss: 1.6064 - regression_loss: 1.3460 - classification_loss: 0.2604 402/500 [=======================>......] - ETA: 24s - loss: 1.6061 - regression_loss: 1.3459 - classification_loss: 0.2602 403/500 [=======================>......] - ETA: 24s - loss: 1.6054 - regression_loss: 1.3453 - classification_loss: 0.2601 404/500 [=======================>......] - ETA: 23s - loss: 1.6047 - regression_loss: 1.3447 - classification_loss: 0.2600 405/500 [=======================>......] - ETA: 23s - loss: 1.6073 - regression_loss: 1.3469 - classification_loss: 0.2604 406/500 [=======================>......] - ETA: 23s - loss: 1.6076 - regression_loss: 1.3472 - classification_loss: 0.2604 407/500 [=======================>......] - ETA: 23s - loss: 1.6080 - regression_loss: 1.3472 - classification_loss: 0.2608 408/500 [=======================>......] - ETA: 22s - loss: 1.6085 - regression_loss: 1.3476 - classification_loss: 0.2608 409/500 [=======================>......] - ETA: 22s - loss: 1.6093 - regression_loss: 1.3484 - classification_loss: 0.2609 410/500 [=======================>......] - ETA: 22s - loss: 1.6076 - regression_loss: 1.3471 - classification_loss: 0.2605 411/500 [=======================>......] - ETA: 22s - loss: 1.6075 - regression_loss: 1.3468 - classification_loss: 0.2607 412/500 [=======================>......] - ETA: 21s - loss: 1.6069 - regression_loss: 1.3436 - classification_loss: 0.2633 413/500 [=======================>......] - ETA: 21s - loss: 1.6078 - regression_loss: 1.3445 - classification_loss: 0.2634 414/500 [=======================>......] - ETA: 21s - loss: 1.6084 - regression_loss: 1.3449 - classification_loss: 0.2635 415/500 [=======================>......] - ETA: 21s - loss: 1.6081 - regression_loss: 1.3448 - classification_loss: 0.2633 416/500 [=======================>......] - ETA: 20s - loss: 1.6077 - regression_loss: 1.3444 - classification_loss: 0.2633 417/500 [========================>.....] - ETA: 20s - loss: 1.6097 - regression_loss: 1.3458 - classification_loss: 0.2640 418/500 [========================>.....] - ETA: 20s - loss: 1.6111 - regression_loss: 1.3467 - classification_loss: 0.2644 419/500 [========================>.....] - ETA: 20s - loss: 1.6083 - regression_loss: 1.3444 - classification_loss: 0.2639 420/500 [========================>.....] - ETA: 19s - loss: 1.6071 - regression_loss: 1.3434 - classification_loss: 0.2638 421/500 [========================>.....] - ETA: 19s - loss: 1.6075 - regression_loss: 1.3435 - classification_loss: 0.2639 422/500 [========================>.....] - ETA: 19s - loss: 1.6079 - regression_loss: 1.3439 - classification_loss: 0.2640 423/500 [========================>.....] - ETA: 19s - loss: 1.6077 - regression_loss: 1.3437 - classification_loss: 0.2641 424/500 [========================>.....] - ETA: 18s - loss: 1.6090 - regression_loss: 1.3445 - classification_loss: 0.2645 425/500 [========================>.....] - ETA: 18s - loss: 1.6072 - regression_loss: 1.3432 - classification_loss: 0.2641 426/500 [========================>.....] - ETA: 18s - loss: 1.6073 - regression_loss: 1.3433 - classification_loss: 0.2640 427/500 [========================>.....] - ETA: 18s - loss: 1.6053 - regression_loss: 1.3417 - classification_loss: 0.2636 428/500 [========================>.....] - ETA: 17s - loss: 1.6066 - regression_loss: 1.3429 - classification_loss: 0.2637 429/500 [========================>.....] - ETA: 17s - loss: 1.6060 - regression_loss: 1.3425 - classification_loss: 0.2635 430/500 [========================>.....] - ETA: 17s - loss: 1.6037 - regression_loss: 1.3406 - classification_loss: 0.2631 431/500 [========================>.....] - ETA: 17s - loss: 1.6028 - regression_loss: 1.3398 - classification_loss: 0.2630 432/500 [========================>.....] - ETA: 16s - loss: 1.6032 - regression_loss: 1.3402 - classification_loss: 0.2630 433/500 [========================>.....] - ETA: 16s - loss: 1.6029 - regression_loss: 1.3399 - classification_loss: 0.2630 434/500 [=========================>....] - ETA: 16s - loss: 1.6040 - regression_loss: 1.3404 - classification_loss: 0.2636 435/500 [=========================>....] - ETA: 16s - loss: 1.6041 - regression_loss: 1.3402 - classification_loss: 0.2639 436/500 [=========================>....] - ETA: 15s - loss: 1.6035 - regression_loss: 1.3397 - classification_loss: 0.2638 437/500 [=========================>....] - ETA: 15s - loss: 1.6027 - regression_loss: 1.3391 - classification_loss: 0.2637 438/500 [=========================>....] - ETA: 15s - loss: 1.6036 - regression_loss: 1.3397 - classification_loss: 0.2639 439/500 [=========================>....] - ETA: 15s - loss: 1.6036 - regression_loss: 1.3397 - classification_loss: 0.2639 440/500 [=========================>....] - ETA: 14s - loss: 1.6022 - regression_loss: 1.3387 - classification_loss: 0.2635 441/500 [=========================>....] - ETA: 14s - loss: 1.6015 - regression_loss: 1.3382 - classification_loss: 0.2633 442/500 [=========================>....] - ETA: 14s - loss: 1.6012 - regression_loss: 1.3380 - classification_loss: 0.2631 443/500 [=========================>....] - ETA: 14s - loss: 1.6014 - regression_loss: 1.3385 - classification_loss: 0.2630 444/500 [=========================>....] - ETA: 13s - loss: 1.6010 - regression_loss: 1.3381 - classification_loss: 0.2628 445/500 [=========================>....] - ETA: 13s - loss: 1.6007 - regression_loss: 1.3379 - classification_loss: 0.2627 446/500 [=========================>....] - ETA: 13s - loss: 1.6024 - regression_loss: 1.3391 - classification_loss: 0.2633 447/500 [=========================>....] - ETA: 13s - loss: 1.6022 - regression_loss: 1.3389 - classification_loss: 0.2633 448/500 [=========================>....] - ETA: 12s - loss: 1.6014 - regression_loss: 1.3383 - classification_loss: 0.2631 449/500 [=========================>....] - ETA: 12s - loss: 1.6026 - regression_loss: 1.3396 - classification_loss: 0.2631 450/500 [==========================>...] - ETA: 12s - loss: 1.6025 - regression_loss: 1.3393 - classification_loss: 0.2632 451/500 [==========================>...] - ETA: 12s - loss: 1.6033 - regression_loss: 1.3397 - classification_loss: 0.2636 452/500 [==========================>...] - ETA: 11s - loss: 1.6034 - regression_loss: 1.3397 - classification_loss: 0.2636 453/500 [==========================>...] - ETA: 11s - loss: 1.6029 - regression_loss: 1.3395 - classification_loss: 0.2634 454/500 [==========================>...] - ETA: 11s - loss: 1.6013 - regression_loss: 1.3381 - classification_loss: 0.2631 455/500 [==========================>...] - ETA: 11s - loss: 1.6020 - regression_loss: 1.3385 - classification_loss: 0.2635 456/500 [==========================>...] - ETA: 10s - loss: 1.6022 - regression_loss: 1.3384 - classification_loss: 0.2638 457/500 [==========================>...] - ETA: 10s - loss: 1.6019 - regression_loss: 1.3383 - classification_loss: 0.2636 458/500 [==========================>...] - ETA: 10s - loss: 1.6012 - regression_loss: 1.3376 - classification_loss: 0.2636 459/500 [==========================>...] - ETA: 10s - loss: 1.5993 - regression_loss: 1.3361 - classification_loss: 0.2632 460/500 [==========================>...] - ETA: 9s - loss: 1.6002 - regression_loss: 1.3368 - classification_loss: 0.2634  461/500 [==========================>...] - ETA: 9s - loss: 1.5993 - regression_loss: 1.3362 - classification_loss: 0.2631 462/500 [==========================>...] - ETA: 9s - loss: 1.5996 - regression_loss: 1.3367 - classification_loss: 0.2630 463/500 [==========================>...] - ETA: 9s - loss: 1.6020 - regression_loss: 1.3381 - classification_loss: 0.2639 464/500 [==========================>...] - ETA: 8s - loss: 1.6000 - regression_loss: 1.3366 - classification_loss: 0.2635 465/500 [==========================>...] - ETA: 8s - loss: 1.5993 - regression_loss: 1.3359 - classification_loss: 0.2635 466/500 [==========================>...] - ETA: 8s - loss: 1.6002 - regression_loss: 1.3367 - classification_loss: 0.2635 467/500 [===========================>..] - ETA: 8s - loss: 1.6004 - regression_loss: 1.3367 - classification_loss: 0.2637 468/500 [===========================>..] - ETA: 7s - loss: 1.6016 - regression_loss: 1.3376 - classification_loss: 0.2640 469/500 [===========================>..] - ETA: 7s - loss: 1.6017 - regression_loss: 1.3376 - classification_loss: 0.2640 470/500 [===========================>..] - ETA: 7s - loss: 1.6011 - regression_loss: 1.3372 - classification_loss: 0.2638 471/500 [===========================>..] - ETA: 7s - loss: 1.6022 - regression_loss: 1.3381 - classification_loss: 0.2641 472/500 [===========================>..] - ETA: 6s - loss: 1.6036 - regression_loss: 1.3392 - classification_loss: 0.2644 473/500 [===========================>..] - ETA: 6s - loss: 1.6039 - regression_loss: 1.3395 - classification_loss: 0.2644 474/500 [===========================>..] - ETA: 6s - loss: 1.6038 - regression_loss: 1.3396 - classification_loss: 0.2642 475/500 [===========================>..] - ETA: 6s - loss: 1.6037 - regression_loss: 1.3396 - classification_loss: 0.2641 476/500 [===========================>..] - ETA: 5s - loss: 1.6015 - regression_loss: 1.3378 - classification_loss: 0.2637 477/500 [===========================>..] - ETA: 5s - loss: 1.5997 - regression_loss: 1.3364 - classification_loss: 0.2633 478/500 [===========================>..] - ETA: 5s - loss: 1.6005 - regression_loss: 1.3369 - classification_loss: 0.2636 479/500 [===========================>..] - ETA: 5s - loss: 1.5987 - regression_loss: 1.3354 - classification_loss: 0.2634 480/500 [===========================>..] - ETA: 4s - loss: 1.5997 - regression_loss: 1.3360 - classification_loss: 0.2636 481/500 [===========================>..] - ETA: 4s - loss: 1.6000 - regression_loss: 1.3364 - classification_loss: 0.2636 482/500 [===========================>..] - ETA: 4s - loss: 1.5987 - regression_loss: 1.3353 - classification_loss: 0.2634 483/500 [===========================>..] - ETA: 4s - loss: 1.5970 - regression_loss: 1.3338 - classification_loss: 0.2632 484/500 [============================>.] - ETA: 3s - loss: 1.5987 - regression_loss: 1.3354 - classification_loss: 0.2634 485/500 [============================>.] - ETA: 3s - loss: 1.5979 - regression_loss: 1.3347 - classification_loss: 0.2632 486/500 [============================>.] - ETA: 3s - loss: 1.5983 - regression_loss: 1.3350 - classification_loss: 0.2633 487/500 [============================>.] - ETA: 3s - loss: 1.5974 - regression_loss: 1.3344 - classification_loss: 0.2630 488/500 [============================>.] - ETA: 2s - loss: 1.5982 - regression_loss: 1.3351 - classification_loss: 0.2631 489/500 [============================>.] - ETA: 2s - loss: 1.5985 - regression_loss: 1.3355 - classification_loss: 0.2631 490/500 [============================>.] - ETA: 2s - loss: 1.5984 - regression_loss: 1.3354 - classification_loss: 0.2630 491/500 [============================>.] - ETA: 2s - loss: 1.5971 - regression_loss: 1.3342 - classification_loss: 0.2629 492/500 [============================>.] - ETA: 1s - loss: 1.5978 - regression_loss: 1.3346 - classification_loss: 0.2632 493/500 [============================>.] - ETA: 1s - loss: 1.5979 - regression_loss: 1.3348 - classification_loss: 0.2631 494/500 [============================>.] - ETA: 1s - loss: 1.5978 - regression_loss: 1.3348 - classification_loss: 0.2630 495/500 [============================>.] - ETA: 1s - loss: 1.5977 - regression_loss: 1.3347 - classification_loss: 0.2630 496/500 [============================>.] - ETA: 0s - loss: 1.5984 - regression_loss: 1.3353 - classification_loss: 0.2632 497/500 [============================>.] - ETA: 0s - loss: 1.5981 - regression_loss: 1.3351 - classification_loss: 0.2630 498/500 [============================>.] - ETA: 0s - loss: 1.5991 - regression_loss: 1.3359 - classification_loss: 0.2632 499/500 [============================>.] - ETA: 0s - loss: 1.5998 - regression_loss: 1.3363 - classification_loss: 0.2635 500/500 [==============================] - 124s 248ms/step - loss: 1.6011 - regression_loss: 1.3372 - classification_loss: 0.2639 326 instances of class plum with average precision: 0.7817 mAP: 0.7817 Epoch 00062: saving model to ./training/snapshots/resnet50_pascal_62.h5 Epoch 63/150 1/500 [..............................] - ETA: 1:54 - loss: 1.9398 - regression_loss: 1.5894 - classification_loss: 0.3504 2/500 [..............................] - ETA: 1:58 - loss: 1.7691 - regression_loss: 1.4785 - classification_loss: 0.2905 3/500 [..............................] - ETA: 2:00 - loss: 1.6545 - regression_loss: 1.4035 - classification_loss: 0.2511 4/500 [..............................] - ETA: 2:01 - loss: 1.6526 - regression_loss: 1.3975 - classification_loss: 0.2551 5/500 [..............................] - ETA: 1:58 - loss: 1.8258 - regression_loss: 1.2149 - classification_loss: 0.6109 6/500 [..............................] - ETA: 1:57 - loss: 1.7399 - regression_loss: 1.2035 - classification_loss: 0.5364 7/500 [..............................] - ETA: 1:58 - loss: 1.6366 - regression_loss: 1.1551 - classification_loss: 0.4814 8/500 [..............................] - ETA: 1:58 - loss: 1.6354 - regression_loss: 1.1756 - classification_loss: 0.4598 9/500 [..............................] - ETA: 1:58 - loss: 1.5689 - regression_loss: 1.1463 - classification_loss: 0.4226 10/500 [..............................] - ETA: 1:58 - loss: 1.6084 - regression_loss: 1.1965 - classification_loss: 0.4119 11/500 [..............................] - ETA: 1:58 - loss: 1.6138 - regression_loss: 1.2168 - classification_loss: 0.3970 12/500 [..............................] - ETA: 1:58 - loss: 1.5764 - regression_loss: 1.1988 - classification_loss: 0.3776 13/500 [..............................] - ETA: 1:58 - loss: 1.5219 - regression_loss: 1.1649 - classification_loss: 0.3569 14/500 [..............................] - ETA: 1:58 - loss: 1.5294 - regression_loss: 1.1807 - classification_loss: 0.3487 15/500 [..............................] - ETA: 1:58 - loss: 1.5116 - regression_loss: 1.1718 - classification_loss: 0.3398 16/500 [..............................] - ETA: 1:58 - loss: 1.5602 - regression_loss: 1.2232 - classification_loss: 0.3370 17/500 [>.............................] - ETA: 1:58 - loss: 1.5667 - regression_loss: 1.2345 - classification_loss: 0.3323 18/500 [>.............................] - ETA: 1:57 - loss: 1.6042 - regression_loss: 1.2689 - classification_loss: 0.3353 19/500 [>.............................] - ETA: 1:57 - loss: 1.6164 - regression_loss: 1.2825 - classification_loss: 0.3338 20/500 [>.............................] - ETA: 1:57 - loss: 1.6230 - regression_loss: 1.2920 - classification_loss: 0.3310 21/500 [>.............................] - ETA: 1:57 - loss: 1.6493 - regression_loss: 1.3137 - classification_loss: 0.3357 22/500 [>.............................] - ETA: 1:57 - loss: 1.6890 - regression_loss: 1.3481 - classification_loss: 0.3409 23/500 [>.............................] - ETA: 1:57 - loss: 1.6697 - regression_loss: 1.3316 - classification_loss: 0.3381 24/500 [>.............................] - ETA: 1:57 - loss: 1.6724 - regression_loss: 1.3333 - classification_loss: 0.3391 25/500 [>.............................] - ETA: 1:57 - loss: 1.6644 - regression_loss: 1.3286 - classification_loss: 0.3358 26/500 [>.............................] - ETA: 1:57 - loss: 1.6499 - regression_loss: 1.3221 - classification_loss: 0.3278 27/500 [>.............................] - ETA: 1:56 - loss: 1.6489 - regression_loss: 1.3271 - classification_loss: 0.3219 28/500 [>.............................] - ETA: 1:56 - loss: 1.6348 - regression_loss: 1.3188 - classification_loss: 0.3160 29/500 [>.............................] - ETA: 1:56 - loss: 1.6351 - regression_loss: 1.3253 - classification_loss: 0.3098 30/500 [>.............................] - ETA: 1:56 - loss: 1.6511 - regression_loss: 1.3348 - classification_loss: 0.3163 31/500 [>.............................] - ETA: 1:55 - loss: 1.6659 - regression_loss: 1.3505 - classification_loss: 0.3155 32/500 [>.............................] - ETA: 1:55 - loss: 1.6807 - regression_loss: 1.3600 - classification_loss: 0.3208 33/500 [>.............................] - ETA: 1:55 - loss: 1.6650 - regression_loss: 1.3504 - classification_loss: 0.3145 34/500 [=>............................] - ETA: 1:55 - loss: 1.6687 - regression_loss: 1.3568 - classification_loss: 0.3120 35/500 [=>............................] - ETA: 1:55 - loss: 1.6782 - regression_loss: 1.3684 - classification_loss: 0.3098 36/500 [=>............................] - ETA: 1:55 - loss: 1.6814 - regression_loss: 1.3736 - classification_loss: 0.3079 37/500 [=>............................] - ETA: 1:54 - loss: 1.6755 - regression_loss: 1.3697 - classification_loss: 0.3058 38/500 [=>............................] - ETA: 1:54 - loss: 1.6780 - regression_loss: 1.3704 - classification_loss: 0.3076 39/500 [=>............................] - ETA: 1:54 - loss: 1.6893 - regression_loss: 1.3797 - classification_loss: 0.3096 40/500 [=>............................] - ETA: 1:54 - loss: 1.6712 - regression_loss: 1.3661 - classification_loss: 0.3050 41/500 [=>............................] - ETA: 1:53 - loss: 1.6546 - regression_loss: 1.3551 - classification_loss: 0.2995 42/500 [=>............................] - ETA: 1:53 - loss: 1.6675 - regression_loss: 1.3656 - classification_loss: 0.3020 43/500 [=>............................] - ETA: 1:52 - loss: 1.6733 - regression_loss: 1.3697 - classification_loss: 0.3036 44/500 [=>............................] - ETA: 1:52 - loss: 1.6672 - regression_loss: 1.3650 - classification_loss: 0.3022 45/500 [=>............................] - ETA: 1:52 - loss: 1.6660 - regression_loss: 1.3651 - classification_loss: 0.3009 46/500 [=>............................] - ETA: 1:51 - loss: 1.6575 - regression_loss: 1.3593 - classification_loss: 0.2982 47/500 [=>............................] - ETA: 1:51 - loss: 1.6783 - regression_loss: 1.3755 - classification_loss: 0.3028 48/500 [=>............................] - ETA: 1:51 - loss: 1.6691 - regression_loss: 1.3702 - classification_loss: 0.2988 49/500 [=>............................] - ETA: 1:51 - loss: 1.6699 - regression_loss: 1.3692 - classification_loss: 0.3008 50/500 [==>...........................] - ETA: 1:50 - loss: 1.6918 - regression_loss: 1.3864 - classification_loss: 0.3054 51/500 [==>...........................] - ETA: 1:50 - loss: 1.6871 - regression_loss: 1.3840 - classification_loss: 0.3031 52/500 [==>...........................] - ETA: 1:50 - loss: 1.6937 - regression_loss: 1.3857 - classification_loss: 0.3080 53/500 [==>...........................] - ETA: 1:50 - loss: 1.6819 - regression_loss: 1.3765 - classification_loss: 0.3054 54/500 [==>...........................] - ETA: 1:49 - loss: 1.6750 - regression_loss: 1.3719 - classification_loss: 0.3031 55/500 [==>...........................] - ETA: 1:49 - loss: 1.6723 - regression_loss: 1.3714 - classification_loss: 0.3009 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6929 - regression_loss: 1.3880 - classification_loss: 0.3049 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6938 - regression_loss: 1.3897 - classification_loss: 0.3041 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6874 - regression_loss: 1.3865 - classification_loss: 0.3009 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6858 - regression_loss: 1.3845 - classification_loss: 0.3012 60/500 [==>...........................] - ETA: 1:48 - loss: 1.6775 - regression_loss: 1.3787 - classification_loss: 0.2988 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6783 - regression_loss: 1.3805 - classification_loss: 0.2978 62/500 [==>...........................] - ETA: 1:47 - loss: 1.6738 - regression_loss: 1.3775 - classification_loss: 0.2962 63/500 [==>...........................] - ETA: 1:47 - loss: 1.6692 - regression_loss: 1.3748 - classification_loss: 0.2944 64/500 [==>...........................] - ETA: 1:47 - loss: 1.6636 - regression_loss: 1.3705 - classification_loss: 0.2930 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6599 - regression_loss: 1.3688 - classification_loss: 0.2911 66/500 [==>...........................] - ETA: 1:46 - loss: 1.6663 - regression_loss: 1.3744 - classification_loss: 0.2919 67/500 [===>..........................] - ETA: 1:46 - loss: 1.6672 - regression_loss: 1.3766 - classification_loss: 0.2906 68/500 [===>..........................] - ETA: 1:46 - loss: 1.6805 - regression_loss: 1.3879 - classification_loss: 0.2926 69/500 [===>..........................] - ETA: 1:46 - loss: 1.6817 - regression_loss: 1.3889 - classification_loss: 0.2928 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6802 - regression_loss: 1.3886 - classification_loss: 0.2916 71/500 [===>..........................] - ETA: 1:45 - loss: 1.6645 - regression_loss: 1.3756 - classification_loss: 0.2889 72/500 [===>..........................] - ETA: 1:45 - loss: 1.6697 - regression_loss: 1.3814 - classification_loss: 0.2883 73/500 [===>..........................] - ETA: 1:45 - loss: 1.6751 - regression_loss: 1.3865 - classification_loss: 0.2887 74/500 [===>..........................] - ETA: 1:44 - loss: 1.6771 - regression_loss: 1.3881 - classification_loss: 0.2890 75/500 [===>..........................] - ETA: 1:44 - loss: 1.6742 - regression_loss: 1.3869 - classification_loss: 0.2873 76/500 [===>..........................] - ETA: 1:44 - loss: 1.6725 - regression_loss: 1.3866 - classification_loss: 0.2859 77/500 [===>..........................] - ETA: 1:44 - loss: 1.6775 - regression_loss: 1.3911 - classification_loss: 0.2864 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6649 - regression_loss: 1.3815 - classification_loss: 0.2834 79/500 [===>..........................] - ETA: 1:43 - loss: 1.6627 - regression_loss: 1.3800 - classification_loss: 0.2827 80/500 [===>..........................] - ETA: 1:43 - loss: 1.6742 - regression_loss: 1.3887 - classification_loss: 0.2855 81/500 [===>..........................] - ETA: 1:43 - loss: 1.6684 - regression_loss: 1.3846 - classification_loss: 0.2838 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6667 - regression_loss: 1.3839 - classification_loss: 0.2828 83/500 [===>..........................] - ETA: 1:42 - loss: 1.6676 - regression_loss: 1.3856 - classification_loss: 0.2820 84/500 [====>.........................] - ETA: 1:42 - loss: 1.6680 - regression_loss: 1.3848 - classification_loss: 0.2833 85/500 [====>.........................] - ETA: 1:42 - loss: 1.6724 - regression_loss: 1.3874 - classification_loss: 0.2850 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6629 - regression_loss: 1.3786 - classification_loss: 0.2842 87/500 [====>.........................] - ETA: 1:41 - loss: 1.6663 - regression_loss: 1.3811 - classification_loss: 0.2852 88/500 [====>.........................] - ETA: 1:41 - loss: 1.6650 - regression_loss: 1.3789 - classification_loss: 0.2860 89/500 [====>.........................] - ETA: 1:41 - loss: 1.6745 - regression_loss: 1.3855 - classification_loss: 0.2891 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6615 - regression_loss: 1.3752 - classification_loss: 0.2863 91/500 [====>.........................] - ETA: 1:40 - loss: 1.6572 - regression_loss: 1.3723 - classification_loss: 0.2849 92/500 [====>.........................] - ETA: 1:40 - loss: 1.6543 - regression_loss: 1.3702 - classification_loss: 0.2841 93/500 [====>.........................] - ETA: 1:40 - loss: 1.6510 - regression_loss: 1.3683 - classification_loss: 0.2827 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6546 - regression_loss: 1.3713 - classification_loss: 0.2833 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6495 - regression_loss: 1.3675 - classification_loss: 0.2820 96/500 [====>.........................] - ETA: 1:39 - loss: 1.6478 - regression_loss: 1.3665 - classification_loss: 0.2813 97/500 [====>.........................] - ETA: 1:39 - loss: 1.6485 - regression_loss: 1.3671 - classification_loss: 0.2814 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6458 - regression_loss: 1.3652 - classification_loss: 0.2806 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6469 - regression_loss: 1.3656 - classification_loss: 0.2813 100/500 [=====>........................] - ETA: 1:38 - loss: 1.6492 - regression_loss: 1.3678 - classification_loss: 0.2814 101/500 [=====>........................] - ETA: 1:38 - loss: 1.6643 - regression_loss: 1.3824 - classification_loss: 0.2819 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6652 - regression_loss: 1.3833 - classification_loss: 0.2819 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6648 - regression_loss: 1.3826 - classification_loss: 0.2822 104/500 [=====>........................] - ETA: 1:37 - loss: 1.6654 - regression_loss: 1.3833 - classification_loss: 0.2821 105/500 [=====>........................] - ETA: 1:37 - loss: 1.6544 - regression_loss: 1.3744 - classification_loss: 0.2800 106/500 [=====>........................] - ETA: 1:37 - loss: 1.6531 - regression_loss: 1.3733 - classification_loss: 0.2799 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6444 - regression_loss: 1.3663 - classification_loss: 0.2781 108/500 [=====>........................] - ETA: 1:36 - loss: 1.6487 - regression_loss: 1.3694 - classification_loss: 0.2794 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6437 - regression_loss: 1.3652 - classification_loss: 0.2785 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6412 - regression_loss: 1.3632 - classification_loss: 0.2780 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6382 - regression_loss: 1.3616 - classification_loss: 0.2767 112/500 [=====>........................] - ETA: 1:35 - loss: 1.6356 - regression_loss: 1.3597 - classification_loss: 0.2759 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6379 - regression_loss: 1.3620 - classification_loss: 0.2759 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6337 - regression_loss: 1.3593 - classification_loss: 0.2744 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6315 - regression_loss: 1.3579 - classification_loss: 0.2735 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6322 - regression_loss: 1.3575 - classification_loss: 0.2747 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6355 - regression_loss: 1.3600 - classification_loss: 0.2755 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6369 - regression_loss: 1.3614 - classification_loss: 0.2755 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6340 - regression_loss: 1.3591 - classification_loss: 0.2749 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6329 - regression_loss: 1.3585 - classification_loss: 0.2744 121/500 [======>.......................] - ETA: 1:33 - loss: 1.6303 - regression_loss: 1.3571 - classification_loss: 0.2732 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6294 - regression_loss: 1.3560 - classification_loss: 0.2735 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6262 - regression_loss: 1.3531 - classification_loss: 0.2731 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6291 - regression_loss: 1.3564 - classification_loss: 0.2727 125/500 [======>.......................] - ETA: 1:32 - loss: 1.6202 - regression_loss: 1.3494 - classification_loss: 0.2708 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6242 - regression_loss: 1.3525 - classification_loss: 0.2718 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6235 - regression_loss: 1.3519 - classification_loss: 0.2716 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6233 - regression_loss: 1.3517 - classification_loss: 0.2716 129/500 [======>.......................] - ETA: 1:31 - loss: 1.6298 - regression_loss: 1.3564 - classification_loss: 0.2734 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6292 - regression_loss: 1.3565 - classification_loss: 0.2727 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6278 - regression_loss: 1.3556 - classification_loss: 0.2722 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6229 - regression_loss: 1.3515 - classification_loss: 0.2713 133/500 [======>.......................] - ETA: 1:30 - loss: 1.6231 - regression_loss: 1.3517 - classification_loss: 0.2715 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6208 - regression_loss: 1.3502 - classification_loss: 0.2705 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6161 - regression_loss: 1.3468 - classification_loss: 0.2693 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6155 - regression_loss: 1.3456 - classification_loss: 0.2699 137/500 [=======>......................] - ETA: 1:29 - loss: 1.6155 - regression_loss: 1.3456 - classification_loss: 0.2699 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6144 - regression_loss: 1.3446 - classification_loss: 0.2698 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6117 - regression_loss: 1.3417 - classification_loss: 0.2700 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6090 - regression_loss: 1.3396 - classification_loss: 0.2694 141/500 [=======>......................] - ETA: 1:28 - loss: 1.6095 - regression_loss: 1.3400 - classification_loss: 0.2695 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6069 - regression_loss: 1.3377 - classification_loss: 0.2692 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6027 - regression_loss: 1.3343 - classification_loss: 0.2684 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6008 - regression_loss: 1.3332 - classification_loss: 0.2676 145/500 [=======>......................] - ETA: 1:27 - loss: 1.6053 - regression_loss: 1.3370 - classification_loss: 0.2683 146/500 [=======>......................] - ETA: 1:27 - loss: 1.5987 - regression_loss: 1.3317 - classification_loss: 0.2670 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5928 - regression_loss: 1.3270 - classification_loss: 0.2657 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5950 - regression_loss: 1.3283 - classification_loss: 0.2667 149/500 [=======>......................] - ETA: 1:26 - loss: 1.5985 - regression_loss: 1.3309 - classification_loss: 0.2676 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6011 - regression_loss: 1.3329 - classification_loss: 0.2682 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6015 - regression_loss: 1.3338 - classification_loss: 0.2676 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5962 - regression_loss: 1.3296 - classification_loss: 0.2666 153/500 [========>.....................] - ETA: 1:25 - loss: 1.6000 - regression_loss: 1.3323 - classification_loss: 0.2677 154/500 [========>.....................] - ETA: 1:25 - loss: 1.5932 - regression_loss: 1.3269 - classification_loss: 0.2663 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5949 - regression_loss: 1.3289 - classification_loss: 0.2660 156/500 [========>.....................] - ETA: 1:24 - loss: 1.5966 - regression_loss: 1.3306 - classification_loss: 0.2660 157/500 [========>.....................] - ETA: 1:24 - loss: 1.6001 - regression_loss: 1.3335 - classification_loss: 0.2667 158/500 [========>.....................] - ETA: 1:24 - loss: 1.5961 - regression_loss: 1.3301 - classification_loss: 0.2660 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5976 - regression_loss: 1.3308 - classification_loss: 0.2667 160/500 [========>.....................] - ETA: 1:23 - loss: 1.5944 - regression_loss: 1.3286 - classification_loss: 0.2658 161/500 [========>.....................] - ETA: 1:23 - loss: 1.5922 - regression_loss: 1.3270 - classification_loss: 0.2652 162/500 [========>.....................] - ETA: 1:23 - loss: 1.5895 - regression_loss: 1.3251 - classification_loss: 0.2644 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5880 - regression_loss: 1.3241 - classification_loss: 0.2640 164/500 [========>.....................] - ETA: 1:22 - loss: 1.5858 - regression_loss: 1.3223 - classification_loss: 0.2634 165/500 [========>.....................] - ETA: 1:22 - loss: 1.5884 - regression_loss: 1.3243 - classification_loss: 0.2641 166/500 [========>.....................] - ETA: 1:22 - loss: 1.5894 - regression_loss: 1.3254 - classification_loss: 0.2640 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5843 - regression_loss: 1.3211 - classification_loss: 0.2632 168/500 [=========>....................] - ETA: 1:21 - loss: 1.5881 - regression_loss: 1.3244 - classification_loss: 0.2637 169/500 [=========>....................] - ETA: 1:21 - loss: 1.5869 - regression_loss: 1.3236 - classification_loss: 0.2633 170/500 [=========>....................] - ETA: 1:21 - loss: 1.5859 - regression_loss: 1.3231 - classification_loss: 0.2628 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5863 - regression_loss: 1.3239 - classification_loss: 0.2625 172/500 [=========>....................] - ETA: 1:20 - loss: 1.5872 - regression_loss: 1.3252 - classification_loss: 0.2620 173/500 [=========>....................] - ETA: 1:20 - loss: 1.5898 - regression_loss: 1.3274 - classification_loss: 0.2625 174/500 [=========>....................] - ETA: 1:20 - loss: 1.5907 - regression_loss: 1.3282 - classification_loss: 0.2625 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5909 - regression_loss: 1.3290 - classification_loss: 0.2619 176/500 [=========>....................] - ETA: 1:19 - loss: 1.5908 - regression_loss: 1.3292 - classification_loss: 0.2616 177/500 [=========>....................] - ETA: 1:19 - loss: 1.5900 - regression_loss: 1.3288 - classification_loss: 0.2611 178/500 [=========>....................] - ETA: 1:19 - loss: 1.5904 - regression_loss: 1.3293 - classification_loss: 0.2610 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5871 - regression_loss: 1.3264 - classification_loss: 0.2607 180/500 [=========>....................] - ETA: 1:18 - loss: 1.5864 - regression_loss: 1.3261 - classification_loss: 0.2603 181/500 [=========>....................] - ETA: 1:18 - loss: 1.5895 - regression_loss: 1.3290 - classification_loss: 0.2604 182/500 [=========>....................] - ETA: 1:18 - loss: 1.5864 - regression_loss: 1.3265 - classification_loss: 0.2599 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5860 - regression_loss: 1.3262 - classification_loss: 0.2598 184/500 [==========>...................] - ETA: 1:18 - loss: 1.5812 - regression_loss: 1.3224 - classification_loss: 0.2588 185/500 [==========>...................] - ETA: 1:17 - loss: 1.5864 - regression_loss: 1.3270 - classification_loss: 0.2594 186/500 [==========>...................] - ETA: 1:17 - loss: 1.5879 - regression_loss: 1.3283 - classification_loss: 0.2595 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5858 - regression_loss: 1.3266 - classification_loss: 0.2591 188/500 [==========>...................] - ETA: 1:17 - loss: 1.5872 - regression_loss: 1.3283 - classification_loss: 0.2589 189/500 [==========>...................] - ETA: 1:16 - loss: 1.5887 - regression_loss: 1.3292 - classification_loss: 0.2595 190/500 [==========>...................] - ETA: 1:16 - loss: 1.5869 - regression_loss: 1.3278 - classification_loss: 0.2591 191/500 [==========>...................] - ETA: 1:16 - loss: 1.5884 - regression_loss: 1.3292 - classification_loss: 0.2592 192/500 [==========>...................] - ETA: 1:15 - loss: 1.5871 - regression_loss: 1.3285 - classification_loss: 0.2586 193/500 [==========>...................] - ETA: 1:15 - loss: 1.5862 - regression_loss: 1.3280 - classification_loss: 0.2582 194/500 [==========>...................] - ETA: 1:15 - loss: 1.5830 - regression_loss: 1.3259 - classification_loss: 0.2572 195/500 [==========>...................] - ETA: 1:15 - loss: 1.5815 - regression_loss: 1.3246 - classification_loss: 0.2569 196/500 [==========>...................] - ETA: 1:14 - loss: 1.5763 - regression_loss: 1.3202 - classification_loss: 0.2561 197/500 [==========>...................] - ETA: 1:14 - loss: 1.5768 - regression_loss: 1.3207 - classification_loss: 0.2561 198/500 [==========>...................] - ETA: 1:14 - loss: 1.5732 - regression_loss: 1.3176 - classification_loss: 0.2556 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5703 - regression_loss: 1.3153 - classification_loss: 0.2550 200/500 [===========>..................] - ETA: 1:13 - loss: 1.5701 - regression_loss: 1.3150 - classification_loss: 0.2552 201/500 [===========>..................] - ETA: 1:13 - loss: 1.5678 - regression_loss: 1.3132 - classification_loss: 0.2546 202/500 [===========>..................] - ETA: 1:13 - loss: 1.5686 - regression_loss: 1.3138 - classification_loss: 0.2548 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5719 - regression_loss: 1.3162 - classification_loss: 0.2557 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5695 - regression_loss: 1.3146 - classification_loss: 0.2549 205/500 [===========>..................] - ETA: 1:12 - loss: 1.5704 - regression_loss: 1.3158 - classification_loss: 0.2546 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5681 - regression_loss: 1.3141 - classification_loss: 0.2540 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5695 - regression_loss: 1.3153 - classification_loss: 0.2542 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5686 - regression_loss: 1.3148 - classification_loss: 0.2538 209/500 [===========>..................] - ETA: 1:11 - loss: 1.5680 - regression_loss: 1.3145 - classification_loss: 0.2535 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5644 - regression_loss: 1.3115 - classification_loss: 0.2529 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5642 - regression_loss: 1.3117 - classification_loss: 0.2525 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5648 - regression_loss: 1.3123 - classification_loss: 0.2525 213/500 [===========>..................] - ETA: 1:10 - loss: 1.5657 - regression_loss: 1.3133 - classification_loss: 0.2524 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5641 - regression_loss: 1.3119 - classification_loss: 0.2522 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5673 - regression_loss: 1.3150 - classification_loss: 0.2524 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5654 - regression_loss: 1.3135 - classification_loss: 0.2519 217/500 [============>.................] - ETA: 1:09 - loss: 1.5687 - regression_loss: 1.3154 - classification_loss: 0.2533 218/500 [============>.................] - ETA: 1:09 - loss: 1.5684 - regression_loss: 1.3152 - classification_loss: 0.2532 219/500 [============>.................] - ETA: 1:09 - loss: 1.5662 - regression_loss: 1.3138 - classification_loss: 0.2524 220/500 [============>.................] - ETA: 1:09 - loss: 1.5684 - regression_loss: 1.3153 - classification_loss: 0.2531 221/500 [============>.................] - ETA: 1:08 - loss: 1.5682 - regression_loss: 1.3154 - classification_loss: 0.2528 222/500 [============>.................] - ETA: 1:08 - loss: 1.5679 - regression_loss: 1.3155 - classification_loss: 0.2524 223/500 [============>.................] - ETA: 1:08 - loss: 1.5692 - regression_loss: 1.3169 - classification_loss: 0.2523 224/500 [============>.................] - ETA: 1:08 - loss: 1.5704 - regression_loss: 1.3176 - classification_loss: 0.2527 225/500 [============>.................] - ETA: 1:07 - loss: 1.5696 - regression_loss: 1.3173 - classification_loss: 0.2523 226/500 [============>.................] - ETA: 1:07 - loss: 1.5722 - regression_loss: 1.3192 - classification_loss: 0.2530 227/500 [============>.................] - ETA: 1:07 - loss: 1.5713 - regression_loss: 1.3186 - classification_loss: 0.2527 228/500 [============>.................] - ETA: 1:07 - loss: 1.5701 - regression_loss: 1.3177 - classification_loss: 0.2524 229/500 [============>.................] - ETA: 1:06 - loss: 1.5714 - regression_loss: 1.3190 - classification_loss: 0.2524 230/500 [============>.................] - ETA: 1:06 - loss: 1.5699 - regression_loss: 1.3179 - classification_loss: 0.2519 231/500 [============>.................] - ETA: 1:06 - loss: 1.5706 - regression_loss: 1.3184 - classification_loss: 0.2522 232/500 [============>.................] - ETA: 1:06 - loss: 1.5720 - regression_loss: 1.3194 - classification_loss: 0.2525 233/500 [============>.................] - ETA: 1:05 - loss: 1.5733 - regression_loss: 1.3205 - classification_loss: 0.2529 234/500 [=============>................] - ETA: 1:05 - loss: 1.5724 - regression_loss: 1.3199 - classification_loss: 0.2525 235/500 [=============>................] - ETA: 1:05 - loss: 1.5706 - regression_loss: 1.3182 - classification_loss: 0.2523 236/500 [=============>................] - ETA: 1:05 - loss: 1.5718 - regression_loss: 1.3183 - classification_loss: 0.2535 237/500 [=============>................] - ETA: 1:04 - loss: 1.5723 - regression_loss: 1.3186 - classification_loss: 0.2537 238/500 [=============>................] - ETA: 1:04 - loss: 1.5741 - regression_loss: 1.3203 - classification_loss: 0.2538 239/500 [=============>................] - ETA: 1:04 - loss: 1.5708 - regression_loss: 1.3176 - classification_loss: 0.2532 240/500 [=============>................] - ETA: 1:04 - loss: 1.5735 - regression_loss: 1.3195 - classification_loss: 0.2539 241/500 [=============>................] - ETA: 1:03 - loss: 1.5736 - regression_loss: 1.3198 - classification_loss: 0.2538 242/500 [=============>................] - ETA: 1:03 - loss: 1.5694 - regression_loss: 1.3164 - classification_loss: 0.2530 243/500 [=============>................] - ETA: 1:03 - loss: 1.5678 - regression_loss: 1.3151 - classification_loss: 0.2527 244/500 [=============>................] - ETA: 1:03 - loss: 1.5727 - regression_loss: 1.3185 - classification_loss: 0.2543 245/500 [=============>................] - ETA: 1:02 - loss: 1.5712 - regression_loss: 1.3172 - classification_loss: 0.2540 246/500 [=============>................] - ETA: 1:02 - loss: 1.5685 - regression_loss: 1.3150 - classification_loss: 0.2535 247/500 [=============>................] - ETA: 1:02 - loss: 1.5674 - regression_loss: 1.3140 - classification_loss: 0.2533 248/500 [=============>................] - ETA: 1:02 - loss: 1.5680 - regression_loss: 1.3146 - classification_loss: 0.2533 249/500 [=============>................] - ETA: 1:01 - loss: 1.5737 - regression_loss: 1.3168 - classification_loss: 0.2569 250/500 [==============>...............] - ETA: 1:01 - loss: 1.5767 - regression_loss: 1.3192 - classification_loss: 0.2575 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5811 - regression_loss: 1.3225 - classification_loss: 0.2586 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5827 - regression_loss: 1.3238 - classification_loss: 0.2589 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5847 - regression_loss: 1.3254 - classification_loss: 0.2592 254/500 [==============>...............] - ETA: 1:00 - loss: 1.5863 - regression_loss: 1.3269 - classification_loss: 0.2594 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5857 - regression_loss: 1.3267 - classification_loss: 0.2591 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5824 - regression_loss: 1.3239 - classification_loss: 0.2585 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5841 - regression_loss: 1.3251 - classification_loss: 0.2590 258/500 [==============>...............] - ETA: 59s - loss: 1.5855 - regression_loss: 1.3256 - classification_loss: 0.2599  259/500 [==============>...............] - ETA: 59s - loss: 1.5840 - regression_loss: 1.3246 - classification_loss: 0.2594 260/500 [==============>...............] - ETA: 59s - loss: 1.5825 - regression_loss: 1.3236 - classification_loss: 0.2590 261/500 [==============>...............] - ETA: 59s - loss: 1.5830 - regression_loss: 1.3238 - classification_loss: 0.2591 262/500 [==============>...............] - ETA: 58s - loss: 1.5821 - regression_loss: 1.3233 - classification_loss: 0.2588 263/500 [==============>...............] - ETA: 58s - loss: 1.5778 - regression_loss: 1.3198 - classification_loss: 0.2580 264/500 [==============>...............] - ETA: 58s - loss: 1.5764 - regression_loss: 1.3183 - classification_loss: 0.2581 265/500 [==============>...............] - ETA: 58s - loss: 1.5751 - regression_loss: 1.3173 - classification_loss: 0.2578 266/500 [==============>...............] - ETA: 57s - loss: 1.5761 - regression_loss: 1.3181 - classification_loss: 0.2580 267/500 [===============>..............] - ETA: 57s - loss: 1.5765 - regression_loss: 1.3180 - classification_loss: 0.2585 268/500 [===============>..............] - ETA: 57s - loss: 1.5757 - regression_loss: 1.3173 - classification_loss: 0.2584 269/500 [===============>..............] - ETA: 57s - loss: 1.5746 - regression_loss: 1.3164 - classification_loss: 0.2581 270/500 [===============>..............] - ETA: 56s - loss: 1.5746 - regression_loss: 1.3166 - classification_loss: 0.2579 271/500 [===============>..............] - ETA: 56s - loss: 1.5725 - regression_loss: 1.3152 - classification_loss: 0.2573 272/500 [===============>..............] - ETA: 56s - loss: 1.5729 - regression_loss: 1.3153 - classification_loss: 0.2576 273/500 [===============>..............] - ETA: 56s - loss: 1.5740 - regression_loss: 1.3161 - classification_loss: 0.2579 274/500 [===============>..............] - ETA: 55s - loss: 1.5733 - regression_loss: 1.3155 - classification_loss: 0.2578 275/500 [===============>..............] - ETA: 55s - loss: 1.5717 - regression_loss: 1.3141 - classification_loss: 0.2576 276/500 [===============>..............] - ETA: 55s - loss: 1.5706 - regression_loss: 1.3134 - classification_loss: 0.2572 277/500 [===============>..............] - ETA: 55s - loss: 1.5703 - regression_loss: 1.3135 - classification_loss: 0.2568 278/500 [===============>..............] - ETA: 54s - loss: 1.5685 - regression_loss: 1.3122 - classification_loss: 0.2563 279/500 [===============>..............] - ETA: 54s - loss: 1.5683 - regression_loss: 1.3120 - classification_loss: 0.2563 280/500 [===============>..............] - ETA: 54s - loss: 1.5644 - regression_loss: 1.3088 - classification_loss: 0.2556 281/500 [===============>..............] - ETA: 54s - loss: 1.5668 - regression_loss: 1.3105 - classification_loss: 0.2563 282/500 [===============>..............] - ETA: 53s - loss: 1.5658 - regression_loss: 1.3097 - classification_loss: 0.2561 283/500 [===============>..............] - ETA: 53s - loss: 1.5652 - regression_loss: 1.3090 - classification_loss: 0.2562 284/500 [================>.............] - ETA: 53s - loss: 1.5633 - regression_loss: 1.3075 - classification_loss: 0.2558 285/500 [================>.............] - ETA: 53s - loss: 1.5641 - regression_loss: 1.3083 - classification_loss: 0.2559 286/500 [================>.............] - ETA: 52s - loss: 1.5671 - regression_loss: 1.3108 - classification_loss: 0.2563 287/500 [================>.............] - ETA: 52s - loss: 1.5678 - regression_loss: 1.3120 - classification_loss: 0.2559 288/500 [================>.............] - ETA: 52s - loss: 1.5683 - regression_loss: 1.3125 - classification_loss: 0.2558 289/500 [================>.............] - ETA: 52s - loss: 1.5686 - regression_loss: 1.3127 - classification_loss: 0.2558 290/500 [================>.............] - ETA: 51s - loss: 1.5694 - regression_loss: 1.3133 - classification_loss: 0.2562 291/500 [================>.............] - ETA: 51s - loss: 1.5680 - regression_loss: 1.3124 - classification_loss: 0.2556 292/500 [================>.............] - ETA: 51s - loss: 1.5673 - regression_loss: 1.3121 - classification_loss: 0.2553 293/500 [================>.............] - ETA: 51s - loss: 1.5645 - regression_loss: 1.3098 - classification_loss: 0.2547 294/500 [================>.............] - ETA: 50s - loss: 1.5652 - regression_loss: 1.3105 - classification_loss: 0.2546 295/500 [================>.............] - ETA: 50s - loss: 1.5677 - regression_loss: 1.3126 - classification_loss: 0.2551 296/500 [================>.............] - ETA: 50s - loss: 1.5684 - regression_loss: 1.3134 - classification_loss: 0.2551 297/500 [================>.............] - ETA: 50s - loss: 1.5680 - regression_loss: 1.3131 - classification_loss: 0.2549 298/500 [================>.............] - ETA: 49s - loss: 1.5682 - regression_loss: 1.3137 - classification_loss: 0.2545 299/500 [================>.............] - ETA: 49s - loss: 1.5687 - regression_loss: 1.3141 - classification_loss: 0.2546 300/500 [=================>............] - ETA: 49s - loss: 1.5663 - regression_loss: 1.3122 - classification_loss: 0.2541 301/500 [=================>............] - ETA: 49s - loss: 1.5675 - regression_loss: 1.3134 - classification_loss: 0.2541 302/500 [=================>............] - ETA: 48s - loss: 1.5679 - regression_loss: 1.3137 - classification_loss: 0.2542 303/500 [=================>............] - ETA: 48s - loss: 1.5676 - regression_loss: 1.3138 - classification_loss: 0.2538 304/500 [=================>............] - ETA: 48s - loss: 1.5719 - regression_loss: 1.3172 - classification_loss: 0.2546 305/500 [=================>............] - ETA: 48s - loss: 1.5719 - regression_loss: 1.3174 - classification_loss: 0.2546 306/500 [=================>............] - ETA: 47s - loss: 1.5737 - regression_loss: 1.3187 - classification_loss: 0.2550 307/500 [=================>............] - ETA: 47s - loss: 1.5738 - regression_loss: 1.3188 - classification_loss: 0.2550 308/500 [=================>............] - ETA: 47s - loss: 1.5740 - regression_loss: 1.3191 - classification_loss: 0.2549 309/500 [=================>............] - ETA: 47s - loss: 1.5733 - regression_loss: 1.3186 - classification_loss: 0.2547 310/500 [=================>............] - ETA: 46s - loss: 1.5710 - regression_loss: 1.3167 - classification_loss: 0.2543 311/500 [=================>............] - ETA: 46s - loss: 1.5720 - regression_loss: 1.3174 - classification_loss: 0.2546 312/500 [=================>............] - ETA: 46s - loss: 1.5745 - regression_loss: 1.3196 - classification_loss: 0.2549 313/500 [=================>............] - ETA: 46s - loss: 1.5740 - regression_loss: 1.3192 - classification_loss: 0.2547 314/500 [=================>............] - ETA: 46s - loss: 1.5732 - regression_loss: 1.3187 - classification_loss: 0.2546 315/500 [=================>............] - ETA: 45s - loss: 1.5740 - regression_loss: 1.3193 - classification_loss: 0.2548 316/500 [=================>............] - ETA: 45s - loss: 1.5759 - regression_loss: 1.3206 - classification_loss: 0.2553 317/500 [==================>...........] - ETA: 45s - loss: 1.5738 - regression_loss: 1.3191 - classification_loss: 0.2548 318/500 [==================>...........] - ETA: 45s - loss: 1.5754 - regression_loss: 1.3204 - classification_loss: 0.2549 319/500 [==================>...........] - ETA: 44s - loss: 1.5759 - regression_loss: 1.3210 - classification_loss: 0.2549 320/500 [==================>...........] - ETA: 44s - loss: 1.5771 - regression_loss: 1.3221 - classification_loss: 0.2550 321/500 [==================>...........] - ETA: 44s - loss: 1.5756 - regression_loss: 1.3209 - classification_loss: 0.2547 322/500 [==================>...........] - ETA: 44s - loss: 1.5774 - regression_loss: 1.3226 - classification_loss: 0.2548 323/500 [==================>...........] - ETA: 43s - loss: 1.5793 - regression_loss: 1.3242 - classification_loss: 0.2551 324/500 [==================>...........] - ETA: 43s - loss: 1.5794 - regression_loss: 1.3245 - classification_loss: 0.2549 325/500 [==================>...........] - ETA: 43s - loss: 1.5785 - regression_loss: 1.3236 - classification_loss: 0.2549 326/500 [==================>...........] - ETA: 43s - loss: 1.5766 - regression_loss: 1.3220 - classification_loss: 0.2546 327/500 [==================>...........] - ETA: 42s - loss: 1.5742 - regression_loss: 1.3202 - classification_loss: 0.2541 328/500 [==================>...........] - ETA: 42s - loss: 1.5778 - regression_loss: 1.3236 - classification_loss: 0.2543 329/500 [==================>...........] - ETA: 42s - loss: 1.5745 - regression_loss: 1.3209 - classification_loss: 0.2536 330/500 [==================>...........] - ETA: 42s - loss: 1.5729 - regression_loss: 1.3197 - classification_loss: 0.2532 331/500 [==================>...........] - ETA: 41s - loss: 1.5731 - regression_loss: 1.3201 - classification_loss: 0.2530 332/500 [==================>...........] - ETA: 41s - loss: 1.5718 - regression_loss: 1.3189 - classification_loss: 0.2529 333/500 [==================>...........] - ETA: 41s - loss: 1.5714 - regression_loss: 1.3186 - classification_loss: 0.2527 334/500 [===================>..........] - ETA: 41s - loss: 1.5718 - regression_loss: 1.3191 - classification_loss: 0.2527 335/500 [===================>..........] - ETA: 40s - loss: 1.5747 - regression_loss: 1.3217 - classification_loss: 0.2530 336/500 [===================>..........] - ETA: 40s - loss: 1.5727 - regression_loss: 1.3201 - classification_loss: 0.2525 337/500 [===================>..........] - ETA: 40s - loss: 1.5717 - regression_loss: 1.3193 - classification_loss: 0.2524 338/500 [===================>..........] - ETA: 40s - loss: 1.5712 - regression_loss: 1.3190 - classification_loss: 0.2522 339/500 [===================>..........] - ETA: 39s - loss: 1.5704 - regression_loss: 1.3181 - classification_loss: 0.2523 340/500 [===================>..........] - ETA: 39s - loss: 1.5779 - regression_loss: 1.3217 - classification_loss: 0.2563 341/500 [===================>..........] - ETA: 39s - loss: 1.5765 - regression_loss: 1.3206 - classification_loss: 0.2559 342/500 [===================>..........] - ETA: 39s - loss: 1.5770 - regression_loss: 1.3210 - classification_loss: 0.2560 343/500 [===================>..........] - ETA: 38s - loss: 1.5777 - regression_loss: 1.3216 - classification_loss: 0.2561 344/500 [===================>..........] - ETA: 38s - loss: 1.5787 - regression_loss: 1.3224 - classification_loss: 0.2563 345/500 [===================>..........] - ETA: 38s - loss: 1.5789 - regression_loss: 1.3218 - classification_loss: 0.2570 346/500 [===================>..........] - ETA: 38s - loss: 1.5782 - regression_loss: 1.3215 - classification_loss: 0.2567 347/500 [===================>..........] - ETA: 37s - loss: 1.5763 - regression_loss: 1.3200 - classification_loss: 0.2562 348/500 [===================>..........] - ETA: 37s - loss: 1.5776 - regression_loss: 1.3210 - classification_loss: 0.2566 349/500 [===================>..........] - ETA: 37s - loss: 1.5783 - regression_loss: 1.3216 - classification_loss: 0.2567 350/500 [====================>.........] - ETA: 37s - loss: 1.5784 - regression_loss: 1.3217 - classification_loss: 0.2566 351/500 [====================>.........] - ETA: 36s - loss: 1.5796 - regression_loss: 1.3230 - classification_loss: 0.2566 352/500 [====================>.........] - ETA: 36s - loss: 1.5792 - regression_loss: 1.3227 - classification_loss: 0.2564 353/500 [====================>.........] - ETA: 36s - loss: 1.5781 - regression_loss: 1.3220 - classification_loss: 0.2562 354/500 [====================>.........] - ETA: 36s - loss: 1.5825 - regression_loss: 1.3257 - classification_loss: 0.2567 355/500 [====================>.........] - ETA: 35s - loss: 1.5833 - regression_loss: 1.3265 - classification_loss: 0.2568 356/500 [====================>.........] - ETA: 35s - loss: 1.5852 - regression_loss: 1.3276 - classification_loss: 0.2576 357/500 [====================>.........] - ETA: 35s - loss: 1.5869 - regression_loss: 1.3288 - classification_loss: 0.2581 358/500 [====================>.........] - ETA: 35s - loss: 1.5873 - regression_loss: 1.3290 - classification_loss: 0.2583 359/500 [====================>.........] - ETA: 34s - loss: 1.5897 - regression_loss: 1.3309 - classification_loss: 0.2588 360/500 [====================>.........] - ETA: 34s - loss: 1.5887 - regression_loss: 1.3301 - classification_loss: 0.2586 361/500 [====================>.........] - ETA: 34s - loss: 1.5895 - regression_loss: 1.3310 - classification_loss: 0.2585 362/500 [====================>.........] - ETA: 34s - loss: 1.5909 - regression_loss: 1.3322 - classification_loss: 0.2587 363/500 [====================>.........] - ETA: 33s - loss: 1.5930 - regression_loss: 1.3342 - classification_loss: 0.2588 364/500 [====================>.........] - ETA: 33s - loss: 1.5977 - regression_loss: 1.3382 - classification_loss: 0.2596 365/500 [====================>.........] - ETA: 33s - loss: 1.5976 - regression_loss: 1.3382 - classification_loss: 0.2595 366/500 [====================>.........] - ETA: 33s - loss: 1.5988 - regression_loss: 1.3393 - classification_loss: 0.2594 367/500 [=====================>........] - ETA: 32s - loss: 1.5980 - regression_loss: 1.3389 - classification_loss: 0.2591 368/500 [=====================>........] - ETA: 32s - loss: 1.5981 - regression_loss: 1.3389 - classification_loss: 0.2592 369/500 [=====================>........] - ETA: 32s - loss: 1.5972 - regression_loss: 1.3383 - classification_loss: 0.2590 370/500 [=====================>........] - ETA: 32s - loss: 1.5990 - regression_loss: 1.3395 - classification_loss: 0.2595 371/500 [=====================>........] - ETA: 31s - loss: 1.5978 - regression_loss: 1.3386 - classification_loss: 0.2592 372/500 [=====================>........] - ETA: 31s - loss: 1.5979 - regression_loss: 1.3386 - classification_loss: 0.2593 373/500 [=====================>........] - ETA: 31s - loss: 1.5997 - regression_loss: 1.3399 - classification_loss: 0.2599 374/500 [=====================>........] - ETA: 31s - loss: 1.5978 - regression_loss: 1.3383 - classification_loss: 0.2595 375/500 [=====================>........] - ETA: 30s - loss: 1.5970 - regression_loss: 1.3376 - classification_loss: 0.2594 376/500 [=====================>........] - ETA: 30s - loss: 1.5984 - regression_loss: 1.3389 - classification_loss: 0.2596 377/500 [=====================>........] - ETA: 30s - loss: 1.5979 - regression_loss: 1.3386 - classification_loss: 0.2593 378/500 [=====================>........] - ETA: 30s - loss: 1.5975 - regression_loss: 1.3384 - classification_loss: 0.2590 379/500 [=====================>........] - ETA: 29s - loss: 1.5967 - regression_loss: 1.3379 - classification_loss: 0.2589 380/500 [=====================>........] - ETA: 29s - loss: 1.5949 - regression_loss: 1.3364 - classification_loss: 0.2586 381/500 [=====================>........] - ETA: 29s - loss: 1.5956 - regression_loss: 1.3370 - classification_loss: 0.2586 382/500 [=====================>........] - ETA: 29s - loss: 1.5945 - regression_loss: 1.3362 - classification_loss: 0.2583 383/500 [=====================>........] - ETA: 28s - loss: 1.5937 - regression_loss: 1.3356 - classification_loss: 0.2580 384/500 [======================>.......] - ETA: 28s - loss: 1.5932 - regression_loss: 1.3353 - classification_loss: 0.2579 385/500 [======================>.......] - ETA: 28s - loss: 1.5926 - regression_loss: 1.3349 - classification_loss: 0.2577 386/500 [======================>.......] - ETA: 28s - loss: 1.5923 - regression_loss: 1.3347 - classification_loss: 0.2576 387/500 [======================>.......] - ETA: 27s - loss: 1.5928 - regression_loss: 1.3352 - classification_loss: 0.2576 388/500 [======================>.......] - ETA: 27s - loss: 1.5930 - regression_loss: 1.3353 - classification_loss: 0.2576 389/500 [======================>.......] - ETA: 27s - loss: 1.5949 - regression_loss: 1.3368 - classification_loss: 0.2581 390/500 [======================>.......] - ETA: 27s - loss: 1.5927 - regression_loss: 1.3350 - classification_loss: 0.2577 391/500 [======================>.......] - ETA: 26s - loss: 1.5944 - regression_loss: 1.3364 - classification_loss: 0.2580 392/500 [======================>.......] - ETA: 26s - loss: 1.5943 - regression_loss: 1.3364 - classification_loss: 0.2579 393/500 [======================>.......] - ETA: 26s - loss: 1.5946 - regression_loss: 1.3366 - classification_loss: 0.2579 394/500 [======================>.......] - ETA: 26s - loss: 1.5982 - regression_loss: 1.3395 - classification_loss: 0.2587 395/500 [======================>.......] - ETA: 25s - loss: 1.5983 - regression_loss: 1.3398 - classification_loss: 0.2585 396/500 [======================>.......] - ETA: 25s - loss: 1.5981 - regression_loss: 1.3397 - classification_loss: 0.2584 397/500 [======================>.......] - ETA: 25s - loss: 1.5981 - regression_loss: 1.3395 - classification_loss: 0.2586 398/500 [======================>.......] - ETA: 25s - loss: 1.5966 - regression_loss: 1.3383 - classification_loss: 0.2582 399/500 [======================>.......] - ETA: 24s - loss: 1.5975 - regression_loss: 1.3393 - classification_loss: 0.2582 400/500 [=======================>......] - ETA: 24s - loss: 1.5980 - regression_loss: 1.3397 - classification_loss: 0.2583 401/500 [=======================>......] - ETA: 24s - loss: 1.5961 - regression_loss: 1.3382 - classification_loss: 0.2580 402/500 [=======================>......] - ETA: 24s - loss: 1.5973 - regression_loss: 1.3384 - classification_loss: 0.2590 403/500 [=======================>......] - ETA: 23s - loss: 1.5975 - regression_loss: 1.3386 - classification_loss: 0.2589 404/500 [=======================>......] - ETA: 23s - loss: 1.5967 - regression_loss: 1.3382 - classification_loss: 0.2585 405/500 [=======================>......] - ETA: 23s - loss: 1.5971 - regression_loss: 1.3386 - classification_loss: 0.2585 406/500 [=======================>......] - ETA: 23s - loss: 1.5957 - regression_loss: 1.3376 - classification_loss: 0.2581 407/500 [=======================>......] - ETA: 22s - loss: 1.5958 - regression_loss: 1.3378 - classification_loss: 0.2580 408/500 [=======================>......] - ETA: 22s - loss: 1.5933 - regression_loss: 1.3357 - classification_loss: 0.2575 409/500 [=======================>......] - ETA: 22s - loss: 1.5941 - regression_loss: 1.3365 - classification_loss: 0.2576 410/500 [=======================>......] - ETA: 22s - loss: 1.5942 - regression_loss: 1.3365 - classification_loss: 0.2576 411/500 [=======================>......] - ETA: 21s - loss: 1.5948 - regression_loss: 1.3371 - classification_loss: 0.2577 412/500 [=======================>......] - ETA: 21s - loss: 1.5933 - regression_loss: 1.3359 - classification_loss: 0.2573 413/500 [=======================>......] - ETA: 21s - loss: 1.5928 - regression_loss: 1.3357 - classification_loss: 0.2572 414/500 [=======================>......] - ETA: 21s - loss: 1.5933 - regression_loss: 1.3360 - classification_loss: 0.2573 415/500 [=======================>......] - ETA: 20s - loss: 1.5939 - regression_loss: 1.3364 - classification_loss: 0.2574 416/500 [=======================>......] - ETA: 20s - loss: 1.5935 - regression_loss: 1.3362 - classification_loss: 0.2573 417/500 [========================>.....] - ETA: 20s - loss: 1.5926 - regression_loss: 1.3354 - classification_loss: 0.2572 418/500 [========================>.....] - ETA: 20s - loss: 1.5905 - regression_loss: 1.3337 - classification_loss: 0.2568 419/500 [========================>.....] - ETA: 20s - loss: 1.5902 - regression_loss: 1.3334 - classification_loss: 0.2568 420/500 [========================>.....] - ETA: 19s - loss: 1.5895 - regression_loss: 1.3328 - classification_loss: 0.2567 421/500 [========================>.....] - ETA: 19s - loss: 1.5893 - regression_loss: 1.3327 - classification_loss: 0.2565 422/500 [========================>.....] - ETA: 19s - loss: 1.5896 - regression_loss: 1.3329 - classification_loss: 0.2566 423/500 [========================>.....] - ETA: 19s - loss: 1.5887 - regression_loss: 1.3323 - classification_loss: 0.2564 424/500 [========================>.....] - ETA: 18s - loss: 1.5895 - regression_loss: 1.3328 - classification_loss: 0.2567 425/500 [========================>.....] - ETA: 18s - loss: 1.5912 - regression_loss: 1.3341 - classification_loss: 0.2572 426/500 [========================>.....] - ETA: 18s - loss: 1.5903 - regression_loss: 1.3334 - classification_loss: 0.2569 427/500 [========================>.....] - ETA: 18s - loss: 1.5904 - regression_loss: 1.3337 - classification_loss: 0.2567 428/500 [========================>.....] - ETA: 17s - loss: 1.5903 - regression_loss: 1.3337 - classification_loss: 0.2567 429/500 [========================>.....] - ETA: 17s - loss: 1.5893 - regression_loss: 1.3329 - classification_loss: 0.2564 430/500 [========================>.....] - ETA: 17s - loss: 1.5893 - regression_loss: 1.3326 - classification_loss: 0.2567 431/500 [========================>.....] - ETA: 17s - loss: 1.5894 - regression_loss: 1.3327 - classification_loss: 0.2567 432/500 [========================>.....] - ETA: 16s - loss: 1.5885 - regression_loss: 1.3321 - classification_loss: 0.2564 433/500 [========================>.....] - ETA: 16s - loss: 1.5877 - regression_loss: 1.3315 - classification_loss: 0.2561 434/500 [=========================>....] - ETA: 16s - loss: 1.5863 - regression_loss: 1.3306 - classification_loss: 0.2557 435/500 [=========================>....] - ETA: 16s - loss: 1.5860 - regression_loss: 1.3304 - classification_loss: 0.2556 436/500 [=========================>....] - ETA: 15s - loss: 1.5866 - regression_loss: 1.3309 - classification_loss: 0.2557 437/500 [=========================>....] - ETA: 15s - loss: 1.5853 - regression_loss: 1.3298 - classification_loss: 0.2555 438/500 [=========================>....] - ETA: 15s - loss: 1.5842 - regression_loss: 1.3288 - classification_loss: 0.2554 439/500 [=========================>....] - ETA: 15s - loss: 1.5850 - regression_loss: 1.3294 - classification_loss: 0.2556 440/500 [=========================>....] - ETA: 14s - loss: 1.5852 - regression_loss: 1.3297 - classification_loss: 0.2555 441/500 [=========================>....] - ETA: 14s - loss: 1.5845 - regression_loss: 1.3289 - classification_loss: 0.2556 442/500 [=========================>....] - ETA: 14s - loss: 1.5851 - regression_loss: 1.3293 - classification_loss: 0.2558 443/500 [=========================>....] - ETA: 14s - loss: 1.5845 - regression_loss: 1.3287 - classification_loss: 0.2558 444/500 [=========================>....] - ETA: 13s - loss: 1.5852 - regression_loss: 1.3294 - classification_loss: 0.2558 445/500 [=========================>....] - ETA: 13s - loss: 1.5844 - regression_loss: 1.3288 - classification_loss: 0.2555 446/500 [=========================>....] - ETA: 13s - loss: 1.5854 - regression_loss: 1.3297 - classification_loss: 0.2557 447/500 [=========================>....] - ETA: 13s - loss: 1.5865 - regression_loss: 1.3306 - classification_loss: 0.2559 448/500 [=========================>....] - ETA: 12s - loss: 1.5852 - regression_loss: 1.3296 - classification_loss: 0.2556 449/500 [=========================>....] - ETA: 12s - loss: 1.5848 - regression_loss: 1.3294 - classification_loss: 0.2554 450/500 [==========================>...] - ETA: 12s - loss: 1.5836 - regression_loss: 1.3284 - classification_loss: 0.2552 451/500 [==========================>...] - ETA: 12s - loss: 1.5856 - regression_loss: 1.3297 - classification_loss: 0.2559 452/500 [==========================>...] - ETA: 11s - loss: 1.5865 - regression_loss: 1.3303 - classification_loss: 0.2562 453/500 [==========================>...] - ETA: 11s - loss: 1.5882 - regression_loss: 1.3316 - classification_loss: 0.2566 454/500 [==========================>...] - ETA: 11s - loss: 1.5901 - regression_loss: 1.3331 - classification_loss: 0.2571 455/500 [==========================>...] - ETA: 11s - loss: 1.5912 - regression_loss: 1.3336 - classification_loss: 0.2576 456/500 [==========================>...] - ETA: 10s - loss: 1.5893 - regression_loss: 1.3320 - classification_loss: 0.2574 457/500 [==========================>...] - ETA: 10s - loss: 1.5896 - regression_loss: 1.3321 - classification_loss: 0.2575 458/500 [==========================>...] - ETA: 10s - loss: 1.5899 - regression_loss: 1.3325 - classification_loss: 0.2574 459/500 [==========================>...] - ETA: 10s - loss: 1.5896 - regression_loss: 1.3324 - classification_loss: 0.2572 460/500 [==========================>...] - ETA: 9s - loss: 1.5889 - regression_loss: 1.3319 - classification_loss: 0.2570  461/500 [==========================>...] - ETA: 9s - loss: 1.5886 - regression_loss: 1.3316 - classification_loss: 0.2570 462/500 [==========================>...] - ETA: 9s - loss: 1.5867 - regression_loss: 1.3298 - classification_loss: 0.2569 463/500 [==========================>...] - ETA: 9s - loss: 1.5865 - regression_loss: 1.3297 - classification_loss: 0.2569 464/500 [==========================>...] - ETA: 8s - loss: 1.5843 - regression_loss: 1.3278 - classification_loss: 0.2565 465/500 [==========================>...] - ETA: 8s - loss: 1.5857 - regression_loss: 1.3287 - classification_loss: 0.2569 466/500 [==========================>...] - ETA: 8s - loss: 1.5870 - regression_loss: 1.3298 - classification_loss: 0.2572 467/500 [===========================>..] - ETA: 8s - loss: 1.5856 - regression_loss: 1.3286 - classification_loss: 0.2570 468/500 [===========================>..] - ETA: 7s - loss: 1.5854 - regression_loss: 1.3283 - classification_loss: 0.2571 469/500 [===========================>..] - ETA: 7s - loss: 1.5847 - regression_loss: 1.3279 - classification_loss: 0.2568 470/500 [===========================>..] - ETA: 7s - loss: 1.5853 - regression_loss: 1.3285 - classification_loss: 0.2569 471/500 [===========================>..] - ETA: 7s - loss: 1.5856 - regression_loss: 1.3289 - classification_loss: 0.2567 472/500 [===========================>..] - ETA: 6s - loss: 1.5858 - regression_loss: 1.3291 - classification_loss: 0.2567 473/500 [===========================>..] - ETA: 6s - loss: 1.5858 - regression_loss: 1.3293 - classification_loss: 0.2565 474/500 [===========================>..] - ETA: 6s - loss: 1.5888 - regression_loss: 1.3317 - classification_loss: 0.2570 475/500 [===========================>..] - ETA: 6s - loss: 1.5880 - regression_loss: 1.3310 - classification_loss: 0.2570 476/500 [===========================>..] - ETA: 5s - loss: 1.5873 - regression_loss: 1.3306 - classification_loss: 0.2567 477/500 [===========================>..] - ETA: 5s - loss: 1.5888 - regression_loss: 1.3318 - classification_loss: 0.2570 478/500 [===========================>..] - ETA: 5s - loss: 1.5879 - regression_loss: 1.3310 - classification_loss: 0.2569 479/500 [===========================>..] - ETA: 5s - loss: 1.5873 - regression_loss: 1.3306 - classification_loss: 0.2566 480/500 [===========================>..] - ETA: 4s - loss: 1.5875 - regression_loss: 1.3309 - classification_loss: 0.2565 481/500 [===========================>..] - ETA: 4s - loss: 1.5888 - regression_loss: 1.3320 - classification_loss: 0.2568 482/500 [===========================>..] - ETA: 4s - loss: 1.5884 - regression_loss: 1.3316 - classification_loss: 0.2568 483/500 [===========================>..] - ETA: 4s - loss: 1.5887 - regression_loss: 1.3318 - classification_loss: 0.2570 484/500 [============================>.] - ETA: 3s - loss: 1.5869 - regression_loss: 1.3303 - classification_loss: 0.2566 485/500 [============================>.] - ETA: 3s - loss: 1.5857 - regression_loss: 1.3294 - classification_loss: 0.2563 486/500 [============================>.] - ETA: 3s - loss: 1.5859 - regression_loss: 1.3296 - classification_loss: 0.2562 487/500 [============================>.] - ETA: 3s - loss: 1.5860 - regression_loss: 1.3298 - classification_loss: 0.2562 488/500 [============================>.] - ETA: 2s - loss: 1.5860 - regression_loss: 1.3300 - classification_loss: 0.2560 489/500 [============================>.] - ETA: 2s - loss: 1.5841 - regression_loss: 1.3284 - classification_loss: 0.2556 490/500 [============================>.] - ETA: 2s - loss: 1.5835 - regression_loss: 1.3282 - classification_loss: 0.2553 491/500 [============================>.] - ETA: 2s - loss: 1.5844 - regression_loss: 1.3290 - classification_loss: 0.2554 492/500 [============================>.] - ETA: 1s - loss: 1.5843 - regression_loss: 1.3289 - classification_loss: 0.2554 493/500 [============================>.] - ETA: 1s - loss: 1.5833 - regression_loss: 1.3281 - classification_loss: 0.2552 494/500 [============================>.] - ETA: 1s - loss: 1.5830 - regression_loss: 1.3279 - classification_loss: 0.2551 495/500 [============================>.] - ETA: 1s - loss: 1.5825 - regression_loss: 1.3277 - classification_loss: 0.2549 496/500 [============================>.] - ETA: 0s - loss: 1.5817 - regression_loss: 1.3270 - classification_loss: 0.2547 497/500 [============================>.] - ETA: 0s - loss: 1.5837 - regression_loss: 1.3286 - classification_loss: 0.2551 498/500 [============================>.] - ETA: 0s - loss: 1.5816 - regression_loss: 1.3267 - classification_loss: 0.2548 499/500 [============================>.] - ETA: 0s - loss: 1.5812 - regression_loss: 1.3265 - classification_loss: 0.2547 500/500 [==============================] - 124s 247ms/step - loss: 1.5821 - regression_loss: 1.3273 - classification_loss: 0.2548 326 instances of class plum with average precision: 0.7718 mAP: 0.7718 Epoch 00063: saving model to ./training/snapshots/resnet50_pascal_63.h5 Epoch 64/150 1/500 [..............................] - ETA: 2:03 - loss: 1.7609 - regression_loss: 1.5188 - classification_loss: 0.2421 2/500 [..............................] - ETA: 2:04 - loss: 1.8664 - regression_loss: 1.5864 - classification_loss: 0.2800 3/500 [..............................] - ETA: 2:04 - loss: 1.8348 - regression_loss: 1.5824 - classification_loss: 0.2524 4/500 [..............................] - ETA: 2:01 - loss: 1.8193 - regression_loss: 1.5461 - classification_loss: 0.2732 5/500 [..............................] - ETA: 1:59 - loss: 1.7778 - regression_loss: 1.5282 - classification_loss: 0.2497 6/500 [..............................] - ETA: 1:56 - loss: 1.7558 - regression_loss: 1.5002 - classification_loss: 0.2556 7/500 [..............................] - ETA: 1:55 - loss: 1.5621 - regression_loss: 1.3387 - classification_loss: 0.2234 8/500 [..............................] - ETA: 1:55 - loss: 1.5292 - regression_loss: 1.3144 - classification_loss: 0.2148 9/500 [..............................] - ETA: 1:55 - loss: 1.4123 - regression_loss: 1.2149 - classification_loss: 0.1974 10/500 [..............................] - ETA: 1:55 - loss: 1.3184 - regression_loss: 1.1352 - classification_loss: 0.1832 11/500 [..............................] - ETA: 1:55 - loss: 1.2750 - regression_loss: 1.0992 - classification_loss: 0.1758 12/500 [..............................] - ETA: 1:56 - loss: 1.3190 - regression_loss: 1.1314 - classification_loss: 0.1875 13/500 [..............................] - ETA: 1:56 - loss: 1.2641 - regression_loss: 1.0876 - classification_loss: 0.1765 14/500 [..............................] - ETA: 1:56 - loss: 1.2355 - regression_loss: 1.0640 - classification_loss: 0.1715 15/500 [..............................] - ETA: 1:56 - loss: 1.2514 - regression_loss: 1.0740 - classification_loss: 0.1774 16/500 [..............................] - ETA: 1:56 - loss: 1.2874 - regression_loss: 1.1085 - classification_loss: 0.1789 17/500 [>.............................] - ETA: 1:56 - loss: 1.2822 - regression_loss: 1.1007 - classification_loss: 0.1815 18/500 [>.............................] - ETA: 1:56 - loss: 1.3006 - regression_loss: 1.1113 - classification_loss: 0.1893 19/500 [>.............................] - ETA: 1:56 - loss: 1.3349 - regression_loss: 1.1336 - classification_loss: 0.2012 20/500 [>.............................] - ETA: 1:56 - loss: 1.3698 - regression_loss: 1.1668 - classification_loss: 0.2030 21/500 [>.............................] - ETA: 1:56 - loss: 1.3929 - regression_loss: 1.1877 - classification_loss: 0.2053 22/500 [>.............................] - ETA: 1:56 - loss: 1.3988 - regression_loss: 1.1953 - classification_loss: 0.2035 23/500 [>.............................] - ETA: 1:56 - loss: 1.4053 - regression_loss: 1.1999 - classification_loss: 0.2054 24/500 [>.............................] - ETA: 1:55 - loss: 1.4384 - regression_loss: 1.2213 - classification_loss: 0.2171 25/500 [>.............................] - ETA: 1:56 - loss: 1.4402 - regression_loss: 1.2201 - classification_loss: 0.2201 26/500 [>.............................] - ETA: 1:55 - loss: 1.4416 - regression_loss: 1.2228 - classification_loss: 0.2188 27/500 [>.............................] - ETA: 1:55 - loss: 1.4878 - regression_loss: 1.2614 - classification_loss: 0.2264 28/500 [>.............................] - ETA: 1:55 - loss: 1.5263 - regression_loss: 1.2879 - classification_loss: 0.2384 29/500 [>.............................] - ETA: 1:55 - loss: 1.5770 - regression_loss: 1.3203 - classification_loss: 0.2567 30/500 [>.............................] - ETA: 1:55 - loss: 1.5752 - regression_loss: 1.3025 - classification_loss: 0.2726 31/500 [>.............................] - ETA: 1:55 - loss: 1.5862 - regression_loss: 1.3146 - classification_loss: 0.2716 32/500 [>.............................] - ETA: 1:54 - loss: 1.5841 - regression_loss: 1.3153 - classification_loss: 0.2688 33/500 [>.............................] - ETA: 1:54 - loss: 1.5959 - regression_loss: 1.3275 - classification_loss: 0.2684 34/500 [=>............................] - ETA: 1:54 - loss: 1.5913 - regression_loss: 1.3255 - classification_loss: 0.2659 35/500 [=>............................] - ETA: 1:53 - loss: 1.5838 - regression_loss: 1.3193 - classification_loss: 0.2645 36/500 [=>............................] - ETA: 1:53 - loss: 1.5815 - regression_loss: 1.3185 - classification_loss: 0.2629 37/500 [=>............................] - ETA: 1:53 - loss: 1.5922 - regression_loss: 1.3285 - classification_loss: 0.2638 38/500 [=>............................] - ETA: 1:53 - loss: 1.5935 - regression_loss: 1.3323 - classification_loss: 0.2611 39/500 [=>............................] - ETA: 1:53 - loss: 1.5819 - regression_loss: 1.3215 - classification_loss: 0.2604 40/500 [=>............................] - ETA: 1:52 - loss: 1.5735 - regression_loss: 1.3150 - classification_loss: 0.2585 41/500 [=>............................] - ETA: 1:52 - loss: 1.5780 - regression_loss: 1.3197 - classification_loss: 0.2582 42/500 [=>............................] - ETA: 1:52 - loss: 1.5567 - regression_loss: 1.3029 - classification_loss: 0.2538 43/500 [=>............................] - ETA: 1:52 - loss: 1.5884 - regression_loss: 1.3290 - classification_loss: 0.2594 44/500 [=>............................] - ETA: 1:52 - loss: 1.6123 - regression_loss: 1.3534 - classification_loss: 0.2589 45/500 [=>............................] - ETA: 1:51 - loss: 1.6260 - regression_loss: 1.3636 - classification_loss: 0.2623 46/500 [=>............................] - ETA: 1:51 - loss: 1.6267 - regression_loss: 1.3626 - classification_loss: 0.2641 47/500 [=>............................] - ETA: 1:51 - loss: 1.6171 - regression_loss: 1.3550 - classification_loss: 0.2621 48/500 [=>............................] - ETA: 1:51 - loss: 1.6236 - regression_loss: 1.3599 - classification_loss: 0.2638 49/500 [=>............................] - ETA: 1:51 - loss: 1.6199 - regression_loss: 1.3575 - classification_loss: 0.2624 50/500 [==>...........................] - ETA: 1:50 - loss: 1.6273 - regression_loss: 1.3619 - classification_loss: 0.2654 51/500 [==>...........................] - ETA: 1:50 - loss: 1.6088 - regression_loss: 1.3475 - classification_loss: 0.2614 52/500 [==>...........................] - ETA: 1:50 - loss: 1.6207 - regression_loss: 1.3561 - classification_loss: 0.2646 53/500 [==>...........................] - ETA: 1:50 - loss: 1.6177 - regression_loss: 1.3542 - classification_loss: 0.2634 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6105 - regression_loss: 1.3488 - classification_loss: 0.2617 55/500 [==>...........................] - ETA: 1:49 - loss: 1.6076 - regression_loss: 1.3466 - classification_loss: 0.2609 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6131 - regression_loss: 1.3520 - classification_loss: 0.2611 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6221 - regression_loss: 1.3602 - classification_loss: 0.2619 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6309 - regression_loss: 1.3677 - classification_loss: 0.2632 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6249 - regression_loss: 1.3629 - classification_loss: 0.2620 60/500 [==>...........................] - ETA: 1:48 - loss: 1.6259 - regression_loss: 1.3639 - classification_loss: 0.2620 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6158 - regression_loss: 1.3563 - classification_loss: 0.2595 62/500 [==>...........................] - ETA: 1:48 - loss: 1.6145 - regression_loss: 1.3546 - classification_loss: 0.2598 63/500 [==>...........................] - ETA: 1:47 - loss: 1.6200 - regression_loss: 1.3588 - classification_loss: 0.2612 64/500 [==>...........................] - ETA: 1:47 - loss: 1.6178 - regression_loss: 1.3567 - classification_loss: 0.2611 65/500 [==>...........................] - ETA: 1:47 - loss: 1.6094 - regression_loss: 1.3502 - classification_loss: 0.2592 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6126 - regression_loss: 1.3526 - classification_loss: 0.2600 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6192 - regression_loss: 1.3577 - classification_loss: 0.2615 68/500 [===>..........................] - ETA: 1:46 - loss: 1.6175 - regression_loss: 1.3558 - classification_loss: 0.2617 69/500 [===>..........................] - ETA: 1:46 - loss: 1.6140 - regression_loss: 1.3528 - classification_loss: 0.2612 70/500 [===>..........................] - ETA: 1:46 - loss: 1.6093 - regression_loss: 1.3496 - classification_loss: 0.2597 71/500 [===>..........................] - ETA: 1:45 - loss: 1.6106 - regression_loss: 1.3495 - classification_loss: 0.2611 72/500 [===>..........................] - ETA: 1:45 - loss: 1.6127 - regression_loss: 1.3502 - classification_loss: 0.2625 73/500 [===>..........................] - ETA: 1:45 - loss: 1.6084 - regression_loss: 1.3475 - classification_loss: 0.2608 74/500 [===>..........................] - ETA: 1:45 - loss: 1.6009 - regression_loss: 1.3416 - classification_loss: 0.2593 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6032 - regression_loss: 1.3436 - classification_loss: 0.2596 76/500 [===>..........................] - ETA: 1:44 - loss: 1.6032 - regression_loss: 1.3433 - classification_loss: 0.2599 77/500 [===>..........................] - ETA: 1:44 - loss: 1.6274 - regression_loss: 1.3614 - classification_loss: 0.2661 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6243 - regression_loss: 1.3584 - classification_loss: 0.2658 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6291 - regression_loss: 1.3625 - classification_loss: 0.2666 80/500 [===>..........................] - ETA: 1:43 - loss: 1.6351 - regression_loss: 1.3698 - classification_loss: 0.2653 81/500 [===>..........................] - ETA: 1:43 - loss: 1.6330 - regression_loss: 1.3687 - classification_loss: 0.2643 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6342 - regression_loss: 1.3704 - classification_loss: 0.2638 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6288 - regression_loss: 1.3654 - classification_loss: 0.2634 84/500 [====>.........................] - ETA: 1:42 - loss: 1.6229 - regression_loss: 1.3608 - classification_loss: 0.2621 85/500 [====>.........................] - ETA: 1:42 - loss: 1.6184 - regression_loss: 1.3569 - classification_loss: 0.2615 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6202 - regression_loss: 1.3580 - classification_loss: 0.2622 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6212 - regression_loss: 1.3584 - classification_loss: 0.2628 88/500 [====>.........................] - ETA: 1:41 - loss: 1.6157 - regression_loss: 1.3544 - classification_loss: 0.2613 89/500 [====>.........................] - ETA: 1:41 - loss: 1.6109 - regression_loss: 1.3511 - classification_loss: 0.2598 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6136 - regression_loss: 1.3530 - classification_loss: 0.2606 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6091 - regression_loss: 1.3496 - classification_loss: 0.2596 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6163 - regression_loss: 1.3565 - classification_loss: 0.2599 93/500 [====>.........................] - ETA: 1:40 - loss: 1.6128 - regression_loss: 1.3536 - classification_loss: 0.2591 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6138 - regression_loss: 1.3535 - classification_loss: 0.2603 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6234 - regression_loss: 1.3579 - classification_loss: 0.2655 96/500 [====>.........................] - ETA: 1:39 - loss: 1.6210 - regression_loss: 1.3550 - classification_loss: 0.2660 97/500 [====>.........................] - ETA: 1:39 - loss: 1.6123 - regression_loss: 1.3476 - classification_loss: 0.2647 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6167 - regression_loss: 1.3508 - classification_loss: 0.2658 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6121 - regression_loss: 1.3475 - classification_loss: 0.2646 100/500 [=====>........................] - ETA: 1:38 - loss: 1.6157 - regression_loss: 1.3503 - classification_loss: 0.2655 101/500 [=====>........................] - ETA: 1:38 - loss: 1.6106 - regression_loss: 1.3460 - classification_loss: 0.2646 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6083 - regression_loss: 1.3446 - classification_loss: 0.2637 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6119 - regression_loss: 1.3471 - classification_loss: 0.2649 104/500 [=====>........................] - ETA: 1:37 - loss: 1.6066 - regression_loss: 1.3429 - classification_loss: 0.2637 105/500 [=====>........................] - ETA: 1:37 - loss: 1.6081 - regression_loss: 1.3445 - classification_loss: 0.2636 106/500 [=====>........................] - ETA: 1:37 - loss: 1.6076 - regression_loss: 1.3441 - classification_loss: 0.2635 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6107 - regression_loss: 1.3455 - classification_loss: 0.2652 108/500 [=====>........................] - ETA: 1:36 - loss: 1.6115 - regression_loss: 1.3462 - classification_loss: 0.2654 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6104 - regression_loss: 1.3457 - classification_loss: 0.2647 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6074 - regression_loss: 1.3428 - classification_loss: 0.2646 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6092 - regression_loss: 1.3460 - classification_loss: 0.2632 112/500 [=====>........................] - ETA: 1:35 - loss: 1.5989 - regression_loss: 1.3374 - classification_loss: 0.2615 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6003 - regression_loss: 1.3393 - classification_loss: 0.2609 114/500 [=====>........................] - ETA: 1:35 - loss: 1.5988 - regression_loss: 1.3388 - classification_loss: 0.2600 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5899 - regression_loss: 1.3315 - classification_loss: 0.2584 116/500 [=====>........................] - ETA: 1:34 - loss: 1.5856 - regression_loss: 1.3278 - classification_loss: 0.2577 117/500 [======>.......................] - ETA: 1:34 - loss: 1.5816 - regression_loss: 1.3250 - classification_loss: 0.2566 118/500 [======>.......................] - ETA: 1:34 - loss: 1.5837 - regression_loss: 1.3264 - classification_loss: 0.2573 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5916 - regression_loss: 1.3323 - classification_loss: 0.2594 120/500 [======>.......................] - ETA: 1:33 - loss: 1.5886 - regression_loss: 1.3305 - classification_loss: 0.2581 121/500 [======>.......................] - ETA: 1:33 - loss: 1.5849 - regression_loss: 1.3266 - classification_loss: 0.2583 122/500 [======>.......................] - ETA: 1:33 - loss: 1.5913 - regression_loss: 1.3318 - classification_loss: 0.2596 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5904 - regression_loss: 1.3309 - classification_loss: 0.2595 124/500 [======>.......................] - ETA: 1:32 - loss: 1.5946 - regression_loss: 1.3344 - classification_loss: 0.2602 125/500 [======>.......................] - ETA: 1:32 - loss: 1.5942 - regression_loss: 1.3344 - classification_loss: 0.2598 126/500 [======>.......................] - ETA: 1:32 - loss: 1.5926 - regression_loss: 1.3337 - classification_loss: 0.2589 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5931 - regression_loss: 1.3340 - classification_loss: 0.2591 128/500 [======>.......................] - ETA: 1:31 - loss: 1.5901 - regression_loss: 1.3314 - classification_loss: 0.2587 129/500 [======>.......................] - ETA: 1:31 - loss: 1.5883 - regression_loss: 1.3304 - classification_loss: 0.2579 130/500 [======>.......................] - ETA: 1:31 - loss: 1.5891 - regression_loss: 1.3307 - classification_loss: 0.2584 131/500 [======>.......................] - ETA: 1:30 - loss: 1.5867 - regression_loss: 1.3291 - classification_loss: 0.2576 132/500 [======>.......................] - ETA: 1:30 - loss: 1.5882 - regression_loss: 1.3303 - classification_loss: 0.2578 133/500 [======>.......................] - ETA: 1:30 - loss: 1.5914 - regression_loss: 1.3324 - classification_loss: 0.2590 134/500 [=======>......................] - ETA: 1:30 - loss: 1.5957 - regression_loss: 1.3353 - classification_loss: 0.2604 135/500 [=======>......................] - ETA: 1:29 - loss: 1.5959 - regression_loss: 1.3353 - classification_loss: 0.2606 136/500 [=======>......................] - ETA: 1:29 - loss: 1.5983 - regression_loss: 1.3375 - classification_loss: 0.2608 137/500 [=======>......................] - ETA: 1:29 - loss: 1.6019 - regression_loss: 1.3396 - classification_loss: 0.2623 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6062 - regression_loss: 1.3431 - classification_loss: 0.2631 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6116 - regression_loss: 1.3480 - classification_loss: 0.2637 140/500 [=======>......................] - ETA: 1:28 - loss: 1.6098 - regression_loss: 1.3464 - classification_loss: 0.2634 141/500 [=======>......................] - ETA: 1:28 - loss: 1.6063 - regression_loss: 1.3431 - classification_loss: 0.2632 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6032 - regression_loss: 1.3412 - classification_loss: 0.2620 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6201 - regression_loss: 1.3319 - classification_loss: 0.2882 144/500 [=======>......................] - ETA: 1:27 - loss: 1.6154 - regression_loss: 1.3276 - classification_loss: 0.2878 145/500 [=======>......................] - ETA: 1:27 - loss: 1.6143 - regression_loss: 1.3265 - classification_loss: 0.2878 146/500 [=======>......................] - ETA: 1:27 - loss: 1.6104 - regression_loss: 1.3238 - classification_loss: 0.2866 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6098 - regression_loss: 1.3232 - classification_loss: 0.2866 148/500 [=======>......................] - ETA: 1:26 - loss: 1.6081 - regression_loss: 1.3223 - classification_loss: 0.2858 149/500 [=======>......................] - ETA: 1:26 - loss: 1.6017 - regression_loss: 1.3172 - classification_loss: 0.2845 150/500 [========>.....................] - ETA: 1:26 - loss: 1.5987 - regression_loss: 1.3152 - classification_loss: 0.2835 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5963 - regression_loss: 1.3140 - classification_loss: 0.2823 152/500 [========>.....................] - ETA: 1:25 - loss: 1.5961 - regression_loss: 1.3141 - classification_loss: 0.2820 153/500 [========>.....................] - ETA: 1:25 - loss: 1.5940 - regression_loss: 1.3124 - classification_loss: 0.2816 154/500 [========>.....................] - ETA: 1:25 - loss: 1.5933 - regression_loss: 1.3121 - classification_loss: 0.2812 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5956 - regression_loss: 1.3142 - classification_loss: 0.2814 156/500 [========>.....................] - ETA: 1:24 - loss: 1.5999 - regression_loss: 1.3175 - classification_loss: 0.2824 157/500 [========>.....................] - ETA: 1:24 - loss: 1.6000 - regression_loss: 1.3179 - classification_loss: 0.2821 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6006 - regression_loss: 1.3188 - classification_loss: 0.2818 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5995 - regression_loss: 1.3184 - classification_loss: 0.2811 160/500 [========>.....................] - ETA: 1:23 - loss: 1.6057 - regression_loss: 1.3237 - classification_loss: 0.2820 161/500 [========>.....................] - ETA: 1:23 - loss: 1.6007 - regression_loss: 1.3197 - classification_loss: 0.2811 162/500 [========>.....................] - ETA: 1:23 - loss: 1.6005 - regression_loss: 1.3197 - classification_loss: 0.2808 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6014 - regression_loss: 1.3208 - classification_loss: 0.2806 164/500 [========>.....................] - ETA: 1:22 - loss: 1.5969 - regression_loss: 1.3175 - classification_loss: 0.2793 165/500 [========>.....................] - ETA: 1:22 - loss: 1.5931 - regression_loss: 1.3147 - classification_loss: 0.2784 166/500 [========>.....................] - ETA: 1:22 - loss: 1.5900 - regression_loss: 1.3115 - classification_loss: 0.2785 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5884 - regression_loss: 1.3106 - classification_loss: 0.2777 168/500 [=========>....................] - ETA: 1:21 - loss: 1.5863 - regression_loss: 1.3094 - classification_loss: 0.2769 169/500 [=========>....................] - ETA: 1:21 - loss: 1.5886 - regression_loss: 1.3114 - classification_loss: 0.2773 170/500 [=========>....................] - ETA: 1:21 - loss: 1.5950 - regression_loss: 1.3162 - classification_loss: 0.2788 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5943 - regression_loss: 1.3161 - classification_loss: 0.2782 172/500 [=========>....................] - ETA: 1:20 - loss: 1.5949 - regression_loss: 1.3174 - classification_loss: 0.2774 173/500 [=========>....................] - ETA: 1:20 - loss: 1.5940 - regression_loss: 1.3170 - classification_loss: 0.2770 174/500 [=========>....................] - ETA: 1:20 - loss: 1.5913 - regression_loss: 1.3154 - classification_loss: 0.2759 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5952 - regression_loss: 1.3182 - classification_loss: 0.2769 176/500 [=========>....................] - ETA: 1:19 - loss: 1.5970 - regression_loss: 1.3206 - classification_loss: 0.2764 177/500 [=========>....................] - ETA: 1:19 - loss: 1.5963 - regression_loss: 1.3204 - classification_loss: 0.2759 178/500 [=========>....................] - ETA: 1:19 - loss: 1.5957 - regression_loss: 1.3198 - classification_loss: 0.2759 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5942 - regression_loss: 1.3181 - classification_loss: 0.2762 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5913 - regression_loss: 1.3156 - classification_loss: 0.2757 181/500 [=========>....................] - ETA: 1:18 - loss: 1.5895 - regression_loss: 1.3143 - classification_loss: 0.2753 182/500 [=========>....................] - ETA: 1:18 - loss: 1.5911 - regression_loss: 1.3154 - classification_loss: 0.2756 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5900 - regression_loss: 1.3148 - classification_loss: 0.2752 184/500 [==========>...................] - ETA: 1:17 - loss: 1.5881 - regression_loss: 1.3137 - classification_loss: 0.2744 185/500 [==========>...................] - ETA: 1:17 - loss: 1.5860 - regression_loss: 1.3123 - classification_loss: 0.2737 186/500 [==========>...................] - ETA: 1:17 - loss: 1.5822 - regression_loss: 1.3095 - classification_loss: 0.2728 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5757 - regression_loss: 1.3025 - classification_loss: 0.2733 188/500 [==========>...................] - ETA: 1:16 - loss: 1.5765 - regression_loss: 1.3031 - classification_loss: 0.2734 189/500 [==========>...................] - ETA: 1:16 - loss: 1.5755 - regression_loss: 1.3029 - classification_loss: 0.2726 190/500 [==========>...................] - ETA: 1:16 - loss: 1.5752 - regression_loss: 1.3027 - classification_loss: 0.2725 191/500 [==========>...................] - ETA: 1:16 - loss: 1.5747 - regression_loss: 1.3028 - classification_loss: 0.2719 192/500 [==========>...................] - ETA: 1:15 - loss: 1.5717 - regression_loss: 1.3005 - classification_loss: 0.2712 193/500 [==========>...................] - ETA: 1:15 - loss: 1.5742 - regression_loss: 1.3027 - classification_loss: 0.2715 194/500 [==========>...................] - ETA: 1:15 - loss: 1.5728 - regression_loss: 1.3012 - classification_loss: 0.2715 195/500 [==========>...................] - ETA: 1:15 - loss: 1.5812 - regression_loss: 1.3077 - classification_loss: 0.2735 196/500 [==========>...................] - ETA: 1:14 - loss: 1.5808 - regression_loss: 1.3075 - classification_loss: 0.2733 197/500 [==========>...................] - ETA: 1:14 - loss: 1.5832 - regression_loss: 1.3093 - classification_loss: 0.2739 198/500 [==========>...................] - ETA: 1:14 - loss: 1.5838 - regression_loss: 1.3097 - classification_loss: 0.2741 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5816 - regression_loss: 1.3081 - classification_loss: 0.2735 200/500 [===========>..................] - ETA: 1:13 - loss: 1.5856 - regression_loss: 1.3101 - classification_loss: 0.2754 201/500 [===========>..................] - ETA: 1:13 - loss: 1.5829 - regression_loss: 1.3084 - classification_loss: 0.2745 202/500 [===========>..................] - ETA: 1:13 - loss: 1.5863 - regression_loss: 1.3109 - classification_loss: 0.2754 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5879 - regression_loss: 1.3120 - classification_loss: 0.2759 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5897 - regression_loss: 1.3137 - classification_loss: 0.2759 205/500 [===========>..................] - ETA: 1:12 - loss: 1.5875 - regression_loss: 1.3122 - classification_loss: 0.2753 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5871 - regression_loss: 1.3118 - classification_loss: 0.2753 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5874 - regression_loss: 1.3125 - classification_loss: 0.2748 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5868 - regression_loss: 1.3119 - classification_loss: 0.2749 209/500 [===========>..................] - ETA: 1:11 - loss: 1.5858 - regression_loss: 1.3113 - classification_loss: 0.2744 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5870 - regression_loss: 1.3122 - classification_loss: 0.2748 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5837 - regression_loss: 1.3096 - classification_loss: 0.2741 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5847 - regression_loss: 1.3108 - classification_loss: 0.2739 213/500 [===========>..................] - ETA: 1:10 - loss: 1.5834 - regression_loss: 1.3099 - classification_loss: 0.2735 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5888 - regression_loss: 1.3140 - classification_loss: 0.2748 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5886 - regression_loss: 1.3138 - classification_loss: 0.2748 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5845 - regression_loss: 1.3106 - classification_loss: 0.2739 217/500 [============>.................] - ETA: 1:09 - loss: 1.5854 - regression_loss: 1.3117 - classification_loss: 0.2737 218/500 [============>.................] - ETA: 1:09 - loss: 1.5826 - regression_loss: 1.3098 - classification_loss: 0.2728 219/500 [============>.................] - ETA: 1:09 - loss: 1.5783 - regression_loss: 1.3062 - classification_loss: 0.2721 220/500 [============>.................] - ETA: 1:09 - loss: 1.5786 - regression_loss: 1.3057 - classification_loss: 0.2729 221/500 [============>.................] - ETA: 1:08 - loss: 1.5737 - regression_loss: 1.3018 - classification_loss: 0.2719 222/500 [============>.................] - ETA: 1:08 - loss: 1.5770 - regression_loss: 1.3040 - classification_loss: 0.2731 223/500 [============>.................] - ETA: 1:08 - loss: 1.5781 - regression_loss: 1.3050 - classification_loss: 0.2731 224/500 [============>.................] - ETA: 1:08 - loss: 1.5784 - regression_loss: 1.3056 - classification_loss: 0.2729 225/500 [============>.................] - ETA: 1:07 - loss: 1.5873 - regression_loss: 1.3119 - classification_loss: 0.2755 226/500 [============>.................] - ETA: 1:07 - loss: 1.5872 - regression_loss: 1.3118 - classification_loss: 0.2754 227/500 [============>.................] - ETA: 1:07 - loss: 1.5876 - regression_loss: 1.3124 - classification_loss: 0.2752 228/500 [============>.................] - ETA: 1:07 - loss: 1.5864 - regression_loss: 1.3117 - classification_loss: 0.2746 229/500 [============>.................] - ETA: 1:06 - loss: 1.5844 - regression_loss: 1.3105 - classification_loss: 0.2739 230/500 [============>.................] - ETA: 1:06 - loss: 1.5818 - regression_loss: 1.3086 - classification_loss: 0.2733 231/500 [============>.................] - ETA: 1:06 - loss: 1.5822 - regression_loss: 1.3087 - classification_loss: 0.2735 232/500 [============>.................] - ETA: 1:06 - loss: 1.5855 - regression_loss: 1.3123 - classification_loss: 0.2733 233/500 [============>.................] - ETA: 1:05 - loss: 1.5836 - regression_loss: 1.3110 - classification_loss: 0.2726 234/500 [=============>................] - ETA: 1:05 - loss: 1.5795 - regression_loss: 1.3078 - classification_loss: 0.2718 235/500 [=============>................] - ETA: 1:05 - loss: 1.5796 - regression_loss: 1.3082 - classification_loss: 0.2714 236/500 [=============>................] - ETA: 1:05 - loss: 1.5817 - regression_loss: 1.3096 - classification_loss: 0.2720 237/500 [=============>................] - ETA: 1:04 - loss: 1.5811 - regression_loss: 1.3094 - classification_loss: 0.2716 238/500 [=============>................] - ETA: 1:04 - loss: 1.5824 - regression_loss: 1.3108 - classification_loss: 0.2716 239/500 [=============>................] - ETA: 1:04 - loss: 1.5795 - regression_loss: 1.3087 - classification_loss: 0.2708 240/500 [=============>................] - ETA: 1:04 - loss: 1.5831 - regression_loss: 1.3116 - classification_loss: 0.2716 241/500 [=============>................] - ETA: 1:03 - loss: 1.5792 - regression_loss: 1.3085 - classification_loss: 0.2706 242/500 [=============>................] - ETA: 1:03 - loss: 1.5818 - regression_loss: 1.3107 - classification_loss: 0.2711 243/500 [=============>................] - ETA: 1:03 - loss: 1.5805 - regression_loss: 1.3101 - classification_loss: 0.2704 244/500 [=============>................] - ETA: 1:03 - loss: 1.5800 - regression_loss: 1.3098 - classification_loss: 0.2702 245/500 [=============>................] - ETA: 1:02 - loss: 1.5794 - regression_loss: 1.3094 - classification_loss: 0.2699 246/500 [=============>................] - ETA: 1:02 - loss: 1.5815 - regression_loss: 1.3111 - classification_loss: 0.2704 247/500 [=============>................] - ETA: 1:02 - loss: 1.5801 - regression_loss: 1.3103 - classification_loss: 0.2698 248/500 [=============>................] - ETA: 1:02 - loss: 1.5810 - regression_loss: 1.3111 - classification_loss: 0.2699 249/500 [=============>................] - ETA: 1:01 - loss: 1.5807 - regression_loss: 1.3111 - classification_loss: 0.2697 250/500 [==============>...............] - ETA: 1:01 - loss: 1.5788 - regression_loss: 1.3097 - classification_loss: 0.2690 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5782 - regression_loss: 1.3094 - classification_loss: 0.2688 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5790 - regression_loss: 1.3100 - classification_loss: 0.2690 253/500 [==============>...............] - ETA: 1:00 - loss: 1.5804 - regression_loss: 1.3116 - classification_loss: 0.2688 254/500 [==============>...............] - ETA: 1:00 - loss: 1.5800 - regression_loss: 1.3114 - classification_loss: 0.2686 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5793 - regression_loss: 1.3111 - classification_loss: 0.2682 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5792 - regression_loss: 1.3111 - classification_loss: 0.2681 257/500 [==============>...............] - ETA: 59s - loss: 1.5800 - regression_loss: 1.3118 - classification_loss: 0.2682  258/500 [==============>...............] - ETA: 59s - loss: 1.5808 - regression_loss: 1.3124 - classification_loss: 0.2684 259/500 [==============>...............] - ETA: 59s - loss: 1.5793 - regression_loss: 1.3114 - classification_loss: 0.2679 260/500 [==============>...............] - ETA: 59s - loss: 1.5812 - regression_loss: 1.3131 - classification_loss: 0.2681 261/500 [==============>...............] - ETA: 58s - loss: 1.5822 - regression_loss: 1.3136 - classification_loss: 0.2686 262/500 [==============>...............] - ETA: 58s - loss: 1.5824 - regression_loss: 1.3140 - classification_loss: 0.2683 263/500 [==============>...............] - ETA: 58s - loss: 1.5849 - regression_loss: 1.3163 - classification_loss: 0.2687 264/500 [==============>...............] - ETA: 58s - loss: 1.5853 - regression_loss: 1.3171 - classification_loss: 0.2682 265/500 [==============>...............] - ETA: 57s - loss: 1.5842 - regression_loss: 1.3163 - classification_loss: 0.2679 266/500 [==============>...............] - ETA: 57s - loss: 1.5848 - regression_loss: 1.3167 - classification_loss: 0.2681 267/500 [===============>..............] - ETA: 57s - loss: 1.5880 - regression_loss: 1.3194 - classification_loss: 0.2686 268/500 [===============>..............] - ETA: 57s - loss: 1.5907 - regression_loss: 1.3213 - classification_loss: 0.2693 269/500 [===============>..............] - ETA: 57s - loss: 1.5903 - regression_loss: 1.3209 - classification_loss: 0.2694 270/500 [===============>..............] - ETA: 56s - loss: 1.5883 - regression_loss: 1.3192 - classification_loss: 0.2691 271/500 [===============>..............] - ETA: 56s - loss: 1.5912 - regression_loss: 1.3215 - classification_loss: 0.2697 272/500 [===============>..............] - ETA: 56s - loss: 1.5870 - regression_loss: 1.3179 - classification_loss: 0.2691 273/500 [===============>..............] - ETA: 56s - loss: 1.5853 - regression_loss: 1.3167 - classification_loss: 0.2686 274/500 [===============>..............] - ETA: 55s - loss: 1.5811 - regression_loss: 1.3133 - classification_loss: 0.2678 275/500 [===============>..............] - ETA: 55s - loss: 1.5788 - regression_loss: 1.3116 - classification_loss: 0.2672 276/500 [===============>..............] - ETA: 55s - loss: 1.5781 - regression_loss: 1.3111 - classification_loss: 0.2669 277/500 [===============>..............] - ETA: 55s - loss: 1.5762 - regression_loss: 1.3098 - classification_loss: 0.2664 278/500 [===============>..............] - ETA: 54s - loss: 1.5757 - regression_loss: 1.3095 - classification_loss: 0.2661 279/500 [===============>..............] - ETA: 54s - loss: 1.5736 - regression_loss: 1.3082 - classification_loss: 0.2654 280/500 [===============>..............] - ETA: 54s - loss: 1.5720 - regression_loss: 1.3071 - classification_loss: 0.2649 281/500 [===============>..............] - ETA: 54s - loss: 1.5721 - regression_loss: 1.3073 - classification_loss: 0.2648 282/500 [===============>..............] - ETA: 53s - loss: 1.5730 - regression_loss: 1.3083 - classification_loss: 0.2647 283/500 [===============>..............] - ETA: 53s - loss: 1.5708 - regression_loss: 1.3066 - classification_loss: 0.2642 284/500 [================>.............] - ETA: 53s - loss: 1.5734 - regression_loss: 1.3087 - classification_loss: 0.2647 285/500 [================>.............] - ETA: 53s - loss: 1.5721 - regression_loss: 1.3079 - classification_loss: 0.2643 286/500 [================>.............] - ETA: 52s - loss: 1.5723 - regression_loss: 1.3078 - classification_loss: 0.2645 287/500 [================>.............] - ETA: 52s - loss: 1.5729 - regression_loss: 1.3082 - classification_loss: 0.2648 288/500 [================>.............] - ETA: 52s - loss: 1.5731 - regression_loss: 1.3076 - classification_loss: 0.2656 289/500 [================>.............] - ETA: 52s - loss: 1.5746 - regression_loss: 1.3090 - classification_loss: 0.2655 290/500 [================>.............] - ETA: 51s - loss: 1.5759 - regression_loss: 1.3103 - classification_loss: 0.2656 291/500 [================>.............] - ETA: 51s - loss: 1.5744 - regression_loss: 1.3092 - classification_loss: 0.2652 292/500 [================>.............] - ETA: 51s - loss: 1.5753 - regression_loss: 1.3096 - classification_loss: 0.2657 293/500 [================>.............] - ETA: 51s - loss: 1.5747 - regression_loss: 1.3093 - classification_loss: 0.2654 294/500 [================>.............] - ETA: 50s - loss: 1.5742 - regression_loss: 1.3088 - classification_loss: 0.2654 295/500 [================>.............] - ETA: 50s - loss: 1.5757 - regression_loss: 1.3099 - classification_loss: 0.2658 296/500 [================>.............] - ETA: 50s - loss: 1.5746 - regression_loss: 1.3090 - classification_loss: 0.2655 297/500 [================>.............] - ETA: 50s - loss: 1.5709 - regression_loss: 1.3061 - classification_loss: 0.2648 298/500 [================>.............] - ETA: 49s - loss: 1.5680 - regression_loss: 1.3039 - classification_loss: 0.2641 299/500 [================>.............] - ETA: 49s - loss: 1.5699 - regression_loss: 1.3051 - classification_loss: 0.2648 300/500 [=================>............] - ETA: 49s - loss: 1.5715 - regression_loss: 1.3067 - classification_loss: 0.2648 301/500 [=================>............] - ETA: 49s - loss: 1.5713 - regression_loss: 1.3067 - classification_loss: 0.2645 302/500 [=================>............] - ETA: 48s - loss: 1.5724 - regression_loss: 1.3076 - classification_loss: 0.2647 303/500 [=================>............] - ETA: 48s - loss: 1.5721 - regression_loss: 1.3076 - classification_loss: 0.2645 304/500 [=================>............] - ETA: 48s - loss: 1.5704 - regression_loss: 1.3062 - classification_loss: 0.2642 305/500 [=================>............] - ETA: 48s - loss: 1.5682 - regression_loss: 1.3045 - classification_loss: 0.2637 306/500 [=================>............] - ETA: 47s - loss: 1.5694 - regression_loss: 1.3056 - classification_loss: 0.2638 307/500 [=================>............] - ETA: 47s - loss: 1.5683 - regression_loss: 1.3048 - classification_loss: 0.2634 308/500 [=================>............] - ETA: 47s - loss: 1.5683 - regression_loss: 1.3050 - classification_loss: 0.2632 309/500 [=================>............] - ETA: 47s - loss: 1.5693 - regression_loss: 1.3060 - classification_loss: 0.2634 310/500 [=================>............] - ETA: 46s - loss: 1.5683 - regression_loss: 1.3053 - classification_loss: 0.2630 311/500 [=================>............] - ETA: 46s - loss: 1.5679 - regression_loss: 1.3050 - classification_loss: 0.2630 312/500 [=================>............] - ETA: 46s - loss: 1.5688 - regression_loss: 1.3054 - classification_loss: 0.2634 313/500 [=================>............] - ETA: 46s - loss: 1.5672 - regression_loss: 1.3039 - classification_loss: 0.2633 314/500 [=================>............] - ETA: 45s - loss: 1.5696 - regression_loss: 1.3058 - classification_loss: 0.2638 315/500 [=================>............] - ETA: 45s - loss: 1.5736 - regression_loss: 1.3089 - classification_loss: 0.2647 316/500 [=================>............] - ETA: 45s - loss: 1.5728 - regression_loss: 1.3085 - classification_loss: 0.2643 317/500 [==================>...........] - ETA: 45s - loss: 1.5718 - regression_loss: 1.3079 - classification_loss: 0.2640 318/500 [==================>...........] - ETA: 45s - loss: 1.5744 - regression_loss: 1.3093 - classification_loss: 0.2651 319/500 [==================>...........] - ETA: 44s - loss: 1.5756 - regression_loss: 1.3105 - classification_loss: 0.2652 320/500 [==================>...........] - ETA: 44s - loss: 1.5752 - regression_loss: 1.3099 - classification_loss: 0.2653 321/500 [==================>...........] - ETA: 44s - loss: 1.5749 - regression_loss: 1.3098 - classification_loss: 0.2651 322/500 [==================>...........] - ETA: 44s - loss: 1.5763 - regression_loss: 1.3106 - classification_loss: 0.2657 323/500 [==================>...........] - ETA: 43s - loss: 1.5740 - regression_loss: 1.3088 - classification_loss: 0.2652 324/500 [==================>...........] - ETA: 43s - loss: 1.5723 - regression_loss: 1.3075 - classification_loss: 0.2648 325/500 [==================>...........] - ETA: 43s - loss: 1.5729 - regression_loss: 1.3081 - classification_loss: 0.2648 326/500 [==================>...........] - ETA: 43s - loss: 1.5737 - regression_loss: 1.3086 - classification_loss: 0.2651 327/500 [==================>...........] - ETA: 42s - loss: 1.5771 - regression_loss: 1.3109 - classification_loss: 0.2661 328/500 [==================>...........] - ETA: 42s - loss: 1.5771 - regression_loss: 1.3111 - classification_loss: 0.2661 329/500 [==================>...........] - ETA: 42s - loss: 1.5764 - regression_loss: 1.3106 - classification_loss: 0.2658 330/500 [==================>...........] - ETA: 42s - loss: 1.5798 - regression_loss: 1.3135 - classification_loss: 0.2664 331/500 [==================>...........] - ETA: 41s - loss: 1.5784 - regression_loss: 1.3126 - classification_loss: 0.2659 332/500 [==================>...........] - ETA: 41s - loss: 1.5810 - regression_loss: 1.3145 - classification_loss: 0.2665 333/500 [==================>...........] - ETA: 41s - loss: 1.5813 - regression_loss: 1.3149 - classification_loss: 0.2664 334/500 [===================>..........] - ETA: 41s - loss: 1.5825 - regression_loss: 1.3162 - classification_loss: 0.2662 335/500 [===================>..........] - ETA: 40s - loss: 1.5814 - regression_loss: 1.3155 - classification_loss: 0.2659 336/500 [===================>..........] - ETA: 40s - loss: 1.5810 - regression_loss: 1.3153 - classification_loss: 0.2657 337/500 [===================>..........] - ETA: 40s - loss: 1.5884 - regression_loss: 1.3114 - classification_loss: 0.2770 338/500 [===================>..........] - ETA: 40s - loss: 1.5891 - regression_loss: 1.3119 - classification_loss: 0.2772 339/500 [===================>..........] - ETA: 39s - loss: 1.5880 - regression_loss: 1.3112 - classification_loss: 0.2768 340/500 [===================>..........] - ETA: 39s - loss: 1.5878 - regression_loss: 1.3113 - classification_loss: 0.2766 341/500 [===================>..........] - ETA: 39s - loss: 1.5871 - regression_loss: 1.3109 - classification_loss: 0.2762 342/500 [===================>..........] - ETA: 39s - loss: 1.5858 - regression_loss: 1.3099 - classification_loss: 0.2759 343/500 [===================>..........] - ETA: 38s - loss: 1.5897 - regression_loss: 1.3130 - classification_loss: 0.2767 344/500 [===================>..........] - ETA: 38s - loss: 1.5903 - regression_loss: 1.3136 - classification_loss: 0.2767 345/500 [===================>..........] - ETA: 38s - loss: 1.5890 - regression_loss: 1.3127 - classification_loss: 0.2763 346/500 [===================>..........] - ETA: 38s - loss: 1.5869 - regression_loss: 1.3111 - classification_loss: 0.2758 347/500 [===================>..........] - ETA: 37s - loss: 1.5876 - regression_loss: 1.3117 - classification_loss: 0.2759 348/500 [===================>..........] - ETA: 37s - loss: 1.5891 - regression_loss: 1.3131 - classification_loss: 0.2760 349/500 [===================>..........] - ETA: 37s - loss: 1.5896 - regression_loss: 1.3134 - classification_loss: 0.2762 350/500 [====================>.........] - ETA: 37s - loss: 1.5863 - regression_loss: 1.3108 - classification_loss: 0.2755 351/500 [====================>.........] - ETA: 36s - loss: 1.5858 - regression_loss: 1.3103 - classification_loss: 0.2755 352/500 [====================>.........] - ETA: 36s - loss: 1.5882 - regression_loss: 1.3119 - classification_loss: 0.2763 353/500 [====================>.........] - ETA: 36s - loss: 1.5894 - regression_loss: 1.3129 - classification_loss: 0.2765 354/500 [====================>.........] - ETA: 36s - loss: 1.5894 - regression_loss: 1.3130 - classification_loss: 0.2764 355/500 [====================>.........] - ETA: 35s - loss: 1.5877 - regression_loss: 1.3117 - classification_loss: 0.2760 356/500 [====================>.........] - ETA: 35s - loss: 1.5871 - regression_loss: 1.3114 - classification_loss: 0.2757 357/500 [====================>.........] - ETA: 35s - loss: 1.5874 - regression_loss: 1.3120 - classification_loss: 0.2755 358/500 [====================>.........] - ETA: 35s - loss: 1.5865 - regression_loss: 1.3114 - classification_loss: 0.2751 359/500 [====================>.........] - ETA: 34s - loss: 1.5865 - regression_loss: 1.3114 - classification_loss: 0.2751 360/500 [====================>.........] - ETA: 34s - loss: 1.5850 - regression_loss: 1.3102 - classification_loss: 0.2747 361/500 [====================>.........] - ETA: 34s - loss: 1.5879 - regression_loss: 1.3091 - classification_loss: 0.2788 362/500 [====================>.........] - ETA: 34s - loss: 1.5898 - regression_loss: 1.3105 - classification_loss: 0.2793 363/500 [====================>.........] - ETA: 33s - loss: 1.5886 - regression_loss: 1.3097 - classification_loss: 0.2789 364/500 [====================>.........] - ETA: 33s - loss: 1.5889 - regression_loss: 1.3101 - classification_loss: 0.2788 365/500 [====================>.........] - ETA: 33s - loss: 1.5880 - regression_loss: 1.3096 - classification_loss: 0.2784 366/500 [====================>.........] - ETA: 33s - loss: 1.5880 - regression_loss: 1.3098 - classification_loss: 0.2782 367/500 [=====================>........] - ETA: 32s - loss: 1.5877 - regression_loss: 1.3094 - classification_loss: 0.2783 368/500 [=====================>........] - ETA: 32s - loss: 1.5859 - regression_loss: 1.3081 - classification_loss: 0.2778 369/500 [=====================>........] - ETA: 32s - loss: 1.5873 - regression_loss: 1.3091 - classification_loss: 0.2782 370/500 [=====================>........] - ETA: 32s - loss: 1.5869 - regression_loss: 1.3090 - classification_loss: 0.2779 371/500 [=====================>........] - ETA: 31s - loss: 1.5837 - regression_loss: 1.3064 - classification_loss: 0.2773 372/500 [=====================>........] - ETA: 31s - loss: 1.5820 - regression_loss: 1.3052 - classification_loss: 0.2768 373/500 [=====================>........] - ETA: 31s - loss: 1.5824 - regression_loss: 1.3054 - classification_loss: 0.2770 374/500 [=====================>........] - ETA: 31s - loss: 1.5816 - regression_loss: 1.3049 - classification_loss: 0.2767 375/500 [=====================>........] - ETA: 30s - loss: 1.5820 - regression_loss: 1.3053 - classification_loss: 0.2767 376/500 [=====================>........] - ETA: 30s - loss: 1.5819 - regression_loss: 1.3054 - classification_loss: 0.2765 377/500 [=====================>........] - ETA: 30s - loss: 1.5834 - regression_loss: 1.3066 - classification_loss: 0.2768 378/500 [=====================>........] - ETA: 30s - loss: 1.5843 - regression_loss: 1.3073 - classification_loss: 0.2770 379/500 [=====================>........] - ETA: 29s - loss: 1.5815 - regression_loss: 1.3050 - classification_loss: 0.2765 380/500 [=====================>........] - ETA: 29s - loss: 1.5822 - regression_loss: 1.3054 - classification_loss: 0.2768 381/500 [=====================>........] - ETA: 29s - loss: 1.5819 - regression_loss: 1.3054 - classification_loss: 0.2765 382/500 [=====================>........] - ETA: 29s - loss: 1.5832 - regression_loss: 1.3063 - classification_loss: 0.2769 383/500 [=====================>........] - ETA: 28s - loss: 1.5813 - regression_loss: 1.3048 - classification_loss: 0.2765 384/500 [======================>.......] - ETA: 28s - loss: 1.5832 - regression_loss: 1.3065 - classification_loss: 0.2766 385/500 [======================>.......] - ETA: 28s - loss: 1.5837 - regression_loss: 1.3070 - classification_loss: 0.2767 386/500 [======================>.......] - ETA: 28s - loss: 1.5831 - regression_loss: 1.3067 - classification_loss: 0.2765 387/500 [======================>.......] - ETA: 27s - loss: 1.5848 - regression_loss: 1.3080 - classification_loss: 0.2769 388/500 [======================>.......] - ETA: 27s - loss: 1.5842 - regression_loss: 1.3076 - classification_loss: 0.2766 389/500 [======================>.......] - ETA: 27s - loss: 1.5847 - regression_loss: 1.3079 - classification_loss: 0.2768 390/500 [======================>.......] - ETA: 27s - loss: 1.5839 - regression_loss: 1.3075 - classification_loss: 0.2765 391/500 [======================>.......] - ETA: 26s - loss: 1.5834 - regression_loss: 1.3070 - classification_loss: 0.2763 392/500 [======================>.......] - ETA: 26s - loss: 1.5830 - regression_loss: 1.3069 - classification_loss: 0.2760 393/500 [======================>.......] - ETA: 26s - loss: 1.5823 - regression_loss: 1.3064 - classification_loss: 0.2759 394/500 [======================>.......] - ETA: 26s - loss: 1.5831 - regression_loss: 1.3072 - classification_loss: 0.2759 395/500 [======================>.......] - ETA: 25s - loss: 1.5817 - regression_loss: 1.3062 - classification_loss: 0.2755 396/500 [======================>.......] - ETA: 25s - loss: 1.5806 - regression_loss: 1.3054 - classification_loss: 0.2752 397/500 [======================>.......] - ETA: 25s - loss: 1.5794 - regression_loss: 1.3045 - classification_loss: 0.2749 398/500 [======================>.......] - ETA: 25s - loss: 1.5801 - regression_loss: 1.3052 - classification_loss: 0.2749 399/500 [======================>.......] - ETA: 24s - loss: 1.5794 - regression_loss: 1.3048 - classification_loss: 0.2745 400/500 [=======================>......] - ETA: 24s - loss: 1.5791 - regression_loss: 1.3047 - classification_loss: 0.2744 401/500 [=======================>......] - ETA: 24s - loss: 1.5784 - regression_loss: 1.3043 - classification_loss: 0.2741 402/500 [=======================>......] - ETA: 24s - loss: 1.5789 - regression_loss: 1.3047 - classification_loss: 0.2742 403/500 [=======================>......] - ETA: 23s - loss: 1.5799 - regression_loss: 1.3056 - classification_loss: 0.2744 404/500 [=======================>......] - ETA: 23s - loss: 1.5800 - regression_loss: 1.3056 - classification_loss: 0.2743 405/500 [=======================>......] - ETA: 23s - loss: 1.5808 - regression_loss: 1.3063 - classification_loss: 0.2745 406/500 [=======================>......] - ETA: 23s - loss: 1.5808 - regression_loss: 1.3064 - classification_loss: 0.2744 407/500 [=======================>......] - ETA: 23s - loss: 1.5793 - regression_loss: 1.3052 - classification_loss: 0.2741 408/500 [=======================>......] - ETA: 22s - loss: 1.5781 - regression_loss: 1.3046 - classification_loss: 0.2736 409/500 [=======================>......] - ETA: 22s - loss: 1.5767 - regression_loss: 1.3034 - classification_loss: 0.2734 410/500 [=======================>......] - ETA: 22s - loss: 1.5781 - regression_loss: 1.3045 - classification_loss: 0.2735 411/500 [=======================>......] - ETA: 22s - loss: 1.5758 - regression_loss: 1.3027 - classification_loss: 0.2730 412/500 [=======================>......] - ETA: 21s - loss: 1.5757 - regression_loss: 1.3027 - classification_loss: 0.2729 413/500 [=======================>......] - ETA: 21s - loss: 1.5767 - regression_loss: 1.3036 - classification_loss: 0.2731 414/500 [=======================>......] - ETA: 21s - loss: 1.5779 - regression_loss: 1.3043 - classification_loss: 0.2736 415/500 [=======================>......] - ETA: 21s - loss: 1.5770 - regression_loss: 1.3037 - classification_loss: 0.2732 416/500 [=======================>......] - ETA: 20s - loss: 1.5765 - regression_loss: 1.3034 - classification_loss: 0.2731 417/500 [========================>.....] - ETA: 20s - loss: 1.5761 - regression_loss: 1.3031 - classification_loss: 0.2730 418/500 [========================>.....] - ETA: 20s - loss: 1.5756 - regression_loss: 1.3028 - classification_loss: 0.2728 419/500 [========================>.....] - ETA: 20s - loss: 1.5767 - regression_loss: 1.3037 - classification_loss: 0.2730 420/500 [========================>.....] - ETA: 19s - loss: 1.5776 - regression_loss: 1.3046 - classification_loss: 0.2730 421/500 [========================>.....] - ETA: 19s - loss: 1.5749 - regression_loss: 1.3024 - classification_loss: 0.2725 422/500 [========================>.....] - ETA: 19s - loss: 1.5742 - regression_loss: 1.3019 - classification_loss: 0.2723 423/500 [========================>.....] - ETA: 19s - loss: 1.5740 - regression_loss: 1.3017 - classification_loss: 0.2723 424/500 [========================>.....] - ETA: 18s - loss: 1.5739 - regression_loss: 1.3019 - classification_loss: 0.2720 425/500 [========================>.....] - ETA: 18s - loss: 1.5727 - regression_loss: 1.3010 - classification_loss: 0.2718 426/500 [========================>.....] - ETA: 18s - loss: 1.5760 - regression_loss: 1.3032 - classification_loss: 0.2728 427/500 [========================>.....] - ETA: 18s - loss: 1.5769 - regression_loss: 1.3039 - classification_loss: 0.2730 428/500 [========================>.....] - ETA: 17s - loss: 1.5791 - regression_loss: 1.3056 - classification_loss: 0.2735 429/500 [========================>.....] - ETA: 17s - loss: 1.5789 - regression_loss: 1.3054 - classification_loss: 0.2736 430/500 [========================>.....] - ETA: 17s - loss: 1.5780 - regression_loss: 1.3048 - classification_loss: 0.2732 431/500 [========================>.....] - ETA: 17s - loss: 1.5752 - regression_loss: 1.3025 - classification_loss: 0.2727 432/500 [========================>.....] - ETA: 16s - loss: 1.5768 - regression_loss: 1.3038 - classification_loss: 0.2730 433/500 [========================>.....] - ETA: 16s - loss: 1.5784 - regression_loss: 1.3050 - classification_loss: 0.2735 434/500 [=========================>....] - ETA: 16s - loss: 1.5781 - regression_loss: 1.3048 - classification_loss: 0.2734 435/500 [=========================>....] - ETA: 16s - loss: 1.5785 - regression_loss: 1.3050 - classification_loss: 0.2735 436/500 [=========================>....] - ETA: 15s - loss: 1.5772 - regression_loss: 1.3042 - classification_loss: 0.2730 437/500 [=========================>....] - ETA: 15s - loss: 1.5776 - regression_loss: 1.3046 - classification_loss: 0.2730 438/500 [=========================>....] - ETA: 15s - loss: 1.5779 - regression_loss: 1.3048 - classification_loss: 0.2731 439/500 [=========================>....] - ETA: 15s - loss: 1.5786 - regression_loss: 1.3054 - classification_loss: 0.2732 440/500 [=========================>....] - ETA: 14s - loss: 1.5796 - regression_loss: 1.3063 - classification_loss: 0.2733 441/500 [=========================>....] - ETA: 14s - loss: 1.5779 - regression_loss: 1.3049 - classification_loss: 0.2730 442/500 [=========================>....] - ETA: 14s - loss: 1.5785 - regression_loss: 1.3053 - classification_loss: 0.2732 443/500 [=========================>....] - ETA: 14s - loss: 1.5790 - regression_loss: 1.3057 - classification_loss: 0.2732 444/500 [=========================>....] - ETA: 13s - loss: 1.5794 - regression_loss: 1.3062 - classification_loss: 0.2732 445/500 [=========================>....] - ETA: 13s - loss: 1.5800 - regression_loss: 1.3068 - classification_loss: 0.2732 446/500 [=========================>....] - ETA: 13s - loss: 1.5814 - regression_loss: 1.3079 - classification_loss: 0.2736 447/500 [=========================>....] - ETA: 13s - loss: 1.5802 - regression_loss: 1.3069 - classification_loss: 0.2733 448/500 [=========================>....] - ETA: 12s - loss: 1.5787 - regression_loss: 1.3059 - classification_loss: 0.2728 449/500 [=========================>....] - ETA: 12s - loss: 1.5785 - regression_loss: 1.3056 - classification_loss: 0.2729 450/500 [==========================>...] - ETA: 12s - loss: 1.5783 - regression_loss: 1.3055 - classification_loss: 0.2728 451/500 [==========================>...] - ETA: 12s - loss: 1.5770 - regression_loss: 1.3045 - classification_loss: 0.2724 452/500 [==========================>...] - ETA: 11s - loss: 1.5783 - regression_loss: 1.3055 - classification_loss: 0.2728 453/500 [==========================>...] - ETA: 11s - loss: 1.5782 - regression_loss: 1.3056 - classification_loss: 0.2726 454/500 [==========================>...] - ETA: 11s - loss: 1.5782 - regression_loss: 1.3056 - classification_loss: 0.2726 455/500 [==========================>...] - ETA: 11s - loss: 1.5773 - regression_loss: 1.3048 - classification_loss: 0.2725 456/500 [==========================>...] - ETA: 10s - loss: 1.5783 - regression_loss: 1.3059 - classification_loss: 0.2724 457/500 [==========================>...] - ETA: 10s - loss: 1.5764 - regression_loss: 1.3043 - classification_loss: 0.2720 458/500 [==========================>...] - ETA: 10s - loss: 1.5776 - regression_loss: 1.3052 - classification_loss: 0.2724 459/500 [==========================>...] - ETA: 10s - loss: 1.5765 - regression_loss: 1.3045 - classification_loss: 0.2720 460/500 [==========================>...] - ETA: 9s - loss: 1.5760 - regression_loss: 1.3041 - classification_loss: 0.2719  461/500 [==========================>...] - ETA: 9s - loss: 1.5765 - regression_loss: 1.3046 - classification_loss: 0.2719 462/500 [==========================>...] - ETA: 9s - loss: 1.5766 - regression_loss: 1.3047 - classification_loss: 0.2719 463/500 [==========================>...] - ETA: 9s - loss: 1.5743 - regression_loss: 1.3028 - classification_loss: 0.2715 464/500 [==========================>...] - ETA: 8s - loss: 1.5742 - regression_loss: 1.3028 - classification_loss: 0.2714 465/500 [==========================>...] - ETA: 8s - loss: 1.5735 - regression_loss: 1.3023 - classification_loss: 0.2712 466/500 [==========================>...] - ETA: 8s - loss: 1.5734 - regression_loss: 1.3023 - classification_loss: 0.2711 467/500 [===========================>..] - ETA: 8s - loss: 1.5748 - regression_loss: 1.3030 - classification_loss: 0.2717 468/500 [===========================>..] - ETA: 7s - loss: 1.5738 - regression_loss: 1.3025 - classification_loss: 0.2713 469/500 [===========================>..] - ETA: 7s - loss: 1.5730 - regression_loss: 1.3013 - classification_loss: 0.2717 470/500 [===========================>..] - ETA: 7s - loss: 1.5716 - regression_loss: 1.3002 - classification_loss: 0.2714 471/500 [===========================>..] - ETA: 7s - loss: 1.5724 - regression_loss: 1.3008 - classification_loss: 0.2716 472/500 [===========================>..] - ETA: 6s - loss: 1.5741 - regression_loss: 1.3024 - classification_loss: 0.2717 473/500 [===========================>..] - ETA: 6s - loss: 1.5739 - regression_loss: 1.3022 - classification_loss: 0.2717 474/500 [===========================>..] - ETA: 6s - loss: 1.5714 - regression_loss: 1.3000 - classification_loss: 0.2714 475/500 [===========================>..] - ETA: 6s - loss: 1.5715 - regression_loss: 1.3000 - classification_loss: 0.2714 476/500 [===========================>..] - ETA: 5s - loss: 1.5700 - regression_loss: 1.2989 - classification_loss: 0.2711 477/500 [===========================>..] - ETA: 5s - loss: 1.5692 - regression_loss: 1.2984 - classification_loss: 0.2708 478/500 [===========================>..] - ETA: 5s - loss: 1.5678 - regression_loss: 1.2973 - classification_loss: 0.2705 479/500 [===========================>..] - ETA: 5s - loss: 1.5683 - regression_loss: 1.2977 - classification_loss: 0.2705 480/500 [===========================>..] - ETA: 4s - loss: 1.5692 - regression_loss: 1.2984 - classification_loss: 0.2708 481/500 [===========================>..] - ETA: 4s - loss: 1.5682 - regression_loss: 1.2978 - classification_loss: 0.2705 482/500 [===========================>..] - ETA: 4s - loss: 1.5674 - regression_loss: 1.2972 - classification_loss: 0.2702 483/500 [===========================>..] - ETA: 4s - loss: 1.5668 - regression_loss: 1.2967 - classification_loss: 0.2701 484/500 [============================>.] - ETA: 3s - loss: 1.5670 - regression_loss: 1.2967 - classification_loss: 0.2703 485/500 [============================>.] - ETA: 3s - loss: 1.5672 - regression_loss: 1.2967 - classification_loss: 0.2705 486/500 [============================>.] - ETA: 3s - loss: 1.5663 - regression_loss: 1.2961 - classification_loss: 0.2703 487/500 [============================>.] - ETA: 3s - loss: 1.5663 - regression_loss: 1.2962 - classification_loss: 0.2702 488/500 [============================>.] - ETA: 2s - loss: 1.5658 - regression_loss: 1.2958 - classification_loss: 0.2699 489/500 [============================>.] - ETA: 2s - loss: 1.5662 - regression_loss: 1.2964 - classification_loss: 0.2698 490/500 [============================>.] - ETA: 2s - loss: 1.5661 - regression_loss: 1.2963 - classification_loss: 0.2698 491/500 [============================>.] - ETA: 2s - loss: 1.5669 - regression_loss: 1.2968 - classification_loss: 0.2701 492/500 [============================>.] - ETA: 1s - loss: 1.5684 - regression_loss: 1.2981 - classification_loss: 0.2703 493/500 [============================>.] - ETA: 1s - loss: 1.5682 - regression_loss: 1.2980 - classification_loss: 0.2702 494/500 [============================>.] - ETA: 1s - loss: 1.5687 - regression_loss: 1.2986 - classification_loss: 0.2700 495/500 [============================>.] - ETA: 1s - loss: 1.5687 - regression_loss: 1.2988 - classification_loss: 0.2699 496/500 [============================>.] - ETA: 0s - loss: 1.5697 - regression_loss: 1.2997 - classification_loss: 0.2699 497/500 [============================>.] - ETA: 0s - loss: 1.5702 - regression_loss: 1.3002 - classification_loss: 0.2701 498/500 [============================>.] - ETA: 0s - loss: 1.5698 - regression_loss: 1.2999 - classification_loss: 0.2699 499/500 [============================>.] - ETA: 0s - loss: 1.5714 - regression_loss: 1.3014 - classification_loss: 0.2701 500/500 [==============================] - 124s 248ms/step - loss: 1.5711 - regression_loss: 1.3011 - classification_loss: 0.2700 326 instances of class plum with average precision: 0.7678 mAP: 0.7678 Epoch 00064: saving model to ./training/snapshots/resnet50_pascal_64.h5 Epoch 65/150 1/500 [..............................] - ETA: 2:05 - loss: 1.3934 - regression_loss: 1.2005 - classification_loss: 0.1928 2/500 [..............................] - ETA: 2:06 - loss: 1.4924 - regression_loss: 1.2507 - classification_loss: 0.2417 3/500 [..............................] - ETA: 2:06 - loss: 1.4708 - regression_loss: 1.2447 - classification_loss: 0.2261 4/500 [..............................] - ETA: 2:05 - loss: 1.3466 - regression_loss: 1.1254 - classification_loss: 0.2212 5/500 [..............................] - ETA: 2:05 - loss: 1.4851 - regression_loss: 1.2199 - classification_loss: 0.2651 6/500 [..............................] - ETA: 2:04 - loss: 1.4324 - regression_loss: 1.1868 - classification_loss: 0.2456 7/500 [..............................] - ETA: 2:03 - loss: 1.4551 - regression_loss: 1.2134 - classification_loss: 0.2418 8/500 [..............................] - ETA: 2:03 - loss: 1.5718 - regression_loss: 1.3194 - classification_loss: 0.2524 9/500 [..............................] - ETA: 2:03 - loss: 1.5485 - regression_loss: 1.3063 - classification_loss: 0.2422 10/500 [..............................] - ETA: 2:02 - loss: 1.5348 - regression_loss: 1.2960 - classification_loss: 0.2388 11/500 [..............................] - ETA: 2:02 - loss: 1.4912 - regression_loss: 1.2589 - classification_loss: 0.2323 12/500 [..............................] - ETA: 2:02 - loss: 1.4849 - regression_loss: 1.2466 - classification_loss: 0.2383 13/500 [..............................] - ETA: 2:01 - loss: 1.5426 - regression_loss: 1.2849 - classification_loss: 0.2577 14/500 [..............................] - ETA: 2:01 - loss: 1.5558 - regression_loss: 1.2992 - classification_loss: 0.2566 15/500 [..............................] - ETA: 2:05 - loss: 1.5495 - regression_loss: 1.2958 - classification_loss: 0.2537 16/500 [..............................] - ETA: 2:04 - loss: 1.5666 - regression_loss: 1.3094 - classification_loss: 0.2572 17/500 [>.............................] - ETA: 2:04 - loss: 1.6315 - regression_loss: 1.3606 - classification_loss: 0.2709 18/500 [>.............................] - ETA: 2:03 - loss: 1.6353 - regression_loss: 1.3635 - classification_loss: 0.2718 19/500 [>.............................] - ETA: 2:02 - loss: 1.6268 - regression_loss: 1.3585 - classification_loss: 0.2683 20/500 [>.............................] - ETA: 2:02 - loss: 1.6355 - regression_loss: 1.3663 - classification_loss: 0.2692 21/500 [>.............................] - ETA: 2:02 - loss: 1.6350 - regression_loss: 1.3682 - classification_loss: 0.2668 22/500 [>.............................] - ETA: 2:01 - loss: 1.6223 - regression_loss: 1.3592 - classification_loss: 0.2631 23/500 [>.............................] - ETA: 2:01 - loss: 1.6637 - regression_loss: 1.3870 - classification_loss: 0.2767 24/500 [>.............................] - ETA: 2:01 - loss: 1.7105 - regression_loss: 1.4197 - classification_loss: 0.2907 25/500 [>.............................] - ETA: 2:00 - loss: 1.7094 - regression_loss: 1.4220 - classification_loss: 0.2874 26/500 [>.............................] - ETA: 1:59 - loss: 1.7032 - regression_loss: 1.4167 - classification_loss: 0.2865 27/500 [>.............................] - ETA: 1:58 - loss: 1.7252 - regression_loss: 1.4314 - classification_loss: 0.2938 28/500 [>.............................] - ETA: 1:58 - loss: 1.7240 - regression_loss: 1.4329 - classification_loss: 0.2911 29/500 [>.............................] - ETA: 1:58 - loss: 1.7252 - regression_loss: 1.4365 - classification_loss: 0.2887 30/500 [>.............................] - ETA: 1:57 - loss: 1.7163 - regression_loss: 1.4286 - classification_loss: 0.2877 31/500 [>.............................] - ETA: 1:57 - loss: 1.6976 - regression_loss: 1.4161 - classification_loss: 0.2815 32/500 [>.............................] - ETA: 1:56 - loss: 1.7068 - regression_loss: 1.4245 - classification_loss: 0.2824 33/500 [>.............................] - ETA: 1:56 - loss: 1.7028 - regression_loss: 1.4201 - classification_loss: 0.2826 34/500 [=>............................] - ETA: 1:55 - loss: 1.6936 - regression_loss: 1.4161 - classification_loss: 0.2776 35/500 [=>............................] - ETA: 1:54 - loss: 1.6660 - regression_loss: 1.3933 - classification_loss: 0.2727 36/500 [=>............................] - ETA: 1:54 - loss: 1.6654 - regression_loss: 1.3939 - classification_loss: 0.2715 37/500 [=>............................] - ETA: 1:53 - loss: 1.6745 - regression_loss: 1.4001 - classification_loss: 0.2744 38/500 [=>............................] - ETA: 1:53 - loss: 1.6624 - regression_loss: 1.3920 - classification_loss: 0.2705 39/500 [=>............................] - ETA: 1:53 - loss: 1.6558 - regression_loss: 1.3861 - classification_loss: 0.2697 40/500 [=>............................] - ETA: 1:53 - loss: 1.6877 - regression_loss: 1.4111 - classification_loss: 0.2765 41/500 [=>............................] - ETA: 1:53 - loss: 1.6832 - regression_loss: 1.4084 - classification_loss: 0.2748 42/500 [=>............................] - ETA: 1:52 - loss: 1.6831 - regression_loss: 1.4066 - classification_loss: 0.2764 43/500 [=>............................] - ETA: 1:52 - loss: 1.6914 - regression_loss: 1.4119 - classification_loss: 0.2795 44/500 [=>............................] - ETA: 1:52 - loss: 1.6754 - regression_loss: 1.3988 - classification_loss: 0.2766 45/500 [=>............................] - ETA: 1:52 - loss: 1.6698 - regression_loss: 1.3952 - classification_loss: 0.2745 46/500 [=>............................] - ETA: 1:51 - loss: 1.6553 - regression_loss: 1.3827 - classification_loss: 0.2726 47/500 [=>............................] - ETA: 1:51 - loss: 1.6489 - regression_loss: 1.3776 - classification_loss: 0.2713 48/500 [=>............................] - ETA: 1:51 - loss: 1.6472 - regression_loss: 1.3763 - classification_loss: 0.2709 49/500 [=>............................] - ETA: 1:51 - loss: 1.6489 - regression_loss: 1.3772 - classification_loss: 0.2717 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6418 - regression_loss: 1.3712 - classification_loss: 0.2706 51/500 [==>...........................] - ETA: 1:50 - loss: 1.6386 - regression_loss: 1.3689 - classification_loss: 0.2697 52/500 [==>...........................] - ETA: 1:50 - loss: 1.6349 - regression_loss: 1.3663 - classification_loss: 0.2686 53/500 [==>...........................] - ETA: 1:50 - loss: 1.6199 - regression_loss: 1.3544 - classification_loss: 0.2655 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6171 - regression_loss: 1.3526 - classification_loss: 0.2645 55/500 [==>...........................] - ETA: 1:49 - loss: 1.6151 - regression_loss: 1.3521 - classification_loss: 0.2629 56/500 [==>...........................] - ETA: 1:49 - loss: 1.6075 - regression_loss: 1.3476 - classification_loss: 0.2599 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6075 - regression_loss: 1.3478 - classification_loss: 0.2597 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6037 - regression_loss: 1.3455 - classification_loss: 0.2583 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6095 - regression_loss: 1.3500 - classification_loss: 0.2595 60/500 [==>...........................] - ETA: 1:48 - loss: 1.5974 - regression_loss: 1.3398 - classification_loss: 0.2575 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6029 - regression_loss: 1.3445 - classification_loss: 0.2583 62/500 [==>...........................] - ETA: 1:47 - loss: 1.5965 - regression_loss: 1.3375 - classification_loss: 0.2590 63/500 [==>...........................] - ETA: 1:47 - loss: 1.5889 - regression_loss: 1.3322 - classification_loss: 0.2567 64/500 [==>...........................] - ETA: 1:47 - loss: 1.5838 - regression_loss: 1.3287 - classification_loss: 0.2551 65/500 [==>...........................] - ETA: 1:47 - loss: 1.5820 - regression_loss: 1.3285 - classification_loss: 0.2535 66/500 [==>...........................] - ETA: 1:46 - loss: 1.5803 - regression_loss: 1.3278 - classification_loss: 0.2525 67/500 [===>..........................] - ETA: 1:46 - loss: 1.5794 - regression_loss: 1.3272 - classification_loss: 0.2522 68/500 [===>..........................] - ETA: 1:46 - loss: 1.5677 - regression_loss: 1.3178 - classification_loss: 0.2498 69/500 [===>..........................] - ETA: 1:46 - loss: 1.5697 - regression_loss: 1.3194 - classification_loss: 0.2503 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5654 - regression_loss: 1.3155 - classification_loss: 0.2499 71/500 [===>..........................] - ETA: 1:45 - loss: 1.5690 - regression_loss: 1.3180 - classification_loss: 0.2510 72/500 [===>..........................] - ETA: 1:45 - loss: 1.5585 - regression_loss: 1.3096 - classification_loss: 0.2489 73/500 [===>..........................] - ETA: 1:45 - loss: 1.5567 - regression_loss: 1.3085 - classification_loss: 0.2482 74/500 [===>..........................] - ETA: 1:45 - loss: 1.5562 - regression_loss: 1.3083 - classification_loss: 0.2479 75/500 [===>..........................] - ETA: 1:44 - loss: 1.5454 - regression_loss: 1.2991 - classification_loss: 0.2463 76/500 [===>..........................] - ETA: 1:44 - loss: 1.5436 - regression_loss: 1.2987 - classification_loss: 0.2448 77/500 [===>..........................] - ETA: 1:44 - loss: 1.5389 - regression_loss: 1.2953 - classification_loss: 0.2436 78/500 [===>..........................] - ETA: 1:44 - loss: 1.5328 - regression_loss: 1.2909 - classification_loss: 0.2419 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5354 - regression_loss: 1.2935 - classification_loss: 0.2419 80/500 [===>..........................] - ETA: 1:43 - loss: 1.5306 - regression_loss: 1.2901 - classification_loss: 0.2405 81/500 [===>..........................] - ETA: 1:43 - loss: 1.5444 - regression_loss: 1.2983 - classification_loss: 0.2461 82/500 [===>..........................] - ETA: 1:43 - loss: 1.5441 - regression_loss: 1.2991 - classification_loss: 0.2451 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5480 - regression_loss: 1.3030 - classification_loss: 0.2450 84/500 [====>.........................] - ETA: 1:42 - loss: 1.5447 - regression_loss: 1.3008 - classification_loss: 0.2438 85/500 [====>.........................] - ETA: 1:42 - loss: 1.5460 - regression_loss: 1.3027 - classification_loss: 0.2434 86/500 [====>.........................] - ETA: 1:42 - loss: 1.5430 - regression_loss: 1.3004 - classification_loss: 0.2427 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5487 - regression_loss: 1.3036 - classification_loss: 0.2451 88/500 [====>.........................] - ETA: 1:41 - loss: 1.5564 - regression_loss: 1.3090 - classification_loss: 0.2475 89/500 [====>.........................] - ETA: 1:41 - loss: 1.5602 - regression_loss: 1.3125 - classification_loss: 0.2477 90/500 [====>.........................] - ETA: 1:41 - loss: 1.5562 - regression_loss: 1.3088 - classification_loss: 0.2474 91/500 [====>.........................] - ETA: 1:41 - loss: 1.5494 - regression_loss: 1.3038 - classification_loss: 0.2456 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5442 - regression_loss: 1.3000 - classification_loss: 0.2442 93/500 [====>.........................] - ETA: 1:40 - loss: 1.5435 - regression_loss: 1.2992 - classification_loss: 0.2442 94/500 [====>.........................] - ETA: 1:40 - loss: 1.5400 - regression_loss: 1.2974 - classification_loss: 0.2426 95/500 [====>.........................] - ETA: 1:40 - loss: 1.5522 - regression_loss: 1.3060 - classification_loss: 0.2461 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5602 - regression_loss: 1.3132 - classification_loss: 0.2471 97/500 [====>.........................] - ETA: 1:39 - loss: 1.5682 - regression_loss: 1.3202 - classification_loss: 0.2480 98/500 [====>.........................] - ETA: 1:39 - loss: 1.5685 - regression_loss: 1.3209 - classification_loss: 0.2477 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5726 - regression_loss: 1.3244 - classification_loss: 0.2482 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5909 - regression_loss: 1.3380 - classification_loss: 0.2529 101/500 [=====>........................] - ETA: 1:38 - loss: 1.5958 - regression_loss: 1.3414 - classification_loss: 0.2544 102/500 [=====>........................] - ETA: 1:38 - loss: 1.5950 - regression_loss: 1.3416 - classification_loss: 0.2534 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5965 - regression_loss: 1.3422 - classification_loss: 0.2543 104/500 [=====>........................] - ETA: 1:37 - loss: 1.5966 - regression_loss: 1.3423 - classification_loss: 0.2543 105/500 [=====>........................] - ETA: 1:37 - loss: 1.5927 - regression_loss: 1.3396 - classification_loss: 0.2531 106/500 [=====>........................] - ETA: 1:37 - loss: 1.5843 - regression_loss: 1.3333 - classification_loss: 0.2511 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5861 - regression_loss: 1.3347 - classification_loss: 0.2514 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5933 - regression_loss: 1.3398 - classification_loss: 0.2535 109/500 [=====>........................] - ETA: 1:36 - loss: 1.5900 - regression_loss: 1.3370 - classification_loss: 0.2530 110/500 [=====>........................] - ETA: 1:36 - loss: 1.5896 - regression_loss: 1.3365 - classification_loss: 0.2530 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5865 - regression_loss: 1.3349 - classification_loss: 0.2516 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5864 - regression_loss: 1.3354 - classification_loss: 0.2510 113/500 [=====>........................] - ETA: 1:35 - loss: 1.5841 - regression_loss: 1.3343 - classification_loss: 0.2499 114/500 [=====>........................] - ETA: 1:35 - loss: 1.5812 - regression_loss: 1.3321 - classification_loss: 0.2491 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5794 - regression_loss: 1.3307 - classification_loss: 0.2487 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5762 - regression_loss: 1.3280 - classification_loss: 0.2481 117/500 [======>.......................] - ETA: 1:34 - loss: 1.5809 - regression_loss: 1.3315 - classification_loss: 0.2494 118/500 [======>.......................] - ETA: 1:34 - loss: 1.5791 - regression_loss: 1.3303 - classification_loss: 0.2488 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5911 - regression_loss: 1.3394 - classification_loss: 0.2517 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5935 - regression_loss: 1.3417 - classification_loss: 0.2518 121/500 [======>.......................] - ETA: 1:33 - loss: 1.5854 - regression_loss: 1.3353 - classification_loss: 0.2501 122/500 [======>.......................] - ETA: 1:33 - loss: 1.5911 - regression_loss: 1.3403 - classification_loss: 0.2508 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5942 - regression_loss: 1.3429 - classification_loss: 0.2513 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6061 - regression_loss: 1.3523 - classification_loss: 0.2538 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6131 - regression_loss: 1.3579 - classification_loss: 0.2552 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6126 - regression_loss: 1.3575 - classification_loss: 0.2551 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6146 - regression_loss: 1.3589 - classification_loss: 0.2557 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6174 - regression_loss: 1.3609 - classification_loss: 0.2564 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6123 - regression_loss: 1.3570 - classification_loss: 0.2553 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6073 - regression_loss: 1.3528 - classification_loss: 0.2545 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6054 - regression_loss: 1.3515 - classification_loss: 0.2539 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6069 - regression_loss: 1.3525 - classification_loss: 0.2543 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6114 - regression_loss: 1.3559 - classification_loss: 0.2555 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6109 - regression_loss: 1.3559 - classification_loss: 0.2550 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6117 - regression_loss: 1.3564 - classification_loss: 0.2552 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6100 - regression_loss: 1.3553 - classification_loss: 0.2548 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6045 - regression_loss: 1.3507 - classification_loss: 0.2537 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6056 - regression_loss: 1.3510 - classification_loss: 0.2546 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6048 - regression_loss: 1.3502 - classification_loss: 0.2546 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6042 - regression_loss: 1.3501 - classification_loss: 0.2541 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5968 - regression_loss: 1.3442 - classification_loss: 0.2526 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6001 - regression_loss: 1.3463 - classification_loss: 0.2538 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6136 - regression_loss: 1.3508 - classification_loss: 0.2627 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6124 - regression_loss: 1.3492 - classification_loss: 0.2632 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6106 - regression_loss: 1.3488 - classification_loss: 0.2619 146/500 [=======>......................] - ETA: 1:27 - loss: 1.6158 - regression_loss: 1.3526 - classification_loss: 0.2633 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6195 - regression_loss: 1.3555 - classification_loss: 0.2640 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6227 - regression_loss: 1.3583 - classification_loss: 0.2644 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6317 - regression_loss: 1.3668 - classification_loss: 0.2648 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6316 - regression_loss: 1.3665 - classification_loss: 0.2651 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6321 - regression_loss: 1.3668 - classification_loss: 0.2653 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6362 - regression_loss: 1.3697 - classification_loss: 0.2665 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6288 - regression_loss: 1.3635 - classification_loss: 0.2653 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6244 - regression_loss: 1.3600 - classification_loss: 0.2644 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6238 - regression_loss: 1.3600 - classification_loss: 0.2638 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6196 - regression_loss: 1.3565 - classification_loss: 0.2630 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6205 - regression_loss: 1.3567 - classification_loss: 0.2638 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6236 - regression_loss: 1.3592 - classification_loss: 0.2644 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6233 - regression_loss: 1.3592 - classification_loss: 0.2641 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6206 - regression_loss: 1.3572 - classification_loss: 0.2634 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6214 - regression_loss: 1.3582 - classification_loss: 0.2632 162/500 [========>.....................] - ETA: 1:23 - loss: 1.6244 - regression_loss: 1.3603 - classification_loss: 0.2640 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6228 - regression_loss: 1.3593 - classification_loss: 0.2636 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6226 - regression_loss: 1.3591 - classification_loss: 0.2635 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6198 - regression_loss: 1.3562 - classification_loss: 0.2636 166/500 [========>.....................] - ETA: 1:22 - loss: 1.6143 - regression_loss: 1.3513 - classification_loss: 0.2630 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6175 - regression_loss: 1.3538 - classification_loss: 0.2637 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6128 - regression_loss: 1.3498 - classification_loss: 0.2630 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6090 - regression_loss: 1.3464 - classification_loss: 0.2625 170/500 [=========>....................] - ETA: 1:21 - loss: 1.6067 - regression_loss: 1.3448 - classification_loss: 0.2619 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6018 - regression_loss: 1.3409 - classification_loss: 0.2608 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5984 - regression_loss: 1.3383 - classification_loss: 0.2600 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5971 - regression_loss: 1.3372 - classification_loss: 0.2599 174/500 [=========>....................] - ETA: 1:20 - loss: 1.5967 - regression_loss: 1.3369 - classification_loss: 0.2599 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6031 - regression_loss: 1.3419 - classification_loss: 0.2613 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6028 - regression_loss: 1.3413 - classification_loss: 0.2615 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6037 - regression_loss: 1.3429 - classification_loss: 0.2608 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6043 - regression_loss: 1.3440 - classification_loss: 0.2603 179/500 [=========>....................] - ETA: 1:19 - loss: 1.6041 - regression_loss: 1.3434 - classification_loss: 0.2606 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6069 - regression_loss: 1.3455 - classification_loss: 0.2613 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6098 - regression_loss: 1.3483 - classification_loss: 0.2615 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6089 - regression_loss: 1.3478 - classification_loss: 0.2611 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6101 - regression_loss: 1.3489 - classification_loss: 0.2612 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6081 - regression_loss: 1.3475 - classification_loss: 0.2606 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6073 - regression_loss: 1.3473 - classification_loss: 0.2600 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6066 - regression_loss: 1.3470 - classification_loss: 0.2596 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6070 - regression_loss: 1.3476 - classification_loss: 0.2594 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6070 - regression_loss: 1.3476 - classification_loss: 0.2594 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6072 - regression_loss: 1.3478 - classification_loss: 0.2594 190/500 [==========>...................] - ETA: 1:16 - loss: 1.6080 - regression_loss: 1.3487 - classification_loss: 0.2593 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6099 - regression_loss: 1.3500 - classification_loss: 0.2600 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6056 - regression_loss: 1.3465 - classification_loss: 0.2591 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6060 - regression_loss: 1.3470 - classification_loss: 0.2589 194/500 [==========>...................] - ETA: 1:15 - loss: 1.6041 - regression_loss: 1.3460 - classification_loss: 0.2581 195/500 [==========>...................] - ETA: 1:15 - loss: 1.6020 - regression_loss: 1.3444 - classification_loss: 0.2576 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6007 - regression_loss: 1.3437 - classification_loss: 0.2570 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6028 - regression_loss: 1.3454 - classification_loss: 0.2573 198/500 [==========>...................] - ETA: 1:14 - loss: 1.5976 - regression_loss: 1.3413 - classification_loss: 0.2563 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5956 - regression_loss: 1.3399 - classification_loss: 0.2556 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5973 - regression_loss: 1.3414 - classification_loss: 0.2559 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5969 - regression_loss: 1.3412 - classification_loss: 0.2557 202/500 [===========>..................] - ETA: 1:13 - loss: 1.6000 - regression_loss: 1.3432 - classification_loss: 0.2568 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5985 - regression_loss: 1.3424 - classification_loss: 0.2562 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5945 - regression_loss: 1.3392 - classification_loss: 0.2552 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5955 - regression_loss: 1.3399 - classification_loss: 0.2556 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5952 - regression_loss: 1.3396 - classification_loss: 0.2556 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5973 - regression_loss: 1.3418 - classification_loss: 0.2556 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5978 - regression_loss: 1.3420 - classification_loss: 0.2558 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5959 - regression_loss: 1.3407 - classification_loss: 0.2552 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5898 - regression_loss: 1.3354 - classification_loss: 0.2544 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5912 - regression_loss: 1.3364 - classification_loss: 0.2548 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5930 - regression_loss: 1.3381 - classification_loss: 0.2549 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5924 - regression_loss: 1.3378 - classification_loss: 0.2546 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5903 - regression_loss: 1.3359 - classification_loss: 0.2544 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5893 - regression_loss: 1.3352 - classification_loss: 0.2540 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5917 - regression_loss: 1.3376 - classification_loss: 0.2541 217/500 [============>.................] - ETA: 1:10 - loss: 1.5908 - regression_loss: 1.3369 - classification_loss: 0.2539 218/500 [============>.................] - ETA: 1:09 - loss: 1.5900 - regression_loss: 1.3363 - classification_loss: 0.2537 219/500 [============>.................] - ETA: 1:09 - loss: 1.5849 - regression_loss: 1.3323 - classification_loss: 0.2527 220/500 [============>.................] - ETA: 1:09 - loss: 1.5864 - regression_loss: 1.3335 - classification_loss: 0.2530 221/500 [============>.................] - ETA: 1:09 - loss: 1.5835 - regression_loss: 1.3312 - classification_loss: 0.2524 222/500 [============>.................] - ETA: 1:08 - loss: 1.5821 - regression_loss: 1.3300 - classification_loss: 0.2521 223/500 [============>.................] - ETA: 1:08 - loss: 1.5807 - regression_loss: 1.3291 - classification_loss: 0.2516 224/500 [============>.................] - ETA: 1:08 - loss: 1.5800 - regression_loss: 1.3285 - classification_loss: 0.2514 225/500 [============>.................] - ETA: 1:08 - loss: 1.5818 - regression_loss: 1.3300 - classification_loss: 0.2518 226/500 [============>.................] - ETA: 1:07 - loss: 1.5842 - regression_loss: 1.3322 - classification_loss: 0.2520 227/500 [============>.................] - ETA: 1:07 - loss: 1.5863 - regression_loss: 1.3340 - classification_loss: 0.2523 228/500 [============>.................] - ETA: 1:07 - loss: 1.5833 - regression_loss: 1.3315 - classification_loss: 0.2518 229/500 [============>.................] - ETA: 1:07 - loss: 1.5779 - regression_loss: 1.3269 - classification_loss: 0.2510 230/500 [============>.................] - ETA: 1:06 - loss: 1.5775 - regression_loss: 1.3266 - classification_loss: 0.2509 231/500 [============>.................] - ETA: 1:06 - loss: 1.5779 - regression_loss: 1.3271 - classification_loss: 0.2509 232/500 [============>.................] - ETA: 1:06 - loss: 1.5779 - regression_loss: 1.3268 - classification_loss: 0.2511 233/500 [============>.................] - ETA: 1:06 - loss: 1.5747 - regression_loss: 1.3241 - classification_loss: 0.2506 234/500 [=============>................] - ETA: 1:05 - loss: 1.5786 - regression_loss: 1.3274 - classification_loss: 0.2513 235/500 [=============>................] - ETA: 1:05 - loss: 1.5771 - regression_loss: 1.3262 - classification_loss: 0.2509 236/500 [=============>................] - ETA: 1:05 - loss: 1.5802 - regression_loss: 1.3286 - classification_loss: 0.2517 237/500 [=============>................] - ETA: 1:05 - loss: 1.5822 - regression_loss: 1.3302 - classification_loss: 0.2520 238/500 [=============>................] - ETA: 1:04 - loss: 1.5804 - regression_loss: 1.3288 - classification_loss: 0.2516 239/500 [=============>................] - ETA: 1:04 - loss: 1.5793 - regression_loss: 1.3280 - classification_loss: 0.2513 240/500 [=============>................] - ETA: 1:04 - loss: 1.5808 - regression_loss: 1.3293 - classification_loss: 0.2514 241/500 [=============>................] - ETA: 1:04 - loss: 1.5760 - regression_loss: 1.3254 - classification_loss: 0.2506 242/500 [=============>................] - ETA: 1:03 - loss: 1.5747 - regression_loss: 1.3247 - classification_loss: 0.2501 243/500 [=============>................] - ETA: 1:03 - loss: 1.5770 - regression_loss: 1.3264 - classification_loss: 0.2506 244/500 [=============>................] - ETA: 1:03 - loss: 1.5786 - regression_loss: 1.3277 - classification_loss: 0.2509 245/500 [=============>................] - ETA: 1:03 - loss: 1.5798 - regression_loss: 1.3287 - classification_loss: 0.2511 246/500 [=============>................] - ETA: 1:02 - loss: 1.5777 - regression_loss: 1.3270 - classification_loss: 0.2508 247/500 [=============>................] - ETA: 1:02 - loss: 1.5811 - regression_loss: 1.3294 - classification_loss: 0.2517 248/500 [=============>................] - ETA: 1:02 - loss: 1.5858 - regression_loss: 1.3241 - classification_loss: 0.2617 249/500 [=============>................] - ETA: 1:02 - loss: 1.5861 - regression_loss: 1.3244 - classification_loss: 0.2617 250/500 [==============>...............] - ETA: 1:01 - loss: 1.5847 - regression_loss: 1.3233 - classification_loss: 0.2614 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5835 - regression_loss: 1.3226 - classification_loss: 0.2609 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5818 - regression_loss: 1.3214 - classification_loss: 0.2605 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5811 - regression_loss: 1.3208 - classification_loss: 0.2602 254/500 [==============>...............] - ETA: 1:00 - loss: 1.5788 - regression_loss: 1.3190 - classification_loss: 0.2597 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5778 - regression_loss: 1.3183 - classification_loss: 0.2595 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5772 - regression_loss: 1.3178 - classification_loss: 0.2594 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5789 - regression_loss: 1.3192 - classification_loss: 0.2597 258/500 [==============>...............] - ETA: 59s - loss: 1.5798 - regression_loss: 1.3199 - classification_loss: 0.2599  259/500 [==============>...............] - ETA: 59s - loss: 1.5840 - regression_loss: 1.3232 - classification_loss: 0.2609 260/500 [==============>...............] - ETA: 59s - loss: 1.5829 - regression_loss: 1.3225 - classification_loss: 0.2604 261/500 [==============>...............] - ETA: 59s - loss: 1.5858 - regression_loss: 1.3245 - classification_loss: 0.2614 262/500 [==============>...............] - ETA: 58s - loss: 1.5850 - regression_loss: 1.3238 - classification_loss: 0.2612 263/500 [==============>...............] - ETA: 58s - loss: 1.5860 - regression_loss: 1.3245 - classification_loss: 0.2615 264/500 [==============>...............] - ETA: 58s - loss: 1.5860 - regression_loss: 1.3243 - classification_loss: 0.2617 265/500 [==============>...............] - ETA: 58s - loss: 1.5851 - regression_loss: 1.3237 - classification_loss: 0.2614 266/500 [==============>...............] - ETA: 57s - loss: 1.5835 - regression_loss: 1.3226 - classification_loss: 0.2609 267/500 [===============>..............] - ETA: 57s - loss: 1.5837 - regression_loss: 1.3223 - classification_loss: 0.2614 268/500 [===============>..............] - ETA: 57s - loss: 1.5821 - regression_loss: 1.3212 - classification_loss: 0.2609 269/500 [===============>..............] - ETA: 57s - loss: 1.5837 - regression_loss: 1.3222 - classification_loss: 0.2615 270/500 [===============>..............] - ETA: 56s - loss: 1.5845 - regression_loss: 1.3226 - classification_loss: 0.2619 271/500 [===============>..............] - ETA: 56s - loss: 1.5835 - regression_loss: 1.3217 - classification_loss: 0.2618 272/500 [===============>..............] - ETA: 56s - loss: 1.5894 - regression_loss: 1.3259 - classification_loss: 0.2634 273/500 [===============>..............] - ETA: 56s - loss: 1.5889 - regression_loss: 1.3257 - classification_loss: 0.2633 274/500 [===============>..............] - ETA: 55s - loss: 1.5901 - regression_loss: 1.3266 - classification_loss: 0.2635 275/500 [===============>..............] - ETA: 55s - loss: 1.5905 - regression_loss: 1.3272 - classification_loss: 0.2633 276/500 [===============>..............] - ETA: 55s - loss: 1.5891 - regression_loss: 1.3262 - classification_loss: 0.2629 277/500 [===============>..............] - ETA: 55s - loss: 1.5897 - regression_loss: 1.3268 - classification_loss: 0.2629 278/500 [===============>..............] - ETA: 54s - loss: 1.5892 - regression_loss: 1.3265 - classification_loss: 0.2627 279/500 [===============>..............] - ETA: 54s - loss: 1.5874 - regression_loss: 1.3251 - classification_loss: 0.2622 280/500 [===============>..............] - ETA: 54s - loss: 1.5883 - regression_loss: 1.3260 - classification_loss: 0.2624 281/500 [===============>..............] - ETA: 54s - loss: 1.5869 - regression_loss: 1.3250 - classification_loss: 0.2619 282/500 [===============>..............] - ETA: 53s - loss: 1.5870 - regression_loss: 1.3253 - classification_loss: 0.2617 283/500 [===============>..............] - ETA: 53s - loss: 1.5861 - regression_loss: 1.3246 - classification_loss: 0.2615 284/500 [================>.............] - ETA: 53s - loss: 1.5886 - regression_loss: 1.3268 - classification_loss: 0.2619 285/500 [================>.............] - ETA: 53s - loss: 1.5919 - regression_loss: 1.3288 - classification_loss: 0.2631 286/500 [================>.............] - ETA: 52s - loss: 1.5877 - regression_loss: 1.3253 - classification_loss: 0.2625 287/500 [================>.............] - ETA: 52s - loss: 1.5864 - regression_loss: 1.3242 - classification_loss: 0.2622 288/500 [================>.............] - ETA: 52s - loss: 1.5881 - regression_loss: 1.3258 - classification_loss: 0.2623 289/500 [================>.............] - ETA: 52s - loss: 1.5885 - regression_loss: 1.3258 - classification_loss: 0.2626 290/500 [================>.............] - ETA: 51s - loss: 1.5886 - regression_loss: 1.3258 - classification_loss: 0.2627 291/500 [================>.............] - ETA: 51s - loss: 1.5857 - regression_loss: 1.3233 - classification_loss: 0.2624 292/500 [================>.............] - ETA: 51s - loss: 1.5849 - regression_loss: 1.3230 - classification_loss: 0.2620 293/500 [================>.............] - ETA: 51s - loss: 1.5842 - regression_loss: 1.3227 - classification_loss: 0.2615 294/500 [================>.............] - ETA: 50s - loss: 1.5828 - regression_loss: 1.3216 - classification_loss: 0.2612 295/500 [================>.............] - ETA: 50s - loss: 1.5792 - regression_loss: 1.3187 - classification_loss: 0.2606 296/500 [================>.............] - ETA: 50s - loss: 1.5811 - regression_loss: 1.3204 - classification_loss: 0.2607 297/500 [================>.............] - ETA: 50s - loss: 1.5806 - regression_loss: 1.3203 - classification_loss: 0.2604 298/500 [================>.............] - ETA: 49s - loss: 1.5803 - regression_loss: 1.3201 - classification_loss: 0.2602 299/500 [================>.............] - ETA: 49s - loss: 1.5797 - regression_loss: 1.3197 - classification_loss: 0.2600 300/500 [=================>............] - ETA: 49s - loss: 1.5805 - regression_loss: 1.3208 - classification_loss: 0.2597 301/500 [=================>............] - ETA: 49s - loss: 1.5813 - regression_loss: 1.3211 - classification_loss: 0.2602 302/500 [=================>............] - ETA: 49s - loss: 1.5828 - regression_loss: 1.3222 - classification_loss: 0.2606 303/500 [=================>............] - ETA: 48s - loss: 1.5824 - regression_loss: 1.3219 - classification_loss: 0.2605 304/500 [=================>............] - ETA: 48s - loss: 1.5829 - regression_loss: 1.3226 - classification_loss: 0.2603 305/500 [=================>............] - ETA: 48s - loss: 1.5830 - regression_loss: 1.3226 - classification_loss: 0.2604 306/500 [=================>............] - ETA: 48s - loss: 1.5851 - regression_loss: 1.3246 - classification_loss: 0.2605 307/500 [=================>............] - ETA: 47s - loss: 1.5842 - regression_loss: 1.3240 - classification_loss: 0.2602 308/500 [=================>............] - ETA: 47s - loss: 1.5829 - regression_loss: 1.3230 - classification_loss: 0.2599 309/500 [=================>............] - ETA: 47s - loss: 1.5805 - regression_loss: 1.3209 - classification_loss: 0.2596 310/500 [=================>............] - ETA: 47s - loss: 1.5804 - regression_loss: 1.3207 - classification_loss: 0.2596 311/500 [=================>............] - ETA: 46s - loss: 1.5782 - regression_loss: 1.3189 - classification_loss: 0.2594 312/500 [=================>............] - ETA: 46s - loss: 1.5796 - regression_loss: 1.3201 - classification_loss: 0.2596 313/500 [=================>............] - ETA: 46s - loss: 1.5789 - regression_loss: 1.3196 - classification_loss: 0.2593 314/500 [=================>............] - ETA: 46s - loss: 1.5777 - regression_loss: 1.3187 - classification_loss: 0.2589 315/500 [=================>............] - ETA: 45s - loss: 1.5779 - regression_loss: 1.3186 - classification_loss: 0.2594 316/500 [=================>............] - ETA: 45s - loss: 1.5774 - regression_loss: 1.3184 - classification_loss: 0.2590 317/500 [==================>...........] - ETA: 45s - loss: 1.5769 - regression_loss: 1.3182 - classification_loss: 0.2587 318/500 [==================>...........] - ETA: 45s - loss: 1.5729 - regression_loss: 1.3148 - classification_loss: 0.2581 319/500 [==================>...........] - ETA: 44s - loss: 1.5729 - regression_loss: 1.3149 - classification_loss: 0.2580 320/500 [==================>...........] - ETA: 44s - loss: 1.5768 - regression_loss: 1.3183 - classification_loss: 0.2585 321/500 [==================>...........] - ETA: 44s - loss: 1.5769 - regression_loss: 1.3185 - classification_loss: 0.2584 322/500 [==================>...........] - ETA: 44s - loss: 1.5801 - regression_loss: 1.3209 - classification_loss: 0.2592 323/500 [==================>...........] - ETA: 43s - loss: 1.5797 - regression_loss: 1.3207 - classification_loss: 0.2590 324/500 [==================>...........] - ETA: 43s - loss: 1.5798 - regression_loss: 1.3207 - classification_loss: 0.2590 325/500 [==================>...........] - ETA: 43s - loss: 1.5786 - regression_loss: 1.3196 - classification_loss: 0.2590 326/500 [==================>...........] - ETA: 43s - loss: 1.5786 - regression_loss: 1.3195 - classification_loss: 0.2591 327/500 [==================>...........] - ETA: 42s - loss: 1.5783 - regression_loss: 1.3193 - classification_loss: 0.2590 328/500 [==================>...........] - ETA: 42s - loss: 1.5784 - regression_loss: 1.3196 - classification_loss: 0.2588 329/500 [==================>...........] - ETA: 42s - loss: 1.5759 - regression_loss: 1.3176 - classification_loss: 0.2582 330/500 [==================>...........] - ETA: 42s - loss: 1.5775 - regression_loss: 1.3191 - classification_loss: 0.2584 331/500 [==================>...........] - ETA: 41s - loss: 1.5784 - regression_loss: 1.3196 - classification_loss: 0.2587 332/500 [==================>...........] - ETA: 41s - loss: 1.5785 - regression_loss: 1.3199 - classification_loss: 0.2586 333/500 [==================>...........] - ETA: 41s - loss: 1.5764 - regression_loss: 1.3183 - classification_loss: 0.2581 334/500 [===================>..........] - ETA: 41s - loss: 1.5779 - regression_loss: 1.3192 - classification_loss: 0.2587 335/500 [===================>..........] - ETA: 40s - loss: 1.5767 - regression_loss: 1.3183 - classification_loss: 0.2585 336/500 [===================>..........] - ETA: 40s - loss: 1.5787 - regression_loss: 1.3197 - classification_loss: 0.2589 337/500 [===================>..........] - ETA: 40s - loss: 1.5785 - regression_loss: 1.3195 - classification_loss: 0.2590 338/500 [===================>..........] - ETA: 40s - loss: 1.5775 - regression_loss: 1.3188 - classification_loss: 0.2587 339/500 [===================>..........] - ETA: 39s - loss: 1.5761 - regression_loss: 1.3179 - classification_loss: 0.2582 340/500 [===================>..........] - ETA: 39s - loss: 1.5757 - regression_loss: 1.3175 - classification_loss: 0.2582 341/500 [===================>..........] - ETA: 39s - loss: 1.5758 - regression_loss: 1.3175 - classification_loss: 0.2582 342/500 [===================>..........] - ETA: 39s - loss: 1.5752 - regression_loss: 1.3172 - classification_loss: 0.2580 343/500 [===================>..........] - ETA: 38s - loss: 1.5744 - regression_loss: 1.3167 - classification_loss: 0.2577 344/500 [===================>..........] - ETA: 38s - loss: 1.5739 - regression_loss: 1.3166 - classification_loss: 0.2573 345/500 [===================>..........] - ETA: 38s - loss: 1.5752 - regression_loss: 1.3178 - classification_loss: 0.2574 346/500 [===================>..........] - ETA: 38s - loss: 1.5747 - regression_loss: 1.3175 - classification_loss: 0.2572 347/500 [===================>..........] - ETA: 37s - loss: 1.5741 - regression_loss: 1.3170 - classification_loss: 0.2571 348/500 [===================>..........] - ETA: 37s - loss: 1.5737 - regression_loss: 1.3166 - classification_loss: 0.2571 349/500 [===================>..........] - ETA: 37s - loss: 1.5724 - regression_loss: 1.3155 - classification_loss: 0.2569 350/500 [====================>.........] - ETA: 37s - loss: 1.5717 - regression_loss: 1.3149 - classification_loss: 0.2568 351/500 [====================>.........] - ETA: 36s - loss: 1.5717 - regression_loss: 1.3149 - classification_loss: 0.2568 352/500 [====================>.........] - ETA: 36s - loss: 1.5722 - regression_loss: 1.3152 - classification_loss: 0.2570 353/500 [====================>.........] - ETA: 36s - loss: 1.5729 - regression_loss: 1.3158 - classification_loss: 0.2571 354/500 [====================>.........] - ETA: 36s - loss: 1.5714 - regression_loss: 1.3145 - classification_loss: 0.2569 355/500 [====================>.........] - ETA: 35s - loss: 1.5720 - regression_loss: 1.3152 - classification_loss: 0.2569 356/500 [====================>.........] - ETA: 35s - loss: 1.5722 - regression_loss: 1.3155 - classification_loss: 0.2566 357/500 [====================>.........] - ETA: 35s - loss: 1.5721 - regression_loss: 1.3160 - classification_loss: 0.2561 358/500 [====================>.........] - ETA: 35s - loss: 1.5701 - regression_loss: 1.3144 - classification_loss: 0.2558 359/500 [====================>.........] - ETA: 34s - loss: 1.5719 - regression_loss: 1.3159 - classification_loss: 0.2561 360/500 [====================>.........] - ETA: 34s - loss: 1.5741 - regression_loss: 1.3175 - classification_loss: 0.2567 361/500 [====================>.........] - ETA: 34s - loss: 1.5733 - regression_loss: 1.3167 - classification_loss: 0.2566 362/500 [====================>.........] - ETA: 34s - loss: 1.5718 - regression_loss: 1.3155 - classification_loss: 0.2563 363/500 [====================>.........] - ETA: 33s - loss: 1.5723 - regression_loss: 1.3158 - classification_loss: 0.2566 364/500 [====================>.........] - ETA: 33s - loss: 1.5733 - regression_loss: 1.3165 - classification_loss: 0.2568 365/500 [====================>.........] - ETA: 33s - loss: 1.5746 - regression_loss: 1.3179 - classification_loss: 0.2567 366/500 [====================>.........] - ETA: 33s - loss: 1.5767 - regression_loss: 1.3197 - classification_loss: 0.2571 367/500 [=====================>........] - ETA: 32s - loss: 1.5765 - regression_loss: 1.3194 - classification_loss: 0.2571 368/500 [=====================>........] - ETA: 32s - loss: 1.5785 - regression_loss: 1.3210 - classification_loss: 0.2574 369/500 [=====================>........] - ETA: 32s - loss: 1.5773 - regression_loss: 1.3202 - classification_loss: 0.2571 370/500 [=====================>........] - ETA: 32s - loss: 1.5762 - regression_loss: 1.3194 - classification_loss: 0.2568 371/500 [=====================>........] - ETA: 31s - loss: 1.5774 - regression_loss: 1.3201 - classification_loss: 0.2572 372/500 [=====================>........] - ETA: 31s - loss: 1.5782 - regression_loss: 1.3210 - classification_loss: 0.2572 373/500 [=====================>........] - ETA: 31s - loss: 1.5772 - regression_loss: 1.3202 - classification_loss: 0.2570 374/500 [=====================>........] - ETA: 31s - loss: 1.5781 - regression_loss: 1.3206 - classification_loss: 0.2576 375/500 [=====================>........] - ETA: 30s - loss: 1.5803 - regression_loss: 1.3222 - classification_loss: 0.2581 376/500 [=====================>........] - ETA: 30s - loss: 1.5822 - regression_loss: 1.3237 - classification_loss: 0.2585 377/500 [=====================>........] - ETA: 30s - loss: 1.5806 - regression_loss: 1.3226 - classification_loss: 0.2581 378/500 [=====================>........] - ETA: 30s - loss: 1.5806 - regression_loss: 1.3227 - classification_loss: 0.2579 379/500 [=====================>........] - ETA: 29s - loss: 1.5820 - regression_loss: 1.3241 - classification_loss: 0.2579 380/500 [=====================>........] - ETA: 29s - loss: 1.5821 - regression_loss: 1.3242 - classification_loss: 0.2579 381/500 [=====================>........] - ETA: 29s - loss: 1.5816 - regression_loss: 1.3237 - classification_loss: 0.2578 382/500 [=====================>........] - ETA: 29s - loss: 1.5814 - regression_loss: 1.3237 - classification_loss: 0.2578 383/500 [=====================>........] - ETA: 28s - loss: 1.5794 - regression_loss: 1.3220 - classification_loss: 0.2574 384/500 [======================>.......] - ETA: 28s - loss: 1.5789 - regression_loss: 1.3218 - classification_loss: 0.2571 385/500 [======================>.......] - ETA: 28s - loss: 1.5807 - regression_loss: 1.3232 - classification_loss: 0.2575 386/500 [======================>.......] - ETA: 28s - loss: 1.5811 - regression_loss: 1.3236 - classification_loss: 0.2575 387/500 [======================>.......] - ETA: 27s - loss: 1.5803 - regression_loss: 1.3230 - classification_loss: 0.2573 388/500 [======================>.......] - ETA: 27s - loss: 1.5817 - regression_loss: 1.3242 - classification_loss: 0.2575 389/500 [======================>.......] - ETA: 27s - loss: 1.5802 - regression_loss: 1.3231 - classification_loss: 0.2572 390/500 [======================>.......] - ETA: 27s - loss: 1.5791 - regression_loss: 1.3222 - classification_loss: 0.2569 391/500 [======================>.......] - ETA: 26s - loss: 1.5803 - regression_loss: 1.3233 - classification_loss: 0.2571 392/500 [======================>.......] - ETA: 26s - loss: 1.5824 - regression_loss: 1.3248 - classification_loss: 0.2576 393/500 [======================>.......] - ETA: 26s - loss: 1.5826 - regression_loss: 1.3250 - classification_loss: 0.2577 394/500 [======================>.......] - ETA: 26s - loss: 1.5834 - regression_loss: 1.3257 - classification_loss: 0.2577 395/500 [======================>.......] - ETA: 25s - loss: 1.5868 - regression_loss: 1.3288 - classification_loss: 0.2581 396/500 [======================>.......] - ETA: 25s - loss: 1.5864 - regression_loss: 1.3283 - classification_loss: 0.2581 397/500 [======================>.......] - ETA: 25s - loss: 1.5863 - regression_loss: 1.3283 - classification_loss: 0.2580 398/500 [======================>.......] - ETA: 25s - loss: 1.5899 - regression_loss: 1.3312 - classification_loss: 0.2587 399/500 [======================>.......] - ETA: 24s - loss: 1.5911 - regression_loss: 1.3321 - classification_loss: 0.2590 400/500 [=======================>......] - ETA: 24s - loss: 1.5914 - regression_loss: 1.3324 - classification_loss: 0.2590 401/500 [=======================>......] - ETA: 24s - loss: 1.5931 - regression_loss: 1.3335 - classification_loss: 0.2596 402/500 [=======================>......] - ETA: 24s - loss: 1.5906 - regression_loss: 1.3314 - classification_loss: 0.2592 403/500 [=======================>......] - ETA: 23s - loss: 1.5919 - regression_loss: 1.3320 - classification_loss: 0.2599 404/500 [=======================>......] - ETA: 23s - loss: 1.5918 - regression_loss: 1.3318 - classification_loss: 0.2600 405/500 [=======================>......] - ETA: 23s - loss: 1.5922 - regression_loss: 1.3324 - classification_loss: 0.2598 406/500 [=======================>......] - ETA: 23s - loss: 1.5898 - regression_loss: 1.3304 - classification_loss: 0.2594 407/500 [=======================>......] - ETA: 22s - loss: 1.5905 - regression_loss: 1.3312 - classification_loss: 0.2593 408/500 [=======================>......] - ETA: 22s - loss: 1.5898 - regression_loss: 1.3306 - classification_loss: 0.2591 409/500 [=======================>......] - ETA: 22s - loss: 1.5896 - regression_loss: 1.3305 - classification_loss: 0.2590 410/500 [=======================>......] - ETA: 22s - loss: 1.5895 - regression_loss: 1.3305 - classification_loss: 0.2590 411/500 [=======================>......] - ETA: 22s - loss: 1.5892 - regression_loss: 1.3304 - classification_loss: 0.2589 412/500 [=======================>......] - ETA: 21s - loss: 1.5883 - regression_loss: 1.3296 - classification_loss: 0.2587 413/500 [=======================>......] - ETA: 21s - loss: 1.5890 - regression_loss: 1.3302 - classification_loss: 0.2588 414/500 [=======================>......] - ETA: 21s - loss: 1.5888 - regression_loss: 1.3301 - classification_loss: 0.2587 415/500 [=======================>......] - ETA: 21s - loss: 1.5890 - regression_loss: 1.3303 - classification_loss: 0.2586 416/500 [=======================>......] - ETA: 20s - loss: 1.5883 - regression_loss: 1.3299 - classification_loss: 0.2584 417/500 [========================>.....] - ETA: 20s - loss: 1.5888 - regression_loss: 1.3301 - classification_loss: 0.2586 418/500 [========================>.....] - ETA: 20s - loss: 1.5902 - regression_loss: 1.3314 - classification_loss: 0.2589 419/500 [========================>.....] - ETA: 20s - loss: 1.5891 - regression_loss: 1.3306 - classification_loss: 0.2585 420/500 [========================>.....] - ETA: 19s - loss: 1.5895 - regression_loss: 1.3306 - classification_loss: 0.2589 421/500 [========================>.....] - ETA: 19s - loss: 1.5890 - regression_loss: 1.3303 - classification_loss: 0.2587 422/500 [========================>.....] - ETA: 19s - loss: 1.5882 - regression_loss: 1.3298 - classification_loss: 0.2585 423/500 [========================>.....] - ETA: 19s - loss: 1.5884 - regression_loss: 1.3298 - classification_loss: 0.2586 424/500 [========================>.....] - ETA: 18s - loss: 1.5885 - regression_loss: 1.3301 - classification_loss: 0.2584 425/500 [========================>.....] - ETA: 18s - loss: 1.5868 - regression_loss: 1.3289 - classification_loss: 0.2579 426/500 [========================>.....] - ETA: 18s - loss: 1.5858 - regression_loss: 1.3282 - classification_loss: 0.2576 427/500 [========================>.....] - ETA: 18s - loss: 1.5851 - regression_loss: 1.3277 - classification_loss: 0.2574 428/500 [========================>.....] - ETA: 17s - loss: 1.5821 - regression_loss: 1.3252 - classification_loss: 0.2569 429/500 [========================>.....] - ETA: 17s - loss: 1.5837 - regression_loss: 1.3263 - classification_loss: 0.2574 430/500 [========================>.....] - ETA: 17s - loss: 1.5830 - regression_loss: 1.3260 - classification_loss: 0.2570 431/500 [========================>.....] - ETA: 17s - loss: 1.5820 - regression_loss: 1.3252 - classification_loss: 0.2568 432/500 [========================>.....] - ETA: 16s - loss: 1.5810 - regression_loss: 1.3246 - classification_loss: 0.2564 433/500 [========================>.....] - ETA: 16s - loss: 1.5819 - regression_loss: 1.3253 - classification_loss: 0.2566 434/500 [=========================>....] - ETA: 16s - loss: 1.5807 - regression_loss: 1.3244 - classification_loss: 0.2563 435/500 [=========================>....] - ETA: 16s - loss: 1.5823 - regression_loss: 1.3255 - classification_loss: 0.2568 436/500 [=========================>....] - ETA: 15s - loss: 1.5793 - regression_loss: 1.3230 - classification_loss: 0.2563 437/500 [=========================>....] - ETA: 15s - loss: 1.5802 - regression_loss: 1.3238 - classification_loss: 0.2564 438/500 [=========================>....] - ETA: 15s - loss: 1.5815 - regression_loss: 1.3247 - classification_loss: 0.2568 439/500 [=========================>....] - ETA: 15s - loss: 1.5809 - regression_loss: 1.3243 - classification_loss: 0.2566 440/500 [=========================>....] - ETA: 14s - loss: 1.5814 - regression_loss: 1.3250 - classification_loss: 0.2565 441/500 [=========================>....] - ETA: 14s - loss: 1.5818 - regression_loss: 1.3251 - classification_loss: 0.2567 442/500 [=========================>....] - ETA: 14s - loss: 1.5818 - regression_loss: 1.3251 - classification_loss: 0.2567 443/500 [=========================>....] - ETA: 14s - loss: 1.5814 - regression_loss: 1.3249 - classification_loss: 0.2565 444/500 [=========================>....] - ETA: 13s - loss: 1.5817 - regression_loss: 1.3253 - classification_loss: 0.2564 445/500 [=========================>....] - ETA: 13s - loss: 1.5794 - regression_loss: 1.3235 - classification_loss: 0.2560 446/500 [=========================>....] - ETA: 13s - loss: 1.5796 - regression_loss: 1.3236 - classification_loss: 0.2560 447/500 [=========================>....] - ETA: 13s - loss: 1.5796 - regression_loss: 1.3236 - classification_loss: 0.2560 448/500 [=========================>....] - ETA: 12s - loss: 1.5800 - regression_loss: 1.3239 - classification_loss: 0.2561 449/500 [=========================>....] - ETA: 12s - loss: 1.5807 - regression_loss: 1.3244 - classification_loss: 0.2563 450/500 [==========================>...] - ETA: 12s - loss: 1.5817 - regression_loss: 1.3250 - classification_loss: 0.2567 451/500 [==========================>...] - ETA: 12s - loss: 1.5818 - regression_loss: 1.3250 - classification_loss: 0.2568 452/500 [==========================>...] - ETA: 11s - loss: 1.5820 - regression_loss: 1.3252 - classification_loss: 0.2568 453/500 [==========================>...] - ETA: 11s - loss: 1.5818 - regression_loss: 1.3252 - classification_loss: 0.2565 454/500 [==========================>...] - ETA: 11s - loss: 1.5812 - regression_loss: 1.3223 - classification_loss: 0.2589 455/500 [==========================>...] - ETA: 11s - loss: 1.5816 - regression_loss: 1.3228 - classification_loss: 0.2588 456/500 [==========================>...] - ETA: 10s - loss: 1.5802 - regression_loss: 1.3218 - classification_loss: 0.2585 457/500 [==========================>...] - ETA: 10s - loss: 1.5784 - regression_loss: 1.3202 - classification_loss: 0.2582 458/500 [==========================>...] - ETA: 10s - loss: 1.5783 - regression_loss: 1.3203 - classification_loss: 0.2581 459/500 [==========================>...] - ETA: 10s - loss: 1.5775 - regression_loss: 1.3198 - classification_loss: 0.2578 460/500 [==========================>...] - ETA: 9s - loss: 1.5774 - regression_loss: 1.3192 - classification_loss: 0.2582  461/500 [==========================>...] - ETA: 9s - loss: 1.5767 - regression_loss: 1.3188 - classification_loss: 0.2580 462/500 [==========================>...] - ETA: 9s - loss: 1.5783 - regression_loss: 1.3198 - classification_loss: 0.2585 463/500 [==========================>...] - ETA: 9s - loss: 1.5776 - regression_loss: 1.3194 - classification_loss: 0.2582 464/500 [==========================>...] - ETA: 8s - loss: 1.5782 - regression_loss: 1.3199 - classification_loss: 0.2583 465/500 [==========================>...] - ETA: 8s - loss: 1.5773 - regression_loss: 1.3193 - classification_loss: 0.2580 466/500 [==========================>...] - ETA: 8s - loss: 1.5784 - regression_loss: 1.3202 - classification_loss: 0.2582 467/500 [===========================>..] - ETA: 8s - loss: 1.5779 - regression_loss: 1.3197 - classification_loss: 0.2582 468/500 [===========================>..] - ETA: 7s - loss: 1.5774 - regression_loss: 1.3194 - classification_loss: 0.2580 469/500 [===========================>..] - ETA: 7s - loss: 1.5772 - regression_loss: 1.3194 - classification_loss: 0.2579 470/500 [===========================>..] - ETA: 7s - loss: 1.5768 - regression_loss: 1.3189 - classification_loss: 0.2579 471/500 [===========================>..] - ETA: 7s - loss: 1.5765 - regression_loss: 1.3188 - classification_loss: 0.2577 472/500 [===========================>..] - ETA: 6s - loss: 1.5761 - regression_loss: 1.3186 - classification_loss: 0.2575 473/500 [===========================>..] - ETA: 6s - loss: 1.5764 - regression_loss: 1.3188 - classification_loss: 0.2576 474/500 [===========================>..] - ETA: 6s - loss: 1.5783 - regression_loss: 1.3206 - classification_loss: 0.2577 475/500 [===========================>..] - ETA: 6s - loss: 1.5790 - regression_loss: 1.3213 - classification_loss: 0.2577 476/500 [===========================>..] - ETA: 5s - loss: 1.5792 - regression_loss: 1.3216 - classification_loss: 0.2577 477/500 [===========================>..] - ETA: 5s - loss: 1.5787 - regression_loss: 1.3213 - classification_loss: 0.2575 478/500 [===========================>..] - ETA: 5s - loss: 1.5787 - regression_loss: 1.3213 - classification_loss: 0.2573 479/500 [===========================>..] - ETA: 5s - loss: 1.5782 - regression_loss: 1.3211 - classification_loss: 0.2572 480/500 [===========================>..] - ETA: 4s - loss: 1.5790 - regression_loss: 1.3219 - classification_loss: 0.2571 481/500 [===========================>..] - ETA: 4s - loss: 1.5776 - regression_loss: 1.3208 - classification_loss: 0.2568 482/500 [===========================>..] - ETA: 4s - loss: 1.5762 - regression_loss: 1.3197 - classification_loss: 0.2565 483/500 [===========================>..] - ETA: 4s - loss: 1.5755 - regression_loss: 1.3192 - classification_loss: 0.2563 484/500 [============================>.] - ETA: 3s - loss: 1.5764 - regression_loss: 1.3199 - classification_loss: 0.2565 485/500 [============================>.] - ETA: 3s - loss: 1.5774 - regression_loss: 1.3207 - classification_loss: 0.2567 486/500 [============================>.] - ETA: 3s - loss: 1.5766 - regression_loss: 1.3201 - classification_loss: 0.2565 487/500 [============================>.] - ETA: 3s - loss: 1.5782 - regression_loss: 1.3213 - classification_loss: 0.2569 488/500 [============================>.] - ETA: 2s - loss: 1.5776 - regression_loss: 1.3208 - classification_loss: 0.2569 489/500 [============================>.] - ETA: 2s - loss: 1.5791 - regression_loss: 1.3220 - classification_loss: 0.2572 490/500 [============================>.] - ETA: 2s - loss: 1.5800 - regression_loss: 1.3227 - classification_loss: 0.2572 491/500 [============================>.] - ETA: 2s - loss: 1.5793 - regression_loss: 1.3222 - classification_loss: 0.2571 492/500 [============================>.] - ETA: 1s - loss: 1.5799 - regression_loss: 1.3229 - classification_loss: 0.2570 493/500 [============================>.] - ETA: 1s - loss: 1.5793 - regression_loss: 1.3225 - classification_loss: 0.2568 494/500 [============================>.] - ETA: 1s - loss: 1.5790 - regression_loss: 1.3223 - classification_loss: 0.2566 495/500 [============================>.] - ETA: 1s - loss: 1.5801 - regression_loss: 1.3232 - classification_loss: 0.2569 496/500 [============================>.] - ETA: 0s - loss: 1.5797 - regression_loss: 1.3230 - classification_loss: 0.2567 497/500 [============================>.] - ETA: 0s - loss: 1.5794 - regression_loss: 1.3230 - classification_loss: 0.2564 498/500 [============================>.] - ETA: 0s - loss: 1.5791 - regression_loss: 1.3227 - classification_loss: 0.2564 499/500 [============================>.] - ETA: 0s - loss: 1.5778 - regression_loss: 1.3217 - classification_loss: 0.2562 500/500 [==============================] - 124s 248ms/step - loss: 1.5779 - regression_loss: 1.3218 - classification_loss: 0.2561 326 instances of class plum with average precision: 0.7805 mAP: 0.7805 Epoch 00065: saving model to ./training/snapshots/resnet50_pascal_65.h5 Epoch 66/150 1/500 [..............................] - ETA: 1:57 - loss: 0.9758 - regression_loss: 0.8121 - classification_loss: 0.1636 2/500 [..............................] - ETA: 1:59 - loss: 1.4191 - regression_loss: 1.1913 - classification_loss: 0.2278 3/500 [..............................] - ETA: 2:00 - loss: 1.4401 - regression_loss: 1.2111 - classification_loss: 0.2291 4/500 [..............................] - ETA: 2:01 - loss: 1.4171 - regression_loss: 1.1929 - classification_loss: 0.2242 5/500 [..............................] - ETA: 2:01 - loss: 1.3758 - regression_loss: 1.1579 - classification_loss: 0.2179 6/500 [..............................] - ETA: 2:02 - loss: 1.3978 - regression_loss: 1.1769 - classification_loss: 0.2210 7/500 [..............................] - ETA: 2:02 - loss: 1.4110 - regression_loss: 1.1829 - classification_loss: 0.2281 8/500 [..............................] - ETA: 2:02 - loss: 1.4044 - regression_loss: 1.1880 - classification_loss: 0.2164 9/500 [..............................] - ETA: 2:02 - loss: 1.4679 - regression_loss: 1.2438 - classification_loss: 0.2241 10/500 [..............................] - ETA: 2:02 - loss: 1.4339 - regression_loss: 1.2194 - classification_loss: 0.2145 11/500 [..............................] - ETA: 2:02 - loss: 1.4170 - regression_loss: 1.2086 - classification_loss: 0.2084 12/500 [..............................] - ETA: 2:02 - loss: 1.4184 - regression_loss: 1.2101 - classification_loss: 0.2084 13/500 [..............................] - ETA: 2:01 - loss: 1.4239 - regression_loss: 1.2192 - classification_loss: 0.2047 14/500 [..............................] - ETA: 2:01 - loss: 1.4245 - regression_loss: 1.2199 - classification_loss: 0.2046 15/500 [..............................] - ETA: 2:01 - loss: 1.4319 - regression_loss: 1.2055 - classification_loss: 0.2264 16/500 [..............................] - ETA: 2:01 - loss: 1.4731 - regression_loss: 1.2409 - classification_loss: 0.2323 17/500 [>.............................] - ETA: 2:00 - loss: 1.4995 - regression_loss: 1.2588 - classification_loss: 0.2408 18/500 [>.............................] - ETA: 2:00 - loss: 1.4899 - regression_loss: 1.2506 - classification_loss: 0.2393 19/500 [>.............................] - ETA: 2:00 - loss: 1.5188 - regression_loss: 1.2756 - classification_loss: 0.2431 20/500 [>.............................] - ETA: 2:00 - loss: 1.5463 - regression_loss: 1.3043 - classification_loss: 0.2419 21/500 [>.............................] - ETA: 1:59 - loss: 1.5646 - regression_loss: 1.3155 - classification_loss: 0.2491 22/500 [>.............................] - ETA: 1:59 - loss: 1.5397 - regression_loss: 1.2959 - classification_loss: 0.2438 23/500 [>.............................] - ETA: 1:59 - loss: 1.5434 - regression_loss: 1.2964 - classification_loss: 0.2470 24/500 [>.............................] - ETA: 1:58 - loss: 1.5771 - regression_loss: 1.3217 - classification_loss: 0.2554 25/500 [>.............................] - ETA: 1:58 - loss: 1.6137 - regression_loss: 1.3440 - classification_loss: 0.2697 26/500 [>.............................] - ETA: 1:58 - loss: 1.6191 - regression_loss: 1.3513 - classification_loss: 0.2677 27/500 [>.............................] - ETA: 1:58 - loss: 1.5971 - regression_loss: 1.3333 - classification_loss: 0.2638 28/500 [>.............................] - ETA: 1:57 - loss: 1.6010 - regression_loss: 1.3376 - classification_loss: 0.2634 29/500 [>.............................] - ETA: 1:57 - loss: 1.6842 - regression_loss: 1.3922 - classification_loss: 0.2920 30/500 [>.............................] - ETA: 1:57 - loss: 1.6701 - regression_loss: 1.3827 - classification_loss: 0.2874 31/500 [>.............................] - ETA: 1:57 - loss: 1.6729 - regression_loss: 1.3840 - classification_loss: 0.2889 32/500 [>.............................] - ETA: 1:56 - loss: 1.6317 - regression_loss: 1.3508 - classification_loss: 0.2809 33/500 [>.............................] - ETA: 1:56 - loss: 1.6313 - regression_loss: 1.3495 - classification_loss: 0.2818 34/500 [=>............................] - ETA: 1:56 - loss: 1.6064 - regression_loss: 1.3298 - classification_loss: 0.2766 35/500 [=>............................] - ETA: 1:56 - loss: 1.6037 - regression_loss: 1.3300 - classification_loss: 0.2737 36/500 [=>............................] - ETA: 1:55 - loss: 1.5812 - regression_loss: 1.3107 - classification_loss: 0.2705 37/500 [=>............................] - ETA: 1:55 - loss: 1.5669 - regression_loss: 1.2971 - classification_loss: 0.2697 38/500 [=>............................] - ETA: 1:55 - loss: 1.5648 - regression_loss: 1.2989 - classification_loss: 0.2659 39/500 [=>............................] - ETA: 1:55 - loss: 1.5639 - regression_loss: 1.2989 - classification_loss: 0.2650 40/500 [=>............................] - ETA: 1:55 - loss: 1.5615 - regression_loss: 1.2966 - classification_loss: 0.2650 41/500 [=>............................] - ETA: 1:54 - loss: 1.5462 - regression_loss: 1.2848 - classification_loss: 0.2615 42/500 [=>............................] - ETA: 1:54 - loss: 1.5440 - regression_loss: 1.2857 - classification_loss: 0.2582 43/500 [=>............................] - ETA: 1:54 - loss: 1.5415 - regression_loss: 1.2855 - classification_loss: 0.2561 44/500 [=>............................] - ETA: 1:54 - loss: 1.5520 - regression_loss: 1.2940 - classification_loss: 0.2581 45/500 [=>............................] - ETA: 1:53 - loss: 1.5486 - regression_loss: 1.2925 - classification_loss: 0.2561 46/500 [=>............................] - ETA: 1:53 - loss: 1.5680 - regression_loss: 1.3067 - classification_loss: 0.2613 47/500 [=>............................] - ETA: 1:53 - loss: 1.5709 - regression_loss: 1.3114 - classification_loss: 0.2595 48/500 [=>............................] - ETA: 1:53 - loss: 1.5661 - regression_loss: 1.3084 - classification_loss: 0.2577 49/500 [=>............................] - ETA: 1:52 - loss: 1.5642 - regression_loss: 1.3073 - classification_loss: 0.2569 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5536 - regression_loss: 1.2995 - classification_loss: 0.2540 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5477 - regression_loss: 1.2960 - classification_loss: 0.2517 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5566 - regression_loss: 1.3031 - classification_loss: 0.2535 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5552 - regression_loss: 1.3014 - classification_loss: 0.2539 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5534 - regression_loss: 1.2990 - classification_loss: 0.2544 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5527 - regression_loss: 1.2999 - classification_loss: 0.2527 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5561 - regression_loss: 1.3025 - classification_loss: 0.2536 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5611 - regression_loss: 1.3065 - classification_loss: 0.2546 58/500 [==>...........................] - ETA: 1:50 - loss: 1.5531 - regression_loss: 1.3002 - classification_loss: 0.2529 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5378 - regression_loss: 1.2887 - classification_loss: 0.2491 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5259 - regression_loss: 1.2794 - classification_loss: 0.2465 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5359 - regression_loss: 1.2892 - classification_loss: 0.2467 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5374 - regression_loss: 1.2905 - classification_loss: 0.2469 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5278 - regression_loss: 1.2838 - classification_loss: 0.2440 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5318 - regression_loss: 1.2870 - classification_loss: 0.2448 65/500 [==>...........................] - ETA: 1:47 - loss: 1.5347 - regression_loss: 1.2895 - classification_loss: 0.2452 66/500 [==>...........................] - ETA: 1:47 - loss: 1.5243 - regression_loss: 1.2785 - classification_loss: 0.2458 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5226 - regression_loss: 1.2777 - classification_loss: 0.2449 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5306 - regression_loss: 1.2839 - classification_loss: 0.2467 69/500 [===>..........................] - ETA: 1:46 - loss: 1.5210 - regression_loss: 1.2754 - classification_loss: 0.2456 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5343 - regression_loss: 1.2860 - classification_loss: 0.2483 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5290 - regression_loss: 1.2815 - classification_loss: 0.2475 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5295 - regression_loss: 1.2818 - classification_loss: 0.2477 73/500 [===>..........................] - ETA: 1:45 - loss: 1.5269 - regression_loss: 1.2809 - classification_loss: 0.2460 74/500 [===>..........................] - ETA: 1:45 - loss: 1.5242 - regression_loss: 1.2783 - classification_loss: 0.2459 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5184 - regression_loss: 1.2739 - classification_loss: 0.2445 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5136 - regression_loss: 1.2701 - classification_loss: 0.2435 77/500 [===>..........................] - ETA: 1:44 - loss: 1.5148 - regression_loss: 1.2720 - classification_loss: 0.2429 78/500 [===>..........................] - ETA: 1:44 - loss: 1.5151 - regression_loss: 1.2728 - classification_loss: 0.2423 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5192 - regression_loss: 1.2772 - classification_loss: 0.2420 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5250 - regression_loss: 1.2828 - classification_loss: 0.2422 81/500 [===>..........................] - ETA: 1:43 - loss: 1.5179 - regression_loss: 1.2770 - classification_loss: 0.2409 82/500 [===>..........................] - ETA: 1:43 - loss: 1.5030 - regression_loss: 1.2643 - classification_loss: 0.2387 83/500 [===>..........................] - ETA: 1:43 - loss: 1.4978 - regression_loss: 1.2605 - classification_loss: 0.2373 84/500 [====>.........................] - ETA: 1:43 - loss: 1.4898 - regression_loss: 1.2543 - classification_loss: 0.2355 85/500 [====>.........................] - ETA: 1:42 - loss: 1.4818 - regression_loss: 1.2479 - classification_loss: 0.2338 86/500 [====>.........................] - ETA: 1:42 - loss: 1.4768 - regression_loss: 1.2441 - classification_loss: 0.2327 87/500 [====>.........................] - ETA: 1:42 - loss: 1.4780 - regression_loss: 1.2456 - classification_loss: 0.2324 88/500 [====>.........................] - ETA: 1:42 - loss: 1.4838 - regression_loss: 1.2512 - classification_loss: 0.2326 89/500 [====>.........................] - ETA: 1:42 - loss: 1.4844 - regression_loss: 1.2521 - classification_loss: 0.2323 90/500 [====>.........................] - ETA: 1:41 - loss: 1.4837 - regression_loss: 1.2520 - classification_loss: 0.2318 91/500 [====>.........................] - ETA: 1:41 - loss: 1.4869 - regression_loss: 1.2548 - classification_loss: 0.2321 92/500 [====>.........................] - ETA: 1:41 - loss: 1.4850 - regression_loss: 1.2531 - classification_loss: 0.2319 93/500 [====>.........................] - ETA: 1:41 - loss: 1.4928 - regression_loss: 1.2588 - classification_loss: 0.2340 94/500 [====>.........................] - ETA: 1:40 - loss: 1.4818 - regression_loss: 1.2499 - classification_loss: 0.2319 95/500 [====>.........................] - ETA: 1:40 - loss: 1.4844 - regression_loss: 1.2525 - classification_loss: 0.2320 96/500 [====>.........................] - ETA: 1:40 - loss: 1.4785 - regression_loss: 1.2480 - classification_loss: 0.2305 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4775 - regression_loss: 1.2482 - classification_loss: 0.2293 98/500 [====>.........................] - ETA: 1:39 - loss: 1.4772 - regression_loss: 1.2483 - classification_loss: 0.2290 99/500 [====>.........................] - ETA: 1:39 - loss: 1.4738 - regression_loss: 1.2457 - classification_loss: 0.2281 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4825 - regression_loss: 1.2521 - classification_loss: 0.2304 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4864 - regression_loss: 1.2555 - classification_loss: 0.2309 102/500 [=====>........................] - ETA: 1:38 - loss: 1.4847 - regression_loss: 1.2543 - classification_loss: 0.2304 103/500 [=====>........................] - ETA: 1:38 - loss: 1.4875 - regression_loss: 1.2567 - classification_loss: 0.2309 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4946 - regression_loss: 1.2616 - classification_loss: 0.2330 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4938 - regression_loss: 1.2608 - classification_loss: 0.2330 106/500 [=====>........................] - ETA: 1:37 - loss: 1.5097 - regression_loss: 1.2746 - classification_loss: 0.2350 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5113 - regression_loss: 1.2755 - classification_loss: 0.2358 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5100 - regression_loss: 1.2746 - classification_loss: 0.2355 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5101 - regression_loss: 1.2749 - classification_loss: 0.2352 110/500 [=====>........................] - ETA: 1:36 - loss: 1.5171 - regression_loss: 1.2803 - classification_loss: 0.2368 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5182 - regression_loss: 1.2813 - classification_loss: 0.2368 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5288 - regression_loss: 1.2892 - classification_loss: 0.2396 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5269 - regression_loss: 1.2878 - classification_loss: 0.2390 114/500 [=====>........................] - ETA: 1:35 - loss: 1.5324 - regression_loss: 1.2931 - classification_loss: 0.2393 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5307 - regression_loss: 1.2923 - classification_loss: 0.2384 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5287 - regression_loss: 1.2908 - classification_loss: 0.2379 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5254 - regression_loss: 1.2886 - classification_loss: 0.2368 118/500 [======>.......................] - ETA: 1:34 - loss: 1.5213 - regression_loss: 1.2853 - classification_loss: 0.2360 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5218 - regression_loss: 1.2858 - classification_loss: 0.2360 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5194 - regression_loss: 1.2847 - classification_loss: 0.2347 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5215 - regression_loss: 1.2871 - classification_loss: 0.2344 122/500 [======>.......................] - ETA: 1:33 - loss: 1.5240 - regression_loss: 1.2898 - classification_loss: 0.2341 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5297 - regression_loss: 1.2946 - classification_loss: 0.2350 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5248 - regression_loss: 1.2909 - classification_loss: 0.2339 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5302 - regression_loss: 1.2955 - classification_loss: 0.2347 126/500 [======>.......................] - ETA: 1:32 - loss: 1.5288 - regression_loss: 1.2949 - classification_loss: 0.2339 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5241 - regression_loss: 1.2907 - classification_loss: 0.2335 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5169 - regression_loss: 1.2846 - classification_loss: 0.2323 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5148 - regression_loss: 1.2832 - classification_loss: 0.2316 130/500 [======>.......................] - ETA: 1:31 - loss: 1.5086 - regression_loss: 1.2780 - classification_loss: 0.2306 131/500 [======>.......................] - ETA: 1:31 - loss: 1.5067 - regression_loss: 1.2768 - classification_loss: 0.2298 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5072 - regression_loss: 1.2773 - classification_loss: 0.2299 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5016 - regression_loss: 1.2724 - classification_loss: 0.2291 134/500 [=======>......................] - ETA: 1:30 - loss: 1.4983 - regression_loss: 1.2697 - classification_loss: 0.2286 135/500 [=======>......................] - ETA: 1:30 - loss: 1.4995 - regression_loss: 1.2701 - classification_loss: 0.2295 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5006 - regression_loss: 1.2714 - classification_loss: 0.2292 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4989 - regression_loss: 1.2693 - classification_loss: 0.2296 138/500 [=======>......................] - ETA: 1:29 - loss: 1.4962 - regression_loss: 1.2672 - classification_loss: 0.2290 139/500 [=======>......................] - ETA: 1:29 - loss: 1.5012 - regression_loss: 1.2704 - classification_loss: 0.2308 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5034 - regression_loss: 1.2725 - classification_loss: 0.2309 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5068 - regression_loss: 1.2748 - classification_loss: 0.2320 142/500 [=======>......................] - ETA: 1:28 - loss: 1.5081 - regression_loss: 1.2758 - classification_loss: 0.2323 143/500 [=======>......................] - ETA: 1:28 - loss: 1.5050 - regression_loss: 1.2730 - classification_loss: 0.2320 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5103 - regression_loss: 1.2770 - classification_loss: 0.2333 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5133 - regression_loss: 1.2793 - classification_loss: 0.2340 146/500 [=======>......................] - ETA: 1:27 - loss: 1.5175 - regression_loss: 1.2827 - classification_loss: 0.2348 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5201 - regression_loss: 1.2848 - classification_loss: 0.2353 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5123 - regression_loss: 1.2783 - classification_loss: 0.2340 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5087 - regression_loss: 1.2758 - classification_loss: 0.2329 150/500 [========>.....................] - ETA: 1:26 - loss: 1.5098 - regression_loss: 1.2760 - classification_loss: 0.2338 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5104 - regression_loss: 1.2765 - classification_loss: 0.2339 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5096 - regression_loss: 1.2763 - classification_loss: 0.2333 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5088 - regression_loss: 1.2750 - classification_loss: 0.2338 154/500 [========>.....................] - ETA: 1:25 - loss: 1.5113 - regression_loss: 1.2764 - classification_loss: 0.2349 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5126 - regression_loss: 1.2774 - classification_loss: 0.2352 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5166 - regression_loss: 1.2805 - classification_loss: 0.2361 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5191 - regression_loss: 1.2821 - classification_loss: 0.2370 158/500 [========>.....................] - ETA: 1:24 - loss: 1.5163 - regression_loss: 1.2794 - classification_loss: 0.2370 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5172 - regression_loss: 1.2802 - classification_loss: 0.2370 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5132 - regression_loss: 1.2772 - classification_loss: 0.2360 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5121 - regression_loss: 1.2765 - classification_loss: 0.2356 162/500 [========>.....................] - ETA: 1:23 - loss: 1.5127 - regression_loss: 1.2767 - classification_loss: 0.2360 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5100 - regression_loss: 1.2746 - classification_loss: 0.2354 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5089 - regression_loss: 1.2738 - classification_loss: 0.2351 165/500 [========>.....................] - ETA: 1:22 - loss: 1.5081 - regression_loss: 1.2732 - classification_loss: 0.2349 166/500 [========>.....................] - ETA: 1:22 - loss: 1.5078 - regression_loss: 1.2732 - classification_loss: 0.2346 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5065 - regression_loss: 1.2725 - classification_loss: 0.2340 168/500 [=========>....................] - ETA: 1:21 - loss: 1.5095 - regression_loss: 1.2755 - classification_loss: 0.2340 169/500 [=========>....................] - ETA: 1:21 - loss: 1.5115 - regression_loss: 1.2770 - classification_loss: 0.2345 170/500 [=========>....................] - ETA: 1:21 - loss: 1.5132 - regression_loss: 1.2786 - classification_loss: 0.2346 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5096 - regression_loss: 1.2754 - classification_loss: 0.2342 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5132 - regression_loss: 1.2781 - classification_loss: 0.2351 173/500 [=========>....................] - ETA: 1:20 - loss: 1.5130 - regression_loss: 1.2781 - classification_loss: 0.2349 174/500 [=========>....................] - ETA: 1:20 - loss: 1.5124 - regression_loss: 1.2778 - classification_loss: 0.2345 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5130 - regression_loss: 1.2783 - classification_loss: 0.2347 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5151 - regression_loss: 1.2801 - classification_loss: 0.2350 177/500 [=========>....................] - ETA: 1:19 - loss: 1.5174 - regression_loss: 1.2818 - classification_loss: 0.2356 178/500 [=========>....................] - ETA: 1:19 - loss: 1.5172 - regression_loss: 1.2813 - classification_loss: 0.2359 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5190 - regression_loss: 1.2829 - classification_loss: 0.2362 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5180 - regression_loss: 1.2821 - classification_loss: 0.2359 181/500 [=========>....................] - ETA: 1:18 - loss: 1.5130 - regression_loss: 1.2780 - classification_loss: 0.2350 182/500 [=========>....................] - ETA: 1:18 - loss: 1.5148 - regression_loss: 1.2793 - classification_loss: 0.2355 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5162 - regression_loss: 1.2809 - classification_loss: 0.2352 184/500 [==========>...................] - ETA: 1:17 - loss: 1.5187 - regression_loss: 1.2830 - classification_loss: 0.2357 185/500 [==========>...................] - ETA: 1:17 - loss: 1.5192 - regression_loss: 1.2836 - classification_loss: 0.2356 186/500 [==========>...................] - ETA: 1:17 - loss: 1.5164 - regression_loss: 1.2808 - classification_loss: 0.2356 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5170 - regression_loss: 1.2808 - classification_loss: 0.2362 188/500 [==========>...................] - ETA: 1:16 - loss: 1.5127 - regression_loss: 1.2777 - classification_loss: 0.2351 189/500 [==========>...................] - ETA: 1:16 - loss: 1.5116 - regression_loss: 1.2768 - classification_loss: 0.2348 190/500 [==========>...................] - ETA: 1:16 - loss: 1.5098 - regression_loss: 1.2758 - classification_loss: 0.2340 191/500 [==========>...................] - ETA: 1:16 - loss: 1.5100 - regression_loss: 1.2761 - classification_loss: 0.2338 192/500 [==========>...................] - ETA: 1:15 - loss: 1.5136 - regression_loss: 1.2788 - classification_loss: 0.2348 193/500 [==========>...................] - ETA: 1:15 - loss: 1.5181 - regression_loss: 1.2826 - classification_loss: 0.2355 194/500 [==========>...................] - ETA: 1:15 - loss: 1.5234 - regression_loss: 1.2865 - classification_loss: 0.2369 195/500 [==========>...................] - ETA: 1:14 - loss: 1.5209 - regression_loss: 1.2847 - classification_loss: 0.2362 196/500 [==========>...................] - ETA: 1:14 - loss: 1.5247 - regression_loss: 1.2878 - classification_loss: 0.2369 197/500 [==========>...................] - ETA: 1:14 - loss: 1.5314 - regression_loss: 1.2923 - classification_loss: 0.2391 198/500 [==========>...................] - ETA: 1:14 - loss: 1.5348 - regression_loss: 1.2951 - classification_loss: 0.2397 199/500 [==========>...................] - ETA: 1:13 - loss: 1.5345 - regression_loss: 1.2952 - classification_loss: 0.2393 200/500 [===========>..................] - ETA: 1:13 - loss: 1.5335 - regression_loss: 1.2943 - classification_loss: 0.2392 201/500 [===========>..................] - ETA: 1:13 - loss: 1.5371 - regression_loss: 1.2975 - classification_loss: 0.2396 202/500 [===========>..................] - ETA: 1:13 - loss: 1.5320 - regression_loss: 1.2930 - classification_loss: 0.2389 203/500 [===========>..................] - ETA: 1:12 - loss: 1.5335 - regression_loss: 1.2941 - classification_loss: 0.2394 204/500 [===========>..................] - ETA: 1:12 - loss: 1.5331 - regression_loss: 1.2939 - classification_loss: 0.2393 205/500 [===========>..................] - ETA: 1:12 - loss: 1.5340 - regression_loss: 1.2947 - classification_loss: 0.2393 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5355 - regression_loss: 1.2959 - classification_loss: 0.2396 207/500 [===========>..................] - ETA: 1:11 - loss: 1.5363 - regression_loss: 1.2968 - classification_loss: 0.2395 208/500 [===========>..................] - ETA: 1:11 - loss: 1.5370 - regression_loss: 1.2976 - classification_loss: 0.2395 209/500 [===========>..................] - ETA: 1:11 - loss: 1.5370 - regression_loss: 1.2973 - classification_loss: 0.2397 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5343 - regression_loss: 1.2951 - classification_loss: 0.2392 211/500 [===========>..................] - ETA: 1:10 - loss: 1.5343 - regression_loss: 1.2952 - classification_loss: 0.2391 212/500 [===========>..................] - ETA: 1:10 - loss: 1.5358 - regression_loss: 1.2960 - classification_loss: 0.2398 213/500 [===========>..................] - ETA: 1:10 - loss: 1.5373 - regression_loss: 1.2969 - classification_loss: 0.2404 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5390 - regression_loss: 1.2986 - classification_loss: 0.2404 215/500 [===========>..................] - ETA: 1:09 - loss: 1.5418 - regression_loss: 1.3007 - classification_loss: 0.2411 216/500 [===========>..................] - ETA: 1:09 - loss: 1.5410 - regression_loss: 1.3001 - classification_loss: 0.2409 217/500 [============>.................] - ETA: 1:09 - loss: 1.5385 - regression_loss: 1.2978 - classification_loss: 0.2407 218/500 [============>.................] - ETA: 1:09 - loss: 1.5412 - regression_loss: 1.2999 - classification_loss: 0.2413 219/500 [============>.................] - ETA: 1:08 - loss: 1.5401 - regression_loss: 1.2993 - classification_loss: 0.2408 220/500 [============>.................] - ETA: 1:08 - loss: 1.5387 - regression_loss: 1.2983 - classification_loss: 0.2404 221/500 [============>.................] - ETA: 1:08 - loss: 1.5423 - regression_loss: 1.3018 - classification_loss: 0.2405 222/500 [============>.................] - ETA: 1:08 - loss: 1.5393 - regression_loss: 1.2993 - classification_loss: 0.2400 223/500 [============>.................] - ETA: 1:07 - loss: 1.5353 - regression_loss: 1.2962 - classification_loss: 0.2391 224/500 [============>.................] - ETA: 1:07 - loss: 1.5348 - regression_loss: 1.2959 - classification_loss: 0.2389 225/500 [============>.................] - ETA: 1:07 - loss: 1.5361 - regression_loss: 1.2973 - classification_loss: 0.2387 226/500 [============>.................] - ETA: 1:07 - loss: 1.5360 - regression_loss: 1.2975 - classification_loss: 0.2385 227/500 [============>.................] - ETA: 1:06 - loss: 1.5354 - regression_loss: 1.2965 - classification_loss: 0.2389 228/500 [============>.................] - ETA: 1:06 - loss: 1.5380 - regression_loss: 1.2978 - classification_loss: 0.2402 229/500 [============>.................] - ETA: 1:06 - loss: 1.5392 - regression_loss: 1.2986 - classification_loss: 0.2406 230/500 [============>.................] - ETA: 1:06 - loss: 1.5405 - regression_loss: 1.2995 - classification_loss: 0.2410 231/500 [============>.................] - ETA: 1:05 - loss: 1.5414 - regression_loss: 1.3000 - classification_loss: 0.2414 232/500 [============>.................] - ETA: 1:05 - loss: 1.5424 - regression_loss: 1.3009 - classification_loss: 0.2415 233/500 [============>.................] - ETA: 1:05 - loss: 1.5431 - regression_loss: 1.3015 - classification_loss: 0.2416 234/500 [=============>................] - ETA: 1:05 - loss: 1.5408 - regression_loss: 1.2995 - classification_loss: 0.2413 235/500 [=============>................] - ETA: 1:04 - loss: 1.5420 - regression_loss: 1.3002 - classification_loss: 0.2418 236/500 [=============>................] - ETA: 1:04 - loss: 1.5420 - regression_loss: 1.2997 - classification_loss: 0.2423 237/500 [=============>................] - ETA: 1:04 - loss: 1.5381 - regression_loss: 1.2963 - classification_loss: 0.2418 238/500 [=============>................] - ETA: 1:04 - loss: 1.5391 - regression_loss: 1.2974 - classification_loss: 0.2417 239/500 [=============>................] - ETA: 1:03 - loss: 1.5387 - regression_loss: 1.2971 - classification_loss: 0.2416 240/500 [=============>................] - ETA: 1:03 - loss: 1.5364 - regression_loss: 1.2951 - classification_loss: 0.2412 241/500 [=============>................] - ETA: 1:03 - loss: 1.5370 - regression_loss: 1.2957 - classification_loss: 0.2413 242/500 [=============>................] - ETA: 1:03 - loss: 1.5372 - regression_loss: 1.2960 - classification_loss: 0.2412 243/500 [=============>................] - ETA: 1:02 - loss: 1.5362 - regression_loss: 1.2952 - classification_loss: 0.2410 244/500 [=============>................] - ETA: 1:02 - loss: 1.5352 - regression_loss: 1.2945 - classification_loss: 0.2407 245/500 [=============>................] - ETA: 1:02 - loss: 1.5312 - regression_loss: 1.2912 - classification_loss: 0.2400 246/500 [=============>................] - ETA: 1:02 - loss: 1.5320 - regression_loss: 1.2920 - classification_loss: 0.2400 247/500 [=============>................] - ETA: 1:01 - loss: 1.5313 - regression_loss: 1.2916 - classification_loss: 0.2396 248/500 [=============>................] - ETA: 1:01 - loss: 1.5315 - regression_loss: 1.2918 - classification_loss: 0.2396 249/500 [=============>................] - ETA: 1:01 - loss: 1.5323 - regression_loss: 1.2920 - classification_loss: 0.2403 250/500 [==============>...............] - ETA: 1:01 - loss: 1.5325 - regression_loss: 1.2920 - classification_loss: 0.2405 251/500 [==============>...............] - ETA: 1:00 - loss: 1.5347 - regression_loss: 1.2937 - classification_loss: 0.2410 252/500 [==============>...............] - ETA: 1:00 - loss: 1.5332 - regression_loss: 1.2924 - classification_loss: 0.2408 253/500 [==============>...............] - ETA: 1:00 - loss: 1.5308 - regression_loss: 1.2904 - classification_loss: 0.2404 254/500 [==============>...............] - ETA: 1:00 - loss: 1.5327 - regression_loss: 1.2919 - classification_loss: 0.2408 255/500 [==============>...............] - ETA: 59s - loss: 1.5289 - regression_loss: 1.2888 - classification_loss: 0.2401  256/500 [==============>...............] - ETA: 59s - loss: 1.5268 - regression_loss: 1.2871 - classification_loss: 0.2397 257/500 [==============>...............] - ETA: 59s - loss: 1.5290 - regression_loss: 1.2889 - classification_loss: 0.2402 258/500 [==============>...............] - ETA: 59s - loss: 1.5296 - regression_loss: 1.2893 - classification_loss: 0.2403 259/500 [==============>...............] - ETA: 58s - loss: 1.5301 - regression_loss: 1.2900 - classification_loss: 0.2401 260/500 [==============>...............] - ETA: 58s - loss: 1.5276 - regression_loss: 1.2881 - classification_loss: 0.2395 261/500 [==============>...............] - ETA: 58s - loss: 1.5291 - regression_loss: 1.2893 - classification_loss: 0.2398 262/500 [==============>...............] - ETA: 58s - loss: 1.5273 - regression_loss: 1.2877 - classification_loss: 0.2396 263/500 [==============>...............] - ETA: 57s - loss: 1.5268 - regression_loss: 1.2874 - classification_loss: 0.2394 264/500 [==============>...............] - ETA: 57s - loss: 1.5269 - regression_loss: 1.2875 - classification_loss: 0.2394 265/500 [==============>...............] - ETA: 57s - loss: 1.5278 - regression_loss: 1.2881 - classification_loss: 0.2398 266/500 [==============>...............] - ETA: 57s - loss: 1.5306 - regression_loss: 1.2900 - classification_loss: 0.2406 267/500 [===============>..............] - ETA: 56s - loss: 1.5311 - regression_loss: 1.2904 - classification_loss: 0.2407 268/500 [===============>..............] - ETA: 56s - loss: 1.5288 - regression_loss: 1.2886 - classification_loss: 0.2402 269/500 [===============>..............] - ETA: 56s - loss: 1.5277 - regression_loss: 1.2879 - classification_loss: 0.2398 270/500 [===============>..............] - ETA: 56s - loss: 1.5273 - regression_loss: 1.2877 - classification_loss: 0.2396 271/500 [===============>..............] - ETA: 55s - loss: 1.5283 - regression_loss: 1.2879 - classification_loss: 0.2403 272/500 [===============>..............] - ETA: 55s - loss: 1.5271 - regression_loss: 1.2871 - classification_loss: 0.2400 273/500 [===============>..............] - ETA: 55s - loss: 1.5265 - regression_loss: 1.2867 - classification_loss: 0.2398 274/500 [===============>..............] - ETA: 55s - loss: 1.5248 - regression_loss: 1.2854 - classification_loss: 0.2394 275/500 [===============>..............] - ETA: 54s - loss: 1.5260 - regression_loss: 1.2860 - classification_loss: 0.2400 276/500 [===============>..............] - ETA: 54s - loss: 1.5290 - regression_loss: 1.2885 - classification_loss: 0.2406 277/500 [===============>..............] - ETA: 54s - loss: 1.5285 - regression_loss: 1.2882 - classification_loss: 0.2403 278/500 [===============>..............] - ETA: 54s - loss: 1.5281 - regression_loss: 1.2878 - classification_loss: 0.2403 279/500 [===============>..............] - ETA: 53s - loss: 1.5281 - regression_loss: 1.2879 - classification_loss: 0.2403 280/500 [===============>..............] - ETA: 53s - loss: 1.5280 - regression_loss: 1.2878 - classification_loss: 0.2402 281/500 [===============>..............] - ETA: 53s - loss: 1.5268 - regression_loss: 1.2868 - classification_loss: 0.2400 282/500 [===============>..............] - ETA: 53s - loss: 1.5250 - regression_loss: 1.2855 - classification_loss: 0.2394 283/500 [===============>..............] - ETA: 52s - loss: 1.5234 - regression_loss: 1.2843 - classification_loss: 0.2391 284/500 [================>.............] - ETA: 52s - loss: 1.5205 - regression_loss: 1.2818 - classification_loss: 0.2386 285/500 [================>.............] - ETA: 52s - loss: 1.5231 - regression_loss: 1.2842 - classification_loss: 0.2389 286/500 [================>.............] - ETA: 52s - loss: 1.5194 - regression_loss: 1.2812 - classification_loss: 0.2383 287/500 [================>.............] - ETA: 51s - loss: 1.5192 - regression_loss: 1.2811 - classification_loss: 0.2380 288/500 [================>.............] - ETA: 51s - loss: 1.5267 - regression_loss: 1.2855 - classification_loss: 0.2412 289/500 [================>.............] - ETA: 51s - loss: 1.5266 - regression_loss: 1.2857 - classification_loss: 0.2410 290/500 [================>.............] - ETA: 51s - loss: 1.5272 - regression_loss: 1.2861 - classification_loss: 0.2411 291/500 [================>.............] - ETA: 50s - loss: 1.5241 - regression_loss: 1.2836 - classification_loss: 0.2405 292/500 [================>.............] - ETA: 50s - loss: 1.5239 - regression_loss: 1.2838 - classification_loss: 0.2400 293/500 [================>.............] - ETA: 50s - loss: 1.5201 - regression_loss: 1.2808 - classification_loss: 0.2393 294/500 [================>.............] - ETA: 50s - loss: 1.5181 - regression_loss: 1.2791 - classification_loss: 0.2390 295/500 [================>.............] - ETA: 49s - loss: 1.5185 - regression_loss: 1.2798 - classification_loss: 0.2388 296/500 [================>.............] - ETA: 49s - loss: 1.5172 - regression_loss: 1.2788 - classification_loss: 0.2383 297/500 [================>.............] - ETA: 49s - loss: 1.5176 - regression_loss: 1.2790 - classification_loss: 0.2386 298/500 [================>.............] - ETA: 49s - loss: 1.5176 - regression_loss: 1.2791 - classification_loss: 0.2385 299/500 [================>.............] - ETA: 48s - loss: 1.5156 - regression_loss: 1.2775 - classification_loss: 0.2381 300/500 [=================>............] - ETA: 48s - loss: 1.5217 - regression_loss: 1.2823 - classification_loss: 0.2394 301/500 [=================>............] - ETA: 48s - loss: 1.5227 - regression_loss: 1.2832 - classification_loss: 0.2395 302/500 [=================>............] - ETA: 48s - loss: 1.5234 - regression_loss: 1.2839 - classification_loss: 0.2395 303/500 [=================>............] - ETA: 47s - loss: 1.5272 - regression_loss: 1.2870 - classification_loss: 0.2402 304/500 [=================>............] - ETA: 47s - loss: 1.5287 - regression_loss: 1.2882 - classification_loss: 0.2405 305/500 [=================>............] - ETA: 47s - loss: 1.5307 - regression_loss: 1.2896 - classification_loss: 0.2411 306/500 [=================>............] - ETA: 47s - loss: 1.5337 - regression_loss: 1.2921 - classification_loss: 0.2416 307/500 [=================>............] - ETA: 46s - loss: 1.5296 - regression_loss: 1.2879 - classification_loss: 0.2417 308/500 [=================>............] - ETA: 46s - loss: 1.5296 - regression_loss: 1.2879 - classification_loss: 0.2417 309/500 [=================>............] - ETA: 46s - loss: 1.5300 - regression_loss: 1.2885 - classification_loss: 0.2415 310/500 [=================>............] - ETA: 46s - loss: 1.5303 - regression_loss: 1.2886 - classification_loss: 0.2416 311/500 [=================>............] - ETA: 45s - loss: 1.5300 - regression_loss: 1.2884 - classification_loss: 0.2415 312/500 [=================>............] - ETA: 45s - loss: 1.5282 - regression_loss: 1.2870 - classification_loss: 0.2412 313/500 [=================>............] - ETA: 45s - loss: 1.5283 - regression_loss: 1.2871 - classification_loss: 0.2413 314/500 [=================>............] - ETA: 45s - loss: 1.5276 - regression_loss: 1.2864 - classification_loss: 0.2412 315/500 [=================>............] - ETA: 44s - loss: 1.5299 - regression_loss: 1.2877 - classification_loss: 0.2422 316/500 [=================>............] - ETA: 44s - loss: 1.5293 - regression_loss: 1.2875 - classification_loss: 0.2417 317/500 [==================>...........] - ETA: 44s - loss: 1.5310 - regression_loss: 1.2886 - classification_loss: 0.2425 318/500 [==================>...........] - ETA: 44s - loss: 1.5324 - regression_loss: 1.2896 - classification_loss: 0.2428 319/500 [==================>...........] - ETA: 43s - loss: 1.5349 - regression_loss: 1.2913 - classification_loss: 0.2435 320/500 [==================>...........] - ETA: 43s - loss: 1.5361 - regression_loss: 1.2922 - classification_loss: 0.2439 321/500 [==================>...........] - ETA: 43s - loss: 1.5364 - regression_loss: 1.2921 - classification_loss: 0.2442 322/500 [==================>...........] - ETA: 43s - loss: 1.5339 - regression_loss: 1.2901 - classification_loss: 0.2438 323/500 [==================>...........] - ETA: 43s - loss: 1.5315 - regression_loss: 1.2881 - classification_loss: 0.2434 324/500 [==================>...........] - ETA: 42s - loss: 1.5313 - regression_loss: 1.2881 - classification_loss: 0.2432 325/500 [==================>...........] - ETA: 42s - loss: 1.5294 - regression_loss: 1.2861 - classification_loss: 0.2433 326/500 [==================>...........] - ETA: 42s - loss: 1.5289 - regression_loss: 1.2858 - classification_loss: 0.2431 327/500 [==================>...........] - ETA: 42s - loss: 1.5297 - regression_loss: 1.2864 - classification_loss: 0.2433 328/500 [==================>...........] - ETA: 41s - loss: 1.5285 - regression_loss: 1.2855 - classification_loss: 0.2430 329/500 [==================>...........] - ETA: 41s - loss: 1.5259 - regression_loss: 1.2835 - classification_loss: 0.2424 330/500 [==================>...........] - ETA: 41s - loss: 1.5259 - regression_loss: 1.2834 - classification_loss: 0.2425 331/500 [==================>...........] - ETA: 41s - loss: 1.5268 - regression_loss: 1.2842 - classification_loss: 0.2425 332/500 [==================>...........] - ETA: 40s - loss: 1.5262 - regression_loss: 1.2835 - classification_loss: 0.2427 333/500 [==================>...........] - ETA: 40s - loss: 1.5253 - regression_loss: 1.2828 - classification_loss: 0.2425 334/500 [===================>..........] - ETA: 40s - loss: 1.5251 - regression_loss: 1.2828 - classification_loss: 0.2423 335/500 [===================>..........] - ETA: 40s - loss: 1.5259 - regression_loss: 1.2838 - classification_loss: 0.2421 336/500 [===================>..........] - ETA: 39s - loss: 1.5275 - regression_loss: 1.2850 - classification_loss: 0.2425 337/500 [===================>..........] - ETA: 39s - loss: 1.5261 - regression_loss: 1.2840 - classification_loss: 0.2421 338/500 [===================>..........] - ETA: 39s - loss: 1.5288 - regression_loss: 1.2859 - classification_loss: 0.2429 339/500 [===================>..........] - ETA: 39s - loss: 1.5291 - regression_loss: 1.2862 - classification_loss: 0.2428 340/500 [===================>..........] - ETA: 38s - loss: 1.5289 - regression_loss: 1.2860 - classification_loss: 0.2429 341/500 [===================>..........] - ETA: 38s - loss: 1.5283 - regression_loss: 1.2858 - classification_loss: 0.2426 342/500 [===================>..........] - ETA: 38s - loss: 1.5299 - regression_loss: 1.2873 - classification_loss: 0.2426 343/500 [===================>..........] - ETA: 38s - loss: 1.5305 - regression_loss: 1.2878 - classification_loss: 0.2427 344/500 [===================>..........] - ETA: 37s - loss: 1.5297 - regression_loss: 1.2872 - classification_loss: 0.2424 345/500 [===================>..........] - ETA: 37s - loss: 1.5316 - regression_loss: 1.2891 - classification_loss: 0.2425 346/500 [===================>..........] - ETA: 37s - loss: 1.5300 - regression_loss: 1.2878 - classification_loss: 0.2422 347/500 [===================>..........] - ETA: 37s - loss: 1.5300 - regression_loss: 1.2879 - classification_loss: 0.2421 348/500 [===================>..........] - ETA: 36s - loss: 1.5299 - regression_loss: 1.2878 - classification_loss: 0.2421 349/500 [===================>..........] - ETA: 36s - loss: 1.5303 - regression_loss: 1.2881 - classification_loss: 0.2423 350/500 [====================>.........] - ETA: 36s - loss: 1.5329 - regression_loss: 1.2901 - classification_loss: 0.2428 351/500 [====================>.........] - ETA: 36s - loss: 1.5312 - regression_loss: 1.2886 - classification_loss: 0.2426 352/500 [====================>.........] - ETA: 36s - loss: 1.5300 - regression_loss: 1.2877 - classification_loss: 0.2423 353/500 [====================>.........] - ETA: 35s - loss: 1.5300 - regression_loss: 1.2877 - classification_loss: 0.2424 354/500 [====================>.........] - ETA: 35s - loss: 1.5307 - regression_loss: 1.2883 - classification_loss: 0.2423 355/500 [====================>.........] - ETA: 35s - loss: 1.5300 - regression_loss: 1.2879 - classification_loss: 0.2421 356/500 [====================>.........] - ETA: 35s - loss: 1.5308 - regression_loss: 1.2885 - classification_loss: 0.2423 357/500 [====================>.........] - ETA: 34s - loss: 1.5297 - regression_loss: 1.2875 - classification_loss: 0.2422 358/500 [====================>.........] - ETA: 34s - loss: 1.5316 - regression_loss: 1.2897 - classification_loss: 0.2419 359/500 [====================>.........] - ETA: 34s - loss: 1.5309 - regression_loss: 1.2892 - classification_loss: 0.2417 360/500 [====================>.........] - ETA: 34s - loss: 1.5309 - regression_loss: 1.2893 - classification_loss: 0.2416 361/500 [====================>.........] - ETA: 33s - loss: 1.5315 - regression_loss: 1.2898 - classification_loss: 0.2417 362/500 [====================>.........] - ETA: 33s - loss: 1.5325 - regression_loss: 1.2908 - classification_loss: 0.2418 363/500 [====================>.........] - ETA: 33s - loss: 1.5332 - regression_loss: 1.2915 - classification_loss: 0.2417 364/500 [====================>.........] - ETA: 33s - loss: 1.5341 - regression_loss: 1.2926 - classification_loss: 0.2416 365/500 [====================>.........] - ETA: 32s - loss: 1.5351 - regression_loss: 1.2935 - classification_loss: 0.2416 366/500 [====================>.........] - ETA: 32s - loss: 1.5358 - regression_loss: 1.2936 - classification_loss: 0.2421 367/500 [=====================>........] - ETA: 32s - loss: 1.5354 - regression_loss: 1.2934 - classification_loss: 0.2420 368/500 [=====================>........] - ETA: 32s - loss: 1.5353 - regression_loss: 1.2934 - classification_loss: 0.2418 369/500 [=====================>........] - ETA: 31s - loss: 1.5382 - regression_loss: 1.2954 - classification_loss: 0.2428 370/500 [=====================>........] - ETA: 31s - loss: 1.5351 - regression_loss: 1.2928 - classification_loss: 0.2423 371/500 [=====================>........] - ETA: 31s - loss: 1.5345 - regression_loss: 1.2923 - classification_loss: 0.2422 372/500 [=====================>........] - ETA: 31s - loss: 1.5346 - regression_loss: 1.2921 - classification_loss: 0.2424 373/500 [=====================>........] - ETA: 30s - loss: 1.5335 - regression_loss: 1.2914 - classification_loss: 0.2422 374/500 [=====================>........] - ETA: 30s - loss: 1.5343 - regression_loss: 1.2920 - classification_loss: 0.2423 375/500 [=====================>........] - ETA: 30s - loss: 1.5339 - regression_loss: 1.2917 - classification_loss: 0.2422 376/500 [=====================>........] - ETA: 30s - loss: 1.5330 - regression_loss: 1.2912 - classification_loss: 0.2418 377/500 [=====================>........] - ETA: 29s - loss: 1.5334 - regression_loss: 1.2915 - classification_loss: 0.2419 378/500 [=====================>........] - ETA: 29s - loss: 1.5331 - regression_loss: 1.2911 - classification_loss: 0.2420 379/500 [=====================>........] - ETA: 29s - loss: 1.5307 - regression_loss: 1.2892 - classification_loss: 0.2415 380/500 [=====================>........] - ETA: 29s - loss: 1.5319 - regression_loss: 1.2902 - classification_loss: 0.2417 381/500 [=====================>........] - ETA: 28s - loss: 1.5328 - regression_loss: 1.2912 - classification_loss: 0.2416 382/500 [=====================>........] - ETA: 28s - loss: 1.5330 - regression_loss: 1.2915 - classification_loss: 0.2415 383/500 [=====================>........] - ETA: 28s - loss: 1.5331 - regression_loss: 1.2918 - classification_loss: 0.2413 384/500 [======================>.......] - ETA: 28s - loss: 1.5326 - regression_loss: 1.2915 - classification_loss: 0.2410 385/500 [======================>.......] - ETA: 27s - loss: 1.5318 - regression_loss: 1.2906 - classification_loss: 0.2412 386/500 [======================>.......] - ETA: 27s - loss: 1.5319 - regression_loss: 1.2906 - classification_loss: 0.2413 387/500 [======================>.......] - ETA: 27s - loss: 1.5315 - regression_loss: 1.2903 - classification_loss: 0.2412 388/500 [======================>.......] - ETA: 27s - loss: 1.5311 - regression_loss: 1.2899 - classification_loss: 0.2412 389/500 [======================>.......] - ETA: 27s - loss: 1.5302 - regression_loss: 1.2893 - classification_loss: 0.2409 390/500 [======================>.......] - ETA: 26s - loss: 1.5298 - regression_loss: 1.2890 - classification_loss: 0.2409 391/500 [======================>.......] - ETA: 26s - loss: 1.5301 - regression_loss: 1.2892 - classification_loss: 0.2408 392/500 [======================>.......] - ETA: 26s - loss: 1.5304 - regression_loss: 1.2892 - classification_loss: 0.2412 393/500 [======================>.......] - ETA: 26s - loss: 1.5283 - regression_loss: 1.2872 - classification_loss: 0.2411 394/500 [======================>.......] - ETA: 25s - loss: 1.5289 - regression_loss: 1.2877 - classification_loss: 0.2412 395/500 [======================>.......] - ETA: 25s - loss: 1.5271 - regression_loss: 1.2860 - classification_loss: 0.2411 396/500 [======================>.......] - ETA: 25s - loss: 1.5264 - regression_loss: 1.2855 - classification_loss: 0.2409 397/500 [======================>.......] - ETA: 25s - loss: 1.5260 - regression_loss: 1.2853 - classification_loss: 0.2407 398/500 [======================>.......] - ETA: 24s - loss: 1.5254 - regression_loss: 1.2849 - classification_loss: 0.2405 399/500 [======================>.......] - ETA: 24s - loss: 1.5242 - regression_loss: 1.2839 - classification_loss: 0.2403 400/500 [=======================>......] - ETA: 24s - loss: 1.5307 - regression_loss: 1.2887 - classification_loss: 0.2420 401/500 [=======================>......] - ETA: 24s - loss: 1.5277 - regression_loss: 1.2862 - classification_loss: 0.2415 402/500 [=======================>......] - ETA: 23s - loss: 1.5270 - regression_loss: 1.2855 - classification_loss: 0.2415 403/500 [=======================>......] - ETA: 23s - loss: 1.5280 - regression_loss: 1.2862 - classification_loss: 0.2418 404/500 [=======================>......] - ETA: 23s - loss: 1.5283 - regression_loss: 1.2864 - classification_loss: 0.2419 405/500 [=======================>......] - ETA: 23s - loss: 1.5296 - regression_loss: 1.2875 - classification_loss: 0.2421 406/500 [=======================>......] - ETA: 22s - loss: 1.5308 - regression_loss: 1.2882 - classification_loss: 0.2426 407/500 [=======================>......] - ETA: 22s - loss: 1.5320 - regression_loss: 1.2891 - classification_loss: 0.2429 408/500 [=======================>......] - ETA: 22s - loss: 1.5324 - regression_loss: 1.2896 - classification_loss: 0.2429 409/500 [=======================>......] - ETA: 22s - loss: 1.5299 - regression_loss: 1.2875 - classification_loss: 0.2425 410/500 [=======================>......] - ETA: 21s - loss: 1.5310 - regression_loss: 1.2881 - classification_loss: 0.2429 411/500 [=======================>......] - ETA: 21s - loss: 1.5294 - regression_loss: 1.2869 - classification_loss: 0.2425 412/500 [=======================>......] - ETA: 21s - loss: 1.5302 - regression_loss: 1.2874 - classification_loss: 0.2428 413/500 [=======================>......] - ETA: 21s - loss: 1.5298 - regression_loss: 1.2871 - classification_loss: 0.2427 414/500 [=======================>......] - ETA: 20s - loss: 1.5292 - regression_loss: 1.2866 - classification_loss: 0.2426 415/500 [=======================>......] - ETA: 20s - loss: 1.5298 - regression_loss: 1.2871 - classification_loss: 0.2427 416/500 [=======================>......] - ETA: 20s - loss: 1.5294 - regression_loss: 1.2866 - classification_loss: 0.2428 417/500 [========================>.....] - ETA: 20s - loss: 1.5285 - regression_loss: 1.2860 - classification_loss: 0.2425 418/500 [========================>.....] - ETA: 19s - loss: 1.5298 - regression_loss: 1.2872 - classification_loss: 0.2426 419/500 [========================>.....] - ETA: 19s - loss: 1.5304 - regression_loss: 1.2876 - classification_loss: 0.2428 420/500 [========================>.....] - ETA: 19s - loss: 1.5307 - regression_loss: 1.2879 - classification_loss: 0.2428 421/500 [========================>.....] - ETA: 19s - loss: 1.5301 - regression_loss: 1.2875 - classification_loss: 0.2425 422/500 [========================>.....] - ETA: 18s - loss: 1.5297 - regression_loss: 1.2873 - classification_loss: 0.2424 423/500 [========================>.....] - ETA: 18s - loss: 1.5304 - regression_loss: 1.2879 - classification_loss: 0.2424 424/500 [========================>.....] - ETA: 18s - loss: 1.5303 - regression_loss: 1.2880 - classification_loss: 0.2423 425/500 [========================>.....] - ETA: 18s - loss: 1.5302 - regression_loss: 1.2880 - classification_loss: 0.2423 426/500 [========================>.....] - ETA: 17s - loss: 1.5297 - regression_loss: 1.2876 - classification_loss: 0.2421 427/500 [========================>.....] - ETA: 17s - loss: 1.5291 - regression_loss: 1.2872 - classification_loss: 0.2419 428/500 [========================>.....] - ETA: 17s - loss: 1.5279 - regression_loss: 1.2859 - classification_loss: 0.2420 429/500 [========================>.....] - ETA: 17s - loss: 1.5281 - regression_loss: 1.2862 - classification_loss: 0.2419 430/500 [========================>.....] - ETA: 16s - loss: 1.5284 - regression_loss: 1.2861 - classification_loss: 0.2423 431/500 [========================>.....] - ETA: 16s - loss: 1.5291 - regression_loss: 1.2867 - classification_loss: 0.2424 432/500 [========================>.....] - ETA: 16s - loss: 1.5290 - regression_loss: 1.2868 - classification_loss: 0.2422 433/500 [========================>.....] - ETA: 16s - loss: 1.5298 - regression_loss: 1.2874 - classification_loss: 0.2423 434/500 [=========================>....] - ETA: 16s - loss: 1.5271 - regression_loss: 1.2852 - classification_loss: 0.2419 435/500 [=========================>....] - ETA: 15s - loss: 1.5277 - regression_loss: 1.2858 - classification_loss: 0.2419 436/500 [=========================>....] - ETA: 15s - loss: 1.5263 - regression_loss: 1.2847 - classification_loss: 0.2416 437/500 [=========================>....] - ETA: 15s - loss: 1.5262 - regression_loss: 1.2848 - classification_loss: 0.2414 438/500 [=========================>....] - ETA: 15s - loss: 1.5267 - regression_loss: 1.2854 - classification_loss: 0.2413 439/500 [=========================>....] - ETA: 14s - loss: 1.5275 - regression_loss: 1.2860 - classification_loss: 0.2415 440/500 [=========================>....] - ETA: 14s - loss: 1.5268 - regression_loss: 1.2856 - classification_loss: 0.2413 441/500 [=========================>....] - ETA: 14s - loss: 1.5269 - regression_loss: 1.2858 - classification_loss: 0.2411 442/500 [=========================>....] - ETA: 14s - loss: 1.5277 - regression_loss: 1.2863 - classification_loss: 0.2414 443/500 [=========================>....] - ETA: 13s - loss: 1.5307 - regression_loss: 1.2886 - classification_loss: 0.2422 444/500 [=========================>....] - ETA: 13s - loss: 1.5304 - regression_loss: 1.2884 - classification_loss: 0.2420 445/500 [=========================>....] - ETA: 13s - loss: 1.5320 - regression_loss: 1.2893 - classification_loss: 0.2427 446/500 [=========================>....] - ETA: 13s - loss: 1.5328 - regression_loss: 1.2898 - classification_loss: 0.2430 447/500 [=========================>....] - ETA: 12s - loss: 1.5328 - regression_loss: 1.2898 - classification_loss: 0.2430 448/500 [=========================>....] - ETA: 12s - loss: 1.5339 - regression_loss: 1.2906 - classification_loss: 0.2433 449/500 [=========================>....] - ETA: 12s - loss: 1.5331 - regression_loss: 1.2898 - classification_loss: 0.2432 450/500 [==========================>...] - ETA: 12s - loss: 1.5324 - regression_loss: 1.2893 - classification_loss: 0.2431 451/500 [==========================>...] - ETA: 11s - loss: 1.5318 - regression_loss: 1.2889 - classification_loss: 0.2429 452/500 [==========================>...] - ETA: 11s - loss: 1.5322 - regression_loss: 1.2893 - classification_loss: 0.2429 453/500 [==========================>...] - ETA: 11s - loss: 1.5318 - regression_loss: 1.2890 - classification_loss: 0.2427 454/500 [==========================>...] - ETA: 11s - loss: 1.5305 - regression_loss: 1.2880 - classification_loss: 0.2425 455/500 [==========================>...] - ETA: 10s - loss: 1.5309 - regression_loss: 1.2883 - classification_loss: 0.2426 456/500 [==========================>...] - ETA: 10s - loss: 1.5327 - regression_loss: 1.2896 - classification_loss: 0.2431 457/500 [==========================>...] - ETA: 10s - loss: 1.5324 - regression_loss: 1.2893 - classification_loss: 0.2432 458/500 [==========================>...] - ETA: 10s - loss: 1.5329 - regression_loss: 1.2896 - classification_loss: 0.2433 459/500 [==========================>...] - ETA: 9s - loss: 1.5325 - regression_loss: 1.2894 - classification_loss: 0.2431  460/500 [==========================>...] - ETA: 9s - loss: 1.5323 - regression_loss: 1.2892 - classification_loss: 0.2431 461/500 [==========================>...] - ETA: 9s - loss: 1.5319 - regression_loss: 1.2885 - classification_loss: 0.2434 462/500 [==========================>...] - ETA: 9s - loss: 1.5322 - regression_loss: 1.2889 - classification_loss: 0.2434 463/500 [==========================>...] - ETA: 8s - loss: 1.5316 - regression_loss: 1.2885 - classification_loss: 0.2431 464/500 [==========================>...] - ETA: 8s - loss: 1.5314 - regression_loss: 1.2882 - classification_loss: 0.2432 465/500 [==========================>...] - ETA: 8s - loss: 1.5327 - regression_loss: 1.2891 - classification_loss: 0.2436 466/500 [==========================>...] - ETA: 8s - loss: 1.5326 - regression_loss: 1.2891 - classification_loss: 0.2436 467/500 [===========================>..] - ETA: 8s - loss: 1.5327 - regression_loss: 1.2890 - classification_loss: 0.2438 468/500 [===========================>..] - ETA: 7s - loss: 1.5335 - regression_loss: 1.2897 - classification_loss: 0.2438 469/500 [===========================>..] - ETA: 7s - loss: 1.5354 - regression_loss: 1.2911 - classification_loss: 0.2443 470/500 [===========================>..] - ETA: 7s - loss: 1.5360 - regression_loss: 1.2916 - classification_loss: 0.2444 471/500 [===========================>..] - ETA: 7s - loss: 1.5353 - regression_loss: 1.2911 - classification_loss: 0.2442 472/500 [===========================>..] - ETA: 6s - loss: 1.5362 - regression_loss: 1.2918 - classification_loss: 0.2444 473/500 [===========================>..] - ETA: 6s - loss: 1.5366 - regression_loss: 1.2920 - classification_loss: 0.2447 474/500 [===========================>..] - ETA: 6s - loss: 1.5361 - regression_loss: 1.2916 - classification_loss: 0.2445 475/500 [===========================>..] - ETA: 6s - loss: 1.5366 - regression_loss: 1.2924 - classification_loss: 0.2442 476/500 [===========================>..] - ETA: 5s - loss: 1.5360 - regression_loss: 1.2918 - classification_loss: 0.2442 477/500 [===========================>..] - ETA: 5s - loss: 1.5360 - regression_loss: 1.2919 - classification_loss: 0.2441 478/500 [===========================>..] - ETA: 5s - loss: 1.5371 - regression_loss: 1.2926 - classification_loss: 0.2445 479/500 [===========================>..] - ETA: 5s - loss: 1.5362 - regression_loss: 1.2919 - classification_loss: 0.2443 480/500 [===========================>..] - ETA: 4s - loss: 1.5357 - regression_loss: 1.2914 - classification_loss: 0.2443 481/500 [===========================>..] - ETA: 4s - loss: 1.5348 - regression_loss: 1.2907 - classification_loss: 0.2441 482/500 [===========================>..] - ETA: 4s - loss: 1.5388 - regression_loss: 1.2922 - classification_loss: 0.2466 483/500 [===========================>..] - ETA: 4s - loss: 1.5381 - regression_loss: 1.2917 - classification_loss: 0.2464 484/500 [============================>.] - ETA: 3s - loss: 1.5375 - regression_loss: 1.2911 - classification_loss: 0.2464 485/500 [============================>.] - ETA: 3s - loss: 1.5373 - regression_loss: 1.2909 - classification_loss: 0.2463 486/500 [============================>.] - ETA: 3s - loss: 1.5374 - regression_loss: 1.2910 - classification_loss: 0.2464 487/500 [============================>.] - ETA: 3s - loss: 1.5372 - regression_loss: 1.2910 - classification_loss: 0.2462 488/500 [============================>.] - ETA: 2s - loss: 1.5371 - regression_loss: 1.2910 - classification_loss: 0.2462 489/500 [============================>.] - ETA: 2s - loss: 1.5363 - regression_loss: 1.2902 - classification_loss: 0.2461 490/500 [============================>.] - ETA: 2s - loss: 1.5356 - regression_loss: 1.2896 - classification_loss: 0.2460 491/500 [============================>.] - ETA: 2s - loss: 1.5365 - regression_loss: 1.2901 - classification_loss: 0.2464 492/500 [============================>.] - ETA: 1s - loss: 1.5369 - regression_loss: 1.2905 - classification_loss: 0.2464 493/500 [============================>.] - ETA: 1s - loss: 1.5363 - regression_loss: 1.2900 - classification_loss: 0.2463 494/500 [============================>.] - ETA: 1s - loss: 1.5361 - regression_loss: 1.2900 - classification_loss: 0.2461 495/500 [============================>.] - ETA: 1s - loss: 1.5349 - regression_loss: 1.2890 - classification_loss: 0.2459 496/500 [============================>.] - ETA: 0s - loss: 1.5343 - regression_loss: 1.2885 - classification_loss: 0.2458 497/500 [============================>.] - ETA: 0s - loss: 1.5337 - regression_loss: 1.2880 - classification_loss: 0.2456 498/500 [============================>.] - ETA: 0s - loss: 1.5321 - regression_loss: 1.2867 - classification_loss: 0.2454 499/500 [============================>.] - ETA: 0s - loss: 1.5319 - regression_loss: 1.2864 - classification_loss: 0.2455 500/500 [==============================] - 122s 243ms/step - loss: 1.5322 - regression_loss: 1.2866 - classification_loss: 0.2456 326 instances of class plum with average precision: 0.7959 mAP: 0.7959 Epoch 00066: saving model to ./training/snapshots/resnet50_pascal_66.h5 Epoch 67/150 1/500 [..............................] - ETA: 1:56 - loss: 1.0590 - regression_loss: 0.9081 - classification_loss: 0.1508 2/500 [..............................] - ETA: 1:59 - loss: 1.5424 - regression_loss: 1.3309 - classification_loss: 0.2114 3/500 [..............................] - ETA: 1:58 - loss: 1.3605 - regression_loss: 1.1681 - classification_loss: 0.1924 4/500 [..............................] - ETA: 1:58 - loss: 1.3486 - regression_loss: 1.1544 - classification_loss: 0.1942 5/500 [..............................] - ETA: 1:58 - loss: 1.4913 - regression_loss: 1.2602 - classification_loss: 0.2311 6/500 [..............................] - ETA: 1:59 - loss: 1.4730 - regression_loss: 1.2396 - classification_loss: 0.2334 7/500 [..............................] - ETA: 1:59 - loss: 1.3928 - regression_loss: 1.1771 - classification_loss: 0.2157 8/500 [..............................] - ETA: 2:00 - loss: 1.3301 - regression_loss: 1.1273 - classification_loss: 0.2028 9/500 [..............................] - ETA: 2:00 - loss: 1.4849 - regression_loss: 1.2617 - classification_loss: 0.2231 10/500 [..............................] - ETA: 2:00 - loss: 1.4839 - regression_loss: 1.2665 - classification_loss: 0.2175 11/500 [..............................] - ETA: 2:00 - loss: 1.4704 - regression_loss: 1.2465 - classification_loss: 0.2239 12/500 [..............................] - ETA: 1:59 - loss: 1.4780 - regression_loss: 1.2521 - classification_loss: 0.2258 13/500 [..............................] - ETA: 1:59 - loss: 1.4871 - regression_loss: 1.2617 - classification_loss: 0.2254 14/500 [..............................] - ETA: 1:58 - loss: 1.6006 - regression_loss: 1.3544 - classification_loss: 0.2462 15/500 [..............................] - ETA: 1:58 - loss: 1.5700 - regression_loss: 1.3311 - classification_loss: 0.2389 16/500 [..............................] - ETA: 1:57 - loss: 1.6169 - regression_loss: 1.3704 - classification_loss: 0.2465 17/500 [>.............................] - ETA: 1:57 - loss: 1.6280 - regression_loss: 1.3748 - classification_loss: 0.2532 18/500 [>.............................] - ETA: 1:57 - loss: 1.6295 - regression_loss: 1.3716 - classification_loss: 0.2579 19/500 [>.............................] - ETA: 1:56 - loss: 1.6064 - regression_loss: 1.3520 - classification_loss: 0.2544 20/500 [>.............................] - ETA: 1:56 - loss: 1.5803 - regression_loss: 1.3301 - classification_loss: 0.2502 21/500 [>.............................] - ETA: 1:56 - loss: 1.5268 - regression_loss: 1.2861 - classification_loss: 0.2407 22/500 [>.............................] - ETA: 1:56 - loss: 1.5291 - regression_loss: 1.2902 - classification_loss: 0.2389 23/500 [>.............................] - ETA: 1:55 - loss: 1.5338 - regression_loss: 1.2946 - classification_loss: 0.2393 24/500 [>.............................] - ETA: 1:54 - loss: 1.5587 - regression_loss: 1.3160 - classification_loss: 0.2426 25/500 [>.............................] - ETA: 1:54 - loss: 1.5688 - regression_loss: 1.3225 - classification_loss: 0.2463 26/500 [>.............................] - ETA: 1:54 - loss: 1.5378 - regression_loss: 1.2979 - classification_loss: 0.2399 27/500 [>.............................] - ETA: 1:54 - loss: 1.5074 - regression_loss: 1.2737 - classification_loss: 0.2337 28/500 [>.............................] - ETA: 1:54 - loss: 1.5339 - regression_loss: 1.2958 - classification_loss: 0.2382 29/500 [>.............................] - ETA: 1:53 - loss: 1.5242 - regression_loss: 1.2882 - classification_loss: 0.2359 30/500 [>.............................] - ETA: 1:53 - loss: 1.5452 - regression_loss: 1.3072 - classification_loss: 0.2379 31/500 [>.............................] - ETA: 1:53 - loss: 1.5403 - regression_loss: 1.3052 - classification_loss: 0.2351 32/500 [>.............................] - ETA: 1:53 - loss: 1.5301 - regression_loss: 1.2968 - classification_loss: 0.2333 33/500 [>.............................] - ETA: 1:53 - loss: 1.5305 - regression_loss: 1.2989 - classification_loss: 0.2316 34/500 [=>............................] - ETA: 1:53 - loss: 1.5058 - regression_loss: 1.2778 - classification_loss: 0.2280 35/500 [=>............................] - ETA: 1:52 - loss: 1.5175 - regression_loss: 1.2875 - classification_loss: 0.2300 36/500 [=>............................] - ETA: 1:52 - loss: 1.5346 - regression_loss: 1.3007 - classification_loss: 0.2340 37/500 [=>............................] - ETA: 1:52 - loss: 1.5662 - regression_loss: 1.3283 - classification_loss: 0.2380 38/500 [=>............................] - ETA: 1:52 - loss: 1.5624 - regression_loss: 1.3250 - classification_loss: 0.2374 39/500 [=>............................] - ETA: 1:52 - loss: 1.5328 - regression_loss: 1.3004 - classification_loss: 0.2324 40/500 [=>............................] - ETA: 1:52 - loss: 1.5398 - regression_loss: 1.3068 - classification_loss: 0.2330 41/500 [=>............................] - ETA: 1:52 - loss: 1.5697 - regression_loss: 1.3305 - classification_loss: 0.2392 42/500 [=>............................] - ETA: 1:51 - loss: 1.5691 - regression_loss: 1.3291 - classification_loss: 0.2400 43/500 [=>............................] - ETA: 1:51 - loss: 1.5437 - regression_loss: 1.3067 - classification_loss: 0.2370 44/500 [=>............................] - ETA: 1:51 - loss: 1.5454 - regression_loss: 1.3088 - classification_loss: 0.2366 45/500 [=>............................] - ETA: 1:51 - loss: 1.5471 - regression_loss: 1.3100 - classification_loss: 0.2370 46/500 [=>............................] - ETA: 1:50 - loss: 1.5411 - regression_loss: 1.3065 - classification_loss: 0.2346 47/500 [=>............................] - ETA: 1:50 - loss: 1.5384 - regression_loss: 1.3046 - classification_loss: 0.2339 48/500 [=>............................] - ETA: 1:50 - loss: 1.5320 - regression_loss: 1.2992 - classification_loss: 0.2327 49/500 [=>............................] - ETA: 1:50 - loss: 1.5580 - regression_loss: 1.3176 - classification_loss: 0.2404 50/500 [==>...........................] - ETA: 1:49 - loss: 1.5698 - regression_loss: 1.3272 - classification_loss: 0.2427 51/500 [==>...........................] - ETA: 1:49 - loss: 1.5803 - regression_loss: 1.3356 - classification_loss: 0.2447 52/500 [==>...........................] - ETA: 1:49 - loss: 1.5742 - regression_loss: 1.3314 - classification_loss: 0.2428 53/500 [==>...........................] - ETA: 1:49 - loss: 1.5747 - regression_loss: 1.3319 - classification_loss: 0.2428 54/500 [==>...........................] - ETA: 1:48 - loss: 1.5780 - regression_loss: 1.3348 - classification_loss: 0.2433 55/500 [==>...........................] - ETA: 1:48 - loss: 1.5746 - regression_loss: 1.3333 - classification_loss: 0.2412 56/500 [==>...........................] - ETA: 1:48 - loss: 1.5779 - regression_loss: 1.3368 - classification_loss: 0.2411 57/500 [==>...........................] - ETA: 1:48 - loss: 1.5785 - regression_loss: 1.3363 - classification_loss: 0.2422 58/500 [==>...........................] - ETA: 1:48 - loss: 1.5737 - regression_loss: 1.3338 - classification_loss: 0.2399 59/500 [==>...........................] - ETA: 1:47 - loss: 1.5698 - regression_loss: 1.3294 - classification_loss: 0.2404 60/500 [==>...........................] - ETA: 1:47 - loss: 1.5517 - regression_loss: 1.3136 - classification_loss: 0.2381 61/500 [==>...........................] - ETA: 1:47 - loss: 1.5541 - regression_loss: 1.3150 - classification_loss: 0.2391 62/500 [==>...........................] - ETA: 1:47 - loss: 1.5390 - regression_loss: 1.3019 - classification_loss: 0.2371 63/500 [==>...........................] - ETA: 1:46 - loss: 1.5509 - regression_loss: 1.3103 - classification_loss: 0.2406 64/500 [==>...........................] - ETA: 1:46 - loss: 1.5902 - regression_loss: 1.2899 - classification_loss: 0.3004 65/500 [==>...........................] - ETA: 1:46 - loss: 1.5996 - regression_loss: 1.2976 - classification_loss: 0.3020 66/500 [==>...........................] - ETA: 1:46 - loss: 1.5973 - regression_loss: 1.2973 - classification_loss: 0.3000 67/500 [===>..........................] - ETA: 1:45 - loss: 1.5808 - regression_loss: 1.2842 - classification_loss: 0.2966 68/500 [===>..........................] - ETA: 1:45 - loss: 1.5854 - regression_loss: 1.2886 - classification_loss: 0.2968 69/500 [===>..........................] - ETA: 1:45 - loss: 1.5766 - regression_loss: 1.2819 - classification_loss: 0.2947 70/500 [===>..........................] - ETA: 1:45 - loss: 1.5659 - regression_loss: 1.2736 - classification_loss: 0.2922 71/500 [===>..........................] - ETA: 1:44 - loss: 1.5647 - regression_loss: 1.2730 - classification_loss: 0.2917 72/500 [===>..........................] - ETA: 1:44 - loss: 1.5690 - regression_loss: 1.2758 - classification_loss: 0.2933 73/500 [===>..........................] - ETA: 1:43 - loss: 1.5849 - regression_loss: 1.2889 - classification_loss: 0.2961 74/500 [===>..........................] - ETA: 1:43 - loss: 1.5840 - regression_loss: 1.2888 - classification_loss: 0.2952 75/500 [===>..........................] - ETA: 1:43 - loss: 1.5810 - regression_loss: 1.2859 - classification_loss: 0.2950 76/500 [===>..........................] - ETA: 1:42 - loss: 1.5788 - regression_loss: 1.2851 - classification_loss: 0.2937 77/500 [===>..........................] - ETA: 1:42 - loss: 1.5790 - regression_loss: 1.2863 - classification_loss: 0.2928 78/500 [===>..........................] - ETA: 1:41 - loss: 1.5829 - regression_loss: 1.2897 - classification_loss: 0.2932 79/500 [===>..........................] - ETA: 1:41 - loss: 1.5961 - regression_loss: 1.3014 - classification_loss: 0.2947 80/500 [===>..........................] - ETA: 1:41 - loss: 1.5976 - regression_loss: 1.3030 - classification_loss: 0.2946 81/500 [===>..........................] - ETA: 1:40 - loss: 1.5945 - regression_loss: 1.3009 - classification_loss: 0.2936 82/500 [===>..........................] - ETA: 1:40 - loss: 1.5984 - regression_loss: 1.3015 - classification_loss: 0.2969 83/500 [===>..........................] - ETA: 1:40 - loss: 1.5931 - regression_loss: 1.2986 - classification_loss: 0.2945 84/500 [====>.........................] - ETA: 1:39 - loss: 1.5959 - regression_loss: 1.3012 - classification_loss: 0.2947 85/500 [====>.........................] - ETA: 1:39 - loss: 1.5914 - regression_loss: 1.2984 - classification_loss: 0.2930 86/500 [====>.........................] - ETA: 1:38 - loss: 1.5929 - regression_loss: 1.3001 - classification_loss: 0.2927 87/500 [====>.........................] - ETA: 1:38 - loss: 1.5914 - regression_loss: 1.2994 - classification_loss: 0.2920 88/500 [====>.........................] - ETA: 1:38 - loss: 1.5863 - regression_loss: 1.2963 - classification_loss: 0.2900 89/500 [====>.........................] - ETA: 1:37 - loss: 1.5879 - regression_loss: 1.2947 - classification_loss: 0.2932 90/500 [====>.........................] - ETA: 1:37 - loss: 1.5905 - regression_loss: 1.2963 - classification_loss: 0.2942 91/500 [====>.........................] - ETA: 1:37 - loss: 1.5995 - regression_loss: 1.3026 - classification_loss: 0.2968 92/500 [====>.........................] - ETA: 1:36 - loss: 1.5975 - regression_loss: 1.3017 - classification_loss: 0.2958 93/500 [====>.........................] - ETA: 1:36 - loss: 1.5968 - regression_loss: 1.3020 - classification_loss: 0.2947 94/500 [====>.........................] - ETA: 1:36 - loss: 1.6090 - regression_loss: 1.3114 - classification_loss: 0.2976 95/500 [====>.........................] - ETA: 1:36 - loss: 1.6109 - regression_loss: 1.3136 - classification_loss: 0.2972 96/500 [====>.........................] - ETA: 1:36 - loss: 1.6092 - regression_loss: 1.3128 - classification_loss: 0.2964 97/500 [====>.........................] - ETA: 1:35 - loss: 1.6163 - regression_loss: 1.3190 - classification_loss: 0.2973 98/500 [====>.........................] - ETA: 1:35 - loss: 1.6127 - regression_loss: 1.3168 - classification_loss: 0.2959 99/500 [====>.........................] - ETA: 1:35 - loss: 1.6179 - regression_loss: 1.3205 - classification_loss: 0.2975 100/500 [=====>........................] - ETA: 1:35 - loss: 1.6139 - regression_loss: 1.3182 - classification_loss: 0.2956 101/500 [=====>........................] - ETA: 1:35 - loss: 1.6088 - regression_loss: 1.3149 - classification_loss: 0.2939 102/500 [=====>........................] - ETA: 1:35 - loss: 1.6133 - regression_loss: 1.3177 - classification_loss: 0.2956 103/500 [=====>........................] - ETA: 1:34 - loss: 1.6129 - regression_loss: 1.3174 - classification_loss: 0.2955 104/500 [=====>........................] - ETA: 1:34 - loss: 1.6224 - regression_loss: 1.3235 - classification_loss: 0.2989 105/500 [=====>........................] - ETA: 1:34 - loss: 1.6200 - regression_loss: 1.3221 - classification_loss: 0.2978 106/500 [=====>........................] - ETA: 1:34 - loss: 1.6190 - regression_loss: 1.3223 - classification_loss: 0.2968 107/500 [=====>........................] - ETA: 1:33 - loss: 1.6143 - regression_loss: 1.3189 - classification_loss: 0.2954 108/500 [=====>........................] - ETA: 1:33 - loss: 1.6130 - regression_loss: 1.3184 - classification_loss: 0.2946 109/500 [=====>........................] - ETA: 1:33 - loss: 1.6127 - regression_loss: 1.3186 - classification_loss: 0.2941 110/500 [=====>........................] - ETA: 1:33 - loss: 1.6087 - regression_loss: 1.3161 - classification_loss: 0.2926 111/500 [=====>........................] - ETA: 1:33 - loss: 1.6080 - regression_loss: 1.3158 - classification_loss: 0.2922 112/500 [=====>........................] - ETA: 1:32 - loss: 1.6055 - regression_loss: 1.3137 - classification_loss: 0.2918 113/500 [=====>........................] - ETA: 1:32 - loss: 1.6096 - regression_loss: 1.3184 - classification_loss: 0.2912 114/500 [=====>........................] - ETA: 1:32 - loss: 1.6057 - regression_loss: 1.3159 - classification_loss: 0.2899 115/500 [=====>........................] - ETA: 1:32 - loss: 1.6050 - regression_loss: 1.3155 - classification_loss: 0.2895 116/500 [=====>........................] - ETA: 1:32 - loss: 1.5990 - regression_loss: 1.3109 - classification_loss: 0.2881 117/500 [======>.......................] - ETA: 1:31 - loss: 1.6006 - regression_loss: 1.3122 - classification_loss: 0.2883 118/500 [======>.......................] - ETA: 1:31 - loss: 1.6047 - regression_loss: 1.3158 - classification_loss: 0.2889 119/500 [======>.......................] - ETA: 1:31 - loss: 1.6053 - regression_loss: 1.3171 - classification_loss: 0.2883 120/500 [======>.......................] - ETA: 1:31 - loss: 1.6102 - regression_loss: 1.3206 - classification_loss: 0.2896 121/500 [======>.......................] - ETA: 1:30 - loss: 1.6021 - regression_loss: 1.3137 - classification_loss: 0.2883 122/500 [======>.......................] - ETA: 1:30 - loss: 1.6014 - regression_loss: 1.3134 - classification_loss: 0.2879 123/500 [======>.......................] - ETA: 1:30 - loss: 1.6017 - regression_loss: 1.3135 - classification_loss: 0.2882 124/500 [======>.......................] - ETA: 1:30 - loss: 1.6028 - regression_loss: 1.3143 - classification_loss: 0.2885 125/500 [======>.......................] - ETA: 1:30 - loss: 1.6057 - regression_loss: 1.3169 - classification_loss: 0.2887 126/500 [======>.......................] - ETA: 1:29 - loss: 1.6018 - regression_loss: 1.3141 - classification_loss: 0.2877 127/500 [======>.......................] - ETA: 1:29 - loss: 1.6007 - regression_loss: 1.3137 - classification_loss: 0.2869 128/500 [======>.......................] - ETA: 1:29 - loss: 1.5980 - regression_loss: 1.3120 - classification_loss: 0.2860 129/500 [======>.......................] - ETA: 1:29 - loss: 1.6016 - regression_loss: 1.3165 - classification_loss: 0.2851 130/500 [======>.......................] - ETA: 1:28 - loss: 1.5993 - regression_loss: 1.3153 - classification_loss: 0.2840 131/500 [======>.......................] - ETA: 1:28 - loss: 1.6014 - regression_loss: 1.3176 - classification_loss: 0.2837 132/500 [======>.......................] - ETA: 1:28 - loss: 1.6019 - regression_loss: 1.3184 - classification_loss: 0.2834 133/500 [======>.......................] - ETA: 1:28 - loss: 1.6043 - regression_loss: 1.3209 - classification_loss: 0.2834 134/500 [=======>......................] - ETA: 1:28 - loss: 1.6031 - regression_loss: 1.3198 - classification_loss: 0.2833 135/500 [=======>......................] - ETA: 1:27 - loss: 1.5967 - regression_loss: 1.3147 - classification_loss: 0.2820 136/500 [=======>......................] - ETA: 1:27 - loss: 1.5924 - regression_loss: 1.3114 - classification_loss: 0.2809 137/500 [=======>......................] - ETA: 1:27 - loss: 1.5941 - regression_loss: 1.3131 - classification_loss: 0.2810 138/500 [=======>......................] - ETA: 1:27 - loss: 1.5919 - regression_loss: 1.3118 - classification_loss: 0.2801 139/500 [=======>......................] - ETA: 1:26 - loss: 1.6103 - regression_loss: 1.3196 - classification_loss: 0.2907 140/500 [=======>......................] - ETA: 1:26 - loss: 1.6079 - regression_loss: 1.3181 - classification_loss: 0.2898 141/500 [=======>......................] - ETA: 1:26 - loss: 1.6081 - regression_loss: 1.3179 - classification_loss: 0.2902 142/500 [=======>......................] - ETA: 1:26 - loss: 1.6049 - regression_loss: 1.3150 - classification_loss: 0.2899 143/500 [=======>......................] - ETA: 1:25 - loss: 1.6031 - regression_loss: 1.3142 - classification_loss: 0.2889 144/500 [=======>......................] - ETA: 1:25 - loss: 1.6011 - regression_loss: 1.3131 - classification_loss: 0.2879 145/500 [=======>......................] - ETA: 1:25 - loss: 1.5989 - regression_loss: 1.3119 - classification_loss: 0.2870 146/500 [=======>......................] - ETA: 1:25 - loss: 1.5987 - regression_loss: 1.3114 - classification_loss: 0.2873 147/500 [=======>......................] - ETA: 1:25 - loss: 1.5984 - regression_loss: 1.3118 - classification_loss: 0.2867 148/500 [=======>......................] - ETA: 1:24 - loss: 1.5988 - regression_loss: 1.3124 - classification_loss: 0.2864 149/500 [=======>......................] - ETA: 1:24 - loss: 1.6019 - regression_loss: 1.3147 - classification_loss: 0.2872 150/500 [========>.....................] - ETA: 1:24 - loss: 1.5992 - regression_loss: 1.3128 - classification_loss: 0.2863 151/500 [========>.....................] - ETA: 1:24 - loss: 1.5923 - regression_loss: 1.3071 - classification_loss: 0.2851 152/500 [========>.....................] - ETA: 1:23 - loss: 1.5938 - regression_loss: 1.3086 - classification_loss: 0.2852 153/500 [========>.....................] - ETA: 1:23 - loss: 1.5916 - regression_loss: 1.3074 - classification_loss: 0.2842 154/500 [========>.....................] - ETA: 1:23 - loss: 1.5907 - regression_loss: 1.3073 - classification_loss: 0.2834 155/500 [========>.....................] - ETA: 1:23 - loss: 1.5892 - regression_loss: 1.3054 - classification_loss: 0.2838 156/500 [========>.....................] - ETA: 1:22 - loss: 1.5951 - regression_loss: 1.3100 - classification_loss: 0.2851 157/500 [========>.....................] - ETA: 1:22 - loss: 1.5912 - regression_loss: 1.3073 - classification_loss: 0.2840 158/500 [========>.....................] - ETA: 1:22 - loss: 1.5946 - regression_loss: 1.3099 - classification_loss: 0.2847 159/500 [========>.....................] - ETA: 1:22 - loss: 1.5915 - regression_loss: 1.3072 - classification_loss: 0.2843 160/500 [========>.....................] - ETA: 1:21 - loss: 1.5841 - regression_loss: 1.3014 - classification_loss: 0.2826 161/500 [========>.....................] - ETA: 1:21 - loss: 1.5790 - regression_loss: 1.2974 - classification_loss: 0.2815 162/500 [========>.....................] - ETA: 1:21 - loss: 1.5834 - regression_loss: 1.3016 - classification_loss: 0.2819 163/500 [========>.....................] - ETA: 1:21 - loss: 1.5815 - regression_loss: 1.3001 - classification_loss: 0.2814 164/500 [========>.....................] - ETA: 1:21 - loss: 1.5762 - regression_loss: 1.2959 - classification_loss: 0.2802 165/500 [========>.....................] - ETA: 1:20 - loss: 1.5823 - regression_loss: 1.3018 - classification_loss: 0.2804 166/500 [========>.....................] - ETA: 1:20 - loss: 1.5824 - regression_loss: 1.3020 - classification_loss: 0.2804 167/500 [=========>....................] - ETA: 1:20 - loss: 1.5835 - regression_loss: 1.3032 - classification_loss: 0.2803 168/500 [=========>....................] - ETA: 1:20 - loss: 1.5830 - regression_loss: 1.3031 - classification_loss: 0.2800 169/500 [=========>....................] - ETA: 1:19 - loss: 1.5852 - regression_loss: 1.3057 - classification_loss: 0.2795 170/500 [=========>....................] - ETA: 1:19 - loss: 1.5843 - regression_loss: 1.3051 - classification_loss: 0.2792 171/500 [=========>....................] - ETA: 1:19 - loss: 1.5848 - regression_loss: 1.3061 - classification_loss: 0.2787 172/500 [=========>....................] - ETA: 1:19 - loss: 1.5850 - regression_loss: 1.3062 - classification_loss: 0.2788 173/500 [=========>....................] - ETA: 1:18 - loss: 1.5855 - regression_loss: 1.3067 - classification_loss: 0.2788 174/500 [=========>....................] - ETA: 1:18 - loss: 1.5852 - regression_loss: 1.3065 - classification_loss: 0.2787 175/500 [=========>....................] - ETA: 1:18 - loss: 1.5814 - regression_loss: 1.3037 - classification_loss: 0.2777 176/500 [=========>....................] - ETA: 1:18 - loss: 1.5858 - regression_loss: 1.3072 - classification_loss: 0.2786 177/500 [=========>....................] - ETA: 1:17 - loss: 1.5885 - regression_loss: 1.3094 - classification_loss: 0.2791 178/500 [=========>....................] - ETA: 1:17 - loss: 1.5897 - regression_loss: 1.3105 - classification_loss: 0.2792 179/500 [=========>....................] - ETA: 1:17 - loss: 1.5906 - regression_loss: 1.3120 - classification_loss: 0.2785 180/500 [=========>....................] - ETA: 1:17 - loss: 1.5910 - regression_loss: 1.3126 - classification_loss: 0.2785 181/500 [=========>....................] - ETA: 1:16 - loss: 1.5925 - regression_loss: 1.3142 - classification_loss: 0.2783 182/500 [=========>....................] - ETA: 1:16 - loss: 1.5931 - regression_loss: 1.3149 - classification_loss: 0.2782 183/500 [=========>....................] - ETA: 1:16 - loss: 1.5938 - regression_loss: 1.3159 - classification_loss: 0.2779 184/500 [==========>...................] - ETA: 1:16 - loss: 1.5916 - regression_loss: 1.3145 - classification_loss: 0.2771 185/500 [==========>...................] - ETA: 1:16 - loss: 1.5920 - regression_loss: 1.3147 - classification_loss: 0.2773 186/500 [==========>...................] - ETA: 1:15 - loss: 1.5879 - regression_loss: 1.3115 - classification_loss: 0.2764 187/500 [==========>...................] - ETA: 1:15 - loss: 1.5896 - regression_loss: 1.3127 - classification_loss: 0.2769 188/500 [==========>...................] - ETA: 1:15 - loss: 1.5877 - regression_loss: 1.3112 - classification_loss: 0.2765 189/500 [==========>...................] - ETA: 1:15 - loss: 1.5909 - regression_loss: 1.3138 - classification_loss: 0.2771 190/500 [==========>...................] - ETA: 1:14 - loss: 1.5855 - regression_loss: 1.3096 - classification_loss: 0.2759 191/500 [==========>...................] - ETA: 1:14 - loss: 1.5855 - regression_loss: 1.3098 - classification_loss: 0.2757 192/500 [==========>...................] - ETA: 1:14 - loss: 1.5834 - regression_loss: 1.3086 - classification_loss: 0.2748 193/500 [==========>...................] - ETA: 1:14 - loss: 1.5822 - regression_loss: 1.3080 - classification_loss: 0.2741 194/500 [==========>...................] - ETA: 1:14 - loss: 1.5817 - regression_loss: 1.3080 - classification_loss: 0.2736 195/500 [==========>...................] - ETA: 1:13 - loss: 1.5818 - regression_loss: 1.3085 - classification_loss: 0.2734 196/500 [==========>...................] - ETA: 1:13 - loss: 1.5898 - regression_loss: 1.3149 - classification_loss: 0.2749 197/500 [==========>...................] - ETA: 1:13 - loss: 1.5913 - regression_loss: 1.3168 - classification_loss: 0.2745 198/500 [==========>...................] - ETA: 1:13 - loss: 1.5864 - regression_loss: 1.3129 - classification_loss: 0.2735 199/500 [==========>...................] - ETA: 1:12 - loss: 1.5847 - regression_loss: 1.3119 - classification_loss: 0.2728 200/500 [===========>..................] - ETA: 1:12 - loss: 1.5863 - regression_loss: 1.3138 - classification_loss: 0.2724 201/500 [===========>..................] - ETA: 1:12 - loss: 1.5846 - regression_loss: 1.3125 - classification_loss: 0.2721 202/500 [===========>..................] - ETA: 1:12 - loss: 1.5839 - regression_loss: 1.3120 - classification_loss: 0.2719 203/500 [===========>..................] - ETA: 1:11 - loss: 1.5819 - regression_loss: 1.3106 - classification_loss: 0.2713 204/500 [===========>..................] - ETA: 1:11 - loss: 1.5803 - regression_loss: 1.3097 - classification_loss: 0.2706 205/500 [===========>..................] - ETA: 1:11 - loss: 1.5815 - regression_loss: 1.3106 - classification_loss: 0.2709 206/500 [===========>..................] - ETA: 1:11 - loss: 1.5818 - regression_loss: 1.3102 - classification_loss: 0.2715 207/500 [===========>..................] - ETA: 1:10 - loss: 1.5827 - regression_loss: 1.3116 - classification_loss: 0.2711 208/500 [===========>..................] - ETA: 1:10 - loss: 1.5814 - regression_loss: 1.3108 - classification_loss: 0.2706 209/500 [===========>..................] - ETA: 1:10 - loss: 1.5783 - regression_loss: 1.3085 - classification_loss: 0.2698 210/500 [===========>..................] - ETA: 1:10 - loss: 1.5780 - regression_loss: 1.3086 - classification_loss: 0.2695 211/500 [===========>..................] - ETA: 1:09 - loss: 1.5768 - regression_loss: 1.3077 - classification_loss: 0.2691 212/500 [===========>..................] - ETA: 1:09 - loss: 1.5750 - regression_loss: 1.3064 - classification_loss: 0.2686 213/500 [===========>..................] - ETA: 1:09 - loss: 1.5770 - regression_loss: 1.3085 - classification_loss: 0.2685 214/500 [===========>..................] - ETA: 1:09 - loss: 1.5773 - regression_loss: 1.3089 - classification_loss: 0.2684 215/500 [===========>..................] - ETA: 1:09 - loss: 1.5772 - regression_loss: 1.3092 - classification_loss: 0.2680 216/500 [===========>..................] - ETA: 1:08 - loss: 1.5724 - regression_loss: 1.3053 - classification_loss: 0.2671 217/500 [============>.................] - ETA: 1:08 - loss: 1.5732 - regression_loss: 1.3060 - classification_loss: 0.2672 218/500 [============>.................] - ETA: 1:08 - loss: 1.5741 - regression_loss: 1.3067 - classification_loss: 0.2674 219/500 [============>.................] - ETA: 1:08 - loss: 1.5759 - regression_loss: 1.3086 - classification_loss: 0.2673 220/500 [============>.................] - ETA: 1:07 - loss: 1.5758 - regression_loss: 1.3086 - classification_loss: 0.2672 221/500 [============>.................] - ETA: 1:07 - loss: 1.5746 - regression_loss: 1.3078 - classification_loss: 0.2669 222/500 [============>.................] - ETA: 1:07 - loss: 1.5718 - regression_loss: 1.3056 - classification_loss: 0.2662 223/500 [============>.................] - ETA: 1:07 - loss: 1.5742 - regression_loss: 1.3078 - classification_loss: 0.2664 224/500 [============>.................] - ETA: 1:06 - loss: 1.5744 - regression_loss: 1.3080 - classification_loss: 0.2663 225/500 [============>.................] - ETA: 1:06 - loss: 1.5727 - regression_loss: 1.3070 - classification_loss: 0.2657 226/500 [============>.................] - ETA: 1:06 - loss: 1.5746 - regression_loss: 1.3082 - classification_loss: 0.2664 227/500 [============>.................] - ETA: 1:06 - loss: 1.5778 - regression_loss: 1.3107 - classification_loss: 0.2671 228/500 [============>.................] - ETA: 1:05 - loss: 1.5752 - regression_loss: 1.3088 - classification_loss: 0.2664 229/500 [============>.................] - ETA: 1:05 - loss: 1.5732 - regression_loss: 1.3073 - classification_loss: 0.2658 230/500 [============>.................] - ETA: 1:05 - loss: 1.5765 - regression_loss: 1.3094 - classification_loss: 0.2671 231/500 [============>.................] - ETA: 1:05 - loss: 1.5778 - regression_loss: 1.3108 - classification_loss: 0.2670 232/500 [============>.................] - ETA: 1:04 - loss: 1.5763 - regression_loss: 1.3100 - classification_loss: 0.2663 233/500 [============>.................] - ETA: 1:04 - loss: 1.5769 - regression_loss: 1.3104 - classification_loss: 0.2666 234/500 [=============>................] - ETA: 1:04 - loss: 1.5748 - regression_loss: 1.3084 - classification_loss: 0.2664 235/500 [=============>................] - ETA: 1:04 - loss: 1.5737 - regression_loss: 1.3076 - classification_loss: 0.2661 236/500 [=============>................] - ETA: 1:03 - loss: 1.5750 - regression_loss: 1.3085 - classification_loss: 0.2665 237/500 [=============>................] - ETA: 1:03 - loss: 1.5713 - regression_loss: 1.3056 - classification_loss: 0.2657 238/500 [=============>................] - ETA: 1:03 - loss: 1.5716 - regression_loss: 1.3061 - classification_loss: 0.2655 239/500 [=============>................] - ETA: 1:03 - loss: 1.5687 - regression_loss: 1.3039 - classification_loss: 0.2648 240/500 [=============>................] - ETA: 1:03 - loss: 1.5713 - regression_loss: 1.3063 - classification_loss: 0.2650 241/500 [=============>................] - ETA: 1:02 - loss: 1.5696 - regression_loss: 1.3053 - classification_loss: 0.2644 242/500 [=============>................] - ETA: 1:02 - loss: 1.5698 - regression_loss: 1.3056 - classification_loss: 0.2643 243/500 [=============>................] - ETA: 1:02 - loss: 1.5696 - regression_loss: 1.3049 - classification_loss: 0.2647 244/500 [=============>................] - ETA: 1:02 - loss: 1.5685 - regression_loss: 1.3044 - classification_loss: 0.2641 245/500 [=============>................] - ETA: 1:01 - loss: 1.5683 - regression_loss: 1.3045 - classification_loss: 0.2638 246/500 [=============>................] - ETA: 1:01 - loss: 1.5686 - regression_loss: 1.3049 - classification_loss: 0.2637 247/500 [=============>................] - ETA: 1:01 - loss: 1.5687 - regression_loss: 1.3053 - classification_loss: 0.2634 248/500 [=============>................] - ETA: 1:01 - loss: 1.5666 - regression_loss: 1.3039 - classification_loss: 0.2627 249/500 [=============>................] - ETA: 1:00 - loss: 1.5628 - regression_loss: 1.3010 - classification_loss: 0.2618 250/500 [==============>...............] - ETA: 1:00 - loss: 1.5647 - regression_loss: 1.3030 - classification_loss: 0.2617 251/500 [==============>...............] - ETA: 1:00 - loss: 1.5642 - regression_loss: 1.3025 - classification_loss: 0.2618 252/500 [==============>...............] - ETA: 1:00 - loss: 1.5605 - regression_loss: 1.2997 - classification_loss: 0.2609 253/500 [==============>...............] - ETA: 59s - loss: 1.5633 - regression_loss: 1.3018 - classification_loss: 0.2615  254/500 [==============>...............] - ETA: 59s - loss: 1.5652 - regression_loss: 1.3033 - classification_loss: 0.2618 255/500 [==============>...............] - ETA: 59s - loss: 1.5652 - regression_loss: 1.3035 - classification_loss: 0.2617 256/500 [==============>...............] - ETA: 59s - loss: 1.5646 - regression_loss: 1.3031 - classification_loss: 0.2614 257/500 [==============>...............] - ETA: 58s - loss: 1.5651 - regression_loss: 1.3037 - classification_loss: 0.2614 258/500 [==============>...............] - ETA: 58s - loss: 1.5657 - regression_loss: 1.3039 - classification_loss: 0.2617 259/500 [==============>...............] - ETA: 58s - loss: 1.5643 - regression_loss: 1.3032 - classification_loss: 0.2611 260/500 [==============>...............] - ETA: 58s - loss: 1.5670 - regression_loss: 1.3046 - classification_loss: 0.2624 261/500 [==============>...............] - ETA: 58s - loss: 1.5642 - regression_loss: 1.3026 - classification_loss: 0.2616 262/500 [==============>...............] - ETA: 57s - loss: 1.5642 - regression_loss: 1.3028 - classification_loss: 0.2614 263/500 [==============>...............] - ETA: 57s - loss: 1.5608 - regression_loss: 1.3000 - classification_loss: 0.2608 264/500 [==============>...............] - ETA: 57s - loss: 1.5589 - regression_loss: 1.2984 - classification_loss: 0.2604 265/500 [==============>...............] - ETA: 57s - loss: 1.5595 - regression_loss: 1.2989 - classification_loss: 0.2605 266/500 [==============>...............] - ETA: 56s - loss: 1.5572 - regression_loss: 1.2973 - classification_loss: 0.2599 267/500 [===============>..............] - ETA: 56s - loss: 1.5560 - regression_loss: 1.2965 - classification_loss: 0.2594 268/500 [===============>..............] - ETA: 56s - loss: 1.5580 - regression_loss: 1.2987 - classification_loss: 0.2593 269/500 [===============>..............] - ETA: 56s - loss: 1.5565 - regression_loss: 1.2978 - classification_loss: 0.2588 270/500 [===============>..............] - ETA: 55s - loss: 1.5563 - regression_loss: 1.2978 - classification_loss: 0.2585 271/500 [===============>..............] - ETA: 55s - loss: 1.5561 - regression_loss: 1.2973 - classification_loss: 0.2587 272/500 [===============>..............] - ETA: 55s - loss: 1.5565 - regression_loss: 1.2977 - classification_loss: 0.2588 273/500 [===============>..............] - ETA: 55s - loss: 1.5541 - regression_loss: 1.2959 - classification_loss: 0.2582 274/500 [===============>..............] - ETA: 54s - loss: 1.5534 - regression_loss: 1.2955 - classification_loss: 0.2579 275/500 [===============>..............] - ETA: 54s - loss: 1.5556 - regression_loss: 1.2972 - classification_loss: 0.2585 276/500 [===============>..............] - ETA: 54s - loss: 1.5558 - regression_loss: 1.2975 - classification_loss: 0.2582 277/500 [===============>..............] - ETA: 54s - loss: 1.5546 - regression_loss: 1.2965 - classification_loss: 0.2582 278/500 [===============>..............] - ETA: 53s - loss: 1.5514 - regression_loss: 1.2938 - classification_loss: 0.2576 279/500 [===============>..............] - ETA: 53s - loss: 1.5546 - regression_loss: 1.2956 - classification_loss: 0.2590 280/500 [===============>..............] - ETA: 53s - loss: 1.5555 - regression_loss: 1.2961 - classification_loss: 0.2595 281/500 [===============>..............] - ETA: 53s - loss: 1.5513 - regression_loss: 1.2927 - classification_loss: 0.2587 282/500 [===============>..............] - ETA: 52s - loss: 1.5536 - regression_loss: 1.2943 - classification_loss: 0.2594 283/500 [===============>..............] - ETA: 52s - loss: 1.5515 - regression_loss: 1.2928 - classification_loss: 0.2587 284/500 [================>.............] - ETA: 52s - loss: 1.5518 - regression_loss: 1.2930 - classification_loss: 0.2589 285/500 [================>.............] - ETA: 52s - loss: 1.5510 - regression_loss: 1.2925 - classification_loss: 0.2585 286/500 [================>.............] - ETA: 51s - loss: 1.5499 - regression_loss: 1.2919 - classification_loss: 0.2580 287/500 [================>.............] - ETA: 51s - loss: 1.5486 - regression_loss: 1.2909 - classification_loss: 0.2577 288/500 [================>.............] - ETA: 51s - loss: 1.5504 - regression_loss: 1.2922 - classification_loss: 0.2582 289/500 [================>.............] - ETA: 51s - loss: 1.5498 - regression_loss: 1.2920 - classification_loss: 0.2578 290/500 [================>.............] - ETA: 50s - loss: 1.5466 - regression_loss: 1.2895 - classification_loss: 0.2571 291/500 [================>.............] - ETA: 50s - loss: 1.5464 - regression_loss: 1.2893 - classification_loss: 0.2571 292/500 [================>.............] - ETA: 50s - loss: 1.5450 - regression_loss: 1.2883 - classification_loss: 0.2567 293/500 [================>.............] - ETA: 50s - loss: 1.5439 - regression_loss: 1.2874 - classification_loss: 0.2565 294/500 [================>.............] - ETA: 49s - loss: 1.5421 - regression_loss: 1.2860 - classification_loss: 0.2561 295/500 [================>.............] - ETA: 49s - loss: 1.5393 - regression_loss: 1.2836 - classification_loss: 0.2556 296/500 [================>.............] - ETA: 49s - loss: 1.5398 - regression_loss: 1.2841 - classification_loss: 0.2557 297/500 [================>.............] - ETA: 49s - loss: 1.5386 - regression_loss: 1.2833 - classification_loss: 0.2553 298/500 [================>.............] - ETA: 49s - loss: 1.5382 - regression_loss: 1.2829 - classification_loss: 0.2553 299/500 [================>.............] - ETA: 48s - loss: 1.5383 - regression_loss: 1.2831 - classification_loss: 0.2553 300/500 [=================>............] - ETA: 48s - loss: 1.5394 - regression_loss: 1.2837 - classification_loss: 0.2557 301/500 [=================>............] - ETA: 48s - loss: 1.5399 - regression_loss: 1.2842 - classification_loss: 0.2558 302/500 [=================>............] - ETA: 48s - loss: 1.5383 - regression_loss: 1.2829 - classification_loss: 0.2554 303/500 [=================>............] - ETA: 47s - loss: 1.5368 - regression_loss: 1.2818 - classification_loss: 0.2550 304/500 [=================>............] - ETA: 47s - loss: 1.5358 - regression_loss: 1.2812 - classification_loss: 0.2545 305/500 [=================>............] - ETA: 47s - loss: 1.5401 - regression_loss: 1.2850 - classification_loss: 0.2552 306/500 [=================>............] - ETA: 47s - loss: 1.5370 - regression_loss: 1.2824 - classification_loss: 0.2545 307/500 [=================>............] - ETA: 46s - loss: 1.5340 - regression_loss: 1.2800 - classification_loss: 0.2540 308/500 [=================>............] - ETA: 46s - loss: 1.5377 - regression_loss: 1.2833 - classification_loss: 0.2544 309/500 [=================>............] - ETA: 46s - loss: 1.5370 - regression_loss: 1.2827 - classification_loss: 0.2543 310/500 [=================>............] - ETA: 46s - loss: 1.5394 - regression_loss: 1.2850 - classification_loss: 0.2544 311/500 [=================>............] - ETA: 45s - loss: 1.5384 - regression_loss: 1.2843 - classification_loss: 0.2541 312/500 [=================>............] - ETA: 45s - loss: 1.5372 - regression_loss: 1.2832 - classification_loss: 0.2540 313/500 [=================>............] - ETA: 45s - loss: 1.5398 - regression_loss: 1.2855 - classification_loss: 0.2543 314/500 [=================>............] - ETA: 45s - loss: 1.5422 - regression_loss: 1.2877 - classification_loss: 0.2546 315/500 [=================>............] - ETA: 44s - loss: 1.5413 - regression_loss: 1.2867 - classification_loss: 0.2545 316/500 [=================>............] - ETA: 44s - loss: 1.5373 - regression_loss: 1.2827 - classification_loss: 0.2546 317/500 [==================>...........] - ETA: 44s - loss: 1.5353 - regression_loss: 1.2813 - classification_loss: 0.2541 318/500 [==================>...........] - ETA: 44s - loss: 1.5353 - regression_loss: 1.2811 - classification_loss: 0.2542 319/500 [==================>...........] - ETA: 43s - loss: 1.5339 - regression_loss: 1.2801 - classification_loss: 0.2539 320/500 [==================>...........] - ETA: 43s - loss: 1.5310 - regression_loss: 1.2777 - classification_loss: 0.2533 321/500 [==================>...........] - ETA: 43s - loss: 1.5311 - regression_loss: 1.2780 - classification_loss: 0.2531 322/500 [==================>...........] - ETA: 43s - loss: 1.5303 - regression_loss: 1.2775 - classification_loss: 0.2528 323/500 [==================>...........] - ETA: 43s - loss: 1.5310 - regression_loss: 1.2781 - classification_loss: 0.2530 324/500 [==================>...........] - ETA: 42s - loss: 1.5304 - regression_loss: 1.2777 - classification_loss: 0.2527 325/500 [==================>...........] - ETA: 42s - loss: 1.5277 - regression_loss: 1.2756 - classification_loss: 0.2520 326/500 [==================>...........] - ETA: 42s - loss: 1.5247 - regression_loss: 1.2732 - classification_loss: 0.2515 327/500 [==================>...........] - ETA: 42s - loss: 1.5256 - regression_loss: 1.2738 - classification_loss: 0.2518 328/500 [==================>...........] - ETA: 41s - loss: 1.5256 - regression_loss: 1.2739 - classification_loss: 0.2517 329/500 [==================>...........] - ETA: 41s - loss: 1.5260 - regression_loss: 1.2746 - classification_loss: 0.2514 330/500 [==================>...........] - ETA: 41s - loss: 1.5250 - regression_loss: 1.2739 - classification_loss: 0.2512 331/500 [==================>...........] - ETA: 41s - loss: 1.5247 - regression_loss: 1.2737 - classification_loss: 0.2510 332/500 [==================>...........] - ETA: 40s - loss: 1.5252 - regression_loss: 1.2741 - classification_loss: 0.2511 333/500 [==================>...........] - ETA: 40s - loss: 1.5241 - regression_loss: 1.2731 - classification_loss: 0.2510 334/500 [===================>..........] - ETA: 40s - loss: 1.5230 - regression_loss: 1.2725 - classification_loss: 0.2505 335/500 [===================>..........] - ETA: 40s - loss: 1.5219 - regression_loss: 1.2715 - classification_loss: 0.2504 336/500 [===================>..........] - ETA: 39s - loss: 1.5225 - regression_loss: 1.2718 - classification_loss: 0.2507 337/500 [===================>..........] - ETA: 39s - loss: 1.5212 - regression_loss: 1.2710 - classification_loss: 0.2503 338/500 [===================>..........] - ETA: 39s - loss: 1.5216 - regression_loss: 1.2712 - classification_loss: 0.2504 339/500 [===================>..........] - ETA: 39s - loss: 1.5209 - regression_loss: 1.2707 - classification_loss: 0.2501 340/500 [===================>..........] - ETA: 38s - loss: 1.5222 - regression_loss: 1.2720 - classification_loss: 0.2502 341/500 [===================>..........] - ETA: 38s - loss: 1.5224 - regression_loss: 1.2722 - classification_loss: 0.2502 342/500 [===================>..........] - ETA: 38s - loss: 1.5226 - regression_loss: 1.2724 - classification_loss: 0.2502 343/500 [===================>..........] - ETA: 38s - loss: 1.5229 - regression_loss: 1.2727 - classification_loss: 0.2502 344/500 [===================>..........] - ETA: 37s - loss: 1.5205 - regression_loss: 1.2707 - classification_loss: 0.2498 345/500 [===================>..........] - ETA: 37s - loss: 1.5234 - regression_loss: 1.2732 - classification_loss: 0.2502 346/500 [===================>..........] - ETA: 37s - loss: 1.5210 - regression_loss: 1.2712 - classification_loss: 0.2497 347/500 [===================>..........] - ETA: 37s - loss: 1.5226 - regression_loss: 1.2723 - classification_loss: 0.2503 348/500 [===================>..........] - ETA: 36s - loss: 1.5226 - regression_loss: 1.2723 - classification_loss: 0.2503 349/500 [===================>..........] - ETA: 36s - loss: 1.5235 - regression_loss: 1.2731 - classification_loss: 0.2503 350/500 [====================>.........] - ETA: 36s - loss: 1.5229 - regression_loss: 1.2729 - classification_loss: 0.2501 351/500 [====================>.........] - ETA: 36s - loss: 1.5229 - regression_loss: 1.2728 - classification_loss: 0.2501 352/500 [====================>.........] - ETA: 35s - loss: 1.5236 - regression_loss: 1.2731 - classification_loss: 0.2505 353/500 [====================>.........] - ETA: 35s - loss: 1.5225 - regression_loss: 1.2724 - classification_loss: 0.2501 354/500 [====================>.........] - ETA: 35s - loss: 1.5217 - regression_loss: 1.2719 - classification_loss: 0.2498 355/500 [====================>.........] - ETA: 35s - loss: 1.5225 - regression_loss: 1.2725 - classification_loss: 0.2500 356/500 [====================>.........] - ETA: 34s - loss: 1.5241 - regression_loss: 1.2740 - classification_loss: 0.2501 357/500 [====================>.........] - ETA: 34s - loss: 1.5234 - regression_loss: 1.2733 - classification_loss: 0.2500 358/500 [====================>.........] - ETA: 34s - loss: 1.5233 - regression_loss: 1.2734 - classification_loss: 0.2500 359/500 [====================>.........] - ETA: 34s - loss: 1.5241 - regression_loss: 1.2741 - classification_loss: 0.2500 360/500 [====================>.........] - ETA: 33s - loss: 1.5238 - regression_loss: 1.2740 - classification_loss: 0.2498 361/500 [====================>.........] - ETA: 33s - loss: 1.5242 - regression_loss: 1.2745 - classification_loss: 0.2497 362/500 [====================>.........] - ETA: 33s - loss: 1.5248 - regression_loss: 1.2751 - classification_loss: 0.2497 363/500 [====================>.........] - ETA: 33s - loss: 1.5262 - regression_loss: 1.2764 - classification_loss: 0.2498 364/500 [====================>.........] - ETA: 33s - loss: 1.5261 - regression_loss: 1.2763 - classification_loss: 0.2498 365/500 [====================>.........] - ETA: 32s - loss: 1.5262 - regression_loss: 1.2766 - classification_loss: 0.2495 366/500 [====================>.........] - ETA: 32s - loss: 1.5255 - regression_loss: 1.2761 - classification_loss: 0.2494 367/500 [=====================>........] - ETA: 32s - loss: 1.5262 - regression_loss: 1.2770 - classification_loss: 0.2492 368/500 [=====================>........] - ETA: 32s - loss: 1.5256 - regression_loss: 1.2762 - classification_loss: 0.2493 369/500 [=====================>........] - ETA: 31s - loss: 1.5232 - regression_loss: 1.2728 - classification_loss: 0.2504 370/500 [=====================>........] - ETA: 31s - loss: 1.5229 - regression_loss: 1.2726 - classification_loss: 0.2502 371/500 [=====================>........] - ETA: 31s - loss: 1.5226 - regression_loss: 1.2725 - classification_loss: 0.2501 372/500 [=====================>........] - ETA: 31s - loss: 1.5247 - regression_loss: 1.2742 - classification_loss: 0.2505 373/500 [=====================>........] - ETA: 30s - loss: 1.5250 - regression_loss: 1.2747 - classification_loss: 0.2503 374/500 [=====================>........] - ETA: 30s - loss: 1.5263 - regression_loss: 1.2758 - classification_loss: 0.2505 375/500 [=====================>........] - ETA: 30s - loss: 1.5282 - regression_loss: 1.2773 - classification_loss: 0.2509 376/500 [=====================>........] - ETA: 30s - loss: 1.5305 - regression_loss: 1.2787 - classification_loss: 0.2518 377/500 [=====================>........] - ETA: 29s - loss: 1.5290 - regression_loss: 1.2777 - classification_loss: 0.2514 378/500 [=====================>........] - ETA: 29s - loss: 1.5273 - regression_loss: 1.2763 - classification_loss: 0.2510 379/500 [=====================>........] - ETA: 29s - loss: 1.5283 - regression_loss: 1.2773 - classification_loss: 0.2510 380/500 [=====================>........] - ETA: 29s - loss: 1.5278 - regression_loss: 1.2770 - classification_loss: 0.2508 381/500 [=====================>........] - ETA: 28s - loss: 1.5296 - regression_loss: 1.2784 - classification_loss: 0.2512 382/500 [=====================>........] - ETA: 28s - loss: 1.5305 - regression_loss: 1.2790 - classification_loss: 0.2515 383/500 [=====================>........] - ETA: 28s - loss: 1.5307 - regression_loss: 1.2793 - classification_loss: 0.2514 384/500 [======================>.......] - ETA: 28s - loss: 1.5289 - regression_loss: 1.2779 - classification_loss: 0.2510 385/500 [======================>.......] - ETA: 27s - loss: 1.5296 - regression_loss: 1.2786 - classification_loss: 0.2510 386/500 [======================>.......] - ETA: 27s - loss: 1.5320 - regression_loss: 1.2806 - classification_loss: 0.2514 387/500 [======================>.......] - ETA: 27s - loss: 1.5317 - regression_loss: 1.2802 - classification_loss: 0.2514 388/500 [======================>.......] - ETA: 27s - loss: 1.5304 - regression_loss: 1.2792 - classification_loss: 0.2511 389/500 [======================>.......] - ETA: 26s - loss: 1.5302 - regression_loss: 1.2795 - classification_loss: 0.2508 390/500 [======================>.......] - ETA: 26s - loss: 1.5312 - regression_loss: 1.2802 - classification_loss: 0.2510 391/500 [======================>.......] - ETA: 26s - loss: 1.5332 - regression_loss: 1.2819 - classification_loss: 0.2513 392/500 [======================>.......] - ETA: 26s - loss: 1.5337 - regression_loss: 1.2823 - classification_loss: 0.2514 393/500 [======================>.......] - ETA: 25s - loss: 1.5334 - regression_loss: 1.2821 - classification_loss: 0.2513 394/500 [======================>.......] - ETA: 25s - loss: 1.5320 - regression_loss: 1.2810 - classification_loss: 0.2510 395/500 [======================>.......] - ETA: 25s - loss: 1.5298 - regression_loss: 1.2792 - classification_loss: 0.2506 396/500 [======================>.......] - ETA: 25s - loss: 1.5306 - regression_loss: 1.2797 - classification_loss: 0.2509 397/500 [======================>.......] - ETA: 25s - loss: 1.5297 - regression_loss: 1.2791 - classification_loss: 0.2506 398/500 [======================>.......] - ETA: 24s - loss: 1.5284 - regression_loss: 1.2779 - classification_loss: 0.2505 399/500 [======================>.......] - ETA: 24s - loss: 1.5277 - regression_loss: 1.2773 - classification_loss: 0.2504 400/500 [=======================>......] - ETA: 24s - loss: 1.5294 - regression_loss: 1.2786 - classification_loss: 0.2508 401/500 [=======================>......] - ETA: 24s - loss: 1.5294 - regression_loss: 1.2785 - classification_loss: 0.2509 402/500 [=======================>......] - ETA: 23s - loss: 1.5289 - regression_loss: 1.2782 - classification_loss: 0.2507 403/500 [=======================>......] - ETA: 23s - loss: 1.5286 - regression_loss: 1.2781 - classification_loss: 0.2505 404/500 [=======================>......] - ETA: 23s - loss: 1.5285 - regression_loss: 1.2779 - classification_loss: 0.2506 405/500 [=======================>......] - ETA: 23s - loss: 1.5290 - regression_loss: 1.2784 - classification_loss: 0.2506 406/500 [=======================>......] - ETA: 22s - loss: 1.5309 - regression_loss: 1.2797 - classification_loss: 0.2512 407/500 [=======================>......] - ETA: 22s - loss: 1.5356 - regression_loss: 1.2830 - classification_loss: 0.2526 408/500 [=======================>......] - ETA: 22s - loss: 1.5331 - regression_loss: 1.2810 - classification_loss: 0.2521 409/500 [=======================>......] - ETA: 22s - loss: 1.5329 - regression_loss: 1.2810 - classification_loss: 0.2519 410/500 [=======================>......] - ETA: 21s - loss: 1.5319 - regression_loss: 1.2802 - classification_loss: 0.2517 411/500 [=======================>......] - ETA: 21s - loss: 1.5326 - regression_loss: 1.2807 - classification_loss: 0.2519 412/500 [=======================>......] - ETA: 21s - loss: 1.5315 - regression_loss: 1.2799 - classification_loss: 0.2516 413/500 [=======================>......] - ETA: 21s - loss: 1.5324 - regression_loss: 1.2807 - classification_loss: 0.2517 414/500 [=======================>......] - ETA: 20s - loss: 1.5321 - regression_loss: 1.2805 - classification_loss: 0.2516 415/500 [=======================>......] - ETA: 20s - loss: 1.5320 - regression_loss: 1.2804 - classification_loss: 0.2517 416/500 [=======================>......] - ETA: 20s - loss: 1.5328 - regression_loss: 1.2809 - classification_loss: 0.2520 417/500 [========================>.....] - ETA: 20s - loss: 1.5343 - regression_loss: 1.2820 - classification_loss: 0.2523 418/500 [========================>.....] - ETA: 19s - loss: 1.5346 - regression_loss: 1.2823 - classification_loss: 0.2523 419/500 [========================>.....] - ETA: 19s - loss: 1.5349 - regression_loss: 1.2824 - classification_loss: 0.2525 420/500 [========================>.....] - ETA: 19s - loss: 1.5389 - regression_loss: 1.2858 - classification_loss: 0.2531 421/500 [========================>.....] - ETA: 19s - loss: 1.5387 - regression_loss: 1.2858 - classification_loss: 0.2529 422/500 [========================>.....] - ETA: 18s - loss: 1.5393 - regression_loss: 1.2863 - classification_loss: 0.2530 423/500 [========================>.....] - ETA: 18s - loss: 1.5397 - regression_loss: 1.2867 - classification_loss: 0.2530 424/500 [========================>.....] - ETA: 18s - loss: 1.5407 - regression_loss: 1.2872 - classification_loss: 0.2535 425/500 [========================>.....] - ETA: 18s - loss: 1.5402 - regression_loss: 1.2870 - classification_loss: 0.2532 426/500 [========================>.....] - ETA: 17s - loss: 1.5394 - regression_loss: 1.2865 - classification_loss: 0.2529 427/500 [========================>.....] - ETA: 17s - loss: 1.5410 - regression_loss: 1.2876 - classification_loss: 0.2534 428/500 [========================>.....] - ETA: 17s - loss: 1.5424 - regression_loss: 1.2885 - classification_loss: 0.2539 429/500 [========================>.....] - ETA: 17s - loss: 1.5423 - regression_loss: 1.2886 - classification_loss: 0.2537 430/500 [========================>.....] - ETA: 17s - loss: 1.5425 - regression_loss: 1.2888 - classification_loss: 0.2537 431/500 [========================>.....] - ETA: 16s - loss: 1.5432 - regression_loss: 1.2895 - classification_loss: 0.2537 432/500 [========================>.....] - ETA: 16s - loss: 1.5423 - regression_loss: 1.2887 - classification_loss: 0.2535 433/500 [========================>.....] - ETA: 16s - loss: 1.5424 - regression_loss: 1.2889 - classification_loss: 0.2534 434/500 [=========================>....] - ETA: 16s - loss: 1.5423 - regression_loss: 1.2890 - classification_loss: 0.2533 435/500 [=========================>....] - ETA: 15s - loss: 1.5406 - regression_loss: 1.2877 - classification_loss: 0.2530 436/500 [=========================>....] - ETA: 15s - loss: 1.5407 - regression_loss: 1.2876 - classification_loss: 0.2530 437/500 [=========================>....] - ETA: 15s - loss: 1.5409 - regression_loss: 1.2881 - classification_loss: 0.2529 438/500 [=========================>....] - ETA: 15s - loss: 1.5404 - regression_loss: 1.2878 - classification_loss: 0.2526 439/500 [=========================>....] - ETA: 14s - loss: 1.5438 - regression_loss: 1.2873 - classification_loss: 0.2565 440/500 [=========================>....] - ETA: 14s - loss: 1.5443 - regression_loss: 1.2878 - classification_loss: 0.2565 441/500 [=========================>....] - ETA: 14s - loss: 1.5444 - regression_loss: 1.2878 - classification_loss: 0.2567 442/500 [=========================>....] - ETA: 14s - loss: 1.5451 - regression_loss: 1.2884 - classification_loss: 0.2568 443/500 [=========================>....] - ETA: 13s - loss: 1.5458 - regression_loss: 1.2893 - classification_loss: 0.2565 444/500 [=========================>....] - ETA: 13s - loss: 1.5449 - regression_loss: 1.2886 - classification_loss: 0.2563 445/500 [=========================>....] - ETA: 13s - loss: 1.5455 - regression_loss: 1.2893 - classification_loss: 0.2563 446/500 [=========================>....] - ETA: 13s - loss: 1.5472 - regression_loss: 1.2904 - classification_loss: 0.2568 447/500 [=========================>....] - ETA: 12s - loss: 1.5486 - regression_loss: 1.2915 - classification_loss: 0.2570 448/500 [=========================>....] - ETA: 12s - loss: 1.5485 - regression_loss: 1.2916 - classification_loss: 0.2569 449/500 [=========================>....] - ETA: 12s - loss: 1.5491 - regression_loss: 1.2922 - classification_loss: 0.2569 450/500 [==========================>...] - ETA: 12s - loss: 1.5472 - regression_loss: 1.2906 - classification_loss: 0.2566 451/500 [==========================>...] - ETA: 11s - loss: 1.5471 - regression_loss: 1.2907 - classification_loss: 0.2565 452/500 [==========================>...] - ETA: 11s - loss: 1.5467 - regression_loss: 1.2904 - classification_loss: 0.2563 453/500 [==========================>...] - ETA: 11s - loss: 1.5482 - regression_loss: 1.2918 - classification_loss: 0.2563 454/500 [==========================>...] - ETA: 11s - loss: 1.5487 - regression_loss: 1.2920 - classification_loss: 0.2567 455/500 [==========================>...] - ETA: 10s - loss: 1.5483 - regression_loss: 1.2919 - classification_loss: 0.2565 456/500 [==========================>...] - ETA: 10s - loss: 1.5485 - regression_loss: 1.2919 - classification_loss: 0.2566 457/500 [==========================>...] - ETA: 10s - loss: 1.5475 - regression_loss: 1.2912 - classification_loss: 0.2563 458/500 [==========================>...] - ETA: 10s - loss: 1.5484 - regression_loss: 1.2920 - classification_loss: 0.2564 459/500 [==========================>...] - ETA: 9s - loss: 1.5485 - regression_loss: 1.2921 - classification_loss: 0.2564  460/500 [==========================>...] - ETA: 9s - loss: 1.5482 - regression_loss: 1.2919 - classification_loss: 0.2562 461/500 [==========================>...] - ETA: 9s - loss: 1.5487 - regression_loss: 1.2925 - classification_loss: 0.2563 462/500 [==========================>...] - ETA: 9s - loss: 1.5491 - regression_loss: 1.2927 - classification_loss: 0.2564 463/500 [==========================>...] - ETA: 8s - loss: 1.5508 - regression_loss: 1.2941 - classification_loss: 0.2568 464/500 [==========================>...] - ETA: 8s - loss: 1.5523 - regression_loss: 1.2953 - classification_loss: 0.2570 465/500 [==========================>...] - ETA: 8s - loss: 1.5550 - regression_loss: 1.2977 - classification_loss: 0.2574 466/500 [==========================>...] - ETA: 8s - loss: 1.5532 - regression_loss: 1.2961 - classification_loss: 0.2571 467/500 [===========================>..] - ETA: 8s - loss: 1.5507 - regression_loss: 1.2940 - classification_loss: 0.2567 468/500 [===========================>..] - ETA: 7s - loss: 1.5531 - regression_loss: 1.2960 - classification_loss: 0.2571 469/500 [===========================>..] - ETA: 7s - loss: 1.5531 - regression_loss: 1.2959 - classification_loss: 0.2572 470/500 [===========================>..] - ETA: 7s - loss: 1.5540 - regression_loss: 1.2965 - classification_loss: 0.2574 471/500 [===========================>..] - ETA: 7s - loss: 1.5548 - regression_loss: 1.2972 - classification_loss: 0.2576 472/500 [===========================>..] - ETA: 6s - loss: 1.5555 - regression_loss: 1.2981 - classification_loss: 0.2574 473/500 [===========================>..] - ETA: 6s - loss: 1.5536 - regression_loss: 1.2965 - classification_loss: 0.2571 474/500 [===========================>..] - ETA: 6s - loss: 1.5537 - regression_loss: 1.2965 - classification_loss: 0.2572 475/500 [===========================>..] - ETA: 6s - loss: 1.5545 - regression_loss: 1.2971 - classification_loss: 0.2575 476/500 [===========================>..] - ETA: 5s - loss: 1.5553 - regression_loss: 1.2977 - classification_loss: 0.2576 477/500 [===========================>..] - ETA: 5s - loss: 1.5549 - regression_loss: 1.2974 - classification_loss: 0.2575 478/500 [===========================>..] - ETA: 5s - loss: 1.5548 - regression_loss: 1.2975 - classification_loss: 0.2573 479/500 [===========================>..] - ETA: 5s - loss: 1.5549 - regression_loss: 1.2973 - classification_loss: 0.2576 480/500 [===========================>..] - ETA: 4s - loss: 1.5561 - regression_loss: 1.2982 - classification_loss: 0.2579 481/500 [===========================>..] - ETA: 4s - loss: 1.5547 - regression_loss: 1.2970 - classification_loss: 0.2577 482/500 [===========================>..] - ETA: 4s - loss: 1.5547 - regression_loss: 1.2971 - classification_loss: 0.2576 483/500 [===========================>..] - ETA: 4s - loss: 1.5534 - regression_loss: 1.2961 - classification_loss: 0.2573 484/500 [============================>.] - ETA: 3s - loss: 1.5509 - regression_loss: 1.2941 - classification_loss: 0.2568 485/500 [============================>.] - ETA: 3s - loss: 1.5510 - regression_loss: 1.2942 - classification_loss: 0.2568 486/500 [============================>.] - ETA: 3s - loss: 1.5512 - regression_loss: 1.2943 - classification_loss: 0.2569 487/500 [============================>.] - ETA: 3s - loss: 1.5499 - regression_loss: 1.2933 - classification_loss: 0.2566 488/500 [============================>.] - ETA: 2s - loss: 1.5503 - regression_loss: 1.2935 - classification_loss: 0.2567 489/500 [============================>.] - ETA: 2s - loss: 1.5511 - regression_loss: 1.2941 - classification_loss: 0.2570 490/500 [============================>.] - ETA: 2s - loss: 1.5512 - regression_loss: 1.2943 - classification_loss: 0.2569 491/500 [============================>.] - ETA: 2s - loss: 1.5510 - regression_loss: 1.2942 - classification_loss: 0.2568 492/500 [============================>.] - ETA: 1s - loss: 1.5500 - regression_loss: 1.2934 - classification_loss: 0.2566 493/500 [============================>.] - ETA: 1s - loss: 1.5506 - regression_loss: 1.2937 - classification_loss: 0.2569 494/500 [============================>.] - ETA: 1s - loss: 1.5504 - regression_loss: 1.2936 - classification_loss: 0.2568 495/500 [============================>.] - ETA: 1s - loss: 1.5509 - regression_loss: 1.2942 - classification_loss: 0.2567 496/500 [============================>.] - ETA: 0s - loss: 1.5510 - regression_loss: 1.2943 - classification_loss: 0.2567 497/500 [============================>.] - ETA: 0s - loss: 1.5494 - regression_loss: 1.2931 - classification_loss: 0.2563 498/500 [============================>.] - ETA: 0s - loss: 1.5483 - regression_loss: 1.2923 - classification_loss: 0.2561 499/500 [============================>.] - ETA: 0s - loss: 1.5476 - regression_loss: 1.2919 - classification_loss: 0.2558 500/500 [==============================] - 122s 243ms/step - loss: 1.5473 - regression_loss: 1.2917 - classification_loss: 0.2555 326 instances of class plum with average precision: 0.7833 mAP: 0.7833 Epoch 00067: saving model to ./training/snapshots/resnet50_pascal_67.h5 Epoch 68/150 1/500 [..............................] - ETA: 2:06 - loss: 0.7363 - regression_loss: 0.6491 - classification_loss: 0.0871 2/500 [..............................] - ETA: 2:04 - loss: 1.1288 - regression_loss: 0.9858 - classification_loss: 0.1430 3/500 [..............................] - ETA: 2:04 - loss: 1.3795 - regression_loss: 1.1531 - classification_loss: 0.2264 4/500 [..............................] - ETA: 2:00 - loss: 1.4346 - regression_loss: 1.1889 - classification_loss: 0.2458 5/500 [..............................] - ETA: 1:59 - loss: 1.4555 - regression_loss: 1.2171 - classification_loss: 0.2384 6/500 [..............................] - ETA: 2:00 - loss: 1.3897 - regression_loss: 1.1607 - classification_loss: 0.2290 7/500 [..............................] - ETA: 1:59 - loss: 1.4188 - regression_loss: 1.1746 - classification_loss: 0.2442 8/500 [..............................] - ETA: 2:00 - loss: 1.3947 - regression_loss: 1.1490 - classification_loss: 0.2457 9/500 [..............................] - ETA: 1:59 - loss: 1.3809 - regression_loss: 1.1488 - classification_loss: 0.2321 10/500 [..............................] - ETA: 1:59 - loss: 1.3811 - regression_loss: 1.1524 - classification_loss: 0.2287 11/500 [..............................] - ETA: 1:59 - loss: 1.3877 - regression_loss: 1.1622 - classification_loss: 0.2254 12/500 [..............................] - ETA: 1:59 - loss: 1.3762 - regression_loss: 1.1580 - classification_loss: 0.2182 13/500 [..............................] - ETA: 1:59 - loss: 1.4089 - regression_loss: 1.1787 - classification_loss: 0.2302 14/500 [..............................] - ETA: 1:59 - loss: 1.3820 - regression_loss: 1.1579 - classification_loss: 0.2241 15/500 [..............................] - ETA: 1:59 - loss: 1.3611 - regression_loss: 1.1457 - classification_loss: 0.2155 16/500 [..............................] - ETA: 1:58 - loss: 1.3765 - regression_loss: 1.1652 - classification_loss: 0.2113 17/500 [>.............................] - ETA: 1:57 - loss: 1.4056 - regression_loss: 1.1869 - classification_loss: 0.2187 18/500 [>.............................] - ETA: 1:57 - loss: 1.4079 - regression_loss: 1.1931 - classification_loss: 0.2148 19/500 [>.............................] - ETA: 1:57 - loss: 1.4222 - regression_loss: 1.2017 - classification_loss: 0.2205 20/500 [>.............................] - ETA: 1:56 - loss: 1.4179 - regression_loss: 1.2020 - classification_loss: 0.2159 21/500 [>.............................] - ETA: 1:56 - loss: 1.4328 - regression_loss: 1.2136 - classification_loss: 0.2192 22/500 [>.............................] - ETA: 1:56 - loss: 1.4252 - regression_loss: 1.2054 - classification_loss: 0.2198 23/500 [>.............................] - ETA: 1:56 - loss: 1.4012 - regression_loss: 1.1861 - classification_loss: 0.2151 24/500 [>.............................] - ETA: 1:56 - loss: 1.4014 - regression_loss: 1.1868 - classification_loss: 0.2146 25/500 [>.............................] - ETA: 1:55 - loss: 1.4317 - regression_loss: 1.2076 - classification_loss: 0.2241 26/500 [>.............................] - ETA: 1:55 - loss: 1.4368 - regression_loss: 1.2144 - classification_loss: 0.2224 27/500 [>.............................] - ETA: 1:55 - loss: 1.4462 - regression_loss: 1.2220 - classification_loss: 0.2242 28/500 [>.............................] - ETA: 1:55 - loss: 1.4817 - regression_loss: 1.2530 - classification_loss: 0.2287 29/500 [>.............................] - ETA: 1:55 - loss: 1.4714 - regression_loss: 1.2466 - classification_loss: 0.2248 30/500 [>.............................] - ETA: 1:54 - loss: 1.4578 - regression_loss: 1.2359 - classification_loss: 0.2219 31/500 [>.............................] - ETA: 1:54 - loss: 1.4438 - regression_loss: 1.2229 - classification_loss: 0.2208 32/500 [>.............................] - ETA: 1:54 - loss: 1.4474 - regression_loss: 1.2265 - classification_loss: 0.2209 33/500 [>.............................] - ETA: 1:53 - loss: 1.4612 - regression_loss: 1.2393 - classification_loss: 0.2219 34/500 [=>............................] - ETA: 1:53 - loss: 1.4815 - regression_loss: 1.2522 - classification_loss: 0.2292 35/500 [=>............................] - ETA: 1:53 - loss: 1.4985 - regression_loss: 1.2701 - classification_loss: 0.2284 36/500 [=>............................] - ETA: 1:53 - loss: 1.4977 - regression_loss: 1.2706 - classification_loss: 0.2271 37/500 [=>............................] - ETA: 1:53 - loss: 1.5127 - regression_loss: 1.2831 - classification_loss: 0.2296 38/500 [=>............................] - ETA: 1:52 - loss: 1.4918 - regression_loss: 1.2661 - classification_loss: 0.2256 39/500 [=>............................] - ETA: 1:52 - loss: 1.4972 - regression_loss: 1.2690 - classification_loss: 0.2282 40/500 [=>............................] - ETA: 1:52 - loss: 1.4979 - regression_loss: 1.2696 - classification_loss: 0.2284 41/500 [=>............................] - ETA: 1:52 - loss: 1.4896 - regression_loss: 1.2627 - classification_loss: 0.2269 42/500 [=>............................] - ETA: 1:52 - loss: 1.4937 - regression_loss: 1.2646 - classification_loss: 0.2291 43/500 [=>............................] - ETA: 1:51 - loss: 1.4858 - regression_loss: 1.2578 - classification_loss: 0.2280 44/500 [=>............................] - ETA: 1:51 - loss: 1.4980 - regression_loss: 1.2694 - classification_loss: 0.2286 45/500 [=>............................] - ETA: 1:51 - loss: 1.4985 - regression_loss: 1.2688 - classification_loss: 0.2297 46/500 [=>............................] - ETA: 1:50 - loss: 1.4964 - regression_loss: 1.2677 - classification_loss: 0.2287 47/500 [=>............................] - ETA: 1:50 - loss: 1.4990 - regression_loss: 1.2683 - classification_loss: 0.2307 48/500 [=>............................] - ETA: 1:50 - loss: 1.5000 - regression_loss: 1.2681 - classification_loss: 0.2319 49/500 [=>............................] - ETA: 1:50 - loss: 1.4830 - regression_loss: 1.2539 - classification_loss: 0.2291 50/500 [==>...........................] - ETA: 1:50 - loss: 1.4789 - regression_loss: 1.2515 - classification_loss: 0.2274 51/500 [==>...........................] - ETA: 1:49 - loss: 1.4822 - regression_loss: 1.2544 - classification_loss: 0.2278 52/500 [==>...........................] - ETA: 1:49 - loss: 1.5158 - regression_loss: 1.2868 - classification_loss: 0.2290 53/500 [==>...........................] - ETA: 1:49 - loss: 1.5236 - regression_loss: 1.2912 - classification_loss: 0.2324 54/500 [==>...........................] - ETA: 1:49 - loss: 1.5310 - regression_loss: 1.2957 - classification_loss: 0.2353 55/500 [==>...........................] - ETA: 1:48 - loss: 1.5289 - regression_loss: 1.2931 - classification_loss: 0.2358 56/500 [==>...........................] - ETA: 1:48 - loss: 1.5294 - regression_loss: 1.2922 - classification_loss: 0.2373 57/500 [==>...........................] - ETA: 1:48 - loss: 1.5369 - regression_loss: 1.2971 - classification_loss: 0.2398 58/500 [==>...........................] - ETA: 1:48 - loss: 1.5303 - regression_loss: 1.2916 - classification_loss: 0.2387 59/500 [==>...........................] - ETA: 1:48 - loss: 1.5292 - regression_loss: 1.2912 - classification_loss: 0.2381 60/500 [==>...........................] - ETA: 1:47 - loss: 1.5381 - regression_loss: 1.2959 - classification_loss: 0.2422 61/500 [==>...........................] - ETA: 1:47 - loss: 1.5461 - regression_loss: 1.3016 - classification_loss: 0.2445 62/500 [==>...........................] - ETA: 1:47 - loss: 1.5523 - regression_loss: 1.3071 - classification_loss: 0.2451 63/500 [==>...........................] - ETA: 1:47 - loss: 1.5760 - regression_loss: 1.3252 - classification_loss: 0.2508 64/500 [==>...........................] - ETA: 1:47 - loss: 1.5712 - regression_loss: 1.3221 - classification_loss: 0.2491 65/500 [==>...........................] - ETA: 1:46 - loss: 1.5736 - regression_loss: 1.3239 - classification_loss: 0.2497 66/500 [==>...........................] - ETA: 1:46 - loss: 1.5832 - regression_loss: 1.3312 - classification_loss: 0.2520 67/500 [===>..........................] - ETA: 1:46 - loss: 1.5940 - regression_loss: 1.3390 - classification_loss: 0.2550 68/500 [===>..........................] - ETA: 1:46 - loss: 1.5880 - regression_loss: 1.3348 - classification_loss: 0.2532 69/500 [===>..........................] - ETA: 1:45 - loss: 1.5854 - regression_loss: 1.3327 - classification_loss: 0.2527 70/500 [===>..........................] - ETA: 1:45 - loss: 1.5857 - regression_loss: 1.3341 - classification_loss: 0.2516 71/500 [===>..........................] - ETA: 1:45 - loss: 1.5881 - regression_loss: 1.3361 - classification_loss: 0.2520 72/500 [===>..........................] - ETA: 1:45 - loss: 1.5878 - regression_loss: 1.3339 - classification_loss: 0.2540 73/500 [===>..........................] - ETA: 1:44 - loss: 1.5880 - regression_loss: 1.3306 - classification_loss: 0.2574 74/500 [===>..........................] - ETA: 1:44 - loss: 1.5855 - regression_loss: 1.3292 - classification_loss: 0.2562 75/500 [===>..........................] - ETA: 1:44 - loss: 1.5701 - regression_loss: 1.3164 - classification_loss: 0.2537 76/500 [===>..........................] - ETA: 1:44 - loss: 1.5636 - regression_loss: 1.3114 - classification_loss: 0.2523 77/500 [===>..........................] - ETA: 1:43 - loss: 1.5619 - regression_loss: 1.3108 - classification_loss: 0.2511 78/500 [===>..........................] - ETA: 1:43 - loss: 1.5651 - regression_loss: 1.3146 - classification_loss: 0.2506 79/500 [===>..........................] - ETA: 1:43 - loss: 1.5714 - regression_loss: 1.3204 - classification_loss: 0.2509 80/500 [===>..........................] - ETA: 1:43 - loss: 1.5694 - regression_loss: 1.3195 - classification_loss: 0.2498 81/500 [===>..........................] - ETA: 1:42 - loss: 1.5752 - regression_loss: 1.3256 - classification_loss: 0.2496 82/500 [===>..........................] - ETA: 1:42 - loss: 1.5738 - regression_loss: 1.3245 - classification_loss: 0.2493 83/500 [===>..........................] - ETA: 1:42 - loss: 1.5837 - regression_loss: 1.3315 - classification_loss: 0.2521 84/500 [====>.........................] - ETA: 1:42 - loss: 1.5833 - regression_loss: 1.3332 - classification_loss: 0.2501 85/500 [====>.........................] - ETA: 1:41 - loss: 1.5845 - regression_loss: 1.3334 - classification_loss: 0.2511 86/500 [====>.........................] - ETA: 1:41 - loss: 1.5808 - regression_loss: 1.3309 - classification_loss: 0.2500 87/500 [====>.........................] - ETA: 1:41 - loss: 1.5704 - regression_loss: 1.3227 - classification_loss: 0.2477 88/500 [====>.........................] - ETA: 1:41 - loss: 1.5727 - regression_loss: 1.3245 - classification_loss: 0.2482 89/500 [====>.........................] - ETA: 1:40 - loss: 1.5657 - regression_loss: 1.3185 - classification_loss: 0.2472 90/500 [====>.........................] - ETA: 1:40 - loss: 1.5688 - regression_loss: 1.3213 - classification_loss: 0.2476 91/500 [====>.........................] - ETA: 1:40 - loss: 1.5682 - regression_loss: 1.3212 - classification_loss: 0.2470 92/500 [====>.........................] - ETA: 1:40 - loss: 1.5706 - regression_loss: 1.3237 - classification_loss: 0.2469 93/500 [====>.........................] - ETA: 1:39 - loss: 1.5682 - regression_loss: 1.3212 - classification_loss: 0.2470 94/500 [====>.........................] - ETA: 1:39 - loss: 1.5697 - regression_loss: 1.3228 - classification_loss: 0.2469 95/500 [====>.........................] - ETA: 1:39 - loss: 1.5692 - regression_loss: 1.3233 - classification_loss: 0.2459 96/500 [====>.........................] - ETA: 1:39 - loss: 1.5686 - regression_loss: 1.3244 - classification_loss: 0.2442 97/500 [====>.........................] - ETA: 1:38 - loss: 1.5725 - regression_loss: 1.3278 - classification_loss: 0.2447 98/500 [====>.........................] - ETA: 1:38 - loss: 1.5673 - regression_loss: 1.3234 - classification_loss: 0.2439 99/500 [====>.........................] - ETA: 1:38 - loss: 1.5609 - regression_loss: 1.3186 - classification_loss: 0.2423 100/500 [=====>........................] - ETA: 1:38 - loss: 1.5621 - regression_loss: 1.3197 - classification_loss: 0.2424 101/500 [=====>........................] - ETA: 1:37 - loss: 1.5564 - regression_loss: 1.3152 - classification_loss: 0.2412 102/500 [=====>........................] - ETA: 1:37 - loss: 1.5562 - regression_loss: 1.3154 - classification_loss: 0.2408 103/500 [=====>........................] - ETA: 1:37 - loss: 1.5452 - regression_loss: 1.3064 - classification_loss: 0.2389 104/500 [=====>........................] - ETA: 1:37 - loss: 1.5428 - regression_loss: 1.3045 - classification_loss: 0.2383 105/500 [=====>........................] - ETA: 1:36 - loss: 1.5422 - regression_loss: 1.3047 - classification_loss: 0.2375 106/500 [=====>........................] - ETA: 1:36 - loss: 1.5392 - regression_loss: 1.3027 - classification_loss: 0.2365 107/500 [=====>........................] - ETA: 1:36 - loss: 1.5403 - regression_loss: 1.3035 - classification_loss: 0.2367 108/500 [=====>........................] - ETA: 1:36 - loss: 1.5361 - regression_loss: 1.2999 - classification_loss: 0.2362 109/500 [=====>........................] - ETA: 1:35 - loss: 1.5329 - regression_loss: 1.2979 - classification_loss: 0.2350 110/500 [=====>........................] - ETA: 1:35 - loss: 1.5305 - regression_loss: 1.2958 - classification_loss: 0.2347 111/500 [=====>........................] - ETA: 1:35 - loss: 1.5324 - regression_loss: 1.2981 - classification_loss: 0.2343 112/500 [=====>........................] - ETA: 1:35 - loss: 1.5304 - regression_loss: 1.2963 - classification_loss: 0.2340 113/500 [=====>........................] - ETA: 1:34 - loss: 1.5313 - regression_loss: 1.2969 - classification_loss: 0.2344 114/500 [=====>........................] - ETA: 1:34 - loss: 1.5383 - regression_loss: 1.3029 - classification_loss: 0.2354 115/500 [=====>........................] - ETA: 1:34 - loss: 1.5416 - regression_loss: 1.3053 - classification_loss: 0.2362 116/500 [=====>........................] - ETA: 1:34 - loss: 1.5486 - regression_loss: 1.3112 - classification_loss: 0.2373 117/500 [======>.......................] - ETA: 1:33 - loss: 1.5409 - regression_loss: 1.3051 - classification_loss: 0.2358 118/500 [======>.......................] - ETA: 1:33 - loss: 1.5415 - regression_loss: 1.3057 - classification_loss: 0.2358 119/500 [======>.......................] - ETA: 1:33 - loss: 1.5457 - regression_loss: 1.3084 - classification_loss: 0.2374 120/500 [======>.......................] - ETA: 1:32 - loss: 1.5451 - regression_loss: 1.3075 - classification_loss: 0.2376 121/500 [======>.......................] - ETA: 1:32 - loss: 1.5444 - regression_loss: 1.3072 - classification_loss: 0.2372 122/500 [======>.......................] - ETA: 1:32 - loss: 1.5461 - regression_loss: 1.3082 - classification_loss: 0.2379 123/500 [======>.......................] - ETA: 1:32 - loss: 1.5567 - regression_loss: 1.3165 - classification_loss: 0.2403 124/500 [======>.......................] - ETA: 1:31 - loss: 1.5585 - regression_loss: 1.3182 - classification_loss: 0.2403 125/500 [======>.......................] - ETA: 1:31 - loss: 1.5563 - regression_loss: 1.3164 - classification_loss: 0.2399 126/500 [======>.......................] - ETA: 1:31 - loss: 1.5660 - regression_loss: 1.3259 - classification_loss: 0.2401 127/500 [======>.......................] - ETA: 1:31 - loss: 1.5655 - regression_loss: 1.3259 - classification_loss: 0.2396 128/500 [======>.......................] - ETA: 1:30 - loss: 1.5660 - regression_loss: 1.3262 - classification_loss: 0.2398 129/500 [======>.......................] - ETA: 1:30 - loss: 1.5690 - regression_loss: 1.3280 - classification_loss: 0.2410 130/500 [======>.......................] - ETA: 1:30 - loss: 1.5657 - regression_loss: 1.3253 - classification_loss: 0.2403 131/500 [======>.......................] - ETA: 1:30 - loss: 1.5657 - regression_loss: 1.3253 - classification_loss: 0.2404 132/500 [======>.......................] - ETA: 1:29 - loss: 1.5649 - regression_loss: 1.3246 - classification_loss: 0.2403 133/500 [======>.......................] - ETA: 1:29 - loss: 1.5751 - regression_loss: 1.3334 - classification_loss: 0.2417 134/500 [=======>......................] - ETA: 1:29 - loss: 1.5677 - regression_loss: 1.3272 - classification_loss: 0.2404 135/500 [=======>......................] - ETA: 1:29 - loss: 1.5728 - regression_loss: 1.3312 - classification_loss: 0.2416 136/500 [=======>......................] - ETA: 1:28 - loss: 1.5759 - regression_loss: 1.3327 - classification_loss: 0.2432 137/500 [=======>......................] - ETA: 1:28 - loss: 1.5705 - regression_loss: 1.3282 - classification_loss: 0.2423 138/500 [=======>......................] - ETA: 1:28 - loss: 1.5763 - regression_loss: 1.3331 - classification_loss: 0.2432 139/500 [=======>......................] - ETA: 1:28 - loss: 1.5765 - regression_loss: 1.3332 - classification_loss: 0.2433 140/500 [=======>......................] - ETA: 1:27 - loss: 1.5756 - regression_loss: 1.3327 - classification_loss: 0.2429 141/500 [=======>......................] - ETA: 1:27 - loss: 1.5725 - regression_loss: 1.3304 - classification_loss: 0.2421 142/500 [=======>......................] - ETA: 1:27 - loss: 1.5720 - regression_loss: 1.3298 - classification_loss: 0.2422 143/500 [=======>......................] - ETA: 1:27 - loss: 1.5716 - regression_loss: 1.3298 - classification_loss: 0.2418 144/500 [=======>......................] - ETA: 1:26 - loss: 1.5713 - regression_loss: 1.3294 - classification_loss: 0.2419 145/500 [=======>......................] - ETA: 1:26 - loss: 1.5638 - regression_loss: 1.3232 - classification_loss: 0.2406 146/500 [=======>......................] - ETA: 1:26 - loss: 1.5626 - regression_loss: 1.3223 - classification_loss: 0.2403 147/500 [=======>......................] - ETA: 1:26 - loss: 1.5623 - regression_loss: 1.3223 - classification_loss: 0.2400 148/500 [=======>......................] - ETA: 1:25 - loss: 1.5625 - regression_loss: 1.3225 - classification_loss: 0.2401 149/500 [=======>......................] - ETA: 1:25 - loss: 1.5624 - regression_loss: 1.3225 - classification_loss: 0.2398 150/500 [========>.....................] - ETA: 1:25 - loss: 1.5651 - regression_loss: 1.3249 - classification_loss: 0.2402 151/500 [========>.....................] - ETA: 1:25 - loss: 1.5672 - regression_loss: 1.3268 - classification_loss: 0.2404 152/500 [========>.....................] - ETA: 1:24 - loss: 1.5713 - regression_loss: 1.3304 - classification_loss: 0.2409 153/500 [========>.....................] - ETA: 1:24 - loss: 1.5734 - regression_loss: 1.3319 - classification_loss: 0.2416 154/500 [========>.....................] - ETA: 1:24 - loss: 1.5766 - regression_loss: 1.3347 - classification_loss: 0.2419 155/500 [========>.....................] - ETA: 1:24 - loss: 1.5766 - regression_loss: 1.3349 - classification_loss: 0.2417 156/500 [========>.....................] - ETA: 1:23 - loss: 1.5776 - regression_loss: 1.3362 - classification_loss: 0.2414 157/500 [========>.....................] - ETA: 1:23 - loss: 1.5887 - regression_loss: 1.3450 - classification_loss: 0.2437 158/500 [========>.....................] - ETA: 1:23 - loss: 1.5889 - regression_loss: 1.3448 - classification_loss: 0.2441 159/500 [========>.....................] - ETA: 1:23 - loss: 1.5816 - regression_loss: 1.3387 - classification_loss: 0.2429 160/500 [========>.....................] - ETA: 1:22 - loss: 1.5915 - regression_loss: 1.3466 - classification_loss: 0.2449 161/500 [========>.....................] - ETA: 1:22 - loss: 1.5896 - regression_loss: 1.3452 - classification_loss: 0.2444 162/500 [========>.....................] - ETA: 1:22 - loss: 1.5888 - regression_loss: 1.3447 - classification_loss: 0.2441 163/500 [========>.....................] - ETA: 1:22 - loss: 1.5896 - regression_loss: 1.3456 - classification_loss: 0.2440 164/500 [========>.....................] - ETA: 1:21 - loss: 1.5871 - regression_loss: 1.3437 - classification_loss: 0.2434 165/500 [========>.....................] - ETA: 1:21 - loss: 1.5866 - regression_loss: 1.3432 - classification_loss: 0.2434 166/500 [========>.....................] - ETA: 1:21 - loss: 1.5888 - regression_loss: 1.3448 - classification_loss: 0.2440 167/500 [=========>....................] - ETA: 1:21 - loss: 1.5856 - regression_loss: 1.3425 - classification_loss: 0.2431 168/500 [=========>....................] - ETA: 1:20 - loss: 1.5864 - regression_loss: 1.3430 - classification_loss: 0.2433 169/500 [=========>....................] - ETA: 1:20 - loss: 1.5868 - regression_loss: 1.3438 - classification_loss: 0.2430 170/500 [=========>....................] - ETA: 1:20 - loss: 1.5802 - regression_loss: 1.3382 - classification_loss: 0.2419 171/500 [=========>....................] - ETA: 1:20 - loss: 1.5805 - regression_loss: 1.3383 - classification_loss: 0.2423 172/500 [=========>....................] - ETA: 1:19 - loss: 1.5781 - regression_loss: 1.3364 - classification_loss: 0.2417 173/500 [=========>....................] - ETA: 1:19 - loss: 1.5760 - regression_loss: 1.3343 - classification_loss: 0.2417 174/500 [=========>....................] - ETA: 1:19 - loss: 1.5735 - regression_loss: 1.3326 - classification_loss: 0.2409 175/500 [=========>....................] - ETA: 1:19 - loss: 1.5689 - regression_loss: 1.3288 - classification_loss: 0.2400 176/500 [=========>....................] - ETA: 1:18 - loss: 1.5719 - regression_loss: 1.3310 - classification_loss: 0.2410 177/500 [=========>....................] - ETA: 1:18 - loss: 1.5703 - regression_loss: 1.3293 - classification_loss: 0.2410 178/500 [=========>....................] - ETA: 1:18 - loss: 1.5665 - regression_loss: 1.3263 - classification_loss: 0.2402 179/500 [=========>....................] - ETA: 1:18 - loss: 1.5702 - regression_loss: 1.3298 - classification_loss: 0.2404 180/500 [=========>....................] - ETA: 1:17 - loss: 1.5738 - regression_loss: 1.3306 - classification_loss: 0.2432 181/500 [=========>....................] - ETA: 1:17 - loss: 1.5782 - regression_loss: 1.3336 - classification_loss: 0.2446 182/500 [=========>....................] - ETA: 1:17 - loss: 1.5779 - regression_loss: 1.3334 - classification_loss: 0.2445 183/500 [=========>....................] - ETA: 1:17 - loss: 1.5778 - regression_loss: 1.3337 - classification_loss: 0.2441 184/500 [==========>...................] - ETA: 1:17 - loss: 1.5782 - regression_loss: 1.3340 - classification_loss: 0.2442 185/500 [==========>...................] - ETA: 1:16 - loss: 1.5747 - regression_loss: 1.3311 - classification_loss: 0.2436 186/500 [==========>...................] - ETA: 1:16 - loss: 1.5762 - regression_loss: 1.3321 - classification_loss: 0.2441 187/500 [==========>...................] - ETA: 1:16 - loss: 1.5751 - regression_loss: 1.3310 - classification_loss: 0.2441 188/500 [==========>...................] - ETA: 1:16 - loss: 1.5760 - regression_loss: 1.3314 - classification_loss: 0.2446 189/500 [==========>...................] - ETA: 1:15 - loss: 1.5743 - regression_loss: 1.3303 - classification_loss: 0.2440 190/500 [==========>...................] - ETA: 1:15 - loss: 1.5721 - regression_loss: 1.3282 - classification_loss: 0.2439 191/500 [==========>...................] - ETA: 1:15 - loss: 1.5789 - regression_loss: 1.3341 - classification_loss: 0.2448 192/500 [==========>...................] - ETA: 1:15 - loss: 1.5797 - regression_loss: 1.3346 - classification_loss: 0.2451 193/500 [==========>...................] - ETA: 1:14 - loss: 1.5837 - regression_loss: 1.3383 - classification_loss: 0.2454 194/500 [==========>...................] - ETA: 1:14 - loss: 1.5824 - regression_loss: 1.3372 - classification_loss: 0.2451 195/500 [==========>...................] - ETA: 1:14 - loss: 1.5820 - regression_loss: 1.3368 - classification_loss: 0.2452 196/500 [==========>...................] - ETA: 1:14 - loss: 1.5849 - regression_loss: 1.3387 - classification_loss: 0.2462 197/500 [==========>...................] - ETA: 1:13 - loss: 1.5851 - regression_loss: 1.3392 - classification_loss: 0.2459 198/500 [==========>...................] - ETA: 1:13 - loss: 1.5816 - regression_loss: 1.3362 - classification_loss: 0.2453 199/500 [==========>...................] - ETA: 1:13 - loss: 1.5822 - regression_loss: 1.3369 - classification_loss: 0.2453 200/500 [===========>..................] - ETA: 1:13 - loss: 1.5845 - regression_loss: 1.3388 - classification_loss: 0.2457 201/500 [===========>..................] - ETA: 1:12 - loss: 1.5845 - regression_loss: 1.3389 - classification_loss: 0.2456 202/500 [===========>..................] - ETA: 1:12 - loss: 1.5834 - regression_loss: 1.3377 - classification_loss: 0.2458 203/500 [===========>..................] - ETA: 1:12 - loss: 1.5835 - regression_loss: 1.3380 - classification_loss: 0.2456 204/500 [===========>..................] - ETA: 1:11 - loss: 1.5859 - regression_loss: 1.3403 - classification_loss: 0.2457 205/500 [===========>..................] - ETA: 1:11 - loss: 1.5832 - regression_loss: 1.3382 - classification_loss: 0.2450 206/500 [===========>..................] - ETA: 1:11 - loss: 1.5825 - regression_loss: 1.3377 - classification_loss: 0.2448 207/500 [===========>..................] - ETA: 1:11 - loss: 1.5798 - regression_loss: 1.3355 - classification_loss: 0.2443 208/500 [===========>..................] - ETA: 1:10 - loss: 1.5795 - regression_loss: 1.3354 - classification_loss: 0.2441 209/500 [===========>..................] - ETA: 1:10 - loss: 1.5777 - regression_loss: 1.3340 - classification_loss: 0.2437 210/500 [===========>..................] - ETA: 1:10 - loss: 1.5770 - regression_loss: 1.3335 - classification_loss: 0.2435 211/500 [===========>..................] - ETA: 1:10 - loss: 1.5779 - regression_loss: 1.3343 - classification_loss: 0.2436 212/500 [===========>..................] - ETA: 1:09 - loss: 1.5763 - regression_loss: 1.3330 - classification_loss: 0.2433 213/500 [===========>..................] - ETA: 1:09 - loss: 1.5773 - regression_loss: 1.3341 - classification_loss: 0.2432 214/500 [===========>..................] - ETA: 1:09 - loss: 1.5737 - regression_loss: 1.3311 - classification_loss: 0.2426 215/500 [===========>..................] - ETA: 1:09 - loss: 1.5783 - regression_loss: 1.3353 - classification_loss: 0.2430 216/500 [===========>..................] - ETA: 1:08 - loss: 1.5764 - regression_loss: 1.3333 - classification_loss: 0.2430 217/500 [============>.................] - ETA: 1:08 - loss: 1.5729 - regression_loss: 1.3303 - classification_loss: 0.2426 218/500 [============>.................] - ETA: 1:08 - loss: 1.5703 - regression_loss: 1.3283 - classification_loss: 0.2420 219/500 [============>.................] - ETA: 1:08 - loss: 1.5687 - regression_loss: 1.3270 - classification_loss: 0.2417 220/500 [============>.................] - ETA: 1:07 - loss: 1.5683 - regression_loss: 1.3266 - classification_loss: 0.2417 221/500 [============>.................] - ETA: 1:07 - loss: 1.5723 - regression_loss: 1.3298 - classification_loss: 0.2425 222/500 [============>.................] - ETA: 1:07 - loss: 1.5714 - regression_loss: 1.3291 - classification_loss: 0.2424 223/500 [============>.................] - ETA: 1:07 - loss: 1.5686 - regression_loss: 1.3269 - classification_loss: 0.2417 224/500 [============>.................] - ETA: 1:06 - loss: 1.5682 - regression_loss: 1.3269 - classification_loss: 0.2413 225/500 [============>.................] - ETA: 1:06 - loss: 1.5661 - regression_loss: 1.3254 - classification_loss: 0.2408 226/500 [============>.................] - ETA: 1:06 - loss: 1.5645 - regression_loss: 1.3235 - classification_loss: 0.2410 227/500 [============>.................] - ETA: 1:06 - loss: 1.5635 - regression_loss: 1.3228 - classification_loss: 0.2406 228/500 [============>.................] - ETA: 1:05 - loss: 1.5640 - regression_loss: 1.3233 - classification_loss: 0.2407 229/500 [============>.................] - ETA: 1:05 - loss: 1.5650 - regression_loss: 1.3238 - classification_loss: 0.2411 230/500 [============>.................] - ETA: 1:05 - loss: 1.5668 - regression_loss: 1.3248 - classification_loss: 0.2420 231/500 [============>.................] - ETA: 1:05 - loss: 1.5685 - regression_loss: 1.3258 - classification_loss: 0.2427 232/500 [============>.................] - ETA: 1:04 - loss: 1.5688 - regression_loss: 1.3262 - classification_loss: 0.2425 233/500 [============>.................] - ETA: 1:04 - loss: 1.5713 - regression_loss: 1.3279 - classification_loss: 0.2433 234/500 [=============>................] - ETA: 1:04 - loss: 1.5708 - regression_loss: 1.3276 - classification_loss: 0.2432 235/500 [=============>................] - ETA: 1:04 - loss: 1.5715 - regression_loss: 1.3279 - classification_loss: 0.2436 236/500 [=============>................] - ETA: 1:03 - loss: 1.5720 - regression_loss: 1.3284 - classification_loss: 0.2436 237/500 [=============>................] - ETA: 1:03 - loss: 1.5747 - regression_loss: 1.3306 - classification_loss: 0.2441 238/500 [=============>................] - ETA: 1:03 - loss: 1.5743 - regression_loss: 1.3304 - classification_loss: 0.2438 239/500 [=============>................] - ETA: 1:03 - loss: 1.5752 - regression_loss: 1.3308 - classification_loss: 0.2443 240/500 [=============>................] - ETA: 1:02 - loss: 1.5762 - regression_loss: 1.3316 - classification_loss: 0.2445 241/500 [=============>................] - ETA: 1:02 - loss: 1.5753 - regression_loss: 1.3311 - classification_loss: 0.2443 242/500 [=============>................] - ETA: 1:02 - loss: 1.5742 - regression_loss: 1.3302 - classification_loss: 0.2441 243/500 [=============>................] - ETA: 1:02 - loss: 1.5762 - regression_loss: 1.3320 - classification_loss: 0.2442 244/500 [=============>................] - ETA: 1:01 - loss: 1.5734 - regression_loss: 1.3297 - classification_loss: 0.2437 245/500 [=============>................] - ETA: 1:01 - loss: 1.5770 - regression_loss: 1.3329 - classification_loss: 0.2440 246/500 [=============>................] - ETA: 1:01 - loss: 1.5758 - regression_loss: 1.3322 - classification_loss: 0.2437 247/500 [=============>................] - ETA: 1:01 - loss: 1.5747 - regression_loss: 1.3311 - classification_loss: 0.2436 248/500 [=============>................] - ETA: 1:00 - loss: 1.5756 - regression_loss: 1.3322 - classification_loss: 0.2434 249/500 [=============>................] - ETA: 1:00 - loss: 1.5748 - regression_loss: 1.3315 - classification_loss: 0.2433 250/500 [==============>...............] - ETA: 1:00 - loss: 1.5758 - regression_loss: 1.3323 - classification_loss: 0.2435 251/500 [==============>...............] - ETA: 1:00 - loss: 1.5768 - regression_loss: 1.3328 - classification_loss: 0.2440 252/500 [==============>...............] - ETA: 59s - loss: 1.5823 - regression_loss: 1.3367 - classification_loss: 0.2456  253/500 [==============>...............] - ETA: 59s - loss: 1.5824 - regression_loss: 1.3370 - classification_loss: 0.2454 254/500 [==============>...............] - ETA: 59s - loss: 1.5813 - regression_loss: 1.3362 - classification_loss: 0.2451 255/500 [==============>...............] - ETA: 59s - loss: 1.5801 - regression_loss: 1.3353 - classification_loss: 0.2448 256/500 [==============>...............] - ETA: 58s - loss: 1.5801 - regression_loss: 1.3352 - classification_loss: 0.2449 257/500 [==============>...............] - ETA: 58s - loss: 1.5803 - regression_loss: 1.3354 - classification_loss: 0.2449 258/500 [==============>...............] - ETA: 58s - loss: 1.5784 - regression_loss: 1.3340 - classification_loss: 0.2444 259/500 [==============>...............] - ETA: 58s - loss: 1.5782 - regression_loss: 1.3339 - classification_loss: 0.2443 260/500 [==============>...............] - ETA: 57s - loss: 1.5790 - regression_loss: 1.3343 - classification_loss: 0.2447 261/500 [==============>...............] - ETA: 57s - loss: 1.5791 - regression_loss: 1.3346 - classification_loss: 0.2445 262/500 [==============>...............] - ETA: 57s - loss: 1.5780 - regression_loss: 1.3336 - classification_loss: 0.2444 263/500 [==============>...............] - ETA: 57s - loss: 1.5775 - regression_loss: 1.3333 - classification_loss: 0.2442 264/500 [==============>...............] - ETA: 57s - loss: 1.5736 - regression_loss: 1.3298 - classification_loss: 0.2438 265/500 [==============>...............] - ETA: 56s - loss: 1.5696 - regression_loss: 1.3266 - classification_loss: 0.2430 266/500 [==============>...............] - ETA: 56s - loss: 1.5714 - regression_loss: 1.3279 - classification_loss: 0.2434 267/500 [===============>..............] - ETA: 56s - loss: 1.5733 - regression_loss: 1.3292 - classification_loss: 0.2441 268/500 [===============>..............] - ETA: 56s - loss: 1.5725 - regression_loss: 1.3289 - classification_loss: 0.2436 269/500 [===============>..............] - ETA: 55s - loss: 1.5707 - regression_loss: 1.3276 - classification_loss: 0.2431 270/500 [===============>..............] - ETA: 55s - loss: 1.5705 - regression_loss: 1.3275 - classification_loss: 0.2430 271/500 [===============>..............] - ETA: 55s - loss: 1.5717 - regression_loss: 1.3287 - classification_loss: 0.2430 272/500 [===============>..............] - ETA: 55s - loss: 1.5690 - regression_loss: 1.3266 - classification_loss: 0.2424 273/500 [===============>..............] - ETA: 54s - loss: 1.5679 - regression_loss: 1.3257 - classification_loss: 0.2421 274/500 [===============>..............] - ETA: 54s - loss: 1.5648 - regression_loss: 1.3233 - classification_loss: 0.2415 275/500 [===============>..............] - ETA: 54s - loss: 1.5637 - regression_loss: 1.3226 - classification_loss: 0.2412 276/500 [===============>..............] - ETA: 54s - loss: 1.5647 - regression_loss: 1.3233 - classification_loss: 0.2415 277/500 [===============>..............] - ETA: 53s - loss: 1.5643 - regression_loss: 1.3230 - classification_loss: 0.2412 278/500 [===============>..............] - ETA: 53s - loss: 1.5650 - regression_loss: 1.3225 - classification_loss: 0.2426 279/500 [===============>..............] - ETA: 53s - loss: 1.5646 - regression_loss: 1.3222 - classification_loss: 0.2424 280/500 [===============>..............] - ETA: 53s - loss: 1.5635 - regression_loss: 1.3216 - classification_loss: 0.2419 281/500 [===============>..............] - ETA: 52s - loss: 1.5624 - regression_loss: 1.3209 - classification_loss: 0.2415 282/500 [===============>..............] - ETA: 52s - loss: 1.5623 - regression_loss: 1.3208 - classification_loss: 0.2415 283/500 [===============>..............] - ETA: 52s - loss: 1.5619 - regression_loss: 1.3205 - classification_loss: 0.2414 284/500 [================>.............] - ETA: 52s - loss: 1.5599 - regression_loss: 1.3190 - classification_loss: 0.2409 285/500 [================>.............] - ETA: 51s - loss: 1.5590 - regression_loss: 1.3183 - classification_loss: 0.2406 286/500 [================>.............] - ETA: 51s - loss: 1.5584 - regression_loss: 1.3181 - classification_loss: 0.2403 287/500 [================>.............] - ETA: 51s - loss: 1.5570 - regression_loss: 1.3170 - classification_loss: 0.2401 288/500 [================>.............] - ETA: 51s - loss: 1.5549 - regression_loss: 1.3153 - classification_loss: 0.2395 289/500 [================>.............] - ETA: 50s - loss: 1.5575 - regression_loss: 1.3168 - classification_loss: 0.2407 290/500 [================>.............] - ETA: 50s - loss: 1.5571 - regression_loss: 1.3166 - classification_loss: 0.2405 291/500 [================>.............] - ETA: 50s - loss: 1.5594 - regression_loss: 1.3188 - classification_loss: 0.2406 292/500 [================>.............] - ETA: 50s - loss: 1.5591 - regression_loss: 1.3185 - classification_loss: 0.2407 293/500 [================>.............] - ETA: 49s - loss: 1.5587 - regression_loss: 1.3183 - classification_loss: 0.2404 294/500 [================>.............] - ETA: 49s - loss: 1.5583 - regression_loss: 1.3181 - classification_loss: 0.2402 295/500 [================>.............] - ETA: 49s - loss: 1.5577 - regression_loss: 1.3179 - classification_loss: 0.2398 296/500 [================>.............] - ETA: 49s - loss: 1.5596 - regression_loss: 1.3191 - classification_loss: 0.2406 297/500 [================>.............] - ETA: 48s - loss: 1.5579 - regression_loss: 1.3177 - classification_loss: 0.2402 298/500 [================>.............] - ETA: 48s - loss: 1.5572 - regression_loss: 1.3172 - classification_loss: 0.2400 299/500 [================>.............] - ETA: 48s - loss: 1.5557 - regression_loss: 1.3158 - classification_loss: 0.2399 300/500 [=================>............] - ETA: 48s - loss: 1.5547 - regression_loss: 1.3152 - classification_loss: 0.2396 301/500 [=================>............] - ETA: 48s - loss: 1.5572 - regression_loss: 1.3170 - classification_loss: 0.2402 302/500 [=================>............] - ETA: 47s - loss: 1.5572 - regression_loss: 1.3170 - classification_loss: 0.2402 303/500 [=================>............] - ETA: 47s - loss: 1.5545 - regression_loss: 1.3147 - classification_loss: 0.2398 304/500 [=================>............] - ETA: 47s - loss: 1.5530 - regression_loss: 1.3135 - classification_loss: 0.2395 305/500 [=================>............] - ETA: 47s - loss: 1.5523 - regression_loss: 1.3131 - classification_loss: 0.2392 306/500 [=================>............] - ETA: 46s - loss: 1.5508 - regression_loss: 1.3118 - classification_loss: 0.2389 307/500 [=================>............] - ETA: 46s - loss: 1.5524 - regression_loss: 1.3130 - classification_loss: 0.2393 308/500 [=================>............] - ETA: 46s - loss: 1.5521 - regression_loss: 1.3126 - classification_loss: 0.2394 309/500 [=================>............] - ETA: 46s - loss: 1.5528 - regression_loss: 1.3133 - classification_loss: 0.2395 310/500 [=================>............] - ETA: 45s - loss: 1.5553 - regression_loss: 1.3153 - classification_loss: 0.2400 311/500 [=================>............] - ETA: 45s - loss: 1.5524 - regression_loss: 1.3130 - classification_loss: 0.2394 312/500 [=================>............] - ETA: 45s - loss: 1.5512 - regression_loss: 1.3122 - classification_loss: 0.2391 313/500 [=================>............] - ETA: 45s - loss: 1.5513 - regression_loss: 1.3124 - classification_loss: 0.2390 314/500 [=================>............] - ETA: 44s - loss: 1.5494 - regression_loss: 1.3104 - classification_loss: 0.2390 315/500 [=================>............] - ETA: 44s - loss: 1.5487 - regression_loss: 1.3098 - classification_loss: 0.2388 316/500 [=================>............] - ETA: 44s - loss: 1.5471 - regression_loss: 1.3087 - classification_loss: 0.2384 317/500 [==================>...........] - ETA: 44s - loss: 1.5495 - regression_loss: 1.3105 - classification_loss: 0.2390 318/500 [==================>...........] - ETA: 43s - loss: 1.5496 - regression_loss: 1.3110 - classification_loss: 0.2386 319/500 [==================>...........] - ETA: 43s - loss: 1.5502 - regression_loss: 1.3116 - classification_loss: 0.2386 320/500 [==================>...........] - ETA: 43s - loss: 1.5514 - regression_loss: 1.3123 - classification_loss: 0.2391 321/500 [==================>...........] - ETA: 43s - loss: 1.5516 - regression_loss: 1.3129 - classification_loss: 0.2388 322/500 [==================>...........] - ETA: 43s - loss: 1.5503 - regression_loss: 1.3117 - classification_loss: 0.2386 323/500 [==================>...........] - ETA: 42s - loss: 1.5503 - regression_loss: 1.3120 - classification_loss: 0.2383 324/500 [==================>...........] - ETA: 42s - loss: 1.5518 - regression_loss: 1.3134 - classification_loss: 0.2384 325/500 [==================>...........] - ETA: 42s - loss: 1.5484 - regression_loss: 1.3107 - classification_loss: 0.2377 326/500 [==================>...........] - ETA: 42s - loss: 1.5477 - regression_loss: 1.3100 - classification_loss: 0.2377 327/500 [==================>...........] - ETA: 41s - loss: 1.5503 - regression_loss: 1.3126 - classification_loss: 0.2377 328/500 [==================>...........] - ETA: 41s - loss: 1.5492 - regression_loss: 1.3117 - classification_loss: 0.2375 329/500 [==================>...........] - ETA: 41s - loss: 1.5491 - regression_loss: 1.3116 - classification_loss: 0.2374 330/500 [==================>...........] - ETA: 41s - loss: 1.5485 - regression_loss: 1.3110 - classification_loss: 0.2375 331/500 [==================>...........] - ETA: 40s - loss: 1.5491 - regression_loss: 1.3116 - classification_loss: 0.2374 332/500 [==================>...........] - ETA: 40s - loss: 1.5490 - regression_loss: 1.3117 - classification_loss: 0.2373 333/500 [==================>...........] - ETA: 40s - loss: 1.5494 - regression_loss: 1.3121 - classification_loss: 0.2373 334/500 [===================>..........] - ETA: 40s - loss: 1.5488 - regression_loss: 1.3115 - classification_loss: 0.2372 335/500 [===================>..........] - ETA: 39s - loss: 1.5492 - regression_loss: 1.3119 - classification_loss: 0.2373 336/500 [===================>..........] - ETA: 39s - loss: 1.5487 - regression_loss: 1.3115 - classification_loss: 0.2372 337/500 [===================>..........] - ETA: 39s - loss: 1.5465 - regression_loss: 1.3096 - classification_loss: 0.2370 338/500 [===================>..........] - ETA: 39s - loss: 1.5460 - regression_loss: 1.3092 - classification_loss: 0.2367 339/500 [===================>..........] - ETA: 38s - loss: 1.5451 - regression_loss: 1.3086 - classification_loss: 0.2365 340/500 [===================>..........] - ETA: 38s - loss: 1.5456 - regression_loss: 1.3090 - classification_loss: 0.2366 341/500 [===================>..........] - ETA: 38s - loss: 1.5447 - regression_loss: 1.3083 - classification_loss: 0.2364 342/500 [===================>..........] - ETA: 38s - loss: 1.5461 - regression_loss: 1.3094 - classification_loss: 0.2367 343/500 [===================>..........] - ETA: 37s - loss: 1.5449 - regression_loss: 1.3085 - classification_loss: 0.2364 344/500 [===================>..........] - ETA: 37s - loss: 1.5471 - regression_loss: 1.3102 - classification_loss: 0.2369 345/500 [===================>..........] - ETA: 37s - loss: 1.5468 - regression_loss: 1.3098 - classification_loss: 0.2369 346/500 [===================>..........] - ETA: 37s - loss: 1.5487 - regression_loss: 1.3112 - classification_loss: 0.2375 347/500 [===================>..........] - ETA: 37s - loss: 1.5491 - regression_loss: 1.3113 - classification_loss: 0.2378 348/500 [===================>..........] - ETA: 36s - loss: 1.5494 - regression_loss: 1.3116 - classification_loss: 0.2379 349/500 [===================>..........] - ETA: 36s - loss: 1.5475 - regression_loss: 1.3101 - classification_loss: 0.2374 350/500 [====================>.........] - ETA: 36s - loss: 1.5468 - regression_loss: 1.3095 - classification_loss: 0.2373 351/500 [====================>.........] - ETA: 36s - loss: 1.5476 - regression_loss: 1.3099 - classification_loss: 0.2377 352/500 [====================>.........] - ETA: 35s - loss: 1.5456 - regression_loss: 1.3083 - classification_loss: 0.2373 353/500 [====================>.........] - ETA: 35s - loss: 1.5436 - regression_loss: 1.3067 - classification_loss: 0.2369 354/500 [====================>.........] - ETA: 35s - loss: 1.5438 - regression_loss: 1.3065 - classification_loss: 0.2372 355/500 [====================>.........] - ETA: 35s - loss: 1.5437 - regression_loss: 1.3065 - classification_loss: 0.2372 356/500 [====================>.........] - ETA: 34s - loss: 1.5430 - regression_loss: 1.3062 - classification_loss: 0.2369 357/500 [====================>.........] - ETA: 34s - loss: 1.5424 - regression_loss: 1.3057 - classification_loss: 0.2367 358/500 [====================>.........] - ETA: 34s - loss: 1.5402 - regression_loss: 1.3039 - classification_loss: 0.2363 359/500 [====================>.........] - ETA: 34s - loss: 1.5393 - regression_loss: 1.3032 - classification_loss: 0.2361 360/500 [====================>.........] - ETA: 33s - loss: 1.5399 - regression_loss: 1.3035 - classification_loss: 0.2364 361/500 [====================>.........] - ETA: 33s - loss: 1.5384 - regression_loss: 1.3023 - classification_loss: 0.2361 362/500 [====================>.........] - ETA: 33s - loss: 1.5365 - regression_loss: 1.3007 - classification_loss: 0.2357 363/500 [====================>.........] - ETA: 33s - loss: 1.5344 - regression_loss: 1.2992 - classification_loss: 0.2352 364/500 [====================>.........] - ETA: 32s - loss: 1.5339 - regression_loss: 1.2988 - classification_loss: 0.2351 365/500 [====================>.........] - ETA: 32s - loss: 1.5326 - regression_loss: 1.2979 - classification_loss: 0.2347 366/500 [====================>.........] - ETA: 32s - loss: 1.5314 - regression_loss: 1.2969 - classification_loss: 0.2345 367/500 [=====================>........] - ETA: 32s - loss: 1.5318 - regression_loss: 1.2934 - classification_loss: 0.2384 368/500 [=====================>........] - ETA: 31s - loss: 1.5324 - regression_loss: 1.2938 - classification_loss: 0.2385 369/500 [=====================>........] - ETA: 31s - loss: 1.5315 - regression_loss: 1.2933 - classification_loss: 0.2382 370/500 [=====================>........] - ETA: 31s - loss: 1.5304 - regression_loss: 1.2924 - classification_loss: 0.2381 371/500 [=====================>........] - ETA: 31s - loss: 1.5304 - regression_loss: 1.2923 - classification_loss: 0.2381 372/500 [=====================>........] - ETA: 31s - loss: 1.5296 - regression_loss: 1.2916 - classification_loss: 0.2380 373/500 [=====================>........] - ETA: 30s - loss: 1.5300 - regression_loss: 1.2921 - classification_loss: 0.2379 374/500 [=====================>........] - ETA: 30s - loss: 1.5299 - regression_loss: 1.2921 - classification_loss: 0.2379 375/500 [=====================>........] - ETA: 30s - loss: 1.5299 - regression_loss: 1.2921 - classification_loss: 0.2378 376/500 [=====================>........] - ETA: 30s - loss: 1.5305 - regression_loss: 1.2923 - classification_loss: 0.2382 377/500 [=====================>........] - ETA: 29s - loss: 1.5321 - regression_loss: 1.2935 - classification_loss: 0.2386 378/500 [=====================>........] - ETA: 29s - loss: 1.5315 - regression_loss: 1.2929 - classification_loss: 0.2386 379/500 [=====================>........] - ETA: 29s - loss: 1.5319 - regression_loss: 1.2932 - classification_loss: 0.2387 380/500 [=====================>........] - ETA: 29s - loss: 1.5335 - regression_loss: 1.2944 - classification_loss: 0.2390 381/500 [=====================>........] - ETA: 28s - loss: 1.5328 - regression_loss: 1.2939 - classification_loss: 0.2389 382/500 [=====================>........] - ETA: 28s - loss: 1.5325 - regression_loss: 1.2938 - classification_loss: 0.2387 383/500 [=====================>........] - ETA: 28s - loss: 1.5335 - regression_loss: 1.2948 - classification_loss: 0.2387 384/500 [======================>.......] - ETA: 28s - loss: 1.5320 - regression_loss: 1.2934 - classification_loss: 0.2385 385/500 [======================>.......] - ETA: 27s - loss: 1.5314 - regression_loss: 1.2930 - classification_loss: 0.2384 386/500 [======================>.......] - ETA: 27s - loss: 1.5311 - regression_loss: 1.2926 - classification_loss: 0.2385 387/500 [======================>.......] - ETA: 27s - loss: 1.5306 - regression_loss: 1.2922 - classification_loss: 0.2384 388/500 [======================>.......] - ETA: 27s - loss: 1.5307 - regression_loss: 1.2922 - classification_loss: 0.2386 389/500 [======================>.......] - ETA: 26s - loss: 1.5315 - regression_loss: 1.2927 - classification_loss: 0.2388 390/500 [======================>.......] - ETA: 26s - loss: 1.5322 - regression_loss: 1.2931 - classification_loss: 0.2391 391/500 [======================>.......] - ETA: 26s - loss: 1.5324 - regression_loss: 1.2932 - classification_loss: 0.2391 392/500 [======================>.......] - ETA: 26s - loss: 1.5329 - regression_loss: 1.2930 - classification_loss: 0.2399 393/500 [======================>.......] - ETA: 25s - loss: 1.5338 - regression_loss: 1.2939 - classification_loss: 0.2400 394/500 [======================>.......] - ETA: 25s - loss: 1.5367 - regression_loss: 1.2960 - classification_loss: 0.2407 395/500 [======================>.......] - ETA: 25s - loss: 1.5359 - regression_loss: 1.2954 - classification_loss: 0.2405 396/500 [======================>.......] - ETA: 25s - loss: 1.5351 - regression_loss: 1.2948 - classification_loss: 0.2402 397/500 [======================>.......] - ETA: 25s - loss: 1.5346 - regression_loss: 1.2943 - classification_loss: 0.2402 398/500 [======================>.......] - ETA: 24s - loss: 1.5350 - regression_loss: 1.2948 - classification_loss: 0.2402 399/500 [======================>.......] - ETA: 24s - loss: 1.5364 - regression_loss: 1.2963 - classification_loss: 0.2401 400/500 [=======================>......] - ETA: 24s - loss: 1.5361 - regression_loss: 1.2962 - classification_loss: 0.2399 401/500 [=======================>......] - ETA: 24s - loss: 1.5353 - regression_loss: 1.2957 - classification_loss: 0.2396 402/500 [=======================>......] - ETA: 23s - loss: 1.5353 - regression_loss: 1.2957 - classification_loss: 0.2396 403/500 [=======================>......] - ETA: 23s - loss: 1.5357 - regression_loss: 1.2960 - classification_loss: 0.2396 404/500 [=======================>......] - ETA: 23s - loss: 1.5366 - regression_loss: 1.2968 - classification_loss: 0.2397 405/500 [=======================>......] - ETA: 23s - loss: 1.5370 - regression_loss: 1.2972 - classification_loss: 0.2398 406/500 [=======================>......] - ETA: 22s - loss: 1.5365 - regression_loss: 1.2969 - classification_loss: 0.2396 407/500 [=======================>......] - ETA: 22s - loss: 1.5367 - regression_loss: 1.2971 - classification_loss: 0.2395 408/500 [=======================>......] - ETA: 22s - loss: 1.5397 - regression_loss: 1.2993 - classification_loss: 0.2404 409/500 [=======================>......] - ETA: 22s - loss: 1.5390 - regression_loss: 1.2989 - classification_loss: 0.2401 410/500 [=======================>......] - ETA: 21s - loss: 1.5416 - regression_loss: 1.3010 - classification_loss: 0.2406 411/500 [=======================>......] - ETA: 21s - loss: 1.5420 - regression_loss: 1.3014 - classification_loss: 0.2407 412/500 [=======================>......] - ETA: 21s - loss: 1.5422 - regression_loss: 1.3014 - classification_loss: 0.2408 413/500 [=======================>......] - ETA: 21s - loss: 1.5406 - regression_loss: 1.3003 - classification_loss: 0.2403 414/500 [=======================>......] - ETA: 20s - loss: 1.5414 - regression_loss: 1.3009 - classification_loss: 0.2405 415/500 [=======================>......] - ETA: 20s - loss: 1.5411 - regression_loss: 1.3007 - classification_loss: 0.2404 416/500 [=======================>......] - ETA: 20s - loss: 1.5424 - regression_loss: 1.3018 - classification_loss: 0.2406 417/500 [========================>.....] - ETA: 20s - loss: 1.5418 - regression_loss: 1.3014 - classification_loss: 0.2404 418/500 [========================>.....] - ETA: 19s - loss: 1.5417 - regression_loss: 1.3015 - classification_loss: 0.2402 419/500 [========================>.....] - ETA: 19s - loss: 1.5428 - regression_loss: 1.3025 - classification_loss: 0.2403 420/500 [========================>.....] - ETA: 19s - loss: 1.5426 - regression_loss: 1.3023 - classification_loss: 0.2402 421/500 [========================>.....] - ETA: 19s - loss: 1.5404 - regression_loss: 1.3005 - classification_loss: 0.2399 422/500 [========================>.....] - ETA: 18s - loss: 1.5428 - regression_loss: 1.3024 - classification_loss: 0.2404 423/500 [========================>.....] - ETA: 18s - loss: 1.5437 - regression_loss: 1.3032 - classification_loss: 0.2405 424/500 [========================>.....] - ETA: 18s - loss: 1.5470 - regression_loss: 1.3055 - classification_loss: 0.2415 425/500 [========================>.....] - ETA: 18s - loss: 1.5482 - regression_loss: 1.3063 - classification_loss: 0.2419 426/500 [========================>.....] - ETA: 17s - loss: 1.5489 - regression_loss: 1.3070 - classification_loss: 0.2419 427/500 [========================>.....] - ETA: 17s - loss: 1.5484 - regression_loss: 1.3066 - classification_loss: 0.2418 428/500 [========================>.....] - ETA: 17s - loss: 1.5469 - regression_loss: 1.3054 - classification_loss: 0.2415 429/500 [========================>.....] - ETA: 17s - loss: 1.5494 - regression_loss: 1.3072 - classification_loss: 0.2422 430/500 [========================>.....] - ETA: 17s - loss: 1.5474 - regression_loss: 1.3054 - classification_loss: 0.2420 431/500 [========================>.....] - ETA: 16s - loss: 1.5469 - regression_loss: 1.3051 - classification_loss: 0.2418 432/500 [========================>.....] - ETA: 16s - loss: 1.5488 - regression_loss: 1.3063 - classification_loss: 0.2425 433/500 [========================>.....] - ETA: 16s - loss: 1.5490 - regression_loss: 1.3065 - classification_loss: 0.2425 434/500 [=========================>....] - ETA: 16s - loss: 1.5494 - regression_loss: 1.3067 - classification_loss: 0.2427 435/500 [=========================>....] - ETA: 15s - loss: 1.5489 - regression_loss: 1.3063 - classification_loss: 0.2426 436/500 [=========================>....] - ETA: 15s - loss: 1.5501 - regression_loss: 1.3071 - classification_loss: 0.2430 437/500 [=========================>....] - ETA: 15s - loss: 1.5504 - regression_loss: 1.3073 - classification_loss: 0.2431 438/500 [=========================>....] - ETA: 15s - loss: 1.5503 - regression_loss: 1.3073 - classification_loss: 0.2430 439/500 [=========================>....] - ETA: 14s - loss: 1.5497 - regression_loss: 1.3068 - classification_loss: 0.2429 440/500 [=========================>....] - ETA: 14s - loss: 1.5507 - regression_loss: 1.3078 - classification_loss: 0.2430 441/500 [=========================>....] - ETA: 14s - loss: 1.5505 - regression_loss: 1.3077 - classification_loss: 0.2428 442/500 [=========================>....] - ETA: 14s - loss: 1.5482 - regression_loss: 1.3059 - classification_loss: 0.2424 443/500 [=========================>....] - ETA: 13s - loss: 1.5481 - regression_loss: 1.3059 - classification_loss: 0.2423 444/500 [=========================>....] - ETA: 13s - loss: 1.5489 - regression_loss: 1.3062 - classification_loss: 0.2427 445/500 [=========================>....] - ETA: 13s - loss: 1.5477 - regression_loss: 1.3052 - classification_loss: 0.2425 446/500 [=========================>....] - ETA: 13s - loss: 1.5473 - regression_loss: 1.3049 - classification_loss: 0.2424 447/500 [=========================>....] - ETA: 12s - loss: 1.5461 - regression_loss: 1.3041 - classification_loss: 0.2420 448/500 [=========================>....] - ETA: 12s - loss: 1.5468 - regression_loss: 1.3047 - classification_loss: 0.2421 449/500 [=========================>....] - ETA: 12s - loss: 1.5474 - regression_loss: 1.3052 - classification_loss: 0.2422 450/500 [==========================>...] - ETA: 12s - loss: 1.5476 - regression_loss: 1.3053 - classification_loss: 0.2422 451/500 [==========================>...] - ETA: 11s - loss: 1.5467 - regression_loss: 1.3046 - classification_loss: 0.2421 452/500 [==========================>...] - ETA: 11s - loss: 1.5477 - regression_loss: 1.3055 - classification_loss: 0.2422 453/500 [==========================>...] - ETA: 11s - loss: 1.5482 - regression_loss: 1.3058 - classification_loss: 0.2424 454/500 [==========================>...] - ETA: 11s - loss: 1.5473 - regression_loss: 1.3051 - classification_loss: 0.2422 455/500 [==========================>...] - ETA: 10s - loss: 1.5473 - regression_loss: 1.3050 - classification_loss: 0.2423 456/500 [==========================>...] - ETA: 10s - loss: 1.5475 - regression_loss: 1.3051 - classification_loss: 0.2424 457/500 [==========================>...] - ETA: 10s - loss: 1.5505 - regression_loss: 1.3072 - classification_loss: 0.2433 458/500 [==========================>...] - ETA: 10s - loss: 1.5535 - regression_loss: 1.3097 - classification_loss: 0.2438 459/500 [==========================>...] - ETA: 9s - loss: 1.5514 - regression_loss: 1.3079 - classification_loss: 0.2435  460/500 [==========================>...] - ETA: 9s - loss: 1.5508 - regression_loss: 1.3075 - classification_loss: 0.2432 461/500 [==========================>...] - ETA: 9s - loss: 1.5506 - regression_loss: 1.3075 - classification_loss: 0.2431 462/500 [==========================>...] - ETA: 9s - loss: 1.5509 - regression_loss: 1.3078 - classification_loss: 0.2431 463/500 [==========================>...] - ETA: 8s - loss: 1.5512 - regression_loss: 1.3080 - classification_loss: 0.2431 464/500 [==========================>...] - ETA: 8s - loss: 1.5497 - regression_loss: 1.3068 - classification_loss: 0.2429 465/500 [==========================>...] - ETA: 8s - loss: 1.5494 - regression_loss: 1.3063 - classification_loss: 0.2431 466/500 [==========================>...] - ETA: 8s - loss: 1.5482 - regression_loss: 1.3053 - classification_loss: 0.2429 467/500 [===========================>..] - ETA: 8s - loss: 1.5474 - regression_loss: 1.3043 - classification_loss: 0.2431 468/500 [===========================>..] - ETA: 7s - loss: 1.5457 - regression_loss: 1.3029 - classification_loss: 0.2429 469/500 [===========================>..] - ETA: 7s - loss: 1.5450 - regression_loss: 1.3023 - classification_loss: 0.2427 470/500 [===========================>..] - ETA: 7s - loss: 1.5452 - regression_loss: 1.3025 - classification_loss: 0.2427 471/500 [===========================>..] - ETA: 7s - loss: 1.5467 - regression_loss: 1.3036 - classification_loss: 0.2431 472/500 [===========================>..] - ETA: 6s - loss: 1.5482 - regression_loss: 1.3053 - classification_loss: 0.2430 473/500 [===========================>..] - ETA: 6s - loss: 1.5480 - regression_loss: 1.3052 - classification_loss: 0.2428 474/500 [===========================>..] - ETA: 6s - loss: 1.5468 - regression_loss: 1.3043 - classification_loss: 0.2425 475/500 [===========================>..] - ETA: 6s - loss: 1.5465 - regression_loss: 1.3042 - classification_loss: 0.2423 476/500 [===========================>..] - ETA: 5s - loss: 1.5482 - regression_loss: 1.3059 - classification_loss: 0.2423 477/500 [===========================>..] - ETA: 5s - loss: 1.5477 - regression_loss: 1.3057 - classification_loss: 0.2420 478/500 [===========================>..] - ETA: 5s - loss: 1.5485 - regression_loss: 1.3065 - classification_loss: 0.2420 479/500 [===========================>..] - ETA: 5s - loss: 1.5464 - regression_loss: 1.3048 - classification_loss: 0.2416 480/500 [===========================>..] - ETA: 4s - loss: 1.5459 - regression_loss: 1.3044 - classification_loss: 0.2414 481/500 [===========================>..] - ETA: 4s - loss: 1.5463 - regression_loss: 1.3049 - classification_loss: 0.2414 482/500 [===========================>..] - ETA: 4s - loss: 1.5466 - regression_loss: 1.3050 - classification_loss: 0.2416 483/500 [===========================>..] - ETA: 4s - loss: 1.5465 - regression_loss: 1.3048 - classification_loss: 0.2417 484/500 [============================>.] - ETA: 3s - loss: 1.5465 - regression_loss: 1.3048 - classification_loss: 0.2417 485/500 [============================>.] - ETA: 3s - loss: 1.5462 - regression_loss: 1.3046 - classification_loss: 0.2416 486/500 [============================>.] - ETA: 3s - loss: 1.5485 - regression_loss: 1.3065 - classification_loss: 0.2420 487/500 [============================>.] - ETA: 3s - loss: 1.5484 - regression_loss: 1.3065 - classification_loss: 0.2418 488/500 [============================>.] - ETA: 2s - loss: 1.5497 - regression_loss: 1.3076 - classification_loss: 0.2422 489/500 [============================>.] - ETA: 2s - loss: 1.5490 - regression_loss: 1.3070 - classification_loss: 0.2420 490/500 [============================>.] - ETA: 2s - loss: 1.5485 - regression_loss: 1.3066 - classification_loss: 0.2419 491/500 [============================>.] - ETA: 2s - loss: 1.5483 - regression_loss: 1.3065 - classification_loss: 0.2419 492/500 [============================>.] - ETA: 1s - loss: 1.5466 - regression_loss: 1.3049 - classification_loss: 0.2416 493/500 [============================>.] - ETA: 1s - loss: 1.5466 - regression_loss: 1.3051 - classification_loss: 0.2415 494/500 [============================>.] - ETA: 1s - loss: 1.5464 - regression_loss: 1.3050 - classification_loss: 0.2414 495/500 [============================>.] - ETA: 1s - loss: 1.5463 - regression_loss: 1.3049 - classification_loss: 0.2414 496/500 [============================>.] - ETA: 0s - loss: 1.5471 - regression_loss: 1.3054 - classification_loss: 0.2418 497/500 [============================>.] - ETA: 0s - loss: 1.5468 - regression_loss: 1.3051 - classification_loss: 0.2417 498/500 [============================>.] - ETA: 0s - loss: 1.5468 - regression_loss: 1.3052 - classification_loss: 0.2417 499/500 [============================>.] - ETA: 0s - loss: 1.5448 - regression_loss: 1.3034 - classification_loss: 0.2414 500/500 [==============================] - 122s 243ms/step - loss: 1.5461 - regression_loss: 1.3045 - classification_loss: 0.2416 326 instances of class plum with average precision: 0.7736 mAP: 0.7736 Epoch 00068: saving model to ./training/snapshots/resnet50_pascal_68.h5 Epoch 69/150 1/500 [..............................] - ETA: 2:07 - loss: 1.3583 - regression_loss: 1.1607 - classification_loss: 0.1976 2/500 [..............................] - ETA: 2:08 - loss: 1.1551 - regression_loss: 1.0081 - classification_loss: 0.1471 3/500 [..............................] - ETA: 2:07 - loss: 1.2389 - regression_loss: 1.0847 - classification_loss: 0.1542 4/500 [..............................] - ETA: 2:06 - loss: 1.3610 - regression_loss: 1.1883 - classification_loss: 0.1727 5/500 [..............................] - ETA: 2:04 - loss: 1.4256 - regression_loss: 1.1891 - classification_loss: 0.2365 6/500 [..............................] - ETA: 2:04 - loss: 1.4802 - regression_loss: 1.2438 - classification_loss: 0.2364 7/500 [..............................] - ETA: 2:02 - loss: 1.5297 - regression_loss: 1.2822 - classification_loss: 0.2476 8/500 [..............................] - ETA: 2:01 - loss: 1.5613 - regression_loss: 1.3047 - classification_loss: 0.2566 9/500 [..............................] - ETA: 2:01 - loss: 1.5527 - regression_loss: 1.3022 - classification_loss: 0.2505 10/500 [..............................] - ETA: 2:01 - loss: 1.6255 - regression_loss: 1.3739 - classification_loss: 0.2516 11/500 [..............................] - ETA: 1:59 - loss: 1.5844 - regression_loss: 1.3453 - classification_loss: 0.2392 12/500 [..............................] - ETA: 1:59 - loss: 1.5990 - regression_loss: 1.3585 - classification_loss: 0.2405 13/500 [..............................] - ETA: 1:59 - loss: 1.6501 - regression_loss: 1.3900 - classification_loss: 0.2601 14/500 [..............................] - ETA: 1:59 - loss: 1.6578 - regression_loss: 1.4011 - classification_loss: 0.2567 15/500 [..............................] - ETA: 1:59 - loss: 1.6088 - regression_loss: 1.3576 - classification_loss: 0.2511 16/500 [..............................] - ETA: 1:59 - loss: 1.5818 - regression_loss: 1.3377 - classification_loss: 0.2441 17/500 [>.............................] - ETA: 1:58 - loss: 1.6239 - regression_loss: 1.3710 - classification_loss: 0.2529 18/500 [>.............................] - ETA: 1:58 - loss: 1.5787 - regression_loss: 1.3335 - classification_loss: 0.2453 19/500 [>.............................] - ETA: 1:57 - loss: 1.5416 - regression_loss: 1.3046 - classification_loss: 0.2370 20/500 [>.............................] - ETA: 1:57 - loss: 1.5153 - regression_loss: 1.2830 - classification_loss: 0.2323 21/500 [>.............................] - ETA: 1:57 - loss: 1.5076 - regression_loss: 1.2761 - classification_loss: 0.2315 22/500 [>.............................] - ETA: 1:57 - loss: 1.5271 - regression_loss: 1.2878 - classification_loss: 0.2393 23/500 [>.............................] - ETA: 1:57 - loss: 1.5313 - regression_loss: 1.2919 - classification_loss: 0.2394 24/500 [>.............................] - ETA: 1:56 - loss: 1.5461 - regression_loss: 1.3076 - classification_loss: 0.2386 25/500 [>.............................] - ETA: 1:55 - loss: 1.5651 - regression_loss: 1.3220 - classification_loss: 0.2432 26/500 [>.............................] - ETA: 1:55 - loss: 1.5655 - regression_loss: 1.3232 - classification_loss: 0.2423 27/500 [>.............................] - ETA: 1:55 - loss: 1.5797 - regression_loss: 1.3336 - classification_loss: 0.2461 28/500 [>.............................] - ETA: 1:55 - loss: 1.5647 - regression_loss: 1.3232 - classification_loss: 0.2415 29/500 [>.............................] - ETA: 1:55 - loss: 1.5620 - regression_loss: 1.3207 - classification_loss: 0.2412 30/500 [>.............................] - ETA: 1:55 - loss: 1.5860 - regression_loss: 1.3388 - classification_loss: 0.2472 31/500 [>.............................] - ETA: 1:54 - loss: 1.5765 - regression_loss: 1.3321 - classification_loss: 0.2445 32/500 [>.............................] - ETA: 1:54 - loss: 1.5527 - regression_loss: 1.3131 - classification_loss: 0.2396 33/500 [>.............................] - ETA: 1:54 - loss: 1.5522 - regression_loss: 1.3125 - classification_loss: 0.2397 34/500 [=>............................] - ETA: 1:54 - loss: 1.5489 - regression_loss: 1.3077 - classification_loss: 0.2412 35/500 [=>............................] - ETA: 1:54 - loss: 1.5504 - regression_loss: 1.3072 - classification_loss: 0.2433 36/500 [=>............................] - ETA: 1:53 - loss: 1.5316 - regression_loss: 1.2907 - classification_loss: 0.2409 37/500 [=>............................] - ETA: 1:53 - loss: 1.5158 - regression_loss: 1.2746 - classification_loss: 0.2413 38/500 [=>............................] - ETA: 1:53 - loss: 1.5018 - regression_loss: 1.2634 - classification_loss: 0.2385 39/500 [=>............................] - ETA: 1:53 - loss: 1.4944 - regression_loss: 1.2584 - classification_loss: 0.2359 40/500 [=>............................] - ETA: 1:52 - loss: 1.5063 - regression_loss: 1.2673 - classification_loss: 0.2390 41/500 [=>............................] - ETA: 1:52 - loss: 1.5140 - regression_loss: 1.2745 - classification_loss: 0.2394 42/500 [=>............................] - ETA: 1:52 - loss: 1.5145 - regression_loss: 1.2758 - classification_loss: 0.2387 43/500 [=>............................] - ETA: 1:52 - loss: 1.5184 - regression_loss: 1.2792 - classification_loss: 0.2392 44/500 [=>............................] - ETA: 1:52 - loss: 1.5167 - regression_loss: 1.2778 - classification_loss: 0.2390 45/500 [=>............................] - ETA: 1:51 - loss: 1.5087 - regression_loss: 1.2716 - classification_loss: 0.2371 46/500 [=>............................] - ETA: 1:51 - loss: 1.5007 - regression_loss: 1.2651 - classification_loss: 0.2356 47/500 [=>............................] - ETA: 1:51 - loss: 1.5140 - regression_loss: 1.2757 - classification_loss: 0.2383 48/500 [=>............................] - ETA: 1:51 - loss: 1.5110 - regression_loss: 1.2745 - classification_loss: 0.2364 49/500 [=>............................] - ETA: 1:50 - loss: 1.5193 - regression_loss: 1.2805 - classification_loss: 0.2388 50/500 [==>...........................] - ETA: 1:50 - loss: 1.5227 - regression_loss: 1.2833 - classification_loss: 0.2393 51/500 [==>...........................] - ETA: 1:50 - loss: 1.5367 - regression_loss: 1.2926 - classification_loss: 0.2440 52/500 [==>...........................] - ETA: 1:50 - loss: 1.5518 - regression_loss: 1.3040 - classification_loss: 0.2478 53/500 [==>...........................] - ETA: 1:49 - loss: 1.5624 - regression_loss: 1.3116 - classification_loss: 0.2508 54/500 [==>...........................] - ETA: 1:49 - loss: 1.5659 - regression_loss: 1.3145 - classification_loss: 0.2513 55/500 [==>...........................] - ETA: 1:49 - loss: 1.5790 - regression_loss: 1.3257 - classification_loss: 0.2532 56/500 [==>...........................] - ETA: 1:49 - loss: 1.5838 - regression_loss: 1.3296 - classification_loss: 0.2542 57/500 [==>...........................] - ETA: 1:49 - loss: 1.5808 - regression_loss: 1.3278 - classification_loss: 0.2530 58/500 [==>...........................] - ETA: 1:48 - loss: 1.5817 - regression_loss: 1.3290 - classification_loss: 0.2527 59/500 [==>...........................] - ETA: 1:48 - loss: 1.5868 - regression_loss: 1.3352 - classification_loss: 0.2515 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5747 - regression_loss: 1.3261 - classification_loss: 0.2486 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5607 - regression_loss: 1.3154 - classification_loss: 0.2454 62/500 [==>...........................] - ETA: 1:49 - loss: 1.5690 - regression_loss: 1.3222 - classification_loss: 0.2468 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5771 - regression_loss: 1.3282 - classification_loss: 0.2489 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5782 - regression_loss: 1.3301 - classification_loss: 0.2481 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5681 - regression_loss: 1.3096 - classification_loss: 0.2585 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5674 - regression_loss: 1.3079 - classification_loss: 0.2595 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5890 - regression_loss: 1.3258 - classification_loss: 0.2632 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5801 - regression_loss: 1.3190 - classification_loss: 0.2611 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5857 - regression_loss: 1.3245 - classification_loss: 0.2613 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5770 - regression_loss: 1.3179 - classification_loss: 0.2590 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5833 - regression_loss: 1.3212 - classification_loss: 0.2621 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5749 - regression_loss: 1.3148 - classification_loss: 0.2602 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5619 - regression_loss: 1.3042 - classification_loss: 0.2576 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5720 - regression_loss: 1.3122 - classification_loss: 0.2598 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5670 - regression_loss: 1.3087 - classification_loss: 0.2583 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5662 - regression_loss: 1.3080 - classification_loss: 0.2582 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5594 - regression_loss: 1.3035 - classification_loss: 0.2559 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5730 - regression_loss: 1.3128 - classification_loss: 0.2602 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5773 - regression_loss: 1.3168 - classification_loss: 0.2605 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5980 - regression_loss: 1.3338 - classification_loss: 0.2641 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5896 - regression_loss: 1.3277 - classification_loss: 0.2619 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5901 - regression_loss: 1.3281 - classification_loss: 0.2621 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5894 - regression_loss: 1.3283 - classification_loss: 0.2611 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5949 - regression_loss: 1.3335 - classification_loss: 0.2614 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5846 - regression_loss: 1.3248 - classification_loss: 0.2598 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5831 - regression_loss: 1.3247 - classification_loss: 0.2584 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5807 - regression_loss: 1.3230 - classification_loss: 0.2576 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5783 - regression_loss: 1.3207 - classification_loss: 0.2576 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5843 - regression_loss: 1.3249 - classification_loss: 0.2594 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5802 - regression_loss: 1.3225 - classification_loss: 0.2577 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5791 - regression_loss: 1.3222 - classification_loss: 0.2570 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5776 - regression_loss: 1.3210 - classification_loss: 0.2567 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5727 - regression_loss: 1.3174 - classification_loss: 0.2553 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5749 - regression_loss: 1.3202 - classification_loss: 0.2548 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5670 - regression_loss: 1.3134 - classification_loss: 0.2536 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5706 - regression_loss: 1.3164 - classification_loss: 0.2542 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5641 - regression_loss: 1.3102 - classification_loss: 0.2539 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5656 - regression_loss: 1.3119 - classification_loss: 0.2537 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5692 - regression_loss: 1.3148 - classification_loss: 0.2544 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5720 - regression_loss: 1.3181 - classification_loss: 0.2539 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5714 - regression_loss: 1.3181 - classification_loss: 0.2533 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5765 - regression_loss: 1.3214 - classification_loss: 0.2551 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5718 - regression_loss: 1.3181 - classification_loss: 0.2536 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5789 - regression_loss: 1.3231 - classification_loss: 0.2558 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5865 - regression_loss: 1.3278 - classification_loss: 0.2586 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5837 - regression_loss: 1.3258 - classification_loss: 0.2579 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5856 - regression_loss: 1.3265 - classification_loss: 0.2591 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5889 - regression_loss: 1.3297 - classification_loss: 0.2592 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5918 - regression_loss: 1.3320 - classification_loss: 0.2598 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5951 - regression_loss: 1.3349 - classification_loss: 0.2602 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5966 - regression_loss: 1.3368 - classification_loss: 0.2599 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5972 - regression_loss: 1.3368 - classification_loss: 0.2604 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5991 - regression_loss: 1.3387 - classification_loss: 0.2603 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6019 - regression_loss: 1.3409 - classification_loss: 0.2610 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5987 - regression_loss: 1.3385 - classification_loss: 0.2602 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5948 - regression_loss: 1.3354 - classification_loss: 0.2594 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5916 - regression_loss: 1.3317 - classification_loss: 0.2599 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5864 - regression_loss: 1.3279 - classification_loss: 0.2586 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5841 - regression_loss: 1.3264 - classification_loss: 0.2577 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5750 - regression_loss: 1.3189 - classification_loss: 0.2561 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5713 - regression_loss: 1.3162 - classification_loss: 0.2551 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5720 - regression_loss: 1.3176 - classification_loss: 0.2544 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5737 - regression_loss: 1.3187 - classification_loss: 0.2550 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5787 - regression_loss: 1.3224 - classification_loss: 0.2563 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5773 - regression_loss: 1.3216 - classification_loss: 0.2557 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5764 - regression_loss: 1.3207 - classification_loss: 0.2557 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5679 - regression_loss: 1.3140 - classification_loss: 0.2540 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5682 - regression_loss: 1.3134 - classification_loss: 0.2548 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6084 - regression_loss: 1.3420 - classification_loss: 0.2664 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6108 - regression_loss: 1.3441 - classification_loss: 0.2667 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6087 - regression_loss: 1.3428 - classification_loss: 0.2659 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6080 - regression_loss: 1.3425 - classification_loss: 0.2655 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6068 - regression_loss: 1.3422 - classification_loss: 0.2646 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6018 - regression_loss: 1.3385 - classification_loss: 0.2633 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5964 - regression_loss: 1.3342 - classification_loss: 0.2623 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5898 - regression_loss: 1.3289 - classification_loss: 0.2608 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5885 - regression_loss: 1.3281 - classification_loss: 0.2604 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5890 - regression_loss: 1.3287 - classification_loss: 0.2604 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5910 - regression_loss: 1.3302 - classification_loss: 0.2608 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5885 - regression_loss: 1.3280 - classification_loss: 0.2605 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5874 - regression_loss: 1.3277 - classification_loss: 0.2597 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5882 - regression_loss: 1.3280 - classification_loss: 0.2602 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5924 - regression_loss: 1.3306 - classification_loss: 0.2618 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5959 - regression_loss: 1.3334 - classification_loss: 0.2624 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5950 - regression_loss: 1.3334 - classification_loss: 0.2616 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5954 - regression_loss: 1.3340 - classification_loss: 0.2615 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5904 - regression_loss: 1.3300 - classification_loss: 0.2604 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5921 - regression_loss: 1.3315 - classification_loss: 0.2606 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5936 - regression_loss: 1.3327 - classification_loss: 0.2609 150/500 [========>.....................] - ETA: 1:26 - loss: 1.5892 - regression_loss: 1.3293 - classification_loss: 0.2599 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5920 - regression_loss: 1.3321 - classification_loss: 0.2599 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5977 - regression_loss: 1.3367 - classification_loss: 0.2610 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5986 - regression_loss: 1.3381 - classification_loss: 0.2605 154/500 [========>.....................] - ETA: 1:25 - loss: 1.5977 - regression_loss: 1.3374 - classification_loss: 0.2603 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6019 - regression_loss: 1.3404 - classification_loss: 0.2615 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5979 - regression_loss: 1.3373 - classification_loss: 0.2607 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6011 - regression_loss: 1.3399 - classification_loss: 0.2612 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6032 - regression_loss: 1.3416 - classification_loss: 0.2615 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6005 - regression_loss: 1.3397 - classification_loss: 0.2608 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5978 - regression_loss: 1.3378 - classification_loss: 0.2600 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5954 - regression_loss: 1.3352 - classification_loss: 0.2602 162/500 [========>.....................] - ETA: 1:23 - loss: 1.5947 - regression_loss: 1.3341 - classification_loss: 0.2605 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5976 - regression_loss: 1.3363 - classification_loss: 0.2613 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6013 - regression_loss: 1.3392 - classification_loss: 0.2621 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5963 - regression_loss: 1.3349 - classification_loss: 0.2614 166/500 [========>.....................] - ETA: 1:22 - loss: 1.5954 - regression_loss: 1.3341 - classification_loss: 0.2614 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6096 - regression_loss: 1.3445 - classification_loss: 0.2651 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6103 - regression_loss: 1.3448 - classification_loss: 0.2656 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6121 - regression_loss: 1.3463 - classification_loss: 0.2658 170/500 [=========>....................] - ETA: 1:21 - loss: 1.6154 - regression_loss: 1.3477 - classification_loss: 0.2678 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6143 - regression_loss: 1.3467 - classification_loss: 0.2677 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6122 - regression_loss: 1.3446 - classification_loss: 0.2676 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6067 - regression_loss: 1.3403 - classification_loss: 0.2664 174/500 [=========>....................] - ETA: 1:20 - loss: 1.6034 - regression_loss: 1.3375 - classification_loss: 0.2659 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6017 - regression_loss: 1.3363 - classification_loss: 0.2654 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5981 - regression_loss: 1.3336 - classification_loss: 0.2645 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5978 - regression_loss: 1.3337 - classification_loss: 0.2641 178/500 [=========>....................] - ETA: 1:19 - loss: 1.5948 - regression_loss: 1.3313 - classification_loss: 0.2636 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5985 - regression_loss: 1.3341 - classification_loss: 0.2644 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6012 - regression_loss: 1.3362 - classification_loss: 0.2650 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6011 - regression_loss: 1.3360 - classification_loss: 0.2651 182/500 [=========>....................] - ETA: 1:18 - loss: 1.6019 - regression_loss: 1.3371 - classification_loss: 0.2648 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6012 - regression_loss: 1.3371 - classification_loss: 0.2641 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6056 - regression_loss: 1.3409 - classification_loss: 0.2647 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6010 - regression_loss: 1.3369 - classification_loss: 0.2641 186/500 [==========>...................] - ETA: 1:17 - loss: 1.6046 - regression_loss: 1.3395 - classification_loss: 0.2652 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5989 - regression_loss: 1.3348 - classification_loss: 0.2641 188/500 [==========>...................] - ETA: 1:17 - loss: 1.5956 - regression_loss: 1.3321 - classification_loss: 0.2635 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5972 - regression_loss: 1.3336 - classification_loss: 0.2636 190/500 [==========>...................] - ETA: 1:16 - loss: 1.5999 - regression_loss: 1.3366 - classification_loss: 0.2633 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6000 - regression_loss: 1.3370 - classification_loss: 0.2630 192/500 [==========>...................] - ETA: 1:16 - loss: 1.5989 - regression_loss: 1.3363 - classification_loss: 0.2626 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5925 - regression_loss: 1.3311 - classification_loss: 0.2614 194/500 [==========>...................] - ETA: 1:15 - loss: 1.5931 - regression_loss: 1.3320 - classification_loss: 0.2611 195/500 [==========>...................] - ETA: 1:15 - loss: 1.5950 - regression_loss: 1.3336 - classification_loss: 0.2614 196/500 [==========>...................] - ETA: 1:15 - loss: 1.5972 - regression_loss: 1.3349 - classification_loss: 0.2623 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5970 - regression_loss: 1.3349 - classification_loss: 0.2621 198/500 [==========>...................] - ETA: 1:14 - loss: 1.5943 - regression_loss: 1.3328 - classification_loss: 0.2615 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5879 - regression_loss: 1.3276 - classification_loss: 0.2604 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5879 - regression_loss: 1.3262 - classification_loss: 0.2617 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5874 - regression_loss: 1.3258 - classification_loss: 0.2615 202/500 [===========>..................] - ETA: 1:13 - loss: 1.5890 - regression_loss: 1.3273 - classification_loss: 0.2618 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5902 - regression_loss: 1.3285 - classification_loss: 0.2617 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5916 - regression_loss: 1.3298 - classification_loss: 0.2618 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5926 - regression_loss: 1.3307 - classification_loss: 0.2619 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5902 - regression_loss: 1.3286 - classification_loss: 0.2616 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5904 - regression_loss: 1.3285 - classification_loss: 0.2619 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5898 - regression_loss: 1.3285 - classification_loss: 0.2613 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5875 - regression_loss: 1.3267 - classification_loss: 0.2608 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5900 - regression_loss: 1.3285 - classification_loss: 0.2615 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5902 - regression_loss: 1.3288 - classification_loss: 0.2614 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5899 - regression_loss: 1.3288 - classification_loss: 0.2611 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5886 - regression_loss: 1.3279 - classification_loss: 0.2607 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5937 - regression_loss: 1.3315 - classification_loss: 0.2622 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5927 - regression_loss: 1.3307 - classification_loss: 0.2620 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5929 - regression_loss: 1.3306 - classification_loss: 0.2623 217/500 [============>.................] - ETA: 1:10 - loss: 1.5919 - regression_loss: 1.3300 - classification_loss: 0.2619 218/500 [============>.................] - ETA: 1:09 - loss: 1.5869 - regression_loss: 1.3260 - classification_loss: 0.2609 219/500 [============>.................] - ETA: 1:09 - loss: 1.5851 - regression_loss: 1.3245 - classification_loss: 0.2605 220/500 [============>.................] - ETA: 1:09 - loss: 1.5837 - regression_loss: 1.3236 - classification_loss: 0.2601 221/500 [============>.................] - ETA: 1:09 - loss: 1.5799 - regression_loss: 1.3206 - classification_loss: 0.2593 222/500 [============>.................] - ETA: 1:08 - loss: 1.5796 - regression_loss: 1.3207 - classification_loss: 0.2590 223/500 [============>.................] - ETA: 1:08 - loss: 1.5798 - regression_loss: 1.3209 - classification_loss: 0.2589 224/500 [============>.................] - ETA: 1:08 - loss: 1.5777 - regression_loss: 1.3192 - classification_loss: 0.2584 225/500 [============>.................] - ETA: 1:08 - loss: 1.5793 - regression_loss: 1.3205 - classification_loss: 0.2587 226/500 [============>.................] - ETA: 1:07 - loss: 1.5809 - regression_loss: 1.3216 - classification_loss: 0.2593 227/500 [============>.................] - ETA: 1:07 - loss: 1.5782 - regression_loss: 1.3194 - classification_loss: 0.2587 228/500 [============>.................] - ETA: 1:07 - loss: 1.5787 - regression_loss: 1.3205 - classification_loss: 0.2582 229/500 [============>.................] - ETA: 1:07 - loss: 1.5800 - regression_loss: 1.3211 - classification_loss: 0.2589 230/500 [============>.................] - ETA: 1:06 - loss: 1.5812 - regression_loss: 1.3223 - classification_loss: 0.2589 231/500 [============>.................] - ETA: 1:06 - loss: 1.5816 - regression_loss: 1.3227 - classification_loss: 0.2588 232/500 [============>.................] - ETA: 1:06 - loss: 1.5798 - regression_loss: 1.3216 - classification_loss: 0.2583 233/500 [============>.................] - ETA: 1:06 - loss: 1.5805 - regression_loss: 1.3221 - classification_loss: 0.2583 234/500 [=============>................] - ETA: 1:05 - loss: 1.5789 - regression_loss: 1.3210 - classification_loss: 0.2579 235/500 [=============>................] - ETA: 1:05 - loss: 1.5807 - regression_loss: 1.3225 - classification_loss: 0.2582 236/500 [=============>................] - ETA: 1:05 - loss: 1.5834 - regression_loss: 1.3251 - classification_loss: 0.2583 237/500 [=============>................] - ETA: 1:05 - loss: 1.5840 - regression_loss: 1.3253 - classification_loss: 0.2587 238/500 [=============>................] - ETA: 1:04 - loss: 1.5819 - regression_loss: 1.3237 - classification_loss: 0.2582 239/500 [=============>................] - ETA: 1:04 - loss: 1.5809 - regression_loss: 1.3232 - classification_loss: 0.2577 240/500 [=============>................] - ETA: 1:04 - loss: 1.5788 - regression_loss: 1.3215 - classification_loss: 0.2573 241/500 [=============>................] - ETA: 1:04 - loss: 1.5759 - regression_loss: 1.3193 - classification_loss: 0.2566 242/500 [=============>................] - ETA: 1:03 - loss: 1.5753 - regression_loss: 1.3190 - classification_loss: 0.2563 243/500 [=============>................] - ETA: 1:03 - loss: 1.5735 - regression_loss: 1.3176 - classification_loss: 0.2559 244/500 [=============>................] - ETA: 1:03 - loss: 1.5732 - regression_loss: 1.3171 - classification_loss: 0.2561 245/500 [=============>................] - ETA: 1:03 - loss: 1.5775 - regression_loss: 1.3205 - classification_loss: 0.2571 246/500 [=============>................] - ETA: 1:02 - loss: 1.5778 - regression_loss: 1.3204 - classification_loss: 0.2575 247/500 [=============>................] - ETA: 1:02 - loss: 1.5770 - regression_loss: 1.3195 - classification_loss: 0.2574 248/500 [=============>................] - ETA: 1:02 - loss: 1.5740 - regression_loss: 1.3172 - classification_loss: 0.2568 249/500 [=============>................] - ETA: 1:02 - loss: 1.5761 - regression_loss: 1.3183 - classification_loss: 0.2577 250/500 [==============>...............] - ETA: 1:01 - loss: 1.5745 - regression_loss: 1.3173 - classification_loss: 0.2572 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5766 - regression_loss: 1.3189 - classification_loss: 0.2577 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5798 - regression_loss: 1.3214 - classification_loss: 0.2584 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5807 - regression_loss: 1.3218 - classification_loss: 0.2589 254/500 [==============>...............] - ETA: 1:00 - loss: 1.5793 - regression_loss: 1.3210 - classification_loss: 0.2583 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5793 - regression_loss: 1.3208 - classification_loss: 0.2585 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5808 - regression_loss: 1.3219 - classification_loss: 0.2588 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5812 - regression_loss: 1.3224 - classification_loss: 0.2589 258/500 [==============>...............] - ETA: 59s - loss: 1.5821 - regression_loss: 1.3230 - classification_loss: 0.2591  259/500 [==============>...............] - ETA: 59s - loss: 1.5802 - regression_loss: 1.3213 - classification_loss: 0.2589 260/500 [==============>...............] - ETA: 59s - loss: 1.5784 - regression_loss: 1.3200 - classification_loss: 0.2585 261/500 [==============>...............] - ETA: 59s - loss: 1.5782 - regression_loss: 1.3199 - classification_loss: 0.2583 262/500 [==============>...............] - ETA: 58s - loss: 1.5802 - regression_loss: 1.3211 - classification_loss: 0.2591 263/500 [==============>...............] - ETA: 58s - loss: 1.5782 - regression_loss: 1.3196 - classification_loss: 0.2586 264/500 [==============>...............] - ETA: 58s - loss: 1.5770 - regression_loss: 1.3188 - classification_loss: 0.2582 265/500 [==============>...............] - ETA: 58s - loss: 1.5774 - regression_loss: 1.3193 - classification_loss: 0.2581 266/500 [==============>...............] - ETA: 57s - loss: 1.5776 - regression_loss: 1.3199 - classification_loss: 0.2577 267/500 [===============>..............] - ETA: 57s - loss: 1.5776 - regression_loss: 1.3197 - classification_loss: 0.2579 268/500 [===============>..............] - ETA: 57s - loss: 1.5779 - regression_loss: 1.3199 - classification_loss: 0.2581 269/500 [===============>..............] - ETA: 57s - loss: 1.5744 - regression_loss: 1.3171 - classification_loss: 0.2573 270/500 [===============>..............] - ETA: 56s - loss: 1.5731 - regression_loss: 1.3163 - classification_loss: 0.2569 271/500 [===============>..............] - ETA: 56s - loss: 1.5740 - regression_loss: 1.3171 - classification_loss: 0.2570 272/500 [===============>..............] - ETA: 56s - loss: 1.5737 - regression_loss: 1.3170 - classification_loss: 0.2567 273/500 [===============>..............] - ETA: 56s - loss: 1.5708 - regression_loss: 1.3149 - classification_loss: 0.2560 274/500 [===============>..............] - ETA: 55s - loss: 1.5720 - regression_loss: 1.3158 - classification_loss: 0.2562 275/500 [===============>..............] - ETA: 55s - loss: 1.5710 - regression_loss: 1.3152 - classification_loss: 0.2557 276/500 [===============>..............] - ETA: 55s - loss: 1.5720 - regression_loss: 1.3158 - classification_loss: 0.2561 277/500 [===============>..............] - ETA: 55s - loss: 1.5719 - regression_loss: 1.3160 - classification_loss: 0.2560 278/500 [===============>..............] - ETA: 54s - loss: 1.5746 - regression_loss: 1.3180 - classification_loss: 0.2566 279/500 [===============>..............] - ETA: 54s - loss: 1.5739 - regression_loss: 1.3174 - classification_loss: 0.2565 280/500 [===============>..............] - ETA: 54s - loss: 1.5733 - regression_loss: 1.3170 - classification_loss: 0.2564 281/500 [===============>..............] - ETA: 54s - loss: 1.5746 - regression_loss: 1.3180 - classification_loss: 0.2566 282/500 [===============>..............] - ETA: 53s - loss: 1.5754 - regression_loss: 1.3188 - classification_loss: 0.2565 283/500 [===============>..............] - ETA: 53s - loss: 1.5720 - regression_loss: 1.3161 - classification_loss: 0.2559 284/500 [================>.............] - ETA: 53s - loss: 1.5694 - regression_loss: 1.3140 - classification_loss: 0.2554 285/500 [================>.............] - ETA: 53s - loss: 1.5700 - regression_loss: 1.3146 - classification_loss: 0.2554 286/500 [================>.............] - ETA: 52s - loss: 1.5716 - regression_loss: 1.3160 - classification_loss: 0.2557 287/500 [================>.............] - ETA: 52s - loss: 1.5702 - regression_loss: 1.3146 - classification_loss: 0.2556 288/500 [================>.............] - ETA: 52s - loss: 1.5718 - regression_loss: 1.3160 - classification_loss: 0.2559 289/500 [================>.............] - ETA: 52s - loss: 1.5721 - regression_loss: 1.3160 - classification_loss: 0.2561 290/500 [================>.............] - ETA: 51s - loss: 1.5720 - regression_loss: 1.3160 - classification_loss: 0.2560 291/500 [================>.............] - ETA: 51s - loss: 1.5725 - regression_loss: 1.3162 - classification_loss: 0.2563 292/500 [================>.............] - ETA: 51s - loss: 1.5708 - regression_loss: 1.3146 - classification_loss: 0.2562 293/500 [================>.............] - ETA: 51s - loss: 1.5708 - regression_loss: 1.3147 - classification_loss: 0.2561 294/500 [================>.............] - ETA: 50s - loss: 1.5734 - regression_loss: 1.3165 - classification_loss: 0.2568 295/500 [================>.............] - ETA: 50s - loss: 1.5714 - regression_loss: 1.3150 - classification_loss: 0.2563 296/500 [================>.............] - ETA: 50s - loss: 1.5713 - regression_loss: 1.3151 - classification_loss: 0.2561 297/500 [================>.............] - ETA: 50s - loss: 1.5688 - regression_loss: 1.3131 - classification_loss: 0.2557 298/500 [================>.............] - ETA: 49s - loss: 1.5685 - regression_loss: 1.3130 - classification_loss: 0.2554 299/500 [================>.............] - ETA: 49s - loss: 1.5694 - regression_loss: 1.3139 - classification_loss: 0.2556 300/500 [=================>............] - ETA: 49s - loss: 1.5679 - regression_loss: 1.3127 - classification_loss: 0.2552 301/500 [=================>............] - ETA: 49s - loss: 1.5667 - regression_loss: 1.3117 - classification_loss: 0.2550 302/500 [=================>............] - ETA: 49s - loss: 1.5658 - regression_loss: 1.3111 - classification_loss: 0.2547 303/500 [=================>............] - ETA: 48s - loss: 1.5655 - regression_loss: 1.3110 - classification_loss: 0.2545 304/500 [=================>............] - ETA: 48s - loss: 1.5659 - regression_loss: 1.3117 - classification_loss: 0.2542 305/500 [=================>............] - ETA: 48s - loss: 1.5667 - regression_loss: 1.3125 - classification_loss: 0.2542 306/500 [=================>............] - ETA: 48s - loss: 1.5671 - regression_loss: 1.3126 - classification_loss: 0.2545 307/500 [=================>............] - ETA: 47s - loss: 1.5755 - regression_loss: 1.3169 - classification_loss: 0.2587 308/500 [=================>............] - ETA: 47s - loss: 1.5769 - regression_loss: 1.3181 - classification_loss: 0.2588 309/500 [=================>............] - ETA: 47s - loss: 1.5772 - regression_loss: 1.3185 - classification_loss: 0.2586 310/500 [=================>............] - ETA: 47s - loss: 1.5845 - regression_loss: 1.3143 - classification_loss: 0.2702 311/500 [=================>............] - ETA: 46s - loss: 1.5834 - regression_loss: 1.3133 - classification_loss: 0.2701 312/500 [=================>............] - ETA: 46s - loss: 1.5803 - regression_loss: 1.3108 - classification_loss: 0.2695 313/500 [=================>............] - ETA: 46s - loss: 1.5799 - regression_loss: 1.3105 - classification_loss: 0.2694 314/500 [=================>............] - ETA: 46s - loss: 1.5818 - regression_loss: 1.3120 - classification_loss: 0.2698 315/500 [=================>............] - ETA: 45s - loss: 1.5804 - regression_loss: 1.3111 - classification_loss: 0.2693 316/500 [=================>............] - ETA: 45s - loss: 1.5804 - regression_loss: 1.3115 - classification_loss: 0.2689 317/500 [==================>...........] - ETA: 45s - loss: 1.5785 - regression_loss: 1.3098 - classification_loss: 0.2687 318/500 [==================>...........] - ETA: 45s - loss: 1.5794 - regression_loss: 1.3105 - classification_loss: 0.2689 319/500 [==================>...........] - ETA: 44s - loss: 1.5776 - regression_loss: 1.3090 - classification_loss: 0.2686 320/500 [==================>...........] - ETA: 44s - loss: 1.5756 - regression_loss: 1.3075 - classification_loss: 0.2680 321/500 [==================>...........] - ETA: 44s - loss: 1.5747 - regression_loss: 1.3069 - classification_loss: 0.2678 322/500 [==================>...........] - ETA: 44s - loss: 1.5737 - regression_loss: 1.3062 - classification_loss: 0.2675 323/500 [==================>...........] - ETA: 43s - loss: 1.5757 - regression_loss: 1.3078 - classification_loss: 0.2679 324/500 [==================>...........] - ETA: 43s - loss: 1.5737 - regression_loss: 1.3060 - classification_loss: 0.2676 325/500 [==================>...........] - ETA: 43s - loss: 1.5714 - regression_loss: 1.3045 - classification_loss: 0.2670 326/500 [==================>...........] - ETA: 42s - loss: 1.5710 - regression_loss: 1.3042 - classification_loss: 0.2668 327/500 [==================>...........] - ETA: 42s - loss: 1.5714 - regression_loss: 1.3046 - classification_loss: 0.2668 328/500 [==================>...........] - ETA: 42s - loss: 1.5728 - regression_loss: 1.3053 - classification_loss: 0.2674 329/500 [==================>...........] - ETA: 42s - loss: 1.5706 - regression_loss: 1.3035 - classification_loss: 0.2671 330/500 [==================>...........] - ETA: 41s - loss: 1.5683 - regression_loss: 1.3016 - classification_loss: 0.2666 331/500 [==================>...........] - ETA: 41s - loss: 1.5656 - regression_loss: 1.2996 - classification_loss: 0.2661 332/500 [==================>...........] - ETA: 41s - loss: 1.5647 - regression_loss: 1.2990 - classification_loss: 0.2658 333/500 [==================>...........] - ETA: 41s - loss: 1.5650 - regression_loss: 1.2993 - classification_loss: 0.2657 334/500 [===================>..........] - ETA: 40s - loss: 1.5649 - regression_loss: 1.2991 - classification_loss: 0.2657 335/500 [===================>..........] - ETA: 40s - loss: 1.5658 - regression_loss: 1.2997 - classification_loss: 0.2661 336/500 [===================>..........] - ETA: 40s - loss: 1.5666 - regression_loss: 1.3006 - classification_loss: 0.2660 337/500 [===================>..........] - ETA: 40s - loss: 1.5658 - regression_loss: 1.2999 - classification_loss: 0.2659 338/500 [===================>..........] - ETA: 39s - loss: 1.5651 - regression_loss: 1.2993 - classification_loss: 0.2658 339/500 [===================>..........] - ETA: 39s - loss: 1.5668 - regression_loss: 1.3006 - classification_loss: 0.2662 340/500 [===================>..........] - ETA: 39s - loss: 1.5675 - regression_loss: 1.3014 - classification_loss: 0.2661 341/500 [===================>..........] - ETA: 39s - loss: 1.5659 - regression_loss: 1.3003 - classification_loss: 0.2657 342/500 [===================>..........] - ETA: 39s - loss: 1.5668 - regression_loss: 1.3006 - classification_loss: 0.2662 343/500 [===================>..........] - ETA: 38s - loss: 1.5669 - regression_loss: 1.3010 - classification_loss: 0.2659 344/500 [===================>..........] - ETA: 38s - loss: 1.5657 - regression_loss: 1.3001 - classification_loss: 0.2657 345/500 [===================>..........] - ETA: 38s - loss: 1.5652 - regression_loss: 1.2998 - classification_loss: 0.2655 346/500 [===================>..........] - ETA: 38s - loss: 1.5649 - regression_loss: 1.2995 - classification_loss: 0.2653 347/500 [===================>..........] - ETA: 37s - loss: 1.5652 - regression_loss: 1.3000 - classification_loss: 0.2652 348/500 [===================>..........] - ETA: 37s - loss: 1.5638 - regression_loss: 1.2989 - classification_loss: 0.2649 349/500 [===================>..........] - ETA: 37s - loss: 1.5636 - regression_loss: 1.2988 - classification_loss: 0.2648 350/500 [====================>.........] - ETA: 37s - loss: 1.5671 - regression_loss: 1.3015 - classification_loss: 0.2655 351/500 [====================>.........] - ETA: 36s - loss: 1.5645 - regression_loss: 1.2992 - classification_loss: 0.2653 352/500 [====================>.........] - ETA: 36s - loss: 1.5672 - regression_loss: 1.3013 - classification_loss: 0.2660 353/500 [====================>.........] - ETA: 36s - loss: 1.5686 - regression_loss: 1.3024 - classification_loss: 0.2661 354/500 [====================>.........] - ETA: 36s - loss: 1.5669 - regression_loss: 1.3011 - classification_loss: 0.2658 355/500 [====================>.........] - ETA: 35s - loss: 1.5639 - regression_loss: 1.2982 - classification_loss: 0.2656 356/500 [====================>.........] - ETA: 35s - loss: 1.5634 - regression_loss: 1.2978 - classification_loss: 0.2656 357/500 [====================>.........] - ETA: 35s - loss: 1.5634 - regression_loss: 1.2980 - classification_loss: 0.2654 358/500 [====================>.........] - ETA: 35s - loss: 1.5622 - regression_loss: 1.2972 - classification_loss: 0.2651 359/500 [====================>.........] - ETA: 34s - loss: 1.5618 - regression_loss: 1.2970 - classification_loss: 0.2649 360/500 [====================>.........] - ETA: 34s - loss: 1.5588 - regression_loss: 1.2946 - classification_loss: 0.2642 361/500 [====================>.........] - ETA: 34s - loss: 1.5572 - regression_loss: 1.2933 - classification_loss: 0.2640 362/500 [====================>.........] - ETA: 34s - loss: 1.5576 - regression_loss: 1.2934 - classification_loss: 0.2642 363/500 [====================>.........] - ETA: 33s - loss: 1.5571 - regression_loss: 1.2932 - classification_loss: 0.2640 364/500 [====================>.........] - ETA: 33s - loss: 1.5578 - regression_loss: 1.2935 - classification_loss: 0.2643 365/500 [====================>.........] - ETA: 33s - loss: 1.5596 - regression_loss: 1.2948 - classification_loss: 0.2648 366/500 [====================>.........] - ETA: 33s - loss: 1.5591 - regression_loss: 1.2945 - classification_loss: 0.2646 367/500 [=====================>........] - ETA: 32s - loss: 1.5597 - regression_loss: 1.2950 - classification_loss: 0.2647 368/500 [=====================>........] - ETA: 32s - loss: 1.5599 - regression_loss: 1.2950 - classification_loss: 0.2649 369/500 [=====================>........] - ETA: 32s - loss: 1.5618 - regression_loss: 1.2963 - classification_loss: 0.2655 370/500 [=====================>........] - ETA: 32s - loss: 1.5614 - regression_loss: 1.2960 - classification_loss: 0.2654 371/500 [=====================>........] - ETA: 31s - loss: 1.5629 - regression_loss: 1.2972 - classification_loss: 0.2657 372/500 [=====================>........] - ETA: 31s - loss: 1.5628 - regression_loss: 1.2973 - classification_loss: 0.2655 373/500 [=====================>........] - ETA: 31s - loss: 1.5618 - regression_loss: 1.2967 - classification_loss: 0.2651 374/500 [=====================>........] - ETA: 31s - loss: 1.5612 - regression_loss: 1.2965 - classification_loss: 0.2647 375/500 [=====================>........] - ETA: 30s - loss: 1.5641 - regression_loss: 1.2988 - classification_loss: 0.2653 376/500 [=====================>........] - ETA: 30s - loss: 1.5654 - regression_loss: 1.2998 - classification_loss: 0.2656 377/500 [=====================>........] - ETA: 30s - loss: 1.5653 - regression_loss: 1.2999 - classification_loss: 0.2654 378/500 [=====================>........] - ETA: 30s - loss: 1.5649 - regression_loss: 1.2997 - classification_loss: 0.2652 379/500 [=====================>........] - ETA: 29s - loss: 1.5624 - regression_loss: 1.2977 - classification_loss: 0.2647 380/500 [=====================>........] - ETA: 29s - loss: 1.5617 - regression_loss: 1.2973 - classification_loss: 0.2645 381/500 [=====================>........] - ETA: 29s - loss: 1.5622 - regression_loss: 1.2976 - classification_loss: 0.2646 382/500 [=====================>........] - ETA: 29s - loss: 1.5650 - regression_loss: 1.2999 - classification_loss: 0.2650 383/500 [=====================>........] - ETA: 28s - loss: 1.5653 - regression_loss: 1.3005 - classification_loss: 0.2648 384/500 [======================>.......] - ETA: 28s - loss: 1.5669 - regression_loss: 1.3016 - classification_loss: 0.2652 385/500 [======================>.......] - ETA: 28s - loss: 1.5672 - regression_loss: 1.3022 - classification_loss: 0.2650 386/500 [======================>.......] - ETA: 28s - loss: 1.5674 - regression_loss: 1.3027 - classification_loss: 0.2647 387/500 [======================>.......] - ETA: 27s - loss: 1.5668 - regression_loss: 1.3024 - classification_loss: 0.2645 388/500 [======================>.......] - ETA: 27s - loss: 1.5650 - regression_loss: 1.3008 - classification_loss: 0.2642 389/500 [======================>.......] - ETA: 27s - loss: 1.5666 - regression_loss: 1.3020 - classification_loss: 0.2646 390/500 [======================>.......] - ETA: 27s - loss: 1.5661 - regression_loss: 1.3017 - classification_loss: 0.2644 391/500 [======================>.......] - ETA: 26s - loss: 1.5657 - regression_loss: 1.3017 - classification_loss: 0.2640 392/500 [======================>.......] - ETA: 26s - loss: 1.5658 - regression_loss: 1.3015 - classification_loss: 0.2643 393/500 [======================>.......] - ETA: 26s - loss: 1.5640 - regression_loss: 1.3001 - classification_loss: 0.2639 394/500 [======================>.......] - ETA: 26s - loss: 1.5655 - regression_loss: 1.3015 - classification_loss: 0.2641 395/500 [======================>.......] - ETA: 25s - loss: 1.5652 - regression_loss: 1.3011 - classification_loss: 0.2641 396/500 [======================>.......] - ETA: 25s - loss: 1.5634 - regression_loss: 1.2998 - classification_loss: 0.2637 397/500 [======================>.......] - ETA: 25s - loss: 1.5634 - regression_loss: 1.3000 - classification_loss: 0.2634 398/500 [======================>.......] - ETA: 25s - loss: 1.5637 - regression_loss: 1.3002 - classification_loss: 0.2635 399/500 [======================>.......] - ETA: 24s - loss: 1.5623 - regression_loss: 1.2993 - classification_loss: 0.2631 400/500 [=======================>......] - ETA: 24s - loss: 1.5602 - regression_loss: 1.2976 - classification_loss: 0.2626 401/500 [=======================>......] - ETA: 24s - loss: 1.5592 - regression_loss: 1.2969 - classification_loss: 0.2623 402/500 [=======================>......] - ETA: 24s - loss: 1.5589 - regression_loss: 1.2967 - classification_loss: 0.2622 403/500 [=======================>......] - ETA: 23s - loss: 1.5602 - regression_loss: 1.2979 - classification_loss: 0.2623 404/500 [=======================>......] - ETA: 23s - loss: 1.5590 - regression_loss: 1.2970 - classification_loss: 0.2620 405/500 [=======================>......] - ETA: 23s - loss: 1.5575 - regression_loss: 1.2959 - classification_loss: 0.2616 406/500 [=======================>......] - ETA: 23s - loss: 1.5577 - regression_loss: 1.2960 - classification_loss: 0.2617 407/500 [=======================>......] - ETA: 22s - loss: 1.5574 - regression_loss: 1.2958 - classification_loss: 0.2616 408/500 [=======================>......] - ETA: 22s - loss: 1.5586 - regression_loss: 1.2967 - classification_loss: 0.2618 409/500 [=======================>......] - ETA: 22s - loss: 1.5598 - regression_loss: 1.2978 - classification_loss: 0.2620 410/500 [=======================>......] - ETA: 22s - loss: 1.5593 - regression_loss: 1.2974 - classification_loss: 0.2620 411/500 [=======================>......] - ETA: 21s - loss: 1.5587 - regression_loss: 1.2970 - classification_loss: 0.2617 412/500 [=======================>......] - ETA: 21s - loss: 1.5578 - regression_loss: 1.2963 - classification_loss: 0.2615 413/500 [=======================>......] - ETA: 21s - loss: 1.5583 - regression_loss: 1.2968 - classification_loss: 0.2615 414/500 [=======================>......] - ETA: 21s - loss: 1.5589 - regression_loss: 1.2973 - classification_loss: 0.2616 415/500 [=======================>......] - ETA: 20s - loss: 1.5590 - regression_loss: 1.2975 - classification_loss: 0.2616 416/500 [=======================>......] - ETA: 20s - loss: 1.5603 - regression_loss: 1.2988 - classification_loss: 0.2615 417/500 [========================>.....] - ETA: 20s - loss: 1.5600 - regression_loss: 1.2985 - classification_loss: 0.2615 418/500 [========================>.....] - ETA: 20s - loss: 1.5611 - regression_loss: 1.2994 - classification_loss: 0.2617 419/500 [========================>.....] - ETA: 19s - loss: 1.5613 - regression_loss: 1.2994 - classification_loss: 0.2618 420/500 [========================>.....] - ETA: 19s - loss: 1.5604 - regression_loss: 1.2990 - classification_loss: 0.2615 421/500 [========================>.....] - ETA: 19s - loss: 1.5591 - regression_loss: 1.2978 - classification_loss: 0.2613 422/500 [========================>.....] - ETA: 19s - loss: 1.5582 - regression_loss: 1.2972 - classification_loss: 0.2610 423/500 [========================>.....] - ETA: 18s - loss: 1.5567 - regression_loss: 1.2960 - classification_loss: 0.2607 424/500 [========================>.....] - ETA: 18s - loss: 1.5569 - regression_loss: 1.2963 - classification_loss: 0.2606 425/500 [========================>.....] - ETA: 18s - loss: 1.5594 - regression_loss: 1.2979 - classification_loss: 0.2614 426/500 [========================>.....] - ETA: 18s - loss: 1.5581 - regression_loss: 1.2970 - classification_loss: 0.2611 427/500 [========================>.....] - ETA: 17s - loss: 1.5620 - regression_loss: 1.3001 - classification_loss: 0.2618 428/500 [========================>.....] - ETA: 17s - loss: 1.5642 - regression_loss: 1.3024 - classification_loss: 0.2618 429/500 [========================>.....] - ETA: 17s - loss: 1.5636 - regression_loss: 1.3020 - classification_loss: 0.2616 430/500 [========================>.....] - ETA: 17s - loss: 1.5638 - regression_loss: 1.3022 - classification_loss: 0.2617 431/500 [========================>.....] - ETA: 17s - loss: 1.5637 - regression_loss: 1.3021 - classification_loss: 0.2616 432/500 [========================>.....] - ETA: 16s - loss: 1.5638 - regression_loss: 1.3023 - classification_loss: 0.2615 433/500 [========================>.....] - ETA: 16s - loss: 1.5645 - regression_loss: 1.3028 - classification_loss: 0.2617 434/500 [=========================>....] - ETA: 16s - loss: 1.5638 - regression_loss: 1.3024 - classification_loss: 0.2615 435/500 [=========================>....] - ETA: 16s - loss: 1.5644 - regression_loss: 1.3026 - classification_loss: 0.2618 436/500 [=========================>....] - ETA: 15s - loss: 1.5651 - regression_loss: 1.3032 - classification_loss: 0.2620 437/500 [=========================>....] - ETA: 15s - loss: 1.5659 - regression_loss: 1.3039 - classification_loss: 0.2620 438/500 [=========================>....] - ETA: 15s - loss: 1.5677 - regression_loss: 1.3051 - classification_loss: 0.2625 439/500 [=========================>....] - ETA: 15s - loss: 1.5681 - regression_loss: 1.3057 - classification_loss: 0.2624 440/500 [=========================>....] - ETA: 14s - loss: 1.5685 - regression_loss: 1.3062 - classification_loss: 0.2624 441/500 [=========================>....] - ETA: 14s - loss: 1.5679 - regression_loss: 1.3057 - classification_loss: 0.2622 442/500 [=========================>....] - ETA: 14s - loss: 1.5662 - regression_loss: 1.3043 - classification_loss: 0.2619 443/500 [=========================>....] - ETA: 14s - loss: 1.5635 - regression_loss: 1.3021 - classification_loss: 0.2614 444/500 [=========================>....] - ETA: 13s - loss: 1.5646 - regression_loss: 1.3026 - classification_loss: 0.2620 445/500 [=========================>....] - ETA: 13s - loss: 1.5652 - regression_loss: 1.3031 - classification_loss: 0.2621 446/500 [=========================>....] - ETA: 13s - loss: 1.5669 - regression_loss: 1.3043 - classification_loss: 0.2626 447/500 [=========================>....] - ETA: 13s - loss: 1.5668 - regression_loss: 1.3043 - classification_loss: 0.2625 448/500 [=========================>....] - ETA: 12s - loss: 1.5678 - regression_loss: 1.3053 - classification_loss: 0.2625 449/500 [=========================>....] - ETA: 12s - loss: 1.5679 - regression_loss: 1.3056 - classification_loss: 0.2623 450/500 [==========================>...] - ETA: 12s - loss: 1.5698 - regression_loss: 1.3074 - classification_loss: 0.2624 451/500 [==========================>...] - ETA: 12s - loss: 1.5693 - regression_loss: 1.3071 - classification_loss: 0.2622 452/500 [==========================>...] - ETA: 11s - loss: 1.5699 - regression_loss: 1.3079 - classification_loss: 0.2620 453/500 [==========================>...] - ETA: 11s - loss: 1.5692 - regression_loss: 1.3074 - classification_loss: 0.2619 454/500 [==========================>...] - ETA: 11s - loss: 1.5693 - regression_loss: 1.3074 - classification_loss: 0.2620 455/500 [==========================>...] - ETA: 11s - loss: 1.5691 - regression_loss: 1.3073 - classification_loss: 0.2618 456/500 [==========================>...] - ETA: 10s - loss: 1.5665 - regression_loss: 1.3052 - classification_loss: 0.2613 457/500 [==========================>...] - ETA: 10s - loss: 1.5669 - regression_loss: 1.3053 - classification_loss: 0.2616 458/500 [==========================>...] - ETA: 10s - loss: 1.5666 - regression_loss: 1.3052 - classification_loss: 0.2613 459/500 [==========================>...] - ETA: 10s - loss: 1.5680 - regression_loss: 1.3064 - classification_loss: 0.2616 460/500 [==========================>...] - ETA: 9s - loss: 1.5686 - regression_loss: 1.3069 - classification_loss: 0.2616  461/500 [==========================>...] - ETA: 9s - loss: 1.5667 - regression_loss: 1.3053 - classification_loss: 0.2614 462/500 [==========================>...] - ETA: 9s - loss: 1.5662 - regression_loss: 1.3047 - classification_loss: 0.2615 463/500 [==========================>...] - ETA: 9s - loss: 1.5663 - regression_loss: 1.3046 - classification_loss: 0.2616 464/500 [==========================>...] - ETA: 8s - loss: 1.5649 - regression_loss: 1.3036 - classification_loss: 0.2613 465/500 [==========================>...] - ETA: 8s - loss: 1.5639 - regression_loss: 1.3028 - classification_loss: 0.2611 466/500 [==========================>...] - ETA: 8s - loss: 1.5654 - regression_loss: 1.3041 - classification_loss: 0.2613 467/500 [===========================>..] - ETA: 8s - loss: 1.5662 - regression_loss: 1.3045 - classification_loss: 0.2617 468/500 [===========================>..] - ETA: 7s - loss: 1.5654 - regression_loss: 1.3040 - classification_loss: 0.2614 469/500 [===========================>..] - ETA: 7s - loss: 1.5659 - regression_loss: 1.3046 - classification_loss: 0.2612 470/500 [===========================>..] - ETA: 7s - loss: 1.5653 - regression_loss: 1.3042 - classification_loss: 0.2612 471/500 [===========================>..] - ETA: 7s - loss: 1.5664 - regression_loss: 1.3048 - classification_loss: 0.2616 472/500 [===========================>..] - ETA: 6s - loss: 1.5654 - regression_loss: 1.3041 - classification_loss: 0.2614 473/500 [===========================>..] - ETA: 6s - loss: 1.5654 - regression_loss: 1.3041 - classification_loss: 0.2613 474/500 [===========================>..] - ETA: 6s - loss: 1.5666 - regression_loss: 1.3053 - classification_loss: 0.2614 475/500 [===========================>..] - ETA: 6s - loss: 1.5669 - regression_loss: 1.3055 - classification_loss: 0.2613 476/500 [===========================>..] - ETA: 5s - loss: 1.5659 - regression_loss: 1.3049 - classification_loss: 0.2611 477/500 [===========================>..] - ETA: 5s - loss: 1.5654 - regression_loss: 1.3046 - classification_loss: 0.2608 478/500 [===========================>..] - ETA: 5s - loss: 1.5658 - regression_loss: 1.3049 - classification_loss: 0.2609 479/500 [===========================>..] - ETA: 5s - loss: 1.5652 - regression_loss: 1.3045 - classification_loss: 0.2607 480/500 [===========================>..] - ETA: 4s - loss: 1.5628 - regression_loss: 1.3024 - classification_loss: 0.2603 481/500 [===========================>..] - ETA: 4s - loss: 1.5637 - regression_loss: 1.3030 - classification_loss: 0.2608 482/500 [===========================>..] - ETA: 4s - loss: 1.5623 - regression_loss: 1.3018 - classification_loss: 0.2605 483/500 [===========================>..] - ETA: 4s - loss: 1.5617 - regression_loss: 1.3013 - classification_loss: 0.2603 484/500 [============================>.] - ETA: 3s - loss: 1.5628 - regression_loss: 1.3021 - classification_loss: 0.2606 485/500 [============================>.] - ETA: 3s - loss: 1.5624 - regression_loss: 1.3020 - classification_loss: 0.2605 486/500 [============================>.] - ETA: 3s - loss: 1.5624 - regression_loss: 1.3021 - classification_loss: 0.2604 487/500 [============================>.] - ETA: 3s - loss: 1.5621 - regression_loss: 1.3020 - classification_loss: 0.2601 488/500 [============================>.] - ETA: 2s - loss: 1.5618 - regression_loss: 1.3018 - classification_loss: 0.2600 489/500 [============================>.] - ETA: 2s - loss: 1.5620 - regression_loss: 1.3022 - classification_loss: 0.2598 490/500 [============================>.] - ETA: 2s - loss: 1.5614 - regression_loss: 1.3017 - classification_loss: 0.2597 491/500 [============================>.] - ETA: 2s - loss: 1.5605 - regression_loss: 1.3011 - classification_loss: 0.2594 492/500 [============================>.] - ETA: 1s - loss: 1.5623 - regression_loss: 1.3026 - classification_loss: 0.2597 493/500 [============================>.] - ETA: 1s - loss: 1.5625 - regression_loss: 1.3028 - classification_loss: 0.2597 494/500 [============================>.] - ETA: 1s - loss: 1.5634 - regression_loss: 1.3035 - classification_loss: 0.2599 495/500 [============================>.] - ETA: 1s - loss: 1.5640 - regression_loss: 1.3039 - classification_loss: 0.2601 496/500 [============================>.] - ETA: 0s - loss: 1.5662 - regression_loss: 1.3059 - classification_loss: 0.2603 497/500 [============================>.] - ETA: 0s - loss: 1.5663 - regression_loss: 1.3061 - classification_loss: 0.2603 498/500 [============================>.] - ETA: 0s - loss: 1.5675 - regression_loss: 1.3070 - classification_loss: 0.2604 499/500 [============================>.] - ETA: 0s - loss: 1.5674 - regression_loss: 1.3071 - classification_loss: 0.2603 500/500 [==============================] - 123s 246ms/step - loss: 1.5674 - regression_loss: 1.3072 - classification_loss: 0.2602 326 instances of class plum with average precision: 0.7996 mAP: 0.7996 Epoch 00069: saving model to ./training/snapshots/resnet50_pascal_69.h5 Epoch 00069: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. Epoch 70/150 1/500 [..............................] - ETA: 2:07 - loss: 1.3607 - regression_loss: 1.1367 - classification_loss: 0.2241 2/500 [..............................] - ETA: 2:04 - loss: 1.3089 - regression_loss: 1.1350 - classification_loss: 0.1739 3/500 [..............................] - ETA: 2:04 - loss: 1.4096 - regression_loss: 1.1867 - classification_loss: 0.2229 4/500 [..............................] - ETA: 2:05 - loss: 1.3358 - regression_loss: 1.1190 - classification_loss: 0.2168 5/500 [..............................] - ETA: 2:04 - loss: 1.3801 - regression_loss: 1.1647 - classification_loss: 0.2154 6/500 [..............................] - ETA: 2:03 - loss: 1.4034 - regression_loss: 1.1927 - classification_loss: 0.2107 7/500 [..............................] - ETA: 2:03 - loss: 1.3875 - regression_loss: 1.1856 - classification_loss: 0.2019 8/500 [..............................] - ETA: 2:02 - loss: 1.3687 - regression_loss: 1.1680 - classification_loss: 0.2007 9/500 [..............................] - ETA: 2:02 - loss: 1.3650 - regression_loss: 1.1675 - classification_loss: 0.1974 10/500 [..............................] - ETA: 2:00 - loss: 1.4678 - regression_loss: 1.2303 - classification_loss: 0.2375 11/500 [..............................] - ETA: 2:00 - loss: 1.4529 - regression_loss: 1.2193 - classification_loss: 0.2336 12/500 [..............................] - ETA: 1:59 - loss: 1.4180 - regression_loss: 1.1923 - classification_loss: 0.2258 13/500 [..............................] - ETA: 1:59 - loss: 1.4119 - regression_loss: 1.1924 - classification_loss: 0.2195 14/500 [..............................] - ETA: 1:59 - loss: 1.4611 - regression_loss: 1.2344 - classification_loss: 0.2267 15/500 [..............................] - ETA: 1:59 - loss: 1.4589 - regression_loss: 1.2303 - classification_loss: 0.2286 16/500 [..............................] - ETA: 1:59 - loss: 1.4700 - regression_loss: 1.2386 - classification_loss: 0.2315 17/500 [>.............................] - ETA: 1:59 - loss: 1.4448 - regression_loss: 1.2204 - classification_loss: 0.2244 18/500 [>.............................] - ETA: 1:59 - loss: 1.4532 - regression_loss: 1.2289 - classification_loss: 0.2243 19/500 [>.............................] - ETA: 1:59 - loss: 1.4391 - regression_loss: 1.2169 - classification_loss: 0.2221 20/500 [>.............................] - ETA: 1:58 - loss: 1.4388 - regression_loss: 1.2177 - classification_loss: 0.2212 21/500 [>.............................] - ETA: 1:58 - loss: 1.4071 - regression_loss: 1.1884 - classification_loss: 0.2187 22/500 [>.............................] - ETA: 1:58 - loss: 1.4203 - regression_loss: 1.1992 - classification_loss: 0.2211 23/500 [>.............................] - ETA: 1:57 - loss: 1.4587 - regression_loss: 1.2419 - classification_loss: 0.2168 24/500 [>.............................] - ETA: 1:57 - loss: 1.4155 - regression_loss: 1.2062 - classification_loss: 0.2094 25/500 [>.............................] - ETA: 1:57 - loss: 1.3835 - regression_loss: 1.1791 - classification_loss: 0.2044 26/500 [>.............................] - ETA: 1:57 - loss: 1.3846 - regression_loss: 1.1825 - classification_loss: 0.2021 27/500 [>.............................] - ETA: 1:57 - loss: 1.3528 - regression_loss: 1.1540 - classification_loss: 0.1988 28/500 [>.............................] - ETA: 1:56 - loss: 1.3751 - regression_loss: 1.1649 - classification_loss: 0.2102 29/500 [>.............................] - ETA: 1:56 - loss: 1.3930 - regression_loss: 1.1787 - classification_loss: 0.2143 30/500 [>.............................] - ETA: 1:56 - loss: 1.3871 - regression_loss: 1.1762 - classification_loss: 0.2109 31/500 [>.............................] - ETA: 1:56 - loss: 1.3844 - regression_loss: 1.1728 - classification_loss: 0.2117 32/500 [>.............................] - ETA: 1:56 - loss: 1.3911 - regression_loss: 1.1784 - classification_loss: 0.2126 33/500 [>.............................] - ETA: 1:56 - loss: 1.4094 - regression_loss: 1.1888 - classification_loss: 0.2206 34/500 [=>............................] - ETA: 1:55 - loss: 1.4171 - regression_loss: 1.1959 - classification_loss: 0.2212 35/500 [=>............................] - ETA: 1:55 - loss: 1.4266 - regression_loss: 1.2054 - classification_loss: 0.2212 36/500 [=>............................] - ETA: 1:55 - loss: 1.4295 - regression_loss: 1.2083 - classification_loss: 0.2212 37/500 [=>............................] - ETA: 1:55 - loss: 1.4269 - regression_loss: 1.2071 - classification_loss: 0.2198 38/500 [=>............................] - ETA: 1:55 - loss: 1.4150 - regression_loss: 1.1980 - classification_loss: 0.2169 39/500 [=>............................] - ETA: 1:54 - loss: 1.3930 - regression_loss: 1.1801 - classification_loss: 0.2128 40/500 [=>............................] - ETA: 1:54 - loss: 1.3948 - regression_loss: 1.1799 - classification_loss: 0.2148 41/500 [=>............................] - ETA: 1:54 - loss: 1.3852 - regression_loss: 1.1725 - classification_loss: 0.2127 42/500 [=>............................] - ETA: 1:54 - loss: 1.3832 - regression_loss: 1.1727 - classification_loss: 0.2106 43/500 [=>............................] - ETA: 1:53 - loss: 1.3926 - regression_loss: 1.1788 - classification_loss: 0.2138 44/500 [=>............................] - ETA: 1:53 - loss: 1.3968 - regression_loss: 1.1830 - classification_loss: 0.2138 45/500 [=>............................] - ETA: 1:52 - loss: 1.4068 - regression_loss: 1.1930 - classification_loss: 0.2138 46/500 [=>............................] - ETA: 1:52 - loss: 1.4009 - regression_loss: 1.1873 - classification_loss: 0.2136 47/500 [=>............................] - ETA: 1:52 - loss: 1.3998 - regression_loss: 1.1867 - classification_loss: 0.2131 48/500 [=>............................] - ETA: 1:51 - loss: 1.4171 - regression_loss: 1.2027 - classification_loss: 0.2144 49/500 [=>............................] - ETA: 1:51 - loss: 1.4401 - regression_loss: 1.2195 - classification_loss: 0.2206 50/500 [==>...........................] - ETA: 1:51 - loss: 1.4410 - regression_loss: 1.2211 - classification_loss: 0.2198 51/500 [==>...........................] - ETA: 1:50 - loss: 1.4282 - regression_loss: 1.2106 - classification_loss: 0.2176 52/500 [==>...........................] - ETA: 1:50 - loss: 1.4250 - regression_loss: 1.2090 - classification_loss: 0.2160 53/500 [==>...........................] - ETA: 1:50 - loss: 1.4168 - regression_loss: 1.2016 - classification_loss: 0.2152 54/500 [==>...........................] - ETA: 1:50 - loss: 1.4235 - regression_loss: 1.2055 - classification_loss: 0.2180 55/500 [==>...........................] - ETA: 1:49 - loss: 1.4263 - regression_loss: 1.2061 - classification_loss: 0.2201 56/500 [==>...........................] - ETA: 1:49 - loss: 1.4674 - regression_loss: 1.2293 - classification_loss: 0.2381 57/500 [==>...........................] - ETA: 1:49 - loss: 1.4585 - regression_loss: 1.2217 - classification_loss: 0.2368 58/500 [==>...........................] - ETA: 1:49 - loss: 1.4657 - regression_loss: 1.2315 - classification_loss: 0.2342 59/500 [==>...........................] - ETA: 1:49 - loss: 1.4670 - regression_loss: 1.2330 - classification_loss: 0.2340 60/500 [==>...........................] - ETA: 1:48 - loss: 1.4648 - regression_loss: 1.2306 - classification_loss: 0.2342 61/500 [==>...........................] - ETA: 1:48 - loss: 1.4662 - regression_loss: 1.2324 - classification_loss: 0.2339 62/500 [==>...........................] - ETA: 1:48 - loss: 1.4615 - regression_loss: 1.2288 - classification_loss: 0.2327 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4873 - regression_loss: 1.2496 - classification_loss: 0.2378 64/500 [==>...........................] - ETA: 1:47 - loss: 1.4860 - regression_loss: 1.2498 - classification_loss: 0.2362 65/500 [==>...........................] - ETA: 1:47 - loss: 1.4979 - regression_loss: 1.2589 - classification_loss: 0.2390 66/500 [==>...........................] - ETA: 1:47 - loss: 1.4874 - regression_loss: 1.2503 - classification_loss: 0.2371 67/500 [===>..........................] - ETA: 1:46 - loss: 1.4867 - regression_loss: 1.2512 - classification_loss: 0.2355 68/500 [===>..........................] - ETA: 1:46 - loss: 1.4902 - regression_loss: 1.2551 - classification_loss: 0.2351 69/500 [===>..........................] - ETA: 1:46 - loss: 1.4900 - regression_loss: 1.2548 - classification_loss: 0.2352 70/500 [===>..........................] - ETA: 1:46 - loss: 1.4811 - regression_loss: 1.2472 - classification_loss: 0.2339 71/500 [===>..........................] - ETA: 1:46 - loss: 1.4846 - regression_loss: 1.2506 - classification_loss: 0.2340 72/500 [===>..........................] - ETA: 1:45 - loss: 1.4896 - regression_loss: 1.2557 - classification_loss: 0.2340 73/500 [===>..........................] - ETA: 1:45 - loss: 1.4778 - regression_loss: 1.2458 - classification_loss: 0.2320 74/500 [===>..........................] - ETA: 1:45 - loss: 1.4801 - regression_loss: 1.2476 - classification_loss: 0.2325 75/500 [===>..........................] - ETA: 1:45 - loss: 1.4732 - regression_loss: 1.2421 - classification_loss: 0.2311 76/500 [===>..........................] - ETA: 1:44 - loss: 1.4773 - regression_loss: 1.2456 - classification_loss: 0.2317 77/500 [===>..........................] - ETA: 1:44 - loss: 1.4835 - regression_loss: 1.2509 - classification_loss: 0.2326 78/500 [===>..........................] - ETA: 1:44 - loss: 1.4789 - regression_loss: 1.2463 - classification_loss: 0.2326 79/500 [===>..........................] - ETA: 1:44 - loss: 1.4749 - regression_loss: 1.2439 - classification_loss: 0.2310 80/500 [===>..........................] - ETA: 1:43 - loss: 1.4681 - regression_loss: 1.2387 - classification_loss: 0.2294 81/500 [===>..........................] - ETA: 1:43 - loss: 1.4754 - regression_loss: 1.2446 - classification_loss: 0.2307 82/500 [===>..........................] - ETA: 1:43 - loss: 1.4770 - regression_loss: 1.2467 - classification_loss: 0.2303 83/500 [===>..........................] - ETA: 1:43 - loss: 1.4710 - regression_loss: 1.2422 - classification_loss: 0.2288 84/500 [====>.........................] - ETA: 1:42 - loss: 1.4635 - regression_loss: 1.2361 - classification_loss: 0.2274 85/500 [====>.........................] - ETA: 1:42 - loss: 1.4593 - regression_loss: 1.2335 - classification_loss: 0.2258 86/500 [====>.........................] - ETA: 1:42 - loss: 1.4584 - regression_loss: 1.2331 - classification_loss: 0.2253 87/500 [====>.........................] - ETA: 1:42 - loss: 1.4523 - regression_loss: 1.2277 - classification_loss: 0.2246 88/500 [====>.........................] - ETA: 1:41 - loss: 1.4562 - regression_loss: 1.2309 - classification_loss: 0.2254 89/500 [====>.........................] - ETA: 1:41 - loss: 1.4571 - regression_loss: 1.2316 - classification_loss: 0.2255 90/500 [====>.........................] - ETA: 1:41 - loss: 1.4560 - regression_loss: 1.2308 - classification_loss: 0.2252 91/500 [====>.........................] - ETA: 1:41 - loss: 1.4548 - regression_loss: 1.2300 - classification_loss: 0.2248 92/500 [====>.........................] - ETA: 1:40 - loss: 1.4488 - regression_loss: 1.2252 - classification_loss: 0.2236 93/500 [====>.........................] - ETA: 1:40 - loss: 1.4521 - regression_loss: 1.2286 - classification_loss: 0.2235 94/500 [====>.........................] - ETA: 1:40 - loss: 1.4530 - regression_loss: 1.2299 - classification_loss: 0.2232 95/500 [====>.........................] - ETA: 1:40 - loss: 1.4620 - regression_loss: 1.2357 - classification_loss: 0.2263 96/500 [====>.........................] - ETA: 1:39 - loss: 1.4632 - regression_loss: 1.2378 - classification_loss: 0.2254 97/500 [====>.........................] - ETA: 1:39 - loss: 1.4622 - regression_loss: 1.2372 - classification_loss: 0.2251 98/500 [====>.........................] - ETA: 1:39 - loss: 1.4646 - regression_loss: 1.2397 - classification_loss: 0.2249 99/500 [====>.........................] - ETA: 1:39 - loss: 1.4667 - regression_loss: 1.2410 - classification_loss: 0.2257 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4714 - regression_loss: 1.2454 - classification_loss: 0.2260 101/500 [=====>........................] - ETA: 1:38 - loss: 1.4632 - regression_loss: 1.2382 - classification_loss: 0.2250 102/500 [=====>........................] - ETA: 1:38 - loss: 1.4618 - regression_loss: 1.2375 - classification_loss: 0.2243 103/500 [=====>........................] - ETA: 1:38 - loss: 1.4692 - regression_loss: 1.2430 - classification_loss: 0.2262 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4653 - regression_loss: 1.2404 - classification_loss: 0.2249 105/500 [=====>........................] - ETA: 1:37 - loss: 1.4605 - regression_loss: 1.2367 - classification_loss: 0.2237 106/500 [=====>........................] - ETA: 1:37 - loss: 1.4535 - regression_loss: 1.2311 - classification_loss: 0.2224 107/500 [=====>........................] - ETA: 1:37 - loss: 1.4586 - regression_loss: 1.2349 - classification_loss: 0.2237 108/500 [=====>........................] - ETA: 1:37 - loss: 1.4550 - regression_loss: 1.2320 - classification_loss: 0.2230 109/500 [=====>........................] - ETA: 1:36 - loss: 1.4628 - regression_loss: 1.2392 - classification_loss: 0.2236 110/500 [=====>........................] - ETA: 1:36 - loss: 1.4654 - regression_loss: 1.2415 - classification_loss: 0.2240 111/500 [=====>........................] - ETA: 1:36 - loss: 1.4576 - regression_loss: 1.2348 - classification_loss: 0.2228 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4573 - regression_loss: 1.2352 - classification_loss: 0.2221 113/500 [=====>........................] - ETA: 1:35 - loss: 1.4594 - regression_loss: 1.2371 - classification_loss: 0.2223 114/500 [=====>........................] - ETA: 1:35 - loss: 1.4642 - regression_loss: 1.2406 - classification_loss: 0.2236 115/500 [=====>........................] - ETA: 1:35 - loss: 1.4699 - regression_loss: 1.2444 - classification_loss: 0.2255 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4694 - regression_loss: 1.2436 - classification_loss: 0.2258 117/500 [======>.......................] - ETA: 1:34 - loss: 1.4738 - regression_loss: 1.2478 - classification_loss: 0.2260 118/500 [======>.......................] - ETA: 1:34 - loss: 1.4709 - regression_loss: 1.2454 - classification_loss: 0.2256 119/500 [======>.......................] - ETA: 1:34 - loss: 1.4674 - regression_loss: 1.2427 - classification_loss: 0.2247 120/500 [======>.......................] - ETA: 1:34 - loss: 1.4691 - regression_loss: 1.2439 - classification_loss: 0.2251 121/500 [======>.......................] - ETA: 1:33 - loss: 1.4668 - regression_loss: 1.2422 - classification_loss: 0.2246 122/500 [======>.......................] - ETA: 1:33 - loss: 1.4612 - regression_loss: 1.2371 - classification_loss: 0.2242 123/500 [======>.......................] - ETA: 1:33 - loss: 1.4601 - regression_loss: 1.2359 - classification_loss: 0.2242 124/500 [======>.......................] - ETA: 1:32 - loss: 1.4596 - regression_loss: 1.2357 - classification_loss: 0.2239 125/500 [======>.......................] - ETA: 1:32 - loss: 1.4588 - regression_loss: 1.2349 - classification_loss: 0.2239 126/500 [======>.......................] - ETA: 1:32 - loss: 1.4627 - regression_loss: 1.2390 - classification_loss: 0.2238 127/500 [======>.......................] - ETA: 1:32 - loss: 1.4695 - regression_loss: 1.2439 - classification_loss: 0.2255 128/500 [======>.......................] - ETA: 1:31 - loss: 1.4674 - regression_loss: 1.2425 - classification_loss: 0.2249 129/500 [======>.......................] - ETA: 1:31 - loss: 1.4710 - regression_loss: 1.2457 - classification_loss: 0.2253 130/500 [======>.......................] - ETA: 1:31 - loss: 1.4735 - regression_loss: 1.2467 - classification_loss: 0.2268 131/500 [======>.......................] - ETA: 1:31 - loss: 1.4749 - regression_loss: 1.2477 - classification_loss: 0.2272 132/500 [======>.......................] - ETA: 1:30 - loss: 1.4820 - regression_loss: 1.2540 - classification_loss: 0.2280 133/500 [======>.......................] - ETA: 1:30 - loss: 1.4829 - regression_loss: 1.2547 - classification_loss: 0.2281 134/500 [=======>......................] - ETA: 1:30 - loss: 1.4818 - regression_loss: 1.2538 - classification_loss: 0.2279 135/500 [=======>......................] - ETA: 1:30 - loss: 1.4846 - regression_loss: 1.2567 - classification_loss: 0.2279 136/500 [=======>......................] - ETA: 1:29 - loss: 1.4776 - regression_loss: 1.2508 - classification_loss: 0.2268 137/500 [=======>......................] - ETA: 1:29 - loss: 1.4826 - regression_loss: 1.2544 - classification_loss: 0.2281 138/500 [=======>......................] - ETA: 1:29 - loss: 1.4823 - regression_loss: 1.2541 - classification_loss: 0.2282 139/500 [=======>......................] - ETA: 1:29 - loss: 1.4918 - regression_loss: 1.2619 - classification_loss: 0.2299 140/500 [=======>......................] - ETA: 1:28 - loss: 1.4903 - regression_loss: 1.2610 - classification_loss: 0.2294 141/500 [=======>......................] - ETA: 1:28 - loss: 1.4910 - regression_loss: 1.2613 - classification_loss: 0.2298 142/500 [=======>......................] - ETA: 1:28 - loss: 1.4930 - regression_loss: 1.2630 - classification_loss: 0.2300 143/500 [=======>......................] - ETA: 1:28 - loss: 1.4900 - regression_loss: 1.2605 - classification_loss: 0.2295 144/500 [=======>......................] - ETA: 1:27 - loss: 1.4912 - regression_loss: 1.2612 - classification_loss: 0.2300 145/500 [=======>......................] - ETA: 1:27 - loss: 1.4891 - regression_loss: 1.2596 - classification_loss: 0.2296 146/500 [=======>......................] - ETA: 1:27 - loss: 1.4886 - regression_loss: 1.2591 - classification_loss: 0.2295 147/500 [=======>......................] - ETA: 1:27 - loss: 1.4862 - regression_loss: 1.2574 - classification_loss: 0.2288 148/500 [=======>......................] - ETA: 1:26 - loss: 1.4824 - regression_loss: 1.2537 - classification_loss: 0.2287 149/500 [=======>......................] - ETA: 1:26 - loss: 1.4853 - regression_loss: 1.2563 - classification_loss: 0.2289 150/500 [========>.....................] - ETA: 1:26 - loss: 1.4846 - regression_loss: 1.2557 - classification_loss: 0.2289 151/500 [========>.....................] - ETA: 1:26 - loss: 1.4862 - regression_loss: 1.2571 - classification_loss: 0.2290 152/500 [========>.....................] - ETA: 1:25 - loss: 1.4890 - regression_loss: 1.2591 - classification_loss: 0.2299 153/500 [========>.....................] - ETA: 1:25 - loss: 1.4815 - regression_loss: 1.2528 - classification_loss: 0.2288 154/500 [========>.....................] - ETA: 1:25 - loss: 1.4797 - regression_loss: 1.2515 - classification_loss: 0.2283 155/500 [========>.....................] - ETA: 1:25 - loss: 1.4736 - regression_loss: 1.2465 - classification_loss: 0.2271 156/500 [========>.....................] - ETA: 1:24 - loss: 1.4746 - regression_loss: 1.2472 - classification_loss: 0.2274 157/500 [========>.....................] - ETA: 1:24 - loss: 1.4750 - regression_loss: 1.2477 - classification_loss: 0.2273 158/500 [========>.....................] - ETA: 1:24 - loss: 1.4695 - regression_loss: 1.2432 - classification_loss: 0.2263 159/500 [========>.....................] - ETA: 1:24 - loss: 1.4741 - regression_loss: 1.2476 - classification_loss: 0.2265 160/500 [========>.....................] - ETA: 1:24 - loss: 1.4767 - regression_loss: 1.2501 - classification_loss: 0.2267 161/500 [========>.....................] - ETA: 1:23 - loss: 1.4795 - regression_loss: 1.2521 - classification_loss: 0.2274 162/500 [========>.....................] - ETA: 1:23 - loss: 1.4775 - regression_loss: 1.2503 - classification_loss: 0.2271 163/500 [========>.....................] - ETA: 1:23 - loss: 1.4780 - regression_loss: 1.2504 - classification_loss: 0.2275 164/500 [========>.....................] - ETA: 1:23 - loss: 1.4784 - regression_loss: 1.2512 - classification_loss: 0.2272 165/500 [========>.....................] - ETA: 1:22 - loss: 1.4799 - regression_loss: 1.2523 - classification_loss: 0.2276 166/500 [========>.....................] - ETA: 1:22 - loss: 1.4768 - regression_loss: 1.2498 - classification_loss: 0.2270 167/500 [=========>....................] - ETA: 1:22 - loss: 1.4737 - regression_loss: 1.2475 - classification_loss: 0.2262 168/500 [=========>....................] - ETA: 1:22 - loss: 1.4764 - regression_loss: 1.2495 - classification_loss: 0.2270 169/500 [=========>....................] - ETA: 1:21 - loss: 1.4765 - regression_loss: 1.2494 - classification_loss: 0.2271 170/500 [=========>....................] - ETA: 1:21 - loss: 1.4765 - regression_loss: 1.2497 - classification_loss: 0.2268 171/500 [=========>....................] - ETA: 1:21 - loss: 1.4778 - regression_loss: 1.2507 - classification_loss: 0.2271 172/500 [=========>....................] - ETA: 1:20 - loss: 1.4776 - regression_loss: 1.2507 - classification_loss: 0.2269 173/500 [=========>....................] - ETA: 1:20 - loss: 1.4818 - regression_loss: 1.2543 - classification_loss: 0.2275 174/500 [=========>....................] - ETA: 1:20 - loss: 1.4798 - regression_loss: 1.2526 - classification_loss: 0.2272 175/500 [=========>....................] - ETA: 1:19 - loss: 1.4800 - regression_loss: 1.2529 - classification_loss: 0.2271 176/500 [=========>....................] - ETA: 1:19 - loss: 1.4790 - regression_loss: 1.2522 - classification_loss: 0.2268 177/500 [=========>....................] - ETA: 1:19 - loss: 1.4785 - regression_loss: 1.2519 - classification_loss: 0.2266 178/500 [=========>....................] - ETA: 1:19 - loss: 1.4787 - regression_loss: 1.2519 - classification_loss: 0.2268 179/500 [=========>....................] - ETA: 1:18 - loss: 1.4788 - regression_loss: 1.2522 - classification_loss: 0.2266 180/500 [=========>....................] - ETA: 1:18 - loss: 1.4803 - regression_loss: 1.2533 - classification_loss: 0.2270 181/500 [=========>....................] - ETA: 1:18 - loss: 1.4822 - regression_loss: 1.2545 - classification_loss: 0.2277 182/500 [=========>....................] - ETA: 1:18 - loss: 1.4857 - regression_loss: 1.2571 - classification_loss: 0.2286 183/500 [=========>....................] - ETA: 1:17 - loss: 1.4838 - regression_loss: 1.2558 - classification_loss: 0.2280 184/500 [==========>...................] - ETA: 1:17 - loss: 1.4814 - regression_loss: 1.2541 - classification_loss: 0.2273 185/500 [==========>...................] - ETA: 1:17 - loss: 1.4815 - regression_loss: 1.2540 - classification_loss: 0.2274 186/500 [==========>...................] - ETA: 1:17 - loss: 1.4779 - regression_loss: 1.2513 - classification_loss: 0.2266 187/500 [==========>...................] - ETA: 1:16 - loss: 1.4734 - regression_loss: 1.2473 - classification_loss: 0.2261 188/500 [==========>...................] - ETA: 1:16 - loss: 1.4710 - regression_loss: 1.2452 - classification_loss: 0.2258 189/500 [==========>...................] - ETA: 1:16 - loss: 1.4707 - regression_loss: 1.2453 - classification_loss: 0.2254 190/500 [==========>...................] - ETA: 1:16 - loss: 1.4687 - regression_loss: 1.2435 - classification_loss: 0.2252 191/500 [==========>...................] - ETA: 1:15 - loss: 1.4653 - regression_loss: 1.2409 - classification_loss: 0.2244 192/500 [==========>...................] - ETA: 1:15 - loss: 1.4634 - regression_loss: 1.2390 - classification_loss: 0.2244 193/500 [==========>...................] - ETA: 1:15 - loss: 1.4662 - regression_loss: 1.2415 - classification_loss: 0.2248 194/500 [==========>...................] - ETA: 1:15 - loss: 1.4688 - regression_loss: 1.2438 - classification_loss: 0.2250 195/500 [==========>...................] - ETA: 1:14 - loss: 1.4672 - regression_loss: 1.2429 - classification_loss: 0.2244 196/500 [==========>...................] - ETA: 1:14 - loss: 1.4661 - regression_loss: 1.2418 - classification_loss: 0.2243 197/500 [==========>...................] - ETA: 1:14 - loss: 1.4652 - regression_loss: 1.2411 - classification_loss: 0.2241 198/500 [==========>...................] - ETA: 1:14 - loss: 1.4639 - regression_loss: 1.2398 - classification_loss: 0.2241 199/500 [==========>...................] - ETA: 1:13 - loss: 1.4643 - regression_loss: 1.2400 - classification_loss: 0.2243 200/500 [===========>..................] - ETA: 1:13 - loss: 1.4644 - regression_loss: 1.2403 - classification_loss: 0.2242 201/500 [===========>..................] - ETA: 1:13 - loss: 1.4667 - regression_loss: 1.2421 - classification_loss: 0.2245 202/500 [===========>..................] - ETA: 1:13 - loss: 1.4666 - regression_loss: 1.2422 - classification_loss: 0.2244 203/500 [===========>..................] - ETA: 1:12 - loss: 1.4690 - regression_loss: 1.2443 - classification_loss: 0.2247 204/500 [===========>..................] - ETA: 1:12 - loss: 1.4656 - regression_loss: 1.2417 - classification_loss: 0.2239 205/500 [===========>..................] - ETA: 1:12 - loss: 1.4625 - regression_loss: 1.2392 - classification_loss: 0.2233 206/500 [===========>..................] - ETA: 1:12 - loss: 1.4650 - regression_loss: 1.2413 - classification_loss: 0.2237 207/500 [===========>..................] - ETA: 1:11 - loss: 1.4658 - regression_loss: 1.2421 - classification_loss: 0.2237 208/500 [===========>..................] - ETA: 1:11 - loss: 1.4641 - regression_loss: 1.2407 - classification_loss: 0.2234 209/500 [===========>..................] - ETA: 1:11 - loss: 1.4619 - regression_loss: 1.2389 - classification_loss: 0.2230 210/500 [===========>..................] - ETA: 1:11 - loss: 1.4654 - regression_loss: 1.2418 - classification_loss: 0.2236 211/500 [===========>..................] - ETA: 1:10 - loss: 1.4658 - regression_loss: 1.2423 - classification_loss: 0.2235 212/500 [===========>..................] - ETA: 1:10 - loss: 1.4660 - regression_loss: 1.2424 - classification_loss: 0.2236 213/500 [===========>..................] - ETA: 1:10 - loss: 1.4646 - regression_loss: 1.2415 - classification_loss: 0.2231 214/500 [===========>..................] - ETA: 1:10 - loss: 1.4626 - regression_loss: 1.2399 - classification_loss: 0.2226 215/500 [===========>..................] - ETA: 1:09 - loss: 1.4614 - regression_loss: 1.2391 - classification_loss: 0.2224 216/500 [===========>..................] - ETA: 1:09 - loss: 1.4581 - regression_loss: 1.2363 - classification_loss: 0.2218 217/500 [============>.................] - ETA: 1:09 - loss: 1.4609 - regression_loss: 1.2387 - classification_loss: 0.2221 218/500 [============>.................] - ETA: 1:09 - loss: 1.4616 - regression_loss: 1.2392 - classification_loss: 0.2225 219/500 [============>.................] - ETA: 1:08 - loss: 1.4598 - regression_loss: 1.2378 - classification_loss: 0.2219 220/500 [============>.................] - ETA: 1:08 - loss: 1.4595 - regression_loss: 1.2378 - classification_loss: 0.2218 221/500 [============>.................] - ETA: 1:08 - loss: 1.4594 - regression_loss: 1.2377 - classification_loss: 0.2217 222/500 [============>.................] - ETA: 1:08 - loss: 1.4574 - regression_loss: 1.2360 - classification_loss: 0.2214 223/500 [============>.................] - ETA: 1:07 - loss: 1.4581 - regression_loss: 1.2365 - classification_loss: 0.2217 224/500 [============>.................] - ETA: 1:07 - loss: 1.4589 - regression_loss: 1.2374 - classification_loss: 0.2216 225/500 [============>.................] - ETA: 1:07 - loss: 1.4577 - regression_loss: 1.2366 - classification_loss: 0.2211 226/500 [============>.................] - ETA: 1:07 - loss: 1.4587 - regression_loss: 1.2376 - classification_loss: 0.2211 227/500 [============>.................] - ETA: 1:06 - loss: 1.4613 - regression_loss: 1.2396 - classification_loss: 0.2217 228/500 [============>.................] - ETA: 1:06 - loss: 1.4589 - regression_loss: 1.2376 - classification_loss: 0.2213 229/500 [============>.................] - ETA: 1:06 - loss: 1.4581 - regression_loss: 1.2371 - classification_loss: 0.2210 230/500 [============>.................] - ETA: 1:06 - loss: 1.4614 - regression_loss: 1.2395 - classification_loss: 0.2219 231/500 [============>.................] - ETA: 1:05 - loss: 1.4561 - regression_loss: 1.2351 - classification_loss: 0.2210 232/500 [============>.................] - ETA: 1:05 - loss: 1.4614 - regression_loss: 1.2388 - classification_loss: 0.2226 233/500 [============>.................] - ETA: 1:05 - loss: 1.4597 - regression_loss: 1.2373 - classification_loss: 0.2224 234/500 [=============>................] - ETA: 1:05 - loss: 1.4576 - regression_loss: 1.2351 - classification_loss: 0.2225 235/500 [=============>................] - ETA: 1:04 - loss: 1.4600 - regression_loss: 1.2369 - classification_loss: 0.2231 236/500 [=============>................] - ETA: 1:04 - loss: 1.4570 - regression_loss: 1.2346 - classification_loss: 0.2224 237/500 [=============>................] - ETA: 1:04 - loss: 1.4567 - regression_loss: 1.2345 - classification_loss: 0.2222 238/500 [=============>................] - ETA: 1:04 - loss: 1.4571 - regression_loss: 1.2349 - classification_loss: 0.2221 239/500 [=============>................] - ETA: 1:03 - loss: 1.4578 - regression_loss: 1.2351 - classification_loss: 0.2227 240/500 [=============>................] - ETA: 1:03 - loss: 1.4586 - regression_loss: 1.2357 - classification_loss: 0.2229 241/500 [=============>................] - ETA: 1:03 - loss: 1.4603 - regression_loss: 1.2373 - classification_loss: 0.2231 242/500 [=============>................] - ETA: 1:03 - loss: 1.4600 - regression_loss: 1.2367 - classification_loss: 0.2232 243/500 [=============>................] - ETA: 1:02 - loss: 1.4611 - regression_loss: 1.2381 - classification_loss: 0.2230 244/500 [=============>................] - ETA: 1:02 - loss: 1.4620 - regression_loss: 1.2389 - classification_loss: 0.2231 245/500 [=============>................] - ETA: 1:02 - loss: 1.4614 - regression_loss: 1.2382 - classification_loss: 0.2232 246/500 [=============>................] - ETA: 1:02 - loss: 1.4625 - regression_loss: 1.2393 - classification_loss: 0.2232 247/500 [=============>................] - ETA: 1:01 - loss: 1.4612 - regression_loss: 1.2383 - classification_loss: 0.2228 248/500 [=============>................] - ETA: 1:01 - loss: 1.4619 - regression_loss: 1.2388 - classification_loss: 0.2231 249/500 [=============>................] - ETA: 1:01 - loss: 1.4593 - regression_loss: 1.2366 - classification_loss: 0.2227 250/500 [==============>...............] - ETA: 1:01 - loss: 1.4582 - regression_loss: 1.2359 - classification_loss: 0.2223 251/500 [==============>...............] - ETA: 1:00 - loss: 1.4595 - regression_loss: 1.2371 - classification_loss: 0.2224 252/500 [==============>...............] - ETA: 1:00 - loss: 1.4581 - regression_loss: 1.2360 - classification_loss: 0.2220 253/500 [==============>...............] - ETA: 1:00 - loss: 1.4598 - regression_loss: 1.2374 - classification_loss: 0.2224 254/500 [==============>...............] - ETA: 1:00 - loss: 1.4613 - regression_loss: 1.2384 - classification_loss: 0.2229 255/500 [==============>...............] - ETA: 59s - loss: 1.4568 - regression_loss: 1.2345 - classification_loss: 0.2223  256/500 [==============>...............] - ETA: 59s - loss: 1.4601 - regression_loss: 1.2367 - classification_loss: 0.2233 257/500 [==============>...............] - ETA: 59s - loss: 1.4597 - regression_loss: 1.2361 - classification_loss: 0.2235 258/500 [==============>...............] - ETA: 59s - loss: 1.4585 - regression_loss: 1.2350 - classification_loss: 0.2235 259/500 [==============>...............] - ETA: 58s - loss: 1.4588 - regression_loss: 1.2349 - classification_loss: 0.2239 260/500 [==============>...............] - ETA: 58s - loss: 1.4580 - regression_loss: 1.2345 - classification_loss: 0.2235 261/500 [==============>...............] - ETA: 58s - loss: 1.4570 - regression_loss: 1.2339 - classification_loss: 0.2230 262/500 [==============>...............] - ETA: 58s - loss: 1.4592 - regression_loss: 1.2356 - classification_loss: 0.2235 263/500 [==============>...............] - ETA: 57s - loss: 1.4593 - regression_loss: 1.2356 - classification_loss: 0.2237 264/500 [==============>...............] - ETA: 57s - loss: 1.4594 - regression_loss: 1.2357 - classification_loss: 0.2237 265/500 [==============>...............] - ETA: 57s - loss: 1.4584 - regression_loss: 1.2349 - classification_loss: 0.2235 266/500 [==============>...............] - ETA: 57s - loss: 1.4579 - regression_loss: 1.2348 - classification_loss: 0.2231 267/500 [===============>..............] - ETA: 57s - loss: 1.4575 - regression_loss: 1.2346 - classification_loss: 0.2229 268/500 [===============>..............] - ETA: 56s - loss: 1.4576 - regression_loss: 1.2348 - classification_loss: 0.2228 269/500 [===============>..............] - ETA: 56s - loss: 1.4582 - regression_loss: 1.2355 - classification_loss: 0.2227 270/500 [===============>..............] - ETA: 56s - loss: 1.4594 - regression_loss: 1.2364 - classification_loss: 0.2230 271/500 [===============>..............] - ETA: 56s - loss: 1.4587 - regression_loss: 1.2355 - classification_loss: 0.2232 272/500 [===============>..............] - ETA: 55s - loss: 1.4606 - regression_loss: 1.2369 - classification_loss: 0.2237 273/500 [===============>..............] - ETA: 55s - loss: 1.4567 - regression_loss: 1.2337 - classification_loss: 0.2230 274/500 [===============>..............] - ETA: 55s - loss: 1.4550 - regression_loss: 1.2323 - classification_loss: 0.2228 275/500 [===============>..............] - ETA: 55s - loss: 1.4575 - regression_loss: 1.2342 - classification_loss: 0.2234 276/500 [===============>..............] - ETA: 54s - loss: 1.4588 - regression_loss: 1.2352 - classification_loss: 0.2236 277/500 [===============>..............] - ETA: 54s - loss: 1.4632 - regression_loss: 1.2374 - classification_loss: 0.2258 278/500 [===============>..............] - ETA: 54s - loss: 1.4662 - regression_loss: 1.2400 - classification_loss: 0.2263 279/500 [===============>..............] - ETA: 54s - loss: 1.4658 - regression_loss: 1.2398 - classification_loss: 0.2260 280/500 [===============>..............] - ETA: 53s - loss: 1.4655 - regression_loss: 1.2396 - classification_loss: 0.2259 281/500 [===============>..............] - ETA: 53s - loss: 1.4653 - regression_loss: 1.2393 - classification_loss: 0.2260 282/500 [===============>..............] - ETA: 53s - loss: 1.4644 - regression_loss: 1.2388 - classification_loss: 0.2256 283/500 [===============>..............] - ETA: 53s - loss: 1.4647 - regression_loss: 1.2390 - classification_loss: 0.2258 284/500 [================>.............] - ETA: 52s - loss: 1.4644 - regression_loss: 1.2387 - classification_loss: 0.2257 285/500 [================>.............] - ETA: 52s - loss: 1.4608 - regression_loss: 1.2357 - classification_loss: 0.2251 286/500 [================>.............] - ETA: 52s - loss: 1.4603 - regression_loss: 1.2354 - classification_loss: 0.2249 287/500 [================>.............] - ETA: 52s - loss: 1.4570 - regression_loss: 1.2325 - classification_loss: 0.2245 288/500 [================>.............] - ETA: 51s - loss: 1.4581 - regression_loss: 1.2336 - classification_loss: 0.2245 289/500 [================>.............] - ETA: 51s - loss: 1.4600 - regression_loss: 1.2346 - classification_loss: 0.2254 290/500 [================>.............] - ETA: 51s - loss: 1.4593 - regression_loss: 1.2341 - classification_loss: 0.2252 291/500 [================>.............] - ETA: 51s - loss: 1.4595 - regression_loss: 1.2343 - classification_loss: 0.2252 292/500 [================>.............] - ETA: 50s - loss: 1.4596 - regression_loss: 1.2346 - classification_loss: 0.2250 293/500 [================>.............] - ETA: 50s - loss: 1.4607 - regression_loss: 1.2353 - classification_loss: 0.2254 294/500 [================>.............] - ETA: 50s - loss: 1.4603 - regression_loss: 1.2351 - classification_loss: 0.2252 295/500 [================>.............] - ETA: 50s - loss: 1.4620 - regression_loss: 1.2364 - classification_loss: 0.2256 296/500 [================>.............] - ETA: 49s - loss: 1.4617 - regression_loss: 1.2362 - classification_loss: 0.2256 297/500 [================>.............] - ETA: 49s - loss: 1.4591 - regression_loss: 1.2339 - classification_loss: 0.2252 298/500 [================>.............] - ETA: 49s - loss: 1.4578 - regression_loss: 1.2328 - classification_loss: 0.2249 299/500 [================>.............] - ETA: 49s - loss: 1.4578 - regression_loss: 1.2329 - classification_loss: 0.2249 300/500 [=================>............] - ETA: 48s - loss: 1.4557 - regression_loss: 1.2311 - classification_loss: 0.2246 301/500 [=================>............] - ETA: 48s - loss: 1.4591 - regression_loss: 1.2339 - classification_loss: 0.2252 302/500 [=================>............] - ETA: 48s - loss: 1.4591 - regression_loss: 1.2340 - classification_loss: 0.2251 303/500 [=================>............] - ETA: 48s - loss: 1.4581 - regression_loss: 1.2332 - classification_loss: 0.2248 304/500 [=================>............] - ETA: 47s - loss: 1.4594 - regression_loss: 1.2343 - classification_loss: 0.2251 305/500 [=================>............] - ETA: 47s - loss: 1.4611 - regression_loss: 1.2358 - classification_loss: 0.2253 306/500 [=================>............] - ETA: 47s - loss: 1.4673 - regression_loss: 1.2409 - classification_loss: 0.2263 307/500 [=================>............] - ETA: 47s - loss: 1.4671 - regression_loss: 1.2409 - classification_loss: 0.2262 308/500 [=================>............] - ETA: 46s - loss: 1.4673 - regression_loss: 1.2410 - classification_loss: 0.2263 309/500 [=================>............] - ETA: 46s - loss: 1.4663 - regression_loss: 1.2403 - classification_loss: 0.2261 310/500 [=================>............] - ETA: 46s - loss: 1.4653 - regression_loss: 1.2395 - classification_loss: 0.2259 311/500 [=================>............] - ETA: 46s - loss: 1.4647 - regression_loss: 1.2389 - classification_loss: 0.2258 312/500 [=================>............] - ETA: 45s - loss: 1.4657 - regression_loss: 1.2396 - classification_loss: 0.2261 313/500 [=================>............] - ETA: 45s - loss: 1.4648 - regression_loss: 1.2388 - classification_loss: 0.2259 314/500 [=================>............] - ETA: 45s - loss: 1.4660 - regression_loss: 1.2402 - classification_loss: 0.2258 315/500 [=================>............] - ETA: 45s - loss: 1.4698 - regression_loss: 1.2428 - classification_loss: 0.2270 316/500 [=================>............] - ETA: 45s - loss: 1.4699 - regression_loss: 1.2429 - classification_loss: 0.2270 317/500 [==================>...........] - ETA: 44s - loss: 1.4715 - regression_loss: 1.2441 - classification_loss: 0.2274 318/500 [==================>...........] - ETA: 44s - loss: 1.4717 - regression_loss: 1.2441 - classification_loss: 0.2276 319/500 [==================>...........] - ETA: 44s - loss: 1.4707 - regression_loss: 1.2434 - classification_loss: 0.2273 320/500 [==================>...........] - ETA: 44s - loss: 1.4711 - regression_loss: 1.2439 - classification_loss: 0.2271 321/500 [==================>...........] - ETA: 43s - loss: 1.4711 - regression_loss: 1.2440 - classification_loss: 0.2271 322/500 [==================>...........] - ETA: 43s - loss: 1.4721 - regression_loss: 1.2450 - classification_loss: 0.2270 323/500 [==================>...........] - ETA: 43s - loss: 1.4720 - regression_loss: 1.2451 - classification_loss: 0.2269 324/500 [==================>...........] - ETA: 43s - loss: 1.4738 - regression_loss: 1.2462 - classification_loss: 0.2276 325/500 [==================>...........] - ETA: 42s - loss: 1.4737 - regression_loss: 1.2461 - classification_loss: 0.2276 326/500 [==================>...........] - ETA: 42s - loss: 1.4735 - regression_loss: 1.2460 - classification_loss: 0.2275 327/500 [==================>...........] - ETA: 42s - loss: 1.4737 - regression_loss: 1.2460 - classification_loss: 0.2278 328/500 [==================>...........] - ETA: 42s - loss: 1.4739 - regression_loss: 1.2463 - classification_loss: 0.2276 329/500 [==================>...........] - ETA: 41s - loss: 1.4705 - regression_loss: 1.2434 - classification_loss: 0.2272 330/500 [==================>...........] - ETA: 41s - loss: 1.4707 - regression_loss: 1.2435 - classification_loss: 0.2272 331/500 [==================>...........] - ETA: 41s - loss: 1.4709 - regression_loss: 1.2434 - classification_loss: 0.2275 332/500 [==================>...........] - ETA: 41s - loss: 1.4729 - regression_loss: 1.2448 - classification_loss: 0.2281 333/500 [==================>...........] - ETA: 40s - loss: 1.4718 - regression_loss: 1.2439 - classification_loss: 0.2279 334/500 [===================>..........] - ETA: 40s - loss: 1.4754 - regression_loss: 1.2472 - classification_loss: 0.2283 335/500 [===================>..........] - ETA: 40s - loss: 1.4765 - regression_loss: 1.2479 - classification_loss: 0.2285 336/500 [===================>..........] - ETA: 40s - loss: 1.4770 - regression_loss: 1.2483 - classification_loss: 0.2287 337/500 [===================>..........] - ETA: 39s - loss: 1.4760 - regression_loss: 1.2475 - classification_loss: 0.2286 338/500 [===================>..........] - ETA: 39s - loss: 1.4760 - regression_loss: 1.2474 - classification_loss: 0.2286 339/500 [===================>..........] - ETA: 39s - loss: 1.4755 - regression_loss: 1.2470 - classification_loss: 0.2285 340/500 [===================>..........] - ETA: 39s - loss: 1.4752 - regression_loss: 1.2468 - classification_loss: 0.2285 341/500 [===================>..........] - ETA: 38s - loss: 1.4737 - regression_loss: 1.2452 - classification_loss: 0.2284 342/500 [===================>..........] - ETA: 38s - loss: 1.4731 - regression_loss: 1.2416 - classification_loss: 0.2315 343/500 [===================>..........] - ETA: 38s - loss: 1.4737 - regression_loss: 1.2420 - classification_loss: 0.2317 344/500 [===================>..........] - ETA: 38s - loss: 1.4754 - regression_loss: 1.2435 - classification_loss: 0.2319 345/500 [===================>..........] - ETA: 37s - loss: 1.4732 - regression_loss: 1.2417 - classification_loss: 0.2315 346/500 [===================>..........] - ETA: 37s - loss: 1.4743 - regression_loss: 1.2424 - classification_loss: 0.2318 347/500 [===================>..........] - ETA: 37s - loss: 1.4747 - regression_loss: 1.2429 - classification_loss: 0.2318 348/500 [===================>..........] - ETA: 37s - loss: 1.4738 - regression_loss: 1.2422 - classification_loss: 0.2316 349/500 [===================>..........] - ETA: 36s - loss: 1.4736 - regression_loss: 1.2421 - classification_loss: 0.2315 350/500 [====================>.........] - ETA: 36s - loss: 1.4726 - regression_loss: 1.2413 - classification_loss: 0.2312 351/500 [====================>.........] - ETA: 36s - loss: 1.4720 - regression_loss: 1.2411 - classification_loss: 0.2309 352/500 [====================>.........] - ETA: 36s - loss: 1.4723 - regression_loss: 1.2414 - classification_loss: 0.2308 353/500 [====================>.........] - ETA: 35s - loss: 1.4719 - regression_loss: 1.2411 - classification_loss: 0.2308 354/500 [====================>.........] - ETA: 35s - loss: 1.4735 - regression_loss: 1.2426 - classification_loss: 0.2309 355/500 [====================>.........] - ETA: 35s - loss: 1.4738 - regression_loss: 1.2429 - classification_loss: 0.2309 356/500 [====================>.........] - ETA: 35s - loss: 1.4734 - regression_loss: 1.2425 - classification_loss: 0.2309 357/500 [====================>.........] - ETA: 34s - loss: 1.4729 - regression_loss: 1.2421 - classification_loss: 0.2308 358/500 [====================>.........] - ETA: 34s - loss: 1.4749 - regression_loss: 1.2427 - classification_loss: 0.2323 359/500 [====================>.........] - ETA: 34s - loss: 1.4734 - regression_loss: 1.2414 - classification_loss: 0.2320 360/500 [====================>.........] - ETA: 34s - loss: 1.4736 - regression_loss: 1.2415 - classification_loss: 0.2321 361/500 [====================>.........] - ETA: 33s - loss: 1.4734 - regression_loss: 1.2415 - classification_loss: 0.2319 362/500 [====================>.........] - ETA: 33s - loss: 1.4738 - regression_loss: 1.2417 - classification_loss: 0.2321 363/500 [====================>.........] - ETA: 33s - loss: 1.4723 - regression_loss: 1.2404 - classification_loss: 0.2318 364/500 [====================>.........] - ETA: 33s - loss: 1.4746 - regression_loss: 1.2422 - classification_loss: 0.2324 365/500 [====================>.........] - ETA: 32s - loss: 1.4748 - regression_loss: 1.2423 - classification_loss: 0.2325 366/500 [====================>.........] - ETA: 32s - loss: 1.4750 - regression_loss: 1.2427 - classification_loss: 0.2324 367/500 [=====================>........] - ETA: 32s - loss: 1.4750 - regression_loss: 1.2426 - classification_loss: 0.2324 368/500 [=====================>........] - ETA: 32s - loss: 1.4759 - regression_loss: 1.2432 - classification_loss: 0.2327 369/500 [=====================>........] - ETA: 31s - loss: 1.4759 - regression_loss: 1.2432 - classification_loss: 0.2327 370/500 [=====================>........] - ETA: 31s - loss: 1.4757 - regression_loss: 1.2432 - classification_loss: 0.2324 371/500 [=====================>........] - ETA: 31s - loss: 1.4771 - regression_loss: 1.2444 - classification_loss: 0.2327 372/500 [=====================>........] - ETA: 31s - loss: 1.4796 - regression_loss: 1.2462 - classification_loss: 0.2334 373/500 [=====================>........] - ETA: 30s - loss: 1.4786 - regression_loss: 1.2455 - classification_loss: 0.2331 374/500 [=====================>........] - ETA: 30s - loss: 1.4776 - regression_loss: 1.2449 - classification_loss: 0.2327 375/500 [=====================>........] - ETA: 30s - loss: 1.4756 - regression_loss: 1.2433 - classification_loss: 0.2323 376/500 [=====================>........] - ETA: 30s - loss: 1.4769 - regression_loss: 1.2441 - classification_loss: 0.2328 377/500 [=====================>........] - ETA: 29s - loss: 1.4775 - regression_loss: 1.2444 - classification_loss: 0.2331 378/500 [=====================>........] - ETA: 29s - loss: 1.4785 - regression_loss: 1.2453 - classification_loss: 0.2332 379/500 [=====================>........] - ETA: 29s - loss: 1.4783 - regression_loss: 1.2453 - classification_loss: 0.2330 380/500 [=====================>........] - ETA: 29s - loss: 1.4777 - regression_loss: 1.2448 - classification_loss: 0.2328 381/500 [=====================>........] - ETA: 28s - loss: 1.4750 - regression_loss: 1.2426 - classification_loss: 0.2324 382/500 [=====================>........] - ETA: 28s - loss: 1.4749 - regression_loss: 1.2425 - classification_loss: 0.2323 383/500 [=====================>........] - ETA: 28s - loss: 1.4749 - regression_loss: 1.2427 - classification_loss: 0.2322 384/500 [======================>.......] - ETA: 28s - loss: 1.4745 - regression_loss: 1.2423 - classification_loss: 0.2322 385/500 [======================>.......] - ETA: 27s - loss: 1.4750 - regression_loss: 1.2428 - classification_loss: 0.2321 386/500 [======================>.......] - ETA: 27s - loss: 1.4763 - regression_loss: 1.2438 - classification_loss: 0.2325 387/500 [======================>.......] - ETA: 27s - loss: 1.4768 - regression_loss: 1.2442 - classification_loss: 0.2326 388/500 [======================>.......] - ETA: 27s - loss: 1.4775 - regression_loss: 1.2447 - classification_loss: 0.2328 389/500 [======================>.......] - ETA: 26s - loss: 1.4784 - regression_loss: 1.2455 - classification_loss: 0.2329 390/500 [======================>.......] - ETA: 26s - loss: 1.4775 - regression_loss: 1.2447 - classification_loss: 0.2328 391/500 [======================>.......] - ETA: 26s - loss: 1.4799 - regression_loss: 1.2464 - classification_loss: 0.2336 392/500 [======================>.......] - ETA: 26s - loss: 1.4770 - regression_loss: 1.2439 - classification_loss: 0.2332 393/500 [======================>.......] - ETA: 26s - loss: 1.4783 - regression_loss: 1.2448 - classification_loss: 0.2335 394/500 [======================>.......] - ETA: 25s - loss: 1.4789 - regression_loss: 1.2457 - classification_loss: 0.2332 395/500 [======================>.......] - ETA: 25s - loss: 1.4782 - regression_loss: 1.2452 - classification_loss: 0.2330 396/500 [======================>.......] - ETA: 25s - loss: 1.4801 - regression_loss: 1.2465 - classification_loss: 0.2335 397/500 [======================>.......] - ETA: 25s - loss: 1.4826 - regression_loss: 1.2484 - classification_loss: 0.2342 398/500 [======================>.......] - ETA: 24s - loss: 1.4859 - regression_loss: 1.2509 - classification_loss: 0.2351 399/500 [======================>.......] - ETA: 24s - loss: 1.4844 - regression_loss: 1.2498 - classification_loss: 0.2347 400/500 [=======================>......] - ETA: 24s - loss: 1.4844 - regression_loss: 1.2496 - classification_loss: 0.2348 401/500 [=======================>......] - ETA: 24s - loss: 1.4832 - regression_loss: 1.2488 - classification_loss: 0.2344 402/500 [=======================>......] - ETA: 23s - loss: 1.4838 - regression_loss: 1.2492 - classification_loss: 0.2346 403/500 [=======================>......] - ETA: 23s - loss: 1.4841 - regression_loss: 1.2497 - classification_loss: 0.2344 404/500 [=======================>......] - ETA: 23s - loss: 1.4828 - regression_loss: 1.2487 - classification_loss: 0.2341 405/500 [=======================>......] - ETA: 23s - loss: 1.4822 - regression_loss: 1.2481 - classification_loss: 0.2341 406/500 [=======================>......] - ETA: 22s - loss: 1.4809 - regression_loss: 1.2469 - classification_loss: 0.2340 407/500 [=======================>......] - ETA: 22s - loss: 1.4807 - regression_loss: 1.2468 - classification_loss: 0.2339 408/500 [=======================>......] - ETA: 22s - loss: 1.4793 - regression_loss: 1.2456 - classification_loss: 0.2337 409/500 [=======================>......] - ETA: 22s - loss: 1.4806 - regression_loss: 1.2465 - classification_loss: 0.2341 410/500 [=======================>......] - ETA: 21s - loss: 1.4814 - regression_loss: 1.2471 - classification_loss: 0.2344 411/500 [=======================>......] - ETA: 21s - loss: 1.4809 - regression_loss: 1.2465 - classification_loss: 0.2344 412/500 [=======================>......] - ETA: 21s - loss: 1.4828 - regression_loss: 1.2478 - classification_loss: 0.2350 413/500 [=======================>......] - ETA: 21s - loss: 1.4826 - regression_loss: 1.2477 - classification_loss: 0.2350 414/500 [=======================>......] - ETA: 20s - loss: 1.4832 - regression_loss: 1.2481 - classification_loss: 0.2351 415/500 [=======================>......] - ETA: 20s - loss: 1.4839 - regression_loss: 1.2485 - classification_loss: 0.2354 416/500 [=======================>......] - ETA: 20s - loss: 1.4818 - regression_loss: 1.2468 - classification_loss: 0.2350 417/500 [========================>.....] - ETA: 20s - loss: 1.4812 - regression_loss: 1.2462 - classification_loss: 0.2349 418/500 [========================>.....] - ETA: 19s - loss: 1.4797 - regression_loss: 1.2452 - classification_loss: 0.2346 419/500 [========================>.....] - ETA: 19s - loss: 1.4797 - regression_loss: 1.2453 - classification_loss: 0.2345 420/500 [========================>.....] - ETA: 19s - loss: 1.4792 - regression_loss: 1.2451 - classification_loss: 0.2342 421/500 [========================>.....] - ETA: 19s - loss: 1.4789 - regression_loss: 1.2450 - classification_loss: 0.2339 422/500 [========================>.....] - ETA: 18s - loss: 1.4791 - regression_loss: 1.2454 - classification_loss: 0.2337 423/500 [========================>.....] - ETA: 18s - loss: 1.4779 - regression_loss: 1.2445 - classification_loss: 0.2334 424/500 [========================>.....] - ETA: 18s - loss: 1.4778 - regression_loss: 1.2445 - classification_loss: 0.2333 425/500 [========================>.....] - ETA: 18s - loss: 1.4769 - regression_loss: 1.2437 - classification_loss: 0.2332 426/500 [========================>.....] - ETA: 17s - loss: 1.4771 - regression_loss: 1.2439 - classification_loss: 0.2332 427/500 [========================>.....] - ETA: 17s - loss: 1.4781 - regression_loss: 1.2448 - classification_loss: 0.2334 428/500 [========================>.....] - ETA: 17s - loss: 1.4794 - regression_loss: 1.2457 - classification_loss: 0.2337 429/500 [========================>.....] - ETA: 17s - loss: 1.4787 - regression_loss: 1.2451 - classification_loss: 0.2335 430/500 [========================>.....] - ETA: 16s - loss: 1.4793 - regression_loss: 1.2455 - classification_loss: 0.2338 431/500 [========================>.....] - ETA: 16s - loss: 1.4793 - regression_loss: 1.2456 - classification_loss: 0.2337 432/500 [========================>.....] - ETA: 16s - loss: 1.4809 - regression_loss: 1.2468 - classification_loss: 0.2341 433/500 [========================>.....] - ETA: 16s - loss: 1.4801 - regression_loss: 1.2462 - classification_loss: 0.2338 434/500 [=========================>....] - ETA: 16s - loss: 1.4810 - regression_loss: 1.2470 - classification_loss: 0.2341 435/500 [=========================>....] - ETA: 15s - loss: 1.4818 - regression_loss: 1.2477 - classification_loss: 0.2341 436/500 [=========================>....] - ETA: 15s - loss: 1.4815 - regression_loss: 1.2474 - classification_loss: 0.2341 437/500 [=========================>....] - ETA: 15s - loss: 1.4803 - regression_loss: 1.2465 - classification_loss: 0.2338 438/500 [=========================>....] - ETA: 15s - loss: 1.4795 - regression_loss: 1.2460 - classification_loss: 0.2335 439/500 [=========================>....] - ETA: 14s - loss: 1.4796 - regression_loss: 1.2463 - classification_loss: 0.2333 440/500 [=========================>....] - ETA: 14s - loss: 1.4776 - regression_loss: 1.2445 - classification_loss: 0.2331 441/500 [=========================>....] - ETA: 14s - loss: 1.4772 - regression_loss: 1.2440 - classification_loss: 0.2331 442/500 [=========================>....] - ETA: 14s - loss: 1.4770 - regression_loss: 1.2437 - classification_loss: 0.2333 443/500 [=========================>....] - ETA: 13s - loss: 1.4765 - regression_loss: 1.2433 - classification_loss: 0.2332 444/500 [=========================>....] - ETA: 13s - loss: 1.4754 - regression_loss: 1.2425 - classification_loss: 0.2329 445/500 [=========================>....] - ETA: 13s - loss: 1.4751 - regression_loss: 1.2422 - classification_loss: 0.2328 446/500 [=========================>....] - ETA: 13s - loss: 1.4753 - regression_loss: 1.2424 - classification_loss: 0.2329 447/500 [=========================>....] - ETA: 12s - loss: 1.4765 - regression_loss: 1.2435 - classification_loss: 0.2331 448/500 [=========================>....] - ETA: 12s - loss: 1.4749 - regression_loss: 1.2419 - classification_loss: 0.2330 449/500 [=========================>....] - ETA: 12s - loss: 1.4743 - regression_loss: 1.2415 - classification_loss: 0.2328 450/500 [==========================>...] - ETA: 12s - loss: 1.4736 - regression_loss: 1.2410 - classification_loss: 0.2326 451/500 [==========================>...] - ETA: 11s - loss: 1.4735 - regression_loss: 1.2409 - classification_loss: 0.2326 452/500 [==========================>...] - ETA: 11s - loss: 1.4738 - regression_loss: 1.2413 - classification_loss: 0.2326 453/500 [==========================>...] - ETA: 11s - loss: 1.4728 - regression_loss: 1.2405 - classification_loss: 0.2323 454/500 [==========================>...] - ETA: 11s - loss: 1.4728 - regression_loss: 1.2407 - classification_loss: 0.2321 455/500 [==========================>...] - ETA: 10s - loss: 1.4722 - regression_loss: 1.2403 - classification_loss: 0.2319 456/500 [==========================>...] - ETA: 10s - loss: 1.4754 - regression_loss: 1.2417 - classification_loss: 0.2337 457/500 [==========================>...] - ETA: 10s - loss: 1.4747 - regression_loss: 1.2411 - classification_loss: 0.2336 458/500 [==========================>...] - ETA: 10s - loss: 1.4749 - regression_loss: 1.2414 - classification_loss: 0.2335 459/500 [==========================>...] - ETA: 9s - loss: 1.4739 - regression_loss: 1.2406 - classification_loss: 0.2333  460/500 [==========================>...] - ETA: 9s - loss: 1.4745 - regression_loss: 1.2411 - classification_loss: 0.2334 461/500 [==========================>...] - ETA: 9s - loss: 1.4737 - regression_loss: 1.2405 - classification_loss: 0.2332 462/500 [==========================>...] - ETA: 9s - loss: 1.4732 - regression_loss: 1.2402 - classification_loss: 0.2330 463/500 [==========================>...] - ETA: 8s - loss: 1.4730 - regression_loss: 1.2402 - classification_loss: 0.2328 464/500 [==========================>...] - ETA: 8s - loss: 1.4729 - regression_loss: 1.2403 - classification_loss: 0.2326 465/500 [==========================>...] - ETA: 8s - loss: 1.4741 - regression_loss: 1.2413 - classification_loss: 0.2328 466/500 [==========================>...] - ETA: 8s - loss: 1.4752 - regression_loss: 1.2422 - classification_loss: 0.2330 467/500 [===========================>..] - ETA: 8s - loss: 1.4745 - regression_loss: 1.2418 - classification_loss: 0.2327 468/500 [===========================>..] - ETA: 7s - loss: 1.4741 - regression_loss: 1.2415 - classification_loss: 0.2326 469/500 [===========================>..] - ETA: 7s - loss: 1.4751 - regression_loss: 1.2424 - classification_loss: 0.2327 470/500 [===========================>..] - ETA: 7s - loss: 1.4729 - regression_loss: 1.2407 - classification_loss: 0.2323 471/500 [===========================>..] - ETA: 7s - loss: 1.4723 - regression_loss: 1.2401 - classification_loss: 0.2322 472/500 [===========================>..] - ETA: 6s - loss: 1.4712 - regression_loss: 1.2393 - classification_loss: 0.2319 473/500 [===========================>..] - ETA: 6s - loss: 1.4690 - regression_loss: 1.2375 - classification_loss: 0.2315 474/500 [===========================>..] - ETA: 6s - loss: 1.4690 - regression_loss: 1.2374 - classification_loss: 0.2315 475/500 [===========================>..] - ETA: 6s - loss: 1.4689 - regression_loss: 1.2376 - classification_loss: 0.2314 476/500 [===========================>..] - ETA: 5s - loss: 1.4685 - regression_loss: 1.2372 - classification_loss: 0.2314 477/500 [===========================>..] - ETA: 5s - loss: 1.4686 - regression_loss: 1.2371 - classification_loss: 0.2315 478/500 [===========================>..] - ETA: 5s - loss: 1.4686 - regression_loss: 1.2373 - classification_loss: 0.2313 479/500 [===========================>..] - ETA: 5s - loss: 1.4676 - regression_loss: 1.2366 - classification_loss: 0.2310 480/500 [===========================>..] - ETA: 4s - loss: 1.4670 - regression_loss: 1.2361 - classification_loss: 0.2308 481/500 [===========================>..] - ETA: 4s - loss: 1.4666 - regression_loss: 1.2359 - classification_loss: 0.2306 482/500 [===========================>..] - ETA: 4s - loss: 1.4668 - regression_loss: 1.2361 - classification_loss: 0.2307 483/500 [===========================>..] - ETA: 4s - loss: 1.4663 - regression_loss: 1.2358 - classification_loss: 0.2305 484/500 [============================>.] - ETA: 3s - loss: 1.4689 - regression_loss: 1.2372 - classification_loss: 0.2317 485/500 [============================>.] - ETA: 3s - loss: 1.4691 - regression_loss: 1.2372 - classification_loss: 0.2319 486/500 [============================>.] - ETA: 3s - loss: 1.4694 - regression_loss: 1.2374 - classification_loss: 0.2320 487/500 [============================>.] - ETA: 3s - loss: 1.4694 - regression_loss: 1.2374 - classification_loss: 0.2320 488/500 [============================>.] - ETA: 2s - loss: 1.4694 - regression_loss: 1.2374 - classification_loss: 0.2320 489/500 [============================>.] - ETA: 2s - loss: 1.4692 - regression_loss: 1.2372 - classification_loss: 0.2320 490/500 [============================>.] - ETA: 2s - loss: 1.4697 - regression_loss: 1.2376 - classification_loss: 0.2320 491/500 [============================>.] - ETA: 2s - loss: 1.4696 - regression_loss: 1.2377 - classification_loss: 0.2319 492/500 [============================>.] - ETA: 1s - loss: 1.4688 - regression_loss: 1.2371 - classification_loss: 0.2316 493/500 [============================>.] - ETA: 1s - loss: 1.4687 - regression_loss: 1.2371 - classification_loss: 0.2316 494/500 [============================>.] - ETA: 1s - loss: 1.4693 - regression_loss: 1.2375 - classification_loss: 0.2318 495/500 [============================>.] - ETA: 1s - loss: 1.4680 - regression_loss: 1.2364 - classification_loss: 0.2316 496/500 [============================>.] - ETA: 0s - loss: 1.4663 - regression_loss: 1.2350 - classification_loss: 0.2313 497/500 [============================>.] - ETA: 0s - loss: 1.4652 - regression_loss: 1.2341 - classification_loss: 0.2310 498/500 [============================>.] - ETA: 0s - loss: 1.4651 - regression_loss: 1.2341 - classification_loss: 0.2310 499/500 [============================>.] - ETA: 0s - loss: 1.4638 - regression_loss: 1.2331 - classification_loss: 0.2307 500/500 [==============================] - 121s 242ms/step - loss: 1.4640 - regression_loss: 1.2332 - classification_loss: 0.2308 326 instances of class plum with average precision: 0.8105 mAP: 0.8105 Epoch 00070: saving model to ./training/snapshots/resnet50_pascal_70.h5 Epoch 71/150 1/500 [..............................] - ETA: 1:55 - loss: 1.6920 - regression_loss: 1.4455 - classification_loss: 0.2465 2/500 [..............................] - ETA: 1:57 - loss: 1.4858 - regression_loss: 1.2845 - classification_loss: 0.2013 3/500 [..............................] - ETA: 1:59 - loss: 1.4598 - regression_loss: 1.2863 - classification_loss: 0.1735 4/500 [..............................] - ETA: 2:00 - loss: 1.2754 - regression_loss: 1.1181 - classification_loss: 0.1574 5/500 [..............................] - ETA: 2:00 - loss: 1.4576 - regression_loss: 1.2711 - classification_loss: 0.1865 6/500 [..............................] - ETA: 2:00 - loss: 1.3602 - regression_loss: 1.1902 - classification_loss: 0.1701 7/500 [..............................] - ETA: 1:58 - loss: 1.3576 - regression_loss: 1.1876 - classification_loss: 0.1700 8/500 [..............................] - ETA: 1:57 - loss: 1.4048 - regression_loss: 1.2207 - classification_loss: 0.1840 9/500 [..............................] - ETA: 1:58 - loss: 1.4430 - regression_loss: 1.2570 - classification_loss: 0.1860 10/500 [..............................] - ETA: 1:57 - loss: 1.5104 - regression_loss: 1.3047 - classification_loss: 0.2057 11/500 [..............................] - ETA: 1:57 - loss: 1.5153 - regression_loss: 1.2962 - classification_loss: 0.2191 12/500 [..............................] - ETA: 1:57 - loss: 1.4232 - regression_loss: 1.2160 - classification_loss: 0.2072 13/500 [..............................] - ETA: 1:57 - loss: 1.4718 - regression_loss: 1.2534 - classification_loss: 0.2184 14/500 [..............................] - ETA: 1:56 - loss: 1.4128 - regression_loss: 1.2051 - classification_loss: 0.2078 15/500 [..............................] - ETA: 1:57 - loss: 1.4167 - regression_loss: 1.2114 - classification_loss: 0.2052 16/500 [..............................] - ETA: 1:57 - loss: 1.4114 - regression_loss: 1.2098 - classification_loss: 0.2016 17/500 [>.............................] - ETA: 1:57 - loss: 1.4186 - regression_loss: 1.2147 - classification_loss: 0.2039 18/500 [>.............................] - ETA: 1:56 - loss: 1.4291 - regression_loss: 1.2209 - classification_loss: 0.2082 19/500 [>.............................] - ETA: 1:56 - loss: 1.4616 - regression_loss: 1.2496 - classification_loss: 0.2120 20/500 [>.............................] - ETA: 1:55 - loss: 1.4821 - regression_loss: 1.2659 - classification_loss: 0.2162 21/500 [>.............................] - ETA: 1:55 - loss: 1.4590 - regression_loss: 1.2473 - classification_loss: 0.2117 22/500 [>.............................] - ETA: 1:55 - loss: 1.4687 - regression_loss: 1.2541 - classification_loss: 0.2145 23/500 [>.............................] - ETA: 1:55 - loss: 1.4444 - regression_loss: 1.2336 - classification_loss: 0.2108 24/500 [>.............................] - ETA: 1:54 - loss: 1.4638 - regression_loss: 1.2522 - classification_loss: 0.2116 25/500 [>.............................] - ETA: 1:53 - loss: 1.4582 - regression_loss: 1.2491 - classification_loss: 0.2091 26/500 [>.............................] - ETA: 1:53 - loss: 1.4327 - regression_loss: 1.2272 - classification_loss: 0.2056 27/500 [>.............................] - ETA: 1:52 - loss: 1.4243 - regression_loss: 1.2201 - classification_loss: 0.2042 28/500 [>.............................] - ETA: 1:52 - loss: 1.4162 - regression_loss: 1.2111 - classification_loss: 0.2051 29/500 [>.............................] - ETA: 1:51 - loss: 1.4041 - regression_loss: 1.2006 - classification_loss: 0.2036 30/500 [>.............................] - ETA: 1:51 - loss: 1.4205 - regression_loss: 1.2142 - classification_loss: 0.2063 31/500 [>.............................] - ETA: 1:51 - loss: 1.4213 - regression_loss: 1.2146 - classification_loss: 0.2067 32/500 [>.............................] - ETA: 1:51 - loss: 1.4112 - regression_loss: 1.2058 - classification_loss: 0.2054 33/500 [>.............................] - ETA: 1:51 - loss: 1.4032 - regression_loss: 1.2007 - classification_loss: 0.2025 34/500 [=>............................] - ETA: 1:51 - loss: 1.3937 - regression_loss: 1.1920 - classification_loss: 0.2017 35/500 [=>............................] - ETA: 1:50 - loss: 1.3957 - regression_loss: 1.1928 - classification_loss: 0.2029 36/500 [=>............................] - ETA: 1:50 - loss: 1.3925 - regression_loss: 1.1905 - classification_loss: 0.2020 37/500 [=>............................] - ETA: 1:50 - loss: 1.3915 - regression_loss: 1.1905 - classification_loss: 0.2010 38/500 [=>............................] - ETA: 1:50 - loss: 1.4544 - regression_loss: 1.2287 - classification_loss: 0.2257 39/500 [=>............................] - ETA: 1:50 - loss: 1.4404 - regression_loss: 1.2181 - classification_loss: 0.2222 40/500 [=>............................] - ETA: 1:49 - loss: 1.4693 - regression_loss: 1.2447 - classification_loss: 0.2246 41/500 [=>............................] - ETA: 1:49 - loss: 1.4832 - regression_loss: 1.2535 - classification_loss: 0.2297 42/500 [=>............................] - ETA: 1:49 - loss: 1.4710 - regression_loss: 1.2442 - classification_loss: 0.2268 43/500 [=>............................] - ETA: 1:49 - loss: 1.4878 - regression_loss: 1.2582 - classification_loss: 0.2296 44/500 [=>............................] - ETA: 1:49 - loss: 1.4764 - regression_loss: 1.2491 - classification_loss: 0.2273 45/500 [=>............................] - ETA: 1:48 - loss: 1.4805 - regression_loss: 1.2513 - classification_loss: 0.2292 46/500 [=>............................] - ETA: 1:48 - loss: 1.4788 - regression_loss: 1.2505 - classification_loss: 0.2283 47/500 [=>............................] - ETA: 1:48 - loss: 1.4646 - regression_loss: 1.2388 - classification_loss: 0.2258 48/500 [=>............................] - ETA: 1:48 - loss: 1.4393 - regression_loss: 1.2178 - classification_loss: 0.2215 49/500 [=>............................] - ETA: 1:47 - loss: 1.4343 - regression_loss: 1.2147 - classification_loss: 0.2197 50/500 [==>...........................] - ETA: 1:47 - loss: 1.4370 - regression_loss: 1.2169 - classification_loss: 0.2201 51/500 [==>...........................] - ETA: 1:47 - loss: 1.4209 - regression_loss: 1.2034 - classification_loss: 0.2175 52/500 [==>...........................] - ETA: 1:47 - loss: 1.4317 - regression_loss: 1.2134 - classification_loss: 0.2184 53/500 [==>...........................] - ETA: 1:47 - loss: 1.4379 - regression_loss: 1.2183 - classification_loss: 0.2195 54/500 [==>...........................] - ETA: 1:47 - loss: 1.4337 - regression_loss: 1.2167 - classification_loss: 0.2170 55/500 [==>...........................] - ETA: 1:46 - loss: 1.4423 - regression_loss: 1.2228 - classification_loss: 0.2195 56/500 [==>...........................] - ETA: 1:46 - loss: 1.4349 - regression_loss: 1.2164 - classification_loss: 0.2185 57/500 [==>...........................] - ETA: 1:46 - loss: 1.4181 - regression_loss: 1.2023 - classification_loss: 0.2158 58/500 [==>...........................] - ETA: 1:46 - loss: 1.4199 - regression_loss: 1.2049 - classification_loss: 0.2150 59/500 [==>...........................] - ETA: 1:45 - loss: 1.4221 - regression_loss: 1.2074 - classification_loss: 0.2147 60/500 [==>...........................] - ETA: 1:45 - loss: 1.4318 - regression_loss: 1.2136 - classification_loss: 0.2182 61/500 [==>...........................] - ETA: 1:45 - loss: 1.4244 - regression_loss: 1.2077 - classification_loss: 0.2166 62/500 [==>...........................] - ETA: 1:45 - loss: 1.4274 - regression_loss: 1.2099 - classification_loss: 0.2175 63/500 [==>...........................] - ETA: 1:44 - loss: 1.4264 - regression_loss: 1.2101 - classification_loss: 0.2163 64/500 [==>...........................] - ETA: 1:44 - loss: 1.4252 - regression_loss: 1.2084 - classification_loss: 0.2168 65/500 [==>...........................] - ETA: 1:44 - loss: 1.4183 - regression_loss: 1.2027 - classification_loss: 0.2156 66/500 [==>...........................] - ETA: 1:44 - loss: 1.4165 - regression_loss: 1.1997 - classification_loss: 0.2168 67/500 [===>..........................] - ETA: 1:44 - loss: 1.4083 - regression_loss: 1.1931 - classification_loss: 0.2152 68/500 [===>..........................] - ETA: 1:43 - loss: 1.4086 - regression_loss: 1.1940 - classification_loss: 0.2146 69/500 [===>..........................] - ETA: 1:43 - loss: 1.4197 - regression_loss: 1.2020 - classification_loss: 0.2177 70/500 [===>..........................] - ETA: 1:43 - loss: 1.4231 - regression_loss: 1.2044 - classification_loss: 0.2187 71/500 [===>..........................] - ETA: 1:43 - loss: 1.4272 - regression_loss: 1.2083 - classification_loss: 0.2189 72/500 [===>..........................] - ETA: 1:42 - loss: 1.4313 - regression_loss: 1.2125 - classification_loss: 0.2188 73/500 [===>..........................] - ETA: 1:42 - loss: 1.4326 - regression_loss: 1.2142 - classification_loss: 0.2184 74/500 [===>..........................] - ETA: 1:42 - loss: 1.4299 - regression_loss: 1.2123 - classification_loss: 0.2176 75/500 [===>..........................] - ETA: 1:42 - loss: 1.4224 - regression_loss: 1.2061 - classification_loss: 0.2163 76/500 [===>..........................] - ETA: 1:41 - loss: 1.4217 - regression_loss: 1.2051 - classification_loss: 0.2166 77/500 [===>..........................] - ETA: 1:41 - loss: 1.4171 - regression_loss: 1.2020 - classification_loss: 0.2151 78/500 [===>..........................] - ETA: 1:41 - loss: 1.4181 - regression_loss: 1.2033 - classification_loss: 0.2149 79/500 [===>..........................] - ETA: 1:41 - loss: 1.4205 - regression_loss: 1.2050 - classification_loss: 0.2155 80/500 [===>..........................] - ETA: 1:40 - loss: 1.4218 - regression_loss: 1.2073 - classification_loss: 0.2145 81/500 [===>..........................] - ETA: 1:40 - loss: 1.4169 - regression_loss: 1.2030 - classification_loss: 0.2139 82/500 [===>..........................] - ETA: 1:40 - loss: 1.4093 - regression_loss: 1.1966 - classification_loss: 0.2127 83/500 [===>..........................] - ETA: 1:40 - loss: 1.4047 - regression_loss: 1.1929 - classification_loss: 0.2118 84/500 [====>.........................] - ETA: 1:39 - loss: 1.4041 - regression_loss: 1.1923 - classification_loss: 0.2118 85/500 [====>.........................] - ETA: 1:39 - loss: 1.4145 - regression_loss: 1.1988 - classification_loss: 0.2157 86/500 [====>.........................] - ETA: 1:39 - loss: 1.4255 - regression_loss: 1.2070 - classification_loss: 0.2184 87/500 [====>.........................] - ETA: 1:38 - loss: 1.4430 - regression_loss: 1.2201 - classification_loss: 0.2229 88/500 [====>.........................] - ETA: 1:38 - loss: 1.4409 - regression_loss: 1.2197 - classification_loss: 0.2212 89/500 [====>.........................] - ETA: 1:38 - loss: 1.4358 - regression_loss: 1.2148 - classification_loss: 0.2210 90/500 [====>.........................] - ETA: 1:38 - loss: 1.4329 - regression_loss: 1.2126 - classification_loss: 0.2203 91/500 [====>.........................] - ETA: 1:37 - loss: 1.4217 - regression_loss: 1.2030 - classification_loss: 0.2187 92/500 [====>.........................] - ETA: 1:37 - loss: 1.4303 - regression_loss: 1.2105 - classification_loss: 0.2198 93/500 [====>.........................] - ETA: 1:37 - loss: 1.4271 - regression_loss: 1.2085 - classification_loss: 0.2186 94/500 [====>.........................] - ETA: 1:37 - loss: 1.4256 - regression_loss: 1.2074 - classification_loss: 0.2182 95/500 [====>.........................] - ETA: 1:36 - loss: 1.4315 - regression_loss: 1.2136 - classification_loss: 0.2179 96/500 [====>.........................] - ETA: 1:36 - loss: 1.4297 - regression_loss: 1.2127 - classification_loss: 0.2170 97/500 [====>.........................] - ETA: 1:36 - loss: 1.4278 - regression_loss: 1.2117 - classification_loss: 0.2161 98/500 [====>.........................] - ETA: 1:36 - loss: 1.4303 - regression_loss: 1.2139 - classification_loss: 0.2164 99/500 [====>.........................] - ETA: 1:35 - loss: 1.4303 - regression_loss: 1.2132 - classification_loss: 0.2172 100/500 [=====>........................] - ETA: 1:35 - loss: 1.4357 - regression_loss: 1.2169 - classification_loss: 0.2189 101/500 [=====>........................] - ETA: 1:35 - loss: 1.4408 - regression_loss: 1.2216 - classification_loss: 0.2192 102/500 [=====>........................] - ETA: 1:34 - loss: 1.4431 - regression_loss: 1.2238 - classification_loss: 0.2193 103/500 [=====>........................] - ETA: 1:34 - loss: 1.4463 - regression_loss: 1.2253 - classification_loss: 0.2211 104/500 [=====>........................] - ETA: 1:34 - loss: 1.4572 - regression_loss: 1.2344 - classification_loss: 0.2228 105/500 [=====>........................] - ETA: 1:34 - loss: 1.4537 - regression_loss: 1.2319 - classification_loss: 0.2218 106/500 [=====>........................] - ETA: 1:33 - loss: 1.4551 - regression_loss: 1.2328 - classification_loss: 0.2223 107/500 [=====>........................] - ETA: 1:33 - loss: 1.4525 - regression_loss: 1.2303 - classification_loss: 0.2222 108/500 [=====>........................] - ETA: 1:33 - loss: 1.4555 - regression_loss: 1.2328 - classification_loss: 0.2228 109/500 [=====>........................] - ETA: 1:33 - loss: 1.4510 - regression_loss: 1.2293 - classification_loss: 0.2218 110/500 [=====>........................] - ETA: 1:32 - loss: 1.4521 - regression_loss: 1.2300 - classification_loss: 0.2221 111/500 [=====>........................] - ETA: 1:32 - loss: 1.4482 - regression_loss: 1.2271 - classification_loss: 0.2211 112/500 [=====>........................] - ETA: 1:32 - loss: 1.4462 - regression_loss: 1.2255 - classification_loss: 0.2207 113/500 [=====>........................] - ETA: 1:32 - loss: 1.4454 - regression_loss: 1.2249 - classification_loss: 0.2205 114/500 [=====>........................] - ETA: 1:32 - loss: 1.4446 - regression_loss: 1.2246 - classification_loss: 0.2199 115/500 [=====>........................] - ETA: 1:31 - loss: 1.4483 - regression_loss: 1.2279 - classification_loss: 0.2204 116/500 [=====>........................] - ETA: 1:31 - loss: 1.4426 - regression_loss: 1.2233 - classification_loss: 0.2193 117/500 [======>.......................] - ETA: 1:31 - loss: 1.4349 - regression_loss: 1.2169 - classification_loss: 0.2181 118/500 [======>.......................] - ETA: 1:30 - loss: 1.4389 - regression_loss: 1.2196 - classification_loss: 0.2193 119/500 [======>.......................] - ETA: 1:30 - loss: 1.4383 - regression_loss: 1.2191 - classification_loss: 0.2192 120/500 [======>.......................] - ETA: 1:30 - loss: 1.4372 - regression_loss: 1.2179 - classification_loss: 0.2193 121/500 [======>.......................] - ETA: 1:30 - loss: 1.4377 - regression_loss: 1.2182 - classification_loss: 0.2195 122/500 [======>.......................] - ETA: 1:30 - loss: 1.4400 - regression_loss: 1.2199 - classification_loss: 0.2200 123/500 [======>.......................] - ETA: 1:29 - loss: 1.4401 - regression_loss: 1.2205 - classification_loss: 0.2196 124/500 [======>.......................] - ETA: 1:29 - loss: 1.4407 - regression_loss: 1.2206 - classification_loss: 0.2201 125/500 [======>.......................] - ETA: 1:29 - loss: 1.4396 - regression_loss: 1.2205 - classification_loss: 0.2191 126/500 [======>.......................] - ETA: 1:29 - loss: 1.4370 - regression_loss: 1.2187 - classification_loss: 0.2184 127/500 [======>.......................] - ETA: 1:28 - loss: 1.4343 - regression_loss: 1.2167 - classification_loss: 0.2176 128/500 [======>.......................] - ETA: 1:28 - loss: 1.4341 - regression_loss: 1.2164 - classification_loss: 0.2177 129/500 [======>.......................] - ETA: 1:28 - loss: 1.4380 - regression_loss: 1.2193 - classification_loss: 0.2187 130/500 [======>.......................] - ETA: 1:28 - loss: 1.4332 - regression_loss: 1.2152 - classification_loss: 0.2180 131/500 [======>.......................] - ETA: 1:28 - loss: 1.4316 - regression_loss: 1.2140 - classification_loss: 0.2176 132/500 [======>.......................] - ETA: 1:27 - loss: 1.4346 - regression_loss: 1.2166 - classification_loss: 0.2180 133/500 [======>.......................] - ETA: 1:27 - loss: 1.4306 - regression_loss: 1.2132 - classification_loss: 0.2174 134/500 [=======>......................] - ETA: 1:27 - loss: 1.4307 - regression_loss: 1.2134 - classification_loss: 0.2173 135/500 [=======>......................] - ETA: 1:27 - loss: 1.4323 - regression_loss: 1.2147 - classification_loss: 0.2176 136/500 [=======>......................] - ETA: 1:27 - loss: 1.4323 - regression_loss: 1.2146 - classification_loss: 0.2177 137/500 [=======>......................] - ETA: 1:26 - loss: 1.4335 - regression_loss: 1.2157 - classification_loss: 0.2178 138/500 [=======>......................] - ETA: 1:26 - loss: 1.4355 - regression_loss: 1.2170 - classification_loss: 0.2184 139/500 [=======>......................] - ETA: 1:26 - loss: 1.4398 - regression_loss: 1.2202 - classification_loss: 0.2196 140/500 [=======>......................] - ETA: 1:26 - loss: 1.4409 - regression_loss: 1.2208 - classification_loss: 0.2201 141/500 [=======>......................] - ETA: 1:25 - loss: 1.4465 - regression_loss: 1.2251 - classification_loss: 0.2214 142/500 [=======>......................] - ETA: 1:25 - loss: 1.4500 - regression_loss: 1.2277 - classification_loss: 0.2223 143/500 [=======>......................] - ETA: 1:25 - loss: 1.4513 - regression_loss: 1.2292 - classification_loss: 0.2221 144/500 [=======>......................] - ETA: 1:25 - loss: 1.4553 - regression_loss: 1.2325 - classification_loss: 0.2227 145/500 [=======>......................] - ETA: 1:24 - loss: 1.4556 - regression_loss: 1.2322 - classification_loss: 0.2234 146/500 [=======>......................] - ETA: 1:24 - loss: 1.4533 - regression_loss: 1.2302 - classification_loss: 0.2231 147/500 [=======>......................] - ETA: 1:24 - loss: 1.4547 - regression_loss: 1.2310 - classification_loss: 0.2237 148/500 [=======>......................] - ETA: 1:24 - loss: 1.4526 - regression_loss: 1.2294 - classification_loss: 0.2232 149/500 [=======>......................] - ETA: 1:23 - loss: 1.4539 - regression_loss: 1.2308 - classification_loss: 0.2231 150/500 [========>.....................] - ETA: 1:23 - loss: 1.4496 - regression_loss: 1.2276 - classification_loss: 0.2220 151/500 [========>.....................] - ETA: 1:23 - loss: 1.4528 - regression_loss: 1.2303 - classification_loss: 0.2225 152/500 [========>.....................] - ETA: 1:23 - loss: 1.4557 - regression_loss: 1.2323 - classification_loss: 0.2233 153/500 [========>.....................] - ETA: 1:22 - loss: 1.4551 - regression_loss: 1.2319 - classification_loss: 0.2232 154/500 [========>.....................] - ETA: 1:22 - loss: 1.4521 - regression_loss: 1.2292 - classification_loss: 0.2229 155/500 [========>.....................] - ETA: 1:22 - loss: 1.4512 - regression_loss: 1.2289 - classification_loss: 0.2224 156/500 [========>.....................] - ETA: 1:22 - loss: 1.4509 - regression_loss: 1.2287 - classification_loss: 0.2222 157/500 [========>.....................] - ETA: 1:22 - loss: 1.4515 - regression_loss: 1.2293 - classification_loss: 0.2221 158/500 [========>.....................] - ETA: 1:21 - loss: 1.4477 - regression_loss: 1.2266 - classification_loss: 0.2211 159/500 [========>.....................] - ETA: 1:21 - loss: 1.4477 - regression_loss: 1.2268 - classification_loss: 0.2209 160/500 [========>.....................] - ETA: 1:21 - loss: 1.4485 - regression_loss: 1.2275 - classification_loss: 0.2211 161/500 [========>.....................] - ETA: 1:21 - loss: 1.4476 - regression_loss: 1.2269 - classification_loss: 0.2208 162/500 [========>.....................] - ETA: 1:20 - loss: 1.4474 - regression_loss: 1.2265 - classification_loss: 0.2209 163/500 [========>.....................] - ETA: 1:20 - loss: 1.4441 - regression_loss: 1.2241 - classification_loss: 0.2200 164/500 [========>.....................] - ETA: 1:20 - loss: 1.4443 - regression_loss: 1.2246 - classification_loss: 0.2197 165/500 [========>.....................] - ETA: 1:20 - loss: 1.4464 - regression_loss: 1.2260 - classification_loss: 0.2204 166/500 [========>.....................] - ETA: 1:19 - loss: 1.4465 - regression_loss: 1.2255 - classification_loss: 0.2210 167/500 [=========>....................] - ETA: 1:19 - loss: 1.4418 - regression_loss: 1.2212 - classification_loss: 0.2206 168/500 [=========>....................] - ETA: 1:19 - loss: 1.4410 - regression_loss: 1.2204 - classification_loss: 0.2206 169/500 [=========>....................] - ETA: 1:18 - loss: 1.4401 - regression_loss: 1.2195 - classification_loss: 0.2206 170/500 [=========>....................] - ETA: 1:18 - loss: 1.4435 - regression_loss: 1.2220 - classification_loss: 0.2215 171/500 [=========>....................] - ETA: 1:18 - loss: 1.4423 - regression_loss: 1.2212 - classification_loss: 0.2211 172/500 [=========>....................] - ETA: 1:18 - loss: 1.4383 - regression_loss: 1.2179 - classification_loss: 0.2205 173/500 [=========>....................] - ETA: 1:17 - loss: 1.4466 - regression_loss: 1.2242 - classification_loss: 0.2225 174/500 [=========>....................] - ETA: 1:17 - loss: 1.4453 - regression_loss: 1.2225 - classification_loss: 0.2227 175/500 [=========>....................] - ETA: 1:17 - loss: 1.4452 - regression_loss: 1.2217 - classification_loss: 0.2235 176/500 [=========>....................] - ETA: 1:16 - loss: 1.4414 - regression_loss: 1.2184 - classification_loss: 0.2231 177/500 [=========>....................] - ETA: 1:16 - loss: 1.4407 - regression_loss: 1.2178 - classification_loss: 0.2229 178/500 [=========>....................] - ETA: 1:16 - loss: 1.4427 - regression_loss: 1.2191 - classification_loss: 0.2236 179/500 [=========>....................] - ETA: 1:16 - loss: 1.4383 - regression_loss: 1.2123 - classification_loss: 0.2260 180/500 [=========>....................] - ETA: 1:15 - loss: 1.4383 - regression_loss: 1.2125 - classification_loss: 0.2257 181/500 [=========>....................] - ETA: 1:15 - loss: 1.4373 - regression_loss: 1.2114 - classification_loss: 0.2259 182/500 [=========>....................] - ETA: 1:15 - loss: 1.4351 - regression_loss: 1.2097 - classification_loss: 0.2254 183/500 [=========>....................] - ETA: 1:15 - loss: 1.4335 - regression_loss: 1.2086 - classification_loss: 0.2249 184/500 [==========>...................] - ETA: 1:14 - loss: 1.4316 - regression_loss: 1.2073 - classification_loss: 0.2243 185/500 [==========>...................] - ETA: 1:14 - loss: 1.4259 - regression_loss: 1.2024 - classification_loss: 0.2235 186/500 [==========>...................] - ETA: 1:14 - loss: 1.4251 - regression_loss: 1.2021 - classification_loss: 0.2229 187/500 [==========>...................] - ETA: 1:14 - loss: 1.4206 - regression_loss: 1.1985 - classification_loss: 0.2221 188/500 [==========>...................] - ETA: 1:13 - loss: 1.4158 - regression_loss: 1.1946 - classification_loss: 0.2211 189/500 [==========>...................] - ETA: 1:13 - loss: 1.4163 - regression_loss: 1.1952 - classification_loss: 0.2211 190/500 [==========>...................] - ETA: 1:13 - loss: 1.4188 - regression_loss: 1.1972 - classification_loss: 0.2216 191/500 [==========>...................] - ETA: 1:13 - loss: 1.4226 - regression_loss: 1.1998 - classification_loss: 0.2228 192/500 [==========>...................] - ETA: 1:12 - loss: 1.4187 - regression_loss: 1.1967 - classification_loss: 0.2220 193/500 [==========>...................] - ETA: 1:12 - loss: 1.4174 - regression_loss: 1.1956 - classification_loss: 0.2219 194/500 [==========>...................] - ETA: 1:12 - loss: 1.4149 - regression_loss: 1.1937 - classification_loss: 0.2212 195/500 [==========>...................] - ETA: 1:12 - loss: 1.4167 - regression_loss: 1.1951 - classification_loss: 0.2216 196/500 [==========>...................] - ETA: 1:11 - loss: 1.4135 - regression_loss: 1.1925 - classification_loss: 0.2210 197/500 [==========>...................] - ETA: 1:11 - loss: 1.4135 - regression_loss: 1.1927 - classification_loss: 0.2208 198/500 [==========>...................] - ETA: 1:11 - loss: 1.4125 - regression_loss: 1.1922 - classification_loss: 0.2203 199/500 [==========>...................] - ETA: 1:11 - loss: 1.4113 - regression_loss: 1.1914 - classification_loss: 0.2199 200/500 [===========>..................] - ETA: 1:10 - loss: 1.4124 - regression_loss: 1.1925 - classification_loss: 0.2198 201/500 [===========>..................] - ETA: 1:10 - loss: 1.4099 - regression_loss: 1.1908 - classification_loss: 0.2191 202/500 [===========>..................] - ETA: 1:10 - loss: 1.4088 - regression_loss: 1.1901 - classification_loss: 0.2187 203/500 [===========>..................] - ETA: 1:10 - loss: 1.4097 - regression_loss: 1.1907 - classification_loss: 0.2189 204/500 [===========>..................] - ETA: 1:09 - loss: 1.4100 - regression_loss: 1.1908 - classification_loss: 0.2192 205/500 [===========>..................] - ETA: 1:09 - loss: 1.4108 - regression_loss: 1.1913 - classification_loss: 0.2195 206/500 [===========>..................] - ETA: 1:09 - loss: 1.4071 - regression_loss: 1.1880 - classification_loss: 0.2191 207/500 [===========>..................] - ETA: 1:09 - loss: 1.4079 - regression_loss: 1.1891 - classification_loss: 0.2188 208/500 [===========>..................] - ETA: 1:08 - loss: 1.4063 - regression_loss: 1.1879 - classification_loss: 0.2184 209/500 [===========>..................] - ETA: 1:08 - loss: 1.4073 - regression_loss: 1.1890 - classification_loss: 0.2183 210/500 [===========>..................] - ETA: 1:08 - loss: 1.4086 - regression_loss: 1.1899 - classification_loss: 0.2186 211/500 [===========>..................] - ETA: 1:07 - loss: 1.4090 - regression_loss: 1.1904 - classification_loss: 0.2186 212/500 [===========>..................] - ETA: 1:07 - loss: 1.4133 - regression_loss: 1.1937 - classification_loss: 0.2196 213/500 [===========>..................] - ETA: 1:07 - loss: 1.4079 - regression_loss: 1.1892 - classification_loss: 0.2187 214/500 [===========>..................] - ETA: 1:07 - loss: 1.4073 - regression_loss: 1.1888 - classification_loss: 0.2185 215/500 [===========>..................] - ETA: 1:06 - loss: 1.4042 - regression_loss: 1.1865 - classification_loss: 0.2177 216/500 [===========>..................] - ETA: 1:06 - loss: 1.4052 - regression_loss: 1.1868 - classification_loss: 0.2184 217/500 [============>.................] - ETA: 1:06 - loss: 1.4053 - regression_loss: 1.1868 - classification_loss: 0.2184 218/500 [============>.................] - ETA: 1:06 - loss: 1.4047 - regression_loss: 1.1863 - classification_loss: 0.2183 219/500 [============>.................] - ETA: 1:05 - loss: 1.4026 - regression_loss: 1.1847 - classification_loss: 0.2178 220/500 [============>.................] - ETA: 1:05 - loss: 1.4005 - regression_loss: 1.1831 - classification_loss: 0.2174 221/500 [============>.................] - ETA: 1:05 - loss: 1.3990 - regression_loss: 1.1818 - classification_loss: 0.2172 222/500 [============>.................] - ETA: 1:05 - loss: 1.3989 - regression_loss: 1.1818 - classification_loss: 0.2171 223/500 [============>.................] - ETA: 1:05 - loss: 1.3997 - regression_loss: 1.1823 - classification_loss: 0.2173 224/500 [============>.................] - ETA: 1:04 - loss: 1.3994 - regression_loss: 1.1824 - classification_loss: 0.2170 225/500 [============>.................] - ETA: 1:04 - loss: 1.4014 - regression_loss: 1.1840 - classification_loss: 0.2174 226/500 [============>.................] - ETA: 1:04 - loss: 1.3994 - regression_loss: 1.1823 - classification_loss: 0.2171 227/500 [============>.................] - ETA: 1:04 - loss: 1.3946 - regression_loss: 1.1782 - classification_loss: 0.2163 228/500 [============>.................] - ETA: 1:03 - loss: 1.3928 - regression_loss: 1.1766 - classification_loss: 0.2162 229/500 [============>.................] - ETA: 1:03 - loss: 1.3920 - regression_loss: 1.1761 - classification_loss: 0.2159 230/500 [============>.................] - ETA: 1:03 - loss: 1.3921 - regression_loss: 1.1764 - classification_loss: 0.2157 231/500 [============>.................] - ETA: 1:03 - loss: 1.3927 - regression_loss: 1.1768 - classification_loss: 0.2160 232/500 [============>.................] - ETA: 1:03 - loss: 1.3924 - regression_loss: 1.1765 - classification_loss: 0.2159 233/500 [============>.................] - ETA: 1:02 - loss: 1.3916 - regression_loss: 1.1760 - classification_loss: 0.2156 234/500 [=============>................] - ETA: 1:02 - loss: 1.3906 - regression_loss: 1.1753 - classification_loss: 0.2153 235/500 [=============>................] - ETA: 1:02 - loss: 1.3914 - regression_loss: 1.1760 - classification_loss: 0.2153 236/500 [=============>................] - ETA: 1:02 - loss: 1.3928 - regression_loss: 1.1768 - classification_loss: 0.2159 237/500 [=============>................] - ETA: 1:01 - loss: 1.3936 - regression_loss: 1.1776 - classification_loss: 0.2160 238/500 [=============>................] - ETA: 1:01 - loss: 1.3965 - regression_loss: 1.1799 - classification_loss: 0.2166 239/500 [=============>................] - ETA: 1:01 - loss: 1.3944 - regression_loss: 1.1782 - classification_loss: 0.2162 240/500 [=============>................] - ETA: 1:01 - loss: 1.3934 - regression_loss: 1.1771 - classification_loss: 0.2163 241/500 [=============>................] - ETA: 1:01 - loss: 1.3940 - regression_loss: 1.1777 - classification_loss: 0.2163 242/500 [=============>................] - ETA: 1:00 - loss: 1.3960 - regression_loss: 1.1785 - classification_loss: 0.2174 243/500 [=============>................] - ETA: 1:00 - loss: 1.3981 - regression_loss: 1.1810 - classification_loss: 0.2172 244/500 [=============>................] - ETA: 1:00 - loss: 1.3990 - regression_loss: 1.1815 - classification_loss: 0.2175 245/500 [=============>................] - ETA: 1:00 - loss: 1.3981 - regression_loss: 1.1809 - classification_loss: 0.2172 246/500 [=============>................] - ETA: 59s - loss: 1.4014 - regression_loss: 1.1836 - classification_loss: 0.2178  247/500 [=============>................] - ETA: 59s - loss: 1.4023 - regression_loss: 1.1844 - classification_loss: 0.2179 248/500 [=============>................] - ETA: 59s - loss: 1.4063 - regression_loss: 1.1873 - classification_loss: 0.2189 249/500 [=============>................] - ETA: 59s - loss: 1.4088 - regression_loss: 1.1897 - classification_loss: 0.2191 250/500 [==============>...............] - ETA: 58s - loss: 1.4087 - regression_loss: 1.1898 - classification_loss: 0.2189 251/500 [==============>...............] - ETA: 58s - loss: 1.4083 - regression_loss: 1.1897 - classification_loss: 0.2186 252/500 [==============>...............] - ETA: 58s - loss: 1.4114 - regression_loss: 1.1922 - classification_loss: 0.2192 253/500 [==============>...............] - ETA: 58s - loss: 1.4100 - regression_loss: 1.1910 - classification_loss: 0.2190 254/500 [==============>...............] - ETA: 58s - loss: 1.4102 - regression_loss: 1.1914 - classification_loss: 0.2188 255/500 [==============>...............] - ETA: 57s - loss: 1.4119 - regression_loss: 1.1925 - classification_loss: 0.2194 256/500 [==============>...............] - ETA: 57s - loss: 1.4096 - regression_loss: 1.1908 - classification_loss: 0.2188 257/500 [==============>...............] - ETA: 57s - loss: 1.4105 - regression_loss: 1.1918 - classification_loss: 0.2187 258/500 [==============>...............] - ETA: 57s - loss: 1.4069 - regression_loss: 1.1888 - classification_loss: 0.2181 259/500 [==============>...............] - ETA: 56s - loss: 1.4062 - regression_loss: 1.1882 - classification_loss: 0.2179 260/500 [==============>...............] - ETA: 56s - loss: 1.4050 - regression_loss: 1.1875 - classification_loss: 0.2175 261/500 [==============>...............] - ETA: 56s - loss: 1.4067 - regression_loss: 1.1889 - classification_loss: 0.2178 262/500 [==============>...............] - ETA: 56s - loss: 1.4075 - regression_loss: 1.1896 - classification_loss: 0.2179 263/500 [==============>...............] - ETA: 55s - loss: 1.4067 - regression_loss: 1.1888 - classification_loss: 0.2179 264/500 [==============>...............] - ETA: 55s - loss: 1.4055 - regression_loss: 1.1878 - classification_loss: 0.2177 265/500 [==============>...............] - ETA: 55s - loss: 1.4052 - regression_loss: 1.1878 - classification_loss: 0.2174 266/500 [==============>...............] - ETA: 55s - loss: 1.4036 - regression_loss: 1.1864 - classification_loss: 0.2172 267/500 [===============>..............] - ETA: 55s - loss: 1.4006 - regression_loss: 1.1838 - classification_loss: 0.2168 268/500 [===============>..............] - ETA: 54s - loss: 1.4030 - regression_loss: 1.1854 - classification_loss: 0.2177 269/500 [===============>..............] - ETA: 54s - loss: 1.4040 - regression_loss: 1.1864 - classification_loss: 0.2176 270/500 [===============>..............] - ETA: 54s - loss: 1.4054 - regression_loss: 1.1875 - classification_loss: 0.2178 271/500 [===============>..............] - ETA: 54s - loss: 1.4034 - regression_loss: 1.1860 - classification_loss: 0.2174 272/500 [===============>..............] - ETA: 53s - loss: 1.4057 - regression_loss: 1.1878 - classification_loss: 0.2179 273/500 [===============>..............] - ETA: 53s - loss: 1.4065 - regression_loss: 1.1887 - classification_loss: 0.2177 274/500 [===============>..............] - ETA: 53s - loss: 1.4051 - regression_loss: 1.1878 - classification_loss: 0.2173 275/500 [===============>..............] - ETA: 53s - loss: 1.4078 - regression_loss: 1.1901 - classification_loss: 0.2176 276/500 [===============>..............] - ETA: 52s - loss: 1.4089 - regression_loss: 1.1908 - classification_loss: 0.2180 277/500 [===============>..............] - ETA: 52s - loss: 1.4119 - regression_loss: 1.1935 - classification_loss: 0.2184 278/500 [===============>..............] - ETA: 52s - loss: 1.4113 - regression_loss: 1.1930 - classification_loss: 0.2183 279/500 [===============>..............] - ETA: 52s - loss: 1.4132 - regression_loss: 1.1943 - classification_loss: 0.2189 280/500 [===============>..............] - ETA: 52s - loss: 1.4149 - regression_loss: 1.1960 - classification_loss: 0.2189 281/500 [===============>..............] - ETA: 51s - loss: 1.4148 - regression_loss: 1.1961 - classification_loss: 0.2187 282/500 [===============>..............] - ETA: 51s - loss: 1.4144 - regression_loss: 1.1959 - classification_loss: 0.2185 283/500 [===============>..............] - ETA: 51s - loss: 1.4145 - regression_loss: 1.1961 - classification_loss: 0.2184 284/500 [================>.............] - ETA: 51s - loss: 1.4145 - regression_loss: 1.1961 - classification_loss: 0.2184 285/500 [================>.............] - ETA: 50s - loss: 1.4129 - regression_loss: 1.1947 - classification_loss: 0.2182 286/500 [================>.............] - ETA: 50s - loss: 1.4134 - regression_loss: 1.1952 - classification_loss: 0.2182 287/500 [================>.............] - ETA: 50s - loss: 1.4144 - regression_loss: 1.1962 - classification_loss: 0.2182 288/500 [================>.............] - ETA: 50s - loss: 1.4154 - regression_loss: 1.1970 - classification_loss: 0.2184 289/500 [================>.............] - ETA: 50s - loss: 1.4134 - regression_loss: 1.1955 - classification_loss: 0.2179 290/500 [================>.............] - ETA: 49s - loss: 1.4132 - regression_loss: 1.1952 - classification_loss: 0.2180 291/500 [================>.............] - ETA: 49s - loss: 1.4127 - regression_loss: 1.1948 - classification_loss: 0.2180 292/500 [================>.............] - ETA: 49s - loss: 1.4123 - regression_loss: 1.1945 - classification_loss: 0.2178 293/500 [================>.............] - ETA: 49s - loss: 1.4130 - regression_loss: 1.1954 - classification_loss: 0.2176 294/500 [================>.............] - ETA: 48s - loss: 1.4126 - regression_loss: 1.1952 - classification_loss: 0.2174 295/500 [================>.............] - ETA: 48s - loss: 1.4150 - regression_loss: 1.1966 - classification_loss: 0.2184 296/500 [================>.............] - ETA: 48s - loss: 1.4158 - regression_loss: 1.1975 - classification_loss: 0.2183 297/500 [================>.............] - ETA: 48s - loss: 1.4159 - regression_loss: 1.1978 - classification_loss: 0.2181 298/500 [================>.............] - ETA: 47s - loss: 1.4152 - regression_loss: 1.1974 - classification_loss: 0.2178 299/500 [================>.............] - ETA: 47s - loss: 1.4152 - regression_loss: 1.1973 - classification_loss: 0.2179 300/500 [=================>............] - ETA: 47s - loss: 1.4170 - regression_loss: 1.1983 - classification_loss: 0.2187 301/500 [=================>............] - ETA: 47s - loss: 1.4177 - regression_loss: 1.1988 - classification_loss: 0.2189 302/500 [=================>............] - ETA: 46s - loss: 1.4173 - regression_loss: 1.1987 - classification_loss: 0.2186 303/500 [=================>............] - ETA: 46s - loss: 1.4186 - regression_loss: 1.1998 - classification_loss: 0.2188 304/500 [=================>............] - ETA: 46s - loss: 1.4163 - regression_loss: 1.1979 - classification_loss: 0.2184 305/500 [=================>............] - ETA: 46s - loss: 1.4208 - regression_loss: 1.2017 - classification_loss: 0.2191 306/500 [=================>............] - ETA: 46s - loss: 1.4221 - regression_loss: 1.2027 - classification_loss: 0.2194 307/500 [=================>............] - ETA: 45s - loss: 1.4235 - regression_loss: 1.2040 - classification_loss: 0.2195 308/500 [=================>............] - ETA: 45s - loss: 1.4254 - regression_loss: 1.2057 - classification_loss: 0.2197 309/500 [=================>............] - ETA: 45s - loss: 1.4259 - regression_loss: 1.2061 - classification_loss: 0.2199 310/500 [=================>............] - ETA: 45s - loss: 1.4268 - regression_loss: 1.2068 - classification_loss: 0.2200 311/500 [=================>............] - ETA: 44s - loss: 1.4251 - regression_loss: 1.2054 - classification_loss: 0.2197 312/500 [=================>............] - ETA: 44s - loss: 1.4224 - regression_loss: 1.2032 - classification_loss: 0.2192 313/500 [=================>............] - ETA: 44s - loss: 1.4213 - regression_loss: 1.2024 - classification_loss: 0.2189 314/500 [=================>............] - ETA: 44s - loss: 1.4242 - regression_loss: 1.2045 - classification_loss: 0.2197 315/500 [=================>............] - ETA: 43s - loss: 1.4240 - regression_loss: 1.2045 - classification_loss: 0.2195 316/500 [=================>............] - ETA: 43s - loss: 1.4232 - regression_loss: 1.2039 - classification_loss: 0.2193 317/500 [==================>...........] - ETA: 43s - loss: 1.4227 - regression_loss: 1.2035 - classification_loss: 0.2192 318/500 [==================>...........] - ETA: 43s - loss: 1.4207 - regression_loss: 1.2020 - classification_loss: 0.2187 319/500 [==================>...........] - ETA: 43s - loss: 1.4246 - regression_loss: 1.2052 - classification_loss: 0.2194 320/500 [==================>...........] - ETA: 42s - loss: 1.4249 - regression_loss: 1.2054 - classification_loss: 0.2195 321/500 [==================>...........] - ETA: 42s - loss: 1.4266 - regression_loss: 1.2071 - classification_loss: 0.2195 322/500 [==================>...........] - ETA: 42s - loss: 1.4270 - regression_loss: 1.2077 - classification_loss: 0.2193 323/500 [==================>...........] - ETA: 42s - loss: 1.4295 - regression_loss: 1.2096 - classification_loss: 0.2199 324/500 [==================>...........] - ETA: 41s - loss: 1.4265 - regression_loss: 1.2071 - classification_loss: 0.2194 325/500 [==================>...........] - ETA: 41s - loss: 1.4270 - regression_loss: 1.2074 - classification_loss: 0.2196 326/500 [==================>...........] - ETA: 41s - loss: 1.4276 - regression_loss: 1.2078 - classification_loss: 0.2198 327/500 [==================>...........] - ETA: 41s - loss: 1.4302 - regression_loss: 1.2099 - classification_loss: 0.2203 328/500 [==================>...........] - ETA: 40s - loss: 1.4286 - regression_loss: 1.2087 - classification_loss: 0.2199 329/500 [==================>...........] - ETA: 40s - loss: 1.4291 - regression_loss: 1.2089 - classification_loss: 0.2202 330/500 [==================>...........] - ETA: 40s - loss: 1.4300 - regression_loss: 1.2098 - classification_loss: 0.2201 331/500 [==================>...........] - ETA: 40s - loss: 1.4296 - regression_loss: 1.2091 - classification_loss: 0.2204 332/500 [==================>...........] - ETA: 39s - loss: 1.4287 - regression_loss: 1.2084 - classification_loss: 0.2203 333/500 [==================>...........] - ETA: 39s - loss: 1.4301 - regression_loss: 1.2098 - classification_loss: 0.2203 334/500 [===================>..........] - ETA: 39s - loss: 1.4285 - regression_loss: 1.2085 - classification_loss: 0.2200 335/500 [===================>..........] - ETA: 39s - loss: 1.4281 - regression_loss: 1.2084 - classification_loss: 0.2197 336/500 [===================>..........] - ETA: 39s - loss: 1.4258 - regression_loss: 1.2066 - classification_loss: 0.2192 337/500 [===================>..........] - ETA: 38s - loss: 1.4246 - regression_loss: 1.2056 - classification_loss: 0.2190 338/500 [===================>..........] - ETA: 38s - loss: 1.4245 - regression_loss: 1.2056 - classification_loss: 0.2189 339/500 [===================>..........] - ETA: 38s - loss: 1.4246 - regression_loss: 1.2058 - classification_loss: 0.2188 340/500 [===================>..........] - ETA: 38s - loss: 1.4254 - regression_loss: 1.2065 - classification_loss: 0.2189 341/500 [===================>..........] - ETA: 37s - loss: 1.4268 - regression_loss: 1.2076 - classification_loss: 0.2192 342/500 [===================>..........] - ETA: 37s - loss: 1.4264 - regression_loss: 1.2068 - classification_loss: 0.2195 343/500 [===================>..........] - ETA: 37s - loss: 1.4262 - regression_loss: 1.2069 - classification_loss: 0.2193 344/500 [===================>..........] - ETA: 37s - loss: 1.4253 - regression_loss: 1.2061 - classification_loss: 0.2192 345/500 [===================>..........] - ETA: 36s - loss: 1.4250 - regression_loss: 1.2059 - classification_loss: 0.2190 346/500 [===================>..........] - ETA: 36s - loss: 1.4253 - regression_loss: 1.2061 - classification_loss: 0.2192 347/500 [===================>..........] - ETA: 36s - loss: 1.4249 - regression_loss: 1.2060 - classification_loss: 0.2189 348/500 [===================>..........] - ETA: 36s - loss: 1.4248 - regression_loss: 1.2061 - classification_loss: 0.2187 349/500 [===================>..........] - ETA: 35s - loss: 1.4260 - regression_loss: 1.2069 - classification_loss: 0.2191 350/500 [====================>.........] - ETA: 35s - loss: 1.4263 - regression_loss: 1.2072 - classification_loss: 0.2192 351/500 [====================>.........] - ETA: 35s - loss: 1.4281 - regression_loss: 1.2085 - classification_loss: 0.2196 352/500 [====================>.........] - ETA: 35s - loss: 1.4260 - regression_loss: 1.2068 - classification_loss: 0.2193 353/500 [====================>.........] - ETA: 34s - loss: 1.4273 - regression_loss: 1.2077 - classification_loss: 0.2196 354/500 [====================>.........] - ETA: 34s - loss: 1.4270 - regression_loss: 1.2075 - classification_loss: 0.2195 355/500 [====================>.........] - ETA: 34s - loss: 1.4295 - regression_loss: 1.2094 - classification_loss: 0.2200 356/500 [====================>.........] - ETA: 34s - loss: 1.4282 - regression_loss: 1.2084 - classification_loss: 0.2198 357/500 [====================>.........] - ETA: 34s - loss: 1.4270 - regression_loss: 1.2076 - classification_loss: 0.2195 358/500 [====================>.........] - ETA: 33s - loss: 1.4280 - regression_loss: 1.2083 - classification_loss: 0.2198 359/500 [====================>.........] - ETA: 33s - loss: 1.4290 - regression_loss: 1.2089 - classification_loss: 0.2201 360/500 [====================>.........] - ETA: 33s - loss: 1.4285 - regression_loss: 1.2085 - classification_loss: 0.2200 361/500 [====================>.........] - ETA: 33s - loss: 1.4256 - regression_loss: 1.2052 - classification_loss: 0.2204 362/500 [====================>.........] - ETA: 32s - loss: 1.4265 - regression_loss: 1.2058 - classification_loss: 0.2206 363/500 [====================>.........] - ETA: 32s - loss: 1.4255 - regression_loss: 1.2048 - classification_loss: 0.2207 364/500 [====================>.........] - ETA: 32s - loss: 1.4259 - regression_loss: 1.2050 - classification_loss: 0.2208 365/500 [====================>.........] - ETA: 32s - loss: 1.4233 - regression_loss: 1.2029 - classification_loss: 0.2204 366/500 [====================>.........] - ETA: 31s - loss: 1.4222 - regression_loss: 1.2021 - classification_loss: 0.2201 367/500 [=====================>........] - ETA: 31s - loss: 1.4261 - regression_loss: 1.1988 - classification_loss: 0.2272 368/500 [=====================>........] - ETA: 31s - loss: 1.4257 - regression_loss: 1.1985 - classification_loss: 0.2272 369/500 [=====================>........] - ETA: 31s - loss: 1.4250 - regression_loss: 1.1981 - classification_loss: 0.2269 370/500 [=====================>........] - ETA: 31s - loss: 1.4239 - regression_loss: 1.1972 - classification_loss: 0.2267 371/500 [=====================>........] - ETA: 30s - loss: 1.4235 - regression_loss: 1.1970 - classification_loss: 0.2265 372/500 [=====================>........] - ETA: 30s - loss: 1.4242 - regression_loss: 1.1976 - classification_loss: 0.2267 373/500 [=====================>........] - ETA: 30s - loss: 1.4225 - regression_loss: 1.1962 - classification_loss: 0.2264 374/500 [=====================>........] - ETA: 30s - loss: 1.4234 - regression_loss: 1.1968 - classification_loss: 0.2266 375/500 [=====================>........] - ETA: 29s - loss: 1.4257 - regression_loss: 1.1984 - classification_loss: 0.2273 376/500 [=====================>........] - ETA: 29s - loss: 1.4238 - regression_loss: 1.1968 - classification_loss: 0.2270 377/500 [=====================>........] - ETA: 29s - loss: 1.4253 - regression_loss: 1.1981 - classification_loss: 0.2273 378/500 [=====================>........] - ETA: 29s - loss: 1.4230 - regression_loss: 1.1960 - classification_loss: 0.2270 379/500 [=====================>........] - ETA: 28s - loss: 1.4237 - regression_loss: 1.1967 - classification_loss: 0.2270 380/500 [=====================>........] - ETA: 28s - loss: 1.4245 - regression_loss: 1.1975 - classification_loss: 0.2270 381/500 [=====================>........] - ETA: 28s - loss: 1.4219 - regression_loss: 1.1955 - classification_loss: 0.2265 382/500 [=====================>........] - ETA: 28s - loss: 1.4226 - regression_loss: 1.1960 - classification_loss: 0.2266 383/500 [=====================>........] - ETA: 27s - loss: 1.4222 - regression_loss: 1.1957 - classification_loss: 0.2265 384/500 [======================>.......] - ETA: 27s - loss: 1.4226 - regression_loss: 1.1963 - classification_loss: 0.2263 385/500 [======================>.......] - ETA: 27s - loss: 1.4229 - regression_loss: 1.1964 - classification_loss: 0.2266 386/500 [======================>.......] - ETA: 27s - loss: 1.4234 - regression_loss: 1.1968 - classification_loss: 0.2266 387/500 [======================>.......] - ETA: 26s - loss: 1.4229 - regression_loss: 1.1965 - classification_loss: 0.2264 388/500 [======================>.......] - ETA: 26s - loss: 1.4237 - regression_loss: 1.1971 - classification_loss: 0.2266 389/500 [======================>.......] - ETA: 26s - loss: 1.4229 - regression_loss: 1.1965 - classification_loss: 0.2263 390/500 [======================>.......] - ETA: 26s - loss: 1.4208 - regression_loss: 1.1947 - classification_loss: 0.2261 391/500 [======================>.......] - ETA: 26s - loss: 1.4202 - regression_loss: 1.1942 - classification_loss: 0.2260 392/500 [======================>.......] - ETA: 25s - loss: 1.4207 - regression_loss: 1.1945 - classification_loss: 0.2262 393/500 [======================>.......] - ETA: 25s - loss: 1.4211 - regression_loss: 1.1949 - classification_loss: 0.2262 394/500 [======================>.......] - ETA: 25s - loss: 1.4223 - regression_loss: 1.1960 - classification_loss: 0.2263 395/500 [======================>.......] - ETA: 25s - loss: 1.4236 - regression_loss: 1.1970 - classification_loss: 0.2265 396/500 [======================>.......] - ETA: 24s - loss: 1.4249 - regression_loss: 1.1982 - classification_loss: 0.2266 397/500 [======================>.......] - ETA: 24s - loss: 1.4250 - regression_loss: 1.1983 - classification_loss: 0.2267 398/500 [======================>.......] - ETA: 24s - loss: 1.4256 - regression_loss: 1.1991 - classification_loss: 0.2265 399/500 [======================>.......] - ETA: 24s - loss: 1.4275 - regression_loss: 1.2006 - classification_loss: 0.2269 400/500 [=======================>......] - ETA: 23s - loss: 1.4267 - regression_loss: 1.2001 - classification_loss: 0.2266 401/500 [=======================>......] - ETA: 23s - loss: 1.4257 - regression_loss: 1.1994 - classification_loss: 0.2263 402/500 [=======================>......] - ETA: 23s - loss: 1.4258 - regression_loss: 1.1995 - classification_loss: 0.2263 403/500 [=======================>......] - ETA: 23s - loss: 1.4271 - regression_loss: 1.2007 - classification_loss: 0.2263 404/500 [=======================>......] - ETA: 22s - loss: 1.4262 - regression_loss: 1.2001 - classification_loss: 0.2261 405/500 [=======================>......] - ETA: 22s - loss: 1.4264 - regression_loss: 1.2000 - classification_loss: 0.2264 406/500 [=======================>......] - ETA: 22s - loss: 1.4254 - regression_loss: 1.1993 - classification_loss: 0.2261 407/500 [=======================>......] - ETA: 22s - loss: 1.4292 - regression_loss: 1.2022 - classification_loss: 0.2270 408/500 [=======================>......] - ETA: 21s - loss: 1.4281 - regression_loss: 1.2014 - classification_loss: 0.2267 409/500 [=======================>......] - ETA: 21s - loss: 1.4272 - regression_loss: 1.2007 - classification_loss: 0.2265 410/500 [=======================>......] - ETA: 21s - loss: 1.4278 - regression_loss: 1.2013 - classification_loss: 0.2265 411/500 [=======================>......] - ETA: 21s - loss: 1.4281 - regression_loss: 1.2014 - classification_loss: 0.2267 412/500 [=======================>......] - ETA: 21s - loss: 1.4299 - regression_loss: 1.2030 - classification_loss: 0.2269 413/500 [=======================>......] - ETA: 20s - loss: 1.4302 - regression_loss: 1.2035 - classification_loss: 0.2267 414/500 [=======================>......] - ETA: 20s - loss: 1.4313 - regression_loss: 1.2042 - classification_loss: 0.2271 415/500 [=======================>......] - ETA: 20s - loss: 1.4313 - regression_loss: 1.2043 - classification_loss: 0.2270 416/500 [=======================>......] - ETA: 20s - loss: 1.4311 - regression_loss: 1.2043 - classification_loss: 0.2268 417/500 [========================>.....] - ETA: 19s - loss: 1.4304 - regression_loss: 1.2037 - classification_loss: 0.2267 418/500 [========================>.....] - ETA: 19s - loss: 1.4304 - regression_loss: 1.2036 - classification_loss: 0.2268 419/500 [========================>.....] - ETA: 19s - loss: 1.4312 - regression_loss: 1.2041 - classification_loss: 0.2271 420/500 [========================>.....] - ETA: 19s - loss: 1.4334 - regression_loss: 1.2058 - classification_loss: 0.2276 421/500 [========================>.....] - ETA: 18s - loss: 1.4352 - regression_loss: 1.2073 - classification_loss: 0.2279 422/500 [========================>.....] - ETA: 18s - loss: 1.4355 - regression_loss: 1.2076 - classification_loss: 0.2279 423/500 [========================>.....] - ETA: 18s - loss: 1.4377 - regression_loss: 1.2092 - classification_loss: 0.2284 424/500 [========================>.....] - ETA: 18s - loss: 1.4374 - regression_loss: 1.2091 - classification_loss: 0.2283 425/500 [========================>.....] - ETA: 17s - loss: 1.4375 - regression_loss: 1.2094 - classification_loss: 0.2281 426/500 [========================>.....] - ETA: 17s - loss: 1.4377 - regression_loss: 1.2097 - classification_loss: 0.2280 427/500 [========================>.....] - ETA: 17s - loss: 1.4356 - regression_loss: 1.2079 - classification_loss: 0.2277 428/500 [========================>.....] - ETA: 17s - loss: 1.4368 - regression_loss: 1.2087 - classification_loss: 0.2281 429/500 [========================>.....] - ETA: 16s - loss: 1.4369 - regression_loss: 1.2088 - classification_loss: 0.2280 430/500 [========================>.....] - ETA: 16s - loss: 1.4350 - regression_loss: 1.2073 - classification_loss: 0.2276 431/500 [========================>.....] - ETA: 16s - loss: 1.4340 - regression_loss: 1.2066 - classification_loss: 0.2274 432/500 [========================>.....] - ETA: 16s - loss: 1.4348 - regression_loss: 1.2072 - classification_loss: 0.2276 433/500 [========================>.....] - ETA: 16s - loss: 1.4350 - regression_loss: 1.2074 - classification_loss: 0.2276 434/500 [=========================>....] - ETA: 15s - loss: 1.4369 - regression_loss: 1.2091 - classification_loss: 0.2279 435/500 [=========================>....] - ETA: 15s - loss: 1.4369 - regression_loss: 1.2090 - classification_loss: 0.2279 436/500 [=========================>....] - ETA: 15s - loss: 1.4375 - regression_loss: 1.2094 - classification_loss: 0.2281 437/500 [=========================>....] - ETA: 15s - loss: 1.4367 - regression_loss: 1.2088 - classification_loss: 0.2279 438/500 [=========================>....] - ETA: 14s - loss: 1.4369 - regression_loss: 1.2088 - classification_loss: 0.2281 439/500 [=========================>....] - ETA: 14s - loss: 1.4387 - regression_loss: 1.2100 - classification_loss: 0.2287 440/500 [=========================>....] - ETA: 14s - loss: 1.4390 - regression_loss: 1.2101 - classification_loss: 0.2289 441/500 [=========================>....] - ETA: 14s - loss: 1.4384 - regression_loss: 1.2097 - classification_loss: 0.2287 442/500 [=========================>....] - ETA: 13s - loss: 1.4379 - regression_loss: 1.2094 - classification_loss: 0.2285 443/500 [=========================>....] - ETA: 13s - loss: 1.4378 - regression_loss: 1.2094 - classification_loss: 0.2284 444/500 [=========================>....] - ETA: 13s - loss: 1.4389 - regression_loss: 1.2102 - classification_loss: 0.2287 445/500 [=========================>....] - ETA: 13s - loss: 1.4392 - regression_loss: 1.2103 - classification_loss: 0.2288 446/500 [=========================>....] - ETA: 12s - loss: 1.4384 - regression_loss: 1.2097 - classification_loss: 0.2287 447/500 [=========================>....] - ETA: 12s - loss: 1.4394 - regression_loss: 1.2106 - classification_loss: 0.2288 448/500 [=========================>....] - ETA: 12s - loss: 1.4407 - regression_loss: 1.2114 - classification_loss: 0.2293 449/500 [=========================>....] - ETA: 12s - loss: 1.4431 - regression_loss: 1.2128 - classification_loss: 0.2303 450/500 [==========================>...] - ETA: 11s - loss: 1.4415 - regression_loss: 1.2116 - classification_loss: 0.2299 451/500 [==========================>...] - ETA: 11s - loss: 1.4420 - regression_loss: 1.2122 - classification_loss: 0.2298 452/500 [==========================>...] - ETA: 11s - loss: 1.4398 - regression_loss: 1.2104 - classification_loss: 0.2294 453/500 [==========================>...] - ETA: 11s - loss: 1.4396 - regression_loss: 1.2104 - classification_loss: 0.2293 454/500 [==========================>...] - ETA: 11s - loss: 1.4390 - regression_loss: 1.2098 - classification_loss: 0.2292 455/500 [==========================>...] - ETA: 10s - loss: 1.4385 - regression_loss: 1.2094 - classification_loss: 0.2291 456/500 [==========================>...] - ETA: 10s - loss: 1.4388 - regression_loss: 1.2099 - classification_loss: 0.2289 457/500 [==========================>...] - ETA: 10s - loss: 1.4392 - regression_loss: 1.2100 - classification_loss: 0.2292 458/500 [==========================>...] - ETA: 10s - loss: 1.4391 - regression_loss: 1.2100 - classification_loss: 0.2290 459/500 [==========================>...] - ETA: 9s - loss: 1.4398 - regression_loss: 1.2107 - classification_loss: 0.2291  460/500 [==========================>...] - ETA: 9s - loss: 1.4403 - regression_loss: 1.2110 - classification_loss: 0.2293 461/500 [==========================>...] - ETA: 9s - loss: 1.4387 - regression_loss: 1.2097 - classification_loss: 0.2289 462/500 [==========================>...] - ETA: 9s - loss: 1.4395 - regression_loss: 1.2104 - classification_loss: 0.2291 463/500 [==========================>...] - ETA: 8s - loss: 1.4388 - regression_loss: 1.2099 - classification_loss: 0.2289 464/500 [==========================>...] - ETA: 8s - loss: 1.4397 - regression_loss: 1.2107 - classification_loss: 0.2290 465/500 [==========================>...] - ETA: 8s - loss: 1.4411 - regression_loss: 1.2118 - classification_loss: 0.2294 466/500 [==========================>...] - ETA: 8s - loss: 1.4399 - regression_loss: 1.2109 - classification_loss: 0.2290 467/500 [===========================>..] - ETA: 7s - loss: 1.4392 - regression_loss: 1.2102 - classification_loss: 0.2290 468/500 [===========================>..] - ETA: 7s - loss: 1.4403 - regression_loss: 1.2110 - classification_loss: 0.2293 469/500 [===========================>..] - ETA: 7s - loss: 1.4410 - regression_loss: 1.2116 - classification_loss: 0.2294 470/500 [===========================>..] - ETA: 7s - loss: 1.4403 - regression_loss: 1.2112 - classification_loss: 0.2292 471/500 [===========================>..] - ETA: 6s - loss: 1.4406 - regression_loss: 1.2114 - classification_loss: 0.2292 472/500 [===========================>..] - ETA: 6s - loss: 1.4408 - regression_loss: 1.2115 - classification_loss: 0.2292 473/500 [===========================>..] - ETA: 6s - loss: 1.4408 - regression_loss: 1.2117 - classification_loss: 0.2291 474/500 [===========================>..] - ETA: 6s - loss: 1.4405 - regression_loss: 1.2113 - classification_loss: 0.2291 475/500 [===========================>..] - ETA: 5s - loss: 1.4403 - regression_loss: 1.2112 - classification_loss: 0.2292 476/500 [===========================>..] - ETA: 5s - loss: 1.4405 - regression_loss: 1.2113 - classification_loss: 0.2292 477/500 [===========================>..] - ETA: 5s - loss: 1.4399 - regression_loss: 1.2108 - classification_loss: 0.2292 478/500 [===========================>..] - ETA: 5s - loss: 1.4405 - regression_loss: 1.2111 - classification_loss: 0.2294 479/500 [===========================>..] - ETA: 5s - loss: 1.4414 - regression_loss: 1.2115 - classification_loss: 0.2298 480/500 [===========================>..] - ETA: 4s - loss: 1.4420 - regression_loss: 1.2122 - classification_loss: 0.2298 481/500 [===========================>..] - ETA: 4s - loss: 1.4414 - regression_loss: 1.2116 - classification_loss: 0.2298 482/500 [===========================>..] - ETA: 4s - loss: 1.4391 - regression_loss: 1.2097 - classification_loss: 0.2295 483/500 [===========================>..] - ETA: 4s - loss: 1.4396 - regression_loss: 1.2102 - classification_loss: 0.2294 484/500 [============================>.] - ETA: 3s - loss: 1.4400 - regression_loss: 1.2106 - classification_loss: 0.2294 485/500 [============================>.] - ETA: 3s - loss: 1.4409 - regression_loss: 1.2114 - classification_loss: 0.2294 486/500 [============================>.] - ETA: 3s - loss: 1.4408 - regression_loss: 1.2115 - classification_loss: 0.2293 487/500 [============================>.] - ETA: 3s - loss: 1.4415 - regression_loss: 1.2123 - classification_loss: 0.2292 488/500 [============================>.] - ETA: 2s - loss: 1.4402 - regression_loss: 1.2112 - classification_loss: 0.2290 489/500 [============================>.] - ETA: 2s - loss: 1.4407 - regression_loss: 1.2117 - classification_loss: 0.2290 490/500 [============================>.] - ETA: 2s - loss: 1.4416 - regression_loss: 1.2125 - classification_loss: 0.2291 491/500 [============================>.] - ETA: 2s - loss: 1.4414 - regression_loss: 1.2125 - classification_loss: 0.2289 492/500 [============================>.] - ETA: 1s - loss: 1.4405 - regression_loss: 1.2117 - classification_loss: 0.2288 493/500 [============================>.] - ETA: 1s - loss: 1.4393 - regression_loss: 1.2107 - classification_loss: 0.2286 494/500 [============================>.] - ETA: 1s - loss: 1.4388 - regression_loss: 1.2104 - classification_loss: 0.2285 495/500 [============================>.] - ETA: 1s - loss: 1.4382 - regression_loss: 1.2098 - classification_loss: 0.2284 496/500 [============================>.] - ETA: 0s - loss: 1.4380 - regression_loss: 1.2098 - classification_loss: 0.2282 497/500 [============================>.] - ETA: 0s - loss: 1.4370 - regression_loss: 1.2090 - classification_loss: 0.2280 498/500 [============================>.] - ETA: 0s - loss: 1.4354 - regression_loss: 1.2076 - classification_loss: 0.2278 499/500 [============================>.] - ETA: 0s - loss: 1.4339 - regression_loss: 1.2063 - classification_loss: 0.2276 500/500 [==============================] - 120s 239ms/step - loss: 1.4347 - regression_loss: 1.2068 - classification_loss: 0.2279 326 instances of class plum with average precision: 0.8250 mAP: 0.8250 Epoch 00071: saving model to ./training/snapshots/resnet50_pascal_71.h5 Epoch 72/150 1/500 [..............................] - ETA: 1:57 - loss: 0.4865 - regression_loss: 0.4329 - classification_loss: 0.0535 2/500 [..............................] - ETA: 1:58 - loss: 0.8734 - regression_loss: 0.7503 - classification_loss: 0.1231 3/500 [..............................] - ETA: 1:59 - loss: 1.1515 - regression_loss: 0.9731 - classification_loss: 0.1784 4/500 [..............................] - ETA: 1:59 - loss: 1.2549 - regression_loss: 1.0635 - classification_loss: 0.1914 5/500 [..............................] - ETA: 2:00 - loss: 1.3142 - regression_loss: 1.1046 - classification_loss: 0.2095 6/500 [..............................] - ETA: 1:59 - loss: 1.2517 - regression_loss: 1.0622 - classification_loss: 0.1895 7/500 [..............................] - ETA: 1:59 - loss: 1.2724 - regression_loss: 1.0782 - classification_loss: 0.1942 8/500 [..............................] - ETA: 1:59 - loss: 1.3710 - regression_loss: 1.1467 - classification_loss: 0.2243 9/500 [..............................] - ETA: 1:59 - loss: 1.3939 - regression_loss: 1.1663 - classification_loss: 0.2276 10/500 [..............................] - ETA: 1:59 - loss: 1.4505 - regression_loss: 1.2175 - classification_loss: 0.2330 11/500 [..............................] - ETA: 1:59 - loss: 1.3894 - regression_loss: 1.1697 - classification_loss: 0.2197 12/500 [..............................] - ETA: 1:59 - loss: 1.3835 - regression_loss: 1.1687 - classification_loss: 0.2148 13/500 [..............................] - ETA: 1:59 - loss: 1.3223 - regression_loss: 1.1194 - classification_loss: 0.2029 14/500 [..............................] - ETA: 1:58 - loss: 1.3031 - regression_loss: 1.1027 - classification_loss: 0.2004 15/500 [..............................] - ETA: 1:57 - loss: 1.2460 - regression_loss: 1.0525 - classification_loss: 0.1936 16/500 [..............................] - ETA: 1:57 - loss: 1.2550 - regression_loss: 1.0596 - classification_loss: 0.1954 17/500 [>.............................] - ETA: 1:57 - loss: 1.2613 - regression_loss: 1.0640 - classification_loss: 0.1973 18/500 [>.............................] - ETA: 1:56 - loss: 1.2849 - regression_loss: 1.0833 - classification_loss: 0.2016 19/500 [>.............................] - ETA: 1:56 - loss: 1.2682 - regression_loss: 1.0705 - classification_loss: 0.1977 20/500 [>.............................] - ETA: 1:56 - loss: 1.2619 - regression_loss: 1.0666 - classification_loss: 0.1953 21/500 [>.............................] - ETA: 1:56 - loss: 1.2895 - regression_loss: 1.0996 - classification_loss: 0.1899 22/500 [>.............................] - ETA: 1:56 - loss: 1.2817 - regression_loss: 1.0921 - classification_loss: 0.1896 23/500 [>.............................] - ETA: 1:56 - loss: 1.2699 - regression_loss: 1.0826 - classification_loss: 0.1873 24/500 [>.............................] - ETA: 1:56 - loss: 1.3391 - regression_loss: 1.1395 - classification_loss: 0.1997 25/500 [>.............................] - ETA: 1:56 - loss: 1.3700 - regression_loss: 1.1617 - classification_loss: 0.2083 26/500 [>.............................] - ETA: 1:55 - loss: 1.3626 - regression_loss: 1.1571 - classification_loss: 0.2055 27/500 [>.............................] - ETA: 1:55 - loss: 1.3225 - regression_loss: 1.1234 - classification_loss: 0.1991 28/500 [>.............................] - ETA: 1:55 - loss: 1.3253 - regression_loss: 1.1264 - classification_loss: 0.1989 29/500 [>.............................] - ETA: 1:54 - loss: 1.3178 - regression_loss: 1.1206 - classification_loss: 0.1972 30/500 [>.............................] - ETA: 1:54 - loss: 1.3109 - regression_loss: 1.1153 - classification_loss: 0.1956 31/500 [>.............................] - ETA: 1:54 - loss: 1.2795 - regression_loss: 1.0888 - classification_loss: 0.1908 32/500 [>.............................] - ETA: 1:54 - loss: 1.2755 - regression_loss: 1.0823 - classification_loss: 0.1932 33/500 [>.............................] - ETA: 1:53 - loss: 1.2878 - regression_loss: 1.0945 - classification_loss: 0.1933 34/500 [=>............................] - ETA: 1:53 - loss: 1.2787 - regression_loss: 1.0864 - classification_loss: 0.1923 35/500 [=>............................] - ETA: 1:53 - loss: 1.2908 - regression_loss: 1.0949 - classification_loss: 0.1958 36/500 [=>............................] - ETA: 1:53 - loss: 1.3038 - regression_loss: 1.1097 - classification_loss: 0.1940 37/500 [=>............................] - ETA: 1:52 - loss: 1.3217 - regression_loss: 1.1270 - classification_loss: 0.1948 38/500 [=>............................] - ETA: 1:52 - loss: 1.3205 - regression_loss: 1.1269 - classification_loss: 0.1937 39/500 [=>............................] - ETA: 1:52 - loss: 1.3230 - regression_loss: 1.1289 - classification_loss: 0.1941 40/500 [=>............................] - ETA: 1:51 - loss: 1.3193 - regression_loss: 1.1274 - classification_loss: 0.1919 41/500 [=>............................] - ETA: 1:51 - loss: 1.3174 - regression_loss: 1.1264 - classification_loss: 0.1910 42/500 [=>............................] - ETA: 1:51 - loss: 1.3329 - regression_loss: 1.1374 - classification_loss: 0.1955 43/500 [=>............................] - ETA: 1:51 - loss: 1.3411 - regression_loss: 1.1426 - classification_loss: 0.1985 44/500 [=>............................] - ETA: 1:50 - loss: 1.3431 - regression_loss: 1.1439 - classification_loss: 0.1992 45/500 [=>............................] - ETA: 1:50 - loss: 1.3638 - regression_loss: 1.1587 - classification_loss: 0.2050 46/500 [=>............................] - ETA: 1:50 - loss: 1.3528 - regression_loss: 1.1494 - classification_loss: 0.2033 47/500 [=>............................] - ETA: 1:50 - loss: 1.3604 - regression_loss: 1.1574 - classification_loss: 0.2030 48/500 [=>............................] - ETA: 1:50 - loss: 1.3699 - regression_loss: 1.1644 - classification_loss: 0.2055 49/500 [=>............................] - ETA: 1:49 - loss: 1.3603 - regression_loss: 1.1567 - classification_loss: 0.2036 50/500 [==>...........................] - ETA: 1:49 - loss: 1.3539 - regression_loss: 1.1508 - classification_loss: 0.2031 51/500 [==>...........................] - ETA: 1:49 - loss: 1.3617 - regression_loss: 1.1583 - classification_loss: 0.2034 52/500 [==>...........................] - ETA: 1:48 - loss: 1.3623 - regression_loss: 1.1599 - classification_loss: 0.2024 53/500 [==>...........................] - ETA: 1:48 - loss: 1.3574 - regression_loss: 1.1561 - classification_loss: 0.2013 54/500 [==>...........................] - ETA: 1:47 - loss: 1.3689 - regression_loss: 1.1642 - classification_loss: 0.2047 55/500 [==>...........................] - ETA: 1:47 - loss: 1.3663 - regression_loss: 1.1628 - classification_loss: 0.2035 56/500 [==>...........................] - ETA: 1:47 - loss: 1.3562 - regression_loss: 1.1543 - classification_loss: 0.2019 57/500 [==>...........................] - ETA: 1:46 - loss: 1.3715 - regression_loss: 1.1659 - classification_loss: 0.2057 58/500 [==>...........................] - ETA: 1:46 - loss: 1.3723 - regression_loss: 1.1669 - classification_loss: 0.2053 59/500 [==>...........................] - ETA: 1:46 - loss: 1.3713 - regression_loss: 1.1657 - classification_loss: 0.2056 60/500 [==>...........................] - ETA: 1:46 - loss: 1.4033 - regression_loss: 1.1463 - classification_loss: 0.2570 61/500 [==>...........................] - ETA: 1:46 - loss: 1.4063 - regression_loss: 1.1511 - classification_loss: 0.2552 62/500 [==>...........................] - ETA: 1:45 - loss: 1.4239 - regression_loss: 1.1667 - classification_loss: 0.2572 63/500 [==>...........................] - ETA: 1:45 - loss: 1.4246 - regression_loss: 1.1686 - classification_loss: 0.2560 64/500 [==>...........................] - ETA: 1:45 - loss: 1.4224 - regression_loss: 1.1672 - classification_loss: 0.2552 65/500 [==>...........................] - ETA: 1:45 - loss: 1.4193 - regression_loss: 1.1659 - classification_loss: 0.2534 66/500 [==>...........................] - ETA: 1:45 - loss: 1.4043 - regression_loss: 1.1543 - classification_loss: 0.2500 67/500 [===>..........................] - ETA: 1:44 - loss: 1.4156 - regression_loss: 1.1595 - classification_loss: 0.2561 68/500 [===>..........................] - ETA: 1:44 - loss: 1.4220 - regression_loss: 1.1660 - classification_loss: 0.2560 69/500 [===>..........................] - ETA: 1:44 - loss: 1.4243 - regression_loss: 1.1672 - classification_loss: 0.2571 70/500 [===>..........................] - ETA: 1:44 - loss: 1.4211 - regression_loss: 1.1648 - classification_loss: 0.2564 71/500 [===>..........................] - ETA: 1:43 - loss: 1.4215 - regression_loss: 1.1652 - classification_loss: 0.2563 72/500 [===>..........................] - ETA: 1:43 - loss: 1.4147 - regression_loss: 1.1602 - classification_loss: 0.2545 73/500 [===>..........................] - ETA: 1:43 - loss: 1.4204 - regression_loss: 1.1646 - classification_loss: 0.2558 74/500 [===>..........................] - ETA: 1:43 - loss: 1.4174 - regression_loss: 1.1632 - classification_loss: 0.2542 75/500 [===>..........................] - ETA: 1:42 - loss: 1.4150 - regression_loss: 1.1624 - classification_loss: 0.2525 76/500 [===>..........................] - ETA: 1:42 - loss: 1.4081 - regression_loss: 1.1580 - classification_loss: 0.2501 77/500 [===>..........................] - ETA: 1:42 - loss: 1.4084 - regression_loss: 1.1589 - classification_loss: 0.2494 78/500 [===>..........................] - ETA: 1:42 - loss: 1.4095 - regression_loss: 1.1616 - classification_loss: 0.2480 79/500 [===>..........................] - ETA: 1:42 - loss: 1.4136 - regression_loss: 1.1650 - classification_loss: 0.2486 80/500 [===>..........................] - ETA: 1:41 - loss: 1.4112 - regression_loss: 1.1647 - classification_loss: 0.2465 81/500 [===>..........................] - ETA: 1:41 - loss: 1.4006 - regression_loss: 1.1565 - classification_loss: 0.2441 82/500 [===>..........................] - ETA: 1:41 - loss: 1.4048 - regression_loss: 1.1599 - classification_loss: 0.2448 83/500 [===>..........................] - ETA: 1:41 - loss: 1.4006 - regression_loss: 1.1568 - classification_loss: 0.2438 84/500 [====>.........................] - ETA: 1:41 - loss: 1.3950 - regression_loss: 1.1529 - classification_loss: 0.2420 85/500 [====>.........................] - ETA: 1:40 - loss: 1.3933 - regression_loss: 1.1520 - classification_loss: 0.2413 86/500 [====>.........................] - ETA: 1:40 - loss: 1.3973 - regression_loss: 1.1541 - classification_loss: 0.2432 87/500 [====>.........................] - ETA: 1:40 - loss: 1.4052 - regression_loss: 1.1615 - classification_loss: 0.2438 88/500 [====>.........................] - ETA: 1:40 - loss: 1.4056 - regression_loss: 1.1620 - classification_loss: 0.2436 89/500 [====>.........................] - ETA: 1:39 - loss: 1.4016 - regression_loss: 1.1595 - classification_loss: 0.2421 90/500 [====>.........................] - ETA: 1:39 - loss: 1.3977 - regression_loss: 1.1562 - classification_loss: 0.2415 91/500 [====>.........................] - ETA: 1:39 - loss: 1.3956 - regression_loss: 1.1550 - classification_loss: 0.2406 92/500 [====>.........................] - ETA: 1:39 - loss: 1.4004 - regression_loss: 1.1595 - classification_loss: 0.2410 93/500 [====>.........................] - ETA: 1:38 - loss: 1.3965 - regression_loss: 1.1568 - classification_loss: 0.2397 94/500 [====>.........................] - ETA: 1:38 - loss: 1.3886 - regression_loss: 1.1504 - classification_loss: 0.2382 95/500 [====>.........................] - ETA: 1:38 - loss: 1.3890 - regression_loss: 1.1512 - classification_loss: 0.2378 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3900 - regression_loss: 1.1521 - classification_loss: 0.2379 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3868 - regression_loss: 1.1494 - classification_loss: 0.2374 98/500 [====>.........................] - ETA: 1:37 - loss: 1.3770 - regression_loss: 1.1410 - classification_loss: 0.2360 99/500 [====>.........................] - ETA: 1:37 - loss: 1.3719 - regression_loss: 1.1374 - classification_loss: 0.2346 100/500 [=====>........................] - ETA: 1:37 - loss: 1.3692 - regression_loss: 1.1360 - classification_loss: 0.2333 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3741 - regression_loss: 1.1399 - classification_loss: 0.2342 102/500 [=====>........................] - ETA: 1:36 - loss: 1.3737 - regression_loss: 1.1402 - classification_loss: 0.2335 103/500 [=====>........................] - ETA: 1:36 - loss: 1.3720 - regression_loss: 1.1391 - classification_loss: 0.2329 104/500 [=====>........................] - ETA: 1:36 - loss: 1.3741 - regression_loss: 1.1413 - classification_loss: 0.2328 105/500 [=====>........................] - ETA: 1:35 - loss: 1.3765 - regression_loss: 1.1436 - classification_loss: 0.2329 106/500 [=====>........................] - ETA: 1:35 - loss: 1.3726 - regression_loss: 1.1406 - classification_loss: 0.2320 107/500 [=====>........................] - ETA: 1:35 - loss: 1.3758 - regression_loss: 1.1436 - classification_loss: 0.2322 108/500 [=====>........................] - ETA: 1:35 - loss: 1.3682 - regression_loss: 1.1377 - classification_loss: 0.2305 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3835 - regression_loss: 1.1501 - classification_loss: 0.2334 110/500 [=====>........................] - ETA: 1:34 - loss: 1.3787 - regression_loss: 1.1468 - classification_loss: 0.2319 111/500 [=====>........................] - ETA: 1:34 - loss: 1.3756 - regression_loss: 1.1448 - classification_loss: 0.2307 112/500 [=====>........................] - ETA: 1:34 - loss: 1.3740 - regression_loss: 1.1438 - classification_loss: 0.2302 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3765 - regression_loss: 1.1459 - classification_loss: 0.2306 114/500 [=====>........................] - ETA: 1:33 - loss: 1.3737 - regression_loss: 1.1443 - classification_loss: 0.2294 115/500 [=====>........................] - ETA: 1:33 - loss: 1.3694 - regression_loss: 1.1414 - classification_loss: 0.2280 116/500 [=====>........................] - ETA: 1:33 - loss: 1.3657 - regression_loss: 1.1381 - classification_loss: 0.2275 117/500 [======>.......................] - ETA: 1:33 - loss: 1.3731 - regression_loss: 1.1437 - classification_loss: 0.2293 118/500 [======>.......................] - ETA: 1:32 - loss: 1.3761 - regression_loss: 1.1465 - classification_loss: 0.2296 119/500 [======>.......................] - ETA: 1:32 - loss: 1.3727 - regression_loss: 1.1433 - classification_loss: 0.2294 120/500 [======>.......................] - ETA: 1:32 - loss: 1.3841 - regression_loss: 1.1527 - classification_loss: 0.2314 121/500 [======>.......................] - ETA: 1:32 - loss: 1.3850 - regression_loss: 1.1541 - classification_loss: 0.2309 122/500 [======>.......................] - ETA: 1:31 - loss: 1.3930 - regression_loss: 1.1604 - classification_loss: 0.2326 123/500 [======>.......................] - ETA: 1:31 - loss: 1.4005 - regression_loss: 1.1672 - classification_loss: 0.2333 124/500 [======>.......................] - ETA: 1:31 - loss: 1.3997 - regression_loss: 1.1663 - classification_loss: 0.2334 125/500 [======>.......................] - ETA: 1:31 - loss: 1.4023 - regression_loss: 1.1681 - classification_loss: 0.2342 126/500 [======>.......................] - ETA: 1:30 - loss: 1.4064 - regression_loss: 1.1719 - classification_loss: 0.2346 127/500 [======>.......................] - ETA: 1:30 - loss: 1.4076 - regression_loss: 1.1732 - classification_loss: 0.2344 128/500 [======>.......................] - ETA: 1:30 - loss: 1.4033 - regression_loss: 1.1698 - classification_loss: 0.2335 129/500 [======>.......................] - ETA: 1:29 - loss: 1.4033 - regression_loss: 1.1701 - classification_loss: 0.2332 130/500 [======>.......................] - ETA: 1:29 - loss: 1.4070 - regression_loss: 1.1729 - classification_loss: 0.2341 131/500 [======>.......................] - ETA: 1:29 - loss: 1.4064 - regression_loss: 1.1728 - classification_loss: 0.2336 132/500 [======>.......................] - ETA: 1:29 - loss: 1.4036 - regression_loss: 1.1708 - classification_loss: 0.2329 133/500 [======>.......................] - ETA: 1:28 - loss: 1.4046 - regression_loss: 1.1721 - classification_loss: 0.2325 134/500 [=======>......................] - ETA: 1:28 - loss: 1.3992 - regression_loss: 1.1676 - classification_loss: 0.2316 135/500 [=======>......................] - ETA: 1:28 - loss: 1.3946 - regression_loss: 1.1635 - classification_loss: 0.2311 136/500 [=======>......................] - ETA: 1:28 - loss: 1.3942 - regression_loss: 1.1631 - classification_loss: 0.2311 137/500 [=======>......................] - ETA: 1:27 - loss: 1.3953 - regression_loss: 1.1643 - classification_loss: 0.2310 138/500 [=======>......................] - ETA: 1:27 - loss: 1.3957 - regression_loss: 1.1653 - classification_loss: 0.2303 139/500 [=======>......................] - ETA: 1:27 - loss: 1.3901 - regression_loss: 1.1602 - classification_loss: 0.2299 140/500 [=======>......................] - ETA: 1:27 - loss: 1.3883 - regression_loss: 1.1588 - classification_loss: 0.2295 141/500 [=======>......................] - ETA: 1:26 - loss: 1.3962 - regression_loss: 1.1647 - classification_loss: 0.2315 142/500 [=======>......................] - ETA: 1:26 - loss: 1.3970 - regression_loss: 1.1660 - classification_loss: 0.2310 143/500 [=======>......................] - ETA: 1:26 - loss: 1.3963 - regression_loss: 1.1658 - classification_loss: 0.2305 144/500 [=======>......................] - ETA: 1:26 - loss: 1.3987 - regression_loss: 1.1680 - classification_loss: 0.2306 145/500 [=======>......................] - ETA: 1:26 - loss: 1.3989 - regression_loss: 1.1687 - classification_loss: 0.2302 146/500 [=======>......................] - ETA: 1:25 - loss: 1.3996 - regression_loss: 1.1690 - classification_loss: 0.2306 147/500 [=======>......................] - ETA: 1:25 - loss: 1.3943 - regression_loss: 1.1647 - classification_loss: 0.2296 148/500 [=======>......................] - ETA: 1:25 - loss: 1.3997 - regression_loss: 1.1692 - classification_loss: 0.2306 149/500 [=======>......................] - ETA: 1:25 - loss: 1.3979 - regression_loss: 1.1680 - classification_loss: 0.2299 150/500 [========>.....................] - ETA: 1:24 - loss: 1.3996 - regression_loss: 1.1693 - classification_loss: 0.2304 151/500 [========>.....................] - ETA: 1:24 - loss: 1.4012 - regression_loss: 1.1704 - classification_loss: 0.2308 152/500 [========>.....................] - ETA: 1:24 - loss: 1.4013 - regression_loss: 1.1708 - classification_loss: 0.2304 153/500 [========>.....................] - ETA: 1:24 - loss: 1.4012 - regression_loss: 1.1708 - classification_loss: 0.2303 154/500 [========>.....................] - ETA: 1:23 - loss: 1.4007 - regression_loss: 1.1706 - classification_loss: 0.2301 155/500 [========>.....................] - ETA: 1:23 - loss: 1.3994 - regression_loss: 1.1698 - classification_loss: 0.2296 156/500 [========>.....................] - ETA: 1:23 - loss: 1.4010 - regression_loss: 1.1711 - classification_loss: 0.2300 157/500 [========>.....................] - ETA: 1:23 - loss: 1.4005 - regression_loss: 1.1706 - classification_loss: 0.2299 158/500 [========>.....................] - ETA: 1:22 - loss: 1.3953 - regression_loss: 1.1664 - classification_loss: 0.2288 159/500 [========>.....................] - ETA: 1:22 - loss: 1.3959 - regression_loss: 1.1674 - classification_loss: 0.2285 160/500 [========>.....................] - ETA: 1:22 - loss: 1.3945 - regression_loss: 1.1666 - classification_loss: 0.2278 161/500 [========>.....................] - ETA: 1:22 - loss: 1.3948 - regression_loss: 1.1672 - classification_loss: 0.2276 162/500 [========>.....................] - ETA: 1:21 - loss: 1.3973 - regression_loss: 1.1697 - classification_loss: 0.2277 163/500 [========>.....................] - ETA: 1:21 - loss: 1.3976 - regression_loss: 1.1703 - classification_loss: 0.2272 164/500 [========>.....................] - ETA: 1:21 - loss: 1.3999 - regression_loss: 1.1726 - classification_loss: 0.2273 165/500 [========>.....................] - ETA: 1:21 - loss: 1.4025 - regression_loss: 1.1745 - classification_loss: 0.2281 166/500 [========>.....................] - ETA: 1:21 - loss: 1.4014 - regression_loss: 1.1735 - classification_loss: 0.2279 167/500 [=========>....................] - ETA: 1:20 - loss: 1.3942 - regression_loss: 1.1675 - classification_loss: 0.2267 168/500 [=========>....................] - ETA: 1:20 - loss: 1.3957 - regression_loss: 1.1686 - classification_loss: 0.2271 169/500 [=========>....................] - ETA: 1:20 - loss: 1.3949 - regression_loss: 1.1682 - classification_loss: 0.2267 170/500 [=========>....................] - ETA: 1:20 - loss: 1.4007 - regression_loss: 1.1725 - classification_loss: 0.2282 171/500 [=========>....................] - ETA: 1:19 - loss: 1.3967 - regression_loss: 1.1691 - classification_loss: 0.2275 172/500 [=========>....................] - ETA: 1:19 - loss: 1.3972 - regression_loss: 1.1699 - classification_loss: 0.2273 173/500 [=========>....................] - ETA: 1:19 - loss: 1.3970 - regression_loss: 1.1697 - classification_loss: 0.2273 174/500 [=========>....................] - ETA: 1:19 - loss: 1.3955 - regression_loss: 1.1679 - classification_loss: 0.2276 175/500 [=========>....................] - ETA: 1:18 - loss: 1.3943 - regression_loss: 1.1669 - classification_loss: 0.2274 176/500 [=========>....................] - ETA: 1:18 - loss: 1.3954 - regression_loss: 1.1675 - classification_loss: 0.2279 177/500 [=========>....................] - ETA: 1:18 - loss: 1.3937 - regression_loss: 1.1662 - classification_loss: 0.2275 178/500 [=========>....................] - ETA: 1:18 - loss: 1.3945 - regression_loss: 1.1666 - classification_loss: 0.2279 179/500 [=========>....................] - ETA: 1:17 - loss: 1.3948 - regression_loss: 1.1671 - classification_loss: 0.2278 180/500 [=========>....................] - ETA: 1:17 - loss: 1.3941 - regression_loss: 1.1668 - classification_loss: 0.2272 181/500 [=========>....................] - ETA: 1:17 - loss: 1.3964 - regression_loss: 1.1681 - classification_loss: 0.2283 182/500 [=========>....................] - ETA: 1:17 - loss: 1.4008 - regression_loss: 1.1709 - classification_loss: 0.2299 183/500 [=========>....................] - ETA: 1:16 - loss: 1.4023 - regression_loss: 1.1724 - classification_loss: 0.2299 184/500 [==========>...................] - ETA: 1:16 - loss: 1.4018 - regression_loss: 1.1718 - classification_loss: 0.2300 185/500 [==========>...................] - ETA: 1:16 - loss: 1.3999 - regression_loss: 1.1703 - classification_loss: 0.2296 186/500 [==========>...................] - ETA: 1:16 - loss: 1.4010 - regression_loss: 1.1717 - classification_loss: 0.2293 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3995 - regression_loss: 1.1708 - classification_loss: 0.2287 188/500 [==========>...................] - ETA: 1:15 - loss: 1.4048 - regression_loss: 1.1744 - classification_loss: 0.2304 189/500 [==========>...................] - ETA: 1:15 - loss: 1.4071 - regression_loss: 1.1766 - classification_loss: 0.2305 190/500 [==========>...................] - ETA: 1:15 - loss: 1.4085 - regression_loss: 1.1777 - classification_loss: 0.2309 191/500 [==========>...................] - ETA: 1:14 - loss: 1.4133 - regression_loss: 1.1816 - classification_loss: 0.2316 192/500 [==========>...................] - ETA: 1:14 - loss: 1.4138 - regression_loss: 1.1822 - classification_loss: 0.2316 193/500 [==========>...................] - ETA: 1:14 - loss: 1.4157 - regression_loss: 1.1840 - classification_loss: 0.2317 194/500 [==========>...................] - ETA: 1:14 - loss: 1.4134 - regression_loss: 1.1822 - classification_loss: 0.2312 195/500 [==========>...................] - ETA: 1:14 - loss: 1.4142 - regression_loss: 1.1831 - classification_loss: 0.2311 196/500 [==========>...................] - ETA: 1:13 - loss: 1.4154 - regression_loss: 1.1843 - classification_loss: 0.2311 197/500 [==========>...................] - ETA: 1:13 - loss: 1.4152 - regression_loss: 1.1836 - classification_loss: 0.2316 198/500 [==========>...................] - ETA: 1:13 - loss: 1.4132 - regression_loss: 1.1823 - classification_loss: 0.2309 199/500 [==========>...................] - ETA: 1:12 - loss: 1.4136 - regression_loss: 1.1826 - classification_loss: 0.2310 200/500 [===========>..................] - ETA: 1:12 - loss: 1.4100 - regression_loss: 1.1798 - classification_loss: 0.2302 201/500 [===========>..................] - ETA: 1:12 - loss: 1.4085 - regression_loss: 1.1785 - classification_loss: 0.2300 202/500 [===========>..................] - ETA: 1:12 - loss: 1.4058 - regression_loss: 1.1766 - classification_loss: 0.2293 203/500 [===========>..................] - ETA: 1:12 - loss: 1.4063 - regression_loss: 1.1772 - classification_loss: 0.2291 204/500 [===========>..................] - ETA: 1:11 - loss: 1.4053 - regression_loss: 1.1766 - classification_loss: 0.2286 205/500 [===========>..................] - ETA: 1:11 - loss: 1.4033 - regression_loss: 1.1752 - classification_loss: 0.2281 206/500 [===========>..................] - ETA: 1:11 - loss: 1.4013 - regression_loss: 1.1738 - classification_loss: 0.2275 207/500 [===========>..................] - ETA: 1:11 - loss: 1.4060 - regression_loss: 1.1773 - classification_loss: 0.2287 208/500 [===========>..................] - ETA: 1:10 - loss: 1.4070 - regression_loss: 1.1775 - classification_loss: 0.2295 209/500 [===========>..................] - ETA: 1:10 - loss: 1.4073 - regression_loss: 1.1781 - classification_loss: 0.2292 210/500 [===========>..................] - ETA: 1:10 - loss: 1.4100 - regression_loss: 1.1804 - classification_loss: 0.2297 211/500 [===========>..................] - ETA: 1:10 - loss: 1.4089 - regression_loss: 1.1795 - classification_loss: 0.2294 212/500 [===========>..................] - ETA: 1:09 - loss: 1.4070 - regression_loss: 1.1780 - classification_loss: 0.2290 213/500 [===========>..................] - ETA: 1:09 - loss: 1.4054 - regression_loss: 1.1770 - classification_loss: 0.2285 214/500 [===========>..................] - ETA: 1:09 - loss: 1.4067 - regression_loss: 1.1780 - classification_loss: 0.2287 215/500 [===========>..................] - ETA: 1:09 - loss: 1.4122 - regression_loss: 1.1833 - classification_loss: 0.2290 216/500 [===========>..................] - ETA: 1:08 - loss: 1.4120 - regression_loss: 1.1834 - classification_loss: 0.2286 217/500 [============>.................] - ETA: 1:08 - loss: 1.4146 - regression_loss: 1.1852 - classification_loss: 0.2293 218/500 [============>.................] - ETA: 1:08 - loss: 1.4107 - regression_loss: 1.1822 - classification_loss: 0.2285 219/500 [============>.................] - ETA: 1:08 - loss: 1.4094 - regression_loss: 1.1811 - classification_loss: 0.2283 220/500 [============>.................] - ETA: 1:07 - loss: 1.4098 - regression_loss: 1.1815 - classification_loss: 0.2283 221/500 [============>.................] - ETA: 1:07 - loss: 1.4096 - regression_loss: 1.1811 - classification_loss: 0.2285 222/500 [============>.................] - ETA: 1:07 - loss: 1.4100 - regression_loss: 1.1812 - classification_loss: 0.2287 223/500 [============>.................] - ETA: 1:07 - loss: 1.4064 - regression_loss: 1.1784 - classification_loss: 0.2280 224/500 [============>.................] - ETA: 1:06 - loss: 1.4086 - regression_loss: 1.1803 - classification_loss: 0.2283 225/500 [============>.................] - ETA: 1:06 - loss: 1.4084 - regression_loss: 1.1800 - classification_loss: 0.2283 226/500 [============>.................] - ETA: 1:06 - loss: 1.4107 - regression_loss: 1.1818 - classification_loss: 0.2290 227/500 [============>.................] - ETA: 1:06 - loss: 1.4089 - regression_loss: 1.1805 - classification_loss: 0.2284 228/500 [============>.................] - ETA: 1:05 - loss: 1.4080 - regression_loss: 1.1800 - classification_loss: 0.2281 229/500 [============>.................] - ETA: 1:05 - loss: 1.4048 - regression_loss: 1.1773 - classification_loss: 0.2274 230/500 [============>.................] - ETA: 1:05 - loss: 1.4047 - regression_loss: 1.1773 - classification_loss: 0.2274 231/500 [============>.................] - ETA: 1:05 - loss: 1.4030 - regression_loss: 1.1759 - classification_loss: 0.2271 232/500 [============>.................] - ETA: 1:04 - loss: 1.4025 - regression_loss: 1.1755 - classification_loss: 0.2270 233/500 [============>.................] - ETA: 1:04 - loss: 1.4033 - regression_loss: 1.1759 - classification_loss: 0.2274 234/500 [=============>................] - ETA: 1:04 - loss: 1.4016 - regression_loss: 1.1747 - classification_loss: 0.2269 235/500 [=============>................] - ETA: 1:04 - loss: 1.4009 - regression_loss: 1.1743 - classification_loss: 0.2266 236/500 [=============>................] - ETA: 1:03 - loss: 1.4025 - regression_loss: 1.1756 - classification_loss: 0.2269 237/500 [=============>................] - ETA: 1:03 - loss: 1.4055 - regression_loss: 1.1779 - classification_loss: 0.2275 238/500 [=============>................] - ETA: 1:03 - loss: 1.4030 - regression_loss: 1.1761 - classification_loss: 0.2270 239/500 [=============>................] - ETA: 1:03 - loss: 1.4039 - regression_loss: 1.1773 - classification_loss: 0.2267 240/500 [=============>................] - ETA: 1:02 - loss: 1.4024 - regression_loss: 1.1763 - classification_loss: 0.2261 241/500 [=============>................] - ETA: 1:02 - loss: 1.4033 - regression_loss: 1.1770 - classification_loss: 0.2263 242/500 [=============>................] - ETA: 1:02 - loss: 1.4046 - regression_loss: 1.1781 - classification_loss: 0.2265 243/500 [=============>................] - ETA: 1:02 - loss: 1.4053 - regression_loss: 1.1788 - classification_loss: 0.2265 244/500 [=============>................] - ETA: 1:01 - loss: 1.4057 - regression_loss: 1.1792 - classification_loss: 0.2264 245/500 [=============>................] - ETA: 1:01 - loss: 1.4047 - regression_loss: 1.1785 - classification_loss: 0.2262 246/500 [=============>................] - ETA: 1:01 - loss: 1.4059 - regression_loss: 1.1796 - classification_loss: 0.2263 247/500 [=============>................] - ETA: 1:01 - loss: 1.4049 - regression_loss: 1.1790 - classification_loss: 0.2259 248/500 [=============>................] - ETA: 1:00 - loss: 1.4054 - regression_loss: 1.1793 - classification_loss: 0.2262 249/500 [=============>................] - ETA: 1:00 - loss: 1.4045 - regression_loss: 1.1783 - classification_loss: 0.2262 250/500 [==============>...............] - ETA: 1:00 - loss: 1.4052 - regression_loss: 1.1788 - classification_loss: 0.2263 251/500 [==============>...............] - ETA: 1:00 - loss: 1.4043 - regression_loss: 1.1783 - classification_loss: 0.2261 252/500 [==============>...............] - ETA: 59s - loss: 1.4041 - regression_loss: 1.1780 - classification_loss: 0.2261  253/500 [==============>...............] - ETA: 59s - loss: 1.4035 - regression_loss: 1.1777 - classification_loss: 0.2258 254/500 [==============>...............] - ETA: 59s - loss: 1.4052 - regression_loss: 1.1791 - classification_loss: 0.2262 255/500 [==============>...............] - ETA: 59s - loss: 1.4052 - regression_loss: 1.1791 - classification_loss: 0.2261 256/500 [==============>...............] - ETA: 58s - loss: 1.4039 - regression_loss: 1.1783 - classification_loss: 0.2256 257/500 [==============>...............] - ETA: 58s - loss: 1.4035 - regression_loss: 1.1782 - classification_loss: 0.2253 258/500 [==============>...............] - ETA: 58s - loss: 1.4029 - regression_loss: 1.1779 - classification_loss: 0.2250 259/500 [==============>...............] - ETA: 58s - loss: 1.4058 - regression_loss: 1.1802 - classification_loss: 0.2256 260/500 [==============>...............] - ETA: 58s - loss: 1.4038 - regression_loss: 1.1787 - classification_loss: 0.2251 261/500 [==============>...............] - ETA: 57s - loss: 1.4022 - regression_loss: 1.1775 - classification_loss: 0.2247 262/500 [==============>...............] - ETA: 57s - loss: 1.4017 - regression_loss: 1.1773 - classification_loss: 0.2244 263/500 [==============>...............] - ETA: 57s - loss: 1.4034 - regression_loss: 1.1791 - classification_loss: 0.2244 264/500 [==============>...............] - ETA: 57s - loss: 1.4065 - regression_loss: 1.1814 - classification_loss: 0.2251 265/500 [==============>...............] - ETA: 56s - loss: 1.4074 - regression_loss: 1.1822 - classification_loss: 0.2252 266/500 [==============>...............] - ETA: 56s - loss: 1.4093 - regression_loss: 1.1837 - classification_loss: 0.2256 267/500 [===============>..............] - ETA: 56s - loss: 1.4100 - regression_loss: 1.1842 - classification_loss: 0.2258 268/500 [===============>..............] - ETA: 56s - loss: 1.4119 - regression_loss: 1.1860 - classification_loss: 0.2259 269/500 [===============>..............] - ETA: 55s - loss: 1.4122 - regression_loss: 1.1861 - classification_loss: 0.2262 270/500 [===============>..............] - ETA: 55s - loss: 1.4122 - regression_loss: 1.1863 - classification_loss: 0.2259 271/500 [===============>..............] - ETA: 55s - loss: 1.4128 - regression_loss: 1.1867 - classification_loss: 0.2261 272/500 [===============>..............] - ETA: 55s - loss: 1.4118 - regression_loss: 1.1858 - classification_loss: 0.2260 273/500 [===============>..............] - ETA: 54s - loss: 1.4099 - regression_loss: 1.1838 - classification_loss: 0.2261 274/500 [===============>..............] - ETA: 54s - loss: 1.4122 - regression_loss: 1.1858 - classification_loss: 0.2264 275/500 [===============>..............] - ETA: 54s - loss: 1.4132 - regression_loss: 1.1862 - classification_loss: 0.2269 276/500 [===============>..............] - ETA: 54s - loss: 1.4145 - regression_loss: 1.1879 - classification_loss: 0.2267 277/500 [===============>..............] - ETA: 53s - loss: 1.4152 - regression_loss: 1.1883 - classification_loss: 0.2268 278/500 [===============>..............] - ETA: 53s - loss: 1.4141 - regression_loss: 1.1877 - classification_loss: 0.2264 279/500 [===============>..............] - ETA: 53s - loss: 1.4138 - regression_loss: 1.1876 - classification_loss: 0.2262 280/500 [===============>..............] - ETA: 53s - loss: 1.4200 - regression_loss: 1.1927 - classification_loss: 0.2273 281/500 [===============>..............] - ETA: 52s - loss: 1.4203 - regression_loss: 1.1927 - classification_loss: 0.2276 282/500 [===============>..............] - ETA: 52s - loss: 1.4209 - regression_loss: 1.1934 - classification_loss: 0.2275 283/500 [===============>..............] - ETA: 52s - loss: 1.4189 - regression_loss: 1.1919 - classification_loss: 0.2271 284/500 [================>.............] - ETA: 52s - loss: 1.4189 - regression_loss: 1.1922 - classification_loss: 0.2267 285/500 [================>.............] - ETA: 51s - loss: 1.4189 - regression_loss: 1.1923 - classification_loss: 0.2267 286/500 [================>.............] - ETA: 51s - loss: 1.4155 - regression_loss: 1.1895 - classification_loss: 0.2261 287/500 [================>.............] - ETA: 51s - loss: 1.4166 - regression_loss: 1.1904 - classification_loss: 0.2262 288/500 [================>.............] - ETA: 51s - loss: 1.4135 - regression_loss: 1.1879 - classification_loss: 0.2257 289/500 [================>.............] - ETA: 50s - loss: 1.4120 - regression_loss: 1.1867 - classification_loss: 0.2254 290/500 [================>.............] - ETA: 50s - loss: 1.4122 - regression_loss: 1.1867 - classification_loss: 0.2255 291/500 [================>.............] - ETA: 50s - loss: 1.4137 - regression_loss: 1.1878 - classification_loss: 0.2258 292/500 [================>.............] - ETA: 50s - loss: 1.4097 - regression_loss: 1.1845 - classification_loss: 0.2252 293/500 [================>.............] - ETA: 49s - loss: 1.4110 - regression_loss: 1.1853 - classification_loss: 0.2257 294/500 [================>.............] - ETA: 49s - loss: 1.4145 - regression_loss: 1.1884 - classification_loss: 0.2261 295/500 [================>.............] - ETA: 49s - loss: 1.4123 - regression_loss: 1.1866 - classification_loss: 0.2256 296/500 [================>.............] - ETA: 49s - loss: 1.4140 - regression_loss: 1.1885 - classification_loss: 0.2255 297/500 [================>.............] - ETA: 48s - loss: 1.4117 - regression_loss: 1.1866 - classification_loss: 0.2251 298/500 [================>.............] - ETA: 48s - loss: 1.4105 - regression_loss: 1.1858 - classification_loss: 0.2248 299/500 [================>.............] - ETA: 48s - loss: 1.4078 - regression_loss: 1.1835 - classification_loss: 0.2243 300/500 [=================>............] - ETA: 48s - loss: 1.4114 - regression_loss: 1.1863 - classification_loss: 0.2251 301/500 [=================>............] - ETA: 47s - loss: 1.4101 - regression_loss: 1.1853 - classification_loss: 0.2248 302/500 [=================>............] - ETA: 47s - loss: 1.4088 - regression_loss: 1.1844 - classification_loss: 0.2245 303/500 [=================>............] - ETA: 47s - loss: 1.4087 - regression_loss: 1.1843 - classification_loss: 0.2244 304/500 [=================>............] - ETA: 47s - loss: 1.4103 - regression_loss: 1.1855 - classification_loss: 0.2247 305/500 [=================>............] - ETA: 46s - loss: 1.4090 - regression_loss: 1.1847 - classification_loss: 0.2243 306/500 [=================>............] - ETA: 46s - loss: 1.4074 - regression_loss: 1.1835 - classification_loss: 0.2239 307/500 [=================>............] - ETA: 46s - loss: 1.4086 - regression_loss: 1.1846 - classification_loss: 0.2240 308/500 [=================>............] - ETA: 46s - loss: 1.4086 - regression_loss: 1.1847 - classification_loss: 0.2239 309/500 [=================>............] - ETA: 45s - loss: 1.4059 - regression_loss: 1.1826 - classification_loss: 0.2233 310/500 [=================>............] - ETA: 45s - loss: 1.4050 - regression_loss: 1.1819 - classification_loss: 0.2231 311/500 [=================>............] - ETA: 45s - loss: 1.4035 - regression_loss: 1.1807 - classification_loss: 0.2228 312/500 [=================>............] - ETA: 45s - loss: 1.4061 - regression_loss: 1.1826 - classification_loss: 0.2235 313/500 [=================>............] - ETA: 44s - loss: 1.4059 - regression_loss: 1.1825 - classification_loss: 0.2234 314/500 [=================>............] - ETA: 44s - loss: 1.4111 - regression_loss: 1.1868 - classification_loss: 0.2243 315/500 [=================>............] - ETA: 44s - loss: 1.4127 - regression_loss: 1.1881 - classification_loss: 0.2246 316/500 [=================>............] - ETA: 44s - loss: 1.4135 - regression_loss: 1.1888 - classification_loss: 0.2247 317/500 [==================>...........] - ETA: 43s - loss: 1.4122 - regression_loss: 1.1878 - classification_loss: 0.2244 318/500 [==================>...........] - ETA: 43s - loss: 1.4124 - regression_loss: 1.1880 - classification_loss: 0.2245 319/500 [==================>...........] - ETA: 43s - loss: 1.4128 - regression_loss: 1.1884 - classification_loss: 0.2244 320/500 [==================>...........] - ETA: 43s - loss: 1.4121 - regression_loss: 1.1878 - classification_loss: 0.2243 321/500 [==================>...........] - ETA: 42s - loss: 1.4132 - regression_loss: 1.1888 - classification_loss: 0.2244 322/500 [==================>...........] - ETA: 42s - loss: 1.4127 - regression_loss: 1.1884 - classification_loss: 0.2243 323/500 [==================>...........] - ETA: 42s - loss: 1.4141 - regression_loss: 1.1897 - classification_loss: 0.2243 324/500 [==================>...........] - ETA: 42s - loss: 1.4110 - regression_loss: 1.1872 - classification_loss: 0.2238 325/500 [==================>...........] - ETA: 41s - loss: 1.4116 - regression_loss: 1.1877 - classification_loss: 0.2238 326/500 [==================>...........] - ETA: 41s - loss: 1.4129 - regression_loss: 1.1887 - classification_loss: 0.2243 327/500 [==================>...........] - ETA: 41s - loss: 1.4154 - regression_loss: 1.1905 - classification_loss: 0.2249 328/500 [==================>...........] - ETA: 41s - loss: 1.4155 - regression_loss: 1.1907 - classification_loss: 0.2247 329/500 [==================>...........] - ETA: 40s - loss: 1.4127 - regression_loss: 1.1884 - classification_loss: 0.2243 330/500 [==================>...........] - ETA: 40s - loss: 1.4126 - regression_loss: 1.1884 - classification_loss: 0.2242 331/500 [==================>...........] - ETA: 40s - loss: 1.4124 - regression_loss: 1.1884 - classification_loss: 0.2240 332/500 [==================>...........] - ETA: 40s - loss: 1.4116 - regression_loss: 1.1879 - classification_loss: 0.2237 333/500 [==================>...........] - ETA: 39s - loss: 1.4139 - regression_loss: 1.1898 - classification_loss: 0.2241 334/500 [===================>..........] - ETA: 39s - loss: 1.4163 - regression_loss: 1.1921 - classification_loss: 0.2242 335/500 [===================>..........] - ETA: 39s - loss: 1.4193 - regression_loss: 1.1944 - classification_loss: 0.2249 336/500 [===================>..........] - ETA: 39s - loss: 1.4191 - regression_loss: 1.1942 - classification_loss: 0.2249 337/500 [===================>..........] - ETA: 38s - loss: 1.4192 - regression_loss: 1.1942 - classification_loss: 0.2250 338/500 [===================>..........] - ETA: 38s - loss: 1.4166 - regression_loss: 1.1919 - classification_loss: 0.2247 339/500 [===================>..........] - ETA: 38s - loss: 1.4162 - regression_loss: 1.1917 - classification_loss: 0.2244 340/500 [===================>..........] - ETA: 38s - loss: 1.4154 - regression_loss: 1.1913 - classification_loss: 0.2241 341/500 [===================>..........] - ETA: 38s - loss: 1.4150 - regression_loss: 1.1910 - classification_loss: 0.2241 342/500 [===================>..........] - ETA: 37s - loss: 1.4127 - regression_loss: 1.1889 - classification_loss: 0.2237 343/500 [===================>..........] - ETA: 37s - loss: 1.4124 - regression_loss: 1.1889 - classification_loss: 0.2235 344/500 [===================>..........] - ETA: 37s - loss: 1.4126 - regression_loss: 1.1891 - classification_loss: 0.2235 345/500 [===================>..........] - ETA: 37s - loss: 1.4127 - regression_loss: 1.1893 - classification_loss: 0.2234 346/500 [===================>..........] - ETA: 36s - loss: 1.4134 - regression_loss: 1.1899 - classification_loss: 0.2235 347/500 [===================>..........] - ETA: 36s - loss: 1.4133 - regression_loss: 1.1897 - classification_loss: 0.2236 348/500 [===================>..........] - ETA: 36s - loss: 1.4159 - regression_loss: 1.1915 - classification_loss: 0.2244 349/500 [===================>..........] - ETA: 36s - loss: 1.4145 - regression_loss: 1.1905 - classification_loss: 0.2240 350/500 [====================>.........] - ETA: 35s - loss: 1.4137 - regression_loss: 1.1899 - classification_loss: 0.2238 351/500 [====================>.........] - ETA: 35s - loss: 1.4151 - regression_loss: 1.1913 - classification_loss: 0.2238 352/500 [====================>.........] - ETA: 35s - loss: 1.4149 - regression_loss: 1.1911 - classification_loss: 0.2238 353/500 [====================>.........] - ETA: 35s - loss: 1.4140 - regression_loss: 1.1906 - classification_loss: 0.2234 354/500 [====================>.........] - ETA: 34s - loss: 1.4134 - regression_loss: 1.1902 - classification_loss: 0.2232 355/500 [====================>.........] - ETA: 34s - loss: 1.4139 - regression_loss: 1.1906 - classification_loss: 0.2234 356/500 [====================>.........] - ETA: 34s - loss: 1.4124 - regression_loss: 1.1891 - classification_loss: 0.2233 357/500 [====================>.........] - ETA: 34s - loss: 1.4106 - regression_loss: 1.1876 - classification_loss: 0.2230 358/500 [====================>.........] - ETA: 33s - loss: 1.4119 - regression_loss: 1.1887 - classification_loss: 0.2232 359/500 [====================>.........] - ETA: 33s - loss: 1.4142 - regression_loss: 1.1906 - classification_loss: 0.2236 360/500 [====================>.........] - ETA: 33s - loss: 1.4147 - regression_loss: 1.1909 - classification_loss: 0.2238 361/500 [====================>.........] - ETA: 33s - loss: 1.4165 - regression_loss: 1.1921 - classification_loss: 0.2244 362/500 [====================>.........] - ETA: 32s - loss: 1.4163 - regression_loss: 1.1920 - classification_loss: 0.2243 363/500 [====================>.........] - ETA: 32s - loss: 1.4146 - regression_loss: 1.1906 - classification_loss: 0.2240 364/500 [====================>.........] - ETA: 32s - loss: 1.4148 - regression_loss: 1.1909 - classification_loss: 0.2239 365/500 [====================>.........] - ETA: 32s - loss: 1.4142 - regression_loss: 1.1904 - classification_loss: 0.2238 366/500 [====================>.........] - ETA: 32s - loss: 1.4134 - regression_loss: 1.1899 - classification_loss: 0.2235 367/500 [=====================>........] - ETA: 31s - loss: 1.4114 - regression_loss: 1.1884 - classification_loss: 0.2230 368/500 [=====================>........] - ETA: 31s - loss: 1.4119 - regression_loss: 1.1890 - classification_loss: 0.2230 369/500 [=====================>........] - ETA: 31s - loss: 1.4120 - regression_loss: 1.1893 - classification_loss: 0.2227 370/500 [=====================>........] - ETA: 31s - loss: 1.4128 - regression_loss: 1.1899 - classification_loss: 0.2228 371/500 [=====================>........] - ETA: 30s - loss: 1.4127 - regression_loss: 1.1898 - classification_loss: 0.2229 372/500 [=====================>........] - ETA: 30s - loss: 1.4135 - regression_loss: 1.1903 - classification_loss: 0.2232 373/500 [=====================>........] - ETA: 30s - loss: 1.4138 - regression_loss: 1.1906 - classification_loss: 0.2232 374/500 [=====================>........] - ETA: 30s - loss: 1.4154 - regression_loss: 1.1922 - classification_loss: 0.2232 375/500 [=====================>........] - ETA: 29s - loss: 1.4173 - regression_loss: 1.1936 - classification_loss: 0.2237 376/500 [=====================>........] - ETA: 29s - loss: 1.4166 - regression_loss: 1.1930 - classification_loss: 0.2236 377/500 [=====================>........] - ETA: 29s - loss: 1.4145 - regression_loss: 1.1912 - classification_loss: 0.2233 378/500 [=====================>........] - ETA: 29s - loss: 1.4129 - regression_loss: 1.1899 - classification_loss: 0.2230 379/500 [=====================>........] - ETA: 28s - loss: 1.4104 - regression_loss: 1.1867 - classification_loss: 0.2236 380/500 [=====================>........] - ETA: 28s - loss: 1.4090 - regression_loss: 1.1856 - classification_loss: 0.2233 381/500 [=====================>........] - ETA: 28s - loss: 1.4091 - regression_loss: 1.1859 - classification_loss: 0.2232 382/500 [=====================>........] - ETA: 28s - loss: 1.4085 - regression_loss: 1.1854 - classification_loss: 0.2231 383/500 [=====================>........] - ETA: 27s - loss: 1.4064 - regression_loss: 1.1836 - classification_loss: 0.2228 384/500 [======================>.......] - ETA: 27s - loss: 1.4071 - regression_loss: 1.1842 - classification_loss: 0.2229 385/500 [======================>.......] - ETA: 27s - loss: 1.4081 - regression_loss: 1.1850 - classification_loss: 0.2232 386/500 [======================>.......] - ETA: 27s - loss: 1.4090 - regression_loss: 1.1857 - classification_loss: 0.2233 387/500 [======================>.......] - ETA: 26s - loss: 1.4104 - regression_loss: 1.1872 - classification_loss: 0.2232 388/500 [======================>.......] - ETA: 26s - loss: 1.4110 - regression_loss: 1.1877 - classification_loss: 0.2233 389/500 [======================>.......] - ETA: 26s - loss: 1.4098 - regression_loss: 1.1868 - classification_loss: 0.2230 390/500 [======================>.......] - ETA: 26s - loss: 1.4107 - regression_loss: 1.1873 - classification_loss: 0.2234 391/500 [======================>.......] - ETA: 26s - loss: 1.4107 - regression_loss: 1.1873 - classification_loss: 0.2234 392/500 [======================>.......] - ETA: 25s - loss: 1.4104 - regression_loss: 1.1870 - classification_loss: 0.2234 393/500 [======================>.......] - ETA: 25s - loss: 1.4099 - regression_loss: 1.1865 - classification_loss: 0.2234 394/500 [======================>.......] - ETA: 25s - loss: 1.4160 - regression_loss: 1.1908 - classification_loss: 0.2252 395/500 [======================>.......] - ETA: 25s - loss: 1.4157 - regression_loss: 1.1906 - classification_loss: 0.2251 396/500 [======================>.......] - ETA: 24s - loss: 1.4151 - regression_loss: 1.1900 - classification_loss: 0.2251 397/500 [======================>.......] - ETA: 24s - loss: 1.4152 - regression_loss: 1.1901 - classification_loss: 0.2251 398/500 [======================>.......] - ETA: 24s - loss: 1.4150 - regression_loss: 1.1901 - classification_loss: 0.2250 399/500 [======================>.......] - ETA: 24s - loss: 1.4138 - regression_loss: 1.1890 - classification_loss: 0.2248 400/500 [=======================>......] - ETA: 23s - loss: 1.4125 - regression_loss: 1.1878 - classification_loss: 0.2247 401/500 [=======================>......] - ETA: 23s - loss: 1.4131 - regression_loss: 1.1884 - classification_loss: 0.2248 402/500 [=======================>......] - ETA: 23s - loss: 1.4120 - regression_loss: 1.1875 - classification_loss: 0.2245 403/500 [=======================>......] - ETA: 23s - loss: 1.4129 - regression_loss: 1.1883 - classification_loss: 0.2247 404/500 [=======================>......] - ETA: 22s - loss: 1.4114 - regression_loss: 1.1870 - classification_loss: 0.2244 405/500 [=======================>......] - ETA: 22s - loss: 1.4101 - regression_loss: 1.1861 - classification_loss: 0.2241 406/500 [=======================>......] - ETA: 22s - loss: 1.4076 - regression_loss: 1.1840 - classification_loss: 0.2236 407/500 [=======================>......] - ETA: 22s - loss: 1.4094 - regression_loss: 1.1854 - classification_loss: 0.2241 408/500 [=======================>......] - ETA: 21s - loss: 1.4087 - regression_loss: 1.1847 - classification_loss: 0.2239 409/500 [=======================>......] - ETA: 21s - loss: 1.4088 - regression_loss: 1.1849 - classification_loss: 0.2240 410/500 [=======================>......] - ETA: 21s - loss: 1.4106 - regression_loss: 1.1862 - classification_loss: 0.2243 411/500 [=======================>......] - ETA: 21s - loss: 1.4108 - regression_loss: 1.1864 - classification_loss: 0.2244 412/500 [=======================>......] - ETA: 20s - loss: 1.4111 - regression_loss: 1.1866 - classification_loss: 0.2244 413/500 [=======================>......] - ETA: 20s - loss: 1.4105 - regression_loss: 1.1864 - classification_loss: 0.2241 414/500 [=======================>......] - ETA: 20s - loss: 1.4105 - regression_loss: 1.1863 - classification_loss: 0.2241 415/500 [=======================>......] - ETA: 20s - loss: 1.4107 - regression_loss: 1.1864 - classification_loss: 0.2243 416/500 [=======================>......] - ETA: 20s - loss: 1.4111 - regression_loss: 1.1868 - classification_loss: 0.2243 417/500 [========================>.....] - ETA: 19s - loss: 1.4108 - regression_loss: 1.1866 - classification_loss: 0.2242 418/500 [========================>.....] - ETA: 19s - loss: 1.4105 - regression_loss: 1.1864 - classification_loss: 0.2241 419/500 [========================>.....] - ETA: 19s - loss: 1.4121 - regression_loss: 1.1877 - classification_loss: 0.2243 420/500 [========================>.....] - ETA: 19s - loss: 1.4114 - regression_loss: 1.1874 - classification_loss: 0.2240 421/500 [========================>.....] - ETA: 18s - loss: 1.4095 - regression_loss: 1.1858 - classification_loss: 0.2237 422/500 [========================>.....] - ETA: 18s - loss: 1.4115 - regression_loss: 1.1874 - classification_loss: 0.2241 423/500 [========================>.....] - ETA: 18s - loss: 1.4132 - regression_loss: 1.1886 - classification_loss: 0.2246 424/500 [========================>.....] - ETA: 18s - loss: 1.4137 - regression_loss: 1.1890 - classification_loss: 0.2247 425/500 [========================>.....] - ETA: 17s - loss: 1.4130 - regression_loss: 1.1884 - classification_loss: 0.2246 426/500 [========================>.....] - ETA: 17s - loss: 1.4125 - regression_loss: 1.1880 - classification_loss: 0.2244 427/500 [========================>.....] - ETA: 17s - loss: 1.4113 - regression_loss: 1.1872 - classification_loss: 0.2241 428/500 [========================>.....] - ETA: 17s - loss: 1.4125 - regression_loss: 1.1881 - classification_loss: 0.2244 429/500 [========================>.....] - ETA: 16s - loss: 1.4116 - regression_loss: 1.1875 - classification_loss: 0.2242 430/500 [========================>.....] - ETA: 16s - loss: 1.4097 - regression_loss: 1.1859 - classification_loss: 0.2238 431/500 [========================>.....] - ETA: 16s - loss: 1.4093 - regression_loss: 1.1857 - classification_loss: 0.2236 432/500 [========================>.....] - ETA: 16s - loss: 1.4093 - regression_loss: 1.1858 - classification_loss: 0.2235 433/500 [========================>.....] - ETA: 15s - loss: 1.4093 - regression_loss: 1.1860 - classification_loss: 0.2233 434/500 [=========================>....] - ETA: 15s - loss: 1.4079 - regression_loss: 1.1849 - classification_loss: 0.2230 435/500 [=========================>....] - ETA: 15s - loss: 1.4078 - regression_loss: 1.1848 - classification_loss: 0.2231 436/500 [=========================>....] - ETA: 15s - loss: 1.4074 - regression_loss: 1.1843 - classification_loss: 0.2231 437/500 [=========================>....] - ETA: 15s - loss: 1.4073 - regression_loss: 1.1841 - classification_loss: 0.2232 438/500 [=========================>....] - ETA: 14s - loss: 1.4072 - regression_loss: 1.1840 - classification_loss: 0.2232 439/500 [=========================>....] - ETA: 14s - loss: 1.4068 - regression_loss: 1.1837 - classification_loss: 0.2231 440/500 [=========================>....] - ETA: 14s - loss: 1.4061 - regression_loss: 1.1832 - classification_loss: 0.2229 441/500 [=========================>....] - ETA: 14s - loss: 1.4080 - regression_loss: 1.1844 - classification_loss: 0.2236 442/500 [=========================>....] - ETA: 13s - loss: 1.4084 - regression_loss: 1.1847 - classification_loss: 0.2237 443/500 [=========================>....] - ETA: 13s - loss: 1.4092 - regression_loss: 1.1851 - classification_loss: 0.2241 444/500 [=========================>....] - ETA: 13s - loss: 1.4094 - regression_loss: 1.1853 - classification_loss: 0.2241 445/500 [=========================>....] - ETA: 13s - loss: 1.4090 - regression_loss: 1.1851 - classification_loss: 0.2239 446/500 [=========================>....] - ETA: 12s - loss: 1.4089 - regression_loss: 1.1853 - classification_loss: 0.2236 447/500 [=========================>....] - ETA: 12s - loss: 1.4098 - regression_loss: 1.1863 - classification_loss: 0.2235 448/500 [=========================>....] - ETA: 12s - loss: 1.4102 - regression_loss: 1.1868 - classification_loss: 0.2235 449/500 [=========================>....] - ETA: 12s - loss: 1.4108 - regression_loss: 1.1872 - classification_loss: 0.2237 450/500 [==========================>...] - ETA: 11s - loss: 1.4128 - regression_loss: 1.1890 - classification_loss: 0.2238 451/500 [==========================>...] - ETA: 11s - loss: 1.4125 - regression_loss: 1.1888 - classification_loss: 0.2237 452/500 [==========================>...] - ETA: 11s - loss: 1.4130 - regression_loss: 1.1889 - classification_loss: 0.2240 453/500 [==========================>...] - ETA: 11s - loss: 1.4117 - regression_loss: 1.1878 - classification_loss: 0.2239 454/500 [==========================>...] - ETA: 10s - loss: 1.4111 - regression_loss: 1.1873 - classification_loss: 0.2238 455/500 [==========================>...] - ETA: 10s - loss: 1.4107 - regression_loss: 1.1871 - classification_loss: 0.2236 456/500 [==========================>...] - ETA: 10s - loss: 1.4119 - regression_loss: 1.1880 - classification_loss: 0.2239 457/500 [==========================>...] - ETA: 10s - loss: 1.4111 - regression_loss: 1.1873 - classification_loss: 0.2238 458/500 [==========================>...] - ETA: 10s - loss: 1.4111 - regression_loss: 1.1873 - classification_loss: 0.2238 459/500 [==========================>...] - ETA: 9s - loss: 1.4109 - regression_loss: 1.1872 - classification_loss: 0.2237  460/500 [==========================>...] - ETA: 9s - loss: 1.4107 - regression_loss: 1.1872 - classification_loss: 0.2235 461/500 [==========================>...] - ETA: 9s - loss: 1.4121 - regression_loss: 1.1885 - classification_loss: 0.2236 462/500 [==========================>...] - ETA: 9s - loss: 1.4125 - regression_loss: 1.1889 - classification_loss: 0.2236 463/500 [==========================>...] - ETA: 8s - loss: 1.4134 - regression_loss: 1.1897 - classification_loss: 0.2236 464/500 [==========================>...] - ETA: 8s - loss: 1.4141 - regression_loss: 1.1906 - classification_loss: 0.2235 465/500 [==========================>...] - ETA: 8s - loss: 1.4152 - regression_loss: 1.1915 - classification_loss: 0.2237 466/500 [==========================>...] - ETA: 8s - loss: 1.4148 - regression_loss: 1.1913 - classification_loss: 0.2235 467/500 [===========================>..] - ETA: 7s - loss: 1.4186 - regression_loss: 1.1939 - classification_loss: 0.2246 468/500 [===========================>..] - ETA: 7s - loss: 1.4173 - regression_loss: 1.1930 - classification_loss: 0.2243 469/500 [===========================>..] - ETA: 7s - loss: 1.4169 - regression_loss: 1.1927 - classification_loss: 0.2242 470/500 [===========================>..] - ETA: 7s - loss: 1.4147 - regression_loss: 1.1909 - classification_loss: 0.2238 471/500 [===========================>..] - ETA: 6s - loss: 1.4153 - regression_loss: 1.1914 - classification_loss: 0.2239 472/500 [===========================>..] - ETA: 6s - loss: 1.4283 - regression_loss: 1.2017 - classification_loss: 0.2266 473/500 [===========================>..] - ETA: 6s - loss: 1.4298 - regression_loss: 1.2030 - classification_loss: 0.2268 474/500 [===========================>..] - ETA: 6s - loss: 1.4287 - regression_loss: 1.2020 - classification_loss: 0.2266 475/500 [===========================>..] - ETA: 5s - loss: 1.4280 - regression_loss: 1.2016 - classification_loss: 0.2264 476/500 [===========================>..] - ETA: 5s - loss: 1.4268 - regression_loss: 1.2006 - classification_loss: 0.2262 477/500 [===========================>..] - ETA: 5s - loss: 1.4271 - regression_loss: 1.2007 - classification_loss: 0.2264 478/500 [===========================>..] - ETA: 5s - loss: 1.4271 - regression_loss: 1.2007 - classification_loss: 0.2264 479/500 [===========================>..] - ETA: 5s - loss: 1.4266 - regression_loss: 1.2003 - classification_loss: 0.2262 480/500 [===========================>..] - ETA: 4s - loss: 1.4273 - regression_loss: 1.2008 - classification_loss: 0.2266 481/500 [===========================>..] - ETA: 4s - loss: 1.4251 - regression_loss: 1.1989 - classification_loss: 0.2262 482/500 [===========================>..] - ETA: 4s - loss: 1.4263 - regression_loss: 1.2001 - classification_loss: 0.2262 483/500 [===========================>..] - ETA: 4s - loss: 1.4255 - regression_loss: 1.1996 - classification_loss: 0.2259 484/500 [============================>.] - ETA: 3s - loss: 1.4269 - regression_loss: 1.2007 - classification_loss: 0.2262 485/500 [============================>.] - ETA: 3s - loss: 1.4272 - regression_loss: 1.2009 - classification_loss: 0.2263 486/500 [============================>.] - ETA: 3s - loss: 1.4284 - regression_loss: 1.2017 - classification_loss: 0.2267 487/500 [============================>.] - ETA: 3s - loss: 1.4278 - regression_loss: 1.2013 - classification_loss: 0.2265 488/500 [============================>.] - ETA: 2s - loss: 1.4287 - regression_loss: 1.2021 - classification_loss: 0.2266 489/500 [============================>.] - ETA: 2s - loss: 1.4286 - regression_loss: 1.2021 - classification_loss: 0.2265 490/500 [============================>.] - ETA: 2s - loss: 1.4297 - regression_loss: 1.2028 - classification_loss: 0.2268 491/500 [============================>.] - ETA: 2s - loss: 1.4291 - regression_loss: 1.2025 - classification_loss: 0.2267 492/500 [============================>.] - ETA: 1s - loss: 1.4293 - regression_loss: 1.2027 - classification_loss: 0.2266 493/500 [============================>.] - ETA: 1s - loss: 1.4298 - regression_loss: 1.2032 - classification_loss: 0.2266 494/500 [============================>.] - ETA: 1s - loss: 1.4295 - regression_loss: 1.2029 - classification_loss: 0.2265 495/500 [============================>.] - ETA: 1s - loss: 1.4290 - regression_loss: 1.2027 - classification_loss: 0.2263 496/500 [============================>.] - ETA: 0s - loss: 1.4288 - regression_loss: 1.2026 - classification_loss: 0.2262 497/500 [============================>.] - ETA: 0s - loss: 1.4282 - regression_loss: 1.2022 - classification_loss: 0.2260 498/500 [============================>.] - ETA: 0s - loss: 1.4279 - regression_loss: 1.2020 - classification_loss: 0.2258 499/500 [============================>.] - ETA: 0s - loss: 1.4287 - regression_loss: 1.2027 - classification_loss: 0.2261 500/500 [==============================] - 120s 239ms/step - loss: 1.4285 - regression_loss: 1.2026 - classification_loss: 0.2258 326 instances of class plum with average precision: 0.8149 mAP: 0.8149 Epoch 00072: saving model to ./training/snapshots/resnet50_pascal_72.h5 Epoch 73/150 1/500 [..............................] - ETA: 1:58 - loss: 0.4125 - regression_loss: 0.3516 - classification_loss: 0.0609 2/500 [..............................] - ETA: 2:00 - loss: 1.0517 - regression_loss: 0.8764 - classification_loss: 0.1754 3/500 [..............................] - ETA: 1:57 - loss: 1.1331 - regression_loss: 0.9637 - classification_loss: 0.1694 4/500 [..............................] - ETA: 1:55 - loss: 0.9682 - regression_loss: 0.8273 - classification_loss: 0.1409 5/500 [..............................] - ETA: 1:54 - loss: 1.0274 - regression_loss: 0.8842 - classification_loss: 0.1432 6/500 [..............................] - ETA: 1:53 - loss: 1.3066 - regression_loss: 1.0914 - classification_loss: 0.2152 7/500 [..............................] - ETA: 1:53 - loss: 1.4123 - regression_loss: 1.1841 - classification_loss: 0.2283 8/500 [..............................] - ETA: 1:52 - loss: 1.5464 - regression_loss: 1.2854 - classification_loss: 0.2610 9/500 [..............................] - ETA: 1:52 - loss: 1.4993 - regression_loss: 1.2509 - classification_loss: 0.2484 10/500 [..............................] - ETA: 1:53 - loss: 1.5397 - regression_loss: 1.2891 - classification_loss: 0.2506 11/500 [..............................] - ETA: 1:53 - loss: 1.5424 - regression_loss: 1.2892 - classification_loss: 0.2532 12/500 [..............................] - ETA: 1:53 - loss: 1.5279 - regression_loss: 1.2816 - classification_loss: 0.2462 13/500 [..............................] - ETA: 1:53 - loss: 1.5355 - regression_loss: 1.2914 - classification_loss: 0.2441 14/500 [..............................] - ETA: 1:53 - loss: 1.5180 - regression_loss: 1.2783 - classification_loss: 0.2397 15/500 [..............................] - ETA: 1:53 - loss: 1.5059 - regression_loss: 1.2709 - classification_loss: 0.2350 16/500 [..............................] - ETA: 1:53 - loss: 1.4905 - regression_loss: 1.2606 - classification_loss: 0.2299 17/500 [>.............................] - ETA: 1:52 - loss: 1.5087 - regression_loss: 1.2774 - classification_loss: 0.2312 18/500 [>.............................] - ETA: 1:52 - loss: 1.4557 - regression_loss: 1.2316 - classification_loss: 0.2240 19/500 [>.............................] - ETA: 1:51 - loss: 1.4272 - regression_loss: 1.2092 - classification_loss: 0.2180 20/500 [>.............................] - ETA: 1:51 - loss: 1.4168 - regression_loss: 1.2030 - classification_loss: 0.2138 21/500 [>.............................] - ETA: 1:51 - loss: 1.4286 - regression_loss: 1.2153 - classification_loss: 0.2133 22/500 [>.............................] - ETA: 1:51 - loss: 1.4352 - regression_loss: 1.2189 - classification_loss: 0.2163 23/500 [>.............................] - ETA: 1:50 - loss: 1.4505 - regression_loss: 1.2239 - classification_loss: 0.2266 24/500 [>.............................] - ETA: 1:50 - loss: 1.4254 - regression_loss: 1.2032 - classification_loss: 0.2222 25/500 [>.............................] - ETA: 1:49 - loss: 1.4085 - regression_loss: 1.1890 - classification_loss: 0.2195 26/500 [>.............................] - ETA: 1:49 - loss: 1.4117 - regression_loss: 1.1935 - classification_loss: 0.2182 27/500 [>.............................] - ETA: 1:49 - loss: 1.4019 - regression_loss: 1.1859 - classification_loss: 0.2160 28/500 [>.............................] - ETA: 1:49 - loss: 1.3790 - regression_loss: 1.1675 - classification_loss: 0.2115 29/500 [>.............................] - ETA: 1:48 - loss: 1.3661 - regression_loss: 1.1576 - classification_loss: 0.2085 30/500 [>.............................] - ETA: 1:48 - loss: 1.3764 - regression_loss: 1.1684 - classification_loss: 0.2080 31/500 [>.............................] - ETA: 1:48 - loss: 1.3940 - regression_loss: 1.1840 - classification_loss: 0.2100 32/500 [>.............................] - ETA: 1:48 - loss: 1.3876 - regression_loss: 1.1791 - classification_loss: 0.2085 33/500 [>.............................] - ETA: 1:47 - loss: 1.4234 - regression_loss: 1.2079 - classification_loss: 0.2155 34/500 [=>............................] - ETA: 1:47 - loss: 1.4084 - regression_loss: 1.1953 - classification_loss: 0.2131 35/500 [=>............................] - ETA: 1:47 - loss: 1.4014 - regression_loss: 1.1913 - classification_loss: 0.2101 36/500 [=>............................] - ETA: 1:47 - loss: 1.4100 - regression_loss: 1.1971 - classification_loss: 0.2129 37/500 [=>............................] - ETA: 1:47 - loss: 1.3935 - regression_loss: 1.1849 - classification_loss: 0.2086 38/500 [=>............................] - ETA: 1:47 - loss: 1.3827 - regression_loss: 1.1745 - classification_loss: 0.2082 39/500 [=>............................] - ETA: 1:46 - loss: 1.3844 - regression_loss: 1.1737 - classification_loss: 0.2107 40/500 [=>............................] - ETA: 1:46 - loss: 1.3765 - regression_loss: 1.1685 - classification_loss: 0.2079 41/500 [=>............................] - ETA: 1:46 - loss: 1.3662 - regression_loss: 1.1599 - classification_loss: 0.2063 42/500 [=>............................] - ETA: 1:46 - loss: 1.3659 - regression_loss: 1.1594 - classification_loss: 0.2065 43/500 [=>............................] - ETA: 1:45 - loss: 1.3580 - regression_loss: 1.1535 - classification_loss: 0.2045 44/500 [=>............................] - ETA: 1:45 - loss: 1.3685 - regression_loss: 1.1628 - classification_loss: 0.2057 45/500 [=>............................] - ETA: 1:45 - loss: 1.3539 - regression_loss: 1.1509 - classification_loss: 0.2030 46/500 [=>............................] - ETA: 1:45 - loss: 1.3564 - regression_loss: 1.1532 - classification_loss: 0.2032 47/500 [=>............................] - ETA: 1:45 - loss: 1.3606 - regression_loss: 1.1566 - classification_loss: 0.2040 48/500 [=>............................] - ETA: 1:45 - loss: 1.3874 - regression_loss: 1.1758 - classification_loss: 0.2117 49/500 [=>............................] - ETA: 1:45 - loss: 1.3719 - regression_loss: 1.1636 - classification_loss: 0.2083 50/500 [==>...........................] - ETA: 1:45 - loss: 1.3678 - regression_loss: 1.1605 - classification_loss: 0.2073 51/500 [==>...........................] - ETA: 1:44 - loss: 1.3680 - regression_loss: 1.1601 - classification_loss: 0.2080 52/500 [==>...........................] - ETA: 1:44 - loss: 1.3695 - regression_loss: 1.1616 - classification_loss: 0.2079 53/500 [==>...........................] - ETA: 1:44 - loss: 1.3696 - regression_loss: 1.1622 - classification_loss: 0.2074 54/500 [==>...........................] - ETA: 1:44 - loss: 1.3893 - regression_loss: 1.1774 - classification_loss: 0.2119 55/500 [==>...........................] - ETA: 1:44 - loss: 1.4008 - regression_loss: 1.1847 - classification_loss: 0.2161 56/500 [==>...........................] - ETA: 1:44 - loss: 1.3852 - regression_loss: 1.1717 - classification_loss: 0.2136 57/500 [==>...........................] - ETA: 1:43 - loss: 1.3892 - regression_loss: 1.1755 - classification_loss: 0.2137 58/500 [==>...........................] - ETA: 1:43 - loss: 1.3931 - regression_loss: 1.1780 - classification_loss: 0.2151 59/500 [==>...........................] - ETA: 1:43 - loss: 1.3918 - regression_loss: 1.1765 - classification_loss: 0.2153 60/500 [==>...........................] - ETA: 1:43 - loss: 1.3920 - regression_loss: 1.1773 - classification_loss: 0.2147 61/500 [==>...........................] - ETA: 1:43 - loss: 1.3786 - regression_loss: 1.1666 - classification_loss: 0.2120 62/500 [==>...........................] - ETA: 1:43 - loss: 1.3836 - regression_loss: 1.1698 - classification_loss: 0.2138 63/500 [==>...........................] - ETA: 1:42 - loss: 1.3985 - regression_loss: 1.1822 - classification_loss: 0.2162 64/500 [==>...........................] - ETA: 1:42 - loss: 1.4131 - regression_loss: 1.1948 - classification_loss: 0.2183 65/500 [==>...........................] - ETA: 1:42 - loss: 1.4027 - regression_loss: 1.1867 - classification_loss: 0.2160 66/500 [==>...........................] - ETA: 1:42 - loss: 1.4018 - regression_loss: 1.1864 - classification_loss: 0.2155 67/500 [===>..........................] - ETA: 1:42 - loss: 1.3999 - regression_loss: 1.1850 - classification_loss: 0.2149 68/500 [===>..........................] - ETA: 1:42 - loss: 1.4000 - regression_loss: 1.1855 - classification_loss: 0.2145 69/500 [===>..........................] - ETA: 1:41 - loss: 1.4126 - regression_loss: 1.1959 - classification_loss: 0.2167 70/500 [===>..........................] - ETA: 1:41 - loss: 1.4036 - regression_loss: 1.1881 - classification_loss: 0.2155 71/500 [===>..........................] - ETA: 1:41 - loss: 1.3918 - regression_loss: 1.1787 - classification_loss: 0.2132 72/500 [===>..........................] - ETA: 1:41 - loss: 1.3953 - regression_loss: 1.1796 - classification_loss: 0.2157 73/500 [===>..........................] - ETA: 1:41 - loss: 1.3991 - regression_loss: 1.1835 - classification_loss: 0.2156 74/500 [===>..........................] - ETA: 1:40 - loss: 1.3941 - regression_loss: 1.1802 - classification_loss: 0.2139 75/500 [===>..........................] - ETA: 1:40 - loss: 1.3973 - regression_loss: 1.1832 - classification_loss: 0.2141 76/500 [===>..........................] - ETA: 1:40 - loss: 1.3999 - regression_loss: 1.1842 - classification_loss: 0.2157 77/500 [===>..........................] - ETA: 1:40 - loss: 1.4050 - regression_loss: 1.1889 - classification_loss: 0.2161 78/500 [===>..........................] - ETA: 1:40 - loss: 1.4034 - regression_loss: 1.1871 - classification_loss: 0.2163 79/500 [===>..........................] - ETA: 1:39 - loss: 1.3982 - regression_loss: 1.1834 - classification_loss: 0.2148 80/500 [===>..........................] - ETA: 1:39 - loss: 1.4009 - regression_loss: 1.1854 - classification_loss: 0.2155 81/500 [===>..........................] - ETA: 1:39 - loss: 1.3964 - regression_loss: 1.1821 - classification_loss: 0.2143 82/500 [===>..........................] - ETA: 1:38 - loss: 1.3940 - regression_loss: 1.1805 - classification_loss: 0.2135 83/500 [===>..........................] - ETA: 1:38 - loss: 1.3956 - regression_loss: 1.1818 - classification_loss: 0.2138 84/500 [====>.........................] - ETA: 1:38 - loss: 1.3854 - regression_loss: 1.1735 - classification_loss: 0.2119 85/500 [====>.........................] - ETA: 1:38 - loss: 1.3953 - regression_loss: 1.1809 - classification_loss: 0.2143 86/500 [====>.........................] - ETA: 1:37 - loss: 1.3971 - regression_loss: 1.1831 - classification_loss: 0.2139 87/500 [====>.........................] - ETA: 1:37 - loss: 1.3971 - regression_loss: 1.1835 - classification_loss: 0.2136 88/500 [====>.........................] - ETA: 1:37 - loss: 1.3967 - regression_loss: 1.1832 - classification_loss: 0.2135 89/500 [====>.........................] - ETA: 1:37 - loss: 1.3994 - regression_loss: 1.1844 - classification_loss: 0.2150 90/500 [====>.........................] - ETA: 1:36 - loss: 1.4010 - regression_loss: 1.1850 - classification_loss: 0.2160 91/500 [====>.........................] - ETA: 1:36 - loss: 1.4062 - regression_loss: 1.1899 - classification_loss: 0.2163 92/500 [====>.........................] - ETA: 1:36 - loss: 1.4003 - regression_loss: 1.1849 - classification_loss: 0.2154 93/500 [====>.........................] - ETA: 1:36 - loss: 1.3991 - regression_loss: 1.1844 - classification_loss: 0.2148 94/500 [====>.........................] - ETA: 1:36 - loss: 1.3946 - regression_loss: 1.1811 - classification_loss: 0.2135 95/500 [====>.........................] - ETA: 1:35 - loss: 1.3870 - regression_loss: 1.1746 - classification_loss: 0.2124 96/500 [====>.........................] - ETA: 1:35 - loss: 1.3861 - regression_loss: 1.1738 - classification_loss: 0.2123 97/500 [====>.........................] - ETA: 1:35 - loss: 1.3780 - regression_loss: 1.1671 - classification_loss: 0.2109 98/500 [====>.........................] - ETA: 1:35 - loss: 1.3762 - regression_loss: 1.1660 - classification_loss: 0.2103 99/500 [====>.........................] - ETA: 1:35 - loss: 1.3788 - regression_loss: 1.1685 - classification_loss: 0.2104 100/500 [=====>........................] - ETA: 1:34 - loss: 1.3771 - regression_loss: 1.1668 - classification_loss: 0.2103 101/500 [=====>........................] - ETA: 1:34 - loss: 1.3820 - regression_loss: 1.1708 - classification_loss: 0.2112 102/500 [=====>........................] - ETA: 1:34 - loss: 1.3881 - regression_loss: 1.1755 - classification_loss: 0.2126 103/500 [=====>........................] - ETA: 1:34 - loss: 1.3917 - regression_loss: 1.1773 - classification_loss: 0.2144 104/500 [=====>........................] - ETA: 1:34 - loss: 1.3975 - regression_loss: 1.1832 - classification_loss: 0.2142 105/500 [=====>........................] - ETA: 1:34 - loss: 1.3972 - regression_loss: 1.1821 - classification_loss: 0.2151 106/500 [=====>........................] - ETA: 1:33 - loss: 1.3988 - regression_loss: 1.1840 - classification_loss: 0.2149 107/500 [=====>........................] - ETA: 1:33 - loss: 1.3967 - regression_loss: 1.1822 - classification_loss: 0.2146 108/500 [=====>........................] - ETA: 1:33 - loss: 1.4022 - regression_loss: 1.1873 - classification_loss: 0.2150 109/500 [=====>........................] - ETA: 1:33 - loss: 1.3977 - regression_loss: 1.1835 - classification_loss: 0.2141 110/500 [=====>........................] - ETA: 1:33 - loss: 1.4017 - regression_loss: 1.1870 - classification_loss: 0.2148 111/500 [=====>........................] - ETA: 1:32 - loss: 1.3965 - regression_loss: 1.1833 - classification_loss: 0.2132 112/500 [=====>........................] - ETA: 1:32 - loss: 1.3937 - regression_loss: 1.1811 - classification_loss: 0.2126 113/500 [=====>........................] - ETA: 1:32 - loss: 1.4023 - regression_loss: 1.1880 - classification_loss: 0.2144 114/500 [=====>........................] - ETA: 1:32 - loss: 1.4021 - regression_loss: 1.1881 - classification_loss: 0.2140 115/500 [=====>........................] - ETA: 1:31 - loss: 1.4011 - regression_loss: 1.1878 - classification_loss: 0.2133 116/500 [=====>........................] - ETA: 1:31 - loss: 1.3991 - regression_loss: 1.1864 - classification_loss: 0.2127 117/500 [======>.......................] - ETA: 1:31 - loss: 1.3940 - regression_loss: 1.1824 - classification_loss: 0.2116 118/500 [======>.......................] - ETA: 1:31 - loss: 1.3906 - regression_loss: 1.1798 - classification_loss: 0.2108 119/500 [======>.......................] - ETA: 1:31 - loss: 1.3899 - regression_loss: 1.1797 - classification_loss: 0.2102 120/500 [======>.......................] - ETA: 1:30 - loss: 1.3891 - regression_loss: 1.1795 - classification_loss: 0.2096 121/500 [======>.......................] - ETA: 1:30 - loss: 1.3919 - regression_loss: 1.1814 - classification_loss: 0.2106 122/500 [======>.......................] - ETA: 1:30 - loss: 1.4022 - regression_loss: 1.1895 - classification_loss: 0.2127 123/500 [======>.......................] - ETA: 1:30 - loss: 1.4030 - regression_loss: 1.1907 - classification_loss: 0.2124 124/500 [======>.......................] - ETA: 1:29 - loss: 1.3976 - regression_loss: 1.1855 - classification_loss: 0.2120 125/500 [======>.......................] - ETA: 1:29 - loss: 1.3996 - regression_loss: 1.1878 - classification_loss: 0.2118 126/500 [======>.......................] - ETA: 1:29 - loss: 1.3981 - regression_loss: 1.1870 - classification_loss: 0.2111 127/500 [======>.......................] - ETA: 1:29 - loss: 1.3961 - regression_loss: 1.1855 - classification_loss: 0.2106 128/500 [======>.......................] - ETA: 1:29 - loss: 1.3911 - regression_loss: 1.1809 - classification_loss: 0.2103 129/500 [======>.......................] - ETA: 1:28 - loss: 1.3885 - regression_loss: 1.1791 - classification_loss: 0.2095 130/500 [======>.......................] - ETA: 1:28 - loss: 1.3921 - regression_loss: 1.1817 - classification_loss: 0.2103 131/500 [======>.......................] - ETA: 1:28 - loss: 1.3894 - regression_loss: 1.1791 - classification_loss: 0.2103 132/500 [======>.......................] - ETA: 1:28 - loss: 1.3870 - regression_loss: 1.1772 - classification_loss: 0.2098 133/500 [======>.......................] - ETA: 1:27 - loss: 1.3800 - regression_loss: 1.1716 - classification_loss: 0.2085 134/500 [=======>......................] - ETA: 1:27 - loss: 1.3839 - regression_loss: 1.1745 - classification_loss: 0.2094 135/500 [=======>......................] - ETA: 1:27 - loss: 1.3852 - regression_loss: 1.1756 - classification_loss: 0.2096 136/500 [=======>......................] - ETA: 1:27 - loss: 1.3869 - regression_loss: 1.1773 - classification_loss: 0.2096 137/500 [=======>......................] - ETA: 1:27 - loss: 1.3849 - regression_loss: 1.1759 - classification_loss: 0.2090 138/500 [=======>......................] - ETA: 1:26 - loss: 1.3835 - regression_loss: 1.1753 - classification_loss: 0.2082 139/500 [=======>......................] - ETA: 1:26 - loss: 1.3856 - regression_loss: 1.1768 - classification_loss: 0.2088 140/500 [=======>......................] - ETA: 1:26 - loss: 1.3863 - regression_loss: 1.1775 - classification_loss: 0.2088 141/500 [=======>......................] - ETA: 1:26 - loss: 1.3890 - regression_loss: 1.1803 - classification_loss: 0.2088 142/500 [=======>......................] - ETA: 1:25 - loss: 1.3892 - regression_loss: 1.1802 - classification_loss: 0.2090 143/500 [=======>......................] - ETA: 1:25 - loss: 1.3844 - regression_loss: 1.1763 - classification_loss: 0.2081 144/500 [=======>......................] - ETA: 1:25 - loss: 1.3889 - regression_loss: 1.1800 - classification_loss: 0.2090 145/500 [=======>......................] - ETA: 1:25 - loss: 1.3889 - regression_loss: 1.1801 - classification_loss: 0.2087 146/500 [=======>......................] - ETA: 1:24 - loss: 1.3882 - regression_loss: 1.1798 - classification_loss: 0.2085 147/500 [=======>......................] - ETA: 1:24 - loss: 1.3917 - regression_loss: 1.1834 - classification_loss: 0.2084 148/500 [=======>......................] - ETA: 1:24 - loss: 1.3923 - regression_loss: 1.1841 - classification_loss: 0.2082 149/500 [=======>......................] - ETA: 1:24 - loss: 1.3936 - regression_loss: 1.1849 - classification_loss: 0.2087 150/500 [========>.....................] - ETA: 1:24 - loss: 1.3930 - regression_loss: 1.1847 - classification_loss: 0.2082 151/500 [========>.....................] - ETA: 1:23 - loss: 1.3947 - regression_loss: 1.1857 - classification_loss: 0.2090 152/500 [========>.....................] - ETA: 1:23 - loss: 1.3915 - regression_loss: 1.1830 - classification_loss: 0.2085 153/500 [========>.....................] - ETA: 1:23 - loss: 1.3959 - regression_loss: 1.1865 - classification_loss: 0.2094 154/500 [========>.....................] - ETA: 1:23 - loss: 1.4013 - regression_loss: 1.1902 - classification_loss: 0.2111 155/500 [========>.....................] - ETA: 1:22 - loss: 1.4054 - regression_loss: 1.1946 - classification_loss: 0.2109 156/500 [========>.....................] - ETA: 1:22 - loss: 1.4054 - regression_loss: 1.1948 - classification_loss: 0.2106 157/500 [========>.....................] - ETA: 1:22 - loss: 1.4056 - regression_loss: 1.1955 - classification_loss: 0.2100 158/500 [========>.....................] - ETA: 1:22 - loss: 1.4037 - regression_loss: 1.1939 - classification_loss: 0.2098 159/500 [========>.....................] - ETA: 1:21 - loss: 1.4035 - regression_loss: 1.1941 - classification_loss: 0.2094 160/500 [========>.....................] - ETA: 1:21 - loss: 1.4020 - regression_loss: 1.1931 - classification_loss: 0.2088 161/500 [========>.....................] - ETA: 1:21 - loss: 1.4018 - regression_loss: 1.1931 - classification_loss: 0.2087 162/500 [========>.....................] - ETA: 1:21 - loss: 1.4067 - regression_loss: 1.1972 - classification_loss: 0.2095 163/500 [========>.....................] - ETA: 1:21 - loss: 1.4011 - regression_loss: 1.1925 - classification_loss: 0.2087 164/500 [========>.....................] - ETA: 1:20 - loss: 1.4015 - regression_loss: 1.1928 - classification_loss: 0.2087 165/500 [========>.....................] - ETA: 1:20 - loss: 1.3958 - regression_loss: 1.1878 - classification_loss: 0.2079 166/500 [========>.....................] - ETA: 1:20 - loss: 1.3955 - regression_loss: 1.1876 - classification_loss: 0.2079 167/500 [=========>....................] - ETA: 1:20 - loss: 1.3986 - regression_loss: 1.1905 - classification_loss: 0.2080 168/500 [=========>....................] - ETA: 1:19 - loss: 1.4011 - regression_loss: 1.1926 - classification_loss: 0.2085 169/500 [=========>....................] - ETA: 1:19 - loss: 1.3991 - regression_loss: 1.1909 - classification_loss: 0.2081 170/500 [=========>....................] - ETA: 1:19 - loss: 1.4012 - regression_loss: 1.1926 - classification_loss: 0.2087 171/500 [=========>....................] - ETA: 1:19 - loss: 1.4030 - regression_loss: 1.1939 - classification_loss: 0.2092 172/500 [=========>....................] - ETA: 1:18 - loss: 1.4018 - regression_loss: 1.1928 - classification_loss: 0.2090 173/500 [=========>....................] - ETA: 1:18 - loss: 1.4072 - regression_loss: 1.1968 - classification_loss: 0.2104 174/500 [=========>....................] - ETA: 1:18 - loss: 1.4051 - regression_loss: 1.1952 - classification_loss: 0.2099 175/500 [=========>....................] - ETA: 1:18 - loss: 1.4040 - regression_loss: 1.1945 - classification_loss: 0.2095 176/500 [=========>....................] - ETA: 1:18 - loss: 1.4068 - regression_loss: 1.1971 - classification_loss: 0.2097 177/500 [=========>....................] - ETA: 1:17 - loss: 1.4078 - regression_loss: 1.1981 - classification_loss: 0.2097 178/500 [=========>....................] - ETA: 1:17 - loss: 1.4184 - regression_loss: 1.2074 - classification_loss: 0.2110 179/500 [=========>....................] - ETA: 1:17 - loss: 1.4215 - regression_loss: 1.2097 - classification_loss: 0.2118 180/500 [=========>....................] - ETA: 1:17 - loss: 1.4198 - regression_loss: 1.2084 - classification_loss: 0.2114 181/500 [=========>....................] - ETA: 1:16 - loss: 1.4197 - regression_loss: 1.2085 - classification_loss: 0.2111 182/500 [=========>....................] - ETA: 1:16 - loss: 1.4208 - regression_loss: 1.2094 - classification_loss: 0.2114 183/500 [=========>....................] - ETA: 1:16 - loss: 1.4237 - regression_loss: 1.2117 - classification_loss: 0.2120 184/500 [==========>...................] - ETA: 1:16 - loss: 1.4191 - regression_loss: 1.2077 - classification_loss: 0.2113 185/500 [==========>...................] - ETA: 1:15 - loss: 1.4200 - regression_loss: 1.2079 - classification_loss: 0.2121 186/500 [==========>...................] - ETA: 1:15 - loss: 1.4254 - regression_loss: 1.2122 - classification_loss: 0.2132 187/500 [==========>...................] - ETA: 1:15 - loss: 1.4231 - regression_loss: 1.2105 - classification_loss: 0.2126 188/500 [==========>...................] - ETA: 1:15 - loss: 1.4240 - regression_loss: 1.2115 - classification_loss: 0.2125 189/500 [==========>...................] - ETA: 1:14 - loss: 1.4230 - regression_loss: 1.2104 - classification_loss: 0.2126 190/500 [==========>...................] - ETA: 1:14 - loss: 1.4257 - regression_loss: 1.2122 - classification_loss: 0.2135 191/500 [==========>...................] - ETA: 1:14 - loss: 1.4247 - regression_loss: 1.2115 - classification_loss: 0.2132 192/500 [==========>...................] - ETA: 1:14 - loss: 1.4250 - regression_loss: 1.2118 - classification_loss: 0.2133 193/500 [==========>...................] - ETA: 1:13 - loss: 1.4238 - regression_loss: 1.2108 - classification_loss: 0.2130 194/500 [==========>...................] - ETA: 1:13 - loss: 1.4219 - regression_loss: 1.2092 - classification_loss: 0.2127 195/500 [==========>...................] - ETA: 1:13 - loss: 1.4290 - regression_loss: 1.2154 - classification_loss: 0.2136 196/500 [==========>...................] - ETA: 1:13 - loss: 1.4332 - regression_loss: 1.2198 - classification_loss: 0.2134 197/500 [==========>...................] - ETA: 1:12 - loss: 1.4318 - regression_loss: 1.2186 - classification_loss: 0.2132 198/500 [==========>...................] - ETA: 1:12 - loss: 1.4280 - regression_loss: 1.2155 - classification_loss: 0.2126 199/500 [==========>...................] - ETA: 1:12 - loss: 1.4309 - regression_loss: 1.2177 - classification_loss: 0.2133 200/500 [===========>..................] - ETA: 1:12 - loss: 1.4290 - regression_loss: 1.2163 - classification_loss: 0.2127 201/500 [===========>..................] - ETA: 1:12 - loss: 1.4306 - regression_loss: 1.2175 - classification_loss: 0.2131 202/500 [===========>..................] - ETA: 1:11 - loss: 1.4275 - regression_loss: 1.2146 - classification_loss: 0.2128 203/500 [===========>..................] - ETA: 1:11 - loss: 1.4276 - regression_loss: 1.2147 - classification_loss: 0.2129 204/500 [===========>..................] - ETA: 1:11 - loss: 1.4274 - regression_loss: 1.2147 - classification_loss: 0.2127 205/500 [===========>..................] - ETA: 1:11 - loss: 1.4292 - regression_loss: 1.2162 - classification_loss: 0.2129 206/500 [===========>..................] - ETA: 1:10 - loss: 1.4261 - regression_loss: 1.2137 - classification_loss: 0.2124 207/500 [===========>..................] - ETA: 1:10 - loss: 1.4218 - regression_loss: 1.2101 - classification_loss: 0.2117 208/500 [===========>..................] - ETA: 1:10 - loss: 1.4245 - regression_loss: 1.2118 - classification_loss: 0.2127 209/500 [===========>..................] - ETA: 1:10 - loss: 1.4228 - regression_loss: 1.2105 - classification_loss: 0.2123 210/500 [===========>..................] - ETA: 1:09 - loss: 1.4247 - regression_loss: 1.2123 - classification_loss: 0.2125 211/500 [===========>..................] - ETA: 1:09 - loss: 1.4242 - regression_loss: 1.2115 - classification_loss: 0.2127 212/500 [===========>..................] - ETA: 1:09 - loss: 1.4205 - regression_loss: 1.2086 - classification_loss: 0.2119 213/500 [===========>..................] - ETA: 1:09 - loss: 1.4214 - regression_loss: 1.2090 - classification_loss: 0.2124 214/500 [===========>..................] - ETA: 1:09 - loss: 1.4228 - regression_loss: 1.2099 - classification_loss: 0.2129 215/500 [===========>..................] - ETA: 1:08 - loss: 1.4254 - regression_loss: 1.2120 - classification_loss: 0.2134 216/500 [===========>..................] - ETA: 1:08 - loss: 1.4264 - regression_loss: 1.2126 - classification_loss: 0.2138 217/500 [============>.................] - ETA: 1:08 - loss: 1.4258 - regression_loss: 1.2122 - classification_loss: 0.2136 218/500 [============>.................] - ETA: 1:08 - loss: 1.4270 - regression_loss: 1.2131 - classification_loss: 0.2139 219/500 [============>.................] - ETA: 1:07 - loss: 1.4266 - regression_loss: 1.2129 - classification_loss: 0.2137 220/500 [============>.................] - ETA: 1:07 - loss: 1.4225 - regression_loss: 1.2096 - classification_loss: 0.2130 221/500 [============>.................] - ETA: 1:07 - loss: 1.4233 - regression_loss: 1.2104 - classification_loss: 0.2129 222/500 [============>.................] - ETA: 1:07 - loss: 1.4257 - regression_loss: 1.2124 - classification_loss: 0.2132 223/500 [============>.................] - ETA: 1:06 - loss: 1.4264 - regression_loss: 1.2134 - classification_loss: 0.2130 224/500 [============>.................] - ETA: 1:06 - loss: 1.4288 - regression_loss: 1.2150 - classification_loss: 0.2138 225/500 [============>.................] - ETA: 1:06 - loss: 1.4290 - regression_loss: 1.2149 - classification_loss: 0.2141 226/500 [============>.................] - ETA: 1:06 - loss: 1.4274 - regression_loss: 1.2137 - classification_loss: 0.2137 227/500 [============>.................] - ETA: 1:05 - loss: 1.4269 - regression_loss: 1.2134 - classification_loss: 0.2135 228/500 [============>.................] - ETA: 1:05 - loss: 1.4262 - regression_loss: 1.2129 - classification_loss: 0.2132 229/500 [============>.................] - ETA: 1:05 - loss: 1.4291 - regression_loss: 1.2155 - classification_loss: 0.2137 230/500 [============>.................] - ETA: 1:05 - loss: 1.4290 - regression_loss: 1.2152 - classification_loss: 0.2138 231/500 [============>.................] - ETA: 1:04 - loss: 1.4292 - regression_loss: 1.2149 - classification_loss: 0.2144 232/500 [============>.................] - ETA: 1:04 - loss: 1.4245 - regression_loss: 1.2108 - classification_loss: 0.2138 233/500 [============>.................] - ETA: 1:04 - loss: 1.4285 - regression_loss: 1.2135 - classification_loss: 0.2150 234/500 [=============>................] - ETA: 1:04 - loss: 1.4295 - regression_loss: 1.2141 - classification_loss: 0.2155 235/500 [=============>................] - ETA: 1:04 - loss: 1.4324 - regression_loss: 1.2162 - classification_loss: 0.2162 236/500 [=============>................] - ETA: 1:03 - loss: 1.4341 - regression_loss: 1.2179 - classification_loss: 0.2162 237/500 [=============>................] - ETA: 1:03 - loss: 1.4366 - regression_loss: 1.2198 - classification_loss: 0.2168 238/500 [=============>................] - ETA: 1:03 - loss: 1.4387 - regression_loss: 1.2213 - classification_loss: 0.2174 239/500 [=============>................] - ETA: 1:03 - loss: 1.4362 - regression_loss: 1.2193 - classification_loss: 0.2170 240/500 [=============>................] - ETA: 1:02 - loss: 1.4382 - regression_loss: 1.2204 - classification_loss: 0.2178 241/500 [=============>................] - ETA: 1:02 - loss: 1.4405 - regression_loss: 1.2226 - classification_loss: 0.2179 242/500 [=============>................] - ETA: 1:02 - loss: 1.4366 - regression_loss: 1.2194 - classification_loss: 0.2173 243/500 [=============>................] - ETA: 1:02 - loss: 1.4349 - regression_loss: 1.2181 - classification_loss: 0.2168 244/500 [=============>................] - ETA: 1:01 - loss: 1.4342 - regression_loss: 1.2177 - classification_loss: 0.2165 245/500 [=============>................] - ETA: 1:01 - loss: 1.4329 - regression_loss: 1.2165 - classification_loss: 0.2164 246/500 [=============>................] - ETA: 1:01 - loss: 1.4347 - regression_loss: 1.2181 - classification_loss: 0.2166 247/500 [=============>................] - ETA: 1:01 - loss: 1.4369 - regression_loss: 1.2198 - classification_loss: 0.2171 248/500 [=============>................] - ETA: 1:00 - loss: 1.4360 - regression_loss: 1.2193 - classification_loss: 0.2167 249/500 [=============>................] - ETA: 1:00 - loss: 1.4375 - regression_loss: 1.2204 - classification_loss: 0.2171 250/500 [==============>...............] - ETA: 1:00 - loss: 1.4367 - regression_loss: 1.2200 - classification_loss: 0.2167 251/500 [==============>...............] - ETA: 1:00 - loss: 1.4376 - regression_loss: 1.2208 - classification_loss: 0.2168 252/500 [==============>...............] - ETA: 1:00 - loss: 1.4397 - regression_loss: 1.2225 - classification_loss: 0.2171 253/500 [==============>...............] - ETA: 59s - loss: 1.4375 - regression_loss: 1.2208 - classification_loss: 0.2167  254/500 [==============>...............] - ETA: 59s - loss: 1.4372 - regression_loss: 1.2205 - classification_loss: 0.2166 255/500 [==============>...............] - ETA: 59s - loss: 1.4396 - regression_loss: 1.2226 - classification_loss: 0.2169 256/500 [==============>...............] - ETA: 59s - loss: 1.4400 - regression_loss: 1.2231 - classification_loss: 0.2169 257/500 [==============>...............] - ETA: 58s - loss: 1.4420 - regression_loss: 1.2249 - classification_loss: 0.2171 258/500 [==============>...............] - ETA: 58s - loss: 1.4406 - regression_loss: 1.2237 - classification_loss: 0.2169 259/500 [==============>...............] - ETA: 58s - loss: 1.4419 - regression_loss: 1.2250 - classification_loss: 0.2169 260/500 [==============>...............] - ETA: 58s - loss: 1.4421 - regression_loss: 1.2253 - classification_loss: 0.2168 261/500 [==============>...............] - ETA: 57s - loss: 1.4411 - regression_loss: 1.2245 - classification_loss: 0.2166 262/500 [==============>...............] - ETA: 57s - loss: 1.4405 - regression_loss: 1.2239 - classification_loss: 0.2165 263/500 [==============>...............] - ETA: 57s - loss: 1.4390 - regression_loss: 1.2226 - classification_loss: 0.2164 264/500 [==============>...............] - ETA: 57s - loss: 1.4354 - regression_loss: 1.2195 - classification_loss: 0.2159 265/500 [==============>...............] - ETA: 56s - loss: 1.4348 - regression_loss: 1.2191 - classification_loss: 0.2158 266/500 [==============>...............] - ETA: 56s - loss: 1.4357 - regression_loss: 1.2196 - classification_loss: 0.2160 267/500 [===============>..............] - ETA: 56s - loss: 1.4343 - regression_loss: 1.2182 - classification_loss: 0.2161 268/500 [===============>..............] - ETA: 56s - loss: 1.4330 - regression_loss: 1.2173 - classification_loss: 0.2157 269/500 [===============>..............] - ETA: 55s - loss: 1.4328 - regression_loss: 1.2175 - classification_loss: 0.2154 270/500 [===============>..............] - ETA: 55s - loss: 1.4327 - regression_loss: 1.2174 - classification_loss: 0.2153 271/500 [===============>..............] - ETA: 55s - loss: 1.4355 - regression_loss: 1.2196 - classification_loss: 0.2160 272/500 [===============>..............] - ETA: 55s - loss: 1.4335 - regression_loss: 1.2179 - classification_loss: 0.2155 273/500 [===============>..............] - ETA: 54s - loss: 1.4339 - regression_loss: 1.2182 - classification_loss: 0.2157 274/500 [===============>..............] - ETA: 54s - loss: 1.4338 - regression_loss: 1.2179 - classification_loss: 0.2158 275/500 [===============>..............] - ETA: 54s - loss: 1.4338 - regression_loss: 1.2182 - classification_loss: 0.2156 276/500 [===============>..............] - ETA: 54s - loss: 1.4348 - regression_loss: 1.2188 - classification_loss: 0.2160 277/500 [===============>..............] - ETA: 53s - loss: 1.4349 - regression_loss: 1.2194 - classification_loss: 0.2155 278/500 [===============>..............] - ETA: 53s - loss: 1.4355 - regression_loss: 1.2199 - classification_loss: 0.2156 279/500 [===============>..............] - ETA: 53s - loss: 1.4353 - regression_loss: 1.2196 - classification_loss: 0.2157 280/500 [===============>..............] - ETA: 53s - loss: 1.4360 - regression_loss: 1.2199 - classification_loss: 0.2160 281/500 [===============>..............] - ETA: 53s - loss: 1.4362 - regression_loss: 1.2197 - classification_loss: 0.2165 282/500 [===============>..............] - ETA: 52s - loss: 1.4379 - regression_loss: 1.2211 - classification_loss: 0.2168 283/500 [===============>..............] - ETA: 52s - loss: 1.4379 - regression_loss: 1.2215 - classification_loss: 0.2164 284/500 [================>.............] - ETA: 52s - loss: 1.4371 - regression_loss: 1.2208 - classification_loss: 0.2163 285/500 [================>.............] - ETA: 52s - loss: 1.4366 - regression_loss: 1.2202 - classification_loss: 0.2164 286/500 [================>.............] - ETA: 51s - loss: 1.4361 - regression_loss: 1.2198 - classification_loss: 0.2163 287/500 [================>.............] - ETA: 51s - loss: 1.4348 - regression_loss: 1.2184 - classification_loss: 0.2164 288/500 [================>.............] - ETA: 51s - loss: 1.4352 - regression_loss: 1.2187 - classification_loss: 0.2165 289/500 [================>.............] - ETA: 51s - loss: 1.4332 - regression_loss: 1.2171 - classification_loss: 0.2161 290/500 [================>.............] - ETA: 50s - loss: 1.4348 - regression_loss: 1.2183 - classification_loss: 0.2165 291/500 [================>.............] - ETA: 50s - loss: 1.4341 - regression_loss: 1.2180 - classification_loss: 0.2161 292/500 [================>.............] - ETA: 50s - loss: 1.4339 - regression_loss: 1.2177 - classification_loss: 0.2162 293/500 [================>.............] - ETA: 50s - loss: 1.4335 - regression_loss: 1.2175 - classification_loss: 0.2160 294/500 [================>.............] - ETA: 49s - loss: 1.4348 - regression_loss: 1.2182 - classification_loss: 0.2165 295/500 [================>.............] - ETA: 49s - loss: 1.4336 - regression_loss: 1.2174 - classification_loss: 0.2162 296/500 [================>.............] - ETA: 49s - loss: 1.4319 - regression_loss: 1.2160 - classification_loss: 0.2159 297/500 [================>.............] - ETA: 49s - loss: 1.4303 - regression_loss: 1.2147 - classification_loss: 0.2156 298/500 [================>.............] - ETA: 48s - loss: 1.4305 - regression_loss: 1.2148 - classification_loss: 0.2157 299/500 [================>.............] - ETA: 48s - loss: 1.4310 - regression_loss: 1.2150 - classification_loss: 0.2160 300/500 [=================>............] - ETA: 48s - loss: 1.4341 - regression_loss: 1.2179 - classification_loss: 0.2162 301/500 [=================>............] - ETA: 48s - loss: 1.4324 - regression_loss: 1.2166 - classification_loss: 0.2158 302/500 [=================>............] - ETA: 47s - loss: 1.4344 - regression_loss: 1.2180 - classification_loss: 0.2164 303/500 [=================>............] - ETA: 47s - loss: 1.4354 - regression_loss: 1.2188 - classification_loss: 0.2166 304/500 [=================>............] - ETA: 47s - loss: 1.4342 - regression_loss: 1.2179 - classification_loss: 0.2163 305/500 [=================>............] - ETA: 47s - loss: 1.4345 - regression_loss: 1.2181 - classification_loss: 0.2163 306/500 [=================>............] - ETA: 46s - loss: 1.4332 - regression_loss: 1.2170 - classification_loss: 0.2162 307/500 [=================>............] - ETA: 46s - loss: 1.4334 - regression_loss: 1.2172 - classification_loss: 0.2162 308/500 [=================>............] - ETA: 46s - loss: 1.4326 - regression_loss: 1.2166 - classification_loss: 0.2160 309/500 [=================>............] - ETA: 46s - loss: 1.4299 - regression_loss: 1.2144 - classification_loss: 0.2155 310/500 [=================>............] - ETA: 46s - loss: 1.4341 - regression_loss: 1.2178 - classification_loss: 0.2163 311/500 [=================>............] - ETA: 45s - loss: 1.4344 - regression_loss: 1.2182 - classification_loss: 0.2161 312/500 [=================>............] - ETA: 45s - loss: 1.4327 - regression_loss: 1.2167 - classification_loss: 0.2159 313/500 [=================>............] - ETA: 45s - loss: 1.4314 - regression_loss: 1.2157 - classification_loss: 0.2157 314/500 [=================>............] - ETA: 45s - loss: 1.4328 - regression_loss: 1.2169 - classification_loss: 0.2159 315/500 [=================>............] - ETA: 44s - loss: 1.4331 - regression_loss: 1.2171 - classification_loss: 0.2160 316/500 [=================>............] - ETA: 44s - loss: 1.4354 - regression_loss: 1.2190 - classification_loss: 0.2165 317/500 [==================>...........] - ETA: 44s - loss: 1.4358 - regression_loss: 1.2192 - classification_loss: 0.2166 318/500 [==================>...........] - ETA: 44s - loss: 1.4378 - regression_loss: 1.2210 - classification_loss: 0.2168 319/500 [==================>...........] - ETA: 43s - loss: 1.4377 - regression_loss: 1.2210 - classification_loss: 0.2166 320/500 [==================>...........] - ETA: 43s - loss: 1.4394 - regression_loss: 1.2221 - classification_loss: 0.2173 321/500 [==================>...........] - ETA: 43s - loss: 1.4375 - regression_loss: 1.2206 - classification_loss: 0.2169 322/500 [==================>...........] - ETA: 43s - loss: 1.4369 - regression_loss: 1.2202 - classification_loss: 0.2168 323/500 [==================>...........] - ETA: 42s - loss: 1.4380 - regression_loss: 1.2209 - classification_loss: 0.2171 324/500 [==================>...........] - ETA: 42s - loss: 1.4404 - regression_loss: 1.2226 - classification_loss: 0.2178 325/500 [==================>...........] - ETA: 42s - loss: 1.4424 - regression_loss: 1.2244 - classification_loss: 0.2180 326/500 [==================>...........] - ETA: 42s - loss: 1.4429 - regression_loss: 1.2247 - classification_loss: 0.2182 327/500 [==================>...........] - ETA: 41s - loss: 1.4495 - regression_loss: 1.2297 - classification_loss: 0.2198 328/500 [==================>...........] - ETA: 41s - loss: 1.4530 - regression_loss: 1.2329 - classification_loss: 0.2202 329/500 [==================>...........] - ETA: 41s - loss: 1.4521 - regression_loss: 1.2322 - classification_loss: 0.2199 330/500 [==================>...........] - ETA: 41s - loss: 1.4540 - regression_loss: 1.2335 - classification_loss: 0.2205 331/500 [==================>...........] - ETA: 40s - loss: 1.4568 - regression_loss: 1.2354 - classification_loss: 0.2214 332/500 [==================>...........] - ETA: 40s - loss: 1.4568 - regression_loss: 1.2356 - classification_loss: 0.2212 333/500 [==================>...........] - ETA: 40s - loss: 1.4572 - regression_loss: 1.2361 - classification_loss: 0.2211 334/500 [===================>..........] - ETA: 40s - loss: 1.4563 - regression_loss: 1.2353 - classification_loss: 0.2210 335/500 [===================>..........] - ETA: 39s - loss: 1.4571 - regression_loss: 1.2363 - classification_loss: 0.2208 336/500 [===================>..........] - ETA: 39s - loss: 1.4567 - regression_loss: 1.2360 - classification_loss: 0.2207 337/500 [===================>..........] - ETA: 39s - loss: 1.4555 - regression_loss: 1.2351 - classification_loss: 0.2204 338/500 [===================>..........] - ETA: 39s - loss: 1.4540 - regression_loss: 1.2337 - classification_loss: 0.2203 339/500 [===================>..........] - ETA: 39s - loss: 1.4533 - regression_loss: 1.2332 - classification_loss: 0.2201 340/500 [===================>..........] - ETA: 38s - loss: 1.4530 - regression_loss: 1.2327 - classification_loss: 0.2203 341/500 [===================>..........] - ETA: 38s - loss: 1.4525 - regression_loss: 1.2320 - classification_loss: 0.2205 342/500 [===================>..........] - ETA: 38s - loss: 1.4521 - regression_loss: 1.2317 - classification_loss: 0.2203 343/500 [===================>..........] - ETA: 38s - loss: 1.4504 - regression_loss: 1.2305 - classification_loss: 0.2199 344/500 [===================>..........] - ETA: 37s - loss: 1.4490 - regression_loss: 1.2294 - classification_loss: 0.2196 345/500 [===================>..........] - ETA: 37s - loss: 1.4484 - regression_loss: 1.2290 - classification_loss: 0.2194 346/500 [===================>..........] - ETA: 37s - loss: 1.4487 - regression_loss: 1.2293 - classification_loss: 0.2195 347/500 [===================>..........] - ETA: 37s - loss: 1.4511 - regression_loss: 1.2310 - classification_loss: 0.2201 348/500 [===================>..........] - ETA: 36s - loss: 1.4512 - regression_loss: 1.2313 - classification_loss: 0.2200 349/500 [===================>..........] - ETA: 36s - loss: 1.4513 - regression_loss: 1.2314 - classification_loss: 0.2199 350/500 [====================>.........] - ETA: 36s - loss: 1.4526 - regression_loss: 1.2325 - classification_loss: 0.2202 351/500 [====================>.........] - ETA: 36s - loss: 1.4512 - regression_loss: 1.2313 - classification_loss: 0.2198 352/500 [====================>.........] - ETA: 35s - loss: 1.4482 - regression_loss: 1.2288 - classification_loss: 0.2194 353/500 [====================>.........] - ETA: 35s - loss: 1.4474 - regression_loss: 1.2281 - classification_loss: 0.2193 354/500 [====================>.........] - ETA: 35s - loss: 1.4477 - regression_loss: 1.2284 - classification_loss: 0.2192 355/500 [====================>.........] - ETA: 35s - loss: 1.4451 - regression_loss: 1.2264 - classification_loss: 0.2187 356/500 [====================>.........] - ETA: 34s - loss: 1.4455 - regression_loss: 1.2265 - classification_loss: 0.2190 357/500 [====================>.........] - ETA: 34s - loss: 1.4435 - regression_loss: 1.2249 - classification_loss: 0.2186 358/500 [====================>.........] - ETA: 34s - loss: 1.4466 - regression_loss: 1.2273 - classification_loss: 0.2193 359/500 [====================>.........] - ETA: 34s - loss: 1.4444 - regression_loss: 1.2254 - classification_loss: 0.2189 360/500 [====================>.........] - ETA: 33s - loss: 1.4459 - regression_loss: 1.2268 - classification_loss: 0.2191 361/500 [====================>.........] - ETA: 33s - loss: 1.4432 - regression_loss: 1.2245 - classification_loss: 0.2187 362/500 [====================>.........] - ETA: 33s - loss: 1.4417 - regression_loss: 1.2233 - classification_loss: 0.2184 363/500 [====================>.........] - ETA: 33s - loss: 1.4405 - regression_loss: 1.2224 - classification_loss: 0.2182 364/500 [====================>.........] - ETA: 32s - loss: 1.4409 - regression_loss: 1.2228 - classification_loss: 0.2181 365/500 [====================>.........] - ETA: 32s - loss: 1.4404 - regression_loss: 1.2224 - classification_loss: 0.2180 366/500 [====================>.........] - ETA: 32s - loss: 1.4396 - regression_loss: 1.2218 - classification_loss: 0.2178 367/500 [=====================>........] - ETA: 32s - loss: 1.4414 - regression_loss: 1.2230 - classification_loss: 0.2184 368/500 [=====================>........] - ETA: 32s - loss: 1.4421 - regression_loss: 1.2233 - classification_loss: 0.2188 369/500 [=====================>........] - ETA: 31s - loss: 1.4430 - regression_loss: 1.2200 - classification_loss: 0.2230 370/500 [=====================>........] - ETA: 31s - loss: 1.4431 - regression_loss: 1.2201 - classification_loss: 0.2230 371/500 [=====================>........] - ETA: 31s - loss: 1.4429 - regression_loss: 1.2201 - classification_loss: 0.2228 372/500 [=====================>........] - ETA: 31s - loss: 1.4427 - regression_loss: 1.2201 - classification_loss: 0.2226 373/500 [=====================>........] - ETA: 30s - loss: 1.4426 - regression_loss: 1.2202 - classification_loss: 0.2224 374/500 [=====================>........] - ETA: 30s - loss: 1.4425 - regression_loss: 1.2202 - classification_loss: 0.2223 375/500 [=====================>........] - ETA: 30s - loss: 1.4424 - regression_loss: 1.2201 - classification_loss: 0.2223 376/500 [=====================>........] - ETA: 30s - loss: 1.4421 - regression_loss: 1.2200 - classification_loss: 0.2221 377/500 [=====================>........] - ETA: 29s - loss: 1.4421 - regression_loss: 1.2198 - classification_loss: 0.2223 378/500 [=====================>........] - ETA: 29s - loss: 1.4416 - regression_loss: 1.2195 - classification_loss: 0.2221 379/500 [=====================>........] - ETA: 29s - loss: 1.4427 - regression_loss: 1.2204 - classification_loss: 0.2223 380/500 [=====================>........] - ETA: 29s - loss: 1.4440 - regression_loss: 1.2215 - classification_loss: 0.2225 381/500 [=====================>........] - ETA: 28s - loss: 1.4440 - regression_loss: 1.2215 - classification_loss: 0.2225 382/500 [=====================>........] - ETA: 28s - loss: 1.4419 - regression_loss: 1.2197 - classification_loss: 0.2222 383/500 [=====================>........] - ETA: 28s - loss: 1.4413 - regression_loss: 1.2192 - classification_loss: 0.2220 384/500 [======================>.......] - ETA: 28s - loss: 1.4400 - regression_loss: 1.2183 - classification_loss: 0.2218 385/500 [======================>.......] - ETA: 27s - loss: 1.4412 - regression_loss: 1.2151 - classification_loss: 0.2261 386/500 [======================>.......] - ETA: 27s - loss: 1.4450 - regression_loss: 1.2182 - classification_loss: 0.2268 387/500 [======================>.......] - ETA: 27s - loss: 1.4462 - regression_loss: 1.2194 - classification_loss: 0.2268 388/500 [======================>.......] - ETA: 27s - loss: 1.4459 - regression_loss: 1.2192 - classification_loss: 0.2267 389/500 [======================>.......] - ETA: 26s - loss: 1.4437 - regression_loss: 1.2174 - classification_loss: 0.2263 390/500 [======================>.......] - ETA: 26s - loss: 1.4434 - regression_loss: 1.2171 - classification_loss: 0.2263 391/500 [======================>.......] - ETA: 26s - loss: 1.4431 - regression_loss: 1.2169 - classification_loss: 0.2262 392/500 [======================>.......] - ETA: 26s - loss: 1.4440 - regression_loss: 1.2177 - classification_loss: 0.2264 393/500 [======================>.......] - ETA: 25s - loss: 1.4441 - regression_loss: 1.2176 - classification_loss: 0.2265 394/500 [======================>.......] - ETA: 25s - loss: 1.4453 - regression_loss: 1.2184 - classification_loss: 0.2269 395/500 [======================>.......] - ETA: 25s - loss: 1.4452 - regression_loss: 1.2183 - classification_loss: 0.2269 396/500 [======================>.......] - ETA: 25s - loss: 1.4449 - regression_loss: 1.2180 - classification_loss: 0.2269 397/500 [======================>.......] - ETA: 25s - loss: 1.4455 - regression_loss: 1.2186 - classification_loss: 0.2269 398/500 [======================>.......] - ETA: 24s - loss: 1.4457 - regression_loss: 1.2188 - classification_loss: 0.2268 399/500 [======================>.......] - ETA: 24s - loss: 1.4437 - regression_loss: 1.2173 - classification_loss: 0.2264 400/500 [=======================>......] - ETA: 24s - loss: 1.4441 - regression_loss: 1.2174 - classification_loss: 0.2267 401/500 [=======================>......] - ETA: 24s - loss: 1.4443 - regression_loss: 1.2177 - classification_loss: 0.2266 402/500 [=======================>......] - ETA: 23s - loss: 1.4442 - regression_loss: 1.2177 - classification_loss: 0.2265 403/500 [=======================>......] - ETA: 23s - loss: 1.4439 - regression_loss: 1.2175 - classification_loss: 0.2265 404/500 [=======================>......] - ETA: 23s - loss: 1.4435 - regression_loss: 1.2172 - classification_loss: 0.2263 405/500 [=======================>......] - ETA: 23s - loss: 1.4432 - regression_loss: 1.2171 - classification_loss: 0.2261 406/500 [=======================>......] - ETA: 22s - loss: 1.4432 - regression_loss: 1.2170 - classification_loss: 0.2261 407/500 [=======================>......] - ETA: 22s - loss: 1.4416 - regression_loss: 1.2158 - classification_loss: 0.2258 408/500 [=======================>......] - ETA: 22s - loss: 1.4416 - regression_loss: 1.2159 - classification_loss: 0.2257 409/500 [=======================>......] - ETA: 22s - loss: 1.4416 - regression_loss: 1.2162 - classification_loss: 0.2254 410/500 [=======================>......] - ETA: 21s - loss: 1.4434 - regression_loss: 1.2180 - classification_loss: 0.2254 411/500 [=======================>......] - ETA: 21s - loss: 1.4413 - regression_loss: 1.2162 - classification_loss: 0.2252 412/500 [=======================>......] - ETA: 21s - loss: 1.4392 - regression_loss: 1.2144 - classification_loss: 0.2248 413/500 [=======================>......] - ETA: 21s - loss: 1.4403 - regression_loss: 1.2152 - classification_loss: 0.2251 414/500 [=======================>......] - ETA: 20s - loss: 1.4386 - regression_loss: 1.2139 - classification_loss: 0.2247 415/500 [=======================>......] - ETA: 20s - loss: 1.4377 - regression_loss: 1.2132 - classification_loss: 0.2245 416/500 [=======================>......] - ETA: 20s - loss: 1.4368 - regression_loss: 1.2126 - classification_loss: 0.2243 417/500 [========================>.....] - ETA: 20s - loss: 1.4361 - regression_loss: 1.2121 - classification_loss: 0.2240 418/500 [========================>.....] - ETA: 19s - loss: 1.4364 - regression_loss: 1.2119 - classification_loss: 0.2244 419/500 [========================>.....] - ETA: 19s - loss: 1.4380 - regression_loss: 1.2132 - classification_loss: 0.2248 420/500 [========================>.....] - ETA: 19s - loss: 1.4380 - regression_loss: 1.2132 - classification_loss: 0.2248 421/500 [========================>.....] - ETA: 19s - loss: 1.4382 - regression_loss: 1.2134 - classification_loss: 0.2248 422/500 [========================>.....] - ETA: 18s - loss: 1.4377 - regression_loss: 1.2130 - classification_loss: 0.2246 423/500 [========================>.....] - ETA: 18s - loss: 1.4376 - regression_loss: 1.2128 - classification_loss: 0.2248 424/500 [========================>.....] - ETA: 18s - loss: 1.4371 - regression_loss: 1.2125 - classification_loss: 0.2246 425/500 [========================>.....] - ETA: 18s - loss: 1.4357 - regression_loss: 1.2114 - classification_loss: 0.2243 426/500 [========================>.....] - ETA: 17s - loss: 1.4340 - regression_loss: 1.2101 - classification_loss: 0.2239 427/500 [========================>.....] - ETA: 17s - loss: 1.4329 - regression_loss: 1.2090 - classification_loss: 0.2239 428/500 [========================>.....] - ETA: 17s - loss: 1.4312 - regression_loss: 1.2077 - classification_loss: 0.2236 429/500 [========================>.....] - ETA: 17s - loss: 1.4326 - regression_loss: 1.2087 - classification_loss: 0.2238 430/500 [========================>.....] - ETA: 16s - loss: 1.4319 - regression_loss: 1.2083 - classification_loss: 0.2236 431/500 [========================>.....] - ETA: 16s - loss: 1.4333 - regression_loss: 1.2095 - classification_loss: 0.2238 432/500 [========================>.....] - ETA: 16s - loss: 1.4334 - regression_loss: 1.2095 - classification_loss: 0.2239 433/500 [========================>.....] - ETA: 16s - loss: 1.4350 - regression_loss: 1.2108 - classification_loss: 0.2242 434/500 [=========================>....] - ETA: 16s - loss: 1.4340 - regression_loss: 1.2099 - classification_loss: 0.2240 435/500 [=========================>....] - ETA: 15s - loss: 1.4325 - regression_loss: 1.2087 - classification_loss: 0.2238 436/500 [=========================>....] - ETA: 15s - loss: 1.4321 - regression_loss: 1.2085 - classification_loss: 0.2237 437/500 [=========================>....] - ETA: 15s - loss: 1.4320 - regression_loss: 1.2083 - classification_loss: 0.2236 438/500 [=========================>....] - ETA: 15s - loss: 1.4325 - regression_loss: 1.2088 - classification_loss: 0.2236 439/500 [=========================>....] - ETA: 14s - loss: 1.4315 - regression_loss: 1.2080 - classification_loss: 0.2235 440/500 [=========================>....] - ETA: 14s - loss: 1.4300 - regression_loss: 1.2069 - classification_loss: 0.2232 441/500 [=========================>....] - ETA: 14s - loss: 1.4302 - regression_loss: 1.2072 - classification_loss: 0.2229 442/500 [=========================>....] - ETA: 14s - loss: 1.4319 - regression_loss: 1.2084 - classification_loss: 0.2235 443/500 [=========================>....] - ETA: 13s - loss: 1.4333 - regression_loss: 1.2100 - classification_loss: 0.2233 444/500 [=========================>....] - ETA: 13s - loss: 1.4332 - regression_loss: 1.2100 - classification_loss: 0.2232 445/500 [=========================>....] - ETA: 13s - loss: 1.4335 - regression_loss: 1.2102 - classification_loss: 0.2233 446/500 [=========================>....] - ETA: 13s - loss: 1.4342 - regression_loss: 1.2107 - classification_loss: 0.2235 447/500 [=========================>....] - ETA: 12s - loss: 1.4346 - regression_loss: 1.2112 - classification_loss: 0.2234 448/500 [=========================>....] - ETA: 12s - loss: 1.4348 - regression_loss: 1.2113 - classification_loss: 0.2235 449/500 [=========================>....] - ETA: 12s - loss: 1.4359 - regression_loss: 1.2121 - classification_loss: 0.2238 450/500 [==========================>...] - ETA: 12s - loss: 1.4365 - regression_loss: 1.2127 - classification_loss: 0.2238 451/500 [==========================>...] - ETA: 11s - loss: 1.4348 - regression_loss: 1.2114 - classification_loss: 0.2234 452/500 [==========================>...] - ETA: 11s - loss: 1.4355 - regression_loss: 1.2119 - classification_loss: 0.2236 453/500 [==========================>...] - ETA: 11s - loss: 1.4357 - regression_loss: 1.2121 - classification_loss: 0.2236 454/500 [==========================>...] - ETA: 11s - loss: 1.4362 - regression_loss: 1.2126 - classification_loss: 0.2236 455/500 [==========================>...] - ETA: 10s - loss: 1.4353 - regression_loss: 1.2119 - classification_loss: 0.2234 456/500 [==========================>...] - ETA: 10s - loss: 1.4350 - regression_loss: 1.2117 - classification_loss: 0.2233 457/500 [==========================>...] - ETA: 10s - loss: 1.4352 - regression_loss: 1.2119 - classification_loss: 0.2233 458/500 [==========================>...] - ETA: 10s - loss: 1.4363 - regression_loss: 1.2128 - classification_loss: 0.2234 459/500 [==========================>...] - ETA: 9s - loss: 1.4364 - regression_loss: 1.2130 - classification_loss: 0.2234  460/500 [==========================>...] - ETA: 9s - loss: 1.4374 - regression_loss: 1.2137 - classification_loss: 0.2237 461/500 [==========================>...] - ETA: 9s - loss: 1.4368 - regression_loss: 1.2133 - classification_loss: 0.2235 462/500 [==========================>...] - ETA: 9s - loss: 1.4361 - regression_loss: 1.2127 - classification_loss: 0.2233 463/500 [==========================>...] - ETA: 8s - loss: 1.4357 - regression_loss: 1.2125 - classification_loss: 0.2232 464/500 [==========================>...] - ETA: 8s - loss: 1.4354 - regression_loss: 1.2123 - classification_loss: 0.2230 465/500 [==========================>...] - ETA: 8s - loss: 1.4349 - regression_loss: 1.2120 - classification_loss: 0.2229 466/500 [==========================>...] - ETA: 8s - loss: 1.4342 - regression_loss: 1.2114 - classification_loss: 0.2228 467/500 [===========================>..] - ETA: 8s - loss: 1.4361 - regression_loss: 1.2131 - classification_loss: 0.2230 468/500 [===========================>..] - ETA: 7s - loss: 1.4369 - regression_loss: 1.2138 - classification_loss: 0.2232 469/500 [===========================>..] - ETA: 7s - loss: 1.4358 - regression_loss: 1.2127 - classification_loss: 0.2231 470/500 [===========================>..] - ETA: 7s - loss: 1.4368 - regression_loss: 1.2133 - classification_loss: 0.2235 471/500 [===========================>..] - ETA: 7s - loss: 1.4364 - regression_loss: 1.2130 - classification_loss: 0.2233 472/500 [===========================>..] - ETA: 6s - loss: 1.4379 - regression_loss: 1.2144 - classification_loss: 0.2235 473/500 [===========================>..] - ETA: 6s - loss: 1.4388 - regression_loss: 1.2151 - classification_loss: 0.2237 474/500 [===========================>..] - ETA: 6s - loss: 1.4396 - regression_loss: 1.2151 - classification_loss: 0.2244 475/500 [===========================>..] - ETA: 6s - loss: 1.4389 - regression_loss: 1.2147 - classification_loss: 0.2242 476/500 [===========================>..] - ETA: 5s - loss: 1.4391 - regression_loss: 1.2149 - classification_loss: 0.2242 477/500 [===========================>..] - ETA: 5s - loss: 1.4366 - regression_loss: 1.2128 - classification_loss: 0.2238 478/500 [===========================>..] - ETA: 5s - loss: 1.4358 - regression_loss: 1.2121 - classification_loss: 0.2237 479/500 [===========================>..] - ETA: 5s - loss: 1.4366 - regression_loss: 1.2127 - classification_loss: 0.2239 480/500 [===========================>..] - ETA: 4s - loss: 1.4369 - regression_loss: 1.2129 - classification_loss: 0.2240 481/500 [===========================>..] - ETA: 4s - loss: 1.4356 - regression_loss: 1.2119 - classification_loss: 0.2237 482/500 [===========================>..] - ETA: 4s - loss: 1.4336 - regression_loss: 1.2101 - classification_loss: 0.2235 483/500 [===========================>..] - ETA: 4s - loss: 1.4326 - regression_loss: 1.2093 - classification_loss: 0.2233 484/500 [============================>.] - ETA: 3s - loss: 1.4316 - regression_loss: 1.2085 - classification_loss: 0.2231 485/500 [============================>.] - ETA: 3s - loss: 1.4330 - regression_loss: 1.2096 - classification_loss: 0.2235 486/500 [============================>.] - ETA: 3s - loss: 1.4343 - regression_loss: 1.2107 - classification_loss: 0.2237 487/500 [============================>.] - ETA: 3s - loss: 1.4349 - regression_loss: 1.2110 - classification_loss: 0.2239 488/500 [============================>.] - ETA: 2s - loss: 1.4350 - regression_loss: 1.2111 - classification_loss: 0.2239 489/500 [============================>.] - ETA: 2s - loss: 1.4342 - regression_loss: 1.2105 - classification_loss: 0.2237 490/500 [============================>.] - ETA: 2s - loss: 1.4342 - regression_loss: 1.2104 - classification_loss: 0.2238 491/500 [============================>.] - ETA: 2s - loss: 1.4352 - regression_loss: 1.2111 - classification_loss: 0.2241 492/500 [============================>.] - ETA: 1s - loss: 1.4374 - regression_loss: 1.2130 - classification_loss: 0.2244 493/500 [============================>.] - ETA: 1s - loss: 1.4368 - regression_loss: 1.2127 - classification_loss: 0.2241 494/500 [============================>.] - ETA: 1s - loss: 1.4366 - regression_loss: 1.2126 - classification_loss: 0.2240 495/500 [============================>.] - ETA: 1s - loss: 1.4369 - regression_loss: 1.2131 - classification_loss: 0.2238 496/500 [============================>.] - ETA: 0s - loss: 1.4368 - regression_loss: 1.2131 - classification_loss: 0.2237 497/500 [============================>.] - ETA: 0s - loss: 1.4365 - regression_loss: 1.2127 - classification_loss: 0.2238 498/500 [============================>.] - ETA: 0s - loss: 1.4367 - regression_loss: 1.2129 - classification_loss: 0.2239 499/500 [============================>.] - ETA: 0s - loss: 1.4371 - regression_loss: 1.2133 - classification_loss: 0.2238 500/500 [==============================] - 122s 243ms/step - loss: 1.4379 - regression_loss: 1.2139 - classification_loss: 0.2239 326 instances of class plum with average precision: 0.7926 mAP: 0.7926 Epoch 00073: saving model to ./training/snapshots/resnet50_pascal_73.h5 Epoch 74/150 1/500 [..............................] - ETA: 1:58 - loss: 2.1800 - regression_loss: 1.7105 - classification_loss: 0.4695 2/500 [..............................] - ETA: 2:00 - loss: 1.9309 - regression_loss: 1.5424 - classification_loss: 0.3885 3/500 [..............................] - ETA: 2:01 - loss: 1.6242 - regression_loss: 1.3203 - classification_loss: 0.3039 4/500 [..............................] - ETA: 2:01 - loss: 1.4542 - regression_loss: 1.1997 - classification_loss: 0.2546 5/500 [..............................] - ETA: 2:00 - loss: 1.4153 - regression_loss: 1.1791 - classification_loss: 0.2362 6/500 [..............................] - ETA: 2:00 - loss: 1.5718 - regression_loss: 1.3403 - classification_loss: 0.2316 7/500 [..............................] - ETA: 1:59 - loss: 1.5049 - regression_loss: 1.2823 - classification_loss: 0.2226 8/500 [..............................] - ETA: 1:59 - loss: 1.4669 - regression_loss: 1.2544 - classification_loss: 0.2125 9/500 [..............................] - ETA: 1:59 - loss: 1.4442 - regression_loss: 1.2262 - classification_loss: 0.2180 10/500 [..............................] - ETA: 1:59 - loss: 1.4002 - regression_loss: 1.1942 - classification_loss: 0.2061 11/500 [..............................] - ETA: 1:59 - loss: 1.3261 - regression_loss: 1.0856 - classification_loss: 0.2405 12/500 [..............................] - ETA: 1:58 - loss: 1.3264 - regression_loss: 1.0908 - classification_loss: 0.2356 13/500 [..............................] - ETA: 1:57 - loss: 1.2725 - regression_loss: 1.0508 - classification_loss: 0.2217 14/500 [..............................] - ETA: 1:57 - loss: 1.2887 - regression_loss: 1.0663 - classification_loss: 0.2224 15/500 [..............................] - ETA: 1:57 - loss: 1.2977 - regression_loss: 1.0741 - classification_loss: 0.2235 16/500 [..............................] - ETA: 1:57 - loss: 1.3148 - regression_loss: 1.0855 - classification_loss: 0.2294 17/500 [>.............................] - ETA: 1:57 - loss: 1.3422 - regression_loss: 1.1105 - classification_loss: 0.2317 18/500 [>.............................] - ETA: 1:56 - loss: 1.3508 - regression_loss: 1.1217 - classification_loss: 0.2292 19/500 [>.............................] - ETA: 1:56 - loss: 1.3334 - regression_loss: 1.1094 - classification_loss: 0.2240 20/500 [>.............................] - ETA: 1:56 - loss: 1.3643 - regression_loss: 1.1369 - classification_loss: 0.2274 21/500 [>.............................] - ETA: 1:56 - loss: 1.3184 - regression_loss: 1.0997 - classification_loss: 0.2188 22/500 [>.............................] - ETA: 1:56 - loss: 1.3032 - regression_loss: 1.0869 - classification_loss: 0.2163 23/500 [>.............................] - ETA: 1:56 - loss: 1.3182 - regression_loss: 1.0979 - classification_loss: 0.2203 24/500 [>.............................] - ETA: 1:55 - loss: 1.3291 - regression_loss: 1.1099 - classification_loss: 0.2192 25/500 [>.............................] - ETA: 1:55 - loss: 1.3215 - regression_loss: 1.1075 - classification_loss: 0.2139 26/500 [>.............................] - ETA: 1:55 - loss: 1.2781 - regression_loss: 1.0709 - classification_loss: 0.2072 27/500 [>.............................] - ETA: 1:55 - loss: 1.2544 - regression_loss: 1.0514 - classification_loss: 0.2030 28/500 [>.............................] - ETA: 1:55 - loss: 1.2355 - regression_loss: 1.0378 - classification_loss: 0.1977 29/500 [>.............................] - ETA: 1:55 - loss: 1.2280 - regression_loss: 1.0330 - classification_loss: 0.1950 30/500 [>.............................] - ETA: 1:54 - loss: 1.2436 - regression_loss: 1.0457 - classification_loss: 0.1979 31/500 [>.............................] - ETA: 1:53 - loss: 1.2302 - regression_loss: 1.0345 - classification_loss: 0.1957 32/500 [>.............................] - ETA: 1:53 - loss: 1.2368 - regression_loss: 1.0408 - classification_loss: 0.1960 33/500 [>.............................] - ETA: 1:53 - loss: 1.2493 - regression_loss: 1.0526 - classification_loss: 0.1966 34/500 [=>............................] - ETA: 1:53 - loss: 1.2239 - regression_loss: 1.0319 - classification_loss: 0.1920 35/500 [=>............................] - ETA: 1:52 - loss: 1.2280 - regression_loss: 1.0374 - classification_loss: 0.1905 36/500 [=>............................] - ETA: 1:52 - loss: 1.2259 - regression_loss: 1.0355 - classification_loss: 0.1904 37/500 [=>............................] - ETA: 1:52 - loss: 1.2293 - regression_loss: 1.0398 - classification_loss: 0.1895 38/500 [=>............................] - ETA: 1:52 - loss: 1.2388 - regression_loss: 1.0488 - classification_loss: 0.1901 39/500 [=>............................] - ETA: 1:52 - loss: 1.2474 - regression_loss: 1.0583 - classification_loss: 0.1890 40/500 [=>............................] - ETA: 1:51 - loss: 1.2299 - regression_loss: 1.0434 - classification_loss: 0.1865 41/500 [=>............................] - ETA: 1:51 - loss: 1.2329 - regression_loss: 1.0462 - classification_loss: 0.1867 42/500 [=>............................] - ETA: 1:51 - loss: 1.2545 - regression_loss: 1.0617 - classification_loss: 0.1928 43/500 [=>............................] - ETA: 1:51 - loss: 1.2716 - regression_loss: 1.0740 - classification_loss: 0.1976 44/500 [=>............................] - ETA: 1:51 - loss: 1.2810 - regression_loss: 1.0816 - classification_loss: 0.1993 45/500 [=>............................] - ETA: 1:50 - loss: 1.2891 - regression_loss: 1.0871 - classification_loss: 0.2020 46/500 [=>............................] - ETA: 1:50 - loss: 1.3009 - regression_loss: 1.0979 - classification_loss: 0.2030 47/500 [=>............................] - ETA: 1:50 - loss: 1.2999 - regression_loss: 1.0952 - classification_loss: 0.2047 48/500 [=>............................] - ETA: 1:50 - loss: 1.2979 - regression_loss: 1.0949 - classification_loss: 0.2030 49/500 [=>............................] - ETA: 1:49 - loss: 1.2854 - regression_loss: 1.0848 - classification_loss: 0.2006 50/500 [==>...........................] - ETA: 1:49 - loss: 1.2793 - regression_loss: 1.0801 - classification_loss: 0.1992 51/500 [==>...........................] - ETA: 1:49 - loss: 1.2784 - regression_loss: 1.0800 - classification_loss: 0.1984 52/500 [==>...........................] - ETA: 1:48 - loss: 1.2879 - regression_loss: 1.0880 - classification_loss: 0.1999 53/500 [==>...........................] - ETA: 1:48 - loss: 1.2887 - regression_loss: 1.0899 - classification_loss: 0.1988 54/500 [==>...........................] - ETA: 1:48 - loss: 1.2877 - regression_loss: 1.0893 - classification_loss: 0.1984 55/500 [==>...........................] - ETA: 1:48 - loss: 1.2874 - regression_loss: 1.0895 - classification_loss: 0.1979 56/500 [==>...........................] - ETA: 1:48 - loss: 1.2942 - regression_loss: 1.0964 - classification_loss: 0.1978 57/500 [==>...........................] - ETA: 1:47 - loss: 1.2889 - regression_loss: 1.0929 - classification_loss: 0.1960 58/500 [==>...........................] - ETA: 1:47 - loss: 1.2857 - regression_loss: 1.0905 - classification_loss: 0.1953 59/500 [==>...........................] - ETA: 1:47 - loss: 1.3133 - regression_loss: 1.1112 - classification_loss: 0.2020 60/500 [==>...........................] - ETA: 1:47 - loss: 1.3152 - regression_loss: 1.1137 - classification_loss: 0.2015 61/500 [==>...........................] - ETA: 1:46 - loss: 1.3145 - regression_loss: 1.1141 - classification_loss: 0.2004 62/500 [==>...........................] - ETA: 1:46 - loss: 1.3187 - regression_loss: 1.1163 - classification_loss: 0.2024 63/500 [==>...........................] - ETA: 1:46 - loss: 1.3140 - regression_loss: 1.1123 - classification_loss: 0.2017 64/500 [==>...........................] - ETA: 1:46 - loss: 1.3219 - regression_loss: 1.1183 - classification_loss: 0.2036 65/500 [==>...........................] - ETA: 1:45 - loss: 1.3240 - regression_loss: 1.1202 - classification_loss: 0.2038 66/500 [==>...........................] - ETA: 1:45 - loss: 1.3332 - regression_loss: 1.1285 - classification_loss: 0.2047 67/500 [===>..........................] - ETA: 1:45 - loss: 1.3189 - regression_loss: 1.1162 - classification_loss: 0.2027 68/500 [===>..........................] - ETA: 1:45 - loss: 1.3151 - regression_loss: 1.1127 - classification_loss: 0.2024 69/500 [===>..........................] - ETA: 1:44 - loss: 1.3087 - regression_loss: 1.1082 - classification_loss: 0.2006 70/500 [===>..........................] - ETA: 1:44 - loss: 1.3069 - regression_loss: 1.1067 - classification_loss: 0.2003 71/500 [===>..........................] - ETA: 1:44 - loss: 1.3040 - regression_loss: 1.1049 - classification_loss: 0.1992 72/500 [===>..........................] - ETA: 1:44 - loss: 1.3024 - regression_loss: 1.1038 - classification_loss: 0.1986 73/500 [===>..........................] - ETA: 1:44 - loss: 1.3030 - regression_loss: 1.1048 - classification_loss: 0.1982 74/500 [===>..........................] - ETA: 1:43 - loss: 1.3104 - regression_loss: 1.1110 - classification_loss: 0.1994 75/500 [===>..........................] - ETA: 1:43 - loss: 1.3196 - regression_loss: 1.1173 - classification_loss: 0.2022 76/500 [===>..........................] - ETA: 1:43 - loss: 1.3164 - regression_loss: 1.1145 - classification_loss: 0.2019 77/500 [===>..........................] - ETA: 1:43 - loss: 1.3089 - regression_loss: 1.1090 - classification_loss: 0.1999 78/500 [===>..........................] - ETA: 1:42 - loss: 1.3077 - regression_loss: 1.1082 - classification_loss: 0.1995 79/500 [===>..........................] - ETA: 1:42 - loss: 1.3142 - regression_loss: 1.1140 - classification_loss: 0.2002 80/500 [===>..........................] - ETA: 1:42 - loss: 1.3169 - regression_loss: 1.1164 - classification_loss: 0.2005 81/500 [===>..........................] - ETA: 1:42 - loss: 1.3141 - regression_loss: 1.1143 - classification_loss: 0.1998 82/500 [===>..........................] - ETA: 1:41 - loss: 1.3196 - regression_loss: 1.1194 - classification_loss: 0.2001 83/500 [===>..........................] - ETA: 1:41 - loss: 1.3204 - regression_loss: 1.1212 - classification_loss: 0.1993 84/500 [====>.........................] - ETA: 1:41 - loss: 1.3259 - regression_loss: 1.1243 - classification_loss: 0.2016 85/500 [====>.........................] - ETA: 1:40 - loss: 1.3335 - regression_loss: 1.1301 - classification_loss: 0.2034 86/500 [====>.........................] - ETA: 1:40 - loss: 1.3263 - regression_loss: 1.1240 - classification_loss: 0.2023 87/500 [====>.........................] - ETA: 1:40 - loss: 1.3264 - regression_loss: 1.1232 - classification_loss: 0.2032 88/500 [====>.........................] - ETA: 1:40 - loss: 1.3163 - regression_loss: 1.1147 - classification_loss: 0.2015 89/500 [====>.........................] - ETA: 1:39 - loss: 1.3170 - regression_loss: 1.1157 - classification_loss: 0.2013 90/500 [====>.........................] - ETA: 1:39 - loss: 1.3168 - regression_loss: 1.1160 - classification_loss: 0.2008 91/500 [====>.........................] - ETA: 1:39 - loss: 1.3299 - regression_loss: 1.1268 - classification_loss: 0.2030 92/500 [====>.........................] - ETA: 1:39 - loss: 1.3336 - regression_loss: 1.1298 - classification_loss: 0.2038 93/500 [====>.........................] - ETA: 1:38 - loss: 1.3368 - regression_loss: 1.1317 - classification_loss: 0.2051 94/500 [====>.........................] - ETA: 1:38 - loss: 1.3402 - regression_loss: 1.1343 - classification_loss: 0.2059 95/500 [====>.........................] - ETA: 1:38 - loss: 1.3344 - regression_loss: 1.1296 - classification_loss: 0.2048 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3285 - regression_loss: 1.1247 - classification_loss: 0.2039 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3248 - regression_loss: 1.1216 - classification_loss: 0.2032 98/500 [====>.........................] - ETA: 1:37 - loss: 1.3238 - regression_loss: 1.1211 - classification_loss: 0.2026 99/500 [====>.........................] - ETA: 1:37 - loss: 1.3241 - regression_loss: 1.1209 - classification_loss: 0.2032 100/500 [=====>........................] - ETA: 1:37 - loss: 1.3258 - regression_loss: 1.1225 - classification_loss: 0.2033 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3287 - regression_loss: 1.1243 - classification_loss: 0.2043 102/500 [=====>........................] - ETA: 1:37 - loss: 1.3282 - regression_loss: 1.1244 - classification_loss: 0.2038 103/500 [=====>........................] - ETA: 1:36 - loss: 1.3270 - regression_loss: 1.1233 - classification_loss: 0.2038 104/500 [=====>........................] - ETA: 1:36 - loss: 1.3359 - regression_loss: 1.1308 - classification_loss: 0.2050 105/500 [=====>........................] - ETA: 1:36 - loss: 1.3440 - regression_loss: 1.1372 - classification_loss: 0.2068 106/500 [=====>........................] - ETA: 1:36 - loss: 1.3442 - regression_loss: 1.1376 - classification_loss: 0.2066 107/500 [=====>........................] - ETA: 1:35 - loss: 1.3418 - regression_loss: 1.1361 - classification_loss: 0.2057 108/500 [=====>........................] - ETA: 1:35 - loss: 1.3409 - regression_loss: 1.1360 - classification_loss: 0.2049 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3416 - regression_loss: 1.1363 - classification_loss: 0.2053 110/500 [=====>........................] - ETA: 1:34 - loss: 1.3453 - regression_loss: 1.1398 - classification_loss: 0.2056 111/500 [=====>........................] - ETA: 1:34 - loss: 1.3461 - regression_loss: 1.1410 - classification_loss: 0.2051 112/500 [=====>........................] - ETA: 1:34 - loss: 1.3538 - regression_loss: 1.1459 - classification_loss: 0.2078 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3616 - regression_loss: 1.1520 - classification_loss: 0.2096 114/500 [=====>........................] - ETA: 1:33 - loss: 1.3547 - regression_loss: 1.1466 - classification_loss: 0.2081 115/500 [=====>........................] - ETA: 1:33 - loss: 1.3489 - regression_loss: 1.1417 - classification_loss: 0.2073 116/500 [=====>........................] - ETA: 1:33 - loss: 1.3494 - regression_loss: 1.1407 - classification_loss: 0.2087 117/500 [======>.......................] - ETA: 1:33 - loss: 1.3483 - regression_loss: 1.1399 - classification_loss: 0.2084 118/500 [======>.......................] - ETA: 1:32 - loss: 1.3540 - regression_loss: 1.1439 - classification_loss: 0.2101 119/500 [======>.......................] - ETA: 1:32 - loss: 1.3484 - regression_loss: 1.1392 - classification_loss: 0.2092 120/500 [======>.......................] - ETA: 1:32 - loss: 1.3434 - regression_loss: 1.1350 - classification_loss: 0.2084 121/500 [======>.......................] - ETA: 1:32 - loss: 1.3460 - regression_loss: 1.1375 - classification_loss: 0.2085 122/500 [======>.......................] - ETA: 1:31 - loss: 1.3459 - regression_loss: 1.1376 - classification_loss: 0.2083 123/500 [======>.......................] - ETA: 1:31 - loss: 1.3489 - regression_loss: 1.1401 - classification_loss: 0.2088 124/500 [======>.......................] - ETA: 1:31 - loss: 1.3504 - regression_loss: 1.1413 - classification_loss: 0.2090 125/500 [======>.......................] - ETA: 1:31 - loss: 1.3527 - regression_loss: 1.1437 - classification_loss: 0.2090 126/500 [======>.......................] - ETA: 1:31 - loss: 1.3464 - regression_loss: 1.1386 - classification_loss: 0.2078 127/500 [======>.......................] - ETA: 1:30 - loss: 1.3455 - regression_loss: 1.1296 - classification_loss: 0.2160 128/500 [======>.......................] - ETA: 1:30 - loss: 1.3388 - regression_loss: 1.1240 - classification_loss: 0.2148 129/500 [======>.......................] - ETA: 1:30 - loss: 1.3371 - regression_loss: 1.1228 - classification_loss: 0.2142 130/500 [======>.......................] - ETA: 1:30 - loss: 1.3367 - regression_loss: 1.1228 - classification_loss: 0.2138 131/500 [======>.......................] - ETA: 1:29 - loss: 1.3402 - regression_loss: 1.1253 - classification_loss: 0.2148 132/500 [======>.......................] - ETA: 1:29 - loss: 1.3404 - regression_loss: 1.1261 - classification_loss: 0.2143 133/500 [======>.......................] - ETA: 1:29 - loss: 1.3400 - regression_loss: 1.1263 - classification_loss: 0.2137 134/500 [=======>......................] - ETA: 1:29 - loss: 1.3416 - regression_loss: 1.1283 - classification_loss: 0.2133 135/500 [=======>......................] - ETA: 1:28 - loss: 1.3375 - regression_loss: 1.1251 - classification_loss: 0.2125 136/500 [=======>......................] - ETA: 1:28 - loss: 1.3332 - regression_loss: 1.1213 - classification_loss: 0.2119 137/500 [=======>......................] - ETA: 1:28 - loss: 1.3343 - regression_loss: 1.1226 - classification_loss: 0.2117 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3291 - regression_loss: 1.1169 - classification_loss: 0.2122 139/500 [=======>......................] - ETA: 1:27 - loss: 1.3294 - regression_loss: 1.1171 - classification_loss: 0.2123 140/500 [=======>......................] - ETA: 1:27 - loss: 1.3333 - regression_loss: 1.1204 - classification_loss: 0.2129 141/500 [=======>......................] - ETA: 1:27 - loss: 1.3336 - regression_loss: 1.1209 - classification_loss: 0.2128 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3334 - regression_loss: 1.1210 - classification_loss: 0.2124 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3339 - regression_loss: 1.1212 - classification_loss: 0.2127 144/500 [=======>......................] - ETA: 1:26 - loss: 1.3293 - regression_loss: 1.1178 - classification_loss: 0.2115 145/500 [=======>......................] - ETA: 1:26 - loss: 1.3309 - regression_loss: 1.1193 - classification_loss: 0.2116 146/500 [=======>......................] - ETA: 1:26 - loss: 1.3294 - regression_loss: 1.1183 - classification_loss: 0.2111 147/500 [=======>......................] - ETA: 1:26 - loss: 1.3313 - regression_loss: 1.1202 - classification_loss: 0.2111 148/500 [=======>......................] - ETA: 1:25 - loss: 1.3314 - regression_loss: 1.1203 - classification_loss: 0.2111 149/500 [=======>......................] - ETA: 1:25 - loss: 1.3270 - regression_loss: 1.1167 - classification_loss: 0.2103 150/500 [========>.....................] - ETA: 1:25 - loss: 1.3288 - regression_loss: 1.1184 - classification_loss: 0.2104 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3261 - regression_loss: 1.1164 - classification_loss: 0.2097 152/500 [========>.....................] - ETA: 1:24 - loss: 1.3313 - regression_loss: 1.1205 - classification_loss: 0.2108 153/500 [========>.....................] - ETA: 1:24 - loss: 1.3327 - regression_loss: 1.1219 - classification_loss: 0.2108 154/500 [========>.....................] - ETA: 1:24 - loss: 1.3376 - regression_loss: 1.1255 - classification_loss: 0.2122 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3313 - regression_loss: 1.1203 - classification_loss: 0.2111 156/500 [========>.....................] - ETA: 1:23 - loss: 1.3309 - regression_loss: 1.1203 - classification_loss: 0.2106 157/500 [========>.....................] - ETA: 1:23 - loss: 1.3283 - regression_loss: 1.1185 - classification_loss: 0.2098 158/500 [========>.....................] - ETA: 1:23 - loss: 1.3266 - regression_loss: 1.1174 - classification_loss: 0.2092 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3250 - regression_loss: 1.1163 - classification_loss: 0.2087 160/500 [========>.....................] - ETA: 1:22 - loss: 1.3269 - regression_loss: 1.1177 - classification_loss: 0.2092 161/500 [========>.....................] - ETA: 1:22 - loss: 1.3276 - regression_loss: 1.1181 - classification_loss: 0.2095 162/500 [========>.....................] - ETA: 1:22 - loss: 1.3305 - regression_loss: 1.1201 - classification_loss: 0.2104 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3317 - regression_loss: 1.1215 - classification_loss: 0.2102 164/500 [========>.....................] - ETA: 1:21 - loss: 1.3346 - regression_loss: 1.1236 - classification_loss: 0.2110 165/500 [========>.....................] - ETA: 1:21 - loss: 1.3375 - regression_loss: 1.1263 - classification_loss: 0.2112 166/500 [========>.....................] - ETA: 1:21 - loss: 1.3404 - regression_loss: 1.1285 - classification_loss: 0.2120 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3472 - regression_loss: 1.1337 - classification_loss: 0.2135 168/500 [=========>....................] - ETA: 1:20 - loss: 1.3485 - regression_loss: 1.1347 - classification_loss: 0.2138 169/500 [=========>....................] - ETA: 1:20 - loss: 1.3467 - regression_loss: 1.1331 - classification_loss: 0.2136 170/500 [=========>....................] - ETA: 1:20 - loss: 1.3481 - regression_loss: 1.1338 - classification_loss: 0.2143 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3492 - regression_loss: 1.1352 - classification_loss: 0.2141 172/500 [=========>....................] - ETA: 1:19 - loss: 1.3469 - regression_loss: 1.1335 - classification_loss: 0.2134 173/500 [=========>....................] - ETA: 1:19 - loss: 1.3449 - regression_loss: 1.1321 - classification_loss: 0.2128 174/500 [=========>....................] - ETA: 1:19 - loss: 1.3434 - regression_loss: 1.1311 - classification_loss: 0.2123 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3429 - regression_loss: 1.1311 - classification_loss: 0.2119 176/500 [=========>....................] - ETA: 1:18 - loss: 1.3446 - regression_loss: 1.1323 - classification_loss: 0.2122 177/500 [=========>....................] - ETA: 1:18 - loss: 1.3436 - regression_loss: 1.1314 - classification_loss: 0.2122 178/500 [=========>....................] - ETA: 1:18 - loss: 1.3422 - regression_loss: 1.1306 - classification_loss: 0.2116 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3394 - regression_loss: 1.1284 - classification_loss: 0.2110 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3426 - regression_loss: 1.1308 - classification_loss: 0.2117 181/500 [=========>....................] - ETA: 1:17 - loss: 1.3441 - regression_loss: 1.1326 - classification_loss: 0.2116 182/500 [=========>....................] - ETA: 1:17 - loss: 1.3458 - regression_loss: 1.1340 - classification_loss: 0.2118 183/500 [=========>....................] - ETA: 1:17 - loss: 1.3464 - regression_loss: 1.1346 - classification_loss: 0.2118 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3488 - regression_loss: 1.1363 - classification_loss: 0.2125 185/500 [==========>...................] - ETA: 1:16 - loss: 1.3503 - regression_loss: 1.1379 - classification_loss: 0.2124 186/500 [==========>...................] - ETA: 1:16 - loss: 1.3490 - regression_loss: 1.1369 - classification_loss: 0.2122 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3518 - regression_loss: 1.1392 - classification_loss: 0.2126 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3515 - regression_loss: 1.1392 - classification_loss: 0.2122 189/500 [==========>...................] - ETA: 1:15 - loss: 1.3570 - regression_loss: 1.1435 - classification_loss: 0.2136 190/500 [==========>...................] - ETA: 1:15 - loss: 1.3552 - regression_loss: 1.1421 - classification_loss: 0.2132 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3589 - regression_loss: 1.1445 - classification_loss: 0.2144 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3643 - regression_loss: 1.1492 - classification_loss: 0.2152 193/500 [==========>...................] - ETA: 1:14 - loss: 1.3637 - regression_loss: 1.1490 - classification_loss: 0.2147 194/500 [==========>...................] - ETA: 1:14 - loss: 1.3585 - regression_loss: 1.1447 - classification_loss: 0.2138 195/500 [==========>...................] - ETA: 1:14 - loss: 1.3619 - regression_loss: 1.1480 - classification_loss: 0.2138 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3595 - regression_loss: 1.1460 - classification_loss: 0.2135 197/500 [==========>...................] - ETA: 1:13 - loss: 1.3552 - regression_loss: 1.1423 - classification_loss: 0.2130 198/500 [==========>...................] - ETA: 1:13 - loss: 1.3573 - regression_loss: 1.1437 - classification_loss: 0.2136 199/500 [==========>...................] - ETA: 1:13 - loss: 1.3534 - regression_loss: 1.1404 - classification_loss: 0.2130 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3535 - regression_loss: 1.1406 - classification_loss: 0.2129 201/500 [===========>..................] - ETA: 1:12 - loss: 1.3521 - regression_loss: 1.1397 - classification_loss: 0.2124 202/500 [===========>..................] - ETA: 1:12 - loss: 1.3522 - regression_loss: 1.1395 - classification_loss: 0.2126 203/500 [===========>..................] - ETA: 1:12 - loss: 1.3568 - regression_loss: 1.1431 - classification_loss: 0.2137 204/500 [===========>..................] - ETA: 1:12 - loss: 1.3566 - regression_loss: 1.1433 - classification_loss: 0.2134 205/500 [===========>..................] - ETA: 1:11 - loss: 1.3558 - regression_loss: 1.1429 - classification_loss: 0.2129 206/500 [===========>..................] - ETA: 1:11 - loss: 1.3559 - regression_loss: 1.1428 - classification_loss: 0.2132 207/500 [===========>..................] - ETA: 1:11 - loss: 1.3570 - regression_loss: 1.1439 - classification_loss: 0.2131 208/500 [===========>..................] - ETA: 1:11 - loss: 1.3561 - regression_loss: 1.1436 - classification_loss: 0.2125 209/500 [===========>..................] - ETA: 1:10 - loss: 1.3563 - regression_loss: 1.1437 - classification_loss: 0.2126 210/500 [===========>..................] - ETA: 1:10 - loss: 1.3571 - regression_loss: 1.1443 - classification_loss: 0.2128 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3569 - regression_loss: 1.1443 - classification_loss: 0.2127 212/500 [===========>..................] - ETA: 1:10 - loss: 1.3559 - regression_loss: 1.1439 - classification_loss: 0.2121 213/500 [===========>..................] - ETA: 1:09 - loss: 1.3584 - regression_loss: 1.1462 - classification_loss: 0.2123 214/500 [===========>..................] - ETA: 1:09 - loss: 1.3594 - regression_loss: 1.1469 - classification_loss: 0.2125 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3546 - regression_loss: 1.1428 - classification_loss: 0.2119 216/500 [===========>..................] - ETA: 1:09 - loss: 1.3547 - regression_loss: 1.1428 - classification_loss: 0.2119 217/500 [============>.................] - ETA: 1:08 - loss: 1.3562 - regression_loss: 1.1440 - classification_loss: 0.2121 218/500 [============>.................] - ETA: 1:08 - loss: 1.3559 - regression_loss: 1.1437 - classification_loss: 0.2121 219/500 [============>.................] - ETA: 1:08 - loss: 1.3609 - regression_loss: 1.1481 - classification_loss: 0.2127 220/500 [============>.................] - ETA: 1:08 - loss: 1.3584 - regression_loss: 1.1462 - classification_loss: 0.2122 221/500 [============>.................] - ETA: 1:08 - loss: 1.3565 - regression_loss: 1.1444 - classification_loss: 0.2120 222/500 [============>.................] - ETA: 1:07 - loss: 1.3577 - regression_loss: 1.1453 - classification_loss: 0.2124 223/500 [============>.................] - ETA: 1:07 - loss: 1.3566 - regression_loss: 1.1444 - classification_loss: 0.2121 224/500 [============>.................] - ETA: 1:07 - loss: 1.3588 - regression_loss: 1.1463 - classification_loss: 0.2125 225/500 [============>.................] - ETA: 1:07 - loss: 1.3573 - regression_loss: 1.1449 - classification_loss: 0.2124 226/500 [============>.................] - ETA: 1:06 - loss: 1.3579 - regression_loss: 1.1453 - classification_loss: 0.2125 227/500 [============>.................] - ETA: 1:06 - loss: 1.3571 - regression_loss: 1.1451 - classification_loss: 0.2121 228/500 [============>.................] - ETA: 1:06 - loss: 1.3599 - regression_loss: 1.1472 - classification_loss: 0.2127 229/500 [============>.................] - ETA: 1:06 - loss: 1.3586 - regression_loss: 1.1463 - classification_loss: 0.2123 230/500 [============>.................] - ETA: 1:05 - loss: 1.3540 - regression_loss: 1.1425 - classification_loss: 0.2115 231/500 [============>.................] - ETA: 1:05 - loss: 1.3574 - regression_loss: 1.1449 - classification_loss: 0.2125 232/500 [============>.................] - ETA: 1:05 - loss: 1.3597 - regression_loss: 1.1474 - classification_loss: 0.2124 233/500 [============>.................] - ETA: 1:05 - loss: 1.3601 - regression_loss: 1.1476 - classification_loss: 0.2125 234/500 [=============>................] - ETA: 1:04 - loss: 1.3579 - regression_loss: 1.1459 - classification_loss: 0.2120 235/500 [=============>................] - ETA: 1:04 - loss: 1.3600 - regression_loss: 1.1472 - classification_loss: 0.2128 236/500 [=============>................] - ETA: 1:04 - loss: 1.3602 - regression_loss: 1.1472 - classification_loss: 0.2130 237/500 [=============>................] - ETA: 1:04 - loss: 1.3617 - regression_loss: 1.1487 - classification_loss: 0.2130 238/500 [=============>................] - ETA: 1:03 - loss: 1.3627 - regression_loss: 1.1498 - classification_loss: 0.2129 239/500 [=============>................] - ETA: 1:03 - loss: 1.3628 - regression_loss: 1.1502 - classification_loss: 0.2126 240/500 [=============>................] - ETA: 1:03 - loss: 1.3608 - regression_loss: 1.1488 - classification_loss: 0.2121 241/500 [=============>................] - ETA: 1:03 - loss: 1.3626 - regression_loss: 1.1501 - classification_loss: 0.2125 242/500 [=============>................] - ETA: 1:02 - loss: 1.3634 - regression_loss: 1.1511 - classification_loss: 0.2124 243/500 [=============>................] - ETA: 1:02 - loss: 1.3661 - regression_loss: 1.1536 - classification_loss: 0.2126 244/500 [=============>................] - ETA: 1:02 - loss: 1.3637 - regression_loss: 1.1516 - classification_loss: 0.2121 245/500 [=============>................] - ETA: 1:02 - loss: 1.3652 - regression_loss: 1.1528 - classification_loss: 0.2123 246/500 [=============>................] - ETA: 1:01 - loss: 1.3650 - regression_loss: 1.1527 - classification_loss: 0.2123 247/500 [=============>................] - ETA: 1:01 - loss: 1.3683 - regression_loss: 1.1556 - classification_loss: 0.2127 248/500 [=============>................] - ETA: 1:01 - loss: 1.3679 - regression_loss: 1.1553 - classification_loss: 0.2126 249/500 [=============>................] - ETA: 1:01 - loss: 1.3687 - regression_loss: 1.1559 - classification_loss: 0.2129 250/500 [==============>...............] - ETA: 1:01 - loss: 1.3718 - regression_loss: 1.1577 - classification_loss: 0.2140 251/500 [==============>...............] - ETA: 1:00 - loss: 1.3709 - regression_loss: 1.1571 - classification_loss: 0.2138 252/500 [==============>...............] - ETA: 1:00 - loss: 1.3723 - regression_loss: 1.1583 - classification_loss: 0.2140 253/500 [==============>...............] - ETA: 1:00 - loss: 1.3715 - regression_loss: 1.1574 - classification_loss: 0.2140 254/500 [==============>...............] - ETA: 59s - loss: 1.3714 - regression_loss: 1.1572 - classification_loss: 0.2142  255/500 [==============>...............] - ETA: 59s - loss: 1.3701 - regression_loss: 1.1562 - classification_loss: 0.2139 256/500 [==============>...............] - ETA: 59s - loss: 1.3699 - regression_loss: 1.1561 - classification_loss: 0.2138 257/500 [==============>...............] - ETA: 59s - loss: 1.3735 - regression_loss: 1.1574 - classification_loss: 0.2161 258/500 [==============>...............] - ETA: 59s - loss: 1.3730 - regression_loss: 1.1571 - classification_loss: 0.2159 259/500 [==============>...............] - ETA: 58s - loss: 1.3757 - regression_loss: 1.1589 - classification_loss: 0.2169 260/500 [==============>...............] - ETA: 58s - loss: 1.3755 - regression_loss: 1.1590 - classification_loss: 0.2165 261/500 [==============>...............] - ETA: 58s - loss: 1.3727 - regression_loss: 1.1569 - classification_loss: 0.2158 262/500 [==============>...............] - ETA: 58s - loss: 1.3741 - regression_loss: 1.1579 - classification_loss: 0.2162 263/500 [==============>...............] - ETA: 57s - loss: 1.3719 - regression_loss: 1.1562 - classification_loss: 0.2158 264/500 [==============>...............] - ETA: 57s - loss: 1.3713 - regression_loss: 1.1560 - classification_loss: 0.2153 265/500 [==============>...............] - ETA: 57s - loss: 1.3711 - regression_loss: 1.1561 - classification_loss: 0.2150 266/500 [==============>...............] - ETA: 57s - loss: 1.3727 - regression_loss: 1.1575 - classification_loss: 0.2152 267/500 [===============>..............] - ETA: 56s - loss: 1.3749 - regression_loss: 1.1592 - classification_loss: 0.2158 268/500 [===============>..............] - ETA: 56s - loss: 1.3739 - regression_loss: 1.1584 - classification_loss: 0.2154 269/500 [===============>..............] - ETA: 56s - loss: 1.3746 - regression_loss: 1.1592 - classification_loss: 0.2154 270/500 [===============>..............] - ETA: 56s - loss: 1.3731 - regression_loss: 1.1579 - classification_loss: 0.2152 271/500 [===============>..............] - ETA: 55s - loss: 1.3753 - regression_loss: 1.1597 - classification_loss: 0.2156 272/500 [===============>..............] - ETA: 55s - loss: 1.3756 - regression_loss: 1.1598 - classification_loss: 0.2157 273/500 [===============>..............] - ETA: 55s - loss: 1.3749 - regression_loss: 1.1590 - classification_loss: 0.2159 274/500 [===============>..............] - ETA: 55s - loss: 1.3738 - regression_loss: 1.1581 - classification_loss: 0.2157 275/500 [===============>..............] - ETA: 54s - loss: 1.3720 - regression_loss: 1.1567 - classification_loss: 0.2152 276/500 [===============>..............] - ETA: 54s - loss: 1.3740 - regression_loss: 1.1583 - classification_loss: 0.2157 277/500 [===============>..............] - ETA: 54s - loss: 1.3746 - regression_loss: 1.1587 - classification_loss: 0.2159 278/500 [===============>..............] - ETA: 54s - loss: 1.3774 - regression_loss: 1.1611 - classification_loss: 0.2164 279/500 [===============>..............] - ETA: 53s - loss: 1.3780 - regression_loss: 1.1617 - classification_loss: 0.2163 280/500 [===============>..............] - ETA: 53s - loss: 1.3794 - regression_loss: 1.1630 - classification_loss: 0.2164 281/500 [===============>..............] - ETA: 53s - loss: 1.3800 - regression_loss: 1.1633 - classification_loss: 0.2167 282/500 [===============>..............] - ETA: 53s - loss: 1.3800 - regression_loss: 1.1632 - classification_loss: 0.2168 283/500 [===============>..............] - ETA: 52s - loss: 1.3797 - regression_loss: 1.1631 - classification_loss: 0.2166 284/500 [================>.............] - ETA: 52s - loss: 1.3811 - regression_loss: 1.1644 - classification_loss: 0.2167 285/500 [================>.............] - ETA: 52s - loss: 1.3811 - regression_loss: 1.1647 - classification_loss: 0.2164 286/500 [================>.............] - ETA: 52s - loss: 1.3831 - regression_loss: 1.1661 - classification_loss: 0.2170 287/500 [================>.............] - ETA: 51s - loss: 1.3858 - regression_loss: 1.1686 - classification_loss: 0.2173 288/500 [================>.............] - ETA: 51s - loss: 1.3860 - regression_loss: 1.1686 - classification_loss: 0.2174 289/500 [================>.............] - ETA: 51s - loss: 1.3870 - regression_loss: 1.1695 - classification_loss: 0.2175 290/500 [================>.............] - ETA: 51s - loss: 1.3867 - regression_loss: 1.1693 - classification_loss: 0.2175 291/500 [================>.............] - ETA: 50s - loss: 1.3867 - regression_loss: 1.1692 - classification_loss: 0.2175 292/500 [================>.............] - ETA: 50s - loss: 1.3899 - regression_loss: 1.1714 - classification_loss: 0.2184 293/500 [================>.............] - ETA: 50s - loss: 1.3873 - regression_loss: 1.1691 - classification_loss: 0.2181 294/500 [================>.............] - ETA: 50s - loss: 1.3882 - regression_loss: 1.1696 - classification_loss: 0.2186 295/500 [================>.............] - ETA: 49s - loss: 1.3878 - regression_loss: 1.1694 - classification_loss: 0.2184 296/500 [================>.............] - ETA: 49s - loss: 1.3866 - regression_loss: 1.1684 - classification_loss: 0.2182 297/500 [================>.............] - ETA: 49s - loss: 1.3857 - regression_loss: 1.1677 - classification_loss: 0.2180 298/500 [================>.............] - ETA: 49s - loss: 1.3874 - regression_loss: 1.1693 - classification_loss: 0.2181 299/500 [================>.............] - ETA: 48s - loss: 1.3868 - regression_loss: 1.1688 - classification_loss: 0.2180 300/500 [=================>............] - ETA: 48s - loss: 1.3886 - regression_loss: 1.1702 - classification_loss: 0.2184 301/500 [=================>............] - ETA: 48s - loss: 1.3875 - regression_loss: 1.1694 - classification_loss: 0.2181 302/500 [=================>............] - ETA: 48s - loss: 1.3885 - regression_loss: 1.1702 - classification_loss: 0.2183 303/500 [=================>............] - ETA: 47s - loss: 1.3888 - regression_loss: 1.1705 - classification_loss: 0.2183 304/500 [=================>............] - ETA: 47s - loss: 1.3893 - regression_loss: 1.1713 - classification_loss: 0.2180 305/500 [=================>............] - ETA: 47s - loss: 1.3891 - regression_loss: 1.1713 - classification_loss: 0.2179 306/500 [=================>............] - ETA: 47s - loss: 1.3908 - regression_loss: 1.1725 - classification_loss: 0.2183 307/500 [=================>............] - ETA: 46s - loss: 1.3961 - regression_loss: 1.1772 - classification_loss: 0.2189 308/500 [=================>............] - ETA: 46s - loss: 1.3960 - regression_loss: 1.1774 - classification_loss: 0.2186 309/500 [=================>............] - ETA: 46s - loss: 1.3946 - regression_loss: 1.1764 - classification_loss: 0.2183 310/500 [=================>............] - ETA: 46s - loss: 1.3953 - regression_loss: 1.1770 - classification_loss: 0.2183 311/500 [=================>............] - ETA: 45s - loss: 1.3934 - regression_loss: 1.1754 - classification_loss: 0.2180 312/500 [=================>............] - ETA: 45s - loss: 1.3934 - regression_loss: 1.1755 - classification_loss: 0.2179 313/500 [=================>............] - ETA: 45s - loss: 1.3926 - regression_loss: 1.1749 - classification_loss: 0.2177 314/500 [=================>............] - ETA: 45s - loss: 1.3916 - regression_loss: 1.1742 - classification_loss: 0.2174 315/500 [=================>............] - ETA: 44s - loss: 1.3914 - regression_loss: 1.1742 - classification_loss: 0.2172 316/500 [=================>............] - ETA: 44s - loss: 1.3919 - regression_loss: 1.1747 - classification_loss: 0.2172 317/500 [==================>...........] - ETA: 44s - loss: 1.3921 - regression_loss: 1.1748 - classification_loss: 0.2173 318/500 [==================>...........] - ETA: 44s - loss: 1.3888 - regression_loss: 1.1719 - classification_loss: 0.2168 319/500 [==================>...........] - ETA: 44s - loss: 1.3896 - regression_loss: 1.1726 - classification_loss: 0.2170 320/500 [==================>...........] - ETA: 43s - loss: 1.3872 - regression_loss: 1.1706 - classification_loss: 0.2166 321/500 [==================>...........] - ETA: 43s - loss: 1.3869 - regression_loss: 1.1706 - classification_loss: 0.2164 322/500 [==================>...........] - ETA: 43s - loss: 1.3868 - regression_loss: 1.1706 - classification_loss: 0.2162 323/500 [==================>...........] - ETA: 43s - loss: 1.3873 - regression_loss: 1.1710 - classification_loss: 0.2163 324/500 [==================>...........] - ETA: 42s - loss: 1.3847 - regression_loss: 1.1688 - classification_loss: 0.2159 325/500 [==================>...........] - ETA: 42s - loss: 1.3864 - regression_loss: 1.1702 - classification_loss: 0.2161 326/500 [==================>...........] - ETA: 42s - loss: 1.3867 - regression_loss: 1.1706 - classification_loss: 0.2161 327/500 [==================>...........] - ETA: 42s - loss: 1.3851 - regression_loss: 1.1693 - classification_loss: 0.2158 328/500 [==================>...........] - ETA: 41s - loss: 1.3858 - regression_loss: 1.1699 - classification_loss: 0.2159 329/500 [==================>...........] - ETA: 41s - loss: 1.3858 - regression_loss: 1.1700 - classification_loss: 0.2158 330/500 [==================>...........] - ETA: 41s - loss: 1.3839 - regression_loss: 1.1684 - classification_loss: 0.2155 331/500 [==================>...........] - ETA: 41s - loss: 1.3831 - regression_loss: 1.1676 - classification_loss: 0.2155 332/500 [==================>...........] - ETA: 40s - loss: 1.3851 - regression_loss: 1.1693 - classification_loss: 0.2158 333/500 [==================>...........] - ETA: 40s - loss: 1.3860 - regression_loss: 1.1698 - classification_loss: 0.2161 334/500 [===================>..........] - ETA: 40s - loss: 1.3844 - regression_loss: 1.1684 - classification_loss: 0.2160 335/500 [===================>..........] - ETA: 40s - loss: 1.3841 - regression_loss: 1.1682 - classification_loss: 0.2159 336/500 [===================>..........] - ETA: 39s - loss: 1.3851 - regression_loss: 1.1692 - classification_loss: 0.2158 337/500 [===================>..........] - ETA: 39s - loss: 1.3863 - regression_loss: 1.1703 - classification_loss: 0.2160 338/500 [===================>..........] - ETA: 39s - loss: 1.3894 - regression_loss: 1.1729 - classification_loss: 0.2165 339/500 [===================>..........] - ETA: 39s - loss: 1.3903 - regression_loss: 1.1737 - classification_loss: 0.2166 340/500 [===================>..........] - ETA: 38s - loss: 1.3903 - regression_loss: 1.1738 - classification_loss: 0.2165 341/500 [===================>..........] - ETA: 38s - loss: 1.3903 - regression_loss: 1.1737 - classification_loss: 0.2166 342/500 [===================>..........] - ETA: 38s - loss: 1.3910 - regression_loss: 1.1744 - classification_loss: 0.2166 343/500 [===================>..........] - ETA: 38s - loss: 1.3904 - regression_loss: 1.1740 - classification_loss: 0.2164 344/500 [===================>..........] - ETA: 37s - loss: 1.3895 - regression_loss: 1.1733 - classification_loss: 0.2162 345/500 [===================>..........] - ETA: 37s - loss: 1.3900 - regression_loss: 1.1738 - classification_loss: 0.2162 346/500 [===================>..........] - ETA: 37s - loss: 1.3889 - regression_loss: 1.1728 - classification_loss: 0.2161 347/500 [===================>..........] - ETA: 37s - loss: 1.3897 - regression_loss: 1.1736 - classification_loss: 0.2161 348/500 [===================>..........] - ETA: 36s - loss: 1.3893 - regression_loss: 1.1731 - classification_loss: 0.2162 349/500 [===================>..........] - ETA: 36s - loss: 1.3887 - regression_loss: 1.1727 - classification_loss: 0.2160 350/500 [====================>.........] - ETA: 36s - loss: 1.3898 - regression_loss: 1.1737 - classification_loss: 0.2161 351/500 [====================>.........] - ETA: 36s - loss: 1.3873 - regression_loss: 1.1716 - classification_loss: 0.2157 352/500 [====================>.........] - ETA: 35s - loss: 1.3870 - regression_loss: 1.1714 - classification_loss: 0.2157 353/500 [====================>.........] - ETA: 35s - loss: 1.3856 - regression_loss: 1.1703 - classification_loss: 0.2153 354/500 [====================>.........] - ETA: 35s - loss: 1.3849 - regression_loss: 1.1699 - classification_loss: 0.2150 355/500 [====================>.........] - ETA: 35s - loss: 1.3840 - regression_loss: 1.1693 - classification_loss: 0.2147 356/500 [====================>.........] - ETA: 35s - loss: 1.3844 - regression_loss: 1.1697 - classification_loss: 0.2147 357/500 [====================>.........] - ETA: 34s - loss: 1.3871 - regression_loss: 1.1715 - classification_loss: 0.2156 358/500 [====================>.........] - ETA: 34s - loss: 1.3857 - regression_loss: 1.1705 - classification_loss: 0.2153 359/500 [====================>.........] - ETA: 34s - loss: 1.3829 - regression_loss: 1.1682 - classification_loss: 0.2147 360/500 [====================>.........] - ETA: 34s - loss: 1.3825 - regression_loss: 1.1677 - classification_loss: 0.2148 361/500 [====================>.........] - ETA: 33s - loss: 1.3826 - regression_loss: 1.1678 - classification_loss: 0.2148 362/500 [====================>.........] - ETA: 33s - loss: 1.3825 - regression_loss: 1.1680 - classification_loss: 0.2145 363/500 [====================>.........] - ETA: 33s - loss: 1.3819 - regression_loss: 1.1675 - classification_loss: 0.2143 364/500 [====================>.........] - ETA: 33s - loss: 1.3810 - regression_loss: 1.1670 - classification_loss: 0.2140 365/500 [====================>.........] - ETA: 32s - loss: 1.3814 - regression_loss: 1.1675 - classification_loss: 0.2140 366/500 [====================>.........] - ETA: 32s - loss: 1.3825 - regression_loss: 1.1684 - classification_loss: 0.2141 367/500 [=====================>........] - ETA: 32s - loss: 1.3836 - regression_loss: 1.1689 - classification_loss: 0.2147 368/500 [=====================>........] - ETA: 32s - loss: 1.3844 - regression_loss: 1.1697 - classification_loss: 0.2147 369/500 [=====================>........] - ETA: 31s - loss: 1.3849 - regression_loss: 1.1702 - classification_loss: 0.2147 370/500 [=====================>........] - ETA: 31s - loss: 1.3887 - regression_loss: 1.1721 - classification_loss: 0.2167 371/500 [=====================>........] - ETA: 31s - loss: 1.3898 - regression_loss: 1.1729 - classification_loss: 0.2168 372/500 [=====================>........] - ETA: 31s - loss: 1.3912 - regression_loss: 1.1741 - classification_loss: 0.2171 373/500 [=====================>........] - ETA: 30s - loss: 1.3911 - regression_loss: 1.1740 - classification_loss: 0.2171 374/500 [=====================>........] - ETA: 30s - loss: 1.3911 - regression_loss: 1.1739 - classification_loss: 0.2172 375/500 [=====================>........] - ETA: 30s - loss: 1.3903 - regression_loss: 1.1733 - classification_loss: 0.2170 376/500 [=====================>........] - ETA: 30s - loss: 1.3894 - regression_loss: 1.1727 - classification_loss: 0.2167 377/500 [=====================>........] - ETA: 29s - loss: 1.3887 - regression_loss: 1.1722 - classification_loss: 0.2166 378/500 [=====================>........] - ETA: 29s - loss: 1.3891 - regression_loss: 1.1725 - classification_loss: 0.2166 379/500 [=====================>........] - ETA: 29s - loss: 1.3904 - regression_loss: 1.1732 - classification_loss: 0.2172 380/500 [=====================>........] - ETA: 29s - loss: 1.3916 - regression_loss: 1.1739 - classification_loss: 0.2177 381/500 [=====================>........] - ETA: 28s - loss: 1.3908 - regression_loss: 1.1733 - classification_loss: 0.2175 382/500 [=====================>........] - ETA: 28s - loss: 1.3904 - regression_loss: 1.1728 - classification_loss: 0.2176 383/500 [=====================>........] - ETA: 28s - loss: 1.3893 - regression_loss: 1.1720 - classification_loss: 0.2174 384/500 [======================>.......] - ETA: 28s - loss: 1.3897 - regression_loss: 1.1722 - classification_loss: 0.2175 385/500 [======================>.......] - ETA: 27s - loss: 1.3903 - regression_loss: 1.1728 - classification_loss: 0.2175 386/500 [======================>.......] - ETA: 27s - loss: 1.3893 - regression_loss: 1.1721 - classification_loss: 0.2172 387/500 [======================>.......] - ETA: 27s - loss: 1.3881 - regression_loss: 1.1712 - classification_loss: 0.2169 388/500 [======================>.......] - ETA: 27s - loss: 1.3880 - regression_loss: 1.1711 - classification_loss: 0.2169 389/500 [======================>.......] - ETA: 26s - loss: 1.3873 - regression_loss: 1.1707 - classification_loss: 0.2166 390/500 [======================>.......] - ETA: 26s - loss: 1.3869 - regression_loss: 1.1704 - classification_loss: 0.2165 391/500 [======================>.......] - ETA: 26s - loss: 1.3877 - regression_loss: 1.1711 - classification_loss: 0.2165 392/500 [======================>.......] - ETA: 26s - loss: 1.3880 - regression_loss: 1.1713 - classification_loss: 0.2166 393/500 [======================>.......] - ETA: 26s - loss: 1.3877 - regression_loss: 1.1712 - classification_loss: 0.2165 394/500 [======================>.......] - ETA: 25s - loss: 1.3854 - regression_loss: 1.1693 - classification_loss: 0.2161 395/500 [======================>.......] - ETA: 25s - loss: 1.3856 - regression_loss: 1.1694 - classification_loss: 0.2162 396/500 [======================>.......] - ETA: 25s - loss: 1.3858 - regression_loss: 1.1697 - classification_loss: 0.2162 397/500 [======================>.......] - ETA: 25s - loss: 1.3851 - regression_loss: 1.1690 - classification_loss: 0.2161 398/500 [======================>.......] - ETA: 24s - loss: 1.3844 - regression_loss: 1.1686 - classification_loss: 0.2159 399/500 [======================>.......] - ETA: 24s - loss: 1.3843 - regression_loss: 1.1685 - classification_loss: 0.2158 400/500 [=======================>......] - ETA: 24s - loss: 1.3827 - regression_loss: 1.1673 - classification_loss: 0.2154 401/500 [=======================>......] - ETA: 24s - loss: 1.3827 - regression_loss: 1.1671 - classification_loss: 0.2156 402/500 [=======================>......] - ETA: 23s - loss: 1.3820 - regression_loss: 1.1667 - classification_loss: 0.2153 403/500 [=======================>......] - ETA: 23s - loss: 1.3806 - regression_loss: 1.1656 - classification_loss: 0.2150 404/500 [=======================>......] - ETA: 23s - loss: 1.3787 - regression_loss: 1.1641 - classification_loss: 0.2146 405/500 [=======================>......] - ETA: 23s - loss: 1.3788 - regression_loss: 1.1642 - classification_loss: 0.2146 406/500 [=======================>......] - ETA: 22s - loss: 1.3805 - regression_loss: 1.1656 - classification_loss: 0.2149 407/500 [=======================>......] - ETA: 22s - loss: 1.3809 - regression_loss: 1.1659 - classification_loss: 0.2150 408/500 [=======================>......] - ETA: 22s - loss: 1.3807 - regression_loss: 1.1658 - classification_loss: 0.2149 409/500 [=======================>......] - ETA: 22s - loss: 1.3813 - regression_loss: 1.1663 - classification_loss: 0.2150 410/500 [=======================>......] - ETA: 21s - loss: 1.3797 - regression_loss: 1.1647 - classification_loss: 0.2150 411/500 [=======================>......] - ETA: 21s - loss: 1.3799 - regression_loss: 1.1648 - classification_loss: 0.2150 412/500 [=======================>......] - ETA: 21s - loss: 1.3816 - regression_loss: 1.1662 - classification_loss: 0.2155 413/500 [=======================>......] - ETA: 21s - loss: 1.3821 - regression_loss: 1.1666 - classification_loss: 0.2155 414/500 [=======================>......] - ETA: 20s - loss: 1.3834 - regression_loss: 1.1675 - classification_loss: 0.2159 415/500 [=======================>......] - ETA: 20s - loss: 1.3834 - regression_loss: 1.1677 - classification_loss: 0.2157 416/500 [=======================>......] - ETA: 20s - loss: 1.3845 - regression_loss: 1.1685 - classification_loss: 0.2160 417/500 [========================>.....] - ETA: 20s - loss: 1.3841 - regression_loss: 1.1682 - classification_loss: 0.2159 418/500 [========================>.....] - ETA: 19s - loss: 1.3819 - regression_loss: 1.1664 - classification_loss: 0.2155 419/500 [========================>.....] - ETA: 19s - loss: 1.3809 - regression_loss: 1.1656 - classification_loss: 0.2153 420/500 [========================>.....] - ETA: 19s - loss: 1.3815 - regression_loss: 1.1662 - classification_loss: 0.2152 421/500 [========================>.....] - ETA: 19s - loss: 1.3812 - regression_loss: 1.1661 - classification_loss: 0.2151 422/500 [========================>.....] - ETA: 18s - loss: 1.3825 - regression_loss: 1.1673 - classification_loss: 0.2152 423/500 [========================>.....] - ETA: 18s - loss: 1.3812 - regression_loss: 1.1663 - classification_loss: 0.2149 424/500 [========================>.....] - ETA: 18s - loss: 1.3808 - regression_loss: 1.1659 - classification_loss: 0.2149 425/500 [========================>.....] - ETA: 18s - loss: 1.3807 - regression_loss: 1.1659 - classification_loss: 0.2147 426/500 [========================>.....] - ETA: 17s - loss: 1.3822 - regression_loss: 1.1670 - classification_loss: 0.2152 427/500 [========================>.....] - ETA: 17s - loss: 1.3819 - regression_loss: 1.1669 - classification_loss: 0.2151 428/500 [========================>.....] - ETA: 17s - loss: 1.3831 - regression_loss: 1.1679 - classification_loss: 0.2152 429/500 [========================>.....] - ETA: 17s - loss: 1.3837 - regression_loss: 1.1683 - classification_loss: 0.2153 430/500 [========================>.....] - ETA: 17s - loss: 1.3836 - regression_loss: 1.1682 - classification_loss: 0.2154 431/500 [========================>.....] - ETA: 16s - loss: 1.3826 - regression_loss: 1.1674 - classification_loss: 0.2152 432/500 [========================>.....] - ETA: 16s - loss: 1.3823 - regression_loss: 1.1673 - classification_loss: 0.2150 433/500 [========================>.....] - ETA: 16s - loss: 1.3824 - regression_loss: 1.1675 - classification_loss: 0.2148 434/500 [=========================>....] - ETA: 16s - loss: 1.3838 - regression_loss: 1.1688 - classification_loss: 0.2151 435/500 [=========================>....] - ETA: 15s - loss: 1.3832 - regression_loss: 1.1683 - classification_loss: 0.2150 436/500 [=========================>....] - ETA: 15s - loss: 1.3837 - regression_loss: 1.1687 - classification_loss: 0.2150 437/500 [=========================>....] - ETA: 15s - loss: 1.3852 - regression_loss: 1.1696 - classification_loss: 0.2156 438/500 [=========================>....] - ETA: 15s - loss: 1.3835 - regression_loss: 1.1682 - classification_loss: 0.2153 439/500 [=========================>....] - ETA: 14s - loss: 1.3812 - regression_loss: 1.1663 - classification_loss: 0.2149 440/500 [=========================>....] - ETA: 14s - loss: 1.3817 - regression_loss: 1.1669 - classification_loss: 0.2149 441/500 [=========================>....] - ETA: 14s - loss: 1.3827 - regression_loss: 1.1677 - classification_loss: 0.2150 442/500 [=========================>....] - ETA: 14s - loss: 1.3836 - regression_loss: 1.1685 - classification_loss: 0.2151 443/500 [=========================>....] - ETA: 13s - loss: 1.3835 - regression_loss: 1.1685 - classification_loss: 0.2150 444/500 [=========================>....] - ETA: 13s - loss: 1.3850 - regression_loss: 1.1698 - classification_loss: 0.2152 445/500 [=========================>....] - ETA: 13s - loss: 1.3860 - regression_loss: 1.1706 - classification_loss: 0.2154 446/500 [=========================>....] - ETA: 13s - loss: 1.3859 - regression_loss: 1.1705 - classification_loss: 0.2154 447/500 [=========================>....] - ETA: 12s - loss: 1.3869 - regression_loss: 1.1711 - classification_loss: 0.2158 448/500 [=========================>....] - ETA: 12s - loss: 1.3870 - regression_loss: 1.1710 - classification_loss: 0.2160 449/500 [=========================>....] - ETA: 12s - loss: 1.3871 - regression_loss: 1.1713 - classification_loss: 0.2159 450/500 [==========================>...] - ETA: 12s - loss: 1.3874 - regression_loss: 1.1716 - classification_loss: 0.2158 451/500 [==========================>...] - ETA: 11s - loss: 1.3860 - regression_loss: 1.1704 - classification_loss: 0.2156 452/500 [==========================>...] - ETA: 11s - loss: 1.3859 - regression_loss: 1.1705 - classification_loss: 0.2154 453/500 [==========================>...] - ETA: 11s - loss: 1.3860 - regression_loss: 1.1707 - classification_loss: 0.2153 454/500 [==========================>...] - ETA: 11s - loss: 1.3854 - regression_loss: 1.1703 - classification_loss: 0.2151 455/500 [==========================>...] - ETA: 10s - loss: 1.3857 - regression_loss: 1.1706 - classification_loss: 0.2151 456/500 [==========================>...] - ETA: 10s - loss: 1.3835 - regression_loss: 1.1687 - classification_loss: 0.2148 457/500 [==========================>...] - ETA: 10s - loss: 1.3839 - regression_loss: 1.1690 - classification_loss: 0.2149 458/500 [==========================>...] - ETA: 10s - loss: 1.3841 - regression_loss: 1.1691 - classification_loss: 0.2150 459/500 [==========================>...] - ETA: 9s - loss: 1.3852 - regression_loss: 1.1695 - classification_loss: 0.2157  460/500 [==========================>...] - ETA: 9s - loss: 1.3864 - regression_loss: 1.1706 - classification_loss: 0.2158 461/500 [==========================>...] - ETA: 9s - loss: 1.3868 - regression_loss: 1.1710 - classification_loss: 0.2158 462/500 [==========================>...] - ETA: 9s - loss: 1.3863 - regression_loss: 1.1707 - classification_loss: 0.2156 463/500 [==========================>...] - ETA: 9s - loss: 1.3861 - regression_loss: 1.1704 - classification_loss: 0.2157 464/500 [==========================>...] - ETA: 8s - loss: 1.3869 - regression_loss: 1.1711 - classification_loss: 0.2158 465/500 [==========================>...] - ETA: 8s - loss: 1.3871 - regression_loss: 1.1713 - classification_loss: 0.2158 466/500 [==========================>...] - ETA: 8s - loss: 1.3866 - regression_loss: 1.1710 - classification_loss: 0.2156 467/500 [===========================>..] - ETA: 8s - loss: 1.3876 - regression_loss: 1.1715 - classification_loss: 0.2161 468/500 [===========================>..] - ETA: 7s - loss: 1.3876 - regression_loss: 1.1715 - classification_loss: 0.2161 469/500 [===========================>..] - ETA: 7s - loss: 1.3882 - regression_loss: 1.1720 - classification_loss: 0.2162 470/500 [===========================>..] - ETA: 7s - loss: 1.3862 - regression_loss: 1.1704 - classification_loss: 0.2158 471/500 [===========================>..] - ETA: 7s - loss: 1.3848 - regression_loss: 1.1692 - classification_loss: 0.2156 472/500 [===========================>..] - ETA: 6s - loss: 1.3839 - regression_loss: 1.1685 - classification_loss: 0.2154 473/500 [===========================>..] - ETA: 6s - loss: 1.3844 - regression_loss: 1.1690 - classification_loss: 0.2154 474/500 [===========================>..] - ETA: 6s - loss: 1.3831 - regression_loss: 1.1679 - classification_loss: 0.2151 475/500 [===========================>..] - ETA: 6s - loss: 1.3834 - regression_loss: 1.1684 - classification_loss: 0.2150 476/500 [===========================>..] - ETA: 5s - loss: 1.3843 - regression_loss: 1.1691 - classification_loss: 0.2152 477/500 [===========================>..] - ETA: 5s - loss: 1.3846 - regression_loss: 1.1695 - classification_loss: 0.2152 478/500 [===========================>..] - ETA: 5s - loss: 1.3853 - regression_loss: 1.1700 - classification_loss: 0.2152 479/500 [===========================>..] - ETA: 5s - loss: 1.3862 - regression_loss: 1.1708 - classification_loss: 0.2153 480/500 [===========================>..] - ETA: 4s - loss: 1.3860 - regression_loss: 1.1708 - classification_loss: 0.2153 481/500 [===========================>..] - ETA: 4s - loss: 1.3845 - regression_loss: 1.1694 - classification_loss: 0.2150 482/500 [===========================>..] - ETA: 4s - loss: 1.3841 - regression_loss: 1.1692 - classification_loss: 0.2149 483/500 [===========================>..] - ETA: 4s - loss: 1.3835 - regression_loss: 1.1688 - classification_loss: 0.2147 484/500 [============================>.] - ETA: 3s - loss: 1.3841 - regression_loss: 1.1695 - classification_loss: 0.2146 485/500 [============================>.] - ETA: 3s - loss: 1.3852 - regression_loss: 1.1703 - classification_loss: 0.2149 486/500 [============================>.] - ETA: 3s - loss: 1.3838 - regression_loss: 1.1692 - classification_loss: 0.2145 487/500 [============================>.] - ETA: 3s - loss: 1.3843 - regression_loss: 1.1698 - classification_loss: 0.2145 488/500 [============================>.] - ETA: 2s - loss: 1.3836 - regression_loss: 1.1690 - classification_loss: 0.2145 489/500 [============================>.] - ETA: 2s - loss: 1.3850 - regression_loss: 1.1703 - classification_loss: 0.2148 490/500 [============================>.] - ETA: 2s - loss: 1.3863 - regression_loss: 1.1714 - classification_loss: 0.2149 491/500 [============================>.] - ETA: 2s - loss: 1.3885 - regression_loss: 1.1733 - classification_loss: 0.2152 492/500 [============================>.] - ETA: 1s - loss: 1.3889 - regression_loss: 1.1736 - classification_loss: 0.2153 493/500 [============================>.] - ETA: 1s - loss: 1.3892 - regression_loss: 1.1738 - classification_loss: 0.2154 494/500 [============================>.] - ETA: 1s - loss: 1.3891 - regression_loss: 1.1738 - classification_loss: 0.2153 495/500 [============================>.] - ETA: 1s - loss: 1.3884 - regression_loss: 1.1733 - classification_loss: 0.2151 496/500 [============================>.] - ETA: 0s - loss: 1.3907 - regression_loss: 1.1748 - classification_loss: 0.2159 497/500 [============================>.] - ETA: 0s - loss: 1.3948 - regression_loss: 1.1782 - classification_loss: 0.2167 498/500 [============================>.] - ETA: 0s - loss: 1.3957 - regression_loss: 1.1789 - classification_loss: 0.2168 499/500 [============================>.] - ETA: 0s - loss: 1.3955 - regression_loss: 1.1788 - classification_loss: 0.2166 500/500 [==============================] - 122s 244ms/step - loss: 1.3957 - regression_loss: 1.1790 - classification_loss: 0.2167 326 instances of class plum with average precision: 0.8010 mAP: 0.8010 Epoch 00074: saving model to ./training/snapshots/resnet50_pascal_74.h5 Epoch 75/150 1/500 [..............................] - ETA: 1:58 - loss: 0.7665 - regression_loss: 0.6503 - classification_loss: 0.1162 2/500 [..............................] - ETA: 2:01 - loss: 0.8242 - regression_loss: 0.7205 - classification_loss: 0.1038 3/500 [..............................] - ETA: 2:02 - loss: 1.0338 - regression_loss: 0.8778 - classification_loss: 0.1561 4/500 [..............................] - ETA: 2:02 - loss: 1.1830 - regression_loss: 1.0101 - classification_loss: 0.1730 5/500 [..............................] - ETA: 2:02 - loss: 1.2744 - regression_loss: 1.0696 - classification_loss: 0.2047 6/500 [..............................] - ETA: 2:02 - loss: 1.3006 - regression_loss: 1.0965 - classification_loss: 0.2041 7/500 [..............................] - ETA: 2:02 - loss: 1.2739 - regression_loss: 1.0841 - classification_loss: 0.1898 8/500 [..............................] - ETA: 2:02 - loss: 1.1799 - regression_loss: 0.9972 - classification_loss: 0.1828 9/500 [..............................] - ETA: 2:01 - loss: 1.2138 - regression_loss: 1.0159 - classification_loss: 0.1979 10/500 [..............................] - ETA: 2:01 - loss: 1.2299 - regression_loss: 1.0336 - classification_loss: 0.1962 11/500 [..............................] - ETA: 2:01 - loss: 1.2675 - regression_loss: 1.0797 - classification_loss: 0.1878 12/500 [..............................] - ETA: 2:01 - loss: 1.2508 - regression_loss: 1.0666 - classification_loss: 0.1841 13/500 [..............................] - ETA: 2:26 - loss: 1.1891 - regression_loss: 1.0118 - classification_loss: 0.1773 14/500 [..............................] - ETA: 2:24 - loss: 1.2472 - regression_loss: 1.0580 - classification_loss: 0.1893 15/500 [..............................] - ETA: 2:22 - loss: 1.2710 - regression_loss: 1.0805 - classification_loss: 0.1905 16/500 [..............................] - ETA: 2:20 - loss: 1.2664 - regression_loss: 1.0791 - classification_loss: 0.1873 17/500 [>.............................] - ETA: 2:19 - loss: 1.2897 - regression_loss: 1.1001 - classification_loss: 0.1896 18/500 [>.............................] - ETA: 2:18 - loss: 1.3172 - regression_loss: 1.1244 - classification_loss: 0.1928 19/500 [>.............................] - ETA: 2:16 - loss: 1.3191 - regression_loss: 1.1301 - classification_loss: 0.1890 20/500 [>.............................] - ETA: 2:15 - loss: 1.3646 - regression_loss: 1.1677 - classification_loss: 0.1969 21/500 [>.............................] - ETA: 2:14 - loss: 1.3569 - regression_loss: 1.1644 - classification_loss: 0.1925 22/500 [>.............................] - ETA: 2:13 - loss: 1.3393 - regression_loss: 1.1510 - classification_loss: 0.1883 23/500 [>.............................] - ETA: 2:12 - loss: 1.3568 - regression_loss: 1.1659 - classification_loss: 0.1909 24/500 [>.............................] - ETA: 2:11 - loss: 1.3726 - regression_loss: 1.1786 - classification_loss: 0.1940 25/500 [>.............................] - ETA: 2:10 - loss: 1.3841 - regression_loss: 1.1868 - classification_loss: 0.1972 26/500 [>.............................] - ETA: 2:10 - loss: 1.3890 - regression_loss: 1.1881 - classification_loss: 0.2009 27/500 [>.............................] - ETA: 2:09 - loss: 1.4123 - regression_loss: 1.2118 - classification_loss: 0.2005 28/500 [>.............................] - ETA: 2:07 - loss: 1.4234 - regression_loss: 1.2206 - classification_loss: 0.2028 29/500 [>.............................] - ETA: 2:07 - loss: 1.4117 - regression_loss: 1.2121 - classification_loss: 0.1996 30/500 [>.............................] - ETA: 2:06 - loss: 1.4250 - regression_loss: 1.2242 - classification_loss: 0.2007 31/500 [>.............................] - ETA: 2:05 - loss: 1.4164 - regression_loss: 1.2171 - classification_loss: 0.1992 32/500 [>.............................] - ETA: 2:05 - loss: 1.4047 - regression_loss: 1.2078 - classification_loss: 0.1969 33/500 [>.............................] - ETA: 2:04 - loss: 1.4212 - regression_loss: 1.2187 - classification_loss: 0.2025 34/500 [=>............................] - ETA: 2:04 - loss: 1.3981 - regression_loss: 1.1997 - classification_loss: 0.1983 35/500 [=>............................] - ETA: 2:03 - loss: 1.4300 - regression_loss: 1.2231 - classification_loss: 0.2068 36/500 [=>............................] - ETA: 2:03 - loss: 1.4421 - regression_loss: 1.2300 - classification_loss: 0.2120 37/500 [=>............................] - ETA: 2:02 - loss: 1.4337 - regression_loss: 1.2208 - classification_loss: 0.2130 38/500 [=>............................] - ETA: 2:02 - loss: 1.4151 - regression_loss: 1.2051 - classification_loss: 0.2100 39/500 [=>............................] - ETA: 2:01 - loss: 1.4307 - regression_loss: 1.2160 - classification_loss: 0.2147 40/500 [=>............................] - ETA: 2:01 - loss: 1.4376 - regression_loss: 1.2221 - classification_loss: 0.2155 41/500 [=>............................] - ETA: 2:00 - loss: 1.4139 - regression_loss: 1.2017 - classification_loss: 0.2122 42/500 [=>............................] - ETA: 2:00 - loss: 1.4103 - regression_loss: 1.1996 - classification_loss: 0.2107 43/500 [=>............................] - ETA: 1:59 - loss: 1.4179 - regression_loss: 1.2050 - classification_loss: 0.2129 44/500 [=>............................] - ETA: 1:59 - loss: 1.4273 - regression_loss: 1.2111 - classification_loss: 0.2162 45/500 [=>............................] - ETA: 1:59 - loss: 1.4202 - regression_loss: 1.2049 - classification_loss: 0.2154 46/500 [=>............................] - ETA: 1:58 - loss: 1.4280 - regression_loss: 1.2108 - classification_loss: 0.2171 47/500 [=>............................] - ETA: 1:58 - loss: 1.4243 - regression_loss: 1.2079 - classification_loss: 0.2164 48/500 [=>............................] - ETA: 1:57 - loss: 1.4207 - regression_loss: 1.2060 - classification_loss: 0.2147 49/500 [=>............................] - ETA: 1:57 - loss: 1.4107 - regression_loss: 1.1986 - classification_loss: 0.2122 50/500 [==>...........................] - ETA: 1:56 - loss: 1.4226 - regression_loss: 1.2092 - classification_loss: 0.2135 51/500 [==>...........................] - ETA: 1:56 - loss: 1.4383 - regression_loss: 1.2226 - classification_loss: 0.2157 52/500 [==>...........................] - ETA: 1:56 - loss: 1.4415 - regression_loss: 1.2245 - classification_loss: 0.2170 53/500 [==>...........................] - ETA: 1:55 - loss: 1.4265 - regression_loss: 1.2115 - classification_loss: 0.2150 54/500 [==>...........................] - ETA: 1:55 - loss: 1.4239 - regression_loss: 1.2098 - classification_loss: 0.2140 55/500 [==>...........................] - ETA: 1:55 - loss: 1.4410 - regression_loss: 1.2215 - classification_loss: 0.2195 56/500 [==>...........................] - ETA: 1:54 - loss: 1.4343 - regression_loss: 1.2168 - classification_loss: 0.2174 57/500 [==>...........................] - ETA: 1:54 - loss: 1.4330 - regression_loss: 1.2155 - classification_loss: 0.2175 58/500 [==>...........................] - ETA: 1:54 - loss: 1.4430 - regression_loss: 1.2221 - classification_loss: 0.2209 59/500 [==>...........................] - ETA: 1:53 - loss: 1.4482 - regression_loss: 1.2273 - classification_loss: 0.2208 60/500 [==>...........................] - ETA: 1:53 - loss: 1.4559 - regression_loss: 1.2329 - classification_loss: 0.2230 61/500 [==>...........................] - ETA: 1:53 - loss: 1.4682 - regression_loss: 1.2421 - classification_loss: 0.2261 62/500 [==>...........................] - ETA: 1:52 - loss: 1.4622 - regression_loss: 1.2382 - classification_loss: 0.2240 63/500 [==>...........................] - ETA: 1:52 - loss: 1.4577 - regression_loss: 1.2347 - classification_loss: 0.2230 64/500 [==>...........................] - ETA: 1:52 - loss: 1.4629 - regression_loss: 1.2391 - classification_loss: 0.2238 65/500 [==>...........................] - ETA: 1:51 - loss: 1.4756 - regression_loss: 1.2486 - classification_loss: 0.2271 66/500 [==>...........................] - ETA: 1:51 - loss: 1.4658 - regression_loss: 1.2411 - classification_loss: 0.2246 67/500 [===>..........................] - ETA: 1:51 - loss: 1.4674 - regression_loss: 1.2424 - classification_loss: 0.2250 68/500 [===>..........................] - ETA: 1:50 - loss: 1.4647 - regression_loss: 1.2400 - classification_loss: 0.2247 69/500 [===>..........................] - ETA: 1:50 - loss: 1.4625 - regression_loss: 1.2381 - classification_loss: 0.2244 70/500 [===>..........................] - ETA: 1:50 - loss: 1.4610 - regression_loss: 1.2377 - classification_loss: 0.2233 71/500 [===>..........................] - ETA: 1:49 - loss: 1.4655 - regression_loss: 1.2419 - classification_loss: 0.2235 72/500 [===>..........................] - ETA: 1:49 - loss: 1.4538 - regression_loss: 1.2323 - classification_loss: 0.2215 73/500 [===>..........................] - ETA: 1:49 - loss: 1.4571 - regression_loss: 1.2346 - classification_loss: 0.2225 74/500 [===>..........................] - ETA: 1:49 - loss: 1.4592 - regression_loss: 1.2368 - classification_loss: 0.2225 75/500 [===>..........................] - ETA: 1:48 - loss: 1.4530 - regression_loss: 1.2317 - classification_loss: 0.2212 76/500 [===>..........................] - ETA: 1:48 - loss: 1.4529 - regression_loss: 1.2335 - classification_loss: 0.2195 77/500 [===>..........................] - ETA: 1:48 - loss: 1.4474 - regression_loss: 1.2297 - classification_loss: 0.2177 78/500 [===>..........................] - ETA: 1:47 - loss: 1.4483 - regression_loss: 1.2303 - classification_loss: 0.2180 79/500 [===>..........................] - ETA: 1:47 - loss: 1.4572 - regression_loss: 1.2379 - classification_loss: 0.2192 80/500 [===>..........................] - ETA: 1:47 - loss: 1.4491 - regression_loss: 1.2308 - classification_loss: 0.2183 81/500 [===>..........................] - ETA: 1:46 - loss: 1.4448 - regression_loss: 1.2277 - classification_loss: 0.2171 82/500 [===>..........................] - ETA: 1:46 - loss: 1.4477 - regression_loss: 1.2304 - classification_loss: 0.2173 83/500 [===>..........................] - ETA: 1:46 - loss: 1.4430 - regression_loss: 1.2266 - classification_loss: 0.2164 84/500 [====>.........................] - ETA: 1:46 - loss: 1.4388 - regression_loss: 1.2233 - classification_loss: 0.2155 85/500 [====>.........................] - ETA: 1:45 - loss: 1.4375 - regression_loss: 1.2227 - classification_loss: 0.2148 86/500 [====>.........................] - ETA: 1:45 - loss: 1.4278 - regression_loss: 1.2148 - classification_loss: 0.2130 87/500 [====>.........................] - ETA: 1:45 - loss: 1.4199 - regression_loss: 1.2083 - classification_loss: 0.2116 88/500 [====>.........................] - ETA: 1:45 - loss: 1.4218 - regression_loss: 1.2100 - classification_loss: 0.2118 89/500 [====>.........................] - ETA: 1:44 - loss: 1.4266 - regression_loss: 1.2137 - classification_loss: 0.2129 90/500 [====>.........................] - ETA: 1:44 - loss: 1.4146 - regression_loss: 1.2032 - classification_loss: 0.2113 91/500 [====>.........................] - ETA: 1:44 - loss: 1.4137 - regression_loss: 1.2028 - classification_loss: 0.2109 92/500 [====>.........................] - ETA: 1:43 - loss: 1.4149 - regression_loss: 1.2036 - classification_loss: 0.2114 93/500 [====>.........................] - ETA: 1:43 - loss: 1.4166 - regression_loss: 1.2049 - classification_loss: 0.2117 94/500 [====>.........................] - ETA: 1:43 - loss: 1.4172 - regression_loss: 1.2055 - classification_loss: 0.2117 95/500 [====>.........................] - ETA: 1:42 - loss: 1.4263 - regression_loss: 1.2120 - classification_loss: 0.2144 96/500 [====>.........................] - ETA: 1:42 - loss: 1.4216 - regression_loss: 1.2081 - classification_loss: 0.2136 97/500 [====>.........................] - ETA: 1:42 - loss: 1.4263 - regression_loss: 1.2110 - classification_loss: 0.2153 98/500 [====>.........................] - ETA: 1:41 - loss: 1.4240 - regression_loss: 1.2089 - classification_loss: 0.2150 99/500 [====>.........................] - ETA: 1:41 - loss: 1.4278 - regression_loss: 1.2114 - classification_loss: 0.2164 100/500 [=====>........................] - ETA: 1:41 - loss: 1.4253 - regression_loss: 1.2094 - classification_loss: 0.2158 101/500 [=====>........................] - ETA: 1:41 - loss: 1.4185 - regression_loss: 1.2038 - classification_loss: 0.2146 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4195 - regression_loss: 1.2040 - classification_loss: 0.2155 103/500 [=====>........................] - ETA: 1:40 - loss: 1.4175 - regression_loss: 1.2029 - classification_loss: 0.2147 104/500 [=====>........................] - ETA: 1:40 - loss: 1.4227 - regression_loss: 1.2077 - classification_loss: 0.2151 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4211 - regression_loss: 1.2064 - classification_loss: 0.2147 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4203 - regression_loss: 1.2060 - classification_loss: 0.2143 107/500 [=====>........................] - ETA: 1:39 - loss: 1.4137 - regression_loss: 1.2008 - classification_loss: 0.2129 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4202 - regression_loss: 1.2054 - classification_loss: 0.2148 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4102 - regression_loss: 1.1970 - classification_loss: 0.2132 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4063 - regression_loss: 1.1934 - classification_loss: 0.2129 111/500 [=====>........................] - ETA: 1:38 - loss: 1.4084 - regression_loss: 1.1944 - classification_loss: 0.2141 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4089 - regression_loss: 1.1952 - classification_loss: 0.2137 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3994 - regression_loss: 1.1873 - classification_loss: 0.2121 114/500 [=====>........................] - ETA: 1:37 - loss: 1.3969 - regression_loss: 1.1854 - classification_loss: 0.2115 115/500 [=====>........................] - ETA: 1:37 - loss: 1.3931 - regression_loss: 1.1824 - classification_loss: 0.2107 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3852 - regression_loss: 1.1760 - classification_loss: 0.2092 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3845 - regression_loss: 1.1748 - classification_loss: 0.2097 118/500 [======>.......................] - ETA: 1:36 - loss: 1.3825 - regression_loss: 1.1730 - classification_loss: 0.2095 119/500 [======>.......................] - ETA: 1:36 - loss: 1.3788 - regression_loss: 1.1701 - classification_loss: 0.2088 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3823 - regression_loss: 1.1725 - classification_loss: 0.2098 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3845 - regression_loss: 1.1747 - classification_loss: 0.2098 122/500 [======>.......................] - ETA: 1:35 - loss: 1.3822 - regression_loss: 1.1725 - classification_loss: 0.2097 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3855 - regression_loss: 1.1751 - classification_loss: 0.2104 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3902 - regression_loss: 1.1784 - classification_loss: 0.2118 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3871 - regression_loss: 1.1755 - classification_loss: 0.2116 126/500 [======>.......................] - ETA: 1:34 - loss: 1.3863 - regression_loss: 1.1748 - classification_loss: 0.2115 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3797 - regression_loss: 1.1689 - classification_loss: 0.2107 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3801 - regression_loss: 1.1697 - classification_loss: 0.2103 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3839 - regression_loss: 1.1731 - classification_loss: 0.2107 130/500 [======>.......................] - ETA: 1:33 - loss: 1.3857 - regression_loss: 1.1750 - classification_loss: 0.2106 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3858 - regression_loss: 1.1753 - classification_loss: 0.2105 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3849 - regression_loss: 1.1746 - classification_loss: 0.2102 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3869 - regression_loss: 1.1767 - classification_loss: 0.2102 134/500 [=======>......................] - ETA: 1:32 - loss: 1.3806 - regression_loss: 1.1715 - classification_loss: 0.2090 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3850 - regression_loss: 1.1751 - classification_loss: 0.2099 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3857 - regression_loss: 1.1754 - classification_loss: 0.2103 137/500 [=======>......................] - ETA: 1:31 - loss: 1.3899 - regression_loss: 1.1786 - classification_loss: 0.2114 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3854 - regression_loss: 1.1749 - classification_loss: 0.2105 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3865 - regression_loss: 1.1756 - classification_loss: 0.2109 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3879 - regression_loss: 1.1769 - classification_loss: 0.2111 141/500 [=======>......................] - ETA: 1:29 - loss: 1.3911 - regression_loss: 1.1792 - classification_loss: 0.2119 142/500 [=======>......................] - ETA: 1:29 - loss: 1.3900 - regression_loss: 1.1784 - classification_loss: 0.2116 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3865 - regression_loss: 1.1757 - classification_loss: 0.2108 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3916 - regression_loss: 1.1790 - classification_loss: 0.2127 145/500 [=======>......................] - ETA: 1:28 - loss: 1.3914 - regression_loss: 1.1791 - classification_loss: 0.2123 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3923 - regression_loss: 1.1797 - classification_loss: 0.2126 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3942 - regression_loss: 1.1808 - classification_loss: 0.2134 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3971 - regression_loss: 1.1831 - classification_loss: 0.2140 149/500 [=======>......................] - ETA: 1:27 - loss: 1.3949 - regression_loss: 1.1815 - classification_loss: 0.2133 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3914 - regression_loss: 1.1790 - classification_loss: 0.2124 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3910 - regression_loss: 1.1787 - classification_loss: 0.2124 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3907 - regression_loss: 1.1789 - classification_loss: 0.2118 153/500 [========>.....................] - ETA: 1:26 - loss: 1.3951 - regression_loss: 1.1822 - classification_loss: 0.2129 154/500 [========>.....................] - ETA: 1:26 - loss: 1.3928 - regression_loss: 1.1806 - classification_loss: 0.2123 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3926 - regression_loss: 1.1801 - classification_loss: 0.2125 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3924 - regression_loss: 1.1801 - classification_loss: 0.2123 157/500 [========>.....................] - ETA: 1:25 - loss: 1.3918 - regression_loss: 1.1800 - classification_loss: 0.2119 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3923 - regression_loss: 1.1807 - classification_loss: 0.2116 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3912 - regression_loss: 1.1800 - classification_loss: 0.2112 160/500 [========>.....................] - ETA: 1:24 - loss: 1.3917 - regression_loss: 1.1806 - classification_loss: 0.2111 161/500 [========>.....................] - ETA: 1:24 - loss: 1.3921 - regression_loss: 1.1806 - classification_loss: 0.2114 162/500 [========>.....................] - ETA: 1:24 - loss: 1.3948 - regression_loss: 1.1835 - classification_loss: 0.2113 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3959 - regression_loss: 1.1845 - classification_loss: 0.2114 164/500 [========>.....................] - ETA: 1:23 - loss: 1.3982 - regression_loss: 1.1862 - classification_loss: 0.2119 165/500 [========>.....................] - ETA: 1:23 - loss: 1.3928 - regression_loss: 1.1819 - classification_loss: 0.2109 166/500 [========>.....................] - ETA: 1:23 - loss: 1.3873 - regression_loss: 1.1767 - classification_loss: 0.2106 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3908 - regression_loss: 1.1791 - classification_loss: 0.2117 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3897 - regression_loss: 1.1781 - classification_loss: 0.2116 169/500 [=========>....................] - ETA: 1:22 - loss: 1.3910 - regression_loss: 1.1793 - classification_loss: 0.2117 170/500 [=========>....................] - ETA: 1:22 - loss: 1.3947 - regression_loss: 1.1829 - classification_loss: 0.2119 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3960 - regression_loss: 1.1840 - classification_loss: 0.2120 172/500 [=========>....................] - ETA: 1:21 - loss: 1.3929 - regression_loss: 1.1816 - classification_loss: 0.2113 173/500 [=========>....................] - ETA: 1:21 - loss: 1.3970 - regression_loss: 1.1849 - classification_loss: 0.2122 174/500 [=========>....................] - ETA: 1:21 - loss: 1.3933 - regression_loss: 1.1817 - classification_loss: 0.2116 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3931 - regression_loss: 1.1813 - classification_loss: 0.2118 176/500 [=========>....................] - ETA: 1:20 - loss: 1.3899 - regression_loss: 1.1789 - classification_loss: 0.2110 177/500 [=========>....................] - ETA: 1:20 - loss: 1.3859 - regression_loss: 1.1756 - classification_loss: 0.2103 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3905 - regression_loss: 1.1792 - classification_loss: 0.2113 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3881 - regression_loss: 1.1775 - classification_loss: 0.2106 180/500 [=========>....................] - ETA: 1:19 - loss: 1.3912 - regression_loss: 1.1802 - classification_loss: 0.2109 181/500 [=========>....................] - ETA: 1:19 - loss: 1.3945 - regression_loss: 1.1831 - classification_loss: 0.2114 182/500 [=========>....................] - ETA: 1:19 - loss: 1.3925 - regression_loss: 1.1816 - classification_loss: 0.2109 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3918 - regression_loss: 1.1810 - classification_loss: 0.2108 184/500 [==========>...................] - ETA: 1:18 - loss: 1.3934 - regression_loss: 1.1822 - classification_loss: 0.2113 185/500 [==========>...................] - ETA: 1:18 - loss: 1.3948 - regression_loss: 1.1832 - classification_loss: 0.2115 186/500 [==========>...................] - ETA: 1:18 - loss: 1.3923 - regression_loss: 1.1814 - classification_loss: 0.2110 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3880 - regression_loss: 1.1779 - classification_loss: 0.2100 188/500 [==========>...................] - ETA: 1:17 - loss: 1.3870 - regression_loss: 1.1773 - classification_loss: 0.2097 189/500 [==========>...................] - ETA: 1:17 - loss: 1.3886 - regression_loss: 1.1785 - classification_loss: 0.2101 190/500 [==========>...................] - ETA: 1:17 - loss: 1.3833 - regression_loss: 1.1741 - classification_loss: 0.2092 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3821 - regression_loss: 1.1732 - classification_loss: 0.2090 192/500 [==========>...................] - ETA: 1:16 - loss: 1.3821 - regression_loss: 1.1729 - classification_loss: 0.2091 193/500 [==========>...................] - ETA: 1:16 - loss: 1.3779 - regression_loss: 1.1696 - classification_loss: 0.2084 194/500 [==========>...................] - ETA: 1:16 - loss: 1.3784 - regression_loss: 1.1698 - classification_loss: 0.2085 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3779 - regression_loss: 1.1692 - classification_loss: 0.2087 196/500 [==========>...................] - ETA: 1:15 - loss: 1.3765 - regression_loss: 1.1679 - classification_loss: 0.2086 197/500 [==========>...................] - ETA: 1:15 - loss: 1.3772 - regression_loss: 1.1683 - classification_loss: 0.2089 198/500 [==========>...................] - ETA: 1:15 - loss: 1.3767 - regression_loss: 1.1684 - classification_loss: 0.2083 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3798 - regression_loss: 1.1712 - classification_loss: 0.2086 200/500 [===========>..................] - ETA: 1:14 - loss: 1.3792 - regression_loss: 1.1708 - classification_loss: 0.2084 201/500 [===========>..................] - ETA: 1:14 - loss: 1.3818 - regression_loss: 1.1733 - classification_loss: 0.2085 202/500 [===========>..................] - ETA: 1:14 - loss: 1.3810 - regression_loss: 1.1729 - classification_loss: 0.2081 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3829 - regression_loss: 1.1745 - classification_loss: 0.2084 204/500 [===========>..................] - ETA: 1:13 - loss: 1.3784 - regression_loss: 1.1709 - classification_loss: 0.2075 205/500 [===========>..................] - ETA: 1:13 - loss: 1.3776 - regression_loss: 1.1704 - classification_loss: 0.2072 206/500 [===========>..................] - ETA: 1:13 - loss: 1.3760 - regression_loss: 1.1692 - classification_loss: 0.2067 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3799 - regression_loss: 1.1716 - classification_loss: 0.2083 208/500 [===========>..................] - ETA: 1:12 - loss: 1.3813 - regression_loss: 1.1725 - classification_loss: 0.2088 209/500 [===========>..................] - ETA: 1:12 - loss: 1.3780 - regression_loss: 1.1698 - classification_loss: 0.2082 210/500 [===========>..................] - ETA: 1:12 - loss: 1.3785 - regression_loss: 1.1701 - classification_loss: 0.2084 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3790 - regression_loss: 1.1708 - classification_loss: 0.2082 212/500 [===========>..................] - ETA: 1:11 - loss: 1.3800 - regression_loss: 1.1716 - classification_loss: 0.2084 213/500 [===========>..................] - ETA: 1:11 - loss: 1.3786 - regression_loss: 1.1705 - classification_loss: 0.2081 214/500 [===========>..................] - ETA: 1:11 - loss: 1.3775 - regression_loss: 1.1697 - classification_loss: 0.2078 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3801 - regression_loss: 1.1721 - classification_loss: 0.2080 216/500 [===========>..................] - ETA: 1:10 - loss: 1.3813 - regression_loss: 1.1731 - classification_loss: 0.2082 217/500 [============>.................] - ETA: 1:10 - loss: 1.3854 - regression_loss: 1.1764 - classification_loss: 0.2090 218/500 [============>.................] - ETA: 1:10 - loss: 1.3830 - regression_loss: 1.1747 - classification_loss: 0.2083 219/500 [============>.................] - ETA: 1:10 - loss: 1.3809 - regression_loss: 1.1731 - classification_loss: 0.2078 220/500 [============>.................] - ETA: 1:09 - loss: 1.3815 - regression_loss: 1.1738 - classification_loss: 0.2077 221/500 [============>.................] - ETA: 1:09 - loss: 1.3765 - regression_loss: 1.1696 - classification_loss: 0.2069 222/500 [============>.................] - ETA: 1:09 - loss: 1.3772 - regression_loss: 1.1704 - classification_loss: 0.2068 223/500 [============>.................] - ETA: 1:09 - loss: 1.3821 - regression_loss: 1.1745 - classification_loss: 0.2077 224/500 [============>.................] - ETA: 1:08 - loss: 1.3831 - regression_loss: 1.1752 - classification_loss: 0.2079 225/500 [============>.................] - ETA: 1:08 - loss: 1.3830 - regression_loss: 1.1752 - classification_loss: 0.2078 226/500 [============>.................] - ETA: 1:08 - loss: 1.3850 - regression_loss: 1.1771 - classification_loss: 0.2079 227/500 [============>.................] - ETA: 1:08 - loss: 1.3857 - regression_loss: 1.1773 - classification_loss: 0.2084 228/500 [============>.................] - ETA: 1:07 - loss: 1.3865 - regression_loss: 1.1784 - classification_loss: 0.2082 229/500 [============>.................] - ETA: 1:07 - loss: 1.3866 - regression_loss: 1.1785 - classification_loss: 0.2081 230/500 [============>.................] - ETA: 1:07 - loss: 1.3871 - regression_loss: 1.1790 - classification_loss: 0.2081 231/500 [============>.................] - ETA: 1:07 - loss: 1.3864 - regression_loss: 1.1786 - classification_loss: 0.2077 232/500 [============>.................] - ETA: 1:06 - loss: 1.3872 - regression_loss: 1.1789 - classification_loss: 0.2083 233/500 [============>.................] - ETA: 1:06 - loss: 1.3948 - regression_loss: 1.1840 - classification_loss: 0.2108 234/500 [=============>................] - ETA: 1:06 - loss: 1.3950 - regression_loss: 1.1843 - classification_loss: 0.2106 235/500 [=============>................] - ETA: 1:05 - loss: 1.3983 - regression_loss: 1.1872 - classification_loss: 0.2110 236/500 [=============>................] - ETA: 1:05 - loss: 1.3962 - regression_loss: 1.1857 - classification_loss: 0.2105 237/500 [=============>................] - ETA: 1:05 - loss: 1.3974 - regression_loss: 1.1867 - classification_loss: 0.2107 238/500 [=============>................] - ETA: 1:05 - loss: 1.3952 - regression_loss: 1.1850 - classification_loss: 0.2102 239/500 [=============>................] - ETA: 1:04 - loss: 1.3957 - regression_loss: 1.1855 - classification_loss: 0.2102 240/500 [=============>................] - ETA: 1:04 - loss: 1.3968 - regression_loss: 1.1864 - classification_loss: 0.2104 241/500 [=============>................] - ETA: 1:04 - loss: 1.3992 - regression_loss: 1.1882 - classification_loss: 0.2109 242/500 [=============>................] - ETA: 1:04 - loss: 1.3953 - regression_loss: 1.1850 - classification_loss: 0.2102 243/500 [=============>................] - ETA: 1:04 - loss: 1.3932 - regression_loss: 1.1835 - classification_loss: 0.2097 244/500 [=============>................] - ETA: 1:03 - loss: 1.3907 - regression_loss: 1.1813 - classification_loss: 0.2094 245/500 [=============>................] - ETA: 1:03 - loss: 1.3934 - regression_loss: 1.1839 - classification_loss: 0.2094 246/500 [=============>................] - ETA: 1:03 - loss: 1.3906 - regression_loss: 1.1813 - classification_loss: 0.2093 247/500 [=============>................] - ETA: 1:03 - loss: 1.3901 - regression_loss: 1.1809 - classification_loss: 0.2092 248/500 [=============>................] - ETA: 1:02 - loss: 1.3960 - regression_loss: 1.1862 - classification_loss: 0.2098 249/500 [=============>................] - ETA: 1:02 - loss: 1.3949 - regression_loss: 1.1852 - classification_loss: 0.2097 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3943 - regression_loss: 1.1849 - classification_loss: 0.2094 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3931 - regression_loss: 1.1840 - classification_loss: 0.2091 252/500 [==============>...............] - ETA: 1:01 - loss: 1.3935 - regression_loss: 1.1845 - classification_loss: 0.2090 253/500 [==============>...............] - ETA: 1:01 - loss: 1.3936 - regression_loss: 1.1846 - classification_loss: 0.2090 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3930 - regression_loss: 1.1840 - classification_loss: 0.2090 255/500 [==============>...............] - ETA: 1:00 - loss: 1.3939 - regression_loss: 1.1848 - classification_loss: 0.2091 256/500 [==============>...............] - ETA: 1:00 - loss: 1.3943 - regression_loss: 1.1851 - classification_loss: 0.2092 257/500 [==============>...............] - ETA: 1:00 - loss: 1.3944 - regression_loss: 1.1853 - classification_loss: 0.2092 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3931 - regression_loss: 1.1842 - classification_loss: 0.2089 259/500 [==============>...............] - ETA: 59s - loss: 1.3929 - regression_loss: 1.1838 - classification_loss: 0.2091  260/500 [==============>...............] - ETA: 59s - loss: 1.3924 - regression_loss: 1.1835 - classification_loss: 0.2089 261/500 [==============>...............] - ETA: 59s - loss: 1.3892 - regression_loss: 1.1809 - classification_loss: 0.2084 262/500 [==============>...............] - ETA: 59s - loss: 1.3893 - regression_loss: 1.1809 - classification_loss: 0.2084 263/500 [==============>...............] - ETA: 58s - loss: 1.3875 - regression_loss: 1.1795 - classification_loss: 0.2080 264/500 [==============>...............] - ETA: 58s - loss: 1.3866 - regression_loss: 1.1788 - classification_loss: 0.2078 265/500 [==============>...............] - ETA: 58s - loss: 1.3889 - regression_loss: 1.1807 - classification_loss: 0.2082 266/500 [==============>...............] - ETA: 58s - loss: 1.3874 - regression_loss: 1.1794 - classification_loss: 0.2080 267/500 [===============>..............] - ETA: 57s - loss: 1.3896 - regression_loss: 1.1814 - classification_loss: 0.2082 268/500 [===============>..............] - ETA: 57s - loss: 1.3875 - regression_loss: 1.1796 - classification_loss: 0.2079 269/500 [===============>..............] - ETA: 57s - loss: 1.3874 - regression_loss: 1.1794 - classification_loss: 0.2080 270/500 [===============>..............] - ETA: 57s - loss: 1.3886 - regression_loss: 1.1802 - classification_loss: 0.2085 271/500 [===============>..............] - ETA: 56s - loss: 1.4043 - regression_loss: 1.1828 - classification_loss: 0.2215 272/500 [===============>..............] - ETA: 56s - loss: 1.4049 - regression_loss: 1.1832 - classification_loss: 0.2217 273/500 [===============>..............] - ETA: 56s - loss: 1.4038 - regression_loss: 1.1824 - classification_loss: 0.2214 274/500 [===============>..............] - ETA: 56s - loss: 1.4056 - regression_loss: 1.1841 - classification_loss: 0.2214 275/500 [===============>..............] - ETA: 55s - loss: 1.4077 - regression_loss: 1.1859 - classification_loss: 0.2218 276/500 [===============>..............] - ETA: 55s - loss: 1.4071 - regression_loss: 1.1857 - classification_loss: 0.2215 277/500 [===============>..............] - ETA: 55s - loss: 1.4058 - regression_loss: 1.1844 - classification_loss: 0.2214 278/500 [===============>..............] - ETA: 55s - loss: 1.4074 - regression_loss: 1.1858 - classification_loss: 0.2216 279/500 [===============>..............] - ETA: 54s - loss: 1.4064 - regression_loss: 1.1849 - classification_loss: 0.2215 280/500 [===============>..............] - ETA: 54s - loss: 1.4068 - regression_loss: 1.1855 - classification_loss: 0.2213 281/500 [===============>..............] - ETA: 54s - loss: 1.4048 - regression_loss: 1.1836 - classification_loss: 0.2212 282/500 [===============>..............] - ETA: 54s - loss: 1.4052 - regression_loss: 1.1839 - classification_loss: 0.2212 283/500 [===============>..............] - ETA: 53s - loss: 1.4072 - regression_loss: 1.1854 - classification_loss: 0.2218 284/500 [================>.............] - ETA: 53s - loss: 1.4059 - regression_loss: 1.1840 - classification_loss: 0.2218 285/500 [================>.............] - ETA: 53s - loss: 1.4048 - regression_loss: 1.1832 - classification_loss: 0.2215 286/500 [================>.............] - ETA: 53s - loss: 1.4044 - regression_loss: 1.1830 - classification_loss: 0.2214 287/500 [================>.............] - ETA: 52s - loss: 1.4039 - regression_loss: 1.1827 - classification_loss: 0.2211 288/500 [================>.............] - ETA: 52s - loss: 1.4062 - regression_loss: 1.1849 - classification_loss: 0.2213 289/500 [================>.............] - ETA: 52s - loss: 1.4034 - regression_loss: 1.1826 - classification_loss: 0.2208 290/500 [================>.............] - ETA: 52s - loss: 1.4024 - regression_loss: 1.1819 - classification_loss: 0.2205 291/500 [================>.............] - ETA: 51s - loss: 1.4028 - regression_loss: 1.1822 - classification_loss: 0.2206 292/500 [================>.............] - ETA: 51s - loss: 1.4032 - regression_loss: 1.1825 - classification_loss: 0.2207 293/500 [================>.............] - ETA: 51s - loss: 1.4037 - regression_loss: 1.1830 - classification_loss: 0.2207 294/500 [================>.............] - ETA: 51s - loss: 1.4044 - regression_loss: 1.1837 - classification_loss: 0.2206 295/500 [================>.............] - ETA: 50s - loss: 1.4074 - regression_loss: 1.1862 - classification_loss: 0.2212 296/500 [================>.............] - ETA: 50s - loss: 1.4074 - regression_loss: 1.1860 - classification_loss: 0.2214 297/500 [================>.............] - ETA: 50s - loss: 1.4049 - regression_loss: 1.1838 - classification_loss: 0.2211 298/500 [================>.............] - ETA: 50s - loss: 1.4066 - regression_loss: 1.1852 - classification_loss: 0.2214 299/500 [================>.............] - ETA: 49s - loss: 1.4075 - regression_loss: 1.1855 - classification_loss: 0.2220 300/500 [=================>............] - ETA: 49s - loss: 1.4077 - regression_loss: 1.1856 - classification_loss: 0.2220 301/500 [=================>............] - ETA: 49s - loss: 1.4069 - regression_loss: 1.1851 - classification_loss: 0.2218 302/500 [=================>............] - ETA: 49s - loss: 1.4058 - regression_loss: 1.1844 - classification_loss: 0.2215 303/500 [=================>............] - ETA: 48s - loss: 1.4046 - regression_loss: 1.1832 - classification_loss: 0.2214 304/500 [=================>............] - ETA: 48s - loss: 1.4051 - regression_loss: 1.1835 - classification_loss: 0.2215 305/500 [=================>............] - ETA: 48s - loss: 1.4068 - regression_loss: 1.1847 - classification_loss: 0.2222 306/500 [=================>............] - ETA: 48s - loss: 1.4054 - regression_loss: 1.1836 - classification_loss: 0.2219 307/500 [=================>............] - ETA: 47s - loss: 1.4041 - regression_loss: 1.1824 - classification_loss: 0.2216 308/500 [=================>............] - ETA: 47s - loss: 1.4060 - regression_loss: 1.1838 - classification_loss: 0.2222 309/500 [=================>............] - ETA: 47s - loss: 1.4099 - regression_loss: 1.1871 - classification_loss: 0.2228 310/500 [=================>............] - ETA: 47s - loss: 1.4120 - regression_loss: 1.1885 - classification_loss: 0.2235 311/500 [=================>............] - ETA: 46s - loss: 1.4106 - regression_loss: 1.1874 - classification_loss: 0.2232 312/500 [=================>............] - ETA: 46s - loss: 1.4089 - regression_loss: 1.1860 - classification_loss: 0.2230 313/500 [=================>............] - ETA: 46s - loss: 1.4079 - regression_loss: 1.1852 - classification_loss: 0.2227 314/500 [=================>............] - ETA: 46s - loss: 1.4083 - regression_loss: 1.1857 - classification_loss: 0.2226 315/500 [=================>............] - ETA: 45s - loss: 1.4101 - regression_loss: 1.1871 - classification_loss: 0.2229 316/500 [=================>............] - ETA: 45s - loss: 1.4084 - regression_loss: 1.1859 - classification_loss: 0.2225 317/500 [==================>...........] - ETA: 45s - loss: 1.4073 - regression_loss: 1.1849 - classification_loss: 0.2224 318/500 [==================>...........] - ETA: 45s - loss: 1.4076 - regression_loss: 1.1853 - classification_loss: 0.2223 319/500 [==================>...........] - ETA: 44s - loss: 1.4072 - regression_loss: 1.1850 - classification_loss: 0.2222 320/500 [==================>...........] - ETA: 44s - loss: 1.4062 - regression_loss: 1.1844 - classification_loss: 0.2218 321/500 [==================>...........] - ETA: 44s - loss: 1.4041 - regression_loss: 1.1823 - classification_loss: 0.2218 322/500 [==================>...........] - ETA: 43s - loss: 1.4039 - regression_loss: 1.1823 - classification_loss: 0.2216 323/500 [==================>...........] - ETA: 43s - loss: 1.4055 - regression_loss: 1.1835 - classification_loss: 0.2220 324/500 [==================>...........] - ETA: 43s - loss: 1.4055 - regression_loss: 1.1835 - classification_loss: 0.2220 325/500 [==================>...........] - ETA: 43s - loss: 1.4057 - regression_loss: 1.1837 - classification_loss: 0.2220 326/500 [==================>...........] - ETA: 42s - loss: 1.4068 - regression_loss: 1.1846 - classification_loss: 0.2222 327/500 [==================>...........] - ETA: 42s - loss: 1.4067 - regression_loss: 1.1845 - classification_loss: 0.2222 328/500 [==================>...........] - ETA: 42s - loss: 1.4065 - regression_loss: 1.1845 - classification_loss: 0.2220 329/500 [==================>...........] - ETA: 42s - loss: 1.4051 - regression_loss: 1.1834 - classification_loss: 0.2217 330/500 [==================>...........] - ETA: 41s - loss: 1.4082 - regression_loss: 1.1862 - classification_loss: 0.2220 331/500 [==================>...........] - ETA: 41s - loss: 1.4081 - regression_loss: 1.1861 - classification_loss: 0.2220 332/500 [==================>...........] - ETA: 41s - loss: 1.4094 - regression_loss: 1.1870 - classification_loss: 0.2224 333/500 [==================>...........] - ETA: 41s - loss: 1.4093 - regression_loss: 1.1868 - classification_loss: 0.2225 334/500 [===================>..........] - ETA: 40s - loss: 1.4096 - regression_loss: 1.1872 - classification_loss: 0.2224 335/500 [===================>..........] - ETA: 40s - loss: 1.4109 - regression_loss: 1.1883 - classification_loss: 0.2226 336/500 [===================>..........] - ETA: 40s - loss: 1.4100 - regression_loss: 1.1875 - classification_loss: 0.2225 337/500 [===================>..........] - ETA: 40s - loss: 1.4076 - regression_loss: 1.1856 - classification_loss: 0.2220 338/500 [===================>..........] - ETA: 39s - loss: 1.4055 - regression_loss: 1.1839 - classification_loss: 0.2215 339/500 [===================>..........] - ETA: 39s - loss: 1.4063 - regression_loss: 1.1847 - classification_loss: 0.2216 340/500 [===================>..........] - ETA: 39s - loss: 1.4057 - regression_loss: 1.1845 - classification_loss: 0.2212 341/500 [===================>..........] - ETA: 39s - loss: 1.4040 - regression_loss: 1.1832 - classification_loss: 0.2208 342/500 [===================>..........] - ETA: 39s - loss: 1.4049 - regression_loss: 1.1840 - classification_loss: 0.2210 343/500 [===================>..........] - ETA: 38s - loss: 1.4051 - regression_loss: 1.1842 - classification_loss: 0.2210 344/500 [===================>..........] - ETA: 38s - loss: 1.4059 - regression_loss: 1.1848 - classification_loss: 0.2211 345/500 [===================>..........] - ETA: 38s - loss: 1.4035 - regression_loss: 1.1829 - classification_loss: 0.2206 346/500 [===================>..........] - ETA: 38s - loss: 1.4027 - regression_loss: 1.1821 - classification_loss: 0.2206 347/500 [===================>..........] - ETA: 37s - loss: 1.4050 - regression_loss: 1.1841 - classification_loss: 0.2209 348/500 [===================>..........] - ETA: 37s - loss: 1.4060 - regression_loss: 1.1850 - classification_loss: 0.2210 349/500 [===================>..........] - ETA: 37s - loss: 1.4074 - regression_loss: 1.1862 - classification_loss: 0.2213 350/500 [====================>.........] - ETA: 37s - loss: 1.4065 - regression_loss: 1.1855 - classification_loss: 0.2210 351/500 [====================>.........] - ETA: 36s - loss: 1.4069 - regression_loss: 1.1857 - classification_loss: 0.2212 352/500 [====================>.........] - ETA: 36s - loss: 1.4047 - regression_loss: 1.1839 - classification_loss: 0.2207 353/500 [====================>.........] - ETA: 36s - loss: 1.4045 - regression_loss: 1.1838 - classification_loss: 0.2207 354/500 [====================>.........] - ETA: 36s - loss: 1.4048 - regression_loss: 1.1838 - classification_loss: 0.2210 355/500 [====================>.........] - ETA: 35s - loss: 1.4050 - regression_loss: 1.1837 - classification_loss: 0.2213 356/500 [====================>.........] - ETA: 35s - loss: 1.4054 - regression_loss: 1.1841 - classification_loss: 0.2213 357/500 [====================>.........] - ETA: 35s - loss: 1.4059 - regression_loss: 1.1846 - classification_loss: 0.2212 358/500 [====================>.........] - ETA: 35s - loss: 1.4061 - regression_loss: 1.1850 - classification_loss: 0.2211 359/500 [====================>.........] - ETA: 34s - loss: 1.4067 - regression_loss: 1.1858 - classification_loss: 0.2209 360/500 [====================>.........] - ETA: 34s - loss: 1.4079 - regression_loss: 1.1867 - classification_loss: 0.2212 361/500 [====================>.........] - ETA: 34s - loss: 1.4081 - regression_loss: 1.1869 - classification_loss: 0.2212 362/500 [====================>.........] - ETA: 34s - loss: 1.4081 - regression_loss: 1.1868 - classification_loss: 0.2213 363/500 [====================>.........] - ETA: 33s - loss: 1.4076 - regression_loss: 1.1866 - classification_loss: 0.2210 364/500 [====================>.........] - ETA: 33s - loss: 1.4077 - regression_loss: 1.1866 - classification_loss: 0.2211 365/500 [====================>.........] - ETA: 33s - loss: 1.4087 - regression_loss: 1.1878 - classification_loss: 0.2210 366/500 [====================>.........] - ETA: 33s - loss: 1.4060 - regression_loss: 1.1855 - classification_loss: 0.2205 367/500 [=====================>........] - ETA: 32s - loss: 1.4046 - regression_loss: 1.1845 - classification_loss: 0.2202 368/500 [=====================>........] - ETA: 32s - loss: 1.4040 - regression_loss: 1.1841 - classification_loss: 0.2199 369/500 [=====================>........] - ETA: 32s - loss: 1.4023 - regression_loss: 1.1824 - classification_loss: 0.2198 370/500 [=====================>........] - ETA: 32s - loss: 1.4041 - regression_loss: 1.1842 - classification_loss: 0.2199 371/500 [=====================>........] - ETA: 31s - loss: 1.4014 - regression_loss: 1.1820 - classification_loss: 0.2194 372/500 [=====================>........] - ETA: 31s - loss: 1.4023 - regression_loss: 1.1820 - classification_loss: 0.2203 373/500 [=====================>........] - ETA: 31s - loss: 1.4011 - regression_loss: 1.1810 - classification_loss: 0.2201 374/500 [=====================>........] - ETA: 31s - loss: 1.4019 - regression_loss: 1.1817 - classification_loss: 0.2202 375/500 [=====================>........] - ETA: 30s - loss: 1.4013 - regression_loss: 1.1812 - classification_loss: 0.2202 376/500 [=====================>........] - ETA: 30s - loss: 1.4039 - regression_loss: 1.1831 - classification_loss: 0.2208 377/500 [=====================>........] - ETA: 30s - loss: 1.4041 - regression_loss: 1.1831 - classification_loss: 0.2211 378/500 [=====================>........] - ETA: 30s - loss: 1.4034 - regression_loss: 1.1826 - classification_loss: 0.2209 379/500 [=====================>........] - ETA: 29s - loss: 1.4023 - regression_loss: 1.1817 - classification_loss: 0.2206 380/500 [=====================>........] - ETA: 29s - loss: 1.4012 - regression_loss: 1.1810 - classification_loss: 0.2202 381/500 [=====================>........] - ETA: 29s - loss: 1.4035 - regression_loss: 1.1829 - classification_loss: 0.2205 382/500 [=====================>........] - ETA: 29s - loss: 1.4038 - regression_loss: 1.1831 - classification_loss: 0.2206 383/500 [=====================>........] - ETA: 28s - loss: 1.4038 - regression_loss: 1.1833 - classification_loss: 0.2205 384/500 [======================>.......] - ETA: 28s - loss: 1.4042 - regression_loss: 1.1836 - classification_loss: 0.2207 385/500 [======================>.......] - ETA: 28s - loss: 1.4040 - regression_loss: 1.1835 - classification_loss: 0.2205 386/500 [======================>.......] - ETA: 28s - loss: 1.4052 - regression_loss: 1.1843 - classification_loss: 0.2209 387/500 [======================>.......] - ETA: 27s - loss: 1.4061 - regression_loss: 1.1852 - classification_loss: 0.2210 388/500 [======================>.......] - ETA: 27s - loss: 1.4059 - regression_loss: 1.1850 - classification_loss: 0.2209 389/500 [======================>.......] - ETA: 27s - loss: 1.4053 - regression_loss: 1.1847 - classification_loss: 0.2206 390/500 [======================>.......] - ETA: 27s - loss: 1.4048 - regression_loss: 1.1843 - classification_loss: 0.2205 391/500 [======================>.......] - ETA: 26s - loss: 1.4047 - regression_loss: 1.1842 - classification_loss: 0.2205 392/500 [======================>.......] - ETA: 26s - loss: 1.4036 - regression_loss: 1.1832 - classification_loss: 0.2204 393/500 [======================>.......] - ETA: 26s - loss: 1.4033 - regression_loss: 1.1830 - classification_loss: 0.2203 394/500 [======================>.......] - ETA: 26s - loss: 1.4025 - regression_loss: 1.1825 - classification_loss: 0.2201 395/500 [======================>.......] - ETA: 25s - loss: 1.4019 - regression_loss: 1.1822 - classification_loss: 0.2197 396/500 [======================>.......] - ETA: 25s - loss: 1.4024 - regression_loss: 1.1829 - classification_loss: 0.2196 397/500 [======================>.......] - ETA: 25s - loss: 1.4023 - regression_loss: 1.1828 - classification_loss: 0.2195 398/500 [======================>.......] - ETA: 25s - loss: 1.4022 - regression_loss: 1.1829 - classification_loss: 0.2194 399/500 [======================>.......] - ETA: 24s - loss: 1.4009 - regression_loss: 1.1818 - classification_loss: 0.2191 400/500 [=======================>......] - ETA: 24s - loss: 1.4001 - regression_loss: 1.1789 - classification_loss: 0.2212 401/500 [=======================>......] - ETA: 24s - loss: 1.4002 - regression_loss: 1.1789 - classification_loss: 0.2213 402/500 [=======================>......] - ETA: 24s - loss: 1.3988 - regression_loss: 1.1778 - classification_loss: 0.2210 403/500 [=======================>......] - ETA: 23s - loss: 1.3999 - regression_loss: 1.1785 - classification_loss: 0.2213 404/500 [=======================>......] - ETA: 23s - loss: 1.3990 - regression_loss: 1.1779 - classification_loss: 0.2211 405/500 [=======================>......] - ETA: 23s - loss: 1.3968 - regression_loss: 1.1761 - classification_loss: 0.2207 406/500 [=======================>......] - ETA: 23s - loss: 1.3949 - regression_loss: 1.1747 - classification_loss: 0.2203 407/500 [=======================>......] - ETA: 22s - loss: 1.3964 - regression_loss: 1.1758 - classification_loss: 0.2206 408/500 [=======================>......] - ETA: 22s - loss: 1.3974 - regression_loss: 1.1766 - classification_loss: 0.2208 409/500 [=======================>......] - ETA: 22s - loss: 1.3962 - regression_loss: 1.1754 - classification_loss: 0.2207 410/500 [=======================>......] - ETA: 22s - loss: 1.3961 - regression_loss: 1.1755 - classification_loss: 0.2206 411/500 [=======================>......] - ETA: 21s - loss: 1.3941 - regression_loss: 1.1737 - classification_loss: 0.2204 412/500 [=======================>......] - ETA: 21s - loss: 1.3932 - regression_loss: 1.1731 - classification_loss: 0.2201 413/500 [=======================>......] - ETA: 21s - loss: 1.3905 - regression_loss: 1.1708 - classification_loss: 0.2197 414/500 [=======================>......] - ETA: 21s - loss: 1.3902 - regression_loss: 1.1702 - classification_loss: 0.2199 415/500 [=======================>......] - ETA: 20s - loss: 1.3901 - regression_loss: 1.1703 - classification_loss: 0.2198 416/500 [=======================>......] - ETA: 20s - loss: 1.3904 - regression_loss: 1.1706 - classification_loss: 0.2198 417/500 [========================>.....] - ETA: 20s - loss: 1.3919 - regression_loss: 1.1718 - classification_loss: 0.2202 418/500 [========================>.....] - ETA: 20s - loss: 1.3932 - regression_loss: 1.1725 - classification_loss: 0.2207 419/500 [========================>.....] - ETA: 19s - loss: 1.3935 - regression_loss: 1.1728 - classification_loss: 0.2207 420/500 [========================>.....] - ETA: 19s - loss: 1.3932 - regression_loss: 1.1726 - classification_loss: 0.2206 421/500 [========================>.....] - ETA: 19s - loss: 1.3928 - regression_loss: 1.1724 - classification_loss: 0.2204 422/500 [========================>.....] - ETA: 19s - loss: 1.3941 - regression_loss: 1.1736 - classification_loss: 0.2205 423/500 [========================>.....] - ETA: 19s - loss: 1.3936 - regression_loss: 1.1733 - classification_loss: 0.2203 424/500 [========================>.....] - ETA: 18s - loss: 1.3935 - regression_loss: 1.1731 - classification_loss: 0.2204 425/500 [========================>.....] - ETA: 18s - loss: 1.3934 - regression_loss: 1.1732 - classification_loss: 0.2203 426/500 [========================>.....] - ETA: 18s - loss: 1.3922 - regression_loss: 1.1721 - classification_loss: 0.2200 427/500 [========================>.....] - ETA: 18s - loss: 1.3936 - regression_loss: 1.1733 - classification_loss: 0.2203 428/500 [========================>.....] - ETA: 17s - loss: 1.3920 - regression_loss: 1.1718 - classification_loss: 0.2202 429/500 [========================>.....] - ETA: 17s - loss: 1.3912 - regression_loss: 1.1711 - classification_loss: 0.2201 430/500 [========================>.....] - ETA: 17s - loss: 1.3896 - regression_loss: 1.1697 - classification_loss: 0.2199 431/500 [========================>.....] - ETA: 17s - loss: 1.3900 - regression_loss: 1.1700 - classification_loss: 0.2200 432/500 [========================>.....] - ETA: 16s - loss: 1.3896 - regression_loss: 1.1698 - classification_loss: 0.2198 433/500 [========================>.....] - ETA: 16s - loss: 1.3894 - regression_loss: 1.1697 - classification_loss: 0.2197 434/500 [=========================>....] - ETA: 16s - loss: 1.3892 - regression_loss: 1.1697 - classification_loss: 0.2195 435/500 [=========================>....] - ETA: 16s - loss: 1.3890 - regression_loss: 1.1695 - classification_loss: 0.2195 436/500 [=========================>....] - ETA: 15s - loss: 1.3900 - regression_loss: 1.1701 - classification_loss: 0.2199 437/500 [=========================>....] - ETA: 15s - loss: 1.3889 - regression_loss: 1.1692 - classification_loss: 0.2197 438/500 [=========================>....] - ETA: 15s - loss: 1.3884 - regression_loss: 1.1690 - classification_loss: 0.2194 439/500 [=========================>....] - ETA: 15s - loss: 1.3896 - regression_loss: 1.1701 - classification_loss: 0.2196 440/500 [=========================>....] - ETA: 14s - loss: 1.3883 - regression_loss: 1.1689 - classification_loss: 0.2193 441/500 [=========================>....] - ETA: 14s - loss: 1.3894 - regression_loss: 1.1700 - classification_loss: 0.2193 442/500 [=========================>....] - ETA: 14s - loss: 1.3901 - regression_loss: 1.1706 - classification_loss: 0.2195 443/500 [=========================>....] - ETA: 14s - loss: 1.3894 - regression_loss: 1.1700 - classification_loss: 0.2194 444/500 [=========================>....] - ETA: 13s - loss: 1.3895 - regression_loss: 1.1702 - classification_loss: 0.2194 445/500 [=========================>....] - ETA: 13s - loss: 1.3882 - regression_loss: 1.1691 - classification_loss: 0.2191 446/500 [=========================>....] - ETA: 13s - loss: 1.3871 - regression_loss: 1.1682 - classification_loss: 0.2189 447/500 [=========================>....] - ETA: 13s - loss: 1.3857 - regression_loss: 1.1671 - classification_loss: 0.2187 448/500 [=========================>....] - ETA: 12s - loss: 1.3866 - regression_loss: 1.1678 - classification_loss: 0.2188 449/500 [=========================>....] - ETA: 12s - loss: 1.3866 - regression_loss: 1.1679 - classification_loss: 0.2186 450/500 [==========================>...] - ETA: 12s - loss: 1.3866 - regression_loss: 1.1680 - classification_loss: 0.2186 451/500 [==========================>...] - ETA: 12s - loss: 1.3876 - regression_loss: 1.1687 - classification_loss: 0.2188 452/500 [==========================>...] - ETA: 11s - loss: 1.3875 - regression_loss: 1.1688 - classification_loss: 0.2188 453/500 [==========================>...] - ETA: 11s - loss: 1.3876 - regression_loss: 1.1687 - classification_loss: 0.2188 454/500 [==========================>...] - ETA: 11s - loss: 1.3894 - regression_loss: 1.1699 - classification_loss: 0.2195 455/500 [==========================>...] - ETA: 11s - loss: 1.3885 - regression_loss: 1.1692 - classification_loss: 0.2193 456/500 [==========================>...] - ETA: 10s - loss: 1.3878 - regression_loss: 1.1688 - classification_loss: 0.2190 457/500 [==========================>...] - ETA: 10s - loss: 1.3887 - regression_loss: 1.1692 - classification_loss: 0.2195 458/500 [==========================>...] - ETA: 10s - loss: 1.3882 - regression_loss: 1.1688 - classification_loss: 0.2194 459/500 [==========================>...] - ETA: 10s - loss: 1.3873 - regression_loss: 1.1680 - classification_loss: 0.2192 460/500 [==========================>...] - ETA: 9s - loss: 1.3875 - regression_loss: 1.1682 - classification_loss: 0.2194  461/500 [==========================>...] - ETA: 9s - loss: 1.3881 - regression_loss: 1.1688 - classification_loss: 0.2193 462/500 [==========================>...] - ETA: 9s - loss: 1.3874 - regression_loss: 1.1682 - classification_loss: 0.2192 463/500 [==========================>...] - ETA: 9s - loss: 1.3874 - regression_loss: 1.1681 - classification_loss: 0.2193 464/500 [==========================>...] - ETA: 8s - loss: 1.3883 - regression_loss: 1.1690 - classification_loss: 0.2193 465/500 [==========================>...] - ETA: 8s - loss: 1.3892 - regression_loss: 1.1698 - classification_loss: 0.2194 466/500 [==========================>...] - ETA: 8s - loss: 1.3909 - regression_loss: 1.1708 - classification_loss: 0.2201 467/500 [===========================>..] - ETA: 8s - loss: 1.3900 - regression_loss: 1.1701 - classification_loss: 0.2199 468/500 [===========================>..] - ETA: 7s - loss: 1.3899 - regression_loss: 1.1702 - classification_loss: 0.2198 469/500 [===========================>..] - ETA: 7s - loss: 1.3917 - regression_loss: 1.1715 - classification_loss: 0.2201 470/500 [===========================>..] - ETA: 7s - loss: 1.3922 - regression_loss: 1.1721 - classification_loss: 0.2202 471/500 [===========================>..] - ETA: 7s - loss: 1.3929 - regression_loss: 1.1725 - classification_loss: 0.2204 472/500 [===========================>..] - ETA: 6s - loss: 1.3945 - regression_loss: 1.1737 - classification_loss: 0.2208 473/500 [===========================>..] - ETA: 6s - loss: 1.3936 - regression_loss: 1.1730 - classification_loss: 0.2207 474/500 [===========================>..] - ETA: 6s - loss: 1.3933 - regression_loss: 1.1727 - classification_loss: 0.2206 475/500 [===========================>..] - ETA: 6s - loss: 1.3949 - regression_loss: 1.1742 - classification_loss: 0.2208 476/500 [===========================>..] - ETA: 5s - loss: 1.3949 - regression_loss: 1.1743 - classification_loss: 0.2206 477/500 [===========================>..] - ETA: 5s - loss: 1.3951 - regression_loss: 1.1745 - classification_loss: 0.2206 478/500 [===========================>..] - ETA: 5s - loss: 1.3971 - regression_loss: 1.1764 - classification_loss: 0.2207 479/500 [===========================>..] - ETA: 5s - loss: 1.3970 - regression_loss: 1.1763 - classification_loss: 0.2207 480/500 [===========================>..] - ETA: 4s - loss: 1.3968 - regression_loss: 1.1762 - classification_loss: 0.2206 481/500 [===========================>..] - ETA: 4s - loss: 1.3966 - regression_loss: 1.1762 - classification_loss: 0.2204 482/500 [===========================>..] - ETA: 4s - loss: 1.3945 - regression_loss: 1.1744 - classification_loss: 0.2201 483/500 [===========================>..] - ETA: 4s - loss: 1.3937 - regression_loss: 1.1737 - classification_loss: 0.2200 484/500 [============================>.] - ETA: 3s - loss: 1.3951 - regression_loss: 1.1748 - classification_loss: 0.2203 485/500 [============================>.] - ETA: 3s - loss: 1.3952 - regression_loss: 1.1747 - classification_loss: 0.2205 486/500 [============================>.] - ETA: 3s - loss: 1.3957 - regression_loss: 1.1752 - classification_loss: 0.2205 487/500 [============================>.] - ETA: 3s - loss: 1.3976 - regression_loss: 1.1768 - classification_loss: 0.2208 488/500 [============================>.] - ETA: 2s - loss: 1.3960 - regression_loss: 1.1755 - classification_loss: 0.2205 489/500 [============================>.] - ETA: 2s - loss: 1.3966 - regression_loss: 1.1759 - classification_loss: 0.2207 490/500 [============================>.] - ETA: 2s - loss: 1.3964 - regression_loss: 1.1757 - classification_loss: 0.2207 491/500 [============================>.] - ETA: 2s - loss: 1.3958 - regression_loss: 1.1752 - classification_loss: 0.2205 492/500 [============================>.] - ETA: 1s - loss: 1.3960 - regression_loss: 1.1755 - classification_loss: 0.2205 493/500 [============================>.] - ETA: 1s - loss: 1.3943 - regression_loss: 1.1741 - classification_loss: 0.2202 494/500 [============================>.] - ETA: 1s - loss: 1.3943 - regression_loss: 1.1740 - classification_loss: 0.2203 495/500 [============================>.] - ETA: 1s - loss: 1.3930 - regression_loss: 1.1730 - classification_loss: 0.2200 496/500 [============================>.] - ETA: 0s - loss: 1.3928 - regression_loss: 1.1730 - classification_loss: 0.2199 497/500 [============================>.] - ETA: 0s - loss: 1.3947 - regression_loss: 1.1748 - classification_loss: 0.2200 498/500 [============================>.] - ETA: 0s - loss: 1.3942 - regression_loss: 1.1743 - classification_loss: 0.2198 499/500 [============================>.] - ETA: 0s - loss: 1.3931 - regression_loss: 1.1733 - classification_loss: 0.2197 500/500 [==============================] - 123s 246ms/step - loss: 1.3936 - regression_loss: 1.1738 - classification_loss: 0.2198 326 instances of class plum with average precision: 0.8048 mAP: 0.8048 Epoch 00075: saving model to ./training/snapshots/resnet50_pascal_75.h5 Epoch 76/150 1/500 [..............................] - ETA: 1:57 - loss: 1.0443 - regression_loss: 0.8674 - classification_loss: 0.1769 2/500 [..............................] - ETA: 2:01 - loss: 1.2628 - regression_loss: 1.0785 - classification_loss: 0.1843 3/500 [..............................] - ETA: 2:01 - loss: 1.4320 - regression_loss: 1.2755 - classification_loss: 0.1565 4/500 [..............................] - ETA: 2:00 - loss: 1.5434 - regression_loss: 1.3737 - classification_loss: 0.1697 5/500 [..............................] - ETA: 2:00 - loss: 1.3978 - regression_loss: 1.2387 - classification_loss: 0.1591 6/500 [..............................] - ETA: 1:59 - loss: 1.3948 - regression_loss: 1.2338 - classification_loss: 0.1610 7/500 [..............................] - ETA: 2:00 - loss: 1.4861 - regression_loss: 1.2902 - classification_loss: 0.1958 8/500 [..............................] - ETA: 2:00 - loss: 1.4146 - regression_loss: 1.2303 - classification_loss: 0.1843 9/500 [..............................] - ETA: 1:59 - loss: 1.3502 - regression_loss: 1.1697 - classification_loss: 0.1805 10/500 [..............................] - ETA: 1:59 - loss: 1.4125 - regression_loss: 1.2202 - classification_loss: 0.1923 11/500 [..............................] - ETA: 1:59 - loss: 1.3341 - regression_loss: 1.1534 - classification_loss: 0.1807 12/500 [..............................] - ETA: 1:59 - loss: 1.3289 - regression_loss: 1.1453 - classification_loss: 0.1836 13/500 [..............................] - ETA: 1:59 - loss: 1.2676 - regression_loss: 1.0941 - classification_loss: 0.1735 14/500 [..............................] - ETA: 1:59 - loss: 1.2752 - regression_loss: 1.1009 - classification_loss: 0.1743 15/500 [..............................] - ETA: 1:58 - loss: 1.2706 - regression_loss: 1.0967 - classification_loss: 0.1739 16/500 [..............................] - ETA: 1:58 - loss: 1.2269 - regression_loss: 1.0562 - classification_loss: 0.1707 17/500 [>.............................] - ETA: 1:58 - loss: 1.2238 - regression_loss: 1.0553 - classification_loss: 0.1685 18/500 [>.............................] - ETA: 1:58 - loss: 1.2237 - regression_loss: 1.0584 - classification_loss: 0.1653 19/500 [>.............................] - ETA: 1:58 - loss: 1.2417 - regression_loss: 1.0648 - classification_loss: 0.1769 20/500 [>.............................] - ETA: 1:58 - loss: 1.2491 - regression_loss: 1.0703 - classification_loss: 0.1788 21/500 [>.............................] - ETA: 1:58 - loss: 1.2141 - regression_loss: 1.0411 - classification_loss: 0.1730 22/500 [>.............................] - ETA: 1:57 - loss: 1.2131 - regression_loss: 1.0387 - classification_loss: 0.1745 23/500 [>.............................] - ETA: 1:57 - loss: 1.2237 - regression_loss: 1.0492 - classification_loss: 0.1745 24/500 [>.............................] - ETA: 1:57 - loss: 1.2991 - regression_loss: 1.1074 - classification_loss: 0.1918 25/500 [>.............................] - ETA: 1:57 - loss: 1.2794 - regression_loss: 1.0916 - classification_loss: 0.1878 26/500 [>.............................] - ETA: 1:57 - loss: 1.2918 - regression_loss: 1.0996 - classification_loss: 0.1923 27/500 [>.............................] - ETA: 1:56 - loss: 1.3049 - regression_loss: 1.1101 - classification_loss: 0.1948 28/500 [>.............................] - ETA: 1:56 - loss: 1.3361 - regression_loss: 1.1321 - classification_loss: 0.2040 29/500 [>.............................] - ETA: 1:55 - loss: 1.3073 - regression_loss: 1.1083 - classification_loss: 0.1990 30/500 [>.............................] - ETA: 1:55 - loss: 1.3084 - regression_loss: 1.1103 - classification_loss: 0.1980 31/500 [>.............................] - ETA: 1:55 - loss: 1.3087 - regression_loss: 1.1101 - classification_loss: 0.1986 32/500 [>.............................] - ETA: 1:55 - loss: 1.3001 - regression_loss: 1.1029 - classification_loss: 0.1971 33/500 [>.............................] - ETA: 1:55 - loss: 1.2756 - regression_loss: 1.0825 - classification_loss: 0.1932 34/500 [=>............................] - ETA: 1:55 - loss: 1.2903 - regression_loss: 1.0949 - classification_loss: 0.1955 35/500 [=>............................] - ETA: 1:54 - loss: 1.2889 - regression_loss: 1.0946 - classification_loss: 0.1943 36/500 [=>............................] - ETA: 1:54 - loss: 1.3373 - regression_loss: 1.1249 - classification_loss: 0.2123 37/500 [=>............................] - ETA: 1:54 - loss: 1.3263 - regression_loss: 1.1172 - classification_loss: 0.2091 38/500 [=>............................] - ETA: 1:54 - loss: 1.3330 - regression_loss: 1.1233 - classification_loss: 0.2097 39/500 [=>............................] - ETA: 1:54 - loss: 1.3230 - regression_loss: 1.1165 - classification_loss: 0.2065 40/500 [=>............................] - ETA: 1:53 - loss: 1.3306 - regression_loss: 1.1245 - classification_loss: 0.2062 41/500 [=>............................] - ETA: 1:53 - loss: 1.3372 - regression_loss: 1.1273 - classification_loss: 0.2099 42/500 [=>............................] - ETA: 1:53 - loss: 1.3322 - regression_loss: 1.1250 - classification_loss: 0.2072 43/500 [=>............................] - ETA: 1:53 - loss: 1.3283 - regression_loss: 1.1228 - classification_loss: 0.2055 44/500 [=>............................] - ETA: 1:52 - loss: 1.3406 - regression_loss: 1.1316 - classification_loss: 0.2090 45/500 [=>............................] - ETA: 1:52 - loss: 1.3324 - regression_loss: 1.1257 - classification_loss: 0.2068 46/500 [=>............................] - ETA: 1:52 - loss: 1.3623 - regression_loss: 1.1518 - classification_loss: 0.2104 47/500 [=>............................] - ETA: 1:52 - loss: 1.3613 - regression_loss: 1.1509 - classification_loss: 0.2104 48/500 [=>............................] - ETA: 1:52 - loss: 1.3711 - regression_loss: 1.1615 - classification_loss: 0.2096 49/500 [=>............................] - ETA: 1:51 - loss: 1.3745 - regression_loss: 1.1643 - classification_loss: 0.2102 50/500 [==>...........................] - ETA: 1:51 - loss: 1.3699 - regression_loss: 1.1613 - classification_loss: 0.2086 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3751 - regression_loss: 1.1646 - classification_loss: 0.2105 52/500 [==>...........................] - ETA: 1:51 - loss: 1.3872 - regression_loss: 1.1670 - classification_loss: 0.2202 53/500 [==>...........................] - ETA: 1:50 - loss: 1.3961 - regression_loss: 1.1748 - classification_loss: 0.2213 54/500 [==>...........................] - ETA: 1:50 - loss: 1.4004 - regression_loss: 1.1784 - classification_loss: 0.2220 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3917 - regression_loss: 1.1723 - classification_loss: 0.2195 56/500 [==>...........................] - ETA: 1:50 - loss: 1.3870 - regression_loss: 1.1670 - classification_loss: 0.2199 57/500 [==>...........................] - ETA: 1:49 - loss: 1.3881 - regression_loss: 1.1691 - classification_loss: 0.2190 58/500 [==>...........................] - ETA: 1:49 - loss: 1.3948 - regression_loss: 1.1744 - classification_loss: 0.2204 59/500 [==>...........................] - ETA: 1:49 - loss: 1.3964 - regression_loss: 1.1762 - classification_loss: 0.2202 60/500 [==>...........................] - ETA: 1:49 - loss: 1.3998 - regression_loss: 1.1789 - classification_loss: 0.2209 61/500 [==>...........................] - ETA: 1:48 - loss: 1.3961 - regression_loss: 1.1761 - classification_loss: 0.2201 62/500 [==>...........................] - ETA: 1:48 - loss: 1.3928 - regression_loss: 1.1734 - classification_loss: 0.2193 63/500 [==>...........................] - ETA: 1:48 - loss: 1.3968 - regression_loss: 1.1761 - classification_loss: 0.2207 64/500 [==>...........................] - ETA: 1:47 - loss: 1.4013 - regression_loss: 1.1797 - classification_loss: 0.2216 65/500 [==>...........................] - ETA: 1:47 - loss: 1.3970 - regression_loss: 1.1772 - classification_loss: 0.2199 66/500 [==>...........................] - ETA: 1:47 - loss: 1.3894 - regression_loss: 1.1715 - classification_loss: 0.2179 67/500 [===>..........................] - ETA: 1:47 - loss: 1.3783 - regression_loss: 1.1624 - classification_loss: 0.2160 68/500 [===>..........................] - ETA: 1:46 - loss: 1.3857 - regression_loss: 1.1675 - classification_loss: 0.2182 69/500 [===>..........................] - ETA: 1:46 - loss: 1.3912 - regression_loss: 1.1723 - classification_loss: 0.2188 70/500 [===>..........................] - ETA: 1:46 - loss: 1.3868 - regression_loss: 1.1693 - classification_loss: 0.2175 71/500 [===>..........................] - ETA: 1:46 - loss: 1.3883 - regression_loss: 1.1702 - classification_loss: 0.2181 72/500 [===>..........................] - ETA: 1:45 - loss: 1.3961 - regression_loss: 1.1780 - classification_loss: 0.2181 73/500 [===>..........................] - ETA: 1:45 - loss: 1.4048 - regression_loss: 1.1851 - classification_loss: 0.2197 74/500 [===>..........................] - ETA: 1:45 - loss: 1.4044 - regression_loss: 1.1839 - classification_loss: 0.2204 75/500 [===>..........................] - ETA: 1:45 - loss: 1.4013 - regression_loss: 1.1820 - classification_loss: 0.2193 76/500 [===>..........................] - ETA: 1:44 - loss: 1.4007 - regression_loss: 1.1818 - classification_loss: 0.2189 77/500 [===>..........................] - ETA: 1:44 - loss: 1.3982 - regression_loss: 1.1811 - classification_loss: 0.2171 78/500 [===>..........................] - ETA: 1:44 - loss: 1.3883 - regression_loss: 1.1735 - classification_loss: 0.2149 79/500 [===>..........................] - ETA: 1:44 - loss: 1.3922 - regression_loss: 1.1768 - classification_loss: 0.2154 80/500 [===>..........................] - ETA: 1:43 - loss: 1.4005 - regression_loss: 1.1826 - classification_loss: 0.2179 81/500 [===>..........................] - ETA: 1:43 - loss: 1.4053 - regression_loss: 1.1873 - classification_loss: 0.2180 82/500 [===>..........................] - ETA: 1:42 - loss: 1.4056 - regression_loss: 1.1875 - classification_loss: 0.2180 83/500 [===>..........................] - ETA: 1:42 - loss: 1.3953 - regression_loss: 1.1793 - classification_loss: 0.2160 84/500 [====>.........................] - ETA: 1:42 - loss: 1.3977 - regression_loss: 1.1817 - classification_loss: 0.2160 85/500 [====>.........................] - ETA: 1:42 - loss: 1.3987 - regression_loss: 1.1826 - classification_loss: 0.2161 86/500 [====>.........................] - ETA: 1:41 - loss: 1.3998 - regression_loss: 1.1843 - classification_loss: 0.2155 87/500 [====>.........................] - ETA: 1:41 - loss: 1.4024 - regression_loss: 1.1856 - classification_loss: 0.2168 88/500 [====>.........................] - ETA: 1:41 - loss: 1.4056 - regression_loss: 1.1881 - classification_loss: 0.2176 89/500 [====>.........................] - ETA: 1:40 - loss: 1.4122 - regression_loss: 1.1935 - classification_loss: 0.2187 90/500 [====>.........................] - ETA: 1:40 - loss: 1.4167 - regression_loss: 1.1972 - classification_loss: 0.2195 91/500 [====>.........................] - ETA: 1:40 - loss: 1.4118 - regression_loss: 1.1933 - classification_loss: 0.2185 92/500 [====>.........................] - ETA: 1:39 - loss: 1.4098 - regression_loss: 1.1918 - classification_loss: 0.2180 93/500 [====>.........................] - ETA: 1:39 - loss: 1.4027 - regression_loss: 1.1863 - classification_loss: 0.2163 94/500 [====>.........................] - ETA: 1:39 - loss: 1.4014 - regression_loss: 1.1856 - classification_loss: 0.2157 95/500 [====>.........................] - ETA: 1:38 - loss: 1.3917 - regression_loss: 1.1775 - classification_loss: 0.2142 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3873 - regression_loss: 1.1736 - classification_loss: 0.2137 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3966 - regression_loss: 1.1816 - classification_loss: 0.2150 98/500 [====>.........................] - ETA: 1:38 - loss: 1.3948 - regression_loss: 1.1804 - classification_loss: 0.2145 99/500 [====>.........................] - ETA: 1:37 - loss: 1.3903 - regression_loss: 1.1769 - classification_loss: 0.2134 100/500 [=====>........................] - ETA: 1:37 - loss: 1.3887 - regression_loss: 1.1755 - classification_loss: 0.2132 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3898 - regression_loss: 1.1759 - classification_loss: 0.2139 102/500 [=====>........................] - ETA: 1:36 - loss: 1.3889 - regression_loss: 1.1750 - classification_loss: 0.2139 103/500 [=====>........................] - ETA: 1:36 - loss: 1.3945 - regression_loss: 1.1791 - classification_loss: 0.2154 104/500 [=====>........................] - ETA: 1:36 - loss: 1.3879 - regression_loss: 1.1738 - classification_loss: 0.2141 105/500 [=====>........................] - ETA: 1:35 - loss: 1.3888 - regression_loss: 1.1745 - classification_loss: 0.2143 106/500 [=====>........................] - ETA: 1:35 - loss: 1.3952 - regression_loss: 1.1807 - classification_loss: 0.2146 107/500 [=====>........................] - ETA: 1:35 - loss: 1.3964 - regression_loss: 1.1814 - classification_loss: 0.2150 108/500 [=====>........................] - ETA: 1:35 - loss: 1.3968 - regression_loss: 1.1812 - classification_loss: 0.2156 109/500 [=====>........................] - ETA: 1:34 - loss: 1.4051 - regression_loss: 1.1881 - classification_loss: 0.2169 110/500 [=====>........................] - ETA: 1:34 - loss: 1.4032 - regression_loss: 1.1868 - classification_loss: 0.2164 111/500 [=====>........................] - ETA: 1:34 - loss: 1.4045 - regression_loss: 1.1872 - classification_loss: 0.2173 112/500 [=====>........................] - ETA: 1:33 - loss: 1.4006 - regression_loss: 1.1841 - classification_loss: 0.2165 113/500 [=====>........................] - ETA: 1:33 - loss: 1.4010 - regression_loss: 1.1853 - classification_loss: 0.2157 114/500 [=====>........................] - ETA: 1:33 - loss: 1.3997 - regression_loss: 1.1848 - classification_loss: 0.2149 115/500 [=====>........................] - ETA: 1:33 - loss: 1.4036 - regression_loss: 1.1884 - classification_loss: 0.2152 116/500 [=====>........................] - ETA: 1:32 - loss: 1.4025 - regression_loss: 1.1879 - classification_loss: 0.2146 117/500 [======>.......................] - ETA: 1:32 - loss: 1.4023 - regression_loss: 1.1882 - classification_loss: 0.2141 118/500 [======>.......................] - ETA: 1:32 - loss: 1.4052 - regression_loss: 1.1907 - classification_loss: 0.2144 119/500 [======>.......................] - ETA: 1:32 - loss: 1.3964 - regression_loss: 1.1834 - classification_loss: 0.2130 120/500 [======>.......................] - ETA: 1:31 - loss: 1.3952 - regression_loss: 1.1827 - classification_loss: 0.2125 121/500 [======>.......................] - ETA: 1:31 - loss: 1.3956 - regression_loss: 1.1827 - classification_loss: 0.2129 122/500 [======>.......................] - ETA: 1:31 - loss: 1.3993 - regression_loss: 1.1847 - classification_loss: 0.2146 123/500 [======>.......................] - ETA: 1:30 - loss: 1.3968 - regression_loss: 1.1829 - classification_loss: 0.2139 124/500 [======>.......................] - ETA: 1:30 - loss: 1.3980 - regression_loss: 1.1842 - classification_loss: 0.2138 125/500 [======>.......................] - ETA: 1:30 - loss: 1.3979 - regression_loss: 1.1835 - classification_loss: 0.2144 126/500 [======>.......................] - ETA: 1:30 - loss: 1.3941 - regression_loss: 1.1802 - classification_loss: 0.2139 127/500 [======>.......................] - ETA: 1:29 - loss: 1.3927 - regression_loss: 1.1793 - classification_loss: 0.2134 128/500 [======>.......................] - ETA: 1:29 - loss: 1.3919 - regression_loss: 1.1790 - classification_loss: 0.2129 129/500 [======>.......................] - ETA: 1:29 - loss: 1.3885 - regression_loss: 1.1763 - classification_loss: 0.2122 130/500 [======>.......................] - ETA: 1:28 - loss: 1.3949 - regression_loss: 1.1815 - classification_loss: 0.2134 131/500 [======>.......................] - ETA: 1:28 - loss: 1.3961 - regression_loss: 1.1825 - classification_loss: 0.2136 132/500 [======>.......................] - ETA: 1:28 - loss: 1.4099 - regression_loss: 1.1957 - classification_loss: 0.2142 133/500 [======>.......................] - ETA: 1:28 - loss: 1.4143 - regression_loss: 1.1988 - classification_loss: 0.2155 134/500 [=======>......................] - ETA: 1:27 - loss: 1.4144 - regression_loss: 1.1988 - classification_loss: 0.2156 135/500 [=======>......................] - ETA: 1:27 - loss: 1.4137 - regression_loss: 1.1984 - classification_loss: 0.2153 136/500 [=======>......................] - ETA: 1:27 - loss: 1.4189 - regression_loss: 1.2021 - classification_loss: 0.2168 137/500 [=======>......................] - ETA: 1:27 - loss: 1.4140 - regression_loss: 1.1979 - classification_loss: 0.2161 138/500 [=======>......................] - ETA: 1:26 - loss: 1.4194 - regression_loss: 1.2016 - classification_loss: 0.2178 139/500 [=======>......................] - ETA: 1:26 - loss: 1.4204 - regression_loss: 1.2023 - classification_loss: 0.2181 140/500 [=======>......................] - ETA: 1:26 - loss: 1.4226 - regression_loss: 1.2038 - classification_loss: 0.2189 141/500 [=======>......................] - ETA: 1:26 - loss: 1.4220 - regression_loss: 1.2036 - classification_loss: 0.2184 142/500 [=======>......................] - ETA: 1:25 - loss: 1.4258 - regression_loss: 1.2068 - classification_loss: 0.2191 143/500 [=======>......................] - ETA: 1:25 - loss: 1.4217 - regression_loss: 1.2034 - classification_loss: 0.2182 144/500 [=======>......................] - ETA: 1:25 - loss: 1.4263 - regression_loss: 1.2075 - classification_loss: 0.2188 145/500 [=======>......................] - ETA: 1:25 - loss: 1.4243 - regression_loss: 1.2061 - classification_loss: 0.2182 146/500 [=======>......................] - ETA: 1:25 - loss: 1.4204 - regression_loss: 1.2026 - classification_loss: 0.2178 147/500 [=======>......................] - ETA: 1:24 - loss: 1.4236 - regression_loss: 1.2047 - classification_loss: 0.2189 148/500 [=======>......................] - ETA: 1:24 - loss: 1.4259 - regression_loss: 1.2063 - classification_loss: 0.2196 149/500 [=======>......................] - ETA: 1:24 - loss: 1.4239 - regression_loss: 1.2052 - classification_loss: 0.2187 150/500 [========>.....................] - ETA: 1:24 - loss: 1.4177 - regression_loss: 1.2003 - classification_loss: 0.2174 151/500 [========>.....................] - ETA: 1:23 - loss: 1.4104 - regression_loss: 1.1941 - classification_loss: 0.2163 152/500 [========>.....................] - ETA: 1:23 - loss: 1.4112 - regression_loss: 1.1952 - classification_loss: 0.2160 153/500 [========>.....................] - ETA: 1:23 - loss: 1.4139 - regression_loss: 1.1975 - classification_loss: 0.2163 154/500 [========>.....................] - ETA: 1:23 - loss: 1.4183 - regression_loss: 1.2010 - classification_loss: 0.2172 155/500 [========>.....................] - ETA: 1:22 - loss: 1.4126 - regression_loss: 1.1961 - classification_loss: 0.2165 156/500 [========>.....................] - ETA: 1:22 - loss: 1.4147 - regression_loss: 1.1982 - classification_loss: 0.2166 157/500 [========>.....................] - ETA: 1:22 - loss: 1.4155 - regression_loss: 1.1987 - classification_loss: 0.2168 158/500 [========>.....................] - ETA: 1:22 - loss: 1.4166 - regression_loss: 1.2000 - classification_loss: 0.2167 159/500 [========>.....................] - ETA: 1:21 - loss: 1.4135 - regression_loss: 1.1974 - classification_loss: 0.2161 160/500 [========>.....................] - ETA: 1:21 - loss: 1.4161 - regression_loss: 1.1994 - classification_loss: 0.2167 161/500 [========>.....................] - ETA: 1:21 - loss: 1.4155 - regression_loss: 1.1990 - classification_loss: 0.2165 162/500 [========>.....................] - ETA: 1:21 - loss: 1.4134 - regression_loss: 1.1974 - classification_loss: 0.2160 163/500 [========>.....................] - ETA: 1:21 - loss: 1.4122 - regression_loss: 1.1965 - classification_loss: 0.2157 164/500 [========>.....................] - ETA: 1:20 - loss: 1.4110 - regression_loss: 1.1955 - classification_loss: 0.2155 165/500 [========>.....................] - ETA: 1:20 - loss: 1.4105 - regression_loss: 1.1953 - classification_loss: 0.2152 166/500 [========>.....................] - ETA: 1:20 - loss: 1.4106 - regression_loss: 1.1955 - classification_loss: 0.2150 167/500 [=========>....................] - ETA: 1:19 - loss: 1.4117 - regression_loss: 1.1963 - classification_loss: 0.2154 168/500 [=========>....................] - ETA: 1:19 - loss: 1.4137 - regression_loss: 1.1975 - classification_loss: 0.2162 169/500 [=========>....................] - ETA: 1:19 - loss: 1.4189 - regression_loss: 1.2012 - classification_loss: 0.2177 170/500 [=========>....................] - ETA: 1:19 - loss: 1.4141 - regression_loss: 1.1970 - classification_loss: 0.2171 171/500 [=========>....................] - ETA: 1:18 - loss: 1.4087 - regression_loss: 1.1925 - classification_loss: 0.2162 172/500 [=========>....................] - ETA: 1:18 - loss: 1.4112 - regression_loss: 1.1946 - classification_loss: 0.2166 173/500 [=========>....................] - ETA: 1:18 - loss: 1.4129 - regression_loss: 1.1963 - classification_loss: 0.2166 174/500 [=========>....................] - ETA: 1:18 - loss: 1.4092 - regression_loss: 1.1931 - classification_loss: 0.2161 175/500 [=========>....................] - ETA: 1:17 - loss: 1.4177 - regression_loss: 1.2002 - classification_loss: 0.2175 176/500 [=========>....................] - ETA: 1:17 - loss: 1.4214 - regression_loss: 1.2026 - classification_loss: 0.2188 177/500 [=========>....................] - ETA: 1:17 - loss: 1.4229 - regression_loss: 1.2042 - classification_loss: 0.2187 178/500 [=========>....................] - ETA: 1:17 - loss: 1.4252 - regression_loss: 1.2064 - classification_loss: 0.2188 179/500 [=========>....................] - ETA: 1:16 - loss: 1.4288 - regression_loss: 1.2093 - classification_loss: 0.2196 180/500 [=========>....................] - ETA: 1:16 - loss: 1.4255 - regression_loss: 1.2067 - classification_loss: 0.2188 181/500 [=========>....................] - ETA: 1:16 - loss: 1.4214 - regression_loss: 1.2032 - classification_loss: 0.2182 182/500 [=========>....................] - ETA: 1:16 - loss: 1.4217 - regression_loss: 1.2033 - classification_loss: 0.2184 183/500 [=========>....................] - ETA: 1:15 - loss: 1.4211 - regression_loss: 1.2031 - classification_loss: 0.2180 184/500 [==========>...................] - ETA: 1:15 - loss: 1.4202 - regression_loss: 1.2025 - classification_loss: 0.2178 185/500 [==========>...................] - ETA: 1:15 - loss: 1.4241 - regression_loss: 1.2051 - classification_loss: 0.2190 186/500 [==========>...................] - ETA: 1:15 - loss: 1.4252 - regression_loss: 1.2065 - classification_loss: 0.2187 187/500 [==========>...................] - ETA: 1:14 - loss: 1.4269 - regression_loss: 1.2075 - classification_loss: 0.2193 188/500 [==========>...................] - ETA: 1:14 - loss: 1.4287 - regression_loss: 1.2090 - classification_loss: 0.2197 189/500 [==========>...................] - ETA: 1:14 - loss: 1.4276 - regression_loss: 1.2083 - classification_loss: 0.2193 190/500 [==========>...................] - ETA: 1:14 - loss: 1.4277 - regression_loss: 1.2082 - classification_loss: 0.2195 191/500 [==========>...................] - ETA: 1:13 - loss: 1.4275 - regression_loss: 1.2081 - classification_loss: 0.2194 192/500 [==========>...................] - ETA: 1:13 - loss: 1.4286 - regression_loss: 1.2087 - classification_loss: 0.2199 193/500 [==========>...................] - ETA: 1:13 - loss: 1.4286 - regression_loss: 1.2088 - classification_loss: 0.2198 194/500 [==========>...................] - ETA: 1:13 - loss: 1.4280 - regression_loss: 1.2086 - classification_loss: 0.2195 195/500 [==========>...................] - ETA: 1:13 - loss: 1.4290 - regression_loss: 1.2089 - classification_loss: 0.2201 196/500 [==========>...................] - ETA: 1:12 - loss: 1.4303 - regression_loss: 1.2098 - classification_loss: 0.2206 197/500 [==========>...................] - ETA: 1:12 - loss: 1.4297 - regression_loss: 1.2094 - classification_loss: 0.2203 198/500 [==========>...................] - ETA: 1:12 - loss: 1.4281 - regression_loss: 1.2083 - classification_loss: 0.2198 199/500 [==========>...................] - ETA: 1:12 - loss: 1.4322 - regression_loss: 1.2115 - classification_loss: 0.2207 200/500 [===========>..................] - ETA: 1:11 - loss: 1.4284 - regression_loss: 1.2084 - classification_loss: 0.2199 201/500 [===========>..................] - ETA: 1:11 - loss: 1.4308 - regression_loss: 1.2102 - classification_loss: 0.2206 202/500 [===========>..................] - ETA: 1:11 - loss: 1.4300 - regression_loss: 1.2090 - classification_loss: 0.2210 203/500 [===========>..................] - ETA: 1:11 - loss: 1.4328 - regression_loss: 1.2112 - classification_loss: 0.2216 204/500 [===========>..................] - ETA: 1:10 - loss: 1.4319 - regression_loss: 1.2104 - classification_loss: 0.2215 205/500 [===========>..................] - ETA: 1:10 - loss: 1.4354 - regression_loss: 1.2136 - classification_loss: 0.2218 206/500 [===========>..................] - ETA: 1:10 - loss: 1.4297 - regression_loss: 1.2088 - classification_loss: 0.2209 207/500 [===========>..................] - ETA: 1:10 - loss: 1.4323 - regression_loss: 1.2106 - classification_loss: 0.2217 208/500 [===========>..................] - ETA: 1:09 - loss: 1.4278 - regression_loss: 1.2069 - classification_loss: 0.2209 209/500 [===========>..................] - ETA: 1:09 - loss: 1.4244 - regression_loss: 1.2042 - classification_loss: 0.2203 210/500 [===========>..................] - ETA: 1:09 - loss: 1.4224 - regression_loss: 1.2027 - classification_loss: 0.2197 211/500 [===========>..................] - ETA: 1:09 - loss: 1.4259 - regression_loss: 1.2050 - classification_loss: 0.2209 212/500 [===========>..................] - ETA: 1:09 - loss: 1.4286 - regression_loss: 1.2073 - classification_loss: 0.2213 213/500 [===========>..................] - ETA: 1:08 - loss: 1.4285 - regression_loss: 1.2072 - classification_loss: 0.2213 214/500 [===========>..................] - ETA: 1:08 - loss: 1.4261 - regression_loss: 1.2054 - classification_loss: 0.2207 215/500 [===========>..................] - ETA: 1:08 - loss: 1.4251 - regression_loss: 1.2045 - classification_loss: 0.2205 216/500 [===========>..................] - ETA: 1:08 - loss: 1.4230 - regression_loss: 1.2026 - classification_loss: 0.2204 217/500 [============>.................] - ETA: 1:07 - loss: 1.4214 - regression_loss: 1.2015 - classification_loss: 0.2199 218/500 [============>.................] - ETA: 1:07 - loss: 1.4223 - regression_loss: 1.2026 - classification_loss: 0.2197 219/500 [============>.................] - ETA: 1:07 - loss: 1.4256 - regression_loss: 1.2039 - classification_loss: 0.2217 220/500 [============>.................] - ETA: 1:07 - loss: 1.4312 - regression_loss: 1.2081 - classification_loss: 0.2231 221/500 [============>.................] - ETA: 1:07 - loss: 1.4329 - regression_loss: 1.2095 - classification_loss: 0.2234 222/500 [============>.................] - ETA: 1:06 - loss: 1.4321 - regression_loss: 1.2091 - classification_loss: 0.2230 223/500 [============>.................] - ETA: 1:06 - loss: 1.4329 - regression_loss: 1.2099 - classification_loss: 0.2230 224/500 [============>.................] - ETA: 1:06 - loss: 1.4317 - regression_loss: 1.2089 - classification_loss: 0.2228 225/500 [============>.................] - ETA: 1:06 - loss: 1.4294 - regression_loss: 1.2070 - classification_loss: 0.2224 226/500 [============>.................] - ETA: 1:05 - loss: 1.4274 - regression_loss: 1.2056 - classification_loss: 0.2219 227/500 [============>.................] - ETA: 1:05 - loss: 1.4282 - regression_loss: 1.2063 - classification_loss: 0.2220 228/500 [============>.................] - ETA: 1:05 - loss: 1.4256 - regression_loss: 1.2042 - classification_loss: 0.2215 229/500 [============>.................] - ETA: 1:05 - loss: 1.4219 - regression_loss: 1.2012 - classification_loss: 0.2208 230/500 [============>.................] - ETA: 1:04 - loss: 1.4221 - regression_loss: 1.2010 - classification_loss: 0.2210 231/500 [============>.................] - ETA: 1:04 - loss: 1.4221 - regression_loss: 1.2012 - classification_loss: 0.2209 232/500 [============>.................] - ETA: 1:04 - loss: 1.4229 - regression_loss: 1.2021 - classification_loss: 0.2208 233/500 [============>.................] - ETA: 1:04 - loss: 1.4219 - regression_loss: 1.2013 - classification_loss: 0.2206 234/500 [=============>................] - ETA: 1:04 - loss: 1.4220 - regression_loss: 1.2015 - classification_loss: 0.2205 235/500 [=============>................] - ETA: 1:03 - loss: 1.4230 - regression_loss: 1.2023 - classification_loss: 0.2207 236/500 [=============>................] - ETA: 1:03 - loss: 1.4214 - regression_loss: 1.2010 - classification_loss: 0.2204 237/500 [=============>................] - ETA: 1:03 - loss: 1.4304 - regression_loss: 1.2086 - classification_loss: 0.2219 238/500 [=============>................] - ETA: 1:03 - loss: 1.4278 - regression_loss: 1.2064 - classification_loss: 0.2213 239/500 [=============>................] - ETA: 1:02 - loss: 1.4243 - regression_loss: 1.2036 - classification_loss: 0.2207 240/500 [=============>................] - ETA: 1:02 - loss: 1.4203 - regression_loss: 1.2003 - classification_loss: 0.2200 241/500 [=============>................] - ETA: 1:02 - loss: 1.4242 - regression_loss: 1.2031 - classification_loss: 0.2211 242/500 [=============>................] - ETA: 1:02 - loss: 1.4260 - regression_loss: 1.2044 - classification_loss: 0.2217 243/500 [=============>................] - ETA: 1:01 - loss: 1.4272 - regression_loss: 1.2051 - classification_loss: 0.2221 244/500 [=============>................] - ETA: 1:01 - loss: 1.4280 - regression_loss: 1.2056 - classification_loss: 0.2224 245/500 [=============>................] - ETA: 1:01 - loss: 1.4277 - regression_loss: 1.2057 - classification_loss: 0.2220 246/500 [=============>................] - ETA: 1:01 - loss: 1.4279 - regression_loss: 1.2060 - classification_loss: 0.2219 247/500 [=============>................] - ETA: 1:00 - loss: 1.4283 - regression_loss: 1.2064 - classification_loss: 0.2219 248/500 [=============>................] - ETA: 1:00 - loss: 1.4273 - regression_loss: 1.2057 - classification_loss: 0.2216 249/500 [=============>................] - ETA: 1:00 - loss: 1.4285 - regression_loss: 1.2066 - classification_loss: 0.2218 250/500 [==============>...............] - ETA: 1:00 - loss: 1.4310 - regression_loss: 1.2084 - classification_loss: 0.2226 251/500 [==============>...............] - ETA: 59s - loss: 1.4294 - regression_loss: 1.2071 - classification_loss: 0.2223  252/500 [==============>...............] - ETA: 59s - loss: 1.4285 - regression_loss: 1.2064 - classification_loss: 0.2221 253/500 [==============>...............] - ETA: 59s - loss: 1.4266 - regression_loss: 1.2050 - classification_loss: 0.2216 254/500 [==============>...............] - ETA: 59s - loss: 1.4263 - regression_loss: 1.2050 - classification_loss: 0.2214 255/500 [==============>...............] - ETA: 58s - loss: 1.4249 - regression_loss: 1.2038 - classification_loss: 0.2211 256/500 [==============>...............] - ETA: 58s - loss: 1.4233 - regression_loss: 1.2024 - classification_loss: 0.2209 257/500 [==============>...............] - ETA: 58s - loss: 1.4244 - regression_loss: 1.2032 - classification_loss: 0.2212 258/500 [==============>...............] - ETA: 58s - loss: 1.4238 - regression_loss: 1.2029 - classification_loss: 0.2209 259/500 [==============>...............] - ETA: 58s - loss: 1.4233 - regression_loss: 1.2021 - classification_loss: 0.2212 260/500 [==============>...............] - ETA: 57s - loss: 1.4205 - regression_loss: 1.1998 - classification_loss: 0.2206 261/500 [==============>...............] - ETA: 57s - loss: 1.4201 - regression_loss: 1.1996 - classification_loss: 0.2205 262/500 [==============>...............] - ETA: 57s - loss: 1.4202 - regression_loss: 1.1995 - classification_loss: 0.2207 263/500 [==============>...............] - ETA: 57s - loss: 1.4185 - regression_loss: 1.1982 - classification_loss: 0.2203 264/500 [==============>...............] - ETA: 56s - loss: 1.4190 - regression_loss: 1.1991 - classification_loss: 0.2199 265/500 [==============>...............] - ETA: 56s - loss: 1.4180 - regression_loss: 1.1985 - classification_loss: 0.2195 266/500 [==============>...............] - ETA: 56s - loss: 1.4189 - regression_loss: 1.1990 - classification_loss: 0.2198 267/500 [===============>..............] - ETA: 56s - loss: 1.4186 - regression_loss: 1.1990 - classification_loss: 0.2196 268/500 [===============>..............] - ETA: 55s - loss: 1.4186 - regression_loss: 1.1989 - classification_loss: 0.2196 269/500 [===============>..............] - ETA: 55s - loss: 1.4191 - regression_loss: 1.1994 - classification_loss: 0.2197 270/500 [===============>..............] - ETA: 55s - loss: 1.4156 - regression_loss: 1.1965 - classification_loss: 0.2192 271/500 [===============>..............] - ETA: 55s - loss: 1.4165 - regression_loss: 1.1973 - classification_loss: 0.2193 272/500 [===============>..............] - ETA: 54s - loss: 1.4188 - regression_loss: 1.1990 - classification_loss: 0.2198 273/500 [===============>..............] - ETA: 54s - loss: 1.4189 - regression_loss: 1.1993 - classification_loss: 0.2196 274/500 [===============>..............] - ETA: 54s - loss: 1.4153 - regression_loss: 1.1961 - classification_loss: 0.2192 275/500 [===============>..............] - ETA: 54s - loss: 1.4137 - regression_loss: 1.1948 - classification_loss: 0.2189 276/500 [===============>..............] - ETA: 53s - loss: 1.4133 - regression_loss: 1.1943 - classification_loss: 0.2191 277/500 [===============>..............] - ETA: 53s - loss: 1.4129 - regression_loss: 1.1939 - classification_loss: 0.2190 278/500 [===============>..............] - ETA: 53s - loss: 1.4129 - regression_loss: 1.1939 - classification_loss: 0.2189 279/500 [===============>..............] - ETA: 53s - loss: 1.4139 - regression_loss: 1.1949 - classification_loss: 0.2190 280/500 [===============>..............] - ETA: 53s - loss: 1.4138 - regression_loss: 1.1948 - classification_loss: 0.2190 281/500 [===============>..............] - ETA: 52s - loss: 1.4147 - regression_loss: 1.1955 - classification_loss: 0.2191 282/500 [===============>..............] - ETA: 52s - loss: 1.4155 - regression_loss: 1.1965 - classification_loss: 0.2190 283/500 [===============>..............] - ETA: 52s - loss: 1.4163 - regression_loss: 1.1968 - classification_loss: 0.2195 284/500 [================>.............] - ETA: 52s - loss: 1.4156 - regression_loss: 1.1963 - classification_loss: 0.2193 285/500 [================>.............] - ETA: 51s - loss: 1.4136 - regression_loss: 1.1949 - classification_loss: 0.2187 286/500 [================>.............] - ETA: 51s - loss: 1.4103 - regression_loss: 1.1920 - classification_loss: 0.2184 287/500 [================>.............] - ETA: 51s - loss: 1.4121 - regression_loss: 1.1937 - classification_loss: 0.2185 288/500 [================>.............] - ETA: 51s - loss: 1.4151 - regression_loss: 1.1956 - classification_loss: 0.2195 289/500 [================>.............] - ETA: 50s - loss: 1.4162 - regression_loss: 1.1964 - classification_loss: 0.2197 290/500 [================>.............] - ETA: 50s - loss: 1.4170 - regression_loss: 1.1970 - classification_loss: 0.2200 291/500 [================>.............] - ETA: 50s - loss: 1.4171 - regression_loss: 1.1972 - classification_loss: 0.2199 292/500 [================>.............] - ETA: 50s - loss: 1.4172 - regression_loss: 1.1975 - classification_loss: 0.2198 293/500 [================>.............] - ETA: 49s - loss: 1.4160 - regression_loss: 1.1966 - classification_loss: 0.2194 294/500 [================>.............] - ETA: 49s - loss: 1.4165 - regression_loss: 1.1969 - classification_loss: 0.2196 295/500 [================>.............] - ETA: 49s - loss: 1.4173 - regression_loss: 1.1979 - classification_loss: 0.2195 296/500 [================>.............] - ETA: 49s - loss: 1.4180 - regression_loss: 1.1982 - classification_loss: 0.2198 297/500 [================>.............] - ETA: 48s - loss: 1.4178 - regression_loss: 1.1983 - classification_loss: 0.2195 298/500 [================>.............] - ETA: 48s - loss: 1.4182 - regression_loss: 1.1985 - classification_loss: 0.2197 299/500 [================>.............] - ETA: 48s - loss: 1.4188 - regression_loss: 1.1991 - classification_loss: 0.2197 300/500 [=================>............] - ETA: 48s - loss: 1.4215 - regression_loss: 1.2012 - classification_loss: 0.2202 301/500 [=================>............] - ETA: 48s - loss: 1.4215 - regression_loss: 1.2013 - classification_loss: 0.2202 302/500 [=================>............] - ETA: 47s - loss: 1.4201 - regression_loss: 1.2003 - classification_loss: 0.2199 303/500 [=================>............] - ETA: 47s - loss: 1.4206 - regression_loss: 1.2008 - classification_loss: 0.2198 304/500 [=================>............] - ETA: 47s - loss: 1.4190 - regression_loss: 1.1995 - classification_loss: 0.2195 305/500 [=================>............] - ETA: 47s - loss: 1.4182 - regression_loss: 1.1989 - classification_loss: 0.2193 306/500 [=================>............] - ETA: 46s - loss: 1.4181 - regression_loss: 1.1989 - classification_loss: 0.2192 307/500 [=================>............] - ETA: 46s - loss: 1.4191 - regression_loss: 1.2000 - classification_loss: 0.2191 308/500 [=================>............] - ETA: 46s - loss: 1.4191 - regression_loss: 1.2002 - classification_loss: 0.2189 309/500 [=================>............] - ETA: 46s - loss: 1.4230 - regression_loss: 1.2036 - classification_loss: 0.2194 310/500 [=================>............] - ETA: 45s - loss: 1.4227 - regression_loss: 1.2033 - classification_loss: 0.2194 311/500 [=================>............] - ETA: 45s - loss: 1.4246 - regression_loss: 1.2048 - classification_loss: 0.2198 312/500 [=================>............] - ETA: 45s - loss: 1.4246 - regression_loss: 1.2050 - classification_loss: 0.2196 313/500 [=================>............] - ETA: 45s - loss: 1.4246 - regression_loss: 1.2050 - classification_loss: 0.2195 314/500 [=================>............] - ETA: 44s - loss: 1.4235 - regression_loss: 1.2043 - classification_loss: 0.2192 315/500 [=================>............] - ETA: 44s - loss: 1.4225 - regression_loss: 1.2036 - classification_loss: 0.2189 316/500 [=================>............] - ETA: 44s - loss: 1.4210 - regression_loss: 1.2023 - classification_loss: 0.2187 317/500 [==================>...........] - ETA: 44s - loss: 1.4214 - regression_loss: 1.2028 - classification_loss: 0.2187 318/500 [==================>...........] - ETA: 43s - loss: 1.4191 - regression_loss: 1.2007 - classification_loss: 0.2184 319/500 [==================>...........] - ETA: 43s - loss: 1.4215 - regression_loss: 1.2027 - classification_loss: 0.2188 320/500 [==================>...........] - ETA: 43s - loss: 1.4195 - regression_loss: 1.2012 - classification_loss: 0.2184 321/500 [==================>...........] - ETA: 43s - loss: 1.4180 - regression_loss: 1.1999 - classification_loss: 0.2181 322/500 [==================>...........] - ETA: 42s - loss: 1.4174 - regression_loss: 1.1991 - classification_loss: 0.2183 323/500 [==================>...........] - ETA: 42s - loss: 1.4166 - regression_loss: 1.1984 - classification_loss: 0.2182 324/500 [==================>...........] - ETA: 42s - loss: 1.4178 - regression_loss: 1.1996 - classification_loss: 0.2182 325/500 [==================>...........] - ETA: 42s - loss: 1.4173 - regression_loss: 1.1992 - classification_loss: 0.2181 326/500 [==================>...........] - ETA: 42s - loss: 1.4155 - regression_loss: 1.1976 - classification_loss: 0.2179 327/500 [==================>...........] - ETA: 41s - loss: 1.4147 - regression_loss: 1.1971 - classification_loss: 0.2176 328/500 [==================>...........] - ETA: 41s - loss: 1.4147 - regression_loss: 1.1973 - classification_loss: 0.2174 329/500 [==================>...........] - ETA: 41s - loss: 1.4136 - regression_loss: 1.1964 - classification_loss: 0.2172 330/500 [==================>...........] - ETA: 41s - loss: 1.4154 - regression_loss: 1.1976 - classification_loss: 0.2178 331/500 [==================>...........] - ETA: 40s - loss: 1.4146 - regression_loss: 1.1967 - classification_loss: 0.2179 332/500 [==================>...........] - ETA: 40s - loss: 1.4147 - regression_loss: 1.1969 - classification_loss: 0.2178 333/500 [==================>...........] - ETA: 40s - loss: 1.4144 - regression_loss: 1.1965 - classification_loss: 0.2179 334/500 [===================>..........] - ETA: 40s - loss: 1.4139 - regression_loss: 1.1962 - classification_loss: 0.2177 335/500 [===================>..........] - ETA: 39s - loss: 1.4146 - regression_loss: 1.1969 - classification_loss: 0.2177 336/500 [===================>..........] - ETA: 39s - loss: 1.4154 - regression_loss: 1.1976 - classification_loss: 0.2178 337/500 [===================>..........] - ETA: 39s - loss: 1.4173 - regression_loss: 1.1990 - classification_loss: 0.2183 338/500 [===================>..........] - ETA: 39s - loss: 1.4170 - regression_loss: 1.1989 - classification_loss: 0.2181 339/500 [===================>..........] - ETA: 38s - loss: 1.4168 - regression_loss: 1.1987 - classification_loss: 0.2181 340/500 [===================>..........] - ETA: 38s - loss: 1.4180 - regression_loss: 1.1997 - classification_loss: 0.2183 341/500 [===================>..........] - ETA: 38s - loss: 1.4174 - regression_loss: 1.1993 - classification_loss: 0.2181 342/500 [===================>..........] - ETA: 38s - loss: 1.4170 - regression_loss: 1.1991 - classification_loss: 0.2179 343/500 [===================>..........] - ETA: 37s - loss: 1.4168 - regression_loss: 1.1991 - classification_loss: 0.2177 344/500 [===================>..........] - ETA: 37s - loss: 1.4196 - regression_loss: 1.2015 - classification_loss: 0.2182 345/500 [===================>..........] - ETA: 37s - loss: 1.4191 - regression_loss: 1.2012 - classification_loss: 0.2179 346/500 [===================>..........] - ETA: 37s - loss: 1.4194 - regression_loss: 1.2013 - classification_loss: 0.2180 347/500 [===================>..........] - ETA: 36s - loss: 1.4195 - regression_loss: 1.2014 - classification_loss: 0.2181 348/500 [===================>..........] - ETA: 36s - loss: 1.4186 - regression_loss: 1.2007 - classification_loss: 0.2180 349/500 [===================>..........] - ETA: 36s - loss: 1.4196 - regression_loss: 1.2016 - classification_loss: 0.2180 350/500 [====================>.........] - ETA: 36s - loss: 1.4192 - regression_loss: 1.2014 - classification_loss: 0.2179 351/500 [====================>.........] - ETA: 35s - loss: 1.4184 - regression_loss: 1.2005 - classification_loss: 0.2179 352/500 [====================>.........] - ETA: 35s - loss: 1.4188 - regression_loss: 1.2009 - classification_loss: 0.2179 353/500 [====================>.........] - ETA: 35s - loss: 1.4176 - regression_loss: 1.2000 - classification_loss: 0.2176 354/500 [====================>.........] - ETA: 35s - loss: 1.4148 - regression_loss: 1.1977 - classification_loss: 0.2171 355/500 [====================>.........] - ETA: 35s - loss: 1.4168 - regression_loss: 1.1992 - classification_loss: 0.2176 356/500 [====================>.........] - ETA: 34s - loss: 1.4169 - regression_loss: 1.1993 - classification_loss: 0.2176 357/500 [====================>.........] - ETA: 34s - loss: 1.4158 - regression_loss: 1.1985 - classification_loss: 0.2173 358/500 [====================>.........] - ETA: 34s - loss: 1.4152 - regression_loss: 1.1980 - classification_loss: 0.2173 359/500 [====================>.........] - ETA: 34s - loss: 1.4146 - regression_loss: 1.1974 - classification_loss: 0.2172 360/500 [====================>.........] - ETA: 33s - loss: 1.4153 - regression_loss: 1.1976 - classification_loss: 0.2176 361/500 [====================>.........] - ETA: 33s - loss: 1.4146 - regression_loss: 1.1972 - classification_loss: 0.2174 362/500 [====================>.........] - ETA: 33s - loss: 1.4149 - regression_loss: 1.1977 - classification_loss: 0.2173 363/500 [====================>.........] - ETA: 33s - loss: 1.4162 - regression_loss: 1.1983 - classification_loss: 0.2179 364/500 [====================>.........] - ETA: 32s - loss: 1.4137 - regression_loss: 1.1962 - classification_loss: 0.2174 365/500 [====================>.........] - ETA: 32s - loss: 1.4138 - regression_loss: 1.1964 - classification_loss: 0.2175 366/500 [====================>.........] - ETA: 32s - loss: 1.4136 - regression_loss: 1.1962 - classification_loss: 0.2174 367/500 [=====================>........] - ETA: 32s - loss: 1.4136 - regression_loss: 1.1961 - classification_loss: 0.2174 368/500 [=====================>........] - ETA: 31s - loss: 1.4159 - regression_loss: 1.1978 - classification_loss: 0.2181 369/500 [=====================>........] - ETA: 31s - loss: 1.4178 - regression_loss: 1.1992 - classification_loss: 0.2185 370/500 [=====================>........] - ETA: 31s - loss: 1.4178 - regression_loss: 1.1993 - classification_loss: 0.2184 371/500 [=====================>........] - ETA: 31s - loss: 1.4179 - regression_loss: 1.1992 - classification_loss: 0.2187 372/500 [=====================>........] - ETA: 30s - loss: 1.4167 - regression_loss: 1.1983 - classification_loss: 0.2184 373/500 [=====================>........] - ETA: 30s - loss: 1.4160 - regression_loss: 1.1977 - classification_loss: 0.2183 374/500 [=====================>........] - ETA: 30s - loss: 1.4159 - regression_loss: 1.1978 - classification_loss: 0.2181 375/500 [=====================>........] - ETA: 30s - loss: 1.4156 - regression_loss: 1.1976 - classification_loss: 0.2180 376/500 [=====================>........] - ETA: 29s - loss: 1.4137 - regression_loss: 1.1960 - classification_loss: 0.2177 377/500 [=====================>........] - ETA: 29s - loss: 1.4120 - regression_loss: 1.1947 - classification_loss: 0.2173 378/500 [=====================>........] - ETA: 29s - loss: 1.4111 - regression_loss: 1.1940 - classification_loss: 0.2172 379/500 [=====================>........] - ETA: 29s - loss: 1.4111 - regression_loss: 1.1939 - classification_loss: 0.2172 380/500 [=====================>........] - ETA: 29s - loss: 1.4085 - regression_loss: 1.1917 - classification_loss: 0.2168 381/500 [=====================>........] - ETA: 28s - loss: 1.4088 - regression_loss: 1.1920 - classification_loss: 0.2168 382/500 [=====================>........] - ETA: 28s - loss: 1.4089 - regression_loss: 1.1920 - classification_loss: 0.2169 383/500 [=====================>........] - ETA: 28s - loss: 1.4079 - regression_loss: 1.1888 - classification_loss: 0.2190 384/500 [======================>.......] - ETA: 28s - loss: 1.4066 - regression_loss: 1.1878 - classification_loss: 0.2187 385/500 [======================>.......] - ETA: 27s - loss: 1.4076 - regression_loss: 1.1886 - classification_loss: 0.2189 386/500 [======================>.......] - ETA: 27s - loss: 1.4099 - regression_loss: 1.1902 - classification_loss: 0.2197 387/500 [======================>.......] - ETA: 27s - loss: 1.4100 - regression_loss: 1.1902 - classification_loss: 0.2198 388/500 [======================>.......] - ETA: 27s - loss: 1.4105 - regression_loss: 1.1904 - classification_loss: 0.2201 389/500 [======================>.......] - ETA: 26s - loss: 1.4106 - regression_loss: 1.1904 - classification_loss: 0.2201 390/500 [======================>.......] - ETA: 26s - loss: 1.4103 - regression_loss: 1.1904 - classification_loss: 0.2199 391/500 [======================>.......] - ETA: 26s - loss: 1.4103 - regression_loss: 1.1905 - classification_loss: 0.2198 392/500 [======================>.......] - ETA: 26s - loss: 1.4117 - regression_loss: 1.1917 - classification_loss: 0.2201 393/500 [======================>.......] - ETA: 25s - loss: 1.4137 - regression_loss: 1.1934 - classification_loss: 0.2203 394/500 [======================>.......] - ETA: 25s - loss: 1.4127 - regression_loss: 1.1926 - classification_loss: 0.2202 395/500 [======================>.......] - ETA: 25s - loss: 1.4114 - regression_loss: 1.1915 - classification_loss: 0.2199 396/500 [======================>.......] - ETA: 25s - loss: 1.4145 - regression_loss: 1.1939 - classification_loss: 0.2206 397/500 [======================>.......] - ETA: 24s - loss: 1.4142 - regression_loss: 1.1936 - classification_loss: 0.2206 398/500 [======================>.......] - ETA: 24s - loss: 1.4144 - regression_loss: 1.1939 - classification_loss: 0.2205 399/500 [======================>.......] - ETA: 24s - loss: 1.4142 - regression_loss: 1.1938 - classification_loss: 0.2204 400/500 [=======================>......] - ETA: 24s - loss: 1.4162 - regression_loss: 1.1955 - classification_loss: 0.2207 401/500 [=======================>......] - ETA: 23s - loss: 1.4153 - regression_loss: 1.1946 - classification_loss: 0.2206 402/500 [=======================>......] - ETA: 23s - loss: 1.4145 - regression_loss: 1.1942 - classification_loss: 0.2203 403/500 [=======================>......] - ETA: 23s - loss: 1.4144 - regression_loss: 1.1943 - classification_loss: 0.2201 404/500 [=======================>......] - ETA: 23s - loss: 1.4148 - regression_loss: 1.1947 - classification_loss: 0.2201 405/500 [=======================>......] - ETA: 22s - loss: 1.4131 - regression_loss: 1.1933 - classification_loss: 0.2199 406/500 [=======================>......] - ETA: 22s - loss: 1.4139 - regression_loss: 1.1942 - classification_loss: 0.2197 407/500 [=======================>......] - ETA: 22s - loss: 1.4131 - regression_loss: 1.1937 - classification_loss: 0.2194 408/500 [=======================>......] - ETA: 22s - loss: 1.4119 - regression_loss: 1.1928 - classification_loss: 0.2191 409/500 [=======================>......] - ETA: 21s - loss: 1.4099 - regression_loss: 1.1911 - classification_loss: 0.2188 410/500 [=======================>......] - ETA: 21s - loss: 1.4088 - regression_loss: 1.1903 - classification_loss: 0.2185 411/500 [=======================>......] - ETA: 21s - loss: 1.4073 - regression_loss: 1.1889 - classification_loss: 0.2183 412/500 [=======================>......] - ETA: 21s - loss: 1.4075 - regression_loss: 1.1892 - classification_loss: 0.2183 413/500 [=======================>......] - ETA: 21s - loss: 1.4076 - regression_loss: 1.1892 - classification_loss: 0.2185 414/500 [=======================>......] - ETA: 20s - loss: 1.4081 - regression_loss: 1.1894 - classification_loss: 0.2186 415/500 [=======================>......] - ETA: 20s - loss: 1.4081 - regression_loss: 1.1895 - classification_loss: 0.2186 416/500 [=======================>......] - ETA: 20s - loss: 1.4084 - regression_loss: 1.1898 - classification_loss: 0.2186 417/500 [========================>.....] - ETA: 20s - loss: 1.4100 - regression_loss: 1.1911 - classification_loss: 0.2189 418/500 [========================>.....] - ETA: 19s - loss: 1.4117 - regression_loss: 1.1926 - classification_loss: 0.2190 419/500 [========================>.....] - ETA: 19s - loss: 1.4118 - regression_loss: 1.1927 - classification_loss: 0.2191 420/500 [========================>.....] - ETA: 19s - loss: 1.4104 - regression_loss: 1.1916 - classification_loss: 0.2188 421/500 [========================>.....] - ETA: 19s - loss: 1.4082 - regression_loss: 1.1897 - classification_loss: 0.2185 422/500 [========================>.....] - ETA: 18s - loss: 1.4083 - regression_loss: 1.1898 - classification_loss: 0.2184 423/500 [========================>.....] - ETA: 18s - loss: 1.4075 - regression_loss: 1.1893 - classification_loss: 0.2183 424/500 [========================>.....] - ETA: 18s - loss: 1.4103 - regression_loss: 1.1914 - classification_loss: 0.2189 425/500 [========================>.....] - ETA: 18s - loss: 1.4097 - regression_loss: 1.1907 - classification_loss: 0.2189 426/500 [========================>.....] - ETA: 17s - loss: 1.4091 - regression_loss: 1.1903 - classification_loss: 0.2188 427/500 [========================>.....] - ETA: 17s - loss: 1.4085 - regression_loss: 1.1901 - classification_loss: 0.2185 428/500 [========================>.....] - ETA: 17s - loss: 1.4086 - regression_loss: 1.1902 - classification_loss: 0.2184 429/500 [========================>.....] - ETA: 17s - loss: 1.4071 - regression_loss: 1.1888 - classification_loss: 0.2183 430/500 [========================>.....] - ETA: 16s - loss: 1.4068 - regression_loss: 1.1886 - classification_loss: 0.2182 431/500 [========================>.....] - ETA: 16s - loss: 1.4085 - regression_loss: 1.1900 - classification_loss: 0.2185 432/500 [========================>.....] - ETA: 16s - loss: 1.4060 - regression_loss: 1.1878 - classification_loss: 0.2181 433/500 [========================>.....] - ETA: 16s - loss: 1.4059 - regression_loss: 1.1878 - classification_loss: 0.2181 434/500 [=========================>....] - ETA: 15s - loss: 1.4047 - regression_loss: 1.1869 - classification_loss: 0.2178 435/500 [=========================>....] - ETA: 15s - loss: 1.4056 - regression_loss: 1.1875 - classification_loss: 0.2181 436/500 [=========================>....] - ETA: 15s - loss: 1.4061 - regression_loss: 1.1879 - classification_loss: 0.2182 437/500 [=========================>....] - ETA: 15s - loss: 1.4082 - regression_loss: 1.1893 - classification_loss: 0.2188 438/500 [=========================>....] - ETA: 14s - loss: 1.4081 - regression_loss: 1.1893 - classification_loss: 0.2187 439/500 [=========================>....] - ETA: 14s - loss: 1.4077 - regression_loss: 1.1892 - classification_loss: 0.2185 440/500 [=========================>....] - ETA: 14s - loss: 1.4066 - regression_loss: 1.1884 - classification_loss: 0.2183 441/500 [=========================>....] - ETA: 14s - loss: 1.4065 - regression_loss: 1.1884 - classification_loss: 0.2181 442/500 [=========================>....] - ETA: 14s - loss: 1.4066 - regression_loss: 1.1886 - classification_loss: 0.2181 443/500 [=========================>....] - ETA: 13s - loss: 1.4066 - regression_loss: 1.1886 - classification_loss: 0.2180 444/500 [=========================>....] - ETA: 13s - loss: 1.4063 - regression_loss: 1.1884 - classification_loss: 0.2179 445/500 [=========================>....] - ETA: 13s - loss: 1.4064 - regression_loss: 1.1885 - classification_loss: 0.2179 446/500 [=========================>....] - ETA: 13s - loss: 1.4068 - regression_loss: 1.1887 - classification_loss: 0.2181 447/500 [=========================>....] - ETA: 12s - loss: 1.4066 - regression_loss: 1.1887 - classification_loss: 0.2179 448/500 [=========================>....] - ETA: 12s - loss: 1.4060 - regression_loss: 1.1883 - classification_loss: 0.2177 449/500 [=========================>....] - ETA: 12s - loss: 1.4050 - regression_loss: 1.1875 - classification_loss: 0.2175 450/500 [==========================>...] - ETA: 12s - loss: 1.4059 - regression_loss: 1.1883 - classification_loss: 0.2176 451/500 [==========================>...] - ETA: 11s - loss: 1.4058 - regression_loss: 1.1882 - classification_loss: 0.2176 452/500 [==========================>...] - ETA: 11s - loss: 1.4036 - regression_loss: 1.1864 - classification_loss: 0.2172 453/500 [==========================>...] - ETA: 11s - loss: 1.4052 - regression_loss: 1.1876 - classification_loss: 0.2176 454/500 [==========================>...] - ETA: 11s - loss: 1.4046 - regression_loss: 1.1872 - classification_loss: 0.2174 455/500 [==========================>...] - ETA: 10s - loss: 1.4044 - regression_loss: 1.1870 - classification_loss: 0.2174 456/500 [==========================>...] - ETA: 10s - loss: 1.4046 - regression_loss: 1.1873 - classification_loss: 0.2174 457/500 [==========================>...] - ETA: 10s - loss: 1.4041 - regression_loss: 1.1868 - classification_loss: 0.2173 458/500 [==========================>...] - ETA: 10s - loss: 1.4052 - regression_loss: 1.1875 - classification_loss: 0.2177 459/500 [==========================>...] - ETA: 9s - loss: 1.4050 - regression_loss: 1.1874 - classification_loss: 0.2176  460/500 [==========================>...] - ETA: 9s - loss: 1.4046 - regression_loss: 1.1871 - classification_loss: 0.2175 461/500 [==========================>...] - ETA: 9s - loss: 1.4129 - regression_loss: 1.1868 - classification_loss: 0.2262 462/500 [==========================>...] - ETA: 9s - loss: 1.4137 - regression_loss: 1.1875 - classification_loss: 0.2262 463/500 [==========================>...] - ETA: 8s - loss: 1.4133 - regression_loss: 1.1872 - classification_loss: 0.2262 464/500 [==========================>...] - ETA: 8s - loss: 1.4166 - regression_loss: 1.1892 - classification_loss: 0.2275 465/500 [==========================>...] - ETA: 8s - loss: 1.4161 - regression_loss: 1.1888 - classification_loss: 0.2273 466/500 [==========================>...] - ETA: 8s - loss: 1.4166 - regression_loss: 1.1891 - classification_loss: 0.2274 467/500 [===========================>..] - ETA: 7s - loss: 1.4161 - regression_loss: 1.1889 - classification_loss: 0.2272 468/500 [===========================>..] - ETA: 7s - loss: 1.4159 - regression_loss: 1.1888 - classification_loss: 0.2271 469/500 [===========================>..] - ETA: 7s - loss: 1.4196 - regression_loss: 1.1916 - classification_loss: 0.2280 470/500 [===========================>..] - ETA: 7s - loss: 1.4188 - regression_loss: 1.1909 - classification_loss: 0.2279 471/500 [===========================>..] - ETA: 7s - loss: 1.4190 - regression_loss: 1.1912 - classification_loss: 0.2278 472/500 [===========================>..] - ETA: 6s - loss: 1.4189 - regression_loss: 1.1911 - classification_loss: 0.2278 473/500 [===========================>..] - ETA: 6s - loss: 1.4191 - regression_loss: 1.1913 - classification_loss: 0.2279 474/500 [===========================>..] - ETA: 6s - loss: 1.4188 - regression_loss: 1.1910 - classification_loss: 0.2278 475/500 [===========================>..] - ETA: 6s - loss: 1.4169 - regression_loss: 1.1894 - classification_loss: 0.2274 476/500 [===========================>..] - ETA: 5s - loss: 1.4164 - regression_loss: 1.1891 - classification_loss: 0.2273 477/500 [===========================>..] - ETA: 5s - loss: 1.4166 - regression_loss: 1.1894 - classification_loss: 0.2273 478/500 [===========================>..] - ETA: 5s - loss: 1.4177 - regression_loss: 1.1903 - classification_loss: 0.2274 479/500 [===========================>..] - ETA: 5s - loss: 1.4194 - regression_loss: 1.1914 - classification_loss: 0.2280 480/500 [===========================>..] - ETA: 4s - loss: 1.4178 - regression_loss: 1.1902 - classification_loss: 0.2276 481/500 [===========================>..] - ETA: 4s - loss: 1.4181 - regression_loss: 1.1907 - classification_loss: 0.2274 482/500 [===========================>..] - ETA: 4s - loss: 1.4182 - regression_loss: 1.1909 - classification_loss: 0.2273 483/500 [===========================>..] - ETA: 4s - loss: 1.4173 - regression_loss: 1.1901 - classification_loss: 0.2271 484/500 [============================>.] - ETA: 3s - loss: 1.4176 - regression_loss: 1.1906 - classification_loss: 0.2271 485/500 [============================>.] - ETA: 3s - loss: 1.4186 - regression_loss: 1.1914 - classification_loss: 0.2271 486/500 [============================>.] - ETA: 3s - loss: 1.4181 - regression_loss: 1.1911 - classification_loss: 0.2269 487/500 [============================>.] - ETA: 3s - loss: 1.4176 - regression_loss: 1.1908 - classification_loss: 0.2268 488/500 [============================>.] - ETA: 2s - loss: 1.4159 - regression_loss: 1.1894 - classification_loss: 0.2265 489/500 [============================>.] - ETA: 2s - loss: 1.4156 - regression_loss: 1.1892 - classification_loss: 0.2263 490/500 [============================>.] - ETA: 2s - loss: 1.4169 - regression_loss: 1.1907 - classification_loss: 0.2262 491/500 [============================>.] - ETA: 2s - loss: 1.4173 - regression_loss: 1.1910 - classification_loss: 0.2263 492/500 [============================>.] - ETA: 1s - loss: 1.4159 - regression_loss: 1.1900 - classification_loss: 0.2260 493/500 [============================>.] - ETA: 1s - loss: 1.4163 - regression_loss: 1.1904 - classification_loss: 0.2259 494/500 [============================>.] - ETA: 1s - loss: 1.4168 - regression_loss: 1.1907 - classification_loss: 0.2261 495/500 [============================>.] - ETA: 1s - loss: 1.4155 - regression_loss: 1.1897 - classification_loss: 0.2258 496/500 [============================>.] - ETA: 0s - loss: 1.4142 - regression_loss: 1.1886 - classification_loss: 0.2256 497/500 [============================>.] - ETA: 0s - loss: 1.4125 - regression_loss: 1.1872 - classification_loss: 0.2253 498/500 [============================>.] - ETA: 0s - loss: 1.4123 - regression_loss: 1.1872 - classification_loss: 0.2251 499/500 [============================>.] - ETA: 0s - loss: 1.4131 - regression_loss: 1.1877 - classification_loss: 0.2253 500/500 [==============================] - 121s 242ms/step - loss: 1.4127 - regression_loss: 1.1875 - classification_loss: 0.2252 326 instances of class plum with average precision: 0.8089 mAP: 0.8089 Epoch 00076: saving model to ./training/snapshots/resnet50_pascal_76.h5 Epoch 77/150 1/500 [..............................] - ETA: 1:57 - loss: 1.8074 - regression_loss: 1.4797 - classification_loss: 0.3277 2/500 [..............................] - ETA: 1:58 - loss: 2.0838 - regression_loss: 1.7157 - classification_loss: 0.3681 3/500 [..............................] - ETA: 1:58 - loss: 1.6655 - regression_loss: 1.3842 - classification_loss: 0.2813 4/500 [..............................] - ETA: 2:00 - loss: 1.5954 - regression_loss: 1.3357 - classification_loss: 0.2597 5/500 [..............................] - ETA: 2:01 - loss: 1.6541 - regression_loss: 1.3813 - classification_loss: 0.2728 6/500 [..............................] - ETA: 2:01 - loss: 1.5916 - regression_loss: 1.3351 - classification_loss: 0.2564 7/500 [..............................] - ETA: 2:00 - loss: 1.4103 - regression_loss: 1.1830 - classification_loss: 0.2274 8/500 [..............................] - ETA: 2:01 - loss: 1.3956 - regression_loss: 1.1767 - classification_loss: 0.2189 9/500 [..............................] - ETA: 2:01 - loss: 1.4573 - regression_loss: 1.2267 - classification_loss: 0.2307 10/500 [..............................] - ETA: 2:01 - loss: 1.4878 - regression_loss: 1.2552 - classification_loss: 0.2326 11/500 [..............................] - ETA: 2:01 - loss: 1.5305 - regression_loss: 1.2949 - classification_loss: 0.2356 12/500 [..............................] - ETA: 2:00 - loss: 1.4729 - regression_loss: 1.2501 - classification_loss: 0.2228 13/500 [..............................] - ETA: 2:00 - loss: 1.4140 - regression_loss: 1.2022 - classification_loss: 0.2118 14/500 [..............................] - ETA: 1:59 - loss: 1.3834 - regression_loss: 1.1747 - classification_loss: 0.2087 15/500 [..............................] - ETA: 1:59 - loss: 1.3203 - regression_loss: 1.0964 - classification_loss: 0.2239 16/500 [..............................] - ETA: 1:59 - loss: 1.2973 - regression_loss: 1.0811 - classification_loss: 0.2163 17/500 [>.............................] - ETA: 1:57 - loss: 1.3787 - regression_loss: 1.1457 - classification_loss: 0.2330 18/500 [>.............................] - ETA: 1:56 - loss: 1.4189 - regression_loss: 1.1796 - classification_loss: 0.2393 19/500 [>.............................] - ETA: 1:56 - loss: 1.4088 - regression_loss: 1.1744 - classification_loss: 0.2344 20/500 [>.............................] - ETA: 1:56 - loss: 1.4454 - regression_loss: 1.2039 - classification_loss: 0.2415 21/500 [>.............................] - ETA: 1:55 - loss: 1.4329 - regression_loss: 1.1980 - classification_loss: 0.2349 22/500 [>.............................] - ETA: 1:55 - loss: 1.4302 - regression_loss: 1.1984 - classification_loss: 0.2318 23/500 [>.............................] - ETA: 1:55 - loss: 1.4369 - regression_loss: 1.2034 - classification_loss: 0.2334 24/500 [>.............................] - ETA: 1:55 - loss: 1.4092 - regression_loss: 1.1816 - classification_loss: 0.2276 25/500 [>.............................] - ETA: 1:55 - loss: 1.4033 - regression_loss: 1.1780 - classification_loss: 0.2254 26/500 [>.............................] - ETA: 1:55 - loss: 1.4045 - regression_loss: 1.1764 - classification_loss: 0.2282 27/500 [>.............................] - ETA: 1:54 - loss: 1.4207 - regression_loss: 1.1900 - classification_loss: 0.2307 28/500 [>.............................] - ETA: 1:54 - loss: 1.4350 - regression_loss: 1.2042 - classification_loss: 0.2308 29/500 [>.............................] - ETA: 1:54 - loss: 1.4093 - regression_loss: 1.1837 - classification_loss: 0.2255 30/500 [>.............................] - ETA: 1:54 - loss: 1.4186 - regression_loss: 1.1910 - classification_loss: 0.2276 31/500 [>.............................] - ETA: 1:54 - loss: 1.4232 - regression_loss: 1.1956 - classification_loss: 0.2276 32/500 [>.............................] - ETA: 1:53 - loss: 1.4098 - regression_loss: 1.1850 - classification_loss: 0.2248 33/500 [>.............................] - ETA: 1:53 - loss: 1.3849 - regression_loss: 1.1650 - classification_loss: 0.2199 34/500 [=>............................] - ETA: 1:53 - loss: 1.4035 - regression_loss: 1.1795 - classification_loss: 0.2240 35/500 [=>............................] - ETA: 1:53 - loss: 1.3882 - regression_loss: 1.1671 - classification_loss: 0.2210 36/500 [=>............................] - ETA: 1:53 - loss: 1.3590 - regression_loss: 1.1425 - classification_loss: 0.2165 37/500 [=>............................] - ETA: 1:52 - loss: 1.3473 - regression_loss: 1.1348 - classification_loss: 0.2125 38/500 [=>............................] - ETA: 1:52 - loss: 1.3445 - regression_loss: 1.1329 - classification_loss: 0.2116 39/500 [=>............................] - ETA: 1:52 - loss: 1.3493 - regression_loss: 1.1372 - classification_loss: 0.2121 40/500 [=>............................] - ETA: 1:52 - loss: 1.3629 - regression_loss: 1.1489 - classification_loss: 0.2140 41/500 [=>............................] - ETA: 1:52 - loss: 1.3571 - regression_loss: 1.1457 - classification_loss: 0.2114 42/500 [=>............................] - ETA: 1:51 - loss: 1.3603 - regression_loss: 1.1489 - classification_loss: 0.2114 43/500 [=>............................] - ETA: 1:51 - loss: 1.3707 - regression_loss: 1.1536 - classification_loss: 0.2170 44/500 [=>............................] - ETA: 1:51 - loss: 1.3671 - regression_loss: 1.1520 - classification_loss: 0.2151 45/500 [=>............................] - ETA: 1:51 - loss: 1.3713 - regression_loss: 1.1567 - classification_loss: 0.2147 46/500 [=>............................] - ETA: 1:51 - loss: 1.3701 - regression_loss: 1.1562 - classification_loss: 0.2139 47/500 [=>............................] - ETA: 1:50 - loss: 1.3716 - regression_loss: 1.1575 - classification_loss: 0.2141 48/500 [=>............................] - ETA: 1:50 - loss: 1.3670 - regression_loss: 1.1535 - classification_loss: 0.2135 49/500 [=>............................] - ETA: 1:50 - loss: 1.3442 - regression_loss: 1.1339 - classification_loss: 0.2103 50/500 [==>...........................] - ETA: 1:50 - loss: 1.3561 - regression_loss: 1.1455 - classification_loss: 0.2106 51/500 [==>...........................] - ETA: 1:49 - loss: 1.3553 - regression_loss: 1.1452 - classification_loss: 0.2100 52/500 [==>...........................] - ETA: 1:49 - loss: 1.3578 - regression_loss: 1.1469 - classification_loss: 0.2109 53/500 [==>...........................] - ETA: 1:49 - loss: 1.3543 - regression_loss: 1.1439 - classification_loss: 0.2104 54/500 [==>...........................] - ETA: 1:49 - loss: 1.3636 - regression_loss: 1.1498 - classification_loss: 0.2138 55/500 [==>...........................] - ETA: 1:48 - loss: 1.3713 - regression_loss: 1.1552 - classification_loss: 0.2162 56/500 [==>...........................] - ETA: 1:48 - loss: 1.3584 - regression_loss: 1.1444 - classification_loss: 0.2140 57/500 [==>...........................] - ETA: 1:48 - loss: 1.3630 - regression_loss: 1.1483 - classification_loss: 0.2147 58/500 [==>...........................] - ETA: 1:48 - loss: 1.3803 - regression_loss: 1.1635 - classification_loss: 0.2167 59/500 [==>...........................] - ETA: 1:47 - loss: 1.3828 - regression_loss: 1.1663 - classification_loss: 0.2165 60/500 [==>...........................] - ETA: 1:47 - loss: 1.3797 - regression_loss: 1.1644 - classification_loss: 0.2153 61/500 [==>...........................] - ETA: 1:47 - loss: 1.3794 - regression_loss: 1.1651 - classification_loss: 0.2143 62/500 [==>...........................] - ETA: 1:47 - loss: 1.3884 - regression_loss: 1.1725 - classification_loss: 0.2159 63/500 [==>...........................] - ETA: 1:46 - loss: 1.3867 - regression_loss: 1.1713 - classification_loss: 0.2153 64/500 [==>...........................] - ETA: 1:46 - loss: 1.3849 - regression_loss: 1.1694 - classification_loss: 0.2154 65/500 [==>...........................] - ETA: 1:46 - loss: 1.4015 - regression_loss: 1.1830 - classification_loss: 0.2185 66/500 [==>...........................] - ETA: 1:45 - loss: 1.3936 - regression_loss: 1.1763 - classification_loss: 0.2172 67/500 [===>..........................] - ETA: 1:45 - loss: 1.3963 - regression_loss: 1.1775 - classification_loss: 0.2188 68/500 [===>..........................] - ETA: 1:45 - loss: 1.4020 - regression_loss: 1.1814 - classification_loss: 0.2206 69/500 [===>..........................] - ETA: 1:45 - loss: 1.4008 - regression_loss: 1.1808 - classification_loss: 0.2200 70/500 [===>..........................] - ETA: 1:45 - loss: 1.3966 - regression_loss: 1.1783 - classification_loss: 0.2183 71/500 [===>..........................] - ETA: 1:44 - loss: 1.3887 - regression_loss: 1.1719 - classification_loss: 0.2168 72/500 [===>..........................] - ETA: 1:44 - loss: 1.3779 - regression_loss: 1.1622 - classification_loss: 0.2157 73/500 [===>..........................] - ETA: 1:44 - loss: 1.3719 - regression_loss: 1.1567 - classification_loss: 0.2152 74/500 [===>..........................] - ETA: 1:44 - loss: 1.3709 - regression_loss: 1.1565 - classification_loss: 0.2145 75/500 [===>..........................] - ETA: 1:44 - loss: 1.3735 - regression_loss: 1.1593 - classification_loss: 0.2142 76/500 [===>..........................] - ETA: 1:43 - loss: 1.3719 - regression_loss: 1.1581 - classification_loss: 0.2138 77/500 [===>..........................] - ETA: 1:43 - loss: 1.3734 - regression_loss: 1.1589 - classification_loss: 0.2146 78/500 [===>..........................] - ETA: 1:43 - loss: 1.3745 - regression_loss: 1.1598 - classification_loss: 0.2147 79/500 [===>..........................] - ETA: 1:43 - loss: 1.3732 - regression_loss: 1.1594 - classification_loss: 0.2137 80/500 [===>..........................] - ETA: 1:42 - loss: 1.3714 - regression_loss: 1.1586 - classification_loss: 0.2129 81/500 [===>..........................] - ETA: 1:42 - loss: 1.3755 - regression_loss: 1.1618 - classification_loss: 0.2138 82/500 [===>..........................] - ETA: 1:42 - loss: 1.3708 - regression_loss: 1.1578 - classification_loss: 0.2130 83/500 [===>..........................] - ETA: 1:42 - loss: 1.3651 - regression_loss: 1.1534 - classification_loss: 0.2117 84/500 [====>.........................] - ETA: 1:41 - loss: 1.3639 - regression_loss: 1.1516 - classification_loss: 0.2122 85/500 [====>.........................] - ETA: 1:41 - loss: 1.3596 - regression_loss: 1.1474 - classification_loss: 0.2122 86/500 [====>.........................] - ETA: 1:41 - loss: 1.3670 - regression_loss: 1.1549 - classification_loss: 0.2121 87/500 [====>.........................] - ETA: 1:41 - loss: 1.3636 - regression_loss: 1.1529 - classification_loss: 0.2107 88/500 [====>.........................] - ETA: 1:40 - loss: 1.3614 - regression_loss: 1.1507 - classification_loss: 0.2107 89/500 [====>.........................] - ETA: 1:40 - loss: 1.3590 - regression_loss: 1.1485 - classification_loss: 0.2105 90/500 [====>.........................] - ETA: 1:40 - loss: 1.3551 - regression_loss: 1.1458 - classification_loss: 0.2093 91/500 [====>.........................] - ETA: 1:40 - loss: 1.3619 - regression_loss: 1.1518 - classification_loss: 0.2101 92/500 [====>.........................] - ETA: 1:39 - loss: 1.3648 - regression_loss: 1.1546 - classification_loss: 0.2102 93/500 [====>.........................] - ETA: 1:39 - loss: 1.3616 - regression_loss: 1.1517 - classification_loss: 0.2100 94/500 [====>.........................] - ETA: 1:39 - loss: 1.3582 - regression_loss: 1.1494 - classification_loss: 0.2088 95/500 [====>.........................] - ETA: 1:39 - loss: 1.3681 - regression_loss: 1.1569 - classification_loss: 0.2112 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3749 - regression_loss: 1.1617 - classification_loss: 0.2132 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3742 - regression_loss: 1.1614 - classification_loss: 0.2128 98/500 [====>.........................] - ETA: 1:38 - loss: 1.3713 - regression_loss: 1.1590 - classification_loss: 0.2123 99/500 [====>.........................] - ETA: 1:38 - loss: 1.3742 - regression_loss: 1.1612 - classification_loss: 0.2130 100/500 [=====>........................] - ETA: 1:38 - loss: 1.3691 - regression_loss: 1.1573 - classification_loss: 0.2118 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3671 - regression_loss: 1.1563 - classification_loss: 0.2109 102/500 [=====>........................] - ETA: 1:37 - loss: 1.3659 - regression_loss: 1.1558 - classification_loss: 0.2101 103/500 [=====>........................] - ETA: 1:37 - loss: 1.3679 - regression_loss: 1.1574 - classification_loss: 0.2104 104/500 [=====>........................] - ETA: 1:37 - loss: 1.3687 - regression_loss: 1.1574 - classification_loss: 0.2113 105/500 [=====>........................] - ETA: 1:36 - loss: 1.3703 - regression_loss: 1.1587 - classification_loss: 0.2117 106/500 [=====>........................] - ETA: 1:36 - loss: 1.3726 - regression_loss: 1.1598 - classification_loss: 0.2128 107/500 [=====>........................] - ETA: 1:36 - loss: 1.3704 - regression_loss: 1.1583 - classification_loss: 0.2121 108/500 [=====>........................] - ETA: 1:36 - loss: 1.3627 - regression_loss: 1.1524 - classification_loss: 0.2103 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3663 - regression_loss: 1.1550 - classification_loss: 0.2113 110/500 [=====>........................] - ETA: 1:35 - loss: 1.3659 - regression_loss: 1.1548 - classification_loss: 0.2111 111/500 [=====>........................] - ETA: 1:35 - loss: 1.3694 - regression_loss: 1.1577 - classification_loss: 0.2117 112/500 [=====>........................] - ETA: 1:35 - loss: 1.3943 - regression_loss: 1.1474 - classification_loss: 0.2470 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3961 - regression_loss: 1.1490 - classification_loss: 0.2472 114/500 [=====>........................] - ETA: 1:34 - loss: 1.3953 - regression_loss: 1.1488 - classification_loss: 0.2466 115/500 [=====>........................] - ETA: 1:34 - loss: 1.3921 - regression_loss: 1.1466 - classification_loss: 0.2455 116/500 [=====>........................] - ETA: 1:34 - loss: 1.3932 - regression_loss: 1.1479 - classification_loss: 0.2453 117/500 [======>.......................] - ETA: 1:33 - loss: 1.3928 - regression_loss: 1.1477 - classification_loss: 0.2450 118/500 [======>.......................] - ETA: 1:33 - loss: 1.4002 - regression_loss: 1.1535 - classification_loss: 0.2468 119/500 [======>.......................] - ETA: 1:33 - loss: 1.4058 - regression_loss: 1.1584 - classification_loss: 0.2474 120/500 [======>.......................] - ETA: 1:33 - loss: 1.4107 - regression_loss: 1.1630 - classification_loss: 0.2477 121/500 [======>.......................] - ETA: 1:32 - loss: 1.4124 - regression_loss: 1.1654 - classification_loss: 0.2470 122/500 [======>.......................] - ETA: 1:32 - loss: 1.4144 - regression_loss: 1.1674 - classification_loss: 0.2470 123/500 [======>.......................] - ETA: 1:32 - loss: 1.4139 - regression_loss: 1.1677 - classification_loss: 0.2462 124/500 [======>.......................] - ETA: 1:32 - loss: 1.4183 - regression_loss: 1.1726 - classification_loss: 0.2458 125/500 [======>.......................] - ETA: 1:31 - loss: 1.4226 - regression_loss: 1.1764 - classification_loss: 0.2462 126/500 [======>.......................] - ETA: 1:31 - loss: 1.4232 - regression_loss: 1.1775 - classification_loss: 0.2457 127/500 [======>.......................] - ETA: 1:31 - loss: 1.4178 - regression_loss: 1.1736 - classification_loss: 0.2442 128/500 [======>.......................] - ETA: 1:31 - loss: 1.4178 - regression_loss: 1.1740 - classification_loss: 0.2438 129/500 [======>.......................] - ETA: 1:31 - loss: 1.4157 - regression_loss: 1.1724 - classification_loss: 0.2432 130/500 [======>.......................] - ETA: 1:30 - loss: 1.4136 - regression_loss: 1.1711 - classification_loss: 0.2425 131/500 [======>.......................] - ETA: 1:30 - loss: 1.4131 - regression_loss: 1.1712 - classification_loss: 0.2418 132/500 [======>.......................] - ETA: 1:30 - loss: 1.4115 - regression_loss: 1.1701 - classification_loss: 0.2414 133/500 [======>.......................] - ETA: 1:30 - loss: 1.4084 - regression_loss: 1.1679 - classification_loss: 0.2405 134/500 [=======>......................] - ETA: 1:29 - loss: 1.4027 - regression_loss: 1.1636 - classification_loss: 0.2392 135/500 [=======>......................] - ETA: 1:29 - loss: 1.4007 - regression_loss: 1.1622 - classification_loss: 0.2385 136/500 [=======>......................] - ETA: 1:29 - loss: 1.3971 - regression_loss: 1.1593 - classification_loss: 0.2378 137/500 [=======>......................] - ETA: 1:29 - loss: 1.3973 - regression_loss: 1.1598 - classification_loss: 0.2375 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3972 - regression_loss: 1.1602 - classification_loss: 0.2370 139/500 [=======>......................] - ETA: 1:28 - loss: 1.3960 - regression_loss: 1.1598 - classification_loss: 0.2362 140/500 [=======>......................] - ETA: 1:28 - loss: 1.3919 - regression_loss: 1.1569 - classification_loss: 0.2350 141/500 [=======>......................] - ETA: 1:28 - loss: 1.3941 - regression_loss: 1.1585 - classification_loss: 0.2356 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3939 - regression_loss: 1.1592 - classification_loss: 0.2347 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3878 - regression_loss: 1.1543 - classification_loss: 0.2334 144/500 [=======>......................] - ETA: 1:27 - loss: 1.3928 - regression_loss: 1.1584 - classification_loss: 0.2345 145/500 [=======>......................] - ETA: 1:27 - loss: 1.3933 - regression_loss: 1.1588 - classification_loss: 0.2345 146/500 [=======>......................] - ETA: 1:27 - loss: 1.3997 - regression_loss: 1.1634 - classification_loss: 0.2363 147/500 [=======>......................] - ETA: 1:26 - loss: 1.4002 - regression_loss: 1.1637 - classification_loss: 0.2364 148/500 [=======>......................] - ETA: 1:26 - loss: 1.3959 - regression_loss: 1.1606 - classification_loss: 0.2353 149/500 [=======>......................] - ETA: 1:26 - loss: 1.3979 - regression_loss: 1.1629 - classification_loss: 0.2350 150/500 [========>.....................] - ETA: 1:26 - loss: 1.3957 - regression_loss: 1.1611 - classification_loss: 0.2346 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3907 - regression_loss: 1.1572 - classification_loss: 0.2335 152/500 [========>.....................] - ETA: 1:25 - loss: 1.3874 - regression_loss: 1.1542 - classification_loss: 0.2331 153/500 [========>.....................] - ETA: 1:25 - loss: 1.3836 - regression_loss: 1.1513 - classification_loss: 0.2323 154/500 [========>.....................] - ETA: 1:25 - loss: 1.3847 - regression_loss: 1.1522 - classification_loss: 0.2325 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3820 - regression_loss: 1.1488 - classification_loss: 0.2332 156/500 [========>.....................] - ETA: 1:24 - loss: 1.3838 - regression_loss: 1.1506 - classification_loss: 0.2332 157/500 [========>.....................] - ETA: 1:24 - loss: 1.3839 - regression_loss: 1.1511 - classification_loss: 0.2328 158/500 [========>.....................] - ETA: 1:24 - loss: 1.3811 - regression_loss: 1.1490 - classification_loss: 0.2320 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3829 - regression_loss: 1.1510 - classification_loss: 0.2320 160/500 [========>.....................] - ETA: 1:23 - loss: 1.3829 - regression_loss: 1.1504 - classification_loss: 0.2325 161/500 [========>.....................] - ETA: 1:23 - loss: 1.3798 - regression_loss: 1.1481 - classification_loss: 0.2317 162/500 [========>.....................] - ETA: 1:23 - loss: 1.3794 - regression_loss: 1.1479 - classification_loss: 0.2315 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3749 - regression_loss: 1.1440 - classification_loss: 0.2309 164/500 [========>.....................] - ETA: 1:22 - loss: 1.3751 - regression_loss: 1.1448 - classification_loss: 0.2303 165/500 [========>.....................] - ETA: 1:22 - loss: 1.3761 - regression_loss: 1.1459 - classification_loss: 0.2302 166/500 [========>.....................] - ETA: 1:22 - loss: 1.3798 - regression_loss: 1.1488 - classification_loss: 0.2310 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3778 - regression_loss: 1.1472 - classification_loss: 0.2307 168/500 [=========>....................] - ETA: 1:21 - loss: 1.3792 - regression_loss: 1.1486 - classification_loss: 0.2306 169/500 [=========>....................] - ETA: 1:21 - loss: 1.3820 - regression_loss: 1.1503 - classification_loss: 0.2317 170/500 [=========>....................] - ETA: 1:21 - loss: 1.3828 - regression_loss: 1.1510 - classification_loss: 0.2318 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3799 - regression_loss: 1.1484 - classification_loss: 0.2315 172/500 [=========>....................] - ETA: 1:20 - loss: 1.3822 - regression_loss: 1.1506 - classification_loss: 0.2316 173/500 [=========>....................] - ETA: 1:20 - loss: 1.3831 - regression_loss: 1.1510 - classification_loss: 0.2321 174/500 [=========>....................] - ETA: 1:20 - loss: 1.3829 - regression_loss: 1.1515 - classification_loss: 0.2313 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3770 - regression_loss: 1.1468 - classification_loss: 0.2302 176/500 [=========>....................] - ETA: 1:19 - loss: 1.3780 - regression_loss: 1.1475 - classification_loss: 0.2305 177/500 [=========>....................] - ETA: 1:19 - loss: 1.3825 - regression_loss: 1.1515 - classification_loss: 0.2310 178/500 [=========>....................] - ETA: 1:19 - loss: 1.3824 - regression_loss: 1.1516 - classification_loss: 0.2308 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3839 - regression_loss: 1.1531 - classification_loss: 0.2307 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3836 - regression_loss: 1.1532 - classification_loss: 0.2305 181/500 [=========>....................] - ETA: 1:18 - loss: 1.3819 - regression_loss: 1.1523 - classification_loss: 0.2296 182/500 [=========>....................] - ETA: 1:18 - loss: 1.3794 - regression_loss: 1.1502 - classification_loss: 0.2293 183/500 [=========>....................] - ETA: 1:17 - loss: 1.3804 - regression_loss: 1.1509 - classification_loss: 0.2294 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3748 - regression_loss: 1.1463 - classification_loss: 0.2285 185/500 [==========>...................] - ETA: 1:17 - loss: 1.3742 - regression_loss: 1.1459 - classification_loss: 0.2283 186/500 [==========>...................] - ETA: 1:17 - loss: 1.3795 - regression_loss: 1.1500 - classification_loss: 0.2295 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3790 - regression_loss: 1.1497 - classification_loss: 0.2293 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3775 - regression_loss: 1.1487 - classification_loss: 0.2288 189/500 [==========>...................] - ETA: 1:16 - loss: 1.3772 - regression_loss: 1.1486 - classification_loss: 0.2286 190/500 [==========>...................] - ETA: 1:16 - loss: 1.3795 - regression_loss: 1.1502 - classification_loss: 0.2294 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3794 - regression_loss: 1.1503 - classification_loss: 0.2291 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3770 - regression_loss: 1.1479 - classification_loss: 0.2291 193/500 [==========>...................] - ETA: 1:15 - loss: 1.3790 - regression_loss: 1.1501 - classification_loss: 0.2289 194/500 [==========>...................] - ETA: 1:14 - loss: 1.3792 - regression_loss: 1.1498 - classification_loss: 0.2294 195/500 [==========>...................] - ETA: 1:14 - loss: 1.3780 - regression_loss: 1.1491 - classification_loss: 0.2289 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3754 - regression_loss: 1.1473 - classification_loss: 0.2281 197/500 [==========>...................] - ETA: 1:14 - loss: 1.3751 - regression_loss: 1.1475 - classification_loss: 0.2277 198/500 [==========>...................] - ETA: 1:13 - loss: 1.3749 - regression_loss: 1.1478 - classification_loss: 0.2272 199/500 [==========>...................] - ETA: 1:13 - loss: 1.3733 - regression_loss: 1.1469 - classification_loss: 0.2264 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3693 - regression_loss: 1.1436 - classification_loss: 0.2257 201/500 [===========>..................] - ETA: 1:13 - loss: 1.3701 - regression_loss: 1.1443 - classification_loss: 0.2257 202/500 [===========>..................] - ETA: 1:13 - loss: 1.3661 - regression_loss: 1.1411 - classification_loss: 0.2250 203/500 [===========>..................] - ETA: 1:12 - loss: 1.3658 - regression_loss: 1.1408 - classification_loss: 0.2250 204/500 [===========>..................] - ETA: 1:12 - loss: 1.3652 - regression_loss: 1.1404 - classification_loss: 0.2248 205/500 [===========>..................] - ETA: 1:12 - loss: 1.3626 - regression_loss: 1.1383 - classification_loss: 0.2243 206/500 [===========>..................] - ETA: 1:12 - loss: 1.3605 - regression_loss: 1.1368 - classification_loss: 0.2237 207/500 [===========>..................] - ETA: 1:11 - loss: 1.3604 - regression_loss: 1.1370 - classification_loss: 0.2234 208/500 [===========>..................] - ETA: 1:11 - loss: 1.3585 - regression_loss: 1.1357 - classification_loss: 0.2228 209/500 [===========>..................] - ETA: 1:11 - loss: 1.3597 - regression_loss: 1.1366 - classification_loss: 0.2231 210/500 [===========>..................] - ETA: 1:11 - loss: 1.3591 - regression_loss: 1.1361 - classification_loss: 0.2230 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3560 - regression_loss: 1.1337 - classification_loss: 0.2224 212/500 [===========>..................] - ETA: 1:10 - loss: 1.3566 - regression_loss: 1.1346 - classification_loss: 0.2220 213/500 [===========>..................] - ETA: 1:10 - loss: 1.3557 - regression_loss: 1.1340 - classification_loss: 0.2217 214/500 [===========>..................] - ETA: 1:10 - loss: 1.3559 - regression_loss: 1.1343 - classification_loss: 0.2216 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3538 - regression_loss: 1.1328 - classification_loss: 0.2211 216/500 [===========>..................] - ETA: 1:09 - loss: 1.3559 - regression_loss: 1.1344 - classification_loss: 0.2215 217/500 [============>.................] - ETA: 1:09 - loss: 1.3542 - regression_loss: 1.1331 - classification_loss: 0.2211 218/500 [============>.................] - ETA: 1:09 - loss: 1.3545 - regression_loss: 1.1331 - classification_loss: 0.2214 219/500 [============>.................] - ETA: 1:08 - loss: 1.3535 - regression_loss: 1.1318 - classification_loss: 0.2217 220/500 [============>.................] - ETA: 1:08 - loss: 1.3513 - regression_loss: 1.1302 - classification_loss: 0.2212 221/500 [============>.................] - ETA: 1:08 - loss: 1.3510 - regression_loss: 1.1301 - classification_loss: 0.2208 222/500 [============>.................] - ETA: 1:08 - loss: 1.3536 - regression_loss: 1.1321 - classification_loss: 0.2215 223/500 [============>.................] - ETA: 1:07 - loss: 1.3572 - regression_loss: 1.1351 - classification_loss: 0.2221 224/500 [============>.................] - ETA: 1:07 - loss: 1.3565 - regression_loss: 1.1349 - classification_loss: 0.2216 225/500 [============>.................] - ETA: 1:07 - loss: 1.3537 - regression_loss: 1.1328 - classification_loss: 0.2209 226/500 [============>.................] - ETA: 1:07 - loss: 1.3575 - regression_loss: 1.1358 - classification_loss: 0.2217 227/500 [============>.................] - ETA: 1:06 - loss: 1.3575 - regression_loss: 1.1363 - classification_loss: 0.2212 228/500 [============>.................] - ETA: 1:06 - loss: 1.3542 - regression_loss: 1.1336 - classification_loss: 0.2206 229/500 [============>.................] - ETA: 1:06 - loss: 1.3567 - regression_loss: 1.1356 - classification_loss: 0.2211 230/500 [============>.................] - ETA: 1:06 - loss: 1.3565 - regression_loss: 1.1356 - classification_loss: 0.2209 231/500 [============>.................] - ETA: 1:05 - loss: 1.3559 - regression_loss: 1.1352 - classification_loss: 0.2207 232/500 [============>.................] - ETA: 1:05 - loss: 1.3569 - regression_loss: 1.1361 - classification_loss: 0.2208 233/500 [============>.................] - ETA: 1:05 - loss: 1.3572 - regression_loss: 1.1365 - classification_loss: 0.2207 234/500 [=============>................] - ETA: 1:05 - loss: 1.3574 - regression_loss: 1.1369 - classification_loss: 0.2205 235/500 [=============>................] - ETA: 1:05 - loss: 1.3573 - regression_loss: 1.1370 - classification_loss: 0.2204 236/500 [=============>................] - ETA: 1:04 - loss: 1.3563 - regression_loss: 1.1362 - classification_loss: 0.2201 237/500 [=============>................] - ETA: 1:04 - loss: 1.3574 - regression_loss: 1.1374 - classification_loss: 0.2200 238/500 [=============>................] - ETA: 1:04 - loss: 1.3564 - regression_loss: 1.1368 - classification_loss: 0.2196 239/500 [=============>................] - ETA: 1:04 - loss: 1.3522 - regression_loss: 1.1335 - classification_loss: 0.2188 240/500 [=============>................] - ETA: 1:03 - loss: 1.3525 - regression_loss: 1.1338 - classification_loss: 0.2188 241/500 [=============>................] - ETA: 1:03 - loss: 1.3499 - regression_loss: 1.1318 - classification_loss: 0.2181 242/500 [=============>................] - ETA: 1:03 - loss: 1.3458 - regression_loss: 1.1283 - classification_loss: 0.2175 243/500 [=============>................] - ETA: 1:03 - loss: 1.3450 - regression_loss: 1.1275 - classification_loss: 0.2174 244/500 [=============>................] - ETA: 1:02 - loss: 1.3470 - regression_loss: 1.1295 - classification_loss: 0.2175 245/500 [=============>................] - ETA: 1:02 - loss: 1.3470 - regression_loss: 1.1297 - classification_loss: 0.2173 246/500 [=============>................] - ETA: 1:02 - loss: 1.3513 - regression_loss: 1.1328 - classification_loss: 0.2185 247/500 [=============>................] - ETA: 1:02 - loss: 1.3538 - regression_loss: 1.1351 - classification_loss: 0.2187 248/500 [=============>................] - ETA: 1:01 - loss: 1.3553 - regression_loss: 1.1365 - classification_loss: 0.2188 249/500 [=============>................] - ETA: 1:01 - loss: 1.3558 - regression_loss: 1.1367 - classification_loss: 0.2191 250/500 [==============>...............] - ETA: 1:01 - loss: 1.3591 - regression_loss: 1.1391 - classification_loss: 0.2199 251/500 [==============>...............] - ETA: 1:01 - loss: 1.3631 - regression_loss: 1.1421 - classification_loss: 0.2210 252/500 [==============>...............] - ETA: 1:00 - loss: 1.3607 - regression_loss: 1.1403 - classification_loss: 0.2205 253/500 [==============>...............] - ETA: 1:00 - loss: 1.3621 - regression_loss: 1.1415 - classification_loss: 0.2205 254/500 [==============>...............] - ETA: 1:00 - loss: 1.3624 - regression_loss: 1.1418 - classification_loss: 0.2206 255/500 [==============>...............] - ETA: 1:00 - loss: 1.3652 - regression_loss: 1.1437 - classification_loss: 0.2215 256/500 [==============>...............] - ETA: 59s - loss: 1.3666 - regression_loss: 1.1451 - classification_loss: 0.2215  257/500 [==============>...............] - ETA: 59s - loss: 1.3723 - regression_loss: 1.1488 - classification_loss: 0.2235 258/500 [==============>...............] - ETA: 59s - loss: 1.3719 - regression_loss: 1.1486 - classification_loss: 0.2233 259/500 [==============>...............] - ETA: 59s - loss: 1.3746 - regression_loss: 1.1509 - classification_loss: 0.2236 260/500 [==============>...............] - ETA: 58s - loss: 1.3760 - regression_loss: 1.1523 - classification_loss: 0.2237 261/500 [==============>...............] - ETA: 58s - loss: 1.3776 - regression_loss: 1.1538 - classification_loss: 0.2238 262/500 [==============>...............] - ETA: 58s - loss: 1.3780 - regression_loss: 1.1543 - classification_loss: 0.2237 263/500 [==============>...............] - ETA: 58s - loss: 1.3772 - regression_loss: 1.1539 - classification_loss: 0.2234 264/500 [==============>...............] - ETA: 57s - loss: 1.3823 - regression_loss: 1.1584 - classification_loss: 0.2240 265/500 [==============>...............] - ETA: 57s - loss: 1.3840 - regression_loss: 1.1599 - classification_loss: 0.2241 266/500 [==============>...............] - ETA: 57s - loss: 1.3837 - regression_loss: 1.1599 - classification_loss: 0.2238 267/500 [===============>..............] - ETA: 57s - loss: 1.3831 - regression_loss: 1.1594 - classification_loss: 0.2237 268/500 [===============>..............] - ETA: 56s - loss: 1.3812 - regression_loss: 1.1578 - classification_loss: 0.2233 269/500 [===============>..............] - ETA: 56s - loss: 1.3784 - regression_loss: 1.1556 - classification_loss: 0.2228 270/500 [===============>..............] - ETA: 56s - loss: 1.3788 - regression_loss: 1.1560 - classification_loss: 0.2228 271/500 [===============>..............] - ETA: 56s - loss: 1.3760 - regression_loss: 1.1537 - classification_loss: 0.2223 272/500 [===============>..............] - ETA: 55s - loss: 1.3776 - regression_loss: 1.1551 - classification_loss: 0.2224 273/500 [===============>..............] - ETA: 55s - loss: 1.3765 - regression_loss: 1.1543 - classification_loss: 0.2222 274/500 [===============>..............] - ETA: 55s - loss: 1.3774 - regression_loss: 1.1550 - classification_loss: 0.2224 275/500 [===============>..............] - ETA: 55s - loss: 1.3772 - regression_loss: 1.1549 - classification_loss: 0.2223 276/500 [===============>..............] - ETA: 54s - loss: 1.3785 - regression_loss: 1.1559 - classification_loss: 0.2226 277/500 [===============>..............] - ETA: 54s - loss: 1.3763 - regression_loss: 1.1542 - classification_loss: 0.2221 278/500 [===============>..............] - ETA: 54s - loss: 1.3750 - regression_loss: 1.1530 - classification_loss: 0.2219 279/500 [===============>..............] - ETA: 54s - loss: 1.3760 - regression_loss: 1.1541 - classification_loss: 0.2220 280/500 [===============>..............] - ETA: 53s - loss: 1.3753 - regression_loss: 1.1535 - classification_loss: 0.2218 281/500 [===============>..............] - ETA: 53s - loss: 1.3760 - regression_loss: 1.1542 - classification_loss: 0.2218 282/500 [===============>..............] - ETA: 53s - loss: 1.3769 - regression_loss: 1.1548 - classification_loss: 0.2220 283/500 [===============>..............] - ETA: 53s - loss: 1.3784 - regression_loss: 1.1560 - classification_loss: 0.2224 284/500 [================>.............] - ETA: 52s - loss: 1.3801 - regression_loss: 1.1574 - classification_loss: 0.2227 285/500 [================>.............] - ETA: 52s - loss: 1.3804 - regression_loss: 1.1574 - classification_loss: 0.2230 286/500 [================>.............] - ETA: 52s - loss: 1.3814 - regression_loss: 1.1580 - classification_loss: 0.2234 287/500 [================>.............] - ETA: 52s - loss: 1.3806 - regression_loss: 1.1576 - classification_loss: 0.2230 288/500 [================>.............] - ETA: 51s - loss: 1.3807 - regression_loss: 1.1577 - classification_loss: 0.2230 289/500 [================>.............] - ETA: 51s - loss: 1.3801 - regression_loss: 1.1574 - classification_loss: 0.2227 290/500 [================>.............] - ETA: 51s - loss: 1.3826 - regression_loss: 1.1595 - classification_loss: 0.2231 291/500 [================>.............] - ETA: 51s - loss: 1.3815 - regression_loss: 1.1588 - classification_loss: 0.2227 292/500 [================>.............] - ETA: 50s - loss: 1.3817 - regression_loss: 1.1593 - classification_loss: 0.2223 293/500 [================>.............] - ETA: 50s - loss: 1.3832 - regression_loss: 1.1608 - classification_loss: 0.2224 294/500 [================>.............] - ETA: 50s - loss: 1.3809 - regression_loss: 1.1589 - classification_loss: 0.2219 295/500 [================>.............] - ETA: 50s - loss: 1.3814 - regression_loss: 1.1595 - classification_loss: 0.2219 296/500 [================>.............] - ETA: 49s - loss: 1.3826 - regression_loss: 1.1604 - classification_loss: 0.2221 297/500 [================>.............] - ETA: 49s - loss: 1.3824 - regression_loss: 1.1601 - classification_loss: 0.2222 298/500 [================>.............] - ETA: 49s - loss: 1.3828 - regression_loss: 1.1604 - classification_loss: 0.2224 299/500 [================>.............] - ETA: 49s - loss: 1.3822 - regression_loss: 1.1601 - classification_loss: 0.2221 300/500 [=================>............] - ETA: 48s - loss: 1.3830 - regression_loss: 1.1607 - classification_loss: 0.2223 301/500 [=================>............] - ETA: 48s - loss: 1.3826 - regression_loss: 1.1607 - classification_loss: 0.2220 302/500 [=================>............] - ETA: 48s - loss: 1.3840 - regression_loss: 1.1619 - classification_loss: 0.2221 303/500 [=================>............] - ETA: 48s - loss: 1.3834 - regression_loss: 1.1616 - classification_loss: 0.2218 304/500 [=================>............] - ETA: 47s - loss: 1.3838 - regression_loss: 1.1619 - classification_loss: 0.2219 305/500 [=================>............] - ETA: 47s - loss: 1.3839 - regression_loss: 1.1621 - classification_loss: 0.2218 306/500 [=================>............] - ETA: 47s - loss: 1.3865 - regression_loss: 1.1641 - classification_loss: 0.2224 307/500 [=================>............] - ETA: 47s - loss: 1.3876 - regression_loss: 1.1652 - classification_loss: 0.2224 308/500 [=================>............] - ETA: 46s - loss: 1.3846 - regression_loss: 1.1627 - classification_loss: 0.2219 309/500 [=================>............] - ETA: 46s - loss: 1.3833 - regression_loss: 1.1617 - classification_loss: 0.2215 310/500 [=================>............] - ETA: 46s - loss: 1.3837 - regression_loss: 1.1621 - classification_loss: 0.2215 311/500 [=================>............] - ETA: 46s - loss: 1.3854 - regression_loss: 1.1639 - classification_loss: 0.2215 312/500 [=================>............] - ETA: 45s - loss: 1.3847 - regression_loss: 1.1633 - classification_loss: 0.2214 313/500 [=================>............] - ETA: 45s - loss: 1.3854 - regression_loss: 1.1638 - classification_loss: 0.2215 314/500 [=================>............] - ETA: 45s - loss: 1.3836 - regression_loss: 1.1624 - classification_loss: 0.2212 315/500 [=================>............] - ETA: 45s - loss: 1.3853 - regression_loss: 1.1633 - classification_loss: 0.2221 316/500 [=================>............] - ETA: 44s - loss: 1.3858 - regression_loss: 1.1638 - classification_loss: 0.2220 317/500 [==================>...........] - ETA: 44s - loss: 1.3861 - regression_loss: 1.1642 - classification_loss: 0.2220 318/500 [==================>...........] - ETA: 44s - loss: 1.3847 - regression_loss: 1.1631 - classification_loss: 0.2216 319/500 [==================>...........] - ETA: 44s - loss: 1.3842 - regression_loss: 1.1628 - classification_loss: 0.2214 320/500 [==================>...........] - ETA: 43s - loss: 1.3844 - regression_loss: 1.1630 - classification_loss: 0.2214 321/500 [==================>...........] - ETA: 43s - loss: 1.3852 - regression_loss: 1.1637 - classification_loss: 0.2214 322/500 [==================>...........] - ETA: 43s - loss: 1.3849 - regression_loss: 1.1636 - classification_loss: 0.2213 323/500 [==================>...........] - ETA: 43s - loss: 1.3872 - regression_loss: 1.1659 - classification_loss: 0.2213 324/500 [==================>...........] - ETA: 42s - loss: 1.3876 - regression_loss: 1.1663 - classification_loss: 0.2212 325/500 [==================>...........] - ETA: 42s - loss: 1.3865 - regression_loss: 1.1655 - classification_loss: 0.2209 326/500 [==================>...........] - ETA: 42s - loss: 1.3879 - regression_loss: 1.1663 - classification_loss: 0.2217 327/500 [==================>...........] - ETA: 42s - loss: 1.3857 - regression_loss: 1.1645 - classification_loss: 0.2212 328/500 [==================>...........] - ETA: 41s - loss: 1.3887 - regression_loss: 1.1666 - classification_loss: 0.2220 329/500 [==================>...........] - ETA: 41s - loss: 1.3887 - regression_loss: 1.1669 - classification_loss: 0.2218 330/500 [==================>...........] - ETA: 41s - loss: 1.3856 - regression_loss: 1.1643 - classification_loss: 0.2213 331/500 [==================>...........] - ETA: 41s - loss: 1.3843 - regression_loss: 1.1628 - classification_loss: 0.2215 332/500 [==================>...........] - ETA: 40s - loss: 1.3844 - regression_loss: 1.1632 - classification_loss: 0.2212 333/500 [==================>...........] - ETA: 40s - loss: 1.3840 - regression_loss: 1.1628 - classification_loss: 0.2212 334/500 [===================>..........] - ETA: 40s - loss: 1.3851 - regression_loss: 1.1639 - classification_loss: 0.2213 335/500 [===================>..........] - ETA: 40s - loss: 1.3861 - regression_loss: 1.1647 - classification_loss: 0.2214 336/500 [===================>..........] - ETA: 39s - loss: 1.3844 - regression_loss: 1.1634 - classification_loss: 0.2210 337/500 [===================>..........] - ETA: 39s - loss: 1.3832 - regression_loss: 1.1621 - classification_loss: 0.2211 338/500 [===================>..........] - ETA: 39s - loss: 1.3824 - regression_loss: 1.1614 - classification_loss: 0.2210 339/500 [===================>..........] - ETA: 39s - loss: 1.3832 - regression_loss: 1.1621 - classification_loss: 0.2212 340/500 [===================>..........] - ETA: 38s - loss: 1.3821 - regression_loss: 1.1607 - classification_loss: 0.2213 341/500 [===================>..........] - ETA: 38s - loss: 1.3841 - regression_loss: 1.1625 - classification_loss: 0.2216 342/500 [===================>..........] - ETA: 38s - loss: 1.3836 - regression_loss: 1.1623 - classification_loss: 0.2213 343/500 [===================>..........] - ETA: 38s - loss: 1.3828 - regression_loss: 1.1617 - classification_loss: 0.2211 344/500 [===================>..........] - ETA: 37s - loss: 1.3815 - regression_loss: 1.1608 - classification_loss: 0.2208 345/500 [===================>..........] - ETA: 37s - loss: 1.3802 - regression_loss: 1.1597 - classification_loss: 0.2205 346/500 [===================>..........] - ETA: 37s - loss: 1.3797 - regression_loss: 1.1594 - classification_loss: 0.2203 347/500 [===================>..........] - ETA: 37s - loss: 1.3797 - regression_loss: 1.1595 - classification_loss: 0.2202 348/500 [===================>..........] - ETA: 36s - loss: 1.3821 - regression_loss: 1.1615 - classification_loss: 0.2206 349/500 [===================>..........] - ETA: 36s - loss: 1.3840 - regression_loss: 1.1629 - classification_loss: 0.2211 350/500 [====================>.........] - ETA: 36s - loss: 1.3855 - regression_loss: 1.1642 - classification_loss: 0.2213 351/500 [====================>.........] - ETA: 36s - loss: 1.3841 - regression_loss: 1.1631 - classification_loss: 0.2210 352/500 [====================>.........] - ETA: 36s - loss: 1.3838 - regression_loss: 1.1629 - classification_loss: 0.2209 353/500 [====================>.........] - ETA: 35s - loss: 1.3841 - regression_loss: 1.1631 - classification_loss: 0.2210 354/500 [====================>.........] - ETA: 35s - loss: 1.3844 - regression_loss: 1.1632 - classification_loss: 0.2212 355/500 [====================>.........] - ETA: 35s - loss: 1.3846 - regression_loss: 1.1636 - classification_loss: 0.2210 356/500 [====================>.........] - ETA: 35s - loss: 1.3873 - regression_loss: 1.1657 - classification_loss: 0.2216 357/500 [====================>.........] - ETA: 34s - loss: 1.3883 - regression_loss: 1.1666 - classification_loss: 0.2217 358/500 [====================>.........] - ETA: 34s - loss: 1.3879 - regression_loss: 1.1662 - classification_loss: 0.2217 359/500 [====================>.........] - ETA: 34s - loss: 1.3855 - regression_loss: 1.1643 - classification_loss: 0.2212 360/500 [====================>.........] - ETA: 34s - loss: 1.3847 - regression_loss: 1.1637 - classification_loss: 0.2211 361/500 [====================>.........] - ETA: 33s - loss: 1.3825 - regression_loss: 1.1619 - classification_loss: 0.2206 362/500 [====================>.........] - ETA: 33s - loss: 1.3845 - regression_loss: 1.1633 - classification_loss: 0.2212 363/500 [====================>.........] - ETA: 33s - loss: 1.3845 - regression_loss: 1.1633 - classification_loss: 0.2212 364/500 [====================>.........] - ETA: 33s - loss: 1.3858 - regression_loss: 1.1644 - classification_loss: 0.2214 365/500 [====================>.........] - ETA: 32s - loss: 1.3863 - regression_loss: 1.1649 - classification_loss: 0.2215 366/500 [====================>.........] - ETA: 32s - loss: 1.3876 - regression_loss: 1.1659 - classification_loss: 0.2218 367/500 [=====================>........] - ETA: 32s - loss: 1.3886 - regression_loss: 1.1668 - classification_loss: 0.2218 368/500 [=====================>........] - ETA: 32s - loss: 1.3907 - regression_loss: 1.1684 - classification_loss: 0.2223 369/500 [=====================>........] - ETA: 31s - loss: 1.3913 - regression_loss: 1.1691 - classification_loss: 0.2222 370/500 [=====================>........] - ETA: 31s - loss: 1.3916 - regression_loss: 1.1694 - classification_loss: 0.2221 371/500 [=====================>........] - ETA: 31s - loss: 1.3929 - regression_loss: 1.1707 - classification_loss: 0.2223 372/500 [=====================>........] - ETA: 31s - loss: 1.3921 - regression_loss: 1.1701 - classification_loss: 0.2220 373/500 [=====================>........] - ETA: 30s - loss: 1.3913 - regression_loss: 1.1693 - classification_loss: 0.2220 374/500 [=====================>........] - ETA: 30s - loss: 1.3914 - regression_loss: 1.1694 - classification_loss: 0.2220 375/500 [=====================>........] - ETA: 30s - loss: 1.3904 - regression_loss: 1.1687 - classification_loss: 0.2217 376/500 [=====================>........] - ETA: 30s - loss: 1.3896 - regression_loss: 1.1680 - classification_loss: 0.2215 377/500 [=====================>........] - ETA: 29s - loss: 1.3891 - regression_loss: 1.1678 - classification_loss: 0.2213 378/500 [=====================>........] - ETA: 29s - loss: 1.3893 - regression_loss: 1.1681 - classification_loss: 0.2212 379/500 [=====================>........] - ETA: 29s - loss: 1.3897 - regression_loss: 1.1682 - classification_loss: 0.2215 380/500 [=====================>........] - ETA: 29s - loss: 1.3880 - regression_loss: 1.1669 - classification_loss: 0.2211 381/500 [=====================>........] - ETA: 28s - loss: 1.3888 - regression_loss: 1.1674 - classification_loss: 0.2215 382/500 [=====================>........] - ETA: 28s - loss: 1.3894 - regression_loss: 1.1678 - classification_loss: 0.2215 383/500 [=====================>........] - ETA: 28s - loss: 1.3906 - regression_loss: 1.1688 - classification_loss: 0.2218 384/500 [======================>.......] - ETA: 28s - loss: 1.3906 - regression_loss: 1.1689 - classification_loss: 0.2216 385/500 [======================>.......] - ETA: 27s - loss: 1.3905 - regression_loss: 1.1690 - classification_loss: 0.2215 386/500 [======================>.......] - ETA: 27s - loss: 1.3898 - regression_loss: 1.1685 - classification_loss: 0.2213 387/500 [======================>.......] - ETA: 27s - loss: 1.3909 - regression_loss: 1.1699 - classification_loss: 0.2210 388/500 [======================>.......] - ETA: 27s - loss: 1.3911 - regression_loss: 1.1700 - classification_loss: 0.2211 389/500 [======================>.......] - ETA: 26s - loss: 1.3908 - regression_loss: 1.1699 - classification_loss: 0.2210 390/500 [======================>.......] - ETA: 26s - loss: 1.3899 - regression_loss: 1.1692 - classification_loss: 0.2207 391/500 [======================>.......] - ETA: 26s - loss: 1.3893 - regression_loss: 1.1686 - classification_loss: 0.2207 392/500 [======================>.......] - ETA: 26s - loss: 1.3892 - regression_loss: 1.1686 - classification_loss: 0.2206 393/500 [======================>.......] - ETA: 26s - loss: 1.3892 - regression_loss: 1.1688 - classification_loss: 0.2204 394/500 [======================>.......] - ETA: 25s - loss: 1.3907 - regression_loss: 1.1702 - classification_loss: 0.2205 395/500 [======================>.......] - ETA: 25s - loss: 1.3893 - regression_loss: 1.1691 - classification_loss: 0.2203 396/500 [======================>.......] - ETA: 25s - loss: 1.3881 - regression_loss: 1.1680 - classification_loss: 0.2201 397/500 [======================>.......] - ETA: 25s - loss: 1.3881 - regression_loss: 1.1680 - classification_loss: 0.2201 398/500 [======================>.......] - ETA: 24s - loss: 1.3885 - regression_loss: 1.1685 - classification_loss: 0.2199 399/500 [======================>.......] - ETA: 24s - loss: 1.3881 - regression_loss: 1.1682 - classification_loss: 0.2199 400/500 [=======================>......] - ETA: 24s - loss: 1.3870 - regression_loss: 1.1674 - classification_loss: 0.2196 401/500 [=======================>......] - ETA: 24s - loss: 1.3868 - regression_loss: 1.1675 - classification_loss: 0.2193 402/500 [=======================>......] - ETA: 23s - loss: 1.3843 - regression_loss: 1.1653 - classification_loss: 0.2190 403/500 [=======================>......] - ETA: 23s - loss: 1.3827 - regression_loss: 1.1640 - classification_loss: 0.2188 404/500 [=======================>......] - ETA: 23s - loss: 1.3821 - regression_loss: 1.1637 - classification_loss: 0.2185 405/500 [=======================>......] - ETA: 23s - loss: 1.3823 - regression_loss: 1.1639 - classification_loss: 0.2184 406/500 [=======================>......] - ETA: 22s - loss: 1.3830 - regression_loss: 1.1646 - classification_loss: 0.2183 407/500 [=======================>......] - ETA: 22s - loss: 1.3843 - regression_loss: 1.1657 - classification_loss: 0.2186 408/500 [=======================>......] - ETA: 22s - loss: 1.3841 - regression_loss: 1.1655 - classification_loss: 0.2185 409/500 [=======================>......] - ETA: 22s - loss: 1.3840 - regression_loss: 1.1655 - classification_loss: 0.2184 410/500 [=======================>......] - ETA: 21s - loss: 1.3837 - regression_loss: 1.1653 - classification_loss: 0.2184 411/500 [=======================>......] - ETA: 21s - loss: 1.3834 - regression_loss: 1.1651 - classification_loss: 0.2183 412/500 [=======================>......] - ETA: 21s - loss: 1.3821 - regression_loss: 1.1639 - classification_loss: 0.2182 413/500 [=======================>......] - ETA: 21s - loss: 1.3823 - regression_loss: 1.1642 - classification_loss: 0.2181 414/500 [=======================>......] - ETA: 20s - loss: 1.3826 - regression_loss: 1.1645 - classification_loss: 0.2181 415/500 [=======================>......] - ETA: 20s - loss: 1.3842 - regression_loss: 1.1641 - classification_loss: 0.2200 416/500 [=======================>......] - ETA: 20s - loss: 1.3852 - regression_loss: 1.1651 - classification_loss: 0.2201 417/500 [========================>.....] - ETA: 20s - loss: 1.3853 - regression_loss: 1.1652 - classification_loss: 0.2200 418/500 [========================>.....] - ETA: 19s - loss: 1.3845 - regression_loss: 1.1647 - classification_loss: 0.2198 419/500 [========================>.....] - ETA: 19s - loss: 1.3859 - regression_loss: 1.1661 - classification_loss: 0.2198 420/500 [========================>.....] - ETA: 19s - loss: 1.3858 - regression_loss: 1.1661 - classification_loss: 0.2197 421/500 [========================>.....] - ETA: 19s - loss: 1.3878 - regression_loss: 1.1679 - classification_loss: 0.2198 422/500 [========================>.....] - ETA: 18s - loss: 1.3874 - regression_loss: 1.1677 - classification_loss: 0.2196 423/500 [========================>.....] - ETA: 18s - loss: 1.3869 - regression_loss: 1.1673 - classification_loss: 0.2196 424/500 [========================>.....] - ETA: 18s - loss: 1.3874 - regression_loss: 1.1677 - classification_loss: 0.2197 425/500 [========================>.....] - ETA: 18s - loss: 1.3887 - regression_loss: 1.1688 - classification_loss: 0.2199 426/500 [========================>.....] - ETA: 18s - loss: 1.3868 - regression_loss: 1.1673 - classification_loss: 0.2195 427/500 [========================>.....] - ETA: 17s - loss: 1.3869 - regression_loss: 1.1675 - classification_loss: 0.2194 428/500 [========================>.....] - ETA: 17s - loss: 1.3848 - regression_loss: 1.1658 - classification_loss: 0.2190 429/500 [========================>.....] - ETA: 17s - loss: 1.3831 - regression_loss: 1.1645 - classification_loss: 0.2187 430/500 [========================>.....] - ETA: 17s - loss: 1.3831 - regression_loss: 1.1642 - classification_loss: 0.2188 431/500 [========================>.....] - ETA: 16s - loss: 1.3838 - regression_loss: 1.1650 - classification_loss: 0.2188 432/500 [========================>.....] - ETA: 16s - loss: 1.3836 - regression_loss: 1.1650 - classification_loss: 0.2187 433/500 [========================>.....] - ETA: 16s - loss: 1.3842 - regression_loss: 1.1655 - classification_loss: 0.2187 434/500 [=========================>....] - ETA: 16s - loss: 1.3837 - regression_loss: 1.1651 - classification_loss: 0.2185 435/500 [=========================>....] - ETA: 15s - loss: 1.3847 - regression_loss: 1.1661 - classification_loss: 0.2186 436/500 [=========================>....] - ETA: 15s - loss: 1.3866 - regression_loss: 1.1674 - classification_loss: 0.2192 437/500 [=========================>....] - ETA: 15s - loss: 1.3891 - regression_loss: 1.1692 - classification_loss: 0.2199 438/500 [=========================>....] - ETA: 15s - loss: 1.3897 - regression_loss: 1.1699 - classification_loss: 0.2198 439/500 [=========================>....] - ETA: 14s - loss: 1.3900 - regression_loss: 1.1704 - classification_loss: 0.2196 440/500 [=========================>....] - ETA: 14s - loss: 1.3876 - regression_loss: 1.1684 - classification_loss: 0.2192 441/500 [=========================>....] - ETA: 14s - loss: 1.3866 - regression_loss: 1.1677 - classification_loss: 0.2189 442/500 [=========================>....] - ETA: 14s - loss: 1.3860 - regression_loss: 1.1671 - classification_loss: 0.2189 443/500 [=========================>....] - ETA: 13s - loss: 1.3863 - regression_loss: 1.1674 - classification_loss: 0.2189 444/500 [=========================>....] - ETA: 13s - loss: 1.3868 - regression_loss: 1.1677 - classification_loss: 0.2191 445/500 [=========================>....] - ETA: 13s - loss: 1.3851 - regression_loss: 1.1663 - classification_loss: 0.2188 446/500 [=========================>....] - ETA: 13s - loss: 1.3857 - regression_loss: 1.1666 - classification_loss: 0.2190 447/500 [=========================>....] - ETA: 12s - loss: 1.3873 - regression_loss: 1.1681 - classification_loss: 0.2191 448/500 [=========================>....] - ETA: 12s - loss: 1.3873 - regression_loss: 1.1680 - classification_loss: 0.2192 449/500 [=========================>....] - ETA: 12s - loss: 1.3851 - regression_loss: 1.1662 - classification_loss: 0.2189 450/500 [==========================>...] - ETA: 12s - loss: 1.3836 - regression_loss: 1.1650 - classification_loss: 0.2186 451/500 [==========================>...] - ETA: 11s - loss: 1.3844 - regression_loss: 1.1655 - classification_loss: 0.2188 452/500 [==========================>...] - ETA: 11s - loss: 1.3873 - regression_loss: 1.1681 - classification_loss: 0.2192 453/500 [==========================>...] - ETA: 11s - loss: 1.3873 - regression_loss: 1.1680 - classification_loss: 0.2193 454/500 [==========================>...] - ETA: 11s - loss: 1.3885 - regression_loss: 1.1689 - classification_loss: 0.2196 455/500 [==========================>...] - ETA: 10s - loss: 1.3904 - regression_loss: 1.1704 - classification_loss: 0.2200 456/500 [==========================>...] - ETA: 10s - loss: 1.3902 - regression_loss: 1.1703 - classification_loss: 0.2199 457/500 [==========================>...] - ETA: 10s - loss: 1.3893 - regression_loss: 1.1696 - classification_loss: 0.2197 458/500 [==========================>...] - ETA: 10s - loss: 1.3901 - regression_loss: 1.1703 - classification_loss: 0.2198 459/500 [==========================>...] - ETA: 9s - loss: 1.3910 - regression_loss: 1.1708 - classification_loss: 0.2202  460/500 [==========================>...] - ETA: 9s - loss: 1.3900 - regression_loss: 1.1700 - classification_loss: 0.2200 461/500 [==========================>...] - ETA: 9s - loss: 1.3906 - regression_loss: 1.1705 - classification_loss: 0.2201 462/500 [==========================>...] - ETA: 9s - loss: 1.3904 - regression_loss: 1.1705 - classification_loss: 0.2200 463/500 [==========================>...] - ETA: 9s - loss: 1.3905 - regression_loss: 1.1705 - classification_loss: 0.2200 464/500 [==========================>...] - ETA: 8s - loss: 1.3911 - regression_loss: 1.1710 - classification_loss: 0.2200 465/500 [==========================>...] - ETA: 8s - loss: 1.3908 - regression_loss: 1.1708 - classification_loss: 0.2200 466/500 [==========================>...] - ETA: 8s - loss: 1.3920 - regression_loss: 1.1714 - classification_loss: 0.2206 467/500 [===========================>..] - ETA: 8s - loss: 1.3915 - regression_loss: 1.1712 - classification_loss: 0.2203 468/500 [===========================>..] - ETA: 7s - loss: 1.3913 - regression_loss: 1.1710 - classification_loss: 0.2203 469/500 [===========================>..] - ETA: 7s - loss: 1.3911 - regression_loss: 1.1708 - classification_loss: 0.2203 470/500 [===========================>..] - ETA: 7s - loss: 1.3893 - regression_loss: 1.1693 - classification_loss: 0.2200 471/500 [===========================>..] - ETA: 7s - loss: 1.3915 - regression_loss: 1.1710 - classification_loss: 0.2206 472/500 [===========================>..] - ETA: 6s - loss: 1.3906 - regression_loss: 1.1702 - classification_loss: 0.2204 473/500 [===========================>..] - ETA: 6s - loss: 1.3909 - regression_loss: 1.1703 - classification_loss: 0.2206 474/500 [===========================>..] - ETA: 6s - loss: 1.3901 - regression_loss: 1.1697 - classification_loss: 0.2204 475/500 [===========================>..] - ETA: 6s - loss: 1.3911 - regression_loss: 1.1704 - classification_loss: 0.2207 476/500 [===========================>..] - ETA: 5s - loss: 1.3898 - regression_loss: 1.1693 - classification_loss: 0.2204 477/500 [===========================>..] - ETA: 5s - loss: 1.3911 - regression_loss: 1.1704 - classification_loss: 0.2207 478/500 [===========================>..] - ETA: 5s - loss: 1.3909 - regression_loss: 1.1702 - classification_loss: 0.2207 479/500 [===========================>..] - ETA: 5s - loss: 1.3906 - regression_loss: 1.1700 - classification_loss: 0.2206 480/500 [===========================>..] - ETA: 4s - loss: 1.3911 - regression_loss: 1.1704 - classification_loss: 0.2207 481/500 [===========================>..] - ETA: 4s - loss: 1.3909 - regression_loss: 1.1702 - classification_loss: 0.2206 482/500 [===========================>..] - ETA: 4s - loss: 1.3926 - regression_loss: 1.1715 - classification_loss: 0.2211 483/500 [===========================>..] - ETA: 4s - loss: 1.3928 - regression_loss: 1.1716 - classification_loss: 0.2212 484/500 [============================>.] - ETA: 3s - loss: 1.3921 - regression_loss: 1.1710 - classification_loss: 0.2211 485/500 [============================>.] - ETA: 3s - loss: 1.3936 - regression_loss: 1.1724 - classification_loss: 0.2213 486/500 [============================>.] - ETA: 3s - loss: 1.3924 - regression_loss: 1.1714 - classification_loss: 0.2210 487/500 [============================>.] - ETA: 3s - loss: 1.3903 - regression_loss: 1.1696 - classification_loss: 0.2207 488/500 [============================>.] - ETA: 2s - loss: 1.3894 - regression_loss: 1.1690 - classification_loss: 0.2204 489/500 [============================>.] - ETA: 2s - loss: 1.3888 - regression_loss: 1.1685 - classification_loss: 0.2203 490/500 [============================>.] - ETA: 2s - loss: 1.3892 - regression_loss: 1.1689 - classification_loss: 0.2203 491/500 [============================>.] - ETA: 2s - loss: 1.3886 - regression_loss: 1.1684 - classification_loss: 0.2202 492/500 [============================>.] - ETA: 1s - loss: 1.3887 - regression_loss: 1.1685 - classification_loss: 0.2203 493/500 [============================>.] - ETA: 1s - loss: 1.3898 - regression_loss: 1.1694 - classification_loss: 0.2204 494/500 [============================>.] - ETA: 1s - loss: 1.3912 - regression_loss: 1.1706 - classification_loss: 0.2207 495/500 [============================>.] - ETA: 1s - loss: 1.3912 - regression_loss: 1.1706 - classification_loss: 0.2206 496/500 [============================>.] - ETA: 0s - loss: 1.3916 - regression_loss: 1.1710 - classification_loss: 0.2206 497/500 [============================>.] - ETA: 0s - loss: 1.3908 - regression_loss: 1.1704 - classification_loss: 0.2203 498/500 [============================>.] - ETA: 0s - loss: 1.3902 - regression_loss: 1.1700 - classification_loss: 0.2202 499/500 [============================>.] - ETA: 0s - loss: 1.3886 - regression_loss: 1.1687 - classification_loss: 0.2199 500/500 [==============================] - 122s 244ms/step - loss: 1.3876 - regression_loss: 1.1679 - classification_loss: 0.2197 326 instances of class plum with average precision: 0.8174 mAP: 0.8174 Epoch 00077: saving model to ./training/snapshots/resnet50_pascal_77.h5 Epoch 78/150 1/500 [..............................] - ETA: 1:56 - loss: 0.6894 - regression_loss: 0.5955 - classification_loss: 0.0939 2/500 [..............................] - ETA: 2:01 - loss: 0.9763 - regression_loss: 0.8455 - classification_loss: 0.1308 3/500 [..............................] - ETA: 2:02 - loss: 1.1420 - regression_loss: 0.9729 - classification_loss: 0.1691 4/500 [..............................] - ETA: 2:03 - loss: 1.1781 - regression_loss: 0.9994 - classification_loss: 0.1787 5/500 [..............................] - ETA: 2:03 - loss: 1.2391 - regression_loss: 1.0404 - classification_loss: 0.1988 6/500 [..............................] - ETA: 2:03 - loss: 1.2787 - regression_loss: 1.0762 - classification_loss: 0.2025 7/500 [..............................] - ETA: 2:03 - loss: 1.2260 - regression_loss: 1.0318 - classification_loss: 0.1942 8/500 [..............................] - ETA: 2:02 - loss: 1.2680 - regression_loss: 1.0604 - classification_loss: 0.2076 9/500 [..............................] - ETA: 2:01 - loss: 1.2970 - regression_loss: 1.0954 - classification_loss: 0.2017 10/500 [..............................] - ETA: 1:59 - loss: 1.2703 - regression_loss: 1.0734 - classification_loss: 0.1969 11/500 [..............................] - ETA: 1:59 - loss: 1.2631 - regression_loss: 1.0696 - classification_loss: 0.1936 12/500 [..............................] - ETA: 1:59 - loss: 1.2129 - regression_loss: 1.0305 - classification_loss: 0.1824 13/500 [..............................] - ETA: 1:59 - loss: 1.2305 - regression_loss: 1.0440 - classification_loss: 0.1866 14/500 [..............................] - ETA: 1:59 - loss: 1.3172 - regression_loss: 1.1117 - classification_loss: 0.2055 15/500 [..............................] - ETA: 1:59 - loss: 1.3245 - regression_loss: 1.1190 - classification_loss: 0.2055 16/500 [..............................] - ETA: 1:58 - loss: 1.3439 - regression_loss: 1.1408 - classification_loss: 0.2031 17/500 [>.............................] - ETA: 1:58 - loss: 1.3707 - regression_loss: 1.1598 - classification_loss: 0.2110 18/500 [>.............................] - ETA: 1:58 - loss: 1.4070 - regression_loss: 1.1874 - classification_loss: 0.2196 19/500 [>.............................] - ETA: 1:58 - loss: 1.4059 - regression_loss: 1.1885 - classification_loss: 0.2174 20/500 [>.............................] - ETA: 1:58 - loss: 1.4001 - regression_loss: 1.1863 - classification_loss: 0.2137 21/500 [>.............................] - ETA: 1:58 - loss: 1.4026 - regression_loss: 1.1888 - classification_loss: 0.2138 22/500 [>.............................] - ETA: 1:57 - loss: 1.3956 - regression_loss: 1.1841 - classification_loss: 0.2115 23/500 [>.............................] - ETA: 1:57 - loss: 1.3670 - regression_loss: 1.1604 - classification_loss: 0.2066 24/500 [>.............................] - ETA: 1:57 - loss: 1.3626 - regression_loss: 1.1579 - classification_loss: 0.2047 25/500 [>.............................] - ETA: 1:57 - loss: 1.3493 - regression_loss: 1.1473 - classification_loss: 0.2020 26/500 [>.............................] - ETA: 1:57 - loss: 1.3482 - regression_loss: 1.1464 - classification_loss: 0.2017 27/500 [>.............................] - ETA: 1:56 - loss: 1.3730 - regression_loss: 1.1642 - classification_loss: 0.2088 28/500 [>.............................] - ETA: 1:56 - loss: 1.3886 - regression_loss: 1.1781 - classification_loss: 0.2105 29/500 [>.............................] - ETA: 1:55 - loss: 1.3973 - regression_loss: 1.1867 - classification_loss: 0.2106 30/500 [>.............................] - ETA: 1:55 - loss: 1.4121 - regression_loss: 1.1915 - classification_loss: 0.2206 31/500 [>.............................] - ETA: 1:55 - loss: 1.4000 - regression_loss: 1.1791 - classification_loss: 0.2209 32/500 [>.............................] - ETA: 1:55 - loss: 1.4162 - regression_loss: 1.1898 - classification_loss: 0.2263 33/500 [>.............................] - ETA: 1:54 - loss: 1.4196 - regression_loss: 1.1934 - classification_loss: 0.2262 34/500 [=>............................] - ETA: 1:54 - loss: 1.4208 - regression_loss: 1.1946 - classification_loss: 0.2262 35/500 [=>............................] - ETA: 1:54 - loss: 1.4182 - regression_loss: 1.1927 - classification_loss: 0.2255 36/500 [=>............................] - ETA: 1:53 - loss: 1.4109 - regression_loss: 1.1880 - classification_loss: 0.2229 37/500 [=>............................] - ETA: 1:53 - loss: 1.4061 - regression_loss: 1.1859 - classification_loss: 0.2203 38/500 [=>............................] - ETA: 1:53 - loss: 1.3952 - regression_loss: 1.1757 - classification_loss: 0.2195 39/500 [=>............................] - ETA: 1:53 - loss: 1.4032 - regression_loss: 1.1834 - classification_loss: 0.2198 40/500 [=>............................] - ETA: 1:52 - loss: 1.3897 - regression_loss: 1.1730 - classification_loss: 0.2167 41/500 [=>............................] - ETA: 1:52 - loss: 1.3682 - regression_loss: 1.1561 - classification_loss: 0.2121 42/500 [=>............................] - ETA: 1:52 - loss: 1.3722 - regression_loss: 1.1592 - classification_loss: 0.2130 43/500 [=>............................] - ETA: 1:51 - loss: 1.3672 - regression_loss: 1.1565 - classification_loss: 0.2107 44/500 [=>............................] - ETA: 1:51 - loss: 1.3534 - regression_loss: 1.1449 - classification_loss: 0.2086 45/500 [=>............................] - ETA: 1:50 - loss: 1.3460 - regression_loss: 1.1391 - classification_loss: 0.2069 46/500 [=>............................] - ETA: 1:49 - loss: 1.3479 - regression_loss: 1.1394 - classification_loss: 0.2084 47/500 [=>............................] - ETA: 1:49 - loss: 1.3548 - regression_loss: 1.1461 - classification_loss: 0.2086 48/500 [=>............................] - ETA: 1:49 - loss: 1.3619 - regression_loss: 1.1518 - classification_loss: 0.2100 49/500 [=>............................] - ETA: 1:49 - loss: 1.3740 - regression_loss: 1.1597 - classification_loss: 0.2142 50/500 [==>...........................] - ETA: 1:49 - loss: 1.3689 - regression_loss: 1.1568 - classification_loss: 0.2120 51/500 [==>...........................] - ETA: 1:48 - loss: 1.3659 - regression_loss: 1.1541 - classification_loss: 0.2118 52/500 [==>...........................] - ETA: 1:48 - loss: 1.3756 - regression_loss: 1.1622 - classification_loss: 0.2134 53/500 [==>...........................] - ETA: 1:48 - loss: 1.3827 - regression_loss: 1.1699 - classification_loss: 0.2128 54/500 [==>...........................] - ETA: 1:48 - loss: 1.3866 - regression_loss: 1.1731 - classification_loss: 0.2135 55/500 [==>...........................] - ETA: 1:48 - loss: 1.3736 - regression_loss: 1.1617 - classification_loss: 0.2119 56/500 [==>...........................] - ETA: 1:47 - loss: 1.3718 - regression_loss: 1.1610 - classification_loss: 0.2108 57/500 [==>...........................] - ETA: 1:47 - loss: 1.3631 - regression_loss: 1.1544 - classification_loss: 0.2086 58/500 [==>...........................] - ETA: 1:47 - loss: 1.3642 - regression_loss: 1.1551 - classification_loss: 0.2091 59/500 [==>...........................] - ETA: 1:47 - loss: 1.3729 - regression_loss: 1.1623 - classification_loss: 0.2106 60/500 [==>...........................] - ETA: 1:47 - loss: 1.3758 - regression_loss: 1.1642 - classification_loss: 0.2116 61/500 [==>...........................] - ETA: 1:46 - loss: 1.3803 - regression_loss: 1.1685 - classification_loss: 0.2118 62/500 [==>...........................] - ETA: 1:46 - loss: 1.3652 - regression_loss: 1.1550 - classification_loss: 0.2102 63/500 [==>...........................] - ETA: 1:46 - loss: 1.3546 - regression_loss: 1.1463 - classification_loss: 0.2083 64/500 [==>...........................] - ETA: 1:46 - loss: 1.3561 - regression_loss: 1.1483 - classification_loss: 0.2078 65/500 [==>...........................] - ETA: 1:46 - loss: 1.3513 - regression_loss: 1.1444 - classification_loss: 0.2069 66/500 [==>...........................] - ETA: 1:45 - loss: 1.3610 - regression_loss: 1.1522 - classification_loss: 0.2088 67/500 [===>..........................] - ETA: 1:45 - loss: 1.3613 - regression_loss: 1.1529 - classification_loss: 0.2084 68/500 [===>..........................] - ETA: 1:45 - loss: 1.3559 - regression_loss: 1.1489 - classification_loss: 0.2069 69/500 [===>..........................] - ETA: 1:45 - loss: 1.3534 - regression_loss: 1.1461 - classification_loss: 0.2073 70/500 [===>..........................] - ETA: 1:44 - loss: 1.3437 - regression_loss: 1.1382 - classification_loss: 0.2055 71/500 [===>..........................] - ETA: 1:44 - loss: 1.3427 - regression_loss: 1.1380 - classification_loss: 0.2047 72/500 [===>..........................] - ETA: 1:44 - loss: 1.3538 - regression_loss: 1.1491 - classification_loss: 0.2047 73/500 [===>..........................] - ETA: 1:44 - loss: 1.3527 - regression_loss: 1.1490 - classification_loss: 0.2038 74/500 [===>..........................] - ETA: 1:43 - loss: 1.3434 - regression_loss: 1.1418 - classification_loss: 0.2016 75/500 [===>..........................] - ETA: 1:43 - loss: 1.3471 - regression_loss: 1.1452 - classification_loss: 0.2019 76/500 [===>..........................] - ETA: 1:43 - loss: 1.3432 - regression_loss: 1.1425 - classification_loss: 0.2007 77/500 [===>..........................] - ETA: 1:43 - loss: 1.3409 - regression_loss: 1.1414 - classification_loss: 0.1996 78/500 [===>..........................] - ETA: 1:42 - loss: 1.3488 - regression_loss: 1.1471 - classification_loss: 0.2017 79/500 [===>..........................] - ETA: 1:42 - loss: 1.3496 - regression_loss: 1.1476 - classification_loss: 0.2020 80/500 [===>..........................] - ETA: 1:42 - loss: 1.3520 - regression_loss: 1.1496 - classification_loss: 0.2024 81/500 [===>..........................] - ETA: 1:42 - loss: 1.3408 - regression_loss: 1.1402 - classification_loss: 0.2005 82/500 [===>..........................] - ETA: 1:41 - loss: 1.3383 - regression_loss: 1.1390 - classification_loss: 0.1994 83/500 [===>..........................] - ETA: 1:41 - loss: 1.3366 - regression_loss: 1.1377 - classification_loss: 0.1989 84/500 [====>.........................] - ETA: 1:41 - loss: 1.3347 - regression_loss: 1.1363 - classification_loss: 0.1983 85/500 [====>.........................] - ETA: 1:41 - loss: 1.3395 - regression_loss: 1.1406 - classification_loss: 0.1989 86/500 [====>.........................] - ETA: 1:41 - loss: 1.3509 - regression_loss: 1.1493 - classification_loss: 0.2017 87/500 [====>.........................] - ETA: 1:40 - loss: 1.3478 - regression_loss: 1.1465 - classification_loss: 0.2014 88/500 [====>.........................] - ETA: 1:40 - loss: 1.3426 - regression_loss: 1.1426 - classification_loss: 0.2000 89/500 [====>.........................] - ETA: 1:40 - loss: 1.3434 - regression_loss: 1.1421 - classification_loss: 0.2013 90/500 [====>.........................] - ETA: 1:40 - loss: 1.3435 - regression_loss: 1.1427 - classification_loss: 0.2008 91/500 [====>.........................] - ETA: 1:39 - loss: 1.3423 - regression_loss: 1.1423 - classification_loss: 0.2000 92/500 [====>.........................] - ETA: 1:39 - loss: 1.3486 - regression_loss: 1.1465 - classification_loss: 0.2021 93/500 [====>.........................] - ETA: 1:39 - loss: 1.3500 - regression_loss: 1.1471 - classification_loss: 0.2029 94/500 [====>.........................] - ETA: 1:39 - loss: 1.3543 - regression_loss: 1.1504 - classification_loss: 0.2038 95/500 [====>.........................] - ETA: 1:38 - loss: 1.3645 - regression_loss: 1.1579 - classification_loss: 0.2066 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3643 - regression_loss: 1.1585 - classification_loss: 0.2059 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3716 - regression_loss: 1.1645 - classification_loss: 0.2071 98/500 [====>.........................] - ETA: 1:38 - loss: 1.3751 - regression_loss: 1.1674 - classification_loss: 0.2077 99/500 [====>.........................] - ETA: 1:38 - loss: 1.3806 - regression_loss: 1.1712 - classification_loss: 0.2094 100/500 [=====>........................] - ETA: 1:37 - loss: 1.3849 - regression_loss: 1.1743 - classification_loss: 0.2105 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3847 - regression_loss: 1.1744 - classification_loss: 0.2103 102/500 [=====>........................] - ETA: 1:37 - loss: 1.3825 - regression_loss: 1.1716 - classification_loss: 0.2109 103/500 [=====>........................] - ETA: 1:37 - loss: 1.3819 - regression_loss: 1.1713 - classification_loss: 0.2106 104/500 [=====>........................] - ETA: 1:36 - loss: 1.3818 - regression_loss: 1.1711 - classification_loss: 0.2107 105/500 [=====>........................] - ETA: 1:36 - loss: 1.3786 - regression_loss: 1.1691 - classification_loss: 0.2095 106/500 [=====>........................] - ETA: 1:36 - loss: 1.3775 - regression_loss: 1.1683 - classification_loss: 0.2092 107/500 [=====>........................] - ETA: 1:36 - loss: 1.3789 - regression_loss: 1.1698 - classification_loss: 0.2090 108/500 [=====>........................] - ETA: 1:35 - loss: 1.3733 - regression_loss: 1.1657 - classification_loss: 0.2077 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3675 - regression_loss: 1.1611 - classification_loss: 0.2064 110/500 [=====>........................] - ETA: 1:35 - loss: 1.3715 - regression_loss: 1.1644 - classification_loss: 0.2070 111/500 [=====>........................] - ETA: 1:35 - loss: 1.3723 - regression_loss: 1.1652 - classification_loss: 0.2071 112/500 [=====>........................] - ETA: 1:34 - loss: 1.3688 - regression_loss: 1.1626 - classification_loss: 0.2062 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3606 - regression_loss: 1.1555 - classification_loss: 0.2051 114/500 [=====>........................] - ETA: 1:34 - loss: 1.3622 - regression_loss: 1.1573 - classification_loss: 0.2048 115/500 [=====>........................] - ETA: 1:33 - loss: 1.3636 - regression_loss: 1.1588 - classification_loss: 0.2048 116/500 [=====>........................] - ETA: 1:33 - loss: 1.3620 - regression_loss: 1.1575 - classification_loss: 0.2045 117/500 [======>.......................] - ETA: 1:33 - loss: 1.3633 - regression_loss: 1.1587 - classification_loss: 0.2046 118/500 [======>.......................] - ETA: 1:33 - loss: 1.3924 - regression_loss: 1.1855 - classification_loss: 0.2069 119/500 [======>.......................] - ETA: 1:32 - loss: 1.3900 - regression_loss: 1.1837 - classification_loss: 0.2063 120/500 [======>.......................] - ETA: 1:32 - loss: 1.3832 - regression_loss: 1.1782 - classification_loss: 0.2051 121/500 [======>.......................] - ETA: 1:32 - loss: 1.3870 - regression_loss: 1.1807 - classification_loss: 0.2063 122/500 [======>.......................] - ETA: 1:32 - loss: 1.3830 - regression_loss: 1.1772 - classification_loss: 0.2058 123/500 [======>.......................] - ETA: 1:32 - loss: 1.3806 - regression_loss: 1.1752 - classification_loss: 0.2054 124/500 [======>.......................] - ETA: 1:31 - loss: 1.3812 - regression_loss: 1.1754 - classification_loss: 0.2057 125/500 [======>.......................] - ETA: 1:31 - loss: 1.3768 - regression_loss: 1.1717 - classification_loss: 0.2051 126/500 [======>.......................] - ETA: 1:31 - loss: 1.3824 - regression_loss: 1.1759 - classification_loss: 0.2065 127/500 [======>.......................] - ETA: 1:31 - loss: 1.3842 - regression_loss: 1.1766 - classification_loss: 0.2075 128/500 [======>.......................] - ETA: 1:30 - loss: 1.3855 - regression_loss: 1.1780 - classification_loss: 0.2075 129/500 [======>.......................] - ETA: 1:30 - loss: 1.3895 - regression_loss: 1.1823 - classification_loss: 0.2072 130/500 [======>.......................] - ETA: 1:30 - loss: 1.3835 - regression_loss: 1.1775 - classification_loss: 0.2060 131/500 [======>.......................] - ETA: 1:30 - loss: 1.3782 - regression_loss: 1.1731 - classification_loss: 0.2050 132/500 [======>.......................] - ETA: 1:29 - loss: 1.3756 - regression_loss: 1.1707 - classification_loss: 0.2049 133/500 [======>.......................] - ETA: 1:29 - loss: 1.3751 - regression_loss: 1.1694 - classification_loss: 0.2058 134/500 [=======>......................] - ETA: 1:29 - loss: 1.3782 - regression_loss: 1.1721 - classification_loss: 0.2061 135/500 [=======>......................] - ETA: 1:29 - loss: 1.3730 - regression_loss: 1.1674 - classification_loss: 0.2056 136/500 [=======>......................] - ETA: 1:28 - loss: 1.3777 - regression_loss: 1.1717 - classification_loss: 0.2060 137/500 [=======>......................] - ETA: 1:28 - loss: 1.3778 - regression_loss: 1.1719 - classification_loss: 0.2060 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3777 - regression_loss: 1.1717 - classification_loss: 0.2059 139/500 [=======>......................] - ETA: 1:28 - loss: 1.3801 - regression_loss: 1.1734 - classification_loss: 0.2067 140/500 [=======>......................] - ETA: 1:28 - loss: 1.3806 - regression_loss: 1.1737 - classification_loss: 0.2068 141/500 [=======>......................] - ETA: 1:27 - loss: 1.3890 - regression_loss: 1.1806 - classification_loss: 0.2085 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3876 - regression_loss: 1.1795 - classification_loss: 0.2081 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3864 - regression_loss: 1.1783 - classification_loss: 0.2081 144/500 [=======>......................] - ETA: 1:27 - loss: 1.3872 - regression_loss: 1.1791 - classification_loss: 0.2081 145/500 [=======>......................] - ETA: 1:26 - loss: 1.3883 - regression_loss: 1.1804 - classification_loss: 0.2079 146/500 [=======>......................] - ETA: 1:26 - loss: 1.3892 - regression_loss: 1.1819 - classification_loss: 0.2072 147/500 [=======>......................] - ETA: 1:26 - loss: 1.3987 - regression_loss: 1.1896 - classification_loss: 0.2091 148/500 [=======>......................] - ETA: 1:25 - loss: 1.3971 - regression_loss: 1.1882 - classification_loss: 0.2089 149/500 [=======>......................] - ETA: 1:25 - loss: 1.3972 - regression_loss: 1.1887 - classification_loss: 0.2085 150/500 [========>.....................] - ETA: 1:25 - loss: 1.3980 - regression_loss: 1.1889 - classification_loss: 0.2090 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3966 - regression_loss: 1.1879 - classification_loss: 0.2088 152/500 [========>.....................] - ETA: 1:25 - loss: 1.3968 - regression_loss: 1.1885 - classification_loss: 0.2083 153/500 [========>.....................] - ETA: 1:24 - loss: 1.3965 - regression_loss: 1.1883 - classification_loss: 0.2082 154/500 [========>.....................] - ETA: 1:24 - loss: 1.4002 - regression_loss: 1.1913 - classification_loss: 0.2089 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3995 - regression_loss: 1.1910 - classification_loss: 0.2085 156/500 [========>.....................] - ETA: 1:24 - loss: 1.3982 - regression_loss: 1.1903 - classification_loss: 0.2079 157/500 [========>.....................] - ETA: 1:23 - loss: 1.3958 - regression_loss: 1.1880 - classification_loss: 0.2078 158/500 [========>.....................] - ETA: 1:23 - loss: 1.3959 - regression_loss: 1.1885 - classification_loss: 0.2074 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3928 - regression_loss: 1.1859 - classification_loss: 0.2069 160/500 [========>.....................] - ETA: 1:23 - loss: 1.3892 - regression_loss: 1.1830 - classification_loss: 0.2062 161/500 [========>.....................] - ETA: 1:22 - loss: 1.3893 - regression_loss: 1.1829 - classification_loss: 0.2064 162/500 [========>.....................] - ETA: 1:22 - loss: 1.3905 - regression_loss: 1.1837 - classification_loss: 0.2068 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3905 - regression_loss: 1.1835 - classification_loss: 0.2071 164/500 [========>.....................] - ETA: 1:22 - loss: 1.3908 - regression_loss: 1.1840 - classification_loss: 0.2068 165/500 [========>.....................] - ETA: 1:21 - loss: 1.3902 - regression_loss: 1.1838 - classification_loss: 0.2064 166/500 [========>.....................] - ETA: 1:21 - loss: 1.3914 - regression_loss: 1.1847 - classification_loss: 0.2068 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3891 - regression_loss: 1.1828 - classification_loss: 0.2063 168/500 [=========>....................] - ETA: 1:21 - loss: 1.3886 - regression_loss: 1.1824 - classification_loss: 0.2062 169/500 [=========>....................] - ETA: 1:20 - loss: 1.3867 - regression_loss: 1.1802 - classification_loss: 0.2065 170/500 [=========>....................] - ETA: 1:20 - loss: 1.3839 - regression_loss: 1.1780 - classification_loss: 0.2059 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3835 - regression_loss: 1.1781 - classification_loss: 0.2055 172/500 [=========>....................] - ETA: 1:20 - loss: 1.3844 - regression_loss: 1.1790 - classification_loss: 0.2054 173/500 [=========>....................] - ETA: 1:20 - loss: 1.3875 - regression_loss: 1.1811 - classification_loss: 0.2065 174/500 [=========>....................] - ETA: 1:19 - loss: 1.3892 - regression_loss: 1.1822 - classification_loss: 0.2070 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3892 - regression_loss: 1.1821 - classification_loss: 0.2070 176/500 [=========>....................] - ETA: 1:19 - loss: 1.3895 - regression_loss: 1.1818 - classification_loss: 0.2077 177/500 [=========>....................] - ETA: 1:19 - loss: 1.3904 - regression_loss: 1.1828 - classification_loss: 0.2076 178/500 [=========>....................] - ETA: 1:18 - loss: 1.3892 - regression_loss: 1.1820 - classification_loss: 0.2072 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3905 - regression_loss: 1.1827 - classification_loss: 0.2078 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3877 - regression_loss: 1.1805 - classification_loss: 0.2072 181/500 [=========>....................] - ETA: 1:18 - loss: 1.3978 - regression_loss: 1.1899 - classification_loss: 0.2078 182/500 [=========>....................] - ETA: 1:17 - loss: 1.3999 - regression_loss: 1.1918 - classification_loss: 0.2081 183/500 [=========>....................] - ETA: 1:17 - loss: 1.4016 - regression_loss: 1.1927 - classification_loss: 0.2088 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3983 - regression_loss: 1.1862 - classification_loss: 0.2121 185/500 [==========>...................] - ETA: 1:16 - loss: 1.3962 - regression_loss: 1.1845 - classification_loss: 0.2116 186/500 [==========>...................] - ETA: 1:16 - loss: 1.3932 - regression_loss: 1.1822 - classification_loss: 0.2110 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3940 - regression_loss: 1.1820 - classification_loss: 0.2120 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3947 - regression_loss: 1.1828 - classification_loss: 0.2119 189/500 [==========>...................] - ETA: 1:15 - loss: 1.3959 - regression_loss: 1.1837 - classification_loss: 0.2122 190/500 [==========>...................] - ETA: 1:15 - loss: 1.3939 - regression_loss: 1.1821 - classification_loss: 0.2118 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3921 - regression_loss: 1.1806 - classification_loss: 0.2115 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3908 - regression_loss: 1.1796 - classification_loss: 0.2112 193/500 [==========>...................] - ETA: 1:14 - loss: 1.3938 - regression_loss: 1.1820 - classification_loss: 0.2118 194/500 [==========>...................] - ETA: 1:14 - loss: 1.3953 - regression_loss: 1.1835 - classification_loss: 0.2119 195/500 [==========>...................] - ETA: 1:14 - loss: 1.3973 - regression_loss: 1.1852 - classification_loss: 0.2122 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3946 - regression_loss: 1.1828 - classification_loss: 0.2119 197/500 [==========>...................] - ETA: 1:13 - loss: 1.3971 - regression_loss: 1.1854 - classification_loss: 0.2117 198/500 [==========>...................] - ETA: 1:13 - loss: 1.3966 - regression_loss: 1.1854 - classification_loss: 0.2112 199/500 [==========>...................] - ETA: 1:13 - loss: 1.3957 - regression_loss: 1.1846 - classification_loss: 0.2111 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3949 - regression_loss: 1.1837 - classification_loss: 0.2111 201/500 [===========>..................] - ETA: 1:12 - loss: 1.3952 - regression_loss: 1.1836 - classification_loss: 0.2116 202/500 [===========>..................] - ETA: 1:12 - loss: 1.3966 - regression_loss: 1.1844 - classification_loss: 0.2122 203/500 [===========>..................] - ETA: 1:12 - loss: 1.4017 - regression_loss: 1.1885 - classification_loss: 0.2132 204/500 [===========>..................] - ETA: 1:11 - loss: 1.4024 - regression_loss: 1.1889 - classification_loss: 0.2135 205/500 [===========>..................] - ETA: 1:11 - loss: 1.3973 - regression_loss: 1.1846 - classification_loss: 0.2127 206/500 [===========>..................] - ETA: 1:11 - loss: 1.3959 - regression_loss: 1.1835 - classification_loss: 0.2124 207/500 [===========>..................] - ETA: 1:11 - loss: 1.3926 - regression_loss: 1.1808 - classification_loss: 0.2118 208/500 [===========>..................] - ETA: 1:10 - loss: 1.3945 - regression_loss: 1.1831 - classification_loss: 0.2115 209/500 [===========>..................] - ETA: 1:10 - loss: 1.3939 - regression_loss: 1.1826 - classification_loss: 0.2113 210/500 [===========>..................] - ETA: 1:10 - loss: 1.3894 - regression_loss: 1.1789 - classification_loss: 0.2105 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3882 - regression_loss: 1.1780 - classification_loss: 0.2101 212/500 [===========>..................] - ETA: 1:09 - loss: 1.3877 - regression_loss: 1.1776 - classification_loss: 0.2101 213/500 [===========>..................] - ETA: 1:09 - loss: 1.3865 - regression_loss: 1.1763 - classification_loss: 0.2102 214/500 [===========>..................] - ETA: 1:09 - loss: 1.3849 - regression_loss: 1.1751 - classification_loss: 0.2099 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3822 - regression_loss: 1.1731 - classification_loss: 0.2091 216/500 [===========>..................] - ETA: 1:08 - loss: 1.3786 - regression_loss: 1.1701 - classification_loss: 0.2085 217/500 [============>.................] - ETA: 1:08 - loss: 1.3770 - regression_loss: 1.1690 - classification_loss: 0.2080 218/500 [============>.................] - ETA: 1:08 - loss: 1.3747 - regression_loss: 1.1671 - classification_loss: 0.2077 219/500 [============>.................] - ETA: 1:08 - loss: 1.3751 - regression_loss: 1.1677 - classification_loss: 0.2073 220/500 [============>.................] - ETA: 1:07 - loss: 1.3734 - regression_loss: 1.1662 - classification_loss: 0.2072 221/500 [============>.................] - ETA: 1:07 - loss: 1.3736 - regression_loss: 1.1667 - classification_loss: 0.2069 222/500 [============>.................] - ETA: 1:07 - loss: 1.3728 - regression_loss: 1.1660 - classification_loss: 0.2067 223/500 [============>.................] - ETA: 1:06 - loss: 1.3737 - regression_loss: 1.1670 - classification_loss: 0.2068 224/500 [============>.................] - ETA: 1:06 - loss: 1.3745 - regression_loss: 1.1680 - classification_loss: 0.2065 225/500 [============>.................] - ETA: 1:06 - loss: 1.3767 - regression_loss: 1.1703 - classification_loss: 0.2064 226/500 [============>.................] - ETA: 1:06 - loss: 1.3782 - regression_loss: 1.1709 - classification_loss: 0.2073 227/500 [============>.................] - ETA: 1:06 - loss: 1.3773 - regression_loss: 1.1702 - classification_loss: 0.2071 228/500 [============>.................] - ETA: 1:05 - loss: 1.3756 - regression_loss: 1.1689 - classification_loss: 0.2067 229/500 [============>.................] - ETA: 1:05 - loss: 1.3738 - regression_loss: 1.1675 - classification_loss: 0.2064 230/500 [============>.................] - ETA: 1:05 - loss: 1.3766 - regression_loss: 1.1695 - classification_loss: 0.2071 231/500 [============>.................] - ETA: 1:05 - loss: 1.3760 - regression_loss: 1.1691 - classification_loss: 0.2069 232/500 [============>.................] - ETA: 1:04 - loss: 1.3754 - regression_loss: 1.1687 - classification_loss: 0.2067 233/500 [============>.................] - ETA: 1:04 - loss: 1.3785 - regression_loss: 1.1716 - classification_loss: 0.2069 234/500 [=============>................] - ETA: 1:04 - loss: 1.3782 - regression_loss: 1.1713 - classification_loss: 0.2069 235/500 [=============>................] - ETA: 1:04 - loss: 1.3764 - regression_loss: 1.1696 - classification_loss: 0.2069 236/500 [=============>................] - ETA: 1:03 - loss: 1.3781 - regression_loss: 1.1709 - classification_loss: 0.2072 237/500 [=============>................] - ETA: 1:03 - loss: 1.3763 - regression_loss: 1.1695 - classification_loss: 0.2068 238/500 [=============>................] - ETA: 1:03 - loss: 1.3749 - regression_loss: 1.1679 - classification_loss: 0.2069 239/500 [=============>................] - ETA: 1:02 - loss: 1.3756 - regression_loss: 1.1686 - classification_loss: 0.2070 240/500 [=============>................] - ETA: 1:02 - loss: 1.3757 - regression_loss: 1.1687 - classification_loss: 0.2071 241/500 [=============>................] - ETA: 1:02 - loss: 1.3781 - regression_loss: 1.1708 - classification_loss: 0.2073 242/500 [=============>................] - ETA: 1:02 - loss: 1.3774 - regression_loss: 1.1701 - classification_loss: 0.2074 243/500 [=============>................] - ETA: 1:02 - loss: 1.3769 - regression_loss: 1.1697 - classification_loss: 0.2072 244/500 [=============>................] - ETA: 1:01 - loss: 1.3775 - regression_loss: 1.1701 - classification_loss: 0.2074 245/500 [=============>................] - ETA: 1:01 - loss: 1.3786 - regression_loss: 1.1712 - classification_loss: 0.2074 246/500 [=============>................] - ETA: 1:01 - loss: 1.3776 - regression_loss: 1.1702 - classification_loss: 0.2074 247/500 [=============>................] - ETA: 1:01 - loss: 1.3772 - regression_loss: 1.1699 - classification_loss: 0.2073 248/500 [=============>................] - ETA: 1:00 - loss: 1.3730 - regression_loss: 1.1664 - classification_loss: 0.2066 249/500 [=============>................] - ETA: 1:00 - loss: 1.3738 - regression_loss: 1.1671 - classification_loss: 0.2067 250/500 [==============>...............] - ETA: 1:00 - loss: 1.3734 - regression_loss: 1.1669 - classification_loss: 0.2065 251/500 [==============>...............] - ETA: 1:00 - loss: 1.3714 - regression_loss: 1.1653 - classification_loss: 0.2061 252/500 [==============>...............] - ETA: 59s - loss: 1.3713 - regression_loss: 1.1654 - classification_loss: 0.2058  253/500 [==============>...............] - ETA: 59s - loss: 1.3737 - regression_loss: 1.1674 - classification_loss: 0.2063 254/500 [==============>...............] - ETA: 59s - loss: 1.3750 - regression_loss: 1.1684 - classification_loss: 0.2066 255/500 [==============>...............] - ETA: 59s - loss: 1.3748 - regression_loss: 1.1682 - classification_loss: 0.2066 256/500 [==============>...............] - ETA: 58s - loss: 1.3765 - regression_loss: 1.1693 - classification_loss: 0.2073 257/500 [==============>...............] - ETA: 58s - loss: 1.3765 - regression_loss: 1.1695 - classification_loss: 0.2070 258/500 [==============>...............] - ETA: 58s - loss: 1.3747 - regression_loss: 1.1682 - classification_loss: 0.2065 259/500 [==============>...............] - ETA: 58s - loss: 1.3755 - regression_loss: 1.1691 - classification_loss: 0.2064 260/500 [==============>...............] - ETA: 57s - loss: 1.3739 - regression_loss: 1.1677 - classification_loss: 0.2062 261/500 [==============>...............] - ETA: 57s - loss: 1.3730 - regression_loss: 1.1671 - classification_loss: 0.2059 262/500 [==============>...............] - ETA: 57s - loss: 1.3755 - regression_loss: 1.1688 - classification_loss: 0.2067 263/500 [==============>...............] - ETA: 57s - loss: 1.3758 - regression_loss: 1.1689 - classification_loss: 0.2069 264/500 [==============>...............] - ETA: 56s - loss: 1.3767 - regression_loss: 1.1697 - classification_loss: 0.2070 265/500 [==============>...............] - ETA: 56s - loss: 1.3765 - regression_loss: 1.1698 - classification_loss: 0.2068 266/500 [==============>...............] - ETA: 56s - loss: 1.3758 - regression_loss: 1.1692 - classification_loss: 0.2066 267/500 [===============>..............] - ETA: 56s - loss: 1.3763 - regression_loss: 1.1696 - classification_loss: 0.2067 268/500 [===============>..............] - ETA: 55s - loss: 1.3774 - regression_loss: 1.1703 - classification_loss: 0.2070 269/500 [===============>..............] - ETA: 55s - loss: 1.3756 - regression_loss: 1.1686 - classification_loss: 0.2069 270/500 [===============>..............] - ETA: 55s - loss: 1.3737 - regression_loss: 1.1670 - classification_loss: 0.2067 271/500 [===============>..............] - ETA: 55s - loss: 1.3725 - regression_loss: 1.1660 - classification_loss: 0.2065 272/500 [===============>..............] - ETA: 54s - loss: 1.3722 - regression_loss: 1.1656 - classification_loss: 0.2066 273/500 [===============>..............] - ETA: 54s - loss: 1.3687 - regression_loss: 1.1627 - classification_loss: 0.2060 274/500 [===============>..............] - ETA: 54s - loss: 1.3697 - regression_loss: 1.1635 - classification_loss: 0.2063 275/500 [===============>..............] - ETA: 54s - loss: 1.3688 - regression_loss: 1.1628 - classification_loss: 0.2060 276/500 [===============>..............] - ETA: 53s - loss: 1.3656 - regression_loss: 1.1600 - classification_loss: 0.2056 277/500 [===============>..............] - ETA: 53s - loss: 1.3662 - regression_loss: 1.1604 - classification_loss: 0.2058 278/500 [===============>..............] - ETA: 53s - loss: 1.3656 - regression_loss: 1.1601 - classification_loss: 0.2055 279/500 [===============>..............] - ETA: 53s - loss: 1.3635 - regression_loss: 1.1584 - classification_loss: 0.2051 280/500 [===============>..............] - ETA: 52s - loss: 1.3617 - regression_loss: 1.1570 - classification_loss: 0.2047 281/500 [===============>..............] - ETA: 52s - loss: 1.3640 - regression_loss: 1.1586 - classification_loss: 0.2054 282/500 [===============>..............] - ETA: 52s - loss: 1.3646 - regression_loss: 1.1588 - classification_loss: 0.2057 283/500 [===============>..............] - ETA: 52s - loss: 1.3625 - regression_loss: 1.1571 - classification_loss: 0.2054 284/500 [================>.............] - ETA: 51s - loss: 1.3595 - regression_loss: 1.1546 - classification_loss: 0.2050 285/500 [================>.............] - ETA: 51s - loss: 1.3620 - regression_loss: 1.1565 - classification_loss: 0.2055 286/500 [================>.............] - ETA: 51s - loss: 1.3631 - regression_loss: 1.1573 - classification_loss: 0.2059 287/500 [================>.............] - ETA: 51s - loss: 1.3637 - regression_loss: 1.1578 - classification_loss: 0.2059 288/500 [================>.............] - ETA: 50s - loss: 1.3646 - regression_loss: 1.1582 - classification_loss: 0.2064 289/500 [================>.............] - ETA: 50s - loss: 1.3645 - regression_loss: 1.1582 - classification_loss: 0.2063 290/500 [================>.............] - ETA: 50s - loss: 1.3628 - regression_loss: 1.1569 - classification_loss: 0.2059 291/500 [================>.............] - ETA: 50s - loss: 1.3632 - regression_loss: 1.1574 - classification_loss: 0.2058 292/500 [================>.............] - ETA: 49s - loss: 1.3646 - regression_loss: 1.1585 - classification_loss: 0.2060 293/500 [================>.............] - ETA: 49s - loss: 1.3634 - regression_loss: 1.1573 - classification_loss: 0.2061 294/500 [================>.............] - ETA: 49s - loss: 1.3653 - regression_loss: 1.1586 - classification_loss: 0.2068 295/500 [================>.............] - ETA: 49s - loss: 1.3678 - regression_loss: 1.1603 - classification_loss: 0.2074 296/500 [================>.............] - ETA: 48s - loss: 1.3691 - regression_loss: 1.1614 - classification_loss: 0.2077 297/500 [================>.............] - ETA: 48s - loss: 1.3700 - regression_loss: 1.1621 - classification_loss: 0.2079 298/500 [================>.............] - ETA: 48s - loss: 1.3696 - regression_loss: 1.1619 - classification_loss: 0.2077 299/500 [================>.............] - ETA: 48s - loss: 1.3694 - regression_loss: 1.1619 - classification_loss: 0.2075 300/500 [=================>............] - ETA: 47s - loss: 1.3706 - regression_loss: 1.1630 - classification_loss: 0.2076 301/500 [=================>............] - ETA: 47s - loss: 1.3681 - regression_loss: 1.1607 - classification_loss: 0.2075 302/500 [=================>............] - ETA: 47s - loss: 1.3670 - regression_loss: 1.1597 - classification_loss: 0.2073 303/500 [=================>............] - ETA: 47s - loss: 1.3669 - regression_loss: 1.1598 - classification_loss: 0.2071 304/500 [=================>............] - ETA: 46s - loss: 1.3668 - regression_loss: 1.1597 - classification_loss: 0.2071 305/500 [=================>............] - ETA: 46s - loss: 1.3665 - regression_loss: 1.1596 - classification_loss: 0.2069 306/500 [=================>............] - ETA: 46s - loss: 1.3669 - regression_loss: 1.1598 - classification_loss: 0.2070 307/500 [=================>............] - ETA: 46s - loss: 1.3662 - regression_loss: 1.1593 - classification_loss: 0.2070 308/500 [=================>............] - ETA: 45s - loss: 1.3675 - regression_loss: 1.1604 - classification_loss: 0.2071 309/500 [=================>............] - ETA: 45s - loss: 1.3667 - regression_loss: 1.1601 - classification_loss: 0.2067 310/500 [=================>............] - ETA: 45s - loss: 1.3667 - regression_loss: 1.1601 - classification_loss: 0.2066 311/500 [=================>............] - ETA: 45s - loss: 1.3676 - regression_loss: 1.1607 - classification_loss: 0.2068 312/500 [=================>............] - ETA: 44s - loss: 1.3669 - regression_loss: 1.1603 - classification_loss: 0.2066 313/500 [=================>............] - ETA: 44s - loss: 1.3689 - regression_loss: 1.1618 - classification_loss: 0.2070 314/500 [=================>............] - ETA: 44s - loss: 1.3677 - regression_loss: 1.1609 - classification_loss: 0.2068 315/500 [=================>............] - ETA: 44s - loss: 1.3686 - regression_loss: 1.1616 - classification_loss: 0.2070 316/500 [=================>............] - ETA: 43s - loss: 1.3804 - regression_loss: 1.1635 - classification_loss: 0.2169 317/500 [==================>...........] - ETA: 43s - loss: 1.3794 - regression_loss: 1.1628 - classification_loss: 0.2166 318/500 [==================>...........] - ETA: 43s - loss: 1.3774 - regression_loss: 1.1612 - classification_loss: 0.2162 319/500 [==================>...........] - ETA: 43s - loss: 1.3778 - regression_loss: 1.1615 - classification_loss: 0.2163 320/500 [==================>...........] - ETA: 42s - loss: 1.3799 - regression_loss: 1.1630 - classification_loss: 0.2169 321/500 [==================>...........] - ETA: 42s - loss: 1.3778 - regression_loss: 1.1614 - classification_loss: 0.2164 322/500 [==================>...........] - ETA: 42s - loss: 1.3778 - regression_loss: 1.1611 - classification_loss: 0.2167 323/500 [==================>...........] - ETA: 42s - loss: 1.3780 - regression_loss: 1.1615 - classification_loss: 0.2166 324/500 [==================>...........] - ETA: 41s - loss: 1.3779 - regression_loss: 1.1613 - classification_loss: 0.2166 325/500 [==================>...........] - ETA: 41s - loss: 1.3752 - regression_loss: 1.1591 - classification_loss: 0.2161 326/500 [==================>...........] - ETA: 41s - loss: 1.3754 - regression_loss: 1.1593 - classification_loss: 0.2161 327/500 [==================>...........] - ETA: 41s - loss: 1.3750 - regression_loss: 1.1591 - classification_loss: 0.2159 328/500 [==================>...........] - ETA: 41s - loss: 1.3740 - regression_loss: 1.1585 - classification_loss: 0.2156 329/500 [==================>...........] - ETA: 40s - loss: 1.3744 - regression_loss: 1.1590 - classification_loss: 0.2154 330/500 [==================>...........] - ETA: 40s - loss: 1.3753 - regression_loss: 1.1597 - classification_loss: 0.2156 331/500 [==================>...........] - ETA: 40s - loss: 1.3746 - regression_loss: 1.1593 - classification_loss: 0.2153 332/500 [==================>...........] - ETA: 40s - loss: 1.3757 - regression_loss: 1.1601 - classification_loss: 0.2156 333/500 [==================>...........] - ETA: 39s - loss: 1.3774 - regression_loss: 1.1617 - classification_loss: 0.2156 334/500 [===================>..........] - ETA: 39s - loss: 1.3752 - regression_loss: 1.1599 - classification_loss: 0.2153 335/500 [===================>..........] - ETA: 39s - loss: 1.3726 - regression_loss: 1.1577 - classification_loss: 0.2149 336/500 [===================>..........] - ETA: 39s - loss: 1.3713 - regression_loss: 1.1567 - classification_loss: 0.2146 337/500 [===================>..........] - ETA: 38s - loss: 1.3702 - regression_loss: 1.1559 - classification_loss: 0.2143 338/500 [===================>..........] - ETA: 38s - loss: 1.3688 - regression_loss: 1.1545 - classification_loss: 0.2143 339/500 [===================>..........] - ETA: 38s - loss: 1.3682 - regression_loss: 1.1541 - classification_loss: 0.2141 340/500 [===================>..........] - ETA: 38s - loss: 1.3697 - regression_loss: 1.1550 - classification_loss: 0.2147 341/500 [===================>..........] - ETA: 37s - loss: 1.3708 - regression_loss: 1.1559 - classification_loss: 0.2149 342/500 [===================>..........] - ETA: 37s - loss: 1.3692 - regression_loss: 1.1546 - classification_loss: 0.2146 343/500 [===================>..........] - ETA: 37s - loss: 1.3718 - regression_loss: 1.1568 - classification_loss: 0.2150 344/500 [===================>..........] - ETA: 37s - loss: 1.3740 - regression_loss: 1.1586 - classification_loss: 0.2154 345/500 [===================>..........] - ETA: 37s - loss: 1.3742 - regression_loss: 1.1588 - classification_loss: 0.2154 346/500 [===================>..........] - ETA: 36s - loss: 1.3728 - regression_loss: 1.1576 - classification_loss: 0.2151 347/500 [===================>..........] - ETA: 36s - loss: 1.3731 - regression_loss: 1.1581 - classification_loss: 0.2150 348/500 [===================>..........] - ETA: 36s - loss: 1.3731 - regression_loss: 1.1581 - classification_loss: 0.2150 349/500 [===================>..........] - ETA: 36s - loss: 1.3739 - regression_loss: 1.1588 - classification_loss: 0.2151 350/500 [====================>.........] - ETA: 35s - loss: 1.3755 - regression_loss: 1.1600 - classification_loss: 0.2156 351/500 [====================>.........] - ETA: 35s - loss: 1.3749 - regression_loss: 1.1596 - classification_loss: 0.2153 352/500 [====================>.........] - ETA: 35s - loss: 1.3742 - regression_loss: 1.1591 - classification_loss: 0.2151 353/500 [====================>.........] - ETA: 35s - loss: 1.3737 - regression_loss: 1.1586 - classification_loss: 0.2151 354/500 [====================>.........] - ETA: 34s - loss: 1.3739 - regression_loss: 1.1587 - classification_loss: 0.2152 355/500 [====================>.........] - ETA: 34s - loss: 1.3717 - regression_loss: 1.1569 - classification_loss: 0.2148 356/500 [====================>.........] - ETA: 34s - loss: 1.3708 - regression_loss: 1.1563 - classification_loss: 0.2146 357/500 [====================>.........] - ETA: 34s - loss: 1.3718 - regression_loss: 1.1572 - classification_loss: 0.2146 358/500 [====================>.........] - ETA: 33s - loss: 1.3716 - regression_loss: 1.1573 - classification_loss: 0.2143 359/500 [====================>.........] - ETA: 33s - loss: 1.3738 - regression_loss: 1.1594 - classification_loss: 0.2144 360/500 [====================>.........] - ETA: 33s - loss: 1.3735 - regression_loss: 1.1591 - classification_loss: 0.2144 361/500 [====================>.........] - ETA: 33s - loss: 1.3745 - regression_loss: 1.1600 - classification_loss: 0.2145 362/500 [====================>.........] - ETA: 33s - loss: 1.3766 - regression_loss: 1.1619 - classification_loss: 0.2147 363/500 [====================>.........] - ETA: 32s - loss: 1.3760 - regression_loss: 1.1612 - classification_loss: 0.2148 364/500 [====================>.........] - ETA: 32s - loss: 1.3754 - regression_loss: 1.1603 - classification_loss: 0.2151 365/500 [====================>.........] - ETA: 32s - loss: 1.3761 - regression_loss: 1.1608 - classification_loss: 0.2153 366/500 [====================>.........] - ETA: 32s - loss: 1.3760 - regression_loss: 1.1606 - classification_loss: 0.2153 367/500 [=====================>........] - ETA: 31s - loss: 1.3754 - regression_loss: 1.1603 - classification_loss: 0.2151 368/500 [=====================>........] - ETA: 31s - loss: 1.3740 - regression_loss: 1.1592 - classification_loss: 0.2148 369/500 [=====================>........] - ETA: 31s - loss: 1.3721 - regression_loss: 1.1576 - classification_loss: 0.2145 370/500 [=====================>........] - ETA: 31s - loss: 1.3703 - regression_loss: 1.1561 - classification_loss: 0.2142 371/500 [=====================>........] - ETA: 30s - loss: 1.3688 - regression_loss: 1.1549 - classification_loss: 0.2139 372/500 [=====================>........] - ETA: 30s - loss: 1.3690 - regression_loss: 1.1552 - classification_loss: 0.2138 373/500 [=====================>........] - ETA: 30s - loss: 1.3697 - regression_loss: 1.1560 - classification_loss: 0.2137 374/500 [=====================>........] - ETA: 30s - loss: 1.3694 - regression_loss: 1.1558 - classification_loss: 0.2137 375/500 [=====================>........] - ETA: 29s - loss: 1.3696 - regression_loss: 1.1557 - classification_loss: 0.2139 376/500 [=====================>........] - ETA: 29s - loss: 1.3711 - regression_loss: 1.1571 - classification_loss: 0.2140 377/500 [=====================>........] - ETA: 29s - loss: 1.3723 - regression_loss: 1.1580 - classification_loss: 0.2142 378/500 [=====================>........] - ETA: 29s - loss: 1.3703 - regression_loss: 1.1565 - classification_loss: 0.2138 379/500 [=====================>........] - ETA: 29s - loss: 1.3701 - regression_loss: 1.1565 - classification_loss: 0.2136 380/500 [=====================>........] - ETA: 28s - loss: 1.3709 - regression_loss: 1.1571 - classification_loss: 0.2138 381/500 [=====================>........] - ETA: 28s - loss: 1.3713 - regression_loss: 1.1574 - classification_loss: 0.2139 382/500 [=====================>........] - ETA: 28s - loss: 1.3719 - regression_loss: 1.1580 - classification_loss: 0.2139 383/500 [=====================>........] - ETA: 28s - loss: 1.3728 - regression_loss: 1.1586 - classification_loss: 0.2142 384/500 [======================>.......] - ETA: 27s - loss: 1.3745 - regression_loss: 1.1598 - classification_loss: 0.2146 385/500 [======================>.......] - ETA: 27s - loss: 1.3742 - regression_loss: 1.1598 - classification_loss: 0.2144 386/500 [======================>.......] - ETA: 27s - loss: 1.3772 - regression_loss: 1.1623 - classification_loss: 0.2149 387/500 [======================>.......] - ETA: 27s - loss: 1.3779 - regression_loss: 1.1629 - classification_loss: 0.2150 388/500 [======================>.......] - ETA: 26s - loss: 1.3789 - regression_loss: 1.1637 - classification_loss: 0.2153 389/500 [======================>.......] - ETA: 26s - loss: 1.3796 - regression_loss: 1.1641 - classification_loss: 0.2154 390/500 [======================>.......] - ETA: 26s - loss: 1.3782 - regression_loss: 1.1630 - classification_loss: 0.2153 391/500 [======================>.......] - ETA: 26s - loss: 1.3780 - regression_loss: 1.1629 - classification_loss: 0.2152 392/500 [======================>.......] - ETA: 25s - loss: 1.3774 - regression_loss: 1.1624 - classification_loss: 0.2150 393/500 [======================>.......] - ETA: 25s - loss: 1.3757 - regression_loss: 1.1610 - classification_loss: 0.2147 394/500 [======================>.......] - ETA: 25s - loss: 1.3741 - regression_loss: 1.1598 - classification_loss: 0.2143 395/500 [======================>.......] - ETA: 25s - loss: 1.3728 - regression_loss: 1.1569 - classification_loss: 0.2159 396/500 [======================>.......] - ETA: 24s - loss: 1.3729 - regression_loss: 1.1569 - classification_loss: 0.2160 397/500 [======================>.......] - ETA: 24s - loss: 1.3726 - regression_loss: 1.1567 - classification_loss: 0.2159 398/500 [======================>.......] - ETA: 24s - loss: 1.3740 - regression_loss: 1.1576 - classification_loss: 0.2163 399/500 [======================>.......] - ETA: 24s - loss: 1.3735 - regression_loss: 1.1572 - classification_loss: 0.2163 400/500 [=======================>......] - ETA: 23s - loss: 1.3738 - regression_loss: 1.1574 - classification_loss: 0.2164 401/500 [=======================>......] - ETA: 23s - loss: 1.3734 - regression_loss: 1.1571 - classification_loss: 0.2163 402/500 [=======================>......] - ETA: 23s - loss: 1.3753 - regression_loss: 1.1587 - classification_loss: 0.2166 403/500 [=======================>......] - ETA: 23s - loss: 1.3766 - regression_loss: 1.1598 - classification_loss: 0.2167 404/500 [=======================>......] - ETA: 23s - loss: 1.3764 - regression_loss: 1.1597 - classification_loss: 0.2167 405/500 [=======================>......] - ETA: 22s - loss: 1.3763 - regression_loss: 1.1597 - classification_loss: 0.2166 406/500 [=======================>......] - ETA: 22s - loss: 1.3754 - regression_loss: 1.1590 - classification_loss: 0.2165 407/500 [=======================>......] - ETA: 22s - loss: 1.3754 - regression_loss: 1.1587 - classification_loss: 0.2167 408/500 [=======================>......] - ETA: 22s - loss: 1.3743 - regression_loss: 1.1578 - classification_loss: 0.2165 409/500 [=======================>......] - ETA: 21s - loss: 1.3736 - regression_loss: 1.1573 - classification_loss: 0.2162 410/500 [=======================>......] - ETA: 21s - loss: 1.3725 - regression_loss: 1.1566 - classification_loss: 0.2159 411/500 [=======================>......] - ETA: 21s - loss: 1.3732 - regression_loss: 1.1571 - classification_loss: 0.2161 412/500 [=======================>......] - ETA: 21s - loss: 1.3749 - regression_loss: 1.1585 - classification_loss: 0.2164 413/500 [=======================>......] - ETA: 20s - loss: 1.3754 - regression_loss: 1.1590 - classification_loss: 0.2164 414/500 [=======================>......] - ETA: 20s - loss: 1.3760 - regression_loss: 1.1597 - classification_loss: 0.2163 415/500 [=======================>......] - ETA: 20s - loss: 1.3748 - regression_loss: 1.1588 - classification_loss: 0.2160 416/500 [=======================>......] - ETA: 20s - loss: 1.3749 - regression_loss: 1.1589 - classification_loss: 0.2160 417/500 [========================>.....] - ETA: 19s - loss: 1.3739 - regression_loss: 1.1581 - classification_loss: 0.2158 418/500 [========================>.....] - ETA: 19s - loss: 1.3733 - regression_loss: 1.1577 - classification_loss: 0.2156 419/500 [========================>.....] - ETA: 19s - loss: 1.3733 - regression_loss: 1.1578 - classification_loss: 0.2155 420/500 [========================>.....] - ETA: 19s - loss: 1.3746 - regression_loss: 1.1589 - classification_loss: 0.2157 421/500 [========================>.....] - ETA: 18s - loss: 1.3722 - regression_loss: 1.1569 - classification_loss: 0.2153 422/500 [========================>.....] - ETA: 18s - loss: 1.3726 - regression_loss: 1.1573 - classification_loss: 0.2153 423/500 [========================>.....] - ETA: 18s - loss: 1.3716 - regression_loss: 1.1564 - classification_loss: 0.2152 424/500 [========================>.....] - ETA: 18s - loss: 1.3693 - regression_loss: 1.1545 - classification_loss: 0.2148 425/500 [========================>.....] - ETA: 18s - loss: 1.3675 - regression_loss: 1.1529 - classification_loss: 0.2146 426/500 [========================>.....] - ETA: 17s - loss: 1.3679 - regression_loss: 1.1531 - classification_loss: 0.2148 427/500 [========================>.....] - ETA: 17s - loss: 1.3679 - regression_loss: 1.1532 - classification_loss: 0.2147 428/500 [========================>.....] - ETA: 17s - loss: 1.3662 - regression_loss: 1.1517 - classification_loss: 0.2144 429/500 [========================>.....] - ETA: 17s - loss: 1.3671 - regression_loss: 1.1524 - classification_loss: 0.2147 430/500 [========================>.....] - ETA: 16s - loss: 1.3649 - regression_loss: 1.1505 - classification_loss: 0.2144 431/500 [========================>.....] - ETA: 16s - loss: 1.3652 - regression_loss: 1.1506 - classification_loss: 0.2146 432/500 [========================>.....] - ETA: 16s - loss: 1.3634 - regression_loss: 1.1491 - classification_loss: 0.2143 433/500 [========================>.....] - ETA: 16s - loss: 1.3636 - regression_loss: 1.1495 - classification_loss: 0.2141 434/500 [=========================>....] - ETA: 15s - loss: 1.3629 - regression_loss: 1.1490 - classification_loss: 0.2139 435/500 [=========================>....] - ETA: 15s - loss: 1.3612 - regression_loss: 1.1475 - classification_loss: 0.2137 436/500 [=========================>....] - ETA: 15s - loss: 1.3625 - regression_loss: 1.1485 - classification_loss: 0.2140 437/500 [=========================>....] - ETA: 15s - loss: 1.3639 - regression_loss: 1.1498 - classification_loss: 0.2141 438/500 [=========================>....] - ETA: 14s - loss: 1.3632 - regression_loss: 1.1493 - classification_loss: 0.2139 439/500 [=========================>....] - ETA: 14s - loss: 1.3624 - regression_loss: 1.1488 - classification_loss: 0.2136 440/500 [=========================>....] - ETA: 14s - loss: 1.3636 - regression_loss: 1.1498 - classification_loss: 0.2138 441/500 [=========================>....] - ETA: 14s - loss: 1.3647 - regression_loss: 1.1507 - classification_loss: 0.2140 442/500 [=========================>....] - ETA: 13s - loss: 1.3653 - regression_loss: 1.1512 - classification_loss: 0.2142 443/500 [=========================>....] - ETA: 13s - loss: 1.3690 - regression_loss: 1.1543 - classification_loss: 0.2147 444/500 [=========================>....] - ETA: 13s - loss: 1.3692 - regression_loss: 1.1546 - classification_loss: 0.2146 445/500 [=========================>....] - ETA: 13s - loss: 1.3709 - regression_loss: 1.1560 - classification_loss: 0.2148 446/500 [=========================>....] - ETA: 12s - loss: 1.3717 - regression_loss: 1.1567 - classification_loss: 0.2150 447/500 [=========================>....] - ETA: 12s - loss: 1.3723 - regression_loss: 1.1572 - classification_loss: 0.2151 448/500 [=========================>....] - ETA: 12s - loss: 1.3731 - regression_loss: 1.1579 - classification_loss: 0.2152 449/500 [=========================>....] - ETA: 12s - loss: 1.3728 - regression_loss: 1.1576 - classification_loss: 0.2152 450/500 [==========================>...] - ETA: 12s - loss: 1.3730 - regression_loss: 1.1578 - classification_loss: 0.2151 451/500 [==========================>...] - ETA: 11s - loss: 1.3724 - regression_loss: 1.1574 - classification_loss: 0.2150 452/500 [==========================>...] - ETA: 11s - loss: 1.3729 - regression_loss: 1.1578 - classification_loss: 0.2151 453/500 [==========================>...] - ETA: 11s - loss: 1.3752 - regression_loss: 1.1593 - classification_loss: 0.2158 454/500 [==========================>...] - ETA: 11s - loss: 1.3754 - regression_loss: 1.1598 - classification_loss: 0.2157 455/500 [==========================>...] - ETA: 10s - loss: 1.3777 - regression_loss: 1.1613 - classification_loss: 0.2164 456/500 [==========================>...] - ETA: 10s - loss: 1.3769 - regression_loss: 1.1607 - classification_loss: 0.2162 457/500 [==========================>...] - ETA: 10s - loss: 1.3760 - regression_loss: 1.1600 - classification_loss: 0.2160 458/500 [==========================>...] - ETA: 10s - loss: 1.3761 - regression_loss: 1.1600 - classification_loss: 0.2161 459/500 [==========================>...] - ETA: 9s - loss: 1.3750 - regression_loss: 1.1589 - classification_loss: 0.2160  460/500 [==========================>...] - ETA: 9s - loss: 1.3743 - regression_loss: 1.1584 - classification_loss: 0.2159 461/500 [==========================>...] - ETA: 9s - loss: 1.3739 - regression_loss: 1.1581 - classification_loss: 0.2158 462/500 [==========================>...] - ETA: 9s - loss: 1.3752 - regression_loss: 1.1590 - classification_loss: 0.2162 463/500 [==========================>...] - ETA: 8s - loss: 1.3754 - regression_loss: 1.1593 - classification_loss: 0.2161 464/500 [==========================>...] - ETA: 8s - loss: 1.3747 - regression_loss: 1.1588 - classification_loss: 0.2159 465/500 [==========================>...] - ETA: 8s - loss: 1.3740 - regression_loss: 1.1583 - classification_loss: 0.2158 466/500 [==========================>...] - ETA: 8s - loss: 1.3742 - regression_loss: 1.1586 - classification_loss: 0.2157 467/500 [===========================>..] - ETA: 7s - loss: 1.3729 - regression_loss: 1.1575 - classification_loss: 0.2154 468/500 [===========================>..] - ETA: 7s - loss: 1.3744 - regression_loss: 1.1588 - classification_loss: 0.2157 469/500 [===========================>..] - ETA: 7s - loss: 1.3742 - regression_loss: 1.1588 - classification_loss: 0.2154 470/500 [===========================>..] - ETA: 7s - loss: 1.3739 - regression_loss: 1.1587 - classification_loss: 0.2153 471/500 [===========================>..] - ETA: 6s - loss: 1.3725 - regression_loss: 1.1576 - classification_loss: 0.2149 472/500 [===========================>..] - ETA: 6s - loss: 1.3727 - regression_loss: 1.1579 - classification_loss: 0.2148 473/500 [===========================>..] - ETA: 6s - loss: 1.3718 - regression_loss: 1.1572 - classification_loss: 0.2146 474/500 [===========================>..] - ETA: 6s - loss: 1.3719 - regression_loss: 1.1572 - classification_loss: 0.2147 475/500 [===========================>..] - ETA: 6s - loss: 1.3714 - regression_loss: 1.1569 - classification_loss: 0.2145 476/500 [===========================>..] - ETA: 5s - loss: 1.3700 - regression_loss: 1.1558 - classification_loss: 0.2142 477/500 [===========================>..] - ETA: 5s - loss: 1.3700 - regression_loss: 1.1560 - classification_loss: 0.2140 478/500 [===========================>..] - ETA: 5s - loss: 1.3700 - regression_loss: 1.1562 - classification_loss: 0.2138 479/500 [===========================>..] - ETA: 5s - loss: 1.3685 - regression_loss: 1.1550 - classification_loss: 0.2136 480/500 [===========================>..] - ETA: 4s - loss: 1.3700 - regression_loss: 1.1563 - classification_loss: 0.2137 481/500 [===========================>..] - ETA: 4s - loss: 1.3687 - regression_loss: 1.1552 - classification_loss: 0.2135 482/500 [===========================>..] - ETA: 4s - loss: 1.3693 - regression_loss: 1.1557 - classification_loss: 0.2136 483/500 [===========================>..] - ETA: 4s - loss: 1.3704 - regression_loss: 1.1566 - classification_loss: 0.2138 484/500 [============================>.] - ETA: 3s - loss: 1.3703 - regression_loss: 1.1566 - classification_loss: 0.2137 485/500 [============================>.] - ETA: 3s - loss: 1.3709 - regression_loss: 1.1570 - classification_loss: 0.2139 486/500 [============================>.] - ETA: 3s - loss: 1.3704 - regression_loss: 1.1566 - classification_loss: 0.2138 487/500 [============================>.] - ETA: 3s - loss: 1.3707 - regression_loss: 1.1569 - classification_loss: 0.2138 488/500 [============================>.] - ETA: 2s - loss: 1.3702 - regression_loss: 1.1566 - classification_loss: 0.2136 489/500 [============================>.] - ETA: 2s - loss: 1.3703 - regression_loss: 1.1569 - classification_loss: 0.2134 490/500 [============================>.] - ETA: 2s - loss: 1.3715 - regression_loss: 1.1577 - classification_loss: 0.2138 491/500 [============================>.] - ETA: 2s - loss: 1.3725 - regression_loss: 1.1584 - classification_loss: 0.2141 492/500 [============================>.] - ETA: 1s - loss: 1.3728 - regression_loss: 1.1588 - classification_loss: 0.2139 493/500 [============================>.] - ETA: 1s - loss: 1.3735 - regression_loss: 1.1594 - classification_loss: 0.2141 494/500 [============================>.] - ETA: 1s - loss: 1.3739 - regression_loss: 1.1599 - classification_loss: 0.2140 495/500 [============================>.] - ETA: 1s - loss: 1.3751 - regression_loss: 1.1607 - classification_loss: 0.2143 496/500 [============================>.] - ETA: 0s - loss: 1.3753 - regression_loss: 1.1609 - classification_loss: 0.2144 497/500 [============================>.] - ETA: 0s - loss: 1.3751 - regression_loss: 1.1608 - classification_loss: 0.2143 498/500 [============================>.] - ETA: 0s - loss: 1.3754 - regression_loss: 1.1609 - classification_loss: 0.2145 499/500 [============================>.] - ETA: 0s - loss: 1.3754 - regression_loss: 1.1607 - classification_loss: 0.2146 500/500 [==============================] - 121s 241ms/step - loss: 1.3773 - regression_loss: 1.1609 - classification_loss: 0.2164 326 instances of class plum with average precision: 0.8131 mAP: 0.8131 Epoch 00078: saving model to ./training/snapshots/resnet50_pascal_78.h5 Epoch 79/150 1/500 [..............................] - ETA: 1:49 - loss: 2.0048 - regression_loss: 1.6972 - classification_loss: 0.3076 2/500 [..............................] - ETA: 1:49 - loss: 1.5816 - regression_loss: 1.3588 - classification_loss: 0.2227 3/500 [..............................] - ETA: 1:50 - loss: 1.2472 - regression_loss: 1.0820 - classification_loss: 0.1651 4/500 [..............................] - ETA: 1:49 - loss: 1.2090 - regression_loss: 1.0295 - classification_loss: 0.1795 5/500 [..............................] - ETA: 1:49 - loss: 1.2409 - regression_loss: 1.0552 - classification_loss: 0.1857 6/500 [..............................] - ETA: 1:48 - loss: 1.2907 - regression_loss: 1.0845 - classification_loss: 0.2062 7/500 [..............................] - ETA: 1:48 - loss: 1.2985 - regression_loss: 1.0876 - classification_loss: 0.2110 8/500 [..............................] - ETA: 1:47 - loss: 1.4185 - regression_loss: 1.1829 - classification_loss: 0.2356 9/500 [..............................] - ETA: 1:47 - loss: 1.4068 - regression_loss: 1.1823 - classification_loss: 0.2245 10/500 [..............................] - ETA: 1:46 - loss: 1.2984 - regression_loss: 1.0932 - classification_loss: 0.2052 11/500 [..............................] - ETA: 1:46 - loss: 1.2812 - regression_loss: 1.0775 - classification_loss: 0.2037 12/500 [..............................] - ETA: 1:45 - loss: 1.3360 - regression_loss: 1.1258 - classification_loss: 0.2102 13/500 [..............................] - ETA: 1:45 - loss: 1.3284 - regression_loss: 1.1192 - classification_loss: 0.2092 14/500 [..............................] - ETA: 1:46 - loss: 1.2846 - regression_loss: 1.0828 - classification_loss: 0.2018 15/500 [..............................] - ETA: 1:46 - loss: 1.3446 - regression_loss: 1.1308 - classification_loss: 0.2138 16/500 [..............................] - ETA: 1:46 - loss: 1.3571 - regression_loss: 1.1427 - classification_loss: 0.2144 17/500 [>.............................] - ETA: 1:46 - loss: 1.3516 - regression_loss: 1.1409 - classification_loss: 0.2107 18/500 [>.............................] - ETA: 1:46 - loss: 1.3510 - regression_loss: 1.1389 - classification_loss: 0.2121 19/500 [>.............................] - ETA: 1:46 - loss: 1.3675 - regression_loss: 1.1560 - classification_loss: 0.2115 20/500 [>.............................] - ETA: 1:45 - loss: 1.3785 - regression_loss: 1.1610 - classification_loss: 0.2175 21/500 [>.............................] - ETA: 1:45 - loss: 1.3509 - regression_loss: 1.1393 - classification_loss: 0.2116 22/500 [>.............................] - ETA: 1:45 - loss: 1.3557 - regression_loss: 1.1459 - classification_loss: 0.2099 23/500 [>.............................] - ETA: 1:45 - loss: 1.3874 - regression_loss: 1.1686 - classification_loss: 0.2188 24/500 [>.............................] - ETA: 1:45 - loss: 1.3662 - regression_loss: 1.1519 - classification_loss: 0.2143 25/500 [>.............................] - ETA: 1:46 - loss: 1.3571 - regression_loss: 1.1455 - classification_loss: 0.2116 26/500 [>.............................] - ETA: 1:46 - loss: 1.3652 - regression_loss: 1.1484 - classification_loss: 0.2169 27/500 [>.............................] - ETA: 1:46 - loss: 1.3421 - regression_loss: 1.1301 - classification_loss: 0.2120 28/500 [>.............................] - ETA: 1:46 - loss: 1.3146 - regression_loss: 1.1078 - classification_loss: 0.2068 29/500 [>.............................] - ETA: 1:45 - loss: 1.3169 - regression_loss: 1.1117 - classification_loss: 0.2053 30/500 [>.............................] - ETA: 1:45 - loss: 1.3120 - regression_loss: 1.1091 - classification_loss: 0.2029 31/500 [>.............................] - ETA: 1:45 - loss: 1.3227 - regression_loss: 1.1170 - classification_loss: 0.2057 32/500 [>.............................] - ETA: 1:45 - loss: 1.3513 - regression_loss: 1.1367 - classification_loss: 0.2146 33/500 [>.............................] - ETA: 1:45 - loss: 1.3579 - regression_loss: 1.1440 - classification_loss: 0.2138 34/500 [=>............................] - ETA: 1:45 - loss: 1.4069 - regression_loss: 1.1857 - classification_loss: 0.2211 35/500 [=>............................] - ETA: 1:45 - loss: 1.3965 - regression_loss: 1.1771 - classification_loss: 0.2194 36/500 [=>............................] - ETA: 1:45 - loss: 1.3903 - regression_loss: 1.1723 - classification_loss: 0.2180 37/500 [=>............................] - ETA: 1:45 - loss: 1.3932 - regression_loss: 1.1757 - classification_loss: 0.2175 38/500 [=>............................] - ETA: 1:45 - loss: 1.3855 - regression_loss: 1.1693 - classification_loss: 0.2162 39/500 [=>............................] - ETA: 1:45 - loss: 1.3799 - regression_loss: 1.1643 - classification_loss: 0.2156 40/500 [=>............................] - ETA: 1:45 - loss: 1.3776 - regression_loss: 1.1630 - classification_loss: 0.2146 41/500 [=>............................] - ETA: 1:45 - loss: 1.3772 - regression_loss: 1.1609 - classification_loss: 0.2163 42/500 [=>............................] - ETA: 1:45 - loss: 1.3710 - regression_loss: 1.1551 - classification_loss: 0.2159 43/500 [=>............................] - ETA: 1:45 - loss: 1.3710 - regression_loss: 1.1563 - classification_loss: 0.2147 44/500 [=>............................] - ETA: 1:45 - loss: 1.3941 - regression_loss: 1.1722 - classification_loss: 0.2219 45/500 [=>............................] - ETA: 1:45 - loss: 1.3885 - regression_loss: 1.1661 - classification_loss: 0.2224 46/500 [=>............................] - ETA: 1:45 - loss: 1.3875 - regression_loss: 1.1656 - classification_loss: 0.2219 47/500 [=>............................] - ETA: 1:45 - loss: 1.3695 - regression_loss: 1.1503 - classification_loss: 0.2193 48/500 [=>............................] - ETA: 1:44 - loss: 1.3793 - regression_loss: 1.1581 - classification_loss: 0.2212 49/500 [=>............................] - ETA: 1:44 - loss: 1.3837 - regression_loss: 1.1619 - classification_loss: 0.2218 50/500 [==>...........................] - ETA: 1:44 - loss: 1.4217 - regression_loss: 1.1936 - classification_loss: 0.2281 51/500 [==>...........................] - ETA: 1:44 - loss: 1.4388 - regression_loss: 1.2062 - classification_loss: 0.2326 52/500 [==>...........................] - ETA: 1:44 - loss: 1.4331 - regression_loss: 1.2016 - classification_loss: 0.2315 53/500 [==>...........................] - ETA: 1:44 - loss: 1.4208 - regression_loss: 1.1918 - classification_loss: 0.2290 54/500 [==>...........................] - ETA: 1:44 - loss: 1.4174 - regression_loss: 1.1904 - classification_loss: 0.2270 55/500 [==>...........................] - ETA: 1:43 - loss: 1.4205 - regression_loss: 1.1931 - classification_loss: 0.2273 56/500 [==>...........................] - ETA: 1:43 - loss: 1.4153 - regression_loss: 1.1889 - classification_loss: 0.2264 57/500 [==>...........................] - ETA: 1:43 - loss: 1.4128 - regression_loss: 1.1876 - classification_loss: 0.2253 58/500 [==>...........................] - ETA: 1:43 - loss: 1.4083 - regression_loss: 1.1853 - classification_loss: 0.2230 59/500 [==>...........................] - ETA: 1:42 - loss: 1.4070 - regression_loss: 1.1856 - classification_loss: 0.2214 60/500 [==>...........................] - ETA: 1:42 - loss: 1.4114 - regression_loss: 1.1882 - classification_loss: 0.2232 61/500 [==>...........................] - ETA: 1:42 - loss: 1.4093 - regression_loss: 1.1867 - classification_loss: 0.2226 62/500 [==>...........................] - ETA: 1:42 - loss: 1.4124 - regression_loss: 1.1907 - classification_loss: 0.2217 63/500 [==>...........................] - ETA: 1:42 - loss: 1.4055 - regression_loss: 1.1852 - classification_loss: 0.2203 64/500 [==>...........................] - ETA: 1:42 - loss: 1.4085 - regression_loss: 1.1872 - classification_loss: 0.2213 65/500 [==>...........................] - ETA: 1:42 - loss: 1.4069 - regression_loss: 1.1852 - classification_loss: 0.2217 66/500 [==>...........................] - ETA: 1:42 - loss: 1.4050 - regression_loss: 1.1836 - classification_loss: 0.2215 67/500 [===>..........................] - ETA: 1:41 - loss: 1.4153 - regression_loss: 1.1917 - classification_loss: 0.2236 68/500 [===>..........................] - ETA: 1:41 - loss: 1.4150 - regression_loss: 1.1926 - classification_loss: 0.2224 69/500 [===>..........................] - ETA: 1:41 - loss: 1.4161 - regression_loss: 1.1946 - classification_loss: 0.2215 70/500 [===>..........................] - ETA: 1:41 - loss: 1.4152 - regression_loss: 1.1943 - classification_loss: 0.2209 71/500 [===>..........................] - ETA: 1:41 - loss: 1.4187 - regression_loss: 1.1962 - classification_loss: 0.2225 72/500 [===>..........................] - ETA: 1:40 - loss: 1.4072 - regression_loss: 1.1796 - classification_loss: 0.2276 73/500 [===>..........................] - ETA: 1:40 - loss: 1.4086 - regression_loss: 1.1824 - classification_loss: 0.2262 74/500 [===>..........................] - ETA: 1:40 - loss: 1.4129 - regression_loss: 1.1857 - classification_loss: 0.2272 75/500 [===>..........................] - ETA: 1:39 - loss: 1.4116 - regression_loss: 1.1853 - classification_loss: 0.2263 76/500 [===>..........................] - ETA: 1:39 - loss: 1.4162 - regression_loss: 1.1896 - classification_loss: 0.2267 77/500 [===>..........................] - ETA: 1:39 - loss: 1.4171 - regression_loss: 1.1900 - classification_loss: 0.2271 78/500 [===>..........................] - ETA: 1:39 - loss: 1.4243 - regression_loss: 1.1961 - classification_loss: 0.2282 79/500 [===>..........................] - ETA: 1:39 - loss: 1.4108 - regression_loss: 1.1809 - classification_loss: 0.2298 80/500 [===>..........................] - ETA: 1:38 - loss: 1.4171 - regression_loss: 1.1855 - classification_loss: 0.2316 81/500 [===>..........................] - ETA: 1:38 - loss: 1.4363 - regression_loss: 1.2010 - classification_loss: 0.2354 82/500 [===>..........................] - ETA: 1:38 - loss: 1.4354 - regression_loss: 1.2012 - classification_loss: 0.2342 83/500 [===>..........................] - ETA: 1:38 - loss: 1.4535 - regression_loss: 1.2176 - classification_loss: 0.2359 84/500 [====>.........................] - ETA: 1:38 - loss: 1.4503 - regression_loss: 1.2156 - classification_loss: 0.2347 85/500 [====>.........................] - ETA: 1:37 - loss: 1.4395 - regression_loss: 1.2070 - classification_loss: 0.2326 86/500 [====>.........................] - ETA: 1:37 - loss: 1.4339 - regression_loss: 1.2018 - classification_loss: 0.2321 87/500 [====>.........................] - ETA: 1:37 - loss: 1.4348 - regression_loss: 1.2027 - classification_loss: 0.2321 88/500 [====>.........................] - ETA: 1:37 - loss: 1.4245 - regression_loss: 1.1940 - classification_loss: 0.2305 89/500 [====>.........................] - ETA: 1:36 - loss: 1.4271 - regression_loss: 1.1976 - classification_loss: 0.2295 90/500 [====>.........................] - ETA: 1:36 - loss: 1.4274 - regression_loss: 1.1974 - classification_loss: 0.2300 91/500 [====>.........................] - ETA: 1:36 - loss: 1.4208 - regression_loss: 1.1926 - classification_loss: 0.2283 92/500 [====>.........................] - ETA: 1:36 - loss: 1.4211 - regression_loss: 1.1931 - classification_loss: 0.2280 93/500 [====>.........................] - ETA: 1:36 - loss: 1.4255 - regression_loss: 1.1964 - classification_loss: 0.2291 94/500 [====>.........................] - ETA: 1:36 - loss: 1.4232 - regression_loss: 1.1945 - classification_loss: 0.2286 95/500 [====>.........................] - ETA: 1:35 - loss: 1.4333 - regression_loss: 1.2016 - classification_loss: 0.2316 96/500 [====>.........................] - ETA: 1:35 - loss: 1.4322 - regression_loss: 1.2006 - classification_loss: 0.2316 97/500 [====>.........................] - ETA: 1:35 - loss: 1.4201 - regression_loss: 1.1905 - classification_loss: 0.2295 98/500 [====>.........................] - ETA: 1:35 - loss: 1.4189 - regression_loss: 1.1900 - classification_loss: 0.2289 99/500 [====>.........................] - ETA: 1:35 - loss: 1.4128 - regression_loss: 1.1853 - classification_loss: 0.2275 100/500 [=====>........................] - ETA: 1:34 - loss: 1.4162 - regression_loss: 1.1871 - classification_loss: 0.2291 101/500 [=====>........................] - ETA: 1:34 - loss: 1.4177 - regression_loss: 1.1888 - classification_loss: 0.2289 102/500 [=====>........................] - ETA: 1:34 - loss: 1.4198 - regression_loss: 1.1904 - classification_loss: 0.2294 103/500 [=====>........................] - ETA: 1:34 - loss: 1.4203 - regression_loss: 1.1914 - classification_loss: 0.2288 104/500 [=====>........................] - ETA: 1:34 - loss: 1.4200 - regression_loss: 1.1921 - classification_loss: 0.2279 105/500 [=====>........................] - ETA: 1:33 - loss: 1.4215 - regression_loss: 1.1934 - classification_loss: 0.2281 106/500 [=====>........................] - ETA: 1:33 - loss: 1.4219 - regression_loss: 1.1935 - classification_loss: 0.2284 107/500 [=====>........................] - ETA: 1:33 - loss: 1.4140 - regression_loss: 1.1874 - classification_loss: 0.2266 108/500 [=====>........................] - ETA: 1:33 - loss: 1.4155 - regression_loss: 1.1884 - classification_loss: 0.2271 109/500 [=====>........................] - ETA: 1:33 - loss: 1.4127 - regression_loss: 1.1863 - classification_loss: 0.2264 110/500 [=====>........................] - ETA: 1:32 - loss: 1.4146 - regression_loss: 1.1878 - classification_loss: 0.2267 111/500 [=====>........................] - ETA: 1:32 - loss: 1.4150 - regression_loss: 1.1877 - classification_loss: 0.2273 112/500 [=====>........................] - ETA: 1:32 - loss: 1.4124 - regression_loss: 1.1860 - classification_loss: 0.2265 113/500 [=====>........................] - ETA: 1:32 - loss: 1.4096 - regression_loss: 1.1837 - classification_loss: 0.2259 114/500 [=====>........................] - ETA: 1:31 - loss: 1.4077 - regression_loss: 1.1824 - classification_loss: 0.2253 115/500 [=====>........................] - ETA: 1:31 - loss: 1.4077 - regression_loss: 1.1832 - classification_loss: 0.2246 116/500 [=====>........................] - ETA: 1:31 - loss: 1.4087 - regression_loss: 1.1835 - classification_loss: 0.2252 117/500 [======>.......................] - ETA: 1:31 - loss: 1.4087 - regression_loss: 1.1828 - classification_loss: 0.2259 118/500 [======>.......................] - ETA: 1:31 - loss: 1.4064 - regression_loss: 1.1812 - classification_loss: 0.2252 119/500 [======>.......................] - ETA: 1:30 - loss: 1.4059 - regression_loss: 1.1813 - classification_loss: 0.2246 120/500 [======>.......................] - ETA: 1:30 - loss: 1.4009 - regression_loss: 1.1774 - classification_loss: 0.2236 121/500 [======>.......................] - ETA: 1:30 - loss: 1.4009 - regression_loss: 1.1765 - classification_loss: 0.2244 122/500 [======>.......................] - ETA: 1:30 - loss: 1.3985 - regression_loss: 1.1749 - classification_loss: 0.2236 123/500 [======>.......................] - ETA: 1:29 - loss: 1.4010 - regression_loss: 1.1769 - classification_loss: 0.2241 124/500 [======>.......................] - ETA: 1:29 - loss: 1.3999 - regression_loss: 1.1760 - classification_loss: 0.2238 125/500 [======>.......................] - ETA: 1:29 - loss: 1.4028 - regression_loss: 1.1783 - classification_loss: 0.2245 126/500 [======>.......................] - ETA: 1:29 - loss: 1.4025 - regression_loss: 1.1780 - classification_loss: 0.2245 127/500 [======>.......................] - ETA: 1:29 - loss: 1.4011 - regression_loss: 1.1773 - classification_loss: 0.2238 128/500 [======>.......................] - ETA: 1:28 - loss: 1.4017 - regression_loss: 1.1787 - classification_loss: 0.2231 129/500 [======>.......................] - ETA: 1:28 - loss: 1.4035 - regression_loss: 1.1807 - classification_loss: 0.2228 130/500 [======>.......................] - ETA: 1:28 - loss: 1.4042 - regression_loss: 1.1815 - classification_loss: 0.2227 131/500 [======>.......................] - ETA: 1:28 - loss: 1.4099 - regression_loss: 1.1850 - classification_loss: 0.2249 132/500 [======>.......................] - ETA: 1:28 - loss: 1.4094 - regression_loss: 1.1848 - classification_loss: 0.2245 133/500 [======>.......................] - ETA: 1:27 - loss: 1.4125 - regression_loss: 1.1876 - classification_loss: 0.2249 134/500 [=======>......................] - ETA: 1:27 - loss: 1.4118 - regression_loss: 1.1872 - classification_loss: 0.2247 135/500 [=======>......................] - ETA: 1:27 - loss: 1.4133 - regression_loss: 1.1885 - classification_loss: 0.2248 136/500 [=======>......................] - ETA: 1:27 - loss: 1.4112 - regression_loss: 1.1869 - classification_loss: 0.2242 137/500 [=======>......................] - ETA: 1:26 - loss: 1.4134 - regression_loss: 1.1883 - classification_loss: 0.2252 138/500 [=======>......................] - ETA: 1:26 - loss: 1.4096 - regression_loss: 1.1854 - classification_loss: 0.2242 139/500 [=======>......................] - ETA: 1:26 - loss: 1.4124 - regression_loss: 1.1872 - classification_loss: 0.2252 140/500 [=======>......................] - ETA: 1:26 - loss: 1.4127 - regression_loss: 1.1874 - classification_loss: 0.2253 141/500 [=======>......................] - ETA: 1:26 - loss: 1.4096 - regression_loss: 1.1846 - classification_loss: 0.2249 142/500 [=======>......................] - ETA: 1:25 - loss: 1.4127 - regression_loss: 1.1873 - classification_loss: 0.2253 143/500 [=======>......................] - ETA: 1:25 - loss: 1.4111 - regression_loss: 1.1866 - classification_loss: 0.2246 144/500 [=======>......................] - ETA: 1:25 - loss: 1.4071 - regression_loss: 1.1833 - classification_loss: 0.2238 145/500 [=======>......................] - ETA: 1:25 - loss: 1.4033 - regression_loss: 1.1804 - classification_loss: 0.2229 146/500 [=======>......................] - ETA: 1:24 - loss: 1.4016 - regression_loss: 1.1795 - classification_loss: 0.2221 147/500 [=======>......................] - ETA: 1:24 - loss: 1.4037 - regression_loss: 1.1817 - classification_loss: 0.2221 148/500 [=======>......................] - ETA: 1:24 - loss: 1.4029 - regression_loss: 1.1812 - classification_loss: 0.2217 149/500 [=======>......................] - ETA: 1:24 - loss: 1.3962 - regression_loss: 1.1757 - classification_loss: 0.2205 150/500 [========>.....................] - ETA: 1:24 - loss: 1.3972 - regression_loss: 1.1768 - classification_loss: 0.2204 151/500 [========>.....................] - ETA: 1:23 - loss: 1.3900 - regression_loss: 1.1708 - classification_loss: 0.2192 152/500 [========>.....................] - ETA: 1:23 - loss: 1.3869 - regression_loss: 1.1683 - classification_loss: 0.2186 153/500 [========>.....................] - ETA: 1:23 - loss: 1.3826 - regression_loss: 1.1645 - classification_loss: 0.2182 154/500 [========>.....................] - ETA: 1:23 - loss: 1.3820 - regression_loss: 1.1641 - classification_loss: 0.2179 155/500 [========>.....................] - ETA: 1:22 - loss: 1.3783 - regression_loss: 1.1609 - classification_loss: 0.2173 156/500 [========>.....................] - ETA: 1:22 - loss: 1.3795 - regression_loss: 1.1623 - classification_loss: 0.2172 157/500 [========>.....................] - ETA: 1:22 - loss: 1.3835 - regression_loss: 1.1654 - classification_loss: 0.2181 158/500 [========>.....................] - ETA: 1:22 - loss: 1.3835 - regression_loss: 1.1656 - classification_loss: 0.2179 159/500 [========>.....................] - ETA: 1:21 - loss: 1.3821 - regression_loss: 1.1649 - classification_loss: 0.2172 160/500 [========>.....................] - ETA: 1:21 - loss: 1.3758 - regression_loss: 1.1598 - classification_loss: 0.2160 161/500 [========>.....................] - ETA: 1:21 - loss: 1.3751 - regression_loss: 1.1596 - classification_loss: 0.2155 162/500 [========>.....................] - ETA: 1:21 - loss: 1.3715 - regression_loss: 1.1567 - classification_loss: 0.2149 163/500 [========>.....................] - ETA: 1:20 - loss: 1.3697 - regression_loss: 1.1555 - classification_loss: 0.2142 164/500 [========>.....................] - ETA: 1:20 - loss: 1.3679 - regression_loss: 1.1543 - classification_loss: 0.2137 165/500 [========>.....................] - ETA: 1:20 - loss: 1.3685 - regression_loss: 1.1547 - classification_loss: 0.2138 166/500 [========>.....................] - ETA: 1:20 - loss: 1.3697 - regression_loss: 1.1555 - classification_loss: 0.2142 167/500 [=========>....................] - ETA: 1:20 - loss: 1.3844 - regression_loss: 1.1653 - classification_loss: 0.2190 168/500 [=========>....................] - ETA: 1:19 - loss: 1.3808 - regression_loss: 1.1626 - classification_loss: 0.2182 169/500 [=========>....................] - ETA: 1:19 - loss: 1.3833 - regression_loss: 1.1647 - classification_loss: 0.2186 170/500 [=========>....................] - ETA: 1:19 - loss: 1.3852 - regression_loss: 1.1664 - classification_loss: 0.2188 171/500 [=========>....................] - ETA: 1:19 - loss: 1.3841 - regression_loss: 1.1654 - classification_loss: 0.2187 172/500 [=========>....................] - ETA: 1:18 - loss: 1.3851 - regression_loss: 1.1667 - classification_loss: 0.2184 173/500 [=========>....................] - ETA: 1:18 - loss: 1.3822 - regression_loss: 1.1643 - classification_loss: 0.2178 174/500 [=========>....................] - ETA: 1:18 - loss: 1.3848 - regression_loss: 1.1664 - classification_loss: 0.2183 175/500 [=========>....................] - ETA: 1:17 - loss: 1.3840 - regression_loss: 1.1656 - classification_loss: 0.2183 176/500 [=========>....................] - ETA: 1:17 - loss: 1.3864 - regression_loss: 1.1676 - classification_loss: 0.2188 177/500 [=========>....................] - ETA: 1:17 - loss: 1.3899 - regression_loss: 1.1702 - classification_loss: 0.2197 178/500 [=========>....................] - ETA: 1:17 - loss: 1.3949 - regression_loss: 1.1745 - classification_loss: 0.2204 179/500 [=========>....................] - ETA: 1:17 - loss: 1.3955 - regression_loss: 1.1751 - classification_loss: 0.2205 180/500 [=========>....................] - ETA: 1:16 - loss: 1.3967 - regression_loss: 1.1764 - classification_loss: 0.2202 181/500 [=========>....................] - ETA: 1:16 - loss: 1.3952 - regression_loss: 1.1754 - classification_loss: 0.2199 182/500 [=========>....................] - ETA: 1:16 - loss: 1.3988 - regression_loss: 1.1778 - classification_loss: 0.2211 183/500 [=========>....................] - ETA: 1:16 - loss: 1.3955 - regression_loss: 1.1752 - classification_loss: 0.2203 184/500 [==========>...................] - ETA: 1:15 - loss: 1.3947 - regression_loss: 1.1748 - classification_loss: 0.2199 185/500 [==========>...................] - ETA: 1:15 - loss: 1.3930 - regression_loss: 1.1735 - classification_loss: 0.2196 186/500 [==========>...................] - ETA: 1:15 - loss: 1.3899 - regression_loss: 1.1701 - classification_loss: 0.2198 187/500 [==========>...................] - ETA: 1:15 - loss: 1.3895 - regression_loss: 1.1699 - classification_loss: 0.2196 188/500 [==========>...................] - ETA: 1:14 - loss: 1.3917 - regression_loss: 1.1718 - classification_loss: 0.2199 189/500 [==========>...................] - ETA: 1:14 - loss: 1.3899 - regression_loss: 1.1706 - classification_loss: 0.2193 190/500 [==========>...................] - ETA: 1:14 - loss: 1.3908 - regression_loss: 1.1719 - classification_loss: 0.2189 191/500 [==========>...................] - ETA: 1:14 - loss: 1.3903 - regression_loss: 1.1715 - classification_loss: 0.2188 192/500 [==========>...................] - ETA: 1:13 - loss: 1.3888 - regression_loss: 1.1702 - classification_loss: 0.2186 193/500 [==========>...................] - ETA: 1:13 - loss: 1.3920 - regression_loss: 1.1726 - classification_loss: 0.2194 194/500 [==========>...................] - ETA: 1:13 - loss: 1.3913 - regression_loss: 1.1720 - classification_loss: 0.2193 195/500 [==========>...................] - ETA: 1:13 - loss: 1.3946 - regression_loss: 1.1750 - classification_loss: 0.2195 196/500 [==========>...................] - ETA: 1:12 - loss: 1.3977 - regression_loss: 1.1773 - classification_loss: 0.2204 197/500 [==========>...................] - ETA: 1:12 - loss: 1.3952 - regression_loss: 1.1753 - classification_loss: 0.2199 198/500 [==========>...................] - ETA: 1:12 - loss: 1.3937 - regression_loss: 1.1744 - classification_loss: 0.2194 199/500 [==========>...................] - ETA: 1:12 - loss: 1.3934 - regression_loss: 1.1738 - classification_loss: 0.2195 200/500 [===========>..................] - ETA: 1:11 - loss: 1.3968 - regression_loss: 1.1764 - classification_loss: 0.2204 201/500 [===========>..................] - ETA: 1:11 - loss: 1.3948 - regression_loss: 1.1748 - classification_loss: 0.2200 202/500 [===========>..................] - ETA: 1:11 - loss: 1.3938 - regression_loss: 1.1740 - classification_loss: 0.2198 203/500 [===========>..................] - ETA: 1:11 - loss: 1.3969 - regression_loss: 1.1768 - classification_loss: 0.2201 204/500 [===========>..................] - ETA: 1:11 - loss: 1.3918 - regression_loss: 1.1723 - classification_loss: 0.2195 205/500 [===========>..................] - ETA: 1:10 - loss: 1.3909 - regression_loss: 1.1719 - classification_loss: 0.2190 206/500 [===========>..................] - ETA: 1:10 - loss: 1.3891 - regression_loss: 1.1702 - classification_loss: 0.2189 207/500 [===========>..................] - ETA: 1:10 - loss: 1.3889 - regression_loss: 1.1702 - classification_loss: 0.2187 208/500 [===========>..................] - ETA: 1:10 - loss: 1.3896 - regression_loss: 1.1706 - classification_loss: 0.2189 209/500 [===========>..................] - ETA: 1:09 - loss: 1.3869 - regression_loss: 1.1685 - classification_loss: 0.2184 210/500 [===========>..................] - ETA: 1:09 - loss: 1.3858 - regression_loss: 1.1679 - classification_loss: 0.2179 211/500 [===========>..................] - ETA: 1:09 - loss: 1.3850 - regression_loss: 1.1674 - classification_loss: 0.2176 212/500 [===========>..................] - ETA: 1:09 - loss: 1.3872 - regression_loss: 1.1693 - classification_loss: 0.2179 213/500 [===========>..................] - ETA: 1:09 - loss: 1.3847 - regression_loss: 1.1674 - classification_loss: 0.2173 214/500 [===========>..................] - ETA: 1:08 - loss: 1.3849 - regression_loss: 1.1675 - classification_loss: 0.2174 215/500 [===========>..................] - ETA: 1:08 - loss: 1.3834 - regression_loss: 1.1664 - classification_loss: 0.2170 216/500 [===========>..................] - ETA: 1:08 - loss: 1.3837 - regression_loss: 1.1667 - classification_loss: 0.2170 217/500 [============>.................] - ETA: 1:08 - loss: 1.3826 - regression_loss: 1.1659 - classification_loss: 0.2167 218/500 [============>.................] - ETA: 1:07 - loss: 1.3814 - regression_loss: 1.1649 - classification_loss: 0.2164 219/500 [============>.................] - ETA: 1:07 - loss: 1.3835 - regression_loss: 1.1662 - classification_loss: 0.2172 220/500 [============>.................] - ETA: 1:07 - loss: 1.3844 - regression_loss: 1.1671 - classification_loss: 0.2173 221/500 [============>.................] - ETA: 1:07 - loss: 1.3809 - regression_loss: 1.1643 - classification_loss: 0.2166 222/500 [============>.................] - ETA: 1:06 - loss: 1.3811 - regression_loss: 1.1647 - classification_loss: 0.2165 223/500 [============>.................] - ETA: 1:06 - loss: 1.3810 - regression_loss: 1.1645 - classification_loss: 0.2165 224/500 [============>.................] - ETA: 1:06 - loss: 1.3827 - regression_loss: 1.1660 - classification_loss: 0.2167 225/500 [============>.................] - ETA: 1:06 - loss: 1.3846 - regression_loss: 1.1681 - classification_loss: 0.2164 226/500 [============>.................] - ETA: 1:05 - loss: 1.3849 - regression_loss: 1.1684 - classification_loss: 0.2166 227/500 [============>.................] - ETA: 1:05 - loss: 1.3878 - regression_loss: 1.1707 - classification_loss: 0.2171 228/500 [============>.................] - ETA: 1:05 - loss: 1.3885 - regression_loss: 1.1715 - classification_loss: 0.2169 229/500 [============>.................] - ETA: 1:05 - loss: 1.3886 - regression_loss: 1.1716 - classification_loss: 0.2170 230/500 [============>.................] - ETA: 1:04 - loss: 1.3883 - regression_loss: 1.1714 - classification_loss: 0.2169 231/500 [============>.................] - ETA: 1:04 - loss: 1.3846 - regression_loss: 1.1685 - classification_loss: 0.2162 232/500 [============>.................] - ETA: 1:04 - loss: 1.3837 - regression_loss: 1.1680 - classification_loss: 0.2157 233/500 [============>.................] - ETA: 1:04 - loss: 1.3835 - regression_loss: 1.1680 - classification_loss: 0.2155 234/500 [=============>................] - ETA: 1:04 - loss: 1.3798 - regression_loss: 1.1650 - classification_loss: 0.2148 235/500 [=============>................] - ETA: 1:03 - loss: 1.3805 - regression_loss: 1.1653 - classification_loss: 0.2152 236/500 [=============>................] - ETA: 1:03 - loss: 1.3838 - regression_loss: 1.1677 - classification_loss: 0.2162 237/500 [=============>................] - ETA: 1:03 - loss: 1.3860 - regression_loss: 1.1696 - classification_loss: 0.2163 238/500 [=============>................] - ETA: 1:03 - loss: 1.3876 - regression_loss: 1.1708 - classification_loss: 0.2168 239/500 [=============>................] - ETA: 1:02 - loss: 1.3867 - regression_loss: 1.1703 - classification_loss: 0.2165 240/500 [=============>................] - ETA: 1:02 - loss: 1.3842 - regression_loss: 1.1683 - classification_loss: 0.2159 241/500 [=============>................] - ETA: 1:02 - loss: 1.3837 - regression_loss: 1.1678 - classification_loss: 0.2158 242/500 [=============>................] - ETA: 1:02 - loss: 1.3836 - regression_loss: 1.1679 - classification_loss: 0.2157 243/500 [=============>................] - ETA: 1:01 - loss: 1.3816 - regression_loss: 1.1660 - classification_loss: 0.2156 244/500 [=============>................] - ETA: 1:01 - loss: 1.3829 - regression_loss: 1.1670 - classification_loss: 0.2159 245/500 [=============>................] - ETA: 1:01 - loss: 1.3843 - regression_loss: 1.1677 - classification_loss: 0.2166 246/500 [=============>................] - ETA: 1:01 - loss: 1.3833 - regression_loss: 1.1670 - classification_loss: 0.2163 247/500 [=============>................] - ETA: 1:00 - loss: 1.3861 - regression_loss: 1.1693 - classification_loss: 0.2168 248/500 [=============>................] - ETA: 1:00 - loss: 1.3847 - regression_loss: 1.1684 - classification_loss: 0.2163 249/500 [=============>................] - ETA: 1:00 - loss: 1.3837 - regression_loss: 1.1678 - classification_loss: 0.2160 250/500 [==============>...............] - ETA: 1:00 - loss: 1.3875 - regression_loss: 1.1709 - classification_loss: 0.2166 251/500 [==============>...............] - ETA: 59s - loss: 1.3865 - regression_loss: 1.1699 - classification_loss: 0.2166  252/500 [==============>...............] - ETA: 59s - loss: 1.3869 - regression_loss: 1.1704 - classification_loss: 0.2165 253/500 [==============>...............] - ETA: 59s - loss: 1.3849 - regression_loss: 1.1689 - classification_loss: 0.2160 254/500 [==============>...............] - ETA: 59s - loss: 1.3817 - regression_loss: 1.1662 - classification_loss: 0.2155 255/500 [==============>...............] - ETA: 58s - loss: 1.3813 - regression_loss: 1.1660 - classification_loss: 0.2153 256/500 [==============>...............] - ETA: 58s - loss: 1.3797 - regression_loss: 1.1648 - classification_loss: 0.2150 257/500 [==============>...............] - ETA: 58s - loss: 1.3796 - regression_loss: 1.1640 - classification_loss: 0.2155 258/500 [==============>...............] - ETA: 58s - loss: 1.3808 - regression_loss: 1.1650 - classification_loss: 0.2158 259/500 [==============>...............] - ETA: 58s - loss: 1.3774 - regression_loss: 1.1620 - classification_loss: 0.2154 260/500 [==============>...............] - ETA: 57s - loss: 1.3746 - regression_loss: 1.1597 - classification_loss: 0.2150 261/500 [==============>...............] - ETA: 57s - loss: 1.3727 - regression_loss: 1.1582 - classification_loss: 0.2145 262/500 [==============>...............] - ETA: 57s - loss: 1.3699 - regression_loss: 1.1560 - classification_loss: 0.2139 263/500 [==============>...............] - ETA: 57s - loss: 1.3682 - regression_loss: 1.1549 - classification_loss: 0.2133 264/500 [==============>...............] - ETA: 56s - loss: 1.3653 - regression_loss: 1.1523 - classification_loss: 0.2130 265/500 [==============>...............] - ETA: 56s - loss: 1.3659 - regression_loss: 1.1528 - classification_loss: 0.2131 266/500 [==============>...............] - ETA: 56s - loss: 1.3631 - regression_loss: 1.1503 - classification_loss: 0.2128 267/500 [===============>..............] - ETA: 56s - loss: 1.3642 - regression_loss: 1.1509 - classification_loss: 0.2133 268/500 [===============>..............] - ETA: 55s - loss: 1.3654 - regression_loss: 1.1520 - classification_loss: 0.2134 269/500 [===============>..............] - ETA: 55s - loss: 1.3648 - regression_loss: 1.1517 - classification_loss: 0.2131 270/500 [===============>..............] - ETA: 55s - loss: 1.3647 - regression_loss: 1.1516 - classification_loss: 0.2131 271/500 [===============>..............] - ETA: 55s - loss: 1.3670 - regression_loss: 1.1536 - classification_loss: 0.2134 272/500 [===============>..............] - ETA: 54s - loss: 1.3634 - regression_loss: 1.1493 - classification_loss: 0.2140 273/500 [===============>..............] - ETA: 54s - loss: 1.3636 - regression_loss: 1.1495 - classification_loss: 0.2142 274/500 [===============>..............] - ETA: 54s - loss: 1.3629 - regression_loss: 1.1489 - classification_loss: 0.2141 275/500 [===============>..............] - ETA: 54s - loss: 1.3614 - regression_loss: 1.1478 - classification_loss: 0.2136 276/500 [===============>..............] - ETA: 53s - loss: 1.3615 - regression_loss: 1.1481 - classification_loss: 0.2134 277/500 [===============>..............] - ETA: 53s - loss: 1.3618 - regression_loss: 1.1485 - classification_loss: 0.2133 278/500 [===============>..............] - ETA: 53s - loss: 1.3608 - regression_loss: 1.1478 - classification_loss: 0.2131 279/500 [===============>..............] - ETA: 53s - loss: 1.3608 - regression_loss: 1.1479 - classification_loss: 0.2129 280/500 [===============>..............] - ETA: 52s - loss: 1.3615 - regression_loss: 1.1484 - classification_loss: 0.2131 281/500 [===============>..............] - ETA: 52s - loss: 1.3604 - regression_loss: 1.1477 - classification_loss: 0.2127 282/500 [===============>..............] - ETA: 52s - loss: 1.3607 - regression_loss: 1.1482 - classification_loss: 0.2125 283/500 [===============>..............] - ETA: 52s - loss: 1.3588 - regression_loss: 1.1466 - classification_loss: 0.2122 284/500 [================>.............] - ETA: 52s - loss: 1.3603 - regression_loss: 1.1481 - classification_loss: 0.2122 285/500 [================>.............] - ETA: 51s - loss: 1.3600 - regression_loss: 1.1477 - classification_loss: 0.2123 286/500 [================>.............] - ETA: 51s - loss: 1.3621 - regression_loss: 1.1495 - classification_loss: 0.2126 287/500 [================>.............] - ETA: 51s - loss: 1.3633 - regression_loss: 1.1502 - classification_loss: 0.2130 288/500 [================>.............] - ETA: 51s - loss: 1.3596 - regression_loss: 1.1472 - classification_loss: 0.2124 289/500 [================>.............] - ETA: 50s - loss: 1.3633 - regression_loss: 1.1503 - classification_loss: 0.2130 290/500 [================>.............] - ETA: 50s - loss: 1.3603 - regression_loss: 1.1478 - classification_loss: 0.2125 291/500 [================>.............] - ETA: 50s - loss: 1.3575 - regression_loss: 1.1454 - classification_loss: 0.2120 292/500 [================>.............] - ETA: 50s - loss: 1.3587 - regression_loss: 1.1463 - classification_loss: 0.2124 293/500 [================>.............] - ETA: 49s - loss: 1.3558 - regression_loss: 1.1436 - classification_loss: 0.2122 294/500 [================>.............] - ETA: 49s - loss: 1.3548 - regression_loss: 1.1429 - classification_loss: 0.2119 295/500 [================>.............] - ETA: 49s - loss: 1.3541 - regression_loss: 1.1421 - classification_loss: 0.2120 296/500 [================>.............] - ETA: 49s - loss: 1.3544 - regression_loss: 1.1425 - classification_loss: 0.2119 297/500 [================>.............] - ETA: 48s - loss: 1.3537 - regression_loss: 1.1420 - classification_loss: 0.2117 298/500 [================>.............] - ETA: 48s - loss: 1.3534 - regression_loss: 1.1418 - classification_loss: 0.2116 299/500 [================>.............] - ETA: 48s - loss: 1.3525 - regression_loss: 1.1413 - classification_loss: 0.2111 300/500 [=================>............] - ETA: 48s - loss: 1.3519 - regression_loss: 1.1410 - classification_loss: 0.2109 301/500 [=================>............] - ETA: 47s - loss: 1.3516 - regression_loss: 1.1410 - classification_loss: 0.2106 302/500 [=================>............] - ETA: 47s - loss: 1.3528 - regression_loss: 1.1420 - classification_loss: 0.2108 303/500 [=================>............] - ETA: 47s - loss: 1.3523 - regression_loss: 1.1417 - classification_loss: 0.2106 304/500 [=================>............] - ETA: 47s - loss: 1.3526 - regression_loss: 1.1420 - classification_loss: 0.2106 305/500 [=================>............] - ETA: 46s - loss: 1.3514 - regression_loss: 1.1411 - classification_loss: 0.2103 306/500 [=================>............] - ETA: 46s - loss: 1.3515 - regression_loss: 1.1412 - classification_loss: 0.2102 307/500 [=================>............] - ETA: 46s - loss: 1.3557 - regression_loss: 1.1436 - classification_loss: 0.2122 308/500 [=================>............] - ETA: 46s - loss: 1.3550 - regression_loss: 1.1431 - classification_loss: 0.2120 309/500 [=================>............] - ETA: 45s - loss: 1.3533 - regression_loss: 1.1418 - classification_loss: 0.2115 310/500 [=================>............] - ETA: 45s - loss: 1.3522 - regression_loss: 1.1409 - classification_loss: 0.2113 311/500 [=================>............] - ETA: 45s - loss: 1.3512 - regression_loss: 1.1401 - classification_loss: 0.2111 312/500 [=================>............] - ETA: 45s - loss: 1.3526 - regression_loss: 1.1408 - classification_loss: 0.2118 313/500 [=================>............] - ETA: 44s - loss: 1.3534 - regression_loss: 1.1416 - classification_loss: 0.2118 314/500 [=================>............] - ETA: 44s - loss: 1.3525 - regression_loss: 1.1409 - classification_loss: 0.2116 315/500 [=================>............] - ETA: 44s - loss: 1.3558 - regression_loss: 1.1435 - classification_loss: 0.2123 316/500 [=================>............] - ETA: 44s - loss: 1.3548 - regression_loss: 1.1428 - classification_loss: 0.2121 317/500 [==================>...........] - ETA: 44s - loss: 1.3543 - regression_loss: 1.1424 - classification_loss: 0.2118 318/500 [==================>...........] - ETA: 43s - loss: 1.3557 - regression_loss: 1.1434 - classification_loss: 0.2124 319/500 [==================>...........] - ETA: 43s - loss: 1.3586 - regression_loss: 1.1457 - classification_loss: 0.2129 320/500 [==================>...........] - ETA: 43s - loss: 1.3581 - regression_loss: 1.1456 - classification_loss: 0.2125 321/500 [==================>...........] - ETA: 43s - loss: 1.3590 - regression_loss: 1.1463 - classification_loss: 0.2127 322/500 [==================>...........] - ETA: 42s - loss: 1.3595 - regression_loss: 1.1467 - classification_loss: 0.2128 323/500 [==================>...........] - ETA: 42s - loss: 1.3601 - regression_loss: 1.1473 - classification_loss: 0.2128 324/500 [==================>...........] - ETA: 42s - loss: 1.3624 - regression_loss: 1.1488 - classification_loss: 0.2136 325/500 [==================>...........] - ETA: 42s - loss: 1.3617 - regression_loss: 1.1484 - classification_loss: 0.2133 326/500 [==================>...........] - ETA: 41s - loss: 1.3607 - regression_loss: 1.1472 - classification_loss: 0.2135 327/500 [==================>...........] - ETA: 41s - loss: 1.3614 - regression_loss: 1.1480 - classification_loss: 0.2135 328/500 [==================>...........] - ETA: 41s - loss: 1.3629 - regression_loss: 1.1494 - classification_loss: 0.2135 329/500 [==================>...........] - ETA: 41s - loss: 1.3634 - regression_loss: 1.1497 - classification_loss: 0.2137 330/500 [==================>...........] - ETA: 40s - loss: 1.3648 - regression_loss: 1.1508 - classification_loss: 0.2140 331/500 [==================>...........] - ETA: 40s - loss: 1.3666 - regression_loss: 1.1522 - classification_loss: 0.2145 332/500 [==================>...........] - ETA: 40s - loss: 1.3656 - regression_loss: 1.1515 - classification_loss: 0.2142 333/500 [==================>...........] - ETA: 40s - loss: 1.3665 - regression_loss: 1.1522 - classification_loss: 0.2143 334/500 [===================>..........] - ETA: 39s - loss: 1.3659 - regression_loss: 1.1518 - classification_loss: 0.2141 335/500 [===================>..........] - ETA: 39s - loss: 1.3646 - regression_loss: 1.1507 - classification_loss: 0.2139 336/500 [===================>..........] - ETA: 39s - loss: 1.3638 - regression_loss: 1.1501 - classification_loss: 0.2137 337/500 [===================>..........] - ETA: 39s - loss: 1.3629 - regression_loss: 1.1495 - classification_loss: 0.2134 338/500 [===================>..........] - ETA: 38s - loss: 1.3614 - regression_loss: 1.1483 - classification_loss: 0.2130 339/500 [===================>..........] - ETA: 38s - loss: 1.3645 - regression_loss: 1.1509 - classification_loss: 0.2136 340/500 [===================>..........] - ETA: 38s - loss: 1.3647 - regression_loss: 1.1512 - classification_loss: 0.2135 341/500 [===================>..........] - ETA: 38s - loss: 1.3628 - regression_loss: 1.1498 - classification_loss: 0.2131 342/500 [===================>..........] - ETA: 37s - loss: 1.3634 - regression_loss: 1.1503 - classification_loss: 0.2131 343/500 [===================>..........] - ETA: 37s - loss: 1.3663 - regression_loss: 1.1527 - classification_loss: 0.2136 344/500 [===================>..........] - ETA: 37s - loss: 1.3677 - regression_loss: 1.1539 - classification_loss: 0.2138 345/500 [===================>..........] - ETA: 37s - loss: 1.3676 - regression_loss: 1.1538 - classification_loss: 0.2138 346/500 [===================>..........] - ETA: 37s - loss: 1.3667 - regression_loss: 1.1532 - classification_loss: 0.2135 347/500 [===================>..........] - ETA: 36s - loss: 1.3664 - regression_loss: 1.1531 - classification_loss: 0.2133 348/500 [===================>..........] - ETA: 36s - loss: 1.3675 - regression_loss: 1.1542 - classification_loss: 0.2133 349/500 [===================>..........] - ETA: 36s - loss: 1.3708 - regression_loss: 1.1574 - classification_loss: 0.2134 350/500 [====================>.........] - ETA: 36s - loss: 1.3704 - regression_loss: 1.1572 - classification_loss: 0.2132 351/500 [====================>.........] - ETA: 35s - loss: 1.3721 - regression_loss: 1.1588 - classification_loss: 0.2133 352/500 [====================>.........] - ETA: 35s - loss: 1.3706 - regression_loss: 1.1577 - classification_loss: 0.2129 353/500 [====================>.........] - ETA: 35s - loss: 1.3692 - regression_loss: 1.1566 - classification_loss: 0.2127 354/500 [====================>.........] - ETA: 35s - loss: 1.3697 - regression_loss: 1.1569 - classification_loss: 0.2129 355/500 [====================>.........] - ETA: 34s - loss: 1.3703 - regression_loss: 1.1573 - classification_loss: 0.2130 356/500 [====================>.........] - ETA: 34s - loss: 1.3695 - regression_loss: 1.1566 - classification_loss: 0.2129 357/500 [====================>.........] - ETA: 34s - loss: 1.3678 - regression_loss: 1.1554 - classification_loss: 0.2125 358/500 [====================>.........] - ETA: 34s - loss: 1.3691 - regression_loss: 1.1564 - classification_loss: 0.2127 359/500 [====================>.........] - ETA: 33s - loss: 1.3703 - regression_loss: 1.1571 - classification_loss: 0.2132 360/500 [====================>.........] - ETA: 33s - loss: 1.3692 - regression_loss: 1.1562 - classification_loss: 0.2129 361/500 [====================>.........] - ETA: 33s - loss: 1.3685 - regression_loss: 1.1558 - classification_loss: 0.2126 362/500 [====================>.........] - ETA: 33s - loss: 1.3676 - regression_loss: 1.1552 - classification_loss: 0.2124 363/500 [====================>.........] - ETA: 32s - loss: 1.3699 - regression_loss: 1.1569 - classification_loss: 0.2129 364/500 [====================>.........] - ETA: 32s - loss: 1.3696 - regression_loss: 1.1569 - classification_loss: 0.2127 365/500 [====================>.........] - ETA: 32s - loss: 1.3697 - regression_loss: 1.1571 - classification_loss: 0.2126 366/500 [====================>.........] - ETA: 32s - loss: 1.3715 - regression_loss: 1.1588 - classification_loss: 0.2127 367/500 [=====================>........] - ETA: 31s - loss: 1.3717 - regression_loss: 1.1590 - classification_loss: 0.2127 368/500 [=====================>........] - ETA: 31s - loss: 1.3694 - regression_loss: 1.1571 - classification_loss: 0.2122 369/500 [=====================>........] - ETA: 31s - loss: 1.3711 - regression_loss: 1.1583 - classification_loss: 0.2128 370/500 [=====================>........] - ETA: 31s - loss: 1.3717 - regression_loss: 1.1587 - classification_loss: 0.2130 371/500 [=====================>........] - ETA: 30s - loss: 1.3733 - regression_loss: 1.1599 - classification_loss: 0.2134 372/500 [=====================>........] - ETA: 30s - loss: 1.3732 - regression_loss: 1.1598 - classification_loss: 0.2133 373/500 [=====================>........] - ETA: 30s - loss: 1.3726 - regression_loss: 1.1595 - classification_loss: 0.2131 374/500 [=====================>........] - ETA: 30s - loss: 1.3736 - regression_loss: 1.1602 - classification_loss: 0.2134 375/500 [=====================>........] - ETA: 29s - loss: 1.3741 - regression_loss: 1.1605 - classification_loss: 0.2136 376/500 [=====================>........] - ETA: 29s - loss: 1.3736 - regression_loss: 1.1602 - classification_loss: 0.2134 377/500 [=====================>........] - ETA: 29s - loss: 1.3757 - regression_loss: 1.1617 - classification_loss: 0.2140 378/500 [=====================>........] - ETA: 29s - loss: 1.3751 - regression_loss: 1.1612 - classification_loss: 0.2140 379/500 [=====================>........] - ETA: 29s - loss: 1.3745 - regression_loss: 1.1606 - classification_loss: 0.2139 380/500 [=====================>........] - ETA: 28s - loss: 1.3744 - regression_loss: 1.1606 - classification_loss: 0.2138 381/500 [=====================>........] - ETA: 28s - loss: 1.3739 - regression_loss: 1.1603 - classification_loss: 0.2136 382/500 [=====================>........] - ETA: 28s - loss: 1.3738 - regression_loss: 1.1601 - classification_loss: 0.2137 383/500 [=====================>........] - ETA: 28s - loss: 1.3749 - regression_loss: 1.1610 - classification_loss: 0.2139 384/500 [======================>.......] - ETA: 27s - loss: 1.3748 - regression_loss: 1.1609 - classification_loss: 0.2139 385/500 [======================>.......] - ETA: 27s - loss: 1.3742 - regression_loss: 1.1605 - classification_loss: 0.2138 386/500 [======================>.......] - ETA: 27s - loss: 1.3747 - regression_loss: 1.1608 - classification_loss: 0.2139 387/500 [======================>.......] - ETA: 27s - loss: 1.3761 - regression_loss: 1.1621 - classification_loss: 0.2140 388/500 [======================>.......] - ETA: 26s - loss: 1.3774 - regression_loss: 1.1630 - classification_loss: 0.2143 389/500 [======================>.......] - ETA: 26s - loss: 1.3748 - regression_loss: 1.1609 - classification_loss: 0.2139 390/500 [======================>.......] - ETA: 26s - loss: 1.3735 - regression_loss: 1.1597 - classification_loss: 0.2139 391/500 [======================>.......] - ETA: 26s - loss: 1.3709 - regression_loss: 1.1575 - classification_loss: 0.2134 392/500 [======================>.......] - ETA: 25s - loss: 1.3709 - regression_loss: 1.1572 - classification_loss: 0.2137 393/500 [======================>.......] - ETA: 25s - loss: 1.3722 - regression_loss: 1.1583 - classification_loss: 0.2139 394/500 [======================>.......] - ETA: 25s - loss: 1.3728 - regression_loss: 1.1588 - classification_loss: 0.2140 395/500 [======================>.......] - ETA: 25s - loss: 1.3731 - regression_loss: 1.1590 - classification_loss: 0.2142 396/500 [======================>.......] - ETA: 24s - loss: 1.3736 - regression_loss: 1.1593 - classification_loss: 0.2143 397/500 [======================>.......] - ETA: 24s - loss: 1.3719 - regression_loss: 1.1578 - classification_loss: 0.2141 398/500 [======================>.......] - ETA: 24s - loss: 1.3745 - regression_loss: 1.1595 - classification_loss: 0.2150 399/500 [======================>.......] - ETA: 24s - loss: 1.3722 - regression_loss: 1.1577 - classification_loss: 0.2146 400/500 [=======================>......] - ETA: 23s - loss: 1.3721 - regression_loss: 1.1575 - classification_loss: 0.2146 401/500 [=======================>......] - ETA: 23s - loss: 1.3709 - regression_loss: 1.1566 - classification_loss: 0.2143 402/500 [=======================>......] - ETA: 23s - loss: 1.3706 - regression_loss: 1.1563 - classification_loss: 0.2143 403/500 [=======================>......] - ETA: 23s - loss: 1.3695 - regression_loss: 1.1555 - classification_loss: 0.2140 404/500 [=======================>......] - ETA: 23s - loss: 1.3704 - regression_loss: 1.1561 - classification_loss: 0.2143 405/500 [=======================>......] - ETA: 22s - loss: 1.3714 - regression_loss: 1.1570 - classification_loss: 0.2144 406/500 [=======================>......] - ETA: 22s - loss: 1.3706 - regression_loss: 1.1560 - classification_loss: 0.2146 407/500 [=======================>......] - ETA: 22s - loss: 1.3701 - regression_loss: 1.1556 - classification_loss: 0.2144 408/500 [=======================>......] - ETA: 22s - loss: 1.3725 - regression_loss: 1.1574 - classification_loss: 0.2151 409/500 [=======================>......] - ETA: 21s - loss: 1.3719 - regression_loss: 1.1569 - classification_loss: 0.2150 410/500 [=======================>......] - ETA: 21s - loss: 1.3712 - regression_loss: 1.1563 - classification_loss: 0.2148 411/500 [=======================>......] - ETA: 21s - loss: 1.3733 - regression_loss: 1.1580 - classification_loss: 0.2154 412/500 [=======================>......] - ETA: 21s - loss: 1.3736 - regression_loss: 1.1582 - classification_loss: 0.2155 413/500 [=======================>......] - ETA: 20s - loss: 1.3735 - regression_loss: 1.1580 - classification_loss: 0.2155 414/500 [=======================>......] - ETA: 20s - loss: 1.3732 - regression_loss: 1.1578 - classification_loss: 0.2153 415/500 [=======================>......] - ETA: 20s - loss: 1.3726 - regression_loss: 1.1575 - classification_loss: 0.2152 416/500 [=======================>......] - ETA: 20s - loss: 1.3725 - regression_loss: 1.1575 - classification_loss: 0.2150 417/500 [========================>.....] - ETA: 19s - loss: 1.3718 - regression_loss: 1.1571 - classification_loss: 0.2147 418/500 [========================>.....] - ETA: 19s - loss: 1.3722 - regression_loss: 1.1576 - classification_loss: 0.2146 419/500 [========================>.....] - ETA: 19s - loss: 1.3731 - regression_loss: 1.1582 - classification_loss: 0.2149 420/500 [========================>.....] - ETA: 19s - loss: 1.3731 - regression_loss: 1.1581 - classification_loss: 0.2151 421/500 [========================>.....] - ETA: 18s - loss: 1.3720 - regression_loss: 1.1572 - classification_loss: 0.2148 422/500 [========================>.....] - ETA: 18s - loss: 1.3716 - regression_loss: 1.1568 - classification_loss: 0.2147 423/500 [========================>.....] - ETA: 18s - loss: 1.3715 - regression_loss: 1.1569 - classification_loss: 0.2146 424/500 [========================>.....] - ETA: 18s - loss: 1.3721 - regression_loss: 1.1574 - classification_loss: 0.2147 425/500 [========================>.....] - ETA: 17s - loss: 1.3719 - regression_loss: 1.1574 - classification_loss: 0.2145 426/500 [========================>.....] - ETA: 17s - loss: 1.3709 - regression_loss: 1.1566 - classification_loss: 0.2142 427/500 [========================>.....] - ETA: 17s - loss: 1.3709 - regression_loss: 1.1567 - classification_loss: 0.2143 428/500 [========================>.....] - ETA: 17s - loss: 1.3725 - regression_loss: 1.1580 - classification_loss: 0.2145 429/500 [========================>.....] - ETA: 17s - loss: 1.3749 - regression_loss: 1.1601 - classification_loss: 0.2148 430/500 [========================>.....] - ETA: 16s - loss: 1.3730 - regression_loss: 1.1586 - classification_loss: 0.2144 431/500 [========================>.....] - ETA: 16s - loss: 1.3714 - regression_loss: 1.1572 - classification_loss: 0.2142 432/500 [========================>.....] - ETA: 16s - loss: 1.3689 - regression_loss: 1.1551 - classification_loss: 0.2138 433/500 [========================>.....] - ETA: 16s - loss: 1.3671 - regression_loss: 1.1535 - classification_loss: 0.2135 434/500 [=========================>....] - ETA: 15s - loss: 1.3681 - regression_loss: 1.1545 - classification_loss: 0.2136 435/500 [=========================>....] - ETA: 15s - loss: 1.3681 - regression_loss: 1.1546 - classification_loss: 0.2135 436/500 [=========================>....] - ETA: 15s - loss: 1.3690 - regression_loss: 1.1554 - classification_loss: 0.2136 437/500 [=========================>....] - ETA: 15s - loss: 1.3679 - regression_loss: 1.1543 - classification_loss: 0.2136 438/500 [=========================>....] - ETA: 14s - loss: 1.3684 - regression_loss: 1.1548 - classification_loss: 0.2137 439/500 [=========================>....] - ETA: 14s - loss: 1.3678 - regression_loss: 1.1544 - classification_loss: 0.2134 440/500 [=========================>....] - ETA: 14s - loss: 1.3672 - regression_loss: 1.1540 - classification_loss: 0.2132 441/500 [=========================>....] - ETA: 14s - loss: 1.3681 - regression_loss: 1.1547 - classification_loss: 0.2134 442/500 [=========================>....] - ETA: 13s - loss: 1.3670 - regression_loss: 1.1538 - classification_loss: 0.2132 443/500 [=========================>....] - ETA: 13s - loss: 1.3679 - regression_loss: 1.1546 - classification_loss: 0.2133 444/500 [=========================>....] - ETA: 13s - loss: 1.3686 - regression_loss: 1.1552 - classification_loss: 0.2133 445/500 [=========================>....] - ETA: 13s - loss: 1.3695 - regression_loss: 1.1562 - classification_loss: 0.2133 446/500 [=========================>....] - ETA: 12s - loss: 1.3693 - regression_loss: 1.1561 - classification_loss: 0.2132 447/500 [=========================>....] - ETA: 12s - loss: 1.3681 - regression_loss: 1.1552 - classification_loss: 0.2129 448/500 [=========================>....] - ETA: 12s - loss: 1.3674 - regression_loss: 1.1545 - classification_loss: 0.2129 449/500 [=========================>....] - ETA: 12s - loss: 1.3677 - regression_loss: 1.1547 - classification_loss: 0.2130 450/500 [==========================>...] - ETA: 11s - loss: 1.3667 - regression_loss: 1.1540 - classification_loss: 0.2127 451/500 [==========================>...] - ETA: 11s - loss: 1.3660 - regression_loss: 1.1535 - classification_loss: 0.2126 452/500 [==========================>...] - ETA: 11s - loss: 1.3657 - regression_loss: 1.1533 - classification_loss: 0.2124 453/500 [==========================>...] - ETA: 11s - loss: 1.3660 - regression_loss: 1.1534 - classification_loss: 0.2125 454/500 [==========================>...] - ETA: 11s - loss: 1.3685 - regression_loss: 1.1555 - classification_loss: 0.2130 455/500 [==========================>...] - ETA: 10s - loss: 1.3679 - regression_loss: 1.1551 - classification_loss: 0.2128 456/500 [==========================>...] - ETA: 10s - loss: 1.3684 - regression_loss: 1.1556 - classification_loss: 0.2128 457/500 [==========================>...] - ETA: 10s - loss: 1.3679 - regression_loss: 1.1550 - classification_loss: 0.2129 458/500 [==========================>...] - ETA: 10s - loss: 1.3664 - regression_loss: 1.1537 - classification_loss: 0.2128 459/500 [==========================>...] - ETA: 9s - loss: 1.3668 - regression_loss: 1.1539 - classification_loss: 0.2129  460/500 [==========================>...] - ETA: 9s - loss: 1.3682 - regression_loss: 1.1550 - classification_loss: 0.2132 461/500 [==========================>...] - ETA: 9s - loss: 1.3671 - regression_loss: 1.1541 - classification_loss: 0.2130 462/500 [==========================>...] - ETA: 9s - loss: 1.3671 - regression_loss: 1.1540 - classification_loss: 0.2130 463/500 [==========================>...] - ETA: 8s - loss: 1.3666 - regression_loss: 1.1537 - classification_loss: 0.2129 464/500 [==========================>...] - ETA: 8s - loss: 1.3668 - regression_loss: 1.1539 - classification_loss: 0.2129 465/500 [==========================>...] - ETA: 8s - loss: 1.3654 - regression_loss: 1.1528 - classification_loss: 0.2126 466/500 [==========================>...] - ETA: 8s - loss: 1.3663 - regression_loss: 1.1534 - classification_loss: 0.2129 467/500 [===========================>..] - ETA: 7s - loss: 1.3682 - regression_loss: 1.1549 - classification_loss: 0.2132 468/500 [===========================>..] - ETA: 7s - loss: 1.3681 - regression_loss: 1.1550 - classification_loss: 0.2132 469/500 [===========================>..] - ETA: 7s - loss: 1.3689 - regression_loss: 1.1554 - classification_loss: 0.2135 470/500 [===========================>..] - ETA: 7s - loss: 1.3675 - regression_loss: 1.1543 - classification_loss: 0.2132 471/500 [===========================>..] - ETA: 6s - loss: 1.3660 - regression_loss: 1.1531 - classification_loss: 0.2129 472/500 [===========================>..] - ETA: 6s - loss: 1.3679 - regression_loss: 1.1544 - classification_loss: 0.2134 473/500 [===========================>..] - ETA: 6s - loss: 1.3673 - regression_loss: 1.1540 - classification_loss: 0.2133 474/500 [===========================>..] - ETA: 6s - loss: 1.3676 - regression_loss: 1.1541 - classification_loss: 0.2135 475/500 [===========================>..] - ETA: 6s - loss: 1.3673 - regression_loss: 1.1539 - classification_loss: 0.2134 476/500 [===========================>..] - ETA: 5s - loss: 1.3668 - regression_loss: 1.1536 - classification_loss: 0.2133 477/500 [===========================>..] - ETA: 5s - loss: 1.3683 - regression_loss: 1.1547 - classification_loss: 0.2135 478/500 [===========================>..] - ETA: 5s - loss: 1.3692 - regression_loss: 1.1554 - classification_loss: 0.2139 479/500 [===========================>..] - ETA: 5s - loss: 1.3684 - regression_loss: 1.1547 - classification_loss: 0.2137 480/500 [===========================>..] - ETA: 4s - loss: 1.3685 - regression_loss: 1.1547 - classification_loss: 0.2137 481/500 [===========================>..] - ETA: 4s - loss: 1.3694 - regression_loss: 1.1556 - classification_loss: 0.2138 482/500 [===========================>..] - ETA: 4s - loss: 1.3689 - regression_loss: 1.1551 - classification_loss: 0.2139 483/500 [===========================>..] - ETA: 4s - loss: 1.3691 - regression_loss: 1.1553 - classification_loss: 0.2138 484/500 [============================>.] - ETA: 3s - loss: 1.3684 - regression_loss: 1.1547 - classification_loss: 0.2137 485/500 [============================>.] - ETA: 3s - loss: 1.3678 - regression_loss: 1.1543 - classification_loss: 0.2135 486/500 [============================>.] - ETA: 3s - loss: 1.3690 - regression_loss: 1.1551 - classification_loss: 0.2139 487/500 [============================>.] - ETA: 3s - loss: 1.3691 - regression_loss: 1.1550 - classification_loss: 0.2141 488/500 [============================>.] - ETA: 2s - loss: 1.3697 - regression_loss: 1.1551 - classification_loss: 0.2145 489/500 [============================>.] - ETA: 2s - loss: 1.3697 - regression_loss: 1.1552 - classification_loss: 0.2145 490/500 [============================>.] - ETA: 2s - loss: 1.3703 - regression_loss: 1.1558 - classification_loss: 0.2145 491/500 [============================>.] - ETA: 2s - loss: 1.3705 - regression_loss: 1.1559 - classification_loss: 0.2146 492/500 [============================>.] - ETA: 1s - loss: 1.3700 - regression_loss: 1.1554 - classification_loss: 0.2146 493/500 [============================>.] - ETA: 1s - loss: 1.3692 - regression_loss: 1.1547 - classification_loss: 0.2145 494/500 [============================>.] - ETA: 1s - loss: 1.3671 - regression_loss: 1.1529 - classification_loss: 0.2142 495/500 [============================>.] - ETA: 1s - loss: 1.3668 - regression_loss: 1.1527 - classification_loss: 0.2141 496/500 [============================>.] - ETA: 0s - loss: 1.3672 - regression_loss: 1.1529 - classification_loss: 0.2143 497/500 [============================>.] - ETA: 0s - loss: 1.3683 - regression_loss: 1.1537 - classification_loss: 0.2146 498/500 [============================>.] - ETA: 0s - loss: 1.3689 - regression_loss: 1.1543 - classification_loss: 0.2147 499/500 [============================>.] - ETA: 0s - loss: 1.3702 - regression_loss: 1.1552 - classification_loss: 0.2151 500/500 [==============================] - 120s 241ms/step - loss: 1.3697 - regression_loss: 1.1548 - classification_loss: 0.2149 326 instances of class plum with average precision: 0.8026 mAP: 0.8026 Epoch 00079: saving model to ./training/snapshots/resnet50_pascal_79.h5 Epoch 80/150 1/500 [..............................] - ETA: 1:56 - loss: 1.9161 - regression_loss: 1.5553 - classification_loss: 0.3608 2/500 [..............................] - ETA: 1:57 - loss: 1.8260 - regression_loss: 1.4960 - classification_loss: 0.3300 3/500 [..............................] - ETA: 1:57 - loss: 1.6533 - regression_loss: 1.3648 - classification_loss: 0.2885 4/500 [..............................] - ETA: 1:57 - loss: 1.7060 - regression_loss: 1.4304 - classification_loss: 0.2757 5/500 [..............................] - ETA: 1:58 - loss: 1.6971 - regression_loss: 1.4201 - classification_loss: 0.2770 6/500 [..............................] - ETA: 1:58 - loss: 1.5708 - regression_loss: 1.3259 - classification_loss: 0.2449 7/500 [..............................] - ETA: 1:59 - loss: 1.6245 - regression_loss: 1.3562 - classification_loss: 0.2683 8/500 [..............................] - ETA: 1:59 - loss: 1.5023 - regression_loss: 1.1867 - classification_loss: 0.3156 9/500 [..............................] - ETA: 1:58 - loss: 1.4588 - regression_loss: 1.1626 - classification_loss: 0.2961 10/500 [..............................] - ETA: 1:57 - loss: 1.4733 - regression_loss: 1.1758 - classification_loss: 0.2975 11/500 [..............................] - ETA: 1:57 - loss: 1.4695 - regression_loss: 1.1781 - classification_loss: 0.2914 12/500 [..............................] - ETA: 1:57 - loss: 1.4408 - regression_loss: 1.1655 - classification_loss: 0.2753 13/500 [..............................] - ETA: 1:57 - loss: 1.4207 - regression_loss: 1.1559 - classification_loss: 0.2648 14/500 [..............................] - ETA: 1:56 - loss: 1.4137 - regression_loss: 1.1553 - classification_loss: 0.2583 15/500 [..............................] - ETA: 1:56 - loss: 1.4123 - regression_loss: 1.1546 - classification_loss: 0.2577 16/500 [..............................] - ETA: 1:55 - loss: 1.4066 - regression_loss: 1.1552 - classification_loss: 0.2514 17/500 [>.............................] - ETA: 1:55 - loss: 1.3757 - regression_loss: 1.1341 - classification_loss: 0.2416 18/500 [>.............................] - ETA: 1:55 - loss: 1.3396 - regression_loss: 1.1075 - classification_loss: 0.2321 19/500 [>.............................] - ETA: 1:55 - loss: 1.3487 - regression_loss: 1.1198 - classification_loss: 0.2289 20/500 [>.............................] - ETA: 1:55 - loss: 1.3208 - regression_loss: 1.0980 - classification_loss: 0.2228 21/500 [>.............................] - ETA: 1:55 - loss: 1.3314 - regression_loss: 1.1104 - classification_loss: 0.2210 22/500 [>.............................] - ETA: 1:55 - loss: 1.3352 - regression_loss: 1.1153 - classification_loss: 0.2199 23/500 [>.............................] - ETA: 1:54 - loss: 1.2932 - regression_loss: 1.0800 - classification_loss: 0.2132 24/500 [>.............................] - ETA: 1:54 - loss: 1.3230 - regression_loss: 1.1057 - classification_loss: 0.2173 25/500 [>.............................] - ETA: 1:54 - loss: 1.3034 - regression_loss: 1.0912 - classification_loss: 0.2122 26/500 [>.............................] - ETA: 1:54 - loss: 1.3228 - regression_loss: 1.1040 - classification_loss: 0.2188 27/500 [>.............................] - ETA: 1:54 - loss: 1.3098 - regression_loss: 1.0956 - classification_loss: 0.2142 28/500 [>.............................] - ETA: 1:53 - loss: 1.2945 - regression_loss: 1.0843 - classification_loss: 0.2102 29/500 [>.............................] - ETA: 1:53 - loss: 1.3031 - regression_loss: 1.0930 - classification_loss: 0.2101 30/500 [>.............................] - ETA: 1:52 - loss: 1.2960 - regression_loss: 1.0883 - classification_loss: 0.2077 31/500 [>.............................] - ETA: 1:52 - loss: 1.2964 - regression_loss: 1.0906 - classification_loss: 0.2058 32/500 [>.............................] - ETA: 1:52 - loss: 1.2912 - regression_loss: 1.0894 - classification_loss: 0.2018 33/500 [>.............................] - ETA: 1:52 - loss: 1.2815 - regression_loss: 1.0818 - classification_loss: 0.1998 34/500 [=>............................] - ETA: 1:52 - loss: 1.2858 - regression_loss: 1.0854 - classification_loss: 0.2004 35/500 [=>............................] - ETA: 1:52 - loss: 1.2742 - regression_loss: 1.0761 - classification_loss: 0.1981 36/500 [=>............................] - ETA: 1:52 - loss: 1.2714 - regression_loss: 1.0739 - classification_loss: 0.1975 37/500 [=>............................] - ETA: 1:52 - loss: 1.2517 - regression_loss: 1.0581 - classification_loss: 0.1937 38/500 [=>............................] - ETA: 1:51 - loss: 1.2731 - regression_loss: 1.0769 - classification_loss: 0.1961 39/500 [=>............................] - ETA: 1:51 - loss: 1.2703 - regression_loss: 1.0754 - classification_loss: 0.1949 40/500 [=>............................] - ETA: 1:51 - loss: 1.2959 - regression_loss: 1.0929 - classification_loss: 0.2030 41/500 [=>............................] - ETA: 1:51 - loss: 1.3296 - regression_loss: 1.1207 - classification_loss: 0.2089 42/500 [=>............................] - ETA: 1:50 - loss: 1.3350 - regression_loss: 1.1261 - classification_loss: 0.2090 43/500 [=>............................] - ETA: 1:50 - loss: 1.3270 - regression_loss: 1.1199 - classification_loss: 0.2071 44/500 [=>............................] - ETA: 1:50 - loss: 1.3259 - regression_loss: 1.1206 - classification_loss: 0.2053 45/500 [=>............................] - ETA: 1:49 - loss: 1.3315 - regression_loss: 1.1274 - classification_loss: 0.2041 46/500 [=>............................] - ETA: 1:49 - loss: 1.3404 - regression_loss: 1.1336 - classification_loss: 0.2069 47/500 [=>............................] - ETA: 1:49 - loss: 1.3337 - regression_loss: 1.1293 - classification_loss: 0.2045 48/500 [=>............................] - ETA: 1:49 - loss: 1.3448 - regression_loss: 1.1366 - classification_loss: 0.2082 49/500 [=>............................] - ETA: 1:49 - loss: 1.3311 - regression_loss: 1.1255 - classification_loss: 0.2056 50/500 [==>...........................] - ETA: 1:48 - loss: 1.3362 - regression_loss: 1.1300 - classification_loss: 0.2062 51/500 [==>...........................] - ETA: 1:48 - loss: 1.3402 - regression_loss: 1.1324 - classification_loss: 0.2078 52/500 [==>...........................] - ETA: 1:48 - loss: 1.3425 - regression_loss: 1.1348 - classification_loss: 0.2078 53/500 [==>...........................] - ETA: 1:48 - loss: 1.3584 - regression_loss: 1.1466 - classification_loss: 0.2118 54/500 [==>...........................] - ETA: 1:48 - loss: 1.3735 - regression_loss: 1.1584 - classification_loss: 0.2151 55/500 [==>...........................] - ETA: 1:47 - loss: 1.3788 - regression_loss: 1.1634 - classification_loss: 0.2155 56/500 [==>...........................] - ETA: 1:47 - loss: 1.3818 - regression_loss: 1.1682 - classification_loss: 0.2136 57/500 [==>...........................] - ETA: 1:46 - loss: 1.3960 - regression_loss: 1.1824 - classification_loss: 0.2136 58/500 [==>...........................] - ETA: 1:46 - loss: 1.3885 - regression_loss: 1.1760 - classification_loss: 0.2125 59/500 [==>...........................] - ETA: 1:46 - loss: 1.3946 - regression_loss: 1.1791 - classification_loss: 0.2154 60/500 [==>...........................] - ETA: 1:46 - loss: 1.3820 - regression_loss: 1.1690 - classification_loss: 0.2130 61/500 [==>...........................] - ETA: 1:45 - loss: 1.3796 - regression_loss: 1.1674 - classification_loss: 0.2122 62/500 [==>...........................] - ETA: 1:45 - loss: 1.3770 - regression_loss: 1.1661 - classification_loss: 0.2109 63/500 [==>...........................] - ETA: 1:45 - loss: 1.3757 - regression_loss: 1.1652 - classification_loss: 0.2106 64/500 [==>...........................] - ETA: 1:45 - loss: 1.3711 - regression_loss: 1.1624 - classification_loss: 0.2087 65/500 [==>...........................] - ETA: 1:44 - loss: 1.3751 - regression_loss: 1.1662 - classification_loss: 0.2089 66/500 [==>...........................] - ETA: 1:44 - loss: 1.3800 - regression_loss: 1.1693 - classification_loss: 0.2108 67/500 [===>..........................] - ETA: 1:44 - loss: 1.3762 - regression_loss: 1.1665 - classification_loss: 0.2097 68/500 [===>..........................] - ETA: 1:44 - loss: 1.3796 - regression_loss: 1.1694 - classification_loss: 0.2103 69/500 [===>..........................] - ETA: 1:44 - loss: 1.3844 - regression_loss: 1.1732 - classification_loss: 0.2112 70/500 [===>..........................] - ETA: 1:43 - loss: 1.3823 - regression_loss: 1.1715 - classification_loss: 0.2108 71/500 [===>..........................] - ETA: 1:43 - loss: 1.3792 - regression_loss: 1.1698 - classification_loss: 0.2094 72/500 [===>..........................] - ETA: 1:43 - loss: 1.3753 - regression_loss: 1.1670 - classification_loss: 0.2083 73/500 [===>..........................] - ETA: 1:42 - loss: 1.3809 - regression_loss: 1.1704 - classification_loss: 0.2105 74/500 [===>..........................] - ETA: 1:42 - loss: 1.3845 - regression_loss: 1.1742 - classification_loss: 0.2104 75/500 [===>..........................] - ETA: 1:42 - loss: 1.3857 - regression_loss: 1.1744 - classification_loss: 0.2113 76/500 [===>..........................] - ETA: 1:42 - loss: 1.3824 - regression_loss: 1.1728 - classification_loss: 0.2096 77/500 [===>..........................] - ETA: 1:41 - loss: 1.3903 - regression_loss: 1.1741 - classification_loss: 0.2161 78/500 [===>..........................] - ETA: 1:41 - loss: 1.3945 - regression_loss: 1.1785 - classification_loss: 0.2160 79/500 [===>..........................] - ETA: 1:41 - loss: 1.3897 - regression_loss: 1.1748 - classification_loss: 0.2150 80/500 [===>..........................] - ETA: 1:41 - loss: 1.3924 - regression_loss: 1.1774 - classification_loss: 0.2150 81/500 [===>..........................] - ETA: 1:40 - loss: 1.4012 - regression_loss: 1.1850 - classification_loss: 0.2162 82/500 [===>..........................] - ETA: 1:40 - loss: 1.3951 - regression_loss: 1.1804 - classification_loss: 0.2148 83/500 [===>..........................] - ETA: 1:40 - loss: 1.3999 - regression_loss: 1.1843 - classification_loss: 0.2155 84/500 [====>.........................] - ETA: 1:40 - loss: 1.3985 - regression_loss: 1.1834 - classification_loss: 0.2151 85/500 [====>.........................] - ETA: 1:39 - loss: 1.3948 - regression_loss: 1.1810 - classification_loss: 0.2139 86/500 [====>.........................] - ETA: 1:39 - loss: 1.3966 - regression_loss: 1.1828 - classification_loss: 0.2138 87/500 [====>.........................] - ETA: 1:39 - loss: 1.4015 - regression_loss: 1.1877 - classification_loss: 0.2138 88/500 [====>.........................] - ETA: 1:39 - loss: 1.4072 - regression_loss: 1.1913 - classification_loss: 0.2159 89/500 [====>.........................] - ETA: 1:38 - loss: 1.4039 - regression_loss: 1.1884 - classification_loss: 0.2155 90/500 [====>.........................] - ETA: 1:38 - loss: 1.4003 - regression_loss: 1.1848 - classification_loss: 0.2155 91/500 [====>.........................] - ETA: 1:38 - loss: 1.3945 - regression_loss: 1.1801 - classification_loss: 0.2144 92/500 [====>.........................] - ETA: 1:38 - loss: 1.3912 - regression_loss: 1.1775 - classification_loss: 0.2138 93/500 [====>.........................] - ETA: 1:37 - loss: 1.3957 - regression_loss: 1.1813 - classification_loss: 0.2144 94/500 [====>.........................] - ETA: 1:37 - loss: 1.3945 - regression_loss: 1.1807 - classification_loss: 0.2138 95/500 [====>.........................] - ETA: 1:37 - loss: 1.3891 - regression_loss: 1.1764 - classification_loss: 0.2127 96/500 [====>.........................] - ETA: 1:37 - loss: 1.3841 - regression_loss: 1.1723 - classification_loss: 0.2118 97/500 [====>.........................] - ETA: 1:37 - loss: 1.3892 - regression_loss: 1.1764 - classification_loss: 0.2128 98/500 [====>.........................] - ETA: 1:36 - loss: 1.3932 - regression_loss: 1.1783 - classification_loss: 0.2149 99/500 [====>.........................] - ETA: 1:36 - loss: 1.3880 - regression_loss: 1.1734 - classification_loss: 0.2147 100/500 [=====>........................] - ETA: 1:36 - loss: 1.3862 - regression_loss: 1.1727 - classification_loss: 0.2135 101/500 [=====>........................] - ETA: 1:35 - loss: 1.3821 - regression_loss: 1.1696 - classification_loss: 0.2126 102/500 [=====>........................] - ETA: 1:35 - loss: 1.3839 - regression_loss: 1.1713 - classification_loss: 0.2126 103/500 [=====>........................] - ETA: 1:35 - loss: 1.3824 - regression_loss: 1.1704 - classification_loss: 0.2120 104/500 [=====>........................] - ETA: 1:34 - loss: 1.3798 - regression_loss: 1.1687 - classification_loss: 0.2110 105/500 [=====>........................] - ETA: 1:34 - loss: 1.3766 - regression_loss: 1.1665 - classification_loss: 0.2102 106/500 [=====>........................] - ETA: 1:34 - loss: 1.3845 - regression_loss: 1.1738 - classification_loss: 0.2107 107/500 [=====>........................] - ETA: 1:34 - loss: 1.3751 - regression_loss: 1.1661 - classification_loss: 0.2090 108/500 [=====>........................] - ETA: 1:33 - loss: 1.3735 - regression_loss: 1.1650 - classification_loss: 0.2085 109/500 [=====>........................] - ETA: 1:33 - loss: 1.3906 - regression_loss: 1.1792 - classification_loss: 0.2114 110/500 [=====>........................] - ETA: 1:33 - loss: 1.3881 - regression_loss: 1.1770 - classification_loss: 0.2111 111/500 [=====>........................] - ETA: 1:33 - loss: 1.3887 - regression_loss: 1.1782 - classification_loss: 0.2104 112/500 [=====>........................] - ETA: 1:32 - loss: 1.3897 - regression_loss: 1.1790 - classification_loss: 0.2107 113/500 [=====>........................] - ETA: 1:32 - loss: 1.3905 - regression_loss: 1.1795 - classification_loss: 0.2110 114/500 [=====>........................] - ETA: 1:32 - loss: 1.3910 - regression_loss: 1.1805 - classification_loss: 0.2105 115/500 [=====>........................] - ETA: 1:32 - loss: 1.3908 - regression_loss: 1.1806 - classification_loss: 0.2102 116/500 [=====>........................] - ETA: 1:31 - loss: 1.3892 - regression_loss: 1.1795 - classification_loss: 0.2097 117/500 [======>.......................] - ETA: 1:31 - loss: 1.3839 - regression_loss: 1.1749 - classification_loss: 0.2090 118/500 [======>.......................] - ETA: 1:31 - loss: 1.3849 - regression_loss: 1.1759 - classification_loss: 0.2090 119/500 [======>.......................] - ETA: 1:31 - loss: 1.3842 - regression_loss: 1.1756 - classification_loss: 0.2085 120/500 [======>.......................] - ETA: 1:31 - loss: 1.3872 - regression_loss: 1.1773 - classification_loss: 0.2099 121/500 [======>.......................] - ETA: 1:30 - loss: 1.3846 - regression_loss: 1.1755 - classification_loss: 0.2091 122/500 [======>.......................] - ETA: 1:30 - loss: 1.3793 - regression_loss: 1.1714 - classification_loss: 0.2080 123/500 [======>.......................] - ETA: 1:30 - loss: 1.3909 - regression_loss: 1.1815 - classification_loss: 0.2094 124/500 [======>.......................] - ETA: 1:30 - loss: 1.3907 - regression_loss: 1.1806 - classification_loss: 0.2101 125/500 [======>.......................] - ETA: 1:29 - loss: 1.3841 - regression_loss: 1.1753 - classification_loss: 0.2088 126/500 [======>.......................] - ETA: 1:29 - loss: 1.3835 - regression_loss: 1.1751 - classification_loss: 0.2084 127/500 [======>.......................] - ETA: 1:29 - loss: 1.3795 - regression_loss: 1.1718 - classification_loss: 0.2078 128/500 [======>.......................] - ETA: 1:29 - loss: 1.3738 - regression_loss: 1.1674 - classification_loss: 0.2065 129/500 [======>.......................] - ETA: 1:28 - loss: 1.3718 - regression_loss: 1.1659 - classification_loss: 0.2059 130/500 [======>.......................] - ETA: 1:28 - loss: 1.3743 - regression_loss: 1.1674 - classification_loss: 0.2069 131/500 [======>.......................] - ETA: 1:28 - loss: 1.3767 - regression_loss: 1.1698 - classification_loss: 0.2069 132/500 [======>.......................] - ETA: 1:28 - loss: 1.3710 - regression_loss: 1.1650 - classification_loss: 0.2061 133/500 [======>.......................] - ETA: 1:27 - loss: 1.3690 - regression_loss: 1.1632 - classification_loss: 0.2059 134/500 [=======>......................] - ETA: 1:27 - loss: 1.3768 - regression_loss: 1.1692 - classification_loss: 0.2076 135/500 [=======>......................] - ETA: 1:27 - loss: 1.3746 - regression_loss: 1.1674 - classification_loss: 0.2072 136/500 [=======>......................] - ETA: 1:27 - loss: 1.3723 - regression_loss: 1.1656 - classification_loss: 0.2067 137/500 [=======>......................] - ETA: 1:27 - loss: 1.3754 - regression_loss: 1.1683 - classification_loss: 0.2071 138/500 [=======>......................] - ETA: 1:26 - loss: 1.3725 - regression_loss: 1.1660 - classification_loss: 0.2065 139/500 [=======>......................] - ETA: 1:26 - loss: 1.3718 - regression_loss: 1.1655 - classification_loss: 0.2063 140/500 [=======>......................] - ETA: 1:26 - loss: 1.3761 - regression_loss: 1.1691 - classification_loss: 0.2070 141/500 [=======>......................] - ETA: 1:25 - loss: 1.3833 - regression_loss: 1.1741 - classification_loss: 0.2092 142/500 [=======>......................] - ETA: 1:25 - loss: 1.3864 - regression_loss: 1.1770 - classification_loss: 0.2094 143/500 [=======>......................] - ETA: 1:25 - loss: 1.3797 - regression_loss: 1.1713 - classification_loss: 0.2084 144/500 [=======>......................] - ETA: 1:25 - loss: 1.3778 - regression_loss: 1.1700 - classification_loss: 0.2077 145/500 [=======>......................] - ETA: 1:25 - loss: 1.3768 - regression_loss: 1.1694 - classification_loss: 0.2074 146/500 [=======>......................] - ETA: 1:24 - loss: 1.3761 - regression_loss: 1.1689 - classification_loss: 0.2072 147/500 [=======>......................] - ETA: 1:24 - loss: 1.3797 - regression_loss: 1.1710 - classification_loss: 0.2088 148/500 [=======>......................] - ETA: 1:24 - loss: 1.3771 - regression_loss: 1.1691 - classification_loss: 0.2081 149/500 [=======>......................] - ETA: 1:24 - loss: 1.3795 - regression_loss: 1.1708 - classification_loss: 0.2088 150/500 [========>.....................] - ETA: 1:23 - loss: 1.3744 - regression_loss: 1.1666 - classification_loss: 0.2078 151/500 [========>.....................] - ETA: 1:23 - loss: 1.3763 - regression_loss: 1.1679 - classification_loss: 0.2084 152/500 [========>.....................] - ETA: 1:23 - loss: 1.3779 - regression_loss: 1.1692 - classification_loss: 0.2087 153/500 [========>.....................] - ETA: 1:23 - loss: 1.3811 - regression_loss: 1.1719 - classification_loss: 0.2092 154/500 [========>.....................] - ETA: 1:22 - loss: 1.3796 - regression_loss: 1.1709 - classification_loss: 0.2087 155/500 [========>.....................] - ETA: 1:22 - loss: 1.3789 - regression_loss: 1.1706 - classification_loss: 0.2083 156/500 [========>.....................] - ETA: 1:22 - loss: 1.3797 - regression_loss: 1.1710 - classification_loss: 0.2087 157/500 [========>.....................] - ETA: 1:22 - loss: 1.3809 - regression_loss: 1.1713 - classification_loss: 0.2095 158/500 [========>.....................] - ETA: 1:22 - loss: 1.3859 - regression_loss: 1.1762 - classification_loss: 0.2097 159/500 [========>.....................] - ETA: 1:21 - loss: 1.3854 - regression_loss: 1.1758 - classification_loss: 0.2096 160/500 [========>.....................] - ETA: 1:21 - loss: 1.3875 - regression_loss: 1.1777 - classification_loss: 0.2098 161/500 [========>.....................] - ETA: 1:21 - loss: 1.3874 - regression_loss: 1.1779 - classification_loss: 0.2095 162/500 [========>.....................] - ETA: 1:21 - loss: 1.3883 - regression_loss: 1.1791 - classification_loss: 0.2091 163/500 [========>.....................] - ETA: 1:20 - loss: 1.3984 - regression_loss: 1.1864 - classification_loss: 0.2119 164/500 [========>.....................] - ETA: 1:20 - loss: 1.3984 - regression_loss: 1.1862 - classification_loss: 0.2122 165/500 [========>.....................] - ETA: 1:20 - loss: 1.4000 - regression_loss: 1.1878 - classification_loss: 0.2123 166/500 [========>.....................] - ETA: 1:20 - loss: 1.4037 - regression_loss: 1.1905 - classification_loss: 0.2132 167/500 [=========>....................] - ETA: 1:19 - loss: 1.4009 - regression_loss: 1.1884 - classification_loss: 0.2126 168/500 [=========>....................] - ETA: 1:19 - loss: 1.3964 - regression_loss: 1.1846 - classification_loss: 0.2117 169/500 [=========>....................] - ETA: 1:19 - loss: 1.3964 - regression_loss: 1.1848 - classification_loss: 0.2116 170/500 [=========>....................] - ETA: 1:19 - loss: 1.3960 - regression_loss: 1.1843 - classification_loss: 0.2117 171/500 [=========>....................] - ETA: 1:18 - loss: 1.3959 - regression_loss: 1.1846 - classification_loss: 0.2113 172/500 [=========>....................] - ETA: 1:18 - loss: 1.3950 - regression_loss: 1.1837 - classification_loss: 0.2113 173/500 [=========>....................] - ETA: 1:18 - loss: 1.3937 - regression_loss: 1.1827 - classification_loss: 0.2110 174/500 [=========>....................] - ETA: 1:18 - loss: 1.3948 - regression_loss: 1.1835 - classification_loss: 0.2113 175/500 [=========>....................] - ETA: 1:17 - loss: 1.3991 - regression_loss: 1.1863 - classification_loss: 0.2128 176/500 [=========>....................] - ETA: 1:17 - loss: 1.3975 - regression_loss: 1.1851 - classification_loss: 0.2124 177/500 [=========>....................] - ETA: 1:17 - loss: 1.4012 - regression_loss: 1.1894 - classification_loss: 0.2119 178/500 [=========>....................] - ETA: 1:17 - loss: 1.3994 - regression_loss: 1.1883 - classification_loss: 0.2111 179/500 [=========>....................] - ETA: 1:16 - loss: 1.3986 - regression_loss: 1.1873 - classification_loss: 0.2113 180/500 [=========>....................] - ETA: 1:16 - loss: 1.3973 - regression_loss: 1.1862 - classification_loss: 0.2111 181/500 [=========>....................] - ETA: 1:16 - loss: 1.4014 - regression_loss: 1.1894 - classification_loss: 0.2120 182/500 [=========>....................] - ETA: 1:16 - loss: 1.3990 - regression_loss: 1.1876 - classification_loss: 0.2114 183/500 [=========>....................] - ETA: 1:15 - loss: 1.3965 - regression_loss: 1.1857 - classification_loss: 0.2108 184/500 [==========>...................] - ETA: 1:15 - loss: 1.3935 - regression_loss: 1.1833 - classification_loss: 0.2102 185/500 [==========>...................] - ETA: 1:15 - loss: 1.3919 - regression_loss: 1.1822 - classification_loss: 0.2096 186/500 [==========>...................] - ETA: 1:15 - loss: 1.3900 - regression_loss: 1.1807 - classification_loss: 0.2093 187/500 [==========>...................] - ETA: 1:15 - loss: 1.3895 - regression_loss: 1.1803 - classification_loss: 0.2091 188/500 [==========>...................] - ETA: 1:14 - loss: 1.3937 - regression_loss: 1.1839 - classification_loss: 0.2098 189/500 [==========>...................] - ETA: 1:14 - loss: 1.3924 - regression_loss: 1.1824 - classification_loss: 0.2100 190/500 [==========>...................] - ETA: 1:14 - loss: 1.3895 - regression_loss: 1.1793 - classification_loss: 0.2102 191/500 [==========>...................] - ETA: 1:14 - loss: 1.3891 - regression_loss: 1.1789 - classification_loss: 0.2102 192/500 [==========>...................] - ETA: 1:13 - loss: 1.3917 - regression_loss: 1.1813 - classification_loss: 0.2104 193/500 [==========>...................] - ETA: 1:13 - loss: 1.3874 - regression_loss: 1.1779 - classification_loss: 0.2095 194/500 [==========>...................] - ETA: 1:13 - loss: 1.3881 - regression_loss: 1.1788 - classification_loss: 0.2093 195/500 [==========>...................] - ETA: 1:13 - loss: 1.3887 - regression_loss: 1.1789 - classification_loss: 0.2098 196/500 [==========>...................] - ETA: 1:12 - loss: 1.3838 - regression_loss: 1.1747 - classification_loss: 0.2091 197/500 [==========>...................] - ETA: 1:12 - loss: 1.3864 - regression_loss: 1.1770 - classification_loss: 0.2093 198/500 [==========>...................] - ETA: 1:12 - loss: 1.3848 - regression_loss: 1.1758 - classification_loss: 0.2090 199/500 [==========>...................] - ETA: 1:12 - loss: 1.3862 - regression_loss: 1.1773 - classification_loss: 0.2089 200/500 [===========>..................] - ETA: 1:11 - loss: 1.3842 - regression_loss: 1.1757 - classification_loss: 0.2085 201/500 [===========>..................] - ETA: 1:11 - loss: 1.3804 - regression_loss: 1.1726 - classification_loss: 0.2078 202/500 [===========>..................] - ETA: 1:11 - loss: 1.3828 - regression_loss: 1.1746 - classification_loss: 0.2082 203/500 [===========>..................] - ETA: 1:11 - loss: 1.3812 - regression_loss: 1.1732 - classification_loss: 0.2080 204/500 [===========>..................] - ETA: 1:10 - loss: 1.3848 - regression_loss: 1.1755 - classification_loss: 0.2093 205/500 [===========>..................] - ETA: 1:10 - loss: 1.3840 - regression_loss: 1.1750 - classification_loss: 0.2090 206/500 [===========>..................] - ETA: 1:10 - loss: 1.3846 - regression_loss: 1.1757 - classification_loss: 0.2089 207/500 [===========>..................] - ETA: 1:10 - loss: 1.3851 - regression_loss: 1.1767 - classification_loss: 0.2084 208/500 [===========>..................] - ETA: 1:09 - loss: 1.3870 - regression_loss: 1.1782 - classification_loss: 0.2088 209/500 [===========>..................] - ETA: 1:09 - loss: 1.3910 - regression_loss: 1.1822 - classification_loss: 0.2088 210/500 [===========>..................] - ETA: 1:09 - loss: 1.3944 - regression_loss: 1.1845 - classification_loss: 0.2098 211/500 [===========>..................] - ETA: 1:09 - loss: 1.3960 - regression_loss: 1.1853 - classification_loss: 0.2107 212/500 [===========>..................] - ETA: 1:08 - loss: 1.3967 - regression_loss: 1.1858 - classification_loss: 0.2109 213/500 [===========>..................] - ETA: 1:08 - loss: 1.3944 - regression_loss: 1.1841 - classification_loss: 0.2103 214/500 [===========>..................] - ETA: 1:08 - loss: 1.3926 - regression_loss: 1.1825 - classification_loss: 0.2101 215/500 [===========>..................] - ETA: 1:08 - loss: 1.3930 - regression_loss: 1.1833 - classification_loss: 0.2097 216/500 [===========>..................] - ETA: 1:08 - loss: 1.3909 - regression_loss: 1.1819 - classification_loss: 0.2090 217/500 [============>.................] - ETA: 1:07 - loss: 1.3921 - regression_loss: 1.1831 - classification_loss: 0.2089 218/500 [============>.................] - ETA: 1:07 - loss: 1.3940 - regression_loss: 1.1848 - classification_loss: 0.2092 219/500 [============>.................] - ETA: 1:07 - loss: 1.3914 - regression_loss: 1.1827 - classification_loss: 0.2087 220/500 [============>.................] - ETA: 1:07 - loss: 1.3882 - regression_loss: 1.1797 - classification_loss: 0.2085 221/500 [============>.................] - ETA: 1:06 - loss: 1.3878 - regression_loss: 1.1795 - classification_loss: 0.2082 222/500 [============>.................] - ETA: 1:06 - loss: 1.3855 - regression_loss: 1.1773 - classification_loss: 0.2082 223/500 [============>.................] - ETA: 1:06 - loss: 1.3845 - regression_loss: 1.1767 - classification_loss: 0.2078 224/500 [============>.................] - ETA: 1:06 - loss: 1.3837 - regression_loss: 1.1761 - classification_loss: 0.2076 225/500 [============>.................] - ETA: 1:05 - loss: 1.3833 - regression_loss: 1.1756 - classification_loss: 0.2077 226/500 [============>.................] - ETA: 1:05 - loss: 1.3825 - regression_loss: 1.1751 - classification_loss: 0.2074 227/500 [============>.................] - ETA: 1:05 - loss: 1.3827 - regression_loss: 1.1755 - classification_loss: 0.2072 228/500 [============>.................] - ETA: 1:05 - loss: 1.3848 - regression_loss: 1.1773 - classification_loss: 0.2075 229/500 [============>.................] - ETA: 1:04 - loss: 1.3849 - regression_loss: 1.1773 - classification_loss: 0.2075 230/500 [============>.................] - ETA: 1:04 - loss: 1.3849 - regression_loss: 1.1773 - classification_loss: 0.2076 231/500 [============>.................] - ETA: 1:04 - loss: 1.3853 - regression_loss: 1.1774 - classification_loss: 0.2079 232/500 [============>.................] - ETA: 1:04 - loss: 1.3836 - regression_loss: 1.1760 - classification_loss: 0.2076 233/500 [============>.................] - ETA: 1:03 - loss: 1.3887 - regression_loss: 1.1802 - classification_loss: 0.2086 234/500 [=============>................] - ETA: 1:03 - loss: 1.3877 - regression_loss: 1.1796 - classification_loss: 0.2081 235/500 [=============>................] - ETA: 1:03 - loss: 1.3880 - regression_loss: 1.1799 - classification_loss: 0.2081 236/500 [=============>................] - ETA: 1:03 - loss: 1.3855 - regression_loss: 1.1775 - classification_loss: 0.2079 237/500 [=============>................] - ETA: 1:03 - loss: 1.3886 - regression_loss: 1.1803 - classification_loss: 0.2082 238/500 [=============>................] - ETA: 1:02 - loss: 1.3908 - regression_loss: 1.1819 - classification_loss: 0.2089 239/500 [=============>................] - ETA: 1:02 - loss: 1.3896 - regression_loss: 1.1811 - classification_loss: 0.2085 240/500 [=============>................] - ETA: 1:02 - loss: 1.3878 - regression_loss: 1.1797 - classification_loss: 0.2081 241/500 [=============>................] - ETA: 1:02 - loss: 1.3893 - regression_loss: 1.1808 - classification_loss: 0.2084 242/500 [=============>................] - ETA: 1:01 - loss: 1.3852 - regression_loss: 1.1775 - classification_loss: 0.2077 243/500 [=============>................] - ETA: 1:01 - loss: 1.3852 - regression_loss: 1.1777 - classification_loss: 0.2075 244/500 [=============>................] - ETA: 1:01 - loss: 1.3857 - regression_loss: 1.1781 - classification_loss: 0.2076 245/500 [=============>................] - ETA: 1:01 - loss: 1.3877 - regression_loss: 1.1796 - classification_loss: 0.2081 246/500 [=============>................] - ETA: 1:00 - loss: 1.3872 - regression_loss: 1.1793 - classification_loss: 0.2079 247/500 [=============>................] - ETA: 1:00 - loss: 1.3863 - regression_loss: 1.1788 - classification_loss: 0.2076 248/500 [=============>................] - ETA: 1:00 - loss: 1.3850 - regression_loss: 1.1776 - classification_loss: 0.2074 249/500 [=============>................] - ETA: 1:00 - loss: 1.3868 - regression_loss: 1.1790 - classification_loss: 0.2078 250/500 [==============>...............] - ETA: 59s - loss: 1.3870 - regression_loss: 1.1793 - classification_loss: 0.2077  251/500 [==============>...............] - ETA: 59s - loss: 1.3873 - regression_loss: 1.1797 - classification_loss: 0.2076 252/500 [==============>...............] - ETA: 59s - loss: 1.3855 - regression_loss: 1.1783 - classification_loss: 0.2072 253/500 [==============>...............] - ETA: 59s - loss: 1.3842 - regression_loss: 1.1770 - classification_loss: 0.2072 254/500 [==============>...............] - ETA: 58s - loss: 1.3835 - regression_loss: 1.1764 - classification_loss: 0.2070 255/500 [==============>...............] - ETA: 58s - loss: 1.3839 - regression_loss: 1.1771 - classification_loss: 0.2068 256/500 [==============>...............] - ETA: 58s - loss: 1.3836 - regression_loss: 1.1770 - classification_loss: 0.2066 257/500 [==============>...............] - ETA: 58s - loss: 1.3844 - regression_loss: 1.1771 - classification_loss: 0.2073 258/500 [==============>...............] - ETA: 57s - loss: 1.3856 - regression_loss: 1.1774 - classification_loss: 0.2081 259/500 [==============>...............] - ETA: 57s - loss: 1.3891 - regression_loss: 1.1802 - classification_loss: 0.2089 260/500 [==============>...............] - ETA: 57s - loss: 1.3869 - regression_loss: 1.1783 - classification_loss: 0.2087 261/500 [==============>...............] - ETA: 57s - loss: 1.3878 - regression_loss: 1.1791 - classification_loss: 0.2087 262/500 [==============>...............] - ETA: 57s - loss: 1.3888 - regression_loss: 1.1797 - classification_loss: 0.2091 263/500 [==============>...............] - ETA: 56s - loss: 1.3883 - regression_loss: 1.1792 - classification_loss: 0.2092 264/500 [==============>...............] - ETA: 56s - loss: 1.3893 - regression_loss: 1.1802 - classification_loss: 0.2091 265/500 [==============>...............] - ETA: 56s - loss: 1.3888 - regression_loss: 1.1795 - classification_loss: 0.2093 266/500 [==============>...............] - ETA: 56s - loss: 1.3897 - regression_loss: 1.1803 - classification_loss: 0.2094 267/500 [===============>..............] - ETA: 55s - loss: 1.3877 - regression_loss: 1.1786 - classification_loss: 0.2091 268/500 [===============>..............] - ETA: 55s - loss: 1.3844 - regression_loss: 1.1759 - classification_loss: 0.2085 269/500 [===============>..............] - ETA: 55s - loss: 1.3826 - regression_loss: 1.1745 - classification_loss: 0.2081 270/500 [===============>..............] - ETA: 55s - loss: 1.3852 - regression_loss: 1.1767 - classification_loss: 0.2085 271/500 [===============>..............] - ETA: 54s - loss: 1.3858 - regression_loss: 1.1770 - classification_loss: 0.2087 272/500 [===============>..............] - ETA: 54s - loss: 1.3881 - regression_loss: 1.1792 - classification_loss: 0.2089 273/500 [===============>..............] - ETA: 54s - loss: 1.3893 - regression_loss: 1.1796 - classification_loss: 0.2097 274/500 [===============>..............] - ETA: 54s - loss: 1.3902 - regression_loss: 1.1804 - classification_loss: 0.2098 275/500 [===============>..............] - ETA: 53s - loss: 1.3914 - regression_loss: 1.1815 - classification_loss: 0.2099 276/500 [===============>..............] - ETA: 53s - loss: 1.3935 - regression_loss: 1.1834 - classification_loss: 0.2101 277/500 [===============>..............] - ETA: 53s - loss: 1.3919 - regression_loss: 1.1821 - classification_loss: 0.2098 278/500 [===============>..............] - ETA: 53s - loss: 1.3926 - regression_loss: 1.1825 - classification_loss: 0.2101 279/500 [===============>..............] - ETA: 52s - loss: 1.3929 - regression_loss: 1.1826 - classification_loss: 0.2103 280/500 [===============>..............] - ETA: 52s - loss: 1.3956 - regression_loss: 1.1852 - classification_loss: 0.2104 281/500 [===============>..............] - ETA: 52s - loss: 1.3937 - regression_loss: 1.1836 - classification_loss: 0.2101 282/500 [===============>..............] - ETA: 52s - loss: 1.3919 - regression_loss: 1.1821 - classification_loss: 0.2097 283/500 [===============>..............] - ETA: 51s - loss: 1.3901 - regression_loss: 1.1806 - classification_loss: 0.2094 284/500 [================>.............] - ETA: 51s - loss: 1.3880 - regression_loss: 1.1786 - classification_loss: 0.2093 285/500 [================>.............] - ETA: 51s - loss: 1.3868 - regression_loss: 1.1775 - classification_loss: 0.2093 286/500 [================>.............] - ETA: 51s - loss: 1.3853 - regression_loss: 1.1764 - classification_loss: 0.2089 287/500 [================>.............] - ETA: 50s - loss: 1.3844 - regression_loss: 1.1758 - classification_loss: 0.2086 288/500 [================>.............] - ETA: 50s - loss: 1.3843 - regression_loss: 1.1759 - classification_loss: 0.2085 289/500 [================>.............] - ETA: 50s - loss: 1.3850 - regression_loss: 1.1767 - classification_loss: 0.2083 290/500 [================>.............] - ETA: 50s - loss: 1.3855 - regression_loss: 1.1773 - classification_loss: 0.2082 291/500 [================>.............] - ETA: 50s - loss: 1.3851 - regression_loss: 1.1769 - classification_loss: 0.2081 292/500 [================>.............] - ETA: 49s - loss: 1.3858 - regression_loss: 1.1774 - classification_loss: 0.2083 293/500 [================>.............] - ETA: 49s - loss: 1.3851 - regression_loss: 1.1766 - classification_loss: 0.2085 294/500 [================>.............] - ETA: 49s - loss: 1.3822 - regression_loss: 1.1741 - classification_loss: 0.2080 295/500 [================>.............] - ETA: 49s - loss: 1.3860 - regression_loss: 1.1772 - classification_loss: 0.2087 296/500 [================>.............] - ETA: 48s - loss: 1.3847 - regression_loss: 1.1762 - classification_loss: 0.2085 297/500 [================>.............] - ETA: 48s - loss: 1.3840 - regression_loss: 1.1754 - classification_loss: 0.2085 298/500 [================>.............] - ETA: 48s - loss: 1.3861 - regression_loss: 1.1770 - classification_loss: 0.2091 299/500 [================>.............] - ETA: 48s - loss: 1.3852 - regression_loss: 1.1763 - classification_loss: 0.2089 300/500 [=================>............] - ETA: 47s - loss: 1.3839 - regression_loss: 1.1751 - classification_loss: 0.2088 301/500 [=================>............] - ETA: 47s - loss: 1.3858 - regression_loss: 1.1765 - classification_loss: 0.2093 302/500 [=================>............] - ETA: 47s - loss: 1.3854 - regression_loss: 1.1764 - classification_loss: 0.2090 303/500 [=================>............] - ETA: 47s - loss: 1.3857 - regression_loss: 1.1768 - classification_loss: 0.2090 304/500 [=================>............] - ETA: 46s - loss: 1.3847 - regression_loss: 1.1760 - classification_loss: 0.2088 305/500 [=================>............] - ETA: 46s - loss: 1.3825 - regression_loss: 1.1741 - classification_loss: 0.2084 306/500 [=================>............] - ETA: 46s - loss: 1.3864 - regression_loss: 1.1753 - classification_loss: 0.2111 307/500 [=================>............] - ETA: 46s - loss: 1.3860 - regression_loss: 1.1750 - classification_loss: 0.2110 308/500 [=================>............] - ETA: 45s - loss: 1.3842 - regression_loss: 1.1736 - classification_loss: 0.2106 309/500 [=================>............] - ETA: 45s - loss: 1.3863 - regression_loss: 1.1752 - classification_loss: 0.2111 310/500 [=================>............] - ETA: 45s - loss: 1.3873 - regression_loss: 1.1757 - classification_loss: 0.2115 311/500 [=================>............] - ETA: 45s - loss: 1.3855 - regression_loss: 1.1740 - classification_loss: 0.2115 312/500 [=================>............] - ETA: 45s - loss: 1.3825 - regression_loss: 1.1715 - classification_loss: 0.2110 313/500 [=================>............] - ETA: 44s - loss: 1.3811 - regression_loss: 1.1705 - classification_loss: 0.2107 314/500 [=================>............] - ETA: 44s - loss: 1.3808 - regression_loss: 1.1703 - classification_loss: 0.2105 315/500 [=================>............] - ETA: 44s - loss: 1.3822 - regression_loss: 1.1715 - classification_loss: 0.2108 316/500 [=================>............] - ETA: 44s - loss: 1.3853 - regression_loss: 1.1735 - classification_loss: 0.2118 317/500 [==================>...........] - ETA: 43s - loss: 1.3862 - regression_loss: 1.1742 - classification_loss: 0.2120 318/500 [==================>...........] - ETA: 43s - loss: 1.3855 - regression_loss: 1.1737 - classification_loss: 0.2118 319/500 [==================>...........] - ETA: 43s - loss: 1.3871 - regression_loss: 1.1751 - classification_loss: 0.2120 320/500 [==================>...........] - ETA: 43s - loss: 1.3881 - regression_loss: 1.1761 - classification_loss: 0.2121 321/500 [==================>...........] - ETA: 42s - loss: 1.3891 - regression_loss: 1.1767 - classification_loss: 0.2123 322/500 [==================>...........] - ETA: 42s - loss: 1.3877 - regression_loss: 1.1757 - classification_loss: 0.2120 323/500 [==================>...........] - ETA: 42s - loss: 1.3873 - regression_loss: 1.1753 - classification_loss: 0.2120 324/500 [==================>...........] - ETA: 42s - loss: 1.3896 - regression_loss: 1.1771 - classification_loss: 0.2125 325/500 [==================>...........] - ETA: 41s - loss: 1.3896 - regression_loss: 1.1772 - classification_loss: 0.2124 326/500 [==================>...........] - ETA: 41s - loss: 1.3925 - regression_loss: 1.1795 - classification_loss: 0.2130 327/500 [==================>...........] - ETA: 41s - loss: 1.3930 - regression_loss: 1.1802 - classification_loss: 0.2128 328/500 [==================>...........] - ETA: 41s - loss: 1.3938 - regression_loss: 1.1808 - classification_loss: 0.2130 329/500 [==================>...........] - ETA: 40s - loss: 1.3955 - regression_loss: 1.1821 - classification_loss: 0.2134 330/500 [==================>...........] - ETA: 40s - loss: 1.3957 - regression_loss: 1.1824 - classification_loss: 0.2133 331/500 [==================>...........] - ETA: 40s - loss: 1.3963 - regression_loss: 1.1830 - classification_loss: 0.2133 332/500 [==================>...........] - ETA: 40s - loss: 1.3934 - regression_loss: 1.1806 - classification_loss: 0.2128 333/500 [==================>...........] - ETA: 40s - loss: 1.3950 - regression_loss: 1.1821 - classification_loss: 0.2129 334/500 [===================>..........] - ETA: 39s - loss: 1.3956 - regression_loss: 1.1824 - classification_loss: 0.2132 335/500 [===================>..........] - ETA: 39s - loss: 1.3941 - regression_loss: 1.1812 - classification_loss: 0.2129 336/500 [===================>..........] - ETA: 39s - loss: 1.3958 - regression_loss: 1.1826 - classification_loss: 0.2132 337/500 [===================>..........] - ETA: 39s - loss: 1.3952 - regression_loss: 1.1822 - classification_loss: 0.2130 338/500 [===================>..........] - ETA: 38s - loss: 1.3944 - regression_loss: 1.1816 - classification_loss: 0.2128 339/500 [===================>..........] - ETA: 38s - loss: 1.3939 - regression_loss: 1.1814 - classification_loss: 0.2125 340/500 [===================>..........] - ETA: 38s - loss: 1.3943 - regression_loss: 1.1816 - classification_loss: 0.2126 341/500 [===================>..........] - ETA: 38s - loss: 1.3946 - regression_loss: 1.1818 - classification_loss: 0.2127 342/500 [===================>..........] - ETA: 37s - loss: 1.3939 - regression_loss: 1.1813 - classification_loss: 0.2125 343/500 [===================>..........] - ETA: 37s - loss: 1.3913 - regression_loss: 1.1788 - classification_loss: 0.2124 344/500 [===================>..........] - ETA: 37s - loss: 1.3936 - regression_loss: 1.1806 - classification_loss: 0.2131 345/500 [===================>..........] - ETA: 37s - loss: 1.3931 - regression_loss: 1.1802 - classification_loss: 0.2130 346/500 [===================>..........] - ETA: 36s - loss: 1.3941 - regression_loss: 1.1811 - classification_loss: 0.2130 347/500 [===================>..........] - ETA: 36s - loss: 1.3945 - regression_loss: 1.1813 - classification_loss: 0.2132 348/500 [===================>..........] - ETA: 36s - loss: 1.3935 - regression_loss: 1.1806 - classification_loss: 0.2129 349/500 [===================>..........] - ETA: 36s - loss: 1.3951 - regression_loss: 1.1819 - classification_loss: 0.2133 350/500 [====================>.........] - ETA: 35s - loss: 1.3943 - regression_loss: 1.1815 - classification_loss: 0.2128 351/500 [====================>.........] - ETA: 35s - loss: 1.3969 - regression_loss: 1.1836 - classification_loss: 0.2133 352/500 [====================>.........] - ETA: 35s - loss: 1.3991 - regression_loss: 1.1854 - classification_loss: 0.2137 353/500 [====================>.........] - ETA: 35s - loss: 1.3986 - regression_loss: 1.1849 - classification_loss: 0.2137 354/500 [====================>.........] - ETA: 34s - loss: 1.3986 - regression_loss: 1.1849 - classification_loss: 0.2137 355/500 [====================>.........] - ETA: 34s - loss: 1.3983 - regression_loss: 1.1845 - classification_loss: 0.2138 356/500 [====================>.........] - ETA: 34s - loss: 1.3990 - regression_loss: 1.1852 - classification_loss: 0.2138 357/500 [====================>.........] - ETA: 34s - loss: 1.3981 - regression_loss: 1.1845 - classification_loss: 0.2136 358/500 [====================>.........] - ETA: 34s - loss: 1.3995 - regression_loss: 1.1859 - classification_loss: 0.2136 359/500 [====================>.........] - ETA: 33s - loss: 1.3986 - regression_loss: 1.1853 - classification_loss: 0.2133 360/500 [====================>.........] - ETA: 33s - loss: 1.3986 - regression_loss: 1.1851 - classification_loss: 0.2134 361/500 [====================>.........] - ETA: 33s - loss: 1.3978 - regression_loss: 1.1846 - classification_loss: 0.2133 362/500 [====================>.........] - ETA: 33s - loss: 1.3992 - regression_loss: 1.1859 - classification_loss: 0.2133 363/500 [====================>.........] - ETA: 32s - loss: 1.3996 - regression_loss: 1.1862 - classification_loss: 0.2134 364/500 [====================>.........] - ETA: 32s - loss: 1.3987 - regression_loss: 1.1855 - classification_loss: 0.2132 365/500 [====================>.........] - ETA: 32s - loss: 1.3958 - regression_loss: 1.1831 - classification_loss: 0.2127 366/500 [====================>.........] - ETA: 32s - loss: 1.3948 - regression_loss: 1.1822 - classification_loss: 0.2126 367/500 [=====================>........] - ETA: 31s - loss: 1.3943 - regression_loss: 1.1817 - classification_loss: 0.2126 368/500 [=====================>........] - ETA: 31s - loss: 1.3936 - regression_loss: 1.1813 - classification_loss: 0.2124 369/500 [=====================>........] - ETA: 31s - loss: 1.3937 - regression_loss: 1.1814 - classification_loss: 0.2123 370/500 [=====================>........] - ETA: 31s - loss: 1.3927 - regression_loss: 1.1804 - classification_loss: 0.2123 371/500 [=====================>........] - ETA: 30s - loss: 1.3931 - regression_loss: 1.1808 - classification_loss: 0.2122 372/500 [=====================>........] - ETA: 30s - loss: 1.3932 - regression_loss: 1.1811 - classification_loss: 0.2121 373/500 [=====================>........] - ETA: 30s - loss: 1.3926 - regression_loss: 1.1807 - classification_loss: 0.2119 374/500 [=====================>........] - ETA: 30s - loss: 1.3931 - regression_loss: 1.1811 - classification_loss: 0.2120 375/500 [=====================>........] - ETA: 29s - loss: 1.3938 - regression_loss: 1.1817 - classification_loss: 0.2121 376/500 [=====================>........] - ETA: 29s - loss: 1.3936 - regression_loss: 1.1815 - classification_loss: 0.2121 377/500 [=====================>........] - ETA: 29s - loss: 1.3934 - regression_loss: 1.1812 - classification_loss: 0.2122 378/500 [=====================>........] - ETA: 29s - loss: 1.3919 - regression_loss: 1.1799 - classification_loss: 0.2120 379/500 [=====================>........] - ETA: 28s - loss: 1.3917 - regression_loss: 1.1798 - classification_loss: 0.2119 380/500 [=====================>........] - ETA: 28s - loss: 1.3909 - regression_loss: 1.1793 - classification_loss: 0.2117 381/500 [=====================>........] - ETA: 28s - loss: 1.3914 - regression_loss: 1.1800 - classification_loss: 0.2114 382/500 [=====================>........] - ETA: 28s - loss: 1.3927 - regression_loss: 1.1808 - classification_loss: 0.2119 383/500 [=====================>........] - ETA: 28s - loss: 1.3899 - regression_loss: 1.1784 - classification_loss: 0.2115 384/500 [======================>.......] - ETA: 27s - loss: 1.3910 - regression_loss: 1.1793 - classification_loss: 0.2117 385/500 [======================>.......] - ETA: 27s - loss: 1.3909 - regression_loss: 1.1793 - classification_loss: 0.2116 386/500 [======================>.......] - ETA: 27s - loss: 1.3920 - regression_loss: 1.1802 - classification_loss: 0.2117 387/500 [======================>.......] - ETA: 27s - loss: 1.3912 - regression_loss: 1.1796 - classification_loss: 0.2116 388/500 [======================>.......] - ETA: 26s - loss: 1.3910 - regression_loss: 1.1793 - classification_loss: 0.2116 389/500 [======================>.......] - ETA: 26s - loss: 1.3898 - regression_loss: 1.1784 - classification_loss: 0.2114 390/500 [======================>.......] - ETA: 26s - loss: 1.3889 - regression_loss: 1.1776 - classification_loss: 0.2112 391/500 [======================>.......] - ETA: 26s - loss: 1.3895 - regression_loss: 1.1782 - classification_loss: 0.2113 392/500 [======================>.......] - ETA: 25s - loss: 1.3893 - regression_loss: 1.1781 - classification_loss: 0.2112 393/500 [======================>.......] - ETA: 25s - loss: 1.3874 - regression_loss: 1.1764 - classification_loss: 0.2111 394/500 [======================>.......] - ETA: 25s - loss: 1.3870 - regression_loss: 1.1762 - classification_loss: 0.2108 395/500 [======================>.......] - ETA: 25s - loss: 1.3860 - regression_loss: 1.1755 - classification_loss: 0.2105 396/500 [======================>.......] - ETA: 24s - loss: 1.3859 - regression_loss: 1.1754 - classification_loss: 0.2104 397/500 [======================>.......] - ETA: 24s - loss: 1.3844 - regression_loss: 1.1742 - classification_loss: 0.2102 398/500 [======================>.......] - ETA: 24s - loss: 1.3861 - regression_loss: 1.1755 - classification_loss: 0.2107 399/500 [======================>.......] - ETA: 24s - loss: 1.3841 - regression_loss: 1.1738 - classification_loss: 0.2103 400/500 [=======================>......] - ETA: 23s - loss: 1.3847 - regression_loss: 1.1742 - classification_loss: 0.2105 401/500 [=======================>......] - ETA: 23s - loss: 1.3821 - regression_loss: 1.1719 - classification_loss: 0.2102 402/500 [=======================>......] - ETA: 23s - loss: 1.3820 - regression_loss: 1.1718 - classification_loss: 0.2101 403/500 [=======================>......] - ETA: 23s - loss: 1.3807 - regression_loss: 1.1708 - classification_loss: 0.2099 404/500 [=======================>......] - ETA: 23s - loss: 1.3804 - regression_loss: 1.1707 - classification_loss: 0.2098 405/500 [=======================>......] - ETA: 22s - loss: 1.3792 - regression_loss: 1.1696 - classification_loss: 0.2096 406/500 [=======================>......] - ETA: 22s - loss: 1.3823 - regression_loss: 1.1721 - classification_loss: 0.2101 407/500 [=======================>......] - ETA: 22s - loss: 1.3825 - regression_loss: 1.1724 - classification_loss: 0.2101 408/500 [=======================>......] - ETA: 22s - loss: 1.3813 - regression_loss: 1.1714 - classification_loss: 0.2099 409/500 [=======================>......] - ETA: 21s - loss: 1.3798 - regression_loss: 1.1701 - classification_loss: 0.2096 410/500 [=======================>......] - ETA: 21s - loss: 1.3787 - regression_loss: 1.1693 - classification_loss: 0.2094 411/500 [=======================>......] - ETA: 21s - loss: 1.3792 - regression_loss: 1.1698 - classification_loss: 0.2094 412/500 [=======================>......] - ETA: 21s - loss: 1.3786 - regression_loss: 1.1691 - classification_loss: 0.2095 413/500 [=======================>......] - ETA: 20s - loss: 1.3793 - regression_loss: 1.1696 - classification_loss: 0.2097 414/500 [=======================>......] - ETA: 20s - loss: 1.3794 - regression_loss: 1.1668 - classification_loss: 0.2126 415/500 [=======================>......] - ETA: 20s - loss: 1.3803 - regression_loss: 1.1675 - classification_loss: 0.2128 416/500 [=======================>......] - ETA: 20s - loss: 1.3804 - regression_loss: 1.1676 - classification_loss: 0.2127 417/500 [========================>.....] - ETA: 19s - loss: 1.3778 - regression_loss: 1.1655 - classification_loss: 0.2123 418/500 [========================>.....] - ETA: 19s - loss: 1.3783 - regression_loss: 1.1658 - classification_loss: 0.2124 419/500 [========================>.....] - ETA: 19s - loss: 1.3773 - regression_loss: 1.1649 - classification_loss: 0.2124 420/500 [========================>.....] - ETA: 19s - loss: 1.3784 - regression_loss: 1.1656 - classification_loss: 0.2127 421/500 [========================>.....] - ETA: 18s - loss: 1.3803 - regression_loss: 1.1663 - classification_loss: 0.2139 422/500 [========================>.....] - ETA: 18s - loss: 1.3802 - regression_loss: 1.1664 - classification_loss: 0.2138 423/500 [========================>.....] - ETA: 18s - loss: 1.3792 - regression_loss: 1.1656 - classification_loss: 0.2136 424/500 [========================>.....] - ETA: 18s - loss: 1.3789 - regression_loss: 1.1656 - classification_loss: 0.2134 425/500 [========================>.....] - ETA: 17s - loss: 1.3772 - regression_loss: 1.1642 - classification_loss: 0.2130 426/500 [========================>.....] - ETA: 17s - loss: 1.3779 - regression_loss: 1.1650 - classification_loss: 0.2129 427/500 [========================>.....] - ETA: 17s - loss: 1.3779 - regression_loss: 1.1650 - classification_loss: 0.2129 428/500 [========================>.....] - ETA: 17s - loss: 1.3791 - regression_loss: 1.1658 - classification_loss: 0.2134 429/500 [========================>.....] - ETA: 17s - loss: 1.3796 - regression_loss: 1.1664 - classification_loss: 0.2132 430/500 [========================>.....] - ETA: 16s - loss: 1.3790 - regression_loss: 1.1660 - classification_loss: 0.2130 431/500 [========================>.....] - ETA: 16s - loss: 1.3784 - regression_loss: 1.1656 - classification_loss: 0.2129 432/500 [========================>.....] - ETA: 16s - loss: 1.3770 - regression_loss: 1.1643 - classification_loss: 0.2126 433/500 [========================>.....] - ETA: 16s - loss: 1.3761 - regression_loss: 1.1637 - classification_loss: 0.2124 434/500 [=========================>....] - ETA: 15s - loss: 1.3767 - regression_loss: 1.1642 - classification_loss: 0.2125 435/500 [=========================>....] - ETA: 15s - loss: 1.3748 - regression_loss: 1.1627 - classification_loss: 0.2122 436/500 [=========================>....] - ETA: 15s - loss: 1.3740 - regression_loss: 1.1621 - classification_loss: 0.2119 437/500 [=========================>....] - ETA: 15s - loss: 1.3728 - regression_loss: 1.1612 - classification_loss: 0.2116 438/500 [=========================>....] - ETA: 14s - loss: 1.3753 - regression_loss: 1.1631 - classification_loss: 0.2122 439/500 [=========================>....] - ETA: 14s - loss: 1.3746 - regression_loss: 1.1625 - classification_loss: 0.2121 440/500 [=========================>....] - ETA: 14s - loss: 1.3752 - regression_loss: 1.1630 - classification_loss: 0.2122 441/500 [=========================>....] - ETA: 14s - loss: 1.3743 - regression_loss: 1.1623 - classification_loss: 0.2120 442/500 [=========================>....] - ETA: 13s - loss: 1.3746 - regression_loss: 1.1626 - classification_loss: 0.2120 443/500 [=========================>....] - ETA: 13s - loss: 1.3765 - regression_loss: 1.1640 - classification_loss: 0.2125 444/500 [=========================>....] - ETA: 13s - loss: 1.3761 - regression_loss: 1.1638 - classification_loss: 0.2123 445/500 [=========================>....] - ETA: 13s - loss: 1.3742 - regression_loss: 1.1622 - classification_loss: 0.2120 446/500 [=========================>....] - ETA: 12s - loss: 1.3746 - regression_loss: 1.1625 - classification_loss: 0.2120 447/500 [=========================>....] - ETA: 12s - loss: 1.3740 - regression_loss: 1.1622 - classification_loss: 0.2118 448/500 [=========================>....] - ETA: 12s - loss: 1.3735 - regression_loss: 1.1618 - classification_loss: 0.2117 449/500 [=========================>....] - ETA: 12s - loss: 1.3737 - regression_loss: 1.1618 - classification_loss: 0.2119 450/500 [==========================>...] - ETA: 11s - loss: 1.3735 - regression_loss: 1.1618 - classification_loss: 0.2117 451/500 [==========================>...] - ETA: 11s - loss: 1.3715 - regression_loss: 1.1602 - classification_loss: 0.2113 452/500 [==========================>...] - ETA: 11s - loss: 1.3720 - regression_loss: 1.1605 - classification_loss: 0.2114 453/500 [==========================>...] - ETA: 11s - loss: 1.3718 - regression_loss: 1.1605 - classification_loss: 0.2113 454/500 [==========================>...] - ETA: 11s - loss: 1.3716 - regression_loss: 1.1603 - classification_loss: 0.2113 455/500 [==========================>...] - ETA: 10s - loss: 1.3721 - regression_loss: 1.1606 - classification_loss: 0.2115 456/500 [==========================>...] - ETA: 10s - loss: 1.3702 - regression_loss: 1.1590 - classification_loss: 0.2112 457/500 [==========================>...] - ETA: 10s - loss: 1.3721 - regression_loss: 1.1605 - classification_loss: 0.2116 458/500 [==========================>...] - ETA: 10s - loss: 1.3741 - regression_loss: 1.1622 - classification_loss: 0.2119 459/500 [==========================>...] - ETA: 9s - loss: 1.3743 - regression_loss: 1.1621 - classification_loss: 0.2122  460/500 [==========================>...] - ETA: 9s - loss: 1.3754 - regression_loss: 1.1631 - classification_loss: 0.2123 461/500 [==========================>...] - ETA: 9s - loss: 1.3742 - regression_loss: 1.1622 - classification_loss: 0.2120 462/500 [==========================>...] - ETA: 9s - loss: 1.3753 - regression_loss: 1.1631 - classification_loss: 0.2122 463/500 [==========================>...] - ETA: 8s - loss: 1.3763 - regression_loss: 1.1638 - classification_loss: 0.2125 464/500 [==========================>...] - ETA: 8s - loss: 1.3759 - regression_loss: 1.1635 - classification_loss: 0.2125 465/500 [==========================>...] - ETA: 8s - loss: 1.3764 - regression_loss: 1.1639 - classification_loss: 0.2125 466/500 [==========================>...] - ETA: 8s - loss: 1.3758 - regression_loss: 1.1633 - classification_loss: 0.2125 467/500 [===========================>..] - ETA: 7s - loss: 1.3759 - regression_loss: 1.1634 - classification_loss: 0.2124 468/500 [===========================>..] - ETA: 7s - loss: 1.3747 - regression_loss: 1.1626 - classification_loss: 0.2122 469/500 [===========================>..] - ETA: 7s - loss: 1.3749 - regression_loss: 1.1628 - classification_loss: 0.2120 470/500 [===========================>..] - ETA: 7s - loss: 1.3759 - regression_loss: 1.1635 - classification_loss: 0.2124 471/500 [===========================>..] - ETA: 6s - loss: 1.3744 - regression_loss: 1.1623 - classification_loss: 0.2121 472/500 [===========================>..] - ETA: 6s - loss: 1.3744 - regression_loss: 1.1624 - classification_loss: 0.2120 473/500 [===========================>..] - ETA: 6s - loss: 1.3751 - regression_loss: 1.1629 - classification_loss: 0.2122 474/500 [===========================>..] - ETA: 6s - loss: 1.3740 - regression_loss: 1.1619 - classification_loss: 0.2120 475/500 [===========================>..] - ETA: 5s - loss: 1.3751 - regression_loss: 1.1629 - classification_loss: 0.2121 476/500 [===========================>..] - ETA: 5s - loss: 1.3756 - regression_loss: 1.1635 - classification_loss: 0.2121 477/500 [===========================>..] - ETA: 5s - loss: 1.3763 - regression_loss: 1.1640 - classification_loss: 0.2123 478/500 [===========================>..] - ETA: 5s - loss: 1.3755 - regression_loss: 1.1632 - classification_loss: 0.2122 479/500 [===========================>..] - ETA: 5s - loss: 1.3753 - regression_loss: 1.1632 - classification_loss: 0.2121 480/500 [===========================>..] - ETA: 4s - loss: 1.3748 - regression_loss: 1.1628 - classification_loss: 0.2120 481/500 [===========================>..] - ETA: 4s - loss: 1.3753 - regression_loss: 1.1631 - classification_loss: 0.2123 482/500 [===========================>..] - ETA: 4s - loss: 1.3769 - regression_loss: 1.1643 - classification_loss: 0.2127 483/500 [===========================>..] - ETA: 4s - loss: 1.3777 - regression_loss: 1.1650 - classification_loss: 0.2127 484/500 [============================>.] - ETA: 3s - loss: 1.3769 - regression_loss: 1.1644 - classification_loss: 0.2124 485/500 [============================>.] - ETA: 3s - loss: 1.3775 - regression_loss: 1.1649 - classification_loss: 0.2126 486/500 [============================>.] - ETA: 3s - loss: 1.3783 - regression_loss: 1.1656 - classification_loss: 0.2126 487/500 [============================>.] - ETA: 3s - loss: 1.3784 - regression_loss: 1.1657 - classification_loss: 0.2127 488/500 [============================>.] - ETA: 2s - loss: 1.3773 - regression_loss: 1.1646 - classification_loss: 0.2127 489/500 [============================>.] - ETA: 2s - loss: 1.3763 - regression_loss: 1.1638 - classification_loss: 0.2125 490/500 [============================>.] - ETA: 2s - loss: 1.3745 - regression_loss: 1.1623 - classification_loss: 0.2123 491/500 [============================>.] - ETA: 2s - loss: 1.3735 - regression_loss: 1.1614 - classification_loss: 0.2121 492/500 [============================>.] - ETA: 1s - loss: 1.3727 - regression_loss: 1.1607 - classification_loss: 0.2120 493/500 [============================>.] - ETA: 1s - loss: 1.3733 - regression_loss: 1.1611 - classification_loss: 0.2122 494/500 [============================>.] - ETA: 1s - loss: 1.3735 - regression_loss: 1.1614 - classification_loss: 0.2122 495/500 [============================>.] - ETA: 1s - loss: 1.3734 - regression_loss: 1.1613 - classification_loss: 0.2120 496/500 [============================>.] - ETA: 0s - loss: 1.3724 - regression_loss: 1.1606 - classification_loss: 0.2118 497/500 [============================>.] - ETA: 0s - loss: 1.3719 - regression_loss: 1.1602 - classification_loss: 0.2117 498/500 [============================>.] - ETA: 0s - loss: 1.3719 - regression_loss: 1.1603 - classification_loss: 0.2116 499/500 [============================>.] - ETA: 0s - loss: 1.3721 - regression_loss: 1.1606 - classification_loss: 0.2115 500/500 [==============================] - 120s 240ms/step - loss: 1.3727 - regression_loss: 1.1613 - classification_loss: 0.2114 326 instances of class plum with average precision: 0.8057 mAP: 0.8057 Epoch 00080: saving model to ./training/snapshots/resnet50_pascal_80.h5 Epoch 81/150 1/500 [..............................] - ETA: 1:56 - loss: 1.3855 - regression_loss: 1.1942 - classification_loss: 0.1914 2/500 [..............................] - ETA: 1:55 - loss: 1.2580 - regression_loss: 1.0560 - classification_loss: 0.2020 3/500 [..............................] - ETA: 1:57 - loss: 1.2045 - regression_loss: 1.0139 - classification_loss: 0.1906 4/500 [..............................] - ETA: 1:58 - loss: 1.1973 - regression_loss: 1.0230 - classification_loss: 0.1744 5/500 [..............................] - ETA: 1:58 - loss: 1.4140 - regression_loss: 1.1961 - classification_loss: 0.2179 6/500 [..............................] - ETA: 1:57 - loss: 1.3661 - regression_loss: 1.1401 - classification_loss: 0.2260 7/500 [..............................] - ETA: 1:57 - loss: 1.4039 - regression_loss: 1.1913 - classification_loss: 0.2126 8/500 [..............................] - ETA: 1:55 - loss: 1.3866 - regression_loss: 1.1839 - classification_loss: 0.2027 9/500 [..............................] - ETA: 1:54 - loss: 1.5178 - regression_loss: 1.2589 - classification_loss: 0.2588 10/500 [..............................] - ETA: 1:54 - loss: 1.5004 - regression_loss: 1.2523 - classification_loss: 0.2481 11/500 [..............................] - ETA: 1:55 - loss: 1.4370 - regression_loss: 1.2050 - classification_loss: 0.2320 12/500 [..............................] - ETA: 1:55 - loss: 1.4644 - regression_loss: 1.2253 - classification_loss: 0.2391 13/500 [..............................] - ETA: 1:55 - loss: 1.4989 - regression_loss: 1.2566 - classification_loss: 0.2423 14/500 [..............................] - ETA: 1:55 - loss: 1.4757 - regression_loss: 1.2359 - classification_loss: 0.2398 15/500 [..............................] - ETA: 1:56 - loss: 1.5291 - regression_loss: 1.2721 - classification_loss: 0.2569 16/500 [..............................] - ETA: 1:55 - loss: 1.4801 - regression_loss: 1.2329 - classification_loss: 0.2472 17/500 [>.............................] - ETA: 1:55 - loss: 1.4735 - regression_loss: 1.2308 - classification_loss: 0.2427 18/500 [>.............................] - ETA: 1:55 - loss: 1.4758 - regression_loss: 1.2322 - classification_loss: 0.2436 19/500 [>.............................] - ETA: 1:55 - loss: 1.4740 - regression_loss: 1.2305 - classification_loss: 0.2435 20/500 [>.............................] - ETA: 1:55 - loss: 1.4894 - regression_loss: 1.2468 - classification_loss: 0.2426 21/500 [>.............................] - ETA: 1:55 - loss: 1.4349 - regression_loss: 1.2011 - classification_loss: 0.2339 22/500 [>.............................] - ETA: 1:54 - loss: 1.4608 - regression_loss: 1.2226 - classification_loss: 0.2382 23/500 [>.............................] - ETA: 1:54 - loss: 1.4740 - regression_loss: 1.2336 - classification_loss: 0.2404 24/500 [>.............................] - ETA: 1:53 - loss: 1.4476 - regression_loss: 1.2119 - classification_loss: 0.2357 25/500 [>.............................] - ETA: 1:53 - loss: 1.4744 - regression_loss: 1.2310 - classification_loss: 0.2434 26/500 [>.............................] - ETA: 1:53 - loss: 1.4642 - regression_loss: 1.2250 - classification_loss: 0.2393 27/500 [>.............................] - ETA: 1:53 - loss: 1.4473 - regression_loss: 1.2122 - classification_loss: 0.2351 28/500 [>.............................] - ETA: 1:52 - loss: 1.4312 - regression_loss: 1.2000 - classification_loss: 0.2312 29/500 [>.............................] - ETA: 1:52 - loss: 1.4255 - regression_loss: 1.1966 - classification_loss: 0.2289 30/500 [>.............................] - ETA: 1:52 - loss: 1.4391 - regression_loss: 1.2052 - classification_loss: 0.2339 31/500 [>.............................] - ETA: 1:52 - loss: 1.4304 - regression_loss: 1.1991 - classification_loss: 0.2313 32/500 [>.............................] - ETA: 1:52 - loss: 1.4277 - regression_loss: 1.1981 - classification_loss: 0.2297 33/500 [>.............................] - ETA: 1:52 - loss: 1.4452 - regression_loss: 1.2129 - classification_loss: 0.2323 34/500 [=>............................] - ETA: 1:51 - loss: 1.4465 - regression_loss: 1.2146 - classification_loss: 0.2319 35/500 [=>............................] - ETA: 1:51 - loss: 1.4637 - regression_loss: 1.2260 - classification_loss: 0.2377 36/500 [=>............................] - ETA: 1:51 - loss: 1.4709 - regression_loss: 1.2306 - classification_loss: 0.2402 37/500 [=>............................] - ETA: 1:50 - loss: 1.4633 - regression_loss: 1.2268 - classification_loss: 0.2365 38/500 [=>............................] - ETA: 1:50 - loss: 1.4660 - regression_loss: 1.2286 - classification_loss: 0.2374 39/500 [=>............................] - ETA: 1:50 - loss: 1.4413 - regression_loss: 1.2069 - classification_loss: 0.2343 40/500 [=>............................] - ETA: 1:50 - loss: 1.4457 - regression_loss: 1.2087 - classification_loss: 0.2370 41/500 [=>............................] - ETA: 1:50 - loss: 1.4407 - regression_loss: 1.2054 - classification_loss: 0.2353 42/500 [=>............................] - ETA: 1:50 - loss: 1.4419 - regression_loss: 1.2072 - classification_loss: 0.2346 43/500 [=>............................] - ETA: 1:49 - loss: 1.4427 - regression_loss: 1.2085 - classification_loss: 0.2342 44/500 [=>............................] - ETA: 1:49 - loss: 1.4290 - regression_loss: 1.1967 - classification_loss: 0.2323 45/500 [=>............................] - ETA: 1:49 - loss: 1.4167 - regression_loss: 1.1877 - classification_loss: 0.2290 46/500 [=>............................] - ETA: 1:49 - loss: 1.4255 - regression_loss: 1.1947 - classification_loss: 0.2309 47/500 [=>............................] - ETA: 1:49 - loss: 1.4179 - regression_loss: 1.1897 - classification_loss: 0.2282 48/500 [=>............................] - ETA: 1:48 - loss: 1.3993 - regression_loss: 1.1751 - classification_loss: 0.2242 49/500 [=>............................] - ETA: 1:48 - loss: 1.3997 - regression_loss: 1.1755 - classification_loss: 0.2242 50/500 [==>...........................] - ETA: 1:48 - loss: 1.3957 - regression_loss: 1.1725 - classification_loss: 0.2233 51/500 [==>...........................] - ETA: 1:47 - loss: 1.4027 - regression_loss: 1.1801 - classification_loss: 0.2225 52/500 [==>...........................] - ETA: 1:47 - loss: 1.4036 - regression_loss: 1.1802 - classification_loss: 0.2235 53/500 [==>...........................] - ETA: 1:47 - loss: 1.4102 - regression_loss: 1.1857 - classification_loss: 0.2246 54/500 [==>...........................] - ETA: 1:46 - loss: 1.4250 - regression_loss: 1.1972 - classification_loss: 0.2278 55/500 [==>...........................] - ETA: 1:46 - loss: 1.4269 - regression_loss: 1.1996 - classification_loss: 0.2273 56/500 [==>...........................] - ETA: 1:46 - loss: 1.4235 - regression_loss: 1.1970 - classification_loss: 0.2266 57/500 [==>...........................] - ETA: 1:46 - loss: 1.4214 - regression_loss: 1.1954 - classification_loss: 0.2261 58/500 [==>...........................] - ETA: 1:46 - loss: 1.4442 - regression_loss: 1.2145 - classification_loss: 0.2298 59/500 [==>...........................] - ETA: 1:46 - loss: 1.4425 - regression_loss: 1.2143 - classification_loss: 0.2281 60/500 [==>...........................] - ETA: 1:45 - loss: 1.4423 - regression_loss: 1.2145 - classification_loss: 0.2278 61/500 [==>...........................] - ETA: 1:45 - loss: 1.4389 - regression_loss: 1.2120 - classification_loss: 0.2268 62/500 [==>...........................] - ETA: 1:45 - loss: 1.4339 - regression_loss: 1.2082 - classification_loss: 0.2257 63/500 [==>...........................] - ETA: 1:45 - loss: 1.4209 - regression_loss: 1.1969 - classification_loss: 0.2240 64/500 [==>...........................] - ETA: 1:44 - loss: 1.4151 - regression_loss: 1.1898 - classification_loss: 0.2253 65/500 [==>...........................] - ETA: 1:44 - loss: 1.4146 - regression_loss: 1.1896 - classification_loss: 0.2250 66/500 [==>...........................] - ETA: 1:43 - loss: 1.4042 - regression_loss: 1.1810 - classification_loss: 0.2231 67/500 [===>..........................] - ETA: 1:43 - loss: 1.3966 - regression_loss: 1.1750 - classification_loss: 0.2216 68/500 [===>..........................] - ETA: 1:43 - loss: 1.4109 - regression_loss: 1.1860 - classification_loss: 0.2250 69/500 [===>..........................] - ETA: 1:43 - loss: 1.4094 - regression_loss: 1.1847 - classification_loss: 0.2247 70/500 [===>..........................] - ETA: 1:42 - loss: 1.4065 - regression_loss: 1.1825 - classification_loss: 0.2240 71/500 [===>..........................] - ETA: 1:42 - loss: 1.4032 - regression_loss: 1.1802 - classification_loss: 0.2229 72/500 [===>..........................] - ETA: 1:42 - loss: 1.4173 - regression_loss: 1.1903 - classification_loss: 0.2270 73/500 [===>..........................] - ETA: 1:42 - loss: 1.4148 - regression_loss: 1.1882 - classification_loss: 0.2266 74/500 [===>..........................] - ETA: 1:42 - loss: 1.4117 - regression_loss: 1.1868 - classification_loss: 0.2249 75/500 [===>..........................] - ETA: 1:41 - loss: 1.4114 - regression_loss: 1.1861 - classification_loss: 0.2254 76/500 [===>..........................] - ETA: 1:41 - loss: 1.4161 - regression_loss: 1.1901 - classification_loss: 0.2260 77/500 [===>..........................] - ETA: 1:41 - loss: 1.4073 - regression_loss: 1.1835 - classification_loss: 0.2238 78/500 [===>..........................] - ETA: 1:41 - loss: 1.4146 - regression_loss: 1.1889 - classification_loss: 0.2256 79/500 [===>..........................] - ETA: 1:40 - loss: 1.4164 - regression_loss: 1.1905 - classification_loss: 0.2259 80/500 [===>..........................] - ETA: 1:40 - loss: 1.4098 - regression_loss: 1.1851 - classification_loss: 0.2247 81/500 [===>..........................] - ETA: 1:40 - loss: 1.4004 - regression_loss: 1.1770 - classification_loss: 0.2234 82/500 [===>..........................] - ETA: 1:40 - loss: 1.3957 - regression_loss: 1.1734 - classification_loss: 0.2223 83/500 [===>..........................] - ETA: 1:40 - loss: 1.4002 - regression_loss: 1.1767 - classification_loss: 0.2235 84/500 [====>.........................] - ETA: 1:39 - loss: 1.3934 - regression_loss: 1.1711 - classification_loss: 0.2223 85/500 [====>.........................] - ETA: 1:39 - loss: 1.3895 - regression_loss: 1.1685 - classification_loss: 0.2210 86/500 [====>.........................] - ETA: 1:39 - loss: 1.3926 - regression_loss: 1.1717 - classification_loss: 0.2209 87/500 [====>.........................] - ETA: 1:39 - loss: 1.3971 - regression_loss: 1.1750 - classification_loss: 0.2221 88/500 [====>.........................] - ETA: 1:39 - loss: 1.3966 - regression_loss: 1.1742 - classification_loss: 0.2224 89/500 [====>.........................] - ETA: 1:38 - loss: 1.4038 - regression_loss: 1.1804 - classification_loss: 0.2234 90/500 [====>.........................] - ETA: 1:38 - loss: 1.4095 - regression_loss: 1.1858 - classification_loss: 0.2237 91/500 [====>.........................] - ETA: 1:38 - loss: 1.4013 - regression_loss: 1.1793 - classification_loss: 0.2220 92/500 [====>.........................] - ETA: 1:38 - loss: 1.3972 - regression_loss: 1.1763 - classification_loss: 0.2209 93/500 [====>.........................] - ETA: 1:37 - loss: 1.3962 - regression_loss: 1.1763 - classification_loss: 0.2199 94/500 [====>.........................] - ETA: 1:37 - loss: 1.3988 - regression_loss: 1.1787 - classification_loss: 0.2202 95/500 [====>.........................] - ETA: 1:37 - loss: 1.4012 - regression_loss: 1.1812 - classification_loss: 0.2201 96/500 [====>.........................] - ETA: 1:36 - loss: 1.4028 - regression_loss: 1.1825 - classification_loss: 0.2203 97/500 [====>.........................] - ETA: 1:36 - loss: 1.4026 - regression_loss: 1.1831 - classification_loss: 0.2196 98/500 [====>.........................] - ETA: 1:36 - loss: 1.3936 - regression_loss: 1.1758 - classification_loss: 0.2178 99/500 [====>.........................] - ETA: 1:36 - loss: 1.3959 - regression_loss: 1.1773 - classification_loss: 0.2186 100/500 [=====>........................] - ETA: 1:36 - loss: 1.3952 - regression_loss: 1.1771 - classification_loss: 0.2181 101/500 [=====>........................] - ETA: 1:35 - loss: 1.3965 - regression_loss: 1.1775 - classification_loss: 0.2190 102/500 [=====>........................] - ETA: 1:35 - loss: 1.3901 - regression_loss: 1.1727 - classification_loss: 0.2174 103/500 [=====>........................] - ETA: 1:35 - loss: 1.3879 - regression_loss: 1.1714 - classification_loss: 0.2165 104/500 [=====>........................] - ETA: 1:35 - loss: 1.3926 - regression_loss: 1.1747 - classification_loss: 0.2179 105/500 [=====>........................] - ETA: 1:35 - loss: 1.3863 - regression_loss: 1.1696 - classification_loss: 0.2167 106/500 [=====>........................] - ETA: 1:34 - loss: 1.3884 - regression_loss: 1.1713 - classification_loss: 0.2171 107/500 [=====>........................] - ETA: 1:34 - loss: 1.3844 - regression_loss: 1.1683 - classification_loss: 0.2161 108/500 [=====>........................] - ETA: 1:34 - loss: 1.3869 - regression_loss: 1.1710 - classification_loss: 0.2158 109/500 [=====>........................] - ETA: 1:33 - loss: 1.3846 - regression_loss: 1.1697 - classification_loss: 0.2149 110/500 [=====>........................] - ETA: 1:33 - loss: 1.3867 - regression_loss: 1.1714 - classification_loss: 0.2153 111/500 [=====>........................] - ETA: 1:33 - loss: 1.3865 - regression_loss: 1.1718 - classification_loss: 0.2147 112/500 [=====>........................] - ETA: 1:33 - loss: 1.3846 - regression_loss: 1.1705 - classification_loss: 0.2141 113/500 [=====>........................] - ETA: 1:32 - loss: 1.3860 - regression_loss: 1.1715 - classification_loss: 0.2145 114/500 [=====>........................] - ETA: 1:32 - loss: 1.3836 - regression_loss: 1.1696 - classification_loss: 0.2140 115/500 [=====>........................] - ETA: 1:32 - loss: 1.3811 - regression_loss: 1.1678 - classification_loss: 0.2133 116/500 [=====>........................] - ETA: 1:32 - loss: 1.3853 - regression_loss: 1.1707 - classification_loss: 0.2147 117/500 [======>.......................] - ETA: 1:31 - loss: 1.3829 - regression_loss: 1.1689 - classification_loss: 0.2141 118/500 [======>.......................] - ETA: 1:31 - loss: 1.3816 - regression_loss: 1.1680 - classification_loss: 0.2136 119/500 [======>.......................] - ETA: 1:31 - loss: 1.3803 - regression_loss: 1.1674 - classification_loss: 0.2129 120/500 [======>.......................] - ETA: 1:31 - loss: 1.3861 - regression_loss: 1.1720 - classification_loss: 0.2141 121/500 [======>.......................] - ETA: 1:30 - loss: 1.3804 - regression_loss: 1.1674 - classification_loss: 0.2130 122/500 [======>.......................] - ETA: 1:30 - loss: 1.3796 - regression_loss: 1.1671 - classification_loss: 0.2125 123/500 [======>.......................] - ETA: 1:30 - loss: 1.3757 - regression_loss: 1.1643 - classification_loss: 0.2114 124/500 [======>.......................] - ETA: 1:30 - loss: 1.3762 - regression_loss: 1.1640 - classification_loss: 0.2122 125/500 [======>.......................] - ETA: 1:29 - loss: 1.3701 - regression_loss: 1.1589 - classification_loss: 0.2112 126/500 [======>.......................] - ETA: 1:29 - loss: 1.3677 - regression_loss: 1.1572 - classification_loss: 0.2105 127/500 [======>.......................] - ETA: 1:29 - loss: 1.3647 - regression_loss: 1.1548 - classification_loss: 0.2099 128/500 [======>.......................] - ETA: 1:29 - loss: 1.3569 - regression_loss: 1.1483 - classification_loss: 0.2085 129/500 [======>.......................] - ETA: 1:28 - loss: 1.3613 - regression_loss: 1.1526 - classification_loss: 0.2087 130/500 [======>.......................] - ETA: 1:28 - loss: 1.3615 - regression_loss: 1.1529 - classification_loss: 0.2085 131/500 [======>.......................] - ETA: 1:28 - loss: 1.3650 - regression_loss: 1.1559 - classification_loss: 0.2090 132/500 [======>.......................] - ETA: 1:28 - loss: 1.3650 - regression_loss: 1.1559 - classification_loss: 0.2091 133/500 [======>.......................] - ETA: 1:27 - loss: 1.3663 - regression_loss: 1.1566 - classification_loss: 0.2097 134/500 [=======>......................] - ETA: 1:27 - loss: 1.3654 - regression_loss: 1.1563 - classification_loss: 0.2091 135/500 [=======>......................] - ETA: 1:27 - loss: 1.3652 - regression_loss: 1.1562 - classification_loss: 0.2090 136/500 [=======>......................] - ETA: 1:27 - loss: 1.3628 - regression_loss: 1.1542 - classification_loss: 0.2086 137/500 [=======>......................] - ETA: 1:26 - loss: 1.3643 - regression_loss: 1.1552 - classification_loss: 0.2090 138/500 [=======>......................] - ETA: 1:26 - loss: 1.3587 - regression_loss: 1.1505 - classification_loss: 0.2082 139/500 [=======>......................] - ETA: 1:26 - loss: 1.3567 - regression_loss: 1.1491 - classification_loss: 0.2076 140/500 [=======>......................] - ETA: 1:26 - loss: 1.3557 - regression_loss: 1.1482 - classification_loss: 0.2076 141/500 [=======>......................] - ETA: 1:25 - loss: 1.3541 - regression_loss: 1.1469 - classification_loss: 0.2072 142/500 [=======>......................] - ETA: 1:25 - loss: 1.3526 - regression_loss: 1.1451 - classification_loss: 0.2075 143/500 [=======>......................] - ETA: 1:25 - loss: 1.3531 - regression_loss: 1.1458 - classification_loss: 0.2073 144/500 [=======>......................] - ETA: 1:25 - loss: 1.3496 - regression_loss: 1.1431 - classification_loss: 0.2066 145/500 [=======>......................] - ETA: 1:24 - loss: 1.3478 - regression_loss: 1.1410 - classification_loss: 0.2068 146/500 [=======>......................] - ETA: 1:24 - loss: 1.3467 - regression_loss: 1.1402 - classification_loss: 0.2065 147/500 [=======>......................] - ETA: 1:24 - loss: 1.3488 - regression_loss: 1.1420 - classification_loss: 0.2068 148/500 [=======>......................] - ETA: 1:24 - loss: 1.3461 - regression_loss: 1.1397 - classification_loss: 0.2064 149/500 [=======>......................] - ETA: 1:24 - loss: 1.3473 - regression_loss: 1.1403 - classification_loss: 0.2070 150/500 [========>.....................] - ETA: 1:23 - loss: 1.3528 - regression_loss: 1.1449 - classification_loss: 0.2079 151/500 [========>.....................] - ETA: 1:23 - loss: 1.3520 - regression_loss: 1.1443 - classification_loss: 0.2077 152/500 [========>.....................] - ETA: 1:23 - loss: 1.3508 - regression_loss: 1.1434 - classification_loss: 0.2074 153/500 [========>.....................] - ETA: 1:23 - loss: 1.3521 - regression_loss: 1.1443 - classification_loss: 0.2077 154/500 [========>.....................] - ETA: 1:22 - loss: 1.3530 - regression_loss: 1.1445 - classification_loss: 0.2085 155/500 [========>.....................] - ETA: 1:22 - loss: 1.3534 - regression_loss: 1.1448 - classification_loss: 0.2086 156/500 [========>.....................] - ETA: 1:22 - loss: 1.3474 - regression_loss: 1.1374 - classification_loss: 0.2100 157/500 [========>.....................] - ETA: 1:22 - loss: 1.3478 - regression_loss: 1.1376 - classification_loss: 0.2102 158/500 [========>.....................] - ETA: 1:21 - loss: 1.3483 - regression_loss: 1.1380 - classification_loss: 0.2103 159/500 [========>.....................] - ETA: 1:21 - loss: 1.3471 - regression_loss: 1.1374 - classification_loss: 0.2097 160/500 [========>.....................] - ETA: 1:21 - loss: 1.3495 - regression_loss: 1.1397 - classification_loss: 0.2098 161/500 [========>.....................] - ETA: 1:21 - loss: 1.3509 - regression_loss: 1.1409 - classification_loss: 0.2100 162/500 [========>.....................] - ETA: 1:21 - loss: 1.3525 - regression_loss: 1.1424 - classification_loss: 0.2102 163/500 [========>.....................] - ETA: 1:20 - loss: 1.3515 - regression_loss: 1.1417 - classification_loss: 0.2098 164/500 [========>.....................] - ETA: 1:20 - loss: 1.3519 - regression_loss: 1.1422 - classification_loss: 0.2097 165/500 [========>.....................] - ETA: 1:20 - loss: 1.3519 - regression_loss: 1.1421 - classification_loss: 0.2097 166/500 [========>.....................] - ETA: 1:20 - loss: 1.3545 - regression_loss: 1.1443 - classification_loss: 0.2101 167/500 [=========>....................] - ETA: 1:19 - loss: 1.3513 - regression_loss: 1.1417 - classification_loss: 0.2096 168/500 [=========>....................] - ETA: 1:19 - loss: 1.3504 - regression_loss: 1.1410 - classification_loss: 0.2094 169/500 [=========>....................] - ETA: 1:19 - loss: 1.3529 - regression_loss: 1.1422 - classification_loss: 0.2107 170/500 [=========>....................] - ETA: 1:19 - loss: 1.3506 - regression_loss: 1.1405 - classification_loss: 0.2101 171/500 [=========>....................] - ETA: 1:18 - loss: 1.3545 - regression_loss: 1.1434 - classification_loss: 0.2111 172/500 [=========>....................] - ETA: 1:18 - loss: 1.3545 - regression_loss: 1.1434 - classification_loss: 0.2111 173/500 [=========>....................] - ETA: 1:18 - loss: 1.3569 - regression_loss: 1.1456 - classification_loss: 0.2113 174/500 [=========>....................] - ETA: 1:18 - loss: 1.3594 - regression_loss: 1.1474 - classification_loss: 0.2120 175/500 [=========>....................] - ETA: 1:17 - loss: 1.3572 - regression_loss: 1.1458 - classification_loss: 0.2114 176/500 [=========>....................] - ETA: 1:17 - loss: 1.3567 - regression_loss: 1.1454 - classification_loss: 0.2113 177/500 [=========>....................] - ETA: 1:17 - loss: 1.3591 - regression_loss: 1.1475 - classification_loss: 0.2116 178/500 [=========>....................] - ETA: 1:17 - loss: 1.3630 - regression_loss: 1.1503 - classification_loss: 0.2127 179/500 [=========>....................] - ETA: 1:16 - loss: 1.3658 - regression_loss: 1.1527 - classification_loss: 0.2131 180/500 [=========>....................] - ETA: 1:16 - loss: 1.3677 - regression_loss: 1.1545 - classification_loss: 0.2132 181/500 [=========>....................] - ETA: 1:16 - loss: 1.3623 - regression_loss: 1.1501 - classification_loss: 0.2122 182/500 [=========>....................] - ETA: 1:16 - loss: 1.3629 - regression_loss: 1.1507 - classification_loss: 0.2122 183/500 [=========>....................] - ETA: 1:15 - loss: 1.3674 - regression_loss: 1.1547 - classification_loss: 0.2127 184/500 [==========>...................] - ETA: 1:15 - loss: 1.3669 - regression_loss: 1.1541 - classification_loss: 0.2129 185/500 [==========>...................] - ETA: 1:15 - loss: 1.3652 - regression_loss: 1.1531 - classification_loss: 0.2121 186/500 [==========>...................] - ETA: 1:15 - loss: 1.3668 - regression_loss: 1.1548 - classification_loss: 0.2120 187/500 [==========>...................] - ETA: 1:15 - loss: 1.3667 - regression_loss: 1.1548 - classification_loss: 0.2118 188/500 [==========>...................] - ETA: 1:14 - loss: 1.3683 - regression_loss: 1.1560 - classification_loss: 0.2123 189/500 [==========>...................] - ETA: 1:14 - loss: 1.3665 - regression_loss: 1.1545 - classification_loss: 0.2120 190/500 [==========>...................] - ETA: 1:14 - loss: 1.3682 - regression_loss: 1.1561 - classification_loss: 0.2121 191/500 [==========>...................] - ETA: 1:14 - loss: 1.3689 - regression_loss: 1.1569 - classification_loss: 0.2121 192/500 [==========>...................] - ETA: 1:13 - loss: 1.3689 - regression_loss: 1.1571 - classification_loss: 0.2118 193/500 [==========>...................] - ETA: 1:13 - loss: 1.3642 - regression_loss: 1.1511 - classification_loss: 0.2131 194/500 [==========>...................] - ETA: 1:13 - loss: 1.3648 - regression_loss: 1.1514 - classification_loss: 0.2135 195/500 [==========>...................] - ETA: 1:13 - loss: 1.3665 - regression_loss: 1.1523 - classification_loss: 0.2142 196/500 [==========>...................] - ETA: 1:12 - loss: 1.3645 - regression_loss: 1.1505 - classification_loss: 0.2140 197/500 [==========>...................] - ETA: 1:12 - loss: 1.3664 - regression_loss: 1.1521 - classification_loss: 0.2143 198/500 [==========>...................] - ETA: 1:12 - loss: 1.3662 - regression_loss: 1.1522 - classification_loss: 0.2140 199/500 [==========>...................] - ETA: 1:12 - loss: 1.3659 - regression_loss: 1.1523 - classification_loss: 0.2135 200/500 [===========>..................] - ETA: 1:11 - loss: 1.3655 - regression_loss: 1.1523 - classification_loss: 0.2132 201/500 [===========>..................] - ETA: 1:11 - loss: 1.3643 - regression_loss: 1.1515 - classification_loss: 0.2128 202/500 [===========>..................] - ETA: 1:11 - loss: 1.3658 - regression_loss: 1.1531 - classification_loss: 0.2127 203/500 [===========>..................] - ETA: 1:11 - loss: 1.3671 - regression_loss: 1.1540 - classification_loss: 0.2131 204/500 [===========>..................] - ETA: 1:10 - loss: 1.3670 - regression_loss: 1.1543 - classification_loss: 0.2126 205/500 [===========>..................] - ETA: 1:10 - loss: 1.3671 - regression_loss: 1.1548 - classification_loss: 0.2123 206/500 [===========>..................] - ETA: 1:10 - loss: 1.3707 - regression_loss: 1.1573 - classification_loss: 0.2134 207/500 [===========>..................] - ETA: 1:10 - loss: 1.3701 - regression_loss: 1.1571 - classification_loss: 0.2130 208/500 [===========>..................] - ETA: 1:09 - loss: 1.3713 - regression_loss: 1.1583 - classification_loss: 0.2129 209/500 [===========>..................] - ETA: 1:09 - loss: 1.3739 - regression_loss: 1.1601 - classification_loss: 0.2138 210/500 [===========>..................] - ETA: 1:09 - loss: 1.3706 - regression_loss: 1.1574 - classification_loss: 0.2132 211/500 [===========>..................] - ETA: 1:09 - loss: 1.3664 - regression_loss: 1.1541 - classification_loss: 0.2123 212/500 [===========>..................] - ETA: 1:09 - loss: 1.3664 - regression_loss: 1.1542 - classification_loss: 0.2121 213/500 [===========>..................] - ETA: 1:08 - loss: 1.3674 - regression_loss: 1.1551 - classification_loss: 0.2122 214/500 [===========>..................] - ETA: 1:08 - loss: 1.3695 - regression_loss: 1.1569 - classification_loss: 0.2126 215/500 [===========>..................] - ETA: 1:08 - loss: 1.3695 - regression_loss: 1.1572 - classification_loss: 0.2123 216/500 [===========>..................] - ETA: 1:08 - loss: 1.3684 - regression_loss: 1.1566 - classification_loss: 0.2118 217/500 [============>.................] - ETA: 1:07 - loss: 1.3689 - regression_loss: 1.1572 - classification_loss: 0.2117 218/500 [============>.................] - ETA: 1:07 - loss: 1.3690 - regression_loss: 1.1574 - classification_loss: 0.2116 219/500 [============>.................] - ETA: 1:07 - loss: 1.3675 - regression_loss: 1.1560 - classification_loss: 0.2114 220/500 [============>.................] - ETA: 1:07 - loss: 1.3643 - regression_loss: 1.1536 - classification_loss: 0.2108 221/500 [============>.................] - ETA: 1:06 - loss: 1.3659 - regression_loss: 1.1551 - classification_loss: 0.2109 222/500 [============>.................] - ETA: 1:06 - loss: 1.3692 - regression_loss: 1.1583 - classification_loss: 0.2109 223/500 [============>.................] - ETA: 1:06 - loss: 1.3682 - regression_loss: 1.1576 - classification_loss: 0.2106 224/500 [============>.................] - ETA: 1:06 - loss: 1.3686 - regression_loss: 1.1580 - classification_loss: 0.2106 225/500 [============>.................] - ETA: 1:05 - loss: 1.3716 - regression_loss: 1.1600 - classification_loss: 0.2116 226/500 [============>.................] - ETA: 1:05 - loss: 1.3677 - regression_loss: 1.1567 - classification_loss: 0.2110 227/500 [============>.................] - ETA: 1:05 - loss: 1.3672 - regression_loss: 1.1516 - classification_loss: 0.2156 228/500 [============>.................] - ETA: 1:05 - loss: 1.3651 - regression_loss: 1.1498 - classification_loss: 0.2153 229/500 [============>.................] - ETA: 1:05 - loss: 1.3608 - regression_loss: 1.1463 - classification_loss: 0.2146 230/500 [============>.................] - ETA: 1:04 - loss: 1.3600 - regression_loss: 1.1457 - classification_loss: 0.2144 231/500 [============>.................] - ETA: 1:04 - loss: 1.3616 - regression_loss: 1.1471 - classification_loss: 0.2145 232/500 [============>.................] - ETA: 1:04 - loss: 1.3608 - regression_loss: 1.1464 - classification_loss: 0.2144 233/500 [============>.................] - ETA: 1:04 - loss: 1.3629 - regression_loss: 1.1477 - classification_loss: 0.2152 234/500 [=============>................] - ETA: 1:03 - loss: 1.3632 - regression_loss: 1.1477 - classification_loss: 0.2156 235/500 [=============>................] - ETA: 1:03 - loss: 1.3720 - regression_loss: 1.1522 - classification_loss: 0.2198 236/500 [=============>................] - ETA: 1:03 - loss: 1.3697 - regression_loss: 1.1505 - classification_loss: 0.2192 237/500 [=============>................] - ETA: 1:03 - loss: 1.3701 - regression_loss: 1.1508 - classification_loss: 0.2193 238/500 [=============>................] - ETA: 1:02 - loss: 1.3700 - regression_loss: 1.1510 - classification_loss: 0.2189 239/500 [=============>................] - ETA: 1:02 - loss: 1.3726 - regression_loss: 1.1533 - classification_loss: 0.2193 240/500 [=============>................] - ETA: 1:02 - loss: 1.3729 - regression_loss: 1.1539 - classification_loss: 0.2191 241/500 [=============>................] - ETA: 1:02 - loss: 1.3689 - regression_loss: 1.1504 - classification_loss: 0.2185 242/500 [=============>................] - ETA: 1:01 - loss: 1.3649 - regression_loss: 1.1472 - classification_loss: 0.2177 243/500 [=============>................] - ETA: 1:01 - loss: 1.3608 - regression_loss: 1.1437 - classification_loss: 0.2171 244/500 [=============>................] - ETA: 1:01 - loss: 1.3590 - regression_loss: 1.1423 - classification_loss: 0.2167 245/500 [=============>................] - ETA: 1:01 - loss: 1.3593 - regression_loss: 1.1426 - classification_loss: 0.2167 246/500 [=============>................] - ETA: 1:00 - loss: 1.3582 - regression_loss: 1.1416 - classification_loss: 0.2166 247/500 [=============>................] - ETA: 1:00 - loss: 1.3595 - regression_loss: 1.1430 - classification_loss: 0.2164 248/500 [=============>................] - ETA: 1:00 - loss: 1.3589 - regression_loss: 1.1428 - classification_loss: 0.2161 249/500 [=============>................] - ETA: 1:00 - loss: 1.3591 - regression_loss: 1.1432 - classification_loss: 0.2159 250/500 [==============>...............] - ETA: 59s - loss: 1.3616 - regression_loss: 1.1452 - classification_loss: 0.2164  251/500 [==============>...............] - ETA: 59s - loss: 1.3633 - regression_loss: 1.1466 - classification_loss: 0.2167 252/500 [==============>...............] - ETA: 59s - loss: 1.3637 - regression_loss: 1.1472 - classification_loss: 0.2166 253/500 [==============>...............] - ETA: 59s - loss: 1.3659 - regression_loss: 1.1485 - classification_loss: 0.2174 254/500 [==============>...............] - ETA: 59s - loss: 1.3664 - regression_loss: 1.1489 - classification_loss: 0.2175 255/500 [==============>...............] - ETA: 58s - loss: 1.3655 - regression_loss: 1.1483 - classification_loss: 0.2172 256/500 [==============>...............] - ETA: 58s - loss: 1.3640 - regression_loss: 1.1471 - classification_loss: 0.2169 257/500 [==============>...............] - ETA: 58s - loss: 1.3619 - regression_loss: 1.1452 - classification_loss: 0.2168 258/500 [==============>...............] - ETA: 58s - loss: 1.3601 - regression_loss: 1.1438 - classification_loss: 0.2164 259/500 [==============>...............] - ETA: 57s - loss: 1.3637 - regression_loss: 1.1469 - classification_loss: 0.2167 260/500 [==============>...............] - ETA: 57s - loss: 1.3642 - regression_loss: 1.1470 - classification_loss: 0.2172 261/500 [==============>...............] - ETA: 57s - loss: 1.3622 - regression_loss: 1.1455 - classification_loss: 0.2167 262/500 [==============>...............] - ETA: 57s - loss: 1.3670 - regression_loss: 1.1411 - classification_loss: 0.2258 263/500 [==============>...............] - ETA: 56s - loss: 1.3689 - regression_loss: 1.1414 - classification_loss: 0.2275 264/500 [==============>...............] - ETA: 56s - loss: 1.3681 - regression_loss: 1.1410 - classification_loss: 0.2271 265/500 [==============>...............] - ETA: 56s - loss: 1.3690 - regression_loss: 1.1418 - classification_loss: 0.2273 266/500 [==============>...............] - ETA: 56s - loss: 1.3676 - regression_loss: 1.1405 - classification_loss: 0.2271 267/500 [===============>..............] - ETA: 55s - loss: 1.3656 - regression_loss: 1.1389 - classification_loss: 0.2267 268/500 [===============>..............] - ETA: 55s - loss: 1.3683 - regression_loss: 1.1410 - classification_loss: 0.2273 269/500 [===============>..............] - ETA: 55s - loss: 1.3672 - regression_loss: 1.1403 - classification_loss: 0.2270 270/500 [===============>..............] - ETA: 55s - loss: 1.3688 - regression_loss: 1.1419 - classification_loss: 0.2269 271/500 [===============>..............] - ETA: 54s - loss: 1.3712 - regression_loss: 1.1438 - classification_loss: 0.2274 272/500 [===============>..............] - ETA: 54s - loss: 1.3671 - regression_loss: 1.1404 - classification_loss: 0.2267 273/500 [===============>..............] - ETA: 54s - loss: 1.3688 - regression_loss: 1.1424 - classification_loss: 0.2264 274/500 [===============>..............] - ETA: 54s - loss: 1.3684 - regression_loss: 1.1421 - classification_loss: 0.2264 275/500 [===============>..............] - ETA: 53s - loss: 1.3704 - regression_loss: 1.1435 - classification_loss: 0.2269 276/500 [===============>..............] - ETA: 53s - loss: 1.3699 - regression_loss: 1.1431 - classification_loss: 0.2268 277/500 [===============>..............] - ETA: 53s - loss: 1.3696 - regression_loss: 1.1431 - classification_loss: 0.2265 278/500 [===============>..............] - ETA: 53s - loss: 1.3686 - regression_loss: 1.1424 - classification_loss: 0.2262 279/500 [===============>..............] - ETA: 53s - loss: 1.3682 - regression_loss: 1.1422 - classification_loss: 0.2260 280/500 [===============>..............] - ETA: 52s - loss: 1.3688 - regression_loss: 1.1426 - classification_loss: 0.2261 281/500 [===============>..............] - ETA: 52s - loss: 1.3674 - regression_loss: 1.1415 - classification_loss: 0.2259 282/500 [===============>..............] - ETA: 52s - loss: 1.3676 - regression_loss: 1.1416 - classification_loss: 0.2259 283/500 [===============>..............] - ETA: 52s - loss: 1.3711 - regression_loss: 1.1442 - classification_loss: 0.2269 284/500 [================>.............] - ETA: 51s - loss: 1.3711 - regression_loss: 1.1446 - classification_loss: 0.2265 285/500 [================>.............] - ETA: 51s - loss: 1.3694 - regression_loss: 1.1435 - classification_loss: 0.2260 286/500 [================>.............] - ETA: 51s - loss: 1.3696 - regression_loss: 1.1439 - classification_loss: 0.2258 287/500 [================>.............] - ETA: 51s - loss: 1.3719 - regression_loss: 1.1456 - classification_loss: 0.2263 288/500 [================>.............] - ETA: 50s - loss: 1.3724 - regression_loss: 1.1459 - classification_loss: 0.2265 289/500 [================>.............] - ETA: 50s - loss: 1.3705 - regression_loss: 1.1444 - classification_loss: 0.2261 290/500 [================>.............] - ETA: 50s - loss: 1.3699 - regression_loss: 1.1441 - classification_loss: 0.2257 291/500 [================>.............] - ETA: 50s - loss: 1.3721 - regression_loss: 1.1458 - classification_loss: 0.2263 292/500 [================>.............] - ETA: 49s - loss: 1.3715 - regression_loss: 1.1455 - classification_loss: 0.2260 293/500 [================>.............] - ETA: 49s - loss: 1.3714 - regression_loss: 1.1456 - classification_loss: 0.2258 294/500 [================>.............] - ETA: 49s - loss: 1.3705 - regression_loss: 1.1449 - classification_loss: 0.2257 295/500 [================>.............] - ETA: 49s - loss: 1.3714 - regression_loss: 1.1456 - classification_loss: 0.2258 296/500 [================>.............] - ETA: 48s - loss: 1.3686 - regression_loss: 1.1434 - classification_loss: 0.2252 297/500 [================>.............] - ETA: 48s - loss: 1.3685 - regression_loss: 1.1435 - classification_loss: 0.2250 298/500 [================>.............] - ETA: 48s - loss: 1.3695 - regression_loss: 1.1444 - classification_loss: 0.2252 299/500 [================>.............] - ETA: 48s - loss: 1.3669 - regression_loss: 1.1421 - classification_loss: 0.2247 300/500 [=================>............] - ETA: 47s - loss: 1.3680 - regression_loss: 1.1429 - classification_loss: 0.2252 301/500 [=================>............] - ETA: 47s - loss: 1.3719 - regression_loss: 1.1461 - classification_loss: 0.2258 302/500 [=================>............] - ETA: 47s - loss: 1.3731 - regression_loss: 1.1475 - classification_loss: 0.2256 303/500 [=================>............] - ETA: 47s - loss: 1.3741 - regression_loss: 1.1484 - classification_loss: 0.2258 304/500 [=================>............] - ETA: 47s - loss: 1.3729 - regression_loss: 1.1475 - classification_loss: 0.2254 305/500 [=================>............] - ETA: 46s - loss: 1.3727 - regression_loss: 1.1473 - classification_loss: 0.2254 306/500 [=================>............] - ETA: 46s - loss: 1.3702 - regression_loss: 1.1453 - classification_loss: 0.2248 307/500 [=================>............] - ETA: 46s - loss: 1.3722 - regression_loss: 1.1472 - classification_loss: 0.2251 308/500 [=================>............] - ETA: 46s - loss: 1.3756 - regression_loss: 1.1501 - classification_loss: 0.2256 309/500 [=================>............] - ETA: 45s - loss: 1.3744 - regression_loss: 1.1491 - classification_loss: 0.2252 310/500 [=================>............] - ETA: 45s - loss: 1.3712 - regression_loss: 1.1465 - classification_loss: 0.2246 311/500 [=================>............] - ETA: 45s - loss: 1.3676 - regression_loss: 1.1436 - classification_loss: 0.2240 312/500 [=================>............] - ETA: 45s - loss: 1.3670 - regression_loss: 1.1429 - classification_loss: 0.2242 313/500 [=================>............] - ETA: 44s - loss: 1.3669 - regression_loss: 1.1428 - classification_loss: 0.2241 314/500 [=================>............] - ETA: 44s - loss: 1.3667 - regression_loss: 1.1427 - classification_loss: 0.2239 315/500 [=================>............] - ETA: 44s - loss: 1.3671 - regression_loss: 1.1431 - classification_loss: 0.2240 316/500 [=================>............] - ETA: 44s - loss: 1.3667 - regression_loss: 1.1430 - classification_loss: 0.2237 317/500 [==================>...........] - ETA: 43s - loss: 1.3644 - regression_loss: 1.1412 - classification_loss: 0.2233 318/500 [==================>...........] - ETA: 43s - loss: 1.3647 - regression_loss: 1.1416 - classification_loss: 0.2232 319/500 [==================>...........] - ETA: 43s - loss: 1.3656 - regression_loss: 1.1424 - classification_loss: 0.2232 320/500 [==================>...........] - ETA: 43s - loss: 1.3663 - regression_loss: 1.1430 - classification_loss: 0.2234 321/500 [==================>...........] - ETA: 42s - loss: 1.3710 - regression_loss: 1.1464 - classification_loss: 0.2245 322/500 [==================>...........] - ETA: 42s - loss: 1.3703 - regression_loss: 1.1459 - classification_loss: 0.2244 323/500 [==================>...........] - ETA: 42s - loss: 1.3715 - regression_loss: 1.1468 - classification_loss: 0.2247 324/500 [==================>...........] - ETA: 42s - loss: 1.3715 - regression_loss: 1.1472 - classification_loss: 0.2244 325/500 [==================>...........] - ETA: 41s - loss: 1.3712 - regression_loss: 1.1469 - classification_loss: 0.2242 326/500 [==================>...........] - ETA: 41s - loss: 1.3711 - regression_loss: 1.1470 - classification_loss: 0.2241 327/500 [==================>...........] - ETA: 41s - loss: 1.3708 - regression_loss: 1.1468 - classification_loss: 0.2240 328/500 [==================>...........] - ETA: 41s - loss: 1.3707 - regression_loss: 1.1469 - classification_loss: 0.2238 329/500 [==================>...........] - ETA: 40s - loss: 1.3698 - regression_loss: 1.1463 - classification_loss: 0.2235 330/500 [==================>...........] - ETA: 40s - loss: 1.3713 - regression_loss: 1.1473 - classification_loss: 0.2239 331/500 [==================>...........] - ETA: 40s - loss: 1.3707 - regression_loss: 1.1471 - classification_loss: 0.2236 332/500 [==================>...........] - ETA: 40s - loss: 1.3697 - regression_loss: 1.1463 - classification_loss: 0.2234 333/500 [==================>...........] - ETA: 39s - loss: 1.3689 - regression_loss: 1.1456 - classification_loss: 0.2233 334/500 [===================>..........] - ETA: 39s - loss: 1.3679 - regression_loss: 1.1450 - classification_loss: 0.2229 335/500 [===================>..........] - ETA: 39s - loss: 1.3676 - regression_loss: 1.1448 - classification_loss: 0.2228 336/500 [===================>..........] - ETA: 39s - loss: 1.3675 - regression_loss: 1.1447 - classification_loss: 0.2227 337/500 [===================>..........] - ETA: 39s - loss: 1.3678 - regression_loss: 1.1450 - classification_loss: 0.2228 338/500 [===================>..........] - ETA: 38s - loss: 1.3688 - regression_loss: 1.1460 - classification_loss: 0.2229 339/500 [===================>..........] - ETA: 38s - loss: 1.3714 - regression_loss: 1.1480 - classification_loss: 0.2234 340/500 [===================>..........] - ETA: 38s - loss: 1.3693 - regression_loss: 1.1464 - classification_loss: 0.2230 341/500 [===================>..........] - ETA: 38s - loss: 1.3706 - regression_loss: 1.1471 - classification_loss: 0.2234 342/500 [===================>..........] - ETA: 37s - loss: 1.3703 - regression_loss: 1.1469 - classification_loss: 0.2234 343/500 [===================>..........] - ETA: 37s - loss: 1.3704 - regression_loss: 1.1472 - classification_loss: 0.2232 344/500 [===================>..........] - ETA: 37s - loss: 1.3718 - regression_loss: 1.1484 - classification_loss: 0.2234 345/500 [===================>..........] - ETA: 37s - loss: 1.3700 - regression_loss: 1.1470 - classification_loss: 0.2230 346/500 [===================>..........] - ETA: 36s - loss: 1.3706 - regression_loss: 1.1477 - classification_loss: 0.2229 347/500 [===================>..........] - ETA: 36s - loss: 1.3685 - regression_loss: 1.1460 - classification_loss: 0.2224 348/500 [===================>..........] - ETA: 36s - loss: 1.3682 - regression_loss: 1.1461 - classification_loss: 0.2221 349/500 [===================>..........] - ETA: 36s - loss: 1.3685 - regression_loss: 1.1464 - classification_loss: 0.2221 350/500 [====================>.........] - ETA: 35s - loss: 1.3692 - regression_loss: 1.1471 - classification_loss: 0.2221 351/500 [====================>.........] - ETA: 35s - loss: 1.3729 - regression_loss: 1.1505 - classification_loss: 0.2224 352/500 [====================>.........] - ETA: 35s - loss: 1.3727 - regression_loss: 1.1504 - classification_loss: 0.2223 353/500 [====================>.........] - ETA: 35s - loss: 1.3729 - regression_loss: 1.1509 - classification_loss: 0.2220 354/500 [====================>.........] - ETA: 34s - loss: 1.3720 - regression_loss: 1.1502 - classification_loss: 0.2217 355/500 [====================>.........] - ETA: 34s - loss: 1.3700 - regression_loss: 1.1487 - classification_loss: 0.2214 356/500 [====================>.........] - ETA: 34s - loss: 1.3697 - regression_loss: 1.1486 - classification_loss: 0.2211 357/500 [====================>.........] - ETA: 34s - loss: 1.3696 - regression_loss: 1.1483 - classification_loss: 0.2213 358/500 [====================>.........] - ETA: 34s - loss: 1.3683 - regression_loss: 1.1473 - classification_loss: 0.2211 359/500 [====================>.........] - ETA: 33s - loss: 1.3694 - regression_loss: 1.1481 - classification_loss: 0.2213 360/500 [====================>.........] - ETA: 33s - loss: 1.3666 - regression_loss: 1.1458 - classification_loss: 0.2208 361/500 [====================>.........] - ETA: 33s - loss: 1.3675 - regression_loss: 1.1466 - classification_loss: 0.2209 362/500 [====================>.........] - ETA: 33s - loss: 1.3654 - regression_loss: 1.1450 - classification_loss: 0.2204 363/500 [====================>.........] - ETA: 32s - loss: 1.3652 - regression_loss: 1.1450 - classification_loss: 0.2202 364/500 [====================>.........] - ETA: 32s - loss: 1.3657 - regression_loss: 1.1456 - classification_loss: 0.2202 365/500 [====================>.........] - ETA: 32s - loss: 1.3639 - regression_loss: 1.1440 - classification_loss: 0.2199 366/500 [====================>.........] - ETA: 32s - loss: 1.3647 - regression_loss: 1.1447 - classification_loss: 0.2200 367/500 [=====================>........] - ETA: 31s - loss: 1.3637 - regression_loss: 1.1440 - classification_loss: 0.2197 368/500 [=====================>........] - ETA: 31s - loss: 1.3647 - regression_loss: 1.1453 - classification_loss: 0.2194 369/500 [=====================>........] - ETA: 31s - loss: 1.3648 - regression_loss: 1.1452 - classification_loss: 0.2196 370/500 [=====================>........] - ETA: 31s - loss: 1.3669 - regression_loss: 1.1472 - classification_loss: 0.2197 371/500 [=====================>........] - ETA: 30s - loss: 1.3678 - regression_loss: 1.1480 - classification_loss: 0.2198 372/500 [=====================>........] - ETA: 30s - loss: 1.3690 - regression_loss: 1.1489 - classification_loss: 0.2201 373/500 [=====================>........] - ETA: 30s - loss: 1.3704 - regression_loss: 1.1502 - classification_loss: 0.2201 374/500 [=====================>........] - ETA: 30s - loss: 1.3708 - regression_loss: 1.1501 - classification_loss: 0.2207 375/500 [=====================>........] - ETA: 29s - loss: 1.3722 - regression_loss: 1.1511 - classification_loss: 0.2211 376/500 [=====================>........] - ETA: 29s - loss: 1.3700 - regression_loss: 1.1494 - classification_loss: 0.2206 377/500 [=====================>........] - ETA: 29s - loss: 1.3706 - regression_loss: 1.1501 - classification_loss: 0.2205 378/500 [=====================>........] - ETA: 29s - loss: 1.3709 - regression_loss: 1.1503 - classification_loss: 0.2206 379/500 [=====================>........] - ETA: 28s - loss: 1.3721 - regression_loss: 1.1512 - classification_loss: 0.2209 380/500 [=====================>........] - ETA: 28s - loss: 1.3716 - regression_loss: 1.1508 - classification_loss: 0.2207 381/500 [=====================>........] - ETA: 28s - loss: 1.3705 - regression_loss: 1.1500 - classification_loss: 0.2205 382/500 [=====================>........] - ETA: 28s - loss: 1.3709 - regression_loss: 1.1505 - classification_loss: 0.2204 383/500 [=====================>........] - ETA: 28s - loss: 1.3713 - regression_loss: 1.1509 - classification_loss: 0.2205 384/500 [======================>.......] - ETA: 27s - loss: 1.3701 - regression_loss: 1.1498 - classification_loss: 0.2202 385/500 [======================>.......] - ETA: 27s - loss: 1.3700 - regression_loss: 1.1499 - classification_loss: 0.2201 386/500 [======================>.......] - ETA: 27s - loss: 1.3702 - regression_loss: 1.1502 - classification_loss: 0.2201 387/500 [======================>.......] - ETA: 27s - loss: 1.3744 - regression_loss: 1.1529 - classification_loss: 0.2215 388/500 [======================>.......] - ETA: 26s - loss: 1.3765 - regression_loss: 1.1547 - classification_loss: 0.2218 389/500 [======================>.......] - ETA: 26s - loss: 1.3764 - regression_loss: 1.1548 - classification_loss: 0.2216 390/500 [======================>.......] - ETA: 26s - loss: 1.3762 - regression_loss: 1.1547 - classification_loss: 0.2215 391/500 [======================>.......] - ETA: 26s - loss: 1.3755 - regression_loss: 1.1542 - classification_loss: 0.2213 392/500 [======================>.......] - ETA: 25s - loss: 1.3739 - regression_loss: 1.1528 - classification_loss: 0.2211 393/500 [======================>.......] - ETA: 25s - loss: 1.3761 - regression_loss: 1.1544 - classification_loss: 0.2218 394/500 [======================>.......] - ETA: 25s - loss: 1.3761 - regression_loss: 1.1544 - classification_loss: 0.2217 395/500 [======================>.......] - ETA: 25s - loss: 1.3748 - regression_loss: 1.1533 - classification_loss: 0.2214 396/500 [======================>.......] - ETA: 24s - loss: 1.3818 - regression_loss: 1.1584 - classification_loss: 0.2234 397/500 [======================>.......] - ETA: 24s - loss: 1.3797 - regression_loss: 1.1565 - classification_loss: 0.2231 398/500 [======================>.......] - ETA: 24s - loss: 1.3786 - regression_loss: 1.1556 - classification_loss: 0.2231 399/500 [======================>.......] - ETA: 24s - loss: 1.3779 - regression_loss: 1.1549 - classification_loss: 0.2230 400/500 [=======================>......] - ETA: 23s - loss: 1.3773 - regression_loss: 1.1545 - classification_loss: 0.2229 401/500 [=======================>......] - ETA: 23s - loss: 1.3772 - regression_loss: 1.1545 - classification_loss: 0.2226 402/500 [=======================>......] - ETA: 23s - loss: 1.3772 - regression_loss: 1.1546 - classification_loss: 0.2226 403/500 [=======================>......] - ETA: 23s - loss: 1.3755 - regression_loss: 1.1532 - classification_loss: 0.2222 404/500 [=======================>......] - ETA: 22s - loss: 1.3763 - regression_loss: 1.1542 - classification_loss: 0.2221 405/500 [=======================>......] - ETA: 22s - loss: 1.3761 - regression_loss: 1.1542 - classification_loss: 0.2219 406/500 [=======================>......] - ETA: 22s - loss: 1.3744 - regression_loss: 1.1528 - classification_loss: 0.2216 407/500 [=======================>......] - ETA: 22s - loss: 1.3754 - regression_loss: 1.1538 - classification_loss: 0.2216 408/500 [=======================>......] - ETA: 22s - loss: 1.3760 - regression_loss: 1.1542 - classification_loss: 0.2218 409/500 [=======================>......] - ETA: 21s - loss: 1.3774 - regression_loss: 1.1556 - classification_loss: 0.2218 410/500 [=======================>......] - ETA: 21s - loss: 1.3776 - regression_loss: 1.1558 - classification_loss: 0.2218 411/500 [=======================>......] - ETA: 21s - loss: 1.3785 - regression_loss: 1.1568 - classification_loss: 0.2218 412/500 [=======================>......] - ETA: 21s - loss: 1.3779 - regression_loss: 1.1564 - classification_loss: 0.2216 413/500 [=======================>......] - ETA: 20s - loss: 1.3793 - regression_loss: 1.1573 - classification_loss: 0.2220 414/500 [=======================>......] - ETA: 20s - loss: 1.3803 - regression_loss: 1.1579 - classification_loss: 0.2225 415/500 [=======================>......] - ETA: 20s - loss: 1.3809 - regression_loss: 1.1582 - classification_loss: 0.2227 416/500 [=======================>......] - ETA: 20s - loss: 1.3819 - regression_loss: 1.1590 - classification_loss: 0.2229 417/500 [========================>.....] - ETA: 19s - loss: 1.3820 - regression_loss: 1.1588 - classification_loss: 0.2231 418/500 [========================>.....] - ETA: 19s - loss: 1.3817 - regression_loss: 1.1587 - classification_loss: 0.2230 419/500 [========================>.....] - ETA: 19s - loss: 1.3814 - regression_loss: 1.1585 - classification_loss: 0.2229 420/500 [========================>.....] - ETA: 19s - loss: 1.3806 - regression_loss: 1.1578 - classification_loss: 0.2228 421/500 [========================>.....] - ETA: 18s - loss: 1.3790 - regression_loss: 1.1565 - classification_loss: 0.2225 422/500 [========================>.....] - ETA: 18s - loss: 1.3808 - regression_loss: 1.1581 - classification_loss: 0.2227 423/500 [========================>.....] - ETA: 18s - loss: 1.3800 - regression_loss: 1.1576 - classification_loss: 0.2224 424/500 [========================>.....] - ETA: 18s - loss: 1.3800 - regression_loss: 1.1576 - classification_loss: 0.2224 425/500 [========================>.....] - ETA: 17s - loss: 1.3803 - regression_loss: 1.1577 - classification_loss: 0.2226 426/500 [========================>.....] - ETA: 17s - loss: 1.3799 - regression_loss: 1.1576 - classification_loss: 0.2223 427/500 [========================>.....] - ETA: 17s - loss: 1.3816 - regression_loss: 1.1591 - classification_loss: 0.2225 428/500 [========================>.....] - ETA: 17s - loss: 1.3811 - regression_loss: 1.1588 - classification_loss: 0.2223 429/500 [========================>.....] - ETA: 17s - loss: 1.3822 - regression_loss: 1.1599 - classification_loss: 0.2223 430/500 [========================>.....] - ETA: 16s - loss: 1.3826 - regression_loss: 1.1604 - classification_loss: 0.2222 431/500 [========================>.....] - ETA: 16s - loss: 1.3828 - regression_loss: 1.1604 - classification_loss: 0.2224 432/500 [========================>.....] - ETA: 16s - loss: 1.3851 - regression_loss: 1.1623 - classification_loss: 0.2228 433/500 [========================>.....] - ETA: 16s - loss: 1.3847 - regression_loss: 1.1620 - classification_loss: 0.2227 434/500 [=========================>....] - ETA: 15s - loss: 1.3846 - regression_loss: 1.1619 - classification_loss: 0.2227 435/500 [=========================>....] - ETA: 15s - loss: 1.3838 - regression_loss: 1.1613 - classification_loss: 0.2225 436/500 [=========================>....] - ETA: 15s - loss: 1.3816 - regression_loss: 1.1594 - classification_loss: 0.2222 437/500 [=========================>....] - ETA: 15s - loss: 1.3806 - regression_loss: 1.1586 - classification_loss: 0.2220 438/500 [=========================>....] - ETA: 14s - loss: 1.3807 - regression_loss: 1.1588 - classification_loss: 0.2218 439/500 [=========================>....] - ETA: 14s - loss: 1.3821 - regression_loss: 1.1601 - classification_loss: 0.2220 440/500 [=========================>....] - ETA: 14s - loss: 1.3822 - regression_loss: 1.1602 - classification_loss: 0.2220 441/500 [=========================>....] - ETA: 14s - loss: 1.3808 - regression_loss: 1.1592 - classification_loss: 0.2216 442/500 [=========================>....] - ETA: 13s - loss: 1.3824 - regression_loss: 1.1607 - classification_loss: 0.2216 443/500 [=========================>....] - ETA: 13s - loss: 1.3825 - regression_loss: 1.1608 - classification_loss: 0.2217 444/500 [=========================>....] - ETA: 13s - loss: 1.3812 - regression_loss: 1.1596 - classification_loss: 0.2216 445/500 [=========================>....] - ETA: 13s - loss: 1.3799 - regression_loss: 1.1586 - classification_loss: 0.2213 446/500 [=========================>....] - ETA: 12s - loss: 1.3795 - regression_loss: 1.1583 - classification_loss: 0.2212 447/500 [=========================>....] - ETA: 12s - loss: 1.3791 - regression_loss: 1.1580 - classification_loss: 0.2212 448/500 [=========================>....] - ETA: 12s - loss: 1.3792 - regression_loss: 1.1580 - classification_loss: 0.2212 449/500 [=========================>....] - ETA: 12s - loss: 1.3808 - regression_loss: 1.1591 - classification_loss: 0.2217 450/500 [==========================>...] - ETA: 11s - loss: 1.3813 - regression_loss: 1.1595 - classification_loss: 0.2218 451/500 [==========================>...] - ETA: 11s - loss: 1.3809 - regression_loss: 1.1593 - classification_loss: 0.2216 452/500 [==========================>...] - ETA: 11s - loss: 1.3801 - regression_loss: 1.1587 - classification_loss: 0.2215 453/500 [==========================>...] - ETA: 11s - loss: 1.3785 - regression_loss: 1.1573 - classification_loss: 0.2212 454/500 [==========================>...] - ETA: 11s - loss: 1.3770 - regression_loss: 1.1559 - classification_loss: 0.2211 455/500 [==========================>...] - ETA: 10s - loss: 1.3762 - regression_loss: 1.1554 - classification_loss: 0.2208 456/500 [==========================>...] - ETA: 10s - loss: 1.3773 - regression_loss: 1.1559 - classification_loss: 0.2215 457/500 [==========================>...] - ETA: 10s - loss: 1.3779 - regression_loss: 1.1565 - classification_loss: 0.2214 458/500 [==========================>...] - ETA: 10s - loss: 1.3777 - regression_loss: 1.1564 - classification_loss: 0.2213 459/500 [==========================>...] - ETA: 9s - loss: 1.3781 - regression_loss: 1.1567 - classification_loss: 0.2214  460/500 [==========================>...] - ETA: 9s - loss: 1.3784 - regression_loss: 1.1570 - classification_loss: 0.2214 461/500 [==========================>...] - ETA: 9s - loss: 1.3803 - regression_loss: 1.1585 - classification_loss: 0.2218 462/500 [==========================>...] - ETA: 9s - loss: 1.3804 - regression_loss: 1.1586 - classification_loss: 0.2218 463/500 [==========================>...] - ETA: 8s - loss: 1.3809 - regression_loss: 1.1592 - classification_loss: 0.2217 464/500 [==========================>...] - ETA: 8s - loss: 1.3809 - regression_loss: 1.1591 - classification_loss: 0.2218 465/500 [==========================>...] - ETA: 8s - loss: 1.3806 - regression_loss: 1.1589 - classification_loss: 0.2217 466/500 [==========================>...] - ETA: 8s - loss: 1.3803 - regression_loss: 1.1586 - classification_loss: 0.2217 467/500 [===========================>..] - ETA: 7s - loss: 1.3814 - regression_loss: 1.1593 - classification_loss: 0.2220 468/500 [===========================>..] - ETA: 7s - loss: 1.3814 - regression_loss: 1.1594 - classification_loss: 0.2220 469/500 [===========================>..] - ETA: 7s - loss: 1.3810 - regression_loss: 1.1591 - classification_loss: 0.2218 470/500 [===========================>..] - ETA: 7s - loss: 1.3830 - regression_loss: 1.1608 - classification_loss: 0.2222 471/500 [===========================>..] - ETA: 6s - loss: 1.3831 - regression_loss: 1.1607 - classification_loss: 0.2224 472/500 [===========================>..] - ETA: 6s - loss: 1.3826 - regression_loss: 1.1604 - classification_loss: 0.2222 473/500 [===========================>..] - ETA: 6s - loss: 1.3838 - regression_loss: 1.1614 - classification_loss: 0.2224 474/500 [===========================>..] - ETA: 6s - loss: 1.3849 - regression_loss: 1.1624 - classification_loss: 0.2225 475/500 [===========================>..] - ETA: 5s - loss: 1.3839 - regression_loss: 1.1616 - classification_loss: 0.2223 476/500 [===========================>..] - ETA: 5s - loss: 1.3844 - regression_loss: 1.1620 - classification_loss: 0.2224 477/500 [===========================>..] - ETA: 5s - loss: 1.3832 - regression_loss: 1.1610 - classification_loss: 0.2221 478/500 [===========================>..] - ETA: 5s - loss: 1.3826 - regression_loss: 1.1607 - classification_loss: 0.2219 479/500 [===========================>..] - ETA: 5s - loss: 1.3812 - regression_loss: 1.1594 - classification_loss: 0.2217 480/500 [===========================>..] - ETA: 4s - loss: 1.3805 - regression_loss: 1.1589 - classification_loss: 0.2216 481/500 [===========================>..] - ETA: 4s - loss: 1.3798 - regression_loss: 1.1585 - classification_loss: 0.2214 482/500 [===========================>..] - ETA: 4s - loss: 1.3801 - regression_loss: 1.1589 - classification_loss: 0.2212 483/500 [===========================>..] - ETA: 4s - loss: 1.3801 - regression_loss: 1.1590 - classification_loss: 0.2211 484/500 [============================>.] - ETA: 3s - loss: 1.3801 - regression_loss: 1.1591 - classification_loss: 0.2210 485/500 [============================>.] - ETA: 3s - loss: 1.3796 - regression_loss: 1.1587 - classification_loss: 0.2209 486/500 [============================>.] - ETA: 3s - loss: 1.3799 - regression_loss: 1.1591 - classification_loss: 0.2208 487/500 [============================>.] - ETA: 3s - loss: 1.3796 - regression_loss: 1.1588 - classification_loss: 0.2208 488/500 [============================>.] - ETA: 2s - loss: 1.3809 - regression_loss: 1.1597 - classification_loss: 0.2212 489/500 [============================>.] - ETA: 2s - loss: 1.3816 - regression_loss: 1.1602 - classification_loss: 0.2214 490/500 [============================>.] - ETA: 2s - loss: 1.3809 - regression_loss: 1.1597 - classification_loss: 0.2212 491/500 [============================>.] - ETA: 2s - loss: 1.3809 - regression_loss: 1.1597 - classification_loss: 0.2212 492/500 [============================>.] - ETA: 1s - loss: 1.3799 - regression_loss: 1.1590 - classification_loss: 0.2209 493/500 [============================>.] - ETA: 1s - loss: 1.3803 - regression_loss: 1.1592 - classification_loss: 0.2211 494/500 [============================>.] - ETA: 1s - loss: 1.3824 - regression_loss: 1.1606 - classification_loss: 0.2218 495/500 [============================>.] - ETA: 1s - loss: 1.3831 - regression_loss: 1.1613 - classification_loss: 0.2217 496/500 [============================>.] - ETA: 0s - loss: 1.3842 - regression_loss: 1.1623 - classification_loss: 0.2218 497/500 [============================>.] - ETA: 0s - loss: 1.3854 - regression_loss: 1.1632 - classification_loss: 0.2222 498/500 [============================>.] - ETA: 0s - loss: 1.3857 - regression_loss: 1.1634 - classification_loss: 0.2223 499/500 [============================>.] - ETA: 0s - loss: 1.3862 - regression_loss: 1.1638 - classification_loss: 0.2224 500/500 [==============================] - 119s 239ms/step - loss: 1.3868 - regression_loss: 1.1644 - classification_loss: 0.2224 326 instances of class plum with average precision: 0.8024 mAP: 0.8024 Epoch 00081: saving model to ./training/snapshots/resnet50_pascal_81.h5 Epoch 82/150 1/500 [..............................] - ETA: 1:56 - loss: 1.7695 - regression_loss: 1.4813 - classification_loss: 0.2882 2/500 [..............................] - ETA: 1:58 - loss: 1.9091 - regression_loss: 1.6141 - classification_loss: 0.2949 3/500 [..............................] - ETA: 1:59 - loss: 1.4769 - regression_loss: 1.2622 - classification_loss: 0.2147 4/500 [..............................] - ETA: 1:58 - loss: 1.5301 - regression_loss: 1.2670 - classification_loss: 0.2631 5/500 [..............................] - ETA: 1:56 - loss: 1.3815 - regression_loss: 1.1518 - classification_loss: 0.2297 6/500 [..............................] - ETA: 1:56 - loss: 1.2633 - regression_loss: 1.0605 - classification_loss: 0.2028 7/500 [..............................] - ETA: 1:57 - loss: 1.3900 - regression_loss: 1.1601 - classification_loss: 0.2299 8/500 [..............................] - ETA: 1:56 - loss: 1.4493 - regression_loss: 1.1942 - classification_loss: 0.2551 9/500 [..............................] - ETA: 1:56 - loss: 1.6245 - regression_loss: 1.3338 - classification_loss: 0.2906 10/500 [..............................] - ETA: 1:54 - loss: 1.6119 - regression_loss: 1.3275 - classification_loss: 0.2844 11/500 [..............................] - ETA: 1:54 - loss: 1.5712 - regression_loss: 1.3016 - classification_loss: 0.2696 12/500 [..............................] - ETA: 1:54 - loss: 1.6022 - regression_loss: 1.3291 - classification_loss: 0.2731 13/500 [..............................] - ETA: 1:54 - loss: 1.5808 - regression_loss: 1.3154 - classification_loss: 0.2654 14/500 [..............................] - ETA: 1:54 - loss: 1.5071 - regression_loss: 1.2562 - classification_loss: 0.2509 15/500 [..............................] - ETA: 1:54 - loss: 1.5039 - regression_loss: 1.2517 - classification_loss: 0.2523 16/500 [..............................] - ETA: 1:54 - loss: 1.4962 - regression_loss: 1.2438 - classification_loss: 0.2524 17/500 [>.............................] - ETA: 1:55 - loss: 1.4564 - regression_loss: 1.2146 - classification_loss: 0.2418 18/500 [>.............................] - ETA: 1:55 - loss: 1.4574 - regression_loss: 1.2174 - classification_loss: 0.2400 19/500 [>.............................] - ETA: 1:54 - loss: 1.4662 - regression_loss: 1.2268 - classification_loss: 0.2394 20/500 [>.............................] - ETA: 1:54 - loss: 1.4396 - regression_loss: 1.2032 - classification_loss: 0.2363 21/500 [>.............................] - ETA: 1:54 - loss: 1.4205 - regression_loss: 1.1888 - classification_loss: 0.2317 22/500 [>.............................] - ETA: 1:54 - loss: 1.4127 - regression_loss: 1.1829 - classification_loss: 0.2298 23/500 [>.............................] - ETA: 1:53 - loss: 1.4265 - regression_loss: 1.1970 - classification_loss: 0.2295 24/500 [>.............................] - ETA: 1:53 - loss: 1.3926 - regression_loss: 1.1694 - classification_loss: 0.2232 25/500 [>.............................] - ETA: 1:52 - loss: 1.4166 - regression_loss: 1.1887 - classification_loss: 0.2278 26/500 [>.............................] - ETA: 1:52 - loss: 1.4523 - regression_loss: 1.2159 - classification_loss: 0.2364 27/500 [>.............................] - ETA: 1:52 - loss: 1.4217 - regression_loss: 1.1913 - classification_loss: 0.2304 28/500 [>.............................] - ETA: 1:52 - loss: 1.4336 - regression_loss: 1.2036 - classification_loss: 0.2301 29/500 [>.............................] - ETA: 1:52 - loss: 1.4317 - regression_loss: 1.2013 - classification_loss: 0.2304 30/500 [>.............................] - ETA: 1:51 - loss: 1.4360 - regression_loss: 1.2061 - classification_loss: 0.2299 31/500 [>.............................] - ETA: 1:51 - loss: 1.4247 - regression_loss: 1.1979 - classification_loss: 0.2268 32/500 [>.............................] - ETA: 1:51 - loss: 1.4231 - regression_loss: 1.1964 - classification_loss: 0.2266 33/500 [>.............................] - ETA: 1:51 - loss: 1.4312 - regression_loss: 1.2031 - classification_loss: 0.2282 34/500 [=>............................] - ETA: 1:51 - loss: 1.4222 - regression_loss: 1.1981 - classification_loss: 0.2241 35/500 [=>............................] - ETA: 1:50 - loss: 1.4331 - regression_loss: 1.2077 - classification_loss: 0.2254 36/500 [=>............................] - ETA: 1:50 - loss: 1.4374 - regression_loss: 1.2115 - classification_loss: 0.2259 37/500 [=>............................] - ETA: 1:50 - loss: 1.4364 - regression_loss: 1.2117 - classification_loss: 0.2248 38/500 [=>............................] - ETA: 1:49 - loss: 1.4504 - regression_loss: 1.2243 - classification_loss: 0.2261 39/500 [=>............................] - ETA: 1:49 - loss: 1.4529 - regression_loss: 1.2254 - classification_loss: 0.2274 40/500 [=>............................] - ETA: 1:49 - loss: 1.4629 - regression_loss: 1.2337 - classification_loss: 0.2292 41/500 [=>............................] - ETA: 1:49 - loss: 1.4656 - regression_loss: 1.2355 - classification_loss: 0.2301 42/500 [=>............................] - ETA: 1:49 - loss: 1.4656 - regression_loss: 1.2352 - classification_loss: 0.2305 43/500 [=>............................] - ETA: 1:49 - loss: 1.4875 - regression_loss: 1.2503 - classification_loss: 0.2373 44/500 [=>............................] - ETA: 1:49 - loss: 1.4924 - regression_loss: 1.2547 - classification_loss: 0.2377 45/500 [=>............................] - ETA: 1:48 - loss: 1.4807 - regression_loss: 1.2460 - classification_loss: 0.2347 46/500 [=>............................] - ETA: 1:48 - loss: 1.4896 - regression_loss: 1.2539 - classification_loss: 0.2357 47/500 [=>............................] - ETA: 1:48 - loss: 1.4980 - regression_loss: 1.2592 - classification_loss: 0.2387 48/500 [=>............................] - ETA: 1:48 - loss: 1.4926 - regression_loss: 1.2545 - classification_loss: 0.2381 49/500 [=>............................] - ETA: 1:48 - loss: 1.4913 - regression_loss: 1.2540 - classification_loss: 0.2373 50/500 [==>...........................] - ETA: 1:47 - loss: 1.4866 - regression_loss: 1.2503 - classification_loss: 0.2363 51/500 [==>...........................] - ETA: 1:47 - loss: 1.4792 - regression_loss: 1.2451 - classification_loss: 0.2341 52/500 [==>...........................] - ETA: 1:46 - loss: 1.4817 - regression_loss: 1.2495 - classification_loss: 0.2322 53/500 [==>...........................] - ETA: 1:46 - loss: 1.5029 - regression_loss: 1.2669 - classification_loss: 0.2360 54/500 [==>...........................] - ETA: 1:46 - loss: 1.4965 - regression_loss: 1.2590 - classification_loss: 0.2375 55/500 [==>...........................] - ETA: 1:46 - loss: 1.4965 - regression_loss: 1.2586 - classification_loss: 0.2378 56/500 [==>...........................] - ETA: 1:45 - loss: 1.4948 - regression_loss: 1.2577 - classification_loss: 0.2371 57/500 [==>...........................] - ETA: 1:45 - loss: 1.4745 - regression_loss: 1.2401 - classification_loss: 0.2344 58/500 [==>...........................] - ETA: 1:45 - loss: 1.4761 - regression_loss: 1.2406 - classification_loss: 0.2355 59/500 [==>...........................] - ETA: 1:45 - loss: 1.4761 - regression_loss: 1.2406 - classification_loss: 0.2355 60/500 [==>...........................] - ETA: 1:44 - loss: 1.4806 - regression_loss: 1.2447 - classification_loss: 0.2360 61/500 [==>...........................] - ETA: 1:44 - loss: 1.4787 - regression_loss: 1.2440 - classification_loss: 0.2347 62/500 [==>...........................] - ETA: 1:44 - loss: 1.4720 - regression_loss: 1.2392 - classification_loss: 0.2328 63/500 [==>...........................] - ETA: 1:44 - loss: 1.4751 - regression_loss: 1.2417 - classification_loss: 0.2334 64/500 [==>...........................] - ETA: 1:44 - loss: 1.4710 - regression_loss: 1.2393 - classification_loss: 0.2317 65/500 [==>...........................] - ETA: 1:43 - loss: 1.4719 - regression_loss: 1.2406 - classification_loss: 0.2313 66/500 [==>...........................] - ETA: 1:43 - loss: 1.4702 - regression_loss: 1.2398 - classification_loss: 0.2304 67/500 [===>..........................] - ETA: 1:43 - loss: 1.4802 - regression_loss: 1.2483 - classification_loss: 0.2319 68/500 [===>..........................] - ETA: 1:43 - loss: 1.4735 - regression_loss: 1.2431 - classification_loss: 0.2304 69/500 [===>..........................] - ETA: 1:42 - loss: 1.4663 - regression_loss: 1.2380 - classification_loss: 0.2283 70/500 [===>..........................] - ETA: 1:42 - loss: 1.4644 - regression_loss: 1.2367 - classification_loss: 0.2277 71/500 [===>..........................] - ETA: 1:42 - loss: 1.4719 - regression_loss: 1.2434 - classification_loss: 0.2286 72/500 [===>..........................] - ETA: 1:42 - loss: 1.4682 - regression_loss: 1.2404 - classification_loss: 0.2278 73/500 [===>..........................] - ETA: 1:41 - loss: 1.4606 - regression_loss: 1.2342 - classification_loss: 0.2264 74/500 [===>..........................] - ETA: 1:41 - loss: 1.4631 - regression_loss: 1.2354 - classification_loss: 0.2278 75/500 [===>..........................] - ETA: 1:41 - loss: 1.4680 - regression_loss: 1.2390 - classification_loss: 0.2290 76/500 [===>..........................] - ETA: 1:41 - loss: 1.4529 - regression_loss: 1.2261 - classification_loss: 0.2268 77/500 [===>..........................] - ETA: 1:40 - loss: 1.4530 - regression_loss: 1.2272 - classification_loss: 0.2259 78/500 [===>..........................] - ETA: 1:40 - loss: 1.4524 - regression_loss: 1.2271 - classification_loss: 0.2252 79/500 [===>..........................] - ETA: 1:40 - loss: 1.4461 - regression_loss: 1.2221 - classification_loss: 0.2240 80/500 [===>..........................] - ETA: 1:40 - loss: 1.4444 - regression_loss: 1.2206 - classification_loss: 0.2238 81/500 [===>..........................] - ETA: 1:39 - loss: 1.4464 - regression_loss: 1.2223 - classification_loss: 0.2241 82/500 [===>..........................] - ETA: 1:39 - loss: 1.4375 - regression_loss: 1.2149 - classification_loss: 0.2226 83/500 [===>..........................] - ETA: 1:39 - loss: 1.4308 - regression_loss: 1.2094 - classification_loss: 0.2214 84/500 [====>.........................] - ETA: 1:39 - loss: 1.4312 - regression_loss: 1.2093 - classification_loss: 0.2219 85/500 [====>.........................] - ETA: 1:39 - loss: 1.4333 - regression_loss: 1.2107 - classification_loss: 0.2225 86/500 [====>.........................] - ETA: 1:38 - loss: 1.4323 - regression_loss: 1.2100 - classification_loss: 0.2223 87/500 [====>.........................] - ETA: 1:38 - loss: 1.4246 - regression_loss: 1.2023 - classification_loss: 0.2223 88/500 [====>.........................] - ETA: 1:38 - loss: 1.4240 - regression_loss: 1.2031 - classification_loss: 0.2210 89/500 [====>.........................] - ETA: 1:38 - loss: 1.4163 - regression_loss: 1.1974 - classification_loss: 0.2189 90/500 [====>.........................] - ETA: 1:37 - loss: 1.4243 - regression_loss: 1.2050 - classification_loss: 0.2193 91/500 [====>.........................] - ETA: 1:37 - loss: 1.4228 - regression_loss: 1.2043 - classification_loss: 0.2184 92/500 [====>.........................] - ETA: 1:37 - loss: 1.4188 - regression_loss: 1.2016 - classification_loss: 0.2172 93/500 [====>.........................] - ETA: 1:37 - loss: 1.4090 - regression_loss: 1.1934 - classification_loss: 0.2156 94/500 [====>.........................] - ETA: 1:36 - loss: 1.4093 - regression_loss: 1.1936 - classification_loss: 0.2157 95/500 [====>.........................] - ETA: 1:36 - loss: 1.4139 - regression_loss: 1.1966 - classification_loss: 0.2173 96/500 [====>.........................] - ETA: 1:36 - loss: 1.4096 - regression_loss: 1.1933 - classification_loss: 0.2162 97/500 [====>.........................] - ETA: 1:36 - loss: 1.4087 - regression_loss: 1.1931 - classification_loss: 0.2156 98/500 [====>.........................] - ETA: 1:35 - loss: 1.4132 - regression_loss: 1.1965 - classification_loss: 0.2167 99/500 [====>.........................] - ETA: 1:35 - loss: 1.4045 - regression_loss: 1.1890 - classification_loss: 0.2155 100/500 [=====>........................] - ETA: 1:35 - loss: 1.3975 - regression_loss: 1.1832 - classification_loss: 0.2143 101/500 [=====>........................] - ETA: 1:35 - loss: 1.3883 - regression_loss: 1.1752 - classification_loss: 0.2131 102/500 [=====>........................] - ETA: 1:35 - loss: 1.3905 - regression_loss: 1.1776 - classification_loss: 0.2129 103/500 [=====>........................] - ETA: 1:34 - loss: 1.3930 - regression_loss: 1.1790 - classification_loss: 0.2140 104/500 [=====>........................] - ETA: 1:34 - loss: 1.3947 - regression_loss: 1.1805 - classification_loss: 0.2142 105/500 [=====>........................] - ETA: 1:34 - loss: 1.3934 - regression_loss: 1.1789 - classification_loss: 0.2145 106/500 [=====>........................] - ETA: 1:34 - loss: 1.4232 - regression_loss: 1.1678 - classification_loss: 0.2554 107/500 [=====>........................] - ETA: 1:33 - loss: 1.4237 - regression_loss: 1.1695 - classification_loss: 0.2542 108/500 [=====>........................] - ETA: 1:33 - loss: 1.4266 - regression_loss: 1.1729 - classification_loss: 0.2537 109/500 [=====>........................] - ETA: 1:33 - loss: 1.4248 - regression_loss: 1.1723 - classification_loss: 0.2525 110/500 [=====>........................] - ETA: 1:33 - loss: 1.4192 - regression_loss: 1.1673 - classification_loss: 0.2519 111/500 [=====>........................] - ETA: 1:33 - loss: 1.4183 - regression_loss: 1.1671 - classification_loss: 0.2513 112/500 [=====>........................] - ETA: 1:32 - loss: 1.4193 - regression_loss: 1.1686 - classification_loss: 0.2506 113/500 [=====>........................] - ETA: 1:32 - loss: 1.4209 - regression_loss: 1.1705 - classification_loss: 0.2504 114/500 [=====>........................] - ETA: 1:32 - loss: 1.4196 - regression_loss: 1.1693 - classification_loss: 0.2503 115/500 [=====>........................] - ETA: 1:32 - loss: 1.4163 - regression_loss: 1.1665 - classification_loss: 0.2498 116/500 [=====>........................] - ETA: 1:31 - loss: 1.4226 - regression_loss: 1.1716 - classification_loss: 0.2510 117/500 [======>.......................] - ETA: 1:31 - loss: 1.4209 - regression_loss: 1.1706 - classification_loss: 0.2503 118/500 [======>.......................] - ETA: 1:31 - loss: 1.4247 - regression_loss: 1.1741 - classification_loss: 0.2506 119/500 [======>.......................] - ETA: 1:31 - loss: 1.4240 - regression_loss: 1.1742 - classification_loss: 0.2498 120/500 [======>.......................] - ETA: 1:30 - loss: 1.4223 - regression_loss: 1.1726 - classification_loss: 0.2497 121/500 [======>.......................] - ETA: 1:30 - loss: 1.4152 - regression_loss: 1.1673 - classification_loss: 0.2480 122/500 [======>.......................] - ETA: 1:30 - loss: 1.4144 - regression_loss: 1.1671 - classification_loss: 0.2473 123/500 [======>.......................] - ETA: 1:30 - loss: 1.4131 - regression_loss: 1.1661 - classification_loss: 0.2470 124/500 [======>.......................] - ETA: 1:30 - loss: 1.4094 - regression_loss: 1.1636 - classification_loss: 0.2458 125/500 [======>.......................] - ETA: 1:29 - loss: 1.4112 - regression_loss: 1.1656 - classification_loss: 0.2456 126/500 [======>.......................] - ETA: 1:29 - loss: 1.4112 - regression_loss: 1.1660 - classification_loss: 0.2452 127/500 [======>.......................] - ETA: 1:29 - loss: 1.4113 - regression_loss: 1.1668 - classification_loss: 0.2445 128/500 [======>.......................] - ETA: 1:29 - loss: 1.4118 - regression_loss: 1.1671 - classification_loss: 0.2448 129/500 [======>.......................] - ETA: 1:28 - loss: 1.4113 - regression_loss: 1.1672 - classification_loss: 0.2441 130/500 [======>.......................] - ETA: 1:28 - loss: 1.4129 - regression_loss: 1.1690 - classification_loss: 0.2438 131/500 [======>.......................] - ETA: 1:28 - loss: 1.4155 - regression_loss: 1.1707 - classification_loss: 0.2449 132/500 [======>.......................] - ETA: 1:28 - loss: 1.4128 - regression_loss: 1.1686 - classification_loss: 0.2442 133/500 [======>.......................] - ETA: 1:27 - loss: 1.4102 - regression_loss: 1.1665 - classification_loss: 0.2437 134/500 [=======>......................] - ETA: 1:27 - loss: 1.4099 - regression_loss: 1.1666 - classification_loss: 0.2433 135/500 [=======>......................] - ETA: 1:27 - loss: 1.4083 - regression_loss: 1.1659 - classification_loss: 0.2424 136/500 [=======>......................] - ETA: 1:27 - loss: 1.4034 - regression_loss: 1.1621 - classification_loss: 0.2413 137/500 [=======>......................] - ETA: 1:26 - loss: 1.4030 - regression_loss: 1.1623 - classification_loss: 0.2407 138/500 [=======>......................] - ETA: 1:26 - loss: 1.3966 - regression_loss: 1.1572 - classification_loss: 0.2394 139/500 [=======>......................] - ETA: 1:26 - loss: 1.3913 - regression_loss: 1.1530 - classification_loss: 0.2383 140/500 [=======>......................] - ETA: 1:26 - loss: 1.3892 - regression_loss: 1.1514 - classification_loss: 0.2378 141/500 [=======>......................] - ETA: 1:25 - loss: 1.3863 - regression_loss: 1.1493 - classification_loss: 0.2370 142/500 [=======>......................] - ETA: 1:25 - loss: 1.3904 - regression_loss: 1.1525 - classification_loss: 0.2379 143/500 [=======>......................] - ETA: 1:25 - loss: 1.3978 - regression_loss: 1.1581 - classification_loss: 0.2398 144/500 [=======>......................] - ETA: 1:25 - loss: 1.3998 - regression_loss: 1.1592 - classification_loss: 0.2406 145/500 [=======>......................] - ETA: 1:24 - loss: 1.3955 - regression_loss: 1.1558 - classification_loss: 0.2397 146/500 [=======>......................] - ETA: 1:24 - loss: 1.3986 - regression_loss: 1.1574 - classification_loss: 0.2412 147/500 [=======>......................] - ETA: 1:24 - loss: 1.3921 - regression_loss: 1.1522 - classification_loss: 0.2399 148/500 [=======>......................] - ETA: 1:24 - loss: 1.3971 - regression_loss: 1.1574 - classification_loss: 0.2397 149/500 [=======>......................] - ETA: 1:23 - loss: 1.4019 - regression_loss: 1.1612 - classification_loss: 0.2407 150/500 [========>.....................] - ETA: 1:23 - loss: 1.4021 - regression_loss: 1.1616 - classification_loss: 0.2405 151/500 [========>.....................] - ETA: 1:23 - loss: 1.3996 - regression_loss: 1.1598 - classification_loss: 0.2398 152/500 [========>.....................] - ETA: 1:23 - loss: 1.4047 - regression_loss: 1.1639 - classification_loss: 0.2409 153/500 [========>.....................] - ETA: 1:22 - loss: 1.4014 - regression_loss: 1.1613 - classification_loss: 0.2401 154/500 [========>.....................] - ETA: 1:22 - loss: 1.4000 - regression_loss: 1.1602 - classification_loss: 0.2398 155/500 [========>.....................] - ETA: 1:22 - loss: 1.4046 - regression_loss: 1.1637 - classification_loss: 0.2409 156/500 [========>.....................] - ETA: 1:22 - loss: 1.4030 - regression_loss: 1.1626 - classification_loss: 0.2403 157/500 [========>.....................] - ETA: 1:21 - loss: 1.4001 - regression_loss: 1.1607 - classification_loss: 0.2394 158/500 [========>.....................] - ETA: 1:21 - loss: 1.4004 - regression_loss: 1.1614 - classification_loss: 0.2390 159/500 [========>.....................] - ETA: 1:21 - loss: 1.4062 - regression_loss: 1.1655 - classification_loss: 0.2408 160/500 [========>.....................] - ETA: 1:21 - loss: 1.4022 - regression_loss: 1.1623 - classification_loss: 0.2399 161/500 [========>.....................] - ETA: 1:20 - loss: 1.4048 - regression_loss: 1.1642 - classification_loss: 0.2406 162/500 [========>.....................] - ETA: 1:20 - loss: 1.4028 - regression_loss: 1.1629 - classification_loss: 0.2399 163/500 [========>.....................] - ETA: 1:20 - loss: 1.4042 - regression_loss: 1.1644 - classification_loss: 0.2397 164/500 [========>.....................] - ETA: 1:20 - loss: 1.4018 - regression_loss: 1.1629 - classification_loss: 0.2388 165/500 [========>.....................] - ETA: 1:19 - loss: 1.3981 - regression_loss: 1.1600 - classification_loss: 0.2381 166/500 [========>.....................] - ETA: 1:19 - loss: 1.4000 - regression_loss: 1.1615 - classification_loss: 0.2385 167/500 [=========>....................] - ETA: 1:19 - loss: 1.4022 - regression_loss: 1.1635 - classification_loss: 0.2387 168/500 [=========>....................] - ETA: 1:19 - loss: 1.4016 - regression_loss: 1.1637 - classification_loss: 0.2379 169/500 [=========>....................] - ETA: 1:18 - loss: 1.4045 - regression_loss: 1.1665 - classification_loss: 0.2380 170/500 [=========>....................] - ETA: 1:18 - loss: 1.4054 - regression_loss: 1.1678 - classification_loss: 0.2376 171/500 [=========>....................] - ETA: 1:18 - loss: 1.4049 - regression_loss: 1.1678 - classification_loss: 0.2371 172/500 [=========>....................] - ETA: 1:18 - loss: 1.4050 - regression_loss: 1.1680 - classification_loss: 0.2370 173/500 [=========>....................] - ETA: 1:17 - loss: 1.4060 - regression_loss: 1.1691 - classification_loss: 0.2369 174/500 [=========>....................] - ETA: 1:17 - loss: 1.4046 - regression_loss: 1.1682 - classification_loss: 0.2365 175/500 [=========>....................] - ETA: 1:17 - loss: 1.4026 - regression_loss: 1.1669 - classification_loss: 0.2357 176/500 [=========>....................] - ETA: 1:17 - loss: 1.4035 - regression_loss: 1.1679 - classification_loss: 0.2357 177/500 [=========>....................] - ETA: 1:16 - loss: 1.4026 - regression_loss: 1.1671 - classification_loss: 0.2355 178/500 [=========>....................] - ETA: 1:16 - loss: 1.3995 - regression_loss: 1.1647 - classification_loss: 0.2348 179/500 [=========>....................] - ETA: 1:16 - loss: 1.3969 - regression_loss: 1.1630 - classification_loss: 0.2340 180/500 [=========>....................] - ETA: 1:16 - loss: 1.3944 - regression_loss: 1.1609 - classification_loss: 0.2335 181/500 [=========>....................] - ETA: 1:15 - loss: 1.3928 - regression_loss: 1.1596 - classification_loss: 0.2332 182/500 [=========>....................] - ETA: 1:15 - loss: 1.3954 - regression_loss: 1.1620 - classification_loss: 0.2334 183/500 [=========>....................] - ETA: 1:15 - loss: 1.3930 - regression_loss: 1.1599 - classification_loss: 0.2332 184/500 [==========>...................] - ETA: 1:15 - loss: 1.3928 - regression_loss: 1.1599 - classification_loss: 0.2328 185/500 [==========>...................] - ETA: 1:15 - loss: 1.3956 - regression_loss: 1.1618 - classification_loss: 0.2338 186/500 [==========>...................] - ETA: 1:14 - loss: 1.3915 - regression_loss: 1.1584 - classification_loss: 0.2331 187/500 [==========>...................] - ETA: 1:14 - loss: 1.3921 - regression_loss: 1.1586 - classification_loss: 0.2334 188/500 [==========>...................] - ETA: 1:14 - loss: 1.3875 - regression_loss: 1.1549 - classification_loss: 0.2325 189/500 [==========>...................] - ETA: 1:14 - loss: 1.3853 - regression_loss: 1.1533 - classification_loss: 0.2321 190/500 [==========>...................] - ETA: 1:13 - loss: 1.3830 - regression_loss: 1.1517 - classification_loss: 0.2313 191/500 [==========>...................] - ETA: 1:13 - loss: 1.3841 - regression_loss: 1.1528 - classification_loss: 0.2313 192/500 [==========>...................] - ETA: 1:13 - loss: 1.3814 - regression_loss: 1.1508 - classification_loss: 0.2306 193/500 [==========>...................] - ETA: 1:13 - loss: 1.3793 - regression_loss: 1.1494 - classification_loss: 0.2300 194/500 [==========>...................] - ETA: 1:12 - loss: 1.3821 - regression_loss: 1.1513 - classification_loss: 0.2308 195/500 [==========>...................] - ETA: 1:12 - loss: 1.3777 - regression_loss: 1.1478 - classification_loss: 0.2299 196/500 [==========>...................] - ETA: 1:12 - loss: 1.3820 - regression_loss: 1.1508 - classification_loss: 0.2312 197/500 [==========>...................] - ETA: 1:12 - loss: 1.3822 - regression_loss: 1.1514 - classification_loss: 0.2308 198/500 [==========>...................] - ETA: 1:12 - loss: 1.3875 - regression_loss: 1.1553 - classification_loss: 0.2321 199/500 [==========>...................] - ETA: 1:11 - loss: 1.3847 - regression_loss: 1.1532 - classification_loss: 0.2315 200/500 [===========>..................] - ETA: 1:11 - loss: 1.3829 - regression_loss: 1.1518 - classification_loss: 0.2311 201/500 [===========>..................] - ETA: 1:11 - loss: 1.3835 - regression_loss: 1.1524 - classification_loss: 0.2311 202/500 [===========>..................] - ETA: 1:11 - loss: 1.3812 - regression_loss: 1.1508 - classification_loss: 0.2303 203/500 [===========>..................] - ETA: 1:10 - loss: 1.3805 - regression_loss: 1.1507 - classification_loss: 0.2298 204/500 [===========>..................] - ETA: 1:10 - loss: 1.3775 - regression_loss: 1.1481 - classification_loss: 0.2294 205/500 [===========>..................] - ETA: 1:10 - loss: 1.3773 - regression_loss: 1.1481 - classification_loss: 0.2292 206/500 [===========>..................] - ETA: 1:10 - loss: 1.3749 - regression_loss: 1.1460 - classification_loss: 0.2289 207/500 [===========>..................] - ETA: 1:09 - loss: 1.3766 - regression_loss: 1.1474 - classification_loss: 0.2292 208/500 [===========>..................] - ETA: 1:09 - loss: 1.3752 - regression_loss: 1.1462 - classification_loss: 0.2290 209/500 [===========>..................] - ETA: 1:09 - loss: 1.3745 - regression_loss: 1.1459 - classification_loss: 0.2286 210/500 [===========>..................] - ETA: 1:09 - loss: 1.3749 - regression_loss: 1.1462 - classification_loss: 0.2287 211/500 [===========>..................] - ETA: 1:08 - loss: 1.3749 - regression_loss: 1.1465 - classification_loss: 0.2284 212/500 [===========>..................] - ETA: 1:08 - loss: 1.3751 - regression_loss: 1.1470 - classification_loss: 0.2282 213/500 [===========>..................] - ETA: 1:08 - loss: 1.3777 - regression_loss: 1.1490 - classification_loss: 0.2286 214/500 [===========>..................] - ETA: 1:08 - loss: 1.3804 - regression_loss: 1.1512 - classification_loss: 0.2292 215/500 [===========>..................] - ETA: 1:07 - loss: 1.3811 - regression_loss: 1.1518 - classification_loss: 0.2293 216/500 [===========>..................] - ETA: 1:07 - loss: 1.3819 - regression_loss: 1.1523 - classification_loss: 0.2296 217/500 [============>.................] - ETA: 1:07 - loss: 1.3819 - regression_loss: 1.1527 - classification_loss: 0.2293 218/500 [============>.................] - ETA: 1:07 - loss: 1.3836 - regression_loss: 1.1540 - classification_loss: 0.2296 219/500 [============>.................] - ETA: 1:06 - loss: 1.3848 - regression_loss: 1.1549 - classification_loss: 0.2298 220/500 [============>.................] - ETA: 1:06 - loss: 1.3871 - regression_loss: 1.1573 - classification_loss: 0.2298 221/500 [============>.................] - ETA: 1:06 - loss: 1.3907 - regression_loss: 1.1603 - classification_loss: 0.2304 222/500 [============>.................] - ETA: 1:06 - loss: 1.3900 - regression_loss: 1.1600 - classification_loss: 0.2300 223/500 [============>.................] - ETA: 1:06 - loss: 1.3905 - regression_loss: 1.1608 - classification_loss: 0.2298 224/500 [============>.................] - ETA: 1:05 - loss: 1.3915 - regression_loss: 1.1615 - classification_loss: 0.2300 225/500 [============>.................] - ETA: 1:05 - loss: 1.3937 - regression_loss: 1.1634 - classification_loss: 0.2303 226/500 [============>.................] - ETA: 1:05 - loss: 1.3969 - regression_loss: 1.1655 - classification_loss: 0.2313 227/500 [============>.................] - ETA: 1:05 - loss: 1.3961 - regression_loss: 1.1650 - classification_loss: 0.2311 228/500 [============>.................] - ETA: 1:04 - loss: 1.3962 - regression_loss: 1.1650 - classification_loss: 0.2312 229/500 [============>.................] - ETA: 1:04 - loss: 1.3957 - regression_loss: 1.1649 - classification_loss: 0.2308 230/500 [============>.................] - ETA: 1:04 - loss: 1.3930 - regression_loss: 1.1628 - classification_loss: 0.2302 231/500 [============>.................] - ETA: 1:04 - loss: 1.3923 - regression_loss: 1.1624 - classification_loss: 0.2299 232/500 [============>.................] - ETA: 1:03 - loss: 1.3913 - regression_loss: 1.1617 - classification_loss: 0.2296 233/500 [============>.................] - ETA: 1:03 - loss: 1.3900 - regression_loss: 1.1608 - classification_loss: 0.2292 234/500 [=============>................] - ETA: 1:03 - loss: 1.3887 - regression_loss: 1.1600 - classification_loss: 0.2286 235/500 [=============>................] - ETA: 1:03 - loss: 1.3898 - regression_loss: 1.1608 - classification_loss: 0.2290 236/500 [=============>................] - ETA: 1:02 - loss: 1.3903 - regression_loss: 1.1614 - classification_loss: 0.2289 237/500 [=============>................] - ETA: 1:02 - loss: 1.3925 - regression_loss: 1.1630 - classification_loss: 0.2295 238/500 [=============>................] - ETA: 1:02 - loss: 1.4014 - regression_loss: 1.1695 - classification_loss: 0.2319 239/500 [=============>................] - ETA: 1:02 - loss: 1.3977 - regression_loss: 1.1666 - classification_loss: 0.2311 240/500 [=============>................] - ETA: 1:02 - loss: 1.3946 - regression_loss: 1.1642 - classification_loss: 0.2304 241/500 [=============>................] - ETA: 1:01 - loss: 1.3969 - regression_loss: 1.1665 - classification_loss: 0.2305 242/500 [=============>................] - ETA: 1:01 - loss: 1.3963 - regression_loss: 1.1661 - classification_loss: 0.2301 243/500 [=============>................] - ETA: 1:01 - loss: 1.3951 - regression_loss: 1.1651 - classification_loss: 0.2300 244/500 [=============>................] - ETA: 1:01 - loss: 1.3920 - regression_loss: 1.1604 - classification_loss: 0.2316 245/500 [=============>................] - ETA: 1:00 - loss: 1.3876 - regression_loss: 1.1568 - classification_loss: 0.2308 246/500 [=============>................] - ETA: 1:00 - loss: 1.3857 - regression_loss: 1.1554 - classification_loss: 0.2304 247/500 [=============>................] - ETA: 1:00 - loss: 1.3860 - regression_loss: 1.1558 - classification_loss: 0.2302 248/500 [=============>................] - ETA: 1:00 - loss: 1.3858 - regression_loss: 1.1559 - classification_loss: 0.2299 249/500 [=============>................] - ETA: 59s - loss: 1.3861 - regression_loss: 1.1566 - classification_loss: 0.2296  250/500 [==============>...............] - ETA: 59s - loss: 1.3882 - regression_loss: 1.1585 - classification_loss: 0.2297 251/500 [==============>...............] - ETA: 59s - loss: 1.3895 - regression_loss: 1.1595 - classification_loss: 0.2300 252/500 [==============>...............] - ETA: 59s - loss: 1.3933 - regression_loss: 1.1626 - classification_loss: 0.2308 253/500 [==============>...............] - ETA: 58s - loss: 1.3924 - regression_loss: 1.1613 - classification_loss: 0.2311 254/500 [==============>...............] - ETA: 58s - loss: 1.3922 - regression_loss: 1.1612 - classification_loss: 0.2310 255/500 [==============>...............] - ETA: 58s - loss: 1.3928 - regression_loss: 1.1618 - classification_loss: 0.2310 256/500 [==============>...............] - ETA: 58s - loss: 1.3917 - regression_loss: 1.1612 - classification_loss: 0.2305 257/500 [==============>...............] - ETA: 58s - loss: 1.3913 - regression_loss: 1.1610 - classification_loss: 0.2303 258/500 [==============>...............] - ETA: 57s - loss: 1.3912 - regression_loss: 1.1612 - classification_loss: 0.2301 259/500 [==============>...............] - ETA: 57s - loss: 1.3904 - regression_loss: 1.1607 - classification_loss: 0.2297 260/500 [==============>...............] - ETA: 57s - loss: 1.3877 - regression_loss: 1.1585 - classification_loss: 0.2292 261/500 [==============>...............] - ETA: 57s - loss: 1.3871 - regression_loss: 1.1579 - classification_loss: 0.2292 262/500 [==============>...............] - ETA: 56s - loss: 1.3857 - regression_loss: 1.1566 - classification_loss: 0.2290 263/500 [==============>...............] - ETA: 56s - loss: 1.3843 - regression_loss: 1.1556 - classification_loss: 0.2287 264/500 [==============>...............] - ETA: 56s - loss: 1.3867 - regression_loss: 1.1578 - classification_loss: 0.2289 265/500 [==============>...............] - ETA: 56s - loss: 1.3854 - regression_loss: 1.1568 - classification_loss: 0.2287 266/500 [==============>...............] - ETA: 55s - loss: 1.3867 - regression_loss: 1.1577 - classification_loss: 0.2290 267/500 [===============>..............] - ETA: 55s - loss: 1.3839 - regression_loss: 1.1555 - classification_loss: 0.2284 268/500 [===============>..............] - ETA: 55s - loss: 1.3810 - regression_loss: 1.1532 - classification_loss: 0.2278 269/500 [===============>..............] - ETA: 55s - loss: 1.3775 - regression_loss: 1.1504 - classification_loss: 0.2270 270/500 [===============>..............] - ETA: 54s - loss: 1.3824 - regression_loss: 1.1544 - classification_loss: 0.2280 271/500 [===============>..............] - ETA: 54s - loss: 1.3799 - regression_loss: 1.1524 - classification_loss: 0.2276 272/500 [===============>..............] - ETA: 54s - loss: 1.3802 - regression_loss: 1.1525 - classification_loss: 0.2278 273/500 [===============>..............] - ETA: 54s - loss: 1.3829 - regression_loss: 1.1546 - classification_loss: 0.2283 274/500 [===============>..............] - ETA: 53s - loss: 1.3829 - regression_loss: 1.1546 - classification_loss: 0.2282 275/500 [===============>..............] - ETA: 53s - loss: 1.3830 - regression_loss: 1.1552 - classification_loss: 0.2278 276/500 [===============>..............] - ETA: 53s - loss: 1.3834 - regression_loss: 1.1557 - classification_loss: 0.2277 277/500 [===============>..............] - ETA: 53s - loss: 1.3833 - regression_loss: 1.1555 - classification_loss: 0.2277 278/500 [===============>..............] - ETA: 53s - loss: 1.3807 - regression_loss: 1.1535 - classification_loss: 0.2272 279/500 [===============>..............] - ETA: 52s - loss: 1.3820 - regression_loss: 1.1547 - classification_loss: 0.2272 280/500 [===============>..............] - ETA: 52s - loss: 1.3816 - regression_loss: 1.1543 - classification_loss: 0.2272 281/500 [===============>..............] - ETA: 52s - loss: 1.3811 - regression_loss: 1.1541 - classification_loss: 0.2270 282/500 [===============>..............] - ETA: 52s - loss: 1.3784 - regression_loss: 1.1518 - classification_loss: 0.2265 283/500 [===============>..............] - ETA: 51s - loss: 1.3784 - regression_loss: 1.1519 - classification_loss: 0.2265 284/500 [================>.............] - ETA: 51s - loss: 1.3782 - regression_loss: 1.1518 - classification_loss: 0.2265 285/500 [================>.............] - ETA: 51s - loss: 1.3772 - regression_loss: 1.1511 - classification_loss: 0.2261 286/500 [================>.............] - ETA: 51s - loss: 1.3765 - regression_loss: 1.1508 - classification_loss: 0.2258 287/500 [================>.............] - ETA: 50s - loss: 1.3759 - regression_loss: 1.1504 - classification_loss: 0.2255 288/500 [================>.............] - ETA: 50s - loss: 1.3755 - regression_loss: 1.1503 - classification_loss: 0.2253 289/500 [================>.............] - ETA: 50s - loss: 1.3763 - regression_loss: 1.1510 - classification_loss: 0.2253 290/500 [================>.............] - ETA: 50s - loss: 1.3761 - regression_loss: 1.1509 - classification_loss: 0.2252 291/500 [================>.............] - ETA: 49s - loss: 1.3743 - regression_loss: 1.1497 - classification_loss: 0.2246 292/500 [================>.............] - ETA: 49s - loss: 1.3770 - regression_loss: 1.1522 - classification_loss: 0.2248 293/500 [================>.............] - ETA: 49s - loss: 1.3787 - regression_loss: 1.1525 - classification_loss: 0.2261 294/500 [================>.............] - ETA: 49s - loss: 1.3792 - regression_loss: 1.1530 - classification_loss: 0.2261 295/500 [================>.............] - ETA: 48s - loss: 1.3774 - regression_loss: 1.1517 - classification_loss: 0.2257 296/500 [================>.............] - ETA: 48s - loss: 1.3764 - regression_loss: 1.1508 - classification_loss: 0.2256 297/500 [================>.............] - ETA: 48s - loss: 1.3764 - regression_loss: 1.1509 - classification_loss: 0.2255 298/500 [================>.............] - ETA: 48s - loss: 1.3782 - regression_loss: 1.1524 - classification_loss: 0.2257 299/500 [================>.............] - ETA: 48s - loss: 1.3750 - regression_loss: 1.1499 - classification_loss: 0.2251 300/500 [=================>............] - ETA: 47s - loss: 1.3731 - regression_loss: 1.1484 - classification_loss: 0.2247 301/500 [=================>............] - ETA: 47s - loss: 1.3759 - regression_loss: 1.1506 - classification_loss: 0.2254 302/500 [=================>............] - ETA: 47s - loss: 1.3750 - regression_loss: 1.1499 - classification_loss: 0.2250 303/500 [=================>............] - ETA: 46s - loss: 1.3762 - regression_loss: 1.1511 - classification_loss: 0.2252 304/500 [=================>............] - ETA: 46s - loss: 1.3758 - regression_loss: 1.1505 - classification_loss: 0.2253 305/500 [=================>............] - ETA: 46s - loss: 1.3772 - regression_loss: 1.1518 - classification_loss: 0.2254 306/500 [=================>............] - ETA: 46s - loss: 1.3800 - regression_loss: 1.1541 - classification_loss: 0.2259 307/500 [=================>............] - ETA: 45s - loss: 1.3808 - regression_loss: 1.1550 - classification_loss: 0.2259 308/500 [=================>............] - ETA: 45s - loss: 1.3798 - regression_loss: 1.1540 - classification_loss: 0.2258 309/500 [=================>............] - ETA: 45s - loss: 1.3805 - regression_loss: 1.1548 - classification_loss: 0.2257 310/500 [=================>............] - ETA: 45s - loss: 1.3817 - regression_loss: 1.1557 - classification_loss: 0.2260 311/500 [=================>............] - ETA: 44s - loss: 1.3827 - regression_loss: 1.1566 - classification_loss: 0.2262 312/500 [=================>............] - ETA: 44s - loss: 1.3816 - regression_loss: 1.1558 - classification_loss: 0.2258 313/500 [=================>............] - ETA: 44s - loss: 1.3783 - regression_loss: 1.1531 - classification_loss: 0.2252 314/500 [=================>............] - ETA: 44s - loss: 1.3785 - regression_loss: 1.1535 - classification_loss: 0.2250 315/500 [=================>............] - ETA: 43s - loss: 1.3785 - regression_loss: 1.1533 - classification_loss: 0.2252 316/500 [=================>............] - ETA: 43s - loss: 1.3765 - regression_loss: 1.1518 - classification_loss: 0.2247 317/500 [==================>...........] - ETA: 43s - loss: 1.3754 - regression_loss: 1.1510 - classification_loss: 0.2244 318/500 [==================>...........] - ETA: 43s - loss: 1.3739 - regression_loss: 1.1499 - classification_loss: 0.2241 319/500 [==================>...........] - ETA: 42s - loss: 1.3749 - regression_loss: 1.1507 - classification_loss: 0.2242 320/500 [==================>...........] - ETA: 42s - loss: 1.3733 - regression_loss: 1.1495 - classification_loss: 0.2238 321/500 [==================>...........] - ETA: 42s - loss: 1.3735 - regression_loss: 1.1498 - classification_loss: 0.2236 322/500 [==================>...........] - ETA: 42s - loss: 1.3741 - regression_loss: 1.1503 - classification_loss: 0.2237 323/500 [==================>...........] - ETA: 41s - loss: 1.3739 - regression_loss: 1.1501 - classification_loss: 0.2237 324/500 [==================>...........] - ETA: 41s - loss: 1.3749 - regression_loss: 1.1508 - classification_loss: 0.2241 325/500 [==================>...........] - ETA: 41s - loss: 1.3754 - regression_loss: 1.1512 - classification_loss: 0.2241 326/500 [==================>...........] - ETA: 41s - loss: 1.3737 - regression_loss: 1.1500 - classification_loss: 0.2237 327/500 [==================>...........] - ETA: 40s - loss: 1.3720 - regression_loss: 1.1487 - classification_loss: 0.2233 328/500 [==================>...........] - ETA: 40s - loss: 1.3732 - regression_loss: 1.1498 - classification_loss: 0.2234 329/500 [==================>...........] - ETA: 40s - loss: 1.3729 - regression_loss: 1.1495 - classification_loss: 0.2234 330/500 [==================>...........] - ETA: 40s - loss: 1.3723 - regression_loss: 1.1492 - classification_loss: 0.2232 331/500 [==================>...........] - ETA: 39s - loss: 1.3721 - regression_loss: 1.1492 - classification_loss: 0.2230 332/500 [==================>...........] - ETA: 39s - loss: 1.3739 - regression_loss: 1.1507 - classification_loss: 0.2231 333/500 [==================>...........] - ETA: 39s - loss: 1.3727 - regression_loss: 1.1498 - classification_loss: 0.2229 334/500 [===================>..........] - ETA: 39s - loss: 1.3733 - regression_loss: 1.1503 - classification_loss: 0.2230 335/500 [===================>..........] - ETA: 38s - loss: 1.3749 - regression_loss: 1.1516 - classification_loss: 0.2233 336/500 [===================>..........] - ETA: 38s - loss: 1.3737 - regression_loss: 1.1508 - classification_loss: 0.2229 337/500 [===================>..........] - ETA: 38s - loss: 1.3729 - regression_loss: 1.1502 - classification_loss: 0.2228 338/500 [===================>..........] - ETA: 38s - loss: 1.3717 - regression_loss: 1.1493 - classification_loss: 0.2224 339/500 [===================>..........] - ETA: 37s - loss: 1.3717 - regression_loss: 1.1495 - classification_loss: 0.2222 340/500 [===================>..........] - ETA: 37s - loss: 1.3759 - regression_loss: 1.1521 - classification_loss: 0.2238 341/500 [===================>..........] - ETA: 37s - loss: 1.3756 - regression_loss: 1.1521 - classification_loss: 0.2235 342/500 [===================>..........] - ETA: 37s - loss: 1.3750 - regression_loss: 1.1515 - classification_loss: 0.2235 343/500 [===================>..........] - ETA: 37s - loss: 1.3767 - regression_loss: 1.1529 - classification_loss: 0.2239 344/500 [===================>..........] - ETA: 36s - loss: 1.3779 - regression_loss: 1.1541 - classification_loss: 0.2238 345/500 [===================>..........] - ETA: 36s - loss: 1.3771 - regression_loss: 1.1536 - classification_loss: 0.2235 346/500 [===================>..........] - ETA: 36s - loss: 1.3783 - regression_loss: 1.1547 - classification_loss: 0.2236 347/500 [===================>..........] - ETA: 36s - loss: 1.3783 - regression_loss: 1.1548 - classification_loss: 0.2235 348/500 [===================>..........] - ETA: 35s - loss: 1.3783 - regression_loss: 1.1550 - classification_loss: 0.2233 349/500 [===================>..........] - ETA: 35s - loss: 1.3794 - regression_loss: 1.1559 - classification_loss: 0.2234 350/500 [====================>.........] - ETA: 35s - loss: 1.3805 - regression_loss: 1.1568 - classification_loss: 0.2237 351/500 [====================>.........] - ETA: 35s - loss: 1.3790 - regression_loss: 1.1556 - classification_loss: 0.2233 352/500 [====================>.........] - ETA: 34s - loss: 1.3788 - regression_loss: 1.1555 - classification_loss: 0.2233 353/500 [====================>.........] - ETA: 34s - loss: 1.3781 - regression_loss: 1.1551 - classification_loss: 0.2230 354/500 [====================>.........] - ETA: 34s - loss: 1.3787 - regression_loss: 1.1556 - classification_loss: 0.2231 355/500 [====================>.........] - ETA: 34s - loss: 1.3802 - regression_loss: 1.1568 - classification_loss: 0.2234 356/500 [====================>.........] - ETA: 33s - loss: 1.3826 - regression_loss: 1.1585 - classification_loss: 0.2241 357/500 [====================>.........] - ETA: 33s - loss: 1.3849 - regression_loss: 1.1601 - classification_loss: 0.2248 358/500 [====================>.........] - ETA: 33s - loss: 1.3846 - regression_loss: 1.1600 - classification_loss: 0.2247 359/500 [====================>.........] - ETA: 33s - loss: 1.3851 - regression_loss: 1.1605 - classification_loss: 0.2246 360/500 [====================>.........] - ETA: 32s - loss: 1.3852 - regression_loss: 1.1606 - classification_loss: 0.2246 361/500 [====================>.........] - ETA: 32s - loss: 1.3835 - regression_loss: 1.1593 - classification_loss: 0.2242 362/500 [====================>.........] - ETA: 32s - loss: 1.3829 - regression_loss: 1.1588 - classification_loss: 0.2240 363/500 [====================>.........] - ETA: 32s - loss: 1.3824 - regression_loss: 1.1586 - classification_loss: 0.2238 364/500 [====================>.........] - ETA: 31s - loss: 1.3830 - regression_loss: 1.1593 - classification_loss: 0.2238 365/500 [====================>.........] - ETA: 31s - loss: 1.3827 - regression_loss: 1.1591 - classification_loss: 0.2237 366/500 [====================>.........] - ETA: 31s - loss: 1.3833 - regression_loss: 1.1597 - classification_loss: 0.2237 367/500 [=====================>........] - ETA: 31s - loss: 1.3811 - regression_loss: 1.1578 - classification_loss: 0.2233 368/500 [=====================>........] - ETA: 30s - loss: 1.3806 - regression_loss: 1.1575 - classification_loss: 0.2231 369/500 [=====================>........] - ETA: 30s - loss: 1.3800 - regression_loss: 1.1570 - classification_loss: 0.2230 370/500 [=====================>........] - ETA: 30s - loss: 1.3797 - regression_loss: 1.1569 - classification_loss: 0.2228 371/500 [=====================>........] - ETA: 30s - loss: 1.3796 - regression_loss: 1.1568 - classification_loss: 0.2229 372/500 [=====================>........] - ETA: 30s - loss: 1.3791 - regression_loss: 1.1564 - classification_loss: 0.2227 373/500 [=====================>........] - ETA: 29s - loss: 1.3787 - regression_loss: 1.1561 - classification_loss: 0.2226 374/500 [=====================>........] - ETA: 29s - loss: 1.3772 - regression_loss: 1.1549 - classification_loss: 0.2223 375/500 [=====================>........] - ETA: 29s - loss: 1.3768 - regression_loss: 1.1545 - classification_loss: 0.2223 376/500 [=====================>........] - ETA: 29s - loss: 1.3773 - regression_loss: 1.1550 - classification_loss: 0.2223 377/500 [=====================>........] - ETA: 28s - loss: 1.3752 - regression_loss: 1.1533 - classification_loss: 0.2219 378/500 [=====================>........] - ETA: 28s - loss: 1.3747 - regression_loss: 1.1531 - classification_loss: 0.2217 379/500 [=====================>........] - ETA: 28s - loss: 1.3728 - regression_loss: 1.1517 - classification_loss: 0.2211 380/500 [=====================>........] - ETA: 28s - loss: 1.3742 - regression_loss: 1.1529 - classification_loss: 0.2213 381/500 [=====================>........] - ETA: 27s - loss: 1.3738 - regression_loss: 1.1529 - classification_loss: 0.2209 382/500 [=====================>........] - ETA: 27s - loss: 1.3755 - regression_loss: 1.1544 - classification_loss: 0.2211 383/500 [=====================>........] - ETA: 27s - loss: 1.3759 - regression_loss: 1.1547 - classification_loss: 0.2212 384/500 [======================>.......] - ETA: 27s - loss: 1.3762 - regression_loss: 1.1550 - classification_loss: 0.2212 385/500 [======================>.......] - ETA: 26s - loss: 1.3775 - regression_loss: 1.1559 - classification_loss: 0.2216 386/500 [======================>.......] - ETA: 26s - loss: 1.3775 - regression_loss: 1.1561 - classification_loss: 0.2214 387/500 [======================>.......] - ETA: 26s - loss: 1.3768 - regression_loss: 1.1558 - classification_loss: 0.2210 388/500 [======================>.......] - ETA: 26s - loss: 1.3762 - regression_loss: 1.1554 - classification_loss: 0.2208 389/500 [======================>.......] - ETA: 26s - loss: 1.3764 - regression_loss: 1.1555 - classification_loss: 0.2208 390/500 [======================>.......] - ETA: 25s - loss: 1.3765 - regression_loss: 1.1557 - classification_loss: 0.2208 391/500 [======================>.......] - ETA: 25s - loss: 1.3755 - regression_loss: 1.1550 - classification_loss: 0.2205 392/500 [======================>.......] - ETA: 25s - loss: 1.3744 - regression_loss: 1.1542 - classification_loss: 0.2202 393/500 [======================>.......] - ETA: 25s - loss: 1.3731 - regression_loss: 1.1532 - classification_loss: 0.2199 394/500 [======================>.......] - ETA: 24s - loss: 1.3722 - regression_loss: 1.1503 - classification_loss: 0.2220 395/500 [======================>.......] - ETA: 24s - loss: 1.3714 - regression_loss: 1.1496 - classification_loss: 0.2218 396/500 [======================>.......] - ETA: 24s - loss: 1.3717 - regression_loss: 1.1500 - classification_loss: 0.2217 397/500 [======================>.......] - ETA: 24s - loss: 1.3720 - regression_loss: 1.1504 - classification_loss: 0.2215 398/500 [======================>.......] - ETA: 23s - loss: 1.3721 - regression_loss: 1.1507 - classification_loss: 0.2215 399/500 [======================>.......] - ETA: 23s - loss: 1.3707 - regression_loss: 1.1495 - classification_loss: 0.2211 400/500 [=======================>......] - ETA: 23s - loss: 1.3714 - regression_loss: 1.1502 - classification_loss: 0.2212 401/500 [=======================>......] - ETA: 23s - loss: 1.3698 - regression_loss: 1.1490 - classification_loss: 0.2208 402/500 [=======================>......] - ETA: 23s - loss: 1.3703 - regression_loss: 1.1494 - classification_loss: 0.2209 403/500 [=======================>......] - ETA: 22s - loss: 1.3694 - regression_loss: 1.1487 - classification_loss: 0.2207 404/500 [=======================>......] - ETA: 22s - loss: 1.3696 - regression_loss: 1.1489 - classification_loss: 0.2207 405/500 [=======================>......] - ETA: 22s - loss: 1.3700 - regression_loss: 1.1492 - classification_loss: 0.2208 406/500 [=======================>......] - ETA: 22s - loss: 1.3707 - regression_loss: 1.1500 - classification_loss: 0.2207 407/500 [=======================>......] - ETA: 21s - loss: 1.3700 - regression_loss: 1.1495 - classification_loss: 0.2205 408/500 [=======================>......] - ETA: 21s - loss: 1.3709 - regression_loss: 1.1503 - classification_loss: 0.2206 409/500 [=======================>......] - ETA: 21s - loss: 1.3700 - regression_loss: 1.1496 - classification_loss: 0.2204 410/500 [=======================>......] - ETA: 21s - loss: 1.3701 - regression_loss: 1.1499 - classification_loss: 0.2202 411/500 [=======================>......] - ETA: 20s - loss: 1.3699 - regression_loss: 1.1498 - classification_loss: 0.2200 412/500 [=======================>......] - ETA: 20s - loss: 1.3705 - regression_loss: 1.1503 - classification_loss: 0.2201 413/500 [=======================>......] - ETA: 20s - loss: 1.3695 - regression_loss: 1.1495 - classification_loss: 0.2201 414/500 [=======================>......] - ETA: 20s - loss: 1.3697 - regression_loss: 1.1496 - classification_loss: 0.2200 415/500 [=======================>......] - ETA: 19s - loss: 1.3689 - regression_loss: 1.1490 - classification_loss: 0.2198 416/500 [=======================>......] - ETA: 19s - loss: 1.3720 - regression_loss: 1.1503 - classification_loss: 0.2216 417/500 [========================>.....] - ETA: 19s - loss: 1.3751 - regression_loss: 1.1529 - classification_loss: 0.2222 418/500 [========================>.....] - ETA: 19s - loss: 1.3737 - regression_loss: 1.1519 - classification_loss: 0.2219 419/500 [========================>.....] - ETA: 19s - loss: 1.3741 - regression_loss: 1.1521 - classification_loss: 0.2220 420/500 [========================>.....] - ETA: 18s - loss: 1.3740 - regression_loss: 1.1522 - classification_loss: 0.2218 421/500 [========================>.....] - ETA: 18s - loss: 1.3725 - regression_loss: 1.1510 - classification_loss: 0.2215 422/500 [========================>.....] - ETA: 18s - loss: 1.3708 - regression_loss: 1.1496 - classification_loss: 0.2212 423/500 [========================>.....] - ETA: 18s - loss: 1.3703 - regression_loss: 1.1494 - classification_loss: 0.2209 424/500 [========================>.....] - ETA: 17s - loss: 1.3686 - regression_loss: 1.1479 - classification_loss: 0.2206 425/500 [========================>.....] - ETA: 17s - loss: 1.3676 - regression_loss: 1.1471 - classification_loss: 0.2205 426/500 [========================>.....] - ETA: 17s - loss: 1.3669 - regression_loss: 1.1466 - classification_loss: 0.2203 427/500 [========================>.....] - ETA: 17s - loss: 1.3684 - regression_loss: 1.1480 - classification_loss: 0.2204 428/500 [========================>.....] - ETA: 16s - loss: 1.3685 - regression_loss: 1.1482 - classification_loss: 0.2203 429/500 [========================>.....] - ETA: 16s - loss: 1.3704 - regression_loss: 1.1495 - classification_loss: 0.2209 430/500 [========================>.....] - ETA: 16s - loss: 1.3692 - regression_loss: 1.1484 - classification_loss: 0.2208 431/500 [========================>.....] - ETA: 16s - loss: 1.3703 - regression_loss: 1.1492 - classification_loss: 0.2211 432/500 [========================>.....] - ETA: 16s - loss: 1.3720 - regression_loss: 1.1503 - classification_loss: 0.2217 433/500 [========================>.....] - ETA: 15s - loss: 1.3721 - regression_loss: 1.1503 - classification_loss: 0.2217 434/500 [=========================>....] - ETA: 15s - loss: 1.3719 - regression_loss: 1.1504 - classification_loss: 0.2215 435/500 [=========================>....] - ETA: 15s - loss: 1.3744 - regression_loss: 1.1522 - classification_loss: 0.2222 436/500 [=========================>....] - ETA: 15s - loss: 1.3745 - regression_loss: 1.1523 - classification_loss: 0.2223 437/500 [=========================>....] - ETA: 14s - loss: 1.3742 - regression_loss: 1.1521 - classification_loss: 0.2221 438/500 [=========================>....] - ETA: 14s - loss: 1.3742 - regression_loss: 1.1519 - classification_loss: 0.2223 439/500 [=========================>....] - ETA: 14s - loss: 1.3759 - regression_loss: 1.1531 - classification_loss: 0.2228 440/500 [=========================>....] - ETA: 14s - loss: 1.3762 - regression_loss: 1.1533 - classification_loss: 0.2228 441/500 [=========================>....] - ETA: 13s - loss: 1.3764 - regression_loss: 1.1535 - classification_loss: 0.2230 442/500 [=========================>....] - ETA: 13s - loss: 1.3759 - regression_loss: 1.1532 - classification_loss: 0.2228 443/500 [=========================>....] - ETA: 13s - loss: 1.3759 - regression_loss: 1.1531 - classification_loss: 0.2228 444/500 [=========================>....] - ETA: 13s - loss: 1.3762 - regression_loss: 1.1534 - classification_loss: 0.2228 445/500 [=========================>....] - ETA: 12s - loss: 1.3765 - regression_loss: 1.1538 - classification_loss: 0.2227 446/500 [=========================>....] - ETA: 12s - loss: 1.3783 - regression_loss: 1.1552 - classification_loss: 0.2231 447/500 [=========================>....] - ETA: 12s - loss: 1.3778 - regression_loss: 1.1549 - classification_loss: 0.2229 448/500 [=========================>....] - ETA: 12s - loss: 1.3769 - regression_loss: 1.1542 - classification_loss: 0.2227 449/500 [=========================>....] - ETA: 12s - loss: 1.3751 - regression_loss: 1.1527 - classification_loss: 0.2224 450/500 [==========================>...] - ETA: 11s - loss: 1.3757 - regression_loss: 1.1531 - classification_loss: 0.2225 451/500 [==========================>...] - ETA: 11s - loss: 1.3762 - regression_loss: 1.1536 - classification_loss: 0.2226 452/500 [==========================>...] - ETA: 11s - loss: 1.3746 - regression_loss: 1.1524 - classification_loss: 0.2223 453/500 [==========================>...] - ETA: 11s - loss: 1.3741 - regression_loss: 1.1520 - classification_loss: 0.2221 454/500 [==========================>...] - ETA: 10s - loss: 1.3733 - regression_loss: 1.1512 - classification_loss: 0.2221 455/500 [==========================>...] - ETA: 10s - loss: 1.3724 - regression_loss: 1.1505 - classification_loss: 0.2218 456/500 [==========================>...] - ETA: 10s - loss: 1.3725 - regression_loss: 1.1507 - classification_loss: 0.2219 457/500 [==========================>...] - ETA: 10s - loss: 1.3714 - regression_loss: 1.1497 - classification_loss: 0.2217 458/500 [==========================>...] - ETA: 9s - loss: 1.3714 - regression_loss: 1.1498 - classification_loss: 0.2217  459/500 [==========================>...] - ETA: 9s - loss: 1.3715 - regression_loss: 1.1500 - classification_loss: 0.2214 460/500 [==========================>...] - ETA: 9s - loss: 1.3715 - regression_loss: 1.1503 - classification_loss: 0.2212 461/500 [==========================>...] - ETA: 9s - loss: 1.3718 - regression_loss: 1.1506 - classification_loss: 0.2212 462/500 [==========================>...] - ETA: 8s - loss: 1.3704 - regression_loss: 1.1495 - classification_loss: 0.2209 463/500 [==========================>...] - ETA: 8s - loss: 1.3699 - regression_loss: 1.1492 - classification_loss: 0.2207 464/500 [==========================>...] - ETA: 8s - loss: 1.3679 - regression_loss: 1.1475 - classification_loss: 0.2204 465/500 [==========================>...] - ETA: 8s - loss: 1.3681 - regression_loss: 1.1479 - classification_loss: 0.2203 466/500 [==========================>...] - ETA: 8s - loss: 1.3677 - regression_loss: 1.1475 - classification_loss: 0.2202 467/500 [===========================>..] - ETA: 7s - loss: 1.3686 - regression_loss: 1.1485 - classification_loss: 0.2202 468/500 [===========================>..] - ETA: 7s - loss: 1.3671 - regression_loss: 1.1472 - classification_loss: 0.2198 469/500 [===========================>..] - ETA: 7s - loss: 1.3654 - regression_loss: 1.1458 - classification_loss: 0.2196 470/500 [===========================>..] - ETA: 7s - loss: 1.3653 - regression_loss: 1.1458 - classification_loss: 0.2195 471/500 [===========================>..] - ETA: 6s - loss: 1.3650 - regression_loss: 1.1457 - classification_loss: 0.2193 472/500 [===========================>..] - ETA: 6s - loss: 1.3651 - regression_loss: 1.1456 - classification_loss: 0.2194 473/500 [===========================>..] - ETA: 6s - loss: 1.3633 - regression_loss: 1.1442 - classification_loss: 0.2191 474/500 [===========================>..] - ETA: 6s - loss: 1.3613 - regression_loss: 1.1424 - classification_loss: 0.2188 475/500 [===========================>..] - ETA: 5s - loss: 1.3612 - regression_loss: 1.1425 - classification_loss: 0.2188 476/500 [===========================>..] - ETA: 5s - loss: 1.3613 - regression_loss: 1.1425 - classification_loss: 0.2188 477/500 [===========================>..] - ETA: 5s - loss: 1.3629 - regression_loss: 1.1437 - classification_loss: 0.2193 478/500 [===========================>..] - ETA: 5s - loss: 1.3625 - regression_loss: 1.1435 - classification_loss: 0.2190 479/500 [===========================>..] - ETA: 4s - loss: 1.3621 - regression_loss: 1.1433 - classification_loss: 0.2188 480/500 [===========================>..] - ETA: 4s - loss: 1.3613 - regression_loss: 1.1427 - classification_loss: 0.2187 481/500 [===========================>..] - ETA: 4s - loss: 1.3619 - regression_loss: 1.1430 - classification_loss: 0.2188 482/500 [===========================>..] - ETA: 4s - loss: 1.3617 - regression_loss: 1.1427 - classification_loss: 0.2190 483/500 [===========================>..] - ETA: 4s - loss: 1.3605 - regression_loss: 1.1417 - classification_loss: 0.2188 484/500 [============================>.] - ETA: 3s - loss: 1.3619 - regression_loss: 1.1427 - classification_loss: 0.2192 485/500 [============================>.] - ETA: 3s - loss: 1.3623 - regression_loss: 1.1429 - classification_loss: 0.2194 486/500 [============================>.] - ETA: 3s - loss: 1.3631 - regression_loss: 1.1437 - classification_loss: 0.2194 487/500 [============================>.] - ETA: 3s - loss: 1.3628 - regression_loss: 1.1435 - classification_loss: 0.2193 488/500 [============================>.] - ETA: 2s - loss: 1.3642 - regression_loss: 1.1445 - classification_loss: 0.2197 489/500 [============================>.] - ETA: 2s - loss: 1.3653 - regression_loss: 1.1455 - classification_loss: 0.2197 490/500 [============================>.] - ETA: 2s - loss: 1.3653 - regression_loss: 1.1455 - classification_loss: 0.2198 491/500 [============================>.] - ETA: 2s - loss: 1.3649 - regression_loss: 1.1452 - classification_loss: 0.2197 492/500 [============================>.] - ETA: 1s - loss: 1.3657 - regression_loss: 1.1457 - classification_loss: 0.2199 493/500 [============================>.] - ETA: 1s - loss: 1.3657 - regression_loss: 1.1457 - classification_loss: 0.2200 494/500 [============================>.] - ETA: 1s - loss: 1.3636 - regression_loss: 1.1440 - classification_loss: 0.2197 495/500 [============================>.] - ETA: 1s - loss: 1.3640 - regression_loss: 1.1443 - classification_loss: 0.2197 496/500 [============================>.] - ETA: 0s - loss: 1.3647 - regression_loss: 1.1450 - classification_loss: 0.2197 497/500 [============================>.] - ETA: 0s - loss: 1.3645 - regression_loss: 1.1449 - classification_loss: 0.2196 498/500 [============================>.] - ETA: 0s - loss: 1.3630 - regression_loss: 1.1436 - classification_loss: 0.2194 499/500 [============================>.] - ETA: 0s - loss: 1.3631 - regression_loss: 1.1439 - classification_loss: 0.2193 500/500 [==============================] - 118s 237ms/step - loss: 1.3649 - regression_loss: 1.1452 - classification_loss: 0.2198 326 instances of class plum with average precision: 0.8042 mAP: 0.8042 Epoch 00082: saving model to ./training/snapshots/resnet50_pascal_82.h5 Epoch 83/150 1/500 [..............................] - ETA: 1:49 - loss: 1.0163 - regression_loss: 0.9093 - classification_loss: 0.1069 2/500 [..............................] - ETA: 1:47 - loss: 1.6028 - regression_loss: 1.3613 - classification_loss: 0.2415 3/500 [..............................] - ETA: 1:46 - loss: 1.3647 - regression_loss: 1.1410 - classification_loss: 0.2238 4/500 [..............................] - ETA: 1:48 - loss: 1.2542 - regression_loss: 1.0574 - classification_loss: 0.1967 5/500 [..............................] - ETA: 1:51 - loss: 1.1773 - regression_loss: 1.0050 - classification_loss: 0.1723 6/500 [..............................] - ETA: 1:54 - loss: 1.4795 - regression_loss: 1.2354 - classification_loss: 0.2441 7/500 [..............................] - ETA: 1:55 - loss: 1.4259 - regression_loss: 1.1916 - classification_loss: 0.2343 8/500 [..............................] - ETA: 1:54 - loss: 1.3879 - regression_loss: 1.1668 - classification_loss: 0.2210 9/500 [..............................] - ETA: 1:53 - loss: 1.3827 - regression_loss: 1.1690 - classification_loss: 0.2137 10/500 [..............................] - ETA: 1:51 - loss: 1.3566 - regression_loss: 1.1506 - classification_loss: 0.2060 11/500 [..............................] - ETA: 1:51 - loss: 1.2708 - regression_loss: 1.0794 - classification_loss: 0.1914 12/500 [..............................] - ETA: 1:51 - loss: 1.2105 - regression_loss: 1.0285 - classification_loss: 0.1820 13/500 [..............................] - ETA: 1:52 - loss: 1.2138 - regression_loss: 1.0335 - classification_loss: 0.1803 14/500 [..............................] - ETA: 1:52 - loss: 1.1918 - regression_loss: 1.0166 - classification_loss: 0.1752 15/500 [..............................] - ETA: 1:52 - loss: 1.1819 - regression_loss: 1.0086 - classification_loss: 0.1733 16/500 [..............................] - ETA: 1:53 - loss: 1.1732 - regression_loss: 1.0035 - classification_loss: 0.1696 17/500 [>.............................] - ETA: 1:53 - loss: 1.1496 - regression_loss: 0.9841 - classification_loss: 0.1655 18/500 [>.............................] - ETA: 1:53 - loss: 1.2093 - regression_loss: 1.0283 - classification_loss: 0.1810 19/500 [>.............................] - ETA: 1:53 - loss: 1.2334 - regression_loss: 1.0498 - classification_loss: 0.1836 20/500 [>.............................] - ETA: 1:52 - loss: 1.2183 - regression_loss: 1.0342 - classification_loss: 0.1842 21/500 [>.............................] - ETA: 1:52 - loss: 1.2447 - regression_loss: 1.0561 - classification_loss: 0.1886 22/500 [>.............................] - ETA: 1:53 - loss: 1.2359 - regression_loss: 1.0499 - classification_loss: 0.1860 23/500 [>.............................] - ETA: 1:52 - loss: 1.2184 - regression_loss: 1.0346 - classification_loss: 0.1838 24/500 [>.............................] - ETA: 1:52 - loss: 1.2133 - regression_loss: 1.0326 - classification_loss: 0.1807 25/500 [>.............................] - ETA: 1:53 - loss: 1.2232 - regression_loss: 1.0432 - classification_loss: 0.1801 26/500 [>.............................] - ETA: 1:53 - loss: 1.2299 - regression_loss: 1.0494 - classification_loss: 0.1805 27/500 [>.............................] - ETA: 1:53 - loss: 1.2122 - regression_loss: 1.0354 - classification_loss: 0.1768 28/500 [>.............................] - ETA: 1:52 - loss: 1.2108 - regression_loss: 1.0346 - classification_loss: 0.1762 29/500 [>.............................] - ETA: 1:52 - loss: 1.2444 - regression_loss: 1.0515 - classification_loss: 0.1929 30/500 [>.............................] - ETA: 1:52 - loss: 1.2567 - regression_loss: 1.0595 - classification_loss: 0.1971 31/500 [>.............................] - ETA: 1:52 - loss: 1.2653 - regression_loss: 1.0659 - classification_loss: 0.1994 32/500 [>.............................] - ETA: 1:52 - loss: 1.2658 - regression_loss: 1.0673 - classification_loss: 0.1985 33/500 [>.............................] - ETA: 1:52 - loss: 1.2985 - regression_loss: 1.0953 - classification_loss: 0.2031 34/500 [=>............................] - ETA: 1:51 - loss: 1.3185 - regression_loss: 1.1095 - classification_loss: 0.2090 35/500 [=>............................] - ETA: 1:51 - loss: 1.3180 - regression_loss: 1.1104 - classification_loss: 0.2076 36/500 [=>............................] - ETA: 1:51 - loss: 1.3108 - regression_loss: 1.1049 - classification_loss: 0.2059 37/500 [=>............................] - ETA: 1:51 - loss: 1.3145 - regression_loss: 1.1079 - classification_loss: 0.2066 38/500 [=>............................] - ETA: 1:51 - loss: 1.3271 - regression_loss: 1.1182 - classification_loss: 0.2089 39/500 [=>............................] - ETA: 1:50 - loss: 1.3285 - regression_loss: 1.1195 - classification_loss: 0.2090 40/500 [=>............................] - ETA: 1:50 - loss: 1.3088 - regression_loss: 1.1032 - classification_loss: 0.2056 41/500 [=>............................] - ETA: 1:50 - loss: 1.3023 - regression_loss: 1.0985 - classification_loss: 0.2038 42/500 [=>............................] - ETA: 1:50 - loss: 1.3115 - regression_loss: 1.1049 - classification_loss: 0.2065 43/500 [=>............................] - ETA: 1:50 - loss: 1.3111 - regression_loss: 1.1050 - classification_loss: 0.2061 44/500 [=>............................] - ETA: 1:49 - loss: 1.2935 - regression_loss: 1.0897 - classification_loss: 0.2037 45/500 [=>............................] - ETA: 1:49 - loss: 1.2728 - regression_loss: 1.0727 - classification_loss: 0.2001 46/500 [=>............................] - ETA: 1:51 - loss: 1.2660 - regression_loss: 1.0674 - classification_loss: 0.1986 47/500 [=>............................] - ETA: 1:51 - loss: 1.2662 - regression_loss: 1.0676 - classification_loss: 0.1986 48/500 [=>............................] - ETA: 1:50 - loss: 1.2711 - regression_loss: 1.0716 - classification_loss: 0.1996 49/500 [=>............................] - ETA: 1:50 - loss: 1.2553 - regression_loss: 1.0579 - classification_loss: 0.1974 50/500 [==>...........................] - ETA: 1:50 - loss: 1.2518 - regression_loss: 1.0564 - classification_loss: 0.1953 51/500 [==>...........................] - ETA: 1:49 - loss: 1.2490 - regression_loss: 1.0546 - classification_loss: 0.1945 52/500 [==>...........................] - ETA: 1:49 - loss: 1.2484 - regression_loss: 1.0540 - classification_loss: 0.1944 53/500 [==>...........................] - ETA: 1:49 - loss: 1.2503 - regression_loss: 1.0558 - classification_loss: 0.1946 54/500 [==>...........................] - ETA: 1:49 - loss: 1.2539 - regression_loss: 1.0588 - classification_loss: 0.1951 55/500 [==>...........................] - ETA: 1:48 - loss: 1.2452 - regression_loss: 1.0515 - classification_loss: 0.1937 56/500 [==>...........................] - ETA: 1:48 - loss: 1.2528 - regression_loss: 1.0582 - classification_loss: 0.1946 57/500 [==>...........................] - ETA: 1:48 - loss: 1.2626 - regression_loss: 1.0651 - classification_loss: 0.1976 58/500 [==>...........................] - ETA: 1:48 - loss: 1.2757 - regression_loss: 1.0783 - classification_loss: 0.1974 59/500 [==>...........................] - ETA: 1:48 - loss: 1.2903 - regression_loss: 1.0893 - classification_loss: 0.2010 60/500 [==>...........................] - ETA: 1:47 - loss: 1.2826 - regression_loss: 1.0831 - classification_loss: 0.1995 61/500 [==>...........................] - ETA: 1:47 - loss: 1.2805 - regression_loss: 1.0822 - classification_loss: 0.1983 62/500 [==>...........................] - ETA: 1:47 - loss: 1.3005 - regression_loss: 1.0980 - classification_loss: 0.2024 63/500 [==>...........................] - ETA: 1:47 - loss: 1.2905 - regression_loss: 1.0901 - classification_loss: 0.2004 64/500 [==>...........................] - ETA: 1:46 - loss: 1.2997 - regression_loss: 1.0976 - classification_loss: 0.2021 65/500 [==>...........................] - ETA: 1:46 - loss: 1.2975 - regression_loss: 1.0966 - classification_loss: 0.2009 66/500 [==>...........................] - ETA: 1:46 - loss: 1.3037 - regression_loss: 1.1021 - classification_loss: 0.2016 67/500 [===>..........................] - ETA: 1:46 - loss: 1.3006 - regression_loss: 1.1000 - classification_loss: 0.2006 68/500 [===>..........................] - ETA: 1:45 - loss: 1.3034 - regression_loss: 1.1031 - classification_loss: 0.2004 69/500 [===>..........................] - ETA: 1:45 - loss: 1.3040 - regression_loss: 1.1040 - classification_loss: 0.2001 70/500 [===>..........................] - ETA: 1:45 - loss: 1.3043 - regression_loss: 1.1048 - classification_loss: 0.1995 71/500 [===>..........................] - ETA: 1:44 - loss: 1.3052 - regression_loss: 1.1054 - classification_loss: 0.1998 72/500 [===>..........................] - ETA: 1:44 - loss: 1.3029 - regression_loss: 1.1041 - classification_loss: 0.1988 73/500 [===>..........................] - ETA: 1:44 - loss: 1.2900 - regression_loss: 1.0932 - classification_loss: 0.1968 74/500 [===>..........................] - ETA: 1:44 - loss: 1.2953 - regression_loss: 1.0985 - classification_loss: 0.1968 75/500 [===>..........................] - ETA: 1:44 - loss: 1.2873 - regression_loss: 1.0912 - classification_loss: 0.1961 76/500 [===>..........................] - ETA: 1:43 - loss: 1.2867 - regression_loss: 1.0906 - classification_loss: 0.1961 77/500 [===>..........................] - ETA: 1:43 - loss: 1.2894 - regression_loss: 1.0937 - classification_loss: 0.1958 78/500 [===>..........................] - ETA: 1:43 - loss: 1.3023 - regression_loss: 1.1044 - classification_loss: 0.1979 79/500 [===>..........................] - ETA: 1:43 - loss: 1.3032 - regression_loss: 1.1056 - classification_loss: 0.1976 80/500 [===>..........................] - ETA: 1:42 - loss: 1.3066 - regression_loss: 1.1090 - classification_loss: 0.1976 81/500 [===>..........................] - ETA: 1:42 - loss: 1.3142 - regression_loss: 1.1162 - classification_loss: 0.1979 82/500 [===>..........................] - ETA: 1:42 - loss: 1.3141 - regression_loss: 1.1160 - classification_loss: 0.1980 83/500 [===>..........................] - ETA: 1:41 - loss: 1.3152 - regression_loss: 1.1167 - classification_loss: 0.1985 84/500 [====>.........................] - ETA: 1:41 - loss: 1.3111 - regression_loss: 1.1131 - classification_loss: 0.1979 85/500 [====>.........................] - ETA: 1:41 - loss: 1.3129 - regression_loss: 1.1146 - classification_loss: 0.1982 86/500 [====>.........................] - ETA: 1:41 - loss: 1.3006 - regression_loss: 1.1043 - classification_loss: 0.1964 87/500 [====>.........................] - ETA: 1:40 - loss: 1.3092 - regression_loss: 1.1098 - classification_loss: 0.1994 88/500 [====>.........................] - ETA: 1:40 - loss: 1.3141 - regression_loss: 1.1140 - classification_loss: 0.2001 89/500 [====>.........................] - ETA: 1:40 - loss: 1.3193 - regression_loss: 1.1181 - classification_loss: 0.2012 90/500 [====>.........................] - ETA: 1:40 - loss: 1.3117 - regression_loss: 1.1119 - classification_loss: 0.1998 91/500 [====>.........................] - ETA: 1:39 - loss: 1.3234 - regression_loss: 1.1222 - classification_loss: 0.2012 92/500 [====>.........................] - ETA: 1:39 - loss: 1.3178 - regression_loss: 1.1179 - classification_loss: 0.1999 93/500 [====>.........................] - ETA: 1:39 - loss: 1.3180 - regression_loss: 1.1186 - classification_loss: 0.1995 94/500 [====>.........................] - ETA: 1:38 - loss: 1.3206 - regression_loss: 1.1206 - classification_loss: 0.2000 95/500 [====>.........................] - ETA: 1:38 - loss: 1.3153 - regression_loss: 1.1165 - classification_loss: 0.1988 96/500 [====>.........................] - ETA: 1:38 - loss: 1.3166 - regression_loss: 1.1176 - classification_loss: 0.1990 97/500 [====>.........................] - ETA: 1:38 - loss: 1.3238 - regression_loss: 1.1228 - classification_loss: 0.2010 98/500 [====>.........................] - ETA: 1:38 - loss: 1.3229 - regression_loss: 1.1209 - classification_loss: 0.2021 99/500 [====>.........................] - ETA: 1:37 - loss: 1.3336 - regression_loss: 1.1285 - classification_loss: 0.2051 100/500 [=====>........................] - ETA: 1:37 - loss: 1.3389 - regression_loss: 1.1328 - classification_loss: 0.2060 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3355 - regression_loss: 1.1300 - classification_loss: 0.2055 102/500 [=====>........................] - ETA: 1:37 - loss: 1.3380 - regression_loss: 1.1314 - classification_loss: 0.2066 103/500 [=====>........................] - ETA: 1:36 - loss: 1.3380 - regression_loss: 1.1320 - classification_loss: 0.2060 104/500 [=====>........................] - ETA: 1:36 - loss: 1.3401 - regression_loss: 1.1345 - classification_loss: 0.2056 105/500 [=====>........................] - ETA: 1:36 - loss: 1.3367 - regression_loss: 1.1321 - classification_loss: 0.2046 106/500 [=====>........................] - ETA: 1:36 - loss: 1.3339 - regression_loss: 1.1302 - classification_loss: 0.2037 107/500 [=====>........................] - ETA: 1:35 - loss: 1.3401 - regression_loss: 1.1355 - classification_loss: 0.2045 108/500 [=====>........................] - ETA: 1:35 - loss: 1.3362 - regression_loss: 1.1320 - classification_loss: 0.2041 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3296 - regression_loss: 1.1268 - classification_loss: 0.2029 110/500 [=====>........................] - ETA: 1:35 - loss: 1.3219 - regression_loss: 1.1205 - classification_loss: 0.2014 111/500 [=====>........................] - ETA: 1:34 - loss: 1.3284 - regression_loss: 1.1258 - classification_loss: 0.2026 112/500 [=====>........................] - ETA: 1:34 - loss: 1.3280 - regression_loss: 1.1260 - classification_loss: 0.2020 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3277 - regression_loss: 1.1263 - classification_loss: 0.2015 114/500 [=====>........................] - ETA: 1:34 - loss: 1.3313 - regression_loss: 1.1291 - classification_loss: 0.2022 115/500 [=====>........................] - ETA: 1:34 - loss: 1.3340 - regression_loss: 1.1316 - classification_loss: 0.2024 116/500 [=====>........................] - ETA: 1:33 - loss: 1.3300 - regression_loss: 1.1281 - classification_loss: 0.2019 117/500 [======>.......................] - ETA: 1:33 - loss: 1.3364 - regression_loss: 1.1321 - classification_loss: 0.2043 118/500 [======>.......................] - ETA: 1:33 - loss: 1.3408 - regression_loss: 1.1357 - classification_loss: 0.2052 119/500 [======>.......................] - ETA: 1:33 - loss: 1.3340 - regression_loss: 1.1299 - classification_loss: 0.2042 120/500 [======>.......................] - ETA: 1:32 - loss: 1.3332 - regression_loss: 1.1295 - classification_loss: 0.2037 121/500 [======>.......................] - ETA: 1:32 - loss: 1.3332 - regression_loss: 1.1292 - classification_loss: 0.2040 122/500 [======>.......................] - ETA: 1:32 - loss: 1.3347 - regression_loss: 1.1308 - classification_loss: 0.2039 123/500 [======>.......................] - ETA: 1:32 - loss: 1.3363 - regression_loss: 1.1317 - classification_loss: 0.2045 124/500 [======>.......................] - ETA: 1:32 - loss: 1.3318 - regression_loss: 1.1279 - classification_loss: 0.2039 125/500 [======>.......................] - ETA: 1:31 - loss: 1.3324 - regression_loss: 1.1283 - classification_loss: 0.2041 126/500 [======>.......................] - ETA: 1:31 - loss: 1.3266 - regression_loss: 1.1233 - classification_loss: 0.2033 127/500 [======>.......................] - ETA: 1:31 - loss: 1.3308 - regression_loss: 1.1258 - classification_loss: 0.2050 128/500 [======>.......................] - ETA: 1:31 - loss: 1.3286 - regression_loss: 1.1242 - classification_loss: 0.2043 129/500 [======>.......................] - ETA: 1:30 - loss: 1.3282 - regression_loss: 1.1242 - classification_loss: 0.2040 130/500 [======>.......................] - ETA: 1:30 - loss: 1.3245 - regression_loss: 1.1212 - classification_loss: 0.2033 131/500 [======>.......................] - ETA: 1:30 - loss: 1.3258 - regression_loss: 1.1230 - classification_loss: 0.2027 132/500 [======>.......................] - ETA: 1:30 - loss: 1.3291 - regression_loss: 1.1253 - classification_loss: 0.2037 133/500 [======>.......................] - ETA: 1:29 - loss: 1.3296 - regression_loss: 1.1265 - classification_loss: 0.2031 134/500 [=======>......................] - ETA: 1:29 - loss: 1.3341 - regression_loss: 1.1307 - classification_loss: 0.2034 135/500 [=======>......................] - ETA: 1:29 - loss: 1.3379 - regression_loss: 1.1339 - classification_loss: 0.2040 136/500 [=======>......................] - ETA: 1:29 - loss: 1.3371 - regression_loss: 1.1333 - classification_loss: 0.2038 137/500 [=======>......................] - ETA: 1:28 - loss: 1.3385 - regression_loss: 1.1339 - classification_loss: 0.2046 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3419 - regression_loss: 1.1368 - classification_loss: 0.2051 139/500 [=======>......................] - ETA: 1:28 - loss: 1.3404 - regression_loss: 1.1360 - classification_loss: 0.2044 140/500 [=======>......................] - ETA: 1:28 - loss: 1.3421 - regression_loss: 1.1377 - classification_loss: 0.2044 141/500 [=======>......................] - ETA: 1:27 - loss: 1.3423 - regression_loss: 1.1380 - classification_loss: 0.2043 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3383 - regression_loss: 1.1347 - classification_loss: 0.2036 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3355 - regression_loss: 1.1326 - classification_loss: 0.2029 144/500 [=======>......................] - ETA: 1:27 - loss: 1.3407 - regression_loss: 1.1369 - classification_loss: 0.2038 145/500 [=======>......................] - ETA: 1:27 - loss: 1.3378 - regression_loss: 1.1346 - classification_loss: 0.2033 146/500 [=======>......................] - ETA: 1:26 - loss: 1.3377 - regression_loss: 1.1346 - classification_loss: 0.2031 147/500 [=======>......................] - ETA: 1:26 - loss: 1.3452 - regression_loss: 1.1402 - classification_loss: 0.2050 148/500 [=======>......................] - ETA: 1:26 - loss: 1.3440 - regression_loss: 1.1397 - classification_loss: 0.2043 149/500 [=======>......................] - ETA: 1:26 - loss: 1.3447 - regression_loss: 1.1406 - classification_loss: 0.2041 150/500 [========>.....................] - ETA: 1:25 - loss: 1.3455 - regression_loss: 1.1414 - classification_loss: 0.2042 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3442 - regression_loss: 1.1405 - classification_loss: 0.2037 152/500 [========>.....................] - ETA: 1:25 - loss: 1.3411 - regression_loss: 1.1381 - classification_loss: 0.2029 153/500 [========>.....................] - ETA: 1:25 - loss: 1.3406 - regression_loss: 1.1375 - classification_loss: 0.2031 154/500 [========>.....................] - ETA: 1:24 - loss: 1.3409 - regression_loss: 1.1375 - classification_loss: 0.2034 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3396 - regression_loss: 1.1367 - classification_loss: 0.2028 156/500 [========>.....................] - ETA: 1:24 - loss: 1.3350 - regression_loss: 1.1330 - classification_loss: 0.2020 157/500 [========>.....................] - ETA: 1:24 - loss: 1.3336 - regression_loss: 1.1318 - classification_loss: 0.2018 158/500 [========>.....................] - ETA: 1:23 - loss: 1.3321 - regression_loss: 1.1305 - classification_loss: 0.2015 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3300 - regression_loss: 1.1290 - classification_loss: 0.2010 160/500 [========>.....................] - ETA: 1:23 - loss: 1.3287 - regression_loss: 1.1281 - classification_loss: 0.2006 161/500 [========>.....................] - ETA: 1:23 - loss: 1.3279 - regression_loss: 1.1278 - classification_loss: 0.2001 162/500 [========>.....................] - ETA: 1:22 - loss: 1.3282 - regression_loss: 1.1281 - classification_loss: 0.2001 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3248 - regression_loss: 1.1250 - classification_loss: 0.1998 164/500 [========>.....................] - ETA: 1:22 - loss: 1.3239 - regression_loss: 1.1244 - classification_loss: 0.1995 165/500 [========>.....................] - ETA: 1:22 - loss: 1.3250 - regression_loss: 1.1249 - classification_loss: 0.2002 166/500 [========>.....................] - ETA: 1:21 - loss: 1.3282 - regression_loss: 1.1277 - classification_loss: 0.2005 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3235 - regression_loss: 1.1239 - classification_loss: 0.1996 168/500 [=========>....................] - ETA: 1:21 - loss: 1.3217 - regression_loss: 1.1223 - classification_loss: 0.1994 169/500 [=========>....................] - ETA: 1:21 - loss: 1.3285 - regression_loss: 1.1278 - classification_loss: 0.2006 170/500 [=========>....................] - ETA: 1:20 - loss: 1.3321 - regression_loss: 1.1305 - classification_loss: 0.2016 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3319 - regression_loss: 1.1303 - classification_loss: 0.2016 172/500 [=========>....................] - ETA: 1:20 - loss: 1.3343 - regression_loss: 1.1324 - classification_loss: 0.2020 173/500 [=========>....................] - ETA: 1:20 - loss: 1.3345 - regression_loss: 1.1325 - classification_loss: 0.2020 174/500 [=========>....................] - ETA: 1:20 - loss: 1.3383 - regression_loss: 1.1353 - classification_loss: 0.2030 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3373 - regression_loss: 1.1344 - classification_loss: 0.2029 176/500 [=========>....................] - ETA: 1:19 - loss: 1.3363 - regression_loss: 1.1339 - classification_loss: 0.2024 177/500 [=========>....................] - ETA: 1:19 - loss: 1.3371 - regression_loss: 1.1347 - classification_loss: 0.2024 178/500 [=========>....................] - ETA: 1:19 - loss: 1.3350 - regression_loss: 1.1332 - classification_loss: 0.2019 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3358 - regression_loss: 1.1337 - classification_loss: 0.2021 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3337 - regression_loss: 1.1319 - classification_loss: 0.2018 181/500 [=========>....................] - ETA: 1:18 - loss: 1.3315 - regression_loss: 1.1302 - classification_loss: 0.2014 182/500 [=========>....................] - ETA: 1:18 - loss: 1.3299 - regression_loss: 1.1288 - classification_loss: 0.2011 183/500 [=========>....................] - ETA: 1:17 - loss: 1.3298 - regression_loss: 1.1286 - classification_loss: 0.2012 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3317 - regression_loss: 1.1299 - classification_loss: 0.2018 185/500 [==========>...................] - ETA: 1:17 - loss: 1.3321 - regression_loss: 1.1303 - classification_loss: 0.2018 186/500 [==========>...................] - ETA: 1:16 - loss: 1.3374 - regression_loss: 1.1339 - classification_loss: 0.2035 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3379 - regression_loss: 1.1345 - classification_loss: 0.2034 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3381 - regression_loss: 1.1348 - classification_loss: 0.2033 189/500 [==========>...................] - ETA: 1:16 - loss: 1.3388 - regression_loss: 1.1352 - classification_loss: 0.2036 190/500 [==========>...................] - ETA: 1:16 - loss: 1.3378 - regression_loss: 1.1345 - classification_loss: 0.2033 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3347 - regression_loss: 1.1319 - classification_loss: 0.2027 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3334 - regression_loss: 1.1311 - classification_loss: 0.2023 193/500 [==========>...................] - ETA: 1:15 - loss: 1.3338 - regression_loss: 1.1319 - classification_loss: 0.2019 194/500 [==========>...................] - ETA: 1:15 - loss: 1.3309 - regression_loss: 1.1296 - classification_loss: 0.2013 195/500 [==========>...................] - ETA: 1:14 - loss: 1.3297 - regression_loss: 1.1289 - classification_loss: 0.2008 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3282 - regression_loss: 1.1278 - classification_loss: 0.2005 197/500 [==========>...................] - ETA: 1:14 - loss: 1.3293 - regression_loss: 1.1287 - classification_loss: 0.2006 198/500 [==========>...................] - ETA: 1:14 - loss: 1.3327 - regression_loss: 1.1315 - classification_loss: 0.2012 199/500 [==========>...................] - ETA: 1:13 - loss: 1.3356 - regression_loss: 1.1341 - classification_loss: 0.2015 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3398 - regression_loss: 1.1376 - classification_loss: 0.2022 201/500 [===========>..................] - ETA: 1:13 - loss: 1.3406 - regression_loss: 1.1381 - classification_loss: 0.2025 202/500 [===========>..................] - ETA: 1:13 - loss: 1.3429 - regression_loss: 1.1395 - classification_loss: 0.2034 203/500 [===========>..................] - ETA: 1:12 - loss: 1.3424 - regression_loss: 1.1392 - classification_loss: 0.2031 204/500 [===========>..................] - ETA: 1:12 - loss: 1.3402 - regression_loss: 1.1373 - classification_loss: 0.2029 205/500 [===========>..................] - ETA: 1:12 - loss: 1.3409 - regression_loss: 1.1383 - classification_loss: 0.2026 206/500 [===========>..................] - ETA: 1:12 - loss: 1.3404 - regression_loss: 1.1381 - classification_loss: 0.2024 207/500 [===========>..................] - ETA: 1:11 - loss: 1.3401 - regression_loss: 1.1379 - classification_loss: 0.2021 208/500 [===========>..................] - ETA: 1:11 - loss: 1.3403 - regression_loss: 1.1386 - classification_loss: 0.2017 209/500 [===========>..................] - ETA: 1:11 - loss: 1.3415 - regression_loss: 1.1394 - classification_loss: 0.2021 210/500 [===========>..................] - ETA: 1:11 - loss: 1.3380 - regression_loss: 1.1365 - classification_loss: 0.2015 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3378 - regression_loss: 1.1366 - classification_loss: 0.2012 212/500 [===========>..................] - ETA: 1:10 - loss: 1.3370 - regression_loss: 1.1360 - classification_loss: 0.2009 213/500 [===========>..................] - ETA: 1:10 - loss: 1.3393 - regression_loss: 1.1378 - classification_loss: 0.2015 214/500 [===========>..................] - ETA: 1:10 - loss: 1.3380 - regression_loss: 1.1369 - classification_loss: 0.2011 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3393 - regression_loss: 1.1375 - classification_loss: 0.2018 216/500 [===========>..................] - ETA: 1:09 - loss: 1.3376 - regression_loss: 1.1360 - classification_loss: 0.2016 217/500 [============>.................] - ETA: 1:09 - loss: 1.3389 - regression_loss: 1.1371 - classification_loss: 0.2018 218/500 [============>.................] - ETA: 1:09 - loss: 1.3376 - regression_loss: 1.1361 - classification_loss: 0.2015 219/500 [============>.................] - ETA: 1:08 - loss: 1.3365 - regression_loss: 1.1353 - classification_loss: 0.2012 220/500 [============>.................] - ETA: 1:08 - loss: 1.3374 - regression_loss: 1.1359 - classification_loss: 0.2015 221/500 [============>.................] - ETA: 1:08 - loss: 1.3373 - regression_loss: 1.1361 - classification_loss: 0.2012 222/500 [============>.................] - ETA: 1:08 - loss: 1.3399 - regression_loss: 1.1378 - classification_loss: 0.2021 223/500 [============>.................] - ETA: 1:07 - loss: 1.3435 - regression_loss: 1.1405 - classification_loss: 0.2030 224/500 [============>.................] - ETA: 1:07 - loss: 1.3426 - regression_loss: 1.1398 - classification_loss: 0.2029 225/500 [============>.................] - ETA: 1:07 - loss: 1.3453 - regression_loss: 1.1420 - classification_loss: 0.2033 226/500 [============>.................] - ETA: 1:07 - loss: 1.3442 - regression_loss: 1.1411 - classification_loss: 0.2031 227/500 [============>.................] - ETA: 1:07 - loss: 1.3435 - regression_loss: 1.1409 - classification_loss: 0.2027 228/500 [============>.................] - ETA: 1:06 - loss: 1.3433 - regression_loss: 1.1405 - classification_loss: 0.2027 229/500 [============>.................] - ETA: 1:06 - loss: 1.3444 - regression_loss: 1.1412 - classification_loss: 0.2032 230/500 [============>.................] - ETA: 1:06 - loss: 1.3438 - regression_loss: 1.1408 - classification_loss: 0.2030 231/500 [============>.................] - ETA: 1:06 - loss: 1.3460 - regression_loss: 1.1425 - classification_loss: 0.2034 232/500 [============>.................] - ETA: 1:05 - loss: 1.3473 - regression_loss: 1.1439 - classification_loss: 0.2034 233/500 [============>.................] - ETA: 1:05 - loss: 1.3466 - regression_loss: 1.1434 - classification_loss: 0.2031 234/500 [=============>................] - ETA: 1:05 - loss: 1.3491 - regression_loss: 1.1457 - classification_loss: 0.2034 235/500 [=============>................] - ETA: 1:05 - loss: 1.3530 - regression_loss: 1.1485 - classification_loss: 0.2044 236/500 [=============>................] - ETA: 1:04 - loss: 1.3549 - regression_loss: 1.1501 - classification_loss: 0.2048 237/500 [=============>................] - ETA: 1:04 - loss: 1.3536 - regression_loss: 1.1488 - classification_loss: 0.2049 238/500 [=============>................] - ETA: 1:04 - loss: 1.3555 - regression_loss: 1.1500 - classification_loss: 0.2055 239/500 [=============>................] - ETA: 1:04 - loss: 1.3515 - regression_loss: 1.1465 - classification_loss: 0.2049 240/500 [=============>................] - ETA: 1:03 - loss: 1.3506 - regression_loss: 1.1458 - classification_loss: 0.2048 241/500 [=============>................] - ETA: 1:03 - loss: 1.3499 - regression_loss: 1.1451 - classification_loss: 0.2048 242/500 [=============>................] - ETA: 1:03 - loss: 1.3505 - regression_loss: 1.1459 - classification_loss: 0.2046 243/500 [=============>................] - ETA: 1:03 - loss: 1.3487 - regression_loss: 1.1447 - classification_loss: 0.2040 244/500 [=============>................] - ETA: 1:02 - loss: 1.3505 - regression_loss: 1.1460 - classification_loss: 0.2046 245/500 [=============>................] - ETA: 1:02 - loss: 1.3501 - regression_loss: 1.1455 - classification_loss: 0.2046 246/500 [=============>................] - ETA: 1:02 - loss: 1.3500 - regression_loss: 1.1456 - classification_loss: 0.2044 247/500 [=============>................] - ETA: 1:02 - loss: 1.3499 - regression_loss: 1.1454 - classification_loss: 0.2045 248/500 [=============>................] - ETA: 1:01 - loss: 1.3527 - regression_loss: 1.1475 - classification_loss: 0.2052 249/500 [=============>................] - ETA: 1:01 - loss: 1.3538 - regression_loss: 1.1486 - classification_loss: 0.2052 250/500 [==============>...............] - ETA: 1:01 - loss: 1.3555 - regression_loss: 1.1499 - classification_loss: 0.2056 251/500 [==============>...............] - ETA: 1:01 - loss: 1.3539 - regression_loss: 1.1488 - classification_loss: 0.2051 252/500 [==============>...............] - ETA: 1:00 - loss: 1.3544 - regression_loss: 1.1491 - classification_loss: 0.2053 253/500 [==============>...............] - ETA: 1:00 - loss: 1.3540 - regression_loss: 1.1487 - classification_loss: 0.2053 254/500 [==============>...............] - ETA: 1:00 - loss: 1.3533 - regression_loss: 1.1483 - classification_loss: 0.2051 255/500 [==============>...............] - ETA: 1:00 - loss: 1.3528 - regression_loss: 1.1479 - classification_loss: 0.2050 256/500 [==============>...............] - ETA: 59s - loss: 1.3532 - regression_loss: 1.1483 - classification_loss: 0.2049  257/500 [==============>...............] - ETA: 59s - loss: 1.3546 - regression_loss: 1.1495 - classification_loss: 0.2051 258/500 [==============>...............] - ETA: 59s - loss: 1.3555 - regression_loss: 1.1502 - classification_loss: 0.2053 259/500 [==============>...............] - ETA: 59s - loss: 1.3516 - regression_loss: 1.1468 - classification_loss: 0.2048 260/500 [==============>...............] - ETA: 58s - loss: 1.3528 - regression_loss: 1.1479 - classification_loss: 0.2048 261/500 [==============>...............] - ETA: 58s - loss: 1.3510 - regression_loss: 1.1465 - classification_loss: 0.2045 262/500 [==============>...............] - ETA: 58s - loss: 1.3527 - regression_loss: 1.1476 - classification_loss: 0.2050 263/500 [==============>...............] - ETA: 58s - loss: 1.3520 - regression_loss: 1.1472 - classification_loss: 0.2048 264/500 [==============>...............] - ETA: 57s - loss: 1.3532 - regression_loss: 1.1482 - classification_loss: 0.2050 265/500 [==============>...............] - ETA: 57s - loss: 1.3500 - regression_loss: 1.1456 - classification_loss: 0.2044 266/500 [==============>...............] - ETA: 57s - loss: 1.3526 - regression_loss: 1.1474 - classification_loss: 0.2052 267/500 [===============>..............] - ETA: 57s - loss: 1.3495 - regression_loss: 1.1449 - classification_loss: 0.2046 268/500 [===============>..............] - ETA: 57s - loss: 1.3484 - regression_loss: 1.1441 - classification_loss: 0.2043 269/500 [===============>..............] - ETA: 56s - loss: 1.3467 - regression_loss: 1.1428 - classification_loss: 0.2040 270/500 [===============>..............] - ETA: 56s - loss: 1.3454 - regression_loss: 1.1419 - classification_loss: 0.2035 271/500 [===============>..............] - ETA: 56s - loss: 1.3447 - regression_loss: 1.1413 - classification_loss: 0.2034 272/500 [===============>..............] - ETA: 56s - loss: 1.3456 - regression_loss: 1.1421 - classification_loss: 0.2034 273/500 [===============>..............] - ETA: 55s - loss: 1.3454 - regression_loss: 1.1419 - classification_loss: 0.2035 274/500 [===============>..............] - ETA: 55s - loss: 1.3458 - regression_loss: 1.1423 - classification_loss: 0.2035 275/500 [===============>..............] - ETA: 55s - loss: 1.3469 - regression_loss: 1.1430 - classification_loss: 0.2040 276/500 [===============>..............] - ETA: 55s - loss: 1.3481 - regression_loss: 1.1441 - classification_loss: 0.2041 277/500 [===============>..............] - ETA: 54s - loss: 1.3466 - regression_loss: 1.1429 - classification_loss: 0.2037 278/500 [===============>..............] - ETA: 54s - loss: 1.3475 - regression_loss: 1.1437 - classification_loss: 0.2038 279/500 [===============>..............] - ETA: 54s - loss: 1.3468 - regression_loss: 1.1432 - classification_loss: 0.2037 280/500 [===============>..............] - ETA: 54s - loss: 1.3502 - regression_loss: 1.1458 - classification_loss: 0.2044 281/500 [===============>..............] - ETA: 53s - loss: 1.3522 - regression_loss: 1.1474 - classification_loss: 0.2048 282/500 [===============>..............] - ETA: 53s - loss: 1.3518 - regression_loss: 1.1473 - classification_loss: 0.2045 283/500 [===============>..............] - ETA: 53s - loss: 1.3505 - regression_loss: 1.1463 - classification_loss: 0.2042 284/500 [================>.............] - ETA: 53s - loss: 1.3498 - regression_loss: 1.1458 - classification_loss: 0.2040 285/500 [================>.............] - ETA: 52s - loss: 1.3505 - regression_loss: 1.1462 - classification_loss: 0.2044 286/500 [================>.............] - ETA: 52s - loss: 1.3552 - regression_loss: 1.1498 - classification_loss: 0.2053 287/500 [================>.............] - ETA: 52s - loss: 1.3559 - regression_loss: 1.1504 - classification_loss: 0.2054 288/500 [================>.............] - ETA: 52s - loss: 1.3543 - regression_loss: 1.1491 - classification_loss: 0.2052 289/500 [================>.............] - ETA: 51s - loss: 1.3543 - regression_loss: 1.1493 - classification_loss: 0.2050 290/500 [================>.............] - ETA: 51s - loss: 1.3528 - regression_loss: 1.1482 - classification_loss: 0.2047 291/500 [================>.............] - ETA: 51s - loss: 1.3524 - regression_loss: 1.1480 - classification_loss: 0.2044 292/500 [================>.............] - ETA: 51s - loss: 1.3538 - regression_loss: 1.1495 - classification_loss: 0.2043 293/500 [================>.............] - ETA: 50s - loss: 1.3555 - regression_loss: 1.1510 - classification_loss: 0.2045 294/500 [================>.............] - ETA: 50s - loss: 1.3550 - regression_loss: 1.1508 - classification_loss: 0.2042 295/500 [================>.............] - ETA: 50s - loss: 1.3563 - regression_loss: 1.1515 - classification_loss: 0.2048 296/500 [================>.............] - ETA: 50s - loss: 1.3539 - regression_loss: 1.1495 - classification_loss: 0.2044 297/500 [================>.............] - ETA: 49s - loss: 1.3562 - regression_loss: 1.1511 - classification_loss: 0.2051 298/500 [================>.............] - ETA: 49s - loss: 1.3552 - regression_loss: 1.1503 - classification_loss: 0.2049 299/500 [================>.............] - ETA: 49s - loss: 1.3523 - regression_loss: 1.1479 - classification_loss: 0.2043 300/500 [=================>............] - ETA: 49s - loss: 1.3524 - regression_loss: 1.1483 - classification_loss: 0.2042 301/500 [=================>............] - ETA: 48s - loss: 1.3542 - regression_loss: 1.1498 - classification_loss: 0.2044 302/500 [=================>............] - ETA: 48s - loss: 1.3532 - regression_loss: 1.1490 - classification_loss: 0.2042 303/500 [=================>............] - ETA: 48s - loss: 1.3526 - regression_loss: 1.1480 - classification_loss: 0.2046 304/500 [=================>............] - ETA: 48s - loss: 1.3493 - regression_loss: 1.1453 - classification_loss: 0.2041 305/500 [=================>............] - ETA: 47s - loss: 1.3542 - regression_loss: 1.1490 - classification_loss: 0.2052 306/500 [=================>............] - ETA: 47s - loss: 1.3549 - regression_loss: 1.1495 - classification_loss: 0.2054 307/500 [=================>............] - ETA: 47s - loss: 1.3545 - regression_loss: 1.1493 - classification_loss: 0.2052 308/500 [=================>............] - ETA: 47s - loss: 1.3575 - regression_loss: 1.1518 - classification_loss: 0.2057 309/500 [=================>............] - ETA: 46s - loss: 1.3574 - regression_loss: 1.1514 - classification_loss: 0.2060 310/500 [=================>............] - ETA: 46s - loss: 1.3573 - regression_loss: 1.1512 - classification_loss: 0.2060 311/500 [=================>............] - ETA: 46s - loss: 1.3577 - regression_loss: 1.1514 - classification_loss: 0.2063 312/500 [=================>............] - ETA: 46s - loss: 1.3564 - regression_loss: 1.1505 - classification_loss: 0.2060 313/500 [=================>............] - ETA: 45s - loss: 1.3554 - regression_loss: 1.1496 - classification_loss: 0.2058 314/500 [=================>............] - ETA: 45s - loss: 1.3562 - regression_loss: 1.1502 - classification_loss: 0.2061 315/500 [=================>............] - ETA: 45s - loss: 1.3562 - regression_loss: 1.1501 - classification_loss: 0.2061 316/500 [=================>............] - ETA: 45s - loss: 1.3562 - regression_loss: 1.1501 - classification_loss: 0.2061 317/500 [==================>...........] - ETA: 44s - loss: 1.3556 - regression_loss: 1.1498 - classification_loss: 0.2058 318/500 [==================>...........] - ETA: 44s - loss: 1.3548 - regression_loss: 1.1493 - classification_loss: 0.2055 319/500 [==================>...........] - ETA: 44s - loss: 1.3556 - regression_loss: 1.1502 - classification_loss: 0.2055 320/500 [==================>...........] - ETA: 44s - loss: 1.3548 - regression_loss: 1.1495 - classification_loss: 0.2053 321/500 [==================>...........] - ETA: 43s - loss: 1.3549 - regression_loss: 1.1496 - classification_loss: 0.2054 322/500 [==================>...........] - ETA: 43s - loss: 1.3545 - regression_loss: 1.1488 - classification_loss: 0.2057 323/500 [==================>...........] - ETA: 43s - loss: 1.3552 - regression_loss: 1.1493 - classification_loss: 0.2059 324/500 [==================>...........] - ETA: 43s - loss: 1.3542 - regression_loss: 1.1484 - classification_loss: 0.2058 325/500 [==================>...........] - ETA: 42s - loss: 1.3555 - regression_loss: 1.1495 - classification_loss: 0.2060 326/500 [==================>...........] - ETA: 42s - loss: 1.3544 - regression_loss: 1.1486 - classification_loss: 0.2058 327/500 [==================>...........] - ETA: 42s - loss: 1.3530 - regression_loss: 1.1470 - classification_loss: 0.2060 328/500 [==================>...........] - ETA: 42s - loss: 1.3535 - regression_loss: 1.1476 - classification_loss: 0.2059 329/500 [==================>...........] - ETA: 41s - loss: 1.3534 - regression_loss: 1.1477 - classification_loss: 0.2057 330/500 [==================>...........] - ETA: 41s - loss: 1.3544 - regression_loss: 1.1489 - classification_loss: 0.2055 331/500 [==================>...........] - ETA: 41s - loss: 1.3538 - regression_loss: 1.1485 - classification_loss: 0.2053 332/500 [==================>...........] - ETA: 41s - loss: 1.3527 - regression_loss: 1.1476 - classification_loss: 0.2050 333/500 [==================>...........] - ETA: 40s - loss: 1.3521 - regression_loss: 1.1474 - classification_loss: 0.2047 334/500 [===================>..........] - ETA: 40s - loss: 1.3491 - regression_loss: 1.1449 - classification_loss: 0.2042 335/500 [===================>..........] - ETA: 40s - loss: 1.3495 - regression_loss: 1.1453 - classification_loss: 0.2041 336/500 [===================>..........] - ETA: 40s - loss: 1.3495 - regression_loss: 1.1454 - classification_loss: 0.2041 337/500 [===================>..........] - ETA: 40s - loss: 1.3486 - regression_loss: 1.1448 - classification_loss: 0.2038 338/500 [===================>..........] - ETA: 39s - loss: 1.3486 - regression_loss: 1.1449 - classification_loss: 0.2036 339/500 [===================>..........] - ETA: 39s - loss: 1.3487 - regression_loss: 1.1449 - classification_loss: 0.2038 340/500 [===================>..........] - ETA: 39s - loss: 1.3484 - regression_loss: 1.1445 - classification_loss: 0.2039 341/500 [===================>..........] - ETA: 39s - loss: 1.3504 - regression_loss: 1.1465 - classification_loss: 0.2040 342/500 [===================>..........] - ETA: 38s - loss: 1.3509 - regression_loss: 1.1472 - classification_loss: 0.2037 343/500 [===================>..........] - ETA: 38s - loss: 1.3510 - regression_loss: 1.1473 - classification_loss: 0.2037 344/500 [===================>..........] - ETA: 38s - loss: 1.3515 - regression_loss: 1.1477 - classification_loss: 0.2038 345/500 [===================>..........] - ETA: 38s - loss: 1.3509 - regression_loss: 1.1472 - classification_loss: 0.2037 346/500 [===================>..........] - ETA: 37s - loss: 1.3503 - regression_loss: 1.1467 - classification_loss: 0.2036 347/500 [===================>..........] - ETA: 37s - loss: 1.3505 - regression_loss: 1.1467 - classification_loss: 0.2037 348/500 [===================>..........] - ETA: 37s - loss: 1.3511 - regression_loss: 1.1472 - classification_loss: 0.2038 349/500 [===================>..........] - ETA: 37s - loss: 1.3511 - regression_loss: 1.1474 - classification_loss: 0.2037 350/500 [====================>.........] - ETA: 36s - loss: 1.3493 - regression_loss: 1.1459 - classification_loss: 0.2034 351/500 [====================>.........] - ETA: 36s - loss: 1.3502 - regression_loss: 1.1466 - classification_loss: 0.2036 352/500 [====================>.........] - ETA: 36s - loss: 1.3482 - regression_loss: 1.1449 - classification_loss: 0.2033 353/500 [====================>.........] - ETA: 36s - loss: 1.3474 - regression_loss: 1.1443 - classification_loss: 0.2031 354/500 [====================>.........] - ETA: 35s - loss: 1.3477 - regression_loss: 1.1442 - classification_loss: 0.2035 355/500 [====================>.........] - ETA: 35s - loss: 1.3481 - regression_loss: 1.1447 - classification_loss: 0.2034 356/500 [====================>.........] - ETA: 35s - loss: 1.3470 - regression_loss: 1.1438 - classification_loss: 0.2033 357/500 [====================>.........] - ETA: 35s - loss: 1.3488 - regression_loss: 1.1452 - classification_loss: 0.2035 358/500 [====================>.........] - ETA: 34s - loss: 1.3490 - regression_loss: 1.1453 - classification_loss: 0.2037 359/500 [====================>.........] - ETA: 34s - loss: 1.3517 - regression_loss: 1.1471 - classification_loss: 0.2046 360/500 [====================>.........] - ETA: 34s - loss: 1.3498 - regression_loss: 1.1455 - classification_loss: 0.2043 361/500 [====================>.........] - ETA: 34s - loss: 1.3490 - regression_loss: 1.1449 - classification_loss: 0.2041 362/500 [====================>.........] - ETA: 33s - loss: 1.3481 - regression_loss: 1.1443 - classification_loss: 0.2039 363/500 [====================>.........] - ETA: 33s - loss: 1.3474 - regression_loss: 1.1437 - classification_loss: 0.2037 364/500 [====================>.........] - ETA: 33s - loss: 1.3468 - regression_loss: 1.1433 - classification_loss: 0.2035 365/500 [====================>.........] - ETA: 33s - loss: 1.3462 - regression_loss: 1.1428 - classification_loss: 0.2034 366/500 [====================>.........] - ETA: 32s - loss: 1.3468 - regression_loss: 1.1432 - classification_loss: 0.2036 367/500 [=====================>........] - ETA: 32s - loss: 1.3454 - regression_loss: 1.1421 - classification_loss: 0.2033 368/500 [=====================>........] - ETA: 32s - loss: 1.3446 - regression_loss: 1.1413 - classification_loss: 0.2033 369/500 [=====================>........] - ETA: 32s - loss: 1.3425 - regression_loss: 1.1395 - classification_loss: 0.2031 370/500 [=====================>........] - ETA: 31s - loss: 1.3434 - regression_loss: 1.1399 - classification_loss: 0.2034 371/500 [=====================>........] - ETA: 31s - loss: 1.3421 - regression_loss: 1.1389 - classification_loss: 0.2032 372/500 [=====================>........] - ETA: 31s - loss: 1.3431 - regression_loss: 1.1398 - classification_loss: 0.2033 373/500 [=====================>........] - ETA: 31s - loss: 1.3455 - regression_loss: 1.1416 - classification_loss: 0.2040 374/500 [=====================>........] - ETA: 30s - loss: 1.3454 - regression_loss: 1.1414 - classification_loss: 0.2040 375/500 [=====================>........] - ETA: 30s - loss: 1.3481 - regression_loss: 1.1436 - classification_loss: 0.2045 376/500 [=====================>........] - ETA: 30s - loss: 1.3509 - regression_loss: 1.1456 - classification_loss: 0.2053 377/500 [=====================>........] - ETA: 29s - loss: 1.3508 - regression_loss: 1.1455 - classification_loss: 0.2052 378/500 [=====================>........] - ETA: 29s - loss: 1.3523 - regression_loss: 1.1468 - classification_loss: 0.2055 379/500 [=====================>........] - ETA: 29s - loss: 1.3574 - regression_loss: 1.1481 - classification_loss: 0.2093 380/500 [=====================>........] - ETA: 29s - loss: 1.3575 - regression_loss: 1.1484 - classification_loss: 0.2091 381/500 [=====================>........] - ETA: 28s - loss: 1.3569 - regression_loss: 1.1480 - classification_loss: 0.2089 382/500 [=====================>........] - ETA: 28s - loss: 1.3546 - regression_loss: 1.1450 - classification_loss: 0.2095 383/500 [=====================>........] - ETA: 28s - loss: 1.3543 - regression_loss: 1.1449 - classification_loss: 0.2094 384/500 [======================>.......] - ETA: 28s - loss: 1.3546 - regression_loss: 1.1453 - classification_loss: 0.2093 385/500 [======================>.......] - ETA: 27s - loss: 1.3568 - regression_loss: 1.1471 - classification_loss: 0.2098 386/500 [======================>.......] - ETA: 27s - loss: 1.3557 - regression_loss: 1.1461 - classification_loss: 0.2095 387/500 [======================>.......] - ETA: 27s - loss: 1.3579 - regression_loss: 1.1481 - classification_loss: 0.2098 388/500 [======================>.......] - ETA: 27s - loss: 1.3577 - regression_loss: 1.1478 - classification_loss: 0.2099 389/500 [======================>.......] - ETA: 26s - loss: 1.3585 - regression_loss: 1.1485 - classification_loss: 0.2100 390/500 [======================>.......] - ETA: 26s - loss: 1.3575 - regression_loss: 1.1476 - classification_loss: 0.2099 391/500 [======================>.......] - ETA: 26s - loss: 1.3567 - regression_loss: 1.1470 - classification_loss: 0.2097 392/500 [======================>.......] - ETA: 26s - loss: 1.3581 - regression_loss: 1.1481 - classification_loss: 0.2100 393/500 [======================>.......] - ETA: 25s - loss: 1.3593 - regression_loss: 1.1492 - classification_loss: 0.2102 394/500 [======================>.......] - ETA: 25s - loss: 1.3594 - regression_loss: 1.1493 - classification_loss: 0.2101 395/500 [======================>.......] - ETA: 25s - loss: 1.3607 - regression_loss: 1.1503 - classification_loss: 0.2104 396/500 [======================>.......] - ETA: 25s - loss: 1.3604 - regression_loss: 1.1500 - classification_loss: 0.2105 397/500 [======================>.......] - ETA: 24s - loss: 1.3584 - regression_loss: 1.1483 - classification_loss: 0.2101 398/500 [======================>.......] - ETA: 24s - loss: 1.3569 - regression_loss: 1.1470 - classification_loss: 0.2098 399/500 [======================>.......] - ETA: 24s - loss: 1.3588 - regression_loss: 1.1483 - classification_loss: 0.2105 400/500 [=======================>......] - ETA: 24s - loss: 1.3583 - regression_loss: 1.1480 - classification_loss: 0.2102 401/500 [=======================>......] - ETA: 23s - loss: 1.3573 - regression_loss: 1.1473 - classification_loss: 0.2099 402/500 [=======================>......] - ETA: 23s - loss: 1.3568 - regression_loss: 1.1470 - classification_loss: 0.2098 403/500 [=======================>......] - ETA: 23s - loss: 1.3558 - regression_loss: 1.1463 - classification_loss: 0.2095 404/500 [=======================>......] - ETA: 23s - loss: 1.3565 - regression_loss: 1.1467 - classification_loss: 0.2098 405/500 [=======================>......] - ETA: 22s - loss: 1.3547 - regression_loss: 1.1452 - classification_loss: 0.2095 406/500 [=======================>......] - ETA: 22s - loss: 1.3538 - regression_loss: 1.1446 - classification_loss: 0.2093 407/500 [=======================>......] - ETA: 22s - loss: 1.3544 - regression_loss: 1.1449 - classification_loss: 0.2095 408/500 [=======================>......] - ETA: 22s - loss: 1.3537 - regression_loss: 1.1444 - classification_loss: 0.2093 409/500 [=======================>......] - ETA: 21s - loss: 1.3545 - regression_loss: 1.1453 - classification_loss: 0.2092 410/500 [=======================>......] - ETA: 21s - loss: 1.3558 - regression_loss: 1.1463 - classification_loss: 0.2095 411/500 [=======================>......] - ETA: 21s - loss: 1.3558 - regression_loss: 1.1465 - classification_loss: 0.2094 412/500 [=======================>......] - ETA: 21s - loss: 1.3540 - regression_loss: 1.1450 - classification_loss: 0.2090 413/500 [=======================>......] - ETA: 20s - loss: 1.3549 - regression_loss: 1.1461 - classification_loss: 0.2088 414/500 [=======================>......] - ETA: 20s - loss: 1.3550 - regression_loss: 1.1462 - classification_loss: 0.2088 415/500 [=======================>......] - ETA: 20s - loss: 1.3545 - regression_loss: 1.1460 - classification_loss: 0.2086 416/500 [=======================>......] - ETA: 20s - loss: 1.3522 - regression_loss: 1.1440 - classification_loss: 0.2081 417/500 [========================>.....] - ETA: 19s - loss: 1.3517 - regression_loss: 1.1439 - classification_loss: 0.2078 418/500 [========================>.....] - ETA: 19s - loss: 1.3553 - regression_loss: 1.1429 - classification_loss: 0.2125 419/500 [========================>.....] - ETA: 19s - loss: 1.3557 - regression_loss: 1.1433 - classification_loss: 0.2124 420/500 [========================>.....] - ETA: 19s - loss: 1.3560 - regression_loss: 1.1435 - classification_loss: 0.2125 421/500 [========================>.....] - ETA: 18s - loss: 1.3548 - regression_loss: 1.1425 - classification_loss: 0.2123 422/500 [========================>.....] - ETA: 18s - loss: 1.3548 - regression_loss: 1.1425 - classification_loss: 0.2123 423/500 [========================>.....] - ETA: 18s - loss: 1.3546 - regression_loss: 1.1425 - classification_loss: 0.2121 424/500 [========================>.....] - ETA: 18s - loss: 1.3542 - regression_loss: 1.1421 - classification_loss: 0.2121 425/500 [========================>.....] - ETA: 18s - loss: 1.3544 - regression_loss: 1.1424 - classification_loss: 0.2120 426/500 [========================>.....] - ETA: 17s - loss: 1.3557 - regression_loss: 1.1432 - classification_loss: 0.2125 427/500 [========================>.....] - ETA: 17s - loss: 1.3568 - regression_loss: 1.1444 - classification_loss: 0.2124 428/500 [========================>.....] - ETA: 17s - loss: 1.3555 - regression_loss: 1.1434 - classification_loss: 0.2121 429/500 [========================>.....] - ETA: 17s - loss: 1.3555 - regression_loss: 1.1435 - classification_loss: 0.2120 430/500 [========================>.....] - ETA: 16s - loss: 1.3544 - regression_loss: 1.1425 - classification_loss: 0.2118 431/500 [========================>.....] - ETA: 16s - loss: 1.3527 - regression_loss: 1.1412 - classification_loss: 0.2115 432/500 [========================>.....] - ETA: 16s - loss: 1.3533 - regression_loss: 1.1418 - classification_loss: 0.2116 433/500 [========================>.....] - ETA: 16s - loss: 1.3533 - regression_loss: 1.1418 - classification_loss: 0.2115 434/500 [=========================>....] - ETA: 15s - loss: 1.3524 - regression_loss: 1.1411 - classification_loss: 0.2113 435/500 [=========================>....] - ETA: 15s - loss: 1.3535 - regression_loss: 1.1417 - classification_loss: 0.2118 436/500 [=========================>....] - ETA: 15s - loss: 1.3542 - regression_loss: 1.1422 - classification_loss: 0.2120 437/500 [=========================>....] - ETA: 15s - loss: 1.3544 - regression_loss: 1.1424 - classification_loss: 0.2120 438/500 [=========================>....] - ETA: 14s - loss: 1.3548 - regression_loss: 1.1427 - classification_loss: 0.2121 439/500 [=========================>....] - ETA: 14s - loss: 1.3539 - regression_loss: 1.1419 - classification_loss: 0.2121 440/500 [=========================>....] - ETA: 14s - loss: 1.3545 - regression_loss: 1.1425 - classification_loss: 0.2119 441/500 [=========================>....] - ETA: 14s - loss: 1.3566 - regression_loss: 1.1443 - classification_loss: 0.2123 442/500 [=========================>....] - ETA: 13s - loss: 1.3576 - regression_loss: 1.1452 - classification_loss: 0.2125 443/500 [=========================>....] - ETA: 13s - loss: 1.3595 - regression_loss: 1.1468 - classification_loss: 0.2127 444/500 [=========================>....] - ETA: 13s - loss: 1.3614 - regression_loss: 1.1482 - classification_loss: 0.2131 445/500 [=========================>....] - ETA: 13s - loss: 1.3626 - regression_loss: 1.1490 - classification_loss: 0.2135 446/500 [=========================>....] - ETA: 12s - loss: 1.3614 - regression_loss: 1.1479 - classification_loss: 0.2135 447/500 [=========================>....] - ETA: 12s - loss: 1.3598 - regression_loss: 1.1465 - classification_loss: 0.2134 448/500 [=========================>....] - ETA: 12s - loss: 1.3603 - regression_loss: 1.1469 - classification_loss: 0.2134 449/500 [=========================>....] - ETA: 12s - loss: 1.3608 - regression_loss: 1.1473 - classification_loss: 0.2134 450/500 [==========================>...] - ETA: 11s - loss: 1.3606 - regression_loss: 1.1471 - classification_loss: 0.2135 451/500 [==========================>...] - ETA: 11s - loss: 1.3611 - regression_loss: 1.1474 - classification_loss: 0.2136 452/500 [==========================>...] - ETA: 11s - loss: 1.3591 - regression_loss: 1.1457 - classification_loss: 0.2134 453/500 [==========================>...] - ETA: 11s - loss: 1.3572 - regression_loss: 1.1441 - classification_loss: 0.2131 454/500 [==========================>...] - ETA: 11s - loss: 1.3558 - regression_loss: 1.1431 - classification_loss: 0.2128 455/500 [==========================>...] - ETA: 10s - loss: 1.3550 - regression_loss: 1.1424 - classification_loss: 0.2126 456/500 [==========================>...] - ETA: 10s - loss: 1.3562 - regression_loss: 1.1433 - classification_loss: 0.2129 457/500 [==========================>...] - ETA: 10s - loss: 1.3574 - regression_loss: 1.1444 - classification_loss: 0.2131 458/500 [==========================>...] - ETA: 10s - loss: 1.3564 - regression_loss: 1.1436 - classification_loss: 0.2128 459/500 [==========================>...] - ETA: 9s - loss: 1.3545 - regression_loss: 1.1420 - classification_loss: 0.2125  460/500 [==========================>...] - ETA: 9s - loss: 1.3542 - regression_loss: 1.1418 - classification_loss: 0.2124 461/500 [==========================>...] - ETA: 9s - loss: 1.3563 - regression_loss: 1.1434 - classification_loss: 0.2129 462/500 [==========================>...] - ETA: 9s - loss: 1.3557 - regression_loss: 1.1430 - classification_loss: 0.2127 463/500 [==========================>...] - ETA: 8s - loss: 1.3567 - regression_loss: 1.1439 - classification_loss: 0.2129 464/500 [==========================>...] - ETA: 8s - loss: 1.3563 - regression_loss: 1.1436 - classification_loss: 0.2127 465/500 [==========================>...] - ETA: 8s - loss: 1.3554 - regression_loss: 1.1430 - classification_loss: 0.2125 466/500 [==========================>...] - ETA: 8s - loss: 1.3558 - regression_loss: 1.1432 - classification_loss: 0.2125 467/500 [===========================>..] - ETA: 7s - loss: 1.3552 - regression_loss: 1.1428 - classification_loss: 0.2125 468/500 [===========================>..] - ETA: 7s - loss: 1.3559 - regression_loss: 1.1434 - classification_loss: 0.2126 469/500 [===========================>..] - ETA: 7s - loss: 1.3561 - regression_loss: 1.1435 - classification_loss: 0.2126 470/500 [===========================>..] - ETA: 7s - loss: 1.3571 - regression_loss: 1.1445 - classification_loss: 0.2126 471/500 [===========================>..] - ETA: 6s - loss: 1.3572 - regression_loss: 1.1446 - classification_loss: 0.2126 472/500 [===========================>..] - ETA: 6s - loss: 1.3569 - regression_loss: 1.1444 - classification_loss: 0.2124 473/500 [===========================>..] - ETA: 6s - loss: 1.3586 - regression_loss: 1.1455 - classification_loss: 0.2130 474/500 [===========================>..] - ETA: 6s - loss: 1.3571 - regression_loss: 1.1441 - classification_loss: 0.2129 475/500 [===========================>..] - ETA: 5s - loss: 1.3566 - regression_loss: 1.1438 - classification_loss: 0.2128 476/500 [===========================>..] - ETA: 5s - loss: 1.3568 - regression_loss: 1.1441 - classification_loss: 0.2127 477/500 [===========================>..] - ETA: 5s - loss: 1.3573 - regression_loss: 1.1446 - classification_loss: 0.2127 478/500 [===========================>..] - ETA: 5s - loss: 1.3573 - regression_loss: 1.1447 - classification_loss: 0.2127 479/500 [===========================>..] - ETA: 5s - loss: 1.3571 - regression_loss: 1.1445 - classification_loss: 0.2126 480/500 [===========================>..] - ETA: 4s - loss: 1.3591 - regression_loss: 1.1459 - classification_loss: 0.2133 481/500 [===========================>..] - ETA: 4s - loss: 1.3592 - regression_loss: 1.1460 - classification_loss: 0.2132 482/500 [===========================>..] - ETA: 4s - loss: 1.3588 - regression_loss: 1.1456 - classification_loss: 0.2132 483/500 [===========================>..] - ETA: 4s - loss: 1.3586 - regression_loss: 1.1455 - classification_loss: 0.2131 484/500 [============================>.] - ETA: 3s - loss: 1.3586 - regression_loss: 1.1455 - classification_loss: 0.2131 485/500 [============================>.] - ETA: 3s - loss: 1.3601 - regression_loss: 1.1466 - classification_loss: 0.2135 486/500 [============================>.] - ETA: 3s - loss: 1.3603 - regression_loss: 1.1467 - classification_loss: 0.2136 487/500 [============================>.] - ETA: 3s - loss: 1.3597 - regression_loss: 1.1463 - classification_loss: 0.2134 488/500 [============================>.] - ETA: 2s - loss: 1.3604 - regression_loss: 1.1468 - classification_loss: 0.2136 489/500 [============================>.] - ETA: 2s - loss: 1.3591 - regression_loss: 1.1457 - classification_loss: 0.2133 490/500 [============================>.] - ETA: 2s - loss: 1.3574 - regression_loss: 1.1442 - classification_loss: 0.2131 491/500 [============================>.] - ETA: 2s - loss: 1.3569 - regression_loss: 1.1439 - classification_loss: 0.2130 492/500 [============================>.] - ETA: 1s - loss: 1.3574 - regression_loss: 1.1440 - classification_loss: 0.2134 493/500 [============================>.] - ETA: 1s - loss: 1.3577 - regression_loss: 1.1444 - classification_loss: 0.2133 494/500 [============================>.] - ETA: 1s - loss: 1.3568 - regression_loss: 1.1437 - classification_loss: 0.2131 495/500 [============================>.] - ETA: 1s - loss: 1.3566 - regression_loss: 1.1437 - classification_loss: 0.2129 496/500 [============================>.] - ETA: 0s - loss: 1.3567 - regression_loss: 1.1438 - classification_loss: 0.2129 497/500 [============================>.] - ETA: 0s - loss: 1.3585 - regression_loss: 1.1449 - classification_loss: 0.2136 498/500 [============================>.] - ETA: 0s - loss: 1.3588 - regression_loss: 1.1450 - classification_loss: 0.2138 499/500 [============================>.] - ETA: 0s - loss: 1.3595 - regression_loss: 1.1456 - classification_loss: 0.2139 500/500 [==============================] - 119s 238ms/step - loss: 1.3594 - regression_loss: 1.1456 - classification_loss: 0.2138 326 instances of class plum with average precision: 0.8072 mAP: 0.8072 Epoch 00083: saving model to ./training/snapshots/resnet50_pascal_83.h5 Epoch 84/150 1/500 [..............................] - ETA: 2:03 - loss: 2.1480 - regression_loss: 1.6829 - classification_loss: 0.4650 2/500 [..............................] - ETA: 2:04 - loss: 1.5161 - regression_loss: 1.2397 - classification_loss: 0.2765 3/500 [..............................] - ETA: 2:04 - loss: 1.5247 - regression_loss: 1.2567 - classification_loss: 0.2680 4/500 [..............................] - ETA: 2:03 - loss: 1.4675 - regression_loss: 1.2272 - classification_loss: 0.2403 5/500 [..............................] - ETA: 2:00 - loss: 1.3670 - regression_loss: 1.1553 - classification_loss: 0.2117 6/500 [..............................] - ETA: 1:58 - loss: 1.3680 - regression_loss: 1.1605 - classification_loss: 0.2075 7/500 [..............................] - ETA: 1:55 - loss: 1.4253 - regression_loss: 1.2105 - classification_loss: 0.2148 8/500 [..............................] - ETA: 1:54 - loss: 1.3995 - regression_loss: 1.1899 - classification_loss: 0.2096 9/500 [..............................] - ETA: 1:53 - loss: 1.3500 - regression_loss: 1.1470 - classification_loss: 0.2029 10/500 [..............................] - ETA: 1:52 - loss: 1.3876 - regression_loss: 1.1830 - classification_loss: 0.2046 11/500 [..............................] - ETA: 1:51 - loss: 1.3815 - regression_loss: 1.1714 - classification_loss: 0.2101 12/500 [..............................] - ETA: 1:50 - loss: 1.3693 - regression_loss: 1.1642 - classification_loss: 0.2051 13/500 [..............................] - ETA: 1:49 - loss: 1.3895 - regression_loss: 1.1820 - classification_loss: 0.2075 14/500 [..............................] - ETA: 1:48 - loss: 1.3754 - regression_loss: 1.1678 - classification_loss: 0.2076 15/500 [..............................] - ETA: 1:48 - loss: 1.3569 - regression_loss: 1.1561 - classification_loss: 0.2008 16/500 [..............................] - ETA: 1:47 - loss: 1.3010 - regression_loss: 1.1097 - classification_loss: 0.1913 17/500 [>.............................] - ETA: 1:47 - loss: 1.2763 - regression_loss: 1.0889 - classification_loss: 0.1874 18/500 [>.............................] - ETA: 1:46 - loss: 1.2706 - regression_loss: 1.0848 - classification_loss: 0.1858 19/500 [>.............................] - ETA: 1:46 - loss: 1.2491 - regression_loss: 1.0651 - classification_loss: 0.1840 20/500 [>.............................] - ETA: 1:45 - loss: 1.2686 - regression_loss: 1.0782 - classification_loss: 0.1903 21/500 [>.............................] - ETA: 1:45 - loss: 1.2958 - regression_loss: 1.1016 - classification_loss: 0.1942 22/500 [>.............................] - ETA: 1:44 - loss: 1.3262 - regression_loss: 1.1305 - classification_loss: 0.1957 23/500 [>.............................] - ETA: 1:44 - loss: 1.3120 - regression_loss: 1.1179 - classification_loss: 0.1942 24/500 [>.............................] - ETA: 1:44 - loss: 1.3538 - regression_loss: 1.1468 - classification_loss: 0.2070 25/500 [>.............................] - ETA: 1:43 - loss: 1.3562 - regression_loss: 1.1488 - classification_loss: 0.2075 26/500 [>.............................] - ETA: 1:43 - loss: 1.3746 - regression_loss: 1.1640 - classification_loss: 0.2106 27/500 [>.............................] - ETA: 1:43 - loss: 1.3822 - regression_loss: 1.1701 - classification_loss: 0.2121 28/500 [>.............................] - ETA: 1:42 - loss: 1.3822 - regression_loss: 1.1708 - classification_loss: 0.2115 29/500 [>.............................] - ETA: 1:42 - loss: 1.3652 - regression_loss: 1.1571 - classification_loss: 0.2081 30/500 [>.............................] - ETA: 1:41 - loss: 1.3735 - regression_loss: 1.1649 - classification_loss: 0.2085 31/500 [>.............................] - ETA: 1:41 - loss: 1.3757 - regression_loss: 1.1679 - classification_loss: 0.2078 32/500 [>.............................] - ETA: 1:41 - loss: 1.3712 - regression_loss: 1.1658 - classification_loss: 0.2053 33/500 [>.............................] - ETA: 1:41 - loss: 1.3533 - regression_loss: 1.1517 - classification_loss: 0.2016 34/500 [=>............................] - ETA: 1:41 - loss: 1.3360 - regression_loss: 1.1381 - classification_loss: 0.1979 35/500 [=>............................] - ETA: 1:42 - loss: 1.3464 - regression_loss: 1.1445 - classification_loss: 0.2019 36/500 [=>............................] - ETA: 1:42 - loss: 1.3465 - regression_loss: 1.1460 - classification_loss: 0.2005 37/500 [=>............................] - ETA: 1:41 - loss: 1.3402 - regression_loss: 1.1378 - classification_loss: 0.2024 38/500 [=>............................] - ETA: 1:41 - loss: 1.3409 - regression_loss: 1.1393 - classification_loss: 0.2017 39/500 [=>............................] - ETA: 1:41 - loss: 1.3459 - regression_loss: 1.1438 - classification_loss: 0.2021 40/500 [=>............................] - ETA: 1:41 - loss: 1.3539 - regression_loss: 1.1503 - classification_loss: 0.2036 41/500 [=>............................] - ETA: 1:41 - loss: 1.3307 - regression_loss: 1.1312 - classification_loss: 0.1996 42/500 [=>............................] - ETA: 1:40 - loss: 1.3210 - regression_loss: 1.1237 - classification_loss: 0.1973 43/500 [=>............................] - ETA: 1:40 - loss: 1.3258 - regression_loss: 1.1285 - classification_loss: 0.1973 44/500 [=>............................] - ETA: 1:40 - loss: 1.3238 - regression_loss: 1.1279 - classification_loss: 0.1959 45/500 [=>............................] - ETA: 1:40 - loss: 1.3072 - regression_loss: 1.1145 - classification_loss: 0.1927 46/500 [=>............................] - ETA: 1:39 - loss: 1.3131 - regression_loss: 1.1196 - classification_loss: 0.1935 47/500 [=>............................] - ETA: 1:39 - loss: 1.3307 - regression_loss: 1.1327 - classification_loss: 0.1980 48/500 [=>............................] - ETA: 1:39 - loss: 1.3347 - regression_loss: 1.1374 - classification_loss: 0.1973 49/500 [=>............................] - ETA: 1:39 - loss: 1.3232 - regression_loss: 1.1277 - classification_loss: 0.1955 50/500 [==>...........................] - ETA: 1:38 - loss: 1.3208 - regression_loss: 1.1251 - classification_loss: 0.1957 51/500 [==>...........................] - ETA: 1:38 - loss: 1.3181 - regression_loss: 1.1240 - classification_loss: 0.1942 52/500 [==>...........................] - ETA: 1:38 - loss: 1.3156 - regression_loss: 1.1224 - classification_loss: 0.1933 53/500 [==>...........................] - ETA: 1:37 - loss: 1.3123 - regression_loss: 1.1203 - classification_loss: 0.1920 54/500 [==>...........................] - ETA: 1:37 - loss: 1.3430 - regression_loss: 1.1469 - classification_loss: 0.1961 55/500 [==>...........................] - ETA: 1:37 - loss: 1.3417 - regression_loss: 1.1466 - classification_loss: 0.1951 56/500 [==>...........................] - ETA: 1:37 - loss: 1.3523 - regression_loss: 1.1549 - classification_loss: 0.1974 57/500 [==>...........................] - ETA: 1:36 - loss: 1.3522 - regression_loss: 1.1546 - classification_loss: 0.1976 58/500 [==>...........................] - ETA: 1:36 - loss: 1.3463 - regression_loss: 1.1500 - classification_loss: 0.1963 59/500 [==>...........................] - ETA: 1:36 - loss: 1.3457 - regression_loss: 1.1483 - classification_loss: 0.1975 60/500 [==>...........................] - ETA: 1:35 - loss: 1.3473 - regression_loss: 1.1467 - classification_loss: 0.2006 61/500 [==>...........................] - ETA: 1:35 - loss: 1.3372 - regression_loss: 1.1386 - classification_loss: 0.1986 62/500 [==>...........................] - ETA: 1:35 - loss: 1.3362 - regression_loss: 1.1378 - classification_loss: 0.1984 63/500 [==>...........................] - ETA: 1:35 - loss: 1.3454 - regression_loss: 1.1453 - classification_loss: 0.2001 64/500 [==>...........................] - ETA: 1:34 - loss: 1.3460 - regression_loss: 1.1462 - classification_loss: 0.1998 65/500 [==>...........................] - ETA: 1:34 - loss: 1.3466 - regression_loss: 1.1460 - classification_loss: 0.2006 66/500 [==>...........................] - ETA: 1:34 - loss: 1.3470 - regression_loss: 1.1460 - classification_loss: 0.2010 67/500 [===>..........................] - ETA: 1:34 - loss: 1.3420 - regression_loss: 1.1428 - classification_loss: 0.1992 68/500 [===>..........................] - ETA: 1:33 - loss: 1.3482 - regression_loss: 1.1472 - classification_loss: 0.2010 69/500 [===>..........................] - ETA: 1:33 - loss: 1.3531 - regression_loss: 1.1515 - classification_loss: 0.2016 70/500 [===>..........................] - ETA: 1:33 - loss: 1.3460 - regression_loss: 1.1451 - classification_loss: 0.2009 71/500 [===>..........................] - ETA: 1:33 - loss: 1.3538 - regression_loss: 1.1511 - classification_loss: 0.2027 72/500 [===>..........................] - ETA: 1:32 - loss: 1.3458 - regression_loss: 1.1449 - classification_loss: 0.2009 73/500 [===>..........................] - ETA: 1:32 - loss: 1.3464 - regression_loss: 1.1449 - classification_loss: 0.2015 74/500 [===>..........................] - ETA: 1:32 - loss: 1.3343 - regression_loss: 1.1350 - classification_loss: 0.1993 75/500 [===>..........................] - ETA: 1:32 - loss: 1.3291 - regression_loss: 1.1310 - classification_loss: 0.1981 76/500 [===>..........................] - ETA: 1:31 - loss: 1.3303 - regression_loss: 1.1315 - classification_loss: 0.1988 77/500 [===>..........................] - ETA: 1:31 - loss: 1.3216 - regression_loss: 1.1240 - classification_loss: 0.1976 78/500 [===>..........................] - ETA: 1:31 - loss: 1.3143 - regression_loss: 1.1181 - classification_loss: 0.1962 79/500 [===>..........................] - ETA: 1:31 - loss: 1.3204 - regression_loss: 1.1217 - classification_loss: 0.1987 80/500 [===>..........................] - ETA: 1:30 - loss: 1.3122 - regression_loss: 1.1150 - classification_loss: 0.1971 81/500 [===>..........................] - ETA: 1:30 - loss: 1.3119 - regression_loss: 1.1153 - classification_loss: 0.1966 82/500 [===>..........................] - ETA: 1:30 - loss: 1.3187 - regression_loss: 1.1207 - classification_loss: 0.1980 83/500 [===>..........................] - ETA: 1:30 - loss: 1.3201 - regression_loss: 1.1217 - classification_loss: 0.1984 84/500 [====>.........................] - ETA: 1:29 - loss: 1.3306 - regression_loss: 1.1290 - classification_loss: 0.2016 85/500 [====>.........................] - ETA: 1:29 - loss: 1.3188 - regression_loss: 1.1190 - classification_loss: 0.1998 86/500 [====>.........................] - ETA: 1:29 - loss: 1.3166 - regression_loss: 1.1172 - classification_loss: 0.1994 87/500 [====>.........................] - ETA: 1:29 - loss: 1.3185 - regression_loss: 1.1194 - classification_loss: 0.1992 88/500 [====>.........................] - ETA: 1:28 - loss: 1.3231 - regression_loss: 1.1233 - classification_loss: 0.1998 89/500 [====>.........................] - ETA: 1:28 - loss: 1.3252 - regression_loss: 1.1248 - classification_loss: 0.2004 90/500 [====>.........................] - ETA: 1:28 - loss: 1.3280 - regression_loss: 1.1278 - classification_loss: 0.2002 91/500 [====>.........................] - ETA: 1:28 - loss: 1.3261 - regression_loss: 1.1267 - classification_loss: 0.1994 92/500 [====>.........................] - ETA: 1:28 - loss: 1.3258 - regression_loss: 1.1256 - classification_loss: 0.2001 93/500 [====>.........................] - ETA: 1:27 - loss: 1.3230 - regression_loss: 1.1235 - classification_loss: 0.1996 94/500 [====>.........................] - ETA: 1:27 - loss: 1.3227 - regression_loss: 1.1239 - classification_loss: 0.1988 95/500 [====>.........................] - ETA: 1:27 - loss: 1.3206 - regression_loss: 1.1228 - classification_loss: 0.1977 96/500 [====>.........................] - ETA: 1:27 - loss: 1.3407 - regression_loss: 1.1387 - classification_loss: 0.2020 97/500 [====>.........................] - ETA: 1:27 - loss: 1.3365 - regression_loss: 1.1355 - classification_loss: 0.2011 98/500 [====>.........................] - ETA: 1:27 - loss: 1.3371 - regression_loss: 1.1357 - classification_loss: 0.2014 99/500 [====>.........................] - ETA: 1:27 - loss: 1.3401 - regression_loss: 1.1374 - classification_loss: 0.2028 100/500 [=====>........................] - ETA: 1:26 - loss: 1.3371 - regression_loss: 1.1357 - classification_loss: 0.2014 101/500 [=====>........................] - ETA: 1:26 - loss: 1.3340 - regression_loss: 1.1335 - classification_loss: 0.2005 102/500 [=====>........................] - ETA: 1:26 - loss: 1.3400 - regression_loss: 1.1392 - classification_loss: 0.2009 103/500 [=====>........................] - ETA: 1:26 - loss: 1.3332 - regression_loss: 1.1339 - classification_loss: 0.1994 104/500 [=====>........................] - ETA: 1:26 - loss: 1.3305 - regression_loss: 1.1318 - classification_loss: 0.1987 105/500 [=====>........................] - ETA: 1:26 - loss: 1.3307 - regression_loss: 1.1326 - classification_loss: 0.1982 106/500 [=====>........................] - ETA: 1:26 - loss: 1.3356 - regression_loss: 1.1370 - classification_loss: 0.1986 107/500 [=====>........................] - ETA: 1:26 - loss: 1.3364 - regression_loss: 1.1374 - classification_loss: 0.1991 108/500 [=====>........................] - ETA: 1:25 - loss: 1.3270 - regression_loss: 1.1295 - classification_loss: 0.1975 109/500 [=====>........................] - ETA: 1:25 - loss: 1.3262 - regression_loss: 1.1290 - classification_loss: 0.1972 110/500 [=====>........................] - ETA: 1:25 - loss: 1.3200 - regression_loss: 1.1235 - classification_loss: 0.1965 111/500 [=====>........................] - ETA: 1:25 - loss: 1.3171 - regression_loss: 1.1213 - classification_loss: 0.1958 112/500 [=====>........................] - ETA: 1:24 - loss: 1.3112 - regression_loss: 1.1163 - classification_loss: 0.1949 113/500 [=====>........................] - ETA: 1:24 - loss: 1.3116 - regression_loss: 1.1173 - classification_loss: 0.1943 114/500 [=====>........................] - ETA: 1:24 - loss: 1.3151 - regression_loss: 1.1204 - classification_loss: 0.1947 115/500 [=====>........................] - ETA: 1:24 - loss: 1.3202 - regression_loss: 1.1251 - classification_loss: 0.1951 116/500 [=====>........................] - ETA: 1:23 - loss: 1.3238 - regression_loss: 1.1278 - classification_loss: 0.1960 117/500 [======>.......................] - ETA: 1:23 - loss: 1.3260 - regression_loss: 1.1288 - classification_loss: 0.1973 118/500 [======>.......................] - ETA: 1:23 - loss: 1.3242 - regression_loss: 1.1274 - classification_loss: 0.1968 119/500 [======>.......................] - ETA: 1:23 - loss: 1.3393 - regression_loss: 1.1355 - classification_loss: 0.2039 120/500 [======>.......................] - ETA: 1:22 - loss: 1.3436 - regression_loss: 1.1389 - classification_loss: 0.2047 121/500 [======>.......................] - ETA: 1:22 - loss: 1.3452 - regression_loss: 1.1401 - classification_loss: 0.2051 122/500 [======>.......................] - ETA: 1:22 - loss: 1.3492 - regression_loss: 1.1430 - classification_loss: 0.2061 123/500 [======>.......................] - ETA: 1:22 - loss: 1.3528 - regression_loss: 1.1455 - classification_loss: 0.2073 124/500 [======>.......................] - ETA: 1:21 - loss: 1.3506 - regression_loss: 1.1430 - classification_loss: 0.2076 125/500 [======>.......................] - ETA: 1:21 - loss: 1.3541 - regression_loss: 1.1458 - classification_loss: 0.2082 126/500 [======>.......................] - ETA: 1:21 - loss: 1.3653 - regression_loss: 1.1549 - classification_loss: 0.2104 127/500 [======>.......................] - ETA: 1:21 - loss: 1.3613 - regression_loss: 1.1515 - classification_loss: 0.2098 128/500 [======>.......................] - ETA: 1:21 - loss: 1.3597 - regression_loss: 1.1501 - classification_loss: 0.2096 129/500 [======>.......................] - ETA: 1:20 - loss: 1.3559 - regression_loss: 1.1470 - classification_loss: 0.2089 130/500 [======>.......................] - ETA: 1:20 - loss: 1.3617 - regression_loss: 1.1514 - classification_loss: 0.2103 131/500 [======>.......................] - ETA: 1:20 - loss: 1.3622 - regression_loss: 1.1516 - classification_loss: 0.2106 132/500 [======>.......................] - ETA: 1:20 - loss: 1.3644 - regression_loss: 1.1528 - classification_loss: 0.2116 133/500 [======>.......................] - ETA: 1:19 - loss: 1.3631 - regression_loss: 1.1521 - classification_loss: 0.2110 134/500 [=======>......................] - ETA: 1:19 - loss: 1.3653 - regression_loss: 1.1530 - classification_loss: 0.2123 135/500 [=======>......................] - ETA: 1:19 - loss: 1.3621 - regression_loss: 1.1507 - classification_loss: 0.2115 136/500 [=======>......................] - ETA: 1:19 - loss: 1.3613 - regression_loss: 1.1501 - classification_loss: 0.2112 137/500 [=======>......................] - ETA: 1:18 - loss: 1.3622 - regression_loss: 1.1508 - classification_loss: 0.2114 138/500 [=======>......................] - ETA: 1:18 - loss: 1.3619 - regression_loss: 1.1506 - classification_loss: 0.2113 139/500 [=======>......................] - ETA: 1:18 - loss: 1.3592 - regression_loss: 1.1490 - classification_loss: 0.2102 140/500 [=======>......................] - ETA: 1:18 - loss: 1.3552 - regression_loss: 1.1459 - classification_loss: 0.2093 141/500 [=======>......................] - ETA: 1:17 - loss: 1.3490 - regression_loss: 1.1406 - classification_loss: 0.2084 142/500 [=======>......................] - ETA: 1:17 - loss: 1.3437 - regression_loss: 1.1357 - classification_loss: 0.2079 143/500 [=======>......................] - ETA: 1:17 - loss: 1.3478 - regression_loss: 1.1390 - classification_loss: 0.2088 144/500 [=======>......................] - ETA: 1:17 - loss: 1.3517 - regression_loss: 1.1413 - classification_loss: 0.2104 145/500 [=======>......................] - ETA: 1:17 - loss: 1.3471 - regression_loss: 1.1374 - classification_loss: 0.2097 146/500 [=======>......................] - ETA: 1:16 - loss: 1.3472 - regression_loss: 1.1377 - classification_loss: 0.2095 147/500 [=======>......................] - ETA: 1:16 - loss: 1.3424 - regression_loss: 1.1336 - classification_loss: 0.2088 148/500 [=======>......................] - ETA: 1:16 - loss: 1.3422 - regression_loss: 1.1332 - classification_loss: 0.2090 149/500 [=======>......................] - ETA: 1:16 - loss: 1.3408 - regression_loss: 1.1324 - classification_loss: 0.2084 150/500 [========>.....................] - ETA: 1:15 - loss: 1.3404 - regression_loss: 1.1325 - classification_loss: 0.2080 151/500 [========>.....................] - ETA: 1:15 - loss: 1.3366 - regression_loss: 1.1289 - classification_loss: 0.2076 152/500 [========>.....................] - ETA: 1:15 - loss: 1.3362 - regression_loss: 1.1285 - classification_loss: 0.2076 153/500 [========>.....................] - ETA: 1:15 - loss: 1.3363 - regression_loss: 1.1287 - classification_loss: 0.2076 154/500 [========>.....................] - ETA: 1:14 - loss: 1.3360 - regression_loss: 1.1287 - classification_loss: 0.2073 155/500 [========>.....................] - ETA: 1:14 - loss: 1.3331 - regression_loss: 1.1266 - classification_loss: 0.2065 156/500 [========>.....................] - ETA: 1:14 - loss: 1.3359 - regression_loss: 1.1284 - classification_loss: 0.2075 157/500 [========>.....................] - ETA: 1:14 - loss: 1.3373 - regression_loss: 1.1292 - classification_loss: 0.2081 158/500 [========>.....................] - ETA: 1:14 - loss: 1.3368 - regression_loss: 1.1292 - classification_loss: 0.2076 159/500 [========>.....................] - ETA: 1:13 - loss: 1.3416 - regression_loss: 1.1329 - classification_loss: 0.2087 160/500 [========>.....................] - ETA: 1:13 - loss: 1.3429 - regression_loss: 1.1340 - classification_loss: 0.2089 161/500 [========>.....................] - ETA: 1:13 - loss: 1.3443 - regression_loss: 1.1354 - classification_loss: 0.2090 162/500 [========>.....................] - ETA: 1:13 - loss: 1.3436 - regression_loss: 1.1350 - classification_loss: 0.2086 163/500 [========>.....................] - ETA: 1:12 - loss: 1.3476 - regression_loss: 1.1379 - classification_loss: 0.2097 164/500 [========>.....................] - ETA: 1:12 - loss: 1.3422 - regression_loss: 1.1335 - classification_loss: 0.2087 165/500 [========>.....................] - ETA: 1:12 - loss: 1.3434 - regression_loss: 1.1344 - classification_loss: 0.2090 166/500 [========>.....................] - ETA: 1:12 - loss: 1.3408 - regression_loss: 1.1322 - classification_loss: 0.2087 167/500 [=========>....................] - ETA: 1:12 - loss: 1.3381 - regression_loss: 1.1297 - classification_loss: 0.2083 168/500 [=========>....................] - ETA: 1:11 - loss: 1.3336 - regression_loss: 1.1261 - classification_loss: 0.2075 169/500 [=========>....................] - ETA: 1:11 - loss: 1.3380 - regression_loss: 1.1294 - classification_loss: 0.2086 170/500 [=========>....................] - ETA: 1:11 - loss: 1.3393 - regression_loss: 1.1312 - classification_loss: 0.2081 171/500 [=========>....................] - ETA: 1:11 - loss: 1.3372 - regression_loss: 1.1296 - classification_loss: 0.2076 172/500 [=========>....................] - ETA: 1:10 - loss: 1.3380 - regression_loss: 1.1302 - classification_loss: 0.2078 173/500 [=========>....................] - ETA: 1:10 - loss: 1.3351 - regression_loss: 1.1279 - classification_loss: 0.2072 174/500 [=========>....................] - ETA: 1:10 - loss: 1.3322 - regression_loss: 1.1256 - classification_loss: 0.2066 175/500 [=========>....................] - ETA: 1:10 - loss: 1.3281 - regression_loss: 1.1221 - classification_loss: 0.2060 176/500 [=========>....................] - ETA: 1:10 - loss: 1.3287 - regression_loss: 1.1223 - classification_loss: 0.2064 177/500 [=========>....................] - ETA: 1:10 - loss: 1.3306 - regression_loss: 1.1242 - classification_loss: 0.2064 178/500 [=========>....................] - ETA: 1:09 - loss: 1.3334 - regression_loss: 1.1271 - classification_loss: 0.2062 179/500 [=========>....................] - ETA: 1:09 - loss: 1.3352 - regression_loss: 1.1286 - classification_loss: 0.2066 180/500 [=========>....................] - ETA: 1:09 - loss: 1.3403 - regression_loss: 1.1324 - classification_loss: 0.2079 181/500 [=========>....................] - ETA: 1:09 - loss: 1.3422 - regression_loss: 1.1339 - classification_loss: 0.2083 182/500 [=========>....................] - ETA: 1:09 - loss: 1.3417 - regression_loss: 1.1337 - classification_loss: 0.2079 183/500 [=========>....................] - ETA: 1:09 - loss: 1.3414 - regression_loss: 1.1334 - classification_loss: 0.2080 184/500 [==========>...................] - ETA: 1:08 - loss: 1.3452 - regression_loss: 1.1365 - classification_loss: 0.2087 185/500 [==========>...................] - ETA: 1:08 - loss: 1.3471 - regression_loss: 1.1377 - classification_loss: 0.2094 186/500 [==========>...................] - ETA: 1:08 - loss: 1.3488 - regression_loss: 1.1389 - classification_loss: 0.2099 187/500 [==========>...................] - ETA: 1:08 - loss: 1.3523 - regression_loss: 1.1415 - classification_loss: 0.2108 188/500 [==========>...................] - ETA: 1:07 - loss: 1.3547 - regression_loss: 1.1436 - classification_loss: 0.2111 189/500 [==========>...................] - ETA: 1:07 - loss: 1.3573 - regression_loss: 1.1460 - classification_loss: 0.2113 190/500 [==========>...................] - ETA: 1:07 - loss: 1.3602 - regression_loss: 1.1486 - classification_loss: 0.2116 191/500 [==========>...................] - ETA: 1:07 - loss: 1.3633 - regression_loss: 1.1509 - classification_loss: 0.2124 192/500 [==========>...................] - ETA: 1:07 - loss: 1.3627 - regression_loss: 1.1501 - classification_loss: 0.2125 193/500 [==========>...................] - ETA: 1:06 - loss: 1.3620 - regression_loss: 1.1495 - classification_loss: 0.2125 194/500 [==========>...................] - ETA: 1:06 - loss: 1.3639 - regression_loss: 1.1514 - classification_loss: 0.2125 195/500 [==========>...................] - ETA: 1:06 - loss: 1.3658 - regression_loss: 1.1530 - classification_loss: 0.2128 196/500 [==========>...................] - ETA: 1:06 - loss: 1.3682 - regression_loss: 1.1547 - classification_loss: 0.2135 197/500 [==========>...................] - ETA: 1:05 - loss: 1.3680 - regression_loss: 1.1549 - classification_loss: 0.2131 198/500 [==========>...................] - ETA: 1:05 - loss: 1.3713 - regression_loss: 1.1576 - classification_loss: 0.2137 199/500 [==========>...................] - ETA: 1:05 - loss: 1.3700 - regression_loss: 1.1565 - classification_loss: 0.2134 200/500 [===========>..................] - ETA: 1:05 - loss: 1.3711 - regression_loss: 1.1576 - classification_loss: 0.2135 201/500 [===========>..................] - ETA: 1:05 - loss: 1.3666 - regression_loss: 1.1537 - classification_loss: 0.2128 202/500 [===========>..................] - ETA: 1:04 - loss: 1.3666 - regression_loss: 1.1536 - classification_loss: 0.2130 203/500 [===========>..................] - ETA: 1:04 - loss: 1.3699 - regression_loss: 1.1570 - classification_loss: 0.2129 204/500 [===========>..................] - ETA: 1:04 - loss: 1.3702 - regression_loss: 1.1575 - classification_loss: 0.2127 205/500 [===========>..................] - ETA: 1:04 - loss: 1.3734 - regression_loss: 1.1582 - classification_loss: 0.2152 206/500 [===========>..................] - ETA: 1:03 - loss: 1.3744 - regression_loss: 1.1591 - classification_loss: 0.2153 207/500 [===========>..................] - ETA: 1:03 - loss: 1.3769 - regression_loss: 1.1614 - classification_loss: 0.2156 208/500 [===========>..................] - ETA: 1:03 - loss: 1.3802 - regression_loss: 1.1637 - classification_loss: 0.2166 209/500 [===========>..................] - ETA: 1:03 - loss: 1.3798 - regression_loss: 1.1635 - classification_loss: 0.2163 210/500 [===========>..................] - ETA: 1:03 - loss: 1.3846 - regression_loss: 1.1667 - classification_loss: 0.2179 211/500 [===========>..................] - ETA: 1:02 - loss: 1.3850 - regression_loss: 1.1672 - classification_loss: 0.2178 212/500 [===========>..................] - ETA: 1:02 - loss: 1.3872 - regression_loss: 1.1692 - classification_loss: 0.2180 213/500 [===========>..................] - ETA: 1:02 - loss: 1.3922 - regression_loss: 1.1732 - classification_loss: 0.2190 214/500 [===========>..................] - ETA: 1:02 - loss: 1.3915 - regression_loss: 1.1729 - classification_loss: 0.2185 215/500 [===========>..................] - ETA: 1:01 - loss: 1.3939 - regression_loss: 1.1749 - classification_loss: 0.2190 216/500 [===========>..................] - ETA: 1:01 - loss: 1.3915 - regression_loss: 1.1730 - classification_loss: 0.2186 217/500 [============>.................] - ETA: 1:01 - loss: 1.3908 - regression_loss: 1.1726 - classification_loss: 0.2182 218/500 [============>.................] - ETA: 1:01 - loss: 1.3944 - regression_loss: 1.1757 - classification_loss: 0.2187 219/500 [============>.................] - ETA: 1:00 - loss: 1.3959 - regression_loss: 1.1769 - classification_loss: 0.2190 220/500 [============>.................] - ETA: 1:00 - loss: 1.3973 - regression_loss: 1.1780 - classification_loss: 0.2193 221/500 [============>.................] - ETA: 1:00 - loss: 1.3958 - regression_loss: 1.1769 - classification_loss: 0.2189 222/500 [============>.................] - ETA: 1:00 - loss: 1.3937 - regression_loss: 1.1755 - classification_loss: 0.2182 223/500 [============>.................] - ETA: 1:00 - loss: 1.3941 - regression_loss: 1.1760 - classification_loss: 0.2181 224/500 [============>.................] - ETA: 59s - loss: 1.3921 - regression_loss: 1.1744 - classification_loss: 0.2177  225/500 [============>.................] - ETA: 59s - loss: 1.3922 - regression_loss: 1.1742 - classification_loss: 0.2181 226/500 [============>.................] - ETA: 59s - loss: 1.3920 - regression_loss: 1.1740 - classification_loss: 0.2180 227/500 [============>.................] - ETA: 59s - loss: 1.3931 - regression_loss: 1.1748 - classification_loss: 0.2183 228/500 [============>.................] - ETA: 58s - loss: 1.3928 - regression_loss: 1.1746 - classification_loss: 0.2182 229/500 [============>.................] - ETA: 58s - loss: 1.3939 - regression_loss: 1.1761 - classification_loss: 0.2179 230/500 [============>.................] - ETA: 58s - loss: 1.3950 - regression_loss: 1.1771 - classification_loss: 0.2179 231/500 [============>.................] - ETA: 58s - loss: 1.3975 - regression_loss: 1.1786 - classification_loss: 0.2189 232/500 [============>.................] - ETA: 58s - loss: 1.3991 - regression_loss: 1.1797 - classification_loss: 0.2194 233/500 [============>.................] - ETA: 57s - loss: 1.3963 - regression_loss: 1.1769 - classification_loss: 0.2194 234/500 [=============>................] - ETA: 57s - loss: 1.3972 - regression_loss: 1.1777 - classification_loss: 0.2195 235/500 [=============>................] - ETA: 57s - loss: 1.3978 - regression_loss: 1.1779 - classification_loss: 0.2199 236/500 [=============>................] - ETA: 57s - loss: 1.3965 - regression_loss: 1.1769 - classification_loss: 0.2196 237/500 [=============>................] - ETA: 57s - loss: 1.3948 - regression_loss: 1.1757 - classification_loss: 0.2191 238/500 [=============>................] - ETA: 57s - loss: 1.3941 - regression_loss: 1.1751 - classification_loss: 0.2190 239/500 [=============>................] - ETA: 56s - loss: 1.3930 - regression_loss: 1.1744 - classification_loss: 0.2186 240/500 [=============>................] - ETA: 56s - loss: 1.3934 - regression_loss: 1.1748 - classification_loss: 0.2186 241/500 [=============>................] - ETA: 56s - loss: 1.3950 - regression_loss: 1.1759 - classification_loss: 0.2191 242/500 [=============>................] - ETA: 56s - loss: 1.3939 - regression_loss: 1.1752 - classification_loss: 0.2187 243/500 [=============>................] - ETA: 55s - loss: 1.3947 - regression_loss: 1.1759 - classification_loss: 0.2188 244/500 [=============>................] - ETA: 55s - loss: 1.3917 - regression_loss: 1.1732 - classification_loss: 0.2185 245/500 [=============>................] - ETA: 55s - loss: 1.3920 - regression_loss: 1.1739 - classification_loss: 0.2182 246/500 [=============>................] - ETA: 55s - loss: 1.3980 - regression_loss: 1.1785 - classification_loss: 0.2196 247/500 [=============>................] - ETA: 55s - loss: 1.3956 - regression_loss: 1.1766 - classification_loss: 0.2190 248/500 [=============>................] - ETA: 54s - loss: 1.3948 - regression_loss: 1.1762 - classification_loss: 0.2187 249/500 [=============>................] - ETA: 54s - loss: 1.3960 - regression_loss: 1.1769 - classification_loss: 0.2191 250/500 [==============>...............] - ETA: 54s - loss: 1.3982 - regression_loss: 1.1786 - classification_loss: 0.2196 251/500 [==============>...............] - ETA: 54s - loss: 1.3943 - regression_loss: 1.1754 - classification_loss: 0.2189 252/500 [==============>...............] - ETA: 53s - loss: 1.3915 - regression_loss: 1.1730 - classification_loss: 0.2185 253/500 [==============>...............] - ETA: 53s - loss: 1.3923 - regression_loss: 1.1739 - classification_loss: 0.2184 254/500 [==============>...............] - ETA: 53s - loss: 1.3897 - regression_loss: 1.1719 - classification_loss: 0.2178 255/500 [==============>...............] - ETA: 53s - loss: 1.3884 - regression_loss: 1.1709 - classification_loss: 0.2175 256/500 [==============>...............] - ETA: 53s - loss: 1.3949 - regression_loss: 1.1759 - classification_loss: 0.2189 257/500 [==============>...............] - ETA: 52s - loss: 1.3931 - regression_loss: 1.1745 - classification_loss: 0.2186 258/500 [==============>...............] - ETA: 52s - loss: 1.3935 - regression_loss: 1.1750 - classification_loss: 0.2184 259/500 [==============>...............] - ETA: 52s - loss: 1.3934 - regression_loss: 1.1750 - classification_loss: 0.2185 260/500 [==============>...............] - ETA: 52s - loss: 1.3951 - regression_loss: 1.1761 - classification_loss: 0.2189 261/500 [==============>...............] - ETA: 51s - loss: 1.3942 - regression_loss: 1.1756 - classification_loss: 0.2186 262/500 [==============>...............] - ETA: 51s - loss: 1.3966 - regression_loss: 1.1775 - classification_loss: 0.2191 263/500 [==============>...............] - ETA: 51s - loss: 1.3965 - regression_loss: 1.1776 - classification_loss: 0.2189 264/500 [==============>...............] - ETA: 51s - loss: 1.4000 - regression_loss: 1.1801 - classification_loss: 0.2199 265/500 [==============>...............] - ETA: 51s - loss: 1.4027 - regression_loss: 1.1818 - classification_loss: 0.2209 266/500 [==============>...............] - ETA: 50s - loss: 1.4001 - regression_loss: 1.1798 - classification_loss: 0.2204 267/500 [===============>..............] - ETA: 50s - loss: 1.4012 - regression_loss: 1.1807 - classification_loss: 0.2205 268/500 [===============>..............] - ETA: 50s - loss: 1.3998 - regression_loss: 1.1796 - classification_loss: 0.2202 269/500 [===============>..............] - ETA: 50s - loss: 1.3994 - regression_loss: 1.1794 - classification_loss: 0.2200 270/500 [===============>..............] - ETA: 49s - loss: 1.3993 - regression_loss: 1.1795 - classification_loss: 0.2198 271/500 [===============>..............] - ETA: 49s - loss: 1.3956 - regression_loss: 1.1764 - classification_loss: 0.2192 272/500 [===============>..............] - ETA: 49s - loss: 1.3936 - regression_loss: 1.1747 - classification_loss: 0.2189 273/500 [===============>..............] - ETA: 49s - loss: 1.3966 - regression_loss: 1.1774 - classification_loss: 0.2192 274/500 [===============>..............] - ETA: 49s - loss: 1.3972 - regression_loss: 1.1781 - classification_loss: 0.2190 275/500 [===============>..............] - ETA: 48s - loss: 1.3982 - regression_loss: 1.1789 - classification_loss: 0.2193 276/500 [===============>..............] - ETA: 48s - loss: 1.3988 - regression_loss: 1.1792 - classification_loss: 0.2195 277/500 [===============>..............] - ETA: 48s - loss: 1.3969 - regression_loss: 1.1778 - classification_loss: 0.2191 278/500 [===============>..............] - ETA: 48s - loss: 1.3967 - regression_loss: 1.1775 - classification_loss: 0.2192 279/500 [===============>..............] - ETA: 48s - loss: 1.3971 - regression_loss: 1.1778 - classification_loss: 0.2193 280/500 [===============>..............] - ETA: 47s - loss: 1.3982 - regression_loss: 1.1785 - classification_loss: 0.2198 281/500 [===============>..............] - ETA: 47s - loss: 1.4008 - regression_loss: 1.1801 - classification_loss: 0.2207 282/500 [===============>..............] - ETA: 47s - loss: 1.4020 - regression_loss: 1.1811 - classification_loss: 0.2209 283/500 [===============>..............] - ETA: 47s - loss: 1.4010 - regression_loss: 1.1803 - classification_loss: 0.2207 284/500 [================>.............] - ETA: 47s - loss: 1.3978 - regression_loss: 1.1777 - classification_loss: 0.2201 285/500 [================>.............] - ETA: 46s - loss: 1.3972 - regression_loss: 1.1774 - classification_loss: 0.2198 286/500 [================>.............] - ETA: 46s - loss: 1.3979 - regression_loss: 1.1780 - classification_loss: 0.2199 287/500 [================>.............] - ETA: 46s - loss: 1.3959 - regression_loss: 1.1764 - classification_loss: 0.2195 288/500 [================>.............] - ETA: 46s - loss: 1.3986 - regression_loss: 1.1787 - classification_loss: 0.2199 289/500 [================>.............] - ETA: 45s - loss: 1.3999 - regression_loss: 1.1800 - classification_loss: 0.2200 290/500 [================>.............] - ETA: 45s - loss: 1.3991 - regression_loss: 1.1791 - classification_loss: 0.2200 291/500 [================>.............] - ETA: 45s - loss: 1.3986 - regression_loss: 1.1789 - classification_loss: 0.2197 292/500 [================>.............] - ETA: 45s - loss: 1.3971 - regression_loss: 1.1777 - classification_loss: 0.2194 293/500 [================>.............] - ETA: 45s - loss: 1.3981 - regression_loss: 1.1787 - classification_loss: 0.2194 294/500 [================>.............] - ETA: 44s - loss: 1.3982 - regression_loss: 1.1789 - classification_loss: 0.2193 295/500 [================>.............] - ETA: 44s - loss: 1.3972 - regression_loss: 1.1780 - classification_loss: 0.2191 296/500 [================>.............] - ETA: 44s - loss: 1.3987 - regression_loss: 1.1794 - classification_loss: 0.2192 297/500 [================>.............] - ETA: 44s - loss: 1.3979 - regression_loss: 1.1784 - classification_loss: 0.2195 298/500 [================>.............] - ETA: 43s - loss: 1.3966 - regression_loss: 1.1776 - classification_loss: 0.2190 299/500 [================>.............] - ETA: 43s - loss: 1.3955 - regression_loss: 1.1768 - classification_loss: 0.2187 300/500 [=================>............] - ETA: 43s - loss: 1.3937 - regression_loss: 1.1754 - classification_loss: 0.2183 301/500 [=================>............] - ETA: 43s - loss: 1.3936 - regression_loss: 1.1754 - classification_loss: 0.2182 302/500 [=================>............] - ETA: 43s - loss: 1.3914 - regression_loss: 1.1737 - classification_loss: 0.2178 303/500 [=================>............] - ETA: 42s - loss: 1.3914 - regression_loss: 1.1736 - classification_loss: 0.2178 304/500 [=================>............] - ETA: 42s - loss: 1.3902 - regression_loss: 1.1726 - classification_loss: 0.2176 305/500 [=================>............] - ETA: 42s - loss: 1.3883 - regression_loss: 1.1712 - classification_loss: 0.2172 306/500 [=================>............] - ETA: 42s - loss: 1.3880 - regression_loss: 1.1709 - classification_loss: 0.2171 307/500 [=================>............] - ETA: 41s - loss: 1.3868 - regression_loss: 1.1701 - classification_loss: 0.2167 308/500 [=================>............] - ETA: 41s - loss: 1.3850 - regression_loss: 1.1686 - classification_loss: 0.2164 309/500 [=================>............] - ETA: 41s - loss: 1.3868 - regression_loss: 1.1701 - classification_loss: 0.2168 310/500 [=================>............] - ETA: 41s - loss: 1.3862 - regression_loss: 1.1695 - classification_loss: 0.2167 311/500 [=================>............] - ETA: 41s - loss: 1.3859 - regression_loss: 1.1691 - classification_loss: 0.2168 312/500 [=================>............] - ETA: 40s - loss: 1.3865 - regression_loss: 1.1688 - classification_loss: 0.2177 313/500 [=================>............] - ETA: 40s - loss: 1.3883 - regression_loss: 1.1700 - classification_loss: 0.2182 314/500 [=================>............] - ETA: 40s - loss: 1.3892 - regression_loss: 1.1707 - classification_loss: 0.2186 315/500 [=================>............] - ETA: 40s - loss: 1.3898 - regression_loss: 1.1711 - classification_loss: 0.2186 316/500 [=================>............] - ETA: 40s - loss: 1.3903 - regression_loss: 1.1715 - classification_loss: 0.2187 317/500 [==================>...........] - ETA: 39s - loss: 1.3894 - regression_loss: 1.1710 - classification_loss: 0.2184 318/500 [==================>...........] - ETA: 39s - loss: 1.3875 - regression_loss: 1.1694 - classification_loss: 0.2181 319/500 [==================>...........] - ETA: 39s - loss: 1.3902 - regression_loss: 1.1715 - classification_loss: 0.2187 320/500 [==================>...........] - ETA: 39s - loss: 1.3889 - regression_loss: 1.1705 - classification_loss: 0.2184 321/500 [==================>...........] - ETA: 38s - loss: 1.3866 - regression_loss: 1.1685 - classification_loss: 0.2181 322/500 [==================>...........] - ETA: 38s - loss: 1.3853 - regression_loss: 1.1672 - classification_loss: 0.2181 323/500 [==================>...........] - ETA: 38s - loss: 1.3850 - regression_loss: 1.1671 - classification_loss: 0.2179 324/500 [==================>...........] - ETA: 38s - loss: 1.3854 - regression_loss: 1.1673 - classification_loss: 0.2181 325/500 [==================>...........] - ETA: 38s - loss: 1.3853 - regression_loss: 1.1672 - classification_loss: 0.2181 326/500 [==================>...........] - ETA: 37s - loss: 1.3850 - regression_loss: 1.1671 - classification_loss: 0.2179 327/500 [==================>...........] - ETA: 37s - loss: 1.3866 - regression_loss: 1.1684 - classification_loss: 0.2181 328/500 [==================>...........] - ETA: 37s - loss: 1.3852 - regression_loss: 1.1672 - classification_loss: 0.2180 329/500 [==================>...........] - ETA: 37s - loss: 1.3859 - regression_loss: 1.1680 - classification_loss: 0.2179 330/500 [==================>...........] - ETA: 37s - loss: 1.3864 - regression_loss: 1.1682 - classification_loss: 0.2181 331/500 [==================>...........] - ETA: 36s - loss: 1.3867 - regression_loss: 1.1688 - classification_loss: 0.2179 332/500 [==================>...........] - ETA: 36s - loss: 1.3835 - regression_loss: 1.1661 - classification_loss: 0.2174 333/500 [==================>...........] - ETA: 36s - loss: 1.3824 - regression_loss: 1.1653 - classification_loss: 0.2171 334/500 [===================>..........] - ETA: 36s - loss: 1.3811 - regression_loss: 1.1641 - classification_loss: 0.2171 335/500 [===================>..........] - ETA: 35s - loss: 1.3799 - regression_loss: 1.1631 - classification_loss: 0.2168 336/500 [===================>..........] - ETA: 35s - loss: 1.3795 - regression_loss: 1.1629 - classification_loss: 0.2166 337/500 [===================>..........] - ETA: 35s - loss: 1.3790 - regression_loss: 1.1627 - classification_loss: 0.2163 338/500 [===================>..........] - ETA: 35s - loss: 1.3780 - regression_loss: 1.1619 - classification_loss: 0.2161 339/500 [===================>..........] - ETA: 35s - loss: 1.3769 - regression_loss: 1.1611 - classification_loss: 0.2158 340/500 [===================>..........] - ETA: 34s - loss: 1.3766 - regression_loss: 1.1609 - classification_loss: 0.2157 341/500 [===================>..........] - ETA: 34s - loss: 1.3757 - regression_loss: 1.1602 - classification_loss: 0.2154 342/500 [===================>..........] - ETA: 34s - loss: 1.3757 - regression_loss: 1.1604 - classification_loss: 0.2153 343/500 [===================>..........] - ETA: 34s - loss: 1.3753 - regression_loss: 1.1602 - classification_loss: 0.2151 344/500 [===================>..........] - ETA: 34s - loss: 1.3747 - regression_loss: 1.1597 - classification_loss: 0.2150 345/500 [===================>..........] - ETA: 33s - loss: 1.3750 - regression_loss: 1.1602 - classification_loss: 0.2148 346/500 [===================>..........] - ETA: 33s - loss: 1.3755 - regression_loss: 1.1608 - classification_loss: 0.2147 347/500 [===================>..........] - ETA: 33s - loss: 1.3763 - regression_loss: 1.1617 - classification_loss: 0.2146 348/500 [===================>..........] - ETA: 33s - loss: 1.3804 - regression_loss: 1.1648 - classification_loss: 0.2156 349/500 [===================>..........] - ETA: 32s - loss: 1.3814 - regression_loss: 1.1655 - classification_loss: 0.2159 350/500 [====================>.........] - ETA: 32s - loss: 1.3803 - regression_loss: 1.1646 - classification_loss: 0.2157 351/500 [====================>.........] - ETA: 32s - loss: 1.3791 - regression_loss: 1.1636 - classification_loss: 0.2155 352/500 [====================>.........] - ETA: 32s - loss: 1.3772 - regression_loss: 1.1621 - classification_loss: 0.2151 353/500 [====================>.........] - ETA: 32s - loss: 1.3780 - regression_loss: 1.1629 - classification_loss: 0.2151 354/500 [====================>.........] - ETA: 31s - loss: 1.3758 - regression_loss: 1.1611 - classification_loss: 0.2147 355/500 [====================>.........] - ETA: 31s - loss: 1.3748 - regression_loss: 1.1604 - classification_loss: 0.2144 356/500 [====================>.........] - ETA: 31s - loss: 1.3769 - regression_loss: 1.1619 - classification_loss: 0.2149 357/500 [====================>.........] - ETA: 31s - loss: 1.3793 - regression_loss: 1.1636 - classification_loss: 0.2157 358/500 [====================>.........] - ETA: 30s - loss: 1.3782 - regression_loss: 1.1628 - classification_loss: 0.2154 359/500 [====================>.........] - ETA: 30s - loss: 1.3777 - regression_loss: 1.1625 - classification_loss: 0.2152 360/500 [====================>.........] - ETA: 30s - loss: 1.3774 - regression_loss: 1.1624 - classification_loss: 0.2150 361/500 [====================>.........] - ETA: 30s - loss: 1.3770 - regression_loss: 1.1621 - classification_loss: 0.2148 362/500 [====================>.........] - ETA: 30s - loss: 1.3776 - regression_loss: 1.1626 - classification_loss: 0.2150 363/500 [====================>.........] - ETA: 29s - loss: 1.3768 - regression_loss: 1.1619 - classification_loss: 0.2149 364/500 [====================>.........] - ETA: 29s - loss: 1.3769 - regression_loss: 1.1622 - classification_loss: 0.2148 365/500 [====================>.........] - ETA: 29s - loss: 1.3783 - regression_loss: 1.1633 - classification_loss: 0.2151 366/500 [====================>.........] - ETA: 29s - loss: 1.3779 - regression_loss: 1.1628 - classification_loss: 0.2151 367/500 [=====================>........] - ETA: 28s - loss: 1.3879 - regression_loss: 1.1684 - classification_loss: 0.2196 368/500 [=====================>........] - ETA: 28s - loss: 1.3887 - regression_loss: 1.1691 - classification_loss: 0.2196 369/500 [=====================>........] - ETA: 28s - loss: 1.3899 - regression_loss: 1.1701 - classification_loss: 0.2198 370/500 [=====================>........] - ETA: 28s - loss: 1.3885 - regression_loss: 1.1690 - classification_loss: 0.2194 371/500 [=====================>........] - ETA: 28s - loss: 1.3871 - regression_loss: 1.1679 - classification_loss: 0.2191 372/500 [=====================>........] - ETA: 27s - loss: 1.3866 - regression_loss: 1.1676 - classification_loss: 0.2190 373/500 [=====================>........] - ETA: 27s - loss: 1.3852 - regression_loss: 1.1666 - classification_loss: 0.2186 374/500 [=====================>........] - ETA: 27s - loss: 1.3856 - regression_loss: 1.1670 - classification_loss: 0.2186 375/500 [=====================>........] - ETA: 27s - loss: 1.3853 - regression_loss: 1.1668 - classification_loss: 0.2185 376/500 [=====================>........] - ETA: 27s - loss: 1.3863 - regression_loss: 1.1676 - classification_loss: 0.2186 377/500 [=====================>........] - ETA: 26s - loss: 1.3864 - regression_loss: 1.1679 - classification_loss: 0.2185 378/500 [=====================>........] - ETA: 26s - loss: 1.3894 - regression_loss: 1.1700 - classification_loss: 0.2194 379/500 [=====================>........] - ETA: 26s - loss: 1.3899 - regression_loss: 1.1704 - classification_loss: 0.2195 380/500 [=====================>........] - ETA: 26s - loss: 1.3906 - regression_loss: 1.1710 - classification_loss: 0.2197 381/500 [=====================>........] - ETA: 25s - loss: 1.3904 - regression_loss: 1.1708 - classification_loss: 0.2196 382/500 [=====================>........] - ETA: 25s - loss: 1.3898 - regression_loss: 1.1704 - classification_loss: 0.2194 383/500 [=====================>........] - ETA: 25s - loss: 1.3896 - regression_loss: 1.1703 - classification_loss: 0.2193 384/500 [======================>.......] - ETA: 25s - loss: 1.3891 - regression_loss: 1.1699 - classification_loss: 0.2192 385/500 [======================>.......] - ETA: 25s - loss: 1.3894 - regression_loss: 1.1704 - classification_loss: 0.2190 386/500 [======================>.......] - ETA: 24s - loss: 1.3890 - regression_loss: 1.1700 - classification_loss: 0.2189 387/500 [======================>.......] - ETA: 24s - loss: 1.3888 - regression_loss: 1.1700 - classification_loss: 0.2188 388/500 [======================>.......] - ETA: 24s - loss: 1.3898 - regression_loss: 1.1708 - classification_loss: 0.2190 389/500 [======================>.......] - ETA: 24s - loss: 1.3894 - regression_loss: 1.1707 - classification_loss: 0.2188 390/500 [======================>.......] - ETA: 23s - loss: 1.3890 - regression_loss: 1.1705 - classification_loss: 0.2186 391/500 [======================>.......] - ETA: 23s - loss: 1.3867 - regression_loss: 1.1686 - classification_loss: 0.2182 392/500 [======================>.......] - ETA: 23s - loss: 1.3875 - regression_loss: 1.1693 - classification_loss: 0.2182 393/500 [======================>.......] - ETA: 23s - loss: 1.3873 - regression_loss: 1.1691 - classification_loss: 0.2182 394/500 [======================>.......] - ETA: 23s - loss: 1.3856 - regression_loss: 1.1677 - classification_loss: 0.2179 395/500 [======================>.......] - ETA: 22s - loss: 1.3866 - regression_loss: 1.1685 - classification_loss: 0.2181 396/500 [======================>.......] - ETA: 22s - loss: 1.3867 - regression_loss: 1.1687 - classification_loss: 0.2180 397/500 [======================>.......] - ETA: 22s - loss: 1.3879 - regression_loss: 1.1697 - classification_loss: 0.2181 398/500 [======================>.......] - ETA: 22s - loss: 1.3872 - regression_loss: 1.1692 - classification_loss: 0.2180 399/500 [======================>.......] - ETA: 22s - loss: 1.3868 - regression_loss: 1.1691 - classification_loss: 0.2177 400/500 [=======================>......] - ETA: 21s - loss: 1.3856 - regression_loss: 1.1681 - classification_loss: 0.2175 401/500 [=======================>......] - ETA: 21s - loss: 1.3890 - regression_loss: 1.1709 - classification_loss: 0.2181 402/500 [=======================>......] - ETA: 21s - loss: 1.3882 - regression_loss: 1.1702 - classification_loss: 0.2179 403/500 [=======================>......] - ETA: 21s - loss: 1.3885 - regression_loss: 1.1705 - classification_loss: 0.2179 404/500 [=======================>......] - ETA: 20s - loss: 1.3869 - regression_loss: 1.1693 - classification_loss: 0.2176 405/500 [=======================>......] - ETA: 20s - loss: 1.3891 - regression_loss: 1.1715 - classification_loss: 0.2177 406/500 [=======================>......] - ETA: 20s - loss: 1.3882 - regression_loss: 1.1707 - classification_loss: 0.2175 407/500 [=======================>......] - ETA: 20s - loss: 1.3886 - regression_loss: 1.1709 - classification_loss: 0.2177 408/500 [=======================>......] - ETA: 20s - loss: 1.3911 - regression_loss: 1.1733 - classification_loss: 0.2178 409/500 [=======================>......] - ETA: 19s - loss: 1.3916 - regression_loss: 1.1736 - classification_loss: 0.2180 410/500 [=======================>......] - ETA: 19s - loss: 1.3929 - regression_loss: 1.1747 - classification_loss: 0.2182 411/500 [=======================>......] - ETA: 19s - loss: 1.3920 - regression_loss: 1.1740 - classification_loss: 0.2180 412/500 [=======================>......] - ETA: 19s - loss: 1.3914 - regression_loss: 1.1735 - classification_loss: 0.2178 413/500 [=======================>......] - ETA: 18s - loss: 1.3929 - regression_loss: 1.1748 - classification_loss: 0.2182 414/500 [=======================>......] - ETA: 18s - loss: 1.3932 - regression_loss: 1.1747 - classification_loss: 0.2186 415/500 [=======================>......] - ETA: 18s - loss: 1.3938 - regression_loss: 1.1753 - classification_loss: 0.2185 416/500 [=======================>......] - ETA: 18s - loss: 1.3949 - regression_loss: 1.1761 - classification_loss: 0.2188 417/500 [========================>.....] - ETA: 18s - loss: 1.3961 - regression_loss: 1.1770 - classification_loss: 0.2191 418/500 [========================>.....] - ETA: 17s - loss: 1.3958 - regression_loss: 1.1767 - classification_loss: 0.2191 419/500 [========================>.....] - ETA: 17s - loss: 1.3958 - regression_loss: 1.1768 - classification_loss: 0.2190 420/500 [========================>.....] - ETA: 17s - loss: 1.3956 - regression_loss: 1.1767 - classification_loss: 0.2189 421/500 [========================>.....] - ETA: 17s - loss: 1.3964 - regression_loss: 1.1775 - classification_loss: 0.2189 422/500 [========================>.....] - ETA: 16s - loss: 1.3954 - regression_loss: 1.1767 - classification_loss: 0.2187 423/500 [========================>.....] - ETA: 16s - loss: 1.3964 - regression_loss: 1.1776 - classification_loss: 0.2188 424/500 [========================>.....] - ETA: 16s - loss: 1.3970 - regression_loss: 1.1780 - classification_loss: 0.2190 425/500 [========================>.....] - ETA: 16s - loss: 1.3947 - regression_loss: 1.1761 - classification_loss: 0.2186 426/500 [========================>.....] - ETA: 16s - loss: 1.3945 - regression_loss: 1.1760 - classification_loss: 0.2185 427/500 [========================>.....] - ETA: 15s - loss: 1.3932 - regression_loss: 1.1749 - classification_loss: 0.2183 428/500 [========================>.....] - ETA: 15s - loss: 1.3934 - regression_loss: 1.1751 - classification_loss: 0.2184 429/500 [========================>.....] - ETA: 15s - loss: 1.3932 - regression_loss: 1.1750 - classification_loss: 0.2182 430/500 [========================>.....] - ETA: 15s - loss: 1.3957 - regression_loss: 1.1769 - classification_loss: 0.2188 431/500 [========================>.....] - ETA: 15s - loss: 1.3949 - regression_loss: 1.1760 - classification_loss: 0.2189 432/500 [========================>.....] - ETA: 14s - loss: 1.3943 - regression_loss: 1.1756 - classification_loss: 0.2186 433/500 [========================>.....] - ETA: 14s - loss: 1.3953 - regression_loss: 1.1759 - classification_loss: 0.2194 434/500 [=========================>....] - ETA: 14s - loss: 1.3944 - regression_loss: 1.1751 - classification_loss: 0.2192 435/500 [=========================>....] - ETA: 14s - loss: 1.3948 - regression_loss: 1.1753 - classification_loss: 0.2195 436/500 [=========================>....] - ETA: 13s - loss: 1.3941 - regression_loss: 1.1748 - classification_loss: 0.2193 437/500 [=========================>....] - ETA: 13s - loss: 1.3922 - regression_loss: 1.1731 - classification_loss: 0.2191 438/500 [=========================>....] - ETA: 13s - loss: 1.3917 - regression_loss: 1.1728 - classification_loss: 0.2188 439/500 [=========================>....] - ETA: 13s - loss: 1.3913 - regression_loss: 1.1725 - classification_loss: 0.2189 440/500 [=========================>....] - ETA: 13s - loss: 1.3902 - regression_loss: 1.1715 - classification_loss: 0.2187 441/500 [=========================>....] - ETA: 12s - loss: 1.3914 - regression_loss: 1.1724 - classification_loss: 0.2190 442/500 [=========================>....] - ETA: 12s - loss: 1.3910 - regression_loss: 1.1722 - classification_loss: 0.2188 443/500 [=========================>....] - ETA: 12s - loss: 1.3891 - regression_loss: 1.1707 - classification_loss: 0.2185 444/500 [=========================>....] - ETA: 12s - loss: 1.3891 - regression_loss: 1.1706 - classification_loss: 0.2185 445/500 [=========================>....] - ETA: 12s - loss: 1.3884 - regression_loss: 1.1697 - classification_loss: 0.2187 446/500 [=========================>....] - ETA: 11s - loss: 1.3894 - regression_loss: 1.1706 - classification_loss: 0.2188 447/500 [=========================>....] - ETA: 11s - loss: 1.3908 - regression_loss: 1.1719 - classification_loss: 0.2188 448/500 [=========================>....] - ETA: 11s - loss: 1.3906 - regression_loss: 1.1718 - classification_loss: 0.2188 449/500 [=========================>....] - ETA: 11s - loss: 1.3904 - regression_loss: 1.1716 - classification_loss: 0.2189 450/500 [==========================>...] - ETA: 10s - loss: 1.3887 - regression_loss: 1.1702 - classification_loss: 0.2185 451/500 [==========================>...] - ETA: 10s - loss: 1.3883 - regression_loss: 1.1699 - classification_loss: 0.2183 452/500 [==========================>...] - ETA: 10s - loss: 1.3880 - regression_loss: 1.1698 - classification_loss: 0.2182 453/500 [==========================>...] - ETA: 10s - loss: 1.3868 - regression_loss: 1.1689 - classification_loss: 0.2180 454/500 [==========================>...] - ETA: 10s - loss: 1.3872 - regression_loss: 1.1692 - classification_loss: 0.2179 455/500 [==========================>...] - ETA: 9s - loss: 1.3867 - regression_loss: 1.1689 - classification_loss: 0.2177  456/500 [==========================>...] - ETA: 9s - loss: 1.3873 - regression_loss: 1.1694 - classification_loss: 0.2179 457/500 [==========================>...] - ETA: 9s - loss: 1.3886 - regression_loss: 1.1706 - classification_loss: 0.2180 458/500 [==========================>...] - ETA: 9s - loss: 1.3906 - regression_loss: 1.1721 - classification_loss: 0.2185 459/500 [==========================>...] - ETA: 8s - loss: 1.3908 - regression_loss: 1.1722 - classification_loss: 0.2186 460/500 [==========================>...] - ETA: 8s - loss: 1.3903 - regression_loss: 1.1716 - classification_loss: 0.2186 461/500 [==========================>...] - ETA: 8s - loss: 1.3914 - regression_loss: 1.1727 - classification_loss: 0.2187 462/500 [==========================>...] - ETA: 8s - loss: 1.3913 - regression_loss: 1.1726 - classification_loss: 0.2186 463/500 [==========================>...] - ETA: 8s - loss: 1.3916 - regression_loss: 1.1728 - classification_loss: 0.2188 464/500 [==========================>...] - ETA: 7s - loss: 1.3926 - regression_loss: 1.1736 - classification_loss: 0.2190 465/500 [==========================>...] - ETA: 7s - loss: 1.3935 - regression_loss: 1.1742 - classification_loss: 0.2194 466/500 [==========================>...] - ETA: 7s - loss: 1.3940 - regression_loss: 1.1747 - classification_loss: 0.2194 467/500 [===========================>..] - ETA: 7s - loss: 1.3936 - regression_loss: 1.1743 - classification_loss: 0.2193 468/500 [===========================>..] - ETA: 6s - loss: 1.3936 - regression_loss: 1.1742 - classification_loss: 0.2193 469/500 [===========================>..] - ETA: 6s - loss: 1.3932 - regression_loss: 1.1741 - classification_loss: 0.2192 470/500 [===========================>..] - ETA: 6s - loss: 1.3926 - regression_loss: 1.1737 - classification_loss: 0.2190 471/500 [===========================>..] - ETA: 6s - loss: 1.3921 - regression_loss: 1.1732 - classification_loss: 0.2189 472/500 [===========================>..] - ETA: 6s - loss: 1.3931 - regression_loss: 1.1739 - classification_loss: 0.2191 473/500 [===========================>..] - ETA: 5s - loss: 1.3924 - regression_loss: 1.1734 - classification_loss: 0.2189 474/500 [===========================>..] - ETA: 5s - loss: 1.3917 - regression_loss: 1.1729 - classification_loss: 0.2188 475/500 [===========================>..] - ETA: 5s - loss: 1.3926 - regression_loss: 1.1737 - classification_loss: 0.2189 476/500 [===========================>..] - ETA: 5s - loss: 1.3922 - regression_loss: 1.1734 - classification_loss: 0.2188 477/500 [===========================>..] - ETA: 5s - loss: 1.3913 - regression_loss: 1.1727 - classification_loss: 0.2186 478/500 [===========================>..] - ETA: 4s - loss: 1.3906 - regression_loss: 1.1721 - classification_loss: 0.2184 479/500 [===========================>..] - ETA: 4s - loss: 1.3892 - regression_loss: 1.1711 - classification_loss: 0.2181 480/500 [===========================>..] - ETA: 4s - loss: 1.3888 - regression_loss: 1.1709 - classification_loss: 0.2179 481/500 [===========================>..] - ETA: 4s - loss: 1.3877 - regression_loss: 1.1700 - classification_loss: 0.2177 482/500 [===========================>..] - ETA: 3s - loss: 1.3867 - regression_loss: 1.1692 - classification_loss: 0.2175 483/500 [===========================>..] - ETA: 3s - loss: 1.3871 - regression_loss: 1.1695 - classification_loss: 0.2175 484/500 [============================>.] - ETA: 3s - loss: 1.3853 - regression_loss: 1.1681 - classification_loss: 0.2172 485/500 [============================>.] - ETA: 3s - loss: 1.3874 - regression_loss: 1.1699 - classification_loss: 0.2175 486/500 [============================>.] - ETA: 3s - loss: 1.3864 - regression_loss: 1.1691 - classification_loss: 0.2173 487/500 [============================>.] - ETA: 2s - loss: 1.3853 - regression_loss: 1.1681 - classification_loss: 0.2171 488/500 [============================>.] - ETA: 2s - loss: 1.3869 - regression_loss: 1.1695 - classification_loss: 0.2174 489/500 [============================>.] - ETA: 2s - loss: 1.3867 - regression_loss: 1.1694 - classification_loss: 0.2173 490/500 [============================>.] - ETA: 2s - loss: 1.3866 - regression_loss: 1.1691 - classification_loss: 0.2175 491/500 [============================>.] - ETA: 1s - loss: 1.3850 - regression_loss: 1.1677 - classification_loss: 0.2173 492/500 [============================>.] - ETA: 1s - loss: 1.3860 - regression_loss: 1.1685 - classification_loss: 0.2175 493/500 [============================>.] - ETA: 1s - loss: 1.3857 - regression_loss: 1.1683 - classification_loss: 0.2174 494/500 [============================>.] - ETA: 1s - loss: 1.3870 - regression_loss: 1.1695 - classification_loss: 0.2175 495/500 [============================>.] - ETA: 1s - loss: 1.3873 - regression_loss: 1.1697 - classification_loss: 0.2175 496/500 [============================>.] - ETA: 0s - loss: 1.3878 - regression_loss: 1.1703 - classification_loss: 0.2174 497/500 [============================>.] - ETA: 0s - loss: 1.3874 - regression_loss: 1.1701 - classification_loss: 0.2173 498/500 [============================>.] - ETA: 0s - loss: 1.3876 - regression_loss: 1.1704 - classification_loss: 0.2171 499/500 [============================>.] - ETA: 0s - loss: 1.3881 - regression_loss: 1.1710 - classification_loss: 0.2171 500/500 [==============================] - 109s 218ms/step - loss: 1.3878 - regression_loss: 1.1708 - classification_loss: 0.2170 326 instances of class plum with average precision: 0.8128 mAP: 0.8128 Epoch 00084: saving model to ./training/snapshots/resnet50_pascal_84.h5 Epoch 85/150 1/500 [..............................] - ETA: 2:00 - loss: 2.0820 - regression_loss: 1.7810 - classification_loss: 0.3010 2/500 [..............................] - ETA: 1:55 - loss: 1.5296 - regression_loss: 1.3250 - classification_loss: 0.2045 3/500 [..............................] - ETA: 1:51 - loss: 1.6489 - regression_loss: 1.4292 - classification_loss: 0.2197 4/500 [..............................] - ETA: 1:49 - loss: 1.4821 - regression_loss: 1.2968 - classification_loss: 0.1853 5/500 [..............................] - ETA: 1:48 - loss: 1.4545 - regression_loss: 1.2811 - classification_loss: 0.1734 6/500 [..............................] - ETA: 1:47 - loss: 1.3063 - regression_loss: 1.1537 - classification_loss: 0.1525 7/500 [..............................] - ETA: 1:46 - loss: 1.3383 - regression_loss: 1.1877 - classification_loss: 0.1506 8/500 [..............................] - ETA: 1:46 - loss: 1.3385 - regression_loss: 1.1959 - classification_loss: 0.1425 9/500 [..............................] - ETA: 1:46 - loss: 1.3356 - regression_loss: 1.1924 - classification_loss: 0.1432 10/500 [..............................] - ETA: 1:45 - loss: 1.3060 - regression_loss: 1.1604 - classification_loss: 0.1456 11/500 [..............................] - ETA: 1:45 - loss: 1.3524 - regression_loss: 1.1909 - classification_loss: 0.1615 12/500 [..............................] - ETA: 1:44 - loss: 1.3824 - regression_loss: 1.2139 - classification_loss: 0.1685 13/500 [..............................] - ETA: 1:44 - loss: 1.3256 - regression_loss: 1.1639 - classification_loss: 0.1616 14/500 [..............................] - ETA: 1:44 - loss: 1.3104 - regression_loss: 1.1488 - classification_loss: 0.1615 15/500 [..............................] - ETA: 1:43 - loss: 1.2869 - regression_loss: 1.1290 - classification_loss: 0.1580 16/500 [..............................] - ETA: 1:43 - loss: 1.3118 - regression_loss: 1.1440 - classification_loss: 0.1678 17/500 [>.............................] - ETA: 1:43 - loss: 1.2965 - regression_loss: 1.1314 - classification_loss: 0.1650 18/500 [>.............................] - ETA: 1:43 - loss: 1.3184 - regression_loss: 1.1470 - classification_loss: 0.1713 19/500 [>.............................] - ETA: 1:42 - loss: 1.3335 - regression_loss: 1.1566 - classification_loss: 0.1770 20/500 [>.............................] - ETA: 1:42 - loss: 1.4459 - regression_loss: 1.2141 - classification_loss: 0.2318 21/500 [>.............................] - ETA: 1:42 - loss: 1.4786 - regression_loss: 1.2389 - classification_loss: 0.2398 22/500 [>.............................] - ETA: 1:42 - loss: 1.4534 - regression_loss: 1.2195 - classification_loss: 0.2339 23/500 [>.............................] - ETA: 1:41 - loss: 1.4647 - regression_loss: 1.2256 - classification_loss: 0.2391 24/500 [>.............................] - ETA: 1:41 - loss: 1.4974 - regression_loss: 1.2458 - classification_loss: 0.2516 25/500 [>.............................] - ETA: 1:41 - loss: 1.4977 - regression_loss: 1.2464 - classification_loss: 0.2512 26/500 [>.............................] - ETA: 1:40 - loss: 1.5014 - regression_loss: 1.2510 - classification_loss: 0.2504 27/500 [>.............................] - ETA: 1:40 - loss: 1.4858 - regression_loss: 1.2387 - classification_loss: 0.2471 28/500 [>.............................] - ETA: 1:40 - loss: 1.4707 - regression_loss: 1.2272 - classification_loss: 0.2435 29/500 [>.............................] - ETA: 1:40 - loss: 1.4564 - regression_loss: 1.2166 - classification_loss: 0.2397 30/500 [>.............................] - ETA: 1:40 - loss: 1.4700 - regression_loss: 1.2271 - classification_loss: 0.2428 31/500 [>.............................] - ETA: 1:40 - loss: 1.5033 - regression_loss: 1.2500 - classification_loss: 0.2533 32/500 [>.............................] - ETA: 1:40 - loss: 1.4852 - regression_loss: 1.2353 - classification_loss: 0.2499 33/500 [>.............................] - ETA: 1:39 - loss: 1.4964 - regression_loss: 1.2436 - classification_loss: 0.2528 34/500 [=>............................] - ETA: 1:39 - loss: 1.4667 - regression_loss: 1.2196 - classification_loss: 0.2470 35/500 [=>............................] - ETA: 1:39 - loss: 1.4725 - regression_loss: 1.2251 - classification_loss: 0.2475 36/500 [=>............................] - ETA: 1:39 - loss: 1.4659 - regression_loss: 1.2212 - classification_loss: 0.2447 37/500 [=>............................] - ETA: 1:38 - loss: 1.4711 - regression_loss: 1.2251 - classification_loss: 0.2460 38/500 [=>............................] - ETA: 1:38 - loss: 1.4814 - regression_loss: 1.2354 - classification_loss: 0.2460 39/500 [=>............................] - ETA: 1:39 - loss: 1.4674 - regression_loss: 1.2246 - classification_loss: 0.2428 40/500 [=>............................] - ETA: 1:39 - loss: 1.5122 - regression_loss: 1.2589 - classification_loss: 0.2533 41/500 [=>............................] - ETA: 1:39 - loss: 1.5066 - regression_loss: 1.2546 - classification_loss: 0.2519 42/500 [=>............................] - ETA: 1:39 - loss: 1.5009 - regression_loss: 1.2503 - classification_loss: 0.2507 43/500 [=>............................] - ETA: 1:39 - loss: 1.4849 - regression_loss: 1.2378 - classification_loss: 0.2470 44/500 [=>............................] - ETA: 1:39 - loss: 1.4679 - regression_loss: 1.2227 - classification_loss: 0.2452 45/500 [=>............................] - ETA: 1:39 - loss: 1.4560 - regression_loss: 1.2137 - classification_loss: 0.2424 46/500 [=>............................] - ETA: 1:39 - loss: 1.4687 - regression_loss: 1.2270 - classification_loss: 0.2417 47/500 [=>............................] - ETA: 1:39 - loss: 1.4765 - regression_loss: 1.2337 - classification_loss: 0.2428 48/500 [=>............................] - ETA: 1:39 - loss: 1.4804 - regression_loss: 1.2402 - classification_loss: 0.2402 49/500 [=>............................] - ETA: 1:39 - loss: 1.4812 - regression_loss: 1.2413 - classification_loss: 0.2400 50/500 [==>...........................] - ETA: 1:39 - loss: 1.4821 - regression_loss: 1.2433 - classification_loss: 0.2388 51/500 [==>...........................] - ETA: 1:39 - loss: 1.4660 - regression_loss: 1.2293 - classification_loss: 0.2367 52/500 [==>...........................] - ETA: 1:39 - loss: 1.4640 - regression_loss: 1.2281 - classification_loss: 0.2359 53/500 [==>...........................] - ETA: 1:38 - loss: 1.4551 - regression_loss: 1.2216 - classification_loss: 0.2335 54/500 [==>...........................] - ETA: 1:38 - loss: 1.4652 - regression_loss: 1.2304 - classification_loss: 0.2348 55/500 [==>...........................] - ETA: 1:38 - loss: 1.4557 - regression_loss: 1.2237 - classification_loss: 0.2320 56/500 [==>...........................] - ETA: 1:38 - loss: 1.4622 - regression_loss: 1.2288 - classification_loss: 0.2334 57/500 [==>...........................] - ETA: 1:37 - loss: 1.4715 - regression_loss: 1.2360 - classification_loss: 0.2355 58/500 [==>...........................] - ETA: 1:37 - loss: 1.4714 - regression_loss: 1.2351 - classification_loss: 0.2362 59/500 [==>...........................] - ETA: 1:37 - loss: 1.4814 - regression_loss: 1.2441 - classification_loss: 0.2374 60/500 [==>...........................] - ETA: 1:36 - loss: 1.4834 - regression_loss: 1.2457 - classification_loss: 0.2376 61/500 [==>...........................] - ETA: 1:36 - loss: 1.4758 - regression_loss: 1.2400 - classification_loss: 0.2358 62/500 [==>...........................] - ETA: 1:36 - loss: 1.4707 - regression_loss: 1.2364 - classification_loss: 0.2343 63/500 [==>...........................] - ETA: 1:36 - loss: 1.4627 - regression_loss: 1.2304 - classification_loss: 0.2323 64/500 [==>...........................] - ETA: 1:35 - loss: 1.4469 - regression_loss: 1.2178 - classification_loss: 0.2291 65/500 [==>...........................] - ETA: 1:35 - loss: 1.4474 - regression_loss: 1.2185 - classification_loss: 0.2289 66/500 [==>...........................] - ETA: 1:35 - loss: 1.4447 - regression_loss: 1.2171 - classification_loss: 0.2276 67/500 [===>..........................] - ETA: 1:35 - loss: 1.4477 - regression_loss: 1.2198 - classification_loss: 0.2279 68/500 [===>..........................] - ETA: 1:34 - loss: 1.4446 - regression_loss: 1.2179 - classification_loss: 0.2266 69/500 [===>..........................] - ETA: 1:34 - loss: 1.4421 - regression_loss: 1.2158 - classification_loss: 0.2263 70/500 [===>..........................] - ETA: 1:34 - loss: 1.4402 - regression_loss: 1.2155 - classification_loss: 0.2247 71/500 [===>..........................] - ETA: 1:33 - loss: 1.4439 - regression_loss: 1.2175 - classification_loss: 0.2264 72/500 [===>..........................] - ETA: 1:33 - loss: 1.4356 - regression_loss: 1.2106 - classification_loss: 0.2250 73/500 [===>..........................] - ETA: 1:33 - loss: 1.4295 - regression_loss: 1.2062 - classification_loss: 0.2234 74/500 [===>..........................] - ETA: 1:33 - loss: 1.4277 - regression_loss: 1.2058 - classification_loss: 0.2219 75/500 [===>..........................] - ETA: 1:32 - loss: 1.4217 - regression_loss: 1.2014 - classification_loss: 0.2203 76/500 [===>..........................] - ETA: 1:32 - loss: 1.4211 - regression_loss: 1.2008 - classification_loss: 0.2204 77/500 [===>..........................] - ETA: 1:32 - loss: 1.4164 - regression_loss: 1.1977 - classification_loss: 0.2187 78/500 [===>..........................] - ETA: 1:32 - loss: 1.4169 - regression_loss: 1.1985 - classification_loss: 0.2184 79/500 [===>..........................] - ETA: 1:31 - loss: 1.4253 - regression_loss: 1.2050 - classification_loss: 0.2203 80/500 [===>..........................] - ETA: 1:31 - loss: 1.4286 - regression_loss: 1.2067 - classification_loss: 0.2220 81/500 [===>..........................] - ETA: 1:31 - loss: 1.4400 - regression_loss: 1.2152 - classification_loss: 0.2248 82/500 [===>..........................] - ETA: 1:30 - loss: 1.4333 - regression_loss: 1.2102 - classification_loss: 0.2231 83/500 [===>..........................] - ETA: 1:30 - loss: 1.4401 - regression_loss: 1.2163 - classification_loss: 0.2238 84/500 [====>.........................] - ETA: 1:30 - loss: 1.4486 - regression_loss: 1.2232 - classification_loss: 0.2255 85/500 [====>.........................] - ETA: 1:30 - loss: 1.4487 - regression_loss: 1.2240 - classification_loss: 0.2248 86/500 [====>.........................] - ETA: 1:30 - loss: 1.4434 - regression_loss: 1.2200 - classification_loss: 0.2234 87/500 [====>.........................] - ETA: 1:29 - loss: 1.4410 - regression_loss: 1.2182 - classification_loss: 0.2228 88/500 [====>.........................] - ETA: 1:29 - loss: 1.4418 - regression_loss: 1.2182 - classification_loss: 0.2235 89/500 [====>.........................] - ETA: 1:29 - loss: 1.4471 - regression_loss: 1.2217 - classification_loss: 0.2253 90/500 [====>.........................] - ETA: 1:29 - loss: 1.4473 - regression_loss: 1.2218 - classification_loss: 0.2255 91/500 [====>.........................] - ETA: 1:28 - loss: 1.4416 - regression_loss: 1.2174 - classification_loss: 0.2242 92/500 [====>.........................] - ETA: 1:28 - loss: 1.4455 - regression_loss: 1.2207 - classification_loss: 0.2249 93/500 [====>.........................] - ETA: 1:28 - loss: 1.4456 - regression_loss: 1.2212 - classification_loss: 0.2244 94/500 [====>.........................] - ETA: 1:28 - loss: 1.4436 - regression_loss: 1.2198 - classification_loss: 0.2239 95/500 [====>.........................] - ETA: 1:28 - loss: 1.4449 - regression_loss: 1.2208 - classification_loss: 0.2241 96/500 [====>.........................] - ETA: 1:28 - loss: 1.4399 - regression_loss: 1.2163 - classification_loss: 0.2236 97/500 [====>.........................] - ETA: 1:27 - loss: 1.4359 - regression_loss: 1.2133 - classification_loss: 0.2227 98/500 [====>.........................] - ETA: 1:27 - loss: 1.4355 - regression_loss: 1.2133 - classification_loss: 0.2222 99/500 [====>.........................] - ETA: 1:27 - loss: 1.4354 - regression_loss: 1.2135 - classification_loss: 0.2219 100/500 [=====>........................] - ETA: 1:27 - loss: 1.4342 - regression_loss: 1.2130 - classification_loss: 0.2212 101/500 [=====>........................] - ETA: 1:27 - loss: 1.4281 - regression_loss: 1.2082 - classification_loss: 0.2200 102/500 [=====>........................] - ETA: 1:27 - loss: 1.4223 - regression_loss: 1.2035 - classification_loss: 0.2188 103/500 [=====>........................] - ETA: 1:27 - loss: 1.4167 - regression_loss: 1.1994 - classification_loss: 0.2173 104/500 [=====>........................] - ETA: 1:27 - loss: 1.4168 - regression_loss: 1.2004 - classification_loss: 0.2164 105/500 [=====>........................] - ETA: 1:26 - loss: 1.4142 - regression_loss: 1.1981 - classification_loss: 0.2161 106/500 [=====>........................] - ETA: 1:26 - loss: 1.4157 - regression_loss: 1.1993 - classification_loss: 0.2163 107/500 [=====>........................] - ETA: 1:26 - loss: 1.4197 - regression_loss: 1.2029 - classification_loss: 0.2168 108/500 [=====>........................] - ETA: 1:26 - loss: 1.4145 - regression_loss: 1.1988 - classification_loss: 0.2157 109/500 [=====>........................] - ETA: 1:25 - loss: 1.4199 - regression_loss: 1.2031 - classification_loss: 0.2168 110/500 [=====>........................] - ETA: 1:25 - loss: 1.4239 - regression_loss: 1.2069 - classification_loss: 0.2170 111/500 [=====>........................] - ETA: 1:25 - loss: 1.4202 - regression_loss: 1.2042 - classification_loss: 0.2160 112/500 [=====>........................] - ETA: 1:25 - loss: 1.4151 - regression_loss: 1.2000 - classification_loss: 0.2151 113/500 [=====>........................] - ETA: 1:24 - loss: 1.4143 - regression_loss: 1.1996 - classification_loss: 0.2148 114/500 [=====>........................] - ETA: 1:24 - loss: 1.4049 - regression_loss: 1.1917 - classification_loss: 0.2132 115/500 [=====>........................] - ETA: 1:24 - loss: 1.4102 - regression_loss: 1.1951 - classification_loss: 0.2151 116/500 [=====>........................] - ETA: 1:24 - loss: 1.4143 - regression_loss: 1.1987 - classification_loss: 0.2156 117/500 [======>.......................] - ETA: 1:23 - loss: 1.4140 - regression_loss: 1.1979 - classification_loss: 0.2161 118/500 [======>.......................] - ETA: 1:23 - loss: 1.4126 - regression_loss: 1.1968 - classification_loss: 0.2157 119/500 [======>.......................] - ETA: 1:23 - loss: 1.4096 - regression_loss: 1.1946 - classification_loss: 0.2150 120/500 [======>.......................] - ETA: 1:23 - loss: 1.4154 - regression_loss: 1.1994 - classification_loss: 0.2161 121/500 [======>.......................] - ETA: 1:23 - loss: 1.4173 - regression_loss: 1.2007 - classification_loss: 0.2166 122/500 [======>.......................] - ETA: 1:22 - loss: 1.4159 - regression_loss: 1.1990 - classification_loss: 0.2169 123/500 [======>.......................] - ETA: 1:22 - loss: 1.4178 - regression_loss: 1.2012 - classification_loss: 0.2167 124/500 [======>.......................] - ETA: 1:22 - loss: 1.4193 - regression_loss: 1.2023 - classification_loss: 0.2170 125/500 [======>.......................] - ETA: 1:22 - loss: 1.4188 - regression_loss: 1.2018 - classification_loss: 0.2169 126/500 [======>.......................] - ETA: 1:21 - loss: 1.4108 - regression_loss: 1.1952 - classification_loss: 0.2156 127/500 [======>.......................] - ETA: 1:21 - loss: 1.4143 - regression_loss: 1.1982 - classification_loss: 0.2161 128/500 [======>.......................] - ETA: 1:21 - loss: 1.4132 - regression_loss: 1.1979 - classification_loss: 0.2153 129/500 [======>.......................] - ETA: 1:21 - loss: 1.4106 - regression_loss: 1.1957 - classification_loss: 0.2149 130/500 [======>.......................] - ETA: 1:20 - loss: 1.4113 - regression_loss: 1.1967 - classification_loss: 0.2146 131/500 [======>.......................] - ETA: 1:20 - loss: 1.4089 - regression_loss: 1.1948 - classification_loss: 0.2141 132/500 [======>.......................] - ETA: 1:20 - loss: 1.4081 - regression_loss: 1.1945 - classification_loss: 0.2136 133/500 [======>.......................] - ETA: 1:20 - loss: 1.4041 - regression_loss: 1.1912 - classification_loss: 0.2129 134/500 [=======>......................] - ETA: 1:19 - loss: 1.4007 - regression_loss: 1.1886 - classification_loss: 0.2121 135/500 [=======>......................] - ETA: 1:19 - loss: 1.3984 - regression_loss: 1.1869 - classification_loss: 0.2115 136/500 [=======>......................] - ETA: 1:19 - loss: 1.3976 - regression_loss: 1.1865 - classification_loss: 0.2111 137/500 [=======>......................] - ETA: 1:19 - loss: 1.4036 - regression_loss: 1.1905 - classification_loss: 0.2131 138/500 [=======>......................] - ETA: 1:18 - loss: 1.3984 - regression_loss: 1.1862 - classification_loss: 0.2121 139/500 [=======>......................] - ETA: 1:18 - loss: 1.4000 - regression_loss: 1.1882 - classification_loss: 0.2118 140/500 [=======>......................] - ETA: 1:18 - loss: 1.3979 - regression_loss: 1.1859 - classification_loss: 0.2120 141/500 [=======>......................] - ETA: 1:18 - loss: 1.3952 - regression_loss: 1.1838 - classification_loss: 0.2114 142/500 [=======>......................] - ETA: 1:17 - loss: 1.3893 - regression_loss: 1.1789 - classification_loss: 0.2103 143/500 [=======>......................] - ETA: 1:17 - loss: 1.3857 - regression_loss: 1.1758 - classification_loss: 0.2099 144/500 [=======>......................] - ETA: 1:17 - loss: 1.3818 - regression_loss: 1.1726 - classification_loss: 0.2092 145/500 [=======>......................] - ETA: 1:17 - loss: 1.3793 - regression_loss: 1.1706 - classification_loss: 0.2087 146/500 [=======>......................] - ETA: 1:17 - loss: 1.3814 - regression_loss: 1.1724 - classification_loss: 0.2090 147/500 [=======>......................] - ETA: 1:16 - loss: 1.3789 - regression_loss: 1.1706 - classification_loss: 0.2084 148/500 [=======>......................] - ETA: 1:16 - loss: 1.3805 - regression_loss: 1.1721 - classification_loss: 0.2084 149/500 [=======>......................] - ETA: 1:16 - loss: 1.3775 - regression_loss: 1.1696 - classification_loss: 0.2078 150/500 [========>.....................] - ETA: 1:16 - loss: 1.3738 - regression_loss: 1.1666 - classification_loss: 0.2072 151/500 [========>.....................] - ETA: 1:15 - loss: 1.3741 - regression_loss: 1.1669 - classification_loss: 0.2072 152/500 [========>.....................] - ETA: 1:15 - loss: 1.3695 - regression_loss: 1.1632 - classification_loss: 0.2063 153/500 [========>.....................] - ETA: 1:15 - loss: 1.3684 - regression_loss: 1.1625 - classification_loss: 0.2059 154/500 [========>.....................] - ETA: 1:15 - loss: 1.3718 - regression_loss: 1.1655 - classification_loss: 0.2064 155/500 [========>.....................] - ETA: 1:14 - loss: 1.3707 - regression_loss: 1.1646 - classification_loss: 0.2061 156/500 [========>.....................] - ETA: 1:14 - loss: 1.3682 - regression_loss: 1.1627 - classification_loss: 0.2056 157/500 [========>.....................] - ETA: 1:14 - loss: 1.3728 - regression_loss: 1.1658 - classification_loss: 0.2070 158/500 [========>.....................] - ETA: 1:14 - loss: 1.3821 - regression_loss: 1.1741 - classification_loss: 0.2079 159/500 [========>.....................] - ETA: 1:14 - loss: 1.3806 - regression_loss: 1.1731 - classification_loss: 0.2075 160/500 [========>.....................] - ETA: 1:13 - loss: 1.3782 - regression_loss: 1.1710 - classification_loss: 0.2073 161/500 [========>.....................] - ETA: 1:13 - loss: 1.3776 - regression_loss: 1.1701 - classification_loss: 0.2075 162/500 [========>.....................] - ETA: 1:13 - loss: 1.3745 - regression_loss: 1.1672 - classification_loss: 0.2073 163/500 [========>.....................] - ETA: 1:13 - loss: 1.3720 - regression_loss: 1.1654 - classification_loss: 0.2066 164/500 [========>.....................] - ETA: 1:12 - loss: 1.3696 - regression_loss: 1.1636 - classification_loss: 0.2060 165/500 [========>.....................] - ETA: 1:12 - loss: 1.3701 - regression_loss: 1.1641 - classification_loss: 0.2059 166/500 [========>.....................] - ETA: 1:12 - loss: 1.3669 - regression_loss: 1.1614 - classification_loss: 0.2056 167/500 [=========>....................] - ETA: 1:12 - loss: 1.3668 - regression_loss: 1.1614 - classification_loss: 0.2054 168/500 [=========>....................] - ETA: 1:12 - loss: 1.3721 - regression_loss: 1.1657 - classification_loss: 0.2064 169/500 [=========>....................] - ETA: 1:11 - loss: 1.3692 - regression_loss: 1.1630 - classification_loss: 0.2062 170/500 [=========>....................] - ETA: 1:11 - loss: 1.3708 - regression_loss: 1.1641 - classification_loss: 0.2067 171/500 [=========>....................] - ETA: 1:11 - loss: 1.3711 - regression_loss: 1.1636 - classification_loss: 0.2075 172/500 [=========>....................] - ETA: 1:11 - loss: 1.3757 - regression_loss: 1.1670 - classification_loss: 0.2087 173/500 [=========>....................] - ETA: 1:10 - loss: 1.3766 - regression_loss: 1.1682 - classification_loss: 0.2084 174/500 [=========>....................] - ETA: 1:10 - loss: 1.3761 - regression_loss: 1.1679 - classification_loss: 0.2081 175/500 [=========>....................] - ETA: 1:10 - loss: 1.3734 - regression_loss: 1.1656 - classification_loss: 0.2078 176/500 [=========>....................] - ETA: 1:10 - loss: 1.3760 - regression_loss: 1.1680 - classification_loss: 0.2080 177/500 [=========>....................] - ETA: 1:10 - loss: 1.3740 - regression_loss: 1.1662 - classification_loss: 0.2078 178/500 [=========>....................] - ETA: 1:09 - loss: 1.3741 - regression_loss: 1.1662 - classification_loss: 0.2079 179/500 [=========>....................] - ETA: 1:09 - loss: 1.3740 - regression_loss: 1.1660 - classification_loss: 0.2080 180/500 [=========>....................] - ETA: 1:09 - loss: 1.3773 - regression_loss: 1.1688 - classification_loss: 0.2085 181/500 [=========>....................] - ETA: 1:09 - loss: 1.3783 - regression_loss: 1.1693 - classification_loss: 0.2090 182/500 [=========>....................] - ETA: 1:09 - loss: 1.3726 - regression_loss: 1.1645 - classification_loss: 0.2081 183/500 [=========>....................] - ETA: 1:09 - loss: 1.3769 - regression_loss: 1.1674 - classification_loss: 0.2095 184/500 [==========>...................] - ETA: 1:08 - loss: 1.3738 - regression_loss: 1.1651 - classification_loss: 0.2087 185/500 [==========>...................] - ETA: 1:08 - loss: 1.3706 - regression_loss: 1.1625 - classification_loss: 0.2080 186/500 [==========>...................] - ETA: 1:08 - loss: 1.3700 - regression_loss: 1.1623 - classification_loss: 0.2077 187/500 [==========>...................] - ETA: 1:08 - loss: 1.3719 - regression_loss: 1.1641 - classification_loss: 0.2077 188/500 [==========>...................] - ETA: 1:08 - loss: 1.3768 - regression_loss: 1.1679 - classification_loss: 0.2088 189/500 [==========>...................] - ETA: 1:08 - loss: 1.3795 - regression_loss: 1.1698 - classification_loss: 0.2096 190/500 [==========>...................] - ETA: 1:07 - loss: 1.3814 - regression_loss: 1.1711 - classification_loss: 0.2103 191/500 [==========>...................] - ETA: 1:07 - loss: 1.3830 - regression_loss: 1.1721 - classification_loss: 0.2108 192/500 [==========>...................] - ETA: 1:07 - loss: 1.3829 - regression_loss: 1.1723 - classification_loss: 0.2106 193/500 [==========>...................] - ETA: 1:07 - loss: 1.3840 - regression_loss: 1.1732 - classification_loss: 0.2107 194/500 [==========>...................] - ETA: 1:06 - loss: 1.3880 - regression_loss: 1.1763 - classification_loss: 0.2117 195/500 [==========>...................] - ETA: 1:06 - loss: 1.3917 - regression_loss: 1.1793 - classification_loss: 0.2124 196/500 [==========>...................] - ETA: 1:06 - loss: 1.3907 - regression_loss: 1.1787 - classification_loss: 0.2120 197/500 [==========>...................] - ETA: 1:06 - loss: 1.3910 - regression_loss: 1.1787 - classification_loss: 0.2123 198/500 [==========>...................] - ETA: 1:05 - loss: 1.3897 - regression_loss: 1.1779 - classification_loss: 0.2118 199/500 [==========>...................] - ETA: 1:05 - loss: 1.3910 - regression_loss: 1.1792 - classification_loss: 0.2118 200/500 [===========>..................] - ETA: 1:05 - loss: 1.3910 - regression_loss: 1.1793 - classification_loss: 0.2117 201/500 [===========>..................] - ETA: 1:05 - loss: 1.3909 - regression_loss: 1.1795 - classification_loss: 0.2114 202/500 [===========>..................] - ETA: 1:05 - loss: 1.3941 - regression_loss: 1.1820 - classification_loss: 0.2121 203/500 [===========>..................] - ETA: 1:04 - loss: 1.3912 - regression_loss: 1.1798 - classification_loss: 0.2114 204/500 [===========>..................] - ETA: 1:04 - loss: 1.3916 - regression_loss: 1.1802 - classification_loss: 0.2114 205/500 [===========>..................] - ETA: 1:04 - loss: 1.3918 - regression_loss: 1.1801 - classification_loss: 0.2117 206/500 [===========>..................] - ETA: 1:04 - loss: 1.3940 - regression_loss: 1.1822 - classification_loss: 0.2118 207/500 [===========>..................] - ETA: 1:03 - loss: 1.3893 - regression_loss: 1.1782 - classification_loss: 0.2110 208/500 [===========>..................] - ETA: 1:03 - loss: 1.3846 - regression_loss: 1.1741 - classification_loss: 0.2105 209/500 [===========>..................] - ETA: 1:03 - loss: 1.3865 - regression_loss: 1.1751 - classification_loss: 0.2114 210/500 [===========>..................] - ETA: 1:03 - loss: 1.3893 - regression_loss: 1.1773 - classification_loss: 0.2120 211/500 [===========>..................] - ETA: 1:02 - loss: 1.3910 - regression_loss: 1.1789 - classification_loss: 0.2121 212/500 [===========>..................] - ETA: 1:02 - loss: 1.3903 - regression_loss: 1.1779 - classification_loss: 0.2124 213/500 [===========>..................] - ETA: 1:02 - loss: 1.3916 - regression_loss: 1.1788 - classification_loss: 0.2127 214/500 [===========>..................] - ETA: 1:02 - loss: 1.3899 - regression_loss: 1.1777 - classification_loss: 0.2122 215/500 [===========>..................] - ETA: 1:02 - loss: 1.3882 - regression_loss: 1.1764 - classification_loss: 0.2119 216/500 [===========>..................] - ETA: 1:01 - loss: 1.3848 - regression_loss: 1.1736 - classification_loss: 0.2112 217/500 [============>.................] - ETA: 1:01 - loss: 1.3842 - regression_loss: 1.1731 - classification_loss: 0.2111 218/500 [============>.................] - ETA: 1:01 - loss: 1.3830 - regression_loss: 1.1724 - classification_loss: 0.2106 219/500 [============>.................] - ETA: 1:01 - loss: 1.3807 - regression_loss: 1.1704 - classification_loss: 0.2103 220/500 [============>.................] - ETA: 1:00 - loss: 1.3762 - regression_loss: 1.1667 - classification_loss: 0.2095 221/500 [============>.................] - ETA: 1:00 - loss: 1.3770 - regression_loss: 1.1674 - classification_loss: 0.2095 222/500 [============>.................] - ETA: 1:00 - loss: 1.3762 - regression_loss: 1.1667 - classification_loss: 0.2096 223/500 [============>.................] - ETA: 1:00 - loss: 1.3798 - regression_loss: 1.1694 - classification_loss: 0.2103 224/500 [============>.................] - ETA: 1:00 - loss: 1.3818 - regression_loss: 1.1712 - classification_loss: 0.2105 225/500 [============>.................] - ETA: 1:00 - loss: 1.3829 - regression_loss: 1.1719 - classification_loss: 0.2110 226/500 [============>.................] - ETA: 59s - loss: 1.3834 - regression_loss: 1.1724 - classification_loss: 0.2110  227/500 [============>.................] - ETA: 59s - loss: 1.3850 - regression_loss: 1.1737 - classification_loss: 0.2113 228/500 [============>.................] - ETA: 59s - loss: 1.3868 - regression_loss: 1.1749 - classification_loss: 0.2119 229/500 [============>.................] - ETA: 59s - loss: 1.3881 - regression_loss: 1.1697 - classification_loss: 0.2183 230/500 [============>.................] - ETA: 59s - loss: 1.3913 - regression_loss: 1.1721 - classification_loss: 0.2191 231/500 [============>.................] - ETA: 58s - loss: 1.3905 - regression_loss: 1.1719 - classification_loss: 0.2187 232/500 [============>.................] - ETA: 58s - loss: 1.3930 - regression_loss: 1.1741 - classification_loss: 0.2189 233/500 [============>.................] - ETA: 58s - loss: 1.3900 - regression_loss: 1.1718 - classification_loss: 0.2182 234/500 [=============>................] - ETA: 58s - loss: 1.3912 - regression_loss: 1.1731 - classification_loss: 0.2181 235/500 [=============>................] - ETA: 58s - loss: 1.3906 - regression_loss: 1.1724 - classification_loss: 0.2182 236/500 [=============>................] - ETA: 57s - loss: 1.3893 - regression_loss: 1.1714 - classification_loss: 0.2179 237/500 [=============>................] - ETA: 57s - loss: 1.3899 - regression_loss: 1.1720 - classification_loss: 0.2179 238/500 [=============>................] - ETA: 57s - loss: 1.3889 - regression_loss: 1.1715 - classification_loss: 0.2174 239/500 [=============>................] - ETA: 57s - loss: 1.3922 - regression_loss: 1.1747 - classification_loss: 0.2176 240/500 [=============>................] - ETA: 56s - loss: 1.3932 - regression_loss: 1.1756 - classification_loss: 0.2176 241/500 [=============>................] - ETA: 56s - loss: 1.3932 - regression_loss: 1.1758 - classification_loss: 0.2174 242/500 [=============>................] - ETA: 56s - loss: 1.3964 - regression_loss: 1.1785 - classification_loss: 0.2179 243/500 [=============>................] - ETA: 56s - loss: 1.3929 - regression_loss: 1.1757 - classification_loss: 0.2172 244/500 [=============>................] - ETA: 56s - loss: 1.3949 - regression_loss: 1.1773 - classification_loss: 0.2176 245/500 [=============>................] - ETA: 55s - loss: 1.3935 - regression_loss: 1.1763 - classification_loss: 0.2171 246/500 [=============>................] - ETA: 55s - loss: 1.3971 - regression_loss: 1.1794 - classification_loss: 0.2177 247/500 [=============>................] - ETA: 55s - loss: 1.3991 - regression_loss: 1.1815 - classification_loss: 0.2176 248/500 [=============>................] - ETA: 55s - loss: 1.3973 - regression_loss: 1.1801 - classification_loss: 0.2172 249/500 [=============>................] - ETA: 54s - loss: 1.3978 - regression_loss: 1.1806 - classification_loss: 0.2171 250/500 [==============>...............] - ETA: 54s - loss: 1.3984 - regression_loss: 1.1810 - classification_loss: 0.2174 251/500 [==============>...............] - ETA: 54s - loss: 1.4003 - regression_loss: 1.1821 - classification_loss: 0.2182 252/500 [==============>...............] - ETA: 54s - loss: 1.3990 - regression_loss: 1.1812 - classification_loss: 0.2178 253/500 [==============>...............] - ETA: 54s - loss: 1.3977 - regression_loss: 1.1802 - classification_loss: 0.2174 254/500 [==============>...............] - ETA: 53s - loss: 1.3981 - regression_loss: 1.1804 - classification_loss: 0.2176 255/500 [==============>...............] - ETA: 53s - loss: 1.3979 - regression_loss: 1.1806 - classification_loss: 0.2174 256/500 [==============>...............] - ETA: 53s - loss: 1.3974 - regression_loss: 1.1802 - classification_loss: 0.2172 257/500 [==============>...............] - ETA: 53s - loss: 1.3997 - regression_loss: 1.1818 - classification_loss: 0.2180 258/500 [==============>...............] - ETA: 52s - loss: 1.4002 - regression_loss: 1.1821 - classification_loss: 0.2181 259/500 [==============>...............] - ETA: 52s - loss: 1.4007 - regression_loss: 1.1824 - classification_loss: 0.2183 260/500 [==============>...............] - ETA: 52s - loss: 1.3991 - regression_loss: 1.1812 - classification_loss: 0.2178 261/500 [==============>...............] - ETA: 52s - loss: 1.3992 - regression_loss: 1.1811 - classification_loss: 0.2180 262/500 [==============>...............] - ETA: 51s - loss: 1.3995 - regression_loss: 1.1815 - classification_loss: 0.2180 263/500 [==============>...............] - ETA: 51s - loss: 1.3982 - regression_loss: 1.1803 - classification_loss: 0.2178 264/500 [==============>...............] - ETA: 51s - loss: 1.3985 - regression_loss: 1.1804 - classification_loss: 0.2181 265/500 [==============>...............] - ETA: 51s - loss: 1.3974 - regression_loss: 1.1795 - classification_loss: 0.2180 266/500 [==============>...............] - ETA: 51s - loss: 1.3960 - regression_loss: 1.1785 - classification_loss: 0.2175 267/500 [===============>..............] - ETA: 50s - loss: 1.3957 - regression_loss: 1.1783 - classification_loss: 0.2174 268/500 [===============>..............] - ETA: 50s - loss: 1.3952 - regression_loss: 1.1780 - classification_loss: 0.2171 269/500 [===============>..............] - ETA: 50s - loss: 1.3971 - regression_loss: 1.1794 - classification_loss: 0.2178 270/500 [===============>..............] - ETA: 50s - loss: 1.3982 - regression_loss: 1.1799 - classification_loss: 0.2183 271/500 [===============>..............] - ETA: 50s - loss: 1.3957 - regression_loss: 1.1778 - classification_loss: 0.2179 272/500 [===============>..............] - ETA: 50s - loss: 1.3956 - regression_loss: 1.1779 - classification_loss: 0.2177 273/500 [===============>..............] - ETA: 49s - loss: 1.3965 - regression_loss: 1.1789 - classification_loss: 0.2176 274/500 [===============>..............] - ETA: 49s - loss: 1.3963 - regression_loss: 1.1789 - classification_loss: 0.2174 275/500 [===============>..............] - ETA: 49s - loss: 1.3967 - regression_loss: 1.1790 - classification_loss: 0.2176 276/500 [===============>..............] - ETA: 49s - loss: 1.3971 - regression_loss: 1.1795 - classification_loss: 0.2176 277/500 [===============>..............] - ETA: 49s - loss: 1.3987 - regression_loss: 1.1809 - classification_loss: 0.2178 278/500 [===============>..............] - ETA: 48s - loss: 1.3988 - regression_loss: 1.1812 - classification_loss: 0.2176 279/500 [===============>..............] - ETA: 48s - loss: 1.3995 - regression_loss: 1.1819 - classification_loss: 0.2175 280/500 [===============>..............] - ETA: 48s - loss: 1.4009 - regression_loss: 1.1833 - classification_loss: 0.2176 281/500 [===============>..............] - ETA: 48s - loss: 1.4007 - regression_loss: 1.1833 - classification_loss: 0.2173 282/500 [===============>..............] - ETA: 47s - loss: 1.4010 - regression_loss: 1.1836 - classification_loss: 0.2173 283/500 [===============>..............] - ETA: 47s - loss: 1.4050 - regression_loss: 1.1866 - classification_loss: 0.2184 284/500 [================>.............] - ETA: 47s - loss: 1.4059 - regression_loss: 1.1873 - classification_loss: 0.2186 285/500 [================>.............] - ETA: 47s - loss: 1.4068 - regression_loss: 1.1880 - classification_loss: 0.2189 286/500 [================>.............] - ETA: 46s - loss: 1.4043 - regression_loss: 1.1859 - classification_loss: 0.2184 287/500 [================>.............] - ETA: 46s - loss: 1.4038 - regression_loss: 1.1857 - classification_loss: 0.2182 288/500 [================>.............] - ETA: 46s - loss: 1.4048 - regression_loss: 1.1868 - classification_loss: 0.2179 289/500 [================>.............] - ETA: 46s - loss: 1.4015 - regression_loss: 1.1841 - classification_loss: 0.2174 290/500 [================>.............] - ETA: 46s - loss: 1.4000 - regression_loss: 1.1827 - classification_loss: 0.2173 291/500 [================>.............] - ETA: 45s - loss: 1.3985 - regression_loss: 1.1815 - classification_loss: 0.2170 292/500 [================>.............] - ETA: 45s - loss: 1.3995 - regression_loss: 1.1824 - classification_loss: 0.2171 293/500 [================>.............] - ETA: 45s - loss: 1.4000 - regression_loss: 1.1826 - classification_loss: 0.2174 294/500 [================>.............] - ETA: 45s - loss: 1.4003 - regression_loss: 1.1830 - classification_loss: 0.2172 295/500 [================>.............] - ETA: 44s - loss: 1.3993 - regression_loss: 1.1823 - classification_loss: 0.2171 296/500 [================>.............] - ETA: 44s - loss: 1.3975 - regression_loss: 1.1809 - classification_loss: 0.2166 297/500 [================>.............] - ETA: 44s - loss: 1.3967 - regression_loss: 1.1800 - classification_loss: 0.2166 298/500 [================>.............] - ETA: 44s - loss: 1.3959 - regression_loss: 1.1795 - classification_loss: 0.2164 299/500 [================>.............] - ETA: 44s - loss: 1.3928 - regression_loss: 1.1769 - classification_loss: 0.2159 300/500 [=================>............] - ETA: 43s - loss: 1.3950 - regression_loss: 1.1784 - classification_loss: 0.2166 301/500 [=================>............] - ETA: 43s - loss: 1.3959 - regression_loss: 1.1792 - classification_loss: 0.2167 302/500 [=================>............] - ETA: 43s - loss: 1.3962 - regression_loss: 1.1793 - classification_loss: 0.2169 303/500 [=================>............] - ETA: 43s - loss: 1.3948 - regression_loss: 1.1781 - classification_loss: 0.2167 304/500 [=================>............] - ETA: 42s - loss: 1.3955 - regression_loss: 1.1786 - classification_loss: 0.2169 305/500 [=================>............] - ETA: 42s - loss: 1.3955 - regression_loss: 1.1788 - classification_loss: 0.2167 306/500 [=================>............] - ETA: 42s - loss: 1.3962 - regression_loss: 1.1793 - classification_loss: 0.2168 307/500 [=================>............] - ETA: 42s - loss: 1.3956 - regression_loss: 1.1788 - classification_loss: 0.2168 308/500 [=================>............] - ETA: 42s - loss: 1.3955 - regression_loss: 1.1788 - classification_loss: 0.2167 309/500 [=================>............] - ETA: 41s - loss: 1.3951 - regression_loss: 1.1787 - classification_loss: 0.2164 310/500 [=================>............] - ETA: 41s - loss: 1.3948 - regression_loss: 1.1783 - classification_loss: 0.2165 311/500 [=================>............] - ETA: 41s - loss: 1.3937 - regression_loss: 1.1775 - classification_loss: 0.2162 312/500 [=================>............] - ETA: 41s - loss: 1.3947 - regression_loss: 1.1785 - classification_loss: 0.2162 313/500 [=================>............] - ETA: 40s - loss: 1.3953 - regression_loss: 1.1791 - classification_loss: 0.2163 314/500 [=================>............] - ETA: 40s - loss: 1.3974 - regression_loss: 1.1806 - classification_loss: 0.2168 315/500 [=================>............] - ETA: 40s - loss: 1.3976 - regression_loss: 1.1812 - classification_loss: 0.2165 316/500 [=================>............] - ETA: 40s - loss: 1.3967 - regression_loss: 1.1804 - classification_loss: 0.2163 317/500 [==================>...........] - ETA: 40s - loss: 1.3962 - regression_loss: 1.1800 - classification_loss: 0.2162 318/500 [==================>...........] - ETA: 39s - loss: 1.3967 - regression_loss: 1.1803 - classification_loss: 0.2164 319/500 [==================>...........] - ETA: 39s - loss: 1.3985 - regression_loss: 1.1818 - classification_loss: 0.2167 320/500 [==================>...........] - ETA: 39s - loss: 1.3998 - regression_loss: 1.1829 - classification_loss: 0.2169 321/500 [==================>...........] - ETA: 39s - loss: 1.4008 - regression_loss: 1.1839 - classification_loss: 0.2169 322/500 [==================>...........] - ETA: 39s - loss: 1.4012 - regression_loss: 1.1842 - classification_loss: 0.2170 323/500 [==================>...........] - ETA: 38s - loss: 1.4005 - regression_loss: 1.1836 - classification_loss: 0.2169 324/500 [==================>...........] - ETA: 38s - loss: 1.4025 - regression_loss: 1.1851 - classification_loss: 0.2174 325/500 [==================>...........] - ETA: 38s - loss: 1.4044 - regression_loss: 1.1865 - classification_loss: 0.2179 326/500 [==================>...........] - ETA: 38s - loss: 1.4027 - regression_loss: 1.1852 - classification_loss: 0.2175 327/500 [==================>...........] - ETA: 38s - loss: 1.4032 - regression_loss: 1.1858 - classification_loss: 0.2175 328/500 [==================>...........] - ETA: 37s - loss: 1.4001 - regression_loss: 1.1832 - classification_loss: 0.2169 329/500 [==================>...........] - ETA: 37s - loss: 1.4002 - regression_loss: 1.1834 - classification_loss: 0.2169 330/500 [==================>...........] - ETA: 37s - loss: 1.4014 - regression_loss: 1.1843 - classification_loss: 0.2171 331/500 [==================>...........] - ETA: 37s - loss: 1.4003 - regression_loss: 1.1834 - classification_loss: 0.2169 332/500 [==================>...........] - ETA: 36s - loss: 1.3986 - regression_loss: 1.1819 - classification_loss: 0.2167 333/500 [==================>...........] - ETA: 36s - loss: 1.3957 - regression_loss: 1.1795 - classification_loss: 0.2162 334/500 [===================>..........] - ETA: 36s - loss: 1.3947 - regression_loss: 1.1788 - classification_loss: 0.2159 335/500 [===================>..........] - ETA: 36s - loss: 1.3947 - regression_loss: 1.1789 - classification_loss: 0.2158 336/500 [===================>..........] - ETA: 36s - loss: 1.3935 - regression_loss: 1.1780 - classification_loss: 0.2155 337/500 [===================>..........] - ETA: 35s - loss: 1.3940 - regression_loss: 1.1785 - classification_loss: 0.2155 338/500 [===================>..........] - ETA: 35s - loss: 1.3952 - regression_loss: 1.1792 - classification_loss: 0.2160 339/500 [===================>..........] - ETA: 35s - loss: 1.3965 - regression_loss: 1.1802 - classification_loss: 0.2163 340/500 [===================>..........] - ETA: 35s - loss: 1.4004 - regression_loss: 1.1833 - classification_loss: 0.2170 341/500 [===================>..........] - ETA: 34s - loss: 1.3987 - regression_loss: 1.1819 - classification_loss: 0.2168 342/500 [===================>..........] - ETA: 34s - loss: 1.3975 - regression_loss: 1.1811 - classification_loss: 0.2164 343/500 [===================>..........] - ETA: 34s - loss: 1.3973 - regression_loss: 1.1810 - classification_loss: 0.2163 344/500 [===================>..........] - ETA: 34s - loss: 1.3972 - regression_loss: 1.1808 - classification_loss: 0.2164 345/500 [===================>..........] - ETA: 34s - loss: 1.3987 - regression_loss: 1.1821 - classification_loss: 0.2166 346/500 [===================>..........] - ETA: 33s - loss: 1.3987 - regression_loss: 1.1822 - classification_loss: 0.2165 347/500 [===================>..........] - ETA: 33s - loss: 1.3969 - regression_loss: 1.1808 - classification_loss: 0.2161 348/500 [===================>..........] - ETA: 33s - loss: 1.3977 - regression_loss: 1.1812 - classification_loss: 0.2164 349/500 [===================>..........] - ETA: 33s - loss: 1.3953 - regression_loss: 1.1778 - classification_loss: 0.2175 350/500 [====================>.........] - ETA: 32s - loss: 1.3958 - regression_loss: 1.1782 - classification_loss: 0.2176 351/500 [====================>.........] - ETA: 32s - loss: 1.3945 - regression_loss: 1.1772 - classification_loss: 0.2173 352/500 [====================>.........] - ETA: 32s - loss: 1.3926 - regression_loss: 1.1757 - classification_loss: 0.2169 353/500 [====================>.........] - ETA: 32s - loss: 1.3936 - regression_loss: 1.1765 - classification_loss: 0.2171 354/500 [====================>.........] - ETA: 32s - loss: 1.3958 - regression_loss: 1.1782 - classification_loss: 0.2176 355/500 [====================>.........] - ETA: 31s - loss: 1.3954 - regression_loss: 1.1779 - classification_loss: 0.2176 356/500 [====================>.........] - ETA: 31s - loss: 1.3943 - regression_loss: 1.1769 - classification_loss: 0.2173 357/500 [====================>.........] - ETA: 31s - loss: 1.3948 - regression_loss: 1.1776 - classification_loss: 0.2172 358/500 [====================>.........] - ETA: 31s - loss: 1.3937 - regression_loss: 1.1767 - classification_loss: 0.2170 359/500 [====================>.........] - ETA: 30s - loss: 1.3928 - regression_loss: 1.1760 - classification_loss: 0.2169 360/500 [====================>.........] - ETA: 30s - loss: 1.3930 - regression_loss: 1.1762 - classification_loss: 0.2169 361/500 [====================>.........] - ETA: 30s - loss: 1.3928 - regression_loss: 1.1761 - classification_loss: 0.2166 362/500 [====================>.........] - ETA: 30s - loss: 1.3897 - regression_loss: 1.1736 - classification_loss: 0.2162 363/500 [====================>.........] - ETA: 30s - loss: 1.3911 - regression_loss: 1.1745 - classification_loss: 0.2165 364/500 [====================>.........] - ETA: 29s - loss: 1.3904 - regression_loss: 1.1741 - classification_loss: 0.2163 365/500 [====================>.........] - ETA: 29s - loss: 1.3902 - regression_loss: 1.1740 - classification_loss: 0.2162 366/500 [====================>.........] - ETA: 29s - loss: 1.3892 - regression_loss: 1.1733 - classification_loss: 0.2159 367/500 [=====================>........] - ETA: 29s - loss: 1.3889 - regression_loss: 1.1731 - classification_loss: 0.2158 368/500 [=====================>........] - ETA: 29s - loss: 1.3895 - regression_loss: 1.1737 - classification_loss: 0.2158 369/500 [=====================>........] - ETA: 28s - loss: 1.3892 - regression_loss: 1.1736 - classification_loss: 0.2156 370/500 [=====================>........] - ETA: 28s - loss: 1.3890 - regression_loss: 1.1735 - classification_loss: 0.2155 371/500 [=====================>........] - ETA: 28s - loss: 1.3892 - regression_loss: 1.1737 - classification_loss: 0.2156 372/500 [=====================>........] - ETA: 28s - loss: 1.3870 - regression_loss: 1.1719 - classification_loss: 0.2152 373/500 [=====================>........] - ETA: 27s - loss: 1.3862 - regression_loss: 1.1714 - classification_loss: 0.2148 374/500 [=====================>........] - ETA: 27s - loss: 1.3849 - regression_loss: 1.1704 - classification_loss: 0.2145 375/500 [=====================>........] - ETA: 27s - loss: 1.3829 - regression_loss: 1.1687 - classification_loss: 0.2142 376/500 [=====================>........] - ETA: 27s - loss: 1.3821 - regression_loss: 1.1682 - classification_loss: 0.2140 377/500 [=====================>........] - ETA: 27s - loss: 1.3826 - regression_loss: 1.1684 - classification_loss: 0.2141 378/500 [=====================>........] - ETA: 26s - loss: 1.3821 - regression_loss: 1.1682 - classification_loss: 0.2139 379/500 [=====================>........] - ETA: 26s - loss: 1.3808 - regression_loss: 1.1673 - classification_loss: 0.2136 380/500 [=====================>........] - ETA: 26s - loss: 1.3811 - regression_loss: 1.1677 - classification_loss: 0.2135 381/500 [=====================>........] - ETA: 26s - loss: 1.3813 - regression_loss: 1.1679 - classification_loss: 0.2134 382/500 [=====================>........] - ETA: 25s - loss: 1.3805 - regression_loss: 1.1672 - classification_loss: 0.2133 383/500 [=====================>........] - ETA: 25s - loss: 1.3806 - regression_loss: 1.1672 - classification_loss: 0.2133 384/500 [======================>.......] - ETA: 25s - loss: 1.3796 - regression_loss: 1.1665 - classification_loss: 0.2131 385/500 [======================>.......] - ETA: 25s - loss: 1.3780 - regression_loss: 1.1652 - classification_loss: 0.2128 386/500 [======================>.......] - ETA: 25s - loss: 1.3777 - regression_loss: 1.1650 - classification_loss: 0.2127 387/500 [======================>.......] - ETA: 24s - loss: 1.3783 - regression_loss: 1.1654 - classification_loss: 0.2129 388/500 [======================>.......] - ETA: 24s - loss: 1.3778 - regression_loss: 1.1651 - classification_loss: 0.2127 389/500 [======================>.......] - ETA: 24s - loss: 1.3766 - regression_loss: 1.1641 - classification_loss: 0.2125 390/500 [======================>.......] - ETA: 24s - loss: 1.3781 - regression_loss: 1.1653 - classification_loss: 0.2127 391/500 [======================>.......] - ETA: 23s - loss: 1.3779 - regression_loss: 1.1655 - classification_loss: 0.2125 392/500 [======================>.......] - ETA: 23s - loss: 1.3781 - regression_loss: 1.1654 - classification_loss: 0.2127 393/500 [======================>.......] - ETA: 23s - loss: 1.3781 - regression_loss: 1.1655 - classification_loss: 0.2126 394/500 [======================>.......] - ETA: 23s - loss: 1.3775 - regression_loss: 1.1647 - classification_loss: 0.2128 395/500 [======================>.......] - ETA: 23s - loss: 1.3778 - regression_loss: 1.1650 - classification_loss: 0.2128 396/500 [======================>.......] - ETA: 22s - loss: 1.3802 - regression_loss: 1.1669 - classification_loss: 0.2133 397/500 [======================>.......] - ETA: 22s - loss: 1.3793 - regression_loss: 1.1661 - classification_loss: 0.2132 398/500 [======================>.......] - ETA: 22s - loss: 1.3785 - regression_loss: 1.1655 - classification_loss: 0.2130 399/500 [======================>.......] - ETA: 22s - loss: 1.3773 - regression_loss: 1.1646 - classification_loss: 0.2127 400/500 [=======================>......] - ETA: 21s - loss: 1.3751 - regression_loss: 1.1626 - classification_loss: 0.2125 401/500 [=======================>......] - ETA: 21s - loss: 1.3739 - regression_loss: 1.1615 - classification_loss: 0.2123 402/500 [=======================>......] - ETA: 21s - loss: 1.3740 - regression_loss: 1.1619 - classification_loss: 0.2121 403/500 [=======================>......] - ETA: 21s - loss: 1.3754 - regression_loss: 1.1631 - classification_loss: 0.2123 404/500 [=======================>......] - ETA: 21s - loss: 1.3755 - regression_loss: 1.1632 - classification_loss: 0.2123 405/500 [=======================>......] - ETA: 20s - loss: 1.3764 - regression_loss: 1.1640 - classification_loss: 0.2124 406/500 [=======================>......] - ETA: 20s - loss: 1.3768 - regression_loss: 1.1642 - classification_loss: 0.2126 407/500 [=======================>......] - ETA: 20s - loss: 1.3763 - regression_loss: 1.1639 - classification_loss: 0.2123 408/500 [=======================>......] - ETA: 20s - loss: 1.3756 - regression_loss: 1.1636 - classification_loss: 0.2120 409/500 [=======================>......] - ETA: 19s - loss: 1.3763 - regression_loss: 1.1641 - classification_loss: 0.2121 410/500 [=======================>......] - ETA: 19s - loss: 1.3762 - regression_loss: 1.1642 - classification_loss: 0.2121 411/500 [=======================>......] - ETA: 19s - loss: 1.3769 - regression_loss: 1.1647 - classification_loss: 0.2122 412/500 [=======================>......] - ETA: 19s - loss: 1.3769 - regression_loss: 1.1646 - classification_loss: 0.2123 413/500 [=======================>......] - ETA: 19s - loss: 1.3768 - regression_loss: 1.1645 - classification_loss: 0.2122 414/500 [=======================>......] - ETA: 18s - loss: 1.3799 - regression_loss: 1.1670 - classification_loss: 0.2129 415/500 [=======================>......] - ETA: 18s - loss: 1.3806 - regression_loss: 1.1675 - classification_loss: 0.2130 416/500 [=======================>......] - ETA: 18s - loss: 1.3816 - regression_loss: 1.1683 - classification_loss: 0.2133 417/500 [========================>.....] - ETA: 18s - loss: 1.3818 - regression_loss: 1.1685 - classification_loss: 0.2134 418/500 [========================>.....] - ETA: 17s - loss: 1.3808 - regression_loss: 1.1677 - classification_loss: 0.2131 419/500 [========================>.....] - ETA: 17s - loss: 1.3813 - regression_loss: 1.1680 - classification_loss: 0.2133 420/500 [========================>.....] - ETA: 17s - loss: 1.3805 - regression_loss: 1.1675 - classification_loss: 0.2130 421/500 [========================>.....] - ETA: 17s - loss: 1.3808 - regression_loss: 1.1677 - classification_loss: 0.2130 422/500 [========================>.....] - ETA: 17s - loss: 1.3811 - regression_loss: 1.1680 - classification_loss: 0.2130 423/500 [========================>.....] - ETA: 16s - loss: 1.3807 - regression_loss: 1.1679 - classification_loss: 0.2128 424/500 [========================>.....] - ETA: 16s - loss: 1.3797 - regression_loss: 1.1671 - classification_loss: 0.2126 425/500 [========================>.....] - ETA: 16s - loss: 1.3804 - regression_loss: 1.1675 - classification_loss: 0.2130 426/500 [========================>.....] - ETA: 16s - loss: 1.3783 - regression_loss: 1.1657 - classification_loss: 0.2126 427/500 [========================>.....] - ETA: 16s - loss: 1.3801 - regression_loss: 1.1671 - classification_loss: 0.2130 428/500 [========================>.....] - ETA: 15s - loss: 1.3807 - regression_loss: 1.1678 - classification_loss: 0.2129 429/500 [========================>.....] - ETA: 15s - loss: 1.3808 - regression_loss: 1.1681 - classification_loss: 0.2127 430/500 [========================>.....] - ETA: 15s - loss: 1.3800 - regression_loss: 1.1676 - classification_loss: 0.2124 431/500 [========================>.....] - ETA: 15s - loss: 1.3800 - regression_loss: 1.1676 - classification_loss: 0.2125 432/500 [========================>.....] - ETA: 14s - loss: 1.3827 - regression_loss: 1.1692 - classification_loss: 0.2134 433/500 [========================>.....] - ETA: 14s - loss: 1.3818 - regression_loss: 1.1685 - classification_loss: 0.2133 434/500 [=========================>....] - ETA: 14s - loss: 1.3810 - regression_loss: 1.1680 - classification_loss: 0.2131 435/500 [=========================>....] - ETA: 14s - loss: 1.3810 - regression_loss: 1.1680 - classification_loss: 0.2130 436/500 [=========================>....] - ETA: 14s - loss: 1.3805 - regression_loss: 1.1677 - classification_loss: 0.2128 437/500 [=========================>....] - ETA: 13s - loss: 1.3808 - regression_loss: 1.1679 - classification_loss: 0.2128 438/500 [=========================>....] - ETA: 13s - loss: 1.3798 - regression_loss: 1.1670 - classification_loss: 0.2128 439/500 [=========================>....] - ETA: 13s - loss: 1.3788 - regression_loss: 1.1663 - classification_loss: 0.2125 440/500 [=========================>....] - ETA: 13s - loss: 1.3801 - regression_loss: 1.1674 - classification_loss: 0.2127 441/500 [=========================>....] - ETA: 12s - loss: 1.3804 - regression_loss: 1.1678 - classification_loss: 0.2126 442/500 [=========================>....] - ETA: 12s - loss: 1.3799 - regression_loss: 1.1675 - classification_loss: 0.2123 443/500 [=========================>....] - ETA: 12s - loss: 1.3797 - regression_loss: 1.1675 - classification_loss: 0.2123 444/500 [=========================>....] - ETA: 12s - loss: 1.3815 - regression_loss: 1.1687 - classification_loss: 0.2128 445/500 [=========================>....] - ETA: 12s - loss: 1.3824 - regression_loss: 1.1694 - classification_loss: 0.2130 446/500 [=========================>....] - ETA: 11s - loss: 1.3840 - regression_loss: 1.1710 - classification_loss: 0.2130 447/500 [=========================>....] - ETA: 11s - loss: 1.3824 - regression_loss: 1.1698 - classification_loss: 0.2127 448/500 [=========================>....] - ETA: 11s - loss: 1.3819 - regression_loss: 1.1694 - classification_loss: 0.2125 449/500 [=========================>....] - ETA: 11s - loss: 1.3815 - regression_loss: 1.1691 - classification_loss: 0.2123 450/500 [==========================>...] - ETA: 10s - loss: 1.3818 - regression_loss: 1.1694 - classification_loss: 0.2125 451/500 [==========================>...] - ETA: 10s - loss: 1.3812 - regression_loss: 1.1688 - classification_loss: 0.2124 452/500 [==========================>...] - ETA: 10s - loss: 1.3823 - regression_loss: 1.1697 - classification_loss: 0.2126 453/500 [==========================>...] - ETA: 10s - loss: 1.3832 - regression_loss: 1.1704 - classification_loss: 0.2128 454/500 [==========================>...] - ETA: 10s - loss: 1.3839 - regression_loss: 1.1709 - classification_loss: 0.2130 455/500 [==========================>...] - ETA: 9s - loss: 1.3826 - regression_loss: 1.1698 - classification_loss: 0.2128  456/500 [==========================>...] - ETA: 9s - loss: 1.3821 - regression_loss: 1.1695 - classification_loss: 0.2126 457/500 [==========================>...] - ETA: 9s - loss: 1.3816 - regression_loss: 1.1691 - classification_loss: 0.2125 458/500 [==========================>...] - ETA: 9s - loss: 1.3822 - regression_loss: 1.1697 - classification_loss: 0.2125 459/500 [==========================>...] - ETA: 8s - loss: 1.3830 - regression_loss: 1.1703 - classification_loss: 0.2126 460/500 [==========================>...] - ETA: 8s - loss: 1.3834 - regression_loss: 1.1706 - classification_loss: 0.2128 461/500 [==========================>...] - ETA: 8s - loss: 1.3829 - regression_loss: 1.1702 - classification_loss: 0.2127 462/500 [==========================>...] - ETA: 8s - loss: 1.3830 - regression_loss: 1.1705 - classification_loss: 0.2125 463/500 [==========================>...] - ETA: 8s - loss: 1.3822 - regression_loss: 1.1698 - classification_loss: 0.2123 464/500 [==========================>...] - ETA: 7s - loss: 1.3816 - regression_loss: 1.1694 - classification_loss: 0.2122 465/500 [==========================>...] - ETA: 7s - loss: 1.3844 - regression_loss: 1.1714 - classification_loss: 0.2130 466/500 [==========================>...] - ETA: 7s - loss: 1.3837 - regression_loss: 1.1707 - classification_loss: 0.2131 467/500 [===========================>..] - ETA: 7s - loss: 1.3834 - regression_loss: 1.1704 - classification_loss: 0.2130 468/500 [===========================>..] - ETA: 6s - loss: 1.3837 - regression_loss: 1.1707 - classification_loss: 0.2130 469/500 [===========================>..] - ETA: 6s - loss: 1.3838 - regression_loss: 1.1709 - classification_loss: 0.2129 470/500 [===========================>..] - ETA: 6s - loss: 1.3836 - regression_loss: 1.1708 - classification_loss: 0.2128 471/500 [===========================>..] - ETA: 6s - loss: 1.3835 - regression_loss: 1.1708 - classification_loss: 0.2127 472/500 [===========================>..] - ETA: 6s - loss: 1.3843 - regression_loss: 1.1715 - classification_loss: 0.2128 473/500 [===========================>..] - ETA: 5s - loss: 1.3843 - regression_loss: 1.1713 - classification_loss: 0.2130 474/500 [===========================>..] - ETA: 5s - loss: 1.3848 - regression_loss: 1.1718 - classification_loss: 0.2130 475/500 [===========================>..] - ETA: 5s - loss: 1.3830 - regression_loss: 1.1703 - classification_loss: 0.2126 476/500 [===========================>..] - ETA: 5s - loss: 1.3876 - regression_loss: 1.1738 - classification_loss: 0.2139 477/500 [===========================>..] - ETA: 5s - loss: 1.3882 - regression_loss: 1.1744 - classification_loss: 0.2137 478/500 [===========================>..] - ETA: 4s - loss: 1.3889 - regression_loss: 1.1751 - classification_loss: 0.2138 479/500 [===========================>..] - ETA: 4s - loss: 1.3893 - regression_loss: 1.1753 - classification_loss: 0.2140 480/500 [===========================>..] - ETA: 4s - loss: 1.3906 - regression_loss: 1.1766 - classification_loss: 0.2140 481/500 [===========================>..] - ETA: 4s - loss: 1.3902 - regression_loss: 1.1762 - classification_loss: 0.2139 482/500 [===========================>..] - ETA: 3s - loss: 1.3907 - regression_loss: 1.1763 - classification_loss: 0.2144 483/500 [===========================>..] - ETA: 3s - loss: 1.3903 - regression_loss: 1.1761 - classification_loss: 0.2142 484/500 [============================>.] - ETA: 3s - loss: 1.3920 - regression_loss: 1.1775 - classification_loss: 0.2145 485/500 [============================>.] - ETA: 3s - loss: 1.3919 - regression_loss: 1.1774 - classification_loss: 0.2145 486/500 [============================>.] - ETA: 3s - loss: 1.3922 - regression_loss: 1.1778 - classification_loss: 0.2145 487/500 [============================>.] - ETA: 2s - loss: 1.3922 - regression_loss: 1.1778 - classification_loss: 0.2144 488/500 [============================>.] - ETA: 2s - loss: 1.3926 - regression_loss: 1.1781 - classification_loss: 0.2145 489/500 [============================>.] - ETA: 2s - loss: 1.3914 - regression_loss: 1.1771 - classification_loss: 0.2144 490/500 [============================>.] - ETA: 2s - loss: 1.3911 - regression_loss: 1.1767 - classification_loss: 0.2144 491/500 [============================>.] - ETA: 1s - loss: 1.3922 - regression_loss: 1.1775 - classification_loss: 0.2147 492/500 [============================>.] - ETA: 1s - loss: 1.3916 - regression_loss: 1.1767 - classification_loss: 0.2149 493/500 [============================>.] - ETA: 1s - loss: 1.3908 - regression_loss: 1.1760 - classification_loss: 0.2149 494/500 [============================>.] - ETA: 1s - loss: 1.3908 - regression_loss: 1.1761 - classification_loss: 0.2147 495/500 [============================>.] - ETA: 1s - loss: 1.3903 - regression_loss: 1.1757 - classification_loss: 0.2146 496/500 [============================>.] - ETA: 0s - loss: 1.3901 - regression_loss: 1.1757 - classification_loss: 0.2145 497/500 [============================>.] - ETA: 0s - loss: 1.3913 - regression_loss: 1.1766 - classification_loss: 0.2147 498/500 [============================>.] - ETA: 0s - loss: 1.3929 - regression_loss: 1.1779 - classification_loss: 0.2150 499/500 [============================>.] - ETA: 0s - loss: 1.3924 - regression_loss: 1.1776 - classification_loss: 0.2148 500/500 [==============================] - 109s 218ms/step - loss: 1.3929 - regression_loss: 1.1783 - classification_loss: 0.2146 326 instances of class plum with average precision: 0.8075 mAP: 0.8075 Epoch 00085: saving model to ./training/snapshots/resnet50_pascal_85.h5 Epoch 86/150 1/500 [..............................] - ETA: 1:50 - loss: 1.0783 - regression_loss: 0.8817 - classification_loss: 0.1965 2/500 [..............................] - ETA: 1:47 - loss: 1.1441 - regression_loss: 0.9632 - classification_loss: 0.1808 3/500 [..............................] - ETA: 1:48 - loss: 1.3681 - regression_loss: 1.1460 - classification_loss: 0.2221 4/500 [..............................] - ETA: 1:47 - loss: 1.5461 - regression_loss: 1.2805 - classification_loss: 0.2656 5/500 [..............................] - ETA: 1:45 - loss: 1.5940 - regression_loss: 1.3302 - classification_loss: 0.2638 6/500 [..............................] - ETA: 1:45 - loss: 1.6150 - regression_loss: 1.3440 - classification_loss: 0.2709 7/500 [..............................] - ETA: 1:45 - loss: 1.5631 - regression_loss: 1.3098 - classification_loss: 0.2533 8/500 [..............................] - ETA: 1:45 - loss: 1.5597 - regression_loss: 1.3045 - classification_loss: 0.2552 9/500 [..............................] - ETA: 1:44 - loss: 1.5656 - regression_loss: 1.3113 - classification_loss: 0.2543 10/500 [..............................] - ETA: 1:44 - loss: 1.5202 - regression_loss: 1.2769 - classification_loss: 0.2433 11/500 [..............................] - ETA: 1:44 - loss: 1.4605 - regression_loss: 1.2292 - classification_loss: 0.2313 12/500 [..............................] - ETA: 1:43 - loss: 1.4823 - regression_loss: 1.2525 - classification_loss: 0.2298 13/500 [..............................] - ETA: 1:43 - loss: 1.4869 - regression_loss: 1.2542 - classification_loss: 0.2327 14/500 [..............................] - ETA: 1:43 - loss: 1.4936 - regression_loss: 1.2659 - classification_loss: 0.2276 15/500 [..............................] - ETA: 1:43 - loss: 1.5275 - regression_loss: 1.2984 - classification_loss: 0.2291 16/500 [..............................] - ETA: 1:42 - loss: 1.5396 - regression_loss: 1.3050 - classification_loss: 0.2346 17/500 [>.............................] - ETA: 1:42 - loss: 1.5415 - regression_loss: 1.3081 - classification_loss: 0.2335 18/500 [>.............................] - ETA: 1:42 - loss: 1.5261 - regression_loss: 1.2978 - classification_loss: 0.2284 19/500 [>.............................] - ETA: 1:41 - loss: 1.5286 - regression_loss: 1.2936 - classification_loss: 0.2349 20/500 [>.............................] - ETA: 1:41 - loss: 1.5775 - regression_loss: 1.3270 - classification_loss: 0.2505 21/500 [>.............................] - ETA: 1:41 - loss: 1.5667 - regression_loss: 1.3156 - classification_loss: 0.2511 22/500 [>.............................] - ETA: 1:42 - loss: 1.5618 - regression_loss: 1.3163 - classification_loss: 0.2455 23/500 [>.............................] - ETA: 1:42 - loss: 1.5853 - regression_loss: 1.3385 - classification_loss: 0.2468 24/500 [>.............................] - ETA: 1:43 - loss: 1.5552 - regression_loss: 1.3139 - classification_loss: 0.2413 25/500 [>.............................] - ETA: 1:42 - loss: 1.5266 - regression_loss: 1.2907 - classification_loss: 0.2358 26/500 [>.............................] - ETA: 1:42 - loss: 1.4901 - regression_loss: 1.2592 - classification_loss: 0.2308 27/500 [>.............................] - ETA: 1:42 - loss: 1.5124 - regression_loss: 1.2810 - classification_loss: 0.2313 28/500 [>.............................] - ETA: 1:42 - loss: 1.5035 - regression_loss: 1.2756 - classification_loss: 0.2278 29/500 [>.............................] - ETA: 1:41 - loss: 1.4840 - regression_loss: 1.2603 - classification_loss: 0.2238 30/500 [>.............................] - ETA: 1:41 - loss: 1.5016 - regression_loss: 1.2759 - classification_loss: 0.2257 31/500 [>.............................] - ETA: 1:41 - loss: 1.5260 - regression_loss: 1.2945 - classification_loss: 0.2315 32/500 [>.............................] - ETA: 1:41 - loss: 1.5224 - regression_loss: 1.2906 - classification_loss: 0.2318 33/500 [>.............................] - ETA: 1:40 - loss: 1.4887 - regression_loss: 1.2581 - classification_loss: 0.2306 34/500 [=>............................] - ETA: 1:40 - loss: 1.4701 - regression_loss: 1.2420 - classification_loss: 0.2282 35/500 [=>............................] - ETA: 1:40 - loss: 1.4679 - regression_loss: 1.2427 - classification_loss: 0.2252 36/500 [=>............................] - ETA: 1:39 - loss: 1.4709 - regression_loss: 1.2462 - classification_loss: 0.2247 37/500 [=>............................] - ETA: 1:39 - loss: 1.4415 - regression_loss: 1.2216 - classification_loss: 0.2199 38/500 [=>............................] - ETA: 1:39 - loss: 1.4256 - regression_loss: 1.2089 - classification_loss: 0.2167 39/500 [=>............................] - ETA: 1:39 - loss: 1.4239 - regression_loss: 1.2092 - classification_loss: 0.2147 40/500 [=>............................] - ETA: 1:38 - loss: 1.4332 - regression_loss: 1.2157 - classification_loss: 0.2175 41/500 [=>............................] - ETA: 1:38 - loss: 1.4401 - regression_loss: 1.2218 - classification_loss: 0.2183 42/500 [=>............................] - ETA: 1:38 - loss: 1.4351 - regression_loss: 1.2170 - classification_loss: 0.2181 43/500 [=>............................] - ETA: 1:38 - loss: 1.4435 - regression_loss: 1.2242 - classification_loss: 0.2193 44/500 [=>............................] - ETA: 1:37 - loss: 1.4340 - regression_loss: 1.2165 - classification_loss: 0.2176 45/500 [=>............................] - ETA: 1:37 - loss: 1.4379 - regression_loss: 1.2201 - classification_loss: 0.2178 46/500 [=>............................] - ETA: 1:37 - loss: 1.4390 - regression_loss: 1.2206 - classification_loss: 0.2183 47/500 [=>............................] - ETA: 1:37 - loss: 1.4429 - regression_loss: 1.2230 - classification_loss: 0.2199 48/500 [=>............................] - ETA: 1:37 - loss: 1.4343 - regression_loss: 1.2156 - classification_loss: 0.2186 49/500 [=>............................] - ETA: 1:37 - loss: 1.4345 - regression_loss: 1.2175 - classification_loss: 0.2170 50/500 [==>...........................] - ETA: 1:37 - loss: 1.4366 - regression_loss: 1.2172 - classification_loss: 0.2194 51/500 [==>...........................] - ETA: 1:37 - loss: 1.4361 - regression_loss: 1.2170 - classification_loss: 0.2191 52/500 [==>...........................] - ETA: 1:36 - loss: 1.4263 - regression_loss: 1.2096 - classification_loss: 0.2166 53/500 [==>...........................] - ETA: 1:36 - loss: 1.4288 - regression_loss: 1.2127 - classification_loss: 0.2161 54/500 [==>...........................] - ETA: 1:36 - loss: 1.4165 - regression_loss: 1.2028 - classification_loss: 0.2136 55/500 [==>...........................] - ETA: 1:36 - loss: 1.4181 - regression_loss: 1.2040 - classification_loss: 0.2141 56/500 [==>...........................] - ETA: 1:35 - loss: 1.4110 - regression_loss: 1.1979 - classification_loss: 0.2131 57/500 [==>...........................] - ETA: 1:35 - loss: 1.4132 - regression_loss: 1.2000 - classification_loss: 0.2132 58/500 [==>...........................] - ETA: 1:35 - loss: 1.4153 - regression_loss: 1.2017 - classification_loss: 0.2136 59/500 [==>...........................] - ETA: 1:35 - loss: 1.4132 - regression_loss: 1.2002 - classification_loss: 0.2129 60/500 [==>...........................] - ETA: 1:34 - loss: 1.4119 - regression_loss: 1.1991 - classification_loss: 0.2128 61/500 [==>...........................] - ETA: 1:34 - loss: 1.4103 - regression_loss: 1.1973 - classification_loss: 0.2130 62/500 [==>...........................] - ETA: 1:34 - loss: 1.3942 - regression_loss: 1.1834 - classification_loss: 0.2108 63/500 [==>...........................] - ETA: 1:34 - loss: 1.3785 - regression_loss: 1.1701 - classification_loss: 0.2085 64/500 [==>...........................] - ETA: 1:33 - loss: 1.3738 - regression_loss: 1.1668 - classification_loss: 0.2070 65/500 [==>...........................] - ETA: 1:33 - loss: 1.3829 - regression_loss: 1.1738 - classification_loss: 0.2091 66/500 [==>...........................] - ETA: 1:33 - loss: 1.3801 - regression_loss: 1.1716 - classification_loss: 0.2085 67/500 [===>..........................] - ETA: 1:33 - loss: 1.3737 - regression_loss: 1.1665 - classification_loss: 0.2072 68/500 [===>..........................] - ETA: 1:33 - loss: 1.3722 - regression_loss: 1.1661 - classification_loss: 0.2061 69/500 [===>..........................] - ETA: 1:33 - loss: 1.3587 - regression_loss: 1.1549 - classification_loss: 0.2039 70/500 [===>..........................] - ETA: 1:33 - loss: 1.3501 - regression_loss: 1.1481 - classification_loss: 0.2020 71/500 [===>..........................] - ETA: 1:33 - loss: 1.3386 - regression_loss: 1.1391 - classification_loss: 0.1995 72/500 [===>..........................] - ETA: 1:33 - loss: 1.3359 - regression_loss: 1.1372 - classification_loss: 0.1987 73/500 [===>..........................] - ETA: 1:33 - loss: 1.3238 - regression_loss: 1.1273 - classification_loss: 0.1965 74/500 [===>..........................] - ETA: 1:33 - loss: 1.3281 - regression_loss: 1.1315 - classification_loss: 0.1965 75/500 [===>..........................] - ETA: 1:33 - loss: 1.3255 - regression_loss: 1.1292 - classification_loss: 0.1963 76/500 [===>..........................] - ETA: 1:33 - loss: 1.3352 - regression_loss: 1.1356 - classification_loss: 0.1996 77/500 [===>..........................] - ETA: 1:32 - loss: 1.3309 - regression_loss: 1.1324 - classification_loss: 0.1984 78/500 [===>..........................] - ETA: 1:32 - loss: 1.3313 - regression_loss: 1.1328 - classification_loss: 0.1985 79/500 [===>..........................] - ETA: 1:32 - loss: 1.3232 - regression_loss: 1.1259 - classification_loss: 0.1974 80/500 [===>..........................] - ETA: 1:31 - loss: 1.3153 - regression_loss: 1.1194 - classification_loss: 0.1959 81/500 [===>..........................] - ETA: 1:31 - loss: 1.3227 - regression_loss: 1.1255 - classification_loss: 0.1972 82/500 [===>..........................] - ETA: 1:31 - loss: 1.3178 - regression_loss: 1.1216 - classification_loss: 0.1962 83/500 [===>..........................] - ETA: 1:31 - loss: 1.3301 - regression_loss: 1.1313 - classification_loss: 0.1987 84/500 [====>.........................] - ETA: 1:30 - loss: 1.3378 - regression_loss: 1.1368 - classification_loss: 0.2010 85/500 [====>.........................] - ETA: 1:30 - loss: 1.3356 - regression_loss: 1.1352 - classification_loss: 0.2004 86/500 [====>.........................] - ETA: 1:30 - loss: 1.3370 - regression_loss: 1.1370 - classification_loss: 0.2000 87/500 [====>.........................] - ETA: 1:30 - loss: 1.3353 - regression_loss: 1.1239 - classification_loss: 0.2114 88/500 [====>.........................] - ETA: 1:29 - loss: 1.3427 - regression_loss: 1.1291 - classification_loss: 0.2136 89/500 [====>.........................] - ETA: 1:29 - loss: 1.3493 - regression_loss: 1.1341 - classification_loss: 0.2152 90/500 [====>.........................] - ETA: 1:29 - loss: 1.3379 - regression_loss: 1.1245 - classification_loss: 0.2134 91/500 [====>.........................] - ETA: 1:29 - loss: 1.3340 - regression_loss: 1.1212 - classification_loss: 0.2128 92/500 [====>.........................] - ETA: 1:28 - loss: 1.3319 - regression_loss: 1.1198 - classification_loss: 0.2121 93/500 [====>.........................] - ETA: 1:28 - loss: 1.3237 - regression_loss: 1.1127 - classification_loss: 0.2110 94/500 [====>.........................] - ETA: 1:28 - loss: 1.3233 - regression_loss: 1.1132 - classification_loss: 0.2101 95/500 [====>.........................] - ETA: 1:28 - loss: 1.3241 - regression_loss: 1.1136 - classification_loss: 0.2105 96/500 [====>.........................] - ETA: 1:28 - loss: 1.3285 - regression_loss: 1.1173 - classification_loss: 0.2112 97/500 [====>.........................] - ETA: 1:27 - loss: 1.3239 - regression_loss: 1.1135 - classification_loss: 0.2104 98/500 [====>.........................] - ETA: 1:27 - loss: 1.3250 - regression_loss: 1.1149 - classification_loss: 0.2100 99/500 [====>.........................] - ETA: 1:27 - loss: 1.3276 - regression_loss: 1.1170 - classification_loss: 0.2106 100/500 [=====>........................] - ETA: 1:27 - loss: 1.3262 - regression_loss: 1.1156 - classification_loss: 0.2105 101/500 [=====>........................] - ETA: 1:26 - loss: 1.3238 - regression_loss: 1.1140 - classification_loss: 0.2098 102/500 [=====>........................] - ETA: 1:26 - loss: 1.3297 - regression_loss: 1.1192 - classification_loss: 0.2105 103/500 [=====>........................] - ETA: 1:26 - loss: 1.3276 - regression_loss: 1.1175 - classification_loss: 0.2102 104/500 [=====>........................] - ETA: 1:26 - loss: 1.3313 - regression_loss: 1.1203 - classification_loss: 0.2110 105/500 [=====>........................] - ETA: 1:25 - loss: 1.3320 - regression_loss: 1.1207 - classification_loss: 0.2113 106/500 [=====>........................] - ETA: 1:25 - loss: 1.3288 - regression_loss: 1.1173 - classification_loss: 0.2115 107/500 [=====>........................] - ETA: 1:25 - loss: 1.3325 - regression_loss: 1.1203 - classification_loss: 0.2122 108/500 [=====>........................] - ETA: 1:25 - loss: 1.3335 - regression_loss: 1.1212 - classification_loss: 0.2123 109/500 [=====>........................] - ETA: 1:24 - loss: 1.3356 - regression_loss: 1.1237 - classification_loss: 0.2120 110/500 [=====>........................] - ETA: 1:24 - loss: 1.3301 - regression_loss: 1.1196 - classification_loss: 0.2105 111/500 [=====>........................] - ETA: 1:24 - loss: 1.3244 - regression_loss: 1.1148 - classification_loss: 0.2096 112/500 [=====>........................] - ETA: 1:24 - loss: 1.3273 - regression_loss: 1.1176 - classification_loss: 0.2097 113/500 [=====>........................] - ETA: 1:24 - loss: 1.3265 - regression_loss: 1.1172 - classification_loss: 0.2093 114/500 [=====>........................] - ETA: 1:23 - loss: 1.3273 - regression_loss: 1.1174 - classification_loss: 0.2099 115/500 [=====>........................] - ETA: 1:23 - loss: 1.3342 - regression_loss: 1.1232 - classification_loss: 0.2110 116/500 [=====>........................] - ETA: 1:23 - loss: 1.3267 - regression_loss: 1.1168 - classification_loss: 0.2098 117/500 [======>.......................] - ETA: 1:23 - loss: 1.3321 - regression_loss: 1.1218 - classification_loss: 0.2103 118/500 [======>.......................] - ETA: 1:22 - loss: 1.3302 - regression_loss: 1.1208 - classification_loss: 0.2094 119/500 [======>.......................] - ETA: 1:22 - loss: 1.3253 - regression_loss: 1.1168 - classification_loss: 0.2085 120/500 [======>.......................] - ETA: 1:22 - loss: 1.3289 - regression_loss: 1.1198 - classification_loss: 0.2091 121/500 [======>.......................] - ETA: 1:22 - loss: 1.3310 - regression_loss: 1.1212 - classification_loss: 0.2098 122/500 [======>.......................] - ETA: 1:21 - loss: 1.3351 - regression_loss: 1.1248 - classification_loss: 0.2102 123/500 [======>.......................] - ETA: 1:21 - loss: 1.3342 - regression_loss: 1.1241 - classification_loss: 0.2101 124/500 [======>.......................] - ETA: 1:21 - loss: 1.3431 - regression_loss: 1.1310 - classification_loss: 0.2121 125/500 [======>.......................] - ETA: 1:21 - loss: 1.3349 - regression_loss: 1.1241 - classification_loss: 0.2108 126/500 [======>.......................] - ETA: 1:21 - loss: 1.3369 - regression_loss: 1.1259 - classification_loss: 0.2110 127/500 [======>.......................] - ETA: 1:20 - loss: 1.3359 - regression_loss: 1.1253 - classification_loss: 0.2106 128/500 [======>.......................] - ETA: 1:20 - loss: 1.3375 - regression_loss: 1.1271 - classification_loss: 0.2104 129/500 [======>.......................] - ETA: 1:20 - loss: 1.3391 - regression_loss: 1.1283 - classification_loss: 0.2108 130/500 [======>.......................] - ETA: 1:20 - loss: 1.3473 - regression_loss: 1.1332 - classification_loss: 0.2141 131/500 [======>.......................] - ETA: 1:19 - loss: 1.3461 - regression_loss: 1.1327 - classification_loss: 0.2134 132/500 [======>.......................] - ETA: 1:19 - loss: 1.3441 - regression_loss: 1.1312 - classification_loss: 0.2129 133/500 [======>.......................] - ETA: 1:19 - loss: 1.3418 - regression_loss: 1.1287 - classification_loss: 0.2131 134/500 [=======>......................] - ETA: 1:19 - loss: 1.3414 - regression_loss: 1.1281 - classification_loss: 0.2133 135/500 [=======>......................] - ETA: 1:18 - loss: 1.3442 - regression_loss: 1.1301 - classification_loss: 0.2141 136/500 [=======>......................] - ETA: 1:18 - loss: 1.3454 - regression_loss: 1.1299 - classification_loss: 0.2156 137/500 [=======>......................] - ETA: 1:18 - loss: 1.3506 - regression_loss: 1.1334 - classification_loss: 0.2172 138/500 [=======>......................] - ETA: 1:18 - loss: 1.3523 - regression_loss: 1.1354 - classification_loss: 0.2169 139/500 [=======>......................] - ETA: 1:18 - loss: 1.3562 - regression_loss: 1.1384 - classification_loss: 0.2177 140/500 [=======>......................] - ETA: 1:17 - loss: 1.3595 - regression_loss: 1.1411 - classification_loss: 0.2184 141/500 [=======>......................] - ETA: 1:17 - loss: 1.3599 - regression_loss: 1.1419 - classification_loss: 0.2180 142/500 [=======>......................] - ETA: 1:17 - loss: 1.3588 - regression_loss: 1.1413 - classification_loss: 0.2175 143/500 [=======>......................] - ETA: 1:17 - loss: 1.3595 - regression_loss: 1.1417 - classification_loss: 0.2177 144/500 [=======>......................] - ETA: 1:17 - loss: 1.3610 - regression_loss: 1.1428 - classification_loss: 0.2182 145/500 [=======>......................] - ETA: 1:17 - loss: 1.3609 - regression_loss: 1.1432 - classification_loss: 0.2177 146/500 [=======>......................] - ETA: 1:16 - loss: 1.3581 - regression_loss: 1.1412 - classification_loss: 0.2169 147/500 [=======>......................] - ETA: 1:16 - loss: 1.3584 - regression_loss: 1.1414 - classification_loss: 0.2170 148/500 [=======>......................] - ETA: 1:16 - loss: 1.3601 - regression_loss: 1.1433 - classification_loss: 0.2167 149/500 [=======>......................] - ETA: 1:16 - loss: 1.3603 - regression_loss: 1.1437 - classification_loss: 0.2166 150/500 [========>.....................] - ETA: 1:16 - loss: 1.3615 - regression_loss: 1.1445 - classification_loss: 0.2170 151/500 [========>.....................] - ETA: 1:16 - loss: 1.3617 - regression_loss: 1.1447 - classification_loss: 0.2170 152/500 [========>.....................] - ETA: 1:16 - loss: 1.3626 - regression_loss: 1.1456 - classification_loss: 0.2169 153/500 [========>.....................] - ETA: 1:15 - loss: 1.3610 - regression_loss: 1.1449 - classification_loss: 0.2161 154/500 [========>.....................] - ETA: 1:15 - loss: 1.3620 - regression_loss: 1.1458 - classification_loss: 0.2162 155/500 [========>.....................] - ETA: 1:15 - loss: 1.3580 - regression_loss: 1.1425 - classification_loss: 0.2154 156/500 [========>.....................] - ETA: 1:15 - loss: 1.3542 - regression_loss: 1.1395 - classification_loss: 0.2147 157/500 [========>.....................] - ETA: 1:14 - loss: 1.3530 - regression_loss: 1.1381 - classification_loss: 0.2149 158/500 [========>.....................] - ETA: 1:14 - loss: 1.3519 - regression_loss: 1.1372 - classification_loss: 0.2147 159/500 [========>.....................] - ETA: 1:14 - loss: 1.3498 - regression_loss: 1.1357 - classification_loss: 0.2141 160/500 [========>.....................] - ETA: 1:14 - loss: 1.3498 - regression_loss: 1.1358 - classification_loss: 0.2140 161/500 [========>.....................] - ETA: 1:14 - loss: 1.3492 - regression_loss: 1.1352 - classification_loss: 0.2140 162/500 [========>.....................] - ETA: 1:13 - loss: 1.3435 - regression_loss: 1.1307 - classification_loss: 0.2129 163/500 [========>.....................] - ETA: 1:13 - loss: 1.3431 - regression_loss: 1.1304 - classification_loss: 0.2127 164/500 [========>.....................] - ETA: 1:13 - loss: 1.3476 - regression_loss: 1.1344 - classification_loss: 0.2132 165/500 [========>.....................] - ETA: 1:13 - loss: 1.3492 - regression_loss: 1.1357 - classification_loss: 0.2135 166/500 [========>.....................] - ETA: 1:12 - loss: 1.3523 - regression_loss: 1.1375 - classification_loss: 0.2148 167/500 [=========>....................] - ETA: 1:12 - loss: 1.3514 - regression_loss: 1.1370 - classification_loss: 0.2143 168/500 [=========>....................] - ETA: 1:12 - loss: 1.3497 - regression_loss: 1.1356 - classification_loss: 0.2141 169/500 [=========>....................] - ETA: 1:12 - loss: 1.3565 - regression_loss: 1.1415 - classification_loss: 0.2151 170/500 [=========>....................] - ETA: 1:11 - loss: 1.3543 - regression_loss: 1.1397 - classification_loss: 0.2146 171/500 [=========>....................] - ETA: 1:11 - loss: 1.3540 - regression_loss: 1.1401 - classification_loss: 0.2139 172/500 [=========>....................] - ETA: 1:11 - loss: 1.3479 - regression_loss: 1.1351 - classification_loss: 0.2128 173/500 [=========>....................] - ETA: 1:11 - loss: 1.3508 - regression_loss: 1.1372 - classification_loss: 0.2136 174/500 [=========>....................] - ETA: 1:11 - loss: 1.3536 - regression_loss: 1.1400 - classification_loss: 0.2137 175/500 [=========>....................] - ETA: 1:10 - loss: 1.3557 - regression_loss: 1.1411 - classification_loss: 0.2146 176/500 [=========>....................] - ETA: 1:10 - loss: 1.3521 - regression_loss: 1.1384 - classification_loss: 0.2137 177/500 [=========>....................] - ETA: 1:10 - loss: 1.3494 - regression_loss: 1.1363 - classification_loss: 0.2131 178/500 [=========>....................] - ETA: 1:10 - loss: 1.3487 - regression_loss: 1.1359 - classification_loss: 0.2127 179/500 [=========>....................] - ETA: 1:09 - loss: 1.3534 - regression_loss: 1.1394 - classification_loss: 0.2140 180/500 [=========>....................] - ETA: 1:09 - loss: 1.3526 - regression_loss: 1.1388 - classification_loss: 0.2138 181/500 [=========>....................] - ETA: 1:09 - loss: 1.3546 - regression_loss: 1.1404 - classification_loss: 0.2141 182/500 [=========>....................] - ETA: 1:09 - loss: 1.3532 - regression_loss: 1.1395 - classification_loss: 0.2137 183/500 [=========>....................] - ETA: 1:08 - loss: 1.3538 - regression_loss: 1.1404 - classification_loss: 0.2134 184/500 [==========>...................] - ETA: 1:08 - loss: 1.3549 - regression_loss: 1.1413 - classification_loss: 0.2136 185/500 [==========>...................] - ETA: 1:08 - loss: 1.3552 - regression_loss: 1.1417 - classification_loss: 0.2135 186/500 [==========>...................] - ETA: 1:08 - loss: 1.3550 - regression_loss: 1.1419 - classification_loss: 0.2131 187/500 [==========>...................] - ETA: 1:08 - loss: 1.3530 - regression_loss: 1.1402 - classification_loss: 0.2128 188/500 [==========>...................] - ETA: 1:07 - loss: 1.3552 - regression_loss: 1.1414 - classification_loss: 0.2138 189/500 [==========>...................] - ETA: 1:07 - loss: 1.3568 - regression_loss: 1.1430 - classification_loss: 0.2137 190/500 [==========>...................] - ETA: 1:07 - loss: 1.3587 - regression_loss: 1.1447 - classification_loss: 0.2140 191/500 [==========>...................] - ETA: 1:07 - loss: 1.3614 - regression_loss: 1.1469 - classification_loss: 0.2145 192/500 [==========>...................] - ETA: 1:06 - loss: 1.3616 - regression_loss: 1.1474 - classification_loss: 0.2142 193/500 [==========>...................] - ETA: 1:06 - loss: 1.3592 - regression_loss: 1.1456 - classification_loss: 0.2136 194/500 [==========>...................] - ETA: 1:06 - loss: 1.3594 - regression_loss: 1.1461 - classification_loss: 0.2133 195/500 [==========>...................] - ETA: 1:06 - loss: 1.3555 - regression_loss: 1.1430 - classification_loss: 0.2124 196/500 [==========>...................] - ETA: 1:06 - loss: 1.3549 - regression_loss: 1.1428 - classification_loss: 0.2121 197/500 [==========>...................] - ETA: 1:05 - loss: 1.3547 - regression_loss: 1.1430 - classification_loss: 0.2117 198/500 [==========>...................] - ETA: 1:05 - loss: 1.3530 - regression_loss: 1.1418 - classification_loss: 0.2112 199/500 [==========>...................] - ETA: 1:05 - loss: 1.3521 - regression_loss: 1.1411 - classification_loss: 0.2110 200/500 [===========>..................] - ETA: 1:05 - loss: 1.3535 - regression_loss: 1.1423 - classification_loss: 0.2111 201/500 [===========>..................] - ETA: 1:04 - loss: 1.3526 - regression_loss: 1.1418 - classification_loss: 0.2108 202/500 [===========>..................] - ETA: 1:04 - loss: 1.3538 - regression_loss: 1.1430 - classification_loss: 0.2108 203/500 [===========>..................] - ETA: 1:04 - loss: 1.3566 - regression_loss: 1.1452 - classification_loss: 0.2114 204/500 [===========>..................] - ETA: 1:04 - loss: 1.3575 - regression_loss: 1.1464 - classification_loss: 0.2111 205/500 [===========>..................] - ETA: 1:04 - loss: 1.3550 - regression_loss: 1.1446 - classification_loss: 0.2105 206/500 [===========>..................] - ETA: 1:03 - loss: 1.3544 - regression_loss: 1.1441 - classification_loss: 0.2103 207/500 [===========>..................] - ETA: 1:03 - loss: 1.3549 - regression_loss: 1.1447 - classification_loss: 0.2103 208/500 [===========>..................] - ETA: 1:03 - loss: 1.3562 - regression_loss: 1.1460 - classification_loss: 0.2102 209/500 [===========>..................] - ETA: 1:03 - loss: 1.3538 - regression_loss: 1.1442 - classification_loss: 0.2096 210/500 [===========>..................] - ETA: 1:03 - loss: 1.3528 - regression_loss: 1.1436 - classification_loss: 0.2092 211/500 [===========>..................] - ETA: 1:02 - loss: 1.3480 - regression_loss: 1.1395 - classification_loss: 0.2084 212/500 [===========>..................] - ETA: 1:02 - loss: 1.3514 - regression_loss: 1.1426 - classification_loss: 0.2089 213/500 [===========>..................] - ETA: 1:02 - loss: 1.3550 - regression_loss: 1.1452 - classification_loss: 0.2097 214/500 [===========>..................] - ETA: 1:02 - loss: 1.3547 - regression_loss: 1.1448 - classification_loss: 0.2100 215/500 [===========>..................] - ETA: 1:02 - loss: 1.3554 - regression_loss: 1.1449 - classification_loss: 0.2105 216/500 [===========>..................] - ETA: 1:02 - loss: 1.3592 - regression_loss: 1.1474 - classification_loss: 0.2119 217/500 [============>.................] - ETA: 1:01 - loss: 1.3579 - regression_loss: 1.1465 - classification_loss: 0.2115 218/500 [============>.................] - ETA: 1:01 - loss: 1.3574 - regression_loss: 1.1461 - classification_loss: 0.2113 219/500 [============>.................] - ETA: 1:01 - loss: 1.3570 - regression_loss: 1.1453 - classification_loss: 0.2116 220/500 [============>.................] - ETA: 1:01 - loss: 1.3592 - regression_loss: 1.1479 - classification_loss: 0.2113 221/500 [============>.................] - ETA: 1:00 - loss: 1.3606 - regression_loss: 1.1489 - classification_loss: 0.2117 222/500 [============>.................] - ETA: 1:00 - loss: 1.3592 - regression_loss: 1.1476 - classification_loss: 0.2116 223/500 [============>.................] - ETA: 1:00 - loss: 1.3609 - regression_loss: 1.1489 - classification_loss: 0.2120 224/500 [============>.................] - ETA: 1:00 - loss: 1.3628 - regression_loss: 1.1508 - classification_loss: 0.2120 225/500 [============>.................] - ETA: 1:00 - loss: 1.3604 - regression_loss: 1.1488 - classification_loss: 0.2116 226/500 [============>.................] - ETA: 59s - loss: 1.3585 - regression_loss: 1.1474 - classification_loss: 0.2111  227/500 [============>.................] - ETA: 59s - loss: 1.3569 - regression_loss: 1.1463 - classification_loss: 0.2106 228/500 [============>.................] - ETA: 59s - loss: 1.3558 - regression_loss: 1.1453 - classification_loss: 0.2105 229/500 [============>.................] - ETA: 59s - loss: 1.3585 - regression_loss: 1.1474 - classification_loss: 0.2111 230/500 [============>.................] - ETA: 58s - loss: 1.3581 - regression_loss: 1.1471 - classification_loss: 0.2109 231/500 [============>.................] - ETA: 58s - loss: 1.3562 - regression_loss: 1.1457 - classification_loss: 0.2105 232/500 [============>.................] - ETA: 58s - loss: 1.3603 - regression_loss: 1.1488 - classification_loss: 0.2116 233/500 [============>.................] - ETA: 58s - loss: 1.3600 - regression_loss: 1.1489 - classification_loss: 0.2111 234/500 [=============>................] - ETA: 58s - loss: 1.3577 - regression_loss: 1.1469 - classification_loss: 0.2108 235/500 [=============>................] - ETA: 57s - loss: 1.3569 - regression_loss: 1.1466 - classification_loss: 0.2104 236/500 [=============>................] - ETA: 57s - loss: 1.3570 - regression_loss: 1.1466 - classification_loss: 0.2103 237/500 [=============>................] - ETA: 57s - loss: 1.3607 - regression_loss: 1.1495 - classification_loss: 0.2112 238/500 [=============>................] - ETA: 57s - loss: 1.3626 - regression_loss: 1.1508 - classification_loss: 0.2117 239/500 [=============>................] - ETA: 56s - loss: 1.3627 - regression_loss: 1.1510 - classification_loss: 0.2117 240/500 [=============>................] - ETA: 56s - loss: 1.3606 - regression_loss: 1.1495 - classification_loss: 0.2111 241/500 [=============>................] - ETA: 56s - loss: 1.3610 - regression_loss: 1.1500 - classification_loss: 0.2110 242/500 [=============>................] - ETA: 56s - loss: 1.3567 - regression_loss: 1.1463 - classification_loss: 0.2104 243/500 [=============>................] - ETA: 56s - loss: 1.3555 - regression_loss: 1.1456 - classification_loss: 0.2099 244/500 [=============>................] - ETA: 55s - loss: 1.3527 - regression_loss: 1.1433 - classification_loss: 0.2095 245/500 [=============>................] - ETA: 55s - loss: 1.3543 - regression_loss: 1.1446 - classification_loss: 0.2097 246/500 [=============>................] - ETA: 55s - loss: 1.3542 - regression_loss: 1.1448 - classification_loss: 0.2094 247/500 [=============>................] - ETA: 55s - loss: 1.3607 - regression_loss: 1.1473 - classification_loss: 0.2134 248/500 [=============>................] - ETA: 54s - loss: 1.3636 - regression_loss: 1.1495 - classification_loss: 0.2141 249/500 [=============>................] - ETA: 54s - loss: 1.3625 - regression_loss: 1.1488 - classification_loss: 0.2137 250/500 [==============>...............] - ETA: 54s - loss: 1.3623 - regression_loss: 1.1484 - classification_loss: 0.2138 251/500 [==============>...............] - ETA: 54s - loss: 1.3618 - regression_loss: 1.1477 - classification_loss: 0.2142 252/500 [==============>...............] - ETA: 53s - loss: 1.3604 - regression_loss: 1.1465 - classification_loss: 0.2139 253/500 [==============>...............] - ETA: 53s - loss: 1.3611 - regression_loss: 1.1469 - classification_loss: 0.2142 254/500 [==============>...............] - ETA: 53s - loss: 1.3621 - regression_loss: 1.1479 - classification_loss: 0.2143 255/500 [==============>...............] - ETA: 53s - loss: 1.3596 - regression_loss: 1.1459 - classification_loss: 0.2137 256/500 [==============>...............] - ETA: 53s - loss: 1.3616 - regression_loss: 1.1471 - classification_loss: 0.2144 257/500 [==============>...............] - ETA: 52s - loss: 1.3626 - regression_loss: 1.1480 - classification_loss: 0.2146 258/500 [==============>...............] - ETA: 52s - loss: 1.3642 - regression_loss: 1.1492 - classification_loss: 0.2150 259/500 [==============>...............] - ETA: 52s - loss: 1.3662 - regression_loss: 1.1506 - classification_loss: 0.2156 260/500 [==============>...............] - ETA: 52s - loss: 1.3656 - regression_loss: 1.1501 - classification_loss: 0.2155 261/500 [==============>...............] - ETA: 51s - loss: 1.3647 - regression_loss: 1.1494 - classification_loss: 0.2153 262/500 [==============>...............] - ETA: 51s - loss: 1.3638 - regression_loss: 1.1487 - classification_loss: 0.2151 263/500 [==============>...............] - ETA: 51s - loss: 1.3640 - regression_loss: 1.1489 - classification_loss: 0.2150 264/500 [==============>...............] - ETA: 51s - loss: 1.3621 - regression_loss: 1.1471 - classification_loss: 0.2150 265/500 [==============>...............] - ETA: 51s - loss: 1.3624 - regression_loss: 1.1473 - classification_loss: 0.2151 266/500 [==============>...............] - ETA: 50s - loss: 1.3628 - regression_loss: 1.1474 - classification_loss: 0.2154 267/500 [===============>..............] - ETA: 50s - loss: 1.3624 - regression_loss: 1.1472 - classification_loss: 0.2152 268/500 [===============>..............] - ETA: 50s - loss: 1.3667 - regression_loss: 1.1504 - classification_loss: 0.2163 269/500 [===============>..............] - ETA: 50s - loss: 1.3644 - regression_loss: 1.1484 - classification_loss: 0.2160 270/500 [===============>..............] - ETA: 49s - loss: 1.3645 - regression_loss: 1.1487 - classification_loss: 0.2158 271/500 [===============>..............] - ETA: 49s - loss: 1.3654 - regression_loss: 1.1495 - classification_loss: 0.2159 272/500 [===============>..............] - ETA: 49s - loss: 1.3673 - regression_loss: 1.1510 - classification_loss: 0.2163 273/500 [===============>..............] - ETA: 49s - loss: 1.3667 - regression_loss: 1.1504 - classification_loss: 0.2163 274/500 [===============>..............] - ETA: 49s - loss: 1.3673 - regression_loss: 1.1510 - classification_loss: 0.2163 275/500 [===============>..............] - ETA: 48s - loss: 1.3675 - regression_loss: 1.1514 - classification_loss: 0.2161 276/500 [===============>..............] - ETA: 48s - loss: 1.3658 - regression_loss: 1.1500 - classification_loss: 0.2158 277/500 [===============>..............] - ETA: 48s - loss: 1.3655 - regression_loss: 1.1499 - classification_loss: 0.2157 278/500 [===============>..............] - ETA: 48s - loss: 1.3646 - regression_loss: 1.1490 - classification_loss: 0.2156 279/500 [===============>..............] - ETA: 47s - loss: 1.3630 - regression_loss: 1.1478 - classification_loss: 0.2153 280/500 [===============>..............] - ETA: 47s - loss: 1.3637 - regression_loss: 1.1483 - classification_loss: 0.2153 281/500 [===============>..............] - ETA: 47s - loss: 1.3646 - regression_loss: 1.1497 - classification_loss: 0.2149 282/500 [===============>..............] - ETA: 47s - loss: 1.3648 - regression_loss: 1.1499 - classification_loss: 0.2149 283/500 [===============>..............] - ETA: 47s - loss: 1.3619 - regression_loss: 1.1475 - classification_loss: 0.2144 284/500 [================>.............] - ETA: 46s - loss: 1.3623 - regression_loss: 1.1478 - classification_loss: 0.2144 285/500 [================>.............] - ETA: 46s - loss: 1.3627 - regression_loss: 1.1484 - classification_loss: 0.2142 286/500 [================>.............] - ETA: 46s - loss: 1.3606 - regression_loss: 1.1467 - classification_loss: 0.2139 287/500 [================>.............] - ETA: 46s - loss: 1.3603 - regression_loss: 1.1464 - classification_loss: 0.2138 288/500 [================>.............] - ETA: 46s - loss: 1.3607 - regression_loss: 1.1470 - classification_loss: 0.2137 289/500 [================>.............] - ETA: 45s - loss: 1.3577 - regression_loss: 1.1445 - classification_loss: 0.2131 290/500 [================>.............] - ETA: 45s - loss: 1.3572 - regression_loss: 1.1443 - classification_loss: 0.2129 291/500 [================>.............] - ETA: 45s - loss: 1.3551 - regression_loss: 1.1426 - classification_loss: 0.2125 292/500 [================>.............] - ETA: 45s - loss: 1.3545 - regression_loss: 1.1422 - classification_loss: 0.2123 293/500 [================>.............] - ETA: 45s - loss: 1.3524 - regression_loss: 1.1405 - classification_loss: 0.2119 294/500 [================>.............] - ETA: 44s - loss: 1.3519 - regression_loss: 1.1401 - classification_loss: 0.2118 295/500 [================>.............] - ETA: 44s - loss: 1.3526 - regression_loss: 1.1406 - classification_loss: 0.2119 296/500 [================>.............] - ETA: 44s - loss: 1.3558 - regression_loss: 1.1433 - classification_loss: 0.2125 297/500 [================>.............] - ETA: 44s - loss: 1.3566 - regression_loss: 1.1438 - classification_loss: 0.2128 298/500 [================>.............] - ETA: 44s - loss: 1.3549 - regression_loss: 1.1426 - classification_loss: 0.2124 299/500 [================>.............] - ETA: 43s - loss: 1.3573 - regression_loss: 1.1445 - classification_loss: 0.2128 300/500 [=================>............] - ETA: 43s - loss: 1.3589 - regression_loss: 1.1457 - classification_loss: 0.2132 301/500 [=================>............] - ETA: 43s - loss: 1.3585 - regression_loss: 1.1455 - classification_loss: 0.2130 302/500 [=================>............] - ETA: 43s - loss: 1.3573 - regression_loss: 1.1443 - classification_loss: 0.2130 303/500 [=================>............] - ETA: 42s - loss: 1.3570 - regression_loss: 1.1438 - classification_loss: 0.2131 304/500 [=================>............] - ETA: 42s - loss: 1.3565 - regression_loss: 1.1435 - classification_loss: 0.2130 305/500 [=================>............] - ETA: 42s - loss: 1.3531 - regression_loss: 1.1407 - classification_loss: 0.2124 306/500 [=================>............] - ETA: 42s - loss: 1.3522 - regression_loss: 1.1399 - classification_loss: 0.2123 307/500 [=================>............] - ETA: 42s - loss: 1.3524 - regression_loss: 1.1400 - classification_loss: 0.2124 308/500 [=================>............] - ETA: 41s - loss: 1.3521 - regression_loss: 1.1398 - classification_loss: 0.2122 309/500 [=================>............] - ETA: 41s - loss: 1.3528 - regression_loss: 1.1402 - classification_loss: 0.2125 310/500 [=================>............] - ETA: 41s - loss: 1.3523 - regression_loss: 1.1397 - classification_loss: 0.2126 311/500 [=================>............] - ETA: 41s - loss: 1.3582 - regression_loss: 1.1442 - classification_loss: 0.2140 312/500 [=================>............] - ETA: 40s - loss: 1.3595 - regression_loss: 1.1452 - classification_loss: 0.2143 313/500 [=================>............] - ETA: 40s - loss: 1.3581 - regression_loss: 1.1442 - classification_loss: 0.2139 314/500 [=================>............] - ETA: 40s - loss: 1.3602 - regression_loss: 1.1458 - classification_loss: 0.2144 315/500 [=================>............] - ETA: 40s - loss: 1.3625 - regression_loss: 1.1475 - classification_loss: 0.2150 316/500 [=================>............] - ETA: 40s - loss: 1.3609 - regression_loss: 1.1463 - classification_loss: 0.2147 317/500 [==================>...........] - ETA: 39s - loss: 1.3608 - regression_loss: 1.1463 - classification_loss: 0.2145 318/500 [==================>...........] - ETA: 39s - loss: 1.3628 - regression_loss: 1.1480 - classification_loss: 0.2148 319/500 [==================>...........] - ETA: 39s - loss: 1.3642 - regression_loss: 1.1488 - classification_loss: 0.2153 320/500 [==================>...........] - ETA: 39s - loss: 1.3695 - regression_loss: 1.1530 - classification_loss: 0.2164 321/500 [==================>...........] - ETA: 38s - loss: 1.3699 - regression_loss: 1.1535 - classification_loss: 0.2163 322/500 [==================>...........] - ETA: 38s - loss: 1.3702 - regression_loss: 1.1538 - classification_loss: 0.2164 323/500 [==================>...........] - ETA: 38s - loss: 1.3689 - regression_loss: 1.1529 - classification_loss: 0.2160 324/500 [==================>...........] - ETA: 38s - loss: 1.3700 - regression_loss: 1.1538 - classification_loss: 0.2162 325/500 [==================>...........] - ETA: 38s - loss: 1.3729 - regression_loss: 1.1561 - classification_loss: 0.2168 326/500 [==================>...........] - ETA: 37s - loss: 1.3716 - regression_loss: 1.1549 - classification_loss: 0.2167 327/500 [==================>...........] - ETA: 37s - loss: 1.3706 - regression_loss: 1.1541 - classification_loss: 0.2164 328/500 [==================>...........] - ETA: 37s - loss: 1.3708 - regression_loss: 1.1543 - classification_loss: 0.2164 329/500 [==================>...........] - ETA: 37s - loss: 1.3697 - regression_loss: 1.1536 - classification_loss: 0.2161 330/500 [==================>...........] - ETA: 37s - loss: 1.3708 - regression_loss: 1.1541 - classification_loss: 0.2166 331/500 [==================>...........] - ETA: 36s - loss: 1.3715 - regression_loss: 1.1548 - classification_loss: 0.2167 332/500 [==================>...........] - ETA: 36s - loss: 1.3709 - regression_loss: 1.1543 - classification_loss: 0.2166 333/500 [==================>...........] - ETA: 36s - loss: 1.3759 - regression_loss: 1.1587 - classification_loss: 0.2172 334/500 [===================>..........] - ETA: 36s - loss: 1.3759 - regression_loss: 1.1589 - classification_loss: 0.2170 335/500 [===================>..........] - ETA: 35s - loss: 1.3772 - regression_loss: 1.1600 - classification_loss: 0.2173 336/500 [===================>..........] - ETA: 35s - loss: 1.3801 - regression_loss: 1.1621 - classification_loss: 0.2180 337/500 [===================>..........] - ETA: 35s - loss: 1.3777 - regression_loss: 1.1599 - classification_loss: 0.2178 338/500 [===================>..........] - ETA: 35s - loss: 1.3772 - regression_loss: 1.1596 - classification_loss: 0.2175 339/500 [===================>..........] - ETA: 35s - loss: 1.3746 - regression_loss: 1.1576 - classification_loss: 0.2169 340/500 [===================>..........] - ETA: 34s - loss: 1.3721 - regression_loss: 1.1556 - classification_loss: 0.2165 341/500 [===================>..........] - ETA: 34s - loss: 1.3733 - regression_loss: 1.1565 - classification_loss: 0.2169 342/500 [===================>..........] - ETA: 34s - loss: 1.3759 - regression_loss: 1.1585 - classification_loss: 0.2174 343/500 [===================>..........] - ETA: 34s - loss: 1.3783 - regression_loss: 1.1606 - classification_loss: 0.2177 344/500 [===================>..........] - ETA: 33s - loss: 1.3791 - regression_loss: 1.1614 - classification_loss: 0.2178 345/500 [===================>..........] - ETA: 33s - loss: 1.3787 - regression_loss: 1.1610 - classification_loss: 0.2177 346/500 [===================>..........] - ETA: 33s - loss: 1.3791 - regression_loss: 1.1615 - classification_loss: 0.2176 347/500 [===================>..........] - ETA: 33s - loss: 1.3781 - regression_loss: 1.1605 - classification_loss: 0.2176 348/500 [===================>..........] - ETA: 33s - loss: 1.3777 - regression_loss: 1.1603 - classification_loss: 0.2174 349/500 [===================>..........] - ETA: 32s - loss: 1.3766 - regression_loss: 1.1593 - classification_loss: 0.2174 350/500 [====================>.........] - ETA: 32s - loss: 1.3760 - regression_loss: 1.1588 - classification_loss: 0.2172 351/500 [====================>.........] - ETA: 32s - loss: 1.3748 - regression_loss: 1.1579 - classification_loss: 0.2169 352/500 [====================>.........] - ETA: 32s - loss: 1.3741 - regression_loss: 1.1575 - classification_loss: 0.2166 353/500 [====================>.........] - ETA: 31s - loss: 1.3739 - regression_loss: 1.1574 - classification_loss: 0.2165 354/500 [====================>.........] - ETA: 31s - loss: 1.3732 - regression_loss: 1.1568 - classification_loss: 0.2164 355/500 [====================>.........] - ETA: 31s - loss: 1.3739 - regression_loss: 1.1575 - classification_loss: 0.2165 356/500 [====================>.........] - ETA: 31s - loss: 1.3749 - regression_loss: 1.1583 - classification_loss: 0.2166 357/500 [====================>.........] - ETA: 31s - loss: 1.3762 - regression_loss: 1.1593 - classification_loss: 0.2169 358/500 [====================>.........] - ETA: 30s - loss: 1.3771 - regression_loss: 1.1602 - classification_loss: 0.2168 359/500 [====================>.........] - ETA: 30s - loss: 1.3784 - regression_loss: 1.1610 - classification_loss: 0.2174 360/500 [====================>.........] - ETA: 30s - loss: 1.3797 - regression_loss: 1.1620 - classification_loss: 0.2177 361/500 [====================>.........] - ETA: 30s - loss: 1.3776 - regression_loss: 1.1603 - classification_loss: 0.2173 362/500 [====================>.........] - ETA: 30s - loss: 1.3782 - regression_loss: 1.1607 - classification_loss: 0.2175 363/500 [====================>.........] - ETA: 29s - loss: 1.3802 - regression_loss: 1.1621 - classification_loss: 0.2180 364/500 [====================>.........] - ETA: 29s - loss: 1.3788 - regression_loss: 1.1610 - classification_loss: 0.2178 365/500 [====================>.........] - ETA: 29s - loss: 1.3802 - regression_loss: 1.1620 - classification_loss: 0.2182 366/500 [====================>.........] - ETA: 29s - loss: 1.3781 - regression_loss: 1.1603 - classification_loss: 0.2178 367/500 [=====================>........] - ETA: 28s - loss: 1.3775 - regression_loss: 1.1599 - classification_loss: 0.2176 368/500 [=====================>........] - ETA: 28s - loss: 1.3761 - regression_loss: 1.1589 - classification_loss: 0.2173 369/500 [=====================>........] - ETA: 28s - loss: 1.3773 - regression_loss: 1.1599 - classification_loss: 0.2174 370/500 [=====================>........] - ETA: 28s - loss: 1.3760 - regression_loss: 1.1589 - classification_loss: 0.2171 371/500 [=====================>........] - ETA: 28s - loss: 1.3739 - regression_loss: 1.1573 - classification_loss: 0.2166 372/500 [=====================>........] - ETA: 27s - loss: 1.3722 - regression_loss: 1.1559 - classification_loss: 0.2162 373/500 [=====================>........] - ETA: 27s - loss: 1.3724 - regression_loss: 1.1560 - classification_loss: 0.2164 374/500 [=====================>........] - ETA: 27s - loss: 1.3728 - regression_loss: 1.1561 - classification_loss: 0.2166 375/500 [=====================>........] - ETA: 27s - loss: 1.3718 - regression_loss: 1.1554 - classification_loss: 0.2164 376/500 [=====================>........] - ETA: 26s - loss: 1.3703 - regression_loss: 1.1541 - classification_loss: 0.2162 377/500 [=====================>........] - ETA: 26s - loss: 1.3703 - regression_loss: 1.1542 - classification_loss: 0.2161 378/500 [=====================>........] - ETA: 26s - loss: 1.3694 - regression_loss: 1.1536 - classification_loss: 0.2158 379/500 [=====================>........] - ETA: 26s - loss: 1.3694 - regression_loss: 1.1536 - classification_loss: 0.2158 380/500 [=====================>........] - ETA: 26s - loss: 1.3690 - regression_loss: 1.1533 - classification_loss: 0.2157 381/500 [=====================>........] - ETA: 25s - loss: 1.3700 - regression_loss: 1.1542 - classification_loss: 0.2159 382/500 [=====================>........] - ETA: 25s - loss: 1.3705 - regression_loss: 1.1546 - classification_loss: 0.2159 383/500 [=====================>........] - ETA: 25s - loss: 1.3709 - regression_loss: 1.1549 - classification_loss: 0.2161 384/500 [======================>.......] - ETA: 25s - loss: 1.3717 - regression_loss: 1.1556 - classification_loss: 0.2161 385/500 [======================>.......] - ETA: 24s - loss: 1.3721 - regression_loss: 1.1560 - classification_loss: 0.2161 386/500 [======================>.......] - ETA: 24s - loss: 1.3727 - regression_loss: 1.1566 - classification_loss: 0.2161 387/500 [======================>.......] - ETA: 24s - loss: 1.3701 - regression_loss: 1.1545 - classification_loss: 0.2156 388/500 [======================>.......] - ETA: 24s - loss: 1.3683 - regression_loss: 1.1530 - classification_loss: 0.2152 389/500 [======================>.......] - ETA: 24s - loss: 1.3689 - regression_loss: 1.1536 - classification_loss: 0.2153 390/500 [======================>.......] - ETA: 23s - loss: 1.3678 - regression_loss: 1.1528 - classification_loss: 0.2150 391/500 [======================>.......] - ETA: 23s - loss: 1.3663 - regression_loss: 1.1516 - classification_loss: 0.2147 392/500 [======================>.......] - ETA: 23s - loss: 1.3671 - regression_loss: 1.1524 - classification_loss: 0.2146 393/500 [======================>.......] - ETA: 23s - loss: 1.3661 - regression_loss: 1.1517 - classification_loss: 0.2144 394/500 [======================>.......] - ETA: 23s - loss: 1.3664 - regression_loss: 1.1517 - classification_loss: 0.2147 395/500 [======================>.......] - ETA: 22s - loss: 1.3658 - regression_loss: 1.1513 - classification_loss: 0.2145 396/500 [======================>.......] - ETA: 22s - loss: 1.3675 - regression_loss: 1.1527 - classification_loss: 0.2148 397/500 [======================>.......] - ETA: 22s - loss: 1.3692 - regression_loss: 1.1542 - classification_loss: 0.2150 398/500 [======================>.......] - ETA: 22s - loss: 1.3693 - regression_loss: 1.1542 - classification_loss: 0.2151 399/500 [======================>.......] - ETA: 21s - loss: 1.3699 - regression_loss: 1.1546 - classification_loss: 0.2153 400/500 [=======================>......] - ETA: 21s - loss: 1.3688 - regression_loss: 1.1538 - classification_loss: 0.2150 401/500 [=======================>......] - ETA: 21s - loss: 1.3692 - regression_loss: 1.1543 - classification_loss: 0.2149 402/500 [=======================>......] - ETA: 21s - loss: 1.3692 - regression_loss: 1.1514 - classification_loss: 0.2178 403/500 [=======================>......] - ETA: 21s - loss: 1.3711 - regression_loss: 1.1528 - classification_loss: 0.2182 404/500 [=======================>......] - ETA: 20s - loss: 1.3711 - regression_loss: 1.1530 - classification_loss: 0.2181 405/500 [=======================>......] - ETA: 20s - loss: 1.3718 - regression_loss: 1.1537 - classification_loss: 0.2181 406/500 [=======================>......] - ETA: 20s - loss: 1.3722 - regression_loss: 1.1542 - classification_loss: 0.2181 407/500 [=======================>......] - ETA: 20s - loss: 1.3724 - regression_loss: 1.1544 - classification_loss: 0.2180 408/500 [=======================>......] - ETA: 19s - loss: 1.3733 - regression_loss: 1.1554 - classification_loss: 0.2179 409/500 [=======================>......] - ETA: 19s - loss: 1.3751 - regression_loss: 1.1568 - classification_loss: 0.2183 410/500 [=======================>......] - ETA: 19s - loss: 1.3758 - regression_loss: 1.1574 - classification_loss: 0.2184 411/500 [=======================>......] - ETA: 19s - loss: 1.3750 - regression_loss: 1.1570 - classification_loss: 0.2181 412/500 [=======================>......] - ETA: 19s - loss: 1.3744 - regression_loss: 1.1565 - classification_loss: 0.2179 413/500 [=======================>......] - ETA: 18s - loss: 1.3743 - regression_loss: 1.1564 - classification_loss: 0.2179 414/500 [=======================>......] - ETA: 18s - loss: 1.3730 - regression_loss: 1.1554 - classification_loss: 0.2175 415/500 [=======================>......] - ETA: 18s - loss: 1.3731 - regression_loss: 1.1555 - classification_loss: 0.2176 416/500 [=======================>......] - ETA: 18s - loss: 1.3728 - regression_loss: 1.1553 - classification_loss: 0.2175 417/500 [========================>.....] - ETA: 18s - loss: 1.3710 - regression_loss: 1.1539 - classification_loss: 0.2171 418/500 [========================>.....] - ETA: 17s - loss: 1.3729 - regression_loss: 1.1553 - classification_loss: 0.2176 419/500 [========================>.....] - ETA: 17s - loss: 1.3713 - regression_loss: 1.1541 - classification_loss: 0.2172 420/500 [========================>.....] - ETA: 17s - loss: 1.3691 - regression_loss: 1.1522 - classification_loss: 0.2169 421/500 [========================>.....] - ETA: 17s - loss: 1.3700 - regression_loss: 1.1527 - classification_loss: 0.2173 422/500 [========================>.....] - ETA: 16s - loss: 1.3704 - regression_loss: 1.1535 - classification_loss: 0.2170 423/500 [========================>.....] - ETA: 16s - loss: 1.3704 - regression_loss: 1.1535 - classification_loss: 0.2169 424/500 [========================>.....] - ETA: 16s - loss: 1.3695 - regression_loss: 1.1529 - classification_loss: 0.2167 425/500 [========================>.....] - ETA: 16s - loss: 1.3696 - regression_loss: 1.1530 - classification_loss: 0.2166 426/500 [========================>.....] - ETA: 16s - loss: 1.3675 - regression_loss: 1.1513 - classification_loss: 0.2162 427/500 [========================>.....] - ETA: 15s - loss: 1.3669 - regression_loss: 1.1508 - classification_loss: 0.2160 428/500 [========================>.....] - ETA: 15s - loss: 1.3669 - regression_loss: 1.1510 - classification_loss: 0.2159 429/500 [========================>.....] - ETA: 15s - loss: 1.3686 - regression_loss: 1.1525 - classification_loss: 0.2161 430/500 [========================>.....] - ETA: 15s - loss: 1.3686 - regression_loss: 1.1526 - classification_loss: 0.2159 431/500 [========================>.....] - ETA: 15s - loss: 1.3672 - regression_loss: 1.1513 - classification_loss: 0.2159 432/500 [========================>.....] - ETA: 14s - loss: 1.3674 - regression_loss: 1.1516 - classification_loss: 0.2158 433/500 [========================>.....] - ETA: 14s - loss: 1.3692 - regression_loss: 1.1528 - classification_loss: 0.2164 434/500 [=========================>....] - ETA: 14s - loss: 1.3684 - regression_loss: 1.1521 - classification_loss: 0.2163 435/500 [=========================>....] - ETA: 14s - loss: 1.3684 - regression_loss: 1.1522 - classification_loss: 0.2162 436/500 [=========================>....] - ETA: 13s - loss: 1.3678 - regression_loss: 1.1515 - classification_loss: 0.2163 437/500 [=========================>....] - ETA: 13s - loss: 1.3683 - regression_loss: 1.1516 - classification_loss: 0.2166 438/500 [=========================>....] - ETA: 13s - loss: 1.3682 - regression_loss: 1.1516 - classification_loss: 0.2166 439/500 [=========================>....] - ETA: 13s - loss: 1.3682 - regression_loss: 1.1517 - classification_loss: 0.2166 440/500 [=========================>....] - ETA: 13s - loss: 1.3673 - regression_loss: 1.1509 - classification_loss: 0.2164 441/500 [=========================>....] - ETA: 12s - loss: 1.3662 - regression_loss: 1.1499 - classification_loss: 0.2163 442/500 [=========================>....] - ETA: 12s - loss: 1.3664 - regression_loss: 1.1502 - classification_loss: 0.2162 443/500 [=========================>....] - ETA: 12s - loss: 1.3667 - regression_loss: 1.1505 - classification_loss: 0.2162 444/500 [=========================>....] - ETA: 12s - loss: 1.3659 - regression_loss: 1.1498 - classification_loss: 0.2160 445/500 [=========================>....] - ETA: 11s - loss: 1.3658 - regression_loss: 1.1500 - classification_loss: 0.2158 446/500 [=========================>....] - ETA: 11s - loss: 1.3644 - regression_loss: 1.1488 - classification_loss: 0.2156 447/500 [=========================>....] - ETA: 11s - loss: 1.3636 - regression_loss: 1.1483 - classification_loss: 0.2154 448/500 [=========================>....] - ETA: 11s - loss: 1.3634 - regression_loss: 1.1481 - classification_loss: 0.2153 449/500 [=========================>....] - ETA: 11s - loss: 1.3639 - regression_loss: 1.1483 - classification_loss: 0.2156 450/500 [==========================>...] - ETA: 10s - loss: 1.3630 - regression_loss: 1.1477 - classification_loss: 0.2153 451/500 [==========================>...] - ETA: 10s - loss: 1.3630 - regression_loss: 1.1477 - classification_loss: 0.2154 452/500 [==========================>...] - ETA: 10s - loss: 1.3628 - regression_loss: 1.1475 - classification_loss: 0.2153 453/500 [==========================>...] - ETA: 10s - loss: 1.3628 - regression_loss: 1.1475 - classification_loss: 0.2153 454/500 [==========================>...] - ETA: 10s - loss: 1.3627 - regression_loss: 1.1475 - classification_loss: 0.2152 455/500 [==========================>...] - ETA: 9s - loss: 1.3633 - regression_loss: 1.1481 - classification_loss: 0.2152  456/500 [==========================>...] - ETA: 9s - loss: 1.3628 - regression_loss: 1.1479 - classification_loss: 0.2149 457/500 [==========================>...] - ETA: 9s - loss: 1.3619 - regression_loss: 1.1471 - classification_loss: 0.2148 458/500 [==========================>...] - ETA: 9s - loss: 1.3628 - regression_loss: 1.1478 - classification_loss: 0.2149 459/500 [==========================>...] - ETA: 8s - loss: 1.3616 - regression_loss: 1.1469 - classification_loss: 0.2147 460/500 [==========================>...] - ETA: 8s - loss: 1.3611 - regression_loss: 1.1464 - classification_loss: 0.2146 461/500 [==========================>...] - ETA: 8s - loss: 1.3593 - regression_loss: 1.1450 - classification_loss: 0.2143 462/500 [==========================>...] - ETA: 8s - loss: 1.3602 - regression_loss: 1.1459 - classification_loss: 0.2144 463/500 [==========================>...] - ETA: 8s - loss: 1.3599 - regression_loss: 1.1457 - classification_loss: 0.2142 464/500 [==========================>...] - ETA: 7s - loss: 1.3609 - regression_loss: 1.1465 - classification_loss: 0.2144 465/500 [==========================>...] - ETA: 7s - loss: 1.3608 - regression_loss: 1.1465 - classification_loss: 0.2143 466/500 [==========================>...] - ETA: 7s - loss: 1.3593 - regression_loss: 1.1453 - classification_loss: 0.2140 467/500 [===========================>..] - ETA: 7s - loss: 1.3587 - regression_loss: 1.1448 - classification_loss: 0.2139 468/500 [===========================>..] - ETA: 6s - loss: 1.3597 - regression_loss: 1.1455 - classification_loss: 0.2142 469/500 [===========================>..] - ETA: 6s - loss: 1.3589 - regression_loss: 1.1448 - classification_loss: 0.2141 470/500 [===========================>..] - ETA: 6s - loss: 1.3586 - regression_loss: 1.1444 - classification_loss: 0.2142 471/500 [===========================>..] - ETA: 6s - loss: 1.3576 - regression_loss: 1.1437 - classification_loss: 0.2139 472/500 [===========================>..] - ETA: 6s - loss: 1.3581 - regression_loss: 1.1441 - classification_loss: 0.2140 473/500 [===========================>..] - ETA: 5s - loss: 1.3583 - regression_loss: 1.1443 - classification_loss: 0.2140 474/500 [===========================>..] - ETA: 5s - loss: 1.3588 - regression_loss: 1.1449 - classification_loss: 0.2139 475/500 [===========================>..] - ETA: 5s - loss: 1.3587 - regression_loss: 1.1448 - classification_loss: 0.2139 476/500 [===========================>..] - ETA: 5s - loss: 1.3585 - regression_loss: 1.1447 - classification_loss: 0.2138 477/500 [===========================>..] - ETA: 5s - loss: 1.3591 - regression_loss: 1.1453 - classification_loss: 0.2137 478/500 [===========================>..] - ETA: 4s - loss: 1.3571 - regression_loss: 1.1437 - classification_loss: 0.2134 479/500 [===========================>..] - ETA: 4s - loss: 1.3574 - regression_loss: 1.1438 - classification_loss: 0.2136 480/500 [===========================>..] - ETA: 4s - loss: 1.3582 - regression_loss: 1.1446 - classification_loss: 0.2136 481/500 [===========================>..] - ETA: 4s - loss: 1.3609 - regression_loss: 1.1469 - classification_loss: 0.2141 482/500 [===========================>..] - ETA: 3s - loss: 1.3630 - regression_loss: 1.1490 - classification_loss: 0.2140 483/500 [===========================>..] - ETA: 3s - loss: 1.3640 - regression_loss: 1.1498 - classification_loss: 0.2142 484/500 [============================>.] - ETA: 3s - loss: 1.3641 - regression_loss: 1.1499 - classification_loss: 0.2142 485/500 [============================>.] - ETA: 3s - loss: 1.3644 - regression_loss: 1.1500 - classification_loss: 0.2144 486/500 [============================>.] - ETA: 3s - loss: 1.3632 - regression_loss: 1.1490 - classification_loss: 0.2142 487/500 [============================>.] - ETA: 2s - loss: 1.3637 - regression_loss: 1.1495 - classification_loss: 0.2142 488/500 [============================>.] - ETA: 2s - loss: 1.3639 - regression_loss: 1.1497 - classification_loss: 0.2142 489/500 [============================>.] - ETA: 2s - loss: 1.3639 - regression_loss: 1.1497 - classification_loss: 0.2142 490/500 [============================>.] - ETA: 2s - loss: 1.3629 - regression_loss: 1.1488 - classification_loss: 0.2140 491/500 [============================>.] - ETA: 1s - loss: 1.3632 - regression_loss: 1.1490 - classification_loss: 0.2142 492/500 [============================>.] - ETA: 1s - loss: 1.3623 - regression_loss: 1.1483 - classification_loss: 0.2140 493/500 [============================>.] - ETA: 1s - loss: 1.3616 - regression_loss: 1.1477 - classification_loss: 0.2139 494/500 [============================>.] - ETA: 1s - loss: 1.3608 - regression_loss: 1.1471 - classification_loss: 0.2137 495/500 [============================>.] - ETA: 1s - loss: 1.3606 - regression_loss: 1.1470 - classification_loss: 0.2136 496/500 [============================>.] - ETA: 0s - loss: 1.3617 - regression_loss: 1.1477 - classification_loss: 0.2140 497/500 [============================>.] - ETA: 0s - loss: 1.3619 - regression_loss: 1.1479 - classification_loss: 0.2140 498/500 [============================>.] - ETA: 0s - loss: 1.3618 - regression_loss: 1.1478 - classification_loss: 0.2140 499/500 [============================>.] - ETA: 0s - loss: 1.3610 - regression_loss: 1.1472 - classification_loss: 0.2138 500/500 [==============================] - 109s 219ms/step - loss: 1.3630 - regression_loss: 1.1484 - classification_loss: 0.2145 326 instances of class plum with average precision: 0.8055 mAP: 0.8055 Epoch 00086: saving model to ./training/snapshots/resnet50_pascal_86.h5 Epoch 00086: ReduceLROnPlateau reducing learning rate to 9.999999974752428e-08. Epoch 87/150 1/500 [..............................] - ETA: 1:52 - loss: 1.4940 - regression_loss: 1.3923 - classification_loss: 0.1017 2/500 [..............................] - ETA: 1:57 - loss: 1.5745 - regression_loss: 1.4471 - classification_loss: 0.1274 3/500 [..............................] - ETA: 1:59 - loss: 1.5690 - regression_loss: 1.3888 - classification_loss: 0.1801 4/500 [..............................] - ETA: 1:58 - loss: 1.5720 - regression_loss: 1.3614 - classification_loss: 0.2106 5/500 [..............................] - ETA: 1:58 - loss: 1.6354 - regression_loss: 1.4034 - classification_loss: 0.2320 6/500 [..............................] - ETA: 1:58 - loss: 1.5445 - regression_loss: 1.3288 - classification_loss: 0.2157 7/500 [..............................] - ETA: 1:59 - loss: 1.5265 - regression_loss: 1.3128 - classification_loss: 0.2137 8/500 [..............................] - ETA: 1:59 - loss: 1.5656 - regression_loss: 1.3449 - classification_loss: 0.2208 9/500 [..............................] - ETA: 1:58 - loss: 1.6411 - regression_loss: 1.4042 - classification_loss: 0.2369 10/500 [..............................] - ETA: 1:56 - loss: 1.5209 - regression_loss: 1.3029 - classification_loss: 0.2180 11/500 [..............................] - ETA: 1:55 - loss: 1.5272 - regression_loss: 1.2999 - classification_loss: 0.2274 12/500 [..............................] - ETA: 1:54 - loss: 1.4891 - regression_loss: 1.2635 - classification_loss: 0.2256 13/500 [..............................] - ETA: 1:52 - loss: 1.4371 - regression_loss: 1.2200 - classification_loss: 0.2170 14/500 [..............................] - ETA: 1:52 - loss: 1.4306 - regression_loss: 1.2182 - classification_loss: 0.2123 15/500 [..............................] - ETA: 1:51 - loss: 1.3559 - regression_loss: 1.1543 - classification_loss: 0.2016 16/500 [..............................] - ETA: 1:50 - loss: 1.3814 - regression_loss: 1.1742 - classification_loss: 0.2072 17/500 [>.............................] - ETA: 1:49 - loss: 1.3648 - regression_loss: 1.1629 - classification_loss: 0.2019 18/500 [>.............................] - ETA: 1:49 - loss: 1.3768 - regression_loss: 1.1751 - classification_loss: 0.2017 19/500 [>.............................] - ETA: 1:48 - loss: 1.3793 - regression_loss: 1.1738 - classification_loss: 0.2055 20/500 [>.............................] - ETA: 1:47 - loss: 1.3774 - regression_loss: 1.1734 - classification_loss: 0.2040 21/500 [>.............................] - ETA: 1:47 - loss: 1.3652 - regression_loss: 1.1618 - classification_loss: 0.2034 22/500 [>.............................] - ETA: 1:46 - loss: 1.3401 - regression_loss: 1.1415 - classification_loss: 0.1986 23/500 [>.............................] - ETA: 1:46 - loss: 1.3645 - regression_loss: 1.1540 - classification_loss: 0.2104 24/500 [>.............................] - ETA: 1:46 - loss: 1.3533 - regression_loss: 1.1461 - classification_loss: 0.2072 25/500 [>.............................] - ETA: 1:45 - loss: 1.3281 - regression_loss: 1.1223 - classification_loss: 0.2058 26/500 [>.............................] - ETA: 1:45 - loss: 1.3312 - regression_loss: 1.1252 - classification_loss: 0.2060 27/500 [>.............................] - ETA: 1:45 - loss: 1.3397 - regression_loss: 1.1327 - classification_loss: 0.2070 28/500 [>.............................] - ETA: 1:45 - loss: 1.3302 - regression_loss: 1.1233 - classification_loss: 0.2068 29/500 [>.............................] - ETA: 1:44 - loss: 1.3444 - regression_loss: 1.1360 - classification_loss: 0.2083 30/500 [>.............................] - ETA: 1:44 - loss: 1.3504 - regression_loss: 1.1407 - classification_loss: 0.2097 31/500 [>.............................] - ETA: 1:44 - loss: 1.3335 - regression_loss: 1.1273 - classification_loss: 0.2063 32/500 [>.............................] - ETA: 1:43 - loss: 1.3335 - regression_loss: 1.1286 - classification_loss: 0.2049 33/500 [>.............................] - ETA: 1:43 - loss: 1.3132 - regression_loss: 1.1126 - classification_loss: 0.2006 34/500 [=>............................] - ETA: 1:43 - loss: 1.3238 - regression_loss: 1.1209 - classification_loss: 0.2029 35/500 [=>............................] - ETA: 1:42 - loss: 1.3310 - regression_loss: 1.1268 - classification_loss: 0.2042 36/500 [=>............................] - ETA: 1:42 - loss: 1.3521 - regression_loss: 1.1422 - classification_loss: 0.2099 37/500 [=>............................] - ETA: 1:42 - loss: 1.3604 - regression_loss: 1.1513 - classification_loss: 0.2091 38/500 [=>............................] - ETA: 1:41 - loss: 1.3595 - regression_loss: 1.1496 - classification_loss: 0.2099 39/500 [=>............................] - ETA: 1:41 - loss: 1.3709 - regression_loss: 1.1592 - classification_loss: 0.2117 40/500 [=>............................] - ETA: 1:41 - loss: 1.3601 - regression_loss: 1.1516 - classification_loss: 0.2085 41/500 [=>............................] - ETA: 1:40 - loss: 1.3737 - regression_loss: 1.1635 - classification_loss: 0.2102 42/500 [=>............................] - ETA: 1:40 - loss: 1.3861 - regression_loss: 1.1739 - classification_loss: 0.2123 43/500 [=>............................] - ETA: 1:40 - loss: 1.3868 - regression_loss: 1.1752 - classification_loss: 0.2117 44/500 [=>............................] - ETA: 1:39 - loss: 1.4053 - regression_loss: 1.1864 - classification_loss: 0.2189 45/500 [=>............................] - ETA: 1:39 - loss: 1.3929 - regression_loss: 1.1765 - classification_loss: 0.2164 46/500 [=>............................] - ETA: 1:39 - loss: 1.3827 - regression_loss: 1.1682 - classification_loss: 0.2145 47/500 [=>............................] - ETA: 1:39 - loss: 1.4022 - regression_loss: 1.1832 - classification_loss: 0.2190 48/500 [=>............................] - ETA: 1:38 - loss: 1.4157 - regression_loss: 1.1938 - classification_loss: 0.2219 49/500 [=>............................] - ETA: 1:38 - loss: 1.3976 - regression_loss: 1.1790 - classification_loss: 0.2186 50/500 [==>...........................] - ETA: 1:38 - loss: 1.3973 - regression_loss: 1.1777 - classification_loss: 0.2196 51/500 [==>...........................] - ETA: 1:38 - loss: 1.3840 - regression_loss: 1.1675 - classification_loss: 0.2165 52/500 [==>...........................] - ETA: 1:37 - loss: 1.3978 - regression_loss: 1.1810 - classification_loss: 0.2169 53/500 [==>...........................] - ETA: 1:37 - loss: 1.3856 - regression_loss: 1.1587 - classification_loss: 0.2268 54/500 [==>...........................] - ETA: 1:37 - loss: 1.3787 - regression_loss: 1.1537 - classification_loss: 0.2249 55/500 [==>...........................] - ETA: 1:37 - loss: 1.3866 - regression_loss: 1.1592 - classification_loss: 0.2274 56/500 [==>...........................] - ETA: 1:36 - loss: 1.3909 - regression_loss: 1.1624 - classification_loss: 0.2285 57/500 [==>...........................] - ETA: 1:36 - loss: 1.3873 - regression_loss: 1.1602 - classification_loss: 0.2271 58/500 [==>...........................] - ETA: 1:36 - loss: 1.3793 - regression_loss: 1.1542 - classification_loss: 0.2251 59/500 [==>...........................] - ETA: 1:36 - loss: 1.3776 - regression_loss: 1.1533 - classification_loss: 0.2243 60/500 [==>...........................] - ETA: 1:35 - loss: 1.3938 - regression_loss: 1.1680 - classification_loss: 0.2257 61/500 [==>...........................] - ETA: 1:35 - loss: 1.3929 - regression_loss: 1.1681 - classification_loss: 0.2248 62/500 [==>...........................] - ETA: 1:35 - loss: 1.3823 - regression_loss: 1.1493 - classification_loss: 0.2330 63/500 [==>...........................] - ETA: 1:35 - loss: 1.3844 - regression_loss: 1.1517 - classification_loss: 0.2327 64/500 [==>...........................] - ETA: 1:34 - loss: 1.3909 - regression_loss: 1.1569 - classification_loss: 0.2340 65/500 [==>...........................] - ETA: 1:34 - loss: 1.3884 - regression_loss: 1.1554 - classification_loss: 0.2330 66/500 [==>...........................] - ETA: 1:34 - loss: 1.3765 - regression_loss: 1.1462 - classification_loss: 0.2303 67/500 [===>..........................] - ETA: 1:33 - loss: 1.3698 - regression_loss: 1.1415 - classification_loss: 0.2284 68/500 [===>..........................] - ETA: 1:33 - loss: 1.3725 - regression_loss: 1.1443 - classification_loss: 0.2282 69/500 [===>..........................] - ETA: 1:33 - loss: 1.3737 - regression_loss: 1.1451 - classification_loss: 0.2286 70/500 [===>..........................] - ETA: 1:33 - loss: 1.3697 - regression_loss: 1.1425 - classification_loss: 0.2271 71/500 [===>..........................] - ETA: 1:32 - loss: 1.3649 - regression_loss: 1.1376 - classification_loss: 0.2274 72/500 [===>..........................] - ETA: 1:32 - loss: 1.3642 - regression_loss: 1.1378 - classification_loss: 0.2264 73/500 [===>..........................] - ETA: 1:32 - loss: 1.3619 - regression_loss: 1.1363 - classification_loss: 0.2257 74/500 [===>..........................] - ETA: 1:32 - loss: 1.3628 - regression_loss: 1.1374 - classification_loss: 0.2254 75/500 [===>..........................] - ETA: 1:32 - loss: 1.3669 - regression_loss: 1.1416 - classification_loss: 0.2253 76/500 [===>..........................] - ETA: 1:31 - loss: 1.3554 - regression_loss: 1.1321 - classification_loss: 0.2233 77/500 [===>..........................] - ETA: 1:31 - loss: 1.3581 - regression_loss: 1.1340 - classification_loss: 0.2241 78/500 [===>..........................] - ETA: 1:31 - loss: 1.3480 - regression_loss: 1.1262 - classification_loss: 0.2218 79/500 [===>..........................] - ETA: 1:31 - loss: 1.3399 - regression_loss: 1.1197 - classification_loss: 0.2202 80/500 [===>..........................] - ETA: 1:30 - loss: 1.3415 - regression_loss: 1.1212 - classification_loss: 0.2203 81/500 [===>..........................] - ETA: 1:30 - loss: 1.3339 - regression_loss: 1.1147 - classification_loss: 0.2192 82/500 [===>..........................] - ETA: 1:30 - loss: 1.3351 - regression_loss: 1.1167 - classification_loss: 0.2183 83/500 [===>..........................] - ETA: 1:30 - loss: 1.3350 - regression_loss: 1.1163 - classification_loss: 0.2187 84/500 [====>.........................] - ETA: 1:29 - loss: 1.3332 - regression_loss: 1.1153 - classification_loss: 0.2178 85/500 [====>.........................] - ETA: 1:29 - loss: 1.3417 - regression_loss: 1.1240 - classification_loss: 0.2178 86/500 [====>.........................] - ETA: 1:29 - loss: 1.3482 - regression_loss: 1.1295 - classification_loss: 0.2187 87/500 [====>.........................] - ETA: 1:29 - loss: 1.3579 - regression_loss: 1.1372 - classification_loss: 0.2208 88/500 [====>.........................] - ETA: 1:28 - loss: 1.3595 - regression_loss: 1.1384 - classification_loss: 0.2212 89/500 [====>.........................] - ETA: 1:28 - loss: 1.3586 - regression_loss: 1.1374 - classification_loss: 0.2212 90/500 [====>.........................] - ETA: 1:28 - loss: 1.3502 - regression_loss: 1.1301 - classification_loss: 0.2201 91/500 [====>.........................] - ETA: 1:28 - loss: 1.3541 - regression_loss: 1.1341 - classification_loss: 0.2201 92/500 [====>.........................] - ETA: 1:28 - loss: 1.3570 - regression_loss: 1.1367 - classification_loss: 0.2203 93/500 [====>.........................] - ETA: 1:27 - loss: 1.3545 - regression_loss: 1.1353 - classification_loss: 0.2192 94/500 [====>.........................] - ETA: 1:27 - loss: 1.3509 - regression_loss: 1.1320 - classification_loss: 0.2189 95/500 [====>.........................] - ETA: 1:27 - loss: 1.3509 - regression_loss: 1.1319 - classification_loss: 0.2190 96/500 [====>.........................] - ETA: 1:27 - loss: 1.3401 - regression_loss: 1.1231 - classification_loss: 0.2171 97/500 [====>.........................] - ETA: 1:26 - loss: 1.3370 - regression_loss: 1.1205 - classification_loss: 0.2165 98/500 [====>.........................] - ETA: 1:26 - loss: 1.3351 - regression_loss: 1.1194 - classification_loss: 0.2157 99/500 [====>.........................] - ETA: 1:26 - loss: 1.3319 - regression_loss: 1.1171 - classification_loss: 0.2148 100/500 [=====>........................] - ETA: 1:26 - loss: 1.3334 - regression_loss: 1.1182 - classification_loss: 0.2152 101/500 [=====>........................] - ETA: 1:25 - loss: 1.3291 - regression_loss: 1.1147 - classification_loss: 0.2144 102/500 [=====>........................] - ETA: 1:25 - loss: 1.3258 - regression_loss: 1.1127 - classification_loss: 0.2131 103/500 [=====>........................] - ETA: 1:25 - loss: 1.3315 - regression_loss: 1.1170 - classification_loss: 0.2144 104/500 [=====>........................] - ETA: 1:25 - loss: 1.3303 - regression_loss: 1.1166 - classification_loss: 0.2137 105/500 [=====>........................] - ETA: 1:25 - loss: 1.3209 - regression_loss: 1.1087 - classification_loss: 0.2122 106/500 [=====>........................] - ETA: 1:24 - loss: 1.3217 - regression_loss: 1.1103 - classification_loss: 0.2115 107/500 [=====>........................] - ETA: 1:24 - loss: 1.3200 - regression_loss: 1.1083 - classification_loss: 0.2118 108/500 [=====>........................] - ETA: 1:24 - loss: 1.3238 - regression_loss: 1.1116 - classification_loss: 0.2122 109/500 [=====>........................] - ETA: 1:24 - loss: 1.3238 - regression_loss: 1.1125 - classification_loss: 0.2114 110/500 [=====>........................] - ETA: 1:23 - loss: 1.3214 - regression_loss: 1.1108 - classification_loss: 0.2107 111/500 [=====>........................] - ETA: 1:23 - loss: 1.3214 - regression_loss: 1.1115 - classification_loss: 0.2099 112/500 [=====>........................] - ETA: 1:23 - loss: 1.3230 - regression_loss: 1.1133 - classification_loss: 0.2097 113/500 [=====>........................] - ETA: 1:23 - loss: 1.3291 - regression_loss: 1.1171 - classification_loss: 0.2120 114/500 [=====>........................] - ETA: 1:23 - loss: 1.3342 - regression_loss: 1.1217 - classification_loss: 0.2125 115/500 [=====>........................] - ETA: 1:22 - loss: 1.3313 - regression_loss: 1.1195 - classification_loss: 0.2118 116/500 [=====>........................] - ETA: 1:22 - loss: 1.3321 - regression_loss: 1.1211 - classification_loss: 0.2110 117/500 [======>.......................] - ETA: 1:22 - loss: 1.3267 - regression_loss: 1.1167 - classification_loss: 0.2100 118/500 [======>.......................] - ETA: 1:22 - loss: 1.3299 - regression_loss: 1.1193 - classification_loss: 0.2106 119/500 [======>.......................] - ETA: 1:21 - loss: 1.3316 - regression_loss: 1.1208 - classification_loss: 0.2108 120/500 [======>.......................] - ETA: 1:21 - loss: 1.3296 - regression_loss: 1.1195 - classification_loss: 0.2101 121/500 [======>.......................] - ETA: 1:21 - loss: 1.3242 - regression_loss: 1.1153 - classification_loss: 0.2089 122/500 [======>.......................] - ETA: 1:21 - loss: 1.3267 - regression_loss: 1.1172 - classification_loss: 0.2095 123/500 [======>.......................] - ETA: 1:21 - loss: 1.3225 - regression_loss: 1.1131 - classification_loss: 0.2094 124/500 [======>.......................] - ETA: 1:20 - loss: 1.3261 - regression_loss: 1.1152 - classification_loss: 0.2109 125/500 [======>.......................] - ETA: 1:20 - loss: 1.3291 - regression_loss: 1.1176 - classification_loss: 0.2115 126/500 [======>.......................] - ETA: 1:20 - loss: 1.3295 - regression_loss: 1.1181 - classification_loss: 0.2114 127/500 [======>.......................] - ETA: 1:20 - loss: 1.3307 - regression_loss: 1.1193 - classification_loss: 0.2114 128/500 [======>.......................] - ETA: 1:19 - loss: 1.3289 - regression_loss: 1.1178 - classification_loss: 0.2111 129/500 [======>.......................] - ETA: 1:19 - loss: 1.3248 - regression_loss: 1.1148 - classification_loss: 0.2100 130/500 [======>.......................] - ETA: 1:19 - loss: 1.3233 - regression_loss: 1.1137 - classification_loss: 0.2096 131/500 [======>.......................] - ETA: 1:19 - loss: 1.3213 - regression_loss: 1.1127 - classification_loss: 0.2086 132/500 [======>.......................] - ETA: 1:19 - loss: 1.3221 - regression_loss: 1.1137 - classification_loss: 0.2084 133/500 [======>.......................] - ETA: 1:18 - loss: 1.3222 - regression_loss: 1.1133 - classification_loss: 0.2089 134/500 [=======>......................] - ETA: 1:18 - loss: 1.3213 - regression_loss: 1.1126 - classification_loss: 0.2087 135/500 [=======>......................] - ETA: 1:18 - loss: 1.3211 - regression_loss: 1.1123 - classification_loss: 0.2088 136/500 [=======>......................] - ETA: 1:18 - loss: 1.3281 - regression_loss: 1.1176 - classification_loss: 0.2105 137/500 [=======>......................] - ETA: 1:18 - loss: 1.3283 - regression_loss: 1.1180 - classification_loss: 0.2103 138/500 [=======>......................] - ETA: 1:17 - loss: 1.3277 - regression_loss: 1.1179 - classification_loss: 0.2098 139/500 [=======>......................] - ETA: 1:17 - loss: 1.3313 - regression_loss: 1.1209 - classification_loss: 0.2105 140/500 [=======>......................] - ETA: 1:17 - loss: 1.3296 - regression_loss: 1.1194 - classification_loss: 0.2103 141/500 [=======>......................] - ETA: 1:17 - loss: 1.3227 - regression_loss: 1.1136 - classification_loss: 0.2091 142/500 [=======>......................] - ETA: 1:16 - loss: 1.3243 - regression_loss: 1.1153 - classification_loss: 0.2090 143/500 [=======>......................] - ETA: 1:16 - loss: 1.3208 - regression_loss: 1.1123 - classification_loss: 0.2085 144/500 [=======>......................] - ETA: 1:16 - loss: 1.3190 - regression_loss: 1.1108 - classification_loss: 0.2082 145/500 [=======>......................] - ETA: 1:16 - loss: 1.3212 - regression_loss: 1.1118 - classification_loss: 0.2094 146/500 [=======>......................] - ETA: 1:16 - loss: 1.3220 - regression_loss: 1.1132 - classification_loss: 0.2088 147/500 [=======>......................] - ETA: 1:15 - loss: 1.3182 - regression_loss: 1.1103 - classification_loss: 0.2079 148/500 [=======>......................] - ETA: 1:15 - loss: 1.3115 - regression_loss: 1.1047 - classification_loss: 0.2068 149/500 [=======>......................] - ETA: 1:15 - loss: 1.3063 - regression_loss: 1.1001 - classification_loss: 0.2061 150/500 [========>.....................] - ETA: 1:15 - loss: 1.3086 - regression_loss: 1.1020 - classification_loss: 0.2065 151/500 [========>.....................] - ETA: 1:15 - loss: 1.3095 - regression_loss: 1.1032 - classification_loss: 0.2064 152/500 [========>.....................] - ETA: 1:14 - loss: 1.3087 - regression_loss: 1.1026 - classification_loss: 0.2062 153/500 [========>.....................] - ETA: 1:14 - loss: 1.3128 - regression_loss: 1.1054 - classification_loss: 0.2073 154/500 [========>.....................] - ETA: 1:14 - loss: 1.3123 - regression_loss: 1.1046 - classification_loss: 0.2076 155/500 [========>.....................] - ETA: 1:14 - loss: 1.3137 - regression_loss: 1.1060 - classification_loss: 0.2078 156/500 [========>.....................] - ETA: 1:13 - loss: 1.3087 - regression_loss: 1.0989 - classification_loss: 0.2098 157/500 [========>.....................] - ETA: 1:13 - loss: 1.3056 - regression_loss: 1.0959 - classification_loss: 0.2097 158/500 [========>.....................] - ETA: 1:13 - loss: 1.3056 - regression_loss: 1.0960 - classification_loss: 0.2096 159/500 [========>.....................] - ETA: 1:13 - loss: 1.3047 - regression_loss: 1.0959 - classification_loss: 0.2088 160/500 [========>.....................] - ETA: 1:13 - loss: 1.3033 - regression_loss: 1.0952 - classification_loss: 0.2081 161/500 [========>.....................] - ETA: 1:12 - loss: 1.3035 - regression_loss: 1.0962 - classification_loss: 0.2073 162/500 [========>.....................] - ETA: 1:12 - loss: 1.3046 - regression_loss: 1.0972 - classification_loss: 0.2074 163/500 [========>.....................] - ETA: 1:12 - loss: 1.3041 - regression_loss: 1.0971 - classification_loss: 0.2070 164/500 [========>.....................] - ETA: 1:12 - loss: 1.3072 - regression_loss: 1.0996 - classification_loss: 0.2075 165/500 [========>.....................] - ETA: 1:11 - loss: 1.3068 - regression_loss: 1.0994 - classification_loss: 0.2074 166/500 [========>.....................] - ETA: 1:11 - loss: 1.3045 - regression_loss: 1.0976 - classification_loss: 0.2069 167/500 [=========>....................] - ETA: 1:11 - loss: 1.3069 - regression_loss: 1.0997 - classification_loss: 0.2072 168/500 [=========>....................] - ETA: 1:11 - loss: 1.3085 - regression_loss: 1.1010 - classification_loss: 0.2076 169/500 [=========>....................] - ETA: 1:11 - loss: 1.3089 - regression_loss: 1.1014 - classification_loss: 0.2075 170/500 [=========>....................] - ETA: 1:10 - loss: 1.3091 - regression_loss: 1.1018 - classification_loss: 0.2073 171/500 [=========>....................] - ETA: 1:10 - loss: 1.3113 - regression_loss: 1.1037 - classification_loss: 0.2076 172/500 [=========>....................] - ETA: 1:10 - loss: 1.3084 - regression_loss: 1.1013 - classification_loss: 0.2071 173/500 [=========>....................] - ETA: 1:10 - loss: 1.3081 - regression_loss: 1.1013 - classification_loss: 0.2068 174/500 [=========>....................] - ETA: 1:09 - loss: 1.3069 - regression_loss: 1.1006 - classification_loss: 0.2063 175/500 [=========>....................] - ETA: 1:09 - loss: 1.3074 - regression_loss: 1.1014 - classification_loss: 0.2060 176/500 [=========>....................] - ETA: 1:09 - loss: 1.3099 - regression_loss: 1.1036 - classification_loss: 0.2063 177/500 [=========>....................] - ETA: 1:09 - loss: 1.3131 - regression_loss: 1.1060 - classification_loss: 0.2071 178/500 [=========>....................] - ETA: 1:09 - loss: 1.3217 - regression_loss: 1.1136 - classification_loss: 0.2081 179/500 [=========>....................] - ETA: 1:08 - loss: 1.3207 - regression_loss: 1.1132 - classification_loss: 0.2075 180/500 [=========>....................] - ETA: 1:08 - loss: 1.3220 - regression_loss: 1.1144 - classification_loss: 0.2077 181/500 [=========>....................] - ETA: 1:08 - loss: 1.3214 - regression_loss: 1.1139 - classification_loss: 0.2075 182/500 [=========>....................] - ETA: 1:08 - loss: 1.3253 - regression_loss: 1.1173 - classification_loss: 0.2080 183/500 [=========>....................] - ETA: 1:08 - loss: 1.3236 - regression_loss: 1.1160 - classification_loss: 0.2077 184/500 [==========>...................] - ETA: 1:07 - loss: 1.3231 - regression_loss: 1.1157 - classification_loss: 0.2073 185/500 [==========>...................] - ETA: 1:07 - loss: 1.3221 - regression_loss: 1.1152 - classification_loss: 0.2069 186/500 [==========>...................] - ETA: 1:07 - loss: 1.3219 - regression_loss: 1.1150 - classification_loss: 0.2069 187/500 [==========>...................] - ETA: 1:07 - loss: 1.3246 - regression_loss: 1.1163 - classification_loss: 0.2083 188/500 [==========>...................] - ETA: 1:06 - loss: 1.3269 - regression_loss: 1.1183 - classification_loss: 0.2087 189/500 [==========>...................] - ETA: 1:06 - loss: 1.3306 - regression_loss: 1.1211 - classification_loss: 0.2094 190/500 [==========>...................] - ETA: 1:06 - loss: 1.3320 - regression_loss: 1.1224 - classification_loss: 0.2096 191/500 [==========>...................] - ETA: 1:06 - loss: 1.3345 - regression_loss: 1.1246 - classification_loss: 0.2100 192/500 [==========>...................] - ETA: 1:06 - loss: 1.3358 - regression_loss: 1.1255 - classification_loss: 0.2104 193/500 [==========>...................] - ETA: 1:05 - loss: 1.3348 - regression_loss: 1.1249 - classification_loss: 0.2099 194/500 [==========>...................] - ETA: 1:05 - loss: 1.3333 - regression_loss: 1.1237 - classification_loss: 0.2096 195/500 [==========>...................] - ETA: 1:05 - loss: 1.3330 - regression_loss: 1.1234 - classification_loss: 0.2096 196/500 [==========>...................] - ETA: 1:05 - loss: 1.3351 - regression_loss: 1.1250 - classification_loss: 0.2101 197/500 [==========>...................] - ETA: 1:04 - loss: 1.3374 - regression_loss: 1.1270 - classification_loss: 0.2104 198/500 [==========>...................] - ETA: 1:04 - loss: 1.3379 - regression_loss: 1.1276 - classification_loss: 0.2102 199/500 [==========>...................] - ETA: 1:04 - loss: 1.3346 - regression_loss: 1.1249 - classification_loss: 0.2097 200/500 [===========>..................] - ETA: 1:04 - loss: 1.3361 - regression_loss: 1.1261 - classification_loss: 0.2099 201/500 [===========>..................] - ETA: 1:04 - loss: 1.3329 - regression_loss: 1.1237 - classification_loss: 0.2092 202/500 [===========>..................] - ETA: 1:03 - loss: 1.3364 - regression_loss: 1.1274 - classification_loss: 0.2089 203/500 [===========>..................] - ETA: 1:03 - loss: 1.3357 - regression_loss: 1.1268 - classification_loss: 0.2089 204/500 [===========>..................] - ETA: 1:03 - loss: 1.3330 - regression_loss: 1.1245 - classification_loss: 0.2085 205/500 [===========>..................] - ETA: 1:03 - loss: 1.3318 - regression_loss: 1.1237 - classification_loss: 0.2081 206/500 [===========>..................] - ETA: 1:02 - loss: 1.3331 - regression_loss: 1.1249 - classification_loss: 0.2082 207/500 [===========>..................] - ETA: 1:02 - loss: 1.3280 - regression_loss: 1.1206 - classification_loss: 0.2074 208/500 [===========>..................] - ETA: 1:02 - loss: 1.3273 - regression_loss: 1.1203 - classification_loss: 0.2071 209/500 [===========>..................] - ETA: 1:02 - loss: 1.3269 - regression_loss: 1.1202 - classification_loss: 0.2067 210/500 [===========>..................] - ETA: 1:02 - loss: 1.3287 - regression_loss: 1.1217 - classification_loss: 0.2070 211/500 [===========>..................] - ETA: 1:01 - loss: 1.3284 - regression_loss: 1.1212 - classification_loss: 0.2071 212/500 [===========>..................] - ETA: 1:01 - loss: 1.3308 - regression_loss: 1.1232 - classification_loss: 0.2076 213/500 [===========>..................] - ETA: 1:01 - loss: 1.3335 - regression_loss: 1.1256 - classification_loss: 0.2079 214/500 [===========>..................] - ETA: 1:01 - loss: 1.3367 - regression_loss: 1.1285 - classification_loss: 0.2082 215/500 [===========>..................] - ETA: 1:01 - loss: 1.3336 - regression_loss: 1.1260 - classification_loss: 0.2076 216/500 [===========>..................] - ETA: 1:00 - loss: 1.3333 - regression_loss: 1.1260 - classification_loss: 0.2073 217/500 [============>.................] - ETA: 1:00 - loss: 1.3345 - regression_loss: 1.1208 - classification_loss: 0.2137 218/500 [============>.................] - ETA: 1:00 - loss: 1.3349 - regression_loss: 1.1214 - classification_loss: 0.2136 219/500 [============>.................] - ETA: 1:00 - loss: 1.3376 - regression_loss: 1.1236 - classification_loss: 0.2141 220/500 [============>.................] - ETA: 59s - loss: 1.3362 - regression_loss: 1.1228 - classification_loss: 0.2134  221/500 [============>.................] - ETA: 59s - loss: 1.3362 - regression_loss: 1.1224 - classification_loss: 0.2138 222/500 [============>.................] - ETA: 59s - loss: 1.3365 - regression_loss: 1.1230 - classification_loss: 0.2135 223/500 [============>.................] - ETA: 59s - loss: 1.3389 - regression_loss: 1.1255 - classification_loss: 0.2135 224/500 [============>.................] - ETA: 59s - loss: 1.3350 - regression_loss: 1.1221 - classification_loss: 0.2129 225/500 [============>.................] - ETA: 58s - loss: 1.3345 - regression_loss: 1.1215 - classification_loss: 0.2130 226/500 [============>.................] - ETA: 58s - loss: 1.3340 - regression_loss: 1.1212 - classification_loss: 0.2128 227/500 [============>.................] - ETA: 58s - loss: 1.3374 - regression_loss: 1.1240 - classification_loss: 0.2134 228/500 [============>.................] - ETA: 58s - loss: 1.3370 - regression_loss: 1.1238 - classification_loss: 0.2133 229/500 [============>.................] - ETA: 58s - loss: 1.3379 - regression_loss: 1.1244 - classification_loss: 0.2135 230/500 [============>.................] - ETA: 57s - loss: 1.3397 - regression_loss: 1.1256 - classification_loss: 0.2140 231/500 [============>.................] - ETA: 57s - loss: 1.3427 - regression_loss: 1.1280 - classification_loss: 0.2147 232/500 [============>.................] - ETA: 57s - loss: 1.3413 - regression_loss: 1.1270 - classification_loss: 0.2143 233/500 [============>.................] - ETA: 57s - loss: 1.3412 - regression_loss: 1.1269 - classification_loss: 0.2144 234/500 [=============>................] - ETA: 56s - loss: 1.3390 - regression_loss: 1.1254 - classification_loss: 0.2137 235/500 [=============>................] - ETA: 56s - loss: 1.3359 - regression_loss: 1.1229 - classification_loss: 0.2130 236/500 [=============>................] - ETA: 56s - loss: 1.3376 - regression_loss: 1.1244 - classification_loss: 0.2133 237/500 [=============>................] - ETA: 56s - loss: 1.3334 - regression_loss: 1.1208 - classification_loss: 0.2126 238/500 [=============>................] - ETA: 56s - loss: 1.3305 - regression_loss: 1.1183 - classification_loss: 0.2123 239/500 [=============>................] - ETA: 55s - loss: 1.3306 - regression_loss: 1.1184 - classification_loss: 0.2122 240/500 [=============>................] - ETA: 55s - loss: 1.3278 - regression_loss: 1.1161 - classification_loss: 0.2117 241/500 [=============>................] - ETA: 55s - loss: 1.3295 - regression_loss: 1.1175 - classification_loss: 0.2120 242/500 [=============>................] - ETA: 55s - loss: 1.3327 - regression_loss: 1.1203 - classification_loss: 0.2124 243/500 [=============>................] - ETA: 55s - loss: 1.3332 - regression_loss: 1.1209 - classification_loss: 0.2123 244/500 [=============>................] - ETA: 54s - loss: 1.3363 - regression_loss: 1.1231 - classification_loss: 0.2132 245/500 [=============>................] - ETA: 54s - loss: 1.3379 - regression_loss: 1.1244 - classification_loss: 0.2135 246/500 [=============>................] - ETA: 54s - loss: 1.3376 - regression_loss: 1.1243 - classification_loss: 0.2133 247/500 [=============>................] - ETA: 54s - loss: 1.3382 - regression_loss: 1.1250 - classification_loss: 0.2132 248/500 [=============>................] - ETA: 53s - loss: 1.3365 - regression_loss: 1.1237 - classification_loss: 0.2128 249/500 [=============>................] - ETA: 53s - loss: 1.3376 - regression_loss: 1.1247 - classification_loss: 0.2129 250/500 [==============>...............] - ETA: 53s - loss: 1.3407 - regression_loss: 1.1276 - classification_loss: 0.2131 251/500 [==============>...............] - ETA: 53s - loss: 1.3376 - regression_loss: 1.1251 - classification_loss: 0.2126 252/500 [==============>...............] - ETA: 53s - loss: 1.3395 - regression_loss: 1.1262 - classification_loss: 0.2134 253/500 [==============>...............] - ETA: 52s - loss: 1.3406 - regression_loss: 1.1269 - classification_loss: 0.2137 254/500 [==============>...............] - ETA: 52s - loss: 1.3386 - regression_loss: 1.1254 - classification_loss: 0.2132 255/500 [==============>...............] - ETA: 52s - loss: 1.3398 - regression_loss: 1.1265 - classification_loss: 0.2133 256/500 [==============>...............] - ETA: 52s - loss: 1.3406 - regression_loss: 1.1270 - classification_loss: 0.2136 257/500 [==============>...............] - ETA: 51s - loss: 1.3404 - regression_loss: 1.1271 - classification_loss: 0.2133 258/500 [==============>...............] - ETA: 51s - loss: 1.3429 - regression_loss: 1.1292 - classification_loss: 0.2137 259/500 [==============>...............] - ETA: 51s - loss: 1.3407 - regression_loss: 1.1273 - classification_loss: 0.2133 260/500 [==============>...............] - ETA: 51s - loss: 1.3385 - regression_loss: 1.1255 - classification_loss: 0.2131 261/500 [==============>...............] - ETA: 51s - loss: 1.3368 - regression_loss: 1.1241 - classification_loss: 0.2126 262/500 [==============>...............] - ETA: 50s - loss: 1.3367 - regression_loss: 1.1244 - classification_loss: 0.2123 263/500 [==============>...............] - ETA: 50s - loss: 1.3373 - regression_loss: 1.1250 - classification_loss: 0.2123 264/500 [==============>...............] - ETA: 50s - loss: 1.3359 - regression_loss: 1.1239 - classification_loss: 0.2119 265/500 [==============>...............] - ETA: 50s - loss: 1.3362 - regression_loss: 1.1245 - classification_loss: 0.2117 266/500 [==============>...............] - ETA: 50s - loss: 1.3354 - regression_loss: 1.1240 - classification_loss: 0.2114 267/500 [===============>..............] - ETA: 49s - loss: 1.3348 - regression_loss: 1.1238 - classification_loss: 0.2110 268/500 [===============>..............] - ETA: 49s - loss: 1.3325 - regression_loss: 1.1220 - classification_loss: 0.2106 269/500 [===============>..............] - ETA: 49s - loss: 1.3324 - regression_loss: 1.1221 - classification_loss: 0.2103 270/500 [===============>..............] - ETA: 49s - loss: 1.3312 - regression_loss: 1.1213 - classification_loss: 0.2099 271/500 [===============>..............] - ETA: 49s - loss: 1.3317 - regression_loss: 1.1223 - classification_loss: 0.2094 272/500 [===============>..............] - ETA: 48s - loss: 1.3321 - regression_loss: 1.1228 - classification_loss: 0.2093 273/500 [===============>..............] - ETA: 48s - loss: 1.3328 - regression_loss: 1.1234 - classification_loss: 0.2094 274/500 [===============>..............] - ETA: 48s - loss: 1.3336 - regression_loss: 1.1243 - classification_loss: 0.2093 275/500 [===============>..............] - ETA: 48s - loss: 1.3300 - regression_loss: 1.1214 - classification_loss: 0.2086 276/500 [===============>..............] - ETA: 47s - loss: 1.3284 - regression_loss: 1.1200 - classification_loss: 0.2084 277/500 [===============>..............] - ETA: 47s - loss: 1.3310 - regression_loss: 1.1220 - classification_loss: 0.2091 278/500 [===============>..............] - ETA: 47s - loss: 1.3343 - regression_loss: 1.1243 - classification_loss: 0.2101 279/500 [===============>..............] - ETA: 47s - loss: 1.3337 - regression_loss: 1.1238 - classification_loss: 0.2099 280/500 [===============>..............] - ETA: 47s - loss: 1.3349 - regression_loss: 1.1249 - classification_loss: 0.2100 281/500 [===============>..............] - ETA: 46s - loss: 1.3359 - regression_loss: 1.1257 - classification_loss: 0.2102 282/500 [===============>..............] - ETA: 46s - loss: 1.3366 - regression_loss: 1.1261 - classification_loss: 0.2105 283/500 [===============>..............] - ETA: 46s - loss: 1.3345 - regression_loss: 1.1244 - classification_loss: 0.2101 284/500 [================>.............] - ETA: 46s - loss: 1.3327 - regression_loss: 1.1230 - classification_loss: 0.2097 285/500 [================>.............] - ETA: 45s - loss: 1.3358 - regression_loss: 1.1253 - classification_loss: 0.2106 286/500 [================>.............] - ETA: 45s - loss: 1.3372 - regression_loss: 1.1262 - classification_loss: 0.2110 287/500 [================>.............] - ETA: 45s - loss: 1.3343 - regression_loss: 1.1238 - classification_loss: 0.2105 288/500 [================>.............] - ETA: 45s - loss: 1.3356 - regression_loss: 1.1249 - classification_loss: 0.2107 289/500 [================>.............] - ETA: 45s - loss: 1.3376 - regression_loss: 1.1260 - classification_loss: 0.2116 290/500 [================>.............] - ETA: 44s - loss: 1.3367 - regression_loss: 1.1254 - classification_loss: 0.2113 291/500 [================>.............] - ETA: 44s - loss: 1.3361 - regression_loss: 1.1250 - classification_loss: 0.2111 292/500 [================>.............] - ETA: 44s - loss: 1.3349 - regression_loss: 1.1240 - classification_loss: 0.2109 293/500 [================>.............] - ETA: 44s - loss: 1.3372 - regression_loss: 1.1257 - classification_loss: 0.2115 294/500 [================>.............] - ETA: 44s - loss: 1.3360 - regression_loss: 1.1247 - classification_loss: 0.2113 295/500 [================>.............] - ETA: 43s - loss: 1.3391 - regression_loss: 1.1274 - classification_loss: 0.2117 296/500 [================>.............] - ETA: 43s - loss: 1.3358 - regression_loss: 1.1246 - classification_loss: 0.2112 297/500 [================>.............] - ETA: 43s - loss: 1.3371 - regression_loss: 1.1259 - classification_loss: 0.2112 298/500 [================>.............] - ETA: 43s - loss: 1.3386 - regression_loss: 1.1269 - classification_loss: 0.2117 299/500 [================>.............] - ETA: 42s - loss: 1.3389 - regression_loss: 1.1272 - classification_loss: 0.2117 300/500 [=================>............] - ETA: 42s - loss: 1.3382 - regression_loss: 1.1267 - classification_loss: 0.2116 301/500 [=================>............] - ETA: 42s - loss: 1.3391 - regression_loss: 1.1273 - classification_loss: 0.2118 302/500 [=================>............] - ETA: 42s - loss: 1.3390 - regression_loss: 1.1276 - classification_loss: 0.2115 303/500 [=================>............] - ETA: 42s - loss: 1.3410 - regression_loss: 1.1294 - classification_loss: 0.2116 304/500 [=================>............] - ETA: 41s - loss: 1.3401 - regression_loss: 1.1288 - classification_loss: 0.2113 305/500 [=================>............] - ETA: 41s - loss: 1.3422 - regression_loss: 1.1306 - classification_loss: 0.2116 306/500 [=================>............] - ETA: 41s - loss: 1.3427 - regression_loss: 1.1311 - classification_loss: 0.2116 307/500 [=================>............] - ETA: 41s - loss: 1.3432 - regression_loss: 1.1316 - classification_loss: 0.2116 308/500 [=================>............] - ETA: 41s - loss: 1.3433 - regression_loss: 1.1319 - classification_loss: 0.2114 309/500 [=================>............] - ETA: 40s - loss: 1.3440 - regression_loss: 1.1326 - classification_loss: 0.2114 310/500 [=================>............] - ETA: 40s - loss: 1.3454 - regression_loss: 1.1337 - classification_loss: 0.2117 311/500 [=================>............] - ETA: 40s - loss: 1.3470 - regression_loss: 1.1351 - classification_loss: 0.2120 312/500 [=================>............] - ETA: 40s - loss: 1.3476 - regression_loss: 1.1356 - classification_loss: 0.2120 313/500 [=================>............] - ETA: 39s - loss: 1.3457 - regression_loss: 1.1341 - classification_loss: 0.2115 314/500 [=================>............] - ETA: 39s - loss: 1.3445 - regression_loss: 1.1334 - classification_loss: 0.2111 315/500 [=================>............] - ETA: 39s - loss: 1.3459 - regression_loss: 1.1347 - classification_loss: 0.2113 316/500 [=================>............] - ETA: 39s - loss: 1.3478 - regression_loss: 1.1361 - classification_loss: 0.2116 317/500 [==================>...........] - ETA: 39s - loss: 1.3481 - regression_loss: 1.1365 - classification_loss: 0.2116 318/500 [==================>...........] - ETA: 38s - loss: 1.3506 - regression_loss: 1.1383 - classification_loss: 0.2122 319/500 [==================>...........] - ETA: 38s - loss: 1.3520 - regression_loss: 1.1390 - classification_loss: 0.2130 320/500 [==================>...........] - ETA: 38s - loss: 1.3535 - regression_loss: 1.1401 - classification_loss: 0.2134 321/500 [==================>...........] - ETA: 38s - loss: 1.3526 - regression_loss: 1.1391 - classification_loss: 0.2135 322/500 [==================>...........] - ETA: 38s - loss: 1.3510 - regression_loss: 1.1379 - classification_loss: 0.2132 323/500 [==================>...........] - ETA: 37s - loss: 1.3505 - regression_loss: 1.1374 - classification_loss: 0.2131 324/500 [==================>...........] - ETA: 37s - loss: 1.3509 - regression_loss: 1.1378 - classification_loss: 0.2131 325/500 [==================>...........] - ETA: 37s - loss: 1.3511 - regression_loss: 1.1382 - classification_loss: 0.2128 326/500 [==================>...........] - ETA: 37s - loss: 1.3525 - regression_loss: 1.1393 - classification_loss: 0.2131 327/500 [==================>...........] - ETA: 36s - loss: 1.3524 - regression_loss: 1.1393 - classification_loss: 0.2131 328/500 [==================>...........] - ETA: 36s - loss: 1.3523 - regression_loss: 1.1395 - classification_loss: 0.2129 329/500 [==================>...........] - ETA: 36s - loss: 1.3521 - regression_loss: 1.1393 - classification_loss: 0.2128 330/500 [==================>...........] - ETA: 36s - loss: 1.3520 - regression_loss: 1.1391 - classification_loss: 0.2129 331/500 [==================>...........] - ETA: 36s - loss: 1.3508 - regression_loss: 1.1383 - classification_loss: 0.2125 332/500 [==================>...........] - ETA: 35s - loss: 1.3506 - regression_loss: 1.1381 - classification_loss: 0.2125 333/500 [==================>...........] - ETA: 35s - loss: 1.3527 - regression_loss: 1.1399 - classification_loss: 0.2129 334/500 [===================>..........] - ETA: 35s - loss: 1.3514 - regression_loss: 1.1388 - classification_loss: 0.2126 335/500 [===================>..........] - ETA: 35s - loss: 1.3518 - regression_loss: 1.1389 - classification_loss: 0.2129 336/500 [===================>..........] - ETA: 35s - loss: 1.3535 - regression_loss: 1.1404 - classification_loss: 0.2131 337/500 [===================>..........] - ETA: 34s - loss: 1.3531 - regression_loss: 1.1401 - classification_loss: 0.2130 338/500 [===================>..........] - ETA: 34s - loss: 1.3542 - regression_loss: 1.1415 - classification_loss: 0.2126 339/500 [===================>..........] - ETA: 34s - loss: 1.3549 - regression_loss: 1.1424 - classification_loss: 0.2125 340/500 [===================>..........] - ETA: 34s - loss: 1.3551 - regression_loss: 1.1427 - classification_loss: 0.2124 341/500 [===================>..........] - ETA: 33s - loss: 1.3545 - regression_loss: 1.1422 - classification_loss: 0.2123 342/500 [===================>..........] - ETA: 33s - loss: 1.3560 - regression_loss: 1.1435 - classification_loss: 0.2125 343/500 [===================>..........] - ETA: 33s - loss: 1.3552 - regression_loss: 1.1429 - classification_loss: 0.2123 344/500 [===================>..........] - ETA: 33s - loss: 1.3547 - regression_loss: 1.1419 - classification_loss: 0.2129 345/500 [===================>..........] - ETA: 33s - loss: 1.3540 - regression_loss: 1.1413 - classification_loss: 0.2127 346/500 [===================>..........] - ETA: 32s - loss: 1.3541 - regression_loss: 1.1415 - classification_loss: 0.2125 347/500 [===================>..........] - ETA: 32s - loss: 1.3513 - regression_loss: 1.1392 - classification_loss: 0.2122 348/500 [===================>..........] - ETA: 32s - loss: 1.3532 - regression_loss: 1.1408 - classification_loss: 0.2124 349/500 [===================>..........] - ETA: 32s - loss: 1.3525 - regression_loss: 1.1403 - classification_loss: 0.2122 350/500 [====================>.........] - ETA: 32s - loss: 1.3517 - regression_loss: 1.1396 - classification_loss: 0.2121 351/500 [====================>.........] - ETA: 31s - loss: 1.3516 - regression_loss: 1.1396 - classification_loss: 0.2121 352/500 [====================>.........] - ETA: 31s - loss: 1.3523 - regression_loss: 1.1401 - classification_loss: 0.2122 353/500 [====================>.........] - ETA: 31s - loss: 1.3522 - regression_loss: 1.1400 - classification_loss: 0.2121 354/500 [====================>.........] - ETA: 31s - loss: 1.3528 - regression_loss: 1.1406 - classification_loss: 0.2121 355/500 [====================>.........] - ETA: 31s - loss: 1.3507 - regression_loss: 1.1389 - classification_loss: 0.2117 356/500 [====================>.........] - ETA: 30s - loss: 1.3506 - regression_loss: 1.1390 - classification_loss: 0.2116 357/500 [====================>.........] - ETA: 30s - loss: 1.3489 - regression_loss: 1.1376 - classification_loss: 0.2113 358/500 [====================>.........] - ETA: 30s - loss: 1.3493 - regression_loss: 1.1381 - classification_loss: 0.2113 359/500 [====================>.........] - ETA: 30s - loss: 1.3492 - regression_loss: 1.1380 - classification_loss: 0.2112 360/500 [====================>.........] - ETA: 29s - loss: 1.3484 - regression_loss: 1.1372 - classification_loss: 0.2112 361/500 [====================>.........] - ETA: 29s - loss: 1.3469 - regression_loss: 1.1361 - classification_loss: 0.2108 362/500 [====================>.........] - ETA: 29s - loss: 1.3466 - regression_loss: 1.1359 - classification_loss: 0.2107 363/500 [====================>.........] - ETA: 29s - loss: 1.3467 - regression_loss: 1.1361 - classification_loss: 0.2106 364/500 [====================>.........] - ETA: 29s - loss: 1.3464 - regression_loss: 1.1360 - classification_loss: 0.2104 365/500 [====================>.........] - ETA: 28s - loss: 1.3483 - regression_loss: 1.1375 - classification_loss: 0.2108 366/500 [====================>.........] - ETA: 28s - loss: 1.3470 - regression_loss: 1.1365 - classification_loss: 0.2105 367/500 [=====================>........] - ETA: 28s - loss: 1.3455 - regression_loss: 1.1354 - classification_loss: 0.2101 368/500 [=====================>........] - ETA: 28s - loss: 1.3456 - regression_loss: 1.1356 - classification_loss: 0.2100 369/500 [=====================>........] - ETA: 28s - loss: 1.3457 - regression_loss: 1.1357 - classification_loss: 0.2100 370/500 [=====================>........] - ETA: 27s - loss: 1.3453 - regression_loss: 1.1356 - classification_loss: 0.2097 371/500 [=====================>........] - ETA: 27s - loss: 1.3451 - regression_loss: 1.1355 - classification_loss: 0.2096 372/500 [=====================>........] - ETA: 27s - loss: 1.3464 - regression_loss: 1.1366 - classification_loss: 0.2098 373/500 [=====================>........] - ETA: 27s - loss: 1.3447 - regression_loss: 1.1352 - classification_loss: 0.2095 374/500 [=====================>........] - ETA: 26s - loss: 1.3458 - regression_loss: 1.1360 - classification_loss: 0.2097 375/500 [=====================>........] - ETA: 26s - loss: 1.3465 - regression_loss: 1.1366 - classification_loss: 0.2098 376/500 [=====================>........] - ETA: 26s - loss: 1.3478 - regression_loss: 1.1377 - classification_loss: 0.2102 377/500 [=====================>........] - ETA: 26s - loss: 1.3490 - regression_loss: 1.1384 - classification_loss: 0.2106 378/500 [=====================>........] - ETA: 26s - loss: 1.3476 - regression_loss: 1.1373 - classification_loss: 0.2103 379/500 [=====================>........] - ETA: 25s - loss: 1.3472 - regression_loss: 1.1372 - classification_loss: 0.2101 380/500 [=====================>........] - ETA: 25s - loss: 1.3459 - regression_loss: 1.1359 - classification_loss: 0.2100 381/500 [=====================>........] - ETA: 25s - loss: 1.3452 - regression_loss: 1.1355 - classification_loss: 0.2097 382/500 [=====================>........] - ETA: 25s - loss: 1.3457 - regression_loss: 1.1356 - classification_loss: 0.2100 383/500 [=====================>........] - ETA: 25s - loss: 1.3450 - regression_loss: 1.1352 - classification_loss: 0.2098 384/500 [======================>.......] - ETA: 24s - loss: 1.3455 - regression_loss: 1.1358 - classification_loss: 0.2097 385/500 [======================>.......] - ETA: 24s - loss: 1.3439 - regression_loss: 1.1345 - classification_loss: 0.2094 386/500 [======================>.......] - ETA: 24s - loss: 1.3481 - regression_loss: 1.1380 - classification_loss: 0.2101 387/500 [======================>.......] - ETA: 24s - loss: 1.3473 - regression_loss: 1.1374 - classification_loss: 0.2099 388/500 [======================>.......] - ETA: 23s - loss: 1.3462 - regression_loss: 1.1366 - classification_loss: 0.2096 389/500 [======================>.......] - ETA: 23s - loss: 1.3483 - regression_loss: 1.1381 - classification_loss: 0.2101 390/500 [======================>.......] - ETA: 23s - loss: 1.3495 - regression_loss: 1.1391 - classification_loss: 0.2104 391/500 [======================>.......] - ETA: 23s - loss: 1.3492 - regression_loss: 1.1389 - classification_loss: 0.2103 392/500 [======================>.......] - ETA: 23s - loss: 1.3496 - regression_loss: 1.1391 - classification_loss: 0.2105 393/500 [======================>.......] - ETA: 22s - loss: 1.3522 - regression_loss: 1.1409 - classification_loss: 0.2112 394/500 [======================>.......] - ETA: 22s - loss: 1.3539 - regression_loss: 1.1425 - classification_loss: 0.2114 395/500 [======================>.......] - ETA: 22s - loss: 1.3743 - regression_loss: 1.1439 - classification_loss: 0.2303 396/500 [======================>.......] - ETA: 22s - loss: 1.3729 - regression_loss: 1.1428 - classification_loss: 0.2301 397/500 [======================>.......] - ETA: 22s - loss: 1.3724 - regression_loss: 1.1426 - classification_loss: 0.2299 398/500 [======================>.......] - ETA: 21s - loss: 1.3716 - regression_loss: 1.1421 - classification_loss: 0.2296 399/500 [======================>.......] - ETA: 21s - loss: 1.3708 - regression_loss: 1.1416 - classification_loss: 0.2293 400/500 [=======================>......] - ETA: 21s - loss: 1.3739 - regression_loss: 1.1424 - classification_loss: 0.2315 401/500 [=======================>......] - ETA: 21s - loss: 1.3731 - regression_loss: 1.1419 - classification_loss: 0.2312 402/500 [=======================>......] - ETA: 20s - loss: 1.3739 - regression_loss: 1.1424 - classification_loss: 0.2315 403/500 [=======================>......] - ETA: 20s - loss: 1.3722 - regression_loss: 1.1412 - classification_loss: 0.2310 404/500 [=======================>......] - ETA: 20s - loss: 1.3713 - regression_loss: 1.1404 - classification_loss: 0.2310 405/500 [=======================>......] - ETA: 20s - loss: 1.3718 - regression_loss: 1.1409 - classification_loss: 0.2309 406/500 [=======================>......] - ETA: 20s - loss: 1.3727 - regression_loss: 1.1418 - classification_loss: 0.2310 407/500 [=======================>......] - ETA: 19s - loss: 1.3752 - regression_loss: 1.1442 - classification_loss: 0.2310 408/500 [=======================>......] - ETA: 19s - loss: 1.3740 - regression_loss: 1.1431 - classification_loss: 0.2310 409/500 [=======================>......] - ETA: 19s - loss: 1.3741 - regression_loss: 1.1433 - classification_loss: 0.2308 410/500 [=======================>......] - ETA: 19s - loss: 1.3750 - regression_loss: 1.1441 - classification_loss: 0.2309 411/500 [=======================>......] - ETA: 19s - loss: 1.3727 - regression_loss: 1.1423 - classification_loss: 0.2305 412/500 [=======================>......] - ETA: 18s - loss: 1.3722 - regression_loss: 1.1421 - classification_loss: 0.2302 413/500 [=======================>......] - ETA: 18s - loss: 1.3721 - regression_loss: 1.1419 - classification_loss: 0.2302 414/500 [=======================>......] - ETA: 18s - loss: 1.3728 - regression_loss: 1.1429 - classification_loss: 0.2300 415/500 [=======================>......] - ETA: 18s - loss: 1.3733 - regression_loss: 1.1433 - classification_loss: 0.2300 416/500 [=======================>......] - ETA: 17s - loss: 1.3732 - regression_loss: 1.1433 - classification_loss: 0.2299 417/500 [========================>.....] - ETA: 17s - loss: 1.3728 - regression_loss: 1.1432 - classification_loss: 0.2297 418/500 [========================>.....] - ETA: 17s - loss: 1.3736 - regression_loss: 1.1439 - classification_loss: 0.2297 419/500 [========================>.....] - ETA: 17s - loss: 1.3744 - regression_loss: 1.1447 - classification_loss: 0.2297 420/500 [========================>.....] - ETA: 17s - loss: 1.3750 - regression_loss: 1.1452 - classification_loss: 0.2298 421/500 [========================>.....] - ETA: 16s - loss: 1.3742 - regression_loss: 1.1447 - classification_loss: 0.2296 422/500 [========================>.....] - ETA: 16s - loss: 1.3745 - regression_loss: 1.1449 - classification_loss: 0.2296 423/500 [========================>.....] - ETA: 16s - loss: 1.3744 - regression_loss: 1.1449 - classification_loss: 0.2295 424/500 [========================>.....] - ETA: 16s - loss: 1.3743 - regression_loss: 1.1449 - classification_loss: 0.2294 425/500 [========================>.....] - ETA: 16s - loss: 1.3743 - regression_loss: 1.1450 - classification_loss: 0.2294 426/500 [========================>.....] - ETA: 15s - loss: 1.3740 - regression_loss: 1.1449 - classification_loss: 0.2291 427/500 [========================>.....] - ETA: 15s - loss: 1.3754 - regression_loss: 1.1460 - classification_loss: 0.2294 428/500 [========================>.....] - ETA: 15s - loss: 1.3748 - regression_loss: 1.1456 - classification_loss: 0.2292 429/500 [========================>.....] - ETA: 15s - loss: 1.3754 - regression_loss: 1.1460 - classification_loss: 0.2294 430/500 [========================>.....] - ETA: 15s - loss: 1.3754 - regression_loss: 1.1463 - classification_loss: 0.2292 431/500 [========================>.....] - ETA: 14s - loss: 1.3756 - regression_loss: 1.1463 - classification_loss: 0.2292 432/500 [========================>.....] - ETA: 14s - loss: 1.3744 - regression_loss: 1.1453 - classification_loss: 0.2291 433/500 [========================>.....] - ETA: 14s - loss: 1.3741 - regression_loss: 1.1450 - classification_loss: 0.2290 434/500 [=========================>....] - ETA: 14s - loss: 1.3748 - regression_loss: 1.1456 - classification_loss: 0.2292 435/500 [=========================>....] - ETA: 13s - loss: 1.3745 - regression_loss: 1.1455 - classification_loss: 0.2290 436/500 [=========================>....] - ETA: 13s - loss: 1.3761 - regression_loss: 1.1461 - classification_loss: 0.2300 437/500 [=========================>....] - ETA: 13s - loss: 1.3750 - regression_loss: 1.1452 - classification_loss: 0.2298 438/500 [=========================>....] - ETA: 13s - loss: 1.3768 - regression_loss: 1.1467 - classification_loss: 0.2301 439/500 [=========================>....] - ETA: 13s - loss: 1.3757 - regression_loss: 1.1459 - classification_loss: 0.2298 440/500 [=========================>....] - ETA: 12s - loss: 1.3752 - regression_loss: 1.1455 - classification_loss: 0.2297 441/500 [=========================>....] - ETA: 12s - loss: 1.3742 - regression_loss: 1.1447 - classification_loss: 0.2294 442/500 [=========================>....] - ETA: 12s - loss: 1.3732 - regression_loss: 1.1439 - classification_loss: 0.2292 443/500 [=========================>....] - ETA: 12s - loss: 1.3730 - regression_loss: 1.1438 - classification_loss: 0.2292 444/500 [=========================>....] - ETA: 12s - loss: 1.3726 - regression_loss: 1.1436 - classification_loss: 0.2291 445/500 [=========================>....] - ETA: 11s - loss: 1.3713 - regression_loss: 1.1423 - classification_loss: 0.2290 446/500 [=========================>....] - ETA: 11s - loss: 1.3723 - regression_loss: 1.1432 - classification_loss: 0.2291 447/500 [=========================>....] - ETA: 11s - loss: 1.3727 - regression_loss: 1.1436 - classification_loss: 0.2291 448/500 [=========================>....] - ETA: 11s - loss: 1.3726 - regression_loss: 1.1436 - classification_loss: 0.2289 449/500 [=========================>....] - ETA: 10s - loss: 1.3711 - regression_loss: 1.1425 - classification_loss: 0.2285 450/500 [==========================>...] - ETA: 10s - loss: 1.3709 - regression_loss: 1.1424 - classification_loss: 0.2285 451/500 [==========================>...] - ETA: 10s - loss: 1.3720 - regression_loss: 1.1432 - classification_loss: 0.2288 452/500 [==========================>...] - ETA: 10s - loss: 1.3739 - regression_loss: 1.1445 - classification_loss: 0.2294 453/500 [==========================>...] - ETA: 10s - loss: 1.3749 - regression_loss: 1.1450 - classification_loss: 0.2299 454/500 [==========================>...] - ETA: 9s - loss: 1.3755 - regression_loss: 1.1457 - classification_loss: 0.2299  455/500 [==========================>...] - ETA: 9s - loss: 1.3731 - regression_loss: 1.1436 - classification_loss: 0.2294 456/500 [==========================>...] - ETA: 9s - loss: 1.3736 - regression_loss: 1.1444 - classification_loss: 0.2292 457/500 [==========================>...] - ETA: 9s - loss: 1.3726 - regression_loss: 1.1437 - classification_loss: 0.2289 458/500 [==========================>...] - ETA: 9s - loss: 1.3721 - regression_loss: 1.1434 - classification_loss: 0.2287 459/500 [==========================>...] - ETA: 8s - loss: 1.3724 - regression_loss: 1.1438 - classification_loss: 0.2286 460/500 [==========================>...] - ETA: 8s - loss: 1.3704 - regression_loss: 1.1423 - classification_loss: 0.2281 461/500 [==========================>...] - ETA: 8s - loss: 1.3705 - regression_loss: 1.1424 - classification_loss: 0.2280 462/500 [==========================>...] - ETA: 8s - loss: 1.3716 - regression_loss: 1.1432 - classification_loss: 0.2283 463/500 [==========================>...] - ETA: 7s - loss: 1.3719 - regression_loss: 1.1437 - classification_loss: 0.2282 464/500 [==========================>...] - ETA: 7s - loss: 1.3728 - regression_loss: 1.1446 - classification_loss: 0.2282 465/500 [==========================>...] - ETA: 7s - loss: 1.3729 - regression_loss: 1.1447 - classification_loss: 0.2281 466/500 [==========================>...] - ETA: 7s - loss: 1.3724 - regression_loss: 1.1443 - classification_loss: 0.2281 467/500 [===========================>..] - ETA: 7s - loss: 1.3724 - regression_loss: 1.1444 - classification_loss: 0.2279 468/500 [===========================>..] - ETA: 6s - loss: 1.3726 - regression_loss: 1.1447 - classification_loss: 0.2280 469/500 [===========================>..] - ETA: 6s - loss: 1.3715 - regression_loss: 1.1437 - classification_loss: 0.2278 470/500 [===========================>..] - ETA: 6s - loss: 1.3718 - regression_loss: 1.1440 - classification_loss: 0.2278 471/500 [===========================>..] - ETA: 6s - loss: 1.3727 - regression_loss: 1.1451 - classification_loss: 0.2277 472/500 [===========================>..] - ETA: 6s - loss: 1.3707 - regression_loss: 1.1435 - classification_loss: 0.2273 473/500 [===========================>..] - ETA: 5s - loss: 1.3707 - regression_loss: 1.1435 - classification_loss: 0.2272 474/500 [===========================>..] - ETA: 5s - loss: 1.3690 - regression_loss: 1.1421 - classification_loss: 0.2269 475/500 [===========================>..] - ETA: 5s - loss: 1.3690 - regression_loss: 1.1424 - classification_loss: 0.2266 476/500 [===========================>..] - ETA: 5s - loss: 1.3689 - regression_loss: 1.1424 - classification_loss: 0.2265 477/500 [===========================>..] - ETA: 4s - loss: 1.3689 - regression_loss: 1.1425 - classification_loss: 0.2264 478/500 [===========================>..] - ETA: 4s - loss: 1.3689 - regression_loss: 1.1426 - classification_loss: 0.2263 479/500 [===========================>..] - ETA: 4s - loss: 1.3703 - regression_loss: 1.1439 - classification_loss: 0.2264 480/500 [===========================>..] - ETA: 4s - loss: 1.3703 - regression_loss: 1.1439 - classification_loss: 0.2264 481/500 [===========================>..] - ETA: 4s - loss: 1.3711 - regression_loss: 1.1446 - classification_loss: 0.2264 482/500 [===========================>..] - ETA: 3s - loss: 1.3701 - regression_loss: 1.1440 - classification_loss: 0.2261 483/500 [===========================>..] - ETA: 3s - loss: 1.3703 - regression_loss: 1.1443 - classification_loss: 0.2260 484/500 [============================>.] - ETA: 3s - loss: 1.3702 - regression_loss: 1.1442 - classification_loss: 0.2259 485/500 [============================>.] - ETA: 3s - loss: 1.3700 - regression_loss: 1.1442 - classification_loss: 0.2258 486/500 [============================>.] - ETA: 3s - loss: 1.3702 - regression_loss: 1.1444 - classification_loss: 0.2257 487/500 [============================>.] - ETA: 2s - loss: 1.3695 - regression_loss: 1.1439 - classification_loss: 0.2256 488/500 [============================>.] - ETA: 2s - loss: 1.3675 - regression_loss: 1.1422 - classification_loss: 0.2253 489/500 [============================>.] - ETA: 2s - loss: 1.3683 - regression_loss: 1.1430 - classification_loss: 0.2253 490/500 [============================>.] - ETA: 2s - loss: 1.3713 - regression_loss: 1.1455 - classification_loss: 0.2258 491/500 [============================>.] - ETA: 1s - loss: 1.3715 - regression_loss: 1.1458 - classification_loss: 0.2257 492/500 [============================>.] - ETA: 1s - loss: 1.3724 - regression_loss: 1.1465 - classification_loss: 0.2259 493/500 [============================>.] - ETA: 1s - loss: 1.3728 - regression_loss: 1.1467 - classification_loss: 0.2261 494/500 [============================>.] - ETA: 1s - loss: 1.3743 - regression_loss: 1.1478 - classification_loss: 0.2265 495/500 [============================>.] - ETA: 1s - loss: 1.3746 - regression_loss: 1.1481 - classification_loss: 0.2265 496/500 [============================>.] - ETA: 0s - loss: 1.3752 - regression_loss: 1.1485 - classification_loss: 0.2267 497/500 [============================>.] - ETA: 0s - loss: 1.3739 - regression_loss: 1.1475 - classification_loss: 0.2264 498/500 [============================>.] - ETA: 0s - loss: 1.3738 - regression_loss: 1.1474 - classification_loss: 0.2264 499/500 [============================>.] - ETA: 0s - loss: 1.3747 - regression_loss: 1.1484 - classification_loss: 0.2263 500/500 [==============================] - 107s 215ms/step - loss: 1.3756 - regression_loss: 1.1492 - classification_loss: 0.2265 326 instances of class plum with average precision: 0.8060 mAP: 0.8060 Epoch 00087: saving model to ./training/snapshots/resnet50_pascal_87.h5 Epoch 88/150 1/500 [..............................] - ETA: 2:05 - loss: 1.9176 - regression_loss: 1.6133 - classification_loss: 0.3043 2/500 [..............................] - ETA: 2:05 - loss: 1.6744 - regression_loss: 1.3998 - classification_loss: 0.2746 3/500 [..............................] - ETA: 2:03 - loss: 1.5668 - regression_loss: 1.3362 - classification_loss: 0.2307 4/500 [..............................] - ETA: 2:02 - loss: 1.5097 - regression_loss: 1.3014 - classification_loss: 0.2083 5/500 [..............................] - ETA: 2:01 - loss: 1.3744 - regression_loss: 1.1927 - classification_loss: 0.1817 6/500 [..............................] - ETA: 1:59 - loss: 1.4492 - regression_loss: 1.2474 - classification_loss: 0.2018 7/500 [..............................] - ETA: 1:57 - loss: 1.5856 - regression_loss: 1.3799 - classification_loss: 0.2057 8/500 [..............................] - ETA: 1:55 - loss: 1.5397 - regression_loss: 1.3359 - classification_loss: 0.2038 9/500 [..............................] - ETA: 1:54 - loss: 1.4733 - regression_loss: 1.2765 - classification_loss: 0.1968 10/500 [..............................] - ETA: 1:52 - loss: 1.4633 - regression_loss: 1.2628 - classification_loss: 0.2004 11/500 [..............................] - ETA: 1:51 - loss: 1.4765 - regression_loss: 1.2676 - classification_loss: 0.2089 12/500 [..............................] - ETA: 1:50 - loss: 1.5594 - regression_loss: 1.3322 - classification_loss: 0.2272 13/500 [..............................] - ETA: 1:49 - loss: 1.5648 - regression_loss: 1.3375 - classification_loss: 0.2273 14/500 [..............................] - ETA: 1:49 - loss: 1.5344 - regression_loss: 1.3128 - classification_loss: 0.2216 15/500 [..............................] - ETA: 1:48 - loss: 1.5163 - regression_loss: 1.2968 - classification_loss: 0.2195 16/500 [..............................] - ETA: 1:47 - loss: 1.5063 - regression_loss: 1.2810 - classification_loss: 0.2252 17/500 [>.............................] - ETA: 1:47 - loss: 1.5429 - regression_loss: 1.3030 - classification_loss: 0.2399 18/500 [>.............................] - ETA: 1:47 - loss: 1.5475 - regression_loss: 1.3085 - classification_loss: 0.2390 19/500 [>.............................] - ETA: 1:47 - loss: 1.5200 - regression_loss: 1.2881 - classification_loss: 0.2319 20/500 [>.............................] - ETA: 1:48 - loss: 1.4673 - regression_loss: 1.2444 - classification_loss: 0.2229 21/500 [>.............................] - ETA: 1:48 - loss: 1.5055 - regression_loss: 1.2703 - classification_loss: 0.2351 22/500 [>.............................] - ETA: 1:48 - loss: 1.5373 - regression_loss: 1.2942 - classification_loss: 0.2431 23/500 [>.............................] - ETA: 1:48 - loss: 1.5195 - regression_loss: 1.2786 - classification_loss: 0.2409 24/500 [>.............................] - ETA: 1:48 - loss: 1.5063 - regression_loss: 1.2685 - classification_loss: 0.2378 25/500 [>.............................] - ETA: 1:49 - loss: 1.5152 - regression_loss: 1.2749 - classification_loss: 0.2403 26/500 [>.............................] - ETA: 1:48 - loss: 1.5098 - regression_loss: 1.2687 - classification_loss: 0.2412 27/500 [>.............................] - ETA: 1:48 - loss: 1.5093 - regression_loss: 1.2698 - classification_loss: 0.2395 28/500 [>.............................] - ETA: 1:47 - loss: 1.4930 - regression_loss: 1.2567 - classification_loss: 0.2363 29/500 [>.............................] - ETA: 1:47 - loss: 1.4649 - regression_loss: 1.2334 - classification_loss: 0.2315 30/500 [>.............................] - ETA: 1:46 - loss: 1.4401 - regression_loss: 1.2119 - classification_loss: 0.2282 31/500 [>.............................] - ETA: 1:46 - loss: 1.4442 - regression_loss: 1.2129 - classification_loss: 0.2313 32/500 [>.............................] - ETA: 1:45 - loss: 1.4562 - regression_loss: 1.2219 - classification_loss: 0.2343 33/500 [>.............................] - ETA: 1:45 - loss: 1.4639 - regression_loss: 1.2285 - classification_loss: 0.2354 34/500 [=>............................] - ETA: 1:45 - loss: 1.4478 - regression_loss: 1.2152 - classification_loss: 0.2326 35/500 [=>............................] - ETA: 1:44 - loss: 1.4359 - regression_loss: 1.2058 - classification_loss: 0.2301 36/500 [=>............................] - ETA: 1:44 - loss: 1.4357 - regression_loss: 1.2066 - classification_loss: 0.2291 37/500 [=>............................] - ETA: 1:43 - loss: 1.4495 - regression_loss: 1.2191 - classification_loss: 0.2304 38/500 [=>............................] - ETA: 1:43 - loss: 1.4438 - regression_loss: 1.2137 - classification_loss: 0.2301 39/500 [=>............................] - ETA: 1:42 - loss: 1.4383 - regression_loss: 1.2103 - classification_loss: 0.2280 40/500 [=>............................] - ETA: 1:42 - loss: 1.4539 - regression_loss: 1.2199 - classification_loss: 0.2339 41/500 [=>............................] - ETA: 1:42 - loss: 1.4532 - regression_loss: 1.2188 - classification_loss: 0.2344 42/500 [=>............................] - ETA: 1:42 - loss: 1.4679 - regression_loss: 1.2284 - classification_loss: 0.2395 43/500 [=>............................] - ETA: 1:42 - loss: 1.4699 - regression_loss: 1.2301 - classification_loss: 0.2398 44/500 [=>............................] - ETA: 1:41 - loss: 1.4522 - regression_loss: 1.2162 - classification_loss: 0.2360 45/500 [=>............................] - ETA: 1:41 - loss: 1.4446 - regression_loss: 1.2110 - classification_loss: 0.2336 46/500 [=>............................] - ETA: 1:41 - loss: 1.4383 - regression_loss: 1.2067 - classification_loss: 0.2316 47/500 [=>............................] - ETA: 1:41 - loss: 1.4329 - regression_loss: 1.2019 - classification_loss: 0.2311 48/500 [=>............................] - ETA: 1:40 - loss: 1.4266 - regression_loss: 1.1970 - classification_loss: 0.2295 49/500 [=>............................] - ETA: 1:40 - loss: 1.4298 - regression_loss: 1.2008 - classification_loss: 0.2290 50/500 [==>...........................] - ETA: 1:40 - loss: 1.4223 - regression_loss: 1.1955 - classification_loss: 0.2267 51/500 [==>...........................] - ETA: 1:40 - loss: 1.4315 - regression_loss: 1.2009 - classification_loss: 0.2306 52/500 [==>...........................] - ETA: 1:39 - loss: 1.4441 - regression_loss: 1.2131 - classification_loss: 0.2310 53/500 [==>...........................] - ETA: 1:39 - loss: 1.4470 - regression_loss: 1.2154 - classification_loss: 0.2316 54/500 [==>...........................] - ETA: 1:39 - loss: 1.4348 - regression_loss: 1.2060 - classification_loss: 0.2288 55/500 [==>...........................] - ETA: 1:38 - loss: 1.4378 - regression_loss: 1.2082 - classification_loss: 0.2295 56/500 [==>...........................] - ETA: 1:38 - loss: 1.4554 - regression_loss: 1.2219 - classification_loss: 0.2334 57/500 [==>...........................] - ETA: 1:38 - loss: 1.4530 - regression_loss: 1.2202 - classification_loss: 0.2328 58/500 [==>...........................] - ETA: 1:37 - loss: 1.4340 - regression_loss: 1.2043 - classification_loss: 0.2297 59/500 [==>...........................] - ETA: 1:37 - loss: 1.4215 - regression_loss: 1.1951 - classification_loss: 0.2263 60/500 [==>...........................] - ETA: 1:37 - loss: 1.4311 - regression_loss: 1.2021 - classification_loss: 0.2290 61/500 [==>...........................] - ETA: 1:37 - loss: 1.4223 - regression_loss: 1.1953 - classification_loss: 0.2270 62/500 [==>...........................] - ETA: 1:36 - loss: 1.4191 - regression_loss: 1.1934 - classification_loss: 0.2257 63/500 [==>...........................] - ETA: 1:36 - loss: 1.4105 - regression_loss: 1.1869 - classification_loss: 0.2236 64/500 [==>...........................] - ETA: 1:36 - loss: 1.4050 - regression_loss: 1.1822 - classification_loss: 0.2228 65/500 [==>...........................] - ETA: 1:35 - loss: 1.3884 - regression_loss: 1.1688 - classification_loss: 0.2196 66/500 [==>...........................] - ETA: 1:35 - loss: 1.3837 - regression_loss: 1.1660 - classification_loss: 0.2177 67/500 [===>..........................] - ETA: 1:35 - loss: 1.3834 - regression_loss: 1.1659 - classification_loss: 0.2176 68/500 [===>..........................] - ETA: 1:35 - loss: 1.3837 - regression_loss: 1.1668 - classification_loss: 0.2170 69/500 [===>..........................] - ETA: 1:35 - loss: 1.3780 - regression_loss: 1.1623 - classification_loss: 0.2157 70/500 [===>..........................] - ETA: 1:34 - loss: 1.3733 - regression_loss: 1.1590 - classification_loss: 0.2143 71/500 [===>..........................] - ETA: 1:34 - loss: 1.3793 - regression_loss: 1.1646 - classification_loss: 0.2148 72/500 [===>..........................] - ETA: 1:34 - loss: 1.3834 - regression_loss: 1.1681 - classification_loss: 0.2153 73/500 [===>..........................] - ETA: 1:34 - loss: 1.3798 - regression_loss: 1.1646 - classification_loss: 0.2152 74/500 [===>..........................] - ETA: 1:33 - loss: 1.3893 - regression_loss: 1.1714 - classification_loss: 0.2179 75/500 [===>..........................] - ETA: 1:33 - loss: 1.3885 - regression_loss: 1.1718 - classification_loss: 0.2167 76/500 [===>..........................] - ETA: 1:33 - loss: 1.3965 - regression_loss: 1.1774 - classification_loss: 0.2191 77/500 [===>..........................] - ETA: 1:32 - loss: 1.3919 - regression_loss: 1.1740 - classification_loss: 0.2179 78/500 [===>..........................] - ETA: 1:32 - loss: 1.3864 - regression_loss: 1.1695 - classification_loss: 0.2169 79/500 [===>..........................] - ETA: 1:32 - loss: 1.3798 - regression_loss: 1.1641 - classification_loss: 0.2157 80/500 [===>..........................] - ETA: 1:32 - loss: 1.3773 - regression_loss: 1.1627 - classification_loss: 0.2146 81/500 [===>..........................] - ETA: 1:31 - loss: 1.3716 - regression_loss: 1.1586 - classification_loss: 0.2130 82/500 [===>..........................] - ETA: 1:31 - loss: 1.3651 - regression_loss: 1.1445 - classification_loss: 0.2206 83/500 [===>..........................] - ETA: 1:31 - loss: 1.3567 - regression_loss: 1.1380 - classification_loss: 0.2187 84/500 [====>.........................] - ETA: 1:31 - loss: 1.3600 - regression_loss: 1.1412 - classification_loss: 0.2188 85/500 [====>.........................] - ETA: 1:30 - loss: 1.3575 - regression_loss: 1.1395 - classification_loss: 0.2180 86/500 [====>.........................] - ETA: 1:30 - loss: 1.3586 - regression_loss: 1.1397 - classification_loss: 0.2189 87/500 [====>.........................] - ETA: 1:30 - loss: 1.3666 - regression_loss: 1.1457 - classification_loss: 0.2209 88/500 [====>.........................] - ETA: 1:30 - loss: 1.3645 - regression_loss: 1.1438 - classification_loss: 0.2207 89/500 [====>.........................] - ETA: 1:29 - loss: 1.3633 - regression_loss: 1.1432 - classification_loss: 0.2202 90/500 [====>.........................] - ETA: 1:29 - loss: 1.3676 - regression_loss: 1.1467 - classification_loss: 0.2210 91/500 [====>.........................] - ETA: 1:29 - loss: 1.3596 - regression_loss: 1.1390 - classification_loss: 0.2206 92/500 [====>.........................] - ETA: 1:29 - loss: 1.3680 - regression_loss: 1.1458 - classification_loss: 0.2222 93/500 [====>.........................] - ETA: 1:28 - loss: 1.3655 - regression_loss: 1.1443 - classification_loss: 0.2213 94/500 [====>.........................] - ETA: 1:28 - loss: 1.3618 - regression_loss: 1.1414 - classification_loss: 0.2204 95/500 [====>.........................] - ETA: 1:28 - loss: 1.3657 - regression_loss: 1.1447 - classification_loss: 0.2210 96/500 [====>.........................] - ETA: 1:28 - loss: 1.3613 - regression_loss: 1.1418 - classification_loss: 0.2195 97/500 [====>.........................] - ETA: 1:28 - loss: 1.3609 - regression_loss: 1.1418 - classification_loss: 0.2191 98/500 [====>.........................] - ETA: 1:27 - loss: 1.3591 - regression_loss: 1.1411 - classification_loss: 0.2179 99/500 [====>.........................] - ETA: 1:27 - loss: 1.3617 - regression_loss: 1.1437 - classification_loss: 0.2180 100/500 [=====>........................] - ETA: 1:27 - loss: 1.3630 - regression_loss: 1.1444 - classification_loss: 0.2186 101/500 [=====>........................] - ETA: 1:27 - loss: 1.3611 - regression_loss: 1.1435 - classification_loss: 0.2176 102/500 [=====>........................] - ETA: 1:26 - loss: 1.3585 - regression_loss: 1.1420 - classification_loss: 0.2165 103/500 [=====>........................] - ETA: 1:26 - loss: 1.3574 - regression_loss: 1.1410 - classification_loss: 0.2164 104/500 [=====>........................] - ETA: 1:26 - loss: 1.3557 - regression_loss: 1.1395 - classification_loss: 0.2162 105/500 [=====>........................] - ETA: 1:26 - loss: 1.3554 - regression_loss: 1.1392 - classification_loss: 0.2162 106/500 [=====>........................] - ETA: 1:25 - loss: 1.3571 - regression_loss: 1.1405 - classification_loss: 0.2166 107/500 [=====>........................] - ETA: 1:25 - loss: 1.3573 - regression_loss: 1.1406 - classification_loss: 0.2167 108/500 [=====>........................] - ETA: 1:25 - loss: 1.3485 - regression_loss: 1.1336 - classification_loss: 0.2150 109/500 [=====>........................] - ETA: 1:25 - loss: 1.3505 - regression_loss: 1.1351 - classification_loss: 0.2154 110/500 [=====>........................] - ETA: 1:25 - loss: 1.3474 - regression_loss: 1.1326 - classification_loss: 0.2148 111/500 [=====>........................] - ETA: 1:25 - loss: 1.3532 - regression_loss: 1.1370 - classification_loss: 0.2163 112/500 [=====>........................] - ETA: 1:24 - loss: 1.3509 - regression_loss: 1.1357 - classification_loss: 0.2153 113/500 [=====>........................] - ETA: 1:24 - loss: 1.3479 - regression_loss: 1.1333 - classification_loss: 0.2146 114/500 [=====>........................] - ETA: 1:24 - loss: 1.3451 - regression_loss: 1.1313 - classification_loss: 0.2138 115/500 [=====>........................] - ETA: 1:24 - loss: 1.3399 - regression_loss: 1.1267 - classification_loss: 0.2131 116/500 [=====>........................] - ETA: 1:23 - loss: 1.3375 - regression_loss: 1.1251 - classification_loss: 0.2125 117/500 [======>.......................] - ETA: 1:23 - loss: 1.3392 - regression_loss: 1.1268 - classification_loss: 0.2124 118/500 [======>.......................] - ETA: 1:23 - loss: 1.3370 - regression_loss: 1.1254 - classification_loss: 0.2116 119/500 [======>.......................] - ETA: 1:23 - loss: 1.3432 - regression_loss: 1.1314 - classification_loss: 0.2118 120/500 [======>.......................] - ETA: 1:22 - loss: 1.3386 - regression_loss: 1.1276 - classification_loss: 0.2110 121/500 [======>.......................] - ETA: 1:22 - loss: 1.3391 - regression_loss: 1.1280 - classification_loss: 0.2110 122/500 [======>.......................] - ETA: 1:22 - loss: 1.3402 - regression_loss: 1.1292 - classification_loss: 0.2109 123/500 [======>.......................] - ETA: 1:22 - loss: 1.3395 - regression_loss: 1.1294 - classification_loss: 0.2102 124/500 [======>.......................] - ETA: 1:22 - loss: 1.3364 - regression_loss: 1.1270 - classification_loss: 0.2093 125/500 [======>.......................] - ETA: 1:21 - loss: 1.3397 - regression_loss: 1.1307 - classification_loss: 0.2089 126/500 [======>.......................] - ETA: 1:21 - loss: 1.3415 - regression_loss: 1.1324 - classification_loss: 0.2092 127/500 [======>.......................] - ETA: 1:21 - loss: 1.3393 - regression_loss: 1.1305 - classification_loss: 0.2088 128/500 [======>.......................] - ETA: 1:21 - loss: 1.3411 - regression_loss: 1.1323 - classification_loss: 0.2089 129/500 [======>.......................] - ETA: 1:21 - loss: 1.3362 - regression_loss: 1.1285 - classification_loss: 0.2077 130/500 [======>.......................] - ETA: 1:20 - loss: 1.3395 - regression_loss: 1.1312 - classification_loss: 0.2082 131/500 [======>.......................] - ETA: 1:20 - loss: 1.3379 - regression_loss: 1.1299 - classification_loss: 0.2080 132/500 [======>.......................] - ETA: 1:20 - loss: 1.3354 - regression_loss: 1.1276 - classification_loss: 0.2078 133/500 [======>.......................] - ETA: 1:20 - loss: 1.3293 - regression_loss: 1.1227 - classification_loss: 0.2065 134/500 [=======>......................] - ETA: 1:19 - loss: 1.3314 - regression_loss: 1.1241 - classification_loss: 0.2073 135/500 [=======>......................] - ETA: 1:19 - loss: 1.3374 - regression_loss: 1.1288 - classification_loss: 0.2086 136/500 [=======>......................] - ETA: 1:19 - loss: 1.3315 - regression_loss: 1.1238 - classification_loss: 0.2077 137/500 [=======>......................] - ETA: 1:19 - loss: 1.3339 - regression_loss: 1.1253 - classification_loss: 0.2086 138/500 [=======>......................] - ETA: 1:18 - loss: 1.3307 - regression_loss: 1.1227 - classification_loss: 0.2080 139/500 [=======>......................] - ETA: 1:18 - loss: 1.3289 - regression_loss: 1.1216 - classification_loss: 0.2073 140/500 [=======>......................] - ETA: 1:18 - loss: 1.3247 - regression_loss: 1.1183 - classification_loss: 0.2064 141/500 [=======>......................] - ETA: 1:18 - loss: 1.3261 - regression_loss: 1.1200 - classification_loss: 0.2061 142/500 [=======>......................] - ETA: 1:17 - loss: 1.3247 - regression_loss: 1.1192 - classification_loss: 0.2055 143/500 [=======>......................] - ETA: 1:17 - loss: 1.3280 - regression_loss: 1.1222 - classification_loss: 0.2058 144/500 [=======>......................] - ETA: 1:17 - loss: 1.3306 - regression_loss: 1.1251 - classification_loss: 0.2055 145/500 [=======>......................] - ETA: 1:17 - loss: 1.3283 - regression_loss: 1.1231 - classification_loss: 0.2052 146/500 [=======>......................] - ETA: 1:17 - loss: 1.3284 - regression_loss: 1.1233 - classification_loss: 0.2051 147/500 [=======>......................] - ETA: 1:16 - loss: 1.3229 - regression_loss: 1.1189 - classification_loss: 0.2041 148/500 [=======>......................] - ETA: 1:16 - loss: 1.3255 - regression_loss: 1.1211 - classification_loss: 0.2044 149/500 [=======>......................] - ETA: 1:16 - loss: 1.3300 - regression_loss: 1.1247 - classification_loss: 0.2053 150/500 [========>.....................] - ETA: 1:16 - loss: 1.3363 - regression_loss: 1.1301 - classification_loss: 0.2062 151/500 [========>.....................] - ETA: 1:16 - loss: 1.3331 - regression_loss: 1.1273 - classification_loss: 0.2058 152/500 [========>.....................] - ETA: 1:15 - loss: 1.3294 - regression_loss: 1.1239 - classification_loss: 0.2055 153/500 [========>.....................] - ETA: 1:15 - loss: 1.3289 - regression_loss: 1.1237 - classification_loss: 0.2053 154/500 [========>.....................] - ETA: 1:15 - loss: 1.3294 - regression_loss: 1.1243 - classification_loss: 0.2051 155/500 [========>.....................] - ETA: 1:15 - loss: 1.3297 - regression_loss: 1.1244 - classification_loss: 0.2052 156/500 [========>.....................] - ETA: 1:14 - loss: 1.3283 - regression_loss: 1.1232 - classification_loss: 0.2051 157/500 [========>.....................] - ETA: 1:14 - loss: 1.3317 - regression_loss: 1.1257 - classification_loss: 0.2059 158/500 [========>.....................] - ETA: 1:14 - loss: 1.3275 - regression_loss: 1.1223 - classification_loss: 0.2052 159/500 [========>.....................] - ETA: 1:14 - loss: 1.3304 - regression_loss: 1.1247 - classification_loss: 0.2057 160/500 [========>.....................] - ETA: 1:14 - loss: 1.3304 - regression_loss: 1.1249 - classification_loss: 0.2056 161/500 [========>.....................] - ETA: 1:13 - loss: 1.3332 - regression_loss: 1.1262 - classification_loss: 0.2070 162/500 [========>.....................] - ETA: 1:13 - loss: 1.3275 - regression_loss: 1.1213 - classification_loss: 0.2062 163/500 [========>.....................] - ETA: 1:13 - loss: 1.3327 - regression_loss: 1.1249 - classification_loss: 0.2078 164/500 [========>.....................] - ETA: 1:13 - loss: 1.3311 - regression_loss: 1.1238 - classification_loss: 0.2073 165/500 [========>.....................] - ETA: 1:12 - loss: 1.3309 - regression_loss: 1.1236 - classification_loss: 0.2073 166/500 [========>.....................] - ETA: 1:12 - loss: 1.3373 - regression_loss: 1.1282 - classification_loss: 0.2091 167/500 [=========>....................] - ETA: 1:12 - loss: 1.3342 - regression_loss: 1.1258 - classification_loss: 0.2083 168/500 [=========>....................] - ETA: 1:12 - loss: 1.3330 - regression_loss: 1.1252 - classification_loss: 0.2079 169/500 [=========>....................] - ETA: 1:11 - loss: 1.3332 - regression_loss: 1.1254 - classification_loss: 0.2079 170/500 [=========>....................] - ETA: 1:11 - loss: 1.3374 - regression_loss: 1.1282 - classification_loss: 0.2091 171/500 [=========>....................] - ETA: 1:11 - loss: 1.3388 - regression_loss: 1.1298 - classification_loss: 0.2090 172/500 [=========>....................] - ETA: 1:11 - loss: 1.3384 - regression_loss: 1.1296 - classification_loss: 0.2088 173/500 [=========>....................] - ETA: 1:11 - loss: 1.3443 - regression_loss: 1.1336 - classification_loss: 0.2107 174/500 [=========>....................] - ETA: 1:10 - loss: 1.3470 - regression_loss: 1.1358 - classification_loss: 0.2112 175/500 [=========>....................] - ETA: 1:10 - loss: 1.3514 - regression_loss: 1.1399 - classification_loss: 0.2116 176/500 [=========>....................] - ETA: 1:10 - loss: 1.3522 - regression_loss: 1.1405 - classification_loss: 0.2117 177/500 [=========>....................] - ETA: 1:10 - loss: 1.3521 - regression_loss: 1.1401 - classification_loss: 0.2119 178/500 [=========>....................] - ETA: 1:10 - loss: 1.3521 - regression_loss: 1.1401 - classification_loss: 0.2120 179/500 [=========>....................] - ETA: 1:09 - loss: 1.3502 - regression_loss: 1.1389 - classification_loss: 0.2113 180/500 [=========>....................] - ETA: 1:09 - loss: 1.3487 - regression_loss: 1.1381 - classification_loss: 0.2106 181/500 [=========>....................] - ETA: 1:09 - loss: 1.3474 - regression_loss: 1.1371 - classification_loss: 0.2103 182/500 [=========>....................] - ETA: 1:09 - loss: 1.3463 - regression_loss: 1.1364 - classification_loss: 0.2099 183/500 [=========>....................] - ETA: 1:09 - loss: 1.3469 - regression_loss: 1.1370 - classification_loss: 0.2099 184/500 [==========>...................] - ETA: 1:08 - loss: 1.3437 - regression_loss: 1.1341 - classification_loss: 0.2096 185/500 [==========>...................] - ETA: 1:08 - loss: 1.3436 - regression_loss: 1.1343 - classification_loss: 0.2093 186/500 [==========>...................] - ETA: 1:08 - loss: 1.3454 - regression_loss: 1.1362 - classification_loss: 0.2092 187/500 [==========>...................] - ETA: 1:08 - loss: 1.3453 - regression_loss: 1.1362 - classification_loss: 0.2091 188/500 [==========>...................] - ETA: 1:08 - loss: 1.3500 - regression_loss: 1.1302 - classification_loss: 0.2198 189/500 [==========>...................] - ETA: 1:08 - loss: 1.3520 - regression_loss: 1.1318 - classification_loss: 0.2202 190/500 [==========>...................] - ETA: 1:07 - loss: 1.3553 - regression_loss: 1.1341 - classification_loss: 0.2212 191/500 [==========>...................] - ETA: 1:07 - loss: 1.3553 - regression_loss: 1.1345 - classification_loss: 0.2207 192/500 [==========>...................] - ETA: 1:07 - loss: 1.3534 - regression_loss: 1.1331 - classification_loss: 0.2203 193/500 [==========>...................] - ETA: 1:07 - loss: 1.3546 - regression_loss: 1.1340 - classification_loss: 0.2207 194/500 [==========>...................] - ETA: 1:07 - loss: 1.3545 - regression_loss: 1.1342 - classification_loss: 0.2203 195/500 [==========>...................] - ETA: 1:06 - loss: 1.3573 - regression_loss: 1.1366 - classification_loss: 0.2207 196/500 [==========>...................] - ETA: 1:06 - loss: 1.3589 - regression_loss: 1.1382 - classification_loss: 0.2207 197/500 [==========>...................] - ETA: 1:06 - loss: 1.3544 - regression_loss: 1.1345 - classification_loss: 0.2200 198/500 [==========>...................] - ETA: 1:06 - loss: 1.3532 - regression_loss: 1.1335 - classification_loss: 0.2197 199/500 [==========>...................] - ETA: 1:06 - loss: 1.3498 - regression_loss: 1.1305 - classification_loss: 0.2192 200/500 [===========>..................] - ETA: 1:05 - loss: 1.3472 - regression_loss: 1.1287 - classification_loss: 0.2186 201/500 [===========>..................] - ETA: 1:05 - loss: 1.3464 - regression_loss: 1.1279 - classification_loss: 0.2185 202/500 [===========>..................] - ETA: 1:05 - loss: 1.3489 - regression_loss: 1.1299 - classification_loss: 0.2190 203/500 [===========>..................] - ETA: 1:05 - loss: 1.3548 - regression_loss: 1.1346 - classification_loss: 0.2202 204/500 [===========>..................] - ETA: 1:04 - loss: 1.3557 - regression_loss: 1.1357 - classification_loss: 0.2200 205/500 [===========>..................] - ETA: 1:04 - loss: 1.3577 - regression_loss: 1.1375 - classification_loss: 0.2202 206/500 [===========>..................] - ETA: 1:04 - loss: 1.3571 - regression_loss: 1.1365 - classification_loss: 0.2206 207/500 [===========>..................] - ETA: 1:04 - loss: 1.3558 - regression_loss: 1.1357 - classification_loss: 0.2202 208/500 [===========>..................] - ETA: 1:04 - loss: 1.3543 - regression_loss: 1.1346 - classification_loss: 0.2198 209/500 [===========>..................] - ETA: 1:03 - loss: 1.3529 - regression_loss: 1.1335 - classification_loss: 0.2194 210/500 [===========>..................] - ETA: 1:03 - loss: 1.3492 - regression_loss: 1.1304 - classification_loss: 0.2188 211/500 [===========>..................] - ETA: 1:03 - loss: 1.3515 - regression_loss: 1.1327 - classification_loss: 0.2188 212/500 [===========>..................] - ETA: 1:03 - loss: 1.3514 - regression_loss: 1.1327 - classification_loss: 0.2187 213/500 [===========>..................] - ETA: 1:02 - loss: 1.3548 - regression_loss: 1.1359 - classification_loss: 0.2189 214/500 [===========>..................] - ETA: 1:02 - loss: 1.3567 - regression_loss: 1.1374 - classification_loss: 0.2193 215/500 [===========>..................] - ETA: 1:02 - loss: 1.3574 - regression_loss: 1.1380 - classification_loss: 0.2194 216/500 [===========>..................] - ETA: 1:02 - loss: 1.3559 - regression_loss: 1.1370 - classification_loss: 0.2189 217/500 [============>.................] - ETA: 1:02 - loss: 1.3549 - regression_loss: 1.1363 - classification_loss: 0.2186 218/500 [============>.................] - ETA: 1:01 - loss: 1.3588 - regression_loss: 1.1386 - classification_loss: 0.2203 219/500 [============>.................] - ETA: 1:01 - loss: 1.3576 - regression_loss: 1.1378 - classification_loss: 0.2198 220/500 [============>.................] - ETA: 1:01 - loss: 1.3592 - regression_loss: 1.1390 - classification_loss: 0.2203 221/500 [============>.................] - ETA: 1:01 - loss: 1.3579 - regression_loss: 1.1377 - classification_loss: 0.2202 222/500 [============>.................] - ETA: 1:01 - loss: 1.3566 - regression_loss: 1.1368 - classification_loss: 0.2198 223/500 [============>.................] - ETA: 1:00 - loss: 1.3561 - regression_loss: 1.1363 - classification_loss: 0.2197 224/500 [============>.................] - ETA: 1:00 - loss: 1.3549 - regression_loss: 1.1351 - classification_loss: 0.2198 225/500 [============>.................] - ETA: 1:00 - loss: 1.3580 - regression_loss: 1.1381 - classification_loss: 0.2199 226/500 [============>.................] - ETA: 1:00 - loss: 1.3628 - regression_loss: 1.1414 - classification_loss: 0.2213 227/500 [============>.................] - ETA: 59s - loss: 1.3628 - regression_loss: 1.1415 - classification_loss: 0.2213  228/500 [============>.................] - ETA: 59s - loss: 1.3598 - regression_loss: 1.1391 - classification_loss: 0.2208 229/500 [============>.................] - ETA: 59s - loss: 1.3621 - regression_loss: 1.1410 - classification_loss: 0.2211 230/500 [============>.................] - ETA: 59s - loss: 1.3581 - regression_loss: 1.1377 - classification_loss: 0.2204 231/500 [============>.................] - ETA: 59s - loss: 1.3535 - regression_loss: 1.1339 - classification_loss: 0.2195 232/500 [============>.................] - ETA: 58s - loss: 1.3545 - regression_loss: 1.1351 - classification_loss: 0.2193 233/500 [============>.................] - ETA: 58s - loss: 1.3539 - regression_loss: 1.1352 - classification_loss: 0.2187 234/500 [=============>................] - ETA: 58s - loss: 1.3524 - regression_loss: 1.1342 - classification_loss: 0.2182 235/500 [=============>................] - ETA: 58s - loss: 1.3507 - regression_loss: 1.1327 - classification_loss: 0.2180 236/500 [=============>................] - ETA: 58s - loss: 1.3491 - regression_loss: 1.1313 - classification_loss: 0.2178 237/500 [=============>................] - ETA: 57s - loss: 1.3494 - regression_loss: 1.1316 - classification_loss: 0.2178 238/500 [=============>................] - ETA: 57s - loss: 1.3508 - regression_loss: 1.1329 - classification_loss: 0.2179 239/500 [=============>................] - ETA: 57s - loss: 1.3478 - regression_loss: 1.1304 - classification_loss: 0.2174 240/500 [=============>................] - ETA: 57s - loss: 1.3497 - regression_loss: 1.1322 - classification_loss: 0.2176 241/500 [=============>................] - ETA: 57s - loss: 1.3492 - regression_loss: 1.1319 - classification_loss: 0.2174 242/500 [=============>................] - ETA: 56s - loss: 1.3497 - regression_loss: 1.1324 - classification_loss: 0.2173 243/500 [=============>................] - ETA: 56s - loss: 1.3504 - regression_loss: 1.1328 - classification_loss: 0.2175 244/500 [=============>................] - ETA: 56s - loss: 1.3508 - regression_loss: 1.1334 - classification_loss: 0.2174 245/500 [=============>................] - ETA: 56s - loss: 1.3522 - regression_loss: 1.1344 - classification_loss: 0.2178 246/500 [=============>................] - ETA: 55s - loss: 1.3541 - regression_loss: 1.1362 - classification_loss: 0.2179 247/500 [=============>................] - ETA: 55s - loss: 1.3532 - regression_loss: 1.1355 - classification_loss: 0.2177 248/500 [=============>................] - ETA: 55s - loss: 1.3526 - regression_loss: 1.1352 - classification_loss: 0.2174 249/500 [=============>................] - ETA: 55s - loss: 1.3532 - regression_loss: 1.1359 - classification_loss: 0.2174 250/500 [==============>...............] - ETA: 55s - loss: 1.3535 - regression_loss: 1.1356 - classification_loss: 0.2179 251/500 [==============>...............] - ETA: 54s - loss: 1.3557 - regression_loss: 1.1373 - classification_loss: 0.2184 252/500 [==============>...............] - ETA: 54s - loss: 1.3592 - regression_loss: 1.1405 - classification_loss: 0.2187 253/500 [==============>...............] - ETA: 54s - loss: 1.3603 - regression_loss: 1.1415 - classification_loss: 0.2188 254/500 [==============>...............] - ETA: 54s - loss: 1.3623 - regression_loss: 1.1438 - classification_loss: 0.2185 255/500 [==============>...............] - ETA: 53s - loss: 1.3656 - regression_loss: 1.1464 - classification_loss: 0.2192 256/500 [==============>...............] - ETA: 53s - loss: 1.3663 - regression_loss: 1.1468 - classification_loss: 0.2194 257/500 [==============>...............] - ETA: 53s - loss: 1.3653 - regression_loss: 1.1461 - classification_loss: 0.2192 258/500 [==============>...............] - ETA: 53s - loss: 1.3648 - regression_loss: 1.1458 - classification_loss: 0.2190 259/500 [==============>...............] - ETA: 52s - loss: 1.3646 - regression_loss: 1.1457 - classification_loss: 0.2189 260/500 [==============>...............] - ETA: 52s - loss: 1.3642 - regression_loss: 1.1456 - classification_loss: 0.2187 261/500 [==============>...............] - ETA: 52s - loss: 1.3648 - regression_loss: 1.1458 - classification_loss: 0.2189 262/500 [==============>...............] - ETA: 52s - loss: 1.3630 - regression_loss: 1.1444 - classification_loss: 0.2186 263/500 [==============>...............] - ETA: 52s - loss: 1.3621 - regression_loss: 1.1439 - classification_loss: 0.2182 264/500 [==============>...............] - ETA: 51s - loss: 1.3630 - regression_loss: 1.1450 - classification_loss: 0.2180 265/500 [==============>...............] - ETA: 51s - loss: 1.3640 - regression_loss: 1.1464 - classification_loss: 0.2176 266/500 [==============>...............] - ETA: 51s - loss: 1.3637 - regression_loss: 1.1463 - classification_loss: 0.2174 267/500 [===============>..............] - ETA: 51s - loss: 1.3636 - regression_loss: 1.1463 - classification_loss: 0.2173 268/500 [===============>..............] - ETA: 51s - loss: 1.3631 - regression_loss: 1.1459 - classification_loss: 0.2171 269/500 [===============>..............] - ETA: 50s - loss: 1.3632 - regression_loss: 1.1461 - classification_loss: 0.2171 270/500 [===============>..............] - ETA: 50s - loss: 1.3649 - regression_loss: 1.1478 - classification_loss: 0.2170 271/500 [===============>..............] - ETA: 50s - loss: 1.3657 - regression_loss: 1.1488 - classification_loss: 0.2169 272/500 [===============>..............] - ETA: 50s - loss: 1.3666 - regression_loss: 1.1495 - classification_loss: 0.2171 273/500 [===============>..............] - ETA: 50s - loss: 1.3667 - regression_loss: 1.1496 - classification_loss: 0.2171 274/500 [===============>..............] - ETA: 49s - loss: 1.3655 - regression_loss: 1.1488 - classification_loss: 0.2167 275/500 [===============>..............] - ETA: 49s - loss: 1.3665 - regression_loss: 1.1496 - classification_loss: 0.2169 276/500 [===============>..............] - ETA: 49s - loss: 1.3654 - regression_loss: 1.1488 - classification_loss: 0.2166 277/500 [===============>..............] - ETA: 49s - loss: 1.3668 - regression_loss: 1.1500 - classification_loss: 0.2168 278/500 [===============>..............] - ETA: 48s - loss: 1.3649 - regression_loss: 1.1484 - classification_loss: 0.2165 279/500 [===============>..............] - ETA: 48s - loss: 1.3649 - regression_loss: 1.1485 - classification_loss: 0.2164 280/500 [===============>..............] - ETA: 48s - loss: 1.3643 - regression_loss: 1.1482 - classification_loss: 0.2161 281/500 [===============>..............] - ETA: 48s - loss: 1.3620 - regression_loss: 1.1463 - classification_loss: 0.2156 282/500 [===============>..............] - ETA: 47s - loss: 1.3631 - regression_loss: 1.1474 - classification_loss: 0.2157 283/500 [===============>..............] - ETA: 47s - loss: 1.3644 - regression_loss: 1.1486 - classification_loss: 0.2158 284/500 [================>.............] - ETA: 47s - loss: 1.3711 - regression_loss: 1.1477 - classification_loss: 0.2235 285/500 [================>.............] - ETA: 47s - loss: 1.3692 - regression_loss: 1.1461 - classification_loss: 0.2231 286/500 [================>.............] - ETA: 47s - loss: 1.3664 - regression_loss: 1.1440 - classification_loss: 0.2224 287/500 [================>.............] - ETA: 46s - loss: 1.3642 - regression_loss: 1.1423 - classification_loss: 0.2219 288/500 [================>.............] - ETA: 46s - loss: 1.3637 - regression_loss: 1.1417 - classification_loss: 0.2220 289/500 [================>.............] - ETA: 46s - loss: 1.3614 - regression_loss: 1.1398 - classification_loss: 0.2215 290/500 [================>.............] - ETA: 46s - loss: 1.3643 - regression_loss: 1.1424 - classification_loss: 0.2220 291/500 [================>.............] - ETA: 45s - loss: 1.3638 - regression_loss: 1.1423 - classification_loss: 0.2215 292/500 [================>.............] - ETA: 45s - loss: 1.3644 - regression_loss: 1.1427 - classification_loss: 0.2217 293/500 [================>.............] - ETA: 45s - loss: 1.3659 - regression_loss: 1.1435 - classification_loss: 0.2224 294/500 [================>.............] - ETA: 45s - loss: 1.3657 - regression_loss: 1.1435 - classification_loss: 0.2222 295/500 [================>.............] - ETA: 45s - loss: 1.3676 - regression_loss: 1.1449 - classification_loss: 0.2227 296/500 [================>.............] - ETA: 44s - loss: 1.3662 - regression_loss: 1.1438 - classification_loss: 0.2223 297/500 [================>.............] - ETA: 44s - loss: 1.3641 - regression_loss: 1.1424 - classification_loss: 0.2217 298/500 [================>.............] - ETA: 44s - loss: 1.3624 - regression_loss: 1.1412 - classification_loss: 0.2212 299/500 [================>.............] - ETA: 44s - loss: 1.3606 - regression_loss: 1.1396 - classification_loss: 0.2210 300/500 [=================>............] - ETA: 44s - loss: 1.3611 - regression_loss: 1.1403 - classification_loss: 0.2208 301/500 [=================>............] - ETA: 43s - loss: 1.3599 - regression_loss: 1.1394 - classification_loss: 0.2205 302/500 [=================>............] - ETA: 43s - loss: 1.3599 - regression_loss: 1.1395 - classification_loss: 0.2204 303/500 [=================>............] - ETA: 43s - loss: 1.3615 - regression_loss: 1.1409 - classification_loss: 0.2206 304/500 [=================>............] - ETA: 43s - loss: 1.3613 - regression_loss: 1.1406 - classification_loss: 0.2207 305/500 [=================>............] - ETA: 42s - loss: 1.3621 - regression_loss: 1.1413 - classification_loss: 0.2208 306/500 [=================>............] - ETA: 42s - loss: 1.3621 - regression_loss: 1.1415 - classification_loss: 0.2206 307/500 [=================>............] - ETA: 42s - loss: 1.3601 - regression_loss: 1.1399 - classification_loss: 0.2203 308/500 [=================>............] - ETA: 42s - loss: 1.3607 - regression_loss: 1.1408 - classification_loss: 0.2199 309/500 [=================>............] - ETA: 42s - loss: 1.3632 - regression_loss: 1.1427 - classification_loss: 0.2205 310/500 [=================>............] - ETA: 41s - loss: 1.3645 - regression_loss: 1.1442 - classification_loss: 0.2202 311/500 [=================>............] - ETA: 41s - loss: 1.3644 - regression_loss: 1.1441 - classification_loss: 0.2203 312/500 [=================>............] - ETA: 41s - loss: 1.3637 - regression_loss: 1.1436 - classification_loss: 0.2201 313/500 [=================>............] - ETA: 41s - loss: 1.3626 - regression_loss: 1.1429 - classification_loss: 0.2198 314/500 [=================>............] - ETA: 40s - loss: 1.3618 - regression_loss: 1.1424 - classification_loss: 0.2194 315/500 [=================>............] - ETA: 40s - loss: 1.3634 - regression_loss: 1.1436 - classification_loss: 0.2197 316/500 [=================>............] - ETA: 40s - loss: 1.3626 - regression_loss: 1.1431 - classification_loss: 0.2196 317/500 [==================>...........] - ETA: 40s - loss: 1.3603 - regression_loss: 1.1412 - classification_loss: 0.2191 318/500 [==================>...........] - ETA: 40s - loss: 1.3619 - regression_loss: 1.1428 - classification_loss: 0.2191 319/500 [==================>...........] - ETA: 39s - loss: 1.3609 - regression_loss: 1.1420 - classification_loss: 0.2189 320/500 [==================>...........] - ETA: 39s - loss: 1.3641 - regression_loss: 1.1450 - classification_loss: 0.2191 321/500 [==================>...........] - ETA: 39s - loss: 1.3610 - regression_loss: 1.1425 - classification_loss: 0.2185 322/500 [==================>...........] - ETA: 39s - loss: 1.3612 - regression_loss: 1.1430 - classification_loss: 0.2182 323/500 [==================>...........] - ETA: 39s - loss: 1.3613 - regression_loss: 1.1428 - classification_loss: 0.2185 324/500 [==================>...........] - ETA: 38s - loss: 1.3612 - regression_loss: 1.1428 - classification_loss: 0.2184 325/500 [==================>...........] - ETA: 38s - loss: 1.3618 - regression_loss: 1.1435 - classification_loss: 0.2183 326/500 [==================>...........] - ETA: 38s - loss: 1.3610 - regression_loss: 1.1431 - classification_loss: 0.2180 327/500 [==================>...........] - ETA: 38s - loss: 1.3581 - regression_loss: 1.1406 - classification_loss: 0.2175 328/500 [==================>...........] - ETA: 37s - loss: 1.3568 - regression_loss: 1.1395 - classification_loss: 0.2172 329/500 [==================>...........] - ETA: 37s - loss: 1.3568 - regression_loss: 1.1396 - classification_loss: 0.2172 330/500 [==================>...........] - ETA: 37s - loss: 1.3554 - regression_loss: 1.1385 - classification_loss: 0.2168 331/500 [==================>...........] - ETA: 37s - loss: 1.3576 - regression_loss: 1.1404 - classification_loss: 0.2172 332/500 [==================>...........] - ETA: 37s - loss: 1.3595 - regression_loss: 1.1421 - classification_loss: 0.2174 333/500 [==================>...........] - ETA: 36s - loss: 1.3621 - regression_loss: 1.1439 - classification_loss: 0.2182 334/500 [===================>..........] - ETA: 36s - loss: 1.3628 - regression_loss: 1.1444 - classification_loss: 0.2184 335/500 [===================>..........] - ETA: 36s - loss: 1.3659 - regression_loss: 1.1467 - classification_loss: 0.2192 336/500 [===================>..........] - ETA: 36s - loss: 1.3653 - regression_loss: 1.1463 - classification_loss: 0.2190 337/500 [===================>..........] - ETA: 35s - loss: 1.3632 - regression_loss: 1.1446 - classification_loss: 0.2186 338/500 [===================>..........] - ETA: 35s - loss: 1.3607 - regression_loss: 1.1426 - classification_loss: 0.2181 339/500 [===================>..........] - ETA: 35s - loss: 1.3622 - regression_loss: 1.1439 - classification_loss: 0.2183 340/500 [===================>..........] - ETA: 35s - loss: 1.3617 - regression_loss: 1.1436 - classification_loss: 0.2181 341/500 [===================>..........] - ETA: 35s - loss: 1.3638 - regression_loss: 1.1454 - classification_loss: 0.2184 342/500 [===================>..........] - ETA: 34s - loss: 1.3636 - regression_loss: 1.1455 - classification_loss: 0.2181 343/500 [===================>..........] - ETA: 34s - loss: 1.3633 - regression_loss: 1.1453 - classification_loss: 0.2179 344/500 [===================>..........] - ETA: 34s - loss: 1.3656 - regression_loss: 1.1470 - classification_loss: 0.2186 345/500 [===================>..........] - ETA: 34s - loss: 1.3659 - regression_loss: 1.1474 - classification_loss: 0.2185 346/500 [===================>..........] - ETA: 34s - loss: 1.3651 - regression_loss: 1.1470 - classification_loss: 0.2181 347/500 [===================>..........] - ETA: 33s - loss: 1.3657 - regression_loss: 1.1474 - classification_loss: 0.2183 348/500 [===================>..........] - ETA: 33s - loss: 1.3656 - regression_loss: 1.1472 - classification_loss: 0.2184 349/500 [===================>..........] - ETA: 33s - loss: 1.3648 - regression_loss: 1.1467 - classification_loss: 0.2181 350/500 [====================>.........] - ETA: 33s - loss: 1.3651 - regression_loss: 1.1469 - classification_loss: 0.2182 351/500 [====================>.........] - ETA: 32s - loss: 1.3641 - regression_loss: 1.1460 - classification_loss: 0.2181 352/500 [====================>.........] - ETA: 32s - loss: 1.3639 - regression_loss: 1.1459 - classification_loss: 0.2180 353/500 [====================>.........] - ETA: 32s - loss: 1.3654 - regression_loss: 1.1470 - classification_loss: 0.2184 354/500 [====================>.........] - ETA: 32s - loss: 1.3659 - regression_loss: 1.1474 - classification_loss: 0.2185 355/500 [====================>.........] - ETA: 31s - loss: 1.3651 - regression_loss: 1.1469 - classification_loss: 0.2182 356/500 [====================>.........] - ETA: 31s - loss: 1.3632 - regression_loss: 1.1453 - classification_loss: 0.2179 357/500 [====================>.........] - ETA: 31s - loss: 1.3634 - regression_loss: 1.1455 - classification_loss: 0.2178 358/500 [====================>.........] - ETA: 31s - loss: 1.3657 - regression_loss: 1.1473 - classification_loss: 0.2185 359/500 [====================>.........] - ETA: 31s - loss: 1.3656 - regression_loss: 1.1472 - classification_loss: 0.2183 360/500 [====================>.........] - ETA: 30s - loss: 1.3668 - regression_loss: 1.1483 - classification_loss: 0.2186 361/500 [====================>.........] - ETA: 30s - loss: 1.3660 - regression_loss: 1.1478 - classification_loss: 0.2182 362/500 [====================>.........] - ETA: 30s - loss: 1.3659 - regression_loss: 1.1477 - classification_loss: 0.2182 363/500 [====================>.........] - ETA: 30s - loss: 1.3659 - regression_loss: 1.1479 - classification_loss: 0.2180 364/500 [====================>.........] - ETA: 29s - loss: 1.3647 - regression_loss: 1.1470 - classification_loss: 0.2177 365/500 [====================>.........] - ETA: 29s - loss: 1.3649 - regression_loss: 1.1469 - classification_loss: 0.2180 366/500 [====================>.........] - ETA: 29s - loss: 1.3653 - regression_loss: 1.1474 - classification_loss: 0.2180 367/500 [=====================>........] - ETA: 29s - loss: 1.3636 - regression_loss: 1.1459 - classification_loss: 0.2176 368/500 [=====================>........] - ETA: 29s - loss: 1.3622 - regression_loss: 1.1449 - classification_loss: 0.2174 369/500 [=====================>........] - ETA: 28s - loss: 1.3633 - regression_loss: 1.1458 - classification_loss: 0.2175 370/500 [=====================>........] - ETA: 28s - loss: 1.3625 - regression_loss: 1.1454 - classification_loss: 0.2172 371/500 [=====================>........] - ETA: 28s - loss: 1.3645 - regression_loss: 1.1471 - classification_loss: 0.2174 372/500 [=====================>........] - ETA: 28s - loss: 1.3617 - regression_loss: 1.1447 - classification_loss: 0.2170 373/500 [=====================>........] - ETA: 27s - loss: 1.3616 - regression_loss: 1.1447 - classification_loss: 0.2169 374/500 [=====================>........] - ETA: 27s - loss: 1.3620 - regression_loss: 1.1450 - classification_loss: 0.2170 375/500 [=====================>........] - ETA: 27s - loss: 1.3621 - regression_loss: 1.1451 - classification_loss: 0.2171 376/500 [=====================>........] - ETA: 27s - loss: 1.3620 - regression_loss: 1.1451 - classification_loss: 0.2169 377/500 [=====================>........] - ETA: 27s - loss: 1.3631 - regression_loss: 1.1460 - classification_loss: 0.2171 378/500 [=====================>........] - ETA: 26s - loss: 1.3625 - regression_loss: 1.1457 - classification_loss: 0.2168 379/500 [=====================>........] - ETA: 26s - loss: 1.3626 - regression_loss: 1.1458 - classification_loss: 0.2168 380/500 [=====================>........] - ETA: 26s - loss: 1.3635 - regression_loss: 1.1465 - classification_loss: 0.2170 381/500 [=====================>........] - ETA: 26s - loss: 1.3636 - regression_loss: 1.1467 - classification_loss: 0.2169 382/500 [=====================>........] - ETA: 25s - loss: 1.3646 - regression_loss: 1.1476 - classification_loss: 0.2170 383/500 [=====================>........] - ETA: 25s - loss: 1.3661 - regression_loss: 1.1485 - classification_loss: 0.2176 384/500 [======================>.......] - ETA: 25s - loss: 1.3676 - regression_loss: 1.1498 - classification_loss: 0.2178 385/500 [======================>.......] - ETA: 25s - loss: 1.3676 - regression_loss: 1.1498 - classification_loss: 0.2177 386/500 [======================>.......] - ETA: 25s - loss: 1.3677 - regression_loss: 1.1499 - classification_loss: 0.2178 387/500 [======================>.......] - ETA: 24s - loss: 1.3673 - regression_loss: 1.1494 - classification_loss: 0.2179 388/500 [======================>.......] - ETA: 24s - loss: 1.3714 - regression_loss: 1.1527 - classification_loss: 0.2187 389/500 [======================>.......] - ETA: 24s - loss: 1.3699 - regression_loss: 1.1514 - classification_loss: 0.2184 390/500 [======================>.......] - ETA: 24s - loss: 1.3693 - regression_loss: 1.1511 - classification_loss: 0.2182 391/500 [======================>.......] - ETA: 24s - loss: 1.3722 - regression_loss: 1.1533 - classification_loss: 0.2188 392/500 [======================>.......] - ETA: 23s - loss: 1.3736 - regression_loss: 1.1546 - classification_loss: 0.2190 393/500 [======================>.......] - ETA: 23s - loss: 1.3728 - regression_loss: 1.1540 - classification_loss: 0.2188 394/500 [======================>.......] - ETA: 23s - loss: 1.3729 - regression_loss: 1.1542 - classification_loss: 0.2187 395/500 [======================>.......] - ETA: 23s - loss: 1.3734 - regression_loss: 1.1549 - classification_loss: 0.2185 396/500 [======================>.......] - ETA: 22s - loss: 1.3726 - regression_loss: 1.1543 - classification_loss: 0.2183 397/500 [======================>.......] - ETA: 22s - loss: 1.3730 - regression_loss: 1.1546 - classification_loss: 0.2183 398/500 [======================>.......] - ETA: 22s - loss: 1.3704 - regression_loss: 1.1526 - classification_loss: 0.2178 399/500 [======================>.......] - ETA: 22s - loss: 1.3703 - regression_loss: 1.1524 - classification_loss: 0.2179 400/500 [=======================>......] - ETA: 22s - loss: 1.3697 - regression_loss: 1.1520 - classification_loss: 0.2177 401/500 [=======================>......] - ETA: 21s - loss: 1.3711 - regression_loss: 1.1535 - classification_loss: 0.2175 402/500 [=======================>......] - ETA: 21s - loss: 1.3707 - regression_loss: 1.1532 - classification_loss: 0.2174 403/500 [=======================>......] - ETA: 21s - loss: 1.3709 - regression_loss: 1.1533 - classification_loss: 0.2176 404/500 [=======================>......] - ETA: 21s - loss: 1.3697 - regression_loss: 1.1523 - classification_loss: 0.2174 405/500 [=======================>......] - ETA: 21s - loss: 1.3694 - regression_loss: 1.1519 - classification_loss: 0.2176 406/500 [=======================>......] - ETA: 20s - loss: 1.3685 - regression_loss: 1.1512 - classification_loss: 0.2173 407/500 [=======================>......] - ETA: 20s - loss: 1.3714 - regression_loss: 1.1525 - classification_loss: 0.2188 408/500 [=======================>......] - ETA: 20s - loss: 1.3692 - regression_loss: 1.1508 - classification_loss: 0.2184 409/500 [=======================>......] - ETA: 20s - loss: 1.3673 - regression_loss: 1.1492 - classification_loss: 0.2181 410/500 [=======================>......] - ETA: 19s - loss: 1.3673 - regression_loss: 1.1494 - classification_loss: 0.2180 411/500 [=======================>......] - ETA: 19s - loss: 1.3672 - regression_loss: 1.1494 - classification_loss: 0.2178 412/500 [=======================>......] - ETA: 19s - loss: 1.3664 - regression_loss: 1.1488 - classification_loss: 0.2176 413/500 [=======================>......] - ETA: 19s - loss: 1.3669 - regression_loss: 1.1491 - classification_loss: 0.2178 414/500 [=======================>......] - ETA: 19s - loss: 1.3657 - regression_loss: 1.1481 - classification_loss: 0.2176 415/500 [=======================>......] - ETA: 18s - loss: 1.3661 - regression_loss: 1.1483 - classification_loss: 0.2178 416/500 [=======================>......] - ETA: 18s - loss: 1.3652 - regression_loss: 1.1475 - classification_loss: 0.2176 417/500 [========================>.....] - ETA: 18s - loss: 1.3651 - regression_loss: 1.1477 - classification_loss: 0.2174 418/500 [========================>.....] - ETA: 18s - loss: 1.3641 - regression_loss: 1.1468 - classification_loss: 0.2173 419/500 [========================>.....] - ETA: 17s - loss: 1.3657 - regression_loss: 1.1483 - classification_loss: 0.2174 420/500 [========================>.....] - ETA: 17s - loss: 1.3656 - regression_loss: 1.1484 - classification_loss: 0.2172 421/500 [========================>.....] - ETA: 17s - loss: 1.3659 - regression_loss: 1.1487 - classification_loss: 0.2172 422/500 [========================>.....] - ETA: 17s - loss: 1.3655 - regression_loss: 1.1460 - classification_loss: 0.2195 423/500 [========================>.....] - ETA: 17s - loss: 1.3650 - regression_loss: 1.1457 - classification_loss: 0.2194 424/500 [========================>.....] - ETA: 16s - loss: 1.3660 - regression_loss: 1.1465 - classification_loss: 0.2195 425/500 [========================>.....] - ETA: 16s - loss: 1.3652 - regression_loss: 1.1459 - classification_loss: 0.2194 426/500 [========================>.....] - ETA: 16s - loss: 1.3659 - regression_loss: 1.1462 - classification_loss: 0.2197 427/500 [========================>.....] - ETA: 16s - loss: 1.3656 - regression_loss: 1.1460 - classification_loss: 0.2196 428/500 [========================>.....] - ETA: 16s - loss: 1.3640 - regression_loss: 1.1447 - classification_loss: 0.2193 429/500 [========================>.....] - ETA: 15s - loss: 1.3635 - regression_loss: 1.1443 - classification_loss: 0.2192 430/500 [========================>.....] - ETA: 15s - loss: 1.3636 - regression_loss: 1.1445 - classification_loss: 0.2191 431/500 [========================>.....] - ETA: 15s - loss: 1.3655 - regression_loss: 1.1461 - classification_loss: 0.2195 432/500 [========================>.....] - ETA: 15s - loss: 1.3669 - regression_loss: 1.1471 - classification_loss: 0.2198 433/500 [========================>.....] - ETA: 14s - loss: 1.3654 - regression_loss: 1.1459 - classification_loss: 0.2195 434/500 [=========================>....] - ETA: 14s - loss: 1.3638 - regression_loss: 1.1446 - classification_loss: 0.2193 435/500 [=========================>....] - ETA: 14s - loss: 1.3647 - regression_loss: 1.1453 - classification_loss: 0.2194 436/500 [=========================>....] - ETA: 14s - loss: 1.3638 - regression_loss: 1.1447 - classification_loss: 0.2192 437/500 [=========================>....] - ETA: 14s - loss: 1.3644 - regression_loss: 1.1452 - classification_loss: 0.2192 438/500 [=========================>....] - ETA: 13s - loss: 1.3638 - regression_loss: 1.1449 - classification_loss: 0.2190 439/500 [=========================>....] - ETA: 13s - loss: 1.3628 - regression_loss: 1.1440 - classification_loss: 0.2188 440/500 [=========================>....] - ETA: 13s - loss: 1.3647 - regression_loss: 1.1452 - classification_loss: 0.2195 441/500 [=========================>....] - ETA: 13s - loss: 1.3645 - regression_loss: 1.1452 - classification_loss: 0.2193 442/500 [=========================>....] - ETA: 12s - loss: 1.3658 - regression_loss: 1.1462 - classification_loss: 0.2195 443/500 [=========================>....] - ETA: 12s - loss: 1.3645 - regression_loss: 1.1453 - classification_loss: 0.2192 444/500 [=========================>....] - ETA: 12s - loss: 1.3645 - regression_loss: 1.1454 - classification_loss: 0.2192 445/500 [=========================>....] - ETA: 12s - loss: 1.3633 - regression_loss: 1.1444 - classification_loss: 0.2189 446/500 [=========================>....] - ETA: 12s - loss: 1.3637 - regression_loss: 1.1448 - classification_loss: 0.2189 447/500 [=========================>....] - ETA: 11s - loss: 1.3632 - regression_loss: 1.1445 - classification_loss: 0.2187 448/500 [=========================>....] - ETA: 11s - loss: 1.3630 - regression_loss: 1.1444 - classification_loss: 0.2186 449/500 [=========================>....] - ETA: 11s - loss: 1.3633 - regression_loss: 1.1447 - classification_loss: 0.2187 450/500 [==========================>...] - ETA: 11s - loss: 1.3629 - regression_loss: 1.1443 - classification_loss: 0.2186 451/500 [==========================>...] - ETA: 10s - loss: 1.3625 - regression_loss: 1.1440 - classification_loss: 0.2185 452/500 [==========================>...] - ETA: 10s - loss: 1.3622 - regression_loss: 1.1438 - classification_loss: 0.2184 453/500 [==========================>...] - ETA: 10s - loss: 1.3628 - regression_loss: 1.1442 - classification_loss: 0.2186 454/500 [==========================>...] - ETA: 10s - loss: 1.3633 - regression_loss: 1.1448 - classification_loss: 0.2185 455/500 [==========================>...] - ETA: 10s - loss: 1.3642 - regression_loss: 1.1454 - classification_loss: 0.2188 456/500 [==========================>...] - ETA: 9s - loss: 1.3645 - regression_loss: 1.1457 - classification_loss: 0.2188  457/500 [==========================>...] - ETA: 9s - loss: 1.3625 - regression_loss: 1.1440 - classification_loss: 0.2185 458/500 [==========================>...] - ETA: 9s - loss: 1.3623 - regression_loss: 1.1439 - classification_loss: 0.2184 459/500 [==========================>...] - ETA: 9s - loss: 1.3635 - regression_loss: 1.1447 - classification_loss: 0.2188 460/500 [==========================>...] - ETA: 8s - loss: 1.3635 - regression_loss: 1.1446 - classification_loss: 0.2188 461/500 [==========================>...] - ETA: 8s - loss: 1.3655 - regression_loss: 1.1462 - classification_loss: 0.2194 462/500 [==========================>...] - ETA: 8s - loss: 1.3652 - regression_loss: 1.1461 - classification_loss: 0.2191 463/500 [==========================>...] - ETA: 8s - loss: 1.3661 - regression_loss: 1.1471 - classification_loss: 0.2190 464/500 [==========================>...] - ETA: 8s - loss: 1.3656 - regression_loss: 1.1467 - classification_loss: 0.2189 465/500 [==========================>...] - ETA: 7s - loss: 1.3717 - regression_loss: 1.1521 - classification_loss: 0.2196 466/500 [==========================>...] - ETA: 7s - loss: 1.3715 - regression_loss: 1.1520 - classification_loss: 0.2195 467/500 [===========================>..] - ETA: 7s - loss: 1.3721 - regression_loss: 1.1525 - classification_loss: 0.2196 468/500 [===========================>..] - ETA: 7s - loss: 1.3737 - regression_loss: 1.1536 - classification_loss: 0.2201 469/500 [===========================>..] - ETA: 6s - loss: 1.3733 - regression_loss: 1.1534 - classification_loss: 0.2199 470/500 [===========================>..] - ETA: 6s - loss: 1.3712 - regression_loss: 1.1517 - classification_loss: 0.2196 471/500 [===========================>..] - ETA: 6s - loss: 1.3706 - regression_loss: 1.1512 - classification_loss: 0.2194 472/500 [===========================>..] - ETA: 6s - loss: 1.3710 - regression_loss: 1.1516 - classification_loss: 0.2194 473/500 [===========================>..] - ETA: 6s - loss: 1.3701 - regression_loss: 1.1509 - classification_loss: 0.2193 474/500 [===========================>..] - ETA: 5s - loss: 1.3703 - regression_loss: 1.1513 - classification_loss: 0.2190 475/500 [===========================>..] - ETA: 5s - loss: 1.3715 - regression_loss: 1.1522 - classification_loss: 0.2194 476/500 [===========================>..] - ETA: 5s - loss: 1.3700 - regression_loss: 1.1509 - classification_loss: 0.2191 477/500 [===========================>..] - ETA: 5s - loss: 1.3705 - regression_loss: 1.1512 - classification_loss: 0.2193 478/500 [===========================>..] - ETA: 4s - loss: 1.3708 - regression_loss: 1.1515 - classification_loss: 0.2193 479/500 [===========================>..] - ETA: 4s - loss: 1.3694 - regression_loss: 1.1504 - classification_loss: 0.2191 480/500 [===========================>..] - ETA: 4s - loss: 1.3681 - regression_loss: 1.1493 - classification_loss: 0.2188 481/500 [===========================>..] - ETA: 4s - loss: 1.3686 - regression_loss: 1.1498 - classification_loss: 0.2188 482/500 [===========================>..] - ETA: 4s - loss: 1.3688 - regression_loss: 1.1501 - classification_loss: 0.2187 483/500 [===========================>..] - ETA: 3s - loss: 1.3690 - regression_loss: 1.1504 - classification_loss: 0.2187 484/500 [============================>.] - ETA: 3s - loss: 1.3697 - regression_loss: 1.1509 - classification_loss: 0.2188 485/500 [============================>.] - ETA: 3s - loss: 1.3698 - regression_loss: 1.1511 - classification_loss: 0.2188 486/500 [============================>.] - ETA: 3s - loss: 1.3699 - regression_loss: 1.1513 - classification_loss: 0.2187 487/500 [============================>.] - ETA: 2s - loss: 1.3690 - regression_loss: 1.1506 - classification_loss: 0.2184 488/500 [============================>.] - ETA: 2s - loss: 1.3710 - regression_loss: 1.1521 - classification_loss: 0.2189 489/500 [============================>.] - ETA: 2s - loss: 1.3707 - regression_loss: 1.1519 - classification_loss: 0.2187 490/500 [============================>.] - ETA: 2s - loss: 1.3703 - regression_loss: 1.1516 - classification_loss: 0.2186 491/500 [============================>.] - ETA: 2s - loss: 1.3703 - regression_loss: 1.1517 - classification_loss: 0.2186 492/500 [============================>.] - ETA: 1s - loss: 1.3692 - regression_loss: 1.1509 - classification_loss: 0.2183 493/500 [============================>.] - ETA: 1s - loss: 1.3681 - regression_loss: 1.1501 - classification_loss: 0.2180 494/500 [============================>.] - ETA: 1s - loss: 1.3692 - regression_loss: 1.1509 - classification_loss: 0.2183 495/500 [============================>.] - ETA: 1s - loss: 1.3691 - regression_loss: 1.1508 - classification_loss: 0.2183 496/500 [============================>.] - ETA: 0s - loss: 1.3670 - regression_loss: 1.1491 - classification_loss: 0.2179 497/500 [============================>.] - ETA: 0s - loss: 1.3661 - regression_loss: 1.1485 - classification_loss: 0.2177 498/500 [============================>.] - ETA: 0s - loss: 1.3654 - regression_loss: 1.1462 - classification_loss: 0.2192 499/500 [============================>.] - ETA: 0s - loss: 1.3666 - regression_loss: 1.1472 - classification_loss: 0.2194 500/500 [==============================] - 113s 225ms/step - loss: 1.3673 - regression_loss: 1.1477 - classification_loss: 0.2196 326 instances of class plum with average precision: 0.8045 mAP: 0.8045 Epoch 00088: saving model to ./training/snapshots/resnet50_pascal_88.h5 Epoch 89/150 1/500 [..............................] - ETA: 1:50 - loss: 2.1126 - regression_loss: 1.5478 - classification_loss: 0.5648 2/500 [..............................] - ETA: 1:58 - loss: 1.9000 - regression_loss: 1.4954 - classification_loss: 0.4046 3/500 [..............................] - ETA: 1:56 - loss: 1.6934 - regression_loss: 1.3659 - classification_loss: 0.3276 4/500 [..............................] - ETA: 1:56 - loss: 1.5820 - regression_loss: 1.2848 - classification_loss: 0.2973 5/500 [..............................] - ETA: 1:57 - loss: 1.4501 - regression_loss: 1.1827 - classification_loss: 0.2675 6/500 [..............................] - ETA: 1:56 - loss: 1.4174 - regression_loss: 1.1672 - classification_loss: 0.2501 7/500 [..............................] - ETA: 1:56 - loss: 1.4230 - regression_loss: 1.1694 - classification_loss: 0.2536 8/500 [..............................] - ETA: 1:56 - loss: 1.4211 - regression_loss: 1.1700 - classification_loss: 0.2511 9/500 [..............................] - ETA: 1:55 - loss: 1.3151 - regression_loss: 1.0881 - classification_loss: 0.2270 10/500 [..............................] - ETA: 1:55 - loss: 1.3283 - regression_loss: 1.0996 - classification_loss: 0.2288 11/500 [..............................] - ETA: 1:55 - loss: 1.2982 - regression_loss: 1.0807 - classification_loss: 0.2175 12/500 [..............................] - ETA: 1:56 - loss: 1.4107 - regression_loss: 1.1593 - classification_loss: 0.2514 13/500 [..............................] - ETA: 1:56 - loss: 1.4298 - regression_loss: 1.1841 - classification_loss: 0.2457 14/500 [..............................] - ETA: 1:56 - loss: 1.3886 - regression_loss: 1.1540 - classification_loss: 0.2346 15/500 [..............................] - ETA: 1:56 - loss: 1.4890 - regression_loss: 1.2346 - classification_loss: 0.2543 16/500 [..............................] - ETA: 1:56 - loss: 1.5246 - regression_loss: 1.2600 - classification_loss: 0.2646 17/500 [>.............................] - ETA: 1:56 - loss: 1.4929 - regression_loss: 1.2377 - classification_loss: 0.2552 18/500 [>.............................] - ETA: 1:56 - loss: 1.4955 - regression_loss: 1.2328 - classification_loss: 0.2627 19/500 [>.............................] - ETA: 1:56 - loss: 1.4968 - regression_loss: 1.2350 - classification_loss: 0.2619 20/500 [>.............................] - ETA: 1:56 - loss: 1.4535 - regression_loss: 1.2016 - classification_loss: 0.2519 21/500 [>.............................] - ETA: 1:56 - loss: 1.4280 - regression_loss: 1.1849 - classification_loss: 0.2431 22/500 [>.............................] - ETA: 1:56 - loss: 1.4705 - regression_loss: 1.2171 - classification_loss: 0.2534 23/500 [>.............................] - ETA: 1:55 - loss: 1.4687 - regression_loss: 1.2191 - classification_loss: 0.2496 24/500 [>.............................] - ETA: 1:55 - loss: 1.4553 - regression_loss: 1.2094 - classification_loss: 0.2459 25/500 [>.............................] - ETA: 1:54 - loss: 1.4432 - regression_loss: 1.2025 - classification_loss: 0.2407 26/500 [>.............................] - ETA: 1:54 - loss: 1.4456 - regression_loss: 1.2025 - classification_loss: 0.2432 27/500 [>.............................] - ETA: 1:54 - loss: 1.4321 - regression_loss: 1.1924 - classification_loss: 0.2397 28/500 [>.............................] - ETA: 1:54 - loss: 1.4004 - regression_loss: 1.1674 - classification_loss: 0.2330 29/500 [>.............................] - ETA: 1:54 - loss: 1.4000 - regression_loss: 1.1692 - classification_loss: 0.2308 30/500 [>.............................] - ETA: 1:53 - loss: 1.4033 - regression_loss: 1.1749 - classification_loss: 0.2284 31/500 [>.............................] - ETA: 1:53 - loss: 1.3834 - regression_loss: 1.1573 - classification_loss: 0.2262 32/500 [>.............................] - ETA: 1:52 - loss: 1.3752 - regression_loss: 1.1516 - classification_loss: 0.2236 33/500 [>.............................] - ETA: 1:52 - loss: 1.3700 - regression_loss: 1.1471 - classification_loss: 0.2230 34/500 [=>............................] - ETA: 1:52 - loss: 1.3643 - regression_loss: 1.1438 - classification_loss: 0.2205 35/500 [=>............................] - ETA: 1:51 - loss: 1.3400 - regression_loss: 1.1249 - classification_loss: 0.2151 36/500 [=>............................] - ETA: 1:51 - loss: 1.3369 - regression_loss: 1.1194 - classification_loss: 0.2175 37/500 [=>............................] - ETA: 1:51 - loss: 1.3453 - regression_loss: 1.1279 - classification_loss: 0.2174 38/500 [=>............................] - ETA: 1:51 - loss: 1.3389 - regression_loss: 1.1236 - classification_loss: 0.2153 39/500 [=>............................] - ETA: 1:51 - loss: 1.3468 - regression_loss: 1.1291 - classification_loss: 0.2176 40/500 [=>............................] - ETA: 1:51 - loss: 1.3633 - regression_loss: 1.1413 - classification_loss: 0.2220 41/500 [=>............................] - ETA: 1:50 - loss: 1.3412 - regression_loss: 1.1228 - classification_loss: 0.2184 42/500 [=>............................] - ETA: 1:50 - loss: 1.3572 - regression_loss: 1.1340 - classification_loss: 0.2231 43/500 [=>............................] - ETA: 1:50 - loss: 1.3505 - regression_loss: 1.1296 - classification_loss: 0.2209 44/500 [=>............................] - ETA: 1:50 - loss: 1.3473 - regression_loss: 1.1288 - classification_loss: 0.2186 45/500 [=>............................] - ETA: 1:49 - loss: 1.3348 - regression_loss: 1.1183 - classification_loss: 0.2165 46/500 [=>............................] - ETA: 1:49 - loss: 1.3207 - regression_loss: 1.1072 - classification_loss: 0.2135 47/500 [=>............................] - ETA: 1:49 - loss: 1.3190 - regression_loss: 1.1071 - classification_loss: 0.2119 48/500 [=>............................] - ETA: 1:49 - loss: 1.3261 - regression_loss: 1.1122 - classification_loss: 0.2139 49/500 [=>............................] - ETA: 1:48 - loss: 1.3228 - regression_loss: 1.1107 - classification_loss: 0.2121 50/500 [==>...........................] - ETA: 1:48 - loss: 1.3149 - regression_loss: 1.1038 - classification_loss: 0.2111 51/500 [==>...........................] - ETA: 1:48 - loss: 1.3055 - regression_loss: 1.0962 - classification_loss: 0.2093 52/500 [==>...........................] - ETA: 1:47 - loss: 1.2967 - regression_loss: 1.0898 - classification_loss: 0.2068 53/500 [==>...........................] - ETA: 1:47 - loss: 1.3063 - regression_loss: 1.0984 - classification_loss: 0.2079 54/500 [==>...........................] - ETA: 1:47 - loss: 1.3048 - regression_loss: 1.0975 - classification_loss: 0.2072 55/500 [==>...........................] - ETA: 1:47 - loss: 1.3030 - regression_loss: 1.0969 - classification_loss: 0.2061 56/500 [==>...........................] - ETA: 1:46 - loss: 1.3050 - regression_loss: 1.0990 - classification_loss: 0.2060 57/500 [==>...........................] - ETA: 1:46 - loss: 1.2997 - regression_loss: 1.0960 - classification_loss: 0.2037 58/500 [==>...........................] - ETA: 1:46 - loss: 1.3082 - regression_loss: 1.1030 - classification_loss: 0.2052 59/500 [==>...........................] - ETA: 1:45 - loss: 1.3218 - regression_loss: 1.1135 - classification_loss: 0.2083 60/500 [==>...........................] - ETA: 1:45 - loss: 1.3250 - regression_loss: 1.1158 - classification_loss: 0.2092 61/500 [==>...........................] - ETA: 1:45 - loss: 1.3143 - regression_loss: 1.1076 - classification_loss: 0.2067 62/500 [==>...........................] - ETA: 1:44 - loss: 1.3088 - regression_loss: 1.1038 - classification_loss: 0.2049 63/500 [==>...........................] - ETA: 1:44 - loss: 1.3112 - regression_loss: 1.1057 - classification_loss: 0.2055 64/500 [==>...........................] - ETA: 1:44 - loss: 1.3036 - regression_loss: 1.1001 - classification_loss: 0.2035 65/500 [==>...........................] - ETA: 1:44 - loss: 1.3088 - regression_loss: 1.1040 - classification_loss: 0.2049 66/500 [==>...........................] - ETA: 1:44 - loss: 1.3155 - regression_loss: 1.1097 - classification_loss: 0.2058 67/500 [===>..........................] - ETA: 1:43 - loss: 1.3101 - regression_loss: 1.1054 - classification_loss: 0.2046 68/500 [===>..........................] - ETA: 1:43 - loss: 1.3110 - regression_loss: 1.1067 - classification_loss: 0.2043 69/500 [===>..........................] - ETA: 1:43 - loss: 1.3079 - regression_loss: 1.1044 - classification_loss: 0.2035 70/500 [===>..........................] - ETA: 1:43 - loss: 1.3082 - regression_loss: 1.1048 - classification_loss: 0.2033 71/500 [===>..........................] - ETA: 1:42 - loss: 1.3096 - regression_loss: 1.1062 - classification_loss: 0.2033 72/500 [===>..........................] - ETA: 1:42 - loss: 1.3108 - regression_loss: 1.1065 - classification_loss: 0.2043 73/500 [===>..........................] - ETA: 1:42 - loss: 1.3094 - regression_loss: 1.1043 - classification_loss: 0.2052 74/500 [===>..........................] - ETA: 1:42 - loss: 1.3065 - regression_loss: 1.1021 - classification_loss: 0.2045 75/500 [===>..........................] - ETA: 1:42 - loss: 1.3121 - regression_loss: 1.1067 - classification_loss: 0.2054 76/500 [===>..........................] - ETA: 1:41 - loss: 1.3075 - regression_loss: 1.1032 - classification_loss: 0.2043 77/500 [===>..........................] - ETA: 1:41 - loss: 1.2973 - regression_loss: 1.0951 - classification_loss: 0.2023 78/500 [===>..........................] - ETA: 1:41 - loss: 1.2901 - regression_loss: 1.0894 - classification_loss: 0.2007 79/500 [===>..........................] - ETA: 1:41 - loss: 1.2839 - regression_loss: 1.0845 - classification_loss: 0.1994 80/500 [===>..........................] - ETA: 1:40 - loss: 1.2846 - regression_loss: 1.0852 - classification_loss: 0.1993 81/500 [===>..........................] - ETA: 1:40 - loss: 1.2891 - regression_loss: 1.0894 - classification_loss: 0.1997 82/500 [===>..........................] - ETA: 1:40 - loss: 1.2816 - regression_loss: 1.0831 - classification_loss: 0.1984 83/500 [===>..........................] - ETA: 1:40 - loss: 1.2716 - regression_loss: 1.0747 - classification_loss: 0.1968 84/500 [====>.........................] - ETA: 1:39 - loss: 1.2683 - regression_loss: 1.0718 - classification_loss: 0.1966 85/500 [====>.........................] - ETA: 1:39 - loss: 1.2732 - regression_loss: 1.0756 - classification_loss: 0.1976 86/500 [====>.........................] - ETA: 1:39 - loss: 1.2769 - regression_loss: 1.0790 - classification_loss: 0.1979 87/500 [====>.........................] - ETA: 1:39 - loss: 1.2760 - regression_loss: 1.0789 - classification_loss: 0.1971 88/500 [====>.........................] - ETA: 1:38 - loss: 1.2812 - regression_loss: 1.0822 - classification_loss: 0.1990 89/500 [====>.........................] - ETA: 1:38 - loss: 1.2813 - regression_loss: 1.0832 - classification_loss: 0.1981 90/500 [====>.........................] - ETA: 1:38 - loss: 1.2861 - regression_loss: 1.0875 - classification_loss: 0.1986 91/500 [====>.........................] - ETA: 1:38 - loss: 1.2855 - regression_loss: 1.0871 - classification_loss: 0.1984 92/500 [====>.........................] - ETA: 1:37 - loss: 1.2753 - regression_loss: 1.0787 - classification_loss: 0.1966 93/500 [====>.........................] - ETA: 1:37 - loss: 1.2760 - regression_loss: 1.0797 - classification_loss: 0.1963 94/500 [====>.........................] - ETA: 1:37 - loss: 1.2765 - regression_loss: 1.0808 - classification_loss: 0.1957 95/500 [====>.........................] - ETA: 1:37 - loss: 1.2758 - regression_loss: 1.0806 - classification_loss: 0.1951 96/500 [====>.........................] - ETA: 1:36 - loss: 1.2736 - regression_loss: 1.0784 - classification_loss: 0.1952 97/500 [====>.........................] - ETA: 1:36 - loss: 1.2677 - regression_loss: 1.0741 - classification_loss: 0.1936 98/500 [====>.........................] - ETA: 1:36 - loss: 1.2746 - regression_loss: 1.0792 - classification_loss: 0.1953 99/500 [====>.........................] - ETA: 1:36 - loss: 1.2731 - regression_loss: 1.0778 - classification_loss: 0.1953 100/500 [=====>........................] - ETA: 1:35 - loss: 1.2745 - regression_loss: 1.0797 - classification_loss: 0.1948 101/500 [=====>........................] - ETA: 1:35 - loss: 1.2874 - regression_loss: 1.0909 - classification_loss: 0.1965 102/500 [=====>........................] - ETA: 1:35 - loss: 1.2831 - regression_loss: 1.0875 - classification_loss: 0.1956 103/500 [=====>........................] - ETA: 1:35 - loss: 1.2807 - regression_loss: 1.0857 - classification_loss: 0.1950 104/500 [=====>........................] - ETA: 1:34 - loss: 1.2807 - regression_loss: 1.0863 - classification_loss: 0.1944 105/500 [=====>........................] - ETA: 1:34 - loss: 1.2819 - regression_loss: 1.0879 - classification_loss: 0.1939 106/500 [=====>........................] - ETA: 1:34 - loss: 1.2818 - regression_loss: 1.0883 - classification_loss: 0.1935 107/500 [=====>........................] - ETA: 1:34 - loss: 1.2891 - regression_loss: 1.0941 - classification_loss: 0.1950 108/500 [=====>........................] - ETA: 1:34 - loss: 1.2925 - regression_loss: 1.0961 - classification_loss: 0.1964 109/500 [=====>........................] - ETA: 1:33 - loss: 1.2957 - regression_loss: 1.0987 - classification_loss: 0.1970 110/500 [=====>........................] - ETA: 1:33 - loss: 1.2985 - regression_loss: 1.1011 - classification_loss: 0.1974 111/500 [=====>........................] - ETA: 1:33 - loss: 1.3038 - regression_loss: 1.1049 - classification_loss: 0.1989 112/500 [=====>........................] - ETA: 1:32 - loss: 1.2976 - regression_loss: 1.0999 - classification_loss: 0.1977 113/500 [=====>........................] - ETA: 1:32 - loss: 1.2962 - regression_loss: 1.0987 - classification_loss: 0.1975 114/500 [=====>........................] - ETA: 1:32 - loss: 1.3016 - regression_loss: 1.1026 - classification_loss: 0.1990 115/500 [=====>........................] - ETA: 1:32 - loss: 1.3055 - regression_loss: 1.1057 - classification_loss: 0.1997 116/500 [=====>........................] - ETA: 1:31 - loss: 1.3052 - regression_loss: 1.1049 - classification_loss: 0.2003 117/500 [======>.......................] - ETA: 1:31 - loss: 1.2983 - regression_loss: 1.0993 - classification_loss: 0.1990 118/500 [======>.......................] - ETA: 1:31 - loss: 1.2984 - regression_loss: 1.0995 - classification_loss: 0.1990 119/500 [======>.......................] - ETA: 1:31 - loss: 1.3029 - regression_loss: 1.1033 - classification_loss: 0.1996 120/500 [======>.......................] - ETA: 1:31 - loss: 1.3014 - regression_loss: 1.1024 - classification_loss: 0.1990 121/500 [======>.......................] - ETA: 1:30 - loss: 1.3009 - regression_loss: 1.1013 - classification_loss: 0.1997 122/500 [======>.......................] - ETA: 1:30 - loss: 1.3032 - regression_loss: 1.1039 - classification_loss: 0.1993 123/500 [======>.......................] - ETA: 1:30 - loss: 1.3015 - regression_loss: 1.1023 - classification_loss: 0.1992 124/500 [======>.......................] - ETA: 1:30 - loss: 1.3014 - regression_loss: 1.1022 - classification_loss: 0.1992 125/500 [======>.......................] - ETA: 1:29 - loss: 1.3039 - regression_loss: 1.1043 - classification_loss: 0.1996 126/500 [======>.......................] - ETA: 1:29 - loss: 1.3067 - regression_loss: 1.1066 - classification_loss: 0.2001 127/500 [======>.......................] - ETA: 1:29 - loss: 1.3086 - regression_loss: 1.1082 - classification_loss: 0.2004 128/500 [======>.......................] - ETA: 1:29 - loss: 1.3130 - regression_loss: 1.1111 - classification_loss: 0.2019 129/500 [======>.......................] - ETA: 1:28 - loss: 1.3144 - regression_loss: 1.1124 - classification_loss: 0.2020 130/500 [======>.......................] - ETA: 1:28 - loss: 1.3122 - regression_loss: 1.1108 - classification_loss: 0.2014 131/500 [======>.......................] - ETA: 1:28 - loss: 1.3190 - regression_loss: 1.1164 - classification_loss: 0.2026 132/500 [======>.......................] - ETA: 1:28 - loss: 1.3218 - regression_loss: 1.1183 - classification_loss: 0.2035 133/500 [======>.......................] - ETA: 1:27 - loss: 1.3159 - regression_loss: 1.1130 - classification_loss: 0.2028 134/500 [=======>......................] - ETA: 1:27 - loss: 1.3133 - regression_loss: 1.1112 - classification_loss: 0.2021 135/500 [=======>......................] - ETA: 1:27 - loss: 1.3113 - regression_loss: 1.1100 - classification_loss: 0.2013 136/500 [=======>......................] - ETA: 1:27 - loss: 1.3054 - regression_loss: 1.1019 - classification_loss: 0.2035 137/500 [=======>......................] - ETA: 1:27 - loss: 1.3041 - regression_loss: 1.1014 - classification_loss: 0.2027 138/500 [=======>......................] - ETA: 1:26 - loss: 1.3075 - regression_loss: 1.1041 - classification_loss: 0.2033 139/500 [=======>......................] - ETA: 1:26 - loss: 1.3069 - regression_loss: 1.1039 - classification_loss: 0.2029 140/500 [=======>......................] - ETA: 1:26 - loss: 1.3061 - regression_loss: 1.1034 - classification_loss: 0.2027 141/500 [=======>......................] - ETA: 1:26 - loss: 1.3015 - regression_loss: 1.0991 - classification_loss: 0.2024 142/500 [=======>......................] - ETA: 1:25 - loss: 1.3035 - regression_loss: 1.1012 - classification_loss: 0.2023 143/500 [=======>......................] - ETA: 1:25 - loss: 1.3000 - regression_loss: 1.0985 - classification_loss: 0.2014 144/500 [=======>......................] - ETA: 1:25 - loss: 1.3007 - regression_loss: 1.0992 - classification_loss: 0.2015 145/500 [=======>......................] - ETA: 1:25 - loss: 1.3017 - regression_loss: 1.0999 - classification_loss: 0.2018 146/500 [=======>......................] - ETA: 1:24 - loss: 1.3009 - regression_loss: 1.0994 - classification_loss: 0.2015 147/500 [=======>......................] - ETA: 1:24 - loss: 1.3040 - regression_loss: 1.1022 - classification_loss: 0.2018 148/500 [=======>......................] - ETA: 1:24 - loss: 1.3008 - regression_loss: 1.0999 - classification_loss: 0.2010 149/500 [=======>......................] - ETA: 1:24 - loss: 1.2944 - regression_loss: 1.0925 - classification_loss: 0.2019 150/500 [========>.....................] - ETA: 1:23 - loss: 1.2944 - regression_loss: 1.0922 - classification_loss: 0.2022 151/500 [========>.....................] - ETA: 1:23 - loss: 1.2944 - regression_loss: 1.0921 - classification_loss: 0.2023 152/500 [========>.....................] - ETA: 1:23 - loss: 1.2998 - regression_loss: 1.0960 - classification_loss: 0.2038 153/500 [========>.....................] - ETA: 1:23 - loss: 1.3010 - regression_loss: 1.0968 - classification_loss: 0.2042 154/500 [========>.....................] - ETA: 1:22 - loss: 1.2991 - regression_loss: 1.0952 - classification_loss: 0.2039 155/500 [========>.....................] - ETA: 1:22 - loss: 1.2966 - regression_loss: 1.0934 - classification_loss: 0.2032 156/500 [========>.....................] - ETA: 1:22 - loss: 1.2974 - regression_loss: 1.0946 - classification_loss: 0.2028 157/500 [========>.....................] - ETA: 1:22 - loss: 1.2948 - regression_loss: 1.0925 - classification_loss: 0.2023 158/500 [========>.....................] - ETA: 1:22 - loss: 1.2919 - regression_loss: 1.0901 - classification_loss: 0.2018 159/500 [========>.....................] - ETA: 1:21 - loss: 1.2854 - regression_loss: 1.0847 - classification_loss: 0.2008 160/500 [========>.....................] - ETA: 1:21 - loss: 1.2860 - regression_loss: 1.0847 - classification_loss: 0.2013 161/500 [========>.....................] - ETA: 1:21 - loss: 1.2880 - regression_loss: 1.0864 - classification_loss: 0.2016 162/500 [========>.....................] - ETA: 1:21 - loss: 1.2887 - regression_loss: 1.0866 - classification_loss: 0.2021 163/500 [========>.....................] - ETA: 1:20 - loss: 1.2881 - regression_loss: 1.0866 - classification_loss: 0.2015 164/500 [========>.....................] - ETA: 1:20 - loss: 1.2892 - regression_loss: 1.0876 - classification_loss: 0.2015 165/500 [========>.....................] - ETA: 1:20 - loss: 1.2996 - regression_loss: 1.0963 - classification_loss: 0.2033 166/500 [========>.....................] - ETA: 1:20 - loss: 1.3016 - regression_loss: 1.0978 - classification_loss: 0.2038 167/500 [=========>....................] - ETA: 1:19 - loss: 1.3039 - regression_loss: 1.0999 - classification_loss: 0.2041 168/500 [=========>....................] - ETA: 1:19 - loss: 1.3066 - regression_loss: 1.1020 - classification_loss: 0.2046 169/500 [=========>....................] - ETA: 1:19 - loss: 1.3049 - regression_loss: 1.1009 - classification_loss: 0.2041 170/500 [=========>....................] - ETA: 1:19 - loss: 1.3045 - regression_loss: 1.1008 - classification_loss: 0.2037 171/500 [=========>....................] - ETA: 1:18 - loss: 1.3036 - regression_loss: 1.1003 - classification_loss: 0.2033 172/500 [=========>....................] - ETA: 1:18 - loss: 1.3049 - regression_loss: 1.1017 - classification_loss: 0.2032 173/500 [=========>....................] - ETA: 1:18 - loss: 1.3013 - regression_loss: 1.0985 - classification_loss: 0.2028 174/500 [=========>....................] - ETA: 1:18 - loss: 1.3027 - regression_loss: 1.0999 - classification_loss: 0.2028 175/500 [=========>....................] - ETA: 1:17 - loss: 1.3024 - regression_loss: 1.0997 - classification_loss: 0.2027 176/500 [=========>....................] - ETA: 1:17 - loss: 1.3026 - regression_loss: 1.1001 - classification_loss: 0.2025 177/500 [=========>....................] - ETA: 1:17 - loss: 1.3024 - regression_loss: 1.1002 - classification_loss: 0.2021 178/500 [=========>....................] - ETA: 1:17 - loss: 1.3029 - regression_loss: 1.1008 - classification_loss: 0.2022 179/500 [=========>....................] - ETA: 1:16 - loss: 1.3060 - regression_loss: 1.1026 - classification_loss: 0.2034 180/500 [=========>....................] - ETA: 1:16 - loss: 1.3068 - regression_loss: 1.1036 - classification_loss: 0.2033 181/500 [=========>....................] - ETA: 1:16 - loss: 1.3054 - regression_loss: 1.1026 - classification_loss: 0.2028 182/500 [=========>....................] - ETA: 1:16 - loss: 1.3055 - regression_loss: 1.1025 - classification_loss: 0.2029 183/500 [=========>....................] - ETA: 1:15 - loss: 1.3094 - regression_loss: 1.1056 - classification_loss: 0.2038 184/500 [==========>...................] - ETA: 1:15 - loss: 1.3093 - regression_loss: 1.1055 - classification_loss: 0.2037 185/500 [==========>...................] - ETA: 1:15 - loss: 1.3087 - regression_loss: 1.1053 - classification_loss: 0.2034 186/500 [==========>...................] - ETA: 1:15 - loss: 1.3111 - regression_loss: 1.1073 - classification_loss: 0.2038 187/500 [==========>...................] - ETA: 1:14 - loss: 1.3094 - regression_loss: 1.1062 - classification_loss: 0.2032 188/500 [==========>...................] - ETA: 1:14 - loss: 1.3105 - regression_loss: 1.1071 - classification_loss: 0.2034 189/500 [==========>...................] - ETA: 1:14 - loss: 1.3102 - regression_loss: 1.1069 - classification_loss: 0.2034 190/500 [==========>...................] - ETA: 1:14 - loss: 1.3115 - regression_loss: 1.1077 - classification_loss: 0.2038 191/500 [==========>...................] - ETA: 1:13 - loss: 1.3101 - regression_loss: 1.1065 - classification_loss: 0.2036 192/500 [==========>...................] - ETA: 1:13 - loss: 1.3101 - regression_loss: 1.1069 - classification_loss: 0.2032 193/500 [==========>...................] - ETA: 1:13 - loss: 1.3094 - regression_loss: 1.1063 - classification_loss: 0.2030 194/500 [==========>...................] - ETA: 1:13 - loss: 1.3079 - regression_loss: 1.1053 - classification_loss: 0.2026 195/500 [==========>...................] - ETA: 1:12 - loss: 1.3083 - regression_loss: 1.1054 - classification_loss: 0.2029 196/500 [==========>...................] - ETA: 1:12 - loss: 1.3089 - regression_loss: 1.1060 - classification_loss: 0.2029 197/500 [==========>...................] - ETA: 1:12 - loss: 1.3042 - regression_loss: 1.1023 - classification_loss: 0.2019 198/500 [==========>...................] - ETA: 1:12 - loss: 1.3039 - regression_loss: 1.1022 - classification_loss: 0.2017 199/500 [==========>...................] - ETA: 1:12 - loss: 1.3037 - regression_loss: 1.1022 - classification_loss: 0.2016 200/500 [===========>..................] - ETA: 1:11 - loss: 1.3062 - regression_loss: 1.1042 - classification_loss: 0.2019 201/500 [===========>..................] - ETA: 1:11 - loss: 1.3079 - regression_loss: 1.1058 - classification_loss: 0.2021 202/500 [===========>..................] - ETA: 1:11 - loss: 1.3075 - regression_loss: 1.1055 - classification_loss: 0.2020 203/500 [===========>..................] - ETA: 1:11 - loss: 1.3093 - regression_loss: 1.1070 - classification_loss: 0.2023 204/500 [===========>..................] - ETA: 1:10 - loss: 1.3119 - regression_loss: 1.1087 - classification_loss: 0.2033 205/500 [===========>..................] - ETA: 1:10 - loss: 1.3120 - regression_loss: 1.1089 - classification_loss: 0.2032 206/500 [===========>..................] - ETA: 1:10 - loss: 1.3111 - regression_loss: 1.1079 - classification_loss: 0.2032 207/500 [===========>..................] - ETA: 1:10 - loss: 1.3097 - regression_loss: 1.1069 - classification_loss: 0.2027 208/500 [===========>..................] - ETA: 1:09 - loss: 1.3092 - regression_loss: 1.1068 - classification_loss: 0.2023 209/500 [===========>..................] - ETA: 1:09 - loss: 1.3095 - regression_loss: 1.1073 - classification_loss: 0.2022 210/500 [===========>..................] - ETA: 1:09 - loss: 1.3089 - regression_loss: 1.1069 - classification_loss: 0.2019 211/500 [===========>..................] - ETA: 1:09 - loss: 1.3117 - regression_loss: 1.1096 - classification_loss: 0.2021 212/500 [===========>..................] - ETA: 1:09 - loss: 1.3100 - regression_loss: 1.1084 - classification_loss: 0.2016 213/500 [===========>..................] - ETA: 1:08 - loss: 1.3158 - regression_loss: 1.1128 - classification_loss: 0.2031 214/500 [===========>..................] - ETA: 1:08 - loss: 1.3141 - regression_loss: 1.1116 - classification_loss: 0.2026 215/500 [===========>..................] - ETA: 1:08 - loss: 1.3175 - regression_loss: 1.1139 - classification_loss: 0.2035 216/500 [===========>..................] - ETA: 1:08 - loss: 1.3180 - regression_loss: 1.1146 - classification_loss: 0.2034 217/500 [============>.................] - ETA: 1:07 - loss: 1.3176 - regression_loss: 1.1146 - classification_loss: 0.2030 218/500 [============>.................] - ETA: 1:07 - loss: 1.3192 - regression_loss: 1.1157 - classification_loss: 0.2035 219/500 [============>.................] - ETA: 1:07 - loss: 1.3188 - regression_loss: 1.1152 - classification_loss: 0.2035 220/500 [============>.................] - ETA: 1:07 - loss: 1.3179 - regression_loss: 1.1149 - classification_loss: 0.2031 221/500 [============>.................] - ETA: 1:06 - loss: 1.3164 - regression_loss: 1.1135 - classification_loss: 0.2030 222/500 [============>.................] - ETA: 1:06 - loss: 1.3196 - regression_loss: 1.1155 - classification_loss: 0.2041 223/500 [============>.................] - ETA: 1:06 - loss: 1.3202 - regression_loss: 1.1165 - classification_loss: 0.2037 224/500 [============>.................] - ETA: 1:06 - loss: 1.3215 - regression_loss: 1.1178 - classification_loss: 0.2037 225/500 [============>.................] - ETA: 1:06 - loss: 1.3236 - regression_loss: 1.1191 - classification_loss: 0.2045 226/500 [============>.................] - ETA: 1:05 - loss: 1.3213 - regression_loss: 1.1172 - classification_loss: 0.2040 227/500 [============>.................] - ETA: 1:05 - loss: 1.3195 - regression_loss: 1.1157 - classification_loss: 0.2038 228/500 [============>.................] - ETA: 1:05 - loss: 1.3210 - regression_loss: 1.1169 - classification_loss: 0.2041 229/500 [============>.................] - ETA: 1:05 - loss: 1.3219 - regression_loss: 1.1178 - classification_loss: 0.2041 230/500 [============>.................] - ETA: 1:04 - loss: 1.3226 - regression_loss: 1.1184 - classification_loss: 0.2042 231/500 [============>.................] - ETA: 1:04 - loss: 1.3230 - regression_loss: 1.1186 - classification_loss: 0.2043 232/500 [============>.................] - ETA: 1:04 - loss: 1.3222 - regression_loss: 1.1182 - classification_loss: 0.2040 233/500 [============>.................] - ETA: 1:04 - loss: 1.3254 - regression_loss: 1.1207 - classification_loss: 0.2047 234/500 [=============>................] - ETA: 1:03 - loss: 1.3250 - regression_loss: 1.1205 - classification_loss: 0.2044 235/500 [=============>................] - ETA: 1:03 - loss: 1.3251 - regression_loss: 1.1204 - classification_loss: 0.2047 236/500 [=============>................] - ETA: 1:03 - loss: 1.3273 - regression_loss: 1.1221 - classification_loss: 0.2052 237/500 [=============>................] - ETA: 1:03 - loss: 1.3238 - regression_loss: 1.1190 - classification_loss: 0.2047 238/500 [=============>................] - ETA: 1:02 - loss: 1.3228 - regression_loss: 1.1183 - classification_loss: 0.2045 239/500 [=============>................] - ETA: 1:02 - loss: 1.3222 - regression_loss: 1.1179 - classification_loss: 0.2043 240/500 [=============>................] - ETA: 1:02 - loss: 1.3221 - regression_loss: 1.1181 - classification_loss: 0.2040 241/500 [=============>................] - ETA: 1:02 - loss: 1.3215 - regression_loss: 1.1177 - classification_loss: 0.2038 242/500 [=============>................] - ETA: 1:02 - loss: 1.3227 - regression_loss: 1.1188 - classification_loss: 0.2039 243/500 [=============>................] - ETA: 1:01 - loss: 1.3232 - regression_loss: 1.1192 - classification_loss: 0.2040 244/500 [=============>................] - ETA: 1:01 - loss: 1.3243 - regression_loss: 1.1201 - classification_loss: 0.2043 245/500 [=============>................] - ETA: 1:01 - loss: 1.3253 - regression_loss: 1.1210 - classification_loss: 0.2044 246/500 [=============>................] - ETA: 1:01 - loss: 1.3247 - regression_loss: 1.1206 - classification_loss: 0.2042 247/500 [=============>................] - ETA: 1:00 - loss: 1.3269 - regression_loss: 1.1222 - classification_loss: 0.2047 248/500 [=============>................] - ETA: 1:00 - loss: 1.3231 - regression_loss: 1.1190 - classification_loss: 0.2040 249/500 [=============>................] - ETA: 1:00 - loss: 1.3248 - regression_loss: 1.1204 - classification_loss: 0.2044 250/500 [==============>...............] - ETA: 1:00 - loss: 1.3265 - regression_loss: 1.1217 - classification_loss: 0.2049 251/500 [==============>...............] - ETA: 59s - loss: 1.3243 - regression_loss: 1.1199 - classification_loss: 0.2044  252/500 [==============>...............] - ETA: 59s - loss: 1.3239 - regression_loss: 1.1199 - classification_loss: 0.2040 253/500 [==============>...............] - ETA: 59s - loss: 1.3253 - regression_loss: 1.1207 - classification_loss: 0.2046 254/500 [==============>...............] - ETA: 59s - loss: 1.3256 - regression_loss: 1.1215 - classification_loss: 0.2041 255/500 [==============>...............] - ETA: 58s - loss: 1.3239 - regression_loss: 1.1201 - classification_loss: 0.2038 256/500 [==============>...............] - ETA: 58s - loss: 1.3211 - regression_loss: 1.1179 - classification_loss: 0.2032 257/500 [==============>...............] - ETA: 58s - loss: 1.3217 - regression_loss: 1.1184 - classification_loss: 0.2033 258/500 [==============>...............] - ETA: 58s - loss: 1.3251 - regression_loss: 1.1211 - classification_loss: 0.2040 259/500 [==============>...............] - ETA: 58s - loss: 1.3278 - regression_loss: 1.1230 - classification_loss: 0.2048 260/500 [==============>...............] - ETA: 57s - loss: 1.3316 - regression_loss: 1.1256 - classification_loss: 0.2060 261/500 [==============>...............] - ETA: 57s - loss: 1.3311 - regression_loss: 1.1253 - classification_loss: 0.2058 262/500 [==============>...............] - ETA: 57s - loss: 1.3321 - regression_loss: 1.1263 - classification_loss: 0.2058 263/500 [==============>...............] - ETA: 57s - loss: 1.3316 - regression_loss: 1.1259 - classification_loss: 0.2057 264/500 [==============>...............] - ETA: 56s - loss: 1.3326 - regression_loss: 1.1265 - classification_loss: 0.2061 265/500 [==============>...............] - ETA: 56s - loss: 1.3344 - regression_loss: 1.1279 - classification_loss: 0.2064 266/500 [==============>...............] - ETA: 56s - loss: 1.3320 - regression_loss: 1.1261 - classification_loss: 0.2060 267/500 [===============>..............] - ETA: 56s - loss: 1.3322 - regression_loss: 1.1265 - classification_loss: 0.2057 268/500 [===============>..............] - ETA: 55s - loss: 1.3312 - regression_loss: 1.1258 - classification_loss: 0.2054 269/500 [===============>..............] - ETA: 55s - loss: 1.3335 - regression_loss: 1.1277 - classification_loss: 0.2058 270/500 [===============>..............] - ETA: 55s - loss: 1.3334 - regression_loss: 1.1272 - classification_loss: 0.2062 271/500 [===============>..............] - ETA: 55s - loss: 1.3333 - regression_loss: 1.1274 - classification_loss: 0.2059 272/500 [===============>..............] - ETA: 54s - loss: 1.3314 - regression_loss: 1.1259 - classification_loss: 0.2055 273/500 [===============>..............] - ETA: 54s - loss: 1.3320 - regression_loss: 1.1264 - classification_loss: 0.2056 274/500 [===============>..............] - ETA: 54s - loss: 1.3305 - regression_loss: 1.1253 - classification_loss: 0.2052 275/500 [===============>..............] - ETA: 54s - loss: 1.3290 - regression_loss: 1.1236 - classification_loss: 0.2054 276/500 [===============>..............] - ETA: 53s - loss: 1.3257 - regression_loss: 1.1210 - classification_loss: 0.2047 277/500 [===============>..............] - ETA: 53s - loss: 1.3277 - regression_loss: 1.1223 - classification_loss: 0.2054 278/500 [===============>..............] - ETA: 53s - loss: 1.3284 - regression_loss: 1.1230 - classification_loss: 0.2054 279/500 [===============>..............] - ETA: 53s - loss: 1.3274 - regression_loss: 1.1223 - classification_loss: 0.2051 280/500 [===============>..............] - ETA: 53s - loss: 1.3268 - regression_loss: 1.1219 - classification_loss: 0.2049 281/500 [===============>..............] - ETA: 52s - loss: 1.3260 - regression_loss: 1.1214 - classification_loss: 0.2046 282/500 [===============>..............] - ETA: 52s - loss: 1.3312 - regression_loss: 1.1253 - classification_loss: 0.2059 283/500 [===============>..............] - ETA: 52s - loss: 1.3312 - regression_loss: 1.1253 - classification_loss: 0.2059 284/500 [================>.............] - ETA: 52s - loss: 1.3288 - regression_loss: 1.1233 - classification_loss: 0.2055 285/500 [================>.............] - ETA: 51s - loss: 1.3273 - regression_loss: 1.1222 - classification_loss: 0.2050 286/500 [================>.............] - ETA: 51s - loss: 1.3276 - regression_loss: 1.1228 - classification_loss: 0.2048 287/500 [================>.............] - ETA: 51s - loss: 1.3286 - regression_loss: 1.1237 - classification_loss: 0.2048 288/500 [================>.............] - ETA: 51s - loss: 1.3302 - regression_loss: 1.1251 - classification_loss: 0.2050 289/500 [================>.............] - ETA: 50s - loss: 1.3308 - regression_loss: 1.1255 - classification_loss: 0.2053 290/500 [================>.............] - ETA: 50s - loss: 1.3319 - regression_loss: 1.1263 - classification_loss: 0.2056 291/500 [================>.............] - ETA: 50s - loss: 1.3300 - regression_loss: 1.1247 - classification_loss: 0.2053 292/500 [================>.............] - ETA: 50s - loss: 1.3303 - regression_loss: 1.1250 - classification_loss: 0.2053 293/500 [================>.............] - ETA: 49s - loss: 1.3326 - regression_loss: 1.1266 - classification_loss: 0.2060 294/500 [================>.............] - ETA: 49s - loss: 1.3321 - regression_loss: 1.1263 - classification_loss: 0.2058 295/500 [================>.............] - ETA: 49s - loss: 1.3339 - regression_loss: 1.1279 - classification_loss: 0.2060 296/500 [================>.............] - ETA: 49s - loss: 1.3374 - regression_loss: 1.1307 - classification_loss: 0.2067 297/500 [================>.............] - ETA: 48s - loss: 1.3346 - regression_loss: 1.1286 - classification_loss: 0.2061 298/500 [================>.............] - ETA: 48s - loss: 1.3361 - regression_loss: 1.1296 - classification_loss: 0.2065 299/500 [================>.............] - ETA: 48s - loss: 1.3342 - regression_loss: 1.1281 - classification_loss: 0.2061 300/500 [=================>............] - ETA: 48s - loss: 1.3356 - regression_loss: 1.1295 - classification_loss: 0.2061 301/500 [=================>............] - ETA: 47s - loss: 1.3339 - regression_loss: 1.1281 - classification_loss: 0.2058 302/500 [=================>............] - ETA: 47s - loss: 1.3357 - regression_loss: 1.1299 - classification_loss: 0.2058 303/500 [=================>............] - ETA: 47s - loss: 1.3340 - regression_loss: 1.1285 - classification_loss: 0.2055 304/500 [=================>............] - ETA: 47s - loss: 1.3337 - regression_loss: 1.1283 - classification_loss: 0.2055 305/500 [=================>............] - ETA: 46s - loss: 1.3333 - regression_loss: 1.1278 - classification_loss: 0.2054 306/500 [=================>............] - ETA: 46s - loss: 1.3311 - regression_loss: 1.1262 - classification_loss: 0.2050 307/500 [=================>............] - ETA: 46s - loss: 1.3304 - regression_loss: 1.1256 - classification_loss: 0.2047 308/500 [=================>............] - ETA: 46s - loss: 1.3294 - regression_loss: 1.1250 - classification_loss: 0.2044 309/500 [=================>............] - ETA: 46s - loss: 1.3265 - regression_loss: 1.1226 - classification_loss: 0.2039 310/500 [=================>............] - ETA: 45s - loss: 1.3275 - regression_loss: 1.1234 - classification_loss: 0.2042 311/500 [=================>............] - ETA: 45s - loss: 1.3280 - regression_loss: 1.1239 - classification_loss: 0.2042 312/500 [=================>............] - ETA: 45s - loss: 1.3296 - regression_loss: 1.1255 - classification_loss: 0.2041 313/500 [=================>............] - ETA: 45s - loss: 1.3314 - regression_loss: 1.1270 - classification_loss: 0.2044 314/500 [=================>............] - ETA: 44s - loss: 1.3288 - regression_loss: 1.1249 - classification_loss: 0.2039 315/500 [=================>............] - ETA: 44s - loss: 1.3301 - regression_loss: 1.1260 - classification_loss: 0.2041 316/500 [=================>............] - ETA: 44s - loss: 1.3307 - regression_loss: 1.1263 - classification_loss: 0.2044 317/500 [==================>...........] - ETA: 44s - loss: 1.3305 - regression_loss: 1.1261 - classification_loss: 0.2044 318/500 [==================>...........] - ETA: 43s - loss: 1.3303 - regression_loss: 1.1256 - classification_loss: 0.2047 319/500 [==================>...........] - ETA: 43s - loss: 1.3299 - regression_loss: 1.1253 - classification_loss: 0.2046 320/500 [==================>...........] - ETA: 43s - loss: 1.3303 - regression_loss: 1.1259 - classification_loss: 0.2044 321/500 [==================>...........] - ETA: 43s - loss: 1.3293 - regression_loss: 1.1251 - classification_loss: 0.2042 322/500 [==================>...........] - ETA: 42s - loss: 1.3296 - regression_loss: 1.1255 - classification_loss: 0.2042 323/500 [==================>...........] - ETA: 42s - loss: 1.3310 - regression_loss: 1.1266 - classification_loss: 0.2043 324/500 [==================>...........] - ETA: 42s - loss: 1.3300 - regression_loss: 1.1257 - classification_loss: 0.2043 325/500 [==================>...........] - ETA: 42s - loss: 1.3297 - regression_loss: 1.1256 - classification_loss: 0.2041 326/500 [==================>...........] - ETA: 41s - loss: 1.3288 - regression_loss: 1.1249 - classification_loss: 0.2039 327/500 [==================>...........] - ETA: 41s - loss: 1.3300 - regression_loss: 1.1260 - classification_loss: 0.2040 328/500 [==================>...........] - ETA: 41s - loss: 1.3293 - regression_loss: 1.1255 - classification_loss: 0.2038 329/500 [==================>...........] - ETA: 41s - loss: 1.3299 - regression_loss: 1.1263 - classification_loss: 0.2037 330/500 [==================>...........] - ETA: 40s - loss: 1.3305 - regression_loss: 1.1268 - classification_loss: 0.2037 331/500 [==================>...........] - ETA: 40s - loss: 1.3310 - regression_loss: 1.1273 - classification_loss: 0.2037 332/500 [==================>...........] - ETA: 40s - loss: 1.3314 - regression_loss: 1.1278 - classification_loss: 0.2036 333/500 [==================>...........] - ETA: 40s - loss: 1.3335 - regression_loss: 1.1291 - classification_loss: 0.2043 334/500 [===================>..........] - ETA: 40s - loss: 1.3334 - regression_loss: 1.1292 - classification_loss: 0.2042 335/500 [===================>..........] - ETA: 39s - loss: 1.3324 - regression_loss: 1.1280 - classification_loss: 0.2044 336/500 [===================>..........] - ETA: 39s - loss: 1.3316 - regression_loss: 1.1274 - classification_loss: 0.2041 337/500 [===================>..........] - ETA: 39s - loss: 1.3339 - regression_loss: 1.1291 - classification_loss: 0.2048 338/500 [===================>..........] - ETA: 39s - loss: 1.3321 - regression_loss: 1.1275 - classification_loss: 0.2046 339/500 [===================>..........] - ETA: 38s - loss: 1.3329 - regression_loss: 1.1283 - classification_loss: 0.2046 340/500 [===================>..........] - ETA: 38s - loss: 1.3330 - regression_loss: 1.1283 - classification_loss: 0.2048 341/500 [===================>..........] - ETA: 38s - loss: 1.3318 - regression_loss: 1.1272 - classification_loss: 0.2046 342/500 [===================>..........] - ETA: 38s - loss: 1.3305 - regression_loss: 1.1261 - classification_loss: 0.2044 343/500 [===================>..........] - ETA: 37s - loss: 1.3296 - regression_loss: 1.1256 - classification_loss: 0.2040 344/500 [===================>..........] - ETA: 37s - loss: 1.3293 - regression_loss: 1.1252 - classification_loss: 0.2041 345/500 [===================>..........] - ETA: 37s - loss: 1.3314 - regression_loss: 1.1267 - classification_loss: 0.2046 346/500 [===================>..........] - ETA: 37s - loss: 1.3316 - regression_loss: 1.1269 - classification_loss: 0.2046 347/500 [===================>..........] - ETA: 36s - loss: 1.3295 - regression_loss: 1.1252 - classification_loss: 0.2043 348/500 [===================>..........] - ETA: 36s - loss: 1.3277 - regression_loss: 1.1238 - classification_loss: 0.2040 349/500 [===================>..........] - ETA: 36s - loss: 1.3266 - regression_loss: 1.1229 - classification_loss: 0.2037 350/500 [====================>.........] - ETA: 36s - loss: 1.3284 - regression_loss: 1.1246 - classification_loss: 0.2038 351/500 [====================>.........] - ETA: 35s - loss: 1.3285 - regression_loss: 1.1247 - classification_loss: 0.2038 352/500 [====================>.........] - ETA: 35s - loss: 1.3300 - regression_loss: 1.1260 - classification_loss: 0.2040 353/500 [====================>.........] - ETA: 35s - loss: 1.3325 - regression_loss: 1.1280 - classification_loss: 0.2046 354/500 [====================>.........] - ETA: 35s - loss: 1.3322 - regression_loss: 1.1278 - classification_loss: 0.2043 355/500 [====================>.........] - ETA: 34s - loss: 1.3344 - regression_loss: 1.1291 - classification_loss: 0.2054 356/500 [====================>.........] - ETA: 34s - loss: 1.3317 - regression_loss: 1.1267 - classification_loss: 0.2050 357/500 [====================>.........] - ETA: 34s - loss: 1.3300 - regression_loss: 1.1254 - classification_loss: 0.2046 358/500 [====================>.........] - ETA: 34s - loss: 1.3315 - regression_loss: 1.1265 - classification_loss: 0.2051 359/500 [====================>.........] - ETA: 34s - loss: 1.3331 - regression_loss: 1.1276 - classification_loss: 0.2055 360/500 [====================>.........] - ETA: 33s - loss: 1.3375 - regression_loss: 1.1306 - classification_loss: 0.2069 361/500 [====================>.........] - ETA: 33s - loss: 1.3395 - regression_loss: 1.1323 - classification_loss: 0.2072 362/500 [====================>.........] - ETA: 33s - loss: 1.3404 - regression_loss: 1.1330 - classification_loss: 0.2073 363/500 [====================>.........] - ETA: 33s - loss: 1.3397 - regression_loss: 1.1325 - classification_loss: 0.2072 364/500 [====================>.........] - ETA: 32s - loss: 1.3411 - regression_loss: 1.1336 - classification_loss: 0.2075 365/500 [====================>.........] - ETA: 32s - loss: 1.3388 - regression_loss: 1.1318 - classification_loss: 0.2071 366/500 [====================>.........] - ETA: 32s - loss: 1.3369 - regression_loss: 1.1300 - classification_loss: 0.2069 367/500 [=====================>........] - ETA: 32s - loss: 1.3367 - regression_loss: 1.1300 - classification_loss: 0.2067 368/500 [=====================>........] - ETA: 31s - loss: 1.3356 - regression_loss: 1.1291 - classification_loss: 0.2065 369/500 [=====================>........] - ETA: 31s - loss: 1.3376 - regression_loss: 1.1308 - classification_loss: 0.2068 370/500 [=====================>........] - ETA: 31s - loss: 1.3376 - regression_loss: 1.1308 - classification_loss: 0.2067 371/500 [=====================>........] - ETA: 31s - loss: 1.3370 - regression_loss: 1.1303 - classification_loss: 0.2067 372/500 [=====================>........] - ETA: 30s - loss: 1.3378 - regression_loss: 1.1312 - classification_loss: 0.2067 373/500 [=====================>........] - ETA: 30s - loss: 1.3382 - regression_loss: 1.1315 - classification_loss: 0.2067 374/500 [=====================>........] - ETA: 30s - loss: 1.3397 - regression_loss: 1.1326 - classification_loss: 0.2071 375/500 [=====================>........] - ETA: 30s - loss: 1.3383 - regression_loss: 1.1313 - classification_loss: 0.2070 376/500 [=====================>........] - ETA: 29s - loss: 1.3396 - regression_loss: 1.1323 - classification_loss: 0.2074 377/500 [=====================>........] - ETA: 29s - loss: 1.3389 - regression_loss: 1.1316 - classification_loss: 0.2073 378/500 [=====================>........] - ETA: 29s - loss: 1.3533 - regression_loss: 1.1340 - classification_loss: 0.2194 379/500 [=====================>........] - ETA: 29s - loss: 1.3527 - regression_loss: 1.1336 - classification_loss: 0.2191 380/500 [=====================>........] - ETA: 28s - loss: 1.3533 - regression_loss: 1.1342 - classification_loss: 0.2191 381/500 [=====================>........] - ETA: 28s - loss: 1.3534 - regression_loss: 1.1345 - classification_loss: 0.2189 382/500 [=====================>........] - ETA: 28s - loss: 1.3526 - regression_loss: 1.1340 - classification_loss: 0.2186 383/500 [=====================>........] - ETA: 28s - loss: 1.3516 - regression_loss: 1.1333 - classification_loss: 0.2183 384/500 [======================>.......] - ETA: 28s - loss: 1.3508 - regression_loss: 1.1326 - classification_loss: 0.2182 385/500 [======================>.......] - ETA: 27s - loss: 1.3505 - regression_loss: 1.1324 - classification_loss: 0.2181 386/500 [======================>.......] - ETA: 27s - loss: 1.3502 - regression_loss: 1.1322 - classification_loss: 0.2180 387/500 [======================>.......] - ETA: 27s - loss: 1.3509 - regression_loss: 1.1327 - classification_loss: 0.2182 388/500 [======================>.......] - ETA: 27s - loss: 1.3511 - regression_loss: 1.1327 - classification_loss: 0.2184 389/500 [======================>.......] - ETA: 26s - loss: 1.3510 - regression_loss: 1.1328 - classification_loss: 0.2182 390/500 [======================>.......] - ETA: 26s - loss: 1.3513 - regression_loss: 1.1328 - classification_loss: 0.2184 391/500 [======================>.......] - ETA: 26s - loss: 1.3517 - regression_loss: 1.1332 - classification_loss: 0.2184 392/500 [======================>.......] - ETA: 26s - loss: 1.3531 - regression_loss: 1.1345 - classification_loss: 0.2186 393/500 [======================>.......] - ETA: 25s - loss: 1.3502 - regression_loss: 1.1321 - classification_loss: 0.2182 394/500 [======================>.......] - ETA: 25s - loss: 1.3503 - regression_loss: 1.1319 - classification_loss: 0.2184 395/500 [======================>.......] - ETA: 25s - loss: 1.3503 - regression_loss: 1.1321 - classification_loss: 0.2182 396/500 [======================>.......] - ETA: 25s - loss: 1.3500 - regression_loss: 1.1319 - classification_loss: 0.2181 397/500 [======================>.......] - ETA: 24s - loss: 1.3490 - regression_loss: 1.1311 - classification_loss: 0.2179 398/500 [======================>.......] - ETA: 24s - loss: 1.3512 - regression_loss: 1.1326 - classification_loss: 0.2185 399/500 [======================>.......] - ETA: 24s - loss: 1.3508 - regression_loss: 1.1326 - classification_loss: 0.2183 400/500 [=======================>......] - ETA: 24s - loss: 1.3507 - regression_loss: 1.1325 - classification_loss: 0.2182 401/500 [=======================>......] - ETA: 23s - loss: 1.3506 - regression_loss: 1.1324 - classification_loss: 0.2182 402/500 [=======================>......] - ETA: 23s - loss: 1.3513 - regression_loss: 1.1329 - classification_loss: 0.2183 403/500 [=======================>......] - ETA: 23s - loss: 1.3517 - regression_loss: 1.1333 - classification_loss: 0.2183 404/500 [=======================>......] - ETA: 23s - loss: 1.3515 - regression_loss: 1.1332 - classification_loss: 0.2184 405/500 [=======================>......] - ETA: 22s - loss: 1.3503 - regression_loss: 1.1323 - classification_loss: 0.2181 406/500 [=======================>......] - ETA: 22s - loss: 1.3519 - regression_loss: 1.1334 - classification_loss: 0.2185 407/500 [=======================>......] - ETA: 22s - loss: 1.3498 - regression_loss: 1.1316 - classification_loss: 0.2182 408/500 [=======================>......] - ETA: 22s - loss: 1.3477 - regression_loss: 1.1299 - classification_loss: 0.2178 409/500 [=======================>......] - ETA: 21s - loss: 1.3474 - regression_loss: 1.1299 - classification_loss: 0.2175 410/500 [=======================>......] - ETA: 21s - loss: 1.3473 - regression_loss: 1.1298 - classification_loss: 0.2175 411/500 [=======================>......] - ETA: 21s - loss: 1.3488 - regression_loss: 1.1311 - classification_loss: 0.2177 412/500 [=======================>......] - ETA: 21s - loss: 1.3494 - regression_loss: 1.1316 - classification_loss: 0.2179 413/500 [=======================>......] - ETA: 21s - loss: 1.3498 - regression_loss: 1.1318 - classification_loss: 0.2180 414/500 [=======================>......] - ETA: 20s - loss: 1.3496 - regression_loss: 1.1317 - classification_loss: 0.2180 415/500 [=======================>......] - ETA: 20s - loss: 1.3502 - regression_loss: 1.1322 - classification_loss: 0.2180 416/500 [=======================>......] - ETA: 20s - loss: 1.3503 - regression_loss: 1.1324 - classification_loss: 0.2179 417/500 [========================>.....] - ETA: 20s - loss: 1.3509 - regression_loss: 1.1331 - classification_loss: 0.2178 418/500 [========================>.....] - ETA: 19s - loss: 1.3512 - regression_loss: 1.1333 - classification_loss: 0.2179 419/500 [========================>.....] - ETA: 19s - loss: 1.3507 - regression_loss: 1.1329 - classification_loss: 0.2178 420/500 [========================>.....] - ETA: 19s - loss: 1.3516 - regression_loss: 1.1338 - classification_loss: 0.2178 421/500 [========================>.....] - ETA: 19s - loss: 1.3508 - regression_loss: 1.1333 - classification_loss: 0.2175 422/500 [========================>.....] - ETA: 18s - loss: 1.3492 - regression_loss: 1.1321 - classification_loss: 0.2171 423/500 [========================>.....] - ETA: 18s - loss: 1.3493 - regression_loss: 1.1322 - classification_loss: 0.2171 424/500 [========================>.....] - ETA: 18s - loss: 1.3474 - regression_loss: 1.1307 - classification_loss: 0.2168 425/500 [========================>.....] - ETA: 18s - loss: 1.3484 - regression_loss: 1.1316 - classification_loss: 0.2169 426/500 [========================>.....] - ETA: 17s - loss: 1.3529 - regression_loss: 1.1349 - classification_loss: 0.2180 427/500 [========================>.....] - ETA: 17s - loss: 1.3542 - regression_loss: 1.1361 - classification_loss: 0.2181 428/500 [========================>.....] - ETA: 17s - loss: 1.3551 - regression_loss: 1.1369 - classification_loss: 0.2182 429/500 [========================>.....] - ETA: 17s - loss: 1.3544 - regression_loss: 1.1363 - classification_loss: 0.2181 430/500 [========================>.....] - ETA: 16s - loss: 1.3553 - regression_loss: 1.1370 - classification_loss: 0.2183 431/500 [========================>.....] - ETA: 16s - loss: 1.3552 - regression_loss: 1.1370 - classification_loss: 0.2182 432/500 [========================>.....] - ETA: 16s - loss: 1.3546 - regression_loss: 1.1366 - classification_loss: 0.2180 433/500 [========================>.....] - ETA: 16s - loss: 1.3556 - regression_loss: 1.1373 - classification_loss: 0.2183 434/500 [=========================>....] - ETA: 15s - loss: 1.3543 - regression_loss: 1.1362 - classification_loss: 0.2181 435/500 [=========================>....] - ETA: 15s - loss: 1.3522 - regression_loss: 1.1344 - classification_loss: 0.2177 436/500 [=========================>....] - ETA: 15s - loss: 1.3520 - regression_loss: 1.1344 - classification_loss: 0.2176 437/500 [=========================>....] - ETA: 15s - loss: 1.3526 - regression_loss: 1.1348 - classification_loss: 0.2178 438/500 [=========================>....] - ETA: 14s - loss: 1.3532 - regression_loss: 1.1353 - classification_loss: 0.2178 439/500 [=========================>....] - ETA: 14s - loss: 1.3525 - regression_loss: 1.1349 - classification_loss: 0.2176 440/500 [=========================>....] - ETA: 14s - loss: 1.3521 - regression_loss: 1.1346 - classification_loss: 0.2175 441/500 [=========================>....] - ETA: 14s - loss: 1.3539 - regression_loss: 1.1362 - classification_loss: 0.2177 442/500 [=========================>....] - ETA: 14s - loss: 1.3542 - regression_loss: 1.1363 - classification_loss: 0.2179 443/500 [=========================>....] - ETA: 13s - loss: 1.3547 - regression_loss: 1.1366 - classification_loss: 0.2181 444/500 [=========================>....] - ETA: 13s - loss: 1.3547 - regression_loss: 1.1366 - classification_loss: 0.2181 445/500 [=========================>....] - ETA: 13s - loss: 1.3549 - regression_loss: 1.1364 - classification_loss: 0.2185 446/500 [=========================>....] - ETA: 13s - loss: 1.3552 - regression_loss: 1.1369 - classification_loss: 0.2183 447/500 [=========================>....] - ETA: 12s - loss: 1.3549 - regression_loss: 1.1367 - classification_loss: 0.2182 448/500 [=========================>....] - ETA: 12s - loss: 1.3562 - regression_loss: 1.1377 - classification_loss: 0.2185 449/500 [=========================>....] - ETA: 12s - loss: 1.3549 - regression_loss: 1.1366 - classification_loss: 0.2183 450/500 [==========================>...] - ETA: 12s - loss: 1.3543 - regression_loss: 1.1362 - classification_loss: 0.2181 451/500 [==========================>...] - ETA: 11s - loss: 1.3548 - regression_loss: 1.1367 - classification_loss: 0.2181 452/500 [==========================>...] - ETA: 11s - loss: 1.3541 - regression_loss: 1.1362 - classification_loss: 0.2179 453/500 [==========================>...] - ETA: 11s - loss: 1.3555 - regression_loss: 1.1371 - classification_loss: 0.2184 454/500 [==========================>...] - ETA: 11s - loss: 1.3552 - regression_loss: 1.1370 - classification_loss: 0.2183 455/500 [==========================>...] - ETA: 10s - loss: 1.3555 - regression_loss: 1.1374 - classification_loss: 0.2181 456/500 [==========================>...] - ETA: 10s - loss: 1.3552 - regression_loss: 1.1373 - classification_loss: 0.2179 457/500 [==========================>...] - ETA: 10s - loss: 1.3572 - regression_loss: 1.1390 - classification_loss: 0.2182 458/500 [==========================>...] - ETA: 10s - loss: 1.3569 - regression_loss: 1.1389 - classification_loss: 0.2179 459/500 [==========================>...] - ETA: 9s - loss: 1.3568 - regression_loss: 1.1390 - classification_loss: 0.2178  460/500 [==========================>...] - ETA: 9s - loss: 1.3564 - regression_loss: 1.1388 - classification_loss: 0.2176 461/500 [==========================>...] - ETA: 9s - loss: 1.3554 - regression_loss: 1.1379 - classification_loss: 0.2174 462/500 [==========================>...] - ETA: 9s - loss: 1.3557 - regression_loss: 1.1383 - classification_loss: 0.2174 463/500 [==========================>...] - ETA: 8s - loss: 1.3537 - regression_loss: 1.1367 - classification_loss: 0.2170 464/500 [==========================>...] - ETA: 8s - loss: 1.3530 - regression_loss: 1.1362 - classification_loss: 0.2167 465/500 [==========================>...] - ETA: 8s - loss: 1.3541 - regression_loss: 1.1370 - classification_loss: 0.2171 466/500 [==========================>...] - ETA: 8s - loss: 1.3539 - regression_loss: 1.1367 - classification_loss: 0.2171 467/500 [===========================>..] - ETA: 7s - loss: 1.3538 - regression_loss: 1.1368 - classification_loss: 0.2169 468/500 [===========================>..] - ETA: 7s - loss: 1.3545 - regression_loss: 1.1377 - classification_loss: 0.2169 469/500 [===========================>..] - ETA: 7s - loss: 1.3554 - regression_loss: 1.1383 - classification_loss: 0.2170 470/500 [===========================>..] - ETA: 7s - loss: 1.3566 - regression_loss: 1.1392 - classification_loss: 0.2174 471/500 [===========================>..] - ETA: 7s - loss: 1.3571 - regression_loss: 1.1396 - classification_loss: 0.2174 472/500 [===========================>..] - ETA: 6s - loss: 1.3608 - regression_loss: 1.1407 - classification_loss: 0.2201 473/500 [===========================>..] - ETA: 6s - loss: 1.3616 - regression_loss: 1.1414 - classification_loss: 0.2202 474/500 [===========================>..] - ETA: 6s - loss: 1.3607 - regression_loss: 1.1406 - classification_loss: 0.2201 475/500 [===========================>..] - ETA: 6s - loss: 1.3602 - regression_loss: 1.1402 - classification_loss: 0.2200 476/500 [===========================>..] - ETA: 5s - loss: 1.3602 - regression_loss: 1.1401 - classification_loss: 0.2202 477/500 [===========================>..] - ETA: 5s - loss: 1.3608 - regression_loss: 1.1407 - classification_loss: 0.2201 478/500 [===========================>..] - ETA: 5s - loss: 1.3605 - regression_loss: 1.1405 - classification_loss: 0.2200 479/500 [===========================>..] - ETA: 5s - loss: 1.3607 - regression_loss: 1.1406 - classification_loss: 0.2200 480/500 [===========================>..] - ETA: 4s - loss: 1.3608 - regression_loss: 1.1408 - classification_loss: 0.2200 481/500 [===========================>..] - ETA: 4s - loss: 1.3588 - regression_loss: 1.1391 - classification_loss: 0.2198 482/500 [===========================>..] - ETA: 4s - loss: 1.3603 - regression_loss: 1.1402 - classification_loss: 0.2201 483/500 [===========================>..] - ETA: 4s - loss: 1.3594 - regression_loss: 1.1395 - classification_loss: 0.2199 484/500 [============================>.] - ETA: 3s - loss: 1.3587 - regression_loss: 1.1389 - classification_loss: 0.2198 485/500 [============================>.] - ETA: 3s - loss: 1.3575 - regression_loss: 1.1378 - classification_loss: 0.2197 486/500 [============================>.] - ETA: 3s - loss: 1.3565 - regression_loss: 1.1371 - classification_loss: 0.2194 487/500 [============================>.] - ETA: 3s - loss: 1.3549 - regression_loss: 1.1358 - classification_loss: 0.2191 488/500 [============================>.] - ETA: 2s - loss: 1.3556 - regression_loss: 1.1365 - classification_loss: 0.2190 489/500 [============================>.] - ETA: 2s - loss: 1.3553 - regression_loss: 1.1363 - classification_loss: 0.2190 490/500 [============================>.] - ETA: 2s - loss: 1.3567 - regression_loss: 1.1375 - classification_loss: 0.2192 491/500 [============================>.] - ETA: 2s - loss: 1.3561 - regression_loss: 1.1369 - classification_loss: 0.2192 492/500 [============================>.] - ETA: 1s - loss: 1.3565 - regression_loss: 1.1373 - classification_loss: 0.2192 493/500 [============================>.] - ETA: 1s - loss: 1.3564 - regression_loss: 1.1373 - classification_loss: 0.2191 494/500 [============================>.] - ETA: 1s - loss: 1.3563 - regression_loss: 1.1373 - classification_loss: 0.2190 495/500 [============================>.] - ETA: 1s - loss: 1.3564 - regression_loss: 1.1373 - classification_loss: 0.2191 496/500 [============================>.] - ETA: 0s - loss: 1.3564 - regression_loss: 1.1374 - classification_loss: 0.2191 497/500 [============================>.] - ETA: 0s - loss: 1.3578 - regression_loss: 1.1386 - classification_loss: 0.2193 498/500 [============================>.] - ETA: 0s - loss: 1.3580 - regression_loss: 1.1389 - classification_loss: 0.2192 499/500 [============================>.] - ETA: 0s - loss: 1.3572 - regression_loss: 1.1383 - classification_loss: 0.2189 500/500 [==============================] - 121s 242ms/step - loss: 1.3555 - regression_loss: 1.1368 - classification_loss: 0.2186 326 instances of class plum with average precision: 0.8070 mAP: 0.8070 Epoch 00089: saving model to ./training/snapshots/resnet50_pascal_89.h5 Epoch 90/150 1/500 [..............................] - ETA: 1:58 - loss: 0.8783 - regression_loss: 0.7581 - classification_loss: 0.1202 2/500 [..............................] - ETA: 2:00 - loss: 0.8232 - regression_loss: 0.7234 - classification_loss: 0.0999 3/500 [..............................] - ETA: 2:03 - loss: 0.7825 - regression_loss: 0.6759 - classification_loss: 0.1067 4/500 [..............................] - ETA: 2:02 - loss: 0.7118 - regression_loss: 0.6166 - classification_loss: 0.0952 5/500 [..............................] - ETA: 2:03 - loss: 1.1073 - regression_loss: 0.9449 - classification_loss: 0.1624 6/500 [..............................] - ETA: 2:02 - loss: 1.0057 - regression_loss: 0.8645 - classification_loss: 0.1412 7/500 [..............................] - ETA: 2:02 - loss: 1.0768 - regression_loss: 0.9281 - classification_loss: 0.1487 8/500 [..............................] - ETA: 2:02 - loss: 1.1859 - regression_loss: 1.0109 - classification_loss: 0.1751 9/500 [..............................] - ETA: 2:02 - loss: 1.1759 - regression_loss: 1.0008 - classification_loss: 0.1752 10/500 [..............................] - ETA: 2:01 - loss: 1.1131 - regression_loss: 0.9493 - classification_loss: 0.1638 11/500 [..............................] - ETA: 2:00 - loss: 1.1296 - regression_loss: 0.9652 - classification_loss: 0.1644 12/500 [..............................] - ETA: 2:00 - loss: 1.0887 - regression_loss: 0.9291 - classification_loss: 0.1596 13/500 [..............................] - ETA: 2:00 - loss: 1.0989 - regression_loss: 0.9403 - classification_loss: 0.1586 14/500 [..............................] - ETA: 1:59 - loss: 1.0866 - regression_loss: 0.9293 - classification_loss: 0.1573 15/500 [..............................] - ETA: 1:59 - loss: 1.1034 - regression_loss: 0.9448 - classification_loss: 0.1586 16/500 [..............................] - ETA: 1:59 - loss: 1.1219 - regression_loss: 0.9609 - classification_loss: 0.1609 17/500 [>.............................] - ETA: 1:58 - loss: 1.1282 - regression_loss: 0.9684 - classification_loss: 0.1598 18/500 [>.............................] - ETA: 1:58 - loss: 1.1252 - regression_loss: 0.9668 - classification_loss: 0.1584 19/500 [>.............................] - ETA: 1:58 - loss: 1.0951 - regression_loss: 0.9430 - classification_loss: 0.1521 20/500 [>.............................] - ETA: 1:58 - loss: 1.1052 - regression_loss: 0.9525 - classification_loss: 0.1527 21/500 [>.............................] - ETA: 1:57 - loss: 1.0863 - regression_loss: 0.9374 - classification_loss: 0.1489 22/500 [>.............................] - ETA: 1:57 - loss: 1.1155 - regression_loss: 0.9626 - classification_loss: 0.1530 23/500 [>.............................] - ETA: 1:57 - loss: 1.1314 - regression_loss: 0.9804 - classification_loss: 0.1510 24/500 [>.............................] - ETA: 1:56 - loss: 1.1408 - regression_loss: 0.9879 - classification_loss: 0.1529 25/500 [>.............................] - ETA: 1:56 - loss: 1.1534 - regression_loss: 0.9987 - classification_loss: 0.1547 26/500 [>.............................] - ETA: 1:56 - loss: 1.1818 - regression_loss: 1.0235 - classification_loss: 0.1583 27/500 [>.............................] - ETA: 1:56 - loss: 1.1703 - regression_loss: 1.0152 - classification_loss: 0.1550 28/500 [>.............................] - ETA: 1:55 - loss: 1.1585 - regression_loss: 1.0047 - classification_loss: 0.1538 29/500 [>.............................] - ETA: 1:54 - loss: 1.1699 - regression_loss: 1.0137 - classification_loss: 0.1562 30/500 [>.............................] - ETA: 1:54 - loss: 1.1676 - regression_loss: 1.0115 - classification_loss: 0.1561 31/500 [>.............................] - ETA: 1:53 - loss: 1.1718 - regression_loss: 1.0145 - classification_loss: 0.1573 32/500 [>.............................] - ETA: 1:53 - loss: 1.1726 - regression_loss: 1.0155 - classification_loss: 0.1571 33/500 [>.............................] - ETA: 1:53 - loss: 1.1858 - regression_loss: 1.0219 - classification_loss: 0.1639 34/500 [=>............................] - ETA: 1:53 - loss: 1.2056 - regression_loss: 1.0376 - classification_loss: 0.1680 35/500 [=>............................] - ETA: 1:53 - loss: 1.1782 - regression_loss: 1.0137 - classification_loss: 0.1644 36/500 [=>............................] - ETA: 1:53 - loss: 1.1852 - regression_loss: 1.0190 - classification_loss: 0.1663 37/500 [=>............................] - ETA: 1:52 - loss: 1.1743 - regression_loss: 1.0099 - classification_loss: 0.1644 38/500 [=>............................] - ETA: 1:52 - loss: 1.1642 - regression_loss: 1.0017 - classification_loss: 0.1624 39/500 [=>............................] - ETA: 1:52 - loss: 1.1466 - regression_loss: 0.9857 - classification_loss: 0.1609 40/500 [=>............................] - ETA: 1:52 - loss: 1.1388 - regression_loss: 0.9793 - classification_loss: 0.1595 41/500 [=>............................] - ETA: 1:52 - loss: 1.1384 - regression_loss: 0.9777 - classification_loss: 0.1608 42/500 [=>............................] - ETA: 1:52 - loss: 1.1403 - regression_loss: 0.9797 - classification_loss: 0.1606 43/500 [=>............................] - ETA: 1:51 - loss: 1.1885 - regression_loss: 1.0113 - classification_loss: 0.1771 44/500 [=>............................] - ETA: 1:51 - loss: 1.1894 - regression_loss: 1.0105 - classification_loss: 0.1790 45/500 [=>............................] - ETA: 1:51 - loss: 1.2117 - regression_loss: 1.0291 - classification_loss: 0.1826 46/500 [=>............................] - ETA: 1:51 - loss: 1.2484 - regression_loss: 1.0576 - classification_loss: 0.1908 47/500 [=>............................] - ETA: 1:51 - loss: 1.2430 - regression_loss: 1.0543 - classification_loss: 0.1888 48/500 [=>............................] - ETA: 1:50 - loss: 1.2491 - regression_loss: 1.0595 - classification_loss: 0.1895 49/500 [=>............................] - ETA: 1:50 - loss: 1.2639 - regression_loss: 1.0721 - classification_loss: 0.1918 50/500 [==>...........................] - ETA: 1:50 - loss: 1.2540 - regression_loss: 1.0648 - classification_loss: 0.1893 51/500 [==>...........................] - ETA: 1:50 - loss: 1.2579 - regression_loss: 1.0685 - classification_loss: 0.1894 52/500 [==>...........................] - ETA: 1:49 - loss: 1.2664 - regression_loss: 1.0753 - classification_loss: 0.1910 53/500 [==>...........................] - ETA: 1:49 - loss: 1.2675 - regression_loss: 1.0770 - classification_loss: 0.1905 54/500 [==>...........................] - ETA: 1:49 - loss: 1.2601 - regression_loss: 1.0703 - classification_loss: 0.1899 55/500 [==>...........................] - ETA: 1:49 - loss: 1.2636 - regression_loss: 1.0729 - classification_loss: 0.1908 56/500 [==>...........................] - ETA: 1:48 - loss: 1.2747 - regression_loss: 1.0812 - classification_loss: 0.1935 57/500 [==>...........................] - ETA: 1:48 - loss: 1.2744 - regression_loss: 1.0803 - classification_loss: 0.1941 58/500 [==>...........................] - ETA: 1:48 - loss: 1.2733 - regression_loss: 1.0800 - classification_loss: 0.1933 59/500 [==>...........................] - ETA: 1:48 - loss: 1.2884 - regression_loss: 1.0918 - classification_loss: 0.1965 60/500 [==>...........................] - ETA: 1:47 - loss: 1.2885 - regression_loss: 1.0920 - classification_loss: 0.1965 61/500 [==>...........................] - ETA: 1:47 - loss: 1.2785 - regression_loss: 1.0828 - classification_loss: 0.1957 62/500 [==>...........................] - ETA: 1:47 - loss: 1.2819 - regression_loss: 1.0858 - classification_loss: 0.1962 63/500 [==>...........................] - ETA: 1:47 - loss: 1.2844 - regression_loss: 1.0880 - classification_loss: 0.1964 64/500 [==>...........................] - ETA: 1:47 - loss: 1.2905 - regression_loss: 1.0929 - classification_loss: 0.1977 65/500 [==>...........................] - ETA: 1:46 - loss: 1.2898 - regression_loss: 1.0927 - classification_loss: 0.1971 66/500 [==>...........................] - ETA: 1:46 - loss: 1.2909 - regression_loss: 1.0937 - classification_loss: 0.1972 67/500 [===>..........................] - ETA: 1:46 - loss: 1.2938 - regression_loss: 1.0969 - classification_loss: 0.1969 68/500 [===>..........................] - ETA: 1:46 - loss: 1.2884 - regression_loss: 1.0929 - classification_loss: 0.1956 69/500 [===>..........................] - ETA: 1:45 - loss: 1.2944 - regression_loss: 1.0977 - classification_loss: 0.1968 70/500 [===>..........................] - ETA: 1:45 - loss: 1.2998 - regression_loss: 1.1019 - classification_loss: 0.1979 71/500 [===>..........................] - ETA: 1:45 - loss: 1.2986 - regression_loss: 1.1014 - classification_loss: 0.1972 72/500 [===>..........................] - ETA: 1:45 - loss: 1.2990 - regression_loss: 1.1018 - classification_loss: 0.1972 73/500 [===>..........................] - ETA: 1:44 - loss: 1.3015 - regression_loss: 1.1043 - classification_loss: 0.1972 74/500 [===>..........................] - ETA: 1:44 - loss: 1.2983 - regression_loss: 1.1017 - classification_loss: 0.1966 75/500 [===>..........................] - ETA: 1:44 - loss: 1.3070 - regression_loss: 1.1084 - classification_loss: 0.1985 76/500 [===>..........................] - ETA: 1:43 - loss: 1.3039 - regression_loss: 1.1066 - classification_loss: 0.1973 77/500 [===>..........................] - ETA: 1:43 - loss: 1.2993 - regression_loss: 1.1035 - classification_loss: 0.1958 78/500 [===>..........................] - ETA: 1:43 - loss: 1.2970 - regression_loss: 1.1022 - classification_loss: 0.1948 79/500 [===>..........................] - ETA: 1:43 - loss: 1.2879 - regression_loss: 1.0946 - classification_loss: 0.1933 80/500 [===>..........................] - ETA: 1:42 - loss: 1.2960 - regression_loss: 1.1020 - classification_loss: 0.1940 81/500 [===>..........................] - ETA: 1:42 - loss: 1.3011 - regression_loss: 1.1056 - classification_loss: 0.1955 82/500 [===>..........................] - ETA: 1:42 - loss: 1.2974 - regression_loss: 1.1029 - classification_loss: 0.1945 83/500 [===>..........................] - ETA: 1:41 - loss: 1.2978 - regression_loss: 1.1037 - classification_loss: 0.1942 84/500 [====>.........................] - ETA: 1:41 - loss: 1.2908 - regression_loss: 1.0979 - classification_loss: 0.1929 85/500 [====>.........................] - ETA: 1:41 - loss: 1.2814 - regression_loss: 1.0897 - classification_loss: 0.1917 86/500 [====>.........................] - ETA: 1:41 - loss: 1.3148 - regression_loss: 1.0969 - classification_loss: 0.2180 87/500 [====>.........................] - ETA: 1:41 - loss: 1.3183 - regression_loss: 1.1007 - classification_loss: 0.2176 88/500 [====>.........................] - ETA: 1:40 - loss: 1.3176 - regression_loss: 1.1011 - classification_loss: 0.2165 89/500 [====>.........................] - ETA: 1:40 - loss: 1.3148 - regression_loss: 1.0991 - classification_loss: 0.2157 90/500 [====>.........................] - ETA: 1:40 - loss: 1.3047 - regression_loss: 1.0909 - classification_loss: 0.2138 91/500 [====>.........................] - ETA: 1:40 - loss: 1.3031 - regression_loss: 1.0900 - classification_loss: 0.2131 92/500 [====>.........................] - ETA: 1:39 - loss: 1.3053 - regression_loss: 1.0925 - classification_loss: 0.2128 93/500 [====>.........................] - ETA: 1:39 - loss: 1.3072 - regression_loss: 1.0934 - classification_loss: 0.2138 94/500 [====>.........................] - ETA: 1:39 - loss: 1.2962 - regression_loss: 1.0842 - classification_loss: 0.2120 95/500 [====>.........................] - ETA: 1:39 - loss: 1.2961 - regression_loss: 1.0844 - classification_loss: 0.2117 96/500 [====>.........................] - ETA: 1:39 - loss: 1.2928 - regression_loss: 1.0820 - classification_loss: 0.2108 97/500 [====>.........................] - ETA: 1:38 - loss: 1.2951 - regression_loss: 1.0842 - classification_loss: 0.2109 98/500 [====>.........................] - ETA: 1:38 - loss: 1.3104 - regression_loss: 1.0981 - classification_loss: 0.2123 99/500 [====>.........................] - ETA: 1:38 - loss: 1.3094 - regression_loss: 1.0977 - classification_loss: 0.2117 100/500 [=====>........................] - ETA: 1:38 - loss: 1.3081 - regression_loss: 1.0968 - classification_loss: 0.2113 101/500 [=====>........................] - ETA: 1:37 - loss: 1.3051 - regression_loss: 1.0947 - classification_loss: 0.2104 102/500 [=====>........................] - ETA: 1:37 - loss: 1.3025 - regression_loss: 1.0923 - classification_loss: 0.2102 103/500 [=====>........................] - ETA: 1:37 - loss: 1.3000 - regression_loss: 1.0909 - classification_loss: 0.2092 104/500 [=====>........................] - ETA: 1:37 - loss: 1.3025 - regression_loss: 1.0925 - classification_loss: 0.2100 105/500 [=====>........................] - ETA: 1:36 - loss: 1.3001 - regression_loss: 1.0910 - classification_loss: 0.2091 106/500 [=====>........................] - ETA: 1:36 - loss: 1.3128 - regression_loss: 1.1012 - classification_loss: 0.2116 107/500 [=====>........................] - ETA: 1:36 - loss: 1.3134 - regression_loss: 1.1023 - classification_loss: 0.2111 108/500 [=====>........................] - ETA: 1:36 - loss: 1.3103 - regression_loss: 1.0994 - classification_loss: 0.2109 109/500 [=====>........................] - ETA: 1:35 - loss: 1.3057 - regression_loss: 1.0960 - classification_loss: 0.2097 110/500 [=====>........................] - ETA: 1:35 - loss: 1.3220 - regression_loss: 1.1084 - classification_loss: 0.2136 111/500 [=====>........................] - ETA: 1:35 - loss: 1.3231 - regression_loss: 1.1099 - classification_loss: 0.2133 112/500 [=====>........................] - ETA: 1:35 - loss: 1.3239 - regression_loss: 1.1102 - classification_loss: 0.2137 113/500 [=====>........................] - ETA: 1:34 - loss: 1.3259 - regression_loss: 1.1123 - classification_loss: 0.2136 114/500 [=====>........................] - ETA: 1:34 - loss: 1.3217 - regression_loss: 1.1094 - classification_loss: 0.2123 115/500 [=====>........................] - ETA: 1:34 - loss: 1.3205 - regression_loss: 1.1084 - classification_loss: 0.2120 116/500 [=====>........................] - ETA: 1:34 - loss: 1.3296 - regression_loss: 1.1140 - classification_loss: 0.2156 117/500 [======>.......................] - ETA: 1:34 - loss: 1.3333 - regression_loss: 1.1170 - classification_loss: 0.2164 118/500 [======>.......................] - ETA: 1:33 - loss: 1.3360 - regression_loss: 1.1193 - classification_loss: 0.2166 119/500 [======>.......................] - ETA: 1:33 - loss: 1.3297 - regression_loss: 1.1142 - classification_loss: 0.2156 120/500 [======>.......................] - ETA: 1:33 - loss: 1.3214 - regression_loss: 1.1074 - classification_loss: 0.2140 121/500 [======>.......................] - ETA: 1:33 - loss: 1.3249 - regression_loss: 1.1104 - classification_loss: 0.2146 122/500 [======>.......................] - ETA: 1:32 - loss: 1.3239 - regression_loss: 1.1096 - classification_loss: 0.2143 123/500 [======>.......................] - ETA: 1:32 - loss: 1.3245 - regression_loss: 1.1107 - classification_loss: 0.2139 124/500 [======>.......................] - ETA: 1:32 - loss: 1.3256 - regression_loss: 1.1120 - classification_loss: 0.2136 125/500 [======>.......................] - ETA: 1:32 - loss: 1.3274 - regression_loss: 1.1137 - classification_loss: 0.2137 126/500 [======>.......................] - ETA: 1:31 - loss: 1.3291 - regression_loss: 1.1143 - classification_loss: 0.2148 127/500 [======>.......................] - ETA: 1:31 - loss: 1.3301 - regression_loss: 1.1153 - classification_loss: 0.2148 128/500 [======>.......................] - ETA: 1:31 - loss: 1.3252 - regression_loss: 1.1116 - classification_loss: 0.2136 129/500 [======>.......................] - ETA: 1:30 - loss: 1.3266 - regression_loss: 1.1130 - classification_loss: 0.2136 130/500 [======>.......................] - ETA: 1:30 - loss: 1.3205 - regression_loss: 1.1082 - classification_loss: 0.2123 131/500 [======>.......................] - ETA: 1:30 - loss: 1.3215 - regression_loss: 1.1094 - classification_loss: 0.2121 132/500 [======>.......................] - ETA: 1:30 - loss: 1.3200 - regression_loss: 1.1086 - classification_loss: 0.2114 133/500 [======>.......................] - ETA: 1:30 - loss: 1.3246 - regression_loss: 1.1121 - classification_loss: 0.2125 134/500 [=======>......................] - ETA: 1:29 - loss: 1.3244 - regression_loss: 1.1115 - classification_loss: 0.2130 135/500 [=======>......................] - ETA: 1:29 - loss: 1.3228 - regression_loss: 1.1106 - classification_loss: 0.2122 136/500 [=======>......................] - ETA: 1:29 - loss: 1.3220 - regression_loss: 1.1103 - classification_loss: 0.2117 137/500 [=======>......................] - ETA: 1:29 - loss: 1.3214 - regression_loss: 1.1096 - classification_loss: 0.2118 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3223 - regression_loss: 1.1104 - classification_loss: 0.2120 139/500 [=======>......................] - ETA: 1:28 - loss: 1.3229 - regression_loss: 1.1114 - classification_loss: 0.2116 140/500 [=======>......................] - ETA: 1:28 - loss: 1.3210 - regression_loss: 1.1099 - classification_loss: 0.2111 141/500 [=======>......................] - ETA: 1:28 - loss: 1.3187 - regression_loss: 1.1083 - classification_loss: 0.2105 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3189 - regression_loss: 1.1088 - classification_loss: 0.2100 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3184 - regression_loss: 1.1080 - classification_loss: 0.2104 144/500 [=======>......................] - ETA: 1:27 - loss: 1.3179 - regression_loss: 1.1079 - classification_loss: 0.2100 145/500 [=======>......................] - ETA: 1:27 - loss: 1.3223 - regression_loss: 1.1112 - classification_loss: 0.2111 146/500 [=======>......................] - ETA: 1:26 - loss: 1.3216 - regression_loss: 1.1110 - classification_loss: 0.2106 147/500 [=======>......................] - ETA: 1:26 - loss: 1.3233 - regression_loss: 1.1124 - classification_loss: 0.2109 148/500 [=======>......................] - ETA: 1:26 - loss: 1.3271 - regression_loss: 1.1150 - classification_loss: 0.2120 149/500 [=======>......................] - ETA: 1:26 - loss: 1.3283 - regression_loss: 1.1163 - classification_loss: 0.2121 150/500 [========>.....................] - ETA: 1:25 - loss: 1.3298 - regression_loss: 1.1179 - classification_loss: 0.2119 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3305 - regression_loss: 1.1190 - classification_loss: 0.2115 152/500 [========>.....................] - ETA: 1:25 - loss: 1.3303 - regression_loss: 1.1187 - classification_loss: 0.2116 153/500 [========>.....................] - ETA: 1:25 - loss: 1.3285 - regression_loss: 1.1174 - classification_loss: 0.2110 154/500 [========>.....................] - ETA: 1:24 - loss: 1.3286 - regression_loss: 1.1175 - classification_loss: 0.2111 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3224 - regression_loss: 1.1123 - classification_loss: 0.2102 156/500 [========>.....................] - ETA: 1:24 - loss: 1.3208 - regression_loss: 1.1111 - classification_loss: 0.2097 157/500 [========>.....................] - ETA: 1:24 - loss: 1.3232 - regression_loss: 1.1132 - classification_loss: 0.2100 158/500 [========>.....................] - ETA: 1:23 - loss: 1.3259 - regression_loss: 1.1155 - classification_loss: 0.2104 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3227 - regression_loss: 1.1129 - classification_loss: 0.2098 160/500 [========>.....................] - ETA: 1:23 - loss: 1.3218 - regression_loss: 1.1122 - classification_loss: 0.2096 161/500 [========>.....................] - ETA: 1:23 - loss: 1.3263 - regression_loss: 1.1156 - classification_loss: 0.2107 162/500 [========>.....................] - ETA: 1:22 - loss: 1.3261 - regression_loss: 1.1152 - classification_loss: 0.2109 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3294 - regression_loss: 1.1184 - classification_loss: 0.2110 164/500 [========>.....................] - ETA: 1:22 - loss: 1.3276 - regression_loss: 1.1170 - classification_loss: 0.2107 165/500 [========>.....................] - ETA: 1:22 - loss: 1.3272 - regression_loss: 1.1169 - classification_loss: 0.2103 166/500 [========>.....................] - ETA: 1:22 - loss: 1.3315 - regression_loss: 1.1201 - classification_loss: 0.2114 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3290 - regression_loss: 1.1182 - classification_loss: 0.2108 168/500 [=========>....................] - ETA: 1:21 - loss: 1.3255 - regression_loss: 1.1154 - classification_loss: 0.2102 169/500 [=========>....................] - ETA: 1:21 - loss: 1.3252 - regression_loss: 1.1154 - classification_loss: 0.2098 170/500 [=========>....................] - ETA: 1:21 - loss: 1.3268 - regression_loss: 1.1164 - classification_loss: 0.2104 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3284 - regression_loss: 1.1176 - classification_loss: 0.2108 172/500 [=========>....................] - ETA: 1:20 - loss: 1.3276 - regression_loss: 1.1170 - classification_loss: 0.2106 173/500 [=========>....................] - ETA: 1:20 - loss: 1.3316 - regression_loss: 1.1202 - classification_loss: 0.2114 174/500 [=========>....................] - ETA: 1:20 - loss: 1.3314 - regression_loss: 1.1199 - classification_loss: 0.2115 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3335 - regression_loss: 1.1217 - classification_loss: 0.2119 176/500 [=========>....................] - ETA: 1:19 - loss: 1.3374 - regression_loss: 1.1245 - classification_loss: 0.2129 177/500 [=========>....................] - ETA: 1:19 - loss: 1.3371 - regression_loss: 1.1244 - classification_loss: 0.2127 178/500 [=========>....................] - ETA: 1:19 - loss: 1.3378 - regression_loss: 1.1254 - classification_loss: 0.2125 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3398 - regression_loss: 1.1275 - classification_loss: 0.2123 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3425 - regression_loss: 1.1293 - classification_loss: 0.2132 181/500 [=========>....................] - ETA: 1:18 - loss: 1.3405 - regression_loss: 1.1276 - classification_loss: 0.2129 182/500 [=========>....................] - ETA: 1:18 - loss: 1.3414 - regression_loss: 1.1286 - classification_loss: 0.2128 183/500 [=========>....................] - ETA: 1:17 - loss: 1.3465 - regression_loss: 1.1322 - classification_loss: 0.2143 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3444 - regression_loss: 1.1303 - classification_loss: 0.2141 185/500 [==========>...................] - ETA: 1:17 - loss: 1.3439 - regression_loss: 1.1303 - classification_loss: 0.2136 186/500 [==========>...................] - ETA: 1:17 - loss: 1.3439 - regression_loss: 1.1306 - classification_loss: 0.2133 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3464 - regression_loss: 1.1335 - classification_loss: 0.2129 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3468 - regression_loss: 1.1339 - classification_loss: 0.2129 189/500 [==========>...................] - ETA: 1:16 - loss: 1.3460 - regression_loss: 1.1329 - classification_loss: 0.2130 190/500 [==========>...................] - ETA: 1:16 - loss: 1.3444 - regression_loss: 1.1317 - classification_loss: 0.2126 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3452 - regression_loss: 1.1325 - classification_loss: 0.2127 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3455 - regression_loss: 1.1328 - classification_loss: 0.2127 193/500 [==========>...................] - ETA: 1:15 - loss: 1.3451 - regression_loss: 1.1327 - classification_loss: 0.2125 194/500 [==========>...................] - ETA: 1:15 - loss: 1.3441 - regression_loss: 1.1319 - classification_loss: 0.2122 195/500 [==========>...................] - ETA: 1:15 - loss: 1.3414 - regression_loss: 1.1294 - classification_loss: 0.2120 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3430 - regression_loss: 1.1306 - classification_loss: 0.2124 197/500 [==========>...................] - ETA: 1:14 - loss: 1.3436 - regression_loss: 1.1312 - classification_loss: 0.2124 198/500 [==========>...................] - ETA: 1:14 - loss: 1.3419 - regression_loss: 1.1297 - classification_loss: 0.2122 199/500 [==========>...................] - ETA: 1:14 - loss: 1.3396 - regression_loss: 1.1279 - classification_loss: 0.2117 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3382 - regression_loss: 1.1268 - classification_loss: 0.2114 201/500 [===========>..................] - ETA: 1:13 - loss: 1.3356 - regression_loss: 1.1246 - classification_loss: 0.2110 202/500 [===========>..................] - ETA: 1:13 - loss: 1.3327 - regression_loss: 1.1223 - classification_loss: 0.2104 203/500 [===========>..................] - ETA: 1:13 - loss: 1.3330 - regression_loss: 1.1224 - classification_loss: 0.2107 204/500 [===========>..................] - ETA: 1:12 - loss: 1.3309 - regression_loss: 1.1207 - classification_loss: 0.2103 205/500 [===========>..................] - ETA: 1:12 - loss: 1.3342 - regression_loss: 1.1234 - classification_loss: 0.2109 206/500 [===========>..................] - ETA: 1:12 - loss: 1.3363 - regression_loss: 1.1250 - classification_loss: 0.2113 207/500 [===========>..................] - ETA: 1:11 - loss: 1.3338 - regression_loss: 1.1225 - classification_loss: 0.2113 208/500 [===========>..................] - ETA: 1:11 - loss: 1.3298 - regression_loss: 1.1193 - classification_loss: 0.2105 209/500 [===========>..................] - ETA: 1:11 - loss: 1.3293 - regression_loss: 1.1189 - classification_loss: 0.2104 210/500 [===========>..................] - ETA: 1:11 - loss: 1.3315 - regression_loss: 1.1206 - classification_loss: 0.2109 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3373 - regression_loss: 1.1251 - classification_loss: 0.2123 212/500 [===========>..................] - ETA: 1:10 - loss: 1.3351 - regression_loss: 1.1231 - classification_loss: 0.2120 213/500 [===========>..................] - ETA: 1:10 - loss: 1.3334 - regression_loss: 1.1220 - classification_loss: 0.2114 214/500 [===========>..................] - ETA: 1:10 - loss: 1.3345 - regression_loss: 1.1229 - classification_loss: 0.2117 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3322 - regression_loss: 1.1211 - classification_loss: 0.2111 216/500 [===========>..................] - ETA: 1:09 - loss: 1.3300 - regression_loss: 1.1195 - classification_loss: 0.2105 217/500 [============>.................] - ETA: 1:09 - loss: 1.3339 - regression_loss: 1.1226 - classification_loss: 0.2112 218/500 [============>.................] - ETA: 1:09 - loss: 1.3332 - regression_loss: 1.1223 - classification_loss: 0.2110 219/500 [============>.................] - ETA: 1:08 - loss: 1.3352 - regression_loss: 1.1241 - classification_loss: 0.2111 220/500 [============>.................] - ETA: 1:08 - loss: 1.3346 - regression_loss: 1.1236 - classification_loss: 0.2109 221/500 [============>.................] - ETA: 1:08 - loss: 1.3356 - regression_loss: 1.1248 - classification_loss: 0.2109 222/500 [============>.................] - ETA: 1:08 - loss: 1.3379 - regression_loss: 1.1266 - classification_loss: 0.2114 223/500 [============>.................] - ETA: 1:07 - loss: 1.3368 - regression_loss: 1.1258 - classification_loss: 0.2111 224/500 [============>.................] - ETA: 1:07 - loss: 1.3342 - regression_loss: 1.1238 - classification_loss: 0.2104 225/500 [============>.................] - ETA: 1:07 - loss: 1.3324 - regression_loss: 1.1224 - classification_loss: 0.2100 226/500 [============>.................] - ETA: 1:07 - loss: 1.3296 - regression_loss: 1.1203 - classification_loss: 0.2094 227/500 [============>.................] - ETA: 1:06 - loss: 1.3274 - regression_loss: 1.1188 - classification_loss: 0.2087 228/500 [============>.................] - ETA: 1:06 - loss: 1.3302 - regression_loss: 1.1208 - classification_loss: 0.2094 229/500 [============>.................] - ETA: 1:06 - loss: 1.3314 - regression_loss: 1.1218 - classification_loss: 0.2096 230/500 [============>.................] - ETA: 1:06 - loss: 1.3313 - regression_loss: 1.1216 - classification_loss: 0.2096 231/500 [============>.................] - ETA: 1:06 - loss: 1.3335 - regression_loss: 1.1237 - classification_loss: 0.2098 232/500 [============>.................] - ETA: 1:05 - loss: 1.3290 - regression_loss: 1.1198 - classification_loss: 0.2091 233/500 [============>.................] - ETA: 1:05 - loss: 1.3321 - regression_loss: 1.1222 - classification_loss: 0.2100 234/500 [=============>................] - ETA: 1:05 - loss: 1.3353 - regression_loss: 1.1244 - classification_loss: 0.2109 235/500 [=============>................] - ETA: 1:04 - loss: 1.3332 - regression_loss: 1.1227 - classification_loss: 0.2105 236/500 [=============>................] - ETA: 1:04 - loss: 1.3343 - regression_loss: 1.1237 - classification_loss: 0.2107 237/500 [=============>................] - ETA: 1:04 - loss: 1.3337 - regression_loss: 1.1233 - classification_loss: 0.2104 238/500 [=============>................] - ETA: 1:04 - loss: 1.3290 - regression_loss: 1.1193 - classification_loss: 0.2098 239/500 [=============>................] - ETA: 1:03 - loss: 1.3304 - regression_loss: 1.1196 - classification_loss: 0.2108 240/500 [=============>................] - ETA: 1:03 - loss: 1.3304 - regression_loss: 1.1199 - classification_loss: 0.2105 241/500 [=============>................] - ETA: 1:03 - loss: 1.3286 - regression_loss: 1.1187 - classification_loss: 0.2099 242/500 [=============>................] - ETA: 1:03 - loss: 1.3302 - regression_loss: 1.1199 - classification_loss: 0.2103 243/500 [=============>................] - ETA: 1:03 - loss: 1.3300 - regression_loss: 1.1197 - classification_loss: 0.2104 244/500 [=============>................] - ETA: 1:02 - loss: 1.3302 - regression_loss: 1.1193 - classification_loss: 0.2109 245/500 [=============>................] - ETA: 1:02 - loss: 1.3313 - regression_loss: 1.1204 - classification_loss: 0.2109 246/500 [=============>................] - ETA: 1:02 - loss: 1.3315 - regression_loss: 1.1208 - classification_loss: 0.2108 247/500 [=============>................] - ETA: 1:02 - loss: 1.3332 - regression_loss: 1.1222 - classification_loss: 0.2110 248/500 [=============>................] - ETA: 1:01 - loss: 1.3320 - regression_loss: 1.1215 - classification_loss: 0.2105 249/500 [=============>................] - ETA: 1:01 - loss: 1.3321 - regression_loss: 1.1220 - classification_loss: 0.2101 250/500 [==============>...............] - ETA: 1:01 - loss: 1.3338 - regression_loss: 1.1233 - classification_loss: 0.2105 251/500 [==============>...............] - ETA: 1:01 - loss: 1.3306 - regression_loss: 1.1207 - classification_loss: 0.2100 252/500 [==============>...............] - ETA: 1:00 - loss: 1.3316 - regression_loss: 1.1215 - classification_loss: 0.2101 253/500 [==============>...............] - ETA: 1:00 - loss: 1.3312 - regression_loss: 1.1213 - classification_loss: 0.2099 254/500 [==============>...............] - ETA: 1:00 - loss: 1.3308 - regression_loss: 1.1210 - classification_loss: 0.2097 255/500 [==============>...............] - ETA: 1:00 - loss: 1.3320 - regression_loss: 1.1220 - classification_loss: 0.2100 256/500 [==============>...............] - ETA: 59s - loss: 1.3305 - regression_loss: 1.1209 - classification_loss: 0.2096  257/500 [==============>...............] - ETA: 59s - loss: 1.3283 - regression_loss: 1.1192 - classification_loss: 0.2091 258/500 [==============>...............] - ETA: 59s - loss: 1.3290 - regression_loss: 1.1200 - classification_loss: 0.2090 259/500 [==============>...............] - ETA: 59s - loss: 1.3325 - regression_loss: 1.1229 - classification_loss: 0.2096 260/500 [==============>...............] - ETA: 58s - loss: 1.3345 - regression_loss: 1.1247 - classification_loss: 0.2098 261/500 [==============>...............] - ETA: 58s - loss: 1.3348 - regression_loss: 1.1252 - classification_loss: 0.2096 262/500 [==============>...............] - ETA: 58s - loss: 1.3335 - regression_loss: 1.1243 - classification_loss: 0.2093 263/500 [==============>...............] - ETA: 58s - loss: 1.3357 - regression_loss: 1.1261 - classification_loss: 0.2096 264/500 [==============>...............] - ETA: 57s - loss: 1.3370 - regression_loss: 1.1273 - classification_loss: 0.2097 265/500 [==============>...............] - ETA: 57s - loss: 1.3393 - regression_loss: 1.1290 - classification_loss: 0.2103 266/500 [==============>...............] - ETA: 57s - loss: 1.3392 - regression_loss: 1.1289 - classification_loss: 0.2103 267/500 [===============>..............] - ETA: 57s - loss: 1.3370 - regression_loss: 1.1271 - classification_loss: 0.2100 268/500 [===============>..............] - ETA: 56s - loss: 1.3396 - regression_loss: 1.1296 - classification_loss: 0.2100 269/500 [===============>..............] - ETA: 56s - loss: 1.3434 - regression_loss: 1.1324 - classification_loss: 0.2111 270/500 [===============>..............] - ETA: 56s - loss: 1.3441 - regression_loss: 1.1333 - classification_loss: 0.2108 271/500 [===============>..............] - ETA: 56s - loss: 1.3439 - regression_loss: 1.1332 - classification_loss: 0.2107 272/500 [===============>..............] - ETA: 55s - loss: 1.3443 - regression_loss: 1.1336 - classification_loss: 0.2107 273/500 [===============>..............] - ETA: 55s - loss: 1.3452 - regression_loss: 1.1344 - classification_loss: 0.2108 274/500 [===============>..............] - ETA: 55s - loss: 1.3451 - regression_loss: 1.1344 - classification_loss: 0.2108 275/500 [===============>..............] - ETA: 55s - loss: 1.3456 - regression_loss: 1.1346 - classification_loss: 0.2110 276/500 [===============>..............] - ETA: 54s - loss: 1.3455 - regression_loss: 1.1348 - classification_loss: 0.2108 277/500 [===============>..............] - ETA: 54s - loss: 1.3424 - regression_loss: 1.1322 - classification_loss: 0.2101 278/500 [===============>..............] - ETA: 54s - loss: 1.3424 - regression_loss: 1.1322 - classification_loss: 0.2103 279/500 [===============>..............] - ETA: 54s - loss: 1.3434 - regression_loss: 1.1329 - classification_loss: 0.2105 280/500 [===============>..............] - ETA: 53s - loss: 1.3432 - regression_loss: 1.1328 - classification_loss: 0.2103 281/500 [===============>..............] - ETA: 53s - loss: 1.3417 - regression_loss: 1.1317 - classification_loss: 0.2100 282/500 [===============>..............] - ETA: 53s - loss: 1.3411 - regression_loss: 1.1313 - classification_loss: 0.2098 283/500 [===============>..............] - ETA: 53s - loss: 1.3425 - regression_loss: 1.1326 - classification_loss: 0.2100 284/500 [================>.............] - ETA: 52s - loss: 1.3440 - regression_loss: 1.1340 - classification_loss: 0.2100 285/500 [================>.............] - ETA: 52s - loss: 1.3470 - regression_loss: 1.1360 - classification_loss: 0.2110 286/500 [================>.............] - ETA: 52s - loss: 1.3472 - regression_loss: 1.1321 - classification_loss: 0.2151 287/500 [================>.............] - ETA: 52s - loss: 1.3473 - regression_loss: 1.1325 - classification_loss: 0.2148 288/500 [================>.............] - ETA: 51s - loss: 1.3478 - regression_loss: 1.1325 - classification_loss: 0.2153 289/500 [================>.............] - ETA: 51s - loss: 1.3509 - regression_loss: 1.1340 - classification_loss: 0.2170 290/500 [================>.............] - ETA: 51s - loss: 1.3543 - regression_loss: 1.1366 - classification_loss: 0.2177 291/500 [================>.............] - ETA: 51s - loss: 1.3549 - regression_loss: 1.1372 - classification_loss: 0.2177 292/500 [================>.............] - ETA: 50s - loss: 1.3539 - regression_loss: 1.1364 - classification_loss: 0.2175 293/500 [================>.............] - ETA: 50s - loss: 1.3515 - regression_loss: 1.1344 - classification_loss: 0.2171 294/500 [================>.............] - ETA: 50s - loss: 1.3536 - regression_loss: 1.1360 - classification_loss: 0.2176 295/500 [================>.............] - ETA: 50s - loss: 1.3526 - regression_loss: 1.1353 - classification_loss: 0.2173 296/500 [================>.............] - ETA: 50s - loss: 1.3537 - regression_loss: 1.1363 - classification_loss: 0.2174 297/500 [================>.............] - ETA: 49s - loss: 1.3572 - regression_loss: 1.1389 - classification_loss: 0.2183 298/500 [================>.............] - ETA: 49s - loss: 1.3595 - regression_loss: 1.1410 - classification_loss: 0.2185 299/500 [================>.............] - ETA: 49s - loss: 1.3578 - regression_loss: 1.1397 - classification_loss: 0.2181 300/500 [=================>............] - ETA: 49s - loss: 1.3586 - regression_loss: 1.1402 - classification_loss: 0.2183 301/500 [=================>............] - ETA: 48s - loss: 1.3561 - regression_loss: 1.1382 - classification_loss: 0.2179 302/500 [=================>............] - ETA: 48s - loss: 1.3543 - regression_loss: 1.1367 - classification_loss: 0.2175 303/500 [=================>............] - ETA: 48s - loss: 1.3564 - regression_loss: 1.1387 - classification_loss: 0.2177 304/500 [=================>............] - ETA: 48s - loss: 1.3572 - regression_loss: 1.1394 - classification_loss: 0.2178 305/500 [=================>............] - ETA: 47s - loss: 1.3577 - regression_loss: 1.1399 - classification_loss: 0.2178 306/500 [=================>............] - ETA: 47s - loss: 1.3589 - regression_loss: 1.1411 - classification_loss: 0.2178 307/500 [=================>............] - ETA: 47s - loss: 1.3595 - regression_loss: 1.1419 - classification_loss: 0.2176 308/500 [=================>............] - ETA: 47s - loss: 1.3586 - regression_loss: 1.1413 - classification_loss: 0.2173 309/500 [=================>............] - ETA: 46s - loss: 1.3577 - regression_loss: 1.1408 - classification_loss: 0.2169 310/500 [=================>............] - ETA: 46s - loss: 1.3572 - regression_loss: 1.1406 - classification_loss: 0.2167 311/500 [=================>............] - ETA: 46s - loss: 1.3562 - regression_loss: 1.1398 - classification_loss: 0.2165 312/500 [=================>............] - ETA: 46s - loss: 1.3554 - regression_loss: 1.1391 - classification_loss: 0.2163 313/500 [=================>............] - ETA: 45s - loss: 1.3543 - regression_loss: 1.1383 - classification_loss: 0.2160 314/500 [=================>............] - ETA: 45s - loss: 1.3540 - regression_loss: 1.1382 - classification_loss: 0.2157 315/500 [=================>............] - ETA: 45s - loss: 1.3555 - regression_loss: 1.1395 - classification_loss: 0.2160 316/500 [=================>............] - ETA: 45s - loss: 1.3557 - regression_loss: 1.1397 - classification_loss: 0.2160 317/500 [==================>...........] - ETA: 44s - loss: 1.3560 - regression_loss: 1.1402 - classification_loss: 0.2158 318/500 [==================>...........] - ETA: 44s - loss: 1.3567 - regression_loss: 1.1404 - classification_loss: 0.2162 319/500 [==================>...........] - ETA: 44s - loss: 1.3558 - regression_loss: 1.1399 - classification_loss: 0.2159 320/500 [==================>...........] - ETA: 44s - loss: 1.3535 - regression_loss: 1.1380 - classification_loss: 0.2155 321/500 [==================>...........] - ETA: 43s - loss: 1.3528 - regression_loss: 1.1373 - classification_loss: 0.2155 322/500 [==================>...........] - ETA: 43s - loss: 1.3537 - regression_loss: 1.1377 - classification_loss: 0.2159 323/500 [==================>...........] - ETA: 43s - loss: 1.3545 - regression_loss: 1.1383 - classification_loss: 0.2162 324/500 [==================>...........] - ETA: 43s - loss: 1.3528 - regression_loss: 1.1369 - classification_loss: 0.2159 325/500 [==================>...........] - ETA: 42s - loss: 1.3534 - regression_loss: 1.1377 - classification_loss: 0.2157 326/500 [==================>...........] - ETA: 42s - loss: 1.3555 - regression_loss: 1.1398 - classification_loss: 0.2158 327/500 [==================>...........] - ETA: 42s - loss: 1.3573 - regression_loss: 1.1409 - classification_loss: 0.2164 328/500 [==================>...........] - ETA: 42s - loss: 1.3583 - regression_loss: 1.1415 - classification_loss: 0.2168 329/500 [==================>...........] - ETA: 41s - loss: 1.3589 - regression_loss: 1.1419 - classification_loss: 0.2170 330/500 [==================>...........] - ETA: 41s - loss: 1.3583 - regression_loss: 1.1415 - classification_loss: 0.2168 331/500 [==================>...........] - ETA: 41s - loss: 1.3571 - regression_loss: 1.1407 - classification_loss: 0.2165 332/500 [==================>...........] - ETA: 41s - loss: 1.3564 - regression_loss: 1.1401 - classification_loss: 0.2163 333/500 [==================>...........] - ETA: 40s - loss: 1.3560 - regression_loss: 1.1399 - classification_loss: 0.2161 334/500 [===================>..........] - ETA: 40s - loss: 1.3586 - regression_loss: 1.1420 - classification_loss: 0.2166 335/500 [===================>..........] - ETA: 40s - loss: 1.3591 - regression_loss: 1.1426 - classification_loss: 0.2165 336/500 [===================>..........] - ETA: 40s - loss: 1.3609 - regression_loss: 1.1440 - classification_loss: 0.2170 337/500 [===================>..........] - ETA: 40s - loss: 1.3594 - regression_loss: 1.1427 - classification_loss: 0.2167 338/500 [===================>..........] - ETA: 39s - loss: 1.3586 - regression_loss: 1.1422 - classification_loss: 0.2164 339/500 [===================>..........] - ETA: 39s - loss: 1.3575 - regression_loss: 1.1412 - classification_loss: 0.2163 340/500 [===================>..........] - ETA: 39s - loss: 1.3578 - regression_loss: 1.1415 - classification_loss: 0.2163 341/500 [===================>..........] - ETA: 39s - loss: 1.3576 - regression_loss: 1.1411 - classification_loss: 0.2165 342/500 [===================>..........] - ETA: 38s - loss: 1.3575 - regression_loss: 1.1410 - classification_loss: 0.2165 343/500 [===================>..........] - ETA: 38s - loss: 1.3580 - regression_loss: 1.1414 - classification_loss: 0.2166 344/500 [===================>..........] - ETA: 38s - loss: 1.3588 - regression_loss: 1.1420 - classification_loss: 0.2168 345/500 [===================>..........] - ETA: 38s - loss: 1.3609 - regression_loss: 1.1437 - classification_loss: 0.2171 346/500 [===================>..........] - ETA: 37s - loss: 1.3599 - regression_loss: 1.1430 - classification_loss: 0.2169 347/500 [===================>..........] - ETA: 37s - loss: 1.3600 - regression_loss: 1.1430 - classification_loss: 0.2170 348/500 [===================>..........] - ETA: 37s - loss: 1.3586 - regression_loss: 1.1420 - classification_loss: 0.2166 349/500 [===================>..........] - ETA: 37s - loss: 1.3574 - regression_loss: 1.1411 - classification_loss: 0.2163 350/500 [====================>.........] - ETA: 36s - loss: 1.3567 - regression_loss: 1.1406 - classification_loss: 0.2161 351/500 [====================>.........] - ETA: 36s - loss: 1.3591 - regression_loss: 1.1423 - classification_loss: 0.2168 352/500 [====================>.........] - ETA: 36s - loss: 1.3580 - regression_loss: 1.1414 - classification_loss: 0.2166 353/500 [====================>.........] - ETA: 36s - loss: 1.3596 - regression_loss: 1.1429 - classification_loss: 0.2167 354/500 [====================>.........] - ETA: 35s - loss: 1.3592 - regression_loss: 1.1428 - classification_loss: 0.2165 355/500 [====================>.........] - ETA: 35s - loss: 1.3593 - regression_loss: 1.1428 - classification_loss: 0.2165 356/500 [====================>.........] - ETA: 35s - loss: 1.3593 - regression_loss: 1.1427 - classification_loss: 0.2166 357/500 [====================>.........] - ETA: 35s - loss: 1.3587 - regression_loss: 1.1420 - classification_loss: 0.2166 358/500 [====================>.........] - ETA: 34s - loss: 1.3591 - regression_loss: 1.1426 - classification_loss: 0.2165 359/500 [====================>.........] - ETA: 34s - loss: 1.3589 - regression_loss: 1.1424 - classification_loss: 0.2165 360/500 [====================>.........] - ETA: 34s - loss: 1.3596 - regression_loss: 1.1430 - classification_loss: 0.2166 361/500 [====================>.........] - ETA: 34s - loss: 1.3599 - regression_loss: 1.1437 - classification_loss: 0.2163 362/500 [====================>.........] - ETA: 33s - loss: 1.3610 - regression_loss: 1.1445 - classification_loss: 0.2165 363/500 [====================>.........] - ETA: 33s - loss: 1.3599 - regression_loss: 1.1437 - classification_loss: 0.2163 364/500 [====================>.........] - ETA: 33s - loss: 1.3585 - regression_loss: 1.1426 - classification_loss: 0.2159 365/500 [====================>.........] - ETA: 33s - loss: 1.3591 - regression_loss: 1.1432 - classification_loss: 0.2160 366/500 [====================>.........] - ETA: 32s - loss: 1.3583 - regression_loss: 1.1423 - classification_loss: 0.2159 367/500 [=====================>........] - ETA: 32s - loss: 1.3573 - regression_loss: 1.1416 - classification_loss: 0.2157 368/500 [=====================>........] - ETA: 32s - loss: 1.3583 - regression_loss: 1.1427 - classification_loss: 0.2156 369/500 [=====================>........] - ETA: 32s - loss: 1.3590 - regression_loss: 1.1434 - classification_loss: 0.2156 370/500 [=====================>........] - ETA: 31s - loss: 1.3588 - regression_loss: 1.1433 - classification_loss: 0.2155 371/500 [=====================>........] - ETA: 31s - loss: 1.3595 - regression_loss: 1.1440 - classification_loss: 0.2155 372/500 [=====================>........] - ETA: 31s - loss: 1.3570 - regression_loss: 1.1419 - classification_loss: 0.2151 373/500 [=====================>........] - ETA: 31s - loss: 1.3566 - regression_loss: 1.1418 - classification_loss: 0.2148 374/500 [=====================>........] - ETA: 30s - loss: 1.3564 - regression_loss: 1.1417 - classification_loss: 0.2147 375/500 [=====================>........] - ETA: 30s - loss: 1.3556 - regression_loss: 1.1411 - classification_loss: 0.2145 376/500 [=====================>........] - ETA: 30s - loss: 1.3581 - regression_loss: 1.1430 - classification_loss: 0.2151 377/500 [=====================>........] - ETA: 30s - loss: 1.3567 - regression_loss: 1.1418 - classification_loss: 0.2148 378/500 [=====================>........] - ETA: 29s - loss: 1.3539 - regression_loss: 1.1396 - classification_loss: 0.2143 379/500 [=====================>........] - ETA: 29s - loss: 1.3564 - regression_loss: 1.1416 - classification_loss: 0.2148 380/500 [=====================>........] - ETA: 29s - loss: 1.3559 - regression_loss: 1.1410 - classification_loss: 0.2149 381/500 [=====================>........] - ETA: 29s - loss: 1.3559 - regression_loss: 1.1411 - classification_loss: 0.2148 382/500 [=====================>........] - ETA: 28s - loss: 1.3551 - regression_loss: 1.1406 - classification_loss: 0.2145 383/500 [=====================>........] - ETA: 28s - loss: 1.3565 - regression_loss: 1.1417 - classification_loss: 0.2148 384/500 [======================>.......] - ETA: 28s - loss: 1.3565 - regression_loss: 1.1419 - classification_loss: 0.2145 385/500 [======================>.......] - ETA: 28s - loss: 1.3600 - regression_loss: 1.1444 - classification_loss: 0.2156 386/500 [======================>.......] - ETA: 27s - loss: 1.3596 - regression_loss: 1.1440 - classification_loss: 0.2156 387/500 [======================>.......] - ETA: 27s - loss: 1.3612 - regression_loss: 1.1454 - classification_loss: 0.2158 388/500 [======================>.......] - ETA: 27s - loss: 1.3597 - regression_loss: 1.1443 - classification_loss: 0.2155 389/500 [======================>.......] - ETA: 27s - loss: 1.3615 - regression_loss: 1.1454 - classification_loss: 0.2160 390/500 [======================>.......] - ETA: 26s - loss: 1.3628 - regression_loss: 1.1466 - classification_loss: 0.2162 391/500 [======================>.......] - ETA: 26s - loss: 1.3620 - regression_loss: 1.1460 - classification_loss: 0.2160 392/500 [======================>.......] - ETA: 26s - loss: 1.3622 - regression_loss: 1.1463 - classification_loss: 0.2160 393/500 [======================>.......] - ETA: 26s - loss: 1.3605 - regression_loss: 1.1449 - classification_loss: 0.2157 394/500 [======================>.......] - ETA: 25s - loss: 1.3612 - regression_loss: 1.1451 - classification_loss: 0.2161 395/500 [======================>.......] - ETA: 25s - loss: 1.3606 - regression_loss: 1.1448 - classification_loss: 0.2159 396/500 [======================>.......] - ETA: 25s - loss: 1.3606 - regression_loss: 1.1448 - classification_loss: 0.2158 397/500 [======================>.......] - ETA: 25s - loss: 1.3642 - regression_loss: 1.1475 - classification_loss: 0.2167 398/500 [======================>.......] - ETA: 25s - loss: 1.3645 - regression_loss: 1.1478 - classification_loss: 0.2167 399/500 [======================>.......] - ETA: 24s - loss: 1.3647 - regression_loss: 1.1481 - classification_loss: 0.2166 400/500 [=======================>......] - ETA: 24s - loss: 1.3644 - regression_loss: 1.1480 - classification_loss: 0.2164 401/500 [=======================>......] - ETA: 24s - loss: 1.3626 - regression_loss: 1.1464 - classification_loss: 0.2162 402/500 [=======================>......] - ETA: 24s - loss: 1.3639 - regression_loss: 1.1474 - classification_loss: 0.2166 403/500 [=======================>......] - ETA: 23s - loss: 1.3658 - regression_loss: 1.1490 - classification_loss: 0.2168 404/500 [=======================>......] - ETA: 23s - loss: 1.3646 - regression_loss: 1.1482 - classification_loss: 0.2164 405/500 [=======================>......] - ETA: 23s - loss: 1.3647 - regression_loss: 1.1483 - classification_loss: 0.2164 406/500 [=======================>......] - ETA: 23s - loss: 1.3637 - regression_loss: 1.1475 - classification_loss: 0.2162 407/500 [=======================>......] - ETA: 22s - loss: 1.3647 - regression_loss: 1.1484 - classification_loss: 0.2163 408/500 [=======================>......] - ETA: 22s - loss: 1.3651 - regression_loss: 1.1489 - classification_loss: 0.2163 409/500 [=======================>......] - ETA: 22s - loss: 1.3654 - regression_loss: 1.1493 - classification_loss: 0.2161 410/500 [=======================>......] - ETA: 22s - loss: 1.3646 - regression_loss: 1.1487 - classification_loss: 0.2159 411/500 [=======================>......] - ETA: 21s - loss: 1.3639 - regression_loss: 1.1483 - classification_loss: 0.2157 412/500 [=======================>......] - ETA: 21s - loss: 1.3632 - regression_loss: 1.1477 - classification_loss: 0.2155 413/500 [=======================>......] - ETA: 21s - loss: 1.3618 - regression_loss: 1.1466 - classification_loss: 0.2152 414/500 [=======================>......] - ETA: 21s - loss: 1.3619 - regression_loss: 1.1466 - classification_loss: 0.2152 415/500 [=======================>......] - ETA: 20s - loss: 1.3607 - regression_loss: 1.1455 - classification_loss: 0.2153 416/500 [=======================>......] - ETA: 20s - loss: 1.3600 - regression_loss: 1.1451 - classification_loss: 0.2149 417/500 [========================>.....] - ETA: 20s - loss: 1.3598 - regression_loss: 1.1449 - classification_loss: 0.2149 418/500 [========================>.....] - ETA: 20s - loss: 1.3588 - regression_loss: 1.1439 - classification_loss: 0.2149 419/500 [========================>.....] - ETA: 19s - loss: 1.3581 - regression_loss: 1.1434 - classification_loss: 0.2147 420/500 [========================>.....] - ETA: 19s - loss: 1.3595 - regression_loss: 1.1443 - classification_loss: 0.2152 421/500 [========================>.....] - ETA: 19s - loss: 1.3603 - regression_loss: 1.1449 - classification_loss: 0.2154 422/500 [========================>.....] - ETA: 19s - loss: 1.3600 - regression_loss: 1.1446 - classification_loss: 0.2154 423/500 [========================>.....] - ETA: 18s - loss: 1.3604 - regression_loss: 1.1448 - classification_loss: 0.2156 424/500 [========================>.....] - ETA: 18s - loss: 1.3600 - regression_loss: 1.1445 - classification_loss: 0.2155 425/500 [========================>.....] - ETA: 18s - loss: 1.3599 - regression_loss: 1.1444 - classification_loss: 0.2155 426/500 [========================>.....] - ETA: 18s - loss: 1.3601 - regression_loss: 1.1446 - classification_loss: 0.2155 427/500 [========================>.....] - ETA: 17s - loss: 1.3602 - regression_loss: 1.1446 - classification_loss: 0.2156 428/500 [========================>.....] - ETA: 17s - loss: 1.3591 - regression_loss: 1.1437 - classification_loss: 0.2154 429/500 [========================>.....] - ETA: 17s - loss: 1.3616 - regression_loss: 1.1458 - classification_loss: 0.2158 430/500 [========================>.....] - ETA: 17s - loss: 1.3611 - regression_loss: 1.1455 - classification_loss: 0.2156 431/500 [========================>.....] - ETA: 16s - loss: 1.3590 - regression_loss: 1.1437 - classification_loss: 0.2152 432/500 [========================>.....] - ETA: 16s - loss: 1.3588 - regression_loss: 1.1437 - classification_loss: 0.2151 433/500 [========================>.....] - ETA: 16s - loss: 1.3603 - regression_loss: 1.1449 - classification_loss: 0.2154 434/500 [=========================>....] - ETA: 16s - loss: 1.3585 - regression_loss: 1.1433 - classification_loss: 0.2151 435/500 [=========================>....] - ETA: 15s - loss: 1.3587 - regression_loss: 1.1435 - classification_loss: 0.2152 436/500 [=========================>....] - ETA: 15s - loss: 1.3579 - regression_loss: 1.1429 - classification_loss: 0.2150 437/500 [=========================>....] - ETA: 15s - loss: 1.3560 - regression_loss: 1.1413 - classification_loss: 0.2146 438/500 [=========================>....] - ETA: 15s - loss: 1.3563 - regression_loss: 1.1418 - classification_loss: 0.2145 439/500 [=========================>....] - ETA: 14s - loss: 1.3561 - regression_loss: 1.1416 - classification_loss: 0.2145 440/500 [=========================>....] - ETA: 14s - loss: 1.3564 - regression_loss: 1.1417 - classification_loss: 0.2147 441/500 [=========================>....] - ETA: 14s - loss: 1.3556 - regression_loss: 1.1410 - classification_loss: 0.2146 442/500 [=========================>....] - ETA: 14s - loss: 1.3556 - regression_loss: 1.1411 - classification_loss: 0.2145 443/500 [=========================>....] - ETA: 13s - loss: 1.3561 - regression_loss: 1.1415 - classification_loss: 0.2146 444/500 [=========================>....] - ETA: 13s - loss: 1.3564 - regression_loss: 1.1418 - classification_loss: 0.2146 445/500 [=========================>....] - ETA: 13s - loss: 1.3556 - regression_loss: 1.1412 - classification_loss: 0.2144 446/500 [=========================>....] - ETA: 13s - loss: 1.3562 - regression_loss: 1.1418 - classification_loss: 0.2144 447/500 [=========================>....] - ETA: 13s - loss: 1.3553 - regression_loss: 1.1411 - classification_loss: 0.2142 448/500 [=========================>....] - ETA: 12s - loss: 1.3546 - regression_loss: 1.1405 - classification_loss: 0.2141 449/500 [=========================>....] - ETA: 12s - loss: 1.3564 - regression_loss: 1.1418 - classification_loss: 0.2146 450/500 [==========================>...] - ETA: 12s - loss: 1.3567 - regression_loss: 1.1422 - classification_loss: 0.2144 451/500 [==========================>...] - ETA: 12s - loss: 1.3580 - regression_loss: 1.1433 - classification_loss: 0.2147 452/500 [==========================>...] - ETA: 11s - loss: 1.3582 - regression_loss: 1.1433 - classification_loss: 0.2149 453/500 [==========================>...] - ETA: 11s - loss: 1.3572 - regression_loss: 1.1425 - classification_loss: 0.2147 454/500 [==========================>...] - ETA: 11s - loss: 1.3568 - regression_loss: 1.1421 - classification_loss: 0.2147 455/500 [==========================>...] - ETA: 11s - loss: 1.3579 - regression_loss: 1.1431 - classification_loss: 0.2148 456/500 [==========================>...] - ETA: 10s - loss: 1.3589 - regression_loss: 1.1438 - classification_loss: 0.2151 457/500 [==========================>...] - ETA: 10s - loss: 1.3598 - regression_loss: 1.1445 - classification_loss: 0.2153 458/500 [==========================>...] - ETA: 10s - loss: 1.3589 - regression_loss: 1.1439 - classification_loss: 0.2151 459/500 [==========================>...] - ETA: 10s - loss: 1.3589 - regression_loss: 1.1440 - classification_loss: 0.2149 460/500 [==========================>...] - ETA: 9s - loss: 1.3587 - regression_loss: 1.1440 - classification_loss: 0.2148  461/500 [==========================>...] - ETA: 9s - loss: 1.3584 - regression_loss: 1.1438 - classification_loss: 0.2146 462/500 [==========================>...] - ETA: 9s - loss: 1.3582 - regression_loss: 1.1438 - classification_loss: 0.2145 463/500 [==========================>...] - ETA: 9s - loss: 1.3591 - regression_loss: 1.1445 - classification_loss: 0.2146 464/500 [==========================>...] - ETA: 8s - loss: 1.3592 - regression_loss: 1.1446 - classification_loss: 0.2145 465/500 [==========================>...] - ETA: 8s - loss: 1.3590 - regression_loss: 1.1446 - classification_loss: 0.2144 466/500 [==========================>...] - ETA: 8s - loss: 1.3593 - regression_loss: 1.1446 - classification_loss: 0.2148 467/500 [===========================>..] - ETA: 8s - loss: 1.3588 - regression_loss: 1.1442 - classification_loss: 0.2146 468/500 [===========================>..] - ETA: 7s - loss: 1.3584 - regression_loss: 1.1439 - classification_loss: 0.2145 469/500 [===========================>..] - ETA: 7s - loss: 1.3597 - regression_loss: 1.1448 - classification_loss: 0.2149 470/500 [===========================>..] - ETA: 7s - loss: 1.3581 - regression_loss: 1.1436 - classification_loss: 0.2145 471/500 [===========================>..] - ETA: 7s - loss: 1.3580 - regression_loss: 1.1436 - classification_loss: 0.2144 472/500 [===========================>..] - ETA: 6s - loss: 1.3570 - regression_loss: 1.1429 - classification_loss: 0.2142 473/500 [===========================>..] - ETA: 6s - loss: 1.3574 - regression_loss: 1.1431 - classification_loss: 0.2143 474/500 [===========================>..] - ETA: 6s - loss: 1.3578 - regression_loss: 1.1436 - classification_loss: 0.2143 475/500 [===========================>..] - ETA: 6s - loss: 1.3581 - regression_loss: 1.1439 - classification_loss: 0.2142 476/500 [===========================>..] - ETA: 5s - loss: 1.3568 - regression_loss: 1.1428 - classification_loss: 0.2140 477/500 [===========================>..] - ETA: 5s - loss: 1.3562 - regression_loss: 1.1425 - classification_loss: 0.2137 478/500 [===========================>..] - ETA: 5s - loss: 1.3550 - regression_loss: 1.1414 - classification_loss: 0.2136 479/500 [===========================>..] - ETA: 5s - loss: 1.3556 - regression_loss: 1.1419 - classification_loss: 0.2138 480/500 [===========================>..] - ETA: 4s - loss: 1.3557 - regression_loss: 1.1420 - classification_loss: 0.2137 481/500 [===========================>..] - ETA: 4s - loss: 1.3554 - regression_loss: 1.1418 - classification_loss: 0.2136 482/500 [===========================>..] - ETA: 4s - loss: 1.3558 - regression_loss: 1.1422 - classification_loss: 0.2135 483/500 [===========================>..] - ETA: 4s - loss: 1.3559 - regression_loss: 1.1422 - classification_loss: 0.2137 484/500 [============================>.] - ETA: 3s - loss: 1.3555 - regression_loss: 1.1421 - classification_loss: 0.2134 485/500 [============================>.] - ETA: 3s - loss: 1.3563 - regression_loss: 1.1426 - classification_loss: 0.2137 486/500 [============================>.] - ETA: 3s - loss: 1.3572 - regression_loss: 1.1433 - classification_loss: 0.2139 487/500 [============================>.] - ETA: 3s - loss: 1.3558 - regression_loss: 1.1422 - classification_loss: 0.2136 488/500 [============================>.] - ETA: 2s - loss: 1.3557 - regression_loss: 1.1422 - classification_loss: 0.2135 489/500 [============================>.] - ETA: 2s - loss: 1.3557 - regression_loss: 1.1423 - classification_loss: 0.2134 490/500 [============================>.] - ETA: 2s - loss: 1.3557 - regression_loss: 1.1424 - classification_loss: 0.2133 491/500 [============================>.] - ETA: 2s - loss: 1.3535 - regression_loss: 1.1406 - classification_loss: 0.2129 492/500 [============================>.] - ETA: 1s - loss: 1.3534 - regression_loss: 1.1406 - classification_loss: 0.2128 493/500 [============================>.] - ETA: 1s - loss: 1.3530 - regression_loss: 1.1401 - classification_loss: 0.2128 494/500 [============================>.] - ETA: 1s - loss: 1.3528 - regression_loss: 1.1401 - classification_loss: 0.2127 495/500 [============================>.] - ETA: 1s - loss: 1.3527 - regression_loss: 1.1402 - classification_loss: 0.2126 496/500 [============================>.] - ETA: 0s - loss: 1.3527 - regression_loss: 1.1403 - classification_loss: 0.2124 497/500 [============================>.] - ETA: 0s - loss: 1.3517 - regression_loss: 1.1396 - classification_loss: 0.2122 498/500 [============================>.] - ETA: 0s - loss: 1.3514 - regression_loss: 1.1394 - classification_loss: 0.2120 499/500 [============================>.] - ETA: 0s - loss: 1.3508 - regression_loss: 1.1391 - classification_loss: 0.2118 500/500 [==============================] - 123s 245ms/step - loss: 1.3489 - regression_loss: 1.1374 - classification_loss: 0.2115 326 instances of class plum with average precision: 0.8082 mAP: 0.8082 Epoch 00090: saving model to ./training/snapshots/resnet50_pascal_90.h5 Epoch 91/150 1/500 [..............................] - ETA: 2:02 - loss: 1.0247 - regression_loss: 0.9174 - classification_loss: 0.1073 2/500 [..............................] - ETA: 2:01 - loss: 1.2865 - regression_loss: 1.1540 - classification_loss: 0.1325 3/500 [..............................] - ETA: 2:00 - loss: 1.5641 - regression_loss: 1.3374 - classification_loss: 0.2267 4/500 [..............................] - ETA: 2:01 - loss: 1.3907 - regression_loss: 1.1871 - classification_loss: 0.2035 5/500 [..............................] - ETA: 2:02 - loss: 1.3177 - regression_loss: 1.1201 - classification_loss: 0.1976 6/500 [..............................] - ETA: 2:02 - loss: 1.3784 - regression_loss: 1.1693 - classification_loss: 0.2091 7/500 [..............................] - ETA: 2:01 - loss: 1.2172 - regression_loss: 1.0022 - classification_loss: 0.2150 8/500 [..............................] - ETA: 2:01 - loss: 1.2836 - regression_loss: 1.0622 - classification_loss: 0.2214 9/500 [..............................] - ETA: 2:02 - loss: 1.2167 - regression_loss: 1.0109 - classification_loss: 0.2057 10/500 [..............................] - ETA: 2:02 - loss: 1.2116 - regression_loss: 0.9098 - classification_loss: 0.3017 11/500 [..............................] - ETA: 2:00 - loss: 1.2423 - regression_loss: 0.9464 - classification_loss: 0.2959 12/500 [..............................] - ETA: 1:59 - loss: 1.3352 - regression_loss: 1.0290 - classification_loss: 0.3062 13/500 [..............................] - ETA: 2:00 - loss: 1.3282 - regression_loss: 1.0293 - classification_loss: 0.2989 14/500 [..............................] - ETA: 2:00 - loss: 1.4102 - regression_loss: 1.0890 - classification_loss: 0.3212 15/500 [..............................] - ETA: 2:00 - loss: 1.3612 - regression_loss: 1.0576 - classification_loss: 0.3036 16/500 [..............................] - ETA: 1:59 - loss: 1.4471 - regression_loss: 1.1392 - classification_loss: 0.3080 17/500 [>.............................] - ETA: 1:59 - loss: 1.4624 - regression_loss: 1.1519 - classification_loss: 0.3106 18/500 [>.............................] - ETA: 1:59 - loss: 1.4061 - regression_loss: 1.1115 - classification_loss: 0.2946 19/500 [>.............................] - ETA: 1:59 - loss: 1.4228 - regression_loss: 1.1292 - classification_loss: 0.2935 20/500 [>.............................] - ETA: 1:59 - loss: 1.4420 - regression_loss: 1.1484 - classification_loss: 0.2936 21/500 [>.............................] - ETA: 1:58 - loss: 1.4628 - regression_loss: 1.1658 - classification_loss: 0.2969 22/500 [>.............................] - ETA: 1:58 - loss: 1.4290 - regression_loss: 1.1420 - classification_loss: 0.2870 23/500 [>.............................] - ETA: 1:58 - loss: 1.3866 - regression_loss: 1.1100 - classification_loss: 0.2766 24/500 [>.............................] - ETA: 1:58 - loss: 1.4231 - regression_loss: 1.1414 - classification_loss: 0.2816 25/500 [>.............................] - ETA: 1:58 - loss: 1.4375 - regression_loss: 1.1578 - classification_loss: 0.2797 26/500 [>.............................] - ETA: 1:57 - loss: 1.4384 - regression_loss: 1.1629 - classification_loss: 0.2755 27/500 [>.............................] - ETA: 1:57 - loss: 1.4507 - regression_loss: 1.1773 - classification_loss: 0.2734 28/500 [>.............................] - ETA: 1:57 - loss: 1.4224 - regression_loss: 1.1556 - classification_loss: 0.2668 29/500 [>.............................] - ETA: 1:57 - loss: 1.4379 - regression_loss: 1.1710 - classification_loss: 0.2669 30/500 [>.............................] - ETA: 1:56 - loss: 1.4318 - regression_loss: 1.1704 - classification_loss: 0.2613 31/500 [>.............................] - ETA: 1:56 - loss: 1.4306 - regression_loss: 1.1717 - classification_loss: 0.2589 32/500 [>.............................] - ETA: 1:56 - loss: 1.4275 - regression_loss: 1.1701 - classification_loss: 0.2573 33/500 [>.............................] - ETA: 1:56 - loss: 1.4299 - regression_loss: 1.1742 - classification_loss: 0.2557 34/500 [=>............................] - ETA: 1:55 - loss: 1.4127 - regression_loss: 1.1616 - classification_loss: 0.2511 35/500 [=>............................] - ETA: 1:55 - loss: 1.3999 - regression_loss: 1.1508 - classification_loss: 0.2492 36/500 [=>............................] - ETA: 1:55 - loss: 1.3927 - regression_loss: 1.1470 - classification_loss: 0.2457 37/500 [=>............................] - ETA: 1:54 - loss: 1.3928 - regression_loss: 1.1489 - classification_loss: 0.2439 38/500 [=>............................] - ETA: 1:54 - loss: 1.4056 - regression_loss: 1.1604 - classification_loss: 0.2451 39/500 [=>............................] - ETA: 1:54 - loss: 1.3978 - regression_loss: 1.1554 - classification_loss: 0.2424 40/500 [=>............................] - ETA: 1:54 - loss: 1.3834 - regression_loss: 1.1458 - classification_loss: 0.2375 41/500 [=>............................] - ETA: 1:53 - loss: 1.3752 - regression_loss: 1.1403 - classification_loss: 0.2348 42/500 [=>............................] - ETA: 1:53 - loss: 1.3560 - regression_loss: 1.1247 - classification_loss: 0.2313 43/500 [=>............................] - ETA: 1:53 - loss: 1.3589 - regression_loss: 1.1258 - classification_loss: 0.2330 44/500 [=>............................] - ETA: 1:53 - loss: 1.3543 - regression_loss: 1.1234 - classification_loss: 0.2309 45/500 [=>............................] - ETA: 1:53 - loss: 1.3609 - regression_loss: 1.1265 - classification_loss: 0.2344 46/500 [=>............................] - ETA: 1:52 - loss: 1.3576 - regression_loss: 1.1247 - classification_loss: 0.2329 47/500 [=>............................] - ETA: 1:52 - loss: 1.3592 - regression_loss: 1.1277 - classification_loss: 0.2315 48/500 [=>............................] - ETA: 1:52 - loss: 1.3524 - regression_loss: 1.1227 - classification_loss: 0.2297 49/500 [=>............................] - ETA: 1:52 - loss: 1.3610 - regression_loss: 1.1318 - classification_loss: 0.2291 50/500 [==>...........................] - ETA: 1:51 - loss: 1.3656 - regression_loss: 1.1359 - classification_loss: 0.2296 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3611 - regression_loss: 1.1330 - classification_loss: 0.2281 52/500 [==>...........................] - ETA: 1:51 - loss: 1.3455 - regression_loss: 1.1210 - classification_loss: 0.2245 53/500 [==>...........................] - ETA: 1:51 - loss: 1.3421 - regression_loss: 1.0998 - classification_loss: 0.2423 54/500 [==>...........................] - ETA: 1:50 - loss: 1.3359 - regression_loss: 1.0956 - classification_loss: 0.2403 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3431 - regression_loss: 1.1022 - classification_loss: 0.2409 56/500 [==>...........................] - ETA: 1:50 - loss: 1.3480 - regression_loss: 1.1065 - classification_loss: 0.2415 57/500 [==>...........................] - ETA: 1:49 - loss: 1.3499 - regression_loss: 1.1087 - classification_loss: 0.2413 58/500 [==>...........................] - ETA: 1:49 - loss: 1.3423 - regression_loss: 1.1035 - classification_loss: 0.2388 59/500 [==>...........................] - ETA: 1:48 - loss: 1.3400 - regression_loss: 1.1031 - classification_loss: 0.2369 60/500 [==>...........................] - ETA: 1:48 - loss: 1.3647 - regression_loss: 1.1248 - classification_loss: 0.2398 61/500 [==>...........................] - ETA: 1:48 - loss: 1.3581 - regression_loss: 1.1205 - classification_loss: 0.2376 62/500 [==>...........................] - ETA: 1:48 - loss: 1.3552 - regression_loss: 1.1178 - classification_loss: 0.2374 63/500 [==>...........................] - ETA: 1:47 - loss: 1.3541 - regression_loss: 1.1179 - classification_loss: 0.2362 64/500 [==>...........................] - ETA: 1:47 - loss: 1.3505 - regression_loss: 1.1162 - classification_loss: 0.2344 65/500 [==>...........................] - ETA: 1:47 - loss: 1.3536 - regression_loss: 1.1190 - classification_loss: 0.2345 66/500 [==>...........................] - ETA: 1:47 - loss: 1.3424 - regression_loss: 1.1099 - classification_loss: 0.2325 67/500 [===>..........................] - ETA: 1:47 - loss: 1.3361 - regression_loss: 1.1046 - classification_loss: 0.2315 68/500 [===>..........................] - ETA: 1:46 - loss: 1.3391 - regression_loss: 1.1076 - classification_loss: 0.2315 69/500 [===>..........................] - ETA: 1:46 - loss: 1.3409 - regression_loss: 1.1095 - classification_loss: 0.2314 70/500 [===>..........................] - ETA: 1:46 - loss: 1.3440 - regression_loss: 1.1124 - classification_loss: 0.2316 71/500 [===>..........................] - ETA: 1:46 - loss: 1.3497 - regression_loss: 1.1172 - classification_loss: 0.2325 72/500 [===>..........................] - ETA: 1:45 - loss: 1.3487 - regression_loss: 1.1169 - classification_loss: 0.2318 73/500 [===>..........................] - ETA: 1:45 - loss: 1.3394 - regression_loss: 1.1092 - classification_loss: 0.2302 74/500 [===>..........................] - ETA: 1:45 - loss: 1.3546 - regression_loss: 1.1211 - classification_loss: 0.2335 75/500 [===>..........................] - ETA: 1:45 - loss: 1.3532 - regression_loss: 1.1209 - classification_loss: 0.2323 76/500 [===>..........................] - ETA: 1:44 - loss: 1.3532 - regression_loss: 1.1220 - classification_loss: 0.2312 77/500 [===>..........................] - ETA: 1:44 - loss: 1.3512 - regression_loss: 1.1208 - classification_loss: 0.2304 78/500 [===>..........................] - ETA: 1:44 - loss: 1.3486 - regression_loss: 1.1188 - classification_loss: 0.2299 79/500 [===>..........................] - ETA: 1:44 - loss: 1.3623 - regression_loss: 1.1301 - classification_loss: 0.2322 80/500 [===>..........................] - ETA: 1:43 - loss: 1.3647 - regression_loss: 1.1319 - classification_loss: 0.2327 81/500 [===>..........................] - ETA: 1:43 - loss: 1.3715 - regression_loss: 1.1375 - classification_loss: 0.2340 82/500 [===>..........................] - ETA: 1:43 - loss: 1.3653 - regression_loss: 1.1322 - classification_loss: 0.2331 83/500 [===>..........................] - ETA: 1:43 - loss: 1.3608 - regression_loss: 1.1291 - classification_loss: 0.2317 84/500 [====>.........................] - ETA: 1:42 - loss: 1.3641 - regression_loss: 1.1304 - classification_loss: 0.2337 85/500 [====>.........................] - ETA: 1:42 - loss: 1.3650 - regression_loss: 1.1313 - classification_loss: 0.2337 86/500 [====>.........................] - ETA: 1:42 - loss: 1.3624 - regression_loss: 1.1298 - classification_loss: 0.2326 87/500 [====>.........................] - ETA: 1:42 - loss: 1.3615 - regression_loss: 1.1296 - classification_loss: 0.2319 88/500 [====>.........................] - ETA: 1:41 - loss: 1.3662 - regression_loss: 1.1326 - classification_loss: 0.2336 89/500 [====>.........................] - ETA: 1:41 - loss: 1.3666 - regression_loss: 1.1334 - classification_loss: 0.2332 90/500 [====>.........................] - ETA: 1:41 - loss: 1.3656 - regression_loss: 1.1335 - classification_loss: 0.2321 91/500 [====>.........................] - ETA: 1:40 - loss: 1.3666 - regression_loss: 1.1332 - classification_loss: 0.2334 92/500 [====>.........................] - ETA: 1:40 - loss: 1.3713 - regression_loss: 1.1371 - classification_loss: 0.2341 93/500 [====>.........................] - ETA: 1:40 - loss: 1.3709 - regression_loss: 1.1367 - classification_loss: 0.2343 94/500 [====>.........................] - ETA: 1:40 - loss: 1.3774 - regression_loss: 1.1421 - classification_loss: 0.2353 95/500 [====>.........................] - ETA: 1:39 - loss: 1.3783 - regression_loss: 1.1430 - classification_loss: 0.2353 96/500 [====>.........................] - ETA: 1:39 - loss: 1.3796 - regression_loss: 1.1438 - classification_loss: 0.2358 97/500 [====>.........................] - ETA: 1:39 - loss: 1.3777 - regression_loss: 1.1428 - classification_loss: 0.2348 98/500 [====>.........................] - ETA: 1:39 - loss: 1.3762 - regression_loss: 1.1417 - classification_loss: 0.2345 99/500 [====>.........................] - ETA: 1:38 - loss: 1.3708 - regression_loss: 1.1375 - classification_loss: 0.2333 100/500 [=====>........................] - ETA: 1:38 - loss: 1.3731 - regression_loss: 1.1395 - classification_loss: 0.2336 101/500 [=====>........................] - ETA: 1:38 - loss: 1.3734 - regression_loss: 1.1405 - classification_loss: 0.2329 102/500 [=====>........................] - ETA: 1:38 - loss: 1.3701 - regression_loss: 1.1381 - classification_loss: 0.2321 103/500 [=====>........................] - ETA: 1:37 - loss: 1.3663 - regression_loss: 1.1350 - classification_loss: 0.2313 104/500 [=====>........................] - ETA: 1:37 - loss: 1.3716 - regression_loss: 1.1399 - classification_loss: 0.2318 105/500 [=====>........................] - ETA: 1:37 - loss: 1.3668 - regression_loss: 1.1363 - classification_loss: 0.2306 106/500 [=====>........................] - ETA: 1:37 - loss: 1.3640 - regression_loss: 1.1341 - classification_loss: 0.2299 107/500 [=====>........................] - ETA: 1:36 - loss: 1.3674 - regression_loss: 1.1371 - classification_loss: 0.2303 108/500 [=====>........................] - ETA: 1:36 - loss: 1.3697 - regression_loss: 1.1399 - classification_loss: 0.2298 109/500 [=====>........................] - ETA: 1:36 - loss: 1.3711 - regression_loss: 1.1412 - classification_loss: 0.2299 110/500 [=====>........................] - ETA: 1:35 - loss: 1.3689 - regression_loss: 1.1399 - classification_loss: 0.2290 111/500 [=====>........................] - ETA: 1:35 - loss: 1.3591 - regression_loss: 1.1318 - classification_loss: 0.2273 112/500 [=====>........................] - ETA: 1:35 - loss: 1.3612 - regression_loss: 1.1335 - classification_loss: 0.2277 113/500 [=====>........................] - ETA: 1:35 - loss: 1.3610 - regression_loss: 1.1339 - classification_loss: 0.2271 114/500 [=====>........................] - ETA: 1:34 - loss: 1.3592 - regression_loss: 1.1326 - classification_loss: 0.2266 115/500 [=====>........................] - ETA: 1:34 - loss: 1.3595 - regression_loss: 1.1333 - classification_loss: 0.2262 116/500 [=====>........................] - ETA: 1:34 - loss: 1.3520 - regression_loss: 1.1272 - classification_loss: 0.2248 117/500 [======>.......................] - ETA: 1:34 - loss: 1.3566 - regression_loss: 1.1314 - classification_loss: 0.2252 118/500 [======>.......................] - ETA: 1:33 - loss: 1.3573 - regression_loss: 1.1322 - classification_loss: 0.2252 119/500 [======>.......................] - ETA: 1:33 - loss: 1.3561 - regression_loss: 1.1316 - classification_loss: 0.2245 120/500 [======>.......................] - ETA: 1:33 - loss: 1.3554 - regression_loss: 1.1314 - classification_loss: 0.2240 121/500 [======>.......................] - ETA: 1:33 - loss: 1.3580 - regression_loss: 1.1340 - classification_loss: 0.2241 122/500 [======>.......................] - ETA: 1:32 - loss: 1.3630 - regression_loss: 1.1384 - classification_loss: 0.2246 123/500 [======>.......................] - ETA: 1:32 - loss: 1.3685 - regression_loss: 1.1425 - classification_loss: 0.2261 124/500 [======>.......................] - ETA: 1:32 - loss: 1.3628 - regression_loss: 1.1375 - classification_loss: 0.2252 125/500 [======>.......................] - ETA: 1:31 - loss: 1.3605 - regression_loss: 1.1360 - classification_loss: 0.2245 126/500 [======>.......................] - ETA: 1:31 - loss: 1.3570 - regression_loss: 1.1334 - classification_loss: 0.2235 127/500 [======>.......................] - ETA: 1:31 - loss: 1.3580 - regression_loss: 1.1341 - classification_loss: 0.2239 128/500 [======>.......................] - ETA: 1:31 - loss: 1.3602 - regression_loss: 1.1361 - classification_loss: 0.2241 129/500 [======>.......................] - ETA: 1:30 - loss: 1.3550 - regression_loss: 1.1319 - classification_loss: 0.2231 130/500 [======>.......................] - ETA: 1:30 - loss: 1.3535 - regression_loss: 1.1305 - classification_loss: 0.2230 131/500 [======>.......................] - ETA: 1:30 - loss: 1.3526 - regression_loss: 1.1301 - classification_loss: 0.2225 132/500 [======>.......................] - ETA: 1:30 - loss: 1.3600 - regression_loss: 1.1356 - classification_loss: 0.2244 133/500 [======>.......................] - ETA: 1:29 - loss: 1.3592 - regression_loss: 1.1350 - classification_loss: 0.2242 134/500 [=======>......................] - ETA: 1:29 - loss: 1.3594 - regression_loss: 1.1346 - classification_loss: 0.2247 135/500 [=======>......................] - ETA: 1:29 - loss: 1.3545 - regression_loss: 1.1309 - classification_loss: 0.2236 136/500 [=======>......................] - ETA: 1:29 - loss: 1.3526 - regression_loss: 1.1297 - classification_loss: 0.2230 137/500 [=======>......................] - ETA: 1:29 - loss: 1.3493 - regression_loss: 1.1268 - classification_loss: 0.2225 138/500 [=======>......................] - ETA: 1:28 - loss: 1.3491 - regression_loss: 1.1270 - classification_loss: 0.2221 139/500 [=======>......................] - ETA: 1:28 - loss: 1.3428 - regression_loss: 1.1189 - classification_loss: 0.2238 140/500 [=======>......................] - ETA: 1:28 - loss: 1.3438 - regression_loss: 1.1203 - classification_loss: 0.2236 141/500 [=======>......................] - ETA: 1:27 - loss: 1.3428 - regression_loss: 1.1198 - classification_loss: 0.2230 142/500 [=======>......................] - ETA: 1:27 - loss: 1.3392 - regression_loss: 1.1171 - classification_loss: 0.2221 143/500 [=======>......................] - ETA: 1:27 - loss: 1.3444 - regression_loss: 1.1210 - classification_loss: 0.2234 144/500 [=======>......................] - ETA: 1:27 - loss: 1.3398 - regression_loss: 1.1174 - classification_loss: 0.2225 145/500 [=======>......................] - ETA: 1:26 - loss: 1.3412 - regression_loss: 1.1183 - classification_loss: 0.2230 146/500 [=======>......................] - ETA: 1:26 - loss: 1.3458 - regression_loss: 1.1224 - classification_loss: 0.2233 147/500 [=======>......................] - ETA: 1:26 - loss: 1.3467 - regression_loss: 1.1237 - classification_loss: 0.2230 148/500 [=======>......................] - ETA: 1:26 - loss: 1.3449 - regression_loss: 1.1222 - classification_loss: 0.2226 149/500 [=======>......................] - ETA: 1:26 - loss: 1.3432 - regression_loss: 1.1211 - classification_loss: 0.2220 150/500 [========>.....................] - ETA: 1:25 - loss: 1.3418 - regression_loss: 1.1204 - classification_loss: 0.2215 151/500 [========>.....................] - ETA: 1:25 - loss: 1.3386 - regression_loss: 1.1175 - classification_loss: 0.2211 152/500 [========>.....................] - ETA: 1:25 - loss: 1.3325 - regression_loss: 1.1123 - classification_loss: 0.2202 153/500 [========>.....................] - ETA: 1:25 - loss: 1.3337 - regression_loss: 1.1138 - classification_loss: 0.2198 154/500 [========>.....................] - ETA: 1:24 - loss: 1.3292 - regression_loss: 1.1104 - classification_loss: 0.2189 155/500 [========>.....................] - ETA: 1:24 - loss: 1.3262 - regression_loss: 1.1079 - classification_loss: 0.2182 156/500 [========>.....................] - ETA: 1:24 - loss: 1.3242 - regression_loss: 1.1063 - classification_loss: 0.2179 157/500 [========>.....................] - ETA: 1:24 - loss: 1.3267 - regression_loss: 1.1083 - classification_loss: 0.2184 158/500 [========>.....................] - ETA: 1:23 - loss: 1.3286 - regression_loss: 1.1104 - classification_loss: 0.2182 159/500 [========>.....................] - ETA: 1:23 - loss: 1.3239 - regression_loss: 1.1065 - classification_loss: 0.2174 160/500 [========>.....................] - ETA: 1:23 - loss: 1.3237 - regression_loss: 1.1065 - classification_loss: 0.2172 161/500 [========>.....................] - ETA: 1:23 - loss: 1.3197 - regression_loss: 1.1029 - classification_loss: 0.2168 162/500 [========>.....................] - ETA: 1:22 - loss: 1.3232 - regression_loss: 1.1057 - classification_loss: 0.2175 163/500 [========>.....................] - ETA: 1:22 - loss: 1.3221 - regression_loss: 1.1052 - classification_loss: 0.2169 164/500 [========>.....................] - ETA: 1:22 - loss: 1.3209 - regression_loss: 1.1037 - classification_loss: 0.2172 165/500 [========>.....................] - ETA: 1:22 - loss: 1.3242 - regression_loss: 1.1065 - classification_loss: 0.2176 166/500 [========>.....................] - ETA: 1:21 - loss: 1.3174 - regression_loss: 1.0999 - classification_loss: 0.2176 167/500 [=========>....................] - ETA: 1:21 - loss: 1.3153 - regression_loss: 1.0983 - classification_loss: 0.2170 168/500 [=========>....................] - ETA: 1:21 - loss: 1.3203 - regression_loss: 1.1024 - classification_loss: 0.2179 169/500 [=========>....................] - ETA: 1:21 - loss: 1.3205 - regression_loss: 1.1027 - classification_loss: 0.2178 170/500 [=========>....................] - ETA: 1:20 - loss: 1.3232 - regression_loss: 1.1048 - classification_loss: 0.2184 171/500 [=========>....................] - ETA: 1:20 - loss: 1.3243 - regression_loss: 1.1059 - classification_loss: 0.2184 172/500 [=========>....................] - ETA: 1:20 - loss: 1.3246 - regression_loss: 1.1064 - classification_loss: 0.2182 173/500 [=========>....................] - ETA: 1:20 - loss: 1.3219 - regression_loss: 1.1045 - classification_loss: 0.2173 174/500 [=========>....................] - ETA: 1:19 - loss: 1.3246 - regression_loss: 1.1076 - classification_loss: 0.2170 175/500 [=========>....................] - ETA: 1:19 - loss: 1.3256 - regression_loss: 1.1086 - classification_loss: 0.2170 176/500 [=========>....................] - ETA: 1:19 - loss: 1.3251 - regression_loss: 1.1084 - classification_loss: 0.2167 177/500 [=========>....................] - ETA: 1:19 - loss: 1.3280 - regression_loss: 1.1105 - classification_loss: 0.2176 178/500 [=========>....................] - ETA: 1:18 - loss: 1.3266 - regression_loss: 1.1095 - classification_loss: 0.2171 179/500 [=========>....................] - ETA: 1:18 - loss: 1.3212 - regression_loss: 1.1052 - classification_loss: 0.2160 180/500 [=========>....................] - ETA: 1:18 - loss: 1.3214 - regression_loss: 1.1056 - classification_loss: 0.2157 181/500 [=========>....................] - ETA: 1:18 - loss: 1.3200 - regression_loss: 1.1046 - classification_loss: 0.2154 182/500 [=========>....................] - ETA: 1:18 - loss: 1.3192 - regression_loss: 1.1041 - classification_loss: 0.2151 183/500 [=========>....................] - ETA: 1:17 - loss: 1.3224 - regression_loss: 1.1070 - classification_loss: 0.2155 184/500 [==========>...................] - ETA: 1:17 - loss: 1.3197 - regression_loss: 1.1050 - classification_loss: 0.2147 185/500 [==========>...................] - ETA: 1:17 - loss: 1.3179 - regression_loss: 1.1040 - classification_loss: 0.2139 186/500 [==========>...................] - ETA: 1:17 - loss: 1.3123 - regression_loss: 1.0992 - classification_loss: 0.2131 187/500 [==========>...................] - ETA: 1:16 - loss: 1.3137 - regression_loss: 1.0997 - classification_loss: 0.2140 188/500 [==========>...................] - ETA: 1:16 - loss: 1.3199 - regression_loss: 1.1054 - classification_loss: 0.2146 189/500 [==========>...................] - ETA: 1:16 - loss: 1.3176 - regression_loss: 1.1035 - classification_loss: 0.2142 190/500 [==========>...................] - ETA: 1:16 - loss: 1.3178 - regression_loss: 1.1037 - classification_loss: 0.2142 191/500 [==========>...................] - ETA: 1:15 - loss: 1.3172 - regression_loss: 1.1032 - classification_loss: 0.2140 192/500 [==========>...................] - ETA: 1:15 - loss: 1.3172 - regression_loss: 1.1032 - classification_loss: 0.2140 193/500 [==========>...................] - ETA: 1:15 - loss: 1.3139 - regression_loss: 1.1008 - classification_loss: 0.2132 194/500 [==========>...................] - ETA: 1:15 - loss: 1.3150 - regression_loss: 1.1013 - classification_loss: 0.2137 195/500 [==========>...................] - ETA: 1:14 - loss: 1.3144 - regression_loss: 1.1011 - classification_loss: 0.2134 196/500 [==========>...................] - ETA: 1:14 - loss: 1.3132 - regression_loss: 1.1001 - classification_loss: 0.2131 197/500 [==========>...................] - ETA: 1:14 - loss: 1.3126 - regression_loss: 1.0999 - classification_loss: 0.2127 198/500 [==========>...................] - ETA: 1:14 - loss: 1.3108 - regression_loss: 1.0985 - classification_loss: 0.2123 199/500 [==========>...................] - ETA: 1:13 - loss: 1.3103 - regression_loss: 1.0983 - classification_loss: 0.2120 200/500 [===========>..................] - ETA: 1:13 - loss: 1.3104 - regression_loss: 1.0981 - classification_loss: 0.2123 201/500 [===========>..................] - ETA: 1:13 - loss: 1.3102 - regression_loss: 1.0983 - classification_loss: 0.2118 202/500 [===========>..................] - ETA: 1:13 - loss: 1.3111 - regression_loss: 1.0987 - classification_loss: 0.2123 203/500 [===========>..................] - ETA: 1:12 - loss: 1.3144 - regression_loss: 1.1018 - classification_loss: 0.2126 204/500 [===========>..................] - ETA: 1:12 - loss: 1.3156 - regression_loss: 1.1030 - classification_loss: 0.2126 205/500 [===========>..................] - ETA: 1:12 - loss: 1.3165 - regression_loss: 1.1037 - classification_loss: 0.2128 206/500 [===========>..................] - ETA: 1:12 - loss: 1.3156 - regression_loss: 1.1030 - classification_loss: 0.2125 207/500 [===========>..................] - ETA: 1:12 - loss: 1.3129 - regression_loss: 1.1007 - classification_loss: 0.2122 208/500 [===========>..................] - ETA: 1:11 - loss: 1.3166 - regression_loss: 1.1033 - classification_loss: 0.2133 209/500 [===========>..................] - ETA: 1:11 - loss: 1.3175 - regression_loss: 1.1041 - classification_loss: 0.2134 210/500 [===========>..................] - ETA: 1:11 - loss: 1.3145 - regression_loss: 1.1018 - classification_loss: 0.2128 211/500 [===========>..................] - ETA: 1:10 - loss: 1.3138 - regression_loss: 1.1014 - classification_loss: 0.2125 212/500 [===========>..................] - ETA: 1:10 - loss: 1.3133 - regression_loss: 1.1014 - classification_loss: 0.2120 213/500 [===========>..................] - ETA: 1:10 - loss: 1.3137 - regression_loss: 1.1020 - classification_loss: 0.2117 214/500 [===========>..................] - ETA: 1:10 - loss: 1.3134 - regression_loss: 1.1013 - classification_loss: 0.2121 215/500 [===========>..................] - ETA: 1:09 - loss: 1.3114 - regression_loss: 1.0998 - classification_loss: 0.2117 216/500 [===========>..................] - ETA: 1:09 - loss: 1.3134 - regression_loss: 1.1017 - classification_loss: 0.2117 217/500 [============>.................] - ETA: 1:09 - loss: 1.3170 - regression_loss: 1.1054 - classification_loss: 0.2117 218/500 [============>.................] - ETA: 1:09 - loss: 1.3132 - regression_loss: 1.1024 - classification_loss: 0.2108 219/500 [============>.................] - ETA: 1:08 - loss: 1.3176 - regression_loss: 1.1059 - classification_loss: 0.2117 220/500 [============>.................] - ETA: 1:08 - loss: 1.3217 - regression_loss: 1.1085 - classification_loss: 0.2132 221/500 [============>.................] - ETA: 1:08 - loss: 1.3229 - regression_loss: 1.1098 - classification_loss: 0.2131 222/500 [============>.................] - ETA: 1:08 - loss: 1.3214 - regression_loss: 1.1084 - classification_loss: 0.2130 223/500 [============>.................] - ETA: 1:08 - loss: 1.3230 - regression_loss: 1.1098 - classification_loss: 0.2132 224/500 [============>.................] - ETA: 1:07 - loss: 1.3236 - regression_loss: 1.1106 - classification_loss: 0.2131 225/500 [============>.................] - ETA: 1:07 - loss: 1.3215 - regression_loss: 1.1090 - classification_loss: 0.2125 226/500 [============>.................] - ETA: 1:07 - loss: 1.3202 - regression_loss: 1.1081 - classification_loss: 0.2122 227/500 [============>.................] - ETA: 1:07 - loss: 1.3186 - regression_loss: 1.1070 - classification_loss: 0.2115 228/500 [============>.................] - ETA: 1:06 - loss: 1.3200 - regression_loss: 1.1085 - classification_loss: 0.2116 229/500 [============>.................] - ETA: 1:06 - loss: 1.3177 - regression_loss: 1.1067 - classification_loss: 0.2110 230/500 [============>.................] - ETA: 1:06 - loss: 1.3214 - regression_loss: 1.1098 - classification_loss: 0.2116 231/500 [============>.................] - ETA: 1:06 - loss: 1.3211 - regression_loss: 1.1097 - classification_loss: 0.2114 232/500 [============>.................] - ETA: 1:05 - loss: 1.3208 - regression_loss: 1.1097 - classification_loss: 0.2110 233/500 [============>.................] - ETA: 1:05 - loss: 1.3217 - regression_loss: 1.1106 - classification_loss: 0.2111 234/500 [=============>................] - ETA: 1:05 - loss: 1.3202 - regression_loss: 1.1095 - classification_loss: 0.2107 235/500 [=============>................] - ETA: 1:04 - loss: 1.3184 - regression_loss: 1.1082 - classification_loss: 0.2101 236/500 [=============>................] - ETA: 1:04 - loss: 1.3214 - regression_loss: 1.1109 - classification_loss: 0.2105 237/500 [=============>................] - ETA: 1:04 - loss: 1.3214 - regression_loss: 1.1112 - classification_loss: 0.2102 238/500 [=============>................] - ETA: 1:04 - loss: 1.3198 - regression_loss: 1.1101 - classification_loss: 0.2097 239/500 [=============>................] - ETA: 1:03 - loss: 1.3201 - regression_loss: 1.1104 - classification_loss: 0.2097 240/500 [=============>................] - ETA: 1:03 - loss: 1.3207 - regression_loss: 1.1112 - classification_loss: 0.2095 241/500 [=============>................] - ETA: 1:03 - loss: 1.3206 - regression_loss: 1.1109 - classification_loss: 0.2097 242/500 [=============>................] - ETA: 1:03 - loss: 1.3221 - regression_loss: 1.1123 - classification_loss: 0.2098 243/500 [=============>................] - ETA: 1:03 - loss: 1.3198 - regression_loss: 1.1104 - classification_loss: 0.2094 244/500 [=============>................] - ETA: 1:02 - loss: 1.3172 - regression_loss: 1.1081 - classification_loss: 0.2091 245/500 [=============>................] - ETA: 1:02 - loss: 1.3176 - regression_loss: 1.1085 - classification_loss: 0.2092 246/500 [=============>................] - ETA: 1:02 - loss: 1.3173 - regression_loss: 1.1085 - classification_loss: 0.2088 247/500 [=============>................] - ETA: 1:02 - loss: 1.3188 - regression_loss: 1.1100 - classification_loss: 0.2088 248/500 [=============>................] - ETA: 1:01 - loss: 1.3222 - regression_loss: 1.1126 - classification_loss: 0.2096 249/500 [=============>................] - ETA: 1:01 - loss: 1.3238 - regression_loss: 1.1141 - classification_loss: 0.2097 250/500 [==============>...............] - ETA: 1:01 - loss: 1.3277 - regression_loss: 1.1167 - classification_loss: 0.2109 251/500 [==============>...............] - ETA: 1:01 - loss: 1.3269 - regression_loss: 1.1162 - classification_loss: 0.2107 252/500 [==============>...............] - ETA: 1:00 - loss: 1.3269 - regression_loss: 1.1163 - classification_loss: 0.2106 253/500 [==============>...............] - ETA: 1:00 - loss: 1.3266 - regression_loss: 1.1163 - classification_loss: 0.2103 254/500 [==============>...............] - ETA: 1:00 - loss: 1.3256 - regression_loss: 1.1155 - classification_loss: 0.2101 255/500 [==============>...............] - ETA: 1:00 - loss: 1.3234 - regression_loss: 1.1137 - classification_loss: 0.2097 256/500 [==============>...............] - ETA: 59s - loss: 1.3226 - regression_loss: 1.1132 - classification_loss: 0.2094  257/500 [==============>...............] - ETA: 59s - loss: 1.3217 - regression_loss: 1.1124 - classification_loss: 0.2093 258/500 [==============>...............] - ETA: 59s - loss: 1.3219 - regression_loss: 1.1124 - classification_loss: 0.2095 259/500 [==============>...............] - ETA: 59s - loss: 1.3214 - regression_loss: 1.1123 - classification_loss: 0.2092 260/500 [==============>...............] - ETA: 58s - loss: 1.3229 - regression_loss: 1.1134 - classification_loss: 0.2096 261/500 [==============>...............] - ETA: 58s - loss: 1.3238 - regression_loss: 1.1143 - classification_loss: 0.2095 262/500 [==============>...............] - ETA: 58s - loss: 1.3265 - regression_loss: 1.1164 - classification_loss: 0.2101 263/500 [==============>...............] - ETA: 58s - loss: 1.3275 - regression_loss: 1.1172 - classification_loss: 0.2102 264/500 [==============>...............] - ETA: 57s - loss: 1.3279 - regression_loss: 1.1178 - classification_loss: 0.2101 265/500 [==============>...............] - ETA: 57s - loss: 1.3290 - regression_loss: 1.1189 - classification_loss: 0.2101 266/500 [==============>...............] - ETA: 57s - loss: 1.3314 - regression_loss: 1.1209 - classification_loss: 0.2105 267/500 [===============>..............] - ETA: 57s - loss: 1.3327 - regression_loss: 1.1220 - classification_loss: 0.2108 268/500 [===============>..............] - ETA: 56s - loss: 1.3336 - regression_loss: 1.1228 - classification_loss: 0.2108 269/500 [===============>..............] - ETA: 56s - loss: 1.3321 - regression_loss: 1.1214 - classification_loss: 0.2106 270/500 [===============>..............] - ETA: 56s - loss: 1.3325 - regression_loss: 1.1217 - classification_loss: 0.2108 271/500 [===============>..............] - ETA: 56s - loss: 1.3304 - regression_loss: 1.1201 - classification_loss: 0.2104 272/500 [===============>..............] - ETA: 55s - loss: 1.3296 - regression_loss: 1.1196 - classification_loss: 0.2100 273/500 [===============>..............] - ETA: 55s - loss: 1.3316 - regression_loss: 1.1211 - classification_loss: 0.2105 274/500 [===============>..............] - ETA: 55s - loss: 1.3304 - regression_loss: 1.1203 - classification_loss: 0.2101 275/500 [===============>..............] - ETA: 55s - loss: 1.3302 - regression_loss: 1.1200 - classification_loss: 0.2103 276/500 [===============>..............] - ETA: 54s - loss: 1.3321 - regression_loss: 1.1218 - classification_loss: 0.2104 277/500 [===============>..............] - ETA: 54s - loss: 1.3321 - regression_loss: 1.1218 - classification_loss: 0.2103 278/500 [===============>..............] - ETA: 54s - loss: 1.3318 - regression_loss: 1.1215 - classification_loss: 0.2103 279/500 [===============>..............] - ETA: 54s - loss: 1.3333 - regression_loss: 1.1228 - classification_loss: 0.2106 280/500 [===============>..............] - ETA: 54s - loss: 1.3328 - regression_loss: 1.1225 - classification_loss: 0.2102 281/500 [===============>..............] - ETA: 53s - loss: 1.3307 - regression_loss: 1.1210 - classification_loss: 0.2097 282/500 [===============>..............] - ETA: 53s - loss: 1.3306 - regression_loss: 1.1211 - classification_loss: 0.2095 283/500 [===============>..............] - ETA: 53s - loss: 1.3311 - regression_loss: 1.1218 - classification_loss: 0.2093 284/500 [================>.............] - ETA: 53s - loss: 1.3309 - regression_loss: 1.1217 - classification_loss: 0.2092 285/500 [================>.............] - ETA: 52s - loss: 1.3327 - regression_loss: 1.1233 - classification_loss: 0.2094 286/500 [================>.............] - ETA: 52s - loss: 1.3342 - regression_loss: 1.1245 - classification_loss: 0.2097 287/500 [================>.............] - ETA: 52s - loss: 1.3349 - regression_loss: 1.1252 - classification_loss: 0.2097 288/500 [================>.............] - ETA: 52s - loss: 1.3348 - regression_loss: 1.1253 - classification_loss: 0.2095 289/500 [================>.............] - ETA: 51s - loss: 1.3345 - regression_loss: 1.1253 - classification_loss: 0.2093 290/500 [================>.............] - ETA: 51s - loss: 1.3321 - regression_loss: 1.1233 - classification_loss: 0.2088 291/500 [================>.............] - ETA: 51s - loss: 1.3316 - regression_loss: 1.1230 - classification_loss: 0.2087 292/500 [================>.............] - ETA: 51s - loss: 1.3315 - regression_loss: 1.1229 - classification_loss: 0.2086 293/500 [================>.............] - ETA: 50s - loss: 1.3329 - regression_loss: 1.1239 - classification_loss: 0.2090 294/500 [================>.............] - ETA: 50s - loss: 1.3316 - regression_loss: 1.1230 - classification_loss: 0.2087 295/500 [================>.............] - ETA: 50s - loss: 1.3307 - regression_loss: 1.1223 - classification_loss: 0.2085 296/500 [================>.............] - ETA: 49s - loss: 1.3310 - regression_loss: 1.1226 - classification_loss: 0.2084 297/500 [================>.............] - ETA: 49s - loss: 1.3298 - regression_loss: 1.1218 - classification_loss: 0.2080 298/500 [================>.............] - ETA: 49s - loss: 1.3273 - regression_loss: 1.1195 - classification_loss: 0.2078 299/500 [================>.............] - ETA: 49s - loss: 1.3257 - regression_loss: 1.1182 - classification_loss: 0.2075 300/500 [=================>............] - ETA: 48s - loss: 1.3249 - regression_loss: 1.1178 - classification_loss: 0.2071 301/500 [=================>............] - ETA: 48s - loss: 1.3251 - regression_loss: 1.1181 - classification_loss: 0.2071 302/500 [=================>............] - ETA: 48s - loss: 1.3249 - regression_loss: 1.1178 - classification_loss: 0.2071 303/500 [=================>............] - ETA: 48s - loss: 1.3249 - regression_loss: 1.1181 - classification_loss: 0.2068 304/500 [=================>............] - ETA: 47s - loss: 1.3250 - regression_loss: 1.1183 - classification_loss: 0.2067 305/500 [=================>............] - ETA: 47s - loss: 1.3254 - regression_loss: 1.1183 - classification_loss: 0.2071 306/500 [=================>............] - ETA: 47s - loss: 1.3274 - regression_loss: 1.1200 - classification_loss: 0.2073 307/500 [=================>............] - ETA: 47s - loss: 1.3285 - regression_loss: 1.1207 - classification_loss: 0.2078 308/500 [=================>............] - ETA: 46s - loss: 1.3274 - regression_loss: 1.1200 - classification_loss: 0.2074 309/500 [=================>............] - ETA: 46s - loss: 1.3271 - regression_loss: 1.1197 - classification_loss: 0.2074 310/500 [=================>............] - ETA: 46s - loss: 1.3263 - regression_loss: 1.1191 - classification_loss: 0.2072 311/500 [=================>............] - ETA: 46s - loss: 1.3264 - regression_loss: 1.1191 - classification_loss: 0.2073 312/500 [=================>............] - ETA: 45s - loss: 1.3261 - regression_loss: 1.1189 - classification_loss: 0.2073 313/500 [=================>............] - ETA: 45s - loss: 1.3298 - regression_loss: 1.1223 - classification_loss: 0.2075 314/500 [=================>............] - ETA: 45s - loss: 1.3307 - regression_loss: 1.1228 - classification_loss: 0.2079 315/500 [=================>............] - ETA: 45s - loss: 1.3296 - regression_loss: 1.1219 - classification_loss: 0.2078 316/500 [=================>............] - ETA: 45s - loss: 1.3312 - regression_loss: 1.1230 - classification_loss: 0.2082 317/500 [==================>...........] - ETA: 44s - loss: 1.3291 - regression_loss: 1.1212 - classification_loss: 0.2079 318/500 [==================>...........] - ETA: 44s - loss: 1.3264 - regression_loss: 1.1189 - classification_loss: 0.2074 319/500 [==================>...........] - ETA: 44s - loss: 1.3271 - regression_loss: 1.1195 - classification_loss: 0.2076 320/500 [==================>...........] - ETA: 44s - loss: 1.3272 - regression_loss: 1.1198 - classification_loss: 0.2074 321/500 [==================>...........] - ETA: 43s - loss: 1.3268 - regression_loss: 1.1196 - classification_loss: 0.2072 322/500 [==================>...........] - ETA: 43s - loss: 1.3271 - regression_loss: 1.1201 - classification_loss: 0.2070 323/500 [==================>...........] - ETA: 43s - loss: 1.3270 - regression_loss: 1.1200 - classification_loss: 0.2070 324/500 [==================>...........] - ETA: 43s - loss: 1.3271 - regression_loss: 1.1199 - classification_loss: 0.2072 325/500 [==================>...........] - ETA: 42s - loss: 1.3278 - regression_loss: 1.1205 - classification_loss: 0.2073 326/500 [==================>...........] - ETA: 42s - loss: 1.3268 - regression_loss: 1.1199 - classification_loss: 0.2069 327/500 [==================>...........] - ETA: 42s - loss: 1.3276 - regression_loss: 1.1203 - classification_loss: 0.2073 328/500 [==================>...........] - ETA: 42s - loss: 1.3271 - regression_loss: 1.1197 - classification_loss: 0.2074 329/500 [==================>...........] - ETA: 41s - loss: 1.3282 - regression_loss: 1.1205 - classification_loss: 0.2077 330/500 [==================>...........] - ETA: 41s - loss: 1.3283 - regression_loss: 1.1208 - classification_loss: 0.2075 331/500 [==================>...........] - ETA: 41s - loss: 1.3252 - regression_loss: 1.1182 - classification_loss: 0.2070 332/500 [==================>...........] - ETA: 41s - loss: 1.3219 - regression_loss: 1.1155 - classification_loss: 0.2065 333/500 [==================>...........] - ETA: 40s - loss: 1.3233 - regression_loss: 1.1166 - classification_loss: 0.2066 334/500 [===================>..........] - ETA: 40s - loss: 1.3212 - regression_loss: 1.1150 - classification_loss: 0.2062 335/500 [===================>..........] - ETA: 40s - loss: 1.3205 - regression_loss: 1.1145 - classification_loss: 0.2060 336/500 [===================>..........] - ETA: 40s - loss: 1.3207 - regression_loss: 1.1148 - classification_loss: 0.2059 337/500 [===================>..........] - ETA: 39s - loss: 1.3208 - regression_loss: 1.1149 - classification_loss: 0.2059 338/500 [===================>..........] - ETA: 39s - loss: 1.3187 - regression_loss: 1.1116 - classification_loss: 0.2071 339/500 [===================>..........] - ETA: 39s - loss: 1.3193 - regression_loss: 1.1120 - classification_loss: 0.2073 340/500 [===================>..........] - ETA: 39s - loss: 1.3220 - regression_loss: 1.1149 - classification_loss: 0.2071 341/500 [===================>..........] - ETA: 38s - loss: 1.3217 - regression_loss: 1.1148 - classification_loss: 0.2069 342/500 [===================>..........] - ETA: 38s - loss: 1.3212 - regression_loss: 1.1143 - classification_loss: 0.2069 343/500 [===================>..........] - ETA: 38s - loss: 1.3206 - regression_loss: 1.1138 - classification_loss: 0.2068 344/500 [===================>..........] - ETA: 38s - loss: 1.3207 - regression_loss: 1.1140 - classification_loss: 0.2068 345/500 [===================>..........] - ETA: 37s - loss: 1.3222 - regression_loss: 1.1153 - classification_loss: 0.2069 346/500 [===================>..........] - ETA: 37s - loss: 1.3219 - regression_loss: 1.1151 - classification_loss: 0.2067 347/500 [===================>..........] - ETA: 37s - loss: 1.3206 - regression_loss: 1.1141 - classification_loss: 0.2065 348/500 [===================>..........] - ETA: 37s - loss: 1.3176 - regression_loss: 1.1115 - classification_loss: 0.2060 349/500 [===================>..........] - ETA: 36s - loss: 1.3181 - regression_loss: 1.1120 - classification_loss: 0.2062 350/500 [====================>.........] - ETA: 36s - loss: 1.3186 - regression_loss: 1.1122 - classification_loss: 0.2063 351/500 [====================>.........] - ETA: 36s - loss: 1.3183 - regression_loss: 1.1120 - classification_loss: 0.2063 352/500 [====================>.........] - ETA: 36s - loss: 1.3186 - regression_loss: 1.1124 - classification_loss: 0.2062 353/500 [====================>.........] - ETA: 35s - loss: 1.3218 - regression_loss: 1.1149 - classification_loss: 0.2069 354/500 [====================>.........] - ETA: 35s - loss: 1.3204 - regression_loss: 1.1139 - classification_loss: 0.2065 355/500 [====================>.........] - ETA: 35s - loss: 1.3210 - regression_loss: 1.1143 - classification_loss: 0.2067 356/500 [====================>.........] - ETA: 35s - loss: 1.3208 - regression_loss: 1.1141 - classification_loss: 0.2068 357/500 [====================>.........] - ETA: 34s - loss: 1.3190 - regression_loss: 1.1124 - classification_loss: 0.2066 358/500 [====================>.........] - ETA: 34s - loss: 1.3199 - regression_loss: 1.1130 - classification_loss: 0.2069 359/500 [====================>.........] - ETA: 34s - loss: 1.3191 - regression_loss: 1.1123 - classification_loss: 0.2068 360/500 [====================>.........] - ETA: 34s - loss: 1.3165 - regression_loss: 1.1102 - classification_loss: 0.2063 361/500 [====================>.........] - ETA: 34s - loss: 1.3174 - regression_loss: 1.1108 - classification_loss: 0.2066 362/500 [====================>.........] - ETA: 33s - loss: 1.3184 - regression_loss: 1.1116 - classification_loss: 0.2068 363/500 [====================>.........] - ETA: 33s - loss: 1.3187 - regression_loss: 1.1121 - classification_loss: 0.2066 364/500 [====================>.........] - ETA: 33s - loss: 1.3190 - regression_loss: 1.1123 - classification_loss: 0.2067 365/500 [====================>.........] - ETA: 33s - loss: 1.3176 - regression_loss: 1.1111 - classification_loss: 0.2065 366/500 [====================>.........] - ETA: 32s - loss: 1.3191 - regression_loss: 1.1123 - classification_loss: 0.2068 367/500 [=====================>........] - ETA: 32s - loss: 1.3194 - regression_loss: 1.1126 - classification_loss: 0.2068 368/500 [=====================>........] - ETA: 32s - loss: 1.3178 - regression_loss: 1.1115 - classification_loss: 0.2064 369/500 [=====================>........] - ETA: 32s - loss: 1.3191 - regression_loss: 1.1126 - classification_loss: 0.2065 370/500 [=====================>........] - ETA: 31s - loss: 1.3191 - regression_loss: 1.1126 - classification_loss: 0.2064 371/500 [=====================>........] - ETA: 31s - loss: 1.3203 - regression_loss: 1.1135 - classification_loss: 0.2068 372/500 [=====================>........] - ETA: 31s - loss: 1.3219 - regression_loss: 1.1148 - classification_loss: 0.2070 373/500 [=====================>........] - ETA: 31s - loss: 1.3205 - regression_loss: 1.1138 - classification_loss: 0.2067 374/500 [=====================>........] - ETA: 30s - loss: 1.3204 - regression_loss: 1.1139 - classification_loss: 0.2066 375/500 [=====================>........] - ETA: 30s - loss: 1.3200 - regression_loss: 1.1136 - classification_loss: 0.2064 376/500 [=====================>........] - ETA: 30s - loss: 1.3201 - regression_loss: 1.1138 - classification_loss: 0.2063 377/500 [=====================>........] - ETA: 30s - loss: 1.3220 - regression_loss: 1.1147 - classification_loss: 0.2072 378/500 [=====================>........] - ETA: 29s - loss: 1.3206 - regression_loss: 1.1137 - classification_loss: 0.2068 379/500 [=====================>........] - ETA: 29s - loss: 1.3225 - regression_loss: 1.1153 - classification_loss: 0.2073 380/500 [=====================>........] - ETA: 29s - loss: 1.3231 - regression_loss: 1.1160 - classification_loss: 0.2072 381/500 [=====================>........] - ETA: 29s - loss: 1.3249 - regression_loss: 1.1174 - classification_loss: 0.2074 382/500 [=====================>........] - ETA: 28s - loss: 1.3225 - regression_loss: 1.1156 - classification_loss: 0.2070 383/500 [=====================>........] - ETA: 28s - loss: 1.3224 - regression_loss: 1.1156 - classification_loss: 0.2069 384/500 [======================>.......] - ETA: 28s - loss: 1.3205 - regression_loss: 1.1138 - classification_loss: 0.2066 385/500 [======================>.......] - ETA: 28s - loss: 1.3196 - regression_loss: 1.1132 - classification_loss: 0.2064 386/500 [======================>.......] - ETA: 27s - loss: 1.3200 - regression_loss: 1.1137 - classification_loss: 0.2063 387/500 [======================>.......] - ETA: 27s - loss: 1.3179 - regression_loss: 1.1120 - classification_loss: 0.2059 388/500 [======================>.......] - ETA: 27s - loss: 1.3190 - regression_loss: 1.1129 - classification_loss: 0.2061 389/500 [======================>.......] - ETA: 27s - loss: 1.3201 - regression_loss: 1.1138 - classification_loss: 0.2063 390/500 [======================>.......] - ETA: 26s - loss: 1.3228 - regression_loss: 1.1159 - classification_loss: 0.2069 391/500 [======================>.......] - ETA: 26s - loss: 1.3207 - regression_loss: 1.1142 - classification_loss: 0.2065 392/500 [======================>.......] - ETA: 26s - loss: 1.3188 - regression_loss: 1.1127 - classification_loss: 0.2061 393/500 [======================>.......] - ETA: 26s - loss: 1.3183 - regression_loss: 1.1124 - classification_loss: 0.2059 394/500 [======================>.......] - ETA: 25s - loss: 1.3188 - regression_loss: 1.1129 - classification_loss: 0.2059 395/500 [======================>.......] - ETA: 25s - loss: 1.3190 - regression_loss: 1.1131 - classification_loss: 0.2059 396/500 [======================>.......] - ETA: 25s - loss: 1.3175 - regression_loss: 1.1117 - classification_loss: 0.2057 397/500 [======================>.......] - ETA: 25s - loss: 1.3171 - regression_loss: 1.1116 - classification_loss: 0.2056 398/500 [======================>.......] - ETA: 24s - loss: 1.3169 - regression_loss: 1.1115 - classification_loss: 0.2054 399/500 [======================>.......] - ETA: 24s - loss: 1.3169 - regression_loss: 1.1115 - classification_loss: 0.2054 400/500 [=======================>......] - ETA: 24s - loss: 1.3196 - regression_loss: 1.1140 - classification_loss: 0.2057 401/500 [=======================>......] - ETA: 24s - loss: 1.3236 - regression_loss: 1.1170 - classification_loss: 0.2066 402/500 [=======================>......] - ETA: 23s - loss: 1.3237 - regression_loss: 1.1172 - classification_loss: 0.2065 403/500 [=======================>......] - ETA: 23s - loss: 1.3249 - regression_loss: 1.1180 - classification_loss: 0.2069 404/500 [=======================>......] - ETA: 23s - loss: 1.3269 - regression_loss: 1.1194 - classification_loss: 0.2075 405/500 [=======================>......] - ETA: 23s - loss: 1.3262 - regression_loss: 1.1190 - classification_loss: 0.2073 406/500 [=======================>......] - ETA: 22s - loss: 1.3261 - regression_loss: 1.1189 - classification_loss: 0.2072 407/500 [=======================>......] - ETA: 22s - loss: 1.3286 - regression_loss: 1.1207 - classification_loss: 0.2079 408/500 [=======================>......] - ETA: 22s - loss: 1.3286 - regression_loss: 1.1206 - classification_loss: 0.2080 409/500 [=======================>......] - ETA: 22s - loss: 1.3296 - regression_loss: 1.1214 - classification_loss: 0.2082 410/500 [=======================>......] - ETA: 22s - loss: 1.3292 - regression_loss: 1.1211 - classification_loss: 0.2080 411/500 [=======================>......] - ETA: 21s - loss: 1.3293 - regression_loss: 1.1214 - classification_loss: 0.2079 412/500 [=======================>......] - ETA: 21s - loss: 1.3301 - regression_loss: 1.1221 - classification_loss: 0.2080 413/500 [=======================>......] - ETA: 21s - loss: 1.3319 - regression_loss: 1.1236 - classification_loss: 0.2082 414/500 [=======================>......] - ETA: 21s - loss: 1.3319 - regression_loss: 1.1236 - classification_loss: 0.2083 415/500 [=======================>......] - ETA: 20s - loss: 1.3333 - regression_loss: 1.1247 - classification_loss: 0.2086 416/500 [=======================>......] - ETA: 20s - loss: 1.3344 - regression_loss: 1.1257 - classification_loss: 0.2088 417/500 [========================>.....] - ETA: 20s - loss: 1.3342 - regression_loss: 1.1255 - classification_loss: 0.2087 418/500 [========================>.....] - ETA: 20s - loss: 1.3340 - regression_loss: 1.1254 - classification_loss: 0.2086 419/500 [========================>.....] - ETA: 19s - loss: 1.3332 - regression_loss: 1.1246 - classification_loss: 0.2086 420/500 [========================>.....] - ETA: 19s - loss: 1.3323 - regression_loss: 1.1238 - classification_loss: 0.2085 421/500 [========================>.....] - ETA: 19s - loss: 1.3329 - regression_loss: 1.1241 - classification_loss: 0.2088 422/500 [========================>.....] - ETA: 19s - loss: 1.3315 - regression_loss: 1.1229 - classification_loss: 0.2087 423/500 [========================>.....] - ETA: 18s - loss: 1.3330 - regression_loss: 1.1240 - classification_loss: 0.2089 424/500 [========================>.....] - ETA: 18s - loss: 1.3328 - regression_loss: 1.1241 - classification_loss: 0.2087 425/500 [========================>.....] - ETA: 18s - loss: 1.3339 - regression_loss: 1.1247 - classification_loss: 0.2092 426/500 [========================>.....] - ETA: 18s - loss: 1.3326 - regression_loss: 1.1237 - classification_loss: 0.2090 427/500 [========================>.....] - ETA: 17s - loss: 1.3338 - regression_loss: 1.1245 - classification_loss: 0.2093 428/500 [========================>.....] - ETA: 17s - loss: 1.3324 - regression_loss: 1.1234 - classification_loss: 0.2090 429/500 [========================>.....] - ETA: 17s - loss: 1.3344 - regression_loss: 1.1248 - classification_loss: 0.2096 430/500 [========================>.....] - ETA: 17s - loss: 1.3346 - regression_loss: 1.1251 - classification_loss: 0.2095 431/500 [========================>.....] - ETA: 16s - loss: 1.3342 - regression_loss: 1.1248 - classification_loss: 0.2094 432/500 [========================>.....] - ETA: 16s - loss: 1.3349 - regression_loss: 1.1257 - classification_loss: 0.2092 433/500 [========================>.....] - ETA: 16s - loss: 1.3373 - regression_loss: 1.1276 - classification_loss: 0.2097 434/500 [=========================>....] - ETA: 16s - loss: 1.3378 - regression_loss: 1.1279 - classification_loss: 0.2099 435/500 [=========================>....] - ETA: 15s - loss: 1.3375 - regression_loss: 1.1275 - classification_loss: 0.2100 436/500 [=========================>....] - ETA: 15s - loss: 1.3359 - regression_loss: 1.1263 - classification_loss: 0.2096 437/500 [=========================>....] - ETA: 15s - loss: 1.3371 - regression_loss: 1.1271 - classification_loss: 0.2100 438/500 [=========================>....] - ETA: 15s - loss: 1.3358 - regression_loss: 1.1260 - classification_loss: 0.2097 439/500 [=========================>....] - ETA: 14s - loss: 1.3349 - regression_loss: 1.1254 - classification_loss: 0.2095 440/500 [=========================>....] - ETA: 14s - loss: 1.3391 - regression_loss: 1.1285 - classification_loss: 0.2106 441/500 [=========================>....] - ETA: 14s - loss: 1.3389 - regression_loss: 1.1284 - classification_loss: 0.2105 442/500 [=========================>....] - ETA: 14s - loss: 1.3386 - regression_loss: 1.1282 - classification_loss: 0.2104 443/500 [=========================>....] - ETA: 13s - loss: 1.3366 - regression_loss: 1.1265 - classification_loss: 0.2101 444/500 [=========================>....] - ETA: 13s - loss: 1.3353 - regression_loss: 1.1253 - classification_loss: 0.2101 445/500 [=========================>....] - ETA: 13s - loss: 1.3369 - regression_loss: 1.1263 - classification_loss: 0.2106 446/500 [=========================>....] - ETA: 13s - loss: 1.3348 - regression_loss: 1.1246 - classification_loss: 0.2102 447/500 [=========================>....] - ETA: 12s - loss: 1.3339 - regression_loss: 1.1239 - classification_loss: 0.2100 448/500 [=========================>....] - ETA: 12s - loss: 1.3348 - regression_loss: 1.1246 - classification_loss: 0.2102 449/500 [=========================>....] - ETA: 12s - loss: 1.3350 - regression_loss: 1.1249 - classification_loss: 0.2101 450/500 [==========================>...] - ETA: 12s - loss: 1.3359 - regression_loss: 1.1256 - classification_loss: 0.2103 451/500 [==========================>...] - ETA: 11s - loss: 1.3349 - regression_loss: 1.1249 - classification_loss: 0.2101 452/500 [==========================>...] - ETA: 11s - loss: 1.3364 - regression_loss: 1.1259 - classification_loss: 0.2105 453/500 [==========================>...] - ETA: 11s - loss: 1.3360 - regression_loss: 1.1256 - classification_loss: 0.2104 454/500 [==========================>...] - ETA: 11s - loss: 1.3350 - regression_loss: 1.1247 - classification_loss: 0.2103 455/500 [==========================>...] - ETA: 11s - loss: 1.3343 - regression_loss: 1.1241 - classification_loss: 0.2101 456/500 [==========================>...] - ETA: 10s - loss: 1.3337 - regression_loss: 1.1237 - classification_loss: 0.2099 457/500 [==========================>...] - ETA: 10s - loss: 1.3345 - regression_loss: 1.1243 - classification_loss: 0.2102 458/500 [==========================>...] - ETA: 10s - loss: 1.3356 - regression_loss: 1.1249 - classification_loss: 0.2106 459/500 [==========================>...] - ETA: 10s - loss: 1.3364 - regression_loss: 1.1255 - classification_loss: 0.2108 460/500 [==========================>...] - ETA: 9s - loss: 1.3369 - regression_loss: 1.1262 - classification_loss: 0.2107  461/500 [==========================>...] - ETA: 9s - loss: 1.3378 - regression_loss: 1.1270 - classification_loss: 0.2108 462/500 [==========================>...] - ETA: 9s - loss: 1.3372 - regression_loss: 1.1266 - classification_loss: 0.2107 463/500 [==========================>...] - ETA: 9s - loss: 1.3364 - regression_loss: 1.1259 - classification_loss: 0.2104 464/500 [==========================>...] - ETA: 8s - loss: 1.3363 - regression_loss: 1.1258 - classification_loss: 0.2105 465/500 [==========================>...] - ETA: 8s - loss: 1.3366 - regression_loss: 1.1260 - classification_loss: 0.2106 466/500 [==========================>...] - ETA: 8s - loss: 1.3354 - regression_loss: 1.1250 - classification_loss: 0.2104 467/500 [===========================>..] - ETA: 8s - loss: 1.3348 - regression_loss: 1.1246 - classification_loss: 0.2102 468/500 [===========================>..] - ETA: 7s - loss: 1.3339 - regression_loss: 1.1238 - classification_loss: 0.2101 469/500 [===========================>..] - ETA: 7s - loss: 1.3335 - regression_loss: 1.1235 - classification_loss: 0.2100 470/500 [===========================>..] - ETA: 7s - loss: 1.3340 - regression_loss: 1.1240 - classification_loss: 0.2100 471/500 [===========================>..] - ETA: 7s - loss: 1.3358 - regression_loss: 1.1254 - classification_loss: 0.2104 472/500 [===========================>..] - ETA: 6s - loss: 1.3357 - regression_loss: 1.1253 - classification_loss: 0.2104 473/500 [===========================>..] - ETA: 6s - loss: 1.3370 - regression_loss: 1.1263 - classification_loss: 0.2107 474/500 [===========================>..] - ETA: 6s - loss: 1.3379 - regression_loss: 1.1270 - classification_loss: 0.2109 475/500 [===========================>..] - ETA: 6s - loss: 1.3361 - regression_loss: 1.1255 - classification_loss: 0.2106 476/500 [===========================>..] - ETA: 5s - loss: 1.3360 - regression_loss: 1.1256 - classification_loss: 0.2105 477/500 [===========================>..] - ETA: 5s - loss: 1.3358 - regression_loss: 1.1253 - classification_loss: 0.2105 478/500 [===========================>..] - ETA: 5s - loss: 1.3348 - regression_loss: 1.1246 - classification_loss: 0.2102 479/500 [===========================>..] - ETA: 5s - loss: 1.3334 - regression_loss: 1.1235 - classification_loss: 0.2099 480/500 [===========================>..] - ETA: 4s - loss: 1.3342 - regression_loss: 1.1242 - classification_loss: 0.2100 481/500 [===========================>..] - ETA: 4s - loss: 1.3335 - regression_loss: 1.1236 - classification_loss: 0.2099 482/500 [===========================>..] - ETA: 4s - loss: 1.3339 - regression_loss: 1.1238 - classification_loss: 0.2101 483/500 [===========================>..] - ETA: 4s - loss: 1.3328 - regression_loss: 1.1228 - classification_loss: 0.2100 484/500 [============================>.] - ETA: 3s - loss: 1.3315 - regression_loss: 1.1218 - classification_loss: 0.2098 485/500 [============================>.] - ETA: 3s - loss: 1.3314 - regression_loss: 1.1218 - classification_loss: 0.2096 486/500 [============================>.] - ETA: 3s - loss: 1.3315 - regression_loss: 1.1219 - classification_loss: 0.2095 487/500 [============================>.] - ETA: 3s - loss: 1.3321 - regression_loss: 1.1225 - classification_loss: 0.2096 488/500 [============================>.] - ETA: 2s - loss: 1.3317 - regression_loss: 1.1222 - classification_loss: 0.2094 489/500 [============================>.] - ETA: 2s - loss: 1.3312 - regression_loss: 1.1220 - classification_loss: 0.2092 490/500 [============================>.] - ETA: 2s - loss: 1.3296 - regression_loss: 1.1206 - classification_loss: 0.2091 491/500 [============================>.] - ETA: 2s - loss: 1.3293 - regression_loss: 1.1203 - classification_loss: 0.2090 492/500 [============================>.] - ETA: 1s - loss: 1.3296 - regression_loss: 1.1205 - classification_loss: 0.2091 493/500 [============================>.] - ETA: 1s - loss: 1.3301 - regression_loss: 1.1208 - classification_loss: 0.2092 494/500 [============================>.] - ETA: 1s - loss: 1.3303 - regression_loss: 1.1211 - classification_loss: 0.2092 495/500 [============================>.] - ETA: 1s - loss: 1.3288 - regression_loss: 1.1199 - classification_loss: 0.2089 496/500 [============================>.] - ETA: 0s - loss: 1.3283 - regression_loss: 1.1196 - classification_loss: 0.2087 497/500 [============================>.] - ETA: 0s - loss: 1.3301 - regression_loss: 1.1208 - classification_loss: 0.2093 498/500 [============================>.] - ETA: 0s - loss: 1.3305 - regression_loss: 1.1212 - classification_loss: 0.2094 499/500 [============================>.] - ETA: 0s - loss: 1.3315 - regression_loss: 1.1220 - classification_loss: 0.2095 500/500 [==============================] - 123s 245ms/step - loss: 1.3313 - regression_loss: 1.1218 - classification_loss: 0.2095 326 instances of class plum with average precision: 0.8069 mAP: 0.8069 Epoch 00091: saving model to ./training/snapshots/resnet50_pascal_91.h5