loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. ***************************************** loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ------------------------ arguments ------------------------ batches_per_epoch ............................... 312 channel_last .................................... True ddp ............................................. False exit_num ........................................ 300 fuse_bn_add_relu ................................ True fuse_bn_relu .................................... True gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... True label_smoothing ................................. 0.1 learning_rate ................................... 4.096 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 8 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... True skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 512 train_global_batch_size ......................... 4096 use_fp16 ........................................ True use_gpu_decode .................................. False val_batch_size .................................. 50 val_batches_per_epoch ........................... 125 val_global_batch_size ........................... 400 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 2.86618 s ***** [rank:5] [train], epoch: 0/1, iter: 100/312, loss: 0.86757, top1: 0.00084, throughput: 444.98 | 2022-04-12 16:42:50.099 [rank:0] [train], epoch: 0/1, iter: 100/312, loss: 0.86766, top1: 0.00092, throughput: 444.98 | 2022-04-12 16:42:50.100 [rank:3] [train], epoch: 0/1, iter: 100/312, loss: 0.86754, top1: 0.00070, throughput: 444.95 | 2022-04-12 16:42:50.103 [rank:4] [train], epoch: 0/1, iter: 100/312, loss: 0.86779, top1: 0.00100, throughput: 444.95 | 2022-04-12 16:42:50.103 [rank:6] [train], epoch: 0/1, iter: 100/312, loss: 0.86757, top1: 0.00107, throughput: 444.95 | 2022-04-12 16:42:50.104 [rank:2] [train], epoch: 0/1, iter: 100/312, loss: 0.86759, top1: 0.00104, throughput: 444.99 | 2022-04-12 16:42:50.100 [rank:1] [train], epoch: 0/1, iter: 100/312, loss: 0.86751, top1: 0.00102, throughput: 444.96 | 2022-04-12 16:42:50.103 [rank:7] [train], epoch: 0/1, iter: 100/312, loss: 0.86768, top1: 0.00111, throughput: 444.98 | 2022-04-12 16:42:50.102 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/12 16:42:50.414, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/12 16:42:50.422, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/12 16:42:50.426, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.426, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 54 %, 32510 MiB, 8810 MiB, 23700 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/12 16:42:50.428, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/12 16:42:50.428, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.429, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.431, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.431, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 93 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.432, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.434, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.433, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.435, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.434, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 64 %, 32510 MiB, 8629 MiB, 23881 MiB 2022/04/12 16:42:50.436, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.438, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 54 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.438, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 91 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.440, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.442, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 54 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.442, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.443, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 54 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.443, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 79 %, 32510 MiB, 8658 MiB, 23852 MiB 2022/04/12 16:42:50.445, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 54 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.446, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 93 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.446, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.449, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 89 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.450, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 93 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.451, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 89 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.451, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 93 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.452, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 89 %, 32510 MiB, 8810 MiB, 23700 MiB 2022/04/12 16:42:50.453, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 92 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.454, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.455, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.457, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 92 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.459, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.459, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 92 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.460, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.460, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 92 %, 32510 MiB, 8742 MiB, 23768 MiB 2022/04/12 16:42:50.472, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.473, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.474, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.476, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.478, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.482, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.498, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.515, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 90 %, 32510 MiB, 8758 MiB, 23752 MiB 2022/04/12 16:42:50.524, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.525, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.544, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.546, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.546, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.553, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.556, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 87 %, 32510 MiB, 8622 MiB, 23888 MiB 2022/04/12 16:42:50.557, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.559, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.565, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.567, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.567, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.574, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.577, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 67 %, 32510 MiB, 8530 MiB, 23980 MiB 2022/04/12 16:42:50.578, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.585, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.588, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB 2022/04/12 16:42:50.594, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 68 %, 32510 MiB, 8690 MiB, 23820 MiB [rank:7] [train], epoch: 0/1, iter: 200/312, loss: 0.86762, top1: 0.00088, throughput: 1333.33 | 2022-04-12 16:43:28.502 [rank:3] [train], epoch: 0/1, iter: 200/312, loss: 0.86752, top1: 0.00102, throughput: 1333.36 | 2022-04-12 16:43:28.502 [rank:5] [train], epoch: 0/1, iter: 200/312, loss: 0.86743, top1: 0.00078, throughput: 1333.21 | 2022-04-12 16:43:28.502 [rank:0] [train], epoch: 0/1, iter: 200/312, loss: 0.86751, top1: 0.00105, throughput: 1333.23 | 2022-04-12 16:43:28.503 [rank:1] [train], epoch: 0/1, iter: 200/312, loss: 0.86786, top1: 0.00080, throughput: 1333.28 | 2022-04-12 16:43:28.505 [rank:6] [train], epoch: 0/1, iter: 200/312, loss: 0.86760, top1: 0.00100, throughput: 1333.32 | 2022-04-12 16:43:28.504 [rank:2] [train], epoch: 0/1, iter: 200/312, loss: 0.86780, top1: 0.00096, throughput: 1333.20 | 2022-04-12 16:43:28.504 [rank:4] [train], epoch: 0/1, iter: 200/312, loss: 0.86757, top1: 0.00088, throughput: 1333.30 | 2022-04-12 16:43:28.503 [rank:0] [train], epoch: 0/1, iter: 300/312, loss: 0.86804, top1: 0.00092, throughput: 1340.93 | 2022-04-12 16:44:06.686 [rank:5] [train], epoch: 0/1, iter: 300/312, loss: 0.86763, top1: 0.00078, throughput: 1340.92 | 2022-04-12 16:44:06.685 [rank:4] [train], epoch: 0/1, iter: 300/312, loss: 0.86784, top1: 0.00080, throughput: 1340.92 | 2022-04-12 16:44:06.686 [rank:1] [train], epoch: 0/1, iter: 300/312, loss: 0.86752, top1: 0.00090, throughput: 1340.97 | 2022-04-12 16:44:06.686 [rank:6] [train], epoch: 0/1, iter: 300/312, loss: 0.86770, top1: 0.00096, throughput: 1340.89 | 2022-04-12 16:44:06.688 [rank:3] [train], epoch: 0/1, iter: 300/312, loss: 0.86743, top1: 0.00098, throughput: 1340.87 | 2022-04-12 16:44:06.686 [rank:2] [train], epoch: 0/1, iter: 300/312, loss: 0.86756, top1: 0.00098, throughput: 1340.98 | 2022-04-12 16:44:06.685 [rank:7] [train], epoch: 0/1, iter: 300/312, loss: 0.86744, top1: 0.00094, throughput: 1340.85 | 2022-04-12 16:44:06.687