loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. ***************************************** loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ------------------------ arguments ------------------------ batches_per_epoch ............................... 312 channel_last .................................... True ddp ............................................. False exit_num ........................................ 300 fuse_bn_add_relu ................................ True fuse_bn_relu .................................... True gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... True label_smoothing ................................. 0.1 learning_rate ................................... 4.096 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 8 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... True skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 512 train_global_batch_size ......................... 4096 use_fp16 ........................................ True use_gpu_decode .................................. True val_batch_size .................................. 50 val_batches_per_epoch ........................... 125 val_global_batch_size ........................... 400 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 2.96135 s ***** [rank:2] [train], epoch: 0/1, iter: 100/312, loss: 0.86721, top1: 0.00113, throughput: 423.18 | 2022-05-23 09:40:44.081 [rank:1] [train], epoch: 0/1, iter: 100/312, loss: 0.86726, top1: 0.00090, throughput: 423.20 | 2022-05-23 09:40:44.083 [rank:3] [train], epoch: 0/1, iter: 100/312, loss: 0.86737, top1: 0.00105, throughput: 423.21 | 2022-05-23 09:40:44.081 [rank:0] [train], epoch: 0/1, iter: 100/312, loss: 0.86704, top1: 0.00094, throughput: 423.21 | 2022-05-23 09:40:44.082 [rank:4] [train], epoch: 0/1, iter: 100/312, loss: 0.86730, top1: 0.00094, throughput: 423.18 | 2022-05-23 09:40:44.082 [rank:7] [train], epoch: 0/1, iter: 100/312, loss: 0.86705, top1: 0.00098, throughput: 423.20[rank:5] [train], epoch: 0/1, iter: 100/312, loss: 0.86721, top1: 0.00100, throughput: 423.20 | 2022-05-23 09:40:44.082 | 2022-05-23 09:40:44.084 [rank:6] [train], epoch: 0/1, iter: 100/312, loss: 0.86717, top1: 0.00105, throughput: 423.12 | 2022-05-23 09:40:44.105 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:40:44.349, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.353, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.354, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:40:44.357, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.359, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.360, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.360, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.361, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.362, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:40:44.364, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.363, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.367, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.368, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.369, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.370, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.371, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.372, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.372, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.372, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 66 %, 32510 MiB, 7805 MiB, 24705 MiB 2022/05/23 09:40:44.375, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.376, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.377, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.378, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.379, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.381, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.381, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.381, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 75 %, 32510 MiB, 7790 MiB, 24720 MiB 2022/05/23 09:40:44.384, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.385, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.386, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.387, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.388, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.390, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.390, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.391, Tesla V100-SXM2-32GB, 470.57.02, 96 %, 87 %, 32510 MiB, 7980 MiB, 24530 MiB 2022/05/23 09:40:44.393, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.394, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.395, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.396, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.397, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.398, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.399, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.399, Tesla V100-SXM2-32GB, 470.57.02, 94 %, 85 %, 32510 MiB, 7918 MiB, 24592 MiB 2022/05/23 09:40:44.402, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.403, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.404, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.405, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.406, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.407, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.407, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.408, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 89 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/05/23 09:40:44.410, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.411, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.412, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.413, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.414, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.416, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.416, Tesla V100-SXM2-32GB, 470.57.02, 86 %, 79 %, 32510 MiB, 7780 MiB, 24730 MiB 2022/05/23 09:40:44.421, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.421, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.422, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.424, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB 2022/05/23 09:40:44.424, Tesla V100-SXM2-32GB, 470.57.02, 93 %, 85 %, 32510 MiB, 7668 MiB, 24842 MiB 2022/05/23 09:40:44.431, Tesla V100-SXM2-32GB, 470.57.02, 74 %, 68 %, 32510 MiB, 7834 MiB, 24676 MiB [rank:6] [train], epoch: 0/1, iter: 200/312, loss: 0.86740, top1: 0.00102, throughput: 1354.11 | 2022-05-23 09:41:21.916 [rank:7] [train], epoch: 0/1, iter: 200/312, loss: 0.86731, top1: 0.00111, throughput: 1353.37 | 2022-05-23 09:41:21.915 [rank:5] [train], epoch: 0/1, iter: 200/312, loss: 0.86736, top1: 0.00078, throughput: 1353.32 | 2022-05-23 09:41:21.915 [rank:4] [train], epoch: 0/1, iter: 200/312, loss: 0.86731, top1: 0.00102, throughput: 1353.32 | 2022-05-23 09:41:21.915 [rank:2] [train], epoch: 0/1, iter: 200/312, loss: 0.86735, top1: 0.00145, throughput: 1353.23 | 2022-05-23 09:41:21.917 [rank:1] [train], epoch: 0/1, iter: 200/312, loss: 0.86700, top1: 0.00107, throughput: 1353.29 | 2022-05-23 09:41:21.916 [rank:3] [train], epoch: 0/1, iter: 200/312, loss: 0.86712, top1: 0.00119, throughput: 1353.27 | 2022-05-23 09:41:21.915 [rank:0] [train], epoch: 0/1, iter: 200/312, loss: 0.86721, top1: 0.00121, throughput: 1352.92 | 2022-05-23 09:41:21.926 [rank:6] [train], epoch: 0/1, iter: 300/312, loss: 0.86703, top1: 0.00129, throughput: 1377.97 | 2022-05-23 09:41:59.072 [rank:1] [train], epoch: 0/1, iter: 300/312, loss: 0.86726, top1: 0.00141, throughput: 1378.08 | 2022-05-23 09:41:59.070 [rank:0] [train], epoch: 0/1, iter: 300/312, loss: 0.86731, top1: 0.00105, throughput: 1378.40 | 2022-05-23 09:41:59.071 [rank:2] [train], epoch: 0/1, iter: 300/312, loss: 0.86732, top1: 0.00125, throughput: 1377.99 | 2022-05-23 09:41:59.072 [rank:7] [train], epoch: 0/1, iter: 300/312, loss: 0.86707, top1: 0.00137, throughput: 1378.01 | 2022-05-23 09:41:59.070 [rank:5] [train], epoch: 0/1, iter: 300/312, loss: 0.86705, top1: 0.00119, throughput: 1377.63 | 2022-05-23 09:41:59.080 [rank:3] [train], epoch: 0/1, iter: 300/312, loss: 0.86713, top1: 0.00119, throughput: 1377.92 | 2022-05-23 09:41:59.073 [rank:4] [train], epoch: 0/1, iter: 300/312, loss: 0.86720, top1: 0.00115, throughput: 1377.18 | 2022-05-23 09:41:59.093