loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. ***************************************** loaded library: loaded library: loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ------------------------ arguments ------------------------ batches_per_epoch ............................... 312 channel_last .................................... True ddp ............................................. False exit_num ........................................ 300 fuse_bn_add_relu ................................ True fuse_bn_relu .................................... True gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... True label_smoothing ................................. 0.1 learning_rate ................................... 4.096 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 8 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... True skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 512 train_global_batch_size ......................... 4096 use_fp16 ........................................ True use_gpu_decode .................................. True val_batch_size .................................. 50 val_batches_per_epoch ........................... 125 val_global_batch_size ........................... 400 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 2.96926 s ***** [rank:0] [train], epoch: 0/1, iter: 100/312, loss: 0.86726, top1: 0.00070, throughput: 421.63 | 2022-05-14 02:03:02.434 [rank:2] [train], epoch: 0/1, iter: 100/312, loss: 0.86739, top1: 0.00094, throughput: 421.63 | 2022-05-14 02:03:02.434 [rank:7] [train], epoch: 0/1, iter: 100/312, loss: 0.86719, top1: 0.00086, throughput: 421.66 | 2022-05-14 02:03:02.434 [rank:3] [train], epoch: 0/1, iter: 100/312, loss: 0.86712, top1: 0.00082, throughput: 421.63 | 2022-05-14 02:03:02.435 [rank:5] [train], epoch: 0/1, iter: 100/312, loss: 0.86711, top1: 0.00076, throughput: 421.65 | 2022-05-14 02:03:02.435 [rank:4] [train], epoch: 0/1, iter: 100/312, loss: 0.86714, top1: 0.00102, throughput: 421.62 | 2022-05-14 02:03:02.435 [rank:6] [train], epoch: 0/1, iter: 100/312, loss: 0.86730, top1: 0.00096, throughput: 421.63 | 2022-05-14 02:03:02.434 [rank:1] [train], epoch: 0/1, iter: 100/312, loss: 0.86727, top1: 0.00068, throughput: 421.64 | 2022-05-14 02:03:02.437 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.672, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.674, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.679, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.680, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.679, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.681, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.684, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/14 02:03:02.686, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.685, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB 2022/05/14 02:03:02.686, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.687, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB 2022/05/14 02:03:02.687, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.691, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.690, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB 2022/05/14 02:03:02.692, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.692, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.692, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7791 MiB, 24719 MiB 2022/05/14 02:03:02.693, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.694, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.694, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.699, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.699, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.701, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.701, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.701, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 65 %, 32510 MiB, 7776 MiB, 24734 MiB 2022/05/14 02:03:02.701, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.703, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.703, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.707, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.708, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.709, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.709, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.710, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 73 %, 32510 MiB, 7972 MiB, 24538 MiB 2022/05/14 02:03:02.710, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.711, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.711, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.716, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.716, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.717, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.718, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.718, Tesla V100-SXM2-32GB, 470.57.02, 72 %, 63 %, 32510 MiB, 7892 MiB, 24618 MiB 2022/05/14 02:03:02.718, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.719, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.720, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.724, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.724, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.725, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.725, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.726, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 70 %, 32510 MiB, 7904 MiB, 24606 MiB 2022/05/14 02:03:02.726, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.727, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.727, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.731, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.732, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.733, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 68 %, 32510 MiB, 7816 MiB, 24694 MiB 2022/05/14 02:03:02.733, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.734, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.735, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.738, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.739, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.739, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 71 %, 32510 MiB, 7664 MiB, 24846 MiB 2022/05/14 02:03:02.741, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.745, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/05/14 02:03:02.746, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 87 %, 32510 MiB, 7814 MiB, 24696 MiB [rank:0] [train], epoch: 0/1, iter: 200/312, loss: 0.86707, top1: 0.00107, throughput: 1341.20 | 2022-05-14 02:03:40.609 [rank:6] [train], epoch: 0/1, iter: 200/312, loss: 0.86731, top1: 0.00078, throughput: 1341.17 | 2022-05-14 02:03:40.610 [rank:1] [train], epoch: 0/1, iter: 200/312, loss: 0.86709, top1: 0.00086, throughput: 1341.30 | 2022-05-14 02:03:40.609[rank:2] [train], epoch: 0/1, iter: 200/312, loss: 0.86721, top1: 0.00084, throughput: 1341.18 | 2022-05-14 02:03:40.610 [rank:7] [train], epoch: 0/1, iter: 200/312, loss: 0.86714, top1: 0.00100, throughput: 1341.18 | 2022-05-14 02:03:40.609 [rank:3] [train], epoch: 0/1, iter: 200/312, loss: 0.86695, top1: 0.00109, throughput: 1341.18 | 2022-05-14 02:03:40.610 [rank:5] [train], epoch: 0/1, iter: 200/312, loss: 0.86713, top1: 0.00084, throughput: 1341.10 | 2022-05-14 02:03:40.612 [rank:4] [train], epoch: 0/1, iter: 200/312, loss: 0.86703, top1: 0.00113, throughput: 1341.16 | 2022-05-14 02:03:40.611 [rank:0] [train], epoch: 0/1, iter: 300/312, loss: 0.86736, top1: 0.00074, throughput: 1376.32 | 2022-05-14 02:04:17.809 [rank:3] [train], epoch: 0/1, iter: 300/312, loss: 0.86710, top1: 0.00123, throughput: 1376.34 | 2022-05-14 02:04:17.810 [rank:6] [train], epoch: 0/1, iter: 300/312, loss: 0.86713, top1: 0.00115, throughput: 1376.34 | 2022-05-14 02:04:17.810 [rank:4] [train], epoch: 0/1, iter: 300/312, loss: 0.86694, top1: 0.00088, throughput: 1376.36 | 2022-05-14 02:04:17.811 [rank:2] [train], epoch: 0/1, iter: 300/312, loss: 0.86707, top1: 0.00127, throughput: 1376.27 | 2022-05-14 02:04:17.812 [rank:7] [train], epoch: 0/1, iter: 300/312, loss: 0.86719, top1: 0.00107, throughput: 1376.32 | 2022-05-14 02:04:17.810 [rank:5] [train], epoch: 0/1, iter: 300/312, loss: 0.86724, top1: 0.00074, throughput: 1376.42 | 2022-05-14 02:04:17.810 [rank:1] [train], epoch: 0/1, iter: 300/312, loss: 0.86701, top1: 0.00113, throughput: 1376.29 | 2022-05-14 02:04:17.810