loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. ***************************************** loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: loaded library: loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 ------------------------ arguments ------------------------ batches_per_epoch ............................... 312 channel_last .................................... True ddp ............................................. False exit_num ........................................ 300 fuse_bn_add_relu ................................ True fuse_bn_relu .................................... True gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... True label_smoothing ................................. 0.1 learning_rate ................................... 4.096 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 8 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... True skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 512 train_global_batch_size ......................... 4096 use_fp16 ........................................ True use_gpu_decode .................................. False val_batch_size .................................. 50 val_batches_per_epoch ........................... 125 val_global_batch_size ........................... 400 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 3.15909 s ***** [rank:7] [train], epoch: 0/1, iter: 100/312, loss: 0.86764, top1: 0.00131, throughput: 447.51 | 2022-05-23 09:48:26.436 [rank:2] [train], epoch: 0/1, iter: 100/312, loss: 0.86771, top1: 0.00123, throughput: 447.50 | 2022-05-23 09:48:26.434 [rank:1] [train], epoch: 0/1, iter: 100/312, loss: 0.86744, top1: 0.00129, throughput: 447.50 | 2022-05-23 09:48:26.436 [rank:4] [train], epoch: 0/1, iter: 100/312, loss: 0.86752, top1: 0.00121, throughput: 447.49 | 2022-05-23 09:48:26.436 [rank:6] [train], epoch: 0/1, iter: 100/312, loss: 0.86735, top1: 0.00141, throughput: 447.50 | 2022-05-23 09:48:26.436 [rank:3] [train], epoch: 0/1, iter: 100/312, loss: 0.86789, top1: 0.00129, throughput: 447.51 | 2022-05-23 09:48:26.437 [rank:5] [train], epoch: 0/1, iter: 100/312, loss: 0.86761, top1: 0.00162, throughput: 447.51 | 2022-05-23 09:48:26.437 [rank:0] [train], epoch: 0/1, iter: 100/312, loss: 0.86769, top1: 0.00137, throughput: 447.53 | 2022-05-23 09:48:26.438 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:48:26.728, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.728, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.729, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:48:26.735, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.736, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:48:26.736, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.737, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:48:26.741, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.742, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.742, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.742, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.743, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/23 09:48:26.744, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.747, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.748, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.747, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.748, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.749, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.750, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.750, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 58 %, 32510 MiB, 8555 MiB, 23955 MiB 2022/05/23 09:48:26.752, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.755, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 48 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.756, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 48 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.756, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.756, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 48 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.757, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.757, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.759, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 50 %, 32510 MiB, 8562 MiB, 23948 MiB 2022/05/23 09:48:26.760, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.768, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.768, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.769, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.769, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.770, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.770, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 92 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.772, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 81 %, 32510 MiB, 8716 MiB, 23794 MiB 2022/05/23 09:48:26.773, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.776, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.777, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.777, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.778, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.779, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 91 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.779, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.780, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 78 %, 32510 MiB, 8668 MiB, 23842 MiB 2022/05/23 09:48:26.787, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 91 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.791, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 85 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.792, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 85 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.792, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 91 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.793, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 85 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.793, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.794, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.795, Tesla V100-SXM2-32GB, 470.57.02, 98 %, 91 %, 32510 MiB, 8684 MiB, 23826 MiB 2022/05/23 09:48:26.796, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.801, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.808, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.816, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 83 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.817, Tesla V100-SXM2-32GB, 470.57.02, 97 %, 86 %, 32510 MiB, 8548 MiB, 23962 MiB 2022/05/23 09:48:26.826, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.840, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.841, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 83 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.847, Tesla V100-SXM2-32GB, 470.57.02, 92 %, 80 %, 32510 MiB, 8436 MiB, 24074 MiB 2022/05/23 09:48:26.853, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 83 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.865, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 83 %, 32510 MiB, 8616 MiB, 23894 MiB 2022/05/23 09:48:26.872, Tesla V100-SXM2-32GB, 470.57.02, 99 %, 83 %, 32510 MiB, 8616 MiB, 23894 MiB [rank:0] [train], epoch: 0/1, iter: 200/312, loss: 0.86772, top1: 0.00131, throughput: 1338.07 | 2022-05-23 09:49:04.702 [rank:7] [train], epoch: 0/1, iter: 200/312, loss: 0.86769, top1: 0.00139, throughput: 1338.04 | 2022-05-23 09:49:04.700 [rank:1] [train], epoch: 0/1, iter: 200/312, loss: 0.86745, top1: 0.00168, throughput: 1338.02 | 2022-05-23 09:49:04.702 [rank:5] [train], epoch: 0/1, iter: 200/312, loss: 0.86760, top1: 0.00133, throughput: 1338.02 | 2022-05-23 09:49:04.702 [rank:2] [train], epoch: 0/1, iter: 200/312, loss: 0.86745, top1: 0.00166, throughput: 1337.91 | 2022-05-23 09:49:04.703 [rank:4] [train], epoch: 0/1, iter: 200/312, loss: 0.86762, top1: 0.00152, throughput: 1338.00 | 2022-05-23 09:49:04.702 [rank:3] [train], epoch: 0/1, iter: 200/312, loss: 0.86776, top1: 0.00143, throughput: 1337.99 | 2022-05-23 09:49:04.703 [rank:6] [train], epoch: 0/1, iter: 200/312, loss: 0.86750, top1: 0.00145, throughput: 1337.78 | 2022-05-23 09:49:04.708 [rank:2] [train], epoch: 0/1, iter: 300/312, loss: 0.86738, top1: 0.00137, throughput: 1350.72 | 2022-05-23 09:49:42.609 [rank:5] [train], epoch: 0/1, iter: 300/312, loss: 0.86753, top1: 0.00164, throughput: 1350.68 | 2022-05-23 09:49:42.609 [rank:4] [train], epoch: 0/1, iter: 300/312, loss: 0.86756, top1: 0.00170, throughput: 1350.65 | 2022-05-23 09:49:42.610 [rank:0] [train], epoch: 0/1, iter: 300/312, loss: 0.86756, top1: 0.00143, throughput: 1350.55 | 2022-05-23 09:49:42.612 [rank:6] [train], epoch: 0/1, iter: 300/312, loss: 0.86757, top1: 0.00150, throughput: 1350.81 | 2022-05-23 09:49:42.612 [rank:1] [train], epoch: 0/1, iter: 300/312, loss: 0.86785, top1: 0.00152, throughput: 1350.64 | 2022-05-23 09:49:42.610 [rank:3] [train], epoch: 0/1, iter: 300/312, loss: 0.86749, top1: 0.00146, throughput: 1350.60 | 2022-05-23 09:49:42.612 [rank:7] [train], epoch: 0/1, iter: 300/312, loss: 0.86773, top1: 0.00137, throughput: 1350.55 | 2022-05-23 09:49:42.611