loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ------------------------ arguments ------------------------ batches_per_epoch ............................... 5004 channel_last .................................... False ddp ............................................. True exit_num ........................................ 300 fuse_bn_add_relu ................................ False fuse_bn_relu .................................... False gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... False label_smoothing ................................. 0.1 learning_rate ................................... 0.256 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 1 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... False skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 256 train_global_batch_size ......................... 256 use_fp16 ........................................ False use_gpu_decode .................................. False val_batch_size .................................. 50 val_batches_per_epoch ........................... 1000 val_global_batch_size ........................... 50 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 1.81012 s ***** [rank:0] [train], epoch: 0/1, iter: 100/5004, loss: 6.94049, lr: 0.000000, top1: 0.00141, throughput: 311.52 | 2022-05-22 01:50:46.535 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/05/22 01:50:46.643, Tesla V100-SXM2-32GB, 470.57.02, 45 %, 33 %, 32510 MiB, 5444 MiB, 27066 MiB [rank:0] [train], epoch: 0/1, iter: 200/5004, loss: 6.94044, lr: 0.000000, top1: 0.00109, throughput: 318.75 | 2022-05-22 01:52:06.849 [rank:0] [train], epoch: 0/1, iter: 300/5004, loss: 6.93898, lr: 0.000000, top1: 0.00148, throughput: 319.75 | 2022-05-22 01:53:26.911