loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. ***************************************** loaded library: loaded library: loaded library: loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1/usr/lib/x86_64-linux-gnu/libibverbs.so.1 /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 loaded library: /usr/lib/x86_64-linux-gnu/libibverbs.so.1 W20220428 01:41:39.022450 404 rpc_client.cpp:190] LoadServer 10.7.183.15 Failed at 0 times error_code 14 error_message failed to connect to all addresses W20220428 01:41:39.022454 405 rpc_client.cpp:190] LoadServer 10.7.183.15 Failed at 0 times error_code 14 error_message failed to connect to all addresses W20220428 01:41:39.022447 403 rpc_client.cpp:190] LoadServer 10.7.183.15 Failed at 0 times error_code 14 error_message failed to connect to all addresses ------------------------ arguments ------------------------ batches_per_epoch ............................... 312 channel_last .................................... True ddp ............................................. False exit_num ........................................ 300 fuse_bn_add_relu ................................ True fuse_bn_relu .................................... True gpu_stat_file ................................... None grad_clipping ................................... 0.0 graph ........................................... True label_smoothing ................................. 0.1 learning_rate ................................... 4.096 legacy_init ..................................... False load_path ....................................... None lr_decay_type ................................... cosine metric_local .................................... True metric_train_acc ................................ True momentum ........................................ 0.875 nccl_fusion_max_ops ............................. 24 nccl_fusion_threshold_mb ........................ 16 num_classes ..................................... 1000 num_devices_per_node ............................ 8 num_epochs ...................................... 1 num_nodes ....................................... 1 ofrecord_part_num ............................... 256 ofrecord_path ................................... /dataset/79846248 print_interval .................................. 100 print_timestamp ................................. False samples_per_epoch ............................... 1281167 save_init ....................................... False save_path ....................................... None scale_grad ...................................... True skip_eval ....................................... True synthetic_data .................................. False total_batches ................................... -1 train_batch_size ................................ 512 train_global_batch_size ......................... 4096 use_fp16 ........................................ True use_gpu_decode .................................. True val_batch_size .................................. 50 val_batches_per_epoch ........................... 125 val_global_batch_size ........................... 400 val_samples_per_epoch ........................... 50000 warmup_epochs ................................... 5 weight_decay .................................... 3.0517578125e-05 zero_init_residual .............................. True -------------------- end of arguments --------------------- ***** Model Init ***** ***** Model Init Finish, time escapled: 2.86573 s ***** [rank:7] [train], epoch: 0/1, iter: 100/312, loss: 0.86703, top1: 0.00092, throughput: 433.41 | 2022-04-28 01:43:52.269 [rank:5] [train], epoch: 0/1, iter: 100/312, loss: 0.86691, top1: 0.00104, throughput: 433.40 | 2022-04-28 01:43:52.268 [rank:2] [train], epoch: 0/1, iter: 100/312, loss: 0.86681, top1: 0.00092, throughput: 433.42 | 2022-04-28 01:43:52.268 [rank:0] [train], epoch: 0/1, iter: 100/312, loss: 0.86692, top1: 0.00088, throughput: 433.41 | 2022-04-28 01:43:52.271 [rank:1] [train], epoch: 0/1, iter: 100/312, loss: 0.86706, top1: 0.00086, throughput: 433.40 | 2022-04-28 01:43:52.271 [rank:3] [train], epoch: 0/1, iter: 100/312, loss: 0.86708, top1: 0.00084, throughput: 433.42 | 2022-04-28 01:43:52.269 [rank:4] [train], epoch: 0/1, iter: 100/312, loss: 0.86697, top1: 0.00111, throughput: 433.35 | 2022-04-28 01:43:52.283 [rank:6] [train], epoch: 0/1, iter: 100/312, loss: 0.86682, top1: 0.00100, throughput: 433.31 | 2022-04-28 01:43:52.297 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/28 01:43:52.492, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/28 01:43:52.498, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/28 01:43:52.501, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.501, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.502, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.502, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.503, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.504, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.504, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.508, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/04/28 01:43:52.508, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.509, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.510, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.511, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.512, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.512, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.528, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.529, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.528, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7793 MiB, 24717 MiB 2022/04/28 01:43:52.530, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.531, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.531, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.532, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.533, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 49 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.536, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.536, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.537, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 86 %, 32510 MiB, 7802 MiB, 24708 MiB 2022/04/28 01:43:52.537, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.538, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.539, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.540, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.540, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.543, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.544, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.544, Tesla V100-SXM2-32GB, 470.57.02, 84 %, 75 %, 32510 MiB, 7970 MiB, 24540 MiB 2022/04/28 01:43:52.545, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.546, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.547, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.548, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.548, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.551, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 52 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.552, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.552, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 74 %, 32510 MiB, 7898 MiB, 24612 MiB 2022/04/28 01:43:52.553, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.554, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.555, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.556, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.556, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.560, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.561, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 72 %, 32510 MiB, 7914 MiB, 24596 MiB 2022/04/28 01:43:52.561, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.562, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.563, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.564, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.564, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.568, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.569, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 81 %, 32510 MiB, 7814 MiB, 24696 MiB 2022/04/28 01:43:52.569, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.570, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.571, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.572, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.572, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB 2022/04/28 01:43:52.576, Tesla V100-SXM2-32GB, 470.57.02, 100 %, 88 %, 32510 MiB, 7666 MiB, 24844 MiB 2022/04/28 01:43:52.583, Tesla V100-SXM2-32GB, 470.57.02, 73 %, 64 %, 32510 MiB, 7850 MiB, 24660 MiB [rank:4] [train], epoch: 0/1, iter: 200/312, loss: 0.86704, top1: 0.00098, throughput: 1304.64 | 2022-04-28 01:44:31.527 [rank:1] [train], epoch: 0/1, iter: 200/312, loss: 0.86707, top1: 0.00057, throughput: 1304.27 | 2022-04-28 01:44:31.526 [rank:2] [train], epoch: 0/1, iter: 200/312, loss: 0.86682, top1: 0.00107, throughput: 1304.17 | 2022-04-28 01:44:31.527 [rank:3] [train], epoch: 0/1, iter: 200/312, loss: 0.86699, top1: 0.00086, throughput: 1304.15 | 2022-04-28 01:44:31.528 [rank:6] [train], epoch: 0/1, iter: 200/312, loss: 0.86697, top1: 0.00098, throughput: 1305.14 | 2022-04-28 01:44:31.527 [rank:0] [train], epoch: 0/1, iter: 200/312, loss: 0.86710, top1: 0.00096, throughput: 1304.24 | 2022-04-28 01:44:31.528 [rank:5] [train], epoch: 0/1, iter: 200/312, loss: 0.86719, top1: 0.00084, throughput: 1304.02 | 2022-04-28 01:44:31.531 [rank:7] [train], epoch: 0/1, iter: 200/312, loss: 0.86711, top1: 0.00102, throughput: 1304.21 | 2022-04-28 01:44:31.527 [rank:4] [train], epoch: 0/1, iter: 300/312, loss: 0.86676, top1: 0.00107, throughput: 1313.88 | 2022-04-28 01:45:10.496 [rank:1] [train], epoch: 0/1, iter: 300/312, loss: 0.86696, top1: 0.00102, throughput: 1313.85 | 2022-04-28 01:45:10.496 [rank:3] [train], epoch: 0/1, iter: 300/312, loss: 0.86675, top1: 0.00105, throughput: 1313.97 | 2022-04-28 01:45:10.494 [rank:2] [train], epoch: 0/1, iter: 300/312, loss: 0.86691, top1: 0.00088, throughput: 1313.89 | 2022-04-28 01:45:10.495 [rank:6] [train], epoch: 0/1, iter: 300/312, loss: 0.86669, top1: 0.00100, throughput: 1313.81 | 2022-04-28 01:45:10.497 [rank:0] [train], epoch: 0/1, iter: 300/312, loss: 0.86671, top1: 0.00121, throughput: 1313.77 | 2022-04-28 01:45:10.499 [rank:7] [train], epoch: 0/1, iter: 300/312, loss: 0.86678, top1: 0.00092, throughput: 1313.74 | 2022-04-28 01:45:10.500 [rank:5] [train], epoch: 0/1, iter: 300/312, loss: 0.86672, top1: 0.00074, throughput: 1308.15 | 2022-04-28 01:45:10.671