The module torch.distributed.launch is deprecated and going to be removed in future.Migrate to torch.distributed.run WARNING:torch.distributed.run:--use_env is deprecated and will be removed in future releases. Please read local_rank from `os.environ('LOCAL_RANK')` instead. INFO:torch.distributed.launcher.api:Starting elastic_operator with launch configs: entrypoint : pretrain_gpt.py min_nodes : 1 max_nodes : 1 nproc_per_node : 8 run_id : none rdzv_backend : static rdzv_endpoint : 127.0.0.1:6000 rdzv_configs : {'rank': 0, 'timeout': 900} max_restarts : 3 monitor_interval : 5 log_dir : None metrics_cfg : {} INFO:torch.distributed.elastic.agent.server.local_elastic_agent:log directory set to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8 INFO:torch.distributed.elastic.agent.server.api:[default] starting workers for entrypoint: python INFO:torch.distributed.elastic.agent.server.api:[default] Rendezvous'ing worker group /opt/conda/lib/python3.8/site-packages/torch/distributed/elastic/utils/store.py:52: FutureWarning: This is an experimental API and will be changed in future. warnings.warn( INFO:torch.distributed.elastic.agent.server.api:[default] Rendezvous complete for workers. Result: restart_count=0 master_addr=127.0.0.1 master_port=6000 group_rank=0 group_world_size=1 local_ranks=[0, 1, 2, 3, 4, 5, 6, 7] role_ranks=[0, 1, 2, 3, 4, 5, 6, 7] global_ranks=[0, 1, 2, 3, 4, 5, 6, 7] role_world_sizes=[8, 8, 8, 8, 8, 8, 8, 8] global_world_sizes=[8, 8, 8, 8, 8, 8, 8, 8] INFO:torch.distributed.elastic.agent.server.api:[default] Starting worker group INFO:torch.distributed.elastic.multiprocessing:Setting worker0 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/0/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker1 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/1/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker2 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/2/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker3 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/3/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker4 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/4/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker5 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/5/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker6 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/6/error.json INFO:torch.distributed.elastic.multiprocessing:Setting worker7 reply file to: /tmp/torchelastic_zzsl5qrt/none_9evqq3g8/attempt_0/7/error.json using world size: 8, data-parallel-size: 2, tensor-model-parallel size: 1, pipeline-model-parallel size: 4 using torch.float16 for parameters ... Persistent fused layer norm kernel is supported from pytorch v1.11 (nvidia pytorch container paired with v1.11). Defaulting to no_persist_layer_norm=True ------------------------ arguments ------------------------ accumulate_allreduce_grads_in_fp32 .............. False activations_checkpoint_method ................... uniform activations_checkpoint_num_layers ............... 1 adam_beta1 ...................................... 0.9 adam_beta2 ...................................... 0.999 adam_eps ........................................ 1e-08 adlr_autoresume ................................. False adlr_autoresume_interval ........................ 1000 apply_query_key_layer_scaling ................... True apply_residual_connection_post_layernorm ........ False attention_dropout ............................... 0.1 attention_softmax_in_fp32 ....................... False bert_binary_head ................................ True bert_load ....................................... None bf16 ............................................ False bias_dropout_fusion ............................. True bias_gelu_fusion ................................ True biencoder_projection_dim ........................ 0 biencoder_shared_query_context_model ............ False block_data_path ................................. None classes_fraction ................................ 1.0 clip_grad ....................................... 1.0 consumed_train_samples .......................... 0 consumed_valid_samples .......................... 0 data_impl ....................................... mmap data_parallel_random_init ....................... False data_parallel_size .............................. 2 data_path ....................................... ['/dataset/source/dataset/loss_compara_content_sentence'] data_per_class_fraction ......................... 1.0 data_sharding ................................... True dataloader_type ................................. single DDP_impl ........................................ local decoder_seq_length .............................. None distribute_checkpointed_activations ............. False distributed_backend ............................. nccl embedding_path .................................. None empty_unused_memory_level ....................... 0 encoder_seq_length .............................. 1024 eod_mask_loss ................................... False eval_interval ................................... 1000 eval_iters ...................................... 10 evidence_data_path .............................. None exit_duration_in_mins ........................... None exit_interval ................................... None exit_signal_handler ............................. False ffn_hidden_size ................................. 4096 finetune ........................................ False fp16 ............................................ True fp16_lm_cross_entropy ........................... False fp32_residual_connection ........................ False global_batch_size ............................... 512 hidden_dropout .................................. 0.1 hidden_size ..................................... 1024 hysteresis ...................................... 2 ict_head_size ................................... None ict_load ........................................ None img_h ........................................... 224 img_w ........................................... 224 indexer_batch_size .............................. 128 indexer_log_interval ............................ 1000 inference_batch_times_seqlen_threshold .......... 512 init_method_std ................................. 0.02 init_method_xavier_uniform ...................... False initial_loss_scale .............................. 4294967296 kv_channels ..................................... 64 layernorm_epsilon ............................... 1e-05 lazy_mpu_init ................................... None load ............................................ None local_rank ...................................... 0 log_batch_size_to_tensorboard ................... False log_interval .................................... 100 log_learning_rate_to_tensorboard ................ True log_loss_scale_to_tensorboard ................... True log_memory_to_tensorboard ....................... False log_num_zeros_in_grad ........................... False log_params_norm ................................. False log_timers_to_tensorboard ....................... False log_validation_ppl_to_tensorboard ............... False log_world_size_to_tensorboard ................... False loss_scale ...................................... None loss_scale_window ............................... 1000 lr .............................................. 0.00015 lr_decay_iters .................................. 320000 lr_decay_samples ................................ None lr_decay_style .................................. cosine lr_warmup_fraction .............................. 0.01 lr_warmup_iters ................................. 0 lr_warmup_samples ............................... 0 make_vocab_size_divisible_by .................... 128 mask_prob ....................................... 0.15 masked_softmax_fusion ........................... True max_position_embeddings ......................... 1024 merge_file ...................................... /dataset/source/dataset/gpt2-merges.txt micro_batch_size ................................ 32 min_loss_scale .................................. 1.0 min_lr .......................................... 1e-05 mmap_warmup ..................................... False no_async_tensor_model_parallel_allreduce ........ False no_load_optim ................................... None no_load_rng ..................................... None no_persist_layer_norm ........................... True no_save_optim ................................... None no_save_rng ..................................... None num_attention_heads ............................. 16 num_channels .................................... 3 num_classes ..................................... 1000 num_layers ...................................... 24 num_layers_per_virtual_pipeline_stage ........... None num_workers ..................................... 2 onnx_safe ....................................... None openai_gelu ..................................... False optimizer ....................................... adam override_lr_scheduler ........................... False params_dtype .................................... torch.float16 patch_dim ....................................... 16 pipeline_model_parallel_size .................... 4 pipeline_model_parallel_split_rank .............. None query_in_block_prob ............................. 0.1 rampup_batch_size ............................... None rank ............................................ 0 reset_attention_mask ............................ False reset_position_ids .............................. False retriever_report_topk_accuracies ................ [] retriever_score_scaling ......................... False retriever_seq_length ............................ 256 sample_rate ..................................... 1.0 save ............................................ None save_interval ................................... 10000 scatter_gather_tensors_in_pipeline .............. True seed ............................................ 1234 seq_length ...................................... 1024 sgd_momentum .................................... 0.9 short_seq_prob .................................. 0.1 split ........................................... 949,50,1 tensor_model_parallel_size ...................... 1 tensorboard_dir ................................. None tensorboard_log_interval ........................ 1 tensorboard_queue_size .......................... 1000 titles_data_path ................................ None tokenizer_type .................................. GPT2BPETokenizer train_iters ..................................... 220 train_samples ................................... None use_checkpoint_lr_scheduler ..................... False use_contiguous_buffers_in_local_ddp ............. True use_cpu_initialization .......................... None use_one_sent_docs ............................... False virtual_pipeline_model_parallel_size ............ None vocab_extra_ids ................................. 0 vocab_file ...................................... /dataset/source/dataset/gpt2-vocab.json weight_decay .................................... 0.01 world_size ...................................... 8 -------------------- end of arguments --------------------- setting number of micro-batches to constant 8 > building GPT2BPETokenizer tokenizer ... > padded vocab (size: 50257) with 47 dummy tokens (new size: 50304) > initializing torch distributed ... > initializing tensor model parallel with size 1 > initializing pipeline model parallel with size 4 [W ProcessGroupNCCL.cpp:1671] Rank 7 using best-guess GPU 7 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. [W ProcessGroupNCCL.cpp:1671] Rank 1 using best-guess GPU 1 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. > setting random seeds to 1234 ... > initializing model parallel cuda seeds on global rank 0, model parallel rank 0, and data parallel rank 0 with model parallel seed: 3952 and data parallel seed: 1234 > compiling dataset index builder ... [W ProcessGroupNCCL.cpp:1671] Rank 4 using best-guess GPU 4 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. [W ProcessGroupNCCL.cpp:1671] Rank 2 using best-guess GPU 2 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. [W ProcessGroupNCCL.cpp:1671] Rank 6 using best-guess GPU 6 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. [W ProcessGroupNCCL.cpp:1671] Rank 3 using best-guess GPU 3 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. [W ProcessGroupNCCL.cpp:1671] Rank 5 using best-guess GPU 5 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. make: Entering directory '/dataset/xyn/Megatron-LM/megatron/data' make: Nothing to be done for 'default'. make: Leaving directory '/dataset/xyn/Megatron-LM/megatron/data' >>> done with dataset index builder. Compilation time: 0.039 seconds > compiling and loading fused kernels ... Detected CUDA files, patching ldflags Emitting ninja build file /dataset/xyn/Megatron-LM/megatron/fused_kernels/build/build.ninja... Building extension module scaled_upper_triang_masked_softmax_cuda... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) ninja: no work to do. Loading extension module scaled_upper_triang_masked_softmax_cuda... Detected CUDA files, patching ldflags Emitting ninja build file /dataset/xyn/Megatron-LM/megatron/fused_kernels/build/build.ninja... Building extension module scaled_masked_softmax_cuda... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) ninja: no work to do. Loading extension module scaled_masked_softmax_cuda... Detected CUDA files, patching ldflags Emitting ninja build file /dataset/xyn/Megatron-LM/megatron/fused_kernels/build/build.ninja... Building extension module scaled_softmax_cuda... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) ninja: no work to do. Loading extension module scaled_softmax_cuda... Detected CUDA files, patching ldflags Emitting ninja build file /dataset/xyn/Megatron-LM/megatron/fused_kernels/build/build.ninja... Building extension module fused_mix_prec_layer_norm_cuda... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) ninja: no work to do. Loading extension module fused_mix_prec_layer_norm_cuda... [W ProcessGroupNCCL.cpp:1671] Rank 0 using best-guess GPU 0 to perform barrier as devices used by this process are currently unknown. This can potentially cause a hang if this rank to GPU mapping is incorrect.Specify device_ids in barrier() to force use of a particular device. iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Using network IBext NCCL version 2.10.3+cuda11.4 iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO Bootstrap : Using eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO Plugin Path : /opt/hpcx/nccl_rdma_sharp_plugin/lib/libnccl-net.so iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO P2P plugin IBext iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO NCCL_IB_PCI_RELAXED_ORDERING set by environment to 1. iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71304:71304 [7] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO NET/IB : Using [0]mlx5_1:1/RoCE ; OOB eth0:192.168.11.230<0> iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO Using network IBext iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO NCCL_IB_GID_INDEX set by environment to 3. iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO NCCL_IB_TIMEOUT set by environment to 23. iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO NCCL_IB_RETRY_CNT set by environment to 7. iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 00/12 : 0 2 3 1 6 4 5 7 iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Trees [0] -1/-1/-1->7->5 [1] -1/-1/-1->7->5 [2] 5/-1/-1->7->0 [3] 5/-1/-1->7->0 [4] 4/-1/-1->7->6 [5] 6/-1/-1->7->4 [6] -1/-1/-1->7->5 [7] -1/-1/-1->7->5 [8] 5/-1/-1->7->0 [9] 5/-1/-1->7->0 [10] 4/-1/-1->7->6 [11] 6/-1/-1->7->4 iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Trees [0] 5/-1/-1->4->6 [1] 5/-1/-1->4->6 [2] 6/-1/-1->4->5 [3] 6/-1/-1->4->5 [4] 3/-1/-1->4->7 [5] 7/-1/-1->4->3 [6] 5/-1/-1->4->6 [7] 5/-1/-1->4->6 [8] 6/-1/-1->4->5 [9] 6/-1/-1->4->5 [10] 3/-1/-1->4->7 [11] 7/-1/-1->4->3 iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Trees [0] 6/-1/-1->1->3 [1] 6/-1/-1->1->3 [2] 3/-1/-1->1->6 [3] 3/-1/-1->1->6 [4] 2/-1/-1->1->0 [5] -1/-1/-1->1->2 [6] 6/-1/-1->1->3 [7] 6/-1/-1->1->3 [8] 3/-1/-1->1->6 [9] 3/-1/-1->1->6 [10] 2/-1/-1->1->0 [11] -1/-1/-1->1->2 iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Trees [0] 3/-1/-1->2->0 [1] 3/-1/-1->2->0 [2] -1/-1/-1->2->3 [3] -1/-1/-1->2->3 [4] 5/-1/-1->2->1 [5] 1/-1/-1->2->5 [6] 3/-1/-1->2->0 [7] 3/-1/-1->2->0 [8] -1/-1/-1->2->3 [9] -1/-1/-1->2->3 [10] 5/-1/-1->2->1 [11] 1/-1/-1->2->5 iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Trees [0] 1/-1/-1->3->2 [1] 1/-1/-1->3->2 [2] 2/-1/-1->3->1 [3] 2/-1/-1->3->1 [4] -1/-1/-1->3->4 [5] 4/-1/-1->3->0 [6] 1/-1/-1->3->2 [7] 1/-1/-1->3->2 [8] 2/-1/-1->3->1 [9] 2/-1/-1->3->1 [10] -1/-1/-1->3->4 [11] 4/-1/-1->3->0 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 01/12 : 0 2 3 1 6 4 5 7 iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Trees [0] 4/-1/-1->6->1 [1] 4/-1/-1->6->1 [2] 1/-1/-1->6->4 [3] 1/-1/-1->6->4 [4] 7/-1/-1->6->5 [5] 5/-1/-1->6->7 [6] 4/-1/-1->6->1 [7] 4/-1/-1->6->1 [8] 1/-1/-1->6->4 [9] 1/-1/-1->6->4 [10] 7/-1/-1->6->5 [11] 5/-1/-1->6->7 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 02/12 : 0 7 5 4 6 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Trees [0] 7/-1/-1->5->4 [1] 7/-1/-1->5->4 [2] 4/-1/-1->5->7 [3] 4/-1/-1->5->7 [4] 6/-1/-1->5->2 [5] 2/-1/-1->5->6 [6] 7/-1/-1->5->4 [7] 7/-1/-1->5->4 [8] 4/-1/-1->5->7 [9] 4/-1/-1->5->7 [10] 6/-1/-1->5->2 [11] 2/-1/-1->5->6 iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 03/12 : 0 7 5 4 6 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 04/12 : 0 1 2 5 6 7 4 3 iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 05/12 : 0 3 4 7 6 5 2 1 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 06/12 : 0 2 3 1 6 4 5 7 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 07/12 : 0 2 3 1 6 4 5 7 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 08/12 : 0 7 5 4 6 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 09/12 : 0 7 5 4 6 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 10/12 : 0 1 2 5 6 7 4 3 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 11/12 : 0 3 4 7 6 5 2 1 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Trees [0] 2/-1/-1->0->-1 [1] 2/-1/-1->0->-1 [2] 7/-1/-1->0->-1 [3] 7/-1/-1->0->-1 [4] 1/-1/-1->0->-1 [5] 3/-1/-1->0->-1 [6] 2/-1/-1->0->-1 [7] 2/-1/-1->0->-1 [8] 7/-1/-1->0->-1 [9] 7/-1/-1->0->-1 [10] 1/-1/-1->0->-1 [11] 3/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 00 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 04 : 1[65020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 00 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 04 : 6[6b010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 01 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 10 : 1[65020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 00 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 01 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 10 : 6[6b010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 06 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 04 : 5[69020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 01 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 06 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 04 : 0[65010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 07 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 10 : 5[69020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 06 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 07 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 10 : 0[65010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 05 : 3[67020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 07 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 11 : 3[67020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 02 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 00 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 00 : 0[65010] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 03 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 01 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 01 : 0[65010] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 02 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 08 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 06 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 06 : 0[65010] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 03 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 09 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 07 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 07 : 0[65010] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 08 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 09 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 02 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 05 : 4[69010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 04 : 2[67010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 05 : 0[65010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 03 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 11 : 4[69010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 10 : 2[67010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 11 : 0[65010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 08 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 09 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 05 : 5[69020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 04 : 7[6b020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 04 : 3[67020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 11 : 5[69020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 10 : 7[6b020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 10 : 3[67020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 00 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 02 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 02 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 00 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 01 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 03 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 03 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 01 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 06 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 08 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 08 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 06 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 07 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 09 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 09 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 07 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 00 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 01 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 06 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 02 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 02 : 0[65010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 07 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 03 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 03 : 0[65010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 08 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 05 : 7[6b020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 08 : 0[65010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 05 : 2[67010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 09 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 11 : 7[6b020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 09 : 0[65010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 11 : 2[67010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 05 : 6[6b010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 05 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 02 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 11 : 6[6b010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 11 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 03 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 04 : 4[69010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 08 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 10 : 4[69010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 09 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 05 : 1[65020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 11 : 1[65020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 05 : 6[6b010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 11 : 6[6b010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 02 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 03 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 08 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 02 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 02 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 09 : 7[6b020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 03 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 03 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 05 : 5[69020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 04 : 3[67020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 08 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 08 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 11 : 5[69020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 09 : 2[67010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 10 : 3[67020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 09 : 4[69010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 00 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 01 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 06 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 05 : 2[67010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 02 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 00 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 07 : 1[65020] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 11 : 2[67010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 03 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 01 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 08 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 06 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 09 : 5[69020] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 07 : 4[69010] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 05 : 3[67020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 00 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 04 : 4[69010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 11 : 3[67020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 01 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 10 : 4[69010] -> 7[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 06 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 07 : 6[6b010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 04 : 5[69020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 02 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 05 : 7[6b020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 10 : 5[69020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 03 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 11 : 7[6b020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 08 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 09 : 3[67020] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 02 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 00 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 00 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 03 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 01 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 01 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 08 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 06 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 06 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 09 : 1[65020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 07 : 2[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 07 : 7[6b020] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 02 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 04 : 7[6b020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 03 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 10 : 7[6b020] -> 6[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 08 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 00 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 09 : 6[6b010] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 01 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 06 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 04 : 2[67010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 07 : 5[69020] -> 4[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 10 : 2[67010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 04 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 00 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 04 : 6[6b010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 10 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 01 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 10 : 6[6b010] -> 5[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 05 : 4[69010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 06 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 11 : 4[69010] -> 3[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 07 : 3[67020] -> 2[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 02 : 7[6b020] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO threadThresholds 8/8/64 | 64/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 10 : 7[6b020] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 02 : 6[6b010] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO 12 coll channels, 16 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 10 : 6[6b010] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 02 : 2[67010] -> 4[69010] via P2P/indirect/5[69020] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 02 : 3[67020] -> 5[69020] via P2P/indirect/4[69010] iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 03 : 7[6b020] -> 2[67010] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 03 : 1[65020] -> 4[69010] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 10 : 2[67010] -> 4[69010] via P2P/indirect/5[69020] iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 11 : 7[6b020] -> 2[67010] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 11 : 1[65020] -> 4[69010] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 10 : 3[67020] -> 5[69020] via P2P/indirect/4[69010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 03 : 5[69020] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 03 : 3[67020] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 11 : 5[69020] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 11 : 3[67020] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 04 : 4[69010] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 04 : 2[67010] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 04 : 0[65010] -> 4[69010] via P2P/indirect/3[67020] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 12 : 2[67010] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 12 : 4[69010] -> 0[65010] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 04 : 1[65020] -> 5[69020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 04 : 6[6b010] -> 2[67010] via P2P/indirect/5[69020] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 04 : 3[67020] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 04 : 7[6b020] -> 3[67020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 04 : 5[69020] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 12 : 0[65010] -> 4[69010] via P2P/indirect/3[67020] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 12 : 1[65020] -> 5[69020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 12 : 6[6b010] -> 2[67010] via P2P/indirect/5[69020] iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO Channel 12 : 3[67020] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 12 : 5[69020] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO Channel 12 : 7[6b020] -> 3[67020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 05 : 4[69010] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 05 : 2[67010] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 05 : 6[6b010] -> 3[67020] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 05 : 0[65010] -> 5[69020] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO Channel 13 : 2[67010] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 13 : 4[69010] -> 1[65020] via P2P/indirect/6[6b010] iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO Channel 13 : 6[6b010] -> 3[67020] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 13 : 0[65010] -> 5[69020] via P2P/indirect/7[6b020] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 06 : 4[69010] -> 2[67010] via P2P/indirect/3[67020] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 06 : 1[65020] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 06 : 5[69020] -> 3[67020] via P2P/indirect/2[67010] iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO Channel 14 : 4[69010] -> 2[67010] via P2P/indirect/3[67020] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 06 : 0[65010] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO Channel 14 : 1[65020] -> 7[6b020] via P2P/indirect/0[65010] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO Channel 14 : 5[69020] -> 3[67020] via P2P/indirect/2[67010] iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO Channel 14 : 0[65010] -> 6[6b010] via P2P/indirect/1[65020] iv-ybpu7pvmiu5m57lh5kdd:71302:71556 [5] NCCL INFO comm 0x7f07f8008fb0 rank 5 nranks 8 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:71557 [6] NCCL INFO comm 0x7fa0b0008fb0 rank 6 nranks 8 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71547 [0] NCCL INFO comm 0x7f98bc008fb0 rank 0 nranks 8 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:71561 [2] NCCL INFO comm 0x7fd6c0008fb0 rank 2 nranks 8 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71559 [1] NCCL INFO comm 0x7f7418008fb0 rank 1 nranks 8 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71304:71555 [7] NCCL INFO comm 0x7f3a50008fb0 rank 7 nranks 8 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71560 [4] NCCL INFO comm 0x7f626c008fb0 rank 4 nranks 8 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:71558 [3] NCCL INFO comm 0x7f3668008fb0 rank 3 nranks 8 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel >>> done with compiling and loading fused kernels. Compilation time: 5.935 seconds time to initialize megatron (seconds): 6.468 [after megatron is initialized] datetime: 2022-07-05 16:29:08 building GPT model ... > number of parameters on (tensor, pipeline) model parallel rank (0, 2): 75577344 > number of parameters on (tensor, pipeline) model parallel rank (0, 1): 75577344 NCCL version 2.10.3+cuda11.4 iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Channel 00 : 0[65020] -> 1[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Channel 00 : 1[6b020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Channel 01 : 0[65020] -> 1[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Channel 01 : 1[6b020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Channel 00 : 1[6b010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Channel 00 : 0[65010] -> 1[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Channel 01 : 1[6b010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Channel 01 : 0[65010] -> 1[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71298:71608 [1] NCCL INFO comm 0x7f73bc008fb0 rank 0 nranks 2 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71304:71609 [7] NCCL INFO comm 0x7f39f8008fb0 rank 1 nranks 2 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71614 [6] NCCL INFO comm 0x7fa060008fb0 rank 1 nranks 2 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71613 [0] NCCL INFO comm 0x7f9870008fb0 rank 0 nranks 2 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel > number of parameters on (tensor, pipeline) model parallel rank (0, 3): 127090688 > number of parameters on (tensor, pipeline) model parallel rank (0, 0): 128137216 > learning rate decay style: cosine [after model, optimizer, and learning rate scheduler are built] datetime: 2022-07-05 16:29:08 > building train, validation, and test datasets ... > datasets target sizes (minimum size): train: 112640 validation: 5120 test: 5120 > building train, validation, and test datasets for GPT ... > building dataset index ... reading sizes... reading pointers... reading document index... creating numpy buffer of mmap... creating memory view of numpy buffer... > finished creating indexed dataset in 0.003603 seconds number of documents: 1249934 > dataset split: train: document indices in [0, 1186187) total of 1186187 documents validation: document indices in [1186187, 1248684) total of 62497 documents test: document indices in [1248684, 1249934) total of 1250 documents NCCL version 2.10.3+cuda11.4 NCCL version 2.10.3+cuda11.4 NCCL version 2.10.3+cuda11.4 > WARNING: could not find index map files, building the indices on rank 0 ... > last epoch number of samples (3794) is smaller than 80% of number of samples per epoch (54423), setting separate_last_epoch to True > elasped time to build and save doc-idx mapping (seconds): 0.147422 using: number of documents: 1186187 number of epochs: 3 sequence length: 1024 total number of samples: 163270 > elasped time to build and save sample-idx mapping (seconds): 0.021275 > building shuffle index with split [0, 108846) and [108846, 163270) ... > elasped time to build and save shuffle-idx mapping (seconds): 0.004883 iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 [2] -1/-1/-1->1->0 [3] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 [2] 1/-1/-1->0->-1 [3] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 [2] -1/-1/-1->1->0 [3] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 [2] 1/-1/-1->0->-1 [3] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Channel 00 : 0[6b010] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Channel 00 : 1[6b020] -> 0[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Channel 01 : 0[6b010] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Channel 01 : 1[6b020] -> 0[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 00 : 0[69010] -> 1[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Channel 00 : 1[69020] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71622 [6] NCCL INFO comm 0x7fa018008fb0 rank 0 nranks 2 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71304:71625 [7] NCCL INFO comm 0x7f39f80de4b0 rank 1 nranks 2 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:71303 [6] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 01 : 0[69010] -> 1[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Channel 01 : 1[69020] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Channel 00 : 1[67020] -> 0[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 02 : 0[69010] -> 1[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Channel 02 : 1[69020] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 00 : 0[67010] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Channel 00 : 0[65010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Channel 01 : 1[67020] -> 0[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Channel 00 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Channel 03 : 0[69010] -> 1[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Channel 03 : 1[69020] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 01 : 0[67010] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Channel 01 : 0[65010] -> 1[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Channel 02 : 1[67020] -> 0[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Channel 01 : 1[65020] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 02 : 0[67010] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Channel 03 : 1[67020] -> 0[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Channel 03 : 0[67010] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71298:71638 [1] NCCL INFO comm 0x7f73bc0de4b0 rank 1 nranks 2 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71637 [0] NCCL INFO comm 0x7f9834008fb0 rank 0 nranks 2 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:71626 [4] NCCL INFO comm 0x7f6204008fb0 rank 0 nranks 2 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71302:71628 [5] NCCL INFO comm 0x7f0784008fb0 rank 1 nranks 2 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:71627 [2] NCCL INFO comm 0x7fd64c008fb0 rank 0 nranks 2 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:71633 [3] NCCL INFO comm 0x7f35fc008fb0 rank 1 nranks 2 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Trees [0] 3/-1/-1->1->0 [1] 3/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Trees [0] -1/-1/-1->2->3 [1] -1/-1/-1->2->3 iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Channel 00/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Channel 01/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Trees [0] 2/-1/-1->3->1 [1] 2/-1/-1->3->1 iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Channel 00 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Channel 00 : 2[69020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Channel 01 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Channel 01 : 2[69020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Trees [0] 2/-1/-1->3->1 [1] 2/-1/-1->3->1 iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Trees [0] -1/-1/-1->2->3 [1] -1/-1/-1->2->3 iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Channel 00 : 1[67020] -> 3[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Trees [0] 3/-1/-1->1->0 [1] 3/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Channel 00/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Channel 01/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Channel 01 : 1[67020] -> 3[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Channel 00 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Channel 01 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Channel 00 : 2[69010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Channel 01 : 2[69010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Channel 00 : 1[67010] -> 3[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Channel 01 : 1[67010] -> 3[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Channel 00 : 3[6b020] -> 2[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Channel 01 : 3[6b020] -> 2[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Channel 00 : 3[6b010] -> 2[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Channel 01 : 3[6b010] -> 2[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Channel 00 : 2[69020] -> 3[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Channel 01 : 2[69020] -> 3[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Channel 00 : 3[6b020] -> 1[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Channel 01 : 3[6b020] -> 1[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Channel 00 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Channel 00 : 2[69010] -> 3[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Channel 01 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Channel 01 : 2[69010] -> 3[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71650 [7] NCCL INFO comm 0x7f39f82bd2f0 rank 3 nranks 4 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71647 [1] NCCL INFO comm 0x7f736c008fb0 rank 0 nranks 4 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:71657 [3] NCCL INFO comm 0x7f35fc0d1690 rank 1 nranks 4 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:71654 [5] NCCL INFO comm 0x7f07840a35e0 rank 2 nranks 4 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Channel 00 : 3[6b010] -> 1[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Channel 01 : 3[6b010] -> 1[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Channel 00 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Channel 01 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71649 [6] NCCL INFO comm 0x7fa0180ad380 rank 3 nranks 4 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:71658 [2] NCCL INFO comm 0x7fd64c0d0bd0 rank 1 nranks 4 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71648 [0] NCCL INFO comm 0x7f982c008fb0 rank 0 nranks 4 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71653 [4] NCCL INFO comm 0x7f62040a3d70 rank 2 nranks 4 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel > loading doc-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_train_indexmap_112640ns_1024sl_1234s_doc_idx.npy > loading sample-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_train_indexmap_112640ns_1024sl_1234s_sample_idx.npy > loading shuffle-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_train_indexmap_112640ns_1024sl_1234s_shuffle_idx.npy loaded indexed file in 0.004 seconds total number of samples: 163271 total number of epochs: 3 > loading doc-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_valid_indexmap_5120ns_1024sl_1234s_doc_idx.npy > loading sample-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_valid_indexmap_5120ns_1024sl_1234s_sample_idx.npy > loading shuffle-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_valid_indexmap_5120ns_1024sl_1234s_shuffle_idx.npy loaded indexed file in 0.008 seconds total number of samples: 5718 total number of epochs: 2 > loading doc-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_test_indexmap_5120ns_1024sl_1234s_doc_idx.npy > loading sample-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_test_indexmap_5120ns_1024sl_1234s_sample_idx.npy > loading shuffle-idx mapping from /dataset/source/dataset/loss_compara_content_sentence_test_indexmap_5120ns_1024sl_1234s_shuffle_idx.npy loaded indexed file in 0.010 seconds total number of samples: 5128 total number of epochs: 102 > finished creating GPT datasets ... NCCL version 2.10.3+cuda11.4 NCCL version 2.10.3+cuda11.4 NCCL version 2.10.3+cuda11.4 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 00/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 01/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 02/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 03/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 04/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 05/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 06/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 07/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 08/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 09/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 10/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 11/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 12/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 13/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 14/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 15/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 16/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 17/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 18/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 19/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 20/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 21/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 22/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 23/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 24/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 25/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 26/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 27/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 28/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 29/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 30/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Channel 31/32 : 0 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Trees [0] -1/-1/-1->0->-1 [1] -1/-1/-1->0->-1 [2] -1/-1/-1->0->-1 [3] -1/-1/-1->0->-1 [4] -1/-1/-1->0->-1 [5] -1/-1/-1->0->-1 [6] -1/-1/-1->0->-1 [7] -1/-1/-1->0->-1 [8] -1/-1/-1->0->-1 [9] -1/-1/-1->0->-1 [10] -1/-1/-1->0->-1 [11] -1/-1/-1->0->-1 [12] -1/-1/-1->0->-1 [13] -1/-1/-1->0->-1 [14] -1/-1/-1->0->-1 [15] -1/-1/-1->0->-1 [16] -1/-1/-1->0->-1 [17] -1/-1/-1->0->-1 [18] -1/-1/-1->0->-1 [19] -1/-1/-1->0->-1 [20] -1/-1/-1->0->-1 [21] -1/-1/-1->0->-1 [22] -1/-1/-1->0->-1 [23] -1/-1/-1->0->-1 [24] -1/-1/-1->0->-1 [25] -1/-1/-1->0->-1 [26] -1/-1/-1->0->-1 [27] -1/-1/-1->0->-1 [28] -1/-1/-1->0->-1 [29] -1/-1/-1->0->-1 [30] -1/-1/-1->0->-1 [31] -1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:71681 [2] NCCL INFO comm 0x7fd640008fb0 rank 0 nranks 1 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71695 [1] NCCL INFO comm 0x7f7364008fb0 rank 0 nranks 1 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:71682 [6] NCCL INFO comm 0x7fa004008fb0 rank 0 nranks 1 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:71679 [4] NCCL INFO comm 0x7f61f0008fb0 rank 0 nranks 1 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:71680 [5] NCCL INFO comm 0x7f076c008fb0 rank 0 nranks 1 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71297:71685 [0] NCCL INFO comm 0x7f9824008fb0 rank 0 nranks 1 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:71687 [7] NCCL INFO comm 0x7f399c008fb0 rank 0 nranks 1 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO 32 coll channels, 32 p2p channels, 32 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:71692 [3] NCCL INFO comm 0x7f35e4008fb0 rank 0 nranks 1 cudaDev 3 busId 67020 - Init COMPLETE [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) [W pthreadpool-cpp.cc:99] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool) time (ms) | model-and-optimizer-setup: 342.90 | train/valid/test-data-iterators-setup: 1803.87 [after dataloaders are built] datetime: 2022-07-05 16:29:10 done with setup ... training ... [before the start of training step] datetime: 2022-07-05 16:29:10 /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 [2] -1/-1/-1->1->0 [3] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 [2] 1/-1/-1->0->-1 [3] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 [2] -1/-1/-1->1->0 [3] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 [2] 1/-1/-1->0->-1 [3] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 00 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Channel 00 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 01 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Channel 01 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 02 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Channel 02 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Channel 03 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Channel 03 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 00 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Channel 00 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 01 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Channel 01 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 02 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Channel 02 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Channel 03 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Channel 03 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:73930 [3] NCCL INFO comm 0x7f35a4008fb0 rank 1 nranks 2 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:73929 [1] NCCL INFO comm 0x7f71fc008fb0 rank 0 nranks 2 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:73935 [2] NCCL INFO comm 0x7fd610008fb0 rank 1 nranks 2 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:73934 [0] NCCL INFO comm 0x7f96b8008fb0 rank 0 nranks 2 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Channel 00/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Trees [0] -1/-1/-1->1->0 [1] -1/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Channel 01/02 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Channel 00 : 0[67020] -> 1[69020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Channel 00 : 1[69020] -> 0[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Channel 01 : 0[67020] -> 1[69020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Channel 01 : 1[69020] -> 0[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Channel 00 : 1[69010] -> 0[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Channel 00 : 0[67010] -> 1[69010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Channel 01 : 1[69010] -> 0[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Channel 01 : 0[67010] -> 1[69010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:73943 [3] NCCL INFO comm 0x7f3480008fb0 rank 0 nranks 2 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71302:73944 [5] NCCL INFO comm 0x7f0734008fb0 rank 1 nranks 2 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:71300 [3] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:73949 [4] NCCL INFO comm 0x7f61b8008fb0 rank 1 nranks 2 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:73948 [2] NCCL INFO comm 0x7fd4d4008fb0 rank 0 nranks 2 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:71299 [2] NCCL INFO Launch mode Parallel /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Trees [0] 0/-1/-1->1->-1 [1] 0/-1/-1->1->-1 [2] 0/-1/-1->1->-1 [3] 0/-1/-1->1->-1 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Trees [0] -1/-1/-1->0->1 [1] -1/-1/-1->0->1 [2] -1/-1/-1->0->1 [3] -1/-1/-1->0->1 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 00/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Trees [0] 0/-1/-1->1->-1 [1] 0/-1/-1->1->-1 [2] 0/-1/-1->1->-1 [3] 0/-1/-1->1->-1 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 01/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 02/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 03/04 : 0 1 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Trees [0] -1/-1/-1->0->1 [1] -1/-1/-1->0->1 [2] -1/-1/-1->0->1 [3] -1/-1/-1->0->1 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 00 : 0[69020] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 01 : 0[69020] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Channel 00 : 1[6b020] -> 0[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Channel 01 : 1[6b020] -> 0[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 02 : 0[69020] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Channel 02 : 1[6b020] -> 0[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Channel 03 : 0[69020] -> 1[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Channel 03 : 1[6b020] -> 0[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 00 : 0[69010] -> 1[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Channel 00 : 1[6b010] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 01 : 0[69010] -> 1[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Channel 01 : 1[6b010] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 02 : 0[69010] -> 1[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Channel 02 : 1[6b010] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Channel 03 : 0[69010] -> 1[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Channel 03 : 1[6b010] -> 0[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71302:73957 [5] NCCL INFO comm 0x7f0610008fb0 rank 0 nranks 2 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71304:73958 [7] NCCL INFO comm 0x7f3968008fb0 rank 1 nranks 2 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71302:71302 [5] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO threadThresholds 8/8/64 | 16/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO 4 coll channels, 4 p2p channels, 4 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71301:73960 [4] NCCL INFO comm 0x7f6094008fb0 rank 0 nranks 2 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:73961 [6] NCCL INFO comm 0x7f9fd0008fb0 rank 1 nranks 2 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:71301 [4] NCCL INFO Launch mode Parallel /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( /dataset/xyn/Megatron-LM/megatron/model/transformer.py:536: UserWarning: AutoNonVariableTypeMode is deprecated and will be removed in 1.10 release. For kernel implementations please use AutoDispatchBelowADInplaceOrView instead, If you are looking for a user facing API to enable running your inference-only workload, please use c10::InferenceMode. Using AutoDispatchBelowADInplaceOrView in user code is under risk of producing silent wrong result in some edge cases. See Note [AutoDispatchBelowAutograd] for more details. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/core/LegacyTypeDispatch.h:74.) output = bias_dropout_add_func( iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Trees [0] -1/-1/-1->2->3 [1] -1/-1/-1->2->3 iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Setting affinity for GPU 5 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Trees [0] 2/-1/-1->3->1 [1] 2/-1/-1->3->1 iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Setting affinity for GPU 7 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Channel 00/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Channel 01/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Trees [0] 3/-1/-1->1->0 [1] 3/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Setting affinity for GPU 3 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Setting affinity for GPU 1 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Channel 00 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Channel 00 : 2[69020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Channel 01 : 0[65020] -> 1[67020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Channel 01 : 2[69020] -> 0[65020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Trees [0] -1/-1/-1->2->3 [1] -1/-1/-1->2->3 iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Trees [0] 2/-1/-1->3->1 [1] 2/-1/-1->3->1 iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Setting affinity for GPU 6 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Setting affinity for GPU 4 to 0fffff,fffffc00,00000000 iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Trees [0] 3/-1/-1->1->0 [1] 3/-1/-1->1->0 iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Channel 00/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Setting affinity for GPU 2 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Channel 01/02 : 0 1 3 2 iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Trees [0] 1/-1/-1->0->-1 [1] 1/-1/-1->0->-1 iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Setting affinity for GPU 0 to 03ff,ffffffff iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Channel 00 : 1[67020] -> 3[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Channel 00 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Channel 01 : 1[67020] -> 3[6b020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Channel 00 : 2[69010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Channel 01 : 0[65010] -> 1[67010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Channel 01 : 2[69010] -> 0[65010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Channel 00 : 1[67010] -> 3[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Channel 01 : 1[67010] -> 3[6b010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Channel 00 : 3[6b020] -> 2[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Channel 01 : 3[6b020] -> 2[69020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Channel 00 : 3[6b010] -> 2[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Channel 01 : 3[6b010] -> 2[69010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Channel 00 : 2[69020] -> 3[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Channel 01 : 2[69020] -> 3[6b020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Channel 00 : 3[6b020] -> 1[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Connected all rings iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Channel 01 : 3[6b020] -> 1[67020] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Channel 00 : 2[69010] -> 3[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Channel 01 : 2[69010] -> 3[6b010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Channel 00 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Channel 01 : 1[67020] -> 0[65020] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Channel 00 : 3[6b010] -> 1[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Channel 01 : 3[6b010] -> 1[67010] via direct shared memory iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71304:74066 [7] NCCL INFO comm 0x7f35cc008fb0 rank 3 nranks 4 cudaDev 7 busId 6b020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71300:74064 [3] NCCL INFO comm 0x7f33d8008fb0 rank 1 nranks 4 cudaDev 3 busId 67020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:74062 [1] NCCL INFO comm 0x7f70f0008fb0 rank 0 nranks 4 cudaDev 1 busId 65020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71302:74068 [5] NCCL INFO comm 0x7f0534008fb0 rank 2 nranks 4 cudaDev 5 busId 69020 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71298:71298 [1] NCCL INFO Launch mode Parallel iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Channel 00 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Channel 01 : 1[67010] -> 0[65010] via P2P/IPC iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO Connected all trees iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO threadThresholds 8/8/64 | 32/8/64 | 8/8/512 iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO 2 coll channels, 2 p2p channels, 2 p2p channels per peer iv-ybpu7pvmiu5m57lh5kdd:71297:74063 [0] NCCL INFO comm 0x7f95ac008fb0 rank 0 nranks 4 cudaDev 0 busId 65010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71303:74069 [6] NCCL INFO comm 0x7f9c30008fb0 rank 3 nranks 4 cudaDev 6 busId 6b010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71299:74065 [2] NCCL INFO comm 0x7fd470008fb0 rank 1 nranks 4 cudaDev 2 busId 67010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71301:74067 [4] NCCL INFO comm 0x7f5fb8008fb0 rank 2 nranks 4 cudaDev 4 busId 69010 - Init COMPLETE iv-ybpu7pvmiu5m57lh5kdd:71297:71297 [0] NCCL INFO Launch mode Parallel [Rank 2] (after 100 iterations) memory (MB) | allocated: 1569.52734375 | max allocated: 6948.11474609375 | reserved: 8152.0 | max reserved: 8152.0 [Rank 4] (after 100 iterations) memory (MB) | allocated: 1569.52734375 | max allocated: 6563.1142578125 | reserved: 7896.0 | max reserved: 7896.0 iteration 100/ 220 | consumed samples: 51200 | elapsed time per iteration (ms): 7911.2 | learning rate: 3.984E-06 | tpt: 64.7 samples/s | global batch size: 512 | lm loss: 9.996378E+00 | loss scale: 262144.0 | grad norm: 1.758 | number of skipped iterations: 15 | number of nan iterations: 0 | [Rank 6] (after 100 iterations) memory (MB) | allocated: 2552.5673828125 | max allocated: 12015.5166015625 | reserved: 14944.0 | max reserved: 14944.0 [Rank 0] (after 100 iterations) memory (MB) | allocated: 2444.02734375 | max allocated: 7863.146484375 | reserved: 9164.0 | max reserved: 9164.0 time (ms) | forward-compute: 2158.09 | forward-recv: 493.80 | backward-compute: 3998.88 | backward-send: 1.68 | backward-send-forward-recv: 24.17 | backward-params-all-reduce: 14.12 | backward-embedding-all-reduce: 1200.09 | optimizer-copy-to-main-grad: 1.48 | optimizer-unscale-and-check-inf: 7.75 | optimizer-clip-main-grad: 2.18 | optimizer-copy-main-to-model-params: 1.33 | optimizer: 17.21 | batch-generator: 8.32 timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/07/05 16:42:29.345, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.348, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.350, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/07/05 16:42:29.350, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.350, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.351, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/07/05 16:42:29.352, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/07/05 16:42:29.353, Tesla V100-SXM2-32GB, 470.57.02, 91 %, 3 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.354, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.354, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.354, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.355, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.355, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.355, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.357, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB timestamp, name, driver_version, utilization.gpu [%], utilization.memory [%], memory.total [MiB], memory.free [MiB], memory.used [MiB] 2022/07/05 16:42:29.359, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.359, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.360, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.361, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.361, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.361, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.363, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.364, Tesla V100-SXM2-32GB, 470.57.02, 59 %, 27 %, 32510 MiB, 21714 MiB, 10796 MiB 2022/07/05 16:42:29.365, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.365, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.366, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.367, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.367, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.367, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.371, Tesla V100-SXM2-32GB, 470.57.02, 87 %, 51 %, 32510 MiB, 21734 MiB, 10776 MiB 2022/07/05 16:42:29.371, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.372, Tesla V100-SXM2-32GB, 470.57.02, 91 %, 3 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.373, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.373, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.373, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.374, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.378, Tesla V100-SXM2-32GB, 470.57.02, 44 %, 3 %, 32510 MiB, 22814 MiB, 9696 MiB 2022/07/05 16:42:29.378, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.378, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.379, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.380, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.380, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.381, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.384, Tesla V100-SXM2-32GB, 470.57.02, 15 %, 3 %, 32510 MiB, 22770 MiB, 9740 MiB 2022/07/05 16:42:29.385, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.385, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.386, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.387, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.387, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.387, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.391, Tesla V100-SXM2-32GB, 470.57.02, 16 %, 3 %, 32510 MiB, 23030 MiB, 9480 MiB 2022/07/05 16:42:29.391, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.392, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.393, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.393, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.393, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.399, Tesla V100-SXM2-32GB, 470.57.02, 0 %, 0 %, 32510 MiB, 23006 MiB, 9504 MiB 2022/07/05 16:42:29.399, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.400, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.401, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.401, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.401, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB 2022/07/05 16:42:29.404, Tesla V100-SXM2-32GB, 470.57.02, 10 %, 7 %, 32510 MiB, 15910 MiB, 16600 MiB 2022/07/05 16:42:29.409, Tesla V100-SXM2-32GB, 470.57.02, 42 %, 7 %, 32510 MiB, 15974 MiB, 16536 MiB iteration 200/ 220 | consumed samples: 102400 | elapsed time per iteration (ms): 7838.2 | learning rate: 8.672E-06 | tpt: 65.3 samples/s | global batch size: 512 | lm loss: 8.663109E+00 | loss scale: 262144.0 | grad norm: 2.817 | number of skipped iterations: 0 | number of nan iterations: 0 | time (ms) | forward-compute: 2145.35 | forward-recv: 448.76 | backward-compute: 3995.38 | backward-send: 1.68 | backward-send-forward-recv: 15.13 | backward-params-all-reduce: 14.16 | backward-embedding-all-reduce: 1200.04 | optimizer-copy-to-main-grad: 1.48 | optimizer-unscale-and-check-inf: 1.66 | optimizer-clip-main-grad: 2.55 | optimizer-copy-main-to-model-params: 1.56 | optimizer: 12.47 | batch-generator: 8.18 [after training is done] datetime: 2022-07-05 16:58:01 ------------------------------------------------------------------------------------------------------------------ validation loss at the end of training for val data | lm loss value: 7.763425E+00 | lm loss PPL: 2.352949E+03 | ------------------------------------------------------------------------------------------------------------------ ------------------------------------------------------------------------------------------------------------------- validation loss at the end of training for test data | lm loss value: 7.594659E+00 | lm loss PPL: 1.987552E+03 | ------------------------------------------------------------------------------------------------------------------- INFO:torch.distributed.elastic.agent.server.api:[default] worker group successfully finished. Waiting 300 seconds for other agents to finish. INFO:torch.distributed.elastic.agent.server.api:Local worker group finished (SUCCEEDED). Waiting 300 seconds for other agents to finish /opt/conda/lib/python3.8/site-packages/torch/distributed/elastic/utils/store.py:70: FutureWarning: This is an experimental API and will be changed in future. warnings.warn( INFO:torch.distributed.elastic.agent.server.api:Done waiting for other agents. Elapsed: 0.0003008842468261719 seconds {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 0, "group_rank": 0, "worker_id": "71297", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [0], \"role_rank\": [0], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 1, "group_rank": 0, "worker_id": "71298", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [1], \"role_rank\": [1], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 2, "group_rank": 0, "worker_id": "71299", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [2], \"role_rank\": [2], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 3, "group_rank": 0, "worker_id": "71300", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [3], \"role_rank\": [3], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 4, "group_rank": 0, "worker_id": "71301", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [4], \"role_rank\": [4], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 5, "group_rank": 0, "worker_id": "71302", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [5], \"role_rank\": [5], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 6, "group_rank": 0, "worker_id": "71303", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [6], \"role_rank\": [6], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "WORKER", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": 7, "group_rank": 0, "worker_id": "71304", "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\", \"local_rank\": [7], \"role_rank\": [7], \"role_world_size\": [8]}", "agent_restarts": 0}} {"name": "torchelastic.worker.status.SUCCEEDED", "source": "AGENT", "timestamp": 0, "metadata": {"run_id": "none", "global_rank": null, "group_rank": 0, "worker_id": null, "role": "default", "hostname": "iv-ybpu7pvmiu5m57lh5kdd", "state": "SUCCEEDED", "total_run_time": 1786, "rdzv_backend": "static", "raw_error": null, "metadata": "{\"group_world_size\": 1, \"entry_point\": \"python\"}", "agent_restarts": 0}} ***************************************** Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. *****************************************