pylibwholegraph.torch.initialize.init_torch_env_and_create_wm_comm#

pylibwholegraph.torch.initialize.init_torch_env_and_create_wm_comm(world_rank: int, world_size: int, local_rank: int, local_size: int, distributed_backend_type='nccl', wm_log_level='info')#

Init WholeGraph environment for PyTorch and create single communicator for all ranks. :param world_rank: world rank of current process :param world_size: world size of all processes :param local_rank: local rank of current process :param local_size: local size :return: global and local node Communicator