pylibwholegraph.torch.initialize.init_torch_env_and_create_wm_comm#
- pylibwholegraph.torch.initialize.init_torch_env_and_create_wm_comm(world_rank: int, world_size: int, local_rank: int, local_size: int, distributed_backend_type='nccl', wm_log_level='info')#
- Init WholeGraph environment for PyTorch and create
single communicator for all ranks.
- Parameters:
world_rank – world rank of current process
world_size – world size of all processes
local_rank – local rank of current process
local_size – local size
- Returns:
global and local node Communicator