pylibwholegraph.torch.comm.WholeMemoryCommunicator#

class pylibwholegraph.torch.comm.WholeMemoryCommunicator(wmb_comm: PyWholeMemoryComm)#

WholeMemory Communicator. You should not create object of this class directly, use create_group_communicator, get_global_communicator, get_local_node_communicator or get_local_device_communicator instead.

Attributes:
distributed_backend

Methods

barrier()

Barrier on WholeMemory Communicator.

get_rank()

Get rank of current process in this communicator

get_size()

Get world size of this communicator

support_type_location(memory_type, ...)

Return True if Communicator supports combination of memory_type and memory_location.

destroy

__init__(wmb_comm: PyWholeMemoryComm)#

Methods

__init__(wmb_comm)

barrier()

Barrier on WholeMemory Communicator.

destroy()

get_rank()

Get rank of current process in this communicator

get_size()

Get world size of this communicator

support_type_location(memory_type, ...)

Return True if Communicator supports combination of memory_type and memory_location.

Attributes

distributed_backend