pylibwholegraph.torch.embedding.create_embedding_from_filelist(comm: WholeMemoryCommunicator, memory_type: str, memory_location: str, filelist: Union[List[str], str], dtype: dtype, last_dim_size: int, *, optimizer: Optional[WholeMemoryOptimizer] = None, cache_policy: Optional[WholeMemoryCachePolicy] = None, gather_sms: int = -1, round_robin_size: int = 0)#

Create embedding from file list :param comm: WholeMemoryCommunicator :param memory_type: WholeMemory type, should be continuous, chunked or distributed :param memory_location: WholeMemory location, should be cpu or cuda :param filelist: list of files :param dtype: data type :param last_dim_size: size of last dim :param optimizer: optimizer :param cache_policy: cache policy :param gather_sms: the number of SMs used in gather process :return: