pylibwholegraph.torch.embedding.WholeMemoryEmbedding#
- class pylibwholegraph.torch.embedding.WholeMemoryEmbedding(wmb_embedding: PyWholeMemoryEmbedding, wmb_cache_policy: WholeMemoryCachePolicy | None)#
WholeMemory Embedding
- Attributes:
- shape
Methods
add_gradients
apply_gradients
dim
drop_all_cache
gather
get_embedding_tensor
get_optimizer_state
get_optimizer_state_names
load
need_grad
save
set_adjust_cache
writeback_all_cache
- __init__(wmb_embedding: PyWholeMemoryEmbedding, wmb_cache_policy: WholeMemoryCachePolicy | None)#
Methods
__init__(wmb_embedding, wmb_cache_policy)add_gradients(indice, grad_outputs)apply_gradients(lr)dim()drop_all_cache()gather(indice, *[, is_training, force_dtype])get_embedding_tensor()get_optimizer_state(state_name)get_optimizer_state_names()load(file_prefix, *[, ignore_embedding, ...])need_grad()save(file_prefix)set_adjust_cache(adjust_cache)writeback_all_cache()Attributes
shape