Skip to content

Commit

Permalink
add l2_cache_size config path from model config (pytorch#2336)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: pytorch#2336

support l2 cache size as a config passed from model side

Differential Revision: D61418000
  • Loading branch information
duduyi2013 authored and facebook-github-bot committed Aug 28, 2024
1 parent b0a56ef commit a2db5b1
Showing 1 changed file with 5 additions and 2 deletions.
7 changes: 5 additions & 2 deletions torchrec/distributed/types.py
Original file line number Diff line number Diff line change
Expand Up @@ -610,6 +610,7 @@ class KeyValueParams:
gather_ssd_cache_stats: Optional[bool] = None
stats_reporter_config: Optional[TBEStatsReporterConfig] = None
use_passed_in_path: bool = True
l2_cache_size: Optional[int] = None

# Parameter Server (PS) Attributes
ps_hosts: Optional[Tuple[Tuple[str, int], ...]] = None
Expand All @@ -623,13 +624,15 @@ def __hash__(self) -> int:
self.ssd_storage_directory,
self.ssd_rocksdb_write_buffer_size,
self.ssd_rocksdb_shards,
self.gather_ssd_cache_stats,
self.stats_reporter_config,
# Parameter Server (PS) Attributes
self.ps_hosts,
self.ps_client_thread_num,
self.ps_max_key_per_request,
self.ps_max_local_index_length,
# tbe attributes
self.gather_ssd_cache_stats,
self.stats_reporter_config,
self.l2_cache_size,
)
)

Expand Down

0 comments on commit a2db5b1

Please sign in to comment.