Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove obsolete codes from PARAM #107

Closed

Conversation

TaekyungHeo
Copy link
Contributor

@TaekyungHeo TaekyungHeo commented May 2, 2024

Summary

Remove obsolete codes from PARAM. README will be updated by follow-up PRs after refactoring.

  • Remove inference/compute/pt/pytorch_linear.py
  • Remove train/compute/pt/*
  • Remove dlrm
  • Remove train/comms/pt/comms.py

Test Plan

$ grep "dlrm" -r          
./README.md:3. DLRM: tracks the `ext_dist` branch of DRLM benchmark use Facebook's DLRM benchmark (https://github.com/facebookresearch/dlrm). In short, PARAM fully relies on DLRM benchmark for end-to-end workload evaluation; with additional extensions as required for scale-out AI training platforms.
./train/comms/pt/README.md:The DLRM-Comms benchmark (`dlrm.py`) is similar to the open-source DLRM benchmark except it
./train/comms/pt/README.md:### DLRM-Comms benchmark (`dlrm.py`)
./train/comms/pt/README.md:mpirun -np <num-processes> -N <processes per node> --hostfile <file contains host list> ./dlrm.py \
./train/comms/pt/README.md:mpirun -np 16 -N 8 --hostfile ./hfile ./dlrm.py --master-ip $(head -n 1 ./hfile.txt) --mini-batch-size 32 \
./train/comms/pt/pytorch_dist_backend.py:        # Open-source extend_distributed.py can be found in https://github.com/facebookresearch/dlrm
./train/comms/pt/pytorch_dist_backend.py:        # TODO: this is a temporary workaround; need to unify the type of commsParams in comms and dlrm

$ find . -name 'pytorch_linear.py'

$ grep "pytorch_linear" -r

$ grep "compute.pt" -r        

$ grep commsCollBench -r             

$ grep MultilineFormatter -r     

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 2, 2024
@TaekyungHeo TaekyungHeo force-pushed the refactor-obsolete branch 2 times, most recently from fd333de to 25f0870 Compare May 3, 2024 13:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants