Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove unused device kernels #2509

Merged
merged 3 commits into from
Dec 4, 2023
Merged

Remove unused device kernels #2509

merged 3 commits into from
Dec 4, 2023

Conversation

umangyadav
Copy link
Member

Gather and Pad operators have JIT kernels.
Device kernels are no longer in use hence removing them.

@umangyadav umangyadav self-assigned this Dec 4, 2023
@umangyadav umangyadav added the simple small or simple changes label Dec 4, 2023
@TedThemistokleous TedThemistokleous added the Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR label Dec 4, 2023
@TedThemistokleous
Copy link
Collaborator

I get cleaning these up but do we ever want to keep this around for debug purposes?

@umangyadav
Copy link
Member Author

umangyadav commented Dec 4, 2023

I get cleaning these up but do we ever want to keep this around for debug purposes?

I see value in that but we can use git to retrieve older files. Also we had gather and pad JIT kernels for a while now. So unlikely it has any bugs now.
I am working on FP8 for device kernels. Having these files that are unused have added burden to keep it maintained otherwise.

@umangyadav umangyadav added the skip bot checks Skips the Performance and Accuracy CI tests label Dec 4, 2023
@migraphx-bot
Copy link
Collaborator

Test Batch Rate new
09e857
Rate old
02f740
Diff Compare
torchvision-resnet50 64 2,830.66 2,832.03 -0.05%
torchvision-resnet50_fp16 64 6,501.27 6,501.16 0.00%
torchvision-densenet121 32 2,091.38 2,098.99 -0.36%
torchvision-densenet121_fp16 32 3,650.81 3,666.47 -0.43%
torchvision-inceptionv3 32 1,585.85 1,584.68 0.07%
torchvision-inceptionv3_fp16 32 2,569.46 2,569.37 0.00%
cadene-inceptionv4 16 704.54 704.57 -0.00%
cadene-resnext64x4 16 692.72 691.97 0.11%
slim-mobilenet 64 8,336.64 8,324.07 0.15%
slim-nasnetalarge 64 225.25 225.67 -0.18%
slim-resnet50v2 64 2,663.85 2,665.32 -0.06%
bert-mrpc-onnx 8 823.32 822.26 0.13%
bert-mrpc-tf 1 389.15 386.92 0.58%
pytorch-examples-wlang-gru 1 298.57 300.38 -0.60%
pytorch-examples-wlang-lstm 1 312.48 310.78 0.55%
torchvision-resnet50_1 1 603.36 598.68 0.78%
torchvision-inceptionv3_1 1 342.17 343.19 -0.30%
cadene-dpn92_1 1 398.72 401.44 -0.68%
cadene-resnext101_1 1 328.49 324.52 1.22%
slim-vgg16_1 1 459.63 459.52 0.02%
slim-mobilenet_1 1 2,099.23 2,116.32 -0.81%
slim-inceptionv4_1 1 218.38 220.17 -0.81%
onnx-taau-downsample 1 304.65 304.43 0.07%
dlrm-criteoterabyte 1 21.60 21.60 0.00%
dlrm-criteoterabyte_fp16 1 40.65 40.66 -0.04%
agentmodel 1 6,010.23 5,930.88 1.34%
unet_fp16 2 54.75 54.75 0.01%
resnet50v1_fp16 1 937.86 959.61 -2.27%
bert_base_cased_fp16 64 903.03 903.28 -0.03%
bert_large_uncased_fp16 32 285.75 285.65 0.04%
bert_large_fp16 1 166.47 166.48 -0.01%
distilgpt2_fp16 16 1,280.74 1,282.20 -0.11%

This build is OK for merge ✅

@migraphx-bot
Copy link
Collaborator


     ✅ bert-mrpc-onnx: PASSED: MIGraphX meets tolerance

     ✅ bert-mrpc-tf: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-gru: PASSED: MIGraphX meets tolerance

     ✅ pytorch-examples-wlang-lstm: PASSED: MIGraphX meets tolerance

     ✅ torchvision-resnet50_1: PASSED: MIGraphX meets tolerance

     ✅ torchvision-inceptionv3_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-dpn92_1: PASSED: MIGraphX meets tolerance

     ✅ cadene-resnext101_1: PASSED: MIGraphX meets tolerance

     ✅ slim-vgg16_1: PASSED: MIGraphX meets tolerance

     ✅ slim-mobilenet_1: PASSED: MIGraphX meets tolerance

     ✅ slim-inceptionv4_1: PASSED: MIGraphX meets tolerance

     ✅ dlrm-criteoterabyte: PASSED: MIGraphX meets tolerance

     ✅ agentmodel: PASSED: MIGraphX meets tolerance

     ✅ unet: PASSED: MIGraphX meets tolerance

     ✅ resnet50v1: PASSED: MIGraphX meets tolerance

🔴bert_base_cased_fp16: FAILED: MIGraphX is not within tolerance - check verbose output


     ✅ bert_large_uncased_fp16: PASSED: MIGraphX meets tolerance

     ✅ bert_large: PASSED: MIGraphX meets tolerance

🔴distilgpt2_fp16: FAILED: MIGraphX is not within tolerance - check verbose output

@causten causten merged commit 609ad96 into develop Dec 4, 2023
15 checks passed
@causten causten deleted the remove_unused_device branch December 4, 2023 22:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cleanup Cleans up code from stale bits/warnings/previous changes for a previous feature PR simple small or simple changes skip bot checks Skips the Performance and Accuracy CI tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants