Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code Contribution: 【Lv3】【Operator Development】batch_norm_backward #315

Open
StrongSpoon opened this issue Nov 22, 2024 · 1 comment · May be fixed by #362
Open

Code Contribution: 【Lv3】【Operator Development】batch_norm_backward #315

StrongSpoon opened this issue Nov 22, 2024 · 1 comment · May be fixed by #362
Assignees

Comments

@StrongSpoon
Copy link
Collaborator

StrongSpoon commented Nov 22, 2024

Description 任务介绍

Develop backward function for batch_norm operator.
开发batch_norm算子的反向功能。

Requirements 任务要求

Interface 接口
batch_norm_backward(Tensor grad_out, Tensor input, Tensor weight, Tensor? running_mean, Tensor? running_var, Tensor? save_mean, Tensor? save_var, bool update, float eps, bool[3] output_mask, Tensor reserve) -> (Tensor, Tensor, Tensor)
Function reference 功能参考
https://pytorch.org/docs/stable/generated/torch.nn.functional.batch_norm.html
Implementation reference 实现参考
https://github.com/FlagOpen/FlagGems/blob/master/src/flag_gems/ops/groupnorm.py
https://github.com/FlagOpen/FlagGems/blob/master/src/flag_gems/ops/layernorm.py

The operator should support all optional arguments defined in the interface and return the corresponding number of values.
算子应支持接口中定义的所有参数选项,并返回相应数量的值。

DDL 提交时间

Please submit a Pull Request within 3 weeks after accepting the assignment.
请于接取任务后三周内提交PR。
Please provide both accuracy test and performance test code.
请同时提供实现正确性测试与性能测试代码。

@StrongSpoon StrongSpoon converted this from a draft issue Nov 22, 2024
@2niuhe
Copy link
Contributor

2niuhe commented Nov 22, 2024

2niuhe认领

@2niuhe 2niuhe linked a pull request Dec 13, 2024 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: In Progress
Development

Successfully merging a pull request may close this issue.

2 participants