Replies: 6 comments 12 replies
-
Following SPIR-V backend design for CUDA/HIP makes sense to me. 👍
Just a minor clarification: if low-level runtime supports corresponding "native device lib" extension then DPC++ compiler will skip this step. So there is an option to provide a math library optimized by the HW vendor. |
Beta Was this translation helpful? Give feedback.
-
@jinge90, could you please review? |
Beta Was this translation helpful? Give feedback.
-
The first stage of implementation is roughly working, however there are some issues around using common functions like cos, sin, exp etc. These are recognized by clang as things that can be replaced by LLVM instructions later on, so the funcs are replaced before the libdevice functions are linked. This causes errors later on when say Another option is to change the Let us know your thoughts, or if you think there is another approach we can take. We think the first option is the best approach as it stands. |
Beta Was this translation helpful? Give feedback.
-
We have a libm.a with embedded code for amdhsa and nvptx. The problem is where to uptream it. |
Beta Was this translation helpful? Give feedback.
-
Proposal for CXX standard library support for CUDA backend.
CXX standard library support for SPIR-V backends is provided by IR bundled in .o files such as
libsycl-fallback-cmath.o
. The IR for each SPIR-V backend is unbundled and linked at compile time, before being further compiled to SPIR-V or device code along with the rest of the source code.We propose that the NVPTX target should enable CXX standard library functions in the same way. This involves:
libsycl-fallback-cmath.o
) at the building stage of DPC++.There is also the small matter of stripping the
cuda-gpu-arch
information from the LLVM IR when the .o files are being created; we think we have an approach for this but will only know for certain later down the line.Please let me know if anyone has any suggestions, and if you think this is a suitable way of implementing CXX stdlib support for the CUDA backend.
Beta Was this translation helpful? Give feedback.
All reactions