-
Notifications
You must be signed in to change notification settings - Fork 3.2k
Python MacOS arm64 release binaries #6633
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
What kind of tag we should publish? Does pypi have a standard for that? |
@ronaldoussoren is there some official guidance? Guessing that |
Is it in the latest pip release? What's the version requirement of pip to support it? |
pip 21.0.1 can install universal2 wheels (I've just verified this on my machine). For my own projects I currently publish an x86_64 wheel as before and a universal2 wheel for folks with M1 systems. I intend to drop the x86_64 wheels from releases later this year. My personal preference is to have universal2 wheels. That reduces the amount of different wheels needed, and more importantly makes it a lot easier to build applications that can be used on both Intel and M1 Macs using tools like py2app and pyinstaller. For projects with huge wheels such as tensorflow (which wheels well over 150MB) having per-architecture wheels is likely better due to disk usage considerations. |
@ronaldoussoren Thank you for the information, it is really helpful. We will look into it. Currently we are preparing the release for 1.7. The deadline is soon. Likely this feature won't be in that release. Sorry. |
any update on this? |
I'm working on it. |
Reference:
I think we can do cross-compiling on a Mac X86 hardware. Anybody knows more details? What should I do? |
Let me try to leverage https://cibuildwheel.readthedocs.io/en/stable/options/ |
what is the plan to support Onnxruntime on M1 chip? |
Being able to run ONNX on M1 Macs is very desired. Currently, there's no ability to do development with ONNX on these laptops. So instead, I have to use a virtual machine or rent a cloud x64 instance, which impedes productivity or creates additional costs. |
I will submit a PR and publish a test package tomorrow. |
https://test.pypi.org/project/onnxruntime/1.8.2.dev20210816004/
|
I followed your instructions and I am getting the following error:
|
What's your pip version? |
|
Looks like something is wrong. Let me debug it. |
I found the problem. Fixing. |
It needs quite a lot code changes. CMAKE_OSX_ARCHITECTURES is a list, we shouldn't use it in a way like: if (CMAKE_OSX_ARCHITECTURES STREQUAL "arm64") See: #3298 |
Now I got a new one, please try: Currently it only has python 3.8 and 3.9. This is because only CPython 3.8 and newer support universal2 and arm64 wheels. It was built from commit id: 617f0f5 |
Works perfectly! Thanks a lot! |
I'm getting
I have |
Hey @snn when can we expect this to be regularly available at least in the nightly? (right now only very specific commits have the arm64) |
Hi @snnn! It seems that universal binary (or at least a separate binary for
Is it something that has been missed? |
@snnn Does onnxruntime m1 version ready to publish to pypi? |
Not yet. |
@snnn I found ARM build version on my own side even slower than x86 version (Conda x86 python vs Miniforge amr64 python). I don't know why |
Any ETA when |
Hi, it's now one month later. Is there any chance this will be soon? Please. 😀 |
Hi, first quarter of 2022 are coming to end, is there any updates on arm M1 version? |
I would also love to see an updated ARM M1 version on PyPi |
Same, a lot of errors trying to install on m1. |
Same, cannot find a way to deal with it. |
Hi, 100 years later, any m1 version onnxruntime? If not, how to build on my self? I don't know what's the command line to build with cmake enable arm64 |
See: https://github.com/microsoft/onnxruntime/pull/8710/files f you want to cross-compile for Apple Silicon in an Intel-based MacOS machine, please add the argument if you want to generate a Universal2 Binary, please use |
Please bear in mind that OSS in general, and ONNX in particular, is a project run by volunteers (in this case Microsoft volunteering their resources) and is available freely to the world. Smirky comments like '100 years later' are not only rude, they will discourage others from contributing, essentially harming everyone including yourself. Please refrain from such an attitude in the future. That being said @snnn there are working wheels on the nightly version of this repository. What would be necessary to include them in ALL of the future nightly releases? Can we help by adding a PR? Once the coverage is sufficient it can then go from the nightly to stable.. what's preventing this from happening? A little insight here would be much appreciated |
Thanks for the understanding. I just had a discussion with our product team. The feature will be included in ONNX Runtime 1.12 release. @mszhanyi will work on it. He will update the "MacOS_py_Wheels" build job in https://github.com/microsoft/onnxruntime/blob/master/tools/ci_build/github/azure-pipelines/templates/py-packaging-stage.yml to include arm64. We should provide two types of wheels: one for x86_64, another for universal2. |
@faxu , would you please create a 1.12 project at https://github.com/microsoft/onnxruntime/projects ? |
https://github.com/microsoft/onnxruntime/projects/9 Will populate soon with discussed release plans. |
@snnn The packages you linked do not exist anymore, is there another way to install onnxruntime? Or maybe could you please provide updated packages? |
If you are interested in pre-built onnxruntime wheels for Mac M1 (Apple Silicon), I have created a replacement package called onnxruntime-silicon.
|
@cansik Great, works on the first try, thank you! :-) |
@cansik Out of curiosity, do you have any idea if this can be leveraged for quantization on Apple M1 chip? Or there is still an execution provider missing for this kind of hardware? Thanks! |
Describe the bug
ONNX Runtime does not install using pip on M1.
System information
To Reproduce
The text was updated successfully, but these errors were encountered: