You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After several days of research, I have started to learn from scratch as someone who knows nothing about deep learning.I now know that I can do this kind of reasoning.
When exploring the implementation of deep learning frameworks in Rust, such as Candle and Burn, it raises the question of whether using these Rust-based frameworks could be a better alternative to ONNX Runtime for building projects like SurealML. burn candle
SurrealML essentially gets trained models and converts them to models with meta data around the model. This enables the database or surrealML engine to run the model with some inputs. What this means is that you train your models however you want. ONNX is the universal agreed upon format that is essentially protobuf describing the weights and computation graph to execute inference of the model. We then have an ONNX inference engine in Rust that works in the database, but it also keeps your model portable. For instance, there are projects working on making ONNX WASM compatible so it can be run in the browser. If we look we are working on a c-wrapper for the core module here:
Can this be used to perform neural network inference, such as utilizing the MNIST dataset? (https://pytorch.org/vision/0.20/generated/torchvision.datasets.MNIST.html)
The text was updated successfully, but these errors were encountered: