autograph v0.1.0 #45
Replies: 3 comments 4 replies
-
Would it make sense to use wgpu instead of gfx-hal as gfx-rs no longer seems to be actively developed? |
Beta Was this translation helpful? Give feedback.
-
Sorry for the delay, I was really busy last week but do have some free time again now.
Each node is a rust function (with a bit of extra magic like potentially a state). fn node_graph (input : u32 )-> u32 {
let n0 = graphene_core :: value :: ValueNode :: new (input );
let n1 = graphene_core :: value :: ValueNode :: new (1u32);
let n2 = graphene_core :: ops :: AddNode :: new ((& n0 , & n1 ));
n2 . eval ()
}
This would be generated from the visual representation of the node graph, this is obviously a pretty complicated example but it should be possible to reach turing completenes by just visually chaining nodes because many of our users won't be programmers but rather artists and forcing them to write actual code might be a bit intimidating.
Yeah I stumbled upon that and wasn't sure how to best deal with that. I could just alloc one big buffer from wgpu and use the rust alloc api + any of the already implemented rust allocators (e.g. wee alloc) to manage the memory. I'd really like to avioid implementing my own allocator because that one would probably be much buggier and less efficient than tested solutions. On another note, do you potentially have time for a call at some point? |
Beta Was this translation helpful? Give feedback.
-
This is the first release of autograph rebuilt on SPIR-V compute shaders that can be compiled from Rust source with rust-gpu!
Compute Shaders
All computations are implemented in either Rust or GLSL (to be replaced by Rust), and this API is publicly exposed so that external crates can develop their own routines. Shader code targeting SPIR-V is portable and is compiled at runtime for devices supporting Vulkan, Metal, and DX12 API's.
Datasets
The library includes MNIST and Iris datasets to make it easy to get started and these are used in examples.
Machine Learning
High level traits like Train, Test, and Infer are provided to create a common interface for different algorithms.
KMeans
An implementation of the KMeans classifier, demonstrated in the examples.
Neural Networks
Networks can be constructed as a structure of Layers, including:
Each of these layers implement Layer and Forward traits, which can be derived to reduce boiler plate.
Similarly, backward ops can be defined using the Autograd and Backward traits, where Autograd can be derived in much the same way that Layer is.
The intent is that users can write their own custom, modular layers and functions which can be defined from the high level down to custom shader code, all implemented in Rust.
Status
The crate is fairly minimal, missing implementations for some data types, not supporting bf16 for convolutions and pooling layers, with many functions like matrix multiplication internal and not publicly exposed. Things that are potential work items:
Contributors
Thank you to those that have contributed to the project!
This discussion was created from the release autograph v0.1.0.
Beta Was this translation helpful? Give feedback.
All reactions