-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gltf interop #18
Comments
In the simplest case, if you have raw bytes packed in a vector or a bytestring and a description of the attributes layout, you don't even need to process the content of the data. Just specify the layout similar to Lib/Vulkan/Vertex.hs way and copy the content of the vector as in Lib/Vulkan/VertexBuffer.hs. You just need a pointer to the raw data; for example, if you use vanilla bytestrings, then The |
Great, that's helpful, thanks. I'll forgo
I'm more than a little familiar with the state of (tensor-based) numerics for the Haskell ecosystem, so I have a few answers to this question, but why not something that's backed by BLAS for linear algebra? |
Because I want to implement a Proper pure haskell tensor library in the end :) And in |
I'm interested in how you'll handle the SIMD operations, but that's for another thread, I think. I look forward to its development. :) |
Hi,
I recently started a library to load assets from the gltf spec (here) with the aim of being able to efficiently get assets and animations to the vulkan api (using this c++ project as a reference).
I wondered if you could provide some advice: If I have a buffer or raw binary vertex data (ie, a
Mesh
) parsed as a little endianByteString
, someaccessor
s telling me what components this buffer represents, the number of bytes per component, the stride, etc, what's the best way to get that geometry into a form that I can get tovkVertexBuffer
efficiently?I've looked at the
DataFrame
PrimBytes
instance but I'm not exactly sure where to start as theeasytensor
vulkan-api
design goals are not totally clear to me.Thanks!
The text was updated successfully, but these errors were encountered: