Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

962 use mid level render apis for render #964

Conversation

tychedelia
Copy link
Collaborator

Mid-level Render APIs

This pull request moves us from using a roughly 1:1 conversion of the rendering technique on master, which uses a custom ViewNode that runs on Bevy's 3d render graph, to using a variety of "mid-level" render APIs that bring us closer in line with Bevy's idioms for render code.

High level overview

Rather than executing a series of render commands, we queue a number of render items into the Transparent3d phase of Bevy's existing render graph. These items are then executed with a "draw function", which specifies a series of render operations to perform on each item, setting the pipeline, bind groups, and finally issuing a draw call.

This initially poses a problem, because our existing rendering code executes a variety of draw calls against a single mesh, and so isn't easily broken into "items." Unlike other rendering code Bevy handles, we don't have the benefit of assuming that logical rendering entities exist for multiple frames. In the worst case scenario, each frame is maximally different in every dimension from the one that came before it. This makes patterns in Bevy designed around rendering stable assets a bit more difficult to fit into.

The solution in this PR is the following:

  • Keep our existing mesh, which has been renamed from ViewMesh to DrawMesh and converted to a Bevy asset. We still associate draw instances with a view, but instead let Bevy manage the lifetime of the asset, and interact with a handle to the mesh. This better identifies our mesh as having a relationship to certain GPU resources, which are created in the mesh lifecycle, rather than explicitly in the render code.
  • When the mesh is filled, spawn a number of DrawMeshItems that contain metadata that are used to drive our new draw function, like what texture should be used for rendering this portion of the mesh, etc. These are created on every frame and extracted into the render world where they are added to the render phase.
  • Moved vertex mode information into another uniform, which was necessary to help decouple the new asset pattern from our previous imperative rendering logic.

Learning examples in Bevy

These patterns are used all over the renderer, but it's worth calling out a few examples:

  • Gizmos, which are an immediate style drawing API somewhat similar conceptually to Nannou.
  • The instancing example, which is pretty verbose right now but potentially relevant to our users.
  • The manual 2d mesh example, which is probably the most straightforward example for understanding some of these patterns.

How to review

Unfortunately, the diff here got pretty messy, I hope it's still somewhat comprehensible, but I can move things around into new files or otherwise if it would make it clearer or easier to review.

  • Please bikeshed names. In particular, things like DrawDrawMesh are too overloaded. I was considering keeping Draw in terms of our high level API, but calling things Drawing in the render code to distinguish it from Bevy's draw/mesh abstractions.
  • In bevy_nannou_render/src/lib.rs, notice how much code was able to be removed from update_draw_mesh (previously prepare_view_mesh. Also the RenderAsset mechanism for DrawMesh/GpuDrawMesh.
  • bevy_nannou_render/src/pipeline.rs should be more straightforward.

Conclusion and next steps

Some of these patterns are more verbose and a bit abstract, but I think are worthwhile in bringing us closer to Bevy. There's also definitely room for improvement here, including just more general cleanup and tidiness. In particular, it's still worth investigating whether we can make more use of Bevy's mesh infrastructure, rather than re-inventing the wheel here. This might not be possible due to our immediate mode needs, but is worth ruling out.

@tychedelia tychedelia linked an issue Mar 1, 2024 that may be closed by this pull request
@tychedelia tychedelia added the bevy label Mar 1, 2024
@tychedelia
Copy link
Collaborator Author

Realizing that one issue with this design is that we are setting the same vertex buffer over and over for each phase item. Wondering if we should either:

  • Set the correct ranges of vertex data for each item.
  • Rethink this approach and spawn an arbitrary set of render assets matching our phase items per frame.

@tychedelia
Copy link
Collaborator Author

Realizing that one issue with this design is that we are setting the same vertex buffer over and over for each phase item. Wondering if we should either:

Actually, this appears to be a no-op, so we're fine! https://github.com/bevyengine/bevy/blob/main/crates/bevy_render/src/render_phase/draw_state.rs#L191-L202

Copy link
Member

@mitchmindtree mitchmindtree left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tychedelia all I can say is - this is looking awesome! Seems like someone is becoming a bevy pro 🕊️ 🤓

Please bikeshed names

Honestly I think I don't mind it too much, as it kinda makes sense when reading those commands in context 🤔

In bevy_nannou_render/src/lib.rs, notice how much code was able to be removed from update_draw_mesh (previously prepare_view_mesh.

Huge 🚀 In general, it's so cool to see the draw logic broken up into bevy systems.

In particular, it's still worth investigating whether we can make more use of Bevy's mesh infrastructure, rather than re-inventing the wheel here

So curious about this - would be amazing if we could leverage bevy's mesh abstractions too. Have you had a chance to play with bevy's mesh stuff yet / get an intuition for how feasible this might be?

Comment on lines +69 to +71
let view_key = MeshPipelineKey::from_msaa_samples(sample_count)
| MeshPipelineKey::from_hdr(color_format == ViewTarget::TEXTURE_FORMAT_HDR)
| MeshPipelineKey::from_primitive_topology(topology);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oooo fancy 🤩

@@ -0,0 +1 @@
/nix/store/l6qwc0cii9g6hzrqcb79wr61jg32nhx6-nannou-0.19.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like result snuck in here 😅

@tychedelia
Copy link
Collaborator Author

Going to merge this and begin working on the follow up outlined here: #966

@tychedelia tychedelia merged commit 4a88a15 into nannou-org:bevy-refactor Mar 14, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Use mid-level render APIs for render
2 participants