Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

design UI integration #34

Closed
atsushieno opened this issue Feb 3, 2020 · 10 comments
Closed

design UI integration #34

atsushieno opened this issue Feb 3, 2020 · 10 comments

Comments

@atsushieno
Copy link
Owner

atsushieno commented Feb 3, 2020

We need plugin UI support. LV2 cannot be a reference model here because we are not living on the desktops that LV2 UI extension specification mentions.

Also, unlike other audio plugin frameworks, AAP exposes an interesting issue here. In general, audio plugin UIs are implemented with the plugin application, and when a user manipulates a plugin UI, it has to be launched from within the plugin process. Then the host goes background. I guess we want to avoid that in general, as it would lose access to foreground AAP services. Also we might want to control plugins while we also want to make some edits on the host. (It might be hardly doable even on desktop nowadays, as they usually show up as a modal dialog.)

Therefore, in short, I'm exploring if we can implement UI foundation that can be loaded within the host process. And here are my thoughts.

There can be in-plugin-process editor and in-host-process editor. Both edit service parameters. It is locally done for in-plugin-process editor, and remotely done for in-host-process editor. For JuceAAPAudioPluginHost, remote access is already achieved through ports.

In-plugin-process editor can be anything (for example we already have juce_gui_basics). To give access to ports we will have to prepare supplemental plugin "parameters" or "properties", which don't exist right now. We will have to define them, as getter for port definitely doesn't exist. (TBD: check how LV2 UI extension handles them.)

Considering desktop integration, my current idea for UI development is to use Flutter (with some necessary dart:ffi backed manipulators), but since it's all up to developer it's just matter of taste.

In-host-process editor cannot be a barely runnable program. What we can do instead is to bring in UI controller container which has access to parameters/properties. The simplest solution here is Web UI components. They only have to be browser-loadable component (can be even hosted on the web, if we give permission).

There were some former attempts to implement audio UI controls for the Web and separate concerns from audio processing:

Also we have juce_emscripten that can even bring JUCE UI for in-host-process editor.

I wouldn't prefer having desktop UI on mobules, but simple UI can be portable with less awkwardness.

What we need here is to define a data transmit method between service and client in our AIDL (internal, not exposed to plugin developers).

@atsushieno
Copy link
Owner Author

There is a problem regarding Emscripten support - Chrome (and supposedly Android WebView) does not support JavaScript Atomics, and therefore juce_emscripten is not yet functional (in terms of compiled apps. e.g. apps on https://juce-demos.atsushieno.dev. Until Atomics gets inplemented and juce_emscripten got working on Android platform, the host-instantiated plugin UI idea is on hold.

@atsushieno
Copy link
Owner Author

Or, should we rather mock Atomics API if it does not exist (i.e. thread-unsafe compare-and-exchange) ? Maybe an option. (Though there is no proof that Atomics is the only missing API on Android.)

@atsushieno
Copy link
Owner Author

Mock implementation did not help, the runtime returns/expects SharedArrayBuffer which is not supported on Android for now.

@cbix
Copy link

cbix commented Apr 15, 2020

Hey @atsushieno, really impressive work you put up here, I found this recently after discussing Android pro audio with a friend and doing some research...
A bit off topic, but you should get in touch with @dturner and @philburk who have done all the low-level pro audio stuff for Android (and maybe some of the FAUST developers like @sletz as they also have two quite mature Android build targets in place). Let's get an audio plugin standard built into Android! ;)

About plugin/host UI integration, I'm imagining something like Android's widget concept, where apps can provide a flexible UI widget to launchers/hosts, including multiple instances. Also I wouldn't dismiss the LV2 model completely as one of it's modular design intentions was network transparency (i.e. the plugin might run on one host and the UI on another). It doesn't include a protocol for this, but if AAP UIs had the same scheme of ports and plugin URIs it would allow using Android as a remote for any cross-platform audio plugin with it's own UI (well, currently people do this with TouchOSC).

@philburk
Copy link

This project looks really interesting. I'd like to learn more about it. I will look at the readme files.

@atsushieno
Copy link
Owner Author

@cbix thanks for the kind notes. I actually met Phil, Don, and other Android Audio team guys last year in London at ADC'19, asking questions for this project (I hadn't open sourced it at that time yet). The entire project is quite low quality so far (#18) and not really ready for actual trial experience, no realtime-ish player example yet, so I didn't really announce anything yet.

On the UI integration, I definitely agree that some plugin-framework-agnostic design would be great and would help reusing UI components (as long as the plugin is okay with "general purpose" level), which also leads to MIDI 2.0 Profile Configuration ideas. The webaudio-controls project I mentioned on the issue description (a bit outdated, but the developer I know is still active on Web Audio stuff) is indeed based on WebComponents. I also agree that LV2 separation of concerns is still useful (I was just more interested in reusing existing bits from them). That is, though, still no actual code/product yet.

For other Android concepts like Widgets and Slices, I left those options as we'd need more UI interactions than they provide - I don't remember much on why (maybe I thought of direct interaction between host app A and slices from app B). If they proved to actually work then it's cool. Any other native Android controls would have to reside in either of host or plugin (without choice). While that would still work for open source components, it would impact on how commercial plugin products to be distributed - the only viable solution for them is Google Play (or some Chinese app stores likewise), if that happens.

On FAUST: I indeed thought of it, just like I did for SOUL (I haven't played with either of those). Unlike SOUL which is missing mandatory binary for Android yet, FAUST is already a possibility here. Recently I also noticed that Guitarix also uses FAUST as its audio processing part (partially?) with some Web UI stuff for embedded uses, which is looking interesting.

Before going forward I want to try some LV2 stuff like sfizz and guitarix, first on Linux (I still have no luck on getting those LV2 UI experience on Tracktion Waveform which I aim to bring in partly (i.e. tracktion_engine) on Android) then port them to this framework. I still have handful of tasks to get it done and not sure when I can tackle this UI integration stuff in depth, but I am grateful for thoughts and feedbacks!

@atsushieno
Copy link
Owner Author

In case it helps, I have some unpublished slides that introduces AAP (was planning to use it whenever the project gets more ready i.e. no actual talk done based on this yet) that gives high-level concepts.

@atsushieno
Copy link
Owner Author

@cbix I have an issue with related to latency here #35 (comment) that would answer to your questions you gave me today (link).

I have been without my development machine (sent for repair) but was on some plugin UI bits rather on LV2 these days.

@atsushieno
Copy link
Owner Author

After some experiment with web-based UI at https://github.com/atsushieno/aria2web I wrote some draft idea on how UI integration would be achieved: https://gist.github.com/atsushieno/eb8155cbde052ded330ff9667b51e937

@atsushieno atsushieno mentioned this issue Jul 11, 2020
17 tasks
atsushieno added a commit that referenced this issue Feb 9, 2021
context: #34

It still has no mechanism on how the activity takes instance (it is actually
to be rewritten, we would first create instance and then always pass id),
but it is a starter.
atsushieno added a commit that referenced this issue Mar 20, 2023
This is the first revision that could successfully dispatch Web UI inputs
to the PluginInstance on aapinstrumentsample (to some extent).

The existing ui-compose PluginDetails is not much capable of dealing with
the UI inputs (as it does not constantly processing audio) but the
parameter changes on the Web UI is reflected on the processing results.

There are more work to do, but we are getting able to finally close the
issue #34.
@atsushieno
Copy link
Owner Author

Finally closing this issue as it is now at implementation phase. #150

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants