system (effects and synths) for audio apps - on mobile platforms - taking best from open technology and FLOSS - along with audio developers' tech. trends (kinda) images on the slides mostly has links to the originals (website, product, etc.)
player: user may change effects interactively Games: apply effects as user controls DAW: apply effects as user plays Live streaming: change effects while user speaks They are all LIVE (some are even real-time, some are even low-latency) (also their slides)
glitches and delays? Use C++ (or C or Rust or Zig...) Use any language! (JS, Java, Python, Go...) YES GC JIT dynamic types interpreter is it live? NO YES NO
glitches and delays? Use any language! (JS, Java, Python, Go...) YES NO GC JIT dynamic types interpreter is it live? NO YES Use C++ (or C or Rust or Zig...)
safety • in RT-safe code: RTNeural etc. • in non-RT-safe code: OnnxRuntime etc. with RT thread C++: how we can achieve RT safety • RT-safe algorithms • memory operations (allocation, atomic shared pointers / hazard pointers) • related toolchains and stdlibs e.g. RealtimeSanitizer
"equalizer", "reverb", "denoise" ... • only platform and driver vendor provide them (mobiles: phone vendor provides) • can be hardware-accelerated • can be specific to platform features e.g. VR headset (spatializer) • can be controlled by parameters, but no exact/precise audio outputs • no GUI • consumers: recorder, video and voice calls, streaming apps etc. (not very suitable for DAWs) e.g. android.media.audiofx API for apps, AOSP Audio effect HAL for devices Trend
load any plugin Popular formats: • Steinberg VST3 • Apple AudioUnit • Linux community LV2 • Bitwig CLAP No one can break the ABI Music software always drives audio plugins ecosystem
still ship a DAW, but... "The plug-ins Steinberg offer do not compare to the plethora of VSTs, AUs that iOS users can install. This needs to change in all fairness to make it a complete DAW." "Unfortunately the given limitations on Android, prevents the app to show up with the same features compared to iOS" (more of those voices at FL Studio Mobile reviews)
(except for UI) • Users want plugins available on multiple platforms + multiple plugin formats ◦ in consistent behavior • Plugins are (usually) native apps based on C ABI (application binary interface)
(except for UI) • Users want plugins available on multiple platforms + multiple plugin formats ◦ in consistent behavior • Plugins are (usually) native apps based on C ABI (application binary interface) • developers tend to write cross-platform code for audio plugins (in C++) ◦ using JUCE, DPF, iPlug2 etc. -or- using clap-wrapper ("CLAP first") • many of existing plugin sources are cross-platform code using ^ Trend
run apps many windows show plugin within DAW one app on entire screen remote UI is restricted screen size big, show everything small, cannot show all load and save files usual awkward input events mouse + keyboard touches
sloppy: download from anywhere app can load third-party libs strict: usually via app stores each app has user context how plugins are used load as a library connect as a different app JIT compilation possible prohibited on iOS standard plugin formats a few options only AUv3 on iOS nothing on Android (On macOS, Mac App Store apps are similarly strict)
(interoperable) with desktop apps • build and run subset of desktop version ◦ not to achieve the same workflow, but make project data round-trip (shareable with desktop) example: Logic Pro
CoreMIDI, AudioToolbox etc.) • audio plugin format exists: AudioUnit v3 ◦ AUv3 works on both macOS and iOS ◦ interoperable with AUv2 (desktop in-process only) : migration without rewrite • with audio plugins, audio apps gain more features It is not loved by desktop devs, but on secure platform everyone needs to behave.
(CoreMIDI) to AudioUnits. The earliest party to have adopted MIDI 2.0. Internally CoreMIDI is based on MIDI 2.0, downtranslates to 1.0 only if needed. Still not complete, but the most advanced MIDI platform.
(using SurfaceView) It is doable because host app instantiates them (security risk otherwise) Option 2: Web UI loaded by host WebView (there may be communication limitations)
will people write code for it? => Very unlikely 😥 Chicken and Egg - no plugins, then no hosts. No hosts, then no plugins Why not reuse existing plugins then!
of plugins: • aap-juce build JUCE plugins (and hosts) as AAP • aap-lv2 wrap LV2 plugins (of non-JUCE plugins) as AAP In the future, CLAP wrapper is also doable (when many CLAP API based OSS plugins show up)
UI does not support Android • JUCE UI does not work as native UI (yet) ◦ it likely needs significant rewrite of JUCE IMO there should be different UI for mobiles Ideally this should work...
UMP • Events (note-on/off etc.) are fully based on UMP • Parameter changes are MIDI 2.0 Assignable Controllers (NRPNs) • Supports per-note expression and pitchbend • Extension API is built on top of SysEx messages • Future: state binaries and param./prog. metadata in MIDI-CI for sharing with potential MIDI-CI compatible desktop plugins AudioUnit supports UMP as well (parameters API is different) JUCE 9 (next major ver.) will support UMP in its plugin API too Trend
? (assuming we can JIT compile) • pros can run DSP in process without privilege inheritance host can be simple WebView • cons WASM may not perform well interop between DSP and WebView in separate process can be tricky less possibility to port existing plugins (must be emscripten-compatible)
Bringing in audio plugins ecosystem to mobile is hard (app process model, UI) Port existing plugin apps to build up our own plugin API ecosystem Make use of open technology to make things easier and useful