When the “Plugin Revolution” happened in computer-based musicking, I was on a 20-year hiatus from the scene after having done a few things with music on computers. From 1989 to 1995, computer music was a rather important part of my life. I left that part of my life behind, around 1995, mostly because of grad school. I only reconnected with it around 2015, in large part because of the Raspberry Pi.
So I don’t know firsthand what happened when Steinberg came out with the VST format and different developers starting creating plugins using it.
Still, the effects of that shift are quite obvious to anyone who’s observed the before and after. Before, each piece of music software (or hardware) was its own little bubble or island. You could transfer sounds and other data between them, but it was an arduous process, similar to old-school publishing workflows. Nowadays, almost any piece of music software on the desktop supports some form of plugin architecture. People will warn you when software is meant to be used standalone, requiring you to move files around and ensure that all the right things are maintained. In high-budget professional workflows, it makes a lot of sense to send “stems” around so that specialists can each work on their parts. Even in those contexts, a lot of plugins are used without a second thought given to the power of this whole architecture.
Plugins on the desktop have all sorts of issues. For instance, there are plenty of plugins which are still only available in a 32-bit version which is only supported by some “DAWs” (Digital Audio Workstations) and other plugin hosts. Steinberg has been forcing developers to move to the VST3 format but there are plenty of plugins which remain in VST 2.4 format. Other plugin formats exist apart from VST. Pro Tools, which probably remains the most prominent DAW in recording studios, only accepts AAX plugins. Apple’s music software (Logic Studio X, MainStage, GarageBand) can only accept Audio Unit “components”, which are themselves only accepted by a few other DAWs. On Linux, some VST plugins may work under certain conditions and there are other plugin formats which are mostly found on Linux (LV2 and LADSPA, for instance). No plugin format works everywhere, whether it’s on the same platform or on different platforms.
Because of incompatibility between plugin formats, there are some “bridges” and other workarounds. For instance, the ReWire technology allowed someone to basically “run a DAW in another DAW’ by connecting audio inputs and outputs in some ways.
Still, things are stabilizing in the world of desktop plugins. Reason Studios (then Propellerhead Software) started accepting VST plugins in a recent version of their flagship DAW (they had their own “Rack Extensions” before that). The most recent version, Reason 11, can itself run as a plugin in another DAW, thereby making ReWire irrelevant in this specific case. The JUCE frameworks used by many developers makes it easier to create plugins which will run on different platforms. And some DAWs accept more than one plugin format.
Musicking on smartphones and tablets is going through a large transition, especially in terms of key devices made by Apple (iPad and iPhone). These devices now contain processors which rival high-end desktop computers for certain aspects of the performance. In terms of audio, it’s quite possible to use very sophisticated DSP (Digital Signal Processing) to accomplish things which were mere dreams when desktop plugins first appeared. There are enough of these devices around the world that the market for music software is relatively healthy there, despite being a small niche.
And Apple made it possible to use a special plugin format on these devices: AUv3 (Audio Unit version 3).
The format came out some years back and some developers have deepened their experience of its many flaws and quirks. Recently, a broader range of people could be observed as taking the plunge into AUv3.
This includes devs who already had standalone apps on the platform and were convinced to support the plugin format. It also includes manufacturers of desktop plugins who had a very limited presence on the platform until recently and came out with very respectable AUv3 plugins (namely, FabFilter and Eventide).
Some of us have been focusing our iPad or iPhone musicking on AUv3. Not that we never ever use an app if it doesn’t support AUv3. Just that AUv3 support can easily be a dealbreaker. When a new app comes out without AUv3 support, it’s quite likely that they’ll address the situation on their own or because of pressure from users. AUv3 is a must.
“But, why? Can’t you appreciate music apps for what they give you? Aren’t there other ways to use music apps together?”
Well, yes, sure. And we can also develop creative things in monolithic software when it comes to images, videos, and writing. That doesn’t mean we should accept those types of limitations.
There are several reasons the lack of AUv3 might be a dealbreaker, for some of us.
- Coding standards
Apart from AUv3, existing approaches for music apps to communicate between each other on iPadOS and iOS revolve around a clunky technology which is officially on its way out. Before Apple added AUv3 support to iOS, Audiobus came out with a way to let music apps communicate. It required rather deep support on the part of developers involved yet 100s of apps came out with Audiobus support. It was clever, convenient… and somewhat kludgy. In the meantime, Apple came up with its own approach (Inter-App Audio) which made things somewhat more stable and Audiobus started using IAA for its own communication across apps. Problem is, IAA is on the way out. Apple has officially deprecated it, which means that it won’t work in the future, though that future might be a bit further off. In fact, IAA broke in a recent update to iOS and Apple made it work again in a more recent update to iOS and iPadOS. On the Audiobus side, they could (and probably will) come up with a new de facto standard for communication between music apps once IAA disappears. Which will require more adaptation. In other words, apps based on IAA will have to make a switch at some point. AUv3 is mature enough now that the reluctance to switch is a rather negative sign.
Some apps have ways to move files around or even export projects. Some even come with their own protocol for apps from the same vendor. Some apps don’t even support IAA. With those apps, a musicker is expected to do everything in the same environment, including from creating the music to recording it and making it available elsewhere.
AUv3 is a radical contrast to all of these situations. The AUv3 workflow is remarkably easy, as you can instantiate multiple plugins from a DAW or plugin host, move them around, route the sound or performance controls from one to another, mix tracks seamlessly, tweak multiple plugin parameters at the same time using the touchscreen or some controllers, and record everything as multitrack audio and/or as a full project with saved plugin state. Enhanced workflow is where people may feel “once you go AUv3, you can’t go back”. Not that AUv3 workflows are obvious to everyone the moment they lay their hands on one plugin for use in one host. There are enough videos around to make it obvious that this type of workflow is a “gamechanger”. In my experience, AUv3 workflows based on a plugin host on an iPad are often way more convenient than the equivalent on a desktop DAW for jamming and other forms of live musicking. This includes DAWs which were designed for live performance. Part of the reason is that there are several AUv3 MIDI plugins that we can seamlessly use together to generate, modify, or refine musical movement. In a desktop DAW, these plugins require quite a bit of finagling to work together and produce desired results. Plus, these desktop versions have limited ways to represent what is going on with MIDI across multiple channels.
In terms of workflow, part of the reason IAA is incredibly cumbersome is because of the limitations of multitasking on iPadOS (and even more so on iOS). With IAA, you need to switch between apps if you want to play with sounds from a single IAA app and anything else. With a keyboard, switching isn’t that difficult. It just means that the controls from that IAA app aren’t visible when playing in another app and vice-versa. Since a lot of computer musicking is about those controls, it can rapidly be infuriating or, at the very least, break the flow. Having controls from multiple plugins at the same time (along with “transport” controls like play, pause, and record) makes a lot more sense on a multitouch device like a tablet than on the mouse-and-keyboard desktop. Apple still dismisses multitouch input for macOS and there’s a lot which could be improved with touch control on Windows or Linux. Controlling music plugins is one of the cases where a multitouch interface makes the most sense.
Plugins typically expose their parameters to the host (a DAW or plugin host). Then, a musicker can control these parameters in sophisticated ways through the host. So, instead of twiddling virtual knobs on a touchscreen, one could use modulation to have one plugin (or a feature in the host) control the parameters of another plugin. Not only does it make it very easy to keep things synchronized, it affords a lot of creativity. Most desktop DAWs have sophisticated ways to “modulate” plugin parameters, and some are commonly recognized for the power of their modulation. On iPadOS and iOS, DAWs and plugin hosts arguably have above-average modulation power.
A plugin architecture is a matter of “modularity”. You prefer this filter over this other one? You can switch one for the other as they’re equivalent “modules” in a broader context. Many plugins are effects added to a sound source (which may itself be a plugin). When apps communicate with one another, you can hack together a similar setup, using one app as a sound source and another as an effect. There’s little flexibility in such a setup as the apps don’t really correspond in their modularity. You may end up using one component of one app and one component of another in a way that is difficult to pick and choose, and you can’t quickly switch from one to another.
In terms of performance, AUv3 plugins are difficult to beat. Apart from memory management (which is also an issue with standalone apps), AUv3s on a relatively recent iPad or iPhone are very efficient in terms of processor (and battery). Conveniently, plugins consume almost nothing when they’re not processing sound. A standalone app running in the background takes quite a bit of processor power and is very likely to drain the battery quickly. In fact, because of background audio, music apps are an exception to the rule that you shouldn’t force-quit apps on iOS and iPadOS. It’s easy to run multiple instances of a given plugin (and keep track of what each instance does). It’s quite difficult to run more than two or three apps without getting performance issues. In fact, because IAA is so “clunky”, it’s more common for a standalone app to crash in the background than a plugin in a DAW or host. Sure, there are some wonky plugins and very stable standalone apps. It’s just that IAA may mean that a stable standalone will crash even if it’s ok on its own.
Since making AUv3 plugins requires adherence to some fairly strict framework, apps with AUv3 support are likely to follow relatively strict coding practices. There are very efficient standalone apps, especially those coded in JUCE (Korg’s Gadget system being a prime example). There are also apps which have been put together rather sloppily. Those may produce decent results on their own but likely eat up too many resources on the device. So, on top of better performance in aggregate, a plugin workflow tends to bring together pieces which were crafted more carefully. There are some exceptions (which are promptly discussed in the community). It’d be a gross overstatement to say that AUv3 support is a guarantee that an app is higher quality than another.
Still, it should be obvious by now that implementing AUv3 support in a music app is a strong signal that the dev is forward-looking.