Semi-modular synth application
Native + browser-based synth application with custom UI framework
An audio synth application that can run natively (without Electron or web dependencies) or in the browser using canvas. The goal is to help facilitate creativity and experimentation with sound synthesis while being easy-to-use upfront, have the flexibility to explore more complex ideas, and have a fun, performance-friendly experience.
I call it a semimodular synth because because by default it can produce a wide range of sound synthesis, but the idea is that it can be patched and modulated in various different ways to generate extreme complexity and variation. In the future I will be adding additional random modulation, function generators, rhythm generators, and more - similar to a fully modular system, but with a focus on ease of use and simplicity out-of-the-box. The same goes for the visual effects - it will be possible to patch together different effects that react to different parts of the audio, and have user interaction with the audio.
The way it works is that it has multiple tracks to play various synth engines with effects, filter options and envelope options. It has 8 tracks that can play a different synth engines. The synth is split up into 3 sections - Sequencer, Audio engine, Modulations.
Sequencer
The sequencer is step sequencer that is divided up into steps that can run at independent rates. The steps are routing into the audio engine for that track. Different lengths can be chosen to make polyrhythms and there are ratchet and random chance modes.
Each track actually has 6 sequencers that by default are running all at once. But this can be customised with wait ratios, which you can organise to make more complex sequences. And there is experimental algorithm modes that can feed one sequence into another - this makes some really complex arrangements possible.
Synth engine
The middle section is the synth engine. The backend audio engine is using Elementary Audio which is a Native+Web Assembly DSP library which uses a declarative node network to route audio signals and effects. I use a custom fork of the library with some extra features that I've added.
It has various synth engines to choose from, some of them are ported from Mutable Instruments source code. This includes various percussion engines, string+modal synthesis, chord engine and more. Each engine has different parameters that can be tweaked (and modulated) which change the tone and sound of the engine. There is lots of scope to me to develop these engines further.
The envelope is triggered by the sequencer and can be customised. It also has a filter that can be morphed between different types with resonance and cut off parameters.
Modulations
Every parameter in the synth can be modulated, and each track 6 modulation sources which can feed into any other parameter, even into other tracks. This means LFOs and Envelopes can be triggered by the sequencer to make interesting sound changes.
Effects
All tracks are fed into the master effects chain which has reverb and delay and in the future will have more effects but this needs a lot more work.
UI framework
The UI framework itself is something I could talk about hour hours. It's something I've been working on for a while. It has the following high level features -
- Custom layout engine that has a simplistic recursive constraint system that can react to the screen size
- Declarative (immediate mode) user API with single function component interface
- Immediate-mode means you don't have to manually hook up reactive state changes to the UI, the UI will update when the state changes
- Animation system is simple since you can just interpolate between values
- Composable state functions by pure functions (similar to React hooks)
- Built-in layout primitives for flexbox-like layout
- Full input hierarchical event system with event bubbling similar to the browser DOM
- Hot reloading
- Hardware accelerated rendering lets me add postprocessing effects like chromatic aberration and bloom easily that can respond to the audio
Post processing effects. In the future I will add more visual effects for the background and UI elements which can be modulated by the audio and/or sequencer. I have many ideas for this including 3D rendered elements and trippy feedback effects.
Debugging
The layout algorithm is ran in a WebAssembly module that produces rectangles for each element. There is a debugging view that overlays the rectangles on top of the UI.
I also created a node debugger for the signals. Since Elementary Audio is a declarative audio graph, you can get access to the entire signal graph and inspect the values of the nodes. In this case I can record a segment of time and produce a dataflow view of the nodes with a timeline of values. This uses a layout algorithm similar to viewing Git branches to visualise the graph in a linear way.