Ableton 11 Devices – ‘Inspired by Nature’ Interview with creator Dillon Bastan
Ableton 11 has a lot of great new features coming out in 2021. We had the pleasure of speaking with device designer Dillon Bastan about his 6 new devices for Ableton Live 11 – “Inspired By Nature”. You can see Dillon’s demo for each device at the bottom of the page.
What is your background, are you a physicist? What got you interested in making sounds based on natural phenomena?
I’m not a physicist at all and am coming from a music background/self-taught programming, my academic math level is low. For some of the devices I make I try to get to a game-like/playful approach to sound generation/design. A couple of years ago I began to learn more about coding simulations of natural phenomena, particularly simple particle systems in the case of these devices, so it was very natural to bring those ideas to production/performance tools. For me when I learn something or want to learn something (like how to simulate 2D water ripples for example) then I start imagining ways to connect these things to sound parameters or processes. Sometimes it is very direct and intuitive and sometimes not.
(Image of Dillon from SoundPedro, an annual ear-oriented multi-sensory event, presenting artists whose work addresses sound and aural perception in combination with other senses.)
In addition to sound, I noticed your creations incorporate a lot of complex/elegant visual elements, e.g. your Iota MaxForLive Granular Synth, Emit, or even Tree Tone from Ableton 11. How do you conceptualize the relationship between sound and sight as you create your devices? Is there a visual relationship you keep in mind, i.e. is sight necessary for our understanding of sound?
I would say I just focus on each device without any overarching intentions and allow all of the elements of each device to emerge through perceived necessity and playing with prototypes. The role of the visual element in each device varies, some are just a neat way of monitoring and understanding the system while others are critical for the functionality of the device. It depends on each device but usually, I had some idea of the visual interface first, but from there it varied a lot. Some devices stuck to that original idea and the rest of the parameters naturally fell into place (because they were obvious to me at the moment). But for other devices, the role of the visual element changed dramatically over time and the same with the synthesis or processing of the device.
(Emit Device, coming to Ableton 11 2021. )
Tree Tone is one of my favorite devices of yours. Could you tell us a little about how this device came to be?
When I was learning some ways of simulating particle systems, etc I also learned some ways of rendering fractals, in particular, L-System fractals to render trees. When I was pitching ideas for devices to the Packs team, one of them was to make some kind of sound generator with the L-System fractal trees with wavetables or resonators or something. Later on, I started making the device but realized the L-System didn’t give me want I wanted, which was to have each branch of the tree tuned and attenuated in such a way to relate to how a real physical object would be (I was thinking of metallic or wooden objects, or even just strings) where longer, thicker objects have a lower pitch and longer sustain. Then I searched for a while on the internet until I found examples using a Space Colonization algorithm for simulating plant growth, which is what Tree Tone uses to “grow” the tree. I guess in general my devices remove some control from the user (in the case of Tree Tone that would be the relationship of the tunings and randomness inherent in the exciters), which gives that wonderful limitation and happy accident thing. But at the same time, they can have the possibility for complex results or relationships to occur that we wouldn’t normally navigate to. And also to keep things fun, playful, and exploratory at the same time. Even I don’t know many ways to navigate these devices.
(Tree Tone Device, coming to Ableton 11 2021. )
A lot of our readers have heard of Max For Live but might not have ever experienced programming in it. How did you discover Max For Live?
A friend showed me it when I first was learning Ableton Live, but I didn’t use it really much for 2-3 years after that other than dropping in some max for live devices such as LFO and Convolution Reverb. When I started scripting in the Lemur app (by Liine) more, eventually I ran into limitations and started to make hybrid Lemur + max for live devices (which you can find in the lemur user library or even my account in maxforlive.com under the name “ndivuyo”, keep in mind they are my early devices and not performing great/very idiosyncratic. The original version of Iota is one of those hybrid devices). At that time I still hadn’t taught myself to code in text languages so I was very blindly copying things from other max patches and the forums and hoping they would work. And they did! And it boosted my confidence to keep making more devices, which I always just made because I thought it would be nice to have (and still do honestly) versus thinking about what would other users like or what would be marketable. But eventually, I started to learn how everything works thanks to the documentation and forums and many hours trying things out, which I spent virtually all of my free time doing. When I learned how to code in other text languages and object-oriented programming, max became a lot easier to navigate for me, but I would say depending on the level you want to use it that it is not necessary to go that route. I remember at some point I would make different types of devices to learn how to code different types of synthesis and effects. But I still have an endless amount of things to learn in terms of audio DSP (digital signal processing) and always will consider myself an amateur.
What are your next inquiries? What are your next frontiers in sound, programming, and music?
I always have many projects. I do a lot with sound art, music, performance, DIY electronics/devices for music/art or therapy, and these days experimental films. So if you check out my website or Instagram you can see all the other stuff I do besides max for live devices if that interests you. As for audio devices: I am making devices using Chaos Theory for school credit, I put one out recently called Strange Mod and I need more credits so I will probably put out another 1-2 devices with Chaos Theory before March (my deadline!). Eventually, I will put out a beta of Iota 2 I think, if I ever get around to finishing it, it’ll be much nicer than the first. Currently, I am collaborating with someone on a really exciting multi-device project using AI and machine learning from the convenient workflow of a DAW (no need to be a programmer or do it over a browser), but I can’t reveal that as we are still deciding where to take the project.
Thank you so much Dillon for taking the time to answer our questions, we can’t wait to see what you come up with next.
Beat Lab had the opportunity to work with Dillon last August to release the device Instant Sampler. You can check it out here and free devices when you subscribe to our blog.
Ableton 11 – ‘Inspired by Nature’ By Dillon Bastan
Emit. A granular phase vocoder with a spectrogram interface
Vector Delay. Poly delay line driven by a physics simulation/particle system
Tree Tone. Resonator Bank tuned by a generated tree with weather like exciters
Bouncy Notes. Balls bounce on a keyboard for a peculiar midi note generator
Vector Grain. Granular Sampler driven by a particle system/physics simulation
Vector FM. Granular FM synth driven by a particle system/physics simulation
If you’d like to learn more about music production with Ableton Live in a Live Interactive approach, find out which of our programs is right for you.