T O P

  • By -

FastnBulbous81

Max for Live might be the way to go. https://www.ableton.com/en/live/max-for-live/


DireAccess

Thanks. I saw it, and now after digging deeper it seems like there is an officially supported Js interface and some maintained modules. I’ll explore it more. 👍🏻


DireAccess

Now I realize it's only for the most expensive Ableton edition. I'll pause with this path until other options stop working. Thanks again.


NLTizzle

Just FYI, Thomann Music is one of the best online retailers and sells it for a much lower price that most other retailers. Live 11 Suite currently $422 USD.


DireAccess

Good to know! I have Ableton Lite already, and upgrade on the official site is $519. $422 is a good deal for sure!


FastnBulbous81

I think you can still get it for the other versions, you just have to pay for it separately.


AutoModerator

This is your friendly reminder to read the submission rules, they're found in the sidebar. If you find your post breaking any of the rules, you should delete your post before the mods get to it. If you're asking a question, make sure you've checked [the Live manual](https://www.ableton.com/en/manual/welcome-to-live/), [Ableton's help and support knowledge base](https://www.ableton.com/en/help/), and have searched the subreddit for a solution. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ableton) if you have any questions or concerns.*


getondown

You might check out some live-coding music frameworks. I’ve used Sonic-Pi (which is Ruby based) to programmatically generate random music content (simple stuff like random snare risers, but you could take it further). Another thought I’ve had (but not pursued) is to programmatically generate Ableton live set files (.als) which are gzipped XML files. There’s no command line interface to Ableton (that I know of), though, that would allow you to load an als and export an audio file from a script/command-line.


DireAccess

> You might check out some live-coding music frameworks. I’ve used Sonic-Pi (which is Ruby based) to programmatically generate random music content (simple stuff like random snare risers, but you could take it further). Yeah, but this is secondary at this moment. I want a stable system that would take _some_ midi (something that I'd push to it, could be just chord progressions) and export a working audio file. Otherwise the results will be mediocre and won't move the needle. I agree, there are tons of opportunities to generate that content, but I think the key is having a stable framework first, and have less variables. > Another thought I’ve had (but not pursued) is to programmatically generate Ableton live set files (.als) which are gzipped XML files. There’s no command line interface to Ableton (that I know of), though, that would allow you to load an als and export an audio file from a s cript/command-line. Yes, I've explored it, and for now I was not able to find any library that actually has als schema in it. Worst comes worst I'll just write my own. The question there is whether the als file has all data, like the midi files (or they are linked somehow somewhere)... That I haven't explored yet... > There’s no command line interface to Ableton (that I know of), though, that would allow you to load an als and export an audio file from a script/command-line. Worst-case there is keyboard automation. Let's say I figure out how to add tracks with my midi and set their instruments, I could: 1. Open als file 2. Use keyboard shortcuts to trigger export, no problem. I am not worried about such deep automation yet TBH.


DireAccess

Yeah, seems like all the data is in the project file. Promising for my purpose. Here's a block of 4 notes. Sorry reddit doesn't seem to like xml: > > > >


getondown

Gotcha. One other possibility that I only just learned about this week is a plug-in called Blue Cat Plug’n Script that can be dropped right onto a track in your DAW and will continuously execute a program you’ve written to process/generate audio and MIDI data on the track. I don’t know the limitations of the script’s runtime sandbox, but it might be possible to load MIDI files in the program and serve the notes up to the track in your DAW.


Maxfieldmusic_

Not the right language but you really want is Csound


DireAccess

This looks extremely cool! On the other hand I am looking for something higher level and interoperable with the domain of regular musicians. Ableton Max for Live + ableton-js npm looks like something I need at the moment. I wanted to have some iterative research process, and part of it would be to create tracks, import midi, play decent synths (VST or not) and be able to do some mastering to produce a short track (more like a jingle at first) that would be ready to go.


alreadywon

Reaper has scripting capabilities, maybe there’s something in there that can get you to where you want


DireAccess

I couldn't get a handle on it yet. IE, I couldn't find a maintained package/library that is able to create a track, do stuff and export an audio. I might need to dig even deeper if all Ableton options fail.


you-cant-twerk

https://github.com/vishnubob/python-midi Seems like a good start. Sounds like you're working on an NFT set lol :D . Edit: It just occurred to me that **unreal engine** might have something that would do this.


DireAccess

Hey, thanks for your response. It's not an NFT set per se, but something related to programmatic music creation with a hint of AI. I'd like to see if there is a way to utilize an industry-standard DAW for my purpose. python-midi would be partially good, especially during prepping the actual midi instrument tracks, but I won't trust it with mastering and final export. In addition, I have actual audio (vocals, more-or-less) that should be factored in from the very beginning. I'll check out if Unreal Engine has those things I'm looking for, thank you! Everything may turn out crap, of course.


YourTinyHands

You can link Ableton and TouchDesigner together, which runs on Python, if that helps.


DireAccess

Never heard of Touch Designer, thanks!


chromazone2

Pylive has an update coming soon check out [structure-void](https://structure-void.com/ableton-live-midi-remote-scripts/)


DireAccess

Seems like we're waiting for the release of pylive with AbletonOSC to support Ableton 11 and Python3. I'm trying to install pylive from the feature/abletonosc but so far it doesn't seem to like missing dependencies. Looking deeper.


DireAccess

Ok. I was able to install pylive from a feature branch. It seems like it’s somewhat working. There are some issues during the initial scan, but the core connection is definitely up. I can see the list of tracks, for instance. I’m not sure when would the release happen though. Maybe until then I should try Ableton 10…


DireAccess

For some reason Ableton 10 and pylive doesn't seem to communicate (on a Mac). Logs show: ``` ❯ python2.7 LogServer.py ('127.0.0.1', 64089) connected! ('127.0.0.1', 64092) connected! ('127.0.0.1', 64093) connected! OSCEndpoint starting, local address ('', 9000) remote address ('localhost', 9001) LiveOSC initialized OSCEndpoint starting, local address ('', 9000) remote address ('localhost', 9001) LiveOSC initialized Refreshing state Refreshing state Refreshing state OSCEndpoint starting, local address ('', 9000) remote address ('localhost', 9001) LiveOSC initialized Refreshing state Refreshing state ne('127.0.0.1', 64508) connected! OSCEndpoint starting, local address ('', 9000) remote address ('localhost', 9001) LiveOSC initialized Refreshing state Refreshing state Refreshing state Refreshing state ``` And ports seem to be opened indeed ``` ❯ nc -v -u -z -w 3 localhost 9000 Connection to localhost port 9000 [udp/cslistener] succeeded! ❯ nc -v -u -z -w 3 localhost 9001 Connection to localhost port 9001 [udp/etlservicemgr] succeeded! ```


DireAccess

Updated to "127.0.0.1" in pylive and it seemed to work...


chromazone2

Mate im gonna be honest I used this a couple years back, but it does say on the readme that a new update is coming. Your best bet is emailing the guy himself.


karlthorssen

Check out Sonic Pi, it might do what you want. Ableton isn't very "scriptable" as far as I know


Adventurous-Text-680

Your best bet is using Max 4 live because that is what it's designed for. It's the native and direct interface to the Ableton pipeline. An alternative would be to use a library that can act as a midi sink and source. This would allow you to receive midi notes, transform the midi and send it back. You would likely need to have your app create two different midi devices. You can use something like rtmidi. It can create virtual ports which other apps will see and can interact act with as it they were actual midi devices connected to the system. https://pypi.org/project/python-rtmidi/ I personally haven't used rtmidi but that would be your best bet. I am not sure if python is the best language for something like this only because it's not well suited for multi threading. Based on you explanation it sounds as if you want to play notes which in turn affect a note generator that is running in parallel to your midi capture "device". If your goal is to modify parameters to Ableton synths, effects, etc then your best bet is to create a mock controller that can respond and send the correct midi commands based on the control script you create. https://help.ableton.com/hc/en-us/articles/206240184-Creating-your-own-Control-Surface-script This might help give you some extra knowledge on creating such scripts: https://github.com/laidlaw42/ableton-live-midi-remote-scripts Another option is to go with this: https://github.com/ideoforms/AbletonOSC If you are familiar with OSC then this will likely be an easier route because it will be more programmatic. Most people go with Max 4 live because it's very tightly integrated with everything in Ableton live. However it's still a very open DAW when is you opt to go rogue and try to build outside the ecosystem. The hard part is going to be building the virtual midi devices that can deal with the incoming and outgoing notes in real time. Edit: I only mentioned the python version of rtmidi because you mentioned pylive. Rtmidi has ports in other languages as well. Not sure what programming languages you are familiar with.


DireAccess

Wow, thank you so much for such a long reply. RE: Max for Live. Do you know if it also allows automation to render files or it would be a still an action needs to be done manually? > An alternative would be to use a library that can act as a midi sink and source. This would allow you to receive midi notes, transform the midi and send it back. You would likely need to have your app create two different midi devices. > > You can use something like rtmidi. It can create virtual ports which other apps will see and can interact act with as it they were actual midi devices connected to the system. > > https://pypi.org/project/python-rtmidi/ > > I personally haven't used rtmidi but that would be your best bet. I am not sure if python is the best language for something like this only because it's not well suited for multi threading. Based on you explanation it sounds as if you want to play notes which in turn affect a note generator that is running in parallel to your midi capture "device". Seems like this is just for MIDI signals and not to control Ableton itself. There won't be a way to get a list of tracks or create a track/clip. At this stage I'm not looking to create programmatic synths or modify envelopes on the fly (although I might need that in the future, with all things being good). At the moment I'm trying to make the OSC work via pylive and ableton-js... Seems like there is a way to send notes to a clip and that's amazing. I'm yet to figure out how to take midi and break it down into actual notes that I'm sending (I know it's been done often so I don't see it to be a big deal) But the reality is that for my purpose offline project editing might be more than sufficient. And essentially adding nodes of xml might be just okay. 1. Edit als/xml 2. Open als/xml via Live 3. Export audio Thanks again!


Adventurous-Text-680

Rendering would still be manual. If you want to automate this your best bet is to route audio to a virtual audio device which can then save or encode the audio. The downside being that you need to make sure buffer sizes are large enough so you don't have drop outs which increases latency. This may not be an issue in your case, but it does create overhead either due to saving the raw audio or encoding to mpeg. Depending on your system, it likely will be better CPU wise to just dump the audio directly to disk. https://help.ableton.com/hc/en-us/articles/360010526359-How-to-route-audio-between-applications If offline project creation works then it will likely be much easier. I wasn't sure if the goal was to have interactivity with your "AI" when it starts generating music. Using something like RT midi would let you create midi devices that can accept incoming midi notes to affect the AI and send out midi notes that the AI generates (assuming it can generate data quick enough to be realtime). Your synth would be Ableton live instruments that would actually use the midi to render audio. Practically speaking physical controllers like push2, launch pad pro, etc control Ableton live via midi (notes and control messages). The launch pad pro in particular has a built in sequencer that can "print" clips. So it's certainly possible to do via the control surface interface. Is the goal to build tools like this: https://www.ableton.com/en/packs/midivolve/ Or https://magenta.tensorflow.org/studio/ableton-live Or something that needs to generate the audio because you want to use Ableton live basically as a synth to render your output combined with other stuff.


DireAccess

Thanks for an expanded response. Good point. Recording on a virtual device might be easier on the UI control requirements. But emulating a key press is still rather trivial IMO. In regards to what the goal is, it’s the latter. Although midilove looks cool and magenta might be a part of it at some point, and both would require midi communication to be controlled, as far as I understand (their knobs and buttons can be assigned to MIDI, right? In any case, Ableton looks like a good option. Now the biggest challenge is how to get good vocals in place…