Control Ableton Live
with AI
An MCP server that connects AI assistants like Claude to Ableton Live. Read and write session properties, trigger actions, and observe changes in real time through the Live Object Model.
How it works
Two components, one bridge
A Max for Live device runs inside Ableton, exposing the Live Object Model over a local TCP connection. The MCP server connects to it and translates AI tool calls into Live API operations. Drop the device on a track, start the server, and your AI assistant can see and control your session.
What you can do
Read, write, observe, call
Get and set any property in the Live Object Model — tempo, track names, clip slots, device parameters. Call functions like fire clips or create scenes. Observe properties for real-time change notifications. Navigate the object tree to discover what's available.
Requirements
System requirements
- Ableton Live 11 or 12 (Suite or Standard with Max for Live)
- Node.js 18+
- An MCP-compatible AI client (e.g. Claude Code)