Every element in your Ableton Live session — every track, clip, device, parameter, scene, and routing configuration — exists as an object in the Live Object Model (LOM). Understanding the LOM at a high level will help you get the most out of Live MCP and communicate more effectively with your AI assistant.

A tree of objects

The LOM organizes everything in your session into a hierarchy. At the root sits the Song (accessed via the path live_set), and from there, branches extend to tracks, clips, devices, and beyond.

Song (live_set)
├── tracks[]
│   ├── clip_slots[]
│   │   └── clip
│   ├── devices[]
│   │   ├── parameters[]
│   │   ├── chains[] (rack devices)
│   │   │   └── devices[]
│   │   └── drum_pads[] (drum racks)
│   │       └── chains[]
│   └── mixer_device
│       ├── volume
│       ├── panning
│       └── sends[]
├── return_tracks[]
├── master_track
├── scenes[]
└── groove_pool

Each node in this tree is an object with properties (values you can read and sometimes write), functions (actions you can call), and children (sub-objects you can navigate into).

Why a tree matters for AI

The tree structure is what makes AI control of Ableton practical. Instead of needing to understand a complex graphical interface, an AI assistant can navigate the LOM like a file system: start at the root, list what is available, and drill down to exactly the right object.

For example, to find the volume of the second device’s third parameter on the first track, the path is simply live_set tracks 0 devices 1 parameters 2. No menus, no mouse clicks, no visual parsing. Just a path.

This structured access is also what makes batch operations natural. When your AI assistant adds a Utility device to every track, it iterates through live_set tracks 0, live_set tracks 1, live_set tracks 2, and so on — the same way you might loop through files in a directory.

Properties and functions

Every LOM object has properties and functions appropriate to its type.

Properties describe the object’s current state. A Track has a name, a color, a mute state. The Song has a tempo, a signature_numerator, an is_playing flag. Some properties are read-only (you can check is_playing but you toggle playback through a function, not a property set).

Functions perform actions. The Song has start_playing() and stop_playing(). A Clip has fire() and stop(). The Song has create_midi_track() and create_audio_track(). Functions are how you make things happen, while properties are how you read and configure state.

The two root objects

The LOM has two root entry points:

  • live_set — the Song. This is where almost everything lives: tracks, clips, devices, scenes, the transport. This is the root you will use 99% of the time.
  • live_app — the Application. This gives access to application-level information like the current view, control surfaces, and the Live version. Useful for UI control but less common in typical workflows.

Discovering the tree

You do not need to memorize the LOM. Live MCP provides the get_children tool, which lets your AI assistant explore the tree dynamically. Your assistant can start at live_set, see what is available, and navigate deeper as needed.

This means you can ask questions like “What devices are on my third track?” and the assistant will figure out the path by walking the tree — you do not need to know that the path is live_set tracks 2 devices.

Further reading

The LOM is documented in detail by Cycling ‘74. For the complete list of classes, properties, and functions, see the Cycling ‘74 LOM reference. That reference is the authoritative source for what is available on each object type.