the reason this works and link doesn’t is that link is fundamentally not a transport sync system. link shares tempo and phase relationships, but every participant has its own independent timeline and its own quantum, and start/stop is not guaranteed to be aligned. it’s designed for jam sessions, not deterministic playback. what i’m doing here is the opposite: ableton is the single source for transport, and everything else is just reacting to explicit events..the nice side effect is that this is fully cross platform without any weird hacks. macos and windows (using the RTPmidi driver) both already speak rtp midi properly, and linux just joins via rtpmidid. the video machine can be completely headless, minimal, and stable, which is honestly a better fit for stage visuals anyway. you don’t need resolume or touchdesigner or any heavyweight visual stack if your requirements are basically “play this video exactly when i press play”.so all in all, the whole thing ends up being less about “syncing” and more about “stop pretending and just send the right signals for fucks sake”. ableton already knows exactly when playback starts, so just export that over the network and treat it as gospel. everything downstream becomes trivial once you do that.