MIDI has never been userland in multitasking environments. It's a real-time, low-latency interface between some number of running processes and some number of hardware ports: the timing glitches and weird buffer behaviors you'll get from mixing together different hardware are totally kernel's wheelhouse.
If you meant General MIDI synthesis, that's a different spec and related only in that the MIDI message encoding is reused to describe sequences.
That's ok. You can talk to USB devices from userspace without issues. That's how many game controllers work. Also most (all?) SDR devices.
In most systems that impacts the mininal latency you can achieve though, so unless you have a system designed around it, you typically don't want real-time usage crossing to userspace too many times.
What’s a good method for semi real-time, multi process, and multi hardware port synchronization, in user space? All of the real-time kernel code I’ve worked on wasn’t possible 15 years or so. Are there some new user space hooks?
The kernel aint hard real time anyway. In userspace, guess you can pin the task to one or two core, have it the highest rr priority and get more or less the same soft realtime we have in kernel?