I meant the input and output of nodes. The main thing a VST oriented VPL does is manipulating audio samples so the graph is easier to use and reason about than a generic VPL (domain specific VPL seem to thrive, I wouldn't say the same thing for generic VPLs)
No that would have been very limited. It's very important to turn on and off execution flow and of course branch it. That applies both in graphics and in sound. Why you got that impression ?
Audio VPLs are far more powerful than what you describe.
Two materials combined to one over a 3d surface where instead of mixing in certain areas you can have one or the other depending on a texture transparency acting as a mask. Or a gradient color customized to act as a bit mask with one and off values in the colors of black and white. Non continuous data in sound can be midi or any form of modulation source like a square wave that can also act as a bit and tons of other methods.
Ok so theses VPL have 2 primitives : 2D texture (a bit mask is a 2D texture) and 3D object / sound samples and MIDI stream.
Can you generate an image from sound samples ? Can you create 3D objects particles from MIDI stream ?
I use blender, Blender has multiple VPLs that each specialize in one area. Texture, Material and Compositor. Blender VPLs are fully upgradable with Python code that allows the creation of new nodes . This allowed developers to create new VPLs using animation and modeling nodes. So even though Blender VPLs are not meant to be used as generic languages however there is already the API in place to allow this essentially exposing the entire Python scripting interface as nodes.
Reaktor I do not know if it supports 3d physics particles , but you could do what you ask for from blender. Blender supports midi for controlling animation.
Blueprints on the other hand are a generic VPL and can do anything you describe. It's also the best choice because Unreal excels at real time graphics.