Cool! Understood - that explains the syncing to host.
But how does the plugin know how to start doing it’s metronome magic?
Do you know which protocol is used for that?
Cool! Understood - that explains the syncing to host.
But how does the plugin know how to start doing it’s metronome magic?
Do you know which protocol is used for that?
If you are looking for simple bookend go/stop functions, that’s not really how it works, at least AFAIK. When it is time to process audio, the plugin will receive a constant stream of processAudio calls to determine what to do. Note that these may be going forward or backward in time (consider reverse scrubbing).
Here you go, if interested:
I’m not an expert on the VST API (I am working with CLAP instead and using wrappers to generate VST and AU). However, those are the canonical Steinberg docs.
You just made me very curious about the inner working of VSTs
I did some MPEG & AVI development when I was you & stupid. The timecode syncing seems to be similar (and does not need to be resolved, as this works already).
In fact, the “only” thing necessary is to kick MMetronome to start, I guess. Shouldn’t be too hard to create a VST plugin that does only that!?
The only thing I don’t like is that it seems to need dreaded C++. Always hated that!
Gimme C#.
But - challenge accepted!
Again, that’s not really how it works. I suspect the original plugin host (Tonelib) is simply not implementing part of the VST API as a host.
But that “part” could be emulated, I guess.
That is why I hoped that one of the plug-in wrappers above works. They don’t.
Let’s see what MeldaProduction and Tonelib say.
If they can’t correct it, maybe they can give me a hint…
99% sure Tonelib needs to fix Jam’s plugin hosting.