I think you will be able to offer some valuable advice, so let me describe the 'problem' in more detail.
As you know, still images can be captured with a high degree of synchronisation, typically 1/5000 sec.
Movies are captured with far less precision.
For stereo movies involving 'action', they need to be synchronised to less than one msec.
First question is, when is the movie frame captured ?
If that is determined by a hardware timer then we cannot delay that time to achieve synch.
If we could flash an led in synch with that timer we could at least indicate the synch difference on an external device.
After the movie frame is captured, the MovieRecord task will be executed.
Is that a fixed time after the frame is captured ?
If so, we can flash the led in that task and it will be in synch with the frame capture.
If the MovieRecord and AviWrite tasks are not a constant time after frame capture then we cannot measure synch.
So, is each task executed at a fixed rate or can a task be interrupted by another task ?
I have just added a jump to 'blink_led' at the start of MovieRecord task and a friend will look at the signal on an oscilloscope to see how regular it is.
On my A620, that task can either be exited, continually looped or branch through one of ten other chunks of code.
In one branch, CHDK unlocks optical zoom.
In another branch (called for every movie frame), the mute-on-zoom is executed.
Amongst all this there is also a branch to hardware failure caused by JPEG time-out.
Again, I can flash an led at the start of the mute-on-zoom branch and see how regular the timing is.
Any idea why lock optical zoom is in the particular branch that it is ?