sync'ing 3D video - Creative Uses of CHDK - CHDK Forum

sync'ing 3D video

  • 103 Replies
  • 35083 Views
sync'ing 3D video
« on: 22 / June / 2013, 13:12:07 »
Advertisements
I have two input signals that change state every 30 msec.  Let us say they are high (or low) for one msec every 30 msec.
I'm curious about where these signals come from?  Some sort of vertical sync signal? How complicated is the electronics to generate this signal? And how do you adjust the phase diff between the two signals?

Quote
I need to indicate on a number of leds (maybe bicolour ones) the phase difference between the two signals.
It can vary from zero to 30 msec. We are especially interested in zero phase difference.
If I under stand what you are trying to do correctly,  I'd suggest two LED's.  You make one or the other blink at rate based on the phase difference.  One LED is for positive phase difference,  the other for negative.  The LED blinks faster as you get closer to zero phase difference. When the controller thinks they are in phase,  the will both be on.

Quote
Software wise, do you know how to do this ?
Yes.
Ported :   A1200    SD940   G10    Powershot N    G16

Re: sync'ing 3D video
« Reply #1 on: 22 / June / 2013, 14:12:01 »
I'm curious about where these signals come from?

Do you know in detail how video capture works ?

That is what I thought, neither do I.
As far as I know, no-one understands this, not even srsa_4c.

A signal is required that is related to video synch.
We do not know of such a signal.

An led can be flashed VERY briefly in the code that deals with each frame, or alternatively, toggled on/off in each frame  ... preferable because of lower processing 'overhead'.

For most cameras except recent ones, the A/V output can be monitored for vertical synch pulse.

Easily done with LM1881 or with more modern, expensive chips.
I believe some microcontrollers can also do this.

However, not much use if NOT related to movie synch.

Cannot get my head around how an A/V frame-rate can be generated that is different to movie capture frame-rate.

Quote
how do you adjust the phase diff between the two signals?


You tell me, this is what it is all about !
« Last Edit: 22 / June / 2013, 14:13:42 by Microfunguy »

Re: sync'ing 3D video
« Reply #2 on: 22 / June / 2013, 16:34:20 »
Oh - your original post read like you had this all worked out except for needing a uP for a phase detector.

When I first read it,  I thought you were trying to figure out how to post process two video clips so that each frame is sync'd to its exact pair from the other clip.  However, this could have up to 33 mSec of phase difference at 60 Hz - objectionable for stereoscopic viewing I assume?

I think I now realize that your real goal is to have two camera be able to sync when they start shooting each frame ?

Do you know in detail how video capture works ? That is what I thought, neither do I. As far as I know, no-one understands this, not even srsa_4c.
I guess the fundamental questions is what initiates each frame once shooting starts - the ARM chip or the DSP or perhaps even some custom hardware? Unless its the ARM chip,  this project is dead before it starts I think.

Quote
A signal is required that is related to video synch. We do not know of such a signal.
If I understand you,  the idea is to have one camera generate a sync pulse at the start of each frame and the other camera to respond by shooting the next frame when the signal is received?  So you would need to hack the f/w in one cam to produce "a signal" when its starts to shoot the next frame and then hack the f/w in the other camera to initiate its next frame only when that signal is received.

Quote
An led can be flashed VERY briefly in the code that deals with each frame, or alternatively, toggled on/off in each frame  ... preferable because of lower processing 'overhead'.
So one camera generates an LED blink and the other cam uses a phototransistor on the USB remote input line to read the signal?

Quote
For most cameras except recent ones, the A/V output can be monitored for vertical synch pulse. Easily done with LM1881 or with more modern, expensive chips. I believe some microcontrollers can also do this.
However, not much use if NOT related to movie synch. Cannot get my head around how an A/V frame-rate can be generated that is different to movie capture frame-rate.
Ah,  you are hoping the A/V output is sync'd to the shooting frame rate.  Seems like a reasonable assumption I guess.   That gives you a "sync" signal but not the other half of the puzzle - what to do with it.

Quote
Quote
how do you adjust the phase diff between the two signals?
You tell me, this is what it is all about !
I guess I was confused about why you wanted the uP phase detector. Seems kind of useless if you can't use the output to adjust something.   I suppose for a static subject you could use the USB remote to start both cameras shooting, observe the sync detection signal and stop shooting if its two far apart.  The idea would be to keep restarting until the circuit tells you the cams started sync's and then keep shooting until the circuit tells you they have drifted too far apart ?
Ported :   A1200    SD940   G10    Powershot N    G16

Re: sync'ing 3D video
« Reply #3 on: 22 / June / 2013, 17:35:44 »
this could have up to 33 mSec of phase difference at 60 Hz - objectionable for stereoscopic viewing I assume?

Worse than useless, even though some 'pro' cinematography forums describe synching in 'post'.

Quote
I think I now realize that your real goal is to have two camera be able to sync when they start shooting each frame ?

Ideally pixel perfect, or at least sub-msec error.


Quote
I guess the fundamental questions is what initiates each frame once shooting starts - the ARM chip or the DSP or perhaps even some custom hardware? Unless its the ARM chip,  this project is dead before it starts I think.

We know from Magic Lantern and CHDK that there are timers that can have their frequency changed and thereby affect frame-rate.

Quote
If I understand you,  the idea is to have one camera generate a sync pulse at the start of each frame and the other camera to respond by shooting the next frame when the signal is received?

I do not think any pausing of frame capture would be tolerated.
I was thinking more of displaying synch error and by half-pressing on one camera gradually drift into synch by changing frame-rate.

The latter part should be possible from info already published, though I did not manage to get it working when I tried.

Quote
So one camera generates an LED blink and the other cam uses a phototransistor on the USB remote input line to read the signal?

No, both cameras generate a blink or continuous square wave and "your" circuit displays phase difference.
We have no idea if that is really related to capture synch.
Quote
you are hoping the A/V output is sync'd to the shooting frame rate.  Seems like a reasonable assumption I guess.

Was certainly the case in the old days.

 
Quote
That gives you a "sync" signal but not the other half of the puzzle - what to do with it.

Just indicate magnitude of phase difference.

Quote
The idea would be to keep restarting until the circuit tells you the cams started sync's and then keep shooting until the circuit tells you they have drifted too far apart ?

Unless the frame-rate can be drifted or switching VERY briefly to another video resolution or even between the PAL and NTSC A/V output modes somehow affects the phase difference of the framerate timers.



Re: sync'ing 3D video
« Reply #4 on: 22 / June / 2013, 19:43:15 »
We know from Magic Lantern and CHDK that there are timers that can have their frequency changed and thereby affect frame-rate.

I was thinking more of displaying synch error and by half-pressing on one camera gradually drift into synch by changing frame-rate.

The latter part should be possible from info already published, though I did not manage to get it working when I tried.
So as long as we are dreaming here - if the external sync phase detector provides a feedback signal to the cameras ( USB remote style PWM data or PTP ) then the camera kbd.c code could adjust the "magic" register on the fly to achieve sync lock.   
Ported :   A1200    SD940   G10    Powershot N    G16

Re: sync'ing 3D video
« Reply #5 on: 22 / June / 2013, 20:49:51 »
So as long as we are dreaming here - if the external sync phase detector provides a feedback signal to the cameras ( USB remote style PWM data or PTP ) then the camera kbd.c code could adjust the "magic" register on the fly to achieve sync lock.

Yes.

I am assuming the timer registers determine the division ratio of a phase-locked and hence timer output frequency.
Derived from that signal may be an interrupt which ultimately sends a particular message to the movie capture task.

Depends how near you can get to the source of that message.
Presumably the nearer you are the more it truly represents frame-capture time.

Re: sync'ing 3D video
« Reply #6 on: 22 / June / 2013, 21:06:45 »
I am assuming the timer registers determine the division ratio of a phase-locked and hence timer output frequency.
Derived from that signal may be an interrupt which ultimately sends a particular message to the movie capture task. Depends how near you can get to the source of that message. Presumably the nearer you are the more it truly represents frame-capture time.
Or the timer registers are for some external hardware that clocks the video circuits - nothing to do with any code in the ARM.
Ported :   A1200    SD940   G10    Powershot N    G16

Re: sync'ing 3D video
« Reply #7 on: 23 / June / 2013, 05:57:56 »
The CCD circuitry must report that a new frame is ready, surely it must use an interrupt to report that to the firmware ?

Is it possible to generate software interrupts ?



Re: sync'ing 3D video
« Reply #8 on: 23 / June / 2013, 10:24:07 »
The CCD circuitry must report that a new frame is ready, surely it must use an interrupt to report that to the firmware ?
If my conjecture of how this works is correct,  then the CCD / DMA circuitry may toggle some flag for each frame completed,  but I would expect that  the  <shoot-transfer to SD card-repeat> sequence is free running and independent of the ARM chip.  Other than whatever h/w register & timers the ARM chip sets up at the start of the sequence.

Exposure, focus and zoom adjustments during shooting are likely handled by the ARM but asynchronous to the actual shooting sequence.

My thinking is that when the ARM chip is involved,  the best cycle time CHDK or Canon can manage is about 1 second.   Many newer camera will also shoot in "Sports" mode at 10 frames per second but only for a little while.   The ability to stream 30 fps or 60 fps to the SD card seems  a least an order of magnitude harder and thus likely happening via the DSP (for compression) and some h/w circuitry.

But that could be completely wrong - an "arm chair" theory.    Do you have any links to Magic Lantern references?

Ported :   A1200    SD940   G10    Powershot N    G16

Re: sync'ing 3D video
« Reply #9 on: 23 / June / 2013, 13:23:11 »
This is main reference that I have used http://chdk.setepontos.com/index.php?topic=7938.0

 

Related Topics