As a feasibility test on my S5is, I wrote a small script that simply had an empty look from 1 to 1,000,000 and then I timed it with a stopwatch.... not very accurate I know, but it seemed to be right around 4 seconds every time, so I'm thinking I could use this for a delay adjustable down to about 4µs
Of course this would only be good on my camera so a better way to do it would be at the beginning of my script to read the camera's clock, count to some big number in my loop, then read the timer again and calculate exactly how long once through the loop takes.. then ask the user to input how many µs they want and then calculate how many loops to run.has this already been done somewhere??
Can you post the script? One million iteration - even in Lua - in 4 seconds seems unlikely.
here is what I used for my test:
Is there a way to access the 10mS timer?? That would be fine.. I can just have my script calibrate itself when it first runs by running a long loop and see how many 10mS clocks it takes then divide.. the longer my calibrations loop, the more accurate the result will be, even if the time base is only 10mS
the first issue is.. when I run this over and over.. it keeps coming up with a different number of tics for my delay loop.. sometimes it's 4300, sometime 4010, sometimes 4600.. I mean it's all over the place.
obviously it's rounding each step of the way... and coming up with 239000... doing the 1000000/4170 getting 239.808, but truncating it to 239 then doing the *1000...
these questions could be irrelevant however... because my intention is to use the motion detector, then light some leds in sequence with specific delays after motion is detected, and I suspect the motion detector will be depending on things happening during the yield such as reading the sensor.
Started by fudgey
« 1 2 3 4 »
Started by ghorian
Started by mattkime
« 1 2 »
Started by MotCan10
« 1 2 3 »
General Help and Assistance on using CHDK stable releases
Started by ericcire
« 1 2 »