Tell me your camera model and firmware version. I can post a modified test version of CHDK for you to try out.
set_shot_histogram(1)
histo_size=get_histo_range(-1) -- will be 2048 for 12 bit camera
total_samples=get_histo_range(-2)
then call it with
nsamples=get_histo_range(from,to,scale)
if scale is 0, it returns the actual number of samples in the range
otherwise, it returns nsamples/scale
It only works with Lua. scale is optional, and without it, you get the old version. That is, if scale is not present, it defaults to scale=100
==
Also, if you want to set exposure for ETTR, I should have a function that returns an exposure compensation value that you just add to the current shot shutter time and use in the next shot. Maybe you could help me figure out the best way to calculate that.
My plan is to start from the top of the histogram work my way down, adding up all the values until I get to a certain percentage of the total pixels, say 1% or whatever you want. Then I return an exposure compensation value to put that 1% level at the top of the histogram, or a level you specify.
I could also return an ETTL value, so you could shoot one shot underexposed but without black level clipping, and one shot overexposed (ETTR), but without white clipping.
I don't understand this at all. "ugmrnts" "plowed"?
Me neither. I prefer them unplowed.
I would prefer to access the true RAW histogram, rather than the 10b scaled one.
Can you show a case where this actually makes a difference?
I'm not sure it will make much difference for ETTR, but on the low end, the extra precision would make exposure change calculations more accurate. Anyway, I've got it programmed already for 12 and 14 bit full size histograms, so we can try it and see if it's necessary.