Does downscaling actually decrease resolution? - General Chat - CHDK Forum
supplierdeeply

Does downscaling actually decrease resolution?

  • 22 Replies
  • 10752 Views
*

Offline LjL

  • ****
  • 266
  • A720IS
Does downscaling actually decrease resolution?
« on: 10 / July / 2008, 07:47:54 »
Advertisements
My camera is 8 megapixels. It can process images up to 3264x2448 RGB. 1224

It never, strictly speaking, employs pixel binning (I'm aware that other cameras do when using a high ISO setting), but I can set image size to various lower resolutions, such as 2048x1536 (labeled as "3 megapixel").

Let's say the sensor is composed of 3264x2448 = 7990272 elements, each with 10-bits quantization. It's actually a bit more because the borders are cropped, but whatever.

The total amount of potential information is, of course, about 8 million x 10 = 80 million bits.

The total amount of potential information in a 2048x1536 8-bit RGB image is about 75 million bits.

I lose 5 million bits, ok, but that's because I don't have an intermediate resolution that ends up giving precisely 80 megabits.

Aside from this nitpick, is any information actually lost? Or is it more accurate to say that it's the 3264x2448 image that contains a very redundant amount of non-information, introduced by Bayer demosaicing, while the 2048x1536 image contains more or less the original amount of information from the sensor?


*

Offline RaduP

  • *****
  • 926
Re: Does downscaling actually decrease resolution?
« Reply #1 on: 10 / July / 2008, 11:56:33 »
Yes, it is, I debated that with a friend and he proved me wrong.
The idea is that the light is sampled at 80 million locations (before demosaicing). So you can interpolate the missing colors and obtain a good image.
After converting it to RGB, the light is only sampled at (number of pixels/4) locations, and if you reduce the resolution further, then you can't properly interpolate back to what it was.

Of course, that's just theory, since the jpeg compression also gets rid of some information. And, of course, the quality of the lens can limit the resolution to elss than the resolution of the sensor.

*

Offline dzsemx

  • ****
  • 302
  • [SX40HS][A650][A630]
Re: Does downscaling actually decrease resolution?
« Reply #2 on: 12 / July / 2008, 20:03:28 »
it's just downsampling, nothing else

*

Offline LjL

  • ****
  • 266
  • A720IS
Re: Does downscaling actually decrease resolution?
« Reply #3 on: 12 / July / 2008, 20:13:21 »
That isn't helpful. If I have something at resolution X, then upsample to 2X, then downsample to X, I haven't lost anything.

So "it's just downsampling" doesn't answer anything.


*

Offline RaduP

  • *****
  • 926
Re: Does downscaling actually decrease resolution?
« Reply #4 on: 13 / July / 2008, 00:07:41 »
Ok, this is hard to explain.
Basically, it is not just upsampling.

As you know, the pixels are something like
RG
GB

Now, most of the pictures do not only have R, G or B, but all 3 colors. So then at each pixel you do not only get the color for that pixel, but also the luminosity of that particular pixel. And the human eye is more sensitive to luminosity than to color.

So with a 8mp sensor, you sample the luminosity in 8 million points. After the demosaicing, stuff is interpolated and you get more information (8MP*3 channels). However, if you downsample this to a lower resulution, you lose from the luminosity channel.

To understand my point, do the following thing:
Take a raw picture, don't do any demosaicing, and instead save each RGGB square as one pixel. This way you will have half x and half y. Now, upsample this image to 200%, to get the original image size, and compare this image with the raw file processed normally.

*

Offline dzsemx

  • ****
  • 302
  • [SX40HS][A650][A630]
Re: Does downscaling actually decrease resolution?
« Reply #5 on: 13 / July / 2008, 01:41:38 »
the count of million bits doesn't matter, don't try to be so scientific (like math teachers)...
Image sensor - Wikipedia, the free encyclopedia
Bayer filter - Wikipedia, the free encyclopedia
Demosaicing - Wikipedia, the free encyclopedia
an 8 megpixel camera has 8million of sensors.
so this camera does not have really 3264*2448 pixel of resolution but has a bit more details than a  2048*1536 image for example, because each sensor has different location
btw i used this resolution(2048*1536) generally with my a630(8Mpix), pixel level sharpness at this resolution is quite good, at this resolution for each 4 pixel is used 9 sensors aproximatelly
« Last Edit: 13 / July / 2008, 02:18:20 by dzsemx »

*

Offline RaduP

  • *****
  • 926
Re: Does downscaling actually decrease resolution?
« Reply #6 on: 13 / July / 2008, 02:34:17 »
Did you even bother reading what I posted? Did you bother trying what I proposed?

*

Offline LjL

  • ****
  • 266
  • A720IS
Re: Does downscaling actually decrease resolution?
« Reply #7 on: 13 / July / 2008, 08:00:47 »
RaduP, two things confuse me a bit.

One is that in an GRGB cell, there's "just" one more green pixel than in an RGB pixel, so yes, of course you lose information if you pack that into an RGB pixel, but somehow I feel there should be a "correct" size, and a way of interpolating up to that size, that allows to keep all information.
But then, whatever I try, I seem to end up with a non-integer number for the width/height.
So I guess you're right there's no way to achieve that, it just feels counter-intuitive to me...

The other thing is that you say each pixel encodes luminance (you said luminosity, but I guess there's no real difference here). I've seen many other sources state the same thing.
However, does it really? A green-filtered element (for instance) will encode the luminance of the green component, won't it? It will have no clue whether or not that spot is actually brighter due to the presence of red and blue components.
Wikipedia defines luminance in sRGB as 0.2126 R + 0.7152 G + 0.0722 B, so it would appear you can't really do away with two of the wavelengths...


*

Offline LjL

  • ****
  • 266
  • A720IS
Re: Does downscaling actually decrease resolution?
« Reply #8 on: 13 / July / 2008, 08:02:27 »
dzsemx, I started this topic in search for "scientifically" sound and mathematically convincing replies. If you don't want to try to talk scientifically, I fear this is not the topic for you.

*

Offline fudgey

  • *****
  • 1705
  • a570is
Re: Does downscaling actually decrease resolution?
« Reply #9 on: 13 / July / 2008, 09:26:54 »
Some of this has been discussed before, a major part of it was about trying to achieve grayscale superresolution. I'm too lazy to search, and it may not be exactly what you're thinking of but figured it was worth mentioning.

If you would be taking a photo of a black & white printed A4 sheet, a single pixel in the sensor will sufficiently act as a luminance sensor no matter it's color since the scene (ideally) only has areas with "no color" and areas with "all colors". Any downscaling or interpolation from the RAW file to a 8-bit RGB such as the JPEG will lose resolution and thus information, won't it?

It's difficult to get a beautiful grayscale superresolution image because the pixels of different color aren't  behaving identically to overall luminance (especially if the scene has colors!), but from a black & white scene this should be quite easily achieved.

Of course, when we reduce colors, we get rid of some information and the result will fit into a smaller file.

Without any mathematical point of view, what you originally asked could very well be true; the largest resolution we get from digital camera JPEGs is crappy by design when zoomed in on a PC, because it's been interpolated from neighboring pixels. A smaller resolution image downscaled from that highest resolution file won't be not much worse as you are saying. But obviously something is lost, so storing all those smudgy pixels isn't a total waste. A scientist truely requiring a 8 Mpixel sensor will just have to know the way currently available sensors work...

The quality of the image you get from the camera at a lower resolution setting is likely a bit worse than the one you downscale yourself (not taking into account JPEG artifacts, which may trip the scale the other way), because it's a full  resolution image (unless digital telephoto feature is enabled) downscaled by the camera using an algorithm that can't possibly be as good as what your image process software has to offer. Also, this downscaling slows down bursts.

 

Related Topics