Pixel binning isn't a just a Samsung S20 Ultra gimmick

Megapixels used to be so much simpler: a bigger number meant your camera could capture more photo detail as long as the scene had enough light.


 But a technology called pixel binning now sweeping across flagship smartphones is changing the old photography rules for the better.

Picture 1 of Pixel binning isn't a just a Samsung S20 Ultra gimmick

In short, pixel binning gives you a camera that offers lots of detail when it's bright out without becoming useless when it's dim. The necessary hardware changes bring some tradeoffs and interesting details, though, which is why we're taking a closer look.

You might have spotted pixel binning in earlier phones like LG's G7 ThinQ in 2018 and this year's LG V60 ThinQ, but this year it's really catching on. Samsung, the top Android phone maker, built pixel binning into its flagship Galaxy S20 Ultra. Other high-end phones launched last week, Xiaomi's Mi 10 and Mi 10 Pro and Huawei's P40 Pro and Pro+, also offer pixel binning. 

Here's your guide to what's going on.

What is pixel binning?

Pixel binning is a technology that's designed to make an image sensor more adaptable to different conditions. On today's new phones, it involves changes to the image sensor that gathers the light in the first place and to the image-processing algorithms that transform the sensor's raw data into a photo or video. Pixel binning combines data from groups of pixels on the sensor to create, in effect, a smaller number of higher-quality pixels.

When there's plenty of light, you can shoot photos at an image sensor's true resolution -- 108 megapixels in the case of the Galaxy S20 Ultra. But when it's dim out, pixel binning lets a phone take a good but lower-resolution photo, which for the S20 Ultra is the still useful 12-megapixel resolution that's prevailed on smartphones for a few years now.

"We wanted to give the flexibility of having a large number of pixels as well as having large pixel size," said Ian Huang, who oversees LG's products in the US.

Is pixel binning a gimmick?

No. Well, mostly no. It does let phone makers brag about megapixel numbers that exceed what you'll see even on DSLRs and other professional-grade cameras. That's a bit silly, but pixel binning can deliver some true benefits if you want to get the most out of your smartphone camera.

How does pixel binning work?

To understand pixel binning better, you have to know what a digital camera's image sensor looks like. It's a silicon chip with a grid of millions of pixels (technically called photosites) that capture the light that comes through the camera lens. Each pixel registers only one color: red, green or blue. But the colors are staggered in a special checkerboard arrangement called a Bayer pattern that lets a digital camera reconstruct all three color values for each pixel, a key step in generating that JPEG you want to share on Instagram.

Picture 2 of Pixel binning isn't a just a Samsung S20 Ultra gimmick

Pixel binning combines data from multiple small pixels on the image sensor into one larger virtual pixel. That's really useful for lower-light situations, where big pixels are better at keeping image noise at bay.

The technology commonly combines four real pixels into one virtual pixel "bin." But Samsung's S20 Ultra combines a 3x3 group of real pixels into one virtual pixel, an approach the Korean company calls nona binning.

When it's bright out and you don't have to worry about noise, the camera can take a photo with no binning, using instead the full resolution of the image sensor. That's useful for printing big photos or cropping in on an area of interest.

When should you use high resolution vs. pixel binning?

Most people will be happy with lower-resolution shots, and that's the default my colleagues Jessica Dolcourt and Patrick Holland recommend after testing the new Samsung Galaxy phones.

The biggest reason to use pixel binning is better low-light performance, but it also avoids monster file sizes of full-resolution images that can gobble up your phone's storage and online services like Google Photos. For example, a sample shot my colleague Lexy Savvides took was 3.6MB at 12 megapixels with pixel binning and 24MB at 108 megapixels without.

Photo enthusiasts are more likely to want to use full resolution when it's bright. That could help you identify distant birds or take more dramatic nature photos of distant subjects. And if you like to print large photos (yes, some people still make prints), more megapixels matters.

Does a 108-megapixel Samsung Galaxy S20 Ultra take better photos than a 61-megapixel Sony A7r IV professional camera?

Seriously? No. The size of each pixel on the image sensor also matters, along with other factors like lenses and image processing. There's a reason the Sony A7r IV costs $3,500 and only takes photos while the S20 Ultra costs $1,400 and can also run thousands of apps and make phone calls.

Image sensor pixels are squares whose width is measured in millionths of a meter, or microns. A human hair is about 100 micron across. On the S20 Ultra, each pixel is 0.8 micron wide. With Samsung's 3x3 binning, a virtual pixel is 2.4 microns across. On a Sony A7r IV, a pixel is 3.8 microns across. That means the Sony can gather 2.5 times more light per pixel than the S20 Ultra with 12-megapixel binning mode and 22 times more than in 108-megapixel full-resolution mode -- a major improvement in image quality.

Phones are closing the image quality gap, though, especially with computational photography technology like combining multiple frames into one shot. And image sensors in smartphones are getting bigger and bigger to improve quality.

Why is pixel binning suddenly popular?

Because miniaturization has made ever-smaller pixels possible. "What has propelled binning is this new trend of submicron pixels," those less than 1 micron wide, said Devang Patel, a senior marketing manager at Omnivision, a top image sensor manufacturer. Having lots of those pixels lets phone makers -- desperate to make this year's phone stand out -- brag about lots of megapixel ratings and 8K video. Binning lets them make that boast without sacrificing low-light sensitivity.

Do phones need special sensors for pixel binning?

The underlying sensor is the same, but a change to an element attached to it, called the color filter array, changes how the sensor gathers red, green and blue light. Conventional Bayer pattern checkerboards alternate colors for each neighboring pixel. But with pixel binning, sensors arrange same-color pixels into 2x2, 3x3 or 4x4 groups. (Pixel binning is possible without these groups, but it require extra processing, and Patel said image quality suffers somewhat.)

Picture 3 of Pixel binning isn't a just a Samsung S20 Ultra gimmick

The Samsung Galaxy S20 Ultra uses 3x3 pixel binning groups. Xioami took a different 2x2 approach with the Mi 10, so photos are 108 megapixels at full resolution photos and 27 megapixels with pixel binning. Huawei's P40 Pro models emphasize low-light performance with 4x4 pixel binning ("SedecimPixel," named for the 16:1 ratio) for virtual pixels an enormous 4.5 microns across.

Grouping pixels into larger virtual pixels works nicely for image-processing hardware and software tuned for regular Bayer pattern pixel data. But for high-resolution photos, it adds another image-processing step (called remosaicking, if you're curious) that mathematically constructs a finer-detail Bayer pattern out of the coarser pixel color groups actually on the sensor.

Can you shoot raw with pixel binning?

Photo enthusiasts like the flexibility and image quality of raw photos -- the unprocessed image sensor data, packaged as a DNG file. For them, pixel binning works just fine. But if you want the full-resolution data, sorry. Samsung and LG don't share it, and raw processing software like Adobe Lightroom expects a traditional Bayer pattern, not pixel cells grouped into 2x2 or 3x3 patches of the same color.

What are the downsides of pixel binning?

For the same size sensor, 12 real megapixels would perform a bit better than 12 binned megapixels, says Judd Heape, a senior director at mobile phone chipmaker Qualcomm. The sensor would likely be less expensive, too. And when you're shooting at full resolution, more image processing is required, which shortens your battery life.

For high-resolution photos, you'd get better sharpness with a regular Bayer pattern than with a binning sensor using 2x2 or 3x3 groups of same-color pixels. But that isn't too bad a problem. "With our algorithm, we're able to recover anywhere from 90% to 95% of the actual Bayer image quality," Patel said. Comparing the two approaches side by side images, you probably couldn't tell a difference outside lab test scenes with difficult situations like fine lines.

If you forget to switch your phone to binning mode and then take high-resolution shots in the dark, image quality suffers. In this situation, LG tries to compensate by combining multiple shots to try to reduce noise and improve dynamic range, Huang said.

Could regular cameras use pixel binning, too?

Yes, and judging by some full-frame sensor designs from Sony, the top image sensor maker right now, it might well.

What's the future of pixel binning?

Several developments are possible. More 4x4 pixel binning could encourage phone makers to move sensor resolutions well beyond 108 megapixels. Lower-end phones will get binning, too.

Picture 4 of Pixel binning isn't a just a Samsung S20 Ultra gimmick

Another direction is better HDR, or high dynamic range, photography that captures a better span of bright and dark image data. Small phone sensors struggle to capture a broad dynamic range, which is why companies like Google and Apple combine multiple shots to computationally generate HDR photos.

But pixel binning means new pixel-level flexibility. In a 2x2 group, you could devote two pixels to regular exposure, one to a darker exposure to capture highlights like bright skies, and one to a brighter exposure to capture shadow details.

Omnivision also expects autofocus improvements. Today, each pixel has its own microlens designed to gather more light. But you could put a single microlens over each 2x2, 3x3, or 4x4 group, too. Each pixel under the same microlens gets a slightly different view of the scene depending on its position, and the difference lets a digital camera calculate focus distance. That should help your camera keep the photo subjects in sharp focus.

Update 31 March 2020
Category

System

Mac OS X

Hardware

Game

Tech info

Technology

Science

Life

Application

Electric

Program

Mobile