Spread the love

Megapixels used to be so much simpler: a bigger number meant your camera could capture more photo detail as long as the scene had enough light. But a technology called pixel binning now sweeping across flagship smartphones is changing the old photography rules for the better.

In short, pixel binning gives you a camera that offers lots of detail when it’s bright out without becoming useless when it’s dim. The necessary hardware changes bring some tradeoffs and interesting details, though, which is why we’re taking a closer look.

You might have spotted pixel binning in earlier phones like LG’s G7 ThinQ in 2018 and this year’s LG V60 ThinQ, but this year it’s really catching on. Samsung, the top Android phone maker, built pixel binning into its flagship Galaxy S20 Ultra. Other high-end phones launched last week, Xiaomi’s Mi 10 and Mi 10 Pro and Huawei’s P40 Pro and Pro+, also offer pixel binning. 

Here’s your guide to what’s going on.

What is pixel binning?

Pixel binning is a technology that’s designed to make an image sensor more adaptable to different conditions. On today’s new phones, it involves changes to the image sensor that gathers the light in the first place and to the image processing algorithms that transform the sensor’s raw data into a photo or video. Pixel binning combines data from groups of pixels on the sensor to create, in effect, a smaller number of higher-quality pixels.

When there’s plenty of light, you can shoot photos at an image sensor’s true resolution, 108 megapixels in the case of the Galaxy S20 Ultra. But when it’s dim out, pixel binning lets a phone take a good but lower-resolution photo, which for the S20 Ultra is the still useful 12 megapixel resolution that’s prevailed on smartphones for a few years now.

“We wanted to give the flexibility of having a large number of pixels as well as having large pixel size,” said Ian Huang, who oversees LG’s products in the US.

Spread the love
WhatsApp Kontakto