by Mario Colangelo
TL:DR: The new iPhone 6s shoots 4K video. Unfortunately, consumer 4K is still very much an unfulfilled promise.
The new iPhone shoots 4K video.
Give that a moment to sink in...
This is an amazing time to be a content producer because everything is so accessible. People are shooting 4K footage on their phones and for the most part it looks good. But this is bringing 4K into the public consciousness in a big way and I feel the need to throw a little water on everyone's enthusiasm.
First some background; digital images are made up of square dots called pixels arranged in a grid, and resolution is simply the amount of pixels that make up the entire image. Images can be many different resolutions but in video we usually use HD which is a grid of either 1280x720 pixels or 1920x1080 pixels. 4K or "UltraHD" doubles HD's highest resolution to 3840x2160 pixels which in theory is twice as "good." In reality though, that is almost never the case.
graph of the relative frame sizes of the main video resolutions
The resolution is a lie
The ugly truth about resolution is that the pixels you are promised and the pixels you actually get are a completely different thing. The actual effective resolution of most digital images is far lower than what you are told it is. And the reason for this is that digital images are enormous. HD has 2.2 million pixels in each frame and 4K has 8.3 million. Capturing unique color values for each of those pixels many times a second, (depending on frame rate) is a massive challenge.
For those of you that like to see the numbers the actual data rate on 1920x1080 video is between 100 and 240 megabytes per second, (MBps) depending on frame rate and color fidelity. 4K is around four times larger so you are looking at between 370 and 1000 megabytes per second! So for a file that is a minute long in HD resolution you could be looking at as much as 15 gigabytes of data. In 4K that file could be very close to 60 gigabytes.
The largest iPhone has 128 gigabytes of storage so at most it could store 2-3 minutes of full-quality 4K footage. Obviously this is completely unworkable which is why every digital camera ever created utilizes some kind of algorithmic compression scheme to throw out some of the resolution and color fidelity and get the file size down to a workable level. The goal of these types of compression is to throw out information but be visually similar enough to the original to fool the average human eye. However, no compression is perfect and they all introduce artifacts into the image that degrade it's visual quality.
And this is the crux of the issue. There is a seriously problem with the way consumer goods are marketed, and especially cameras. The truth is the average consumer can't tell the difference between one image and another when the differences are not extremely exaggerated. So camera manufacturers give us spec sheets where they can brag about supposedly concrete numbers that "objectively" prove one camera produces a better image than another.
However, resolution does not equal image quality any more than any other singular spec does. It's not a measure of sharpness, color accuracy, noise, or anything else that would actually increase the quality of the image. It's simply more pixels... And the world doesn't need more pixels. It needs better pixels.
The pixels we deserve
There are many factors that go into making pixels. The camera lens, sensor, filters, processing hardware, and compression all work together to make the image you see. But the most important factor in video quality isn't even a part of the camera. The visual acuity of the viewer and the screen that the final image will be seen on is the real upper limit to visual quality. And while they have been making some improvements, most consumer screens today don't make the best images.
But for the sake of argument let's assume a viewer with 20/20 vision and a magically perfect screen to look at; does the extra resolution that 4K offers actually make a difference? Well, it depends entirely on the size of your screen:
This chart shows the screen size you need at any given viewing distance to notice the increased resolution of 4K over a 1080p HD image size. If you sit 10 feet away from your TV for example, you would need a 85" screen to begin to notice any difference. Obviously if we are talking about a tablet or a computer screen and you are 2-3 ft away you are definitely going to see the extra pixels. So now that we have an understanding of the actual benefit to more resolution lets take a look at the factor that is currently damaging visual quality at all resolutions; compression.
Modern compression schemes are very good at what they do which is figuring out what data can be thrown out of an image without much visual quality loss. But throw out too much data and the compression artifacts become severe and start to really impact the image quality. The 4K that the new iPhone captures is about 5.6 megabytes per second. That's between 1-2% of the image data that would be in an uncompressed 4K video file. Here's an example 4K video clip from the iPhone so you can see what that kind of compression ratio does to an image:
This clip is a pretty good test for the camera because it has very dark and very light areas as well as fast moving objects to really stress the compression. And the truth is it doesn't hold up very well. Sure it looks good in this small window but if you full-screen it or even click through to youtube and watch at 4K there you will see a lot of problems with the image. Here's a visual catalog of the kinds of issues this much compression creates shown at 100% zoom:
• Loss of detail in highlights:
Aliasing and moire on fine patterns:
Loss of color resolution around highly saturated elements:
Macro-blocking everywhere:
Compression artifacts like these destroy the detail and fidelity of the image. What possible good is resolution without detail? In fact, more resolution can often exacerbate the compression artifacts because manufacturers are pushing the the compression even harder to keep file sizes down. I can't help but wonder if the above clip wouldn't have actually been sharper and clearer if it was captured in HD instead of 4K.
True 4K does exist and there are plenty of cameras that can produce very clear high-quality images at that resolution. But they do so by using very little to no compression and as a result the files they create are only really useful to pros with access to a lot of computing power and storage. And of course since large 4K TVs or projectors aren't really available at consumer prices yet who is going to notice the difference?
Ultimately I guess everybody gets what they want. Manufacturers are happy because they get to sell cameras to consumers who are happy to have the next thing which is definitely better than the last thing. Just wait in another few years the iPhone will shoot 8K and we will have to have this discussion all over again.