Thursday, September 17, 2015

iPhone 6s and 6s Plus Camera Launch Review

Apple iPhone 6s

With every new iPhone release, Apple has placed a greater emphasis on photography, though in truth, for each new feature that is announced there are a number of small iterative improvements that may or may not impact day to to day use. This year's significant changes are:

  • 12mp still photos
  • Improved phase detection autofocus
  • Deep trench sensor construction for better signal/noise
  • "Live Photo" capture
  • 4K video recording (3840x2160)

Obviously the fundamental changes are the bump in resolution and the addition of 4K. However, the key idea is "day to day" use: do these changes meaningfully benefit the consumer or has Apple joined the traditional camera industry's predictable iterative parade?


History ofthe iPhone Camera


All of this depends on if the features "work as intended." Though Apple portrays each generation as an evolution of photography, the actual benefits are not as stark as they would have you believe. Here is the progression of technology over the past few generations:

iPhone 4s: Camera resolution increased to 8mp from 5m in iPhone 4 (60% increase in pixels, ~25% increase in actual linear resolution). First time that a back-side illuminated sensor used in an iPhone. The image quality improved by a fair margin during this generation; not just more pixels, but better pixels overall.

iPhone 5: Same camera unit, but sapphire glass used on the external lens element. Synthetic sapphire is more scratch resistant than conventional glass. It's inclusion also had the unfortunate side-effect of introducing noticeable purple fringing in bright/contrasty situations. For many people the iPhone 5 seemed to give sharper and more contrasty images, but a large part of that was because of the different display used in the iPhone 5 compared to the iPhone 4s. When comparing images downloaded off-camera, the iPhone 5 images look cleaner but less crisp htan the 4s because of heavier amounts of noise reduction.

iPhone 5s: New dual-LED flash, new electronic image stabilization system. Larger sensor. Overall image quality improved, but the new sensor-lens combination is less sharp in the corners. Digital image stabilization produces crisp edges but fine-detail can suffer. Dual-LED flash system provides generally better colour balance for flash exposures, but can be visibly inaccurate in some situations.

iPhone 6: Photodiode size increased from 1.2µm to 1.5 µm, which benefited low light quality and improved dynamic range between the shadow and bright portions of the image. Optical image stabilization was added to the 6 Plus. Both cameras received 240 fps slow motion at 720p

There is only so much that you can do with the small optical system on a smartphone. iPhone photography has gotten batter over the years, particularly for those who like pictures but aren't serious about the nitty-gritty of picture taking. However, the start of the performance plateau began back at the 4s generation and the gains since then have been small. The nature of the game is to add specs and usability as time goes forward, but for the majority of people, basic stills photography is the majority of their activity, and for that there hasn't been much meaningful change over the years. Improvement, but not significant meaningful change.

iPhone 6s: The increased number of pixels on the 6s versus the 6 isn't meaningful from a resolution standpoint, since the lens isn't of high enough quality to show the different. It should also be remembered that the difference between 12mp and 8mp is 50% in the total number of pixels, but only a 22% difference in the total number of rows and columns across the picture. This is barely above what the typical human eye can differentiate. In other words, your photos might look better on Facebook if everything about the picture taking process is perfect (good lighting, steady hands, etc.) but they still wont be anywhere near as good as those that your friends with DSLR's are posting.

However, if you don't look at the resolution bump from a camera perspective, but rather from a smartphone perspective, then a modest bump in pixels is welcome giving that the lens is a fixed wide-angle design. Every little bit helps when using the "zoom" feature, which is just cropping in and throwing away the periphery pixels.

The other big thing is 4K video. 4K has been the camera development that few consumers have been able to benefit from. Few have 4K TV's, most find Blu-Ray to be more than enough quality.  That is true for most of the industry, but there is also the iPad and Macbook Retina displays to contend with; these can display resolution higher than 1080p (though it might not be so meaningful giving the typical viewing distance). In other words, those deepest embedded in the Apple ecosystem will benefit from 4K.

Never mind the storage requirements that 4K video requires. The 6s seems to require 16GB for 40min, assuming a 50mb/s data stream. For comparison, 4K video on a GoPro Hero 4 Black Edition at 30fps will consume over 32Gb for every hour. This is where the friction with the consumer experience becomes apparent, as Apple ostensibly wants to push more of its users to iCloud and has priced their phones to "encourage" users to step up from the 16GB models.

More Pixels


There was a practical reason why the iPhone remained at 8mp for some time while the competitors went to higher counts. Given the restraints of the small optical system, this was safely under the diffraction limitation threshold. Increasing the number of pixels (or narrowing the aperture) beyond a certain limit will result in more either more pixels or more depth of field, but not necessarily a sharper overall image. In layman's terms, the amount of sharpness per pixel decreases in these situations... you record information on more pixels, but the pixels become more mushy as well. 12mp on the iPhone 6s is still below the diffraction limit, but probably just barely.

The increase in pixels means that the iPhone 6s doesn't use all of the pixels for video, but at 8mp-size frames for 4K, a significant portion of the sensor is being used for video. For many sensors, 1080p video is actually recorded on a sub-sample basis; not all of the pixels are used, but rather, pixels are sampled across the sensor to gather a 2mp image for one frame of 1080p video. More advanced cameras like the Panasonic GH4 pull data from the whole sensor and then down sample it to 1080p size to produce a crisper-looking video stream. In other words, if you can't take advantage of the 4K video, resizing it to 1080p in post-processing will still make for a better-looking video than from previous generations of the iPhone.

On-Sensor Phase Detection


The previous iPhone 6 camera introduced on-sensor phase detection, and the current one improves on that. In simple terms, this is an advanced form of autofocus where the camera can quickly and accurately determine focus by calculating the distance from the camera to the subject. Previous iPhone cameras use a method of focus known as contrast detect, which works differently in that the the camera judges focus by looking at the level of contrast in the scene.... the more in focus the image is, the more contrast the imaging engine will see. Phase detection is the faster method because the camera not only knows if a subject is in focus or not, it also knows how far out-of focus the subject is and the exact amount that the lens has to be adjusted to achieve focus lock. Contrast detect is different in that the camera is moving the lens back and forth as a means of "guessing" focus.

The inclusion of on-camera phase-detection echoes what is happening in the dedicated cameras world with devices like the Fujifilm X-T1 and Sony A6000, but worth on a smartphone is of questionable benefit. Certainly autofocus will be faster, but since most iPhone photography is done with still subjects, the extra speed is more of a subjective improvement than it is a practical benefit. Phase detection on dedicated cameras is used to track moving subjects, but that also requires dedicated programming to drive the hardware. That certainly isn't happening on smart devices; the phase detection technology is only there to make the camera smarter, not faster. To do the latter requires quite a lot more in terms of technology.

But that aside, improved focus performance is good right? Yes, but the form factor of the iPhone camera itself dulls that advantage. Even though the sensor is larger, the (allegedly 1/2.6" sized) sensor is still small by any means. This means that for any given situation, the total depth of field is still enormous. Here is a plot of what the near and far limits of acceptable sharpness are compared to how far the subject is away from the camera.


Notice something? By 5 feet, the far limit of acceptable sharpness shoots off to infinity; this means that past 5 feet virtually everything from 3 feet to infinity is "in focus." This is known as the hyperfocal distance, the distance/aperture setting that produces the greatest depth of field possible. In other words, the camera lens only needs to adjust for subjects between 0-5 feet; beyond this; everything from 5 feet an onward is actually shot with the same lens setting. The implication is that the phase-detection ability of the iPhone 6 and 6s is actually only benefiting subjects shot at close range.

Concluding Thoughts


For many people, the iPhone is their main and perhaps only camera. Because of the physical constraints, there is only so much that can be done to improve the photographic experience... from a hardware standpoint. There is still a ways to go in terms of software. Though features like HDR, sweep-panoramas and slow-motion video have made their way onto the iPhone, the overall operation of the camera is still rudimentary. The basic camera app uses a simplistic spot-metering method that only determines exposure at the focus point. There are third-party apps that allow for separate control of focus and exposure metering, but there are no apps that allow for the scene recognition abilities found on DLSR's. Another area where dedicated cameras excel... and DSLR's in particular ... is in autofocus tracking moving subjects.

As an inevitable upgrade, the  iPhone 6s is an indispensable photography tool, but it isn't a dedicated camera. Certainly more pictures are taken with iPhone's everyday than they are with any other camera, but take another look at your Instagram feed... notice how many of those crystal-clear shots aren't taken with an iPhone? There's always a place for a camera and a phone that has a camera; so much the better if both improve over time.

No comments:

Post a Comment