With every new iPhone release, Apple has placed a greater emphasis on photography. The iPhone 6 and iPhone 6 Plus seem to be a waning of that trend; though there are improvements the 2014 editions are more iterative than innovative. In many ways, this is a mirror of the dedicated camera market as a whole; the improvements are there but the actual benefit to the consumer diminishes with each gain. The headline specs are:
- Larger 1.5 µm photodiodes (iPhone 5s was 1.2µm)
- f/2.2 aperture
- Optical image stabilization on the 6 Plus
- True Tone flash carried over from iPhone 5s
- 240 fps slow-motion at 720p
There are some interesting things going on, but the sum total isn't overwhelming this time around. Certainly not like the 5s introduction, which is almost as much a camera launch as it was a smartphone launch.
The new sensor is 25% larger than the one used on the iPhone 5s, which in turn was 15% larger than the one used on the iPhone 5. To put that in perspective, a 25% increase in light gathering ability is the equivalent of a 1/3EV improvement in ISO noise characteristics. That's not much, but for their extremely small size, smartphone camera sensors produce remarkable quality for what they are. Note that the camera protrudes slightly from both versions of this phone: this almost certainly has to be the case if the sensor is bigger, as the lens system would also have to be lengthened to maintain the same field of view... assuming that Apple didn't change that aspect of the camera.
If it works as intended, the optical image stabilization in the iPhone 6 Plus is a bigger benefit than having a larger sensor. Most image stabilization systems today give a 2EV benefit in terms of hand-holding advantage, meaning that the camera can use a shutter speed that is at least four times longer than what available light would give without image stabilization. (If you are keeping score, the first phone to use an optical image stabilization system was the LG G2 , though with different sensor specs.)
It's a shame that the optical image stabilization isn't available on the smaller mass-market model. Ostensibly, there is more room inside the 6 Plus for the stabilization hardware, but it does leave open the possibility of optical image-stabilization making its way to a hypothetical "iPhone 6s" in a year's time.
If it sounds as though Apple is letting the iPhone stagnate with a 8mp back-facing camera, there is a reason, though not everybody will agree in light of the higher resolution options available in the Android world. Even the larger resolution of the iPhone 6 Plus, which is a true 1080p display, is smaller than the pixel dimensions of the iSight camera output. 1080p (1,920 x 1,080) is just over 2mp, which is 1/4 the size of what the camera is outputting. Though there are those who will need to download pictures off of their phones, the vast majority of mobile photography comprises activities that only require the phone's display screen. In other words, if you are using your camera for Facebook, Twitter, Instagram, etc. then you (mostly) don't need more megapixels than what you have now. Certainly you can zoom-in and perform more cropping with more pixels, but there is also a cost in terms of storage. Think of all of the people that you know who never off-load pictures from your phone and you will understand. Given how Apple sets its pricing tiers for the different amounts of flash memory, it's merciful that the the camera files sizes aren't going up to consume what is a limited and expensive resource.
History of Iteration
All of this depends on if the features "work as intended." Though Apple portrays each generation as an evolution of photography, the actual benefits are not as stark as they would have you believe. Here is the progression of technology over the past few generations:
iPhone 4s: Camera resolution increased to 8mp from 5m in iPhone 4 (60% increase in pixels, ~25% increase in actual linear resolution). First time that a back-side illuminated sensor used in an iPhone. The image quality improved by a fair margin during this generation; not just more pixels, but better pixels overall.
iPhone 5: Same camera unit, but sapphire glass used on the external lens element. Synthetic sapphire is more scratch resistant than conventional glass. It's inclusion also had the unfortunate side-effect of introducing noticeable purple fringing in bright/contrasty situations. For many people the iPhone 5 seemed to give sharper and more contrasty images, but a large part of that was because of the different display used in the iPhone 5 compared to the iPhone 4s. When comparing images downloaded off-camera, the iPhone 5 images look cleaner but less crisp htan the 4s because of heavier amounts of noise reduction.
iPhone 5s: New dual-LED flash, new electronic image stabilization system. Larger sensor. Overall image quality improved, but the new sensor-lens combination is less sharp in the corners. Digital image stabilization produces crisp edges but fine-detail can suffer. Dual-LED flash system provides generally better colour balance for flash exposures, but can be visibly inaccurate in some situations.
There is only so much that you can do with the small optical system on a smartphone. iPhone photography has gotten batter over the years, particularly for those who like pictures but aren't serious about the nitty-gritty of picture taking. However, the start of the performance plateau began back at the 4s generation and the gains since then have been small. If you are strictly comparing the the iPhone 6 to the 5s, the improvements are subtle at best; this year it's all about the non-camera upgrades like the larger screen, NFC enabled mobile payments (ApplePay) and the accompanying Apple Watch.
On-Sensor Phase Detection
One surprising inclusion on the iPhone 6 camera is on-sensor phase detection. In simple terms, this is an advanced form of autofocus where the camera can quickly and accurately determine focus by calculating the distance from the camera to the subject. Previous iPhone cameras use a method of focus known as contrast detect, which works differently in that the the camera judges focus by looking at the level of contrast in the scene.... the more in focus the image is, the more contrast the imaging engine will see. Phase detection is the faster method because the camera not only knows if a subject is in focus or not, it also knows how far out-of focus the subject is and the exact amount that the lens has to be adjusted to achieve focus lock. Contrast detect is different in that the camera is moving the lens back and forth as a means of "guessing" focus.
The inclusion of on-camera phase-detection echoes what is happening in the dedicated cameras world with devices like the Fujifilm X-T1 and Sony A6000, but worth on a smartphone is of questionable benefit. Certainly autofocus will be faster on the iPhone 6 than on the 5s, but since most iPhone photography is done with still subjects, the extra speed is more of a subjective improvement than it is a practical benefit. Phase detection on dedicated cameras is used to track moving subjects, but that also requires dedicated programming to drive the hardware.
But that aside, improved focus performance is good right? Yes, but the form factor of the iPhone camera itself dulls that advantage somewhat. Even though the sensor is larger, the (allegedly 1/2.6" sized) sensor is still small by any means. This means that for any given situation, the total depth of field is still sizable. Here is a plot of what the near and far limits of acceptable sharpness are compared to how far the subject is away from the camera.
Notice something? By 5 feet, the far limit of acceptable sharpness shoots off to infinity; this means that past 5 feet virtually everything from 3 feet to infinity is "in focus." This is known as the hyperfocal distance, the distance/aperture setting that produces the greatest depth of field possible. In other words, the camera lens only needs to adjust for subjects between 0-5 feet; beyond this; everything from 5 feet an onward is actually shot with the same lens setting. The implication is that the phase-detection ability of the iPhone 6 is actually only benefiting subjects shot at close range.
For many people, the iPhone is there main and perhaps only camera. Because of the physical constraints, there is only so much that can be done to improve the photographic experience... from a hardware standpoint. There is still a ways to go in terms of software. Though features like HDR, sweep-panoramas and slow-motion video have made their way onto the iPhone, the overall operation of the camera is still rudimentary. The basic camera app uses a simplistic spot-metering method that only determines exposure at the focus point. There are third-party apps that allow for separate control of focus and exposure metering, but there are no apps that allow for the scene recognition abilities found on DLSR's. Another area where dedicated cameras excel... and DSLR's in particular ... is in autofocus tracking moving subjects.
As an upgrade, the focus shift away from photography for 2014 reminds us that the iPhone isn't a dedicated camera, no matter how much previous launch events tried to ingrain that concept into the public's mind. Certainly more pictures are taken with iPhone's everyday than they are with any other camera, but take another look at your Instagram feed... notice how many of those crystal-clear shots aren't taken with an iPhone?