With iOS 15, Apple Might Finally Fix the Worst Thing About the iPhone. It’s Been Broken for 14 Years

With iOS 15, Apple Might Finally Fix the Worst Thing About the iPhone. It’s Been Broken for 14 Years
Image credit: source

The iPhone, since it was introduced 14 years ago, has changed the way we do just about everything. Like the way we communicate, for example, or how we navigate the world around us and engage with the people in our lives.

Personally, I love my iPhone 12 Pro. I use it all the time for everything from sending emails, messaging with my team, scrolling endlessly through social media, saving ideas for columns, and keeping in touch with my family. Honestly, though, one of the things I do the most with my iPhone is take pictures.

We have four children and a few pets, and between soccer games, gymnastics meets, family trips, and just random weird things they all do every day, I take a lot of photos. Having spent a decade as a professional photographer, I have a nice Nikon DSLR, but honestly, almost all of the photos I take now are with my iPhone. 

There is a problem, however. While the camera on my iPhone is obviously the most convenient way to capture images, it does have limitations. The iPhone camera is notorious for producing images with little green dots, especially at night.

The good news is, it appears that Apple is working on a fix in iOS 15. It’s such good news, that I’d argue that it fixes what’s I think is the worst thing about the iPhone. 

To understand the problem, and what Apple is doing to fix it, first, a quick primer on digital photography–smartphone edition. 

Most cameras consist of three things: a sensor that captures light, a shutter that determines how long the sensor is exposed to light, and a lens that focuses the light onto the sensor. That’s true if you use a large DSLR like my Nikon, a new mirrorless camera, a smaller point and shoot, or the camera on your iPhone (though your iPhone doesn’t have a physical shutter, but an electric one). 

Most smartphone camera sensors are pretty good at capturing light and detail, even as small as they are. The limiting factor, in most cases, is the tiny size of the lens.

That’s because it’s limited by the fact that an iPhone has to be able to fit in your pocket. Apple–and every other smartphone maker–has had to balance the need to make the camera elements as small as possible, while still squeezing the most quality out as possible. That means every photo you take with a smartphone camera has defects. It’s just simple physics. 

The smaller the lens, the more opportunity for defects like chromatic aberration, lens flare, or distortion. It’s part of the reason smartphones have increasingly large camera bumps on their back, to make room for larger sensors and more lens elements. Still, there’s only so much you can do, because, again, physics. Those green dots are a form of lens flare, and the iPhone has been notorious for producing them. 

On the other hand, one thing that smartphones have done well, especially in recent years, is compensate for these defects using the computational power of their processors. Google’s Pixel smartphones are a great example. They’ve used roughly the same camera sensor since the Pixel 2, but have used software to drastically improved the photos you can capture, even in less than stellar conditions.

The iPhone has tried to do the same thing, using the power of the A-series chips to enable features like Night Mode, which is honestly some kind of magic. But, those little green dots have found a way to stick around. 

Now, in iOS 15, Apple appears to be putting that computational power towards solving the problem. There are a few great examples of the beta feature in action. The Verge has a particularly good demonstration of before and after that is worth checking out if you want to see how it works. I’m honestly more interested in what it means for Apple and for the iPhone in general. 

See, every product is a series of compromises. A company has to balance a variety of needs to come up with something that satisfies the greatest range of use cases, while still considering things like price, battery life, size, and functionality. It’s not an easy thing to figure out, but it does matter because it affects the way users experience your product.

In the case of the iPhone, Apple is putting its considerable engineering behind solving a particularly difficult problem–removing those green dots not with a better lens, but by processing the image signal to give you a more desirable final result.

You can certainly debate whether a processed image is a real photo, but for most people, all they need to know is that their images will look better. Ultimately, that means Apple is solving the worst thing about the iPhone, which is good news for everyone who uses one.

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

(Excerpt) Read more Here | 2021-08-08 12:30:48

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.