The Galaxy S20 is here and with the iPhone 12 on its way this fall, here are a few things I’d like to see from Samsung’s latest implemented on Apple’s next greatest.
Back in the stone ages of smartphones, I used to joke that Nokia didn’t have any silicon or machine learning, so they just hung big glass on the back of everything and captured it all optically. Apple didn’t have glass or servers so they used big silicon on everything, doing it all in the image signal processor. And Google, well, Google had no idea what phone you were using so it just sucked everything up into the cloud and auto awesomed everything on big servers.
We’ve come a long way since then and most phones now use a combination of better optics, better image signal processors, and better machine learning, but some companies still do some of them way better than others.
Google still just crushes the algorithms. Everything from segmentation masking to super res zoom, which works similarly to Google’s HDR+ pipeline and Apple’s Smart HDR pipeline.
Basically, it takes a stack of photos all within an incredibly short period of time, in this case, differentiated by the shaking of your hand or a simulation thereof, and then grinds through them and produces a single image with all the best color and detail information.
At the same time, Google seems to have an irrational hatred for good optics. Every year they put out a new Pixel saying who needs a camera bump or optical stabilization or a telephoto or an ultra-wide angle, only to figure out that yes, they need it the next year. So, as amazing as super res zoom is, it’s limited by commodity cameras.
Huawei and now Samsung have taken the opposite approach. Because their machine learning and algorithms are nowhere nearly as good as Google, they’ve gone all-in on big optics. Even and now especially for zoom.
Now, as anyone with a zoom lens knows, they’re long and there’s just no way to stick them on the back of a phone without them sticking out like a canon from the back of the phone… Unless you fold them sideways. Yeah, sideways, and then use prisms to bend the light straight again front and back.
That’s what Huawei did with the periscope zoom and it’s what Samsung is doing with the new Ultra zoom.
Since zoom is one of the few dedicated camera features Apple hasn’t really addressed, not since the 2x of the iPhone 7, it’s something I’d love to see them tackle in the next iPhone.
Doing it in a similar way to Google makes the most sense. Again, Apple already has a Smart HDR pipeline that could be adapted to computational zoom the same way Google adapted HDR+ to Super Res Zoom.
It provides a lot of the benefits of optical zoom without taking up a ton of space inside the device, space which is always at a huge premium.
But… Apple also goes several extra more miles than Google when it comes to camera hardware. Their strength, recently, has been mixing really good optics with really good silicon and topping it off with really good machine learning.
So, I would love to see some periscope-like hardware in the next iPhone along with some super zoom style software.
That way, I could scratch take photos and videos of sports, keynotes, and trips right off my dedicated camera list.
12 megapixels is how many the iPhone 11 Pro’s wide-angle, telephoto, and ultra wide-angle cameras have. Google Pixel 4 uses 12 megapixels for the main wide-angle as well but goes to 16 megapixels for the telephoto.
That’s just the best compromise, the best balance between quantity and quality they’ve come to in the post megapixel wars worlds.
Samsung’s S20 Ultra stays with 12 megapixels for the ultra-wide but escalates rapidly from there with a 48 megapixels telephoto and 108-megapixel wide-angle.
Now, it’s a complicated subject. Often times companies go higher megapixel simply by chopping up the sensor into smaller and smaller pixels, which means they take in less light than they would have otherwise, and all you get is more of less — worse picture quality, not better.
To compensate, they’ll pixel bin, or re-combine all those many smaller megapixels into bigger fewer ones again, hoping to get something more than the sum of their tiny parts.
In the S20’s case, it can use 9 times pixel binning to squeeze those 108 megapixels down to 12-megapixel photos.
They’re also using a one over one-thirds sensor, which as camera phones go is enormous, and a quad Bayer filter. That means the pixels are color filters — red, green, blue, green — in 2 by 2 grids, so the full number of them can be used in brighter light, for example, or the color filters can be binned or interleaved down for better HDR or low light.
All that to say, we’re getting to a point where the technology is there to start moving beyond 12 megapixels, especially when Apple can combine those better optics with their already very good machine learning to produce even better photos in an even wider range of conditions.
Ever since Apple introduced 120Hz ProMotion displays for the iPad Pros in June of 2017, pretty much everyone has wanted them on iPhones.
But, in September of 2017, what would become the Pro line of iPhones switched to OLED and, back then, OLED just wasn’t ProMotion ready.
In 2019, that started to change, with OnePlus, ASUS, even Google, pushing out 90 and 120Hz OLED panels. And now, Samsung, which thus far has been fabricating the OLED panels Apple designs for the iPhone, has gone to 120Hz as well.
Of course, like with pixel counts, refresh rates are only quantitative aspect of panels, and that’s not even getting into the qualitative aspects yet.
You know, the ones that got not-even-1080p iPhone XR and 11 hot takers whose reputations go so embarrassingly burned once people got their hands and eyes on the actual quality of those panels.
So, you have to power those panels, which probably means dual-processor interface lanes and, unless you can pay that off with an equal and opposite increase in efficiency, bigger batteries to maintain batter life as well.
Currently, Samsung is shipping the S20 at 2K and 60hz. If you flip on 120Hz, Samsung is reducing the resolution to 1080p to maintain battery life.
And that’s fine. Historically, Samsung has been far more willing to offload display management to customers, down to things as arcane as color management.
Apple not so much. So, if the iPhone Pro is to go ProMotion, I expect it will go there fully. No such switch. And that means Apple will have to figure out how to pay for it while maintaining battery life in an iPhone-sized package.
And, with that kind of stuff, stuff that involves power and experience, Apple tends to be intractably conservative at times. Regardless of how much rage it producers in the sizzling spec community, or, of course, how much the vast majority of people who buy won’t even notice. Like they don’t notice less than 1080p when it’s really good quality less than 1080p.
If Apple can’t do that, make it work transparently, I expect we won’t see ProMotion on the iPhone until they can.
But if they can, I’d love to see it and soon. Because 120hz has the big advantage of being divisible by 60, 30, and 24. And as much as high frame rate interface and gaming is great, 24 frames-per-second video is just what it’s meant to be.
One of the big rumors for last year was reverse inductive charging. In other words, the ability to put something like AirPods on the back of your iPhone to transfer power from the phone to the pod.
Now, I’ve heard that this was never actually in the production plans for last year. But, that doesn’t mean it won’t ever be.
Of course, taking power from a device as important as our phone and inefficiently transferring it to a device as dependent as our AirPods isn’t something most people should do most of the time.
But, if your phone is full and your AirPods near empty, or your friend or family member’s phone is near empty, and you can give them a little power in a pinch, then that’s a good feature to have if and when you need it.
I’d say the same about the Apple Watch, but Apple didn’t go with vanilla Qi charging when they designed it. It’s based on Qi, but Apple tweaked it to charge better and faster than what was available back then.
Apple would need more coils to make the inductive charging both outie and innie, and I’m already overstuffing it with periscope cameras and bigger sensors and extra processor lanes, and bigger batteries, but, hey, rumor has it Apple is moving to larger Pro models this year anyway, so who knows?