I’m Rene Richie welcome back to vector so great to have you here iPhone 10 r has the same front-facing true depth camera system as the iPhone 10 10s and 10’s max that means it can do the same portrait selfies including with portrait lighting new depth control that lets you change the bouquet from F 1.
Same an emoji emoji and augmented reality effects as all of those higher priced phones and by the.
Way Apple says it’s heard our complaints know.
Enough to wacky internet hashtag smooth gate conspiracy theories but the legit complaints I talked about in my previous video Apple for its part really truly deeply believes a new imaging pipeline is better than the previous one and better than what anyone else is doing today if you disagree and when it comes to the selfie results I personally disagree hard or soft smooth or.
Whatever it’s important to let Apple know a lot because pipelines can and will be tweaked updated and improved over time and like I said if they can detect and preserve fabric rope cloud and other textures why not skin texture as well the good news.
Is that Apple says is identified a bug not just with faces but with everything and it will be fixing it in iOS 12.1 by making sure the system picks.
The sharpest frame possible to compute from and that way we’ll all be getting better crisper selfies and photos in general going forward what’s different is the camera on the back where I found 10 10s and 10’s.
Mac’s have dual systems with wide-angle and telephoto lenses and fuse them together for two times optical zoom and rear-facing portrait mode iPhone 10 R has just the wide-angle it’s the same improved wide-angle as on the 10s.
And 10’s Mac’s complete with an over 30% bigger sensor with bigger and deeper pixels to drink in more light.
And preserve it more accurately and it has the same new smart HDR feature that ties the image signal processor to the eight core neural Engine buffers up to four frames ahead shoots a series of.
Exposures interleaves them with a series of over exposures to get details from the highlights and tops it all off with a long exposure to pull similar details from the shadows you.
Can turn all of that off in settings or keep both the smart HDR and the original version if you like but it adds up to a very similar experience for a camera phone that costs only three-quarters as much where iPhone.
10 are really differs from 10s and 10’s max is the rear portrait mode absent a second telephoto camera to leverage.
For more real depth data Apple is doing with 10 R something similar to what.
Google did with pixel two last year using the parallax pulled off the phase detect autofocus system or what Apple calls focus pixels to get some depth data and then applying a machine-learning powered.
Segmentation mask to separate the subject from the background now as Apple was announcing this on stage I was worried I know a lot of people love love love Google’s portrait mode but as someone who’s owned a pixel to excel for a year now I’ve had some issues with it some were minor like the slightly cooler color cast.
It looks like Google fixed with the pixel three others were bigger like the cardboard cutout effect segmentation masks can.
And frankly I see a little bit on the 10r as well one was a deal-breaker though and that kept me from using the pixel for anything other than regular photography not for portrait mode the inability to show the effect live in the preview I explained why in my full review but I’ll quickly repeat it again the pixels portrait mode isn’t really portrait mode to me because it doesn’t show the actual effect in the live preview it only applies it afterwards a few long seconds afterwards as a post-production filter and.
Could do with any app heck it’s something Google could release as an app for every other camera phone out there many.
Would argue that none of that matters just the end result I would say both at least for me because I’m used to shooting with the DSLR I’m.
Used to framing for the actual shot I’m getting if I don’t like the depth of field in.
The preview I can move a little bit and get.
It to look just the way I want it before taking the shot with pixel I had to take the shot go check it wait for it to apply and then if I didn’t like it go back and shoot it all over again and all of that was a nasty surprise when I got the phone because everything is terrible and almost no one covered it in their reviews and it doesn’t sound like pixel 3 solves this what I think it comes down to is that Google seems to be using separate pipelines.
For the pixels live preview and actual camera capture where Apple has gone to great engineering in silicon panes to make sure that like a DSLR what you see is what you shoot even more so because the iPhone display is so much better and more accurate than any DSLR so while I’ll be getting one I’ll likely be sticking.
To non portrait mode photos with it as well yell at me in.
The comments all you want Google nerds I heart you anyway I just give far more towards optical nerdery long tangent short I was worried Apple would end up doing the same on iPhone 10 are but turns out not so much whether it’s the power of the a 12 Bionic or just the result of different design trade-offs Apple has managed.
To push the depth effect into the live view on iPhone 10 R as well so what you see is what you shoot Apple showed off the results on stage and in its demo picks I know you can’t always trust demo picks they tend.
To be cherry picked idealized best cased best of the best case examples but Apple has a good reputation here they don’t cheat and clean DSLR photos are shot with their phone or bring special lighting rigs around with them.
Everywhere they go things that an average customer just wouldn’t have access to and they also don’t hire professional photographers to go on big publicity tours or make huge publicity buys with massive magazine media companies including a bunch of cover shots a lot of famous photographers do use iPhones and plenty of magazines have shot covers and features on iPhones but as far as I can tell Apple hasn’t ever paid for.
Carry or for placement so what I also like is the way that Apple backs up the demo shots and quickly with shot on iphone shots which is something Apple latched on to early people started hash tagging their photos on Flickr and Instagram and Apple noticed became enamored with them and.
Quickly got behind them and started amplifying them which was smart the best campaigns are often the ones your customers come up with but in this case it’s extra smart because you don’t have to trust Apple’s demos you’re literally flooded with other examples almost immediately after launch second long tangent short I thought I had an idea what the 10r could do with its new portrait mode but no shooting with it for the last week has been one surprise after another some good some not so.
It educational the major difference is this with all previous portraits mode and yeah it’s like surgeons generals don’t worry about it from iphone 7 plus to iPhone 10s and 10’s max you were shooting with the effective 52 millimeter telephoto lens with iPhone 10 R you’re shooting with the effective 26.
Millimeter wide angle lens switching from one to the other is like swapping glass on a traditional camera that’s especially true because instead of.
Just slapping on a custom Gaussian or disk blur over the background and calling it a day which is what Apple used to do and I think pretty much everyone else still does this year Apple examined a bunch of high-end cameras and lenses and engineered separate virtual models for both the iPhone 10s and the iPhone 10 R that means when you switch to portrait mode it ingests the scene with computer vision tries to.
Make sense of everything at quote unquote sees and then renders the bouquet including lights overlapping lights and the kind of distortions real glass physics produces in the real world and when you slide the new depth control back and forth between F 1.4 and F 16 it recalculates and re-renders.
That virtual lens model the result is the same kind of character and yeah personality you get with real world lenses and that.
Means shooting with iPhone 10s vs. iPhone 10 R gives your photos a different character and yeah personality as well there.
Are also some huge pros and cons to get used to for starters the wide-angle lens is of course much wider so if you want a face to fill the frame you’ll have to sneaker zoom in instead of out that you can move in and out so much is terrific though you’re not bound by the same sweet spot that you are with the dual camera portrait mode system that often seems to be telling.
You to move closer move further away move closer move further away just take the shot already and that.
Means you can get a lot closer or a lot further away with the 10r then you can with 10s and still trigger the depth effect and because the 10r is using the F 1.8 aperture wide-angle for portrait modes and not the F 2.4 telephoto like dual camera iPhones it can pull in more light and compute its version of the depth effect in much darker conditions than the 10s or previous can but only.
For human faces which is where 10r might experience its own deal-breaker at least for some people unlike iPhone 7 plus when it first shipped where portrait mode was optimized for human faces but would do its best to sort out everything else and now with iOS 12 does an amazing job on an amazing array of different subjects and objects iPhone 10 are literally will not engage portrait mode if it can’t detect a.
Human face now like I said in.
My review it’s pretty great at engaging when it does see a face it uses face ID like neural networks to not only identify human faces but identify.
Them even if they’re partially obscured by glasses scarves and other forms of clothing Apple trained and tested it on an incredibly diverse and varied pool of people and things that people usually have on their faces and on their heads but that does mean no blurry photos of your foods or drinks no pets or droids in depth effects RER – and it can even lose track of human faces if they turn too much.
Also said in my review the F 1.8 camera has gotten good enough that.
You get a lot of real optical depth of field by picking your shots but if you want the computational stuff and you want it for everything you’ll have to move up to an iPhone 10s or 10s Macs or stick with an iPhone 8 + or 7 + to get it now I’ve shot with SLRs in DSLRs all my life.
But I’ve never considered myself a real photographer more of a hobbyist at best so to get a better sense of the.
Trade-offs I asked a real photographer a real professional photographer what he thought here’s Tyler Stallman thanks for having me be a part of your video Renee my name is tyler Stallman and I’ve been working for a photographer for over a.
Decade now I think this is the first time that Apple actually.
Undersold the cameras in their new phones with a whole new larger sensor and lens and camera compartment everything about.
It has been performing a lot better in the iPhone 10s that I’ve been using so far but the best news is that we can expect most of those same improvements to come to the iPhone 10 our so first let’s talk about what is the same and is the most important.
Stuff to me and that’s everything on the wide-angle lens it’ll allow you to have.
The same increased dynamic range where it’s using smart HDR to take multiple exposures every single moment that the camera is on and combine them into the image quality that we just weren’t seeing in cell phone cameras before now in previous iPhones we’re doing HDR as well but they had to take the photo and then process it and you’d see the results afterwards in the iPhone 10.
S and 10 art it is live processing all the time that means that your live photos also have that effect as well as.
All your video even on the selfie camera it’s got that extra.
Dynamic range and it makes a really big difference but the new camera.
Isn’t just what you see inside the little bump on the back of your phone it’s also things like the knurl engine which.
Has been improved in both the 10r and Tenace it’s exactly the same as well as the image signal processor and those together.
Are doing a lot of the intelligent processing that give you those incredible results out of camera now the big difference is that we don’t have the telephoto lens on the iPhone 10 R that means that you can’t zoom in with quite as much detail and when you’re taking portrait mode photos there’s a little less refinement around the edges of your subject so those times that you see ears or noses.
Or classes get cut off in portrait mode that might be a little bit worse on the 10r because it doesn’t have quite as much depth information to work with however you do get something for.
This different lens since it’s only using one lens the wider lens you’re able to take a wider portrait mode photo than you can on the 10s so strange that for so much lower a price the 10 are actually has one feature of the 10s doesn’t anyway thanks again for having.
Me Rene I can’t wait to talk to you more about the releases of 2018 thanks Tyler if you haven’t already check out and subscribe to his channel not only.
Can you see his iPhone 10 our hands-on but his ongoing experiences shooting with the 10s as well as the fancy photography workstation he set up and put together with Jonathan Morrison I’ll link all of that below I’m gonna say this again because it bears repeating again and again as good as DSLRs and micro four-thirds have become and I shoot the sit-down scenes and some of the b-roll for.
This very show on Panasonic’s and Canon cameras we now live in an age of computational photography of.
Bits that can go far beyond the atoms theoretically these bits those computational cameras have no limits they can reproduce the world in a.
Way no bound by physics glass ever could what you shoot could end up looking more.
Real than real scientific and sterile or just uncanny and unnatural by imposing some of the.
Constraints and yet distortions of the real-world physics and lenses to computational models not only does the wrong we’ve gotten used to look right again the limits add character and drie of creativity and physical or computational that’s what you want from a great camera now if you’re interested in this brave but strange new world and aren’t sure where to start.
If you keep hearing terms like algorithms and neural networks and are interested.
To learn more check out brilliant.
They have a bunch of courses.
Teaching you the logic and theory behind all of this.
Each course is interactive and breaks up complicated concepts into bite-sized chunks to make sure you actually absorb the information it’s a strategy.
That I wish I really wish my teachers had used back when I was in college.
Because it might have caused me to stay in college longer but check out brilliant org vector and start learning today thanks brilliant and thanks to all of you for it during the show I’ve been shooting with apples dual camera system for a few years now and the new single camera system for just around a week so obviously I want to shoot a lot more to get a better handle on it but.
I think one thing is already abundantly clear if you don’t absolutely have to have the dual camera system then you can get an amazing single camera system and frankly industry-leading video video that is better compared to standalone video cameras than other camera phones and save yourself 250 or more dollars depending on size and configuration while doing it but enough of what I think now I want to hear what you think are you dual cameras for life or at least until three or four.