Curly-haired, bespectacled and Queensland-raised, Apple’s camera software chief Jon McCormack has a touch of the mad scientist about him—equal parts genius and inventive—yet he’s eloquent and soft-spoken on camera. A veteran of Amazon and HP, he joined Apple in April 2018, and now leads Camera & Photo Software Engineering as the company shifts from pure optics to human-centred computation; off-hours he shoots wildlife and supports education via The Kilgoris Project. Megan Nash, an iPhone product manager who debuted the Center Stage front camera on stage during Apple’s September 2025 iPhone 17 launch, steers the product story from hardware realities to human behaviour. In an exclusive conversation with BW Businessworld, McCormack and Nash trace how Apple rebuilt the selfie dubbed the centre stage camera from first principles for the iPhone 17 era.
Apple has turned the front camera into a systems story. With the iPhone 17 family, Center Stage moves from an iPad-era idea to the heart of the iPhone, backed by a larger square-format front sensor, new stabilisation logic that borrows from Action mode, and silicon that treats selfies like first-class cinematography. It is part behavioural study, part optics, part chip design—producing a front-facing camera that reframes itself, rotates when it should, and steadies your stride without you thinking about it. McCormack and Nash walk through how—and why—they built it this way.
Why Now
Apple’s camera leadership frames the change not as a feature race but as a behavioural correction: design the default around how people actually shoot, not how cameras have always worked. The cultural backdrop matters—short video as lingua franca, video calling as routine, and creators wanting pro steadiness without pro friction.
“As the new iPhone lineup this year really does completely reimagine the front camera experience with the Center Stage camera. It fundamentally changes the way that we capture our memories and communicate with our loved ones,” Jon McCormack, Vice President of Camera and Photo Software Engineering, said. “We like to say that the iPhone is a really social camera, and this is especially true for the front camera. … Globally, we took around 500 billion selfies on the iPhone last year.”
Apple put that figure on the record at the September event. For scale, that’s roughly 1.37 billion selfies per day, 57 million per hour, and about 16,000 per second. The company did not detail its methodology on stage, but the message lands: the selfie is not a niche—it’s the mainstream.
“We see selfie sticks; we see people switching to the 0.5 times ultra-wide camera; we see folks rotating the iPhone to horizontal; and we even see people handing the iPhone over to the tallest person in the group to get that maximum arm extension before they take a selfie,” McCormack explained. “What’s going on here is that our users are trying to make the camera work for them, but we knew that we could do better… what if the camera could just understand what you’re trying to capture and then make those adjustments for you?”
What Center Stage Means On iPhone 17
The leap from call-only reframing to camera-level intelligence is the real pivot. Think less “effect” and more “baseline behaviour”: composition and stabilisation now travel with you into Photos and the Camera app itself.
On iPad and Mac, Center Stage used an ultra-wide front camera and machine learning to crop and track faces during calls. On iPhone 17, it goes further: the camera can auto-rotate and auto-zoom for photos, expanding the field of view when friends enter the frame and keeping callers centred during movement. Crucially, these behaviours are woven into the native camera, not fenced off inside chat apps. In practice, you hold the phone vertically, but the system composes the landscape when the scene demands it—no wrist contortions, no mode-hunting.
Hardware, Reimagined

Apple’s imaging team began by rewriting constraints: if people won’t rotate the device, the sensor must carry that burden. That meant enlarging the capture area, re-balancing optics to avoid wide-angle distortion, and—critically—changing the geometry itself.
“So, the front camera and the sensor were developed together with very clear ideas about the customer experiences we wanted to enable,” Megan Nash, iPhone Product Manager, explained. “Like John said, we knew we wanted to help users fit more friends in their group selfies, so we needed a wider field of view… With the new Center Stage camera, we grew the sensor to almost double the size of the previous sensor to match pixel-for-pixel sharpness… The result is a wider field of view that fits more people or background in the frame, along with excellent image quality—really the best of both worlds.”
“Normally, smartphones use a 4:3 ratio sensor,” Nash said. “So we decided to make the sensor square to enable these industry-first experiences… The square sensor means you can hold your iPhone vertically and still capture landscape photos and videos. For the first time, we’ve uncoupled the orientation of your phone from the aspect ratio of your capture.” Output lands at up to 18 megapixels, giving headroom for cropping without the mushy edges that often plague wide selfie lenses.
Silicon, In Service Of Framing

Megan Nash debuted the groundbreaking new center stage camera at Apple's September 9 keynote video.
Under the glass, this is a data-rate story: more pixels per degree and more decisions per second. That forces a marriage of sensor I/O, memory bandwidth and thermal headroom so the phone can crop, stabilise and recognise faces simultaneously—and keep doing so in long takes.
“Years in advance, we were thinking about how this new front camera would need the high-speed Apple Camera Interface,” Nash explained. “So the A19 and A19 Pro use ACI to efficiently transfer data between the image sensor and the chip. That was especially important for experiences like Dual Capture, which uses both the front and rear cameras. So, really, the magic of these experiences is the result of hardware, software, and Apple silicon.”
Apple couples this with a thermally sturdier design on the Pro models. Paired with a new vapour chamber, A19 Pro is billed as delivering up to 40 percent better sustained performance than the previous generation—exactly the headroom needed for long, stabilised takes and multi-camera recording without the heat-related throttling that can sink a creator’s day.
Stabilisation You Don’t Think About

Action mode gave Apple a template on the rear cameras; the front camera now inherits that composure by default. The promise is psychological as much as optical: remove the burden of “am I framed?” and people behave more naturally on camera.
“Our goal with the iPhone’s camera is always to make it invisible,” McCormack said. That invisibility depends on stabilisation and smart framing, both now default behaviours with the front camera. “We achieved this by using the large overscan region on the sensor to enable this amazing stability… The larger field of view and high-resolution sensor allow us to use Action mode automatically every time you capture a selfie video. You never even have to turn it on, so you can walk, bike, or run and know that your video is going to be great.”
In practice, the system blends motion correction with face-aware framing that’s intentionally relaxed. Your head remains centred enough to read as composed, but the background doesn’t lurch with every micro-movement. The feel is cinematic rather than twitchy—a choice that matters even more when the front camera appears as a picture-in-picture over the rear camera during Dual Capture.
When A Phone Becomes A Two-Camera Crew
Creators often have to choose: show the scene or show the reaction. Apple’s answer is to do both, natively, with a picture-in-picture that behaves like a tiny live edit bay.
“With Dual Capture, we achieve all of this while recording great video on the rear camera as well,” McCormack explained. “Stabilisation is especially important here… With face framing and Dual Camera, we actually create a more relaxed version of face framing because we want to balance keeping the background stable while keeping your head centered.”
Audio gets similarly intentional treatment. “We’ve got a whole array of microphones on the iPhone,” McCormack said. “We are able to determine where the audio is coming from predominantly and then use machine learning to pull it apart into separate tracks, which we can then remix for you.” The upshot is cleaner in-frame mixes for interviews, walk-and-talks and reaction videos—without third-party stitching.
There’s also a storyteller’s flourish: you can drag the picture-in-picture live while recording. The system adds a subtle motion blur so the reposition feels editorial rather than abrupt, and it all saves to a single file when you stop—ready to share.
How It Anticipates Intention

Ask cinematographers and they’ll tell you: composition is a chain of micro decisions. Apple wants those decisions to happen for you when you are not thinking like a cinematographer—at a party, on a street corner, in a moving cab.
“We extended this face framing idea to photography,” McCormack explained. “Center Stage for photos is not just focused on faces; it’s really about recognising intention patterns… If there are two people, it’ll auto-zoom to frame you perfectly. With three or more people, it can actually auto-rotate the capture to landscape while you’re still holding the phone vertically.”
Making this feel natural required months of tuning, including a hysteresis delay that prevents the system from chasing transient faces in the background while remaining responsive when a friend leans in. The aim: responsiveness without fussiness.
Eye Gaze And The Vertical Grip
The ergonomics here are deliberate: the more you can keep your wrists neutral and your eyes near the lens, the more natural the image feels. It is a small correction with outsized human effect.
“The square sensor means you can hold your iPhone vertically and still capture landscape photos and videos,” Nash said. “It’s generally more comfortable and allows for a more secure grip… You’ll also notice everyone in the photo has better eye gaze because the camera preview is centred with the front camera rather than being off to the side.” That dovetails with Apple’s evolving Camera Control ergonomics: adjustments stay within thumb reach while you keep a stable vertical hold, reducing the classic “selfie-arm” tell.
India In The Frame
For Indian users, the stakes are practical: haze softens edges, mixed light confuses white balance, and crowds create unpredictable motion at the edges of the frame. Apple’s approach is to anchor on perceptual truth, then let people personalise.
India’s imaging realities—haze, mixed indoor lighting, crowded frames, a kaleidoscope of skin tones—have informed Apple’s tuning. “We tune the camera to produce what we see as a perceptually accurate image, meaning we want to capture exactly what you saw,” McCormack explained. “Our goal is always to achieve perfect skin tones in our standard rendering. We introduced undertones last year because we realised that people perceive themselves differently and may want a different undertone applied to their skin… We sent research teams all over the world to understand how people see themselves based on geographic location and age.”
“Additionally, I want to mention that there’s a new bright style this year as well, which has become very popular,” Nash said.
What’s Different From Android—And Why It Matters

Example of a center stage selfie which doesn't need the user to orient the phone horizontally
Competitors have strong ideas about selfies—Google with Real Tone and Guided Frame, Samsung with Auto Framing, vivo with ZEISS-tuned portrait lighting. Apple’s twist is architectural: instead of a mode you choose, it is a camera that behaves. Orientation-agnostic capture, stabilisation by default, machine-learned framing and native Dual Capture are all baked into the stock app. The creative dividend is friction saved.
Android competitors have invested heavily in front-camera smarts. Google’s Real Tone prioritises inclusive skin-tone rendering; Guided Frame uses audio and haptics so blind and low-vision users can compose without looking. Samsung’s Auto Framing keeps faces centred in calls, while vivo’s ZEISS-branded portrait tools and Aura Light lean into stylised lighting. Apple’s difference lies in where the behaviour lives and how it’s powered: a square-format sensor designed for reframing, on-by-default stabilisation, ML-guided composition and native Dual Capture in the stock camera that saves to one file. The wager is that default beats mode—good composition and steady footage should require no thought.
Inside The Device, Outside The Compromises

The packaging challenge is non-trivial: a larger, square-format sensor must coexist with the TrueDepth array, even as devices get thinner. Apple’s answer, per usual, is cross-disciplinary design—industrial, thermal, optical—iterated until nothing gives. This is particularly important in the context of the iPhone 17 Air which is just 5.6mm thin and uses basically the same front facing camera stack.
“That’s just part of what goes into making an iPhone,” McCormack said when asked about packaging a larger, square-format sensor alongside the TrueDepth array, particularly in the ultra-thin iPhone Air. “We wouldn’t compromise on image quality at all. It still needed to maintain that same great image quality, which meant we needed to have the same size pixels—we just needed a lot more of them in the X and Y directions… The camera team, industrial designers, and thermal engineers all work side by side to create a device that seems unfathomable to put everything together, but we do it.”
Numbers, Without The Numbers

Selfies have a wider field of view, rich detail and often seem like photos taken from a rear camera
Resolution choices can feel like a spec sheet game; Apple treats them like defaults that lower cognitive load. Eighteen megapixels is the chosen landing spot not because the sensor cannot do more, but because the experience becomes better when the “extra” becomes stabilisation and crop freedom.
In practice, Apple settled on 18 megapixels as the sweet spot for stills from the front camera, even though the sensor area is larger and the raw pixel matrix is higher—the philosophy is to pick defaults that reduce decision-fatigue. “Right now, we are just offering the 18-megapixel option, both horizontally and vertically,” McCormack said. “We specifically optimised the hardware, software, and processing around that 18-megapixel horizontal and vertical design. That’s what we are providing to users.”
Action Mode, Explained
Overscan is the quiet hero here. By capturing more image than you see and then cropping dynamically, the phone earns a stabilisation margin that feels like a gimbal without the kit—or the learning curve.
“Yes, exactly,” McCormack explained. “What we’re doing is using what we call the overscan area, which is basically all of the area that isn’t being used for video, and we’re utilising that for stabilisation… With the power of Apple silicon, we can do that stabilisation on the fly—not just for the front camera but if you turn on Action mode, you can do it on the rear camera at the same time when you’re using Dual Capture. This results in magically stable video.”
The result is a familiar YouTube-ready look—stabilised walk-and-talks and reaction shots that feel composed—without rigs, cages or post-production fixes.
Camera Control, Designed For The Vertical Grip
Apple’s new Center Stage front camera doesn’t just change what the lens sees; it changes how you hold and operate the phone. By aligning framing and stabilisation with a naturally vertical grip, the feature dovetails with Camera Control—introduced last year—to reduce fumbles, fix eye-gaze, and keep the most-used controls under your thumb. As Jon McCormack put it, “I think we look at the whole phone holistically. The interesting aspect of camera control is that, yes, when you hold iPhone horizontally or vertically, it impacts how your finger positions relative to the screen. We conducted many user studies and experiments. To your point, now, with the new Center Stage front camera for selfies, you never have to make the phone horizontal again. You can always hold it in that stable position, which is great for the camera control. As Megan pointed out earlier, this also greatly benefits your eye gaze because you don’t end up with the awkward gaze issue, where when I hold a camera horizontally for selfie, people end up looking at themselves instead of the camera, resulting in that off-axis look. However, when the camera's vertical, you never have that issue.”
In practice, that means fewer missed taps, steadier framing, and faces that look straight down the lens—small ergonomic tweaks that add up to a more confident, documentary-style front camera.
How Apple’s Executives Use The New Cameras
Usage is philosophy made visible. Watch how the people who built it reach for the features and you see the values: first, delight; second, memory; always, less fuss.
“I love showing people Center Stage for photos for the first time,” Nash said. “I always make sure to capture their reaction the moment they realise that it’s rotating because then I can play it back on Live and show them their reaction… I also have a toddler at home, so showing her Dual Capture or having her face appear in Dual Capture with the Halloween decorations in the States right now is fantastic for creating memories.”
“I absolutely love the new four-times telephoto lens as a portrait lens… And with the new front-facing camera, I don’t have to think about selfies anymore,” McCormack explained. “In the first few instances of using the new front camera, I would instinctively think, ‘Oh, there are more people in the frame,’ and then I’d twist my arm, only to see my arm in the shot and realise, ‘Oh wait, I don’t have to do that anymore.’ … I’ve actually started finding it hard to tell whether an image was taken as a selfie or by someone else because it creates a much more natural pose.”
A19, Pricing And Practicalities
For buyers weighing models, the silicon split is straightforward and the implications are concrete: A19 handles the reframing and crop math with ease; A19 Pro, paired with vapour-chamber cooling, keeps that performance aloft during longer sessions—useful for vloggers, travellers and anyone leaning on Dual Capture.
As of publication, Apple lists pricing in the United States at Rs 79,990 for iPhone 17, Rs 119,990 for iPhone Air, Rs 134,990 for iPhone 17 Pro and Rs 149,990 for iPhone 17 Pro Max.
The Craft In The Details
What makes Center Stage feel human is how small choices accrue: eye gaze improves because you no longer rotate the phone sideways and look off-axis at yourself—the camera remains centred to the screen you are watching; grip is more secure and natural when you can stay vertical, reducing accidental drops and blurred frames in candid moments; hysteresis delays keep crops from jumping when strangers pass behind you yet remain responsive when a friend leans into the edge of the frame; and on-the-fly motion blur sweetens picture-in-picture moves in Dual Capture, so dragging the inset window mid-record looks intentional rather than abrupt. These are not headline features but background refinements—the kind that quietly turn a specification into something you can feel.
“We wanted to design front camera experiences that make it easier and more delightful to take selfies, especially in groups,” Nash said. “We also wanted you to be able to be more present when communicating with loved ones over video calls.”
Apple’s Vision For The iPhone Camera
“The camera on an iPhone is all about letting you capture the moment while you stay in it,” McCormack said. “No settings, no distractions—just pull out your iPhone and let the new Center Stage front camera do all the work for you.”
If the rear cameras made the iPhone a pro tool, the new front camera argues the everyday shot deserves pro-level intent too. That, more than any spec sheet, is the innovation. |