Apple Studied Paintings and Shined Light on People to Perfect New Portrait Lighting Feature

iPhone 8 Plus and iPhone X feature advanced cameras with a new Portrait Lighting feature that uses sophisticated algorithms to calculate how your facial features interact with light. That data is used to create lighting effects, such as Natural Light, Studio Light, Contour Light, and Stage Light.


In a new interview with BuzzFeed News reporter John Paczkowski, Apple says it studied the work of portrait photographers such as Richard Avedon, Annie Leibovitz, and Johannes Vermeer, a seventeenth-century Dutch painter, to learn how others have used lighting throughout history.

"We didn't just study portrait photography. We went all the way back to paint," said Apple's marketing chief Phil Schiller.

"If you look at the Dutch Masters and compare them to the paintings that were being done in Asia, stylistically they're different," said Johnnie Manzari, a designer on Apple's Human Interface Team. "So we asked why are they different? And what elements of those styles can we recreate with software?"

Apple said it took what it learned, went into its studio, and spent countless hours shining light on people from different angles.

"We spent a lot of time shining light on people and moving them around — a lot of time," Manzari added. "We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work."

Schiller acknowledged that Apple aims to make a professional camera, ranked the best among smartphones in a recent review, but he added that the company also cares about what it can contribute to photography as a whole.

"We're in a time where the greatest advances in camera technology are happening as much in the software as in the hardware," Schiller said. "And that obviously plays to Apple's strengths over traditional camera companies."

Apple's software advancements allow anyone to simply pick up an iPhone and capture a high-quality photo, eliminating the learning curve that can come with a high-end DSLR camera from the likes of Canon or Nikon.

"It's all seamless; the camera just does what it needs to," said Schiller. "The software knows how to take care of it for you. There are no settings."

Both the iPhone 8 Plus and iPhone X rear cameras been advanced with larger, faster dual-lens sensors, new color filters, and deeper pixels. iPhone X also has optical image stabilization for both the wide-angle and telephoto lenses, the latter of which has a larger ƒ/2.4 aperture that lets more light in.

Read More: How Apple Built An iPhone Camera That Makes Everyone A Professional Photographer


Discuss this article in our forums

Apple Started Developing A11 Bionic Chip When A8 Chip Was Released Three Years Ago

Shortly after Apple's iPhone X event this week, the company's silicon chief Johny Srouji and marketing chief Phil Schiller sat down for an interview about its new A11 Bionic chip with Mashable's editor-at-large Lance Ulanoff.


One interesting tidbit mentioned was that Apple began exploring and developing the core technologies in the A11 chip at least three years ago, when the iPhone 6 and iPhone 6 Plus launched with A8 chips.
Srouji told me that when Apple architects silicon, they start by looking three years out, which means the A11 Bionic was under development when Apple was shipping the iPhone 6 and its A8 chip. Back then we weren't even talking about AI and machine learning at a mobile level and, yet, Srouji said, "The neural engine embed, it’s a bet we made three years ahead."
Apple's three-year roadmap can change if new features are planned, like the Super Retina HD Display in iPhone X.
"The process is flexible to changes," said Srouji, who’s been with Apple since the first iPhone. If a team comes in with a request that wasn't part of the original plan, "We need to make that happen. We don't say, 'No, let me get back to my road map and, five years later, I'll give you something."
Apple senior executives Phil Schiller, left, and Johny Srouji

In fact, Schiller praised Srouji's team for its ability to "move heaven and earth" when the roadmap suddenly changes.
"There have been some critical things in the past few years, where we've asked Johny's team to do something on a different schedule, on a different plan than they had in place for years, and they moved heaven and earth and done it, and it's remarkable to see."
A11 Bionic six-core chip has two performance cores that are 25 percent faster, and four high-efficiency cores that are 70 percent faster, than the A10 chip in iPhone 7 and iPhone 7 Plus. Early benchmarks suggest the A11 Bionic is even on par with the performance of Apple's latest 13-inch MacBook Pro models.

The A11 chip is more efficient at multi-threaded tasks thanks to a second-generation performance controller that is able to access all six of the cores simultaneously if a particular task demands it.
Gaming might use more cores, said Srouji, but something as simple as predictive texting, where the system suggests the next word to type, can tap into the high-performance CPUs, as well.
The A11 chip also has an Apple-designed neural engine that handles facial recognition for Face ID and Animoji, and other machine learning algorithms. The dual-core engine recognizes people, places, and objects, and processes machine learning tasks at up to 600 billion operations per second, according to Apple.
“When you look at applications and software, there are certain algorithms that are better off using a functional programming model,” said Srouji.

This includes the iPhone X’s new face tracking and Face ID as well as the augmented-reality-related object detection. All of them use neural networks, machine learning or deep learning (which is part of machine learning). This kind of neural processing could run on a CPU or, preferably, a GPU. “But for these neural networking kinds of programming models, implementing custom silicon that’s targeted for that application, that will perform the exact same tasks, is much more energy efficient than a graphics engine,” said Srouji.
Apple's new iPhone 8, iPhone 8 Plus, and iPhone X are all equipped with an A11 chip.

In related news, Carnegie Mellon University's School of Computer Science has announced that Srouji will take part in a distinguished industry lecture on Monday, September 18 from 5:00 p.m. to 6:30 p.m. local time.

Full Interview: The Inside Story of the iPhone X 'Brain,' the A11 Bionic Chip


Discuss this article in our forums

Apple Started Developing A11 Bionic Chip When A8 Chip Was Released Three Years Ago

Shortly after Apple's iPhone X event this week, the company's silicon chief Johny Srouji and marketing chief Phil Schiller sat down for an interview about its new A11 Bionic chip with Mashable's editor-at-large Lance Ulanoff.


One interesting tidbit mentioned was that Apple began exploring and developing the core technologies in the A11 chip at least three years ago, when the iPhone 6 and iPhone 6 Plus launched with A8 chips.
Srouji told me that when Apple architects silicon, they start by looking three years out, which means the A11 Bionic was under development when Apple was shipping the iPhone 6 and its A8 chip. Back then we weren't even talking about AI and machine learning at a mobile level and, yet, Srouji said, "The neural engine embed, it’s a bet we made three years ahead."
Apple's three-year roadmap can change if new features are planned, like the Super Retina HD Display in iPhone X.
"The process is flexible to changes," said Srouji, who’s been with Apple since the first iPhone. If a team comes in with a request that wasn't part of the original plan, "We need to make that happen. We don't say, 'No, let me get back to my road map and, five years later, I'll give you something."
Apple senior executives Phil Schiller, left, and Johny Srouji

In fact, Schiller praised Srouji's team for its ability to "move heaven and earth" when the roadmap suddenly changes.
"There have been some critical things in the past few years, where we've asked Johny's team to do something on a different schedule, on a different plan than they had in place for years, and they moved heaven and earth and done it, and it's remarkable to see."
A11 Bionic six-core chip has two performance cores that are 25 percent faster, and four high-efficiency cores that are 70 percent faster, than the A10 chip in iPhone 7 and iPhone 7 Plus. Early benchmarks suggest the A11 Bionic is even on par with the performance of Apple's latest 13-inch MacBook Pro models.

The A11 chip is more efficient at multi-threaded tasks thanks to a second-generation performance controller that is able to access all six of the cores simultaneously if a particular task demands it.
Gaming might use more cores, said Srouji, but something as simple as predictive texting, where the system suggests the next word to type, can tap into the high-performance CPUs, as well.
The A11 chip also has an Apple-designed neural engine that handles facial recognition for Face ID and Animoji, and other machine learning algorithms. The dual-core engine recognizes people, places, and objects, and processes machine learning tasks at up to 600 billion operations per second, according to Apple.
“When you look at applications and software, there are certain algorithms that are better off using a functional programming model,” said Srouji.

This includes the iPhone X’s new face tracking and Face ID as well as the augmented-reality-related object detection. All of them use neural networks, machine learning or deep learning (which is part of machine learning). This kind of neural processing could run on a CPU or, preferably, a GPU. “But for these neural networking kinds of programming models, implementing custom silicon that’s targeted for that application, that will perform the exact same tasks, is much more energy efficient than a graphics engine,” said Srouji.
Apple's new iPhone 8, iPhone 8 Plus, and iPhone X are all equipped with an A11 chip.

In related news, Carnegie Mellon University's School of Computer Science has announced that Srouji will take part in a distinguished industry lecture on Monday, September 18 from 5:00 p.m. to 6:30 p.m. local time.

Full Interview: The Inside Story of the iPhone X 'Brain,' the A11 Bionic Chip


Discuss this article in our forums

Phil Schiller and Kim Gassett-Schiller Donate $10M to Bowdoin College’s Coastal Studies Center in Maine

Apple's senior vice president of worldwide marketing Phil Schiller and his wife Kim have donated $10 million to Bowdoin College's Coastal Studies Center on Harpswell Sound in Harpswell, Maine, as a user on Reddit pointed out today. In a press release, Bowdoin explained that the gift will allow it to "substantially expand" the facilities in which students perform ocean research and study environmental education.

Image via Bowdoin.edu

Specifically, Bowdoin said that thanks to the gift from the Schillers, the college will be able to add a state-of-the art dry laboratory, convening center, modern classrooms, and housing and dining facilities.
“This extraordinary act of generosity and vision by Phil and Kim Schiller will transform the Coastal Studies Center into a facility where students and faculty from Bowdoin and from other institutions can gather together for concentrated periods to learn from each other and to advance knowledge and understanding about the ocean, marine science and the impact of climate change on marine life.”
Phil Schiller explained the family's donation in a video shared recently by Bowdoin, citing the college's efforts in developing new methods to research and fight against pollution, climate change, and other environmental issues. Schiller himself was born on the east coast in Natick, Massachusetts and graduated with a B.S. in Biology from Boston College. One of his sons, Mark, graduated from Bowdoin earlier in 2017.


In response to the $10 million donation, Bowdoin has named the center the "Schiller Coastal Studies Center (SCSC)." The SCSC is located on 118 acres of land that occupies 2.5 miles of the Maine coastline, and is situated on Orr's Island, about 12 miles outside of Bowdoin's main campus. For more information about the center, and the Schillers' donation to Bowdoin, check out the college's website here.


Discuss this article in our forums

Watch ‘The Talk Show’ Live From WWDC 2017 With Craig Federighi and Phil Schiller

Daring Fireball has shared the full video of "The Talk Show Live" from Apple's Worldwide Developers Conference this week.


Before a live audience at The California Theatre in San Jose, Apple senior executives Craig Federighi and Phil Schiller joined host John Gruber to reflect on the company's announcements at its WWDC opening keynote on Monday, including several new Macs, macOS High Sierra, iOS 11, and HomePod.

The video, produced by Amy Jane Gruber and Paul Kafasis, is available on Vimeo and embedded below.


MacRumors has put together a WWDC 2017 roundup with the latest news and announcements from the conference.

Related Roundup: WWDC 2017
Tags: Phil Schiller, Craig Federighi

Discuss this article in our forums

Apple VP Phil Schiller Implies Voice-Activated Smart Speakers Could Benefit From a Screen

Gadgets 360 published an interview with Apple SVP of Worldwide Marketing Phil Schiller this week that could shed some light on Apple's plans for a dedicated Siri-based voice-assistant for the home. Rumors have swirled in recent weeks about Apple's plans to unveil an Amazon Echo-like smart connected speaker, possibly as early as WWDC in June, so Schiller's thoughts on the topic could potentially relate to the way Apple is approaching the design of its Echo rival.

During the interview, Schiller demurred when asked what he thought about Amazon's Echo and Google Home, but his comments clearly imply that the two speakers leave a lot to be desired: "My mother used to have a saying that if you don't have something nice to say, say nothing at all." More revealingly perhaps, Schiller took pains to distinguish between different usage scenarios for voice assistants: handsfree, such as while driving, when simple voice-activation is convenient – but limited – and most other occasions when the availability of a screen is preferred.

"We think it's important that there are times when it's convenient to simply use your voice when you are not able to use the screen," said Schiller. "For example, if you're driving [and] you want Siri to work for you without having to look at the screen, that's the best thing. Or maybe you're across the room, and you want to ask Siri to change the song you're listening to."

So there's many moments where a voice assistant is really beneficial, but that doesn't mean you'd never want a screen. So the idea of not having a screen, I don't think suits many situations. For example if I'm looking for directions and I'm using Maps, Siri can tell me those directions by voice and that's really convenient but it's even better if I can see that map, and I can see what turns are coming up, and I can see where there is congestion, I understand better my route, and what I'm going to do.
Schiller continued his argument for voice assistants with screens using the example of photography and photo sharing. "With all the social networking apps that are now embracing photos more and more, well, it doesn't work really so great in voice-only assistants," said Schiller. The same goes for games, he said, calling them the "biggest category of all".
I have yet to see any voice-only games that, for me, are nearly as fun as the one that I play on my screen. And so I think voice assistants are incredibly powerful, their intelligence is going to grow, they're gonna do more for us, but the role of the screen is gonna remain very important to all of this.
Schiller ended his comments on the topic by calling the dual role of voice-assistants "an interesting discussion", especially with respect to "when each is appropriate, and what they can do in our lives".

It's unclear how Schiller's comments fit in with the recent uptick in rumors that Apple is working on a Siri-based smart speaker for the home. Often-reliable analyst Ming-Chi Kuo of KGI Securities has said the product will double up as an AirPlay speaker and feature a custom W1 Bluetooth chip for easy pairing, while Sonny Dickson has suggested the device will run a variation of iOS and have a Mac Pro-like concave top with built-in controls. However, none have claimed Apple is working to integrate a screen into the device.

By contrast, recent alleged leaks have suggested Amazon's next-generation Echo could have a built-in touchscreen and camera with the potential to support phone and video calls.

In the Gadgets 360 interview quoted from above, Schiller also spoke about other topics, including Apple's Swift programming language, and the company's app subscription model as it relates to developers and users of the App Store. You can read the full interview here.


Discuss this article in our forums

Phil Schiller Says iPhone Was ‘Earth-Shattering’ Ten Years Ago and Remains ‘Unmatched’ Today

To commemorate the tenth anniversary of the iPhone, Apple marketing chief Phil Schiller sat down with tech journalist Steven Levy for a wide-ranging interview about the smartphone's past, present, and future.

original-iphone
The report first reflects upon the iPhone's lack of support for third-party apps in its first year. The argument inside Apple was split between whether the iPhone should be a closed device like the iPod, or an open platform like the Mac, a discussion that Schiller said was ultimately "shut down" by then-CEO Steve Jobs.
Steve Jobs shut down the discussion, Schiller recalls. “He said ‘We don’t have to keep debating this because we can’t have [an open system] right now. Maybe we’ll change our mind afterwards, or maybe we won’t, but for now there isn’t one so let’s envision this world where we solve the problem with great built-in apps and a way for developers to make web apps.”
Levy suggested that the iPhone's great moment was when the App Store launched a year later, creating a world where for "every imaginable activity" there was "an app for that." Schiller, perhaps unsurprisingly as Apple's marketing chief, said that belief undermines how truly "earth-shattering" the iPhone was at the time.
“That undervalues how earth-shattering the iPhone was when it first came to market, and we all first got them and fell in love with them,” he says. “iPhone made the idea of a smartphone real. It really was a computer in your pocket. The idea of real internet, real web browser, Multi-Touch. There were so many things that are core to what is the smartphone today, that created a product that customers fell in love with, that then also demanded more stuff on them, more apps.”
iphone-full-lineup
Nowadays, some critics are wondering whether Apple is playing it safe as of late, arguing that recent iPhone models have only incremental improvements rather than revolutionary new features. But, again, Schiller downplayed this notion and said the changes in more recent iPhones are "sometimes even bigger now."
“I actually think the leaps in the later versions are as big and sometimes even bigger now,” he says. “I think our expectations are changing more, not the leaps in the products. If you look through every version—from the original iPhone to the iPhone 3G to the 4 to the 4S, you see great changes all throughout. You see screen size change from three and a half inch to four inch to four point seven and five point five. You see cameras going through incredible change, from the first camera that couldn’t shoot video, to then having both a front and a backside camera, to now three cameras with the stuff we’re doing, and with live photos and 4K video.”
Schiller positioned the iPhone as a top smartphone. "The quality is unmatched. The ease of use is still unmatched. The integration of hardware software is unmatched. We’re not about the cheapest, we’re not about the most, we’re about the best."

In a press release yesterday, Schiller said Apple is "just getting started" with the iPhone, while CEO Tim Cook promised "the best is yet to come." Building upon those comments, Schiller told Levy that he hopes in 50 years, people will indeed look back and realize how much was yet to come.
Schiller hopes that 50 years people will look back at this point and say, “Wow, they didn’t realize how much was to come — in fact, others missed it because they were busy running around looking for other things. Everyone has their opinions at this point, but it could be that we’re only in the first minutes of the first quarter of the game,” he says. “I believe this product is so great that it has many years of innovation ahead.”
Levy, however, went on to question "whether a pocket-sized device like the iPhone will still be as relevant decades hence," particularly as "a lot of observers have been saying we are at the start of the era of the conversational interface."

amazon-alexa
At CES last week, for example, a number of reputable publications said Amazon's Alexa platform "stole the show" or offered similar accolades, after companies showed off everything from new cars and robots to fridges and laundry machines integrated with the voice-controlled assistant, which launched in late 2014.

Apple itself had an early lead in this artificial intelligence space when it debuted Siri on the iPhone 4s in 2011.

Schiller opined that "the best intelligent assistant is the one that's with you all the time," such as the iPhone. Schiller added that "people are forgetting the value and importance of the display," which he said is "not going to go away."
“I'm so glad the team years ago set out to create Siri — I think we do more with that conversational interface that anyone else. Personally, I still think the best intelligent assistant is the one that’s with you all the time. Having my iPhone with me as the thing I speak to is better than something stuck in my kitchen or on a wall somewhere.”

“People are forgetting the value and importance of the display,” he says. “Some of the greatest innovations on iPhone over the last ten years have been in display. Displays are not going to go away. We still like to take pictures and we need to look at them, and a disembodied voice is not going to show me what the picture is.”
Full-Length Article: Phil Schiller on iPhone’s Launch, How It Changed Apple, and Why It Will Keep Going for 50 Years


Discuss this article in our forums