The Most Important Apple Executive You’ve Never Heard Of
A visit with Cupertino’s chief chipmaker, Johny Srouji.
A little over a year ago, Apple had a problem: The iPad Pro was behind schedule. Elements of the hardware, software, and accompanying stylus weren’t going to be ready for a release in the spring. Chief Executive Officer Tim Cook and his top lieutenants had to delay the unveiling until the fall. That gave most of Apple’s engineers more time. It gave a little-known executive named Johny Srouji much less.
Srouji is the senior vice president for hardware technologies at Apple. He runs the division that makes processor chips, the silicon brains inside the iPhone, iPad, Apple Watch, and Apple TV. The original plan was to introduce the iPad Pro with Apple’s tablet chip, the A8X, the same processor that powered the iPad Air 2, introduced in 2014. But delaying until fall meant that the Pro would make its debut alongside the iPhone 6s, which was going to use a newer, faster phone chip called the A9.
This is the stuff that keeps technology executives up at night. The iPad Pro was important: It was Apple’s attempt to sell tablets to business customers. And it would look feeble next to the iPhone 6s. So Srouji put his engineers on a crash program to move up the rollout of a new tablet processor, the A9X, by half a year. The engineers finished in time, and the Pro hit the market with the faster chip and a 12.9-inch display packed with 5.6 million pixels.
Srouji was nicely rewarded for his efforts. In December he became the newest member of Cook’s management team and received about 90,000 additional shares of Apple stock, which vest over a four-year period.
He also stepped into the kind of spotlight he’s avoided since joining Apple in 2008. Srouji runs what is probably the most important and least understood division inside the world’s most profitable company. Since 2010, when his team produced the A4 chip for the original iPad, Apple has immersed itself in the costly and complex science of silicon. It develops specialized microprocessors as a way to distinguish its products from the competition. The Apple-designed circuits allow the company to customize products to perfectly match the features of its software, while tightly controlling the critical trade-off between speed and battery consumption. Among the components on its chip (technically called a “system on a chip,” or SOC) are an image signal processor and a storage controller, which let Apple tailor useful functions for taking and storing photos, such as the rapid-fire “burst mode” introduced with the iPhone 5s. Engineers and designers can work on features like that years in advance without prematurely notifying vendors—especially Samsung, which manufactures many of Apple’s chips.
At the center of all this is Srouji, 51, an Israeli who joined Apple after jobs at Intel and IBM. He’s compact, he’s intense, and he speaks Arabic, Hebrew, and French. His English is lightly accented and, when the subject has anything to do with Apple, nonspecific bordering on koanlike. “Hard is good. Easy is a waste of time,” he says when asked about increasingly thin iPhone designs. “The chip architects at Apple are artists, the engineers are wizards,” he answers another question. He’ll elaborate a bit when the topic is general. “When designers say, ‘This is hard,’ ” he says, “my rule of thumb is if it’s not gated by physics, that means it’s hard but doable.”
Srouji recently spent several hours with Bloomberg Businessweek over several days and guided a tour of Apple chip facilities in Cupertino, Calif., and Herzliya, Israel. This was, no doubt, strategic. Investors have battered Apple stock over the past year, sending it down more than 25 percent. Most people are already pretty satisfied with their phones, the criticism goes, and aren’t compelled to spend an additional few hundred bucks on an upgrade. (In March, Apple intends to announce an updated iPad and smaller-screen iPhone featuring the latest A9x and A9 chips, according to a person familiar with the plans, who wasn’t authorized to comment publicly.)
Apple’s usual response is to point to Jony Ive and his team of fastidiously cool, Wallabee-shod industrial designers, or to highlight elegantly tooled aluminum or an app or some new feature or gadget. There’s always something new to show off. But none of that has ever explained anything about a crucial part of Apple’s profit machine: its chips.
“I think it’s too good of a story not to be told at this stage,” Srouji says. “Hopefully, we won’t reveal too much.”
When the original iPhone came out in 2007, Steve Jobs was well aware of its flaws. It had no front camera, measly battery life, and a slow 2G connection from AT&T. It was also underpowered. A former Apple engineer who worked on the device said that while the handset was a breakthrough technology, it was limited because it pieced together components from different vendors, including elements from a Samsung chip used in DVD players. “Steve came to the conclusion that the only way for Apple to really differentiate and deliver something truly unique and truly great, you have to own your own silicon,” Srouji says. “You have to control and own it.”
One of Jobs’s trusted advisers, Bob Mansfield, Apple’s top hardware executive at the time, recruited Srouji to lead that effort. Srouji, then at IBM, was a rising star in the arcane world of semiconductor engineering. Mansfield promised him an opportunity to build something from scratch.
The decision to design semiconductors was risky. About the size of a small postage stamp, the microprocessor is the most important component of any computing device. It does the work that makes playing games, posting to Facebook, sending texts, and taking pictures seem easy. Small currents of energy move from the battery through hundreds of millions of tiny transistors, triggering commands and responses in nanoseconds. It’s like an intricate city design that fits on the tip of your finger. When the chip isn’t doing its job efficiently, the device feels sluggish, crashes, or makes users want to throw it against a wall.
If there’s a bug in software, you simply release a corrected version. It’s different with hardware. “You get one transistor wrong, it’s done, game over,” Srouji says. “Each one of those transistors has to work. Silicon is very unforgiving.” Among computer and smartphone makers, industry practice is to leave the processors to specialists such as Intel, Qualcomm, or Samsung, which sink billions into getting the chips right and making them inexpensively. (Apple used to co-design processors for the Macintosh, but Jobs abandoned the work in 2005 in favor of more powerful models from Intel, whose chips still power all Macs.)
When Srouji joined Apple, the company had a group of about 40 engineers working on integrating chips from various vendors into the iPhone. That grew by about 150 people in April 2008, after Apple acquired a Silicon Valley chip startup called P.A. Semi, which had a power-efficient semiconductor design. Srouji’s team found itself interacting regularly with other departments, from software programmers, who wanted chip support for new features, to Ive’s industrial designers, who wanted help making the phones flatter and sleeker. An engineer who sat in on Srouji’s meetings remembers senior managers preparing extensively for presentations, because his support was critical for getting new features approved. He was known for peppering engineers with technically sophisticated questions, particularly about contingency options if something didn’t work out as planned. He’d ask, for example, if a different form of plastic could be used that wouldn’t interfere with another component.
The first public signs of Srouji’s work came in 2010 with the debut of the iPad and iPhone 4. The processor, the A4, was a modified version of a design from ARM Holdings, a British company that licenses mobile technology. The A4 was designed to power the handset’s new high-definition “retina display.” Srouji says it was a race to get that first system-on-a-chip produced. “The airplane was taking off, and I was building the runway just in time,” he says.
Over the next few years, Apple kept making improvements to its designs, introducing chips to accommodate fingerprint identification, video calling, and Siri, the iPhone’s voice-activated assistant, among other enhancements. When the companies using Google’s Android operating system started making tablets, they mostly used conventional mobile phone processors. Starting with the third-generation iPad in 2012, Srouji’s team designed specific chips (the A5x and A6x) to give the tablet the same pixel-packing high-definition screens as the iPhone.
These mysterious semiconductors coming from Apple were the curiosity of the tech industry, but it wasn’t until the release of the iPhone 5s in 2013 that rivals really started to pay attention. The phone featured the A7 processor, the first smartphone chip with 64 bits—double the 32-bit standard at the time. The new technology allowed for entirely new features, such as Apple Pay and the Touch ID fingerprint scanner. Developers had to rewrite applications to account for the new standard, but it gave way to smoother maps, cooler video games, and generally more responsive apps that don’t hog as much memory. (Apple’s control over hardware and software is also useful for encrypting everything on the device, a capability that has landed the company in a controversy: On Feb. 16, a judge ruled that Apple must help the FBI unlock an iPhone owned by one of the San Bernardino shooters. Apple is fighting the order, saying it would set a precedent that would undermine the privacy of all its customers.)
Qualcomm, then as now the biggest designer of phone chips, made the expensive decision to scrap development of its 32-bit chips and put all its resources into catching up. Handset companies all “wanted the shiny new thing,” says Ryan Smith, the editor-in-chief of AnandTech, a website that publishes exhaustive reviews of semiconductor designs. “The A7 really turned the world upside down.”
Srouji can’t restrain a smile when recalling competitors’ reactions to Apple’s 64-bit surprise. “When we pick something,” he says, “it’s because we think there’s a problem that nobody can do, or there is some idea that’s so unique and differentiating that the best way to do it is you have to do it yourself.”
Srouji was born in Haifa, a port city in northern Israel. He was the third child of four. His family was Christian Arab, a minority within a minority in the Jewish state. “Haifa is one of the most integrated cities in Israel,” he says. “You have Christians, you have Muslims, Jews, Bahá’ís, you have any religion you want, and everyone lives together in peaceful harmony. Integration worked for me.”
Srouji’s father owned a metal pattern-making business outside the city, and from age 10, Srouji spent weekends and summers helping him pattern wooden moldings that were used to make engine parts, medical equipment, and other machinery. His father had an unusual philosophy: He would undercharge customers for complicated work while overcharging for easier jobs. “If there was a very complex thing that he’d never done, he wanted to do it,” Srouji says.
His father, who died in 2000, constantly reminded him not to get comfortable in the family business. Education was more important. In high school, Srouji got perfect grades in math, physics, chemistry, and science. He was introduced to computers by an instructor who also taught at the nearby Technion Israel Institute of Technology, one of the world’s top engineering schools. “I fell in love,” Srouji says.
He enrolled at the Technion, spent late nights in the computer lab drafting out code in pencil, and earned undergraduate and master’s degrees in computer science. His master’s thesis was on new techniques for testing software and hardware systems. “At the time it was very progressive,” says Orna Berry, general manager of the EMC Center of Excellence in Israel and corporate vice president of innovation, who met Srouji at the Technion. “I’m not surprised he is where he is.”
After graduating, Srouji got a job with IBM, which had placed its largest non-U.S. research facility in Haifa, the better to attract the big brains coming out of the Technion and other Israeli universities. He researched distributed systems, an emerging field in which computers in different locations are networked together to complete computationally intensive assignments. Ensuring the machines communicated correctly required skill building hardware and writing software algorithms.
“Sometimes I wondered—when he got an assignment and within a day it was complete and perfect—if he was brilliant or just didn’t sleep at night,” says Srouji’s first boss, Oded Cohn, vice president and lab director for IBM Haifa Research. “In some cases, the conclusion was both.”
Although Israel grapples with Jewish-Arab tensions all the time, none of it mattered in Srouji’s world. Cohn, who remains friends with him, says their different backgrounds never came up. “Technical people treat technical people based on personality and technical ability,” he says. “You don’t think about it. You just work together. The rest goes away.”
In 1993, Srouji left IBM for Intel, where he created techniques for running simulations that test the strength of semiconductor designs. During a visit to the U.S. in 1999, he used a 20-minute car ride with a manager, fellow Israeli Uri Weiser, to lobby for a three-year stint at Intel’s research hub in Austin. Assuming Srouji was also Jewish, Weiser invited him to an Israeli Memorial Day celebration at a synagogue in Texas.
“He looked at me and said, ‘I’m a Christian Arab,’ ” recalls Weiser, who gave Srouji the Texas assignment. “I said, ‘Well, come and join and learn about your environment,’ and he said OK. He was there sitting with a kippah in the synagogue and following everything.”
Srouji lives a few miles from Apple’s headquarters at One Infinite Loop, Cupertino. He drives a black Mercedes-Benz and relaxes by lifting weights and riding his bike on weekends. He smiles easily, warmly touches a reporter’s shoulder when sharing a laugh, blushes at compliments, and absolutely clams up when he’s asked about anything that could remotely be considered a corporate secret. “I don’t want to go into too much detail on that” is a common refrain.
Friends have noticed the heightened discretion. Srouji once invited his former Intel colleague, Weiser, to give a speech about chip development at Apple headquarters in Cupertino. After the presentation, an assistant escorted Weiser to Srouji’s empty office, where he noticed that the papers on the desk were all turned upside down. Then Srouji entered the room and told Weiser he had to move. “He said, ‘We are at Apple, you can’t sit here,’ ” Weiser recalls. “He offered me to sit with his secretary and said, ‘If you want to go to the bathroom, she will escort you.’ ”
One morning in February, Srouji conducts a brief tour of his domain, which is scattered in unmarked locations around Silicon Valley. A shuttle bus leaves One Infinite Loop and drives 10 minutes through a series of residential neighborhoods to a low-rise office building near the Santa Clara city limits.
One of his deputies greets Srouji at the bus and badges through several locked doors into a room where future chip designs are being tested. The building is eerily quiet and still, save for the hum of air conditioners and the blinking red and green lights of large dark boxes that are stacked together and resemble Zambonis. The room is Apple-white and clean, but not tidy; thick wires and large plugs lie around. Old, unused Macs are lined up on a shelf like books that have already been read. All the equipment is operated remotely. The boxes are running software that scans for possible flaws in the chip architecture. Testing proceeds for several days on one element of the chip, then moves on to the next, and then the next, until the process is done, which can take months. “We beat the silicon as much as we can,” Srouji says. “If you’re lucky and rigorous, you find the mistakes before you ship.”
In an adjacent room, circuit boards are wired together in milk carton-size stacks to simulate the capabilities of a future iPhone or iPad. Apple’s software programmers, sitting anywhere in the world, can remotely test how their code holds up against a future chip design.
Then the shuttle takes Srouji a few more miles away, to another unmarked building, where rows of customized Mac Minis are testing prototype chips under various temperature and pressure conditions. Standing in an aisle, surrounded by exposed circuit boards and digital innards, is like being inside the Matrix. “No one has seen this before,” Srouji says.
Everything looks exceedingly complicated. Srouji won’t discuss costs, but Apple’s research and development expenses hit $8.1 billion last year, up from $6 billion in 2014 and $4.5 billion in 2013. Many analysts attribute the rise in large part to chip development. All Srouji will say about his budget is that Cook doesn’t scrutinize it. “I run it very tight,” he says. “I truly believe that engineers will do their best when they are constrained by either money, tools, or resources. If you become sloppy because you have too much money, that’s the wrong mindset.”
Apple isn’t completely in charge of its own destiny. It remains in many ways a prisoner of its supply chain. Displays come from Samsung, and cellular modems from Qualcomm. Samsung and TSMC, based in Taiwan, still manufacture the processors. Apple’s ability to keep up with demand is in part dependent on the production capacity of those companies. It also lags behind Samsung in some areas of chip development, such as adding a modem to the central processor to conserve space and power and transitioning from a 20-nanometer chip design to a more compact 16-nanometer format, which means even more transistors can be crammed into a smaller space. “If I was just arguing hardware and not Apple’s marketing, I would say Samsung has the best processor,” says Mike Demler, a senior mobile chips analyst at the Linley Group, a technology consulting firm in Silicon Valley.
Or Apple could just be getting started. It relies on suppliers for Wi-Fi modems now, but will it forever? “I don’t want to go into Wi-Fi specifically,” Srouji says.
Apple could also take a page from Tesla’s playbook and start developing its own batteries. “I don’t want to get into batteries too deeply,” he says.
And since Apple is doing a fine job with mobile processors, it could conceivably decide to get into conventional chips and bump Intel out of its Mac laptops and desktops. Srouji, of course, won’t go there, though he does allow that his team’s mission is finite. “If we attempt to do everything on the planet,” he says, “I don’t think that would be very smart.”
—With reporting by Ian King