Skip to main content

Who Invented the iPhone?

It all depends on what you mean by “invented”

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


The great man theory has crept back into popular culture in recent years, repurposed for the world of entrepreneurs, tech start-ups and digital conglomerates. Elon Musk revolutionized the electric car. Mark Zuckerberg pioneered the social network. Steve Jobs and his team at Apple invented the iPhone.  

These heroic narratives are both factually incorrect and unhelpful. In educational terms, a whole generation is growing up on inspirational YouTube videos revering individualism and some troubling leadership traits (see here for the darker side of Jobs and Apple). Yet the challenges the world faces—energy crises, food shortages, climate change, overpopulation—require collaboration and cooperation from all of us, both as global citizens and nations. These challenges are too complex, interconnected and fast-moving to be solved by any one person, idea, organization or nation. We will need to harness the fundamental principle underpinning all research—to stand on the shoulders of giants, with each new breakthrough building on the work of others before it. The hidden story of the iPhone is a testament to this.

The relentless drive and ingenuity of the many teams at Apple cannot be doubted. But there were hundreds of research breakthroughs and innovations without which the iPhone would not even be possible. Each was the result of countless researchers, universities, funders, governments and private companies layering one innovation on top of another.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


To demonstrate this, here’s a closer look at just three of the research breakthroughs that underpin the iPhone. 

THE TOUCH SCREEN

The iPhone wouldn’t be the iPhone without its iconic touch-screen technology.

The first touch screen was actually invented way back in the 1960s by Eric Arthur Johnson, a radar engineer working at a government research center in the U.K. While the Righteous Brothers were losing that lovin’ feeling, Johnson was publishing his findings in an Electronics Letters article published by the Institution of Engineering and Technology. His 1965 article, “Touch display—a novel input/output device for computers" continues to be cited by researchers to this day. The 1969 patent that followed has now been cited across a whole host of famous inventions—including Apple’s 1997 patent for “a portable computer handheld cellular telephone.”

Since Johnson’s first leap forward, billions of dollars have been awarded to research on touch-screen technology—from public bodies and private investors alike, with one often leading to the other. The University of Cambridge, for example, recently spun out a limited company to secure further investment for their own research on touch-screen technology, successfully closing a $5.5m investment round backed by venture capitalists from the U.K. and China.

One Apple patent on touch-screen technology cites over 200 scientific peer-reviewed articles, published by a range of academic societies, commercial publishers and university presses. These authors did not work alone. Most were part of a research group. Many were awarded a grant for their research. Each had their article independently evaluated by at least one external academic in the peer-review process that sits at the core of academic research. Consider one article on touch-screen technology recently published by Elsevier’s Information Sciences journal. Six authors and two blind peer reviewers are acknowledged. Conservatively extrapolating such figures across the two hundred articles cited by Apple tallies to over a thousand researchers, each making their important contribution to this area of touch-screen technology. 

Johnson may have taken the first step, and Apple harnessed its potential, but we owe touch-screen technology to the collective efforts of numerous researchers all over the world.

THE LITHIUM BATTERY

Battery Low. Blink, blink. We all know iPhones soak up a lot of power, yet they’d be nowhere without the rechargeable lithium battery.

British scientist Stanley Whittingham created the very first example of the lithium battery while working in a lab for ExxonMobil in the ‘70s, carrying forward research he’d initially conducted with colleagues at Stanford University. Previous research had already indicated that lithium could be used to store energy, but it was Whittingham and his team that figured out how to do this at room temperature—without the risk of explosion (Samsung take note).

A professor at the University of Oxford, John Goodenough, then improved on Whittingham’s original work by using metal oxides to enhance performance. This, in turn, piqued Sony’s interest, which became the first company to commercialize lithium batteries in the 1990s and launched a lithium powered cell phone in Japan in 1991. All of this provided the basis for mass use, with Apple duly obliging when they first launched the iPhone to over a million users in 2007.

Lithium’s story doesn’t stop there. As one of the building blocks of a world without fossil fuels, its production is zealously guarded. So who do you think bought Sony’s battery business in 2016? Why, one of Apple’s leading suppliers no less, Murata Manufacturing. Meanwhile, John Goodenough, now 95, continues his groundbreaking research. Only a few months ago he published a landmark study in the Journal of the American Chemical Society. Among its claims? That Goodenough had created a lithium battery for electric cars that can be used 23 times more than the current average.

THE INTERNET AND THE WORLD WIDE WEB

When Apple engineer Andy Grignon first added internet functionality to an iPod in 2004, Steve Jobs was far from enthusiastic: “This is bullshit. I don’t want this. I know it works, I got it, great, thanks, but this is a shitty experience.” 

The painstaking work of multiple Apple teams took a “shitty experience” and made something revolutionary—all collective human experience and knowledge right there, in your back pocket, at the touch of your fingertips. But who do we have to thank for this?

Sir Tim Berners-Lee is widely credited with the invention of the World Wide Web. His work began in the 1980s while at the European Organization for Nuclear Research. Better known by its French acronym, CERN was established by 12 European governments in 1952 and continues to be funded by its member states. Berners-Lee’s ideas began as a proposed solution for a very specific problem at CERN: how best to facilitate the sharing and updating of the vast amounts of information and data used by CERN researchers. His proposal was based on the concept of hypertext, a term first coined by the theoretical pioneer Ted Nelson in a 1965 paper published by the Association for Computing Machinery. Often compared to an electronic version of the footnoting system used by researchers the world over, hypertext underpins the web, enabling you to jump from one source of information to another. Anywhere on the Internet. In whatever form it may be.

But even Berners-Lee cannot be given solo credit. If the World Wide Web is the map, the internet is the landscape we navigate: a networking infrastructure connecting millions of computers globally, enabling each to communicate with the other, transferring vast quantities of information.

To trace the origins of the internet we have to return to 1965. While Nelson was coining hypertext and Eric inventing the touch screen, two researchers at MIT, Thomas Merrill and Lawrence Roberts, connected their computer to another 3,000 miles away in California using a simple low-speed dial-up telephone line. Shortly after that came Arpanet, not a dystopian AI system, but the Advanced Research Projects Agency Network. Arpanet was established and funded by DARPA, the U.S. Defense Advanced Research Projects Agency, and initially conceived as a means of interconnecting the American military’s computers across their various regional hubs.

It was Arpanet that really gave birth to the internet, in a moment described below by Leonard Kleinrock. It’s October 1969, three months after man has walked on the moon, and Kleinrock and his colleagues have just connected multiple computers across the U.S.:

We typed the L and we asked on the phone,

Do you see the L?

Yes, we see the L

We typed the O, and we asked, Do you see the O?

Yes, we see the O.

Then we typed the G, and the system crashed…

The course of true innovation never did run smoothly. But these early breakthroughs of the space age were the basis for all that was to follow. While the modern iPhone is now 120 million times more powerful than the computers that took Apollo 11 to the moon, its real power lies in its ability to leverage the billions of websites and terabytes that make up the internet.

A brief analysis of these three research breakthroughs reveals a research web of over 400,000 publications since Apple first published their phone patent in 1997. Add the factor of supporting researchers, funders, universities and companies behind them, and the contributing network is simply awe-inspiring. And we’ve barely scratched the surface. There are countless other research breakthroughs without which the iPhone would not be possible. Some well-known, others less so. Both GPS and Siri had their origins with the U.S. military, while the complex algorithms that enable digitization were initially conceived to detect nuclear testing. All had research at their core.

The iPhone is an era-defining technology. Era-defining technologies do not come from the rare brilliance of one person or organization, but layer upon layer of innovation and decade upon decade of research, with thousands of individuals and organizations standing on each other’s shoulders and peering that bit further into the future. In our age of seemingly insurmountable global challenges, we must not only remember this but be inspired by it.

We must encourage openness and transparency at the heart of research, ensuring it is disseminated as widely, quickly and clearly as possible. We must remember that every delay and distortion matters. Research integrity and reproducibility, transparent peer review, open access, diversity—these are more than just buzzwords. They are exciting steps toward reforming the infrastructure of a global research ecosystem that has always been our best hope for the future.

Matthew Hayes is the author of "Robert Kennedy and the Cuban Missile Crisis," published in the journal History. He is director of publisher and funder growth at Publons, the world's largest peer review platform and a part of Clarivate Analytics. He studied history at Oxford University, has a master's in international relations from SOAS, University of London, and is currently researching a PhD on global citizenship education at the Institute of Education, University College London.

More by Matthew Hayes