Friday 17 April 2015

Moore's Law is the reason your iPhone is so thin and cheap



Intel co-founder Gordon Moore's observation 50 years ago set the groundwork for self-driving cars on the road and computers in our pockets today.

To get a sense of what society owes to Moore's Law, just ask what the world would look like if Intel co-founder Gordon Moore never made his famous 1965 observation that the processing power of computers would increase exponentially.


"It is almost unimaginable," said Genevieve Bell, a cultural anthropologist for Intel.

"The implications would be so dramatic, I struggle to put it in words," said Adrian Valenzuela, marketing director for processors for Texas Instruments.

Jeff Bokor, a professor of electrical engineering and computer science at the University of California, Berkeley, found at least one: "Cataclysmic."

The comments aren't wild hyperbole; they underscore just how significant an impact one little observation has had on the world. Moore's Law is more than a guideline for computer processor, or chip, manufacturing. It's become a shorthand definition for innovation at regular intervals, and has become a self-fulfilling prophecy driving the tech industry.

Are you happy about your sleeker iPhone 6 or cheaper Chromebook? You can thank Moore's Law.

50TH ANNIVERSARY OF MOORE'S LAW

The inside story of the multibillion-dollar quest to make faster, cheaper gadgets
Visiting the places that make Moore's Law happen (pictures)
With Sunday marking the 50th anniversary of Moore's observation, we decided to take stock of Moore's Law. CNET staff reporter Ben Fox Rubin offers an in-depth look at the work that semiconductor manufacturers are putting in to make sure the rate of improvement is sustainable. Tomorrow, CNET senior reporter Stephen Shankland explores alternative technologies and the future of Moore's Law while senior reporter Shara Tibken looks at Samsung's lesser known presence in the field.

But first, let's explore the effect of Moore's Law throughout history -- and start by dispelling some misconceptions. Most importantly, Moore's Law is not actually a law like Isaac Newton's Three Laws of Motion. In a paper titled, "Cramming More Components onto Integrated Circuits," published by the trade journal Electronics in 1965, Moore, who studied chemistry and physics, predicted that the number of components in an integrated circuit -- the brains of a computer -- would double every year, boosting performance.

A decade later, he slowed his prediction to a doubling of components every two years.

It wasn't until Carver Mead, a professor at the California Institute of Technology who worked with Moore at the Institute of Electrical and Electronics Engineers, coined the term "Moore's Law" in 1975 that it gained widespread recognition in the tech world. It became a goal for an entire industry to aspire to -- and hit -- for five decades.

"[It's] a name that has stuck beyond anything that I think could have been anticipated," Moore, now 86, said in an interview with Intel earlier this year.

A self-fulfilling prophecy

Moore's Law specifically refers to transistors, which switch electrical signals on and off so that devices can process information and perform tasks. They serve as the building blocks for the brains inside all our smartphones, tablets and digital gadgets.

The more transistors on a chip, the faster that chip processes information.

To keep Moore's Law going, chip manufacturers have to keep shrinking the size of the transistors so more can be placed together with each subsequent generation of the technology. The original size of a transistor was half an inch long. Today's newest chips contain transistors that are smaller than a virus, an almost unimaginably small scale. Chipmakers including Intel and Samsung are pushing to shrink them even more.

But size doesn't really matter when it comes to appreciating Moore's Law. More important is the broader idea that things get better -- smarter -- over time.

The law has resulted in dramatic increases in performance in smaller packages. The Texas Instruments processor that powers the navigation system in a modern Ford vehicle is nearly 1.8 million times more powerful than the Launch Vehicle Digital Computer that helped astronauts navigate their way to the moon in 1969.

And Apple's iPhone 6 is roughly 1 million times more powerful than an IBM computer from 1975 -- which took up an entire room -- according to a rough estimate by UC Berkeley's Bokor. The iPhone, priced starting at $650, is also a lot cheaper than a full-fledged desktop computer selling anywhere between $1,000 and $4,000 a decade ago -- and it can do so much more.

Just as critical is the time element of Moore's Law: the doubling of transistors every two years meant the entire tech industry -- from consumer electronics manufacturers to companies that make the equipment to manufacture chips and everything in between -- had a consistent rate that everyone could work at.


"It created a metronome," Bell said. "It's given us this incredible notion of constant progress that is constantly changing."

It also set a pace that companies need to keep, or else get left behind, according to Moore. "Rather than become something that chronicled the progress of the industry, Moore's Law became something that drove it," Moore said in an online interview with semiconductor industry supplier ASML in December.

While he didn't think his observation would hold true forever, chipmakers don't seem to be slowing down their efforts. "It's a self-fulfilling prophecy, so to the industry it seems like a law," said Tsu-Jae King Liu, a professor of microelectronics at UC Berkeley.

Life without Moore's Law

Nowadays, everyone assumes technology will just get better, faster and cheaper. If we don't have a sophisticated enough processor to power a self-driving car now, a faster one will emerge in a year or two.

Remove Moore's Law, and that assumption no longer holds true. Without a unifying observation to propel the industry forward, the state of integrated circuits and components might be decades behind.

"It's an exponential curve, and we would be much earlier on that curve," Valenzuela said. "I'm happy to say I don't have to carry my 1980s Zack Morris phone."

Intel's Bell imagines a more "horrifying" world without integrated circuits, one in which everything is mechanized, and common tropes of technology such as smartphones and even modern telephone service wouldn't exist. "The Internet would have been impossible," she said.

No comments:

Post a Comment