Moore’s Law doomed?
Philosophy

Moore’s Law doomed?


Gordon Moore

Prognosticators are a dime a dozen...Michio Kaku

According to Moore’s Law, the number of transistors on a chip roughly doubles every two years. As a result the scale gets smaller and smaller.

"Death of Moore's Law Will Cause Economic Crisis"

by

John E. Dunn

March 21st, 2011

techworld.com

Gordon Moore's famous law about the doubling of transistor density and power every two years will not only end it could bring economic disaster in its wake, respected scientist Michio Kaku has predicted in a new book, Physics of the Future.

Kaku sets out the crunch moment as being the point at which ultraviolet light can no longer tuned to etch ever smaller circuits on to silicon wafers, which on current trends will kick in less than a decade from now. From that moment on, Moore's Law will gradually diminish, and the effects will not only be technological but economic.

He argues that the computing industries depend on a conveyor belt of new products which roughly double the power of each new product from the equivalent a year or two earlier. With no Moore's Law to propel this rise in computing power, this upgrade culture will grind to a halt causing consumer interest to wane.

"Around 2020 or soon afterward, Moore's law will gradually cease to hold true and Silicon Valley may slowly turn into a rust belt unless a replacement technology is found," says Kaku in an extract published on Salon.com website.

"Transistors will be so small that quantum theory or atomic physics takes over and electrons leak out of the wires. At that point, according to the laws of physics, the quantum theory takes over," says Kaku, invoking one of science's most feared laws, The Heisenberg uncertainty principle.

His point is stark. Once the most basic unit of computing work - the electron with a measurable behaviour inside a wire - becomes uncertain, as it surely will at these scales, the silicon age is over. Any smaller and science has no way of knowing where an electron is in order to put it to work in a transistor.

Kaku's pronouncements on the limits of Moore's Law are nothing new and have been argued over almost since it was first mooted by Moore himself in the 1960s. In 2005, even Moore himself saw problems in applying the exponential to today's computing environment, although Intel executives continue to make optimistic pronouncements in public.

Kaku's thesis is interesting, however, in focussing on the economic consequences of its demise and the extent to which high-tech companies and whole economies are vulnerable.

He reminds us of how much the world has come to depend on computing power that is now taken utterly for granted. For example, the primitive chip inside a birthday greetings card has more processing power than the Allied armies had at their disposal in 1945.

"Hitler, Churchill, or Roosevelt might have killed to get that chip. But what do we do with it? After the birthday, we throw the card and chip away," he says.

Arguments can be raised against his pessimism if not his physics. The first is that while the basic unit of computing power might stop advancing due to physical barriers, these units could be deployed in parallel to do more useful work. The world will need to think of ways to deploy this basic unit of power more efficiently, which is tends not to need to do today because of Moore's Law itself. That will buy some time.

Further out, there's also the wildcard of quantum computing, a design for performing calculations that seeks to harness the principles that for Kaku are disturbingly close to sinking the computing age for good. If such a vision is to leave the science lab where it has been stuck for some years, it will of course still have to overcome Heisenberg's tricky measurement paradox first.

If commercial quantum computing does come to pass, some believe a much bigger problem than fundamental physics starts to afflict the human obsession with building more and more complex computers, namely which problems will such powerful devices solve? Quantum computers might be perfectly suited to solving the deepest conundrums of the universe but perhaps not driving the 2050's equivalent of an iPod.

Moore's law [Wikipedia]




- Visions Of The Future: The Quantum Revolution
There seems to be a natural progression from complete bewilderment to understanding to the need to manipulate and find practical applications for the scientific discoveries we make. Sometimes there is also the stage of speculation whereby we take our...

- Quantum Computer To Usc
Once all of unit's bugs are fixed, this will be a revolution in computing "USC receives first quantum computer" by Rachel Bracker October 30th, 2011 Daily Trojan USC on Friday became the first academic institution to house an operational quantum...

- More On Moore
Patrick (Alfred Caldwell) Moore, English amateur astronomer, writer and broadcaster. He was educated at home due to childhood illness, from which time he acquired his interest in observational astronomy. Moore is best known as the enthusiastic and knowledgeable...

- Raymond C. Moore...crinoid Man
Raymond C. Moore February 20th, 1892 to April 16th, 1974 From left, Raymond C. Moore, William W. Hambleton, and Frank C. Foley. Raymond C. Moore was an American paleontologist [University of Kansas] known for his work on Paleozoic crinoids, bryozoans,...

- Pluto, Saturn, Hawking, Earth's Gravity--april Fool
Well, did you jump and experience a fleeting moment of less gravity? Would you believe only one friend of mine fell for it. I'm doing much better now...I am out of the hospital and I can receive nourishment through a straw...so I am okay. The origin...



Philosophy








.