Pages

Tuesday, 28 April 2015

Moore's Law: Beyond the First Law of Computing


Moore
Moore’s Law – Reflection of History of Computing Hardware


Moore’s Law is a reflection of the history of computing hardware and the number of transistors in a dense integrated circuit that has increased tremendously every two years. This has been named after Gordon E. Moore, the co-founder of the Intel Corporation and Fairchild Semiconductor which in 1965 was defined doubling every year, the quantityof components for each integrated circuit.

It is not guaranteed that computer chips would keep growing smaller and more powerful though it would be getting near to capacity. The most powerful computing machine in the world could also be the most universal – the human brain which can perform computation on a scale which the most advanced super-computers would be unable to match.

The neurons tend to act on millisecond timescales, though slow when they are compared to the fastest processor and fail to activate. What enables the brain to function is parallel computing, the capability of solving problems at the same time with several different parts of the brain and the focus to imitate the brain’s potential to parallel compute is not only the promise of improvement in computing but also recovery from the imminent death of one of the laws in modern history.

Chips’ First Getaway – Apple Newton (1993) 

It is assessed that there are around 1.4 billion smartphones on the planet and though most of the prevalent devices have been designed in California or South Korea, the chips which power them are designed in Cambridge, England.

ARM, the company which is behind it all may not be compared to Intel but their chips are better than their U.S. rival, on energy efficiency as well as size that is crucial for smartphones. Supporting ARM chips is a simplified approach to computing, which was first considered at the University of Stanford and the University of California, Berkeley.

According to professor of computer engineering at the University of Manchester in England, Stephen Fuber, he states that `this is one case of U.S. academic research being taken up by a U.K. company very successfully’. In 1980, he had designed the ARM chip, at the present defunct ARM predecessor company, Acorn. The chips’ first getaway was in the Apple Newton in 1993 while the rest is said to be history. Fuber states that `then and now, the Apple brand was magic and opened doors’.

Inevitable Technological Progress

ARM, like all other companies has been dealing with the end of a trend which has provided more powerful computers every two years for almost half a century. The term Moore’s Law represents inevitable technological progress though it is a very explicit observation on computer chips. Gordon Moore, in his seminal 1965 paper, had predicted that the number of transistors on a chip would double every 18 months and the cost would fall off at the same rate.

While they are speculating on why transistor density has followed this exponential path, its effect is doubtful. According to professor of complexity economics at the University of Oxford, Doyne Farmer, he states that `it has been a windfall’. The laws of Physics tend to threaten to end that windfall.

 In other words, transistors have to be made from atoms and as the transistors shrink to pack it in a chip, one ultimately reaches a limit where there is shortage of atoms and before running out of atoms, the reliability of tiniest transistors tend to fall while the cost increases due to the increased complexity as well as the difficulty in producing them.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.