Tag Archives: computing

Moore’s Law and the Incredible Shrinking Transistor

Moore’s Law :

The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper.[6][7][8] The paper noted that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue “for at least ten years”.[9] His prediction has proved to be uncannily accurate, in part because the law is now used in the semiconductor industry to guide long-term planning and to set targets for research and development.[10]

This trend has continued for more than half a century. 2005 sources expected it to continue until at least 2015 or 2020.[note 1][12] However, the 2010 update to the International Technology Roadmap for Semiconductors has growth slowing at the end of 2013,[13] after which time transistor counts and densities are to double only every 3 years.

(Wikipedia, 2012)

As noted above, Moore’s Law has been the moving force in the computer community for 47 years. For a while, the Law must’ve looked like it was coming up against the proverbial brick wall with the advent of quantum computing. But quantum computing is going to have to wait, or is going to be slightly different from originally prognastsized:

Moore’s Law could be safe for another decade or so. An international team of scientists has demonstrated a working transistor composed of a single atom–nearly 100 times smaller than the 22-nanometer cutting-edge transistors fabricated by Intel.

More importantly, the research team led by Michelle Simmons of the University of New South Wales in Sydney was able to show a method for repeating the process with great accuracy and in a fashion that is compatible with the CMOS technology used in transistor fabrication today.

“This is the first time anyone has shown control of a single atom in a substrate with this level of precise accuracy,” said Simmons, who worked with colleagues from the Korea Institute of Science and Technology Information, Purdue University, the University of Sydney, the University of Melbourne, and the University of New South Wales on the project.

The “law” associated with Intel co-founder Gordon Moore predicts a steady rate at which the density of transistors on silicon-based semiconductors increases over time. That steady procession of ever-smaller computer circuitry has held up for decades, but as the size of transistors approaches atomic scales, there have been serious questions as to whether Moore’s Law can last much longer than another five years or so.

The work of Simmons and her colleagues could show a way to keep making microprocessor circuitry smaller and smaller through 2020 and beyond.

As they run up against atomic scales with ever-smaller circuitry, semiconductor manufacturers today are running up against problems affecting transistor performance that stem from quantum effects (basically, the fact that materials interact very differently at very small sizes) and a need for precision that may not be possible with the lithographic methods currently in use.

In recent years, advances in quantum computing have offered a viable path to smaller and smaller transistors, to be sure. But the new research might be the first strong sign that atomic-level transistor fabrication can be done in keeping with the part of Moore’s Law that’s often forgotten amidst the wonderment over tinier and tinier computer chips–that it be done cheaply.

Using a “combination of scanning tunneling microscopy and hydrogen-resist lithography,” the team was able to “deterministically” place an individual phosphorus dopant atom “within an epitaxial silicon device architecture with a spatial accuracy of one lattice site,” according to a paper published Sundayin the journal Nature Nanotechnology.

In layman’s terms, that means the researchers are able to stick the phosphorous atom (used to “dope,” or add an electron charge to a silicon substrate) precisely where they want to, whenever they want to.

That’s important, because as transistors approach the size of atoms, it becomes hugely important to place each of those atoms very precisely. On larger scales, silicon can be doped with less accuracy and still produce the electrical current needed to switch between “on” and “off,” the essence of what a transistor does and how it works.

Hmm..this is the crux of the standard technology, the ability to turn the electrical current “on” and “off”, the “ones” and “zeros”  of the simple binary code itself. There’s no worrying about about “qubits” existing in the events at the same time and how the act of “observation” is going to affect calculations.

As noted above, the quantum effects are going to become noticeable anyway, simply because of the atomic scale size of the processors.

But I surmise the theme here isn’t just the perfecting the size of the technology, it’s how cheaply the technology can be done now — and how cost-effective the processors can be manufactured.

So not to worry Singularitarians, this will only enhance the availability of cybernetic enhancements!

Researchers Develop Single-Atom Transistor

Thanks to the Daily Grail