computer chips

The Future Of Computer Chips Sees Quantum Computing As Possibility

The best things come in small packages, and this is most true in the world of technology. However, the saying does not merely apply to smaller size gadgets that do a lot, like tablets and smartphones that perform just as well as larger sized personal computers. This also applies to the smaller components of electronics that allow us to do so much with the popular gizmos, such as computer chips. These chips are bound to get better with constant innovation, but sometimes even innovation has limits. To overcome these limits, it becomes necessary to consider perfecting quantum computing.

Transistors Then and Now

What makes computer chips powerful are the minuscule transistors found in them. The transistors are the small switches that command electrical signals. Intel raised the bar high when it comes to transistors with their first ever commercial microprocessor released in 1971: the 4004 chip. The 4004 had 2,300 transistors, each of which measured some micrometres across. Since then, all chip makers sought to speed up their processors by putting as many transistors as possible on the surfaces of the chips, which are about the size of a fingernail.

At present, advanced computer chips have billions of nano-sized switches. Shifting to nanoscale has become necessary, considering modern society’s need for information. Today’s society creates and shares data in significant amounts. One only need to look at social media to understand how much information is created and consumed on a daily basis. According to Peter Bentley, a professor from University College London, all these information have to be stored and processed somewhere. Bentley told CNN that the option is either to ration data usage per individual or create smaller devices that are faster and use less energy. In the case of computer chips, small is necessary, especially since more transistors will be needed.

The Birth of ‘3-D’ Computer Chips

Gordon Moore, who is the co-founder of Intel, predicted that the number of transistors on a chip will double every two years. This prediction, which became known as Moore’s Law, became the guiding principle Intel used to create its chips over the years. It also gave birth to what is called ‘3-D’ or ‘Tri-Gate’ transistors, which are very distinct from transistors made by Intel or other chip manufacturers in the past.

Intel’s vice president Kaizad Mistry noted that in the past 50 years, transistors were designed to conduct electricity along a planar surface of a small silicon wafer. The Tri-Gate transistor is not made the same way: there are pillars built on the surface of the wafer, so the current can pass through three sides of the pillar and get more conduction. Basically, its design makes it a better switch. The Ivy Bridge chip is one prime example of how chips need to be: containing 1.4 billion transistors, it conducts electricity better, runs faster and uses less energy that the Intel’s original 4004.

The redesign of the chip is made necessary by the aforementioned information overload. Traditional chips become too hot, so the newer chips had to be created in a way that allows them to work at lower voltages. To keep voltages low, much smaller transistors have to be designed. Intel’s development for 14 nanometer transistors are already in the works, but other computer chip makers need to prepare even smaller ones for the future. However, further development will pose a problem for manufacturers and not because of size. At this point, quantum effects are the hurdle.

On the Brink of Quantum Computing

computer chipsQuantum tunneling, or that which happens when electrons pass through solid objects due to quantum effects, present problems for computer chip design. Any move to go smaller will not help transistors work better; instead, it may cause these to cease working altogether. Professor Bentley noted that the best way to address the problem is to work on quantum computing.

Unlike traditional computers, quantum computers control electrons and photons for processing. They work with qubits (or quantum bits) rather than bits (binary digits) of information. Quantum computers will be able to solve major concerns faster because they see all solutions at once and determine the best one out of all.

Professor Bentley states that there is a long way to go before quantum computing becomes a possibility, but considering the progress in computer chip development, it is certainly in the future.

Betty Fulton, who is a fulltime writer, penned this article. With her keen interest in computers and all things related to it, she is the best choice as contributor for PC Doc.

Comments
  1. Can’t wait to learn more about quantum computing. Sounds like a good innovation

  2. Yes Exactly.

  3. The tech of chip is better and the company like Intel can now produce it as small as possible. So, the future of personal computer would be great with more powerful and less size.

  4. Hi Alize,
    Welcome to my blog and thanks for stopping by here. I am glad that you liked this product. I really appreciate your feedback.

    Thanks again. Have a nice day ahead!

ADD YOUR COMMENT