In Computers, what is Moore's Law?
Moore's Law is a concept that was first proposed in 1965 by Gorden E. Moore, one of the founders of Intel, a major American technology company. Simply put, it states that the number of transistors on a microchip will increase exponentially, typically doubling every two years. Since microchips are the powerhouses of electronics industry, this exponential progression obviously has a huge impact on computer hardware.
Moore's observation was based on his experience in the integrated circuit manufacturing industry. He observed that Intel was able to double the number of transistors on an individual chip approximately every 18-24 months, and that this trend held steady through multiple generations of chips. By 1970, people were referring to this phenomenon as “Moore's Law,” thanks to Carver Mead, a professor at the California Institute of Technology, who coined the phrase.
A glance at a graph that tracks microchip production suggests that this law is a reality, although people argue over its limit; several studies indicate that this exponential growth rate may stop between 2017 and 2025, as manufacturers reach the limits of possibility. Moore's Law isn't just about the basic number of transistors on a chip, it also has to do with prices for microchips, and pricing for electronics in general as a result.
By using this law, people can predict price points for a wide range of consumer electronics including computers, digital cameras, and phones. A larger number of transistors increases the power and ability of electronics, meaning that companies are constantly releasing new and improved versions of their products. This can be frustrating for consumers who buy a top of the line product, only to discover that the price rapidly falls within a year or so. An awareness of this trend leads some consumers to reach for midrange electronics, rather than aiming for the best.
Technology companies sometimes feel intense pressure as a result of Moore's Law. Although Moore's original proposal was merely an observation of market trends, some companies use it as a literal law, trying to double the capacity of their computer components every year. Major chip manufacturers, including Intel, tend to release new chips on a two year schedule, reflecting scientific development, consumer demand, and the pressure of how people understand the law. As Gordon Moore pointed out in 2005, chip development has to stop somewhere, and ultimately, technology companies will be limited on the atomic level, unable to go any smaller.
I think computer networks have added a new wrinkle to the concept of Moore’s law. A network can multiply the power of its computers. You see this in things like cloud computing, where the entire network performs computer calculations instead of just one processor. I don’t think Moore foresaw the application of his law to networks.
Moore’s law has a direct impact on the prices of popular electronic consumer gadgets that come to market. Prices start high and gradually drop, as technology itself gets better or multiplies, whatever you want to call it. They have phrases for consumers who buy at different intervals of the price points.
For example, the first people to buy new gadgets are called the “innovators.” These are the people who are willing to stand in long lines and pay top dollar just to use the new product.
Next come the “early adopters” who wait before the price drops a little, then get in on the act. The next in line are the “early majority,” “late majority,” and the “laggards.” I’m a laggard myself. I’ll wait until a product reaches critical mass before I’ll hop on the bandwagon, knowing the newer version of the product will deliver more bang for the buck.
I think we’ve reached the peak with Moore’s Law. It’s difficult to keep doubling computer speed, because every time that you do that you always increase the amount of heat generated. As a result you need to add more and more thermal protection to your chip and a whole bunch of fans just to keep your computer from overheating.
I think the next leap will be nanotechnology in computers. When little “nano bots” replace the microprocessor, computer technology will take a huge leap forward. Until that day, the best bet is to upgrade your RAM if you want to boost your speed.
Post your comments