What Is Memory Bandwidth?
Computers need memory to store and use data, such as in graphical processing or loading simple documents. While random access memory (RAM) chips may say they offer a specific amount of memory, such as 10 gigabytes (GB), this amount represents the maximum amount of memory the RAM chip can generate. What is more important is the memory bandwidth, or the amount of memory that can be used for files per second. As the computer gets older, regardless of how many RAM chips are installed, the memory bandwidth will degrade. This is because part of the bandwidth equation is the clocking speed, which slows down as the computer ages.
When someone buys a RAM chip, the RAM will indicate it has a specific amount of memory, such as 10 GB. This measurement is not entirely accurate; it means the chip has a maximum memory bandwidth of 10 GB but will generally have a lower bandwidth. This is because the RAM size is only part of the bandwidth equation along with processor speed.
Processor speed refers to the central processing unit (CPU) and the power it has. CPU speed, known also as clocking speed, is measured in hertz values, such as megahertz (MHz) or gigahertz (GHz). A higher clocking speed means the computer is able to access a higher amount of bandwidth. This is the value that will consistently degrade as the computer ages.
To get the true memory bandwidth, a formula has to be employed. This formula involves multiplying the size of the RAM chip in bytes by the current processing speed. If there are extra interfaces or chips, such as two RAM chips, this number is also added to the formula. This is how most hardware companies arrive at the posted RAM size.
Memory bandwidth is essential to accessing and using data. As the bandwidth decreases, the computer will have difficulty processing or loading documents. This means it will take a prolonged amount of time before the computer will be able to work on files. Many consumers purchase new, larger RAM chips to fix this problem, but both the RAM and CPU need to be changed for the computer to be more effective.
The reason for memory bandwidth degradation is varied. One reason is that the CPU often ends up with tiny particles of dust that interfere with processing. Another reason is that new programs often need more power, and this continuous need for extra power begins to burn out the CPU, reducing its overall processing abilities. Background processing, or viruses that take up memory behind the scenes, also takes power from the CPU and eats away at the bandwidth.
Yes -- transistors do degrade over time and that means CPUs certainly do. But keep a couple of things in mind. Computer manufactures are very conservative in slowing down clock rates so that CPUs last for a long time. Also, those older computers don't run as "hot" as newer ones because they are doing far less in terms of processing than modern computers that operate at clock speeds that were inconceivable just a couple of decades ago. So you might not notice any performance hits in older machines even after 20 or 30 years.
Here's a question -- has an effective way to measure transistor degradation been developed? For people with multi-core, data crunching monsters, that is an important question.
Anyway, one of the great things about older computers is that they use very inexpensive CPUs and a lot of those are still available. If worse comes to worse, you can find replacement parts easily. Heck, a lot of them are still in use in "embedded" designs and are still manufactured.
Should people who collect and still use older hardware be concerned about this issue? Take a fan of the Apple 2 line, for example. That old 8-bit, 6502 CPU that powers even the "youngest" Apple //e Platinum is still 20 years old. If the CPUs in those machines are degrading, people who love those vintage machines may want to take some steps to preserve their beloved machines.
Post your comments