Moore’s Law isn’t a law as such. Instead, it is a mere prediction that has become a golden rule in electronics.
Gordon E. Moore predicted in a 1965 article in Electronics Magazine that the number of components that could fit on an integrated computer circuit would double every year. Although the terms of Moore’s Law changed over time as technology changed, developers and designers took this as a challenge and turned it into a self-fulfilling prophecy.
In its current form, Moore’s Law states that the number of transistors per semiconductor should double every two years without cost, allowing the computer industry to offer more processing power in lighter and smaller computing devices for the same price every two years. His estimate was a little wide of the mark, with the actual number of transistors doubling approximately every 18 months in the 50 years since 1961.
The industry has followed this prediction for decades, but no one can guarantee that it will continue. It has been suggested that this theory will run its course in the 2020s; however, as technology evolves and incorporates more modern methods such as microfabrication, it is unknown how long Moore’s Law will be relevant to the electronics industry.
Moore’s Law and manufacturing
What exactly does Moore’s Law mean in regards to a manufacturing business? Have you ever noticed that every two to four years, you need to upgrade your electronics? As technology advances, so does the speed at which older technology becomes obsolete, meaning that it becomes less efficient and parts are no longer made. The impact this can have on the manufacturing industry means older technology needs to be updated frequently. Manufacturers are now able to buy better, smaller microfabricated components and machinery that can achieve more for less money thanks to digitization revolutionizing technology.
Moore’s Law and microfabrication
The vision of an infinitely empowered and interconnected future presents both challenges and opportunities. For more than a half-century, shrinking transistors thanks to microfabrication have powered advances in computing. Still, engineers and scientists will soon need to find new ways to make computers more capable. Instead of physical processes, applications and software may aid in increasing computer speed and efficiency. Cloud computing, wireless communication, the Internet of Things (IoT), and quantum physics may all play a role in computer technology innovation in the future.