The History of Intel
As it currently stands, Intel is one of the most powerful and influential tech companies to ever come into existence. Not only is the company responsible for the creation and development of the semiconductor chips responsible for the functioning of all major computers, but the company was founded by Gordon Moore (of “Moore’s Law”) himself. Despite the fact that so much of what has become normal in our society owes itself to the innovation of this one particular company, few know the history behind one of the world’s largest and highest valued semiconductor chip, microprocessor, motherboard chipset, network interface controller, integrated circuit, flash memory, graphics chip, and embedded processor manufacturers.
Co-founded in Mountain View, California in 1968 by chemist Gordon E. Moore and physicist Robert Noyce, Intel began as a risky project manned by the two inventors after they left Fairchild Semiconductor to begin their own company. The company was almost called “Moore Noyce,” a homophone for “more noise,” but the name was quickly retracted when the two men realized that noise in electronics is generally associated with poorly made products and the name would therefore be ill-suited for their venture. They changed the name to NM Electronics before eventually renaming it to Integrated Electronics or “Intel” for short.
Intel started out distinguishing itself by its ability to make semiconductors for electronics. Its first ever product, the 3101 Schottky TTL bipolar 64-bit static random-access memory (SRAM), worked almost twice as quickly as the competing Schottky diode implementations released by Fairchild and the Electrotechnical Laboratory. That same year (1969), Intel released its 3301 Schottky bipolar 1024-bit read-only memory (ROM) as well as the first ever commercial metal-oxide-semiconductor field-effect transistor (MOSFET) silicon gate SRAM chip, the 256-bit, to ever hit the market. In other words, from the company’s founding it was clear that Intel had the know-how to compete in the electronics market.
What took a little more time was for the company to build its manufacturing infrastructure and expand its reach around the U.S. and globally. It took most of the 1970’s for Intel to catch up with more established companies, but all the while it was releasing major innovations that revolutionized the field of electronics.
For example, in 1971 Intel created the first commercially available microprocessor, called the Intel 4004. It also released one of the world’s first microcomputers in 1972. Intel’s general business model changed many times throughout the decades.
In the 1980’s, its business was bolstered mainly by its ability to produce dynamic random-access memory chips. However, eventually Japanese semiconductor manufacturers were able to compete with Intel in such a way as to substantially reduce the profitability of that particular business angle. Gordon Moore looked to IBM’s success with personal computers (which were based on an Intel microprocessor) and decided to shift the company’s to microprocessors and to seek out ways to restructure that business model. In hindsight, many business analysts have attributed Intel’s success to Moore’s foresight on this issue as well as his decision to sole-source Intel’s 386 chip.
As the 1980’s drew to a close, Intel prepared for a decade of absolutely unprecedented growth, occupying a very profitable place as the primary and most profitable hardware supplier to the PC industry. In other words, Intel supplied microprocessors to both IBM and IBM’s competitors as the personal computer market exploded and underwent an explosion of its own. Intel also launched a very successful advertising campaign in 1991 known as the “Intel Inside” campaign. It successfully wedded brand loyalty with consumer selection, making itself a household name throughout the 90’s.
Since then Intel has gone through its ups and downs, but it remains one of the most recognized and esteemed electronics brands on the market.