The Fiftieth Anniversary of Intel's Microprocessor

If Intel did not build the first microprocessor, who did?

David A. Laws
8 min readNov 16, 2021

--

Advertisement in Electronic News, November 15, 1971 Image: Courtesy the Computer History Museum

“This is what made computing really take off!” — Pat Gelsinger, Intel CEO

On November 15, 1971, Intel Corporation ran a two-page advertisement in Electronic News, the industry weekly newspaper, proclaiming ”A New Era of Integrated Electronics.” With Intel barely two years old, the announcement of The MCS-4 Micro Computer Set, including a “micro programmable computer on a chip,” appeared to many industry analysts to be a distraction from the company’s focus on developing a new market for DRAM semiconductor memories.

Intel 4004 CPU chip package. Photo: Courtesy Intel Corporation

In hindsight, after memory chips became a profitless business in the mid-1980s, this “distraction” proved to be the company’s savior.

Descendants of the 4004 central processing unit of the MCS-4 family, now popularly known as a microprocessor, have profoundly impacted the course of electronics and changed the way the world lives, plays, and works.

Fifty years later, chip complexity has increased from the 2,300 transistors of the 4004 to billions of transistors on a sliver of silicon, and Intel remains the predominant manufacturer of microprocessors for computers.

Many writers over the years have described the 4004 as the world’s first microprocessor. Intel does not make that claim. A news release celebrating the anniversary of the product’s unveiling describes it as “the first commercially available microprocessor” and that, by offering a complete CPU element on a single chip, it paved the path for modern microprocessor-based computing. [1]

Who Invented the microprocessor?

The MCS-4 Micro Computer Set datasheet, November 1971

So if Intel did not build the first microprocessor, who did? My blog article for the Computer History Museum, “Who Invented the microprocessor?” published in 2018 [2], notes that the term “microprocessor’ had been in use for decades.

That word is nowhere to be seen on the first Intel datasheet published in November 1971. It calls the 4004 a 4-bit parallel CPU. It was several years before microprocessor was widely used to refer to a single-chip CPU element.

The following, taken mainly from the blog text, describes a chronology of early approaches to integrating the primary building blocks of a computer onto fewer and fewer microelectronic chips, eventually culminating in the concept of the microprocessor.

A general-purpose computer is made up of three basic functional blocks. The central processing unit (CPU), which combines arithmetic and control logic functions; a storage unit for storing programs and data; and I/O units (UART, DMA controller, timer/counters, device controllers, etc.) that interface to external devices such as monitors, printers, modems, etc.

Ed Sack and other Westinghouse engineers described an early attempt to integrate multiple transistors on a silicon chip to fulfill these significant functions of a CPU in a 1964 IEEE paper. They stated that “Techniques have been developed for the interconnection of a large number of gates on an integrated-circuit wafer to achieve multiple-bit logic functions on a single slice of silicon. . . . Ultimately, the achievement of a significant portion of a computer arithmetic function on a single wafer appears entirely feasible.” [3]

At that time, most small computer systems were built from multiple standard integrated circuit (IC) logic chips, such as the Texas Instruments SN7400 family, mounted on several printed circuit boards (PCBs). As these ICs increased in density, as forecast by Moore’s Law, from a few logic gates per chip (SSI — small scale integration) to tens of gates per chip (MSI — Medium Scale Integration), fewer boards were required to build a computer. Minicomputer manufacturer Data General reduced the CPU of its Nova 16-bit minicomputer down to a single PCB by the end of the decade.

MOS-LSI (Large Scale Integration)

A significant step in meeting the Westinghouse engineers’ goal of putting a CPU on a single chip occurred as IC manufacturing processes transitioned from bipolar to metal oxide semiconductor (MOS) technology. By the mid-1960s, MOS enabled large-scale integration (LSI) chips to accommodate hundreds of logic gates. Designers of consumer digital products where small size was an advantage — such as in calculators and watches — developed custom LSI chips. In 1965, calculator manufacturer Victor Comptometer contracted with General Microelectronics Inc. (GMe) to design 23 custom ICs for its first MOS-based electronic calculator. By 1969 Rockwell Microelectronics had reduced the chip count to four devices for Sharp’s first portable machine. Mostek and TI introduced single-chip solutions in 1971.

By the early 1970s, small LSI-based computer systems emerged and were called microcomputers. Developers of these machines pursued the same techniques used by calculator designers to reduce the number of chips required to make up a CPU by creating more highly integrated LSI ICs. These were known as microcomputer chipsets. Each new step along the curve of Moore’s Law reduced the number of chips required to implement the CPU, eventually leading to a single-chip product we know today as the microprocessor.

The Contenders for “First”

Fairchild Semiconductor began the development of standardized MOS computer system building blocks in 1966. The 3804, its first complete CPU processor bit slice, featured instruction decoding, a parallel four-bit ALU, registers, full condition code generation, and the first use of an I/O bus. Designer Lee Boysel noted that it would not be considered a microprocessor because it lacked internal multi-state sequencing capability. Still, it was an important milestone in establishing the architectural characteristics of future microprocessors. [4] Boysel began work on the AL1, an 8-bit CPU bit slice for use in low-cost computer terminals at Fairchild in 1968. After founding Four Phase Systems Inc., he completed the design and demonstrated working chips in April 1969. A single terminal configuration employed one AL1 device: a multi-terminal server used three.

Lee Boysel and the AL 1 chip, Collection of the Computer History Museum, 102716365

The Computer History Museum collection holds a courtroom demonstration system using the AL1 processor chip. Links to a talk by Boysel and a commentary by Gordon Bell offer more insight into his work.

After Boysel left the company, Fairchild Semiconductor continued to invest in this area with the PPS-25 (Programmed Processor System), a set of 4-bit programmable chips introduced in 1971. Considered a “microprocessor” by some users, [5] it found favor in scientific applications.

Working for Garrett AiResearch Corp under contract from Grumman Aircraft, beginning in 1968, Steve Geller and Ray Holt designed a highly integrated microcomputer chipset, designated MP944, for the Central Air Data Computer (CADC) in the US Navy F14A “Tomcat” fighter. The processor unit comprised several arithmetic and control chips; the CPU, PMD (parallel divider), PMU (parallel multiplier), and SLU (steering logic unit). American Microsystems Inc. manufactured the first complete chipset in 1970. [6]

Ray Holt and the MP944 chipset

Enter Intel

In 1969, Nippon Calculating Machine Corp. approached Intel about designing a set of integrated circuits for its engineering prototype calculator, the Busicom 141-PF. Intel’s Marcian (Ted) Hoff and Stanley Mazor conceived the MCS-4 Micro Computer Set as the most efficient solution for the application. Federico Faggin, aided by engineer Masatoshi Shima, designed and built the first chips in January 1971. Intel acquired rights to the products and introduced them to the commercial market in November of that year.

Intel MCS-4 and MCS-8 design teams (Faggin, Shima, Mazor, Feeny, Hoff) and their CPU chips

By enhancing the MOS Silicon Gate process he pioneered with Tom Klein at Fairchild, Faggin squeezed 2,300 transistors onto the 4004 CPU device to create one of the densest chips fabricated to date. This level of complexity incorporated more of the essential logical elements of a processor onto a single chip than prior solutions. These included a program counter, instruction decode/control logic, the ALU, data registers, and the data path between those elements. In addition to calculators, the programmable features of the 4004 enabled applications in peripherals, terminals, process controllers, and test and measuring systems. While sales of the chipset were modest, the project launched Intel into a profitable new business opportunity.

Vic Poor, vice president of R&D at Computer Terminal Corporation (CTC), approached Intel in 1969 with a request to develop a special-purpose memory stack for the Datapoint 2200 Programmable Terminal. Based on the approach for the Busicom project, Intel responded with a proposal to integrate the entire CPU as an 8-bit processor and signed a contract with CTC in early 1970. Begun by Hal Feeney and completed by Federico Faggin, working units of the 8008 were delivered to the customer in late 1971, followed by a public announcement in February 1972.

Presented with an early copy of the Intel specification by CTC, Texas Instruments assigned engineer Gary Boone to design a competing single-chip 8-bit processor. Designated the TMX1795, the first units were delivered to CTC in mid-1971, several months before Intel’s 8008. [8] But as Intel’s original specification contained an error in one of the instructions, the TI devices were non-functional in the application.

Gary Boone and Texas Instruments TMX1795 chip

Although CTC ultimately chose not to use either version of the processor, the experience paid off handsomely for both companies. For Intel, the 8008 led to the highly successful 8080 processor and ultimately the entire x86 family of microprocessors, the most successful in history. Boone’s design concepts applied to single-chip calculators and microcontrollers made TI a leader in those markets for many years.

So, who deserves the “inventor” credit?

Most historians who follow the tangled threads underlying the origins of the microprocessor conclude that it was simply an idea whose time had come. Throughout the 1960s, numerous semiconductor manufacturers contributed to an ever-increasing number of transistors that could be integrated onto a single silicon chip. Correspondingly many government-funded and commercial computer designers were seeking to reduce the number of chips in a system. The eventual implementation of the primary functions of a CPU on a chip was both inevitable and obvious.

References

[1] https://www.intel.com/content/www/us/en/newsroom/news/intel-marks-50th-anniversary-4004.html#gs.g3sm5b

[2] https://computerhistory.org/blog/who-invented-the-microprocessor/

[3] E. A. Sack, R. C. Lyman, and G. Y. Chang, “Evolution of the Concept of a Computer on a Slice,Proceedings of the IEEE, vol. 52, no. 12, (Dec. 1964) pp. 1713–20.

[4] Lee, Boysel, “History of the Microprocessor: Fact, Fiction, and Patents” Draft of unpublished paper (1998).

[5] R. Kitai, I. Renyi and F. Vajda, “Microprocessor Application in a Walsh Fourier Spectral Analyzer” IEEE Computer (Volume: 9, Issue: 4, April 1976) p. 27.

[6] Ray Holt, “Architecture of a Microprocessor” Accepted for publication in Computer Design in 1971 withdrawn for security reasons, released 1998.

[8] ”The Texas Instruments TMX1795,” Ken Shirriff Blog, May 10, 2015.

--

--

David A. Laws

I photograph and write about Gardens, Nature, Travel, and the history of Silicon Valley from my home on the Monterey Peninsula in California.