Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube also known, especially in Britain, as a valve. The vacuum tube, each one about as big as a person's thumb and glowing red hot like a tiny electric light bulb, had been invented in by Lee de Forest — , who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly.
Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. It contained nearly 18, vacuum tubes nine times more than Colossus , was around 24 m 80 ft long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer.
Colossus might have qualified for this title too, but it was designed purely for one job code-breaking ; since it couldn't store a program, it couldn't easily be reprogrammed to do other things.
ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late s. In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper — , who had originally been employed by Howard Aiken on the Harvard Mark I. It was then manufactured for other users—and became the world's first large-scale commercial computer.
Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late s and the early s. Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug.
But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. The ENIAC's designers had boasted that its calculating speed was "at least times as great as that of any other existing computing machine. So a new technology was urgently required. The solution appeared in thanks to three physicists working at Bell Telephone Laboratories Bell Labs.
John Bardeen — , Walter Brattain — , and William Shockley — were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways to make a better form of amplifier than the vacuum tube.
History of Computing
When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December , they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since. Like vacuum tubes, transistors could be used as amplifiers or as switches.
But they had several major advantages. They were a fraction the size of vacuum tubes typically about as big as a pea , used no power at all unless they were in operation, and were virtually percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the Nobel Prize in Physics. By that time, however, the three men had already gone their separate ways.
John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in Walter Brattain moved to another part of Bell Labs. William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce — and research chemist Gordon Moore —.
It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In , eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since.
- Swords In The Narthex;
- Religion, Identity and Politics in Northern Ireland: Boundaries of Belonging and Belief?
- Team Management: Achieving Business Results Through Teams.
- A History of Modern Computing, Second Edition | The MIT Press.
- Entitlement Games.
- Immigration (The Ilan Stavans Library of Latino Civilization).
It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby — was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together.
That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit IC , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor.
Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.
The Birth of Modern Computing
Photo: An integrated circuit seen from the inside. Integrated circuits, as much as transistors, helped to shrink computers during the s. As the s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration LSI , in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated VLSI , when the same chip could contain thousands of components.
The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction.
A couple of their engineers, Federico Faggin — and Marcian Edward Ted Hoff — , realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all.
History of Computers - A Brief Timeline of Their Evolution | Live Science
Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution. By , Intel had launched a popular microprocessor known as the and computer hobbyists were soon building home computers around it. With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak — to develop a computer of his own. In the mids, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.
After seeing the Altair, Woz used a microprocessor made by an Intel rival, Mos Technology to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs — , persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents.
While the Altair looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April , it was the world's first easy-to-use home "microcomputer. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in , which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50, of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies.
Photos: Microcomputers—the first PCs.