Microsoft Windows Year 2000 and Beyond



Microsoft Windows
Microsoft logo used with permission


On May 22, 1990, the critically accepted Windows 3.0 was released. Windows 3.0 had an improved program manager and icon system, a new file manager, support for sixteen colors, and improved speed and reliability. Most important, Windows 3.0 gained widespread third-party support. Programmers started writing Windows-compatible software, giving end users a reason to buy Windows 3.0. Three million copies were sold the first year, and Windows finally came of age.

On April 6, 1992, Windows 3.1 was released. Three million copies were sold in the first two months. TrueType scalable font support was added, along with multimedia capability, object linking and embedding (OLE), application reboot capability, and more. Windows 3.x became the number one operating system installed in PCs until 1997, when Windows 95 took over.

Windows 95

On August 24, 1995, Windows 95 was released in a buying fever so great that even consumers without home computers bought copies of the program. Code-named Chicago, Windows 95 was considered very user-friendly. It included an integrated TCP/IP stack, dial-up networking, and long filename support. It was also the first version of Windows that did not require MS-DOS to be installed beforehand.

Windows 98

On June 25, 1998, Microsoft released Windows 98. It was the last version of Windows based on the MS-DOS kernel. Windows 98 has Microsoft's Internet browser "Internet Explorer 4" built in and supported new input devices like USB.

Windows 2000

Windows 2000 (released in 2000) was based on Microsoft's NT technology. Microsoft now offered automatic software updates over the Internet for Windows starting with Windows 2000.

Windows XP

According to Microsoft, "the XP in Windows XP stands for experience, symbolizing the innovative experiences that Windows can offer to personal computer users." Windows XP was released in October 2001 and offered better multi-media support and increased performance.

Windows Vista

Codenamed Longhorn in its development phase, Windows Vista is the latest edition of Windows.

Continue>> The History of Computers


1985 Microsoft Windows

Microsoft WindowsMicrosoft logo used with permission



"Microsoft become the top software vendor in 1988 and never looked back" - Microsoft Corporation

Apple Bytes Back

Microsoft Windows version 1.0 was considered buggy, crude, and slow. This rough start was made worse by a threatened lawsuit from Apple Computers. In September 1985, Apple lawyers warned Bill Gates that Windows 1.0 infringed on Apple copyrights and patents, and that his corporation stoled Apple's trade secrets. Microsoft Windows had similar drop-down menus, tiled windows and mouse support.

Deal of the Century

Bill Gates and his head counsel Bill Neukom, decided to make an offer to license features of Apple's operating system. Apple agreed and a contract was drawn up. Here's the clincher: Microsoft wrote the licensing agreement to include use of Apple features in Microsoft Windows version 1.0 and all future Microsoft software programs. As it turned out, this move by Bill Gates was as brilliant as his decision to buy QDOS from Seattle Computer Products and his convincing IBM to let Microsoft keep the licensing rights to MS-DOS. (You can read all about those smooth moves in our feature on MS-DOS.)

Windows 1.0 floundered on the market until January 1987, when a Windows-compatible program called Aldus PageMaker 1.0 was released. PageMaker was the first WYSIWYG desktop-publishing program for the PC. Later that year, Microsoft released a Windows-compatible spreadsheet called Excel. Other popular and useful software like Microsoft Word and Corel Draw helped promote Windows, however, Microsoft realized that Windows needed further development.

Microsoft Windows Version 2.0

On December 9, 1987, Microsoft released a much-improved Windows version 2.0 that made Windows based computers look more like a Mac. Windows 2.0 had icons to represent programs and files, improved support for expanded-memory hardware and windows that could overlap. Apple Computer saw a resemblance and filed a 1988 lawsuit against Microsoft, alleging that they had broken the 1985 licensing agreement.

Copy This Will You

In their defense, Microsoft claimed that the licensing agreement actually gave them the rights to use Apple features. After a four-year court case, Microsoft won. Apple claimed that Microsoft had infringed on 170 of their copyrights. The courts said that the licensing agreement gave Microsoft the rights to use all but nine of the copyrights, and Microsoft later convinced the courts that the remaining copyrights should not be covered by copyright law. Bill Gates claimed that Apple had taken ideas from the graphical user interface developed by Xerox for Xerox's Alto and Star computers.

On June 1, 1993, Judge Vaughn R. Walker of the U.S. District Court of Northern California ruled in Microsoft's favor in the Apple vs. Microsoft & Hewlett-Packard copyright suit. The judge granted Microsoft's and Hewlett-Packard's motions to dismiss the last remaining copyright infringement claims against Microsoft Windows versions 2.03 and 3.0, as well as HP NewWave.

What would have happened if Microsoft had lost the lawsuit? Microsoft Windows might never have become the dominant operating system that it is today.

1985 Microsoft Windows

Microsoft WindowsMicrosoft logo used with permission

On November 10, 1983, at the Plaza Hotel in New York City, Microsoft Corporation formally announced Microsoft Windows, a next-generation operating system that would provide a graphical user interface (GUI) and a multitasking environment for IBM computers.

Introducing Interface Manager

Microsoft promised that the new product would be on the shelf by April 1984. Windows might have been released under the original name of Interface Manager if marketing whiz, Rowland Hanson had not convinced Microsoft's founder Bill Gates that Windows was the far better name.

Did Windows Get Top View?

That same November in 1983, Bill Gates showed a beta version of Windows to IBM's head honchos. Their response was lackluster probably because they were working on their own operating system called Top View. IBM did not give Microsoft the same encouragement for Windows that they gave the other operating system that Microsoft brokered to IBM. In 1981, MS-DOS became the highly successful operating system that came bundled with an IBM computer.

Top View was released in February of 1985 as a DOS-based multitasking program manager without any GUI features. IBM promised that future versions of Top View would have a GUI. That promise was never kept, and the program was discontinued barely two years later.

A Byte Out of Apple

No doubt, Bill Gates realized how profitable a successful GUI for IBM computers would be. He had seen Apple's Lisa computer and later the more successful Macintosh or Mac computer. Both Apple computers came with a stunning graphical user interface.

Wimps

Side Note: Early MS-DOS diehards liked to refer to MacOS (Macintosh operating system)as "WIMP", an acronym for the Windows, Icons, Mice and Pointers interface.

Competition

As a new product, Microsoft Windows faced potential competition from IBM's own Top View, and others. VisiCorp's short-lived VisiOn, released in October 1983, was the official first PC-based GUI. The second was GEM (Graphics Environment Manager), released by Digital Research in early 1985. Both GEM and VisiOn lacked support from the all-important third-party developers. Since, if nobody wanted to write software programs for an operating system, there would be no programs to use, and nobody would want to buy it.

Microsoft finally shipped Windows 1.0 on November 20, 1985, almost two years past the initially promised release date.

1984 Apple Macintosh Computer

stylized illustration of an apple macintoshApple Macintosh

"Hello, I am Macintosh. Never trust a computer you cannot lift... I'm glad to be out of that bag" - talking Macintosh Computer.

In December, 1983, Apple Computers ran its' famous "1984" Macintosh television commercial, on a small unknown station solely to make the commercial eligible for awards during 1984. The commercial cost 1.5 million and only ran once in 1983, but news and talk shows everywhere replayed it, making TV history. The next month, Apple Computer ran the same ad during the NFL Super Bowl, and millions of viewers saw their first glimpse of the Macintosh computer. The commercial was directed by Ridley Scott, and the Orwellian scene depicted the IBM world being destroyed by a new machine, the "Macintosh".

Could we expect anything less from a company that was now being run by the former president of Pepsi-Cola. Steve Jobs, co-founder of Apple Computers had been trying to hire Pepsi's John Sculley since early 1983. In April of that year he succeeded. But Steve and John discovered that they did not get along and one of John Sculley's first actions as CEO of Apple was to boot Steve Jobs off the Apple "Lisa" project, the "Lisa" was the first consumer computer with a graphical user interface or GUI. Jobs then switched over to managing the Apple "Macintosh" project begun by Jeff Raskin. Jobs was determined that the new "Macintosh" was going to have a graphical user interface, like the "Lisa" but at a considerably lower cost.

Note: The early Mac team members (1979) consisted of Jeff Raskin, Brian Howard, Marc LeBrun, Burrell Smith. Joanna Hoffman and Bud Tribble. Others began working working on the Mac at later dates.

Specifications Macintosh 128K
CPU: MC68000
CPU speed: 8 Mhz
FPU: None
RAM: 128k Dram not expandable
ROM: 64k
Serial Ports: 2
Floppy: 1 3.5" 400k
Monitor: 9" 512x384 square pixels built-in B/W
Power: 60 Watts
Weight: 16.5 lbs.
Dimensions: 13.6" H x 9.6" W x 10.9" D
System Software: Mac OS 1.0
Production: January 1984 to October 1985
Cost: $2,495

Seventy-four days after the introduction of the "Macintosh", 50,000 units had been sold, not that strong a show. Apple refused to license the OS or the hardware, the 128k memory was not enough and a single floppy was difficult to use. The "Macintosh" had "Lisa's" user friendly GUI, but initially missed some of the more powerful features of the "Lisa" like multitasking and the 1 MB of memory. Jobs compensated by making sure developers created software for the new "Macintosh", Jobs figured that software was the way to win the consumer over.

In 1985, the "Macintosh" computer line received a big sales boost with the introduction of the LaserWriter printer and Aldus PageMaker, home desktop publishing was now possible. But 1985 was also the year when the original founders of Apple left the company.

Steve Wozniak returned to college and Steve Jobs was fired, his difficulties with John Sculley coming to a head. Jobs had decided, to regain control of the company away from Sculley, he scheduled a business meeting in China for Sculley and planned for a corporate take-over, when Sculley would be absent. Information about Jobs' true motives, reached Sculley before the China trip, he confronted Jobs and asked Apple's Board of Directors to vote on the issue. Cveryone voted for Sculley and Jobs quit, in lieu of being fired. Jobs later rejoined Apple in 1996 and has happily worked there ever since. Sculley was eventually replaced as CEO of Apple.

1983 Apple Lisa Computer




The first home computer with a GUI, graphical user interface.

Stylized Illustration of Apple Lisa
Image: Apple Lisa

No, Steve, I think its more like we both have a rich neighbor named Xerox, and you broke in to steal the TV set, and you found out I'd been there first, and you said. "Hey that's no fair! I wanted to steal the TV set! - Bill Gates' response after Steve Jobs accused Microsoft of borrowing the GUI (Graphical User Interface) from Apple for Windows 1.0*

The Lisa - The Personal Computer That Works The Way You Do - Apple promotional material

A GUI (pronounced GOO-ee) is a graphical user interface to a computer. Most of you are using one right now. Take a look at your computer screen, the GUI provides you with windows, pull-down menus, clickable buttons, scroll bars, icons, images and the mouse or pointer. The first user interfaces to computers were not graphical or visually oriented; they were all text and keyboard commands. MS-DOS is an example of a text and keyboard method of computer control that you can still find on many PCs today.

The very first graphical user interface was developed by the Xerox Corporation at their Palo Alto Research Center (PARC) in the 1970s, but it was not until the 1980s when GUIs became widespread and popular. By that time the CPU power and monitors necessary for an effective GUI became cheap enough to use in home computers.

Steve Jobs, co-founder of Apple Computers, visited PARC in 1979 (after buying Xerox stock) and was impressed by the "Alto", the first computer ever with a graphical user interface. Several PARC engineers were later hired by Apple and worked on the Apple Lisa and Macintosh. The Apple research team contributed much in the way of originality in their first GUI computers, and work had already begun on the Lisa before Jobs visited PARC. Jobs was definitely inspired and influenced from the technology he saw at PARC, however, enough for Bill Gates to later defend Microsoft against an Apple's lawsuit over Windows 1.0 having too much of the "look and feel" of a Apple MacIntosh. Gates' claim being, "hey, we both got it from Xerox." The lawsuit ended when Gates finally agreed that Microsoft would not use MacIntosh technology in Windows 1.0, but the use of that technology in future versions of Windows was left open. With that agreement, Apple lost its exclusive rights to certain key design elements.

In 1978, Apple Computers started on a business system to complement their successful Apple II/III line of home computers. The new project was code named Lisa, unofficially after the daughter of one of its designers and officially standing for Local Integrated Software Architecture. Steve Jobs was completely dedicated to new project, implementing feature after feature and delaying the release of Lisa, until he was finally removed as project manager by then Apple president Mark Markkula. The Lisa was finally released in January 1983.

Side Note: Don't worry about Jobs. He then turned his attention to the Macintosh.

The Lisa was the first personal computer to use a GUI. Other innovative features for the personal market included a drop-down menu bar, windows, multiple tasking, a hierarchal file system, the ability to copy and paste, icons, folders and a mouse. It cost Apple $50 million to develop the Lisa and $100 million to write the software, and only 10,000 units were ever sold. One year later the Lisa 2 was released with a 3.5" drive instead of the two 5.25" and a price tag slashed in half from the original $9,995. In 1985, the Lisa 2 was renamed the Macintosh XL and bundled with MacWorks system software. Finally in 1986, the Lisa, Lisa 2 and Macintosh XL line was scrapped altogether, literally ending up as landfill, despite Steve Jobs saying, "We're prepared to live with Lisa for the next ten years."

Specifications
The Lisa/Lisa 2/Mac XL
CPU: MC68000
CPU speed: 5 Mhz
FPU: None
Motherboard RAM: minimum 512 k - maximum 2MB
ROM: 16k
Serial Ports: 2 RS-323
Parallel Ports: 1 Lisa - 0 Lisa 2/MacXL
Floppy Drive: 2 internal 871k 5.25"
1 internal 400k Sony 3.5" Lisa 2/MacXL
Hard Drive: 5 MB internal;
Monitor: Built-In 12" - 720 x 360 pixels
Power Supply: 150 Watts
Weight: 48 lbs.
Dimensions: 15.2" H x 18.7" W x 13.8" D
System Software: LisaOS/MacWorks
Production: January 1983 to August 1986
Initial Cost: $9,995

The high cost and delays in its release date helped to create the Lisa's demise, but where the Lisa failed the Macintosh succeeded. Continue reading about Apple's history with our next chapter on the Macintosh.

A month after the Lisa line was cut; Steve Jobs quit his job at Apple. However, do not worry about what happened to Jobs. He then turned his attention to the NeXT computer.

1981 Microsoft MS-DOS Computer Operating System

"I don't think it's that significant." - Tandy president John Roach on IBM's entry into the microcomputer field

On August 12, 1981, IBM introduced its new revolution in a box, the "Personal Computer" complete with a brand new operating system from Microsoft and a 16-bit computer operating system called MS-DOS 1.0.

Operating System : /n./ [techspeak] (Often abbreviated `OS') The foundation software of a machine, of course; that which schedules tasks, allocates storage, and presents a default interface to the user between applications. The facilities an operating system provides and its general design philosophy exert an extremely strong influence on programming style and on the technical cultures that grow up around its host machines. - The Jargon Dictionary*

In 1980, IBM first approached Bill Gates and Microsoft, to discuss the state of home computers and Microsoft products. Gates gave IBM a few ideas on what would make a great home computer, among them to have Basic written into the ROM chip. Microsoft had already produced several versions of Basic for different computer system beginning with the Altair, so Gates was more than happy to write a version for IBM.

As for an operating system (OS) for the new computers, since Microsoft had never written an operating system before, Gates had suggested that IBM investigate an OS called CP/M (Control Program for Microcomputers), written by Gary Kildall of Digital Research. Kindall had his Ph.D. in computers and had written the most successful operating system of the time, selling over 600,000 copies of CP/M, his OS set the standard at that time.

IBM tried to contact Kildall for a meeting, executives met with Mrs. Kildall who refused to sign a non-disclosure agreement. IBM soon returned to Bill Gates and gave Microsoft the contract to write the new operating system, one that would eventually wipe Kildall's CP/M out of common use.

The "Microsoft Disk Operating System" or MS-DOS was based on QDOS, the "Quick and Dirty Operating System" written by Tim Paterson of Seattle Computer Products, for their prototype Intel 8086 based computer.

QDOS was based on Gary Kildall's CP/M, Paterson had bought a CP/M manual and used it as the basis to write his operating system in six weeks, QDOS was different enough from CP/M to be considered legal.

Microsoft bought the rights to QDOS for $50,000, keeping the IBM deal a secret from Seattle Computer Products.

Gates then talked IBM into letting Microsoft retain the rights, to market MS DOS separate from the IBM PC project, Gates proceeded to make a fortune from the licensing of MS-DOS.

In 1981, Tim Paterson quit Seattle Computer Products and found employment at Microsoft.


1981 IBM The IBM PC - Home Computer

IBM PC
IBM PC
In July of 1980, IBM representatives met for the first time with Microsoft's Bill Gates to talk about writing an operating system for IBM's new hush-hush "personal" computer. IBM had been observing the growing personal computer market for some time. They had already made one dismal attempt to crack the market with their IBM 5100. At one point, IBM considered buying the fledgling game company Atari to commandeer Atari's early line of personal computers. However, IBM decided to stick with making their own personal computer line and developed a brand new operating system to go with. The secret plans were referred to as "Project Chess". The code name for the new computer was "Acorn". Twelve engineers, led by William C. Lowe, assembled in Boca Raton, Florida, to design and build the "Acorn". On August 12, 1981, IBM released their new computer, re-named the IBM PC. The "PC" stood for "personal computer" making IBM responsible for popularizing the term "PC".

The first IBM PC ran on a 4.77 MHz Intel 8088 microprocessor. The PC came equipped with 16 kilobytes of memory, expandable to 256k. The PC came with one or two 160k floppy disk drives and an optional color monitor. The price tag started at $1,565, which would be nearly $4,000 today. What really made the IBM PC different from previous IBM computers was that it was the first one built from off the shelf parts (called open architecture) and marketed by outside distributors (Sears & Roebucks and Computerland). The Intel chip was chosen because IBM had already obtained the rights to manufacture the Intel chips. IBM had used the Intel 8086 for use in its Displaywriter Intelligent Typewriter in exchange for giving Intel the rights to IBM's bubble memory technology.

Less than four months after IBM introduced the PC, Time Magazine named the computer "man of the year"

1979 Seymour Rubenstein & Rob Barnaby WordStar Software

Screenshot - My Word Processor Microsoft

Screenshot - My Word Processor Microsoft

Microsoft

"I am happy to greet the geniuses who made me a born-again writer, having announced my retirement in 1978, I now have six books in the works and two [probables], all through WordStar." Quote from Arthur C. Clarke on meeting Seymour Rubenstein and Rob Barnaby, the inventors of Wordstar.

WordStar - The First Word Processor

Released in 1979 by Micropro International, WordStar was the first commercially successful word processing software program produced for microcomputers and the best selling software program of the early eighties.

What is Word Processing?

Word processing can be defined as the manipulation of computer generated text data including creating, editing, storing, retrieving and printing a document.

The Electric Pencil

The first computer word processors were line editors, software-writing aids that allowed a programmer to make changes in a line of program code. Altair programmer Michael Shrayer decided to write the manuals for computer programs on the same computers the programs ran on. He wrote the somewhat popular and the actual first PC word processing program, the Electric Pencil in 1976.

Other early word processor programs worth noting were: Apple Write I, Samna III, Word, WordPerfect and Scripsit.

Seymour Rubenstein and Rob Barnaby

Seymour Rubenstein first started developing an early version of a word processor for the IMSAI 8080 computer when he was director of marketing for IMSAI. He left to start MicroPro International Inc. in 1978 with only $8,500 in cash.

Software programmer Rob Barnaby was convinced to leave IMSAI and tag along with Rubenstein to join MicroPro. Rob Barnaby wrote the 1979 version of WordStar for CP/M. Jim Fox, Barnaby's assistant, ported (meaning re-wrote for a different operating system) WordStar from the CP/M operating system* to MS/PC DOS.

The 3.0 version of WordStar for DOS was released in 1982. Within three years, WordStar was the most popular word processing software in the world. However by the late 1980s, programs like WordPerfect knocked Wordstar out of the word processing market after the poor performance of WordStar 2000.

"In the early days, the size of the market was more promise than reality... WordStar was a tremendous learning experience. I didn't know all that much about the world of big business. I thought I knew it" Quote from Seymour Rubenstein the inventor of WordStar

*The CP/M operating system was developed by Gary Kildall, founder of Digital Research, copywritten in 1976 and released in 1977. MS/PC DOS is the famous operating system introduced by MicroSoft and Bill Gates in 1981.


1978 Dan Bricklin & Bob Frankston VisiCalc Spreadsheet Software

""Any product that pays for itself in two weeks is a surefire winner." - Dan Bricklin on VisiCalc

VisiCalc was the first computer spreadsheet program. It was released to the public in 1979, running on an Apple II computer. While most early microprocessor computers had been quickly supported by BASIC and a few games, VisiCalc introduced a new level in application software. It was considered a fourth generation software program. Companies invested time and money in doing financial projections with manually calculated spreadsheets, where changing a single number meant recalculating every single cell in the sheet. With VisiCalc, you could change any cell, and the entire sheet would be automatically recalculated.

"VisiCalc took 20 hours of work per week for some people and turned it out in 15 minutes and let them become much more creative." - Dan Bricklin

Dan Bricklin and Bob Frankston invented VisiCalc. While a masters student in business administration at Harvard Business School, Dan Bricklin joined up with Bob Frankston to help him write the programming for his new electronic spreadsheet. The two started their own company, Software Arts Inc., to develop their product.

"Early Apple machines -- don't know how to answer what it was like since there were so few tools. Just had to keep debugging by isolating a problem, looking at memory in the limited debugging (weaker than the DOS DEBUG and no symbols) patch and retry and then re-program, download and try again. And again..." - Bob Frankston on programming VisiCalc for the Apple II

By the fall of 1979, an Apple II version of VisiCalc was ready, and the team started writing versions for the Tandy TRS-80, Commodore PET and the Atari 800. By October, VisiCalc was a fast seller on the shelves of computer stores at US $100.

In November of 1981, Bricklin received the Grace Murray Hopper Award from the Association for Computing Machinery in honor of his innovation. VisiCalc was soon sold to Lotus Development Corporation, where it developed into the Lotus 1-2-3 spreadsheet for the PC by 1983. Bricklin never received a patent for VisiCalc. It was not until after 1981 that software programs were made eligible for patents by the Supreme Court.

"I'm not rich because I invented VisiCalc, but I feel that I've made a change in the world. That's a satisfaction money can't buy." - Dan Bricklin

"Patents? Disappointed? Don't think of it that way. Software patents weren't feasible then so we chose not to risk $10,000." - Bob Frankston on not patenting VisiCalc.

More On Spreadsheets

  • In 1980, the DIF format was developed allowing spreadsheet data to be shared/imported into other programs, such as word processors, making spreadsheet data was more portable.
  • Also in 1980, SuperCalc was introduced, the first spreadsheet for popular micro OS called CP/M.
  • In 1983, the popular Lotus 123 spreadsheet was introduced. Mitch Kapor was the founder of Lotus, and used his previous programming experience with VisiCalc to create 123. 123 was based on VisiCalc.
  • From 1987, Excel and Quattro Pro spreadsheets were introduced with a more graphical interface.

1976/77 Apple I, II & TRS-80 & Commodore Pet Computers

Apple I Board
Apple I Board

"The first Apple was just a culmination of my whole life." - Steve Wozniak, Co-Founder Apple Computers

Following the introduction of the Altair, a boom in personal computers occurred, and luckily for the consumer, the next round of home computers were considered useful and a joy to use.

In 1975, Steve Wozniak was working for Hewlett Packard (calculator manufacturers) by day and playing computer hobbyist by night, tinkering with the early computer kits like the Altair. "All the little computer kits that were being touted to hobbyists in 1975 were square or rectangular boxes with non understandable switches on them..." claimed Wozniak. Wozniak realized that the prices of some computer parts (e.g. microprocessors and memory chips) had gotten so low that he could buy them with maybe a month's salary. Wozniak decided that, with some help from fellow hobbyist Steve Jobs, they could build their own computer.

On April Fool's Day, 1976, Steve Wozniak and Steve Jobs released the Apple I computer and started Apple Computers. The Apple I was the first single circuit board computer. It came with a video interface, 8k of RAM and a keyboard. The system incorporated some economical components, including the 6502 processor (only $25 dollars - designed by Rockwell and produced by MOS Technologies) and dynamic RAM.

The pair showed the prototype Apple I, mounted on plywood with all the components visible, at a meeting of a local computer hobbyist group called "The Homebrew Computer Club" (based in Palo Alto, California). A local computer dealer (The Byte Shop) saw it and ordered 100 units, providing that Wozniak and Jobs agreed to assemble the kits for the customers. About two hundred Apple Is were built and sold over a ten month period, for the superstitious price of $666.66.

In 1977, Apple Computers was incorporated and the Apple II computer model was released. The first West Coast Computer Faire was held in San Francisco the same year, and attendees saw the public debut of the Apple II (available for $1298). The Apple II was also based on the 6502 processor, but it had color graphics (a first for a personal computer), and used an audio cassette drive for storage. Its original configuration came with 4 kb of RAM, but a year later this was increased to 48 kb of RAM and the cassette drive was replaced by a floppy disk drive.

Commodore PetThe Commodore PET

The Commodore PET (Personal Electronic Transactor or maybe rumored to be named after the "pet rock" fad) was designed by Chuck Peddle. It was first presented at the January, 1977, Winter Consumer Electronics Show and later at the West Coast Computer Faire. The Pet Computer also ran on the 6502 chip, but it cost only $795, half the price of the Apple II. It included 4 kb of RAM, monochrome graphics and an audio cassette drive for data storage. Included was a version of BASIC in 14k of ROM. Microsoft developed its first 6502-based BASIC for the PET and then sold the source code to Apple for AppleBASIC. The keyboard, cassette drive and small monochrome display all fit within the same self contained unit.

Note: Steve Jobs and Steve Wozniak at one point in time showed the Apple I prototype to Commodore, who agreed to buy Apple. Steve Jobs then decided not to sell to Commodore, who bought MOS Technology instead and then designed the PET. The Commodore PET was seen at the time to be a chief rival of the Apple.

In 1977, Radio Shack introduced its TRS-80 microcomputer, also nicknamed the "Trash-80". It was based on the Zilog Z80 processor (an 8-bit microprocessor whose instruction set is a superset of the Intel 8080) and came with 4 kb of RAM and 4 kb of ROM with BASIC. An optional expansion box enabled memory expansion, and audio cassettes were used for data storage, similar to the PET and the first Apples. Over 10,000 TRS-80s were sold during the first month of production. The later TRS-80 Model II came complete with a disk drive for program and data storage. At that time, only Apple and Radio Shack had machines with disk drives. With the introduction of the disk drive, applications for the personal computer proliferated as distribution of software became easier.

A last note on 1977: It was the year that the tradename "Microsoft" was registered.


1974/75 Scelbi & Mark-8 Altair & IBM 5100 Computers

In the early 1970s, anyone wanting to use a computer had to wait in a long line as computers were few and far apart. The desire and the market was increasing for a computer that could be used at home or in the office, the "personal computer". Several different manufacturers marketed "personal computers" between 1974 and 1977 in response to that desire. These were mainly kits (major assembly required) advertised in the back pages of magazines like Popular Science.

In the March, 1974, issue of QST magazine there appeared the first advertisement for a "personal computer." It was called the Scelbi (SCientific, ELectronic and BIological) and designed by the Scelbi Computer Consulting Company of Milford, Connecticut. Based on Intel's 8008 microprocessor, Scelbi sold for $565 and came with 1K of programmable memory, with an additional 15K of memory available for $2760. The second "personal computer kit" was the Mark-8 (also Intel 8008 based) designed by Jonathan Titus. The July issue of Radio Electronics magazine published an article on building a Mark-8 microcomputer, information the general public was hungry for. At the same time, the Intel company introduced the new 8080 microprocessor chip, made for controlling traffic lights. It was to become the microprocessor inside the very successful Altair computer.

Altair Computer

An Albuquerque, New Mexico, company called MITS (Micro Instrumentation Telemetry Systems) was in the calculator business until Texas Instruments swept the market in 1972 with their low cost calculators. MITS owner Ed Roberts, a former air force electronics specialist, then decided to try designing a computer kit. He was aided by his friend Les Soloman, who happened to be the technical editor for Popular Mechanics magazine and had been flooded with letters from readers describing ideas for home computers. Roberts worked together with hardware engineers William Yates and Jim Bybee during '73 and '74 developing the MITS Altair 8800. The Altair was named by Soloman's 12 year-old daughter after an episode from the original Star Trek television series.

The Altair was the cover story for the January, 1975, issue of Popular Electronics, which described the Altair as the "World's First Minicomputer Kit to Rival Commercial Models". The orders for the Altair were huge in response to the article. The computer kit was shipped with an 8080 CPU, a 256 Byte RAM card, and the new Altair Bus design (S100 Bus - the connector had 100 pins) for the price of $400. It was left to the consumer to put it together, make it work and write any needed software. This was an uneasy task but the computer was definitely expandable, cheap and available.

Two young programmers realized that a software program already written for microcomputers could work on the Altair. Ed Roberts was soon contacted by Harvard freshman Bill Gates (of Microsoft fame) and programmer Paul Allen. Within six weeks, Gates and Allen compiled a version of BASIC to run on the Altair. Allan was offered a position by Roberts as the Director of Software and the only member of the software department. Gates, who was then still a student, started working for MITS part-time after he left school.

BASIC required 4096 bytes of memory to run, sixteen times the amount of memory the Altair then came with. MITS created a 4K (4096 byte) memory board that allowed the Altair to run BASIC. The boards were poorly designed and created problems, and a computer hobbyist named Bob Marsh designed a better 4k board and started a company called Processor Technology to sell his Altair compatible boards. Roberts tried to prevent losing his sales by the BASIC software only with his boards. He succeeded in promoting the first wide-spread case of software piracy. Hobbyists everywhere bought a Processor Technology memory board and somehow found a free copy of BASIC.

Robert's tendency to ship some poorly designed products might have caused MITS' downfall after a few short years, but no one can deny that it was the Altair which really kick-started the home computer revolution. Gates and Allen went on to start Microsoft, becoming the world's leading software developers. Ed Roberts became a doctor and went on to practice medicine.

One more computer worthy of note during this period was the IBM 5100. The 5100 was released in 1975 after two years of development. It was referred to as "Project Mercury" by the IBM scientists. The 5100 was IBM's first portable computer and considered an entry level system, but its $10,000 price tag put it beyond the range of the hobbyists who bought the Altair. Sales of the 5100 went to small business and educational institutions who bought the desktop sized minicomputer which came with BASIC, 16KB of RAM, tape storage and a built-in 5-inch screen.

1973 Robert Metcalfe & Xerox The Ethernet Computer Networking

Illustration of ethernet patent
Ethernet Patent Drawing

I came to work one day at MIT and the computer had been stolen, so I called DEC to break the news to them that this $30,000 computer that they'd lent me was gone. They thought this was the greatest thing that ever happened, because it turns out that I had in my possession the first computer small enough to be stolen! - Robert Metcalfe on the trials and tribulations of inventing the Ethernet.

The ethernet is a system for connecting computers within a building using hardware running from machine to machine. It differs from the Internet, which connects remotely located computers by telephone line, software protocol and some hardware. Ethernet uses some software (borrowed from Internet Protocol), but the connecting hardware was the basis of the patent (#4,063,220) involving newly designed chips and wiring. The patent* describes ethernet as a "multipoint data communication system with collision detection".

Robert Metcalfe was a member of the research staff for Xerox, at their Palo Alto Research Center (PARC) where some of the first personal computers were being made. Metcalfe was asked to build a networking system for PARC's computers. Xerox's motivation for the computer network was that they were also building the world's first laser printer and wanted all of the PARC's computers to be able to print with this printer.

Robert Metcalfe had two challenges: the network had to be fast enough to drive the very fast new laser printer; and it had to connect hundreds of computers within the same building. Never before had hundreds of computers been in the same building -- at that time no one had more than one, two or maybe three computers in operation on any one premise.

The press has often stated that ethernet was invented on May 22, 1973, when Robert Metcalfe wrote a memo to his bosses stating the possibilities of ethernet's potential, but Metcalfe claims ethernet was actually invented very gradually over a period of several years. In 1976, Robert Metcalfe and David Boggs (Metcalfe's assistant) published a paper titled, "Ethernet: Distributed Packet-Switching For Local Computer Networks."

Robert Metcalfe left Xerox in 1979 to promote the use of personal computers and local area networks (LANs). He successfully convinced Digital Equipment, Intel, and Xerox Corporations to work together to promote ethernet as a standard. Now an international computer industry standard, ethernet is the most widely installed LAN protocol.

1971 Alan Shugart &IBM The "Floppy" Disk

3 1/2 Inch Diskette

3 1/2 Inch Diskette

freephotos


In 1971, IBM introduced the first "memory disk", as it was called then, or the "floppy disk" as it is known today.

8-inch Floppy Disk
The first floppy was an 8-inch flexible plastic disk coated with magnetic iron oxide; computer data was written to and read from the disk's surface.

The nickname "floppy" came from the disk's flexibility. The floppy disk was considered a revolutionary device in the "History of Computers" for its portability which provided a new and easy physical means of transporting data from computer to computer.

Inventor Alan Shugart
The "floppy" was invented by IBM engineers led by Alan Shugart. The first disks were designed for loading microcodes into the controller of the Merlin (IBM 3330) disk pack file (a 100 MB storage device). So, in effect, the first floppies were used to fill another type of data storage device. Overnight, additional uses for the floppy were discovered, making it the hot new program and file storage medium.

How Does a Floppy Work?
A floppy is a circle of magnetic material similar to other kinds of recording tape such as cassette tape; one or two sides of the disk are used for recording. The disk drive grabs the floppy by its center and spins it like a record inside its housing. The read/write head, much like the head on a tape deck, contacts the surface through an opening in the plastic shell, or envelope. The first Shugart floppy held 100 KBs of data.

* First-hand account of how the operating system for the 8-inch disk was written.

5 1/4-inch Floppy Disk
In 1976, the 5 1/4" flexible disk drive and diskette was developed by Alan Shugart for Wang Laboratories. Wang wanted a smaller floppy disk and drive to use with their desktop computers. By 1978, more than 10 manufacturers were producing 5 1/4" floppy drives that stored up to 1.2MB (megabytes) of data.

One interesting story about the 5 1/4-inch floppy disk is how the size was decided. Engineers, Jim Adkisson and Don Massaro were discussing the size with An Wang of Wang Laboratories. The trio just happened to be doing their discussing at a bar. An Wang motioned to a drink napkin and stated "about that size" which happened to be 5 1/4-inches wide.

3 1/2-inch Floppy Disk
In 1981, Sony introduced the first 3 1/2" floppy drives and diskettes. These floppies were encased in hard plastic, however, the name stayed the same. They stored 400kb of data, and later 720K (double-density) and 1.44MB (high-density).

Post Floppy Disk
For the most part, recordable CDs and DVDs, and flash drives have replaced floppies as the means of transporting files from one computer to another computer.

1971 Faggin, Hoff & Mazor Intel 4004 Computer Microprocessor

Intel 4004 cpu

Intel 4004 cpu - Interior

Mary Bellis

In November, 1971, a company called Intel publicly introduced the world's first single chip microprocessor, the Intel 4004 (U.S. Patent #3,821,715), invented by Intel engineers Federico Faggin, Ted Hoff, and Stan Mazor. After the invention of integrated circuits revolutionized computer design, the only place to go was down -- in size that is. The Intel 4004 chip took the integrated circuit down one step further by placing all the parts that made a computer think (i.e. central processing unit, memory, input and output controls) on one small chip. Programming intelligence into inanimate objects had now become possible.

The History of Intel

In 1968, Bob Noyce and Gordon Moore were two unhappy engineers working for the Fairchild Semiconductor Company who decided to quit and create their own company at a time when many Fairchild employees were leaving to create start-ups. People like Noyce and Moore were nicknamed the "Fairchildren".

Bob Noyce typed himself a one page idea of what he wanted to do with his new company, and that was enough to convince San Francisco venture capitalist Art Rock to back Noyce's and Moore's new venture. Rock raised $2.5 million dollars in less than 2 days.

Intel Trademark

The name "Moore Noyce" was already trademarked by a hotel chain, so the two founders decided upon the name "Intel" for their new company, a shortened version of "Integrated Electronics".

Intel's first money making product was the 3101 Schottky bipolar 64-bit static random access memory (SRAM) chip.

One Chip Does the Work of Twelve

In late 1969, a potential client from Japan called Busicom, asked to have twelve custom chips designed. Separate chips for keyboard scanning, display control, printer control and other functions for a Busicom-manufactured calculator.

Intel did not have the manpower for the job but they did have the brainpower to come up with a solution. Intel engineer, Ted Hoff decided that Intel could build one chip to do the work of twelve. Intel and Busicom agreed and funded the new programmable, general-purpose logic chip.

Federico Faggin headed the design team along with Ted Hoff and Stan Mazor, who wrote the software for the new chip. Nine months later, a revolution was born. At 1/8th inch wide by 1/6th inch long and consisting of 2,300 MOS (metal oxide semiconductor) transistors, the baby chip had as much power as the ENIAC, which had filled 3,000 cubic feet with 18,000 vacuum tubes.

Cleverly, Intel decided to buy back the design and marketing rights to the 4004 from Busicom for $60,000. The next year Busicom went bankrupt, they never produced a product using the 4004. Intel followed a clever marketing plan to encourage the development of applications for the 4004 chip, leading to its widespread use within months.

The Intel 4004 Microprocessor

The 4004 was the world's first universal microprocessor. In the late 1960s, many scientists had discussed the possibility of a computer on a chip, but nearly everyone felt that integrated circuit technology was not yet ready to support such a chip. Intel's Ted Hoff felt differently; he was the first person to recognize that the new silicon-gated MOS technology might make a single-chip CPU (central processing unit) possible.

Hoff and the Intel team developed such an architecture with just over 2,300 transistors in an area of only 3 by 4 millimetres. With its 4-bit CPU, command register, decoder, decoding control, control monitoring of machine commands and interim register, the 4004 was one heck of a little invention. Today's 64-bit microprocessors are still based on similar designs, and the microprocessor is still the most complex mass-produced product ever with more than 5.5 million transistors performing hundreds of millions of calculations each second - numbers that are sure to be outdated fast.

1970 Intel 1103 Computer Memory



The Invention of the Intel 1103 - The World's First Available DRAM Chip


In 1970, the newly formed Intel company publicly released the 1103, the first DRAM (Dynamic Random Access Memory) chip (1K bit PMOS dynamic RAM ICs), and by 1972 it was the best selling semiconductor memory chip in the world, defeating magnetic core type memory. The first commercially available computer using the 1103 was the HP 9800 series.



Intel 1103 Chip



Dr. Robert H. Dennard, a Fellow at the IBM Thomas J. Watson Research Center created the one-transistor DRAM in 1966. Dennard and his team were working on early field-effect transistors and integrated circuits, and his attention to memory chips came from seeing another team's research with thin-flim magnetic memory. Dennard claims he went home and within a few hours had gotten the basic ideas for the creation of DRAM. He worked on his ideas for a simpler memory cell that used only a single transistor and a small capacitor. IBM and Dennard were granted a patent for DRAM in 1968.

RAM stands for random access memory, memory that can be accessed or written to randomly -- any byte or piece of memory can be used without accessing the other bytes or pieces of memory. There were two basic types of RAM, dynamic RAM (DRAM) and static RAM (SRAM). DRAM needs to be refreshed thousands of times per second. SRAM does not need to be refreshed, which makes it faster. Both types of RAM are volatile -- they lose their contents when the power is turned off. In 1970, Fairchild Corporation invented the first 256-k SRAM chip. Recently, several new types of RAM chips have been designed.

John Reed now head of The Reed Company was once part of the Intel 1103 team. Reed offered the following memories on the development of the Intel 1103.

The "invention?" In those days, Intel, (nor few others for that matter), was not focusing on getting patents or achieving "inventions" so much as they were desperate to get new products to market and begin reaping the profits. But let me tell you how the i1103 was born and raised:

In approximately. 1969, William Regitz of Honeywell canvassed the semiconductor companies of the U.S. looking for someone to share in the development of a dynamic memory circuit based on a novel 3-transistor cell which he (or one of his co-workers) had invented. I won't elaborate, but this cell was a "1X, 2Y" type cell laid out with a "butted" contact for connecting the pass transistor drain to the gate of the cell's current switch.

Regitz talked to many companies, but Intel got really excited about the possibilities here and decided to go ahead with a development program. Moreover, whereas Regitz had originally been proposing a 512-bit chip, Intel decided that 1,024 bits would be feasible, and so the program began. Joel Karp of Intel was the circuit designer, and he worked closely with Regitz throughout the program. It culminated in actual working units, and a paper was given on this device, the i1102, at the 1970 ISSCC conference in Philadelphia.

Intel learned several lessons from the i1102, namely:

1. DRAM cells needed substrate bias. This spawned the 18 pin DIP package.
2. The "butting" contact was a tough technological problem to solve, and yields were low.
3. The "IVG" multi-level cell strobe signal made necessary by the "1X, 2Y" cell circuitry caused the devices to have very small operating margins.

Though they continued to develop the i1102, there was a need to look at other cell techniques. Ted Hoff had proposed all possible ways of wiring up 3 transistors in a DRAM cell earlier, and at this time somebody took a closer look at the "2X, 2Y" cell, I think it may have been Karp and/or Leslie Vadasz. (I hadn't come to Intel yet) The idea of using a "buried contact" was applied (probably by Tom Rowe, process guru), and this cell became more and more attractive, since it could potentially do away with both the butting contact issue and the aforementioned multi-level signal requirement and yield a smaller cell to boot!

So Vadasz and Karp sketched out a schematic of an i1102 alternative (on the sly, since this wasn't exactly a popular decision with Honeywell), and assigned the job of designing the chip to Bob Abbott sometime before I came on the scene in June 1970. He initiated the design and had it laid out. I took over the project after initial "200X" masks had been shot from the original mylar layouts, and it was my job to evolve the product from there which was no small task in itself.

Well, it's hard to make a long story short, but the first silicon chips of the i1103 were practically non-functional, until it was discovered that the overlap between the "PRECH" clock and the "CENABLE" clock, the famous "Tov" parameter, was VERY critical due to our lack of understanding of internal cell dynamics. This was a discovery made by test engineer George Staudacher. Nevertheless, understanding this weakness, I characterized the devices on hand, and we drew up a data sheet. Because of the low yields we were seeing,due to the "Tov" problem, Vadasz and I recommended to Intel management that the product wasn't ready for market, but Bob Graham, then Intel Marketing V.P., thought otherwise and pushed for an early introduction, over our dead bodies so to speak. The Intel i1103 "came to market" in October of 1970.

After the product introduction, demand was strong, and it was my job to evolve the design for better yield. I did this in stages, making improvements at every new mask generation until the "E" revision of the masks, at which point, the i1103 was yielding well and performing well. This early work of mine established a couple of things:

1. Based on my analysis of 4 runs of devices, the refresh time was set at 2 milliseconds. Binary multiples of that initial characterization are still the standard to this day.
2. I was probably the first designer to use Si-gate transistors as bootstrap capacitors; my evolving mask sets had several of these to improve performance and margins.

And that's about all I can say about the Intel 1103's "invention." I will say that "getting inventions" was just not a value amongst us circuit designers of those days. I am personally named on 14 memory-related patents, but in those days, I'm sure I invented many more techniques in the course of getting a circuit developed and out to market without stopping to make any disclosures. That Intel itself wasn't concerned about patents until "too late" is evidence, in my own case, by the 4 or 5 patents I was awarded, applied for and assigned to 2 years after I left the company at the end of 1971! (Look at one of them, and you'll see me listed as an Intel employee!) - John Reed

1969 ARPAnet The original Internet.

ARPAnet - The First Internet
arpanet image by mary bellis


"The Internet may fairly be regarded as a never-ending worldwide conversation." - supreme judge statement on considering first amendment rights for Internet users.

On a cold war kind of day, in swinging 1969, work began on the ARPAnet, grandfather to the Internet. Designed as a computer version of the nuclear bomb shelter, ARPAnet protected the flow of information between military installations by creating a network of geographically separated computers that could exchange information via a newly developed protocol (rule for how computers interact) called NCP (Network Control Protocol).

One opposing view to ARPAnet's origins comes from Charles M. Herzfeld, the former director of ARPA. He claimed that ARPAnet was not created as a result of a military need, stating "it came out of our frustration that there were only a limited number of large, powerful research computers in the country and that many research investigators who should have access were geographically separated from them." ARPA stands for the Advanced Research Projects Agency, a branch of the military that developed top secret systems and weapons during the Cold War.

The first data exchange over this new network occurred between computers at UCLA and Stanford Research Institute. On their first attempt to log into Stanford's computer by typing "log win", UCLA researchers crashed their computer when they typed the letter 'g'.

Four computers were the first connected in the original ARPAnet. They were located in the respective computer research labs of UCLA (Honeywell DDP 516 computer), Stanford Research Institute (SDS-940 computer), UC Santa Barbara (IBM 360/75), and the University of Utah (DEC PDP-10). As the network expanded, different models of computers were connected, creating compatibility problems. The solution rested in a better set of protocols called TCP/IP (Transmission Control Protocol/Internet Protocol) designed in 1982.

To send a message on the network, a computer breaks its data into IP (Internet Protocol) packets, like individually addressed digital envelopes. TCP (Transmission Control Protocol) makes sure the packets are delivered from client to server and reassembled in the right order.

Under ARPAnet several major innovations occurred: email (or electronic mail), the ability to send simple messages to another person across the network (1971); telnet, a remote connection service for controlling a computer (1972); and file transfer protocol (FTP), which allows information to be sent from one computer to another in bulk (1973).

As non-military uses for the network increased, more and more people had access, and it was no longer safe for military purposes. As a result, MILnet, a military only network, was started in 1983. Internet Protocol software was soon being placed on every type of computer, and universities and research groups also began using in-house networks known as Local Area Networks or LAN's. These in-house networks then started using Internet Protocol software so one LAN could connect with other LAN's.

In 1986, one LAN branched out to form a new competing network, called NSFnet (National Science Foundation Network). NSFnet first linked together the five national supercomputer centers, then every major university, and it started to replace the slower ARPAnet (which was finally shutdown in 1990). NSFnet formed the backbone of what we call the Internet today.

"The Internet's pace of adoption eclipses all other technologies that preceded it. Radio was in existence 38 years before 50 million people tuned in; TV took 13 years to reach that benchmark. Sixteen years after the first PC kit came out, 50 million people were using one. Once it was opened to the general public, the Internet crossed that line in four years." - quote from the U.S. Department report "The Emerging Digital Economy".

1964 Douglas Engelbart Computer Mouse & Windows

The History of the Computer Mouse and the Prototype for Windows - Douglas Engelbart


"It would be wonderful if I can inspire others, who are struggling to realize their dreams, to say 'if this country kid could do it, let me keep slogging away'." - Douglas Engelbart

Douglas Engelbart changed the way computers worked, from specialized machinery that only a trained scientist could use, to a user-friendly tool that almost anyone can use. He invented or contributed to several interactive, user-friendly devices: the computer mouse, windows, computer video teleconferencing, hypermedia, groupware, email, the Internet and more.


Computer Mouse
Modern Computer Mouse


In 1964, the first prototype computer mouse was made to use with a graphical user interface (GUI), 'windows'. Engelbart received a patent for the wooden shell with two metal wheels (computer mouse U.S. Patent # 3,541,541) in 1970, describing it in the patent application as an "X-Y position indicator for a display system." "It was nicknamed the mouse because the tail came out the end," Engelbart revealed about his invention. His version of windows was not considered patentable (no software patents were issued at that time), but Douglas Engelbart has over 45 other patents to his name.

Throughout the '60s and '70s, while working at his own lab (Augmentation Research Center, Stanford Research Institute), Engelbart dedicated himself to creating a hypermedia groupware system called NLS (for oNLine System). Most of his accomplishments, including the computer mouse and windows, were part of NLS.

In 1968, a 90-minute, staged public demonstration of a networked computer system was held at the Augmentation Research Center -- the first public appearance of the mouse, windows, hypermedia with object linking and addressing, and video teleconferencing.

Douglas Engelbart was awarded the 1997 Lemelson-MIT Prize of $500,000, the world's largest single prize for invention and innovation. In 1998, he was inducted into the National Inventors Hall of Fame.

Currently, Douglas Engelbart is the director of his company, Bootstrap Institute in Fremont, California, which promotes the concept of Collective IQ. Ironically, Bootstrap is housed rent free courtesy of the Logitech Corp., a famous manufacturer of computer mice.

1962 Steve Russell & MIT Spacewar Computer Game




Spacewar! : The first computer game invented by Steve Russell
illustration of a spacewar video game by mary bellisSpacewar Screenshot


If I hadn't done it, someone would've done something equally exciting if not better in the next six months. I just happened to get there first." - Steve Russell nickname "Slug"

It was in 1962 when a young computer programmer from MIT, Steve Russell fueled with inspiration from the writings of E. E. "Doc" Smith*, led the team that created the first computer game. It took the team about 200 man-hours to write the first version of Spacewar. Steve Russell wrote Spacewar on a PDP-1, an early DEC (Digital Equipment Corporation) interactive mini computer which used a cathode-ray tube type display and keyboard input. The computer was a donation to MIT from DEC, who hoped MIT's think tank would be able to do something remarkable with their product. A computer game called Spacewar was the last thing DEC expected who later provided the game as a diagnostic program for their customers. Russell never profited from Spacewars.

The PDP-1's operating system was the first to allow multiple users to share the computer simultaneously. This was perfect for playing Spacewar, which was a two-player game involving warring spaceships firing photon torpedoes. Each player could maneuver a spaceship and score by firing missiles at his opponent while avoiding the gravitational pull of the sun. Try playing a replica** of the computer game for yourselves. It still holds today up as a great way to waste a few hours. By the mid-sixties, when computer time was still very expensive, Spacewar could be found on nearly every research computer in the country. Steve Russell transferred to Stanford University, where he introduced computer game programming and Spacewar to an engineering student called Nolan Bushnell. Bushnell went on to write the first coin-operated computer arcade game and start Atari Computers.

*An interesting sidenote is that "Doc" Smith, besides being a great science fiction writer, held a Ph.D. in chemical engineering and was the researcher who figured out how to get powdered sugar to stick to doughnuts.

**Spacewar! was conceived in 1961 by Martin Graetz, Steve Russell, and Wayne Wiitanen. It was first realized on the PDP-1 in 1962 by Steve Russell, Peter Samson, Dan Edwards and Martin Graetz, together with Alan Kotok, Steve Piner and Robert A. Saunders.

1958 Jack Kilby & Robert Noyce The Integrated Circuit

illustration from Jack Kilby's inventor's journal
illustration from Jack Kilby's inventor's journal


"What we didn't realize then was that the integrated circuit would reduce the cost of electronic functions by a factor of a million to one, nothing had ever done that for anything before" - Jack Kilby

It seems that the integrated circuit was destined to be invented. Two separate inventors, unaware of each other's activities, invented almost identical integrated circuits or ICs at nearly the same time.

Jack Kilby, an engineer with a background in ceramic-based silk screen circuit boards and transistor-based hearing aids, started working for Texas Instruments in 1958. A year earlier, research engineer Robert Noyce had co-founded the Fairchild Semiconductor Corporation. From 1958 to 1959, both electrical engineers were working on an answer to the same dilemma: how to make more of less.

In designing a complex electronic machine like a computer it was always necessary to increase the number of components involved in order to make technical advances. The monolithic (formed from a single crystal) integrated circuit placed the previously separated transistors, resistors, capacitors and all the connecting wiring onto a single crystal (or 'chip') made of semiconductor material. Kilby used germanium and Noyce used silicon for the semiconductor material.

In 1959 both parties applied for patents. Jack Kilby and Texas Instruments received U.S. patent #3,138,743 for miniaturized electronic circuits. Robert Noyce and the Fairchild Semiconductor Corporation received U.S. patent #2,981,877 for a silicon based integrated circuit. The two companies wisely decided to cross license their technologies after several years of legal battles, creating a global market now worth about $1 trillion a year.

In 1961 the first commercially available integrated circuits came from the Fairchild Semiconductor Corporation. All computers then started to be made using chips instead of the individual transistors and their accompanying parts. Texas Instruments first used the chips in Air Force computers and the Minuteman Missile in 1962. They later used the chips to produce the first electronic portable calculators. The original IC had only one transistor, three resistors and one capacitor and was the size of an adult's pinkie finger. Today an IC smaller than a penny can hold 125 million transistors.

Jack Kilby now holds patents on over sixty inventions and is also well known as the inventor of the portable calculator (1967). In 1970 he was awarded the National Medal of Science. Robert Noyce, with sixteen patents to his name, founded Intel, the company responsible for the invention of the microprocessor, in 1968. But for both men the invention of the integrated circuit stands historically as one of the most important innovations of mankind. Almost all modern products use chip technology.

1955 ( in use 1959 ) Stanford Research Institute, Bank of America, and General Electric ERMA and MICR

Inventors of the Modern Computer
ERMA - the Electronic Recording Method of Accounting computer processing system invented at Stanford Research Institute.


During the 1950s, researchers at the Stanford Research Institute invented "ERMA", the Electronic Recording Method of Accounting computer processing system. ERMA began as a project for the Bank of America in an effort to computerize the banking industry. ERMA computerized the manual processing of checks and account management and automatically updated and posted checking accounts. Stanford Research Institute also invented MICR (magnetic ink character recognition) as part of ERMA. MICR allowed computers to read special numbers at the bottom of checks that allowed computerized tracking and accounting of check transactions.

ERMA was first demonstrated to the public in 1955 (September), and first tested on real banking accounts in the fall of 1956. Production models (ERMA Mark II) of the ERMA computer were built by General Electric. Thirty-two units were delivered to the Bank of America in 1959 for full-time use as the bank's accounting computer and check handling system. ERMA computers were used into the 1970s.

According to Stanford Research Institute's website:

The forty-year-old project [ERMA], provided a vision of what business could expect from the application of data-processing machines, and illustrates how and why some of the key capabilities were invented, including bookkeeping, checks with pre-printed account numbers, optical character recognition (OCR or scanning), and robotic document sorting (ten checks per second). The automated teller machine (ATM) is the natural descendant of this work, and illustrates the progression away from paper checks toward all electronic banking.

ERMA Mark II was designed around solid-state logic elements (i.e. transistors) and magnetic core memory. Numeric data input was read automatically from the original documents using the MICR method. SRI contributed to General Electric's development effort with consultation on character reading and paper-handling techniques and assistance with the detailed programming of the operational steps to be followed by the new equipment.

The Stanford Research Institute researchers behind ERMA and/or MICR were: Jerre Noe, Byron Bennett, C. Bruce Clark, Bonnar "Bart" Cox, Jack Goldberg, Fred Kamphoefner, Philip E. Merritt and Oliver W. Whitby, and others.

1954 John Backus & IBM FORTRAN Computer Programming Language

"I really didn't know what the hell I wanted to do with my life...I said no, I couldn't. I looked sloppy and disheveled. But she insisted and so I did. I took a test and did OK." - John Backus on interviewing for IBM.

FORTRAN or formula translation, the first high level programming language, was invented by John Backus for IBM, in 1954, and released commercially, in 1957. It is still used today for programming scientific and mathematical applications. Fortran began as a digital code interpreter for the IBM 701 and was originally named Speedcoding. John Backus wanted a programming language closer to human language, which is the definition of a high level language, other high language programs include Ada, Algol, BASIC, COBOL, C, C++, LISP, Pascal, and Prolog.

The first generation of codes used to program a computer, was called machine language or machine code, it is the only language a computer really understands, a sequence of 0s and 1s that the computer's controls interprets as instructions, electrically. The second generation of code was called assembly language, assembly language turns the sequences of 0s and 1s into human words like 'add'. Assembly language is always translated back into machine code by programs called assemblers.

The third generation of code, was called high level language or HLL, which has human sounding words and syntax (like words in a sentence). In order for the computer to understand any HLL, a compiler translates the high level language into either assembly language or machine code. All programming languages need to be eventually translated into machine code for a computer to use the instructions they contain.

John Backus headed the IBM team of researchers, at the Watson Scientific Laboratory, that invented Fortran. On the IBM team were the notable names of scientists like; Sheldon F. Best, Harlan Herrick (Harlan Herrick ran the first successful fortran program), Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Richard Goldberg, Lois Haibt and David Sayre. The IBM team didn't invent HLL or the idea of compiling programming language into machine code, but Fortran was the first successful HLL and the Fortran I compiler holds the record for translating code for over 20 years. The first computer to run the first compiler was the IBM 704, which John Backus helped design.

Fortran is now over forty years old and remains the top language in scientific and industrial programming, of course it has constantly been updated. The invention of Fortran began a $24 million dollar computer software industry and began the development of other high level programming languages, Fortran has been used for programming video games, air traffic control systems, payroll calculations, numerous scientific and military applications and parallel computer research. John Backus won the 1993 National Academy of Engineering's Charles Stark Draper Prize, the highest national prize awarded in engineering, for the invention of Fortran.