>>>  Laatst gewijzigd: 28 oktober 2021   >>>  Terug naar www.emo-level-8.nl  

Notities bij boeken

Start Filosofie Kennis Normatieve rationaliteit Waarden in de praktijk Mens en samenleving Techniek


Ceruzzi heeft twee boeken geschreven over de geschiedenis van de computer. Het eerste boek gaat over de 'prehistorie' van 1935 tot 1945. Zie Reckoners. Dit is zijn tweede boek en dat handelt over de periode van 1945 tot 2001.

De Amerikaan Paul Ceruzzi is curator van de afdeling Aerospace Electronics and Computing aan het Smithsonian’s National Air and Space Museum in Washington, D.C.. Hij zit ook in de groep van adviseurs voor het Computer History Museum.

Dit is een geweldig helder en gedetailleerd geschiedenisboek over dit onderwerp, en hoort bij de beste die ik ken.

Voorkant Ceruzzi 'A History of Modern Computing - Second edition' Paul E. CERUZZI
A History of Modern Computing - Second edition
Cambridge-London: The MIT Press, 1998/1; 2003/2; 445 blzn.
ISBN 02 6253 2034

(1) Introduction: Defining 'Computer'

"The narrative that follows is chronological, beginning with the first attempts to commercialize the electronic computer in the late 1940s and ending in the mid–1990s, as networked personal workstations became common. I have identified several major turning points, and these get the closest scrutiny. They include the computer’s transformation in the late 1940s from a specialized instrument for science to a commercial product, the emergence of small systems in the late 1960s, the advent of personal computing in the 1970s, and the spread of networking after 1985. I have also identified several common threads that have persisted throughout these changes."(5-6)

De thema's die steeds weer terugkeren zijn: de computerarchitectuur die fundamenteel gelijk blijft (Von Neumann - architectuur) de invloed van de militaire bureaus in de ontwikkeling van computers, de rol van IBM van 1952 tot in de 80-er jaren, de rol van software (pas later zo genoemd; het gaat hier niet over AI trouwens), en de plaats van informatietechnologie in een democratische samenleving: brengt ze onder controle of maakt ze vrij?

"The military’s role in the advancement of solid state electronics is well known, but a closer look shows that role to be complex and not always beneficial. The term 'military' is misleading: there is no single military entity but rather a group of services and bureaus that are often at odds with one another over roles, missions, and funding. Because the military bureaucracy is large and cumbersome, individual 'product champions' who can cut through red tape are crucial."(7)

"It was not until 1990 that commercial software came to the fore of computing, as hardware prices dropped and computer systems became more reliable, compact, and standardized."(9)

In dit boek wordt om praktische redenen gekozen voor de geschiedenis van de computer in de VS. Er waren natuurlijk ontwikkelingen in Europa, Europa was zelfs een tijdje leidend, maar uiteindelijk werd de computerindustrie in de VS dominant. Europa wordt soms genoemd, evenals Japan en de USSR, maar nauwelijks.

"This book focuses on the history of computing as it unfolded in the United States. (...) By the late 1950s, though, whatever lead the Europeans had was lost to American companies. (...) The following narrative will occassionally address European contributions, but for reasons of space will not chronicle the unfolding of the computer industry there.

This narrative will also touch only lightly on the history of computing in Japan. That story is different: Japan had a late start in computing, never producing vacuum tube computers at all. Japanese firms made remarkable advances in integrated circuit production, however, and had established a solid place in portions of the industry by the 1980s. The announcement in the early 1980s of a Japanese 'Fifth Generation' program, intended to leapfrog over U.S. software expertise, created a lot of anxiety in the United States, but the United States retained its leadership in software into the 1990s. How Japanese firms gained a foothold is discussed briefly in chapter 5.

The end of the Cold War, and with it the opening of Soviet archives, may help us better understand the development of computing in the U.S.S.R. Throughout this era the Soviets remained well behind the United States in computing. So far the reasons that have been given tend to be post hoc: because it was so, therefore it had to be so. But what of Soviet achievements in pure mathematics and physics, as well as in developing ballistic missiles, nuclear weapons, space exploration, and supersonic aircraft? One might expect that the Soviet military would have supported computing for the same reasons the U.S. Air Force supported Whirlwind and SAGE. We know that Soviet scientists began work on advanced digital computers as soon as the ENIAC was publicized. Yet when they needed advanced machines, the Soviets turned to their East European satellites (especially Hungary and Czechoslovakia), or else they reverse-engineered U.S. computers such as the IBM System/ 360 and the VAX. Building copies of these computers gave them access to vast quantities of software, which they could acquire by a purchase on the open market, or by espionage, but it also meant that they remained one or two hardware generations behind the United States.

Perhaps it was the perception that computers, being instruments that facilitate the free exchange of information, are antithetical to a totalitarian state. But U.S. computing from 1945 through the 1970s was dominated by large, centralized systems under tight controls, and these were not at odds with the Soviet political system. Such computers would have been perfect tools to model the command economy of Marxism-Leninism. Soviet planners would not have been alone. Throughout this era some Americans embraced computers for their potential to perform centralized economic modeling for the United States—with constitutional rights guaranteed, of course. Perhaps the reason was the other side of the Western European coin: plenty of military support, but no transfer to a market-driven computer industry. Americans may have found that military support was 'just right': enough to support innovation but not so focused on specific weapons systems as to choke off creativity. More research on the history of Soviet computing needs to be done."(10-12)

(14) 1 - The Advent of Commercial Computing, 1945–1956

De commerciële productie en verkoop van computers begon toen de makers van de ENIAC, Mauchly en Eckert in 1951 besloten een bedrijf op te richten en aan de bouw van de UNIVAC (= Universal Automatic Computer) begonnen. Er is echter wel sprake van een vloeiende overgang van ponskaart- en tabelleermachines naar zo'n computer-voor-alle-doeleinden: de nieuwe functies werden als het ware al uitgeprobeerd in die omgeving.

"IBM called these machines the 'Aberdeen Relay Calculators'; they were later known as the PSRC, for 'Pluggable Sequence Relay Calculator.' In late 1945, three more were built for other military labs, and these were even more complex. During the time one of these machines read a card, it could execute a sequence of up to forty-eight steps. More complex sequences-within-sequences were also possible. One computer scientist later noted that this method of programming demanded "the kind of detailed design of parallel subsequencing that one sees nowadays at the microprogramming level of some computers." When properly programmed, the machines were faster than any other nonelectronic calculator. Even after the ENIAC was completed and installed and moved from Philadelphia to Aberdeen, the Ballistic Research Lab had additional Relay Calculators built. They were still in use in 1952, by which time the BRL not only had the ENIAC but also the EDVAC, the ORDVAC (both electronic computers), an IBM Card Programmed Calculator (described next), and the Bell Labs Model V, a very large programmable relay calculator.

The Aberdeen Relay Calculators never became a commercial product, but they reveal an attempt to adapt existing equipment to post–World War II needs, rather than take a revolutionary approach, such as the UNIVAC. There were also other punched-card devices that represented genuine commercial alternatives to Eckert and Mauchly’s proposed invention. In 1935 IBM introduced a multiplying punch (the Model 601); these soon became popular for scientific or statistical work. In 1946 IBM introduced an improved model, the 603, the first commercial IBM product to use vacuum tubes for calculating. Two years later IBM replaced it with the 604, which not only used tubes but also incorporated the sequencing capability pioneered by the Aberdeen machines. Besides the usual plugboard control common to other punched-card equipment, it could execute up to 60 steps for each reading of a card and setting of the plugboard.15 The 604 and its successor, the IBM 605, became the mainstays of scientific computing at many installations until reliable commercial computers became available in the mid 1950s. It was one of IBM’s most successful products during that era: over 5,000 were built between 1948 and 1958."(18-19)

De UNIVAC was zo veel anders dan deze geavanceerde rekenmachines vanwege de 'stored program'-aanpak.

"When completed in late 1945, the ENIAC operated much faster than any other machine before it. But while it could solve a complex mathematical problem in seconds, it might take days to set up the machine properly to do that.

It was in the midst of building this machine that its creators conceived of an alternative. It was too late to incorporate that insight into the ENIAC, but it did form the basis for a proposed follow-on machine called the 'EDVAC' (Electronic Discrete Variable Computer). In adescription written in September of 1945, Eckert and Mauchly stated the concept succinctly: "An important feature of this device was that operating instructions and function tables would be stored exactly in the same sort of memory device as that used for numbers." Six months later, Eckert and Mauchly left the Moore School, and work on the EDVAC was turned over to others (which was mainly why it took five more years to finish building it). The concept of storing both instructions and data in a common storage unit would become basic features of the UNIVAC and nearly every computer that followed."(21)

Het wordt de Von Neumann-architectuur genoemd, maar Mauchly en Eckert hadden dit dus al geformuleerd vóór Von Neumann. Von Neumann was echter een beroemd wiskundige waardoor het 'stored program'-idee al snel gekoppeld werd aan zijn naam. Instructies konden met die aanpak net zo snel ingelezen worden als data doordat ze in hetzelfde geheugen waren opgeslagen.

De eerste UNIVAC werd uitgeleverd op 31 maart 1951 aan het Census Bureau van de VS. Romd 1954 waren er 20 gebouwd en verkocht. De snelheid was een van de essentiële voordelen van het idee 'stored programs'. En de opslag van resultaten op band (in plaats van ponskaarten) en de mogelijkheid die te doorzoeken, te vinden en weer op te halen scheelde veel mensenwerk.

"Customers regarded the UNIVAC as an information processing system, not a calculator. As such, it replaced not only existing calculating machines, but also the people who tended them."(30)

"A final example of the UNIVAC in use comes from the experience at General Electric’s Appliance Park, outside Louisville, Kentucky. This installation, in 1954, has become famous as the first of a stored-program electronic computer for a nongovernment customer (although the LEO, built for the J. Lyons Catering Company in London, predated it by three years)."(32)

Toen het succes van de UNIVAC duidelijk werd besloot IBM ook in de markt van grote computers te stappen.

"At the time of the UNIVAC’s announcement, IBM was not fully committed to electronic computation and was vigorously marketing its line of punched card calculators and tabulators. But after seeing the competitive threat, it responded with several machines: two were on a par with the UNIVAC; another was more modest. In May 1952, IBM announced the 701, a stored-program computer in the same class as the UNIVAC."(34)

"The nineteen installations were enough to prevent UNIVAC from completely taking over the market and to begin IBM’s transition to a company that designed and built large-scale electronic digital computers."(36)

Andere bedrijven deden hetzelfde voor andere klanten, bijvoorbeeld in de meer wetenschappelijke hoek. Een voorbeeld is Engineering Research Associates (ERA). Het bedrijf vond geheugentechnieken ('magnetic drum storage') uit die heel stimulerend waren voor de ontwikkeling van computers.

"Given the speed penalty, drum-based computers would never be able to compete with the others, regardless of price. The many benefits promised in the 1940s by the stored-program electronic computer architecture required high-capacity, high-speed memory to match electronic processing. With the advent of ferrite cores — and techniques for manufacturing them in large quantities — the memory problem that characterized the first generation was effectively solved."(45)

(47) 2 - Computing Comes of Age, 1956–1964

De eerste generatie mainframes had nog een paar vernieuwingen nodig om echt kostenbesparend te werken.

"Part of this transformation of computers came from advances in circuit technology. By 1959 the transistor had become reliable and cheap enough to serve as the basic circuit element for processors. The result was increased reliability, lower maintenance, and lower operating costs. Before that, however, an even more radical innovation occurred — the development of reliable, high capacity memory units built out of magnetic cores. These two innovations were able to boost performance to a point where many commercial applications became cost-effective."(49)

Uitleg van het kerngeheugen. An Wang - een student van Howard Aiken - maakte een kerngeheugen dat gebruikt werd in de Harvard Mark IV. Maar er werden ook kerngeheugens ontwikkeld voor de ENIAC en de Whirlwind in 1953, die er stukken beter door gingen presteren. Het maakte de SAGE-computer mogelijk die vanaf 1956 geproduceerd werd door IBM en die tot 1983 gebruikt werd:

"A contract with the U.S. Air Force to build production versions of the Whirlwind was a crucial event because it gave engineers the experience needed for core to become viable in commercial systems. The Air Force’s SAGE (Semi-Automatic Ground Environment), a system that combined computers, radar, aircraft, telephone lines, radio links, and ships, was intended to detect, identify, and assist the interception of enemy aircraft attempting to penetrate the skies over the United States. At its center was a computer that would coordinate the information gathered from far-flung sources, process it, and present it in a combination of textual and graphical form. All in all, it was an ambitious design; the Air Force’s desire to have multiple copies of this computer in operation round the clock made it even more so. A primary requirement for the system was high reliability, which ruled out mercury delay lines or electrostatic memory."(51)

IBM deed met dit contract veel ervaring op en verdiende er goed mee. Het overvleugelde UNIVAC en andere bedrijven zoals Honeywell - actief sinds begin 60-er jaren - , RCA, en General Electric, en werd de dominante kracht in de markt voor grote computersystemen. Door de concurrentie en al het onderzoek om te kúnnen concurreren, ontwikkelde de computerarchitectuur zich geleidelijk aan.

"By the end of 1960 there were about 6,000 general-purpose electronic computers installed in the United States.24 Nearly all of them were descendents of the EDVAC and IAS computer projects of the 1940s, where the concept of the stored program first appeared. One often hears that nothing has really changed in computer design since von Neumann. That is true only in a restricted sense—computers still store their programs internally and separate storage from arithmetic funtions in their circuits. In most other ways there have been significant advances. By 1960 some of these innovations became selling points as different vendors sought to establish their products in the marketplace. The most important architectural features are summarized here."(58)

Ook de hardware ontwikkelde zich. Vanaf 1954 was de transistor betrouwbaar genoeg om gebruikt te worden in de grote 'main frames'. Philco maakte de eerste getransistoriseerde 'main frame'.

"The result, called 'SOLO,' was completed sometime between 1956 and 1958, and was probably the first general-purpose transistorized computer to operate in the United States. Philco marketed a commercial version called the TRANSACS-1000, followed quickly by an upgraded S-2000, in 1958. First deliveries of the S-2000 were in January 1960. These, along with deliveries by UNIVAC of a smaller computer called the Solid State 80, mark the beginning of the transistor era, or 'Second Generation.'"()

Bij IBM werd ook een 'disk drive' met magnetische schijven als opslagmedium uitgevonden. Deze kwam in 1956 op de markt. En in 1960 bouwde IBM zijn eerste getransistoriseerde 'main frame', de IBM Model 7090 en 7094. Dit model wordt beschouwd als de klassieke 'main frame' en er werden er vele van verkocht. Tot slot werd een bruikbare 'high speed' printer uitgevonden en veel verkocht.

"By 1960 a pattern of commercial computing had established itself, a pattern that would persist through the next two decades. Customers with the largest needs installed large mainframes in special climate-controlled rooms, presided over by a priesthood of technicians. These mainframes utilized core memories, augmented by sets of disks or drums. Backing that up were banks of magnetic tape drives, as well as a library where reels of magnetic tape were archived. Although disks and drums allowed random access to data, most access conformed to the sequential nature of data storage on tapes and decks of cards. For most users in a university environment, a typical transaction began by submitting a deck of cards to an operator through a window (to preserve the climate control of the computer room). Sometime later the user went to a place where printer output was delivered and retrieved the chunk of fan-fold paper that contained the results of his or her job. The first few pages of the printout were devoted to explaining how long the job took, how much memory it used, which disk or tape drives it accessed, and so on — information useful to the computer center’s operators, and written cryptically enough to intimidate any user not initiated into the priesthood. For commercial and industrial computer centers, this procedure was more routine but essentially the same.(...)

Thus the early era of computing was characterized by batch processing. The cost of the hardware made it impractical for users to interact with computers as is done today. Direct interactive access to a computer’s data was not unknown but was confined to applications where cost was not a factor, such as the SAGE air defense system. For business customers, batch processing was not a serious hindrance. Reliance on printed reports that were a few days out of date was not out of line with the speeds of transportation and communication found elsewhere in society. The drawbacks of batch processing, especially how it made writing and debugging programs difficult, were more noticed in the universities, where the discipline of computer programming was being taught. University faculty and students thus recognized a need to bring interactive computing to the mainstream. In the following years that need would be met, although it would be a long and difficult process."(77-78)

(79) 3 - The Early History of Software, 1952–1968

Hoe ontstond software en wat was haar relatie met de hardware? In de VS begon het programmeren van computers met Grace Murray Hopper die de Harvard Mark I van Aiken 'dingen moest laten doen'.

"It did not take long for her to realize that if a way could be found to reuse the pieces of tape already coded for another problem, a lot of effort would be saved."(82)

Maar dan moest het programma wel in het interne geheugen opgeslagen kunnen worden in plaats van van buiten af aangevoerd te moeten worden (met hendeltjes, ponskaarten, ponsbandjes, tapes, schijven).

"With a stored-program computer, a sequence of instructions that would be needed more than once could be stored on a tape. When a particular problem required that sequence, the computer could read that tape, store the sequence in memory, and insert the sequence into the proper place(s) in the program. By building up a library of sequences covering the most frequently used operations of a computer, a programmer could write a sophisticated and complex program without constant recourse to the binary codes that directed the machine. Of the early stored-program computers, the EDSAC in Cambridge, England, carried this scheme to the farthest extent, with a library of sequences already written, developed, and tested, and punched onto paper tapes that a user could gather and incorporate into his own program.16 D. J. Wheeler of the EDSAC team devised a way of storing the (different) addresses of the main program that these sequences would have to jump to and from each time they were executed. This so-called Wheeler Jump was the predecessor of the modern subroutine call."(84)

Verschillende betekenissen van het woord 'compiler' ontstonden: bij Grace Hopper het programma dat regelt dat een subroutine op de juiste plaats wordt aangeroepen, later: een programma dat commando's van gebruikers omzet naar machinetaal. Maar bij dat laatste duikt ook de term 'assembler' op. Pas later ontstonden hogere programmeertalen. FORTRAN (= Formula Translation) werd door IBM gemaakt in 1957. In 1959 ontstond door het toedoen van het Ministerie van Defensie in de VS de programmeertaal COBOL (= Common Business Oriented Language).

Er werd dus software geschreven om bepaalde problemen voor gebruikers aan te pakken. Maar ook het computersysteem zelf had zaken die geregeld moesten worden. Daarvoor werden 'operating systems' geschreven: systeemprogramma's.

"In 1968 and 1969 a cluster of similar events further established the place of software, its relationship to computer science, and its relationship to industrial, commercial, and military computer users."(103.)

Daaronder: Donald E. Knuth's boekenreeks The Art of Computer Programming. Het eerste deel - Fundamental Algorithms = verscheen in 1968. Pleidooien voor gestructureerd programmeren. Discussies over het eigendom van en patenten op software. De verkoop van kant-en-klare softwarepakketten voor bepaalde hardware.

"In 1969 Ken Thompson and Dennis Ritchie at the Bell Telephone Laboratories in New Jersey began work on what would become the UNIX operating system. The computer they used was a Digital Equipment Corporation PDP-7, a machine with extremely limited memory even by the standards of that day.76 Thompson wrote the earliest version of this system in assembler, but soon he and his colleagues developed a 'system programming language' called 'B,' which by 1973 had evolved into a language called C. C is a language that is 'close to the machine,' in Ritchie’s words, a characteristic that reflected its creators’ desire to retain the power of assembly language. To that extent it went against the tenets of structured programming that were then being espoused. That quality also made it one of the most popular languages of the personal computer era, as practiced at the giant software houses of the 1990s such as Microsoft."(106)

"The activity known as computer programming was not foreseen by the pioneers of computing. During the 1950s they and their customers slowly realized: first, that it existed; second, that it was important; and third, that it was worth the effort to build tools to help do it. These tools, combined with the applications programs, became collectively known as 'software,' a term that first came into use around 1959."(108)

(109) 4 - From Mainframe to Minicomputer, 1959–1969

Begin 60-er jaren beginnen computers steeds vaker gebruikt te worden op steeds meer maatschappelijke terreinen - door nieuwe technische vindingen als de transistor en programmeertalen; maar ook door de economische groei en plannen van de regering van de VS, bijvoorbeeld om een man op de maan te zetten. Veel voorbeelden daarvan in dit hoofdstuk.

Op geen enkel industrieel terrein verliep de vervanging van producten zo snel als in de computerindustrie. IBM - dat vanaf eind 50-er jaren een marktaandeel had van 70% - deed enorm veel fundamenteel en toegepast onderzoek. Maar het hield ook innovatie tegen omdat dat niet in het belang van het bedrijf was. Zo werden nog heel lang ponskaarten gebruikt en bleef de centralistische aanpak door 'batch processing' van grote computercentra bestaan.

"One place where IBM did succeed was in keeping viable the basic input medium of the punched card, and with that the basic flow of data through a customer’s installation. The same card, encoded the same way and using a keypunch little changed since the 1930s, served IBM’s computers through the 1960s and beyond. The sequential processing and file structure, implicit in punched card operations, also survived in the form of batch processing common to most mainframe computer centers in the 1960s. That eased the shock of adopting the new technology for many customers, as well as ensuring IBM’s continued influence on computing at those sites."(111)

Maar dat ging veranderen omdat het Min. van Defensie van de VS (DoD) enorm veel geld begon te pompen in fundamenteel onderzoek aan de universiteiten.

"What changed was the nature of research done under defense support, especially after the onset of the war in Korea in 1950. Military support for basic research in physics, electrical engineering, and mathematics increased dramatically after 1950. The nature of that research also changed, from one where the military specified its needs in detail, to one where the researchers themselves — professors and their graduate students at major universities — took an active role in defining the nature and goals of the work. Military funding, channeled into research departments at prestigious universities, provided an alternative source of knowledge to that generated in large industrial laboratories. This knowledge, in turn, allowed individuals outside established corporations to enter the computer industry. Its effect on computing was dramatic."(112)

Alleri fundamenteel onderzoek in 'solid-state' fysica, electronica, en computerarchitectuur leidde tot de ontwikkeling van de minicomputer. Ze waren kleiner, sneller en goedkoper te produceren en heel geschikt voor wetenschappelijk rekenwerk. Bovendien waren ze interactief te bedienen.

"It was not a direct competitor to mainframes or to the culture of using mainframes. Instead the minicomputer opened up entirely new areas of application. Its growth was a cultural, economic, and technological phenomenon. It introduced large groups of people — at first engineers and scientists, later others — to direct interaction with computing machines. Minicomputers, in particular those operated by a Teletype, introduced the notion of the computer as a personal interactive device. Ultimately that notion would change our culture and dominate our expectations, as the minicomputer yielded to its offspring, the personal computer."(124-125)

Seymour Cray bij het bedrijf CDC (= Control Data Corporation) speelde een rol in de ontwikkeling ervan. Maar met name ook Ken Olsen van het bedrijf DEC (= Digital Equipment Corporation) die betrokken was geweest bij veel innovatie op computergebied en verantwoordelijk was geweest voor de ontwikkeling van de TX-0 - in 1957 een van de meest geavanceerde computers.

"When completed in 1957, the TX-0 was one of the most advanced computers in the world, and in 1959 when Digital Equipment Corporation offered its PDP-1 designed by Gurley, it incorporated many of the TX-0’s architectural and circuit innovations."(127)

"The PDP-1 was not an exact copy of the TX-0, but it did imitate one of its most innovative architectural features: foregoing the use of channels, which mainframes used, and allowing I/O to proceed directly from an I/O device to the core memory itself. By careful design and skillful programming, this allowed fast I/O with only a minimal impact on the operation of the central processor, at a fraction of the cost and complexity of a machine using channels. In one form or another this 'direct memory access' (DMA) was incorporated into nearly all subsequent DEC products and defined the architecture of the minicomputer. It is built into the microprocessors used in modern personal computers as well. To allow such access to take place, the processor allowed interrupts to occur at multiple levels (up to sixteen), with circuits dedicated to handling them in the right order. The cost savings were dramatic: as DEC engineers later described it, "A single IBM channel was more expensive than a PDP-1." The initial selling price was $120,000."(128)

IBM verhuurde zijn computers en was als eigenaar van de apparatuur de enige die wijzigingen mocht aanbrengen. DEC verkocht de PDP's en stimuleerde juist het aanpassen voor eigen gebruik. Het publiceerde de specificaties van de PDP's, deelde handleidingen uit, en kreeg allerlei suggesties terug van klanten voor verbetering van de machines. De sfeer bij DEC (later heette het bedrijf Digital) was dezelfde as aan het MIT, een academische sfeer van vrij onderzoeken en delen van kennis. In 1965 werd de PDP-8 op de markt gebracht voor $18.000 dollar (vergeleken met de vele honderdduizendcen dollars die je voor mainframes betaalde). Van de PDP-8 werden er 50.000 verkocht. Daarmee begon echt het tijdperk van de minicomputer.

"A cult fascination with Digital arose, and many customers, especially scientists or fellow engineers, were encouraged to buy by the Spartan image. DEC represented everything that was liberating about computers, while IBM, with its dress code and above all its punched card, represented everything that had gone wrong.82 Wall Street analysts, accustomed to the trappings of corporate wealth and power, took the Mill culture as a sign that the company was not a serious computer company, like IBM or UNIVAC. More to the point, DEC’s marketing strategy (including paying their salesmen a salary instead of commissions) was minimal. Some argued it was worse than that: that DEC had 'contempt' for marketing, and thus was missing chances to grow even bigger than it did. DEC did not grow as fast as Control Data or Scientific Data Systems, another company that started up at the same time, but it was selling PDP-8s as fast as it could make them, and it was opening up new markets for computers that neither CDC nor SDS had penetrated."(138)

"That new culture of technical entrepreneurship, considered by many to be the main force behind the United States’s economic prosperity of the 1990s, lasted longer than the ambience of the Mill. It was successfully transplanted to Silicon Valley on the West Coast (although for reasons yet to be understood, Route 128 around Boston, later dubbed the Technology Highway, faded). In Silicon Valley, Stanford and Berkeley took the place of MIT, and the Defense Advanced Research Projects Agency (DARPA) took over from the U.S. Navy and the Air Force. A host of venture capital firms emerged in San Francisco that were patterned after Doriot’s American Research and Development Corporation. Many of the popular books that analyze this phenomenon miss its university roots; others fail to understand the role of military funding. Some concentrate on the wealth and extravagant lifestyles adopted by the millionaires of Silicon Valley — hardly applicable to Ken Olsen, whose plain living was legendary."(140)

(143) 5 - The 'Go-Go' Years and the System /360, 1961–1975

Ook mainframes bleven het goed doen in de markt. IBM kwam met de System /360 en allerlei samenhangende apparatuur - een systeem dat schaalbaar was en eerdere IBM-machines kon emuleren - en had daar groot succes mee.

"There was, however, one very important sector that System /360 did not cover — using a large computer interactively or 'conversationally.' For economic reasons one could not dedicate a mainframe to a single user, so in practical terms the only way to use a large machine interactively was for several users to share its computational cycles, or 'time,' simultaneously."(154)

Hiermee kwam het idee 'time sharing' in de wereld. IBM-computers waren er tot 1965 slecht in. Voor het uitwerken van het idee werden in eerste instantie machines van General Electric, later Honeywell, gebruikt - bijvoorbeeld in het beroemde Project MAC waar eind 60-er jaren ook een systeemprogramma MULTICS ontwikkeld werd dat 'time sharing' ondersteunde.

"GE sold its computer business to Honeywell in 1970, a sale that allowed Honeywell to make solid profits and gain customer loyalty for a few years. Bell Laboratories found the GE time-sharing system wanting, and dropped out of the MULTICS project in 1969. Two researchers there, Ken Thompson and Dennis Ritchie, ended up developing UNIX, in part because they needed an environment in which to do their work after their employer removed the GE system. They began work on what eventually became UNIX on a Digital Equipment Corporation PDP-7, a computer with far less capability than the GE mainframe and already obsolete in 1969. They later moved to a PDP-11. For the next decade and a half, UNIX’s development would be associated with DEC computers. The name implies that 'UNIX' is a simplified form of 'MULTICS,' and it did borrow some of MULTICS’s features."(157)

Uiteindelijk kwam 'time sharing'vooral tot bloei door de PDP-10 minicomputer.

"The revolutionary breakthroughs in interactivity, networking, and system software that characterized computing in the late 1970s and early 1980s would not be centered on IBM equipment."(157)

"Spurred on by Defense Department spending for the Vietnam War, and by NASA’s insatiable appetite for computing power to get a man on the Moon, the late 1960s was a time of growth and prosperity for the computer industry in the United States. For those who remember the personal computer explosion of the 1980s, it is easy to overlook earlier events. From about 1966 to 1968, almost any stock that had '-ex,' '-tronics,' or simply '-tron' in its name rode an upward trajectory that rivalled the Moon rockets. John Brooks, a well-known business journal ist and astute observer of Wall Street, labeled them the 'go-go years,' referring to the rabid chants of brokers watching their fortunes ascend with the daily stock ticker."(159)

[Volgt een bijzonder gedetailleerde uitwerking van de ontwikkeling van deze industrietak die ik hier verder niet weerrgeef.]

(177) 6 - The Chip and Its Impact, 1965–1975

Een tijdlang hadden 'main frames' en minicomputers een afzonderlijk segment van de markt. Ze vulden elkaar aan, hadden geen last van elkaar. In de 60-er jaren begon dat te veranderen.

"The force that drove the minicomputer was an improvement in its basic circuits, which began with the integrated circuit (IC) in 1959. The IC, or chip, replaced transistors, resistors, and other discrete circuits in the processing units of computers; it also replaced cores for the memory units. The chip’s impact on society has been the subject of endless discussion and analysis. This chapter, too, will offer an analysis, recognizing that the chip was an evolutionary development whose origins go back to the circuit designs of the first electronic digital computers, and perhaps before that."(178)

Al tijdens de Tweede Wereldoorlog werd gezocht naar betrouwbare manieren om electronica te bouwen. 'Printed circuits' waren er al lang voor het IC en ook 'clean rooms' voor de fabricage kwamen al eerder op. Als uitvinders van het IC worden Jack Kilby (van Texas Instruments) en Robert Noyce (van Fairchild Semiconductor) gezien.

"A substantial push for something new had come from the U.S. Air Force, which needed ever more sophisticated electronic equipment on-board ballistic missiles and airplanes, both of which had stringent weight, power consumption, and space requirements. (A closer look at the Air Force’s needs reveals that reliability, more than size, was foremost on its mind.) The civilian electronics market, which wanted something as well, was primarily concerned with the costs and errors that accompanied the wiring of computer circuits by hand."(179)

Het IC werd vooral gebruikt in minicomputers, die nog goedkoper geproduceerd konden worden dan voorheen. En er was geen dominantie van DEC in die markt - zoals IBM de 'main frame'-markt domineerde. Tussen 1968 en 1972 begonnen honderden al of niet nieuwe bedrijven in die markt van minicomputers te opereren. In die periode verlieten Robert Noyce en Gordon Moore hun bedrijf Fairchild om een eigen bedrijf te beginnen: Intel ontstond in 1968.

"By 1970 a way of connecting the chips to one another had also standardized. The printed circuit board, pioneered by Globe-Union, had evolved to handle integrated circuits as well. A minicomputer designer could now lay out a single large printed circuit board, with places for all the ICs necessary for the circuits of a small computer. On an assembly line (possibly located in Asia and staffed by women to save labor costs), a person (or machine) would 'stuff' chips into holes on one side of the board. She would then place the board in a chamber, where a wave of molten solder would slide across the pins protruding through to the other side, attaching them securely to the printed connections. The process was fast, reliable, and yielded a rugged product."(193)

Ook andere uitvindingen maakten minicomputers steeds gemakkelijker te produceren, uit te breiden en/of te configureren / programmeren. Een voorbeeld: de bus-structuur ('Unibus') die in de PDP-11 werd opgenomen door DEC - een centraal kanaal waarover vrijwel alle signalen gingen van alle IC's / chips en waarop je gemakkelijk andere IC's en dus apparatuur kon aansluiten. Er werden 170.000 PDP-11's verkocht.

Bij de 'main frames' werd 'time sharing' normaal. Daarvoor was het snel inladen en wegschrijven van gegevens een belangrijke voorwaarde ('swapping').

"A key factor was the development of disk storage that offered rapid and direct access to large amounts of data. IBM had pioneered the use of disk storage with RAMAC in the late 1950s, but for the next ten years sequentially accessed tape, not disks, remained the mainstay of mass storage on mainframes. With the System/370, IBM introduced new models of disk storage that offered dramatically increased performance. During the late 1960s and early 1970s, the cost of storing data on disks dropped twentyfold, while the capacity of a typical disk storage system increased fortyfold."(200)

De mogelijkheid tot 'time sharing' was ook essentieel voor het onderwijs in computerwetenschappen aan de universiteiten. Aan Dartmouth werd een programmeertaal ontwikkeld om studenten gemakkelijker te kunnen leren programmeren: de taal BASIC, die later zo'n grote rol zou gaan spelen in de opkomst van de 'personal computers'.

De mogelijkheid voor 'time sharing' kwam er ook al snel (vanaf 1971 op de PDP-11) voor minicomputers. DEC paste daartoe de taal BASIC vergaand aan. En met de minicomputers leerden studenten ook op een interactieve manier werken met het systeemprogramma UNIX.

"This combination of features of DEC’s BASIC — its ability to do low-level system calls or byte transfers, and its ability to fit on machines with limited memory — would be adopted by the Microsoft Corporation a few years later for its version of BASIC for the first personal computers."(205)

"By the mid-1970s, the minicomputer had established strong positions in several markets and had moved out of its niche as an embedded processor for the OEM market. What held it back from the business data-processing market was the mainframe’s ability to move enormous quantities of data through its channels, back and forth to rows of tape drives and 'disk farms.' But the minicomputer took better advantage than mainframes of advances in integrated circuits, packaging, and processor architecture. Its future seemed bright indeed. What happened next, however, was not what its creators intended. The mini generated the seeds of its own destruction, by preparing the way for personal computers that came from an entirely different source."(206)

(207) 7 - The Personal Computer, 1972–1977

De ervaring van het interactief en voor de lol kunnen werken met computers zonder je druk te hoeven maken over kosten legde de basis voor het succes van de 'personal computer'.

"Of all the early time-sharing systems, the PDP-10 best created an illusion that each user was being given the full attention and resources of the computer. That illusion, in turn, created a mental model of what computing could be—a mental model that would later be realized in genuine personal computers."(208)

"The feeling that a PDP-10 was one’s own personal computer came from its operating system — especially from the way it managed the flow of information to and from the disks or tapes. With MIT’s help, DEC supplied a system called 'TOPS-10,' beginning in 1972.(...)

Users could easily create, modify, store, and recall blocks of data from a terminal. The system called these blocks by the already-familiar term, 'files.' Files were named by one to six characters, followed by a period, then a three-character extension (which typically told what type of file it was, e.g.: xxxxxx.BAS for a program written in BASIC). By typing DIR at a terminal users could obtain a directory of all the files residing on a disk. They could easily send the contents of a file to a desired output device, which typically consisted of a three-letter code, for example, LPT for line printer, or TTY for Teletype.(...)

For PDP-10 users, TOPS-10 was a marvel of simplicity and elegance and gave them the illusion that they were in personal control. TOPS-10 was like a Volkswagen Beetle: basic, simple, and easy to understand and work with.13 Using a PDP-10 was not only fun but addictive. It was no accident that Brand saw people playing Spacewar on one, or that it was also the computer on which Adventure—perhaps the most long-lasting of all computer games—was written."(208-210)

Ook het steeds goedkoper en geavanceerder worden van simpele zakrekenmachines werkte door in de vanzelfsprekendheid een eigen computer te hebben. De HP-65 - een programmeerbare zakrekenmachine - werd al in 1974 in de markt gezet met de uitdrukking 'personal computer'.

"The calculator offered the first consumer market for logic chips that allowed companies to amortize the high costs of designing complex integrated circuits. The dramatic drop in prices of calculators between 1971 and 1976 showed just how potent this force was. The second effect was just as important. Pocket calculators, especially those that were programmable, unleashed the force of personal creativity and energy of masses of individuals. This force had already created the hacker culture at MIT and Stanford (observed with trepidation by at least one MIT professor)."(214-215)

"The assertion that hackers created modern interactive computing is about half-right. In sheer numbers there may never have been more than a few hundred people fortunate enough to be allowed to 'hack' (that is, not do a programming job specified by one’s employer) on a computer like the PDP-10. By 1975, there were over 25,000 HP-65 programmable calculators in use, each one owned by an individual who could do whatever he or she wished to with it.35 Who were these people? HP-65 users were not 'strange'. Nearly all were adult professional men, including civil and electrical engineers, lawyers, financial people, pilots, and so on. Only a few were students (or professors), because an HP-65 cost $795. Most purchased the HP-65 because they had a practical need for calculation in their jobs. But this was a personal machine — one could take it home at night. These users — perhaps 5 or 10 percent of those who owned machines — did not fit the popular notion of hackers as kids with "[t]heir rumpled clothes, their unwashed and unshaven faces, and their uncombed hair." But their passion for programming made them the intellectual cousins of the students in the Tech Model Railroad Club. And their numbers — only to increase as the prices of calculators dropped — were the first indication that personal computing was truly a mass phenomenon."(215)

Omdat er geen ondersteuning was voor deze 'calculators' ontstonden er hobbyclubs, nieuwsbrieven en andere publicaties.

"This supporting infrastructure was critical to the success of personal computing; in the following decade it would become an industry all its own."(216)

"Calculators showed what integrated circuits could do, but they did not open up a direct avenue to personal interactive computing. The chips used in them were too specialized for numerical calculation to form a basis for a general-purpose computer. Their architecture was ad-hoc and closely guarded by each manufacturer. What was needed was a set of integrated circuits — or even a single integrated circuit — that incorporated the basic architecture of a general-purpose, stored-program computer. Such a chip, called a 'microprocessor,' did appear."(217)

In 1971 werd de microprocessor - het idee 'computer on a chip' - gerealiseerd door Ted Hoff, Stan Mazor en Federico Faggin die bij Intel werkten (ook anderen hadden dergelijke ideeën uitgewerkt).

"The result was a set of four chips, first advertised in a trade journal in late 1971, which included "a microprogrammable computer on a chip!" That was the 4004, on which one found all the basic registers and control functions of a tiny, general-purpose stored-program computer. The other chips contained a read-only memory (ROM), random-access memory (RAM), and a chip to handle output functions. The 4004 became the historical milestone, but the other chips were important as well, especially the ROM chip that supplied the code that turned a general-purpose processor into something that could meet a customer’s needs."(220)

Daarna volgden andere microprocessoren als de Intel 8008 in april 1972, de Intel 8080 in april 1974, en processoren van allerlei andere opkomende bedrijven als Zilog. Het duurde een tijd voordat de perceptie doordrong dat je met die processoren kleine computers kon maken die mensen thuis zouden kunnen gebruiken. Het beeld werd sterk bepaald door de bestaande computerindustrie met zijn 'main frames' en minicomputers - de nieuwe processoren werden eerder gezien als een verbetering van het bestaande. Dat kun je zien aan het verhaal van de MICRAL, de eerste PC, die in Frankrijk werd gebouwd en verkocht met de 8008 als processor.

"A general-purpose computer based on a microprocessor did appear in 1973. In May of that year Thi T. Truong, an immigrant to France from Viet Nam, had his electronics company design and build a computer based on the Intel 8008 microprocessor. The MICRAL was a rugged and well-designed computer, with a bus architecture and internal slots on its circuit board for expansion. A base model cost under $2,000, and it found a market replacing minicomputers for simple control operations. Around two thousand were sold in the next two years, none of them beyond an industrial market.58 It is regarded as the first microprocessor-based computer to be sold in the commercial marketplace. Because of the limitations of the 8008, its location in France, and above all, the failure by its creators to see what it 'really' was, it never broke out of its niche as a replacement for minicomputers in limited industrial locations.

The perception of the MICRAL as something to replace the mini was echoed at Intel as well. Intel’s mental model of its product was this: an industrial customer bought an 8080 and wrote specialized software for it, which was then burned into a read-only-memory to give a system with the desired functions. The resulting inexpensive product (no longer programmable) was then put on the market as an embedded controller in an industrial system. A major reason for that mental model was the understanding of how hard it was to program a microprocessor. It seemed absurd to ask untrained consumers to program when Intel’s traditional customers, hardware designers, were themselves uncomfortable with programming."(222)

Intel bracht 'kits' uit voor educatieve doeleinden, zodat mensen konden leren programmeren. Intel nam zelfs Gary Kildall aan om een programmeertaal te maken - PL/M - die geschikt zou zijn om mee te werken in die 'kits'. Met andere woorden: Intel had in feite een PC gebouwd, maar realiseerde zich dat niet en bracht hem niet als zodanig in de verkoop voor het grote publiek.

"Here is where the electronics hobbyists and enthusiasts come in. Were it not for them, the two forces in personal computing might have crossed without converging. Hobbyists, at that moment, were willing to do the work needed to make microprocessor-based systems practical."(224)

In de tijdschriften voor die hobbyisten, met name de elektronica-tijdschriften - werd al geadverteerd met machines als de Kenbak-1 (1971), Scelbi-8H (maart 1974), Mark-8 (juli 1974).

"The Mark-8’s appearance in Radio-Electronics was a strong factor in the decision by its rival Popular Electronics to introduce the Altair kit six months later."(225)

"H. Edward Roberts, the Altair’s designer, deserves credit as the inventor of the personal computer. The Altair was a capable, inexpensive computer designed around the Intel 8080 microprocessor. Although calling Roberts the inventor makes sense only in the context of all that came before him, including the crucial steps described above, he does deserve the credit. "(226)

"Following the tradition established by Digital Equipment Corporation, Roberts did not hold specifications of the bus as a company secret. That allowed others to design and market cards for the Altair. That decision was as important to the Altair’s success as its choice of an 8080 processor."(229)

"The limited capabilities of the basic Altair, plus the loss of the only existing Altair by the time the Popular Electronics article appeared, led to the notion that it was a sham, a 'humbug,' not a serious product at all. The creators of the Altair fully intended to deliver a serious computer whose capabilities were on a par with minicomputers then on the market. Making those deliveries proved to be a lot harder than they anticipated. Fortunately, hobbyists understood that. But there should be no mistake about it: the Altair was real.

MITS and the editors of Popular Electronics had found a way to bring the dramatic advances in integrated circuits to individuals. The first customers were hobbyists, and the first thing they did with these machines, once they got them running, was play games."(230)

"But the people at MITS and their hangers-on created more than just a computer. This $400 computer inspired the extensive support of user groups, informal newsletters, commercial magazines, local clubs, conventions, and even retail stores. This social activity went far beyond traditional computer user groups, like SHARE for IBM or DECUS for Digital. Like the calculator users groups, these were open and informal, and offered more to the neophyte. All of this sprang up with the Altair, and many of the publications and groups lived long after the last Altair computer itself was sold."(231)

Roberts moest kiezen voor een programmeertaal waarmee toepassingen konden worden geprogrammeerd voor de Altair. Dat werd de BASIC die Bill Gates en Paul Allen maakten. De programmeertaal moest juli 1975 klaar zijn.

"In a burst of energy, Gates and Allen, with the help of Monte Davidoff, wrote not only a BASIC that fit into very little memory; they wrote a BASIC with a lot of features and impressive performance. The language was true to its Dartmouth roots in that it was easy to learn."(233)

"Bill Gates had recognized what Roberts and all the others had not: that with the advent of cheap, personal computers, software could and should come to the fore as the principal driving agent in computing. And only by charging money for it — even though it had originally been free — could that happen."(236)

Daarnaast maakte Gary Kildall het mogelijk om een 'floppy disk' (oorspronkelijk een uitvinding van IBM) te gebruiken voor een PC door een Disk Operating System (DOS) te programmeren en dat op te nemen in CP/M, een afgeleide van PL/M. Het systeemprogramma voor PC's verscheen in 1976. Toen er duidelijk belangstelling voor was, begon hij het bedrijf Digital Research. Om te kunnen onderschieden tussen wat voor specifieke PC's aangepast moest worden en wat gelijk kon blijven, ontwierp hij het BIOS — Basic Input / Output System. Het BIOS moest per PC aangepast worden, maar de rest van het systeemprogramma kon daardoor juist gelijk blijven en daarmee werd als het ware een standaard gecreëerd.

"CP/M was the final piece of the puzzle that, when made available, made personal computers a practical reality."(238)

"By 1977 the pieces were all in place. The Altair’s design shortcomings were corrected, if not by MITS then by other companies. Microsoft BASIC allowed programmers to write interesting and, for the first time, serious software for these machines. The ethic of charging money for this software gave an incentive to such programmers, although software piracy also became established. Computers were also being offered with BASIC supplied on a read-only-memory (ROM), the manufacturer paying Microsoft a simple royalty fee. (With the start-up codes also in ROM, there was no longer a need for the front panel, with its array of lights and switches.) Eight-inch floppy disk drives, controlled by CP/M, provided a way to develop and exchange software that was independent of particular models. Machines came with standardized serial and parallel ports, and connections for printers, keyboards, and video monitors. Finally, by 1977 there was a strong and healthy industry of publications, software companies, and support groups to bring the novice on board. The personal computer had arrived."(240-241)

(243) 8 - Augmenting Human Intellect, 1975–1985

Wat betekende de PC voor de rest van de computerindustrie?

DEC (later Digital) was vooral bezig met de ontwikkeling van de succesvolle VAX met zijn virtuele geheugentechnieken - er werden er 100.000 van verkocht. Als systeemsoftware draaide het VMS, maar ook UNIX (met name de Berkeley-variant).

"The VAX was a general-purpose computer that came with the standard languages and software. It sold to a wide market, but its biggest impact was on engineering and science. Prices started at $120,000, which was too expensive for a single engineer, but just cheap enough to serve a division at an aerospace, automotive, or chemical firm. For them the standard practice had been either to get in line to use the company’s mainframe, or to sign up for time on a commercial time-sharing service. The VAX gave them computing power at hand. It had a solid, engineering-oriented operating system (VMS), and sophisticated I/O facilities for data collection. Finally, the VAX came with a powerful and easy-to-use terminal, the VT-100. It had an impressive number of features, yet one felt that none was superfluous. It somehow managed to retain the comfortable feel of the old Teletype."(246-247)

Ook IBM ging gewoon door met zijn normale zaken: 'main frames, de System/370 voornamelijk. In 1975 werd een eerste PC gemaakt, Model 5100, maar die werd nauwelijks gezien door de hobbyisten en bleef dus onbekend. IBM had in deze periode ook veel te maken met rechtszaken vanwege het beweerde overtreden van de 'antitrust' wetten.

"IBM continued to develop new products. In addition to the 4300 and 3030 mainframes, IBM went after the minicomputer companies with its System/38 in 1978, following that with its AS/400 in 1988. The AS/400 was aimed more at business than engineering customers, but otherwise it was a strong competitor to the VAX. It used advanced architectural features that IBM had planned for its follow-on to the System/370 but had not implemented. As such, the AS/400 represented IBM’s most advanced technology, and it generated strong revenues for IBM into the 1990s, when its mainframe sales suffered from technological obsolescence. IBM failed to bring other products to market at this time, however, a failure that ultimately hurt the company. It is not clear how much the antitrust suit had to do with that."(250)

Andere bedrijven waren bijvoorbeeld Viatron, Wang (met zijn tekstverkingssystemen), en Xerox Corporation met zijn Xerox Palo Alto Research Center (Xerox PARC).

"One of the ironies of the story of Wang is that despite its innovations, few stories written about the 1970s talk about Wang. To read the literature on these subjects, one would conclude that the Xerox Corporation was the true pioneer in distributed, user-friendly computing; that the Xerox Palo Alto Research Center, which Stewart Brand so glowingly described in his 1972 Rolling Stone article, was the place where the future of computing was invented. Why was that so?

The Xerox Corporation set up a research laboratory in the Palo Alto foothills in 1970. Its goal was to anticipate the profound changes that technology would bring to the handling of information in the business world. As a company famous for its copiers, Xerox was understandably nervous about talk of a ‘‘paperless office.’’ Xerox did not know if that would in fact happen, but it hoped that its Palo Alto Research Center (PARC) would help the company prosper through the storms."(257-258)

Palo Alto was gunstig gelegen, midden in Silicon Valley waar alle IT-ontwikkelingen op dat moment plaats vonden. En omdat er net door het DoD bezuinigd werd op onderzoeksgelden voor fundamenteel onderzoek, kon directeur Georg Pake toptalenten binnen halen voor zijn onderzoek. Ideëen van Licklider, Engelbart, e.a. leidden tot allerlei baanbrekende toepassingen, zoals de muis, grafisch beeldschermgebruik, Ethernet, de laserprinter en de ALTO- (al in 1973) en STAR-PC (1981) waarin een en ander toegepast werd. Dat een en ander commercieel niet uitgebaat werd had veel te maken met dat Xerox zijn tijd ver vooruit was.

"Once again, these top-down innovations from large, established firms were matched by an equally brisk pace of innovation from the bottom up — from personal computer makers."(265)

Over de TSR-80, de Commodore PET, de Apple II (met de 5¼ inch disk drive en het rekenprogramma VisiCalc). En natuurlijk de IBM PC (in 1981, met het systeemprogramma PC-DOS van Microsoft), de Apple Macintosh (in 1984), en IBM-compatibles..

"Although after the Apple II and its floppy drive were available, one could say that hardware advances no longer drove the history of computing, there were a few exceptions, and among them was the IBM Personal Computer. Its announcement in August 1981 did matter, even though it represented an incremental advance over existing technology."(268)

"In the end, Microsoft offered IBM a 16-bit operating system of its own. IBM called it PC-DOS, and Microsoft was free to market it elsewhere as MS-DOS. PC-DOS was based on 86-DOS, an operating system that Tim Paterson of Seattle Computer Products had written for the 8086 chip. Microsoft initially paid about $15,000 for the rights to use Seattle Computer Products’s work. (Microsoft later paid a larger sum of money for the complete rights.) Seattle Computer Products referred to it internally by the code name QDOS for 'Quick and Dirty Operating System'; it ended up as MS-DOS, one of the longest-lived and most-influential pieces of software ever written."(270)

"IBM found itself with an enormously successful product made up of parts designed by others, using ASCII instead of EBCDIC, and with an operating system it did not have complete rights to. It was said that if IBM’s Personal Computer division were a separate company, it would have been ranked #3 in the industry in 1984, after the rest of IBM and Digital Equipment Corporation. Within ten years there were over fifty million computers installed that were variants of the original PC architecture and ran advanced versions of MS-DOS."(272)

"The Mac’s elegant system software was its greatest accomplishment. It displayed a combination of aesthetic beauty and practical engineering that is extremely rare. One can point to specific details. When a file was opened or closed, its symbol expanded or contracted on the screen in little steps — somehow it just felt right. Ultimately this feeling is subjective, but it was one that few would disagree with. The Macintosh software was something rarely found among engineering artifacts. The system evolved as the Mac grew, and it was paid the highest compliment from Microsoft, who tried to copy it with its Windows program. One can hope that some future system will have that combination as well, but the odds are not in favor of it."(275)

"Among sophisticated customers that created a split: one group favored the elegance and sophistication of the Mac, while others preferred the raw horsepower and access to individual bits that MS-DOS allowed. For those who were not members of the computer priesthood, the Macintosh was a godsend; whatever time was lost by its relative slowness was more than compensated for by the time the user did not have to spend reading an indecipherable users manual."(276)

(281) 9 - Workstations, UNIX, and the Net, 1981–1995

Vanaf de 80-er jaren begonnen veel bedrijven gebruik te maken van 'work stations': een soort van PC, maar dan met UNIX en netwerkmogelijkheden en een stuk krachtiger. SUN (= Stanford University Networked) werd in 1982 daarvoor opgericht. Bill Joy ging er werken en ontwikkelde de Berkeley-variant van UNIX daar verder. Die variant was ontstaan uit het AT&T-origineel van Thompson en Ritchie waarvan de broncode vrijwel zonder kosten ter beschikking werd gestgeld aan niet-commerciële instellingen als universiteiten.

"Bill Joy was one of many students who had tinkered with AT&T’s version of UNIX hoping to make it better. The University of California at Berkeley obtained a UNIX tape in 1974, following a visit by Ken Thompson. The system was soon running on several PDP-11s on the campus. Bill Joy also arrived on the campus that year."(283)

"Bill Joy and his fellow students at Berkeley set out to make UNIX more accessible. (...) By 1978 Joy was offering tapes of the first Berkeley Software Distribution (BSD) at a nominal cost to his friends and colleagues around the country. (...) In 1980 ARPA threw its support behind Berkeley UNIX as a common system the agency could recommend for all its clients. That UNIX was, in theory, portable to computers from manufacturers other than DEC was a main reason. Among the many enhancements added to Berkeley UNIX (in version 4.2 BSD) was support for networking by a protocol known as TCP=IP, which ARPA promoted as a way to interconnect networks. This protocol, and its bundling with Berkeley UNIX, forever linked UNIX and the Internet."(284)

Uiteraard waren er in dat decennium veel ontwikkelingen tegelijk. De VAX-strategie van DEC (= Digital) bleek niet te werken, ze waren niet innovatief genoeg in de ontwikkeling van de computerarchitectuur en sloten niet aan op nieuwe mogelijkheden van hardware. De RISC-processor werd ontwikkeld.

"A RISC architecture, UNIX, and scientific or engineering applications differentiated workstations from personal computers. Another distinction was that workstations were designed from the start to be networked, especially at the local level, for example, within a building or a division of an engineering company. That was done using Ethernet, one of the most significant of all the inventions that came from the Xerox Palo Alto Research Center. If the Internet of the 1990s became the 'Information Superhighway,' then Ethernet became the equally important network of local roads to feed it. As a descendent of ARPA research, the global networks we now call the Internet came into existence before the local Ethernet was invented at Xerox. But Ethernet transformed the nature of office and personal computing before the Internet had a significant effect. How Ethernet did that will therefore be examined first."(291)

"Metcalfe connected Xerox’s MAXC to ARPANET, but the focus at Xerox was on local networking: to connect a single-user computer (later to become the Alto) to others like it, and to a shared, high-quality printer, all within the same building. The ARPANET model, with its expensive, dedicated Interface Message Processors was not appropriate."(291)

"Metcalfe recalled that its speed, around three million bits per second, was unheard of at the time, when 'the 50-kilobit-per-second (Kbps) telephone circuits of the ARPANET were considered fast.' Those speeds fundamentally altered the relationship between small and large computers. Clusters of small computers now, finally, provided an alternative to the classic model of a large central system that was time-shared and accessed through dumb terminals."(292)

PC's waren in feite niet gemaakt voor gebruik in netwerken: het waren immers persoonlijke computers. Noch de hardware noch het systeemprogramma waren er geschikt voor. Maar de toepassingsprogramma's voor de PC waren daartegenover weer zo veel beter dan veel software op minicomputers e.d. dat het vrijwel onvermijdelijk was dat ze ook op het werk gebruikt gingen worden.

"By the mid-1980s it was clear that no amount of corporate policy directives could keep the PC out of the office, especially among those employees who already had a PC at home. The solution was a technical fix: network the PCs to one another, in a local-area network (LAN). (...) Networking of PCs lagged behind the networking that UNIX workstations enjoyed from the start, but the personal computer’s lower cost and better office software drove this market."(293-294)

"Local area networks made it possible for large numbers of people to gain access to the Internet. Ethernet’s speeds were fast enough to match the high speeds of the dedicated lines that formed the Internet’s backbone. High-speed networking had always been among the features workstation companies wanted to supply — recall SUN’s marketing slogan: 'The Network is the Computer.' What had not been anticipated was how advances in personal computers, driven by ever more powerful processors from Intel, brought that capability to offices and other places outside the academic and research worlds. By the late 1980s those with UNIX workstations, and by 1995 those with personal computers on a LAN, all had access to the Internet, without each machine requiring a direct connection to the Internet’s high-speed lines. Ethernet’s high data rates thus provided a way of getting around the fact that communication speeds and data capacity had not kept up with the advances in computer processing speeds and storage."(297)

[Volgt een kort overzicht van de ontwikkeling van de mogelijkheden van Internet: telnet, ftp, e-mail, newsgroups, gopher, WAIS, WWW. De geschiedenis van het WWW die er toen nog niet was, is er inmiddels wel. Over Mosaic schrijft Ceruzzi:]

"With the help of others at NCSA, Mosaic was rewritten to run on Windows-based machines and Macintoshes as well as workstations. As a product of a government-funded laboratory, Mosaic was made available free or for a nominal charge. As with the UNIX, history was repeating itself. But not entirely: unlike the developers of UNIX, Andreessen managed to commercialize his invention quickly. In early 1994 he was approached by Jim Clark, the founder of Silicon Graphics, who suggested that they commercialize the invention. Andreessen agreed, but apparently the University of Illinois objected to this idea. Like the University of Pennsylvania a half-century before it, Illinois saw the value of the work done on its campus, but it failed to see the much greater value of the people who did that work. Clark left Silicon Graphics, and with Andreessen founded Mosaic Communications that spring. The University of Illinois asserted its claim to the name Mosaic, so the company changed its name to Netscape Communications Corporation. Clark and Andreessen visited Champaign-Urbana and quickly hired many of the programmers who had worked on the software. Netscape introduced its version of the browser in September 1994. The University of Illinois continued to offer Mosaic, in a licensing agreement with another company, but Netscape’s software quickly supplanted Mosaic as the most popular version of the program."(303)

(307) 10 - 'Internet Time,' 1995-2001

[Dit hoofdstuk is toegevoegd in deze tweede editie van het boek. De eerste editie kwam uit in 1998, toen WWW net populair aan het worden was, maar het schrijven ervan werd al afgesloten in augustus 1995. Hier drie onderwerpen: de dotcom-ontwikkelingen / de commercialisering van internet, de antitrust-rechtszaak tegen Microsoft, en de opkomst van de Open Source - beweging.]

[De ontwikkelingen rondom Microsoft worden het eerst beschreven, maar ik vind die juridische schermutselingen en de 'browser war' niet interessant. De situatie is voor een groot deel veranderd. Maar dat Ceruzzi natuurlijk ook niet voorzien.]

[Zo kon hij ook niet voorzien hoe gauw ADSL, kabel, en glasvezel de doorgifte van internet zouden versnellen en hoe gemakkelijk het voor mensen zou worden om 'always online' te zijn. Daardoor onder andere kon internet een vast en normaal onderdeel worden van het leven en samenleven van mensen en kreeg het een betekenis die geen enkele pionier had kunnen voorspellen. Daardoor ontstonden mogelijkheden als het kopen van boeken en bekijken van filmpjes via internet, die daarvoor door gebrek aan bandbreedte vrijwel onmogelijk waren. Door het massale gebruik van internet ontstonden uiteraard ook problemen, informatieovervloed, informatievervuiling, oppervlakkigheid.]

"The Internet passed through challenges like the 1988 worm, viruses, the Y2K crisis, the dot.com collapse, and the terrorists' attacks of September 11, 2001, with hardly a hiccup. It is based on a robust design. As for the content and quality of information that the Internet conveys, however, it has indeed been tragic. The simple delivery of high-quality text files that the now-obsolete Gopher interface delivered has evolved into a stream of information polluted with pop-up ads, spam, and pornography. Web surfing has gotten so frustrating that one has a hard time remembering how exhilarating it once was. The collapse of the dot.coms may have been a signal that something was wrong, like the collapse of fisheries in the North Atlantic Ocean."(330)

[Ook het derde aspect is inmiddels bekend: hoe Torvalds het systeemprogramma Linux ontwikkelde en hoe dat tot een aanpak leidde die nu aangeduid wordt met 'open source'.]

[Ceruzzi gaf het zelf al aan: het is lastig om de geschiedenis van iets te schrijven dat nog zo dicht bij is, sterker nog: van iets waarvan we zelf dagelijks deel uitmaken. De situatie is inmiddels op veel punten al weer anders en dat verandert simpelweg het perspectief waarmee je alle gebeurtenissen van voorheen beoordeelt.]

(345) Conclusion: The Digitization of the World Picture

"In 1948 a book appeared with the bold title The Mechanization of the World Picture. The author, a Dutch physicist named E. J. Dijksterhuis, argued that much of history was best understood as an unfolding of the 'mechanistic' way of looking at the world that actually began with the Greeks and culminated in the work of Isaac Newton.2 Dijksterhuis’s work found a willing audience of readers who had experienced the power and the horrors of a mechanized world view after six years of world war. It took a millennium and a half for a mechanistic view to take hold, but it has taken less time — about fifty years — for a view equally as revolutionary to take hold. The 'digitization of the world picture' began in the mid-1930s, with the work of a few mathematicians and engineers. By 1985 this world view had triumphed."(346)

"Each transformation of digital computing was propelled by individuals with an idealistic notion that computing, in its new form, would be a liberating force that could redress many of the imbalances brought on by the smokestack of the 'second wave,' in Alvin Toffler’s phrase. UNIVAC installations were accompanied by glowing predictions that the 'automation' they produced would lead to a reduced workweek. In the mid-1960s enthusiasts and hackers saw the PDP-10 and PDP-8 as machines that would liberate computing from the tentacles of the IBM octopus. The Apple II reflected the Utopian visions of the San Francisco Bay area in the early 1970s. And so it will be with universal access to the Internet. In each case the future has turned out to be more complex, and less revolutionary, than its proponents imagined."(347-349)

"We created the computer to serve us. The notion that it might become our master has been the stuff of science fiction for decades, but it was always hard to take those stories seriously when it took heroic efforts just to get a computer to do basic chores. As we start to accept the World Wide Web as a natural part of our daily existence, perhaps it is time to revisit the question of control. My hope is that, with an understanding of history and a dash of Thoreauvian skepticism, we can learn to use the computer rather than allowing it to use us."(350)