Chapter 10 of a university course in media history by Prof. Bill Kovarik, based on the book Revolutions in Communication: Media History from Gutenberg to the Digital Age (Bloomsbury, 2nd ed., 2015).
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Rc 10.computers
1. Media History from
Gutenberg
to the Digital Age
Slides based on the Bloomsbury book by Bill Kovarik
Revolutions in
Communication
Chapter 10 – Computers and the Digital Dawn
2. Web site & textbook
Textbook:
1st edition – 2011 2nd edition – 2016
http://www.revolutionsincommunication.com
4. Longitude required precise
math
• Charles Babbage (1791-1871)
mathematician working at
the Greenwich Observatory
in 1821
• Found shocking errors in
astronomical tables being
used to fix position of ships
at sea.
• Wondered if a mechanical
device could be developed
to avoid errors.
5. Babbage difference engine
Doron Swade, a curator at the Science Museum of London, built the
difference engine based on Charles Babbage’s original plans
6. Ada Lovelace (1815–1852)
Enchantress of numbers
Mathematician who
helped develop the first
algorithm for one of
Babbage’s designs.
Daughter of the poet Lord
Byron
Remembered as the first
computer programmer
Programming language, Ada,
was named in her honor.
7. Herman Hollerith (1860-1929)
• Mathematician
working for the US
Census
• Used punch cards and
mechanical sorters
• Created Tabulating
Machine Co. 1896
• Became International
Business Machines
(IBM) in 1924.
8. Tabulating machines 1910s –
30s
Ann Oliver of the US Census
demonstrates an early Hollerith
tabulating machine in this 1940
picture.
The machine was fed cards
at the rate of 400 a minute, and 12
bits of statistical information
were extracted from each card.
IBM tabulating machinery was sold
around the world by the 1930s.
The most controversial use involved
IBM help for Nazi census work that
identified Jews and organized the
Holocaust in the 1930s and 40s.
9. Z3 Germany 1941
Programmable
digital computer
2000 relays, 5-
10 Hz
Used to
calculate
aircraft designs
Inventor Konrad
Zuse
Destroyed in
bombing raids
1943
Zuse Z3 replica on display at the Deutsches
Museum in Munich, Germany
12. Alan Turning
(1912-1954)
Mathematician,
computer scientist,
code breaker,
philosopher
‘Turing Test’ involves
computer’s ability to
simulate human
intelligence
Personal life: Half-eaten apple found later may have been method of
suicide. Apple Computers logo rumored to be an homage. Turing’s
conviction for gay lifestyle was pardoned posthumously in 2013.
14. ENIAC USA 1946
Electronic numerical integrator and
calculator (ENIAC)
Built at U. Pennsylvania
17,468 radio tubes
30 tons weight
100 feet long, 8 feet high
100,000 calculations per second.
Processor with 20 register positions
2 hours = work of 100 human
mathematicians working one year.
15. “Instead of you holding a computer,
this computer held you…” Harry Reed
“Instead of you holding a computer,
this computer held you…” Harry Reed
16.
17. Grace Hopper (1906 – 1992)
• US Navy admiral
• Worked on Harvard
Mark I - IV
• Developed COBOL
(common-business-
oriented language)
• Popularized
“debugging” for fixing
computer glitches
18. Tube, transistor, chip
Triode tube, radio telephony, 1907
Allows continues wave / Reginald Fessenden
Transistor: Bell Labs. 1947
Lighter, cheaper, far more durable
/ John Bardeen, Walter Brattain, Wm Shockley
Chip: 1958
/ Jack Kilby, Robert Noyce
Billions of transistors
Printed onto an
“integrated circuit”
19.
20. Vannevar Bush (1890 - 1974)
Top science advisor in US in WWII
In a 1946 article, Bush predicted the
development of a personal computer
that he called a “Memex.”
Post-war hope for democratic
influences on science and technology.
“In a free country, in a democracy, this [path taken] is the path
that public opinion wishes to have pursued, whether it leads to
new cures for man’s ills, or new sources of a raised standard
of living . . . In a dictatorship the path is the one that is
dictated, whether the dictator be an individual or part of a self-
perpetuating group.”
21. Univac and the election of
1952
In 1952, the ENIAC had spun off
into a private company and was
dubbed the Sperry-Rand Univac
computer.
Its public debut was on election
night, 1952, on CBS television.
By following key early precincts,
the Univac was able to project a
landslide win for Dwight
Eisenhower.
CBS and Univac did not report the
results election night, instead
claiming to have “computer
problems.” Walter Cronkite of CBS News
demonstrates the “computer of
the future”
22. The “chip”
Early computers used vacuum tubes
By the 1960s they were using
transistors, much faster, much smaller
Breakthrough was the integrated
circuit transistor, aka, the chip
Developed by Robert Noyce & Jack
Kilby
◦ (Fairchild semiconductor, Texas
23. IBM 360 (1965 – 1978)
Used transistors, not vacuum tubes
Scalable 8K – 8 M memory; valuable for banks,
insurance, science
Standard commercial unit: The Model T of
computers
IBM business culture = strict conformity. Precise
haircuts, dark suits, narrow black ties and a strong
sense of company loyalty made the IBM world, for
some, a harrowing glimpse into the totalitarian future
of computing.
For others, the problem was that the old-fashioned
IBM culture simply failed to glimpse the possibilities
of the technology it dominated.
26. Artists didn’t trust computers
George Orwell’s 1984 (published 1948)
described a totalitarian state where
inconvenient history is erased and where
people are constantly monitored.
Stanley Kubrick’s 2001: A Space Odyssey
(1969 film) is a film in which a computer takes
over a spaceship and kills most of the crew.
William Gibson’s Neuromancer (1984 book)
described a nightmarish world of totalitarian
control over human–computer interfaces.
27. But why not?
Was it IBM culture, cold war paranoia,
government spying?
And how did “liberated” computing,
open software and global networks
replace Kubrick and Orwells
nightmares?
How was global communication
socially constructed? How were
nightmares avoided and visions
enabled?
28. Alternative visions
Early computer engineers had a far
more positive vision.
They saw an enabling technology that
could help reorganize the way people
used information.
Their vision spread slowly among:
◦ nerd engineers 1960s
◦ hipster hardware developers 1970s
◦ cyberpunk programmers 1990s
◦ global techies 21st century
29. A lesson for our times
So the story of computers and digital
media is the story of creative cultures
patching together an improbable system.
It’s the story of innovation over hierarchy.
It’s also the story of how the world’s
largest businesses missed their
moments (the curve in the road) …
It’s a story of small companies, guided
by a larger vision, creating a vehicle for a
new kind of global media
30. Doug Engelbart (1925 – 2013)
Led team
at SRI /
Palo Alto
California
envisioned
modern PC
with:
• Mouse and Graphical User Interface (GUI)
• Hypertext ( Linking )
• Networked computers for file sharing & email
• Mother of all Demos Dec. 9, 1968
31. Ted Nelson
1974 book
demanded
computer power
for people
Urged computer
literacy and
fight against
mainframe &
central info
systems
32. Homebrew computer club
1975 – 1986
Palo Alto, CA
Encouraged
experiments
Apple first
demonstrated
Photo shows Apple 1 (wooden frame), Sol-20 (blue), and Altair
(bottom), from Information Age exhibit at the Smithsonian’s National
Museum of American History.
33. Xerox PARC ‘Alto’ 1974
• First working personal
computer Xerox Palo Alto
Research Center (PARC) in
1974.
• Used ideas from SRI, built new
ideas such as networks, object-
oriented programming
• Thousands of prototypes
manufactured.
• Most of ideas ended up in
Apple Mac -- Xerox missed the
“curve in the road”
34. Apple computer – 1978
Steve Wozniak (left) and Steve Jobs. The “killer ap” was a
spreadsheet to help with accounting and small business finances.
35. IBM personal computer 1981
IBM was trying to catch up to
very successful Apple II
Killer Ap was also a
spreadsheet for business
IBM asked Bill Gates and
Microsoft to develop the
operating system
The contract allowed Gates
to sell the software to other
computer systems
“Clone” computers were far
cheaper than the IBM
36. Osborne 1982 –
4 Mhz cpu
Used by news
operations or others
who had to travel
and write
Apple iPhone
412 Mhz cpu
38. What Apple missed
Networked computers
Online exchanges
Object oriented programming
39. Killer Aps
(You’d buy a computer just to be able to use it for something… )
WWII, Cold war: Code-breakers,
artillery tables, rocket scientists
1960s: (IBM) Insurance tables, bank
reconciliations
1970s: Small business spreadsheets
1980s: (Mac) Desktop publishing
1990s: Email, presentations, web
2000s: Media convergence
40. Missing the curves in the road
IBM was not prepared for the PC
◦ Allowed Microsoft to develop an operating
system for “clone” computers.
Xerox didn’t use its own research
◦ Palo Alto Research Center lost staff and
big ideas to Apple Corp.
Also AT&T engineers not prepared for
data networks and turned down
DARPA management offers (Ch. 11)
44. Scientists from RAND Corporation have created this
model to illustrate how a "home computer" could look
in the year 2004. However the needed technology will
not be economically feasible for the average home.
Also the scientists readily admit that the computer will
require not yet invented technology to actually work,
but 50 years from now scientific progress is expected
to solve these problems. With teletype interface and
the Fortran language, the computer will be easy to
use.
46. Review: people
Charles Babbage, Ada Lovelace,
Herman Hollerith, Grace Hopper,
Vannevar Bush, William Shockley,
Gordon Moore, Robert Noyce, J.C.R.
Licklider, Doug Englebart, Steve Jobs,
Steve Wozniak, Bill Gates
47. Review: Concepts
Tube, transistor, chip
Z3, Colossus, ENIAC, IBM 360, Apple
Mac
Interactive computing versus number
crunching, personal computers, the
‘mouse’, Moore’s law, Xerox PARC, killer
aps, the ‘curve in the road’
The computer age traces back to the surprisingly early date of 1821, when Charles Babbage, a mathematician working at the Greenwich Observatory near London, found an error in a set of astronomical tables used by navigators to fix the position of ships at sea. Babbage was alarmed, but then he found another error. And then dozens more.
Naturally, the errors bothered Babbage because lives depended on the accuracy of these tables. Errors could take a ship miles off course and lead to a wreck on an unseen reef. In the process of correcting the tables, it occurred to Babbage that a mechanical device might avoid the errors entirely.
He began to envision an analytical device could handle all kinds of problems, not just astronomical calculations. Soon the idea took shape as a vision of a “difference engine:” the world’s first analog computer that could accept instructions, process calculations, store information and print answers with unerring accuracy.
Babbage spent much of his life working on the project, which was blueprinted and partially built, but never entirely completed. He claimed that in its final design of 1849, the difference engine could solve complex equations with answers of up to 31 decimal places.
The historical mystery was whether it could have actually worked and whether it could have been built in his time. Some, like MIT’s Vannevar Bush, said yes, it would. “His idea was sound enough, but construction and maintenance costs were then too heavy,” Bush said. Others thought not.
Then, in the late 1980s, London Science Museum curator Doron Swade built the Babbage machine, taking care to use machining techniques available in Babbage’s time.
Another visionary worked with Babbage on what is today called computer “software.” She was Countess Augusta Ada King Lovelace (1815–1852), the daughter of the famed British poet Lord George Byron and his wife Anne Byron. She worked on a translation of an Italian mathematician’s memo on the difference engine in 1842 and added her own notes, longer than the translation, in which she proposed a method for calculating a sequence of Bernoulli numbers. At the time, the achievement was seen as remarkable but not terribly significant. Today it is considered the first computer program.
The search for a solution to another kind of problem led to a different approach to computers. As the US Census Bureau approached the national census of 1890, a young mathematician named Herman Hollerith was trying to make it easier to search through massive amounts of data for specific facts, such as how fast manufacturing was growing, or how many Irish now lived in New York, or the birth rate in Chicago.
Hollerith tried punching holes in rolls of paper, like a player piano, but what seemed to work best were individual cards, patterned after cards used as textile guides. Census operators could read 8,000 cards a day.
Hollerith went on to improve mechanical computers and create the Tabulating Machine Company in 1896, which became International Business Machines (IBM) in 1924.
The company focused on large-scale projects for government and business, including census, Social Security, military, insurance and banking operations. IBM’s electro-mechanical tabulators from this period were extremely useful in keeping track of many numbers, but they were not programmable computers. In the 1940 US Census, for example, IBM tabulators – or Hollerith machines -- could process cards containing census information at the rate of 400 a minute and from these, 12 separate bits of statistical information could be extracted and summarized.
The world’s first electronic computer was probably the Z3, developed by an engineer named Konrad Zuse in Berlin in 1941. It was used for aircraft engineering problems, like calculating the stresses on airplane wings. Although it was much faster than mechanical tabulating machines, it was not used for the wider array of problems that mathematicians were just beginning to envision, since it was destroyed by Allied bombing in 1943.
Meanwhile, in a mansion 50 miles northwest of London, a family of computers with a very different mission was under construction by a group of social misfits and theoretical mathematicians. "I told you to leave no stone unturned to get staff,” said Winston Churchill as he inspected Bletchley Park with its director early in 1941, “but I had no idea you had taken me so literally."
The computer, in its various incarnations, was called the Colossus and the mission was the “Ultra” secret deciphering of coded German military messages. The mathematicians were recruited from Oxford, Cambridge and puzzle contests run in the Daily Telegraph. At its peak, 9,000 women and men worked on signals intelligence and computer development at Bletchley Park.
Among the Bletchley Park geniuses was Alan Turing (1912–1954), a British mathematician and cryptographer who led the most difficult German naval code breaking effort. He became a key figure in designing computing systems on mathematical principles, formalizing concepts of "algorithm" and "computation" for example. He also proposed the idea of evaluating the completeness of general purpose computer, now called a “Turing complete” computer.
Developed by IBM and shipped to Harvard University in 1944, the Mark I electromechanical computer was used for calculations supporting the atom bomb project and other military activities. The computer went through several iterations, and by 1952, used all electronic components.
Harry Reed, who worked with ballistics lab in the early 1950s, thought of the ENIAC as very personal computer. “Now we think of a personal computer as one which you carry around with you. The ENIAC was one that you kind of lived inside . . . You could wander around inside it and watch the program being executed . . . So instead of you holding a computer, this computer held you”
Before this time, mathematicians working on ballistics tables were themselves called “computers” because their job was to compute the tables, in the same way that a person might be a baker because their job was to bake bread. After the ENIAC, the role of a computer was mechanized in an industrial device—just as Charles Babbage intended a century beforehand.
Since radio tubes were acting as nothing more than electronic switches, considerable research went into the idea of creating smaller and more efficient switches. This could be done if the switches were placed in a “semi-conductor” environment such as silicon. William Shockley and others at Bell Labs invented the semiconductor transistor as a way to miniaturize the radio tube in 1947. This made computers, radios and other electronic devices much smaller and easier to manage.
Photo of (from left) John Bardeen, William Shockley and Walter Brattain, the inventors of the transistor, 1948. This is one of a series of publicity photos produced by Bell Labs around the time of the public announcement of the invention (June 30, 1948).
Although Shockley was not involved in the invention, and has never been listed on patent applications, Bell Labs decided that Shockley must appear on all publicity photos along with Bardeen and Brattain
In 1948, the point-contact transistor was independently invented by German physicists Herbert Mataré and Heinrich Welker while working at the Compagnie des Freins et Signaux, a Westinghouse subsidiary located in Paris.
One of the most important figures in the long-term development of the computer was President Franklin Roosevelt’s science advisor, Vannevar Bush. In the years after World War II, Bush outlined the relationship between science and the democratic system and suggested new goals for scientists who had been freed from wartime research needs. It was important to think about the problem, he said. While the discussions about politics and crops at the corner store might not explicitly involve democracy and its relationship to science and technology, those elements are constantly in the background. “They determine our destiny, and well we know it,” Bush said. He expressed hope that the need for democratic influences on science and technology could be recognized.
in 1952, CBS news worked with Univac computer manufacturer Sperry-Rand to help project the winner of the presidential election. Based on previous election returns and voting patterns, the Univac helped programmers project that Dwight Eisenhower would win 438 electoral votes and his opponent, Adlai Stevenson, would win 93. The official count would turn out to be 442 and 89 electoral votes. Although Eisenhower seemed likely to win at the time, the landslide was unexpected. In fact, the data seemed so out of line with other predictions that the CBS network news team and the computer programmers held it back, claiming that they had some kind of computer malfunction.
In the late 1950s, engineers realized that there was no need for transistors to be separately soldered onto circuit boards. The idea of the integrated processor, or “chip,” occurred independently to Jack S. Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductors at around the same time. In 1958, Kilby used a small germanium slab and interconnecting wires to create an integrated chip. He was awarded the Nobel Prize in physics in 2000. The following year, Noyce developed a set of circuits and transistors that could be printed onto each layer of a silicon chip. The ability to print circuits and transistors allowed mass production. Like Gutenberg’s original printing system, the printing of computer transistor systems reduced costs and helped spark a communications revolution.
https://www.youtube.com/watch?v=8c0_Lzb1CJw
Accelerating technology and accounting were not the first things that came to mind when people thought about computers in the 1960s and 70s. Instead, there was a sense of unease that accompanied the increasing power of computers. They became associated with totalitarian governments.
Ironically, Arthur C. Clark — the author of the book “2001: A Space Odyssey” — did not personally think that computers were dangerous. In a 1974 interview Clark predicted that within 25 years, people would be using small consoles in their homes to talk with their “friendly local computers” to take care of banking, theater tickets and other information needs. Although there might be problems, he said, “computers will enrich our society.”
Realistic or not, fear of computers led to a social reaction that influenced a generation of programmers and engineers to attempt to push the technology into directions that were less about control and more about individual empowerment. Computers had to be liberated from the IBM way of life, according to publisher Stewart Brand and programmer Ted Nelson.
On December 9, 1968, the previously unknown Engelbart launched into history with a 100-minute multimedia presentation of the mouse, the graphical user interface (GUI) and the possibilities for networking. Most impressive was Engelbart’s implementation of Bush’s Memex machine, showing how a user could select a word in one document and bring up a second relevant document instantly—a forerunner of the hypertext link. The 1,000 computer engineers assembled at the conference gave Engelbart a standing ovation. Today the presentation is known as the “Mother of All Demos.”
The idea that computers could liberate and not tyrannize was shared across several innovative centers that were coming up at the time. One was located in Cambridge, MA and associated with the Massachusetts Institute of Technology (MIT). Another culture emerged in the 1970s around Stanford University in Palo Alto, California—a region also known as “Silicon Valley.”
The excitement surrounding the work at MIT, SRI and PARC spilled over into Stanford University in Palo Alto, California. One of the informal groups that sprang up was the eclectic “Home Brewed Computer Club,” a learning club which would meet weekly to hear lectures and share ideas about computing at Stanford. Steve Wozniak and Steve Jobs were among the computer enthusiasts who attended these early meetings.
Particularly instrumental in organizing the pathbreaking presentation was William K. English, who left SRI and moved, in 1971, to Xerox PARC, taking a dozen researchers with him, to develop other new advanced applications including the “what you see is what you get” (wysiwyg) interface, the desktop, Ethernet and object oriented programming. PARC put the concepts together into the first working personal computer—the Alto—developed in 1974. Thousands were manufactured and used in corporate offices and universities, but never sold commercially.
Founders—Steve Wozniak (left) and Steve Jobs, around 1978, examine an Apple I computer. The design was elegant and the product flew off the shelves, helping to start the personal computer revolution. (Photo by Joe Melina, courtesy of the Computer History Museum)
Gordon Moore of Intel observed this trend, and it has held steady, albeit slowing a little by 2013. Essentially, computer power doubles every 18 – 24 months.
Ray Kurzweil’s backwards extrapolation of Moore’s law.
Many of the Apple – Microsoft incompatibilities were ironed out in the 1980s and 1990s. By the 21st century, both Apple and Microsoft used Intel chips and could shuffle application files back and forth seamlessly.
According to historian John Tosh, there are three components of analysis when examining an historical document: textual authentication; validity of factual information; and weighing alternative interpretations.
Textual authentication—The photo and the text are part of the same image, as would be typical of a wire service photo sent to news organizations. However, checking the Associated Press photo transmission guide, we see that many elements are missing, including: date, location, photographer’s name, photo agency, and meta data about the transmission itself and intended publication date range. There are also grammar issues—“how a home computer could look like . . . ”
Validity of factual information—It’s possible that scientists from RAND Corp. were working on computers of the future, since as we learn from a simple Google Search, the RAND Corp. was founded in 1945 as a policy and technology institute. But we don’t know that they anticipated, or even could have anticipated, a home computer. Most of the work on computers had been taking place at MIT and the University of Pennsylvania with the Univac system. Predictions about home computers were much more typical of the late 1960s and 1970s.
Alternative information—Rather than a news agency photo of a home computer being demonstrated at the RAND Corp., could this photo and caption be something else? The visual information seems suspect. For instance, why would a computer have a steering wheel? This leads us to ask whether this could this be a control panel for something other than a computer.