This
section concerns contributions to the development of
information science and technology at its logical (as opposed to
its hardware)
level. Specifically, this section deals with areas
such as computation theory, artificial intelligence, the
statistical theories of information, communication, and
systems control, cryptography, operations research, computer
and network architectures, and algorithm and software
design. The general level of this contribution is
reflected in the current ~45% Jewish membership in the
Computer and Information Sciences division of the US
National Academy of Sciences and in the percentages of
Jewish recipients shown below for several of the most
prestigious awards in the field. Two of the four
individuals generally recognized to have been the principal
architects of the Information Age were the Jewish
mathematicians Norbert Wiener and John von Neumann.1
Some of the more notable Jewish
contributions are listed below. (The names of
non-Jewish scientists and engineers mentioned in the
accompanying discussion have been denoted with the
superscript "+" in order to avoid confusion.)
The interpretation of
thermodynamic entropy as an
information metric by Leo Szilard.
Szilard's 1929 analysis of the Maxwell's demon paradox "is
now considered to be the earliest known paper in what
became the field of 'information theory' in the 1950s and
1960s." 2 Other important information
metrics were formulated by John von Neumann, Alfréd
Rényi, Solomon Kullback, and Richard Leibler. The
von Neumann entropy, e.g., is the quantum generalization
of Szilard's classical information measure and is one of
the fundamental concepts in quantum information theory.
The
introduction of the diagonal argument proof method
by Georg Cantor*. This method is central to the
derivation of the incompleteness and noncomputability
results of Gödel+, Turing+, Church+,
and Post that lie at the foundation of theoretical
computer science. In a 1936 paper, Emil Post
described a mechanical definition of computation, known as
the Post machine, which is equivalent to the Turing
machine introduced by Alan Turing+ in a paper
that appeared several months later. Post had
understood the undecidability implications of such a
definition as early as 1921, but had hesitated to publish
and lost priority to Gödel+, who
approached the problem from a very different perspective
in his 1931 paper. Post was also one of the
four principal founders of the theory of recursive
functions, which is of immense importance in theoretical
computer science.3
The logical design of Colossus,
the first all-electronic, digital, programmable
computer by Max Newman.* Although Colossus
was not a general-purpose computer and had only limited
programmability, it represented an important
milestone. Newman, a Cambridge University
professor of mathematics, headed the "Newmanry," a
special code-breaking unit at Bletchley Park in England
during World War II. In this capacity, he
formulated the logical design of a machine to mechanize
the cryptanalysis of the German Lorenz Cipher, which was
used by the Nazi high command to encrypt its highest
priority communications. The first instantiation
of this machine, called Heath Robinson, was
based on a combination of electromechanical,
electromagnetic-relay, and vacuum tube switches.
Tommy Flowers,+ who had worked on the
engineering design of Heath Robinson,
subsequently argued, and then brilliantly demonstrated,
that a much faster and more reliable version of the
machine could be realized all-electronically, using just
vacuum tubes. The resulting Colossus
machines played a critical role in securing Allied
victory in Europe and were influential in the post-war
development of computers in England.4
(Contrary to what is sometimes claimed, Alan Turing,+
who was Newman's protégé, had relatively little direct
involvement with Colossus, although his ideas
were extremely influential. Newman later declined
an OBE appointment in protest against the treatment
accorded Turing+ by the postwar British
government.)
The
design of the logical architecture employed in virtually
all modern computers by John von Neumann.
Von Neumann's 1946 paper "Preliminary Discussion of the
Logical Design of an Electronic Computing Instrument" has
been described as "the most influential paper in the
history of computer science ... the ideas it contains,
collectively known as the von Neumann machine, have provided the
foundation for essentially all computer system
developments since that date."5
The machine that von Neumann designed and had constructed
at the Institute for Advanced Study (IAS) in the late
1940s was widely replicated in the development of many
other early computer systems, including SEAC (US National
Bureau of Standards), ILLIAC (University of Illinois),
ORDVAC (Aberdeen Proving Ground), JOHNNIAC (RAND
Corporation), MANIAC (Los Alamos National Laboratory),
AVIDAC (Argonne National Laboratory), ORACLE (Oak Ridge
National Laboratory), WEIZAC (Weizmann Institute of
Science), and the IBM 701 (IBM's first mass-produced,
commercial mainframe computer).6 The IAS
computer project was intended by von Neumann primarily to
demonstrate the utility of computers in the solution of
scientific and engineering problems, an objective at which
it indeed succeeded, creating the field of modern
numerical weather prediction and solving some of the most
difficult problems in radiation hydrodynamics associated
with the design of thermonuclear weapons. Von
Neumann invented the computerized random number generator
and co-invented the Monte Carlo method. He also
invented the theory of system fault tolerance and the cellular
automata model of computation. The universal von
Neumann constructor, a generalization of the
universal Turing machine that emerged out of von Neumann's
theory of
self-reproducing automata, is one of the
foundational concepts in the theoretical study of the
biomolecular nanotechnology of living systems.
The invention of parallel
supercomputing architectures by Stephen
Unger, Daniel Slotnick, David Schaefer, and
Włodzimierz Holsztyński. Unger, Slotnick,
Schaefer, and Holsztyński are four of the "eight men
[who] dominate the history of SIMD computer
architectures." 7 SIMD (single
instruction, multiple data) refers to the basic
parallel processing technique employed in the
earliest supercomputers.8 Unger was
the first to propose and explore such architectures
in the late 1950s. Slotnick designed SOLOMON
in the early 1960s and built the first parallel
processing prototypes. He was later the
architect of Illiac IV, the first important
parallel supercomputer, which had up to 256
processing elements. Built with 64 processing
elements in the early 1970s with ARPA (now DARPA)
funding and operated by NASA, Illiac IV remained the
world's fastest computer until its shutdown in
1981. In the late 1970s and early 1980s,
Schaefer initiated and managed the development of
NASA's Massively Parallel Processor (MPP),
the first truly massively parallel supercomputer,
with 16,384 processing elements. Holsztyński
designed the Geometric-Arithmetic Parallel
Processor (GAPP) in 1981. GAPPs with
hundreds of thousands of processing elements are
used today in real-time video image processing
applications such as image enhancement and noise
reduction, video data compression, and format and
frame rate conversion.
The
co-discovery of NP-completeness by Leonid
Levin. Levin and Stephen Cook+
independently discovered and proved what is now referred
to as the Cook-Levin theorem, the central result
concerning the P = NP?
question, which is the major open problem in theoretical
computer science. Richard Karp introduced the terms
"P" and "NP" and defined NP-completeness (although not the
term itself) in its present form. He also identified
the decision problem formulations of many well-known,
combinatorially intractable problems as being
NP-complete. Levin, Karp, and Manuel Blum are
considered to be three of the six founders of the field of
computational complexity theory.
The
invention of context-free languages by Noam
Chomsky. This work was based on Emil Post's theory
of production systems in mathematical logic. It is
the basis of the BNF notation widely used to specify the
syntax rules of programming languages. Chomsky's
hierarchical classification of formal languages initiated
the field of formal language theory in computer science.
The
co-invention of BASIC by John Kemeny.
Kemeny and Thomas Kurtz+ developed this popular
programming language. At least one-third of the
nine-person team that developed FORTRAN under
John Backus+ at IBM were Jewish. Also at
IBM, Adin Falkoff collaborated with Kenneth
Iverson+ on the design and
development of the array processing language APL (acronym
for "A Programming Language"). Four of the
six principal designers of COBOL, Howard Bromberg,
Norman Discount, Jean Sammet, and William Selden, were
Jewish. COBOL was used to run an estimated 80% of
the world's business systems at the turn of the
century. Although its usage has been in gradual
decline since then, it is still widely employed in
business, financial, and government systems. Ada, an
advanced programming language adopted by the US Department
of Defense as its standard high-level computer programming
language in the 1980s and 1990s, was designed by Jean
Ichbiah. LISP,
the second-oldest high-level programming language still in
use (primarily in artificial intelligence research), was
invented by John McCarthy* in 1958. Barbara Liskov
was awarded the 2008 ACM Turing Award for fundamental
advances in programming language design. The ACM
press release noted that her innovations "are now the
basis of every important programming language since 1975,
including Ada, C++, Java, and C#."
The
invention of the MINIX operating system by Andrew
Tanenbaum. MINIX was the precursor to, and
inspiration for, the widely used Linux
operating system.
The
invention of the computer spreadsheet by Dan
Bricklin and Robert Frankston. Bricklin and
Frankston's VisiCalc spreadsheet was the first "killer
app." The Lotus 1-2-3 spreadsheet program, the most
successful software product of its time, was developed by
Jonathan Sachs and Mitchell Kapor.
The invention of
the computerized word processor by Evelyn Berezin.
Berezin, who was inducted into the National Inventors
Hall of Fame in 2022, invented the Data Secretary, the
first computerized word processor, in 1971. She
is also credited with developing the first office
computer, the first computerized banking system, and
the United Airlines computerized reservation system,
the largest data processing system of its time.
Other important word processors were designed by
Charles "Nick" Corfield (Adobe FrameMaker) and Richard
Brodie* (Microsoft Word).
The
co-founding of the field of artificial intelligence (AI)
by Marvin Minsky, Herbert Simon*, and John
McCarthy*. (Allen Newell+ is also
considered to have been one of AI's four principal
founders.9) Six of the ten inductees into
the IEEE Computer Society's Intelligent Systems
Magazine AI Hall of Fame are, or were, Jewish or of
Jewish descent (Noam Chomsky, Edward Feigenbaum, John
McCarthy*, Marvin Minsky, Judea Pearl, and Lotfi
Zadeh*). Major approaches to
machine learning, the now dominant approach to
artificial intelligence, were pioneered by Jews, including
Bayesian Networks (Judea Pearl), Support Vector
Machines (Vladimir Vapnik and Alexey Chervonenkis),
Deep Learning (Frank Rosenblatt and Yoshua Bengio,
together with Geoffrey Hinton+ and Yann LeCun+),
Evolutionary Computing (Lawrence Fogel), and the Probably
Approximately Correct (PAC) Model (Leslie Valiant).10
Minsky, Simon, McCarthy, Feigenbaum, Pearl, Bengio, and
Valiant have received seven of the eleven Turing
Awards given thus far for work in artificial intelligence.
The creation of ChatGPT
by Ilya Sutskever. Sutskever, a co-founder and the chief
scientist of OpenAI, is the principal designer of the
large language model GPT-4, on which ChatGPT is based.
The extraordinary ability of this deep learning AI
language model to "understand" textual input and
generate textual output that is both syntactically and
semantically precise has shocked even many AI
researchers.
The
development of computer algebra (symbol manipulation)
programs by Jean Sammet (FORMAC), Carl Engelman
(MATHLAB), Joel Moses (MACSYMA), and Stephen Wolfram (Mathematica).
The
invention of reversible computation theory by
Rolf Landauer.Reversible computation
circumvents the thermodynamic limits on irreversible
computation established by John von Neumann, and is one of
the foundations of quantum computing. The
ballistic architecture, or Fredkin gate, model of
reversible computation was introduced by Edward Fredkin.
The
invention of quantum computing by Paul Benioff,
Richard Feynman, Yuri Manin,* and David Deutsch.
The
invention of DNA computing by Leonard Adleman.
The
invention of fuzzy logic by Max Black and Lotfi
Zadeh* (independently).
The
invention of algorithmic complexity by Ray
Solomonoff. Also termed Kolmogorov complexity
or algorithmic information theory, Solomonoff's 1964 work
was later arrived at independently by Andrei Kolmogorov+
(1965) and Gregory Chaitin (1969).
The
invention of the Monte Carlo method by Stanislaw
Ulam and John von Neumann. This statistical
numerical method is one of the cornerstones of computer
simulation science. Von Neumann invented the
first computer-based random number generator for use in
Monte Carlo simulations. The so-called Metropolis
Monte Carlo algorithm, widely used in statistics and
computational physics, was largely devised by Marshall
Rosenbluth, based in part on ideas from Edward Teller and
John von Neumann.11
The
invention of nondeterministic algorithms by
Michael Rabin. Such algorithms employ Monte Carlo
methods to provide efficiently computable solutions that
are correct with high (but less than one hundred percent)
probability to many problems whose exact solution is
computationally intractable. Rabin's probabilistic
primality testing, e.g., is essential to the practical
implementation of RSA public-key cryptography.
The
invention of the SIMPLEX linear programming algorithm
by George B. Dantzig. Linear programming (LP),
invented independently by Dantzig and Leonid
Kantorovich, is a powerful optimization technique
that is widely used in economics and
engineering. It has been estimated that,
aside from database operations such as sorting and
searching, LP consumes more computer time than any other
mathematical procedure.12The
SIMPLEX algorithm remains LP's fundamental numerical
solution technique.
The
invention of the ellipsoid method of convex optimization
by Naum Shor and, independently, by Arkadi Nemirovski and
David Yudin. This technique, which was successfully
employed by Leonid Khachiyan+ to
prove the polynomial-time complexity of linear
programming, underlies most modern results concerning the
computational complexity of convex optimization
programs. The ellipsoid method provided the first
effective solver for semidefinite programs (which are
encountered in many engineering applications) and has led
to significant advances in combinatorial optimization.
The
invention or co-invention of five of CiSE's "Top Ten
Algorithms of the Century" by Stanislaw Ulam,
John von Neumann, Marshall Rosenbluth, Edward Teller,
George Dantzig, Leonid Kantorovich, Cornelius Lanczos, I.
J. Good, Leslie Greengard, and Vladimir Rokhlin,
Jr.. The January/February 2000 issue of Computing in Science &
Engineering, a joint publication of the American
Institute of Physics and the IEEE Computer Society,
assembled a list of "the ten algorithms with the greatest
influence on the development and practice of science and
engineering in the 20th century." In addition to the
Monte Carlo method and the SIMPLEX algorithm discussed
above, the top ten algorithms included the Krylov subspace
iteration method for the solution of large systems of
linear equations (Lanczos, together with Magnus Hestenes+
and Eduard Stiefel+), the Fast Fourier
Transform (FFT) (Lanczos, together with G. C. Danielson+
in 1942, and independently by I. J. Good in 1958 and by
James Cooley+
and John Tukey+
in 1965),13 and the fast multipole algorithm
for the solution of many-body problems (Greengard and
Rokhlin).
The
invention of the Wiener filter by Norbert
Wiener. The Wiener filter is an optimal filter for
extracting signals from noise in stationary stochastic
systems and is one of the central results in statistical
communication theory, a field pioneered by
Wiener. (A version of the Wiener filter was
also formulated independently by Andrei Kolmogorov+.)
The nonlinear, recursive Wiener filter, or Kalman
filter, its extension to nonstationary systems for use in
tracking and guidance was first formulated by Peter
Swerling in 1959.14 Wiener and
Alexander Khinchine independently derived the
Wiener-Khinchine theorem, another central result in statistical
communication theory.
The
invention of statistical decision theory by
Abraham Wald. Among other applications,
statistical decision theory plays an important role in
radar, control, and communication. Its minimax
decision rules derive from John von Neumann's theory of
optimal strategies (theory of games).
The
invention of dynamic programming by Richard
Bellman. This procedure solves sequential, or
multi-stage, decision problems and is one of the
foundations of modern control theory. It also
constitutes the basis for many powerful algorithms,
including the backpropagation algorithm used to train
neural networks in machine learning and the Viterbi
algorithm, invented by Andrew Viterbi, that is used to
decode convolutional codes employed in error correction
and in CDMA and GSM digital cellular telephony.
The
co-invention of public-key cryptography by Martin
Hellman. Hellman and Whitfield Diffie+
devised the Diffie-Hellman algorithm for secure key
distribution over nonsecure channels.
The
co-invention of RSA by Adi Shamir and Leonard
Adleman. RSA (which is named for its three
co-inventors, Shamir, Adleman, and Ronald Rivest+)
is the most widely used public-key algorithm.
The invention of
elliptic curve cryptography (ECC) by Neal Koblitz
and Victor S. Miller (independently). Based on
concepts rooted in algebraic geometry, ECC is widely
deployed as the leading successor to RSA. Relative
to RSA, it provides greater cryptographic strength with
much smaller cryptovariables. (It does, however,
share the same potential vulnerability to quantum
computational attack.)
The formulation of the
learning with errors (LWE) problem and encryption
system by Oded Regev. The application of
LWE, a computationally "hard" problem, has led to
revolutionary developments in cryptography. CRYSTALS-Kyber,
the recent finalist in the US National Institute of
Standards and Technology’s six-year international
competition to find a quantum-resistant, or
"post-quantum," algorithm standard for general
encryption, is based on LWE. CRYSTALS-Dilithium,
one of the three algorithms selected to become
quantum-resistant digital signature standards, is also
based on LWE. (FALCON, one of the other two
digital signature finalists, is based on the NTRU
cryptographic system, which was invented by Joseph
Silverman, Jeffrey Hoffstein, and Jill Pipher.+)
LWE is also the basis for the recent breakthrough
in the construction of efficient, fully
homomorphic encryption (FHE) schemes by Zvika
Brakerski, Craig Gentry,+ and Vinod
Vaikuntanathan.+ FHE techniques permit
the processing of encrypted data without the need to
first decrypt it, thus permitting the secure processing
of sensitive data on non-secure computing
platforms.
The
invention of quantum cryptography by Stephen
Wiesner. Although quantum key distribution was
invented in the mid-1980s by others, it was specifically
acknowledged to have been inspired by Wiesner's circa 1970 work that
established the basic principles underlying the use of
quantum mechanics to achieve information security.
The
development of mathematical and statistical
cryptanalysis by William Friedman.
Friedman's innovations are ranked amongst the greatest in
the history of cryptology; he supervised the
breaking of the Japanese diplomatic code PURPLE in 1940
and directed US cryptanalysis during World War
II. Other important World War II cryptologists
included Solomon Kullback, Leo Rosen, and Abraham Sinkov
in the US and Max Newman*, I.J. Good, and Leo Marks in
England. Newman and Good were instrumental in the
design of Colossus,
which was used to break the Lorenz cipher employed by the
German high command. Marks, the chief cryptologist
of the Special Operations Executive (SOE) of MI6,
revolutionized the one-time pad.
The
invention of cryptocurrency by David Chaum. In his seminal 1982 paper,
Chaum established the concept of secure digital cash,
the first cryptocurrency. Chaum's invention
anticipated by several decades the now widespread
recognition of the issue of electronic privacy in
financial and other online transactions.
The co-invention of blockchain by David Chaum and,
independently, by Stuart Haber (together with W. Scott
Stornetta+). Chaum gave the first
full technical description of a blockchain in his 1982
doctoral dissertation, but never published the concept
in a journal or conference proceeding. It,
therefore, went largely unnoticed. In a 1991
paper, Haber and Stornetta+ introduced
another embodiment of this concept for a shared,
tamper-proof, decentralized transaction ledger.
Their company, Surety Technologies, founded in 1994,
created the first and longest running commercial
blockchain. Blockchain technology is widely
claimed to have the potential to revolutionize
e-commerce (and more) by greatly increasing the
efficiency, confidentiality, and trust with which
online transactions are conducted and recorded.
The
invention of convolutional codes by Peter
Elias. Important decoding algorithms for these error
correction codes were invented by Barney Reiffen,
Robert Fano, and Andrew Viterbi.
The
co-invention of the Reed-Solomon error correction code
by Gustave Solomon. Reed-Solomon and
Viterbi- or Fano-decoded convolutional codes, or hybrid
concatenations of the two, are probably the most widely
used error correction techniques at present.
The
invention of the LZ data compression algorithm by
Jacob Ziv and Abraham Lempel. Although LZ coding was
not the first data compression technique (the first such
technique having been invented, independently, by Robert
Fano and Claude Shannon+), it is today the most
widely used in commercial systems. It underpins PDF,
GIF, TIFF, ZIP, and other widely used file formats.
The
development of automated, electronically switched
telephone networks by Amos Joel. Joel
received both the 1989 Kyoto Prize ("Japan's Nobel Prize")
and the 1993 US National Medal of Technology for
work that revolutionized telephone switching systems
worldwide. Joel's 1972 US Patent No. 3,663,762,
"Mobile Communication System," is the basis of the switching technology that made cellular telephone
networks possible.
The
co-invention of spread spectrum communications by
Hedy Lamarr. Lamarr (the Hollywood actress) and
George Antheil+ (a Hollywood
composer) received US Patent No. 2,292,387, "Secret
Communication System," in 1942 for the invention of
frequency-hopped spread spectrum. The digital form
of spread spectrum that is widely used in cellular
communications (CDMA) was developed by Qualcomm, a company
founded by the information theorists Irwin Jacobs and
Andrew Viterbi. Jacobs received the US National
Medal of Technology in 1994 and Viterbi received the US
National Medal of Science in 2007. Both were
recognized for their pioneering innovations in digital
wireless communications. Joel Engel also received
the Medal of Technology in 1994 as one of the two "fathers
of the cellular phone" for his work on the development of the
basic network architecture used worldwide in cellular
telephony. (The cell phone itself, as
opposed to cellular switching fabrics, etc., was invented
by Martin Cooper. Precursor technologies, such as
the "walkie-talkie," the pager, the cordless telephone,
and CB radio, were largely pioneered by Irving "Al"
Gross.)
The
co-invention of the Internet by Leonard
Kleinrock, Paul Baran, Vinton Cerf,* and Robert
Kahn. Together with Kleinrock, Baran, Cerf, and
Kahn, Donald Davies+ and Lawrence
Roberts+ are the six individuals
most frequently cited as principal inventors of the
Internet. Kleinrock, Cerf, Kahn, and Roberts+were awarded the US National Academy of Engineering's
half-million dollar Draper Prize in 2001
"for the development of the Internet." Baran, Kleinrock,
Davies+, and Roberts+
received the first IEEE Internet Award in 2000 for "their
early, preeminent contributions in conceiving, analyzing
and demonstrating packet-switching networks, the
foundation technology of the Internet." Cerf, Kahn,
and Baran received US
National Medals of Technology, the former two in 1997 and
the latter in 2007. Kleinrock was awarded the US
National Medal of Science in 2007. Cerf and
Kahn co-invented the TCP/IP protocol for
integration of heterogeneous networks, which is the basis
of the Internet's "inter-networking" architecture.
They shared the 2004 ACM Turing Award for this
work, and in 2005 each received the US
Presidential Medal of Freedom.
The
invention of Alohanet (precursor to Ethernet) by
Norman Abramson. Alohanet was a packet-switched
research network that solved the major problem of packet
interference, or "packet collision." Alohanet was
further developed by Robert Metcalfe,+
working at the Xerox Palo Alto Research Center, into
Ethernet (which Metcalfe+
originally called the Alto Aloha network), the standard
method used in local area computer networking. Radia
Perlman's spanning tree protocol, which solved the problem
of broadcast storms due to network switching loops, was
the critical enabler that allowed Ethernet to realize high
levels of robust network complexity.
The
invention of Google by Sergey Brin and Larry
Page*.The algorithm employed by
Google, the most powerful and widely used search
engine on the Internet, employs an adaptation of
the citation frequency "impact factor" metric originally
invented in the 1950s by Eugene Garfield
to rank the relative influence of scientific researchers,
articles, and journals. A
search algorithm very similar to Google PageRank, called
HITS (Hypertext Induced Topic Search), was devised almost
simultaneously by Jon Kleinberg at IBM. In his
papers, Kleinberg credited the 1976 mathematical work of
Gabriel Pinski and Francis Narin, as does the PageRank
patent. Pinski and Narin had shown how to formulate
and compute Garfield's relative influence in terms of a
graph theoretic matrix eigenvalue problem. Similar
mathematical techniques for calculating the relative
influence of individuals in social networks or of
production sectors in national economies can be found,
respectively, in the 1953 work of statistician Leo Katz
and in the 1941 work on input-output analysis by the Nobel
Prize winning economist Wassily Leontief.*
NOTES 1. According to the
prominent historian of technology George Dyson, "there were fouressential
prophets whose mathematics brought us into the Information
Age: Norbert Wiener, John von Neumann, Alan Turing and
Claude Shannon." See "The Elegance of Ones and
Zeroes," by George Dyson in TheWall
Street Journal, 21 July 2017.
2. See Genius in
the Shadows: A Biography of Leo Szilard, by
William Lanouette (Scribner's, New York, 1992, p. 63). 3. See "Emil Post
and His Anticipation of Gödel and Turing," by John
Stillwell in Mathematics
Magazine (Mathematical Association of America,
Washington, DC, Vol. 77, No. 1, Feb. 2004, pp.
3-14). See also http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Post.html. 4. See "Max Newman:
Mathematician, Codebreaker and Computer Pioneer," by
William Newman in Colossus:
The First Electronic Computer, edited by Jack
Copeland (Oxford, Oxford and New York, 2004). 5. Encyclopedia of Computer Science (Fourth
Edition), edited by Anthony Ralston, Edwin D. Reilly, and
David Hemmendinger (Wiley, Chichester, England, 2003, p.
1841).
6. See Turing's Cathedral: The Origins of the Digital
Universe, by George Dyson (Pantheon/Random House,
New York, 2012, p. 287).
7. Parallel Supercomputing in SIMD Architectures,
by R. Michael Hord (CRC Press, Boca Raton, FL, 1990).
8. Although most supercomputers are now based on MIMD
(multiple instruction, multiple data) architectures, their
individual processing nodes generally embody small-scale
SIMD capabilities. The still largely hypothetical
quantum computer can be thought of as an SIMD machine with
exponentially many virtual processors. 9.
See AI: The
Tumultuous History of the Search for Artificial
Intelligence, by Daniel Crevier (Basic Books, New
York, 1993, p. 26), or Encyclopedia of Computer Science (Fourth
Edition), edited by Anthony Ralston, Edwin D. Reilly, and
David Hemmendinger (Wiley, Chichester, England, 2003, p.
91). 10. The 2019 IEEE Xplorearticle
“Who Is the Father of Deep Learning?” states: “We conclude
that Frank Rosenblatt developed and explored all the basic
ingredients of the deep learning systems of today, and that
he should be recognized as a Father of Deep Learning,
perhaps together with Hinton, LeCunand Bengio
who have just received the Turing Award as the fathers of
the deep learning revolution.” Lawrence Fogel
initiated the field of evolutionary computing in 1960 and is
considered to be the “father of evolutionary programming,”
the first of four basic approaches that were eventually
formulated and subsequently merged into the field of
evolutionary computing.
11. See "Marshall
Rosenbluth and the Metropolis algorithm," by J. E.
Gubernatis, in Physics of Plasmas (12,
057303, 2005). According to this article, Nicholas
Metropolis'+ only contribution to the algorithm's
development was making available the use of MANIAC, the Los
Alamos computer he had constructed as a replica of the
computer that von Neumann designed and built at the
Institute for Advanced Study. Of the other co-authors
of the Metropolis algorithm paper, "arguably the most
significant publication in the history of computational
physics," Mici Teller initiated the assembly language
programming work to code the algorithm, Arianna Rosenbluth+
took that over and produced from scratch the actual program
used to test the algorithm, Edward Teller made "the crucial
suggestion" to employ ensemble, rather than temporal
averaging and how to do that, and Marshall Rosenbluth
actually designed the algorithm, incorporating insights of
his own. According to Gubernatis, the "key," as opposed to
the "crucial," idea underpinning the algorithm's power was
the principle of detailed balance, which was implicit in the
original 1953 paper, but not made explicit until Rosenbluth
formulated a general proof of the algorithm's validity in
1956. Rosenbluth went on to become one of the world's
leading plasma theorists and a winner of both the Enrico
Fermi Award (1985) and the US National Medal of Science
(1997).
12. See http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Dantzig_George.html. 13. Cooley and Tukey are
generally credited with invention of the "modern" FFT.
Their 1965 paper only referenced the prior work
of I. J. Good, whose FFT algorithm was both somewhat
different and less efficient. In a January 1992
paper in IEEE SP Magazine, entitled "How the FFT
Gained Acceptance," Cooley reviewed other prior work and
concluded that "it appears that Lanczos had the FFT
algorithm" in 1942. He holds out the possibility
that Gauss may have had it as early 1805, however.
Gauss's work was not published until after his death and
was in Latin and employed somewhat archaic notation,
which made it difficult to decipher. It was only
published in 1866 in his collected works. An analysis by
Michael Heideman, Don Johnson, and C. Sidney Burrus has
concluded that Gauss did indeed have the basic elements
of the modern FFT in 1805, but due to its obscurity, his
formulation appears to have had no influence on
subsequent work.
14. See the next-to-last
paragraphs in https://archive.siam.org/news/news.php?id=526
and in the obituary published in the November
2000 issue of Physics
Today
(pp. 75-76). See also the discussion in
the Appendix to Tracking
and Kalman Filtering Made Easy, by Eli Brookner
(Wiley, New York, 1998, pp. 383-387). *
Georg Cantor and Herbert
Simon had Jewish fathers;
Simon's mother was of partial Jewish descent, which was also
the case, at a minimum, for the mother of Georg
Cantor. Max Newman
and Vinton Cerf had Jewish fathers and non-Jewish mothers,
while Richard Brodie, Wassily Leontief, Yuri Manin, John
McCarthy, Larry Page, and Lotfi
Zadeh have, or had, Jewish mothers. For more
information, see the footnotes to these and other listings
in Jewish Computer
and Information Scientists, or in
the cases of Leontief and Manin, in Jewish Economists and Jewish Mathematicians,
respectively. +
Non-Jewish.