Richard E. Bellman

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Richard Ernest Bellman[1]
Richard Ernest Bellman.jpg
Born
Richard Ernest Bellman

(1920-08-26)August 26, 1920
DiedMarch 19, 1984(1984-03-19) (aged 63)
Alma materPrinceton University
Johns Hopkins University
University of Wisconsin
Brooklyn College
Known forDynamic programming
Bellman equation
Hamilton–Jacobi–Bellman equation
Curse of dimensionality
Bellman–Ford algorithm
AwardsJohn von Neumann Theory Prize (1976)
IEEE Medal of Honor (1979)
Richard E. Bellman Control Heritage Award (1984)
Scientific career
FieldsMathematics and Control theory
InstitutionsUniversity of Southern California;
Rand Corporation;
ThesisOn the Boundedness of Solutions of Non-Linear Differential and Difference Equations[2]
Doctoral advisorSolomon Lefschetz[2]
Doctoral studentsChristine Shoemaker[2]

Richard Ernest Bellman[3] (August 26, 1920 – March 19, 1984) was an American applied mathematician, who introduced dynamic programming in 1953, and important contributions in other fields of mathematics.

Biography[edit]

Bellman was born in 1920 in New York City to non-practising[4] Jewish parents of Polish and Russian descent, Pearl (née Saffian) and John James Bellman,[5] who ran a small grocery store on Bergen Street near Prospect Park, Brooklyn.[6] He attended Abraham Lincoln High School, Brooklyn in 1937,[5] and studied mathematics at Brooklyn College where he earned a BA in 1941. He later earned an MA from the University of Wisconsin. During World War II he worked for a Theoretical Physics Division group in Los Alamos. In 1946 he received his Ph.D at Princeton under the supervision of Solomon Lefschetz.[7] Beginning 1949 Bellman worked for many years at RAND corporation and it was during this time that he developed dynamic programming.[8]

Later in life, Richard Bellman's interests began to emphasize biology and medicine, which he identified as "the frontiers of contemporary science". In 1967, he became founding editor of the journal Mathematical Biosciences which specialized in the publication of applied mathematics research for medical and biological topics. In 1985, the Bellman Prize in Mathematical Biosciences was created in his honor, being awarded biannually to the journal's best research paper.

Bellman was diagnosed with a brain tumor in 1973, which was removed but resulted in complications that left him severely disabled. He was a professor at the University of Southern California, a Fellow in the American Academy of Arts and Sciences (1975),[9] a member of the National Academy of Engineering (1977),[10] and a member of the National Academy of Sciences (1983).

He was awarded the IEEE Medal of Honor in 1979, "for contributions to decision processes and control system theory, particularly the creation and application of dynamic programming".[11] His key work is the Bellman equation.

Work[edit]

Bellman equation[edit]

A Bellman equation, also known as a dynamic programming equation, is a necessary condition for optimality associated with the mathematical optimization method known as dynamic programming. Almost any problem which can be solved using optimal control theory can also be solved by analyzing the appropriate Bellman equation. The Bellman equation was first applied to engineering control theory and to other topics in applied mathematics, and subsequently became an important tool in economic theory.[12]

Hamilton–Jacobi–Bellman equation[edit]

The Hamilton–Jacobi–Bellman equation (HJB) is a partial differential equation which is central to optimal control theory. The solution of the HJB equation is the 'value function', which gives the optimal cost-to-go for a given dynamical system with an associated cost function. Classical variational problems, for example, the brachistochrone problem can be solved using this method as well. The equation is a result of the theory of dynamic programming which was pioneered in the 1950s by Richard Bellman and coworkers. The corresponding discrete-time equation is usually referred to as the Bellman equation. In continuous time, the result can be seen as an extension of earlier work in classical physics on the Hamilton–Jacobi equation by William Rowan Hamilton and Carl Gustav Jacob Jacobi.[13]

Curse of dimensionality[edit]

The curse of dimensionality is an expression coined by Bellman to describe the problem caused by the exponential increase in volume associated with adding extra dimensions to a (mathematical) space. One implication of the curse of dimensionality is that some methods for numerical solution of the Bellman equation require vastly more computer time when there are more state variables in the value function. For example, 100 evenly spaced sample points suffice to sample a unit interval with no more than 0.01 distance between points; an equivalent sampling of a 10-dimensional unit hypercube with a lattice with a spacing of 0.01 between adjacent points would require 1020 sample points: thus, in some sense, the 10-dimensional hypercube can be said to be a factor of 1018 "larger" than the unit interval. (Adapted from an example by R. E. Bellman, see below.) [14]

Bellman–Ford algorithm[edit]

Though discovering the algorithm after Ford he is referred to in the Bellman–Ford algorithm, also sometimes referred to as the Label Correcting Algorithm, computes single-source shortest paths in a weighted digraph where some of the edge weights may be negative. Dijkstra's algorithm accomplishes the same problem with a lower running time, but requires edge weights to be non-negative.

Publications[edit]

Over the course of his career he published 619 papers and 39 books. During the last 11 years of his life he published over 100 papers despite suffering from crippling complications of brain surgery (Dreyfus, 2003). A selection:[5]

  • 1957. Dynamic Programming
  • 1959. Asymptotic Behavior of Solutions of Differential Equations
  • 1961. An Introduction to Inequalities
  • 1961. Adaptive Control Processes: A Guided Tour
  • 1962. Applied Dynamic Programming
  • 1967. Introduction to the Mathematical Theory of Control Processes
  • 1970. Algorithms, Graphs and Computers
  • 1972. Dynamic Programming and Partial Differential Equations
  • 1982. Mathematical Aspects of Scheduling and Applications
  • 1983. Mathematical Methods in Medicine
  • 1984. Partial Differential Equations
  • 1984. Eye of the Hurricane: An Autobiography, World Scientific Publishing.
  • 1985. Artificial Intelligence
  • 1995. Modern Elementary Differential Equations
  • 1997. Introduction to Matrix Analysis
  • 2003. Dynamic Programming
  • 2003. Perturbation Techniques in Mathematics, Engineering and Physics
  • 2003. Stability Theory of Differential Equations (originally publ. 1953)[15]

References[edit]

  1. ^ Richard E. Bellman was elected in 1977 as a member of National Academy of Engineering for contributions to control theory and multistage decision procedures, including the techniques of dynamic programming.
  2. ^ a b c Richard E. Bellman at the Mathematics Genealogy Project
  3. ^ Richard Bellman's Biography
  4. ^ Robert S. Roth, ed. (1986). The Bellman Continuum: A Collection of the Works of Richard E. Bellman. World Scientific. p. 4. ISBN 9789971500900. He was raised by his father to be a religious skeptic. He was taken to a different church every week to observe different ceremonies. He was struck by the contrast between the ideals of various religions and the history of cruelty and hypocrisy done in God's name. He was well aware of the intellectual giants who believed in God, but if asked, he would say that each person had to make their own choice. Statements such as "By the State of New York and God ..." struck him as ludicrous. From his childhood he recalled a particularly unpleasant scene between his parents just before they sent him to the store. He ran down the street saying over and over again, "I wish there was a God, I wish there was a God."
  5. ^ a b c Salvador Sanabria. Richard Bellman profile at http://www-math.cudenver.edu; retrieved October 3, 2008.
  6. ^ Bellman biodata at history.mcs.st-andrews.ac.uk; retrieved August 10, 2013.
  7. ^ Mathematics Genealogy Project
  8. ^ Bellman R: An introduction to the theory of dynamic programming RAND Corp. Report 1953 (Based on unpublished researches from 1949. It contained the first statement of the principle of optimality)
  9. ^ "Book of Members, 1780–2010: Chapter B" (PDF). American Academy of Arts and Sciences. Retrieved April 6, 2011.
  10. ^ "NAE Members Directory – Dr. Richard Bellman profile". NAE. Retrieved April 6, 2011.
  11. ^ "IEEE Medal of Honor Recipients" (PDF). IEEE. Retrieved April 6, 2011.
  12. ^ Ljungqvist, Lars; Sargent, Thomas J. (2012). Recursive Macroeconomic Theory (3rd ed.). MIT Press. ISBN 978-0-262-31202-8.
  13. ^ Kamien, Morton I.; Schwartz, Nancy L. (1991). Dynamic Optimization: The Calculus of Variations and Optimal Control in Economics and Management (2nd ed.). Amsterdam: Elsevier. pp. 259–263.
  14. ^ Richard Bellman (1961). Adaptive control processes: a guided tour. Princeton University Press.
  15. ^ Haas, F. (1954). "Review: Stability theory of differential equations, by R. Bellman". Bull. Amer. Math. Soc. 60 (4): 400–401. doi:10.1090/s0002-9904-1954-09830-0.

Further reading[edit]

Articles[edit]

External links[edit]