Mathematical analysis

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
A strange attractor arising from a differential equation. Differential equations are an important area of mathematical analysis with many applications to science and engineering.

Mathematical analysis is the branch of mathematics dealing with limits and related theories, such as differentiation, integration, measure, infinite series, and analytic functions.[1][2]

These theories are usually studied in the context of real and complex numbers and functions. Analysis evolved from calculus, which involves the elementary concepts and techniques of analysis. Analysis may be distinguished from geometry; however, it can be applied to any space of mathematical objects that has a definition of nearness (a topological space) or specific distances between objects (a metric space).

History[edit]

Archimedes used the method of exhaustion to compute the area inside a circle by finding the area of regular polygons with more and more sides. This was an early but informal example of a limit, one of the most basic concepts in mathematical analysis.

Mathematical analysis formally developed in the 17th century during the Scientific Revolution,[3] but many of its ideas can be traced back to earlier mathematicians. Early results in analysis were implicitly present in the early days of ancient Greek mathematics. For instance, an infinite geometric sum is implicit in Zeno's paradox of the dichotomy.[4] Later, Greek mathematicians such as Eudoxus and Archimedes made more explicit, but informal, use of the concepts of limits and convergence when they used the method of exhaustion to compute the area and volume of regions and solids.[5] The explicit use of infinitesimals appears in Archimedes' The Method of Mechanical Theorems, a work rediscovered in the 20th century.[6] In Asia, the Chinese mathematician Liu Hui used the method of exhaustion in the 3rd century AD to find the area of a circle.[7] Zu Chongzhi established a method that would later be called Cavalieri's principle to find the volume of a sphere in the 5th century.[8] The Indian mathematician Bhāskara II gave examples of the derivative and used what is now known as Rolle's theorem in the 12th century.[9]

In the 14th century, Madhava of Sangamagrama developed infinite series expansions, like the power series and the Taylor series, of functions such as sine, cosine, tangent and arctangent.[10] Alongside his development of the Taylor series of the trigonometric functions, he also estimated the magnitude of the error terms created by truncating these series and gave a rational approximation of an infinite series. His followers at the Kerala School of Astronomy and Mathematics further expanded his works, up to the 16th century.

The modern foundations of mathematical analysis were established in 17th century Europe.[3] Descartes and Fermat independently developed analytic geometry, and a few decades later Newton and Leibniz independently developed infinitesimal calculus, which grew, with the stimulus of applied work that continued through the 18th century, into analysis topics such as the calculus of variations, ordinary and partial differential equations, Fourier analysis, and generating functions. During this period, calculus techniques were applied to approximate discrete problems by continuous ones.

In the 18th century, Euler introduced the notion of mathematical function.[11] Real analysis began to emerge as an independent subject when Bernard Bolzano introduced the modern definition of continuity in 1816,[12] but Bolzano's work did not become widely known until the 1870s. In 1821, Cauchy began to put calculus on a firm logical foundation by rejecting the principle of the generality of algebra widely used in earlier work, particularly by Euler. Instead, Cauchy formulated calculus in terms of geometric ideas and infinitesimals. Thus, his definition of continuity required an infinitesimal change in x to correspond to an infinitesimal change in y. He also introduced the concept of the Cauchy sequence, and started the formal theory of complex analysis. Poisson, Liouville, Fourier and others studied partial differential equations and harmonic analysis. The contributions of these mathematicians and others, such as Weierstrass, developed the (ε, δ)-definition of limit approach, thus founding the modern field of mathematical analysis.

In the middle of the 19th century Riemann introduced his theory of integration. The last third of the century saw the arithmetization of analysis by Weierstrass, who thought that geometric reasoning was inherently misleading, and introduced the "epsilon-delta" definition of limit. Then, mathematicians started worrying that they were assuming the existence of a continuum of real numbers without proof. Dedekind then constructed the real numbers by Dedekind cuts, in which irrational numbers are formally defined, which serve to fill the "gaps" between rational numbers, thereby creating a complete set: the continuum of real numbers, which had already been developed by Simon Stevin in terms of decimal expansions. Around that time, the attempts to refine the theorems of Riemann integration led to the study of the "size" of the set of discontinuities of real functions.

Also, "monsters" (nowhere continuous functions, continuous but nowhere differentiable functions, space-filling curves) began to be investigated. In this context, Jordan developed his theory of measure, Cantor developed what is now called naive set theory, and Baire proved the Baire category theorem. In the early 20th century, calculus was formalized using an axiomatic set theory. Lebesgue solved the problem of measure, and Hilbert introduced Hilbert spaces to solve integral equations. The idea of normed vector space was in the air, and in the 1920s Banach created functional analysis.

Important concepts[edit]

Metric spaces[edit]

In mathematics, a metric space is a set where a notion of distance (called a metric) between elements of the set is defined.

Much of analysis happens in some metric space; the most commonly used are the real line, the complex plane, Euclidean space, other vector spaces, and the integers. Examples of analysis without a metric include measure theory (which describes size rather than distance) and functional analysis (which studies topological vector spaces that need not have any sense of distance).

Formally, a metric space is an ordered pair where is a set and is a metric on , i.e., a function

such that for any , the following holds:

  1. if and only if    (identity of indiscernibles),
  2.    (symmetry), and
  3.    (triangle inequality).

By taking the third property and letting , it can be shown that     (non-negative).

Sequences and limits[edit]

A sequence is an ordered list. Like a set, it contains members (also called elements, or terms). Unlike a set, order matters, and exactly the same elements can appear multiple times at different positions in the sequence. Most precisely, a sequence can be defined as a function whose domain is a countable totally ordered set, such as the natural numbers.

One of the most important properties of a sequence is convergence. Informally, a sequence converges if it has a limit. Continuing informally, a (singly-infinite) sequence has a limit if it approaches some point x, called the limit, as n becomes very large. That is, for an abstract sequence (an) (with n running from 1 to infinity understood) the distance between an and x approaches 0 as n → ∞, denoted

Main branches[edit]

Real analysis[edit]

Real analysis (traditionally, the theory of functions of a real variable) is a branch of mathematical analysis dealing with the real numbers and real-valued functions of a real variable.[13][14] In particular, it deals with the analytic properties of real functions and sequences, including convergence and limits of sequences of real numbers, the calculus of the real numbers, and continuity, smoothness and related properties of real-valued functions.

Complex analysis[edit]

Complex analysis, traditionally known as the theory of functions of a complex variable, is the branch of mathematical analysis that investigates functions of complex numbers.[15] It is useful in many branches of mathematics, including algebraic geometry, number theory, applied mathematics; as well as in physics, including hydrodynamics, thermodynamics, mechanical engineering, electrical engineering, and particularly, quantum field theory.

Complex analysis is particularly concerned with the analytic functions of complex variables (or, more generally, meromorphic functions). Because the separate real and imaginary parts of any analytic function must satisfy Laplace's equation, complex analysis is widely applicable to two-dimensional problems in physics.

Functional analysis[edit]

Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (e.g. inner product, norm, topology, etc.) and the linear operators acting upon these spaces and respecting these structures in a suitable sense.[16][17] The historical roots of functional analysis lie in the study of spaces of functions and the formulation of properties of transformations of functions such as the Fourier transform as transformations defining continuous, unitary etc. operators between function spaces. This point of view turned out to be particularly useful for the study of differential and integral equations.

Differential equations[edit]

A differential equation is a mathematical equation for an unknown function of one or several variables that relates the values of the function itself and its derivatives of various orders.[18][19][20] Differential equations play a prominent role in engineering, physics, economics, biology, and other disciplines.

Differential equations arise in many areas of science and technology, specifically whenever a deterministic relation involving some continuously varying quantities (modeled by functions) and their rates of change in space or time (expressed as derivatives) is known or postulated. This is illustrated in classical mechanics, where the motion of a body is described by its position and velocity as the time value varies. Newton's laws allow one (given the position, velocity, acceleration and various forces acting on the body) to express these variables dynamically as a differential equation for the unknown position of the body as a function of time. In some cases, this differential equation (called an equation of motion) may be solved explicitly.

Measure theory[edit]

A measure on a set is a systematic way to assign a number to each suitable subset of that set, intuitively interpreted as its size.[21] In this sense, a measure is a generalization of the concepts of length, area, and volume. A particularly important example is the Lebesgue measure on a Euclidean space, which assigns the conventional length, area, and volume of Euclidean geometry to suitable subsets of the -dimensional Euclidean space . For instance, the Lebesgue measure of the interval in the real numbers is its length in the everyday sense of the word – specifically, 1.

Technically, a measure is a function that assigns a non-negative real number or +∞ to (certain) subsets of a set . It must assign 0 to the empty set and be (countably) additive: the measure of a 'large' subset that can be decomposed into a finite (or countable) number of 'smaller' disjoint subsets, is the sum of the measures of the "smaller" subsets. In general, if one wants to associate a consistent size to each subset of a given set while satisfying the other axioms of a measure, one only finds trivial examples like the counting measure. This problem was resolved by defining measure only on a sub-collection of all subsets; the so-called measurable subsets, which are required to form a -algebra. This means that countable unions, countable intersections and complements of measurable subsets are measurable. Non-measurable sets in a Euclidean space, on which the Lebesgue measure cannot be defined consistently, are necessarily complicated in the sense of being badly mixed up with their complement. Indeed, their existence is a non-trivial consequence of the axiom of choice.

Numerical analysis[edit]

Numerical analysis is the study of algorithms that use numerical approximation (as opposed to general symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics).[22]

Modern numerical analysis does not seek exact answers, because exact answers are often impossible to obtain in practice. Instead, much of numerical analysis is concerned with obtaining approximate solutions while maintaining reasonable bounds on errors.

Numerical analysis naturally finds applications in all fields of engineering and the physical sciences, but in the 21st century, the life sciences and even the arts have adopted elements of scientific computations. Ordinary differential equations appear in celestial mechanics (planets, stars and galaxies); numerical linear algebra is important for data analysis; stochastic differential equations and Markov chains are essential in simulating living cells for medicine and biology.

Other topics[edit]

Applications[edit]

Techniques from analysis are also found in other areas such as:

Physical sciences[edit]

The vast majority of classical mechanics, relativity, and quantum mechanics is based on applied analysis, and differential equations in particular. Examples of important differential equations include Newton's second law, the Schrödinger equation, and the Einstein field equations.

Functional analysis is also a major factor in quantum mechanics.

Signal processing[edit]

When processing signals, such as audio, radio waves, light waves, seismic waves, and even images, Fourier analysis can isolate individual components of a compound waveform, concentrating them for easier detection or removal. A large family of signal processing techniques consist of Fourier-transforming a signal, manipulating the Fourier-transformed data in a simple way, and reversing the transformation.[23]

Other areas of mathematics[edit]

Techniques from analysis are used in many areas of mathematics, including:

See also[edit]

Notes[edit]

  1. ^ Edwin Hewitt and Karl Stromberg, "Real and Abstract Analysis", Springer-Verlag, 1965
  2. ^ Stillwell, John Colin. "analysis | mathematics". Encyclopædia Britannica. Retrieved 2015-07-31.
  3. ^ a b Jahnke, Hans Niels (2003). A History of Analysis. American Mathematical Society. p. 7. ISBN 978-0-8218-2623-2.
  4. ^ Stillwell (2004). "Infinite Series". Mathematics and its History (2nd ed.). Springer Science + Business Media Inc. p. 170. ISBN 978-0-387-95336-6. Infinite series were present in Greek mathematics, [...] There is no question that Zeno's paradox of the dichotomy (Section 4.1), for example, concerns the decomposition of the number 1 into the infinite series 12 + 122 + 123 + 124 + ... and that Archimedes found the area of the parabolic segment (Section 4.4) essentially by summing the infinite series 1 + 14 + 142 + 143 + ... = 43. Both these examples are special cases of the result we express as summation of a geometric series
  5. ^ Smith 1958.
  6. ^ Pinto, J. Sousa (2004). Infinitesimal Methods of Mathematical Analysis. Horwood Publishing. p. 8. ISBN 978-1-898563-99-0.
  7. ^ Dun, Liu; Fan, Dainian; Cohen, Robert Sonné (1966). A comparison of Archimedes' and Liu Hui's studies of circles. Chinese studies in the history and philosophy of science and technology. 130. Springer. p. 279. ISBN 978-0-7923-3463-7., Chapter, p. 279
  8. ^ Zill, Dennis G.; Wright, Scott; Wright, Warren S. (2009). Calculus: Early Transcendentals (3 ed.). Jones & Bartlett Learning. p. xxvii. ISBN 978-0-7637-5995-7.
  9. ^ Seal, Sir Brajendranath (1915), "The positive sciences of the ancient Hindus", Nature, 97 (2426): 177, Bibcode:1916Natur..97..177., doi:10.1038/097177a0
  10. ^ Rajagopal, C.T.; Rangachari, M.S. (June 1978). "On an untapped source of medieval Keralese Mathematics". Archive for History of Exact Sciences. 18 (2): 89–102. doi:10.1007/BF00348142 (inactive 2019-01-06).
  11. ^ Dunham, William (1999). Euler: The Master of Us All. The Mathematical Association of America. p. 17.
  12. ^ *Cooke, Roger (1997). "Beyond the Calculus". The History of Mathematics: A Brief Course. Wiley-Interscience. p. 379. ISBN 978-0-471-18082-1. Real analysis began its growth as an independent subject with the introduction of the modern definition of continuity in 1816 by the Czech mathematician Bernard Bolzano (1781–1848)
  13. ^ Rudin, Walter. Principles of Mathematical Analysis. Walter Rudin Student Series in Advanced Mathematics (3rd ed.). McGraw–Hill. ISBN 978-0-07-054235-8.
  14. ^ Abbott, Stephen (2001). Understanding Analysis. Undergraduate Texts in Mathematics. New York: Springer-Verlag. ISBN 978-0-387-95060-0.
  15. ^ Ahlfors, L. (1979). Complex Analysis (3rd ed.). New York: McGraw-Hill. ISBN 978-0-07-000657-7.
  16. ^ Rudin, Walter (1991). Functional Analysis. McGraw-Hill Science. ISBN 978-0-07-054236-5.
  17. ^ Conway, J. B. (1994). A Course in Functional Analysis (2nd ed.). Springer-Verlag. ISBN 978-0-387-97245-9.
  18. ^ Ince, Edward L. (1956). Ordinary Differential Equations. Dover Publications. ISBN 978-0-486-60349-0.
  19. ^ Witold Hurewicz, Lectures on Ordinary Differential Equations, Dover Publications, ISBN 0-486-49510-8
  20. ^ Evans, L.C. (1998), Partial Differential Equations, Providence: American Mathematical Society, ISBN 978-0-8218-0772-9
  21. ^ Lao, Terence (2011). An Introduction to Measure Theory. American Mathematical Society. ISBN 978-0-8218-6919-2.
  22. ^ Hildebrand, F.B. (1974). Introduction to Numerical Analysis (2nd ed.). McGraw-Hill. ISBN 978-0-07-028761-7.
  23. ^ Rabiner, L.R.; Gold, B. (1975). Theory and Application of Digital Signal Processing. Englewood Cliffs, NJ: Prentice-Hall. ISBN 978-0-13-914101-0.

References[edit]

External links[edit]