The Algebraization of Analysis
The Algebraization of Analysis
Overview
While the principal concepts of the calculus were developed in the seventeenth century, a sound mathematical formulation of the subject would have to wait until the nineteenth. Critics of the early formulations pointed to the rather casual way in which infinitesimal and infinite quantities were defined. Lagrange attempted to replace the need for derivatives in the differential calculus with a requirement that the difference between the values of a function at two values of its argument be expressed as a power series. The theory of power series would eventually play an important role in the solution of differential equations and in the machine calculation of special functions.
Background
In the seventeenth century, mathematicians had arrived at the notion of a function as a well-defined combination of mathematical operations that allows for one quantity to be obtained from another quantity. Thus f(x) = 2 sin(x) would denote the function of replacing a number x, called the argument, with twice its trigonometric sine. Calculus is concerned with the changes that occur in the value of a function as the argument is changed. It was developed to provide a mathematical framework for the new physics emerging from the work of Galileo and Isaac Newton (1642-1727). The differential calculus permitted the calculation of instantaneous rates of change, making it possible to calculate the velocity of a particle at any point in time, given its position as a function of time. The integral calculus permitted calculation of the change in the value of a function over a time interval given its instantaneous rate of change as a function of time. The calculus was the nearly simultaneous discovery of two mathematicians, the English scientist Isaac Newton and the German scientist Gottfried Wilhelm Leibniz (1646-1716).
Both the integral and differential forms of the calculus involved considering the behavior of quantities as they became infinitesimally small. The instantaneous rate of change, or fluxion in Newton's terminology, was to be calculated by considering the ratio in the change in the value of the function to the change in its argument, as both changes became smaller and smaller. Many mathematicians objected to this procedure as it suggested a ratio of 0/0, which was understood to be indeterminate, that is, not equal to any definite number.
Criticism of the calculus continued into the eighteenth century. Possibly the most outspoken critic was not a mathematician, but the eminent George Berkeley (1685-1753), idealist philosopher and bishop in the Church of England. In a pamphlet published in 1734 he launched a major attack on the reasoning used in the calculus. Berkeley pointed out that no deductive basis had been provided for the calculus. He objected to the use of quantities that were arbitrarily close to zero and to the geometric interpretation being given to the fluxion as a tangent to a curve.
A number of mathematicians rushed to the defense of the calculus. James Jurin (1684-1750) quickly published a rebuttal relying on geometric intuition. Berkley then published a re-rebuttal pointing out that his objections had still not been met. In a treatise published in 1742, Scottish mathematician, Colin Maclaurin (1678-1746) attempted to establish the methods of calculus on a purely geometrical basis. In 1755, the influential Swiss mathematician, Leonhard Euler (1707-1783) published an algebraic treatment, introducing some formal rules to eliminate the necessity of considering the indeterminate form 0/0.
The Italian-French mathematician Joseph-Louis Lagrange took up the issue in a paper published in 1774 and in his book, Theory of Functions, published in 1797 revised in 1813. Lagrange argued that the difference in the value of a function at two points could be represented in a power series of the form:
where a(x), b(x), and c(x), etc. are all functions of x. After making this assumption, which was certainly true for most of the functions to have received the attention of mathematicians, Lagrange noted that a(x) was simply the fluxion of f and that twice b(x) was the fluxion of a(x), three times c was the fluxion of b(x) and so on. In modern notation,
and so on,
where the exclamation point denotes the factorial, n!, the product of all positive integers less than or equal to n. Lagrange called the functions f'(x),
f''(x), f'''(x), and so on derivative functions, from which we obtain the current term derivative.
To illustrate Lagrange's technique we will consider the simple function f(x) = x2. By simple algebra, one can easily obtain the expression,
from which we can readily recognize the first derivative as 2x and the second derivative as 2.
Lagrange had begun teaching at the Ecole Polytechnique in 1794 and presumably taught his approach to his students and discussed it with his colleagues, who included many of the principal figures in French mathematics. The method was eventually abandoned, however, as it was realized that many functions of mathematical interest could not be expanded in the way assumed by Lagrange.
Impact
Lagrange is remembered today as a major mathematical thinker for his contributions to mathematical physics as well as numerous contributions to pure mathematics. His work on algebraicizing analysis is generally regarded as an unimportant byway. The American historian of mathematics, Eric Temple Bell (1883-1960) wrote:
"The contrast between what passed for valid reasoning then and what is demanded now is violent..... Some of Newton's successors who strove to make sense out of the calculus are among the greatest mathematicians of all time. Yet, as we follow their reasoning, we can only wonder whether our own will seem as puerile to our successors...."
In effect, one cannot anticipate the judgement of history. The calculus was put on a firm basis, beginning with the work of the French mathematician Augustin-Louis Cauchy (1789-1848) which provided a rigorous treatment of the limit concept. The notions of function, derivative, and integral have since been generalized and extended in a number of ways, sometimes introducing entire new fields for mathematical exploration.
The representation of functions by power series over small regions has come to play a role in modern practical mathematics. The solutions to most differential equations cannot be expressed in terms of simple functions like sines and cosines, and so an important approach to solving them is to assume a power series solution to be valid in some region and to work out a general formula for the terms. One must then check that the series converges, or takes on a definite limiting value, at each point. Power series are also used in the machine generation of mathematical functions. When a key is pressed on a calculator to find the sine or the logarithm of a number, the microprocessor inside adds up enough terms in the series to produce an accurate estimate of the value. Power expansions are also of importance in that field of mathematics known as approximation theory, in which one seeks to understand how best to choose a function that is easy to evaluate to approximate a more complicated function or a function which is only known at certain points.
Modern readers may find it hard to imagine how developments in an area of mathematics could have been perceived as a threat to religion. It must be remembered, however, that the calculus was key to calculating the behavior of bodies in the new physics based on Newton's laws of motion. Before the publication of Newton's Mathematical Principles of Natural Philosophy, it was reasonable to view the motion of the planets as the result of divine will, or of angels appointed to push around different crystalline spheres bearing the planets. After Newton it was much more reasonable to consider the universe as a giant clockwork mechanism, which, once set into motion would continue according to the laws of motion without any further intervention. In short, there was not much left for God to do. Further, nothing that happened in the natural world was considered to be beyond human understanding. This view naturally clashed with the traditional teachings of the Christian churches about a personal God who had intervened in human history and answered prayers. It was not unreasonable for churchmen to perceive a threat, and to respond by seeking weaknesses in the arguments of their opponents.
The title of Berkeley's pamphlet is itself quite informative. It is "The Analyst, Or A Dis-course Addressed to an Infidel Mathematician, Wherein It is examined whether the Object, Principles, and Inferences of the Modern Analysis are more distinctly conceived, or more evidently deduced, then Religious Mysteries and Points of Faith. 'First cast out the beam out of thine own Eye and then thou see clearly to cast out the mote of thy brothers Eye.'" The infidel in question was Edmond Halley (1656-1742), close friend and supporter of Isaac Newton, who had encouraged him to publish his researches into the laws of motion. Berkeley's attack was on the reasoning used by some of the new Newtonians who felt that their logic led to a surer form of truth than the teachings of the church.
An echo of this conflict might be seen in the reception accorded Charles Darwins's Origin of Species in 1859, which, by suggesting a mechanism for the appearance of new species, eliminated in some minds the need to invoke God as an explanation for the existence of living creatures. Again, the challenge to the new science was led by an Anglican bishop, in this case Samuel Wilberforce (1805-1873), who attacked the logic and evidence of the evolutionists. Remnants of this controversy can still be seen today in the debate over creation science in American education.
DONALD R. FRANCESCHETTI
Further Reading
Bell, Eric Temple. Development of Mathematics. New York: McGraw-Hill, 1945.
Boyer, Carl B. A History of Mathematics. New York: Wiley, 1968.
Kline, Morris. Mathematical Thought from Ancient to Modern Times. New York: Oxford University Press, 1972.