Recurrence relation

(Redirected from Difference operator)

In mathematics, a recurrence relation is an equation according to which the th term of a sequence of numbers is equal to some combination of the previous terms. Often, only previous terms of the sequence appear in the equation, for a parameter that is independent of ; this number is called the order of the relation. If the values of the first numbers in the sequence have been given, the rest of the sequence can be calculated by repeatedly applying the equation.

In linear recurrences, the nth term is equated to a linear function of the previous terms. A famous example is the recurrence for the Fibonacci numbers,

where the order is two and the linear function merely adds the two previous terms. This example is a linear recurrence with constant coefficients, because the coefficients of the linear function (1 and 1) are constants that do not depend on . For these recurrences, one can express the general term of the sequence as a closed-form expression of . As well, linear recurrences with polynomial coefficients depending on are also important, because many common elementary and special functions have a Taylor series whose coefficients satisfy such a recurrence relation (see holonomic function).

Solving a recurrence relation means obtaining a closed-form solution: a non-recursive function of .

The concept of a recurrence relation can be extended to multidimensional arrays, that is, indexed families that are indexed by tuples of natural numbers.

Definition edit

A recurrence relation is an equation that expresses each element of a sequence as a function of the preceding ones. More precisely, in the case where only the immediately preceding element is involved, a recurrence relation has the form

 

where

 

is a function, where X is a set to which the elements of a sequence must belong. For any  , this defines a unique sequence with   as its first element, called the initial value.[1]

It is easy to modify the definition for getting sequences starting from the term of index 1 or higher.

This defines recurrence relation of first order. A recurrence relation of order k has the form

 

where   is a function that involves k consecutive elements of the sequence. In this case, k initial values are needed for defining a sequence.

Examples edit

Factorial edit

The factorial is defined by the recurrence relation

 

and the initial condition

 

This is an example of a linear recurrence with polynomial coefficients of order 1, with the simple polynomial

 

as its only coefficient.

Logistic map edit

An example of a recurrence relation is the logistic map:

 

with a given constant  ; given the initial term  , each subsequent term is determined by this relation.

Fibonacci numbers edit

The recurrence of order two satisfied by the Fibonacci numbers is the canonical example of a homogeneous linear recurrence relation with constant coefficients (see below). The Fibonacci sequence is defined using the recurrence

 

with initial conditions

 
 

Explicitly, the recurrence yields the equations

 
 
 

etc.

We obtain the sequence of Fibonacci numbers, which begins

0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ...

The recurrence can be solved by methods described below yielding Binet's formula, which involves powers of the two roots of the characteristic polynomial  ; the generating function of the sequence is the rational function

 

Binomial coefficients edit

A simple example of a multidimensional recurrence relation is given by the binomial coefficients  , which count the ways of selecting   elements out of a set of   elements. They can be computed by the recurrence relation

 

with the base cases  . Using this formula to compute the values of all binomial coefficients generates an infinite array called Pascal's triangle. The same values can also be computed directly by a different formula that is not a recurrence, but uses factorials, multiplication and division, not just additions:

 

The binomial coefficients can also be computed with a uni-dimensional recurrence:

 

with the initial value   (The division is not displayed as a fraction for emphasizing that it must be computed after the multiplication, for not introducing fractional numbers). This recurrence is widely used in computers because it does not require to build a table as does the bi-dimensional recurrence, and does involve very large integers as does the formula with factorials (if one uses   all involved integers are smaller than the final result).

Difference operator and difference equations edit

The difference operator is an operator that maps sequences to sequences, and, more generally, functions to functions. It is commonly denoted   and is defined, in functional notation, as

 

It is thus a special case of finite difference.

When using the index notation for sequences, the definition becomes

 

The parentheses around   and   are generally omitted, and   must be understood as the term of index n in the sequence   and not   applied to the element  

Given sequence   the first difference of a is  

The second difference is   A simple computation shows that

 

More generally: the kth difference is defined recursively as   and one has

 

This relation can be inverted, giving

 

A difference equation of order k is an equation that involves the k first differences of a sequence or a function, in the same way as a differential equation of order k relates the k first derivatives of a function.

The two above relations allow transforming a recurrence relation of order k into a difference equation of order k, and, conversely, a difference equation of order k into recurrence relation of order k. Each transformation is the inverse of the other, and the sequences that are solution of the difference equation are exactly those that satisfies the recurrence relation.

For example, the difference equation

 

is equivalent to the recurrence relation

 

in the sense that the two equations are satisfied by the same sequences.

As it is equivalent for a sequence to satisfy a recurrence relation or to be the solution of a difference equation, the two terms "recurrence relation" and "difference equation" are sometimes used interchangeably. See Rational difference equation and Matrix difference equation for example of uses of "difference equation" instead of "recurrence relation"

Difference equations resemble to differential equations, and this resemblance is often used to mimic methods for solving differentiable equations to apply to solving difference equations, and therefore recurrence relations.

Summation equations relate to difference equations as integral equations relate to differential equations. See time scale calculus for a unification of the theory of difference equations with that of differential equations.

From sequences to grids edit

Single-variable or one-dimensional recurrence relations are about sequences (i.e. functions defined on one-dimensional grids). Multi-variable or n-dimensional recurrence relations are about  -dimensional grids. Functions defined on  -grids can also be studied with partial difference equations.[2]

Solving edit

Solving linear recurrence relations with constant coefficients edit

Solving first-order non-homogeneous recurrence relations with variable coefficients edit

Moreover, for the general first-order non-homogeneous linear recurrence relation with variable coefficients:

 

there is also a nice method to solve it:[3]

 
 
 

Let

 

Then

 
 
 
 

If we apply the formula to   and take the limit  , we get the formula for first order linear differential equations with variable coefficients; the sum becomes an integral, and the product becomes the exponential function of an integral.

Solving general homogeneous linear recurrence relations edit

Many homogeneous linear recurrence relations may be solved by means of the generalized hypergeometric series. Special cases of these lead to recurrence relations for the orthogonal polynomials, and many special functions. For example, the solution to

 

is given by

 

the Bessel function, while

 

is solved by

 

the confluent hypergeometric series. Sequences which are the solutions of linear difference equations with polynomial coefficients are called P-recursive. For these specific recurrence equations algorithms are known which find polynomial, rational or hypergeometric solutions.

Solving first-order rational difference equations edit

A first order rational difference equation has the form  . Such an equation can be solved by writing   as a nonlinear transformation of another variable   which itself evolves linearly. Then standard methods can be used to solve the linear difference equation in  .

Stability edit

Stability of linear higher-order recurrences edit

The linear recurrence of order  ,

 

has the characteristic equation

 

The recurrence is stable, meaning that the iterates converge asymptotically to a fixed value, if and only if the eigenvalues (i.e., the roots of the characteristic equation), whether real or complex, are all less than unity in absolute value.

Stability of linear first-order matrix recurrences edit

In the first-order matrix difference equation

 

with state vector   and transition matrix  ,   converges asymptotically to the steady state vector   if and only if all eigenvalues of the transition matrix   (whether real or complex) have an absolute value which is less than 1.

Stability of nonlinear first-order recurrences edit

Consider the nonlinear first-order recurrence

 

This recurrence is locally stable, meaning that it converges to a fixed point   from points sufficiently close to  , if the slope of   in the neighborhood of   is smaller than unity in absolute value: that is,

 

A nonlinear recurrence could have multiple fixed points, in which case some fixed points may be locally stable and others locally unstable; for continuous f two adjacent fixed points cannot both be locally stable.

A nonlinear recurrence relation could also have a cycle of period   for  . Such a cycle is stable, meaning that it attracts a set of initial conditions of positive measure, if the composite function

 

with   appearing   times is locally stable according to the same criterion:

 

where   is any point on the cycle.

In a chaotic recurrence relation, the variable   stays in a bounded region but never converges to a fixed point or an attracting cycle; any fixed points or cycles of the equation are unstable. See also logistic map, dyadic transformation, and tent map.

Relationship to differential equations edit

When solving an ordinary differential equation numerically, one typically encounters a recurrence relation. For example, when solving the initial value problem

 

with Euler's method and a step size  , one calculates the values

 

by the recurrence

 

Systems of linear first order differential equations can be discretized exactly analytically using the methods shown in the discretization article.

Applications edit

Mathematical biology edit

Some of the best-known difference equations have their origins in the attempt to model population dynamics. For example, the Fibonacci numbers were once used as a model for the growth of a rabbit population.

The logistic map is used either directly to model population growth, or as a starting point for more detailed models of population dynamics. In this context, coupled difference equations are often used to model the interaction of two or more populations. For example, the Nicholson–Bailey model for a host-parasite interaction is given by

 
 

with   representing the hosts, and   the parasites, at time  .

Integrodifference equations are a form of recurrence relation important to spatial ecology. These and other difference equations are particularly suited to modeling univoltine populations.

Computer science edit

Recurrence relations are also of fundamental importance in analysis of algorithms.[4][5] If an algorithm is designed so that it will break a problem into smaller subproblems (divide and conquer), its running time is described by a recurrence relation.

A simple example is the time an algorithm takes to find an element in an ordered vector with   elements, in the worst case.

A naive algorithm will search from left to right, one element at a time. The worst possible scenario is when the required element is the last, so the number of comparisons is  .

A better algorithm is called binary search. However, it requires a sorted vector. It will first check if the element is at the middle of the vector. If not, then it will check if the middle element is greater or lesser than the sought element. At this point, half of the vector can be discarded, and the algorithm can be run again on the other half. The number of comparisons will be given by

 
 

the time complexity of which will be  .

Digital signal processing edit

In digital signal processing, recurrence relations can model feedback in a system, where outputs at one time become inputs for future time. They thus arise in infinite impulse response (IIR) digital filters.

For example, the equation for a "feedforward" IIR comb filter of delay   is:

 

where   is the input at time  ,   is the output at time  , and   controls how much of the delayed signal is fed back into the output. From this we can see that

 
 

etc.

Economics edit

Recurrence relations, especially linear recurrence relations, are used extensively in both theoretical and empirical economics.[6][7] In particular, in macroeconomics one might develop a model of various broad sectors of the economy (the financial sector, the goods sector, the labor market, etc.) in which some agents' actions depend on lagged variables. The model would then be solved for current values of key variables (interest rate, real GDP, etc.) in terms of past and current values of other variables.

See also edit

References edit

Footnotes edit

  1. ^ Jacobson, Nathan, Basic Algebra 2 (2nd ed.), § 0.4. pg 16.
  2. ^ Partial difference equations, Sui Sun Cheng, CRC Press, 2003, ISBN 978-0-415-29884-1
  3. ^ "Archived copy" (PDF). Archived (PDF) from the original on 2010-07-05. Retrieved 2010-10-19.{{cite web}}: CS1 maint: archived copy as title (link)
  4. ^ Cormen, T. et al, Introduction to Algorithms, MIT Press, 2009
  5. ^ R. Sedgewick, F. Flajolet, An Introduction to the Analysis of Algorithms, Addison-Wesley, 2013
  6. ^ Stokey, Nancy L.; Lucas, Robert E. Jr.; Prescott, Edward C. (1989). Recursive Methods in Economic Dynamics. Cambridge: Harvard University Press. ISBN 0-674-75096-9.
  7. ^ Ljungqvist, Lars; Sargent, Thomas J. (2004). Recursive Macroeconomic Theory (Second ed.). Cambridge: MIT Press. ISBN 0-262-12274-X.

Bibliography edit

External links edit