# Harold Widom’s work in random matrix theory

## Abstract

This is a survey of Harold Widom’s work in random matrices. We start with his pioneering papers on the sine-kernel determinant, continue with his and Craig Tracy’s groundbreaking results concerning the distribution functions of random matrix theory, touch on the remarkable universality of the Tracy–Widom distributions in mathematics and physics, and close with Tracy and Widom’s remarkable work on the asymmetric simple exclusion process.

## 1. Introduction

The distributions of random matrix theory govern the statistical properties of a wide variety of large systems which do not obey the usual laws of classical probability. Such systems appear in many different areas of applied science and technology, including heavy nuclei, polymer growth, high-dimensional data analysis, and certain percolation processes. The four distribution functions play a particularly important role in the mathematical apparatus of random matrices. The first one describes the *emptiness formation probability* in the bulk of the spectrum of a large random matrix, and it is explicitly given in terms of the sine-kernel Fredholm determinant. The second, the third, and the forth distributions are given in terms of the Airy kernel Fredholm determinant, and they describe the edge fluctuations of the eigenvalues in the large size limit of the matrices taken from the three classical Gaussian ensembles, unitary (GUE), orthogonal (GOE), and symplectic (GSE). The last three of these distributions are known now as the Tracy–Widom distribution functions.

A key analytical observation concerning the distribution functions of random matrix theory, which was made on many occasions in the papers Reference 5Reference 6Reference 7Reference 10Reference 11Reference 12, is that they satisfy certain nonlinear integrable PDEs. This property, which generalizes the first result of this type established in Reference JMMS for the sine-kernel determinant, follows in turn from a remarkable Fredholm determinant representation for the random matrix distributions. The existence of such representations in several important examples beyond the sine-kernel case was also first shown in the above-mentioned papers.

What follows is an overview of the principal contributions of Harold Widom to random matrix theory. We start with his papers Reference 3 and Reference 9 on the sine-kernel determinant, which brought mathematical rigor to the theory of this, the first universal distribution function of random matrix theory.

## 2. The sine-kernel determinant

Let be a union of disjoint intervals in Consider the Fredholm determinant .

where is the trace class operator in with kernel

The determinant plays a central role in random matrix theory. Indeed, it is the probability of finding no eigenvalues in the union of intervals for a random Hermitian matrix chosen from the Gaussian unitary ensemble (GUE), in the bulk scaling limit with mean spacing 1 (see Reference Meh and Figure 1). Moreover, in the one interval case, the second derivative of describes the distribution of normalized spacings of eigenvalues of large random GUE matrices (see, e.g., Reference Dei1). The determinant also appears in quantum and statistical mechanics. For instance, it describes the emptiness formation probability in the one-dimensional impenetrable Bose gas Reference Len and the gap probability in the one-dimensional Coulomb gas, at inverse temperature (see, e.g., Reference Dys1). The key analytical issue related to is its large behavior, i.e., the *large gap asymptotics*. Harold Widom made major contributions to the resolution of this question.

In the one interval case, after rescaling and translation, we may assume , For this case, in 1973, des Cloizeaux and Mehta .Reference dCM showed that as ,

for some constant In 1976 Dyson .Reference Dys2 showed that in fact has a full asymptotic expansion of the form

Dyson identified all the constants , , , Of particular interest is the constant . which he found to be ,

where is the Riemann zeta-function. It should be noted that Dyson obtained this result using one of the early results of Widom Reference 1 on the asymptotics of Toeplitz determinants with symbols supported on circular arcs.

The results in Reference dCM and Reference Dys2 were not fully rigorous. In Reference 3, using an adaptation of Szegő’s classical method to the continuous analogues of orthogonal polynomials (the so-called *Krein functions*), Widom gave the first rigorous proof of the leading asymptotics in Equation 2.1 in the form

as Actually, Widom proved a slightly stronger result, .

as In addition, Widom computed the leading asymptotics of the quantity . which is the ratio of the probability that there is at most one eigenvalue in the interval , to In the subsequent paper .Reference 9, Widom considered the multi-interval case, and showed that as , ,

where is a negative constant and is a certain bounded oscillatory function of which was described up to the solution of a Jacobi inversion problem. ,Footnote^{1} The method of Reference 9 is a further development of the approach in Reference 3. As in Reference 3, Widom also computes the leading asymptotics of .

^{1}

An explicit formula for in terms of the Riemann theta-function, was obtained later in ,Reference DIZ.

The formula Equation 2.3 for in Reference Dys2 was in the form of conjecture: A rigorous proof of Equation 2.3 was only given in 2004. In fact, two proofs of this formula were given independently in Reference Kra and Reference Ehr. It is remarkable that the proof of Equation 2.3 in Reference Kra again uses (in conjunction with the nonlinear steepest descent method) Widom’s computation in Reference 1.

## 3. The Tracy–Widom distribution functions

The joint probability densities of the eigenvalues of random matrices from the GOE, GUE, and GSE ensembles are given by

where is a normalization constant (or partition function) and

The famous Tracy–Widom distribution functions, commonly denoted as describe the edge fluctuation of the eigenvalues in the large , limit. They are defined via the scaling limits,

where is the largest eigenvalue drawn from the ensembles with density Equation 3.1. The central theme of the series of papers Reference 5Reference 6Reference 7Reference 10Reference 12 is the following analytical description of these distributions, which links them to the theory of integrable systems.

Let be the trace-class operator in with kernel

where is the Airy function,

Then, the Tracy–Widom distribution, is given by the ,*Airy-kernel* Fredholm determinant (see Reference For),

Moreover, as shown in Reference 6, the following formula is valid for the Fredholm determinant on the right-hand side of Equation 3.4:

where is the Hastings–McLeod solution of the *second Painlevé equation*, i.e., the solution of the ODE

uniquely determined by the boundary condition,

Similar Painlevé representations for the other two Tracy–Widom distribution functions (see Reference 10) have the form

where

and is the same Hastings–McLeod solution to the second Painlevé transcendent.Footnote^{2} Painlevé equations are integrable in the sense of Lax pairs, which, in particular, implies that their solutions admit a Riemann–Hilbert (RH) representation. A Riemann–Hilbert representation can be viewed as the nonabelian analogue of the familiar integral representations of the classical special functions, such as the Bessel functions, the Airy function, etc. A key consequence of the RH representation is that the Painlevé functions possess one of the principal features of a classical special function—a mechanism, viz. the nonlinear steepest descent method, to evaluate explicitly relevant asymptotic connection formulae (see, e.g., Reference FIKN). Specifically, in the case of the Hastings–McLeod solution, the integrability of the second Painlevé equation underlies the fact that, in addition to the asymptotic behavior Equation 3.7 at one also knows the asymptotic behavior of , at which is described in ,Reference HaMcFootnote^{3} as

^{2}

A similar representation for the sine-kernel determinant (involving, this time, a special solution of the fifth Painlevé equation) was given earlier in Reference JMMS.

^{3}

Hastings and McLeod derived Equation 3.9 using the inverse scattering transform which was a precursor of the Riemann–Hilbert method.

Herein lies the importance of the Tracy–Widom formulae Equation 3.5 and Equation 3.8—they provide the key distribution functions of random matrix theory with *explicit* representations that are amenable to detailed asymptotic analysis.

An immediate corollary of formula Equation 3.5 (and the known asymptotics Equation 3.9 of the Painlevé function) is an explicit formula for the large negative behavior of the distribution function :

The value of the constant was conjectured by Tracy and Widom in the paper Reference 6 to be the same as for the sine-kernel determinant, which is given by

where is the Riemann zeta-function.Footnote^{4}

^{4}

This conjecture was independently proved in Reference DIK and Reference BBD. Also, in Reference BBD, similar asymptotic results were established for the other two Tracy–Widom distributions.

Another important advantage of the Tracy–Widom formula Equation 3.8 is that it makes it possible to design, using the exact connection formula Equation 3.7, Equation 3.8, Equation 3.9, a very efficient scheme (see Reference Die, Reference DBT) for the numerical evaluationFootnote^{5} of the distribution functions .

^{5}

It should be noted that the Hastings–McLeod solution is very unstable; indeed, a small change in the pre-exponential numerical factor in the normalization condition Equation 3.7 yields a completely different type of behavior of for negative including the appearance of singularities (e.g., see again ,Reference FIKN). Herein lies the importance of the knowledge of the behavior Equation 3.9 in order to adjust the numerical procedure appropriately. Said differently, knowledge of the connection formulae makes it possible to transform an unstable ODE initial value problem into a stable ODE boundary value problem.

The appearance of the Painlevé equations in the Tracy–Widom formulae is not accidental. As follows from results in Reference 7, the connection to integrable systems is already encoded in the determinant formula Equation 3.4. Indeed, the Airy-kernel integral operator belongs to a class of integral operators with kernels of the form

acting in where , is a union of intervals,

and , are This class of integral operators has appeared frequently in many applications related to random matrices and statistical mechanics. -functions.Footnote^{6} In Reference 7, it is shown that if the functions and satisfy a linear differential equation,

^{6}

An integral operator with kernel Equation 3.12 is a special case of a so-called *integrable Fredholm operator*, i.e., an integral operator whose kernel is of the form

with some functions and defined on a contour This type of integral operator was singled out as a distinguished class in .Reference IIKS (see also Reference Dei2; in a different context, unrelated to integrable systems, these operators were also studied in the earlier work Reference Sak). A crucial property of a kernel Equation 3.13 is that the associated resolvent kernel is again an integrable kernel. Moreover, the functions “ and ”“ corresponding to the resolvent are determined via an auxiliary matrix Riemann–Hilbert problem whose jump-matrix is explicitly constructed in terms of the original ” -functions.

where

and , , and , are polynomials, then the Fredholm determinant, can be expressed in terms of the solution to a certain system of nonlinear partial differential equations with the end points , as independent variables. This system is integrable in the sense of Lax,Footnote^{7} and in the case of the Airy kernel it reduces to a single ODE—the second Painlevé equation.

^{7}

The Lax-integrability of the Tracy–Widom system associated with the kernel Equation 3.12 was proven in Reference Pal, where this system was identified as a special case of isomonodromy deformation equations and the Fredholm determinant was identified as the corresponding -function.

In Reference 8, the results of Reference 7 were extended to the general case, that is, to the kernels of the form

acting in Here, . is, as before, a union of intervals, the constant matrix is antisymmetric, and are satisfying the linear differential equation -functions

where

The main result of Reference 8 is the derivation, for such operators

## 4. Universality of the Tracy–Widom distributions

A remarkable fact is that the Tracy–Widom distribution functions

### 4.1. General invariant ensembles and the Wigner ensembles

The GUE, GOE, and GSE ensembles are, respectively, particular examples of unitary (UE), orthogonal (OE), and symplectic (SE) ensembles of random matrix theory. The joint probability density of the eigenvalues for these ensembles are given by (cf. Equation 3.1)

where

(More general potential functions are also allowed.) GUE, GOE, and GSE correspond to the choice *Edge-universality* states that for a given polynomial potential

where *exactly* the same Tracy–Widom distribution functions as in Equation 3.2. For

Complementary to the invariant ensembles (UE, OE, and SE) are the so-called *Wigner matrix ensembles*, i.e., random matrices with independent identically distributed entries. Tracy–Widom edge universality for these ensembles is also valid, and this important fact was proved by Soshnikov Reference Sos. In addition, universality in the bulk of the spectrum for the Wigner ensembles is now established; see, e.g., Reference Erd for a comprehensive survey of the results and a historical review.

The Tracy–Widom distribution appears also in *Hermitian matrix models with varying weights*. These are unitary ensembles with the potential

Under certain regularity conditions, the leading large

where *free energy* of the model, and is given explicitly in terms of the equilibrium measure corresponding to the potential

The formula for the free energy reads

As shown in Reference DKM, in the case that

where

The number of intervals in the support of the equilibrium measure depends on the values of the parameters

For

As