A Survey of Tables of Probability Distributions

This article is a survey of the tables of probability distributions published about or after the publication in 1964 of the Handbook of Mathematical Functions, edited by Abramowitz and Stegun


Introduction
Probabilities and percentiles of statistical probability distributions have historically been cited from reference tables published in books, journals, and other publications. Reference tables of probability distributions continued to be published from the 1920s through the 1980s and early 1990s. Some tables superceded their earlier counterparts. Abramowitz and Stegun [1] surveyed the tables published before 1964, and reproduced some of them. In particular, Abramowitz and Stegun [1] reproduced the tables of percentiles of chi-square, t-, and F-distributions from the 1954 edition of Pearson and Hartley [2]. Other collections of tables of probability distributions include Greenwood and Hartley [3] and Owen [4].
This article is a survey of the tables published about or after 1964. A few earlier tables are also mentioned when appropriate. Most of the tables abstracted in this article are referenced in Pearson and Hartley [2], Pearson and Hartley [5], Johnson, Kotz, and Kemp [6], Johnson, Kotz, and Balakrishnan [7], Johnson, Kotz, and Balakrishnan [8], Johnson, Kotz, and Balakrishnan [9], and Kotz, Balakrishnan, and Johnson [10]. The abstracts presented here have been verified from the original sources, and in some cases corrections and additions were made. The next three sections contain the abstracts for discrete univariate, continuous univariate, and multivariate probability distributions.
A random variable is denoted by X, and x denotes a particular value of X. The cumulative distribution function of X is F(x) = Pr{X ≤ x}. The survival function of X is F -(x) = 1-F(x) = Pr{X > x}. For a discrete random variable f(x) interpreted as Pr{X = x} is the probability mass function (pmf). For a continuous random variable f(x) interpreted as dF(x)/dx is the probability density function (pdf). A particular value, x, is the rth quantile of X when F(x) = r, for 0 ≤ r ≤ 1. The rth quantile is commonly referred to as the r × 100th percentile of X. The expected value or mean, and the variance of X are denoted by E(X), and V(X), respectively. The abbreviation nD, for an integer n, denotes n decimal places. An expression such as 0.01(0.02)0.09 denotes the sequence of numbers from 0.01 to 0.09 increasing in steps of 0.02. Log denotes natural logs unless indicated otherwise.

68
Volume 110, Number 1, January-February 2005 Journal of Research of the National Institute of Standards and Technology

Standardized Stable Distributions
The pdfs of standardized stable distributions are unimodal with shape depending on the parameters β and α. Although the pdfs are rather complicated, they can be expressed as convergent series (Johnson, Kotz, and Balakrishnan [7]).

Chi-Square Distribution
The pdf is for x > 0 and degrees of freedom ν > 0. If X 1 , X 2 , . . ., X ν have independent standard normal distributions, then with ν degrees of freedom.

Standardized Weibull Distribution
The pdf is for x > 0 and γ > 0, where γ is the shape parameter.

Standardized Extreme Value Distribution -Type 1
The pdf is Gumbel [43] tabulated, to 7D, f (x) and F(x) for the following values of x: White [44] tabulated, to 7D, the means and variances of all order statistics for sample sizes 1(1)50 and 50 (5)100. Extended tables of means, variances, and covariances of all order statistics for sample sizes up to 30 have been provided by Balakrishnan and Chan [45] and Balakrishnan and Chan [46].

Beta Distribution
The pdf is Harter [33] tabulated, to 7D, quantiles x such that F(x) = p for a = 1 (1)

F-Distribution
If X 1 and X 2 have independent chi-square distributions with degrees of freedom ν 1 and ν 2 , respectively, then has an F-distribution with ν 1 (numerator) and ν 2 (denominator) degrees of freedom.
Pearson and Hartley [5] tabulated, to five significant digits, quantiles x such that F(x) = p for p = 0.

t-Distribution
If X 1 has the standard normal distribution and X 2 has an independent chi-square distribution with ν degrees of freedom, then has a Student's t-distributon with ν degrees of freedom.

Noncentral Chi Distribution
If X 1 has a noncentral chi-square distributon then the distribution of is referred to as noncentral chi distribution.
Johnson and Pearson [55] tabulated, to four significant digits, quantiles x of chi distribution such that

Noncentral F-Distribution
If X 1 has a noncentral chi-square distribution with ν 1 degrees of freedom and noncentrality parameter λ, X 2 has a chi-square distribution with ν 2 degrees of freedom, and X 1 and X 2 are independently distributed then has a noncentral F-distribution with ν 1 and ν 2 degrees of freedom and noncentrality parameter λ.

Doubly Noncentral F-Distribution
If X 1 has a noncentral chi-square distribution with ν 1 degrees of freedom and noncentrality parameter λ 1 , X 2 has a noncentral chi-square distribution with ν 2 degrees of freedom and noncentrality parameter λ 2 , and X 1 and X 2 are independently distributed then has a doubly noncentral F-distribution with ν 1 and ν 2 degrees of freedom, and noncentrality parameters λ 1 and λ 2 .

Noncentral t-Distribution
If X 1 has the standard normal distribution and X 2 has an independent chi-square distribution with ν degrees of freedom, then has a noncentral t-distributon with ν degrees of freedom and noncentrality parameter δ.

Doubly Noncentral t-Distribution
If X 1 has the standard normal distribution and X 2 has an independent noncentral chi-square distribution with ν degrees of freedom and noncentrality parameter λ, then has a doubly noncentral t-distributon with ν degrees of freedom, numerator noncentrality parameter δ, and denominator noncentrality parameter λ.

Distribution of the Sample Correlation Coefficient From Bivariate Normal Distribution
Suppose (Y i , Z i ), for i = 1, 2, . . . , n, are independently distributed and have a common joint bivariate normal distribution with correlation coefficient ρ. Then the sample correlation coefficient has a distribution that depends only on the correlation coefficient ρ and the sample size n.

Distribution of the Sample Multiple Correlation Coefficient From Multivariate Normal Distribution
If the random variables X 1 , . . ., X M have a joint multivariate normal distribution, then the smallest mean squared error linear predictor of X 1 is the conditional expected value E(X 1 |x 2 , . . . , x M ). The multiple correlation coefficient R is the correlation between X 1 and its smallest mean squared error linear predictor. The distribution of the sample multiple correlation coefficient r depends only on the population coefficient R, number of variates M, and the sample size N.
Tong [62] tabulated equicoordinate one-sided and two-sided percentage points, to 4D, and probability integrals p, to 5D. The table of

Distribution of the Wilks's Likelihood Ratio Test Statistic
Schatzoff [72], Pillai and Gupta [73], Lee [74], and Davis [75] tabulate multiplying factors C to obtain upper percentage points of the distribution of the Wilks's Likelihood Ratio Test Statistic -[n -p -(1/2) (m -r + 1)] logW from the percentage points of the chi-square distribution for multivariate analysis of variance. Muirhead [76] has consolidated these into one large table. Here, n is the number of multivariate measurements, p is the number of regression parameter vectors, n -p is the error degrees of freedom, m is the dimension of multivariate measurements, and r is the degrees of freedom of the general linear hypothesis. Factors for the upper α × 100 percent points are tabulated for α = 0.100, 0.050, 0.025, and 0.005. The chi-square distribution has mr degrees of freedom. The degrees of freedom n -p -m + 1 equal 1(1)10, 10(2)20, 24, 30, 40, 60, 120, and ∞. Pairs (m, r) are such that m = 3(1)10, 12, and r ≥ m, where r is up to 22 for m = 3, and 4, r is up to 20 for m = 5, 6, and 7, and r is up to 18, 16, and 14 for m = 8, 9, and 10, respectively. Pairs (m, r) = (6, 11), (6,13), and (10, 13) are excluded. For r ≤ m make the substitutions m → r, r → m, and n -p → n + r -p -m.

Zonal Polynomials
Probability density functions and moments of many multivariate distributions can be evaluated using zonal polynomials. Parkhurst and James [79] tabulate zonal polynomials of order 1 through 12 in terms of sums of powers and in terms of elementary symmetric functions.

Distributions of the Largest and Smallest Eigenvalues of Matrices of Sample Quantities
Heck [80] charts some upper percentage points of the distribution of the largest eigenvalue of certain matrices of sample quantities from multivariate normal distribution. Edelman [81] tabulates expected values of the smallest eigenvalue of random matrices of Wishart type.