Abstract
The main computational work in interior-point methods for linear programming (LP) is to solve a least-squares problem. The normal equations are often used, but if the LP constraint matrix contains a nearly dense column the normal-equations matrix will be nearly dense. Assuming that the nondense part of the constraint matrix is of full rank, the Schur complement can be used to handle dense columns. In this article we propose a modified Schur-complement method that relaxes this assumption. Encouraging numerical results are presented.
- Adler I., Karmarker, N., Resende, M., AND Veiga, G. 1989. An implementation of Karmarkar's algorithm for linear programming. Math. Program. 44, 297-235.Google Scholar
- ANDERSEN, E. AND ANDERSEN, K. 1995. Presolving in linear programming. Math. Program. 71, 221-245. Google Scholar
- ANDERSEN, E. AND ANDERSEN, K. 1996. The APOS LP solver. Tech Rep., CORE, Univ. Catholique de Louvain, Belgium.Google Scholar
- ANDERSEN, K. 1996. An efficient Newton barrier method for minimizing a sum of Euclidean norms. SIAM J. Optim. 6, 1, 74-95.Google Scholar
- ANDERSEN, E. AND YE, Y. 1994. Combining interior-point and pivoting algorithms for linear programming. Tech. Rep., Dept. of Management Sciences, The Univ. of Iowa, Ames, Iowa. In preparation.Google Scholar
- CHOI, I., MONMA, C., AND SHANNO, D. 1990. Further development of a primal-dual interior point method. ORSA J. Comput. 2, 304-311.Google Scholar
- GAY, D. 1985. Electronic mail distribution of linear programming test problems. COAL Newslett. 13, 10-12.Google Scholar
- GILL, P., MURRAY, W., SAUNDERS, M., TOMLIN, J., AND WRIGHT, M. 1986. On the projected Newton barrier methods for linear programming and an equivalence to Karmarkar's projective method. Math. Program. 36, 183-209. Google Scholar
- GOLUB, G. AND LOAN, C.V. 1989. Matrix Computations. 2nd ed. The John Hopkins University Press, Baltimore, Md. Google Scholar
- HEATH, M. 1982. Some extensions of an algorithm for sparse linear least squares problems. SIAM J. Sci. Stat. Comput. 3, 2, 223-237.Google Scholar
- LUSTIG, I., MARSTEN, R., AND SHANNO, D. 1991. Computational experience with a primaldual interior point method for linear programming. Lin. Algebra Appl. 20, 191-222.Google Scholar
- LUSTIG, I., MARSTEN, R., AND SHANNO, D. 1992. On implementing Mehrotra's predictorcorrector interior-point method for linear programming. SIAM J. Optim. 2, 3, 435-449.Google Scholar
- MARXEN, A. 1989. Primal barrier methods for linear programming. Rep. SOL 89-6, Dept. of Operations Research, Stanford Univ., Stanford, Calif.Google Scholar
- SAUNDERS, M. 1994. Major Cholesky would feel proud. ORSA J. Comput. 6, 1, 94-105.Google Scholar
- STEWART, G. 1974. Modifying pivot elements in Gaussian elimination. Math. Comput. 28, 126, 537-642.Google Scholar
- VANDERBEI, R. 1991. Splitting dense columns in sparse linear systems. Lin. Algebra Appl. 152, 107-117.Google Scholar
Index Terms
- A modified Schur-complement method for handling dense columns in interior-point methods for linear programming
Recommendations
Detecting "dense" columns in interior point methods for linear programs
During the iterations of interior point methods symmetric indefinite systems are decomposed by LD L T factorization. This step can be performed in a special way where the symmetric indefinite system is transformed to a positive definite one, ...
A Schur complement approach to preconditioning sparse linear least-squares problems with some dense rows
The effectiveness of sparse matrix techniques for directly solving large-scale linear least-squares problems is severely limited if the system matrix A has one or more nearly dense rows. In this paper, we partition the rows of A into sparse rows and ...
A product-form Cholesky factorization method for handling dense columns in interior point methods for linear programming
Cholesky factorization has become the method of choice for solving the symmetric system of linear equations arising in interior point methods (IPMs) for linear programming (LP), its main advantages being numerical stability and efficiency for sparse ...
Comments