Abstract
The conventional mean-field theory of random-field systems is, for large but finite dimensionalities d, found to be in error. For larger applied fields, a separate glassy phase appears in the phase diagram of the bond-diluted antiferromagnet. For large d, the diluted antiferromagnet can be mapped on the antiferromagnet with Gaussian randomness and the phase diagram of this model is shown to have a de Almeida–Thouless line, marking the onset of the glassy phase by replica symmetry breaking. In the limit d→∞ we recover conventional mean-field theory.
- Received 5 January 1987
DOI:https://doi.org/10.1103/PhysRevB.35.7267
©1987 American Physical Society