Skip to main content
Log in

Accidental nuclear war: the contribution of artificial intelligence

  • Viewpoint
  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

The AI community is seriously considering what all the military sponsorship would do to the prospect of being able to carry out basic research without, at the same time, putting the whole of our planet's population at risk. The SDI proposals of a defence shield that completely protects a nation from offensive missiles pose so many technical questions concerning the reliability of computers which are needed to control it. It has been argued that the complexity and sensitivity to the context of application make the construction of programs extremely difficult.

We examine the options which face AI researchers. Many have accepted that military money is necessary for the survival of the research community and that the military intentions are an unavoidable evil. Others have decided to accept military money if it does not involve development, production and use of weapons of mass destruction. One group goes even further and will not accept any form of military funding.

We do not subscribe to this last notion, opting for the intermediate view. We feel that there are some aspects of work in AI which can, perhaps, improve our understanding of the nature of accidents which occur as a result of interaction between humans with complicated technological systems. Research in these areas, therefore, have a positive result in reducing the likelihood of a computer-generated Armageddon. The military should, therefore, be more far-sighted and support basic scientific research in AI.

More information on these issues can be obtained from Computer Professionals for Social Responsibility, P.O. Box 717, Palo Alto, CA 94301, USA and Computing and Social Responsibility, c/o Jane Hesketh, 3 Buccleuch Terrace, Edinburgh EH8 9NB, UK.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Alvey, P. (1982) A programme for advanced information technology, A Report of the Alvey Committee, HMSO.

  • Meltzer, B. (1985) Al and the military. AISB Quarterly, 32, 24–26.

    Google Scholar 

  • Nelson, G. & Redell, D. (1985) The Starwars Computer System In: Computer Professionals for Social Responsibility, Palo Alto, CA.

  • Thompson, H. (1985) There will always be another moonrise — computer technology and nuclear weapons. AISB Quarterly, 53/54, 21–23.

    Google Scholar 

  • Travis, L. et al., (1984) Computer unreliability and nuclear war, Symposium on the Medical Consequences of Nuclear War, Madison, WI.

  • Smith, B. C. (1985) The limits of correctness. Fifth Congress of the International Physicians for Prevention of Nuclear War, Budapest, Hungary.

  • Universal House of Justice (1985) The Promise of World Peace. Bahai Publishing House, Maifa, Israel.

    Google Scholar 

  • Wilks, Y. (1985) AI and the military again. AISB Quarterly, 53/54, 23–24.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Footnote

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yazdani, M., Whitby, B. Accidental nuclear war: the contribution of artificial intelligence. Artif Intell Rev 1, 221–227 (1987). https://doi.org/10.1007/BF00142294

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00142294

Keywords

Navigation