Abstract
Subsymbolic systems have been successfully used to model several aspects of human language processing. Such parsers are appealing because they allow revising the interpretation as words are incrementally processed. Yet, it has been very hard to scale them up to realistic language due to training time, limited memory, and the difficulty of representing linguistic structure. In this study, we show that it is possible to keep track of long-distance dependencies and to parse into deeper structures than before based on two techniques: a localist encoding of the input sequence and a dynamic unrolling of the network according to the parse tree. With these techniques, the system can nonmonotonically parse a corpus of realistic sentences into parse trees labelled with grammatical tags from a broad-coverage Head-driven Phrase Structure Grammar of English.
Similar content being viewed by others
References
J Allen M.S. Seidenberg (1999) The emergence of grammaticality in connectionist networks B MacWhinney (Eds) Emergence of Language Hillsdale NJ, Erlbaum 115–151
R. Miikkulainen (1997) ArticleTitleDyslexic and category-specific impairments in a self-organizing feature map model of the lexicon Brain and Language 59 334–366
R. Miikkulainen (1993) Subsymbolic Natural Language Processing: An Integrated Model of Scripts, Lexicon and Memory MIT Press Cambridge, MA
W.C. Morris G.W Cottrell J.L. Elman (2000) A connectionist simulation of the empirical acquisition of grammatical relations. S. Wermter R. Sun (Eds) Hybrid Neural Systems Springer-Verlag Verlag 175–193
D.C. Plaut T. Shallice (1993) ArticleTitlePerseverative and semantic influences on visual object naming errors in optic aphasia: A connectionist account Journal of Cognitive Neuroscience 5 IssueID1 89–117
M.F. St. John J.L McClelland (1990) ArticleTitleLearning and applying contextual constraints in sentence comprehension Artificial Intelligence 46 217–258
Y. Bengio P. Simard P. Frasconi (1994) ArticleTitleLearning long-term dependencies with gradient descent is difficult IEEE Transactions on Neural Networks 5 IssueID2 157–166
G. Berg (1992) A connectionist parser with recursive sentence structure and lexical disambiguation W. Swartout (Eds) Proceedings of the Tenth National Conference on Artificial Intelligence Cambridge MA, MIT Press 32–37
E.K. Ho L. Chan (2001) ArticleTitleAnalyzing holistic parsers:Implications for robust parsing and systematicity Neural Computation 13 IssueID5 1137–1170
Sharkey, N.E. and Sharkey, A.J.C.: A modular design for connectionist parsing, In: M. Drossaers and A. Nijholt, (eds.), Twente Workshop on Language Technology 3: Connectionism and Natural Language Processing, pp. 87–96, Enschede, the Netherlands, 1992. Department of Computer Science, University of Twente.
P.C.R. Lane J.B.H. Henderson (2001) ArticleTitleIncremental syntactic parsing of natural language corpora with simple synchrony networks IEEE Transactions on Knowledge and Data Engineering 13 IssueID2 219–231
Mayberry, M.R. III and Miikkulainen, R.: Combining maps and distributed representations for shift-reduce parsing, In: S. Wermter and R. Sun, (eds.), Hybrid Neural Systems, pp. 144–157. Springer-Verlag, 2000.
Hammerton, J.: Clause identification with long short-term memory, In: W. Daelemans and R. Zajac, (eds), Proceedings of CoNLL-2001 pp. 61--63, 2001.
S. Lawrence C.L. Giles S. Fong (2000) ArticleTitleNatural language grammatical inference with recurrent neural networks IEEE Transactions on Knowledge and Data Engineering 12 IssueID1 126–140
M.H. Christiansen N. Chater (1999) ArticleTitleToward a connectionist model of recursion in human linguistic performance Cognitive Science 23 IssueID2 157–205
J.L. Elman (1990) ArticleTitleFinding structure in time Cognitive Science 14 179–211
D.J. Chalmers (1990) ArticleTitleSyntactic transformations on distributed representations Connection Science 2 53–62
J.B. Pollack (1990) ArticleTitleRecursive distributed representations Artificial Intelligence 46 77–105
C. Goller A. Küchler (1996) ArticleTitleLearning task-dependent distributed representations by backpropagation through structure IEEE Transactions on Neural Networks 1 347–352
C. Pollard I.A. Sag (1994) Head-Driven Phrase Structure Grammar. Studies in Contemporary Linguistics The University of Chicago Press Chicago, IL
P. Munro C. Cosic M. Tabasko (1991) ArticleTitleA network for encoding, decoding and translating locative prepositions Connection Science 3 225–240
Touretzky, D.S.: Connectionism and compositional semantics. In: J. Barnden and J. Pollack, (eds), High-Level Connectionist Models,, volume1 of Advances in Connectionist and Neural Computation Theory, Barnden, J. A., series (ed.), pp. 17--31.Norwood: NJ, Ablex, 1991.
R.J. Williams D. Zipser (1989) ArticleTitleA learning algorithm for continually running fully recurrent neural networks Neural Computation 1 270–280
A. Copestake J. Carroll R. Malouf S. Oepen (1999) The (New) LKB System Stanford University CSLI
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Mayberry, M.R., Miikkulainen, R. Broad-Coverage Parsing with Neural Networks. Neural Process Lett 21, 121–132 (2005). https://doi.org/10.1007/s11063-004-3423-4
Issue Date:
DOI: https://doi.org/10.1007/s11063-004-3423-4