Reconsidering Written Language

A number of elite thinkers in Europe during the 16th and 17th centuries pursued an agenda which historian Paolo Rossi calls the"quest for a universal language,"a quest which was deeply interwoven with the emergence of the scientific method. From a modern perspective, one of the many surprising aspects of these efforts is that they relied on a diverse array of memorization techniques as foundational elements. In the case of Leibniz's universal calculus, the ultimate vision was to create a pictorial language that could be learned by anyone in a matter of weeks and which would contain within it a symbolic representation of all domains of contemporary thought, ranging from the natural sciences, to theology, to law. In this brief article, I explore why this agenda might have been appealing to thinkers of this era by examining ancient and modern memory feats. As a thought experiment, I suggest that a society built entirely upon memorization might be less limited than we might otherwise imagine, and furthermore, that cultural norms discouraging the use of written language might have had implications for the development of scientific methodology. Viewed in this light, the efforts of Leibniz and others seem significantly less surprising. I close with some general observations about cross-cultural origins of scientific thought.

Was it inevitable that the scientific revolution took place when it did? Of the many complex and interrelated intellectual, cultural, and political events that conspired to give rise to such profound developments, is it possible to identify certain factors that would have precipitated institutional changes comparable in magnitude had they been adopted at a different place and time in human history?
The value of asking such a question is that it might help to shed some light on a range of contemporary scientific issues. We are currently in the midst of a profound period of scientific growth, with an explosion in the number of highly-educated scientists, the emergence of multiple new interdisciplinary areas of study in which the basic paradigms of scientific research are being re-examined in light of novel computational techniques and data-driven methods for hypothesis generation and experimental design. In addition, we are witnessing a rapid growth in the number of avenues for communicating scientific results via the Internet in the form of new online-only journals, blogs, and highly domain specific question an-swering sites. 1 During such a transformative period of growth, examining the history of the scientific method and asking counterfactual questions about the emergence of scientific principles may help to develop our intuition and aid our understanding of large shifts in institutional culture that take place over periods of decades and generations.
In a previous article 2 , I argued that the growth of the scientific method in 17th century Europe took place amidst an unusual cultural milleu in which schools of thought on memory and memorization were at the forefront of intellectual developments. During this time period, memory was viewed not only as a set of practices for remembering, but rather as a foundational methodology for structuring knowledge.
The core premise of this argument is that the development of the scientific method was not a discrete event. Principled reasoning and systematic investigation have always been part of human society, but during this time period, they became distilled and adopted at an institutional level. How did this transformation take place and why had it not before? In a sense, what forms scientific reasoning consists of a complex and rich set of cultural norms, but which at its core, is rather mundane. By nature, people procrastinate, by nature people cut corners, and by nature, people are not systematic. Scientific reasoning might be described as a collection of systematic efforts conducted in a context that prioritizes an attempt to uncover the basic principles governing the behavior of the natural world. How then did the adoption of a systematic approach to knowledge accelerate during the time period traditionally associated with the scientific revolution?
In my previous essay, I argued that the art of memory provided an inspiring vision of what could be accomplished with a systematic approach to knowledge. While the basic notions of scientific reasoning may not be abstract, they are not necessarily easy to put into practice, and furthermore, it is not always clear what precise steps would even constitute reasonable next actions for an institution attempting to transform itself into a more rigorous, long-term research establishment. In the absence of a critical mass of scientific accomplishments, major intellectual shifts would have also been difficult to justify. The notion of a journal system and peer review seem elementary to us, but these ideas are far from obvious and required real momentum to become part of the bedrock of institutional practice. At this crit-ical juncture in human history, the art of memory provided a clear, tangible vision and concrete motivation for widespread adoption of a more systematic approach to knowledge. In a sense, the most ambitious paradigms of this time period did not materialize. It is difficult to imagine that Camillo's memory theater, Bruno's wheels or Leibniz' universal calculus would have actually worked. But these efforts were absolutely inspiring, and Bacon's writing on the scientific method capitalized on this inspiration. The art of memory provided the vision that catalyzed the adoption of scientific reasoning on a grand scale, and which allowed the more standard and widely discussed factors-patronage, the rise of a journal system, an explicitly articulated notion of hypothesis-driven investigation, etc.-to take root at an institutional level.
In modern times, there is a standard refrain about the role of memory and memorization in previous eras, namely that in the absence of convenient methods for writing, a trained memory would have been practically significant in ways that it is not today. But something noteworthy about the European schools of thought on memory is that they continued to exist several centuries after the development of the printing press. In examining the culture of this time period, it is clear that memory was viewed in a very different light than by thinkers of our own era. Most broadly, memory was thought of by 17th century intellectuals as being a foundational methodology for structuring knowledge, but furthermore, a few key individuals, notably Gottfried Leibniz, Giordano Bruno, and Renes Descates, conceived of memory as being intimately related to developing symbolic means for representing scientific concepts. For Leibniz, the mnemonic method was a critical element of his vision to develop a universal calculus, a symbolic language to represent and eliminate logical contradictions from the entirety of human knowledge.
If we accept the premise that these developments were central to the development of the scientific method, and not merely incidental, we arrive at a rather curious conclusion with regards to the question of whether or not a scientific revolution might have taken place at a much earlier time in human history. In particular, it seems as though, around the time when written language came into usage, strong cultural norms discouraging its use might have precipitated analogous developments to those that took place in 17th century Europe. Whereas several millennia had passed between the earliest use of mnemonic techniques in ancient Greece and their flowering in post-Renaissance Europe, a society which resisted the adoption of written language might have seen a comparable developmental trajectory, out of necessity, compressed into a much shorter time period. The result might have been an intellectual explosion, a scientific revolution that is, but of a very different kind than what ultimately came to pass in Europe during the 17th century.
But rather than being merely a counter-factual historical curiosity, there is reason to believe that such a set of cultural norms might have actually emerged. The critical observation is that written language-unlike its spoken counterpart-was an invention, and like with all inventions, would have been met with some amount of resistance. Some of the animosity towards the written word came from highly informed, intellectual, and influential leaders, and were this resistance to have achieved a critical mass, the adoption of written language might have slowed considerably. The following beautiful passage, from Socrates' Phaedrus, illustrates this point: I heard, then, that at Naucratis, in Egypt, was one of the ancient gods of that country, the one whose sacred bird is called the ibis, and the name of the god himself was Theuth. He it was who invented numbers and arithmetic and geometry and astronomy, also draughts and dice, and, most important of all, letters. Now the king of all Egypt at that time was the god Thamus, who lived in a great city of the upper region, which the Greeks call the Egyptian Thebes, and they call the god himself Ammon. To him came Theuth to show his inventions, saying that they ought to be imparted to the other Egyptians. But Thamus asked what use there was in each, and as Theuth enumerated their uses, expressed praise or blame of the various arts which it would take too long to repeat; but when they came to letters, 'This invention, O king,' said Theuth, 'will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.' But Thamus replied, 'Most ingenious Theuth, one man has the ability to beget arts, but the ability to judge of their usefulness or harmfulness to their users belongs to another; and now you, who are the father of letters, have been led by your affection to ascribe to them a power the opposite of that which they really possess. For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practise their memory. Their trust in writing, produced by external characters which are not part of themselves will discourage the use of their own memory within them. You have invented an elixir not of memory but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise. 3 If skepticism of influential leaders such as King Thamus were more widespread and if opposition to written language had reached a critical mass, what might the resulting society have looked like? We would have to imagine that there would be a division of labor entirely devoted to the maintenance of different types of knowledge. Differences in memory that might appear to be extremely subtle to us would be brought to the forefront. There might be groups of people responsible for maintaining longterm knowledge-say, related to agriculture and medicine, or literature even-and others responsible for knowledge that is overturned more quickly, for example, the inventory of vendors in a marketplace. Furthermore, it seems quite likely that the need to maintain all knowledge in memory would have created a natural and organic selective pressure towards infrastructural simplicity, particularly in the construction of legal and political systems. To further lend evidence for the plausibility of such a society, I have listed in Table 1 various memory feats that have been performed in ancient and in modern times. The great epics from Greece and India, the Iliad and the Mahabharata, were largely carried down in an oral tradition and were unlikely to have been written down for a long period after they were composed. These were almost certainly memorized using the standard techniques of repetition, although it is worth mentioning that the meter and poetic structure of these epics were significant aids to the memory as well.
In our own era, the art of memory has largely been confined to the domain of a rather peculiar set of "memory competitions" 11 in which participants are challenged with a broad array of memorization tasks that seem almost superhuman to those unfamiliar with mnemonic techniques. In the context of the present essay, I mention these competitions to give an example of the highly diverse kinds of information that practitioners have developed specialized techniques to remember. For a society built upon memorization, techniques such as these would have no doubt been commonplace-as they were during the Middle Ages, the Renaissance, and the Enlightenmentand one might imagine that for the sake of redundancy and error correction, there would have been social norms encouraging different groups to use different techniques to maintain a given body of information. The ancient art of memory and its recent reincarnation in the form of the competitive memory circuit suggest that both long-term, slowly evolving knowledge, as well as information that is rapidly overturned might have been recorded, processed, re-evaluated, and disseminated without the use of written language.
It is also worth noting that the consistent usage of mnemonic techniques would have had a measurable impact on brain function and neural development as well. It has been demonstrated, for example, that in contrast to memorization by ordinary repetition, use of the mnemonic method activates those regions of the brain otherwise responsible for spatial navigation. 12 Unsurprisingly, this is due to the fact that the mnemonic method 11 Joshua Foer's memoir Moonwalking with Einstein is a beautiful first person account of the history, techniques, and personalities of the "competitive memory circuit." Perhaps one of the primary lessons of Foer's book relevant to this essay is that most of the participants in the various international memory competitions do not claim to have strong natural memories. Rather, their memories were highly trained in the specific context of tasks relevant to competitive memory, for example, memorizing a long list of historical dates, or memorizing the order of a deck of cards. This observation lends further plausibility to the idea of an entire society in which such techniques were commonplace and in which written language was discouraged-these techniques would easily have been learned by many, and need not have been restricted to a handful of elites.
12 Eleanor A. Maguire, Elizabeth R. Valentine, John M. Wilding, and Narinder Kapur. relies explicitly on the use of spatially located images to form memories. Indeed, it is interesting to note that during the Enlightenment, one would have been able to distinguish between disciples of the different schools of thought on memory simply via an fMRI! That is, whereas the dialectic method of Petrus Ramus and his followers would not have given rise to a strong response in the visual regions of the brain, we would expect strong signals from parts of the visual cortex in those practitioners of the mnemonic method, as well as Bruno and Leibniz's hybrid methods. For a society whose very operational foundation was built upon these techniques, one would expect systematic deviations in cognitive organization from an otherwise normal population. One wonders if these cognitive differences in the capacity to generate intense imagery might have had derivative effects on creativity as well.
There is a more serious and scholastic purpose behind this somewhat whimsical thought experiment. If we accept the possibility of a society built upon memorization, and furthermore, that such a set of cultural norms might have precipitated a scientific revolution, we would also have to accept the conclusion that a scientific revolution in such circumstances would have looked very different from what took place in Europe during the 17th century. For instance, while we traditionally associate the scientific method with hypothesis-driven investigation, one of the primary innovations of the 17th century, and which subsequently formed the bedrock of the physical sciences, was Newton and Leibniz' infinitesimal calculus-on its own, a strictly mathematical theory that is only incidentally related to experimental science and hypothesis-driven investigation. If a scientific revolution were to have taken place at a very different place and time in human history, what might it have actually looked like? What would have been the topics that received the most attention and what would the consequences have been? Forcing ourselves to reason explicitly about such unusual circumstances might help to develop more precise models for what we mean by scientific reasoning, and help to disentangle larger principles from the specific set of historical circumstances in which those principles emerged. Therefore, while dethroning written language in favor of memorization is hardly a worthwhile endeavor, there may be real value in encouraging the writing of counter-factual scientific histories, and in particular, to ask the question of what a scientific revolution might have looked like at different places and times in human history. The intuition gained from such exercises could prove useful in developing policy recommendations for and Routes to Remembering: The Brains Behind Superior Memory. Nature Neuroscience, 6(1), 90-95, 2003. guiding the development of younger scientific institutions, particularly in the developing world.