© Mark Aultman

SEARCHING FOR SOCIAL MEMORY

Mark Aultman

Introduction

This article consists of reviews of recent books that consider memory, social memory, or their development in some way, with additional analysis of their implications and interrelationships. The books are: 1) Daniel Schacter's The Seven Sins of Memory: How the Mind Forgets and Remembers (Houghton Mifflin Company, New York, 2001), considering individual memory and the human brain; 2) Juan Luis Arsuaga's The Neanderthal's Necklace: In Search of the First Thinkers (Four Walls Eight Windows, New York, 2002), considering why the Neanderthals, an intelligent race that coexisted with our ancestors, died out; 3) Alexander Stille's The Future of the Past (Farrar, Straus and Giroux, New York, 2002), considering the past from the point of view of modern communication structures and developments; 4) Eviatar Zerubavel's Time Maps: Collective Memory and the Social Shape of the Past (University of Chicago Press, 2003), concerning the cognitive patterns societies use to organize thoughts of the past; 5) Solomon Schimmel's Wounds Not Healed By Time: The Power of Repentance and Forgiveness, (Oxford University Press, 2002), concerning how individuals and groups reconcile with one another for past wrongs; and 6) Catherine Merridale's Night of Stone: Death and Memory in Twentieth Century Russia (Viking Penguin, New York, 2001), concerning individual memory as shaped or repressed by authoritarian social structures.

The purpose of this article is twofold. Most obviously, I want to give the reader a sense of the books. But I also want to focus on the interrelationship between individual and social memory, and so there are several transitional and interpretive comments that go beyond the purposes of the individual authors. At one level the article may be viewed an attempt to consider whether they can take that away from us, as in the Gershwin song:

The way you wear your hat
The way you sip your tea
The memory of all that
No they can't take that away from me.

I ultimately conclude, however, that this is not (quite) the right question to ask. Social memory rides on the coattails of language, separating itself from individual memory in a semantic universe where eventually no actual individuals remember actual events from long ago, but where the credibility of social memory still depends upon a connection with actual events. Social memory does not need to take away the memories of individuals in order to establish itself; all it needs to do is ignore them. In its selection process of what events are significant, however, social memory must still confront events that continue to have consequences or leave evidence. The workings of time in the natural world and in artifacts created by humans and their ancestors combine with the memories of individual persons, and consequences experienced as they live their lives, to create a social memory in a constant state of revision demanded by both novelty and consistency.

********************

The Way You Wear Your Hat, The Way You Sip Your Tea

In his 1996 book, Searching for Memory: the Brain, the Mind, and the Past (New York, Basic Books) Daniel L. Schacter summarized the then latest developments in the study of human memory. Memories are not self-contained replicas of things past, he said, nor bits of data we retrieve as computers do. The constituents of memory, different aspects of experience, are stored in different parts of the brain, and are activated by combining, in the present, aspects of different systems. Rememberers tie together fragments and feelings into a coherent narrative or story.

Schacter drew distinctions between episodic memory (a recollection of personal events involving the rememberer), semantic memory (factual and conceptual knowledge) and procedural memory (skills and habits acquired), and further distinctions between explicit (consciously remembered) and implicit (retained in some way but not consciously remembered) memory.

Working memory holds small amounts of information for brief periods of time. To be retained longer and recalled in conscious memory, information must be encoded, made useful and meaningful by relating it to something known. More durable long-term memory involves protein synthesis -- structural change accompanied by growth of new synapses. Retrieval processes depend on interactions between different parts of the brain. Memories of life events (going to college, serving in the army) provide structure that aids retrieval of more specific memories. Semantic memory or knowledge also affects retrieval, aiding and biasing it, and in fact lifetime periods or general events can be viewed as an aspect of semantic memory rather than (event specific) episodic memory.

Other than by the passing of time, memory may be lost through disassociation (where the links among the systems needed to assemble a memory are broken), repression (a defensive process to protect the ego from threatening material), or inhibition (where some neural activity is ignored in order to focus on other activity). Semantic memory holds up better than episodic memory as people age. The years of early adulthood tend to be remembered better than the years immediately following. This is probably because these years form the core of emerging identity by which one's life story comes to be organized in memory, and thereafter reinforced with retelling.

Not all information, Schacter went on, is retained. Information no longer useful, with no reason to recall, is, with the passage of time, probably lost forever. If all the details in the world of experience had to be consciously remembered, we could not function -- forgetting is, at least partly, an adaptive function. In his more recent book The Seven Sins of Memory: How the Mind Forgets and Remembers (Houghton Mifflin Company, New York, 2001) Schacter studies the failures and errors of memory. These include both forgetting and remembering wrongly.

Schacter divides memory's malfunctions into seven categories. The first three are failures to bring to mind. Transience is a basic feature of memory -- its weakening or loss over time. Absent-mindedness (misplacing keys or forgetting an appointment) involves a lack of attention, either initially (we do not register in memory where the keys are because we are thinking of something else) or at the time we need to remember (the appointment). Blocking involves momentary or "tip of the tongue" forgetting -- we cannot recall a word or the name of a person with a familiar face.

Schacter's other four categories of memory malfunction are mostly errors, not lack, of memory. Some form of memory is present but it is incorrect or unwanted. Misattribution involves attributing a memory to the wrong source (mistaking what you read in a newspaper for what you saw) or mistaking fantasy for reality. Suggestibility is related to it -- memories may be implanted as a result of leading questions, comments or suggestions. Bias involves the "editing" or "rewriting" of memory to conform to present knowledge, experience or belief. Persistence involves not a lack but unwanted memories, repeated recall of what we would prefer to forget.

Schacter is able to link many of these aspects of memory to neurological areas of the brain. He concludes that, though flawed as memory, these bothersome and dangerous traits are useful as evolutionary adaptive mechanisms. Like the seven deadly sins they can be viewed as exaggerations of traits necessary for survival.

With social memory society overcomes, to some extent, the transience involved in individual memory. It does this through language and media that constitute repositories or records of what might otherwise be forgotten. Language permits other people to remember the same event, increasing the likelihood of its being remembered. Writing, art and similar media do too. Moving the battle against transience to a social level, however, increases the risk of misattribution and the incidence of false memories created by bias. People tend to incorporate information from external sources into personal recollections. Overt suggestion and other means of memory reinforcement add to the problem. Eyewitnesses who receive positive feedback, for example, are confirmed in their statements and more confident in their recall.

Schacter notes five major kinds of bias: consistency, change, hindsight, egocentric, and stereotypical. With a consistency bias there is a tendency to view the past as more consistent with the present than it was, and with a change bias (as in self-help programs where change is desired) as less consistent. In both situations the tendency is to minimize current cognitive dissonance. Hindsight, "I knew it all along," biases are particularly evident in sports and politics. Stereotypical biases, including racial, ethnic, or gender, occur when we categorize observations based on generalities. Egocentric biases are common among couples who remember differently events at which both were present. This reflects the significant role that the self plays in organizing mental life.

Most of these biases, we should note, assume a social milieu by which the bias or memory is judged to be incorrect. With past feelings or purely private memories there may not be an objective way to judge (did she really love him?), but with most biases the memory is judged to have been incorrect from the perspective of other observers or participants with their own memories or observations of an event or object, or from the perspective of a recording device (a security camera, for example) deemed objective.

Thus when Schacter, quoting both George Orwell and Barbra Streisand, says that the way we were depends on the way we are, he is making a statement about how we remember, and our limitations, but not necessarily about past facts. In Orwell's 1984 the Ministry of Truth altered the public view of the past, asserting that past events contrary to current storylines have no objective existence and backing up its version of events with methods of reeducation that included torture. Streisand was more comforting in singing The Way We Were: "What's too painful to remember we simply choose to forget. For it's the laughter we will remember, whenever we remember..." But the main difference between the two views rests in who is inflicting the pain and for what purposes.

Some facts are simply too basic to be based on memory alone. That we might remember a particular person as a parent does not change genetic facts or the actuality of events resulting in one's birth. The past (as reflected in one's DNA, for example) maintains a stubborn consistency that exists apart from individual memory and is not merely an artifact of social memory. The events surrounding conception will usually be private memories. And though one's birth may be memorialized by formal birth records, social memory need not depend upon such a record. We do not remember our birth, but parents do and may pass it on to us in language and narrative. This is where social memory begins, in shared events and language, and it exists, at the beginning, without records, whether informal or official.

Schacter goes on to show how persistence is related to the evolutionary roots of memory. Persistence concerns memories we wish would go away, from the emotionally charged traumatic event to the annoying tune. Traumatic memories often induce disassociation, in which the rememberer experiences the self as an observer watching the event happen to someone else. Sometimes rememberers become stuck in the past in a persistent cycle of remembering. Survivors to trauma may report a disintegration where time has stopped or is no longer connected with past and future. The roots of persistence lie deep in the inner regions of the temporal lobe, in the amygdala, which releases hormones in the face of fear and thus regulates memory storage by allowing us to respond and remember vividly.

Schacter concludes, with respect to all the defects of memory, that they are by-products of evolution, protecting the organism even where, when considered from the point of view of accurate memory, they constitute imperfections. Persistence is a result of systems that arouse emotion and reaction in the face of life-threatening experiences. Inhibition prevents information overload. Transience permits new information to be encoded. Schacter notes research showing that when information has not been used for periods of time it becomes less likely it will be needed in the future and the further conclusion that our memory systems have, through the course of evolution, adapted according to this principle.

Similarly, misattribution and suggestibility occur because we do not need to remember specifically all our sources of information; rather we tend to recall that which we are likely to need. The ability to remember the gist of experience, the generalities that work, is in the long run more important than remembering each detail. Gist information permits categorization and comprehension, allowing us to organize experience. False remembering is part of the price we pay for the ability to generalize.

The human brain, Schacter notes, has evolved so as to balance consistency and novelty. Misattribution often results from a strong general sense of familiarity, together with an absence of specific recollection. In left/right brain tests it was noted that the left-brain contains an "interpreter" that continually draws upon general knowledge and past experience to bring order. The left hemisphere recognized novel incidents as being consistent with stereotypes, while the right brain almost never did. The left-brain relies on inference, rationalization and generalization to relate past and present, thus contributing to bias. It confers a sense of order, but often at a cost of preventing us from seeing the novel. It is held in check, though, by right brain constraints more attuned to the external world, responding to what is actually witnessed. In experiments the right brain almost never falsely recognized similar events or objects as the same.

Schacter speculates that transience, persistence, and the generalization bias are basic evolutionary adaptations. More specific errors seem to vary across cultures and are more likely a matter of cultural norms than biological evolution. Blocking, absent-mindedness, misattribution and suggestibility, he hypothesizes, may also be viewed as products of cultural evolution. With the exception of absent-mindedness (which appears to involve a mechanism for avoiding information overload and thus to be an aspect of transience) Schacter's hypotheses appear reasonable -- these errors involve language or information exchange with others.

*********************

Schacter's studies of individual memory suggest that event-specific observation, though it is needed to act as a counterbalance to overgeneralization and stereotyping, tends to be overridden by semantic memory. Not only do the individual memories of youth tend to be categorized in terms of one's internal life story or narrative, as with Streisand's The Way We Were, but the memories or narratives of others intrude, as with Orwell's 1984. We put our memories into categories to form more or less consistent narratives. We generalize from experience, creating the semantic memory into which events (and our lives) must fit. But why do we have semantic memory and where does it come from?

The Way They Held Their Knives

For this we turn to Juan Luis Arsuaga's The Neanderthal's Necklace: In Search of the First Thinkers. Arsuaga speculates, based upon a detailed knowledge of the fossil record, why the Neanderthals, a hominid race that coexisted in Europe with the Cro-Magnons, died out while our ancestors did not. Primitive hominid fossils date from 4.5 million years ago. Early hominids had small brains relative to ours. Homo habilis, dated from about 2.3 million years ago, with a slightly larger brain than previous species, was the first not limited to a forest environment. Living in grasslands meant different diets and ways to survive -- traveling over distances to find meat and plants replaced the more predictable food patterns of tropical forests. The species developed a capacity to make mental maps of larger territories, and interpret animal tracks and other evidence of food. It may have developed a capacity to understand and anticipate earth's rhythms, such as changing seasons, and thereby to plan.

The primatologist Robin Dunbar studied the size of primate brains to see what variables correspond to larger brains. He was left with two hypotheses: the species' ecological niche and the size and complexity of its social group. His final results showed no relationship between the size of the neocortex (which enhances analytical functions) and ecological variables alone. There was, though, a close relationship between the complexity of primate social groups and size of the neocortex. The interplay between genetic and environmental forces at some point gave evolutionary advantage to larger and more complex social groups, facilitating language and the brain's capacity to use it.

Rudimentary intelligence produced an innovation, the first invention, a flaked stone tool, about 2.5 million years ago. Chimpanzees (our closest primate relatives) have been observed cracking nuts between stones, but no animal has been seen modifying a stone intentionally to produce a cutting edge. This tool, which helped establish the ecological niche of meat eaters, was only a small mental step -- whatever cutting edge emerged was sufficient and there was no mental model of a desired implement. It was, though, the beginning of a long slow process where mental leaps resulting in technologies began to co-exist and interact with genetic mutation as driving forces in human development.

There is a significant relationship between the size of the brain and the length of life stages in individual development -- childhood, adolescence, and life expectancy. The length of these stages for a chimpanzee is twice those for a macaque, which has a brain one-fourth the size. Human life cycles are still longer. Why such long developmental stages? During this early period the brain develops its mix of hardware and software that fosters language and socialization. Arsuaga refers to this as a "programming" period necessary for the eventual development of complex society and elaborate technology. It is also, not just incidentally, a period of enforced socialization.

About 1.6 million years ago the biface appeared. This was a stone instrument chipped on two surfaces with the obvious intention of being used as a tool. It also showed a concern for symmetry -- it was both useful and aesthetic. About 250,000 years ago a different ("Mode III") stone-flaking technology appeared. This involved meticulous preparation of a stone core for later production of desired instruments, a two-stage process. That is, the maker had to envision both the prepared core and the later instrument. This new complexity required a greater capacity for planning.

The Neanderthals developed in Europe 127,000 to 40,000 years ago, and spread to the Middle East and Southwest Asia. Our Cro-Magnon ancestors came from Africa to Europe about 40,000 years ago, and for 10,000 years or more coexisted with Neanderthals. Neanderthals used Mode III technology. When Cro-Magnons arrived in Europe they used a still higher ("Mode IV") level of technology (which Neanderthals either imitated or developed independently). The two-stage technology, however, is evidence enough of intelligence -- Neanderthals’ average cranial capacity probably equaled or exceeded our own -- though not in proportion to body weight.

Vegetable foods played an important role in hominid diets during less frigid periods. But animal fats and proteins were necessary for human survival in Europe, especially in cold periods. Stone marks on animal bones provide evidence of cooperative hunting and show a capacity to develop and execute complex hunting strategies based on seasonally predictable conditions. Human beings attained the capacity to anticipate natural events and the behavior of fellow beings.

From the observation that all creatures die, humans came to know that they too would die. But this observation requires a distinction between “I” and the others -- self-consciousness. It is not known how long ago the realization first occurred, but Arsuaga argues that it was, for some hominids, 300,000 years ago. The realization that we will die is also a realization we are alive, and cause for celebration. Human beings celebrated through self-adornment, and commemorated and protected their dead from desecration by burying or hiding in caves. There is evidence the Neanderthals wore necklaces and placed their dead in protective caves. Only Cro-Magnons, though, buried their dead in the open and left evidence of ritual behavior accompanying the burials.

Arsuaga argues, however, that Neanderthal fossil finds show evidence of deliberate and intentional burial. The Neanderthals, then, acted consciously and purposefully, manufacturing stone tools, building fires, and burying or protecting their dead. But when a new ice age arrived, they died out and Cro-Magnon descendants flourished. Why the difference?

It is unlikely that early humans lived in very small groups completely isolated from one another. Small groups could not survive without exchanging either men or women with other groups to maintain sexual balance. More likely there was a network of small groups genetically and culturally interconnected over distances. One study concludes that based on the size of the neocortex humans will ideally have direct relationships and personal ties with about 150 individuals, not necessarily together most of the time. Families, clans, and larger tribes, though, were interconnected.

Chimpanzees seem to recognize themselves in a mirror and may have a rudimentary self-consciousness. The self-consciousness of humans probably evolved as a useful mechanism in the development of social behavior. To respond to actions or possible actions of another we tend to ask "What would I do in that situation?” that is, to represent the minds of others in our own.

Arsuaga notes Charles Pierce's classification of signs into icons, indices, and symbols. Icons are similar in appearance to their referents (a map). Indices do not resemble their referents, but are causally related to them (smoke to fire). Symbols, such as spoken and written words, are arbitrary, except as they become meaningful by social convention. Icons and indices can be more universally understood, but symbols (and thus languages) make sense only within a language community aware of the conventions.

All primates vocalize, and other group members react or respond. But the human brain is not just larger than a chimpanzee's. The prefrontal cortex developed such that humans can store information and keep it available for retrieval, thus permitting extended series of gestures necessary for complex tasks such a producing a stone tool or playing a piano. The areas of the brain involved in language are located in the left hemisphere and there appears to be a connection between language, brain asymmetry, and lateralization.

The Neanderthals do not appear to have had a substantially different brain morphology from our ancestors and there is evidence of their lateralization. The Neanderthals, though, were large, and needed a large volume of air to provide oxygen. The air needed to be warmed and moistened in their oral and nasal cavities (especially given the colder European climate) and so the horizontal part of the vocal tract stayed longer. Our ancestors grew smaller, reducing respiratory demand, permitting a smaller face size. The adult human pharynx is a vertical tube within which sounds from the vocal chords can be modified. The Neanderthal vocal tract permitted a wide range of the sounds of modern language, but not certain sounds of i, u, a, k and g. In spoken (as contrasted with written) language the lack of these sounds would make a difference. The vowels especially are among the most distinguishable and permit us to understand one another's speech in the presence of ambient noise and other activity. Neanderthals' mental capacity for language may have been similar to ours, but their vocal tract probably prevented them from producing sounds as distinct as ours.

In Europe 32,000 years ago modern humans, the Cro-Magnons, occupied most of the continent. They had developed a varied tool kit (Mode IV technology). At the same time the symbolic expression of Paleolithic art, as in cave drawings, flourished. The Neanderthals lost ground, sticking close to the evergreen forests, but as the cold came they were pushed closer to the sea. The Neanderthals used personal self-ornamentation, but not to the extent Cro-Magnons did. Cro-Magnons wore necklaces, belts, bracelets, armbands, and had objects sewn into their clothing. Adornment was not for purely aesthetic purposes -- it visually communicated information about individuals and thus social structure.

While biology limits us to social groups of around 150 or so, the visual code of ornamentation (of "image") expands social structure and permits interpersonal and group connections that foster the formation of and participation in varied social groups with common objectives. Personal appearance (as in headdress or uniforms) permits the transmission of identity, or conveying to others one's membership in a group. They are a symbolic code based, like language, on artificial convention. Natural selection works at the level of individuals, but human history is a competition among groups as well as individuals. Cro-Magnons lived in solidarity with other members of their group, and in competitive environments were better able to exclude and eliminate others.

Neanderthal ornamentation may have been an imitation -- perhaps they did not have the capacity for symbolic communication through abstraction and imitated without understanding. Arsuaga believes, though, they did have some such capacity. They had language, funerals, and advanced technology, but did not develop our extreme specialization in making and using symbols -- our unlimited creativity and unbounded powers of imagination. Though better adapted biologically to cold climates, the Neanderthals were not able to make the cultural-ecological adaptations and alliances with distant groups that Cro-Magnon symbolic systems facilitated. Modern humans in very cold climates learned to build shelters with mammoth bones and skins. The Neanderthals (and other species) never attained a high population density, and biological and cultural resources were spread too thinly for their survival.

Modern humans formed ever-larger groups, with population clusters that were reproductively viable and economically self-sufficient. They also were more culturally and biologically closed, rendering imitation by competing groups more difficult. Ethnicity, organized around shared symbols, developed. The shared stories and myths useful to small dispersed groups, however, became barriers when populations grew. The result, Arsuaga concludes, is that we have two identities, individual and group, one promoting an egoism that may be creative but be at odds with the collective, and the other fostering a collective identity that may look upon individuality as a threat to the group.

*********************

One way of looking at human evolution while language was developing is that what had been individual/genetic evolution became social/cultural evolution, but this is too broad a conclusion. The human brain, and its capacity for language, is a product of evolution, and it continued to develop, and in fact its development accelerated, as humans came more and more to benefit by the advantages of social organization. It was not that there was no longer genetic mutation and adaptation to an environment, but rather that the environment had changed to include not only the natural world but human social groups, which became more and more significant as factors that determine survival. How other humans acted and responded became increasingly important in determining the survival of individuals.

But if symbolic communication, the creation of stories and myths and statements that may or may not have correspondence in reality, had advantages for survival it also had disadvantages. Mistakes and lies became possible, and as groups developed and expanded, conflicting interests as to maintaining collective solidarity arose, both consistent with and conflicting with the interests of group constituents (as well as other groups). It became possible for social groups to create their own narratives referring to pasts that did not correspond to actual memories, not only those of any currently existing individual people who constitute the group, but those of anyone who ever existed. It became possible, as in 1984, to create fictional pasts.

We will return to the problem of fictional narratives and their effects on social memory, but first we will consider some of the ways that social memory came to be transmitted. Parents (and others) talking to their children transmit social memory, but with technology, especially writing, a social memory developed outside the realm of oral communication.

The Memory Of All That

Alexander Stille's The Future of the Past (Farrar, Straus and Giroux, New York, 2002) consists mostly of vivid descriptions, more journalistic than analytical, of a number of representative situations, usually revolving around a central figure studying the past or involved in some way in preserving it, where evidence or memory of the past is being altered or destroyed. Stille argues that as the information age, with space radar, computer models, infrared photography, carbon dating, DNA and chemical analysis, resonance imaging, and CD-ROMs, gives unprecedented opportunities to study and preserve the past, we are paradoxically losing not only our historical memory but the natural environment and the artifacts in which the past, or our evidence of it, resides.

The problem is a combination of transience, information overload, and a kind of historical uncertainty principle where the mere act of discovering and observing changes the object of study. Physical objects deteriorate over time. Moore's law -- that computer speeds double every eighteen months -- means that not only are we producing more information, but also that we are bound to lose more information (there is more to lose), than ever before. A modern Egyptian archeologist sums up a problem with antiquities he tries to preserve: "You study it, you kill it."

Stille's first three chapters consider monuments or artifacts and their preservation -- in Egypt, China, and Italy. The next two (of less concern for this article), set in India and Madagascar, concern the natural environment. The three after that consider the role of language in preserving social memory, as viewed through traditionally oral cultures in New Guinea and Somalia and an effort to preserve Latin as a living language. The last three chapters (before the conclusion) concern writing and other technologies of communication, as maintained by old libraries and a modern archive collection dealing with obsolete communications technology.

Egypt. When a significant restoration of the Sphinx was undertaken around 1400 BC it was already an ancient monument over a thousand years old. When Romans cleared sand from the Sphinx for a visit by Nero, the Sphinx then was more ancient than Nero's Rome is to us today. Today tourists, development, and scientific studies take their toll on what is left, and preservation is a problem. Monuments this old raise the question of what is being preserved: the original or subsequent restorations that changed it.

Organic material, like bones or wood, can be carbon-dated. Stone cannot be, and dating by patterns of weathering is much trickier. The Egyptians had a dry climate with huge quarries of hard stone, but little wood. The Phoenicians had abundant wood, so less of their artifacts survive. Over millennia the Pyramids became mysterious as the day-to-day structures around them disappeared. Close inspection of the surrounding site, though, shows human hands that built them and gives clues to a social structure of organizational, administrative, and engineering skill.

China. The Chinese think differently about preservation. To them a well-made copy may be just as significant as the original. Until recently palaces, temples, and houses were traditionally built of wood. Working with a perishable material such as wood permits a different conservation strategy: rotted parts can be replaced as needed. In Japan the Ise Shrine, originally built in the seventh century A.D., is ritually destroyed and rebuilt every twenty years. Though no piece of it is more than twenty years old, the Japanese think of it as being 1300 years old, in much the same way that we remain ourselves as cells regenerate and replace themselves over time. This different attitude is both philosophical and cultural, and not simply a matter of different materials. UNESCO has excluded the Ise Shrine from its World Heritage Sites because it is neither ancient nor original.

China and Japan have typically had a cyclical view of time. China has written and spoken the same language for about 3500 years, facilitating an imperial system of rising and falling dynasties. Years were counted by dynasties, with a new era for each. The language and the imperial system generated continuity but the dynastic system emphasized change. In a world both eternal and ever changing, rebuilding monuments makes perfect sense. Nothing material lasts forever so why not reproduce as best we can?

The system of preserving by copying or rebuilding works so long as artisan traditions and materials remain the same. The revolution of 1911, however, brought the Western calendar and industrial techniques to China. Craft traditions declined, and in the last several years China has adopted more Western conservation methods.

Italy. Looting the archeological treasures of Sicily's caves has been lucrative over the centuries, and today is big business. Prestigious museums have been caught up in the traffic of illegal antiquities, in a kind of "don't ask, don't tell" mentality where the museums must turn them into generic beautiful objects whose history is "unknown."

A serious problem with this looting and selling is that objects are moved from their original environment, destroying historical memory. Excavators at one site, expecting a more elaborate mosaic floor, found a dirt floor where an important treasure was buried. Treasure under a dirt floor in a farmer's home had a better chance of escaping detection by invading Romans. From a hasty inscription and surrounding archeological evidence they could create a picture of probable religious beliefs and the last days of an invasion where war was changing a culture. Narratives like this are lost, though, where objects are removed, and their history obliterated, to be admired in a museum that does not wish to know real origins.

 

The significance of language, and its relation to the decline of the cultures it facilitates, is a central focus in Stille's chapters on New Guinea, Somalia, and the erosion of Latin.

New Guinea. There are about 6500 hundred languages in the world, about one-sixth of them in the islands of the South Pacific. Half of the world's languages will probably disappear over the next century as globalization proceeds. Each language represents a culture, and purely oral cultures tend to die with little trace.

In 1973 the anthropologist Giancarlo Scoditti began his fieldwork on the remote island of Kitawa. He took meticulous notes, and the locals were impressed by his ability to recall the tiniest details of their everyday rituals. They gave him a name: "the man who remembers." He in turn was impressed at their ingenious strategies for conveying their traditions intact, in an oral culture, from generation to generation. In the past 25 years, though, the culture has begun to erode. Oral cultures are both robust and fragile: they can maintain their culture for hundreds, perhaps thousands, of years, but if the link between one generation and the next is broken, millennia of accumulated wisdom can be lost.

Scoditti noted a complex system, with checks and balances, for controlling and transmitting cultural knowledge. One clan was responsible for the dance, another for decorating the body, another for music, and each scrutinized the other. Transmitting important myths and stories was not left to chance. As the Kitawans now travel to other islands they want more objects of plastic or aluminum and less of their traditional goods. Trading in these goods puts a strain on the traditional economy. In traditional oral culture the elderly store a wealth of knowledge, through which the young gain prestige. This currency is devalued as objects outside the island are sought.

On a return trip to Kitawa Scoditti learned that farmers clearing a field had uncovered a series of megaliths, with human bones buried nearby, organized in an oval. A Kitawan foundation myth tells of ancestors who were buried under an oval formation of rocks. The islands are believed to have been settled 20,000 years ago, and if the bones prove to be that old it would mean that an actual historical event was transmitted to succeeding generations through those many centuries.

Somalia. In 1993, after US troops were dragged through the streets of Mogadishu, the UN withdrew. What used to be called Somalia no longer had an internationally recognized government, but it had several phone companies competing for business and a cellular phone system with low rates. Somali refugees are scattered around the globe, knit together by a combination of traditional kinship and electronic signals. Most of the population of Somalia is non-literate. There is very little in the way of billboard or other written signs. Shops tend to have painted fronts illustrating their goods and services. Poetry is one of the principal forms of mass communication as well as entertainment.

The Somali language was not written down until the 1970s. The road to written language travelled a tortious path from Somalian independence in 1960. The government agreed on the need for a common Somali script, but not on which one. Children educated in Muslim schools had exposure to Arabic, but the Latin alphabet had other advantages. In 1969 the dictator Barre took power and in 1972 chose the Latin alphabet as the official script.

The Latin alphabet stirred opposition. Muslim clerics coined a clever slogan based on similarities between the word "Latin" and the Somali word for "godless." Nationalists objected that Latin reflected colonial domination. Groups prepared their own indigenous scripts, but clans distrusted scripts proposed by those from other clans. The inexpensive tape recorder arrived at about the same time as the new alphabet, creating political opposition in the form of oral poetry and "poetic duels." The government tried to ban the poetry, but the tapes circulated everywhere, rousing opposition. In 1991 Barre fell from power, and the country was carved into pieces governed by competing warlords.

Scholars speculate that the lack of a written tradition, objective norms applying to all segments of society, has caused many African countries to collapse into ethnic and clan warfare and massacres. Radio broadcasts urging one group to kill another are thought to have contributed to the genocide in Rwanda. Hitler and Mussolini used radio propaganda to stir the masses, and some feel that the immediacy of oral discourse makes it well suited to stirring up crowds. After the nation-state collapsed, bands of illiterate teenagers with rifles and Sony Walkmen roamed the streets.

Somali refugees are now returning from the West or sending money for relatives to invest. A network of wire transfer and courier companies, facilitated by modern communications technology, serves in the place of a central bank and moves money in and out of the country. The strength of clans remains a problem, but by keeping government small there is less patronage to distribute, and communication technology makes government more accessible.

The poet Hadrawi, though, does not share McLuhanist optimism toward the decentralizing effects of borderless electronic media. Now, he says, a person is dependent on pen/paper or other technology, but knowledge is carried outside oneself. The new technologies -- the tape recorder and video -- are conduits for foreign culture. Somalia is a part of global culture, but this means that it absorbs much more than it transmits. Somali nomads who produced poetry do not produce technology. The West does. "Poetry is alive," Hadrawi says, "but the conditions of life it expresses are at an end."

Rome. Latin is a dead language which a priest in Rome, Father Foster, tries to keep alive. Latin, he insists, must be learned as an oral language -- a language used in literature and experience. When one learns Italian it is learned in context -- from people in the stores, from billboards, from television. Foster takes his students on tours of Rome, where they read everything from ancient graffiti to inscriptions on buildings and monuments. These walks blend past and present, where language, location, and structure seem to come together again. It is as if the language still lives, or is brought back to life.

When Foster was trained the Mass and his seminary courses were in Latin. The Second Vatican Council, which modernized the Church, ushered in the decline of Latin. The office of Latin secretary to the Pope dates back to Saint Jerome (366-84), and as recently as 20 years ago the Vatican could communicate in Latin with the expectation that most prelates around the world would understand. Now, though Latin is still the official language of the Church, it is mostly ceremonial. A language that for two thousand years conveyed the thoughts of the ancients and Renaissance humanists will be less available to living minds who use it and more and more confined to texts and inscriptions created only in the past.

Writing and other communication technologies preserve and extend thought, and thus memory, but they are subject to their own peculiar kinds of transience, as Stille shows in his next chapters.

Alexandria. For 700 years, from the time of Alexander the Great in 332 B.C., Alexandria, now a poor, crowded, dilapidated Egyptian city, was the intellectual capital of the Western world, a Mediterranean port city at the crossroads of civilization. With its location at the mouth of the Nile and it superb harbors it was the world's first great Western metropolis, with a population of 600,000 three centuries before Christ.

The Ptolemies who ruled Alexandria after 323 B.C. created the Great Library to attract great scholars and books. They decreed that every ship passing through the port must hand over manuscripts and scrolls to be copied, returning the copy rather than the original. A think tank as well as library, its scholars mapped and measured the world, produced the greatest work of ancient astronomy, and virtually invented scholarship.

The Great Library was eventually destroyed and much of its accumulated knowledge dispersed or lost. Who destroyed the library, Caesar, Christians, or Arabs, and how it was destroyed, by fire, war, or ravages of time, is still debated, but its loss has contributed to much romantic "what if/but for" speculation by scholars. Some historians speculate, for example, that the destruction delayed the Industrial Revolution by 1000 years.

There were, however, many libraries in the ancient world, and all of their documents suffered substantially the same fate. Papyrus does not hold up well. Every major disaster, fire, flood, foreign invasion and political upheaval tended to contribute to the loss of

books. Works that survived were usually the most popular ones more likely to be recopied. Parchment, made of more durable animal skin, replaced papyrus by the third or fourth century A.D., but these works also had to be recopied. The oldest complete version of Homer is a medieval copy made nearly 1800 years after Homer wrote. One modern scholar puts the ratio of lost to surviving texts at 40 to 1, a ratio which does not take into account the many more texts of which there is no record at all.

It was not until print revolution of the mid-14th century that the long-term preservation of many texts became possible. The Ottoman sultan banned printing in 1516 in order to keep control of knowledge and its interpretation in the hands of Ottoman bureaucrats and clerics. The modern Egyptian government, though, determined not to miss out on today's digital revolution, is building a new version of the Ancient Library of Alexandria. The project of recreating "the lost library of Alexandria" has romantic and intellectual appeal that has generated support from librarians worldwide and from UNESCO, but it is not clear what it means to "revive" an ancient library in the age of the Internet.

The project to revive the great library faces many contradictions. The architecture is impressive. A large glass roof in the shape of a disk rises from below ground, like the sun rising from the nearby sea. An enormous wall is inscribed with letters from all the scripts and alphabets of the world's languages. But Egypt is relatively poor, and ideological and religious tensions result in contradictory efforts to open the country to foreign investment, with modern television and the Internet, along with efforts to censor and ban books and arrest journalists for libel. The government must placate religious fundamentalists and control information to protect its power.

The library was to be opened and have a collection of 2 million books by 2000. A large state university in the U.S., by contrast, might contain as many as 8 million volumes. The world's largest, the Library of Congress in Washington, with 119 million volumes, still finds it must be selective. The Internet revolution further threatens the identity of the Egyptian library. Libraries today combine books, audiovisual content, and Internet access.

What sense does it make to collect all this information in one place in an age of globalization? Many argue that the more globalization there is, the more important it is to have one space where people can assemble and conserve the collective memory of a community or country -- otherwise its identity disappears, like the Egyptian artifacts now found in foreign museums. Several countries, including France and the U.S., have national digitalization projects, and each tends to favor its own history and preserve its own roots. This argument, however, has little to do with the vision of a universal knowledge that animated the ancient Great Library of Alexandria.

 

Preserving national identity, moreover, may conflict with preserving the historical past. Egyptians are proud of only two periods in their history -- ancient Egypt and the Arab conquest. Roman, Greek, and European periods are viewed as periods of foreign domination. When Nasser took over, cultural institutions were nationalized along with everything else. There was land reform and education was opened to the masses, but at a cost of an imposed monoculturalism. Opening to Europe, and to foreign investment, opens old wounds. In a country so ancient, the past has many faces and not everyone wants to remember all of them.

One last irony: as in most ancient cities, excavations for modern development uncover ancient treasures, which developers may prefer to hide or destroy in order to avoid costly delays. While the new library was being constructed ancient mosaic floors were uncovered, but bulldozers were secretly brought to work in the middle of the night to dig the new foundation. No one is sure exactly where the ancient Great Library existed, but it is possible that the revival of the Great Library of Alexandria is burying the old one forever.

The Vatican. In 1451 the humanist Pope Nicholas V established the Vatican library for "the common convenience of the learned." It was part of an ambitious Christian humanism, to reform society by reviving both pagan and Christian antiquity. This was just two years before the fall of Constantinople to the Ottoman Turks and meant that many manuscripts there, in the only part of the old Roman Empire not sacked by barbarians, survived and went to Rome. The popes sent out representatives to obtain copies by means both legitimate and not, and preserved many ancient documents, some of them part of the foundations of modern science.

The Vatican collection not only preserved ancient texts, but provides an invaluable record, from marginal notes and annotations, of how they were used and who might have read and been influenced by them. Nonetheless, until the nineteenth century many of the books were under lock and key. A Spanish priest referred to the library as "a cemetery of books, not a library." In the 1920s the U. S. Library of Congress offered to begin a catalogue project, eventually resulting in the cataloguing of 3 million printed books and a modern conservation laboratory, but there remain tens of thousands of manuscripts listed only by handwritten notations in Latin, Italian, and French. Moreover, since many of the documents are listed as parts of collections obtained over the centuries, it can take scholars months to discover what is what.

Prior to 1984 the Vatican still had a reputation for secrecy, or at best indifference, regarding making its collections available, and stringent rules and conditions for access. When Father Boyle was appointed to head the Vatican Library, he opened access, doubling the hours, relaxing the dress code, and initiating a project of making images from the library (necessary for studying marginal notes or illuminated manuscripts) available on computer disk.

 

The latter was a prototype for a much larger project: a joint venture with IBM to digitize every page of the Vatican's 150,000 rare manuscripts. Boyle, however, had two problems in his efforts to modernize the library. He worked in an ancient institution that moved slowly, with old-timers who preferred old ways. And his modernization projects were expensive. The combination eventually led to his downfall in a financial scandal.

The joint venture with IBM, however, which remains unfinished with no prospects for financing, offers a glimpse of a potential future. In a 1996 exhibition that Boyle and IBM put together a musical manuscript was digitized so that one could listen to music while following the notes on the page. Digitalization, Boyle said before he died, is just part of another age for the Vatican library, as Nicholas V's opening the library was for his.

Some ancient texts survived, then, despite deterioration, because of recopying. With paper, printing, and digitalization reproduction became increasingly easier and text proliferated. The ancient discipline of recopying, which tended to mean that information that was considered important or useful was preserved, was replaced by a regime where virtually any information could be preserved. The evolution of memory which Schacter notes (useful information is more likely to be remembered), and which ancient copying tendencies preserved, entered the age of overload.

Washington, D.C. In the vast new futuristic National Archives building outside Washington, D.C., there is a laboratory in the Department of Special Media Preservation where old technologies are preserved -- an old Edison phonograph or the steel wire machine that recorded Truman's whistle-stop speeches in 1948. These are modern media by historical standards, but they now are obsolete and long gone from general circulation.

The National Archives and Records Agency was created in the 1930 on the premise that government could keep all its important records indefinitely, an act of collective memory. The National Archives Building on Pennsylvania Avenue opened in 1935 quickly had its center courtyard converted to storage space. High ceiling spaces were cut in half, creating twenty-one short floors of stacks. One term of the U.S. Supreme Court now generates as much paper as forty years did in the early 1800s. The new Archives Building opened in 1994, almost two million square feet, is near capacity.

In theory computers should help but so far they only exacerbate the problem. As a result of a lawsuit preventing the White House from destroying electronic documents in the Iran-Contra scandal, federal agencies must preserve computer files and e-mail. But government offices use different kinds of computers, software, and formats. It took the Archives two and a half years just to make a secure record of the Reagan White House e-records, but they were still unintelligible gibberish. The data in computer programs is meaningless without the software, and hundreds of major software programs have already been discarded in the computer revolution. To resolve this problem, the Archives issued an order permitting agencies to print e-mail onto paper for permanent storage, thus reestablishing the paper avalanche.

Between 1972 and 1975 the State Department generated over 1,250,000 electronically stored diplomatic cables. The Reagan-Bush White House e-mail generated some 200,000 electronic files for the Archives. The State Department now averages about a million messages a year and the White House about six million electronic files, but even the earlier lower volume created problems. White House e-mail was not designed for long term storage, so the Archive computer read each e-mail (rather than the entire storage tape) as a single file to be opened and closed. A tape that normally should take fifteen minutes to copy could not be copied in fifty hours.

The refrain of politicians that the era of Big Government is over does not translate into the end of the era of Big Data Bank -- that is just beginning. Downsizing magnifies the problem for official government memory when the agencies get rid of record-keepers and deliver records to the Archives. When the Pentagon closed an Air Force base with its huge motion picture storage warehouse, the Archives video holdings doubled. While the overall volume of new data coming into the Archives has increased tenfold, its budget for dealing with the new data has fallen.

Overload is not the only problem. Lost data, for reasons of incompatibility and deterioration, is another. A 1996 study concluded that at current staffing levels it would take the Archives 120 years to transfer the existing backlog of non-textual materials (photos, video, film, audiotapes, microfilm) to a more stable format. But some of these media are expected to last only 20 years. There were twenty-eight different kinds of movie sound-tracking systems in the 1930s and 40s, most incompatible and unique. They recorded both the trivial and the Nuremberg trials.

In theory the computer age offers historians the dream of infinite memory with permanent access. But as the speed of technological progress increases, so does the speed at which old technology is supplanted. And new media tend to be fragile. We can still read clay tablets from ancient Sumer and parchment manuscripts from medieval times. Renaissance paper is still readable while modern books with acidic paper turn to dust. Black-and-white photos may last a couple of centuries, but color becomes unstable in 30 to 40 years. Videotapes generally last twenty years, traditional movie film much longer. Digital storage tape is safe for about ten years and is much more an "all or nothing" matter. Analogue technology, ages more gracefully -- old vinyl records may be faded and scratchy but still audible. With its precise mathematical coding digital technology generally either works perfectly or not at all. As larger amounts of data are stored on smaller spaces, the technology becomes more precise, complex and fragile.

With information overload, data fade and deterioration, and incompatibility all affecting what information will be retained, how does someone, or a society, decide? During the Vietnam War the Pentagon had hundreds of people shoot combat film. Much of it is of genuine historical interest, but it would take several lifetimes for a technician to copy or a researcher to study. The head of the Media Preservation lab believes that choices will be by default. What researchers happen to request, and what happens to be copied to new media for many reasons (such as lawsuits) will be preserved. This sorting out may resemble the process that determined what books now remain from antiquity. Homer and Virgil survived because of their popularity and multiple copies made at different times; others were studied by scholars who made a copy for their purposes. Some, though, including works of greatness, disappeared forever.

In evolutionary theory, developments that do not contribute to survival tend to get lost and leave less in the way of later traces than those that contribute. Different ways of doing things, in other words, may compete for a while but the ones that prove to be better than others in the particular environment are more likely to leave traces. (As a modern corollary there is more information in VHS format than Betamax.) While ease of use may cause information to be recorded in one form more frequently than another, though, (writing on papyrus rather than inscribing on stone) long run intelligibility for later generations depends on factors other than temporary ease of use. Information recording is different from information retention and maintenance. And as information recording becomes routine and institutionalized, it may lose its link with usefulness in determining what survives.

Stille's last chapter is entitled "Writing and the Creation of the Past," and it draws conclusions wisely avoided in earlier chapters. The thread Stille chooses as his point of focus is the effects of writing and information technologies on the recording of history and our views of the past. Writing, he says, created history as we know it. Writing conferred a seeming ability to stop time and preserve things for eternity. The possibilities of documentation and record-keeping made possible by writing and then print facilitated chronological dating. Writing renewed interest in past texts, which print reinforced while at the same time creating a more critical attitude toward tradition -- written texts could be more easily scrutinized and criticized. In this last chapter, though, Stille tends to ignore an important distinction -- between the past, what actually happened, and history, what is selected from the past for preservation and memory, thus mistaking social narrative for the past.

*********************

On That Bumpy Road Through Time

Eviatar Zerubavel avoids this problem in Time Maps: Collective Memory and the Social Shape of the Past. Zerubavel considers the cognitive patterns we, as societies, use to organize thoughts of the past. The study of memory, he says, is quite distinct from a study of what actually happened in the past, but it is not a series of different perspectives, al la Rashomon, by individuals with their personal reconstructions of the past. Social memory is not a mere reproduction of objective facts and it is not entirely subjective either. Zerubavel does not make clear exactly what he means by distinguishing social memory from what actually happened in the past, but it probably makes the most sense if we assume it corresponds to, or is at least based upon, the distinction in individual memory between the semantic and episodic.

Being social, and identifying with any social group, Zerubavel says, involves an ability to experience events that happened to groups before we joined them, or before we even existed, as if the groups were a part of our own personal past. Groups try to acquaint members with their past, creating group memories and individual identifications with the group.

A community's collective memory, Zerubavel goes on, includes only those shared by its members as a group, invoking a common past which members seem to recall but of which some or all members may have no personal recollection. They may, as on a holiday such as Good Friday, set aside a time where they recall together -- no animals but humans have accomplished this synchronization. There are social norms of remembering that tell us, sometimes subtly, what should be remembered and what forgotten. Social remembering can be very formalized, as in history class, or informal, as when a young child returns from a shopping trip with a parent and hears the parent describe to others what happened during the day -- what was worth remembering and passing on.

It was language that freed human memory from being stored only in the brains of individuals. Language permits memories to be transmitted as disembodied impersonal recollections (to third persons or later generations) and writing and other records make it possible to bypass oral communication entirely. Paintings, statues, photos, and CDs, as well as calendars, tombstones, war memorials and museums, all transmit social memory.

We provide historical events with meaning by connecting them in storylike narratives: the foundation of Israel as a response to the Holocaust, for example. The plotlines are not objective recreations of actual events, nor are the visions universal (Palestinians view Israel differently). One such plotline is progress or development (as from savagery to civilization) and another is decline (a pessimistic view of deterioration from the higher standards of past times or forbearers). Another is zig-zag, or rise and fall narratives, involving a major turning point.

These are unilinear, successive narratives. Evolutionary narratives may be teleological stories of becoming, ladders where higher levels are built on lower. They may also be multilinear narratives, with dead ends rather than one level replacing another, as when Neanderthals are viewed as coexisting humans who died out -- more treelike than ladderlike. Narratives may be legato, flowing more or less seamlessly and emphasizing continuity, or staccato, where discrete episodes are separated by abrupt changes. The present and past cannot be entirely separated: patterns create traditions and the present is an accumulation of past residues.

Linear narratives, "uni" or "multi," assume time moving forward. Cyclical views see time as repeating itself. In recurrence narratives times (seasons, Sundays) come again. This may involve fusing separate but similar historical figures or events together -- viewing all popes or wars as essentially the same. Linear and cyclical narratives are not contradictory, and can co-exist in practice whatever problems they create in the abstract -- just as Schacter's generalization/stereotyping tendency exists alongside experiencing and integrating the ongoing and the novel.

Time (at least in history and semantic memory) not only has trajectories in memory, but also densities, mountains and valleys with long periods where little or nothing happens. When mathematicized, time is homogenous -- every second like every other -- but in history and practice we divide it into "chunks," some of which we invest with more significance than others: overtime and Sundays are different from their more regular counterparts. Some centuries are viewed as essentially empty (the "Dark Ages"). In American history books the 1770s are considered eventful while the 1740s are much less so. National holidays worldwide, Zerubavel points out, tend to commemorate events that are either very old (the birth of Buddha) or within the last two hundred years.

Memories, depending on identity, may unite or divide: Texans remember the Alamo and the Irish Cromwell. Accelerating change may create a conservative or nostalgic impulse. How can identities persist? No cell in the human body was there forty years ago, and no Frenchman alive today participated in the French Revolution. Zerubavel says: "Continuous identities are thus products of a mental integration of otherwise disconnected points in time into a seemingly historical whole."

It is memory that makes the integration possible, and it frequently involves mental bridging to fill in the gaps. Constancy of place, buildings and architecture is one way to provide a sense of permanence. But relics and memorabilia, not tied to a particular place, do too. Replication or reproduction provides iconic connection, whether it be of objects or by ritual. Court dances, courtroom procedure, or military drills try to integrate into a collective past through imitation. Holidays and holydays connect then and now into the "same" time. There is a tendency to invoke the past as analogy. Appeasement is "another Munich" or the Iraq war "another Vietnam." Images of continuity are invoked by authority: Hitler's "Third Reich" as successor to former empire.

Ancestry and descent, linking through actual people, is another bridging technique. Founding Fathers, pedigrees and bloodlines are used to establish legitimacy. We use generational links to establish closeness: only forty parent/child links will take us back to the Norman Conquest. There is intergenerational overlap (great grandchildren/grandparents co-exist) and this fosters the mental persistence of social entities. In families and nations changes in membership are usually gradual and thus imperceptible, so the continuity persists in memory. Organizations often attempt to maintain a low turnover rate to maintain continuity.

Common descent is an anchor in traditional forms of social solidarity. The further back we go, though, the more inclusive our identity. Both genetics and archeology indicate all humans descended from a common ancestor -- 99.9% of our genes are virtually identical -- and race came later. Classifiers tend to "lump" or "split," that is, downplay or emphasize differences. Splitters, for example, would regard Neanderthals as a separate species from Cro-Magnons. Language, by giving different names, can reinforce splitting, but both lumping and splitting involve a choice between noticing similarities or differences, which may be a matter of social convention.

Historical discontinuity involves punctuation, that is, dividing the past into periods (Stone Age, Bronze Age), in staccato narrative marked by watersheds. "Temporal discontinuity is a form of mental discontinuity, and the way we cut up the past is a manifestation of the way we cut up mental space in general." "Pre-Columbian" and post-1492 America marks the cultural difference between native and European influence, but of course 1493 was very much like 1491.

Classifying into periods involves both assimilation and differentiation. We normally allow perceived similarities in a cluster to outweigh differences and inflate the differences with outside clusters. This may be done with history and prehistory or politically with "new beginnings." Political leaders may try to reset their mnemonic communities' historical chronometers at zero, as in the French or Russian revolutions. This may involve mnemonic obliteration of entire populations, as with pre-Columbian America.

As with statutes of limitation, we agree by social convention to put parts of the past "behind us." When any period of history begins is a matter of convention. Serbs and Albanians in Kosovo construct different narratives dating significant historical beginnings anywhere from 1690 to 1912, depending on which eras they wish to legitimize.

The difference between national or organizational "founding moments" and wedding anniversaries, says Zerubavel, is a matter of scale. Historical depth solidifies identity as well as legitimacy. Establishing deep pedigrees gives rise to claims of priority, as groups may try to "out-past" one another, but that may not resolve issues (as with Palestinians in Israel or native Americans in the U.S.). Each group will try to impose its own narrative as correct, but a view of multiple narratives may be necessary. This does not necessarily mean rejecting the veracity of what has been remembered but rather recognizing that facts may be true but selected from a particular perspective. It is not outright lies that are the only problem in much political discourse, but the tendency to ignore narratives from perspectives other than one's own.

**********************

To summarize to this point: Individual memory became, through evolution and language, a combination of the semantic and the episodic. Social memory, in the form of group symbolic communication, achieved the potential to separate itself, through semantic generalization, from individual memory and create its own narratives, obtaining evolutionary advantage. For Zerubavel the process of history is linked to the process of formation of group identity, where groups create narratives of their own past, of which members become a part, feeling themselves to be part of, or a result of, events that happened in the past before they existed. But this idea of a group identity, and a collective memory based upon it, has serious limitations, as we shall see when considering Solomon Schimmel's views on forgiveness and reconciliation.

But I'll Always, Always Keep the Memory Of...

Solomon Schimmel's Wounds Not Healed By Time: The Power of Repentance and Forgiveness, (Oxford University Press, 2002) is a thoughtful and promising book. It asks how hurts inflicted endure in memory, over time, and how we can forgive and/or become reconciled to them. Should Jews forgive Nazis? Other Germans? South African blacks their apartheid oppressors? Descendants of slaves the descendants of slaveowners? Using biblical and psychological insights Schimmel reviews the processes involved in interpersonal wrongs, forgiveness, and reconciliation. When he attempts to apply these insights to group and social interactions, though, and especially historical wrongs, his analysis breaks down.

When Simon Wiesenthal was a prisoner in a Nazi camp a dying SS officer confessed terrible crimes against Jews and asked Wiesenthal for forgiveness. Wiesenthal said he could not forgive, later agonized over whether he acted properly, and asked for opinions. What was striking, says Schimmel, was the difference in response from Jewish and Christian traditions. Jews tended to feel that only a victim can grant forgiveness, and in the absence of repentance (including remorse, confession, apology, and reparation) there was no obligation to forgive. Christians tended to feel that a party other than a victim could and usually should forgive if there was true confession and remorse, even without reparation or apology to the victim.

Christians, Schimmel goes on, may carry the notion of forgiveness to a point others consider immoral, as with the nun who refuses to testify against her rapists, preventing prosecution. Citing biblical texts Schimmel shows that forgiveness and reconciliation have deep roots in Jewish tradition, but that justice and thus reparation play more significant roles. Remorse, confession, and apology are generally necessary to restore interpersonal bonds, but there are instances where this is impossible -- as when the perpetrator has died. Much of the book, generally its stronger sections, is devoted to interpersonal wrongs both minor and serious (thoughtless verbal slights, sexual abuse) and to inter-family harmony where people continue to interact.

Schimmel objects to the tendency toward a facile forgiveness that ignores justice and thus sweeps under the rug problems that fester, thereby continuing to hinder real reconciliation. But as he considers collective or group forgiveness his community-centered, perpetrator/victim view of forgiveness does not hold up, especially when time and later generations must be taken into account.

In the Old Testament there are references to the importance of forgiveness and reconciliation among community members (among whom one lives and generally comes in contact), with inferences that the obligations may not apply, or apply to a lesser extent, to outsiders. But, Schimmel shows, both Jewish and Christian traditions developed canons of forgiveness for outsiders, but with different emphases. The Christian view is an aspect of Christ's having died to forgive sins, which, when linked to the doctrine of original sin, implies both collective guilt and collective forgiveness existing alongside individual responsibility for transgressions. The Hebrew Bible and the rabbinic traditions also provide for collective responsibility -- members of the community of Israel are morally and spiritually responsible for one another -- but obligations to forgive outsiders are not as strong.

Schimmel recognizes the problems in holding all Germans, especially those of later generations, responsible for the Holocaust. But when he considers the problem of group wrongs against other groups, especially historical ones (reparations for slavery, for example) he loses sight of his focus on individual responsibility. He distinguishes offenses by individuals (which may affect third parties such as a victim's family members), offenses by current groups involved in violent conflict with other groups, and offenses committed by one's ancestors or by a member of a community with which one identifies. These last two categories, however, become increasingly irrelevant to perpetrator apology and reparation as time passes.

Schimmel frames the problem of historical reparations nicely. His parents came to America in the early twentieth century. He was born in 1941 and was not a participant, silent collaborator, or apathetic bystander in slavery. Schimmel does not accept that sins of ancestors must be visited upon descendants, but notes that some parents who commit wrongs against members of groups pass prejudices on to children. (Other parents, though, take pains to hide wrongdoing from their children, and insulate children from their transgressions.) Descendants of slaveowners may not only not know of a heritage they do not consider theirs, but may have been educated so as to have genuine opposition to the institution. To suggest that they are somehow guilty will give legitimate offence.

Schimmel gives two reasons why considerations like these may not end the matter: 1) there is residual benefit to a group (that is, people living in the prosperous United States still benefit from earlier wealth-creating work of slaves), and 2) groups with whom persons identify (to which they "belong" in some sense) have done wrong. His statements: "Although I am not legally or morally culpable for the crime, I might be morally obligated to make some restitution to African Americans from whose oppression I ultimately benefited, albeit unwittingly and unintentionally." "To the extent that I see myself as a member of the American family, I cannot disassociate myself from the history of my family."

This is dangerous reasoning when placed in Schimmel's context of forgiveness. As he points out, there are reasons in justice why there are communal obligations to better the lives of those still suffering the effects of past wrongs. Still, there are difficulties in linking the obligation to particular individuals. While not a majority, many African Americans are prosperous. Of those who are poor, some are descendants of slaves and some are not. Worse yet, some black Americans were themselves slaveholders. On the other side of the racial coin, some whites remain poor. The better educated (and thus wealthier) tend to have less overt racial prejudice than many poor whites. Some whites are descendants of slaveholders, but others are descended from abolitionists or participants in the Underground Railroad. Finally, with the passing of generations there are descendants of slaves who are mostly white. How does one, over a century later, disentangle the effects of ongoing racial prejudice and stereotyping from past slaveholding practices?

Schimmel tends, as does Zerubavel, to use the concept of group identity to work around parts of his problem, but this has serious limitations. Jews may identify with Israel, and citizens with their countries. Muslims identify with Islam, and Christians with their churches. What is done in the name of a group, however, may be opposed by group members who are neither participants, silent collaborators nor apathetic bystanders. Disapproval or opposition, depending on one's ability to influence events, may be silent or active. All Catholics do not approve of the Inquisition, Muslims terrorism, Israelis their country's treatment of Palestinians, or U.S. citizens their country's wars. Moreover, group identities may conflict -- religious affiliation may lead citizens to oppose or support national wars.

We all benefit in some way by being part of groups, some people more than others, and some groups more than others. When Schimmel points out that living in the U.S. confers benefits, though, it does not follow that all group members need to apologize or make reparations for past wrongs. Leaders who symbolize a collective may legitimately apologize for past wrongs on behalf of the collective (as when a U.S. president apologizes for slavery, or the Pope for the Inquisition), but this is an expression of a collective or historical memory that does not impose obligations on all current members of the group as it has evolved over time. The less a group consists of constituents living in a community with sustained interpersonal contact, the less there can be personal obligations for apology, reparation, and reconciliation.

Institutional wrongs, like institutional memories, are different from personal ones. We can legitimately say that an institution such as slavery was wrong, but this alone does not translate into wrongdoing by any presently existing individuals. When Zerubavel points out, then, that groups create their own history and identities, encouraging members to identify, it must be remembered that groups are partly mental creations, abstractions that may not, at any particular time, be undertaking any activity in concert. Belonging to a group has consequences, but individuals are not responsible for all that groups do.

From an evolutionary point of view, the advantage of groups was at least twofold: 1) the ability to act in concert to undertake activities beyond the capacity of the individual, and 2) the ability to communicate that permits an individual to take action, based on information from others he would not otherwise have known. There are limitations to group identity: people can belong to, and identify with, a group and still not be a part of, and can even oppose, some of its activities. Moreover, people can belong to more than one group whose activities conflict.

The concept of group identity, then, does not resolve the problem of social or collective memory -- it just becomes a part of it. Coordinated activity through groups may help enhance the power of individuals, but different groups can have conflicting purposes and conflicting memories, just as different individuals can. Moreover, groups, just as individuals, can create narratives that are false.

*******************

As in 1984, social memory can eventually override the memory of the individuals who make up a collective, picking the facts and memories it needs to sustain its narrative. Social groups or collectives, official or not, can pick facts selectively, and can lie, making up facts that never happened. Group narratives tend to be biased by the need to sustain group purposes, and thus the power, authority, or privilege of those who benefit from group activities. An antidote to this tendency is the actual memories of the people, within and without the group, who are affected by group activity. Catherine Merridale provides an illustration.

The Way You Haunt My Dreams

Catherine Merridale's Night of Stone: Death and Memory in Twentieth Century Russia (Viking Penguin, New York, 2001) examines how twentieth century Russians coped with and remembered their experiences in an extraordinary century in which fifty million of their people died in war, revolution, famine, and political purges.

Merridale is concerned primarily with death, describing in vivid detail a century of Russian history where death was ever-present, whether out in the open, with corpses strewn about cities plagued by siege or famine, or hidden from view, but with whispers and rumors that instilled fear, in the mass graves of the Gulag.

The facts Merridale describes are too much to bear, and tend to become abstractions. Between 1.6 and 2 million Russians died in the Great War from 1914 to 1917. In the Civil War of 1918-1921, resulting in the triumph of the Bolsheviks, and the famine years immediately following, 9 to 14 million people died from war, disease, famine, or cold. About half the deaths were a direct result of war and accompanying epidemics before the end of 1920; the rest came from the famine of 1921-1922. The collectivization of peasant land and the Great Famine of 1929 to 1933 resulted in another 5 to 7 million deaths. The Great Patriotic War of 1941-1945 is estimated to have resulted in more than 25 million deaths, between 8 and 11 million battlefield deaths and the rest civilians. The size of Stalin's Gulag, dating from around 1929, is still unknown. Estimates range from 2 to 15 million.

After Stalin died the regime of terror and death was replaced by the era of "developed socialism," a time of submerged conflict where there were not so many deaths but the regime could not be forthcoming about what had happened in the past. This was an era of war commemoration and official celebrations of the glories of past wars and revolutions. But this was still within the living memory of people who had lived through the wars, revolutions, and famines, who had seen the dead bodies in their streets and fields, and remembered relatives, friends, and acquaintances who had died or disappeared. Merridale conducted interviews to determine what they would remember before history was fixed in stone.

Merridale's interviews show the interplay between individual and social memory. World War I was not part of the Bolshevik foundation myth -- it was the civil war afterwards that mattered. There was no Soviet monument to the First World War: official history had no reason to consider it as anything other than another imperial war consigned to the dustbin of history. Landmarks, buildings, streets and even fields that defined public space had been destroyed, and in the chaos of the civil war and the myth of the new Soviet future there was little reason to reconstruct. The amputated limbs, lost sight, or other injuries of war remained, but these evoked private memories without a public framework. There was no simple story of the war for those who had suffered so much, and the public framework into which stories of heroism had to fit was the Civil War and the triumph of Bolshevism.

When interviewees were asked to name Russia's most deadly 20th century wars virtually no one remembered World War I -- it was the forgotten war. The death and violence of the civil war was more widespread -- its callousness and brutality affected a larger part of the population. People acted out of fear, under orders, in panic, anger, revenge, or ideological zeal. When it was over some continued to suffer from the trauma, but most were able to remake their lives and adjust to peace, domesticity, and a job. For most people even the extreme events of the civil war were assimilated, though in edited form. They remembered being cold and numb, but more disturbing memories were partially edited, distorted and conformed to fit into official narrative and a manageable view of continuous living human experience.

A culture of censorship and secrecy explains some of what happened but this was only part of the story. A new mentality was being created by a new public language that described the world in ideological terms, using hyperbole and euphemism, diluting language of its existential significance and creating a chasm between the idealized public discourse and real lives of suffering and memory. The Bolsheviks would try to dispense with the afterlife of religion but they could not do without the afterlife of memory -- their revolutionary heroes had to be enshrined and memorialized.

The famine of 1921-22 could be a part of the revolutionary foundation myth but the great famine of 1929-33 that followed the campaign for collectivization of the peasantry could not be. Stalin undertook the census of 1937, twenty years after the revolution, with great fanfare, but ended up having to suppress the results as a state secret. Too many people had died. Starvation in famine, however, is not a private matter, either in its suffering and disease before death or in the number of deaths that occur in public, often among a public too weak or with too little resources to remove dead bodies. There had to be private memories of public starvation, death, and "dekulakization."

Those defined as "kulaks" were usually ordinary people who had hired labor. Frequently it was the weaker members of a community who needed help with the harvest who were categorized as "exploiters of the landless poor." Resistance often meant execution on the spot. Those who participated took refuge in official lies and distortions. Outside the famine areas there was willful evasion, with newspapers full of stories of Soviet success.

Those whose families suffered often later remembered the details matter of factly, a narrative of physical facts, but with their significance obscured by fifty years of active, socially conditioned, forgetting. They had since made their lives as Soviet citizens. What they remembered of the collectivization and famine years were fragments saved in private memory, but with distorted chronologies and significance. They tended to view themselves as exceptions, with memories and subsequent lives driven by a survivor's need to justify, with a lingering guilt over having been part of a family defined as exploitative, or over adjusting to the society that destroyed that family -- it was hard to tell which.

The silence of the Gulag, however, often meant there was little to remember. People disappeared. Families of executed prisoners could be told they had been sentenced to "ten years hard labor without right of correspondence." Survivors might, after ten years, plan for the awaited return which never came. Burial sites and execution methods were secret yet vaguely known, creating a fear which could not focus. What is most notable for survivors and witnesses is that so many people (at least in Russia if not in other countries of the former Soviet Union) still believe in the guilt of the victims. The meaning of repressive politics, the stories those in authority tell to justify their existence, are not easily corrected among those who have adjusted their lives to conform to the narratives. More objective, though not necessarily more real, history comes later, when perpetrators and victims are gone, and human suffering of real people can become abstractions, statistics, and numbers.

Memories of World War II, Russia's Great Patriotic War, are different from those of World War I. When people talk of the later war it has a public frame of reference: the German invasion, the siege of Leningrad, the defense of Moscow. These are stories of heroism. Veterans usually edit out the panic, noise, and boredom. Stalin himself distrusted public remembrances of war because the memory of fighting can be personally liberating, but the public consciousness, reinforced afterwards by state-led remembrances and memorials, fit those memories into an official story. Veterans remember the war nostalgically -- it was a time of certainty, when their lives had meaning and what they did contributed and mattered. They remember the heroism and tend to forget the brutality.

Soviet survivors of the Great Patriotic War see themselves as heroes, not victims. Postwar euphemisms and myths, patriotic kitsch, serve a purpose. Just as individuals tend to ignore the death and blood, replaying mental loops of heroism and sentiment to forge a manageable story, so with war poetry, movies, monuments and other public means of war glorification. Societies use them to create a collective fantasy world of escape from the realities of war. In Russia these were mostly fantasies of survival and endurance. Some could not endure the trauma of the memories and were exiled to clinics for the mentally ill, but for most the collective glorification of the war, coupled with the enormous pressure to survive, to work and make the world again, made forgetting possible.

The culture of denial and nostalgic patriotism, with the Communist Party trying to substitute itself for Mother Russia, continued under post-Stalin "developed socialism." But the skeletons from mass graves eventually appeared and came to public attention. In the late 1980s the human rights group Memorial was formed to keep alive the memory of the victims of repression. Gorbachev's regime adopted a policy of glasnost (openness) in a failed attempt to confine the debate within the limits of single party rule. But the 1986 Chernobyl disaster had made it clear that the regime's problems were not all in the past. Memorial assembled testimonies and provided documentation of the victims of the Gulag. It also pushed for public memorials, a different process which requires public consensus and which, to some extent, diminishes individual responsibility for remembering, turning people into history.

An example of Merridale's observation occurred with World Trade Center memorials. Public memorials at the state level, with some exceptions such as the Vietnam memorial in Washington, tend to turn the death and suffering of many varied individuals into an historical event into which they are subsumed. Those who remember victims personally may be at odds with officials who wish to emphasize collective heroism or public narrative. Memorials and monuments are ways of both remembering and forgetting, that is, they are selected narratives of past events where individual details, usually of necessity, are left out. A friend of victims on one of the World Trade Center planes noticed a memorial listing where the husband was some distance removed from the wife, who used a different last name. "Odd," he said, "I knew no one closer during their lives, and now the memorial separates them."

The process of forgetting the dead is inescapable, but when it begins too soon, on false notes as if people never existed or disappeared suddenly for reasons that are unclear, the normal adjustment to death does not take place. History, or at least a manageable view of the past, is difficult in Russia. After three generations of distortion it is harder to fit memories into a public framework. Merridale observes:

Usable history usually emerges from competing forms of collective memory, and it comes well after the individual's sense of shock, of personal exceptionalism, has been exposed and shared. It is a process that is based on negotiation, on crosschecks, and on documents. Fifty years of censorship have delayed all this in Russia, and the usual structures of civil society, the ones that work to build remembrance.... have only just begun to form.

After being distorted for fifty years -- after, in other words, being converted to semantic memory that is partly false -- what of individual memory survives? Merridale suggests that people's memories, still extant in some form, must be shared with others to form a collective narrative that can compete with other collective narratives to form a history that is usable.

It is a common observation that victors and survivors write history. Narratives that cover up genocide or other crimes are used to sustain power or authority. Social narratives coming from official sources are likely to be phrased in terms of images and intended consequences designed to convince an audience. Unintended consequences that cannot be protected by veils of secrecy tend, even if predictable, to be consigned to categories such as "collateral damage." Once we talk of intended and unintended consequences, however, we are talking of more than memory -- we are talking about activities or events expected to have consequences in the future. Memory re-minds us of what is, and was, predictable.

**********************

CONCLUSION--THE WAY YOU CHANGED MY LIFE

The separation of social memory from the living memory of real people takes place not only because time passes and new generations emerge, and not only because communication comes to be abstracted in technologies such as writing, television, or computers, but because of language itself. When the mind notes consistencies and generalizes, it separates the general not only from the novel but also from the particular. Thus "China" can be both an imperial system going back thousands of years and a modern industrializing nation, with millions of constituent people in almost entirely different circumstances over space and time. Historical continuity can add authority and legitimacy, but historical priority does not necessarily mean staying power: witness American Indians.

Those who forget the past may be condemned to repeat it, but those who only remember it run the risk of becoming stuck in it. The persistence of memories can be a problem for societies as much as for individuals. As groups became more significant in human evolution, and came to battle more and more with other groups, individuals obtained evolutionary advantage by identifications with their particular groups. The Old Testament emphasized what was best for intragroup functioning -- individuals should strive to get along with others in their group, and try to forgive and reconcile. There was less evolutionary advantage to forgiveness for outsiders (and in fact it could be dangerous to the functioning of the group).

By the time of the Roman Empire, however, isolation in groups was becoming more difficult. The ideal of a universal order, imposed through force and language, was no longer unimaginable. As competing groups obtained sufficient organization, power, mobility and the ability to inflict significant damage on one another, the idea of just retribution for every wrong created significant risks for all societies. The New Testament idea of universal sin and universal forgiveness was necessary to ameliorate the destructive potential in the new social environment. There needed to be a way to reconcile with distant groups who had done wrong, and to deal with past wrongs that do not fade from memory. Thus when Christ died for sins, forgiveness was opened to all.

This view of sin and forgiveness preserved both responsibility for individual action and a way to acknowledge collective or institutional wrongs for which there may no longer any individual persons to be held personally responsible. It was necessary for people in society to remember wars or slavery so that history would not repeat itself and bring them back, but it was also necessary to forget and move on so as to avoid the effects of persistence -- a debilitating process of recrimination, counter-recrimination, and justification of different views of respective histories that can never leave the past behind.

The group identities possible for the time of which Arsuaga writes, before settled agriculture, were very limited. Collective identities are less fixed now, and there are many different groups to which we belong and from which we derive benefit. We have little control over some, such as the families into which we are born, but others we choose to join and others we form on our own. In a globalizing world there are many collectivities with which to identify, and it is not an identity with any one group that is necessarily the most formative. Individual minds come together to regroup and to re-form societies.

The formation of societies is not simply a matter of competing narratives by competing groups. Groups and organizations and institutions create narratives to justify their existence, all of them selective, many of them untrue. Organized activity has consequences, some good some bad, for both constituents of a group and for outsiders. People, in groups and on their own, act and react to one another. The past leaves evidence -- fossils, artifacts, buildings, and communications -- that constitutes a kind of social memory and serves as a check for credibility. But it is the human mind that is projected in understanding the significance of the evidence, and that turns evidence into narrative.

We need individual memory, with all its biases and imperfections, to judge whether social narratives, and thus social memory, are true or not. Because memory is not just an objective recreation of the past, but projects into a future of potential consequences, we need to be reminded what actual consequences have been. It is by protecting and taking into account the integrity of the memories of individuals, in wide variety, so that the actual consequences of social activity are made known and kept in mind, that social memory comes to be formed in such a way so as to maintain its credibility.