Content created: 2006-07-07
File last modified:
As a general term, the word "evolution" (from Latin evolvere, "to unroll") simply refers to gradual transformation over time. We say that Latin slowly evolved into the modern Romance languages, for example. Or that cell phone etiquette, Russian foreign policy, teen fashions, and the AIDS virus are rapidly evolving. We speak of a person's viewpoint evolving over the course of a lifetime.
Not all gradual changes are necessarily called evolution. We don't normally speak of a flower wilting, wallpaper fading, or a nail rusting as evolution. The concept refers to change, but not normally to change which is merely deterioration. Similarly we don't usually use the phrase for changes which are mere fluctuations, such as the rise and fall in the price of a stock.
This essay considers three kinds of evolution that have had theoretical significance in university pursuits: Linguistic evolution (which forms an intellectual background for other models of evolution), biological evolution (where the idea has been most interesting), and cultural evolution (where the idea has never worked very well).
Page Outline:
Methodical comparisons among European languages, and between these and Indian languages, resulted by the early 1800s in the creation of a formal model of the process. The most frequently cited study is by Jacob Grimm —of Brothers Grimm fame— who in 1822 formulated "Grimm's Law" describing sound changes regularly occurring over time as the Germanic languages emerged from an earlier common ancestor. (It was modified to something like its modern form by Karl Adolf Verner, who in 1875 demonstrated the interaction of Grimm's law with regularities in syllable stress.)
Among other things, the concept of linguistic evolution over time builds on the fact that every language is spoken in a range of dialects peculiar to separate speech communities, and that these dialects tend to drift apart if they are not in constant contact with each other. Thus the consonant written with the letter C by the Romans was pronounced like our K in Rome. But when it came before I or E, it came gradually to be pronounced in Italy like our "CH." It was eventually pronounced "S" in the region that became France. French and Italian words from the same Latin parent term, even if they are spelled identically, are therefore pronounced differently.
From examples such as this, European scholars in the XVIIth, XVIIIth, and XIXth centuries gradually developed a model of linguistic evolution which correctly predicted forms which had not yet yet actually been seen. This allowed them to "reconstruct" purely hypothetical parent languages for which there was no known written documentation. The expression "comparative method" was coined to refer to the inference of historical relationships among languages based on comparison of their living representatives.
The model of linguistic evolution assumed an artificially unitary and hypothetical "parent" language from which various "daughter" languages had historically derived through the operation of consistent changes in the pronunciation of words (or in other features) in different, more or less isolated, speech communities.
When two speech forms are mutually comprehensible, they are considered to be different dialects of the same language; when they are different enough not to be mutually comprehensible, they are considered to be different languages. (There are, of course, complications, but comprehensibility remains the usual way to think about the difference.) Any language at any time, being made up of multiple speech communities, is subject to having its dialects become separate enough to be languages in their own right. For example, even though Romanian, Spanish, and Catalan all derive from Latin, Spanish and Catalan have a nearer common ancestor not shared with Romanian.
The model accommodated this phenomenon by the idea of "branching," on the analogy of a tree or a genealogy: a parent language had several daughter languages, some of which in their turn had other daughter languages of their own, and so on. (In the model, these tend to be artificially represented as lacking internal variation, but of course it is internal variation that produces daughter languages, so it is oversimplifying to think of them as unitary.)
It was thought that underlying most European and northern Indian languages was an ultimate prehistoric tongue, Proto-Indo-European. Proto-Indo-European left no texts, but the model predicts that it has to have existed because that is the most efficient way to explain the relations among related languages today. The study of historical relations among languages is today referred to as "historical linguistics."
In the world of XIXth-century scientific thought, the brilliant success of historical linguistics was only one area where "evolution" was in the air, much encouraged by discoveries in geology showing that the earth was much older than had once been thought. One such field was biology, where the rapidly progressing discovery of new life forms (including fossils and casts of extinct life forms) was challenging efforts to understand their interrelationships.
It had long been clear that there were closer and more distant relations among animals: a deer and an antelope just have to be more closely related than either is to a toad. On the analogy of languages, one can imagine a great branching chart by which all modern animals are ultimately related to each other, with hypothetical ancestors and hypothetical branching linking the modern living forms. Like the written forms of languages no longer spoken, extinct plants and animals can sometimes be fossilized, allowing for periodic confirmation of predictions.
Many scholars were involved with the exciting project of trying to develop a better understanding of the relations among living things. The name most often associated with the application of the model of ancestral forms and branchings is of course Charles Darwin, although Darwin did not satisfactorily explain everything involved or all of the implications. For one thing, he lacked a modern understanding of genetics. For another, he had very little fossil evidence showing what extinct life forms looked like.
Biological evolution as it is understood today involves populations of plants or animals with slight genetic variation from one individual to the next. Genetic variation in the population can increase through occasional, non-lethal genetic mutations. And it naturally decreases with lethal genetic mutations (birth defects) or when any individual dies.
An individual organism (or a population) that produces offspring is referred to as "successful." One that does not is "unsuccessful." The more surviving offspring, the more "successful" it is said to be. (Successful at what? At reproducing. That is the ONLY meaning of "success" in this technical application of the word.)
Biological evolution consists of a heritable change in the genetic make-up of a genetically diverse population. A population containing little or no genetic diversity has little capacity to evolve. An example is sorghum, discussed in the Neolithic essay of the this web site. (Link) Two processes are responsible.
Over time, a single initial population (timber wolves —Canis lupus— for example), subjected to different environmental constraints (including deliberate manipulation by humans), can produce a wide range of very different populations (varieties of domestic dogs —Canis lupus familiaris). (Click me.)
None of this was particularly new. People had been breeding dogs and cultivating varieties of corn and cabbage and tulips for centuries. What was new was proposing that the process was evolutionary (that is, that it eventually produced permanent differences), and that it was an inevitable characteristic of all living forms, including ourselves. (This led to opposition from religious "creationists." Click here for More About Creationism.)
Something else was new: In this model, evolution is held to be the origin of new species out of increasingly separate biological varieties, just as linguistic evolution posits the gradual development of new languages out of increasingly separate dialects.
The working definition of a species in biology is that it is a plant or animal form, in which male and female members can mate to produce fertile offspring. Pigs and goats are different species because they cannot be crossed. But Irish setters and German shepherds are two variants of the same species, since they can produce fertile puppies.
The model of biological evolution holds that if a population is divided into two non-interacting sub-populations in different environments, then the differences in the genotypes in the two populations will eventually become so great that members of the first population will not be able to mate and produce fertile offspring with members of the second population. That is, that the two populations will have become two species.
Darwin proposed that over the vast reaches of time available, this happened over and over, eventually producing the range of modern (and extinct) life forms.
The logic of the argument is quite similar to the logic that guided the model of linguistic evolution.
The idea of changes in the composition of a population being attributable to changes in its environment has been an extremely useful. With this perspective, we can coherently account for the distribution of forms in the fossil record, for example.
However, great interest also lies in species-formation (speciation), which occurs when populations that for some reason cease to interbreed become different enough to lose all ability to interbreed. (In actual life, the boundaries between species are not always absolute —varieties only gradually evolve into species, after all— and it has been found that occasional fertile crosses between closely similar species do occur, if rarely. The rule-of-thumb of fertile offspring as defining species remains the way most biologists think of it, but the possible exceptions have generated complexities in classification and in modeling gene flow.)
A population split in half by an uncrossable barrier can over time develop substantial differences between its two parts. The most famous example of this process, called "allopatric" —"different country"— speciation, is the squirrels of the Grand Canyon, who have diverged into different species on the two sides of the Canyon.
But when the genotype of a single, inbreeding population changes gradually over time, at what point should we say it has turned into a new species? Our "rule of thumb" about producing fertile offspring can hardly be applied to two different points in the history of the same population. And yet if the differences between early and late specimens are substantial, can we be confident in calling them the same species?
As an intellectual matter, it is difficult to know when one continuously changing population should be regarded as a different species from its remote ancestors. As a practical matter, it is also difficult to know how specimens available for study are actually related to each other anyway. If two fossil snake teeth look similar, but neither looks exactly like any modern snake, were they separate species, or were they merely two slightly different variants of the same species? And is either of them ancestral to any modern snake?
Many specialists in prehistoric life —paleontologists— tend to assume that speciation occurs relatively easily when plant or animal communities are separated, and they argue that the default assumption should be that two specimens should be assumed to be different species until they can be convincingly shown to be the same (which is hard to demonstrate with extinct forms).
Others argue that speciation is in fact relatively difficult. Barring intervention with atomic bombardment, we have not succeeded in changing a genotype sufficiently to make it a whole new species just through selective breeding alone. (That is why all dogs are still the same species.) The best assumption when looking at prehistoric forms should therefore be that two similar specimens should be considered to belong to the same species until shown to have more variation between them than is exhibited in a modern population of similar animals, if there is one.
All of this becomes particularly controversial in the case of proto-humans. Darwin did not explore the evolution of humans — indeed only one clearly pre-human hominid fossil had been discovered when he published The Origin of Species Through Natural Selection in 1859. Since humans look a lot like modern Great Apes, it seemed logical that we should have a common ancestor with them, but that was about as much as one could say.
In some respects, the exact ways in which we draw the lines among species may not make much difference. The more we learn about genetics, the more obvious it becomes that what matters most is not the label given to a form, but our understanding of the over-all process of evolutionary transformation that is continually going on. Geneticists are ever refining more detailed "family trees" of biological forms based on the inheritance of specific gene mutations. In these family trees, it makes no significant difference where one species stops and the next begins. What matters is where mutations occur that are inherited by future generations.
Since Darwin's time, many fossil forms have turned up that look more like us than they (or we) look like modern apes. It is impossible to say that any given prehistoric bone is ancestral to any modern creature, of course, but there is a clear sequence of forms: Earlier ones tend to look more like the hypothetical common ancestor that we might share with modern apes; more recent ones look more and more like us.
Chancy as biological classification is with the incomplete fragments of extinct animals, virtually all biologists today are satisfied:
Some fossil forms are objects of great controversy. For example, the Neanderthals that roamed Europe in the last ice age and before are clearly Homo, but it was not clear whether they were a separate species or should be classed as Homo sapiens.
If Neanderthals counted as Homo sapiens, then they belonged to the sub-species "Homo sapiens neanderthalensis" and we were of the sub-species "Homo sapiens sapiens." If not, then they were "Homo neanderthalensis," and we were simply "Homo sapiens," since there was no other Homo sapiens variety that we needed to be contrasted with. In a logically similar way, Homo floresiensis forms have an unclear relationship to Homo erectus, with which they overlapped in time. And so on.
The implications go beyond simply body form. Many of the capacities of modern humans are not broadly shared with other animals, and we see them appear, one by one, in the succession of earlier human-like forms:
If we assume that we are descended from these species or still undiscovered species like them, then many of our abilities are explicable as part of that heritage. Most specialists therefore consider the study of prehistoric hominids (and, for that matter, the study of contemporary apes) to be a window on what is "hard-wired" in the human condition.
The term "cultural evolution" is constantly being reinvented to describe the gradual changes in human customs (or sometimes artifacts). Not surprisingly, in the second half of the XIXth century, when biological evolution was very much the topic of the day, scholars in several countries proposed schemes to arrange the record of human variation into an "evolutionary" sequence. Attempts to develop a general model of cultural evolution repeatedly failed to produce broadly accepted results.
The underlying assumption was that somehow culture —a society's shared understandings— must work the same way biology does, but the analogy was false, since biological evolution takes place through genes, while culture has nothing to do with biological genes and lacks any obviously comparable unit.
The biological analogy with language works better, at least if one considers the language as a system to be what evolves rather than the people who speak it. Just as species boundaries in biology create perpetually different forms of plants and animals, intelligibility boundaries create enduringly different forms of languages. (But the important difference is still that an individual speaker can learn a radically different language, or even several, but cannot assume a radically different biological form, let alone more than one.)
Barriers to interaction similar to intelligibility do not seem to exist in non-language aspects of culture. It is true that isolated populations develop different traditions, and some have argued that the "genes" of cultural evolution are customs, which, like biological genes, can store behavioral information and are subject to mutation through transmission errors, and may be adaptive to an environment. But the analogy inevitably collapses because of the ease with which knowledge can spread across individual humans, while each individual's biological genetic makeup is immutable. "Speciation," which is so central to the discussion of biological and linguistic evolution, cannot be directly relevant to cultural evolution.
That said, human culture obviously does undergo gradual transformations over time, and even if cultural evolution can't be modelled the way biological evolution is, it is perfectly reasonable to speak of "evolutionary trends" in everything from etiquette to theatrical lighting, from food preferences to marriage practices.
Furthermore, "mutations" do occur as cultural knowledge is imperfectly transmitted, occasionally producing accidentally brilliant innovations.
And environmental constraints (or changes in environmental constraints) can make some customs useful or counterproductive in ways that affect the ability of a sociocultural system to survive and prosper.
In other words, there are some processes that encourage us to continue seeing cultural changes evolutionarily, even if we don't have units like genes or species to clarify things. Useful studies have been devoted to how societies of various scales (roughly, population sizes) are integrated, how empires rise and fall, how trade networks come into existence, and so on. And in particular it is possible to see a regular set of significant differences between small-scale societies and large-scale ones.
The role of culture as an adaptation to an ever-changing environment is a crucial part of understanding what culture does. Most of the challenges of the modern world involve our relationship to our social, biological, and geophysical environments. As the prime human means of adapting to these environments, culture has the central role in meeting those challenges. It is obviously critical to understand how culture works. How much an evolutionary model will help us understand culture is still unclear.
Something which is clear, however, is that our culture does affect how biologically "successful" we are, both as individuals and as groups. That means that culture needs to be part of our model of biological evolution. And as we seek to work this out, the complexities, of course, multiply.